Making sure you set money aside for retirement regularly is hard enough. Add in the fact that investing your hard-earned money may seem complicated, and it’s easy to understand why 43% of millennials have less than $5,000 invested for retirement. However, the alternative to managing your own retirement portfolio is paying fees for someone else to do it. If you have absolutely no desire to control your own wealth, it may make sense to outsource it.
On the other hand, if you are hoping to accumulate as much money as possible, you’ll want to go the DIY route, since those advisory fees, paid year after year, can make a big dent in your wealth. Thankfully, the 3-Fund Portfolio might be the perfect way to invest if you want to keep things simple, low fee and highly effective.
What Is A 3-Fund Portfolio?
A 3-Fund Portfolio is simply an investment portfolio comprised of only three assets, which are typically low-cost index funds. It is a type of lazy portfolio since it requires very little maintenance on your part…..Continue reading…
Portfolio managers make decisions about investment mix and policy, matching investments to objectives, asset allocation for individuals and institutions, and balancing risk against performance. Portfolio management is about strengths, weaknesses, opportunities, and threats in the choice of debt vs. equity, domestic vs. international, growth vs. safety, and other trade-offs encountered in the attempt to maximize return at a given appetite for risk.
Portfolio managers are presented with investment ideas by internal buy-side analysts and sell-side analysts from investment banks. It is their job to sift through the relevant information and use their judgment to buy and sell securities. Throughout the day they read reports, talk to company managers, and monitor industry and economic trends, looking for the right company and time to invest the portfolio’scapital.
A team of analysts and researchers are ultimately responsible for establishing an investment strategy, selecting appropriate investments, and allocating each investment properly for a fund or asset management vehicle. In the case of mutual and exchange-traded funds (ETFs), there are two forms of portfolio management: passive and active. Passive management simply tracks a market index, commonly referred to as indexing or index investing.
Active management involves a single manager, co-managers, or a team of managers who attempt to beat the market return by actively managing a fund’s portfolio through investment decisions based on research and decisions on individual holdings. Closed-end funds are generally actively managed.
Modern portfolio theory was introduced in a 1952 doctoral thesis by Harry Markowitz; see Markowitz model. It assumes that an investor wants to maximize a portfolio’s expected return contingent on any given amount of risk. For portfolios that meet this criterion, known as efficient portfolios, achieving a higher expected return requires taking on more risk, so investors are faced with a trade-off between risk and expected return. This risk-expected return relationship of efficient portfolios is graphically represented by a curve known as the efficient frontier.
All efficient portfolios, each represented by a point on the efficient frontier, are well-diversified. While ignoring higher moments can lead to significant over-investment in risky securities, especially when volatility is high,the optimization of portfolios when return distributions are non-Gaussian is mathematically challenging.
Portfolio optimization often takes place in two stages: optimizing weights of asset classes to hold, and optimizing weights of assets within the same asset class. An example of the former would be choosing the proportions placed in equities versus bonds, while an example of the latter would be choosing the proportions of the stock sub-portfolio placed in stocks X, Y, and Z. Equities and bonds have fundamentally different financial characteristics and have different systematic risk .
Hence can be viewed as separate asset classes; holding some of the portfolio in each class provides some diversification, and holding various specific assets within each class affords further diversification. By using such a two-step procedure one eliminates non-systematic risks both on the individual asset and the asset class level. For the specific formulas for efficient portfolios, see Portfolio separation in mean-variance analysis.
One approach to portfolio optimization is to specify a von Neumann–Morgenstern utility function defined over final portfolio wealth; the expected value of utility is to be maximized. To reflect a preference for higher rather than lower returns, this objective function is increasing in wealth, and to reflect risk aversion it is concave. For realistic utility functions in the presence of many assets that can be held, this approach, while theoretically the most defensible, can be computationally intensive.
Harry Markowitz developed the “critical line method”, a general procedure for quadratic programming that can handle additional linear constraints and upper and lower bounds on holdings. Moreover, in this context, the approach provides a method for determining the entire set of efficient portfolios. Its application here was later explicated by William Sharpe.
Portfolio optimization is usually done subject to constraints, such as regulatory constraints, or illiquidity. These constraints can lead to portfolio weights that focus on a small sub-sample of assets within the portfolio. When the portfolio optimization process is subject to other constraints such as taxes, transaction costs, and management fees, the optimization process may result in an under-diversified portfolio.
Investors may be forbidden by law to hold some assets. In some cases, unconstrained portfolio optimization would lead to short-selling of some assets. However short-selling can be forbidden. Sometimes it is impractical to hold an asset because the associated tax cost is too high. In such cases appropriate constraints must be imposed on the optimization process.
Transaction costs are the costs of trading in order to change the portfolio weights. Since the optimal portfolio changes with time, there is an incentive to re-optimize frequently. However, too frequent trading would incur too-frequent transactions costs; so the optimal strategy is to find the frequency of re-optimization and trading that appropriately trades off the avoidance of transaction costs with the avoidance of sticking with an out-of-date set of portfolio proportions.
Investment is a forward-looking activity, and thus the covariances of returns must be forecast rather than observed. Portfolio optimization assumes the investor may have some risk aversion and the stock prices may exhibit significant differences between their historical or forecast values and what is experienced. In particular, financial crises are characterized by a significant increase in correlation of stock price movements which may seriously degrade the benefits of diversification.
In a mean-variance optimization framework, accurate estimation of the variance-covariance matrix is paramount. Quantitative techniques that use Monte-Carlo simulation with the Gaussian copula and well-specified marginal distributions are effective. Allowing the modeling process to allow for empirical characteristics in stock returns such as autoregression, asymmetric volatility, skewness, and kurtosis is important. Not accounting for these attributes can lead to severe estimation error in the correlations, variances and covariances that have negative biases (as much as 70% of the true values).
Other optimization strategies that focus on minimizing tail-risk (e.g., value at risk, conditional value at risk) in investment portfolios are popular amongst risk averse investors. To minimize exposure to tail risk, forecasts of asset returns using Monte-Carlo simulation with vine copulas to allow for lower (left) tail dependence (e.g., Clayton, Rotated Gumbel) across large portfolios of assets are most suitable.(Tail)risk parity focuses on allocation of risk, rather than allocation of capital.
According to the US Department of Labor, workplace injuries cost an estimated $161.5 billion yearly. In Wholesale and Retail Trade (WRT) establishments, lost workday injuries are caused mainly by slips, trips, and falls. A study in the United States in 2020 found that falls accounted for 33% of nonfatal injuries, making it the highest cause of preventable workplace nonfatal injuries. Moreover, falls were the third highest cause of preventable fatal workplace injuries at 21%.
According to The National Institute for Occupational Safety and Health (NIOSH), factors that can lead to workplace injuries include:
Workplace factors – Slippery surface, loose floor coverings, obstructed vision by boxes or containers, poor lighting, lack of maintenance of walking surfaces.
Work organization factors – High working pace that may cause workers to rush, tasks involving handling greasy or liquid materials that may make surfaces slippery.
Individual factors – Age, worker fatigue, and poor eyesight may affect vision and balance, and inappropriate footwear can cause tripping or slipping.
However, most WRT establishments have difficulty ensuring all health and safety protocols are adhered to both by employees and customers. The problem increases in a high-density environment with heavy human traffic. Managers are adopting innovative ways to complement the traditional solutions in the WRT stores.
Artificial Intelligence (AI), the Internet of Things (IoT), and Machine Learning (ML) have combined to detect, analyze, alert, and prevent hazards in the workplace. Workplace safety is significantly improved using real-time responses.
Computer vision
Computer vision uses digital inputs from images and videos to derive information meaningful to a computer. The computer then analyzes the information to detect defects.
SeeChange (AI provider) and Keymakr Inc. Inc. (data-annotation service provider) partnered to leverage AI in preventing slips, trips, and falls using existing CCTV cameras in Asda (supermarket chain in the UK) stores. Keymakr’s SaaS platform empowers SeeChange’s SpillDetect tool to detect liquid spills automatically. The system then sends notifications to the staff on the location of the hazard.
According to Michael Abramov, CEO of Keylabs, Keymakr’s Saas platform, “AI can be leveraged to detect accidents as soon as they happen and AI-based smart checkout systems can eliminate the human-error factor. Implementing AI can save buyers and business owners from such dangers.”
Abramov says that AI does not suffer from fatigue and can monitor non-stop. “The position of products on the shelves (and alert of a dangerous positioning) The condition of the floors (and report any incidents (spilled products, products that have fallen off shelves)). That’s not all of it as AI surveillance systems can monitor the entire store, providing insights into customer behaviors and preventing thefts.”
relEYEble solutions offer computer vision services and integrate with existing cameras to detect areas with the highest traffic in the store and monitor access to the premises. This feature helps reduce injuries caused by overcrowding and limited access and exits to a building in case of emergencies.
Fire detection systems traditionally have a response time of 3-5 minutes after detecting a fire. This time may be crucial, especially for large and fast-spreading fires, reducing the firefighting response time. Computer vision can detect fires from about 50m away and give an alert within 10-15 seconds. When connected to a PA system, the system can make an immediate announcement providing the fire’s exact location and the best exit route.
Ergonomic sensors
Injuries from manual handling of tasks are reduced through ergonomic training of workers. Optimum movement is sent to the worker to self-correct, paving the way for behavioral change.
One such company offering this solution is Soter Analytics. Soter devices worn on the shoulder, headset, helmet, and/or back monitor the risk of injury in real-time. The gadgets are paired with a mobile application to deliver tailored coaching to a specific worker for a particular task. Studies have shown that hazardous movement is reduced by 30-70%. Managers also have access to the data from the soter devices in real-time. The managers can then use the data to:
Identify hazards.
Filter hazard risk by task, department, or individual.
Identify priority areas requiring more focus.
According to Coca-Cola KO+0.4% Amatil Limited (CCA), they reduced the risk from manual handling by approximately 35% after using Soter’s SoterCoach and Clip&Go solutions for six months. Mr. Shawn Rush from Giant Eagle stated that the risk from the hazardous movement was reduced by nearly 50% for the team members who participated in the process.
Predictive data and analytics
Predictive analytics uses various data obtained from the organization and analyzes that data to forecast potential scenarios. The data collected and used in analytics include root causes and complaints and suggestions.
HGS Digital solutions collects, analyzes, and runs what-if scenarios to determine reasons for injury and provide corrective action to mitigate the problem. After entering the data into the program, the tool will analyze the information without being programmed.
Case management software
i-Sight is a case management software similar to HGS Digital Solution. Unlike HGS, I-Sight only collects, tracks, and provides comprehensive reports, and you have to use this information to prevent workplace injuries. I- sight tracks and reports incidents such as:
Accidents
Injuries
Slips and falls
Fatalities
Near misses
Dangerous exposures
Managers can use the i-Sight dashboard to monitor incident reports and possible trends to identify high-risk areas or employees that require urgent attention.
Self-braking trolleys
Autonomous vehicles (AVs) are usually associated with cars. According to Anthony Ireson from Ford of Europe, supermarket trolleys can also use the technology.
The trolley comes with a pre-collision assist to help customers avoid accidents or reduce the effect of a collision. The sensors on the trolley detect people and objects ahead in its path. The self-braking trolley automatically applies the brakes when it detects a potential collision.
Although the trolley is still a prototype in the Ford shop, its application will make run-away trolleys a thing of the past reducing accidents.
Robotics
Engineers from West Virginia University are developing robots to safeguard workers from workplace hazards. The robots detect risks found on floor surfaces in WRT establishments. Besides providing situational awareness, the robots would provide walkability maps and continually monitor the risks. Unlike other computer vision systems that use existing CCTV cameras in the establishment, the robots would be equipped with in-built cameras to reduce deception from surface appearance. The robots would also drive on the surface to better assess the slip risk.
The development of the robots focuses on three key factors:
Identification and evaluation of holistic risks involving the operation of the robots in the working spaces.
Use of robots in other aspects, such as shopping guides.
Effect of walkability maps and the robots on employees’ injury risk.
I’m a writer fascinated by the obvious and hidden dynamics between tech, culture and politics and how societies and habits are shaped by this interplay.
What you do in small, almost undetectable moments of your life has the biggest impact....Rawpixel
Breakthroughs don’t change your life. Microhabits do.Benjamin Hardy compares this concept to compounding interest, and how, given the choice, most people would take $1,000,000 in their bank account right now as opposed to a penny that doubles in value over the course of the month.
What most people don’t realize is that those who take the big payout end up with significantly less money than those who opt for the cent per day. He explains: “The doubling penny actually ends up being $10.7 million dollars. Yet, the majority of the growth happens at the very end, and most people aren’t patient enough for the big return. The live for the moment culture of today stops people from investing.”
The point is that if you want to have a completely different life in a year or two, you need to start now, and you need to start small. Here, 22 impactful microhabits you can begin tonight.
1. Try to be rejected more.
Every day, reach out to one or two people who you’d like to work with, even if you are certain they would have no reason to respond.
It could be a potential employer, an organization at which you’d like to speak, or even a book agent, or client you’d love to work with. You might not hear back at first, but eventually, you will get a response from someone. You have nothing to lose, but potentially a lot to gain.
2. Write one paragraph.
Whether you have a book you’ve always dreamt of authoring, a business plan that’s been in the back of your mind for a while, or even just a blog you want to start, write just a few sentences each day. The momentum will build on its own and you’ll find yourself effortlessly writing more and more… but commit to just beginning with one paragraph.
3. Check your bank account.
Make it a habit to check in on all of your accounts at least once a day. If that sounds like a lot, it’s because it is. But what’s important is that you’re keeping yourself aware of exactly what you have, and where it’s going. Getting a better grip on your finances begins with having a consistently accurate mental layout of your accounts.
4. Get used to maintenance.
Aspirational tropes want you to believe that living your best life is like running a victory lap every day. In reality, it is more like being willing to tend to the unglamorous maintenance of things, like chores, cleaning, healthy cooking, staying current on bills and work assignments, or making time for exercise.
The quality of your life will be directly and drastically improved if you can incorporate necessary maintenance into your daily routine, and learn to see it as something that helps you rather than hinders you from having a great time.
5. Choose comfort for your future self over comfort right now.
If you want to change your life, you need to start considering the needs and wants of your future self over the ones you have right now. Prioritizing how you feel and what you want in the moment is what lead you here. Instead, commit to making choices for the benefit of your future self. The idea that “being present” means disregarding anything but your most base instincts and desires is not enlightenment, it is self-destruction.
6. Be more responsive.
If someone sends a text, answer it when you see it. As often as you are able, respond to important emails as they come in. This will ensure that you aren’t left with a backlog of work that needs to be tended to.
7. Be less reactive.
When you see or hear something that immediately enrages you or upsets you (even if it’s just a negative thought that crops up in your head) before reacting to it and pouring your energy into it, question it. Figure out where it came from, and ask yourself whom your reaction to it would serve. Learning to take that micro-pause between a stimulus and your response will change the way you look at everything.
8. Fulfill your base needs.
You are not a machine, but in some ways, your body and life does require that you fuel it in certain ways to keep it running. Eat when you are hungry. Sleep when you are tired. Trying to deny the importance of your most basic requirements for functioning does not mean you are busy and important, it means you are ignorant and setting yourself up for a breakdown or burnout.
9. Curate your sphere of influence.
You know that the people you spend the most time with have a significant impact on who you will become.
But do you also realize that what you are surrounding yourself with and putting into your head is having just as much, if not even more, of an effect on you? Take a serious look at who you follow online and what their presence on your newsfeed does for you, or perhaps how cluttered your home or office space is. This is your environment, and it is having a silent, and often subconscious, impact on you at all times.
10. Take action when you want to do something.
In Mel Robbin’s The 5 Second Rule, she explains that a lot of what holds people back is those few seconds between when you have an amazing idea, and when your brain interferes. She says that to really move your life forward, you need to act on your ideas before you convince yourself not to.
11. Take action when you don’t.
At the same time, it’s imperative to learn that just because you do not feel like doing something does not mean you are incapable of doing it. Your feelings do not impact your ability.
12. Read more.
If you aren’t someone who can get through a book, that’s okay. But it’s not an excuse to stop learning, growing and developing yourself. Follow people on social media that post or share interesting articles and ideas. Read a news story in the morning. Listen to an audiobook on your commute. How much you read is directly related to your self-growth, and your self-growth is directly related to your external success.
13. Scroll less.
Whereas sifting through TV channels was once the mindless past time of years past, now it’s scrolling through news feeds. Train yourself to limit your “scroll” time each day. Try one of those browser installations that give you a set amount of time you can spend on a website in a day before it blocks access to the site, or apps that counts how many times you open social media apps. You don’t have to delete them entirely, but you should be mindful that you’re not spending multiple hours a day effectively doing nothing.
14. Observe your patterns.
Instead of being critical of yourself when you notice that you’re procrastinating, or engaging in an unhealthy behavior, notice what prompts it. Notice what you’re doing when you feel most at ease, most inspired, or most frustrated. Observe yourself as a third party, treat your life like something you are studying. Get to know what you react to and how — this can help you direct your life.
15. Practice saying “no.”
Your energy is limited each day. Make sure it is only going toward that you truly care about. You should not feel bad about saying “no” to some things. It is ultimately a means of self-preservation.
16. Practice diverting your attention.
When you have a self-defeating thought, the solution isn’t usually to mull on it until you arrive at a different conclusion. The solution is usually to distract yourself with something productive. Get better at diverting your attention to something that helps you, not negative thoughts that can lead to a spiral.
17. Share your ideas consistently and clearly.
Having ideas is great, but they won’t go anywhere if you aren’t able to articulate them, or come up with an action plan that allows you to implement them.
18. Use what you have.
The next time you have the urge to go pick up dinner or a new outfit for the weekend, challenge yourself just once to wear what’s in your closet, or eat what’s in your pantry, even if you don’t want to that much.
19. Drink one more glass of water.
Don’t worry about pressuring yourself to get all recommended 8 cups down perfectly. Just focus on drinking one more. Then, when that’s part of your routine, add another.
20. Eat one less unhealthy snack.
Don’t worry about trying to completely overhaul your diet and perfect every single thing that crosses your lips. Focus only on foregoing one single unhealthy choice that you’d make on any given day. Just one.
21. Create open portals for people to reach and contact you for what you want to do.
Make sure that you are consistently making your information available to those who may want to reach out to you. Your personal website and online presence is the new résumé, so make sure you are consistently updating and improving it, and making it easy for others to understand what you do and how to reach you.
22. Begin each day asking yourself: “How can I change my life today?”
Get out of the mindset that you have to “get through” the day and get into the mindset that the coming hours are filled with open-ended potential for you to take action that will change your life forever. The only difference is your willingness to see things differently, and your effort in trying to make them better.
For someone performing their first technical SEO audit, the results can be both overwhelming and intimidating. Often, you can’t see the wood for the trees and have no idea how to fix things or where to even begin.
After years of working with clients, especially as the head of tech SEO for a U.K. agency, I’ve found technical SEO audits to be a near-daily occurrence. With that, I know how important it is, especially for newer SEOs, to understand what each issue is and why it is important.
Understanding issues found within a technical audit allows you to analyze a site fully and come up with a comprehensive strategy.
In this guide, I am going to walk you through a step-by-step process for a successful tech audit but also explain what each issue is and, perhaps more importantly, where it should lie on your priority list.
Whether it’s to make improvements on your own site or recommendations for your first client, this guide will help you to complete a technical SEO audit successfully and confidently in eight steps.
But first, let’s clarify some basics.
What is a technical SEO audit?
Technical SEO is the core foundation of any website. A technical SEO audit is an imperative part of site maintenance to analyze the technical aspects of your website.
An audit will check if a site is optimized properly for the various search engines, including Google, Bing, Yahoo, etc.
This includes ensuring there are no issues related to crawlability and indexation that prevent search engines from allowing your site to appear on the search engine results pages (SERPs).
An audit involves analyzing all elements of your site to make sure that you have not missed out on anything that could be hindering the optimization process. In many cases, some minor changes can improve your ranking significantly.
Also, an audit can highlight technical problems your website has that you may not be aware of, such as hreflang errors, canonical issues, or mixed content problems.
Generally speaking, I always like to do an initial audit on a new site—whether that is one I just built or one I am seeing for the first time from a client—and then on a quarterly basis.
I think it is advisable to get into good habits with regular audits as part of ongoing site maintenance. This is especially if you are working with a site that is continuously publishing new content.
It is also a good idea to perform an SEO audit when you notice that your rankings are stagnant or declining.
What do you need from a client before completing a technical audit?
Even if a client comes to me with goals that are not necessarily “tech SEO focused,” such as link building or creating content, it is important to remember that any technical issue can impede the success of the work we do going forward.
It is always important to assess the technical aspects of the site, offer advice on how to make improvements, and explain how those technical issues may impact the work we intend to do together.
With that said, if you intend on performing a technical audit on a website that is not your own, at a minimum, you will need access to the Google Search Console and Google Analytics accounts for that site.
How to perform a technical SEO audit in eight steps
For the most part, technical SEO audits are not easy. Unless you have a very small, simple business site that was perfectly built by an expert SEO, you’re likely going to run into some technical issues along the way.
Often, especially with more complex sites, such as those with a large number of pages or those in multiple languages, audits can be like an ever-evolving puzzle that can take days or even weeks to crack.
Regardless of whether you are looking to audit your own small site or a large one for a new client, I’m going to walk you through the eight steps that will help you to identify and fix some of the most common technical issues.
Step 1. Crawl your website
All you need to get started here is to set up a project in Ahrefs’ Site Audit, which you can even access for free as part of Ahrefs Webmaster Tools.
This tool scans your website to check how many URLs there are, how many are indexable, how many are not, and how many have issues.
From this, the audit tool creates an in-depth report on everything it finds to help you identify and fix any issues that are hindering your site’s performance.
Of course, more advanced issues may need further investigation that involves other tools, such as Google Search Console. But our audit tool does a great job at highlighting key issues, especially for beginner SEOs.
First, to run an audit with Site Audit, you will need to ensure your website is connected to your Ahrefs account as a project. The easiest way to do this is via Google Search Console, although you can verify your ownership by adding a DNS record or HTML file.
Once your ownership is verified, it is a good idea to check the Site Audit settings before running your first crawl. If you have a bigger site, it is always best to increase the crawl speed before you start.
There are a number of standard settings in place. For a small, personal site, these settings may be fine as they are. However, settings like the maximum number of pages crawled under “Limits” is something you may want to alter for bigger projects.
Also, if you are looking for in-depth insight on Core Web Vitals (CWV), you may want to add your Google API key here too.
Once happy with the settings, you can run a new crawl under the “Site Audit” tab.
Initially, after running the audit, you will be directed to the “Overview” page. This will give you a top-level view of what the tool has found, including the number of indexable vs. non-indexable pages, top issues, and an overall website health score out of 100.
This will give you a quick and easy-to-understand proxy metric to the overall website health.
From here, you can head over to the “All issues” tab. This breaks down all of the problems the crawler has found, how much of a priority they are to be fixed, and how to fix them.
This report, alongside other tools, can help you to start identifying the issues that may be hindering your performance on the SERPs.
Step 2. Spotting crawlability and indexation issues
If your site has pages that can’t be crawled by search engines, your website may not be indexed correctly, if at all. If your website does not appear in the index, it cannot be found by users.
Ensuring that search bots can crawl your website and collect data from it correctly means search engines can accurately place your site on the SERPs and you can rank for those all-important keywords.
There are a few things you need to consider when looking for crawlability issues:
Indexation errors
Robots.txt errors
Sitemap issues
Optimizing the crawl budget
Identifying indexation issues
Priority: High
Ensuring your pages are indexed is imperative if you want to appear anywhere on Google.
The simplest way to check how your site is indexed is by heading to Google Search Console and checking the Coverage report. Here, you can see exactly which pages are indexed, which pages have warnings, as well as which ones are excluded and why:
Note that pages will only appear in the search results if they are indexed without any issues.
If your pages are not being indexed, there are a number of issues that may be causing this. We will take a look at the top few below, but you can also check our other guide for a more in-depth walkthrough.
Checking the robots.txt file
Priority: High
The robots.txt file is arguably the most straightforward file on your website. But it is something that people consistently get wrong. Although you may advise search engines on how to crawl your site, it is easy to make errors.
Most search engines, especially Google, like to abide by the rules you set out in the robots.txt file. So if you tell a search engine not to crawl and/or index certain URLs or even your entire site by accident, that’s what will happen.
This is what the robots.txt file, which tells search engines not to crawl any pages, looks like:
Often, these instructions are left within the file even after the site goes live, preventing the site from being crawled. This is a rare easy fix that acts as a panacea to your SEO.
You can also check whether a single page is accessible and indexed by typing the URL into the Google Search Console search bar. If it’s not indexed yet and it’s accessible, you can “Request Indexing.”
The Coverage report in Google Search Console can also let you know if you’re blocking certain pages in robots.txt despite them being indexed:
A robots meta tag is an HTML snippet that tells search engines how to crawl or index a certain page. It’s placed into the <head> section of a webpage and looks like this:
<meta name="robots" content="noindex" />
This noindex is the most common one. And as you’ve guessed, it tells search engines not to index the page. We also often see the following robots meta tag on pages across whole websites:
This tells Google to use any of your content freely on its SERPs. The Yoast SEO plugin for WordPress adds this by default unless you add noindex or nosnippet directives.
If there are no robots meta tags on the page, search engines consider that as index, follow, meaning that they can index the page and crawl all links on it.
But noindex actually has a lot of uses:
Thin pages with little or no value for the user
Pages in the staging environment
Admin and thank-you pages
Internal search results
PPC landing pages
Pages about upcoming promotions, contests, or product launches
Duplicate content (use canonical tags to suggest the best version for indexing)
But improper use also happens to be a top indexability issue. Using the wrong attribute accidentally can have a detrimental effect on your presence on the SERPs, so remember to use it with care.
An XML sitemap helps Google to navigate all of the important pages on your website. Considering crawlers can’t stop and ask for directions, a sitemap ensures Google has a set of instructions when it comes to crawling and indexing your website.
But much like crawlers can be accidentally blocked via the robots.txt file, pages can be left out of the sitemap, meaning they likely won’t get prioritized for crawling.
Also, by having pages in your sitemap that shouldn’t be there, such as broken pages, you can confuse crawlers and affect your crawl budget (more on that next).
You can check sitemap issues in Site Audit: Site Audit > All issues > Other.
The main thing here is to ensure that all of the important pages that you want to have indexed are within your sitemap and avoid including anything else.
A crawl budget refers to how many pages and how rapidly a search engine can crawl.
A variety of things influence the crawl budget. These include the number of resources on the website, as well as how valuable Google deems your indexable pages to be.
Having a big crawl budget does not guarantee that you will rank at the top of the SERPs. But if all of your critical pages are not crawled due to crawl budget concerns, it is possible that those pages may not be indexed.
Your pages are likely being scanned as part of your daily crawl budget if they are popular, receive organic traffic and links, and are well-linked internally across your site.
New pages—as well as those that are not linked internally or externally, e.g., those found on newer sites—may not be crawled as frequently, if at all.
For larger sites with millions of pages or sites that are often updated, crawl budget can be an issue. In general, if you have a large number of pages that aren’t being crawled or updated as frequently as you want, you should think about looking to speed up crawling.
Using the Crawl Stats report in Google Search Console can give you insight into how your site is being crawled and any issues that may have been flagged by the Googlebot.
It is important to check your on-page fundamentals. Although many SEOs may tell you that on-page issues like those with meta descriptions aren’t a big deal, I personally think it is part of good SEO housekeeping.
Even Google’s John Mueller previously stated that having multiple H1 tags on a webpage isn’t an issue. However, let’s think about SEO as a points system.
If you and a competitor have sites that stand shoulder to shoulder on the SERP, then even the most basic of issues could be the catalyst that determines who ranks at the top. So in my opinion, even the most basic of housekeeping issues should be addressed.
So let’s take a look at the following:
Page titles and title tags
Meta descriptions
Canonical tags
Hreflang tags
Structured data
Page titles and title tags
Priority: Medium
Title tags have a lot more value than most people give them credit for. Their job is to let Google and site visitors know what a webpage is about—like this:
Here’s what it looks like in raw HTML format:
<title>How to Craft the Perfect SEO Title Tag (Our 4-Step Process)</title>
In recent years, title tags have sparked a lot of debate in the SEO world. Google, it turns out, is likely to modify your title tag if it doesn’t like it.
One of the biggest reasons Google rewrites title tags is that they are simply too long. This is one issue that is highlighted within Site Audit.
In general, it is good practice to ensure all of your pages have title tags, none of which are longer than 60 characters.
A meta description is an HTML attribute that describes the contents of a page. It may be displayed as a snippet under the title tag in the search results to give further context.
More visitors will click on your website in the search results if it has a captivating meta description. Even though Google only provides meta descriptions 37% of the time, it is still important to ensure your most important pages have great ones.
You can find out if any meta descriptions are missing, as well as if they are too long or too short.
But writing meta descriptions is more than just filling a space. It’s about enticing potential site visitors.
A canonical tag (rel=“canonical”) specifies the primary version for duplicate or near-duplicate pages. To put it another way, if you have about the same content available under several URLs, you should be using canonical tags to designate which version is the primary and should be indexed.
Canonical tags are an important part of SEO, mainly because Google doesn’t like duplicate content. Also, using canonical tags incorrectly (or not at all) can seriously affect your crawl budget.
If spiders are wasting their time crawling duplicate pages, it can mean that valuable pages are being missed.
You can find duplicate content issues in Site Audit: Site Audit > Reports > Duplicates > Issues.
Although hreflang is seemingly yet another simple HTML tag, it is possibly the most complex SEO element to get your head around.
The hreflang tag is imperative for sites in multiple languages. If you have many versions of the same page in a different language or target different parts of the world—for example, one version in English for the U.S. and one version in French for France—you need hreflang tags.
Translating a website is time consuming and costly—because you’ll need to put in effort and ensure all versions show up in the relevant search results. But it does give a better user experience by catering to different users who consume content in different languages.
Plus, as clusters of multiple-language pages share each other’s ranking signals, using hreflang tags correctly can have a direct impact as a ranking factor. This is alluded to by Gary Illyes from Google in this video.
You can find hreflang tag issues in Site Audit under localization: Site Audit > All issues > Localization.
Structured data, often referred to as schema markup, has a number of valuable uses in SEO.
Most prominently, structured data is used to help get rich results or features in the Knowledge Panel. Here’s a great example: When working with recipes, more details are given about each result, such as the rating.
You also get a feature in the Knowledge Panel that shows what a chocolate chip cookie is (along with some nutritional information):
Image optimization is often overlooked when it comes to SEO. However, image optimization has a number of benefits that include:
Improved load speed.
More traffic you can get from Google Images.
More engaging user experience.
Improved accessibility.
Image issues can be found in the main audit report: Site Audit > Reports > Images.
Broken images
Priority: High
Broken images cannot be displayed on your website. This makes for a bad user experience in general but can also look spammy, giving visitors the impression that the site is not well maintained and professional.
This can be especially problematic for anyone who monetizes their website, as it can make the website seem less trustworthy.
Image file size too large
Priority: High
Large images on your website can seriously impact your site speed and performance. Ideally, you want to display images in the smallest possible size and in an appropriate format, such as WebP.
The best option is to optimize the image file size before uploading the image to your website. Tools like TinyJPG can optimize your images before they’re added to your site.
If you are looking to optimize existing images, there are tools available, especially for more popular content management systems (CMSs) like WordPress. Plugins such as Imagify or WP-Optimize are great examples.
HTTPS page links to HTTP image
Priority: Medium
HTTPS pages that link to HTTP images cause what is called “mixed content issues.” This means that a page is loaded securely via HTTPS. But a resource it links to, such as an image or video, is on an insecure HTTP connection.
Mixed content is a security issue. For those who monetize sites with display ads, it can even prevent ad providers from allowing ads on your site. It also degrades the user experience of your website.
By default, certain browsers restrict unsafe resource requests. If your page relies on these vulnerable resources, it may not function correctly if they are banned.
Missing alt text
Priority: Low
Alt text, or alternative text, describes an image on a website. It is an incredibly important part of image optimization, as it improves accessibility on your website for millions of people throughout the world who are visually impaired.
Often, those with a visual impairment use screen readers, which convert images into audio. Essentially, this is describing the image to the site visitor. Properly optimized alt text allows screen readers to inform site users with visual impairments exactly what they are seeing.
Alt text can also serve as anchor text for image links, help you to rank on Google Images, and improve topical relevance.
When most people think of “links” for SEO, they think about backlinks. How to build them, how many they should have, and so on.
What many people don’t realize is the sheer importance of internal linking. In fact, internal links are like the jelly to backlinks’ peanut butter. Can you have one without the other? Sure. Are they always better together? You bet!
Not only do internal links help your external link building efforts, but they also make for a better website experience for both search engines and users.
The proper siloing of topics using internal linking creates an easy-to-understand topical roadmap for everyone who comes across your site. This has a number of benefits:
Creates relevancy for keywords
Helps ensure all content is crawled
Makes it easy for visitors to find relevant content or products
Of course, when done right, all of this makes sense. But internal links should be audited when you first get your hands on a site because things may not be as orderly as you’ll want.
4xx status codes
Priority: High
Go to Site Audit > Internal pages > Issues tab > 4XX page.
Here, you can see all of your site’s broken internal pages.
These are problematic because they waste “link equity” and provide users with a negative experience.
Here are a few options for dealing with these issues:
Bring back the broken page at the same address (if deleted by accident)
Redirect the broken page to a more appropriate location; all internal links referring to it should be updated or removed
Orphan pages
Priority: High
Go to Site Audit > Links > Issues tab > Orphan page (has no incoming internal links).
Here, we highlight pages that have zero internal links pointing to them.
There are two reasons why indexable pages should not be orphaned:
Internal links will not pass PageRank because there are none.
They won’t be found by Google (unless you upload your sitemap through Google Search Console or there are backlinks from several other websites’ crawled pages, they won’t be seen).
If your website has multiple orphaned pages, filter the list from high to low for organic traffic. If internal links are added to orphaned pages still receiving organic traffic, they’ll certainly gain far more traffic.
External links are hyperlinks within your pages that link to another domain. That means all of your backlinks—the links to your website from another one—are someone else’s external links.
See how the magic of the internet is invisibly woven together? *mind-blown emoji*
External links are often used to back up sources in the form of citations. For example, if I am writing a blog post and discussing metrics from a study, I’ll externally link to where I found that authoritative source.
Linking to credible sources makes your own website more credible to both visitors and search engines. This is because you show that your information is backed up with sound research.
Linking to other websites is a great way to provide value to your users. Often times, links help users to find out more, to check out your sources and to better understand how your content is relevant to the questions that they have.
As always, just like anything else, external links can cause issues. These can be found in the audit report (similar to internal links): Site Audit > All issues > Links.
As you can see from the image above, links are broken down into indexable and not indexable and you can find the same issues across both categories. However, each issue has a different predetermined importance level—depending on whether the link is indexable or not.
Page has links to broken page
Priority: High (if indexable)
This issue can refer to both internal and external links and simply means that the URLs linked to are returning a 4XX return code. These links damage the user experience for visitors and can impair the credibility of your site.
Page has no outgoing links
Priority: High (if indexable)
Again, this issue refers to both internal and external links and essentially means a page has no links from it at all. This means the page is a “dead end” for your site visitors and search engines. Bummer.
But in regards to external links specifically, if your page has no outgoing links, it affects all of the benefits of external links as discussed above.
Step 7. Site speed and performance
Site speed has become quite a hot topic among the SEO community in recent times, especially after Google announced that mobile speed is indeed a ranking factor.
Since May 2021, speed metrics known as Core Web Vitals (CWV) have been utilized by Google to rank pages. They use Largest Contentful Paint (LCP) to assess visual load, Cumulative Layout Shift (CLS) to test visual stability, and First Input Delay (FID) to measure interactivity.
Google’s goal is to improve user experience because, let’s face it, no one likes a slow website. In today’s society, the need for instant gratification encourages site visitors to leave before they finish what they intend to do.
Within the Ahrefs audit report, you can find information about site speed: Site Audit > Reports > Performance > Overview.
Recommendation
You can get detailed page speed data from Google PageSpeed Insights if you enable Core Web Vitals in the Crawl settings.
There are also a number of excellent speed testing tools available, including PageSpeed Insights from Google and my personal favorite, GTmetrix.
Speed optimization for sites that are very slow can be a complex process. However, for beginners, it is advisable to use one of the available tools such as WPRocket or NitroPack (both paid) to significantly improve site speed.
In the world we now live, more individuals than ever before are continuously utilizing mobile devices. For example, mobile shopping currently has 60% of the market, according to Datareportal’s 300-page study.
It is no wonder that over the last few years, Google has looked to switch to mobile-first indexing.
From a technical standpoint, it is good practice to run a second audit on your site using Ahrefs’ mobile crawler. As a standard, Ahrefs’ audit tool uses a desktop crawl to audit your site; however, this can easily be changed under “Crawl Settings” within your “Project Settings.”
Our comparison function will compare your mobile and desktop sites and inform you what has changed or if any “new” issues have arisen once you have crawled your site a second time, e.g., problems that exist only on mobile.
From here, you can select any of the “New,” “Added,” or “Removed” numbers to determine what has changed with respect to each problem.
In all honesty, this is just scratching the surface when it comes to performing a technical SEO audit. Each of the points above can easily have an entire blog post about it and additional, more advanced issues like paginations, log file analysis, and advanced site architecture.
However, for someone looking to learn where to get started in order to successfully complete a technical SEO audit, this is a great place to begin.
Whenever you perform a technical SEO audit, you’ll always have tons to fix. The important thing is to get your priorities straight first. Luckily, Ahrefs’ Site Audit gives you a predefined priority rating for each issue.
One thing to keep in mind, though, is that regardless of the issue, its importance depends on the website or page you’re working on. For example, the main pages you want to rank will always take priority over pages you don’t want to index.
Jenny is an award-winning SEO consultant who specializes in using branded PR to maximize SEO results for clients by building E-A-T and has an extensive background in niche affiliate and technical SEO.
Pringle, G., Allison, L., and Dowe, D. (April 1998). “What is a tall poppy among web pages?”. Proc. 7th Int. World Wide Web Conference. Archived from the original on April 27, 2007. Retrieved May 8, 2007.Laurie J. Flynn (November 11, 1996). “Desperately Seeking Surfers”. New York Times. Archived from the original on October 30, 2007. Retrieved May 9, 2007.
Zoltan Gyongyi & Hector Garcia-Molina (2005). “Link Spam Alliances”(PDF). Proceedings of the 31st VLDB Conference, Trondheim, Norway. Archived(PDF) from the original on June 12, 2007. Retrieved May 9, 2007.Hansell, Saul (June 3, 2007). “Google Keeps Tweaking Its Search Engine”. New York Times. Archived from the original on November 10, 2017. Retrieved June 6, 2007.
“Lemme ask one of those tone deaf economist questions that annoy almost everyone. Today, many families learned that the amount they owe on their mortgage has declined—in real terms—by 9.1% over the past year. Why do we hear so little about this? Why don’t we see folks celebrating?”
Some other economists agreed with him, at least in terms of how people think of economics. Many non-economists quickly came in to explain their thought processes—that the points, while technically correct, were out of context and touch.
Essentially, the critics made two points as accurately as Wolfers and company related the technicalities. People are set upon from all quarters, not just housing. And the U.S. is becoming a country, not of poverty, but entrenched poorness. That is, in the sense of “small in worth” or “less than adequate” by the Merriam-Webster definition.
It is true that as inflation increases, the monetary value of a loan with terms that established lower interest rates decreases in favor of the borrower, at least while inflation is running hot. If the total remaining on the mortgage, including interest and principal, is $X, then over the last year it’s now 9.1% less expensive because the value of the dollars is falling. The mortgage likely has no inflation escalation rider.
Now, that mortgage only remains 9.1% less expensive if there is no deflation. You do get a savings even if inflation drops to a lower rate, because the value of what a dollar can buy continues to drop. As it does pretty much every year anyway. This is one of the advantages of owning a home. The amount you own drops because there is some degree of inflation in virtually every year, as, unless you have an adjustable-rate mortgage (a bad idea in the long run that might make sense in specific circumstances in the immediate future), you’ll locked in at the level of cheaper dollars.
There’s nothing new with that and it’s how a lot of people build wealth over time. Then they, in theory, pass that property down to their children, who now have greater wealth that, in theory, can get passed down in turn, and so on. The growth of wealth becomes a multi-generational process. The longer you’re around, the greater an advantage you have.
There are two other ways you build value as a homeowner. One is, on the whole, there will be some appreciation in value over time. That comes without additional payments. The other is one of those “you get a benefit because you’re not doing something else that would cost more” kind of financial planning arguments. If you don’t own, you’re a renter and the amount you pay climbs each year. If you do own, then there’s an annual additional amount you don’t have to pay, which is a savings.
That doesn’t mean that homeowners don’t pay more every year because there’s more to owning a house than the payment. Taxes, utilities, maintenance and repair, upgrades, and so on see regularly rising costs. Still, this remains a case that things could be much worse, and you are ahead in some significant ways.
So, why aren’t people dancing in the street? The first reason the critics note is that housing, while a significant cost, isn’t the only place where people are hit. For many years, important areas of living have endured significantly higher increases than income in real terms after inflation. Healthcare, childcare, education, energy (both electric and heating and cooling), all drive up everyday expenses. They leave pay increases in the dusty plains of personal financial ledgers. Personal savings rates are dropping; credit card debt has again reached new heights.
One reason you don’t see conga lines in the street is because people are anxious about the economy and their position in it. Consumer sentiment is up a touch from June, as the newest University of Michigan polling shows, but that’s still down massively from a year earlier. If a patient is in bed with a serious illness and a doctor tells them that they don’t have an additional one, they might be glad to hear it and yet not be in a position to leap to their feet.
The second criticism is even stronger, in a social sense. If housing ownership is at about 65% in the country, should people clap for joy as they see a third of the country having to struggle much harder? When many who are not in a position to own homes are their children or nieces and nephews or kids of friends or younger people they work with? You can be thankful that you weren’t part of a massive traffic accident and yet reluctant to outwardly rejoice so as not to rub others’ noses in the dirt.
My credits include Fortune, the Wall Street Journal, the New York Times Magazine, Zenger News, NBC News, CBS Moneywatch, Technology Review, The Fiscal Times, and…