Saying no at work is hard, especially when you are early in your career or you are really passionate about what you do. Often there is a huge amount of guilt attached, questioning whether you are a team player or not wanting to let your manager down.
But learning when to say no is one of the most important skills to learn in the workplace. Not only does it protect you from being overworked and taken advantage of, but it also helps protect the passion and drive you have for your job. Too often, eager employees are cursed with saying yes to everything, leading them to be exhausted, frustrated, and resenting the job they once loved.
Other times, you may find yourself subject to poor management or unethical behavior if you are asked to complete a task that you know you shouldn’t be doing. Saying no sets a strong boundary with the asked and reinforces that their request is wrong.
Below are some scenarios where you should say no at work and how to do it.
The task interferes with your actual responsibilities
Before saying no to a task, it’s important to have a clear understanding of your actual responsibilities. Review your job description, talk to your manager about priorities, and ask for clarification if needed. Make sure you’re not simply hesitant to take on a new task because it’s unfamiliar or challenging.
How to say no: I would love to help, but I don’t have the capacity at the moment.This response acknowledges the request while also setting a boundary. It’s important to be honest about your workload and priorities, and to avoid overcommitting yourself. This response also shows that you’re willing to help in the future when you have more capacity…..
Workplace communication is the process of exchanging information and ideas, both verbally and non-verbally between one person or group and another person or group within an organization. It includes e-mails, videoconferencing, text messages, notes, calls, etc. Effective communication is critical in getting the job done, as well as building a sense of trust and increasing productivity.
Workers may have different cultures and backgrounds, and may expect different ways of working and understanding how things should be done within an organization’s workplace culture. To strengthen employee cooperation and avoid missed deadlines or activity that could affect the company negatively, effective communication is crucial. Ineffective communication leads to communication gaps, which causes confusion, wastes time, and reduces productivity.
Managers and lower-level employees must be able to interact clearly and effectively with each other through verbal communication and non-verbal communication to achieve specific business goals. Effective communication with clients also plays a vital role in the development of an organization and the success of any business. When communicating, nonverbal communication must also be taken into consideration. How a person delivers a message has a large impact.
Another important aspect of effective workplace communication is taking into consideration the different backgrounds of employees. “While diversity enriches the environment, it can also cause communication barriers. Difficulties arise when a coworker’s cultural background leads him or her to think differently than another. It is for this reason that knowing about intercultural communication at work and learning how to treat others without offending them can bring several benefits to the company.
Workplace communication can be more than the transmission of facts and direct expectations. This communication can be about the forming of relationships amongst the staff and stakeholders, i.e. those inside or outside the organization that are affected in some way by the organization (a simple example would be stockholders). The communication that builds relationships can form or be affected by organizational culture
People like creating to-do lists. It makes them feel productive and useful, as each completed task gets a well-deserved check mark beside it. In theory, all the tasks on a daily list get completed. Then, you can sit back on your couch and congratulate yourself for finishing a hard day’s work.
But in practice, it can leave you feeling stressed and not very productive after all. Here’s the problem with creating lists: there’s no priority in sequence. For example, let’s take a look at a list of things I want to do today:
Write article
Have meeting with prospect
Talk with family and friends
Check email
Exercise
Research article ideas
Share posts on social media
Vacuum
Catch up on Better Call Saul
Phew, that’s a lot of ground to cover in a day. In a to-do list, our goal is to get rid of the pending tasks as soon as possible. The fastest way is to do the easiest tasks first, such as check email, tweet posts, and watch a TV episode while doing the first two.
Except, by doing this you would never get around to the important tasks. You would feel exhausted mentally halfway through and probably run out of time. Instead, I suggest sorting out the tasks in order of importance. One useful tool for doing so is the Eisenhower Matrix.
The principle behind the Eisenhower Matrix is that we should separate tasks that are important from those that are urgent. So what’s the difference, then? Urgent tasks are those that need to be dealt with immediately. We react to a situation and must resolve a problem right away.
Important tasks, on the other hand, are crucial to a long-term goal. They may or may not need to be handled right away, but if we want to improve in an area, we should be focusing on what’s important.
Unfortunately, we often end up working on tasks that fall more into the urgent category than the important category. The Eisenhower Matrix prevents this from happening by keeping our priorities focused.
Check out the tasks above mapped out on the matrix:
Ideally, you should just have one or two activities in the top left quadrant that you absolutely must do during the day. For the activities in the top right quadrant, it’s important to schedule them so that you don’t keep putting these things off.
In my case, I choose to exercise in the afternoon and catch up with family and friends in the evenings. Meetings are put in my calendar, and research is scheduled sometime in the next few days.
My email system is divided up into different categories, so my emails are sorted automatically. Sharing on social media can be automated using a number of apps, such as Buffer. I love using this method to decide what I need to do for the day. It gets me super-focused on what’s important and eliminates those unnecessary tasks that eat up a lot of time without giving much benefit in return.
The development ideology of Fexbits implies the creation of its own unified ecosystem based on quantum technologies and distribution of platform resources between users. The main mission of Fexbits is to provide users with effective ways to earn rewards and bonuses based on the use of quantum computing and provide access to the platforms of strategic partners.
Our partners hold patents in the field of quantum computing, which are used by Fexbits in the cryptocurrency market. More than 20 offices have been opened by partners of the Fexbits platform in different countries of the world: United States, Mexico, Thailand, Cambodia, Russia, Turkey, Ukraine, Great Britain, etc.
We offer various ways to obtain profits, Fexbits has a highly qualified staff that protects the assets of our members and offers them real, lasting, stable and above all safe benefits. Our priority is each of our members so our website has an incredible automated system that makes withdrawals efficiently and safely
At Fexbits we always think of our members to offer them different ways to obtain benefits that is why we have implemented a 6-level referral system with an excellent profit percentage..Read more details here:
For someone performing their first technical SEO audit, the results can be both overwhelming and intimidating. Often, you can’t see the wood for the trees and have no idea how to fix things or where to even begin.
After years of working with clients, especially as the head of tech SEO for a U.K. agency, I’ve found technical SEO audits to be a near-daily occurrence. With that, I know how important it is, especially for newer SEOs, to understand what each issue is and why it is important.
Understanding issues found within a technical audit allows you to analyze a site fully and come up with a comprehensive strategy.
In this guide, I am going to walk you through a step-by-step process for a successful tech audit but also explain what each issue is and, perhaps more importantly, where it should lie on your priority list.
Whether it’s to make improvements on your own site or recommendations for your first client, this guide will help you to complete a technical SEO audit successfully and confidently in eight steps.
But first, let’s clarify some basics.
What is a technical SEO audit?
Technical SEO is the core foundation of any website. A technical SEO audit is an imperative part of site maintenance to analyze the technical aspects of your website.
An audit will check if a site is optimized properly for the various search engines, including Google, Bing, Yahoo, etc.
This includes ensuring there are no issues related to crawlability and indexation that prevent search engines from allowing your site to appear on the search engine results pages (SERPs).
An audit involves analyzing all elements of your site to make sure that you have not missed out on anything that could be hindering the optimization process. In many cases, some minor changes can improve your ranking significantly.
Also, an audit can highlight technical problems your website has that you may not be aware of, such as hreflang errors, canonical issues, or mixed content problems.
Generally speaking, I always like to do an initial audit on a new site—whether that is one I just built or one I am seeing for the first time from a client—and then on a quarterly basis.
I think it is advisable to get into good habits with regular audits as part of ongoing site maintenance. This is especially if you are working with a site that is continuously publishing new content.
It is also a good idea to perform an SEO audit when you notice that your rankings are stagnant or declining.
What do you need from a client before completing a technical audit?
Even if a client comes to me with goals that are not necessarily “tech SEO focused,” such as link building or creating content, it is important to remember that any technical issue can impede the success of the work we do going forward.
It is always important to assess the technical aspects of the site, offer advice on how to make improvements, and explain how those technical issues may impact the work we intend to do together.
With that said, if you intend on performing a technical audit on a website that is not your own, at a minimum, you will need access to the Google Search Console and Google Analytics accounts for that site.
How to perform a technical SEO audit in eight steps
For the most part, technical SEO audits are not easy. Unless you have a very small, simple business site that was perfectly built by an expert SEO, you’re likely going to run into some technical issues along the way.
Often, especially with more complex sites, such as those with a large number of pages or those in multiple languages, audits can be like an ever-evolving puzzle that can take days or even weeks to crack.
Regardless of whether you are looking to audit your own small site or a large one for a new client, I’m going to walk you through the eight steps that will help you to identify and fix some of the most common technical issues.
Step 1. Crawl your website
All you need to get started here is to set up a project in Ahrefs’ Site Audit, which you can even access for free as part of Ahrefs Webmaster Tools.
This tool scans your website to check how many URLs there are, how many are indexable, how many are not, and how many have issues.
From this, the audit tool creates an in-depth report on everything it finds to help you identify and fix any issues that are hindering your site’s performance.
Of course, more advanced issues may need further investigation that involves other tools, such as Google Search Console. But our audit tool does a great job at highlighting key issues, especially for beginner SEOs.
First, to run an audit with Site Audit, you will need to ensure your website is connected to your Ahrefs account as a project. The easiest way to do this is via Google Search Console, although you can verify your ownership by adding a DNS record or HTML file.
Once your ownership is verified, it is a good idea to check the Site Audit settings before running your first crawl. If you have a bigger site, it is always best to increase the crawl speed before you start.
There are a number of standard settings in place. For a small, personal site, these settings may be fine as they are. However, settings like the maximum number of pages crawled under “Limits” is something you may want to alter for bigger projects.
Also, if you are looking for in-depth insight on Core Web Vitals (CWV), you may want to add your Google API key here too.
Once happy with the settings, you can run a new crawl under the “Site Audit” tab.
Initially, after running the audit, you will be directed to the “Overview” page. This will give you a top-level view of what the tool has found, including the number of indexable vs. non-indexable pages, top issues, and an overall website health score out of 100.
This will give you a quick and easy-to-understand proxy metric to the overall website health.
From here, you can head over to the “All issues” tab. This breaks down all of the problems the crawler has found, how much of a priority they are to be fixed, and how to fix them.
This report, alongside other tools, can help you to start identifying the issues that may be hindering your performance on the SERPs.
Step 2. Spotting crawlability and indexation issues
If your site has pages that can’t be crawled by search engines, your website may not be indexed correctly, if at all. If your website does not appear in the index, it cannot be found by users.
Ensuring that search bots can crawl your website and collect data from it correctly means search engines can accurately place your site on the SERPs and you can rank for those all-important keywords.
There are a few things you need to consider when looking for crawlability issues:
Indexation errors
Robots.txt errors
Sitemap issues
Optimizing the crawl budget
Identifying indexation issues
Priority: High
Ensuring your pages are indexed is imperative if you want to appear anywhere on Google.
The simplest way to check how your site is indexed is by heading to Google Search Console and checking the Coverage report. Here, you can see exactly which pages are indexed, which pages have warnings, as well as which ones are excluded and why:
Note that pages will only appear in the search results if they are indexed without any issues.
If your pages are not being indexed, there are a number of issues that may be causing this. We will take a look at the top few below, but you can also check our other guide for a more in-depth walkthrough.
Checking the robots.txt file
Priority: High
The robots.txt file is arguably the most straightforward file on your website. But it is something that people consistently get wrong. Although you may advise search engines on how to crawl your site, it is easy to make errors.
Most search engines, especially Google, like to abide by the rules you set out in the robots.txt file. So if you tell a search engine not to crawl and/or index certain URLs or even your entire site by accident, that’s what will happen.
This is what the robots.txt file, which tells search engines not to crawl any pages, looks like:
Often, these instructions are left within the file even after the site goes live, preventing the site from being crawled. This is a rare easy fix that acts as a panacea to your SEO.
You can also check whether a single page is accessible and indexed by typing the URL into the Google Search Console search bar. If it’s not indexed yet and it’s accessible, you can “Request Indexing.”
The Coverage report in Google Search Console can also let you know if you’re blocking certain pages in robots.txt despite them being indexed:
A robots meta tag is an HTML snippet that tells search engines how to crawl or index a certain page. It’s placed into the <head> section of a webpage and looks like this:
<meta name="robots" content="noindex" />
This noindex is the most common one. And as you’ve guessed, it tells search engines not to index the page. We also often see the following robots meta tag on pages across whole websites:
This tells Google to use any of your content freely on its SERPs. The Yoast SEO plugin for WordPress adds this by default unless you add noindex or nosnippet directives.
If there are no robots meta tags on the page, search engines consider that as index, follow, meaning that they can index the page and crawl all links on it.
But noindex actually has a lot of uses:
Thin pages with little or no value for the user
Pages in the staging environment
Admin and thank-you pages
Internal search results
PPC landing pages
Pages about upcoming promotions, contests, or product launches
Duplicate content (use canonical tags to suggest the best version for indexing)
But improper use also happens to be a top indexability issue. Using the wrong attribute accidentally can have a detrimental effect on your presence on the SERPs, so remember to use it with care.
An XML sitemap helps Google to navigate all of the important pages on your website. Considering crawlers can’t stop and ask for directions, a sitemap ensures Google has a set of instructions when it comes to crawling and indexing your website.
But much like crawlers can be accidentally blocked via the robots.txt file, pages can be left out of the sitemap, meaning they likely won’t get prioritized for crawling.
Also, by having pages in your sitemap that shouldn’t be there, such as broken pages, you can confuse crawlers and affect your crawl budget (more on that next).
You can check sitemap issues in Site Audit: Site Audit > All issues > Other.
The main thing here is to ensure that all of the important pages that you want to have indexed are within your sitemap and avoid including anything else.
A crawl budget refers to how many pages and how rapidly a search engine can crawl.
A variety of things influence the crawl budget. These include the number of resources on the website, as well as how valuable Google deems your indexable pages to be.
Having a big crawl budget does not guarantee that you will rank at the top of the SERPs. But if all of your critical pages are not crawled due to crawl budget concerns, it is possible that those pages may not be indexed.
Your pages are likely being scanned as part of your daily crawl budget if they are popular, receive organic traffic and links, and are well-linked internally across your site.
New pages—as well as those that are not linked internally or externally, e.g., those found on newer sites—may not be crawled as frequently, if at all.
For larger sites with millions of pages or sites that are often updated, crawl budget can be an issue. In general, if you have a large number of pages that aren’t being crawled or updated as frequently as you want, you should think about looking to speed up crawling.
Using the Crawl Stats report in Google Search Console can give you insight into how your site is being crawled and any issues that may have been flagged by the Googlebot.
It is important to check your on-page fundamentals. Although many SEOs may tell you that on-page issues like those with meta descriptions aren’t a big deal, I personally think it is part of good SEO housekeeping.
Even Google’s John Mueller previously stated that having multiple H1 tags on a webpage isn’t an issue. However, let’s think about SEO as a points system.
If you and a competitor have sites that stand shoulder to shoulder on the SERP, then even the most basic of issues could be the catalyst that determines who ranks at the top. So in my opinion, even the most basic of housekeeping issues should be addressed.
So let’s take a look at the following:
Page titles and title tags
Meta descriptions
Canonical tags
Hreflang tags
Structured data
Page titles and title tags
Priority: Medium
Title tags have a lot more value than most people give them credit for. Their job is to let Google and site visitors know what a webpage is about—like this:
Here’s what it looks like in raw HTML format:
<title>How to Craft the Perfect SEO Title Tag (Our 4-Step Process)</title>
In recent years, title tags have sparked a lot of debate in the SEO world. Google, it turns out, is likely to modify your title tag if it doesn’t like it.
One of the biggest reasons Google rewrites title tags is that they are simply too long. This is one issue that is highlighted within Site Audit.
In general, it is good practice to ensure all of your pages have title tags, none of which are longer than 60 characters.
A meta description is an HTML attribute that describes the contents of a page. It may be displayed as a snippet under the title tag in the search results to give further context.
More visitors will click on your website in the search results if it has a captivating meta description. Even though Google only provides meta descriptions 37% of the time, it is still important to ensure your most important pages have great ones.
You can find out if any meta descriptions are missing, as well as if they are too long or too short.
But writing meta descriptions is more than just filling a space. It’s about enticing potential site visitors.
A canonical tag (rel=“canonical”) specifies the primary version for duplicate or near-duplicate pages. To put it another way, if you have about the same content available under several URLs, you should be using canonical tags to designate which version is the primary and should be indexed.
Canonical tags are an important part of SEO, mainly because Google doesn’t like duplicate content. Also, using canonical tags incorrectly (or not at all) can seriously affect your crawl budget.
If spiders are wasting their time crawling duplicate pages, it can mean that valuable pages are being missed.
You can find duplicate content issues in Site Audit: Site Audit > Reports > Duplicates > Issues.
Although hreflang is seemingly yet another simple HTML tag, it is possibly the most complex SEO element to get your head around.
The hreflang tag is imperative for sites in multiple languages. If you have many versions of the same page in a different language or target different parts of the world—for example, one version in English for the U.S. and one version in French for France—you need hreflang tags.
Translating a website is time consuming and costly—because you’ll need to put in effort and ensure all versions show up in the relevant search results. But it does give a better user experience by catering to different users who consume content in different languages.
Plus, as clusters of multiple-language pages share each other’s ranking signals, using hreflang tags correctly can have a direct impact as a ranking factor. This is alluded to by Gary Illyes from Google in this video.
You can find hreflang tag issues in Site Audit under localization: Site Audit > All issues > Localization.
Structured data, often referred to as schema markup, has a number of valuable uses in SEO.
Most prominently, structured data is used to help get rich results or features in the Knowledge Panel. Here’s a great example: When working with recipes, more details are given about each result, such as the rating.
You also get a feature in the Knowledge Panel that shows what a chocolate chip cookie is (along with some nutritional information):
Image optimization is often overlooked when it comes to SEO. However, image optimization has a number of benefits that include:
Improved load speed.
More traffic you can get from Google Images.
More engaging user experience.
Improved accessibility.
Image issues can be found in the main audit report: Site Audit > Reports > Images.
Broken images
Priority: High
Broken images cannot be displayed on your website. This makes for a bad user experience in general but can also look spammy, giving visitors the impression that the site is not well maintained and professional.
This can be especially problematic for anyone who monetizes their website, as it can make the website seem less trustworthy.
Image file size too large
Priority: High
Large images on your website can seriously impact your site speed and performance. Ideally, you want to display images in the smallest possible size and in an appropriate format, such as WebP.
The best option is to optimize the image file size before uploading the image to your website. Tools like TinyJPG can optimize your images before they’re added to your site.
If you are looking to optimize existing images, there are tools available, especially for more popular content management systems (CMSs) like WordPress. Plugins such as Imagify or WP-Optimize are great examples.
HTTPS page links to HTTP image
Priority: Medium
HTTPS pages that link to HTTP images cause what is called “mixed content issues.” This means that a page is loaded securely via HTTPS. But a resource it links to, such as an image or video, is on an insecure HTTP connection.
Mixed content is a security issue. For those who monetize sites with display ads, it can even prevent ad providers from allowing ads on your site. It also degrades the user experience of your website.
By default, certain browsers restrict unsafe resource requests. If your page relies on these vulnerable resources, it may not function correctly if they are banned.
Missing alt text
Priority: Low
Alt text, or alternative text, describes an image on a website. It is an incredibly important part of image optimization, as it improves accessibility on your website for millions of people throughout the world who are visually impaired.
Often, those with a visual impairment use screen readers, which convert images into audio. Essentially, this is describing the image to the site visitor. Properly optimized alt text allows screen readers to inform site users with visual impairments exactly what they are seeing.
Alt text can also serve as anchor text for image links, help you to rank on Google Images, and improve topical relevance.
When most people think of “links” for SEO, they think about backlinks. How to build them, how many they should have, and so on.
What many people don’t realize is the sheer importance of internal linking. In fact, internal links are like the jelly to backlinks’ peanut butter. Can you have one without the other? Sure. Are they always better together? You bet!
Not only do internal links help your external link building efforts, but they also make for a better website experience for both search engines and users.
The proper siloing of topics using internal linking creates an easy-to-understand topical roadmap for everyone who comes across your site. This has a number of benefits:
Creates relevancy for keywords
Helps ensure all content is crawled
Makes it easy for visitors to find relevant content or products
Of course, when done right, all of this makes sense. But internal links should be audited when you first get your hands on a site because things may not be as orderly as you’ll want.
4xx status codes
Priority: High
Go to Site Audit > Internal pages > Issues tab > 4XX page.
Here, you can see all of your site’s broken internal pages.
These are problematic because they waste “link equity” and provide users with a negative experience.
Here are a few options for dealing with these issues:
Bring back the broken page at the same address (if deleted by accident)
Redirect the broken page to a more appropriate location; all internal links referring to it should be updated or removed
Orphan pages
Priority: High
Go to Site Audit > Links > Issues tab > Orphan page (has no incoming internal links).
Here, we highlight pages that have zero internal links pointing to them.
There are two reasons why indexable pages should not be orphaned:
Internal links will not pass PageRank because there are none.
They won’t be found by Google (unless you upload your sitemap through Google Search Console or there are backlinks from several other websites’ crawled pages, they won’t be seen).
If your website has multiple orphaned pages, filter the list from high to low for organic traffic. If internal links are added to orphaned pages still receiving organic traffic, they’ll certainly gain far more traffic.
External links are hyperlinks within your pages that link to another domain. That means all of your backlinks—the links to your website from another one—are someone else’s external links.
See how the magic of the internet is invisibly woven together? *mind-blown emoji*
External links are often used to back up sources in the form of citations. For example, if I am writing a blog post and discussing metrics from a study, I’ll externally link to where I found that authoritative source.
Linking to credible sources makes your own website more credible to both visitors and search engines. This is because you show that your information is backed up with sound research.
Linking to other websites is a great way to provide value to your users. Often times, links help users to find out more, to check out your sources and to better understand how your content is relevant to the questions that they have.
As always, just like anything else, external links can cause issues. These can be found in the audit report (similar to internal links): Site Audit > All issues > Links.
As you can see from the image above, links are broken down into indexable and not indexable and you can find the same issues across both categories. However, each issue has a different predetermined importance level—depending on whether the link is indexable or not.
Page has links to broken page
Priority: High (if indexable)
This issue can refer to both internal and external links and simply means that the URLs linked to are returning a 4XX return code. These links damage the user experience for visitors and can impair the credibility of your site.
Page has no outgoing links
Priority: High (if indexable)
Again, this issue refers to both internal and external links and essentially means a page has no links from it at all. This means the page is a “dead end” for your site visitors and search engines. Bummer.
But in regards to external links specifically, if your page has no outgoing links, it affects all of the benefits of external links as discussed above.
Step 7. Site speed and performance
Site speed has become quite a hot topic among the SEO community in recent times, especially after Google announced that mobile speed is indeed a ranking factor.
Since May 2021, speed metrics known as Core Web Vitals (CWV) have been utilized by Google to rank pages. They use Largest Contentful Paint (LCP) to assess visual load, Cumulative Layout Shift (CLS) to test visual stability, and First Input Delay (FID) to measure interactivity.
Google’s goal is to improve user experience because, let’s face it, no one likes a slow website. In today’s society, the need for instant gratification encourages site visitors to leave before they finish what they intend to do.
Within the Ahrefs audit report, you can find information about site speed: Site Audit > Reports > Performance > Overview.
Recommendation
You can get detailed page speed data from Google PageSpeed Insights if you enable Core Web Vitals in the Crawl settings.
There are also a number of excellent speed testing tools available, including PageSpeed Insights from Google and my personal favorite, GTmetrix.
Speed optimization for sites that are very slow can be a complex process. However, for beginners, it is advisable to use one of the available tools such as WPRocket or NitroPack (both paid) to significantly improve site speed.
In the world we now live, more individuals than ever before are continuously utilizing mobile devices. For example, mobile shopping currently has 60% of the market, according to Datareportal’s 300-page study.
It is no wonder that over the last few years, Google has looked to switch to mobile-first indexing.
From a technical standpoint, it is good practice to run a second audit on your site using Ahrefs’ mobile crawler. As a standard, Ahrefs’ audit tool uses a desktop crawl to audit your site; however, this can easily be changed under “Crawl Settings” within your “Project Settings.”
Our comparison function will compare your mobile and desktop sites and inform you what has changed or if any “new” issues have arisen once you have crawled your site a second time, e.g., problems that exist only on mobile.
From here, you can select any of the “New,” “Added,” or “Removed” numbers to determine what has changed with respect to each problem.
In all honesty, this is just scratching the surface when it comes to performing a technical SEO audit. Each of the points above can easily have an entire blog post about it and additional, more advanced issues like paginations, log file analysis, and advanced site architecture.
However, for someone looking to learn where to get started in order to successfully complete a technical SEO audit, this is a great place to begin.
Whenever you perform a technical SEO audit, you’ll always have tons to fix. The important thing is to get your priorities straight first. Luckily, Ahrefs’ Site Audit gives you a predefined priority rating for each issue.
One thing to keep in mind, though, is that regardless of the issue, its importance depends on the website or page you’re working on. For example, the main pages you want to rank will always take priority over pages you don’t want to index.
Jenny is an award-winning SEO consultant who specializes in using branded PR to maximize SEO results for clients by building E-A-T and has an extensive background in niche affiliate and technical SEO.
Pringle, G., Allison, L., and Dowe, D. (April 1998). “What is a tall poppy among web pages?”. Proc. 7th Int. World Wide Web Conference. Archived from the original on April 27, 2007. Retrieved May 8, 2007.Laurie J. Flynn (November 11, 1996). “Desperately Seeking Surfers”. New York Times. Archived from the original on October 30, 2007. Retrieved May 9, 2007.
Zoltan Gyongyi & Hector Garcia-Molina (2005). “Link Spam Alliances”(PDF). Proceedings of the 31st VLDB Conference, Trondheim, Norway. Archived(PDF) from the original on June 12, 2007. Retrieved May 9, 2007.Hansell, Saul (June 3, 2007). “Google Keeps Tweaking Its Search Engine”. New York Times. Archived from the original on November 10, 2017. Retrieved June 6, 2007.
Shoppers browsing at Houston’s Galleria mall on Black Friday.Brandon Bell/Getty Images
Marketers are staging sweepstakes, quizzes and events to gather people’s personal information and build detailed profiles. New privacy protections put in place by tech giants and governments are threatening the flow of user data that companies rely on to target consumers with online ads.
As a result, companies are taking matters into their own hands. Across nearly every sector, from brewers to fast-food chains to makers of consumer products, marketers are rushing to collect their own information on consumers, seeking to build millions of detailed customer profiles.
Gathering such data has long been a priority, but there is newfound urgency. Until now, most advertisers have depended heavily on data from business partners, including tech giants and ad-technology firms, to determine how to focus their ads. But all of the traditional tactics are under assault.
Apple Inc. rolled out a change on its devices this year that restricts how users can be tracked. Google is planning a similar push for its popular Chrome browser. New privacy laws in California and Europe are adding to the squeeze on data.
So brands are deploying an array of tactics to persuade users to surrender data to the brand itself—loyalty programs, sweepstakes, newsletters, quizzes, polls and QR codes, those pixelated black-and-white squares that have become ubiquitous during the pandemic.
Avocados From Mexico, a nonprofit marketing organization that represents avocado growers and packers, is encouraging people to submit grocery receipts to earn points exchangeable for avocado-themed sportswear. It is also conducting a contest for the chance to win a truck. To enter, consumers scan QR codes on in-store displays and enter their name, birthday, email and phone number.
“We have a limited window to figure this out, and everybody’s scrambling” to do so, said Ivonne Kinser, vice president of marketing for the avocado group. It has managed to capture roughly 50 million device IDs—the numbers associated with mobile devices—and is working to link them to names and email addresses. The group plans to use the customer information for ad targeting and to make its ads more relevant to its customers.
Building detailed profiles of customers can be costly, since it requires sophisticated software and data science expertise. “We can do a little bit at a time, but it will take years,” Ms. Kinser said. Consumer packaged-goods companies, in particular, will likely struggle to get meaningful quantities of data, since many don’t sell directly to their customers.
No matter how successful brands are in these efforts, they will have a minuscule amount of user data compared with giants like Facebook, Google and Amazon.com Inc. Marketers will still spend huge sums to advertise on those platforms for the foreseeable future. But by having their own robust databases, companies could make their online ad campaigns less costly and more effective.
Miller High Life ran an online contest this summer to give away a branded patio set. The lucky winner got a bar, stools and neon signs. The company’s prize was the personal details of almost 40,000 people who signed up, including emails, birthdays and phone numbers. The reason it asks for birthdays is to validate ages, since it’s an alcohol brand.
Molson Coors Beverage Co., Miller’s parent company, said as more people opt out of being tracked by apps, having more customer data can help keep its ad costs from rising when it buys digital ads across social media channels and from online publishers using automated ad-buying systems.
Molson has conducted more than 300 data-collection efforts this year, including sweepstakes and contests at bars around the country. Many customers signing up in the contests agree to let the brewer store their information and use it for marketing purposes.
“You could think it’s a bad thing, like, we’re trying to access people’s information, but people actually have no problem sharing that information because they’re getting a benefit out of it as well,” said Sofia Colucci, global vice president of marketing for the Miller family of brands.
The Milwaukee-based brewer currently has more than a million customer profiles and says it is hoping to increase that to at least 13 million by 2025. Apple’s new privacy policy, introduced in April, requires apps to ask users if they want to be tracked. According to Flurry, a mobile-app analytics provider, U.S. users opt into tracking only about 18% of the times they encounter the Apple privacy prompt.
The upshot is that major apps, including Facebook, will have less data over time to help brands target ads on their platforms. Apple declined to comment. Reaching desirable audiences on Facebook is already getting more expensive for e-commerce brands. The company, whose parent is now known as Meta Platforms Inc., said Apple’s change hurt its sales growth in the most recent quarter. Meta said it is working on technology to mitigate the issues.
Buying and targeting online ads has long been helped by cookies, tiny files stored in a browser that carry information about a person’s online behavior. Google, a unit of Alphabet Inc., has said that by late 2023 it plans to pull the plug on third-party cookies within Chrome, in the interest of user privacy.
Google recently tested a new form of ad targeting that would let marketers direct their ads at large cohorts, such as people interested in travel. In some cases, Google will let marketers use their own customer data to target individuals on Google properties such as YouTube—another move that makes it important for companies to collect their own data.
Developing strong relationships with customers, always critical for marketers, “becomes even more vital in a privacy-first world,” David Temkin, Google’s director of product management for ads privacy and trust, said in a written statement.
California’s Consumer Privacy Act and Europe’s General Data Protection Regulation have both made it more difficult for ad-tech firms and data brokers to collect information that brands can use, helping put the onus on companies to gather data themselves.
Companies aren’t after just a few personal details. Many aim to log most of the interactions they have with customers, to flesh out what is called a “golden record.” Such a high-quality customer record might include dozens, even hundreds, of data points, including the store locations people visit, the items they typically buy, how much they spend and what they do on the company’s website.
This kind of information doesn’t just help with online-ad targeting but also lets brands personalize other parts of their marketing, from the offers they send people to which products are displayed to customers online.
PepsiCo Inc., which began to get more serious about data collection several years ago, already has roughly 75 million customer records and is looking to double that in two years. The data pile has helped the snack and beverage giant save tens of millions of dollars, said Shyam Venugopal, senior vice president of global media and commercial capabilities.
Buying ads on platforms such as Facebook and Snap Inc. is more expensive if marketers use those companies’ data, several marketing executives said. In North America, most of PepsiCo’s online ad targeting now uses its own customer data, so the costs are lower, according to Mr. Venugopal. Its campaigns are also more effective at reaching the right audiences, he said.
Partly to expand its cache of data, PepsiCo has launched an online store for its Mountain Dew Game Fuel brand aimed at gamers. About 35,000 people registered in the first six months and provided some personal information, Mr. Venugopal said.
Companies in retail, travel and hospitality are well positioned to harvest data because they deal directly with consumers. Many such companies have long invested in loyalty programs that offer perks such as fare discounts or hotel-room upgrades, and have already built customer databases for personalizing marketing.
Dining chain Chili’s Grill & Bar has about nine million active loyalty members, and its records contain about 50 different bits of information, including how many times a person ordered certain foods such as burgers, fajitas, ribs or a kids meal, the company said. Chili’s also has some emails, phone numbers and purchase history for 50 million customers who aren’t active loyalty members, which it can use for ad targeting.
In an example of how the data help to tailor messages, ads sent to someone who frequently orders appetizers might say, “Come in for a free app,” said Michael Breed, senior vice president of marketing at Chili’s, which is owned by Brinker International Inc. He credits the chain’s stash of customer data for helping avoid major fallout from the policy change Apple made.
Some retailers that saw a surge in online sales early in the pandemic supercharged their data collection. “It allowed companies in a very natural way to know a lot more about you,” said Chris Chapo, former vice president of advanced analytics for Amperity, a marketing technology firm.
In 2020, Dick’s Sporting Goods Inc. added 8.5 million new loyalty-program members, or athletes, as it calls them. The company has more than 20 million loyalty members.
Dick’s loyalty-member profiles can include up to 325 data points and customer traits. These include the purchases members make, whether they have children, what draws their attention on the website, how much they have spent with Dick’s over 12 months and what is their “lifetime value”—an estimate of how much they will eventually spend with the company.
Molson began ratcheting up its efforts in reaction to the European privacy laws. A pivotal moment came in 2019, when Brad Feinberg, vice president of media and consumer engagement for North America, paid a visit to a bar in Madison, Wis., where a field marketing manager was hosting a contest. Patrons put their names in a fish bowl for the chance to win two tickets to a football game.
Mr. Feinberg asked the marketing manager what he did with the bowl of names after the contest. “I throw them in the garbage,” the manager replied, according to Mr. Feinberg.
He realized how much data Molson was failing to capture, given hundreds of such events it held weekly. He eventually persuaded the company to invest in data collection and set data goals for each of its 80 brands. Molson said its customer-records collection has helped it save more than $300,000 this year on data fees when buying online ads.