Organic search continued to be an important source of traffic for websites in 2020. According to BrightEdge, 68% of online experiences begin with a search engine and 53.3% of all website traffic comes from organic search. Yet many companies lose out on this valuable traffic because their websites aren’t properly optimized for search.
In the past year of conducting Visibility Audits for our SEO clients, we’ve uncovered a pattern of SEO mistakes that are hurting their websites’ digital visibility — whether they’re a large ecommerce website or a small, locally-owned business. We’ve assembled these findings into our 2020 Visibility Roadblocks report, which is now available for download.
Unoptimized on-page SEO
The title tag is an important ranking factor that appears as the blue link in search results. It is used to explain what the content of your page is about, and it should include your target keyword as early in the tag as possible. A title tag is different from the <h1> tag. (The two are commonly confused.) The <h1> is the main header on the actual page of content.
SEO best practices dictate that title tags be 30-60 characters long and unique to the page.
It is also recommended that the title tag and <h1> tag be different from each other. This gives you the opportunity to incorporate long-tail keywords or keyword variations in your on-page elements. Yet websites are commonly plagued by problems with title tag length, duplicate title tags, and matching on-page elements.
When we uncover issues with on-page SEO, we export the errors using Screaming Frog and optimize the elements so they follow SEO best practices. Title tags are rewritten so they are unique, include the page’s target keyword, and fall within the recommended character length. Header tags are also adjusted so they properly structure the content and aren’t exactly the same as the title tag.
Once these obvious issues are fixed, we look at the website as a whole to see if other on-page elements can be improved. Reviewing the website’s current keyword rankings and keyword opportunities helps us uncover opportunities to better optimize title tags, meta descriptions, and header tags to include our target keyword, keyword variations, and more compelling calls to action. This effort is relatively low effort but can have a significant impact on keyword rankings and organic visibility.
Need to improve your on-page SEO? Find our content optimization checklist here.
Slow page speed
Page speed became a confirmed ranking factor for Google as of 2018, and will continue to grow in importance. Google has stated that page experience signals will roll out May 2021, which will “measure how users perceive the experience of interacting with a web page”. This spells trouble for the many websites struggling with slow page speed. Pages that load too slowly can frustrate searchers and cause them to leave. In fact, even a one second delay results in 7% fewer conversions.
90% of the websites we evaluated in 2020 had slow page speed, making it a top priority fix for marketers. You can use a free SEO audit tool like Google PageSpeed Insights or GTmetrix to evaluate your website’s page speed and get recommendations on how to improve it. At Pure Visibility, we work with developers to prioritize page speed improvements and suggestions on how to implement them.
Find a list of useful SEO audit tools here.
Inaccurate XML sitemap
According to Yoast, “a good XML sitemap acts as a roadmap of your website that leads Google to all your important pages. XML sitemaps can be good for SEO, as they allow Google to quickly find your essential website pages, even if your internal linking isn’t perfect.”
In many cases, the XML sitemaps we encounter do not reflect the URLs on the website and/or have not been uploaded to Google Search Console. For example, we usually see URLs in the sitemap which 301 redirect instead of using the end destination URL. It’s also common to see URLs in the sitemap that should not be crawled or found through a Google search, like a login page.
We help clients address these issues by using an SEO crawler tool to identify all the pages on the website and review which ones should be indexed (i.e. non-login pages). We can then list out URLs that are missing from the XML sitemap and need to be included, as well as URLs that should be removed so they won’t be crawled and indexed.
Learn how to build and submit a sitemap here.
Toxic backlinks
Backlinks are links from other websites back to yours. The quantity and quality of your backlinks are one of the most important ranking factors for Google.
Unfortunately, it’s not uncommon for spammy websites to link to yours. Google has gotten better at identifying and ignoring these low-quality links, but it can still be a problem. If there are too many toxic backlinks — especially from link schemes — it sends the wrong message to Google about your authority and can hurt your keyword rankings. In the worst case scenario, Google may issue a manual action, which can remove your website from the search results entirely.
To avoid this, we recommend that clients with a questionable backlink profile review the lowest quality links and create a list of links to disavow. This disavow file is then uploaded to Google Search Console so Google knows to disregard the toxic backlinks.
Find instructions on disavowing links to your website here.
Avoid the top SEO issues of 2020
Strong SEO is essential to ranking well in the search results and driving more organic traffic to your website. Download our 2020 Visibility Roadblocks report to uncover some of the most common SEO mistakes and set yourself up for success.
LIKE IT? SHARE IT!