We can all appreciate the power of search engine optimization to help drive more traffic to a website by improving its online visibility. With an estimated 67% of clicks going to the first five organic results, there’s a clear advantage to having a prominent spot on the search engine results page. But, with an ever-changing search landscape that includes regular updates to Google’s algorithm, it’s vital to know what’s hindering your website’s search performance so you can address those issues.
In the past year of conducting Visibility Audits for our SEO clients, we’ve uncovered a pattern of SEO mistakes that are hurting their websites’ digital visibility—whether they’re a locally-owned business or a large ecommerce website. We’ve compiled these findings into our 2019 Visibility Roadblocks report, which is now available for download.
2019 Visibility Roadblocks will be sent to your inbox!
Here are the top 5 issues hindering rankings in 2019:
1. Page speed optimization
Across the board, every single website we looked at was missing opportunities to optimize their page load times. This was true not only for those sites that took more than 3 seconds to load, but also for quick-loading sites that were leaving opportunities on the table for improvement.
We can tell you (from firsthand experience!) that page speed can be difficult to address, especially when you are using a content management system, such as WordPress, with lots of plugins. But in the eyes of Google and other search engines, your website simply can’t load too quickly. Consider these statistics on the impact of page load time:
- 40% of users will abandon a webpage if it takes more than 3 seconds to load.
- Every 1 second delay in load time results in a 7% loss in conversions.
To address page speed issues, you first need to know what’s wrong. Our favorite tools for figuring this out are Google PageSpeed Insights, and GTmetrix. Pay closest attention to your website’s mobile page load times. Google’s mobile-first indexing means your website’s mobile performance will be used to rank search results, even for desktop users.
50% of the sites we audited loaded in less than 3 seconds on a desktop. Only 25% loaded in 3 seconds or less on mobile devices.
2. H1 tag errors
H1 tag issues are a pretty common occurrence, and fortunately pretty easy to fix, but that doesn’t mean they should be taken lightly. If you need a refresher, the h1 tag is the highest-level heading on a webpage, and an important SEO element that can directly influence your page’s keyword rankings.
H1 tags should adhere to a couple simple rules to be effective:
- Every page should have an h1 tag defined.
- There should be only one h1 per page.
- The h1 tag should be different from a page’s title tag.
Many content management systems will, by default, take the text entered as the “title” when creating a new page and automatically use it for both the h1 tag and title tag. Keep this in mind as you create content. Take the time to learn how you can edit your page’s title tag through your content management system so you can fix this issue. You may need a third-party plugin, such as Yoast SEO for WordPress. (More on title tags below.)
67% of the websites we audited were missing h1 tags, while 58% had h1 tags that matched a page’s title tag.
3. Duplicate title tags
Like h1 tags, title tags are fundamental SEO elements that are important for helping search engines understand and rank a webpage. Title tags are also important because they are used as the link text that is displayed in search results.
Duplicate title tags are often the result of settings within a content management system that auto-generate the title tag when a new page is created, usually from a combination of the website name and the new page’s h1. However, having more than one page with the same title tag can also happen naturally, just by the fact that any business is likely to have several pages and/or blog posts talking about the same or very similar topics.
You can uncover your website’s duplicate title tags by using a tool like Screaming Frog SEO Spider to scan your site. Once you have a list, you can set about correcting them using SEO best practices. You may need to install a plugin to be able to easily update title tags on your site (such as Yoast SEO, which we mention above).
83% of the websites we audited had multiple pages with the same title tags.
4. Local SEO issues
In general, we’ve found local optimization techniques to be an underappreciated aspect of SEO. It’s pretty common to think local SEO is irrelevant to many types of business, but that’s far from the truth. It’s important for every website to ensure that certain key pieces of information (i.e. business name, address, and phone number) are consistent and properly formatted and that their Google My Business (GMB) listing is accurate and up to date. Why? Because it verifies your business’s authenticity with Google.
Claiming and optimizing your GMB profile is not a difficult task. We provide a downloadable tip sheet on the process in our blog post on using Google Posts.
- Your business should be listed under at least one accurate category; adding as many as possible improves your listing’s effectiveness.
- There should be, at a minimum, five appropriate and high-quality photos for your business. Regularly adding new photos (and removing outdated ones) will keep your listing optimized.
- Ideally you should have as many reviews as possible. Encourage reviews on Google as well as other platforms, such as Facebook. Google will display reviews from some other platforms into your knowledge panel listing.
If you are a business with multiple locations (not to mention multiple people managing these locations), things can get trickier. We wrote an extensive blog post on how to optimize Google My Business for multiple locations.
70% of websites we audited had onsite errors in local SEO, while 80% had not optimized their Google My Business accounts.
5. Toxic backlinks
Three-quarters of the websites we audited had “toxic” links from other websites to theirs. This means the websites linking to them were viewed by search engines to be low-quality and potentially spam sites. Backlinks coming from these types of “spammy” websites can potentially reflect badly on yours, and search engines may factor in the quantity of toxic backlinks when determining your website’s authority.
In an ideal world, you should have zero toxic backlinks, but controlling who links to your site is difficult. We recommend that you aim for 5% or less of these undesirable links.
You can identify toxic backlinks by using a tool like SEMrush’s Backlink Audit, and carefully review each one. Any link that is coming from a website that is completely inappropriate, offensive, or spammy, should be disavowed by submitting a disavow file to Google Search Console. However, be sure to do so with care. Websites linking to you that are simply lower quality but appropriate can be safely ignored.
75% of the websites we audited had an undesirable number of toxic backlinks.
Recognize common SEO errors so you can avoid them.
In our review of Visibility Audits from 2019, another thing was clear: companies of all shapes and sizes are subject to the same simple SEO oversights that hamper online visibility. Large organizations with subcontracted web vendors fared no better than small businesses with a do-it-yourself team. Every organization that relies on a website for any level of promotion should take the time to periodically evaluate the visibility roadblocks affecting their site.
Download our 2019 Visibility Roadblocks report and check your own website for these common issues—you may be surprised at what you uncover!
Need help identifying your website’s SEO issues? Our SEO packages include a comprehensive Visibility Audit and ongoing optimization.