How Different Technical Website Issues are Impeding Your SEO Success
Signing your first (or most recent) SEO contract can be really exciting. You are pumped because your optimization specialist is going to rock your website, you will see increased visits and conversions, and your business is about to thrive on the web.
A good SEO will know to do and perform a comprehensive site audit before they bring you on as a client. Lots of technical site issues can be caught with a thorough audit, but there are some things that just do not make themselves known until after access to a website is granted, and the optimizer has time to really get their hands dirty.
Before off-site optimization takes off for your website, it’s important to make sure your website is in good working condition, free of common technical issues that can really impede your site’s SEO success. This is especially important since the Penguin and Panda algorithms were unleashed by Google.
Types of Technical Issues
There are so many technical issues that can affect your site’s performance, ability to be indexed, and how it is indexed, that I’m only going to cover a sampling of them.
Broken Links – Internal and External
Broken links on your website are a problem for both search engines and your users. Who wants to go to a page that doesn’t exist? Most content management systems these days offer a link checking module or plugin, and there are options out there that will spider your website looking for broken links if you don’t use a CMS. If you have a lot of broken internal links, the search engines are going to see a site with lots of 404s that doesn’t get the maintenance it needs – a surefire way to let them know that your site isn’t what their users need.
404: Page Not Found Errors
This one is big. Clean up the 404 errors on your site, make sure to have 301 redirects to new and updated URIs so that your site can thrive. 404s are bad for clients and really bad for search engines. Set up a custom 404 page with some links to top content, as well, so it serves up more useful information to users than, “Page not found.” Google Webmaster Tools and Google Analytics can help you find your problem children so you can whip them back into shape.
Whether you, your CMS, your old SEO company or whoever is to blame, duplicate content on your site is a no-go. While there is no real penalty for duplicate content on your site, non-indexation can be penalty enough. Your site needs to provide users with unique, high-quality content. There’s no reason to go to (and therefore rank) your website if you’re just copying everyone else’s stuff.
As far as CMS-driven duplicate content goes, blogs are notorious for this. Your SEO should know how to make your archives work for you instead of against you, so that you can keep your archives and still see positive ranks in the search engines.
There are a lot of things that can qualify as bad URLs, including, but not limited to:
- dynamic URLs – ones that include query parameters, session IDs, etc
- unfriendly URLs – ones that don’t include human-friendly words
- buried pages – URLs with lots and lots of subfolders
Dynamic URLs can pose all kinds of problems. Query parameters and session IDs can result in URIs that are either too complicated for search engines to index, causing them to give up, or can create duplicate content problems, especially in the case of multiple sort options or unique session IDs for each user each time they visit the site. Switching your site to static URLs sooner rather than later will make link building and indexation much easier.
Unfriendly URLs, such as example.com/pageID6729.html make it difficult for users and search engines to figure out what your site is about. Search engines expect your URI to correspond with the content on your page, and so do users. Human-friendly URLs are also more memorable and easier to link to. When updating your URLs, make sure you use 301 redirects to ensure no one gets lost on pages that don’t exist anymore.
Buried and/or deep URLs are trickier. While it’s nice to have your stuff sorted into categories or main navigation items, such as recipes, tips, and tools, having tons of categorical information in your URL can just make it too long to be useful. It’s hard to know how many folders deep is too deep but use your best judgment here. Shorter URLs are easier to share and more memorable. Shoot for something more like /recipes/slow-cooker-chicken-curry-rice.html instead of /recipes/slow-cooker/chicken/middle-eastern/slow-cooker-chicken-curry-rice.html.
Bad Robots.txt Files
There’s a very specific order to robots.txt files, and if something is out of order, it’s likely to be ignored. Google will still try to use your robots.txt file, but Bing is known to be more strict about errors in files like that, and has been known to throw out the information and do their own thing if there are too many errors.
The major search engines made it clear that your site’s loading speed was impacting your ability to rank. As internet speeds get quicker, users are less patient when waiting for a site to load. It’s important for your conversions, too, to improve your site’s loading speed. A study showed that you can expect about a 7% drop in conversions for every additional second your site takes to load after just two seconds.
Your site doesn’t have to be compatible with every browser, but if your site isn’t compatible with newer versions of IE or Firefox or Google’s Chrome, your conversions will be hurting. Search engines will take note if lots of folks are skipping your site in the SERPs and going back to click through to a different website. Besides, why would you want to direct traffic to a site that isn’t going to work for them anyways?
Why Prioritize Technical Issues?
When you are excited about getting your marketing campaign going, it’s hard to hear that there is going to be any sort of delay, even if your SEO is trying to get things done with your best interests in mind.
If the technical issues on your site are not yet affecting your performance and ranking, it’s only a matter of time before the next Panda or Penguin update will change that. A good SEO will know that correcting technical issues is a priority for your website to ensure the longevity of your SEO efforts, so you can spend the next years building your site and the community around it, instead of trying to fix a major algorithm hit, that you may never recover from. As many as 87% of websites affected by Google’s Panda had still not recovered one year later.
Make sure to get your site cleaned up and working in tip-top shape before you start calling the search engines’ and the public’s attention to it. You’ll be thankful you did, and reap the rewards for years to come.