Google’s Andrey Lipattsev has said that the three most important website ranking factors are content, backlinks and RankBrain (Google’s artificial intelligence algorithm). While that’s all well and good, it makes certain assumptions that Google can view all the content on a website and figure out where links are meant to go.
As an SEO consultant I’ve been asked to help on several projects where a web developer has inadvertently destroyed a site’s SEO, sometimes acting on their own initiative, but more often by simply following a client’s instructions without realizing the consequences.
This guide is intended to help both web developers and businesses avoid making these mistakes, or if you have a site that’s performing very badly in the search rankings, yet has good content and links, to identify an alternative factor that might be at play. I’ve focused on the SEO mistakes that are truly devastating, yet surprisingly still occur on a regular basis.
I’ve also included details of how to fix each mistake.
Making text invisible to search engines
Most professional web developers will use a popular CMS like Drupal, WordPress or Magento. These have good in-built SEO by default and mean you have to go out of your way to make text invisible to search engines, however if you have a custom CMS, there are various ways that you can inadvertently make text invisible. They are:
1. Showing text as an image
Google is starting to be able to read text in images, however they still rarely index it and if large chunks of your content is in the form of text displayed as images, it will be very bad for your SEO. It’s also not mobile friendly, as the text will either be too small to read on mobile devices, or you’ll have to scroll horizontally to read it.
If you want to show text over an image, simply use the background-image property in CSS for the image and place the text over it. If your aim is to use a fancy font that most people won’t have installed on their computer, there are now various ways to display fonts on your website.
2. Creating the entire site in Flash
Adobe Flash may still have its role on websites, provided you offer an alternative for visitors who can’t view it, however designing your entire site in Flash is not smart. Google struggles to view text in Flash animations and will index the entire animation as a single page and this makes creating a whole site in Flash highly inadvisable.
3. Using Frames
Frames allow you to show information in multiple views, while part of the page stays the same and the other part (the frame) changes. They can be useful for embedding social media data from another website, for example, however some websites use them to display the main content of each page. Because the contents of the frame doesn’t appear in the source code search engines don’t index it, see why frames don’t work for SEO for more information.
It’s not quite accurate that the content of the frame is invisible to Google, as Google often find the frame content and index it separately, however internal pages of your site then appear in Google’s index without the rest of the content, as shown below.
This causes all kinds of problems for SEO and user experience. To resolve this, switch to a CMS that doesn’t use frames, or edit the site to embed the frame as part of each page instead.
4. Using only cookies to control the language of a website
To fix this problem, implement a system where each language is shown under a separate domain, sub-domain or sub-folder (e.g. example.com/fr for French, example/com/de for German, etc.). If your existing CMS doesn’t allow this, visit how to translate a website for a questionnaire that guides you through potential solutions.
If you think one of the above problems applies to you, search in Google for:
Replacing example.com with your domain name. This shows you every page of your website that Google has indexed.
Avoiding a Google penalty
While we’re discussing making text invisible, sometimes webmasters want to hide text and use CSS to hide it, rather than simply deleting it. If you hide links or a significant amount of text, Google may consider it to be manipulative and will penalise your site because of it. It’s therefore better to delete unwanted text, not just hide it.
Other development decisions that can get a site penalised include any of the following:
– Allowing dofollow links on unmonitored comments
– Not doing security updates
– Where a developer links all their clients’ sites together
Google say the following in their guidelines:
“Any links intended to manipulate a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site.”
The items on the above list all risk you being penalized for manipulative outgoing links. If you allow unmonitored dofollow backlinks in comments, someone will add links to unscrupulous third party websites. If you don’t do security updates, someone will eventually hack into your site and add links out. If a developer links all their clients’ sites together it will create lots of irrelevant links from sites with the same IP address, which is also considered a manipulative SEO practice (linking every site to the developer’s own site with a ‘web design by …’ is fine).
If you feel you must add irrelevant links for some reason, add rel=”nofollow” in the < a > tag. This tells Google to ignore them, however you’re best removing irrelevant links altogether. The key is to not allow unknown entities to add dofollow links out of your site and to only add relevant links that make sense to actual visitors.
5. Adding an automated translation to a site
Google also consider presenting pages with automated translations to be spam, therefore either add a manual translation to your site, or embed Google’s translation tool in your site, so people can use it, but you don’t risk a penalty for automated content.
6. Keyword stuffing
Finally keyword stuffing (e.g. “if you want to buy a used car, second-hand car or used car for sale …”) is also considered a manipulative SEO practice and particularly if done excessively will get your site penalized.
SEO drops after a change of CMS or domain name
If you’ve experienced a sudden drop in traffic after changing domain names, or your CMS, it’s worth reviewing the SEO on your new site. By far the most common reason for a sudden drop in this situation is that old pages aren’t forwarding to the new ones and this is the first thing to review.
7. Changing domain name
When you change domain name, you should 301 redirect (permanently forward) every page from the old domain to the equivalent page on the new one. As well as helping visitors, this means that Google and other search engines will count links going to your old domain as if they were going to the new one. This is why changing domains without a 301 redirect is so bad – Google treats your new site as unrelated to your old one and you lose literally all benefit from the links and domain’s history.
To 301 redirect every page add the following code to the .htaccess file on your old domain (example.com is the old domain, new-example.com is the new one).
You can also tell Google your domain name has changed, as an extra step to ensure you keep the benefit of the links.
8. Changing CMS
If you switch to a different CMS keeping the same URL structure is the simplest solution; if this isn’t possible, then 301 redirecting at least the most visited / linked to pages to their equivalent new URLs is essential for both SEO and user-friendliness. CMS systems like WordPress and Drupal have modules that let you easily do this, alternatively you can edit the .htaccess file by adding the following code for each page:
Sometimes when you switch to a different CMS it’s tempting to delete a large amount of content from the site, however this can also cause a substantial drop in traffic. I’ve seen one site that lost over 90% of its traffic simply because the owner deleted old pages he didn’t think were important. If this has happened, hopefully you have a backup of the files that you can restore, if not archive.org keeps a history of websites and you may be able to copy content from there.
9. Deleting content
If content is no longer accurate and has to be deleted, it’s important to 301 redirect the URL you delete to a live page (or the home page if no other pages are relevant), so that you keep the value of old links. If a link arrives at a 404 Not Found page, 100% of its value is lost. If content could provide a useful archive to visitors, but you don’t want it featured on your site in an obvious way, then you’re best keeping it, but deleting the link to it from the overall menu structure, or creating a separate block of links on your archived pages, but not the main pages.
Other SEO mistakes to avoid
I’ll wrap up by mentioning a couple more mistakes that won’t destroy a website’s SEO, but will badly damage it.
10. Using a Splash home page
A splash home page is where you have an image or perhaps an animation and the words ‘Enter site’ that you click to reach the actual content. Most visitors find these annoying, as we live in an impatient world and internet users want to access information immediately, however it’s also bad SEO because you’re diluting the benefit of incoming links. By making it that incoming links go to a page with nothing useful from an SEO perspective (as there’s nothing you can optimize on a splash page) you miss the opportunity to optimize the home page, but also the benefit of those links to other pages is diluted, a bit like passing water through a funnel.
11. Omitting the Title or Description Metatag or using identical ones for every page
Another SEO mistake to avoid is not using Meta Titles and Descriptions, whether it’s by ignoring them or by having identical ones on every page or omitting them completely. The Meta Title and Description are what’s shown in the search results and if they’re omitted or not used properly, then they won’t include relevant keywords for each page and you will have a lower Click Through Rate (CTR). Both of these are bad for SEO and mean that Google will move you down the rankings.
To create a website with good SEO it’s essential to ensure that all the pages, metatags and text are visible to search engines, incoming links arrive at a relevant page and outgoing links can’t be hijacked.
Each of these mistakes is attempting to achieve something that will have negative consequences, however there are alternative solutions that achieve the same goal, but without damaging the site.
About the author:
Martin Woods is a Drupal web designer and multilingual SEO consultant for Indigoextra Ltd with 17 years’ experience. When he’s not designing websites and managing international SEO campaigns, he spends his time homeschooling two teenage boys and playing Ultimate Frisbee.
Martin has also written over 500 cryptic crosswords for The Big Issue in the UK.