When hiring a web developer to build you a new website, it’s easy to presume that your website will rank well in the search engines. After all, if you have a fantastic website, Google will love it too, right?
Not so fast. Your new website might look great, and it might have a lot of clever bells and whistles that justified the price tag, but that doesn’t ensure that every piece of the puzzle is in place. Suppose SEO (Search Engine Optimisation) is not given proper weight throughout the web development process. In that case, you could be wasting your money unless you intend to buy most of your website traffic.
If you want to take advantage of the abundance of free organic traffic that ranking on the first page of Google and other major search engines can bring to your website, it’ll need to be “SEO-friendly”.
Having been working as an SEO specialist for the past 15 years, both as a freelancer as well as some established digital marketing agencies, I’ve worked on countless websites on behalf of a wide variety of businesses. If you’re looking at getting a website built, be it a new one or a redesign of an existing website, this article will give you a checklist of what SEO mistakes so you can make sure your developer avoids.
Before any work happens on your website, your developer needs to understand your business goals. Part of this process should involve some keyword research. Essentially, you want to understand the most relevant phrases to your business that people enter into Google search, and their average monthly search volumes based on your geographic markets.
There are quite a few keyword research tools available. Almost all of these tools get their keyword data from the Google Ads platform. To access Google Ads, all you need is a Google account.
Armed with some keyword research, you have the basis for an SEO plan for your website. You and your developer now know which of your website’s related search terms are most popular and can figure out which pages should include them in the content.
When you load a web page in your browser (Chrome, Firefox, Safari etc.), a page title appears in the tab. This title is what also shows up in organic search results. This element is created with a line of HTML code called a meta tag. The page title meta tag is an essential element for on-page SEO and should contain your most valuable target keywords. Too often though, developers leave the page title as the default name of the web page. The page title cuts off after 65 characters but should still be more than 30 characters.
Another important meta tag for SEO is the page description tag. Google will typically show up to 156 characters of a description tag. This tag should be concise and enticing to encourage searchers to click on your link and visit the page. Description meta tags are not a direct ranking factor for Google, but it does affect click-through-rates, which are factored into the search rankings.
Not so important for SEO, but is for your presence on social networks is social sharing meta tags. These tags specify the page title, description and image to use when sharing to social sites like Facebook and Twitter. If you want your previews to look good on social media posts, these meta tags should also be defined on your web pages.
If your website’s menu system is confusing or difficult to use, not only will your visitors not find what they’re looking for, Google will also find it difficult to crawl your website and rank all of your pages in search.
If your website only has a few pages, then a flat menu is fine. If your website has lots of pages and category pages, such as an e-commerce website, your navigation menu should be able to drill down appropriately with sub-items.
All of the pages on your website should be no more than 3-4 clicks away from your home page. Any more than that and those pages won’t be considered valuable and are unlikely to be indexed in search results.
An XML sitemap is a list of all your web page URLs laid out in a particular format that search engines can easily read. XML sitemaps are especially for making it easy for search engines to know about all your web pages and some other index-worthy elements of your website.
Many websites use separate XML sitemaps for each type of web page and even have individual XML sitemaps for images and videos. An e-commerce website, for example, might have a sitemap for the main pages such as the home page, about, contact, shipping info etc. another sitemap for product categories, another for product pages, yet another for images, and yet another for videos.
Image and video sitemaps are useful if you want those elements to show up specifically in image or video search results in Google and other search engines.
To get Google to make use of your XML sitemaps, you need to submit them to Google Search Console. You can access it with your Google account, where ownership of the website will need to be verified. If your website is new, submitting sitemaps to Google Search Console is the best way to get Google to crawl it for the first time and get your pages indexed. Otherwise, Google will need to find your website through a link to it from another website, and that is hardly a guarantee.
Bing has a similar process to Google for submitting XML sitemaps in its Webmaster Tools. It is a good idea to submit your sitemaps to both Google Search Console and Bing Webmaster Tools.
IFrames are code snippets that display content on a page that actually exists on a different page. The content could indeed come from another website altogether. This method of pulling content from other pages doesn’t fool Google, and it will credit the source page for the content with a place in the search results instead of that using the IFrame code.
Websites often use IFrames to embed youtube videos, slideshows from Slideshare or PDF documents. These uses are acceptable and won’t negatively affect your search rankings. If your developer uses iframes to embed text and images where it should be part of your page content, then this can be a problem for your SEO and is quite frankly just lazy design.
The majority of searches now take place on mobile devices, so if your website isn’t mobile-friendly, your chances of getting good rankings on Google will be limited.
Back in November 2016, Google announced that they were working towards eventually using the mobile version of a website’s content to rank web pages. In March 2020, Google confirmed that 70% of all websites had shifted over to mobile-first indexing and that all remaining websites will be indexed on a mobile-first basis starting September 2020.
So the fix is in; if your developer is stuck in desktop mode for your web design and can’t create a great mobile user experience, it simply won’t fly in Google search.
Google has a useful tool to help to make sure the mobile user experience is adequate, appropriately named “Mobile-Friendly Test”.
Time is valuable, and slow loading web pages create a poor user experience, especially on mobile devices. Google doesn’t like slow loading websites either and has made page loading time a factor when ranking pages in search results.
As mentioned in item 6 above, most people now search on mobile devices, so making sure your pages load fast on mobile devices is particularly essential to today’s SEO.
If your website is slow to load on mobile devices, getting on the first page of Google is going to be tough in 2020. The maximum load time you should be aiming for is under 4 seconds from your target geographical areas.
Striking the optimal balance between creating a beautiful and impressive website and keeping the load times down is a challenge for developers. Too many bells and whistles, so to speak, come at a cost in load times.
Possibly the biggest culprit for slow loading pages is large image files. There is often the temptation from both designers and website owners to use high-resolution images. To speed things up, you’ll need to keep the image file sizes to under 160kb. The smaller, the better, so it is worthwhile to compromise a bit on the image quality to get the load time down as much as possible. I try to go for around 80-100kb for large images, but in any case, I’d use 160kb as a hard limit. If you are showcasing a large gallery of high-quality photos, make sure they aren’t all being rendered when your page first loads.
There are a lot of other ways to speed up your website, from using a CDN service (Content Distribution Network) and leveraging browser caching to making the on-page code as efficient as possible.
Google provides a very useful tool for analysing website speed called Google PageSpeed Insights. It will generate a score for the desktop and mobile versions of your pages as well as list everything that is causing slow-downs and how to fix them.
Heading tags are the visible headings and subheadings represented on the page using HTML tags such as <H1>Your Title</H1> or <H2>Your Sub-Title</H2>. In your on-page text, it creates a hierarchy of titles from 1 to, usually, 4. There is no limit to how far you can go to (h23 if you really wanted), but it doesn’t make sense to do that in most circumstances.
The H1 title is usually placed at the top of a page, it should be unique from your page title meta tag, and there should only be one per page. The H1 title should contain your most relevant keywords to the topic of the page, describing what it is about in under 70 characters.
Web developers that are not SEO-savvy often do things like make all the title tags on a page H1, or use H1 tags in the website template for content that’s replicated on different pages.
You can use H2 headings multiple times on pages and they are useful for improving the readability of your text. No one wants to see long slabs of text and the sub-headings help readers the information that’s most important to them. They’re also good opportunities to point search engines towards ranking the page for your keywords.
H3 and H4 title tags can be used for breaking down your content even further if necessary. The text under these sub-titles is seen as less important though. They are often used for widget and table titles too.
Structured Data, otherwise known as Schema data, is code that is used to tell search engines, or “highlights” specific information related to what a web page is essentially about.
Not only that, but structured data can also influence Google to show additional information as “rich snippets” within the search results. These snippets include aggregate ratings of up to 5 stars from customer reviews, images, price information, publishing date, video player thumbnail and more. Enhancements like these can make your search result stand out and drive higher click-through-rates.
Below is an example of Google search results that have been enhanced with rich snippets that are made possible with Structured Data.
If your website isn’t using structured data, you’re basically at a disadvantage against those that are. It has to be appropriately applied, though. Structured data has only been relatively widely known about for about five years, so only a small number of web developers are aware of it and even less know how to implement it properly.
It is even difficult to find CMS (Content Management System) plugins that do the job of implementing structured data properly or offer all the options that would make it work best for every kind of business.
Google has a useful tool for validating your website’s structured data and will report any errors or warnings about the implementation.
I’ve come across this problem several times and it has devastated search rankings for websites that had been redesigned and severely held back new websites from getting organic search traffic. It’s such a simple oversight, but the consequences to a business are huge.
While a new website is built on a staging server, a setting is used to block search engine bots from crawling the website. This makes sense because the staging server won’t be on your live domain name yet and you don’t want search engines indexing an unfinished website on a server that isn’t meant to be public.
Then the website goes live…it’s showtime. However, days and even weeks pass and Google hasn’t indexed anything. The reason might be because that setting that blocks the search bots is still active on the live website.
On website platforms like WordPress, all that needs to be done to unblock your website from being crawled is to untick a single box and then click “save”.
All that lost traffic over a single tick-box!
Websites Should be Built to be Search-Friendly
When investing in getting a website built, often the SEO fundamentals are overlooked. Making sure your website is search engine friendly means that you can take advantage of what is essentially free web traffic. Paying for ongoing specialist SEO services is a wise investment that can enhance your presence in the search engines even further.
But planning for success in organic search before and during the website design and build process will pay dividends far more quickly than going back over it after launch, figuring out which search terms to target and fixing all the issues.
Hopefully, these ten common SEO mistakes that web developers make can be avoided thanks to reading this article and your new website can hit the ground running, getting you sustained and ever-increasing organic search traffic.