First words are important words. If you’re going to put your brand name in your page title, it usually should come after the description of the page content.
If you’re having to do redirects, don’t do them at “database level” – e.g. in the backend of your CMS. Your site will be faster if you do the redirects “higher up”, such as in the htaccess file. Faster sites are good for users and rank better.
It’s a common mistake to use robots.txt to block access to the CSS / theme files of your site. You should let Google access these, so it can accurately render your site and have a better understanding of the content.
“Keywords Everywhere” Chrome plugin is a nice, free way to get search volumes, search value and suggestions overlaid with every search you do. I have it on all of the time and over the months and years, you build a good ‘feel’ for search competitiveness and how other people search.
Pop-ups and interstitials generally annoy users, you too, right? Since Jan 2017, Google has specifically stated that websites that obscure content with them and similar will likely not rank as well. Here are some examples from Google of things to avoid: https://webmasters.googleblog.com/2016/08/helping-users-easily-access-content-on.html
Competitors copying your content? One of the many things you can do is file a DMCA notice directly with Google. This can remove your competitor’s content from search results. Here is the link to do so: https://www.google.com/webmasters/tools/dmca-notice?pli=1 N.B. There are consequences for fake DMCA notices.
If you’re AB testing different page designs on live URLs, make sure you use the canonical tag so you don’t confuse search engines with duplicate content while the test is live.
Bonus – Video course preview: Canonical tags for A/B tests
Ayima Redirect Path Chrome plugin allows you to live view what redirects (JS/301/302 etc) are happening and if you’re getting chains of them. You can get it here: https://chrome.google.com/webstore/detail/redirect-path/aomidfkchockcldhbkggjokdkkebmdll?hl=en
If your web page URLs work both with and without a trailing slash (/), search engines will think you have two identical copies of your website online. You should pick whether you want a trailing slash or not and setup permanent 301 redirects between each. Failing to do so, can result in the ‘two’ pages competing against each other in the SERPs and ranking worse than a single one.
[Deprecated!] No longer an indexing signal for Google, but is for other search engines
If you have paginated (page 1,2,3,4 etc) content, there is special “Prev” and “Next” markup you can use to help search engines understand what is going on, better. More info: https://webmasters.googleblog.com/2011/09/pagination-with-relnext-and-relprev.html
Google has always flatly denied and there is no good evidence whatsoever that social media posts on platforms like Facebook and their associated ‘likes’ and/or engagements directly impact your rankings in any way. If someone is insistent about this, look closely – you may be dealing with a clown! 🤡
How long a domain has been live and receiving links for, building up a reputation plays a part in how well the pages on it can rank. Older is better.
Want to know where you rank? Googling it will just frustrate you. Even with incognito mode, you’re not going to get a fair representation of the rankings. Google Search Console will give you some average rankings – but only for terms they choose. I’d recommend SEMRush. It’s super cheap and will give you loads of keyword ranking data for your site, your competition and specific terms you want to track. You can get a trial here: https://www.semrush.com/sem/ (aff)
Bonus – Video course preview: Google Search Console overview
Add a self referential canonical tag to all canonical pages. This means if someone scrapes your content or links to it with query strings, it’s still clear to Google which version to give credit to. More info: https://support.google.com/webmasters/answer/139066?hl=en
Bonus – Video course preview: Canonical tags
You can find easy link opportunities by using tools such as Majestic’s Clique Hunter. Specify a few competitor sites and you’ll get a list of links that all or most of your competitors have that you do not. This helps close the gap on where all your competitors are being talked about and you’re not. More info: https://blog.majestic.com/company/clique-hunter/
Watch this brilliant video from Lukasz Zelezny on SEO tips you can implement tonight! https://www.youtube.com/watch?v=dXdqmVnP5pg
Using a VPN is a good idea in general, but it’s really helpful for SEO. With a service like ProtonVPN then you can click to change cities or countries and see what different search results look like.
Google can index PDF documents just fine and it actually renders them as HTML. This means links in PDF documents count as normal weblinks – PDFs are pretty easy to share, too….
This week, I saw a company that had been told by an agency their site was slow. It really wasn’t (consistent ~3.0s TTI). You can test site speed (and more) yourself using Google’s PageSpeed / Lighthouse audit tools. I’ll do some tips the next few days about these tools, as they are commonly misused and misinterpreted. PageSpeed: https://developers.google.com/speed/pagespeed/insights/ Lighthouse: https://web.dev/
If you use the page speed tools from yesterday, keep in mind that results can vary every test you run, depending on all kinds of factors. If you are going to use these tools, run multiple tests on multiple pages and get some averages.
If you have enough traffic, Google’s Pagespeed Insight tool will give you “Field data” – this is an amazingly useful average speed, directly from your user’s browsers. It will give you a much better idea of how your site is performing outside of the ‘spot checks’ we spoke about in previous tips. Google Pagespeed Insights tool is here: https://developers.google.com/speed/pagespeed/insights/
Have you heard that 50% of searches by 2020 will be voice searches? They won’t, it’s complete rubbish.
1 in 5 searches in that happen in Google are unique and have never happened before. The vast majority of searches that are conducted are terms that have fewer than 10 searches per month. If you’re just picking key phrases based on volume from “keyword research”, you’re missing the lion’s share of traffic and making life hard for yourself, as lots of other people are doing the same.
Check the last 12 months in Google Analytics, if you’ve got content pages with no traffic – it’s maybe time to consider consolidating, redoing or removing those pages.
Key phrases mentioned in the reviews written about you on Google My Business help the visibility of your company for those terms.
Google do not use UX engagement metrics directly as part of their algorithm (CTR, dwell time etc). They have said this consistently for years and last week, Gary Illyes from Google referred to such theories as “made up crap” in a Reddit SEO AMA. In my opinion, same goes for direct ‘social engagement metrics’.
If you’re serving multiple countries on one website, it is almost always better to do this with sub-folders, rather than sub-domains or separate TLDs. This means: mywebsite.com/en-gb/ mywebiste.com/fr-fr/ Is almost always preferable to: en-gb.mywebsite.com fr-fr.mywebsite.com.
You need other websites to link to your website pages if you want to rank well in Google. This means if you consider SEO to be a one-off, checkbox task of completing items on an audit, you are unlikely to see success. Technical SEO gives you the foundation to build on, not the finished article. #backtobasics
Have a play with Google Trends! It is useful to see trends in searches, when they happen every week, month or year. How much do they vary or are they trending up or down? Here’s a funny trend for two searches (different Y axis) for searches around ‘solar eclipse’ and ‘my eyes hurt’ 🙂
You can do some basic brand monitoring for free with Google Alerts. This gives you the opportunity to do ‘link reclamation’ – when websites are mentioning your brand or website and not giving you that link. Strike up a friendly conversation, offer them some more value, detail, insight and get that request in to get the link 🙂
Registering for Google My Business for free, is how you can start ranking in the local map box results.
Stuck for good content ideas? Put a broad subject (like ‘digital marketing’) into AnswerThePublic and you’ll get a list of the types of questions people are asking in Google!
Screaming Frog is a tool with a free, limited version, that allows you to quickly ‘crawl’ of your pages like a search engine would to see issues such as 404s or duplicate page titles.
Bonus – Video course preview: Installing Screaming Frog and running your first crawl
Video is often overlooked, YouTube is the second largest search engine in the UK – there is more to SEO than just Google search!
Want a better chance that your videos will appear in search results? Then create video sitemaps! Video sitemaps give additional information to search engines about videos hosted on your pages and help them rank.
Don’t stress about linking to other websites where it’s relevant and useful to the user. That’s how the web works and is absolutely fine!
Did you know that sending someone a free product to review and get a link is against Google’s guidelines and comes under ‘link schemes’ that could land you with a penalty?
The factors to rank in the local map pack results are different to ‘normal’ rankings (but there is overlap).
How much is your organic traffic worth? One way to get a good estimation is to find out how much it would cost to buy that search traffic through paid search. The cost per click (CPC) of a keyword is set by market demand and can be used a barometer for the value of your rankings. Tools like SEMrush can do this for you automatically. In this example, the estimated monthly value of the organic traffic is £5,700.
If you want your images indexed, you need an img tag within the HTML.
Google Trends has a commonly-overlooked ability to trend YouTube searches.
Domain age, or at least the component parts of it such as how long links have existed to it, play a part in ranking. It is almost impossible to rank a brand new domain for any competitive term.
Examining your raw server log files can be a worthwhile exercise. You can see directly how Googlebot is interacting with your site and if it’s getting stuck somewhere or getting error responses.
When doing a site migration, don’t forget to migrate URLs not within your site’s internal link structure. This could include links to pages with marketing parameters, for instance. These ‘hidden’ URLs contribute to your ranking, commonly get over-looked and result in permanent ranking drops after migration.
If you’re trying to do a crawl of your site with a tool like Screaming Frog and getting 403 errors, this can be because many Web Application Firewalls or services like Cloudflare will default to blocking crawlers imitating Googlebot. Get around this by setting your crawl user-agent to Screaming Frog SEO Spider or another non-search engine one (or get yourself whitelisted by IT!)
Bonus – Video course preview: Crawling with different user-agents
Don’t focus specifics in algorithm updates, if you’re having to do that, your underlying SEO strategy probably isn’t right. Algorithm updates primarily represent the overcoming of technical hurdles which are still driving toward the same end goal.
Golden rule of SEO – there is absolutely no ‘SEO change’ you should do on your site that will make the user experience worse. None. No exceptions.
Ideally, you just want just one h1 on the page and it should be descriptive of the page content for the user. Naturally, your page title and h1 will normally be similar.
Canonical tags are not a directive. Do not try and use them on pages that are not similar – Google will just ignore them. I recently confirmed this with an SEO test too.
Bonus – Video course preview: When to use canonical tags
Struggling to get interesting data to make a narrative to get links? Did you know Google has a Dataset Search? You can search for publicly available datasets to get inspiration and save huge amounts of time.
With a reasonable number of results, a ‘view all page’ is optimal over paginated content. Research shows ‘view all’ pages are also preferred by users. Google says: “T improve the user experience, when we detect that a content series (e.g. page-1.html, page-2.html, etc.) also contains a single-page version (e.g. page-all.html), we’re now making a larger effort to return the single-page version in search results.”
If you’re trying to stop content getting indexed, remember not to have it in robots.txt – or the crawler will never get to the page to see your noindex tag!
Bonus – Video course preview: Robots.txt introduction
Cannibalisation is when you have more than one URL targeting the same intent / key phrase. It is one of the main problems that causes otherwise technically optimised sites with decent content to rank very poorly.
Making a visual crawl map is a fast way to get a bird’s eye view of your content structure and see if you have any problems. One of my favourite tools to do this is Sitebulb. Neither Sitebulb or Patrick Hathaway compensated me for this post. They just made a really good tool. There’s a trial version to check out.
Bonus – Video course preview: Getting started with Sitebulb
If you want content to rank well over months/years, you need to design your site to link to it from ‘high up’ in your site hierarchy. It’s generally a mistake to post evergreen content in a chronological blog, as it will slowly disappear deeper into your site, more clicks away. If it’s evergreen and always relevant, it should always be prominent.
Google had a bug last week that caused millions of pages to become de-indexed seemingly at random – sometimes even big company’s home pages. This bug is now fixed – so if it affected you, there is no need to panic, these URLs should resolve automatically. If you’re in a rush (who isn’t), you can speed up re-indexing by submitting the de-indexed URL via your Google Search Console account.
If you discontinue a popular model/product on your e-commerce site, rather than delete the page, update it to explain the product is discontinued and link to the nearest alternative products. This is more helpful to the user and prevents the loss of organic traffic.
A specific ‘keyword density’ is not a thing, so don’t waste your time on it. Apart from the fact text analysis goes far beyond this and tf-idf, it means you’re writing for robots and not for humans – and therefore missing the point. The algorithm is only ever trying to describe what is best for humans, so start from there.
www or non-www, pick one! Then redirect (301) one to the other. Did you know that Google and other search engines count URLs with and without www and different (and therefore duplicate) pages?
Despite what they profess, the ‘build your own site’ platforms like SquareSpace and Wix are not optimal for SEO. While they can be great start points, it’s unlikely you’ll get great rankings with those sites. Even bigger platforms such as Shopify don’t allow you to edit your robots.txt file!
Do broken link reclamation. Check server logs or use a tool like Majestic to identify sites that are linking to malformed URLs. Set up 301 redirects for these to reclaim the links and get the extra traffic.
Do not underestimate the power of ranking in Google Images. A huge amount of searches are visual, so it is worth making sure your image assets are properly marked up and optimised.
The site: operator in Google is useful to see if you have major indexing problems, for instance if you have a 20 page site but find 5,000 indexed pages – or vice-versa – you have a 5,000 page site but only 20 are in the index. However! It will not give you an accurate count of the number of pages included in the index, so don’t use it to try and measure index coverage!
If you’re using schema, don’t use fragmented snippets, tie them together with @id – e.g. this Article belongs to this WebPage, written by this Author that belongs to this Organisation, which own this Website – build the graph!
When doing a site migration, try and change as few things as possible. E.g. if you can do a move to http – https first, do that. It will make it easier to diagnose and fix the root cause of any issues.
If you don’t have a strategy to get people to link to you, it’s going to be almost impossible to obtain competitive rankings. Links are still the life blood of rankings. Here is a recent test example. The site does not rank for years. It gets an influx of links (top graph) and the search visibility shoots up (bottom graph). The site loses links (orange, top graph) and search visibility falls (bottom graph).
Google has just announced both Search and Assistant support for FAQ and How-to structured data. (h/t Andrew Martin). Find out more: https://webmasters.googleblog.com/2019/05/new-in-structured-data-faq-and-how-to.html
“The content comes before the format, you don’t ‘need an infographic’, you don’t ‘need a video’. Come up with the content idea, then decide how to frame it” – Brilliant advice (think I got the quote right) from Stacey MacNaught last night at #SearchNorwich.
Dominating Google is about getting your information in multiple places not just your own sites. Or just making Google think you have 512 arms 🙂
Part of being ‘the best’ result comes with format. Google is bringing AR directly to search results. Your product, in the consumer’s home. Doesn’t get much more powerful than that! Find out more in our podcast.
You can easily noindex pages within robots.txt using: Noindex: /page-dont-want-indexed/ Useful as a band-aid for sites you’re struggling to get changes on. Google does say it does not officially support this usage – but it does work (currently!). Might be helpful for those with long dev queues.
Robots vs Noindex usage
Got a showroom? It’s not expensive to get a 360 photo done for your Google My Business and it will help you attract more in-store visitors.
Avoid a common mistake if you’re targeting multiple countries/languages by making sure you use the ‘x-default hreflang’ on your region/language selector page.
If you’re using Google Search Console and it looks like data is missing or you are getting “not part of property” errors, be aware – Google classes http, https, www and non-www versions of your site as different properties! Therefore, you need them all added to your Google Search Console and make sure you have the right one (the one your site uses / redirects to) when making changes!
Bonus – Video course preview: Adding and verifying Google Search Console
Bounce rate is not a ranking factor. A high bounce rate can be good in some cases, it needs to be taken in context with searcher intent.
You cannot “optimise for Rankbrain” – ‘Rankbrain’ is the name of one component of Google search that specifically deals with queries Google has not seen before using AI to try and understand intent. Rankbrain deals with approximately 15% of queries (around 3,000 a second).
“Google has 200 different ranking factors, each with 50 different variables”. Have you heard this? That’s what we were told almost 10 years ago by Matt Cutts from Google. This is not reflective of how Google works in 2019 and someone saying this to you should raise a red flag – it’s super out of date information!
Make sure you’re only specifying hreflang with one method (on-page, sitemap, headers). I’ve seen numerous problems caused with conflicting tags – so check you’re only using the one method!
Having an empty ‘voucher code’ box as the last step of your checkout can kill your conversion rate as you send people off on a wild goose chase to find one! It’s always worth having a “[brand name] vouchers, offers and coupons” page – it will always rank first and if you have no offers on, you can let people know so they don’t feel they are missing out!
Correctly categorising your business with “Google My Business” is vital to appear for generic map-based searches.
It is worth looking at the last 12 months Analytics data and seeing what pages you have that get no traffic and asking why. It’s a great way to see what your content weak spots are, what needs improving, rewriting or sometimes – just deleting.
Don’t add keywords in your Google My Business name, it can get you penalised.
The cache of your page (before JS is rendered) is based on the First Meaningful Paint. This means pages with loading screens/elements that last too long may be caching and Googlebot won’t understand what is on your page.
I’ve mentioned cannibalisation before (many pages trying to rank for the same keywords) and how this can have a drastic impact on a site’s ranking. Well, thanks to Hannah Rampton, you now have a free tool that you can use to check your site for cannibalisation. https://strategiq.co/how-to-identify-keyword-cannibalisation/
The recent Google ‘diversity’ update that limited how many organic results one site can have, usually to 2, does not include ‘special’ results such as rich snippets or Google news etc. This means it’s worth considering what other angles you can use to dominate SERP real estate!
Not sure where to start focus with? There are rarely ‘quick wins’ within SEO, but focussing on your content that ranks in position 3-10 can be the fastest way to get traffic, as most of it is locked up in those top 3 positions on a regular SERP. You can pull a report like this quickly with a tool like SEMrush (aff).
If you’re really thinking about your audience, their intent and getting people that know the subject to write your content – you don’t really need to worry about what TF-IDF is, or how it works.
Having an all secure site (https not http) using SSL/TLS is a great idea for many reasons – it is also a ranking factor in Google! Secure sites rank better and Google recently said it can be used to settle ‘on the fence’ rankings where most other things are equal.
Sometimes blindly following Google’s advice is not in your best interest (in the short term, at least). Here is Lily Ray demonstrating traffic loss after implementing FAQ schema markup.
When calculating organic traffic at risk when completing a website migration, remember to only calculate from unbranded traffic – it is highly unlikely you’ll lose traffic on brand terms during a migration. In some cases, this can be significant amount of traffic and spoil your forecasts.
If you don’t have GA access or you inherit a new GA account with limited historical data, you can find historical URLs of a site by replacing ‘example.co.uk’ with the domain you want from the link: https://web.archive.org/cdx/search/cdx?url=example.co.uk&matchType=domain&fl=original&collapse=urlkey&limit=500000
“Those aren’t my competitors!” – You have both business competitors, who you are likely aware of – and you have search competitors – the ones that rank above you for the keywords you want. These are the people that you’ll be competing with in SEO and you can use a tool like SEMrush to quickly identify which websites overlap with you on how many keywords and which ones.
Name, Address, Phone (NAP) citations are important for local SEO and ranking in the map box. This means having your main business address listed as your accounts (common practice in the UK) can be very detrimental to your SEO!
There is not such thing as a ‘duplicate content penalty’. Unless your site is pure spam, you’re not going to be harmed if someone copies a page of yours or if you have some copied content. It may get filtered out of a search result but you’re not going to get your site penalised.
URLs are case sensitive. This means search engines will consider: mysite.com/pageone and mysite.com/PageOne as different pages! Stick to lowercase where possible for your main, navigable and indexable URLs to make sharing and ranking easier!
You should not be hiring generalist copywriters to write your content. Competition is fierce and your users (and Google) are looking for genuine expertise and insight – not a rehashed article made from reading 10 others that already exist. Not convinced? It’s spelled out for you in Google’s webmaster advice:
There is a difference between an algorithm update and a penalty. If you lose a lot of traffic or rankings because of an algorithm update, this is not a penalty and there may not be anything you can “fix”. Google is simply evaluating in a slightly new way, to closer match their goals. We’ve done a deep dive into Google penalties on this week’s Search with Candour podcast that you can listen to here: https://withcandour.co.uk/blog/episode-16-all-about-google-penalties
Links to your site from posts on platforms like Facebook and LinkedIn do not help your ranking in Google.
Patents are a good way get an idea of what might be happening behind the scenes when you are observing results. It is worth keeping in mind, just because Google has a patent for something, it’s not necessarily used in ranking – but it gives you a good idea of what is coming. For instance, here is a patent that builds on Google’s Knowledge Graph / entity model, that tries to attach internet sentiment about that entity as a ranking factor. In essence, to start promoting web results for businesses that people have had good experiences with. Full patent for the geeky.
Paying for Google Ads does not improve your organic Google ranking. I had someone tell it does yesterday. It doesn’t. It really, really doesn’t.
If you’re using the Lighthouse Chrome extension to audit a site, make sure you launch it within an incognito tab as other extensions can impact results. To do this, there is a switch to ‘allow in incognito’ within Chrome.
Do not claim your Google My Business short URL for the moment! There is a current issue Google is experiencing that is causing Google My Business pages to vanish, as if suspended in many cases when these URLs are claimed. (I think you were affected also Taylor Gathercole?) Technically, it’s not suspended – it is due to CID syncing, but same end effect for the user!
If you’re building a new site, SEO considerations need to happen right at the start. How will you handle the migration? What schema are you using? Which content is evergreen and which is chronological? How are you going to avoid cannibalisation? It’s not a plan to think you can “do the SEO” after the site is built.
Quick and dirty keyword cannibalisation check, use this search in Google: site:yoursite.com intitle:”key phrase to rank for” This will only return results for your website where you have the key phrase you want to rank for in the title. If this returns multiple pages, you may be confusing search engines as to which page you want to rank for this term. Consider consolidating, redirecting or canonicalising as appropriate.
Technical one, so hold your breath! If you see a Google SERP experiment (new type of result), it is possible to share these with other people by finding and manually setting the NID cookie. Here is a tutorial on how to do this: https://valentin.app/nid.html
Straight from Google, “Pages blocked by robots.txt cannot be crawled by Googlebot. However, if a disallowed page has links pointing to it, Google can determine that it is worth being indexed despite not being able to crawl the page.”
Bonus – Video course preview: Testing robots.txt rules
If you’re running A/B tests, do not use the noindex tag on your variant pages. As I put in unsolicited #SEO tip number 29, use a rel=canonical tag. This also means you’ll pick up the benefit of any links your variant pages get. This is 100% what Google advises too.
Almost 25% of all SERPs have a featured snippet, if you’re not tracking them – what are you doing? You can use tools such as SEMrush to keep tabs on the types of SERP features that are appearing in your niche.
Moving pages or migrating domain? The best way to handle redirects in 2019 is on the ‘Edge’. This means setting the redirect through a service like ative that generates likes, links and gets journalists to cover your story and get you the big links!
Less is more when it comes to Local SEO and Google My Business categories. Fewer, more specific business categorisations will get you better results that trying to cover everything.
30% increase in organic traffic after 1 new piece of ‘pillar’ content attracts links. The internet is not short of quantity of content, it’s short on quality. My first ever unsolicited #SEO tip was about how the churn of ‘500 word blog posts’ is normally a waste of time. A great first win for a new SEO client, we got them to invest far more in one bit of content, rather than spread it out. The result – links, coverage and after a year of churning out content and seeing no real results, a 30% uplift in organic traffic. Baby steps that will continue to sustained growth now we have buy-in! If you’re on a tight content budget – do less, but do it better!
Remember to NoIndex your dev and staging environments so they aren’t exposed like these sites and potentially mess up your live site rankings. (Bonus tip: Remember to make it indexable when it goes live, just embarrassing otherwise!)
The hreflang tag can be used cross-domain! So if you’ve got your .co.uk your .com.au or even another language site, you can help Google understand the relationship and have them rank better! 🌍
Google use contained programs in a framework called ‘Twiddlers’ which re-rank results. Twiddlers focus on an individual metric, work in isolation of each other and try and improve SERP quality. This means a factor might not be in the ‘core algorithm’ but it might be affected by a conditionally run Twiddler that alters the result. For example, there is a Twidder that runs on YouTube queries that will improve the result of a matching video, if that channel as many videos that are also a close match to the search (implying the channel is specialist).
If you’re planning on doing a site migration, don’t fall for setting an arbitrary deadline or timeframe. Take a look at your Analytics data and plan it so you can do launch in your quietest period. This will help minimise any traffic losses you are likely to take.
Getting a domain with a backlink history can really help give you a kick start. I hear Rob Kerry is launching a tool soon that will make this a lot easier. https://piggy.domains/