fbpx

Diggity Marketing SEO News Roundup – August 2020

If you prefer to live life on the cutting edge, this roundup is for you. Several of SEO’s most innovative minds have published pieces this month, and we’ve got them covered here so you won’t miss out.

It starts with some chunky guides. You’ll learn how to get the perfect length for a blog post in any niche, some advice on doing a 100K launch with no ads, plus a neat method you can use to scale keyword research using Python (don’t worry, you can copy the code out of the article).

After that, we’ve got two case studies for you. Learn what the data says about how to discover any website’s traffic, and whether Google is effective at crawling tabbed content.

Google dominates our news section this month. You’ll catch up with announcements about a new paid GMB profile, the death of the structured data testing tool, what Google revealed during their recent congressional hearing, and what was up with all the recent rank fluctuations.

How Long Should A Blog Post Be to Win the SEO Game

Marta Szyndlar of Surfer brings us this new assessment of what blog lengths work best for SEO. There have been other studies on this. The article links to several of them (Hubspot, Neil Patel, and Backlinko) in the first section. However, Surfer takes a different approach to the question altogether.

They, Marta argues, believe that “ideal length” is worthless to apply to most types of content. This is because there’s evidence that Google prefers different lengths depending on the topic. It’s not just the algorithm, either. Sometimes searchers want fast answers, and they’ll bounce from a long-winded article.

Source: surferseo.com

The article quotes a personal experiment by Matthew Woodward, where he culled more than 20,000 words from an article after realizing his own content was many times the length of competitors. That action alone took his result from the 7th page to the 1st.

The question becomes: How do you find the sweet spot? This guide recommends a simple competitor analysis process where you average out the right length from the top 10-50 results. You’re going to want to use tools to do it, but the guide covers the manual method first.

That method involves merely clicking on results, copying all content (including any comments they have, because Google considers that part of the count, too), recording them in a sheet, and then averaging the results.

Watch out. It’s possible to go wrong with even these simple instructions if you pick the incorrect results. The guide reminds you to try to match intent when considering competitors. Do not average results with a different form or purpose, or results that are apparent outliers.

If you are a Surfer user, the following sections detail how you can use the various tools in the suite to do this work. You can sweep up all this information from the SERPs page, or just enter each URL into another tool to see the ideal length.

Our next piece moves out of content with some advice on how to promote just about any product without using ads.

How to do a $100k Launch with No Ads in A Single Month

This great tweet-thread by Charles Floate covers the methods he used to generate more than $100k over 31 days that followed the launch of a new product. He did it without paying for a single ad.

The product, in this case, was an eBook. Not everyone has the kind of reputation Charles does. His credibility certainly played a role in driving sales, but the process he used has implications for many types of products—notably how he generated hype.

After optimizing his marketplace and landing page, he ran a non-ad campaign that included the following:

All of these tactics can be used alone or combined with the rest. None of them even has a price tag if you handle them yourself. The closest thing to an ad buy here is working with affiliates, and the affiliates only get paid when you do.

The next piece coming up will also appeal to the DIY’ers out there. It’s going to show you how to combine Python and a free Google account to scale keyword research.

Using Python & Google Sheets to Scale Keyword Research for Local SEO

Skyler Reeves of Ardent Growth brings us this new process for saving a lot of time on keyword research. It should be said that this guide may come across as a little complex for newer SEOs, but the process it describes could be valuable for a professional agency.

The guide covers the theory and explanation of the process, along with how it can be used to collect better data, build more accurate predictions, and even automate the tricky parts.

Source: ardentgrowth.com

It starts with a Google doc filled with standard crawl data (quickly yanked from a free audit tool like ScreamingFrog). In addition to information like URLs, categories, links, and the number of live sessions, you’ll be looking at the existing keywords and how they compare to the best ones.

The guide doesn’t cover how to find the best keywords—assuming you understand that already. It skips right to assessing your competitors by recording the following data from the top 3 competing pages for each of your target URLs:

With this data in hand, you can apply a series of provided formulas to rapidly determine what pages need your attention, and how much potential they have for improvement.

Once the sheet is set up (which will take some time and elbow grease), it can be repeated in about 5 minutes. A whole website can be analyzed—and the recommendations justified with data—in an impressively short time.

Sometimes the best research is the data you lift from people already doing the right thing. Competitor analysis is the subject of the first case study we’ll be covering. It’s going to show you how to find out how much traffic any website gets.

Find Out How Much Traffic ANY Website Gets: 3-Step Analysis (With TEMPLATE)

This first case study isn’t a study in itself, but a process for performing mini studies when you need to know a competitor or research target’s traffic.

Robbie Richards takes us through a process that can help you determine:

This process uses the SEMRush tool, but understanding how it works may give you the insight you need to do this same research using the tools you already have at hand. The data is recorded in a spreadsheet template that is provided for download.

First, Robbie argues, you need to analyze website traffic based on how you monetize your site. For example: for Adsense/ ad revenue, you need to drive ad impressions and should focus on the top results.

For an eCommerce store, you need to look at product category subfolders to see where they’re getting traffic with commercial intent. For an affiliate site, you’ll need to dissect traffic by keyword modifiers like “best,” “alternative,” “top.”

Once you know what information you want to track, the guide details a 3-step process you can use to:

These are covered using different SEMRush tools, but the article closes with some of the tools that you can use as alternatives. Tools like SimilarWeb, Alexa, and Ahrefs have many of the same functions. You’ll even get some screenshots to show you where to find that data in each tool.

Our next case study is also concerned with traffic, but more specifically, with the effect that tabbed content has on it.

SEO Split-Testing Lessons from SearchPilot: Bringing Content Out of Tabs

Emily Potter of SearchPilot brings us this quick case study on the issue of tabbed content.

In the past, SEOs have had trouble pinning down Google on the subject. This is content that is only partially revealed until a visitor clicks on something like “read more.” Accordions and drop-down content are other examples of this style.

Many SEOs prefer this type of content because it makes for much cleaner pages. It’s almost necessary for the mobile versions of many sites.

However, there are some lingering doubts about whether Google is effective at crawling content that’s been tabbed. Some employees, such as Gary Illyes, have (vaguely) suggested that it won’t interfere with crawling. What does the data say?

In a series of tests, tabs were removed from product descriptions. The effect was tested on both accordion content and a set of four tabs.

The results were surprisingly conclusive. In the cases where the tabs were removed, there was a 12% uplift in live sessions. That change happened within less than a month.

The researchers noted that the effect was even more pronounced for mobile versions. It’s important to note that this was only one experiment with a specific type of website, but it is also easy to test for yourself on your own sites.

Now, we’re ready to move on to the latest news. This month, it’s all about Google, starting with its announcement of a new paid tier of GMB.

Google offers ‘upgraded’ GMB profile with Google Guaranteed badge for $50 per month

Google recently announced the launch of a subscription service for GMB. It adds a Google Guaranteed badge of authenticity (certifying that Google considers the business to be real and in good standing) for the cost of $50 a month.

This badge has already existed for a couple of years but was limited to local service ads. The new upgrade allows businesses to list their services as guaranteed even if they’re not running ads. Here’s how the badge currently appears in the ad program::

The form may change before full implementation. Google has already taken criticism from advertisers who feel that the company is positioning itself as a competitor to its own ad-buying clients.

As new programs are introduced, a lot of the old ones are going away. Google recently announced it was closing the structured data testing tool permanently.

Google Shutters Structured Data Testing Tool

The structured data testing tool is officially being retired in the next couple of months, Google has announced. If you’ve relied on this tool, don’t worry. All of the functions and many more new ones will be part of the Rich Results tool.

The Rich Results tool is not new, but it has picked up many features since it was first introduced. At that time, it only recognized recipes, jobs, movies, and courses. Now, it can adequately assess all types of structured data.

It includes features that help you discover all the search feature enhancements a markup is eligible for. It will also provide you with both the mobile and desktop renders of a given result, and provide you with the option to test either a specific snippet or an entire URL.

Google was happy to share the last two bits of news with us, but the next story had to be coaxed out by a congressional committee. Let’s look at what we’ve learned from the released document so far.

US Congress Investigation Suggests Google Uses Clicks & User Data In Search

Rand Fishkin has some thoughts on the latest findings from the recent congressional hearing. In this thread, he posted many of the docs and provided some commentary. The documents provide some rare official confirmation (if any SEOs were still waiting for that) of the fact that Google:

Internal documents also seemed to reveal that Google expressed early concern about the possibility of competition in verticals like travel and local search. They openly discussed the significant competitive advantage that their search data gave them.

While it is news that this information is now public and confirmed, for most SEOs, it’s not news at all. Google has vaguely denied many similar accusations, but the trend lines have always pointed toward the further consolidation of searches toward their own services.

We’ve only seen the beginning of this process, as many other tech companies face accusations of monopoly. These reports may form the origins of a reform movement, and SEOs should watch it carefully to see where it goes.

Speaking of seeing where things go, many of us experienced a massive ranking disruption on August 10th. A few days out, we finally have an idea of what happened. 

That Massive Google Update Was Glitch & Bug – Search Results Back To Normal

If you saw your ranking data dip and dive all over the place earlier this month, we now know why. It wasn’t an update (as many expected), but a massive glitch. 

Google’s John Meuller reached out after several SEOs reported bizarre fluctuations. According to John, the issue had been noticed and fixed, though no additional details were offered. 

We’re still waiting to learn what happened, but no one seems to be having problems at this point. Across the board, most signals seem to have returned to normal.

Got Questions or Comments?

Join the discussion here on Facebook.

Want to rank easier, higher and faster?

Sign up and join 30,000+ other subscribers and get SEO test results sent straight to your inbox.

“One of the most effective SEOs I’ve ever met”– Cyrus Sheppard

As Seen On…

Translate »