Parsing Google Fred and Other Quality Updates: How to Prepare or Recover

Share this:

Just when you think you have everything under control, Google makes another change. If you are involved in any form of SEO, you know how daunting it can be to keep up with Google algorithm updates.  Your heart stops every time you read a headline that says something to the effect of “Google Rolls Out Algorithm Update!” You also are hesitant to log into Twitter out of fear that Google has been at it again and everyone is tweeting about it. From the Mobile First Index to the Owl update to Google Fred, the tide is always shifting.

It is important to step back a moment and take a look at what Google is trying to do.

Google is focused on the experience that users have when they use its search engine.

What does that mean to us? Nearly every change that Google has made involves quality in one way or another. While I could create an entire article series that would run for weeks, if not months, about Google updates the focus of this article is on Google Fred.

Note: Although Google Fred occurred around March 7th or 8th, it was again a topic of conversation at the recent SMX Advanced conference. That is when Gary Illyes, a Google webmaster trends analyst, was part of an AMA session where he was on stage answering the audience’s questions. As expected, Fred came up in the conversation.

What We Know About Google Fred
In March 2017, there was a lot of chatter among SEOs and webmasters regarding rankings and traffic. Initially Google didn’t say a lot about the update—neither confirming or denying it. However, Illyes did say that the search engine is constantly making updates and stated that from now on, he will call every update, Fred. About two weeks after Google Fred occurred, the search engine confirmed that they did indeed have an update. At that time, the conversation was very similar to what it was just a couple of weeks ago at SMX Advanced.  Illyes directed the audience to the quality section of Google’s Webmaster Guidelines, if they want to know what Fred targeted. 

If it has been a while since you visited the quality guideline section, here is a verbatim list of what is covered:

  • Automatically generated content
  • Sneaky redirects
  • Link schemes
  • Cloaking
  • Hidden text and links
  • Doorway pages
  • Scraped content
  • Affiliate programs
  • Irrelevant keywords
  • Creating pages with malicious behavior
  • User-generated spam

From what I have seen after this update, the sites that have been negatively impacted had poor quality links or content.

Is It Really Anything New?
Even before Google rolled out the Panda and Penguin algorithm updates years ago, which caused waves in the SEO world, they already had their quality guidelines in place. The updates finally added “teeth” to the warnings Google was already giving. What Google has done over the years is lower the tolerance for what is considered low quality. Just the other day I had a conversation with an SEO company that burned a client’s domain. Their excuse? Google changed and didn’t like the backlinks pointing to the website. My response was, no, Google didn’t all of a sudden develop a dislike for spam. Those links were likely always bad.

What Fred (and Other Quality-Related) Updates Mean to You
Whether it is Fred or another update, there are activities you should be doing on a regular basis to ensure your website is “quality” in the eyes of the search engines, including backlink audits and content audits.

First and foremost, start with monitoring your website traffic in comparison to known Google updates. Doing so will help you keep an eye on how these updates might be impacting your website. A great tool for this task is the Panguin Tool. It is offered for free and is easy to connect to your Google Analytics account. The tool matches up each update with your existing Google Analytics organic sessions, as shown below. Each line is color-coded to the named update.

Don’t rely solely on this chart to determine if you were impacted by a Google update. It at least points you in the right direction, so you know what areas to further explore. For example, if your traffic dropped around a Penguin update, start with a backlick audit, but also explore other possibilities, such as a change to your CMS or a seasonal sales cycle.

Backlink Audits
There has been a lot of talk about how Google now views spammy links, but regardless you should know what websites are linking to yours. A backlink review should be on your monthly SEO maintenance list, meaning you are using a tool, such as Moz, Ahrefs or Majestic, to monitor your backlinks. Google Search Console and Bing Webmaster Tools are also great resources for links. Make it your goal to compile as large of a list as you can and as you review it, mark the links you will remove. Bruce Clay created a free crowdsourcing tool you can use to see what other people are disavowing. The site is worth checking out: Disavowfiles.com.

Content Audits
From my experience, most people are already aware that they should be auditing backlinks. Yet, they are not as familiar with auditing content. Not only can a content audit help you create a full inventory of pages on the website, it can also help you determine if pages should be revised, removed or consolidated. If you want to streamline your content audits, consider using URL Profiler. It crawls the URLs you provide and reports a variety of URL specific metrics by pulling from Google Analytics, Google Search Console, Moz, Majestic, Ahrefs and other data sources. Be prepared to wait a while, because it has taken me upwards of 5 hours to run a crawl through URL Profiler, depending on the size of the website. What you are left with are a ton of great metrics that you can sort, allowing you to identify weak pages. When running an audit, I will check the areas reflected in the URL Profiler screenshot below.

If the website is fairly large, start with the pages that are most important to your company, or client, and work your way down the list. I set up a custom sort in Excel that looks like the following: Sessions (lowest to highest) > Time on Page > Exits > Exit Rate. In addition, I will look for duplicate content and low word counts. I always include a column in the spreadsheet with notes, so you can detail why you are identifying the page as weak and recommended action steps.

Be Prepared for What Comes Next
Sure, we don’t know what changes might be coming our way next, but if our focus is the same as the search engines, meaning a high quality website, we will be in much better shape. You can read more about what Google considers to be quality in my previous article, A Look at Google’s ‘Quality Rater Guidelines’ Over Time: How to Put Information Into Action.

Mindy Weinstein is the founder and president of Market MindShift, as well as a national speaker, trainer and digital marketing strategist. She teaches part-time at Grand Canyon University and has been a search geek since 2007.

Tags: