Originally published on: http://feedproxy.google.com/~r/WordStreamBlog/~3/NVbzIPQmMnU/google-fred-update
On March 7, 2017, webmasters, site owners, and search engine optimization professionals began to panic. Some sites had seen a massive decrease in traffic seemingly overnight. Frantic SEOs tried to assuage their clients, some of whom had seen a drop in traffic of up to 90%.
This was how most folks learned of the latest major Google algorithm update that came to be known as the “Fred” update, the name jokingly given to the update by webmaster analytics expert Gary Illyes.
Google constantly makes adjustments to and updates its search algorithms throughout the year. Few updates, however, have the kind of immediate impact that Fred did. Now, almost six months after Fred rolled out, some sites are still struggling to regain the traffic they lost – some, but not all.
In this post, we’ll be taking a detailed look at the Google Fred update. We’ll examine the immediate and longer-term impacts of the change, its intended purpose, as well as what you can do to plan for and mitigate against potential declines in traffic as the result of future updates.
What Was the Google Fred Update?
The Fred update was an adjustment to Google’s search ranking algorithms that was implemented on March 7, 2017. Initially, Google chose not to announce the update. It was this lack of forewarning that prompted many SEOs and webmasters to fly into a panic when they examined their analytics data on March 8.
To date, Google has yet to officially confirm the Fred update, even after fallout from the update has made its way far into the search marketing media ecosystem.
What Was the Purpose of Google’s Fred Update?
Unsurprisingly, Google is reticent to reveal the motivation behind the Google Fred update, just as it has been with previous major updates such as the Hummingbird update that was deployed in September 2015. Many SEOs believe, however, that the primary factor behind Fred was quality – specifically, how aggressive monetization tactics used by some sites were, in Google’s view, negatively impacting the experience of its users.
Image via Tydal Wave Creative
Which Sites Were Hit Hardest by Fred?
In the wake of Fred, many SEOs began exploring which sites had been hit the hardest by the update. Although Google has yet to confirm any of the speculation regarding its latest algorithm change, there’s some compelling evidence to support the theory that Fred was designed to penalize sites that were prioritizing monetization over user experience.
In an analysis of 100 websites affected by the Fred update, Barry Schwartz found that the majority of the sites he studied shared similar characteristics, namely that they were all primarily content-driven and featured aggressive advertising placement.
Image via The Verge
Obviously, these criteria could apply to millions of websites, but millions of sites weren’t decimated by the Fred update. So what happened? Well, the evidence certainly seems to suggest that sites prioritizing monetization over the user experience were hit hardest, specifically sites with very heavy advertising inventory and thin, questionably useful content.
Schwartz’s theory about the connection between the Fred update and aggressive monetization held water. In his write-up of his research, Schwartz noted that webmasters of several sites reported subsequent gains in traffic after removing or significantly reducing their ad inventory, reinforcing the connection between sleazy advertising practices and traffic penalties.
Interestingly, though, not all webmasters reported drastic reductions in traffic following the rollout of the Fred update. Some actually saw remarkable gains in traffic – with some experiencing increases of more than 100% after the rollout, as Glenn Gabe reported in his write-up at G-Squared Interactive.
The screenshots above are from Google Analytics data taken from one of Gabe’s clients. Rather than reporting a decrease in traffic, this site owner saw a remarkable increase in traffic of 125% overnight.
Images via Glenn Gabe/G-Squared Interactive
What’s really interesting, though, is the fact that this major spike in traffic wasn’t the result of SEO work done on or around March 7, 2017 – it had been done quite some time before Fred was rolled out. This suggests, as Schwartz noted in his analysis, that Fred wasn’t a dramatic overnight change, but rather a heightened tune-up of algorithmic changes that were already in place.
What Kind of ‘Aggressive Monetization’ Was Hit by Fred?
Few analysts – particularly working SEOs – have been forthcoming about the precise kinds of sites hit hardest by the Fred update. Naming-and-shaming sites with terrible advertising isn’t exactly conducive to professional self-preservation, after all. Some SEOs have, however, outlined some of the issues their clients experienced following the rollout of Fred, which should serve as a cautionary tale to site owners who are tempted to cut corners for the sake of revenue.
Barry Schwartz listed a sampling of 20 or so unique URLs that site owners of sites hit hard by Fred had shared with him for research purposes. Schwartz was quick to point out that not all of the sites shared with him (privately or otherwise) emphasized monetization over user experience – but many did. Several of the sites listed in Schwartz’s write-up were portal sites, as seen below:
An example of a portal site that was negatively
impacted by the Google Fred update, via Barry Schwartz
Poor-Quality Backlinks Targeted by Fred
Evidence suggests that it wasn’t just excessively heavy ad placement that may have resulted in traffic penalties when Fred rolled out. According to some research, poor-quality backlinks were also a target.
Both StatCounter and TechWyse reported that poor-quality backlinks were a commonality across many sites hit hard by the update. This means backlinks to sites with poor domain authority, out-of-date or broken links, or downright “unnatural” links were all fair game for Fred – so much so that some SEOs believe link quality to be the defining characteristic of the entire update.
Link profile metrics for the WordStream blog taken from
Moz’s Open Site Explorer
How to React to Fred – And Future Google Updates
Whether as a result of poor-quality links, excessive ad placement, thin content, or a combination of all three, Fred was hugely disruptive to many websites. Of course, this could be viewed as a good thing for users, which in turn, is good business for Google.
Living in perpetual fear of Google’s whims has long been the norm for SEOs, but there are several steps you can take to minimize the risk that your site will be hit by future algorithmic changes. If you’re worried about the potential impact of Google updates, here are some measures you may want to consider.
Focus on the Overall Quality of Your Site
One of the most fascinating things about Google Fred is how some sites that were hit pretty hard by the update managed to recover.
In his write-up, Glenn Gabe gave three examples of sites that were affected by Fred in some way: one positively, one negatively, and one that occupies what Gabe calls “the grey area,” a middling position in which a site may not be quite where it needs to be but has also shown improvements.
Check out this screenshot of traffic data from Google Analytics for the example site in Gabe’s grey area:
Image via Glenn Gabe/G-Squared Interactive
As you can see, the site took a substantial hit when Fred rolled out. However, as the site owner had been focusing on improving the overall quality of their site per Gabe’s recommendations, the site actually managed to recover almost completely from the initial hit by improving elements including only publishing quality content, fixing UX problems, and addressing site-wide technical SEO issues.
I’d strongly recommend you read Gabe’s post to learn more about how this site recovered.
Conduct a Thorough Site Audit
Before you freak out (note: this step optional), it’s vital that you understand precisely what factors could put you at risk of penalization under future algorithmic updates. This means conducting a thorough, comprehensive audit of your entire site. Yes, every last page.
Image via netsmartz
There are two main ways to accomplish a site audit for SEO: you can do it yourself, or you can hire an SEO professional. The DIY route is very attractive for obvious reasons. It can save you plenty of money, and shields you from any potential indignities if you’ve been engaging in questionable SEO techniques. Going it alone can, however, be a major time-sink – and if you’ve never done an SEO audit before, you may end up with inaccurate or just plain misleading results.
Only Publish Top-Quality Content
By now, this should go without saying, but if your site is still relying on the “quantity over quality” publishing model, it’s time to radically reevaluate your approach.
Although Google’s ranking signals remain one of the most closely guarded secrets in the world, we do know that Google rewards sites that offer strongly relevant, actionable content. The more relevant and useful a site’s content is, the better the user experience will be, which is what Google wants – for users and advertisers.
Producing top-quality content consistently is one of the greatest challenges any marketing team can face. The time and financial commitments necessary to maintaining a regular editorial calendar are considerable, but the benefits can be even greater. Whether you’re aiming for publishing content daily (as we do here at the WordStream blog) or once per week, it’s crucial that you focus on producing the very best content you can as consistently as you can.
To help you accomplish this, here are some resources you may find useful:
Our 13 Best Content Marketing Tips… Ever!
The Best Content Marketing Tools for Creation, Promotion, Syndication & More
What Is Long-Form Content and Why Does It Work?
The Seriously Comprehensive Guide to B2B Content Marketing
SEO Basics: Complete Beginner’s Guide to Search Engine Optimization
Clean Up Your Link Profile
Whether or not link quality was the defining factor in the Fred update, cleaning up your link profile should be high on your to-do list in preparation of future Google updates.
A great place to start evaluating your link profile is by using Moz’s Open Site Explorer. Available as a free tool and on a subscription basis for more advanced functionality, OSE can tell you which of your links are strong and from reputable sources, and which ones are toxic and should be excised. Eliminating broken and low-quality links protects you from potential penalties, and also makes your site more user-friendly – and that’s always a good thing in Google’s all-seeing eyes.
Avoid Oversaturation of Online Ads
Remember back in the good old days when connecting to the internet meant enduring several minutes of the anguished, near-deafening cries of a telephone being murdered and websites’ content was virtually indistinguishable from the horrifically ugly banner ads?
Yeah, that’s not really a viable solution anymore.
Cramming every spare pixel of real estate on your site with ads might be temporarily effective in driving short-term revenues, but it also tells Google that you’re more concerned with making money than providing useful, relevant content for your users. This not only makes your site significantly less appealing to prospective customers – harming your conversion rates – but also paints a huge target on your back.
By all means use any and all ad formats you can to monetize your website, but tread carefully. The more ads you feature on your site, the greater the risk you could be penalized by future changes to Google’s search algorithm.
Ready, Freddy, Go
Google updates its algorithms as often as multiple times per day. Most of these changes are minor tweaks and updates that even the most eagle-eyed SEO wouldn’t notice. That’s what makes large updates like Hummingbird and Fred all the more newsworthy and crucial to prepare for.
Hopefully your site or those of your clients weren’t too badly affected by Fred. If they were, you’ve undoubtedly already taken some action to rectify matters. If not, it’s important to remember that sleazy SEO tactics that may have worked in the past aren’t likely to remain viable strategies for long.