We’ve all been there. You wake up, flick open Search Engine Journal (or whichever industry news you follow) and those two words hit you like a frying pan to the face:


The logical response? Panic!

I’m kidding of course, but a lot of people do react this way; and hey, I totally get it. Nobody likes the ground being pulled from under their feet and that’s how it can feel sometimes.

From Penguin to Panda, from ‘Mobilegeddon’ to Medic; it’s no exaggeration to say that Google updates have defined not only moments but entire eras of SEO. It can be unsettling to find out that things are changing yet again, just when you thought you had it nailed.

The thing is, though, when you’ve been doing SEO for a long time, you start to realise that these updates are not actually pivotal moments at all. Things are always changing in SEO, and these updates are just markers in the sand. If you keep pace with web trends, you’ll already be way ahead of any algorithm update Google can throw at you.

Or to look at it another way, if you ever feel comfortable in SEO, then you’re doing it wrong.

This is a guide to Google algorithm updates based on my experience. I’ll hopefully clear a few things up, take a historic view to get some perspective, and give you a steer for dealing with the fallout of future updates.

Get some perspective and a positive outlook

First of all, let’s tackle the elephant in the room…. the conspiracy theories.

Every time an update happens, the tin-foil hat brigade comes out in full force. Swathes of disgruntled site owners take to industry forums and social platforms to enlighten the world about how Google is out to get you. They’ll tell you that every move Google makes is about lowering the quality of organic results to get more people spending on PPC ads.

How do they know? Well, of course, because their website is perfect and their SEO is better than anyone else’s, possibly the best in the whole world, yet for some unfathomable reason Google disagreed with that and took their rankings away. The only logical explanation is foul play, right?

This couldn’t be further from the truth. The people who push this theory tend to be butt-hurt site owners who have been hit by this update (and very likely others in the past). Dig a little deeper and you’ll probably find that their websites are straight from the dizzy design heights of 2013 and their SEO tactics would have been considered spammy in 2011. But rather than admit any fault of their own, they buy into the ‘evil Google’ conspiracy.

You rarely hear people who benefitted from an update talking about how that update made the search results worse. Go figure.

You wake up, flick open Search Engine Journal (or whichever industry news you follow) and those two words hit you like a frying pan to the face.

The first (and perhaps the biggest) piece of advice I can offer is to not buy into this way of thinking, even if deep down you believe it, even if you’re a naturally suspicious person. It’s not productive and it’s not conducive to the mindset needed to proactively improve your SEO. The moment you believe the game is rigged against you is the moment you either give up on SEO altogether or worse still, give in to the very worst kinds of black hat SEO.

Ignore the negativity. Take everything anybody says (including me) with a pinch of salt and stay focussed on getting results. This race never ends and you need a winner’s attitude to stay in it!

So what is Google playing at?

Google’s algorithm is constantly being tweaked and changed to achieve a singular goal… to make the organic search results as good as they can be. As technology moves forward, those goalposts move, so they need to stay ahead of the curve.

In fact, Google makes at least one update to their algorithm pretty much every day of the year. It’s just that some of them are bigger than others. The big ones get named (sometimes by Google, sometimes by SEOs) and talked about the most: Panda, Penguin, Pigeon, Possum, Hawk, Fred, RankBrain, Maccabees, Brackets… the list goes on.

Google knows that when people use their search engine they do so for the organic results, not the paid ads. By serving up industry-leading organic results, they keep the majority of the search market on their platform. But, of course, they also know that a small percentage of searchers will, inevitably, click an ad. And that’s how Google makes money.

So yes… it’s about making more ad revenue. They have shareholders and organic search results don’t make them any money. But there will always be enough advertisers willing to buy PPC ads, and the prices of those ads are dictated by market dynamics. Each industry has margins that can be achieved and they define how much people are willing to bid for clicks.

The goal for Google is simply to get (and keep) more people searching. If the organic results are great and get more people using Google, then the knock-on effect is that there will be more people clicking on ads. Generally speaking, for every 5 people who use Google, 1 will click an ad, and that means Google makes some revenue.

So then, the aim of every update is a ‘quality improvement’.

Ok fine, but how does Google determine what qualifies as a ‘quality improvement’?

Now you’re asking the right questions!

If you only take one thing away from this article, it should be that Google looks at every angle to determine ‘quality’. That includes technical, on-page, and off-page ranking factors. In other words, they look at your code, your design, your content, and your marketing, broken down across hundreds of individual signals, each fed into the algorithm to be combined, crunched, and weighted.

If you are neglecting any part of this, settling even for average (let alone sub-standard), then there is an exceedingly high likelihood that, at some point, you will be on the wrong side of an algorithm update.

People put so much emphasis on reacting to algorithm updates, trying to decipher them and to work out which piece of the SEO puzzle was targeted, that they forget the bigger picture. Instead, if you are proactive about improving your user experience, content, and marketing, then you’re unlikely to ever fall foul of an update.

So then, the aim of every update is a ‘quality improvement’.

You should know what defines ‘good’ when it comes to websites: speed, mobile usability, content, site structure, relevant marketing, etc. Without wanting to sound obtuse, just do it all right. Don’t get bogged down with this quality signal or that quality signal… just aim for high-quality across the board. That’s really what Google is looking for and every update better enables them to reward it.

Of course, you may not have the time, resources, knowledge, or tools to do this yourself. That’s where a good agency can come in very handy.

If you need help, give us a shout. Auditing websites, getting penalties lifted, and driving positive SEO results is what we’re all about.

A Brief History of SEO

If you want to know where something is going, it helps to know where it has been. By understanding the evolution of Google so far, you start to get a very clear picture of where they are heading:

Google In 1998–2003: Monthly Updates

A long time ago, in the very early days of SEO, Google was pretty static. They would do a monthly data refresh to reorder rankings and that was it. The ranking signals were pretty rudimentary by today’s standard, basically amounting to keyword-matching and link-counting.

In 2003, they moved to an “everflux” model, where data was updated in real-time, which made it easier for SEOs to make changes and see the results almost instantly (once crawled and indexed).

Google In 2003–2011: Everflux & The Spam Years

The next 8 years were mostly about new feature releases (e.g. Google Places, personalised results, Instant Search) and generally making Google faster and more efficient. Examples include the release of the Caffeine algorithm in 2010 (a rewritten and much-improved crawling/indexing infrastructure), xml sitemap compatibility, and real-time search suggestions.

This was a ‘golden age’ for SEO where very basic spammy tactics worked a charm and the rewards were great. When I say basic, I mean we’re talking meta-keywords, keyword stuffing, cloaking, and comment spam. Not so great for users, but wonderful for people selling SEO.

Of course, this didn’t last forever. In 2011 they released the Panda update to target on-page spam, followed by Penguin to target link spam in 2012, and SEO grew up.

Google in 2011–2014: Penguin & Panda Updates

Penguin and Panda were game-changers. Overnight, the SEO conversation pivoted from tricks and techniques to marketing strategy. Updates and penalties dominated the headspace of SEOs.

Some of the old-school SEO tactics still worked, but everybody knew they were a ticking time-bomb. Sure enough, people using these tactics suddenly, eventually, found themselves unable to compete. This was the birth of many of the conspiracies we debunked earlier.

During this era, Google would release versions of Panda and Penguin much like a software company might roll out versioned releases. There were major overhauls where the algorithm itself changed in the way it processed data (i.e. ranking signals), and there were minor tweaks and “data refreshes” released in between. For example:

  • Penguin 1.0 – April 2012
  • Penguin 1.1 – May 2012 (data refresh)
  • Penguin 1.2 – October 2012 (data refresh)
  • Penguin 2.0 – May 2013 (algorithm overhaul + data refresh)
  • Penguin 2.1 – October 2013 (data refresh)
  • Penguin 3.0 – October 2014 (algorithm overhaul + data refresh)
  • and so on

If you got hit with a Penguin penalty, you had to fix the problem and then wait patiently for Google to roll out the next version or data refresh before that penalty could be lifted. Famously, the gap between Penguin 2.1 and Penguin 3.0 was over a year long! Website owners with penalties simply had to wait. It was brutal.

This remained the norm until December, 2014

Google in 2014–2016: Return of Everflux

Just in time for Christmas in 2014, Google released an “everflux” version of Penguin which reassessed data on the fly. It became a real-time algorithm and as such it was considered to be ‘baked in’ to the main, core everflux algorithm.

You could get a Penguin penalty one day, then put the work in to rectify the issues and have it lifted almost immediately.

This soon became the norm, so much so that Google rewrote their entire algorithm from the ground up in 2016 to reflect it, taking everything they had learned up to this point to create a more lightweight and real-time algorithm called Hummingbird. That algorithm is still in use today, albeit with some additions made along the way, including mobile usability signals and a switch to mobile-first indexing.

Google in 2016–2018: Mobile & AI

2016 was when the world really woke up to the fact that mobile traffic was the future. Google spotted this early and started moving their entire operation to become mobile-first.

They released a number of updates focussed on site speed and mobile usability, they made their own UI mobile-first (incrementally), and finally, they switched to a ‘mobile-first index’ where rankings are primarily determined by mobile site versions. These caused step-changes in an otherwise fairly fluid and real-time algorithm.

That real-time algorithm landscape was a blessing because you could push right to the limit and correct if you overstepped. Any serious digital marketers who were interested in conversions as well as rankings were already making their sites mobile-friendly and fast. That wasn’t an issue for anyone with any sense.

The real-time algorithm turned out to be a double-edged sword though. The ability for Google to process huge amounts of data in real-time led to an interesting side-effect: it created a perfect environment for machine-learning.

The two most notable examples of machine-learning in Google include:

RankBrain (late 2015): helps Google to better understand search queries it has never seen before, and to select the most valuable content to show based on user interaction metrics such as dwell time.

BERT (late 2019): a pre-trained natural language processing (NLP) model, used to help Google understand the meaning, context, and intent behind search queries and content.

When these machine-learning parts of Google are introduced, get smarter, and are updated, you can’t really ‘optimise’ for them in quite the same way as you can optimise for other ranking signals.

Pagespeed? Sure, make it faster.

Mobile usability? Sure, update the mobile UX.

RankBrain and BERT? If they don’t like what they see, unfortunately, it’s a fairly holistic judgement that has been made, and you need to take a holisitc approach to improve the ‘quality’ of your site to turn things around.

And so we come back around full-circle. You need to stop optimising for this update or that update and instead focus on general quality!

2018-present: The Broad Core Update Era

In October, 2018 Google released an update that the SEO industry dubbed the ‘Medic Update’. It seemingly affected ‘your money or your life’ (aka YMYL) sites hardest and coincided with Google updating their manual review guidelines for their webspam team, placing more focus and emphasis on ‘E-A-T’ signals (expertise, authority, and trust).

People read way too much into that and started trying to optimise their sites for ‘E-A-T’ and a whole industry has even sprung up around it. To this day, the myth persists, largely due to a very old-school way of thinking about SEO. In reality, ‘E-A-T’ is at best a goal, not a ranking signal. It paints a picture of what every site should be trying to achieve, but it’s measured using other, more tangible ranking signals in combination.

That’s key. It’s all about combinations of ranking signals. That’s the way the machine-learning parts of Google see things, and it’s increasingly the way Google approaches algorithm updates in general.

Over the few months immediately after Medic, there were strong signs of several other updates happening, but Google didn’t confirm them. Interestingly though. nobody seemed to be able to pinpoint any specific areas of focus for these updates. Some tried, but few agreed.

Then in March 2019, Google released a big update to their algorithm, confirming it on Twitter, but they flatly refused to give any meaningful information about it. They simply told people to work on overall quality. As a matter of fact, they actually pointed people to those webspam team guidelines that talk about E-A-T.

‘E-A-T’ is at best a goal, not a ranking signal

What Google were saying was, in effect: “Just be good at all of it”.

To drive this point home, they simply called that update the ‘March 2019 Core Update’. Ever since then, a lot of Google’s updates have used this naming convention. People try to analyse them and they’ll say that each one included some combination of ‘a little bit of Penguin’ or ‘an update to Medic’ or ‘a reversal’ of some previous update. But as time goes on, this is getting harder and harder to follow.

This is intentional. Google is doing you a favour by forcing you to see the bigger picture. They are forcing you to take a holistic approach.

Admittedly, for a data-driven control freak like me, it doesn’t do much for the old anxiety levels. But I get it. Once you accept that you simply have to be the best website to get the best rankings, it can actually be empowering!

If you do ever fancy geeking out over the full history of Google’s algorithm, check out this awesome timeline from Moz. It’s almost like reading a story (if you start at the bottom and work up) and helps you to get into their mindset. It’s also pretty handy to have bookmarked as it gets updated with new info all the time.

Ok, you’ve made your point. But I did get hit by a Google update. What should I do?

Ok, fair question. Here are the things you can and should do.

1. Be Patient

Google has a habit of rolling out updates and reversing them in a week or two, sometimes even after a few months. Also, even if they don’t reverse it, they will almost certainly tweak the update a few times.

Truth be told, they also have a habit of screwing up and then fixing their mess a few days later.

Remember too that Google will often need to recrawl a lot of websites and content before it can process a new or changed ranking signal and apply the results across its entire index. Their link graph won’t refresh overnight. It can take two weeks just for an algorithm to fully roll out.

Don’t make knee-jerk reactions. Let everyone else do that while you watch intently and let the dust settle. Then, when you’ve learned from what others have done, you can make informed decisions.

2. Question Everything

When you read a Facebook post from some ‘guru’ telling you that this update was “definitely Penguin-related”, that might be more an indication of how spammy that person’s backlink profile is and not that accurate a description of the update.

People will find correlations in small data sets, but the reality is there are very few people on the planet who are qualified enough (and have access to enough data) to truly understand what an update might have been targeting – especially in this modern age of broad core updates. Even then, those people will often be running very different SEO strategies to yours. An affiliate SEO or PBN marketer might have lots of data, but that data won’t be meaningful for your online fashion brand or construction company website, for example.

If you do choose to put stock into any of these gurus, give them time. They’ve been known to jump to a conclusion only to backtrack a week later. Data analysis at this scale takes time. Stick to the main industry news outlets where stories and findings are heavily vetted and peer-reviewed, and take everything with a pinch of salt.

3. Do Your Own Analysis

Sure, it’s easy to play the blame game. You can point to mysterious and unattainable “E-A-T” signals, or say that Google has once again furthered their evil plan to get you spending more on PPC, but that’s not going to help you.

Stay calm, stay logical, and look at the winners and losers. When Google promotes websites in search results, they are literally telling you what they are looking for. Look beyond your falling rankings and try to spot the sites making the biggest gains.

What do they have or do that you don’t? Once you know that, you can either start competing on those metrics or, if that’s not possible, look for new keywords where you can!

4. Accept Some Responsibility

The reality is that the issues you’re facing are probably much closer to home than you would like to admit. But you have to face the facts.

When was the last time you did a full SEO audit? I don’t just mean a crawl report; I mean a full, sleeves rolled up, digging deep, hands in the mud, foundation to ceiling, technical, on-page, UX, and off-page SEO audit.

Rather than waiting around for other people to find answers for you, get stuck in. If you find problems, create strategies to tackle them. Even if they are not the particular focus of this recent update, they still need fixing.

Remember, in today’s algorithm landscape, the updates are holistic so your optimisation should be too.

If you need help with an SEO audit, penalty analysis, or ongoing ranking improvements, please get in touch and we’d be happy to help.