The Many Twists and Turns in a Storied History of SEO

Bozboz
25 May 2020

We can’t believe it was a little over 26 years ago that the first search engine was launched.
Well, a lot’s changed since then.

Introduction 

In 1993, a recent graduate of the University of Stirling created the web’s first search engine; it was called Jumpstation. Jumpstation only searched titles and headings and had no ranking system to speak of. But despite its limited functionality, it changed everything. 

Jonathan Fletcher was the ‘father of the search engine’. 

At that time, the web was largely made up of academic papers. And being able to search these papers via a keyword search was revolutionary. It was like having an automatically updating Yellow Pages accessible at the touch of a button.

But in 1997, a terrible mistake was made. Google.com launched. 

With it, came the culmination of Larry Page’s and Sergey Brin’s Stanford PHD research project: PageRank.

In the beginning there was PageRank

PageRank was a simple ranking algorithm for the web and, on paper, it was hugely promising. The idea was to reward sites with a higher rank for searches based on how many links it had from other sites (and how many links those sites had). 

It was a genius idea - PHD-worthy in fact. But in isolation, it was open to abuse.

And if PageRank was the victim of that abuse, SEO was the crime.

Search Engine Optimisation became a rapidly-growing industry after Google launched. Google’s success allowed for the industry to support itself.

But if Google hadn’t done it, another search engine would have. And PageRank isn’t the only way to rank content. In fact, there are now 200 ranking factors (that we know of) that Google uses; PageRank is just one of them. Each one’s been formulated by common sense - or the genius of search engineers. Google’s monopoly on search is owed mainly to their early and continuing innovation.


And that innovation is incredibly important to understand. 

For over 20 years, Google has worked to refine ranking factors to make them fairer. All to ensure that the user gets what they want; not the webmaster, nor the content creator or the Ecommerce giant.

But this user-centric approach isn’t completely selfless. 

Google benefits from keeping users online. Because at their heart, they’re a data company and they make money from advertising. The longer they keep users within their ecosystem, the more they understand their intent. And, crucially, the more ads they can serve.

The Old Testament of search

The development of those ranking methods brought with it a responsibility from Google. They knew that if they just pushed users to low-quality websites, or pushed too many ads, they’d weaken the web. Ultimately, this would just mean that everyone lost out. 

So, what did they do?

Simple. They devised ways to police the web.

At first, Google’s ranking commandments were easy to follow: 

  • Have a crawlable site
  • Use simple common terms
  • Stuff your title with terms
  • Stuff your page with terms
  • Get lots of links from highly-linked sites

These were fairly empowering terms from Google. Each one was a positive action that a site could take. But site owners took advantage. Blackhat SEO rose from the murky depths; methods designed not to improve the content of a site but to improve its rank.

By 2010 this had become a widespread problem. So, Google unleashed Panda and Penguin algorithm updates. 

And by taking the initial rules to their logical conclusions, they punished sites that:

  • duplicated their content 
  • used the exact phrase they wanted to rank for in links to their site
  • bought links
  • created their own low-quality links (like forum comments)
  • created their own website networks full of links
  • created widgets and toolbars full of links

A famous example of Google’s wrath was Interflora.

In 2013, Interflora was just coming down from the seasonal high of Valentine’s day. They’d been savvy and employed SEO tactics to buy a horde of blog and local news links. Most of the content behind these links was the same too; it was duplicated and with identical exact match anchor links.

A few days after Valentine’s they disappeared from the web altogether.

So, what happened?

Well, savvy Google applied a manual action. The rules of the game had radically changed and they were cracking down. Sites could build links and use keywords in content but they had to be careful. 

Follow the rules or feel the wrath:

  • Don’t duplicate content
  • Don’t pay for links
  • Don’t spam low-quality links
  • Don’t fake link networks

This change was drastic, and overnight the SEO industry went into a panic. Google wasn’t empowering sites anymore; it was punishing them.

But nothing lasts forever.

Forgiveness

Google decided to offer an olive branch to everyone. They could redeem themselves by using something called the disavowal tool. This allowed sites to distance themselves from any bad links that may have been built over the years. 

It was explicitly stated that disavowing - and then requesting a manual review - was the way to recover from penalties. If a site didn’t have a manual action but had seen its ranks drop after big algorithm updates, disavowal was encouraged too. Essentially, you could now ask Google for forgiveness directly.

But disavowal wasn’t just for forgiveness. 

Prior to this, Google’s algorithm changes had often involved training a machine on a dataset. And this was created by a focus group of human assessments of web pages. Google set guidelines and employed cheap labour to generate models that the algorithms could interpret and copy. If a human gave certain sites good ‘scores’, then a machine could find patterns and apply them to any site.

Disavowal was a simpler version of this. And Google made huge numbers of people submit their scores for what made a bad site for free. Simpler, cheaper and globally scalable. 

But in 2018, the game changed - again. 

Google’s Webmaster Trends Analyst admitted that in most cases disavowal was no longer needed. 

“if we can recognise them, we can just ignore them,” they said regarding bad backlinks.

And it’s important to recognise just how pivotal that statement is - especially in terms of “ignore”. 

Previously, links could be deemed toxic; they could have a negative impact on your ranks. Now, they were simply worthless. Previously, Google had to make sure that toxic links had a negative impact. They needed to deter people from building them. But they’d become so good at recognising them, due in no small part to sites disavowing their blackhat links, that they no longer needed to use a deterrent. 

Cue present day...

The New Testament of search

Google’s now adept at countering most blackhat techniques - without focusing on punishments.

So, what does this mean for Google, sites and users?

Essentially, Google can focus on empowerment again. And this time it’s shifting focus to where it should have been all along...

The user.

With small updates occurring constantly, Google has largely gone dark on when major algorithm updates are released. But it’s not impossible to know what to do.

Google’s 200 (known) ranking signals are all positive factors that sites should develop to. 

But where do you start?

It’s simple. And in 2020, Google will even help you do it.

Lighthouse is a tool that Google has nested inside Chrome. You can run it for free and it’ll put a single webpage through its paces. The result is a report that covers that page’s technical traits in four areas: Performance, Accessibility, Best Practices and SEO.

At Bozboz, we distil all of that history into a simple set of principles:

  1. Let users quickly use your site
  2. Let users easily access your site
  3. Let users easily use your site
  4. Let bots easily crawl your site
  5. Create high-quality user experiences
  6. Research to stay up to date

Using these principles, we help our clients build the best possible experiences for their users. The stuff that’s useful, engaging and keeps them coming back for more.

---

Not sure whether your site is serving users properly? Don’t worry, we can help. Drop us a message or give us 01273 727 581. We’d love to hear from you.

You may also like...

Micro Influencers Increase Conversions. Here’s how

Micro influencers are creators who have between 1,000 and 100,000 followers. Although they have a smaller reach, they generate more authentic conversions and engagement.

What can you achieve with 280 characters?

Love it or hate it, 280 character tweets are being trialed across select Twitter users, and are potentially here to stay. The Bozboz team come together to explain the pros and cons.

Digital Marketing for a Global Audience: Top 5 Strategies

The world’s a big place, but it’s made significantly smaller with digital technology and social media so it’s no wonder brands are finding it easier to reach new audiences overseas.

Unlock your online
potential today

If you want to showcase your offering, convert more leads, provide resources, or all of the above, we can build a website that separates you from the competition. 

Get in Touch
crossmenu