This document provides an overview of recent changes in SEO, including several major Google algorithm updates that have occurred over the last few months. It discusses topics like rewarding expertise, authority, and trustworthiness (EAT); the importance of site speed; and how to recover from algorithm drops. The document also mentions updates to rel=prev/next, Moz Domain Authority, mobile-first indexing, richer results in search engine result pages, and increased focus on images, video, and visual search.
2. Private and confidential
I’m Adam
� 18 months client side - PPC,
SEO, Web Dev
� Joined Impression in summer
2017
� Strategy and planning across
all areas of digital marketing.
2
3. Our story
Founded in November 2012, Impression
has grown to be one of the UK’s premier
agencies, working with brands across the
globe to drive higher returns on their digital
marketing investments.
Our combined experience across in-house and agency roles means
we’re well placed to understand how you work. We built our agency
in response to what we believe were ineffective practices we’d
witnessed amongst other agencies and we strive to do better,
delivering tangible ROI every day.
hello@impression.co.uk
4. hello@impression.co.uk
What we do
We help businesses just like yours to make more money by
generating new leads and making more profitable online sales.
S E O
C O N T E N T
P P C
C R O
D I G I T A L P R
A N A L Y T I C S
5. Private and confidential 5
Today...
Algorithms, tech SEO, SERP’s, Page
Speed, Moz DA, JavaScript, Image
Search, GoogleBot and beyond..
6. Private and confidential
Algorithm Updates
� Biggest update in years
� Others over the last two
months
� High volatility in SERPs
“Each day, Google usually releases
one or more changes designed to
improve our results. Some are focused
around specific improvements. Some
are broad changes.”
G O O G L E S E A R C H L I A I S O N
https://twitter.com/searchliaison
6
7. Private and confidential
Florida 2
� Broad core update
� Similarities to Florida
� Rewarding EAT
7
“As with any update, some sites may
note drops or gains. There’s nothing
wrong with these pages that may now
perform less well. Instead, it’s that
changes to our systems and benefitting
pages that were previously under-
rewarded…”
G O O G L E S E A R C H L I A I S O N
https://twitter.com/searchliaison
8. Private and confidential
Florida 2 - How to
recover
� Secure your site
� Site Speed
� Focus on EAT
� Trust Signals
8
“As with any update, some sites may
note drops or gains. There’s nothing
wrong with these pages that may now
perform less well. Instead, it’s that
changes to our systems and benefitting
pages that were previously under-
rewarded…”
G O O G L E S E A R C H L I A I S O N
https://twitter.com/searchliaison
9. Private and confidential
Google will always place emphasis on the following:
� Great websites
� Great content
� Great promotion
9
Whatever happens with the algorithm...
10. Private and confidential
rel=prev/next
� rel=prev/next is no longer a
ranking factor
� Used to highlight the order of
pages
� Google can figure out page
sequencing for itself
� Hints towards SPA and JS?
10
“As we evaluated our indexing signals,
we decided to retire rel=prev/next.
Studies show that users love single
page content…”
“Know and do what’s best for *your*
users!”
G O O G L E W E B M A S T E R S
https://twitter.com/googlewmc
11. Private and confidential
Moz DA Update
� DA has always been out of 100
� Update expanded the statistical
variance of DA as a relative metric
� It’s important to remember that DA
is both relative and an estimated
emulation of Google's ranking
algorithm.
11
“Google doesn’t use Moz’s Domain
Authority as a Ranking Factor”
R U S S J O N E S
https://twitter.com/rjonessx
12. Private and confidential
Mobile First
12
� The project resulted in a faster web
� The slowest 33% of sites got 15%-
20% faster
� 200m URL’s submitted over 1bn
times to Google Page Speed
Insights
13. Private and confidential 13
SERP Features
� Increase in richer results in SERPS
� The presence of Images, Videos
and Rich Snippets all increased
� Implications for digital marketers?
14. Private and confidential
FAQs, Q&A’s and
How to’s
14
� Schema mark-up
� Enables rich snippets and
could lead to Increased CTR
� Aids Google Assistant
� But how does this benefit
the business?
15. Private and confidential
Google gets a new
crawler
15
� “Evergreen” crawler will run
on latest version of
Chromium at all times.
� This might mean that
Google can crawl modern
sites easier
17. Private and confidential 17
Image Search
� Image Search being talked about
more and more by Google
� Search is becoming more visual
� Paid Placements muscling in on
Image SERPS
19. Private and confidential
“SEO Flatliners”
� Highly curious anecdotal story
about a month long penalty
� 1 month - to the minute
� Be prepared for turbulence
19
As is typical with discussions about the SEO industry, we’re going to strt by discussion algorithm updates. I, normally, don’t really like to focus on algorithm updates too much. We are very rarely told anything concrete about them and, as I’ll get to later on, there are certain things that will never change, regardless of how the algorithm is updated.
However, this time round, we are coming off the back of some of the biggest updates to the algorithm in a long time and I simply can’t get by without mentioning them in quite some detail.
Unofficially named. The original Florida update in 2003 was rolled out to fight spammy content. Google was trying to position itself as the undisputed leader at a time when the search engine market was much more competitive. Florida 2 is similar, and named accordingly, in that it does not focus on any one market but places emphasis on EAT quality. Expertise, Authoritativeness and Trustworthiness.
How to “recover” - I say recover here in very strong sarcastic air quotes because, if your site has been “hit” then I don’t see it as “recovering”. If your site has been hit, then, according to googles guidelines, public or otherwise, there are competitors out there answering queries better than you are and providing a better user experience in some way.
Firstly, and this is just general best practice, please have a secure site.
Increase your site speed. Google is as keen on this as ever. I covered last how particularly you can go about that, but grab me at the end of you have particular questions on that. I’ll talk a little later about how proud google is about the work that they have done for sit speed.
Now, analyse your sites content, particularly on your most important and/or content heavy pages. Is there anyway you can imprive the experience on those pages for your users? E.g catergory page descriptions.
I stole this slide from a colleagues talk from a while back - and that just goes to show that the principles stand the test of time. Google exists to deliver the best answers to its customer queries. It will determine this based on what it believes the customer wants. Google are pretty clever. A lot of the time, they know what the customer wants. Within your sector, industry or niche, take a look at what the best websites are doing and do it better. No ranking algorithm will ever punish high quality web builds, high quality content and high quality promotion.
So algorithms changes are important, they are worth keeping an eye on to learn from and they do sometimes tell us what
Google announced that, actually for some time now, they havent relied on rel=prev/next to navigate properly from page to page. Their crawlers are intelligent enough to understand which page comes next through internal linking. This isn’t to say that rel is now redundant, as it can of course make navigating site content easier for users. However, in the same announcement, Google has alluded to the benefit of serving content on one page. Whilst this doesnt neccessarilly mean the promotion of SPA’s and JS Frameworks, this coincides with another crawler based announcement and signals an generally more favourable outlook for JS builds.
JavaScript in web development is on the rise. As a framework, it offers very flexibility, personalisation features and dynamic features. Netflix, for example, is built using JavaScript. Whilst netflix is an app/web app, meaning it isnt indexed by google, it still gives you an insight into what JS can do.
The issue with JS is that Google is not very good at crawling it. Googles JS crawler is slow and outdated and inefficient.
2 parts to google - crawl and index - with JS, google is not able to crawl efficiently beciase the rendering service that google uses is old. Google have introd new thing called dynamic rendering - the fact they are bring this out shows you that they are not going to improve there crawler
Work with devs to ensure improtant attributes are html based
Igf not, working js framework look into pre-render or server side rendering solution. Would reccomend bespoke approach
Server side rendering takes JS code, renders in HTML and then serves to google