Google algorithm updates are considered something that marketers are always wary of. So, every time Google rolls out an algorithm update, they become nervous and have no option but to roll with the punches. However, Google releases hundreds of minor adjustments and a few significant tweaks to its algorithm every year.
In this article, we will discuss your ideal strategy in the wake of such updates so that it does not affect your traffic and rankings. We will also explain how you can use past algorithm updates to understand the current trends and deal with them.
In the past decade, the core objective of Google algorithm updates was to prevent manipulation of search engine algorithms by unscrupulous companies, so only the relevant sites may appear on the top SERPs resulting in a better search experience for the users.
Now that Google has taken search experience to the next level, the search engine not only wants its users to get relevant results for their search queries, but it also wants them to be happy with the websites appearing on top of SERPs.
The focus of the previous updates used to be:
It usually affects how a web page ranks on Google.
Google also wants to make sure its users love the content and their overall experience on the website.
So, all the authoritativeness and the backlinks are useless if the website doesn’t offer a good user experience. So, for better or worse, The focus is now shifting towards user experience.
It’s crucial to know since we want our readers not just to be ready for a particular update to be rolled out in the future but any algorithm update that Google may announce in the forthcoming years.
Apart from the regular algorithm updates, Google announces some sweeping updates every year to prepare proactively and update their websites accordingly.
Let’s discuss some significant algorithm updates that changed many practices and changed the landscape of SEO across the web world.
The first serious update that shook the web world was the Panda, which Google announced in 2011. It was aimed at penalizing poor-quality content that offered very little value and was overstuffed with keywords.
Now, let’s discuss the context of this update. Google announced this update at a time when most companies engaged in content farming. It was all about poor-quality content created for search engines. It mainly was irrelevant, least helpful for the users, and keyword stuffed. The aim of writing such articles was to game the search engine algorithms to win top rankings. As soon as Google decided to deal with this sneaky practice, Panda was rolled out to penalize those sites.
The timeline for the Panda algorithm update indicates that Google targeted thin content, high ad to content ratio, and various other quality issues. Google rolled it out in a phased manner in different countries, and finally, in September 2012, Panda #20 was announced, which was a significant update, affecting 2.4% of the queries.
Again in May 2014, Google announced a major Panda update that affected 7.5% of English language queries, and with these updates, Google almost tackled duplicate and poor quality content. Digital marketing teams across the globe started creating high-quality content that offers real value to the users. With this, the nefarious practice of keyword stuffing took a back seat and was replaced with the natural inclusion of relevant keywords.
Google announced the Penguin update just a year after Panda was first rolled out in 2011, but its aim was entirely different. It was launched to deal with another significant menace in the web world— artificial link building used as a “hack” to win top ranks.
You must be familiar with the fact that Google has been taking backlinks as a sign of authority, popularity, and trustworthiness of a site since its inception. It openly uses it as a ranking signal.
Some unscrupulous webmasters decided to take advantage and started using it as a hack to rank on top manipulating Google’s love for backlinks. Low quality and spammy backlinks led to reduced search results on Google as low-quality websites started ranking high. Google took it head-on with the Penguin update and started penalizing the sites involved in this practice. Soon a significant shift happened, and business owners and webmasters started tracking their links and removing low-quality, irrelevant and spammy links.
Google announced the Hummingbird update to provide a better experience to the searcher by showing the related results. For example, if someone searched for “small business,” Google will use different search intents to show the results. So, the results may include websites offering small business loans, business consultancy and networking services, small business ideas, small business trends, and government websites that connect entrepreneurs with lenders. For example, if you run a cake shop, you can optimize your website for the following keywords:
Even though the results don’t look like an exact match to your queries, they are relevant and valuable for the searcher. So, this up-date expected marketers to focus on related keywords as well apart from the main keywords.
If you want to keep your website “update proof” and don’t want to lose out on traffic because of every damn update, the best strategy is to learn from the past updates.
Here are a few tips you can proactively use to protect your website against any future algorithm updates:
You can understand the central objective of Google update, and to a great extent, it was explained by Ben Gomes, Google’s Vice-President of Engineering, “our goal is to get you the exact answer you’re searching for faster.”
So, relevancy, accuracy, and rapidity are what Google wants to achieve with these updates. It makes understanding the search intent of the user essential. Marketers should focus on relevance and user experience, and it will cover almost everything Google expects from your website.
So, let’s begin with user experience. First, check how long it takes for your website to load. Have a look at the navigation. Is it user-friendly? Can you get to the sections you need to go in a maximum of three clicks? Are too many ads bothering you or preventing you from seeing the content on your web pages?
The recent changes made to Google’s algorithm point that sites with poor user experience will also get penalized regardless of other factors that might have propelled them to the top ranks.
As per a shocking study by Google, as page load time increases from one to 10 seconds, the bounce rate of a mobile site visitor may go up by 123%. So, if you are serious about retaining visitors to your site, you should minimize the page load time.
Reducing the number of ads, especially near the critical navigation buttons, is also a good way of optimizing the UX. It would help if you kept in mind that users won’t close the three random ads and would instead close the website and move to another site.
Here’s a checklist you can follow to determine whether your website has been affected by a recent Google algorithm update:
Keep on tracking Google updates and find out if there have been any recently. Most Google algorithm updates that occur almost daily are industry-specific and their effect only in specific industry verticals. Their objective is to improve the search results. Specific algorithm updates are also referred to as core updates on the flip side, and these updates happen several times a year.
If you want to know about the latest algorithm updates and how they can affect your website, check out Moz’s list of Google algorithm updates. The following screenshot explains the updates that happened in 2021 and how they impacted search results. For example, the latest update was detected by many SEO tools in February 2021, and it was named “Featured Snippets Drop.” Its likely impact was that Google Search started showing featured snippets less often (SEL).
Don’t panic as soon as you sense an update taking place. Let the dust settle down, which takes around a week. It would help if you didn’t react to things in the meantime. Once the data stops fluctuating, you can initiate the repair. You can refer to Google Analytics and Google Search Console after that and take cues from these channels.
Now it’s time to get back to Google Analytics and Google Search Console and choose different date ranges to see any traffic fluctuations. To access Search Console via GA, go to Acquisition > Search Console
Sometimes, this fluctuation can be due to an external factor such as running an email or any other promotion campaign, or a big event or holiday, so you should choose a broader date range.
If you see an overall decline in your traffic or a sharp drop in your organic sessions or leads, it can be due to a Google algorithm update.
Log in to Your Site’s Google Analytics Account > Go to Acquisition > Search Console > Landing Pages
If traffic drop is sitewide, it could be due to an algorithm update.
Similarly, you can also check it across different devices.
Acquisition > Search Console > Devices > Click on Mobile
Usually, a sitewide traffic drop is a strong indication that your website has been affected by an algorithm update.
There is another method that you can try with ease. If you feel that your website has been affected by an update, check the search landscape on Google. Also, check the current position of your website as compared to your competition:
a. Is there a significant change in the positions?
b. Which site is ranking at the position your site used to rank before?
c. Are you still ranking but in no-click search results, snippets, or paid ads?
d. Compare the rankings of your web pages to those that are now ranking well. Compare the keywords you are targeting versus them? Find out if they are using any keyword variations or content strategies?
e. Have they implemented any specific landing page strategies you haven’t tried so far which may help you boost your rankings?
Sometimes you may notice a drop in traffic due to the presence of spammy links. As Google may penalize your site anytime if they detect spammy links, it’s essential to check your backlink profile regularly. You can use Google Search Console for auditing your backlinks. You can also manage your backlink profile and improve the overall health of your backlinks through GSC.
Here is the step-by-step process of checking your backlink profile using SEO SpyGlass and Google Search Control:
Open SEO SpyGlass and enter your domain name. SEO SpyGlass provides a comprehensive list of backlinks as it uses its own index to find backlinks.
For better results, you can sync your Google Analytics and Google Search Console account with SEO SpyGlass. To do this, put a tick in front of the Enable Expert Options line, and click Next.
You can also turn it on using the “Preferences drop-down” menu of the tool.
Once Google Analytics and GSC are connected to the tool, it will start retrieving the backlink data. However, the process may take some time.
Google ignores occasional bad backlinks, but an overwhelming number of suspicious backlinks may attract a manual penalty. Here is how to identify toxic backlinks.
In the tool, Go to Backlink Profile and then click the Penalty Risk section from the drop-down menu. On this screen, the tool will provide you the estimated penalty risk for the backlink domains.
A backlink with a penalty risk of over 70% is considered a harmful and high-risk backlink. These are marked red, and you should remove those backlinks.
The best option to remove toxic backlinks to your website is to disavow them. You can do it with the help of the Google Disavow tool. Google wants you to use this tool to inform them that these backlinks should be ignored when assessing your site for ranking purposes.
To do it, create a “disavow” file where you can collect all the toxic links and submit this file to the Google Disavow tool.
If you are using SEO SpyGlass, select the backlinks you want to remove and choose the Disavow backlinks option by right-clicking on it.
Export the file from the tool and submit it to the Google Disavow tool as we did it previously.
You may not notice any immediate impact of this activity on your backlink profile, but Google will soon start ignoring these backlinks.
You should frequently do a backlink health check for your website and remove toxic and harmful links this way.
As emphasized above, Google wants to offer the best user experience and exact search results faster. So, if you notice a drop in rankings or traffic, check whether your website provides:
a. Excellent user experience
b. Great and compelling content
c. Excellent usability
To keep yourself abreast with the latest Google algorithm updates, you may refer to the following trackers.
A. Moz Update History: This list is maintained by none other than SEO guru Rand Fishkin’s Moz. It’s a comprehensive and up-to-date list of algorithm updates so that you can identify the trend.
B. Google Webmaster Central Blog: Monitor this resource to stay updated on the latest algorithm updates, straight from Google.
C. SEJ’s Google Algorithm History: Search Engine Journals’ Google algorithm history provides you a sneak peek into the latest as well as previous updates. You can also see the updates using filters such as the year an update happened.
D. Twitter: After Matt Cutts, the most reliable source of getting information about algorithm updates and Google’s “insider information” is Twitter. Apart from Google, you can follow the likes of:
E. SEO Tools: Apart from that, you can use the following tools to track Google algorithm updates:
If you feel that your site has been penalized following an algorithm update, here is the step-by-step approach to recover and reclaim your lost rankings:
As explained above, check all previous updates and trends through update trackers, and identify what might have gone wrong. There can be several reasons for a penalty, such as —— duplicate or poor quality content, spammy and irrelevant backlinks (backlinks from other niches and industries), keyword abuse, poor navigation, poor user experience, and site usability issues, etc.
Visit Google Webmaster Tools to identify such issues and how to fix them.
Once you identify the possible reasons for a penalty, make algorithm-friendly changes to reclaim your lost rankings. You can remove low-quality spammy backlinks, remove duplicate content, add high-quality content and improve user experience on the site.
You should also pay attention to user experience and related issues such as:
A. Site load speed
B. Poor navigation where it becomes difficult to access the content you are looking for
C. Too many ads that may annoy a visitor
D. Bad layout and website design
The search engine uses a complex system where data from its search index is retrieved to instantly deliver the best possible results for a search term or query. Search engines use algorithms and various ranking factors to return web pages ranked by relevance on their SERPs.
Google changes its algorithm about 500 to a few thousand times per year. However, there are only a few significant updates. Minor updates may not affect your website immediately, but there is always a pattern and trend behind these updates. Therefore it’s crucial to keep an eye on these changes to prevent any penalty.
If you are a business owner or an SEO expert, you should keep track of small and significant changes that could impact your website so that you can change your SEO strategy and tactics accordingly.
A Google algorithm update can either boost or hurt your:
EAT is a popular acronym in SEO that stands for expertise, authoritativeness, and trustworthiness.
The EAT concept is quite important as, according to Google, it helps them measure the relevance of a website and decide whether to rank it higher or not.
So, always try to publish content written or compiled by an expert and is well-researched. If an authority writes it, it will boost trustworthiness.
For example, have a look at this Therapytribe blog. TherapyTribe is an online directory of psychiatrists, and the content it publishes is created by certified mental health and psychiatry professionals.
What are Bulimia, Anorexia, & Eating Disorders? The article is written by Emily Mendez, M.S., Ed.S., a certified psychotherapist and an expert in mental health and substance abuse cases. It increases the trustworthiness of the website for the website visitors.
The websites with unverified and shallow content are often penalized by Google as the information provided in those websites can be potentially harmful to the readers.
Dealing with thousands of minor and a couple of major algorithm updates every year is not a cakewalk. Still, if you have the right tools such as keyword density checkers, backlink finders, and Google algorithm update trackers mentioned above, it will help you puzzle it out.
If your website is affected by a Google algorithm update, go to Moz or SEJ’s trackers mentioned above and study the past algorithm updates to figure out its exact reason.
Use Google Search Central, which was previously known as Google Webmasters’ Tool, to find ways to fix it. You can find many tools and useful study material on GSC. Use it along with the tools listed above, and soon you’ll be on your way to reclaiming your lost rankings and traffic.
If you are an agency and don’t have the required tools and expertise in it, you can hire result-oriented white label SEO services from DashClicks for your clients. It is not only affordable but hassle-free and time-saving too.