In this short case study, I’ll argue that Google’s automated spam algorithms detected [unintentional] cloaking on a client’s website and how we fixed it.
TL;DR – Here’s a quick summary:
- Client launches a new Shopify theme (no content or URL changes, just design).
- Rankings drop after a few days.
- We discover hidden source code visible to search engines, but not humans.
- We inform the client about the potential dangers and provide a course of action.
- Client’s developer removes the hidden code following our advice, and we re-submit the pages to Google Search Console.
- Rankings return in a few days.
For more details, here’s a timeline of events:
August 10 2022
Our client launches a new Shopify theme. There are no changes to content or URL structure, it’s strictly a new design.
August 17 2022
Rankings for the client’s category pages fall through the floor.
We start investigating as soon as we see the Semrush report.
To say I was sweating like a turkey on Christmas Eve would be an understatement.
After a lot of digging around, we found the potential problem – a hidden, duplicate H1 tag within the source code:
<h1 class="collection-header title lg:text-[42px] text-2xl capitalize">Blue Widgets</h1>
There was a line of code in the CSS that hides the H1 tag from users, but search engines can still see it:
h1.collection-header__title {
display: none !important;
}
There were two problems with this:
- We already had an H1 tag on the page for “Blue Widgets”
- We were opening ourselves up to action against cloaking and keyword stuffing.
We alerted the client immediately about the dangers and gave them a course of action.
Here’s an email snippet:
September 10 2022
The client’s web developer confirms they have removed the duplicate H1 tag on the category pages.
We re-submit the pages to be re-crawled through Google Search Console.
September 13 2022
Rankings shoot back up to their original positions.
Now, I know what you’re thinking: “correlation doesn’t equal causation” and I completely agree.
But, here’s the reason for my hypothesis:
Google’s Spam Policies for Google Search documentation states (emphasis mine), “We detect policy-violating content and behaviors both through automated systems and, as needed, human review that can result in a manual action. Sites that violate our policies may rank lower in results or not appear in results at all.”
Given the website’s rankings returned almost instantly following the removal of cloaked content, we can presume that Google’s spam algorithms lifted a filter that was applied to the client’s category pages.
What do you think? I’d love to get your feedback.
If nothing, we hope this case study helps when troubleshooting rankings drops.
Check your website for inadvertent cloaking by running a ‘Command-F’ search in your CSS files for display: none
and make sure all is as it should be.
We’re an SEO company in Perth, WA. Contact us if you’re looking to improve your online presence in Search.