They Did Everything the AI Tools Told Them To
The business was a specialist services firm — the kind where organic search traffic converts at a meaningful rate because people searching for what they offer are actively looking to buy. They had been growing steadily through word of mouth and a small amount of organic traffic from a handful of well-ranked pages. When they discovered AI SEO tools, the appeal was obvious: they could produce more content faster, optimise their meta descriptions at scale, and fix their schema markup without needing a developer.
They did all of this diligently for three months. They published fourteen new blog posts. They rewrote meta descriptions across their entire site.
They added schema markup generated by an AI tool on their key pages. Six weeks later, the organic traffic drop was visible in their Google Search Console data. At three months, it was undeniable.
They came to us confused and frustrated — they had done more SEO work than ever before and their rankings had gone backwards.
Why AI-Generated SEO Gets Penalised
Google's ranking systems have become significantly better at detecting content and metadata that was generated at scale without genuine human judgment or expertise behind it. The Helpful Content update and subsequent core algorithm updates explicitly target content that was produced primarily for search engines rather than for real readers — regardless of who or what produced it. AI-generated content fails the Helpful Content criteria for a consistent reason: it is statistically average.
It covers topics in the way that most content about those topics covers them. It does not reflect the first-hand experience, specific expertise, or original perspective that Google uses as a positive signal under its E-E-A-T framework (Experience, Expertise, Authoritativeness, Trustworthiness). When fourteen articles that all read like confident, well-structured summaries of things that are already widely published appear on a site in quick succession, the pattern is detectable.
The content looks like it was generated rather than written by someone who actually knows the subject.
The 11 Specific Issues We Found
Our audit identified eleven specific problems. In the content category: all fourteen new blog posts had the same structural pattern — a definition, three to five subheadings with one paragraph each, and a conclusion. There were no specific examples, no original data, no named author, and no perspective that could not have been produced by any AI model trained on the same topic.
Three posts were near-duplicates of each other on adjacent topics. The writing was correct but lacked the depth signals Google associates with genuine expertise. In the technical category: the AI-generated meta descriptions were too long on nine pages and duplicated on four page pairs — common outputs of AI tools that generate descriptions without checking the existing site architecture.
The schema markup added by the AI tool contained incorrect property values on six pages and a broken LocalBusiness schema that was missing required fields, making it invalid rather than helpful. The internal linking structure had not been updated to reflect the new content — the fourteen new posts had almost no internal links pointing to them, making them effectively invisible to Googlebot regardless of their quality. And the canonical tags on three older pages had been incorrectly overwritten during the meta description update.
The Difference Between AI-Assisted and AI-Dependent SEO
The right way to use AI in SEO is as a research and drafting tool, not as the author and publisher. AI is genuinely useful for: identifying related keywords and topic clusters you had not considered, generating a first-draft outline that a human expert then rewrites and adds genuine perspective to, checking meta description lengths and flagging duplicates at scale, and automating the repetitive formatting work around schema templates. What AI cannot do reliably: produce content that reflects genuine first-hand experience, make editorial judgments about what is actually useful to a specific reader, or catch the structural SEO errors that require understanding your specific site architecture.
The business that came to us had moved from AI-assisted to AI-dependent — they had removed the human judgment layer entirely. That is where the damage happened.
What We Fixed and In What Order
We worked in two phases. Phase one was the technical fixes — these could be deployed quickly and would stop the active damage. We corrected all schema markup to valid values, restored the incorrectly overwritten canonical tags, fixed the duplicate meta descriptions, trimmed the overlong descriptions to under 155 characters, and built a proper internal linking structure connecting the new posts to the relevant existing pages.
Phase one was live within ten days. Phase two was the content remediation — slower, but where the compounding gains come from. We identified the four new posts that covered topics the business had genuine expertise in and rewrote them with a named author, first-person experience, specific client examples (anonymised), and original perspective.
We consolidated the three near-duplicate posts into a single comprehensive guide. We removed the remaining seven posts and replaced them with 301 redirects to the most relevant existing pages. Phase two was complete at four weeks.
The Results: 47% More Organic Sessions in 60 Days
The impact of the technical fixes was visible in Google Search Console within two weeks — crawl errors resolved, rich result eligibility restored on the schema-corrected pages, and the duplicate content signals cleared. The content changes took longer to flow through to rankings, as expected. At sixty days from the phase two completion, organic sessions were 47% above the pre-AI-experiment baseline — and 164% above the three-month trough.
The four rewritten posts with genuine expert authorship are now ranking on page one for their target terms. The consolidated guide replaced three thin posts and ranks in position four for the primary keyword — the original three posts did not rank at all. The business now uses AI for research and first drafts only, with a human review and rewrite step before anything is published.
That combination — AI speed, human judgment — is where the durable SEO gains come from.
Have you been using AI for SEO and seen your traffic drop?
We offer a free 30-minute SEO audit call. We will look at your Google Search Console data, identify the specific issues, and tell you exactly what to fix first — no commitment required.
Get a Free SEO AuditFrequently Asked Questions
Does Google penalise AI-generated content?
Google does not penalise content for being AI-generated per se. It penalises content that is unhelpful, thin, or produced primarily for search engines rather than for readers — which AI-generated content frequently is when published without human editorial review. The Helpful Content system and E-E-A-T signals both work against content that lacks genuine expertise, first-hand experience, and original perspective.
Why did my SEO rankings drop after using AI tools?
The most common causes are: thin or duplicate content that triggers Google's Helpful Content system, technically broken schema markup generated by AI tools, duplicate or overlong meta descriptions, and a lack of internal linking to new AI-generated pages. Any one of these can cause ranking drops; multiple issues occurring simultaneously can cause significant traffic loss.
How do I fix an SEO penalty from AI-generated content?
Start with the technical issues: fix broken schema, correct canonical tags, deduplicate meta descriptions, and build internal links to new content. Then address the content quality: rewrite AI-generated articles with genuine expert perspective, consolidate near-duplicate posts, and add named authorship with real credentials. Expect ranking recovery to take four to twelve weeks from the point the fixes are deployed.
Is AI content good or bad for SEO?
AI-assisted content — where AI is used for research and drafting and a human expert provides the genuine perspective, examples, and editorial judgment — can rank well. AI-dependent content — where AI is the author and publisher with no meaningful human review — consistently underperforms and is increasingly likely to be downranked by Google's quality systems.