Current Stats
Ranked pages
0
Top 10 rankings
0
Indexed pages
1/10
Monthly visitors
0
Published posts
0
Many SEO agencies constantly run SEO experiments. You could say that everything we do is, to some extent, an experiment. But you’d probably be wrong. We make constant tweaks, and SEO is a constantly evolving science, but all good SEO agencies know exactly how to do good SEO.
With that caveat out of the way, I’m going to tell you why I’m conducting this ‘sketchy’ SEO experiment. I decided against using the word ‘bad’, because not everything I’m going to do in this experiment is going to be based on sketchy SEO.

We already know that downright terrible SEO is a waste of time. An experiment in truly bad SEO would contribute nothing. If everything I do in this experiment is awful SEO practice then I’m not going to learn anything.
There will be aspects and sub-projects in this task that I would never use on client projects, but like many seasoned SEO professionals, I always have a few websites of my own where I can take bigger risks.
Breaking stuff is a great way to learn, as long as the risk is mitigated or transferred. This is why I’m calling this experiment ‘sketchy’, rather than ‘bad’.
You may have noticed that I’m not using the royal ‘we’ when discussing this project; that’s because it’s very much a one-man mission. I will seek input and opinions from my team and associates, but to have total control of the outcome, I’m going to do this experiment as a solo mission.
Still with me? Great, let’s crack on!
The hypothesis for the sketchy AI SEO experiment
Google disregards AI generated content
So many people have tested the theory that SEO disregards AI content that it’s pretty much fact at this stage. There are so many AI SEO experiment writeups that it has almost become a meme in the industry. So why am I adding noise to this debate?
I have read some brilliant reports from SEO industry peers who have tested the impact of AI content. AI has been around for many, many years but has recently gotten lodged in the consciousness of the whole of society. AI is no longer technology that is only used by industry boffins, it has (to a certain extent) been democratised.
Some of the excellent experiments out there were carried out some time ago. The results were noted, and then the experiments appear to have been abandoned. At least as far as I know. It’s my intention that my own AI SEO experiment will run for a good long time.
I need to conduct my own experiment so that I can see all the data, not just the data that is reported during a writeup that someone else has done. Regardless of how great a writer is, they simply cannot include ALL data. I want total ownership of all the data so I can get a deeper understanding of the current and future impact of AI on the world of SEO.
The assumption
Because some of the ground covered in this experiment has been extensively covered before, I’m expecting the timeline of this test to run roughly like this:
- Launch site
- Add AI-generated articles
- Gain some rankings
- Rankings tank
As mentioned earlier, other write-ups seem to stop after the rankings have tanked. I want to add an extra step and see if I can recover rankings.

The AI experiment parameters
Time
I am tracking every minute I’m spending on this project. I’ll occasionally update you on how much time I’ve invested/wasted on this experiment.
The website builder
I will build a WordPress website because I don’t want the CMS I use to give the experiment any specific advantages or disadvantages.
Some might say that using WordPress gives the experiment an advantage over, say, Wix, but I’ve chosen WordPress because it is so ubiquitous.
I will be using a free WordPress theme. This probably means I won’t be able to build with the Genesis framework that I love, but it’s a fairer test to use a freebie.
The topic
I’m going to build a website that focuses on a topic I know nothing about. I want to take any prior knowledge of the subject out of the equation. My theory is that if I were to create a test website on a topic I know well, then that would take away some of the reliance on AI.
Hosting
I’m going to build the site on run-of-the-mill standard cpanel hosting. Not cheap, but not expensive. Building the site on a platform like WpEngine would potentially give the test site an advantage, pulling it away from being utterly average.
Domain name
This is one of the parts of the experiment where I am deviating away from the advice that I would give anyone else. I’m going to use an aged, deleted domain. I’ll explain why further down in this article, but for now, it’s sufficient to say that I want some backlinks so I don’t have a starting start.
DNS and CDN
I’m going to leave the DNS with whoever I register the domain name with. I would usually recommend using a very good DNS service like Amazon AWS for a very marginal SEO gain, but leaving the domain with the registrar is more in-line with how someone who isn’t an SEO pro might operate.
I’m also not going to use a CDN. Although the world of CDNs is very accessible now, I don’t think many site-builders would use one. So I’m not going to be using a CDN, this might change if I can’t get any traction.
SEO Tools
I will be using all the tools Semrush has to offer. As far as I’m concerned, it is the industry leader so it feels like the right tool to use. I will also be using Screaming Frog – because it’s awesome.
Because I’m using Semrush I will be doing some keyword research.
I will be using Google Search Console and Google Analytics.
AI Tools
I will be using GPT4, and any plugins that are available. I acknowledge that GPT4 is not a free tool, but I want to be able to use the data analysis tools to see if they can offer a significant advantage.
I’m going to use some of the Semrush AI tools, including the SEO Writing Assistant. I’m going to accept every suggestion that the Writing Assistant makes, or as close as I can get. I’m also using Grammarly, which is also AI (I guess).

What content will I use AI for?
Everything. Flippin’ everything. From the logo on the site, to the standard pages, through to the blog. I will use dall-e to generate any images I need.
Where my AI experiment might differ from others is that I’m going to use Semrush for keyword research, for building keyword clusters and for creating content briefs. I plan to use the bing connection in GPT4 to do live research before writing articles.
Safety nets and AI tests
I’m going to run every bit of content through CopyScape. I’m creating a slightly frivolous experiment that is as much for funsies as anything else, but I definitely don’t want to accidentally plagiarise anyone.
I’m going to use free AI content detection tools to get a gauge on how well GPT4 is writing. What I’d like to see is content that passes the test and is deemed to be human-generated. My theory is that Google has FAR better AI detection tech than any of the free tools. If they haven’t, then they bloomin’ well should have!
One of the key aspects of this experiment is that I’m not just letting AI run wild. Everything is going to be very carefully prompted, to take it as far away from lazy use of AI as possible. We already know lazy AI use doesn’t work.
Keeping track of the experiment
I will update the metrics section at the top of this article each time I write an update on the experiment. All updates will be added to this page.

21/10/23 – Kick off!
Buying the domain
I wouldn’t normally buy an expired domain but I want there to be some backlinks to kick-start this project. I have found a domain name that has one of my primary target keywords in it. I’ve not chosen an exact-match domain because it will directly help ranking (it won’t), but because the chances of the backlinks being niche-relevant are much higher.
Domain stats:
- 22 current domain backlinks
- Domain originally registered in 2008
- Dmoz listed (old skool!)
I’ve looked on the Internet Archive to see if the domain was used for anything unsavoury, and it looks clean. The domain expired a few months ago, but that’s fine.
The domain is a co.uk, but I’m yet to be convinced that what flavour of TLD you use for a domain makes any difference to rankings. The topic for the site is focussed on a British sub-niche of a larger niche so having a UK domain is a very marginal advtange from a user experience perspective.
I have registered the domain for five years; long domain registrations are a minor ranking signal. Five years is the current maximum length for a UK domain name.
The following tasks were completed today:
- Added to cpanel hosting
- Set up free SSL
- Installed WordPress using Softalicious auto-install
- Picked a free WordPress theme
- Added Yoast (free version) and Google Tag Manager plugins
- Set up Google Tag Manager, Google Analytics and Google Search Console
- Create Semrush project
- Used GPT4 to generate a list of ping update services and added them to site settings
- Created initial keyword clusters in Semrush
I also added the first bits of content and submitted some of the pages to Search Console to request indexing.
It was impossible to get any of the content to pass an AI content test, no matter how much editing I did. So the tests are either really great now or they don’t work.
12/11/23 – adding content
Posts/pages added today – 9
The site is getting no traffic and has no rankings, which isn’t a surprise considering that it only has one blog post! Search Console is informing me that the homepage is now indexed, but only the homepage.
The time has come to shove some more posts on the site. Here’s the workflow I’m going to use:
- Use Semrush to do keyword research and create a keyword cluster
- Generate a list of primary and secondary target keywords using Semrush
- Use this ‘master copywriter’ engineered prompt for GPT4
- Check all articles with Grammarly
- Use Dall-e to generate images for the articles
- Submit new posts to Search Console
This method of creating clusters lends itself nicely to creating a load of articles that can all be assigned to one blog post category on the website. This way I can choose a different workflow, for a different category, next time I add content. I’m hoping this will make comparing results from the different workflows much easier.
I’m still creating articles one at a time with this method, but in future, I want to test bulk-generating content.
I also installed the free version of the Interlink Manager plugin. The theory is that the plugin automatically creates all your internal links for you. This sounds like a terrible idea. But I’ll happily chuck it at this test site to see what happens. I’m guessing that if the links are generated dynamically, rather than being written to the pages then Google probably won’t see the links anyway!
Last Updated on November 12, 2023