Let’s face it, artificial intelligence has brought about a big change in the way online content is created and published. Many websites have already adopted AI to produce high-quality, SEO-optimized content.
But how does Google react to AI-generated content? Does it consider them as human content or as spam?
Google expert Martin Splitt recently answered this question, explaining how Googlebot explores, renders and processes AI content.
In this article, we’ll look at how Google handles content created with artificial intelligence.
Martin Splitt talks about how Google handles AI-generated content
Martin Splitt discussed how Google handles AI-generated content during a webinar called Exploring the Art of Rendering.
He was responding to an audience member who wanted to know if the influx of AI content online could hinder Google’s ability to process information when it comes to rendering and exploration.
The question was:
“They say content production is increasing because of AI, which increases loads on crawling and rendering. Is it likely that rendering processes need to be simplified?“
In other words, the speaker is looking to see if Google has taken any specific steps in the face of the increase in online content given that the load on crawling and rendering will increase considerably.
Initially, Martin said:
“No, I don’t thinkso”.
He then explained how the search engine determines poor-quality pages during crawling, and how it proceeds after finding these types of pages.
How does it detect AI-generated content?
One of the biggest concerns professionals have about AI is Google’s ability to detect the text it generates.
Martin also addressed this issue during his talk.
He stated:
“So we do detection or quality control at several stages, and most s****l content doesn’t necessarily need JavaScript to show us how s****l it is.
Therefore, if we see that it’s s****l content before, we skip rendering, what’s the point?
If we see, ok, this looks absolute… we can be very certain that it’s crap, and that JavaScript could just add more crap, so bye. If it’s an empty page, we can tell ourselves that we don’t know.
People don’t usually put empty pages here, so let’s at least try to render. And then, when the render returns a turd, we say to ourselves, yeah okay, that’s right, it’s been a turd.
So it already has. It’s nothing new. AI can increase the scale, but doesn’t change much. Rendering is not the culprit here“.
AI content passes quality test
With Martin’s response, we see that Google doesn’t proceed with AI detection, but rather uses several means to determine whether content is of quality or not.
This is quite normal, asGoogle’s algorithm is not designed to detect low-quality AI-generated content, but discovers it automatically.
Remember the Helpful Content system, which aimed to reward sites that produced high-quality content for web users.
On this subject, Danny Sullivan wrote:
“...we’re making a series of improvements to the search engine to make it easier for web users to find useful content written by and for people“.
In other words, Google isn’t just highlighting content written for humans, but also content written by humans.
In other words, the engine is able to differentiate between human-generated and non-human-generated content, as well as poor-quality content.
According to the researchers:
“This paper postulates that detectors trained to distinguish between human- and machine-written text are effective predictors of the linguistic quality of web pages, outperforming a basic supervised spam classifier.“.
In other words, it’s a key aspect of SEO and online visibility.
For his part, Martin said:
“...we perform detection or quality control at several stages….
So it’s already happening. It’s not something new. AI may increase the scale, but it doesn’t change much.”.
In other words:
- Google is not applying any new measures to generated content;
- The search engine uses quality detection for both content written by humans and content written by AI.
What is Googlebot rendering of web pages?
Googlebot is nothing other than the name of Google’s indexing robot, which crawls the web and retrieves information from web pages to add to the search engine’s index.
Googlebot’s rendering of web pages, on the other hand, is a process that enables the robot to understand how web pages are displayed to users. It takes into account visual elements, dynamic content, JavaScript, CSS and other resources.
It is through Googlebot’s rendering of web pages that the search engine assesses the quality, relevance and usefulness of web pages for web users.
This video is Martin Splitt’s full presentation during the webinar.
In a nutshell
In conclusion, Google treats AI content in the same way it treats content written by humans. This means that such content must comply with the search engine’s quality guidelines.