[ad_1]
Do you know that Google Search checks about 4 billion host names each day for robots.txt functions? Gary Illyes stated within the December Search Off The Document podcast “now we have about 4 billion host names that we verify each single day for robots.txt.”
He stated this on the 20:31 mark within the video. He stated in the event that they verify 4 billion host names each day, then “the variety of websites might be over or very doubtless over 4 billion.”
I noticed this video Glenn Gabe:
Google’s Gary Illyes within the newest SOTR Podcast: Google has about 4 billion hostnames that it checks each single day for robots.txt https://t.co/Irc2outOM4 pic.twitter.com/lyb68pnR7d
— Glenn Gabe (@glenngabe) December 22, 2023
Right here is the transcript:
GARY ILLYES: Yeah, and I imply, that is one of many issues that we introduced up early on. If we implement one thing or if we come up or recommend one thing that might work, that ought to not put extra pressure on publishers as a result of if you concentrate on it, in the event you undergo our robots.txt cache, you possibly can see that now we have about 4 billion host names that we verify each single day for robots.txt. Now, for example that each one of these have subdirectories, for instance. So the variety of websites might be over or very doubtless over 4 billion.
JOHN MUELLER: What number of of these are in Search Console? I’m wondering.
GARY ILLYES: John, cease it.
JOHN MUELLER: I am sorry.
GARY ILLYES: Anyway, so in case you have 4 billion hostnames plus a bunch extra in subdirectories, then how do you implement one thing that won’t make them go bankrupt once they need to implement some decide out mechanism?
JOHN MUELLER: It is difficult.
GARY ILLYES: It is difficult. And I do know that persons are pissed off that we do not have one thing already. But it surely’s not one thing to–
MARTIN SPLITT: Be taken frivolously, yeah.
GARY ILLYES: Yeah.
Right here is the video embed in the beginning time:
Discussion board dialogue at X.
[ad_2]