[ad_1]
Google has up to date its assist documentation particular to Google Crawlers and added the Google Security crawler to the checklist of particular case crawlers. This crawler isn’t new however Google determined so as to add it as a result of they “acquired many questions” over the previous yr about this crawler.
Google Security’s crawler goes underneath the complete consumer agent string of “Google-Security.”
The Google-Security consumer agent handles abuse-specific crawling, equivalent to malware discovery for publicly posted hyperlinks on Google properties.
Google-Security consumer agent ignores robots.txt guidelines.
Why does it ignore robots.txt? I assume as a result of it has to verify pages and directories that may not be secure with the intention to defend its searchers and customers?
Discussion board dialogue at X.
Picture credit score to Lizzi.
[ad_2]
Source link