[ad_1]
There’s this attention-grabbing dialog on LinkedIn round a robots.txt serves a 503 for 2 months and the remainder of the positioning is accessible. Gary Illyes from Google stated that when different pages on the positioning are reachable and accessible, that makes a giant distinction, however when these different pages will not be, then “you are out of luck,” he wrote.
Word, he specified the house web page and different “vital” pages as needing to be accessible…
The thread was posted by Carlos Sánchez Donate on LinkedIn the place he requested, “what would occurred if the robots.txt is 503 for two months and the remainder of the positioning is accessible?”
Gary Illyes from Google responded:
I am undecided if we have to add extra nuance to it; see final sentence. One side that is disregarded normally is whether or not our crawlers can attain constantly the homepage (or another vital pages? do not bear in mind) whereas the robotstxt is unreachable. Whether it is, then the positioning could be in an okay, albeit limbo state, however nonetheless served. If we get errors for the vital web page too, you are out of luck. With robotstxt http errors you actually simply wish to give attention to fixing the reachability as quickly as doable.
The query was if there must be extra clarification on the robots.txt 5xx error handling in the documentation or to not deal with this.
It is a tremendous attention-grabbing thread, so I like to recommend you scan by way of these items if it pursuits you. After all, most of you’ll say, simply repair the 5xx errors and don’t be concerned about this. However many SEOs like to marvel in regards to the what if conditions.
Here’s a screenshot of this dialog, however once more, there may be much more there, so test it out:
Discussion board dialogue at LinkedIn.
[ad_2]
Source link