Gary Illyes shared a pleasant little tidbit on LinkedIn about robots.txt recordsdata. He mentioned that solely a tiny variety of robots.txt recordsdata are over 500 kilobytes. I imply, most robots.txt recordsdata have a couple of traces of textual content, so this is smart however nonetheless, it’s a good tidbit of information.
Gary checked out over a billion robots.txt recordsdata that Google Search is aware of about and mentioned solely 7,188 of them had been over 500 KiB. That’s lower than 0.000719%.
He wrote, “One would suppose that out of the billions (sure, with a 🐝) of robots.txt recordsdata Google is aware of of greater than 7188 could be bigger in byte dimension than the 500kiB processing restrict. Alas. No.”
Yea, the website positioning level right here is that Google can course of as much as 500KB of your robots.txt file however most of these recordsdata do not even come near that file dimension.
Discussion board dialogue at LinkedIn.