Googlebot Documentation has been updated today, what's your take?
It says: > Googlebot can crawl the first 15MB of content in an HTML file or supported text-based file. After the first 15MB of the file, Googlebot stops crawling and only considers the first 15MB of content for indexing. Other crawlers may have different limits.
15MB is a LOT of data.
Imagine 15 million characters X number of sites available on the Internet...
I think 15MB is the maximum space. I saw some e-commerce websites also crawl unwanted through the Googlebot.