What is Robots.txt? - Accuranker
Free Hub Resources
A robots.txt file tells search engine crawlers which URLs the crawler can or cannot access on a website. It’s mainly used to prevent search engines from wasting crawl budgets on unimportant pages. The robots.txt file doesn’t, however, prevent webpages from being indexed by search engines.