Robots.txt

    Robots Exclusion Protocol (also known as “robots.txt”) is a standard used by websites to tell web crawlers and other web robots which parts of the website they are authorized to see.