Crawler directives (like robots.txt or meta robots tags) tell search engines which pages they can or cannot crawl and index.
Crawler directives (like robots.txt or meta robots tags) tell search engines which pages they can or cannot crawl and index.