A robots.txt file typically contains instructions about which pages should or should not be crawled. For example:
User-agent: * Disallow: /private/
This tells all crawlers (User-agent: *) to avoid indexing any pages in the /private/ directory. It's important to carefully consider which parts of your site you want to restrict to maintain a balance between crawl efficiency and site visibility.