Skip to main content

Robots.txt

·
1 views

Robots.txt is a tiny but powerful file that sits in your website's root directory. It gives instructions to search engine bots about which pages to crawl and index, and which to ignore. Think of it as a "Do Not Enter" sign for specific parts of your website.

You can use it to block admin panels, private data, or duplicate pages from appearing in search results. The syntax is simple — just use "Disallow" commands to specify restricted paths. But be careful: one wrong line can accidentally block your entire website from Google. Always double-check your robots.txt file after making changes.

More to Read