The Robots.txt file on your website tells search engines how you want them to crawl. Specially which directory or files you don't wan't appearing in the SERPs.

For DNN we should pay special attention to this because you will often see DNN's supporting directories show up in search results.

The easiest way to avoid this by taking this sample Robots.txt file that DotNetNuke.com provides and build your way up from there.

You could use Google's Robots.txt tool to check your Robots.txt file to make sure that all your Ducks are in a row.

Related Posts by Categories