This module allows advanced MediaSpace admins to configure the rules that search engines will follow when crawling the sites.
When this module is enabled, the site exposes the "robots.txt" file (with the content as configured by the admin).
For more information, please visit:
http://www.robotstxt.org/robotstxt.html
https://en.wikipedia.org/wiki/Robots_exclusion_standard
This module is not relevant when the site is closed only for logged-in users, thus search engines cannot crawl it anyway.