Protection from search bots
Posted: September 16th, 2011, 5:25 pm
This may be a leap, but I was expecting protected pages that are managed by s2Member to have a "noindex" meta tag. Is there a way to enforce that so that the search bots do not index the page?
Since the site is dynamic and content is edited by users (not a webmaster), I think that updating a robots.txt file frequently (manually) is not a good answer.
Please let me know if I have overlooked an s2 feature that does this or if (as an alternative) there is an automated way to update robots.txt to protect all of the pages that s2member guards.
Thanks
Since the site is dynamic and content is edited by users (not a webmaster), I think that updating a robots.txt file frequently (manually) is not a good answer.
Please let me know if I have overlooked an s2 feature that does this or if (as an alternative) there is an automated way to update robots.txt to protect all of the pages that s2member guards.
Thanks