Page 1 of 1

Protection from search bots

PostPosted: September 16th, 2011, 5:25 pm
by vbsql7
This may be a leap, but I was expecting protected pages that are managed by s2Member to have a "noindex" meta tag. Is there a way to enforce that so that the search bots do not index the page?

Since the site is dynamic and content is edited by users (not a webmaster), I think that updating a robots.txt file frequently (manually) is not a good answer.

Please let me know if I have overlooked an s2 feature that does this or if (as an alternative) there is an automated way to update robots.txt to protect all of the pages that s2member guards.

Thanks

Re: Protection from search bots

PostPosted: September 25th, 2011, 3:32 pm
by Cristián Lávaque
Bots don't get access to protected content. Bots, after all, are not logged in to your site and can only see what non-members can see. Protected content won't be shown to them and usually will redirect them to the Membership Options Page.