Since the site is dynamic and content is edited by users (not a webmaster), I think that updating a robots.txt file frequently (manually) is not a good answer.
Please let me know if I have overlooked an s2 feature that does this or if (as an alternative) there is an automated way to update robots.txt to protect all of the pages that s2member guards.
ThanksStatistics: Posted by vbsql7 — September 16th, 2011, 5:25 pm
]]>