robots.txt: Disallow /quarantine

Restrict crawlers from the quarantined content gate via robots.txt.

Also add robots=noindex,nofollow meta tag to /quarantine page.
This commit is contained in:
Florence Yeun
2015-07-24 10:16:02 -07:00
parent cd4f3b1f1d
commit 4868801ada
2 changed files with 4 additions and 2 deletions

View File

@@ -89,10 +89,11 @@ class PostController(ApiController):
)
def GET_quarantine(self, dest):
sr = UrlParser(dest).get_subreddit()
return BoringPage(_("opt in to potentially offensive content?"),
return BoringPage(
_("opt in to potentially offensive content?"),
content=Quarantine(sr.name),
show_sidebar=False,
).render()
robots='noindex,nofollow').render()
@validate(VModhash(fatal=False),
over18 = nop('over18'),

View File

@@ -54,5 +54,6 @@ Disallow: /reddits/search
Disallow: /search
Disallow: /r/*/search
Disallow: /over18
Disallow: /quarantine
Allow: /