This will allow approval rules to react to things being removed by the
spam filter. Currently, AutoMod often ends up checking items before the
filter finishes processing them, so it's already determined that it
doesn't need to auto-approve something before the filter has even
decided to remove it.
The server side was correctly adding the box but the client side was
ignoring it because r.config.gold was false. This removes the
r.config.gold check and relies on the comment visits box being there to
indicate if the user should have access to the feature.
In reworking new comment highlighting I introduced a regression that
caused child comments to share the timestamp of their parent regardless
of their own time. This was caused by an insufficiently specific
selector.
The structure of a nested comment view looks like:
<div class="comment">
<div class="entry">
<p class="tagline">
<time>
<time class="edited-timestamp">
<div class="child">
<div class="listing">
<div class="comment">
...
Selecting '.tagline time' from beneath '.comment' would pick up child
comment timestamps as well and we'd overwrite their timestamp cache. We
would also pick up edited timestamps, but that doesn't appear to do
anything bad since they're not live.
This fixes the bug by specifically sticking to the direct descendants.
Previously, the server would check the user's previous visits when
rendering a comment page and add comment-period-N classes to comments
depending on where they fell in relation to those visits. The client
side would then add or remove a new-comment class to every comment with
the appropriate (or older) comment-period class on first load or when
the previous visit selection changed.
This removes that server-side addition of comment-period-N classes and
replaces it with ScrollUpdater-based updating of comments based on their
actual timestamps. The goal is to reduce some server-side ugliness and
extraneous memcached lookups.
Previously would cause silent crashes when trying to save if they
attempted to use an unhashable type (generally a list) as the value for
standard, such as: "standard: [one, two]".
Previously, an invalid search check key like
"body+title (includes)#name" would fail the regex, and throw an error by
trying to proceed in parse_match_fields_key() with the match object
being None. This caused the wiki validation to simply fail to save with
no error displayed at all.
This controller wraps up common functionality for controllers
that only serve endpoints that require OAuth to access. This includes
appropriate pagecaching (or lack thereof) and forced authentication
methods.
For loading new ads, use visibilitychange event if supported (it generally is). This means that, in theory, a new ad should load in one of the following cases:
1. Active tab changes.
2. Browser is minimized then maximized.
3. Browser window is covered up then uncovered.
4. OS goes to sleep/is locked then woken up/unlocked.
This makes a lot more sense than the current trigger, which is just focus.
Unfortunately support for cases 2-4 is spotty, but almost all browsers support
case 1.
Loads a new ad when user re-focuses the window, under the following conditions:
1. Ad must be the active item in the spotlight box.
2. Ad must be visible (in the viewport and not hidden).
3. More than 1.5 seconds must have elapsed since the last ad was loaded.
As reported in reddit/reddit#1291, we've been loading some images in our embed
widgets (the old ones, not the new comment embeds) over http. This causes
warnings in most browsers when the embedding page is loaded over https, since
we dropping down to insecure elements.
Now we're always loading them over https. Alternatively, we could use
protocol-relative urls, but I figure there's no harm in always using https, and
it's simpler and causes fewer weird issues with browsers.
If we can't figure out a good image to hint as a thumbnail for a page via
`og:image`, we set it to the reddit snoo icon. However, we have been making
this a protocol-relative url. This doesn't appear to be against [the spec][0],
but it does create problems for some scrapers.
Now we force it to be an https url, which should resolve some of those issues.
[0]: http://opengraphprotocol.org/#url
Previously, automatic_reddits had two effects: they were added to the
list of default subscriptions, and also always forced to the front page
as long as the user hadn't unsubscribed. This change makes it so that
they are no longer added to the list of defaults, so that we can force
/r/modnews to the front page for mods without also effectively
subscribing every user to it by default.
The VUser way made it too easy to make a mistake. Something like
VUser('password')
wouldn't even check the password unless the `default` kwarg was set!