Splits promo traffic apart into tabs: one for the link traffic, one that shows
campaign traffic stats in a table, and another tab for settings (which is just
viewer list right now.)
Gated by VSponsorAdmin for testing/feedback.
Updates traffic viewer emails to use new message string locations. Fixes
a bug that was preventing emails from being sent when a new traffic viewer
was added.
Allows admins/sponsors to edit some campaign fields even after the ad has gone
live. Target can be changed at any time. Bid can only be changed for freebies.
Items at the end of the query are almost guaranteed to be a huge jump
from those in the non-pruned part of the query. To avoid confusion,
we'll cut off the query at MAX_CACHED_ITEMS which is still imperfect,
but better.
Using safe_set_attr makes the Account not dirty, but it sets the
attributes directly on the object rather than in the _t dict. If the
account was later _commit()'ed it would get stuck with these extra
attributes.
If a client sends a user to the authorization page with a
"duration=permanent" parameter, we'll create a refresh token in addition
to the initial access token. When the client fetches the initial access
token with the authorization code, the refresh token will also be returned.
The client can then obtain new access tokens using the refresh token
instead of an authorization code.
Both refresh tokens and access tokens will be displayed on the user's apps
page (as well as the IP history page).
Promotion calendar code was still using the link.campaigns attribute which
no longer exists. This change updates it to use PromoCampaign things instead.
Fixes a bug where new promotions were not showing up in promoted/graph.
cloudsearch_q and scraper_q rely on external services (Amazon
CloudSearch and embed.ly respectively) that must be configured manually
before they will run properly. To avoid a spew of useless error messages
after installation, we'll just set the consumer counts to zero.
template0 is the base template that should be immutable, while template1
is the site-customized version. PostgreSQL defaults to using template1
when doing CREATE DATABASE. Unfortunately, in some situations, template1
will have an encoding set that makes it impossible for us to CREATE
the reddit database with UTF-8 encoding.
Automated runs of the installer need to be able to configure the
domain ahead of time to ensure that the site will come up in a valid
state. The install script will now check for an environment variable,
REDDIT_DOMAIN, and use that value if it is present. If not present,
it will default to the previous "reddit.local".
If prunings fail, the number of items in a fast-moving cached query
could grow rather large. This prevents a bad case where the pruning
ends up way too big and fails itself, causing the query to snowball in
size.
Update isn't even possible for the pure-Cassandra queries, and is
dreadfully slow on the SQL backed queries. If we need to recalculate
queries, they should be done from a map/reduce job.
Original CachedQuery assumed an underlying SQL query to be cached.
New CassandraCachedQuery can cache arbitrary Things by a specified
sort. SqlCachedQuery has same behavior as CachedQuery.