Messaging/commenting

===
 - add confidence sorting to comments
   * common values are precomputed for speedier response
   * best is made the default sort on comment pages
 - messages will now be delivered once one is moderator/contributor/banned
 - UI updates to messaging page, including added show parent functionality to messages
 - Remove the rate-limit on comments on your own self-posts
 - Give users some leeway in editing their comments: don't show an edit star if the edit is within the first few minutes of a comment's lifetime
 - Office Assistant will help users when they write to admins

Backend
===
 - Replace the postgres-based query_queue with an AMQP based one
   * Set up amqp queues for async tasks such as search updates and the scrapers
   * service monitor updates, adding queue-tracking support
 - Allow find_recent_broken_things to specify both from_time and to_time
 - add a ini file parameter to disallow db writes (to create read-only reddit instances for crawlers)

New features
===
 - self-serve advertisement:
   * complete overhaul of sponsored link code
   * functions for talking with authorize.net
   * added pay domain and https support
   * added ability to share traffic from sponsored links
   * auto-reject promotions that are too old and unpaid for
 - awards
 - allow widget to have its links to have a target (in case it is iframed)
 - automatic_reddits:
   * Don't show automatic_reddits in the horizontal topbar
 - Listing numbers are always in order with no gaps
 - add support for sprites for common (r2.lib.contrib.nymph)

Admin
===
 - added a takedown page for dealing with DMCA requests properly
   * status code 404 on takedown pages
   * JSON returns same string as in the explanation text
   * nofollow on markdown in explanation
   * title and image optional
 - Added /c/(comment_id) for admins
 - updates to JS to rate-limit voting, commenting, and anything else that could be just as easily done by a script-kiddie to cheat.
 - make ad frame dynamic and add tracking pixel
 - add the ability to add a sponsored banner to the rightbox of a reddit
 - add the ability to show custom css on cnamed and/or non-cnamed versions of a reddit
 - allow us to ignore reports from report-spammers.

Bugfixes
===
 - Fix sorting of duplicate links (patch by Chromakode)
 - fix traffic bug on main traffic page when it is the first of the month.
 - toolbar redirects to comments page on self posts rather than generating the frame
 - half-assed unicode handling in menus giving us bugs again.  Switched to the whole-ass approach
 - added Thing._byID36
 - Support /help/foo/bar
This commit is contained in:
KeyserSosa
2009-12-01 13:42:06 -08:00
parent 1f1f0606f5
commit bf9f43ccac
195 changed files with 17270 additions and 3196 deletions

View File

@@ -21,11 +21,14 @@
################################################################################
# Jacascript files to be compressified
js_targets = jquery.js jquery.json.js jquery.reddit.js reddit.js
js_targets = jquery.js jquery.json.js jquery.reddit.js reddit.js ui.core.js ui.datepicker.js sponsored.js
# CSS targets
css_targets = reddit.css reddit-ie6-hax.css reddit-ie7-hax.css mobile.css spreadshirt.css
main_css = reddit.css
css_targets = reddit-ie6-hax.css reddit-ie7-hax.css mobile.css spreadshirt.css
SED=sed
CAT=cat
CSS_COMPRESS = $(SED) -e 's/ \+/ /' -e 's/\/\*.*\*\///g' -e 's/: /:/' | grep -v "^ *$$"
package = r2
static_dir = $(package)/public/static
@@ -41,7 +44,8 @@ PRIVATEREPOS = $(shell python -c 'exec "try: import r2admin; print r2admin.__pat
JSTARGETS := $(foreach js, $(js_targets), $(static_dir)/$(js))
CSSTARGETS := $(foreach css, $(css_targets), $(static_dir)/$(css))
RTLCSS = $(CSSTARGETS:.css=-rtl.css)
MAINCSS := $(foreach css, $(main_css), $(static_dir)/$(css))
RTLCSS = $(CSSTARGETS:.css=-rtl.css) $(MAINCSS:.css=-rtl.css)
MD5S = $(JSTARGETS:=.md5) $(CSSTARGETS:=.md5)
@@ -67,10 +71,10 @@ $(JSTARGETS): $(static_dir)/%.js : $(static_dir)/js/%.js
$(JSCOMPRESS) < $< > $@
$(CSSTARGETS): $(static_dir)/%.css : $(static_dir)/css/%.css
$(SED) -e 's/ \+/ /' \
-e 's/\/\*.*\*\///g' \
-e 's/: /:/' \
$< | grep -v "^ *$$" > $@
$(CAT) $< | $(CSS_COMPRESS) > $@
$(MAINCSS): $(static_dir)/%.css : $(static_dir)/css/%.css
python r2/lib/contrib/nymph.py $< | $(CSS_COMPRESS) > $@
$(RTLCSS): %-rtl.css : %.css
$(SED) -e "s/left/>####</g" \

87
r2/draw_load.py Normal file
View File

@@ -0,0 +1,87 @@
from __future__ import with_statement
import Image, ImageDraw
colors = [
"#FFFFFF", "#f0f5FF",
"#E2ECFF", "#d6f5cb",
"#CAFF98", "#e4f484",
"#FFEA71", "#ffdb81",
"#FF9191", "#FF0000"]
def get_load_level(host, nlevels = 8):
# default number of cpus shall be 1
ncpus = getattr(host, "ncpu", 1)
# color code in nlevel levels
return _load_int(host.load(), ncpus, nlevels = 8)
def _load_int(current, max_val, nlevels = 8):
i = min(max(int(nlevels*current/max_val+0.4), 0),nlevels+1)
return colors[i]
def draw_load(row_size = 12, width = 200, out_file = "/tmp/load.png"):
from r2.lib import services
a = services.AppServiceMonitor()
hosts = list(a)
number = (len([x for x in hosts if x.services]) +
len([x for x in hosts if x.database]) +
sum(len(x.queue.queues) for x in hosts if x.queue)) + 3
im = Image.new("RGB", (width, number * row_size + 2))
draw = ImageDraw.Draw(im)
def draw_box(label, color, center = False):
ypos = draw_box.ypos
xpos = 1
if center:
w, h = draw.textsize(label)
xpos = (width - w) / 2
draw.rectangle(((1, ypos), (width-2, ypos + row_size)), color)
draw.text((xpos,ypos+1), label, fill = "#000000")
draw_box.ypos += row_size
draw_box.ypos = 0
draw_box(" ==== DATABASES ==== ", "#BBBBBB", center = True)
for host in hosts:
if host.database:
draw_box(" %s load: %s" % (host.host, host.load()),
get_load_level(host))
draw_box(" ==== SERVICES ==== ", "#BBBBBB", center = True)
for host in hosts:
if host.services:
draw_box(" %s load: %s" % (host.host, host.load()),
get_load_level(host))
draw_box(" ==== QUEUES ==== ", "#BBBBBB", center = True)
for host in hosts:
if host.queue:
for name, data in host.queue:
max_len = host.queue.max_length(name)
draw_box(" %16s: %5s / %5s" % (name, data(), max_len),
_load_int(data(), max_len))
with open(out_file, 'w') as handle:
im.save(handle, "PNG")
def merge_images(out_file, file_names):
images = []
width = 0
height = 0
for f in file_names:
images.append(Image.open(f))
w, h = images[-1].size
width = max(w, width)
height += h
total = Image.new("RGB", (width, height))
height = 0
for im in images:
total.paste(im, (0, height))
w, h = im.size
height += h
with open(out_file, 'w') as handle:
total.save(out_file)

View File

@@ -18,12 +18,33 @@ memcaches = 127.0.0.1:11211
permacaches = 127.0.0.1:11211
rendercaches = 127.0.0.1:11211
rec_cache = 127.0.0.1:11311
# site tracking urls. All urls are assumed to be to an image unless
# otherwise noted:
tracker_url =
adtracker_url =
adframetracker_url =
# for tracking clicks. Should be the url of a redirector
clicktracker_url =
traffic_url =
databases = main, comment, vote, change, email, query_queue
# for sponsored links:
payment_domain = https://pay.localhost/
authorizenetname =
authorizenetkey =
authorizenetapi =
min_promote_bid = 20
max_promote_bid = 9999
min_promote_future = 2
amqp_host = localhost:5672
amqp_user = guest
amqp_pass = guest
amqp_virtual_host = /
databases = main, comment, vote, change, email, authorize, award
#db name db host user, pass
main_db = newreddit, 127.0.0.1, ri, password
@@ -32,7 +53,8 @@ comment2_db = newreddit, 127.0.0.1, ri, password
vote_db = newreddit, 127.0.0.1, ri, password
change_db = changed, 127.0.0.1, ri, password
email_db = email, 127.0.0.1, ri, password
query_queue_db = query_queue, 127.0.0.1, ri, password
authorize_db = authorize, 127.0.0.1, ri, password
award_db = award, 127.0.0.1, ri, password
db_app_name = reddit
db_create_tables = True
@@ -65,6 +87,10 @@ db_table_report_account_comment = relation, account, comment, comment
db_table_report_account_message = relation, account, message, main
db_table_report_account_subreddit = relation, account, subreddit, main
db_table_award = thing, award
db_table_trophy = relation, account, award, award
disallow_db_writes = False
###
# Other magic settings
@@ -88,16 +114,22 @@ allowed_css_linked_domains = my.domain.com, my.otherdomain.com
css_killswitch = False
max_sr_images = 20
show_awards = False
login_cookie = reddit_session
domain = localhost
domain_prefix =
media_domain = localhost
default_sr = localhost
automatic_reddits =
admins =
sponsors =
paid_sponsors =
page_cache_time = 30
static_path = /static/
useragent = Mozilla/5.0 (compatible; bot/1.0; ChangeMe)
allow_shutdown = False
solr_url =
solr_cache_time = 300
@@ -124,7 +156,6 @@ MODWINDOW = 2
HOT_PAGE_AGE = 1
#
media_period = 10 minutes
rising_period = 12 hours
new_incubation = 90 seconds

View File

@@ -60,6 +60,8 @@ def error_mapper(code, message, environ, global_conf=None, **kw):
d['cnameframe'] = 1
if environ.get('REDDIT_NAME'):
d['srname'] = environ.get('REDDIT_NAME')
if environ.get('REDDIT_TAKEDOWN'):
d['takedown'] = environ.get('REDDIT_TAKEDOWN')
#preserve x-sup-id when 304ing
if code == 304:
@@ -288,7 +290,7 @@ class DomainMiddleware(object):
sr_redirect = None
for sd in list(sub_domains):
# subdomains to disregard completely
if sd in ('www', 'origin', 'beta'):
if sd in ('www', 'origin', 'beta', 'pay'):
continue
# subdomains which change the extension
elif sd == 'm':
@@ -297,6 +299,7 @@ class DomainMiddleware(object):
environ['reddit-domain-extension'] = sd
elif (len(sd) == 2 or (len(sd) == 5 and sd[2] == '-')) and self.lang_re.match(sd):
environ['reddit-prefer-lang'] = sd
environ['reddit-domain-prefix'] = sd
else:
sr_redirect = sd
sub_domains.remove(sd)
@@ -357,6 +360,7 @@ class ExtensionMiddleware(object):
'mobile' : ('mobile', 'text/html; charset=UTF-8'),
'png' : ('png', 'image/png'),
'css' : ('css', 'text/css'),
'csv' : ('csv', 'text/csv; charset=UTF-8'),
'api' : (api_type(), 'application/json; charset=UTF-8'),
'json' : (api_type(), 'application/json; charset=UTF-8'),
'json-html' : (api_type('html'), 'application/json; charset=UTF-8')}

View File

@@ -34,6 +34,7 @@ def make_map(global_conf={}, app_conf={}):
mc('/login', controller='front', action='login')
mc('/logout', controller='front', action='logout')
mc('/verify', controller='front', action='verify')
mc('/adminon', controller='front', action='adminon')
mc('/adminoff', controller='front', action='adminoff')
mc('/submit', controller='front', action='submit')
@@ -51,6 +52,7 @@ def make_map(global_conf={}, app_conf={}):
mc('/reddits/create', controller='front', action='newreddit')
mc('/reddits/search', controller='front', action='search_reddits')
mc('/reddits/login', controller='front', action='login')
mc('/reddits/:where', controller='reddits', action='listing',
where = 'popular',
requirements=dict(where="popular|new|banned"))
@@ -69,8 +71,9 @@ def make_map(global_conf={}, app_conf={}):
mc('/widget', controller='buttons', action='widget_demo_page')
mc('/bookmarklets', controller='buttons', action='bookmarklets')
mc('/stats', controller='front', action='stats')
mc('/awards', controller='front', action='awards')
mc('/i18n', controller='feedback', action='i18n')
mc('/feedback', controller='feedback', action='feedback')
mc('/ad_inq', controller='feedback', action='ad_inq')
@@ -79,6 +82,10 @@ def make_map(global_conf={}, app_conf={}):
mc('/admin/i18n/:action', controller='i18n')
mc('/admin/i18n/:action/:lang', controller='i18n')
mc('/admin/awards', controller='awards')
mc('/admin/awards/:awardcn/:action', controller='awards',
requirements=dict(action="give|winners"))
mc('/admin/:action', controller='admin')
mc('/user/:username/about', controller='user', action='about',
@@ -115,8 +122,20 @@ def make_map(global_conf={}, app_conf={}):
mc('/framebuster/:what/:blah',
controller='front', action = 'framebuster')
mc('/promote/edit_promo/:link', controller='promote', action = 'edit_promo')
mc('/promote/:action', controller='promote')
mc('/promoted/edit_promo/:link',
controller='promote', action = 'edit_promo')
mc('/promoted/pay/:link',
controller='promote', action = 'pay')
mc('/promoted/graph',
controller='promote', action = 'graph')
mc('/promoted/:action', controller='promote',
requirements = dict(action = "new_promo"))
mc('/promoted/:sort', controller='promote', action = "listing")
mc('/promoted/', controller='promoted', action = "listing",
sort = "")
mc('/health', controller='health', action='health')
mc('/shutdown', controller='health', action='shutdown')
mc('/', controller='hot', action='listing')
@@ -137,7 +156,7 @@ def make_map(global_conf={}, app_conf={}):
requirements=dict(action="password|random|framebuster"))
mc('/:action', controller='embed',
requirements=dict(action="help|blog"))
mc('/help/:anything', controller='embed', action='help')
mc('/help/*anything', controller='embed', action='help')
mc('/goto', controller='toolbar', action='goto')
mc('/tb/:id', controller='toolbar', action='tb')
@@ -145,6 +164,8 @@ def make_map(global_conf={}, app_conf={}):
requirements=dict(action="toolbar|inner|login"))
mc('/toolbar/comments/:id', controller='toolbar', action='comments')
mc('/c/:comment_id', controller='front', action='comment_by_id')
mc('/s/*rest', controller='toolbar', action='s')
# additional toolbar-related rules just above the catchall
@@ -152,6 +173,8 @@ def make_map(global_conf={}, app_conf={}):
mc('/resetpassword/:key', controller='front',
action='resetpassword')
mc('/verification/:key', controller='front',
action='verify_email')
mc('/resetpassword', controller='front',
action='resetpassword')
@@ -165,6 +188,8 @@ def make_map(global_conf={}, app_conf={}):
requirements=dict(action="login|register"))
mc('/api/gadget/click/:ids', controller = 'api', action='gadget', type='click')
mc('/api/gadget/:type', controller = 'api', action='gadget')
mc('/api/:action', controller='promote',
requirements=dict(action="promote|unpromote|new_promo|link_thumb|freebie|promote_note|update_pay|refund|traffic_viewer|rm_traffic_viewer"))
mc('/api/:action', controller='api')
mc('/captcha/:iden', controller='captcha', action='captchaimg')
@@ -184,6 +209,8 @@ def make_map(global_conf={}, app_conf={}):
mc('/authorize_embed', controller = 'front', action = 'authorize_embed')
mc("/ads/", controller = "front", action = "ad")
mc("/ads/:reddit", controller = "front", action = "ad")
# This route handles displaying the error page and
# graphics used in the 404/500
# error pages. It should likely stay at the top

View File

@@ -33,6 +33,7 @@ api('templated', NullJsonTemplate)
# class specific overrides
api('link', LinkJsonTemplate)
api('promotedlink', PromotedLinkJsonTemplate)
api('comment', CommentJsonTemplate)
api('message', MessageJsonTemplate)
api('subreddit', SubredditJsonTemplate)
@@ -46,3 +47,4 @@ api('account', AccountJsonTemplate)
api('organiclisting', OrganicListingJsonTemplate)
api('reddittraffic', TrafficJsonTemplate)
api('takedownpane', TakedownJsonTemplate)

View File

@@ -37,6 +37,7 @@ from listingcontroller import MyredditsController
from feedback import FeedbackController
from front import FrontController
from health import HealthController
from buttons import ButtonsController
from captcha import CaptchaController
from embed import EmbedController
@@ -44,6 +45,7 @@ from error import ErrorController
from post import PostController
from toolbar import ToolbarController
from i18n import I18nController
from awards import AwardsController
from promotecontroller import PromoteController
from mediaembed import MediaembedController

View File

@@ -31,9 +31,10 @@ from r2.models.subreddit import Default as DefaultSR
import r2.models.thing_changes as tc
from r2.lib.utils import get_title, sanitize_url, timeuntil, set_last_modified
from r2.lib.utils import query_string, to36, timefromnow, link_from_url
from r2.lib.utils import query_string, link_from_url, timefromnow, worker
from r2.lib.utils import timeago
from r2.lib.pages import FriendList, ContributorList, ModList, \
BannedList, BoringPage, FormPage, NewLink, CssError, UploadedImage, \
BannedList, BoringPage, FormPage, CssError, UploadedImage, \
ClickGadget
from r2.lib.pages.things import wrap_links, default_thing_wrapper
@@ -44,31 +45,20 @@ from r2.lib.captcha import get_iden
from r2.lib.strings import strings
from r2.lib.filters import _force_unicode, websafe_json, websafe, spaceCompress
from r2.lib.db import queries
from r2.lib import amqp, promote
from r2.lib.media import force_thumbnail, thumbnail_url
from r2.lib.comment_tree import add_comment, delete_comment
from r2.lib import tracking, sup, cssfilter, emailer
from r2.lib.subreddit_search import search_reddits
from simplejson import dumps
from datetime import datetime, timedelta
from md5 import md5
from r2.lib.promote import promote, unpromote, get_promoted
class ApiController(RedditController):
"""
Controller which deals with almost all AJAX site interaction.
"""
def response_func(self, kw):
data = dumps(kw)
if request.method == "GET" and request.GET.get("callback"):
return "%s(%s)" % (websafe_json(request.GET.get("callback")),
websafe_json(data))
return self.sendstring(data)
@validatedForm()
def ajax_login_redirect(self, form, jquery, dest):
form.redirect("/login" + query_string(dict(dest=dest)))
@@ -89,7 +79,7 @@ class ApiController(RedditController):
@validatedForm(VCaptcha(),
name=VRequired('name', errors.NO_NAME),
email=ValidEmails('email', num = 1),
reason = VOneOf('reason', ('ad_inq', 'feedback')),
reason = VOneOf('reason', ('ad_inq', 'feedback', "i18n")),
message=VRequired('text', errors.NO_TEXT),
)
def POST_feedback(self, form, jquery, name, email, reason, message):
@@ -98,14 +88,17 @@ class ApiController(RedditController):
form.has_errors('text', errors.NO_TEXT) or
form.has_errors('captcha', errors.BAD_CAPTCHA)):
if reason != 'ad_inq':
emailer.feedback_email(email, message, name, reply_to = '')
else:
if reason == 'ad_inq':
emailer.ad_inq_email(email, message, name, reply_to = '')
elif reason == 'i18n':
emailer.i18n_email(email, message, name, reply_to = '')
else:
emailer.feedback_email(email, message, name, reply_to = '')
form.set_html(".status", _("thanks for your message! "
"you should hear back from us shortly."))
form.set_inputs(text = "", captcha = "")
form.find(".spacer").hide()
form.find(".btn").hide()
POST_ad_inq = POST_feedback
@@ -134,8 +127,6 @@ class ApiController(RedditController):
if g.write_query_queue:
queries.new_message(m, inbox_rel)
@validatedForm(VUser(),
VCaptcha(),
ValidDomain('url'),
@@ -148,7 +139,8 @@ class ApiController(RedditController):
save = VBoolean('save'),
selftext = VSelfText('text'),
kind = VOneOf('kind', ['link', 'self', 'poll']),
then = VOneOf('then', ('tb', 'comments'), default='comments'))
then = VOneOf('then', ('tb', 'comments'),
default='comments'))
def POST_submit(self, form, jquery, url, selftext, kind, title, save,
sr, ip, then):
#backwards compatability
@@ -223,7 +215,7 @@ class ApiController(RedditController):
#set the ratelimiter
if should_ratelimit:
VRatelimit.ratelimit(rate_user=True, rate_ip = True,
VRatelimit.ratelimit(rate_user=True, rate_ip = True,
prefix = "rate_submit_")
#update the queries
@@ -231,6 +223,9 @@ class ApiController(RedditController):
queries.new_link(l)
queries.new_vote(v)
# also notifies the searchchanges
worker.do(lambda: amqp.add_item('new_link', l._fullname))
#update the modified flags
set_last_modified(c.user, 'overview')
set_last_modified(c.user, 'submitted')
@@ -239,9 +234,6 @@ class ApiController(RedditController):
#update sup listings
sup.add_update(c.user, 'submitted')
# flag search indexer that something has changed
tc.changed(l)
if then == 'comments':
path = add_sr(l.make_permalink_slow())
elif then == 'tb':
@@ -250,7 +242,6 @@ class ApiController(RedditController):
form.redirect(path)
@validatedForm(VRatelimit(rate_ip = True,
rate_user = True,
prefix = 'fetchtitle_'),
@@ -286,7 +277,7 @@ class ApiController(RedditController):
@validatedForm(VRatelimit(rate_ip = True, prefix = 'login_',
error = errors.WRONG_PASSWORD),
user = VLogin(['user', 'passwd']),
dest = nop('dest'),
dest = VDestination(),
rem = VBoolean('rem'),
reason = VReason('reason'))
def POST_login(self, form, jquery, user, dest, rem, reason):
@@ -303,7 +294,7 @@ class ApiController(RedditController):
name = VUname(['user']),
email = ValidEmails("email", num = 1),
password = VPassword(['passwd', 'passwd2']),
dest = nop('dest'),
dest = VDestination(),
rem = VBoolean('rem'),
reason = VReason('reason'))
def POST_register(self, form, jquery, name, email,
@@ -347,7 +338,7 @@ class ApiController(RedditController):
@noresponse(VUser(),
VModhash(),
container = VByName('id'))
def POST_leave_moderator(self, container):
def POST_leavemoderator(self, container):
"""
Handles self-removal as moderator from a subreddit as rendered
in the subreddit sidebox on any of that subreddit's pages.
@@ -358,7 +349,7 @@ class ApiController(RedditController):
@noresponse(VUser(),
VModhash(),
container = VByName('id'))
def POST_leave_contributor(self, container):
def POST_leavecontributor(self, container):
"""
same comment as for POST_leave_moderator.
"""
@@ -371,7 +362,7 @@ class ApiController(RedditController):
nuser = VExistingUname('name'),
iuser = VByName('id'),
container = VByName('container'),
type = VOneOf('type', ('friend', 'moderator',
type = VOneOf('type', ('friend', 'moderator',
'contributor', 'banned')))
def POST_unfriend(self, nuser, iuser, container, type):
"""
@@ -435,8 +426,9 @@ class ApiController(RedditController):
form.set_html(".status:first", _("added"))
if new and cls:
user_row = cls().user_row(friend)
jquery("table").insert_table_rows(user_row)
jquery("#" + type + "-table").show(
).find("table").insert_table_rows(user_row)
if type != 'friend':
msg = strings.msg_add_friend.get(type)
subj = strings.subj_add_friend.get(type)
@@ -447,14 +439,19 @@ class ApiController(RedditController):
title = container.title)
msg = msg % d
subj = subj % d
Message._new(c.user, friend, subj, msg, ip)
item, inbox_rel = Message._new(c.user, friend,
subj, msg, ip)
if g.write_query_queue:
queries.new_message(item, inbox_rel)
@validatedForm(VUser('curpass', default = ''),
VModhash(),
VModhash(),
email = ValidEmails("email", num = 1),
password = VPassword(['newpass', 'verpass']))
def POST_update(self, form, jquery, email, password):
password = VPassword(['newpass', 'verpass']),
verify = VBoolean("verify"))
def POST_update(self, form, jquery, email, password, verify):
"""
handles /prefs/update for updating email address and password.
"""
@@ -467,11 +464,20 @@ class ApiController(RedditController):
# currently) apply it
updated = False
if (not form.has_errors("email", errors.BAD_EMAILS) and
email and (not hasattr(c.user,'email') or c.user.email != email)):
c.user.email = email
c.user._commit()
form.set_html('.status', _('your email has been updated'))
updated = True
email):
if (not hasattr(c.user,'email') or c.user.email != email):
c.user.email = email
# unverified email for now
c.user.email_verified = None
c.user._commit()
updated = True
if verify:
# TODO: rate limit this?
emailer.verify_email(c.user, request.referer)
form.set_html('.status',
_("you should be getting a verification email shortly."))
else:
form.set_html('.status', _('your email has been updated'))
# change password
if (password and
@@ -512,6 +518,8 @@ class ApiController(RedditController):
if not thing: return
'''for deleting all sorts of things'''
thing._deleted = True
if getattr(thing, "promoted", None) is not None:
promote.delete_promo(thing)
thing._commit()
# flag search indexer that something has changed
@@ -535,9 +543,13 @@ class ApiController(RedditController):
thing = VByName('id'))
def POST_report(self, thing):
'''for reporting...'''
if (thing and not thing._deleted and
not (hasattr(thing, "promoted") and thing.promoted)):
Report.new(c.user, thing)
if not thing or thing._deleted:
return
elif c.user._spam or c.user.ignorereports:
return
elif getattr(thing, 'promoted', False):
return
Report.new(c.user, thing)
@validatedForm(VUser(),
VModhash(),
@@ -555,7 +567,10 @@ class ApiController(RedditController):
kind = 'link'
item.selftext = text
item.editted = True
if (item._date < timeago('60 seconds')
or (item._ups + item._downs > 2)):
item.editted = True
item._commit()
tc.changed(item)
@@ -591,7 +606,8 @@ class ApiController(RedditController):
link = Link._byID(parent.link_id, data = True)
parent_comment = parent
sr = parent.subreddit_slow
if not sr.should_ratelimit(c.user, 'comment'):
if ((link.is_self and link.author_id == c.user._id)
or not sr.should_ratelimit(c.user, 'comment')):
should_ratelimit = False
#remove the ratelimit error if the user's karma is high
@@ -616,12 +632,13 @@ class ApiController(RedditController):
comment, ip)
item.parent_id = parent._id
else:
item, inbox_rel = Comment._new(c.user, link, parent_comment,
comment, ip)
item, inbox_rel = Comment._new(c.user, link, parent_comment,
comment, ip)
Vote.vote(c.user, item, True, ip)
# flag search indexer that something has changed
tc.changed(item)
# will also update searchchanges as appropriate
worker.do(lambda: amqp.add_item('new_comment', item._fullname))
#update last modified
set_last_modified(c.user, 'overview')
set_last_modified(c.user, 'commented')
@@ -724,7 +741,7 @@ class ApiController(RedditController):
def POST_vote(self, dir, thing, ip, vote_type):
ip = request.ip
user = c.user
if not thing:
if not thing or thing._deleted:
return
# TODO: temporary hack until we migrate the rest of the vote data
@@ -732,6 +749,8 @@ class ApiController(RedditController):
g.log.debug("POST_vote: ignoring old vote on %s" % thing._fullname)
return
# in a lock to prevent duplicate votes from people
# double-clicking the arrows
with g.make_lock('vote_lock(%s,%s)' % (c.user._id36, thing._id36)):
dir = (True if dir > 0
else False if dir < 0
@@ -846,26 +865,30 @@ class ApiController(RedditController):
return self.abort(403,'forbidden')
c.site.del_image(name)
c.site._commit()
@validatedForm(VSrModerator(),
VModhash())
def POST_delete_sr_header(self, form, jquery):
VModhash(),
sponsor = VInt("sponsor", min = 0, max = 1))
def POST_delete_sr_header(self, form, jquery, sponsor):
"""
Called when the user request that the header on a sr be reset.
"""
# just in case we need to kill this feature from XSS
if g.css_killswitch:
return self.abort(403,'forbidden')
if c.site.header:
if sponsor and c.user_is_admin:
c.site.sponsorship_img = None
c.site._commit()
elif c.site.header:
# reset the header image on the page
jquery('#header-img').attr("src", DefaultSR.header)
c.site.header = None
c.site._commit()
# reset the header image on the page
form.find('#header-img').attr("src", DefaultSR.header)
# hide the button which started this
form.find('#delete-img').hide()
form.find('.delete-img').hide()
# hide the preview box
form.find('#img-preview-container').hide()
form.find('.img-preview-container').hide()
# reset the status boxes
form.set_html('.img-status', _("deleted"))
@@ -885,8 +908,10 @@ class ApiController(RedditController):
VModhash(),
file = VLength('file', max_length=1024*500),
name = VCssName("name"),
header = nop('header'))
def POST_upload_sr_img(self, file, header, name):
form_id = VLength('formid', max_length = 100),
header = VInt('header', max=1, min=0),
sponsor = VInt('sponsor', max=1, min=0))
def POST_upload_sr_img(self, file, header, sponsor, name, form_id):
"""
Called on /about/stylesheet when an image needs to be replaced
or uploaded, as well as on /about/edit for updating the
@@ -909,13 +934,16 @@ class ApiController(RedditController):
try:
cleaned = cssfilter.clean_image(file,'PNG')
if header:
num = None # there is one and only header, and it is unnumbered
# there is one and only header, and it is unnumbered
resource = None
elif sponsor and c.user_is_admin:
resource = "sponsor"
elif not name:
# error if the name wasn't specified or didn't satisfy
# the validator
errors['BAD_CSS_NAME'] = _("bad image name")
else:
num = c.site.add_image(name, max_num = g.max_sr_images)
resource = c.site.add_image(name, max_num = g.max_sr_images)
c.site._commit()
except cssfilter.BadImage:
@@ -931,15 +959,18 @@ class ApiController(RedditController):
else:
# with the image num, save the image an upload to s3. the
# header image will be of the form "${c.site._fullname}.png"
# while any other image will be ${c.site._fullname}_${num}.png
new_url = cssfilter.save_sr_image(c.site, cleaned, num = num)
# while any other image will be ${c.site._fullname}_${resource}.png
new_url = cssfilter.save_sr_image(c.site, cleaned,
resource = resource)
if header:
c.site.header = new_url
elif sponsor and c.user_is_admin:
c.site.sponsorship_img = new_url
c.site._commit()
return UploadedImage(_('saved'), new_url, name,
errors = errors).render()
errors = errors, form_id = form_id).render()
@validatedForm(VUser(),
VModhash(),
@@ -950,21 +981,27 @@ class ApiController(RedditController):
name = VSubredditName("name"),
title = VLength("title", max_length = 100),
domain = VCnameDomain("domain"),
description = VLength("description", max_length = 500),
description = VLength("description", max_length = 1000),
lang = VLang("lang"),
over_18 = VBoolean('over_18'),
show_media = VBoolean('show_media'),
type = VOneOf('type', ('public', 'private', 'restricted')),
ip = ValidIP(),
ad_type = VOneOf('ad', ('default', 'basic', 'custom')),
ad_file = VLength('ad-location', max_length = 500),
sponsor_name =VLength('sponsorship-name', max_length = 500),
sponsor_url = VLength('sponsorship-url', max_length = 500),
css_on_cname = VBoolean("css_on_cname"),
)
def POST_site_admin(self, form, jquery, name ='', ip = None, sr = None, **kw):
def POST_site_admin(self, form, jquery, name, ip, sr, ad_type, ad_file,
sponsor_url, sponsor_name, **kw):
# the status button is outside the form -- have to reset by hand
form.parent().set_html('.status', "")
redir = False
kw = dict((k, v) for k, v in kw.iteritems()
if k in ('name', 'title', 'domain', 'description', 'over_18',
'show_media', 'type', 'lang',))
'show_media', 'type', 'lang', "css_on_cname"))
#if a user is banned, return rate-limit errors
if c.user._spam:
@@ -976,11 +1013,8 @@ class ApiController(RedditController):
if cname_sr and (not sr or sr != cname_sr):
c.errors.add(errors.USED_CNAME)
if not sr and form.has_errors(None, errors.RATELIMIT):
# this form is a little odd in that the error field
# doesn't occur within the form, so we need to manually
# set this text
form.parent().find('.RATELIMIT').html(c.errors[errors.RATELIMIT].message).show()
if not sr and form.has_errors("ratelimit", errors.RATELIMIT):
pass
elif not sr and form.has_errors("name", errors.SUBREDDIT_EXISTS,
errors.BAD_SR_NAME):
form.find('#example_name').hide()
@@ -996,13 +1030,17 @@ class ApiController(RedditController):
#sending kw is ok because it was sanitized above
sr = Subreddit._new(name = name, author_id = c.user._id, ip = ip,
**kw)
# will also update search
worker.do(lambda: amqp.add_item('new_subreddit', sr._fullname))
Subreddit.subscribe_defaults(c.user)
# make sure this user is on the admin list of that site!
if sr.add_subscriber(c.user):
sr._incr('_ups', 1)
sr.add_moderator(c.user)
sr.add_contributor(c.user)
redir = sr.path + "about/edit/?created=true"
redir = sr.path + "about/edit/?created=true"
if not c.user_is_admin:
VRatelimit.ratelimit(rate_user=True,
rate_ip = True,
@@ -1010,9 +1048,20 @@ class ApiController(RedditController):
#editting an existing reddit
elif sr.is_moderator(c.user) or c.user_is_admin:
if c.user_is_admin:
sr.ad_type = ad_type
if ad_type != "custom":
ad_file = Subreddit._defaults['ad_file']
sr.ad_file = ad_file
sr.sponsorship_url = sponsor_url or None
sr.sponsorship_name = sponsor_name or None
#assume sr existed, or was just built
old_domain = sr.domain
if not sr.domain:
del kw['css_on_cname']
for k, v in kw.iteritems():
setattr(sr, k, v)
sr._commit()
@@ -1028,6 +1077,8 @@ class ApiController(RedditController):
if redir:
form.redirect(redir)
else:
jquery.refresh()
@noresponse(VUser(), VModhash(),
VSrCanBan('id'),
@@ -1108,7 +1159,7 @@ class ApiController(RedditController):
user = c.user if c.user_is_loggedin else None
if not link or not link.subreddit_slow.can_view(user):
return self.abort(403,'forbidden')
if children:
builder = CommentBuilder(link, CommentSortMenu.operator(sort),
children)
@@ -1124,7 +1175,7 @@ class ApiController(RedditController):
cm.child = None
else:
items.append(cm.child)
return items
# assumes there is at least one child
# a = _children(items[0].child.things)
@@ -1191,10 +1242,11 @@ class ApiController(RedditController):
return
else:
emailer.password_email(user)
form.set_html(".status", _("an email will be sent to that account's address shortly"))
form.set_html(".status",
_("an email will be sent to that account's address shortly"))
@validatedForm(cache_evt = VCacheKey('reset', ('key', 'name')),
@validatedForm(cache_evt = VCacheKey('reset', ('key',)),
password = VPassword(['passwd', 'passwd2']))
def POST_resetpassword(self, form, jquery, cache_evt, password):
if form.has_errors('name', errors.EXPIRED):
@@ -1280,6 +1332,91 @@ class ApiController(RedditController):
tr._is_enabled = True
@validatedForm(VAdmin(),
award = VByName("fullname"),
colliding_award=VAwardByCodename(("codename", "fullname")),
codename = VLength("codename", max_length = 100),
title = VLength("title", max_length = 100),
imgurl = VLength("imgurl", max_length = 1000))
def POST_editaward(self, form, jquery, award, colliding_award, codename,
title, imgurl):
if form.has_errors(("codename", "title", "imgurl"), errors.NO_TEXT):
pass
if form.has_errors(("codename"), errors.INVALID_OPTION):
form.set_html(".status", "some other award has that codename")
pass
if form.has_error():
return
if award is None:
Award._new(codename, title, imgurl)
form.set_html(".status", "saved. reload to see it.")
return
award.codename = codename
award.title = title
award.imgurl = imgurl
award._commit()
form.set_html(".status", _('saved'))
@validatedForm(VAdmin(),
award = VByName("fullname"),
description = VLength("description", max_length=1000),
url = VLength("url", max_length=1000),
cup_hours = VFloat("cup_hours",
coerce=False, min=0, max=24 * 365),
recipient = VExistingUname("recipient"))
def POST_givetrophy(self, form, jquery, award, description,
url, cup_hours, recipient):
if form.has_errors("award", errors.NO_TEXT):
pass
if form.has_errors("recipient", errors.USER_DOESNT_EXIST):
pass
if form.has_errors("recipient", errors.NO_USER):
pass
if form.has_errors("fullname", errors.NO_TEXT):
pass
if form.has_errors("cup_hours", errors.BAD_NUMBER):
pass
if form.has_error():
return
if cup_hours:
cup_seconds = int(cup_hours * 3600)
cup_expiration = timefromnow("%s seconds" % cup_seconds)
else:
cup_expiration = None
t = Trophy._new(recipient, award, description=description,
url=url, cup_expiration=cup_expiration)
form.set_html(".status", _('saved'))
@validatedForm(VAdmin(),
account = VExistingUname("account"))
def POST_removecup(self, form, jquery, account):
if not account:
return self.abort404()
account.remove_cup()
@validatedForm(VAdmin(),
trophy = VTrophy("trophy_fn"))
def POST_removetrophy(self, form, jquery, trophy):
if not trophy:
return self.abort404()
recipient = trophy._thing1
award = trophy._thing2
trophy._delete()
Trophy.by_account(recipient, _update=True)
Trophy.by_award(award, _update=True)
@validatedForm(links = VByName('links', thing_cls = Link, multiple = True),
show = VByName('show', thing_cls = Link, multiple = False))
def POST_fetch_links(self, form, jquery, links, show):
@@ -1300,113 +1437,6 @@ class ApiController(RedditController):
setattr(c.user, "pref_" + ui_elem, False)
c.user._commit()
@noresponse(VSponsor(),
thing = VByName('id'))
def POST_unpromote(self, thing):
if not thing: return
unpromote(thing)
@validatedForm(VSponsor(),
ValidDomain('url'),
ip = ValidIP(),
l = VLink('link_id'),
title = VTitle('title'),
url = VUrl(['url', 'sr'], allow_self = False),
sr = VSubmitSR('sr'),
subscribers_only = VBoolean('subscribers_only'),
disable_comments = VBoolean('disable_comments'),
expire = VOneOf('expire', ['nomodify',
'expirein', 'cancel']),
timelimitlength = VInt('timelimitlength',1,1000),
timelimittype = VOneOf('timelimittype',
['hours','days','weeks']))
def POST_edit_promo(self, form, jquery, ip,
title, url, sr, subscribers_only,
disable_comments,
expire = None,
timelimitlength = None, timelimittype = None,
l = None):
if isinstance(url, str):
# VUrl may have modified the URL to make it valid, like
# adding http://
form.set_input('url', url)
elif isinstance(url, tuple) and isinstance(url[0], Link):
# there's already one or more links with this URL, but
# we're allowing mutliple submissions, so we really just
# want the URL
url = url[0].url
if form.has_errors('title', errors.NO_TEXT, errors.TOO_LONG):
pass
elif form.has_errors('url', errors.NO_URL, errors.BAD_URL):
pass
elif ( (not l or url != l.url) and
form.has_errors('url', errors.NO_URL, errors.ALREADY_SUB) ):
#if url == l.url, we're just editting something else
pass
elif form.has_errors('sr', errors.SUBREDDIT_NOEXIST,
errors.SUBREDDIT_NOTALLOWED):
pass
elif (expire == 'expirein' and
form.has_errors('timelimitlength', errors.BAD_NUMBER)):
pass
elif l:
l.title = title
old_url = l.url
l.url = url
l.is_self = False
l.promoted_subscribersonly = subscribers_only
l.disable_comments = disable_comments
if expire == 'cancel':
l.promote_until = None
elif expire == 'expirein' and timelimitlength and timelimittype:
l.promote_until = timefromnow("%d %s" % (timelimitlength,
timelimittype))
l._commit()
l.update_url_cache(old_url)
form.redirect('/promote/edit_promo/%s' % to36(l._id))
else:
l = Link._submit(title, url, c.user, sr, ip)
if expire == 'expirein' and timelimitlength and timelimittype:
promote_until = timefromnow("%d %s" % (timelimitlength,
timelimittype))
else:
promote_until = None
l._commit()
promote(l, subscribers_only = subscribers_only,
promote_until = promote_until,
disable_comments = disable_comments)
form.redirect('/promote/edit_promo/%s' % to36(l._id))
def GET_link_thumb(self, *a, **kw):
"""
See GET_upload_sr_image for rationale
"""
return "nothing to see here."
@validate(VSponsor(),
link = VByName('link_id'),
file = VLength('file', 500*1024))
def POST_link_thumb(self, link=None, file=None):
errors = dict(BAD_CSS_NAME = "", IMAGE_ERROR = "")
try:
force_thumbnail(link, file)
except cssfilter.BadImage:
# if the image doesn't clean up nicely, abort
errors["IMAGE_ERROR"] = _("bad image")
if any(errors.values()):
return UploadedImage("", "", "upload", errors = errors).render()
else:
return UploadedImage(_('saved'), thumbnail_url(link), "",
errors = errors).render()
@validatedForm(type = VOneOf('type', ('click'), default = 'click'),
links = VByName('ids', thing_cls = Link, multiple = True))
def GET_gadget(self, form, jquery, type, links):
@@ -1434,26 +1464,33 @@ class ApiController(RedditController):
c.user.pref_frame_commentspanel = False
c.user._commit()
@validatedForm(promoted = VByName('ids', thing_cls = Link, multiple = True))
def POST_onload(self, form, jquery, promoted, *a, **kw):
if not promoted:
return
# make sure that they are really promoted
promoted = [ l for l in promoted if l.promoted ]
for l in promoted:
dest = l.url
@validatedForm(promoted = VByName('ids', thing_cls = Link,
multiple = True),
sponsorships = VByName('ids', thing_cls = Subreddit,
multiple = True))
def POST_onload(self, form, jquery, promoted, sponsorships, *a, **kw):
def add_tracker(dest, where, what):
jquery.set_tracker(
l._fullname,
tracking.PromotedLinkInfo.gen_url(fullname=l._fullname,
where,
tracking.PromotedLinkInfo.gen_url(fullname=what,
ip = request.ip),
tracking.PromotedLinkClickInfo.gen_url(fullname = l._fullname,
tracking.PromotedLinkClickInfo.gen_url(fullname = what,
dest = dest,
ip = request.ip)
)
if promoted:
# make sure that they are really promoted
promoted = [ l for l in promoted if l.promoted ]
for l in promoted:
add_tracker(l.url, l._fullname, l._fullname)
if sponsorships:
for s in sponsorships:
add_tracker(s.sponsorship_url, s._fullname,
"%s_%s" % (s._fullname, s.sponsorship_name))
@json_validate(query = nop('query'))
def POST_search_reddit_names(self, query):
names = []
@@ -1469,7 +1506,7 @@ class ApiController(RedditController):
wrapped = wrap_links(link)
wrapped = list(wrapped)[0]
return spaceCompress(websafe(wrapped.link_child.content()))
return websafe(spaceCompress(wrapped.link_child.content()))
@validatedForm(link = VByName('name', thing_cls = Link, multiple = False),
color = VOneOf('color', spreadshirt.ShirtPane.colors),

View File

@@ -0,0 +1,54 @@
# The contents of this file are subject to the Common Public Attribution
# License Version 1.0. (the "License"); you may not use this file except in
# compliance with the License. You may obtain a copy of the License at
# http://code.reddit.com/LICENSE. The License is based on the Mozilla Public
# License Version 1.1, but Sections 14 and 15 have been added to cover use of
# software over a computer network and provide for limited attribution for the
# Original Developer. In addition, Exhibit A has been modified to be consistent
# with Exhibit B.
#
# Software distributed under the License is distributed on an "AS IS" basis,
# WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for
# the specific language governing rights and limitations under the License.
#
# The Original Code is Reddit.
#
# The Original Developer is the Initial Developer. The Initial Developer of the
# Original Code is CondeNet, Inc.
#
# All portions of the code written by CondeNet are Copyright (c) 2006-2009
# CondeNet, Inc. All Rights Reserved.
################################################################################
from pylons import request, g
from reddit_base import RedditController
from r2.lib.pages import AdminPage, AdminAwards
from r2.lib.pages import AdminAwardGive, AdminAwardWinners
from validator import *
class AwardsController(RedditController):
@validate(VAdmin())
def GET_index(self):
res = AdminPage(content = AdminAwards(),
title = 'awards').render()
return res
@validate(VAdmin(),
award = VAwardByCodename('awardcn'))
def GET_give(self, award):
if award is None:
abort(404, 'page not found')
res = AdminPage(content = AdminAwardGive(award),
title='give an award').render()
return res
@validate(VAdmin(),
award = VAwardByCodename('awardcn'))
def GET_winners(self, award):
if award is None:
abort(404, 'page not found')
res = AdminPage(content = AdminAwardWinners(award),
title='award winners').render()
return res

View File

@@ -44,16 +44,18 @@ class EmbedController(RedditController):
# Add "edit this page" link if the user is allowed to edit the wiki
if c.user_is_loggedin and c.user.can_wiki():
edit_text = _('edit this page')
read_first = _('read this first')
yes_you_can = _("yes, it's okay!")
read_first = _('just read this first.')
url = "http://code.reddit.com/wiki" + websafe(fp) + "?action=edit"
edittag = """
<div class="editlink">
<hr/>
<a href="%s">%s</a>&#32;
(<a href="/help/editing_help">%s</a>)
(<b>%s&#32;
<a href="/help/editing_help">%s</a></b>)
</div>
""" % (url, edit_text, read_first)
""" % (url, edit_text, yes_you_can, read_first)
output.append(edittag)
@@ -69,6 +71,8 @@ class EmbedController(RedditController):
fp = request.path.rstrip("/")
u = "http://code.reddit.com/wiki" + fp + '?stripped=1'
g.log.debug("Pulling %s for help" % u)
try:
content = proxyurl(u)
return self.rendercontent(content, fp)

View File

@@ -33,6 +33,7 @@ try:
# the stack trace won't be presented to the user in production
from reddit_base import RedditController
from r2.models.subreddit import Default, Subreddit
from r2.models.link import Link
from r2.lib import pages
from r2.lib.strings import strings, rand_strings
except Exception, e:
@@ -45,7 +46,7 @@ except Exception, e:
# kill this app
import os
os._exit(1)
redditbroke = \
'''<html>
<head>
@@ -131,10 +132,14 @@ class ErrorController(RedditController):
code = request.GET.get('code', '')
srname = request.GET.get('srname', '')
takedown = request.GET.get('takedown', "")
if srname:
c.site = Subreddit._by_name(srname)
if c.render_style not in self.allowed_render_styles:
return str(code)
elif takedown and code == '404':
link = Link._by_fullname(takedown)
return pages.TakedownPage(link).render()
elif code == '403':
return self.send403()
elif code == '500':

View File

@@ -25,6 +25,7 @@ from copy import copy
error_list = dict((
('USER_REQUIRED', _("please login to do that")),
('VERIFIED_USER_REQUIRED', _("you need to set a valid email address to do that.")),
('NO_URL', _('a url is required')),
('BAD_URL', _('you should check that url')),
('BAD_CAPTCHA', _('care to try these again?')),
@@ -45,7 +46,8 @@ error_list = dict((
('USER_DOESNT_EXIST', _("that user doesn't exist")),
('NO_USER', _('please enter a username')),
('INVALID_PREF', "that preference isn't valid"),
('BAD_NUMBER', _("that number isn't in the right range")),
('BAD_NUMBER', _("that number isn't in the right range (%(min)d to %(max)d)")),
('BAD_BID', _("your bid must be at least $%(min)d per day and no more than to $%(max)d in total.")),
('ALREADY_SUB', _("that link has already been submitted")),
('SUBREDDIT_EXISTS', _('that reddit already exists')),
('SUBREDDIT_NOEXIST', _('that reddit doesn\'t exist')),
@@ -64,7 +66,12 @@ error_list = dict((
('BAD_EMAILS', _('the following emails are invalid: %(emails)s')),
('NO_EMAILS', _('please enter at least one email address')),
('TOO_MANY_EMAILS', _('please only share to %(num)s emails at a time.')),
('BAD_DATE', _('please provide a date of the form mm/dd/yyyy')),
('BAD_DATE_RANGE', _('the dates need to be in order and not identical')),
('BAD_FUTURE_DATE', _('please enter a date at least %(day)s days in the future')),
('BAD_PAST_DATE', _('please enter a date at least %(day)s days in the past')),
('BAD_ADDRESS', _('address problem: %(message)s')),
('BAD_CARD', _('card problem: %(message)s')),
('TOO_LONG', _("this is too long (max: %(max_length)s)")),
('NO_TEXT', _('we need something here')),
))
@@ -123,3 +130,4 @@ class ErrorSet(object):
del self.errors[pair]
class UserRequiredException(Exception): pass
class VerifiedUserRequiredException(Exception): pass

View File

@@ -22,20 +22,26 @@
from reddit_base import RedditController
from pylons import c, request
from pylons.i18n import _
from r2.lib.pages import FormPage, Feedback, Captcha
from r2.lib.pages import FormPage, Feedback, Captcha, PaneStack, SelfServeBlurb
class FeedbackController(RedditController):
def GET_ad_inq(self):
title = _("inquire about advertising on reddit")
return FormPage('advertise',
content = Feedback(title=title,
action='ad_inq'),
content = PaneStack([SelfServeBlurb(),
Feedback(title=title,
action='ad_inq')]),
loginbox = False).render()
def GET_feedback(self):
title = _("send reddit feedback")
return FormPage('feedback',
content = Feedback(title=title,
action='feedback'),
content = Feedback(title=title, action='feedback'),
loginbox = False).render()
def GET_i18n(self):
title = _("help translate reddit into your language")
return FormPage('help translate',
content = Feedback(title=title, action='i18n'),
loginbox = False).render()

View File

@@ -102,7 +102,46 @@ class FrontController(RedditController):
"""The 'what is my password' page"""
return BoringPage(_("password"), content=Password()).render()
@validate(cache_evt = VCacheKey('reset', ('key', 'name')),
@validate(VUser(),
dest = VDestination())
def GET_verify(self, dest):
if c.user.email_verified:
content = InfoBar(message = strings.email_verified)
if dest:
return self.redirect(dest)
else:
content = PaneStack(
[InfoBar(message = strings.verify_email),
PrefUpdate(email = True, verify = True,
password = False)])
return BoringPage(_("verify email"), content = content).render()
@validate(VUser(),
cache_evt = VCacheKey('email_verify', ('key',)),
key = nop('key'),
dest = VDestination(default = "/prefs/update"))
def GET_verify_email(self, cache_evt, key, dest):
if c.user_is_loggedin and c.user.email_verified:
cache_evt.clear()
return self.redirect(dest)
elif not (cache_evt.user and
key == passhash(cache_evt.user.name, cache_evt.user.email)):
content = PaneStack(
[InfoBar(message = strings.email_verify_failed),
PrefUpdate(email = True, verify = True,
password = False)])
return BoringPage(_("verify email"), content = content).render()
elif c.user != cache_evt.user:
# wrong user. Log them out and try again.
self.logout()
return self.redirect(request.fullpath)
else:
cache_evt.clear()
c.user.email_verified = True
c.user._commit()
return self.redirect(dest)
@validate(cache_evt = VCacheKey('reset', ('key',)),
key = nop('key'))
def GET_resetpassword(self, cache_evt, key):
"""page hit once a user has been sent a password reset email
@@ -129,11 +168,13 @@ class FrontController(RedditController):
"""The (now depricated) details page. Content on this page
has been subsubmed by the presence of the LinkInfoBar on the
rightbox, so it is only useful for Admin-only wizardry."""
return DetailsPage(link = article).render()
return DetailsPage(link = article, expand_children=False).render()
@validate(article = VLink('article'))
def GET_shirt(self, article):
if not can_view_link_comments(article):
abort(403, 'forbidden')
if g.spreadshirt_url:
from r2.lib.spreadshirt import ShirtPage
return ShirtPage(link = article).render()
@@ -147,12 +188,18 @@ class FrontController(RedditController):
def GET_comments(self, article, comment, context, sort, num_comments):
"""Comment page for a given 'article'."""
if comment and comment.link_id != article._id:
return self.abort404()
if not c.default_sr and c.site._id != article.sr_id:
return self.abort404()
if not article.subreddit_slow.can_view(c.user):
sr = Subreddit._byID(article.sr_id, True)
if sr.name == g.takedown_sr:
request.environ['REDDIT_TAKEDOWN'] = article._fullname
return self.abort404()
if not c.default_sr and c.site._id != sr._id:
return self.abort404()
if not can_view_link_comments(article):
abort(403, 'forbidden')
#check for 304
@@ -188,8 +235,7 @@ class FrontController(RedditController):
displayPane.append(PermalinkMessage(article.make_permalink_slow()))
# insert reply box only for logged in user
if c.user_is_loggedin and article.subreddit_slow.can_comment(c.user)\
and not is_api():
if c.user_is_loggedin and can_comment_link(article) and not is_api():
#no comment box for permalinks
displayPane.append(UserText(item = article, creating = True,
post_form = 'comment',
@@ -200,7 +246,7 @@ class FrontController(RedditController):
displayPane.append(listing.listing())
loc = None if c.focal_comment or context is not None else 'comments'
res = LinkInfoPage(link = article, comment = comment,
content = displayPane,
subtitle = _("comments"),
@@ -300,10 +346,10 @@ class FrontController(RedditController):
return self.abort404()
return EditReddit(content = pane).render()
def GET_stats(self):
"""The stats page."""
return BoringPage(_("stats"), content = UserStats()).render()
def GET_awards(self):
"""The awards page."""
return BoringPage(_("awards"), content = UserAwards()).render()
# filter for removing punctuation which could be interpreted as lucene syntax
related_replace_regex = re.compile('[?\\&|!{}+~^()":*-]+')
@@ -315,6 +361,9 @@ class FrontController(RedditController):
"""Related page: performs a search using title of article as
the search query."""
if not can_view_link_comments(article):
abort(403, 'forbidden')
title = c.site.name + ((': ' + article.title) if hasattr(article, 'title') else '')
query = self.related_replace_regex.sub(self.related_replace_with,
@@ -335,8 +384,10 @@ class FrontController(RedditController):
@base_listing
@validate(article = VLink('article'))
def GET_duplicates(self, article, num, after, reverse, count):
links = link_duplicates(article)
if not can_view_link_comments(article):
abort(403, 'forbidden')
links = link_duplicates(article)
builder = IDBuilder([ link._fullname for link in links ],
num = num, after = after, reverse = reverse,
count = count, skip = False)
@@ -492,6 +543,12 @@ class FrontController(RedditController):
c.response.content = ''
return c.response
@validate(VAdmin(),
comment = VCommentByID('comment_id'))
def GET_comment_by_id(self, comment):
href = comment.make_permalink_slow(context=5, anchor=True)
return self.redirect(href)
@validate(VUser(),
VSRSubmitPage(),
url = VRequired('url', None),
@@ -609,15 +666,23 @@ class FrontController(RedditController):
return self.abort404()
@validate(VSponsor(),
@validate(VTrafficViewer('article'),
article = VLink('article'))
def GET_traffic(self, article):
res = LinkInfoPage(link = article,
content = PromotedTraffic(article)
if c.render_style == 'csv':
c.response.content = content.as_csv()
return c.response
return LinkInfoPage(link = article,
comment = None,
content = PromotedTraffic(article)).render()
return res
content = content).render()
@validate(VAdmin())
def GET_site_traffic(self):
return BoringPage("traffic",
content = RedditTraffic()).render()
def GET_ad(self, reddit = None):
return Dart_Ad(reddit).render(style="html")

View File

@@ -0,0 +1,52 @@
from threading import Thread
import os
import time
from pylons.controllers.util import abort
from pylons import c, g
from reddit_base import RedditController
from r2.lib.utils import worker
class HealthController(RedditController):
def shutdown(self):
thread_pool = c.thread_pool
def _shutdown():
#give busy threads 30 seconds to finish up
for s in xrange(30):
busy = thread_pool.track_threads()['busy']
if not busy:
break
time.sleep(1)
thread_pool.shutdown()
worker.join()
os._exit(3)
t = Thread(target = _shutdown)
t.setDaemon(True)
t.start()
def GET_health(self):
c.dontcache = True
if g.shutdown:
if g.shutdown == 'init':
self.shutdown()
g.shutdown = 'shutdown'
abort(503, 'service temporarily unavailable')
else:
c.response_content_type = 'text/plain'
c.response.content = "i'm still alive!"
return c.response
def GET_shutdown(self):
if not g.allow_shutdown:
self.abort404()
c.dontcache = True
#the will make the next health-check initiate the shutdown
g.shutdown = 'init'
c.response_content_type = 'text/plain'
c.response.content = 'shutting down...'
return c.response

View File

@@ -38,6 +38,7 @@ from r2.lib.jsontemplates import is_api
from r2.lib.solrsearch import SearchQuery
from r2.lib.utils import iters, check_cheating, timeago
from r2.lib import sup
from r2.lib.promote import PromoteSR
from admin import admin_profile_query
@@ -97,7 +98,6 @@ class ListingController(RedditController):
show_sidebar = self.show_sidebar,
nav_menus = self.menus,
title = self.title(),
infotext = self.infotext,
**self.render_params).render()
return res
@@ -135,10 +135,17 @@ class ListingController(RedditController):
return b
def keep_fn(self):
return None
def keep(item):
wouldkeep = item.keep_item(item)
if getattr(item, "promoted", None) is not None:
return False
return wouldkeep
return keep
def listing(self):
"""Listing to generate from the builder"""
if c.site.path == PromoteSR.path and not c.user_is_sponsor:
abort(403, 'forbidden')
listing = LinkListing(self.builder_obj, show_nums = self.show_nums)
return listing.listing()
@@ -189,9 +196,12 @@ class HotController(FixListing, ListingController):
if o_links:
# get links in proximity to pos
l = min(len(o_links) - 3, 8)
disp_links = [o_links[(i + pos) % len(o_links)] for i in xrange(-2, l)]
b = IDBuilder(disp_links, wrap = self.builder_wrapper)
disp_links = [o_links[(i + pos) % len(o_links)]
for i in xrange(-2, l)]
def keep_fn(item):
return item.likes is None and item.keep_item(item)
b = IDBuilder(disp_links, wrap = self.builder_wrapper,
skip = True, keep_fn = keep_fn)
o = OrganicListing(b,
org_links = o_links,
visible_link = o_links[pos],
@@ -276,7 +286,10 @@ class NewController(ListingController):
for things like the spam filter and thumbnail fetcher to
act on them before releasing them into the wild"""
wouldkeep = item.keep_item(item)
if c.user_is_loggedin and (c.user_is_admin or item.subreddit.is_moderator(c.user)):
if item.promoted is not None:
return False
elif c.user_is_loggedin and (c.user_is_admin or
item.subreddit.is_moderator(c.user)):
# let admins and moderators see them regardless
return wouldkeep
elif wouldkeep and c.user_is_loggedin and c.user._id == item.author_id:
@@ -379,7 +392,6 @@ class RecommendedController(ListingController):
class UserController(ListingController):
render_cls = ProfilePage
skip = False
show_nums = False
def title(self):
@@ -393,6 +405,14 @@ class UserController(ListingController):
% dict(user = self.vuser.name, site = c.site.name)
return title
# TODO: this might not be the place to do this
skip = True
def keep_fn(self):
# keep promotions off of profile pages.
def keep(item):
return getattr(item, "promoted", None) is None
return keep
def query(self):
q = None
if self.where == 'overview':
@@ -475,7 +495,6 @@ class MessageController(ListingController):
w = Wrapped(thing)
w.render_class = Message
w.to_id = c.user._id
w.subject = _('comment reply')
w.was_comment = True
w.permalink, w._fullname = p, f
return w

View File

@@ -26,6 +26,7 @@ from r2.lib.emailer import opt_in, opt_out
from pylons import request, c, g
from validator import *
from pylons.i18n import _
from r2.models import *
import sha
def to_referer(func, **params):
@@ -37,7 +38,7 @@ def to_referer(func, **params):
class PostController(ApiController):
def response_func(self, kw):
def api_wrapper(self, kw):
return Storage(**kw)
#TODO: feature disabled for now
@@ -103,11 +104,16 @@ class PostController(ApiController):
pref_num_comments = VInt('num_comments', 1, g.max_comments,
default = g.num_comments),
pref_show_stylesheets = VBoolean('show_stylesheets'),
pref_show_promote = VBoolean('show_promote'),
all_langs = nop('all-langs', default = 'all'))
def POST_options(self, all_langs, pref_lang, **kw):
#temporary. eventually we'll change pref_clickgadget to an
#integer preference
kw['pref_clickgadget'] = kw['pref_clickgadget'] and 5 or 0
if c.user.pref_show_promote is None:
kw['pref_show_promote'] = None
elif not kw.get('pref_show_promote'):
kw['pref_show_promote'] = False
self.set_options(all_langs, pref_lang, **kw)
u = UrlParser(c.site.path + "prefs")
@@ -115,14 +121,14 @@ class PostController(ApiController):
if c.cname:
u.put_in_frame()
return self.redirect(u.unparse())
def GET_over18(self):
return BoringPage(_("over 18?"),
content = Over18()).render()
@validate(over18 = nop('over18'),
uh = nop('uh'),
dest = nop('dest'))
dest = VDestination(default = '/'))
def POST_over18(self, over18, uh, dest):
if over18 == 'yes':
if c.user_is_loggedin and c.user.valid_hash(uh):
@@ -199,3 +205,4 @@ class PostController(ApiController):
def GET_login(self, *a, **kw):
return self.redirect('/login' + query_string(dict(dest="/")))

View File

@@ -22,6 +22,7 @@
from validator import *
from pylons.i18n import _
from r2.models import *
from r2.lib.authorize import get_account_info, edit_profile
from r2.lib.pages import *
from r2.lib.pages.things import wrap_links
from r2.lib.menus import *
@@ -29,48 +30,413 @@ from r2.controllers import ListingController
from r2.controllers.reddit_base import RedditController
from r2.lib.promote import get_promoted
from r2.lib.promote import get_promoted, STATUS, PromoteSR
from r2.lib.utils import timetext
from r2.lib.media import force_thumbnail, thumbnail_url
from r2.lib import cssfilter
from datetime import datetime
class PromoteController(RedditController):
@validate(VSponsor())
def GET_index(self):
return self.GET_current_promos()
class PromoteController(ListingController):
skip = False
where = 'promoted'
render_cls = PromotePage
@validate(VSponsor())
def GET_current_promos(self):
render_list = list(wrap_links(get_promoted()))
for x in render_list:
if x.promote_until:
x.promote_expires = timetext(datetime.now(g.tz) - x.promote_until)
page = PromotePage('current_promos',
content = PromotedLinks(render_list))
@property
def title_text(self):
return _('promoted by you')
def query(self):
q = Link._query(Link.c.sr_id == PromoteSR._id)
if not c.user_is_sponsor:
# get user's own promotions
q._filter(Link.c.author_id == c.user._id)
q._filter(Link.c._spam == (True, False),
Link.c.promoted == (True, False))
q._sort = desc('_date')
if self.sort == "future_promos":
q._filter(Link.c.promote_status == STATUS.unseen)
elif self.sort == "pending_promos":
if c.user_is_admin:
q._filter(Link.c.promote_status == STATUS.pending)
else:
q._filter(Link.c.promote_status == (STATUS.unpaid,
STATUS.unseen,
STATUS.accepted,
STATUS.rejected))
elif self.sort == "unpaid_promos":
q._filter(Link.c.promote_status == STATUS.unpaid)
elif self.sort == "live_promos":
q._filter(Link.c.promote_status == STATUS.promoted)
return q
@validate(VPaidSponsor(),
VVerifiedUser())
def GET_listing(self, sort = "", **env):
self.sort = sort
return ListingController.GET_listing(self, **env)
GET_index = GET_listing
return page.render()
@validate(VSponsor())
# To open up: VSponsor -> VVerifiedUser
@validate(VPaidSponsor(),
VVerifiedUser())
def GET_new_promo(self):
page = PromotePage('new_promo',
content = PromoteLinkForm())
return page.render()
return PromotePage('content', content = PromoteLinkForm()).render()
@validate(VSponsor(),
@validate(VSponsor('link'),
link = VLink('link'))
def GET_edit_promo(self, link):
sr = Subreddit._byID(link.sr_id)
listing = wrap_links(link)
if link.promoted is None:
return self.abort404()
rendered = wrap_links(link)
timedeltatext = ''
if link.promote_until:
timedeltatext = timetext(link.promote_until - datetime.now(g.tz),
resultion=2)
form = PromoteLinkForm(sr = sr, link = link,
listing = listing,
form = PromoteLinkForm(link = link,
listing = rendered,
timedeltatext = timedeltatext)
page = PromotePage('new_promo', content = form)
return page.render()
@validate(VPaidSponsor(),
VVerifiedUser())
def GET_graph(self):
content = Promote_Graph()
if c.user_is_sponsor and c.render_style == 'csv':
c.response.content = content.as_csv()
return c.response
return PromotePage("grpaph", content = content).render()
### POST controllers below
@validatedForm(VSponsor(),
link = VByName("link"),
bid = VBid('bid', "link"))
def POST_freebie(self, form, jquery, link, bid):
if link and link.promoted is not None and bid:
promote.auth_paid_promo(link, c.user, -1, bid)
jquery.refresh()
@validatedForm(VSponsor(),
link = VByName("link"),
note = nop("note"))
def POST_promote_note(self, form, jquery, link, note):
if link and link.promoted is not None:
form.find(".notes").children(":last").after(
"<p>" + promote.promotion_log(link, note, True) + "</p>")
@validatedForm(VSponsor(),
link = VByName("link"),
refund = VFloat("refund"))
def POST_refund(self, form, jquery, link, refund):
if link:
# make sure we don't refund more than we should
author = Account._byID(link.author_id)
promote.refund_promo(link, author, refund)
jquery.refresh()
@noresponse(VSponsor(),
thing = VByName('id'))
def POST_promote(self, thing):
if thing:
now = datetime.now(g.tz)
# make accepted if unseen or already rejected
if thing.promote_status in (promote.STATUS.unseen,
promote.STATUS.rejected):
promote.accept_promo(thing)
# if not finished and the dates are current
elif (thing.promote_status < promote.STATUS.finished and
thing._date <= now and thing.promote_until > now):
# if already pending, cron job must have failed. Promote.
if thing.promote_status == promote.STATUS.accepted:
promote.pending_promo(thing)
promote.promote(thing)
@noresponse(VSponsor(),
thing = VByName('id'),
reason = nop("reason"))
def POST_unpromote(self, thing, reason):
if thing:
if (c.user_is_sponsor and
(thing.promote_status in (promote.STATUS.unpaid,
promote.STATUS.unseen,
promote.STATUS.accepted,
promote.STATUS.promoted)) ):
promote.reject_promo(thing, reason = reason)
else:
promote.unpromote(thing)
# TODO: when opening up, may have to refactor
@validatedForm(VPaidSponsor('link_id'),
VModhash(),
VRatelimit(rate_user = True,
rate_ip = True,
prefix = 'create_promo_'),
ip = ValidIP(),
l = VLink('link_id'),
title = VTitle('title'),
url = VUrl('url', allow_self = False),
dates = VDateRange(['startdate', 'enddate'],
future = g.min_promote_future,
reference_date = promote.promo_datetime_now,
business_days = True,
admin_override = True),
disable_comments = VBoolean("disable_comments"),
set_clicks = VBoolean("set_maximum_clicks"),
max_clicks = VInt("maximum_clicks", min = 0),
set_views = VBoolean("set_maximum_views"),
max_views = VInt("maximum_views", min = 0),
bid = VBid('bid', 'link_id'))
def POST_new_promo(self, form, jquery, l, ip, title, url, dates,
disable_comments,
set_clicks, max_clicks, set_views, max_views, bid):
should_ratelimit = False
if not c.user_is_sponsor:
set_clicks = False
set_views = False
should_ratelimit = True
if not set_clicks:
max_clicks = None
if not set_views:
max_views = None
if not should_ratelimit:
c.errors.remove((errors.RATELIMIT, 'ratelimit'))
# demangle URL in canonical way
if url:
if isinstance(url, (unicode, str)):
form.set_inputs(url = url)
elif isinstance(url, tuple) or isinstance(url[0], Link):
# there's already one or more links with this URL, but
# we're allowing mutliple submissions, so we really just
# want the URL
url = url[0].url
# check dates and date range
start, end = [x.date() for x in dates] if dates else (None, None)
if not l or (l._date.date(), l.promote_until.date()) == (start,end):
if (form.has_errors('startdate', errors.BAD_DATE,
errors.BAD_FUTURE_DATE) or
form.has_errors('enddate', errors.BAD_DATE,
errors.BAD_FUTURE_DATE, errors.BAD_DATE_RANGE)):
return
# dates have been validated at this point. Next validate title, etc.
if (form.has_errors('title', errors.NO_TEXT,
errors.TOO_LONG) or
form.has_errors('url', errors.NO_URL, errors.BAD_URL) or
form.has_errors('bid', errors.BAD_BID) or
(not l and jquery.has_errors('ratelimit', errors.RATELIMIT))):
return
elif l:
if l.promote_status == promote.STATUS.finished:
form.parent().set_html(".status",
_("that promoted link is already finished."))
else:
# we won't penalize for changes of dates provided
# the submission isn't pending (or promoted, or
# finished)
changed = False
if dates and not promote.update_promo_dates(l, *dates):
form.parent().set_html(".status",
_("too late to change the date."))
else:
changed = True
# check for changes in the url and title
if promote.update_promo_data(l, title, url):
changed = True
# sponsors can change the bid value (at the expense of making
# the promotion a freebie)
if c.user_is_sponsor and bid != l.promote_bid:
promote.auth_paid_promo(l, c.user, -1, bid)
promote.accept_promo(l)
changed = True
if c.user_is_sponsor:
l.maximum_clicks = max_clicks
l.maximum_views = max_views
changed = True
l.disable_comments = disable_comments
l._commit()
if changed:
jquery.refresh()
# no link so we are creating a new promotion
elif dates:
promote_start, promote_end = dates
# check that the bid satisfies the minimum
duration = max((promote_end - promote_start).days, 1)
if bid / duration >= g.min_promote_bid:
l = promote.new_promotion(title, url, c.user, ip,
promote_start, promote_end, bid,
disable_comments = disable_comments,
max_clicks = max_clicks,
max_views = max_views)
# if the submitter is a sponsor (or implicitly an admin) we can
# fast-track the approval and auto-accept the bid
if c.user_is_sponsor:
promote.auth_paid_promo(l, c.user, -1, bid)
promote.accept_promo(l)
# register a vote
v = Vote.vote(c.user, l, True, ip)
# set the rate limiter
if should_ratelimit:
VRatelimit.ratelimit(rate_user=True, rate_ip = True,
prefix = "create_promo_",
seconds = 60)
form.redirect(promote.promo_edit_url(l))
else:
c.errors.add(errors.BAD_BID,
msg_params = dict(min=g.min_promote_bid,
max=g.max_promote_bid),
field = 'bid')
form.set_error(errors.BAD_BID, "bid")
@validatedForm(VSponsor('container'),
VModhash(),
user = VExistingUname('name'),
thing = VByName('container'))
def POST_traffic_viewer(self, form, jquery, user, thing):
"""
Adds a user to the list of users allowed to view a promoted
link's traffic page.
"""
if not form.has_errors("name",
errors.USER_DOESNT_EXIST, errors.NO_USER):
form.set_inputs(name = "")
form.set_html(".status:first", _("added"))
if promote.add_traffic_viewer(thing, user):
user_row = TrafficViewerList(thing).user_row(user)
jquery("#traffic-table").show(
).find("table").insert_table_rows(user_row)
# send the user a message
msg = strings.msg_add_friend.get("traffic")
subj = strings.subj_add_friend.get("traffic")
if msg and subj:
d = dict(url = thing.make_permalink_slow(),
traffic_url = promote.promo_traffic_url(thing),
title = thing.title)
msg = msg % d
subk =msg % d
item, inbox_rel = Message._new(c.user, user,
subj, msg, request.ip)
if g.write_query_queue:
queries.new_message(item, inbox_rel)
@validatedForm(VSponsor('container'),
VModhash(),
iuser = VByName('id'),
thing = VByName('container'))
def POST_rm_traffic_viewer(self, form, jquery, iuser, thing):
if thing and iuser:
promote.rm_traffic_viewer(thing, iuser)
@validatedForm(VSponsor('link'),
link = VByName("link"),
customer_id = VInt("customer_id", min = 0),
bid = VBid("bid", "link"),
pay_id = VInt("account", min = 0),
edit = VBoolean("edit"),
address = ValidAddress(["firstName", "lastName",
"company", "address",
"city", "state", "zip",
"country", "phoneNumber"],
usa_only = True),
creditcard = ValidCard(["cardNumber", "expirationDate",
"cardCode"]))
def POST_update_pay(self, form, jquery, bid, link, customer_id, pay_id,
edit, address, creditcard):
address_modified = not pay_id or edit
if address_modified:
if (form.has_errors(["firstName", "lastName", "company", "address",
"city", "state", "zip",
"country", "phoneNumber"],
errors.BAD_ADDRESS) or
form.has_errors(["cardNumber", "expirationDate", "cardCode"],
errors.BAD_CARD)):
pass
else:
pay_id = edit_profile(c.user, address, creditcard, pay_id)
if form.has_errors('bid', errors.BAD_BID) or not bid:
pass
# if link is in use or finished, don't make a change
elif link.promote_status == promote.STATUS.promoted:
form.set_html(".status",
_("that link is currently promoted. "
"you can't update your bid now."))
elif link.promote_status == promote.STATUS.finished:
form.set_html(".status",
_("that promotion is already over, so updating "
"your bid is kind of pointless, don't you think?"))
# don't create or modify a transaction if no changes have been made.
elif (link.promote_status > promote.STATUS.unpaid and
not address_modified and
getattr(link, "promote_bid", "") == bid):
form.set_html(".status",
_("no changes needed to be made"))
elif pay_id:
# valid bid and created or existing bid id.
# check if already a transaction
if promote.auth_paid_promo(link, c.user, pay_id, bid):
form.redirect(promote.promo_edit_url(link))
else:
form.set_html(".status",
_("failed to authenticate card. sorry."))
@validate(VSponsor("link"),
article = VLink("link"))
def GET_pay(self, article):
data = get_account_info(c.user)
# no need for admins to play in the credit card area
if c.user_is_loggedin and c.user._id != article.author_id:
return self.abort404()
content = PaymentForm(link = article,
customer_id = data.customerProfileId,
profiles = data.paymentProfiles)
res = LinkInfoPage(link = article,
content = content)
return res.render()
def GET_link_thumb(self, *a, **kw):
"""
See GET_upload_sr_image for rationale
"""
return "nothing to see here."
@validate(VSponsor("link_id"),
link = VByName('link_id'),
file = VLength('file', 500*1024))
def POST_link_thumb(self, link=None, file=None):
errors = dict(BAD_CSS_NAME = "", IMAGE_ERROR = "")
try:
force_thumbnail(link, file)
except cssfilter.BadImage:
# if the image doesn't clean up nicely, abort
errors["IMAGE_ERROR"] = _("bad image")
if any(errors.values()):
return UploadedImage("", "", "upload", errors = errors).render()
else:
if not c.user_is_sponsor:
promote.unapproved_promo(link)
return UploadedImage(_('saved'), thumbnail_url(link), "",
errors = errors).render()

View File

@@ -42,6 +42,7 @@ from Cookie import CookieError
from datetime import datetime
import sha, simplejson, locale
from urllib import quote, unquote
from simplejson import dumps
from r2.lib.tracking import encrypt, decrypt
@@ -409,10 +410,16 @@ def base_listing(fn):
@validate(num = VLimit('limit'),
after = VByName('after'),
before = VByName('before'),
count = VCount('count'))
count = VCount('count'),
target = VTarget("target"))
def new_fn(self, before, **env):
if c.render_style == "htmllite":
c.link_target = env.get("target")
elif "target" in env:
del env["target"]
kw = build_arg_list(fn, env)
#turn before into after/reverse
kw['reverse'] = False
if before:
@@ -454,8 +461,12 @@ class RedditController(BaseController):
c.cookies[g.login_cookie] = Cookie(value='')
def pre(self):
c.start_time = datetime.now(g.tz)
g.cache.caches = (LocalCache(),) + g.cache.caches[1:]
c.domain_prefix = request.environ.get("reddit-domain-prefix",
g.domain_prefix)
#check if user-agent needs a dose of rate-limiting
if not c.error_page:
ratelimit_agents()
@@ -506,6 +517,11 @@ class RedditController(BaseController):
c.have_messages = c.user.msgtime
c.user_is_admin = maybe_admin and c.user.name in g.admins
c.user_is_sponsor = c.user_is_admin or c.user.name in g.sponsors
if not g.disallow_db_writes:
c.user.update_last_visit(c.start_time)
#TODO: temporary
c.user_is_paid_sponsor = c.user.name.lower() in g.paid_sponsors
c.over18 = over18()
@@ -544,7 +560,7 @@ class RedditController(BaseController):
elif not c.user.pref_show_stylesheets and not c.cname:
c.allow_styles = False
#if the site has a cname, but we're not using it
elif c.site.domain and not c.cname:
elif c.site.domain and c.site.css_on_cname and not c.cname:
c.allow_styles = False
#check content cache
@@ -608,6 +624,7 @@ class RedditController(BaseController):
and request.method == 'GET'
and not c.user_is_loggedin
and not c.used_cache
and not c.dontcache
and response.status_code != 503
and response.content and response.content[0]):
g.rendercache.set(self.request_key(),
@@ -645,3 +662,11 @@ class RedditController(BaseController):
merged = copy(request.get)
merged.update(dict)
return request.path + utils.query_string(merged)
def api_wrapper(self, kw):
data = dumps(kw)
if request.method == "GET" and request.GET.get("callback"):
return "%s(%s)" % (websafe_json(request.GET.get("callback")),
websafe_json(data))
return self.sendstring(data)

View File

@@ -89,6 +89,8 @@ class ToolbarController(RedditController):
"/tb/$id36, show a given link with the toolbar"
if not link:
return self.abort404()
elif link.is_self:
return self.redirect(link.url)
res = Frame(title = link.title,
url = link.url,
@@ -160,7 +162,7 @@ class ToolbarController(RedditController):
wrapper = make_wrapper(render_class = StarkComment,
target = "_top")
b = TopCommentBuilder(link, CommentSortMenu.operator('top'),
b = TopCommentBuilder(link, CommentSortMenu.operator('confidence'),
wrap = wrapper)
listing = NestedListing(b, num = 10, # TODO: add config var

View File

@@ -22,7 +22,7 @@
from pylons import c, request, g
from pylons.i18n import _
from pylons.controllers.util import abort
from r2.lib import utils, captcha
from r2.lib import utils, captcha, promote
from r2.lib.filters import unkeep_space, websafe, _force_unicode
from r2.lib.db.operators import asc, desc
from r2.lib.template_helpers import add_sr
@@ -30,12 +30,36 @@ from r2.lib.jsonresponse import json_respond, JQueryResponse, JsonResponse
from r2.lib.jsontemplates import api_type
from r2.models import *
from r2.lib.authorize import Address, CreditCard
from r2.controllers.errors import errors, UserRequiredException
from r2.controllers.errors import VerifiedUserRequiredException
from copy import copy
from datetime import datetime, timedelta
import re, inspect
import pycountry
def visible_promo(article):
is_promo = getattr(article, "promoted", None) is not None
is_author = (c.user_is_loggedin and
c.user._id == article.author_id)
# promos are visible only if comments are not disabled and the
# user is either the author or the link is live/previously live.
if is_promo:
return (not article.disable_comments and
(is_author or
article.promote_status >= promote.STATUS.promoted))
# not a promo, therefore it is visible
return True
def can_view_link_comments(article):
return (article.subreddit_slow.can_view(c.user) and
visible_promo(article))
def can_comment_link(article):
return (article.subreddit_slow.can_comment(c.user) and
visible_promo(article))
class Validator(object):
default_param = None
@@ -110,6 +134,8 @@ def validate(*simple_vals, **param_vals):
return fn(self, *a, **kw)
except UserRequiredException:
return self.intermediate_redirect('/login')
except VerifiedUserRequiredException:
return self.intermediate_redirect('/verify')
return newfn
return val
@@ -138,7 +164,10 @@ def api_validate(response_function):
simple_vals, param_vals, *a, **kw)
except UserRequiredException:
responder.send_failure(errors.USER_REQUIRED)
return self.response_func(responder.make_response())
return self.api_wrapper(responder.make_response())
except VerifiedUserRequiredException:
responder.send_failure(errors.VERIFIED_USER_REQUIRED)
return self.api_wrapper(responder.make_response())
return newfn
return val
return _api_validate
@@ -147,12 +176,12 @@ def api_validate(response_function):
@api_validate
def noresponse(self, self_method, responder, simple_vals, param_vals, *a, **kw):
self_method(self, *a, **kw)
return self.response_func({})
return self.api_wrapper({})
@api_validate
def json_validate(self, self_method, responder, simple_vals, param_vals, *a, **kw):
r = self_method(self, *a, **kw)
return self.response_func(r)
return self.api_wrapper(r)
@api_validate
def validatedForm(self, self_method, responder, simple_vals, param_vals,
@@ -175,7 +204,7 @@ def validatedForm(self, self_method, responder, simple_vals, param_vals,
if val:
return val
else:
return self.response_func(responder.make_response())
return self.api_wrapper(responder.make_response())
@@ -209,22 +238,58 @@ class VRequired(Validator):
else:
return item
class VLink(Validator):
def __init__(self, param, redirect = True, *a, **kw):
class VThing(Validator):
def __init__(self, param, thingclass, redirect = True, *a, **kw):
Validator.__init__(self, param, *a, **kw)
self.thingclass = thingclass
self.redirect = redirect
def run(self, link_id):
if link_id:
def run(self, thing_id):
if thing_id:
try:
aid = int(link_id, 36)
return Link._byID(aid, True)
tid = int(thing_id, 36)
thing = self.thingclass._byID(tid, True)
if thing.__class__ != self.thingclass:
raise TypeError("Expected %s, got %s" %
(self.thingclass, thing.__class__))
return thing
except (NotFound, ValueError):
if self.redirect:
abort(404, 'page not found')
else:
return None
class VLink(VThing):
def __init__(self, param, redirect = True, *a, **kw):
VThing.__init__(self, param, Link, redirect=redirect, *a, **kw)
class VCommentByID(VThing):
def __init__(self, param, redirect = True, *a, **kw):
VThing.__init__(self, param, Comment, redirect=redirect, *a, **kw)
class VAward(VThing):
def __init__(self, param, redirect = True, *a, **kw):
VThing.__init__(self, param, Award, redirect=redirect, *a, **kw)
class VAwardByCodename(Validator):
def run(self, codename, required_fullname=None):
if not codename:
return self.set_error(errors.NO_TEXT)
try:
a = Award._by_codename(codename)
except NotFound:
a = None
if a and required_fullname and a._fullname != required_fullname:
return self.set_error(errors.INVALID_OPTION)
else:
return a
class VTrophy(VThing):
def __init__(self, param, redirect = True, *a, **kw):
VThing.__init__(self, param, Trophy, redirect=redirect, *a, **kw)
class VMessage(Validator):
def run(self, message_id):
if message_id:
@@ -425,10 +490,45 @@ class VAdmin(Validator):
if not c.user_is_admin:
abort(404, "page not found")
class VSponsor(Validator):
class VVerifiedUser(VUser):
def run(self):
if not c.user_is_sponsor:
abort(403, 'forbidden')
VUser.run(self)
if not c.user.email_verified:
raise VerifiedUserRequiredException
class VSponsor(VVerifiedUser):
def user_test(self, thing):
return (thing.author_id == c.user._id)
def run(self, link_id = None):
VVerifiedUser.run(self)
if c.user_is_sponsor:
return
elif link_id:
try:
if '_' in link_id:
t = Link._by_fullname(link_id, True)
else:
aid = int(link_id, 36)
t = Link._byID(aid, True)
if self.user_test(t):
return
except (NotFound, ValueError):
pass
abort(403, 'forbidden')
class VTrafficViewer(VSponsor):
def user_test(self, thing):
return (VSponsor.user_test(self, thing) or
promote.is_traffic_viewer(thing, c.user))
# TODO: tempoary validator to be replaced with Vuser once we get he
# bugs worked out
class VPaidSponsor(VSponsor):
def run(self, link_id = None):
if c.user_is_paid_sponsor:
return
VSponsor.run(self, link_id)
class VSrModerator(Validator):
def run(self):
@@ -496,8 +596,10 @@ class VSubmitParent(VByName):
if isinstance(parent, Message):
return parent
else:
sr = parent.subreddit_slow
if c.user_is_loggedin and sr.can_comment(c.user):
link = parent
if isinstance(parent, Comment):
link = Link._byID(parent.link_id)
if c.user_is_loggedin and can_comment_link(link):
return parent
#else
abort(403, "forbidden")
@@ -614,6 +716,13 @@ class VExistingUname(VRequired):
VRequired.__init__(self, item, errors.NO_USER, *a, **kw)
def run(self, name):
if name and name.startswith('~') and c.user_is_admin:
try:
user_id = int(name[1:])
return Account._byID(user_id)
except (NotFound, ValueError):
return self.error(errors.USER_DOESNT_EXIST)
# make sure the name satisfies our user name regexp before
# bothering to look it up.
name = chkuser(name)
@@ -630,31 +739,74 @@ class VUserWithEmail(VExistingUname):
if not user or not hasattr(user, 'email') or not user.email:
return self.error(errors.NO_EMAIL_FOR_USER)
return user
class VBoolean(Validator):
def run(self, val):
return val != "off" and bool(val)
class VInt(Validator):
def __init__(self, param, min=None, max=None, *a, **kw):
self.min = min
self.max = max
class VNumber(Validator):
def __init__(self, param, min=None, max=None, coerce = True,
error = errors.BAD_NUMBER, *a, **kw):
self.min = self.cast(min) if min is not None else None
self.max = self.cast(max) if max is not None else None
self.coerce = coerce
self.error = error
Validator.__init__(self, param, *a, **kw)
def cast(self, val):
raise NotImplementedError
def run(self, val):
if not val:
return
try:
val = int(val)
val = self.cast(val)
if self.min is not None and val < self.min:
val = self.min
if self.coerce:
val = self.min
else:
raise ValueError, ""
elif self.max is not None and val > self.max:
val = self.max
if self.coerce:
val = self.max
else:
raise ValueError, ""
return val
except ValueError:
self.set_error(errors.BAD_NUMBER)
self.set_error(self.error, msg_params = dict(min=self.min,
max=self.max))
class VInt(VNumber):
def cast(self, val):
return int(val)
class VFloat(VNumber):
def cast(self, val):
return float(val)
class VBid(VNumber):
def __init__(self, bid, link_id):
self.duration = 1
VNumber.__init__(self, (bid, link_id), min = g.min_promote_bid,
max = g.max_promote_bid, coerce = False,
error = errors.BAD_BID)
def cast(self, val):
return float(val)/self.duration
def run(self, bid, link_id):
if link_id:
try:
link = Thing._by_fullname(link_id, return_dict = False,
data=True)
self.duration = max((link.promote_until - link._date).days, 1)
except NotFound:
pass
if VNumber.run(self, bid):
return float(bid)
class VCssName(Validator):
"""
@@ -728,7 +880,7 @@ class VRatelimit(Validator):
field = 'ratelimit')
else:
self.set_error(self.error)
@classmethod
def ratelimit(self, rate_user = False, rate_ip = False, prefix = "rate_",
seconds = None):
@@ -750,28 +902,33 @@ class VCommentIDs(Validator):
comments = Comment._byID(cids, data=True, return_dict = False)
return comments
class VCacheKey(Validator):
def __init__(self, cache_prefix, param, *a, **kw):
class CachedUser(object):
def __init__(self, cache_prefix, user, key):
self.cache_prefix = cache_prefix
self.user = None
self.key = None
Validator.__init__(self, param, *a, **kw)
self.user = user
self.key = key
def clear(self):
if self.key and self.cache_prefix:
g.cache.delete(str(self.cache_prefix + "_" + self.key))
def run(self, key, name):
self.key = key
class VCacheKey(Validator):
def __init__(self, cache_prefix, param, *a, **kw):
self.cache_prefix = cache_prefix
Validator.__init__(self, param, *a, **kw)
def run(self, key):
c_user = CachedUser(self.cache_prefix, None, key)
if key:
uid = g.cache.get(str(self.cache_prefix + "_" + self.key))
uid = g.cache.get(str(self.cache_prefix + "_" + key))
if uid:
try:
self.user = Account._byID(uid, data = True)
c_user.user = Account._byID(uid, data = True)
except NotFound:
return
#found everything we need
return self
return c_user
self.set_error(errors.EXPIRED)
class VOneOf(Validator):
@@ -894,3 +1051,154 @@ class ValidDomain(Validator):
if url and is_banned_domain(url):
self.set_error(errors.BANNED_DOMAIN)
class VDate(Validator):
"""
Date checker that accepts string inputs in %m/%d/%Y format.
Optional parameters include 'past' and 'future' which specify how
far (in days) into the past or future the date must be to be
acceptable.
NOTE: the 'future' param will have precidence during evaluation.
Error conditions:
* BAD_DATE on mal-formed date strings (strptime parse failure)
* BAD_FUTURE_DATE and BAD_PAST_DATE on respective range errors.
"""
def __init__(self, param, future=None, past = None,
admin_override = False,
reference_date = lambda : datetime.now(g.tz),
business_days = False):
self.future = future
self.past = past
# are weekends to be exluded from the interval?
self.business_days = business_days
# function for generating "now"
self.reference_date = reference_date
# do we let admins override date range checking?
self.override = admin_override
Validator.__init__(self, param)
def run(self, date):
now = self.reference_date()
override = c.user_is_sponsor and self.override
try:
date = datetime.strptime(date, "%m/%d/%Y")
if not override:
# can't put in __init__ since we need the date on the fly
future = utils.make_offset_date(now, self.future,
business_days = self.business_days)
past = utils.make_offset_date(now, self.past, future = False,
business_days = self.business_days)
if self.future is not None and date.date() < future.date():
self.set_error(errors.BAD_FUTURE_DATE,
{"day": future.days})
elif self.past is not None and date.date() > past.date():
self.set_error(errors.BAD_PAST_DATE,
{"day": past.days})
return date.replace(tzinfo=g.tz)
except (ValueError, TypeError):
self.set_error(errors.BAD_DATE)
class VDateRange(VDate):
"""
Adds range validation to VDate. In addition to satisfying
future/past requirements in VDate, two date fields must be
provided and they must be in order.
Additional Error conditions:
* BAD_DATE_RANGE if start_date is not less than end_date
"""
def run(self, *a):
try:
start_date, end_date = [VDate.run(self, x) for x in a]
if not start_date or not end_date or end_date < start_date:
self.set_error(errors.BAD_DATE_RANGE)
return (start_date, end_date)
except ValueError:
# insufficient number of arguments provided (expect 2)
self.set_error(errors.BAD_DATE_RANGE)
class VDestination(Validator):
def __init__(self, param = 'dest', default = "", **kw):
self.default = default
Validator.__init__(self, param, **kw)
def run(self, dest):
return dest or request.referer or self.default
class ValidAddress(Validator):
def __init__(self, param, usa_only = True):
self.usa_only = usa_only
Validator.__init__(self, param)
def set_error(self, msg, field):
Validator.set_error(self, errors.BAD_ADDRESS,
dict(message=msg), field = field)
def run(self, firstName, lastName, company, address,
city, state, zipCode, country, phoneNumber):
if not firstName:
self.set_error(_("please provide a first name"), "firstName")
elif not lastName:
self.set_error(_("please provide a last name"), "lastName")
elif not address:
self.set_error(_("please provide an address"), "address")
elif not city:
self.set_error(_("please provide your city"), "city")
elif not state:
self.set_error(_("please provide your state"), "state")
elif not zipCode:
self.set_error(_("please provide your zip or post code"), "zip")
elif (not self.usa_only and
(not country or not pycountry.countries.get(alpha2=country))):
self.set_error(_("please pick a country"), "country")
else:
if self.usa_only:
country = 'United States'
else:
country = pycountry.countries.get(alpha2=country).name
return Address(firstName = firstName,
lastName = lastName,
company = company or "",
address = address,
city = city, state = state,
zip = zipCode, country = country,
phoneNumber = phoneNumber or "")
class ValidCard(Validator):
valid_ccn = re.compile(r"\d{13,16}")
valid_date = re.compile(r"\d\d\d\d-\d\d")
valid_ccv = re.compile(r"\d{3,4}")
def set_error(self, msg, field):
Validator.set_error(self, errors.BAD_CARD,
dict(message=msg), field = field)
def run(self, cardNumber, expirationDate, cardCode):
if not self.valid_ccn.match(cardNumber or ""):
self.set_error(_("credit card numbers should be 13 to 16 digits"),
"cardNumber")
elif not self.valid_date.match(expirationDate or ""):
self.set_error(_("dates should be YYYY-MM"), "expirationDate")
elif not self.valid_ccv.match(cardCode or ""):
self.set_error(_("card verification codes should be 3 or 4 digits"),
"cardCode")
else:
return CreditCard(cardNumber = cardNumber,
expirationDate = expirationDate,
cardCode = cardCode)
class VTarget(Validator):
target_re = re.compile("^[\w_-]{3,20}$")
def run(self, name):
if name and self.target_re.match(name):
return name

File diff suppressed because it is too large Load Diff

211
r2/r2/lib/amqp.py Normal file
View File

@@ -0,0 +1,211 @@
# The contents of this file are subject to the Common Public Attribution
# License Version 1.0. (the "License"); you may not use this file except in
# compliance with the License. You may obtain a copy of the License at
# http://code.reddit.com/LICENSE. The License is based on the Mozilla Public
# License Version 1.1, but Sections 14 and 15 have been added to cover use of
# software over a computer network and provide for limited attribution for the
# Original Developer. In addition, Exhibit A has been modified to be consistent
# with Exhibit B.
#
# Software distributed under the License is distributed on an "AS IS" basis,
# WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for
# the specific language governing rights and limitations under the License.
#
# The Original Code is Reddit.
#
# The Original Developer is the Initial Developer. The Initial Developer of the
# Original Code is CondeNet, Inc.
#
# All portions of the code written by CondeNet are Copyright (c) 2006-2009
# CondeNet, Inc. All Rights Reserved.
################################################################################
from threading import local
from datetime import datetime
import os
import sys
import time
import errno
import socket
from amqplib import client_0_8 as amqp
from r2.lib.cache import LocalCache
from pylons import g
amqp_host = g.amqp_host
amqp_user = g.amqp_user
amqp_pass = g.amqp_pass
log = g.log
amqp_virtual_host = g.amqp_virtual_host
connection = None
channel = local()
have_init = False
#there are two ways of interacting with this module: add_item and
#handle_items. add_item should only be called from the utils.worker
#thread since it might block for an arbitrary amount of time while
#trying to get a connection amqp.
def get_connection():
global connection
global have_init
while not connection:
try:
connection = amqp.Connection(host = amqp_host,
userid = amqp_user,
password = amqp_pass,
virtual_host = amqp_virtual_host,
insist = False)
except (socket.error, IOError):
print 'error connecting to amqp'
time.sleep(1)
#don't run init_queue until someone actually needs it. this allows
#the app server to start and serve most pages if amqp isn't
#running
if not have_init:
init_queue()
have_init = True
def get_channel(reconnect = False):
global connection
global channel
global log
# Periodic (and increasing with uptime) errors appearing when
# connection object is still present, but appears to have been
# closed. This checks that the the connection is still open.
if connection and connection.channels is None:
log.error("Error: amqp.py, connection object with no available channels. Reconnecting...")
connection = None
if not connection or reconnect:
channel.chan = None
connection = None
get_connection()
if not getattr(channel, 'chan', None):
channel.chan = connection.channel()
return channel.chan
def init_queue():
from r2.models import admintools
exchange = 'reddit_exchange'
chan = get_channel()
#we'll have one exchange for now
chan.exchange_declare(exchange=exchange,
type='direct',
durable=True,
auto_delete=False)
#prec_links queue
chan.queue_declare(queue='prec_links',
durable=True,
exclusive=False,
auto_delete=False)
chan.queue_bind(routing_key='prec_links',
queue='prec_links',
exchange=exchange)
chan.queue_declare(queue='scraper_q',
durable=True,
exclusive=False,
auto_delete=False)
chan.queue_declare(queue='searchchanges_q',
durable=True,
exclusive=False,
auto_delete=False)
# new_link
chan.queue_bind(routing_key='new_link',
queue='scraper_q',
exchange=exchange)
chan.queue_bind(routing_key='new_link',
queue='searchchanges_q',
exchange=exchange)
# new_subreddit
chan.queue_bind(routing_key='new_subreddit',
queue='searchchanges_q',
exchange=exchange)
# new_comment (nothing here for now)
# while new items will be put here automatically, we also need a
# way to specify that the item has changed by hand
chan.queue_bind(routing_key='searchchanges_q',
queue='searchchanges_q',
exchange=exchange)
admintools.admin_queues(chan, exchange)
def add_item(routing_key, body, message_id = None):
"""adds an item onto a queue. If the connection to amqp is lost it
will try to reconnect and then call itself again."""
if not amqp_host:
print "Ignoring amqp message %r to %r" % (body, routing_key)
return
chan = get_channel()
msg = amqp.Message(body,
timestamp = datetime.now(),
delivery_mode = 2)
if message_id:
msg.properties['message_id'] = message_id
try:
chan.basic_publish(msg,
exchange = 'reddit_exchange',
routing_key = routing_key)
except Exception as e:
if e.errno == errno.EPIPE:
get_channel(True)
add_item(routing_key, body, message_id)
else:
raise
def handle_items(queue, callback, ack = True, limit = 1, drain = False):
"""Call callback() on every item in a particular queue. If the
connection to the queue is lost, it will die. Intended to be
used as a long-running process."""
# debuffer stdout so that logging comes through more real-time
sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)
chan = get_channel()
while True:
msg = chan.basic_get(queue)
if not msg and drain:
return
elif not msg:
time.sleep(1)
continue
items = []
#reset the local cache
g.cache.caches = (LocalCache(),) + g.cache.caches[1:]
while msg:
items.append(msg)
if len(items) >= limit:
break # the innermost loop only
msg = chan.basic_get(queue)
callback(items)
if ack:
for item in items:
chan.basic_ack(item.delivery_tag)
def empty_queue(queue):
"""debug function to completely erase the contents of a queue"""
chan = get_channel()
chan.queue_purge(queue)

View File

@@ -50,7 +50,11 @@ class Globals(object):
'num_serendipity',
'sr_dropdown_threshold',
]
float_props = ['min_promote_bid',
'max_promote_bid',
]
bool_props = ['debug', 'translator',
'sqlprinting',
'template_debug',
@@ -58,8 +62,11 @@ class Globals(object):
'enable_doquery',
'use_query_cache',
'write_query_queue',
'show_awards',
'css_killswitch',
'db_create_tables']
'db_create_tables',
'disallow_db_writes',
'allow_shutdown']
tuple_props = ['memcaches',
'rec_cache',
@@ -67,8 +74,10 @@ class Globals(object):
'rendercaches',
'admins',
'sponsors',
# TODO: temporary until we open it up to all users
'paid_sponsors',
'monitored_servers',
'default_srs',
'automatic_reddits',
'agents',
'allowed_css_linked_domains']
@@ -103,14 +112,19 @@ class Globals(object):
if not k.startswith("_") and not hasattr(self, k):
if k in self.int_props:
v = int(v)
elif k in self.float_props:
v = float(v)
elif k in self.bool_props:
v = self.to_bool(v)
elif k in self.tuple_props:
v = tuple(self.to_iter(v))
setattr(self, k, v)
self.paid_sponsors = set(x.lower() for x in self.paid_sponsors)
# initialize caches
mc = Memcache(self.memcaches, pickleProtocol = 1)
self.memcache = mc
self.cache = CacheChain((LocalCache(), mc))
self.permacache = Memcache(self.permacaches, pickleProtocol = 1)
self.rendercache = Memcache(self.rendercaches, pickleProtocol = 1)
@@ -173,6 +187,11 @@ class Globals(object):
self.log.addHandler(logging.StreamHandler())
if self.debug:
self.log.setLevel(logging.DEBUG)
else:
self.log.setLevel(logging.WARNING)
# set log level for pycountry which is chatty
logging.getLogger('pycountry.db').setLevel(logging.CRITICAL)
if not self.media_domain:
self.media_domain = self.domain
@@ -190,26 +209,35 @@ class Globals(object):
self.reddit_host = socket.gethostname()
self.reddit_pid = os.getpid()
#the shutdown toggle
self.shutdown = False
#if we're going to use the query_queue, we need amqp
if self.write_query_queue and not self.amqp_host:
raise Exception("amqp_host must be defined to use the query queue")
@staticmethod
def to_bool(x):
return (x.lower() == 'true') if x else None
@staticmethod
def to_iter(v, delim = ','):
return (x.strip() for x in v.split(delim))
return (x.strip() for x in v.split(delim) if x)
def load_db_params(self, gc):
self.databases = tuple(self.to_iter(gc['databases']))
self.db_params = {}
if not self.databases:
return
dbm = db_manager.db_manager()
db_params = ('name', 'db_host', 'db_user', 'db_pass',
'pool_size', 'max_overflow')
db_param_names = ('name', 'db_host', 'db_user', 'db_pass',
'pool_size', 'max_overflow')
for db_name in self.databases:
conf_params = self.to_iter(gc[db_name + '_db'])
params = dict(zip(db_params, conf_params))
params = dict(zip(db_param_names, conf_params))
dbm.engines[db_name] = db_manager.get_engine(**params)
self.db_params[db_name] = params
dbm.type_db = dbm.engines[gc['type_db']]
dbm.relation_type_db = dbm.engines[gc['rel_type_db']]

View File

@@ -0,0 +1,22 @@
# The contents of this file are subject to the Common Public Attribution
# License Version 1.0. (the "License"); you may not use this file except in
# compliance with the License. You may obtain a copy of the License at
# http://code.reddit.com/LICENSE. The License is based on the Mozilla Public
# License Version 1.1, but Sections 14 and 15 have been added to cover use of
# software over a computer network and provide for limited attribution for the
# Original Developer. In addition, Exhibit A has been modified to be consistent
# with Exhibit B.
#
# Software distributed under the License is distributed on an "AS IS" basis,
# WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for
# the specific language governing rights and limitations under the License.
#
# The Original Code is Reddit.
#
# The Original Developer is the Initial Developer. The Initial Developer of the
# Original Code is CondeNet, Inc.
#
# All portions of the code written by CondeNet are Copyright (c) 2006-2009
# CondeNet, Inc. All Rights Reserved.
################################################################################
from interaction import *

589
r2/r2/lib/authorize/api.py Normal file
View File

@@ -0,0 +1,589 @@
# The contents of this file are subject to the Common Public Attribution
# License Version 1.0. (the "License"); you may not use this file except in
# compliance with the License. You may obtain a copy of the License at
# http://code.reddit.com/LICENSE. The License is based on the Mozilla Public
# License Version 1.1, but Sections 14 and 15 have been added to cover use of
# software over a computer network and provide for limited attribution for the
# Original Developer. In addition, Exhibit A has been modified to be consistent
# with Exhibit B.
#
# Software distributed under the License is distributed on an "AS IS" basis,
# WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for
# the specific language governing rights and limitations under the License.
#
# The Original Code is Reddit.
#
# The Original Developer is the Initial Developer. The Initial Developer of the
# Original Code is CondeNet, Inc.
#
# All portions of the code written by CondeNet are Copyright (c) 2006-2009
# CondeNet, Inc. All Rights Reserved.
################################################################################
"""
For talking to authorize.net credit card payments via their XML api.
This file consists mostly of wrapper classes for dealing with their
API, while the actual useful functions live in interaction.py
"""
from pylons import g
from httplib import HTTPSConnection
from urlparse import urlparse
import socket, re
from BeautifulSoup import BeautifulStoneSoup
from r2.lib.utils import iters, Storage
from r2.models import NotFound
from r2.models.bidding import CustomerID, PayID, ShippingAddress
# list of the most common errors.
Errors = Storage(TESTMODE = "E00009",
TRANSACTION_FAIL = "E00027",
DUPLICATE_RECORD = "E00039",
RECORD_NOT_FOUND = "E00040",
TOO_MANY_PAY_PROFILES = "E00042",
TOO_MANY_SHIP_ADDRESSES = "E00043")
class AuthorizeNetException(Exception):
pass
class SimpleXMLObject(object):
"""
All API transactions are done with authorize.net using XML, so
here's a class for generating and extracting structured data from
XML.
"""
_keys = []
def __init__(self, **kw):
self._used_keys = self._keys if self._keys else kw.keys()
for k in self._used_keys:
if not hasattr(self, k):
setattr(self, k, kw.get(k, ""))
@staticmethod
def simple_tag(name, content, **attrs):
attrs = " ".join('%s="%s"' % (k, v) for k, v in attrs.iteritems())
if attrs: attrs = " " + attrs
return ("<%(name)s%(attrs)s>%(content)s</%(name)s>" %
dict(name = name, content = content, attrs = attrs))
def toXML(self):
content = []
def process(k, v):
if isinstance(v, SimpleXMLObject):
v = v.toXML()
if v is not None:
content.append(self.simple_tag(k, v))
for k in self._used_keys:
v = getattr(self, k)
if isinstance(v, iters):
for val in v:
process(k, val)
else:
process(k, v)
return self._wrapper("".join(content))
@classmethod
def fromXML(cls, data):
kw = {}
for k in cls._keys:
d = data.find(k.lower())
if d and d.contents:
kw[k] = unicode(d.contents[0])
return cls(**kw)
def __repr__(self):
return "<%s {%s}>" % (self.__class__.__name__,
",".join("%s=%s" % (k, repr(getattr(self, k)))
for k in self._used_keys))
def _name(self):
name = self.__class__.__name__
return name[0].lower() + name[1:]
def _wrapper(self, content):
return content
class Auth(SimpleXMLObject):
_keys = ["name", "transactionKey"]
class Address(SimpleXMLObject):
_keys = ["firstName", "lastName", "company", "address",
"city", "state", "zip", "country", "phoneNumber",
"faxNumber",
"customerPaymentProfileId",
"customerAddressId" ]
def __init__(self, **kw):
kw['customerPaymentProfileId'] = kw.get("customerPaymentProfileId",
None)
kw['customerAddressId'] = kw.get("customerAddressId", None)
SimpleXMLObject.__init__(self, **kw)
class CreditCard(SimpleXMLObject):
_keys = ["cardNumber", "expirationDate", "cardCode"]
class Profile(SimpleXMLObject):
"""
Converts a user into a Profile object.
"""
_keys = ["merchantCustomerId", "description",
"email", "customerProfileId", "paymentProfiles", "shipToList",
"validationMode"]
def __init__(self, user, paymentProfiles, address,
validationMode = None):
SimpleXMLObject.__init__(self, merchantCustomerId = user._fullname,
description = user.name, email = "",
paymentProfiles = paymentProfiles,
shipToList = address,
validationMode = validationMode,
customerProfileId=CustomerID.get_id(user))
class PaymentProfile(SimpleXMLObject):
_keys = ["billTo", "payment", "customerPaymentProfileId", "validationMode"]
def __init__(self, billTo, card, paymentId = None,
validationMode = None):
SimpleXMLObject.__init__(self, billTo = billTo,
customerPaymentProfileId = paymentId,
payment = SimpleXMLObject(creditCard = card),
validationMode = validationMode)
@classmethod
def fromXML(cls, res):
payid = int(res.customerpaymentprofileid.contents[0])
return cls(Address.fromXML(res.billto),
CreditCard.fromXML(res.payment), payid)
class Order(SimpleXMLObject):
_keys = ["invoiceNumber", "description", "purchaseOrderNumber"]
class Transaction(SimpleXMLObject):
_keys = ["amount", "customerProfileId", "customerPaymentProfileId",
"transId", "order"]
def __init__(self, amount, profile_id, pay_id, trans_id = None,
order = None):
SimpleXMLObject.__init__(self, amount = amount,
customerProfileId = profile_id,
customerPaymentProfileId = pay_id,
transId = trans_id,
order = order)
def _wrapper(self, content):
return self.simple_tag(self._name(), content)
# authorize and charge
class ProfileTransAuthCapture(Transaction): pass
# only authorize (no charge is made)
class ProfileTransAuthOnly(Transaction): pass
# charge only (requires previous auth_only)
class ProfileTransPriorAuthCapture(Transaction): pass
# stronger than above: charge even on decline (not sure why you would want to)
class ProfileTransCaptureOnly(Transaction): pass
# refund a transaction
class ProfileTransRefund(Transaction): pass
# void a transaction
class ProfileTransVoid(Transaction): pass
#-----
class AuthorizeNetRequest(SimpleXMLObject):
_keys = ["merchantAuthentication"]
@property
def merchantAuthentication(self):
return Auth(name = g.authorizenetname,
transactionKey = g.authorizenetkey)
def _wrapper(self, content):
return ('<?xml version="1.0" encoding="utf-8"?>' +
self.simple_tag(self._name(), content,
xmlns="AnetApi/xml/v1/schema/AnetApiSchema.xsd"))
def make_request(self):
g.log.error("Authorize.net request:")
g.log.error(self.toXML())
u = urlparse(g.authorizenetapi)
try:
conn = HTTPSConnection(u.hostname, u.port)
conn.request("POST", u.path, self.toXML(),
{"Content-type": "text/xml"})
res = conn.getresponse()
res = self.handle_response(res.read())
conn.close()
return res
except socket.error:
return False
def is_error_code(self, res, code):
return (res.message.code and res.message.code.contents and
res.message.code.contents[0] == code)
def process_error(self, res):
raise AuthorizeNetException, res
_autoclose_re = re.compile("<([^/]+)/>")
def _autoclose_handler(self, m):
return "<%(m)s></%(m)s>" % dict(m = m.groups()[0])
def handle_response(self, res):
print "RESPONSE:"
print res
res = self._autoclose_re.sub(self._autoclose_handler, res)
res = BeautifulStoneSoup(res, markupMassage=False)
if res.resultcode.contents[0] == u"Ok":
return self.process_response(res)
else:
return self.process_error(res)
def process_response(self, res):
raise NotImplementedError
class CustomerRequest(AuthorizeNetRequest):
_keys = AuthorizeNetRequest._keys + ["customerProfileId"]
def __init__(self, user, **kw):
if isinstance(user, int):
cust_id = user
self._user = None
else:
cust_id = CustomerID.get_id(user)
self._user = user
AuthorizeNetRequest.__init__(self, customerProfileId = cust_id, **kw)
# --- real request classes below
class CreateCustomerProfileRequest(AuthorizeNetRequest):
"""
Create a new user object on authorize.net and return the new object ID.
Handles the case of already existing users on either end
gracefully and will update the Account object accordingly.
"""
_keys = AuthorizeNetRequest._keys + ["profile", "validationMode"]
def __init__(self, user, validationMode = None):
# cache the user object passed in
self._user = user
AuthorizeNetRequest.__init__(self,
profile = Profile(user, None, None),
validationMode = validationMode)
def process_response(self, res):
customer_id = int(res.customerprofileid.contents[0])
CustomerID.set(self._user, customer_id)
return customer_id
def make_request(self):
# don't send a new request if the user already has an id
return (CustomerID.get_id(self._user) or
AuthorizeNetRequest.make_request(self))
re_lost_id = re.compile("A duplicate record with id (\d+) already exists")
def process_error(self, res):
if self.is_error_code(res, Errors.DUPLICATE_RECORD):
# D'oh. We lost one
m = self.re_lost_id.match(res.find("text").contents[0]).groups()
CustomerID.set(self._user, m[0])
# otherwise, we might have sent a user that already had a customer ID
cust_id = CustomerID.get_id(self._user)
if cust_id:
return cust_id
return AuthorizeNetRequest.process_error(self, res)
class CreateCustomerPaymentProfileRequest(CustomerRequest):
"""
Adds a payment profile to an existing user object. The profile
includes a valid address and a credit card number.
"""
_keys = (CustomerRequest._keys + ["paymentProfile", "validationMode"])
def __init__(self, user, address, creditcard, validationMode = None):
CustomerRequest.__init__(self, user,
paymentProfile = PaymentProfile(address,
creditcard),
validationMode = validationMode)
def process_response(self, res):
pay_id = int(res.customerpaymentprofileid.contents[0])
PayID.add(self._user, pay_id)
return pay_id
def process_error(self, res):
if self.is_error_code(res, Errors.DUPLICATE_RECORD):
u, data = GetCustomerProfileRequest(self._user).make_request()
profiles = data.paymentProfiles
if len(profiles) == 1:
return profiles[0].customerPaymentProfileId
return
return CustomerRequest.process_error(self,res)
class CreateCustomerShippingAddressRequest(CustomerRequest):
"""
Adds a shipping address.
"""
_keys = CustomerRequest._keys + ["address"]
def process_response(self, res):
pay_id = int(res.customeraddressid.contents[0])
ShippingAddress.add(self._user, pay_id)
return pay_id
def process_error(self, res):
if self.is_error_code(res, Errors.DUPLICATE_RECORD):
return
return CustomerRequest.process_error(self, res)
class GetCustomerPaymentProfileRequest(CustomerRequest):
_keys = CustomerRequest._keys + ["customerPaymentProfileId"]
"""
Gets a payment profile by user Account object and authorize.net
profileid of the payment profile.
Error handling: make_request returns None if the id generates a
RECORD_NOT_FOUND error from the server. The user object is
cleaned up in either case; if the user object lacked the (valid)
pay id, it is added to its list, while if the pay id is invalid,
it is removed from the user object.
"""
def __init__(self, user, profileid):
CustomerRequest.__init__(self, user,
customerPaymentProfileId=profileid)
def process_response(self, res):
# add the id to the user object in case something has gone wrong
PayID.add(self._user, self.customerPaymentProfileId)
return PaymentProfile.fromXML(res.paymentprofile)
def process_error(self, res):
if self.is_error_code(res, Errors.RECORD_NOT_FOUND):
PayID.delete(self._user, self.customerPaymentProfileId)
return CustomerRequest.process_error(self,res)
class GetCustomerShippingAddressRequest(CustomerRequest):
"""
Same as GetCustomerPaymentProfileRequest except with shipping addresses.
Error handling is identical.
"""
_keys = CustomerRequest._keys + ["customerAddressId"]
def __init__(self, user, shippingid):
CustomerRequest.__init__(self, user,
customerAddressId=shippingid)
def process_response(self, res):
# add the id to the user object in case something has gone wrong
ShippingAddress.add(self._user, self.customerAddressId)
return Address.fromXML(res.address)
def process_error(self, res):
if self.is_error_code(res, Errors.RECORD_NOT_FOUND):
ShippingAddress.delete(self._user, self.customerAddressId)
return CustomerRequest.process_error(self,res)
class GetCustomerProfileIdsRequest(AuthorizeNetRequest):
"""
Get a list of all customer ids that have been recorded with
authorize.net
"""
def process_response(self, res):
return [int(x.contents[0]) for x in res.ids.findAll('numericstring')]
class GetCustomerProfileRequest(CustomerRequest):
"""
Given a user, find their customer information.
"""
def process_response(self, res):
from r2.models import Account
fullname = res.merchantcustomerid.contents[0]
name = res.description.contents[0]
customer_id = int(res.customerprofileid.contents[0])
acct = Account._by_name(name)
# make sure we are updating the correct account!
if acct.name == name:
CustomerID.set(acct, customer_id)
else:
raise AuthorizeNetException, \
"account name doesn't match authorize.net account"
# parse the ship-to list, and make sure the Account is up todate
ship_to = []
for profile in res.findAll("shiptolist"):
a = Address.fromXML(profile)
ShippingAddress.add(acct, a.customerAddressId)
ship_to.append(a)
# parse the payment profiles, and ditto
profiles = []
for profile in res.findAll("paymentprofiles"):
a = Address.fromXML(profile)
cc = CreditCard.fromXML(profile.payment)
payprof = PaymentProfile(a, cc,int(a.customerPaymentProfileId))
PayID.add(acct, a.customerPaymentProfileId)
profiles.append(payprof)
return acct, Profile(acct, profiles, ship_to)
class DeleteCustomerProfileRequest(CustomerRequest):
"""
Delete a customer shipping address
"""
def process_response(self, res):
if self._user:
CustomerID.delete(self._user)
return
def process_error(self, res):
if self.is_error_code(res, Errors.RECORD_NOT_FOUND):
CustomerID.delete(self._user)
return CustomerRequest.process_error(self,res)
class DeleteCustomerPaymentProfileRequest(GetCustomerPaymentProfileRequest):
"""
Delete a customer shipping address
"""
def process_response(self, res):
PayID.delete(self._user,self.customerPaymentProfileId)
return True
def process_error(self, res):
if self.is_error_code(res, Errors.RECORD_NOT_FOUND):
PayID.delete(self._user,self.customerPaymentProfileId)
return GetCustomerPaymentProfileRequest.process_error(self,res)
class DeleteCustomerShippingAddressRequest(GetCustomerShippingAddressRequest):
"""
Delete a customer shipping address
"""
def process_response(self, res):
ShippingAddress.delete(self._user, self.customerAddressId)
return True
def process_error(self, res):
if self.is_error_code(res, Errors.RECORD_NOT_FOUND):
ShippingAddress.delete(self._user, self.customerAddressId)
GetCustomerShippingAddressRequest.process_error(self,res)
# TODO
#class UpdateCustomerProfileRequest(AuthorizeNetRequest):
# _keys = (AuthorizeNetRequest._keys + ["profile"])
#
# def __init__(self, user):
# profile = Profile(user, None, None)
# AuthorizeNetRequest.__init__(self, profile = profile)
class UpdateCustomerPaymentProfileRequest(CreateCustomerPaymentProfileRequest):
"""
For updating the user's payment profile
"""
def __init__(self, user, paymentid, address, creditcard,
validationMode = None):
CustomerRequest.__init__(self, user,
paymentProfile=PaymentProfile(address,
creditcard,
paymentid),
validationMode = validationMode)
def process_response(self, res):
return self.paymentProfile.customerPaymentProfileId
class UpdateCustomerShippingAddressRequest(
CreateCustomerShippingAddressRequest):
"""
For updating the user's shipping address
"""
def __init__(self, user, address_id, address):
address.customerAddressId = address_id
CreateCustomerShippingAddressRequest.__init__(self, user,
address = address)
def process_response(self, res):
return True
class CreateCustomerProfileTransactionRequest(AuthorizeNetRequest):
_keys = AuthorizeNetRequest._keys + ["transaction", "extraOptions"]
# unlike every other response we get back, this api function
# returns CSV data of the response with no field labels. these
# are used in package_response to zip this data into a usable
# storage.
response_keys = ("response_code",
"response_subcode",
"response_reason_code",
"response_reason_text",
"authorization_code",
"avs_response",
"trans_id",
"invoice_number",
"description",
"amount", "method",
"transaction_type",
"customerID",
"firstName", "lastName",
"company", "address", "city", "state",
"zip", "country",
"phoneNumber", "faxNumber", "email",
"shipTo_firstName", "shipTo_lastName",
"shipTo_company", "shipTo_address",
"shipTo_city", "shipTo_state",
"shipTo_zip", "shipTo_country",
"tax", "duty", "freight",
"tax_exempt", "po_number", "md5",
"cav_response")
# list of casts for the response fields given above
response_types = dict(response_code = int,
response_subcode = int,
response_reason_code = int,
trans_id = int)
def __init__(self, **kw):
from pylons import g
self._extra = kw.get("extraOptions", {})
#if g.debug:
# self._extra['x_test_request'] = "TRUE"
AuthorizeNetRequest.__init__(self, **kw)
@property
def extraOptions(self):
return "<![CDATA[%s]]>" % "&".join("%s=%s" % x
for x in self._extra.iteritems())
def process_response(self, res):
return (True, self.package_response(res))
def process_error(self, res):
if self.is_error_code(res, Errors.TRANSACTION_FAIL):
return (False, self.package_response(res))
elif self.is_error_code(res, Errors.TESTMODE):
return (None, None)
return AuthorizeNetRequest.process_error(self,res)
def package_response(self, res):
content = res.directresponse.contents[0]
s = Storage(zip(self.response_keys, content.split(',')))
for name, cast in self.response_types.iteritems():
try:
s[name] = cast(s[name])
except ValueError:
pass
return s

View File

@@ -0,0 +1,176 @@
# The contents of this file are subject to the Common Public Attribution
# License Version 1.0. (the "License"); you may not use this file except in
# compliance with the License. You may obtain a copy of the License at
# http://code.reddit.com/LICENSE. The License is based on the Mozilla Public
# License Version 1.1, but Sections 14 and 15 have been added to cover use of
# software over a computer network and provide for limited attribution for the
# Original Developer. In addition, Exhibit A has been modified to be consistent
# with Exhibit B.
#
# Software distributed under the License is distributed on an "AS IS" basis,
# WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for
# the specific language governing rights and limitations under the License.
#
# The Original Code is Reddit.
#
# The Original Developer is the Initial Developer. The Initial Developer of the
# Original Code is CondeNet, Inc.
#
# All portions of the code written by CondeNet are Copyright (c) 2006-2009
# CondeNet, Inc. All Rights Reserved.
################################################################################
from api import *
from pylons import g
from r2.models.bidding import Bid
# useful test data:
test_card = dict(AMEX = ("370000000000002" , 1234),
DISCOVER = ("6011000000000012" , 123),
MASTERCARD = ("5424000000000015" , 123),
VISA = ("4007000000027" , 123),
# visa card which generates error codes based on the amount
ERRORCARD = ("4222222222222" , 123))
test_card = Storage((k, CreditCard(cardNumber = x, expirationDate="2011-11",
cardCode = y)) for k, (x, y) in
test_card.iteritems())
test_address = Address(firstName = "John", lastName = "Doe",
address = "123 Fake St.",
city = "Anytown", state = "MN", zip = "12346")
def get_account_info(user, recursed = False):
# if we don't have an ID for the user, try to make one
if not CustomerID.get_id(user):
cust_id = CreateCustomerProfileRequest(user).make_request()
# if we do have a customerid, we should be able to fetch it from authorize
try:
u, data = GetCustomerProfileRequest(user).make_request()
except AuthorizeNetException:
u = None
# if the user and the returned user don't match, delete the
# current customer_id and recurse
if u != user:
if not recursed:
CustomerID.delete(user)
return get_account_info(user, True)
else:
raise AuthorizeNetException, "error creating user"
return data
def edit_profile(user, address, creditcard, pay_id = None):
if pay_id:
return UpdateCustomerPaymentProfileRequest(
user, pay_id, address, creditcard).make_request()
else:
return CreateCustomerPaymentProfileRequest(
user, address, creditcard).make_request()
def _make_transaction(trans_cls, amount, user, pay_id,
order = None, trans_id = None, test = None):
"""
private function for handling transactions (since the data is
effectively the same regardless of trans_cls)
"""
# format the amount
if amount:
amount = "%.2f" % amount
# lookup customer ID
cust_id = CustomerID.get_id(user)
# create a new transaction
trans = trans_cls(amount, cust_id, pay_id, trans_id = trans_id,
order = order)
extra = {}
# the optional test field makes the transaction a test, and will
# make the response be the error code corresponding to int(test).
if isinstance(test, int):
extra = dict(x_test_request = "TRUE",
x_card_num = test_card.ERRORCARD.cardNumber,
x_amount = test)
# using the transaction, generate a transaction request and make it
req = CreateCustomerProfileTransactionRequest(transaction = trans,
extraOptions = extra)
return req.make_request()
def auth_transaction(amount, user, payid, thing, test = None):
# use negative pay_ids to identify freebies, coupons, or anything
# that doesn't require a CC.
if payid < 0:
trans_id = -thing._id
# update previous freebie transactions if we can
try:
bid = Bid.one(thing_id = thing._id,
pay_id = payid)
bid.bid = amount
bid.auth()
except NotFound:
bid = Bid._new(trans_id, user, payid, thing._id, amount)
return bid.transaction
elif int(payid) in PayID.get_ids(user):
order = Order(invoiceNumber = "%dT%d" % (user._id, thing._id))
success, res = _make_transaction(ProfileTransAuthOnly,
amount, user, payid,
order = order, test = test)
if success:
Bid._new(res.trans_id, user, payid, thing._id, amount)
return res.trans_id
elif res is None:
# we are in test mode!
return auth_transaction(amount, user, -1, thing, test = test)
# duplicate transaction, which is bad, but not horrible. Log
# the transaction id, creating a new bid if necessary.
elif (res.response_code, res.response_reason_code) == (3,11):
try:
Bid.one(res.trans_id)
except NotFound:
Bid._new(res.trans_id, user, payid, thing._id, amount)
return res.trans_id
def void_transaction(user, trans_id, test = None):
bid = Bid.one(trans_id)
bid.void()
# verify that the transaction has the correct ownership
if bid.account_id == user._id and trans_id > 0:
res = _make_transaction(ProfileTransVoid,
None, user, None, trans_id = trans_id,
test = test)
return res
def charge_transaction(user, trans_id, test = None):
bid = Bid.one(trans_id)
bid.charged()
if trans_id < 0:
# freebies are automatically approved
return True
elif bid.account_id == user._id:
res = _make_transaction(ProfileTransPriorAuthCapture,
bid.bid, user,
bid.pay_id, trans_id = trans_id,
test = test)
return bool(res)
def refund_transaction(amount, user, trans_id, test = None):
bid = Bid.one(trans_id)
if trans_id > 0:
# create a new bid to identify the refund
success, res = _make_transaction(ProfileTransRefund,
amount, user, bid.pay_id,
trans_id = trans_id,
test = test)
if success:
bid = Bid._new(res.trans_id, user, -1, bid.thing_id, amount)
bid.refund()
return bool(res.trans_id)

View File

@@ -90,6 +90,8 @@ class BaseController(WSGIController):
request.environ['pylons.routes_dict']['action'] = \
meth + '_' + action
c.thread_pool = environ['paste.httpserver.thread_pool']
c.response = Response()
res = WSGIController.__call__(self, environ, start_response)
return res

View File

@@ -0,0 +1,93 @@
# The contents of this file are subject to the Common Public Attribution
# License Version 1.0. (the "License"); you may not use this file except in
# compliance with the License. You may obtain a copy of the License at
# http://code.reddit.com/LICENSE. The License is based on the Mozilla Public
# License Version 1.1, but Sections 14 and 15 have been added to cover use of
# software over a computer network and provide for limited attribution for the
# Original Developer. In addition, Exhibit A has been modified to be consistent
# with Exhibit B.
#
# Software distributed under the License is distributed on an "AS IS" basis,
# WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for
# the specific language governing rights and limitations under the License.
#
# The Original Code is Reddit.
#
# The Original Developer is the Initial Developer. The Initial Developer of the
# Original Code is CondeNet, Inc.
#
# All portions of the code written by CondeNet are Copyright (c) 2006-2009
# CondeNet, Inc. All Rights Reserved.
################################################################################
import re, sys, Image, os, hashlib, StringIO
class Spriter(object):
spritable = re.compile(r" *background-image: *url\((.*)\) *.*/\* *SPRITE *\*/")
def __init__(self, padding = (4, 4),
css_path = '/static/', actual_path = "r2/public/static/"):
self.images = []
self.im_lookup = {}
self.ypos = [0]
self.padding = padding
self.css_path = css_path
self.actual_path = actual_path
def make_sprite(self, line):
path, = self.spritable.findall(line)
path = re.sub("^" + self.css_path, self.actual_path, path)
if os.path.exists(path):
if path in self.im_lookup:
i = self.im_lookup[path]
else:
im = Image.open(path)
self.images.append(im)
self.im_lookup[path] = len(self.images) - 1
self.ypos.append(self.ypos[-1] + im.size[1] +
2 * self.padding[1])
i = len(self.images) - 1
return "\n".join([" background-image: url(%(sprite)s);",
" background-position: %dpx %spx;" %
(-self.padding[0], "%(pos_" + str(i) + ")s"),
""])
return line
def finish(self, out_file, out_string):
width = 2 * self.padding[0] + max(i.size[0] for i in self.images)
height = sum((i.size[1] + 2 * self.padding[1]) for i in self.images)
master = Image.new(mode = "RGBA", size = (width, height),
color = (0,0,0,0))
for i, image in enumerate(self.images):
master.paste(image,
(self.padding[0], self.padding[1] + self.ypos[i]))
master.save(os.path.join(self.actual_path, out_file))
d = dict(('pos_' + str(i), -self.padding[1] - y)
for i, y in enumerate(self.ypos))
h = hashlib.md5(master.tostring()).hexdigest()
d['sprite'] = os.path.join(self.css_path, "%s?v=%s" % (out_file, h))
return out_string % d
def process_css(incss, out_file = 'sprite.png', css_path = "/static/"):
s = Spriter(css_path = css_path)
out = StringIO.StringIO()
with open(incss, 'r') as handle:
for line in handle:
if s.spritable.match(line):
out.write(s.make_sprite(line))
else:
out.write(line.replace('%', '%%'))
return s.finish(out_file, out.getvalue())
if __name__ == '__main__':
import sys
print process_css(sys.argv[-1])

View File

@@ -365,7 +365,7 @@ def clean_image(data,format):
return ret
def save_sr_image(sr, data, num = None):
def save_sr_image(sr, data, resource = None):
"""
uploades image data to s3 as a PNG and returns its new url. Urls
will be of the form:
@@ -383,11 +383,12 @@ def save_sr_image(sr, data, num = None):
f.write(data)
f.flush()
resource = g.s3_thumb_bucket + sr._fullname
if num is not None:
resource += '_' + str(num)
resource += '.png'
if resource is not None:
resource = "_%s" % resource
else:
resource = ""
resource = g.s3_thumb_bucket + sr._fullname + resource + ".png"
s3cp.send_file(f.name, resource, 'image/png', 'public-read',
None, False)
finally:

View File

@@ -4,10 +4,11 @@ from r2.lib.db.thing import Thing, Merge
from r2.lib.db.operators import asc, desc, timeago
from r2.lib.db import query_queue
from r2.lib.db.sorts import epoch_seconds
from r2.lib.utils import fetch_things2, worker, tup
from r2.lib.utils import fetch_things2, worker, tup, UniqueIterator
from r2.lib.solrsearch import DomainSearchQuery
from datetime import datetime
import itertools
from pylons import g
query_cache = g.permacache
@@ -82,26 +83,38 @@ class CachedResults(object):
return tuple(lst)
def can_insert(self):
"""True if a new item can just be inserted, which is when the
query is only sorted by date."""
return self.query._sort == [desc('_date')]
"""True if a new item can just be inserted rather than
rerunning the query. This is only true in some
circumstances, which includes having no time rules, and
being sorted descending"""
if self.query._sort in ([desc('_date')],
[desc('_hot'), desc('_date')],
[desc('_score'), desc('_date')],
[desc('_controversy'), desc('_date')]):
if not any(r.lval.name == '_date'
for r in self.query._rules):
# if no time-rule is specified, then it's 'all'
return True
return False
def can_delete(self):
"""True if a item can be removed from the listing, always true for now."""
return True
def insert(self, items):
"""Inserts the item at the front of the cached data. Assumes the query
is sorted by date descending"""
"""Inserts the item into the cached data. This only works
under certain criteria, see can_insert."""
self.fetch()
t = [ self.make_item_tuple(item) for item in tup(items) ]
if len(t) == 1 and self.data and t[0][1:] > self.data[0][1:]:
self.data.insert(0, t[0])
else:
self.data.extend(t)
self.data = list(set(self.data))
self.data.sort(key=lambda x: x[1:], reverse=True)
# insert the new items, remove the duplicates (keeping the one
# being inserted over the stored value if applicable), and
# sort the result
data = itertools.chain(t, self.data)
data = UniqueIterator(data, key = lambda x: x[0])
data = sorted(data, key=lambda x: x[1:], reverse=True)
data = list(data)
self.data = data
query_cache.set(self.iden, self.data[:precompute_limit])
@@ -316,6 +329,7 @@ def add_queries(queries, insert_items = None, delete_items = None):
log('Adding precomputed query %s' % q)
query_queue.add_query(q)
worker.do(_add_queries)
#can be rewritten to be more efficient
def all_queries(fn, obj, *param_lists):
@@ -381,12 +395,12 @@ def new_vote(vote):
if not isinstance(item, Link):
return
if vote.valid_thing:
if vote.valid_thing and not item._spam and not item._deleted:
sr = item.subreddit_slow
results = all_queries(get_links, sr, ('hot', 'new'), ['all'])
results = [get_links(sr, 'hot', 'all')]
results.extend(all_queries(get_links, sr, ('top', 'controversial'), db_times.keys()))
#results.append(get_links(sr, 'toplinks', 'all'))
add_queries(results)
add_queries(results, insert_items = item)
#must update both because we don't know if it's a changed vote
if vote._name == '1':
@@ -443,7 +457,14 @@ def ban(things):
if links:
add_queries([get_spam_links(sr)], insert_items = links)
add_queries([get_links(sr, 'new', 'all')], delete_items = links)
# rip it out of the listings. bam!
results = [get_links(sr, 'hot', 'all'),
get_links(sr, 'new', 'all'),
get_links(sr, 'top', 'all'),
get_links(sr, 'controversial', 'all')]
results.extend(all_queries(get_links, sr, ('top', 'controversial'), db_times.keys()))
add_queries(results, delete_items = links)
#if comments:
# add_queries([get_spam_comments(sr)], insert_items = comments)
@@ -459,8 +480,14 @@ def unban(things):
if links:
add_queries([get_spam_links(sr)], delete_items = links)
# put it back in 'new'
add_queries([get_links(sr, 'new', 'all')], insert_items = links)
# put it back in the listings
results = [get_links(sr, 'hot', 'all'),
get_links(sr, 'new', 'all'),
get_links(sr, 'top', 'all'),
get_links(sr, 'controversial', 'all')]
results.extend(all_queries(get_links, sr, ('top', 'controversial'), db_times.keys()))
add_queries(results, insert_items = links)
#if comments:
# add_queries([get_spam_comments(sr)], delete_items = comments)

View File

@@ -1,117 +1,51 @@
from __future__ import with_statement
from r2.lib.workqueue import WorkQueue
from r2.lib.db.tdb_sql import make_metadata, create_table, index_str
import cPickle as pickle
from datetime import datetime
from urllib2 import Request, urlopen
from urllib import urlencode
from threading import Lock
import time
import sqlalchemy as sa
from sqlalchemy.exceptions import SQLError
from r2.lib import amqp
from pylons import g
tz = g.tz
#the current
running = set()
running_lock = Lock()
def make_query_queue_table():
engine = g.dbm.engines['query_queue']
metadata = make_metadata(engine)
table = sa.Table(g.db_app_name + '_query_queue', metadata,
sa.Column('iden', sa.String, primary_key = True),
sa.Column('query', sa.Binary),
sa.Column('date', sa.DateTime(timezone = True)))
date_idx = index_str(table, 'date', 'date')
create_table(table, [date_idx])
return table
query_queue_table = make_query_queue_table()
working_prefix = 'working_'
prefix = 'prec_link_'
TIMEOUT = 120
def add_query(cached_results):
"""Adds a CachedResults instance to the queue db, ignoring duplicates"""
d = dict(iden = cached_results.query._iden(),
query = pickle.dumps(cached_results, -1),
date = datetime.now(tz))
try:
query_queue_table.insert().execute(d)
except SQLError, e:
#don't worry about inserting duplicates
if not 'IntegrityError' in str(e):
raise
def remove_query(iden):
"""Removes a row identified with iden from the query queue. To be
called after a CachedResults is updated."""
table = query_queue_table
d = table.delete(table.c.iden == iden)
d.execute()
def get_query():
"""Gets the next query off the queue, ignoring the currently running
queries."""
table = query_queue_table
s = table.select(order_by = sa.asc(table.c.date), limit = 1)
s.append_whereclause(sa.and_(*[table.c.iden != i for i in running]))
r = s.execute().fetchone()
if r:
return r.iden, r.query
else:
return None, None
def make_query_job(iden, pickled_cr):
"""Creates a job to send to the query worker. Updates a cached result
then removes the query from both the queue and the running set. If
sending the job fails, the query is only remove from the running
set."""
precompute_worker = g.query_queue_worker
def job():
try:
finished = False
r = Request(url = precompute_worker + '/doquery',
data = urlencode([('query', pickled_cr)]),
#this header prevents pylons from turning the
#parameter into unicode, which breaks pickling
headers = {'x-dont-decode':'true'})
urlopen(r)
finished = True
finally:
with running_lock:
running.remove(iden)
#if finished is false, we'll leave the query in the db
#so we can try again later (e.g. in the event the
#worker is down)
if finished:
remove_query(iden)
return job
amqp.add_item('prec_links', pickle.dumps(cached_results, -1))
def run():
"""Pull jobs from the queue, creates a job, and sends them to a
WorkQueue for processing."""
num_workers = g.num_query_queue_workers
wq = WorkQueue(num_workers = num_workers)
wq.start()
def callback(msgs):
for msg in msgs: # will be len==1
# r2.lib.db.queries.CachedResults
cr = pickle.loads(msg.body)
iden = cr.query._iden()
while True:
job = None
#limit the total number of jobs in the WorkQueue. we don't
#need to load the entire db queue right away (the db queue can
#get quite large).
if len(running) < 2 * num_workers:
with running_lock:
iden, pickled_cr = get_query()
if pickled_cr is not None:
if not iden in running:
running.add(iden)
job = make_query_job(iden, pickled_cr)
wq.add(job)
working_key = working_prefix + iden
key = prefix + iden
#if we didn't find a job, sleep before trying again
if not job:
time.sleep(1)
last_time = g.memcache.get(key)
# check to see if we've computed this job since it was
# added to the queue
if last_time and last_time > msg.timestamp:
print 'skipping, already computed ', key
return
# check if someone else is working on this
elif not g.memcache.add(working_key, 1, TIMEOUT):
print 'skipping, someone else is working', working_key
return
cr = pickle.loads(msg.body)
print 'working: ', iden, cr.query._rules
start = datetime.now()
cr.update()
done = datetime.now()
q_time_s = (done - msg.timestamp).seconds
proc_time_s = (done - start).seconds + ((done - start).microseconds/1000000.0)
print ('processed %s in %.6f seconds after %d seconds in queue'
% (iden, proc_time_s, q_time_s))
g.memcache.set(key, datetime.now())
g.memcache.delete(working_key)
amqp.handle_items('prec_links', callback, limit = 1)

View File

@@ -19,7 +19,7 @@
# All portions of the code written by CondeNet are Copyright (c) 2006-2009
# CondeNet, Inc. All Rights Reserved.
################################################################################
from math import log
from math import log, sqrt
from datetime import datetime, timedelta
from pylons import g
@@ -46,3 +46,28 @@ def controversy(ups, downs):
"""The controversy sort."""
return float(ups + downs) / max(abs(score(ups, downs)), 1)
def _confidence(ups, downs):
"""The confidence sort.
http://www.evanmiller.org/how-not-to-sort-by-average-rating.html"""
n = float(ups + downs)
if n == 0:
return 0
z = 1.0 #1.0 = 85%, 1.6 = 95%
phat = float(ups) / n
return sqrt(phat+z*z/(2*n)-z*((phat*(1-phat)+z*z/(4*n))/n))/(1+z*z/n)
up_range = 400
down_range = 100
confidences = []
for ups in xrange(up_range):
for downs in xrange(down_range):
confidences.append(_confidence(ups, downs))
def confidence(ups, downs):
if ups + downs == 0:
return 0
elif ups < up_range and downs < down_range:
return confidences[downs + ups * down_range]
else:
return _confidence(ups, downs)

View File

@@ -273,7 +273,10 @@ def get_rel_type_id(name):
return rel_types_name[name][0]
def get_write_table(tables):
return tables[0]
if g.disallow_db_writes:
raise Exception("not so fast! writes are not allowed on this app.")
else:
return tables[0]
def get_read_table(tables):
#shortcut with 1 entry
@@ -320,14 +323,15 @@ def get_read_table(tables):
#add in the over-connected machines with a 1% weight
ip_weights.extend((ip, .01) for ip in no_connections)
#rebalance the weights
total_weight = sum(w[1] for w in ip_weights)
ip_weights = [(ip, weight / total_weight)
for ip, weight in ip_weights]
#rebalance the weights
total_weight = sum(w[1] for w in ip_weights)
ip_weights = [(ip, weight / total_weight)
for ip, weight in ip_weights]
r = random.random()
for ip, load in ip_weights:
if r < load:
# print "db ip: %s" % str(ips[ip][0].metadata.bind.url.host)
return ips[ip]
else:
r = r - load

View File

@@ -103,14 +103,13 @@ class DataThing(object):
else:
old_val = self._t.get(attr, self._defaults.get(attr))
self._t[attr] = val
if make_dirty and val != old_val:
self._dirties[attr] = (old_val, val)
def __getattr__(self, attr):
#makes pickling work for some reason
if attr.startswith('__'):
raise AttributeError
raise AttributeError, attr
try:
if hasattr(self, '_t'):
@@ -302,6 +301,23 @@ class DataThing(object):
else:
return filter(None, (bases.get(i) for i in ids))
@classmethod
def _byID36(cls, id36s, return_dict = True, **kw):
id36s, single = tup(id36s, True)
# will fail if it's not a string
ids = [ int(x, 36) for x in id36s ]
things = cls._byID(ids, return_dict=True, **kw)
if single:
return things.values()[0]
elif return_dict:
return things
else:
return things.values()
@classmethod
def _by_fullname(cls, names,
return_dict = True,
@@ -454,6 +470,10 @@ class Thing(DataThing):
def _controversy(self):
return sorts.controversy(self._ups, self._downs)
@property
def _confidence(self):
return sorts.confidence(self._ups, self._downs)
@classmethod
def _build(cls, id, bases):
return cls(bases.ups, bases.downs, bases.date,

View File

@@ -21,51 +21,61 @@
################################################################################
from email.MIMEText import MIMEText
from pylons.i18n import _
from pylons import c, g, request
from r2.lib.pages import PasswordReset, Share, Mail_Opt
from r2.lib.utils import timeago
from r2.models import passhash, Email, Default, has_opted_out
from r2.config import cache
from pylons import c, g
from r2.lib.utils import timeago, query_string
from r2.models import passhash, Email, Default, has_opted_out, Account
import os, random, datetime
import smtplib, traceback, sys
def email_address(name, address):
return '"%s" <%s>' % (name, address) if name else address
feedback = email_address('reddit feedback', g.feedback_email)
def send_mail(msg, fr, to):
session = smtplib.SMTP(g.smtp_server)
session.sendmail(fr, to, msg.as_string())
session.quit()
def simple_email(to, fr, subj, body):
def utf8(s):
return s.encode('utf8') if isinstance(s, unicode) else s
msg = MIMEText(utf8(body))
msg.set_charset('utf8')
msg['To'] = utf8(to)
msg['From'] = utf8(fr)
msg['Subject'] = utf8(subj)
send_mail(msg, fr, to)
def password_email(user):
key = passhash(random.randint(0, 1000), user.email)
passlink = 'http://' + g.domain + '/resetpassword/' + key
print "Generated password reset link: " + passlink
cache.set("reset_%s" %key, user._id, time=1800)
simple_email(user.email, 'reddit@reddit.com',
'reddit.com password reset',
PasswordReset(user=user, passlink=passlink).render(style='email'))
import traceback, sys, smtplib
def _feedback_email(email, body, kind, name='', reply_to = ''):
"""Function for handling feedback and ad_inq emails. Adds an
email to the mail queue to the feedback email account."""
Email.handler.add_to_queue(c.user if c.user_is_loggedin else None,
None, [feedback], name, email,
datetime.datetime.now(),
request.ip, kind, body = body,
reply_to = reply_to)
g.feedback_email, name, email,
kind, body = body, reply_to = reply_to)
def _system_email(email, body, kind, reply_to = "", thing = None):
"""
For sending email from the system to a user (reply address will be
feedback and the name will be reddit.com)
"""
Email.handler.add_to_queue(c.user if c.user_is_loggedin else None,
email, g.domain, g.feedback_email,
kind, body = body, reply_to = reply_to,
thing = thing)
def verify_email(user, dest):
"""
For verifying an email address
"""
from r2.lib.pages import VerifyEmail
key = passhash(user.name, user.email)
user.email_verified = False
user._commit()
emaillink = ('http://' + g.domain + '/verification/' + key
+ query_string(dict(dest=dest)))
print "Generated email verification link: " + emaillink
g.cache.set("email_verify_%s" %key, user._id, time=1800)
_system_email(user.email,
VerifyEmail(user=user,
emaillink = emaillink).render(style='email'),
Email.Kind.VERIFY_EMAIL)
def password_email(user):
"""
For reseting a user's password.
"""
from r2.lib.pages import PasswordReset
key = passhash(random.randint(0, 1000), user.email)
passlink = 'http://' + g.domain + '/resetpassword/' + key
print "Generated password reset link: " + passlink
g.cache.set("reset_%s" %key, user._id, time=1800)
_system_email(user.email,
PasswordReset(user=user,
passlink=passlink).render(style='email'),
Email.Kind.RESET_PASSWORD)
def feedback_email(email, body, name='', reply_to = ''):
"""Queues a feedback email to the feedback account."""
@@ -77,41 +87,49 @@ def ad_inq_email(email, body, name='', reply_to = ''):
return _feedback_email(email, body, Email.Kind.ADVERTISE, name = name,
reply_to = reply_to)
def i18n_email(email, body, name='', reply_to = ''):
"""Queues a ad_inq email to the feedback account."""
return _feedback_email(email, body, Email.Kind.HELP_TRANSLATE, name = name,
reply_to = reply_to)
def share(link, emails, from_name = "", reply_to = "", body = ""):
"""Queues a 'share link' email."""
now = datetime.datetime.now(g.tz)
ival = now - timeago(g.new_link_share_delay)
date = max(now,link._date + ival)
Email.handler.add_to_queue(c.user, link, emails, from_name, g.share_reply,
date, request.ip, Email.Kind.SHARE,
body = body, reply_to = reply_to)
def send_queued_mail():
Email.handler.add_to_queue(c.user, emails, from_name, g.share_reply,
Email.Kind.SHARE, date = date,
body = body, reply_to = reply_to,
thing = link)
def send_queued_mail(test = False):
"""sends mail from the mail queue to smtplib for delivery. Also,
on successes, empties the mail queue and adds all emails to the
sent_mail list."""
from r2.lib.pages import PasswordReset, Share, Mail_Opt, VerifyEmail, Promo_Email
now = datetime.datetime.now(g.tz)
if not c.site:
c.site = Default
clear = False
session = smtplib.SMTP(g.smtp_server)
# convienence funciton for sending the mail to the singly-defined session and
# marking the mail as read.
if not test:
session = smtplib.SMTP(g.smtp_server)
def sendmail(email):
try:
session.sendmail(email.fr_addr, email.to_addr,
email.to_MIMEText().as_string())
email.set_sent(rejected = False)
if test:
print email.to_MIMEText().as_string()
else:
session.sendmail(email.fr_addr, email.to_addr,
email.to_MIMEText().as_string())
email.set_sent(rejected = False)
# exception happens only for local recipient that doesn't exist
except (smtplib.SMTPRecipientsRefused, smtplib.SMTPSenderRefused,
UnicodeDecodeError):
# handle error and print, but don't stall the rest of the queue
print "Handled error sending mail (traceback to follow)"
traceback.print_exc(file = sys.stdout)
print "Handled error sending mail (traceback to follow)"
traceback.print_exc(file = sys.stdout)
email.set_sent(rejected = True)
try:
for email in Email.get_unsent(now):
@@ -124,38 +142,32 @@ def send_queued_mail():
msg_hash = email.msg_hash,
link = email.thing,
body = email.body).render(style = "email")
try:
email.subject = _("[reddit] %(user)s has shared a link with you") % \
{"user": email.from_name()}
except UnicodeDecodeError:
email.subject = _("[reddit] a user has shared a link with you")
sendmail(email)
elif email.kind == Email.Kind.OPTOUT:
email.body = Mail_Opt(msg_hash = email.msg_hash,
leave = True).render(style = "email")
email.subject = _("[reddit] email removal notice")
sendmail(email)
elif email.kind == Email.Kind.OPTIN:
email.body = Mail_Opt(msg_hash = email.msg_hash,
leave = False).render(style = "email")
email.subject = _("[reddit] email addition notice")
sendmail(email)
elif email.kind in (Email.Kind.ACCEPT_PROMO,
Email.Kind.REJECT_PROMO,
Email.Kind.QUEUED_PROMO,
Email.Kind.LIVE_PROMO,
Email.Kind.BID_PROMO,
Email.Kind.FINISHED_PROMO,
Email.Kind.NEW_PROMO):
email.body = Promo_Email(link = email.thing,
kind = email.kind,
body = email.body).render(style="email")
elif email.kind in (Email.Kind.FEEDBACK, Email.Kind.ADVERTISE):
if email.kind == Email.Kind.FEEDBACK:
email.subject = "[feedback] feedback from '%s'" % \
email.from_name()
else:
email.subject = "[ad_inq] feedback from '%s'" % \
email.from_name()
sendmail(email)
# handle failure
else:
# handle unknown types here
elif email.kind not in Email.Kind:
email.set_sent(rejected = True)
continue
sendmail(email)
finally:
session.quit()
if not test:
session.quit()
# clear is true if anything was found and processed above
if clear:
@@ -168,9 +180,7 @@ def opt_out(msg_hash):
address has been opted out of receiving any future mail)"""
email, added = Email.handler.opt_out(msg_hash)
if email and added:
Email.handler.add_to_queue(None, None, [email], "reddit.com",
datetime.datetime.now(g.tz),
'127.0.0.1', Email.Kind.OPTOUT)
_system_email(email, "", Email.Kind.OPTOUT)
return email, added
def opt_in(msg_hash):
@@ -178,7 +188,33 @@ def opt_in(msg_hash):
from our opt out list)"""
email, removed = Email.handler.opt_in(msg_hash)
if email and removed:
Email.handler.add_to_queue(None, None, [email], "reddit.com",
datetime.datetime.now(g.tz),
'127.0.0.1', Email.Kind.OPTIN)
_system_email(email, "", Email.Kind.OPTIN)
return email, removed
def _promo_email(thing, kind, body = ""):
a = Account._byID(thing.author_id)
return _system_email(a.email, body, kind, thing = thing,
reply_to = "selfservicesupport@reddit.com")
def new_promo(thing):
return _promo_email(thing, Email.Kind.NEW_PROMO)
def promo_bid(thing):
return _promo_email(thing, Email.Kind.BID_PROMO)
def accept_promo(thing):
return _promo_email(thing, Email.Kind.ACCEPT_PROMO)
def reject_promo(thing, reason = ""):
return _promo_email(thing, Email.Kind.REJECT_PROMO, reason)
def queue_promo(thing):
return _promo_email(thing, Email.Kind.QUEUED_PROMO)
def live_promo(thing):
return _promo_email(thing, Email.Kind.LIVE_PROMO)
def finished_promo(thing):
return _promo_email(thing, Email.Kind.FINISHED_PROMO)

View File

@@ -127,7 +127,6 @@ code_re = re.compile('<code>([^<]+)</code>')
a_re = re.compile('>([^<]+)</a>')
fix_url = re.compile('&lt;(http://[^\s\'\"\]\)]+)&gt;')
#TODO markdown should be looked up in batch?
#@memoize('markdown')
def safemarkdown(text, nofollow=False, target=None):

View File

@@ -78,10 +78,12 @@ class JsonResponse(object):
def has_errors(self, field_name, *errors, **kw):
have_error = False
field_name = tup(field_name)
for error_name in errors:
if (error_name, field_name) in c.errors:
self.set_error(error_name, field_name)
have_error = True
for fname in field_name:
if (error_name, fname) in c.errors:
self.set_error(error_name, fname)
have_error = True
return have_error
def _things(self, things, action, *a, **kw):

View File

@@ -74,6 +74,10 @@ class JsonTemplate(Template):
def render(self, thing = None, *a, **kw):
return ObjectTemplate({})
class TakedownJsonTemplate(JsonTemplate):
def render(self, thing = None, *a, **kw):
return thing.explanation
class TableRowTemplate(JsonTemplate):
def cells(self, thing):
raise NotImplementedError
@@ -202,11 +206,12 @@ class LinkJsonTemplate(ThingJsonTemplate):
domain = "domain",
title = "title",
url = "url",
author = "author",
author = "author",
thumbnail = "thumbnail",
media = "media_object",
media_embed = "media_embed",
selftext = "selftext",
selftext_html= "selftext_html",
num_comments = "num_comments",
subreddit = "subreddit",
subreddit_id = "subreddit_id")
@@ -228,6 +233,8 @@ class LinkJsonTemplate(ThingJsonTemplate):
elif attr == 'subreddit_id':
return thing.subreddit._fullname
elif attr == 'selftext':
return thing.selftext
elif attr == 'selftext_html':
return safemarkdown(thing.selftext)
return ThingJsonTemplate.thing_attr(self, thing, attr)
@@ -237,6 +244,9 @@ class LinkJsonTemplate(ThingJsonTemplate):
return d
class PromotedLinkJsonTemplate(LinkJsonTemplate):
_data_attrs_ = LinkJsonTemplate.data_attrs(promoted = "promoted")
del _data_attrs_['author']
class CommentJsonTemplate(ThingJsonTemplate):
_data_attrs_ = ThingJsonTemplate.data_attrs(ups = "upvotes",
@@ -293,8 +303,13 @@ class MoreCommentJsonTemplate(CommentJsonTemplate):
def kind(self, wrapped):
return "more"
def thing_attr(self, thing, attr):
if attr in ('body', 'body_html'):
return ""
return CommentJsonTemplate.thing_attr(self, thing, attr)
def rendered_data(self, wrapped):
return ThingJsonTemplate.rendered_data(self, wrapped)
return CommentJsonTemplate.rendered_data(self, wrapped)
class MessageJsonTemplate(ThingJsonTemplate):
_data_attrs_ = ThingJsonTemplate.data_attrs(new = "new",
@@ -309,9 +324,9 @@ class MessageJsonTemplate(ThingJsonTemplate):
def thing_attr(self, thing, attr):
if attr == "was_comment":
return hasattr(thing, "was_comment")
return thing.was_comment
elif attr == "context":
return ("" if not hasattr(thing, "was_comment")
return ("" if not thing.was_comment
else thing.permalink + "?context=3")
elif attr == "dest":
return thing.to.name

View File

@@ -23,17 +23,17 @@
from pylons import g, config
from r2.models.link import Link
from r2.lib.workqueue import WorkQueue
from r2.lib import s3cp
from r2.lib.utils import timeago, fetch_things2
from r2.lib.utils import TimeoutFunction, TimeoutFunctionException
from r2.lib.db.operators import desc
from r2.lib.scraper import make_scraper, str_to_image, image_to_str, prepare_image
from r2.lib import amqp
import tempfile
from Queue import Queue
import traceback
s3_thumbnail_bucket = g.s3_thumb_bucket
media_period = g.media_period
threads = 20
log = g.log
@@ -41,6 +41,7 @@ def thumbnail_url(link):
"""Given a link, returns the url for its thumbnail based on its fullname"""
return 'http:/%s%s.png' % (s3_thumbnail_bucket, link._fullname)
def upload_thumb(link, image):
"""Given a link and an image, uploads the image to s3 into an image
based on the link's fullname"""
@@ -52,26 +53,6 @@ def upload_thumb(link, image):
s3cp.send_file(f.name, resource, 'image/png', 'public-read', None, False)
log.debug('thumbnail %s: %s' % (link._fullname, thumbnail_url(link)))
def make_link_info_job(results, link, useragent):
"""Returns a unit of work to send to a work queue that downloads a
link's thumbnail and media object. Places the result in the results
dict"""
def job():
try:
scraper = make_scraper(link.url)
thumbnail = scraper.thumbnail()
media_object = scraper.media_object()
if thumbnail:
upload_thumb(link, thumbnail)
results[link] = (thumbnail, media_object)
except:
log.warning('error fetching %s %s' % (link._fullname, link.url))
raise
return job
def update_link(link, thumbnail, media_object):
"""Sets the link's has_thumbnail and media_object attributes iin the
@@ -84,40 +65,48 @@ def update_link(link, thumbnail, media_object):
link._commit()
def process_new_links(period = media_period, force = False):
"""Fetches links from the last period and sets their media
properities. If force is True, it will fetch properities for links
even if the properties already exist"""
links = Link._query(Link.c._date > timeago(period), sort = desc('_date'),
data = True)
results = {}
jobs = []
for link in fetch_things2(links):
if link.is_self or link.promoted:
continue
elif not force and (link.has_thumbnail or link.media_object):
continue
jobs.append(make_link_info_job(results, link, g.useragent))
def set_media(link, force = False):
if link.is_self:
return
if not force and link.promoted:
return
elif not force and (link.has_thumbnail or link.media_object):
return
scraper = make_scraper(link.url)
#send links to a queue
wq = WorkQueue(jobs, num_workers = 20, timeout = 30)
wq.start()
wq.jobs.join()
thumbnail = scraper.thumbnail()
media_object = scraper.media_object()
#when the queue is finished, do the db writes in this thread
for link, info in results.items():
update_link(link, info[0], info[1])
if thumbnail:
upload_thumb(link, thumbnail)
def set_media(link):
"""Sets the media properties for a single link."""
results = {}
make_link_info_job(results, link, g.useragent)()
update_link(link, *results[link])
update_link(link, thumbnail, media_object)
def force_thumbnail(link, image_data):
image = str_to_image(image_data)
image = prepare_image(image)
upload_thumb(link, image)
update_link(link, thumbnail = True, media_object = None)
def run():
def process_msgs(msgs):
def _process_link(fname):
print "media: Processing %s" % fname
link = Link._by_fullname(fname, data=True, return_dict=False)
set_media(link)
for msg in msgs:
fname = msg.body
try:
TimeoutFunction(_process_link, 30)(fname)
except TimeoutFunctionException:
print "Timed out on %s" % fname
except KeyboardInterrupt:
raise
except:
print "Error fetching %s" % fname
print traceback.format_exc()
amqp.handle_items('scraper_q', process_msgs, limit=1)

View File

@@ -1,4 +1,3 @@
# The contents of this file are subject to the Common Public Attribution
# License Version 1.0. (the "License"); you may not use this file except in
# compliance with the License. You may obtain a copy of the License at
@@ -62,6 +61,7 @@ menu = MenuHandler(hot = _('hot'),
more = _('more'),
relevance = _('relevance'),
controversial = _('controversial'),
confidence = _('best'),
saved = _('saved {toolbar}'),
recommended = _('recommended'),
rising = _('rising'),
@@ -83,7 +83,6 @@ menu = MenuHandler(hot = _('hot'),
adminon = _("turn admin on"),
adminoff = _("turn admin off"),
prefs = _("preferences"),
stats = _("stats"),
submit = _("submit"),
help = _("help"),
blog = _("the reddit blog"),
@@ -115,27 +114,31 @@ menu = MenuHandler(hot = _('hot'),
sent = _("sent"),
# comments
comments = _("comments {toolbar}"),
related = _("related"),
details = _("details"),
duplicates = _("other discussions (%(num)s)"),
shirt = _("shirt"),
traffic = _("traffic"),
traffic = _("traffic stats"),
# reddits
home = _("home"),
about = _("about"),
edit = _("edit"),
banned = _("banned"),
edit = _("edit this reddit"),
moderators = _("edit moderators"),
contributors = _("edit contributors"),
banned = _("ban users"),
banusers = _("ban users"),
popular = _("popular"),
create = _("create"),
mine = _("my reddits"),
i18n = _("translate site"),
i18n = _("help translate"),
awards = _("awards"),
promoted = _("promoted"),
reporters = _("reporters"),
reports = _("reports"),
reports = _("reported links"),
reportedauth = _("reported authors"),
info = _("info"),
share = _("share"),
@@ -148,9 +151,15 @@ menu = MenuHandler(hot = _('hot'),
deleted = _("deleted"),
reported = _("reported"),
promote = _('promote'),
new_promo = _('new promoted link'),
current_promos = _('promoted links'),
promote = _('self-serve'),
new_promo = _('create promotion'),
my_current_promos = _('my promoted links'),
current_promos = _('all promoted links'),
future_promos = _('unapproved'),
graph = _('analytics'),
live_promos = _('live'),
unpaid_promos = _('unpaid'),
pending_promos = _('pending')
)
def menu_style(type):
@@ -179,7 +188,10 @@ class NavMenu(Styled):
base_path = '', separator = '|', **kw):
self.options = options
self.base_path = base_path
kw['style'], kw['css_class'] = menu_style(type)
#add the menu style, but preserve existing css_class parameter
kw['style'], css_class = menu_style(type)
kw['css_class'] = css_class + ' ' + kw.get('css_class', '')
#used by flatlist to delimit menu items
self.separator = separator
@@ -223,11 +235,11 @@ class NavButton(Styled):
nocname=False, opt = '', aliases = [],
target = "", style = "plain", **kw):
# keep original dest to check against c.location when rendering
aliases = set(a.rstrip('/') for a in aliases)
aliases.add(dest.rstrip('/'))
aliases = set(_force_unicode(a.rstrip('/')) for a in aliases)
aliases.add(_force_unicode(dest.rstrip('/')))
self.request_params = dict(request.GET)
self.stripped_path = request.path.rstrip('/').lower()
self.stripped_path = _force_unicode(request.path.rstrip('/').lower())
Styled.__init__(self, style = style, sr_path = sr_path,
nocname = nocname, target = target,
@@ -264,6 +276,8 @@ class NavButton(Styled):
else:
if self.stripped_path == self.bare_path:
return True
if self.bare_path and self.stripped_path.startswith(self.bare_path):
return True
if self.stripped_path in self.aliases:
return True
@@ -388,10 +402,13 @@ class SortMenu(SimpleGetMenu):
return operators.desc('_score')
elif sort == 'controversial':
return operators.desc('_controversy')
elif sort == 'confidence':
return operators.desc('_confidence')
class CommentSortMenu(SortMenu):
"""Sort menu for comments pages"""
options = ('hot', 'new', 'controversial', 'top', 'old')
default = 'confidence'
options = ('hot', 'new', 'controversial', 'top', 'old', 'confidence')
class SearchSortMenu(SortMenu):
"""Sort menu for search pages."""

View File

@@ -22,6 +22,7 @@
"""
One-time use functions to migrate from one reddit-version to another
"""
from r2.lib.promote import *
def add_allow_top_to_srs():
"Add the allow_top property to all stored subreddits"
@@ -33,3 +34,104 @@ def add_allow_top_to_srs():
sort = desc('_date'))
for sr in fetch_things2(q):
sr.allow_top = True; sr._commit()
def convert_promoted():
"""
should only need to be run once to update old style promoted links
to the new style.
"""
from r2.lib.utils import fetch_things2
from r2.lib import authorize
q = Link._query(Link.c.promoted == (True, False),
sort = desc("_date"))
sr_id = PromoteSR._id
bid = 100
with g.make_lock(promoted_lock_key):
promoted = {}
set_promoted({})
for l in fetch_things2(q):
print "updating:", l
try:
if not l._loaded: l._load()
# move the promotion into the promo subreddit
l.sr_id = sr_id
# set it to accepted (since some of the update functions
# check that it is not already promoted)
l.promote_status = STATUS.accepted
author = Account._byID(l.author_id)
l.promote_trans_id = authorize.auth_transaction(bid, author, -1, l)
l.promote_bid = bid
l.maximum_clicks = None
l.maximum_views = None
# set the dates
start = getattr(l, "promoted_on", l._date)
until = getattr(l, "promote_until", None) or \
(l._date + timedelta(1))
l.promote_until = None
update_promo_dates(l, start, until)
# mark it as promoted if it was promoted when we got there
if l.promoted and l.promote_until > datetime.now(g.tz):
l.promote_status = STATUS.pending
else:
l.promote_status = STATUS.finished
if not hasattr(l, "disable_comments"):
l.disable_comments = False
# add it to the auction list
if l.promote_status == STATUS.pending and l._fullname not in promoted:
promoted[l._fullname] = auction_weight(l)
l._commit()
except AttributeError:
print "BAD THING:", l
print promoted
set_promoted(promoted)
# run what is normally in a cron job to clear out finished promos
#promote_promoted()
def store_market():
"""
create index ix_promote_date_actual_end on promote_date(actual_end);
create index ix_promote_date_actual_start on promote_date(actual_start);
create index ix_promote_date_start_date on promote_date(start_date);
create index ix_promote_date_end_date on promote_date(end_date);
alter table promote_date add column account_id bigint;
create index ix_promote_date_account_id on promote_date(account_id);
alter table promote_date add column bid real;
alter table promote_date add column refund real;
"""
for p in PromoteDates.query().all():
l = Link._by_fullname(p.thing_name, True)
if hasattr(l, "promote_bid") and hasattr(l, "author_id"):
p.account_id = l.author_id
p._commit()
PromoteDates.update(l, l._date, l.promote_until)
PromoteDates.update_bid(l)
def subscribe_to_blog_and_annoucements(filename):
import re
from time import sleep
from r2.models import Account, Subreddit
r_blog = Subreddit._by_name("blog")
r_announcements = Subreddit._by_name("announcements")
contents = file(filename).read()
numbers = [ int(s) for s in re.findall("\d+", contents) ]
# d = Account._byID(numbers, data=True)
# for i, account in enumerate(d.values()):
for i, account_id in enumerate(numbers):
account = Account._byID(account_id, data=True)
for sr in r_blog, r_announcements:
if sr.add_subscriber(account):
sr._incr("_ups", 1)
print ("%d: subscribed %s to %s" % (i, account.name, sr.name))
else:
print ("%d: didn't subscribe %s to %s" % (i, account.name, sr.name))

View File

@@ -101,7 +101,7 @@ def normalized_hot_cached(sr_ids):
if not items:
continue
top_score = max(items[0]._hot, 1)
top_score = max(max(x._hot for x in items), 1)
if items:
results.extend((l, l._hot / top_score) for l in items)

View File

@@ -24,7 +24,7 @@ from r2.lib.memoize import memoize
from r2.lib.normalized_hot import get_hot, only_recent
from r2.lib import count
from r2.lib.utils import UniqueIterator, timeago
from r2.lib.promote import get_promoted
from r2.lib.promote import random_promoted
from pylons import c
@@ -45,36 +45,28 @@ def insert_promoted(link_names, sr_ids, logged_in):
Inserts promoted links into an existing organic list. Destructive
on `link_names'
"""
promoted_items = get_promoted()
promoted_items = random_promoted()
if not promoted_items:
return
def my_keepfn(l):
if l.promoted_subscribersonly and l.sr_id not in sr_ids:
return False
else:
return keep_link(l)
# no point in running the builder over more promoted links than
# we'll even use
max_promoted = max(1,len(link_names)/promoted_every_n)
# in the future, we may want to weight this sorting somehow
random.shuffle(promoted_items)
# remove any that the user has acted on
builder = IDBuilder(promoted_items,
skip = True, keep_fn = my_keepfn,
num = max_promoted)
def keep(item):
if c.user_is_loggedin and c.user._id == item.author_id:
return True
else:
return item.keep_item(item)
builder = IDBuilder(promoted_items, keep_fn = keep,
skip = True, num = max_promoted)
promoted_items = builder.get_items()[0]
if not promoted_items:
return
#make a copy before we start messing with things
orig_promoted = list(promoted_items)
# don't insert one at the head of the list 50% of the time for
# logged in users, and 50% of the time for logged-off users when
# the pool of promoted links is less than 3 (to avoid showing the
@@ -82,11 +74,6 @@ def insert_promoted(link_names, sr_ids, logged_in):
if (logged_in or len(promoted_items) < 3) and random.choice((True,False)):
promoted_items.insert(0, None)
#repeat the same promoted links for non logged in users
if not logged_in:
while len(promoted_items) * promoted_every_n < len(link_names):
promoted_items.extend(orig_promoted)
# insert one promoted item for every N items
for i, item in enumerate(promoted_items):
pos = i * promoted_every_n + i
@@ -106,11 +93,9 @@ def cached_organic_links(user_id, langs):
sr_ids = Subreddit.user_subreddits(user)
sr_count = count.get_link_counts()
#only use links from reddits that you're subscribed to
link_names = filter(lambda n: sr_count[n][1] in sr_ids, sr_count.keys())
link_names.sort(key = lambda n: sr_count[n][0])
#potentially add a up and coming link
if random.choice((True, False)) and sr_ids:
sr = Subreddit._byID(random.choice(sr_ids))
@@ -122,6 +107,8 @@ def cached_organic_links(user_id, langs):
new_item = random.choice(items[1:4])
link_names.insert(0, new_item._fullname)
insert_promoted(link_names, sr_ids, user_id is not None)
# remove any that the user has acted on
builder = IDBuilder(link_names,
skip = True, keep_fn = keep_link,
@@ -133,8 +120,6 @@ def cached_organic_links(user_id, langs):
if user_id:
update_pos(0)
insert_promoted(link_names, sr_ids, user_id is not None)
# remove any duplicates caused by insert_promoted if the user is logged in
if user_id:
link_names = list(UniqueIterator(link_names))

View File

@@ -26,6 +26,7 @@ from r2.lib.menus import NamedButton, NavButton, menu, NavMenu
class AdminSidebar(Templated):
def __init__(self, user):
Templated.__init__(self)
self.user = user
@@ -46,7 +47,9 @@ class AdminPage(Reddit):
buttons = []
if g.translator:
buttons.append(NavButton(menu.i18n, ""))
buttons.append(NavButton(menu.i18n, "i18n"))
buttons.append(NavButton(menu.awards, "awards"))
admin_menu = NavMenu(buttons, title='show', base_path = '/admin',
type="lightdrop")

View File

@@ -138,9 +138,9 @@ class LineGraph(object):
def __init__(self, xydata, colors = ("FF4500", "336699"),
width = 300, height = 175):
series = zip(*xydata)
self.xdata = DataSeries(series[0])
self.ydata = map(DataSeries, series[1:])
self.width = width
@@ -150,13 +150,13 @@ class LineGraph(object):
def google_chart(self, multiy = True, ylabels = [], title = "",
bar_fmt = True):
xdata, ydata = self.xdata, self.ydata
# Bar format makes the line chart look like it is a series of
# contiguous bars without the boundary line between each bar.
if bar_fmt:
xdata = DataSeries(range(len(self.xdata))).toBarX()
ydata = [y.toBarY() for y in self.ydata]
# TODO: currently we are only supporting time series. Make general
xaxis = make_date_axis_labels(self.xdata)

File diff suppressed because it is too large Load Diff

View File

@@ -24,7 +24,9 @@ from r2.lib.wrapped import Wrapped
from r2.models import LinkListing, make_wrapper, Link, IDBuilder, PromotedLink, Thing
from r2.lib.utils import tup
from r2.lib.strings import Score
from pylons import c
from r2.lib.promote import promo_edit_url, promo_traffic_url
from datetime import datetime
from pylons import c, g
class PrintableButtons(Styled):
def __init__(self, style, thing,
@@ -61,6 +63,18 @@ class LinkButtons(PrintableButtons):
show_distinguish = (is_author and thing.can_ban
and getattr(thing, "expand_children", False))
kw = {}
if thing.promoted is not None:
now = datetime.now(g.tz)
promotable = (thing._date <= now and thing.promote_until > now)
kw = dict(promo_url = promo_edit_url(thing),
promote_bid = thing.promote_bid,
promote_status = getattr(thing, "promote_status", 0),
user_is_sponsor = c.user_is_sponsor,
promotable = promotable,
traffic_url = promo_traffic_url(thing),
is_author = thing.is_author)
PrintableButtons.__init__(self, 'linkbuttons', thing,
# user existence and preferences
is_loggedin = c.user_is_loggedin,
@@ -76,7 +90,10 @@ class LinkButtons(PrintableButtons):
show_delete = show_delete,
show_report = show_report,
show_distinguish = show_distinguish,
show_comments = comments)
show_comments = comments,
# promotion
promoted = thing.promoted,
**kw)
class CommentButtons(PrintableButtons):
def __init__(self, thing, delete = True, report = True):
@@ -105,12 +122,13 @@ class MessageButtons(PrintableButtons):
def __init__(self, thing, delete = False, report = True):
was_comment = getattr(thing, 'was_comment', False)
permalink = thing.permalink if was_comment else ""
PrintableButtons.__init__(self, "messagebuttons", thing,
profilepage = c.profilepage,
permalink = permalink,
was_comment = was_comment,
was_comment = was_comment,
can_reply = c.user_is_loggedin,
parent_id = getattr(thing, "parent_id", None),
show_report = True,
show_delete = False)
@@ -120,7 +138,7 @@ def default_thing_wrapper(**params):
w = Wrapped(thing)
style = params.get('style', c.render_style)
if isinstance(thing, Link):
if thing.promoted:
if thing.promoted is not None:
w.render_class = PromotedLink
w.rowstyle = 'promoted link'
elif style == 'htmllite':

View File

@@ -22,50 +22,426 @@
from __future__ import with_statement
from r2.models import *
from r2.lib import authorize
from r2.lib import emailer, filters
from r2.lib.memoize import memoize
from datetime import datetime
from r2.lib.template_helpers import get_domain
from r2.lib.utils import Enum
from pylons import g, c
from datetime import datetime, timedelta
import random
promoted_memo_lifetime = 30
promoted_memo_key = 'cached_promoted_links'
promoted_lock_key = 'cached_promoted_links_lock'
promoted_memo_key = 'cached_promoted_links2'
promoted_lock_key = 'cached_promoted_links_lock2'
def promote(thing, subscribers_only = False, promote_until = None,
disable_comments = False):
STATUS = Enum("unpaid", "unseen", "accepted", "rejected",
"pending", "promoted", "finished")
thing.promoted = True
thing.promoted_on = datetime.now(g.tz)
PromoteSR = 'promos'
try:
PromoteSR = Subreddit._new(name = PromoteSR,
title = "promoted links",
author_id = -1,
type = "public",
ip = '0.0.0.0')
except SubredditExists:
PromoteSR = Subreddit._by_name(PromoteSR)
if c.user:
thing.promoted_by = c.user._id
def promo_traffic_url(l):
domain = get_domain(cname = False, subreddit = False)
return "http://%s/traffic/%s/" % (domain, l._id36)
if promote_until:
thing.promote_until = promote_until
def promo_edit_url(l):
domain = get_domain(cname = False, subreddit = False)
return "http://%s/promoted/edit_promo/%s" % (domain, l._id36)
if disable_comments:
thing.disable_comments = True
# These could be done with relationships, but that seeks overkill as
# we never query based on user and only check per-thing
def is_traffic_viewer(thing, user):
return (c.user_is_sponsor or user._id == thing.author_id or
user._id in getattr(thing, "promo_traffic_viewers", set()))
if subscribers_only:
thing.promoted_subscribersonly = True
def add_traffic_viewer(thing, user):
viewers = getattr(thing, "promo_traffic_viewers", set()).copy()
if user._id not in viewers:
viewers.add(user._id)
thing.promo_traffic_viewers = viewers
thing._commit()
return True
return False
def rm_traffic_viewer(thing, user):
viewers = getattr(thing, "promo_traffic_viewers", set()).copy()
if user._id in viewers:
viewers.remove(user._id)
thing.promo_traffic_viewers = viewers
thing._commit()
return True
return False
def traffic_viewers(thing):
return sorted(getattr(thing, "promo_traffic_viewers", set()))
# logging routine for keeping track of diffs
def promotion_log(thing, text, commit = False):
"""
For logging all sorts of things
"""
name = c.user.name if c.user_is_loggedin else "<MAGIC>"
log = list(getattr(thing, "promotion_log", []))
now = datetime.now(g.tz).strftime("%Y-%m-%d %H:%M:%S")
text = "[%s: %s] %s" % (name, now, text)
log.append(text)
# copy (and fix encoding) to make _dirty
thing.promotion_log = map(filters._force_utf8, log)
if commit:
thing._commit()
return text
def new_promotion(title, url, user, ip, promote_start, promote_until, bid,
disable_comments = False,
max_clicks = None, max_views = None):
"""
Creates a new promotion with the provided title, etc, and sets it
status to be 'unpaid'.
"""
l = Link._submit(title, url, user, PromoteSR, ip)
l.promoted = True
l.promote_until = None
l.promote_status = STATUS.unpaid
l.promote_trans_id = 0
l.promote_bid = bid
l.maximum_clicks = max_clicks
l.maximum_views = max_views
l.disable_comments = disable_comments
update_promo_dates(l, promote_start, promote_until)
promotion_log(l, "promotion created")
l._commit()
# the user has posted a promotion, so enable the promote menu unless
# they have already opted out
if user.pref_show_promote is not False:
user.pref_show_promote = True
user._commit()
emailer.new_promo(l)
return l
def update_promo_dates(thing, start_date, end_date, commit = True):
if thing and thing.promote_status < STATUS.pending or c.user_is_admin:
if (thing._date != start_date or
thing.promote_until != end_date):
promotion_log(thing, "duration updated (was %s -> %s)" %
(thing._date, thing.promote_until))
thing._date = start_date
thing.promote_until = end_date
PromoteDates.update(thing, start_date, end_date)
if commit:
thing._commit()
return True
return False
def update_promo_data(thing, title, url, commit = True):
if thing and (thing.url != url or thing.title != title):
if thing.title != title:
promotion_log(thing, "title updated (was '%s')" %
thing.title)
if thing.url != url:
promotion_log(thing, "url updated (was '%s')" %
thing.url)
old_url = thing.url
thing.url = url
thing.title = title
if not c.user_is_sponsor:
unapproved_promo(thing)
thing.update_url_cache(old_url)
if commit:
thing._commit()
return True
return False
def refund_promo(thing, user, refund):
cur_refund = getattr(thing, "promo_refund", 0)
refund = min(refund, thing.promote_bid - cur_refund)
if refund > 0:
thing.promo_refund = cur_refund + refund
if authorize.refund_transaction(refund, user, thing.promote_trans_id):
promotion_log(thing, "payment update: refunded '%.2f'" % refund)
else:
promotion_log(thing, "payment update: refund failed")
if thing.promote_status in (STATUS.promoted, STATUS.finished):
PromoteDates.update_bid(thing)
thing._commit()
def auth_paid_promo(thing, user, pay_id, bid):
"""
promotes a promotion from 'unpaid' to 'unseen'.
In the case that bid already exists on the current promotion, the
previous transaction is voided and repalced with the new bid.
"""
if thing.promote_status == STATUS.finished:
return
elif (thing.promote_status > STATUS.unpaid and
thing.promote_trans_id):
# void the existing transaction
authorize.void_transaction(user, thing.promote_trans_id)
# create a new transaction and update the bid
trans_id = authorize.auth_transaction(bid, user, pay_id, thing)
thing.promote_bid = bid
if trans_id is not None:
# we won't reset to unseen if already approved and the payment went ok
promotion_log(thing, "updated payment and/or bid: SUCCESS")
if trans_id < 0:
promotion_log(thing, "FREEBIE")
thing.promote_status = max(thing.promote_status, STATUS.unseen)
thing.promote_trans_id = trans_id
else:
# something bad happend.
promotion_log(thing, "updated payment and/or bid: FAILED")
thing.promore_status = STATUS.unpaid
thing.promote_trans_id = 0
PromoteDates.update_bid(thing)
# commit last to guarantee consistency
thing._commit()
emailer.promo_bid(thing)
return bool(trans_id)
with g.make_lock(promoted_lock_key):
promoted = get_promoted_direct()
promoted.append(thing._fullname)
set_promoted(promoted)
def unapproved_promo(thing):
"""
revert status of a promoted link to unseen.
def unpromote(thing):
NOTE: if the promotion is live, this has the side effect of
bumping it from the live queue pending an admin's intervention to
put it back in place.
"""
# only reinforce pending if it hasn't been seen yet.
if STATUS.unseen < thing.promote_status < STATUS.finished:
promotion_log(thing, "status update: unapproved")
unpromote(thing, status = STATUS.unseen)
def accept_promo(thing):
"""
Accept promotion and set its status as accepted if not already
charged, else pending.
"""
if thing.promote_status < STATUS.pending:
bid = Bid.one(thing.promote_trans_id)
if bid.status == Bid.STATUS.CHARGE:
thing.promote_status = STATUS.pending
# repromote if already promoted before
if hasattr(thing, "promoted_on"):
promote(thing)
else:
emailer.queue_promo(thing)
else:
thing.promote_status = STATUS.accepted
promotion_log(thing, "status update: accepted")
emailer.accept_promo(thing)
thing._commit()
def reject_promo(thing, reason = ""):
"""
Reject promotion and set its status as rejected
Here, we use unpromote so that we can also remove a promotion from
the queue if it has become promoted.
"""
unpromote(thing, status = STATUS.rejected)
promotion_log(thing, "status update: rejected. Reason: '%s'" % reason)
emailer.reject_promo(thing, reason)
def delete_promo(thing):
"""
deleted promotions have to be specially dealt with. Reject the
promo and void any associated transactions.
"""
thing.promoted = False
thing.unpromoted_on = datetime.now(g.tz)
thing.promote_until = None
thing._deleted = True
reject_promo(thing, reason = "The promotion was deleted by the user")
if thing.promote_trans_id > 0:
user = Account._byID(thing.author_id)
authorize.void_transaction(user, thing.promote_trans_id)
def pending_promo(thing):
"""
For an accepted promotion within the proper time interval, charge
the account of the user and set the new status as pending.
"""
if thing.promote_status == STATUS.accepted and thing.promote_trans_id:
user = Account._byID(thing.author_id)
# TODO: check for charge failures/recharges, etc
if authorize.charge_transaction(user, thing.promote_trans_id):
promotion_log(thing, "status update: pending")
thing.promote_status = STATUS.pending
thing.promote_paid = thing.promote_bid
thing._commit()
emailer.queue_promo(thing)
else:
promotion_log(thing, "status update: charge failure")
thing._commit()
#TODO: email rejection?
def promote(thing, batch = False):
"""
Given a promotion with pending status, set the status to promoted
and move it into the promoted queue.
"""
if thing.promote_status == STATUS.pending:
promotion_log(thing, "status update: live")
PromoteDates.log_start(thing)
thing.promoted_on = datetime.now(g.tz)
thing.promote_status = STATUS.promoted
thing._commit()
emailer.live_promo(thing)
if not batch:
with g.make_lock(promoted_lock_key):
promoted = get_promoted_direct()
if thing._fullname not in promoted:
promoted[thing._fullname] = auction_weight(thing)
set_promoted(promoted)
def unpromote(thing, batch = False, status = STATUS.finished):
"""
unpromote a link with provided status, removing it from the
current promotional queue.
"""
if status == STATUS.finished:
PromoteDates.log_end(thing)
emailer.finished_promo(thing)
thing.unpromoted_on = datetime.now(g.tz)
promotion_log(thing, "status update: finished")
thing.promote_status = status
thing._commit()
if not batch:
with g.make_lock(promoted_lock_key):
promoted = get_promoted_direct()
if thing._fullname in promoted:
del promoted[thing._fullname]
set_promoted(promoted)
# batch methods for moving promotions into the pending queue, and
# setting status as pending.
# dates are referenced to UTC, while we want promos to change at (roughly)
# midnight eastern-US.
# TODO: make this a config parameter
timezone_offset = -5 # hours
timezone_offset = timedelta(0, timezone_offset * 3600)
def promo_datetime_now():
return datetime.now(g.tz) + timezone_offset
def generate_pending(date = None, test = False):
"""
Look-up links that are to be promoted on the provided date (the
default is now plus one day) and set their status as pending if
they have been accepted. This results in credit cards being charged.
"""
date = date or (promo_datetime_now() + timedelta(1))
links = Link._by_fullname([p.thing_name for p in
PromoteDates.for_date(date)],
data = True,
return_dict = False)
for l in links:
if l._deleted and l.promote_status != STATUS.rejected:
print "DELETING PROMO", l
# deleted promos should never be made pending
delete_promo(l)
elif l.promote_status == STATUS.accepted:
if test:
print "Would have made pending: (%s, %s)" % \
(l, l.make_permalink(None))
else:
pending_promo(l)
def promote_promoted(test = False):
"""
make promotions that are no longer supposed to be active
'finished' and find all pending promotions that are supposed to be
promoted and promote them.
"""
from r2.lib.traffic import load_traffic
with g.make_lock(promoted_lock_key):
promoted = [ x for x in get_promoted_direct()
if x != thing._fullname ]
now = promo_datetime_now()
set_promoted(promoted)
promoted = Link._by_fullname(get_promoted_direct().keys(),
data = True, return_dict = False)
promos = {}
for l in promoted:
keep = True
if l.promote_until < now:
keep = False
maximum_clicks = getattr(l, "maximum_clicks", None)
maximum_views = getattr(l, "maximum_views", None)
if maximum_clicks or maximum_views:
# grab the traffic
traffic = load_traffic("day", "thing", l._fullname)
if traffic:
# (unique impressions, number impressions,
# unique clicks, number of clicks)
traffic = [y for x, y in traffic]
traffic = map(sum, zip(*traffic))
uimp, nimp, ucli, ncli = traffic
if maximum_clicks and maximum_clicks < ncli:
keep = False
if maximum_views and maximum_views < nimp:
keep = False
if not keep:
if test:
print "Would have unpromoted: (%s, %s)" % \
(l, l.make_permalink(None))
else:
unpromote(l, batch = True)
new_promos = Link._query(Link.c.promote_status == (STATUS.pending,
STATUS.promoted),
Link.c.promoted == True,
data = True)
for l in new_promos:
if l.promote_until > now and l._date <= now:
if test:
print "Would have promoted: %s" % l
else:
promote(l, batch = True)
promos[l._fullname] = auction_weight(l)
elif l.promote_until <= now:
if test:
print "Would have unpromoted: (%s, %s)" % \
(l, l.make_permalink(None))
else:
unpromote(l, batch = True)
# remove unpaid promos that are scheduled to run on today or before
unpaid_promos = Link._query(Link.c.promoted == True,
Link.c.promote_status == STATUS.unpaid,
Link.c._date < now,
Link.c._deleted == False,
data = True)
for l in unpaid_promos:
if test:
print "Would have rejected: %s" % promo_edit_url(l)
else:
reject_promo(l, reason = "We're sorry, but this sponsored link was not set up for payment before the appointed date. Please add payment info and move the date into the future if you would like to resubmit. Also please feel free to email us at selfservicesupport@reddit.com if you believe this email is in error.")
if test:
print promos
else:
set_promoted(promos)
return promos
def auction_weight(link):
duration = (link.promote_until - link._date).days
return duration and link.promote_bid / duration
def set_promoted(link_names):
# caller is assumed to execute me inside a lock if necessary
@@ -81,55 +457,60 @@ def get_promoted():
return get_promoted_direct()
def get_promoted_direct():
return g.permacache.get(promoted_memo_key, [])
return g.permacache.get(promoted_memo_key, {})
def expire_promoted():
"""
To be called periodically (e.g. by `cron') to clean up
promoted links past their expiration date
"""
with g.make_lock(promoted_lock_key):
link_names = set(get_promoted_direct())
links = Link._by_fullname(link_names, data=True, return_dict = False)
link_names = []
expired_names = []
for x in links:
if (not x.promoted
or x.promote_until and x.promote_until < datetime.now(g.tz)):
g.log.info('Unpromoting %s' % x._fullname)
unpromote(x)
expired_names.append(x._fullname)
else:
link_names.append(x._fullname)
set_promoted(link_names)
return expired_names
def get_promoted_slow():
# to be used only by a human at a terminal
with g.make_lock(promoted_lock_key):
links = Link._query(Link.c.promoted == True,
links = Link._query(Link.c.promote_status == STATUS.promoted,
Link.c.promoted == True,
data = True)
link_names = [ x._fullname for x in links ]
link_names = dict((x._fullname, auction_weight(x)) for x in links)
set_promoted(link_names)
return link_names
#deprecated
def promote_builder_wrapper(alternative_wrapper):
def wrapper(thing):
if isinstance(thing, Link) and thing.promoted:
w = Wrapped(thing)
w.render_class = PromotedLink
w.rowstyle = 'promoted link'
return w
else:
return alternative_wrapper(thing)
return wrapper
def random_promoted():
"""
return a list of the currently promoted items, randomly choosing
the order of the list based on the bid-weighing.
"""
bids = get_promoted()
market = sum(bids.values())
if market:
# get a list of current promotions, sorted by their bid amount
promo_list = bids.keys()
# sort by bids and use the thing_id as the tie breaker (for
# consistent sorting)
promo_list.sort(key = lambda x: (bids[x], x), reverse = True)
if len(bids) > 1:
# pick a number, any number
n = random.uniform(0, 1)
for i, p in enumerate(promo_list):
n -= bids[p] / market
if n < 0:
return promo_list[i:] + promo_list[:i]
return promo_list
def test_random_promoted(n = 1000):
promos = get_promoted()
market = sum(promos.values())
if market:
res = {}
for i in xrange(n):
key = random_promoted()[0]
res[key] = res.get(key, 0) + 1
print "%10s expected actual E/A" % "thing"
print "------------------------------------"
for k, v in promos.iteritems():
expected = float(v) / market * 100
actual = float(res.get(k, 0)) / n * 100
print "%10s %6.2f%% %6.2f%% %6.2f" % \
(k, expected, actual, expected / actual if actual else 0)

View File

@@ -36,7 +36,7 @@ class ShellProcess(object):
self.proc = subprocess.Popen(cmd, shell = True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
ntries = int(math.ceil(timeout / sleepcycle))
for n in xrange(ntries):
if self.proc.poll() is not None:
@@ -45,7 +45,7 @@ class ShellProcess(object):
else:
print "Process timeout: '%s'" % cmd
os.kill(self.proc.pid, signal.SIGTERM)
self.output, self.error = self.proc.communicate()
self.rcode = self.proc.poll()
@@ -56,8 +56,8 @@ class ShellProcess(object):
def read(self):
return self.output
class AppServiceMonitor(Templated):
cache_key = "service_datalogger_data_"
cache_key_small = "service_datalogger_db_summary_"
@@ -76,7 +76,7 @@ class AppServiceMonitor(Templated):
"""
def __init__(self, hosts = None):
def __init__(self, hosts = None, queue_length_max = {}):
"""
hosts is a list of machine hostnames to be tracked.
"""
@@ -87,7 +87,7 @@ class AppServiceMonitor(Templated):
dbase, ip = list(g.to_iter(getattr(g, db + "_db")))[:2]
try:
name = socket.gethostbyaddr(ip)[0]
for host in g.monitored_servers:
if (name == host or
("." in host and name.endswith("." + host)) or
@@ -97,6 +97,13 @@ class AppServiceMonitor(Templated):
print "error resolving host: %s" % ip
self._db_info = db_info
q_host = g.amqp_host.split(':')[0]
if q_host:
# list of machines that have amqp queues
self._queue_hosts = set([q_host, socket.gethostbyaddr(q_host)[0]])
# dictionary of max lengths for each queue
self._queue_length_max = queue_length_max
self.hostlogs = []
Templated.__init__(self)
@@ -135,7 +142,7 @@ class AppServiceMonitor(Templated):
def server_load(self, mach_name):
h = self.from_cache(host)
return h.load.most_recent()
def __iter__(self):
if not self.hostlogs:
self.hostlogs = [self.from_cache(host) for host in self._hosts]
@@ -152,13 +159,16 @@ class AppServiceMonitor(Templated):
h = HostLogger(host, self)
while True:
h.monitor(srvname, *a, **kw)
self.set_cache(h)
if loop:
time.sleep(loop_time)
else:
break
def is_queue(self, host):
name = socket.gethostbyaddr(host)[0]
return name in self._queue_hosts or host in self._queue_hosts
def is_db_machine(self, host):
"""
@@ -177,7 +187,7 @@ class DataLogger(object):
of the interval provided or returns the last element if no
interval is provided
"""
def __init__(self, maxlen = 30):
self._list = []
self.maxlen = maxlen
@@ -186,7 +196,6 @@ class DataLogger(object):
self._list.append((value, datetime.utcnow()))
if len(self._list) > self.maxlen:
self._list = self._list[-self.maxlen:]
def __call__(self, average = None):
time = datetime.utcnow()
@@ -208,7 +217,6 @@ class DataLogger(object):
else:
return [0, None]
class Service(object):
def __init__(self, name, pid, age):
self.name = name
@@ -221,6 +229,29 @@ class Service(object):
def last_update(self):
return max(x.most_recent()[1] for x in [self.mem, self.cpu])
class AMQueueP(object):
default_max_queue = 1000
def __init__(self, max_lengths = {}):
self.queues = {}
self.max_lengths = max_lengths
def track(self, cmd = "rabbitmqctl"):
for line in ShellProcess("%s list_queues" % cmd):
try:
name, length = line.split('\t')
length = int(length.strip(' \n'))
self.queues.setdefault(name, DataLogger()).add(length)
except ValueError:
continue
def max_length(self, name):
return self.max_lengths.get(name, self.default_max_queue)
def __iter__(self):
for x in sorted(self.queues.keys()):
yield (x, self.queues[x])
class Database(object):
@@ -277,12 +308,14 @@ class HostLogger(object):
self.load = DataLogger()
self.services = {}
db_info = master.is_db_machine(host)
is_queue = master.is_queue(host)
self.ini_db_names = db_info.keys()
self.db_names = set(name for name, ip in db_info.itervalues())
self.db_ips = set(ip for name, ip in db_info.itervalues())
self.database = Database() if self.db_names else None
self.queue = AMQueueP(master._queue_length_max) if is_queue else None
self.ncpu = 0
try:
@@ -320,7 +353,8 @@ class HostLogger(object):
def monitor(self, srvname,
srv_params = {}, top_params = {}, db_params = {}):
srv_params = {}, top_params = {}, db_params = {},
queue_params = {}):
# (re)populate the service listing
if srvname:
for name, status, pid, t in supervise_list(**srv_params):
@@ -338,6 +372,9 @@ class HostLogger(object):
self.database.track(**check_database(self.db_names,
**db_params))
if self.queue:
self.queue.track(**queue_params)
foo = ShellProcess('/usr/bin/env uptime').read()
foo = foo.split("load average")[1].split(':')[1].strip(' ')
self.load.add(float(foo.split(' ')[0].strip(',')))
@@ -542,6 +579,4 @@ def monitor_cache_lifetime(minutes, retest = 10, ntest = -1,
if [] in keys:
keys = filter(None, keys)
ntest -= 1

View File

@@ -21,25 +21,28 @@
################################################################################
"""
Module for communication reddit-level communication with
Solr. Contains functions for indexing (`reindex_all`, `changed`)
Solr. Contains functions for indexing (`reindex_all`, `run_changed`)
and searching (`search_things`). Uses pysolr (placed in r2.lib)
for lower-level communication with Solr
"""
from __future__ import with_statement
from r2.models import *
from r2.lib.contrib import pysolr
from r2.lib.contrib.pysolr import SolrError
from r2.lib.utils import timeago, set_emptying_cache, IteratorChunker
from r2.lib.utils import psave, pload, unicode_safe, tup
from r2.lib.cache import SelfEmptyingCache
from Queue import Queue
from threading import Thread
import time
from datetime import datetime, date
from time import strftime
from pylons import g,config
from pylons import g, config
from r2.models import *
from r2.lib.contrib import pysolr
from r2.lib.contrib.pysolr import SolrError
from r2.lib.utils import timeago
from r2.lib.utils import unicode_safe, tup
from r2.lib.cache import SelfEmptyingCache
from r2.lib import amqp
## Changes to the list of searchable languages will require changes to
## Solr's configuration (specifically, the fields that are searched)
@@ -49,7 +52,8 @@ searchable_langs = set(['dk','nl','en','fi','fr','de','it','no','nn','pt',
## Adding types is a matter of adding the class to indexed_types here,
## adding the fields from that type to search_fields below, and adding
## those fields to Solr's configuration
indexed_types = (Subreddit, Link)
indexed_types = (Subreddit, Link)
class Field(object):
"""
@@ -402,42 +406,6 @@ def reindex_all(types = None, delete_all_first=False):
q.put(e,timeout=30)
raise e
def changed(commit=True,optimize=False,delete_old=True):
"""
Run by `cron` (through `paster run`) on a schedule to update
all Things that have been created or have changed since the
last run. Things add themselves to a `thing_changes` table,
which we read, find the Things, tokenise, and re-submit them
to Solr
"""
set_emptying_cache()
with SolrConnection(commit=commit,optimize=optimize) as s:
changes = thing_changes.get_changed()
if changes:
max_date = max(x[1] for x in changes)
changed = IteratorChunker(x[0] for x in changes)
while not changed.done:
chunk = changed.next_chunk(200)
# chunk =:= [(Fullname,Date) | ...]
chunk = Thing._by_fullname(chunk,
data=True, return_dict=False)
chunk = [x for x in chunk if not x._spam and not x._deleted]
to_delete = [x for x in chunk if x._spam or x._deleted]
# note: anything marked as spam or deleted is not
# updated in the search database. Since these are
# filtered out in the UI, that's probably fine.
if len(chunk) > 0:
chunk = tokenize_things(chunk)
s.add(chunk)
for i in to_delete:
s.delete(id=i._fullname)
if delete_old:
thing_changes.clear_changes(max_date = max_date)
def combine_searchterms(terms):
"""
@@ -728,3 +696,38 @@ def get_after(fullnames, fullname, num):
return fullnames[i+1:i+num+1]
else:
return fullnames[:num]
def run_commit(optimize=False):
with SolrConnection(commit=True, optimize=optimize) as s:
pass
def run_changed(drain=False):
"""
Run by `cron` (through `paster run`) on a schedule to update
all Things that have been created or have changed since the
last run. Note: unlike many queue-using functions, this one is
run from cron and totally drains the queue before terminating
"""
def _run_changed(msgs):
print "changed: Processing %d items" % len(msgs)
fullnames = set([x.body for x in msgs])
things = Thing._by_fullname(fullnames, data=True, return_dict=False)
things = [x for x in things if isinstance(x, indexed_types)]
update_things = [x for x in things if not x._spam and not x._deleted]
delete_things = [x for x in things if x._spam or x._deleted]
with SolrConnection() as s:
if update_things:
tokenized = tokenize_things(update_things)
s.add(tokenized)
if delete_things:
for i in delete_things:
s.delete(id=i._fullname)
amqp.handle_items('searchchanges_q', _run_changed, limit=1000,
drain=drain)

View File

@@ -125,7 +125,7 @@ class ShirtPage(LinkInfoPage):
ShirtPane(self.link)))
class ShirtPane(Templated):
default_color = "white"
default_color = "black"
default_size = "large"
default_style = "men"

View File

@@ -30,7 +30,7 @@ hooks to the UI are the same.
import helpers as h
from pylons import g
from pylons.i18n import _, ungettext
import random
import random, locale
__all__ = ['StringHandler', 'strings', 'PluralManager', 'plurals',
'Score', 'rand_strings']
@@ -58,7 +58,7 @@ string_dict = dict(
float_label = _("%(num)5.3f %(thing)s"),
# this is for Japanese which treats people counds differently
person_label = _("%(num)d %(persons)s"),
person_label = _('<span class="number">%(num)s</span>&#32;<span class="word">%(persons)s</span>'),
firsttext = _("reddit is a source for what's new and popular online. vote on links that you like or dislike and help decide what's popular, or submit your own!"),
@@ -79,14 +79,16 @@ string_dict = dict(
friend = None,
moderator = _("you have been added as a moderator to [%(title)s](%(url)s)."),
contributor = _("you have been added as a contributor to [%(title)s](%(url)s)."),
banned = _("you have been banned from posting to [%(title)s](%(url)s).")
banned = _("you have been banned from posting to [%(title)s](%(url)s)."),
traffic = _('you have been added to the list of users able to see [traffic for the sponsoted link "%(title)s"](%(traffic_url)s).')
),
subj_add_friend = dict(
friend = None,
moderator = _("you are a moderator"),
contributor = _("you are a contributor"),
banned = _("you've been banned")
banned = _("you've been banned"),
traffic = _("you can view traffic on a promoted link")
),
sr_messages = dict(
@@ -96,7 +98,7 @@ string_dict = dict(
moderator = _('below are the reddits that you have moderator access to.')
),
sr_subscribe = _('click the `add` or `remove` buttons to choose which reddits appear on your front page.'),
sr_subscribe = _('click the `+frontpage` or `-frontpage` buttons to choose which reddits appear on your front page.'),
searching_a_reddit = _('you\'re searching within the [%(reddit_name)s](%(reddit_link)s) reddit. '+
'you can also search within [all reddits](%(all_reddits_link)s)'),
@@ -125,6 +127,9 @@ string_dict = dict(
submit_link = _("""You are submitting a link. The key to a successful submission is interesting content and a descriptive title."""),
submit_text = _("""You are submitting a text-based post. Speak your mind. A title is required, but expanding further in the text field is not. Beginning your title with "vote up if" is violation of intergalactic law."""),
iphone_first = _("You should consider using [reddit's free iphone app](http://itunes.com/apps/iredditfree)."),
verify_email = _("we're going to need to verify your email address for you to proceed."),
email_verified = _("your email address has been verfied"),
email_verify_failed = _("Verification failed. Please try that again"),
)
class StringHandler(object):
@@ -194,6 +199,7 @@ plurals = PluralManager([P_("comment", "comments"),
P_("subreddit", "subreddits"),
# people
P_("reader", "readers"),
P_("subscriber", "subscribers"),
P_("contributor", "contributors"),
P_("moderator", "moderators"),
@@ -225,10 +231,19 @@ class Score(object):
return strings.points_label % dict(num=max(x,0),
point=plurals.N_points(x))
@staticmethod
def _people(x, label):
return strings.person_label % \
dict(num = locale.format("%d", x, True),
persons = label(x))
@staticmethod
def subscribers(x):
return strings.person_label % \
dict(num = x, persons = plurals.N_subscribers(x))
return Score._people(x, plurals.N_subscribers)
@staticmethod
def readers(x):
return Score._people(x, plurals.N_readers)
@staticmethod
def none(x):

View File

@@ -76,14 +76,6 @@ def class_dict():
res = ', '.join(classes)
return unsafe('{ %s }' % res)
def path_info():
loc = dict(path = request.path,
params = dict(request.get))
return unsafe(simplejson.dumps(loc))
def replace_render(listing, item, render_func):
def _replace_render(style = None, display = True):
"""
@@ -143,25 +135,31 @@ def replace_render(listing, item, render_func):
com_cls = 'comments'
replacements['numcomments'] = com_label
replacements['commentcls'] = com_cls
replacements['display'] = "" if display else "style='display:none'"
if hasattr(item, "render_score"):
# replace the score stub
(replacements['scoredislikes'],
replacements['scoreunvoted'],
replacements['scorelikes']) = item.render_score
# compute the timesince here so we don't end up caching it
if hasattr(item, "_date"):
replacements['timesince'] = timesince(item._date)
if hasattr(item, "promoted") and item.promoted is not None:
from r2.lib import promote
# promoted links are special in their date handling
replacements['timesince'] = timesince(item._date -
promote.timezone_offset)
else:
replacements['timesince'] = timesince(item._date)
renderer = render_func or item.render
res = renderer(style = style, **replacements)
if isinstance(res, (str, unicode)):
return unsafe(res)
return res
return _replace_render
def get_domain(cname = False, subreddit = True, no_www = False):
@@ -186,7 +184,7 @@ def get_domain(cname = False, subreddit = True, no_www = False):
"""
# locally cache these lookups as this gets run in a loop in add_props
domain = g.domain
domain_prefix = g.domain_prefix
domain_prefix = c.domain_prefix
site = c.site
ccname = c.cname
if not no_www and domain_prefix:
@@ -308,19 +306,51 @@ def panel_size(state):
"the frame.cols of the reddit-toolbar's inner frame"
return '400px, 100%' if state =='expanded' else '0px, 100%x'
def find_author_class(thing, attribs, gray):
#assume attribs is sorted
author = thing.author
author_cls = "author"
# Appends to the list "attrs" a tuple of:
# <priority (higher trumps lower), letter,
# css class, i18n'ed mouseover label, hyperlink (opt), img (opt)>
def add_attr(attrs, code, label=None, link=None):
from r2.lib.template_helpers import static
extra_class = ''
attribs.sort()
img = None
if gray:
author_cls += " gray"
if code == 'F':
priority = 1
cssclass = 'friend'
if not label:
label = _('friend')
if not link:
link = '/prefs/friends'
elif code == 'S':
priority = 2
cssclass = 'submitter'
if not label:
label = _('submitter')
if not link:
raise ValueError ("Need a link")
elif code == 'M':
priority = 3
cssclass = 'moderator'
if not label:
raise ValueError ("Need a label")
if not link:
raise ValueError ("Need a link")
elif code == 'A':
priority = 4
cssclass = 'admin'
if not label:
label = _('reddit admin, speaking officially')
if not link:
link = '/help/faq#Whomadereddit'
elif code == 'trophy':
img = (static('award.png'), '!', 11, 8)
priority = 99
cssclass = 'recent-trophywinner'
if not label:
raise ValueError ("Need a label")
if not link:
raise ValueError ("Need a link")
else:
raise ValueError ("Got weird code [%s]" % code)
for priority, abbv, css_class, label, attr_link in attribs:
author_cls += " " + css_class
return author_cls
attrs.append( (priority, code, cssclass, label, link, img) )

View File

@@ -140,7 +140,7 @@ class UserInfo(Info):
self.site = safe_str(c.site.name if c.site else '')
self.lang = safe_str(c.lang if c.lang else '')
self.cname = safe_str(c.cname)
class PromotedLinkInfo(Info):
_tracked = []
tracker_url = g.adtracker_url
@@ -174,7 +174,17 @@ class PromotedLinkClickInfo(PromotedLinkInfo):
def tracking_url(self):
s = (PromotedLinkInfo.tracking_url(self) + '&url=' + self.dest)
return s
class AdframeInfo(PromotedLinkInfo):
tracker_url = g.adframetracker_url
@classmethod
def make_hash(cls, ip, fullname):
return sha.new("%s%s" % (fullname,
g.tracking_secret)).hexdigest()
def benchmark(n = 10000):
"""on my humble desktop machine, this gives ~150 microseconds per gen_url"""
import time

View File

@@ -61,9 +61,17 @@ def load_traffic_uncached(interval, what, iden,
return []
@memoize("cached_traffic", time = 60)
def load_traffic(interval, what, iden,
def load_traffic(interval, what, iden = '',
start_time = None, stop_time = None,
npoints = None):
"""
interval = (hour, day, month)
what = (reddit, lang, thing, promos)
iden is the specific thing (reddit name, language name, thing
fullname) that one is seeking traffic for.
"""
res = load_traffic_uncached(interval, what, iden,
start_time = start_time, stop_time = stop_time,
npoints = npoints)

View File

@@ -387,11 +387,10 @@ class Translator(LoggedSlots):
message = msg,
locale = self.locale)
key = ts.md5
if self.enabled.has_key(key):
ts.enabled = self.enabled[key]
if ts.enabled:
indx += 1
ts.index = indx
if not self.enabled.has_key(key) or self.enabled[key]:
indx += 1
ts.index = indx
while msgstr.match(line):
r, translation, line = get_next_str_block(line, handle)
ts.add(translation)
@@ -455,6 +454,7 @@ class Translator(LoggedSlots):
out_file = file + ".mo"
cmd = 'msgfmt -o "%s" "%s"' % (out_file, file)
print cmd
with os.popen(cmd) as handle:
x = handle.read()
if include_index:

View File

@@ -1,72 +0,0 @@
# The contents of this file are subject to the Common Public Attribution
# License Version 1.0. (the "License"); you may not use this file except in
# compliance with the License. You may obtain a copy of the License at
# http://code.reddit.com/LICENSE. The License is based on the Mozilla Public
# License Version 1.1, but Sections 14 and 15 have been added to cover use of
# software over a computer network and provide for limited attribution for the
# Original Developer. In addition, Exhibit A has been modified to be consistent
# with Exhibit B.
#
# Software distributed under the License is distributed on an "AS IS" basis,
# WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for
# the specific language governing rights and limitations under the License.
#
# The Original Code is Reddit.
#
# The Original Developer is the Initial Developer. The Initial Developer of the
# Original Code is CondeNet, Inc.
#
# All portions of the code written by CondeNet are Copyright (c) 2006-2009
# CondeNet, Inc. All Rights Reserved.
################################################################################
import sqlalchemy as sa
from r2.models import Account, Vote, Link
from r2.lib.db import tdb_sql as tdb
from r2.lib import utils
from pylons import g
cache = g.cache
def top_users():
tt, dt = tdb.get_thing_table(Account._type_id)
karma = dt.alias()
s = sa.select([tt.c.thing_id],
sa.and_(tt.c.spam == False,
tt.c.deleted == False,
karma.c.thing_id == tt.c.thing_id,
karma.c.key == 'link_karma'),
order_by = sa.desc(sa.cast(karma.c.value, sa.Integer)),
limit = 10)
rows = s.execute().fetchall()
return [r.thing_id for r in rows]
def top_user_change(period = '1 day'):
rel = Vote.rel(Account, Link)
rt, account, link, dt = tdb.get_rel_table(rel._type_id)
author = dt.alias()
date = utils.timeago(period)
s = sa.select([author.c.value, sa.func.sum(sa.cast(rt.c.name, sa.Integer))],
sa.and_(rt.c.date > date,
author.c.thing_id == rt.c.rel_id,
author.c.key == 'author_id'),
group_by = author.c.value,
order_by = sa.desc(sa.func.sum(sa.cast(rt.c.name, sa.Integer))),
limit = 10)
rows = s.execute().fetchall()
return [(int(r.value), r.sum) for r in rows]
def calc_stats():
top = top_users()
top_day = top_user_change('1 day')
top_week = top_user_change('1 week')
return (top, top_day, top_week)
def set_stats():
cache.set('stats', calc_stats())

View File

@@ -23,6 +23,7 @@ from urllib import unquote_plus, quote_plus, urlopen, urlencode
from urlparse import urlparse, urlunparse
from threading import local, Thread
import Queue
import signal
from copy import deepcopy
import cPickle as pickle
import re, datetime, math, random, string, sha, os
@@ -199,6 +200,17 @@ def strips(text, remove):
"""
return rstrips(lstrips(text, remove), remove)
class Enum(Storage):
def __init__(self, *a):
self.name = tuple(a)
Storage.__init__(self, ((e, i) for i, e in enumerate(a)))
def __contains__(self, item):
if isinstance(item, int):
return item in self.values()
else:
return Storage.__contains__(self, item)
class Results():
def __init__(self, sa_ResultProxy, build_fn, do_batch=False):
self.rp = sa_ResultProxy
@@ -887,7 +899,7 @@ def set_emptying_cache():
from r2.lib.cache import SelfEmptyingCache
g.cache.caches = [SelfEmptyingCache(),] + list(g.cache.caches[1:])
def find_recent_broken_things(from_time = None, delete = False):
def find_recent_broken_things(from_time = None, to_time = None, delete = False):
"""
Occasionally (usually during app-server crashes), Things will
be partially written out to the database. Things missing data
@@ -896,11 +908,10 @@ def find_recent_broken_things(from_time = None, delete = False):
them as appropriate.
"""
from r2.models import Link,Comment
from pylons import g
if not from_time:
from_time = timeago("1 hour")
to_time = timeago("60 seconds")
from_time = from_time or timeago('1 hour')
to_time = to_time or datetime.now(g.tz)
for (cls,attrs) in ((Link,('author_id','sr_id')),
(Comment,('author_id','sr_id','body','link_id'))):
@@ -921,7 +932,6 @@ def find_broken_things(cls,attrs,from_time,to_time,delete = False):
getattr(t,a)
except AttributeError:
# that failed; let's explicitly load it, and try again
print "Reloading %s" % t._fullname
t._load()
try:
getattr(t,a)
@@ -943,37 +953,23 @@ def lineno():
import inspect
print "%s\t%s" % (datetime.now(),inspect.currentframe().f_back.f_lineno)
class IteratorChunker(object):
def __init__(self,it):
self.it = it
self.done=False
def next_chunk(self,size):
chunk = []
if not self.done:
try:
for i in xrange(size):
chunk.append(self.it.next())
except StopIteration:
self.done=True
return chunk
def IteratorFilter(iterator, fn):
for x in iterator:
if fn(x):
yield x
def UniqueIterator(iterator):
def UniqueIterator(iterator, key = lambda x: x):
"""
Takes an iterator and returns an iterator that returns only the
first occurence of each entry
"""
so_far = set()
def no_dups(x):
if x in so_far:
k = key(x)
if k in so_far:
return False
else:
so_far.add(x)
so_far.add(k)
return True
return IteratorFilter(iterator, no_dups)
@@ -1087,7 +1083,7 @@ def link_from_url(path, filter_spam = False, multiple = True):
elif a.sr_id not in subs and b.sr_id in subs:
return 1
else:
return cmp(a._hot, b._hot)
return cmp(b._hot, a._hot)
links = sorted(links, cmp = cmp_links)
# among those, show them the hottest one
@@ -1106,3 +1102,62 @@ def link_duplicates(article):
return duplicates
class TimeoutFunctionException(Exception):
pass
class TimeoutFunction:
"""Force an operation to timeout after N seconds. Works with POSIX
signals, so it's not safe to use in a multi-treaded environment"""
def __init__(self, function, timeout):
self.timeout = timeout
self.function = function
def handle_timeout(self, signum, frame):
raise TimeoutFunctionException()
def __call__(self, *args):
# can only be called from the main thread
old = signal.signal(signal.SIGALRM, self.handle_timeout)
signal.alarm(self.timeout)
try:
result = self.function(*args)
finally:
signal.alarm(0)
signal.signal(signal.SIGALRM, old)
return result
def make_offset_date(start_date, interval, future = True,
business_days = False):
"""
Generates a date in the future or past "interval" days from start_date.
Can optionally give weekends no weight in the calculation if
"business_days" is set to true.
"""
if interval is not None:
interval = int(interval)
if business_days:
weeks = interval / 7
dow = start_date.weekday()
if future:
future_dow = (dow + interval) % 7
if dow > future_dow or future_dow > 4:
weeks += 1
else:
future_dow = (dow - interval) % 7
if dow < future_dow or future_dow > 4:
weeks += 1
interval += 2 * weeks;
if future:
return start_date + timedelta(interval)
return start_date - timedelta(interval)
return start_date
def to_csv(table):
# commas and linebreaks must result in a quoted string
def quote_commas(x):
if ',' in x or '\n' in x:
return u'"%s"' % x.replace('"', '""')
return x
return u"\n".join(u','.join(quote_commas(y) for y in x)
for x in table)

View File

@@ -28,9 +28,11 @@ import time
log = g.log
class WorkQueue(object):
"""A WorkQueue is a queue that takes a number of functions and runs
them in parallel"""
global_env = g._current_obj()
def __init__(self, jobs = [], num_workers = 5, timeout = None):
"""Creates a WorkQueue that will process jobs with num_workers
@@ -54,7 +56,7 @@ class WorkQueue(object):
for worker, start_time in self.workers.items():
if (not worker.isAlive() or
self.timeout
and datetime.now() - start_time > self.timeout):
and datetime.now() - start_time > self.timeout):
self.work_count.get_nowait()
self.jobs.task_done()
@@ -62,13 +64,23 @@ class WorkQueue(object):
time.sleep(1)
def _init_thread(self, job, global_env):
# make sure that pylons.g is available for the worker thread
g._push_object(global_env)
try:
job()
finally:
# free it up
g._pop_object()
def run(self):
"""The main thread for the queue. Pull a job off the job queue and
create a thread for it."""
while True:
job = self.jobs.get()
work_thread = Thread(target = job)
work_thread = Thread(target = self._init_thread,
args=(job, self.global_env))
work_thread.setDaemon(True)
self.work_count.put(True)
self.workers[work_thread] = datetime.now()
@@ -117,4 +129,3 @@ def test():
#q.wait()
print 'DONE'

View File

@@ -147,7 +147,7 @@ class Templated(object):
style, cache = not debug)
except AttributeError:
raise NoTemplateFound, (repr(self), style)
return template
def cache_key(self, *a):
@@ -209,7 +209,6 @@ class Templated(object):
"""
from pylons import c, g
style = style or c.render_style or 'html'
# prepare (and store) the list of cachable items.
primary = False
if not isinstance(c.render_tracker, dict):
@@ -385,7 +384,7 @@ class CachedTemplate(Templated):
# these values are needed to render any link on the site, and
# a menu is just a set of links, so we best cache against
# them.
keys = [c.user_is_loggedin, c.user_is_admin,
keys = [c.user_is_loggedin, c.user_is_admin, c.domain_prefix,
c.render_style, c.cname, c.lang, c.site.path,
template_hash]
keys = [make_cachable(x, *a) for x in keys]

View File

@@ -26,6 +26,8 @@ from builder import *
from vote import *
from report import *
from subreddit import *
from award import *
from bidding import *
from mail_queue import Email, has_opted_out, opt_count
from admintools import *
import thing_changes

View File

@@ -23,11 +23,12 @@ from r2.lib.db.thing import Thing, Relation, NotFound
from r2.lib.db.operators import lower
from r2.lib.db.userrel import UserRel
from r2.lib.memoize import memoize
from r2.lib.utils import modhash, valid_hash, randstr
from r2.lib.utils import modhash, valid_hash, randstr
from pylons import g
import time, sha
from copy import copy
from datetime import datetime, timedelta
class AccountExists(Exception): pass
@@ -58,12 +59,17 @@ class Account(Thing):
report_made = 0,
report_correct = 0,
report_ignored = 0,
cup_date = None,
spammer = 0,
sort_options = {},
has_subscribed = False,
pref_media = 'subreddit',
share = {},
wiki_override = None,
email = "",
email_verified = None,
ignorereports = False,
pref_show_promote = None,
)
def karma(self, kind, sr = None):
@@ -143,7 +149,22 @@ class Account(Thing):
self._t.get('comment_karma', 0)))
return karmas
def update_last_visit(self, current_time):
from admintools import apply_updates
apply_updates(self)
prev_visit = getattr(self, 'last_visit', None)
if prev_visit and current_time - prev_visit < timedelta(0, 3600):
return
g.log.debug ("Updating last visit for %s" % self.name)
self.last_visit = current_time
self._commit()
def make_cookie(self, timestr = None, admin = False):
if not self._loaded:
self._load()
@@ -186,6 +207,14 @@ class Account(Thing):
else:
raise NotFound, 'Account %s' % name
# Admins only, since it's not memoized
@classmethod
def _by_name_multiple(cls, name):
q = cls._query(lower(Account.c.name) == name.lower(),
Account.c._spam == (True, False),
Account.c._deleted == (True, False))
return list(q)
@property
def friends(self):
return self.friend_ids()
@@ -235,9 +264,30 @@ class Account(Thing):
share['recent'] = emails
self.share = share
def extend_cup(self, new_expiration):
if self.cup_date and self.cup_date > new_expiration:
return
self.cup_date = new_expiration
self._commit()
def remove_cup(self):
if not self.cup_date:
return
self.cup_date = None
self._commit()
def should_show_cup(self):
# FIX ME.
# this is being called inside builder (Bad #1) in the
# listing loop. On machines that are not allowed to write to
# the db, this generates an exception (Bad #2) on every
# listing page with users with cups.
return False # temporarily disable cups
if self.cup_date and self.cup_date < datetime.now(g.tz):
self.cup_date = None
self._commit()
return self.cup_date
class FakeAccount(Account):
_nodb = True
@@ -314,6 +364,7 @@ def register(name, password):
return a
class Friend(Relation(Account, Account)): pass
Account.__bases__ += (UserRel('friend', Friend, disable_reverse_ids_fn = True),)
class DeletedUser(FakeAccount):

View File

@@ -20,9 +20,9 @@
# CondeNet, Inc. All Rights Reserved.
################################################################################
from r2.lib.utils import tup
from r2.lib.filters import websafe
from r2.models import Report, Account
from r2.models.thing_changes import changed
from r2.lib.db import queries
from pylons import g
@@ -30,26 +30,41 @@ from datetime import datetime
from copy import copy
class AdminTools(object):
def spam(self, things, auto, moderator_banned, banner, date = None, **kw):
from r2.lib.db import queries
things = [x for x in tup(things) if not x._spam]
Report.accept(things, True)
things = [ x for x in tup(things) if not x._spam ]
for t in things:
t._spam = True
ban_info = copy(getattr(t, 'ban_info', {}))
ban_info.update(auto = auto,
moderator_banned = moderator_banned,
banner = banner,
banned_at = date or datetime.now(g.tz),
**kw)
if isinstance(banner, dict):
ban_info['banner'] = banner[t._fullname]
else:
ban_info['banner'] = banner
t.ban_info = ban_info
t._commit()
changed(t)
self.author_spammer(things, True)
if not auto:
self.author_spammer(things, True)
self.set_last_sr_ban(things)
queries.ban(things)
def unspam(self, things, unbanner = None):
from r2.lib.db import queries
things = [x for x in tup(things) if x._spam]
Report.accept(things, False)
things = [ x for x in tup(things) if x._spam ]
for t in things:
ban_info = copy(getattr(t, 'ban_info', {}))
ban_info['unbanned_at'] = datetime.now(g.tz)
@@ -59,7 +74,11 @@ class AdminTools(object):
t._spam = False
t._commit()
changed(t)
# auto is always False for unbans
self.author_spammer(things, False)
self.set_last_sr_ban(things)
queries.unban(things)
def author_spammer(self, things, spam):
@@ -77,6 +96,24 @@ class AdminTools(object):
author = authors[aid]
author._incr('spammer', len(author_things) if spam else -len(author_things))
def set_last_sr_ban(self, things):
by_srid = {}
for thing in things:
if hasattr(thing, 'sr_id'):
by_srid.setdefault(thing.sr_id, []).append(thing)
if by_srid:
srs = Subreddit._byID(by_srid.keys(), data=True, return_dict=True)
for sr_id, sr_things in by_srid.iteritems():
sr = srs[sr_id]
sr.last_mod_action = datetime.now(g.tz)
sr._commit()
sr._incr('mod_actions', len(sr_things))
def admin_queues(self, chan, exchange):
pass
admintools = AdminTools()
def is_banned_IP(ip):
@@ -91,6 +128,9 @@ def valid_thing(v, karma):
def valid_user(v, sr, karma):
return True
def apply_updates(user):
pass
def update_score(obj, up_change, down_change, new_valid_thing, old_valid_thing):
obj._incr('_ups', up_change)
obj._incr('_downs', down_change)
@@ -99,6 +139,9 @@ def compute_votes(wrapper, item):
wrapper.upvotes = item._ups
wrapper.downvotes = item._downs
def ip_span(ip):
ip = websafe(ip)
return '<!-- %s -->' % ip
try:
from r2admin.models.admintools import *

106
r2/r2/models/award.py Normal file
View File

@@ -0,0 +1,106 @@
# The contents of this file are subject to the Common Public Attribution
# License Version 1.0. (the "License"); you may not use this file except in
# compliance with the License. You may obtain a copy of the License at
# http://code.reddit.com/LICENSE. The License is based on the Mozilla Public
# License Version 1.1, but Sections 14 and 15 have been added to cover use of
# software over a computer network and provide for limited attribution for the
# Original Developer. In addition, Exhibit A has been modified to be consistent
# with Exhibit B.
#
# Software distributed under the License is distributed on an "AS IS" basis,
# WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for
# the specific language governing rights and limitations under the License.
#
# The Original Code is Reddit.
#
# The Original Developer is the Initial Developer. The Initial Developer of the
# Original Code is CondeNet, Inc.
#
# All portions of the code written by CondeNet are Copyright (c) 2006-2008
# CondeNet, Inc. All Rights Reserved.
################################################################################
from r2.lib.db.thing import Thing, Relation, NotFound
from r2.lib.db.userrel import UserRel
from r2.lib.db.operators import desc, lower
from r2.lib.db import queries
from r2.lib.memoize import memoize
from r2.models import Account
from pylons import c, g, request
class Award (Thing):
@classmethod
@memoize('award.all_awards')
def _all_awards_cache(cls):
return [ a._id for a in Award._query(limit=100) ]
@classmethod
def _all_awards(cls, _update=False):
all = Award._all_awards_cache(_update=_update)
return Award._byID(all, data=True).values()
@classmethod
def _new(cls, codename, title, imgurl):
# print "Creating new award codename=%s title=%s imgurl=%s" % (
# codename, title, imgurl)
a = Award(codename=codename, title=title, imgurl=imgurl)
a._commit()
Award._all_awards_cache(_update=True)
@classmethod
def _by_codename(cls, codename):
q = cls._query(lower(Award.c.codename) == codename.lower())
q._limit = 1
award = list(q)
if award:
return cls._byID(award[0]._id, True)
else:
raise NotFound, 'Award %s' % codename
class Trophy(Relation(Account, Award)):
@classmethod
def _new(cls, recipient, award, description = None,
url = None, cup_expiration = None):
# The "name" column of the relation can't be a constant or else a
# given account would not be allowed to win a given award more than
# once. So we're setting it to the string form of the timestamp.
# Still, we won't have that date just yet, so for a moment we're
# setting it to "trophy".
t = Trophy(recipient, award, "trophy")
t._name = str(t._date)
if description:
t.description = description
if url:
t.url = url
if cup_expiration:
recipient.extend_cup(cup_expiration)
t._commit()
Trophy.by_account(recipient, _update=True)
Trophy.by_award(award, _update=True)
@classmethod
@memoize('trophy.by_account')
def by_account(cls, account):
q = Trophy._query(Trophy.c._thing1_id == account._id,
eager_load = True, thing_data = True,
data = True,
sort = desc('_date'))
q._limit = 50
return list(q)
@classmethod
@memoize('trophy.by_award')
def by_award(cls, award):
q = Trophy._query(Trophy.c._thing2_id == award._id,
eager_load = True, thing_data = True,
data = True,
sort = desc('_date'))
q._limit = 500
return list(q)

442
r2/r2/models/bidding.py Normal file
View File

@@ -0,0 +1,442 @@
# The contents of this file are subject to the Common Public Attribution
# License Version 1.0. (the "License"); you may not use this file except in
# compliance with the License. You may obtain a copy of the License at
# http://code.reddit.com/LICENSE. The License is based on the Mozilla Public
# License Version 1.1, but Sections 14 and 15 have been added to cover use of
# software over a computer network and provide for limited attribution for the
# Original Developer. In addition, Exhibit A has been modified to be consistent
# with Exhibit B.
#
# Software distributed under the License is distributed on an "AS IS" basis,
# WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for
# the specific language governing rights and limitations under the License.
#
# The Original Code is Reddit.
#
# The Original Developer is the Initial Developer. The Initial Developer of the
# Original Code is CondeNet, Inc.
#
# All portions of the code written by CondeNet are Copyright (c) 2006-2009
# CondeNet, Inc. All Rights Reserved.
################################################################################
from sqlalchemy import Column, String, DateTime, Date, Float, Integer, \
func as safunc, and_, or_
from sqlalchemy.exceptions import IntegrityError
from sqlalchemy.schema import PrimaryKeyConstraint
from sqlalchemy.orm import sessionmaker, scoped_session
from sqlalchemy.orm.exc import NoResultFound
from sqlalchemy.databases.postgres import PGBigInteger as BigInteger, \
PGInet as Inet
from sqlalchemy.ext.declarative import declarative_base
from pylons import g
from r2.lib.utils import Enum
from r2.models.account import Account
from r2.lib.db.thing import Thing, NotFound
from pylons import request
from r2.lib.memoize import memoize
import datetime
engine = g.dbm.engines['authorize']
# Allocate a session maker for communicating object changes with the back end
Session = sessionmaker(autocommit = True, autoflush = True, bind = engine)
# allocate a SQLalchemy base class for auto-creation of tables based
# on class fields.
# NB: any class that inherits from this class will result in a table
# being created, and subclassing doesn't work, hence the
# object-inheriting interface classes.
Base = declarative_base(bind = engine)
class Sessionized(object):
"""
Interface class for wrapping up the "session" in the 0.5 ORM
required for all database communication. This allows subclasses
to have a "query" and "commit" method that doesn't require
managing of the session.
"""
session = Session()
def __init__(self, *a, **kw):
"""
Common init used by all other classes in this file. Allows
for object-creation based on the __table__ field which is
created by Base (further explained in _disambiguate_args).
"""
for k, v in self._disambiguate_args(None, *a, **kw):
setattr(self, k.name, v)
@classmethod
def _new(cls, *a, **kw):
"""
Just like __init__, except the new object is committed to the
db before being returned.
"""
obj = cls(*a, **kw)
obj._commit()
return obj
def _commit(self):
"""
Commits current object to the db.
"""
with self.session.begin():
self.session.add(self)
def _delete(self):
"""
Deletes current object from the db.
"""
with self.session.begin():
self.session.delete(self)
@classmethod
def query(cls, **kw):
"""
Ubiquitous class-level query function.
"""
q = cls.session.query(cls)
if kw:
q = q.filter_by(**kw)
return q
@classmethod
def _disambiguate_args(cls, filter_fn, *a, **kw):
"""
Used in _lookup and __init__ to interpret *a as being a list
of args to match columns in the same order as __table__.c
For example, if a class Foo has fields a and b, this function
allows the two to work identically:
>>> foo = Foo(a = 'arg1', b = 'arg2')
>>> foo = Foo('arg1', 'arg2')
Additionally, this function invokes _make_storable on each of
the values in the arg list (including *a as well as
kw.values())
"""
args = []
cols = filter(filter_fn, cls.__table__.c)
for k, v in zip(cols, a):
if not kw.has_key(k.name):
args.append((k, cls._make_storable(v)))
else:
raise TypeError,\
"got multiple arguments for '%s'" % k.name
cols = dict((x.name, x) for x in cls.__table__.c)
for k, v in kw.iteritems():
if cols.has_key(k):
args.append((cols[k], cls._make_storable(v)))
return args
@classmethod
def _make_storable(self, val):
if isinstance(val, Account):
return val._id
elif isinstance(val, Thing):
return val._fullname
else:
return val
@classmethod
def _lookup(cls, multiple, *a, **kw):
"""
Generates an executes a query where it matches *a to the
primary keys of the current class's table.
The primary key nature can be overridden by providing an
explicit list of columns to search.
This function is only a convenience function, and is called
only by one() and lookup().
"""
args = cls._disambiguate_args(lambda x: x.primary_key, *a, **kw)
res = cls.query().filter(and_(*[k == v for k, v in args]))
try:
res = res.all() if multiple else res.one()
# res.one() will raise NoResultFound, while all() will
# return an empty list. This will make the response
# uniform
if not res:
raise NoResultFound
except NoResultFound:
raise NotFound, "%s with %s" % \
(cls.__name__,
",".join("%s=%s" % x for x in args))
return res
@classmethod
def lookup(cls, *a, **kw):
"""
Returns all objects which match the kw list, or primary keys
that match the *a.
"""
return cls._lookup(True, *a, **kw)
@classmethod
def one(cls, *a, **kw):
"""
Same as lookup, but returns only one argument.
"""
return cls._lookup(False, *a, **kw)
@classmethod
def add(cls, key, *a):
try:
cls.one(key, *a)
except NotFound:
cls(key, *a)._commit()
@classmethod
def delete(cls, key, *a):
try:
cls.one(key, *a)._delete()
except NotFound:
pass
@classmethod
def get(cls, key):
try:
return cls.lookup(key)
except NotFound:
return []
class CustomerID(Sessionized, Base):
__tablename__ = "authorize_account_id"
account_id = Column(BigInteger, primary_key = True,
autoincrement = False)
authorize_id = Column(BigInteger)
def __repr__(self):
return "<AuthNetID(%s)>" % self.authorize_id
@classmethod
def set(cls, user, _id):
try:
existing = cls.one(user)
existing.authorize_id = _id
existing._commit()
except NotFound:
cls(user, _id)._commit()
@classmethod
def get_id(cls, user):
try:
return cls.one(user).authorize_id
except NotFound:
return
class PayID(Sessionized, Base):
__tablename__ = "authorize_pay_id"
account_id = Column(BigInteger, primary_key = True,
autoincrement = False)
pay_id = Column(BigInteger, primary_key = True,
autoincrement = False)
def __repr__(self):
return "<%s(%d)>" % (self.__class__.__name__, self.authorize_id)
@classmethod
def get_ids(cls, key):
return [int(x.pay_id) for x in cls.get(key)]
class ShippingAddress(Sessionized, Base):
__tablename__ = "authorize_ship_id"
account_id = Column(BigInteger, primary_key = True,
autoincrement = False)
ship_id = Column(BigInteger, primary_key = True,
autoincrement = False)
def __repr__(self):
return "<%s(%d)>" % (self.__class__.__name__, self.authorize_id)
class Bid(Sessionized, Base):
__tablename__ = "bids"
STATUS = Enum("AUTH", "CHARGE", "REFUND", "VOID")
# will be unique from authorize
transaction = Column(BigInteger, primary_key = True,
autoincrement = False)
# identifying characteristics
account_id = Column(BigInteger, index = True, nullable = False)
pay_id = Column(BigInteger, index = True, nullable = False)
thing_id = Column(BigInteger, index = True, nullable = False)
# breadcrumbs
ip = Column(Inet)
date = Column(DateTime(timezone = True), default = safunc.now(),
nullable = False)
# bid information:
bid = Column(Float, nullable = False)
charge = Column(Float)
status = Column(Integer, nullable = False,
default = STATUS.AUTH)
@classmethod
def _new(cls, trans_id, user, pay_id, thing_id, bid):
bid = Bid(trans_id, user, pay_id,
thing_id, getattr(request, 'ip', '0.0.0.0'), bid = bid)
bid._commit()
return bid
def set_status(self, status):
if self.status != status:
self.status = status
self._commit()
def auth(self):
self.set_status(self.STATUS.AUTH)
def void(self):
self.set_status(self.STATUS.VOID)
def charged(self):
self.set_status(self.STATUS.CHARGE)
def refund(self):
self.set_status(self.STATUS.REFUND)
class PromoteDates(Sessionized, Base):
__tablename__ = "promote_date"
thing_name = Column(String, primary_key = True, autoincrement = False)
account_id = Column(BigInteger, index = True, autoincrement = False)
start_date = Column(Date(), nullable = False, index = True)
end_date = Column(Date(), nullable = False, index = True)
actual_start = Column(DateTime(timezone = True), index = True)
actual_end = Column(DateTime(timezone = True), index = True)
bid = Column(Float)
refund = Column(Float)
@classmethod
def update(cls, thing, start_date, end_date):
try:
promo = cls.one(thing)
promo.start_date = start_date.date()
promo.end_date = end_date.date()
promo._commit()
except NotFound:
promo = cls._new(thing, thing.author_id, start_date, end_date)
@classmethod
def update_bid(cls, thing):
bid = thing.promote_bid
refund = 0
if thing.promote_trans_id < 0:
refund = bid
elif hasattr(thing, "promo_refund"):
refund = thing.promo_refund
promo = cls.one(thing)
promo.bid = bid
promo.refund = refund
promo._commit()
@classmethod
def log_start(cls, thing):
promo = cls.one(thing)
promo.actual_start = datetime.datetime.now(g.tz)
promo._commit()
cls.update_bid(thing)
@classmethod
def log_end(cls, thing):
promo = cls.one(thing)
promo.actual_end = datetime.datetime.now(g.tz)
promo._commit()
cls.update_bid(thing)
@classmethod
def for_date(cls, date):
if isinstance(date, datetime.datetime):
date = date.date()
q = cls.query().filter(and_(cls.start_date <= date,
cls.end_date > date))
return q.all()
@classmethod
def for_date_range(cls, start_date, end_date, account_id = None):
if isinstance(start_date, datetime.datetime):
start_date = start_date.date()
if isinstance(end_date, datetime.datetime):
end_date = end_date.date()
# Three cases to be included:
# 1) start date is in the provided interval
start_inside = and_(cls.start_date >= start_date,
cls.start_date < end_date)
# 2) end date is in the provided interval
end_inside = and_(cls.end_date >= start_date,
cls.end_date < end_date)
# 3) interval is a subset of a promoted interval
surrounds = and_(cls.start_date <= start_date,
cls.end_date >= end_date)
q = cls.query().filter(or_(start_inside, end_inside, surrounds))
if account_id is not None:
q = q.filter(cls.account_id == account_id)
return q.all()
@classmethod
@memoize('promodates.bid_history', time = 10 * 60)
def bid_history(cls, start_date, end_date = None, account_id = None):
end_date = end_date or datetime.datetime.now(g.tz)
q = cls.for_date_range(start_date, end_date, account_id = account_id)
d = start_date.date()
end_date = end_date.date()
res = []
while d < end_date:
bid = 0
refund = 0
for i in q:
end = i.actual_end.date() if i.actual_end else i.end_date
start = i.actual_start.date() if i.actual_start else None
if start and start <= d and end > d:
duration = float((end - start).days)
bid += i.bid / duration
refund += i.refund / duration
res.append([d, bid, refund])
d += datetime.timedelta(1)
return res
@classmethod
@memoize('promodates.top_promoters', time = 10 * 60)
def top_promoters(cls, start_date, end_date = None):
end_date = end_date or datetime.datetime.now(g.tz)
q = cls.for_date_range(start_date, end_date)
d = start_date
res = []
accounts = Account._byID([i.account_id for i in q],
return_dict = True, data = True)
res = {}
for i in q:
if i.bid is not None and i.actual_start is not None:
r = res.setdefault(i.account_id, [0, 0, set()])
r[0] += i.bid
r[1] += i.refund
r[2].add(i.thing_name)
res = [ ([accounts[k]] + v) for (k, v) in res.iteritems() ]
res.sort(key = lambda x: x[1] - x[2], reverse = True)
return res
# do all the leg work of creating/connecting to tables
Base.metadata.create_all()

View File

@@ -35,53 +35,15 @@ from r2.lib import utils
from r2.lib.db import operators
from r2.lib.cache import sgm
from r2.lib.comment_tree import link_comments
from copy import deepcopy, copy
import time
from datetime import datetime,timedelta
from admintools import compute_votes, admintools
from admintools import compute_votes, admintools, ip_span
EXTRA_FACTOR = 1.5
MAX_RECURSION = 10
# Appends to the list "attrs" a tuple of:
# <priority (higher trumps lower), letter,
# css class, i18n'ed mouseover label, hyperlink (or None)>
def add_attr(attrs, code, label=None, link=None):
if code == 'F':
priority = 1
cssclass = 'friend'
if not label:
label = _('friend')
if not link:
link = '/prefs/friends'
elif code == 'S':
priority = 2
cssclass = 'submitter'
if not label:
label = _('submitter')
if not link:
raise ValueError ("Need a link")
elif code == 'M':
priority = 3
cssclass = 'moderator'
if not label:
raise ValueError ("Need a label")
if not link:
raise ValueError ("Need a link")
elif code == 'A':
priority = 4
cssclass = 'admin'
if not label:
label = _('reddit admin, speaking officially')
if not link:
link = '/help/faq#Whomadereddit'
else:
raise ValueError ("Got weird code [%s]" % code)
attrs.append( (priority, code, cssclass, label, link) )
class Builder(object):
def __init__(self, wrap = Wrapped, keep_fn = None):
self.wrap = wrap
@@ -94,11 +56,13 @@ class Builder(object):
return item.keep_item(item)
def wrap_items(self, items):
from r2.lib.template_helpers import add_attr
user = c.user if c.user_is_loggedin else None
#get authors
#TODO pull the author stuff into add_props for links and
#comments and messages?
try:
aids = set(l.author_id for l in items)
except AttributeError:
@@ -151,8 +115,7 @@ class Builder(object):
w.author = None
w.friend = False
# List of tuples <priority (higher trumps lower), letter,
# css class, i18n'ed mouseover label, hyperlink (or None)>
# List of tuples (see add_attr() for details)
w.attribs = []
w.distinguished = None
@@ -181,6 +144,13 @@ class Builder(object):
getattr(item, "author_id", None) in mods):
add_attr(w.attribs, 'M', label=modlabel, link=modlink)
if (g.show_awards and w.author
and w.author.should_show_cup()):
add_attr(w.attribs, 'trophy', label=
_("%(user)s recently won a trophy! click here to see it.")
% {'user':w.author.name},
link = "/user/%s" % w.author.name)
if hasattr(item, "sr_id"):
w.subreddit = subreddits[item.sr_id]
@@ -214,6 +184,11 @@ class Builder(object):
count += 1
if c.user_is_admin and getattr(item, 'ip', None):
w.ip_span = ip_span(item.ip)
else:
w.ip_span = ""
# if the user can ban things on a given subreddit, or an
# admin, then allow them to see that the item is spam, and
# add the other spam-related display attributes
@@ -365,12 +340,12 @@ class QueryBuilder(Builder):
#skip and count
while new_items and (not self.num or num_have < self.num):
i = new_items.pop(0)
count = count - 1 if self.reverse else count + 1
if not (self.must_skip(i) or self.skip and not self.keep_item(i)):
items.append(i)
num_have += 1
if self.wrap:
i.num = count
if self.wrap:
count = count - 1 if self.reverse else count + 1
i.num = count
last_item = i
#unprewrap the last item

View File

@@ -48,9 +48,7 @@ class Link(Thing, Printable):
media_object = None,
has_thumbnail = False,
promoted = None,
promoted_subscribersonly = False,
promote_until = None,
promoted_by = None,
pending = False,
disable_comments = False,
selftext = '',
ip = '0.0.0.0')
@@ -211,8 +209,16 @@ class Link(Thing, Printable):
@staticmethod
def wrapped_cache_key(wrapped, style):
s = Printable.wrapped_cache_key(wrapped, style)
if wrapped.promoted is not None:
s.extend([getattr(wrapped, "promote_status", -1),
wrapped.disable_comments,
wrapped._date,
wrapped.promote_until,
c.user_is_sponsor,
wrapped.url, repr(wrapped.title)])
if style == "htmllite":
s.append(request.get.has_key('twocolumn'))
s.extend([request.get.has_key('twocolumn'),
c.link_target])
elif style == "xml":
s.append(request.GET.has_key("nothumbs"))
s.append(getattr(wrapped, 'media_object', {}))
@@ -221,7 +227,15 @@ class Link(Thing, Printable):
def make_permalink(self, sr, force_domain = False):
from r2.lib.template_helpers import get_domain
p = "comments/%s/%s/" % (self._id36, title_to_url(self.title))
if not c.cname and not force_domain:
# promoted links belong to a separate subreddit and shouldn't
# include that in the path
if self.promoted is not None:
if force_domain:
res = "http://%s/%s" % (get_domain(cname = False,
subreddit = False), p)
else:
res = "/%s" % p
elif not c.cname and not force_domain:
res = "/r/%s/%s" % (sr.name, p)
elif sr != c.site or force_domain:
res = "http://%s/%s" % (get_domain(cname = (c.cname and
@@ -229,6 +243,11 @@ class Link(Thing, Printable):
subreddit = not c.cname), p)
else:
res = "/%s" % p
# WARNING: If we ever decide to add any ?foo=bar&blah parameters
# here, Comment.make_permalink will need to be updated or else
# it will fail.
return res
def make_permalink_slow(self, force_domain = False):
@@ -256,6 +275,7 @@ class Link(Thing, Printable):
saved = Link._saved(user, wrapped) if user_is_loggedin else {}
hidden = Link._hidden(user, wrapped) if user_is_loggedin else {}
#clicked = Link._clicked(user, wrapped) if user else {}
clicked = {}
@@ -264,17 +284,18 @@ class Link(Thing, Printable):
if not hasattr(item, "score_fmt"):
item.score_fmt = Score.number_only
item.pref_compress = user.pref_compress
if user.pref_compress:
if user.pref_compress and item.promoted is None:
item.render_css_class = "compressed link"
item.score_fmt = Score.points
elif pref_media == 'on':
elif pref_media == 'on' and not user.pref_compress:
show_media = True
elif pref_media == 'subreddit' and item.subreddit.show_media:
show_media = True
elif (item.promoted
and item.has_thumbnail
and pref_media != 'off'):
show_media = True
elif item.promoted and item.has_thumbnail:
if user_is_loggedin and item.author_id == user._id:
show_media = True
elif pref_media != 'off' and not user.pref_compress:
show_media = True
if not show_media:
item.thumbnail = ""
@@ -358,7 +379,9 @@ class Link(Thing, Printable):
else:
item.href_url = item.url
if pref_frame and not item.is_self:
# show the toolbar if the preference is set and the link
# is neither a promoted link nor a self post
if pref_frame and not item.is_self and not item.promoted:
item.mousedown_url = item.tblink
else:
item.mousedown_url = None
@@ -370,6 +393,8 @@ class Link(Thing, Printable):
item._deleted,
item._spam))
item.is_author = (user == item.author)
# bits that we will render stubs (to make the cached
# version more flexible)
item.num = CachedVariable("num")
@@ -377,7 +402,7 @@ class Link(Thing, Printable):
item.commentcls = CachedVariable("commentcls")
item.midcolmargin = CachedVariable("midcolmargin")
item.comment_label = CachedVariable("numcomments")
if user_is_loggedin:
incr_counts(wrapped)
@@ -402,32 +427,21 @@ class PromotedLink(Link):
@classmethod
def add_props(cls, user, wrapped):
# prevents cyclic dependencies
from r2.lib import promote
Link.add_props(user, wrapped)
user_is_sponsor = c.user_is_sponsor
try:
if user_is_sponsor:
promoted_by_ids = set(x.promoted_by
for x in wrapped
if hasattr(x,'promoted_by'))
promoted_by_accounts = Account._byID(promoted_by_ids,
data=True)
else:
promoted_by_accounts = {}
except NotFound:
# since this is just cosmetic, we can skip it altogether
# if one isn't found or is broken
promoted_by_accounts = {}
status_dict = dict((v, k) for k, v in promote.STATUS.iteritems())
for item in wrapped:
# these are potentially paid for placement
item.nofollow = True
item.user_is_sponsor = user_is_sponsor
if item.promoted_by in promoted_by_accounts:
item.promoted_by_name = promoted_by_accounts[item.promoted_by].name
status = getattr(item, "promote_status", -1)
if item.is_author or c.user_is_sponsor:
item.rowstyle = "link " + promote.STATUS.name[status].lower()
else:
# keep the template from trying to read it
item.promoted_by = None
item.rowstyle = "link promoted"
# Run this last
Printable.add_props(user, wrapped)
@@ -443,7 +457,7 @@ class Comment(Thing, Printable):
def _delete(self):
link = Link._byID(self.link_id, data = True)
link._incr('num_comments', -1)
@classmethod
def _new(cls, author, link, parent, body, ip):
c = Comment(body = body,
@@ -462,12 +476,16 @@ class Comment(Thing, Printable):
link._incr('num_comments', 1)
inbox_rel = None
to = None
if parent:
to = Account._byID(parent.author_id)
# only global admins can be message spammed.
if not c._spam or to.name in g.admins:
inbox_rel = Inbox._add(to, c, 'inbox')
elif link.is_self:
to = Account._byID(link.author_id)
inbox_rel = None
# only global admins can be message spammed.
if to and (not c._spam or to.name in g.admins):
inbox_rel = Inbox._add(to, c, 'inbox')
return (c, inbox_rel)
@@ -497,16 +515,23 @@ class Comment(Thing, Printable):
s.extend([wrapped.body])
return s
def make_permalink(self, link, sr=None):
return link.make_permalink(sr) + self._id36
def make_permalink(self, link, sr=None, context=None, anchor=False):
url = link.make_permalink(sr) + self._id36
if context:
url += "?context=%d" % context
if anchor:
url += "#%s" % self._id36
return url
def make_permalink_slow(self):
def make_permalink_slow(self, context=None, anchor=False):
l = Link._byID(self.link_id, data=True)
return self.make_permalink(l, l.subreddit_slow)
return self.make_permalink(l, l.subreddit_slow,
context=context, anchor=anchor)
@classmethod
def add_props(cls, user, wrapped):
from r2.models.builder import add_attr
from r2.lib.template_helpers import add_attr
from r2.lib import promote
#fetch parent links
@@ -517,11 +542,13 @@ class Comment(Thing, Printable):
for cm in wrapped:
if not hasattr(cm, 'sr_id'):
cm.sr_id = links[cm.link_id].sr_id
subreddits = Subreddit._byID(set(cm.sr_id for cm in wrapped),
data=True,return_dict=False)
can_reply_srs = set(s._id for s in subreddits if s.can_comment(user)) \
if c.user_is_loggedin else set()
can_reply_srs.add(promote.PromoteSR._id)
min_score = user.pref_min_comment_score
@@ -539,7 +566,6 @@ class Comment(Thing, Printable):
if (item.link._score <= 1 or item.score < 3 or
item.link._spam or item._spam or item.author._spam):
item.nofollow = True
else:
item.nofollow = False
@@ -651,7 +677,7 @@ class MoreChildren(MoreComments):
pass
class Message(Thing, Printable):
_defaults = dict(reported = 0,)
_defaults = dict(reported = 0, was_comment = False)
_data_int_props = Thing._data_int_props + ('reported', )
cache_ignore = set(["to"]).union(Printable.cache_ignore)
@@ -684,6 +710,14 @@ class Message(Thing, Printable):
#load the "to" field if required
to_ids = set(w.to_id for w in wrapped)
tos = Account._byID(to_ids, True) if to_ids else {}
links = Link._byID(set(l.link_id for l in wrapped if l.was_comment),
data = True,
return_dict = True)
subreddits = Subreddit._byID(set(l.sr_id for l in links.values()),
data = True, return_dict = True)
parents = Comment._byID(set(l.parent_id for l in wrapped
if hasattr(l, "parent_id") and l.was_comment),
data = True, return_dict = True)
for item in wrapped:
item.to = tos[item.to_id]
@@ -692,6 +726,23 @@ class Message(Thing, Printable):
else:
item.new = False
item.score_fmt = Score.none
item.message_style = ""
if item.was_comment:
link = links[item.link_id]
sr = subreddits[link.sr_id]
item.link_title = link.title
item.link_permalink = link.make_permalink(sr)
if hasattr(item, "parent_id"):
item.subject = _('comment reply')
item.message_style = "comment-reply"
parent = parents[item.parent_id]
item.parent = parent._fullname
item.parent_permalink = parent.make_permalink(link, sr)
else:
item.subject = _('post reply')
item.message_style = "post-reply"
# Run this last
Printable.add_props(user, wrapped)

View File

@@ -25,13 +25,13 @@ from email.MIMEText import MIMEText
import sqlalchemy as sa
from sqlalchemy.databases.postgres import PGInet, PGBigInteger
from r2.lib.db.tdb_sql import make_metadata
from r2.models.thing_changes import changed, index_str, create_table
from r2.lib.utils import Storage, timeago
from r2.lib.db.tdb_sql import make_metadata, index_str, create_table
from r2.lib.utils import Storage, timeago, Enum, tup
from account import Account
from r2.lib.db.thing import Thing
from r2.lib.memoize import memoize
from pylons import g
from pylons import g, request
from pylons.i18n import _
def mail_queue(metadata):
return sa.Table(g.db_app_name + '_mail_queue', metadata,
@@ -129,11 +129,11 @@ class EmailHandler(object):
self.queue_table = mail_queue(self.metadata)
indices = [index_str(self.queue_table, "date", "date"),
index_str(self.queue_table, 'kind', 'kind')]
create_table(self.queue_table, indices, force = force)
create_table(self.queue_table, indices)
self.opt_table = opt_out(self.metadata)
indices = [index_str(self.opt_table, 'email', 'email')]
create_table(self.opt_table, indices, force = force)
create_table(self.opt_table, indices)
self.track_table = sent_mail_table(self.metadata)
self.reject_table = sent_mail_table(self.metadata, name = "reject_mail")
@@ -148,8 +148,8 @@ class EmailHandler(object):
index_str(tab, 'msg_hash', 'msg_hash'),
]
create_table(self.track_table, sent_indices(self.track_table), force = force)
create_table(self.reject_table, sent_indices(self.reject_table), force = force)
create_table(self.track_table, sent_indices(self.track_table))
create_table(self.reject_table, sent_indices(self.reject_table))
def __repr__(self):
return "<email-handler>"
@@ -202,15 +202,20 @@ class EmailHandler(object):
return res[0][0] if res and res[:1] else None
def add_to_queue(self, user, thing, emails, from_name, fr_addr, date, ip,
kind, body = "", reply_to = ""):
def add_to_queue(self, user, emails, from_name, fr_addr, kind,
date = None, ip = None,
body = "", reply_to = "", thing = None):
s = self.queue_table
hashes = []
for email in emails:
if not date:
date = datetime.datetime.now(g.tz)
if not ip:
ip = getattr(request, "ip", "127.0.0.1")
for email in tup(emails):
uid = user._id if user else 0
tid = thing._fullname if thing else ""
key = sha.new(str((email, from_name, uid, tid, ip, kind, body,
datetime.datetime.now()))).hexdigest()
datetime.datetime.now(g.tz)))).hexdigest()
s.insert().values({s.c.to_addr : email,
s.c.account_id : uid,
s.c.from_name : from_name,
@@ -285,8 +290,35 @@ class EmailHandler(object):
class Email(object):
handler = EmailHandler()
Kind = ["SHARE", "FEEDBACK", "ADVERTISE", "OPTOUT", "OPTIN"]
Kind = Storage((e, i) for i, e in enumerate(Kind))
Kind = Enum("SHARE", "FEEDBACK", "ADVERTISE", "OPTOUT", "OPTIN",
"VERIFY_EMAIL", "RESET_PASSWORD",
"BID_PROMO",
"ACCEPT_PROMO",
"REJECT_PROMO",
"QUEUED_PROMO",
"LIVE_PROMO",
"FINISHED_PROMO",
"NEW_PROMO",
"HELP_TRANSLATE",
)
subjects = {
Kind.SHARE : _("[reddit] %(user)s has shared a link with you"),
Kind.FEEDBACK : _("[feedback] feedback from '%(user)s'"),
Kind.ADVERTISE : _("[ad_inq] feedback from '%(user)s'"),
Kind.OPTOUT : _("[reddit] email removal notice"),
Kind.OPTIN : _("[reddit] email addition notice"),
Kind.RESET_PASSWORD : _("[reddit] reset your password"),
Kind.VERIFY_EMAIL : _("[reddit] verify your email address"),
Kind.BID_PROMO : _("[reddit] your bid has been accepted"),
Kind.ACCEPT_PROMO : _("[reddit] your promotion has been accepted"),
Kind.REJECT_PROMO : _("[reddit] your promotion has been rejected"),
Kind.QUEUED_PROMO : _("[reddit] your promotion has been queued"),
Kind.LIVE_PROMO : _("[reddit] your promotion is now live"),
Kind.FINISHED_PROMO : _("[reddit] your promotion has finished"),
Kind.NEW_PROMO : _("[reddit] your promotion has been created"),
Kind.HELP_TRANSLATE : _("[i18n] translation offer from '%(user)s'"),
}
def __init__(self, user, thing, email, from_name, date, ip, banned_ip,
kind, msg_hash, body = '', from_addr = '',
@@ -302,9 +334,14 @@ class Email(object):
self.kind = kind
self.sent = False
self.body = body
self.subject = ''
self.msg_hash = msg_hash
self.reply_to = reply_to
self.subject = self.subjects.get(kind, "")
try:
self.subject = self.subject % dict(user = self.from_name())
except UnicodeDecodeError:
self.subject = self.subject % dict(user = "a user")
def from_name(self):
if not self.user:

View File

@@ -20,7 +20,6 @@
# CondeNet, Inc. All Rights Reserved.
################################################################################
from r2.models import *
from r2.lib import promote
from r2.lib.utils import fetch_things2
import string
@@ -62,9 +61,6 @@ def create_links(num):
sr = random.choice(subreddits)
l = Link._submit(title, url, user, sr, '127.0.0.1')
if random.choice(([False] * 50) + [True]):
promote.promote(l)
def by_url_cache():
q = Link._query(Link.c._spam == (True,False),

View File

@@ -40,7 +40,7 @@ class Printable(object):
'render_score', 'score', '_score',
'upvotes', '_ups',
'downvotes', '_downs',
'subreddit_slow',
'subreddit_slow', '_deleted', '_spam',
'cachable', 'make_permalink', 'permalink',
'timesince', 'votehash'
])

View File

@@ -50,16 +50,24 @@ class Subreddit(Thing, Printable):
description = '',
allow_top = True,
images = {},
ad_type = None,
ad_file = os.path.join(g.static_path, 'ad_default.html'),
reported = 0,
valid_votes = 0,
show_media = False,
css_on_cname = True,
domain = None,
mod_actions = 0,
sponsorship_url = None,
sponsorship_img = None,
sponsorship_name = None,
)
_data_int_props = ('mod_actions',)
sr_limit = 50
@classmethod
def _new(self, name, title, author_id, ip, lang = g.lang, type = 'public',
def _new(cls, name, title, author_id, ip, lang = g.lang, type = 'public',
over_18 = False, **kw):
with g.make_lock('create_sr_' + name.lower()):
try:
@@ -265,8 +273,9 @@ class Subreddit(Thing, Printable):
else:
item.subscriber = bool(rels.get((item, user, 'subscriber')))
item.moderator = bool(rels.get((item, user, 'moderator')))
item.contributor = bool(item.moderator or \
rels.get((item, user, 'contributor')))
item.contributor = bool(item.type != 'public' and
(item.moderator or
rels.get((item, user, 'contributor'))))
item.score = item._ups
# override "voting" score behavior (it will override the use of
# item.score in builder.py to be ups-downs)
@@ -274,6 +283,12 @@ class Subreddit(Thing, Printable):
base_score = item.score - (1 if item.likes else 0)
item.voting_score = [(base_score + x - 1) for x in range(3)]
item.score_fmt = Score.subscribers
#will seem less horrible when add_props is in pages.py
from r2.lib.pages import UserText
item.usertext = UserText(item, item.description)
Printable.add_props(user, wrapped)
#TODO: make this work
cache_ignore = set(["subscribers"]).union(Printable.cache_ignore)
@@ -290,7 +305,7 @@ class Subreddit(Thing, Printable):
pop_reddits = Subreddit._query(Subreddit.c.type == ('public',
'restricted'),
sort=desc('_downs'),
limit = limit * 1.5 if limit else None,
limit = limit,
data = True,
read_cache = True,
write_cache = True,
@@ -301,14 +316,7 @@ class Subreddit(Thing, Printable):
if not c.over18:
pop_reddits._filter(Subreddit.c.over_18 == False)
# evaluate the query and remove the ones with
# allow_top==False. Note that because this filtering is done
# after the query is run, if there are a lot of top reddits
# with allow_top==False, we may return fewer than `limit`
# results.
srs = filter(lambda sr: sr.allow_top, pop_reddits)
return srs[:limit] if limit else srs
return list(pop_reddits)
@classmethod
def default_subreddits(cls, ids = True, limit = g.num_default_reddits):
@@ -318,8 +326,26 @@ class Subreddit(Thing, Printable):
An optional kw argument 'limit' is defaulted to g.num_default_reddits
"""
srs = cls.top_lang_srs(c.content_langs, limit)
return [s._id for s in srs] if ids else srs
# If we ever have much more than two of these, we should update
# _by_name to support lists of them
auto_srs = [ Subreddit._by_name(n) for n in g.automatic_reddits ]
srs = cls.top_lang_srs(c.content_langs, limit + len(auto_srs))
rv = []
for i, s in enumerate(srs):
if len(rv) >= limit:
break
if s in auto_srs:
continue
rv.append(s)
rv = auto_srs + rv
if ids:
return [ sr._id for sr in rv ]
else:
return rv
@classmethod
@memoize('random_reddits', time = 1800)
@@ -516,6 +542,7 @@ class AllSR(FakeSubreddit):
title = 'all'
def get_links(self, sort, time):
from r2.lib import promote
from r2.models import Link
from r2.lib.db import queries
q = Link._query(sort = queries.db_sort(sort))

View File

@@ -21,89 +21,14 @@
################################################################################
import sqlalchemy as sa
from r2.lib.db.tdb_sql import make_metadata
from r2.lib.utils import worker
from pylons import g
def index_str(table, name, on, where = None):
index_str = 'create index idx_%s_' % name
index_str += table.name
index_str += ' on '+ table.name + ' (%s)' % on
if where:
index_str += ' where %s' % where
return index_str
def create_table(table, index_commands=None, force = False):
t = table
if g.db_create_tables:
if not t.bind.has_table(t.name) or force:
try:
t.create(checkfirst = False)
except: pass
if index_commands:
for i in index_commands:
try:
t.bind.execute(i)
except: pass
from r2.lib import amqp
from r2.lib.utils import worker
def change_table(metadata):
return sa.Table(g.db_app_name + '_changes', metadata,
sa.Column('fullname', sa.String, nullable=False,
primary_key = True),
sa.Column('thing_type', sa.Integer, nullable=False),
sa.Column('date',
sa.DateTime(timezone = True),
default = sa.func.now(),
nullable = False)
)
def make_change_tables(force = False):
engine = g.dbm.engines['change']
metadata = make_metadata(engine)
table = change_table(metadata)
indices = [
index_str(table, 'fullname', 'fullname'),
index_str(table, 'date', 'date')
]
create_table(table, indices, force = force)
return table
_change_table = make_change_tables()
def changed(thing):
def _changed():
d = dict(fullname = thing._fullname,
thing_type = thing._type_id)
try:
_change_table.insert().execute(d)
except sa.exceptions.SQLError:
t = _change_table
t.update(t.c.fullname == thing._fullname,
values = {t.c.date: sa.func.now()}).execute()
from r2.lib.solrsearch import indexed_types
if isinstance(thing, indexed_types):
worker.do(_changed)
def _where(cls = None, min_date = None, max_date = None):
t = _change_table
where = []
if cls:
where.append(t.c.thing_type == cls._type_id)
if min_date:
where.append(t.c.date > min_date)
if max_date:
where.append(t.c.date <= max_date)
if where:
return sa.and_(*where)
def get_changed(cls = None, min_date = None, limit = None):
t = _change_table
res = sa.select([t.c.fullname, t.c.date], _where(cls, min_date = min_date),
order_by = t.c.date, limit = limit).execute()
return res.fetchall()
def clear_changes(cls = None, min_date=None, max_date=None):
t = _change_table
t.delete(_where(cls, min_date = min_date, max_date = max_date)).execute()
amqp.add_item('searchchanges_q', thing._fullname,
message_id = thing._fullname)
worker.do(_changed)

View File

@@ -64,10 +64,9 @@ class Vote(MultiRelation('vote',
#check for old vote
rel = cls.rel(sub, obj)
oldvote = list(rel._query(rel.c._thing1_id == sub._id,
rel.c._thing2_id == obj._id,
data = True))
oldvote = rel._fast_query(sub, obj, ['-1', '0', '1']).values()
oldvote = filter(None, oldvote)
amount = 1 if dir is True else 0 if dir is None else -1
is_new = False
@@ -90,6 +89,7 @@ class Vote(MultiRelation('vote',
oldamount = 0
v = rel(sub, obj, str(amount))
v.author_id = obj.author_id
v.sr_id = sr._id
v.ip = ip
old_valid_thing = v.valid_thing = (valid_thing(v, karma))
v.valid_user = (v.valid_thing and valid_user(v, sr, karma)

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 218 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 137 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 143 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 349 B

View File

@@ -144,3 +144,23 @@ div.popup {
.shirt .shirt-container .main.red .caption.big .byline {
background-image: non;
}
.usertext .bottom-area .usertext-buttons {display: inline; }
.arrow.upmod {
background-image: url(/static/aupmod.gif);
background-position: 0 0;
}
.arrow.downmod {
background-image: url(/static/adownmod.gif);
background-position: 0 0;
}
.arrow.up {
background-image: url(/static/aupgray.gif);
background-position: 0 0;
}
.arrow.down {
background-image: url(/static/adowngray.gif);
background-position: 0 0;
}

View File

@@ -2,3 +2,11 @@
border: none;
height: 17px;
}
.entry.unvoted .score.unvoted { display: inline; }
.entry.unvoted div.score.unvoted { display: inline; }
/* Award name font is too big in IE */
.award-square .award-name {
font-size:18px;
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,12 +1,16 @@
.shirt {
margin-left: 30px;
width: 600px;
}
.shirt h2 {
margin: 30px;
text-align: center;
font-size: large;
color: gray;
.shirt p {
font-family: arial;
font-size: larger;
color: #111;
}
.shirt h2 {
font-family: arial;
color:gray;
font-size:x-large;
font-weight:normal;
}
.shirt .shirt-container { margin: 10px; }
.shirt .shirt-container .left {

Binary file not shown.

After

Width:  |  Height:  |  Size: 213 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 55 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 177 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 149 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 773 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 732 B

4241
r2/r2/public/static/js/jquery-1.3.1.js vendored Normal file

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

View File

@@ -1 +1 @@
jquery-1.2.6.js
jquery-1.3.1.min.js

View File

@@ -20,7 +20,11 @@ $.log = function(message) {
alert(message);
};
$.debug = $.log;
$.debug = function(message) {
if ($.with_default(reddit.debug, false)) {
return $.log(message);
}
}
$.fn.debug = function() {
$.debug($(this));
return $(this);
@@ -149,6 +153,9 @@ $.request = function(op, parameters, worker_in, block, type, get_only) {
var action = op;
var worker = worker_in;
if (rate_limit(op))
return;
/* we have a lock if we are not blocking or if we have gotten a lock */
var have_lock = !$.with_default(block, false) || acquire_ajax_lock(action);
@@ -198,6 +205,31 @@ var upmod_cls = "upmod";
var down_cls = "down";
var downmod_cls = "downmod";
rate_limit = function() {
/* default rate-limit duration (in milliseconds) */
var default_rate_limit = 333;
/* rate limit on a per-action basis (also in ms, 0 = don't rate limit) */
var rate_limits = {"vote": 333, "comment": 5000,
"ignore": 0, "ban": 0, "unban": 0};
var last_dates = {};
/* paranoia: copy global functions used to avoid tampering. */
var defined = $.defined;
var with_default = $.with_default;
var _Date = Date;
return function(action) {
var now = new _Date();
var last_date = last_dates[action];
var allowed_interval = with_default(rate_limits[action],
default_rate_limit);
last_dates[action] = now;
/* true = being rate limited */
return (defined(last_date) && now - last_date < allowed_interval)
};
}()
$.fn.vote = function(vh, callback) {
/* for vote to work, $(this) should be the clicked arrow */
if($(this).hasClass("arrow")) {
@@ -230,7 +262,7 @@ $.fn.vote = function(vh, callback) {
entry.addClass('unvoted')
.removeClass('likes dislikes');
});
$.request("vote", {id: things.filter(":first").thing_id(),
dir : dir, vh : vh});
}

View File

@@ -53,13 +53,16 @@ function post_form(form, where, statusfunc, nametransformfunc, block) {
}
};
function get_form_fields(form, fields) {
function get_form_fields(form, fields, filter_func) {
fields = fields || {};
if (!filter_func)
filter_func = function(x) { return true; };
/* consolidate the form's inputs for submission */
$(form).find("select, input, textarea").not(".gray, :disabled").each(function() {
if (($(this).attr("type") != "radio" &&
$(this).attr("type") != "checkbox") ||
$(this).attr("checked"))
var type = $(this).attr("type");
if (filter_func(this) &&
( (type != "radio" && type != "checkbox") ||
$(this).attr("checked")) )
fields[$(this).attr("name")] = $(this).attr("value");
});
if (fields.id == null) {
@@ -73,6 +76,16 @@ function simple_post_form(form, where, fields, block) {
return false;
};
function post_pseudo_form(form, where, block) {
var filter_func = function(x) {
var parent = $(x).parents("form:first");
return (parent.length == 0 || parent.get(0) == $(form).get(0))
};
$(form).find(".error").not(".status").hide();
$(form).find(".status").html(reddit.status_msg.submitting).show();
$.request(where, get_form_fields(form, {}, filter_func), null, block);
return false;
}
function emptyInput(elem, msg) {
if (! $(elem).attr("value") || $(elem).attr("value") == msg )
@@ -274,8 +287,13 @@ function unsubscribe(reddit_name) {
function friend(user_name, container_name, type) {
return function() {
$.request("friend",
{name: user_name, container: container_name, type: type});
if (!reddit.logged) {
showcover();
}
else {
$.request("friend",
{name: user_name, container: container_name, type: type});
}
}
};
@@ -298,6 +316,19 @@ function cancelShare(elem) {
return cancelToggleForm(elem, ".sharelink", ".share-button");
};
function reject_promo(elem) {
$(elem).thing().find(".rejection-form").show().find("textare").focus();
}
function cancel_reject_promo(elem) {
$(elem).thing().find(".rejection-form").hide();
}
function complete_reject_promo(elem) {
$(elem).thing().removeClass("accepted").addClass("rejected")
.find(".reject_promo").remove();
}
/* Comment generation */
function helpon(elem) {
$(elem).parents(".usertext-edit:first").children(".markhelp:first").show();
@@ -463,7 +494,8 @@ function updateEventHandlers(thing) {
thing = $(thing);
var listing = thing.parent();
$(thing).filter(".promotedlink").bind("onshow", function() {
$(thing).filter(".promotedlink, .sponsorshipbox")
.bind("onshow", function() {
var id = $(this).thing_id();
if($.inArray(id, reddit.tofetch) != -1) {
$.request("onload", {ids: reddit.tofetch.join(",")});
@@ -472,6 +504,7 @@ function updateEventHandlers(thing) {
var tracker = reddit.trackers[id];
if($.defined(tracker)) {
$(this).find("a.title").attr("href", tracker.click).end()
.find("a.thumbnail").attr("href", tracker.click).end()
.find("img.promote-pixel")
.attr("src", tracker.show);
delete reddit.trackers[id];
@@ -605,7 +638,7 @@ function fetch_title() {
var status = url_field.find(".title-status");
var url = $("#url").val();
if (url) {
status.show().text("loading...");
status.show().text(reddit.status_msg.loading);
error.hide();
$.request("fetch_title", {url: url});
}
@@ -878,14 +911,20 @@ function comment_reply_for_elem(elem) {
}
function edit_usertext(elem) {
show_edit_usertext($(elem).thing().find(".usertext:first"));
var t = $(elem).thing();
t.find(".edit-usertext:first").parent("li").andSelf().hide();
show_edit_usertext(t.find(".usertext:first"));
}
function cancel_usertext(elem) {
hide_edit_usertext($(elem).thing().find(".usertext:first"));
var t = $(elem).thing();
t.find(".edit-usertext:first").parent("li").andSelf().show();
hide_edit_usertext(t.find(".usertext:first"));
}
function save_usertext(elem) {
var t = $(elem).thing();
t.find(".edit-usertext:first").parent("li").andSelf().show();
}
function reply(elem) {
@@ -1044,6 +1083,34 @@ function check_some_langs(elem) {
$(elem).parents("form").find("#some-langs").attr("checked", true);
}
function fetch_parent(elem, parent_permalink, parent_id) {
$(elem).css("color", "red").html(reddit.status_msg.loading);
var thing = $(elem).thing();
var parentdiv = thing.find(".body .parent");
if (parentdiv.length == 0) {
var parent = '';
$.getJSON(parent_permalink, function(response) {
$.each(response, function() {
if (this && this.data.children) {
$.each(this.data.children, function() {
if(this.data.name == parent_id) {
parent= this.data.body_html;
}
});
}
});
if(parent) {
/* make a parent div for the contents of the fetch */
thing.find(".body .md").before('<div class="parent rounded">' +
$.unsafe(parent) +
'</div>');
}
$(elem).parent("li").andSelf().remove();
});
}
return false;
}
/* The ready method */
$(function() {
/* set function to be called on thing creation/replacement,
@@ -1074,5 +1141,4 @@ $(function() {
/* visually mark the last-clicked entry */
last_click();
});

View File

@@ -0,0 +1,64 @@
function update_box(elem) {
$(elem).prevAll("*[type=checkbox]:first").attr('checked', true);
};
function update_bid(elem) {
var form = $(elem).parents("form:first");
var bid = parseFloat(form.find("*[name=bid]").val());
var ndays = ((Date.parse(form.find("*[name=enddate]").val()) -
Date.parse(form.find("*[name=startdate]").val())) / (86400*1000));
$("#bid-field span.gray").html("[Current campaign totals " +
"<b>$" + (bid/ndays).toFixed(2) +
"</b> per day for <b>" + ndays + " day(s)</b>]");
$("#duration span.gray")
.html( ndays == 1 ? "(1 day)" : "(" + ndays + " days)");
}
var dateFromInput = function(selector, offset) {
if(selector) {
var input = $(selector);
if(input.length) {
var d = new Date();
offset = $.with_default(offset, 0);
d.setTime(Date.parse(input.val()) + offset);
return d;
}
}
};
function attach_calendar(where, min_date_src, max_date_src, callback) {
$(where).siblings(".datepicker").mousedown(function() {
$(this).addClass("clicked active");
}).click(function() {
$(this).removeClass("clicked")
.not(".selected").siblings("input").focus().end()
.removeClass("selected");
}).end()
.focus(function() {
var target = $(this);
var dp = $(this).siblings(".datepicker");
if (dp.children().length == 0) {
dp.each(function() {
$(this).datepicker(
{
defaultDate: dateFromInput(target),
minDate: dateFromInput(min_date_src, 86400 * 1000),
maxDate: dateFromInput(max_date_src),
prevText: "&laquo;", nextText: "&raquo;",
altField: "#" + target.attr("id"),
onSelect: function() {
$(dp).addClass("selected").removeClass("clicked");
$(target).blur();
if(callback) callback(this);
}
})
})
.addClass("drop-choices");
};
dp.addClass("inuse active");
}).blur(function() {
$(this).siblings(".datepicker").not(".clicked").removeClass("inuse");
}).click(function() {
$(this).siblings(".datepicker.inuse").addClass("active");
});
}

View File

@@ -0,0 +1,519 @@
/*
* jQuery UI 1.7.1
*
* Copyright (c) 2009 AUTHORS.txt (http://jqueryui.com/about)
* Dual licensed under the MIT (MIT-LICENSE.txt)
* and GPL (GPL-LICENSE.txt) licenses.
*
* http://docs.jquery.com/UI
*/
;jQuery.ui || (function($) {
var _remove = $.fn.remove,
isFF2 = $.browser.mozilla && (parseFloat($.browser.version) < 1.9);
//Helper functions and ui object
$.ui = {
version: "1.7.1",
// $.ui.plugin is deprecated. Use the proxy pattern instead.
plugin: {
add: function(module, option, set) {
var proto = $.ui[module].prototype;
for(var i in set) {
proto.plugins[i] = proto.plugins[i] || [];
proto.plugins[i].push([option, set[i]]);
}
},
call: function(instance, name, args) {
var set = instance.plugins[name];
if(!set || !instance.element[0].parentNode) { return; }
for (var i = 0; i < set.length; i++) {
if (instance.options[set[i][0]]) {
set[i][1].apply(instance.element, args);
}
}
}
},
contains: function(a, b) {
return document.compareDocumentPosition
? a.compareDocumentPosition(b) & 16
: a !== b && a.contains(b);
},
hasScroll: function(el, a) {
//If overflow is hidden, the element might have extra content, but the user wants to hide it
if ($(el).css('overflow') == 'hidden') { return false; }
var scroll = (a && a == 'left') ? 'scrollLeft' : 'scrollTop',
has = false;
if (el[scroll] > 0) { return true; }
// TODO: determine which cases actually cause this to happen
// if the element doesn't have the scroll set, see if it's possible to
// set the scroll
el[scroll] = 1;
has = (el[scroll] > 0);
el[scroll] = 0;
return has;
},
isOverAxis: function(x, reference, size) {
//Determines when x coordinate is over "b" element axis
return (x > reference) && (x < (reference + size));
},
isOver: function(y, x, top, left, height, width) {
//Determines when x, y coordinates is over "b" element
return $.ui.isOverAxis(y, top, height) && $.ui.isOverAxis(x, left, width);
},
keyCode: {
BACKSPACE: 8,
CAPS_LOCK: 20,
COMMA: 188,
CONTROL: 17,
DELETE: 46,
DOWN: 40,
END: 35,
ENTER: 13,
ESCAPE: 27,
HOME: 36,
INSERT: 45,
LEFT: 37,
NUMPAD_ADD: 107,
NUMPAD_DECIMAL: 110,
NUMPAD_DIVIDE: 111,
NUMPAD_ENTER: 108,
NUMPAD_MULTIPLY: 106,
NUMPAD_SUBTRACT: 109,
PAGE_DOWN: 34,
PAGE_UP: 33,
PERIOD: 190,
RIGHT: 39,
SHIFT: 16,
SPACE: 32,
TAB: 9,
UP: 38
}
};
// WAI-ARIA normalization
if (isFF2) {
var attr = $.attr,
removeAttr = $.fn.removeAttr,
ariaNS = "http://www.w3.org/2005/07/aaa",
ariaState = /^aria-/,
ariaRole = /^wairole:/;
$.attr = function(elem, name, value) {
var set = value !== undefined;
return (name == 'role'
? (set
? attr.call(this, elem, name, "wairole:" + value)
: (attr.apply(this, arguments) || "").replace(ariaRole, ""))
: (ariaState.test(name)
? (set
? elem.setAttributeNS(ariaNS,
name.replace(ariaState, "aaa:"), value)
: attr.call(this, elem, name.replace(ariaState, "aaa:")))
: attr.apply(this, arguments)));
};
$.fn.removeAttr = function(name) {
return (ariaState.test(name)
? this.each(function() {
this.removeAttributeNS(ariaNS, name.replace(ariaState, ""));
}) : removeAttr.call(this, name));
};
}
//jQuery plugins
$.fn.extend({
remove: function() {
// Safari has a native remove event which actually removes DOM elements,
// so we have to use triggerHandler instead of trigger (#3037).
$("*", this).add(this).each(function() {
$(this).triggerHandler("remove");
});
return _remove.apply(this, arguments );
},
enableSelection: function() {
return this
.attr('unselectable', 'off')
.css('MozUserSelect', '')
.unbind('selectstart.ui');
},
disableSelection: function() {
return this
.attr('unselectable', 'on')
.css('MozUserSelect', 'none')
.bind('selectstart.ui', function() { return false; });
},
scrollParent: function() {
var scrollParent;
if(($.browser.msie && (/(static|relative)/).test(this.css('position'))) || (/absolute/).test(this.css('position'))) {
scrollParent = this.parents().filter(function() {
return (/(relative|absolute|fixed)/).test($.curCSS(this,'position',1)) && (/(auto|scroll)/).test($.curCSS(this,'overflow',1)+$.curCSS(this,'overflow-y',1)+$.curCSS(this,'overflow-x',1));
}).eq(0);
} else {
scrollParent = this.parents().filter(function() {
return (/(auto|scroll)/).test($.curCSS(this,'overflow',1)+$.curCSS(this,'overflow-y',1)+$.curCSS(this,'overflow-x',1));
}).eq(0);
}
return (/fixed/).test(this.css('position')) || !scrollParent.length ? $(document) : scrollParent;
}
});
//Additional selectors
$.extend($.expr[':'], {
data: function(elem, i, match) {
return !!$.data(elem, match[3]);
},
focusable: function(element) {
var nodeName = element.nodeName.toLowerCase(),
tabIndex = $.attr(element, 'tabindex');
return (/input|select|textarea|button|object/.test(nodeName)
? !element.disabled
: 'a' == nodeName || 'area' == nodeName
? element.href || !isNaN(tabIndex)
: !isNaN(tabIndex))
// the element and all of its ancestors must be visible
// the browser may report that the area is hidden
&& !$(element)['area' == nodeName ? 'parents' : 'closest'](':hidden').length;
},
tabbable: function(element) {
var tabIndex = $.attr(element, 'tabindex');
return (isNaN(tabIndex) || tabIndex >= 0) && $(element).is(':focusable');
}
});
// $.widget is a factory to create jQuery plugins
// taking some boilerplate code out of the plugin code
function getter(namespace, plugin, method, args) {
function getMethods(type) {
var methods = $[namespace][plugin][type] || [];
return (typeof methods == 'string' ? methods.split(/,?\s+/) : methods);
}
var methods = getMethods('getter');
if (args.length == 1 && typeof args[0] == 'string') {
methods = methods.concat(getMethods('getterSetter'));
}
return ($.inArray(method, methods) != -1);
}
$.widget = function(name, prototype) {
var namespace = name.split(".")[0];
name = name.split(".")[1];
// create plugin method
$.fn[name] = function(options) {
var isMethodCall = (typeof options == 'string'),
args = Array.prototype.slice.call(arguments, 1);
// prevent calls to internal methods
if (isMethodCall && options.substring(0, 1) == '_') {
return this;
}
// handle getter methods
if (isMethodCall && getter(namespace, name, options, args)) {
var instance = $.data(this[0], name);
return (instance ? instance[options].apply(instance, args)
: undefined);
}
// handle initialization and non-getter methods
return this.each(function() {
var instance = $.data(this, name);
// constructor
(!instance && !isMethodCall &&
$.data(this, name, new $[namespace][name](this, options))._init());
// method call
(instance && isMethodCall && $.isFunction(instance[options]) &&
instance[options].apply(instance, args));
});
};
// create widget constructor
$[namespace] = $[namespace] || {};
$[namespace][name] = function(element, options) {
var self = this;
this.namespace = namespace;
this.widgetName = name;
this.widgetEventPrefix = $[namespace][name].eventPrefix || name;
this.widgetBaseClass = namespace + '-' + name;
this.options = $.extend({},
$.widget.defaults,
$[namespace][name].defaults,
$.metadata && $.metadata.get(element)[name],
options);
this.element = $(element)
.bind('setData.' + name, function(event, key, value) {
if (event.target == element) {
return self._setData(key, value);
}
})
.bind('getData.' + name, function(event, key) {
if (event.target == element) {
return self._getData(key);
}
})
.bind('remove', function() {
return self.destroy();
});
};
// add widget prototype
$[namespace][name].prototype = $.extend({}, $.widget.prototype, prototype);
// TODO: merge getter and getterSetter properties from widget prototype
// and plugin prototype
$[namespace][name].getterSetter = 'option';
};
$.widget.prototype = {
_init: function() {},
destroy: function() {
this.element.removeData(this.widgetName)
.removeClass(this.widgetBaseClass + '-disabled' + ' ' + this.namespace + '-state-disabled')
.removeAttr('aria-disabled');
},
option: function(key, value) {
var options = key,
self = this;
if (typeof key == "string") {
if (value === undefined) {
return this._getData(key);
}
options = {};
options[key] = value;
}
$.each(options, function(key, value) {
self._setData(key, value);
});
},
_getData: function(key) {
return this.options[key];
},
_setData: function(key, value) {
this.options[key] = value;
if (key == 'disabled') {
this.element
[value ? 'addClass' : 'removeClass'](
this.widgetBaseClass + '-disabled' + ' ' +
this.namespace + '-state-disabled')
.attr("aria-disabled", value);
}
},
enable: function() {
this._setData('disabled', false);
},
disable: function() {
this._setData('disabled', true);
},
_trigger: function(type, event, data) {
var callback = this.options[type],
eventName = (type == this.widgetEventPrefix
? type : this.widgetEventPrefix + type);
event = $.Event(event);
event.type = eventName;
// copy original event properties over to the new event
// this would happen if we could call $.event.fix instead of $.Event
// but we don't have a way to force an event to be fixed multiple times
if (event.originalEvent) {
for (var i = $.event.props.length, prop; i;) {
prop = $.event.props[--i];
event[prop] = event.originalEvent[prop];
}
}
this.element.trigger(event, data);
return !($.isFunction(callback) && callback.call(this.element[0], event, data) === false
|| event.isDefaultPrevented());
}
};
$.widget.defaults = {
disabled: false
};
/** Mouse Interaction Plugin **/
$.ui.mouse = {
_mouseInit: function() {
var self = this;
this.element
.bind('mousedown.'+this.widgetName, function(event) {
return self._mouseDown(event);
})
.bind('click.'+this.widgetName, function(event) {
if(self._preventClickEvent) {
self._preventClickEvent = false;
event.stopImmediatePropagation();
return false;
}
});
// Prevent text selection in IE
if ($.browser.msie) {
this._mouseUnselectable = this.element.attr('unselectable');
this.element.attr('unselectable', 'on');
}
this.started = false;
},
// TODO: make sure destroying one instance of mouse doesn't mess with
// other instances of mouse
_mouseDestroy: function() {
this.element.unbind('.'+this.widgetName);
// Restore text selection in IE
($.browser.msie
&& this.element.attr('unselectable', this._mouseUnselectable));
},
_mouseDown: function(event) {
// don't let more than one widget handle mouseStart
// TODO: figure out why we have to use originalEvent
event.originalEvent = event.originalEvent || {};
if (event.originalEvent.mouseHandled) { return; }
// we may have missed mouseup (out of window)
(this._mouseStarted && this._mouseUp(event));
this._mouseDownEvent = event;
var self = this,
btnIsLeft = (event.which == 1),
elIsCancel = (typeof this.options.cancel == "string" ? $(event.target).parents().add(event.target).filter(this.options.cancel).length : false);
if (!btnIsLeft || elIsCancel || !this._mouseCapture(event)) {
return true;
}
this.mouseDelayMet = !this.options.delay;
if (!this.mouseDelayMet) {
this._mouseDelayTimer = setTimeout(function() {
self.mouseDelayMet = true;
}, this.options.delay);
}
if (this._mouseDistanceMet(event) && this._mouseDelayMet(event)) {
this._mouseStarted = (this._mouseStart(event) !== false);
if (!this._mouseStarted) {
event.preventDefault();
return true;
}
}
// these delegates are required to keep context
this._mouseMoveDelegate = function(event) {
return self._mouseMove(event);
};
this._mouseUpDelegate = function(event) {
return self._mouseUp(event);
};
$(document)
.bind('mousemove.'+this.widgetName, this._mouseMoveDelegate)
.bind('mouseup.'+this.widgetName, this._mouseUpDelegate);
// preventDefault() is used to prevent the selection of text here -
// however, in Safari, this causes select boxes not to be selectable
// anymore, so this fix is needed
($.browser.safari || event.preventDefault());
event.originalEvent.mouseHandled = true;
return true;
},
_mouseMove: function(event) {
// IE mouseup check - mouseup happened when mouse was out of window
if ($.browser.msie && !event.button) {
return this._mouseUp(event);
}
if (this._mouseStarted) {
this._mouseDrag(event);
return event.preventDefault();
}
if (this._mouseDistanceMet(event) && this._mouseDelayMet(event)) {
this._mouseStarted =
(this._mouseStart(this._mouseDownEvent, event) !== false);
(this._mouseStarted ? this._mouseDrag(event) : this._mouseUp(event));
}
return !this._mouseStarted;
},
_mouseUp: function(event) {
$(document)
.unbind('mousemove.'+this.widgetName, this._mouseMoveDelegate)
.unbind('mouseup.'+this.widgetName, this._mouseUpDelegate);
if (this._mouseStarted) {
this._mouseStarted = false;
this._preventClickEvent = (event.target == this._mouseDownEvent.target);
this._mouseStop(event);
}
return false;
},
_mouseDistanceMet: function(event) {
return (Math.max(
Math.abs(this._mouseDownEvent.pageX - event.pageX),
Math.abs(this._mouseDownEvent.pageY - event.pageY)
) >= this.options.distance
);
},
_mouseDelayMet: function(event) {
return this.mouseDelayMet;
},
// These are placeholder methods, to be overriden by extending plugin
_mouseStart: function(event) {},
_mouseDrag: function(event) {},
_mouseStop: function(event) {},
_mouseCapture: function(event) { return true; }
};
$.ui.mouse.defaults = {
cancel: null,
distance: 1,
delay: 0
};
})(jQuery);

File diff suppressed because it is too large Load Diff

Binary file not shown.

After

Width:  |  Height:  |  Size: 767 B

Some files were not shown because too many files have changed in this diff Show More