Compare commits

..

15 Commits

Author SHA1 Message Date
FoxxMD
79db9d3848 feat: Add dns lookup caching
Reduces dns traffic by 95%
2024-06-13 14:16:46 -04:00
FoxxMD
4e3ef71c73 fix(mhs): Check for authentication error response
MHS will return a 200 HTTP response but with response detailing auth error
2024-02-21 09:38:41 -05:00
FoxxMD
c1c0f02c91 fix(notes): Fix user/mod note existing check bugs
* Render note content to check criteria so templated content isn't compared against rendered content
* Fix cached mod note acted on id to have correct prefix based on thing type
* Fix references mod action not checking note for acted on
2023-10-06 11:32:17 -04:00
FoxxMD
da45925f0c feat(history): Store rule failure context in results 2023-10-06 11:30:49 -04:00
MHFDoge
e465f2f1e7 Update configuration.md
(cherry picked from commit 6e37fc4eb7)
2023-05-23 11:41:49 -04:00
FoxxMD
00bc917296 fix: activity fetch reporting wrong raw count
unfiltered doesn't need to be initialized until it is actually used
2023-05-23 11:41:17 -04:00
FoxxMD
8080a0b058 feat: Add banned user lookup functionality
* Retrieve author's ban status in subreddit on a per-name basis
* Cache author's ban status for AuthorTTL
2023-05-17 11:49:13 -04:00
FoxxMD
f7dc9222d6 feat(template): Add flair text props to template rendering for submissions and authors
#141
2023-04-21 10:06:52 -04:00
FoxxMD
021e5c524b feat(cache): Cache reddit request errors when fetching activities
Use small, in-memory cache to store reddit api response errors for 5 seconds to prevent wasting api calls
2023-01-30 12:22:12 -05:00
FoxxMD
e5fe4589e0 fix: Prevent non-serious errors from contributing to retry counts
Do not increase retry count if any error in the stack was marked as non-serious.
2023-01-30 12:20:43 -05:00
FoxxMD
580a9c8fe6 feat(filter): Implement own-subreddit placeholder for subreddit filtering
When subreddit criteria name is `{{subreddit}}` CM checks the given subreddit against the name of the subreddit the bot is currently operating in.
2022-12-16 10:26:42 -05:00
FoxxMD
ef372e531e fix(database): Prevent usage of LIMIT in session storage driver when db backend is mysql/mariadb
Related to freshgiammi-lab/connect-typeorm#8

Closes #128
2022-11-29 09:47:55 -05:00
FoxxMD
fde2836208 chore: remove comments about wrong endpoints
At some point maybe this was fixed by reddit silently?
2022-11-28 14:36:33 -05:00
Matt Foxx
021dd5b0c5 Merge pull request #130 from rysie/bug/selecftlair-fix 2022-11-28 14:35:39 -05:00
Marcin Macinski
5bd38d367a assignFlair doesn't work with flair_template_id 2022-11-25 17:13:49 +01:00
22 changed files with 269 additions and 194 deletions

View File

@@ -36,7 +36,7 @@ configuration.
* **FILE** -- Values specified in a YAML/JSON configuration file using the structure [in the schema](https://json-schema.app/view/%23?url=https%3A%2F%2Fraw.githubusercontent.com%2FFoxxMD%2Fcontext-mod%2Fmaster%2Fsrc%2FSchema%2FOperatorConfig.json)
* When reading the **schema** if the variable is available at a level of configuration other than **FILE** it will be
noted with the same symbol as above. The value shown is the default.
* **ARG** -- Values specified as CLI arguments to the program (see [ClI Usage](#cli-usage) below)
* **ARG** -- Values specified as CLI arguments to the program (see [CLI Usage](#cli-usage) below)
## File Configuration (Recommended)

View File

@@ -61,13 +61,14 @@ All Actions with `content` have access to this data:
Additionally, `author` has these properties accessible:
| Name | Description | Example |
|----------------|-------------------------------------|----------|
| `age` | (Approximate) Age of account | 3 months |
| `linkKarma` | Amount of link karma | 10 |
| `commentKarma` | Amount of comment karma | 3 |
| `totalKarma` | Combined link+comment karma | 13 |
| `verified` | Does account have a verified email? | true |
| Name | Description | Example |
|----------------|-----------------------------------------------------------------------------------|------------|
| `age` | (Approximate) Age of account | 3 months |
| `linkKarma` | Amount of link karma | 10 |
| `commentKarma` | Amount of comment karma | 3 |
| `totalKarma` | Combined link+comment karma | 13 |
| `verified` | Does account have a verified email? | true |
| `flairText` | The text of the Flair assigned to the Author in this subreddit, if one is present | Test Flair |
NOTE: Accessing these properties may require an additional API call so use sparingly on high-volume comments
@@ -84,13 +85,14 @@ Produces:
If the **Activity** is a Submission these additional properties are accessible:
| Name | Description | Example |
|---------------|-----------------------------------------------------------------|-------------------------|
| `upvoteRatio` | The upvote ratio | 100% |
| `nsfw` | If the submission is marked as NSFW | true |
| `spoiler` | If the submission is marked as a spoiler | true |
| `url` | If the submission was a link then this is the URL for that link | http://example.com |
| `title` | The title of the submission | Test post please ignore |
| Name | Description | Example |
|-------------------|-----------------------------------------------------------------|-------------------------|
| `upvoteRatio` | The upvote ratio | 100% |
| `nsfw` | If the submission is marked as NSFW | true |
| `spoiler` | If the submission is marked as a spoiler | true |
| `url` | If the submission was a link then this is the URL for that link | http://example.com |
| `title` | The title of the submission | Test post please ignore |
| `link_flair_text` | The flair text assigned to this submission | Test Flair |
### Comments

55
package-lock.json generated
View File

@@ -31,6 +31,7 @@
"body-parser": "^1.19.0",
"cache-manager": "^3.4.4",
"cache-manager-redis-store": "^2.0.0",
"cacheable-lookup": "^6.1.0",
"command-exists": "^1.2.9",
"commander": "^8.0.0",
"comment-json": "^4.1.1",
@@ -46,7 +47,6 @@
"express-session-cache-manager": "^1.0.2",
"express-socket.io-session": "^1.3.5",
"fast-deep-equal": "^3.1.3",
"fixed-size-list": "^0.3.0",
"globrex": "^0.1.2",
"got": "^11.8.2",
"he": "^1.2.0",
@@ -2396,9 +2396,9 @@
}
},
"node_modules/cacheable-lookup": {
"version": "5.0.4",
"resolved": "https://registry.npmjs.org/cacheable-lookup/-/cacheable-lookup-5.0.4.tgz",
"integrity": "sha512-2/kNscPhpcxrOigMZzbiWF7dz8ilhb/nIHU3EyZiXWXpeq/au8qJ8VhdftMkty3n7Gj6HIGalQG8oiBNB3AJgA==",
"version": "6.1.0",
"resolved": "https://registry.npmjs.org/cacheable-lookup/-/cacheable-lookup-6.1.0.tgz",
"integrity": "sha512-KJ/Dmo1lDDhmW2XDPMo+9oiy/CeqosPguPCrgcVzKyZrL6pM1gU2GmPY/xo6OQPTUaA/c0kwHuywB4E6nmT9ww==",
"engines": {
"node": ">=10.6.0"
}
@@ -3980,14 +3980,6 @@
"micromatch": "^4.0.2"
}
},
"node_modules/fixed-size-list": {
"version": "0.3.0",
"resolved": "https://registry.npmjs.org/fixed-size-list/-/fixed-size-list-0.3.0.tgz",
"integrity": "sha512-c6I8wEE4ZtjKz35BaodH7yWuWmcaUVQwgBeNcI3LxJu79YH+ezHvf1oS9VkgJmyVy5eQ8Wh6jNVcj2rB4rgVgA==",
"dependencies": {
"mitt": "^1.2.0"
}
},
"node_modules/flat": {
"version": "5.0.2",
"resolved": "https://registry.npmjs.org/flat/-/flat-5.0.2.tgz",
@@ -4430,6 +4422,14 @@
"url": "https://github.com/sindresorhus/got?sponsor=1"
}
},
"node_modules/got/node_modules/cacheable-lookup": {
"version": "5.0.4",
"resolved": "https://registry.npmjs.org/cacheable-lookup/-/cacheable-lookup-5.0.4.tgz",
"integrity": "sha512-2/kNscPhpcxrOigMZzbiWF7dz8ilhb/nIHU3EyZiXWXpeq/au8qJ8VhdftMkty3n7Gj6HIGalQG8oiBNB3AJgA==",
"engines": {
"node": ">=10.6.0"
}
},
"node_modules/graceful-fs": {
"version": "4.2.10",
"resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.10.tgz",
@@ -6122,11 +6122,6 @@
"resolved": "https://registry.npmjs.org/minimist/-/minimist-1.2.6.tgz",
"integrity": "sha512-Jsjnk4bw3YJqYzbdyBiNsPWHPfO++UGG749Cxs6peCu5Xg4nrena6OVxOYxrQTqww0Jmwt+Ref8rggumkTLz9Q=="
},
"node_modules/mitt": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/mitt/-/mitt-1.2.0.tgz",
"integrity": "sha512-r6lj77KlwqLhIUku9UWYes7KJtsczvolZkzp8hbaDPPaE24OmWl5s539Mytlj22siEQKosZ26qCBgda2PKwoJw=="
},
"node_modules/mkdirp": {
"version": "0.5.6",
"resolved": "https://registry.npmjs.org/mkdirp/-/mkdirp-0.5.6.tgz",
@@ -12338,9 +12333,9 @@
}
},
"cacheable-lookup": {
"version": "5.0.4",
"resolved": "https://registry.npmjs.org/cacheable-lookup/-/cacheable-lookup-5.0.4.tgz",
"integrity": "sha512-2/kNscPhpcxrOigMZzbiWF7dz8ilhb/nIHU3EyZiXWXpeq/au8qJ8VhdftMkty3n7Gj6HIGalQG8oiBNB3AJgA=="
"version": "6.1.0",
"resolved": "https://registry.npmjs.org/cacheable-lookup/-/cacheable-lookup-6.1.0.tgz",
"integrity": "sha512-KJ/Dmo1lDDhmW2XDPMo+9oiy/CeqosPguPCrgcVzKyZrL6pM1gU2GmPY/xo6OQPTUaA/c0kwHuywB4E6nmT9ww=="
},
"cacheable-request": {
"version": "7.0.2",
@@ -13602,14 +13597,6 @@
"micromatch": "^4.0.2"
}
},
"fixed-size-list": {
"version": "0.3.0",
"resolved": "https://registry.npmjs.org/fixed-size-list/-/fixed-size-list-0.3.0.tgz",
"integrity": "sha512-c6I8wEE4ZtjKz35BaodH7yWuWmcaUVQwgBeNcI3LxJu79YH+ezHvf1oS9VkgJmyVy5eQ8Wh6jNVcj2rB4rgVgA==",
"requires": {
"mitt": "^1.2.0"
}
},
"flat": {
"version": "5.0.2",
"resolved": "https://registry.npmjs.org/flat/-/flat-5.0.2.tgz",
@@ -13922,6 +13909,13 @@
"lowercase-keys": "^2.0.0",
"p-cancelable": "^2.0.0",
"responselike": "^2.0.0"
},
"dependencies": {
"cacheable-lookup": {
"version": "5.0.4",
"resolved": "https://registry.npmjs.org/cacheable-lookup/-/cacheable-lookup-5.0.4.tgz",
"integrity": "sha512-2/kNscPhpcxrOigMZzbiWF7dz8ilhb/nIHU3EyZiXWXpeq/au8qJ8VhdftMkty3n7Gj6HIGalQG8oiBNB3AJgA=="
}
}
},
"graceful-fs": {
@@ -15213,11 +15207,6 @@
"resolved": "https://registry.npmjs.org/minimist/-/minimist-1.2.6.tgz",
"integrity": "sha512-Jsjnk4bw3YJqYzbdyBiNsPWHPfO++UGG749Cxs6peCu5Xg4nrena6OVxOYxrQTqww0Jmwt+Ref8rggumkTLz9Q=="
},
"mitt": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/mitt/-/mitt-1.2.0.tgz",
"integrity": "sha512-r6lj77KlwqLhIUku9UWYes7KJtsczvolZkzp8hbaDPPaE24OmWl5s539Mytlj22siEQKosZ26qCBgda2PKwoJw=="
},
"mkdirp": {
"version": "0.5.6",
"resolved": "https://registry.npmjs.org/mkdirp/-/mkdirp-0.5.6.tgz",

View File

@@ -53,6 +53,7 @@
"body-parser": "^1.19.0",
"cache-manager": "^3.4.4",
"cache-manager-redis-store": "^2.0.0",
"cacheable-lookup": "^6.1.0",
"command-exists": "^1.2.9",
"commander": "^8.0.0",
"comment-json": "^4.1.1",
@@ -68,7 +69,6 @@
"express-session-cache-manager": "^1.0.2",
"express-socket.io-session": "^1.3.5",
"fast-deep-equal": "^3.1.3",
"fixed-size-list": "^0.3.0",
"globrex": "^0.1.2",
"got": "^11.8.2",
"he": "^1.2.0",

View File

@@ -67,13 +67,15 @@ export class BanAction extends Action {
// @ts-ignore
const fetchedSub = await item.subreddit.fetch();
const fetchedName = await item.author.name;
const bannedUser = await fetchedSub.banUser({
const banData = {
name: fetchedName,
banMessage: renderedContent === undefined ? undefined : renderedContent,
banReason: renderedReason,
banNote: renderedNote,
duration: this.duration
});
};
const bannedUser = await fetchedSub.banUser(banData);
await this.resources.addUserToSubredditBannedUserCache(banData)
touchedEntities.push(bannedUser);
}
return {

View File

@@ -57,8 +57,18 @@ export class ModNoteAction extends Action {
// nothing to do!
noteCheckResult = 'existingNoteCheck=false so no existing note checks were performed.';
} else {
const contextualCheck = {...this.existingNoteCheck};
let contextualNotes: string[] | undefined = undefined;
if(this.existingNoteCheck.note !== undefined && this.existingNoteCheck.note !== null) {
contextualNotes = [];
const notes = Array.isArray(this.existingNoteCheck.note) ? this.existingNoteCheck.note : [this.existingNoteCheck.note];
for(const n of notes) {
contextualNotes.push((await this.renderContent(n, item, ruleResults, actionResults) as string))
}
contextualCheck.note = contextualNotes;
}
const noteCheckCriteriaResult = await this.resources.isAuthor(item, {
modActions: [this.existingNoteCheck]
modActions: [contextualCheck]
});
noteCheckPassed = noteCheckCriteriaResult.passed;
const {details} = buildFilterCriteriaSummary(noteCheckCriteriaResult);

View File

@@ -43,11 +43,7 @@ export class FlairAction extends Action {
if (item instanceof Submission) {
if(!this.dryRun) {
if (this.flair_template_id) {
// typings are wrong for this function, flair_template_id should be accepted
// assignFlair uses /api/flair (mod endpoint)
// selectFlair uses /api/selectflair (self endpoint for user to choose their own flair for submission)
// @ts-ignore
await item.assignFlair({flair_template_id: this.flair_template_id}).then(() => {});
await item.selectFlair({flair_template_id: this.flair_template_id}).then(() => {});
item.link_flair_template_id = this.flair_template_id;
} else {
await item.assignFlair({text: renderedText, cssClass: renderedCss}).then(() => {});

View File

@@ -54,8 +54,18 @@ export class UserNoteAction extends Action {
// nothing to do!
noteCheckResult = 'existingNoteCheck=false so no existing note checks were performed.';
} else {
const contextualCheck = {...this.existingNoteCheck};
let contextualNotes: string[] | undefined = undefined;
if(this.existingNoteCheck.note !== undefined && this.existingNoteCheck.note !== null) {
contextualNotes = [];
const notes = Array.isArray(this.existingNoteCheck.note) ? this.existingNoteCheck.note : [this.existingNoteCheck.note];
for(const n of notes) {
contextualNotes.push((await this.renderContent(n, item, ruleResults, actionResults) as string))
}
contextualCheck.note = contextualNotes;
}
const noteCheckCriteriaResult = await this.resources.isAuthor(item, {
userNotes: [this.existingNoteCheck]
userNotes: [contextualCheck]
});
noteCheckPassed = noteCheckCriteriaResult.passed;
const {details} = buildFilterCriteriaSummary(noteCheckCriteriaResult);

View File

@@ -572,7 +572,7 @@ class Bot implements BotInstanceFunctions {
if (stream !== undefined) {
this.logger.info('Restarting SHARED COMMENT STREAM due to a subreddit config change');
stream.end('Replacing with a new stream with updated subreddits');
processed = stream.processedBuffer;
processed = stream.processed;
}
if (sharedCommentsSubreddits.length > 100) {
this.logger.warn(`SHARED COMMENT STREAM => Reddit can only combine 100 subreddits for getting new Comments but this bot has ${sharedCommentsSubreddits.length}`);
@@ -605,7 +605,7 @@ class Bot implements BotInstanceFunctions {
if (stream !== undefined) {
this.logger.info('Restarting SHARED SUBMISSION STREAM due to a subreddit config change');
stream.end('Replacing with a new stream with updated subreddits');
processed = stream.processedBuffer;
processed = stream.processed;
}
if (sharedSubmissionsSubreddits.length > 100) {
this.logger.warn(`SHARED SUBMISSION STREAM => Reddit can only combine 100 subreddits for getting new Submissions but this bot has ${sharedSubmissionsSubreddits.length}`);

View File

@@ -408,3 +408,9 @@ export interface RuleResultsTemplateData {
export interface GenericContentTemplateData extends BaseTemplateData, Partial<RuleResultsTemplateData>, Partial<ActionResultsTemplateData> {
item?: (SubmissionTemplateData | CommentTemplateData)
}
export type SubredditPlaceholderType = '{{subreddit}}';
export const subredditPlaceholder: SubredditPlaceholderType = '{{subreddit}}';
export const asSubredditPlaceholder = (val: any): val is SubredditPlaceholderType => {
return typeof val === 'string' && val.toLowerCase() === '{{subreddit}}';
}

View File

@@ -4,7 +4,7 @@ import {
DurationComparor,
ModeratorNameCriteria,
ModeratorNames, ModActionType,
ModUserNoteLabel, RelativeDateTimeMatch
ModUserNoteLabel, RelativeDateTimeMatch, SubredditPlaceholderType
} from "../Atomic";
import {ActivityType, MaybeActivityType} from "../Reddit";
import {GenericComparison, parseGenericValueComparison} from "../Comparisons";
@@ -57,7 +57,7 @@ export interface SubredditCriteria {
}
export interface StrongSubredditCriteria extends SubredditCriteria {
name?: RegExp
name?: RegExp | SubredditPlaceholderType
}
export const defaultStrongSubredditCriteriaOptions = {

View File

@@ -1,4 +1,5 @@
import {Comment, RedditUser, Submission, Subreddit} from "snoowrap/dist/objects";
import { BannedUser } from "snoowrap/dist/objects/Subreddit";
import { ValueOf } from "ts-essentials";
import {CMError} from "../../Utils/Errors";
@@ -166,3 +167,14 @@ export interface RedditRemovalMessageOptions {
title?: string
lock?: boolean
}
export interface CMBannedUser extends SnoowrapBannedUser {
user: RedditUser
}
export interface SnoowrapBannedUser extends Omit<BannedUser, 'id'> {
days_left: number | null
rel_id?: string
id?: string
}

View File

@@ -311,21 +311,24 @@ export class HistoryRule extends Rule {
let criteriaMet = false;
let failCriteriaResult: string = '';
const criteriaResultsSummary = criteriaResults.map(x => this.generateResultDataFromCriteria(x, true).result).join(this.condition === 'OR' ? ' OR ' : ' AND ');
if (this.condition === 'OR') {
criteriaMet = criteriaResults.some(x => x.triggered);
if(!criteriaMet) {
failCriteriaResult = `${FAIL} No criteria was met`;
failCriteriaResult = `${FAIL} No criteria was met => ${criteriaResultsSummary}`;
}
} else {
criteriaMet = criteriaResults.every(x => x.triggered);
if(!criteriaMet) {
if(criteriaResults.some(x => x.triggered)) {
const met = criteriaResults.filter(x => x.triggered);
failCriteriaResult = `${FAIL} ${met.length} out of ${criteriaResults.length} criteria met but Rule required all be met. Set log level to debug to see individual results`;
failCriteriaResult = `${FAIL} ${met.length} out of ${criteriaResults.length} criteria met but Rule required all be met => ${criteriaResultsSummary}`;
const results = criteriaResults.map(x => this.generateResultDataFromCriteria(x, true));
this.logger.debug(`\r\n ${results.map(x => x.result).join('\r\n')}`);
} else {
failCriteriaResult = `${FAIL} No criteria was met`;
failCriteriaResult = `${FAIL} No criteria was met => ${criteriaResultsSummary}`;
}
}
}
@@ -335,8 +338,8 @@ export class HistoryRule extends Rule {
const refCriteriaResults = criteriaResults.find(x => x.triggered);
const resultData = this.generateResultDataFromCriteria(refCriteriaResults);
this.logger.verbose(`${PASS} ${resultData.result}`);
return Promise.resolve([true, this.getResult(true, resultData)]);
this.logger.verbose(`${PASS} ${criteriaResultsSummary}`);
return Promise.resolve([true, this.getResult(true, {data: resultData.data, result: criteriaResultsSummary})]);
} else {
// log failures for easier debugging
for(const res of criteriaResults) {

View File

@@ -241,6 +241,8 @@ export class MHSRule extends Rule {
res = await this.callMHS(content);
if(res.response.toLowerCase() === 'success') {
await this.resources.cache.set(key, res, {ttl: this.resources.ttl.wikiTTL});
} else if(res.response.toLowerCase().includes('authentication failure')) {
throw new CMError(`MHS Request failed with Authentication failure. You most likely need to generate a new API key.`);
}
return res;
}

View File

@@ -167,7 +167,15 @@ export class ModNote {
if (referenceItem === undefined) {
throw new CMError('Criteria wants to check if mod note references activity but not activity was given.');
}
const isCurrentActivity = this.action.actedOn !== undefined && referenceItem !== undefined && this.action.actedOn.name === referenceItem.name;
let isCurrentActivity = false;
if(referenceItem !== undefined) {
if(this.action.actedOn !== undefined) {
isCurrentActivity = this.action.actedOn.name === referenceItem.name;
}
if(isCurrentActivity === false && this.note !== undefined && this.note.actedOn !== undefined) {
isCurrentActivity = this.note.actedOn.name === referenceItem.name;
}
}
if ((referencesCurrentActivity === true && !isCurrentActivity) || (referencesCurrentActivity === false && isCurrentActivity)) {
return false;
}

View File

@@ -33,9 +33,19 @@ export class ModUserNote {
}
toRaw(): ModUserNoteRaw {
let id = undefined;
if(this.actedOn !== undefined) {
if(this.actedOn instanceof Submission) {
id = `t3_${this.actedOn.id}`;
} else if(this.actedOn instanceof Comment) {
id = `t1_${this.actedOn.id}`;
} else if(this.actedOn instanceof RedditUser) {
id = `t2_${this.actedOn.id}`;
}
}
return {
note: this.note,
reddit_id: this.actedOn !== undefined ? this.actedOn.id : undefined,
reddit_id: id,
label: this.label
}
}

View File

@@ -7,7 +7,6 @@ import {mergeArr, parseDuration, random} from "../util";
import { Logger } from "winston";
import {ErrorWithCause} from "pony-cause";
import dayjs, {Dayjs as DayjsObj} from "dayjs";
import { FixedSizeList } from 'fixed-size-list'
type Awaitable<T> = Promise<T> | T;
@@ -15,12 +14,10 @@ interface RCBPollingOptions<T> extends SnooStormOptions {
subreddit: string,
enforceContinuity?: boolean
logger: Logger
sort?: string
name?: string,
processed?: FixedSizeList<T[keyof T]>
processed?: Set<T[keyof T]>
label?: string
dateCutoff?: boolean
maxHistory?: number
}
interface RCBPollConfiguration<T> extends PollConfiguration<T>,RCBPollingOptions<T> {
@@ -43,9 +40,6 @@ export class SPoll<T extends RedditContent<object>> extends Poll<T> {
name: string = 'Reddit Stream';
logger: Logger;
subreddit: string;
// using a fixed sized "regular" array means slightly more memory usage vs. a Set when holding N items
// BUT now we can limit N items to something reasonable instead of having a crazy big Set with all items seen since stream was started
processedBuffer: FixedSizeList<T[keyof T]>;
constructor(options: RCBPollConfiguration<T>) {
super(options);
@@ -60,7 +54,6 @@ export class SPoll<T extends RedditContent<object>> extends Poll<T> {
label = 'Polling',
processed,
dateCutoff,
maxHistory = 300,
} = options;
this.subreddit = subreddit;
this.name = name !== undefined ? name : this.name;
@@ -74,10 +67,8 @@ export class SPoll<T extends RedditContent<object>> extends Poll<T> {
// if we pass in processed on init the intention is to "continue" from where the previous stream left off
// WITHOUT new start behavior
if (processed !== undefined) {
this.processedBuffer = processed;
this.processed = processed;
this.newStart = false;
} else {
this.processedBuffer = new FixedSizeList<T[keyof T]>(maxHistory);
}
clearInterval(this.interval);
@@ -106,14 +97,14 @@ export class SPoll<T extends RedditContent<object>> extends Poll<T> {
}
for (const item of batch) {
const id = item[self.identifier];
if (self.processedBuffer.data.some(x => x === id)) {
if (self.processed.has(id)) {
anyAlreadySeen = true;
continue;
}
// add new item to list and set as processed
newItems.push(item);
self.processedBuffer.add(id);
self.processed.add(id);
}
page++;
}

View File

@@ -56,7 +56,7 @@ import {
} from "../util";
import {
ActivityDispatch,
CacheConfig,
CacheConfig, CacheOptions,
Footer,
HistoricalStatsDisplay,
ResourceStats, StrongTTLConfig,
@@ -104,7 +104,7 @@ import {
UserNoteCriteria
} from "../Common/Infrastructure/Filters/FilterCriteria";
import {
ActivitySourceValue,
ActivitySourceValue, asSubredditPlaceholder,
ConfigFragmentParseFunc,
DurationVal,
EventRetentionPolicyRange,
@@ -115,7 +115,7 @@ import {
ModUserNoteLabel,
RelativeDateTimeMatch,
statFrequencies,
StatisticFrequencyOption,
StatisticFrequencyOption, SubredditPlaceholderType,
WikiContext
} from "../Common/Infrastructure/Atomic";
import {
@@ -136,9 +136,9 @@ import {Duration} from "dayjs/plugin/duration";
import {
ActivityType,
AuthorHistorySort,
CachedFetchedActivitiesResult,
CachedFetchedActivitiesResult, CMBannedUser,
FetchedActivitiesResult, MaybeActivityType, RedditUserLike,
SnoowrapActivity,
SnoowrapActivity, SnoowrapBannedUser,
SubredditLike,
SubredditRemovalReason
} from "../Common/Infrastructure/Reddit";
@@ -161,7 +161,8 @@ import {ActionResultEntity} from "../Common/Entities/ActionResultEntity";
import {ActivitySource} from "../Common/ActivitySource";
import {SubredditResourceOptions} from "../Common/Subreddit/SubredditResourceInterfaces";
import {SubredditStats} from "./Stats";
import {CMCache} from "../Common/Cache";
import {CMCache, createCacheManager} from "../Common/Cache";
import {BannedUser, BanOptions} from "snoowrap/dist/objects/Subreddit";
export const DEFAULT_FOOTER = '\r\n*****\r\nThis action was performed by [a bot.]({{botLink}}) Mention a moderator or [send a modmail]({{modmailLink}}) if you have any ideas, questions, or concerns about this action.';
@@ -187,6 +188,7 @@ export class SubredditResources {
database: DataSource
client: ExtendedSnoowrap
cache: CMCache
memoryCache: CMCache
cacheSettingsHash?: string;
thirdPartyCredentials: ThirdPartyCredentialsJsonConfig;
delayedItems: ActivityDispatch[] = [];
@@ -244,6 +246,12 @@ export class SubredditResources {
}
this.cache = cache;
this.cache.setLogger(this.logger);
const memoryCacheOpts: CacheOptions = {
store: 'memory',
max: 10,
ttl: 10
};
this.memoryCache = new CMCache(createCacheManager(memoryCacheOpts), memoryCacheOpts, false, undefined, {}, this.logger);
this.subredditStats = new SubredditStats(database, managerEntity, cache, statFrequency, this.logger);
@@ -861,6 +869,47 @@ export class SubredditResources {
}
}
async getSubredditBannedUser(val: string | RedditUser): Promise<CMBannedUser | undefined> {
const subName = this.subreddit.display_name;
const name = getActivityAuthorName(val);
const hash = `sub-${subName}-banned-${name}`;
if (this.ttl.authorTTL !== false) {
const cachedBanData = (await this.cache.get(hash)) as undefined | null | false | SnoowrapBannedUser;
if (cachedBanData !== undefined && cachedBanData !== null) {
this.logger.debug(`Cache Hit: Subreddit Banned User ${subName} ${name}`);
if(cachedBanData === false) {
return undefined;
}
return {...cachedBanData, user: new RedditUser({name: cachedBanData.name}, this.client, false)};
}
}
let bannedUsers = await this.subreddit.getBannedUsers({name});
let bannedUser: CMBannedUser | undefined;
if(bannedUsers.length > 0) {
const banData = bannedUsers[0] as SnoowrapBannedUser;
bannedUser = {...banData, user: new RedditUser({name: banData.name}, this.client, false)};
}
if (this.ttl.authorTTL !== false) {
// @ts-ignore
await this.cache.set(hash, bannedUsers.length > 0 ? bannedUsers[0] as SnoowrapBannedUser : false, {ttl: this.ttl.subredditTTL});
}
return bannedUser;
}
async addUserToSubredditBannedUserCache(data: BanOptions) {
if (this.ttl.authorTTL !== false) {
const subName = this.subreddit.display_name;
const name = getActivityAuthorName(data.name);
const hash = `sub-${subName}-banned-${name}`;
const banData: SnoowrapBannedUser = {date: dayjs().unix(), name: data.name, days_left: data.duration ?? null, note: data.banNote ?? ''};
await this.cache.set(hash, banData, {ttl: this.ttl.authorTTL})
}
}
async hasSubreddit(name: string) {
if (this.ttl.subredditTTL !== false) {
const hash = `sub-${name}`;
@@ -1080,6 +1129,9 @@ export class SubredditResources {
async getActivities(user: RedditUser, options: ActivityWindowCriteria, listingData: NamedListing, prefetchedActivities: SnoowrapActivity[] = []): Promise<FetchedActivitiesResult> {
let cacheKey: string | undefined;
let fromCache = false;
try {
let pre: SnoowrapActivity[] = [];
@@ -1087,7 +1139,6 @@ export class SubredditResources {
let apiCount = 1;
let preMaxTrigger: undefined | string;
let rawCount: number = 0;
let fromCache = false;
const hashObj = cloneDeep(options);
@@ -1100,13 +1151,23 @@ export class SubredditResources {
const userName = getActivityAuthorName(user);
const hash = objectHash.sha1(hashObj);
const cacheKey = `${userName}-${listingData.name}-${hash}`;
cacheKey = `${userName}-${listingData.name}-${hash}`;
if (this.ttl.authorTTL !== false) {
if (this.useSubredditAuthorCache) {
hashObj.subreddit = this.subreddit;
}
// check for cached request error!
//
// we cache reddit API request errors for 403/404 (suspended/shadowban) in memory so that
// we don't waste API calls making the same call repetitively since we know what the result will always be
const cachedRequestError = await this.memoryCache.get(cacheKey) as undefined | null | Error;
if(cachedRequestError !== undefined && cachedRequestError !== null) {
fromCache = true;
this.logger.debug(`In-memory cache found reddit request error for key ${cacheKey}. Must have been <5 sec ago. Throwing to save API calls!`);
throw cachedRequestError;
}
const cacheVal = await this.cache.get(cacheKey);
if(cacheVal === undefined || cacheVal === null) {
@@ -1226,7 +1287,7 @@ export class SubredditResources {
}
preFilteredPrefetchedActivities = await this.filterListingWithHistoryOptions(preFilteredPrefetchedActivities, user, options.filterOn?.pre);
}
let unFilteredItems: SnoowrapActivity[] | undefined = [...preFilteredPrefetchedActivities];
let unFilteredItems: SnoowrapActivity[] | undefined = undefined;
pre = pre.concat(preFilteredPrefetchedActivities);
const { func: listingFunc } = listingData;
@@ -1285,7 +1346,7 @@ export class SubredditResources {
if(satisfiedPreEndtime !== undefined || satisfiedPreCount !== undefined) {
if(unFilteredItems === undefined) {
unFilteredItems = [];
unFilteredItems = [...preFilteredPrefetchedActivities];
}
// window has pre filtering, need to check if fallback max would be hit
if(satisfiedPreEndtime !== undefined) {
@@ -1343,9 +1404,14 @@ export class SubredditResources {
} catch (err: any) {
if(isStatusError(err)) {
switch(err.statusCode) {
case 404:
throw new SimpleError('Reddit returned a 404 for user history. Likely this user is shadowbanned.', {isSerious: false});
case 403:
case 404:
if(!fromCache && cacheKey !== undefined) {
await this.memoryCache.set(cacheKey, err, {ttl: 5});
}
if(err.statusCode === 404) {
throw new SimpleError('Reddit returned a 404 for user history. Likely this user is shadowbanned.', {isSerious: false});
}
throw new MaybeSeriousErrorWithCause('Reddit returned a 403 for user history, likely this user is suspended.', {cause: err, isSerious: false});
default:
throw err;
@@ -1891,8 +1957,14 @@ export class SubredditResources {
if (crit[k] !== undefined) {
switch (k) {
case 'name':
const nameReg = crit[k] as RegExp;
if(!nameReg.test(subreddit.display_name)) {
const nameReg = crit[k] as RegExp | SubredditPlaceholderType;
// placeholder {{subreddit}} tests as true if the given subreddit matches the subreddit this bot is processing the activity from
if (asSubredditPlaceholder(nameReg)) {
if (this.subreddit.display_name !== subreddit.display_name) {
log.debug(`Failed: Expected => ${k}:${crit[k]} (${this.subreddit.display_name}) | Found => ${k}:${subreddit.display_name}`)
return false
}
} else if (!nameReg.test(subreddit.display_name)) {
return false;
}
break;

View File

@@ -46,83 +46,6 @@ import {ActionResultEntity} from "../Common/Entities/ActionResultEntity";
export const BOT_LINK = 'https://www.reddit.com/r/ContextModBot/comments/otz396/introduction_to_contextmodbot';
export interface AuthorTypedActivitiesOptions extends ActivityWindowCriteria {
type?: 'comment' | 'submission',
}
export const isSubreddit = async (subreddit: Subreddit, stateCriteria: SubredditCriteria | StrongSubredditCriteria, logger?: Logger) => {
delete stateCriteria.stateDescription;
if (Object.keys(stateCriteria).length === 0) {
return true;
}
const crit = isStrongSubredditState(stateCriteria) ? stateCriteria : toStrongSubredditState(stateCriteria, {defaultFlags: 'i'});
const log: Logger | undefined = logger !== undefined ? logger.child({leaf: 'Subreddit Check'}, mergeArr) : undefined;
return await (async () => {
for (const k of Object.keys(crit)) {
// @ts-ignore
if (crit[k] !== undefined) {
switch (k) {
case 'name':
const nameReg = crit[k] as RegExp;
if(!nameReg.test(subreddit.display_name)) {
return false;
}
break;
case 'isUserProfile':
const entity = parseRedditEntity(subreddit.display_name);
const entityIsUserProfile = entity.type === 'user';
if(crit[k] !== entityIsUserProfile) {
if(log !== undefined) {
log.debug(`Failed: Expected => ${k}:${crit[k]} | Found => ${k}:${entityIsUserProfile}`)
}
return false
}
break;
case 'over18':
case 'over_18':
// handling an edge case where user may have confused Comment/Submission state "over_18" with SubredditState "over18"
// @ts-ignore
if (crit[k] !== subreddit.over18) {
if(log !== undefined) {
// @ts-ignore
log.debug(`Failed: Expected => ${k}:${crit[k]} | Found => ${k}:${subreddit.over18}`)
}
return false
}
break;
default:
// @ts-ignore
if (subreddit[k] !== undefined) {
// @ts-ignore
if (crit[k] !== subreddit[k]) {
if(log !== undefined) {
// @ts-ignore
log.debug(`Failed: Expected => ${k}:${crit[k]} | Found => ${k}:${subreddit[k]}`)
}
return false
}
} else {
if(log !== undefined) {
log.warn(`Tried to test for Subreddit property '${k}' but it did not exist`);
}
}
break;
}
}
}
if(log !== undefined) {
log.debug(`Passed: ${JSON.stringify(stateCriteria)}`);
}
return true;
})() as boolean;
}
const renderContentCommentTruncate = truncateStringToLength(50);
const shortTitleTruncate = truncateStringToLength(15);
@@ -177,6 +100,7 @@ export const renderContent = async (template: string, data: TemplateContext = {}
conditional.spoiler = activity.spoiler;
conditional.op = true;
conditional.upvoteRatio = `${activity.upvote_ratio * 100}%`;
conditional.link_flair_text = activity.link_flair_text;
} else {
conditional.op = activity.is_submitter;
}
@@ -199,6 +123,7 @@ export const renderContent = async (template: string, data: TemplateContext = {}
author.commentKarma = auth.comment_karma;
author.totalKarma = auth.comment_karma + auth.link_karma;
author.verified = auth.has_verified_email;
author.flairText = activity.author_flair_text;
}
const templateData: any = {

View File

@@ -12,6 +12,7 @@ import {Logger} from "winston";
import {WebSetting} from "../../Common/WebEntities/WebSetting";
import {ErrorWithCause} from "pony-cause";
import {createCacheManager} from "../../Common/Cache";
import {MysqlDriver} from "typeorm/driver/mysql/MysqlDriver";
export interface CacheManagerStoreOptions {
prefix?: string
@@ -103,7 +104,12 @@ export class DatabaseStorageProvider extends StorageProvider {
}
createSessionStore(options?: TypeormStoreOptions): Store {
return new TypeormStore(options).connect(this.clientSessionRepo)
// https://github.com/freshgiammi-lab/connect-typeorm#implement-the-session-entity
// https://github.com/freshgiammi-lab/connect-typeorm/issues/8
// usage of LIMIT in subquery is not supported by mariadb/mysql
// limitSubquery: false -- turns off LIMIT usage
const realOptions = this.database.driver instanceof MysqlDriver ? {...options, limitSubquery: false} : options;
return new TypeormStore(realOptions).connect(this.clientSessionRepo)
}
async getSessionSecret(): Promise<string | undefined> {

View File

@@ -1,6 +1,9 @@
import winston from 'winston';
import 'winston-daily-rotate-file';
import dayjs from 'dayjs';
import http from 'http';
import https from 'https';
import CacheableLookup from 'cacheable-lookup';
import utc from 'dayjs/plugin/utc.js';
import advancedFormat from 'dayjs/plugin/advancedFormat';
import tz from 'dayjs/plugin/timezone';
@@ -9,7 +12,6 @@ import relTime from 'dayjs/plugin/relativeTime.js';
import sameafter from 'dayjs/plugin/isSameOrAfter.js';
import samebefore from 'dayjs/plugin/isSameOrBefore.js';
import weekOfYear from 'dayjs/plugin/weekOfYear.js';
import {Manager} from "./Subreddit/Manager";
import {Command, Argument} from 'commander';
import {
@@ -40,6 +42,17 @@ dayjs.extend(tz);
dayjs.extend(advancedFormat);
dayjs.extend(weekOfYear);
const cacheable = new CacheableLookup({
// cache dns entries for 60 seconds
maxTtl: 60,
// fallback to node lookup for 10 minutes in the event of a failure for 10 minutes
fallbackDuration: 600
});
// replace node native request agents, globally, so they used cached dns lookup
cacheable.install(http.globalAgent);
cacheable.install(https.globalAgent);
const commentReg = parseLinkIdentifier([COMMENT_URL_ID]);
const submissionReg = parseLinkIdentifier([SUBMISSION_URL_ID]);

View File

@@ -46,7 +46,14 @@ import {ErrorWithCause, stackWithCauses} from "pony-cause";
import stringSimilarity from 'string-similarity';
import calculateCosineSimilarity from "./Utils/StringMatching/CosineSimilarity";
import levenSimilarity from "./Utils/StringMatching/levenSimilarity";
import {isRateLimitError, isRequestError, isScopeError, isStatusError, SimpleError} from "./Utils/Errors";
import {
isRateLimitError,
isRequestError,
isScopeError,
isSeriousError,
isStatusError,
SimpleError
} from "./Utils/Errors";
import merge from "deepmerge";
import {RulePremise} from "./Common/Entities/RulePremise";
import {RuleResultEntity as RuleResultEntity} from "./Common/Entities/RuleResultEntity";
@@ -70,7 +77,7 @@ import {
import {
ActivitySourceData,
ActivitySourceTypes,
ActivitySourceValue,
ActivitySourceValue, asSubredditPlaceholder,
ConfigFormat,
DurationVal,
ExternalUrlContext,
@@ -83,7 +90,7 @@ import {
RelativeDateTimeMatch,
statFrequencies,
StatisticFrequency,
StatisticFrequencyOption,
StatisticFrequencyOption, subredditPlaceholder, SubredditPlaceholderType,
UrlContext,
WikiContext
} from "./Common/Infrastructure/Atomic";
@@ -1096,16 +1103,22 @@ export const createRetryHandler = (opts: RetryOptions, logger: Logger) => {
// if it's a request error but not a known "oh probably just a reddit blip" status code treat it as other, which should usually have a lower retry max
}
// linear backoff
otherRetryCount++;
let prefix = '';
if(isSeriousError(err)) {
// linear backoff
otherRetryCount++;
} else {
prefix = 'NON-SERIOUS ';
}
let msg = redditApiError ? `Error occurred while making a request to Reddit (${otherRetryCount}/${maxOtherRetry} in ${clearRetryCountAfter} minutes) but it was NOT a well-known "reddit blip" error.` : `Non-request error occurred (${otherRetryCount}/${maxOtherRetry} in ${clearRetryCountAfter} minutes).`;
if (maxOtherRetry < otherRetryCount) {
logger.warn(`${msg} Exceeded max allowed.`);
logger.warn(`${prefix}${msg} Exceeded max allowed.`);
return false;
}
if(waitOnRetry) {
const ms = (4 * 1000) * otherRetryCount;
logger.warn(`${msg} Will wait ${formatNumber(ms / 1000)} seconds before retrying`);
logger.warn(`${prefix}${msg} Will wait ${formatNumber(ms / 1000)} seconds before retrying`);
await sleep(ms);
}
return true;
@@ -1557,7 +1570,7 @@ export const testMaybeStringRegex = (test: string, subject: string, defaultFlags
}
export const isStrongSubredditState = (value: SubredditCriteria | StrongSubredditCriteria) => {
return value.name === undefined || value.name instanceof RegExp;
return value.name === undefined || value.name instanceof RegExp || asSubredditPlaceholder(value.name);
}
export const asStrongSubredditState = (value: any): value is StrongSubredditCriteria => {
@@ -1575,21 +1588,26 @@ export const toStrongSubredditState = (s: SubredditCriteria, opts?: StrongSubred
let nameValOriginallyRegex = false;
let nameReg: RegExp | undefined;
let nameReg: RegExp | undefined | SubredditPlaceholderType;
if (nameValRaw !== undefined) {
if (!(nameValRaw instanceof RegExp)) {
let nameVal = nameValRaw.trim();
nameReg = parseStringToRegex(nameVal, defaultFlags);
if (nameReg === undefined) {
// if sub state has `isUserProfile=true` and config did not provide a regex then
// assume the user wants to use the value in "name" to look for a user profile so we prefix created regex with u_
const parsedEntity = parseRedditEntity(nameVal, isUserProfile !== undefined && isUserProfile ? 'user' : 'subreddit');
// technically they could provide "u_Username" as the value for "name" and we will then match on it regardless of isUserProfile
// but like...why would they do that? There shouldn't be any subreddits that start with u_ that aren't user profiles anyway(?)
const regPrefix = parsedEntity.type === 'user' ? 'u_' : '';
nameReg = parseStringToRegex(`/^${regPrefix}${nameVal}$/`, defaultFlags);
if(asSubredditPlaceholder(nameVal)) {
nameReg = subredditPlaceholder;
nameValOriginallyRegex = false;
} else {
nameValOriginallyRegex = true;
nameReg = parseStringToRegex(nameVal, defaultFlags);
if (nameReg === undefined) {
// if sub state has `isUserProfile=true` and config did not provide a regex then
// assume the user wants to use the value in "name" to look for a user profile so we prefix created regex with u_
const parsedEntity = parseRedditEntity(nameVal, isUserProfile !== undefined && isUserProfile ? 'user' : 'subreddit');
// technically they could provide "u_Username" as the value for "name" and we will then match on it regardless of isUserProfile
// but like...why would they do that? There shouldn't be any subreddits that start with u_ that aren't user profiles anyway(?)
const regPrefix = parsedEntity.type === 'user' ? 'u_' : '';
nameReg = parseStringToRegex(`/^${regPrefix}${nameVal}$/`, defaultFlags);
} else {
nameValOriginallyRegex = true;
}
}
} else {
nameValOriginallyRegex = true;