Compare commits

..

3 Commits

Author SHA1 Message Date
FoxxMD
9e364c3e28 Refactor cache instantiation to use async method 2022-10-17 11:42:58 -04:00
FoxxMD
acd4b54d8e Implement typings for cache-manager-redis-store@3^ 2022-10-17 11:42:39 -04:00
FoxxMD
19f4ff0b53 feat: Update to cache-manager-redis-store ^3.0.0
* Also bump node minor version to be in line with cache-manager-redis-store
2022-10-17 11:42:04 -04:00
83 changed files with 2405 additions and 3979 deletions

2
.nvmrc
View File

@@ -1 +1 @@
16.14.2
16.18.0

View File

@@ -153,7 +153,7 @@ An **Action** is some action the bot can take against the checked Activity (comm
* For **Operator/Bot maintainers** see **[Operation Guide](/docs/operator/README.md)**
* For **Moderators**
* Start with the [Subreddit/Moderator docs](/docs/subreddit/README.md) or [Moderator Getting Started guide](/docs/subreddit/gettingStarted.md)
* Refer to the [Subreddit Components Documentation](/docs/subreddit/components) or the [subreddit-ready examples](/docs/subreddit/components/cookbook)
* Refer to the [Subreddit Components Documentation](/docs/subreddit/components) or the [subreddit-ready examples](/docs/subreddit/components/subredditReady)
* as well as the [schema](https://json-schema.app/view/%23?url=https%3A%2F%2Fraw.githubusercontent.com%2FFoxxMD%2Fcontext-mod%2Fmaster%2Fsrc%2FSchema%2FApp.json) which has
* fully annotated configuration data/structure
* generated examples in json/yaml

View File

@@ -95,7 +95,7 @@ The Retention Policy can be specified at operator level, bot, subreddit *overrid
operator:
name: u/MyRedditAccount
databaseConfig:
retention: '3 months' # each subreddit will retain 3 months of recorded events
retention: '3 months' # each subreddit will retain 3 more of recorded events
bots:
# all subreddits this bot moderates will have 3 month retention
- name: u/OneBotAccount

View File

@@ -80,7 +80,7 @@ Use the [Configuration Reference](/docs/subreddit/components/README.md) to learn
Additionally, refer to [How It Works](/docs/README.md#how-it-works) and [Core Concepts](/docs/README.md#concepts) to learn the basic of CM configuration.
After you have the basics under your belt you could use the [subreddit configurations cookbook](/docs/subreddit/components/cookbook) to familiarize yourself with a complete configuration and ways to use CM.
After you have the basics under your belt you could use the [subreddit-reddit example configurations](/docs/subreddit/components/subredditReady) to familiarize yourself with a complete configuration and ways to use CM.
# Guest Access

View File

@@ -70,7 +70,6 @@ This list is not exhaustive. [For complete documentation on a subreddit's config
* [Rule Order](#rule-order)
* [Configuration Re-use and Caching](#configuration-re-use-and-caching)
* [Partial Configurations](#partial-configurations)
* [Sharing Configs Between Subreddits](#sharing-full-configs-as-runs)
* [Subreddit-ready examples](#subreddit-ready-examples)
# Runs
@@ -537,25 +536,6 @@ actions:
targets: string # 'self' or 'parent' or 'https://reddit.com/r/someSubreddit/21nfdi....'
```
### Comment As Subreddit
ContextMod can comment [as the subreddit](https://www.reddit.com/r/modnews/comments/wpy5c8/announcing_remove_as_a_subreddit/) using the `/u/subreddit-ModTeam` account with some restrictions:
* The activity being replied to must ALREADY BE REMOVED.
* You can use the [Remove Action](#remove) beforehand to ensure this is the case.
* The created comment will always be stickied and distinguished
Usage:
```yaml
actions:
- kind: comment
asModTeam: true
content: string # required, the content of the comment
lock: boolean # lock the comment after creation
targets: string # 'self' or 'parent' or 'https://reddit.com/r/someSubreddit/21nfdi....'
```
### Submission
Create a Submission [Schema Documentation](https://json-schema.app/view/%23/%23%2Fdefinitions%2FSubmissionCheckJson/%23%2Fdefinitions%2FSubmissionActionJson?url=https%3A%2F%2Fraw.githubusercontent.com%2FFoxxMD%2Freddit-context-bot%2Fmaster%2Fsrc%2FSchema%2FApp.json)
@@ -774,7 +754,7 @@ actions:
- kind: usernote
type: spamwarn
content: 'Usernote message'
existingNoteCheck: boolean # if true (default) then the usernote will not be added if the same note appears for this activity
allowDuplicate: boolean # if false then the usernote will not be added if the same note appears for this activity
```
### Mod Note
@@ -799,7 +779,6 @@ actions:
type: SPAM_WATCH
content: 'a note only mods can see message' # optional
referenceActivity: boolean # if true the Note will be linked to the Activity being processed
existingNoteCheck: boolean # if true (default) then the note will not be added if the same note appears for this activity
```
# Filters
@@ -1267,49 +1246,6 @@ The object contains:
* `path` -- REQUIRED string following rules above
* `ttl` -- OPTIONAL, number of seconds to cache the URL result. Defaults to `WikiTTL`
### Sharing Full Configs as Runs
If the Fragment fetched by CM is a "full config" (including `runs`, `polling`, etc...) that could be used as a valid config for another subreddit then CM will extract and use the **Runs** from that config.
**However, the config must also explicitly allow access for use as a Fragment.** This is to prevent subreddits that share a Bot account from accidentally (or intentionally) gaining access to another subreddit's config with permissions.
#### Sharing
The config that will be shared (accessed at `wiki:botconfig/contextbot|SharingSubreddit`) must have the `sharing` property defined at its top-level. If `sharing` is not defined access will be denied for all subreddits.
```yaml
sharing: false # deny access to all subreddits (default when sharing is not defined)
polling:
- newComm
runs:
# ...
```
```yaml
sharing: true # any subreddit can use this config (reddit account must also be able to access wiki page)
```
```yaml
# when a list is given all subreddit names that match any from the list are ALLOWED to access the config
# list can be regular expressions or case-insensitive strings
sharing:
- mealtimevideos
- videos
- '/Ask.*/i'
```
```yaml
# if `exclude` is used then any subreddit name that is NOT on this list can access the config
# list can be regular expressions or case-insensitive strings
sharing:
exclude:
- mealtimevideos
- videos
- '/Ask.*/i'
```
#### Examples
**Replacing A Rule with a URL Fragment**
@@ -1332,7 +1268,7 @@ runs:
subreddits:
- MyBadSubreddit
window: 7 days
actions:
actions:
- kind: report
content: 'uses freekarma subreddits and bad subreddits'
```
@@ -1367,33 +1303,6 @@ runs:
content: 'uses freekarma subreddits'
```
**Using Another Subreddit's Config**
```yaml
runs:
- `wiki:botconfig/contextbot|SharingSubreddit`
- name: MySubredditSpecificRun
checks:
- name: Free Karma Alert
description: Check if author has posted in 'freekarma' subreddits
kind: submission
rules:
- 'wiki:freeKarmaFrag'
actions:
- kind: report
content: 'uses freekarma subreddits'
```
In `r/SharingSubreddit`:
```yaml
sharing: true
runs:
- name: ARun
# ...
```
# Subreddit-Ready Examples
Refer to the [Subreddit Cookbook Examples](/docs/subreddit/components/cookbook) section to find ready-to-use configurations for common scenarios (spam, freekarma blocking, etc...). This is also a good place to familiarize yourself with what complete configurations look like.
Refer to the [Subreddit-Ready Examples](/docs/subreddit/components/subredditReady) section to find ready-to-use configurations for common scenarios (spam, freekarma blocking, etc...). This is also a good place to familiarize yourself with what complete configurations look like.

View File

@@ -8,24 +8,7 @@ The **Attribution** rule will aggregate an Author's content Attribution (youtube
Consult the [schema](https://json-schema.app/view/%23/%23%2Fdefinitions%2FCheckJson/%23%2Fdefinitions%2FAttributionJSONConfig?url=https%3A%2F%2Fraw.githubusercontent.com%2FFoxxMD%2Fcontext-mod%2Fmaster%2Fsrc%2FSchema%2FApp.json) for a complete reference of the rule's properties.
# [Template Variables](/docs/subreddit/actionTemplating.md)
### Examples
| Name | Description | Example |
|------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------|
| `result` | Summary of rule results (also found in Actioned Events) | 1 Attribution(s) met the threshold of >= 20%, with 6 (40%) of 15 Total -- window: 3 years |
| `triggeredDomainCount` | Number of domains that met the threshold | 1 |
| `window` | Number or duration of Activities considered from window | 3 years |
| `largestCount` | The count from the largest aggregated domain | 6 |
| `largestPercentage` | The percentage of Activities the largest aggregated domain comprises | 40% |
| `smallestCount` | The count from the smallest aggregated domain | 1 |
| `smallestPercentage` | The percentage of Activities the smallest aggregated domain comprises | 6% |
| `countRange` | A convenience string displaying "smallestCount - largestCount" or just one number if both are the same | 5 |
| `percentRange` | A convenience string displaying "smallestPercentage - largestPercentage" or just one percentage if both are the same | 34% |
| `domainsDelim` | A comma-delimited list of all the domain URLs that met the threshold | youtube.com/example1, youtube.com/example2, rueters.com |
| `titlesDelim` | A comma-delimited list of friendly-names of the domain if one is present, otherwise the URL (IE youtube.com/c/34ldfa343 => "My Youtube Channel Title") | My Channel A, My Channel B, reuters.com |
| `threshold` | The threshold you configured for this Rule to trigger | `>= 20%` |
# Examples
* Self Promotion as percentage of all Activities [YAML](/docs/subreddit/components/attribution/redditSelfPromoAll.yaml) | [JSON](/docs/subreddit/components/attribution/redditSelfPromoAll.json5) - Check if Author is submitting much more than they comment.
* Self Promotion as percentage of all Activities [YAML](/docs/subreddit/componentscomponents/attribution/redditSelfPromoAll.yaml) | [JSON](/docs/subreddit/componentscomponents/attribution/redditSelfPromoAll.json5) - Check if Author is submitting much more than they comment.
* Self Promotion as percentage of Submissions [YAML](/docs/subreddit/components/attribution/redditSelfPromoSubmissionsOnly.yaml) | [JSON](/docs/examplesm/attribution/redditSelfPromoSubmissionsOnly.json5) - Check if any of Author's aggregated submission origins are >10% of their submissions

View File

@@ -9,7 +9,7 @@ The **Author** rule triggers if any [AuthorCriteria](https://json-schema.app/vie
* author's subreddit flair text
* author's subreddit flair css
* author's subreddit mod status
* [Toolbox User Notes](/docs/subreddit/components/userNotes)
* [Toolbox User Notes](/docs/subreddit/componentscomponents/userNotes)
The Author **Rule** is best used in conjunction with other Rules to short-circuit a Check based on who the Author is. It is easier to use a Rule to do this then to write **author filters** for every Rule (and makes Rules more re-useable).
@@ -18,10 +18,10 @@ Consult the [schema](https://json-schema.app/view/%23%2Fdefinitions%2FAuthorRule
### Examples
* Basic examples
* Flair new user Submission [YAML](/docs/subreddit/components/author/flairNewUserSubmission.yaml) | [JSON](/docs/subreddit/components/author/flairNewUserSubmission.json5) - If the Author does not have the `vet` flair then flair the Submission with `New User`
* Flair vetted user Submission [YAML](/docs/subreddit/components/author/flairNewUserSubmission.yaml) | [JSON](/docs/subreddit/components/author/flairNewUserSubmission.json5) - If the Author does have the `vet` flair then flair the Submission with `Vetted`
* Flair new user Submission [YAML](/docs/subreddit/componentscomponents/author/flairNewUserSubmission.yaml) | [JSON](/docs/subreddit/componentscomponents/author/flairNewUserSubmission.json5) - If the Author does not have the `vet` flair then flair the Submission with `New User`
* Flair vetted user Submission [YAML](/docs/subreddit/componentscomponents/author/flairNewUserSubmission.yaml) | [JSON](/docs/subreddit/componentscomponents/author/flairNewUserSubmission.json5) - If the Author does have the `vet` flair then flair the Submission with `Vetted`
* Used with other Rules
* Ignore vetted user [YAML](/docs/subreddit/components/author/flairNewUserSubmission.yaml) | [JSON](/docs/subreddit/components/author/flairNewUserSubmission.json5) - Short-circuit the Check if the Author has the `vet` flair
* Ignore vetted user [YAML](/docs/subreddit/componentscomponents/author/flairNewUserSubmission.yaml) | [JSON](/docs/subreddit/componentscomponents/author/flairNewUserSubmission.json5) - Short-circuit the Check if the Author has the `vet` flair
## Filter
@@ -35,7 +35,7 @@ All **Rules** and **Checks** have an optional `authorIs` property that takes an
### Examples
* Skip recent activity check based on author [YAML](/docs/subreddit/components/author/authorFilter.yaml) | [JSON](/docs/subreddit/components/author/authorFilter.json5) - Skip a Recent Activity check for a set of subreddits if the Author of the Submission has any set of flairs.
* Skip recent activity check based on author [YAML](/docs/subreddit/componentscomponents/author/authorFilter.yaml) | [JSON](/docs/subreddit/componentscomponents/author/authorFilter.json5) - Skip a Recent Activity check for a set of subreddits if the Author of the Submission has any set of flairs.
## Flair users and submissions
@@ -45,4 +45,4 @@ Consult [User Flair schema](https://json-schema.app/view/%23%2Fdefinitions%2FUse
### Examples
* OnlyFans submissions [YAML](/docs/subreddit/components/author/onlyfansFlair.yaml) | [JSON](/docs/subreddit/components/author/onlyfansFlair.json5) - Check whether submitter has typical OF keywords in their profile and flair both author + submission accordingly.
* OnlyFans submissions [YAML](/docs/subreddit/componentscomponents/author/onlyfansFlair.yaml) | [JSON](/docs/subreddit/componentscomponents/author/onlyfansFlair.json5) - Check whether submitter has typical OF keywords in their profile and flair both author + submission accordingly.

View File

@@ -40,7 +40,7 @@
// for this to pass the Author of the Submission must not have the flair "Supreme Memer" and have the name "user1" or "user2"
{
"flairText": ["Supreme Memer"],
"name": ["user1","user2"]
"names": ["user1","user2"]
},
{
// for this to pass the Author of the Submission must not have the flair "Decent Memer"

View File

@@ -30,7 +30,7 @@ runs:
# for this to pass the Author of the Submission must not have the flair "Supreme Memer" and have the name "user1" or "user2"
- flairText:
- Supreme Memer
name:
names:
- user1
- user2
# for this to pass the Author of the Submission must not have the flair "Decent Memer"

View File

@@ -1,201 +0,0 @@
# ContextMod Cookbook
Here you will find useful configs for CM that provide real-world functionality. This is where you should look first for **"how do i..."** questions.
## How To Use
Each recipe includes what type of config piece it is (Rule, Check, Action, Run, etc...). Keep this in mind before copy-pasting to make sure it goes in the right place in your config.
### Copy-Pasting
If the type is **Check** or **Run** the recipe contents will have instructions in the comments on how to use it as a **full subreddit config** OR **by itself (default).** If not Check/Run then when copy-pasting you will need to ensure it is placed in the correct spot in your config.
### As Config Fragment
**Checks, Runs, Actions, and Rule** recipes can be referenced in your config without copy-pasting by using them as [Config Fragments.](/docs/subreddit/components/README.md#partial-configurations) These need to be placed in the correct spot in your config, just like copy-pasting, but only require the URL of the recipe instead of all the code.
To use a recipe as a fragment **copy** the URL of the config and insert into your config like this:
```yaml
- 'url:https://URL_TO_CONFIG'
```
EXAMPLE: Using the **Config** link from the [Free Karma](#remove-submissions-from-users-who-have-used-freekarma-subs-to-bypass-karma-checks) check below -- copy the **Config** link and insert it into a full subreddit config like this:
<details>
<summary>Config</summary>
```yaml
polling:
- newSub
runs:
- name: MyFirstRun
checks:
# freekarma check
- 'url:https://github.com/FoxxMD/context-mod/blob/master/docs/subreddit/components/cookbook/freekarma.yaml'
- name: MyRegularCheck
kind: submission
# ...
```
</details>
# Recipes
## Spam Prevention
### Remove submissions from users who have used 'freekarma' subs to bypass karma checks
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/freekarma.yaml)
If the user has any activity (comment/submission) in known freekarma subreddits in the past (100 activities) then remove the submission.
### Remove submissions that are consecutively spammed by the author
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/crosspostSpam.yaml)
If the user has crossposted the same submission in the past (100 activities) 4 or more times in a row then remove the submission.
### Remove submissions if users is flooding new
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/floodingNewSubmissions.yaml)
If the user has made more than 4 submissions in your subreddit in the last 24 hours than new submissions are removed and user is tagged with a modnote.
### Remove submissions posted in diametrically-opposed subreddit
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/diametricSpam.yaml)
If the user makes the same submission to another subreddit(s) that are "thematically" opposed to your subreddit it is probably spam. This check removes it. Detects all types of submissions (including images).
### Remove comments that are consecutively spammed by the author
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/commentSpam.yaml)
If the user made the same comment (with some fuzzy matching) 4 or more times in a row in the past (100 activities or 6 months) then remove the comment.
### Remove comment if it is a chat invite link spam
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/chatSpam.yaml)
This rule goes a step further than automod can by being more discretionary about how it handles this type of spam.
* Remove the comment if:
* Comment being checked contains **only** a chat link (no other text) OR
* Chat links appear **anywhere** in three or more of the last 100 comments the Author has made
This way ContextMod can more easily distinguish between these use cases for a user commenting with a chat link:
* actual spammers who only spam a chat link
* users who may comment with a link but have context for it either in the current comment or in their history
* users who many comment with a link but it's a one-off event (no other links historically)
## Repost Detection
### Remove comments reposted from youtube video submissions
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/youtubeCommentRepost.yaml)
**Requires bot has an API Key for Youtube.**
Removes comment on reddit if the same comment is found on the youtube video the submission is for.
### Remove comments reposted from reddit submissions
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/commentRepost.yaml)
Checks top-level comments on submissions younger than 30 minutes:
* Finds other reddit submissions based on crosspost/duplicates/title/URL, takes top 10 submissions based # of upvotes
* If this comment matches any top comments from those other submissions with at least 85% sameness then it is considered a repost and removed
### Remove reposted reddit submission
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/submissionRepost.yaml)
Checks reddit for top posts with a **Title** that is 90% or more similar to the submission being checked and removes it, if found.
## Self Promotion
### Remove link submissions where the user's history is comprised of 10% or more of the same link
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/selfPromo.yaml)
If the link origin (youtube author, twitter author, etc. or regular domain for non-media links)
* comprises 10% or more of the users **entire** history in the past (100 activities or 6 months)
* or comprises 10% or more of the users **submission** history in the past (100 activities or 6 months) and the user has low engagement (<50% of history is comments or 40%> of comment are as OP)
then remove the submission
### Remove submissions posted in 'newtube' subreddits
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/newtube.yaml)
If the user makes the same submission to a 'newtube' or self-promotional subreddit it is removed and a modnote is added.
## Safety
### Remove comments on brigaded submissions when user has no history
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/brigadingNoHistory.yaml)
The users of comments on a brigaded submission (based on a special submission flair) have their comment history checked -- if they have no participation in your subreddit then the comment is removed.
### Remove submissions from users with a history of sex solicitation
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/sexSolicitationHistory.yaml)
If the author of a submission has submissions in their history that match common reddit "sex solicitation" tags (MFA, R4F, M4F, etc...) the submission is removed and a modnote added.
This is particularly useful for subreddits with underage audiences or mentally/emotionally vulnerable groups.
The check can be modified to removed comments by changing `kind: submission` to `kind: comment`
## Verification
### Verify users from r/TranscribersOfReddit
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/transcribersOfReddit.yaml)
[r/TranscribersOfReddit](https://www.reddit.com/r/transcribersofreddit) is a community of volunteers transcribing images and videos, across reddit, into plain text.
This Check detects their standard transcription template and also checks they have a history in r/transcribersofreddit -- then approves the comment and flairs the user with **Transcriber ✍️**
### Require submission authors have prior subreddit participation
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/requireNonOPParticipation.yaml)
Submission is removed if the author has **less than 5 non-OP comments** in your subreddit prior to making the submission.
### Require submission authors make a top-level comment with 15 minutes of posting
* Type: **Check**
* [Config](/docs/subreddit/components/cookbook/requireNonOPParticipation.yaml)
After making a submission the author must make a top-level comment with a regex-checkable pattern within X minutes. If the comment is not made the submission is removed.
# Monitoring
### Sticky a comment on popular submissions
* Type: **Run**
* [Config](/docs/subreddit/components/cookbook/popularSubmissionMonitoring.yaml)
This **Run** should come after any other Runs you have that may remove a Submission.
The Run will cause CM to check new submissions for 3 hours at a 10 minute interval. The bot will then make a comment and sticky it WHEN it detects the number of upvotes is abnormal for how long the Submission has been "alive".

View File

@@ -1,44 +0,0 @@
#polling:
# - newComm
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
#
# Report comments from users with no history in the subreddit IF the submission is flaired as being brigaded
# optionally, remove comment
#
- name: Brigading No History
kind: comment
# only runs on comments in a submission with a link flair css class of 'brigaded'
itemIs:
- submissionState:
# can use any or all of these to detect brigaded submission
- link_flair_css: brigaded
#flairTemplate: 123-1234
#link_flair_text: Restricted
rules:
- name: noHistory
kind: recentActivity
# check last 100 activities that have not been removed
window:
count: 100
filterOn:
post:
commentState:
include:
- removed: false
thresholds:
# triggers if user has only one activity (this one) in your subreddit
- subreddits:
- MYSUBREDDIT
threshold: '<= 1'
actions:
- kind: report
enable: true
content: User has no history in subreddit
- kind: remove
enable: false
note: User has no history in subreddit

View File

@@ -1,71 +0,0 @@
#polling:
# - newSub
# - newComm
#runs:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a series of RUNS
- name: approvals
checks:
- name: approveSubmissionOnComment
description: Approve an unapproved submission when OP comments with the magic words
kind: comment
itemIs:
# only check comment if submission is not approved and this comment is by OP
- submissionState:
- approved: false
op: true
rules:
- name: OPMagic
kind: regex
criteria:
# YOU NEED TO EDIT THIS REGEX TO MATCH THE PATTERN THE OP'S COMMENT SHOULD HAVE IN ORDER TO VERIFY THE SUBMISSION
- regex: '/Say Please/i'
actions:
- kind: approve
targets:
- parent
- self
# cancel any delayed dispatched actions
- kind: cancelDispatch
# tell action to look for delayed items matched parent (submission)
target: parent
# submission must have 'subVerification' identifier
identifier: subVerification
- name: verification
checks:
- name: waitForVerification
description: Delay processing this submission for 15 minutes
kind: submission
itemIs:
# only dispatch if this is the first time we are seeing this submission
- source:
- "poll:newSub"
- user
actions:
- kind: dispatch
target: self
# unique identifier which is a nice hint in the UI and also allows targeting this item while it is delayed
identifier: subVerification
delay: "15 minutes"
# when it is reprocessed go directly to the 'verification' run, skipping everything else
goto: verification
- name: removeNoVerification
description: Remove submission if it is not verified after delay
kind: submission
itemIs:
# only process this submission if it comes dispatch with 'subVerification' identifier and is NOT approved after 15 minutes
- source: "dispatch:subVerification"
approved: false
actions:
# if this submission is being processed it has been 5 minutes and was not cancelled by OF comment
- kind: remove
enable: true
- kind: comment
enable: true
lock: true
distinguish: true
content: 'Your submission has been removed because you did not follow verification instructions within 15 minutes of posting.'

View File

@@ -1,54 +0,0 @@
#polling:
# - newComm
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
#
# Checks top-level comments on submissions younger than 30 minutes:
# * Finds other reddit submissions based on crosspost/duplicates/title/URL, takes top 10 submissions based # of upvotes
# * If this comment matches any comments from those other submissions with at least 85% sameness then it is considered repost
#
# optionally, bans user if they have more than one modnote for comment reposts
#
- name: commRepost
description: Check if comment has been reposted from youtube
kind: comment
itemIs:
- removed: false
approved: false
op: false
# top level comments only
depth: '< 1'
submissionState:
- age: '< 30 minutes'
condition: AND
rules:
- name: commRepost
kind: repost
criteria:
- searchOn:
- external
actions:
- kind: remove
spam: true
note: 'reposted comment from reddit with {{rules.commrepost.closestSameness}}% sameness'
- kind: ban
authorIs:
# if the author has more than one spamwatch usernote then just ban em
include:
- modActions:
- noteType: SPAM_WATCH
note: "/comment repost.*/i"
search: total
count: "> 1"
message: You have been banned for repeated spammy behavior including reposting reddit comments
note: reddit comment repost + spammy behavior
reason: reddit comment repost + spammy behavior
- name: commRepostModNote
kind: modnote
content: 'YT comment repost with {{rules.commrepost.closestSameness}}% sameness'
type: SPAM_WATCH

View File

@@ -1,34 +0,0 @@
#polling:
# - newSub
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
- name: diametricSpam
description: Check if author has posted the same image in opposite subs
kind: submission
rules:
- name: recent
kind: recentActivity
useSubmissionAsReference: true
# requires your subreddit to be running on a CM instance that supports image processing
imageDetection:
enable: true
threshold: 5
lookAt: submissions
window: 30
thresholds:
- threshold: ">= 1"
subreddits:
- AnotherSubreddit
actions:
- kind: remove
enable: true
content: "Posted same image in {{rules.recent.subSummary}}"
- kind: comment
distinguish: true
sticky: true
lock: true
content: 'You have posted the same image in another subreddit ({{rules.recent.subSummary}}) that does not make sense given the theme of this subreddit. We consider this spam and it has been removed.'

View File

@@ -1,34 +0,0 @@
#polling:
# - newSub
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
#
# Add a mote note to users who are making more than 4 submissions a day
# and optionally remove new submissions by them
#
- name: Flooding New
description: Detect users make more than 4 submission in 24 hours
kind: submission
rules:
- name: Recent In Sub
kind: recentActivity
useSubmissionAsReference: false
window:
duration: 24 hours
fetch: submissions
thresholds:
- subreddits:
# change this to your subreddit
- MYSUBREDDIT
threshold: "> 4"
actions:
- kind: modnote
type: SPAM_WATCH
content: '{{rules.recentinsub.totalCount}} submissions in the last 24 hours'
- kind: remove
enable: false
note: '{{rules.recentinsub.totalCount}} submissions in the last 24 hours'

View File

@@ -1,45 +0,0 @@
#polling:
# - newSub
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
#
# Remove submissions from users who have recent activity in freekarma subs in the last 100 activities
#
- name: freekarma removal
description: Remove submission if user has used freekarma sub recently
kind: submission
rules:
- name: freekarma
kind: recentActivity
window: 100
useSubmissionAsReference: false
thresholds:
- subreddits:
- FreeKarma4U
- FreeKarma4You
- freekarmaforyou
- KarmaFarming4Pros
- KarmaStore
- upvote
- promote
- shamelessplug
- upvote
- FreeUpVotes
- GiveMeKarma
- nsfwkarma
- GetFreeKarmaAnyTime
- freekarma2021
- FreeKarma2022
- KarmaRocket
- FREEKARMA4PORN
actions:
- kind: report
enable: false
content: 'Remove => {{rules.freekarma.totalCount}} activities in freekarma subs'
- kind: remove
enable: true
note: '{{rules.freekarma.totalCount}} activities in freekarma subs'

View File

@@ -1,55 +0,0 @@
#polling:
# - newSub
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
#
# Add a mote note to users who make a submission that is also posted to a 'newtube' subreddit
# and optionally remove new submission
#
- name: Newtube Submission
description: Tag user if submission was posted in 'newtube' subreddit
kind: submission
rules:
- name: newTube
kind: recentActivity
window:
count: 100
fetch: submissions
thresholds:
- subreddits:
- AdvertiseYourVideos
- BrandNewTube
- FreeKarma4U
- FreeKarma4You
- KarmaStore
- GetMoreSubsYT
- GetMoreViewsYT
- NewTubers
- promote
- PromoteGamingVideos
- shamelessplug
- SelfPromotionYouTube
- SmallYTChannel
- SmallYoutubers
- upvote
- youtubestartups
- YouTube_startups
- YoutubeSelfPromotions
- YoutubeSelfPromotion
- YouTubeSubscribeBoost
- youtubepromotion
- YTPromo
- Youtubeviews
- YouTube_startups
actions:
- name: newtubeModTag
kind: modnote
type: SPAM_WATCH
content: 'New Tube => {{rules.newtube.subSummary}}{{rules.newtubeall.subSummary}}'
- kind: remove
enable: false
note: 'New Tube => {{rules.newtube.subSummary}}{{rules.newtubeall.subSummary}}'

View File

@@ -1,89 +0,0 @@
polling:
- newSub
runs:
- name: MyRegularRun
itemIs:
# regular run/checks should only run on new activities or if from dashboard
- source:
- 'poll:newSub'
- 'poll:newComm'
- 'user'
checks:
- name: RuleBreakingCheck1
kind: submission
# ...
#
# your regular checks go here
#
# assuming if a Submission makes it through all of your Checks then it is "OK"
# to be Approved or generally will be visible in the subreddit (valid for monitoring for r/All)
# -- at the end of the Run add a Dispath action
- name: Dispatch For Popular Monitoring
kind: submission
actions:
- kind: dispatch
identifier: 'popular'
# CM will wait 5 minutes before processing this submission again
delay: '5 minutes'
target: 'self'
# a separate run that only processes Submissions from dispatch:popular
- name: PopularWatch
itemIs:
- source: 'dispatch:popular'
checks:
# each check here looks at submission age and tests upvotes against what you think is probably r/All number of votes
# in descending age (oldest first)
# NOTE: You should change the 'age' and 'score' tests to fit the traffic volume for your subreddit!
- name: Two Hour Check
kind: submission
itemIs:
- age: '>= 2 hours'
score: '> 100'
actions:
- kind: comment
name: popularComment
content: 'Looks like this thread is getting a lot of attention. Greetings r/All! Please keep it civil.'
sticky: true
distinguish: true
lock: true
- name: One Hour Check
kind: submission
itemIs:
- age: '>= 1 hours'
score: '> 50'
actions:
- popularComment
- name: Thirty Minute Check
kind: submission
itemIs:
- age: '>= 30 minutes'
score: '> 25'
actions:
- popularComment
- name: Ten Minute Check
kind: submission
itemIs:
- age: '>= 10 minutes'
score: '> 10'
actions:
- popularComment
# finally, if none of the popular checks passed re-dispatch submission to be checked in another 10 minutes
- name: Delay Popular Check
kind: submission
postTrigger:
# don't need to add this Actioned Events
recordTo: false
itemIs:
# only monitor until submission is 3 hours old
- age: '<= 3 hours'
actions:
- kind: dispatch
identifier: 'popular'
delay: '10 minutes'
target: 'self'

View File

@@ -1,51 +0,0 @@
#polling:
# - newSub
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
#
# Report submissions by users with less than 5 non-OP comments in our subreddit
# and optionally remove the submission
#
- name: RequireEngagement
description: Remove submission if author has less than X non-op comments in our subreddit
kind: submission
rules:
- name: LittleEngagement
kind: recentActivity
lookAt: comments
useSubmissionAsReference: false
# bot will check the last 100 NON-OP comments from user's history
window:
count: 100
fetch: comments
filterOn:
post:
commentState:
- op: false
thresholds:
subreddits:
- MYSUBREDDIT
# rule is "triggered" if there are LESS THAN 5 comments in our subreddit in the window specified (currently 100 non-op comments)
threshold: '< 5'
actions:
- kind: report # report the submission
enable: true
# the text of the report
content: 'User has <5 non-OP comments in last 100 comments'
- kind: remove # remove the submission
enable: false
note: 'User has <5 non-OP comments in last 100 comments'
- kind: comment # reply to submission with a comment
enable: false
# contents of the comment
content: We require users to have a minimum level of engagement (>5 comments on other people's posts) in our subreddit before making submissions. Your submission has been automatically removed.
sticky: true
distinguish: true
lock: true

View File

@@ -1,33 +0,0 @@
#polling:
# - newSub
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
#
# Remove submission if user has any "redditor for [sex]..." submissions in their history
# and optionally bans user
#
- name: sexSpamHistory
description: Detect sex spam language in recent history and ban if found (most likely a bot)
kind: submission
rules:
- kind: regex
name: redditorFor
criteria:
# matches if text has common "looking for" acronym like F4M R4A etc...
- regex: '/[RFM]4[a-zA-Z\s0-9]/i'
totalMatchThreshold: "> 1"
window: 100
testOn:
- body
- title
actions:
- kind: remove
enable: true
note: 'Has sex solicitation submission history: {{rules.redditorfor.matchSample}}'
- kind: modnote
type: ABUSE_WARNING
content: 'Has sex solicitation submission history: {{rules.redditorfor.matchSample}}'

View File

@@ -1,31 +0,0 @@
#polling:
# - newSub
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
- name: BotRepost
description: Remove submission if it is likely a repost
kind: submission
rules:
# search reddit for similar submissions to see if it is a repost
- name: subRepost
kind: repost
criteria:
- searchOn:
# match found Submissions sameness using title against title of Submission being checked
- kind: title
# sameness (confidence) % of a title required to consider Submission being checked as a repost
matchScore: 90
actions:
# report the submission
- kind: report
enable: true
content: '{{rules.subrepost.closestSameness}} confidence this is a repost.'
# remove the submission
- kind: remove
enable: false
note: '{{rules.subrepost.closestSameness}} confidence this is a repost.'

View File

@@ -1,41 +0,0 @@
#polling:
# - newComm
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
#
# Detect top-level comments by users from r/transcribersofreddit
# and approve/flair the user
#
- name: transcriber comment
description: approve/flair transcribed video comment
kind: comment
itemIs:
# top-level comments
depth: '< 1'
condition: AND
rules:
- name: transcribedVideoFormat
kind: regex
criteria:
- regex: '/^[\n\r\s]*\*Video Transcription\*[\n\r]+---[\S\s]+---/gim'
- name: transcribersActivity
kind: recentActivity
window:
count: 100
duration: 1 week
useSubmissionAsReference: false
thresholds:
- subreddits:
- transcribersofreddit
actions:
- kind: approve
- name: flairTranscriber
kind: flair
authorIs:
exclude:
- flairText:
- Transcriber ✍️
text: Transcriber ✍️

View File

@@ -1,47 +0,0 @@
#polling:
# - newComm
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
#
# If submission type is a youtube video CM will check top comments on the video and remove comment if it at least 85% the same
# optionally, bans user if they have more than one modnote for comment reposts
#
- name: commRepostYT
description: Check if comment has been reposted from youtube
kind: comment
itemIs:
- removed: false
approved: false
op: false
condition: AND
rules:
- name: commRepost
kind: repost
criteria:
- searchOn:
- external
actions:
- kind: remove
spam: true
note: 'reposted comment from youtube with {{rules.commrepostyt.closestSameness}}% sameness'
- kind: ban
authorIs:
# if the author has more than one spamwatch usernote then just ban em
include:
- modActions:
- noteType: SPAM_WATCH
note: "/comment repost.*/i"
search: total
count: "> 1"
message: You have been banned for repeated spammy behavior including reposting youtube comments
note: yt comment repost + spammy behavior
reason: yt comment repost + spammy behavior
- name: commRepostYTModNote
kind: modnote
content: 'YT comment repost with {{rules.commrepostyt.closestSameness}}% sameness'
type: SPAM_WATCH

View File

@@ -46,23 +46,5 @@ Example:
### Examples
* Low Comment Engagement [YAML](/docs/subreddit/components/history/lowEngagement.yaml) | [JSON](/docs/subreddit/components/history/lowEngagement.json5) - Check if Author is submitting much more than they comment.
* OP Comment Engagement [YAML](/docs/subreddit/components/history/opOnlyEngagement.yaml) | [JSON](/docs/subreddit/components/history/opOnlyEngagement.json5) - Check if Author is mostly engaging only in their own content
# [Template Variables](/docs/subreddit/actionTemplating.md)
| Name | Description | Example |
|----------------------|------------------------------------------------------------------------|----------------------------------------------------|
| `result` | Summary of rule results (also found in Actioned Events) | Filtered Activities (7) were < 10 Items (2 months) |
| `activityTotal` | Total number of activities from window | 50 |
| `filteredTotal` | Total number of activities filtered from window | 7 |
| `filteredPercent` | Percentage of activities filtered from window | 14% |
| `submissionTotal` | Total number of filtered submissions from window | 4 |
| `submissionPercent` | Percentage of filtered submissions from window | 8% |
| `commentTotal` | Total number of filtered comments from window | 3 |
| `commentPercent` | Percentage of filtered comments from window | 6% |
| `opTotal` | Total number of comments as OP from filtered comments | 2 |
| `opPercent` | Percentage of comments as OP from filtered comments | 66% |
| `thresholdSummary` | A text summary of the first Criteria triggered with totals/percentages | Filtered Activities (7) were < 10 Items |
| `subredditBreakdown` | A markdown list of filtered activities by subreddit | * SubredditA - 5 (71%) \n * Subreddit B - 2 (28%) |
| `window` | Number or duration of Activities considered from window | 2 months |
* Low Comment Engagement [YAML](/docs/subreddit/componentscomponents/history/lowEngagement.yaml) | [JSON](/docs/subreddit/componentscomponents/history/lowEngagement.json5) - Check if Author is submitting much more than they comment.
* OP Comment Engagement [YAML](/docs/subreddit/componentscomponents/history/opOnlyEngagement.yaml) | [JSON](/docs/subreddit/componentscomponents/history/opOnlyEngagement.json5) - Check if Author is mostly engaging only in their own content

View File

@@ -116,16 +116,6 @@ rules:
criteria: #... if specified, overrides parent-level criteria
```
# [Template Variables](/docs/subreddit/actionTemplating.md)
| Name | Description | Example |
|-----------------|-------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------|
| `result` | Summary of rule results (also found in Actioned Events) | Current Activity MHS Test: ✓ Confidence test (>= 90) PASSED MHS confidence of 99.85% Flagged pass condition of true (toxic) MATCHED MHS flag 'toxic' |
| `window` | Number or duration of Activities considered from window | 1 activities |
| `criteriaTest` | MHS value to test against | MHS confidence is > 95% |
| `totalMatching` | Total number of activities (current + historical) that matched `criteriaTest` | 1 |
# Examples
Report if MHS flags as toxic
@@ -143,8 +133,7 @@ Report if MHS flags as toxic with 95% confidence
```yaml
rules:
- kind: mhs
criteria:
confidence: '>= 95'
confidence: '>= 95'
actions:
- kind: report
content: 'MHS flagged => {{rules.mhs.summary}}'
@@ -169,9 +158,8 @@ Approve if MHS flags as NOT toxic with 95% confidence
```yaml
rules:
- kind: mhs
criteria:
confidence: '>= 95'
flagged: false
confidence: '>= 95'
flagged: false
actions:
- kind: approve
```

View File

@@ -27,19 +27,5 @@ Consult the [schema](https://json-schema.app/view/%23%2Fdefinitions%2FRecentActi
### Examples
* Free Karma Subreddits [YAML](/docs/subreddit/components/recentActivity/freeKarma.yaml) | [JSON](/docs/subreddit/components/recentActivity/freeKarma.json5) - Check if the Author has recently posted in any "free karma" subreddits
* Submission in Free Karma Subreddits [YAML](/docs/subreddit/components/recentActivity/freeKarmaOnSubmission.yaml) | [JSON](/docs/subreddit/components/recentActivity/freeKarmaOnSubmission.json5) - Check if the Author has posted the Submission this check is running on in any "free karma" subreddits recently
# [Template Variables](/docs/subreddit/actionTemplating.md)
| Name | Description | Example |
|----------------------|------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|
| `result` | Summary of rule results (also found in Actioned Events) | 9 activities found in 2 of the specified subreddits (out of 21 total) MET threshold of >= 1 activities -- subreddits: SubredditA, SubredditB |
| `window` | Number or duration of Activities considered from window | 100 activities |
| `subSummary` | Comma-delimited list of subreddits matched by the criteria | SubredditA, SubredditB |
| `subCount` | Number of subreddits that match the criteria | 2 |
| `totalCount` | Total number of activities found by criteria | 9 |
| `threshold` | The threshold used to trigger the rule | `>= 1` |
| `karmaThreshold` | If present, the karma threshold used to trigger the rule | `> 5` |
| `combinedKarma` | Total number of karma gained from the matched activities | 10 |
| `subredditBreakdown` | A markdown list of filtered activities by subreddit | * SubredditA - 5 (71%) \n * Subreddit B - 2 (28%) |
* Free Karma Subreddits [YAML](/docs/subreddit/componentscomponents/recentActivity/freeKarma.yaml) | [JSON](/docs/subreddit/componentscomponents/recentActivity/freeKarma.json5) - Check if the Author has recently posted in any "free karma" subreddits
* Submission in Free Karma Subreddits [YAML](/docs/subreddit/componentscomponents/recentActivity/freeKarmaOnSubmission.yaml) | [JSON](/docs/subreddit/componentscomponents/recentActivity/freeKarmaOnSubmission.json5) - Check if the Author has posted the Submission this check is running on in any "free karma" subreddits recently

View File

@@ -11,19 +11,12 @@ Which can then be used in conjunction with a [`window`](https://github.com/FoxxM
### Examples
* Trigger if regex matches against the current activity - [YAML](/docs/subreddit/components/regex/matchAnyCurrentActivity.yaml) | [JSON](/docs/subreddit/components/regex/matchAnyCurrentActivity.json5)
* Trigger if regex matches 5 times against the current activity - [YAML](/docs/subreddit/components/regex/matchThresholdCurrentActivity.yaml) | [JSON](/docs/subreddit/components/regex/matchThresholdCurrentActivity.json5)
* Trigger if regex matches against any part of a Submission - [YAML](/docs/subreddit/components/regex/matchSubmissionParts.yaml) | [JSON](/docs/subreddit/components/regex/matchSubmissionParts.json5)
* Trigger if regex matches any of Author's last 10 activities - [YAML](/docs/subreddit/components/regex/matchHistoryActivity.yaml) | [JSON](/docs/subreddit/components/regex/matchHistoryActivity.json5)
* Trigger if regex matches at least 3 of Author's last 10 activities - [YAML](/docs/subreddit/components/regex/matchActivityThresholdHistory.json5) | [JSON](/docs/subreddit/components/regex/matchActivityThresholdHistory.json5)
* Trigger if there are 5 regex matches in the Author's last 10 activities - [YAML](/docs/subreddit/components/regex/matchTotalHistoryActivity.yaml) | [JSON](/docs/subreddit/components/regex/matchTotalHistoryActivity.json5)
* Trigger if there are 5 regex matches in the Author's last 10 comments - [YAML](/docs/subreddit/components/regex/matchSubsetHistoryActivity.yaml) | [JSON](/docs/subreddit/components/regex/matchSubsetHistoryActivity.json5)
* Remove comments that are spamming discord links - [YAML](/docs/subreddit/components/regex/removeDiscordSpam.yaml) | [JSON](/docs/subreddit/components/regex/removeDiscordSpam.json5)
* Trigger if regex matches against the current activity - [YAML](/docs/subreddit/componentscomponents/regex/matchAnyCurrentActivity.yaml) | [JSON](/docs/subreddit/componentscomponents/regex/matchAnyCurrentActivity.json5)
* Trigger if regex matches 5 times against the current activity - [YAML](/docs/subreddit/componentscomponents/regex/matchThresholdCurrentActivity.yaml) | [JSON](/docs/subreddit/componentscomponents/regex/matchThresholdCurrentActivity.json5)
* Trigger if regex matches against any part of a Submission - [YAML](/docs/subreddit/componentscomponents/regex/matchSubmissionParts.yaml) | [JSON](/docs/subreddit/componentscomponents/regex/matchSubmissionParts.json5)
* Trigger if regex matches any of Author's last 10 activities - [YAML](/docs/subreddit/componentscomponents/regex/matchHistoryActivity.yaml) | [JSON](/docs/subreddit/componentscomponents/regex/matchHistoryActivity.json5)
* Trigger if regex matches at least 3 of Author's last 10 activities - [YAML](/docs/subreddit/componentscomponents/regex/matchActivityThresholdHistory.json5) | [JSON](/docs/subreddit/componentscomponents/regex/matchActivityThresholdHistory.json5)
* Trigger if there are 5 regex matches in the Author's last 10 activities - [YAML](/docs/subreddit/componentscomponents/regex/matchTotalHistoryActivity.yaml) | [JSON](/docs/subreddit/componentscomponents/regex/matchTotalHistoryActivity.json5)
* Trigger if there are 5 regex matches in the Author's last 10 comments - [YAML](/docs/subreddit/componentscomponents/regex/matchSubsetHistoryActivity.yaml) | [JSON](/docs/subreddit/componentscomponents/regex/matchSubsetHistoryActivity.json5)
* Remove comments that are spamming discord links - [YAML](/docs/subreddit/componentscomponents/regex/removeDiscordSpam.yaml) | [JSON](/docs/subreddit/componentscomponents/regex/removeDiscordSpam.json5)
* Differs from just using automod because this config can allow one-off/organic links from users who DO NOT spam discord links but will still remove the comment if the user is spamming them
# [Template Variables](/docs/subreddit/actionTemplating.md)
| Name | Description | Example |
|---------------|---------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------|
| `result` | Summary of rule results (also found in Actioned Events) | Criteria 1 ✓ -- Activity Match ✓ => 1 > 0 (Threshold > 0) and 1 Total Matches (Window: 1 Item) -- Matched Values: "example.com/test" |
| `matchSample` | A comma-delimited list of matches from activities | "example.com/test" |

View File

@@ -47,16 +47,5 @@ With only `gapAllowance: 2` this rule **would trigger** because the the 1 and 2
## Examples
* Crosspost Spamming [JSON](/docs/subreddit/components/repeatActivity/crosspostSpamming.json5) | [YAML](/docs/subreddit/components/repeatActivity/crosspostSpamming.yaml) - Check if an Author is spamming their Submissions across multiple subreddits
* Burst-posting [JSON](/docs/subreddit/components/repeatActivity/burstPosting.json5) | [YAML](/docs/subreddit/components/repeatActivity/burstPosting.yaml) - Check if Author is crossposting their Submissions in short bursts
# [Template Variables](/docs/subreddit/actionTemplating.md)
| Name | Description | Example |
|-----------------------|---------------------------------------------------------|-------------------------------------------------------------|
| `result` | Summary of rule results (also found in Actioned Events) | 1 of 1 unique items repeated >= 7 times, largest repeat: 22 |
| `window` | Number or duration of Activities considered from window | 100 activities |
| `threshold` | Number of repeats that trigger rule | `>= 7` |
| `totalTriggeringSets` | Number of sets of repeats that matched threshold | 1 |
| `largestRepeat` | The largest number of repeats in a single set | 22 |
| `gapAllowance` | Number of non-repeat activities allowed between repeats | 2 |
* Crosspost Spamming [JSON](/docs/subreddit/componentscomponents/repeatActivity/crosspostSpamming.json5) | [YAML](/docs/subreddit/componentscomponents/repeatActivity/crosspostSpamming.yaml) - Check if an Author is spamming their Submissions across multiple subreddits
* Burst-posting [JSON](/docs/subreddit/componentscomponents/repeatActivity/burstPosting.json5) | [YAML](/docs/subreddit/componentscomponents/repeatActivity/burstPosting.yaml) - Check if Author is crossposting their Submissions in short bursts

View File

@@ -267,15 +267,6 @@ When the rule is run in a **Comment Check** you may specify text comparisons (li
* **minWordCount** -- The minimum number of words a comment must have
* **caseSensitive** -- If the match comparison should be case-sensitive (defaults to `false`)
# [Template Variables](/docs/subreddit/actionTemplating.md)
| Name | Description | Example |
|-------------------|--------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `result` | Summary of rule results (also found in Actioned Events) | Searched top 0 comments in top 10 most popular submissions, 50 Youtube comments and found 2 reposts. --- Closest Match => >> The thought of this is terrifying << from Youtube (https://youtube.com/watch?v=example) with 100.00% sameness. |
| `closestSummary` | If rule was triggered, the reposted activity type and where it came from | matched a comment from youtube |
| `closestSameness` | If rule was triggered, the sameness of repost to the current activity | 100% |
# Examples
Examples of a *full* CM configuration, including the Repost Rule, in various scenarios. In each scenario the parts of the configuration that affect the rule are indicated.

View File

@@ -181,16 +181,3 @@ rules:
actions:
- kind: remove
```
# [Template Variables](/docs/subreddit/actionTemplating.md)
| Name | Description | Example |
|---------------------------|--------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------|
| `result` | Summary of rule results (also found in Actioned Events) | Current Activity Sentiment -0.35 (-0.45) PASSED sentiment test < -0.3 |
| `triggered` | Boolean if rule was triggered or not | true |
| `sentimentTest` | The sentiment value test | `< -0.3` |
| `historicalSentimentTest` | The sentiment value test used for historical activities | `< -0.3` |
| `averageScore` | The averaged score (equal weights) for all sentiment analysis tests run on the current activity | -0.35 |
| `averageWindowScore` | The averaged score (equal weights) for all sentiment analysis tests run on historical activities | -0.35 |
| `window` | Number or duration of Activities considered from window | 100 activities |
| `totalMatching` | Number of activities that passed the sentimentTest | 1 |

View File

@@ -0,0 +1,78 @@
Provided here are **complete, ready-to-go configuration** that can copy-pasted straight into your configuration wiki page to get going with ContextMod immediately.
These configurations attempt to provide sensible, non-destructive, default behavior for some common scenarios and subreddit types.
In most cases these will perform decently out-of-the-box but they are not perfect. You should still monitor bot behavior to see how it performs and will most likely still need to tweak these configurations to get your desired behavior.
All actions for these configurations are non-destructive in that:
* All instances where an activity would be modified (remove/ban/approve) will have `dryRun: true` set to prevent the action from actually being performed
* These instances will also have a `report` action detailing the action would have been performed
**You will have to remove the `report` action and `dryRun` settings yourself.** This is to ensure that you understand the behavior the bot will be performing. If you are unsure of this you should leave them in place until you are certain the behavior the bot is performing is acceptable.
**YAML** is the same format as **automoderator**
## Submission-based Behavior
### Remove submissions from users who have used 'freekarma' subs to bypass karma checks
[YAML](/docs/subreddit/componentscomponents/subredditReady/freekarma.yaml) | [JSON](/docs/subreddit/componentscomponents/subredditReady/freekarma.json5)
If the user has any activity (comment/submission) in known freekarma subreddits in the past (50 activities or 6 months) then remove the submission.
### Remove submissions from users who have crossposted the same submission 4 or more times
[YAML](/docs/subreddit/componentscomponents/subredditReady/crosspostSpam.yaml) | [JSON](/docs/subreddit/componentscomponents/subredditReady/crosspostSpam.yaml)
If the user has crossposted the same submission in the past (50 activities or 6 months) 4 or more times in a row then remove the submission.
### Remove submissions from users who have crossposted or used 'freekarma' subs
[YAML](/docs/subreddit/componentscomponents/subredditReady/freeKarmaOrCrosspostSpam.yaml) | [JSON](/docs/subreddit/componentscomponents/subredditReady/freeKarmaOrCrosspostSpam.json5)
Will remove submission if either of the above two behaviors is detected
### Remove link submissions where the user's history is comprised of 10% or more of the same link
[YAML](/docs/subreddit/componentscomponents/subredditReady/selfPromo.yaml) | [JSON](/docs/subreddit/componentscomponents/subredditReady/selfPromo.json5)
If the link origin (youtube author, twitter author, etc. or regular domain for non-media links)
* comprises 10% or more of the users **entire** history in the past (100 activities or 6 months)
* or comprises 10% or more of the users **submission** history in the past (100 activities or 6 months) and the user has low engagement (<50% of history is comments or 40%> of comment are as OP)
then remove the submission
## Comment-based behavior
### Remove comment if the user has posted the same comment 4 or more times in a row
[YAML](/docs/subreddit/componentscomponents/subredditReady/commentSpam.yaml) | [JSON](/docs/subreddit/componentscomponents/subredditReady/commentSpam.json5)
If the user made the same comment (with some fuzzy matching) 4 or more times in a row in the past (50 activities or 6 months) then remove the comment.
### Remove comment if it is discord invite link spam
[YAML](/docs/subreddit/componentscomponents/subredditReady/discordSpam.yaml) | [JSON](/docs/subreddit/componentscomponents/subredditReady/discordSpam.json5)
This rule goes a step further than automod can by being more discretionary about how it handles this type of spam.
* Remove the comment and **ban a user** if:
* Comment being checked contains **only** a discord link (no other text) AND
* Discord links appear **anywhere** in three or more of the last 10 comments the Author has made
otherwise...
* Remove the comment if:
* Comment being checked contains **only** a discord link (no other text) OR
* Comment contains a discord link **anywhere** AND
* Discord links appear **anywhere** in three or more of the last 10 comments the Author has made
Using these checks ContextMod can more easily distinguish between these use cases for a user commenting with a discord link:
* actual spammers who only spam a discord link
* users who may comment with a link but have context for it either in the current comment or in their history
* users who many comment with a link but it's a one-off event (no other links historically)
Additionally, you could modify both/either of these checks to not remove one-off discord link comments but still remove if the user has a historical trend for spamming links

View File

@@ -0,0 +1,46 @@
{
"polling": ["newComm"],
"runs": [
{
"checks": [
{
//
// Stop users who spam the same comment many times
//
// Remove a COMMENT if the user has crossposted it at least 4 times in recent history
//
"name": "low xp comment spam",
"description": "X-posted comment >=4x",
"kind": "comment",
"condition": "AND",
"rules": [
{
"name": "xPostLow",
"kind": "repeatActivity",
"gapAllowance": 2,
"threshold": ">= 4",
"window": {
"count": 50,
"duration": "6 months"
}
},
],
"actions": [
// remove this after confirming behavior is acceptable
{
"kind": "report",
"content": "Remove=> Posted same comment {{rules.xpostlow.largestRepeat}}x times"
},
//
//
{
"kind": "remove",
// remove the line below after confirming behavior is acceptable
"dryRun": true
}
]
}
]
}
],
}

View File

@@ -1,18 +1,14 @@
#polling:
# - newComm
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
#
# Remove comments by users who spam the same comment many times
#
polling:
- newComm
runs:
- checks:
# Stop users who spam the same comment many times
- name: low xp comment spam
description: X-posted comment >=4x
kind: comment
condition: AND
rules:
- name: xPostLowComm
- name: xPostLow
kind: repeatActivity
# number of "non-repeat" comments allowed between "repeat comments"
gapAllowance: 2
@@ -20,13 +16,11 @@
threshold: '>= 4'
# retrieve either last 50 comments or 6 months' of history, whichever is less
window:
count: 100
count: 50
duration: 6 months
actions:
- kind: report
enable: false
content: 'Remove => Posted same comment {{rules.xpostlowcomm.largestRepeat}}x times'
enable: true
content: 'Remove => Posted same comment {{rules.xpostlow.largestRepeat}}x times'
- kind: remove
enable: true
note: 'Posted same comment {{rules.xpostlowcomm.largestRepeat}}x times'

View File

@@ -0,0 +1,81 @@
{
"polling": ["unmoderated"],
"runs": [
{
"checks": [
{
//
// Stop users who post low-effort, crossposted spam
//
// Remove a SUBMISSION if the user has crossposted it at least 4 times in recent history AND
// less than 50% of their activity is comments OR more than 40% of those comments are as OP (in the own submissions)
//
"name": "low xp spam and engagement",
"description": "X-posted 4x and low comment engagement",
"kind": "submission",
"itemIs": [
{
"removed": false
}
],
"condition": "AND",
"rules": [
{
"name": "xPostLow",
"kind": "repeatActivity",
"gapAllowance": 2,
"threshold": ">= 4",
"window": {
"count": 50,
"duration": "6 months"
}
},
{
"name": "lowOrOpComm",
"kind": "history",
"criteriaJoin": "OR",
"criteria": [
{
"window": {
"count": 100,
"duration": "6 months"
},
"comment": "< 50%"
},
{
"window": {
"count": 100,
"duration": "6 months"
},
"comment": "> 40% OP"
}
]
}
],
"actions": [
// remove this after confirming behavior is acceptable
{
"kind": "report",
"content": "Remove=>{{rules.xpostlow.largestRepeat}} X-P => {{rules.loworopcomm.thresholdSummary}}"
},
//
//
{
"kind": "remove",
// remove the line below after confirming behavior is acceptable
"dryRun": true
},
// optionally remove "dryRun" from below if you want to leave a comment on removal
// PROTIP: the comment is bland, you should make it better
{
"kind": "comment",
"content": "Your submission has been removed because you cross-posted it {{rules.xpostlow.largestRepeat}} times and you have very low engagement outside of making submissions",
"distinguish": true,
"dryRun": true
}
]
}
]
}
],
}

View File

@@ -1,11 +1,7 @@
#polling:
# - newSub
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
#
polling:
- unmoderated
runs:
- checks:
# stop users who post low-effort, crossposted spam submissions
#
# Remove a SUBMISSION if the user has crossposted it at least 4 times in recent history AND
@@ -22,7 +18,7 @@
gapAllowance: 2
threshold: '>= 4'
window:
count: 100
count: 50
duration: 6 months
- name: lowOrOpComm
kind: history
@@ -38,15 +34,12 @@
comment: '> 40% OP'
actions:
- kind: report
enable: false
enable: true
content: >-
Remove=>{{rules.xpostlow.largestRepeat}} X-P =>
{{rules.loworopcomm.thresholdSummary}}
- kind: remove
enable: true
note: 'Repeated submission {{rules.xpostlow.largestRepeat}}x and low comment engagement'
- kind: comment
enable: true
content: >-

View File

@@ -0,0 +1,79 @@
{
"polling": ["newComm"],
"runs": [
{
"checks": [
{
"name": "ban discord only spammer",
"description": "ban a user who spams only a discord link many times historically",
"kind": "comment",
"condition": "AND",
"rules": [
"linkOnlySpam",
"linkAnywhereHistoricalSpam",
],
"actions": [
{
"kind": "remove"
},
{
"kind": "ban",
"content": "spamming discord links"
}
]
},
{
"name": "remove discord spam",
"description": "remove comments from users who only link to discord or mention discord link many times historically",
"kind": "comment",
"condition": "OR",
"rules": [
{
"name": "linkOnlySpam",
"kind": "regex",
"criteria": [
{
"name": "only link",
"regex": "/^.*(discord\\.gg\\/[\\w\\d]+)$/i",
}
]
},
{
"condition": "AND",
"rules": [
{
"name": "linkAnywhereSpam",
"kind": "regex",
"criteria": [
{
"name": "contains link anywhere",
"regex": "/^.*(discord\\.gg\\/[\\w\\d]+).*$/i",
}
]
},
{
"name": "linkAnywhereHistoricalSpam",
"kind": "regex",
"criteria": [
{
"name": "contains links anywhere historically",
"regex": "/^.*(discord\\.gg\\/[\\w\\d]+).*$/i",
"totalMatchThreshold": ">= 3",
"lookAt": "comments",
"window": 10
}
]
}
]
}
],
"actions": [
{
"kind": "remove"
}
]
}
]
}
],
}

View File

@@ -1,18 +1,9 @@
#polling:
# - newComm
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
#
# Remove comments from users who spam discord and telegram links
# -- differs from just using automod:
# 1) removes comment if it is ONLY discord/telegram link
# 2) if not *only* link then checks user's history to see if link is spammed many times and only removes if it is
#
- name: ban chat only spammer
description: ban a user who spams only a chat link many times historically
polling:
- newComm
runs:
- checks:
- name: ban discord only spammer
description: ban a user who spams only a discord link many times historically
kind: comment
condition: AND
rules:
@@ -22,9 +13,9 @@
- kind: remove
- kind: ban
content: spamming discord links
- name: remove chat spam
- name: remove discord spam
description: >-
remove comments from users who only link to chat or mention chat
remove comments from users who only link to discord or mention discord
link many times historically
kind: comment
condition: OR
@@ -33,9 +24,8 @@
kind: regex
criteria:
- name: only link
# https://regexr.com/70j9m
# single quotes are required to escape special characters
regex: '/^\s*((?:discord\.gg|t\.me|telegram\.me|telegr\.im)\/[\w\d]+)\s*$/i'
regex: '/^.*(discord\.gg\/[\w\d]+)$/i'
- condition: AND
rules:
- name: linkAnywhereSpam
@@ -43,16 +33,15 @@
criteria:
- name: contains link anywhere
# single quotes are required to escape special characters
regex: '/((?:discord\.gg|t\.me|telegram\.me|telegr\.im)\/[\w\d]+)/i'
regex: '/^.*(discord\.gg\/[\w\d]+).*$/i'
- name: linkAnywhereHistoricalSpam
kind: regex
criteria:
- name: contains links anywhere historically
# single quotes are required to escape special characters
regex: '/((?:discord\.gg|t\.me|telegram\.me|telegr\.im)\/[\w\d]+)/i'
regex: '/^.*(discord\.gg\/[\w\d]+).*$/i'
totalMatchThreshold: '>= 3'
lookAt: comments
window: 100
window: 10
actions:
- kind: remove
note: Chat spam link

View File

@@ -0,0 +1,142 @@
{
"polling": [
"unmoderated"
],
"runs": [
{
"checks": [
{
//
// Stop users who post low-effort, crossposted spam
//
// Remove a SUBMISSION if the user has crossposted it at least 4 times in recent history AND
// less than 50% of their activity is comments OR more than 40% of those comments are as OP (in the own submissions)
//
"name": "remove on low xp spam and engagement",
"description": "X-posted 4x and low comment engagement",
"kind": "submission",
"itemIs": [
{
"removed": false
}
],
"condition": "AND",
"rules": [
{
"name": "xPostLow",
"kind": "repeatActivity",
"gapAllowance": 2,
"threshold": ">= 4",
"window": {
"count": 50,
"duration": "6 months"
}
},
{
"name": "lowOrOpComm",
"kind": "history",
"criteriaJoin": "OR",
"criteria": [
{
"window": {
"count": 100,
"duration": "6 months"
},
"comment": "< 50%"
},
{
"window": {
"count": 100,
"duration": "6 months"
},
"comment": "> 40% OP"
}
]
}
],
"actions": [
// remove this after confirming behavior is acceptable
{
"kind": "report",
"content": "Remove=>{{rules.xpostlow.largestRepeat}} X-P => {{rules.loworopcomm.thresholdSummary}}"
},
//
//
{
"kind": "remove",
// remove the line below after confirming behavior is acceptable
"dryRun": true
},
// optionally remove "dryRun" from below if you want to leave a comment on removal
// PROTIP: the comment is bland, you should make it better
{
"kind": "comment",
"content": "Your submission has been removed because you cross-posted it {{rules.xpostlow.largestRepeat}} times and you have very low engagement outside of making submissions",
"distinguish": true,
"dryRun": true
}
]
},
{
//
// Remove submissions from users who have recent activity in freekarma subs within the last 50 activities or 6 months (whichever is less)
//
"name": "freekarma removal",
"description": "Remove submission if user has used freekarma sub recently",
"kind": "submission",
"itemIs": [
{
"removed": false
}
],
"condition": "AND",
"rules": [
{
"name": "freekarma",
"kind": "recentActivity",
"window": {
"count": 50,
"duration": "6 months"
},
"useSubmissionAsReference": false,
"thresholds": [
{
"subreddits": [
"FreeKarma4U",
"FreeKarma4You",
"KarmaStore",
"promote",
"shamelessplug",
"upvote"
]
}
]
}
],
"actions": [
// remove this after confirming behavior is acceptable
{
"kind": "report",
"content": "Remove=> {{rules.newtube.totalCount}} activities in freekarma subs"
},
//
//
{
"kind": "remove",
// remove the line below after confirming behavior is acceptable
"dryRun": true
},
// optionally remove "dryRun" from below if you want to leave a comment on removal
// PROTIP: the comment is bland, you should make it better
{
"kind": "comment",
"content": "Your submission has been removed because you have recent activity in 'freekarma' subs",
"distinguish": true,
"dryRun": true
}
]
}
]
}
]
}

View File

@@ -0,0 +1,85 @@
polling:
- unmoderated
runs:
- checks:
# stop users who post low-effort, crossposted spam submissions
#
# Remove a SUBMISSION if the user has crossposted it at least 4 times in recent history AND
# less than 50% of their activity is comments OR more than 40% of those comments are as OP (in the own submissions)
- name: remove on low xp spam and engagement
description: X-posted 4x and low comment engagement
kind: submission
itemIs:
- removed: false
condition: AND
rules:
- name: xPostLow
kind: repeatActivity
gapAllowance: 2
threshold: '>= 4'
window:
count: 50
duration: 6 months
- name: lowOrOpComm
kind: history
criteriaJoin: OR
criteria:
- window:
count: 100
duration: 6 months
comment: < 50%
- window:
count: 100
duration: 6 months
comment: '> 40% OP'
actions:
- kind: report
enable: true
content: >-
Remove=>{{rules.xpostlow.largestRepeat}} X-P =>
{{rules.loworopcomm.thresholdSummary}}
- kind: remove
enable: false
- kind: comment
enable: true
content: >-
Your submission has been removed because you cross-posted it
{{rules.xpostlow.largestRepeat}} times and you have very low
engagement outside of making submissions
distinguish: true
dryRun: true
# Remove submissions from users who have recent activity in freekarma subs within the last 50 activities or 6 months (whichever is less)
- name: freekarma removal
description: Remove submission if user has used freekarma sub recently
kind: submission
itemIs:
- removed: false
condition: AND
rules:
- name: freekarma
kind: recentActivity
window:
count: 50
duration: 6 months
useSubmissionAsReference: false
thresholds:
- subreddits:
- FreeKarma4U
- FreeKarma4You
- KarmaStore
- promote
- shamelessplug
- upvote
actions:
- kind: report
enable: true
content: 'Remove=> {{rules.newtube.totalCount}} activities in freekarma subs'
- kind: remove
enable: false
- kind: comment
enable: true
content: >-
Your submission has been removed because you have recent activity in
'freekarma' subs
distinguish: true
dryRun: true

View File

@@ -0,0 +1,68 @@
{
"polling": [
"unmoderated"
],
"runs": [
{
"checks": [
{
//
// Remove submissions from users who have recent activity in freekarma subs within the last 50 activities or 6 months (whichever is less)
//
"name": "freekarma removal",
"description": "Remove submission if user has used freekarma sub recently",
"kind": "submission",
"itemIs": [
{
"removed": false
}
],
"condition": "AND",
"rules": [
{
"name": "freekarma",
"kind": "recentActivity",
"window": {
"count": 50,
"duration": "6 months"
},
"useSubmissionAsReference": false,
"thresholds": [
{
"subreddits": [
"FreeKarma4U",
"FreeKarma4You",
"KarmaStore",
"upvote"
]
}
]
}
],
"actions": [
// remove this after confirming behavior is acceptable
{
"kind": "report",
"content": "Remove=> {{rules.freekarma.totalCount}} activities in freekarma subs"
},
//
//
{
"kind": "remove",
// remove the line below after confirming behavior is acceptable
"dryRun": true,
},
// optionally remove "dryRun" from below if you want to leave a comment on removal
// PROTIP: the comment is bland, you should make it better
{
"kind": "comment",
"content": "Your submission has been removed because you have recent activity in 'freekarma' subs",
"distinguish": true,
"dryRun": true,
}
]
}
]
}
],
}

View File

@@ -0,0 +1,36 @@
polling:
- unmoderated
runs:
- checks:
# Remove submissions from users who have recent activity in freekarma subs within the last 50 activities or 6 months (whichever is less)
- name: freekarma removal
description: Remove submission if user has used freekarma sub recently
kind: submission
itemIs:
- removed: false
condition: AND
rules:
- name: freekarma
kind: recentActivity
window:
count: 50
duration: 6 months
useSubmissionAsReference: false
thresholds:
- subreddits:
- FreeKarma4U
- FreeKarma4You
- KarmaStore
- upvote
actions:
- kind: report
enable: true
content: 'Remove=> {{rules.freekarma.totalCount}} activities in freekarma subs'
- kind: remove
enable: true
- kind: comment
enable: false
content: >-
Your submission has been removed because you have recent activity in
'freekarma' subs
distinguish: true

View File

@@ -0,0 +1,108 @@
{
"polling": [
"unmoderated"
],
"runs": [
{
"checks": [
{
//
// Stop users who make link submissions with a self-promotional agenda (with reddit's suggested 10% rule)
// https://www.reddit.com/wiki/selfpromotion#wiki_guidelines_for_self-promotion_on_reddit
//
// Remove a SUBMISSION if the link comprises more than or equal to 10% of users history (100 activities or 6 months) OR
//
// if link comprises 10% of submission history (100 activities or 6 months)
// AND less than 50% of their activity is comments OR more than 40% of those comments are as OP (in the own submissions)
//
"name": "Self-promo all AND low engagement",
"description": "Self-promo is >10% for all or just sub and low comment engagement",
"kind": "submission",
"condition": "OR",
"rules": [
{
"name": "attr",
"kind": "attribution",
"criteria": [
{
"threshold": ">= 10%",
"window": {
"count": 100,
"duration": "6 months"
},
"domains": [
"AGG:SELF"
]
}
],
},
{
"condition": "AND",
"rules": [
{
"name": "attrsub",
"kind": "attribution",
"criteria": [
{
"threshold": ">= 10%",
"thresholdOn": "submissions",
"window": {
"count": 100,
"duration": "6 months"
},
"domains": [
"AGG:SELF"
]
}
]
},
{
"name": "lowOrOpComm",
"kind": "history",
"criteriaJoin": "OR",
"criteria": [
{
"window": {
"count": 100,
"duration": "6 months"
},
"comment": "< 50%"
},
{
"window": {
"count": 100,
"duration": "6 months"
},
"comment": "> 40% OP"
}
]
}
]
}
],
"actions": [
{
"kind": "report",
"content": "{{rules.attr.largestPercent}}{{rules.attrsub.largestPercent}} of {{rules.attr.activityTotal}}{{rules.attrsub.activityTotal}} items ({{rules.attr.window}}{{rules.attrsub.window}}){{#rules.loworopcomm.thresholdSummary}} => {{rules.loworopcomm.thresholdSummary}}{{/rules.loworopcomm.thresholdSummary}}"
},
//
//
{
"kind": "remove",
// remove the line below after confirming behavior is acceptable
"dryRun": true
},
// optionally remove "dryRun" from below if you want to leave a comment on removal
// PROTIP: the comment is bland, you should make it better
{
"kind": "comment",
"content": "Your submission has been removed it comprises 10% or more of your recent history ({{rules.attr.largestPercent}}{{rules.attrsub.largestPercent}}). This is against [reddit's self promotional guidelines.](https://www.reddit.com/wiki/selfpromotion#wiki_guidelines_for_self-promotion_on_reddit)",
"distinguish": true,
"dryRun": true
}
]
}
]
}
]
}

View File

@@ -1,10 +1,7 @@
#polling:
# - newSub
#runs:
# - checks:
#### Uncomment the code above to use this as a FULL subreddit config
####
#### Otherwise copy-paste the code below to use as a CHECK
polling:
- unmoderated
runs:
- checks:
#
# Stop users who make link submissions with a self-promotional agenda (with reddit's suggested 10% rule)
# https://www.reddit.com/wiki/selfpromotion#wiki_guidelines_for_self-promotion_on_reddit
@@ -61,11 +58,8 @@
({{rules.attr.window}}{{rules.attrsub.window}}){{#rules.loworopcomm.thresholdSummary}}
=>
{{rules.loworopcomm.thresholdSummary}}{{/rules.loworopcomm.thresholdSummary}}
- kind: remove
enable: true
note: '>10% of author's history is content from this creator'
enable: false
- kind: comment
enable: true
content: >-
@@ -75,3 +69,4 @@
is against [reddit's self promotional
guidelines.](https://www.reddit.com/wiki/selfpromotion#wiki_guidelines_for_self-promotion_on_reddit)
distinguish: true
dryRun: true

View File

@@ -24,7 +24,7 @@ Consult the [schema](https://json-schema.app/view/%23%2Fdefinitions%2FUserNoteCr
### Examples
* Do not tag user with Good User note [JSON](/docs/subreddit/components/userNotes/usernoteFilter.json5) | [YAML](/docs/subreddit/components/userNotes/usernoteFilter.yaml)
* Do not tag user with Good User note [JSON](/docs/subreddit/componentscomponents/userNotes/usernoteFilter.json5) | [YAML](/docs/subreddit/componentscomponents/userNotes/usernoteFilter.yaml)
## Action
@@ -33,4 +33,4 @@ A User Note can also be added to the Author of a Submission or Comment with the
### Examples
* Add note on user doing self promotion [JSON](/docs/subreddit/components/userNotes/usernoteSP.json5) | [YAML](/docs/subreddit/components/userNotes/usernoteSP.yaml)
* Add note on user doing self promotion [JSON](/docs/subreddit/componentscomponents/userNotes/usernoteSP.json5) | [YAML](/docs/subreddit/componentscomponents/userNotes/usernoteSP.yaml)

View File

@@ -104,7 +104,7 @@ If you already have a configuration you may skip the below step and go directly
### Using an Example Config
Visit the [Examples](https://github.com/FoxxMD/context-mod/tree/master/docs/examples) folder to find various examples of individual rules or see the [subreddit cookbook examples.](/docs/subreddit/components/cookbook)
Visit the [Examples](https://github.com/FoxxMD/context-mod/tree/master/docs/examples) folder to find various examples of individual rules or see the [subreddit-ready examples.](/docs/subreddit/components/subredditReady)
After you have found a configuration to use as a starting point:

274
package-lock.json generated
View File

@@ -29,7 +29,7 @@
"autolinker": "^3.14.3",
"body-parser": "^1.19.0",
"cache-manager": "^3.4.4",
"cache-manager-redis-store": "^2.0.0",
"cache-manager-redis-store": "^3.0.1",
"commander": "^8.0.0",
"comment-json": "^4.1.1",
"connect-typeorm": "^2.0.0",
@@ -95,7 +95,6 @@
"@tsconfig/node14": "^1.0.0",
"@types/async": "^3.2.7",
"@types/cache-manager": "^3.4.2",
"@types/cache-manager-redis-store": "^2.0.0",
"@types/chai": "^4.3.0",
"@types/chai-as-promised": "^7.1.5",
"@types/cookie-parser": "^1.4.2",
@@ -138,7 +137,7 @@
"typescript-json-schema": "~0.53"
},
"engines": {
"node": ">=16"
"node": ">=16.18.0"
},
"optionalDependencies": {
"better-sqlite3": "^7.5.0",
@@ -979,6 +978,59 @@
"resolved": "https://registry.npmjs.org/@nlpjs/slot/-/slot-4.22.17.tgz",
"integrity": "sha512-cNYcxf9DKB+fnRa2NxT5wbWq5j57R1WCTXLWI/1Cyycr227IP7GN7qaD4RbkzotBFFB8wm63UHod9frzmuiXxg=="
},
"node_modules/@redis/bloom": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/@redis/bloom/-/bloom-1.0.2.tgz",
"integrity": "sha512-EBw7Ag1hPgFzdznK2PBblc1kdlj5B5Cw3XwI9/oG7tSn85/HKy3X9xHy/8tm/eNXJYHLXHJL/pkwBpFMVVefkw==",
"peerDependencies": {
"@redis/client": "^1.0.0"
}
},
"node_modules/@redis/client": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/@redis/client/-/client-1.3.0.tgz",
"integrity": "sha512-XCFV60nloXAefDsPnYMjHGtvbtHR8fV5Om8cQ0JYqTNbWcQo/4AryzJ2luRj4blveWazRK/j40gES8M7Cp6cfQ==",
"dependencies": {
"cluster-key-slot": "1.1.0",
"generic-pool": "3.8.2",
"yallist": "4.0.0"
},
"engines": {
"node": ">=14"
}
},
"node_modules/@redis/graph": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/@redis/graph/-/graph-1.0.1.tgz",
"integrity": "sha512-oDE4myMCJOCVKYMygEMWuriBgqlS5FqdWerikMoJxzmmTUErnTRRgmIDa2VcgytACZMFqpAOWDzops4DOlnkfQ==",
"peerDependencies": {
"@redis/client": "^1.0.0"
}
},
"node_modules/@redis/json": {
"version": "1.0.4",
"resolved": "https://registry.npmjs.org/@redis/json/-/json-1.0.4.tgz",
"integrity": "sha512-LUZE2Gdrhg0Rx7AN+cZkb1e6HjoSKaeeW8rYnt89Tly13GBI5eP4CwDVr+MY8BAYfCg4/N15OUrtLoona9uSgw==",
"peerDependencies": {
"@redis/client": "^1.0.0"
}
},
"node_modules/@redis/search": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/@redis/search/-/search-1.1.0.tgz",
"integrity": "sha512-NyFZEVnxIJEybpy+YskjgOJRNsfTYqaPbK/Buv6W2kmFNaRk85JiqjJZA5QkRmWvGbyQYwoO5QfDi2wHskKrQQ==",
"peerDependencies": {
"@redis/client": "^1.0.0"
}
},
"node_modules/@redis/time-series": {
"version": "1.0.3",
"resolved": "https://registry.npmjs.org/@redis/time-series/-/time-series-1.0.3.tgz",
"integrity": "sha512-OFp0q4SGrTH0Mruf6oFsHGea58u8vS/iI5+NpYdicaM+7BgqBZH8FFvNZ8rYYLrUO/QRqMq72NpXmxLVNcdmjA==",
"peerDependencies": {
"@redis/client": "^1.0.0"
}
},
"node_modules/@sindresorhus/is": {
"version": "4.6.0",
"resolved": "https://registry.npmjs.org/@sindresorhus/is/-/is-4.6.0.tgz",
@@ -1163,16 +1215,6 @@
"integrity": "sha512-71aBXoFYXZW4TnDHHH8gExw2lS28BZaWeKefgsiJI7QYZeJfUEbMKw6CQtzGjlYQcGIWwB76hcCrkVA3YHSvsw==",
"dev": true
},
"node_modules/@types/cache-manager-redis-store": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/@types/cache-manager-redis-store/-/cache-manager-redis-store-2.0.1.tgz",
"integrity": "sha512-8QuccvcPieh1xM/5kReE76SfdcIdEB0ePc+54ah/NBuK2eG+6O50SX4WKoJX81UxGdW3sh/WlDaDNqjnqxWNsA==",
"dev": true,
"dependencies": {
"@types/cache-manager": "*",
"@types/redis": "^2.8.0"
}
},
"node_modules/@types/cacheable-request": {
"version": "6.0.2",
"resolved": "https://registry.npmjs.org/@types/cacheable-request/-/cacheable-request-6.0.2.tgz",
@@ -1456,15 +1498,6 @@
"resolved": "https://registry.npmjs.org/@types/range-parser/-/range-parser-1.2.4.tgz",
"integrity": "sha512-EEhsLsD6UsDM1yFhAvy0Cjr6VwmpMWqFBCb9w07wVugF7w9nfajxLuVmngTIpgS6svCnm6Vaw+MZhoDCKnOfsw=="
},
"node_modules/@types/redis": {
"version": "2.8.32",
"resolved": "https://registry.npmjs.org/@types/redis/-/redis-2.8.32.tgz",
"integrity": "sha512-7jkMKxcGq9p242exlbsVzuJb57KqHRhNl4dHoQu2Y5v9bCAbtIXXH0R3HleSQW4CTOqpHIYUW3t6tpUj4BVQ+w==",
"dev": true,
"dependencies": {
"@types/node": "*"
}
},
"node_modules/@types/responselike": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/@types/responselike/-/responselike-1.0.0.tgz",
@@ -2350,14 +2383,14 @@
}
},
"node_modules/cache-manager-redis-store": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/cache-manager-redis-store/-/cache-manager-redis-store-2.0.0.tgz",
"integrity": "sha512-bWLWlUg6nCYHiJLCCYxY2MgvwvKnvlWwrbuynrzpjEIhfArD2GC9LtutIHFEPeyGVQN6C+WEw+P3r+BFBwhswg==",
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/cache-manager-redis-store/-/cache-manager-redis-store-3.0.1.tgz",
"integrity": "sha512-o560kw+dFqusC9lQJhcm6L2F2fMKobJ5af+FoR2PdnMVdpQ3f3Bz6qzvObTGyvoazQJxjQNWgMQeChP4vRTuXQ==",
"dependencies": {
"redis": "^3.0.2"
"redis": "^4.3.1"
},
"engines": {
"node": ">= 8.3"
"node": ">= 16.18.0"
}
},
"node_modules/cacheable-lookup": {
@@ -2641,6 +2674,14 @@
"mimic-response": "^1.0.0"
}
},
"node_modules/cluster-key-slot": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/cluster-key-slot/-/cluster-key-slot-1.1.0.tgz",
"integrity": "sha512-2Nii8p3RwAPiFwsnZvukotvow2rIHM+yQ6ZcBXGHdniadkYGZYiGmkHJIbZPIV9nfv7m/U1IPMVVcAhoWFeklw==",
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/code-point-at": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/code-point-at/-/code-point-at-1.1.0.tgz",
@@ -3210,14 +3251,6 @@
"integrity": "sha1-hMbhWbgZBP3KWaDvRM2HDTElD5o=",
"optional": true
},
"node_modules/denque": {
"version": "1.5.1",
"resolved": "https://registry.npmjs.org/denque/-/denque-1.5.1.tgz",
"integrity": "sha512-XwE+iZ4D6ZUB7mfYRMb5wByE8L74HCn30FBN7sWnXksWc1LO1bPDl67pBR9o/kC4z/xSNAwkMYcGgqDV3BE3Hw==",
"engines": {
"node": ">=0.10"
}
},
"node_modules/depd": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/depd/-/depd-2.0.0.tgz",
@@ -4166,6 +4199,14 @@
"node": ">=10"
}
},
"node_modules/generic-pool": {
"version": "3.8.2",
"resolved": "https://registry.npmjs.org/generic-pool/-/generic-pool-3.8.2.tgz",
"integrity": "sha512-nGToKy6p3PAbYQ7p1UlWl6vSPwfwU6TMSWK7TTu+WUY4ZjyZQGniGGt2oNVvyNSpyZYSB43zMXVLcBm08MTMkg==",
"engines": {
"node": ">= 4"
}
},
"node_modules/gensync": {
"version": "1.0.0-beta.2",
"resolved": "https://registry.npmjs.org/gensync/-/gensync-1.0.0-beta.2.tgz",
@@ -7798,45 +7839,16 @@
}
},
"node_modules/redis": {
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/redis/-/redis-3.1.2.tgz",
"integrity": "sha512-grn5KoZLr/qrRQVwoSkmzdbw6pwF+/rwODtrOr6vuBRiR/f3rjSTGupbF90Zpqm2oenix8Do6RV7pYEkGwlKkw==",
"version": "4.3.1",
"resolved": "https://registry.npmjs.org/redis/-/redis-4.3.1.tgz",
"integrity": "sha512-cM7yFU5CA6zyCF7N/+SSTcSJQSRMEKN0k0Whhu6J7n9mmXRoXugfWDBo5iOzGwABmsWKSwGPTU5J4Bxbl+0mrA==",
"dependencies": {
"denque": "^1.5.0",
"redis-commands": "^1.7.0",
"redis-errors": "^1.2.0",
"redis-parser": "^3.0.0"
},
"engines": {
"node": ">=10"
},
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/node-redis"
}
},
"node_modules/redis-commands": {
"version": "1.7.0",
"resolved": "https://registry.npmjs.org/redis-commands/-/redis-commands-1.7.0.tgz",
"integrity": "sha512-nJWqw3bTFy21hX/CPKHth6sfhZbdiHP6bTawSgQBlKOVRG7EZkfHbbHwQJnrE4vsQf0CMNE+3gJ4Fmm16vdVlQ=="
},
"node_modules/redis-errors": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/redis-errors/-/redis-errors-1.2.0.tgz",
"integrity": "sha1-62LSrbFeTq9GEMBK/hUpOEJQq60=",
"engines": {
"node": ">=4"
}
},
"node_modules/redis-parser": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/redis-parser/-/redis-parser-3.0.0.tgz",
"integrity": "sha1-tm2CjNyv5rS4pCin3vTGvKwxyLQ=",
"dependencies": {
"redis-errors": "^1.0.0"
},
"engines": {
"node": ">=4"
"@redis/bloom": "1.0.2",
"@redis/client": "1.3.0",
"@redis/graph": "1.0.1",
"@redis/json": "1.0.4",
"@redis/search": "1.1.0",
"@redis/time-series": "1.0.3"
}
},
"node_modules/reflect-metadata": {
@@ -11115,6 +11127,46 @@
"resolved": "https://registry.npmjs.org/@nlpjs/slot/-/slot-4.22.17.tgz",
"integrity": "sha512-cNYcxf9DKB+fnRa2NxT5wbWq5j57R1WCTXLWI/1Cyycr227IP7GN7qaD4RbkzotBFFB8wm63UHod9frzmuiXxg=="
},
"@redis/bloom": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/@redis/bloom/-/bloom-1.0.2.tgz",
"integrity": "sha512-EBw7Ag1hPgFzdznK2PBblc1kdlj5B5Cw3XwI9/oG7tSn85/HKy3X9xHy/8tm/eNXJYHLXHJL/pkwBpFMVVefkw==",
"requires": {}
},
"@redis/client": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/@redis/client/-/client-1.3.0.tgz",
"integrity": "sha512-XCFV60nloXAefDsPnYMjHGtvbtHR8fV5Om8cQ0JYqTNbWcQo/4AryzJ2luRj4blveWazRK/j40gES8M7Cp6cfQ==",
"requires": {
"cluster-key-slot": "1.1.0",
"generic-pool": "3.8.2",
"yallist": "4.0.0"
}
},
"@redis/graph": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/@redis/graph/-/graph-1.0.1.tgz",
"integrity": "sha512-oDE4myMCJOCVKYMygEMWuriBgqlS5FqdWerikMoJxzmmTUErnTRRgmIDa2VcgytACZMFqpAOWDzops4DOlnkfQ==",
"requires": {}
},
"@redis/json": {
"version": "1.0.4",
"resolved": "https://registry.npmjs.org/@redis/json/-/json-1.0.4.tgz",
"integrity": "sha512-LUZE2Gdrhg0Rx7AN+cZkb1e6HjoSKaeeW8rYnt89Tly13GBI5eP4CwDVr+MY8BAYfCg4/N15OUrtLoona9uSgw==",
"requires": {}
},
"@redis/search": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/@redis/search/-/search-1.1.0.tgz",
"integrity": "sha512-NyFZEVnxIJEybpy+YskjgOJRNsfTYqaPbK/Buv6W2kmFNaRk85JiqjJZA5QkRmWvGbyQYwoO5QfDi2wHskKrQQ==",
"requires": {}
},
"@redis/time-series": {
"version": "1.0.3",
"resolved": "https://registry.npmjs.org/@redis/time-series/-/time-series-1.0.3.tgz",
"integrity": "sha512-OFp0q4SGrTH0Mruf6oFsHGea58u8vS/iI5+NpYdicaM+7BgqBZH8FFvNZ8rYYLrUO/QRqMq72NpXmxLVNcdmjA==",
"requires": {}
},
"@sindresorhus/is": {
"version": "4.6.0",
"resolved": "https://registry.npmjs.org/@sindresorhus/is/-/is-4.6.0.tgz",
@@ -11214,16 +11266,6 @@
"integrity": "sha512-71aBXoFYXZW4TnDHHH8gExw2lS28BZaWeKefgsiJI7QYZeJfUEbMKw6CQtzGjlYQcGIWwB76hcCrkVA3YHSvsw==",
"dev": true
},
"@types/cache-manager-redis-store": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/@types/cache-manager-redis-store/-/cache-manager-redis-store-2.0.1.tgz",
"integrity": "sha512-8QuccvcPieh1xM/5kReE76SfdcIdEB0ePc+54ah/NBuK2eG+6O50SX4WKoJX81UxGdW3sh/WlDaDNqjnqxWNsA==",
"dev": true,
"requires": {
"@types/cache-manager": "*",
"@types/redis": "^2.8.0"
}
},
"@types/cacheable-request": {
"version": "6.0.2",
"resolved": "https://registry.npmjs.org/@types/cacheable-request/-/cacheable-request-6.0.2.tgz",
@@ -11507,15 +11549,6 @@
"resolved": "https://registry.npmjs.org/@types/range-parser/-/range-parser-1.2.4.tgz",
"integrity": "sha512-EEhsLsD6UsDM1yFhAvy0Cjr6VwmpMWqFBCb9w07wVugF7w9nfajxLuVmngTIpgS6svCnm6Vaw+MZhoDCKnOfsw=="
},
"@types/redis": {
"version": "2.8.32",
"resolved": "https://registry.npmjs.org/@types/redis/-/redis-2.8.32.tgz",
"integrity": "sha512-7jkMKxcGq9p242exlbsVzuJb57KqHRhNl4dHoQu2Y5v9bCAbtIXXH0R3HleSQW4CTOqpHIYUW3t6tpUj4BVQ+w==",
"dev": true,
"requires": {
"@types/node": "*"
}
},
"@types/responselike": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/@types/responselike/-/responselike-1.0.0.tgz",
@@ -12253,11 +12286,11 @@
}
},
"cache-manager-redis-store": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/cache-manager-redis-store/-/cache-manager-redis-store-2.0.0.tgz",
"integrity": "sha512-bWLWlUg6nCYHiJLCCYxY2MgvwvKnvlWwrbuynrzpjEIhfArD2GC9LtutIHFEPeyGVQN6C+WEw+P3r+BFBwhswg==",
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/cache-manager-redis-store/-/cache-manager-redis-store-3.0.1.tgz",
"integrity": "sha512-o560kw+dFqusC9lQJhcm6L2F2fMKobJ5af+FoR2PdnMVdpQ3f3Bz6qzvObTGyvoazQJxjQNWgMQeChP4vRTuXQ==",
"requires": {
"redis": "^3.0.2"
"redis": "^4.3.1"
}
},
"cacheable-lookup": {
@@ -12470,6 +12503,11 @@
"mimic-response": "^1.0.0"
}
},
"cluster-key-slot": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/cluster-key-slot/-/cluster-key-slot-1.1.0.tgz",
"integrity": "sha512-2Nii8p3RwAPiFwsnZvukotvow2rIHM+yQ6ZcBXGHdniadkYGZYiGmkHJIbZPIV9nfv7m/U1IPMVVcAhoWFeklw=="
},
"code-point-at": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/code-point-at/-/code-point-at-1.1.0.tgz",
@@ -12933,11 +12971,6 @@
"integrity": "sha1-hMbhWbgZBP3KWaDvRM2HDTElD5o=",
"optional": true
},
"denque": {
"version": "1.5.1",
"resolved": "https://registry.npmjs.org/denque/-/denque-1.5.1.tgz",
"integrity": "sha512-XwE+iZ4D6ZUB7mfYRMb5wByE8L74HCn30FBN7sWnXksWc1LO1bPDl67pBR9o/kC4z/xSNAwkMYcGgqDV3BE3Hw=="
},
"depd": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/depd/-/depd-2.0.0.tgz",
@@ -13675,6 +13708,11 @@
"json-bigint": "^1.0.0"
}
},
"generic-pool": {
"version": "3.8.2",
"resolved": "https://registry.npmjs.org/generic-pool/-/generic-pool-3.8.2.tgz",
"integrity": "sha512-nGToKy6p3PAbYQ7p1UlWl6vSPwfwU6TMSWK7TTu+WUY4ZjyZQGniGGt2oNVvyNSpyZYSB43zMXVLcBm08MTMkg=="
},
"gensync": {
"version": "1.0.0-beta.2",
"resolved": "https://registry.npmjs.org/gensync/-/gensync-1.0.0-beta.2.tgz",
@@ -16455,32 +16493,16 @@
}
},
"redis": {
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/redis/-/redis-3.1.2.tgz",
"integrity": "sha512-grn5KoZLr/qrRQVwoSkmzdbw6pwF+/rwODtrOr6vuBRiR/f3rjSTGupbF90Zpqm2oenix8Do6RV7pYEkGwlKkw==",
"version": "4.3.1",
"resolved": "https://registry.npmjs.org/redis/-/redis-4.3.1.tgz",
"integrity": "sha512-cM7yFU5CA6zyCF7N/+SSTcSJQSRMEKN0k0Whhu6J7n9mmXRoXugfWDBo5iOzGwABmsWKSwGPTU5J4Bxbl+0mrA==",
"requires": {
"denque": "^1.5.0",
"redis-commands": "^1.7.0",
"redis-errors": "^1.2.0",
"redis-parser": "^3.0.0"
}
},
"redis-commands": {
"version": "1.7.0",
"resolved": "https://registry.npmjs.org/redis-commands/-/redis-commands-1.7.0.tgz",
"integrity": "sha512-nJWqw3bTFy21hX/CPKHth6sfhZbdiHP6bTawSgQBlKOVRG7EZkfHbbHwQJnrE4vsQf0CMNE+3gJ4Fmm16vdVlQ=="
},
"redis-errors": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/redis-errors/-/redis-errors-1.2.0.tgz",
"integrity": "sha1-62LSrbFeTq9GEMBK/hUpOEJQq60="
},
"redis-parser": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/redis-parser/-/redis-parser-3.0.0.tgz",
"integrity": "sha1-tm2CjNyv5rS4pCin3vTGvKwxyLQ=",
"requires": {
"redis-errors": "^1.0.0"
"@redis/bloom": "1.0.2",
"@redis/client": "1.3.0",
"@redis/graph": "1.0.1",
"@redis/json": "1.0.4",
"@redis/search": "1.1.0",
"@redis/time-series": "1.0.3"
}
},
"reflect-metadata": {

View File

@@ -24,7 +24,7 @@
"initMigration": "npm run typeorm -- migration:generate -t 1642180264563 -d ormconfig.js \"src/Common/Migrations/Database/init\""
},
"engines": {
"node": ">=16"
"node": ">=16.18.0"
},
"keywords": [],
"author": "",
@@ -49,7 +49,7 @@
"autolinker": "^3.14.3",
"body-parser": "^1.19.0",
"cache-manager": "^3.4.4",
"cache-manager-redis-store": "^2.0.0",
"cache-manager-redis-store": "^3.0.1",
"commander": "^8.0.0",
"comment-json": "^4.1.1",
"connect-typeorm": "^2.0.0",
@@ -115,7 +115,6 @@
"@tsconfig/node14": "^1.0.0",
"@types/async": "^3.2.7",
"@types/cache-manager": "^3.4.2",
"@types/cache-manager-redis-store": "^2.0.0",
"@types/chai": "^4.3.0",
"@types/chai-as-promised": "^7.1.5",
"@types/cookie-parser": "^1.4.2",

View File

@@ -1,15 +1,9 @@
import Action, {ActionJson, ActionOptions} from "./index";
import {Comment, VoteableContent} from "snoowrap";
import Submission from "snoowrap/dist/objects/Submission";
import {activityIsRemoved, renderContent} from "../Utils/SnoowrapUtils";
import {renderContent} from "../Utils/SnoowrapUtils";
import {ActionProcessResult, Footer, RequiredRichContent, RichContent, RuleResult} from "../Common/interfaces";
import {
asComment,
asSubmission,
getActivitySubredditName,
parseRedditThingsFromLink,
truncateStringToLength
} from "../util";
import {asComment, asSubmission, parseRedditThingsFromLink, truncateStringToLength} from "../util";
import {RuleResultEntity} from "../Common/Entities/RuleResultEntity";
import {runCheckOptions} from "../Subreddit/Manager";
import {ActionTarget, ActionTypes, ArbitraryActionTarget} from "../Common/Infrastructure/Atomic";
@@ -24,7 +18,6 @@ export class CommentAction extends Action {
distinguish: boolean = false;
footer?: false | string;
targets: ArbitraryActionTarget[]
asModTeam: boolean;
constructor(options: CommentActionOptions) {
super(options);
@@ -34,14 +27,12 @@ export class CommentAction extends Action {
sticky = false,
distinguish = false,
footer,
targets = ['self'],
asModTeam = false,
targets = ['self']
} = options;
this.footer = footer;
this.content = content;
this.lock = lock;
this.sticky = sticky;
this.asModTeam = asModTeam;
this.distinguish = distinguish;
if (!Array.isArray(targets)) {
this.targets = [targets];
@@ -113,48 +104,17 @@ export class CommentAction extends Action {
continue;
}
if(this.asModTeam) {
if(!targetItem.can_mod_post) {
const noMod = `[${targetIdentifier}] Cannot comment as subreddit because bot is not a moderator`;
this.logger.warn(noMod);
targetResults.push(noMod);
continue;
}
if(getActivitySubredditName(targetItem) !== this.resources.subreddit.display_name) {
const wrongSubreddit = `[${targetIdentifier}] Will not comment as subreddit because Activity did not occur in the same subreddit as the bot is moderating`;
this.logger.warn(wrongSubreddit);
targetResults.push(wrongSubreddit);
continue;
}
if(!activityIsRemoved(targetItem)) {
const notRemoved = `[${targetIdentifier}] Cannot comment as subreddit because Activity IS NOT REMOVED.`
this.logger.warn(notRemoved);
targetResults.push(notRemoved);
continue;
}
}
let modifiers = [];
let reply: Comment;
if (!dryRun) {
if(this.asModTeam) {
try {
reply = await this.client.addRemovalMessage(targetItem, renderedContent, 'public_as_subreddit',{lock: this.lock});
} catch (e: any) {
this.logger.warn(new CMError('Could not comment as subreddit', {cause: e}));
targetResults.push(`Could not comment as subreddit: ${e.message}`);
continue;
}
} else {
// @ts-ignore
reply = await targetItem.reply(renderedContent);
}
// @ts-ignore
reply = await targetItem.reply(renderedContent);
// add to recent so we ignore activity when/if it is discovered by polling
await this.resources.setRecentSelf(reply);
touchedEntities.push(reply);
}
if (!this.asModTeam && this.lock && targetItem.can_mod_post) {
if (this.lock && targetItem.can_mod_post) {
if (!targetItem.can_mod_post) {
this.logger.warn(`[${targetIdentifier}] Cannot lock because bot is not a moderator`);
} else {
@@ -167,7 +127,7 @@ export class CommentAction extends Action {
}
}
if (!this.asModTeam && this.distinguish) {
if (this.distinguish) {
if (!targetItem.can_mod_post) {
this.logger.warn(`[${targetIdentifier}] Cannot lock Distinguish/Sticky because bot is not a moderator`);
} else {
@@ -243,16 +203,6 @@ export interface CommentActionConfig extends RequiredRichContent, Footer {
* If target is not self/parent then CM assumes the value is a reddit permalink and will attempt to make a comment to that Activity
* */
targets?: ArbitraryActionTarget | ArbitraryActionTarget[]
/**
* Comment "as subreddit" using the "/u/subreddit-ModTeam" account
*
* RESTRICTIONS:
*
* * Target activity must ALREADY BE REMOVED
* * Will always distinguish and sticky the created comment
* */
asModTeam?: boolean
}
export interface CommentActionOptions extends CommentActionConfig, ActionOptions {

View File

@@ -1,6 +1,7 @@
import {ActionJson, ActionConfig, ActionOptions} from "./index";
import Action from "./index";
import {Comment} from "snoowrap";
import {renderContent} from "../Utils/SnoowrapUtils";
import {UserNoteJson} from "../Subreddit/UserNotes";
import Submission from "snoowrap/dist/objects/Submission";
import {ActionProcessResult, RuleResult} from "../Common/interfaces";
@@ -8,34 +9,19 @@ import {RuleResultEntity} from "../Common/Entities/RuleResultEntity";
import {runCheckOptions} from "../Subreddit/Manager";
import {ActionTypes, UserNoteType} from "../Common/Infrastructure/Atomic";
import {ActionResultEntity} from "../Common/Entities/ActionResultEntity";
import {
FullUserNoteCriteria,
toFullUserNoteCriteria, UserNoteCriteria
} from "../Common/Infrastructure/Filters/FilterCriteria";
import {buildFilterCriteriaSummary} from "../util";
export class UserNoteAction extends Action {
content: string;
type: UserNoteType;
existingNoteCheck?: UserNoteCriteria
allowDuplicate: boolean;
constructor(options: UserNoteActionOptions) {
super(options);
const {type, content = '', existingNoteCheck = true, allowDuplicate} = options;
const {type, content = '', allowDuplicate = false} = options;
this.type = type;
this.content = content;
if(typeof existingNoteCheck !== 'boolean') {
this.existingNoteCheck = existingNoteCheck;
} else {
let exNotecheck: boolean;
if(allowDuplicate !== undefined) {
exNotecheck = !allowDuplicate;
} else {
exNotecheck = existingNoteCheck;
}
this.existingNoteCheck = this.generateCriteriaFromDuplicateConvenience(exNotecheck);
}
this.allowDuplicate = allowDuplicate;
}
getKind(): ActionTypes {
@@ -47,30 +33,25 @@ export class UserNoteAction extends Action {
const renderedContent = (await this.renderContent(this.content, item, ruleResults, actionResults) as string);
this.logger.verbose(`Note:\r\n(${this.type}) ${renderedContent}`);
let noteCheckPassed: boolean = true;
let noteCheckResult: undefined | string;
if(this.existingNoteCheck === undefined) {
// nothing to do!
noteCheckResult = 'existingNoteCheck=false so no existing note checks were performed.';
} else {
const noteCheckCriteriaResult = await this.resources.isAuthor(item, {
userNotes: [this.existingNoteCheck]
});
noteCheckPassed = noteCheckCriteriaResult.passed;
const {details} = buildFilterCriteriaSummary(noteCheckCriteriaResult);
noteCheckResult = `${noteCheckPassed ? 'Existing note check condition succeeded' : 'Will not add note because existing note check condition failed'} -- ${details.join(' ')}`;
if (!this.allowDuplicate) {
const notes = await this.resources.userNotes.getUserNotes(item.author);
let existingNote = notes.find((x) => x.link !== null && x.link.includes(item.id));
if(existingNote === undefined && notes.length > 0) {
const lastNote = notes[notes.length - 1];
// possibly notes don't have a reference link so check if last one has same text
if(lastNote.link === null && lastNote.text === renderedContent) {
existingNote = lastNote;
}
}
if (existingNote !== undefined && existingNote.noteType === this.type) {
this.logger.info(`Will not add note because one already exists for this Activity (${existingNote.time.local().format()}) and allowDuplicate=false`);
return {
dryRun,
success: false,
result: `Will not add note because one already exists for this Activity (${existingNote.time.local().format()}) and allowDuplicate=false`
};
}
}
this.logger.info(noteCheckResult);
if (!noteCheckPassed) {
return {
dryRun,
success: false,
result: noteCheckResult
};
}
if (!dryRun) {
await this.resources.userNotes.addUserNote(item, this.type, renderedContent, this.name !== undefined ? `(Action ${this.name})` : '');
} else if (!await this.resources.userNotes.warningExists(this.type)) {
@@ -83,23 +64,11 @@ export class UserNoteAction extends Action {
}
}
generateCriteriaFromDuplicateConvenience(val: boolean): UserNoteCriteria | undefined {
if(val) {
return {
type: this.type,
note: this.content !== '' && this.content !== undefined && this.content !== null ? [this.content] : undefined,
search: 'current',
count: '< 1'
};
}
return undefined;
}
protected getSpecificPremise(): object {
return {
content: this.content,
type: this.type,
existingNoteCheck: this.existingNoteCheck
allowDuplicate: this.allowDuplicate
}
}
}
@@ -107,29 +76,10 @@ export class UserNoteAction extends Action {
export interface UserNoteActionConfig extends ActionConfig,UserNoteJson {
/**
* Add Note even if a Note already exists for this Activity
*
* USE `existingNoteCheck` INSTEAD
*
* @examples [false]
* @default false
* @deprecated
* */
allowDuplicate?: boolean,
/**
* Check if there is an existing Note matching some criteria before adding the Note.
*
* If this check passes then the Note is added. The value may be a boolean or UserNoteCriteria.
*
* Boolean convenience:
*
* * If `true` or undefined then CM generates a UserNoteCriteria that passes only if there is NO existing note matching note criteria
* * If `false` then no check is performed and Note is always added
*
* @examples [true]
* @default true
* */
existingNoteCheck?: boolean | UserNoteCriteria,
}
export interface UserNoteActionOptions extends Omit<UserNoteActionConfig, 'authorIs' | 'itemIs'>, ActionOptions {

View File

@@ -1,224 +0,0 @@
import {SPoll} from "../Subreddit/Streams";
import Snoowrap from "snoowrap";
import {Cache} from "cache-manager";
import {
BotInstanceConfig,
StrongCache,
StrongTTLConfig,
ThirdPartyCredentialsJsonConfig,
TTLConfig
} from "../Common/interfaces";
import winston, {Logger} from "winston";
import {DataSource, Repository} from "typeorm";
import {
EventRetentionPolicyRange
} from "../Common/Infrastructure/Atomic";
import {InvokeeType} from "../Common/Entities/InvokeeType";
import {RunStateType} from "../Common/Entities/RunStateType";
import {buildCachePrefix, cacheStats, mergeArr, toStrongTTLConfig} from "../util";
import objectHash from "object-hash";
import {runMigrations} from "../Common/Migrations/CacheMigrationUtils";
import {CMError} from "../Utils/Errors";
import {DEFAULT_FOOTER, SubredditResources} from "../Subreddit/SubredditResources";
import {SubredditResourceConfig, SubredditResourceOptions} from "../Common/Subreddit/SubredditResourceInterfaces";
import {buildCacheOptionsFromProvider, CMCache, createCacheManager} from "../Common/Cache";
export class BotResourcesManager {
resources: Map<string, SubredditResources> = new Map();
authorTTL: number = 10000;
enabled: boolean = true;
modStreams: Map<string, SPoll<Snoowrap.Submission | Snoowrap.Comment>> = new Map();
defaultCache: CMCache;
defaultCacheConfig: StrongCache
defaultCacheMigrated: boolean = false;
cacheType: string = 'none';
cacheHash: string;
ttlDefaults: StrongTTLConfig
defaultThirdPartyCredentials: ThirdPartyCredentialsJsonConfig;
logger: Logger;
botAccount?: string;
defaultDatabase: DataSource
botName!: string
retention?: EventRetentionPolicyRange
invokeeRepo: Repository<InvokeeType>
runTypeRepo: Repository<RunStateType>
constructor(config: BotInstanceConfig, logger: Logger) {
const {
caching: {
authorTTL,
userNotesTTL,
wikiTTL,
commentTTL,
submissionTTL,
subredditTTL,
filterCriteriaTTL,
modNotesTTL,
selfTTL,
provider,
},
name,
credentials: {
reddit,
...thirdParty
},
database,
databaseConfig: {
retention
} = {},
caching,
} = config;
caching.provider.prefix = buildCachePrefix([caching.provider.prefix, 'SHARED']);
const {...relevantCacheSettings} = caching;
this.cacheHash = objectHash.sha1(relevantCacheSettings);
this.defaultCacheConfig = caching;
this.defaultThirdPartyCredentials = thirdParty;
this.defaultDatabase = database;
this.ttlDefaults = toStrongTTLConfig({
authorTTL,
userNotesTTL,
wikiTTL,
commentTTL,
submissionTTL,
filterCriteriaTTL,
subredditTTL,
selfTTL,
modNotesTTL
});
this.botName = name as string;
this.logger = logger;
this.invokeeRepo = this.defaultDatabase.getRepository(InvokeeType);
this.runTypeRepo = this.defaultDatabase.getRepository(RunStateType);
this.retention = retention;
const options = provider;
this.cacheType = options.store;
const cache = createCacheManager(options);
this.defaultCache = new CMCache(cache, options, true, caching.provider.prefix, this.ttlDefaults, this.logger);
}
get(subName: string): SubredditResources | undefined {
if (this.resources.has(subName)) {
return this.resources.get(subName) as SubredditResources;
}
return undefined;
}
async set(subName: string, initOptions: SubredditResourceConfig): Promise<SubredditResources> {
let hash = 'default';
const {caching, credentials, retention, ...init} = initOptions;
const res = this.get(subName);
let opts: SubredditResourceOptions = {
cache: this.defaultCache,
cacheType: this.cacheType,
cacheSettingsHash: hash,
ttl: this.ttlDefaults,
thirdPartyCredentials: credentials ?? this.defaultThirdPartyCredentials,
prefix: this.defaultCacheConfig.provider.prefix,
database: this.defaultDatabase,
botName: this.botName,
retention: retention ?? this.retention,
...init,
};
if (caching !== undefined) {
const {
provider = this.defaultCacheConfig.provider,
...rest
} = caching;
opts.ttl = toStrongTTLConfig({
...this.ttlDefaults,
...rest
});
const candidateProvider = buildCacheOptionsFromProvider(provider);
const defaultPrefix = candidateProvider.prefix;
const subPrefix = defaultPrefix === this.defaultCacheConfig.provider.prefix ? buildCachePrefix([(defaultPrefix !== undefined ? defaultPrefix.replace('SHARED', '') : defaultPrefix), subName]) : candidateProvider.prefix;
candidateProvider.prefix = subPrefix;
if(this.defaultCache.equalProvider(candidateProvider)) {
opts.cache = this.defaultCache;
} else if(res !== undefined && res.cache.equalProvider(candidateProvider)) {
opts.cache = res.cache;
} else {
opts.cache = new CMCache(createCacheManager(candidateProvider), candidateProvider, false, this.defaultCache.providerOptions.prefix, opts.ttl, this.logger);
await runMigrations(opts.cache.cache, opts.cache.logger, candidateProvider.prefix);
}
} else if (!this.defaultCacheMigrated) {
await runMigrations(this.defaultCache.cache, this.logger, opts.prefix);
this.defaultCacheMigrated = true;
}
let resource: SubredditResources;
if (res === undefined) {
resource = new SubredditResources(subName, {
...opts,
botAccount: this.botAccount
});
this.resources.set(subName, resource);
} else {
// just set non-cache related settings
resource = res;
resource.botAccount = this.botAccount;
}
await resource.configure(opts);
return resource;
}
async destroy(subName: string) {
const res = this.get(subName);
if (res !== undefined) {
await res.destroy();
this.resources.delete(subName);
}
}
async getPendingSubredditInvites(): Promise<(string[])> {
const subredditNames = await this.defaultCache.get(`modInvites`);
if (subredditNames !== undefined && subredditNames !== null) {
return subredditNames as string[];
}
return [];
}
async addPendingSubredditInvite(subreddit: string): Promise<void> {
if (subreddit === null || subreddit === undefined || subreddit == '') {
throw new CMError('Subreddit name cannot be empty');
}
let subredditNames = await this.defaultCache.get(`modInvites`) as (string[] | undefined | null);
if (subredditNames === undefined || subredditNames === null) {
subredditNames = [];
}
const cleanName = subreddit.trim();
if (subredditNames.some(x => x.trim().toLowerCase() === cleanName.toLowerCase())) {
throw new CMError(`An invite for the Subreddit '${subreddit}' already exists`);
}
subredditNames.push(cleanName);
await this.defaultCache.set(`modInvites`, subredditNames, {ttl: 0});
return;
}
async deletePendingSubredditInvite(subreddit: string): Promise<void> {
let subredditNames = await this.defaultCache.get(`modInvites`) as (string[] | undefined | null);
if (subredditNames === undefined || subredditNames === null) {
subredditNames = [];
}
subredditNames = subredditNames.filter(x => x.toLowerCase() !== subreddit.trim().toLowerCase());
await this.defaultCache.set(`modInvites`, subredditNames, {ttl: 0});
return;
}
async clearPendingSubredditInvites(): Promise<void> {
await this.defaultCache.del(`modInvites`);
return;
}
}

View File

@@ -25,6 +25,7 @@ import {
import {Manager} from "../Subreddit/Manager";
import {ExtendedSnoowrap, ProxiedSnoowrap} from "../Utils/SnoowrapClients";
import {CommentStream, ModQueueStream, SPoll, SubmissionStream, UnmoderatedStream} from "../Subreddit/Streams";
import {BotResourcesManager} from "../Subreddit/SubredditResources";
import LoggedError from "../Utils/LoggedError";
import pEvent from "p-event";
import {
@@ -62,7 +63,6 @@ import {Guest, GuestEntityData} from "../Common/Entities/Guest/GuestInterfaces";
import {guestEntitiesToAll, guestEntityToApiGuest} from "../Common/Entities/Guest/GuestEntity";
import {SubredditInvite} from "../Common/Entities/SubredditInvite";
import {dayjsDTFormat} from "../Common/defaults";
import {BotResourcesManager} from "./ResourcesManager";
class Bot implements BotInstanceFunctions {

View File

@@ -1,13 +1,9 @@
import {CacheProvider} from "../Infrastructure/Atomic";
import {CacheOptions, StrongTTLConfig} from "../interfaces";
import {cacheOptDefaults} from "../defaults";
import cacheManager, {Cache, CachingConfig, WrapArgsType} from "cache-manager";
import redisStore from "cache-manager-redis-store";
import {CacheOptions} from "../interfaces";
import cacheManager, {Cache} from "cache-manager";
import {redisStore} from "cache-manager-redis-store";
import {create as createMemoryStore} from "../../Utils/memoryStore";
import winston, {Logger} from "winston";
import {mergeArr, parseStringToRegex, redisScanIterator} from "../../util";
import globrex from "globrex";
import objectHash from "object-hash";
import {CacheProvider} from "../Infrastructure/Atomic";
import {cacheOptDefaults} from "../defaults";
export const buildCacheOptionsFromProvider = (provider: CacheProvider | any): CacheOptions => {
if (typeof provider === 'string') {
@@ -22,18 +18,24 @@ export const buildCacheOptionsFromProvider = (provider: CacheProvider | any): Ca
...provider,
}
}
export const createCacheManager = (options: CacheOptions): Cache => {
const {store, max, ttl = 60, host = 'localhost', port, auth_pass, db, prefix, ...rest} = options;
export const createCacheManager = async (options: CacheOptions): Promise<Cache> => {
const {store, max, ttl = 60, host = 'localhost', port, auth_pass, db, ...rest} = options;
switch (store) {
case 'none':
return cacheManager.caching({store: 'none', max, ttl});
case 'redis':
const rStore = await redisStore(
{
socket: {
host,
port
},
password: auth_pass,
database: db,
}
);
return cacheManager.caching({
store: redisStore,
host,
port,
auth_pass,
db,
store: rStore,
ttl,
...rest,
});
@@ -43,168 +45,3 @@ export const createCacheManager = (options: CacheOptions): Cache => {
return cacheManager.caching({store: {create: createMemoryStore}, max, ttl, shouldCloneBeforeSet: false});
}
}
export class CMCache {
pruneInterval?: any;
prefix?: string
cache: Cache
isDefaultCache: boolean
defaultPrefix?: string
providerOptions: CacheOptions;
logger!: Logger;
constructor(cache: Cache, providerOptions: CacheOptions, defaultCache: boolean, defaultPrefix: string | undefined, ttls: Partial<StrongTTLConfig>, logger: Logger) {
this.cache = cache;
this.providerOptions = providerOptions
this.isDefaultCache = defaultCache;
this.prefix = this.providerOptions.prefix ?? '';
this.defaultPrefix = defaultPrefix ?? '';
this.setLogger(logger);
this.setPruneInterval(ttls);
}
setLogger(logger: Logger) {
this.logger = logger.child({labels: ['Cache']}, mergeArr);
}
equalProvider(candidate: CacheOptions) {
return objectHash.sha1(candidate) === objectHash.sha1(this.providerOptions);
}
setPruneInterval(ttls: Partial<StrongTTLConfig>) {
if (this.providerOptions.store === 'memory' && !this.isDefaultCache) {
if (this.pruneInterval !== undefined) {
clearInterval(this.pruneInterval);
}
const min = Math.min(60, ...Object.values(ttls).filter(x => typeof x === 'number' && x !== 0) as number[]);
if (min > 0) {
// set default prune interval
this.pruneInterval = setInterval(() => {
// @ts-ignore
this.cache?.store.prune();
this.logger.debug('Pruned cache');
// prune interval should be twice the smallest TTL
}, min * 1000 * 2)
}
}
}
async getCacheKeyCount() {
if (this.cache.store.keys !== undefined) {
if (this.providerOptions.store === 'redis') {
const keys = await this.cache.store.keys(`${this.prefix}*`);
return keys.length;
}
return (await this.cache.store.keys()).length;
}
return 0;
}
async interactWithCacheByKeyPattern(pattern: string | RegExp, action: 'get' | 'delete') {
let patternIsReg = pattern instanceof RegExp;
let regPattern: RegExp;
let globPattern = pattern;
const cacheDict: Record<string, any> = {};
if (typeof pattern === 'string') {
const possibleRegPattern = parseStringToRegex(pattern, 'ig');
if (possibleRegPattern !== undefined) {
regPattern = possibleRegPattern;
patternIsReg = true;
} else {
if (this.prefix !== undefined && !pattern.includes(this.prefix)) {
// need to add wildcard to beginning of pattern so that the regex will still match a key with a prefix
globPattern = `${this.prefix}${pattern}`;
}
// @ts-ignore
const result = globrex(globPattern, {flags: 'i'});
regPattern = result.regex;
}
} else {
regPattern = pattern;
}
if (this.providerOptions.store === 'redis') {
// @ts-ignore
const redisClient = this.cache.store.getClient();
if (patternIsReg) {
// scan all and test key by regex
for await (const key of redisClient.scanIterator()) {
if (regPattern.test(key) && (this.prefix === undefined || key.includes(this.prefix))) {
if (action === 'delete') {
await redisClient.del(key)
} else {
cacheDict[key] = await redisClient.get(key);
}
}
}
} else {
// not a regex means we can use glob pattern (more efficient!)
for await (const key of redisScanIterator(redisClient, {MATCH: globPattern})) {
if (action === 'delete') {
await redisClient.del(key)
} else {
cacheDict[key] = await redisClient.get(key);
}
}
}
} else if (this.cache.store.keys !== undefined) {
for (const key of await this.cache.store.keys()) {
if (regPattern.test(key) && (this.prefix === undefined || key.includes(this.prefix))) {
if (action === 'delete') {
await this.cache.del(key)
} else {
cacheDict[key] = await this.cache.get(key);
}
}
}
}
return cacheDict;
}
async deleteCacheByKeyPattern(pattern: string | RegExp) {
return await this.interactWithCacheByKeyPattern(pattern, 'delete');
}
async getCacheByKeyPattern(pattern: string | RegExp) {
return await this.interactWithCacheByKeyPattern(pattern, 'get');
}
get store() {
return this.cache.store;
}
del(key: string, shared = false): Promise<any> {
return this.cache.del(`${shared ? this.defaultPrefix : this.prefix}${key}`);
}
get<T>(key: string, shared = false): Promise<T | undefined> {
return this.cache.get(`${shared ? this.defaultPrefix : this.prefix}${key}`);
}
reset(): Promise<void> {
return this.cache.reset();
}
set<T>(key: string, value: T, options?: CachingConfig & {shared?: boolean}): Promise<T> {
const {shared = false} = options || {};
return this.cache.set(`${shared ? this.defaultPrefix : this.prefix}${key}`, value, options);
}
wrap<T>(...args: WrapArgsType<T>[]): Promise<T> {
const options: any = args.length >= 3 ? args[2] : {};
const {shared = false} = options || {};
args[0] = `${shared ? this.defaultPrefix : this.prefix}${args[0]}`;
return this.cache.wrap(...args);
}
async destroy() {
if (this.pruneInterval !== undefined && this.providerOptions.store === 'memory' && !this.isDefaultCache) {
clearInterval(this.pruneInterval);
this.cache?.reset();
}
}
}

View File

@@ -294,7 +294,7 @@ export type UserNoteType =
export const userNoteTypes = ['gooduser', 'spamwatch', 'spamwarn', 'abusewarn', 'ban', 'permban', 'botban'];
export type ConfigFragmentParseFunc = (data: object, fetched: boolean, subreddit?: string) => object | object[];
export type ConfigFragmentValidationFunc = (data: object, fetched: boolean) => boolean;
export interface WikiContext {
wiki: string

View File

@@ -8,9 +8,8 @@ import {
} from "../Atomic";
import {ActivityType, MaybeActivityType} from "../Reddit";
import {GenericComparison, parseGenericValueComparison} from "../Comparisons";
import {parseStringToRegexOrLiteralSearch, toModNoteLabel} from "../../../util";
import {parseStringToRegexOrLiteralSearch} from "../../../util";
import { Submission, Comment } from "snoowrap";
import {RedditUser} from "snoowrap/dist/objects";
/**
* Different attributes a `Subreddit` can be in. Only include a property if you want to check it.
@@ -120,34 +119,6 @@ export interface UserNoteCriteria extends UserSubredditHistoryCriteria {
* @examples ["spamwarn"]
* */
type: string;
/**
* The content of the Note to search For.
*
* * Can be a single string or list of strings to search for. Each string will be searched for case-insensitive, as a subset of note content.
* * Can also be Regular Expression if wrapped in forward slashes IE '\/test.*\/i'
* */
note?: string | string[]
/*
* Does this note link to the currently processing Activity?
* */
referencesCurrentActivity?: boolean
}
export interface FullUserNoteCriteria extends Omit<UserNoteCriteria, 'note'> {
note?: RegExp[]
}
export const toFullUserNoteCriteria = (val: UserNoteCriteria): FullUserNoteCriteria => {
const {note} = val;
let notesVal = undefined;
if (note !== undefined) {
const notesArr = Array.isArray(note) ? note : [note];
notesVal = notesArr.map(x => parseStringToRegexOrLiteralSearch(x));
}
return {
...val,
note: notesVal
}
}
export interface ModActionCriteria extends UserSubredditHistoryCriteria {
@@ -159,9 +130,6 @@ export interface ModActionCriteria extends UserSubredditHistoryCriteria {
export interface FullModActionCriteria extends Omit<ModActionCriteria, 'count'> {
type?: ModActionType[]
count?: GenericComparison
/*
* Does this action/note link to the currently processing Activity?
* */
activityType?: MaybeActivityType[]
}
@@ -170,14 +138,8 @@ export interface ModNoteCriteria extends ModActionCriteria {
note?: string | string[]
}
export interface FullModNoteCriteria extends FullModActionCriteria {
export interface FullModNoteCriteria extends FullModActionCriteria, Omit<ModNoteCriteria, 'note' | 'count' | 'type' | 'activityType'> {
noteType?: ModUserNoteLabel[]
/**
* The content of the Note to search For.
*
* * Can be a single string or list of strings to search for. Each string will be searched for case-insensitive, as a subset of note content.
* * Can also be Regular Expression if wrapped in forward slashes IE '\/test.*\/i'
* */
note?: RegExp[]
}
@@ -206,15 +168,7 @@ export const toFullModNoteCriteria = (val: ModNoteCriteria): FullModNoteCriteria
acc.count = parseGenericValueComparison(rawVal);
break;
case 'activityType':
if(rawVal === false) {
acc[k] = rawVal
} else {
acc[k] = rawVal.toLowerCase();
}
break;
case 'noteType':
acc[k] = rawVal.map((x: string) => toModNoteLabel(x));
break;
case 'referencesCurrentActivity':
acc[k] = rawVal;
break;
@@ -242,7 +196,7 @@ export interface FullModLogCriteria extends FullModActionCriteria, Omit<ModLogCr
description?: RegExp[]
}
const arrayableModLogProps = ['type','activityType','action','description','details'];
const arrayableModLogProps = ['type','activityType','action','description','details', 'type'];
export const asModLogCriteria = (val: any): val is ModLogCriteria => {
return val !== null && typeof val === 'object' && !asModNoteCriteria(val) && ('action' in val || 'details' in val || 'description' in val || 'activityType' in val || 'search' in val || 'count' in val || 'type' in val);
@@ -267,17 +221,9 @@ export const toFullModLogCriteria = (val: ModLogCriteria): FullModLogCriteria =>
acc.count = parseGenericValueComparison(rawVal);
break;
case 'activityType':
if(rawVal === false) {
acc[k] = rawVal
} else {
acc[k] = rawVal.toLowerCase();
}
break;
case 'type':
acc[k] = rawVal.map((x: string) => x.toUpperCase());
break;
case 'referencesCurrentActivity':
acc[k] = rawVal;
acc[k as keyof FullModLogCriteria] = rawVal;
break;
case 'action':
case 'description':
@@ -315,25 +261,14 @@ export interface AuthorCriteria {
*
* * If `true` then passes if ANY css is assigned
* * If `false` then passes if NO css is assigned
* * If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.
* @examples ["red"]
* */
flairCssClass?: boolean | string | string[],
/**
* The (user) flair background color (or list of) from the subreddit to match against
*
* * If `true` then passes if ANY css background color is assigned
* * If `false` then passes if NO css background is assigned
* * If string or list of strings then color is matched, case-insensitive, without #. String may also be a regular expression enclosed in forward slashes.
* */
flairBackgroundColor?: boolean | string | string[],
/**
* A (user) flair text value (or list of) from the subreddit to match against
*
* * If `true` then passes if ANY text is assigned
* * If `false` then passes if NO text is assigned
* * If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.
*
* @examples ["Approved"]
* */
@@ -344,7 +279,6 @@ export interface AuthorCriteria {
*
* * If `true` then passes if ANY template is assigned
* * If `false` then passed if NO template is assigned
* * If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.
*
* */
flairTemplate?: boolean | string | string[]
@@ -653,13 +587,6 @@ export const cmToSnoowrapActivityMap: Record<string, keyof (Submission & Comment
flairCssClass: 'author_flair_css_class',
}
export const cmToSnoowrapAuthorMap: Record<string, keyof (Submission & Comment)> = {
flairText: 'author_flair_text',
flairCssClass: 'author_flair_css_class',
flairTemplate: 'author_flair_template_id',
flairBackgroundColor: 'author_flair_background_color',
}
export const cmActivityProperties = ['submissionState', 'score', 'reports', 'removed', 'deleted', 'filtered', 'age', 'title'];
/**

View File

@@ -41,10 +41,6 @@ export interface FilterOptionsJson<T> {
}
export const asFilterOptionsJson = <T>(val: any): val is FilterOptionsJson<T> => {
return val !== null && typeof val === 'object' && (val.include !== undefined || val.exclude !== undefined);
}
export interface FilterOptionsConfig<T> extends FilterOptionsJson<T> {
/**
@@ -70,7 +66,7 @@ export interface FilterOptions<T> extends FilterOptionsConfig<T> {
export type MinimalOrFullFilter<T> = MaybeAnonymousCriteria<T>[] | FilterOptions<T>
export type MinimalOrFullMaybeAnonymousFilter<T> = MaybeAnonymousCriteria<T>[] | FilterOptionsConfig<T>
export type MinimalOrFullFilterJson<T> = MaybeAnonymousOrStringCriteria<T> | MaybeAnonymousOrStringCriteria<T>[] | FilterOptionsJson<T>
export type MinimalOrFullFilterJson<T> = MaybeAnonymousOrStringCriteria<T>[] | FilterOptionsJson<T>
export type StructuredFilter<T> = Omit<T, 'authorIs' | 'itemIs'> & {
itemIs?: MinimalOrFullFilter<TypedActivityState>
authorIs?: MinimalOrFullFilter<AuthorCriteria>

View File

@@ -1,6 +1,4 @@
import {Comment, RedditUser, Submission, Subreddit} from "snoowrap/dist/objects";
import { ValueOf } from "ts-essentials";
import {CMError} from "../../Utils/Errors";
import {Comment, Submission} from "snoowrap/dist/objects";
export type ActivityType = 'submission' | 'comment';
export type MaybeActivityType = ActivityType | false;
@@ -23,22 +21,8 @@ export type AuthorHistorySortTime = 'hour' | 'day' | 'week' | 'month' | 'year' |
export type AuthorHistoryType = 'comment' | 'submission' | 'overview';
export type SnoowrapActivity = Submission | Comment;
type valueof<T> = T[keyof T]
/*
* Depending on what caching provider is used the results from cache can either be
*
* * full-fat SnoowrapActivities (memory provider keeps everything in memory!)
* * OR json-serialized objects of the data from those activities (all other cache providers)
*
* we don't know which they are until we retrieve them.
* */
export type SnoowrapLike = Record<keyof SnoowrapActivity, valueof<SnoowrapActivity>>;
export type RedditUserLike = Record<keyof RedditUser, valueof<RedditUser>>;
export type SubredditLike = Record<keyof Subreddit, valueof<Subreddit>>;
export interface CachedFetchedActivitiesResult {
pre: SnoowrapActivity[] | SnoowrapLike[]
pre: SnoowrapActivity[]
rawCount: number
apiCount: number
preMaxTrigger?: string | null
@@ -46,7 +30,6 @@ export interface CachedFetchedActivitiesResult {
export interface FetchedActivitiesResult extends CachedFetchedActivitiesResult {
post: SnoowrapActivity[]
pre: SnoowrapActivity[]
}
export type ReportType = 'mod' | 'user';
@@ -121,48 +104,3 @@ export interface SubredditActivityBreakdownByType {
submission: SubredditActivityBreakdown[]
comment: SubredditActivityBreakdown[]
}
/**
* * `comment` -> reply to activity with comment as with bot account
* * `commentModTeam` -> reply to activity with comment "as subreddit" (modteam account)
* * `modmail` -> send a modmail as the bot account
* * `modmailSubreddit` -> send a modmial as the subrddit
* */
export type CMRemovalMessageType = 'comment' | 'commentModTeam' | 'modmailSubreddit' | 'modmail';
export const cmRemovalMessageType = ['comment', 'commentModTeam', 'modmailSubreddit', 'modmail'];
export const toCmRemovalMessageType = (val: string): CMRemovalMessageType => {
switch(val.toLowerCase()){
case 'comment':
return 'comment';
case 'modmail':
return 'modmail';
case 'commentmodteam':
return 'commentModTeam';
case 'modmailsubreddit':
return 'modmailSubreddit';
default:
throw new CMError(`Removal Message Type '${val}' was not recognized. Valid types: ${cmRemovalMessageType.join(', ')}`);
}
}
export type RedditRemovalMessageType = 'public' | 'private' | 'private_exposed' | 'public_as_subreddit';
export const redditRemoveMessageTypes = ['public', 'private', 'private_exposed', 'public_as_subreddit'];
export const cmToRedditRemovalReason = (cmType: CMRemovalMessageType): RedditRemovalMessageType => {
switch (cmType) {
case 'comment':
return 'public';
case 'commentModTeam':
return 'public_as_subreddit';
case 'modmail':
return 'private_exposed';
case 'modmailSubreddit':
return 'private';
}
}
export interface RedditRemovalMessageOptions {
title?: string
lock?: boolean
}

View File

@@ -1,50 +0,0 @@
import {
ActivityDispatch,
CacheConfig,
Footer,
StrongTTLConfig,
ThirdPartyCredentialsJsonConfig,
TTLConfig
} from "../interfaces";
import {Cache} from "cache-manager";
import {Subreddit} from "snoowrap/dist/objects";
import {DataSource} from "typeorm";
import {Logger} from "winston";
import {ExtendedSnoowrap} from "../../Utils/SnoowrapClients";
import {ManagerEntity} from "../Entities/ManagerEntity";
import {Bot} from "../Entities/Bot";
import {EventRetentionPolicyRange, StatisticFrequencyOption} from "../Infrastructure/Atomic";
import {CMCache} from "../Cache";
export interface SubredditResourceOptions extends Footer {
ttl: StrongTTLConfig
cache: CMCache
cacheType: string;
cacheSettingsHash: string
subreddit: Subreddit,
database: DataSource
logger: Logger;
client: ExtendedSnoowrap;
prefix?: string;
thirdPartyCredentials: ThirdPartyCredentialsJsonConfig
delayedItems?: ActivityDispatch[]
botAccount?: string
botName: string
managerEntity: ManagerEntity
botEntity: Bot
statFrequency: StatisticFrequencyOption
retention?: EventRetentionPolicyRange
footer?: false | string
}
export interface SubredditResourceConfig extends Footer {
caching?: CacheConfig,
subreddit: Subreddit,
logger: Logger;
client: ExtendedSnoowrap
credentials?: ThirdPartyCredentialsJsonConfig
managerEntity: ManagerEntity
botEntity: Bot
statFrequency: StatisticFrequencyOption
retention?: EventRetentionPolicyRange
}

View File

@@ -163,3 +163,9 @@ declare module 'wink-sentiment' {
export default sentiment;
}
declare module 'cache-manager-redis-store' {
import {RedisClientOptions} from "@redis/client";
import {Cache, CachingConfig} from "cache-manager";
export async function redisStore(config: RedisClientOptions & Partial<CachingConfig>): Cache;
}

View File

@@ -45,4 +45,4 @@ export const filterCriteriaDefault: FilterCriteriaDefaults = {
export const defaultDataDir = path.resolve(__dirname, '../..');
export const defaultConfigFilenames = ['config.json', 'config.yaml'];
export const VERSION = '0.13.2';
export const VERSION = '0.13.1';

View File

@@ -481,8 +481,6 @@ export interface TTLConfig {
modNotesTTL?: number | boolean;
}
export type StrongTTLConfig = Record<keyof Required<TTLConfig>, number | false>;
export interface CacheConfig extends TTLConfig {
/**
* The cache provider and, optionally, a custom configuration for that provider
@@ -492,9 +490,32 @@ export interface CacheConfig extends TTLConfig {
* To specify another `provider` but use its default configuration set this property to a string of one of the available providers: `memory`, `redis`, or `none`
* */
provider?: CacheProvider | CacheOptions
/**
* The **maximum** number of Events that the cache should store triggered result summaries for
*
* These summaries are viewable through the Web UI.
*
* The value specified by a subreddit cannot be larger than the value set by the Operator for the global/bot config (if set)
*
* @default 25
* @example [25]
* */
actionedEventsMax?: number
}
export interface OperatorCacheConfig extends CacheConfig {
/**
* The **default** number of Events that the cache will store triggered result summaries for
*
* These summaries are viewable through the Web UI.
*
* The value specified cannot be larger than `actionedEventsMax` for the global/bot config (if set)
*
* @default 25
* @example [25]
* */
actionedEventsDefault?: number
}
export interface Footer {
@@ -651,33 +672,6 @@ export interface ManagerOptions {
*
* */
retention?: EventRetentionPolicyRange
/**
* Enables config sharing
*
* * (Default) When `false` sharing is not enabled
* * When `true` any bot that can access this bot's config wiki page can use inpm t
* * When an object, use `include` or `exclude` to define subreddits that can access this config
* */
sharing?: boolean | string[] | SharingACLConfig
}
export interface SharingACLConfig {
/**
* A list of subreddits, or regular expressions for subreddit names, that are allowed to access this config
* */
include?: string[]
/**
* A list of subreddits, or regular expressions for subreddit names, that are NOT allowed to access this config
*
* If `include` is defined this property is ignored
* */
exclude?: string[]
}
export interface StrongSharingACLConfig {
include?: RegExp[]
exclude?: RegExp[]
}
export interface ThresholdCriteria {
@@ -759,6 +753,8 @@ export type StrongCache = {
modNotesTTL: number | boolean,
filterCriteriaTTL: number | boolean,
provider: CacheOptions
actionedEventsMax?: number,
actionedEventsDefault: number,
}
/**
@@ -818,11 +814,6 @@ export interface CacheOptions {
* */
max?: number
/**
* A prefix to add to all keys
* */
prefix?: string
[key:string]: any
}

View File

@@ -8,7 +8,7 @@ import {
overwriteMerge,
parseBool, parseExternalUrl, parseUrlContext, parseWikiContext, randomId,
readConfigFile,
removeUndefinedKeys, resolvePathFromEnvWithRelative, toStrongSharingACLConfig
removeUndefinedKeys, resolvePathFromEnvWithRelative
} from "./util";
import Ajv, {Schema} from 'ajv';
@@ -41,7 +41,7 @@ import {
BotCredentialsJsonConfig,
BotCredentialsConfig,
OperatorFileConfig,
PostBehavior, SharingACLConfig
PostBehavior
} from "./Common/interfaces";
import {isRuleSetJSON, RuleSetConfigData, RuleSetConfigHydratedData, RuleSetConfigObject} from "./Rule/RuleSet";
import deepEqual from "fast-deep-equal";
@@ -74,11 +74,10 @@ import {ErrorWithCause} from "pony-cause";
import {RunConfigHydratedData, RunConfigData, RunConfigObject} from "./Run";
import {AuthorRuleConfig} from "./Rule/AuthorRule";
import {
CacheProvider, ConfigFormat, ConfigFragmentParseFunc,
CacheProvider, ConfigFormat, ConfigFragmentValidationFunc,
PollOn
} from "./Common/Infrastructure/Atomic";
import {
asFilterOptionsJson,
FilterCriteriaDefaults,
FilterCriteriaDefaultsJson,
MaybeAnonymousOrStringCriteria, MinimalOrFullFilter, MinimalOrFullFilterJson, NamedCriteria
@@ -170,7 +169,7 @@ export class ConfigBuilder {
return validateJson<SubredditConfigData>(config, appSchema, this.logger);
}
async hydrateConfigFragment<T>(val: IncludesData | string | object, resource: SubredditResources, parseFunc?: ConfigFragmentParseFunc, subreddit?: string): Promise<T[]> {
async hydrateConfigFragment<T>(val: IncludesData | string | object, resource: SubredditResources, validateFunc?: ConfigFragmentValidationFunc): Promise<T[]> {
let includes: IncludesData | undefined = undefined;
if(typeof val === 'string') {
const strContextResult = parseUrlContext(val);
@@ -198,7 +197,7 @@ export class ConfigBuilder {
}
}
const resolvedFragment = await resource.getConfigFragment(includes, parseFunc);
const resolvedFragment = await resource.getConfigFragment(includes, validateFunc);
if(Array.isArray(resolvedFragment)) {
return resolvedFragment
}
@@ -231,36 +230,18 @@ export class ConfigBuilder {
let hydratedRunArr: RunConfigData | RunConfigData[];
try {
hydratedRunArr = await this.hydrateConfigFragment<RunConfigData>(r, resource, <RunConfigData>(data: any, fetched: boolean, subreddit?: string) => {
if(data.runs !== undefined && subreddit !== undefined) {
const sharing: boolean | SharingACLConfig = data.sharing ?? false;
if(sharing === false) {
throw new ConfigParseError(`The resource defined at ${r} does not have sharing enabled.`);
} else if(sharing !== true) {
const strongAcl = toStrongSharingACLConfig(sharing);
if(strongAcl.include !== undefined) {
if(!strongAcl.include.some(x => x.test(resource.subreddit.display_name))) {
throw new ConfigParseError(`The resource defined at ${r} does not have sharing enabled for this subreddit.`);
}
} else if(strongAcl.exclude !== undefined) {
if(strongAcl.exclude.some(x => x.test(resource.subreddit.display_name))) {
throw new ConfigParseError(`The resource defined at ${r} does not have sharing enabled for this subreddit.`);
}
}
}
}
const runDataVals = data.runs !== undefined ? data.runs : data;
if (!fetched) {
if (Array.isArray(runDataVals)) {
for (const runData of runDataVals) {
hydratedRunArr = await this.hydrateConfigFragment<RunConfigData>(r, resource, <RunConfigData>(data: object, fetched: boolean) => {
if (fetched) {
if (Array.isArray(data)) {
for (const runData of data) {
validateJson<RunConfigData>(runData, runSchema, this.logger);
}
} else {
validateJson<RunConfigData>(runDataVals, runSchema, this.logger);
validateJson<RunConfigData>(data, runSchema, this.logger);
}
return runDataVals;
return true;
}
return runDataVals;
return true;
});
} catch (e: any) {
throw new CMError(`Could not fetch or validate Run #${runIndex}`, {cause: e});
@@ -291,9 +272,9 @@ export class ConfigBuilder {
} else {
validateJson<ActivityCheckConfigHydratedData>(data, checkSchema, this.logger);
}
return data;
return true;
}
return data;
return true;
});
} catch (e: any) {
throw new CMError(`Could not fetch or validate Check Config Fragment #${checkIndex} in Run #${runIndex}`, {cause: e});
@@ -382,14 +363,12 @@ export class ConfigBuilder {
return validatedHydratedConfig;
}
async parseToHydrated(config: SubredditConfigData, resource: SubredditResources) {
return await this.hydrateConfig(config, resource);
}
async parseToStructured(hydratedConfig: SubredditConfigHydratedData, filterCriteriaDefaultsFromBot?: FilterCriteriaDefaults, postCheckBehaviorDefaultsFromBot: PostBehavior = {}): Promise<RunConfigObject[]> {
async parseToStructured(config: SubredditConfigData, resource: SubredditResources, filterCriteriaDefaultsFromBot?: FilterCriteriaDefaults, postCheckBehaviorDefaultsFromBot: PostBehavior = {}): Promise<RunConfigObject[]> {
let namedRules: Map<string, RuleConfigObject> = new Map();
let namedActions: Map<string, ActionConfigObject> = new Map();
const {filterCriteriaDefaults, postCheckBehaviorDefaults} = hydratedConfig;
const {filterCriteriaDefaults, postCheckBehaviorDefaults} = config;
const hydratedConfig = await this.hydrateConfig(config, resource);
const {runs: realRuns = []} = hydratedConfig;
@@ -549,7 +528,7 @@ const parseFilterJson = <T>(addToFilter: FilterJsonFuncArg<T>) => (val: MinimalO
for (const v of val) {
addToFilter(v);
}
} else if(asFilterOptionsJson<T>(val)) {
} else {
const {include = [], exclude = []} = val;
for (const v of include) {
addToFilter(v);
@@ -567,6 +546,46 @@ export const extractNamedFilters = (config: SubredditConfigHydratedData, namedAu
const parseAuthorIs = parseFilterJson(addToAuthors);
const parseItemIs = parseFilterJson(addToItems);
// const parseAuthorIs = (val: MinimalOrFullFilterJson<AuthorCriteria> | undefined) => {
// if (val === undefined) {
// return;
// }
// if (Array.isArray(val)) {
// for (const v of val) {
// addToAuthors(v);
// }
// } else {
// const {include = [], exclude = []} = val;
// for (const v of include) {
// addToAuthors(v);
// }
// for (const v of exclude) {
// addToAuthors(v);
// }
// }
// }
// const parseItemIs = (val: MinimalOrFullFilterJson<TypedActivityState> | undefined) => {
// if (val === undefined) {
// return;
// }
// if (Array.isArray(val)) {
// for (const v of val) {
// addToItems(v);
// }
// } else {
// const {include = [], exclude = []} = val;
// for (const v of include) {
// addToItems(v);
// }
// for (const v of exclude) {
// addToItems(v);
// }
// }
// }
//const namedRules = new Map();
const {
filterCriteriaDefaults,
runs = []
@@ -642,36 +661,34 @@ export const insertNameFilters = (namedAuthorFilters: Map<string, NamedCriteria<
authorIs: undefined,
itemIs: undefined,
}
if (val.authorIs !== undefined) {
if(val.authorIs !== undefined) {
if (Array.isArray(val.authorIs)) {
runnableOpts.authorIs = val.authorIs.map(x => getNamedAuthorOrReturn(x))
} else if (asFilterOptionsJson<AuthorCriteria>(val.authorIs)) {
const {include, exclude, ...rest} = val.authorIs;
runnableOpts.authorIs = {...rest};
if (include !== undefined) {
} else {
runnableOpts.authorIs = {};
const {include, exclude} = val.authorIs;
if(include !== undefined) {
runnableOpts.authorIs.include = include.map(x => getNamedAuthorOrReturn(x))
} else if (exclude !== undefined) {
}
if(exclude !== undefined) {
runnableOpts.authorIs.exclude = exclude.map(x => getNamedAuthorOrReturn(x))
}
} else {
// assume object is criteria
runnableOpts.authorIs = [getNamedAuthorOrReturn(val.authorIs)];
}
}
if (val.itemIs !== undefined) {
if(val.itemIs !== undefined) {
if (Array.isArray(val.itemIs)) {
runnableOpts.itemIs = val.itemIs.map(x => getNamedItemOrReturn(x))
} else if (asFilterOptionsJson<TypedActivityState>(val.itemIs)) {
const {include, exclude, ...rest} = val.itemIs;
runnableOpts.itemIs = {...rest};
if (include !== undefined) {
} else {
runnableOpts.itemIs = {};
const {include, exclude} = val.itemIs;
if(include !== undefined) {
runnableOpts.itemIs.include = include.map(x => getNamedItemOrReturn(x))
} else if (exclude !== undefined) {
}
if(exclude !== undefined) {
runnableOpts.itemIs.exclude = exclude.map(x => getNamedItemOrReturn(x))
}
} else {
// assume object is criteria
runnableOpts.itemIs = [getNamedItemOrReturn(val.itemIs)];
}
}
@@ -1229,6 +1246,8 @@ export const buildOperatorConfigWithDefaults = async (data: OperatorJsonConfig):
let cache: StrongCache;
let defaultProvider: CacheOptions;
let opActionedEventsMax: number | undefined;
let opActionedEventsDefault: number = 25;
const dataDir = process.env.DATA_DIR ?? defaultDataDir;
@@ -1240,10 +1259,16 @@ export const buildOperatorConfigWithDefaults = async (data: OperatorJsonConfig):
cache = {
...cacheTTLDefaults,
provider: defaultProvider,
actionedEventsDefault: opActionedEventsDefault,
};
} else {
const {provider, ...restConfig} = opCache;
const {provider, actionedEventsMax, actionedEventsDefault = opActionedEventsDefault, ...restConfig} = opCache;
if (actionedEventsMax !== undefined && actionedEventsMax !== null) {
opActionedEventsMax = actionedEventsMax;
opActionedEventsDefault = Math.min(actionedEventsDefault, actionedEventsMax);
}
if (typeof provider === 'string') {
defaultProvider = {
@@ -1261,6 +1286,8 @@ export const buildOperatorConfigWithDefaults = async (data: OperatorJsonConfig):
cache = {
...cacheTTLDefaults,
...restConfig,
actionedEventsMax: opActionedEventsMax,
actionedEventsDefault: opActionedEventsDefault,
provider: defaultProvider,
}
}
@@ -1399,6 +1426,8 @@ export const buildBotConfig = (data: BotInstanceJsonConfig, opConfig: OperatorCo
const {
snoowrap: snoowrapOp,
caching: {
actionedEventsMax: opActionedEventsMax,
actionedEventsDefault: opActionedEventsDefault = 25,
provider: defaultProvider,
} = {},
userAgent,
@@ -1453,18 +1482,28 @@ export const buildBotConfig = (data: BotInstanceJsonConfig, opConfig: OperatorCo
botCache = {
...cacheTTLDefaults,
actionedEventsDefault: opActionedEventsDefault,
actionedEventsMax: opActionedEventsMax,
provider: {...defaultProvider as CacheOptions}
};
} else {
const {
provider,
actionedEventsMax = opActionedEventsMax,
actionedEventsDefault = opActionedEventsDefault,
...restConfig
} = caching;
botActionedEventsDefault = actionedEventsDefault;
if (actionedEventsMax !== undefined) {
botActionedEventsDefault = Math.min(actionedEventsDefault, actionedEventsMax);
}
if (typeof provider === 'string') {
botCache = {
...cacheTTLDefaults,
...restConfig,
actionedEventsDefault: botActionedEventsDefault,
provider: {
store: provider as CacheProvider,
...cacheOptDefaults
@@ -1475,6 +1514,8 @@ export const buildBotConfig = (data: BotInstanceJsonConfig, opConfig: OperatorCo
botCache = {
...cacheTTLDefaults,
...restConfig,
actionedEventsDefault: botActionedEventsDefault,
actionedEventsMax,
provider: {
store,
...cacheOptDefaults,

View File

@@ -229,7 +229,7 @@ export class MHSRule extends Rule {
protected async getMHSResponse(content: string): Promise<MHSResponse> {
const hash = objectHash.sha1({content});
const key = `mhs-${hash}`;
if (this.resources.ttl.wikiTTL !== false) {
if (this.resources.wikiTTL !== false) {
let res = await this.resources.cache.get(key) as undefined | null | MHSResponse;
if(res !== undefined && res !== null) {
// don't cache bad responses
@@ -240,7 +240,7 @@ export class MHSRule extends Rule {
}
res = await this.callMHS(content);
if(res.response.toLowerCase() === 'success') {
await this.resources.cache.set(key, res, {ttl: this.resources.ttl.wikiTTL});
await this.resources.cache.set(key, res, {ttl: this.resources.wikiTTL});
}
return res;
}

View File

@@ -205,23 +205,6 @@
]
]
},
"flairBackgroundColor": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": [
"string",
"boolean"
]
}
],
"description": "The (user) flair background color (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css background color is assigned\n* If `false` then passes if NO css background is assigned\n* If string or list of strings then color is matched, case-insensitive, without #. String may also be a regular expression enclosed in forward slashes."
},
"flairCssClass": {
"anyOf": [
{
@@ -237,7 +220,7 @@
]
}
],
"description": "A (user) flair css class (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css is assigned\n* If `false` then passes if NO css is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.",
"description": "A (user) flair css class (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css is assigned\n* If `false` then passes if NO css is assigned",
"examples": [
"red"
]
@@ -257,7 +240,7 @@
]
}
],
"description": "A (user) flair template id (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY template is assigned\n* If `false` then passed if NO template is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes."
"description": "A (user) flair template id (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY template is assigned\n* If `false` then passed if NO template is assigned"
},
"flairText": {
"anyOf": [
@@ -274,7 +257,7 @@
]
}
],
"description": "A (user) flair text value (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY text is assigned\n* If `false` then passes if NO text is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.",
"description": "A (user) flair text value (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY text is assigned\n* If `false` then passes if NO text is assigned",
"examples": [
"Approved"
]
@@ -2989,7 +2972,7 @@
"properties": {
"allowDuplicate": {
"default": false,
"description": "Add Note even if a Note already exists for this Activity\n\nUSE `existingNoteCheck` INSTEAD",
"description": "Add Note even if a Note already exists for this Activity",
"examples": [
false
],
@@ -3045,21 +3028,6 @@
],
"type": "boolean"
},
"existingNoteCheck": {
"anyOf": [
{
"$ref": "#/definitions/UserNoteCriteria"
},
{
"type": "boolean"
}
],
"default": true,
"description": "Check if there is an existing Note matching some criteria before adding the Note.\n\nIf this check passes then the Note is added. The value may be a boolean or UserNoteCriteria.\n\nBoolean convenience:\n\n* If `true` or undefined then CM generates a UserNoteCriteria that passes only if there is NO existing note matching note criteria\n* If `false` then no check is performed and Note is always added",
"examples": [
true
]
},
"itemIs": {
"anyOf": [
{
@@ -3127,23 +3095,6 @@
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
"type": "string"
},
"note": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": "string"
}
],
"description": "The content of the Note to search For.\n\n* Can be a single string or list of strings to search for. Each string will be searched for case-insensitive, as a subset of note content.\n* Can also be Regular Expression if wrapped in forward slashes IE '\\/test.*\\/i'"
},
"referencesCurrentActivity": {
"type": "boolean"
},
"search": {
"default": "current",
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",

View File

@@ -663,23 +663,6 @@
]
]
},
"flairBackgroundColor": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": [
"string",
"boolean"
]
}
],
"description": "The (user) flair background color (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css background color is assigned\n* If `false` then passes if NO css background is assigned\n* If string or list of strings then color is matched, case-insensitive, without #. String may also be a regular expression enclosed in forward slashes."
},
"flairCssClass": {
"anyOf": [
{
@@ -695,7 +678,7 @@
]
}
],
"description": "A (user) flair css class (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css is assigned\n* If `false` then passes if NO css is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.",
"description": "A (user) flair css class (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css is assigned\n* If `false` then passes if NO css is assigned",
"examples": [
"red"
]
@@ -715,7 +698,7 @@
]
}
],
"description": "A (user) flair template id (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY template is assigned\n* If `false` then passed if NO template is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes."
"description": "A (user) flair template id (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY template is assigned\n* If `false` then passed if NO template is assigned"
},
"flairText": {
"anyOf": [
@@ -732,7 +715,7 @@
]
}
],
"description": "A (user) flair text value (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY text is assigned\n* If `false` then passes if NO text is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.",
"description": "A (user) flair text value (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY text is assigned\n* If `false` then passes if NO text is assigned",
"examples": [
"Approved"
]
@@ -1040,6 +1023,11 @@
},
"CacheConfig": {
"properties": {
"actionedEventsMax": {
"default": 25,
"description": "The **maximum** number of Events that the cache should store triggered result summaries for\n\nThese summaries are viewable through the Web UI.\n\nThe value specified by a subreddit cannot be larger than the value set by the Operator for the global/bot config (if set)",
"type": "number"
},
"authorTTL": {
"default": 60,
"description": "Amount of time, in seconds, author activity history (Comments/Submission) should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache\n\n* ENV => `AUTHOR_TTL`\n* ARG => `--authorTTL <sec>`",
@@ -1198,10 +1186,6 @@
],
"type": "number"
},
"prefix": {
"description": "A prefix to add to all keys",
"type": "string"
},
"store": {
"$ref": "#/definitions/CacheProvider"
},
@@ -5962,25 +5946,6 @@
],
"type": "object"
},
"SharingACLConfig": {
"properties": {
"exclude": {
"description": "A list of subreddits, or regular expressions for subreddit names, that are NOT allowed to access this config\n\nIf `include` is defined this property is ignored",
"items": {
"type": "string"
},
"type": "array"
},
"include": {
"description": "A list of subreddits, or regular expressions for subreddit names, that are allowed to access this config",
"items": {
"type": "string"
},
"type": "array"
}
},
"type": "object"
},
"SubmissionActionJson": {
"description": "Reply to the Activity. For a submission the reply will be a top-level comment.",
"properties": {
@@ -6889,7 +6854,7 @@
"properties": {
"allowDuplicate": {
"default": false,
"description": "Add Note even if a Note already exists for this Activity\n\nUSE `existingNoteCheck` INSTEAD",
"description": "Add Note even if a Note already exists for this Activity",
"examples": [
false
],
@@ -6945,21 +6910,6 @@
],
"type": "boolean"
},
"existingNoteCheck": {
"anyOf": [
{
"$ref": "#/definitions/UserNoteCriteria"
},
{
"type": "boolean"
}
],
"default": true,
"description": "Check if there is an existing Note matching some criteria before adding the Note.\n\nIf this check passes then the Note is added. The value may be a boolean or UserNoteCriteria.\n\nBoolean convenience:\n\n* If `true` or undefined then CM generates a UserNoteCriteria that passes only if there is NO existing note matching note criteria\n* If `false` then no check is performed and Note is always added",
"examples": [
true
]
},
"itemIs": {
"anyOf": [
{
@@ -7027,23 +6977,6 @@
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
"type": "string"
},
"note": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": "string"
}
],
"description": "The content of the Note to search For.\n\n* Can be a single string or list of strings to search for. Each string will be searched for case-insensitive, as a subset of note content.\n* Can also be Regular Expression if wrapped in forward slashes IE '\\/test.*\\/i'"
},
"referencesCurrentActivity": {
"type": "boolean"
},
"search": {
"default": "current",
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
@@ -7228,23 +7161,6 @@
},
"minItems": 1,
"type": "array"
},
"sharing": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"$ref": "#/definitions/SharingACLConfig"
},
{
"type": "boolean"
}
],
"description": "Enables config sharing\n\n* (Default) When `false` sharing is not enabled\n* When `true` any bot that can access this bot's config wiki page can use inpm t\n* When an object, use `include` or `exclude` to define subreddits that can access this config"
}
},
"type": "object"

View File

@@ -677,23 +677,6 @@
]
]
},
"flairBackgroundColor": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": [
"string",
"boolean"
]
}
],
"description": "The (user) flair background color (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css background color is assigned\n* If `false` then passes if NO css background is assigned\n* If string or list of strings then color is matched, case-insensitive, without #. String may also be a regular expression enclosed in forward slashes."
},
"flairCssClass": {
"anyOf": [
{
@@ -709,7 +692,7 @@
]
}
],
"description": "A (user) flair css class (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css is assigned\n* If `false` then passes if NO css is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.",
"description": "A (user) flair css class (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css is assigned\n* If `false` then passes if NO css is assigned",
"examples": [
"red"
]
@@ -729,7 +712,7 @@
]
}
],
"description": "A (user) flair template id (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY template is assigned\n* If `false` then passed if NO template is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes."
"description": "A (user) flair template id (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY template is assigned\n* If `false` then passed if NO template is assigned"
},
"flairText": {
"anyOf": [
@@ -746,7 +729,7 @@
]
}
],
"description": "A (user) flair text value (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY text is assigned\n* If `false` then passes if NO text is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.",
"description": "A (user) flair text value (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY text is assigned\n* If `false` then passes if NO text is assigned",
"examples": [
"Approved"
]
@@ -6287,7 +6270,7 @@
"properties": {
"allowDuplicate": {
"default": false,
"description": "Add Note even if a Note already exists for this Activity\n\nUSE `existingNoteCheck` INSTEAD",
"description": "Add Note even if a Note already exists for this Activity",
"examples": [
false
],
@@ -6343,21 +6326,6 @@
],
"type": "boolean"
},
"existingNoteCheck": {
"anyOf": [
{
"$ref": "#/definitions/UserNoteCriteria"
},
{
"type": "boolean"
}
],
"default": true,
"description": "Check if there is an existing Note matching some criteria before adding the Note.\n\nIf this check passes then the Note is added. The value may be a boolean or UserNoteCriteria.\n\nBoolean convenience:\n\n* If `true` or undefined then CM generates a UserNoteCriteria that passes only if there is NO existing note matching note criteria\n* If `false` then no check is performed and Note is always added",
"examples": [
true
]
},
"itemIs": {
"anyOf": [
{
@@ -6425,23 +6393,6 @@
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
"type": "string"
},
"note": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": "string"
}
],
"description": "The content of the Note to search For.\n\n* Can be a single string or list of strings to search for. Each string will be searched for case-insensitive, as a subset of note content.\n* Can also be Regular Expression if wrapped in forward slashes IE '\\/test.*\\/i'"
},
"referencesCurrentActivity": {
"type": "boolean"
},
"search": {
"default": "current",
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",

View File

@@ -49,23 +49,6 @@
]
]
},
"flairBackgroundColor": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": [
"string",
"boolean"
]
}
],
"description": "The (user) flair background color (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css background color is assigned\n* If `false` then passes if NO css background is assigned\n* If string or list of strings then color is matched, case-insensitive, without #. String may also be a regular expression enclosed in forward slashes."
},
"flairCssClass": {
"anyOf": [
{
@@ -81,7 +64,7 @@
]
}
],
"description": "A (user) flair css class (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css is assigned\n* If `false` then passes if NO css is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.",
"description": "A (user) flair css class (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css is assigned\n* If `false` then passes if NO css is assigned",
"examples": [
"red"
]
@@ -101,7 +84,7 @@
]
}
],
"description": "A (user) flair template id (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY template is assigned\n* If `false` then passed if NO template is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes."
"description": "A (user) flair template id (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY template is assigned\n* If `false` then passed if NO template is assigned"
},
"flairText": {
"anyOf": [
@@ -118,7 +101,7 @@
]
}
],
"description": "A (user) flair text value (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY text is assigned\n* If `false` then passes if NO text is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.",
"description": "A (user) flair text value (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY text is assigned\n* If `false` then passes if NO text is assigned",
"examples": [
"Approved"
]
@@ -491,10 +474,6 @@
],
"type": "number"
},
"prefix": {
"description": "A prefix to add to all keys",
"type": "string"
},
"store": {
"$ref": "#/definitions/CacheProvider"
},
@@ -989,18 +968,8 @@
"credentials": {
"$ref": "#/definitions/InfluxCredentials"
},
"debug": {
"type": "boolean"
},
"defaultTags": {
"$ref": "#/definitions/Record<string,string>"
},
"useKeepAliveAgent": {
"type": "boolean"
},
"writeOptions": {
"$ref": "#/definitions/WriteOptions",
"description": "Options used by{@linkWriteApi}."
}
},
"required": [
@@ -1602,6 +1571,16 @@
},
"OperatorCacheConfig": {
"properties": {
"actionedEventsDefault": {
"default": 25,
"description": "The **default** number of Events that the cache will store triggered result summaries for\n\nThese summaries are viewable through the Web UI.\n\nThe value specified cannot be larger than `actionedEventsMax` for the global/bot config (if set)",
"type": "number"
},
"actionedEventsMax": {
"default": 25,
"description": "The **maximum** number of Events that the cache should store triggered result summaries for\n\nThese summaries are viewable through the Web UI.\n\nThe value specified by a subreddit cannot be larger than the value set by the Operator for the global/bot config (if set)",
"type": "number"
},
"authorTTL": {
"default": 60,
"description": "Amount of time, in seconds, author activity history (Comments/Submission) should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache\n\n* ENV => `AUTHOR_TTL`\n* ARG => `--authorTTL <sec>`",
@@ -2304,23 +2283,6 @@
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
"type": "string"
},
"note": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": "string"
}
],
"description": "The content of the Note to search For.\n\n* Can be a single string or list of strings to search for. Each string will be searched for case-insensitive, as a subset of note content.\n* Can also be Regular Expression if wrapped in forward slashes IE '\\/test.*\\/i'"
},
"referencesCurrentActivity": {
"type": "boolean"
},
"search": {
"default": "current",
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
@@ -2380,94 +2342,6 @@
}
},
"type": "object"
},
"WriteOptions": {
"description": "Options used by{@linkWriteApi}.",
"properties": {
"batchSize": {
"description": "max number of records/lines to send in a batch",
"type": "number"
},
"consistency": {
"description": "InfluxDB Enterprise write consistency as explained in https://docs.influxdata.com/enterprise_influxdb/v1.9/concepts/clustering/#write-consistency",
"enum": [
"all",
"any",
"one",
"quorum"
],
"type": "string"
},
"defaultTags": {
"$ref": "#/definitions/Record<string,string>",
"description": "default tags, unescaped"
},
"exponentialBase": {
"description": "base for the exponential retry delay",
"type": "number"
},
"flushInterval": {
"description": "delay between data flushes in milliseconds, at most `batch size` records are sent during flush",
"type": "number"
},
"gzipThreshold": {
"description": "When specified, write bodies larger than the threshold are gzipped",
"type": "number"
},
"headers": {
"additionalProperties": {
"type": "string"
},
"description": "HTTP headers that will be sent with every write request",
"type": "object"
},
"maxBatchBytes": {
"description": "max size of a batch in bytes",
"type": "number"
},
"maxBufferLines": {
"description": "the maximum size of retry-buffer (in lines)",
"type": "number"
},
"maxRetries": {
"description": "max count of retries after the first write fails",
"type": "number"
},
"maxRetryDelay": {
"description": "maximum delay when retrying write (milliseconds)",
"type": "number"
},
"maxRetryTime": {
"description": "max time (millis) that can be spent with retries",
"type": "number"
},
"minRetryDelay": {
"description": "minimum delay when retrying write (milliseconds)",
"type": "number"
},
"randomRetry": {
"description": "randomRetry indicates whether the next retry delay is deterministic (false) or random (true).\nThe deterministic delay starts with `minRetryDelay * exponentialBase` and it is multiplied\nby `exponentialBase` until it exceeds `maxRetryDelay`.\nWhen random is `true`, the next delay is computed as a random number between next retry attempt (upper)\nand the lower number in the deterministic sequence. `random(retryJitter)` is added to every returned value.",
"type": "boolean"
},
"retryJitter": {
"description": "add `random(retryJitter)` milliseconds delay when retrying HTTP calls",
"type": "number"
}
},
"required": [
"batchSize",
"exponentialBase",
"flushInterval",
"maxBatchBytes",
"maxBufferLines",
"maxRetries",
"maxRetryDelay",
"maxRetryTime",
"minRetryDelay",
"randomRetry",
"retryJitter"
],
"type": "object"
}
},
"description": "Configuration for application-level settings IE for running the bot instance\n\n* To load a JSON configuration **from the command line** use the `-c` cli argument EX: `node src/index.js -c /path/to/JSON/config.json`\n* To load a JSON configuration **using an environmental variable** use `OPERATOR_CONFIG` EX: `OPERATOR_CONFIG=/path/to/JSON/config.json`",

View File

@@ -595,23 +595,6 @@
]
]
},
"flairBackgroundColor": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": [
"string",
"boolean"
]
}
],
"description": "The (user) flair background color (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css background color is assigned\n* If `false` then passes if NO css background is assigned\n* If string or list of strings then color is matched, case-insensitive, without #. String may also be a regular expression enclosed in forward slashes."
},
"flairCssClass": {
"anyOf": [
{
@@ -627,7 +610,7 @@
]
}
],
"description": "A (user) flair css class (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css is assigned\n* If `false` then passes if NO css is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.",
"description": "A (user) flair css class (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css is assigned\n* If `false` then passes if NO css is assigned",
"examples": [
"red"
]
@@ -647,7 +630,7 @@
]
}
],
"description": "A (user) flair template id (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY template is assigned\n* If `false` then passed if NO template is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes."
"description": "A (user) flair template id (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY template is assigned\n* If `false` then passed if NO template is assigned"
},
"flairText": {
"anyOf": [
@@ -664,7 +647,7 @@
]
}
],
"description": "A (user) flair text value (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY text is assigned\n* If `false` then passes if NO text is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.",
"description": "A (user) flair text value (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY text is assigned\n* If `false` then passes if NO text is assigned",
"examples": [
"Approved"
]
@@ -3962,23 +3945,6 @@
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
"type": "string"
},
"note": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": "string"
}
],
"description": "The content of the Note to search For.\n\n* Can be a single string or list of strings to search for. Each string will be searched for case-insensitive, as a subset of note content.\n* Can also be Regular Expression if wrapped in forward slashes IE '\\/test.*\\/i'"
},
"referencesCurrentActivity": {
"type": "boolean"
},
"search": {
"default": "current",
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",

View File

@@ -560,23 +560,6 @@
]
]
},
"flairBackgroundColor": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": [
"string",
"boolean"
]
}
],
"description": "The (user) flair background color (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css background color is assigned\n* If `false` then passes if NO css background is assigned\n* If string or list of strings then color is matched, case-insensitive, without #. String may also be a regular expression enclosed in forward slashes."
},
"flairCssClass": {
"anyOf": [
{
@@ -592,7 +575,7 @@
]
}
],
"description": "A (user) flair css class (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css is assigned\n* If `false` then passes if NO css is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.",
"description": "A (user) flair css class (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css is assigned\n* If `false` then passes if NO css is assigned",
"examples": [
"red"
]
@@ -612,7 +595,7 @@
]
}
],
"description": "A (user) flair template id (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY template is assigned\n* If `false` then passed if NO template is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes."
"description": "A (user) flair template id (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY template is assigned\n* If `false` then passed if NO template is assigned"
},
"flairText": {
"anyOf": [
@@ -629,7 +612,7 @@
]
}
],
"description": "A (user) flair text value (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY text is assigned\n* If `false` then passes if NO text is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.",
"description": "A (user) flair text value (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY text is assigned\n* If `false` then passes if NO text is assigned",
"examples": [
"Approved"
]
@@ -3927,23 +3910,6 @@
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
"type": "string"
},
"note": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": "string"
}
],
"description": "The content of the Note to search For.\n\n* Can be a single string or list of strings to search for. Each string will be searched for case-insensitive, as a subset of note content.\n* Can also be Regular Expression if wrapped in forward slashes IE '\\/test.*\\/i'"
},
"referencesCurrentActivity": {
"type": "boolean"
},
"search": {
"default": "current",
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",

View File

@@ -674,23 +674,6 @@
]
]
},
"flairBackgroundColor": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": [
"string",
"boolean"
]
}
],
"description": "The (user) flair background color (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css background color is assigned\n* If `false` then passes if NO css background is assigned\n* If string or list of strings then color is matched, case-insensitive, without #. String may also be a regular expression enclosed in forward slashes."
},
"flairCssClass": {
"anyOf": [
{
@@ -706,7 +689,7 @@
]
}
],
"description": "A (user) flair css class (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css is assigned\n* If `false` then passes if NO css is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.",
"description": "A (user) flair css class (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY css is assigned\n* If `false` then passes if NO css is assigned",
"examples": [
"red"
]
@@ -726,7 +709,7 @@
]
}
],
"description": "A (user) flair template id (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY template is assigned\n* If `false` then passed if NO template is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes."
"description": "A (user) flair template id (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY template is assigned\n* If `false` then passed if NO template is assigned"
},
"flairText": {
"anyOf": [
@@ -743,7 +726,7 @@
]
}
],
"description": "A (user) flair text value (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY text is assigned\n* If `false` then passes if NO text is assigned\n* If string or list of strings then text is matched, case-insensitive. String may also be a regular expression enclosed in forward slashes.",
"description": "A (user) flair text value (or list of) from the subreddit to match against\n\n* If `true` then passes if ANY text is assigned\n* If `false` then passes if NO text is assigned",
"examples": [
"Approved"
]
@@ -6484,7 +6467,7 @@
"properties": {
"allowDuplicate": {
"default": false,
"description": "Add Note even if a Note already exists for this Activity\n\nUSE `existingNoteCheck` INSTEAD",
"description": "Add Note even if a Note already exists for this Activity",
"examples": [
false
],
@@ -6540,21 +6523,6 @@
],
"type": "boolean"
},
"existingNoteCheck": {
"anyOf": [
{
"$ref": "#/definitions/UserNoteCriteria"
},
{
"type": "boolean"
}
],
"default": true,
"description": "Check if there is an existing Note matching some criteria before adding the Note.\n\nIf this check passes then the Note is added. The value may be a boolean or UserNoteCriteria.\n\nBoolean convenience:\n\n* If `true` or undefined then CM generates a UserNoteCriteria that passes only if there is NO existing note matching note criteria\n* If `false` then no check is performed and Note is always added",
"examples": [
true
]
},
"itemIs": {
"anyOf": [
{
@@ -6622,23 +6590,6 @@
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
"type": "string"
},
"note": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": "string"
}
],
"description": "The content of the Note to search For.\n\n* Can be a single string or list of strings to search for. Each string will be searched for case-insensitive, as a subset of note content.\n* Can also be Regular Expression if wrapped in forward slashes IE '\\/test.*\\/i'"
},
"referencesCurrentActivity": {
"type": "boolean"
},
"search": {
"default": "current",
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",

View File

@@ -53,7 +53,10 @@ import {Submission, Comment, Subreddit} from 'snoowrap/dist/objects';
import {activityIsRemoved, ItemContent, itemContentPeek} from "../Utils/SnoowrapUtils";
import LoggedError from "../Utils/LoggedError";
import {
SubredditResources
BotResourcesManager,
SubredditResourceConfig,
SubredditResources,
SubredditResourceSetOptions
} from "./SubredditResources";
import {SPoll, UnmoderatedStream, ModQueueStream, SubmissionStream, CommentStream} from "./Streams";
import EventEmitter from "events";
@@ -104,9 +107,6 @@ import {InfluxClient} from "../Common/Influx/InfluxClient";
import { Point } from "@influxdata/influxdb-client";
import {NormalizedManagerResponse} from "../Web/Common/interfaces";
import {guestEntityToApiGuest} from "../Common/Entities/Guest/GuestEntity";
import {BotResourcesManager} from "../Bot/ResourcesManager";
import {SubredditResourceConfig} from "../Common/Subreddit/SubredditResourceInterfaces";
import objectHash from "object-hash";
export interface RunningState {
state: RunState,
@@ -210,7 +210,6 @@ export class Manager extends EventEmitter implements RunningStates {
startedAt?: DayjsObj;
validConfigLoaded: boolean = false;
lastParseConfigHash?: string;
eventsState: RunningState = {
state: STOPPED,
@@ -274,8 +273,8 @@ export class Manager extends EventEmitter implements RunningStates {
data.historical = this.resources.getHistoricalDisplayStats();
data.cache = resStats.cache;
data.cache.currentKeyCount = await this.resources.getCacheKeyCount();
data.cache.isShared = this.resources.cache.isDefaultCache;
data.cache.provider = this.resources.cache.providerOptions.store;
data.cache.isShared = this.resources.cacheSettingsHash === 'default';
data.cache.provider = this.resources.cacheType;
}
return data;
}
@@ -382,7 +381,7 @@ export class Manager extends EventEmitter implements RunningStates {
this.eventsSampleInterval = setInterval((function(self) {
return function() {
const et = self.resources !== undefined ? self.resources.subredditStats.stats.historical.eventsCheckedTotal : 0;
const et = self.resources !== undefined ? self.resources.stats.historical.eventsCheckedTotal : 0;
const rollingSample = self.eventsSample.slice(0, 7)
rollingSample.unshift(et)
self.eventsSample = rollingSample;
@@ -404,7 +403,7 @@ export class Manager extends EventEmitter implements RunningStates {
this.rulesUniqueSampleInterval = setInterval((function(self) {
return function() {
const rollingSample = self.rulesUniqueSample.slice(0, 7)
const rt = self.resources !== undefined ? self.resources.subredditStats.stats.historical.rulesRunTotal - self.resources.subredditStats.stats.historical.rulesCachedTotal : 0;
const rt = self.resources !== undefined ? self.resources.stats.historical.rulesRunTotal - self.resources.stats.historical.rulesCachedTotal : 0;
rollingSample.unshift(rt);
self.rulesUniqueSample = rollingSample;
const diff = self.rulesUniqueSample.reduceRight((acc: number[], curr, index) => {
@@ -701,9 +700,7 @@ export class Manager extends EventEmitter implements RunningStates {
this.logger.info('Subreddit-specific options updated');
this.logger.info('Building Runs and Checks...');
const hydratedConfig = await configBuilder.hydrateConfig(validJson, this.resources);
this.lastParseConfigHash = objectHash.sha1(hydratedConfig);
const structuredRuns = await configBuilder.parseToStructured(hydratedConfig, this.filterCriteriaDefaults, this.postCheckBehaviorDefaults);
const structuredRuns = await configBuilder.parseToStructured(validJson, this.resources, this.filterCriteriaDefaults, this.postCheckBehaviorDefaults);
let runs: Run[] = [];
@@ -808,21 +805,16 @@ export class Manager extends EventEmitter implements RunningStates {
async parseConfiguration(causedBy: Invokee = 'system', force: boolean = false, options?: ManagerStateChangeOption) {
const {reason, suppressNotification = false, suppressChangeEvent = false} = options || {};
if(this.resources === undefined) {
await this.setResourceManager();
}
//this.wikiUpdateRunning = true;
this.lastWikiCheck = dayjs();
let wikiPageChanged = false;
try {
let sourceData: string;
let wiki: WikiPage;
try {
try {
const {val, wikiPage} = await this.resources.getWikiPage({wiki: this.wikiLocation}, {force: true});
wiki = wikiPage as WikiPage;
//sourceData = val as string;
// @ts-ignore
wiki = await this.getWikiPage();
} catch (err: any) {
if(err.cause !== undefined && isStatusError(err.cause) && err.cause.statusCode === 404) {
// try to create it
@@ -836,16 +828,26 @@ export class Manager extends EventEmitter implements RunningStates {
}
}
const revisionDate = dayjs.unix(wiki.revision_date);
if(this.lastWikiRevision !== undefined) {
if(this.lastWikiRevision.isSame(revisionDate)) {
this.logger.verbose('Config wiki has not changed since last check, going ahead with other checks...');
} else {
wikiPageChanged = true;
this.logger.info(`Updating config due to stale wiki page (${dayjs.duration(dayjs().diff(revisionDate)).humanize()} old)`)
if (!force && this.validConfigLoaded && (this.lastWikiRevision !== undefined && this.lastWikiRevision.isSame(revisionDate))) {
// nothing to do, we already have this revision
//this.wikiUpdateRunning = false;
if (force) {
this.logger.info('Config is up to date');
}
} else {
return false;
}
if (force) {
this.logger.info('Config update was forced');
} else if (!this.validConfigLoaded) {
this.logger.info('Trying to load (new?) config now since there is no valid config loaded');
} else if (this.lastWikiRevision !== undefined) {
this.logger.info(`Updating config due to stale wiki page (${dayjs.duration(dayjs().diff(revisionDate)).humanize()} old)`)
}
if(this.queueState.state === RUNNING) {
this.logger.verbose('Waiting for activity processing queue to pause before continuing config update');
await this.pauseQueue(causedBy);
}
this.lastWikiRevision = revisionDate;
@@ -855,8 +857,8 @@ export class Manager extends EventEmitter implements RunningStates {
}
if (sourceData.replace('\r\n', '').trim() === '') {
this.logger.error(`Wiki page contents is empty. The bot cannot run until this subreddit's wiki page has a valid config added!`);
throw new ConfigParseError(`Wiki page contents is empty. The bot cannot run until this subreddit's wiki page has a valid config added!`);
this.logger.error(`Wiki page contents was empty`);
throw new ConfigParseError('Wiki page contents was empty');
}
const [format, configObj, jsonErr, yamlErr] = parseFromJsonOrYamlToObject(sourceData);
@@ -876,23 +878,6 @@ export class Manager extends EventEmitter implements RunningStates {
throw new ConfigParseError('Could not parse wiki page contents as JSON or YAML')
}
if (!wikiPageChanged && this.validConfigLoaded && this.lastParseConfigHash !== undefined && !force) {
// need to check if hydrated is different from current
const hydratedRuns = await this.buildHydratedRuns(configObj.toJS());
const hydratedHash = objectHash.sha1(hydratedRuns);
if (hydratedHash === this.lastParseConfigHash) {
this.logger.info('Config is up to date');
return false;
} else {
this.logger.info('Hydrated config differed from wiki contents, continuing with update.');
}
}
if(this.queueState.state === RUNNING) {
this.logger.verbose('Waiting for activity processing queue to pause before continuing config update');
await this.pauseQueue(causedBy);
}
await this.parseConfigurationFromObject(configObj.toJS(), suppressChangeEvent);
this.logger.info('Checks updated');
@@ -912,12 +897,6 @@ export class Manager extends EventEmitter implements RunningStates {
}
}
async buildHydratedRuns(configObj: object) {
const configBuilder = new ConfigBuilder({logger: this.logger});
const validJson = configBuilder.validateJson(configObj);
return await configBuilder.hydrateConfig(validJson, this.resources);
}
async handleActivity(activity: (Submission | Comment), options: runCheckOptions): Promise<void> {
const checkType = isSubmission(activity) ? 'Submission' : 'Comment';
let item = activity,
@@ -995,7 +974,7 @@ export class Manager extends EventEmitter implements RunningStates {
const itemId = await item.id;
if(await this.resources.hasRecentSelf(item)) {
let recentMsg = `Found in Activities recently (last ${this.resources.ttl.selfTTL} seconds) modified/created by this bot`;
let recentMsg = `Found in Activities recently (last ${this.resources.selfTTL} seconds) modified/created by this bot`;
if(force) {
this.logger.debug(`${recentMsg} but will run anyway because "force" option was true.`);
} else {
@@ -1899,6 +1878,33 @@ export class Manager extends EventEmitter implements RunningStates {
}
}
// @ts-ignore
async getWikiPage(location: string = this.wikiLocation) {
let wiki: WikiPage;
try {
// @ts-ignore
wiki = await this.subreddit.getWikiPage(location).fetch();
} catch (err: any) {
if (isStatusError(err)) {
const error = err.statusCode === 404 ? 'does not exist' : 'is not accessible';
let reasons = [];
if (!this.client.scope.includes('wikiread')) {
reasons.push(`Bot does not have 'wikiread' oauth permission`);
} else {
const modPermissions = await this.getModPermissions();
if (!modPermissions.includes('all') && !modPermissions.includes('wiki')) {
reasons.push(`Bot does not have required mod permissions ('all' or 'wiki') to read restricted wiki pages`);
}
}
throw new CMError(`Wiki page ${generateFullWikiUrl(this.subreddit, location)} ${error} (${err.statusCode})${reasons.length > 0 ? ` because: ${reasons.join(' | ')}` : '.'}`, {cause: err});
} else {
throw new CMError(`Wiki page ${generateFullWikiUrl(this.subreddit, location)} could not be read`, {cause: err});
}
}
return wiki;
}
toNormalizedManager(): NormalizedManagerResponse {
return {
name: this.displayLabel,

View File

@@ -3,23 +3,10 @@ import {Submission, RedditUser, Comment, Subreddit} from "snoowrap/dist/objects"
import {ModUserNote, ModUserNoteRaw} from "./ModUserNote";
//import {ExtendedSnoowrap} from "../../Utils/SnoowrapClients";
import dayjs, {Dayjs} from "dayjs";
import {
asComment,
asSubmission,
generateSnoowrapEntityFromRedditThing,
isComment,
isSubmission,
parseRedditFullname
} from "../../util";
import {generateSnoowrapEntityFromRedditThing, parseRedditFullname} from "../../util";
import Snoowrap from "snoowrap";
import {ModActionType, ModUserNoteLabel} from "../../Common/Infrastructure/Atomic";
import {MaybeActivityType, RedditThing, SnoowrapActivity} from "../../Common/Infrastructure/Reddit";
import {
FullModActionCriteria,
FullModLogCriteria,
FullModNoteCriteria
} from "../../Common/Infrastructure/Filters/FilterCriteria";
import {CMError} from "../../Utils/Errors";
import {RedditThing} from "../../Common/Infrastructure/Reddit";
export interface ModNoteSnoowrapPopulated extends Omit<ModNoteRaw, 'subreddit' | 'user'> {
subreddit: Subreddit
@@ -95,12 +82,8 @@ export class ModNote {
}
this.action = new ModAction(data.mod_action_data, client);
if (this.action.actedOn instanceof RedditUser) {
if(this.action.actedOn.id === this.user.id) {
this.action.actedOn = this.user;
}/* else if(data.operator !== undefined) {
this.action.actedOn.name = data.operator;
}*/
if (this.action.actedOn instanceof RedditUser && this.action.actedOn.id === this.user.id) {
this.action.actedOn = this.user;
}
this.note = new ModUserNote(data.user_note_data, client);
@@ -133,112 +116,4 @@ export class ModNote {
toJSON() {
return this.toRaw();
}
matchesModActionCriteria(fullCrit: FullModActionCriteria, referenceItem?: SnoowrapActivity) {
const {count: {duration} = {}, activityType, referencesCurrentActivity, type} = fullCrit;
let cutoffDate: Dayjs | undefined;
if (duration !== undefined) {
// filter out any notes that occur before time range
cutoffDate = dayjs().subtract(duration);
if (this.createdAt.isBefore(cutoffDate)) {
return false;
}
}
if (activityType !== undefined) {
const anyMatch = activityType.some((a: MaybeActivityType) => {
switch (a) {
case 'submission':
return isSubmission(this.action.actedOn);
case 'comment':
return isComment(this.action.actedOn);
case false:
return this.action.actedOn === undefined || (!asSubmission(this.action.actedOn) && !asComment(this.action.actedOn));
}
});
if (!anyMatch) {
return false;
}
}
if (referencesCurrentActivity !== undefined) {
if (referenceItem === undefined) {
throw new CMError('Criteria wants to check if mod note references activity but not activity was given.');
}
const isCurrentActivity = this.action.actedOn !== undefined && referenceItem !== undefined && this.action.actedOn.name === referenceItem.name;
if ((referencesCurrentActivity === true && !isCurrentActivity) || (referencesCurrentActivity === false && isCurrentActivity)) {
return false;
}
}
if (type !== undefined) {
if (!type.includes((this.type as ModActionType))) {
return false
}
}
return true;
}
matchesModLogCriteria(fullCrit: FullModLogCriteria, referenceItem: SnoowrapActivity) {
if (!this.matchesModActionCriteria({
type: ['NOTE'], // default to filtering by note type but allow overriding?
...fullCrit
}, referenceItem)) {
return false;
}
const fullCritEntries = Object.entries(fullCrit);
for (const [k, v] of fullCritEntries) {
const key = k.toLocaleLowerCase();
switch (key) {
case 'description':
case 'action':
case 'details':
const actionPropVal = this.action[key] as string;
if (actionPropVal === undefined) {
return false;
}
const anyPropMatch = v.some((y: RegExp) => y.test(actionPropVal));
if (!anyPropMatch) {
return false;
}
break;
}
}
return true;
}
matchesModNoteCriteria(fullCrit: FullModNoteCriteria, referenceItem: SnoowrapActivity) {
if(!this.matchesModActionCriteria(fullCrit, referenceItem)) {
return false;
}
const fullCritEntries = Object.entries(fullCrit);
for (const [k, v] of fullCritEntries) {
const key = k.toLocaleLowerCase();
switch (key) {
case 'notetype':
if (!v.map((x: ModUserNoteLabel) => x.toUpperCase()).includes((this.note.label as ModUserNoteLabel))) {
return false
}
break;
case 'note':
const actionPropVal = this.note.note;
if (actionPropVal === undefined) {
return false;
}
const anyPropMatch = v.some((y: RegExp) => y.test(actionPropVal));
if (!anyPropMatch) {
return false;
}
break;
}
}
return true;
}
}

View File

@@ -1,287 +0,0 @@
import {Between, DataSource, Repository} from "typeorm";
import {TotalStat} from "../../Common/Entities/Stats/TotalStat";
import {TimeSeriesStat} from "../../Common/Entities/Stats/TimeSeriesStat";
import {statFrequencies, StatisticFrequencyOption} from "../../Common/Infrastructure/Atomic";
import {ManagerEntity} from "../../Common/Entities/ManagerEntity";
import {HistoricalStatsDisplay, ResourceStats} from "../../Common/interfaces";
import {createHistoricalDisplayDefaults} from "../../Common/defaults";
import dayjs from "dayjs";
import {ErrorWithCause} from "pony-cause";
import {cacheStats, formatNumber, frequencyEqualOrLargerThanMin, mergeArr} from "../../util";
import {Cache} from "cache-manager";
import winston, {Logger} from "winston";
import {CMCache} from "../../Common/Cache";
export class SubredditStats {
totalStatsRepo: Repository<TotalStat>
totalStatsEntities?: TotalStat[];
tsStatsRepo: Repository<TimeSeriesStat>
timeSeriesStatsEntities?: TimeSeriesStat[];
statFrequency: StatisticFrequencyOption
historicalSaveInterval?: any;
managerEntity: ManagerEntity;
cache: CMCache;
protected logger: Logger;
init: boolean = false;
stats: {
cache: ResourceStats
historical: HistoricalStatsDisplay
timeSeries: HistoricalStatsDisplay
};
constructor(database: DataSource, managerEntity: ManagerEntity, cache: CMCache, statFrequency: StatisticFrequencyOption, logger: Logger) {
this.totalStatsRepo = database.getRepository(TotalStat);
this.tsStatsRepo = database.getRepository(TimeSeriesStat);
this.statFrequency = statFrequency;
this.managerEntity = managerEntity;
this.cache = cache;
if (logger === undefined) {
const alogger = winston.loggers.get('app')
this.logger = alogger.child({labels: [this.managerEntity.name, 'Stats']}, mergeArr);
} else {
this.logger = logger.child({labels: ['Stats']}, mergeArr);
}
this.stats = {
cache: cacheStats(),
historical: createHistoricalDisplayDefaults(),
timeSeries: createHistoricalDisplayDefaults(),
};
}
async initStats(force: boolean = false) {
if (!this.init || force) {
try {
let currentStats: HistoricalStatsDisplay = createHistoricalDisplayDefaults();
const totalStats = await this.totalStatsRepo.findBy({managerId: this.managerEntity.id});
if (totalStats.length === 0) {
const now = dayjs();
const statEntities: TotalStat[] = [];
for (const [k, v] of Object.entries(currentStats)) {
statEntities.push(new TotalStat({
metric: k,
value: v,
manager: this.managerEntity,
createdAt: now,
}));
}
await this.totalStatsRepo.save(statEntities);
this.totalStatsEntities = statEntities;
} else {
this.totalStatsEntities = totalStats;
for (const [k, v] of Object.entries(currentStats)) {
const matchedStat = totalStats.find(x => x.metric === k);
if (matchedStat !== undefined) {
currentStats[k] = matchedStat.value;
} else {
this.logger.warn(`Could not find historical stat matching '${k}' in the database, will default to 0`);
currentStats[k] = v;
}
}
}
this.stats.historical = currentStats;
} catch (e) {
this.logger.error(new ErrorWithCause('Failed to init historical stats', {cause: e}));
}
try {
if (this.statFrequency !== false) {
let currentStats: HistoricalStatsDisplay = createHistoricalDisplayDefaults();
let startRange = dayjs().set('second', 0);
for (const unit of statFrequencies) {
if (unit !== 'week' && !frequencyEqualOrLargerThanMin(unit, this.statFrequency)) {
startRange = startRange.set(unit, 0);
}
if (unit === 'week' && this.statFrequency === 'week') {
// make sure we get beginning of week
startRange = startRange.week(startRange.week());
}
}
// set end range by +1 of whatever unit we are using
const endRange = this.statFrequency === 'week' ? startRange.clone().week(startRange.week() + 1) : startRange.clone().set(this.statFrequency, startRange.get(this.statFrequency) + 1);
const tsStats = await this.tsStatsRepo.findBy({
managerId: this.managerEntity.id,
granularity: this.statFrequency,
// make sure its inclusive!
_createdAt: Between(startRange.clone().subtract(1, 'second').toDate(), endRange.clone().add(1, 'second').toDate())
});
if (tsStats.length === 0) {
const statEntities: TimeSeriesStat[] = [];
for (const [k, v] of Object.entries(currentStats)) {
statEntities.push(new TimeSeriesStat({
metric: k,
value: v,
granularity: this.statFrequency,
manager: this.managerEntity,
createdAt: startRange,
}));
}
this.timeSeriesStatsEntities = statEntities;
} else {
this.timeSeriesStatsEntities = tsStats;
}
for (const [k, v] of Object.entries(currentStats)) {
const matchedStat = this.timeSeriesStatsEntities.find(x => x.metric === k);
if (matchedStat !== undefined) {
currentStats[k] = matchedStat.value;
} else {
this.logger.warn(`Could not find time series stat matching '${k}' in the database, will default to 0`);
currentStats[k] = v;
}
}
}
} catch (e) {
this.logger.error(new ErrorWithCause('Failed to init frequency (time series) stats', {cause: e}));
}
this.init = true;
}
}
updateHistoricalStats(data: Partial<HistoricalStatsDisplay>) {
for (const [k, v] of Object.entries(data)) {
if (this.stats.historical[k] !== undefined && v !== undefined) {
this.stats.historical[k] += v;
}
if (this.stats.timeSeries[k] !== undefined && v !== undefined) {
this.stats.timeSeries[k] += v;
}
}
}
getHistoricalDisplayStats(): HistoricalStatsDisplay {
return this.stats.historical;
}
async saveHistoricalStats() {
if (this.totalStatsEntities !== undefined) {
for (const [k, v] of Object.entries(this.stats.historical)) {
const matchedStatIndex = this.totalStatsEntities.findIndex(x => x.metric === k);
if (matchedStatIndex !== -1) {
this.totalStatsEntities[matchedStatIndex].value = v;
} else {
this.logger.warn(`Could not find historical stat matching '${k}' in total stats??`);
}
}
await this.totalStatsRepo.save(this.totalStatsEntities);
}
if (this.timeSeriesStatsEntities !== undefined) {
for (const [k, v] of Object.entries(this.stats.timeSeries)) {
const matchedStatIndex = this.timeSeriesStatsEntities.findIndex(x => x.metric === k);
if (matchedStatIndex !== -1) {
this.timeSeriesStatsEntities[matchedStatIndex].value = v;
} else {
this.logger.warn(`Could not find time series stat matching '${k}' in total stats??`);
}
}
await this.tsStatsRepo.save(this.timeSeriesStatsEntities);
}
}
setHistoricalSaveInterval() {
this.historicalSaveInterval = setInterval((function (self) {
return async () => {
await self.saveHistoricalStats();
}
})(this), 10000);
}
getCacheTotals() {
return Object.values(this.stats.cache).reduce((acc, curr) => ({
miss: acc.miss + curr.miss,
req: acc.req + curr.requests,
}), {miss: 0, req: 0});
}
getCacheStats() {
return this.stats.cache;
}
async getCacheStatsForManager() {
const totals = this.getCacheTotals();
const cacheKeys = Object.keys(this.stats.cache);
const res = {
cache: {
// TODO could probably combine these two
totalRequests: totals.req,
totalMiss: totals.miss,
missPercent: `${formatNumber(totals.miss === 0 || totals.req === 0 ? 0 : (totals.miss / totals.req) * 100, {toFixed: 0})}%`,
types: await cacheKeys.reduce(async (accProm, curr) => {
const acc = await accProm;
// calculate miss percent
const per = acc[curr].miss === 0 ? 0 : formatNumber(acc[curr].miss / acc[curr].requests) * 100;
acc[curr].missPercent = `${formatNumber(per, {toFixed: 0})}%`;
// calculate average identifier hits
const idCache = acc[curr].identifierRequestCount;
// @ts-expect-error
const idKeys = await idCache.store.keys() as string[];
if (idKeys.length > 0) {
let hits = 0;
for (const k of idKeys) {
hits += await idCache.get(k) as number;
}
acc[curr].identifierAverageHit = formatNumber(hits / idKeys.length);
}
if (acc[curr].requestTimestamps.length > 1) {
// calculate average time between request
const diffData = acc[curr].requestTimestamps.reduce((accTimestampData, curr: number) => {
if (accTimestampData.last === 0) {
accTimestampData.last = curr;
return accTimestampData;
}
accTimestampData.diffs.push(curr - accTimestampData.last);
accTimestampData.last = curr;
return accTimestampData;
}, {last: 0, diffs: [] as number[]});
const avgDiff = diffData.diffs.reduce((acc, curr) => acc + curr, 0) / diffData.diffs.length;
acc[curr].averageTimeBetweenHits = formatNumber(avgDiff / 1000);
}
const {requestTimestamps, identifierRequestCount, ...rest} = acc[curr];
// @ts-ignore
acc[curr] = rest;
return acc;
}, Promise.resolve({...this.stats.cache}))
}
}
return res;
}
async incrementCacheTypeStat(cacheType: keyof ResourceStats, hash?: string, miss?: boolean) {
if (this.stats.cache[cacheType] === undefined) {
this.logger.warn(`Cache type ${cacheType} does not exist. Fix this!`);
}
if (hash !== undefined) {
await this.stats.cache[cacheType].identifierRequestCount.set(hash, (await this.stats.cache[cacheType].identifierRequestCount.wrap(hash, () => 0) as number) + 1);
}
this.stats.cache[cacheType].requestTimestamps.push(Date.now());
this.stats.cache[cacheType].requests++;
if (miss === true) {
this.stats.cache[cacheType].miss++;
}
}
resetCacheStats() {
this.stats.cache = cacheStats();
}
async destroy() {
if (this.historicalSaveInterval !== undefined) {
clearInterval(this.historicalSaveInterval);
await this.saveHistoricalStats();
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -16,10 +16,6 @@ import {Cache} from 'cache-manager';
import {isScopeError} from "../Utils/Errors";
import {ErrorWithCause} from "pony-cause";
import {UserNoteType} from "../Common/Infrastructure/Atomic";
import {CMCache} from "../Common/Cache";
import {FullUserNoteCriteria, UserNoteCriteria} from "../Common/Infrastructure/Filters/FilterCriteria";
import {parseGenericValueOrPercentComparison} from "../Common/Infrastructure/Comparisons";
import {SnoowrapActivity} from "../Common/Infrastructure/Reddit";
interface RawUserNotesPayload {
ver: number,
@@ -71,7 +67,7 @@ export class UserNotes {
moderators?: RedditUser[];
logger: Logger;
identifier: string;
cache: CMCache
cache: Cache
cacheCB: Function;
mod?: RedditUser;
@@ -81,7 +77,7 @@ export class UserNotes {
debounceCB: any;
batchCount: number = 0;
constructor(ttl: number | boolean, subreddit: string, client: Snoowrap, logger: Logger, cache: CMCache, cacheCB: Function) {
constructor(ttl: number | boolean, subreddit: string, client: Snoowrap, logger: Logger, cache: Cache, cacheCB: Function) {
this.notesTTL = ttl === true ? 0 : ttl;
this.subreddit = client.getSubreddit(subreddit);
this.logger = logger;
@@ -255,44 +251,6 @@ export class UserNote {
}
public matches(criteria: FullUserNoteCriteria, item?: SnoowrapActivity) {
if (criteria.type !== undefined) {
if(typeof this.noteType === 'string') {
if(this.noteType.toLowerCase() !== criteria.type.toLowerCase().trim()) {
return false
}
} else {
return false;
}
}
if (criteria.note !== undefined && !criteria.note.some(x => x.test(this.text ?? ''))) {
return false;
}
if(criteria.referencesCurrentActivity !== undefined) {
if(criteria.referencesCurrentActivity) {
if(item === undefined) {
return false;
}
if(this.link === null) {
return false;
}
if(!this.link.includes(item.id)) {
return false;
}
} else if(this.link !== null && item !== undefined && this.link.includes(item.id)) {
return false;
}
}
const {duration} = parseGenericValueOrPercentComparison(criteria.count ?? '>= 1');
if (duration !== undefined) {
const cutoffDate = dayjs().subtract(duration);
if (this.time.isBefore(cutoffDate)) {
return false;
}
}
return true;
}
public toRaw(constants: UserNotesConstants): RawNote {
let m = this.modIndex;
if(m === undefined && this.moderator !== undefined) {

View File

@@ -1,14 +1,10 @@
import Snoowrap, {Listing, RedditUser} from "snoowrap";
import {Submission, Subreddit, Comment} from "snoowrap/dist/objects";
import {asSubmission, parseSubredditName} from "../util";
import {parseSubredditName} from "../util";
import {ModUserNoteLabel} from "../Common/Infrastructure/Atomic";
import {CreateModNoteData, ModNote, ModNoteRaw, ModNoteSnoowrapPopulated} from "../Subreddit/ModNotes/ModNote";
import {CMError, isStatusError, SimpleError} from "./Errors";
import {
RawSubredditRemovalReasonData, RedditRemovalMessageOptions,
RedditRemovalMessageType,
SnoowrapActivity
} from "../Common/Infrastructure/Reddit";
import {RawSubredditRemovalReasonData, SnoowrapActivity} from "../Common/Infrastructure/Reddit";
// const proxyFactory = (endpoint: string) => {
// return class ProxiedSnoowrap extends Snoowrap {
@@ -208,36 +204,6 @@ export class ExtendedSnoowrap extends Snoowrap {
method: 'get'
}) as RawSubredditRemovalReasonData;
}
// @ts-ignore
async addRemovalMessage(item: SnoowrapActivity, message: string, type: RedditRemovalMessageType, options: RedditRemovalMessageOptions = {}) {
const {
lock = false,
// in the body, title must be a non-empty string or else reddit throws an error
// -- it is only used if sending modmail
title = 'NOT USED'
} = options;
try {
const body: any = {
item_id: [item.name],
message,
type,
lock_comment: lock,
title,
};
const reply = await this.oauthRequest({
uri: `api/v1/modactions/${asSubmission(item) ? 'removal_link_message' : 'removal_comment_message'}`,
method: 'post',
body,
});
return new Comment(reply, this, true);
} catch (e: any) {
if (e.message.includes('The specified id is invalid')) {
throw new CMError('Activity must be REMOVED before a message can be sent.', {cause: e});
}
throw e;
}
}
}
export class RequestTrackingSnoowrap extends ExtendedSnoowrap {

View File

@@ -25,7 +25,7 @@ export type TypeormStoreOptions = Partial<SessionOptions & {
}>;
interface IWebStorageProvider {
createSessionStore(options?: CacheManagerStoreOptions | TypeormStoreOptions): Store
createSessionStore(options?: CacheManagerStoreOptions | TypeormStoreOptions): Promise<Store>
getSessionSecret(): Promise<string | undefined>
@@ -49,7 +49,7 @@ abstract class StorageProvider implements IWebStorageProvider {
this.logger = logger.child({labels: ['Web', 'Storage', ...loggerLabels]}, mergeArr);
}
abstract createSessionStore(options?: CacheManagerStoreOptions | TypeormStoreOptions): Store;
abstract createSessionStore(options?: CacheManagerStoreOptions | TypeormStoreOptions): Promise<Store>;
abstract getSessionSecret(): Promise<string | undefined>;
@@ -58,24 +58,24 @@ abstract class StorageProvider implements IWebStorageProvider {
export class CacheStorageProvider extends StorageProvider {
protected cache: Cache;
protected cache: Promise<Cache>;
constructor(caching: CacheOptions & StorageProviderOptions) {
super(caching);
const {logger, invitesMaxAge, loggerLabels, ...restCache } = caching;
this.cache = createCacheManager({...restCache, prefix: buildCachePrefix(['web'])}) as Cache;
this.cache = createCacheManager({...restCache, prefix: buildCachePrefix(['web'])}) as Promise<Cache>;
this.logger.debug('Using CACHE');
if (caching.store === 'none') {
this.logger.warn(`Using 'none' as cache provider means no one will be able to access the interface since sessions will never be persisted!`);
}
}
createSessionStore(options?: CacheManagerStoreOptions): Store {
return new CacheManagerStore(this.cache, {prefix: 'sess:'});
async createSessionStore(options?: CacheManagerStoreOptions): Promise<Store> {
return new CacheManagerStore((await this.cache), {prefix: 'sess:'});
}
async getSessionSecret() {
const val = await this.cache.get(`sessionSecret`);
const val = await (await this.cache).get(`sessionSecret`);
if (val === null || val === undefined) {
return undefined;
}
@@ -83,7 +83,7 @@ export class CacheStorageProvider extends StorageProvider {
}
async setSessionSecret(secret: string) {
await this.cache.set('sessionSecret', secret, {ttl: 0});
await (await this.cache).set('sessionSecret', secret, {ttl: 0});
}
}
@@ -102,7 +102,7 @@ export class DatabaseStorageProvider extends StorageProvider {
this.logger.debug('Using DATABASE');
}
createSessionStore(options?: TypeormStoreOptions): Store {
async createSessionStore(options?: TypeormStoreOptions): Promise<Store> {
return new TypeormStore(options).connect(this.clientSessionRepo)
}

View File

@@ -324,7 +324,7 @@ const webClient = async (options: OperatorConfigWithFileContext) => {
cookie: {
maxAge: sessionMaxAge * 1000,
},
store: sessionStoreProvider.createSessionStore(sessionStorage === 'database' ? {
store: await sessionStoreProvider.createSessionStore(sessionStorage === 'database' ? {
cleanupLimit: 2,
ttl: sessionMaxAge
} : {}),

View File

@@ -151,7 +151,7 @@
problems.append(wrapper);
}
window.canSave = <%= canSave ?? false %>;
window.canSave = <%= canSave %>;
window.isGuest = false;
if (searchParams.get('subreddit') === null) {
@@ -342,35 +342,6 @@
return new Promise((resolve, reject) => resolve(url));
}
// https://stackoverflow.com/a/65996386
// if the user is accessing CM from an unsecure context
// -- http and NOT http://localhost
// then navigator.clipboard is not available and we need to fallback to execCommand
function copyToClipboard(textToCopy) {
// navigator clipboard api needs a secure context (https)
if (navigator.clipboard && window.isSecureContext) {
// navigator clipboard api method
return navigator.clipboard.writeText(textToCopy);
} else {
debugger;
// text area method
let textArea = document.createElement("textarea");
textArea.value = textToCopy;
// make the textarea out of viewport
textArea.style.position = "fixed";
textArea.style.left = "-999999px";
textArea.style.top = "-999999px";
document.body.appendChild(textArea);
textArea.focus();
textArea.select();
return new Promise((res, rej) => {
// here the magic happens
document.execCommand('copy') ? res() : rej();
textArea.remove();
});
}
}
window.addEventListener('load', function () {
var searchParams = new URLSearchParams(window.location.search);
@@ -471,10 +442,7 @@
copy.insertAdjacentHTML('beforeend', `<a class="hover:bg-gray-400 no-underline rounded-md py-1 px-3 border" href="">Copy ID <span style="display:inline" class="iconify" data-icon="clarity:copy-to-clipboard-line"></span></a>`);
copy.addEventListener('click', e => {
e.preventDefault();
copyToClipboard(reason.id)
.catch((e) => {
console.log(`Could not copy ID ${reason.id} to clipboard due to an error`, e);
});
navigator.clipboard.writeText(reason.id);
});
node.appendChild(copy);
reasonsList.appendChild(node);

View File

@@ -29,12 +29,12 @@ import {
RuleResult,
RuleSetResult,
RunResult,
SearchAndReplaceRegExp, SharingACLConfig,
StringComparisonOptions, StrongSharingACLConfig, StrongTTLConfig, TTLConfig
SearchAndReplaceRegExp,
StringComparisonOptions
} from "./Common/interfaces";
import InvalidRegexError from "./Utils/InvalidRegexError";
import {accessSync, constants, promises} from "fs";
import {cacheTTLDefaults, VERSION} from "./Common/defaults";
import {VERSION} from "./Common/defaults";
import cacheManager from "cache-manager";
import Autolinker from 'autolinker';
import {LEVEL, MESSAGE} from "triple-beam";
@@ -1762,32 +1762,6 @@ export const cacheStats = (): ResourceStats => {
};
}
export const toStrongTTLConfig = (data: TTLConfig): StrongTTLConfig => {
const {
userNotesTTL = cacheTTLDefaults.userNotesTTL,
authorTTL = cacheTTLDefaults.authorTTL,
wikiTTL = cacheTTLDefaults.wikiTTL,
filterCriteriaTTL = cacheTTLDefaults.filterCriteriaTTL,
selfTTL = cacheTTLDefaults.selfTTL,
submissionTTL = cacheTTLDefaults.submissionTTL,
commentTTL = cacheTTLDefaults.commentTTL,
subredditTTL = cacheTTLDefaults.subredditTTL,
modNotesTTL = cacheTTLDefaults.modNotesTTL,
} = data;
return {
authorTTL: authorTTL === true ? 0 : authorTTL,
submissionTTL: submissionTTL === true ? 0 : submissionTTL,
commentTTL: commentTTL === true ? 0 : commentTTL,
subredditTTL: subredditTTL === true ? 0 : subredditTTL,
wikiTTL: wikiTTL === true ? 0 : wikiTTL,
filterCriteriaTTL: filterCriteriaTTL === true ? 0 : filterCriteriaTTL,
modNotesTTL: modNotesTTL === true ? 0 : modNotesTTL,
selfTTL: selfTTL === true ? 0 : selfTTL,
userNotesTTL: userNotesTTL === true ? 0 : userNotesTTL,
};
}
export const randomId = () => crypto.randomBytes(20).toString('hex');
export const intersect = (a: Array<any>, b: Array<any>) => {
@@ -2475,22 +2449,14 @@ export const mergeFilters = (objectConfig: RunnableBaseJson, filterDefs: FilterC
let derivedAuthorIs: AuthorOptions = buildFilter(authorIsDefault);
if (authorIsBehavior === 'merge') {
derivedAuthorIs = {
excludeCondition: authorIs.excludeCondition ?? derivedAuthorIs.excludeCondition,
include: addNonConflictingCriteria(derivedAuthorIs.include, authorIs.include),
exclude: addNonConflictingCriteria(derivedAuthorIs.exclude, authorIs.exclude),
}
derivedAuthorIs = merge.all([authorIs, authorIsDefault], {arrayMerge: removeFromSourceIfKeysExistsInDestination});
} else if (!filterIsEmpty(authorIs)) {
derivedAuthorIs = authorIs;
}
let derivedItemIs: ItemOptions = buildFilter(itemIsDefault);
if (itemIsBehavior === 'merge') {
derivedItemIs = {
excludeCondition: itemIs.excludeCondition ?? derivedItemIs.excludeCondition,
include: addNonConflictingCriteria(derivedItemIs.include, itemIs.include),
exclude: addNonConflictingCriteria(derivedItemIs.exclude, itemIs.exclude),
}
derivedItemIs = merge.all([itemIs, itemIsDefault], {arrayMerge: removeFromSourceIfKeysExistsInDestination});
} else if (!filterIsEmpty(itemIs)) {
derivedItemIs = itemIs;
}
@@ -2498,23 +2464,6 @@ export const mergeFilters = (objectConfig: RunnableBaseJson, filterDefs: FilterC
return [derivedAuthorIs, derivedItemIs];
}
export const addNonConflictingCriteria = <T>(defaultCriteria: NamedCriteria<T>[] = [], explicitCriteria: NamedCriteria<T>[] = []): NamedCriteria<T>[] => {
if(explicitCriteria.length === 0) {
return defaultCriteria;
}
const allExplicitKeys = Array.from(explicitCriteria.reduce((acc, curr) => {
Object.keys(curr.criteria).forEach(key => acc.add(key));
return acc;
}, new Set()));
const nonConflicting = defaultCriteria.filter(x => {
return intersect(Object.keys(x.criteria), allExplicitKeys).length === 0;
});
if(nonConflicting.length > 0) {
return explicitCriteria.concat(nonConflicting);
}
return explicitCriteria;
}
export const filterIsEmpty = (obj: FilterOptions<any>): boolean => {
return (obj.include === undefined || obj.include.length === 0) && (obj.exclude === undefined || obj.exclude.length === 0);
}
@@ -2826,9 +2775,9 @@ export const parseRedditFullname = (str: string): RedditThing | undefined => {
export const generateSnoowrapEntityFromRedditThing = (data: RedditThing, client: Snoowrap) => {
switch(data.type) {
case 'comment':
return new Comment({name: data.val, id: data.id}, client, false);
return new Comment({id: data.val}, client, false);
case 'submission':
return new Submission({name: data.val, id: data.id}, client, false);
return new Submission({id: data.val}, client, false);
case 'user':
return new RedditUser({id: data.val}, client, false);
case 'subreddit':
@@ -3050,18 +2999,3 @@ export const generateFullWikiUrl = (subreddit: Subreddit | string, location: str
const subName = subreddit instanceof Subreddit ? subreddit.url : `r/${subreddit}/`;
return `https://reddit.com${subName}wiki/${location}`
}
export const toStrongSharingACLConfig = (data: SharingACLConfig | string[]): StrongSharingACLConfig => {
if (Array.isArray(data)) {
return {
include: data.map(x => parseStringToRegexOrLiteralSearch(x))
}
} else if (data.include !== undefined) {
return {
include: data.include.map(x => parseStringToRegexOrLiteralSearch(x))
}
}
return {
exclude: (data.exclude ?? []).map(x => parseStringToRegexOrLiteralSearch(x))
}
}

View File

@@ -16,7 +16,7 @@ import Snoowrap from "snoowrap";
import {getResource, getSnoowrap, getSubreddit, sampleActivity} from "./testFactory";
import {Subreddit as SubredditEntity} from "../src/Common/Entities/Subreddit";
import {Activity} from '../src/Common/Entities/Activity';
import {cmToSnoowrapActivityMap, cmToSnoowrapAuthorMap} from "../src/Common/Infrastructure/Filters/FilterCriteria";
import {cmToSnoowrapActivityMap} from "../src/Common/Infrastructure/Filters/FilterCriteria";
import {SnoowrapActivity} from "../src/Common/Infrastructure/Reddit";
dayjs.extend(dduration);
@@ -89,7 +89,10 @@ describe('Author Criteria', function () {
});
for(const prop of ['flairCssClass', 'flairTemplate', 'flairText']) {
let activityPropName = cmToSnoowrapAuthorMap[prop] ?? prop;
let activityPropName = cmToSnoowrapActivityMap[prop] ?? prop;
if(activityPropName === 'link_flair_template_id') {
activityPropName = 'author_flair_template_id';
}
it(`Should detect specific ${prop} as single string`, async function () {
assert.isTrue((await resource.isAuthor(testAuthor({}, 'submission',{
@@ -101,16 +104,6 @@ describe('Author Criteria', function () {
[activityPropName]: 'test',
}), {[prop]: ['foo','test']}, true)).passed);
});
it(`Should detect non-regex specific ${prop} as case-insensitive`, async function () {
assert.isTrue((await resource.isAuthor(testAuthor({}, 'submission',{
[activityPropName]: 'TeSt',
}), {[prop]: 'test'}, true)).passed);
});
it(`Should detect non-regex specific ${prop} is a subset of string`, async function () {
assert.isTrue((await resource.isAuthor(testAuthor({}, 'submission',{
[activityPropName]: 'This is a test phrase',
}), {[prop]: 'test'}, true)).passed);
});
it(`Should detect specific ${prop} is not in criteria`, async function () {
assert.isFalse((await resource.isAuthor(testAuthor({}, 'submission',{
[activityPropName]: 'test',
@@ -132,14 +125,14 @@ describe('Author Criteria', function () {
[activityPropName]: '',
}), {[prop]: 'foo'}, true)).passed);
});
it(`Should detect ${prop} as Regular Expression`, async function () {
assert.isTrue((await resource.isAuthor(testAuthor({}, 'submission',{
[activityPropName]: 'test',
}), {[prop]: '/te.*/'}, true)).passed);
assert.isTrue((await resource.isAuthor(testAuthor({}, 'submission',{
[activityPropName]: 'test',
}), {[prop]: ['foo','/te.*/']}, true)).passed);
});
/*it(`Should detect ${prop} as Regular Expression`, async function () {
assert.isTrue((await resource.isItem(new Submission({
[activityPropName]: 'test'
}, snoowrap, false), {[prop]: '/te.*!/'}, NoopLogger, true)).passed);
assert.isTrue((await resource.isItem(new Submission({
[activityPropName]: 'test'
}, snoowrap, false), {[prop]: ['foo', '/t.*!/']}, NoopLogger, true)).passed);
});*/
}
// TODO isMod

View File

@@ -1,228 +0,0 @@
import {describe, it} from 'mocha';
import {assert} from 'chai';
import {insertNameFilters} from "../src/ConfigBuilder";
import {
authorAgeDayCrit, authorAgeMonthCrit,
authorFlair1Crit,
fullAuthorAnonymousAll,
fullAuthorAnonymousExclude,
fullAuthorAnonymousInclude,
fullAuthorFullExclude,
fullAuthorFullInclude,
fullItemAnonymousAll,
fullItemAnonymousExclude,
fullItemAnonymousInclude,
fullItemFullAll,
fullItemFullExclude,
fullItemFullInclude,
itemApprovedCrit,
itemRemovedCrit,
maybeAnonymousFullAuthorFilter,
minimalAuthorFilter,
namedAuthorFilter,
namedAuthorFilters,
namedItemFilter,
namedItemFilters
} from "./testFactory";
import {buildFilter, mergeFilters} from "../src/util";
import {FilterOptions, FilterOptionsConfig, NamedCriteria} from "../src/Common/Infrastructure/Filters/FilterShapes";
import {AuthorCriteria} from "../src";
import {filterCriteriaDefault} from "../src/Common/defaults";
const namedFilters = insertNameFilters(namedAuthorFilters, namedItemFilters);
describe('Filter Building', function () {
describe('Convert string or plain objects/arrays to AT LEAST filters with anonymous criteria', function () {
describe('Author Filter', function () {
describe('Anonymous Filters', function () {
it('Accepts plain object', function () {
const filters = namedFilters({authorIs: authorAgeDayCrit()})
assert.deepEqual(filters.authorIs, [{criteria: authorAgeDayCrit()}]);
});
it('Accepts plain array of objects', function () {
const filters = namedFilters({authorIs: [authorAgeDayCrit(), authorFlair1Crit()]})
assert.deepEqual(filters.authorIs, [{criteria: authorAgeDayCrit()}, {criteria: authorFlair1Crit()}]);
});
it('Accepts full anonymous include config and returns full anonymous include config', function () {
const authFull = namedFilters({authorIs: fullAuthorAnonymousInclude()})
assert.deepEqual(authFull.authorIs, fullAuthorFullInclude());
});
it('Accepts full anonymous exclude config and returns full anonymous exclude config', function () {
const authFull = namedFilters({authorIs: fullAuthorAnonymousExclude()})
assert.deepEqual(authFull.authorIs, fullAuthorFullExclude());
});
it('Accepts full anonymous include-exclude config and returns full anonymous include with no exclude', function () {
const authFull = namedFilters({authorIs: fullAuthorAnonymousAll()})
assert.deepEqual(authFull.authorIs, fullAuthorFullInclude());
});
});
describe('Named Filters', function () {
it('Inserts named filter from plain array', function () {
const filters = namedFilters({authorIs: ['test1Author', authorFlair1Crit()]})
assert.deepEqual(filters.authorIs, [namedAuthorFilter(), {criteria: authorFlair1Crit()}]);
});
});
});
describe('Item Filter', function () {
describe('Anonymous Filters', function () {
it('Accepts plain object', function () {
const filters = namedFilters({itemIs: itemRemovedCrit()})
assert.deepEqual(filters.itemIs, [{criteria: itemRemovedCrit()}]);
});
it('Accepts plain array of objects', function () {
const filters = namedFilters({itemIs: [itemRemovedCrit(), itemApprovedCrit()]})
assert.deepEqual(filters.itemIs, [{criteria: itemRemovedCrit()}, {criteria: itemApprovedCrit()}]);
});
it('Accepts full anonymous include config and returns full anonymous include config', function () {
const fullFilter = namedFilters({itemIs: fullItemAnonymousInclude()})
assert.deepEqual(fullFilter.itemIs, fullItemFullInclude());
});
it('Accepts full anonymous exclude config and returns full anonymous exclude config', function () {
const fullFilter = namedFilters({itemIs: fullItemAnonymousExclude()})
assert.deepEqual(fullFilter.itemIs, fullItemFullExclude());
});
it('Accepts full anonymous include-exclude config and returns full anonymous include with no exclude', function () {
const fullFilter = namedFilters({itemIs: fullItemAnonymousAll()})
assert.deepEqual(fullFilter.itemIs, fullItemFullAll());
});
});
describe('Named Filters', function () {
it('Inserts named filter from plain array', function () {
const filters = namedFilters({itemIs: ['test1Item', itemApprovedCrit()]})
assert.deepEqual(filters.itemIs, [namedItemFilter(), {criteria: itemApprovedCrit()}]);
});
});
});
describe('Full Filter', function () {
it('Accepts and returns full filter', function () {
const filters = namedFilters({
itemIs: ['test1Item', itemApprovedCrit()],
authorIs: ['test1Author', authorFlair1Crit()]
})
assert.deepEqual(filters.itemIs, [namedItemFilter(), {criteria: itemApprovedCrit()}]);
assert.deepEqual(filters.authorIs, [namedAuthorFilter(), {criteria: authorFlair1Crit()}]);
});
});
});
describe('Convert hydrated/anonymous criteria to full FilterOptions', function () {
describe('Author Filter', function () {
it('Converts minimal (array) filter into include', function () {
const opts = buildFilter(minimalAuthorFilter());
assert.deepEqual(opts, {
include: (minimalAuthorFilter() as NamedCriteria<AuthorCriteria>[]).map((x) => ({
...x,
name: undefined
})),
excludeCondition: 'OR',
exclude: []
});
});
it('Converts anonymous full filter into FilterOptions', function () {
const opts = buildFilter(maybeAnonymousFullAuthorFilter());
assert.deepEqual(opts, {
include: [
{
criteria: authorAgeDayCrit()
},
{
criteria: authorAgeMonthCrit()
}
].map((x) => ({...x, name: undefined})),
excludeCondition: undefined,
exclude: [
{
criteria: authorAgeDayCrit()
},
{
criteria: authorAgeMonthCrit()
}
].map((x) => ({...x, name: undefined}))
})
});
});
});
describe('Filter merging', function () {
it('Merges (adds) when user-defined filter and defaults filter are present', function () {
const [author] = mergeFilters({
authorIs: {
exclude: [
{
criteria: {age: '> 1 hour'}
}]
}
}, filterCriteriaDefault);
assert.deepEqual(author.exclude, [
{
criteria: {age: '> 1 hour'},
name: undefined
}, {
criteria: {isMod: true},
name: undefined
}]);
});
it('Does not merge when user-defined filter and defaults filter are present with conflicting properties', function () {
const [author] = mergeFilters({
authorIs: {
exclude: [{
criteria: {
age: '> 1 hour',
isMod: true
}
}]
}
}, filterCriteriaDefault);
assert.deepEqual(author.exclude, [{criteria: {age: '> 1 hour', isMod: true}, name: undefined}]);
});
it('User-defined filter replaces defaults filter when replace behavior is set', function () {
const [author] = mergeFilters({
authorIs: {
exclude: [
{
criteria: {age: '> 1 hour'}
}
]
}
}, {
authorIsBehavior: 'replace',
authorIs: {
exclude: [
{
criteria: {name: ['test']}
}]
}
});
assert.deepEqual(author.exclude, [{criteria: {age: '> 1 hour'}, name: undefined}]);
});
it('Ignores mods by default', function () {
const [author] = mergeFilters({}, filterCriteriaDefault);
assert.deepEqual(author.exclude, [
{
criteria: {isMod: true},
name: undefined
}]);
});
});
});

View File

@@ -243,16 +243,6 @@ describe('Item Criteria', function () {
[activityPropName]: 'test',
}, snoowrap, false), {[prop]: ['foo','test']}, NoopLogger, true)).passed);
});
it(`Should detect specific ${prop}, case-insensitive`, async function () {
assert.isTrue((await resource.isItem(new Submission({
[activityPropName]: 'test',
}, snoowrap, false), {[prop]: 'TeSt'}, NoopLogger, true)).passed);
});
it(`Should detect specific ${prop}, when not a regex, is subset of string`, async function () {
assert.isTrue((await resource.isItem(new Submission({
[activityPropName]: 'this is a test phrase',
}, snoowrap, false), {[prop]: 'test'}, NoopLogger, true)).passed);
});
it(`Should detect specific ${prop} is not in criteria`, async function () {
assert.isFalse((await resource.isItem(new Submission({
[activityPropName]: 'test',

View File

@@ -1,7 +1,7 @@
import {OperatorConfig, OperatorJsonConfig} from "../src/Common/interfaces";
import Snoowrap from "snoowrap";
import Bot from "../src/Bot/index"
import {buildOperatorConfigWithDefaults, insertNameFilters} from "../src/ConfigBuilder";
import {buildOperatorConfigWithDefaults} from "../src/ConfigBuilder";
import {App} from "../src/App";
import {YamlOperatorConfigDocument} from "../src/Common/Config/Operator";
import {NoopLogger} from "../src/Utils/loggerFactory";
@@ -10,13 +10,6 @@ import {Bot as BotEntity} from "../src/Common/Entities/Bot";
import {SubredditResources} from "../src/Subreddit/SubredditResources";
import {Subreddit, Comment, Submission} from 'snoowrap/dist/objects';
import dayjs from 'dayjs';
import {
FilterOptions, MaybeAnonymousCriteria,
MinimalOrFullFilter,
MinimalOrFullFilterJson, MinimalOrFullMaybeAnonymousFilter, NamedCriteria
} from "../src/Common/Infrastructure/Filters/FilterShapes";
import {AuthorCriteria} from "../src";
import {TypedActivityState} from "../src/Common/Infrastructure/Filters/FilterCriteria";
const mockSnoowrap = new Snoowrap({userAgent: 'test', accessToken: 'test'});
@@ -87,10 +80,10 @@ export const getBot = async () => {
await bot.cacheManager.set('test', {
logger: NoopLogger,
caching: {
authorTTL: false,
submissionTTL: false,
commentTTL: false,
provider: 'memory'
authorTTL: false,
submissionTTL: false,
commentTTL: false,
provider: 'memory'
},
subreddit: bot.client.getSubreddit('test'),
client: bot.client,
@@ -152,12 +145,12 @@ export const sampleActivity = {
}, snoowrap, true)
},
commentRemoved: (snoowrap = mockSnoowrap) => {
return new Comment({
can_mod_post: true,
banned_at_utc: dayjs().subtract(10, 'minutes').unix(),
removed: true,
replies: ''
}, snoowrap, true);
return new Comment({
can_mod_post: true,
banned_at_utc: dayjs().subtract(10, 'minutes').unix(),
removed: true,
replies: ''
}, snoowrap, true);
},
submissionDeleted: (snoowrap = mockSnoowrap) => {
return new Submission({
@@ -207,153 +200,3 @@ export const sampleActivity = {
}
}
}
export const authorAgeDayCrit = (): AuthorCriteria => ({
age: '> 1 day'
});
export const authorAgeMonthCrit = (): AuthorCriteria => ({
age: '> 1 month'
});
export const authorFlair1Crit = (): AuthorCriteria => ({
flairText: 'flair 1'
});
export const authorFlair2Crit = (): AuthorCriteria => ({
flairText: 'flair 2'
});
export const fullAuthorFullInclude = (): FilterOptions<AuthorCriteria> => ({
include: [
{
criteria: authorAgeDayCrit()
},
{
criteria: authorFlair1Crit()
}
]
})
export const fullAuthorFullExclude = (): FilterOptions<AuthorCriteria> => ({
exclude: [
{
criteria: authorAgeMonthCrit()
},
{
criteria: authorFlair2Crit()
}
]
})
export const fullAuthorFullAll = (): FilterOptions<AuthorCriteria> => ({
include: fullAuthorFullInclude().include,
})
export const fullAuthorAnonymousInclude = (): MinimalOrFullFilterJson<AuthorCriteria> => ({
include: [
authorAgeDayCrit(),
authorFlair1Crit()
]
})
export const fullAuthorAnonymousExclude = (): MinimalOrFullFilterJson<AuthorCriteria> => ({
exclude: [
authorAgeMonthCrit(),
authorFlair2Crit()
]
});
export const fullAuthorAnonymousAll = (): MinimalOrFullFilterJson<AuthorCriteria> => ({
include: (fullAuthorAnonymousInclude() as FilterOptions<AuthorCriteria>).include,
exclude: (fullAuthorAnonymousExclude() as FilterOptions<AuthorCriteria>).exclude,
})
export const namedAuthorFilter = (): NamedCriteria<AuthorCriteria> => ({
name: 'test1Author',
criteria: authorAgeDayCrit()
});
export const itemRemovedCrit = (): TypedActivityState => ({
removed: false
});
export const itemApprovedCrit = (): TypedActivityState => ({
approved: true
});
export const itemFlairCrit = (): TypedActivityState => ({
link_flair_text: ['test1','test2']
});
export const fullItemFullInclude = (): FilterOptions<TypedActivityState> => ({
include: [
{
criteria: itemRemovedCrit()
},
{
criteria: itemApprovedCrit()
}
]
})
export const fullItemFullExclude = (): FilterOptions<TypedActivityState> => ({
exclude: [
{
criteria: itemRemovedCrit()
},
{
criteria: itemFlairCrit()
}
]
})
export const fullItemFullAll = (): FilterOptions<TypedActivityState> => ({
include: fullItemFullInclude().include,
})
export const fullItemAnonymousInclude = (): MinimalOrFullFilterJson<TypedActivityState> => ({
include: [
itemRemovedCrit(),
itemApprovedCrit()
]
})
export const fullItemAnonymousExclude = (): MinimalOrFullFilterJson<TypedActivityState> => ({
exclude: [
itemRemovedCrit(),
itemFlairCrit()
]
});
export const fullItemAnonymousAll = (): MinimalOrFullFilterJson<TypedActivityState> => ({
include: (fullItemAnonymousInclude() as FilterOptions<TypedActivityState>).include,
exclude: (fullItemAnonymousExclude() as FilterOptions<TypedActivityState>).exclude,
})
export const namedItemFilter = (): NamedCriteria<TypedActivityState> => ({
name: 'test1Item',
criteria: itemRemovedCrit()
});
export const namedAuthorFilters = new Map([['test1author', namedAuthorFilter()]]);
export const namedItemFilters = new Map([['test1item', namedItemFilter()]]);
export const minimalAuthorFilter = (): MinimalOrFullMaybeAnonymousFilter<AuthorCriteria> => ([
{
criteria: authorAgeDayCrit()
},
{
criteria: authorAgeMonthCrit()
}
]);
export const maybeAnonymousFullAuthorFilter = (): MinimalOrFullMaybeAnonymousFilter<AuthorCriteria> => ({
include: [
{
criteria: authorAgeDayCrit()
},
{
criteria: authorAgeMonthCrit()
}
],
exclude: [
{
criteria: authorAgeDayCrit()
},
{
criteria: authorAgeMonthCrit()
}
]
});