mirror of
https://github.com/FoxxMD/context-mod.git
synced 2026-01-14 07:57:57 -05:00
Compare commits
57 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
3292d011fa | ||
|
|
d7cab4092d | ||
|
|
0370e592f9 | ||
|
|
116d06733a | ||
|
|
22a8a694a7 | ||
|
|
2ed24eee11 | ||
|
|
8822d8520a | ||
|
|
9832292a5b | ||
|
|
7a86c722fa | ||
|
|
2ca4043c02 | ||
|
|
4da8a0b353 | ||
|
|
492ff78b13 | ||
|
|
64a0b0890d | ||
|
|
546daddd49 | ||
|
|
f91d81029f | ||
|
|
68ee1718e0 | ||
|
|
c0d19ede39 | ||
|
|
bb05d64428 | ||
|
|
1977c7317f | ||
|
|
6f784d5aa2 | ||
|
|
4b5c9b82e4 | ||
|
|
0315ad23ae | ||
|
|
da70753f42 | ||
|
|
661a0ae440 | ||
|
|
d59f1b63d1 | ||
|
|
7542947029 | ||
|
|
2d02434e7e | ||
|
|
e2824ea94c | ||
|
|
1c94548947 | ||
|
|
2073e3f650 | ||
|
|
90b8f481ec | ||
|
|
9ad9092e9e | ||
|
|
12adfe9975 | ||
|
|
83dceddae8 | ||
|
|
99b46cb97f | ||
|
|
3ac07cb3e2 | ||
|
|
d7f08d4e27 | ||
|
|
338f393969 | ||
|
|
57e930ca8a | ||
|
|
af3b917b57 | ||
|
|
d01bcc53fe | ||
|
|
e2fe2b4745 | ||
|
|
785099b20c | ||
|
|
726ceb03d2 | ||
|
|
1c37771591 | ||
|
|
67aeaea5f1 | ||
|
|
a8ac4b8497 | ||
|
|
71571d3672 | ||
|
|
2799b6caeb | ||
|
|
e8f94ad1be | ||
|
|
4411d1a413 | ||
|
|
c919532aac | ||
|
|
522ba33377 | ||
|
|
3a18cc219f | ||
|
|
554d7dd86e | ||
|
|
29c3924ab7 | ||
|
|
5551f2c63f |
2
.gitignore
vendored
2
.gitignore
vendored
@@ -383,7 +383,7 @@ dist
|
||||
**/src/**/*.js
|
||||
**/tests/**/*.js
|
||||
**/tests/**/*.map
|
||||
!src/Web/assets/public/yaml/*
|
||||
!src/Web/assets/**
|
||||
**/src/**/*.map
|
||||
/**/*.sqlite
|
||||
/**/*.bak
|
||||
|
||||
@@ -111,7 +111,9 @@ COPY --from=build --chown=abc:abc /app /app
|
||||
|
||||
RUN npm install --production \
|
||||
&& npm cache clean --force \
|
||||
&& chown abc:abc node_modules
|
||||
&& chown abc:abc node_modules \
|
||||
&& rm -rf node_modules/ts-node \
|
||||
&& rm -rf node_modules/typescript
|
||||
|
||||
ENV NPM_CONFIG_LOGLEVEL debug
|
||||
|
||||
|
||||
@@ -39,10 +39,13 @@ This list is not exhaustive. [For complete documentation on a subreddit's config
|
||||
* [Message](#message)
|
||||
* [Remove](#remove)
|
||||
* [Report](#report)
|
||||
* [UserNote](#usernote)
|
||||
* [Toolbox UserNote](#usernote)
|
||||
* [Mod Note](#mod-note)
|
||||
* [Filters](#filters)
|
||||
* [Filter Types](#filter-types)
|
||||
* [Author Filter](#author-filter)
|
||||
* [Mod Notes/Actions](#mod-actionsnotes-filter)
|
||||
* [Toolbox UserNotes](#toolbox-usernotes-filter)
|
||||
* [Item Filter](#item-filter)
|
||||
* [Subreddit Filter](#subreddit-filter)
|
||||
* [Named Filters](#named-filters)
|
||||
@@ -651,6 +654,28 @@ actions:
|
||||
allowDuplicate: boolean # if false then the usernote will not be added if the same note appears for this activity
|
||||
```
|
||||
|
||||
### Mod Note
|
||||
|
||||
Add a [Mod Note](https://www.reddit.com/r/modnews/comments/t8vafc/announcing_mod_notes/) for the Author of the Activity.
|
||||
|
||||
* `type` must be one of the [valid note labels](https://www.reddit.com/dev/api#POST_api_mod_notes):
|
||||
* BOT_BAN
|
||||
* PERMA_BAN
|
||||
* BAN
|
||||
* ABUSE_WARNING
|
||||
* SPAM_WARNING
|
||||
* SPAM_WATCH
|
||||
* SOLID_CONTRIBUTOR
|
||||
* HELPFUL_USER
|
||||
|
||||
```yaml
|
||||
actions:
|
||||
- kind: modnote
|
||||
type: SPAM_WATCH
|
||||
content: 'a note only mods can see message' # optional
|
||||
referenceActivity: boolean # if true the Note will be linked to the Activity being processed
|
||||
```
|
||||
|
||||
# Filters
|
||||
|
||||
**Filters** are an additional channel for determining if an Event should be processed by ContextMod. They differ from **Rules** in several key ways:
|
||||
@@ -732,6 +757,14 @@ There are two types of Filter. Both types have the same "shape" in the configura
|
||||
|
||||
Test the Author of an Activity. See [Schema documentation](https://json-schema.app/view/%23%2Fdefinitions%2FAuthorCriteria?url=https%3A%2F%2Fraw.githubusercontent.com%2FFoxxMD%2Freddit-context-bot%2Fedge%2Fsrc%2FSchema%2FApp.json) for all possible Author Criteria
|
||||
|
||||
#### Mod Actions/Notes Filter
|
||||
|
||||
See [Mod Actions/Notes](/docs/subreddit/components/modActions/README.md#mod-action-filter) documentation.
|
||||
|
||||
#### Toolbox UserNotes Filter
|
||||
|
||||
See [UserNotes](/docs/subreddit/components/userNotes/README.md) documentation
|
||||
|
||||
### Item Filter
|
||||
|
||||
Test for properties of an Activity:
|
||||
|
||||
152
docs/subreddit/components/modActions/README.md
Normal file
152
docs/subreddit/components/modActions/README.md
Normal file
@@ -0,0 +1,152 @@
|
||||
# Table of Contents
|
||||
|
||||
* [Overview](#overview)
|
||||
* [Mod Note Action](#mod-note-action)
|
||||
* [Mod Action Filter](#mod-action-filter)
|
||||
* [API Usage](#api-usage)
|
||||
* [When To Use?](#when-to-use)
|
||||
* [Examples](#examples)
|
||||
|
||||
# Overview
|
||||
|
||||
[Mod Notes](https://www.reddit.com/r/modnews/comments/t8vafc/announcing_mod_notes/) is a feature for New Reddit that allow moderators to add short, categorizable notes to Users of their subreddit, optionally associating te note with a submission/comment the User made. They are inspired by [Toolbox User Notes](https://www.reddit.com/r/toolbox/wiki/docs/usernotes) which are also [supported by ContextMod.](/docs/subreddit/components/userNotes) Reddit's **Mod Notes** also combine [Moderation Log](https://mods.reddithelp.com/hc/en-us/articles/360022402312-Moderation-Log) actions (**Mod Actions**) for the selected User alongside moderator notes, enabling a full "overview" of moderator interactions with a User in their subreddit.
|
||||
|
||||
ContextMod supports adding **Mod Notes** to an Author using an [Action](/docs/subreddit/components/README.md#mod-note) and using **Mod Actions/Mod Notes** as a criteria in an [Author Filter](/docs/subreddit/components/README.md#author-filter)
|
||||
|
||||
# Mod Note Action
|
||||
|
||||
[**Schema Reference**](https://json-schema.app/view/%23%2Fdefinitions%2FModNoteActionJson?url=https%3A%2F%2Fraw.githubusercontent.com%2FFoxxMD%2Freddit-context-bot%2Fedge%2Fsrc%2FSchema%2FApp.json)
|
||||
|
||||
* `type` must be one of the [valid note labels](https://www.reddit.com/dev/api#POST_api_mod_notes):
|
||||
* BOT_BAN
|
||||
* PERMA_BAN
|
||||
* BAN
|
||||
* ABUSE_WARNING
|
||||
* SPAM_WARNING
|
||||
* SPAM_WATCH
|
||||
* SOLID_CONTRIBUTOR
|
||||
* HELPFUL_USER
|
||||
|
||||
```yaml
|
||||
actions:
|
||||
- kind: modnote
|
||||
type: SPAM_WATCH
|
||||
content: 'a note only mods can see message' # optional
|
||||
referenceActivity: boolean # if true the Note will be linked to the Activity being processed
|
||||
```
|
||||
|
||||
# Mod Action Filter
|
||||
|
||||
ContextMod can use **Mod Actions** (from moderation log) and **Mod Notes** in an [Author Filter](/docs/subreddit/components/README.md#author-filter).
|
||||
|
||||
## API Usage
|
||||
|
||||
Notes/Actions are **not** included in the data Reddit returns for either an Author or an Activity. This means that, in most cases, ContextMod is required to make **one additional API call to Reddit during Activity processing** if Notes/Actions as used as part of an **Author Filter**.
|
||||
|
||||
The impact of this additional call is greatest when the Author Filter is used as part of a **Comment Check** or running for **every Activity** such as part of a Run. Take this example:
|
||||
|
||||
No Mod Action filtering
|
||||
|
||||
* CM makes 1 api call to return new comments, find 10 new comments across 6 users
|
||||
* Processing each comment, with no other filters, requires 0 additional calls
|
||||
* At the end of processing 10 comments, CM has used a total of 1 api call.
|
||||
|
||||
Mod Action Filtering Used
|
||||
|
||||
* CM makes 1 api call to return new comments, find 10 new comments across 6 users
|
||||
* Processing each comment, with a mod action filter, requires 1 additional api call per user
|
||||
* At the end of processing 10 comments, CM has used a total of **7 api calls**
|
||||
|
||||
### When To Use?
|
||||
|
||||
In general,**do not** use Mod Actions in a Filter if:
|
||||
|
||||
* The filter is on a [**Comment** Check](/docs/subreddit/components/README.md#checks) and your subreddit has a high volume of Comments
|
||||
* The filter is on a [Run](/docs/subreddit/components/README.md#runs) and your subreddit has a high volume of Activities
|
||||
|
||||
If you need Mod Notes-like functionality for a high volume subreddit consider using [Toolbox UserNotes](/docs/subreddit/components/userNotes) instead.
|
||||
|
||||
In general, **do** use Mod Actions in a Filter if:
|
||||
|
||||
* The filter is on a [**Submission** Check](/docs/subreddit/components/README.md#checks)
|
||||
* The filter is part of an [Author **Rule**](/docs/subreddit/components/README.md#author) that is processed as **late as possible in the rule order for a Check**
|
||||
* Your subreddit has a low volume of Activities (less than 100 combined submissions/comments in a 10 minute period, for example)
|
||||
* The filter is on an Action
|
||||
|
||||
## Usage and Examples
|
||||
|
||||
Filter by Mod Actions/Notes on an Author Filter are done using the `modActions` property:
|
||||
|
||||
```yaml
|
||||
age: '> 1 month'
|
||||
# ...
|
||||
modActions:
|
||||
- ...
|
||||
```
|
||||
|
||||
There two valid shapes for the Mod Action criteria: [ModLogCriteria](https://json-schema.app/view/%23%2Fdefinitions%2FModLogCriteria?url=https%3A%2F%2Fraw.githubusercontent.com%2FFoxxMD%2Freddit-context-bot%2Fedge%2Fsrc%2FSchema%2FApp.json) and [ModNoteCriteria](https://json-schema.app/view/%23%2Fdefinitions%2FModNoteCriteria?url=https%3A%2F%2Fraw.githubusercontent.com%2FFoxxMD%2Freddit-context-bot%2Fedge%2Fsrc%2FSchema%2FApp.json).
|
||||
|
||||
### ModLogCriteria
|
||||
|
||||
Used for filtering by **Moderation Log** actions *and/or general notes*.
|
||||
|
||||
* `activityType` -- Optional. If Mod Action is associated with an activity specify it here. A list or one of:
|
||||
* `submission`
|
||||
* `comment`
|
||||
* `type` -- Optional. The type of Mod Log Action. A list or one of:
|
||||
* `INVITE`
|
||||
* `NOTE`
|
||||
* `REMOVAL`
|
||||
* `SPAM`
|
||||
* `APPROVAL`
|
||||
* `description` -- additional mod log details (string) to filter by -- not documented by reddit. Can be string or regex string-like `/.* test/i`
|
||||
* `details` -- additional mod log details (string) to filter by -- not documented by reddit. Can be string or regex string-like `/.* test/i`
|
||||
|
||||
```yaml
|
||||
activityType: submission
|
||||
type:
|
||||
- REMOVAL
|
||||
- SPAM
|
||||
search: total
|
||||
count: '> 3 in 1 week'
|
||||
```
|
||||
### ModNoteCriteria
|
||||
|
||||
Inherits `activityType` from ModLogCriteria. If either of the below properties in included on the criteria then any other ModLogCriteria-specific properties are **ignored**.
|
||||
|
||||
* `note` -- the contents of the note to match against. Can be one of or a list of strings/regex string-like `/.* test/i`
|
||||
* `noteType` -- If specified by the note, the note type (see [Mod Note Action](#mod-note-action) type). Can be one of or a list of strings/regex string-like `/.* test/i`
|
||||
|
||||
```yaml
|
||||
noteType: SOLID_CONTRIBUTOR
|
||||
search: total
|
||||
count: '> 3 in 1 week'
|
||||
```
|
||||
|
||||
### Examples
|
||||
|
||||
Author has more than 2 submission approvals in the last month
|
||||
|
||||
```yaml
|
||||
type: APPROVAL
|
||||
activityType: submission
|
||||
search: total
|
||||
count: '> 2 in 1 month'
|
||||
```
|
||||
|
||||
Author has at least 1 BAN note
|
||||
|
||||
```yaml
|
||||
noteType: BAN
|
||||
search: total
|
||||
count: '>= 1'
|
||||
```
|
||||
|
||||
Author has at least 3 notes which include the words "self" and "promotion" in the last month
|
||||
|
||||
```yaml
|
||||
note: '/self.*promo/i'
|
||||
activityType: submission
|
||||
search: total
|
||||
count: '>= 3 in 1 month'
|
||||
```
|
||||
4813
package-lock.json
generated
4813
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
25
package.json
25
package.json
@@ -5,7 +5,8 @@
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"test": "nyc ./node_modules/.bin/_mocha 'tests/**/*.test.ts'",
|
||||
"build": "tsc",
|
||||
"build": "tsc && npm run bundle-front",
|
||||
"bundle-front": "browserify src/Web/assets/browser.js | terser --compress --mangle > src/Web/assets/public/browserBundle.js",
|
||||
"start": "node src/index.js run",
|
||||
"schema": "npm run -s schema-app & npm run -s schema-ruleset & npm run -s schema-rule & npm run -s schema-action & npm run -s schema-config",
|
||||
"schema-app": "typescript-json-schema tsconfig.json JSONConfig --out src/Schema/App.json --required --tsNodeRegister --refs --validationKeywords deprecationMessage",
|
||||
@@ -29,6 +30,13 @@
|
||||
"dependencies": {
|
||||
"@awaitjs/express": "^0.8.0",
|
||||
"@googleapis/youtube": "^2.0.0",
|
||||
"@nlpjs/core": "^4.23.4",
|
||||
"@nlpjs/lang-de": "^4.23.4",
|
||||
"@nlpjs/lang-en": "^4.23.4",
|
||||
"@nlpjs/lang-es": "^4.23.4",
|
||||
"@nlpjs/lang-fr": "^4.23.4",
|
||||
"@nlpjs/language": "^4.22.7",
|
||||
"@nlpjs/nlp": "^4.23.5",
|
||||
"@stdlib/regexp-regexp": "^0.0.6",
|
||||
"ajv": "^7.2.4",
|
||||
"ansi-regex": ">=5.0.1",
|
||||
@@ -43,7 +51,6 @@
|
||||
"cookie-parser": "^1.3.5",
|
||||
"dayjs": "^1.10.5",
|
||||
"deepmerge": "^4.2.2",
|
||||
"delimiter-stream": "^3.0.1",
|
||||
"ejs": "^3.1.6",
|
||||
"env-cmd": "^10.1.0",
|
||||
"es6-error": "^4.1.1",
|
||||
@@ -52,16 +59,15 @@
|
||||
"express-session-cache-manager": "^1.0.2",
|
||||
"express-socket.io-session": "^1.3.5",
|
||||
"fast-deep-equal": "^3.1.3",
|
||||
"fuse.js": "^6.4.6",
|
||||
"globrex": "^0.1.2",
|
||||
"got": "^11.8.2",
|
||||
"he": "^1.2.0",
|
||||
"http-proxy": "^1.18.1",
|
||||
"image-size": "^1.0.0",
|
||||
"json5": "^2.2.0",
|
||||
"jsonwebtoken": "^8.5.1",
|
||||
"leven": "^3.1.0",
|
||||
"lodash": "^4.17.21",
|
||||
"logform": "^2.4.0",
|
||||
"lru-cache": "^6.0.0",
|
||||
"migrate": "github:johsunds/node-migrate#49b0054de0a9295857aa8b8eea9a3cdeb2643913",
|
||||
"mustache": "^4.2.0",
|
||||
@@ -77,9 +83,7 @@
|
||||
"patch-package": "^6.4.7",
|
||||
"pixelmatch": "^5.2.1",
|
||||
"pony-cause": "^1.1.1",
|
||||
"pretty-print-json": "^1.0.3",
|
||||
"reflect-metadata": "^0.1.13",
|
||||
"safe-stable-stringify": "^1.1.1",
|
||||
"snoostorm": "^1.5.2",
|
||||
"snoowrap": "^1.23.0",
|
||||
"socket.io": "^4.1.3",
|
||||
@@ -89,8 +93,9 @@
|
||||
"triple-beam": "^1.3.0",
|
||||
"typeorm": "^0.3.4",
|
||||
"typeorm-logger-adaptor": "^1.1.0",
|
||||
"typescript": "^4.3.4",
|
||||
"vader-sentiment": "^1.1.3",
|
||||
"webhook-discord": "^3.7.7",
|
||||
"wink-sentiment": "^5.0.2",
|
||||
"winston": "github:FoxxMD/winston#fbab8de969ecee578981c77846156c7f43b5f01e",
|
||||
"winston-daily-rotate-file": "^4.5.5",
|
||||
"winston-duplex": "^0.1.1",
|
||||
@@ -106,6 +111,7 @@
|
||||
"@types/cache-manager": "^3.4.2",
|
||||
"@types/cache-manager-redis-store": "^2.0.0",
|
||||
"@types/chai": "^4.3.0",
|
||||
"@types/chai-as-promised": "^7.1.5",
|
||||
"@types/cookie-parser": "^1.4.2",
|
||||
"@types/express": "^4.17.13",
|
||||
"@types/express-session": "^1.17.4",
|
||||
@@ -130,16 +136,19 @@
|
||||
"@types/string-similarity": "^4.0.0",
|
||||
"@types/tcp-port-used": "^1.0.0",
|
||||
"@types/triple-beam": "^1.3.2",
|
||||
"browserify": "^17.0.0",
|
||||
"chai": "^4.3.6",
|
||||
"chai-as-promised": "^7.1.1",
|
||||
"mocha": "^9.2.1",
|
||||
"nyc": "^15.1.0",
|
||||
"source-map-support": "^0.5.21",
|
||||
"terser": "^5.13.1",
|
||||
"ts-essentials": "^9.1.2",
|
||||
"ts-json-schema-generator": "^0.93.0",
|
||||
"ts-mockito": "^2.6.1",
|
||||
"ts-node": "^10.7.0",
|
||||
"tsconfig-paths": "^3.13.0",
|
||||
"typescript": "^4.3.4",
|
||||
"typescript": "^4.6.4",
|
||||
"typescript-json-schema": "~0.53"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
|
||||
@@ -17,6 +17,7 @@ import {DispatchAction, DispatchActionJson} from "./DispatchAction";
|
||||
import {CancelDispatchAction, CancelDispatchActionJson} from "./CancelDispatchAction";
|
||||
import ContributorAction, {ContributorActionJson} from "./ContributorAction";
|
||||
import {StructuredFilter} from "../Common/Infrastructure/Filters/FilterShapes";
|
||||
import {ModNoteAction, ModNoteActionJson} from "./ModNoteAction";
|
||||
|
||||
export function actionFactory
|
||||
(config: StructuredActionJson, logger: Logger, subredditName: string, resources: SubredditResources, client: ExtendedSnoowrap, emitter: EventEmitter): Action {
|
||||
@@ -47,6 +48,8 @@ export function actionFactory
|
||||
return new CancelDispatchAction({...config as StructuredFilter<CancelDispatchActionJson>, logger, subredditName, resources, client, emitter})
|
||||
case 'contributor':
|
||||
return new ContributorAction({...config as StructuredFilter<ContributorActionJson>, logger, subredditName, resources, client, emitter})
|
||||
case 'modnote':
|
||||
return new ModNoteAction({...config as StructuredFilter<ModNoteActionJson>, logger, subredditName, resources, client, emitter})
|
||||
default:
|
||||
throw new Error('rule "kind" was not recognized.');
|
||||
}
|
||||
|
||||
@@ -7,6 +7,7 @@ import Comment from "snoowrap/dist/objects/Comment";
|
||||
import {RuleResultEntity} from "../Common/Entities/RuleResultEntity";
|
||||
import {runCheckOptions} from "../Subreddit/Manager";
|
||||
import {ActionTarget, ActionTypes} from "../Common/Infrastructure/Atomic";
|
||||
import {asComment, asSubmission} from "../util";
|
||||
|
||||
export class ApproveAction extends Action {
|
||||
|
||||
@@ -29,22 +30,24 @@ export class ApproveAction extends Action {
|
||||
const dryRun = this.getRuntimeAwareDryrun(options);
|
||||
const touchedEntities = [];
|
||||
|
||||
const realTargets = item instanceof Submission ? ['self'] : this.targets;
|
||||
const realTargets = asSubmission(item) ? ['self'] : this.targets;
|
||||
|
||||
let msg: string[] = [];
|
||||
|
||||
for(const target of realTargets) {
|
||||
let targetItem = item;
|
||||
if(target !== 'self' && item instanceof Comment) {
|
||||
if(target !== 'self' && asComment(item)) {
|
||||
targetItem = await this.resources.getActivity(this.client.getSubmission(item.link_id));
|
||||
}
|
||||
|
||||
// @ts-ignore
|
||||
if (targetItem.approved) {
|
||||
const msg = `${target === 'self' ? 'Item' : 'Comment\'s parent Submission'} is already approved`;
|
||||
msg.push(`${target === 'self' ? 'Item' : 'Comment\'s parent Submission'} is already approved??`);
|
||||
this.logger.warn(msg);
|
||||
return {
|
||||
dryRun,
|
||||
success: false,
|
||||
result: msg
|
||||
result: msg.join('|')
|
||||
}
|
||||
}
|
||||
|
||||
@@ -53,6 +56,9 @@ export class ApproveAction extends Action {
|
||||
if(target !== 'self' && !(targetItem instanceof Submission)) {
|
||||
// @ts-ignore
|
||||
targetItem = await this.client.getSubmission((item as Comment).link_id).fetch();
|
||||
msg.push(`Approving parent Submission ${targetItem.name}`);
|
||||
} else {
|
||||
msg.push(`Approving self ${targetItem.name}`);
|
||||
}
|
||||
// @ts-ignore
|
||||
touchedEntities.push(await targetItem.approve());
|
||||
@@ -70,6 +76,7 @@ export class ApproveAction extends Action {
|
||||
}
|
||||
|
||||
return {
|
||||
result: msg.join(' | '),
|
||||
dryRun,
|
||||
success: true,
|
||||
touchedEntities
|
||||
|
||||
@@ -71,12 +71,7 @@ export class CancelDispatchAction extends Action {
|
||||
} else {
|
||||
matchedDispatchIdentifier = this.identifiers.filter(x => x !== null).includes(x.identifier);
|
||||
}
|
||||
const matched = matchedId && matchedDispatchIdentifier;
|
||||
if(matched && x.processing) {
|
||||
this.logger.debug(`Cannot remove ${isSubmission(x.activity) ? 'Submission' : 'Comment'} ${x.activity.name} because it is currently processing`);
|
||||
return false;
|
||||
}
|
||||
return matched;
|
||||
return matchedId && matchedDispatchIdentifier;
|
||||
});
|
||||
let cancelCrit;
|
||||
if (this.identifiers === undefined) {
|
||||
|
||||
108
src/Action/ModNoteAction.ts
Normal file
108
src/Action/ModNoteAction.ts
Normal file
@@ -0,0 +1,108 @@
|
||||
import {ActionJson, ActionConfig, ActionOptions} from "./index";
|
||||
import Action from "./index";
|
||||
import {Comment} from "snoowrap";
|
||||
import {renderContent} from "../Utils/SnoowrapUtils";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import {ActionProcessResult, RichContent} from "../Common/interfaces";
|
||||
import {toModNoteLabel} from "../util";
|
||||
import {RuleResultEntity} from "../Common/Entities/RuleResultEntity";
|
||||
import {runCheckOptions} from "../Subreddit/Manager";
|
||||
import {ActionTypes, ModUserNoteLabel} from "../Common/Infrastructure/Atomic";
|
||||
import {ModNote} from "../Subreddit/ModNotes/ModNote";
|
||||
|
||||
|
||||
export class ModNoteAction extends Action {
|
||||
content: string;
|
||||
type?: string;
|
||||
allowDuplicate: boolean;
|
||||
referenceActivity: boolean
|
||||
|
||||
constructor(options: ModNoteActionOptions) {
|
||||
super(options);
|
||||
const {type, content = '', allowDuplicate = false, referenceActivity = true} = options;
|
||||
this.type = type;
|
||||
this.content = content;
|
||||
this.allowDuplicate = allowDuplicate;
|
||||
this.referenceActivity = referenceActivity;
|
||||
}
|
||||
|
||||
getKind(): ActionTypes {
|
||||
return 'modnote';
|
||||
}
|
||||
|
||||
protected getSpecificPremise(): object {
|
||||
return {
|
||||
content: this.content,
|
||||
type: this.type,
|
||||
allowDuplicate: this.allowDuplicate,
|
||||
referenceActivity: this.referenceActivity,
|
||||
}
|
||||
}
|
||||
|
||||
async process(item: Comment | Submission, ruleResults: RuleResultEntity[], options: runCheckOptions): Promise<ActionProcessResult> {
|
||||
const dryRun = this.getRuntimeAwareDryrun(options);
|
||||
|
||||
const modLabel = this.type !== undefined ? toModNoteLabel(this.type) : undefined;
|
||||
|
||||
const content = await this.resources.getContent(this.content, item.subreddit);
|
||||
const renderedContent = await renderContent(content, item, ruleResults, this.resources.userNotes);
|
||||
this.logger.verbose(`Note:\r\n(${this.type}) ${renderedContent}`);
|
||||
|
||||
// TODO see what changes are made for bulk fetch of notes before implementing this
|
||||
// https://www.reddit.com/r/redditdev/comments/t8w861/new_mod_notes_api/
|
||||
// if (!this.allowDuplicate) {
|
||||
// const notes = await this.resources.userNotes.getUserNotes(item.author);
|
||||
// let existingNote = notes.find((x) => x.link !== null && x.link.includes(item.id));
|
||||
// if(existingNote === undefined && notes.length > 0) {
|
||||
// const lastNote = notes[notes.length - 1];
|
||||
// // possibly notes don't have a reference link so check if last one has same text
|
||||
// if(lastNote.link === null && lastNote.text === renderedContent) {
|
||||
// existingNote = lastNote;
|
||||
// }
|
||||
// }
|
||||
// if (existingNote !== undefined && existingNote.noteType === this.type) {
|
||||
// this.logger.info(`Will not add note because one already exists for this Activity (${existingNote.time.local().format()}) and allowDuplicate=false`);
|
||||
// return {
|
||||
// dryRun,
|
||||
// success: false,
|
||||
// result: `Will not add note because one already exists for this Activity (${existingNote.time.local().format()}) and allowDuplicate=false`
|
||||
// };
|
||||
// }
|
||||
// }
|
||||
if (!dryRun) {
|
||||
await this.resources.addModNote({
|
||||
label: modLabel,
|
||||
note: renderedContent,
|
||||
activity: this.referenceActivity ? item : undefined,
|
||||
subreddit: this.resources.subreddit,
|
||||
user: item.author
|
||||
});
|
||||
}
|
||||
return {
|
||||
success: true,
|
||||
dryRun,
|
||||
result: `${modLabel !== undefined ? `(${modLabel})` : ''} ${renderedContent}`
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export interface ModNoteActionConfig extends ActionConfig, RichContent {
|
||||
/**
|
||||
* Add Note even if a Note already exists for this Activity
|
||||
* @examples [false]
|
||||
* @default false
|
||||
* */
|
||||
allowDuplicate?: boolean,
|
||||
type?: ModUserNoteLabel
|
||||
referenceActivity?: boolean
|
||||
}
|
||||
|
||||
export interface ModNoteActionOptions extends Omit<ModNoteActionConfig, 'authorIs' | 'itemIs'>, ActionOptions {
|
||||
}
|
||||
|
||||
/**
|
||||
* Add a Toolbox User Note to the Author of this Activity
|
||||
* */
|
||||
export interface ModNoteActionJson extends ModNoteActionConfig, ActionJson {
|
||||
kind: 'modnote'
|
||||
}
|
||||
@@ -434,7 +434,7 @@ export abstract class Check extends RunnableBase implements Omit<ICheck, 'postTr
|
||||
checkSum.postBehavior = this.postFail.behavior;
|
||||
}
|
||||
|
||||
behaviorT = checkSum.triggered ? 'Trigger' : 'Fail';
|
||||
behaviorT = checkResult.triggered ? 'Trigger' : 'Fail';
|
||||
|
||||
switch (checkSum.postBehavior.toLowerCase()) {
|
||||
case 'next':
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import YamlConfigDocument from "../YamlConfigDocument";
|
||||
import JsonConfigDocument from "../JsonConfigDocument";
|
||||
import {YAMLMap, YAMLSeq} from "yaml";
|
||||
import {YAMLMap, YAMLSeq, Pair, Scalar} from "yaml";
|
||||
import {BotInstanceJsonConfig, OperatorJsonConfig} from "../../interfaces";
|
||||
import {assign} from 'comment-json';
|
||||
|
||||
@@ -15,10 +15,12 @@ export class YamlOperatorConfigDocument extends YamlConfigDocument implements Op
|
||||
if (bots === undefined) {
|
||||
this.parsed.add({key: 'bots', value: [botData]});
|
||||
} else if (botData.name !== undefined) {
|
||||
// overwrite if we find an existing
|
||||
// granularly overwrite (merge) if we find an existing
|
||||
const existingIndex = bots.items.findIndex(x => (x as YAMLMap).get('name') === botData.name);
|
||||
if (existingIndex !== -1) {
|
||||
this.parsed.setIn(['bots', existingIndex], botData);
|
||||
const botObj = this.parsed.getIn(['bots', existingIndex]) as YAMLMap;
|
||||
const mergedVal = mergeObjectToYaml(botData, botObj);
|
||||
this.parsed.setIn(['bots', existingIndex], mergedVal);
|
||||
} else {
|
||||
this.parsed.addIn(['bots'], botData);
|
||||
}
|
||||
@@ -32,6 +34,24 @@ export class YamlOperatorConfigDocument extends YamlConfigDocument implements Op
|
||||
}
|
||||
}
|
||||
|
||||
export const mergeObjectToYaml = (source: object, target: YAMLMap) => {
|
||||
for (const [k, v] of Object.entries(source)) {
|
||||
if (target.has(k)) {
|
||||
const targetProp = target.get(k);
|
||||
if (targetProp instanceof YAMLMap && typeof v === 'object') {
|
||||
const merged = mergeObjectToYaml(v, targetProp);
|
||||
target.set(k, merged)
|
||||
} else {
|
||||
// since target prop and value are not both objects don't bother merging, just overwrite (primitive or array)
|
||||
target.set(k, v);
|
||||
}
|
||||
} else {
|
||||
target.add({key: k, value: v});
|
||||
}
|
||||
}
|
||||
return target;
|
||||
}
|
||||
|
||||
export class JsonOperatorConfigDocument extends JsonConfigDocument implements OperatorConfigDocumentInterface {
|
||||
addBot(botData: BotInstanceJsonConfig) {
|
||||
if (this.parsed.bots === undefined) {
|
||||
|
||||
@@ -79,6 +79,7 @@ export class ActionPremise extends TimeAwareRandomBaseEntity {
|
||||
this.active = data.active ?? true;
|
||||
this.configHash = objectHash.sha1(data.config);
|
||||
this.manager = data.manager;
|
||||
this.managerId = data.manager.id;
|
||||
this.name = data.name;
|
||||
|
||||
const {
|
||||
|
||||
@@ -152,6 +152,9 @@ export class DispatchedEntity extends TimeAwareRandomBaseEntity {
|
||||
|
||||
async toActivityDispatch(client: ExtendedSnoowrap): Promise<ActivityDispatch> {
|
||||
const redditThing = parseRedditFullname(this.activityId);
|
||||
if(redditThing === undefined) {
|
||||
throw new Error(`Could not parse reddit ID from value '${this.activityId}'`);
|
||||
}
|
||||
let activity: Comment | Submission;
|
||||
if (redditThing?.type === 'comment') {
|
||||
// @ts-ignore
|
||||
@@ -161,12 +164,12 @@ export class DispatchedEntity extends TimeAwareRandomBaseEntity {
|
||||
activity = await client.getSubmission(redditThing.id);
|
||||
}
|
||||
activity.author = new RedditUser({name: this.author}, client, false);
|
||||
activity.id = redditThing.id;
|
||||
return {
|
||||
id: this.id,
|
||||
queuedAt: this.createdAt,
|
||||
activity,
|
||||
delay: this.delay,
|
||||
processing: false,
|
||||
action: this.action,
|
||||
goto: this.goto,
|
||||
onExistingFound: this.onExistingFound,
|
||||
|
||||
@@ -83,6 +83,7 @@ export class RulePremise extends TimeAwareRandomBaseEntity {
|
||||
this.active = data.active ?? true;
|
||||
this.configHash = objectHash.sha1(data.config);
|
||||
this.manager = data.manager;
|
||||
this.managerId = data.manager.id;
|
||||
this.name = data.name;
|
||||
|
||||
const {
|
||||
|
||||
@@ -2,7 +2,6 @@ import fetch from "node-fetch";
|
||||
import {Submission} from "snoowrap/dist/objects";
|
||||
import {URL} from "url";
|
||||
import {absPercentDifference, getSharpAsync, isValidImageURL} from "../util";
|
||||
import sizeOf from "image-size";
|
||||
import {Sharp} from "sharp";
|
||||
import {blockhash} from "./blockhash/blockhash";
|
||||
import {SimpleError} from "../Utils/Errors";
|
||||
|
||||
@@ -185,4 +185,67 @@ export type ActionTypes =
|
||||
| 'userflair'
|
||||
| 'dispatch'
|
||||
| 'cancelDispatch'
|
||||
| 'contributor';
|
||||
| 'contributor'
|
||||
| 'modnote';
|
||||
|
||||
/**
|
||||
* Test the calculated VADER sentiment (compound) score for an Activity using this comparison. Can be either a numerical or natural language
|
||||
*
|
||||
* Sentiment values range from extremely negative to extremely positive in a numerical range of -1 to +1:
|
||||
*
|
||||
* * -0.6 => extremely negative
|
||||
* * -0.3 => very negative
|
||||
* * -0.1 => negative
|
||||
* * 0 => neutral
|
||||
* * 0.1 => positive
|
||||
* * 0.3 => very positive
|
||||
* * 0.6 => extremely positive
|
||||
*
|
||||
* The below examples are all equivocal. You can use either set of values as the value for `sentiment` (numerical comparisons or natural langauge)
|
||||
*
|
||||
* * `>= 0.1` = `is positive`
|
||||
* * `<= 0.3` = `is very negative`
|
||||
* * `< 0.1` = `is not positive`
|
||||
* * `> -0.3` = `is not very negative`
|
||||
*
|
||||
* Special case:
|
||||
*
|
||||
* * `is neutral` equates to `> -0.1 and < 0.1`
|
||||
* * `is not neutral` equates to `< -0.1 or > 0.1`
|
||||
*
|
||||
* ContextMod uses a normalized, weighted average from these sentiment tools:
|
||||
*
|
||||
* * NLP.js (english, french, german, and spanish) https://github.com/axa-group/nlp.js/blob/master/docs/v3/sentiment-analysis.md
|
||||
* * (english only) vaderSentiment-js https://github.com/vaderSentiment/vaderSentiment-js/
|
||||
* * (english only) wink-sentiment https://github.com/winkjs/wink-sentiment
|
||||
*
|
||||
* More about the sentiment algorithms used:
|
||||
* * VADER https://github.com/cjhutto/vaderSentiment
|
||||
* * AFINN http://corpustext.com/reference/sentiment_afinn.html
|
||||
* * Senticon https://ieeexplore.ieee.org/document/8721408
|
||||
* * Pattern https://github.com/clips/pattern
|
||||
* * wink https://github.com/winkjs/wink-sentiment
|
||||
*
|
||||
* @pattern ((>|>=|<|<=)\s*(-?\d?\.?\d+))|((not)?\s*(very|extremely)?\s*(positive|neutral|negative))
|
||||
* @examples ["is negative", "> 0.2"]
|
||||
* */
|
||||
export type VaderSentimentComparison = string;
|
||||
|
||||
export type ModUserNoteLabel =
|
||||
'BOT_BAN'
|
||||
| 'PERMA_BAN'
|
||||
| 'BAN'
|
||||
| 'ABUSE_WARNING'
|
||||
| 'SPAM_WARNING'
|
||||
| 'SPAM_WATCH'
|
||||
| 'SOLID_CONTRIBUTOR'
|
||||
| 'HELPFUL_USER';
|
||||
|
||||
export const modUserNoteLabels = ['BOT_BAN', 'PERMA_BAN', 'BAN', 'ABUSE_WARNING', 'SPAM_WARNING', 'SPAM_WATCH', 'SOLID_CONTRIBUTOR', 'HELPFUL_USER'];
|
||||
|
||||
export type ModActionType =
|
||||
'INVITE' |
|
||||
'NOTE' |
|
||||
'REMOVAL' |
|
||||
'SPAM' |
|
||||
'APPROVAL';
|
||||
|
||||
@@ -1,15 +1,141 @@
|
||||
import {StringOperator} from "./Atomic";
|
||||
import {Duration} from "dayjs/plugin/duration";
|
||||
import InvalidRegexError from "../../Utils/InvalidRegexError";
|
||||
import dayjs, {Dayjs, OpUnitType} from "dayjs";
|
||||
import {SimpleError} from "../../Utils/Errors";
|
||||
import { parseDuration } from "../../util";
|
||||
|
||||
export interface DurationComparison {
|
||||
operator: StringOperator,
|
||||
duration: Duration
|
||||
}
|
||||
|
||||
export interface GenericComparison {
|
||||
export interface GenericComparison extends HasDisplayText {
|
||||
operator: StringOperator,
|
||||
value: number,
|
||||
isPercent: boolean,
|
||||
extra?: string,
|
||||
displayText: string,
|
||||
duration?: Duration
|
||||
}
|
||||
|
||||
export interface HasDisplayText {
|
||||
displayText: string
|
||||
}
|
||||
|
||||
export interface RangedComparison extends HasDisplayText {
|
||||
range: [number, number]
|
||||
not: boolean
|
||||
}
|
||||
|
||||
export const asGenericComparison = (val: any): val is GenericComparison => {
|
||||
return typeof val === 'object' && 'value' in val;
|
||||
}
|
||||
|
||||
export const GENERIC_VALUE_COMPARISON = /^\s*(?<opStr>>|>=|<|<=)\s*(?<value>-?\d?\.?\d+)(?<extra>\s+.*)*$/
|
||||
export const GENERIC_VALUE_COMPARISON_URL = 'https://regexr.com/60dq4';
|
||||
export const parseGenericValueComparison = (val: string, options?: {
|
||||
requireDuration?: boolean,
|
||||
reg?: RegExp
|
||||
}): GenericComparison => {
|
||||
|
||||
const {
|
||||
requireDuration = false,
|
||||
reg = GENERIC_VALUE_COMPARISON,
|
||||
} = options || {};
|
||||
|
||||
const matches = val.match(reg);
|
||||
|
||||
if (matches === null) {
|
||||
throw new InvalidRegexError(reg, val)
|
||||
}
|
||||
|
||||
const groups = matches.groups as any;
|
||||
|
||||
let duration: Duration | undefined;
|
||||
|
||||
if(typeof groups.extra === 'string' && groups.extra.trim() !== '') {
|
||||
try {
|
||||
duration = parseDuration(groups.extra, false);
|
||||
} catch (e) {
|
||||
// if it returns an invalid regex just means they didn't
|
||||
if (requireDuration || !(e instanceof InvalidRegexError)) {
|
||||
throw e;
|
||||
}
|
||||
}
|
||||
} else if(requireDuration) {
|
||||
throw new SimpleError(`Comparison must contain a duration value but none was found. Given: ${val}`);
|
||||
}
|
||||
|
||||
const displayParts = [`${groups.opStr} ${groups.value}`];
|
||||
const hasPercent = typeof groups.percent === 'string' && groups.percent.trim() !== '';
|
||||
if(hasPercent) {
|
||||
displayParts.push('%');
|
||||
}
|
||||
|
||||
return {
|
||||
operator: groups.opStr as StringOperator,
|
||||
value: Number.parseFloat(groups.value),
|
||||
isPercent: hasPercent,
|
||||
extra: groups.extra,
|
||||
displayText: displayParts.join(''),
|
||||
duration
|
||||
}
|
||||
}
|
||||
const GENERIC_VALUE_PERCENT_COMPARISON = /^\s*(?<opStr>>|>=|<|<=)\s*(?<value>\d+)\s*(?<percent>%)?(?<extra>.*)$/
|
||||
const GENERIC_VALUE_PERCENT_COMPARISON_URL = 'https://regexr.com/60a16';
|
||||
export const parseGenericValueOrPercentComparison = (val: string, options?: {requireDuration: boolean}): GenericComparison => {
|
||||
return parseGenericValueComparison(val, {...(options ?? {}), reg: GENERIC_VALUE_PERCENT_COMPARISON});
|
||||
}
|
||||
/**
|
||||
* Named groups: operator, time, unit
|
||||
* */
|
||||
const DURATION_COMPARISON_REGEX: RegExp = /^\s*(?<opStr>>|>=|<|<=)\s*(?<time>\d+)\s*(?<unit>days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?)\s*$/;
|
||||
const DURATION_COMPARISON_REGEX_URL = 'https://regexr.com/609n8';
|
||||
export const parseDurationComparison = (val: string): DurationComparison => {
|
||||
const matches = val.match(DURATION_COMPARISON_REGEX);
|
||||
if (matches === null) {
|
||||
throw new InvalidRegexError(DURATION_COMPARISON_REGEX, val, DURATION_COMPARISON_REGEX_URL)
|
||||
}
|
||||
const groups = matches.groups as any;
|
||||
const dur: Duration = dayjs.duration(groups.time, groups.unit);
|
||||
if (!dayjs.isDuration(dur)) {
|
||||
throw new SimpleError(`Parsed value '${val}' did not result in a valid Dayjs Duration`);
|
||||
}
|
||||
return {
|
||||
operator: groups.opStr as StringOperator,
|
||||
duration: dur
|
||||
}
|
||||
}
|
||||
export const dateComparisonTextOp = (val1: Dayjs, strOp: StringOperator, val2: Dayjs, granularity?: OpUnitType): boolean => {
|
||||
switch (strOp) {
|
||||
case '>':
|
||||
return val1.isBefore(val2, granularity);
|
||||
case '>=':
|
||||
return val1.isSameOrBefore(val2, granularity);
|
||||
case '<':
|
||||
return val1.isAfter(val2, granularity);
|
||||
case '<=':
|
||||
return val1.isSameOrAfter(val2, granularity);
|
||||
default:
|
||||
throw new Error(`${strOp} was not a recognized operator`);
|
||||
}
|
||||
}
|
||||
export const compareDurationValue = (comp: DurationComparison, date: Dayjs) => {
|
||||
const dateToCompare = dayjs().subtract(comp.duration.asSeconds(), 'seconds');
|
||||
return dateComparisonTextOp(date, comp.operator, dateToCompare);
|
||||
}
|
||||
export const comparisonTextOp = (val1: number, strOp: string, val2: number): boolean => {
|
||||
switch (strOp) {
|
||||
case '>':
|
||||
return val1 > val2;
|
||||
case '>=':
|
||||
return val1 >= val2;
|
||||
case '<':
|
||||
return val1 < val2;
|
||||
case '<=':
|
||||
return val1 <= val2;
|
||||
default:
|
||||
throw new Error(`${strOp} was not a recognized operator`);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,4 +1,14 @@
|
||||
import {CompareValue, CompareValueOrPercent, DurationComparor, ModeratorNameCriteria, ModeratorNames} from "../Atomic";
|
||||
import {
|
||||
CompareValue,
|
||||
CompareValueOrPercent,
|
||||
DurationComparor,
|
||||
ModeratorNameCriteria,
|
||||
ModeratorNames, ModActionType,
|
||||
ModUserNoteLabel
|
||||
} from "../Atomic";
|
||||
import {ActivityType} from "../Reddit";
|
||||
import {GenericComparison, parseGenericValueComparison} from "../Comparisons";
|
||||
import {parseStringToRegexOrLiteralSearch} from "../../../util";
|
||||
|
||||
/**
|
||||
* Different attributes a `Subreddit` can be in. Only include a property if you want to check it.
|
||||
@@ -55,42 +65,40 @@ export const defaultStrongSubredditCriteriaOptions = {
|
||||
|
||||
export type FilterCriteriaDefaultBehavior = 'replace' | 'merge';
|
||||
|
||||
export interface UserNoteCriteria {
|
||||
/**
|
||||
* User Note type key to search for
|
||||
* @examples ["spamwarn"]
|
||||
* */
|
||||
type: string;
|
||||
export interface UserSubredditHistoryCriteria {
|
||||
/**
|
||||
* Number of occurrences of this type. Ignored if `search` is `current`
|
||||
*
|
||||
* A string containing a comparison operator and/or a value to compare number of occurrences against
|
||||
*
|
||||
* The syntax is `(< OR > OR <= OR >=) <number>[percent sign] [ascending|descending]`
|
||||
* The syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`
|
||||
*
|
||||
* If `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`
|
||||
*
|
||||
* @examples [">= 1"]
|
||||
* @default ">= 1"
|
||||
* @pattern ^\s*(?<opStr>>|>=|<|<=)\s*(?<value>\d+)\s*(?<percent>%?)\s*(?<extra>asc.*|desc.*)*$
|
||||
* @pattern ^\s*(?<opStr>>|>=|<|<=)\s*(?<value>\d+)\s*(?<percent>%?)\s*(?<duration>in\s+\d+\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\s*(?<extra>asc.*|desc.*)*$
|
||||
* */
|
||||
count?: string;
|
||||
|
||||
/**
|
||||
* How to test the notes for this Author:
|
||||
* How to test the Toolbox Notes or Mod Actions for this Author:
|
||||
*
|
||||
* ### current
|
||||
*
|
||||
* Only the most recent note is checked for `type`
|
||||
* Only the most recent note is checked for criteria
|
||||
*
|
||||
* ### total
|
||||
*
|
||||
* The `count` comparison of `type` must be found within all notes
|
||||
* `count` comparison of mod actions/notes must be found within all history
|
||||
*
|
||||
* * EX `count: > 3` => Must have more than 3 notes of `type`, total
|
||||
* * EX `count: <= 25%` => Must have 25% or less of notes of `type`, total
|
||||
* * EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week
|
||||
*
|
||||
* ### consecutive
|
||||
*
|
||||
* The `count` **number** of `type` notes must be found in a row.
|
||||
* The `count` **number** of mod actions/notes must be found in a row.
|
||||
*
|
||||
* You may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`
|
||||
*
|
||||
@@ -104,7 +112,126 @@ export interface UserNoteCriteria {
|
||||
search?: 'current' | 'consecutive' | 'total'
|
||||
}
|
||||
|
||||
export const authorCriteriaProperties = ['name', 'flairCssClass', 'flairText', 'flairTemplate', 'isMod', 'userNotes', 'age', 'linkKarma', 'commentKarma', 'totalKarma', 'verified', 'shadowBanned', 'description', 'isContributor'];
|
||||
export interface UserNoteCriteria extends UserSubredditHistoryCriteria {
|
||||
/**
|
||||
* User Note type key to search for
|
||||
* @examples ["spamwarn"]
|
||||
* */
|
||||
type: string;
|
||||
}
|
||||
|
||||
export interface ModActionCriteria extends UserSubredditHistoryCriteria {
|
||||
type?: ModActionType | ModActionType[]
|
||||
activityType?: ActivityType | ActivityType[]
|
||||
}
|
||||
|
||||
export interface FullModActionCriteria extends Omit<ModActionCriteria, 'count'> {
|
||||
type?: ModActionType[]
|
||||
count?: GenericComparison
|
||||
activityType?: ActivityType[]
|
||||
}
|
||||
|
||||
export interface ModNoteCriteria extends ModActionCriteria {
|
||||
noteType?: ModUserNoteLabel | ModUserNoteLabel[]
|
||||
note?: string | string[]
|
||||
}
|
||||
|
||||
export interface FullModNoteCriteria extends FullModActionCriteria, Omit<ModNoteCriteria, 'note' | 'count' | 'type' | 'activityType'> {
|
||||
noteType?: ModUserNoteLabel[]
|
||||
note?: RegExp[]
|
||||
}
|
||||
|
||||
const arrayableModNoteProps = ['activityType','noteType','note'];
|
||||
|
||||
export const asModNoteCriteria = (val: any): val is ModNoteCriteria => {
|
||||
return val !== null && typeof val === 'object' && ('noteType' in val || 'note' in val);
|
||||
}
|
||||
|
||||
export const toFullModNoteCriteria = (val: ModNoteCriteria): FullModNoteCriteria => {
|
||||
|
||||
const result = Object.entries(val).reduce((acc: FullModNoteCriteria, curr) => {
|
||||
const [k,v] = curr;
|
||||
|
||||
if(v === undefined) {
|
||||
return acc;
|
||||
}
|
||||
|
||||
const rawVal = arrayableModNoteProps.includes(k) && !Array.isArray(v) ? [v] : v;
|
||||
|
||||
switch(k) {
|
||||
case 'search':
|
||||
acc.search = rawVal;
|
||||
break;
|
||||
case 'count':
|
||||
acc.count = parseGenericValueComparison(rawVal);
|
||||
break;
|
||||
case 'activityType':
|
||||
case 'noteType':
|
||||
acc[k] = rawVal;
|
||||
break;
|
||||
case 'note':
|
||||
acc[k] = rawVal.map((x: string) => parseStringToRegexOrLiteralSearch(x))
|
||||
}
|
||||
|
||||
return acc;
|
||||
}, {});
|
||||
|
||||
result.type = ['NOTE'];
|
||||
return result;
|
||||
}
|
||||
|
||||
|
||||
export interface ModLogCriteria extends ModActionCriteria {
|
||||
action?: string | string[]
|
||||
details?: string | string[]
|
||||
description?: string | string[]
|
||||
}
|
||||
|
||||
export interface FullModLogCriteria extends FullModActionCriteria, Omit<ModLogCriteria, 'action' | 'details' | 'description' | 'count' | 'type' | 'activityType'> {
|
||||
action?: RegExp[]
|
||||
details?: RegExp[]
|
||||
description?: RegExp[]
|
||||
}
|
||||
|
||||
const arrayableModLogProps = ['type','activityType','action','description','details', 'type'];
|
||||
|
||||
export const asModLogCriteria = (val: any): val is ModLogCriteria => {
|
||||
return val !== null && typeof val === 'object' && !asModNoteCriteria(val) && ('action' in val || 'details' in val || 'description' in val || 'activityType' in val || 'search' in val || 'count' in val || 'type' in val);
|
||||
}
|
||||
|
||||
export const toFullModLogCriteria = (val: ModLogCriteria): FullModLogCriteria => {
|
||||
|
||||
return Object.entries(val).reduce((acc: FullModLogCriteria, curr) => {
|
||||
const [k,v] = curr;
|
||||
|
||||
if(v === undefined) {
|
||||
return acc;
|
||||
}
|
||||
|
||||
const rawVal = arrayableModLogProps.includes(k) && !Array.isArray(v) ? [v] : v;
|
||||
|
||||
switch(k) {
|
||||
case 'search':
|
||||
acc.search = rawVal;
|
||||
break;
|
||||
case 'count':
|
||||
acc.count = parseGenericValueComparison(rawVal);
|
||||
break;
|
||||
case 'activityType':
|
||||
case 'type':
|
||||
acc[k as keyof FullModLogCriteria] = rawVal;
|
||||
break;
|
||||
case 'action':
|
||||
case 'description':
|
||||
case 'details':
|
||||
acc[k as keyof FullModLogCriteria] = rawVal.map((x: string) => parseStringToRegexOrLiteralSearch(x))
|
||||
}
|
||||
|
||||
return acc;
|
||||
}, {});
|
||||
}
|
||||
|
||||
export const authorCriteriaProperties = ['name', 'flairCssClass', 'flairText', 'flairTemplate', 'isMod', 'userNotes', 'modActions', 'age', 'linkKarma', 'commentKarma', 'totalKarma', 'verified', 'shadowBanned', 'description', 'isContributor'];
|
||||
|
||||
/**
|
||||
* Criteria with which to test against the author of an Activity. The outcome of the test is based on:
|
||||
@@ -159,6 +286,8 @@ export interface AuthorCriteria {
|
||||
* */
|
||||
userNotes?: UserNoteCriteria[]
|
||||
|
||||
modActions?: (ModNoteCriteria | ModLogCriteria)[]
|
||||
|
||||
/**
|
||||
* Test the age of the Author's account (when it was created) against this comparison
|
||||
*
|
||||
@@ -228,7 +357,35 @@ export interface AuthorCriteria {
|
||||
* Is the author an approved user (contributor)?
|
||||
* */
|
||||
isContributor?: boolean
|
||||
} // properties calculated/derived by CM -- not provided as plain values by reddit
|
||||
}
|
||||
|
||||
/**
|
||||
* When testing AuthorCriteria test properties in order of likelihood to require an API call to complete
|
||||
* */
|
||||
export const orderedAuthorCriteriaProps: (keyof AuthorCriteria)[] = [
|
||||
'name', // never needs an api call, returned/cached with activity info
|
||||
// none of these normally need api calls unless activity is a skeleton generated by CM (not normal)
|
||||
// all are part of cached activity data
|
||||
'flairCssClass',
|
||||
'flairText',
|
||||
'flairTemplate',
|
||||
// usernotes are cached longer than author by default (5 min vs 60 seconds)
|
||||
'userNotes',
|
||||
// requires fetching/getting cached author.
|
||||
// If fetching and user is shadowbanned none of the individual author data below will be retrievable either so always do this first
|
||||
'shadowBanned',
|
||||
// individual props require fetching/getting cached
|
||||
'age',
|
||||
'linkKarma',
|
||||
'commentKarma',
|
||||
'totalKarma',
|
||||
'verified',
|
||||
'description',
|
||||
'isMod', // requires fetching mods for subreddit
|
||||
'isContributor', // requires fetching contributors for subreddit
|
||||
'modActions', // requires fetching mod notes/actions for author (shortest cache TTL)
|
||||
];
|
||||
|
||||
export interface ActivityState {
|
||||
/**
|
||||
* * true/false => test whether Activity is removed or not
|
||||
|
||||
493
src/Common/LangaugeProcessing.ts
Normal file
493
src/Common/LangaugeProcessing.ts
Normal file
@@ -0,0 +1,493 @@
|
||||
import {containerBootstrap} from '@nlpjs/core';
|
||||
import {Language, LanguageGuess, LanguageType} from '@nlpjs/language';
|
||||
import {Nlp} from '@nlpjs/nlp';
|
||||
import {SentimentIntensityAnalyzer} from 'vader-sentiment';
|
||||
import wink from 'wink-sentiment';
|
||||
import {SnoowrapActivity} from "./Infrastructure/Reddit";
|
||||
import {
|
||||
asGenericComparison, comparisonTextOp,
|
||||
GenericComparison,
|
||||
parseGenericValueComparison,
|
||||
RangedComparison
|
||||
} from "./Infrastructure/Comparisons";
|
||||
import {asSubmission, between, formatNumber} from "../util";
|
||||
import {CMError, MaybeSeriousErrorWithCause} from "../Utils/Errors";
|
||||
import InvalidRegexError from "../Utils/InvalidRegexError";
|
||||
import {StringOperator} from "./Infrastructure/Atomic";
|
||||
import {LangEs} from "@nlpjs/lang-es";
|
||||
import {LangDe} from "@nlpjs/lang-de";
|
||||
import {LangEn} from "@nlpjs/lang-en";
|
||||
import {LangFr} from "@nlpjs/lang-fr";
|
||||
|
||||
export type SentimentAnalysisType = 'vader' | 'afinn' | 'senticon' | 'pattern' | 'wink';
|
||||
|
||||
export const sentimentQuantifier = {
|
||||
'extremely negative': -0.6,
|
||||
'very negative': -0.3,
|
||||
'negative': -0.1,
|
||||
'neutral': 0,
|
||||
'positive': 0.1,
|
||||
'very positive': 0.3,
|
||||
'extremely positive': 0.6,
|
||||
}
|
||||
|
||||
export const sentimentQuantifierRanges = [
|
||||
{
|
||||
range: [Number.MIN_SAFE_INTEGER, -0.6],
|
||||
quant: 'extremely negative'
|
||||
},
|
||||
{
|
||||
range: [-0.6, -0.3],
|
||||
quant: 'very negative'
|
||||
},
|
||||
{
|
||||
range: [-0.3, -0.1],
|
||||
quant: 'negative'
|
||||
},
|
||||
{
|
||||
range: [-0.1, 0.1],
|
||||
quant: 'neutral'
|
||||
},
|
||||
{
|
||||
range: [0.1, 0.3],
|
||||
quant: 'positive'
|
||||
},
|
||||
{
|
||||
range: [0.3, 0.6],
|
||||
quant: 'very positive'
|
||||
},
|
||||
{
|
||||
range: [0.6, Number.MAX_SAFE_INTEGER],
|
||||
quant: 'extremely positive'
|
||||
}
|
||||
]
|
||||
|
||||
const scoreToSentimentText = (val: number) => {
|
||||
for (const segment of sentimentQuantifierRanges) {
|
||||
if (between(val, segment.range[0], segment.range[1], false, true)) {
|
||||
return segment.quant;
|
||||
}
|
||||
}
|
||||
throw new Error('should not hit this!');
|
||||
}
|
||||
|
||||
export interface SentimentResult {
|
||||
comparative: number
|
||||
type: SentimentAnalysisType
|
||||
sentiment: string
|
||||
weight: number
|
||||
tokens: number
|
||||
matchedTokens?: number,
|
||||
usableResult: true | string
|
||||
}
|
||||
|
||||
export interface StringSentiment {
|
||||
results: SentimentResult[]
|
||||
score: number
|
||||
scoreWeighted: number
|
||||
sentiment: string
|
||||
sentimentWeighted: string
|
||||
guessedLanguage: LanguageGuessResult
|
||||
usedLanguage: LanguageType
|
||||
usableScore: boolean
|
||||
reason?: string
|
||||
}
|
||||
|
||||
export interface ActivitySentiment extends StringSentiment {
|
||||
activity: SnoowrapActivity
|
||||
}
|
||||
|
||||
export interface StringSentimentTestResult extends StringSentiment {
|
||||
passes: boolean
|
||||
test: GenericComparison | RangedComparison
|
||||
}
|
||||
|
||||
export interface ActivitySentimentTestResult extends StringSentimentTestResult {
|
||||
activity: SnoowrapActivity
|
||||
}
|
||||
|
||||
export interface ActivitySentimentOptions {
|
||||
testOn?: ('title' | 'body')[]
|
||||
/**
|
||||
* Make the analyzer assume a language if it cannot determine one itself.
|
||||
*
|
||||
* This is very useful for the analyzer when it is parsing short pieces of content. For example, if you know your subreddit is majority english speakers this will make the analyzer return "neutral" sentiment instead of "not detected language".
|
||||
*
|
||||
* Defaults to 'en'
|
||||
*
|
||||
* @example ["en"]
|
||||
* @default en
|
||||
* */
|
||||
defaultLanguage?: string | null | false
|
||||
|
||||
/**
|
||||
* Helps the analyzer coerce a low confidence language guess into a known-used languages in two ways:
|
||||
*
|
||||
* If the analyzer's
|
||||
* * *best* guess is NOT one of these
|
||||
* * but it did guess one of these
|
||||
* * and its guess is above requiredLanguageConfidence score then use the hinted language instead of best guess
|
||||
* * OR text content is very short (4 words or less)
|
||||
* * and the best guess was below the requiredLanguageConfidence score
|
||||
* * and none of guesses was a hinted language then use the defaultLanguage
|
||||
*
|
||||
* Defaults to popular romance languages: ['en', 'es', 'de', 'fr']
|
||||
*
|
||||
* @example [["en", "es", "de", "fr"]]
|
||||
* @default ["en", "es", "de", "fr"]
|
||||
* */
|
||||
languageHints?: string[]
|
||||
|
||||
/**
|
||||
* Required confidence to use a guessed language as the best guess. Score from 0 to 1.
|
||||
*
|
||||
* Defaults to 0.9
|
||||
*
|
||||
* @example [0.9]
|
||||
* @default 0.9
|
||||
* */
|
||||
requiredLanguageConfidence?: number
|
||||
}
|
||||
|
||||
export type SentimentCriteriaTest = GenericComparison | RangedComparison;
|
||||
|
||||
export const availableSentimentLanguages = ['en', 'es', 'de', 'fr'];
|
||||
|
||||
export const textComparison = /(?<not>not)?\s*(?<modifier>very|extremely)?\s*(?<sentiment>positive|neutral|negative)/i;
|
||||
|
||||
export const parseTextToNumberComparison = (val: string): RangedComparison | GenericComparison => {
|
||||
|
||||
let genericError: Error | undefined;
|
||||
try {
|
||||
return parseGenericValueComparison(val);
|
||||
} catch (e) {
|
||||
genericError = e as Error;
|
||||
// now try text match
|
||||
}
|
||||
|
||||
const matches = val.match(textComparison);
|
||||
if (matches === null) {
|
||||
const textError = new InvalidRegexError(textComparison, val);
|
||||
throw new CMError(`Sentiment value did not match a valid numeric comparison or valid text: \n ${genericError.message} \n ${textError.message}`);
|
||||
}
|
||||
const groups = matches.groups as any;
|
||||
|
||||
const negate = groups.not !== undefined && groups.not !== '';
|
||||
|
||||
if (groups.sentiment === 'neutral') {
|
||||
if (negate) {
|
||||
return {
|
||||
displayText: 'not neutral (not -0.1 to 0.1)',
|
||||
range: [-0.1, 0.1],
|
||||
not: true,
|
||||
}
|
||||
}
|
||||
return {
|
||||
displayText: 'is neutral (-0.1 to 0.1)',
|
||||
range: [-0.1, 0.1],
|
||||
not: false
|
||||
}
|
||||
}
|
||||
|
||||
const compoundSentimentText = `${groups.modifier !== undefined && groups.modifier !== '' ? `${groups.modifier} ` : ''}${groups.sentiment}`.toLocaleLowerCase();
|
||||
// @ts-ignore
|
||||
const numericVal = sentimentQuantifier[compoundSentimentText] as number;
|
||||
if (numericVal === undefined) {
|
||||
throw new CMError(`Sentiment given did not match any known phrases: '${compoundSentimentText}'`);
|
||||
}
|
||||
|
||||
let operator: StringOperator;
|
||||
if (negate) {
|
||||
operator = numericVal > 0 ? '<' : '>';
|
||||
} else {
|
||||
operator = numericVal > 0 ? '>=' : '<=';
|
||||
}
|
||||
|
||||
return {
|
||||
operator,
|
||||
value: numericVal,
|
||||
isPercent: false,
|
||||
displayText: `is${negate ? ' not ' : ' '}${compoundSentimentText} (${operator} ${numericVal})`
|
||||
}
|
||||
}
|
||||
|
||||
let nlp: Nlp;
|
||||
let container: any;
|
||||
|
||||
const bootstrapNlp = async () => {
|
||||
|
||||
container = await containerBootstrap();
|
||||
container.use(Language);
|
||||
container.use(Nlp);
|
||||
container.use(LangEs);
|
||||
container.use(LangDe);
|
||||
container.use(LangEn);
|
||||
container.use(LangFr);
|
||||
nlp = container.get('nlp');
|
||||
nlp.settings.autoSave = false;
|
||||
nlp.addLanguage('en');
|
||||
nlp.addLanguage('es');
|
||||
nlp.addLanguage('de');
|
||||
nlp.addLanguage('fr');
|
||||
nlp.nluManager.guesser.processExtraSentences();
|
||||
await nlp.train();
|
||||
}
|
||||
|
||||
export const getNlp = async () => {
|
||||
if (nlp === undefined) {
|
||||
await bootstrapNlp();
|
||||
}
|
||||
|
||||
return nlp;
|
||||
}
|
||||
|
||||
export const getActivityContent = (item: SnoowrapActivity, options?: ActivitySentimentOptions): string => {
|
||||
const {
|
||||
testOn = ['body', 'title'],
|
||||
} = options || {};
|
||||
|
||||
// determine what content we are testing
|
||||
let contents: string[] = [];
|
||||
if (asSubmission(item)) {
|
||||
for (const l of testOn) {
|
||||
switch (l) {
|
||||
case 'title':
|
||||
contents.push(item.title);
|
||||
break;
|
||||
case 'body':
|
||||
if (item.is_self) {
|
||||
contents.push(item.selftext);
|
||||
}
|
||||
break;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
contents.push(item.body)
|
||||
}
|
||||
|
||||
return contents.join(' ');
|
||||
}
|
||||
|
||||
export const getLanguageTypeFromValue = async (val: string): Promise<LanguageType> => {
|
||||
|
||||
if (nlp === undefined) {
|
||||
await bootstrapNlp();
|
||||
}
|
||||
|
||||
const langObj = container.get('Language') as Language;
|
||||
|
||||
const cleanVal = val.trim().toLocaleLowerCase();
|
||||
|
||||
const foundLang = Object.values(langObj.languagesAlpha2).find(x => x.alpha2 === cleanVal || x.alpha3 === cleanVal || x.name.toLocaleLowerCase() === cleanVal);
|
||||
if (foundLang === undefined) {
|
||||
throw new MaybeSeriousErrorWithCause(`Could not find Language with identifier '${val}'`, {isSerious: false});
|
||||
}
|
||||
const {alpha2, alpha3, name: language} = foundLang;
|
||||
return {
|
||||
alpha2,
|
||||
alpha3,
|
||||
language
|
||||
};
|
||||
}
|
||||
|
||||
export interface LanguageGuessResult {
|
||||
bestGuess: LanguageGuess
|
||||
guesses: LanguageGuess[]
|
||||
requiredConfidence: number
|
||||
sparse: boolean
|
||||
language: LanguageType
|
||||
usedDefault: boolean
|
||||
}
|
||||
|
||||
export const getContentLanguage = async (content: string, options?: ActivitySentimentOptions): Promise<LanguageGuessResult> => {
|
||||
|
||||
const {
|
||||
defaultLanguage = 'en',
|
||||
requiredLanguageConfidence = 0.9,
|
||||
languageHints = availableSentimentLanguages
|
||||
} = options || {};
|
||||
|
||||
if (nlp === undefined) {
|
||||
await bootstrapNlp();
|
||||
}
|
||||
|
||||
const spaceNormalizedTokens = content.trim().split(' ').filter(x => x !== ''.trim());
|
||||
|
||||
const lang = container.get('Language') as Language;
|
||||
// would like to improve this https://github.com/axa-group/nlp.js/issues/761
|
||||
const guesses = lang.guess(content, null, 4);
|
||||
let bestLang = guesses[0];
|
||||
const shortContent = spaceNormalizedTokens.length <= 4;
|
||||
|
||||
const altBest = languageHints.includes(bestLang.alpha2) ? undefined : guesses.find(x => x.score >= requiredLanguageConfidence && languageHints.includes(x.alpha2));
|
||||
|
||||
// coerce best guess into a supported language that has a good enough confidence
|
||||
if(!shortContent && altBest !== undefined) {
|
||||
bestLang = altBest;
|
||||
}
|
||||
|
||||
let usedLang: LanguageType = bestLang;
|
||||
let usedDefault = false;
|
||||
|
||||
if (typeof defaultLanguage === 'string' && (bestLang.score < requiredLanguageConfidence || (shortContent && !languageHints.includes(bestLang.alpha2)))) {
|
||||
usedLang = await getLanguageTypeFromValue(defaultLanguage);
|
||||
usedDefault = true;
|
||||
}
|
||||
|
||||
return {
|
||||
guesses,
|
||||
bestGuess: bestLang,
|
||||
requiredConfidence: requiredLanguageConfidence,
|
||||
sparse: shortContent,
|
||||
language: usedLang,
|
||||
usedDefault
|
||||
}
|
||||
}
|
||||
|
||||
export const getActivitySentiment = async (item: SnoowrapActivity, options?: ActivitySentimentOptions): Promise<ActivitySentiment> => {
|
||||
|
||||
const result = await getStringSentiment(getActivityContent(item, options), options);
|
||||
|
||||
return {
|
||||
...result,
|
||||
activity: item
|
||||
}
|
||||
}
|
||||
|
||||
export const getStringSentiment = async (contentStr: string, options?: ActivitySentimentOptions): Promise<StringSentiment> => {
|
||||
|
||||
const langResult = await getContentLanguage(contentStr, options);
|
||||
|
||||
let usedLanguage: LanguageType = langResult.language;
|
||||
|
||||
const spaceNormalizedTokens = contentStr.trim().split(' ').filter(x => x !== ''.trim());
|
||||
|
||||
const results: SentimentResult[] = [];
|
||||
|
||||
const nlpResult = await nlp.process(langResult.language.alpha2, contentStr);
|
||||
|
||||
results.push({
|
||||
comparative: nlpResult.sentiment.average,
|
||||
type: nlpResult.sentiment.type as SentimentAnalysisType,
|
||||
sentiment: scoreToSentimentText(nlpResult.sentiment.average),
|
||||
weight: 1,
|
||||
matchedTokens: nlpResult.sentiment.numHits,
|
||||
tokens: nlpResult.sentiment.numWords,
|
||||
usableResult: availableSentimentLanguages.includes(langResult.language.alpha2) ? true : (nlpResult.sentiment.numHits / nlpResult.sentiment.numWords) >= 0.5 ? true : `${langResult.sparse ? 'Content was too short to guess language' : 'Unsupported language'} and less than 50% of tokens matched`,
|
||||
});
|
||||
|
||||
// only run vader/wink if either
|
||||
//
|
||||
// * content was short which means we aren't confident on language guess
|
||||
// * OR language is english (guessed or explicitly set as language fallback by user due to low confidence)
|
||||
//
|
||||
if (langResult.sparse || langResult.language.alpha2 === 'en') {
|
||||
|
||||
// neg post neu are ratios of *recognized* tokens in the content
|
||||
// when neu is close to 1 its either extremely neutral or no tokens were recognized
|
||||
const vaderScore = SentimentIntensityAnalyzer.polarity_scores(contentStr);
|
||||
const vaderRes: SentimentResult = {
|
||||
comparative: vaderScore.compound,
|
||||
type: 'vader',
|
||||
sentiment: scoreToSentimentText(vaderScore.compound),
|
||||
// may want to weight higher in the future...
|
||||
weight: 1,
|
||||
tokens: spaceNormalizedTokens.length,
|
||||
usableResult: langResult.language.alpha2 === 'en' ? true : (vaderScore.neu < 0.5 ? true : `Unable to guess language and unable to determine if more than 50% of tokens are negative or not matched`)
|
||||
};
|
||||
results.push(vaderRes);
|
||||
|
||||
const winkScore = wink(contentStr);
|
||||
const matchedTokens = winkScore.tokenizedPhrase.filter(x => x.score !== undefined);
|
||||
const matchedMeaningfulTokens = winkScore.tokenizedPhrase.filter(x => x.tag === 'word' || x.tag === 'emoji');
|
||||
// normalizedScore is range of -5 to +5 -- convert to -1 to +1
|
||||
const winkAdjusted = (winkScore.normalizedScore * 2) / 10;
|
||||
const winkRes: SentimentResult = {
|
||||
comparative: winkAdjusted,
|
||||
type: 'wink',
|
||||
sentiment: scoreToSentimentText(winkAdjusted),
|
||||
weight: 1,
|
||||
matchedTokens: matchedTokens.length,
|
||||
tokens: winkScore.tokenizedPhrase.length,
|
||||
usableResult: langResult.language.alpha2 === 'en' ? true : ((matchedTokens.length / matchedMeaningfulTokens.length) > 0.5 ? true : 'Unable to guess language and less than 50% of tokens matched')
|
||||
};
|
||||
results.push(winkRes);
|
||||
|
||||
if ((vaderRes.usableResult == true || winkRes.usableResult === true) && usedLanguage.alpha2 !== 'en') {
|
||||
// since we are confident enough to use one of these then we are assuming language is mostly english
|
||||
usedLanguage = await getLanguageTypeFromValue('en');
|
||||
}
|
||||
}
|
||||
|
||||
const score = results.reduce((acc, curr) => acc + curr.comparative, 0) / results.length;
|
||||
const sentiment = scoreToSentimentText(score);
|
||||
|
||||
const weightSum = results.reduce((acc, curr) => acc + curr.weight, 0);
|
||||
const weightedScores = results.reduce((acc, curr) => acc + (curr.weight * curr.comparative), 0);
|
||||
const weightedScore = weightedScores / weightSum;
|
||||
const weightedSentiment = scoreToSentimentText(weightedScore);
|
||||
|
||||
const actSentResult: StringSentiment = {
|
||||
results,
|
||||
score,
|
||||
sentiment,
|
||||
scoreWeighted: weightedScore,
|
||||
sentimentWeighted: weightedSentiment,
|
||||
guessedLanguage: langResult,
|
||||
usedLanguage,
|
||||
usableScore: results.filter(x => x.usableResult === true).length > 0,
|
||||
}
|
||||
|
||||
if (!actSentResult.usableScore) {
|
||||
if (actSentResult.guessedLanguage.sparse) {
|
||||
actSentResult.reason = 'Content may be supported language but was too short to guess accurately and no algorithm matched enough tokens to be considered confident.';
|
||||
} else {
|
||||
actSentResult.reason = 'Unsupported language'
|
||||
}
|
||||
}
|
||||
|
||||
return actSentResult;
|
||||
}
|
||||
|
||||
export const testActivitySentiment = async (item: SnoowrapActivity, criteria: SentimentCriteriaTest, options?: ActivitySentimentOptions): Promise<ActivitySentimentTestResult> => {
|
||||
const sentimentResult = await getActivitySentiment(item, options);
|
||||
|
||||
const testResult = testSentiment(sentimentResult, criteria);
|
||||
|
||||
return {
|
||||
...testResult,
|
||||
activity: item
|
||||
}
|
||||
}
|
||||
|
||||
export const testSentiment = (sentimentResult: StringSentiment, criteria: SentimentCriteriaTest): StringSentimentTestResult => {
|
||||
|
||||
if (!sentimentResult.usableScore) {
|
||||
return {
|
||||
passes: false,
|
||||
test: criteria,
|
||||
...sentimentResult,
|
||||
}
|
||||
}
|
||||
|
||||
if (asGenericComparison(criteria)) {
|
||||
return {
|
||||
passes: comparisonTextOp(sentimentResult.scoreWeighted, criteria.operator, criteria.value),
|
||||
test: criteria,
|
||||
...sentimentResult,
|
||||
}
|
||||
} else {
|
||||
if (criteria.not) {
|
||||
return {
|
||||
passes: sentimentResult.scoreWeighted < criteria.range[0] || sentimentResult.scoreWeighted > criteria.range[1],
|
||||
test: criteria,
|
||||
...sentimentResult,
|
||||
}
|
||||
}
|
||||
return {
|
||||
passes: sentimentResult.scoreWeighted >= criteria.range[0] || sentimentResult.scoreWeighted <= criteria.range[1],
|
||||
test: criteria,
|
||||
...sentimentResult,
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,97 @@
|
||||
import {MigrationInterface, QueryRunner, Table, TableIndex} from "typeorm"
|
||||
|
||||
const index = (prefix: string, columns: string[], unique = true) => new TableIndex({
|
||||
name: `IDX_${unique ? 'UN_' : ''}${prefix}_${columns.join('-')}_MIG`,
|
||||
columnNames: columns,
|
||||
isUnique: unique,
|
||||
});
|
||||
|
||||
const idIndex = (prefix: string, unique: boolean) => index(prefix, ['id'], unique);
|
||||
|
||||
export class indexes1653586738904 implements MigrationInterface {
|
||||
|
||||
public async up(queryRunner: QueryRunner): Promise<void> {
|
||||
|
||||
queryRunner.connection.logger.logSchemaBuild('Starting Index Add/Update Migration');
|
||||
queryRunner.connection.logger.logSchemaBuild('IF YOU HAVE A LARGE DATABASE THIS MAY TAKE SEVERAL MINUTES! DO NOT STOP CONTEXTMOD WHILE MIGRATION IS IN PROGRESS!');
|
||||
|
||||
// unique ids due to random id
|
||||
const uniqueIdTableNames = [
|
||||
'Manager',
|
||||
'CMEvent',
|
||||
'FilterResult',
|
||||
'FilterCriteriaResult',
|
||||
'RunnableResult',
|
||||
'RulePremise',
|
||||
'RuleResult',
|
||||
'RuleSetResult',
|
||||
'ActionPremise',
|
||||
'ActionResult',
|
||||
'CheckResult',
|
||||
'RunResult'
|
||||
];
|
||||
|
||||
for (const tableName of uniqueIdTableNames) {
|
||||
const cmTable = await queryRunner.getTable(tableName);
|
||||
await queryRunner.createIndex(cmTable as Table, idIndex(tableName, true));
|
||||
}
|
||||
|
||||
// additional indexes
|
||||
|
||||
const actSource = await queryRunner.getTable('ActivitySource');
|
||||
await queryRunner.createIndex(actSource as Table, idIndex('ActivitySource', false));
|
||||
|
||||
const event = await queryRunner.getTable('CMEvent');
|
||||
await queryRunner.createIndices(event as Table, [index('CMEvent', ['activity_id'], false)]);
|
||||
|
||||
// FilterCriteriaResult criteriaId filterResultId
|
||||
|
||||
const fcrTable = await queryRunner.getTable('FilterCriteriaResult');
|
||||
await queryRunner.createIndices(fcrTable as Table, [
|
||||
index('FilterCriteriaResult', ['criteriaId'], false),
|
||||
index('FilterCriteriaResult', ['filterResultId'], false)
|
||||
]);
|
||||
|
||||
|
||||
// FilterCriteria id
|
||||
|
||||
const fcTable = await queryRunner.getTable('FilterCriteria');
|
||||
await queryRunner.createIndices(fcTable as Table, [
|
||||
idIndex('FilterCriteriaResult', false),
|
||||
]);
|
||||
|
||||
// RunnableResult resultId runnableId
|
||||
|
||||
const rrTable = await queryRunner.getTable('RunnableResult');
|
||||
await queryRunner.createIndices(rrTable as Table, [
|
||||
index('RunnableResult', ['resultId'], false),
|
||||
index('RunnableResult', ['runnableId'], false)
|
||||
]);
|
||||
|
||||
// ActionResult checkResultId premiseId
|
||||
|
||||
const arTable = await queryRunner.getTable('ActionResult');
|
||||
await queryRunner.createIndices(arTable as Table, [
|
||||
index('ActionResult', ['checkResultId'], false),
|
||||
index('ActionResult', ['premiseId'], false)
|
||||
]);
|
||||
|
||||
// CheckResult runId
|
||||
|
||||
const crTable = await queryRunner.getTable('CheckResult');
|
||||
await queryRunner.createIndices(crTable as Table, [
|
||||
index('CheckResult', ['runId'], false),
|
||||
]);
|
||||
|
||||
// RunResult eventId
|
||||
|
||||
const runResTable = await queryRunner.getTable('RunResult');
|
||||
await queryRunner.createIndices(runResTable as Table, [
|
||||
index('RunResult', ['eventId'], false),
|
||||
]);
|
||||
}
|
||||
|
||||
public async down(queryRunner: QueryRunner): Promise<void> {
|
||||
}
|
||||
|
||||
}
|
||||
133
src/Common/Typings/support.d.ts
vendored
133
src/Common/Typings/support.d.ts
vendored
@@ -3,9 +3,11 @@ declare module 'snoowrap/dist/errors' {
|
||||
export interface InvalidUserError extends Error {
|
||||
|
||||
}
|
||||
|
||||
export interface NoCredentialsError extends Error {
|
||||
|
||||
}
|
||||
|
||||
export interface InvalidMethodCallError extends Error {
|
||||
|
||||
}
|
||||
@@ -26,9 +28,138 @@ declare module 'snoowrap/dist/errors' {
|
||||
}
|
||||
|
||||
declare module 'winston-null' {
|
||||
import TransportStream from "winston-transport";
|
||||
import TransportStream from "winston-transport";
|
||||
|
||||
export class NullTransport extends TransportStream {
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
declare module '@nlpjs/*' {
|
||||
|
||||
declare interface SentimentResult {
|
||||
score: number,
|
||||
average: number,
|
||||
numWords: number,
|
||||
numHits: number,
|
||||
type: string,
|
||||
language: string
|
||||
}
|
||||
|
||||
declare interface NLPSentimentResult extends Omit<SentimentResult, 'language'> {
|
||||
vote: string
|
||||
locale: string
|
||||
}
|
||||
|
||||
|
||||
declare module '@nlpjs/language' {
|
||||
|
||||
export interface LanguageType {
|
||||
alpha3: string,
|
||||
alpha2: string,
|
||||
language: string,
|
||||
}
|
||||
|
||||
export interface LanguageObj {
|
||||
alpha3: string,
|
||||
alpha2: string,
|
||||
name: string,
|
||||
}
|
||||
|
||||
export interface LanguageGuess extends LanguageType {
|
||||
score: number
|
||||
}
|
||||
|
||||
export class Language {
|
||||
guess(val: string, allowedList?: string[] | null, limit?: number): LanguageGuess[];
|
||||
|
||||
guessBest(val: string, allowedList?: string[] | null): LanguageGuess;
|
||||
|
||||
/**
|
||||
* Key is alpha2 lang IE en es de fr
|
||||
* */
|
||||
languagesAlpha2: Record<string, LanguageObj>;
|
||||
/**
|
||||
* Key is alpha3 lang IE eng spa deu fra
|
||||
* */
|
||||
languagesAlpha3: Record<string, LanguageObj>;
|
||||
}
|
||||
}
|
||||
|
||||
declare module '@nlpjs/sentiment' {
|
||||
|
||||
declare interface SentimentPipelineResult {
|
||||
utterance: string
|
||||
locale: string
|
||||
settings: { tag: string }
|
||||
tokens: string[]
|
||||
sentiment: SentimentResult
|
||||
}
|
||||
|
||||
declare interface SentimentPipelineInput {
|
||||
utterance: string
|
||||
locale: string
|
||||
|
||||
[key: string]: any
|
||||
}
|
||||
|
||||
export class SentimentAnalyzer {
|
||||
constructor(settings?: { language?: string }, container?: any)
|
||||
|
||||
container: any
|
||||
|
||||
process(srcInput: SentimentPipelineInput, settings?: object): Promise<SentimentPipelineResult>
|
||||
}
|
||||
}
|
||||
|
||||
declare module '@nlpjs/nlp' {
|
||||
|
||||
declare interface NlpResult {
|
||||
locale: string
|
||||
language: string
|
||||
languageGuessed: boolean
|
||||
sentiment: NLPSentimentResult
|
||||
}
|
||||
|
||||
export class Nlp {
|
||||
settings: any;
|
||||
nluManager: any;
|
||||
|
||||
constructor(settings?: { language?: string }, container?: any)
|
||||
|
||||
// locale language languageGuessed sentiment
|
||||
process(locale: string, utterance?: string, srcContext?: object, settings?: object): Promise<NlpResult>
|
||||
addLanguage(locale: string)
|
||||
train(): Promise<any>;
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
declare module '@nlpjs/lang-es' {
|
||||
export const LangEs: any
|
||||
}
|
||||
declare module '@nlpjs/lang-en' {
|
||||
export const LangEn: any
|
||||
}
|
||||
declare module '@nlpjs/lang-de' {
|
||||
export const LangDe: any
|
||||
}
|
||||
declare module '@nlpjs/lang-fr' {
|
||||
export const LangFr: any
|
||||
}
|
||||
declare module '@nlpjs/nlu' {
|
||||
export const Nlu: any
|
||||
}
|
||||
|
||||
declare module '@nlpjs/core' {
|
||||
export const Container: any
|
||||
export const containerBootstrap: any
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
declare module 'wink-sentiment' {
|
||||
function sentiment(phrase: string): { score: number, normalizedScore: number, tokenizedPhrase: any[] };
|
||||
|
||||
export default sentiment;
|
||||
}
|
||||
|
||||
50
src/Common/Typings/vader-sentiment.d.ts
vendored
Normal file
50
src/Common/Typings/vader-sentiment.d.ts
vendored
Normal file
@@ -0,0 +1,50 @@
|
||||
declare module 'vader-sentiment' {
|
||||
export const REGEX_REMOVE_PUNCTUATION: RegExp;
|
||||
export const B_INCR: number;
|
||||
export const B_DECR: number;
|
||||
export const C_INCR: number;
|
||||
export const N_SCALER: number;
|
||||
export const PUNC_LIST: string[];
|
||||
export const NEGATE: string[];
|
||||
export const BOOSTER_DICT: Record<string, number>;
|
||||
export const SPECIAL_CASE_IDIOMS: Record<string, number>;
|
||||
|
||||
export interface Scores {
|
||||
neg: number
|
||||
neu: number
|
||||
pos: number
|
||||
compound: number
|
||||
}
|
||||
|
||||
export function negated(input_words: string[], include_nt: boolean = true): boolean;
|
||||
export function normalize(score: number, alpha: number): number;
|
||||
export function allcap_differential(words: string[]): boolean;
|
||||
export function scalar_inc_dec(word: string, valence: number, is_cap_diff: boolean): number
|
||||
export function is_upper_function(word: string): boolean
|
||||
|
||||
export class SentiText {
|
||||
public text: string;
|
||||
public words_and_emoticons: string[];
|
||||
public is_cap_diff: boolean;
|
||||
|
||||
constructor(text: string);
|
||||
|
||||
get_words_plus_punc(): Record<string, string>;
|
||||
get_words_and_emoticons(): string[];
|
||||
}
|
||||
|
||||
export class SentimentIntensityAnalyzer {
|
||||
|
||||
static polarity_scores(text: string): Scores;
|
||||
static sentiment_valence(valence: number, sentiText: SentiText, item: string, index: number, sentiments: number[]);
|
||||
static least_check(valence: number, words_and_emoticons: string[], index: number): number;
|
||||
static but_check(words_and_emoticons: string[], sentiments: number[]): number[]
|
||||
static idioms_check(valence: number, words_and_emoticons: string[], index: number): number;
|
||||
static never_check(valence: number, words_and_emoticons: string[], start_i: number, index: number): number
|
||||
static punctuation_emphasis(sum_s: any, text: string);
|
||||
static amplify_ep(text: string): number;
|
||||
static amplify_qm(text: string): number;
|
||||
static sift_sentiment_scores(sentiments: number[]): number[];
|
||||
static score_valence(sentiments: number[], text: string): Scores;
|
||||
}
|
||||
}
|
||||
@@ -3,7 +3,17 @@ import path from "path";
|
||||
import {FilterCriteriaDefaults} from "./Infrastructure/Filters/FilterShapes";
|
||||
|
||||
export const cacheOptDefaults = {ttl: 60, max: 500, checkPeriod: 600};
|
||||
export const cacheTTLDefaults = {authorTTL: 60, userNotesTTL: 300, wikiTTL: 300, submissionTTL: 60, commentTTL: 60, filterCriteriaTTL: 60, subredditTTL: 600, selfTTL: 60};
|
||||
export const cacheTTLDefaults = {
|
||||
authorTTL: 60,
|
||||
userNotesTTL: 300,
|
||||
modNotesTTL: 60,
|
||||
wikiTTL: 300,
|
||||
submissionTTL: 60,
|
||||
commentTTL: 60,
|
||||
filterCriteriaTTL: 60,
|
||||
subredditTTL: 600,
|
||||
selfTTL: 60
|
||||
};
|
||||
|
||||
export const createHistoricalDisplayDefaults = (): HistoricalStatsDisplay => ({
|
||||
checksRunTotal: 0,
|
||||
@@ -32,4 +42,4 @@ export const filterCriteriaDefault: FilterCriteriaDefaults = {
|
||||
export const defaultDataDir = path.resolve(__dirname, '../..');
|
||||
export const defaultConfigFilenames = ['config.json', 'config.yaml'];
|
||||
|
||||
export const VERSION = '0.10.12';
|
||||
export const VERSION = '0.11.1';
|
||||
|
||||
@@ -461,6 +461,17 @@ export interface TTLConfig {
|
||||
* @default 50
|
||||
* */
|
||||
selfTTL?: number | boolean
|
||||
|
||||
/**
|
||||
* Amount of time, in seconds, Mod Notes should be cached
|
||||
*
|
||||
* * If `0` or `true` will cache indefinitely (not recommended)
|
||||
* * If `false` will not cache
|
||||
*
|
||||
* @examples [60]
|
||||
* @default 60
|
||||
* */
|
||||
modNotesTTL?: number | boolean;
|
||||
}
|
||||
|
||||
export interface CacheConfig extends TTLConfig {
|
||||
@@ -737,6 +748,7 @@ export type StrongCache = {
|
||||
commentTTL: number | boolean,
|
||||
subredditTTL: number | boolean,
|
||||
selfTTL: number | boolean,
|
||||
modNotesTTL: number | boolean,
|
||||
filterCriteriaTTL: number | boolean,
|
||||
provider: CacheOptions
|
||||
actionedEventsMax?: number,
|
||||
@@ -1676,6 +1688,7 @@ export interface LogInfo {
|
||||
labels?: string[]
|
||||
bot?: string
|
||||
user?: string
|
||||
transport?: string[]
|
||||
}
|
||||
|
||||
export interface ActionResult extends ActionProcessResult {
|
||||
@@ -1942,7 +1955,6 @@ export interface ActivityDispatch extends Omit<ActivityDispatchConfig, 'delay'|
|
||||
author: string
|
||||
delay: Duration
|
||||
tardyTolerant?: boolean | Duration
|
||||
processing: boolean
|
||||
action?: string
|
||||
type: ActivitySourceTypes
|
||||
dryRun?: boolean
|
||||
|
||||
@@ -18,7 +18,9 @@ import {RepostRuleJSONConfig} from "../Rule/RepostRule";
|
||||
import {DispatchActionJson} from "../Action/DispatchAction";
|
||||
import {CancelDispatchActionJson} from "../Action/CancelDispatchAction";
|
||||
import {ContributorActionJson} from "../Action/ContributorAction";
|
||||
import {SentimentRuleJSONConfig} from "../Rule/SentimentRule";
|
||||
import {ModNoteActionJson} from "../Action/ModNoteAction";
|
||||
|
||||
export type RuleObjectJsonTypes = RecentActivityRuleJSONConfig | RepeatActivityJSONConfig | AuthorRuleJSONConfig | AttributionJSONConfig | HistoryJSONConfig | RegexRuleJSONConfig | RepostRuleJSONConfig
|
||||
export type RuleObjectJsonTypes = RecentActivityRuleJSONConfig | RepeatActivityJSONConfig | AuthorRuleJSONConfig | AttributionJSONConfig | HistoryJSONConfig | RegexRuleJSONConfig | RepostRuleJSONConfig | SentimentRuleJSONConfig
|
||||
|
||||
export type ActionJson = CommentActionJson | FlairActionJson | ReportActionJson | LockActionJson | RemoveActionJson | ApproveActionJson | BanActionJson | UserNoteActionJson | MessageActionJson | UserFlairActionJson | DispatchActionJson | CancelDispatchActionJson | ContributorActionJson | string;
|
||||
export type ActionJson = CommentActionJson | FlairActionJson | ReportActionJson | LockActionJson | RemoveActionJson | ApproveActionJson | BanActionJson | UserNoteActionJson | MessageActionJson | UserFlairActionJson | DispatchActionJson | CancelDispatchActionJson | ContributorActionJson | ModNoteActionJson | string;
|
||||
|
||||
@@ -10,10 +10,9 @@ import {getAttributionIdentifier} from "../Utils/SnoowrapUtils";
|
||||
import dayjs from "dayjs";
|
||||
import {
|
||||
asSubmission, buildFilter, buildSubredditFilter,
|
||||
comparisonTextOp, convertSubredditsRawToStrong,
|
||||
convertSubredditsRawToStrong,
|
||||
FAIL,
|
||||
formatNumber, getActivitySubredditName, isActivityWindowConfig, isSubmission,
|
||||
parseGenericValueOrPercentComparison,
|
||||
parseSubredditName,
|
||||
PASS, windowConfigToWindowCriteria
|
||||
} from "../util";
|
||||
@@ -27,6 +26,7 @@ import {
|
||||
HistoryFiltersOptions
|
||||
} from "../Common/Infrastructure/ActivityWindow";
|
||||
import {FilterOptions} from "../Common/Infrastructure/Filters/FilterShapes";
|
||||
import {comparisonTextOp, parseGenericValueOrPercentComparison} from "../Common/Infrastructure/Comparisons";
|
||||
|
||||
|
||||
export interface AttributionCriteria {
|
||||
|
||||
@@ -8,10 +8,9 @@ import Submission from "snoowrap/dist/objects/Submission";
|
||||
import dayjs from "dayjs";
|
||||
import {
|
||||
asSubmission,
|
||||
comparisonTextOp,
|
||||
FAIL,
|
||||
formatNumber, getActivitySubredditName, historyFilterConfigToOptions, isSubmission,
|
||||
parseGenericValueOrPercentComparison, parseSubredditName,
|
||||
parseSubredditName,
|
||||
PASS,
|
||||
percentFromString, removeUndefinedKeys, toStrongSubredditState, windowConfigToWindowCriteria
|
||||
} from "../util";
|
||||
@@ -20,6 +19,7 @@ import {SubredditCriteria} from "../Common/Infrastructure/Filters/FilterCriteria
|
||||
import {CompareValueOrPercent} from "../Common/Infrastructure/Atomic";
|
||||
import {ActivityWindowConfig, ActivityWindowCriteria} from "../Common/Infrastructure/ActivityWindow";
|
||||
import {ErrorWithCause} from "pony-cause";
|
||||
import {comparisonTextOp, parseGenericValueOrPercentComparison} from "../Common/Infrastructure/Comparisons";
|
||||
|
||||
export interface CommentThresholdCriteria extends ThresholdCriteria {
|
||||
/**
|
||||
|
||||
@@ -11,7 +11,7 @@ import {
|
||||
asSubmission, bitsToHexLength,
|
||||
// blockHashImage,
|
||||
compareImages,
|
||||
comparisonTextOp, convertSubredditsRawToStrong,
|
||||
convertSubredditsRawToStrong,
|
||||
FAIL,
|
||||
formatNumber,
|
||||
getActivitySubredditName, imageCompareMaxConcurrencyGuess,
|
||||
@@ -19,7 +19,7 @@ import {
|
||||
isSubmission,
|
||||
isValidImageURL,
|
||||
objectToStringSummary,
|
||||
parseGenericValueOrPercentComparison, parseRedditEntity,
|
||||
parseRedditEntity,
|
||||
parseStringToRegex,
|
||||
parseSubredditName,
|
||||
parseUsableLinkIdentifier,
|
||||
@@ -41,6 +41,7 @@ import {
|
||||
SubredditCriteria
|
||||
} from "../Common/Infrastructure/Filters/FilterCriteria";
|
||||
import {ActivityWindow, ActivityWindowConfig} from "../Common/Infrastructure/ActivityWindow";
|
||||
import {comparisonTextOp, parseGenericValueOrPercentComparison} from "../Common/Infrastructure/Comparisons";
|
||||
|
||||
const parseLink = parseUsableLinkIdentifier();
|
||||
|
||||
|
||||
@@ -3,8 +3,7 @@ import {Comment} from "snoowrap";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import {
|
||||
asSubmission,
|
||||
comparisonTextOp, FAIL, isExternalUrlSubmission, isSubmission, parseGenericValueComparison,
|
||||
parseGenericValueOrPercentComparison, parseRegex, parseStringToRegex,
|
||||
FAIL, isExternalUrlSubmission, isSubmission, parseRegex, parseStringToRegex,
|
||||
PASS, triggeredIndicator, windowConfigToWindowCriteria
|
||||
} from "../util";
|
||||
import {
|
||||
@@ -14,6 +13,11 @@ import dayjs from 'dayjs';
|
||||
import {SimpleError} from "../Utils/Errors";
|
||||
import {JoinOperands} from "../Common/Infrastructure/Atomic";
|
||||
import {ActivityWindowConfig} from "../Common/Infrastructure/ActivityWindow";
|
||||
import {
|
||||
comparisonTextOp,
|
||||
parseGenericValueComparison,
|
||||
parseGenericValueOrPercentComparison
|
||||
} from "../Common/Infrastructure/Comparisons";
|
||||
|
||||
export interface RegexCriteria {
|
||||
/**
|
||||
|
||||
@@ -3,12 +3,10 @@ import {Comment, RedditUser} from "snoowrap";
|
||||
import {
|
||||
activityWindowText,
|
||||
asSubmission,
|
||||
comparisonTextOp,
|
||||
FAIL,
|
||||
getActivitySubredditName, isActivityWindowConfig,
|
||||
isExternalUrlSubmission,
|
||||
isRedditMedia,
|
||||
parseGenericValueComparison,
|
||||
parseSubredditName,
|
||||
parseUsableLinkIdentifier as linkParser,
|
||||
PASS,
|
||||
@@ -23,7 +21,6 @@ import {
|
||||
} from "../Common/interfaces";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import dayjs from "dayjs";
|
||||
import Fuse from 'fuse.js'
|
||||
import {StrongSubredditCriteria, SubredditCriteria} from "../Common/Infrastructure/Filters/FilterCriteria";
|
||||
import {
|
||||
ActivityWindow,
|
||||
@@ -31,6 +28,7 @@ import {
|
||||
ActivityWindowCriteria,
|
||||
HistoryFiltersOptions
|
||||
} from "../Common/Infrastructure/ActivityWindow";
|
||||
import {comparisonTextOp, parseGenericValueComparison} from "../Common/Infrastructure/Comparisons";
|
||||
|
||||
const parseUsableLinkIdentifier = linkParser();
|
||||
|
||||
|
||||
@@ -3,11 +3,8 @@ import {Listing, SearchOptions} from "snoowrap";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import Comment from "snoowrap/dist/objects/Comment";
|
||||
import {
|
||||
compareDurationValue,
|
||||
comparisonTextOp,
|
||||
FAIL, formatNumber,
|
||||
isRepostItemResult, parseDurationComparison, parseGenericValueComparison,
|
||||
parseUsableLinkIdentifier,
|
||||
isRepostItemResult, parseUsableLinkIdentifier,
|
||||
PASS, searchAndReplace, stringSameness, triggeredIndicator, windowConfigToWindowCriteria, wordCount
|
||||
} from "../util";
|
||||
import {
|
||||
@@ -18,13 +15,16 @@ import {
|
||||
} from "../Common/interfaces";
|
||||
import objectHash from "object-hash";
|
||||
import {getAttributionIdentifier} from "../Utils/SnoowrapUtils";
|
||||
import Fuse from "fuse.js";
|
||||
import leven from "leven";
|
||||
import {YoutubeClient, commentsAsRepostItems} from "../Utils/ThirdParty/YoutubeClient";
|
||||
import dayjs from "dayjs";
|
||||
import {rest} from "lodash";
|
||||
import {CompareValue, DurationComparor, JoinOperands, SearchFacetType} from "../Common/Infrastructure/Atomic";
|
||||
import {ActivityWindow, ActivityWindowConfig} from "../Common/Infrastructure/ActivityWindow";
|
||||
import {
|
||||
compareDurationValue, comparisonTextOp,
|
||||
parseDurationComparison,
|
||||
parseGenericValueComparison
|
||||
} from "../Common/Infrastructure/Comparisons";
|
||||
|
||||
const parseYtIdentifier = parseUsableLinkIdentifier();
|
||||
|
||||
|
||||
@@ -10,6 +10,7 @@ import {SubredditResources} from "../Subreddit/SubredditResources";
|
||||
import Snoowrap from "snoowrap";
|
||||
import {RepostRule, RepostRuleJSONConfig} from "./RepostRule";
|
||||
import {StructuredFilter} from "../Common/Infrastructure/Filters/FilterShapes";
|
||||
import {SentimentRule, SentimentRuleJSONConfig} from "./SentimentRule";
|
||||
|
||||
export function ruleFactory
|
||||
(config: StructuredRuleJson, logger: Logger, subredditName: string, resources: SubredditResources, client: Snoowrap): Rule {
|
||||
@@ -37,7 +38,10 @@ export function ruleFactory
|
||||
case 'repost':
|
||||
cfg = config as StructuredFilter<RepostRuleJSONConfig>;
|
||||
return new RepostRule({...cfg, logger, subredditName, resources, client});
|
||||
case 'sentiment':
|
||||
cfg = config as StructuredFilter<SentimentRuleJSONConfig>;
|
||||
return new SentimentRule({...cfg, logger, subredditName, resources, client});
|
||||
default:
|
||||
throw new Error('rule "kind" was not recognized.');
|
||||
throw new Error(`Rule with kind '${config.kind}' was not recognized.`);
|
||||
}
|
||||
}
|
||||
|
||||
248
src/Rule/SentimentRule.ts
Normal file
248
src/Rule/SentimentRule.ts
Normal file
@@ -0,0 +1,248 @@
|
||||
import {Rule, RuleJSONConfig, RuleOptions} from "./index";
|
||||
import {Comment} from "snoowrap";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import {
|
||||
formatNumber,
|
||||
triggeredIndicator, windowConfigToWindowCriteria
|
||||
} from "../util";
|
||||
|
||||
import dayjs from 'dayjs';
|
||||
import {map as mapAsync} from 'async';
|
||||
import {
|
||||
comparisonTextOp,
|
||||
GenericComparison,
|
||||
parseGenericValueOrPercentComparison,
|
||||
RangedComparison
|
||||
} from "../Common/Infrastructure/Comparisons";
|
||||
import {ActivityWindowConfig, ActivityWindowCriteria} from "../Common/Infrastructure/ActivityWindow";
|
||||
import {VaderSentimentComparison} from "../Common/Infrastructure/Atomic";
|
||||
import {RuleResult} from "../Common/interfaces";
|
||||
import {SnoowrapActivity} from "../Common/Infrastructure/Reddit";
|
||||
import {
|
||||
ActivitySentimentOptions,
|
||||
ActivitySentimentTestResult,
|
||||
parseTextToNumberComparison,
|
||||
testActivitySentiment
|
||||
} from "../Common/LangaugeProcessing";
|
||||
|
||||
export class SentimentRule extends Rule {
|
||||
|
||||
sentimentVal: string;
|
||||
sentiment: GenericComparison | RangedComparison;
|
||||
|
||||
historical?: HistoricalSentiment;
|
||||
|
||||
testOn: ('title' | 'body')[]
|
||||
|
||||
constructor(options: SentimentRuleOptions) {
|
||||
super(options);
|
||||
|
||||
this.sentimentVal = options.sentiment;
|
||||
this.sentiment = parseTextToNumberComparison(options.sentiment);
|
||||
this.testOn = options.testOn ?? ['title', 'body'];
|
||||
|
||||
if(options.historical !== undefined) {
|
||||
const {
|
||||
window,
|
||||
sentiment: historicalSentiment = this.sentimentVal,
|
||||
mustMatchCurrent = false,
|
||||
totalMatching = '> 0',
|
||||
} = options.historical
|
||||
|
||||
this.historical = {
|
||||
sentiment: parseTextToNumberComparison(historicalSentiment),
|
||||
sentimentVal: historicalSentiment,
|
||||
window: windowConfigToWindowCriteria(window),
|
||||
mustMatchCurrent,
|
||||
totalMatching: parseGenericValueOrPercentComparison(totalMatching),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
getKind(): string {
|
||||
return 'sentiment';
|
||||
}
|
||||
|
||||
getSpecificPremise(): object {
|
||||
return {
|
||||
sentiment: this.sentiment,
|
||||
}
|
||||
}
|
||||
|
||||
protected async process(item: Submission | Comment): Promise<[boolean, RuleResult]> {
|
||||
|
||||
let ogResult = await this.testActivity(item, this.sentiment);
|
||||
let historicResults: ActivitySentimentTestResult[] | undefined;
|
||||
|
||||
if(this.historical !== undefined && (!this.historical.mustMatchCurrent || ogResult.passes)) {
|
||||
const {
|
||||
sentiment = this.sentiment,
|
||||
window,
|
||||
} = this.historical;
|
||||
const history = await this.resources.getAuthorActivities(item.author, window);
|
||||
|
||||
historicResults = await mapAsync(history, async (x: SnoowrapActivity) => await this.testActivity(x, sentiment)); // history.map(x => this.testActivity(x, sentiment));
|
||||
}
|
||||
|
||||
|
||||
|
||||
const logSummary: string[] = [];
|
||||
|
||||
const sentimentTest = this.sentiment.displayText;
|
||||
const historicalSentimentTest = this.historical !== undefined ? this.historical.sentiment.displayText : undefined;
|
||||
|
||||
let triggered = false;
|
||||
let averageScore: number;
|
||||
let averageWindowScore: number | undefined;
|
||||
let humanWindow: string | undefined;
|
||||
let historicalPassed: string | undefined;
|
||||
let totalMatchingText: string | undefined;
|
||||
|
||||
if(historicResults === undefined) {
|
||||
triggered = ogResult.passes;
|
||||
averageScore = ogResult.scoreWeighted;
|
||||
logSummary.push(`${triggeredIndicator(triggered)} Current Activity Sentiment '${ogResult.sentiment} (${ogResult.scoreWeighted})' ${triggered ? 'PASSED' : 'DID NOT PASS'} sentiment test '${sentimentTest}'`);
|
||||
if(!triggered && this.historical !== undefined && this.historical.mustMatchCurrent) {
|
||||
logSummary.push(`Did not check Historical because 'mustMatchCurrent' is true`);
|
||||
}
|
||||
} else {
|
||||
|
||||
const {
|
||||
totalMatching,
|
||||
sentiment,
|
||||
} = this.historical as HistoricalSentiment;
|
||||
|
||||
totalMatchingText = totalMatching.displayText;
|
||||
const allResults = historicResults
|
||||
const passed = allResults.filter(x => x.passes);
|
||||
averageScore = passed.reduce((acc, curr) => acc + curr.scoreWeighted,0) / passed.length;
|
||||
averageWindowScore = allResults.reduce((acc, curr) => acc + curr.scoreWeighted,0) / allResults.length;
|
||||
|
||||
const firstActivity = allResults[0].activity;
|
||||
const lastActivity = allResults[allResults.length - 1].activity;
|
||||
|
||||
const humanRange = dayjs.duration(dayjs(firstActivity.created_utc * 1000).diff(dayjs(lastActivity.created_utc * 1000))).humanize();
|
||||
|
||||
humanWindow = `${allResults.length} Activities (${humanRange})`;
|
||||
|
||||
const {operator, value, isPercent} = totalMatching;
|
||||
if(isPercent) {
|
||||
const passPercentVal = passed.length/allResults.length
|
||||
triggered = comparisonTextOp(passPercentVal, operator, (value/100));
|
||||
historicalPassed = `${passed.length} (${formatNumber(passPercentVal)}%)`;
|
||||
} else {
|
||||
triggered = comparisonTextOp(passed.length, operator, value);
|
||||
historicalPassed = `${passed.length}`;
|
||||
}
|
||||
logSummary.push(`${triggeredIndicator(triggered)} ${historicalPassed} historical activities of ${humanWindow} passed sentiment test '${sentiment.displayText}' which ${triggered ? 'MET' : 'DID NOT MEET'} threshold '${totalMatching.displayText}'`);
|
||||
}
|
||||
|
||||
const result = logSummary.join(' || ');
|
||||
this.logger.verbose(result);
|
||||
|
||||
return Promise.resolve([triggered, this.getResult(triggered, {
|
||||
result,
|
||||
data: {
|
||||
results: {
|
||||
triggered,
|
||||
sentimentTest,
|
||||
historicalSentimentTest,
|
||||
averageScore,
|
||||
averageWindowScore,
|
||||
window: humanWindow,
|
||||
totalMatching: totalMatchingText
|
||||
}
|
||||
}
|
||||
})]);
|
||||
}
|
||||
|
||||
protected async testActivity(a: (Submission | Comment), criteria: GenericComparison | RangedComparison): Promise<ActivitySentimentTestResult> {
|
||||
return await testActivitySentiment(a, criteria, {testOn: this.testOn});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Test the Sentiment of Activities from the Author history
|
||||
*
|
||||
* If this is defined then the `totalMatching` threshold must pass for the Rule to trigger
|
||||
*
|
||||
* If `sentiment` is defined here it overrides the top-level `sentiment` value
|
||||
*
|
||||
* */
|
||||
interface HistoricalSentimentConfig {
|
||||
window: ActivityWindowConfig
|
||||
|
||||
sentiment?: VaderSentimentComparison
|
||||
|
||||
/**
|
||||
* When `true` the original Activity being checked MUST match desired sentiment before the Rule considers any history
|
||||
*
|
||||
* @default false
|
||||
* */
|
||||
mustMatchCurrent?: boolean
|
||||
|
||||
/**
|
||||
* A string containing a comparison operator and a value to compare Activities from history that pass the given `sentiment` comparison
|
||||
*
|
||||
* The syntax is `(< OR > OR <= OR >=) <number>[percent sign]`
|
||||
*
|
||||
* * EX `> 12` => greater than 12 activities passed given `sentiment` comparison
|
||||
* * EX `<= 10%` => less than 10% of all Activities from history passed given `sentiment` comparison
|
||||
*
|
||||
* @pattern ^\s*(>|>=|<|<=)\s*(\d+)\s*(%?)(.*)$
|
||||
* @default "> 0"
|
||||
* @examples ["> 0","> 10%"]
|
||||
* */
|
||||
totalMatching: string
|
||||
}
|
||||
|
||||
interface HistoricalSentiment extends Omit<HistoricalSentimentConfig, 'sentiment' | 'window' | 'totalMatching'> {
|
||||
sentiment: GenericComparison | RangedComparison,
|
||||
sentimentVal: string
|
||||
window: ActivityWindowCriteria
|
||||
totalMatching: GenericComparison
|
||||
}
|
||||
|
||||
interface SentimentConfig extends ActivitySentimentOptions {
|
||||
|
||||
sentiment: VaderSentimentComparison
|
||||
|
||||
/**
|
||||
* Test the Sentiment of Activities from the Author history
|
||||
*
|
||||
* If this is defined then the `totalMatching` threshold must pass for the Rule to trigger
|
||||
*
|
||||
* If `sentiment` is defined here it overrides the top-level `sentiment` value
|
||||
*
|
||||
* */
|
||||
historical?: HistoricalSentimentConfig
|
||||
|
||||
/**
|
||||
* Which content from an Activity to test for `sentiment` against
|
||||
*
|
||||
* Only used if the Activity being tested is a Submission -- Comments are only tested against their body
|
||||
*
|
||||
* If more than one type of content is specified then all text is tested together as one string
|
||||
*
|
||||
* @default ["title", "body"]
|
||||
* */
|
||||
testOn?: ('title' | 'body')[]
|
||||
}
|
||||
|
||||
export interface SentimentRuleOptions extends SentimentConfig, RuleOptions {
|
||||
}
|
||||
|
||||
/**
|
||||
* Test the calculated VADER sentiment for an Activity to determine if the text context is negative, neutral, or positive in tone.
|
||||
*
|
||||
* More about VADER Sentiment: https://github.com/cjhutto/vaderSentiment
|
||||
*
|
||||
* */
|
||||
export interface SentimentRuleJSONConfig extends SentimentConfig, RuleJSONConfig {
|
||||
/**
|
||||
* @examples ["sentiment"]
|
||||
* */
|
||||
kind: 'sentiment'
|
||||
}
|
||||
|
||||
export default SentimentRule;
|
||||
@@ -185,7 +185,7 @@ export interface RuleJSONConfig extends IRule {
|
||||
* The kind of rule to run
|
||||
* @examples ["recentActivity", "repeatActivity", "author", "attribution", "history"]
|
||||
*/
|
||||
kind: 'recentActivity' | 'repeatActivity' | 'author' | 'attribution' | 'history' | 'regex' | 'repost'
|
||||
kind: 'recentActivity' | 'repeatActivity' | 'author' | 'attribution' | 'history' | 'regex' | 'repost' | 'sentiment'
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -12,6 +12,7 @@
|
||||
"flair",
|
||||
"lock",
|
||||
"message",
|
||||
"modnote",
|
||||
"remove",
|
||||
"report",
|
||||
"userflair",
|
||||
@@ -137,6 +138,19 @@
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"modActions": {
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/ModNoteCriteria"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/ModLogCriteria"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"name": {
|
||||
"description": "A list of reddit usernames (case-insensitive) to match against. Do not include the \"u/\" prefix\n\n EX to match against /u/FoxxMD and /u/AnotherUser use [\"FoxxMD\",\"AnotherUser\"]",
|
||||
"examples": [
|
||||
@@ -405,6 +419,241 @@
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ModLogCriteria": {
|
||||
"properties": {
|
||||
"action": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"activityType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"description": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"details": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
"total"
|
||||
],
|
||||
"examples": [
|
||||
"current"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ModNoteCriteria": {
|
||||
"properties": {
|
||||
"activityType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"note": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"noteType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"ABUSE_WARNING",
|
||||
"BAN",
|
||||
"BOT_BAN",
|
||||
"HELPFUL_USER",
|
||||
"PERMA_BAN",
|
||||
"SOLID_CONTRIBUTOR",
|
||||
"SPAM_WARNING",
|
||||
"SPAM_WATCH"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"ABUSE_WARNING",
|
||||
"BAN",
|
||||
"BOT_BAN",
|
||||
"HELPFUL_USER",
|
||||
"PERMA_BAN",
|
||||
"SOLID_CONTRIBUTOR",
|
||||
"SPAM_WARNING",
|
||||
"SPAM_WATCH"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
"total"
|
||||
],
|
||||
"examples": [
|
||||
"current"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ModeratorNameCriteria": {
|
||||
"properties": {
|
||||
"behavior": {
|
||||
@@ -660,16 +909,16 @@
|
||||
"properties": {
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [ascending|descending]`",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the notes for this Author:\n\n### current\n\nOnly the most recent note is checked for `type`\n\n### total\n\nThe `count` comparison of `type` must be found within all notes\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n\n### consecutive\n\nThe `count` **number** of `type` notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
|
||||
@@ -651,6 +651,19 @@
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"modActions": {
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/ModNoteCriteria"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/ModLogCriteria"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"name": {
|
||||
"description": "A list of reddit usernames (case-insensitive) to match against. Do not include the \"u/\" prefix\n\n EX to match against /u/FoxxMD and /u/AnotherUser use [\"FoxxMD\",\"AnotherUser\"]",
|
||||
"examples": [
|
||||
@@ -967,6 +980,17 @@
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"modNotesTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, Mod Notes should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"provider": {
|
||||
"anyOf": [
|
||||
{
|
||||
@@ -1419,6 +1443,9 @@
|
||||
{
|
||||
"$ref": "#/definitions/ContributorActionJson"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/ModNoteActionJson"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
@@ -1581,6 +1608,9 @@
|
||||
{
|
||||
"$ref": "#/definitions/RepostRuleJSONConfig"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/SentimentRuleJSONConfig"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/RuleSetJson"
|
||||
},
|
||||
@@ -2914,6 +2944,60 @@
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"HistoricalSentimentConfig": {
|
||||
"description": "Test the Sentiment of Activities from the Author history\n\nIf this is defined then the `totalMatching` threshold must pass for the Rule to trigger\n\nIf `sentiment` is defined here it overrides the top-level `sentiment` value",
|
||||
"properties": {
|
||||
"mustMatchCurrent": {
|
||||
"default": false,
|
||||
"description": "When `true` the original Activity being checked MUST match desired sentiment before the Rule considers any history",
|
||||
"type": "boolean"
|
||||
},
|
||||
"sentiment": {
|
||||
"description": "Test the calculated VADER sentiment (compound) score for an Activity using this comparison. Can be either a numerical or natural language\n\nSentiment values range from extremely negative to extremely positive in a numerical range of -1 to +1:\n\n* -0.6 => extremely negative\n* -0.3 => very negative\n* -0.1 => negative\n* 0 => neutral\n* 0.1 => positive\n* 0.3 => very positive\n* 0.6 => extremely positive\n\nThe below examples are all equivocal. You can use either set of values as the value for `sentiment` (numerical comparisons or natural langauge)\n\n* `>= 0.1` = `is positive`\n* `<= 0.3` = `is very negative`\n* `< 0.1` = `is not positive`\n* `> -0.3` = `is not very negative`\n\nSpecial case:\n\n* `is neutral` equates to `> -0.1 and < 0.1`\n* `is not neutral` equates to `< -0.1 or > 0.1`\n\nContextMod uses a normalized, weighted average from these sentiment tools:\n\n* NLP.js (english, french, german, and spanish) https://github.com/axa-group/nlp.js/blob/master/docs/v3/sentiment-analysis.md\n* (english only) vaderSentiment-js https://github.com/vaderSentiment/vaderSentiment-js/\n* (english only) wink-sentiment https://github.com/winkjs/wink-sentiment\n\nMore about the sentiment algorithms used:\n* VADER https://github.com/cjhutto/vaderSentiment\n* AFINN http://corpustext.com/reference/sentiment_afinn.html\n* Senticon https://ieeexplore.ieee.org/document/8721408\n* Pattern https://github.com/clips/pattern\n* wink https://github.com/winkjs/wink-sentiment",
|
||||
"examples": [
|
||||
"is negative",
|
||||
"> 0.2"
|
||||
],
|
||||
"pattern": "((>|>=|<|<=)\\s*(-?\\d?\\.?\\d+))|((not)?\\s*(very|extremely)?\\s*(positive|neutral|negative))",
|
||||
"type": "string"
|
||||
},
|
||||
"totalMatching": {
|
||||
"default": "> 0",
|
||||
"description": "A string containing a comparison operator and a value to compare Activities from history that pass the given `sentiment` comparison\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 12` => greater than 12 activities passed given `sentiment` comparison\n* EX `<= 10%` => less than 10% of all Activities from history passed given `sentiment` comparison",
|
||||
"examples": [
|
||||
"> 0",
|
||||
"> 10%"
|
||||
],
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"window": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/DurationObject"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/FullActivityWindowConfig"
|
||||
},
|
||||
{
|
||||
"type": [
|
||||
"string",
|
||||
"number"
|
||||
]
|
||||
}
|
||||
],
|
||||
"description": "A value to define the range of Activities to retrieve.\n\nAcceptable values:\n\n**`ActivityWindowCriteria` object**\n\nAllows specify multiple range properties and more specific behavior\n\n**A `number` of Activities to retrieve**\n\n* EX `100` => 100 Activities\n\n*****\n\nAny of the below values that specify the amount of time to subtract from `NOW` to create a time range IE `NOW <---> [duration] ago`\n\nAcceptable values:\n\n**A `string` consisting of a value and a [Day.js](https://day.js.org/docs/en/durations/creating#list-of-all-available-units) time UNIT**\n\n* EX `9 days` => Range is `NOW <---> 9 days ago`\n\n**A [Day.js](https://day.js.org/docs/en/durations/creating) `object`**\n\n* EX `{\"days\": 90, \"minutes\": 15}` => Range is `NOW <---> 90 days and 15 minutes ago`\n\n**An [ISO 8601 duration](https://en.wikipedia.org/wiki/ISO_8601#Durations) `string`**\n\n* EX `PT15M` => 15 minutes => Range is `NOW <----> 15 minutes ago`",
|
||||
"examples": [
|
||||
"90 days"
|
||||
]
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"totalMatching",
|
||||
"window"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"HistoryCriteria": {
|
||||
"description": "Criteria will only trigger if ALL present thresholds (comment, submission, total) are met",
|
||||
"properties": {
|
||||
@@ -3424,6 +3508,366 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"ModLogCriteria": {
|
||||
"properties": {
|
||||
"action": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"activityType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"description": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"details": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
"total"
|
||||
],
|
||||
"examples": [
|
||||
"current"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ModNoteActionJson": {
|
||||
"description": "Add a Toolbox User Note to the Author of this Activity",
|
||||
"properties": {
|
||||
"allowDuplicate": {
|
||||
"default": false,
|
||||
"description": "Add Note even if a Note already exists for this Activity",
|
||||
"examples": [
|
||||
false
|
||||
],
|
||||
"type": "boolean"
|
||||
},
|
||||
"authorIs": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/AuthorCriteria"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/NamedCriteria<AuthorCriteria>"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/FilterOptionsJson<AuthorCriteria>"
|
||||
}
|
||||
],
|
||||
"description": "If present then these Author criteria are checked before running the Check. If criteria fails then the Check will fail."
|
||||
},
|
||||
"content": {
|
||||
"description": "The Content to submit for this Action. Content is interpreted as reddit-flavored Markdown.\n\nIf value starts with `wiki:` then the proceeding value will be used to get a wiki page from the current subreddit\n\n * EX `wiki:botconfig/mybot` tries to get `https://reddit.com/r/currentSubreddit/wiki/botconfig/mybot`\n\nIf the value starts with `wiki:` and ends with `|someValue` then `someValue` will be used as the base subreddit for the wiki page\n\n* EX `wiki:replytemplates/test|ContextModBot` tries to get `https://reddit.com/r/ContextModBot/wiki/replytemplates/test`\n\nIf the value starts with `url:` then the value is fetched as an external url and expects raw text returned\n\n* EX `url:https://pastebin.com/raw/38qfL7mL` tries to get the text response of `https://pastebin.com/raw/38qfL7mL`\n\nIf none of the above is used the value is treated as the raw context\n\n * EX `this is **bold** markdown text` => \"this is **bold** markdown text\"\n\nAll Content is rendered using [mustache](https://github.com/janl/mustache.js/#templates) to enable [Action Templating](https://github.com/FoxxMD/context-mod#action-templating).\n\nThe following properties are always available in the template (view individual Rules to see rule-specific template data):\n```\nitem.kind => The type of Activity that was checked (comment/submission)\nitem.author => The name of the Author of the Activity EX FoxxMD\nitem.permalink => A permalink URL to the Activity EX https://reddit.com/r/yourSub/comments/o1h0i0/title_name/1v3b7x\nitem.url => If the Activity is Link Sumbission then the external URL\nitem.title => If the Activity is a Submission then the title of that Submission\nrules => An object containing RuleResults of all the rules run for this check. See Action Templating for more details on naming\n```",
|
||||
"examples": [
|
||||
"This is the content of a comment/report/usernote",
|
||||
"this is **bold** markdown text",
|
||||
"wiki:botconfig/acomment"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"dryRun": {
|
||||
"default": false,
|
||||
"description": "If `true` the Action will not make the API request to Reddit to perform its action.",
|
||||
"examples": [
|
||||
false,
|
||||
true
|
||||
],
|
||||
"type": "boolean"
|
||||
},
|
||||
"enable": {
|
||||
"default": true,
|
||||
"description": "If set to `false` the Action will not be run",
|
||||
"examples": [
|
||||
true
|
||||
],
|
||||
"type": "boolean"
|
||||
},
|
||||
"itemIs": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubmissionState"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/CommentState"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/NamedCriteria<TypedActivityState>"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/FilterOptionsJson<TypedActivityState>"
|
||||
}
|
||||
],
|
||||
"description": "A list of criteria to test the state of the `Activity` against before running the check.\n\nIf any set of criteria passes the Check will be run. If the criteria fails then the Check will fail.\n\n* @examples [[{\"over_18\": true, \"removed': false}]]"
|
||||
},
|
||||
"kind": {
|
||||
"description": "The type of action that will be performed",
|
||||
"enum": [
|
||||
"modnote"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"name": {
|
||||
"description": "An optional, but highly recommended, friendly name for this Action. If not present will default to `kind`.\n\nCan only contain letters, numbers, underscore, spaces, and dashes",
|
||||
"examples": [
|
||||
"myDescriptiveAction"
|
||||
],
|
||||
"pattern": "^[a-zA-Z]([\\w -]*[\\w])?$",
|
||||
"type": "string"
|
||||
},
|
||||
"referenceActivity": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"type": {
|
||||
"enum": [
|
||||
"ABUSE_WARNING",
|
||||
"BAN",
|
||||
"BOT_BAN",
|
||||
"HELPFUL_USER",
|
||||
"PERMA_BAN",
|
||||
"SOLID_CONTRIBUTOR",
|
||||
"SPAM_WARNING",
|
||||
"SPAM_WATCH"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"kind"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"ModNoteCriteria": {
|
||||
"properties": {
|
||||
"activityType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"note": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"noteType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"ABUSE_WARNING",
|
||||
"BAN",
|
||||
"BOT_BAN",
|
||||
"HELPFUL_USER",
|
||||
"PERMA_BAN",
|
||||
"SOLID_CONTRIBUTOR",
|
||||
"SPAM_WARNING",
|
||||
"SPAM_WATCH"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"ABUSE_WARNING",
|
||||
"BAN",
|
||||
"BOT_BAN",
|
||||
"HELPFUL_USER",
|
||||
"PERMA_BAN",
|
||||
"SOLID_CONTRIBUTOR",
|
||||
"SPAM_WARNING",
|
||||
"SPAM_WATCH"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
"total"
|
||||
],
|
||||
"examples": [
|
||||
"current"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ModeratorNameCriteria": {
|
||||
"properties": {
|
||||
"behavior": {
|
||||
@@ -4864,6 +5308,9 @@
|
||||
{
|
||||
"$ref": "#/definitions/RepostRuleJSONConfig"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/SentimentRuleJSONConfig"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
@@ -5107,6 +5554,149 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"SentimentRuleJSONConfig": {
|
||||
"description": "Test the calculated VADER sentiment for an Activity to determine if the text context is negative, neutral, or positive in tone.\n\nMore about VADER Sentiment: https://github.com/cjhutto/vaderSentiment",
|
||||
"properties": {
|
||||
"authorIs": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/AuthorCriteria"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/NamedCriteria<AuthorCriteria>"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/FilterOptionsJson<AuthorCriteria>"
|
||||
}
|
||||
],
|
||||
"description": "If present then these Author criteria are checked before running the Check. If criteria fails then the Check will fail."
|
||||
},
|
||||
"defaultLanguage": {
|
||||
"anyOf": [
|
||||
{
|
||||
"enum": [
|
||||
false
|
||||
],
|
||||
"type": "boolean"
|
||||
},
|
||||
{
|
||||
"type": [
|
||||
"null",
|
||||
"string"
|
||||
]
|
||||
}
|
||||
],
|
||||
"default": "en",
|
||||
"description": "Make the analyzer assume a language if it cannot determine one itself.\n\nThis is very useful for the analyzer when it is parsing short pieces of content. For example, if you know your subreddit is majority english speakers this will make the analyzer return \"neutral\" sentiment instead of \"not detected language\".\n\nDefaults to 'en'"
|
||||
},
|
||||
"historical": {
|
||||
"$ref": "#/definitions/HistoricalSentimentConfig",
|
||||
"description": "Test the Sentiment of Activities from the Author history\n\nIf this is defined then the `totalMatching` threshold must pass for the Rule to trigger\n\nIf `sentiment` is defined here it overrides the top-level `sentiment` value"
|
||||
},
|
||||
"itemIs": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubmissionState"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/CommentState"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/NamedCriteria<TypedActivityState>"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/FilterOptionsJson<TypedActivityState>"
|
||||
}
|
||||
],
|
||||
"description": "A list of criteria to test the state of the `Activity` against before running the check.\n\nIf any set of criteria passes the Check will be run. If the criteria fails then the Check will fail.\n\n* @examples [[{\"over_18\": true, \"removed': false}]]"
|
||||
},
|
||||
"kind": {
|
||||
"description": "The kind of rule to run",
|
||||
"enum": [
|
||||
"sentiment"
|
||||
],
|
||||
"examples": [
|
||||
"sentiment"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"languageHints": {
|
||||
"default": [
|
||||
"en",
|
||||
"es",
|
||||
"de",
|
||||
"fr"
|
||||
],
|
||||
"description": "Helps the analyzer coerce a low confidence language guess into a known-used languages in two ways:\n\nIf the analyzer's\n * *best* guess is NOT one of these\n * but it did guess one of these\n * and its guess is above requiredLanguageConfidence score then use the hinted language instead of best guess\n * OR text content is very short (4 words or less)\n * and the best guess was below the requiredLanguageConfidence score\n * and none of guesses was a hinted language then use the defaultLanguage\n\nDefaults to popular romance languages: ['en', 'es', 'de', 'fr']",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"name": {
|
||||
"description": "An optional, but highly recommended, friendly name for this rule. If not present will default to `kind`.\n\nCan only contain letters, numbers, underscore, spaces, and dashes\n\nname is used to reference Rule result data during Action content templating. See CommentAction or ReportAction for more details.",
|
||||
"examples": [
|
||||
"myNewRule"
|
||||
],
|
||||
"pattern": "^[a-zA-Z]([\\w -]*[\\w])?$",
|
||||
"type": "string"
|
||||
},
|
||||
"requiredLanguageConfidence": {
|
||||
"default": 0.9,
|
||||
"description": "Required confidence to use a guessed language as the best guess. Score from 0 to 1.\n\nDefaults to 0.9",
|
||||
"type": "number"
|
||||
},
|
||||
"sentiment": {
|
||||
"description": "Test the calculated VADER sentiment (compound) score for an Activity using this comparison. Can be either a numerical or natural language\n\nSentiment values range from extremely negative to extremely positive in a numerical range of -1 to +1:\n\n* -0.6 => extremely negative\n* -0.3 => very negative\n* -0.1 => negative\n* 0 => neutral\n* 0.1 => positive\n* 0.3 => very positive\n* 0.6 => extremely positive\n\nThe below examples are all equivocal. You can use either set of values as the value for `sentiment` (numerical comparisons or natural langauge)\n\n* `>= 0.1` = `is positive`\n* `<= 0.3` = `is very negative`\n* `< 0.1` = `is not positive`\n* `> -0.3` = `is not very negative`\n\nSpecial case:\n\n* `is neutral` equates to `> -0.1 and < 0.1`\n* `is not neutral` equates to `< -0.1 or > 0.1`\n\nContextMod uses a normalized, weighted average from these sentiment tools:\n\n* NLP.js (english, french, german, and spanish) https://github.com/axa-group/nlp.js/blob/master/docs/v3/sentiment-analysis.md\n* (english only) vaderSentiment-js https://github.com/vaderSentiment/vaderSentiment-js/\n* (english only) wink-sentiment https://github.com/winkjs/wink-sentiment\n\nMore about the sentiment algorithms used:\n* VADER https://github.com/cjhutto/vaderSentiment\n* AFINN http://corpustext.com/reference/sentiment_afinn.html\n* Senticon https://ieeexplore.ieee.org/document/8721408\n* Pattern https://github.com/clips/pattern\n* wink https://github.com/winkjs/wink-sentiment",
|
||||
"examples": [
|
||||
"is negative",
|
||||
"> 0.2"
|
||||
],
|
||||
"pattern": "((>|>=|<|<=)\\s*(-?\\d?\\.?\\d+))|((not)?\\s*(very|extremely)?\\s*(positive|neutral|negative))",
|
||||
"type": "string"
|
||||
},
|
||||
"testOn": {
|
||||
"default": [
|
||||
"title",
|
||||
"body"
|
||||
],
|
||||
"description": "Which content from an Activity to test for `sentiment` against\n\nOnly used if the Activity being tested is a Submission -- Comments are only tested against their body\n\nIf more than one type of content is specified then all text is tested together as one string",
|
||||
"items": {
|
||||
"enum": [
|
||||
"body",
|
||||
"title"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"kind",
|
||||
"sentiment"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"SubmissionCheckJson": {
|
||||
"properties": {
|
||||
"actions": {
|
||||
@@ -5164,6 +5754,9 @@
|
||||
{
|
||||
"$ref": "#/definitions/ContributorActionJson"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/ModNoteActionJson"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
@@ -5326,6 +5919,9 @@
|
||||
{
|
||||
"$ref": "#/definitions/RepostRuleJSONConfig"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/SentimentRuleJSONConfig"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/RuleSetJson"
|
||||
},
|
||||
@@ -5829,16 +6425,16 @@
|
||||
"properties": {
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [ascending|descending]`",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the notes for this Author:\n\n### current\n\nOnly the most recent note is checked for `type`\n\n### total\n\nThe `count` comparison of `type` must be found within all notes\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n\n### consecutive\n\nThe `count` **number** of `type` notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
|
||||
@@ -119,6 +119,19 @@
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"modActions": {
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/ModNoteCriteria"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/ModLogCriteria"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"name": {
|
||||
"description": "A list of reddit usernames (case-insensitive) to match against. Do not include the \"u/\" prefix\n\n EX to match against /u/FoxxMD and /u/AnotherUser use [\"FoxxMD\",\"AnotherUser\"]",
|
||||
"examples": [
|
||||
@@ -965,6 +978,241 @@
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ModLogCriteria": {
|
||||
"properties": {
|
||||
"action": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"activityType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"description": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"details": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
"total"
|
||||
],
|
||||
"examples": [
|
||||
"current"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ModNoteCriteria": {
|
||||
"properties": {
|
||||
"activityType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"note": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"noteType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"ABUSE_WARNING",
|
||||
"BAN",
|
||||
"BOT_BAN",
|
||||
"HELPFUL_USER",
|
||||
"PERMA_BAN",
|
||||
"SOLID_CONTRIBUTOR",
|
||||
"SPAM_WARNING",
|
||||
"SPAM_WATCH"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"ABUSE_WARNING",
|
||||
"BAN",
|
||||
"BOT_BAN",
|
||||
"HELPFUL_USER",
|
||||
"PERMA_BAN",
|
||||
"SOLID_CONTRIBUTOR",
|
||||
"SPAM_WARNING",
|
||||
"SPAM_WATCH"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
"total"
|
||||
],
|
||||
"examples": [
|
||||
"current"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ModeratorNameCriteria": {
|
||||
"properties": {
|
||||
"behavior": {
|
||||
@@ -1226,6 +1474,17 @@
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"modNotesTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, Mod Notes should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"provider": {
|
||||
"anyOf": [
|
||||
{
|
||||
@@ -1749,16 +2008,16 @@
|
||||
"properties": {
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [ascending|descending]`",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the notes for this Author:\n\n### current\n\nOnly the most recent note is checked for `type`\n\n### total\n\nThe `count` comparison of `type` must be found within all notes\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n\n### consecutive\n\nThe `count` **number** of `type` notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
|
||||
@@ -22,6 +22,9 @@
|
||||
{
|
||||
"$ref": "#/definitions/RepostRuleJSONConfig"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/SentimentRuleJSONConfig"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
@@ -574,6 +577,19 @@
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"modActions": {
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/ModNoteCriteria"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/ModLogCriteria"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"name": {
|
||||
"description": "A list of reddit usernames (case-insensitive) to match against. Do not include the \"u/\" prefix\n\n EX to match against /u/FoxxMD and /u/AnotherUser use [\"FoxxMD\",\"AnotherUser\"]",
|
||||
"examples": [
|
||||
@@ -1297,6 +1313,60 @@
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"HistoricalSentimentConfig": {
|
||||
"description": "Test the Sentiment of Activities from the Author history\n\nIf this is defined then the `totalMatching` threshold must pass for the Rule to trigger\n\nIf `sentiment` is defined here it overrides the top-level `sentiment` value",
|
||||
"properties": {
|
||||
"mustMatchCurrent": {
|
||||
"default": false,
|
||||
"description": "When `true` the original Activity being checked MUST match desired sentiment before the Rule considers any history",
|
||||
"type": "boolean"
|
||||
},
|
||||
"sentiment": {
|
||||
"description": "Test the calculated VADER sentiment (compound) score for an Activity using this comparison. Can be either a numerical or natural language\n\nSentiment values range from extremely negative to extremely positive in a numerical range of -1 to +1:\n\n* -0.6 => extremely negative\n* -0.3 => very negative\n* -0.1 => negative\n* 0 => neutral\n* 0.1 => positive\n* 0.3 => very positive\n* 0.6 => extremely positive\n\nThe below examples are all equivocal. You can use either set of values as the value for `sentiment` (numerical comparisons or natural langauge)\n\n* `>= 0.1` = `is positive`\n* `<= 0.3` = `is very negative`\n* `< 0.1` = `is not positive`\n* `> -0.3` = `is not very negative`\n\nSpecial case:\n\n* `is neutral` equates to `> -0.1 and < 0.1`\n* `is not neutral` equates to `< -0.1 or > 0.1`\n\nContextMod uses a normalized, weighted average from these sentiment tools:\n\n* NLP.js (english, french, german, and spanish) https://github.com/axa-group/nlp.js/blob/master/docs/v3/sentiment-analysis.md\n* (english only) vaderSentiment-js https://github.com/vaderSentiment/vaderSentiment-js/\n* (english only) wink-sentiment https://github.com/winkjs/wink-sentiment\n\nMore about the sentiment algorithms used:\n* VADER https://github.com/cjhutto/vaderSentiment\n* AFINN http://corpustext.com/reference/sentiment_afinn.html\n* Senticon https://ieeexplore.ieee.org/document/8721408\n* Pattern https://github.com/clips/pattern\n* wink https://github.com/winkjs/wink-sentiment",
|
||||
"examples": [
|
||||
"is negative",
|
||||
"> 0.2"
|
||||
],
|
||||
"pattern": "((>|>=|<|<=)\\s*(-?\\d?\\.?\\d+))|((not)?\\s*(very|extremely)?\\s*(positive|neutral|negative))",
|
||||
"type": "string"
|
||||
},
|
||||
"totalMatching": {
|
||||
"default": "> 0",
|
||||
"description": "A string containing a comparison operator and a value to compare Activities from history that pass the given `sentiment` comparison\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 12` => greater than 12 activities passed given `sentiment` comparison\n* EX `<= 10%` => less than 10% of all Activities from history passed given `sentiment` comparison",
|
||||
"examples": [
|
||||
"> 0",
|
||||
"> 10%"
|
||||
],
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"window": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/DurationObject"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/FullActivityWindowConfig"
|
||||
},
|
||||
{
|
||||
"type": [
|
||||
"string",
|
||||
"number"
|
||||
]
|
||||
}
|
||||
],
|
||||
"description": "A value to define the range of Activities to retrieve.\n\nAcceptable values:\n\n**`ActivityWindowCriteria` object**\n\nAllows specify multiple range properties and more specific behavior\n\n**A `number` of Activities to retrieve**\n\n* EX `100` => 100 Activities\n\n*****\n\nAny of the below values that specify the amount of time to subtract from `NOW` to create a time range IE `NOW <---> [duration] ago`\n\nAcceptable values:\n\n**A `string` consisting of a value and a [Day.js](https://day.js.org/docs/en/durations/creating#list-of-all-available-units) time UNIT**\n\n* EX `9 days` => Range is `NOW <---> 9 days ago`\n\n**A [Day.js](https://day.js.org/docs/en/durations/creating) `object`**\n\n* EX `{\"days\": 90, \"minutes\": 15}` => Range is `NOW <---> 90 days and 15 minutes ago`\n\n**An [ISO 8601 duration](https://en.wikipedia.org/wiki/ISO_8601#Durations) `string`**\n\n* EX `PT15M` => 15 minutes => Range is `NOW <----> 15 minutes ago`",
|
||||
"examples": [
|
||||
"90 days"
|
||||
]
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"totalMatching",
|
||||
"window"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"HistoryCriteria": {
|
||||
"description": "Criteria will only trigger if ALL present thresholds (comment, submission, total) are met",
|
||||
"properties": {
|
||||
@@ -1585,6 +1655,241 @@
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ModLogCriteria": {
|
||||
"properties": {
|
||||
"action": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"activityType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"description": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"details": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
"total"
|
||||
],
|
||||
"examples": [
|
||||
"current"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ModNoteCriteria": {
|
||||
"properties": {
|
||||
"activityType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"note": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"noteType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"ABUSE_WARNING",
|
||||
"BAN",
|
||||
"BOT_BAN",
|
||||
"HELPFUL_USER",
|
||||
"PERMA_BAN",
|
||||
"SOLID_CONTRIBUTOR",
|
||||
"SPAM_WARNING",
|
||||
"SPAM_WATCH"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"ABUSE_WARNING",
|
||||
"BAN",
|
||||
"BOT_BAN",
|
||||
"HELPFUL_USER",
|
||||
"PERMA_BAN",
|
||||
"SOLID_CONTRIBUTOR",
|
||||
"SPAM_WARNING",
|
||||
"SPAM_WATCH"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
"total"
|
||||
],
|
||||
"examples": [
|
||||
"current"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ModeratorNameCriteria": {
|
||||
"properties": {
|
||||
"behavior": {
|
||||
@@ -2721,6 +3026,149 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"SentimentRuleJSONConfig": {
|
||||
"description": "Test the calculated VADER sentiment for an Activity to determine if the text context is negative, neutral, or positive in tone.\n\nMore about VADER Sentiment: https://github.com/cjhutto/vaderSentiment",
|
||||
"properties": {
|
||||
"authorIs": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/AuthorCriteria"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/NamedCriteria<AuthorCriteria>"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/FilterOptionsJson<AuthorCriteria>"
|
||||
}
|
||||
],
|
||||
"description": "If present then these Author criteria are checked before running the Check. If criteria fails then the Check will fail."
|
||||
},
|
||||
"defaultLanguage": {
|
||||
"anyOf": [
|
||||
{
|
||||
"enum": [
|
||||
false
|
||||
],
|
||||
"type": "boolean"
|
||||
},
|
||||
{
|
||||
"type": [
|
||||
"null",
|
||||
"string"
|
||||
]
|
||||
}
|
||||
],
|
||||
"default": "en",
|
||||
"description": "Make the analyzer assume a language if it cannot determine one itself.\n\nThis is very useful for the analyzer when it is parsing short pieces of content. For example, if you know your subreddit is majority english speakers this will make the analyzer return \"neutral\" sentiment instead of \"not detected language\".\n\nDefaults to 'en'"
|
||||
},
|
||||
"historical": {
|
||||
"$ref": "#/definitions/HistoricalSentimentConfig",
|
||||
"description": "Test the Sentiment of Activities from the Author history\n\nIf this is defined then the `totalMatching` threshold must pass for the Rule to trigger\n\nIf `sentiment` is defined here it overrides the top-level `sentiment` value"
|
||||
},
|
||||
"itemIs": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubmissionState"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/CommentState"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/NamedCriteria<TypedActivityState>"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/FilterOptionsJson<TypedActivityState>"
|
||||
}
|
||||
],
|
||||
"description": "A list of criteria to test the state of the `Activity` against before running the check.\n\nIf any set of criteria passes the Check will be run. If the criteria fails then the Check will fail.\n\n* @examples [[{\"over_18\": true, \"removed': false}]]"
|
||||
},
|
||||
"kind": {
|
||||
"description": "The kind of rule to run",
|
||||
"enum": [
|
||||
"sentiment"
|
||||
],
|
||||
"examples": [
|
||||
"sentiment"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"languageHints": {
|
||||
"default": [
|
||||
"en",
|
||||
"es",
|
||||
"de",
|
||||
"fr"
|
||||
],
|
||||
"description": "Helps the analyzer coerce a low confidence language guess into a known-used languages in two ways:\n\nIf the analyzer's\n * *best* guess is NOT one of these\n * but it did guess one of these\n * and its guess is above requiredLanguageConfidence score then use the hinted language instead of best guess\n * OR text content is very short (4 words or less)\n * and the best guess was below the requiredLanguageConfidence score\n * and none of guesses was a hinted language then use the defaultLanguage\n\nDefaults to popular romance languages: ['en', 'es', 'de', 'fr']",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"name": {
|
||||
"description": "An optional, but highly recommended, friendly name for this rule. If not present will default to `kind`.\n\nCan only contain letters, numbers, underscore, spaces, and dashes\n\nname is used to reference Rule result data during Action content templating. See CommentAction or ReportAction for more details.",
|
||||
"examples": [
|
||||
"myNewRule"
|
||||
],
|
||||
"pattern": "^[a-zA-Z]([\\w -]*[\\w])?$",
|
||||
"type": "string"
|
||||
},
|
||||
"requiredLanguageConfidence": {
|
||||
"default": 0.9,
|
||||
"description": "Required confidence to use a guessed language as the best guess. Score from 0 to 1.\n\nDefaults to 0.9",
|
||||
"type": "number"
|
||||
},
|
||||
"sentiment": {
|
||||
"description": "Test the calculated VADER sentiment (compound) score for an Activity using this comparison. Can be either a numerical or natural language\n\nSentiment values range from extremely negative to extremely positive in a numerical range of -1 to +1:\n\n* -0.6 => extremely negative\n* -0.3 => very negative\n* -0.1 => negative\n* 0 => neutral\n* 0.1 => positive\n* 0.3 => very positive\n* 0.6 => extremely positive\n\nThe below examples are all equivocal. You can use either set of values as the value for `sentiment` (numerical comparisons or natural langauge)\n\n* `>= 0.1` = `is positive`\n* `<= 0.3` = `is very negative`\n* `< 0.1` = `is not positive`\n* `> -0.3` = `is not very negative`\n\nSpecial case:\n\n* `is neutral` equates to `> -0.1 and < 0.1`\n* `is not neutral` equates to `< -0.1 or > 0.1`\n\nContextMod uses a normalized, weighted average from these sentiment tools:\n\n* NLP.js (english, french, german, and spanish) https://github.com/axa-group/nlp.js/blob/master/docs/v3/sentiment-analysis.md\n* (english only) vaderSentiment-js https://github.com/vaderSentiment/vaderSentiment-js/\n* (english only) wink-sentiment https://github.com/winkjs/wink-sentiment\n\nMore about the sentiment algorithms used:\n* VADER https://github.com/cjhutto/vaderSentiment\n* AFINN http://corpustext.com/reference/sentiment_afinn.html\n* Senticon https://ieeexplore.ieee.org/document/8721408\n* Pattern https://github.com/clips/pattern\n* wink https://github.com/winkjs/wink-sentiment",
|
||||
"examples": [
|
||||
"is negative",
|
||||
"> 0.2"
|
||||
],
|
||||
"pattern": "((>|>=|<|<=)\\s*(-?\\d?\\.?\\d+))|((not)?\\s*(very|extremely)?\\s*(positive|neutral|negative))",
|
||||
"type": "string"
|
||||
},
|
||||
"testOn": {
|
||||
"default": [
|
||||
"title",
|
||||
"body"
|
||||
],
|
||||
"description": "Which content from an Activity to test for `sentiment` against\n\nOnly used if the Activity being tested is a Submission -- Comments are only tested against their body\n\nIf more than one type of content is specified then all text is tested together as one string",
|
||||
"items": {
|
||||
"enum": [
|
||||
"body",
|
||||
"title"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"kind",
|
||||
"sentiment"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"SubmissionState": {
|
||||
"description": "Different attributes a `Submission` can be in. Only include a property if you want to check it.",
|
||||
"examples": [
|
||||
@@ -2969,16 +3417,16 @@
|
||||
"properties": {
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [ascending|descending]`",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the notes for this Author:\n\n### current\n\nOnly the most recent note is checked for `type`\n\n### total\n\nThe `count` comparison of `type` must be found within all notes\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n\n### consecutive\n\nThe `count` **number** of `type` notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
|
||||
@@ -548,6 +548,19 @@
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"modActions": {
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/ModNoteCriteria"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/ModLogCriteria"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"name": {
|
||||
"description": "A list of reddit usernames (case-insensitive) to match against. Do not include the \"u/\" prefix\n\n EX to match against /u/FoxxMD and /u/AnotherUser use [\"FoxxMD\",\"AnotherUser\"]",
|
||||
"examples": [
|
||||
@@ -1271,6 +1284,60 @@
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"HistoricalSentimentConfig": {
|
||||
"description": "Test the Sentiment of Activities from the Author history\n\nIf this is defined then the `totalMatching` threshold must pass for the Rule to trigger\n\nIf `sentiment` is defined here it overrides the top-level `sentiment` value",
|
||||
"properties": {
|
||||
"mustMatchCurrent": {
|
||||
"default": false,
|
||||
"description": "When `true` the original Activity being checked MUST match desired sentiment before the Rule considers any history",
|
||||
"type": "boolean"
|
||||
},
|
||||
"sentiment": {
|
||||
"description": "Test the calculated VADER sentiment (compound) score for an Activity using this comparison. Can be either a numerical or natural language\n\nSentiment values range from extremely negative to extremely positive in a numerical range of -1 to +1:\n\n* -0.6 => extremely negative\n* -0.3 => very negative\n* -0.1 => negative\n* 0 => neutral\n* 0.1 => positive\n* 0.3 => very positive\n* 0.6 => extremely positive\n\nThe below examples are all equivocal. You can use either set of values as the value for `sentiment` (numerical comparisons or natural langauge)\n\n* `>= 0.1` = `is positive`\n* `<= 0.3` = `is very negative`\n* `< 0.1` = `is not positive`\n* `> -0.3` = `is not very negative`\n\nSpecial case:\n\n* `is neutral` equates to `> -0.1 and < 0.1`\n* `is not neutral` equates to `< -0.1 or > 0.1`\n\nContextMod uses a normalized, weighted average from these sentiment tools:\n\n* NLP.js (english, french, german, and spanish) https://github.com/axa-group/nlp.js/blob/master/docs/v3/sentiment-analysis.md\n* (english only) vaderSentiment-js https://github.com/vaderSentiment/vaderSentiment-js/\n* (english only) wink-sentiment https://github.com/winkjs/wink-sentiment\n\nMore about the sentiment algorithms used:\n* VADER https://github.com/cjhutto/vaderSentiment\n* AFINN http://corpustext.com/reference/sentiment_afinn.html\n* Senticon https://ieeexplore.ieee.org/document/8721408\n* Pattern https://github.com/clips/pattern\n* wink https://github.com/winkjs/wink-sentiment",
|
||||
"examples": [
|
||||
"is negative",
|
||||
"> 0.2"
|
||||
],
|
||||
"pattern": "((>|>=|<|<=)\\s*(-?\\d?\\.?\\d+))|((not)?\\s*(very|extremely)?\\s*(positive|neutral|negative))",
|
||||
"type": "string"
|
||||
},
|
||||
"totalMatching": {
|
||||
"default": "> 0",
|
||||
"description": "A string containing a comparison operator and a value to compare Activities from history that pass the given `sentiment` comparison\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 12` => greater than 12 activities passed given `sentiment` comparison\n* EX `<= 10%` => less than 10% of all Activities from history passed given `sentiment` comparison",
|
||||
"examples": [
|
||||
"> 0",
|
||||
"> 10%"
|
||||
],
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"window": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/DurationObject"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/FullActivityWindowConfig"
|
||||
},
|
||||
{
|
||||
"type": [
|
||||
"string",
|
||||
"number"
|
||||
]
|
||||
}
|
||||
],
|
||||
"description": "A value to define the range of Activities to retrieve.\n\nAcceptable values:\n\n**`ActivityWindowCriteria` object**\n\nAllows specify multiple range properties and more specific behavior\n\n**A `number` of Activities to retrieve**\n\n* EX `100` => 100 Activities\n\n*****\n\nAny of the below values that specify the amount of time to subtract from `NOW` to create a time range IE `NOW <---> [duration] ago`\n\nAcceptable values:\n\n**A `string` consisting of a value and a [Day.js](https://day.js.org/docs/en/durations/creating#list-of-all-available-units) time UNIT**\n\n* EX `9 days` => Range is `NOW <---> 9 days ago`\n\n**A [Day.js](https://day.js.org/docs/en/durations/creating) `object`**\n\n* EX `{\"days\": 90, \"minutes\": 15}` => Range is `NOW <---> 90 days and 15 minutes ago`\n\n**An [ISO 8601 duration](https://en.wikipedia.org/wiki/ISO_8601#Durations) `string`**\n\n* EX `PT15M` => 15 minutes => Range is `NOW <----> 15 minutes ago`",
|
||||
"examples": [
|
||||
"90 days"
|
||||
]
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"totalMatching",
|
||||
"window"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"HistoryCriteria": {
|
||||
"description": "Criteria will only trigger if ALL present thresholds (comment, submission, total) are met",
|
||||
"properties": {
|
||||
@@ -1559,6 +1626,241 @@
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ModLogCriteria": {
|
||||
"properties": {
|
||||
"action": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"activityType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"description": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"details": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
"total"
|
||||
],
|
||||
"examples": [
|
||||
"current"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ModNoteCriteria": {
|
||||
"properties": {
|
||||
"activityType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"comment",
|
||||
"submission"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"note": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"noteType": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"ABUSE_WARNING",
|
||||
"BAN",
|
||||
"BOT_BAN",
|
||||
"HELPFUL_USER",
|
||||
"PERMA_BAN",
|
||||
"SOLID_CONTRIBUTOR",
|
||||
"SPAM_WARNING",
|
||||
"SPAM_WATCH"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"ABUSE_WARNING",
|
||||
"BAN",
|
||||
"BOT_BAN",
|
||||
"HELPFUL_USER",
|
||||
"PERMA_BAN",
|
||||
"SOLID_CONTRIBUTOR",
|
||||
"SPAM_WARNING",
|
||||
"SPAM_WATCH"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
"total"
|
||||
],
|
||||
"examples": [
|
||||
"current"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"APPROVAL",
|
||||
"INVITE",
|
||||
"NOTE",
|
||||
"REMOVAL",
|
||||
"SPAM"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ModeratorNameCriteria": {
|
||||
"properties": {
|
||||
"behavior": {
|
||||
@@ -2695,6 +2997,149 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"SentimentRuleJSONConfig": {
|
||||
"description": "Test the calculated VADER sentiment for an Activity to determine if the text context is negative, neutral, or positive in tone.\n\nMore about VADER Sentiment: https://github.com/cjhutto/vaderSentiment",
|
||||
"properties": {
|
||||
"authorIs": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/AuthorCriteria"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/NamedCriteria<AuthorCriteria>"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/FilterOptionsJson<AuthorCriteria>"
|
||||
}
|
||||
],
|
||||
"description": "If present then these Author criteria are checked before running the Check. If criteria fails then the Check will fail."
|
||||
},
|
||||
"defaultLanguage": {
|
||||
"anyOf": [
|
||||
{
|
||||
"enum": [
|
||||
false
|
||||
],
|
||||
"type": "boolean"
|
||||
},
|
||||
{
|
||||
"type": [
|
||||
"null",
|
||||
"string"
|
||||
]
|
||||
}
|
||||
],
|
||||
"default": "en",
|
||||
"description": "Make the analyzer assume a language if it cannot determine one itself.\n\nThis is very useful for the analyzer when it is parsing short pieces of content. For example, if you know your subreddit is majority english speakers this will make the analyzer return \"neutral\" sentiment instead of \"not detected language\".\n\nDefaults to 'en'"
|
||||
},
|
||||
"historical": {
|
||||
"$ref": "#/definitions/HistoricalSentimentConfig",
|
||||
"description": "Test the Sentiment of Activities from the Author history\n\nIf this is defined then the `totalMatching` threshold must pass for the Rule to trigger\n\nIf `sentiment` is defined here it overrides the top-level `sentiment` value"
|
||||
},
|
||||
"itemIs": {
|
||||
"anyOf": [
|
||||
{
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubmissionState"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/CommentState"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/NamedCriteria<TypedActivityState>"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/FilterOptionsJson<TypedActivityState>"
|
||||
}
|
||||
],
|
||||
"description": "A list of criteria to test the state of the `Activity` against before running the check.\n\nIf any set of criteria passes the Check will be run. If the criteria fails then the Check will fail.\n\n* @examples [[{\"over_18\": true, \"removed': false}]]"
|
||||
},
|
||||
"kind": {
|
||||
"description": "The kind of rule to run",
|
||||
"enum": [
|
||||
"sentiment"
|
||||
],
|
||||
"examples": [
|
||||
"sentiment"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"languageHints": {
|
||||
"default": [
|
||||
"en",
|
||||
"es",
|
||||
"de",
|
||||
"fr"
|
||||
],
|
||||
"description": "Helps the analyzer coerce a low confidence language guess into a known-used languages in two ways:\n\nIf the analyzer's\n * *best* guess is NOT one of these\n * but it did guess one of these\n * and its guess is above requiredLanguageConfidence score then use the hinted language instead of best guess\n * OR text content is very short (4 words or less)\n * and the best guess was below the requiredLanguageConfidence score\n * and none of guesses was a hinted language then use the defaultLanguage\n\nDefaults to popular romance languages: ['en', 'es', 'de', 'fr']",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"name": {
|
||||
"description": "An optional, but highly recommended, friendly name for this rule. If not present will default to `kind`.\n\nCan only contain letters, numbers, underscore, spaces, and dashes\n\nname is used to reference Rule result data during Action content templating. See CommentAction or ReportAction for more details.",
|
||||
"examples": [
|
||||
"myNewRule"
|
||||
],
|
||||
"pattern": "^[a-zA-Z]([\\w -]*[\\w])?$",
|
||||
"type": "string"
|
||||
},
|
||||
"requiredLanguageConfidence": {
|
||||
"default": 0.9,
|
||||
"description": "Required confidence to use a guessed language as the best guess. Score from 0 to 1.\n\nDefaults to 0.9",
|
||||
"type": "number"
|
||||
},
|
||||
"sentiment": {
|
||||
"description": "Test the calculated VADER sentiment (compound) score for an Activity using this comparison. Can be either a numerical or natural language\n\nSentiment values range from extremely negative to extremely positive in a numerical range of -1 to +1:\n\n* -0.6 => extremely negative\n* -0.3 => very negative\n* -0.1 => negative\n* 0 => neutral\n* 0.1 => positive\n* 0.3 => very positive\n* 0.6 => extremely positive\n\nThe below examples are all equivocal. You can use either set of values as the value for `sentiment` (numerical comparisons or natural langauge)\n\n* `>= 0.1` = `is positive`\n* `<= 0.3` = `is very negative`\n* `< 0.1` = `is not positive`\n* `> -0.3` = `is not very negative`\n\nSpecial case:\n\n* `is neutral` equates to `> -0.1 and < 0.1`\n* `is not neutral` equates to `< -0.1 or > 0.1`\n\nContextMod uses a normalized, weighted average from these sentiment tools:\n\n* NLP.js (english, french, german, and spanish) https://github.com/axa-group/nlp.js/blob/master/docs/v3/sentiment-analysis.md\n* (english only) vaderSentiment-js https://github.com/vaderSentiment/vaderSentiment-js/\n* (english only) wink-sentiment https://github.com/winkjs/wink-sentiment\n\nMore about the sentiment algorithms used:\n* VADER https://github.com/cjhutto/vaderSentiment\n* AFINN http://corpustext.com/reference/sentiment_afinn.html\n* Senticon https://ieeexplore.ieee.org/document/8721408\n* Pattern https://github.com/clips/pattern\n* wink https://github.com/winkjs/wink-sentiment",
|
||||
"examples": [
|
||||
"is negative",
|
||||
"> 0.2"
|
||||
],
|
||||
"pattern": "((>|>=|<|<=)\\s*(-?\\d?\\.?\\d+))|((not)?\\s*(very|extremely)?\\s*(positive|neutral|negative))",
|
||||
"type": "string"
|
||||
},
|
||||
"testOn": {
|
||||
"default": [
|
||||
"title",
|
||||
"body"
|
||||
],
|
||||
"description": "Which content from an Activity to test for `sentiment` against\n\nOnly used if the Activity being tested is a Submission -- Comments are only tested against their body\n\nIf more than one type of content is specified then all text is tested together as one string",
|
||||
"items": {
|
||||
"enum": [
|
||||
"body",
|
||||
"title"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"kind",
|
||||
"sentiment"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"SubmissionState": {
|
||||
"description": "Different attributes a `Submission` can be in. Only include a property if you want to check it.",
|
||||
"examples": [
|
||||
@@ -2943,16 +3388,16 @@
|
||||
"properties": {
|
||||
"count": {
|
||||
"default": ">= 1",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [ascending|descending]`",
|
||||
"description": "Number of occurrences of this type. Ignored if `search` is `current`\n\nA string containing a comparison operator and/or a value to compare number of occurrences against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [in timeRange] [ascending|descending]`\n\nIf `timeRange` is given then only notes/mod actions that occur between timeRange and NOW will be returned. `timeRange` is ignored if search is `current`",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<value>\\d+)\\s*(?<percent>%?)\\s*(?<duration>in\\s+\\d+\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?))?\\s*(?<extra>asc.*|desc.*)*$",
|
||||
"type": "string"
|
||||
},
|
||||
"search": {
|
||||
"default": "current",
|
||||
"description": "How to test the notes for this Author:\n\n### current\n\nOnly the most recent note is checked for `type`\n\n### total\n\nThe `count` comparison of `type` must be found within all notes\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n\n### consecutive\n\nThe `count` **number** of `type` notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"description": "How to test the Toolbox Notes or Mod Actions for this Author:\n\n### current\n\nOnly the most recent note is checked for criteria\n\n### total\n\n`count` comparison of mod actions/notes must be found within all history\n\n* EX `count: > 3` => Must have more than 3 notes of `type`, total\n* EX `count: <= 25%` => Must have 25% or less of notes of `type`, total\n* EX: `count: > 3 in 1 week` => Must have more than 3 notes within the last week\n\n### consecutive\n\nThe `count` **number** of mod actions/notes must be found in a row.\n\nYou may also specify the time-based order in which to search the notes by specifying `ascending (asc)` or `descending (desc)` in the `count` value. Default is `descending`\n\n* EX `count: >= 3` => Must have 3 or more notes of `type` consecutively, in descending order\n* EX `count: < 2` => Must have less than 2 notes of `type` consecutively, in descending order\n* EX `count: > 4 asc` => Must have greater than 4 notes of `type` consecutively, in ascending order",
|
||||
"enum": [
|
||||
"consecutive",
|
||||
"current",
|
||||
@@ -3016,6 +3461,9 @@
|
||||
{
|
||||
"$ref": "#/definitions/RepostRuleJSONConfig"
|
||||
},
|
||||
{
|
||||
"$ref": "#/definitions/SentimentRuleJSONConfig"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import Snoowrap, {Comment, Subreddit, WikiPage} from "snoowrap";
|
||||
import Snoowrap, {WikiPage} from "snoowrap";
|
||||
import {Logger} from "winston";
|
||||
import {SubmissionCheck} from "../Check/SubmissionCheck";
|
||||
import {CommentCheck} from "../Check/CommentCheck";
|
||||
@@ -42,7 +42,7 @@ import {
|
||||
SYSTEM,
|
||||
USER, RuleResult, DatabaseStatisticsOperatorConfig
|
||||
} from "../Common/interfaces";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import {Submission, Comment, Subreddit} from 'snoowrap/dist/objects';
|
||||
import {activityIsRemoved, ItemContent, itemContentPeek} from "../Utils/SnoowrapUtils";
|
||||
import LoggedError from "../Utils/LoggedError";
|
||||
import {
|
||||
@@ -229,6 +229,8 @@ export class Manager extends EventEmitter implements RunningStates {
|
||||
rulesUniqueRollingAvg: number = 0;
|
||||
actionedEvents: ActionedEvent[] = [];
|
||||
|
||||
delayedQueueInterval: any;
|
||||
|
||||
processEmitter: EventEmitter = new EventEmitter();
|
||||
|
||||
activityRepo!: Repository<Activity>;
|
||||
@@ -272,12 +274,11 @@ export class Manager extends EventEmitter implements RunningStates {
|
||||
return {
|
||||
id: x.id,
|
||||
activityId: x.activity.name,
|
||||
permalink: x.activity.permalink,
|
||||
permalink: x.activity.permalink, // TODO construct this without having to fetch activity
|
||||
submissionId: asComment(x.activity) ? x.activity.link_id : undefined,
|
||||
author: x.author,
|
||||
queuedAt: x.queuedAt.unix(),
|
||||
durationMilli: x.delay.asSeconds(),
|
||||
duration: x.delay.humanize(),
|
||||
duration: x.delay.asSeconds(),
|
||||
source: `${x.action}${x.identifier !== undefined ? ` (${x.identifier})` : ''}`,
|
||||
subreddit: this.subreddit.display_name_prefixed
|
||||
}
|
||||
@@ -399,6 +400,45 @@ export class Manager extends EventEmitter implements RunningStates {
|
||||
}
|
||||
})(this), 10000);
|
||||
|
||||
this.delayedQueueInterval = setInterval((function(self) {
|
||||
return function() {
|
||||
if(!self.queue.paused && self.resources !== undefined) {
|
||||
let index = 0;
|
||||
let anyQueued = false;
|
||||
for(const ar of self.resources.delayedItems) {
|
||||
if(ar.queuedAt.add(ar.delay).isSameOrBefore(dayjs())) {
|
||||
anyQueued = true;
|
||||
self.logger.info(`Activity ${ar.activity.name} dispatched at ${ar.queuedAt.format('HH:mm:ss z')} (delayed for ${ar.delay.humanize()}) is now being queued.`, {leaf: 'Delayed Activities'});
|
||||
self.firehose.push({
|
||||
activity: ar.activity,
|
||||
options: {
|
||||
refresh: true,
|
||||
// @ts-ignore
|
||||
source: ar.identifier === undefined ? ar.type : `${ar.type}:${ar.identifier}`,
|
||||
initialGoto: ar.goto,
|
||||
activitySource: {
|
||||
id: ar.id,
|
||||
queuedAt: ar.queuedAt,
|
||||
delay: ar.delay,
|
||||
action: ar.action,
|
||||
goto: ar.goto,
|
||||
identifier: ar.identifier,
|
||||
type: ar.type
|
||||
},
|
||||
dryRun: ar.dryRun,
|
||||
}
|
||||
});
|
||||
self.resources.removeDelayedActivity(ar.id);
|
||||
}
|
||||
index++;
|
||||
}
|
||||
if(!anyQueued) {
|
||||
self.logger.debug('No Activities ready to queue', {leaf: 'Delayed Activities'});
|
||||
}
|
||||
}
|
||||
}
|
||||
})(this), 5000); // every 5 seconds
|
||||
|
||||
this.processEmitter.on('notify', (payload: NotificationEventPayload) => {
|
||||
this.notificationManager.handle(payload.type, payload.title, payload.body, payload.causedBy, payload.logLevel);
|
||||
});
|
||||
@@ -449,7 +489,7 @@ export class Manager extends EventEmitter implements RunningStates {
|
||||
//
|
||||
// if we insert the same item again because it is currently being processed AND THEN we get the item AGAIN we only want to update the newest meta
|
||||
// so search the array backwards to get the neweset only
|
||||
const queuedItemIndex = findLastIndex(this.queuedItemsMeta, x => x.id === task.activity.id);
|
||||
const queuedItemIndex = findLastIndex(this.queuedItemsMeta, x => x.id === task.activity.name);
|
||||
if(queuedItemIndex !== -1) {
|
||||
const itemMeta = this.queuedItemsMeta[queuedItemIndex];
|
||||
let msg = `Item ${itemMeta.id} is already ${itemMeta.state}.`;
|
||||
@@ -458,11 +498,11 @@ export class Manager extends EventEmitter implements RunningStates {
|
||||
this.queuedItemsMeta.splice(queuedItemIndex, 1, {...itemMeta, shouldRefresh: true});
|
||||
} else {
|
||||
this.logger.debug(`${msg} Re-queuing item but will also refresh data before processing.`);
|
||||
this.queuedItemsMeta.push({id: task.activity.id, shouldRefresh: true, state: 'queued'});
|
||||
this.queuedItemsMeta.push({id: task.activity.name, shouldRefresh: true, state: 'queued'});
|
||||
this.queue.push(task);
|
||||
}
|
||||
} else {
|
||||
this.queuedItemsMeta.push({id: task.activity.id, shouldRefresh: false, state: 'queued'});
|
||||
this.queuedItemsMeta.push({id: task.activity.name, shouldRefresh: false, state: 'queued'});
|
||||
this.queue.push(task);
|
||||
}
|
||||
|
||||
@@ -493,40 +533,6 @@ export class Manager extends EventEmitter implements RunningStates {
|
||||
, 1);
|
||||
}
|
||||
|
||||
protected async startDelayQueue() {
|
||||
while(this.queueState.state === RUNNING) {
|
||||
let index = 0;
|
||||
for(const ar of this.resources.delayedItems) {
|
||||
if(!ar.processing && ar.queuedAt.add(ar.delay).isSameOrBefore(dayjs())) {
|
||||
this.logger.info(`Delayed Activity ${ar.activity.name} is being queued.`);
|
||||
await this.firehose.push({
|
||||
activity: ar.activity,
|
||||
options: {
|
||||
refresh: true,
|
||||
// @ts-ignore
|
||||
source: ar.identifier === undefined ? ar.type : `${ar.type}:${ar.identifier}`,
|
||||
initialGoto: ar.goto,
|
||||
activitySource: {
|
||||
id: ar.id,
|
||||
queuedAt: ar.queuedAt,
|
||||
delay: ar.delay,
|
||||
action: ar.action,
|
||||
goto: ar.goto,
|
||||
identifier: ar.identifier,
|
||||
type: ar.type
|
||||
},
|
||||
dryRun: ar.dryRun,
|
||||
}
|
||||
});
|
||||
this.resources.delayedItems.splice(index, 1, {...ar, processing: true});
|
||||
}
|
||||
index++;
|
||||
}
|
||||
// sleep for 5 seconds
|
||||
await sleep(5000);
|
||||
}
|
||||
}
|
||||
|
||||
protected generateQueue(maxWorkers: number) {
|
||||
if (maxWorkers > 1) {
|
||||
this.logger.warn(`Setting max queue workers above 1 (specified: ${maxWorkers}) may have detrimental effects to log readability and api usage. Consult the documentation before using this advanced/experimental feature.`);
|
||||
@@ -538,7 +544,7 @@ export class Manager extends EventEmitter implements RunningStates {
|
||||
await sleep(this.delayBy * 1000);
|
||||
}
|
||||
|
||||
const queuedItemIndex = this.queuedItemsMeta.findIndex(x => x.id === task.activity.id);
|
||||
const queuedItemIndex = this.queuedItemsMeta.findIndex(x => x.id === task.activity.name);
|
||||
try {
|
||||
const itemMeta = this.queuedItemsMeta[queuedItemIndex];
|
||||
this.queuedItemsMeta.splice(queuedItemIndex, 1, {...itemMeta, state: 'processing'});
|
||||
@@ -551,9 +557,6 @@ export class Manager extends EventEmitter implements RunningStates {
|
||||
} finally {
|
||||
// always remove item meta regardless of success or failure since we are done with it meow
|
||||
this.queuedItemsMeta.splice(queuedItemIndex, 1);
|
||||
if(task.options.activitySource?.id !== undefined) {
|
||||
await this.resources.removeDelayedActivity(task.options.activitySource?.id);
|
||||
}
|
||||
}
|
||||
}
|
||||
, maxWorkers);
|
||||
@@ -875,7 +878,6 @@ export class Manager extends EventEmitter implements RunningStates {
|
||||
const checkType = isSubmission(activity) ? 'Submission' : 'Comment';
|
||||
let item = activity,
|
||||
runtimeShouldRefresh = false;
|
||||
const itemId = await item.id;
|
||||
|
||||
const {
|
||||
delayUntil,
|
||||
@@ -885,6 +887,14 @@ export class Manager extends EventEmitter implements RunningStates {
|
||||
force = false,
|
||||
} = options;
|
||||
|
||||
if(refresh) {
|
||||
this.logger.verbose(`Refreshed data`);
|
||||
// @ts-ignore
|
||||
item = await activity.refresh();
|
||||
}
|
||||
|
||||
const itemId = await item.id;
|
||||
|
||||
if(await this.resources.hasRecentSelf(item)) {
|
||||
let recentMsg = `Found in Activities recently (last ${this.resources.selfTTL} seconds) modified/created by this bot`;
|
||||
if(force) {
|
||||
@@ -963,7 +973,6 @@ export class Manager extends EventEmitter implements RunningStates {
|
||||
delay: dayjs.duration(remaining, 'seconds'),
|
||||
id: 'notUsed',
|
||||
queuedAt: dayjs(),
|
||||
processing: false,
|
||||
activity,
|
||||
author: getActivityAuthorName(activity.author),
|
||||
});
|
||||
@@ -977,7 +986,7 @@ export class Manager extends EventEmitter implements RunningStates {
|
||||
}
|
||||
// refresh signal from firehose if activity was ingested multiple times before processing or re-queued while processing
|
||||
// want to make sure we have the most recent data
|
||||
if(runtimeShouldRefresh || refresh) {
|
||||
if(runtimeShouldRefresh) {
|
||||
this.logger.verbose(`Refreshed data`);
|
||||
// @ts-ignore
|
||||
item = await activity.refresh();
|
||||
@@ -1348,7 +1357,6 @@ export class Manager extends EventEmitter implements RunningStates {
|
||||
state: RUNNING,
|
||||
causedBy
|
||||
}
|
||||
this.startDelayQueue();
|
||||
if(!suppressNotification) {
|
||||
this.notificationManager.handle('runStateChanged', 'Queue Started', reason, causedBy);
|
||||
}
|
||||
|
||||
53
src/Subreddit/ModNotes/ModAction.ts
Normal file
53
src/Subreddit/ModNotes/ModAction.ts
Normal file
@@ -0,0 +1,53 @@
|
||||
import {Submission, RedditUser, Comment, Subreddit, PrivateMessage} from "snoowrap/dist/objects"
|
||||
import {generateSnoowrapEntityFromRedditThing, parseRedditFullname} from "../../util"
|
||||
import Snoowrap from "snoowrap";
|
||||
|
||||
//import {ExtendedSnoowrap} from "../../Utils/SnoowrapClients";
|
||||
|
||||
export interface ModActionRaw {
|
||||
action?: string | null
|
||||
reddit_id?: string | null
|
||||
details?: string | null
|
||||
description?: string | null
|
||||
}
|
||||
|
||||
export class ModAction {
|
||||
action?: string
|
||||
actedOn?: RedditUser | Submission | Comment | Subreddit | PrivateMessage
|
||||
details?: string
|
||||
description?: string
|
||||
|
||||
constructor(data: ModActionRaw | undefined, client: Snoowrap) {
|
||||
const {
|
||||
action,
|
||||
reddit_id,
|
||||
details,
|
||||
description
|
||||
} = data || {};
|
||||
this.action = action !== null ? action : undefined;
|
||||
this.details = details !== null ? details : undefined;
|
||||
this.description = description !== null ? description : undefined;
|
||||
|
||||
if (reddit_id !== null && reddit_id !== undefined) {
|
||||
const thing = parseRedditFullname(reddit_id);
|
||||
if (thing !== undefined) {
|
||||
this.actedOn = generateSnoowrapEntityFromRedditThing(thing, client);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
toRaw(): ModActionRaw {
|
||||
return {
|
||||
action: this.action,
|
||||
details: this.details,
|
||||
reddit_id: this.actedOn !== undefined ? this.actedOn.id : undefined,
|
||||
description: this.description
|
||||
}
|
||||
}
|
||||
|
||||
toJSON() {
|
||||
return this.toRaw();
|
||||
}
|
||||
}
|
||||
|
||||
export default ModAction;
|
||||
119
src/Subreddit/ModNotes/ModNote.ts
Normal file
119
src/Subreddit/ModNotes/ModNote.ts
Normal file
@@ -0,0 +1,119 @@
|
||||
import {ModAction, ModActionRaw} from "./ModAction";
|
||||
import {Submission, RedditUser, Comment, Subreddit} from "snoowrap/dist/objects"
|
||||
import {ModUserNote, ModUserNoteRaw} from "./ModUserNote";
|
||||
//import {ExtendedSnoowrap} from "../../Utils/SnoowrapClients";
|
||||
import dayjs, {Dayjs} from "dayjs";
|
||||
import {generateSnoowrapEntityFromRedditThing, parseRedditFullname} from "../../util";
|
||||
import Snoowrap from "snoowrap";
|
||||
import {ModActionType, ModUserNoteLabel} from "../../Common/Infrastructure/Atomic";
|
||||
import {RedditThing} from "../../Common/Infrastructure/Reddit";
|
||||
|
||||
export interface ModNoteSnoowrapPopulated extends Omit<ModNoteRaw, 'subreddit' | 'user'> {
|
||||
subreddit: Subreddit
|
||||
user: RedditUser
|
||||
}
|
||||
|
||||
export interface CreateModNoteData {
|
||||
user: RedditUser
|
||||
subreddit: Subreddit
|
||||
activity?: Submission | Comment | RedditUser
|
||||
label?: ModUserNoteLabel
|
||||
note?: string
|
||||
}
|
||||
|
||||
export const asCreateModNoteData = (val: any): val is CreateModNoteData => {
|
||||
if(val !== null && typeof val === 'object') {
|
||||
return val.user instanceof RedditUser && val.subreddit instanceof Subreddit && typeof val.note === 'string';
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
|
||||
export interface ModNoteRaw {
|
||||
subreddit: string
|
||||
subreddit_id: string
|
||||
|
||||
user: string
|
||||
user_id: string
|
||||
|
||||
operator: string
|
||||
operator_id: string
|
||||
|
||||
id: string
|
||||
created_at: number
|
||||
cursor?: string
|
||||
type: ModActionType | string
|
||||
mod_action_data: ModActionRaw
|
||||
user_note_data: ModUserNoteRaw
|
||||
}
|
||||
|
||||
export class ModNote {
|
||||
|
||||
createdBy: RedditUser | Subreddit
|
||||
createdByName?: string
|
||||
createdAt: Dayjs
|
||||
action: ModAction
|
||||
note: ModUserNote
|
||||
user: RedditUser
|
||||
operatorVal: string
|
||||
cursor?: string
|
||||
id: string
|
||||
subreddit: Subreddit
|
||||
type: ModActionType | string
|
||||
|
||||
|
||||
constructor(data: ModNoteRaw, client: Snoowrap) {
|
||||
|
||||
this.createdByName = data.operator;
|
||||
this.createdAt = dayjs.unix(data.created_at);
|
||||
this.id = data.id;
|
||||
this.type = data.type;
|
||||
this.cursor = data.cursor;
|
||||
|
||||
this.subreddit = new Subreddit({display_name: data.subreddit, id: data.subreddit_id}, client, false);
|
||||
this.user = new RedditUser({name: data.user, id: data.user_id}, client, false);
|
||||
|
||||
this.operatorVal = data.operator;
|
||||
|
||||
const opThing = parseRedditFullname(data.operator_id) as RedditThing;
|
||||
this.createdBy = generateSnoowrapEntityFromRedditThing(opThing, client) as RedditUser | Subreddit;
|
||||
if (this.createdBy instanceof RedditUser) {
|
||||
this.createdBy.name = data.operator;
|
||||
}
|
||||
|
||||
this.action = new ModAction(data.mod_action_data, client);
|
||||
if (this.action.actedOn instanceof RedditUser && this.action.actedOn.id === this.user.id) {
|
||||
this.action.actedOn = this.user;
|
||||
}
|
||||
|
||||
this.note = new ModUserNote(data.user_note_data, client);
|
||||
if (this.note.actedOn instanceof RedditUser && this.note.actedOn.id === this.user.id) {
|
||||
this.note.actedOn = this.user;
|
||||
}
|
||||
}
|
||||
|
||||
toRaw(): ModNoteRaw {
|
||||
return {
|
||||
subreddit: this.subreddit.display_name,
|
||||
subreddit_id: this.subreddit.id,
|
||||
|
||||
user: this.user.name,
|
||||
user_id: this.user.id,
|
||||
|
||||
operator: this.operatorVal,
|
||||
operator_id: this.createdBy.id,
|
||||
|
||||
mod_action_data: this.action.toRaw(),
|
||||
|
||||
id: this.id,
|
||||
user_note_data: this.note.toRaw(),
|
||||
created_at: this.createdAt.unix(),
|
||||
type: this.type,
|
||||
cursor: this.cursor
|
||||
}
|
||||
}
|
||||
|
||||
toJSON() {
|
||||
return this.toRaw();
|
||||
}
|
||||
}
|
||||
48
src/Subreddit/ModNotes/ModUserNote.ts
Normal file
48
src/Subreddit/ModNotes/ModUserNote.ts
Normal file
@@ -0,0 +1,48 @@
|
||||
import {Comment, PrivateMessage, RedditUser, Submission} from "snoowrap/dist/objects";
|
||||
import {ModUserNoteLabel} from "../../Common/Infrastructure/Atomic";
|
||||
//import {ExtendedSnoowrap} from "../../Utils/SnoowrapClients";
|
||||
import {generateSnoowrapEntityFromRedditThing, parseRedditFullname} from "../../util";
|
||||
import Snoowrap from "snoowrap";
|
||||
|
||||
export interface ModUserNoteRaw {
|
||||
note?: string | null
|
||||
reddit_id?: string | null
|
||||
label?: string | null
|
||||
}
|
||||
|
||||
export class ModUserNote {
|
||||
note?: string
|
||||
actedOn?: RedditUser | Submission | Comment | PrivateMessage
|
||||
label?: ModUserNoteLabel
|
||||
|
||||
constructor(data: ModUserNoteRaw | undefined, client: Snoowrap) {
|
||||
const {
|
||||
note,
|
||||
reddit_id,
|
||||
label
|
||||
} = data || {};
|
||||
this.note = note !== null ? note : undefined;
|
||||
this.label = label !== null ? label as ModUserNoteLabel : undefined;
|
||||
|
||||
if (reddit_id !== null && reddit_id !== undefined) {
|
||||
const thing = parseRedditFullname(reddit_id);
|
||||
if (thing !== undefined) {
|
||||
this.actedOn = generateSnoowrapEntityFromRedditThing(thing, client) as RedditUser | Submission | Comment;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
toRaw(): ModUserNoteRaw {
|
||||
return {
|
||||
note: this.note,
|
||||
reddit_id: this.actedOn !== undefined ? this.actedOn.id : undefined,
|
||||
label: this.label
|
||||
}
|
||||
}
|
||||
|
||||
toJSON() {
|
||||
return this.toRaw();
|
||||
}
|
||||
}
|
||||
|
||||
export default ModUserNote;
|
||||
@@ -1,11 +1,12 @@
|
||||
import {Poll, SnooStormOptions} from "snoostorm"
|
||||
import Snoowrap, {Listing} from "snoowrap";
|
||||
import Snoowrap, {Listing, RedditContent} from "snoowrap";
|
||||
import {EventEmitter} from "events";
|
||||
import {PollConfiguration} from "snoostorm/out/util/Poll";
|
||||
import {DEFAULT_POLLING_INTERVAL} from "../Common/interfaces";
|
||||
import {mergeArr, parseDuration, random} from "../util";
|
||||
import { Logger } from "winston";
|
||||
import {ErrorWithCause} from "pony-cause";
|
||||
import dayjs, {Dayjs as DayjsObj} from "dayjs";
|
||||
|
||||
type Awaitable<T> = Promise<T> | T;
|
||||
|
||||
@@ -16,13 +17,15 @@ interface RCBPollingOptions<T> extends SnooStormOptions {
|
||||
name?: string,
|
||||
processed?: Set<T[keyof T]>
|
||||
label?: string
|
||||
dateCutoff?: boolean
|
||||
}
|
||||
|
||||
interface RCBPollConfiguration<T> extends PollConfiguration<T>,RCBPollingOptions<T> {
|
||||
get: () => Promise<Listing<T>>
|
||||
dateCutoff: boolean
|
||||
}
|
||||
|
||||
export class SPoll<T extends object> extends Poll<T> {
|
||||
export class SPoll<T extends RedditContent<object>> extends Poll<T> {
|
||||
identifier: keyof T;
|
||||
getter: () => Promise<Listing<T>>;
|
||||
frequency;
|
||||
@@ -31,6 +34,8 @@ export class SPoll<T extends object> extends Poll<T> {
|
||||
// -- that is, we don't want to emit the items we immediately fetch on a fresh poll start since they existed "before" polling started
|
||||
newStart: boolean = true;
|
||||
enforceContinuity: boolean;
|
||||
useDateCutoff: boolean;
|
||||
dateCutoff?: DayjsObj;
|
||||
randInterval?: { clear: () => void };
|
||||
name: string = 'Reddit Stream';
|
||||
logger: Logger;
|
||||
@@ -47,7 +52,8 @@ export class SPoll<T extends object> extends Poll<T> {
|
||||
name,
|
||||
subreddit,
|
||||
label = 'Polling',
|
||||
processed
|
||||
processed,
|
||||
dateCutoff,
|
||||
} = options;
|
||||
this.subreddit = subreddit;
|
||||
this.name = name !== undefined ? name : this.name;
|
||||
@@ -56,6 +62,7 @@ export class SPoll<T extends object> extends Poll<T> {
|
||||
this.getter = get;
|
||||
this.frequency = frequency;
|
||||
this.enforceContinuity = enforceContinuity;
|
||||
this.useDateCutoff = dateCutoff;
|
||||
|
||||
// if we pass in processed on init the intention is to "continue" from where the previous stream left off
|
||||
// WITHOUT new start behavior
|
||||
@@ -80,7 +87,7 @@ export class SPoll<T extends object> extends Poll<T> {
|
||||
// but only continue iterating if stream enforces continuity and we've only seen new items so far
|
||||
while(page === 1 || (self.enforceContinuity && !self.newStart && !anyAlreadySeen)) {
|
||||
if(page !== 1) {
|
||||
self.logger.debug(`Did not find any already seen activities and continuity is enforced. This probably means there were more new items than 1 api call can return. Fetching next page (${page})...`);
|
||||
self.logger.debug(`Did not find any already seen Activities and continuity is enforced. This probably means there were more new Activities than 1 api call can return. Fetching next page (page ${page})...`);
|
||||
// @ts-ignore
|
||||
batch = await batch.fetchMore({amount: 100});
|
||||
}
|
||||
@@ -95,24 +102,67 @@ export class SPoll<T extends object> extends Poll<T> {
|
||||
continue;
|
||||
}
|
||||
|
||||
// Emit for new items and add it to the list
|
||||
// add new item to list and set as processed
|
||||
newItems.push(item);
|
||||
self.processed.add(id);
|
||||
// but don't emit on new start since we are "buffering" already existing activities
|
||||
if(!self.newStart) {
|
||||
self.emit("item", item);
|
||||
}
|
||||
}
|
||||
page++;
|
||||
}
|
||||
const newItemMsg = `Found ${newItems.length} new items out of ${batch.length} returned`;
|
||||
|
||||
if(self.newStart) {
|
||||
self.logger.debug(`${newItemMsg} but will ignore all on first start.`);
|
||||
|
||||
self.logger.debug(`Found ${newItems.length} unseen Activities out of ${batch.length} returned, but will ignore all on first start.`);
|
||||
self.emit("listing", []);
|
||||
|
||||
if(self.useDateCutoff && self.dateCutoff === undefined) {
|
||||
self.logger.debug('Cutoff date should be used for filtering unseen Activities but none was set. Will determine date based on newest Activity returned from first polling results.');
|
||||
if(newItems.length === 0) {
|
||||
// no items found, cutoff is now
|
||||
self.dateCutoff = dayjs();
|
||||
self.logger.debug(`Cutoff date set to NOW (${self.dateCutoff.format('YYYY-MM-DD HH:mm:ssZ')}) since no unseen Activities returned. Unseen Activities will only be returned if newer than this date.`);
|
||||
} else {
|
||||
// set cutoff date for new items from the newest items found
|
||||
const sorted = [...newItems];
|
||||
sorted.sort((a, z) => z.created_utc - a.created_utc);
|
||||
self.dateCutoff = dayjs.unix(sorted[0].created_utc);
|
||||
self.logger.debug(`Cutoff date set to newest unseen Activity found, ${self.dateCutoff.format('YYYY-MM-DD HH:mm:ssZ')}. Unseen Activities will only be returned if newer than this date.`);
|
||||
}
|
||||
}
|
||||
|
||||
} else {
|
||||
self.logger.debug(newItemMsg);
|
||||
|
||||
// applies mostly (only?) to 'unmoderated' polling
|
||||
//
|
||||
// scenario:
|
||||
// * polling unmoderated for many subreddits and unmoderated has not been clearing out for awhile so it has many (100's) of items
|
||||
// * a moderator, or CM, iterates through list and actions items so the list is shorter
|
||||
// * CM polling unmoderated and finds "unseen" items that don't appear in unprocessed list
|
||||
//
|
||||
// these "unseen" are OLDER than the "newest" seen items we have got from polling because CM only got the first page of unmoderated items
|
||||
// so now CM emits them as "new" and CM starts processing them. If it continues to process them then more and more 'unseen old' items continue to appear in stream,
|
||||
// creating a feedback loop where CM eventually processes the entire backlog of unmoderated items
|
||||
//
|
||||
// this is UNWANTED behavior. CM should only ever process items added to polling sources after it starts monitoring them.
|
||||
//
|
||||
// to address this we use a cutoff date determined from the newest activity returned from the first polling call (or current datetime if none returned)
|
||||
// then we make sure any 'new' items (unseen by CM) are newer than this cutoff date
|
||||
//
|
||||
// -- this is the default behavior for all polling sources except modqueue. See comments on that class below for why.
|
||||
const unixCutoff = self.useDateCutoff && self.dateCutoff !== undefined ? self.dateCutoff.unix() : undefined;
|
||||
const validNewItems = unixCutoff === undefined || newItems.length === 0 ? newItems : newItems.filter(x => x.created_utc >= unixCutoff);
|
||||
|
||||
if(validNewItems.length !== newItems.length && self.dateCutoff !== undefined) {
|
||||
self.logger.warn(`${newItems.length - validNewItems.length} unseen Activities were created before cutoff date (${self.dateCutoff.format('YYYY-MM-DD HH:mm:ssZ')}) and have been filtered out.`);
|
||||
}
|
||||
self.logger.debug(`Found ${validNewItems.length} valid, unseen Activities out of ${batch.length} returned`);
|
||||
|
||||
// only emit if not new start since we are "buffering" already existing activities
|
||||
for(const item of validNewItems) {
|
||||
self.emit('item', item);
|
||||
}
|
||||
|
||||
// Emit the new listing of all new items
|
||||
self.emit("listing", newItems);
|
||||
self.emit("listing", validNewItems);
|
||||
}
|
||||
// no longer new start on n+1 interval
|
||||
self.newStart = false;
|
||||
@@ -146,6 +196,7 @@ export class SPoll<T extends object> extends Poll<T> {
|
||||
this.logger.debug(msg);
|
||||
this.running = false;
|
||||
this.newStart = true;
|
||||
this.dateCutoff = undefined;
|
||||
super.end();
|
||||
}
|
||||
}
|
||||
@@ -159,6 +210,7 @@ export class UnmoderatedStream extends SPoll<Snoowrap.Submission | Snoowrap.Comm
|
||||
get: async () => client.getSubreddit(options.subreddit).getUnmoderated(options),
|
||||
identifier: "id",
|
||||
name: 'Unmoderated',
|
||||
dateCutoff: true,
|
||||
...options,
|
||||
});
|
||||
}
|
||||
@@ -173,6 +225,9 @@ export class ModQueueStream extends SPoll<Snoowrap.Submission | Snoowrap.Comment
|
||||
get: async () => client.getSubreddit(options.subreddit).getModqueue(options),
|
||||
identifier: "id",
|
||||
name: 'Modqueue',
|
||||
// cannot use cutoff date since 'new' items in this list are based on when they were reported, not when the item was created
|
||||
// and unfortunately there is no way to use that "reported at" time since reddit doesn't include it in the returned items
|
||||
dateCutoff: false,
|
||||
...options,
|
||||
});
|
||||
}
|
||||
@@ -187,6 +242,7 @@ export class SubmissionStream extends SPoll<Snoowrap.Submission | Snoowrap.Comme
|
||||
get: async () => client.getNew(options.subreddit, options),
|
||||
identifier: "id",
|
||||
name: 'Submission',
|
||||
dateCutoff: true,
|
||||
...options,
|
||||
});
|
||||
}
|
||||
@@ -201,6 +257,7 @@ export class CommentStream extends SPoll<Snoowrap.Submission | Snoowrap.Comment>
|
||||
get: async () => client.getNewComments(options.subreddit, options),
|
||||
identifier: "id",
|
||||
name: 'Comment',
|
||||
dateCutoff: true,
|
||||
...options,
|
||||
});
|
||||
}
|
||||
|
||||
@@ -17,8 +17,6 @@ import {
|
||||
buildCacheOptionsFromProvider,
|
||||
buildCachePrefix,
|
||||
cacheStats,
|
||||
compareDurationValue,
|
||||
comparisonTextOp,
|
||||
createCacheManager,
|
||||
escapeRegex,
|
||||
FAIL,
|
||||
@@ -35,10 +33,7 @@ import {
|
||||
isUser,
|
||||
hashString,
|
||||
mergeArr,
|
||||
parseDurationComparison,
|
||||
parseExternalUrl,
|
||||
parseGenericValueComparison,
|
||||
parseGenericValueOrPercentComparison,
|
||||
parseRedditEntity,
|
||||
parseStringToRegex,
|
||||
parseWikiContext,
|
||||
@@ -58,7 +53,12 @@ import {
|
||||
frequencyEqualOrLargerThanMin,
|
||||
parseDurationValToDuration,
|
||||
windowConfigToWindowCriteria,
|
||||
asStrongSubredditState, convertSubredditsRawToStrong, filterByTimeRequirement
|
||||
asStrongSubredditState,
|
||||
convertSubredditsRawToStrong,
|
||||
filterByTimeRequirement,
|
||||
asSubreddit,
|
||||
modActionCriteriaSummary,
|
||||
parseRedditFullname
|
||||
} from "../util";
|
||||
import LoggedError from "../Utils/LoggedError";
|
||||
import {
|
||||
@@ -111,16 +111,19 @@ import {RuleSetResultEntity} from "../Common/Entities/RuleSetResultEntity";
|
||||
import {RulePremise} from "../Common/Entities/RulePremise";
|
||||
import cloneDeep from "lodash/cloneDeep";
|
||||
import {
|
||||
AuthorCriteria, CommentState, RequiredAuthorCrit,
|
||||
asModLogCriteria,
|
||||
asModNoteCriteria,
|
||||
AuthorCriteria, CommentState, ModLogCriteria, ModNoteCriteria, orderedAuthorCriteriaProps, RequiredAuthorCrit,
|
||||
StrongSubredditCriteria, SubmissionState,
|
||||
SubredditCriteria, TypedActivityState, TypedActivityStates,
|
||||
SubredditCriteria, toFullModLogCriteria, toFullModNoteCriteria, TypedActivityState, TypedActivityStates,
|
||||
UserNoteCriteria
|
||||
} from "../Common/Infrastructure/Filters/FilterCriteria";
|
||||
import {
|
||||
ActivitySource, DurationVal,
|
||||
EventRetentionPolicyRange,
|
||||
JoinOperands,
|
||||
ModeratorNameCriteria, statFrequencies, StatisticFrequency,
|
||||
ModActionType,
|
||||
ModeratorNameCriteria, ModUserNoteLabel, statFrequencies, StatisticFrequency,
|
||||
StatisticFrequencyOption
|
||||
} from "../Common/Infrastructure/Atomic";
|
||||
import {
|
||||
@@ -139,12 +142,20 @@ import {
|
||||
import {Duration} from "dayjs/plugin/duration";
|
||||
import {
|
||||
|
||||
ActivityType,
|
||||
AuthorHistorySort,
|
||||
CachedFetchedActivitiesResult, FetchedActivitiesResult,
|
||||
SnoowrapActivity
|
||||
} from "../Common/Infrastructure/Reddit";
|
||||
import {AuthorCritPropHelper} from "../Common/Infrastructure/Filters/AuthorCritPropHelper";
|
||||
import {NoopLogger} from "../Utils/loggerFactory";
|
||||
import {
|
||||
compareDurationValue, comparisonTextOp,
|
||||
parseDurationComparison,
|
||||
parseGenericValueComparison,
|
||||
parseGenericValueOrPercentComparison
|
||||
} from "../Common/Infrastructure/Comparisons";
|
||||
import {asCreateModNoteData, CreateModNoteData, ModNote, ModNoteRaw} from "./ModNotes/ModNote";
|
||||
|
||||
export const DEFAULT_FOOTER = '\r\n*****\r\nThis action was performed by [a bot.]({{botLink}}) Mention a moderator or [send a modmail]({{modmailLink}}) if you any ideas, questions, or concerns about this action.';
|
||||
|
||||
@@ -210,6 +221,7 @@ export class SubredditResources {
|
||||
protected submissionTTL: number | false = cacheTTLDefaults.submissionTTL;
|
||||
protected commentTTL: number | false = cacheTTLDefaults.commentTTL;
|
||||
protected filterCriteriaTTL: number | false = cacheTTLDefaults.filterCriteriaTTL;
|
||||
protected modNotesTTL: number | false = cacheTTLDefaults.modNotesTTL;
|
||||
public selfTTL: number | false = cacheTTLDefaults.selfTTL;
|
||||
name: string;
|
||||
botName: string;
|
||||
@@ -259,6 +271,7 @@ export class SubredditResources {
|
||||
submissionTTL,
|
||||
commentTTL,
|
||||
subredditTTL,
|
||||
modNotesTTL,
|
||||
},
|
||||
botName,
|
||||
database,
|
||||
@@ -300,6 +313,7 @@ export class SubredditResources {
|
||||
this.subredditTTL = subredditTTL === true ? 0 : subredditTTL;
|
||||
this.wikiTTL = wikiTTL === true ? 0 : wikiTTL;
|
||||
this.filterCriteriaTTL = filterCriteriaTTL === true ? 0 : filterCriteriaTTL;
|
||||
this.modNotesTTL = modNotesTTL === true ? 0 : modNotesTTL;
|
||||
this.selfTTL = selfTTL === true ? 0 : selfTTL;
|
||||
this.subreddit = subreddit;
|
||||
this.thirdPartyCredentials = thirdPartyCredentials;
|
||||
@@ -440,6 +454,7 @@ export class SubredditResources {
|
||||
const now = dayjs();
|
||||
for(const dAct of dispatchedActivities) {
|
||||
const shouldDispatchAt = dAct.createdAt.add(dAct.delay.asSeconds(), 'seconds');
|
||||
let tardyHint = '';
|
||||
if(shouldDispatchAt.isBefore(now)) {
|
||||
let tardyHint = `Activity ${dAct.activityId} queued at ${dAct.createdAt.format('YYYY-MM-DD HH:mm:ssZ')} for ${dAct.delay.humanize()} is now LATE`;
|
||||
if(dAct.tardyTolerant === true) {
|
||||
@@ -453,7 +468,8 @@ export class SubredditResources {
|
||||
// see if its within tolerance
|
||||
const latest = shouldDispatchAt.add(dAct.tardyTolerant);
|
||||
if(latest.isBefore(now)) {
|
||||
tardyHint += `and IS NOT within tardy tolerance of ${dAct.tardyTolerant.humanize()} of planned dispatch time so will be dropped`;
|
||||
tardyHint += ` and IS NOT within tardy tolerance of ${dAct.tardyTolerant.humanize()} of planned dispatch time so will be dropped`;
|
||||
this.logger.warn(tardyHint);
|
||||
await this.removeDelayedActivity(dAct.id);
|
||||
continue;
|
||||
} else {
|
||||
@@ -461,8 +477,14 @@ export class SubredditResources {
|
||||
}
|
||||
}
|
||||
}
|
||||
// TODO make this less api heavy
|
||||
this.delayedItems.push(await dAct.toActivityDispatch(this.client))
|
||||
if(tardyHint !== '') {
|
||||
this.logger.warn(tardyHint);
|
||||
}
|
||||
try {
|
||||
this.delayedItems.push(await dAct.toActivityDispatch(this.client))
|
||||
} catch (e) {
|
||||
this.logger.warn(new ErrorWithCause(`Unable to add Activity ${dAct.activityId} from database delayed activities to in-app delayed activities queue`, {cause: e}));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -475,7 +497,7 @@ export class SubredditResources {
|
||||
|
||||
async removeDelayedActivity(id: string) {
|
||||
await this.dispatchedActivityRepo.delete(id);
|
||||
this.delayedItems.filter(x => x.id !== id);
|
||||
this.delayedItems = this.delayedItems.filter(x => x.id !== id);
|
||||
}
|
||||
|
||||
async initStats() {
|
||||
@@ -742,7 +764,7 @@ export class SubredditResources {
|
||||
req: acc.req + curr.requests,
|
||||
}), {miss: 0, req: 0});
|
||||
const cacheKeys = Object.keys(this.stats.cache);
|
||||
return {
|
||||
const res = {
|
||||
cache: {
|
||||
// TODO could probably combine these two
|
||||
totalRequests: totals.req,
|
||||
@@ -770,24 +792,29 @@ export class SubredditResources {
|
||||
|
||||
if(acc[curr].requestTimestamps.length > 1) {
|
||||
// calculate average time between request
|
||||
const diffData = acc[curr].requestTimestamps.reduce((acc, curr: number) => {
|
||||
if(acc.last === 0) {
|
||||
acc.last = curr;
|
||||
return acc;
|
||||
const diffData = acc[curr].requestTimestamps.reduce((accTimestampData, curr: number) => {
|
||||
if(accTimestampData.last === 0) {
|
||||
accTimestampData.last = curr;
|
||||
return accTimestampData;
|
||||
}
|
||||
acc.diffs.push(curr - acc.last);
|
||||
acc.last = curr;
|
||||
return acc;
|
||||
accTimestampData.diffs.push(curr - accTimestampData.last);
|
||||
accTimestampData.last = curr;
|
||||
return accTimestampData;
|
||||
},{last: 0, diffs: [] as number[]});
|
||||
const avgDiff = diffData.diffs.reduce((acc, curr) => acc + curr, 0) / diffData.diffs.length;
|
||||
|
||||
acc[curr].averageTimeBetweenHits = formatNumber(avgDiff/1000);
|
||||
}
|
||||
|
||||
const {requestTimestamps, identifierRequestCount, ...rest} = acc[curr];
|
||||
// @ts-ignore
|
||||
acc[curr] = rest;
|
||||
|
||||
return acc;
|
||||
}, Promise.resolve(this.stats.cache))
|
||||
}, Promise.resolve({...this.stats.cache}))
|
||||
}
|
||||
}
|
||||
return res;
|
||||
}
|
||||
|
||||
setLogger(logger: Logger) {
|
||||
@@ -906,39 +933,43 @@ export class SubredditResources {
|
||||
return await item.fetch();
|
||||
}
|
||||
} catch (err: any) {
|
||||
this.logger.error('Error while trying to fetch a cached activity', err);
|
||||
throw err.logged;
|
||||
throw new ErrorWithCause('Error while trying to fetch a cached Activity', {cause: err});
|
||||
}
|
||||
}
|
||||
|
||||
// @ts-ignore
|
||||
public async setActivity(item: Submission | Comment, tryToFetch = true)
|
||||
{
|
||||
let hash = '';
|
||||
if(this.submissionTTL !== false && isSubmission(item)) {
|
||||
hash = `sub-${item.name}`;
|
||||
if(tryToFetch && item instanceof Submission) {
|
||||
// @ts-ignore
|
||||
const itemToCache = await item.fetch();
|
||||
await this.cache.set(hash, itemToCache, {ttl: this.submissionTTL});
|
||||
return itemToCache;
|
||||
} else {
|
||||
// @ts-ignore
|
||||
await this.cache.set(hash, item, {ttl: this.submissionTTL});
|
||||
return item;
|
||||
}
|
||||
} else if(this.commentTTL !== false){
|
||||
hash = `comm-${item.name}`;
|
||||
if(tryToFetch && item instanceof Comment) {
|
||||
// @ts-ignore
|
||||
const itemToCache = await item.fetch();
|
||||
await this.cache.set(hash, itemToCache, {ttl: this.commentTTL});
|
||||
return itemToCache;
|
||||
} else {
|
||||
// @ts-ignore
|
||||
await this.cache.set(hash, item, {ttl: this.commentTTL});
|
||||
return item;
|
||||
try {
|
||||
let hash = '';
|
||||
if (this.submissionTTL !== false && isSubmission(item)) {
|
||||
hash = `sub-${item.name}`;
|
||||
if (tryToFetch && item instanceof Submission) {
|
||||
// @ts-ignore
|
||||
const itemToCache = await item.fetch();
|
||||
await this.cache.set(hash, itemToCache, {ttl: this.submissionTTL});
|
||||
return itemToCache;
|
||||
} else {
|
||||
// @ts-ignore
|
||||
await this.cache.set(hash, item, {ttl: this.submissionTTL});
|
||||
return item;
|
||||
}
|
||||
} else if (this.commentTTL !== false) {
|
||||
hash = `comm-${item.name}`;
|
||||
if (tryToFetch && item instanceof Comment) {
|
||||
// @ts-ignore
|
||||
const itemToCache = await item.fetch();
|
||||
await this.cache.set(hash, itemToCache, {ttl: this.commentTTL});
|
||||
return itemToCache;
|
||||
} else {
|
||||
// @ts-ignore
|
||||
await this.cache.set(hash, item, {ttl: this.commentTTL});
|
||||
return item;
|
||||
}
|
||||
}
|
||||
return item;
|
||||
} catch (e) {
|
||||
throw new ErrorWithCause('Error occurred while trying to add Activity to cache', {cause: e});
|
||||
}
|
||||
}
|
||||
|
||||
@@ -987,11 +1018,19 @@ export class SubredditResources {
|
||||
}
|
||||
|
||||
// @ts-ignore
|
||||
async getSubreddit(item: Submission | Comment, logger = this.logger) {
|
||||
async getSubreddit(item: Submission | Comment | Subreddit | string, logger = this.logger) {
|
||||
let subName = '';
|
||||
if (typeof item === 'string') {
|
||||
subName = item;
|
||||
} else if (asSubreddit(item)) {
|
||||
subName = item.display_name;
|
||||
} else if (asSubmission(item) || asComment(item)) {
|
||||
subName = getActivitySubredditName(item);
|
||||
}
|
||||
try {
|
||||
let hash = '';
|
||||
const subName = getActivitySubredditName(item);
|
||||
if (this.subredditTTL !== false) {
|
||||
|
||||
hash = `sub-${subName}`;
|
||||
await this.stats.cache.subreddit.identifierRequestCount.set(hash, (await this.stats.cache.subreddit.identifierRequestCount.wrap(hash, () => 0) as number) + 1);
|
||||
this.stats.cache.subreddit.requestTimestamps.push(Date.now());
|
||||
@@ -1002,7 +1041,7 @@ export class SubredditResources {
|
||||
return new Subreddit(cachedSubreddit, this.client, false);
|
||||
}
|
||||
// @ts-ignore
|
||||
const subreddit = await this.client.getSubreddit(subName).fetch() as Subreddit;
|
||||
const subreddit = await (item instanceof Subreddit ? item : this.client.getSubreddit(subName)).fetch() as Subreddit;
|
||||
this.stats.cache.subreddit.miss++;
|
||||
// @ts-ignore
|
||||
await this.cache.set(hash, subreddit, {ttl: this.subredditTTL});
|
||||
@@ -1010,12 +1049,12 @@ export class SubredditResources {
|
||||
return subreddit as Subreddit;
|
||||
} else {
|
||||
// @ts-ignore
|
||||
let subreddit = await this.client.getSubreddit(subName);
|
||||
let subreddit = await (item instanceof Subreddit ? item : this.client.getSubreddit(subName)).fetch();
|
||||
|
||||
return subreddit as Subreddit;
|
||||
}
|
||||
} catch (err: any) {
|
||||
this.logger.error('Error while trying to fetch a cached activity', err);
|
||||
this.logger.error('Error while trying to fetch a cached subreddit', err);
|
||||
throw err.logged;
|
||||
}
|
||||
}
|
||||
@@ -1116,6 +1155,84 @@ export class SubredditResources {
|
||||
return false;
|
||||
}
|
||||
|
||||
async getAuthorModNotesByActivityAuthor(activity: Comment | Submission) {
|
||||
const author = activity.author instanceof RedditUser ? activity.author : getActivityAuthorName(activity.author);
|
||||
if (activity.subreddit.display_name !== this.subreddit.display_name) {
|
||||
throw new SimpleError(`Can only get Modnotes for current moderator subreddit, Activity is from ${activity.subreddit.display_name}`, {isSerious: false});
|
||||
}
|
||||
return this.getAuthorModNotes(author);
|
||||
}
|
||||
|
||||
async getAuthorModNotes(val: RedditUser | string) {
|
||||
|
||||
const authorName = typeof val === 'string' ? val : val.name;
|
||||
if (authorName === '[deleted]') {
|
||||
throw new SimpleError(`User is '[deleted]', cannot retrieve`, {isSerious: false});
|
||||
}
|
||||
const subredditName = this.subreddit.display_name
|
||||
|
||||
const hash = `authorModNotes-${subredditName}-${authorName}`;
|
||||
|
||||
if (this.modNotesTTL !== false) {
|
||||
const cachedModNoteData = await this.cache.get(hash) as ModNoteRaw[] | null | undefined;
|
||||
if (cachedModNoteData !== undefined && cachedModNoteData !== null) {
|
||||
this.logger.debug(`Cache Hit: Author ModNotes ${authorName} in ${subredditName}`);
|
||||
|
||||
return cachedModNoteData.map(x => {
|
||||
const note = new ModNote(x, this.client);
|
||||
note.subreddit = this.subreddit;
|
||||
if (val instanceof RedditUser) {
|
||||
note.user = val;
|
||||
}
|
||||
return note;
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
const fetchedNotes = (await this.client.getModNotes(this.subreddit, val)).notes.map(x => {
|
||||
x.subreddit = this.subreddit;
|
||||
if (val instanceof RedditUser) {
|
||||
x.user = val;
|
||||
}
|
||||
return x;
|
||||
});
|
||||
|
||||
if (this.modNotesTTL !== false) {
|
||||
// @ts-ignore
|
||||
await this.cache.set(hash, fetchedNotes, {ttl: this.modNotesTTL});
|
||||
}
|
||||
|
||||
return fetchedNotes;
|
||||
}
|
||||
|
||||
async addModNote(note: CreateModNoteData | ModNote): Promise<ModNote> {
|
||||
let data: CreateModNoteData;
|
||||
if (asCreateModNoteData(note)) {
|
||||
data = note;
|
||||
} else {
|
||||
data = {
|
||||
user: note.user,
|
||||
subreddit: this.subreddit,
|
||||
activity: note.note.actedOn as Submission | Comment | RedditUser | undefined,
|
||||
label: note.note.label,
|
||||
note: note.note.note ?? '',
|
||||
}
|
||||
}
|
||||
|
||||
const newNote = await this.client.addModNote(data);
|
||||
|
||||
if (this.modNotesTTL !== false) {
|
||||
const hash = `authorModNotes-${this.subreddit.display_name}-${data.user.name}`;
|
||||
const cachedModNoteData = await this.cache.get(hash) as ModNoteRaw[] | null | undefined;
|
||||
if (cachedModNoteData !== undefined && cachedModNoteData !== null) {
|
||||
this.logger.debug(`Adding new Note ${newNote.id} to Author ${data.user.name} Note cache`);
|
||||
await this.cache.set(hash, [newNote, ...cachedModNoteData], {ttl: this.modNotesTTL});
|
||||
}
|
||||
}
|
||||
|
||||
return newNote;
|
||||
}
|
||||
|
||||
// @ts-ignore
|
||||
async getAuthor(val: RedditUser | string) {
|
||||
const authorName = typeof val === 'string' ? val : val.name;
|
||||
@@ -1158,7 +1275,7 @@ export class SubredditResources {
|
||||
return user;
|
||||
} catch (err) {
|
||||
if(isStatusError(err) && err.statusCode === 404) {
|
||||
throw new SimpleError(`Reddit returned a 404 for User '${authorName}'. Likely this user is shadowbanned.`, {isSerious: false});
|
||||
throw new SimpleError(`Reddit returned a 404 for User '${authorName}'. Likely this user is shadowbanned.`, {isSerious: false, code: 404});
|
||||
}
|
||||
throw new ErrorWithCause(`Could not retrieve User '${authorName}'`, {cause: err});
|
||||
}
|
||||
@@ -2134,7 +2251,7 @@ export class SubredditResources {
|
||||
propResultsMap.age!.found = created.format('MMMM D, YYYY h:mm A Z');
|
||||
break;
|
||||
case 'title':
|
||||
if((item instanceof Comment)) {
|
||||
if(asComment(item)) {
|
||||
const titleWarn ='`title` is not allowed in `itemIs` criteria when the main Activity is a Comment';
|
||||
log.debug(titleWarn);
|
||||
propResultsMap.title!.passed = true;
|
||||
@@ -2153,7 +2270,7 @@ export class SubredditResources {
|
||||
}
|
||||
break;
|
||||
case 'isRedditMediaDomain':
|
||||
if((item instanceof Comment)) {
|
||||
if(asComment(item)) {
|
||||
const mediaWarn = '`isRedditMediaDomain` is not allowed in `itemIs` criteria when the main Activity is a Comment';
|
||||
log.debug(mediaWarn);
|
||||
propResultsMap.isRedditMediaDomain!.passed = true;
|
||||
@@ -2240,7 +2357,7 @@ export class SubredditResources {
|
||||
propResultsMap[k]!.passed = criteriaPassWithIncludeBehavior(propResultsMap[k]!.found === itemOptVal, include);
|
||||
break;
|
||||
case 'op':
|
||||
if(isSubmission(item)) {
|
||||
if(asSubmission(item)) {
|
||||
const opWarn = `On a Submission the 'op' property will always be true. Did you mean to use this on a comment instead?`;
|
||||
log.debug(opWarn);
|
||||
propResultsMap.op!.passed = true;
|
||||
@@ -2251,7 +2368,7 @@ export class SubredditResources {
|
||||
propResultsMap.op!.passed = criteriaPassWithIncludeBehavior(propResultsMap.op!.found === itemOptVal, include);
|
||||
break;
|
||||
case 'depth':
|
||||
if(isSubmission(item)) {
|
||||
if(asSubmission(item)) {
|
||||
const depthWarn = `Cannot test for 'depth' on a Submission`;
|
||||
log.debug(depthWarn);
|
||||
propResultsMap.depth!.passed = true;
|
||||
@@ -2362,6 +2479,8 @@ export class SubredditResources {
|
||||
ex = v.map(x => {
|
||||
if (asUserNoteCriteria(x)) {
|
||||
return userNoteCriteriaSummary(x);
|
||||
} else if(asModNoteCriteria(x) || asModLogCriteria(x)) {
|
||||
return modActionCriteriaSummary(x);
|
||||
}
|
||||
return x;
|
||||
});
|
||||
@@ -2375,45 +2494,29 @@ export class SubredditResources {
|
||||
return acc;
|
||||
}, {});
|
||||
|
||||
const {shadowBanned} = authorOpts;
|
||||
const keys = Object.keys(propResultsMap) as (keyof AuthorCriteria)[]
|
||||
let orderedKeys: (keyof AuthorCriteria)[] = [];
|
||||
|
||||
if (shadowBanned !== undefined) {
|
||||
try {
|
||||
// @ts-ignore
|
||||
await item.author.fetch();
|
||||
// user is not shadowbanned
|
||||
// if criteria specifies they SHOULD be shadowbanned then return false now
|
||||
if (shadowBanned) {
|
||||
propResultsMap.shadowBanned!.found = false;
|
||||
propResultsMap.shadowBanned!.passed = false;
|
||||
}
|
||||
} catch (err: any) {
|
||||
if (isStatusError(err) && err.statusCode === 404) {
|
||||
// user is shadowbanned
|
||||
// if criteria specifies they should not be shadowbanned then return false now
|
||||
if (!shadowBanned) {
|
||||
propResultsMap.shadowBanned!.found = true;
|
||||
propResultsMap.shadowBanned!.passed = false;
|
||||
}
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
// push existing keys that should be ordered to the front of the list
|
||||
for(const oProp of orderedAuthorCriteriaProps) {
|
||||
if(keys.includes(oProp)) {
|
||||
orderedKeys.push(oProp);
|
||||
}
|
||||
}
|
||||
|
||||
// then add any keys not included as ordered but that exist onto the end of the list
|
||||
// this way when we iterate all properties of the criteria we test all props that (probably) don't require API calls first
|
||||
orderedKeys = orderedKeys.concat(keys.filter(x => !orderedKeys.includes(x)));
|
||||
|
||||
|
||||
if (propResultsMap.shadowBanned === undefined || propResultsMap.shadowBanned.passed === undefined) {
|
||||
try {
|
||||
const authorName = getActivityAuthorName(item.author);
|
||||
|
||||
const keys = Object.keys(propResultsMap) as (keyof AuthorCriteria)[]
|
||||
|
||||
let shouldContinue = true;
|
||||
for (const k of keys) {
|
||||
if (k === 'shadowBanned') {
|
||||
// we have already taken care of this with shadowban check above
|
||||
continue;
|
||||
for (const k of orderedKeys) {
|
||||
|
||||
if(propResultsMap.shadowBanned !== undefined && propResultsMap.shadowBanned!.found === true) {
|
||||
// if we've determined the user is shadowbanned we can't get any info about them anyways so end criteria testing early
|
||||
break;
|
||||
}
|
||||
|
||||
// none of the criteria below are returned if the user is suspended
|
||||
@@ -2438,8 +2541,30 @@ export class SubredditResources {
|
||||
|
||||
const authorOptVal = definedAuthorOpts[k];
|
||||
|
||||
//if (authorOpts[k] !== undefined) {
|
||||
switch (k) {
|
||||
case 'shadowBanned':
|
||||
|
||||
const isShadowBannedTest = async () => {
|
||||
try {
|
||||
// @ts-ignore
|
||||
await user();
|
||||
return false;
|
||||
} catch (err: any) {
|
||||
// see this.getAuthor() catch block
|
||||
if('code' in err && err.code === 404) {
|
||||
return true
|
||||
}
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
propResultsMap.shadowBanned!.found = await isShadowBannedTest();
|
||||
const shadowPassed = (propResultsMap.shadowBanned!.found && authorOptVal === true) || (!propResultsMap.shadowBanned!.found && authorOptVal === false);
|
||||
propResultsMap.shadowBanned!.passed = criteriaPassWithIncludeBehavior(shadowPassed, include);
|
||||
if(propResultsMap.shadowBanned!.passed) {
|
||||
shouldContinue = false;
|
||||
}
|
||||
break;
|
||||
case 'name':
|
||||
const nameVal = authorOptVal as RequiredAuthorCrit['name'];
|
||||
const authPass = () => {
|
||||
@@ -2453,7 +2578,7 @@ export class SubredditResources {
|
||||
}
|
||||
const authResult = authPass();
|
||||
propResultsMap.name!.found = authorName;
|
||||
propResultsMap.name!.passed = !((include && !authResult) || (!include && authResult));
|
||||
propResultsMap.name!.passed = criteriaPassWithIncludeBehavior(authResult, include);
|
||||
if (!propResultsMap.name!.passed) {
|
||||
shouldContinue = false;
|
||||
}
|
||||
@@ -2478,7 +2603,7 @@ export class SubredditResources {
|
||||
cssResult = opts.some(x => x.trim().toLowerCase() === css.trim().toLowerCase())
|
||||
}
|
||||
|
||||
propResultsMap.flairCssClass!.passed = !((include && !cssResult) || (!include && cssResult));
|
||||
propResultsMap.flairCssClass!.passed = criteriaPassWithIncludeBehavior(cssResult, include);
|
||||
if (!propResultsMap.flairCssClass!.passed) {
|
||||
shouldContinue = false;
|
||||
}
|
||||
@@ -2502,7 +2627,7 @@ export class SubredditResources {
|
||||
const opts = Array.isArray(authorOptVal) ? authorOptVal as string[] : [authorOptVal] as string[];
|
||||
textResult = opts.some(x => x.trim().toLowerCase() === text.trim().toLowerCase())
|
||||
}
|
||||
propResultsMap.flairText!.passed = !((include && !textResult) || (!include && textResult));
|
||||
propResultsMap.flairText!.passed = criteriaPassWithIncludeBehavior(textResult, include);
|
||||
if (!propResultsMap.flairText!.passed) {
|
||||
shouldContinue = false;
|
||||
}
|
||||
@@ -2526,7 +2651,7 @@ export class SubredditResources {
|
||||
templateResult = opts.some(x => x.trim() === templateId);
|
||||
}
|
||||
|
||||
propResultsMap.flairTemplate!.passed = !((include && !templateResult) || (!include && templateResult));
|
||||
propResultsMap.flairTemplate!.passed = criteriaPassWithIncludeBehavior(templateResult, include);
|
||||
if (!propResultsMap.flairTemplate!.passed) {
|
||||
shouldContinue = false;
|
||||
}
|
||||
@@ -2536,7 +2661,7 @@ export class SubredditResources {
|
||||
const isModerator = mods.some(x => x.name === authorName) || authorName.toLowerCase() === 'automoderator';
|
||||
const modMatch = authorOptVal === isModerator;
|
||||
propResultsMap.isMod!.found = isModerator;
|
||||
propResultsMap.isMod!.passed = !((include && !modMatch) || (!include && modMatch));
|
||||
propResultsMap.isMod!.passed = criteriaPassWithIncludeBehavior(modMatch, include);
|
||||
if (!propResultsMap.isMod!.passed) {
|
||||
shouldContinue = false;
|
||||
}
|
||||
@@ -2546,7 +2671,7 @@ export class SubredditResources {
|
||||
const isContributor= contributors.some(x => x.name === authorName);
|
||||
const contributorMatch = authorOptVal === isContributor;
|
||||
propResultsMap.isContributor!.found = isContributor;
|
||||
propResultsMap.isContributor!.passed = !((include && !contributorMatch) || (!include && contributorMatch));
|
||||
propResultsMap.isContributor!.passed = criteriaPassWithIncludeBehavior(contributorMatch, include);
|
||||
if (!propResultsMap.isContributor!.passed) {
|
||||
shouldContinue = false;
|
||||
}
|
||||
@@ -2556,7 +2681,7 @@ export class SubredditResources {
|
||||
const authorAge = dayjs.unix((await user()).created);
|
||||
const ageTest = compareDurationValue(parseDurationComparison(await authorOpts.age as string), authorAge);
|
||||
propResultsMap.age!.found = authorAge.fromNow(true);
|
||||
propResultsMap.age!.passed = !((include && !ageTest) || (!include && ageTest));
|
||||
propResultsMap.age!.passed = criteriaPassWithIncludeBehavior(ageTest, include);
|
||||
if (!propResultsMap.age!.passed) {
|
||||
shouldContinue = false;
|
||||
}
|
||||
@@ -2573,7 +2698,7 @@ export class SubredditResources {
|
||||
lkMatch = comparisonTextOp(item.author.link_karma, lkCompare.operator, lkCompare.value);
|
||||
}
|
||||
propResultsMap.linkKarma!.found = tk;
|
||||
propResultsMap.linkKarma!.passed = !((include && !lkMatch) || (!include && lkMatch));
|
||||
propResultsMap.linkKarma!.passed = criteriaPassWithIncludeBehavior(lkMatch, include);
|
||||
if (!propResultsMap.linkKarma!.passed) {
|
||||
shouldContinue = false;
|
||||
}
|
||||
@@ -2589,7 +2714,7 @@ export class SubredditResources {
|
||||
ckMatch = comparisonTextOp(item.author.comment_karma, ckCompare.operator, ckCompare.value);
|
||||
}
|
||||
propResultsMap.commentKarma!.found = ck;
|
||||
propResultsMap.commentKarma!.passed = !((include && !ckMatch) || (!include && ckMatch));
|
||||
propResultsMap.commentKarma!.passed = criteriaPassWithIncludeBehavior(ckMatch, include);
|
||||
if (!propResultsMap.commentKarma!.passed) {
|
||||
shouldContinue = false;
|
||||
}
|
||||
@@ -2603,7 +2728,7 @@ export class SubredditResources {
|
||||
}
|
||||
const tkMatch = comparisonTextOp(totalKarma, tkCompare.operator, tkCompare.value);
|
||||
propResultsMap.totalKarma!.found = totalKarma;
|
||||
propResultsMap.totalKarma!.passed = !((include && !tkMatch) || (!include && tkMatch));
|
||||
propResultsMap.totalKarma!.passed = criteriaPassWithIncludeBehavior(tkMatch, include);
|
||||
if (!propResultsMap.totalKarma!.passed) {
|
||||
shouldContinue = false;
|
||||
}
|
||||
@@ -2613,7 +2738,7 @@ export class SubredditResources {
|
||||
const verified = (await user()).has_verified_mail;
|
||||
const vMatch = verified === authorOpts.verified as boolean;
|
||||
propResultsMap.verified!.found = verified;
|
||||
propResultsMap.verified!.passed = !((include && !vMatch) || (!include && vMatch));
|
||||
propResultsMap.verified!.passed = criteriaPassWithIncludeBehavior(vMatch, include);
|
||||
if (!propResultsMap.verified!.passed) {
|
||||
shouldContinue = false;
|
||||
}
|
||||
@@ -2639,7 +2764,7 @@ export class SubredditResources {
|
||||
}
|
||||
}
|
||||
propResultsMap.description!.found = typeof desc === 'string' ? truncateStringToLength(50)(desc) : desc;
|
||||
propResultsMap.description!.passed = !((include && !passed) || (!include && passed));
|
||||
propResultsMap.description!.passed = criteriaPassWithIncludeBehavior(passed, include);
|
||||
if (!propResultsMap.description!.passed) {
|
||||
shouldContinue = false;
|
||||
} else {
|
||||
@@ -2656,8 +2781,10 @@ export class SubredditResources {
|
||||
value,
|
||||
operator,
|
||||
isPercent,
|
||||
duration,
|
||||
extra = ''
|
||||
} = parseGenericValueOrPercentComparison(count);
|
||||
const cutoffDate = duration === undefined ? undefined : dayjs().subtract(duration);
|
||||
const order = extra.includes('asc') ? 'ascending' : 'descending';
|
||||
switch (search) {
|
||||
case 'current':
|
||||
@@ -2672,29 +2799,32 @@ export class SubredditResources {
|
||||
}
|
||||
break;
|
||||
case 'consecutive':
|
||||
let orderedNotes = notes;
|
||||
if (isPercent) {
|
||||
throw new SimpleError(`When comparing UserNotes with 'consecutive' search 'count' cannot be a percentage. Given: ${count}`);
|
||||
}
|
||||
|
||||
let orderedNotes = cutoffDate === undefined ? notes : notes.filter(x => x.time.isSameOrAfter(cutoffDate));
|
||||
if (order === 'descending') {
|
||||
orderedNotes = [...notes];
|
||||
orderedNotes.reverse();
|
||||
}
|
||||
let currCount = 0;
|
||||
let maxCount = 0;
|
||||
for (const note of orderedNotes) {
|
||||
if (note.noteType === type) {
|
||||
currCount++;
|
||||
maxCount = Math.max(maxCount, currCount);
|
||||
} else {
|
||||
currCount = 0;
|
||||
}
|
||||
if (isPercent) {
|
||||
throw new SimpleError(`When comparing UserNotes with 'consecutive' search 'count' cannot be a percentage. Given: ${count}`);
|
||||
}
|
||||
foundNoteResult.push(`Found ${currCount} ${type} consecutively`);
|
||||
if (comparisonTextOp(currCount, operator, value)) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
foundNoteResult.push(`Found ${currCount} ${type} consecutively`);
|
||||
if (comparisonTextOp(currCount, operator, value)) {
|
||||
return true;
|
||||
}
|
||||
break;
|
||||
case 'total':
|
||||
const filteredNotes = notes.filter(x => x.noteType === type);
|
||||
const filteredNotes = notes.filter(x => x.noteType === type && cutoffDate === undefined || (x.time.isSameOrAfter(cutoffDate)));
|
||||
if (isPercent) {
|
||||
// avoid divide by zero
|
||||
const percent = notes.length === 0 ? 0 : filteredNotes.length / notes.length;
|
||||
@@ -2715,11 +2845,222 @@ export class SubredditResources {
|
||||
}
|
||||
const noteResult = notePass();
|
||||
propResultsMap.userNotes!.found = foundNoteResult.join(' | ');
|
||||
propResultsMap.userNotes!.passed = !((include && !noteResult) || (!include && noteResult));
|
||||
propResultsMap.userNotes!.passed = criteriaPassWithIncludeBehavior(noteResult, include);
|
||||
if (!propResultsMap.userNotes!.passed) {
|
||||
shouldContinue = false;
|
||||
}
|
||||
break;
|
||||
case 'modActions':
|
||||
const modActions = await this.getAuthorModNotesByActivityAuthor(item);
|
||||
// TODO convert these prior to running filter so we don't have to do it every time
|
||||
const actionCriterias = authorOptVal as (ModNoteCriteria | ModLogCriteria)[];
|
||||
let actionResult: string[] = [];
|
||||
|
||||
const actionsPass = () => {
|
||||
|
||||
for (const actionCriteria of actionCriterias) {
|
||||
|
||||
const {search = 'current', count = '>= 1'} = actionCriteria;
|
||||
|
||||
|
||||
const {
|
||||
value,
|
||||
operator,
|
||||
isPercent,
|
||||
duration,
|
||||
extra = ''
|
||||
} = parseGenericValueOrPercentComparison(count);
|
||||
const cutoffDate = duration === undefined ? undefined : dayjs().subtract(duration);
|
||||
|
||||
let actionsToUse: ModNote[] = [];
|
||||
if(asModNoteCriteria(actionCriteria)) {
|
||||
actionsToUse = actionsToUse.filter(x => x.type === 'NOTE');
|
||||
} else {
|
||||
actionsToUse = modActions;
|
||||
}
|
||||
|
||||
if(search === 'current' && actionsToUse.length > 0) {
|
||||
actionsToUse = [actionsToUse[0]];
|
||||
}
|
||||
|
||||
let validActions: ModNote[] = [];
|
||||
if (asModLogCriteria(actionCriteria)) {
|
||||
const fullCrit = toFullModLogCriteria(actionCriteria);
|
||||
const fullCritEntries = Object.entries(fullCrit);
|
||||
validActions = actionsToUse.filter(x => {
|
||||
|
||||
// filter out any notes that occur before time range
|
||||
if(cutoffDate !== undefined && x.createdAt.isBefore(cutoffDate)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
for (const [k, v] of fullCritEntries) {
|
||||
const key = k.toLocaleLowerCase();
|
||||
if (['count', 'search'].includes(key)) {
|
||||
continue;
|
||||
}
|
||||
switch (key) {
|
||||
case 'type':
|
||||
if (!v.includes((x.type as ModActionType))) {
|
||||
return false
|
||||
}
|
||||
break;
|
||||
case 'activitytype':
|
||||
const anyMatch = v.some((a: ActivityType) => {
|
||||
switch (a) {
|
||||
case 'submission':
|
||||
if (x.action.actedOn instanceof Submission) {
|
||||
return true;
|
||||
}
|
||||
break;
|
||||
case 'comment':
|
||||
if (x.action.actedOn instanceof Comment) {
|
||||
return true;
|
||||
}
|
||||
break;
|
||||
}
|
||||
});
|
||||
if (!anyMatch) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'description':
|
||||
case 'action':
|
||||
case 'details':
|
||||
const actionPropVal = x.action[key] as string;
|
||||
if (actionPropVal === undefined) {
|
||||
return false;
|
||||
}
|
||||
const anyPropMatch = v.some((y: RegExp) => y.test(actionPropVal));
|
||||
if (!anyPropMatch) {
|
||||
return false;
|
||||
}
|
||||
} // case end
|
||||
|
||||
} // for each end
|
||||
|
||||
return true;
|
||||
}); // filter end
|
||||
} else if(asModNoteCriteria(actionCriteria)) {
|
||||
const fullCrit = toFullModNoteCriteria(actionCriteria as ModNoteCriteria);
|
||||
const fullCritEntries = Object.entries(fullCrit);
|
||||
validActions = actionsToUse.filter(x => {
|
||||
|
||||
// filter out any notes that occur before time range
|
||||
if(cutoffDate !== undefined && x.createdAt.isBefore(cutoffDate)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
for (const [k, v] of fullCritEntries) {
|
||||
const key = k.toLocaleLowerCase();
|
||||
if (['count', 'search'].includes(key)) {
|
||||
continue;
|
||||
}
|
||||
switch (key) {
|
||||
case 'notetype':
|
||||
if (!v.map((x: ModUserNoteLabel) => x.toUpperCase()).includes((x.note.label as ModUserNoteLabel))) {
|
||||
return false
|
||||
}
|
||||
break;
|
||||
case 'note':
|
||||
const actionPropVal = x.note.note;
|
||||
if (actionPropVal === undefined) {
|
||||
return false;
|
||||
}
|
||||
const anyPropMatch = v.some((y: RegExp) => y.test(actionPropVal));
|
||||
if (!anyPropMatch) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'activitytype':
|
||||
const anyMatch = v.some((a: ActivityType) => {
|
||||
switch (a) {
|
||||
case 'submission':
|
||||
if (x.action.actedOn instanceof Submission) {
|
||||
return true;
|
||||
}
|
||||
break;
|
||||
case 'comment':
|
||||
if (x.action.actedOn instanceof Comment) {
|
||||
return true;
|
||||
}
|
||||
break;
|
||||
}
|
||||
});
|
||||
if (!anyMatch) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
} // case end
|
||||
|
||||
} // for each end
|
||||
|
||||
return true;
|
||||
}); // filter end
|
||||
} else {
|
||||
throw new SimpleError(`Could not determine if a modActions criteria was for Mod Log or Mod Note. Given: ${JSON.stringify(actionCriteria)}`);
|
||||
}
|
||||
|
||||
switch (search) {
|
||||
case 'current':
|
||||
if (validActions.length === 0) {
|
||||
actionResult.push('No Mod Actions present');
|
||||
} else {
|
||||
actionResult.push('Current Action matches criteria');
|
||||
return true;
|
||||
}
|
||||
break;
|
||||
case 'consecutive':
|
||||
if (isPercent) {
|
||||
throw new SimpleError(`When comparing Mod Actions with 'search: consecutive' the 'count' value cannot be a percentage. Given: ${count}`);
|
||||
}
|
||||
const validActionIds = validActions.map(x => x.id);
|
||||
const order = extra.includes('asc') ? 'ascending' : 'descending';
|
||||
let orderedActions = actionsToUse;
|
||||
if(order === 'descending') {
|
||||
orderedActions = [...actionsToUse];
|
||||
orderedActions.reverse();
|
||||
}
|
||||
let currCount = 0;
|
||||
let maxCount = 0;
|
||||
for(const action of orderedActions) {
|
||||
if(validActionIds.includes(action.id)) {
|
||||
currCount++;
|
||||
maxCount = Math.max(maxCount, currCount);
|
||||
} else {
|
||||
currCount = 0;
|
||||
}
|
||||
}
|
||||
actionResult.push(`Found maximum of ${maxCount} consecutive Mod Actions that matched criteria`);
|
||||
if (comparisonTextOp(currCount, operator, value)) {
|
||||
return true;
|
||||
}
|
||||
break;
|
||||
case 'total':
|
||||
if (isPercent) {
|
||||
// avoid divide by zero
|
||||
const percent = notes.length === 0 ? 0 : validActions.length / actionsToUse.length;
|
||||
actionResult.push(`${formatNumber(percent)}% of ${actionsToUse.length} matched criteria`);
|
||||
if (comparisonTextOp(percent, operator, value / 100)) {
|
||||
return true;
|
||||
}
|
||||
} else {
|
||||
actionResult.push(`${validActions.length} matched criteria`);
|
||||
if (comparisonTextOp(validActions.length, operator, value)) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
}
|
||||
} // criteria for loop ends
|
||||
return false;
|
||||
}
|
||||
const actionsResult = actionsPass();
|
||||
propResultsMap.modActions!.found = actionResult.join(' | ');
|
||||
propResultsMap.modActions!.passed = criteriaPassWithIncludeBehavior(actionsResult, include);
|
||||
if (!propResultsMap.modActions!.passed) {
|
||||
shouldContinue = false;
|
||||
}
|
||||
break;
|
||||
}
|
||||
//}
|
||||
if (!shouldContinue) {
|
||||
@@ -2728,12 +3069,11 @@ export class SubredditResources {
|
||||
}
|
||||
} catch (err: any) {
|
||||
if (isStatusError(err) && err.statusCode === 404) {
|
||||
throw new SimpleError('Reddit returned a 404 while trying to retrieve User profile. It is likely this user is shadowbanned.', {isSerious: false});
|
||||
throw new SimpleError('Reddit returned a 404 while trying to retrieve User profile. It is likely this user is shadowbanned.', {isSerious: false, code: 404});
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// gather values and determine overall passed
|
||||
const propResults = Object.values(propResultsMap);
|
||||
@@ -2893,6 +3233,7 @@ export class BotResourcesManager {
|
||||
submissionTTL,
|
||||
subredditTTL,
|
||||
filterCriteriaTTL,
|
||||
modNotesTTL,
|
||||
selfTTL,
|
||||
provider,
|
||||
actionedEventsMax,
|
||||
@@ -2915,7 +3256,7 @@ export class BotResourcesManager {
|
||||
this.defaultCacheConfig = caching;
|
||||
this.defaultThirdPartyCredentials = thirdParty;
|
||||
this.defaultDatabase = database;
|
||||
this.ttlDefaults = {authorTTL, userNotesTTL, wikiTTL, commentTTL, submissionTTL, filterCriteriaTTL, subredditTTL, selfTTL};
|
||||
this.ttlDefaults = {authorTTL, userNotesTTL, wikiTTL, commentTTL, submissionTTL, filterCriteriaTTL, subredditTTL, selfTTL, modNotesTTL};
|
||||
this.botName = name as string;
|
||||
this.logger = logger;
|
||||
this.invokeeRepo = this.defaultDatabase.getRepository(InvokeeType);
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
import Snoowrap, {Listing} from "snoowrap";
|
||||
import {Subreddit} from "snoowrap/dist/objects";
|
||||
import Snoowrap, {Listing, RedditUser} from "snoowrap";
|
||||
import {Submission, Subreddit, Comment} from "snoowrap/dist/objects";
|
||||
import {parseSubredditName} from "../util";
|
||||
import {ModUserNoteLabel} from "../Common/Infrastructure/Atomic";
|
||||
import {CreateModNoteData, ModNote, ModNoteRaw, ModNoteSnoowrapPopulated} from "../Subreddit/ModNotes/ModNote";
|
||||
import {SimpleError} from "./Errors";
|
||||
|
||||
// const proxyFactory = (endpoint: string) => {
|
||||
// return class ProxiedSnoowrap extends Snoowrap {
|
||||
@@ -14,6 +17,26 @@ import {parseSubredditName} from "../util";
|
||||
// }
|
||||
// }
|
||||
|
||||
export interface ModNoteGetOptions {
|
||||
before?: string,
|
||||
filter?: ModUserNoteLabel,
|
||||
limit?: number
|
||||
}
|
||||
|
||||
export interface ModNotesRaw {
|
||||
mod_notes: ModNoteSnoowrapPopulated[]
|
||||
start_cursor: string
|
||||
end_cursor: string
|
||||
has_next_page: boolean
|
||||
}
|
||||
|
||||
export interface ModNotesResponse {
|
||||
notes: ModNote[]
|
||||
startCursor: string
|
||||
endCursor: string
|
||||
isFinished: boolean
|
||||
}
|
||||
|
||||
export class ExtendedSnoowrap extends Snoowrap {
|
||||
|
||||
constructor(args: any) {
|
||||
@@ -53,6 +76,70 @@ export class ExtendedSnoowrap extends Snoowrap {
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async getModNotes(subreddit: Subreddit | string, user: RedditUser | string, options: ModNoteGetOptions = {limit: 100}): Promise<ModNotesResponse> {
|
||||
|
||||
const authorName = typeof user === 'string' ? user : user.name;
|
||||
if(authorName === '[deleted]') {
|
||||
throw new SimpleError(`User is '[deleted]', cannot retrieve`, {isSerious: false});
|
||||
}
|
||||
const subredditName = typeof subreddit === 'string' ? subreddit : subreddit.display_name;
|
||||
|
||||
const data: any = {
|
||||
subreddit: subredditName,
|
||||
user: authorName,
|
||||
...options
|
||||
};
|
||||
const response = await this.oauthRequest({
|
||||
uri: `/api/mod/notes`,
|
||||
method: 'get',
|
||||
qs: data
|
||||
}) as ModNotesRaw;
|
||||
|
||||
// TODO get all mod notes (iterate pages if has_next_page)
|
||||
return {
|
||||
|
||||
// "undo" the _populate function snoowrap uses to replace user/subreddit keys with Proxies
|
||||
// because we want to store the "raw" response data when caching (where user/subreddit keys are strings) so we can construct ModNote from either api response or cache using same data
|
||||
notes: response.mod_notes.map(x => {
|
||||
return new ModNote({
|
||||
...x,
|
||||
subreddit: x.subreddit.display_name,
|
||||
user: x.user.name,
|
||||
}, this);
|
||||
|
||||
}),
|
||||
startCursor: response.start_cursor,
|
||||
endCursor: response.end_cursor,
|
||||
isFinished: !response.has_next_page
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Add a Mod Note
|
||||
*
|
||||
* @see https://www.reddit.com/dev/api#POST_api_mod_notes
|
||||
* */
|
||||
async addModNote(data: CreateModNoteData): Promise<ModNote> {
|
||||
const {note, label} = data;
|
||||
|
||||
const requestData: any = {
|
||||
note,
|
||||
label,
|
||||
subreddit: data.subreddit.display_name,
|
||||
user: data.user.name,
|
||||
}
|
||||
if(data.activity !== undefined) {
|
||||
requestData.reddit_id = data.activity.id;
|
||||
}
|
||||
|
||||
const response =await this.oauthRequest({
|
||||
uri: `/api/mod/notes`,
|
||||
method: 'post',
|
||||
form: requestData
|
||||
}) as { created: ModNoteRaw };
|
||||
return new ModNote(response.created, this);
|
||||
}
|
||||
}
|
||||
|
||||
export class RequestTrackingSnoowrap extends ExtendedSnoowrap {
|
||||
|
||||
@@ -17,7 +17,7 @@ import {
|
||||
convertSubredditsRawToStrong,
|
||||
getActivityAuthorName,
|
||||
getActivitySubredditName,
|
||||
isStrongSubredditState,
|
||||
isStrongSubredditState, isSubmission,
|
||||
mergeArr,
|
||||
normalizeName,
|
||||
parseDurationValToDuration,
|
||||
@@ -344,7 +344,7 @@ export const getAttributionIdentifier = (sub: Submission, useParentMediaDomain =
|
||||
|
||||
export const activityIsRemoved = (item: Submission | Comment): boolean => {
|
||||
if(item.can_mod_post) {
|
||||
if (item instanceof Submission) {
|
||||
if (asSubmission(item)) {
|
||||
// when automod filters a post it gets this category
|
||||
return item.banned_at_utc !== null && item.removed_by_category !== 'automod_filtered';
|
||||
}
|
||||
@@ -352,7 +352,7 @@ export const activityIsRemoved = (item: Submission | Comment): boolean => {
|
||||
// so if we want to processing filtered comments we need to check for this
|
||||
return item.banned_at_utc !== null && item.removed;
|
||||
} else {
|
||||
if (item instanceof Submission) {
|
||||
if (asSubmission(item)) {
|
||||
return item.removed_by_category === 'moderator' || item.removed_by_category === 'deleted';
|
||||
}
|
||||
// in subreddits the bot does not mod it is not possible to tell the difference between a comment that was removed by the user and one that was removed by a mod
|
||||
@@ -362,7 +362,7 @@ export const activityIsRemoved = (item: Submission | Comment): boolean => {
|
||||
|
||||
export const activityIsFiltered = (item: Submission | Comment): boolean => {
|
||||
if(item.can_mod_post) {
|
||||
if (item instanceof Submission) {
|
||||
if (asSubmission(item)) {
|
||||
// when automod filters a post it gets this category
|
||||
return item.banned_at_utc !== null && item.removed_by_category === 'automod_filtered';
|
||||
}
|
||||
@@ -375,7 +375,7 @@ export const activityIsFiltered = (item: Submission | Comment): boolean => {
|
||||
}
|
||||
|
||||
export const activityIsDeleted = (item: Submission | Comment): boolean => {
|
||||
if (item instanceof Submission) {
|
||||
if (asSubmission(item)) {
|
||||
return item.removed_by_category === 'deleted';
|
||||
}
|
||||
return item.author.name === '[deleted]'
|
||||
|
||||
@@ -3,6 +3,7 @@ import {MysqlConnectionOptions} from "typeorm/driver/mysql/MysqlConnectionOption
|
||||
import {MongoConnectionOptions} from "typeorm/driver/mongodb/MongoConnectionOptions";
|
||||
import {PostgresConnectionOptions} from "typeorm/driver/postgres/PostgresConnectionOptions";
|
||||
import {resolve, parse as parsePath} from 'path';
|
||||
// https://stackoverflow.com/questions/49618719/why-does-typeorm-need-reflect-metadata
|
||||
import "reflect-metadata";
|
||||
import {DataSource} from "typeorm";
|
||||
import {castToBool, fileOrDirectoryIsWriteable, mergeArr, resolvePath} from "../util";
|
||||
|
||||
@@ -88,7 +88,8 @@ export class CacheStorageProvider extends StorageProvider {
|
||||
|
||||
constructor(caching: CacheOptions & StorageProviderOptions) {
|
||||
super(caching);
|
||||
this.cache = createCacheManager({...caching, prefix: buildCachePrefix(['web'])}) as Cache;
|
||||
const {logger, invitesMaxAge, loggerLabels, ...restCache } = caching;
|
||||
this.cache = createCacheManager({...restCache, prefix: buildCachePrefix(['web'])}) as Cache;
|
||||
this.logger.debug('Using CACHE');
|
||||
if (caching.store === 'none') {
|
||||
this.logger.warn(`Using 'none' as cache provider means no one will be able to access the interface since sessions will never be persisted!`);
|
||||
|
||||
@@ -29,8 +29,6 @@ import session, {Session, SessionData} from "express-session";
|
||||
import Snoowrap, {Subreddit} from "snoowrap";
|
||||
import {getLogger} from "../../Utils/loggerFactory";
|
||||
import EventEmitter from "events";
|
||||
import stream, {Readable, Writable, Transform} from "stream";
|
||||
import winston from "winston";
|
||||
import tcpUsed from "tcp-port-used";
|
||||
import http from "http";
|
||||
import jwt from 'jsonwebtoken';
|
||||
@@ -39,13 +37,6 @@ import got from 'got';
|
||||
import sharedSession from "express-socket.io-session";
|
||||
import dayjs from "dayjs";
|
||||
import httpProxy from 'http-proxy';
|
||||
import normalizeUrl from 'normalize-url';
|
||||
import GotRequest from "got/dist/source/core";
|
||||
import {prettyPrintJson} from "pretty-print-json";
|
||||
// @ts-ignore
|
||||
import DelimiterStream from 'delimiter-stream';
|
||||
import {pipeline} from 'stream/promises';
|
||||
import {defaultBotStatus} from "../Common/defaults";
|
||||
import {arrayMiddle, booleanMiddle} from "../Common/middleware";
|
||||
import {BotInstance, CMInstanceInterface} from "../interfaces";
|
||||
import { URL } from "url";
|
||||
@@ -54,8 +45,6 @@ import Autolinker from "autolinker";
|
||||
import path from "path";
|
||||
import {ExtendedSnoowrap} from "../../Utils/SnoowrapClients";
|
||||
import ClientUser from "../Common/User/ClientUser";
|
||||
import {BotStatusResponse, InviteData} from "../Common/interfaces";
|
||||
import {TransformableInfo} from "logform";
|
||||
import {SimpleError} from "../../Utils/Errors";
|
||||
import {ErrorWithCause} from "pony-cause";
|
||||
import {CMInstance} from "./CMInstance";
|
||||
@@ -65,8 +54,6 @@ import { ActionPremise } from "../../Common/Entities/ActionPremise";
|
||||
import {CacheStorageProvider, DatabaseStorageProvider} from "./StorageProvider";
|
||||
import {nanoid} from "nanoid";
|
||||
import {MigrationService} from "../../Common/MigrationService";
|
||||
import {WebSetting} from "../../Common/WebEntities/WebSetting";
|
||||
import {CheckResultEntity} from "../../Common/Entities/CheckResultEntity";
|
||||
import {RuleResultEntity} from "../../Common/Entities/RuleResultEntity";
|
||||
import {RuleSetResultEntity} from "../../Common/Entities/RuleSetResultEntity";
|
||||
import { PaginationAwareObject } from "../Common/util";
|
||||
@@ -95,13 +82,26 @@ app.use((req, res, next) => {
|
||||
}
|
||||
});
|
||||
|
||||
const staticHeaders = (res: express.Response, path: string, stat: object) => {
|
||||
res.setHeader('X-Robots-Tag', 'noindex');
|
||||
}
|
||||
const staticOpts = {
|
||||
setHeaders: staticHeaders
|
||||
}
|
||||
|
||||
app.use(bodyParser.urlencoded({extended: false}));
|
||||
//app.use(cookieParser());
|
||||
app.set('views', `${__dirname}/../assets/views`);
|
||||
app.set('view engine', 'ejs');
|
||||
app.use('/public', express.static(`${__dirname}/../assets/public`));
|
||||
app.use('/monaco', express.static(`${__dirname}/../../../node_modules/monaco-editor/`));
|
||||
app.use('/schemas', express.static(`${__dirname}/../../Schema/`));
|
||||
app.use('/public', express.static(`${__dirname}/../assets/public`, staticOpts));
|
||||
app.use('/monaco', express.static(`${__dirname}/../../../node_modules/monaco-editor/`, staticOpts));
|
||||
app.use('/schemas', express.static(`${__dirname}/../../Schema/`, staticOpts));
|
||||
|
||||
app.use((req, res, next) => {
|
||||
// https://developers.google.com/search/docs/advanced/crawling/block-indexing#http-response-header
|
||||
res.setHeader('X-Robots-Tag', 'noindex');
|
||||
next();
|
||||
});
|
||||
|
||||
const userAgent = `web:contextBot:web`;
|
||||
|
||||
@@ -606,106 +606,10 @@ const webClient = async (options: OperatorConfig) => {
|
||||
const cmInstances: CMInstance[] = [];
|
||||
let init = false;
|
||||
const formatter = defaultFormat();
|
||||
const formatTransform = formatter.transform as (info: TransformableInfo, opts?: any) => TransformableInfo;
|
||||
|
||||
let server: http.Server,
|
||||
io: SocketServer;
|
||||
|
||||
const startLogStream = (sessionData: Session & Partial<SessionData>, user: Express.User) => {
|
||||
// @ts-ignore
|
||||
const sessionId = sessionData.id as string;
|
||||
|
||||
if(connectedUsers[sessionId] !== undefined) {
|
||||
|
||||
const delim = new DelimiterStream({
|
||||
delimiter: '\r\n',
|
||||
});
|
||||
|
||||
const currInstance = cmInstances.find(x => x.getName() === sessionData.botId);
|
||||
if(currInstance !== undefined) {
|
||||
const ac = new AbortController();
|
||||
const options = {
|
||||
signal: ac.signal,
|
||||
};
|
||||
|
||||
const retryFn = (retryCount = 0, err: any = undefined) => {
|
||||
const delim = new DelimiterStream({
|
||||
delimiter: '\r\n',
|
||||
});
|
||||
|
||||
if(err !== undefined) {
|
||||
// @ts-ignore
|
||||
currInstance.logger.warn(new ErrorWithCause(`Log streaming encountered an error, trying to reconnect (retries: ${retryCount})`, {cause: err}), {user: user.name});
|
||||
}
|
||||
const gotStream = got.stream.get(`${currInstance.normalUrl}/logs`, {
|
||||
retry: {
|
||||
limit: 5,
|
||||
},
|
||||
headers: {
|
||||
'Authorization': `Bearer ${createToken(currInstance, user)}`,
|
||||
},
|
||||
searchParams: {
|
||||
limit: sessionData.limit,
|
||||
sort: sessionData.sort,
|
||||
level: sessionData.level,
|
||||
stream: true,
|
||||
streamObjects: true,
|
||||
formatted: false,
|
||||
}
|
||||
});
|
||||
|
||||
if(err !== undefined) {
|
||||
gotStream.once('data', () => {
|
||||
currInstance.logger.info('Streaming resumed', {instance: currInstance.getName(), user: user.name});
|
||||
});
|
||||
}
|
||||
|
||||
gotStream.retryCount = retryCount;
|
||||
const s = pipeline(
|
||||
gotStream,
|
||||
delim,
|
||||
options
|
||||
) as Promise<void>;
|
||||
|
||||
// ECONNRESET
|
||||
s.catch((err) => {
|
||||
if(err.code !== 'ABORT_ERR' && err.code !== 'ERR_STREAM_PREMATURE_CLOSE') {
|
||||
// @ts-ignore
|
||||
currInstance.logger.error(new ErrorWithCause('Unexpected error, or too many retries, occurred while streaming logs', {cause: err}), {user: user.name});
|
||||
}
|
||||
});
|
||||
|
||||
|
||||
delim.on('data', (c: any) => {
|
||||
const logObj = JSON.parse(c) as LogInfo;
|
||||
let subredditMessage;
|
||||
let allMessage;
|
||||
if(logObj.subreddit !== undefined) {
|
||||
const {subreddit, bot, ...rest} = logObj
|
||||
// @ts-ignore
|
||||
subredditMessage = formatLogLineToHtml(formatter.transform(rest)[MESSAGE], rest.timestamp);
|
||||
}
|
||||
if(logObj.bot !== undefined) {
|
||||
const {bot, ...rest} = logObj
|
||||
// @ts-ignore
|
||||
allMessage = formatLogLineToHtml(formatter.transform(rest)[MESSAGE], rest.timestamp);
|
||||
}
|
||||
// @ts-ignore
|
||||
let formattedMessage = formatLogLineToHtml(formatter.transform(logObj)[MESSAGE], logObj.timestamp);
|
||||
io.to(sessionId).emit('log', {...logObj, subredditMessage, allMessage, formattedMessage});
|
||||
});
|
||||
|
||||
gotStream.once('retry', retryFn);
|
||||
}
|
||||
|
||||
retryFn();
|
||||
|
||||
return ac;
|
||||
}
|
||||
return undefined;
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
server = await app.listen(port);
|
||||
io = new SocketServer(server);
|
||||
@@ -965,29 +869,7 @@ const webClient = async (options: OperatorConfig) => {
|
||||
|
||||
res.render('status', {
|
||||
instances: shownInstances,
|
||||
bots: resp.bots.map((x: BotStatusResponse) => {
|
||||
const {subreddits = []} = x;
|
||||
const subredditsWithSimpleLogs = subreddits.map(y => {
|
||||
let transformedLogs: string[];
|
||||
if(y.name === 'All') {
|
||||
// only need to remove bot name here
|
||||
transformedLogs = (y.logs as LogInfo[]).map((z: LogInfo) => {
|
||||
const {bot, ...rest} = z;
|
||||
// @ts-ignore
|
||||
return formatLogLineToHtml(formatter.transform(rest)[MESSAGE] as string, rest.timestamp);
|
||||
});
|
||||
} else {
|
||||
transformedLogs = (y.logs as LogInfo[]).map((z: LogInfo) => {
|
||||
const {bot, subreddit, ...rest} = z;
|
||||
// @ts-ignore
|
||||
return formatLogLineToHtml(formatter.transform(rest)[MESSAGE] as string, rest.timestamp);
|
||||
});
|
||||
}
|
||||
y.logs = transformedLogs;
|
||||
return y;
|
||||
});
|
||||
return {...x, subreddits: subredditsWithSimpleLogs};
|
||||
}),
|
||||
bots: resp.bots,
|
||||
botId: (req.instance as CMInstance).getName(),
|
||||
instanceId: (req.instance as CMInstance).getName(),
|
||||
isOperator: isOp,
|
||||
@@ -1022,7 +904,7 @@ const webClient = async (options: OperatorConfig) => {
|
||||
|
||||
app.postAsync('/config', [ensureAuthenticatedApi, defaultSession, instanceWithPermissions, botWithPermissions(true)], async (req: express.Request, res: express.Response) => {
|
||||
const {subreddit} = req.query as any;
|
||||
const {location, data, create = false} = req.body as any;
|
||||
const {location, data, reason = 'Updated through CM Web', create = false} = req.body as any;
|
||||
|
||||
const client = new ExtendedSnoowrap({
|
||||
userAgent,
|
||||
@@ -1036,7 +918,7 @@ const webClient = async (options: OperatorConfig) => {
|
||||
const wiki = await client.getSubreddit(subreddit).getWikiPage(location);
|
||||
await wiki.edit({
|
||||
text: data,
|
||||
reason: create ? 'Created Config through CM Web' : 'Updated through CM Web'
|
||||
reason
|
||||
});
|
||||
} catch (err: any) {
|
||||
res.status(500);
|
||||
@@ -1398,42 +1280,6 @@ const webClient = async (options: OperatorConfig) => {
|
||||
clearSockStreams(socket.id);
|
||||
socket.join(session.id);
|
||||
|
||||
socket.on('viewing', (data) => {
|
||||
if(user !== undefined) {
|
||||
const {subreddit, bot: botVal} = data;
|
||||
const currBot = cmInstances.find(x => x.getName() === session.botId);
|
||||
if(currBot !== undefined) {
|
||||
|
||||
if(liveInterval !== undefined) {
|
||||
clearInterval(liveInterval)
|
||||
}
|
||||
|
||||
const liveEmit = async () => {
|
||||
try {
|
||||
const resp = await got.get(`${currBot.normalUrl}/liveStats`, {
|
||||
headers: {
|
||||
'Authorization': `Bearer ${createToken(currBot, user)}`,
|
||||
},
|
||||
searchParams: {
|
||||
bot: botVal,
|
||||
subreddit
|
||||
}
|
||||
});
|
||||
const stats = JSON.parse(resp.body);
|
||||
io.to(session.id).emit('liveStats', stats);
|
||||
} catch (err: any) {
|
||||
currBot.logger.error(new ErrorWithCause('Could not retrieve live stats', {cause: err}));
|
||||
}
|
||||
}
|
||||
|
||||
// do an initial get
|
||||
liveEmit();
|
||||
// and then every 5 seconds after that
|
||||
liveInterval = setInterval(async () => await liveEmit(), 5000);
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
if(session.botId !== undefined) {
|
||||
const bot = cmInstances.find(x => x.getName() === session.botId);
|
||||
if(bot !== undefined) {
|
||||
@@ -1456,12 +1302,8 @@ const webClient = async (options: OperatorConfig) => {
|
||||
|
||||
// only setup streams if the user can actually access them (not just a web operator)
|
||||
if(session.authBotId !== undefined) {
|
||||
// streaming logs and stats from client
|
||||
// streaming stats from client
|
||||
const newStreams: (AbortController | NodeJS.Timeout)[] = [];
|
||||
const ac = startLogStream(session, user);
|
||||
if(ac !== undefined) {
|
||||
newStreams.push(ac);
|
||||
}
|
||||
const interval = setInterval(async () => {
|
||||
try {
|
||||
const resp = await got.get(`${bot.normalUrl}/stats`, {
|
||||
|
||||
@@ -31,13 +31,13 @@ const addBot = () => {
|
||||
const {
|
||||
bots: botsFromConfig = []
|
||||
} = req.botApp.fileConfig.document.toJS();
|
||||
if(botsFromConfig.length === 0 || botsFromConfig.some(x => x.name !== botData.name)) {
|
||||
if (botsFromConfig.length === 0 || !botsFromConfig.some(x => x.name === botData.name)) {
|
||||
req.botApp.logger.warn('Overwriting existing bot with the same name BUT this bot does not exist in the operator CONFIG FILE. You should check how you have provided config before next start or else this bot may be started twice (once from file, once from arg/env)');
|
||||
|
||||
}
|
||||
|
||||
await existingBot.destroy('system');
|
||||
req.botApp.bots.filter(x => x.botAccount !== botData.name);
|
||||
const existingBotIndex = req.botApp.bots.findIndex(x => x.botAccount === botData.name);
|
||||
req.botApp.bots.splice(existingBotIndex, 1);
|
||||
}
|
||||
|
||||
req.botApp.fileConfig.document.addBot(botData);
|
||||
@@ -55,13 +55,8 @@ const addBot = () => {
|
||||
return res.status(500).json(result);
|
||||
}
|
||||
await newBot.testClient();
|
||||
await newBot.buildManagers();
|
||||
newBot.runManagers('system').catch((err) => {
|
||||
req.botApp.logger.error(`Unexpected error occurred while running Bot ${newBot.botName}. Bot must be re-built to restart`);
|
||||
if (!err.logged || !(err instanceof LoggedError)) {
|
||||
req.botApp.logger.error(err);
|
||||
}
|
||||
});
|
||||
// return response early so client doesn't have to wait for all managers to be built
|
||||
res.json(result);
|
||||
} catch (err: any) {
|
||||
result.success = false;
|
||||
if (newBot.error === undefined) {
|
||||
@@ -73,7 +68,22 @@ const addBot = () => {
|
||||
req.botApp.logger.error(err);
|
||||
}
|
||||
}
|
||||
return res.json(result);
|
||||
|
||||
try {
|
||||
await newBot.buildManagers();
|
||||
newBot.runManagers('system').catch((err) => {
|
||||
req.botApp.logger.error(`Unexpected error occurred while running Bot ${newBot.botName}. Bot must be re-built to restart`);
|
||||
if (!err.logged || !(err instanceof LoggedError)) {
|
||||
req.botApp.logger.error(err);
|
||||
}
|
||||
});
|
||||
} catch (err: any) {
|
||||
req.botApp.logger.error(`Bot ${newBot.botName} cannot recover from this error and must be re-built`);
|
||||
if (!err.logged || !(err instanceof LoggedError)) {
|
||||
req.botApp.logger.error(err);
|
||||
}
|
||||
}
|
||||
return;
|
||||
}
|
||||
return [...middleware, response];
|
||||
}
|
||||
|
||||
@@ -136,6 +136,9 @@ const liveStats = () => {
|
||||
acc[curr].missPercent = `${formatNumber(per, {toFixed: 0})}%`;
|
||||
acc[curr].identifierAverageHit = formatNumber(acc[curr].identifierAverageHit);
|
||||
acc[curr].averageTimeBetweenHits = formatNumber(acc[curr].averageTimeBetweenHits)
|
||||
|
||||
delete acc[curr].requestTimestamps;
|
||||
delete acc[curr].identifierRequestCount;
|
||||
return acc;
|
||||
}, cumRaw);
|
||||
const cacheReq = subManagerData.reduce((acc, curr) => acc + curr.stats.cache.totalRequests, 0);
|
||||
|
||||
@@ -27,6 +27,15 @@ const logs = () => {
|
||||
booleanMiddle([{
|
||||
name: 'stream',
|
||||
defaultVal: false
|
||||
}, {
|
||||
name: 'formatted',
|
||||
defaultVal: true,
|
||||
}, {
|
||||
name: 'transports',
|
||||
defaultVal: false
|
||||
}, {
|
||||
name: 'streamObjects',
|
||||
defaultVal: false
|
||||
}])
|
||||
];
|
||||
|
||||
@@ -37,20 +46,56 @@ const logs = () => {
|
||||
const userName = req.user?.name as string;
|
||||
const isOperator = req.user?.isInstanceOperator(req.botApp);
|
||||
const realManagers = req.botApp.bots.map(x => req.user?.accessibleSubreddits(x).map(x => x.displayLabel)).flat() as string[];
|
||||
const {level = 'verbose', stream, limit = 200, sort = 'descending', streamObjects = false, formatted = true} = req.query;
|
||||
const {level = 'verbose', stream, limit = 200, sort = 'descending', streamObjects = false, formatted: formattedVal = true, transports: transportsVal = false} = req.query;
|
||||
|
||||
const formatted = formattedVal as boolean;
|
||||
const transports = transportsVal as boolean;
|
||||
|
||||
let bots: Bot[] = [];
|
||||
if(req.serverBot !== undefined) {
|
||||
bots = [req.serverBot];
|
||||
} else {
|
||||
bots = req.user?.accessibleBots(req.botApp.bots) as Bot[];
|
||||
}
|
||||
|
||||
let managers: Manager[] = [];
|
||||
|
||||
if(req.manager !== undefined) {
|
||||
managers = [req.manager];
|
||||
} else {
|
||||
for(const b of bots) {
|
||||
managers = managers.concat(req.user?.accessibleSubreddits(b) as Manager[]);
|
||||
}
|
||||
}
|
||||
|
||||
//const allReq = req.query.subreddit !== undefined && (req.query.subreddit as string).toLowerCase() === 'all';
|
||||
|
||||
if (stream) {
|
||||
|
||||
const requestedManagers = managers.map(x => x.displayLabel);
|
||||
const requestedBots = bots.map(x => x.botName);
|
||||
|
||||
const origin = req.header('X-Forwarded-For') ?? req.header('host');
|
||||
try {
|
||||
logger.stream().on('log', (log: LogInfo) => {
|
||||
if (isLogLineMinLevel(log, level as string)) {
|
||||
const {subreddit: subName, user} = log;
|
||||
if (isOperator || (subName !== undefined && (realManagers.includes(subName) || (user !== undefined && user.includes(userName))))) {
|
||||
const {subreddit: subName, bot, user} = log;
|
||||
let canAccess = false;
|
||||
if(user !== undefined && user.includes(userName)) {
|
||||
canAccess = true;
|
||||
} else if(subName !== undefined || bot !== undefined) {
|
||||
if(subName === undefined) {
|
||||
canAccess = requestedBots.includes(bot);
|
||||
} else {
|
||||
canAccess = requestedManagers.includes(subName);
|
||||
}
|
||||
} else if(isOperator) {
|
||||
canAccess = true;
|
||||
}
|
||||
|
||||
if (canAccess) {
|
||||
if(streamObjects) {
|
||||
let obj: any = log;
|
||||
if(!formatted) {
|
||||
const {[MESSAGE]: fMessage, ...rest} = log;
|
||||
obj = rest;
|
||||
}
|
||||
let obj: any = transformLog(log, {formatted, transports});
|
||||
res.write(`${JSON.stringify(obj)}\r\n`);
|
||||
} else if(formatted) {
|
||||
res.write(`${log[MESSAGE]}\r\n`)
|
||||
@@ -74,12 +119,6 @@ const logs = () => {
|
||||
res.destroy();
|
||||
}
|
||||
} else {
|
||||
let bots: Bot[] = [];
|
||||
if(req.serverBot !== undefined) {
|
||||
bots = [req.serverBot];
|
||||
} else {
|
||||
bots = req.user?.accessibleBots(req.botApp.bots) as Bot[];
|
||||
}
|
||||
|
||||
const allReq = req.query.subreddit !== undefined && (req.query.subreddit as string).toLowerCase() === 'all';
|
||||
|
||||
@@ -114,15 +153,9 @@ const logs = () => {
|
||||
botArr.push({
|
||||
name: b.getBotName(),
|
||||
system: systemLogs,
|
||||
all: formatted ? allLogs.map(x => {
|
||||
const {[MESSAGE]: fMessage, ...rest} = x;
|
||||
return {...rest, formatted: fMessage};
|
||||
}) : allLogs,
|
||||
all: allLogs.map(x => transformLog(x, {formatted, transports })),
|
||||
subreddits: allReq ? [] : [...managerLogs.entries()].reduce((acc: any[], curr) => {
|
||||
const l = formatted ? curr[1].map(x => {
|
||||
const {[MESSAGE]: fMessage, ...rest} = x;
|
||||
return {...rest, formatted: fMessage};
|
||||
}) : curr[1];
|
||||
const l = curr[1].map(x => transformLog(x, {formatted, transports }));
|
||||
acc.push({name: curr[0], logs: l});
|
||||
return acc;
|
||||
}, [])
|
||||
@@ -135,4 +168,23 @@ const logs = () => {
|
||||
return [...middleware, response];
|
||||
}
|
||||
|
||||
const transformLog = (obj: LogInfo, options: { formatted: boolean, transports: boolean }) => {
|
||||
const {
|
||||
[MESSAGE]: fMessage,
|
||||
transport,
|
||||
//@ts-ignore
|
||||
name, // name is the name of the last transport
|
||||
...rest
|
||||
} = obj;
|
||||
const transformed: any = rest;
|
||||
if (options.formatted) {
|
||||
transformed.formatted = fMessage;
|
||||
}
|
||||
if (options.transports) {
|
||||
transformed.transport = transport;
|
||||
transformed.name = name;
|
||||
}
|
||||
return transformed;
|
||||
}
|
||||
|
||||
export default logs;
|
||||
|
||||
@@ -66,13 +66,13 @@ const status = () => {
|
||||
|
||||
const subManagerData = [];
|
||||
for (const m of req.user?.accessibleSubreddits(bot) as Manager[]) {
|
||||
const logs = req.manager === undefined || allReq || req.manager.getDisplay() === m.getDisplay() ? filterLogs(m.logs, {
|
||||
level: (level as string),
|
||||
// @ts-ignore
|
||||
sort,
|
||||
limit: limit as string,
|
||||
returnType: 'object'
|
||||
}) as LogInfo[]: [];
|
||||
// const logs = req.manager === undefined || allReq || req.manager.getDisplay() === m.getDisplay() ? filterLogs(m.logs, {
|
||||
// level: (level as string),
|
||||
// // @ts-ignore
|
||||
// sort,
|
||||
// limit: limit as string,
|
||||
// returnType: 'object'
|
||||
// }) as LogInfo[]: [];
|
||||
|
||||
let retention = 'Unknown';
|
||||
if (m.resources !== undefined) {
|
||||
@@ -89,7 +89,7 @@ const status = () => {
|
||||
const sd = {
|
||||
name: m.displayLabel,
|
||||
//linkName: s.replace(/\W/g, ''),
|
||||
logs: logs || [], // provide a default empty value in case we truly have not logged anything for this subreddit yet
|
||||
logs: [], // provide a default empty value in case we truly have not logged anything for this subreddit yet
|
||||
botState: m.managerState,
|
||||
eventsState: m.eventsState,
|
||||
queueState: m.queueState,
|
||||
@@ -237,14 +237,14 @@ const status = () => {
|
||||
const sharedSub = subManagerData.find(x => x.stats.cache.isShared);
|
||||
const sharedCount = sharedSub !== undefined ? sharedSub.stats.cache.currentKeyCount : 0;
|
||||
const scopes = req.user?.isInstanceOperator(bot) ? bot.client.scope : [];
|
||||
const allSubLogs = subManagerData.map(x => x.logs).flat().sort(logSortFunc(sort as string)).slice(0, (limit as number) + 1);
|
||||
const allLogs = filterLogs([...allSubLogs, ...(req.user?.isInstanceOperator(req.botApp) ? bot.logs : bot.logs.filter(x => x.user === req.user?.name))], {
|
||||
level: (level as string),
|
||||
// @ts-ignore
|
||||
sort,
|
||||
limit: limit as string,
|
||||
returnType: 'object'
|
||||
}) as LogInfo[];
|
||||
//const allSubLogs = subManagerData.map(x => x.logs).flat().sort(logSortFunc(sort as string)).slice(0, (limit as number) + 1);
|
||||
// const allLogs = filterLogs([...allSubLogs, ...(req.user?.isInstanceOperator(req.botApp) ? bot.logs : bot.logs.filter(x => x.user === req.user?.name))], {
|
||||
// level: (level as string),
|
||||
// // @ts-ignore
|
||||
// sort,
|
||||
// limit: limit as string,
|
||||
// returnType: 'object'
|
||||
// }) as LogInfo[];
|
||||
let allManagerData: any = {
|
||||
name: 'All',
|
||||
status: bot.running ? 'RUNNING' : 'NOT RUNNING',
|
||||
@@ -261,7 +261,7 @@ const status = () => {
|
||||
causedBy: SYSTEM
|
||||
},
|
||||
dryRun: boolToString(bot.dryRun === true),
|
||||
logs: allLogs,
|
||||
logs: [],
|
||||
checks: checks,
|
||||
softLimit: bot.softLimit,
|
||||
hardLimit: bot.hardLimit,
|
||||
|
||||
@@ -41,6 +41,12 @@ const server = addAsync(express());
|
||||
server.use(bodyParser.json());
|
||||
server.use(bodyParser.urlencoded({extended: false}));
|
||||
|
||||
server.use((req, res, next) => {
|
||||
// https://developers.google.com/search/docs/advanced/crawling/block-indexing#http-response-header
|
||||
res.setHeader('X-Robots-Tag', 'noindex');
|
||||
next();
|
||||
});
|
||||
|
||||
declare module 'express-session' {
|
||||
interface SessionData {
|
||||
user: string,
|
||||
|
||||
5
src/Web/assets/browser.js
Normal file
5
src/Web/assets/browser.js
Normal file
@@ -0,0 +1,5 @@
|
||||
const logform = require('logform');
|
||||
const tripleBeam = require('triple-beam');
|
||||
|
||||
window.format = logform.format;
|
||||
window.beam = tripleBeam;
|
||||
160
src/Web/assets/public/TextEncoderStream.js
Normal file
160
src/Web/assets/public/TextEncoderStream.js
Normal file
@@ -0,0 +1,160 @@
|
||||
// Copyright 2016 Google Inc.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
// Polyfill for TextEncoderStream and TextDecoderStream
|
||||
|
||||
(function() {
|
||||
'use strict';
|
||||
|
||||
if (typeof self.TextEncoderStream === 'function' &&
|
||||
typeof self.TextDecoderStream === 'function') {
|
||||
// The constructors exist. Assume that they work and don't replace them.
|
||||
return;
|
||||
}
|
||||
|
||||
if (typeof self.TextEncoder !== 'function') {
|
||||
throw new ReferenceError('TextEncoder implementation required');
|
||||
}
|
||||
|
||||
if (typeof self.TextDecoder !== 'function') {
|
||||
throw new ReferenceError('TextDecoder implementation required');
|
||||
}
|
||||
|
||||
// These symbols end up being different for every realm, so mixing objects
|
||||
// created in one realm with methods created in another fails.
|
||||
const codec = Symbol('codec');
|
||||
const transform = Symbol('transform');
|
||||
|
||||
class TextEncoderStream {
|
||||
constructor() {
|
||||
this[codec] = new TextEncoder();
|
||||
this[transform] =
|
||||
new TransformStream(new TextEncodeTransformer(this[codec]));
|
||||
}
|
||||
}
|
||||
|
||||
class TextDecoderStream {
|
||||
constructor(label = undefined, options = undefined) {
|
||||
this[codec] = new TextDecoder(label, options);
|
||||
this[transform] = new TransformStream(
|
||||
new TextDecodeTransformer(this[codec]));
|
||||
}
|
||||
}
|
||||
|
||||
// ECMAScript class syntax will create getters that are non-enumerable, but we
|
||||
// need them to be enumerable in WebIDL-style, so we add them manually.
|
||||
// "readable" and "writable" are always delegated to the TransformStream
|
||||
// object. Properties specified in |properties| are delegated to the
|
||||
// underlying TextEncoder or TextDecoder.
|
||||
function addDelegatingProperties(prototype, properties) {
|
||||
for (const transformProperty of ['readable', 'writable']) {
|
||||
addGetter(prototype, transformProperty, function() {
|
||||
return this[transform][transformProperty];
|
||||
});
|
||||
}
|
||||
|
||||
for (const codecProperty of properties) {
|
||||
addGetter(prototype, codecProperty, function() {
|
||||
return this[codec][codecProperty];
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
function addGetter(prototype, property, getter) {
|
||||
Object.defineProperty(prototype, property,
|
||||
{
|
||||
configurable: true,
|
||||
enumerable: true,
|
||||
get: getter
|
||||
});
|
||||
}
|
||||
|
||||
addDelegatingProperties(TextEncoderStream.prototype, ['encoding']);
|
||||
addDelegatingProperties(TextDecoderStream.prototype,
|
||||
['encoding', 'fatal', 'ignoreBOM']);
|
||||
|
||||
class TextEncodeTransformer {
|
||||
constructor() {
|
||||
this._encoder = new TextEncoder();
|
||||
this._carry = undefined;
|
||||
}
|
||||
|
||||
transform(chunk, controller) {
|
||||
chunk = String(chunk);
|
||||
if (this._carry !== undefined) {
|
||||
chunk = this._carry + chunk;
|
||||
this._carry = undefined;
|
||||
}
|
||||
const terminalCodeUnit = chunk.charCodeAt(chunk.length - 1);
|
||||
if (terminalCodeUnit >= 0xD800 && terminalCodeUnit < 0xDC00) {
|
||||
this._carry = chunk.substring(chunk.length - 1);
|
||||
chunk = chunk.substring(0, chunk.length - 1);
|
||||
}
|
||||
const encoded = this._encoder.encode(chunk);
|
||||
if (encoded.length > 0) {
|
||||
controller.enqueue(encoded);
|
||||
}
|
||||
}
|
||||
|
||||
flush(controller) {
|
||||
if (this._carry !== undefined) {
|
||||
controller.enqueue(this._encoder.encode(this._carry));
|
||||
this._carry = undefined;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
class TextDecodeTransformer {
|
||||
constructor(decoder) {
|
||||
this._decoder = new TextDecoder(decoder.encoding, {
|
||||
fatal: decoder.fatal,
|
||||
ignoreBOM: decoder.ignoreBOM
|
||||
});
|
||||
}
|
||||
|
||||
transform(chunk, controller) {
|
||||
const decoded = this._decoder.decode(chunk, {stream: true});
|
||||
if (decoded != '') {
|
||||
controller.enqueue(decoded);
|
||||
}
|
||||
}
|
||||
|
||||
flush(controller) {
|
||||
// If {fatal: false} is in options (the default), then the final call to
|
||||
// decode() can produce extra output (usually the unicode replacement
|
||||
// character 0xFFFD). When fatal is true, this call is just used for its
|
||||
// side-effect of throwing a TypeError exception if the input is
|
||||
// incomplete.
|
||||
var output = this._decoder.decode();
|
||||
if (output !== '') {
|
||||
controller.enqueue(output);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function exportAs(name, value) {
|
||||
// Make it stringify as [object <name>] rather than [object Object].
|
||||
value.prototype[Symbol.toStringTag] = name;
|
||||
Object.defineProperty(self, name,
|
||||
{
|
||||
configurable: true,
|
||||
enumerable: false,
|
||||
writable: true,
|
||||
value
|
||||
});
|
||||
}
|
||||
|
||||
exportAs('TextEncoderStream', TextEncoderStream);
|
||||
exportAs('TextDecoderStream', TextDecoderStream);
|
||||
})();
|
||||
1
src/Web/assets/public/browserBundle.js
Normal file
1
src/Web/assets/public/browserBundle.js
Normal file
File diff suppressed because one or more lines are too long
120
src/Web/assets/public/logUtils.js
Normal file
120
src/Web/assets/public/logUtils.js
Normal file
@@ -0,0 +1,120 @@
|
||||
const SPLAT = window.beam.SPLAT;
|
||||
const {combine, printf, timestamp, label, splat, errors} = window.format;
|
||||
|
||||
window.formattedTime = (short, full) => `<span class="has-tooltip"><span style="margin-top:35px" class='tooltip rounded shadow-lg p-1 bg-gray-100 text-black space-y-3 p-2 text-left'>${full}</span><span>${short}</span></span>`;
|
||||
window.formatLogLineToHtml = (log, timestamp = undefined) => {
|
||||
const val = typeof log === 'string' ? log : log[window.beam.MESSAGE];
|
||||
const logContent = Autolinker.link(val, {
|
||||
email: false,
|
||||
phone: false,
|
||||
mention: false,
|
||||
hashtag: false,
|
||||
stripPrefix: false,
|
||||
sanitizeHtml: true,
|
||||
})
|
||||
.replace(/(\s*debug\s*):/i, '<span class="debug blue">$1</span>:')
|
||||
.replace(/(\s*warn\s*):/i, '<span class="warn yellow">$1</span>:')
|
||||
.replace(/(\s*info\s*):/i, '<span class="info green">$1</span>:')
|
||||
.replace(/(\s*error\s*):/i, '<span class="error red">$1</span>:')
|
||||
.replace(/(\s*verbose\s*):/i, '<span class="error purple">$1</span>:')
|
||||
.replaceAll('\n', '<br />');
|
||||
//.replace(HYPERLINK_REGEX, '<a target="_blank" href="$&">$&</a>');
|
||||
let line;
|
||||
|
||||
let timestampString = timestamp;
|
||||
if(timestamp === undefined && typeof log !== 'string') {
|
||||
timestampString = log.timestamp;
|
||||
}
|
||||
|
||||
if(timestampString !== undefined) {
|
||||
const timeStampReplacement = formattedTime(dayjs(timestampString).format('HH:mm:ss z'), timestampString);
|
||||
const splitLine = logContent.split(timestampString);
|
||||
line = `<div class="logLine">${splitLine[0]}${timeStampReplacement}<span style="white-space: pre-wrap">${splitLine[1]}</span></div>`;
|
||||
} else {
|
||||
line = `<div style="white-space: pre-wrap" class="logLine">${logContent}</div>`
|
||||
}
|
||||
return line;
|
||||
}
|
||||
|
||||
window.formatNumber = (val, options) => {
|
||||
const {
|
||||
toFixed = 2,
|
||||
defaultVal = null,
|
||||
prefix = '',
|
||||
suffix = '',
|
||||
round,
|
||||
} = options || {};
|
||||
let parsedVal = typeof val === 'number' ? val : Number.parseFloat(val);
|
||||
if (Number.isNaN(parsedVal)) {
|
||||
return defaultVal;
|
||||
}
|
||||
let prefixStr = prefix;
|
||||
const {enable = false, indicate = true, type = 'round'} = round || {};
|
||||
if (enable && !Number.isInteger(parsedVal)) {
|
||||
switch (type) {
|
||||
case 'round':
|
||||
parsedVal = Math.round(parsedVal);
|
||||
break;
|
||||
case 'ceil':
|
||||
parsedVal = Math.ceil(parsedVal);
|
||||
break;
|
||||
case 'floor':
|
||||
parsedVal = Math.floor(parsedVal);
|
||||
}
|
||||
if (indicate) {
|
||||
prefixStr = `~${prefix}`;
|
||||
}
|
||||
}
|
||||
const localeString = parsedVal.toLocaleString(undefined, {
|
||||
minimumFractionDigits: toFixed,
|
||||
maximumFractionDigits: toFixed,
|
||||
});
|
||||
return `${prefixStr}${localeString}${suffix}`;
|
||||
};
|
||||
logFormatter = printf(({
|
||||
level,
|
||||
message,
|
||||
labels = ['App'],
|
||||
subreddit,
|
||||
bot,
|
||||
instance,
|
||||
leaf,
|
||||
itemId,
|
||||
timestamp,
|
||||
durationMs,
|
||||
// @ts-ignore
|
||||
[SPLAT]: splatObj,
|
||||
stack,
|
||||
...rest
|
||||
}) => {
|
||||
let stringifyValue = splatObj !== undefined ? JSON.stringify(splatObj) : '';
|
||||
let msg = message;
|
||||
let stackMsg = '';
|
||||
if (stack !== undefined) {
|
||||
const stackArr = stack.split('\n');
|
||||
const stackTop = stackArr[0];
|
||||
const cleanedStack = stackArr
|
||||
.slice(1) // don't need actual error message since we are showing it as msg
|
||||
.join('\n'); // rejoin with newline to preserve formatting
|
||||
stackMsg = `\n${cleanedStack}`;
|
||||
if (msg === undefined || msg === null || typeof message === 'object') {
|
||||
msg = stackTop;
|
||||
} else {
|
||||
stackMsg = `\n${stackTop}${stackMsg}`
|
||||
}
|
||||
}
|
||||
|
||||
let nodes = labels;
|
||||
if (leaf !== null && leaf !== undefined && !nodes.includes(leaf)) {
|
||||
nodes.push(leaf);
|
||||
}
|
||||
const labelContent = `${nodes.map((x) => `[${x}]`).join(' ')}`;
|
||||
|
||||
return `${timestamp} ${level.padEnd(7)}: ${instance !== undefined ? `|${instance}| ` : ''}${bot !== undefined ? `~${bot}~ ` : ''}${subreddit !== undefined ? `{${subreddit}} ` : ''}${labelContent} ${msg}${durationMs !== undefined ? ` Elapsed: ${durationMs}ms (${window.formatNumber(durationMs/1000)}s) ` : ''}${stringifyValue !== '' ? ` ${stringifyValue}` : ''}${stackMsg}`;
|
||||
});
|
||||
|
||||
window.formatLog = (logObj) => {
|
||||
const formatted = logFormatter.transform(logObj);
|
||||
const html = window.formatLogLineToHtml(formatted);
|
||||
return {...formatted, html};
|
||||
}
|
||||
@@ -57,7 +57,9 @@
|
||||
</span>
|
||||
</span>
|
||||
| <input id="configUrl" class="text-black placeholder-gray-500 rounded mx-2" style="min-width:400px;" placeholder="URL of a config to load"/> <a href="#" id="loadConfig">Load</a>
|
||||
<span id="saveTip">
|
||||
<div id="editWrapper" class="my-2">
|
||||
<label style="display: none" for="reason">Edit Reason</label><input id="reason" class="text-black placeholder-gray-500 rounded mr-2" style="min-width:400px;" placeholder="Edit Reason: Updated through CM Web"/>
|
||||
<span id="saveTip">
|
||||
<span style="margin-top:30px; z-index:100" class="tooltip rounded shadow-lg p-1 bg-gray-100 text-black -mt-2 space-y-3 p-2 text-left">
|
||||
<div>In order to <strong id="configPageActionType">save</strong> a configuration to a subreddit's wiki page you must re-authorize ContextMod with Reddit to get the following permissions:</div>
|
||||
<ul class="list-inside list-disc" id="reauthPermissions">
|
||||
@@ -67,7 +69,7 @@
|
||||
<div><b><a href="#" id="doAuthorize">Click Here to re-authorize</a></b></div>
|
||||
</span>
|
||||
<span>
|
||||
| <a id="doSave">Save</a>
|
||||
<a id="doSave">Save</a>
|
||||
<svg id="saveQuestionIcon" xmlns="http://www.w3.org/2000/svg"
|
||||
class="h-4 w-4 inline-block cursor-help"
|
||||
fill="none"
|
||||
@@ -76,6 +78,7 @@
|
||||
</svg>
|
||||
</span>
|
||||
</span>
|
||||
</div>
|
||||
<div id="error" class="font-semibold"></div>
|
||||
<select style="display:none;" id="schema-selection">
|
||||
<option value="bot.yaml">Bot Configuration</option>
|
||||
@@ -138,12 +141,14 @@
|
||||
const saveLink = document.querySelector('#doSave');
|
||||
saveLink.classList.remove('isDisabled');
|
||||
saveLink.href = '#';
|
||||
document.querySelector('#reason').style.display = 'initial';
|
||||
} else {
|
||||
document.querySelector('#saveTip').classList.add('has-tooltip');
|
||||
document.querySelector('#saveQuestionIcon').style.display = 'initial';
|
||||
const saveLink = document.querySelector('#doSave');
|
||||
saveLink.classList.add('isDisabled');
|
||||
saveLink.href = '';
|
||||
document.querySelector('#reason').style.display = 'none';
|
||||
}
|
||||
}
|
||||
|
||||
@@ -196,18 +201,24 @@
|
||||
return;
|
||||
}
|
||||
|
||||
const payload = {
|
||||
location: window.wikiLocation,
|
||||
create: window.creatingWikiPage,
|
||||
data: window.ed.getModel().getValue(),
|
||||
};
|
||||
|
||||
const reasonVal = document.querySelector('#reason').value;
|
||||
if(reasonVal.trim() !== '') {
|
||||
payload.reason = reasonVal;
|
||||
}
|
||||
|
||||
fetch(`${document.location.origin}/config${document.location.search}`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Accept': 'application/json',
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
body: JSON.stringify({
|
||||
//...data,
|
||||
location: window.wikiLocation,
|
||||
create: window.creatingWikiPage,
|
||||
data: window.ed.getModel().getValue()
|
||||
})
|
||||
body: JSON.stringify(payload)
|
||||
}).then((resp) => {
|
||||
if (!resp.ok) {
|
||||
resp.text().then(data => {
|
||||
@@ -217,6 +228,9 @@
|
||||
if(window.creatingWikiPage) {
|
||||
window.isCreatingWikiPage(false);
|
||||
}
|
||||
|
||||
document.querySelector('#reason').value = '';
|
||||
|
||||
document.querySelector('#error').innerHTML = `Wiki saved!`;
|
||||
window.dirty = false;
|
||||
setTimeout(() => {
|
||||
|
||||
@@ -8,5 +8,8 @@
|
||||
<meta charset="utf-8">
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width,initial-scale=1.0">
|
||||
<!-- This is internal tool or at the very least a private site, should not be indexed -->
|
||||
<!-- https://developers.google.com/search/docs/advanced/crawling/block-indexing#meta-tag -->
|
||||
<meta name="robots" content="noindex">
|
||||
<!--icons from https://heroicons.com -->
|
||||
</head>
|
||||
|
||||
@@ -672,41 +672,10 @@
|
||||
dayjs.extend(window.dayjs_plugin_duration)
|
||||
dayjs.extend(window.dayjs_plugin_relativeTime)
|
||||
dayjs.extend(window.dayjs_plugin_isSameOrAfter)
|
||||
window.formattedTime = (short, full) => `<span class="has-tooltip"><span style="margin-top:35px" class='tooltip rounded shadow-lg p-1 bg-gray-100 text-black space-y-3 p-2 text-left'>${full}</span><span>${short}</span></span>`;
|
||||
window.formatLogLineToHtml = (log, timestamp = undefined) => {
|
||||
const val = typeof log === 'string' ? log : log['MESSAGE'];
|
||||
const logContent = Autolinker.link(val, {
|
||||
email: false,
|
||||
phone: false,
|
||||
mention: false,
|
||||
hashtag: false,
|
||||
stripPrefix: false,
|
||||
sanitizeHtml: true,
|
||||
})
|
||||
.replace(/(\s*debug\s*):/i, '<span class="debug blue">$1</span>:')
|
||||
.replace(/(\s*warn\s*):/i, '<span class="warn yellow">$1</span>:')
|
||||
.replace(/(\s*info\s*):/i, '<span class="info green">$1</span>:')
|
||||
.replace(/(\s*error\s*):/i, '<span class="error red">$1</span>:')
|
||||
.replace(/(\s*verbose\s*):/i, '<span class="error purple">$1</span>:')
|
||||
.replaceAll('\n', '<br />');
|
||||
//.replace(HYPERLINK_REGEX, '<a target="_blank" href="$&">$&</a>');
|
||||
let line = '';
|
||||
|
||||
let timestampString = timestamp;
|
||||
if(timestamp === undefined && typeof log !== 'string') {
|
||||
timestampString = log.timestamp;
|
||||
}
|
||||
|
||||
if(timestampString !== undefined) {
|
||||
const timeStampReplacement = formattedTime(dayjs(timestampString).format('HH:mm:ss z'), timestampString);
|
||||
const splitLine = logContent.split(timestampString);
|
||||
line = `<div class="logLine">${splitLine[0]}${timeStampReplacement}<span style="white-space: pre-wrap">${splitLine[1]}</span></div>`;
|
||||
} else {
|
||||
line = `<div style="white-space: pre-wrap" class="logLine">${logContent}</div>`
|
||||
}
|
||||
return line;
|
||||
}
|
||||
</script>
|
||||
<script src="public/TextEncoderStream.js"></script>
|
||||
<script src="public/browserBundle.js"></script>
|
||||
<script src="public/logUtils.js"></script>
|
||||
<script>
|
||||
window.sort = 'desc';
|
||||
|
||||
@@ -878,46 +847,370 @@
|
||||
history.pushState(null, '', newRelativePathQuery);
|
||||
}
|
||||
const activeSub = document.querySelector(`[data-subreddit="${subreddit}"][data-bot="${bot}"].sub`);
|
||||
if(!activeSub.classList.contains('seen')) {
|
||||
//firstSub.classList.add('seen');
|
||||
|
||||
//subreddit = firstSub.dataset.subreddit;
|
||||
//bot = subSection.dataset.bot;
|
||||
level = document.querySelector(`[data-subreddit="${subreddit}"] [data-type="level"]`).value;
|
||||
sort = document.querySelector(`[data-subreddit="${subreddit}"] [data-type="sort"]`).value;
|
||||
limitSel = document.querySelector(`[data-subreddit="${subreddit}"] [data-type="limit"]`).value;
|
||||
|
||||
fetch(`/api/logs?instance=<%= instanceId %>&bot=${bot}&subreddit=${subreddit}&level=${level}&sort=${sort}&limit=${limitSel}&stream=false&formatted=true`).then((resp) => {
|
||||
if (!resp.ok) {
|
||||
console.error('Response was not OK from logs GET');
|
||||
} else {
|
||||
resp.json().then((data) => {
|
||||
const logContainer = document.querySelector(`[data-subreddit="${subreddit}"] .logs`);
|
||||
const logLines = (subreddit.toLowerCase() === 'all' ? data[0].all : data[0].subreddits[0].logs).map(x => {
|
||||
let fString = x.formatted;
|
||||
if(x.bot !== undefined) {
|
||||
fString = fString.replace(`~${x.bot}~ `, '');
|
||||
}
|
||||
if(x.subreddit !== undefined && subreddit !== 'All') {
|
||||
fString = fString.replace(`{${x.subreddit}} `, '');
|
||||
}
|
||||
return window.formatLogLineToHtml(fString, x.timestamp)
|
||||
}).join('');
|
||||
logContainer.insertAdjacentHTML('afterbegin', logLines);
|
||||
activeSub.classList.add('seen');
|
||||
});
|
||||
}
|
||||
}).catch((err) => {
|
||||
console.log(err);
|
||||
});
|
||||
|
||||
}
|
||||
if(window.socket !== undefined) {
|
||||
window.socket.emit('viewing', {bot, subreddit});
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
let recentlySeen = new Map();
|
||||
|
||||
|
||||
function getLogBlock(bot, subreddit) {
|
||||
|
||||
console.debug(`Getting initial logs for ${bot} ${subreddit}`);
|
||||
|
||||
level = document.querySelector(`[data-subreddit="${subreddit}"] [data-type="level"]`).value;
|
||||
sort = document.querySelector(`[data-subreddit="${subreddit}"] [data-type="sort"]`).value;
|
||||
limitSel = document.querySelector(`[data-subreddit="${subreddit}"] [data-type="limit"]`).value;
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
fetch(`/api/logs?instance=<%= instanceId %>&bot=${bot}&subreddit=${subreddit}&level=${level}&sort=${sort}&limit=${limitSel}&stream=false&formatted=false`).then((resp) => {
|
||||
if (!resp.ok) {
|
||||
console.error('Response was not OK from logs GET');
|
||||
reject('Response was not OK from logs GET');
|
||||
} else {
|
||||
resp.json().then((data) => {
|
||||
const logContainer = document.querySelector(`.sub[data-bot="${bot}"] .logs[data-subreddit="${subreddit}"]`);
|
||||
const logLines = (subreddit.toLowerCase() === 'all' ? data[0].all : data[0].subreddits[0].logs).map(x => {
|
||||
const logObj = window.formatLog(x);
|
||||
let fString = logObj[window.beam.MESSAGE];
|
||||
if(x.bot !== undefined) {
|
||||
fString = fString.replace(`~${x.bot}~ `, '');
|
||||
}
|
||||
if(x.subreddit !== undefined && subreddit !== 'All') {
|
||||
fString = fString.replace(`{${x.subreddit}} `, '');
|
||||
}
|
||||
return window.formatLogLineToHtml(fString, x.timestamp)
|
||||
}).join('');
|
||||
logContainer.insertAdjacentHTML('afterbegin', logLines);
|
||||
console.debug(`Done with initial logs for ${bot} ${subreddit}`);
|
||||
resolve();
|
||||
//activeSub.classList.add('seen');
|
||||
});
|
||||
}
|
||||
}).catch((err) => {
|
||||
console.log(err);
|
||||
reject(err);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
// https://stackoverflow.com/a/66394121/1469797
|
||||
function onVisible(element, callback) {
|
||||
new IntersectionObserver((entries, observer) => {
|
||||
entries.forEach(entry => {
|
||||
if(entry.intersectionRatio > 0) {
|
||||
callback(element);
|
||||
//observer.disconnect();
|
||||
}
|
||||
});
|
||||
}).observe(element);
|
||||
}
|
||||
|
||||
function getStreamingLogs(sub, bot) {
|
||||
|
||||
console.debug(`Getting stream for ${bot} ${sub}`);
|
||||
|
||||
level = document.querySelector(`[data-subreddit="${sub}"] [data-type="level"]`).value;
|
||||
sort = document.querySelector(`[data-subreddit="${sub}"] [data-type="sort"]`).value;
|
||||
limitSel = document.querySelector(`[data-subreddit="${sub}"] [data-type="limit"]`).value;
|
||||
|
||||
const logContainer = document.querySelector(`.sub[data-bot="${bot}"] .logs[data-subreddit="${sub}"]`);
|
||||
|
||||
let textBuffer = '';
|
||||
|
||||
var controller = new AbortController();
|
||||
var signal = controller.signal;
|
||||
|
||||
let lastFlush;
|
||||
let bufferTimeout;
|
||||
|
||||
let bufferedLogs = [];
|
||||
|
||||
const formattedMsg = (x) => {
|
||||
const logObj = window.formatLog(x);
|
||||
let fString = logObj[window.beam.MESSAGE];
|
||||
if(x.bot !== undefined) {
|
||||
fString = fString.replace(`~${x.bot}~ `, '');
|
||||
}
|
||||
if(x.subreddit !== undefined && sub !== 'All') {
|
||||
fString = fString.replace(`{${x.subreddit}} `, '');
|
||||
}
|
||||
return window.formatLogLineToHtml(fString, x.timestamp);
|
||||
}
|
||||
|
||||
const flushLogs = () => {
|
||||
let existingLogs;
|
||||
|
||||
//const el = document.querySelector(`[data-subreddit="${sub}"][data-bot="${bot}"].sub`);
|
||||
//const logContainer = el.querySelector(`.logs`);
|
||||
|
||||
if(window.sort === 'desc' || window.sort === 'descending') {
|
||||
bufferedLogs.forEach((l) => {
|
||||
logContainer.insertAdjacentHTML('afterbegin', formattedMsg(l));
|
||||
})
|
||||
existingLogs = Array.from(logContainer.querySelectorAll(`.logLine`));
|
||||
logContainer.replaceChildren(...existingLogs.slice(0, limitSel));
|
||||
} else {
|
||||
bufferedLogs.forEach((l) => {
|
||||
logContainer.insertAdjacentHTML('beforeend', formattedMsg(l));
|
||||
existingLogs = Array.from(logContainer.querySelectorAll(`.logLine`));
|
||||
const overLimit = limitSel - existingLogs.length;
|
||||
logContainer.replaceChildren(...existingLogs.slice(overLimit -1, limitSel));
|
||||
})
|
||||
}
|
||||
lastFlush = Date.now();
|
||||
bufferedLogs = [];
|
||||
}
|
||||
|
||||
const fetchPromise = fetch(`/api/logs?instance=<%= instanceId %>&bot=${bot}&subreddit=${sub}&level=${level}&sort=${sort}&limit=${limitSel}&stream=true&streamObjects=true&formatted=false`, {signal})
|
||||
.then(response => response.body)
|
||||
.then(rs =>
|
||||
rs.pipeThrough(new TextDecoderStream())
|
||||
.pipeThrough(new TransformStream({
|
||||
transform(chunk, controller) {
|
||||
textBuffer += chunk;
|
||||
const lines = textBuffer.split('\n');
|
||||
for (const line of lines.slice(0, -1)) {
|
||||
controller.enqueue(line);
|
||||
}
|
||||
textBuffer = lines.slice(-1)[0];
|
||||
},
|
||||
flush(controller) {
|
||||
if (textBuffer) {
|
||||
controller.enqueue(textBuffer);
|
||||
}
|
||||
}
|
||||
}))
|
||||
|
||||
// Parse JSON objects
|
||||
.pipeThrough(new TransformStream({
|
||||
transform(line, controller) {
|
||||
if (line) {
|
||||
controller.enqueue(
|
||||
JSON.parse(line)
|
||||
);
|
||||
}
|
||||
}
|
||||
}))
|
||||
).catch((e) => {
|
||||
console.warn(e);
|
||||
});
|
||||
|
||||
fetchPromise.then(async res => {
|
||||
const reader = res.getReader();
|
||||
let keepReading = true;
|
||||
while(keepReading) {
|
||||
const {done, value} = await reader.read();
|
||||
if(done) {
|
||||
keepReading = false;
|
||||
console.debug('done');
|
||||
}
|
||||
if(value) {
|
||||
//console.log(`((Logged For ${bot} ${sub})) ${value.message}`);
|
||||
|
||||
bufferedLogs.push(value);
|
||||
|
||||
if(lastFlush !== undefined && bufferTimeout !== undefined && ((Date.now() - lastFlush)/1000) > 3) {
|
||||
//console.log('Immediate flush');
|
||||
clearTimeout(bufferTimeout);
|
||||
bufferTimeout = undefined;
|
||||
flushLogs();
|
||||
} else {
|
||||
//console.log('Using timeout');
|
||||
clearTimeout(bufferTimeout);
|
||||
bufferTimeout = setTimeout(() => {flushLogs();}, 1000);
|
||||
}
|
||||
}
|
||||
}
|
||||
/* function read() {
|
||||
reader.read().then(({done, value}) => {
|
||||
if(done) {
|
||||
console.log('done');
|
||||
return;
|
||||
}
|
||||
if(value) {
|
||||
console.log(value);
|
||||
read();
|
||||
}
|
||||
});
|
||||
}
|
||||
read();*/
|
||||
}).catch((e) => {
|
||||
if(e.name !== 'AbortError') {
|
||||
console.error(e);
|
||||
}
|
||||
});
|
||||
|
||||
const existing = recentlySeen.get(`${bot}.${sub}`) || {};
|
||||
recentlySeen.set(`${bot}.${sub}`, {...existing, fetch: fetchPromise, controller});
|
||||
}
|
||||
|
||||
function updateLiveStats(resp) {
|
||||
let el;
|
||||
let isAll = resp.name.toLowerCase() === 'all';
|
||||
if(isAll) {
|
||||
// got all
|
||||
el = document.querySelector(`[data-subreddit="All"][data-bot="${resp.bot}"].sub`);
|
||||
} else {
|
||||
// got subreddit
|
||||
el = document.querySelector(`[data-subreddit="${resp.name}"].sub`);
|
||||
}
|
||||
|
||||
if(resp.system.running && el.classList.contains('offline')) {
|
||||
el.classList.remove('offline');
|
||||
} else if(!resp.system.running && !el.classList.contains('offline')) {
|
||||
el.classList.add('offline');
|
||||
}
|
||||
|
||||
el.querySelector('.runningActivities').innerHTML = resp.runningActivities;
|
||||
el.querySelector('.queuedActivities').innerHTML = resp.queuedActivities;
|
||||
el.querySelector('.delayedItemsCount').innerHTML = resp.delayedItems.length;
|
||||
el.querySelector('.delayedItemsList').innerHTML = 'No delayed Items!';
|
||||
if(resp.delayedItems.length > 0) {
|
||||
el.querySelector('.delayedItemsList').innerHTML = '';
|
||||
const now = dayjs();
|
||||
const sorted = resp.delayedItems.map(x => ({...x, queuedAtUnix: x.queuedAt, queuedAt: dayjs.unix(x.queuedAt), dispatchAt: dayjs.unix(x.queuedAt + x.duration)}));
|
||||
sorted.sort((a, b) => {
|
||||
return a.dispatchAt.isSameOrAfter(b.dispatchAt) ? 1 : -1
|
||||
});
|
||||
const delayedItemDivs = sorted.map(x => {
|
||||
const diffUntilNow = x.dispatchAt.diff(now);
|
||||
const durationUntilNow = dayjs.duration(diffUntilNow, 'ms');
|
||||
const queuedAtDisplay = x.queuedAt.format('HH:mm:ss z');
|
||||
const durationDayjs = dayjs.duration(x.duration, 'seconds');
|
||||
const durationDisplay = durationDayjs.humanize();
|
||||
const cancelLink = `<a href="#" data-id="${x.id}" data-subreddit="${x.subreddit}" class="delayCancel">CANCEL</a>`;
|
||||
return `<div>A <a href="https://reddit.com${x.permalink}">${x.submissionId !== undefined ? 'Comment' : 'Submssion'}</a>${isAll ? ` in <a href="https://reddit.com${x.subreddit}">${x.subreddit}</a> ` : ''} by <a href="https://reddit.com/u/${x.author}">${x.author}</a> queued by ${x.source} at ${queuedAtDisplay} for ${durationDisplay} (dispatches ${durationUntilNow.humanize(true)}) -- ${cancelLink}</div>`;
|
||||
});
|
||||
el.querySelector('.delayedItemsList').insertAdjacentHTML('afterbegin', delayedItemDivs.join(''));
|
||||
el.querySelectorAll('.delayedItemsList .delayCancel').forEach(elm => {
|
||||
elm.addEventListener('click', e => {
|
||||
e.preventDefault();
|
||||
const id = e.target.dataset.id;
|
||||
const subreddit = e.target.dataset.subreddit;
|
||||
fetch(`/api/delayed?instance=<%= instanceId %>&bot=${resp.bot}&subreddit=${subreddit}&id=${id}`, {
|
||||
method: 'DELETE'
|
||||
}).then((resp) => {
|
||||
if (!resp.ok) {
|
||||
console.error('Response was not OK from delay cancel');
|
||||
} else {
|
||||
console.log('Removed ok');
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
el.querySelector('.allStats .eventsCount').innerHTML = resp.stats.historical.eventsCheckedTotal;
|
||||
el.querySelector('.allStats .checksCount').innerHTML = resp.stats.historical.checksTriggeredTotal;
|
||||
el.querySelector('.allStats .rulesCount').innerHTML = resp.stats.historical.rulesTriggeredTotal;
|
||||
el.querySelector('.allStats .actionsCount').innerHTML = resp.stats.historical.actionsRunTotal;
|
||||
|
||||
if(isAll) {
|
||||
for(const elm of ['apiAvg','apiLimit','apiDepletion','nextHeartbeat', 'nextHeartbeatHuman', 'limitReset', 'limitResetHuman', 'nannyMode', 'startedAtHuman']) {
|
||||
el.querySelector(`#${elm}`).innerHTML = resp[elm];
|
||||
}
|
||||
el.querySelector(`.botStatus`).innerHTML = resp.system.running ? 'ONLINE' : 'OFFLINE';
|
||||
} else {
|
||||
if(el.querySelector('.modPermissionsCount').innerHTML != resp.permissions.length) {
|
||||
el.querySelector('.modPermissionsCount').innerHTML = resp.permissions.length;
|
||||
el.querySelector('.modPermissionsList').innerHTML = '';
|
||||
el.querySelector('.modPermissionsList').insertAdjacentHTML('afterbegin', resp.permissions.map(x => `<li class="font-mono">${x}</li>`).join(''));
|
||||
}
|
||||
|
||||
for(const elm of ['botState', 'queueState', 'eventsState']) {
|
||||
const state = resp[elm];
|
||||
el.querySelector(`.${elm}`).innerHTML = `${state.state}${state.causedBy === 'system' ? '' : ' (user)'}`;
|
||||
}
|
||||
for(const elm of ['startedAt', 'startedAtHuman', 'wikiLastCheck', 'wikiLastCheckHuman', 'wikiRevision', 'wikiRevisionHuman', 'validConfig', 'delayBy']) {
|
||||
el.querySelector(`.${elm}`).innerHTML = resp[elm];
|
||||
}
|
||||
el.querySelector(`.commentCheckCount`).innerHTML = resp.checks.comments;
|
||||
el.querySelector(`.submissionCheckCount`).innerHTML = resp.checks.submissions;
|
||||
|
||||
const newInner = resp.pollingInfo.map(x => `<li>${x}</li>`).join('');
|
||||
if(el.querySelector(`.pollingInfo`).innerHTML !== newInner) {
|
||||
el.querySelector(`.pollingInfo`).innerHTML = newInner;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function getLiveStats(bot, sub) {
|
||||
console.debug(`Getting live stats for ${bot} ${sub}`)
|
||||
return fetch(`/api/liveStats?instance=<%= instanceId %>&bot=${bot}&subreddit=${sub}`)
|
||||
.then(response => response.json())
|
||||
.then(resp => updateLiveStats(resp));
|
||||
}
|
||||
|
||||
document.querySelectorAll('.sub').forEach(el => {
|
||||
const sub = el.dataset.subreddit;
|
||||
const bot = el.dataset.bot;
|
||||
//console.log(`Focused on ${bot} ${sub}`);
|
||||
onVisible(el, () => {
|
||||
console.debug(`Focused on ${bot} ${sub}`);
|
||||
|
||||
const identifier = `${bot}.${sub}`;
|
||||
|
||||
recentlySeen.forEach((value, key) => {
|
||||
const {timeout, liveStatsInt, ...rest} = value;
|
||||
if(key === identifier && timeout !== undefined) {
|
||||
|
||||
console.debug('Clearing timeout on own already set');
|
||||
clearTimeout(timeout);
|
||||
recentlySeen.set(key, rest);
|
||||
|
||||
} else if(key !== identifier) {
|
||||
|
||||
// stop live stats for tabs we are not viewing
|
||||
clearInterval(liveStatsInt);
|
||||
|
||||
// set timeout for logs we are not viewing
|
||||
if(timeout === undefined) {
|
||||
const t = setTimeout(() => {
|
||||
const k = key;
|
||||
const val = recentlySeen.get(k);
|
||||
if(val !== undefined) {
|
||||
const {controller} = val;
|
||||
console.debug(`timeout expired, stopping live data for ${k}`);
|
||||
if(controller !== undefined) {
|
||||
console.debug('Stopping logs');
|
||||
controller.abort();
|
||||
}
|
||||
// if(liveStatInt !== undefined) {
|
||||
// console.log('Stopping live stats');
|
||||
// clearInterval(liveStatInt);
|
||||
// }
|
||||
recentlySeen.delete(k);
|
||||
}
|
||||
},15000);
|
||||
recentlySeen.set(key, {timeout: t, liveStatsInt, ...rest});
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
if(!recentlySeen.has(identifier)) {
|
||||
getLogBlock(bot, sub).then(() => {
|
||||
getStreamingLogs(sub, bot);
|
||||
});
|
||||
}
|
||||
|
||||
// always get live stats for tab we just started viewing
|
||||
getLiveStats(bot, sub).then(() => {
|
||||
let liveStatsInt;
|
||||
const liveStatFunc = () => {
|
||||
getLiveStats(bot, sub).catch(() => {
|
||||
// stop interval if live stat encounters an error
|
||||
clearInterval(liveStatsInt);
|
||||
})
|
||||
};
|
||||
liveStatsInt = setInterval(liveStatFunc, 5000);
|
||||
recentlySeen.set(identifier, {liveStatsInt});
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
var searchParams = new URLSearchParams(window.location.search);
|
||||
const shownSub = searchParams.get('sub') || 'All'
|
||||
@@ -992,85 +1285,9 @@
|
||||
let shownBot = searchParams.get('bot');
|
||||
window.socket.emit('viewing', {bot: shownBot, subreddit: shownSub});
|
||||
|
||||
socket.on("log", data => {
|
||||
const {
|
||||
subreddit,
|
||||
bot,
|
||||
subredditMessage,
|
||||
allMessage,
|
||||
formattedMessage
|
||||
} = data;
|
||||
if(bot === undefined && subreddit === undefined) {
|
||||
const sys = bufferedBot.get('system');
|
||||
if(sys !== undefined) {
|
||||
sys.set('All', sys.get('All').concat(formattedMessage));
|
||||
bufferedBot.set('system', sys);
|
||||
}
|
||||
}
|
||||
if(bot !== undefined) {
|
||||
bufferedBot.set('All', bufferedBot.get('All').concat(allMessage));
|
||||
// TODO web logging
|
||||
// socket.on('log')
|
||||
|
||||
const buffBot = bufferedBot.get(bot) || newBufferedLogs();
|
||||
buffBot.set('All', buffBot.get('All').concat(allMessage));
|
||||
if (subreddit !== undefined) {
|
||||
buffBot.set(subreddit, (buffBot.get(subreddit) || []).concat(subredditMessage));
|
||||
}
|
||||
bufferedBot.set(bot, buffBot);
|
||||
}
|
||||
|
||||
|
||||
const flushLogs = () => {
|
||||
bufferedBot.forEach((subLogs, botName) => {
|
||||
if(botName === 'All') {
|
||||
return;
|
||||
}
|
||||
subLogs.forEach((logs, subKey) => {
|
||||
// check sub exists -- may be a web log
|
||||
const el = document.querySelector(`[data-subreddit="${subKey}"][data-bot="${botName}"].sub.seen`);
|
||||
if(null !== el) {
|
||||
const limit = Number.parseInt(document.querySelector(`[data-subreddit="${subKey}"] [data-type="limit"]`).value);
|
||||
const logContainer = el.querySelector(`.logs`);
|
||||
let existingLogs;
|
||||
if(window.sort === 'desc' || window.sort === 'descending') {
|
||||
logs.forEach((l) => {
|
||||
logContainer.insertAdjacentHTML('afterbegin', l);
|
||||
})
|
||||
existingLogs = Array.from(el.querySelectorAll(`.logs .logLine`));
|
||||
logContainer.replaceChildren(...existingLogs.slice(0, limit));
|
||||
} else {
|
||||
logs.forEach((l) => {
|
||||
logContainer.insertAdjacentHTML('beforeend', l);
|
||||
existingLogs = Array.from(el.querySelectorAll(`.logs .logLine`));
|
||||
const overLimit = limit - existingLogs.length;
|
||||
logContainer.replaceChildren(...existingLogs.slice(overLimit -1, limit));
|
||||
})
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
lastFlush = Date.now();
|
||||
bufferedBot = newBufferedBot();
|
||||
//bufferedLogs = newBufferedLogs();
|
||||
//console.log('Flushed Logs');
|
||||
}
|
||||
|
||||
if(lastFlush !== undefined && bufferTimeout !== undefined && ((Date.now() - lastFlush)/1000) > 3) {
|
||||
//console.log('Immediate flush');
|
||||
clearTimeout(bufferTimeout);
|
||||
bufferTimeout = undefined;
|
||||
flushLogs();
|
||||
} else {
|
||||
//console.log('Using timeout');
|
||||
clearTimeout(bufferTimeout);
|
||||
bufferTimeout = setTimeout(() => {flushLogs();}, 1000);
|
||||
}
|
||||
});
|
||||
socket.on("logClear", data => {
|
||||
data.forEach((obj) => {
|
||||
const n = obj.name === 'all' ? 'All' : obj.name;
|
||||
document.querySelector(`[data-subreddit="${n}"].logs`).innerHTML = obj.logs;
|
||||
})
|
||||
});
|
||||
const subIndicators = ['red', 'green', 'yellow'];
|
||||
socket.on('opStats', (resp) => {
|
||||
for(const b of resp) {
|
||||
@@ -1100,94 +1317,6 @@
|
||||
}
|
||||
|
||||
});
|
||||
socket.on('liveStats', (resp) => {
|
||||
let el;
|
||||
let isAll = resp.name.toLowerCase() === 'all';
|
||||
if(isAll) {
|
||||
// got all
|
||||
el = document.querySelector(`[data-subreddit="All"][data-bot="${resp.bot}"].sub`);
|
||||
} else {
|
||||
// got subreddit
|
||||
el = document.querySelector(`[data-subreddit="${resp.name}"].sub`);
|
||||
}
|
||||
|
||||
if(resp.system.running && el.classList.contains('offline')) {
|
||||
el.classList.remove('offline');
|
||||
} else if(!resp.system.running && !el.classList.contains('offline')) {
|
||||
el.classList.add('offline');
|
||||
}
|
||||
|
||||
el.querySelector('.runningActivities').innerHTML = resp.runningActivities;
|
||||
el.querySelector('.queuedActivities').innerHTML = resp.queuedActivities;
|
||||
el.querySelector('.delayedItemsCount').innerHTML = resp.delayedItems.length;
|
||||
el.querySelector('.delayedItemsList').innerHTML = 'No delayed Items!';
|
||||
if(resp.delayedItems.length > 0) {
|
||||
el.querySelector('.delayedItemsList').innerHTML = '';
|
||||
const now = dayjs();
|
||||
const sorted = resp.delayedItems.map(x => ({...x, dispatchAt: dayjs.unix(x.queuedAt + (x.durationMilli))}));
|
||||
sorted.sort((a, b) => {
|
||||
return a.dispatchAt.isSameOrAfter(b.dispatchAt) ? 1 : -1
|
||||
});
|
||||
const delayedItemDivs = sorted.map(x => {
|
||||
const diffUntilNow = x.dispatchAt.diff(now)
|
||||
const durationUntilNow = dayjs.duration(diffUntilNow);
|
||||
const cancelLink = `<a href="#" data-id="${x.id}" data-subreddit="${x.subreddit}" class="delayCancel">CANCEL</a>`;
|
||||
return `<div>A <a href="https://reddit.com${x.permalink}">${x.submissionId !== undefined ? 'Comment' : 'Submssion'}</a>${isAll ? ` in <a href="https://reddit.com${x.subreddit}">${x.subreddit}</a> ` : ''} by <a href="https://reddit.com/u/${x.author}">${x.author}</a> queued by ${x.source} at ${dayjs.unix(x.queuedAt).format('HH:mm:ss z')} for ${x.duration} (dispatches in ${durationUntilNow.humanize()}) -- ${cancelLink}</div>`;
|
||||
});
|
||||
el.querySelector('.delayedItemsList').insertAdjacentHTML('afterbegin', delayedItemDivs.join(''));
|
||||
el.querySelectorAll('.delayedItemsList .delayCancel').forEach(elm => {
|
||||
elm.addEventListener('click', e => {
|
||||
e.preventDefault();
|
||||
const id = e.target.dataset.id;
|
||||
const subreddit = e.target.dataset.subreddit;
|
||||
fetch(`/api/delayed?instance=<%= instanceId %>&bot=${resp.bot}&subreddit=${subreddit}&id=${id}`, {
|
||||
method: 'DELETE'
|
||||
}).then((resp) => {
|
||||
if (!resp.ok) {
|
||||
console.error('Response was not OK from delay cancel');
|
||||
} else {
|
||||
console.log('Removed ok');
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
el.querySelector('.allStats .eventsCount').innerHTML = resp.stats.historical.eventsCheckedTotal;
|
||||
el.querySelector('.allStats .checksCount').innerHTML = resp.stats.historical.checksTriggeredTotal;
|
||||
el.querySelector('.allStats .rulesCount').innerHTML = resp.stats.historical.rulesTriggeredTotal;
|
||||
el.querySelector('.allStats .actionsCount').innerHTML = resp.stats.historical.actionsRunTotal;
|
||||
|
||||
if(isAll) {
|
||||
for(const elm of ['apiAvg','apiLimit','apiDepletion','nextHeartbeat', 'nextHeartbeatHuman', 'limitReset', 'limitResetHuman', 'nannyMode', 'startedAtHuman']) {
|
||||
el.querySelector(`#${elm}`).innerHTML = resp[elm];
|
||||
}
|
||||
el.querySelector(`.botStatus`).innerHTML = resp.system.running ? 'ONLINE' : 'OFFLINE';
|
||||
} else {
|
||||
if(el.querySelector('.modPermissionsCount').innerHTML != resp.permissions.length) {
|
||||
el.querySelector('.modPermissionsCount').innerHTML = resp.permissions.length;
|
||||
el.querySelector('.modPermissionsList').innerHTML = '';
|
||||
el.querySelector('.modPermissionsList').insertAdjacentHTML('afterbegin', resp.permissions.map(x => `<li class="font-mono">${x}</li>`).join(''));
|
||||
}
|
||||
|
||||
for(const elm of ['botState', 'queueState', 'eventsState']) {
|
||||
const state = resp[elm];
|
||||
el.querySelector(`.${elm}`).innerHTML = `${state.state}${state.causedBy === 'system' ? '' : ' (user)'}`;
|
||||
}
|
||||
for(const elm of ['startedAt', 'startedAtHuman', 'wikiLastCheck', 'wikiLastCheckHuman', 'wikiRevision', 'wikiRevisionHuman', 'validConfig', 'delayBy']) {
|
||||
el.querySelector(`.${elm}`).innerHTML = resp[elm];
|
||||
}
|
||||
el.querySelector(`.commentCheckCount`).innerHTML = resp.checks.comments;
|
||||
el.querySelector(`.submissionCheckCount`).innerHTML = resp.checks.submissions;
|
||||
|
||||
const newInner = resp.pollingInfo.map(x => `<li>${x}</li>`).join('');
|
||||
if(el.querySelector(`.pollingInfo`).innerHTML !== newInner) {
|
||||
el.querySelector(`.pollingInfo`).innerHTML = newInner;
|
||||
}
|
||||
}
|
||||
|
||||
//console.log(resp);
|
||||
});
|
||||
});
|
||||
|
||||
socket.on('disconnect', () => {
|
||||
|
||||
244
src/util.ts
244
src/util.ts
@@ -1,6 +1,5 @@
|
||||
import winston, {Logger} from "winston";
|
||||
import jsonStringify from 'safe-stable-stringify';
|
||||
import dayjs, {Dayjs, OpUnitType} from 'dayjs';
|
||||
import dayjs, {Dayjs} from 'dayjs';
|
||||
import {Duration} from 'dayjs/plugin/duration.js';
|
||||
import Ajv from "ajv";
|
||||
import {InvalidOptionArgumentError} from "commander";
|
||||
@@ -39,7 +38,7 @@ import redisStore from "cache-manager-redis-store";
|
||||
import Autolinker from 'autolinker';
|
||||
import {create as createMemoryStore} from './Utils/memoryStore';
|
||||
import {LEVEL, MESSAGE} from "triple-beam";
|
||||
import {Comment, RedditUser, Submission} from "snoowrap/dist/objects";
|
||||
import {Comment, PrivateMessage, RedditUser, Submission, Subreddit} from "snoowrap/dist/objects";
|
||||
import reRegExp from '@stdlib/regexp-regexp';
|
||||
import fetch from "node-fetch";
|
||||
import ImageData from "./Common/ImageData";
|
||||
@@ -55,10 +54,14 @@ import {RuleResultEntity as RuleResultEntity} from "./Common/Entities/RuleResult
|
||||
import {nanoid} from "nanoid";
|
||||
import {
|
||||
ActivityState,
|
||||
asModLogCriteria,
|
||||
asModNoteCriteria,
|
||||
AuthorCriteria,
|
||||
authorCriteriaProperties,
|
||||
CommentState,
|
||||
defaultStrongSubredditCriteriaOptions,
|
||||
ModLogCriteria,
|
||||
ModNoteCriteria,
|
||||
StrongSubredditCriteria,
|
||||
SubmissionState,
|
||||
SubredditCriteria,
|
||||
@@ -71,14 +74,14 @@ import {
|
||||
CacheProvider,
|
||||
ConfigFormat,
|
||||
DurationVal,
|
||||
ModUserNoteLabel,
|
||||
modUserNoteLabels,
|
||||
RedditEntity,
|
||||
RedditEntityType,
|
||||
statFrequencies,
|
||||
StatisticFrequency,
|
||||
StatisticFrequencyOption,
|
||||
StringOperator
|
||||
StatisticFrequencyOption
|
||||
} from "./Common/Infrastructure/Atomic";
|
||||
import {DurationComparison, GenericComparison} from "./Common/Infrastructure/Comparisons";
|
||||
import {
|
||||
AuthorOptions,
|
||||
FilterCriteriaDefaults,
|
||||
@@ -110,6 +113,7 @@ import {
|
||||
HistoryFiltersOptions
|
||||
} from "./Common/Infrastructure/ActivityWindow";
|
||||
import {RunnableBaseJson} from "./Common/Infrastructure/Runnable";
|
||||
import Snoowrap from "snoowrap";
|
||||
|
||||
|
||||
//import {ResembleSingleCallbackComparisonResult} from "resemblejs";
|
||||
@@ -239,7 +243,7 @@ export const defaultFormat = (defaultLabel = 'App') => printf(({
|
||||
stack,
|
||||
...rest
|
||||
}) => {
|
||||
let stringifyValue = splatObj !== undefined ? jsonStringify(splatObj) : '';
|
||||
let stringifyValue = splatObj !== undefined ? JSON.stringify(splatObj) : '';
|
||||
let msg = message;
|
||||
let stackMsg = '';
|
||||
if (stack !== undefined) {
|
||||
@@ -258,7 +262,7 @@ export const defaultFormat = (defaultLabel = 'App') => printf(({
|
||||
}
|
||||
|
||||
let nodes = labels;
|
||||
if (leaf !== null && leaf !== undefined) {
|
||||
if (leaf !== null && leaf !== undefined && !nodes.includes(leaf)) {
|
||||
nodes.push(leaf);
|
||||
}
|
||||
const labelContent = `${nodes.map((x: string) => `[${x}]`).join(' ')}`;
|
||||
@@ -551,6 +555,8 @@ export const filterCriteriaPropertySummary = <T>(val: FilterCriteriaPropertyResu
|
||||
const expectedStrings = crit.map((x: any) => {
|
||||
if (asUserNoteCriteria(x)) {
|
||||
return userNoteCriteriaSummary(x);
|
||||
} else if(asModNoteCriteria(x) || asModLogCriteria(x)) {
|
||||
return modActionCriteriaSummary(x);
|
||||
}
|
||||
return x;
|
||||
}).join(' OR ');
|
||||
@@ -709,76 +715,16 @@ export const isActivityWindowConfig = (val: any): val is FullActivityWindowConfi
|
||||
return false;
|
||||
}
|
||||
|
||||
export const comparisonTextOp = (val1: number, strOp: string, val2: number): boolean => {
|
||||
switch (strOp) {
|
||||
case '>':
|
||||
return val1 > val2;
|
||||
case '>=':
|
||||
return val1 >= val2;
|
||||
case '<':
|
||||
return val1 < val2;
|
||||
case '<=':
|
||||
return val1 <= val2;
|
||||
default:
|
||||
throw new Error(`${strOp} was not a recognized operator`);
|
||||
}
|
||||
}
|
||||
|
||||
const GENERIC_VALUE_COMPARISON = /^\s*(?<opStr>>|>=|<|<=)\s*(?<value>\d+)(?<extra>\s+.*)*$/
|
||||
const GENERIC_VALUE_COMPARISON_URL = 'https://regexr.com/60dq4';
|
||||
export const parseGenericValueComparison = (val: string): GenericComparison => {
|
||||
const matches = val.match(GENERIC_VALUE_COMPARISON);
|
||||
if (matches === null) {
|
||||
throw new InvalidRegexError(GENERIC_VALUE_COMPARISON, val, GENERIC_VALUE_COMPARISON_URL)
|
||||
}
|
||||
const groups = matches.groups as any;
|
||||
|
||||
return {
|
||||
operator: groups.opStr as StringOperator,
|
||||
value: Number.parseFloat(groups.value),
|
||||
isPercent: false,
|
||||
extra: groups.extra,
|
||||
displayText: `${groups.opStr} ${groups.value}`
|
||||
}
|
||||
}
|
||||
|
||||
const GENERIC_VALUE_PERCENT_COMPARISON = /^\s*(?<opStr>>|>=|<|<=)\s*(?<value>\d+)\s*(?<percent>%?)(?<extra>.*)$/
|
||||
const GENERIC_VALUE_PERCENT_COMPARISON_URL = 'https://regexr.com/60a16';
|
||||
export const parseGenericValueOrPercentComparison = (val: string): GenericComparison => {
|
||||
const matches = val.match(GENERIC_VALUE_PERCENT_COMPARISON);
|
||||
if (matches === null) {
|
||||
throw new InvalidRegexError(GENERIC_VALUE_PERCENT_COMPARISON, val, GENERIC_VALUE_PERCENT_COMPARISON_URL)
|
||||
}
|
||||
const groups = matches.groups as any;
|
||||
|
||||
return {
|
||||
operator: groups.opStr as StringOperator,
|
||||
value: Number.parseFloat(groups.value),
|
||||
isPercent: groups.percent !== '',
|
||||
extra: groups.extra,
|
||||
displayText: `${groups.opStr} ${groups.value}${groups.percent === undefined || groups.percent === '' ? '': '%'}`
|
||||
}
|
||||
}
|
||||
|
||||
export const dateComparisonTextOp = (val1: Dayjs, strOp: StringOperator, val2: Dayjs, granularity?: OpUnitType): boolean => {
|
||||
switch (strOp) {
|
||||
case '>':
|
||||
return val1.isBefore(val2, granularity);
|
||||
case '>=':
|
||||
return val1.isSameOrBefore(val2, granularity);
|
||||
case '<':
|
||||
return val1.isAfter(val2, granularity);
|
||||
case '<=':
|
||||
return val1.isSameOrAfter(val2, granularity);
|
||||
default:
|
||||
throw new Error(`${strOp} was not a recognized operator`);
|
||||
}
|
||||
}
|
||||
|
||||
// string must only contain ISO8601 optionally wrapped by whitespace
|
||||
const ISO8601_REGEX: RegExp = /^(-?)P(?=\d|T\d)(?:(\d+)Y)?(?:(\d+)M)?(?:(\d+)([DW]))?(?:T(?:(\d+)H)?(?:(\d+)M)?(?:(\d+(?:\.\d+)?)S)?)?$/;
|
||||
// finds ISO8601 in any part of a string
|
||||
const ISO8601_SUBSTRING_REGEX: RegExp = /(-?)P(?=\d|T\d)(?:(\d+)Y)?(?:(\d+)M)?(?:(\d+)([DW]))?(?:T(?:(\d+)H)?(?:(\d+)M)?(?:(\d+(?:\.\d+)?)S)?)?/;
|
||||
// string must only duration optionally wrapped by whitespace
|
||||
const DURATION_REGEX: RegExp = /^\s*(?<time>\d+)\s*(?<unit>days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?)\s*$/;
|
||||
export const parseDuration = (val: string): Duration => {
|
||||
let matches = val.match(DURATION_REGEX);
|
||||
// finds duration in any part of the string
|
||||
const DURATION_SUBSTRING_REGEX: RegExp = /(?<time>\d+)\s*(?<unit>days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?)/;
|
||||
export const parseDuration = (val: string, strict = true): Duration => {
|
||||
let matches = val.match(strict ? DURATION_REGEX : DURATION_SUBSTRING_REGEX);
|
||||
if (matches !== null) {
|
||||
const groups = matches.groups as any;
|
||||
const dur: Duration = dayjs.duration(groups.time, groups.unit);
|
||||
@@ -787,7 +733,7 @@ export const parseDuration = (val: string): Duration => {
|
||||
}
|
||||
return dur;
|
||||
}
|
||||
matches = val.match(ISO8601_REGEX);
|
||||
matches = val.match(strict ? ISO8601_REGEX : ISO8601_SUBSTRING_REGEX);
|
||||
if (matches !== null) {
|
||||
const dur: Duration = dayjs.duration(val);
|
||||
if (!dayjs.isDuration(dur)) {
|
||||
@@ -795,32 +741,7 @@ export const parseDuration = (val: string): Duration => {
|
||||
}
|
||||
return dur;
|
||||
}
|
||||
throw new InvalidRegexError([DURATION_REGEX, ISO8601_REGEX], val)
|
||||
}
|
||||
|
||||
/**
|
||||
* Named groups: operator, time, unit
|
||||
* */
|
||||
const DURATION_COMPARISON_REGEX: RegExp = /^\s*(?<opStr>>|>=|<|<=)\s*(?<time>\d+)\s*(?<unit>days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?)\s*$/;
|
||||
const DURATION_COMPARISON_REGEX_URL = 'https://regexr.com/609n8';
|
||||
export const parseDurationComparison = (val: string): DurationComparison => {
|
||||
const matches = val.match(DURATION_COMPARISON_REGEX);
|
||||
if (matches === null) {
|
||||
throw new InvalidRegexError(DURATION_COMPARISON_REGEX, val, DURATION_COMPARISON_REGEX_URL)
|
||||
}
|
||||
const groups = matches.groups as any;
|
||||
const dur: Duration = dayjs.duration(groups.time, groups.unit);
|
||||
if (!dayjs.isDuration(dur)) {
|
||||
throw new SimpleError(`Parsed value '${val}' did not result in a valid Dayjs Duration`);
|
||||
}
|
||||
return {
|
||||
operator: groups.opStr as StringOperator,
|
||||
duration: dur
|
||||
}
|
||||
}
|
||||
export const compareDurationValue = (comp: DurationComparison, date: Dayjs) => {
|
||||
const dateToCompare = dayjs().subtract(comp.duration.asSeconds(), 'seconds');
|
||||
return dateComparisonTextOp(date, comp.operator, dateToCompare);
|
||||
throw new InvalidRegexError([(strict ? DURATION_REGEX : DURATION_SUBSTRING_REGEX), (strict ? ISO8601_REGEX : ISO8601_SUBSTRING_REGEX)], val)
|
||||
}
|
||||
|
||||
const SUBREDDIT_NAME_REGEX: RegExp = /^\s*(?:\/r\/|r\/)*(\w+)*\s*$/;
|
||||
@@ -1449,6 +1370,18 @@ export const parseStringToRegex = (val: string, defaultFlags?: string): RegExp |
|
||||
return new RegExp(result[1], flags);
|
||||
}
|
||||
|
||||
export const parseStringToRegexOrLiteralSearch = (val: string, defaultFlags: string = 'i'): RegExp => {
|
||||
const maybeRegex = parseStringToRegex(val, defaultFlags);
|
||||
if (maybeRegex !== undefined) {
|
||||
return maybeRegex;
|
||||
}
|
||||
const literalSearchRegex = parseStringToRegex(`/${escapeRegex(val.trim())}/`, defaultFlags);
|
||||
if (literalSearchRegex === undefined) {
|
||||
throw new SimpleError(`Could not convert test value to a valid regex: ${val}`);
|
||||
}
|
||||
return literalSearchRegex;
|
||||
}
|
||||
|
||||
export const parseRegex = (reg: RegExp, val: string): RegExResult => {
|
||||
|
||||
if(reg.global) {
|
||||
@@ -1766,12 +1699,25 @@ export const difference = (a: Array<any>, b: Array<any>) => {
|
||||
return Array.from(setMinus(a, b));
|
||||
}
|
||||
|
||||
// can use 'in' operator to check if object has a property with name WITHOUT TRIGGERING a snoowrap proxy to fetch
|
||||
export const isSubreddit = (value: any) => {
|
||||
try {
|
||||
return value !== null && typeof value === 'object' && (value instanceof Subreddit || ('id' in value && value.id !== undefined && value.id.includes('t5_')) || 'display_name' in value);
|
||||
} catch (e) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
export const asSubreddit = (value: any): value is Subreddit => {
|
||||
return isSubreddit(value);
|
||||
}
|
||||
|
||||
/**
|
||||
* Cached activities lose type information when deserialized so need to check properties as well to see if the object is the shape of a Submission
|
||||
* */
|
||||
export const isSubmission = (value: any) => {
|
||||
try {
|
||||
return value !== null && typeof value === 'object' && (value instanceof Submission || (value.name !== undefined && value.name.includes('t3_')));
|
||||
return value !== null && typeof value === 'object' && (value instanceof Submission || ('name' in value && value.name !== undefined && value.name.includes('t3_')));
|
||||
} catch (e) {
|
||||
return false;
|
||||
}
|
||||
@@ -1783,7 +1729,7 @@ export const asSubmission = (value: any): value is Submission => {
|
||||
|
||||
export const isComment = (value: any) => {
|
||||
try {
|
||||
return value !== null && typeof value === 'object' && (value instanceof Comment || value.name.includes('t1_'));
|
||||
return value !== null && typeof value === 'object' && (value instanceof Comment || ('name' in value && value.name !== undefined && value.name.includes('t1_')));
|
||||
} catch (e) {
|
||||
return false;
|
||||
}
|
||||
@@ -1799,7 +1745,7 @@ export const asActivity = (value: any): value is (Submission | Comment) => {
|
||||
|
||||
export const isUser = (value: any) => {
|
||||
try {
|
||||
return value !== null && typeof value === 'object' && (value instanceof RedditUser || value.name.includes('t2_'));
|
||||
return value !== null && typeof value === 'object' && (value instanceof RedditUser || ('name' in value && value.name !== undefined && value.name.includes('t2_')));
|
||||
} catch(e) {
|
||||
return false;
|
||||
}
|
||||
@@ -1821,6 +1767,20 @@ export const userNoteCriteriaSummary = (val: UserNoteCriteria): string => {
|
||||
return `${val.count === undefined ? '>= 1' : val.count} of ${val.search === undefined ? 'current' : val.search} notes is ${val.type}`;
|
||||
}
|
||||
|
||||
export const modActionCriteriaSummary = (val: (ModNoteCriteria | ModLogCriteria)): string => {
|
||||
const isNote = asModNoteCriteria(val);
|
||||
const preamble = `${val.count === undefined ? '>= 1' : val.count} of ${val.search === undefined ? 'current' : val.search} ${isNote ? 'notes' : 'actions'} is`;
|
||||
const filters = Object.entries(val).reduce((acc: string[], curr) => {
|
||||
if(['count', 'search'].includes(curr[0])) {
|
||||
return acc;
|
||||
}
|
||||
const vals = Array.isArray(curr[1]) ? curr[1] : [curr[1]];
|
||||
acc.push(`${curr[0]}: ${vals.join(' ,')}`)
|
||||
return acc;
|
||||
}, []);
|
||||
return `${preamble} ${filters.join(' || ')}`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Serialized activities store subreddit and user properties as their string representations (instead of proxy)
|
||||
* */
|
||||
@@ -2489,6 +2449,30 @@ export const normalizeCriteria = <T extends AuthorCriteria | TypedActivityState
|
||||
if(criteria.description !== undefined) {
|
||||
criteria.description = Array.isArray(criteria.description) ? criteria.description : [criteria.description];
|
||||
}
|
||||
if(criteria.modActions !== undefined) {
|
||||
criteria.modActions.map((x, index) => {
|
||||
const common = {
|
||||
...x,
|
||||
type: x.type === undefined ? undefined : (Array.isArray(x.type) ? x.type : [x.type])
|
||||
}
|
||||
if(asModNoteCriteria(x)) {
|
||||
return {
|
||||
...common,
|
||||
noteType: x.noteType === undefined ? undefined : (Array.isArray(x.noteType) ? x.noteType : [x.noteType]),
|
||||
note: x.note === undefined ? undefined : (Array.isArray(x.note) ? x.note : [x.note]),
|
||||
}
|
||||
} else if(asModLogCriteria(x)) {
|
||||
return {
|
||||
...common,
|
||||
action: x.action === undefined ? undefined : (Array.isArray(x.action) ? x.action : [x.action]),
|
||||
details: x.details === undefined ? undefined : (Array.isArray(x.details) ? x.details : [x.details]),
|
||||
description: x.description === undefined ? undefined : (Array.isArray(x.description) ? x.description : [x.description]),
|
||||
activityType: x.activityType === undefined ? undefined : (Array.isArray(x.activityType) ? x.activityType : [x.activityType]),
|
||||
}
|
||||
}
|
||||
return common;
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
@@ -2652,6 +2636,22 @@ export const parseRedditFullname = (str: string): RedditThing | undefined => {
|
||||
}
|
||||
}
|
||||
|
||||
export const generateSnoowrapEntityFromRedditThing = (data: RedditThing, client: Snoowrap) => {
|
||||
switch(data.type) {
|
||||
case 'comment':
|
||||
return new Comment({id: data.val}, client, false);
|
||||
case 'submission':
|
||||
return new Submission({id: data.val}, client, false);
|
||||
case 'user':
|
||||
return new RedditUser({id: data.val}, client, false);
|
||||
case 'subreddit':
|
||||
return new Subreddit({id: data.val}, client, false);
|
||||
case 'message':
|
||||
return new PrivateMessage({id: data.val}, client, false)
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
export const activityDispatchConfigToDispatch = (config: ActivityDispatchConfig, activity: (Comment | Submission), type: ActivitySourceTypes, {action, dryRun}: {action?: string, dryRun?: boolean} = {}): ActivityDispatch => {
|
||||
let tolerantVal: boolean | Duration | undefined;
|
||||
if (config.tardyTolerant !== undefined) {
|
||||
@@ -2666,7 +2666,6 @@ export const activityDispatchConfigToDispatch = (config: ActivityDispatchConfig,
|
||||
delay: parseDurationValToDuration(config.delay),
|
||||
tardyTolerant: tolerantVal,
|
||||
queuedAt: dayjs().utc(),
|
||||
processing: false,
|
||||
id: nanoid(16),
|
||||
activity,
|
||||
action,
|
||||
@@ -2793,3 +2792,34 @@ export const filterByTimeRequirement = (satisfiedEndtime: Dayjs, listSlice: Snoo
|
||||
|
||||
return [truncatedItems.length !== listSlice.length, truncatedItems]
|
||||
}
|
||||
|
||||
export const between = (val: number, a: number, b: number, inclusiveMin: boolean = false, inclusiveMax: boolean = false): boolean => {
|
||||
var min = Math.min(a, b),
|
||||
max = Math.max(a, b);
|
||||
|
||||
if(!inclusiveMin && !inclusiveMax) {
|
||||
return val > min && val < max;
|
||||
}
|
||||
if(inclusiveMin && inclusiveMax) {
|
||||
return val >= min && val <= max;
|
||||
}
|
||||
if(inclusiveMin) {
|
||||
return val >= min && val < max;
|
||||
}
|
||||
|
||||
// inclusive max
|
||||
return val > min && val <= max;
|
||||
}
|
||||
|
||||
export const toModNoteLabel = (val: string): ModUserNoteLabel => {
|
||||
const cleanVal = val.trim().toUpperCase();
|
||||
if (asModNoteLabel(cleanVal)) {
|
||||
return cleanVal;
|
||||
}
|
||||
throw new Error(`${val} is not a valid mod note label. Must be one of: ${modUserNoteLabels.join(', ')}`);
|
||||
}
|
||||
|
||||
|
||||
export const asModNoteLabel = (val: string): val is ModUserNoteLabel => {
|
||||
return modUserNoteLabels.includes(val);
|
||||
}
|
||||
|
||||
342
tests/languageProcessing.test.ts
Normal file
342
tests/languageProcessing.test.ts
Normal file
@@ -0,0 +1,342 @@
|
||||
import {describe, it} from 'mocha';
|
||||
import chai,{assert} from 'chai';
|
||||
import chaiAsPromised from 'chai-as-promised';
|
||||
import {
|
||||
getContentLanguage,
|
||||
getLanguageTypeFromValue,
|
||||
getStringSentiment, parseTextToNumberComparison,
|
||||
testSentiment
|
||||
} from "../src/Common/LangaugeProcessing";
|
||||
import {GenericComparison, RangedComparison} from "../src/Common/Infrastructure/Comparisons";
|
||||
|
||||
chai.use(chaiAsPromised);
|
||||
|
||||
const longNeutralEnglish = "This is a normal english sentence without emotion";
|
||||
const longNeutralEnglish2 = 'I am neutral on the current subject';
|
||||
const longNeutralEnglish3 = 'The midterms were an election that happened';
|
||||
const longNegativeEnglish = "I hate when idiots drive their bad cars terribly. 😡";
|
||||
const longPositiveEnglish = "We love to be happy and laugh on this wonderful, amazing day";
|
||||
|
||||
const shortIndistinctEnglish = "metal gear";
|
||||
const shortIndistinctEnglish2 = "idk hole ref";
|
||||
|
||||
const shortPositiveEnglish = "haha fun";
|
||||
const shortNegativeEnglish = "fuck you";
|
||||
const shortSlangPositiveEnglish = "lol lmao";
|
||||
const shortSlangNegativeEnglish = "get fuked";
|
||||
|
||||
const longIndonesian = "setiap kali scroll mesti nampak dia nie haih";
|
||||
const shortIndonesian = "Saya bangga saya rasis";
|
||||
const shortPolish = 'Dobry wieczór';
|
||||
const longRussian = 'Чит на золото для аватарии без скачивания бесплатно';
|
||||
const longItalian = 'Sembra ormai passato un secolo, visto che gli anime sono praticamente scomparsi dalla televisione.';
|
||||
|
||||
const shortRomanian = 'Tu știi unde sta?';
|
||||
const longRomanian = 'Deci , daca aveti chef de un mic protest , va astept la aceste coordonate';
|
||||
|
||||
const longFrench = "J’approuve et à ce moment là ça se soigne plus malheureusement";
|
||||
|
||||
const longSpanish = "La segunda parece una mezcla entre una convención de fanáticos de los monster truck y un vertedero.";
|
||||
const longPositiveSpanish = 'me encanta esta hermosa cancion';
|
||||
const longPositiveSpanish2 = 'Increíble muy divertido gracias por compartir';
|
||||
|
||||
const longGerman = "bin mir auch sicher, dass zb mein 65er halb so viel wiegt wie ein kasten Bier";
|
||||
|
||||
const shortEmojiNegative = "France 😫 😞 :(";
|
||||
const shortEmojiPositive = "France 😂 😄 😁";
|
||||
|
||||
describe('Language Detection', function () {
|
||||
|
||||
describe('Derives language from user input', async function () {
|
||||
it('gets from valid, case-insensitive alpha2', async function () {
|
||||
const lang = await getLanguageTypeFromValue('eN');
|
||||
assert.equal(lang.alpha2, 'en');
|
||||
});
|
||||
it('gets from valid, case-insensitive alpha3', async function () {
|
||||
const lang = await getLanguageTypeFromValue('eNg');
|
||||
assert.equal(lang.alpha2, 'en');
|
||||
});
|
||||
it('gets from valid, case-insensitive language name', async function () {
|
||||
const lang = await getLanguageTypeFromValue('EnGlIsH');
|
||||
assert.equal(lang.alpha2, 'en');
|
||||
});
|
||||
|
||||
it('throws on invalid value', function () {
|
||||
assert.isRejected(getLanguageTypeFromValue('pofdsfa'))
|
||||
});
|
||||
})
|
||||
|
||||
describe('Recognizes the language in moderately long content well', function () {
|
||||
it('should recognize english', async function () {
|
||||
const lang = await getContentLanguage(longPositiveEnglish);
|
||||
assert.equal(lang.language.alpha2, 'en');
|
||||
assert.isFalse(lang.usedDefault);
|
||||
assert.isAtLeast(lang.bestGuess.score, 0.9);
|
||||
});
|
||||
it('should recognize french', async function () {
|
||||
const lang = await getContentLanguage(longFrench);
|
||||
assert.equal(lang.language.alpha2, 'fr');
|
||||
assert.isFalse(lang.usedDefault);
|
||||
assert.isAtLeast(lang.bestGuess.score, 0.9);
|
||||
});
|
||||
it('should recognize spanish', async function () {
|
||||
const lang = await getContentLanguage(longSpanish);
|
||||
assert.equal(lang.language.alpha2, 'es');
|
||||
assert.isFalse(lang.usedDefault);
|
||||
assert.isAtLeast(lang.bestGuess.score, 0.9);
|
||||
});
|
||||
it('should recognize german', async function () {
|
||||
const lang = await getContentLanguage(longGerman);
|
||||
assert.equal(lang.language.alpha2, 'de');
|
||||
assert.isFalse(lang.usedDefault);
|
||||
assert.isAtLeast(lang.bestGuess.score, 0.9);
|
||||
});
|
||||
it('should recognize indonesian', async function () {
|
||||
const lang = await getContentLanguage(longIndonesian);
|
||||
assert.equal(lang.language.alpha2, 'id');
|
||||
assert.isFalse(lang.usedDefault);
|
||||
assert.isAtLeast(lang.bestGuess.score, 0.9);
|
||||
});
|
||||
});
|
||||
|
||||
describe('Correctly handles short content classification', function () {
|
||||
it('uses default language', async function () {
|
||||
|
||||
for (const content of [shortIndistinctEnglish, shortIndistinctEnglish2, shortIndonesian]) {
|
||||
const lang = await getContentLanguage(content);
|
||||
assert.equal(lang.language.alpha2, 'en', content);
|
||||
assert.isTrue(lang.usedDefault, content);
|
||||
}
|
||||
});
|
||||
|
||||
it('uses best guess when default language is not provided', async function () {
|
||||
|
||||
for (const content of [shortIndistinctEnglish, shortIndistinctEnglish2, shortIndonesian]) {
|
||||
const lang = await getContentLanguage(content, {defaultLanguage: false});
|
||||
assert.isFalse(lang.usedDefault);
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Sentiment', function() {
|
||||
|
||||
describe('Is conservative when no default language is used for short content', function() {
|
||||
|
||||
it('should return unusable result for short, ambiguous english content', async function() {
|
||||
for(const content of [shortIndistinctEnglish, shortIndistinctEnglish2]) {
|
||||
const res = await getStringSentiment(content, {defaultLanguage: false});
|
||||
assert.isFalse(res.usableScore);
|
||||
}
|
||||
});
|
||||
|
||||
it('should return unusable result for short, non-english content', async function() {
|
||||
for(const content of [shortIndonesian, shortPolish, shortRomanian]) {
|
||||
const res = await getStringSentiment(content, {defaultLanguage: false});
|
||||
assert.isFalse(res.usableScore);
|
||||
}
|
||||
});
|
||||
|
||||
});
|
||||
|
||||
describe('Is conservative when language confidence is high for unsupported languages', function() {
|
||||
|
||||
it('should return unusable result for long, non-english content', async function() {
|
||||
for(const content of [longIndonesian, longRussian, longItalian, longRomanian]) {
|
||||
const res = await getStringSentiment(content);
|
||||
assert.isFalse(res.usableScore, content);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe('vader/wink supersedes low confidence language guess', function() {
|
||||
|
||||
it('should return usable result when valid words found by vader/wink', async function() {
|
||||
for(const content of [shortPositiveEnglish,shortNegativeEnglish]) {
|
||||
const res = await getStringSentiment(content, {defaultLanguage: false});
|
||||
assert.isTrue(res.usableScore);
|
||||
}
|
||||
});
|
||||
|
||||
it('should return usable result when valid slang found by vader/wink', async function() {
|
||||
for(const content of [shortSlangPositiveEnglish,shortSlangNegativeEnglish]) {
|
||||
const res = await getStringSentiment(content, {defaultLanguage: false});
|
||||
assert.isTrue(res.usableScore);
|
||||
}
|
||||
});
|
||||
|
||||
it('should return usable result when valid emojis found by vader/wink', async function() {
|
||||
for(const content of [shortEmojiPositive,shortEmojiNegative]) {
|
||||
const res = await getStringSentiment(content, {defaultLanguage: false});
|
||||
assert.isTrue(res.usableScore);
|
||||
}
|
||||
});
|
||||
})
|
||||
|
||||
describe('Detects correct sentiment', function() {
|
||||
|
||||
describe('In English', function() {
|
||||
|
||||
it('should detect positive sentiment', async function() {
|
||||
for(const content of [shortEmojiPositive,longPositiveEnglish, shortPositiveEnglish, shortSlangPositiveEnglish]) {
|
||||
const res = await getStringSentiment(content);
|
||||
assert.isTrue(res.usableScore);
|
||||
assert.isAtLeast(res.scoreWeighted, 0.1);
|
||||
}
|
||||
});
|
||||
|
||||
it('should detect negative sentiment', async function() {
|
||||
for(const content of [shortEmojiNegative,longNegativeEnglish, shortNegativeEnglish, shortSlangNegativeEnglish]) {
|
||||
const res = await getStringSentiment(content);
|
||||
assert.isTrue(res.usableScore);
|
||||
assert.isAtMost(res.scoreWeighted, -0.1);
|
||||
}
|
||||
});
|
||||
|
||||
it('should detect neutral sentiment', async function() {
|
||||
for(const content of [longNeutralEnglish, longNeutralEnglish2, longNeutralEnglish3]) {
|
||||
const res = await getStringSentiment(content);
|
||||
assert.isTrue(res.usableScore, content);
|
||||
assert.isAtMost(res.scoreWeighted, 0.1, content);
|
||||
assert.isAtLeast(res.scoreWeighted, -0.1, content);
|
||||
}
|
||||
});
|
||||
|
||||
it('should detect neutral sentiment for short content when english is default language', async function() {
|
||||
for(const content of [shortIndistinctEnglish, shortIndistinctEnglish2, shortPolish]) {
|
||||
const res = await getStringSentiment(content);
|
||||
assert.isTrue(res.usableScore);
|
||||
assert.isAtMost(res.scoreWeighted, 0.1, content);
|
||||
assert.isAtLeast(res.scoreWeighted, -0.1, content);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe('In Spanish', function() {
|
||||
it('should detect positive ', async function() {
|
||||
for(const content of [longPositiveSpanish, longPositiveSpanish2]) {
|
||||
const res = await getStringSentiment(content);
|
||||
assert.isTrue(res.usableScore, longPositiveSpanish2);
|
||||
assert.isAtLeast(res.scoreWeighted, 0.1, longPositiveSpanish2);
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Testing', function () {
|
||||
|
||||
describe('Parsing user input to comparison', function() {
|
||||
|
||||
it(`parses 'is neutral'`, function() {
|
||||
const res = parseTextToNumberComparison('is neutral') as RangedComparison;
|
||||
assert.deepEqual(res.range, [-0.1, 0.1]);
|
||||
assert.isFalse(res.not);
|
||||
});
|
||||
|
||||
it(`parses 'is not neutral'`, function() {
|
||||
const res = parseTextToNumberComparison('is not neutral') as RangedComparison;
|
||||
assert.deepEqual(res.range, [-0.1, 0.1]);
|
||||
assert.isTrue(res.not);
|
||||
});
|
||||
|
||||
it(`parses 'is positive'`, function() {
|
||||
const res = parseTextToNumberComparison('is positive') as GenericComparison;
|
||||
assert.equal(res.operator, '>=');
|
||||
assert.equal(res.value, 0.1);
|
||||
});
|
||||
|
||||
it(`parses 'is very positive'`, function() {
|
||||
const res = parseTextToNumberComparison('is very positive') as GenericComparison;
|
||||
assert.equal(res.operator, '>=');
|
||||
assert.equal(res.value, 0.3);
|
||||
});
|
||||
|
||||
it(`parses 'is extremely positive'`, function() {
|
||||
const res = parseTextToNumberComparison('is extremely positive') as GenericComparison;
|
||||
assert.equal(res.operator, '>=');
|
||||
assert.equal(res.value, 0.6);
|
||||
});
|
||||
|
||||
it(`parses 'is negative'`, function() {
|
||||
const res = parseTextToNumberComparison('is negative') as GenericComparison;
|
||||
assert.equal(res.operator, '<=');
|
||||
assert.equal(res.value, -0.1);
|
||||
});
|
||||
|
||||
it(`parses 'is very negative'`, function() {
|
||||
const res = parseTextToNumberComparison('is very negative') as GenericComparison;
|
||||
assert.equal(res.operator, '<=');
|
||||
assert.equal(res.value, -0.3);
|
||||
});
|
||||
|
||||
it(`parses 'is extremely negative'`, function() {
|
||||
const res = parseTextToNumberComparison('is extremely negative') as GenericComparison;
|
||||
assert.equal(res.operator, '<=');
|
||||
assert.equal(res.value, -0.6);
|
||||
});
|
||||
|
||||
it(`parses negative negations`, function() {
|
||||
const res = parseTextToNumberComparison('is not extremely negative') as GenericComparison;
|
||||
assert.equal(res.operator, '>');
|
||||
assert.equal(res.value, -0.6);
|
||||
});
|
||||
|
||||
it(`parses positive negations`, function() {
|
||||
const res = parseTextToNumberComparison('is not positive') as GenericComparison;
|
||||
assert.equal(res.operator, '<');
|
||||
assert.equal(res.value, 0.1);
|
||||
});
|
||||
|
||||
});
|
||||
|
||||
it('should fail test if score is unusable', async function() {
|
||||
|
||||
const comparison = parseTextToNumberComparison('is positive');
|
||||
|
||||
for(const content of [shortIndistinctEnglish, shortIndistinctEnglish2, shortPolish, longRomanian]) {
|
||||
const sentimentResult = await getStringSentiment(content, {defaultLanguage: false});
|
||||
|
||||
const testResult = testSentiment(sentimentResult, comparison);
|
||||
assert.isFalse(testResult.passes);
|
||||
}
|
||||
});
|
||||
|
||||
it('should handle generic comparisons', async function() {
|
||||
|
||||
const comparison = parseTextToNumberComparison('is positive');
|
||||
|
||||
for(const content of [shortEmojiPositive,longPositiveEnglish, shortPositiveEnglish, shortSlangPositiveEnglish]) {
|
||||
const sentimentResult = await getStringSentiment(content, {defaultLanguage: false});
|
||||
|
||||
const testResult = testSentiment(sentimentResult, comparison);
|
||||
assert.isTrue(testResult.passes);
|
||||
}
|
||||
});
|
||||
|
||||
it('should handle ranged comparisons', async function() {
|
||||
|
||||
const comparison = parseTextToNumberComparison('is neutral');
|
||||
|
||||
for(const content of [longNeutralEnglish, longNeutralEnglish2, longNeutralEnglish3]) {
|
||||
const sentimentResult = await getStringSentiment(content, {defaultLanguage: false});
|
||||
|
||||
const testResult = testSentiment(sentimentResult, comparison);
|
||||
assert.isTrue(testResult.passes);
|
||||
}
|
||||
});
|
||||
|
||||
it('should handle negated ranged comparisons', async function() {
|
||||
|
||||
const comparison = parseTextToNumberComparison('is not neutral');
|
||||
|
||||
for(const content of [longPositiveEnglish, longPositiveSpanish, longNegativeEnglish]) {
|
||||
const sentimentResult = await getStringSentiment(content, {defaultLanguage: false});
|
||||
|
||||
const testResult = testSentiment(sentimentResult, comparison);
|
||||
assert.isTrue(testResult.passes, content);
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
@@ -3,13 +3,18 @@ import {assert} from 'chai';
|
||||
import {
|
||||
COMMENT_URL_ID,
|
||||
parseDuration,
|
||||
parseDurationComparison,
|
||||
parseGenericValueComparison,
|
||||
parseGenericValueOrPercentComparison, parseLinkIdentifier,
|
||||
parseLinkIdentifier,
|
||||
parseRedditEntity, removeUndefinedKeys, SUBMISSION_URL_ID
|
||||
} from "../src/util";
|
||||
import dayjs from "dayjs";
|
||||
import dduration, {DurationUnitType} from 'dayjs/plugin/duration.js';
|
||||
import dduration, {Duration, DurationUnitType} from 'dayjs/plugin/duration.js';
|
||||
import {
|
||||
parseDurationComparison,
|
||||
parseGenericValueComparison,
|
||||
parseGenericValueOrPercentComparison
|
||||
} from "../src/Common/Infrastructure/Comparisons";
|
||||
|
||||
dayjs.extend(dduration);
|
||||
|
||||
|
||||
describe('Non-temporal Comparison Operations', function () {
|
||||
@@ -51,12 +56,23 @@ describe('Non-temporal Comparison Operations', function () {
|
||||
const withoutPercent = parseGenericValueOrPercentComparison('<= 3');
|
||||
assert.isFalse(withoutPercent.isPercent)
|
||||
})
|
||||
it('should parse comparison with time component', function() {
|
||||
const val = parseGenericValueComparison('> 3 in 2 months');
|
||||
assert.equal(val.value, 3);
|
||||
assert.isFalse(val.isPercent);
|
||||
assert.exists(val.duration);
|
||||
assert.equal(dayjs.duration(2, 'months').milliseconds(), (val.duration as Duration).milliseconds());
|
||||
});
|
||||
it('should parse percentage comparison with time component', function() {
|
||||
const val = parseGenericValueOrPercentComparison('> 3% in 2 months');
|
||||
assert.equal(val.value, 3);
|
||||
assert.isTrue(val.isPercent);
|
||||
assert.exists(val.duration);
|
||||
assert.equal(dayjs.duration(2, 'months').milliseconds(), (val.duration as Duration).milliseconds());
|
||||
});
|
||||
});
|
||||
|
||||
describe('Parsing Temporal Values', function () {
|
||||
before('Extend DayJS', function () {
|
||||
dayjs.extend(dduration);
|
||||
});
|
||||
|
||||
describe('Temporal Comparison Operations', function () {
|
||||
it('should throw if no operator sign', function () {
|
||||
@@ -150,7 +166,7 @@ describe('Parsing Reddit Entity strings', function () {
|
||||
describe('Config Parsing', function () {
|
||||
describe('Deep pruning of undefined keys on config objects', function () {
|
||||
it('removes undefined keys from objects', function () {
|
||||
const obj = {
|
||||
const obj: {keyA: string, keyB: string, keyC?: string } = {
|
||||
keyA: 'foo',
|
||||
keyB: 'bar',
|
||||
keyC: undefined
|
||||
@@ -166,7 +182,7 @@ describe('Config Parsing', function () {
|
||||
assert.isUndefined(removeUndefinedKeys(obj))
|
||||
})
|
||||
it('ignores arrays', function () {
|
||||
const obj = {
|
||||
const obj: { keyA?: string, keyB: string, keyC: any[] } = {
|
||||
keyA: undefined,
|
||||
keyB: 'bar',
|
||||
keyC: ['foo', 'bar']
|
||||
|
||||
Reference in New Issue
Block a user