Compare commits
119 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d2d945db2c | ||
|
|
c5018183e0 | ||
|
|
c5358f196d | ||
|
|
1d9f8245f9 | ||
|
|
20b37f3a40 | ||
|
|
910f7f79ef | ||
|
|
641892cd3e | ||
|
|
1dfb9779e7 | ||
|
|
40111c54a2 | ||
|
|
b4745e3b45 | ||
|
|
838da497ce | ||
|
|
01755eada5 | ||
|
|
1ff59ad6e8 | ||
|
|
d8fd8e6140 | ||
|
|
255ffdb417 | ||
|
|
f0199366a0 | ||
|
|
20c724cab5 | ||
|
|
a670975f14 | ||
|
|
ee13feaf57 | ||
|
|
23a24b4448 | ||
|
|
a11b667d5e | ||
|
|
269b1620b9 | ||
|
|
6dee734440 | ||
|
|
3aea422eff | ||
|
|
e707e5a9a8 | ||
|
|
2a24eea3a5 | ||
|
|
8ad8297c0e | ||
|
|
0b94a14ac1 | ||
|
|
a04e0d2a9b | ||
|
|
3a1348c370 | ||
|
|
507818037f | ||
|
|
2c1f6daf4f | ||
|
|
fef79472fe | ||
|
|
885e3fa765 | ||
|
|
0b2c0e6451 | ||
|
|
15806b5f1f | ||
|
|
bf42cdf356 | ||
|
|
e21acd86db | ||
|
|
5dca1c9602 | ||
|
|
5274584d92 | ||
|
|
1d386c53a5 | ||
|
|
d6e351b195 | ||
|
|
ea32dc0b62 | ||
|
|
dca57bb19e | ||
|
|
43919f7f9c | ||
|
|
a176b51148 | ||
|
|
75ac5297df | ||
|
|
0ef2b99bd6 | ||
|
|
9596a476b5 | ||
|
|
92f52cada5 | ||
|
|
a482e852c5 | ||
|
|
e9055e5205 | ||
|
|
df2c40d9c1 | ||
|
|
fc4eeb47fa | ||
|
|
9fb3eaa611 | ||
|
|
23394ab5c2 | ||
|
|
5417b26417 | ||
|
|
b6d638d6c5 | ||
|
|
af1dd09e2d | ||
|
|
c42e56c68f | ||
|
|
561a007850 | ||
|
|
465c3c9acf | ||
|
|
6cee8691f5 | ||
|
|
cfb228de73 | ||
|
|
82a1a393de | ||
|
|
2fd1ffed19 | ||
|
|
7b00e1c54b | ||
|
|
bb2c5f076c | ||
|
|
8a9212def2 | ||
|
|
a9a5bd0066 | ||
|
|
f27b4a03e9 | ||
|
|
ce87285283 | ||
|
|
220c6cdd8b | ||
|
|
17440025b9 | ||
|
|
2655ae6041 | ||
|
|
a5d7b473a0 | ||
|
|
67a04c6cc6 | ||
|
|
c687ddbe57 | ||
|
|
980ff7da02 | ||
|
|
0f84a7cf6b | ||
|
|
51a93439bb | ||
|
|
18f115987b | ||
|
|
34faf56d5d | ||
|
|
d09a2df1e0 | ||
|
|
5349171913 | ||
|
|
e283d81fdf | ||
|
|
a606d6558c | ||
|
|
cc058388d0 | ||
|
|
4bbd170c1d | ||
|
|
c817716aa1 | ||
|
|
33f9b4a091 | ||
|
|
8d8e4405e0 | ||
|
|
ee302ee430 | ||
|
|
acbac54903 | ||
|
|
3858070cee | ||
|
|
ac5ace1f61 | ||
|
|
3d79a9217a | ||
|
|
4b6261517c | ||
|
|
d1960c68bb | ||
|
|
a8cc40e95d | ||
|
|
5c76f9ab1c | ||
|
|
a5d3c809aa | ||
|
|
3b905e6961 | ||
|
|
707547effc | ||
|
|
6b02350d96 | ||
|
|
7ff8094156 | ||
|
|
82c673c8a6 | ||
|
|
7f742d3a30 | ||
|
|
2442fc2483 | ||
|
|
e762cc29ef | ||
|
|
88db6767eb | ||
|
|
161251a943 | ||
|
|
6e4b1b68e3 | ||
|
|
a6212897b3 | ||
|
|
7b8a89e918 | ||
|
|
efd31c5f21 | ||
|
|
868bac9f1a | ||
|
|
adf18cc7ee | ||
|
|
3f1d1bc6d0 |
4
.github/workflows/dockerhub.yml
vendored
@@ -6,7 +6,7 @@ on:
|
||||
- 'master'
|
||||
- 'edge'
|
||||
tags:
|
||||
- 'v*.*.*'
|
||||
- '*.*.*'
|
||||
# don't trigger if just updating docs
|
||||
paths-ignore:
|
||||
- '**.md'
|
||||
@@ -36,8 +36,6 @@ jobs:
|
||||
type=raw,value=latest,enable=${{ endsWith(github.ref, 'master') }}
|
||||
type=ref,event=branch,enable=${{ !endsWith(github.ref, 'master') }}
|
||||
type=semver,pattern={{version}}
|
||||
type=semver,pattern={{major}}.{{minor}}
|
||||
type=semver,pattern={{major}}
|
||||
flavor: |
|
||||
latest=false
|
||||
|
||||
|
||||
35
README.md
@@ -15,7 +15,10 @@ An example of the above that Context Bot can do now:
|
||||
|
||||
Some feature highlights:
|
||||
* Simple rule-action behavior can be combined to create any level of complexity in behavior
|
||||
* One instance can manage all moderated subreddits for the authenticated account
|
||||
* Server/client architecture
|
||||
* Default/no configuration runs "All In One" behavior
|
||||
* Additional configuration allows web interface to connect to multiple servers
|
||||
* Each server instance can run multiple reddit accounts as bots
|
||||
* **Per-subreddit configuration** is handled by JSON stored in the subreddit wiki
|
||||
* Any text-based actions (comment, submission, message, usernotes, ban, etc...) can be configured via a wiki page or raw text in JSON and support [mustache](https://mustache.github.io) [templating](/docs/actionTemplating.md)
|
||||
* History-based rules support multiple "valid window" types -- [ISO 8601 Durations](https://en.wikipedia.org/wiki/ISO_8601#Durations), [Day.js Durations](https://day.js.org/docs/en/durations/creating), and submission/comment count limits.
|
||||
@@ -27,7 +30,7 @@ Some feature highlights:
|
||||
* Support for [Toolbox User Notes](https://www.reddit.com/r/toolbox/wiki/docs/usernotes) as criteria or Actions (writing notes)
|
||||
* Docker container support
|
||||
* Event notification via Discord
|
||||
* **Web interface** for monitoring and administration
|
||||
* **Web interface** for monitoring, administration, and oauth bot authentication
|
||||
|
||||
# Table of Contents
|
||||
|
||||
@@ -89,9 +92,12 @@ Context Bot's configuration can be written in JSON, [JSON5](https://json5.org/)
|
||||
|
||||
## Web UI and Screenshots
|
||||
|
||||
RCB comes equipped with a web interface designed for use by both moderators and bot operators. Some feature highlights:
|
||||
### Dashboard
|
||||
|
||||
CM comes equipped with a dashboard designed for use by both moderators and bot operators.
|
||||
|
||||
* Authentication via Reddit OAuth -- only accessible if you are the bot operator or a moderator of a subreddit the bot moderates
|
||||
* Connect to multiple ContextMod instances (specified in configuration)
|
||||
* Monitor API usage/rates
|
||||
* Monitoring and administration **per subreddit:**
|
||||
* Start/stop/pause various bot components
|
||||
@@ -102,10 +108,31 @@ RCB comes equipped with a web interface designed for use by both moderators and
|
||||
|
||||

|
||||
|
||||
Additionally, a helper webpage is available to help initial setup of your bot with reddit's oauth authentication. [Learn more about using the oauth helper.](docs/botAuthentication.md#cm-oauth-helper-recommended)
|
||||
### Bot Setup/Authentication
|
||||
|
||||
A bot oauth helper allows operators to define oauth credentials/permissions and then generate unique, one-time invite links that allow moderators to authenticate their own bots without operator assistance. [Learn more about using the oauth helper.](docs/botAuthentication.md#cm-oauth-helper-recommended)
|
||||
|
||||
Operator view/invite link generation:
|
||||
|
||||

|
||||
|
||||
Moderator view/invite and authorization:
|
||||
|
||||

|
||||
|
||||
### Configuration Editor
|
||||
|
||||
A built-in editor using [monaco-editor](https://microsoft.github.io/monaco-editor/) makes editing configurations easy:
|
||||
|
||||
* Automatic JSON syntax validation and formatting
|
||||
* Automatic Schema (subreddit or operator) validation
|
||||
* All properties are annotated via hover popups
|
||||
* Unauthenticated view via `yourdomain.com/config`
|
||||
* Authenticated view loads subreddit configurations by simple link found on the subreddit dashboard
|
||||
* Switch schemas to edit either subreddit or operator configurations
|
||||
|
||||

|
||||
|
||||
## License
|
||||
|
||||
[MIT](/LICENSE)
|
||||
|
||||
67
cliff.toml
Normal file
@@ -0,0 +1,67 @@
|
||||
# configuration file for git-cliff (0.1.0)
|
||||
|
||||
[changelog]
|
||||
# changelog header
|
||||
header = """
|
||||
# Changelog
|
||||
All notable changes to this project will be documented in this file.\n
|
||||
"""
|
||||
# template for the changelog body
|
||||
# https://tera.netlify.app/docs/#introduction
|
||||
body = """
|
||||
{% if version %}\
|
||||
## [{{ version | replace(from="v", to="") }}] - {{ timestamp | date(format="%Y-%m-%d") }}
|
||||
{% else %}\
|
||||
## [unreleased]
|
||||
{% endif %}\
|
||||
{% for group, commits in commits | group_by(attribute="group") %}
|
||||
### {{ group | upper_first }}
|
||||
{% for commit in commits
|
||||
| filter(attribute="scope")
|
||||
| sort(attribute="scope") %}
|
||||
- *({{commit.scope}})* {{ commit.message | upper_first }}
|
||||
{%- if commit.breaking %}
|
||||
{% raw %} {% endraw %}- **BREAKING**: {{commit.breaking_description}}
|
||||
{%- endif -%}
|
||||
{%- endfor -%}
|
||||
{%- for commit in commits %}
|
||||
{%- if commit.scope -%}
|
||||
{% else -%}
|
||||
- *(No Category)* {{ commit.message | upper_first }}
|
||||
{% if commit.breaking -%}
|
||||
{% raw %} {% endraw %}- **BREAKING**: {{commit.breaking_description}}
|
||||
{% endif -%}
|
||||
{% endif -%}
|
||||
{% endfor -%}
|
||||
{% endfor %}
|
||||
"""
|
||||
# remove the leading and trailing whitespaces from the template
|
||||
trim = true
|
||||
# changelog footer
|
||||
footer = """
|
||||
<!-- generated by git-cliff -->
|
||||
"""
|
||||
|
||||
[git]
|
||||
# allow only conventional commits
|
||||
# https://www.conventionalcommits.org
|
||||
conventional_commits = true
|
||||
# regex for parsing and grouping commits
|
||||
commit_parsers = [
|
||||
{ message = "^feat", group = "Features"},
|
||||
{ message = "^fix", group = "Bug Fixes"},
|
||||
{ message = "^doc", group = "Documentation"},
|
||||
{ message = "^perf", group = "Performance"},
|
||||
{ message = "^refactor", group = "Refactor"},
|
||||
{ message = "^style", group = "Styling"},
|
||||
{ message = "^test", group = "Testing"},
|
||||
{ message = "^chore\\(release\\): prepare for", skip = true},
|
||||
{ message = "^chore", group = "Miscellaneous Tasks"},
|
||||
{ body = ".*security", group = "Security"},
|
||||
]
|
||||
# filter out the commits that are not matched by commit parsers
|
||||
filter_commits = false
|
||||
# glob pattern for matching git tags
|
||||
tag_pattern = "v[0-9]*"
|
||||
# regex for skipping tags
|
||||
skip_tags = "v0.1.0-beta.1"
|
||||
@@ -100,7 +100,7 @@ Find detailed descriptions of all the Rules, with examples, below:
|
||||
* [Repeat Activity](/docs/examples/repeatActivity)
|
||||
* [History](/docs/examples/history)
|
||||
* [Author](/docs/examples/author)
|
||||
* Regex
|
||||
* [Regex](/docs/examples/regex)
|
||||
|
||||
### Rule Set
|
||||
|
||||
|
||||
@@ -12,6 +12,7 @@ At the end of this process you will have this info:
|
||||
* clientSecret
|
||||
* refreshToken
|
||||
* accessToken
|
||||
* redirectUri
|
||||
|
||||
**Note:** If you already have this information you can skip this guide **but make sure your redirect uri is correct if you plan on using the web interface.**
|
||||
|
||||
@@ -28,11 +29,11 @@ At the end of this process you will have this info:
|
||||
Visit [your reddit preferences](https://www.reddit.com/prefs/apps) and at the bottom of the page go through the **create an(other) app** process.
|
||||
* Give it a **name**
|
||||
* Choose **web app**
|
||||
* If you know what you will use for **redirect uri** go ahead and use it, otherwise use **http://localhost:8085** for now
|
||||
* If you know what you will use for **redirect uri** go ahead and use it, otherwise use **http://localhost:8085/callback**
|
||||
|
||||
Click **create app**.
|
||||
|
||||
Then write down your **Client ID, Client Secret, and redirect uri** somewhere (or keep this webpage open)
|
||||
Then write down your **Client ID, Client Secret, and Redirect Uri** somewhere (or keep this webpage open)
|
||||
|
||||
# Authenticate Your Bot Account
|
||||
|
||||
@@ -42,18 +43,31 @@ There are **two ways** you can authenticate your bot account. It is recommended
|
||||
|
||||
This method will use CM's built in oauth flow. It is recommended because it will ensure your bot is authenticated with the correct oauth permissions.
|
||||
|
||||
### Start CM with Client ID/Secret
|
||||
### Start CM with Client ID/Secret and Operator
|
||||
|
||||
Start the application while providing the **Client ID** and **Client Secret** you received. Refer to the [operator config guide](/docs/operatorConfiguration.md) if you need help with this.
|
||||
Start the application and provide these to your configuration:
|
||||
|
||||
* **Client ID**
|
||||
* **Client Secret**
|
||||
* **Redirect URI**
|
||||
* **Operator**
|
||||
|
||||
It is important you define **Operator** because the auth route is **protected.** You must login to the application in order to access the route.
|
||||
|
||||
Refer to the [operator config guide](/docs/operatorConfiguration.md) if you need help with this.
|
||||
|
||||
Examples:
|
||||
|
||||
* CLI - `node src/index.js --clientId=myId --clientSecret=mySecret`
|
||||
* Docker - `docker run -e "CLIENT_ID=myId" -e "CLIENT_SECRET=mySecret" foxxmd/context-mod`
|
||||
* CLI - `node src/index.js --clientId=myId --clientSecret=mySecret --redirectUri="http://localhost:8085/callback" --operator=FoxxMD`
|
||||
* Docker - `docker run -e "CLIENT_ID=myId" -e "CLIENT_SECRET=mySecret" -e "OPERATOR=FoxxMD" -e "REDIRECT_URI=http://localhost:8085/callback" foxxmd/context-mod`
|
||||
|
||||
Then open the CM web interface (default is [http://localhost:8085](http://localhost:8085))
|
||||
### Create An Auth Invite
|
||||
|
||||
Follow the directions in the helper to finish authenticating your bot and get your credentials (Access Token and Refresh Token)
|
||||
Then open the CM web interface (default is [http://localhost:8085](http://localhost:8085)) and login.
|
||||
|
||||
After logging in you should be automatically redirected the auth page. If you are not then visit [http://localhost:8085/auth/helper](http://localhost:8085/auth/helper))
|
||||
|
||||
Follow the directions in the helper to create an **auth invite link.** Open this link and then follow the directions to authenticate your bot. At the end of the process you will receive an **Access Token** and **Refresh Token**
|
||||
|
||||
## Aardvark OAuth Helper
|
||||
|
||||
@@ -88,6 +102,7 @@ At the end of the last step you chose you should now have this information saved
|
||||
* clientSecret
|
||||
* refreshToken
|
||||
* accessToken
|
||||
* redirectUri
|
||||
|
||||
This is all the information you need to run your bot with CM.
|
||||
|
||||
|
||||
@@ -16,6 +16,7 @@ This directory contains example of valid, ready-to-go configurations for Context
|
||||
* [Repeat Activity](/docs/examples/repeatActivity)
|
||||
* [History](/docs/examples/history)
|
||||
* [Author](/docs/examples/author)
|
||||
* [Regex](/docs/examples/regex)
|
||||
* [Toolbox User Notes](/docs/examples/userNotes)
|
||||
* [Advanced Concepts](/docs/examples/advancedConcepts)
|
||||
* [Rule Sets](/docs/examples/advancedConcepts/ruleSets.json5)
|
||||
|
||||
20
docs/examples/regex/README.md
Normal file
@@ -0,0 +1,20 @@
|
||||
The **Regex** rule matches on text content from a comment or submission in the same way automod uses regex. The rule, however, provides additional functionality automod does not:
|
||||
|
||||
* Can set the **number** of matches that trigger the rule (`matchThreshold`)
|
||||
|
||||
Which can then be used in conjunction with a [`window`](https://github.com/FoxxMD/context-mod/blob/master/docs/activitiesWindow.md) to match against activities from the history of the Author of the Activity being checked (including the Activity being checked):
|
||||
|
||||
* Can set the **number of Activities** that meet the `matchThreshold` to trigger the rule (`activityMatchThreshold`)
|
||||
* Can set the **number of total matches** across all Activities to trigger the rule (`totalMatchThreshold`)
|
||||
* Can set the **type of Activities** to check (`lookAt`)
|
||||
* When an Activity is a Submission can **specify which parts of the Submission to match against** IE title, body, and/or url (`testOn`)
|
||||
|
||||
### Examples
|
||||
|
||||
* [Trigger if regex matches against the current activity](/docs/examples/regex/matchAnyCurrentActivity.json5)
|
||||
* [Trigger if regex matches 5 times against the current activity](/docs/examples/regex/matchThresholdCurrentActivity.json5)
|
||||
* [Trigger if regex matches against any part of a Submission](/docs/examples/regex/matchSubmissionParts.json5)
|
||||
* [Trigger if regex matches any of Author's last 10 activities](/docs/examples/regex/matchHistoryActivity.json5)
|
||||
* [Trigger if regex matches at least 3 of Author's last 10 activities](/docs/examples/regex/matchActivityThresholdHistory.json5)
|
||||
* [Trigger if there are 5 regex matches in the Author's last 10 activities](/docs/examples/regex/matchTotalHistoryActivity.json5)
|
||||
* [Trigger if there are 5 regex matches in the Author's last 10 comments](/docs/examples/regex/matchSubsetHistoryActivity.json5)
|
||||
20
docs/examples/regex/matchActivityThresholdHistory.json5
Normal file
@@ -0,0 +1,20 @@
|
||||
// goes inside
|
||||
// "rules": []
|
||||
{
|
||||
"name": "swear",
|
||||
"kind": "regex",
|
||||
"criteria": [
|
||||
// triggers if more than 3 activities in the last 10 match the regex
|
||||
{
|
||||
"regex": "/fuck|shit|damn/",
|
||||
// this differs from "totalMatchThreshold"
|
||||
//
|
||||
// activityMatchThreshold => # of activities from window must match regex
|
||||
// totalMatchThreshold => # of matches across all activities from window must match regex
|
||||
"activityMatchThreshold": "> 3",
|
||||
// if `window` is specified it tells the rule to check the current activity as well as the activities returned from `window`
|
||||
// learn more about `window` here https://github.com/FoxxMD/context-mod/blob/master/docs/activitiesWindow.md
|
||||
"window": 10,
|
||||
},
|
||||
]
|
||||
}
|
||||
14
docs/examples/regex/matchAnyCurrentActivity.json5
Normal file
@@ -0,0 +1,14 @@
|
||||
// goes inside
|
||||
// "rules": []
|
||||
{
|
||||
"name": "swear",
|
||||
"kind": "regex",
|
||||
"criteria": [
|
||||
// triggers if current activity has more than 0 matches
|
||||
{
|
||||
"regex": "/fuck|shit|damn/",
|
||||
// if "matchThreshold" is not specified it defaults to this -- default behavior is to trigger if there are any matches
|
||||
// "matchThreshold": "> 0"
|
||||
},
|
||||
]
|
||||
}
|
||||
15
docs/examples/regex/matchHistoryActivity.json5
Normal file
@@ -0,0 +1,15 @@
|
||||
// goes inside
|
||||
// "rules": []
|
||||
{
|
||||
"name": "swear",
|
||||
"kind": "regex",
|
||||
"criteria": [
|
||||
// triggers if any activity in the last 10 (including current activity) match the regex
|
||||
{
|
||||
"regex": "/fuck|shit|damn/",
|
||||
// if `window` is specified it tells the rule to check the current activity as well as the activities returned from `window`
|
||||
// learn more about `window` here https://github.com/FoxxMD/context-mod/blob/master/docs/activitiesWindow.md
|
||||
"window": 10,
|
||||
},
|
||||
]
|
||||
}
|
||||
19
docs/examples/regex/matchSubmissionParts.json5
Normal file
@@ -0,0 +1,19 @@
|
||||
// goes inside
|
||||
// "rules": []
|
||||
{
|
||||
"name": "swear",
|
||||
"kind": "regex",
|
||||
"criteria": [
|
||||
{
|
||||
// triggers if the current activity has more than 0 matches
|
||||
// if the activity is a submission then matches against title, body, and url
|
||||
// if "testOn" is not provided then `title, body` are the defaults
|
||||
"regex": "/fuck|shit|damn/",
|
||||
"testOn": [
|
||||
"title",
|
||||
"body",
|
||||
"url"
|
||||
]
|
||||
},
|
||||
]
|
||||
}
|
||||
23
docs/examples/regex/matchSubsetHistoryActivity.json5
Normal file
@@ -0,0 +1,23 @@
|
||||
// goes inside
|
||||
// "rules": []
|
||||
{
|
||||
"name": "swear",
|
||||
"kind": "regex",
|
||||
"criteria": [
|
||||
// triggers if there are more than 5 regex matches in the last 10 activities (comments only)
|
||||
{
|
||||
"regex": "/fuck|shit|damn/",
|
||||
// this differs from "activityMatchThreshold"
|
||||
//
|
||||
// activityMatchThreshold => # of activities from window must match regex
|
||||
// totalMatchThreshold => # of matches across all activities from window must match regex
|
||||
"totalMatchThreshold": "> 5",
|
||||
// if `window` is specified it tells the rule to check the current activity as well as the activities returned from `window`
|
||||
// learn more about `window` here https://github.com/FoxxMD/context-mod/blob/master/docs/activitiesWindow.md
|
||||
"window": 10,
|
||||
// determines which activities from window to consider
|
||||
//defaults to "all" (submissions and comments)
|
||||
"lookAt": "comments",
|
||||
},
|
||||
]
|
||||
}
|
||||
13
docs/examples/regex/matchThresholdCurrentActivity.json5
Normal file
@@ -0,0 +1,13 @@
|
||||
// goes inside
|
||||
// "rules": []
|
||||
{
|
||||
"name": "swear",
|
||||
"kind": "regex",
|
||||
"criteria": [
|
||||
{
|
||||
"regex": "/fuck|shit|damn/",
|
||||
// triggers if current activity has greater than 5 matches
|
||||
"matchThreshold": "> 5"
|
||||
},
|
||||
]
|
||||
}
|
||||
21
docs/examples/regex/matchTotalHistoryActivity.json5
Normal file
@@ -0,0 +1,21 @@
|
||||
// goes inside
|
||||
// "rules": []
|
||||
{
|
||||
"name": "swear",
|
||||
"kind": "regex",
|
||||
"criteria": [
|
||||
// triggers if there are more than 5 regex matches in the last 10 activities (comments or submission)
|
||||
{
|
||||
// triggers if there are more than 5 *total matches* across the last 10 activities
|
||||
"regex": "/fuck|shit|damn/",
|
||||
// this differs from "activityMatchThreshold"
|
||||
//
|
||||
// activityMatchThreshold => # of activities from window must match regex
|
||||
// totalMatchThreshold => # of matches across all activities from window must match regex
|
||||
"totalMatchThreshold": "> 5",
|
||||
// if `window` is specified it tells the rule to check the current activity as well as the activities returned from `window`
|
||||
// learn more about `window` here https://github.com/FoxxMD/context-mod/blob/master/docs/activitiesWindow.md
|
||||
"window": 10,
|
||||
},
|
||||
]
|
||||
}
|
||||
@@ -4,7 +4,9 @@ This getting started guide is for **reddit moderators** -- that is, someone who
|
||||
# Table of Contents
|
||||
|
||||
* [Prior Knowledge](#prior-knowledge)
|
||||
* [Mod the Bot](#mod-the-bot)
|
||||
* [Choose A Bot](#choose-a-bot)
|
||||
* [Use The Operator's Bot](#use-the-operators-bot)
|
||||
* [Bring Your Own Bot (BYOB)](#bring-your-own-bot-byob)
|
||||
* [Creating Configuration](#configuring-the-bot)
|
||||
* [Monitor the Bot](#monitor-the-bot)
|
||||
|
||||
@@ -15,9 +17,26 @@ Before continuing with this guide you should first make sure you understand how
|
||||
* [How It Works](/docs#how-it-works)
|
||||
* [Core Concepts](/docs#concepts)
|
||||
|
||||
# Mod The Bot
|
||||
# Choose A Bot
|
||||
|
||||
First ensure that you are in communication with the **operator** for this bot. The bot **will not automatically accept a moderator invitation,** it must be manually done by the bot operator. This is an intentional barrier to ensure moderators and the operator are familiar with their respective needs and have some form of trust.
|
||||
First determine what bot (reddit account) you want to run ContextMod with. (You may have already discussed this with your operator)
|
||||
|
||||
## Use the Operator's Bot
|
||||
|
||||
If the Operator has communicated that **you should add a bot they control as a moderator** to your subreddit this is the option you will use.
|
||||
|
||||
**Pros:**
|
||||
|
||||
* Do not have to create and keep track of another reddit account
|
||||
* Easiest option in terms of setup for both moderators and operator
|
||||
|
||||
**Cons:**
|
||||
|
||||
* Shared api quota among other moderated subreddits (not great for high-volume subreddits)
|
||||
|
||||
___
|
||||
|
||||
Ensure that you are in communication with the **operator** for this bot. The bot **will not automatically accept a moderator invitation,** it must be manually done by the bot operator. This is an intentional barrier to ensure moderators and the operator are familiar with their respective needs and have some form of trust.
|
||||
|
||||
Now invite the bot to moderate your subreddit. The bot should have at least these permissions:
|
||||
|
||||
@@ -27,6 +46,30 @@ Now invite the bot to moderate your subreddit. The bot should have at least thes
|
||||
|
||||
Additionally, the bot must have the **Manage Wiki Pages** permission if you plan to use [Toolbox User Notes](https://www.reddit.com/r/toolbox/wiki/docs/usernotes). If you are not planning on using this feature and do not want the bot to have this permission then you **must** ensure the bot has visibility to the configuration wiki page (detailed below).
|
||||
|
||||
## Bring Your Own Bot (BYOB)
|
||||
|
||||
If the operator has communicated that **they want to use a bot you control** this is the option you will use.
|
||||
|
||||
**Pros:**
|
||||
|
||||
* **Dedicated API quota**
|
||||
* This is basically a requirement if your subreddit has high-volume activity and you plan on running checks on comments
|
||||
* More security guarantees since you control the account
|
||||
* **Note:** authenticating an account does NOT give the operator access to view or change the email/password for the account
|
||||
* Established history in your subreddit
|
||||
|
||||
**Cons:**
|
||||
|
||||
* More setup required for both moderators and operators
|
||||
|
||||
___
|
||||
|
||||
The **operator** will send you an **invite link** that you will use to authenticate your bot with the operator's application. Example link: `https://operatorsUrl.com/auth/invite?invite=4kf9n3o03ncd4nd`
|
||||
|
||||
Review the information shown on the invite link webpage and then follow the directions shown to authorize your bot for the operator.
|
||||
|
||||
**Note:** There is information display **after** authentication that you will need to communicate to your operator -- **Refresh** and **Access** token values. Make sure you save these somewhere as the invite link is **one-use only.**
|
||||
|
||||
# Configuring the Bot
|
||||
|
||||
## Setup wiki page
|
||||
|
||||
@@ -52,7 +52,7 @@ tsc -p .
|
||||
|
||||
# Bot Authentication
|
||||
|
||||
Next you need to create your bot and authenticate it with Reddit. Follow the [bot authentication guide](/docs/botAuthentication.md) to complete this step.
|
||||
Next you need to create a bot and authenticate it with Reddit. Follow the [bot authentication guide](/docs/botAuthentication.md) to complete this step.
|
||||
|
||||
# Instance Configuration
|
||||
|
||||
|
||||
@@ -14,17 +14,15 @@ activities the Bot runs on.
|
||||
|
||||
# Minimum Required Configuration
|
||||
|
||||
The minimum required configuration variables to run the bot on subreddits are:
|
||||
| property | Server And Web | Server Only | Web/Bot-Auth Only |
|
||||
|:--------------:|:------------------:|:------------------:|:------------------:|
|
||||
| `clientId` | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: |
|
||||
| `clientSecret` | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: |
|
||||
| `redirectUri` | :heavy_check_mark: | :x: | :heavy_check_mark: |
|
||||
| `refreshToken` | :heavy_check_mark: | :heavy_check_mark: | :x: |
|
||||
| `accessToken` | :heavy_check_mark: | :heavy_check_mark: | :x: |
|
||||
|
||||
* clientId
|
||||
* clientSecret
|
||||
* refreshToken
|
||||
* accessToken
|
||||
|
||||
However, only **clientId** and **clientSecret** are required to run the **oauth helper** mode in order to generate the last two
|
||||
configuration variables.
|
||||
|
||||
Refer to the **[Bot Authentication guide](/docs/botAuthentication.md)** to retrieve the above credentials.
|
||||
Refer to the **[Bot Authentication guide](/docs/botAuthentication.md)** to retrieve credentials.
|
||||
|
||||
# Defining Configuration
|
||||
|
||||
@@ -48,6 +46,17 @@ noted with the same symbol as above. The value shown is the default.
|
||||
|
||||
[**See the Operator Config Schema here**](https://json-schema.app/view/%23?url=https%3A%2F%2Fraw.githubusercontent.com%2FFoxxMD%2Fcontext-mod%2Fmaster%2Fsrc%2FSchema%2FOperatorConfig.json)
|
||||
|
||||
## Defining Multiple Bots or CM Instances
|
||||
|
||||
One ContextMod instance can
|
||||
|
||||
* Run multiple bots (multiple reddit accounts -- each as a bot)
|
||||
* Connect to many other, independent, ContextMod instances
|
||||
|
||||
However, the default configuration (using **ENV/ARG**) assumes your intention is to run one bot (one reddit account) on one CM instance without these additional features. This is to make this mode of operation easier for users with this intention.
|
||||
|
||||
To take advantage of this additional features you **must** use a **FILE** configuration. Learn about how this works and how to configure this scenario in the [Architecture Documentation.](/docs/serverClientArchitecture.md)
|
||||
|
||||
## CLI Usage
|
||||
|
||||
Running CM from the command line is accomplished with the following command:
|
||||
@@ -112,14 +121,18 @@ Below are examples of the minimum required config to run the application using a
|
||||
Using **FILE**
|
||||
<details>
|
||||
|
||||
```json
|
||||
```json5
|
||||
{
|
||||
"credentials": {
|
||||
"clientId": "f4b4df1c7b2",
|
||||
"clientSecret": "34v5q1c56ub",
|
||||
"refreshToken": "34_f1w1v4",
|
||||
"accessToken": "p75_1c467b2"
|
||||
}
|
||||
"bots": [
|
||||
{
|
||||
"credentials": {
|
||||
"clientId": "f4b4df1c7b2",
|
||||
"clientSecret": "34v5q1c56ub",
|
||||
"refreshToken": "34_f1w1v4",
|
||||
"accessToken": "p75_1c467b2"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
@@ -157,10 +170,8 @@ An example of using multiple configuration levels together IE all are provided t
|
||||
|
||||
```json
|
||||
{
|
||||
"credentials": {
|
||||
"clientId": "f4b4df1c7b2",
|
||||
"refreshToken": "34_f1w1v4",
|
||||
"accessToken": "p75_1c467b2"
|
||||
"logging": {
|
||||
"level": "debug"
|
||||
}
|
||||
}
|
||||
```
|
||||
@@ -173,9 +184,10 @@ An example of using multiple configuration levels together IE all are provided t
|
||||
|
||||
```
|
||||
CLIENT_SECRET=34v5q1c56ub
|
||||
REFRESH_TOKEN=34_f1w1v4
|
||||
ACCESS_TOKEN=p75_1c467b2
|
||||
SUBREDDITS=sub1,sub2,sub3
|
||||
PORT=9008
|
||||
LOG_LEVEL=DEBUG
|
||||
```
|
||||
|
||||
</details>
|
||||
@@ -185,7 +197,7 @@ LOG_LEVEL=DEBUG
|
||||
<details>
|
||||
|
||||
```
|
||||
node src/index.js run --subreddits=sub1
|
||||
node src/index.js run --subreddits=sub1 --clientId=34v5q1c56ub
|
||||
```
|
||||
|
||||
</details>
|
||||
@@ -202,6 +214,52 @@ port: 9008
|
||||
log level: debug
|
||||
```
|
||||
|
||||
## Configuring Client for Many Instances
|
||||
|
||||
See the [Architecture Docs](/docs/serverClientArchitecture.md) for more information.
|
||||
|
||||
<details>
|
||||
|
||||
```json5
|
||||
{
|
||||
"bots": [
|
||||
{
|
||||
"credentials": {
|
||||
"clientId": "f4b4df1c7b2",
|
||||
"clientSecret": "34v5q1c56ub",
|
||||
"refreshToken": "34_f1w1v4",
|
||||
"accessToken": "p75_1c467b2"
|
||||
}
|
||||
}
|
||||
],
|
||||
"web": {
|
||||
"credentials": {
|
||||
"clientId": "f4b4df1c7b2",
|
||||
"clientSecret": "34v5q1c56ub",
|
||||
"redirectUri": "http://localhost:8085/callback"
|
||||
},
|
||||
"clients": [
|
||||
// server application running on this same CM instance
|
||||
{
|
||||
"host": "localhost:8095",
|
||||
"secret": "localSecret"
|
||||
},
|
||||
// a server application running somewhere else
|
||||
{
|
||||
// api endpoint and port
|
||||
"host": "mySecondContextMod.com:8095",
|
||||
"secret": "anotherSecret"
|
||||
}
|
||||
]
|
||||
},
|
||||
"api": {
|
||||
"secret": "localSecret",
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
# Cache Configuration
|
||||
|
||||
CM implements two caching backend **providers**. By default all providers use `memory`:
|
||||
|
||||
BIN
docs/screenshots/editor.jpg
Normal file
|
After Width: | Height: | Size: 125 KiB |
BIN
docs/screenshots/oauth-invite.jpg
Normal file
|
After Width: | Height: | Size: 148 KiB |
|
Before Width: | Height: | Size: 126 KiB After Width: | Height: | Size: 226 KiB |
|
Before Width: | Height: | Size: 332 KiB After Width: | Height: | Size: 479 KiB |
71
docs/serverClientArchitecture.md
Normal file
@@ -0,0 +1,71 @@
|
||||
|
||||
# Overview
|
||||
|
||||
ContextMod's high-level functionality is separated into two **independently run** applications.
|
||||
|
||||
Each application consists of an [Express](https://expressjs.com/) web server that executes the core logic for that application and communicates via HTTP API calls:
|
||||
|
||||
Applications:
|
||||
|
||||
* **Server** -- Responsible for **running the bots** and providing an API to retrieve information on and interact with them EX start/stop bot, reload config, retrieve operational status, etc.
|
||||
* **Client** -- Responsible for serving the **web interface** and handling the bot oauth authentication flow between operators and moderators.
|
||||
|
||||
Both applications operate independently and can be run individually. The determination for which is run is made by environmental variables, operator config, or cli arguments.
|
||||
|
||||
# Authentication
|
||||
|
||||
Communication between the applications is secured using [Json Web Tokens](https://github.com/mikenicholson/passport-jwt) signed/encoded by a **shared secret** (HMAC algorithm). The secret is defined in the operator configuration.
|
||||
|
||||
# Configuration
|
||||
|
||||
## Default Mode
|
||||
|
||||
**ContextMod is designed to operate in a "monolith" mode by default.**
|
||||
|
||||
This is done by assuming that when configuration is provided by **environmental variables or CLI arguments** the user's intention is to run the client/server together with only one bot, as if ContextMod is a monolith application. When using these configuration types the same values are parsed to both the server/client to ensure interoperability/transparent usage for the operator. Some examples of this in the **operator configuration**:
|
||||
|
||||
* The **shared secret** for both client/secret cannot be defined using env/cli -- at runtime a random string is generated that is set for the value `secret` on both the `api` and `web` properties.
|
||||
* The `bots` array cannot be defined using env/cli -- a single entry is generated by the configuration parser using the combined values provided from env/cli
|
||||
* The `PORT` env/cli argument only applies to the `client` wev server to guarantee the default port for the `server` web server is used (so the `client` can connect to `server`)
|
||||
|
||||
**The end result of this default behavior is that an operator who does not care about running multiple CM instances does not need to know or understand anything about the client/server architecture.**
|
||||
|
||||
## Server
|
||||
|
||||
To run a ContextMod instance as **sever only (headless):**
|
||||
|
||||
* Config file -- define top-level `"mode":"server"`
|
||||
* ENV -- `MODE=server`
|
||||
* CLI - `node src/index.js run server`
|
||||
|
||||
The relevant sections of the **operator configuration** for the **Server** are:
|
||||
|
||||
* [`operator.name`](https://json-schema.app/view/%23/%23%2Fproperties%2Foperator?url=https%3A%2F%2Fraw.githubusercontent.com%2FFoxxMD%2Fcontext-mod%2Fmaster%2Fsrc%2FSchema%2FOperatorConfig.json) -- Define the reddit users who will be able to have full access to this server regardless of moderator status
|
||||
* `api`
|
||||
|
||||
### [`api`](https://json-schema.app/view/%23/%23%2Fproperties%2Fapi?url=https%3A%2F%2Fraw.githubusercontent.com%2FFoxxMD%2Fcontext-mod%2Fmaster%2Fsrc%2FSchema%2FOperatorConfig.json)
|
||||
|
||||
* `port` - The port the Server will listen on for incoming api requests. Cannot be the same as the Client (when running on the same host)
|
||||
* `secret` - The **shared secret** that will be used to verify incoming api requests coming from an authenticated Client.
|
||||
* `friendly` - An optional string to identify this **Server** on the client. It is recommended to provide this otherwise it will default to `host:port`
|
||||
|
||||
## Client
|
||||
|
||||
To run a ContextMod instance as **client only:**
|
||||
|
||||
* Config file -- define top-level `"mode":"client"`
|
||||
* ENV -- `MODE=client`
|
||||
* CLI - `node src/index.js run client`
|
||||
|
||||
### [`web`](https://json-schema.app/view/%23/%23%2Fproperties%2Fweb?url=https%3A%2F%2Fraw.githubusercontent.com%2FFoxxMD%2Fcontext-mod%2Fmaster%2Fsrc%2FSchema%2FOperatorConfig.json)
|
||||
|
||||
In the **operator configuration** the top-level `web` property defines the configuration for the **Client** application.
|
||||
|
||||
* `web.credentials` -- Defines the reddit oauth credentials used to authenticate users for the web interface
|
||||
* Must contain a `redirectUri` property to work
|
||||
* Credentials are parsed from ENV/CLI credentials when not specified (IE will be same as default bot)
|
||||
* `web.operators` -- Parsed from `operator.name` if not specified IE will use same users as defined for the bot operators
|
||||
* `port` -- the port the web interface will be served from, defaults to `8085`
|
||||
* `clients` -- An array of `BotConnection` objects that specify what **Server** instances the web interface should connect to. Each object should have:
|
||||
* `host` -- The URL specifying where the server api is listening ie `localhost:8085`
|
||||
* `secret` -- The **shared secret** used to sign api calls. **This should be the same as `api.secret` on the server being connected to.**
|
||||
1172
package-lock.json
generated
21
package.json
@@ -26,14 +26,18 @@
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"@awaitjs/express": "^0.8.0",
|
||||
"@stdlib/regexp-regexp": "^0.0.6",
|
||||
"ajv": "^7.2.4",
|
||||
"async": "^3.2.0",
|
||||
"autolinker": "^3.14.3",
|
||||
"body-parser": "^1.19.0",
|
||||
"cache-manager": "^3.4.4",
|
||||
"cache-manager-redis-store": "^2.0.0",
|
||||
"commander": "^8.0.0",
|
||||
"cookie-parser": "^1.3.5",
|
||||
"dayjs": "^1.10.5",
|
||||
"deepmerge": "^4.2.2",
|
||||
"delimiter-stream": "^3.0.1",
|
||||
"ejs": "^3.1.6",
|
||||
"env-cmd": "^10.1.0",
|
||||
"es6-error": "^4.1.1",
|
||||
@@ -43,25 +47,36 @@
|
||||
"express-socket.io-session": "^1.3.5",
|
||||
"fast-deep-equal": "^3.1.3",
|
||||
"fuse.js": "^6.4.6",
|
||||
"got": "^11.8.2",
|
||||
"he": "^1.2.0",
|
||||
"http-proxy": "^1.18.1",
|
||||
"js-yaml": "^4.1.0",
|
||||
"json5": "^2.2.0",
|
||||
"jsonwebtoken": "^8.5.1",
|
||||
"lodash": "^4.17.21",
|
||||
"lru-cache": "^6.0.0",
|
||||
"monaco-editor": "^0.27.0",
|
||||
"mustache": "^4.2.0",
|
||||
"node-fetch": "^2.6.1",
|
||||
"normalize-url": "^6.1.0",
|
||||
"object-hash": "^2.2.0",
|
||||
"p-event": "^4.2.0",
|
||||
"passport": "^0.4.1",
|
||||
"passport-custom": "^1.1.1",
|
||||
"passport-jwt": "^4.0.0",
|
||||
"pretty-print-json": "^1.0.3",
|
||||
"safe-stable-stringify": "^1.1.1",
|
||||
"snoostorm": "^1.5.2",
|
||||
"snoowrap": "^1.23.0",
|
||||
"socket.io": "^4.1.3",
|
||||
"tcp-port-used": "^1.0.2",
|
||||
"triple-beam": "^1.3.0",
|
||||
"typescript": "^4.3.4",
|
||||
"webhook-discord": "^3.7.7",
|
||||
"winston": "FoxxMD/winston#fbab8de969ecee578981c77846156c7f43b5f01e",
|
||||
"winston-daily-rotate-file": "^4.5.5",
|
||||
"winston-duplex": "^0.1.1",
|
||||
"winston-transport": "^4.4.0",
|
||||
"zlib": "^1.0.5"
|
||||
},
|
||||
"devDependencies": {
|
||||
@@ -69,11 +84,14 @@
|
||||
"@types/async": "^3.2.7",
|
||||
"@types/cache-manager": "^3.4.2",
|
||||
"@types/cache-manager-redis-store": "^2.0.0",
|
||||
"@types/cookie-parser": "^1.4.2",
|
||||
"@types/express": "^4.17.13",
|
||||
"@types/express-session": "^1.17.4",
|
||||
"@types/express-socket.io-session": "^1.3.6",
|
||||
"@types/he": "^1.1.1",
|
||||
"@types/http-proxy": "^1.17.7",
|
||||
"@types/js-yaml": "^4.0.1",
|
||||
"@types/jsonwebtoken": "^8.5.4",
|
||||
"@types/lodash": "^4.14.171",
|
||||
"@types/lru-cache": "^5.1.1",
|
||||
"@types/memory-cache": "^0.2.1",
|
||||
@@ -81,7 +99,10 @@
|
||||
"@types/node": "^15.6.1",
|
||||
"@types/node-fetch": "^2.5.10",
|
||||
"@types/object-hash": "^2.1.0",
|
||||
"@types/passport": "^1.0.7",
|
||||
"@types/passport-jwt": "^3.0.6",
|
||||
"@types/tcp-port-used": "^1.0.0",
|
||||
"@types/triple-beam": "^1.3.2",
|
||||
"ts-auto-guard": "*",
|
||||
"ts-json-schema-generator": "^0.93.0",
|
||||
"typescript-json-schema": "^0.50.1"
|
||||
|
||||
@@ -9,28 +9,30 @@ import {UserNoteAction, UserNoteActionJson} from "./UserNoteAction";
|
||||
import ApproveAction, {ApproveActionConfig} from "./ApproveAction";
|
||||
import BanAction, {BanActionJson} from "./BanAction";
|
||||
import {MessageAction, MessageActionJson} from "./MessageAction";
|
||||
import {SubredditResources} from "../Subreddit/SubredditResources";
|
||||
import Snoowrap from "snoowrap";
|
||||
|
||||
export function actionFactory
|
||||
(config: ActionJson, logger: Logger, subredditName: string): Action {
|
||||
(config: ActionJson, logger: Logger, subredditName: string, resources: SubredditResources, client: Snoowrap): Action {
|
||||
switch (config.kind) {
|
||||
case 'comment':
|
||||
return new CommentAction({...config as CommentActionJson, logger, subredditName});
|
||||
return new CommentAction({...config as CommentActionJson, logger, subredditName, resources, client});
|
||||
case 'lock':
|
||||
return new LockAction({...config, logger, subredditName});
|
||||
return new LockAction({...config, logger, subredditName, resources, client});
|
||||
case 'remove':
|
||||
return new RemoveAction({...config, logger, subredditName});
|
||||
return new RemoveAction({...config, logger, subredditName, resources, client});
|
||||
case 'report':
|
||||
return new ReportAction({...config as ReportActionJson, logger, subredditName});
|
||||
return new ReportAction({...config as ReportActionJson, logger, subredditName, resources, client});
|
||||
case 'flair':
|
||||
return new FlairAction({...config as FlairActionJson, logger, subredditName});
|
||||
return new FlairAction({...config as FlairActionJson, logger, subredditName, resources, client});
|
||||
case 'approve':
|
||||
return new ApproveAction({...config as ApproveActionConfig, logger, subredditName});
|
||||
return new ApproveAction({...config as ApproveActionConfig, logger, subredditName, resources, client});
|
||||
case 'usernote':
|
||||
return new UserNoteAction({...config as UserNoteActionJson, logger, subredditName});
|
||||
return new UserNoteAction({...config as UserNoteActionJson, logger, subredditName, resources, client});
|
||||
case 'ban':
|
||||
return new BanAction({...config as BanActionJson, logger, subredditName});
|
||||
return new BanAction({...config as BanActionJson, logger, subredditName, resources, client});
|
||||
case 'message':
|
||||
return new MessageAction({...config as MessageActionJson, logger, subredditName});
|
||||
return new MessageAction({...config as MessageActionJson, logger, subredditName, resources, client});
|
||||
default:
|
||||
throw new Error('rule "kind" was not recognized.');
|
||||
}
|
||||
|
||||
@@ -2,23 +2,33 @@ import {ActionJson, ActionConfig} from "./index";
|
||||
import Action from "./index";
|
||||
import Snoowrap, {Comment, Submission} from "snoowrap";
|
||||
import {RuleResult} from "../Rule";
|
||||
import {ActionProcessResult} from "../Common/interfaces";
|
||||
|
||||
export class ApproveAction extends Action {
|
||||
getKind() {
|
||||
return 'Approve';
|
||||
}
|
||||
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<void> {
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<ActionProcessResult> {
|
||||
const dryRun = runtimeDryrun || this.dryRun;
|
||||
//snoowrap typing issue, thinks comments can't be locked
|
||||
// @ts-ignore
|
||||
if (item.approved) {
|
||||
this.logger.warn('Item is already approved');
|
||||
return {
|
||||
dryRun,
|
||||
success: false,
|
||||
result: 'Item is already approved'
|
||||
}
|
||||
}
|
||||
if (!dryRun) {
|
||||
// @ts-ignore
|
||||
await item.approve();
|
||||
}
|
||||
return {
|
||||
dryRun,
|
||||
success: true,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ import Action from "./index";
|
||||
import Snoowrap, {Comment, Submission} from "snoowrap";
|
||||
import {RuleResult} from "../Rule";
|
||||
import {renderContent} from "../Utils/SnoowrapUtils";
|
||||
import {Footer} from "../Common/interfaces";
|
||||
import {ActionProcessResult, Footer} from "../Common/interfaces";
|
||||
|
||||
export class BanAction extends Action {
|
||||
|
||||
@@ -33,7 +33,7 @@ export class BanAction extends Action {
|
||||
return 'Ban';
|
||||
}
|
||||
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<void> {
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<ActionProcessResult> {
|
||||
const dryRun = runtimeDryrun || this.dryRun;
|
||||
const content = this.message === undefined ? undefined : await this.resources.getContent(this.message, item.subreddit);
|
||||
const renderedBody = content === undefined ? undefined : await renderContent(content, item, ruleResults, this.resources.userNotes);
|
||||
@@ -58,6 +58,11 @@ export class BanAction extends Action {
|
||||
duration: this.duration
|
||||
});
|
||||
}
|
||||
return {
|
||||
dryRun,
|
||||
success: true,
|
||||
result: `Banned ${item.author.name} ${durText}${this.reason !== undefined ? ` (${this.reason})` : ''}`
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -2,8 +2,9 @@ import Action, {ActionJson, ActionOptions} from "./index";
|
||||
import {Comment} from "snoowrap";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import {renderContent} from "../Utils/SnoowrapUtils";
|
||||
import {Footer, RequiredRichContent, RichContent} from "../Common/interfaces";
|
||||
import {ActionProcessResult, Footer, RequiredRichContent, RichContent} from "../Common/interfaces";
|
||||
import {RuleResult} from "../Rule";
|
||||
import {truncateStringToLength} from "../util";
|
||||
|
||||
export class CommentAction extends Action {
|
||||
content: string;
|
||||
@@ -32,7 +33,7 @@ export class CommentAction extends Action {
|
||||
return 'Comment';
|
||||
}
|
||||
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<void> {
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<ActionProcessResult> {
|
||||
const dryRun = runtimeDryrun || this.dryRun;
|
||||
const content = await this.resources.getContent(this.content, item.subreddit);
|
||||
const body = await renderContent(content, item, ruleResults, this.resources.userNotes);
|
||||
@@ -44,7 +45,11 @@ export class CommentAction extends Action {
|
||||
|
||||
if(item.archived) {
|
||||
this.logger.warn('Cannot comment because Item is archived');
|
||||
return;
|
||||
return {
|
||||
dryRun,
|
||||
success: false,
|
||||
result: 'Cannot comment because Item is archived'
|
||||
};
|
||||
}
|
||||
let reply: Comment;
|
||||
if(!dryRun) {
|
||||
@@ -62,6 +67,19 @@ export class CommentAction extends Action {
|
||||
// @ts-ignore
|
||||
await reply.distinguish({sticky: this.sticky});
|
||||
}
|
||||
let modifiers = [];
|
||||
if(this.distinguish) {
|
||||
modifiers.push('Distinguished');
|
||||
}
|
||||
if(this.sticky) {
|
||||
modifiers.push('Stickied');
|
||||
}
|
||||
const modifierStr = modifiers.length === 0 ? '' : `[${modifiers.join(' | ')}]`;
|
||||
return {
|
||||
dryRun,
|
||||
success: true,
|
||||
result: `${modifierStr}${this.lock ? ' - Locked Author\'s Activity - ' : ''}${truncateStringToLength(100)(body)}`
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -2,24 +2,34 @@ import {ActionJson, ActionConfig} from "./index";
|
||||
import Action from "./index";
|
||||
import Snoowrap, {Comment, Submission} from "snoowrap";
|
||||
import {RuleResult} from "../Rule";
|
||||
import {ActionProcessResult} from "../Common/interfaces";
|
||||
|
||||
export class LockAction extends Action {
|
||||
getKind() {
|
||||
return 'Lock';
|
||||
}
|
||||
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<void> {
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<ActionProcessResult> {
|
||||
const dryRun = runtimeDryrun || this.dryRun;
|
||||
//snoowrap typing issue, thinks comments can't be locked
|
||||
// @ts-ignore
|
||||
if (item.locked) {
|
||||
this.logger.warn('Item is already locked');
|
||||
return {
|
||||
dryRun,
|
||||
success: false,
|
||||
result: 'Item is already locked'
|
||||
};
|
||||
}
|
||||
if (!dryRun) {
|
||||
//snoowrap typing issue, thinks comments can't be locked
|
||||
// @ts-ignore
|
||||
await item.lock();
|
||||
}
|
||||
return {
|
||||
dryRun,
|
||||
success: true
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,10 +1,18 @@
|
||||
import Action, {ActionJson, ActionOptions} from "./index";
|
||||
import {Comment, ComposeMessageParams} from "snoowrap";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import {renderContent, singleton} from "../Utils/SnoowrapUtils";
|
||||
import {Footer, RequiredRichContent, RichContent} from "../Common/interfaces";
|
||||
import {renderContent} from "../Utils/SnoowrapUtils";
|
||||
import {ActionProcessResult, Footer, RequiredRichContent, RichContent} from "../Common/interfaces";
|
||||
import {RuleResult} from "../Rule";
|
||||
import {boolToString} from "../util";
|
||||
import {
|
||||
asSubmission,
|
||||
boolToString,
|
||||
isSubmission,
|
||||
parseRedditEntity,
|
||||
REDDIT_ENTITY_REGEX_URL,
|
||||
truncateStringToLength
|
||||
} from "../util";
|
||||
import SimpleError from "../Utils/SimpleError";
|
||||
|
||||
export class MessageAction extends Action {
|
||||
content: string;
|
||||
@@ -14,6 +22,7 @@ export class MessageAction extends Action {
|
||||
footer?: false | string;
|
||||
|
||||
title?: string;
|
||||
to?: string;
|
||||
asSubreddit: boolean;
|
||||
|
||||
constructor(options: MessageActionOptions) {
|
||||
@@ -23,7 +32,9 @@ export class MessageAction extends Action {
|
||||
asSubreddit,
|
||||
title,
|
||||
footer,
|
||||
to,
|
||||
} = options;
|
||||
this.to = to;
|
||||
this.footer = footer;
|
||||
this.content = content;
|
||||
this.asSubreddit = asSubreddit;
|
||||
@@ -34,7 +45,7 @@ export class MessageAction extends Action {
|
||||
return 'Message';
|
||||
}
|
||||
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<void> {
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<ActionProcessResult> {
|
||||
const dryRun = runtimeDryrun || this.dryRun;
|
||||
const content = await this.resources.getContent(this.content);
|
||||
const body = await renderContent(content, item, ruleResults, this.resources.userNotes);
|
||||
@@ -42,21 +53,38 @@ export class MessageAction extends Action {
|
||||
const footer = await this.resources.generateFooter(item, this.footer);
|
||||
|
||||
const renderedContent = `${body}${footer}`;
|
||||
// @ts-ignore
|
||||
const author = await item.author.fetch() as RedditUser;
|
||||
|
||||
const client = singleton.getClient();
|
||||
let recipient = item.author.name;
|
||||
if(this.to !== undefined) {
|
||||
// parse to value
|
||||
try {
|
||||
const entityData = parseRedditEntity(this.to);
|
||||
if(entityData.type === 'user') {
|
||||
recipient = entityData.name;
|
||||
} else {
|
||||
recipient = `/r/${entityData.name}`;
|
||||
}
|
||||
} catch (err) {
|
||||
this.logger.error(`'to' field for message was not in a valid format. See ${REDDIT_ENTITY_REGEX_URL} for valid examples`);
|
||||
this.logger.error(err);
|
||||
err.logged = true;
|
||||
throw err;
|
||||
}
|
||||
if(recipient.includes('/r/') && this.asSubreddit) {
|
||||
throw new SimpleError(`Cannot send a message as a subreddit to another subreddit. Requested recipient: ${recipient}`);
|
||||
}
|
||||
}
|
||||
|
||||
const msgOpts: ComposeMessageParams = {
|
||||
to: author,
|
||||
to: recipient,
|
||||
text: renderedContent,
|
||||
// @ts-ignore
|
||||
fromSubreddit: this.asSubreddit ? await item.subreddit.fetch() : undefined,
|
||||
subject: this.title || `Concerning your ${item instanceof Submission ? 'Submission' : 'Comment'}`,
|
||||
subject: this.title || `Concerning your ${isSubmission(item) ? 'Submission' : 'Comment'}`,
|
||||
};
|
||||
|
||||
const msgPreview = `\r\n
|
||||
TO: ${author.name}\r\n
|
||||
TO: ${recipient}\r\n
|
||||
Subject: ${msgOpts.subject}\r\n
|
||||
Sent As Modmail: ${boolToString(this.asSubreddit)}\r\n\r\n
|
||||
${renderedContent}`;
|
||||
@@ -64,7 +92,12 @@ export class MessageAction extends Action {
|
||||
this.logger.verbose(`Message Preview => \r\n ${msgPreview}`);
|
||||
|
||||
if (!dryRun) {
|
||||
await client.composeMessage(msgOpts);
|
||||
await this.client.composeMessage(msgOpts);
|
||||
}
|
||||
return {
|
||||
dryRun,
|
||||
success: true,
|
||||
result: truncateStringToLength(200)(msgPreview)
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -75,6 +108,24 @@ export interface MessageActionConfig extends RequiredRichContent, Footer {
|
||||
* */
|
||||
asSubreddit: boolean
|
||||
|
||||
/**
|
||||
* Entity to send message to.
|
||||
*
|
||||
* If not present Message be will sent to the Author of the Activity being checked.
|
||||
*
|
||||
* Valid formats:
|
||||
*
|
||||
* * `aUserName` -- send to /u/aUserName
|
||||
* * `u/aUserName` -- send to /u/aUserName
|
||||
* * `r/aSubreddit` -- sent to modmail of /r/aSubreddit
|
||||
*
|
||||
* **Note:** Reddit does not support sending a message AS a subreddit TO another subreddit
|
||||
*
|
||||
* @pattern ^\s*(\/[ru]\/|[ru]\/)*(\w+)*\s*$
|
||||
* @examples ["aUserName","u/aUserName","r/aSubreddit"]
|
||||
* */
|
||||
to?: string
|
||||
|
||||
/**
|
||||
* The title of the message
|
||||
*
|
||||
|
||||
@@ -3,24 +3,33 @@ import Action from "./index";
|
||||
import Snoowrap, {Comment, Submission} from "snoowrap";
|
||||
import {RuleResult} from "../Rule";
|
||||
import {activityIsRemoved} from "../Utils/SnoowrapUtils";
|
||||
import {ActionProcessResult} from "../Common/interfaces";
|
||||
|
||||
export class RemoveAction extends Action {
|
||||
getKind() {
|
||||
return 'Remove';
|
||||
}
|
||||
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<void> {
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<ActionProcessResult> {
|
||||
const dryRun = runtimeDryrun || this.dryRun;
|
||||
// issue with snoowrap typings, doesn't think prop exists on Submission
|
||||
// @ts-ignore
|
||||
if (activityIsRemoved(item)) {
|
||||
this.logger.warn('Item is already removed');
|
||||
return;
|
||||
return {
|
||||
dryRun,
|
||||
success: false,
|
||||
result: 'Item is already removed',
|
||||
}
|
||||
}
|
||||
if (!dryRun) {
|
||||
// @ts-ignore
|
||||
await item.remove();
|
||||
}
|
||||
|
||||
return {
|
||||
dryRun,
|
||||
success: true,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ import Snoowrap, {Comment, Submission} from "snoowrap";
|
||||
import {truncateStringToLength} from "../util";
|
||||
import {renderContent} from "../Utils/SnoowrapUtils";
|
||||
import {RuleResult} from "../Rule";
|
||||
import {RichContent} from "../Common/interfaces";
|
||||
import {ActionProcessResult, RichContent} from "../Common/interfaces";
|
||||
|
||||
// https://www.reddit.com/dev/api/oauth#POST_api_report
|
||||
// denotes 100 characters maximum
|
||||
@@ -23,7 +23,7 @@ export class ReportAction extends Action {
|
||||
return 'Report';
|
||||
}
|
||||
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<void> {
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<ActionProcessResult> {
|
||||
const dryRun = runtimeDryrun || this.dryRun;
|
||||
const content = await this.resources.getContent(this.content, item.subreddit);
|
||||
const renderedContent = await renderContent(content, item, ruleResults, this.resources.userNotes);
|
||||
@@ -33,6 +33,12 @@ export class ReportAction extends Action {
|
||||
// @ts-ignore
|
||||
await item.report({reason: truncatedContent});
|
||||
}
|
||||
|
||||
return {
|
||||
dryRun,
|
||||
success: true,
|
||||
result: truncatedContent
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -2,6 +2,7 @@ import {SubmissionActionConfig} from "./index";
|
||||
import Action, {ActionJson, ActionOptions} from "../index";
|
||||
import Snoowrap, {Comment, Submission} from "snoowrap";
|
||||
import {RuleResult} from "../../Rule";
|
||||
import {ActionProcessResult} from "../../Common/interfaces";
|
||||
|
||||
export class FlairAction extends Action {
|
||||
text: string;
|
||||
@@ -20,7 +21,17 @@ export class FlairAction extends Action {
|
||||
return 'Flair';
|
||||
}
|
||||
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[]): Promise<void> {
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<ActionProcessResult> {
|
||||
const dryRun = runtimeDryrun || this.dryRun;
|
||||
let flairParts = [];
|
||||
if(this.text !== '') {
|
||||
flairParts.push(`Text: ${this.text}`);
|
||||
}
|
||||
if(this.css !== '') {
|
||||
flairParts.push(`CSS: ${this.css}`);
|
||||
}
|
||||
const flairSummary = flairParts.length === 0 ? 'No flair (unflaired)' : flairParts.join(' | ');
|
||||
this.logger.verbose(flairSummary);
|
||||
if (item instanceof Submission) {
|
||||
if(!this.dryRun) {
|
||||
// @ts-ignore
|
||||
@@ -28,6 +39,16 @@ export class FlairAction extends Action {
|
||||
}
|
||||
} else {
|
||||
this.logger.warn('Cannot flair Comment');
|
||||
return {
|
||||
dryRun,
|
||||
success: false,
|
||||
result: 'Cannot flair Comment',
|
||||
}
|
||||
}
|
||||
return {
|
||||
dryRun,
|
||||
success: true,
|
||||
result: flairSummary
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -5,6 +5,7 @@ import {renderContent} from "../Utils/SnoowrapUtils";
|
||||
import {RuleResult} from "../Rule";
|
||||
import {UserNote, UserNoteJson} from "../Subreddit/UserNotes";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import {ActionProcessResult} from "../Common/interfaces";
|
||||
|
||||
|
||||
export class UserNoteAction extends Action {
|
||||
@@ -24,7 +25,7 @@ export class UserNoteAction extends Action {
|
||||
return 'User Note';
|
||||
}
|
||||
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<void> {
|
||||
async process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<ActionProcessResult> {
|
||||
const dryRun = runtimeDryrun || this.dryRun;
|
||||
const content = await this.resources.getContent(this.content, item.subreddit);
|
||||
const renderedContent = await renderContent(content, item, ruleResults, this.resources.userNotes);
|
||||
@@ -35,7 +36,11 @@ export class UserNoteAction extends Action {
|
||||
const existingNote = notes.find((x) => x.link.includes(item.id));
|
||||
if (existingNote) {
|
||||
this.logger.info(`Will not add note because one already exists for this Activity (${existingNote.time.local().format()}) and allowDuplicate=false`);
|
||||
return;
|
||||
return {
|
||||
dryRun,
|
||||
success: false,
|
||||
result: `Will not add note because one already exists for this Activity (${existingNote.time.local().format()}) and allowDuplicate=false`
|
||||
};
|
||||
}
|
||||
}
|
||||
if (!dryRun) {
|
||||
@@ -43,6 +48,11 @@ export class UserNoteAction extends Action {
|
||||
} else if (!await this.resources.userNotes.warningExists(this.type)) {
|
||||
this.logger.warn(`UserNote type '${this.type}' does not exist. If you meant to use this please add it through Toolbox first.`);
|
||||
}
|
||||
return {
|
||||
success: true,
|
||||
dryRun,
|
||||
result: `(${this.type}) ${renderedContent}`
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,14 +1,17 @@
|
||||
import {Comment, Submission} from "snoowrap";
|
||||
import Snoowrap, {Comment, Submission} from "snoowrap";
|
||||
import {Logger} from "winston";
|
||||
import {RuleResult} from "../Rule";
|
||||
import ResourceManager, {SubredditResources} from "../Subreddit/SubredditResources";
|
||||
import {ChecksActivityState, TypedActivityStates} from "../Common/interfaces";
|
||||
import {SubredditResources} from "../Subreddit/SubredditResources";
|
||||
import {ActionProcessResult, ActionResult, ChecksActivityState, TypedActivityStates} from "../Common/interfaces";
|
||||
import Author, {AuthorOptions} from "../Author/Author";
|
||||
import {mergeArr} from "../util";
|
||||
import LoggedError from "../Utils/LoggedError";
|
||||
|
||||
export abstract class Action {
|
||||
name?: string;
|
||||
logger: Logger;
|
||||
resources: SubredditResources;
|
||||
client: Snoowrap
|
||||
authorIs: AuthorOptions;
|
||||
itemIs: TypedActivityStates;
|
||||
dryRun: boolean;
|
||||
@@ -18,6 +21,8 @@ export abstract class Action {
|
||||
const {
|
||||
enable = true,
|
||||
name = this.getKind(),
|
||||
resources,
|
||||
client,
|
||||
logger,
|
||||
subredditName,
|
||||
dryRun = false,
|
||||
@@ -31,8 +36,9 @@ export abstract class Action {
|
||||
this.name = name;
|
||||
this.dryRun = dryRun;
|
||||
this.enabled = enable;
|
||||
this.resources = ResourceManager.get(subredditName) as SubredditResources;
|
||||
this.logger = logger.child({labels: [`Action ${this.getActionUniqueName()}`]});
|
||||
this.resources = resources;
|
||||
this.client = client;
|
||||
this.logger = logger.child({labels: [`Action ${this.getActionUniqueName()}`]}, mergeArr);
|
||||
|
||||
this.authorIs = {
|
||||
exclude: exclude.map(x => new Author(x)),
|
||||
@@ -48,52 +54,68 @@ export abstract class Action {
|
||||
return this.name === this.getKind() ? this.getKind() : `${this.getKind()} - ${this.name}`;
|
||||
}
|
||||
|
||||
async handle(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<void> {
|
||||
async handle(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<ActionResult> {
|
||||
const dryRun = runtimeDryrun || this.dryRun;
|
||||
let actionRun = false;
|
||||
const itemPass = await this.resources.testItemCriteria(item, this.itemIs);
|
||||
if (!itemPass) {
|
||||
this.logger.verbose(`Activity did not pass 'itemIs' test, Action not run`);
|
||||
return;
|
||||
}
|
||||
const authorRun = async () => {
|
||||
|
||||
let actRes: ActionResult = {
|
||||
kind: this.getKind(),
|
||||
name: this.getActionUniqueName(),
|
||||
run: false,
|
||||
dryRun,
|
||||
success: false,
|
||||
};
|
||||
try {
|
||||
const itemPass = await this.resources.testItemCriteria(item, this.itemIs);
|
||||
if (!itemPass) {
|
||||
this.logger.verbose(`Activity did not pass 'itemIs' test, Action not run`);
|
||||
actRes.runReason = `Activity did not pass 'itemIs' test, Action not run`;
|
||||
return actRes;
|
||||
}
|
||||
if (this.authorIs.include !== undefined && this.authorIs.include.length > 0) {
|
||||
for (const auth of this.authorIs.include) {
|
||||
if (await this.resources.testAuthorCriteria(item, auth)) {
|
||||
await this.process(item, ruleResults, runtimeDryrun);
|
||||
return true;
|
||||
actRes.run = true;
|
||||
const results = await this.process(item, ruleResults, runtimeDryrun);
|
||||
return {...actRes, ...results};
|
||||
}
|
||||
}
|
||||
this.logger.verbose('Inclusive author criteria not matched, Action not run');
|
||||
return false;
|
||||
}
|
||||
if (!actionRun && this.authorIs.exclude !== undefined && this.authorIs.exclude.length > 0) {
|
||||
actRes.runReason = 'Inclusive author criteria not matched';
|
||||
return actRes;
|
||||
} else if (this.authorIs.exclude !== undefined && this.authorIs.exclude.length > 0) {
|
||||
for (const auth of this.authorIs.exclude) {
|
||||
if (await this.resources.testAuthorCriteria(item, auth, false)) {
|
||||
await this.process(item, ruleResults, runtimeDryrun);
|
||||
return true;
|
||||
actRes.run = true;
|
||||
const results = await this.process(item, ruleResults, runtimeDryrun);
|
||||
return {...actRes, ...results};
|
||||
}
|
||||
}
|
||||
this.logger.verbose('Exclusive author criteria not matched, Action not run');
|
||||
return false;
|
||||
actRes.runReason = 'Exclusive author criteria not matched';
|
||||
return actRes;
|
||||
}
|
||||
return null;
|
||||
|
||||
actRes.run = true;
|
||||
const results = await this.process(item, ruleResults, runtimeDryrun);
|
||||
return {...actRes, ...results};
|
||||
} catch (err) {
|
||||
if(!(err instanceof LoggedError)) {
|
||||
this.logger.error(`Encountered error while running`, err);
|
||||
}
|
||||
actRes.success = false;
|
||||
actRes.result = err.message;
|
||||
return actRes;
|
||||
}
|
||||
const authorRunResults = await authorRun();
|
||||
if (null === authorRunResults) {
|
||||
await this.process(item, ruleResults, runtimeDryrun);
|
||||
} else if (!authorRunResults) {
|
||||
return;
|
||||
}
|
||||
this.logger.verbose(`${dryRun ? 'DRYRUN - ' : ''}Done`);
|
||||
}
|
||||
|
||||
abstract process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryun?: boolean): Promise<void>;
|
||||
abstract process(item: Comment | Submission, ruleResults: RuleResult[], runtimeDryun?: boolean): Promise<ActionProcessResult>;
|
||||
}
|
||||
|
||||
export interface ActionOptions extends ActionConfig {
|
||||
logger: Logger;
|
||||
subredditName: string;
|
||||
resources: SubredditResources
|
||||
client: Snoowrap
|
||||
}
|
||||
|
||||
export interface ActionConfig extends ChecksActivityState {
|
||||
|
||||
502
src/App.ts
@@ -1,490 +1,94 @@
|
||||
import Snoowrap, {Subreddit} from "snoowrap";
|
||||
import {Manager} from "./Subreddit/Manager";
|
||||
import winston, {Logger} from "winston";
|
||||
import {
|
||||
argParseInt,
|
||||
createRetryHandler, formatNumber,
|
||||
labelledFormat, logLevels,
|
||||
parseBool, parseDuration,
|
||||
parseFromJsonOrYamlToObject,
|
||||
parseSubredditName,
|
||||
sleep
|
||||
} from "./util";
|
||||
import pEvent from "p-event";
|
||||
import EventEmitter from "events";
|
||||
import CacheManager from './Subreddit/SubredditResources';
|
||||
import dayjs, {Dayjs} from "dayjs";
|
||||
import LoggedError from "./Utils/LoggedError";
|
||||
import {ProxiedSnoowrap, RequestTrackingSnoowrap} from "./Utils/SnoowrapClients";
|
||||
import {ModQueueStream, UnmoderatedStream} from "./Subreddit/Streams";
|
||||
import {getLogger} from "./Utils/loggerFactory";
|
||||
import {DurationString, OperatorConfig, PAUSED, RUNNING, STOPPED, SYSTEM, USER} from "./Common/interfaces";
|
||||
import { Duration } from "dayjs/plugin/duration";
|
||||
import {singleton} from "./Utils/SnoowrapUtils";
|
||||
|
||||
const {transports} = winston;
|
||||
|
||||
const snooLogWrapper = (logger: Logger) => {
|
||||
return {
|
||||
warn: (...args: any[]) => logger.warn(args.slice(0, 2).join(' '), [args.slice(2)]),
|
||||
debug: (...args: any[]) => logger.debug(args.slice(0, 2).join(' '), [args.slice(2)]),
|
||||
info: (...args: any[]) => logger.info(args.slice(0, 2).join(' '), [args.slice(2)]),
|
||||
trace: (...args: any[]) => logger.debug(args.slice(0, 2).join(' '), [args.slice(2)]),
|
||||
}
|
||||
}
|
||||
import {Invokee, OperatorConfig} from "./Common/interfaces";
|
||||
import Bot from "./Bot";
|
||||
import LoggedError from "./Utils/LoggedError";
|
||||
|
||||
export class App {
|
||||
|
||||
client: Snoowrap;
|
||||
subreddits: string[];
|
||||
subManagers: Manager[] = [];
|
||||
bots: Bot[]
|
||||
logger: Logger;
|
||||
wikiLocation: string;
|
||||
dryRun?: true | undefined;
|
||||
heartbeatInterval: number;
|
||||
nextHeartbeat?: Dayjs;
|
||||
heartBeating: boolean = false;
|
||||
//apiLimitWarning: number;
|
||||
softLimit: number | string = 250;
|
||||
hardLimit: number | string = 50;
|
||||
nannyMode?: 'soft' | 'hard';
|
||||
nextExpiration!: Dayjs;
|
||||
botName!: string;
|
||||
botLink!: string;
|
||||
maxWorkers: number;
|
||||
startedAt: Dayjs = dayjs();
|
||||
sharedModqueue: boolean = false;
|
||||
|
||||
apiSample: number[] = [];
|
||||
interval: any;
|
||||
apiRollingAvg: number = 0;
|
||||
apiEstDepletion?: Duration;
|
||||
depletedInSecs: number = 0;
|
||||
error: any;
|
||||
|
||||
constructor(config: OperatorConfig) {
|
||||
const {
|
||||
operator: {
|
||||
botName,
|
||||
name,
|
||||
},
|
||||
subreddits: {
|
||||
names = [],
|
||||
wikiConfig,
|
||||
dryRun,
|
||||
heartbeatInterval,
|
||||
},
|
||||
credentials: {
|
||||
clientId,
|
||||
clientSecret,
|
||||
refreshToken,
|
||||
accessToken,
|
||||
},
|
||||
snoowrap: {
|
||||
proxy,
|
||||
debug,
|
||||
},
|
||||
polling: {
|
||||
sharedMod,
|
||||
},
|
||||
queue: {
|
||||
maxWorkers,
|
||||
},
|
||||
caching: {
|
||||
authorTTL,
|
||||
provider: {
|
||||
store
|
||||
}
|
||||
},
|
||||
api: {
|
||||
softLimit,
|
||||
hardLimit,
|
||||
}
|
||||
notifications,
|
||||
bots = [],
|
||||
} = config;
|
||||
|
||||
CacheManager.setDefaultsFromConfig(config);
|
||||
|
||||
this.dryRun = parseBool(dryRun) === true ? true : undefined;
|
||||
this.heartbeatInterval = heartbeatInterval;
|
||||
//this.apiLimitWarning = argParseInt(apiLimitWarning);
|
||||
this.softLimit = softLimit;
|
||||
this.hardLimit = hardLimit;
|
||||
this.wikiLocation = wikiConfig;
|
||||
this.sharedModqueue = sharedMod;
|
||||
if(botName !== undefined) {
|
||||
this.botName = botName;
|
||||
}
|
||||
|
||||
this.logger = getLogger(config.logging);
|
||||
|
||||
this.logger.info(`Operators: ${name.length === 0 ? 'None Specified' : name.join(', ')}`)
|
||||
|
||||
let mw = maxWorkers;
|
||||
if(maxWorkers < 1) {
|
||||
this.logger.warn(`Max queue workers must be greater than or equal to 1 (Specified: ${maxWorkers})`);
|
||||
mw = 1;
|
||||
}
|
||||
this.maxWorkers = mw;
|
||||
this.bots = bots.map(x => new Bot(x, this.logger));
|
||||
|
||||
if (this.dryRun) {
|
||||
this.logger.info('Running in DRYRUN mode');
|
||||
}
|
||||
|
||||
this.subreddits = names.map(parseSubredditName);
|
||||
|
||||
const creds = {
|
||||
userAgent: `web:contextBot:dev`,
|
||||
clientId,
|
||||
clientSecret,
|
||||
refreshToken,
|
||||
accessToken,
|
||||
};
|
||||
|
||||
const missingCreds = [];
|
||||
for(const [k,v] of Object.entries(creds)) {
|
||||
if(v === undefined || v === '' || v === null) {
|
||||
missingCreds.push(k);
|
||||
}
|
||||
}
|
||||
if(missingCreds.length > 0) {
|
||||
this.logger.error('There are credentials missing that would prevent initializing the Reddit API Client and subsequently the rest of the application');
|
||||
this.logger.error(`Missing credentials: ${missingCreds.join(', ')}`)
|
||||
this.logger.info(`If this is a first-time setup use the 'web' command for a web-based guide to configuring your application`);
|
||||
this.logger.info(`Or check the USAGE section of the readme for the correct naming of these arguments/environment variables`);
|
||||
throw new LoggedError(`Missing credentials: ${missingCreds.join(', ')}`);
|
||||
}
|
||||
|
||||
this.client = proxy === undefined ? new Snoowrap(creds) : new ProxiedSnoowrap({...creds, proxy});
|
||||
this.client.config({
|
||||
warnings: true,
|
||||
maxRetryAttempts: 5,
|
||||
debug,
|
||||
logger: snooLogWrapper(this.logger.child({labels: ['Snoowrap']})),
|
||||
continueAfterRatelimitError: true,
|
||||
process.on('uncaughtException', (e) => {
|
||||
this.error = e;
|
||||
});
|
||||
|
||||
singleton.setClient(this.client);
|
||||
|
||||
const retryHandler = createRetryHandler({maxRequestRetry: 8, maxOtherRetry: 1}, this.logger);
|
||||
|
||||
const modStreamErrorListener = (name: string) => async (err: any) => {
|
||||
this.logger.error('Polling error occurred', err);
|
||||
const shouldRetry = await retryHandler(err);
|
||||
if(shouldRetry) {
|
||||
defaultUnmoderatedStream.startInterval();
|
||||
} else {
|
||||
for(const m of this.subManagers) {
|
||||
if(m.modStreamCallbacks.size > 0) {
|
||||
m.notificationManager.handle('runStateChanged', `${name.toUpperCase()} Polling Stopped`, 'Encountered too many errors from Reddit while polling. Will try to restart on next heartbeat.');
|
||||
}
|
||||
process.on('unhandledRejection', (e) => {
|
||||
this.error = e;
|
||||
});
|
||||
process.on('exit', async (code) => {
|
||||
if(code === 0) {
|
||||
await this.onTerminate();
|
||||
} else if(this.error !== undefined) {
|
||||
let errMsg;
|
||||
if(typeof this.error === 'object' && this.error.message !== undefined) {
|
||||
errMsg = this.error.message;
|
||||
} else if(typeof this.error === 'string') {
|
||||
errMsg = this.error;
|
||||
}
|
||||
this.logger.error(`Mod stream ${name.toUpperCase()} encountered too many errors while polling. Will try to restart on next heartbeat.`);
|
||||
await this.onTerminate(`Application exited due to an unexpected error${errMsg !== undefined ? `: ${errMsg}` : ''}`);
|
||||
} else {
|
||||
await this.onTerminate(`Application exited with unclean exit signal (${code})`);
|
||||
}
|
||||
}
|
||||
|
||||
const defaultUnmoderatedStream = new UnmoderatedStream(this.client, {subreddit: 'mod'});
|
||||
// @ts-ignore
|
||||
defaultUnmoderatedStream.on('error', modStreamErrorListener('unmoderated'));
|
||||
const defaultModqueueStream = new ModQueueStream(this.client, {subreddit: 'mod'});
|
||||
// @ts-ignore
|
||||
defaultModqueueStream.on('error', modStreamErrorListener('modqueue'));
|
||||
CacheManager.modStreams.set('unmoderated', defaultUnmoderatedStream);
|
||||
CacheManager.modStreams.set('modqueue', defaultModqueueStream);
|
||||
});
|
||||
}
|
||||
|
||||
async onTerminate(reason = 'The application was shutdown') {
|
||||
for(const m of this.subManagers) {
|
||||
await m.notificationManager.handle('runStateChanged', 'Application Shutdown', reason);
|
||||
for(const b of this.bots) {
|
||||
for(const m of b.subManagers) {
|
||||
await m.notificationManager.handle('runStateChanged', 'Application Shutdown', reason);
|
||||
}
|
||||
//await b.notificationManager.handle('runStateChanged', 'Application Shutdown', reason);
|
||||
}
|
||||
}
|
||||
|
||||
async testClient() {
|
||||
try {
|
||||
// @ts-ignore
|
||||
await this.client.getMe();
|
||||
this.logger.info('Test API call successful');
|
||||
} catch (err) {
|
||||
this.logger.error('An error occurred while trying to initialize the Reddit API Client which would prevent the entire application from running.');
|
||||
if(err.name === 'StatusCodeError') {
|
||||
const authHeader = err.response.headers['www-authenticate'];
|
||||
if (authHeader !== undefined && authHeader.includes('insufficient_scope')) {
|
||||
this.logger.error('Reddit responded with a 403 insufficient_scope. Please ensure you have chosen the correct scopes when authorizing your account.');
|
||||
} else if(err.statusCode === 401) {
|
||||
this.logger.error('It is likely a credential is missing or incorrect. Check clientId, clientSecret, refreshToken, and accessToken');
|
||||
}
|
||||
this.logger.error(`Error Message: ${err.message}`);
|
||||
} else {
|
||||
this.logger.error(err);
|
||||
}
|
||||
err.logged = true;
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
async buildManagers(subreddits: string[] = []) {
|
||||
let availSubs = [];
|
||||
// @ts-ignore
|
||||
const user = await this.client.getMe().fetch();
|
||||
this.botLink = `https://reddit.com/user/${user.name}`;
|
||||
this.logger.info(`Reddit API Limit Remaining: ${this.client.ratelimitRemaining}`);
|
||||
this.logger.info(`Authenticated Account: u/${user.name}`);
|
||||
|
||||
const botNameFromConfig = this.botName !== undefined;
|
||||
if(this.botName === undefined) {
|
||||
this.botName = `u/${user.name}`;
|
||||
}
|
||||
this.logger.info(`Bot Name${botNameFromConfig ? ' (from config)' : ''}: ${this.botName}`);
|
||||
|
||||
for (const sub of await this.client.getModeratedSubreddits()) {
|
||||
// TODO don't know a way to check permissions yet
|
||||
availSubs.push(sub);
|
||||
}
|
||||
this.logger.info(`${this.botName} is a moderator of these subreddits: ${availSubs.map(x => x.display_name_prefixed).join(', ')}`);
|
||||
|
||||
let subsToRun: Subreddit[] = [];
|
||||
const subsToUse = subreddits.length > 0 ? subreddits.map(parseSubredditName) : this.subreddits;
|
||||
if (subsToUse.length > 0) {
|
||||
this.logger.info(`Operator-defined subreddit constraints detected (CLI argument or environmental variable), will try to run on: ${subsToUse.join(', ')}`);
|
||||
for (const sub of subsToUse) {
|
||||
const asub = availSubs.find(x => x.display_name.toLowerCase() === sub.toLowerCase())
|
||||
if (asub === undefined) {
|
||||
this.logger.warn(`Will not run on ${sub} because is not modded by, or does not have appropriate permissions to mod with, for this client.`);
|
||||
} else {
|
||||
// @ts-ignore
|
||||
const fetchedSub = await asub.fetch();
|
||||
subsToRun.push(fetchedSub);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// otherwise assume all moddable subs from client should be run on
|
||||
this.logger.info('No user-defined subreddit constraints detected, will try to run on all');
|
||||
subsToRun = availSubs;
|
||||
}
|
||||
|
||||
let subSchedule: Manager[] = [];
|
||||
// get configs for subs we want to run on and build/validate them
|
||||
for (const sub of subsToRun) {
|
||||
const manager = new Manager(sub, this.client, this.logger, {dryRun: this.dryRun, sharedModqueue: this.sharedModqueue, wikiLocation: this.wikiLocation, botName: this.botName, maxWorkers: this.maxWorkers});
|
||||
try {
|
||||
await manager.parseConfiguration('system', true, {suppressNotification: true});
|
||||
} catch (err) {
|
||||
if (!(err instanceof LoggedError)) {
|
||||
this.logger.error(`Config was not valid:`, {subreddit: sub.display_name_prefixed});
|
||||
this.logger.error(err, {subreddit: sub.display_name_prefixed});
|
||||
}
|
||||
}
|
||||
subSchedule.push(manager);
|
||||
}
|
||||
this.subManagers = subSchedule;
|
||||
}
|
||||
|
||||
async heartbeat() {
|
||||
try {
|
||||
this.heartBeating = true;
|
||||
while (true) {
|
||||
this.nextHeartbeat = dayjs().add(this.heartbeatInterval, 'second');
|
||||
await sleep(this.heartbeatInterval * 1000);
|
||||
const heartbeat = `HEARTBEAT -- API Remaining: ${this.client.ratelimitRemaining} | Usage Rolling Avg: ~${formatNumber(this.apiRollingAvg)}/s | Est Depletion: ${this.apiEstDepletion === undefined ? 'N/A' : this.apiEstDepletion.humanize()} (${formatNumber(this.depletedInSecs, {toFixed: 0})} seconds)`
|
||||
this.logger.info(heartbeat);
|
||||
for (const s of this.subManagers) {
|
||||
if(s.botState.state === STOPPED && s.botState.causedBy === USER) {
|
||||
this.logger.debug('Skipping config check/restart on heartbeat due to previously being stopped by user', {subreddit: s.displayLabel});
|
||||
continue;
|
||||
}
|
||||
try {
|
||||
const newConfig = await s.parseConfiguration();
|
||||
if(newConfig || (s.queueState.state !== RUNNING && s.queueState.causedBy === SYSTEM))
|
||||
{
|
||||
await s.startQueue('system', {reason: newConfig ? 'Config updated on heartbeat triggered reload' : 'Heartbeat detected non-running queue'});
|
||||
async initBots(causedBy: Invokee = 'system') {
|
||||
for (const b of this.bots) {
|
||||
if (b.error === undefined) {
|
||||
try {
|
||||
await b.testClient();
|
||||
await b.buildManagers();
|
||||
b.runManagers(causedBy).catch((err) => {
|
||||
this.logger.error(`Unexpected error occurred while running Bot ${b.botName}. Bot must be re-built to restart`);
|
||||
if (!err.logged || !(err instanceof LoggedError)) {
|
||||
this.logger.error(err);
|
||||
}
|
||||
if(newConfig || (s.eventsState.state !== RUNNING && s.eventsState.causedBy === SYSTEM))
|
||||
{
|
||||
await s.startEvents('system', {reason: newConfig ? 'Config updated on heartbeat triggered reload' : 'Heartbeat detected non-running events'});
|
||||
}
|
||||
if(s.botState.state !== RUNNING && s.eventsState.state === RUNNING && s.queueState.state === RUNNING) {
|
||||
s.botState = {
|
||||
state: RUNNING,
|
||||
causedBy: 'system',
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
this.logger.info('Stopping event polling to prevent activity processing queue from backing up. Will be restarted when config update succeeds.')
|
||||
await s.stopEvents('system', {reason: 'Invalid config will cause events to pile up in queue. Will be restarted when config update succeeds (next heartbeat).'});
|
||||
if(!(err instanceof LoggedError)) {
|
||||
this.logger.error(err, {subreddit: s.displayLabel});
|
||||
}
|
||||
if(this.nextHeartbeat !== undefined) {
|
||||
this.logger.info(`Will retry parsing config on next heartbeat (in ${dayjs.duration(this.nextHeartbeat.diff(dayjs())).humanize()})`, {subreddit: s.displayLabel});
|
||||
}
|
||||
}
|
||||
}
|
||||
await this.runModStreams(true);
|
||||
}
|
||||
} catch (err) {
|
||||
this.logger.error('Error occurred during heartbeat', err);
|
||||
throw err;
|
||||
} finally {
|
||||
this.nextHeartbeat = undefined;
|
||||
this.heartBeating = false;
|
||||
}
|
||||
}
|
||||
|
||||
async runModStreams(notify = false) {
|
||||
for(const [k,v] of CacheManager.modStreams) {
|
||||
if(!v.running && v.listeners('item').length > 0) {
|
||||
v.startInterval();
|
||||
this.logger.info(`Starting default ${k.toUpperCase()} mod stream`);
|
||||
if(notify) {
|
||||
for(const m of this.subManagers) {
|
||||
if(m.modStreamCallbacks.size > 0) {
|
||||
m.notificationManager.handle('runStateChanged', `${k.toUpperCase()} Polling Started`, 'Polling was successfully restarted on heartbeat.');
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async runManagers() {
|
||||
if(this.subManagers.every(x => !x.validConfigLoaded)) {
|
||||
this.logger.warn('All managers have invalid configs!');
|
||||
}
|
||||
for (const manager of this.subManagers) {
|
||||
if (manager.validConfigLoaded && manager.botState.state !== RUNNING) {
|
||||
await manager.start('system', {reason: 'Caused by application startup'});
|
||||
}
|
||||
}
|
||||
|
||||
await this.runModStreams();
|
||||
|
||||
if (this.heartbeatInterval !== 0 && !this.heartBeating) {
|
||||
this.heartbeat();
|
||||
}
|
||||
this.runApiNanny();
|
||||
|
||||
const emitter = new EventEmitter();
|
||||
await pEvent(emitter, 'end');
|
||||
}
|
||||
|
||||
async runApiNanny() {
|
||||
while(true) {
|
||||
await sleep(10000);
|
||||
this.nextExpiration = dayjs(this.client.ratelimitExpiration);
|
||||
const nowish = dayjs().add(10, 'second');
|
||||
if(nowish.isAfter(this.nextExpiration)) {
|
||||
// it's possible no api calls are being made because of a hard limit
|
||||
// need to make an api call to update this
|
||||
// @ts-ignore
|
||||
await this.client.getMe();
|
||||
this.nextExpiration = dayjs(this.client.ratelimitExpiration);
|
||||
}
|
||||
const rollingSample = this.apiSample.slice(0, 7)
|
||||
rollingSample.unshift(this.client.ratelimitRemaining);
|
||||
this.apiSample = rollingSample;
|
||||
const diff = this.apiSample.reduceRight((acc: number[], curr, index) => {
|
||||
if(this.apiSample[index + 1] !== undefined) {
|
||||
const d = Math.abs(curr - this.apiSample[index + 1]);
|
||||
if(d === 0) {
|
||||
return [...acc, 0];
|
||||
}
|
||||
return [...acc, d/10];
|
||||
}
|
||||
return acc;
|
||||
}, []);
|
||||
this.apiRollingAvg = diff.reduce((acc, curr) => acc + curr,0) / diff.length; // api requests per second
|
||||
this.depletedInSecs = this.client.ratelimitRemaining / this.apiRollingAvg; // number of seconds until current remaining limit is 0
|
||||
this.apiEstDepletion = dayjs.duration({seconds: this.depletedInSecs});
|
||||
this.logger.debug(`API Usage Rolling Avg: ${formatNumber(this.apiRollingAvg)}/s | Est Depletion: ${this.apiEstDepletion.humanize()} (${formatNumber(this.depletedInSecs, {toFixed: 0})} seconds)`);
|
||||
|
||||
|
||||
let hardLimitHit = false;
|
||||
if(typeof this.hardLimit === 'string') {
|
||||
const hardDur = parseDuration(this.hardLimit);
|
||||
hardLimitHit = hardDur.asSeconds() > this.apiEstDepletion.asSeconds();
|
||||
} else {
|
||||
hardLimitHit = this.hardLimit > this.client.ratelimitRemaining;
|
||||
}
|
||||
|
||||
if(hardLimitHit) {
|
||||
if(this.nannyMode === 'hard') {
|
||||
continue;
|
||||
}
|
||||
this.logger.info(`Detected HARD LIMIT of ${this.hardLimit} remaining`, {leaf: 'Api Nanny'});
|
||||
this.logger.info(`API Remaining: ${this.client.ratelimitRemaining} | Usage Rolling Avg: ${this.apiRollingAvg}/s | Est Depletion: ${this.apiEstDepletion.humanize()} (${formatNumber(this.depletedInSecs, {toFixed: 0})} seconds)`, {leaf: 'Api Nanny'});
|
||||
this.logger.info(`All subreddit event polling has been paused`, {leaf: 'Api Nanny'});
|
||||
|
||||
for(const m of this.subManagers) {
|
||||
m.pauseEvents('system');
|
||||
m.notificationManager.handle('runStateChanged', 'Hard Limit Triggered', `Hard Limit of ${this.hardLimit} hit (API Remaining: ${this.client.ratelimitRemaining}). Subreddit event polling has been paused.`, 'system', 'warn');
|
||||
}
|
||||
|
||||
this.nannyMode = 'hard';
|
||||
continue;
|
||||
}
|
||||
|
||||
let softLimitHit = false;
|
||||
if(typeof this.softLimit === 'string') {
|
||||
const softDur = parseDuration(this.softLimit);
|
||||
softLimitHit = softDur.asSeconds() > this.apiEstDepletion.asSeconds();
|
||||
} else {
|
||||
softLimitHit = this.softLimit > this.client.ratelimitRemaining;
|
||||
}
|
||||
|
||||
if(softLimitHit) {
|
||||
if(this.nannyMode === 'soft') {
|
||||
continue;
|
||||
}
|
||||
this.logger.info(`Detected SOFT LIMIT of ${this.softLimit} remaining`, {leaf: 'Api Nanny'});
|
||||
this.logger.info(`API Remaining: ${this.client.ratelimitRemaining} | Usage Rolling Avg: ${formatNumber(this.apiRollingAvg)}/s | Est Depletion: ${this.apiEstDepletion.humanize()} (${formatNumber(this.depletedInSecs, {toFixed: 0})} seconds)`, {leaf: 'Api Nanny'});
|
||||
this.logger.info('Trying to detect heavy usage subreddits...', {leaf: 'Api Nanny'});
|
||||
let threshold = 0.5;
|
||||
let offenders = this.subManagers.filter(x => {
|
||||
const combinedPerSec = x.eventsRollingAvg + x.rulesUniqueRollingAvg;
|
||||
return combinedPerSec > threshold;
|
||||
});
|
||||
if(offenders.length === 0) {
|
||||
threshold = 0.25;
|
||||
// reduce threshold
|
||||
offenders = this.subManagers.filter(x => {
|
||||
const combinedPerSec = x.eventsRollingAvg + x.rulesUniqueRollingAvg;
|
||||
return combinedPerSec > threshold;
|
||||
});
|
||||
}
|
||||
|
||||
if(offenders.length > 0) {
|
||||
this.logger.info(`Slowing subreddits using >- ${threshold}req/s:`, {leaf: 'Api Nanny'});
|
||||
for(const m of offenders) {
|
||||
m.delayBy = 1.5;
|
||||
m.logger.info(`SLOW MODE (Currently ~${formatNumber(m.eventsRollingAvg + m.rulesUniqueRollingAvg)}req/sec)`, {leaf: 'Api Nanny'});
|
||||
m.notificationManager.handle('runStateChanged', 'Soft Limit Triggered', `Soft Limit of ${this.softLimit} hit (API Remaining: ${this.client.ratelimitRemaining}). Subreddit queue processing will be slowed to 1.5 seconds per.`, 'system', 'warn');
|
||||
} catch (err) {
|
||||
if (b.error === undefined) {
|
||||
b.error = err.message;
|
||||
}
|
||||
} else {
|
||||
this.logger.info(`Couldn't detect specific offenders, slowing all...`, {leaf: 'Api Nanny'});
|
||||
for(const m of this.subManagers) {
|
||||
m.delayBy = 1.5;
|
||||
m.logger.info(`SLOW MODE (Currently ~${formatNumber(m.eventsRollingAvg + m.rulesUniqueRollingAvg)}req/sec)`, {leaf: 'Api Nanny'});
|
||||
m.notificationManager.handle('runStateChanged', 'Soft Limit Triggered', `Soft Limit of ${this.softLimit} hit (API Remaining: ${this.client.ratelimitRemaining}). Subreddit queue processing will be slowed to 1.5 seconds per.`, 'system', 'warn');
|
||||
this.logger.error(`Bot ${b.botName} cannot recover from this error and must be re-built`);
|
||||
if (!err.logged || !(err instanceof LoggedError)) {
|
||||
this.logger.error(err);
|
||||
}
|
||||
}
|
||||
this.nannyMode = 'soft';
|
||||
continue;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if(this.nannyMode !== undefined) {
|
||||
this.logger.info('Turning off due to better conditions...', {leaf: 'Api Nanny'});
|
||||
for(const m of this.subManagers) {
|
||||
if(m.delayBy !== undefined) {
|
||||
m.delayBy = undefined;
|
||||
m.notificationManager.handle('runStateChanged', 'Normal Processing Resumed', 'Slow Mode has been turned off due to better API conditions', 'system');
|
||||
}
|
||||
if(m.queueState.state === PAUSED && m.queueState.causedBy === SYSTEM) {
|
||||
m.startQueue('system', {reason: 'API Nanny has been turned off due to better API conditions'});
|
||||
}
|
||||
if(m.eventsState.state === PAUSED && m.eventsState.causedBy === SYSTEM) {
|
||||
await m.startEvents('system', {reason: 'API Nanny has been turned off due to better API conditions'});
|
||||
}
|
||||
}
|
||||
this.nannyMode = undefined;
|
||||
}
|
||||
async destroy(causedBy: Invokee) {
|
||||
this.logger.info('Stopping all bots...');
|
||||
for(const b of this.bots) {
|
||||
await b.destroy(causedBy);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -65,7 +65,7 @@ export interface AuthorCriteria {
|
||||
*
|
||||
* [See] https://regexr.com/609n8 for example
|
||||
*
|
||||
* @pattern ^\s*(?<opStr>>|>=|<|<=)\s*(?<time>\d+)\s*(?<unit>days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?)\s*$
|
||||
* @pattern ^\s*(>|>=|<|<=)\s*(\d+)\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?)\s*$
|
||||
* */
|
||||
age?: DurationComparor
|
||||
|
||||
@@ -99,6 +99,13 @@ export interface AuthorCriteria {
|
||||
* Does Author's account have a verified email?
|
||||
* */
|
||||
verified?: boolean
|
||||
|
||||
/**
|
||||
* Is the author shadowbanned?
|
||||
*
|
||||
* This is determined by trying to retrieve the author's profile. If a 404 is returned it is likely they are shadowbanned
|
||||
* */
|
||||
shadowBanned?: boolean
|
||||
}
|
||||
|
||||
export class Author implements AuthorCriteria {
|
||||
@@ -112,6 +119,7 @@ export class Author implements AuthorCriteria {
|
||||
linkKarma?: string;
|
||||
totalKarma?: string;
|
||||
verified?: boolean;
|
||||
shadowBanned?: boolean;
|
||||
|
||||
constructor(options: AuthorCriteria) {
|
||||
this.name = options.name;
|
||||
@@ -123,6 +131,7 @@ export class Author implements AuthorCriteria {
|
||||
this.commentKarma = options.commentKarma;
|
||||
this.linkKarma = options.linkKarma;
|
||||
this.totalKarma = options.totalKarma;
|
||||
this.shadowBanned = options.shadowBanned;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
576
src/Bot/index.ts
Normal file
@@ -0,0 +1,576 @@
|
||||
import Snoowrap, {Subreddit} from "snoowrap";
|
||||
import {Logger} from "winston";
|
||||
import dayjs, {Dayjs} from "dayjs";
|
||||
import {Duration} from "dayjs/plugin/duration";
|
||||
import EventEmitter from "events";
|
||||
import {BotInstanceConfig, Invokee, PAUSED, RUNNING, STOPPED, SYSTEM, USER} from "../Common/interfaces";
|
||||
import {
|
||||
createRetryHandler,
|
||||
formatNumber,
|
||||
mergeArr,
|
||||
parseBool,
|
||||
parseDuration,
|
||||
parseSubredditName,
|
||||
sleep,
|
||||
snooLogWrapper
|
||||
} from "../util";
|
||||
import {Manager} from "../Subreddit/Manager";
|
||||
import {ProxiedSnoowrap} from "../Utils/SnoowrapClients";
|
||||
import {ModQueueStream, UnmoderatedStream} from "../Subreddit/Streams";
|
||||
import {BotResourcesManager} from "../Subreddit/SubredditResources";
|
||||
import LoggedError from "../Utils/LoggedError";
|
||||
import pEvent from "p-event";
|
||||
|
||||
|
||||
class Bot {
|
||||
|
||||
client!: Snoowrap;
|
||||
logger!: Logger;
|
||||
wikiLocation: string;
|
||||
dryRun?: true | undefined;
|
||||
running: boolean = false;
|
||||
subreddits: string[];
|
||||
excludeSubreddits: string[];
|
||||
subManagers: Manager[] = [];
|
||||
heartbeatInterval: number;
|
||||
nextHeartbeat: Dayjs = dayjs();
|
||||
heartBeating: boolean = false;
|
||||
|
||||
softLimit: number | string = 250;
|
||||
hardLimit: number | string = 50;
|
||||
nannyMode?: 'soft' | 'hard';
|
||||
nannyRunning: boolean = false;
|
||||
nextNannyCheck: Dayjs = dayjs().add(10, 'second');
|
||||
nannyRetryHandler: Function;
|
||||
nextExpiration: Dayjs = dayjs();
|
||||
botName?: string;
|
||||
botLink?: string;
|
||||
botAccount?: string;
|
||||
maxWorkers: number;
|
||||
startedAt: Dayjs = dayjs();
|
||||
sharedModqueue: boolean = false;
|
||||
|
||||
apiSample: number[] = [];
|
||||
apiRollingAvg: number = 0;
|
||||
apiEstDepletion?: Duration;
|
||||
depletedInSecs: number = 0;
|
||||
|
||||
error: any;
|
||||
emitter: EventEmitter = new EventEmitter();
|
||||
|
||||
cacheManager: BotResourcesManager;
|
||||
|
||||
getBotName = () => {
|
||||
return this.botName;
|
||||
}
|
||||
|
||||
getUserAgent = () => {
|
||||
return `web:contextMod:${this.botName}`
|
||||
}
|
||||
|
||||
constructor(config: BotInstanceConfig, logger: Logger) {
|
||||
const {
|
||||
notifications,
|
||||
name,
|
||||
subreddits: {
|
||||
names = [],
|
||||
exclude = [],
|
||||
wikiConfig,
|
||||
dryRun,
|
||||
heartbeatInterval,
|
||||
},
|
||||
credentials: {
|
||||
clientId,
|
||||
clientSecret,
|
||||
refreshToken,
|
||||
accessToken,
|
||||
},
|
||||
snoowrap: {
|
||||
proxy,
|
||||
debug,
|
||||
},
|
||||
polling: {
|
||||
sharedMod,
|
||||
},
|
||||
queue: {
|
||||
maxWorkers,
|
||||
},
|
||||
caching: {
|
||||
authorTTL,
|
||||
provider: {
|
||||
store
|
||||
}
|
||||
},
|
||||
nanny: {
|
||||
softLimit,
|
||||
hardLimit,
|
||||
}
|
||||
} = config;
|
||||
|
||||
this.cacheManager = new BotResourcesManager(config);
|
||||
|
||||
this.dryRun = parseBool(dryRun) === true ? true : undefined;
|
||||
this.softLimit = softLimit;
|
||||
this.hardLimit = hardLimit;
|
||||
this.wikiLocation = wikiConfig;
|
||||
this.heartbeatInterval = heartbeatInterval;
|
||||
this.sharedModqueue = sharedMod;
|
||||
if(name !== undefined) {
|
||||
this.botName = name;
|
||||
}
|
||||
|
||||
const getBotName = this.getBotName;
|
||||
const getUserName = this.getUserAgent;
|
||||
|
||||
this.logger = logger.child({
|
||||
get bot() {
|
||||
return getBotName();
|
||||
}
|
||||
}, mergeArr);
|
||||
|
||||
let mw = maxWorkers;
|
||||
if(maxWorkers < 1) {
|
||||
this.logger.warn(`Max queue workers must be greater than or equal to 1 (Specified: ${maxWorkers})`);
|
||||
mw = 1;
|
||||
}
|
||||
this.maxWorkers = mw;
|
||||
|
||||
if (this.dryRun) {
|
||||
this.logger.info('Running in DRYRUN mode');
|
||||
}
|
||||
|
||||
this.subreddits = names.map(parseSubredditName);
|
||||
this.excludeSubreddits = exclude.map(parseSubredditName);
|
||||
|
||||
let creds: any = {
|
||||
get userAgent() { return getUserName() },
|
||||
clientId,
|
||||
clientSecret,
|
||||
refreshToken,
|
||||
accessToken,
|
||||
};
|
||||
|
||||
const missingCreds = [];
|
||||
for(const [k,v] of Object.entries(creds)) {
|
||||
if(v === undefined || v === '' || v === null) {
|
||||
missingCreds.push(k);
|
||||
}
|
||||
}
|
||||
if(missingCreds.length > 0) {
|
||||
this.logger.error('There are credentials missing that would prevent initializing the Reddit API Client and subsequently the rest of the application');
|
||||
this.logger.error(`Missing credentials: ${missingCreds.join(', ')}`)
|
||||
this.logger.info(`If this is a first-time setup use the 'web' command for a web-based guide to configuring your application`);
|
||||
this.logger.info(`Or check the USAGE section of the readme for the correct naming of these arguments/environment variables`);
|
||||
this.error = `Missing credentials: ${missingCreds.join(', ')}`;
|
||||
//throw new LoggedError(`Missing credentials: ${missingCreds.join(', ')}`);
|
||||
}
|
||||
|
||||
try {
|
||||
this.client = proxy === undefined ? new Snoowrap(creds) : new ProxiedSnoowrap({...creds, proxy});
|
||||
this.client.config({
|
||||
warnings: true,
|
||||
maxRetryAttempts: 5,
|
||||
debug,
|
||||
logger: snooLogWrapper(this.logger.child({labels: ['Snoowrap']}, mergeArr)),
|
||||
continueAfterRatelimitError: true,
|
||||
});
|
||||
} catch (err) {
|
||||
if(this.error === undefined) {
|
||||
this.error = err.message;
|
||||
this.logger.error(err);
|
||||
}
|
||||
}
|
||||
|
||||
const retryHandler = createRetryHandler({maxRequestRetry: 8, maxOtherRetry: 1}, this.logger);
|
||||
this.nannyRetryHandler = createRetryHandler({maxRequestRetry: 5, maxOtherRetry: 1}, this.logger);
|
||||
|
||||
const modStreamErrorListener = (name: string) => async (err: any) => {
|
||||
this.logger.error('Polling error occurred', err);
|
||||
const shouldRetry = await retryHandler(err);
|
||||
if(shouldRetry) {
|
||||
defaultUnmoderatedStream.startInterval();
|
||||
} else {
|
||||
for(const m of this.subManagers) {
|
||||
if(m.modStreamCallbacks.size > 0) {
|
||||
m.notificationManager.handle('runStateChanged', `${name.toUpperCase()} Polling Stopped`, 'Encountered too many errors from Reddit while polling. Will try to restart on next heartbeat.');
|
||||
}
|
||||
}
|
||||
this.logger.error(`Mod stream ${name.toUpperCase()} encountered too many errors while polling. Will try to restart on next heartbeat.`);
|
||||
}
|
||||
}
|
||||
|
||||
const defaultUnmoderatedStream = new UnmoderatedStream(this.client, {subreddit: 'mod'});
|
||||
// @ts-ignore
|
||||
defaultUnmoderatedStream.on('error', modStreamErrorListener('unmoderated'));
|
||||
const defaultModqueueStream = new ModQueueStream(this.client, {subreddit: 'mod'});
|
||||
// @ts-ignore
|
||||
defaultModqueueStream.on('error', modStreamErrorListener('modqueue'));
|
||||
this.cacheManager.modStreams.set('unmoderated', defaultUnmoderatedStream);
|
||||
this.cacheManager.modStreams.set('modqueue', defaultModqueueStream);
|
||||
|
||||
process.on('uncaughtException', (e) => {
|
||||
this.error = e;
|
||||
});
|
||||
process.on('unhandledRejection', (e) => {
|
||||
this.error = e;
|
||||
});
|
||||
process.on('exit', async (code) => {
|
||||
if(code === 0) {
|
||||
await this.onTerminate();
|
||||
} else if(this.error !== undefined) {
|
||||
let errMsg;
|
||||
if(typeof this.error === 'object' && this.error.message !== undefined) {
|
||||
errMsg = this.error.message;
|
||||
} else if(typeof this.error === 'string') {
|
||||
errMsg = this.error;
|
||||
}
|
||||
await this.onTerminate(`Application exited due to an unexpected error${errMsg !== undefined ? `: ${errMsg}` : ''}`);
|
||||
} else {
|
||||
await this.onTerminate(`Application exited with unclean exit signal (${code})`);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async onTerminate(reason = 'The application was shutdown') {
|
||||
for(const m of this.subManagers) {
|
||||
await m.notificationManager.handle('runStateChanged', 'Application Shutdown', reason);
|
||||
}
|
||||
}
|
||||
|
||||
async testClient() {
|
||||
try {
|
||||
// @ts-ignore
|
||||
await this.client.getMe();
|
||||
this.logger.info('Test API call successful');
|
||||
} catch (err) {
|
||||
this.logger.error('An error occurred while trying to initialize the Reddit API Client which would prevent the entire application from running.');
|
||||
if(err.name === 'StatusCodeError') {
|
||||
const authHeader = err.response.headers['www-authenticate'];
|
||||
if (authHeader !== undefined && authHeader.includes('insufficient_scope')) {
|
||||
this.logger.error('Reddit responded with a 403 insufficient_scope. Please ensure you have chosen the correct scopes when authorizing your account.');
|
||||
} else if(err.statusCode === 401) {
|
||||
this.logger.error('It is likely a credential is missing or incorrect. Check clientId, clientSecret, refreshToken, and accessToken');
|
||||
}
|
||||
this.logger.error(`Error Message: ${err.message}`);
|
||||
} else {
|
||||
this.logger.error(err);
|
||||
}
|
||||
this.error = `Error occurred while testing Reddit API client: ${err.message}`;
|
||||
err.logged = true;
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
async buildManagers(subreddits: string[] = []) {
|
||||
let availSubs = [];
|
||||
// @ts-ignore
|
||||
const user = await this.client.getMe().fetch();
|
||||
this.botLink = `https://reddit.com/user/${user.name}`;
|
||||
this.botAccount = `u/${user.name}`;
|
||||
this.logger.info(`Reddit API Limit Remaining: ${this.client.ratelimitRemaining}`);
|
||||
this.logger.info(`Authenticated Account: u/${user.name}`);
|
||||
|
||||
const botNameFromConfig = this.botName !== undefined;
|
||||
if(this.botName === undefined) {
|
||||
this.botName = `u/${user.name}`;
|
||||
}
|
||||
this.logger.info(`Bot Name${botNameFromConfig ? ' (from config)' : ''}: ${this.botName}`);
|
||||
|
||||
for (const sub of await this.client.getModeratedSubreddits()) {
|
||||
// TODO don't know a way to check permissions yet
|
||||
availSubs.push(sub);
|
||||
}
|
||||
this.logger.info(`u/${user.name} is a moderator of these subreddits: ${availSubs.map(x => x.display_name_prefixed).join(', ')}`);
|
||||
|
||||
let subsToRun: Subreddit[] = [];
|
||||
const subsToUse = subreddits.length > 0 ? subreddits.map(parseSubredditName) : this.subreddits;
|
||||
if (subsToUse.length > 0) {
|
||||
this.logger.info(`Operator-defined subreddit constraints detected (CLI argument or environmental variable), will try to run on: ${subsToUse.join(', ')}`);
|
||||
for (const sub of subsToUse) {
|
||||
const asub = availSubs.find(x => x.display_name.toLowerCase() === sub.toLowerCase())
|
||||
if (asub === undefined) {
|
||||
this.logger.warn(`Will not run on ${sub} because is not modded by, or does not have appropriate permissions to mod with, for this client.`);
|
||||
} else {
|
||||
// @ts-ignore
|
||||
const fetchedSub = await asub.fetch();
|
||||
subsToRun.push(fetchedSub);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
if(this.excludeSubreddits.length > 0) {
|
||||
this.logger.info(`Will run on all moderated subreddits but user-defined excluded: ${this.excludeSubreddits.join(', ')}`);
|
||||
const normalExcludes = this.excludeSubreddits.map(x => x.toLowerCase());
|
||||
subsToRun = availSubs.filter(x => !normalExcludes.includes(x.display_name.toLowerCase()));
|
||||
} else {
|
||||
this.logger.info('No user-defined subreddit constraints detected, will run on all moderated subreddits');
|
||||
subsToRun = availSubs;
|
||||
}
|
||||
}
|
||||
|
||||
let subSchedule: Manager[] = [];
|
||||
// get configs for subs we want to run on and build/validate them
|
||||
for (const sub of subsToRun) {
|
||||
const manager = new Manager(sub, this.client, this.logger, this.cacheManager, {dryRun: this.dryRun, sharedModqueue: this.sharedModqueue, wikiLocation: this.wikiLocation, botName: this.botName, maxWorkers: this.maxWorkers});
|
||||
try {
|
||||
await manager.parseConfiguration('system', true, {suppressNotification: true});
|
||||
} catch (err) {
|
||||
if (!(err instanceof LoggedError)) {
|
||||
this.logger.error(`Config was not valid:`, {subreddit: sub.display_name_prefixed});
|
||||
this.logger.error(err, {subreddit: sub.display_name_prefixed});
|
||||
}
|
||||
}
|
||||
subSchedule.push(manager);
|
||||
}
|
||||
this.subManagers = subSchedule;
|
||||
}
|
||||
|
||||
async destroy(causedBy: Invokee) {
|
||||
this.logger.info('Stopping heartbeat and nanny processes, may take up to 5 seconds...');
|
||||
const processWait = pEvent(this.emitter, 'healthStopped');
|
||||
this.running = false;
|
||||
await processWait;
|
||||
for (const manager of this.subManagers) {
|
||||
await manager.stop(causedBy, {reason: 'App rebuild'});
|
||||
}
|
||||
this.logger.info('Bot is stopped.');
|
||||
}
|
||||
|
||||
async runModStreams(notify = false) {
|
||||
for(const [k,v] of this.cacheManager.modStreams) {
|
||||
if(!v.running && v.listeners('item').length > 0) {
|
||||
v.startInterval();
|
||||
this.logger.info(`Starting default ${k.toUpperCase()} mod stream`);
|
||||
if(notify) {
|
||||
for(const m of this.subManagers) {
|
||||
if(m.modStreamCallbacks.size > 0) {
|
||||
await m.notificationManager.handle('runStateChanged', `${k.toUpperCase()} Polling Started`, 'Polling was successfully restarted on heartbeat.');
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async runManagers(causedBy: Invokee = 'system') {
|
||||
if(this.subManagers.every(x => !x.validConfigLoaded)) {
|
||||
this.logger.warn('All managers have invalid configs!');
|
||||
this.error = 'All managers have invalid configs';
|
||||
}
|
||||
for (const manager of this.subManagers) {
|
||||
if (manager.validConfigLoaded && manager.botState.state !== RUNNING) {
|
||||
await manager.start(causedBy, {reason: 'Caused by application startup'});
|
||||
}
|
||||
}
|
||||
|
||||
await this.runModStreams();
|
||||
|
||||
this.running = true;
|
||||
this.nextNannyCheck = dayjs().add(10, 'second');
|
||||
this.nextHeartbeat = dayjs().add(this.heartbeatInterval, 'second');
|
||||
await this.healthLoop();
|
||||
}
|
||||
|
||||
async healthLoop() {
|
||||
while (this.running) {
|
||||
await sleep(5000);
|
||||
if (!this.running) {
|
||||
break;
|
||||
}
|
||||
if (dayjs().isSameOrAfter(this.nextNannyCheck)) {
|
||||
try {
|
||||
await this.runApiNanny();
|
||||
this.nextNannyCheck = dayjs().add(10, 'second');
|
||||
} catch (err) {
|
||||
this.logger.info('Delaying next nanny check for 1 minute due to emitted error');
|
||||
this.nextNannyCheck = dayjs().add(120, 'second');
|
||||
}
|
||||
}
|
||||
if(dayjs().isSameOrAfter(this.nextHeartbeat)) {
|
||||
try {
|
||||
await this.heartbeat();
|
||||
} catch (err) {
|
||||
this.logger.error(`Error occurred during heartbeat check: ${err.message}`);
|
||||
}
|
||||
this.nextHeartbeat = dayjs().add(this.heartbeatInterval, 'second');
|
||||
}
|
||||
}
|
||||
this.emitter.emit('healthStopped');
|
||||
}
|
||||
|
||||
async heartbeat() {
|
||||
const heartbeat = `HEARTBEAT -- API Remaining: ${this.client.ratelimitRemaining} | Usage Rolling Avg: ~${formatNumber(this.apiRollingAvg)}/s | Est Depletion: ${this.apiEstDepletion === undefined ? 'N/A' : this.apiEstDepletion.humanize()} (${formatNumber(this.depletedInSecs, {toFixed: 0})} seconds)`
|
||||
this.logger.info(heartbeat);
|
||||
for (const s of this.subManagers) {
|
||||
if(s.botState.state === STOPPED && s.botState.causedBy === USER) {
|
||||
this.logger.debug('Skipping config check/restart on heartbeat due to previously being stopped by user', {subreddit: s.displayLabel});
|
||||
continue;
|
||||
}
|
||||
try {
|
||||
const newConfig = await s.parseConfiguration();
|
||||
if(newConfig || (s.queueState.state !== RUNNING && s.queueState.causedBy === SYSTEM))
|
||||
{
|
||||
await s.startQueue('system', {reason: newConfig ? 'Config updated on heartbeat triggered reload' : 'Heartbeat detected non-running queue'});
|
||||
}
|
||||
if(newConfig || (s.eventsState.state !== RUNNING && s.eventsState.causedBy === SYSTEM))
|
||||
{
|
||||
await s.startEvents('system', {reason: newConfig ? 'Config updated on heartbeat triggered reload' : 'Heartbeat detected non-running events'});
|
||||
}
|
||||
if(s.botState.state !== RUNNING && s.eventsState.state === RUNNING && s.queueState.state === RUNNING) {
|
||||
s.botState = {
|
||||
state: RUNNING,
|
||||
causedBy: 'system',
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
this.logger.info('Stopping event polling to prevent activity processing queue from backing up. Will be restarted when config update succeeds.')
|
||||
await s.stopEvents('system', {reason: 'Invalid config will cause events to pile up in queue. Will be restarted when config update succeeds (next heartbeat).'});
|
||||
if(!(err instanceof LoggedError)) {
|
||||
this.logger.error(err, {subreddit: s.displayLabel});
|
||||
}
|
||||
if(this.nextHeartbeat !== undefined) {
|
||||
this.logger.info(`Will retry parsing config on next heartbeat (in ${dayjs.duration(this.nextHeartbeat.diff(dayjs())).humanize()})`, {subreddit: s.displayLabel});
|
||||
}
|
||||
}
|
||||
}
|
||||
await this.runModStreams(true);
|
||||
}
|
||||
|
||||
async runApiNanny() {
|
||||
try {
|
||||
this.nextExpiration = dayjs(this.client.ratelimitExpiration);
|
||||
const nowish = dayjs().add(10, 'second');
|
||||
if (nowish.isAfter(this.nextExpiration)) {
|
||||
// it's possible no api calls are being made because of a hard limit
|
||||
// need to make an api call to update this
|
||||
let shouldRetry = true;
|
||||
while (shouldRetry) {
|
||||
try {
|
||||
// @ts-ignore
|
||||
await this.client.getMe();
|
||||
shouldRetry = false;
|
||||
} catch (err) {
|
||||
shouldRetry = await this.nannyRetryHandler(err);
|
||||
if (!shouldRetry) {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
}
|
||||
this.nextExpiration = dayjs(this.client.ratelimitExpiration);
|
||||
}
|
||||
const rollingSample = this.apiSample.slice(0, 7)
|
||||
rollingSample.unshift(this.client.ratelimitRemaining);
|
||||
this.apiSample = rollingSample;
|
||||
const diff = this.apiSample.reduceRight((acc: number[], curr, index) => {
|
||||
if (this.apiSample[index + 1] !== undefined) {
|
||||
const d = Math.abs(curr - this.apiSample[index + 1]);
|
||||
if (d === 0) {
|
||||
return [...acc, 0];
|
||||
}
|
||||
return [...acc, d / 10];
|
||||
}
|
||||
return acc;
|
||||
}, []);
|
||||
this.apiRollingAvg = diff.reduce((acc, curr) => acc + curr, 0) / diff.length; // api requests per second
|
||||
this.depletedInSecs = this.client.ratelimitRemaining / this.apiRollingAvg; // number of seconds until current remaining limit is 0
|
||||
this.apiEstDepletion = dayjs.duration({seconds: this.depletedInSecs});
|
||||
this.logger.debug(`API Usage Rolling Avg: ${formatNumber(this.apiRollingAvg)}/s | Est Depletion: ${this.apiEstDepletion.humanize()} (${formatNumber(this.depletedInSecs, {toFixed: 0})} seconds)`);
|
||||
|
||||
|
||||
let hardLimitHit = false;
|
||||
if (typeof this.hardLimit === 'string') {
|
||||
const hardDur = parseDuration(this.hardLimit);
|
||||
hardLimitHit = hardDur.asSeconds() > this.apiEstDepletion.asSeconds();
|
||||
} else {
|
||||
hardLimitHit = this.hardLimit > this.client.ratelimitRemaining;
|
||||
}
|
||||
|
||||
if (hardLimitHit) {
|
||||
if (this.nannyMode === 'hard') {
|
||||
return;
|
||||
}
|
||||
this.logger.info(`Detected HARD LIMIT of ${this.hardLimit} remaining`, {leaf: 'Api Nanny'});
|
||||
this.logger.info(`API Remaining: ${this.client.ratelimitRemaining} | Usage Rolling Avg: ${this.apiRollingAvg}/s | Est Depletion: ${this.apiEstDepletion.humanize()} (${formatNumber(this.depletedInSecs, {toFixed: 0})} seconds)`, {leaf: 'Api Nanny'});
|
||||
this.logger.info(`All subreddit event polling has been paused`, {leaf: 'Api Nanny'});
|
||||
|
||||
for (const m of this.subManagers) {
|
||||
m.pauseEvents('system');
|
||||
m.notificationManager.handle('runStateChanged', 'Hard Limit Triggered', `Hard Limit of ${this.hardLimit} hit (API Remaining: ${this.client.ratelimitRemaining}). Subreddit event polling has been paused.`, 'system', 'warn');
|
||||
}
|
||||
|
||||
this.nannyMode = 'hard';
|
||||
return;
|
||||
}
|
||||
|
||||
let softLimitHit = false;
|
||||
if (typeof this.softLimit === 'string') {
|
||||
const softDur = parseDuration(this.softLimit);
|
||||
softLimitHit = softDur.asSeconds() > this.apiEstDepletion.asSeconds();
|
||||
} else {
|
||||
softLimitHit = this.softLimit > this.client.ratelimitRemaining;
|
||||
}
|
||||
|
||||
if (softLimitHit) {
|
||||
if (this.nannyMode === 'soft') {
|
||||
return;
|
||||
}
|
||||
this.logger.info(`Detected SOFT LIMIT of ${this.softLimit} remaining`, {leaf: 'Api Nanny'});
|
||||
this.logger.info(`API Remaining: ${this.client.ratelimitRemaining} | Usage Rolling Avg: ${formatNumber(this.apiRollingAvg)}/s | Est Depletion: ${this.apiEstDepletion.humanize()} (${formatNumber(this.depletedInSecs, {toFixed: 0})} seconds)`, {leaf: 'Api Nanny'});
|
||||
this.logger.info('Trying to detect heavy usage subreddits...', {leaf: 'Api Nanny'});
|
||||
let threshold = 0.5;
|
||||
let offenders = this.subManagers.filter(x => {
|
||||
const combinedPerSec = x.eventsRollingAvg + x.rulesUniqueRollingAvg;
|
||||
return combinedPerSec > threshold;
|
||||
});
|
||||
if (offenders.length === 0) {
|
||||
threshold = 0.25;
|
||||
// reduce threshold
|
||||
offenders = this.subManagers.filter(x => {
|
||||
const combinedPerSec = x.eventsRollingAvg + x.rulesUniqueRollingAvg;
|
||||
return combinedPerSec > threshold;
|
||||
});
|
||||
}
|
||||
|
||||
if (offenders.length > 0) {
|
||||
this.logger.info(`Slowing subreddits using >- ${threshold}req/s:`, {leaf: 'Api Nanny'});
|
||||
for (const m of offenders) {
|
||||
m.delayBy = 1.5;
|
||||
m.logger.info(`SLOW MODE (Currently ~${formatNumber(m.eventsRollingAvg + m.rulesUniqueRollingAvg)}req/sec)`, {leaf: 'Api Nanny'});
|
||||
m.notificationManager.handle('runStateChanged', 'Soft Limit Triggered', `Soft Limit of ${this.softLimit} hit (API Remaining: ${this.client.ratelimitRemaining}). Subreddit queue processing will be slowed to 1.5 seconds per.`, 'system', 'warn');
|
||||
}
|
||||
} else {
|
||||
this.logger.info(`Couldn't detect specific offenders, slowing all...`, {leaf: 'Api Nanny'});
|
||||
for (const m of this.subManagers) {
|
||||
m.delayBy = 1.5;
|
||||
m.logger.info(`SLOW MODE (Currently ~${formatNumber(m.eventsRollingAvg + m.rulesUniqueRollingAvg)}req/sec)`, {leaf: 'Api Nanny'});
|
||||
m.notificationManager.handle('runStateChanged', 'Soft Limit Triggered', `Soft Limit of ${this.softLimit} hit (API Remaining: ${this.client.ratelimitRemaining}). Subreddit queue processing will be slowed to 1.5 seconds per.`, 'system', 'warn');
|
||||
}
|
||||
}
|
||||
this.nannyMode = 'soft';
|
||||
return
|
||||
}
|
||||
|
||||
if (this.nannyMode !== undefined) {
|
||||
this.logger.info('Turning off due to better conditions...', {leaf: 'Api Nanny'});
|
||||
for (const m of this.subManagers) {
|
||||
if (m.delayBy !== undefined) {
|
||||
m.delayBy = undefined;
|
||||
m.notificationManager.handle('runStateChanged', 'Normal Processing Resumed', 'Slow Mode has been turned off due to better API conditions', 'system');
|
||||
}
|
||||
if (m.queueState.state === PAUSED && m.queueState.causedBy === SYSTEM) {
|
||||
m.startQueue('system', {reason: 'API Nanny has been turned off due to better API conditions'});
|
||||
}
|
||||
if (m.eventsState.state === PAUSED && m.eventsState.causedBy === SYSTEM) {
|
||||
await m.startEvents('system', {reason: 'API Nanny has been turned off due to better API conditions'});
|
||||
}
|
||||
}
|
||||
this.nannyMode = undefined;
|
||||
}
|
||||
|
||||
} catch (err) {
|
||||
this.logger.error(`Error occurred during nanny loop: ${err.message}`);
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export default Bot;
|
||||
@@ -1,6 +1,7 @@
|
||||
import {Check, CheckOptions, userResultCacheDefault, UserResultCacheOptions} from "./index";
|
||||
import {CommentState} from "../Common/interfaces";
|
||||
import {CommentState, UserResultCache} from "../Common/interfaces";
|
||||
import {Submission, Comment} from "snoowrap/dist/objects";
|
||||
import {RuleResult} from "../Rule";
|
||||
|
||||
export interface CommentCheckOptions extends CheckOptions {
|
||||
cacheUserResult?: UserResultCacheOptions;
|
||||
@@ -9,20 +10,12 @@ export interface CommentCheckOptions extends CheckOptions {
|
||||
export class CommentCheck extends Check {
|
||||
itemIs: CommentState[];
|
||||
|
||||
cacheUserResult: Required<UserResultCacheOptions>;
|
||||
|
||||
constructor(options: CommentCheckOptions) {
|
||||
super(options);
|
||||
const {
|
||||
itemIs = [],
|
||||
cacheUserResult = {},
|
||||
} = options;
|
||||
|
||||
this.cacheUserResult = {
|
||||
...userResultCacheDefault,
|
||||
...cacheUserResult
|
||||
}
|
||||
|
||||
this.itemIs = itemIs;
|
||||
this.logSummary();
|
||||
}
|
||||
@@ -31,7 +24,7 @@ export class CommentCheck extends Check {
|
||||
super.logSummary('comment');
|
||||
}
|
||||
|
||||
async getCacheResult(item: Submission | Comment): Promise<boolean | undefined> {
|
||||
async getCacheResult(item: Submission | Comment): Promise<UserResultCache | undefined> {
|
||||
if (this.cacheUserResult.enable) {
|
||||
return await this.resources.getCommentCheckCacheResult(item as Comment, {
|
||||
name: this.name,
|
||||
@@ -42,13 +35,22 @@ export class CommentCheck extends Check {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
async setCacheResult(item: Submission | Comment, result: boolean): Promise<void> {
|
||||
async setCacheResult(item: Submission | Comment, result: UserResultCache): Promise<void> {
|
||||
if (this.cacheUserResult.enable) {
|
||||
const {result: outcome, ruleResults} = result;
|
||||
|
||||
const res: UserResultCache = {
|
||||
result: outcome,
|
||||
// don't need to cache rule results if check was not triggered
|
||||
// since we only use rule results for actions
|
||||
ruleResults: outcome ? ruleResults : []
|
||||
};
|
||||
|
||||
await this.resources.setCommentCheckCacheResult(item as Comment, {
|
||||
name: this.name,
|
||||
authorIs: this.authorIs,
|
||||
itemIs: this.itemIs
|
||||
}, result, this.cacheUserResult.ttl)
|
||||
}, res, this.cacheUserResult.ttl)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import {Check, CheckOptions} from "./index";
|
||||
import {SubmissionState} from "../Common/interfaces";
|
||||
import {SubmissionState, UserResultCache} from "../Common/interfaces";
|
||||
import {Submission, Comment} from "snoowrap/dist/objects";
|
||||
import {RuleResult} from "../Rule";
|
||||
|
||||
export class SubmissionCheck extends Check {
|
||||
itemIs: SubmissionState[];
|
||||
@@ -15,11 +16,4 @@ export class SubmissionCheck extends Check {
|
||||
logSummary() {
|
||||
super.logSummary('submission');
|
||||
}
|
||||
|
||||
async getCacheResult(item: Submission | Comment) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
async setCacheResult(item: Submission | Comment, result: boolean) {
|
||||
}
|
||||
}
|
||||
|
||||
@@ -2,7 +2,7 @@ import {RuleSet, IRuleSet, RuleSetJson, RuleSetObjectJson} from "../Rule/RuleSet
|
||||
import {IRule, isRuleSetResult, Rule, RuleJSONConfig, RuleResult, RuleSetResult} from "../Rule";
|
||||
import Action, {ActionConfig, ActionJson} from "../Action";
|
||||
import {Logger} from "winston";
|
||||
import {Comment, Submission} from "snoowrap";
|
||||
import Snoowrap, {Comment, Submission} from "snoowrap";
|
||||
import {actionFactory} from "../Action/ActionFactory";
|
||||
import {ruleFactory} from "../Rule/RuleFactory";
|
||||
import {
|
||||
@@ -16,18 +16,19 @@ import {
|
||||
truncateStringToLength
|
||||
} from "../util";
|
||||
import {
|
||||
ActionResult,
|
||||
ChecksActivityState,
|
||||
CommentState,
|
||||
JoinCondition,
|
||||
JoinOperands,
|
||||
SubmissionState,
|
||||
TypedActivityStates
|
||||
TypedActivityStates, UserResultCache
|
||||
} from "../Common/interfaces";
|
||||
import * as RuleSchema from '../Schema/Rule.json';
|
||||
import * as RuleSetSchema from '../Schema/RuleSet.json';
|
||||
import * as ActionSchema from '../Schema/Action.json';
|
||||
import {ActionObjectJson, RuleJson, RuleObjectJson, ActionJson as ActionTypeJson} from "../Common/types";
|
||||
import ResourceManager, {SubredditResources} from "../Subreddit/SubredditResources";
|
||||
import {SubredditResources} from "../Subreddit/SubredditResources";
|
||||
import {Author, AuthorCriteria, AuthorOptions} from "../Author/Author";
|
||||
|
||||
const checkLogName = truncateStringToLength(25);
|
||||
@@ -45,20 +46,25 @@ export abstract class Check implements ICheck {
|
||||
include: AuthorCriteria[],
|
||||
exclude: AuthorCriteria[]
|
||||
};
|
||||
cacheUserResult: Required<UserResultCacheOptions>;
|
||||
dryRun?: boolean;
|
||||
notifyOnTrigger: boolean;
|
||||
resources: SubredditResources;
|
||||
client: Snoowrap;
|
||||
|
||||
constructor(options: CheckOptions) {
|
||||
const {
|
||||
enable = true,
|
||||
name,
|
||||
resources,
|
||||
description,
|
||||
client,
|
||||
condition = 'AND',
|
||||
rules = [],
|
||||
actions = [],
|
||||
notifyOnTrigger = false,
|
||||
subredditName,
|
||||
cacheUserResult = {},
|
||||
itemIs = [],
|
||||
authorIs: {
|
||||
include = [],
|
||||
@@ -73,7 +79,8 @@ export abstract class Check implements ICheck {
|
||||
|
||||
const ajv = createAjvFactory(this.logger);
|
||||
|
||||
this.resources = ResourceManager.get(subredditName) as SubredditResources;
|
||||
this.resources = resources;
|
||||
this.client = client;
|
||||
|
||||
this.name = name;
|
||||
this.description = description;
|
||||
@@ -84,6 +91,10 @@ export abstract class Check implements ICheck {
|
||||
exclude: exclude.map(x => new Author(x)),
|
||||
include: include.map(x => new Author(x)),
|
||||
}
|
||||
this.cacheUserResult = {
|
||||
...userResultCacheDefault,
|
||||
...cacheUserResult
|
||||
}
|
||||
this.dryRun = dryRun;
|
||||
for (const r of rules) {
|
||||
if (r instanceof Rule || r instanceof RuleSet) {
|
||||
@@ -94,12 +105,12 @@ export abstract class Check implements ICheck {
|
||||
let ruleErrors: any = [];
|
||||
if (valid) {
|
||||
const ruleConfig = r as RuleSetObjectJson;
|
||||
this.rules.push(new RuleSet({...ruleConfig, logger: this.logger, subredditName}));
|
||||
this.rules.push(new RuleSet({...ruleConfig, logger: this.logger, subredditName, resources: this.resources, client: this.client}));
|
||||
} else {
|
||||
setErrors = ajv.errors;
|
||||
valid = ajv.validate(RuleSchema, r);
|
||||
if (valid) {
|
||||
this.rules.push(ruleFactory(r as RuleJSONConfig, this.logger, subredditName));
|
||||
this.rules.push(ruleFactory(r as RuleJSONConfig, this.logger, subredditName, this.resources, this.client));
|
||||
} else {
|
||||
ruleErrors = ajv.errors;
|
||||
const leastErrorType = setErrors.length < ruleErrors ? 'RuleSet' : 'Rule';
|
||||
@@ -123,7 +134,7 @@ export abstract class Check implements ICheck {
|
||||
this.actions.push(actionFactory({
|
||||
...aj,
|
||||
dryRun: this.dryRun || aj.dryRun
|
||||
}, this.logger, subredditName));
|
||||
}, this.logger, subredditName, this.resources, this.client));
|
||||
// @ts-ignore
|
||||
a.logger = this.logger;
|
||||
} else {
|
||||
@@ -166,10 +177,14 @@ export abstract class Check implements ICheck {
|
||||
}
|
||||
}
|
||||
|
||||
abstract getCacheResult(item: Submission | Comment) : Promise<boolean | undefined>;
|
||||
abstract setCacheResult(item: Submission | Comment, result: boolean): void;
|
||||
async getCacheResult(item: Submission | Comment) : Promise<UserResultCache | undefined> {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
async runRules(item: Submission | Comment, existingResults: RuleResult[] = []): Promise<[boolean, RuleResult[]]> {
|
||||
async setCacheResult(item: Submission | Comment, result: UserResultCache): Promise<void> {
|
||||
}
|
||||
|
||||
async runRules(item: Submission | Comment, existingResults: RuleResult[] = []): Promise<[boolean, RuleResult[], boolean?]> {
|
||||
try {
|
||||
let allRuleResults: RuleResult[] = [];
|
||||
let allResults: (RuleResult | RuleSetResult)[] = [];
|
||||
@@ -178,7 +193,7 @@ export abstract class Check implements ICheck {
|
||||
const cacheResult = await this.getCacheResult(item);
|
||||
if(cacheResult !== undefined) {
|
||||
this.logger.verbose(`Skipping rules run because result was found in cache, Check Triggered Result: ${cacheResult}`);
|
||||
return [cacheResult, allRuleResults];
|
||||
return [cacheResult.result, cacheResult.ruleResults, true];
|
||||
}
|
||||
|
||||
const itemPass = await this.resources.testItemCriteria(item, this.itemIs);
|
||||
@@ -260,23 +275,27 @@ export abstract class Check implements ICheck {
|
||||
}
|
||||
}
|
||||
|
||||
async runActions(item: Submission | Comment, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<Action[]> {
|
||||
async runActions(item: Submission | Comment, ruleResults: RuleResult[], runtimeDryrun?: boolean): Promise<ActionResult[]> {
|
||||
const dr = runtimeDryrun || this.dryRun;
|
||||
this.logger.debug(`${dr ? 'DRYRUN - ' : ''}Running Actions`);
|
||||
const runActions: Action[] = [];
|
||||
const runActions: ActionResult[] = [];
|
||||
for (const a of this.actions) {
|
||||
if(!a.enabled) {
|
||||
runActions.push({
|
||||
kind: a.getKind(),
|
||||
name: a.getActionUniqueName(),
|
||||
run: false,
|
||||
success: false,
|
||||
runReason: 'Not enabled',
|
||||
dryRun: (a.dryRun || dr) || false,
|
||||
});
|
||||
this.logger.info(`Action ${a.getActionUniqueName()} not run because it is not enabled.`);
|
||||
continue;
|
||||
}
|
||||
try {
|
||||
await a.handle(item, ruleResults, runtimeDryrun);
|
||||
runActions.push(a);
|
||||
} catch (err) {
|
||||
this.logger.error(`Action ${a.getActionUniqueName()} encountered an error while running`, err);
|
||||
}
|
||||
const res = await a.handle(item, ruleResults, runtimeDryrun);
|
||||
runActions.push(res);
|
||||
}
|
||||
this.logger.info(`${dr ? 'DRYRUN - ' : ''}Ran Actions: ${runActions.map(x => x.getActionUniqueName()).join(' | ')}`);
|
||||
this.logger.info(`${dr ? 'DRYRUN - ' : ''}Ran Actions: ${runActions.map(x => x.name).join(' | ')}`);
|
||||
return runActions;
|
||||
}
|
||||
}
|
||||
@@ -331,6 +350,9 @@ export interface CheckOptions extends ICheck {
|
||||
logger: Logger
|
||||
subredditName: string
|
||||
notifyOnTrigger?: boolean
|
||||
resources: SubredditResources
|
||||
client: Snoowrap
|
||||
cacheUserResult?: UserResultCacheOptions;
|
||||
}
|
||||
|
||||
export interface CheckJson extends ICheck {
|
||||
@@ -365,6 +387,8 @@ export interface CheckJson extends ICheck {
|
||||
* @default false
|
||||
* */
|
||||
notifyOnTrigger?: boolean,
|
||||
|
||||
cacheUserResult?: UserResultCacheOptions;
|
||||
}
|
||||
|
||||
export interface SubmissionCheckJson extends CheckJson {
|
||||
@@ -382,6 +406,9 @@ export interface SubmissionCheckJson extends CheckJson {
|
||||
* 3. The rule results are not likely to change while cache is valid
|
||||
* */
|
||||
export interface UserResultCacheOptions {
|
||||
/**
|
||||
* @default false
|
||||
* */
|
||||
enable?: boolean,
|
||||
/**
|
||||
* The amount of time, in seconds, to cache this result
|
||||
@@ -390,17 +417,23 @@ export interface UserResultCacheOptions {
|
||||
* @examples [60]
|
||||
* */
|
||||
ttl?: number,
|
||||
/**
|
||||
* In the event the cache returns a triggered result should the actions for the check also be run?
|
||||
*
|
||||
* @default true
|
||||
* */
|
||||
runActions?: boolean
|
||||
}
|
||||
|
||||
export const userResultCacheDefault: Required<UserResultCacheOptions> = {
|
||||
enable: false,
|
||||
ttl: 60,
|
||||
runActions: true,
|
||||
}
|
||||
|
||||
export interface CommentCheckJson extends CheckJson {
|
||||
kind: 'comment'
|
||||
itemIs?: CommentState[]
|
||||
cacheUserResult?: UserResultCacheOptions
|
||||
}
|
||||
|
||||
export type CheckStructuredJson = SubmissionCheckStructuredJson | CommentCheckStructuredJson;
|
||||
|
||||
@@ -1,2 +1,2 @@
|
||||
export const cacheOptDefaults = {ttl: 60, max: 500, checkPeriod: 600};
|
||||
export const cacheTTLDefaults = {authorTTL: 60, userNotesTTL: 300, wikiTTL: 300, submissionTTL: 60, commentTTL: 60, filterCriteriaTTL: 60};
|
||||
export const cacheTTLDefaults = {authorTTL: 60, userNotesTTL: 300, wikiTTL: 300, submissionTTL: 60, commentTTL: 60, filterCriteriaTTL: 60, subredditTTL: 600};
|
||||
|
||||
@@ -1,12 +1,12 @@
|
||||
import {Logger} from "winston";
|
||||
import {
|
||||
buildCacheOptionsFromProvider,
|
||||
buildCacheOptionsFromProvider, buildCachePrefix,
|
||||
createAjvFactory,
|
||||
mergeArr,
|
||||
normalizeName,
|
||||
overwriteMerge,
|
||||
parseBool, randomId,
|
||||
readJson,
|
||||
readConfigFile,
|
||||
removeUndefinedKeys
|
||||
} from "./util";
|
||||
import {CommentCheck} from "./Check/CommentCheck";
|
||||
@@ -25,7 +25,13 @@ import {
|
||||
OperatorConfig,
|
||||
PollingOptions,
|
||||
PollingOptionsStrong,
|
||||
PollOn, StrongCache, CacheProvider, CacheOptions
|
||||
PollOn,
|
||||
StrongCache,
|
||||
CacheProvider,
|
||||
CacheOptions,
|
||||
BotInstanceJsonConfig,
|
||||
BotInstanceConfig,
|
||||
RequiredWebRedditCredentials
|
||||
} from "./Common/interfaces";
|
||||
import {isRuleSetJSON, RuleSetJson, RuleSetObjectJson} from "./Rule/RuleSet";
|
||||
import deepEqual from "fast-deep-equal";
|
||||
@@ -37,6 +43,7 @@ import {operatorConfig} from "./Utils/CommandConfig";
|
||||
import merge from 'deepmerge';
|
||||
import * as process from "process";
|
||||
import {cacheOptDefaults, cacheTTLDefaults} from "./Common/defaults";
|
||||
import objectHash from "object-hash";
|
||||
|
||||
export interface ConfigBuilderOptions {
|
||||
logger: Logger,
|
||||
@@ -246,85 +253,104 @@ export const insertNamedActions = (actions: Array<ActionJson>, namedActions: Map
|
||||
return strongActions;
|
||||
}
|
||||
|
||||
export const parseOpConfigFromArgs = (args: any): OperatorJsonConfig => {
|
||||
export const parseDefaultBotInstanceFromArgs = (args: any): BotInstanceJsonConfig => {
|
||||
const {
|
||||
subreddits,
|
||||
clientId,
|
||||
clientSecret,
|
||||
accessToken,
|
||||
refreshToken,
|
||||
redirectUri,
|
||||
wikiConfig,
|
||||
dryRun,
|
||||
heartbeat,
|
||||
softLimit,
|
||||
heartbeat,
|
||||
hardLimit,
|
||||
authorTTL,
|
||||
operator,
|
||||
operatorDisplay,
|
||||
snooProxy,
|
||||
snooDebug,
|
||||
sharedMod,
|
||||
logLevel,
|
||||
logDir,
|
||||
port,
|
||||
sessionSecret,
|
||||
caching,
|
||||
web
|
||||
} = args || {};
|
||||
|
||||
const data = {
|
||||
operator: {
|
||||
name: operator,
|
||||
display: operatorDisplay
|
||||
},
|
||||
credentials: {
|
||||
clientId,
|
||||
clientSecret,
|
||||
accessToken,
|
||||
refreshToken,
|
||||
redirectUri,
|
||||
},
|
||||
snoowrap: {
|
||||
proxy: snooProxy,
|
||||
debug: snooDebug,
|
||||
},
|
||||
subreddits: {
|
||||
names: subreddits,
|
||||
wikiConfig,
|
||||
dryRun,
|
||||
heartbeatInterval: heartbeat,
|
||||
dryRun
|
||||
},
|
||||
polling: {
|
||||
sharedMod,
|
||||
},
|
||||
nanny: {
|
||||
softLimit,
|
||||
hardLimit
|
||||
}
|
||||
}
|
||||
return removeUndefinedKeys(data) as BotInstanceJsonConfig;
|
||||
}
|
||||
|
||||
export const parseOpConfigFromArgs = (args: any): OperatorJsonConfig => {
|
||||
const {
|
||||
clientId,
|
||||
clientSecret,
|
||||
redirectUri,
|
||||
operator,
|
||||
operatorDisplay,
|
||||
logLevel,
|
||||
logDir,
|
||||
port,
|
||||
sessionSecret,
|
||||
web,
|
||||
mode,
|
||||
caching,
|
||||
authorTTL,
|
||||
} = args || {};
|
||||
|
||||
const data = {
|
||||
mode,
|
||||
operator: {
|
||||
name: operator,
|
||||
display: operatorDisplay
|
||||
},
|
||||
logging: {
|
||||
level: logLevel,
|
||||
path: logDir === true ? `${process.cwd()}/logs` : undefined,
|
||||
},
|
||||
snoowrap: {
|
||||
proxy: snooProxy,
|
||||
debug: snooDebug,
|
||||
caching: {
|
||||
provider: caching,
|
||||
authorTTL
|
||||
},
|
||||
web: {
|
||||
enabled: web,
|
||||
port,
|
||||
session: {
|
||||
secret: sessionSecret
|
||||
},
|
||||
credentials: {
|
||||
clientId,
|
||||
clientSecret,
|
||||
redirectUri,
|
||||
}
|
||||
},
|
||||
polling: {
|
||||
sharedMod,
|
||||
},
|
||||
caching: {
|
||||
provider: caching,
|
||||
authorTTL
|
||||
},
|
||||
api: {
|
||||
softLimit,
|
||||
hardLimit
|
||||
}
|
||||
}
|
||||
|
||||
return removeUndefinedKeys(data) as OperatorJsonConfig;
|
||||
}
|
||||
|
||||
const parseListFromEnv = (val: string|undefined) => {
|
||||
const parseListFromEnv = (val: string | undefined) => {
|
||||
let listVals: undefined | string[];
|
||||
if(val === undefined) {
|
||||
if (val === undefined) {
|
||||
return listVals;
|
||||
}
|
||||
const trimmedVal = val.trim();
|
||||
@@ -343,54 +369,65 @@ const parseListFromEnv = (val: string|undefined) => {
|
||||
return listVals;
|
||||
}
|
||||
|
||||
export const parseOpConfigFromEnv = (): OperatorJsonConfig => {
|
||||
export const parseDefaultBotInstanceFromEnv = (): BotInstanceJsonConfig => {
|
||||
const data = {
|
||||
operator: {
|
||||
name: parseListFromEnv(process.env.OPERATOR),
|
||||
display: process.env.OPERATOR_DISPLAY
|
||||
},
|
||||
credentials: {
|
||||
clientId: process.env.CLIENT_ID,
|
||||
clientSecret: process.env.CLIENT_SECRET,
|
||||
accessToken: process.env.ACCESS_TOKEN,
|
||||
refreshToken: process.env.REFRESH_TOKEN,
|
||||
redirectUri: process.env.REDIRECT_URI,
|
||||
},
|
||||
subreddits: {
|
||||
names: parseListFromEnv(process.env.SUBREDDITS),
|
||||
wikiConfig: process.env.WIKI_CONFIG,
|
||||
heartbeatInterval: process.env.HEARTBEAT !== undefined ? parseInt(process.env.HEARTBEAT) : undefined,
|
||||
dryRun: parseBool(process.env.DRYRUN, undefined),
|
||||
heartbeatInterval: process.env.HEARTBEAT !== undefined ? parseInt(process.env.HEARTBEAT) : undefined,
|
||||
},
|
||||
snoowrap: {
|
||||
proxy: process.env.PROXY,
|
||||
debug: parseBool(process.env.SNOO_DEBUG, undefined),
|
||||
},
|
||||
polling: {
|
||||
sharedMod: parseBool(process.env.SHARE_MOD),
|
||||
},
|
||||
nanny: {
|
||||
softLimit: process.env.SOFT_LIMIT !== undefined ? parseInt(process.env.SOFT_LIMIT) : undefined,
|
||||
hardLimit: process.env.HARD_LIMIT !== undefined ? parseInt(process.env.HARD_LIMIT) : undefined
|
||||
},
|
||||
};
|
||||
return removeUndefinedKeys(data) as BotInstanceJsonConfig;
|
||||
}
|
||||
|
||||
export const parseOpConfigFromEnv = (): OperatorJsonConfig => {
|
||||
const data = {
|
||||
mode: process.env.MODE !== undefined ? process.env.MODE as ('all' | 'server' | 'client') : undefined,
|
||||
operator: {
|
||||
name: parseListFromEnv(process.env.OPERATOR),
|
||||
display: process.env.OPERATOR_DISPLAY
|
||||
},
|
||||
logging: {
|
||||
// @ts-ignore
|
||||
level: process.env.LOG_LEVEL,
|
||||
path: process.env.LOG_DIR === 'true' ? `${process.cwd()}/logs` : undefined,
|
||||
},
|
||||
snoowrap: {
|
||||
proxy: process.env.PROXY,
|
||||
debug: parseBool(process.env.SNOO_DEBUG, undefined),
|
||||
},
|
||||
web: {
|
||||
enabled: process.env.WEB !== undefined ? parseBool(process.env.WEB) : undefined,
|
||||
port: process.env.PORT !== undefined ? parseInt(process.env.PORT) : undefined,
|
||||
session: {
|
||||
provider: process.env.SESSION_PROVIDER,
|
||||
secret: process.env.SESSION_SECRET
|
||||
}
|
||||
},
|
||||
polling: {
|
||||
sharedMod: parseBool(process.env.SHARE_MOD),
|
||||
},
|
||||
caching: {
|
||||
provider: {
|
||||
// @ts-ignore
|
||||
store: process.env.CACHING as (CacheProvider | undefined)
|
||||
},
|
||||
authorTTL: process.env.AUTHOR_TTL !== undefined ? parseInt(process.env.AUTHOR_TTL) : undefined
|
||||
},
|
||||
api: {
|
||||
softLimit: process.env.SOFT_LIMIT !== undefined ? parseInt(process.env.SOFT_LIMIT) : undefined,
|
||||
hardLimit: process.env.HARD_LIMIT !== undefined ? parseInt(process.env.HARD_LIMIT) : undefined
|
||||
web: {
|
||||
port: process.env.PORT !== undefined ? parseInt(process.env.PORT) : undefined,
|
||||
session: {
|
||||
provider: process.env.SESSION_PROVIDER,
|
||||
secret: process.env.SESSION_SECRET
|
||||
},
|
||||
credentials: {
|
||||
clientId: process.env.CLIENT_ID,
|
||||
clientSecret: process.env.CLIENT_SECRET,
|
||||
redirectUri: process.env.REDIRECT_URI,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
@@ -439,7 +476,7 @@ export const parseOperatorConfigFromSources = async (args: any): Promise<Operato
|
||||
if (operatorConfig !== undefined) {
|
||||
let rawConfig;
|
||||
try {
|
||||
rawConfig = await readJson(operatorConfig, {log: initLogger});
|
||||
rawConfig = await readConfigFile(operatorConfig, {log: initLogger}) as object;
|
||||
} catch (err) {
|
||||
initLogger.error('Cannot continue app startup because operator config file was not parseable.');
|
||||
err.logged = true;
|
||||
@@ -452,150 +489,276 @@ export const parseOperatorConfigFromSources = async (args: any): Promise<Operato
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
const configFromArgs = parseOpConfigFromArgs(args);
|
||||
const configFromEnv = parseOpConfigFromEnv();
|
||||
const opConfigFromArgs = parseOpConfigFromArgs(args);
|
||||
const opConfigFromEnv = parseOpConfigFromEnv();
|
||||
|
||||
const mergedConfig = merge.all([configFromEnv, configFromFile, configFromArgs], {
|
||||
const defaultBotInstanceFromArgs = parseDefaultBotInstanceFromArgs(args);
|
||||
const defaultBotInstanceFromEnv = parseDefaultBotInstanceFromEnv();
|
||||
const {bots: botInstancesFromFile = [], ...restConfigFile} = configFromFile;
|
||||
|
||||
const mergedConfig = merge.all([opConfigFromEnv, restConfigFile, opConfigFromArgs], {
|
||||
arrayMerge: overwriteMerge,
|
||||
});
|
||||
|
||||
return removeUndefinedKeys(mergedConfig) as OperatorJsonConfig;
|
||||
const defaultBotInstance = merge.all([defaultBotInstanceFromEnv, defaultBotInstanceFromArgs], {
|
||||
arrayMerge: overwriteMerge,
|
||||
}) as BotInstanceJsonConfig;
|
||||
|
||||
if (configFromFile.caching !== undefined) {
|
||||
defaultBotInstance.caching = configFromFile.caching;
|
||||
}
|
||||
|
||||
let botInstances = [];
|
||||
if (botInstancesFromFile.length === 0) {
|
||||
botInstances = [defaultBotInstance];
|
||||
} else {
|
||||
botInstances = botInstancesFromFile.map(x => merge.all([defaultBotInstance, x], {arrayMerge: overwriteMerge}));
|
||||
}
|
||||
|
||||
return removeUndefinedKeys({...mergedConfig, bots: botInstances}) as OperatorJsonConfig;
|
||||
}
|
||||
|
||||
export const buildOperatorConfigWithDefaults = (data: OperatorJsonConfig): OperatorConfig => {
|
||||
const {
|
||||
mode = 'all',
|
||||
operator: {
|
||||
name = [],
|
||||
display = 'Anonymous',
|
||||
botName,
|
||||
} = {},
|
||||
credentials: {
|
||||
clientId: ci,
|
||||
clientSecret: cs,
|
||||
...restCred
|
||||
} = {},
|
||||
subreddits: {
|
||||
names = [],
|
||||
wikiConfig = 'botconfig/contextbot',
|
||||
heartbeatInterval = 300,
|
||||
dryRun
|
||||
} = {},
|
||||
logging: {
|
||||
level = 'verbose',
|
||||
path,
|
||||
} = {},
|
||||
snoowrap = {},
|
||||
caching: opCache,
|
||||
web: {
|
||||
enabled = true,
|
||||
port = 8085,
|
||||
maxLogs = 200,
|
||||
caching: webCaching = {},
|
||||
session: {
|
||||
secret = randomId(),
|
||||
provider: sessionProvider = { store: 'memory' },
|
||||
} = {}
|
||||
maxAge: sessionMaxAge = 86400,
|
||||
} = {},
|
||||
invites: {
|
||||
maxAge: inviteMaxAge = 0,
|
||||
} = {},
|
||||
clients,
|
||||
credentials: webCredentials,
|
||||
operators,
|
||||
} = {},
|
||||
polling: {
|
||||
sharedMod = false,
|
||||
limit = 100,
|
||||
interval = 30,
|
||||
} = {},
|
||||
queue: {
|
||||
maxWorkers = 1,
|
||||
} = {},
|
||||
caching,
|
||||
api: {
|
||||
softLimit = 250,
|
||||
hardLimit = 50
|
||||
port: apiPort = 8095,
|
||||
secret: apiSecret = randomId(),
|
||||
friendly,
|
||||
} = {},
|
||||
bots = [],
|
||||
} = data;
|
||||
|
||||
let cache: StrongCache;
|
||||
let defaultProvider: CacheOptions;
|
||||
let opActionedEventsMax: number | undefined;
|
||||
let opActionedEventsDefault: number = 25;
|
||||
|
||||
if(caching === undefined) {
|
||||
if (opCache === undefined) {
|
||||
defaultProvider = {
|
||||
store: 'memory',
|
||||
...cacheOptDefaults
|
||||
};
|
||||
cache = {
|
||||
...cacheTTLDefaults,
|
||||
provider: {
|
||||
store: 'memory',
|
||||
...cacheOptDefaults
|
||||
}
|
||||
provider: defaultProvider,
|
||||
actionedEventsDefault: opActionedEventsDefault,
|
||||
};
|
||||
|
||||
} else {
|
||||
const {provider, ...restConfig} = caching;
|
||||
const {provider, actionedEventsMax, actionedEventsDefault = opActionedEventsDefault, ...restConfig} = opCache;
|
||||
|
||||
if (actionedEventsMax !== undefined && actionedEventsMax !== null) {
|
||||
opActionedEventsMax = actionedEventsMax;
|
||||
opActionedEventsDefault = Math.min(actionedEventsDefault, actionedEventsMax);
|
||||
}
|
||||
|
||||
if (typeof provider === 'string') {
|
||||
cache = {
|
||||
...cacheTTLDefaults,
|
||||
...restConfig,
|
||||
provider: {
|
||||
store: provider as CacheProvider,
|
||||
...cacheOptDefaults
|
||||
}
|
||||
}
|
||||
defaultProvider = {
|
||||
store: provider as CacheProvider,
|
||||
...cacheOptDefaults
|
||||
};
|
||||
} else {
|
||||
const {ttl = 60, max = 500, store = 'memory', ...rest} = provider || {};
|
||||
cache = {
|
||||
...cacheTTLDefaults,
|
||||
...restConfig,
|
||||
provider: {
|
||||
store,
|
||||
...cacheOptDefaults,
|
||||
...rest,
|
||||
},
|
||||
}
|
||||
defaultProvider = {
|
||||
store,
|
||||
...cacheOptDefaults,
|
||||
...rest,
|
||||
};
|
||||
}
|
||||
cache = {
|
||||
...cacheTTLDefaults,
|
||||
...restConfig,
|
||||
actionedEventsMax: opActionedEventsMax,
|
||||
actionedEventsDefault: opActionedEventsDefault,
|
||||
provider: defaultProvider,
|
||||
}
|
||||
}
|
||||
|
||||
const config: OperatorConfig = {
|
||||
operator: {
|
||||
name: typeof name === 'string' ? [name] : name,
|
||||
display,
|
||||
botName,
|
||||
},
|
||||
credentials: {
|
||||
let hydratedBots: BotInstanceConfig[] = bots.map(x => {
|
||||
const {
|
||||
name: botName,
|
||||
polling: {
|
||||
sharedMod = false,
|
||||
limit = 100,
|
||||
interval = 30,
|
||||
} = {},
|
||||
queue: {
|
||||
maxWorkers = 1,
|
||||
} = {},
|
||||
caching,
|
||||
nanny: {
|
||||
softLimit = 250,
|
||||
hardLimit = 50
|
||||
} = {},
|
||||
snoowrap = {},
|
||||
credentials: {
|
||||
clientId: ci,
|
||||
clientSecret: cs,
|
||||
...restCred
|
||||
} = {},
|
||||
subreddits: {
|
||||
names = [],
|
||||
exclude = [],
|
||||
wikiConfig = 'botconfig/contextbot',
|
||||
dryRun,
|
||||
heartbeatInterval = 300,
|
||||
} = {},
|
||||
} = x;
|
||||
|
||||
|
||||
let botCache: StrongCache;
|
||||
let botActionedEventsDefault: number;
|
||||
|
||||
if (caching === undefined) {
|
||||
|
||||
botCache = {
|
||||
...cacheTTLDefaults,
|
||||
actionedEventsDefault: opActionedEventsDefault,
|
||||
actionedEventsMax: opActionedEventsMax,
|
||||
provider: {
|
||||
store: 'memory',
|
||||
...cacheOptDefaults
|
||||
}
|
||||
};
|
||||
} else {
|
||||
const {
|
||||
provider,
|
||||
actionedEventsMax = opActionedEventsMax,
|
||||
actionedEventsDefault = opActionedEventsDefault,
|
||||
...restConfig
|
||||
} = caching;
|
||||
|
||||
botActionedEventsDefault = actionedEventsDefault;
|
||||
if(actionedEventsMax !== undefined) {
|
||||
botActionedEventsDefault = Math.min(actionedEventsDefault, actionedEventsMax);
|
||||
}
|
||||
|
||||
if (typeof provider === 'string') {
|
||||
botCache = {
|
||||
...cacheTTLDefaults,
|
||||
...restConfig,
|
||||
actionedEventsDefault: botActionedEventsDefault,
|
||||
provider: {
|
||||
store: provider as CacheProvider,
|
||||
...cacheOptDefaults
|
||||
}
|
||||
}
|
||||
} else {
|
||||
const {ttl = 60, max = 500, store = 'memory', ...rest} = provider || {};
|
||||
botCache = {
|
||||
...cacheTTLDefaults,
|
||||
...restConfig,
|
||||
actionedEventsDefault: botActionedEventsDefault,
|
||||
actionedEventsMax,
|
||||
provider: {
|
||||
store,
|
||||
...cacheOptDefaults,
|
||||
...rest,
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const botCreds = {
|
||||
clientId: (ci as string),
|
||||
clientSecret: (cs as string),
|
||||
...restCred,
|
||||
};
|
||||
if (botCache.provider.prefix === undefined || botCache.provider.prefix === defaultProvider.prefix) {
|
||||
// need to provide unique prefix to bot
|
||||
botCache.provider.prefix = buildCachePrefix([botCache.provider.prefix, 'bot', (botName || objectHash.sha1(botCreds))]);
|
||||
}
|
||||
|
||||
return {
|
||||
name: botName,
|
||||
snoowrap,
|
||||
subreddits: {
|
||||
names,
|
||||
exclude,
|
||||
wikiConfig,
|
||||
heartbeatInterval,
|
||||
dryRun,
|
||||
},
|
||||
credentials: botCreds,
|
||||
caching: botCache,
|
||||
polling: {
|
||||
sharedMod,
|
||||
limit,
|
||||
interval,
|
||||
},
|
||||
queue: {
|
||||
maxWorkers,
|
||||
},
|
||||
nanny: {
|
||||
softLimit,
|
||||
hardLimit
|
||||
}
|
||||
}
|
||||
|
||||
});
|
||||
|
||||
const defaultOperators = typeof name === 'string' ? [name] : name;
|
||||
|
||||
const config: OperatorConfig = {
|
||||
mode,
|
||||
operator: {
|
||||
name: defaultOperators,
|
||||
display,
|
||||
},
|
||||
logging: {
|
||||
level,
|
||||
path
|
||||
},
|
||||
snoowrap,
|
||||
subreddits: {
|
||||
names,
|
||||
wikiConfig,
|
||||
heartbeatInterval,
|
||||
dryRun,
|
||||
},
|
||||
caching: cache,
|
||||
web: {
|
||||
enabled,
|
||||
port,
|
||||
caching: {
|
||||
...defaultProvider,
|
||||
...webCaching
|
||||
},
|
||||
invites: {
|
||||
maxAge: inviteMaxAge,
|
||||
},
|
||||
session: {
|
||||
secret,
|
||||
provider: typeof sessionProvider === 'string' ? {
|
||||
...buildCacheOptionsFromProvider({
|
||||
ttl: 86400000,
|
||||
store: sessionProvider,
|
||||
})
|
||||
} : {
|
||||
...buildCacheOptionsFromProvider(sessionProvider),
|
||||
ttl: 86400000,
|
||||
},
|
||||
maxAge: sessionMaxAge,
|
||||
},
|
||||
maxLogs,
|
||||
},
|
||||
caching: cache,
|
||||
polling: {
|
||||
sharedMod,
|
||||
limit,
|
||||
interval,
|
||||
},
|
||||
queue: {
|
||||
maxWorkers,
|
||||
clients: clients === undefined ? [{host: 'localhost:8095', secret: apiSecret}] : clients,
|
||||
credentials: webCredentials as RequiredWebRedditCredentials,
|
||||
operators: operators || defaultOperators,
|
||||
},
|
||||
api: {
|
||||
softLimit,
|
||||
hardLimit
|
||||
}
|
||||
port: apiPort,
|
||||
secret: apiSecret,
|
||||
friendly
|
||||
},
|
||||
bots: hydratedBots,
|
||||
};
|
||||
|
||||
return config;
|
||||
|
||||
@@ -5,9 +5,10 @@ import Submission from "snoowrap/dist/objects/Submission";
|
||||
import {getAttributionIdentifier} from "../Utils/SnoowrapUtils";
|
||||
import dayjs from "dayjs";
|
||||
import {
|
||||
asSubmission,
|
||||
comparisonTextOp,
|
||||
FAIL,
|
||||
formatNumber,
|
||||
formatNumber, getActivitySubredditName, isSubmission,
|
||||
parseGenericValueOrPercentComparison,
|
||||
parseSubredditName,
|
||||
PASS
|
||||
@@ -52,8 +53,6 @@ export interface AttributionCriteria {
|
||||
/**
|
||||
* A list of domains whose Activities will be tested against `threshold`.
|
||||
*
|
||||
* If this is present then `aggregateOn` is ignored.
|
||||
*
|
||||
* The values are tested as partial strings so you do not need to include full URLs, just the part that matters.
|
||||
*
|
||||
* EX `["youtube"]` will match submissions with the domain `https://youtube.com/c/aChannel`
|
||||
@@ -97,18 +96,19 @@ export interface AttributionCriteria {
|
||||
exclude?: string[],
|
||||
|
||||
/**
|
||||
* If `domains` is not specified this list determines which categories of domains should be aggregated on. All aggregated domains will be tested against `threshold`
|
||||
* This list determines which categories of domains should be aggregated on. All aggregated domains will be tested against `threshold`
|
||||
*
|
||||
* * If `media` is included then aggregate author's submission history which reddit recognizes as media (youtube, vimeo, etc.)
|
||||
* * If `self` is included then aggregate on author's submission history which are self-post (`self.[subreddit]`) or reddit image/video (i.redd.it / v.redd.it)
|
||||
* * If `link` is included then aggregate author's submission history which is external links but not media
|
||||
* * If `redditMedia` is included then aggregate on author's submissions history which are media hosted on reddit: galleries, videos, and images (i.redd.it / v.redd.it)
|
||||
* * If `self` is included then aggregate on author's submission history which are self-post (`self.[subreddit]`) or domain is `reddit.com`
|
||||
* * If `link` is included then aggregate author's submission history which is external links and not recognized as `media` by reddit
|
||||
*
|
||||
* If nothing is specified or list is empty (default) all domains are aggregated
|
||||
*
|
||||
* @default undefined
|
||||
* @examples [[]]
|
||||
* */
|
||||
aggregateOn?: ('media' | 'self' | 'link')[],
|
||||
aggregateOn?: ('media' | 'redditMedia' | 'self' | 'link')[],
|
||||
|
||||
/**
|
||||
* Should the criteria consolidate recognized media domains into the parent domain?
|
||||
@@ -190,9 +190,9 @@ export class AttributionRule extends Rule {
|
||||
let activities = thresholdOn === 'submissions' ? await this.resources.getAuthorSubmissions(item.author, {window: window}) : await this.resources.getAuthorActivities(item.author, {window: window});
|
||||
activities = activities.filter(act => {
|
||||
if (include.length > 0) {
|
||||
return include.some(x => x === act.subreddit.display_name.toLowerCase());
|
||||
return include.some(x => x === getActivitySubredditName(act).toLowerCase());
|
||||
} else if (exclude.length > 0) {
|
||||
return !exclude.some(x => x === act.subreddit.display_name.toLowerCase())
|
||||
return !exclude.some(x => x === getActivitySubredditName(act).toLowerCase())
|
||||
}
|
||||
return true;
|
||||
});
|
||||
@@ -219,7 +219,7 @@ export class AttributionRule extends Rule {
|
||||
|
||||
const realDomains: DomainInfo[] = domains.map(x => {
|
||||
if(x === SUBMISSION_DOMAIN) {
|
||||
if(!(item instanceof Submission)) {
|
||||
if(!(asSubmission(item))) {
|
||||
throw new SimpleError('Cannot run Attribution Rule with the domain SELF:AGG on a Comment');
|
||||
}
|
||||
return getAttributionIdentifier(item, consolidateMediaDomains);
|
||||
@@ -228,21 +228,28 @@ export class AttributionRule extends Rule {
|
||||
});
|
||||
const realDomainIdents = realDomains.map(x => x.aliases).flat(1).map(x => x.toLowerCase());
|
||||
|
||||
const submissions: Submission[] = thresholdOn === 'submissions' ? activities as Submission[] : activities.filter(x => x instanceof Submission) as Submission[];
|
||||
const submissions: Submission[] = thresholdOn === 'submissions' ? activities as Submission[] : activities.filter(x => isSubmission(x)) as Submission[];
|
||||
const aggregatedSubmissions = submissions.reduce((acc: Map<string, DomainAgg>, sub) => {
|
||||
const domainInfo = getAttributionIdentifier(sub, consolidateMediaDomains)
|
||||
|
||||
let domainType = 'link';
|
||||
if(sub.secure_media !== undefined && sub.secure_media !== null) {
|
||||
domainType = 'media';
|
||||
} else if((sub.is_self || sub.is_video || sub.domain === 'i.redd.it')) {
|
||||
if(sub.is_video || ['i.redd.it','v.redd.it'].includes(sub.domain)
|
||||
// @ts-ignore
|
||||
|| sub.gallery_data !== undefined) {
|
||||
domainType = 'redditMedia';
|
||||
} else if(sub.is_self || sub.domain === 'reddit.com') {
|
||||
domainType = 'self';
|
||||
} else if(sub.secure_media !== undefined && sub.secure_media !== null) {
|
||||
domainType = 'media';
|
||||
}
|
||||
|
||||
if(realDomains.length === 0 && aggregateOn.length !== 0) {
|
||||
if(aggregateOn.length !== 0) {
|
||||
if(domainType === 'media' && !aggregateOn.includes('media')) {
|
||||
return acc;
|
||||
}
|
||||
if(domainType === 'redditMedia' && !aggregateOn.includes('redditMedia')) {
|
||||
return acc;
|
||||
}
|
||||
if(domainType === 'self' && !aggregateOn.includes('self')) {
|
||||
return acc;
|
||||
}
|
||||
|
||||
@@ -1,17 +1,19 @@
|
||||
|
||||
import {ActivityWindowType, CompareValueOrPercent, ThresholdCriteria} from "../Common/interfaces";
|
||||
import {ActivityWindowType, CompareValueOrPercent, SubredditState, ThresholdCriteria} from "../Common/interfaces";
|
||||
import {Rule, RuleJSONConfig, RuleOptions, RuleResult} from "./index";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import {getAuthorActivities} from "../Utils/SnoowrapUtils";
|
||||
import dayjs from "dayjs";
|
||||
import {
|
||||
asSubmission,
|
||||
comparisonTextOp,
|
||||
FAIL,
|
||||
formatNumber,
|
||||
formatNumber, getActivitySubredditName, isSubmission,
|
||||
parseGenericValueOrPercentComparison, parseSubredditName,
|
||||
PASS,
|
||||
percentFromString
|
||||
percentFromString, toStrongSubredditState
|
||||
} from "../util";
|
||||
import {Comment} from "snoowrap";
|
||||
|
||||
export interface CommentThresholdCriteria extends ThresholdCriteria {
|
||||
/**
|
||||
@@ -23,42 +25,56 @@ export interface CommentThresholdCriteria extends ThresholdCriteria {
|
||||
asOp?: boolean
|
||||
}
|
||||
/**
|
||||
* If both `submission` and `comment` are defined then criteria will only trigger if BOTH thresholds are met
|
||||
* Criteria will only trigger if ALL present thresholds (comment, submission, total) are met
|
||||
* */
|
||||
export interface HistoryCriteria {
|
||||
|
||||
/**
|
||||
* A string containing a comparison operator and a value to compare submissions against
|
||||
* A string containing a comparison operator and a value to compare **filtered** (using `include` or `exclude`, if present) submissions against
|
||||
*
|
||||
* The syntax is `(< OR > OR <= OR >=) <number>[percent sign]`
|
||||
*
|
||||
* * EX `> 100` => greater than 100 submissions
|
||||
* * EX `<= 75%` => submissions are equal to or less than 75% of all Activities
|
||||
* * EX `> 100` => greater than 100 filtered submissions
|
||||
* * EX `<= 75%` => filtered submissions are equal to or less than 75% of unfiltered Activities
|
||||
*
|
||||
* @pattern ^\s*(>|>=|<|<=)\s*(\d+)\s*(%?)(.*)$
|
||||
* */
|
||||
submission?: CompareValueOrPercent
|
||||
/**
|
||||
* A string containing a comparison operator and a value to compare comments against
|
||||
* A string containing a comparison operator and a value to compare **filtered** (using `include` or `exclude`, if present) comments against
|
||||
*
|
||||
* The syntax is `(< OR > OR <= OR >=) <number>[percent sign] [OP]`
|
||||
*
|
||||
* * EX `> 100` => greater than 100 comments
|
||||
* * EX `<= 75%` => comments are equal to or less than 75% of all Activities
|
||||
* * EX `<= 75%` => comments are equal to or less than 75% of unfiltered Activities
|
||||
*
|
||||
* If your string also contains the text `OP` somewhere **after** `<number>[percent sign]`...:
|
||||
*
|
||||
* * EX `> 100 OP` => greater than 100 comments as OP
|
||||
* * EX `<= 25% as OP` => Comments as OP were less then or equal to 25% of **all Comments**
|
||||
* * EX `> 100 OP` => greater than 100 filtered comments as OP
|
||||
* * EX `<= 25% as OP` => **Filtered** comments as OP were less then or equal to 25% of **unfiltered Comments**
|
||||
*
|
||||
* @pattern ^\s*(>|>=|<|<=)\s*(\d+)\s*(%?)(.*)$
|
||||
* */
|
||||
comment?: CompareValueOrPercent
|
||||
|
||||
/**
|
||||
* A string containing a comparison operator and a value to compare **filtered** (using `include` or `exclude`) activities against
|
||||
*
|
||||
* **Note:** This is only useful if using `include` or `exclude` otherwise percent will always be 100% and total === activityTotal
|
||||
*
|
||||
* The syntax is `(< OR > OR <= OR >=) <number>[percent sign] [OP]`
|
||||
*
|
||||
* * EX `> 100` => greater than 100 filtered activities
|
||||
* * EX `<= 75%` => filtered activities are equal to or less than 75% of all Activities
|
||||
*
|
||||
* @pattern ^\s*(>|>=|<|<=)\s*(\d+)\s*(%?)(.*)$
|
||||
* */
|
||||
total?: CompareValueOrPercent
|
||||
|
||||
window: ActivityWindowType
|
||||
|
||||
/**
|
||||
* The minimum number of activities that must exist from the `window` results for this criteria to run
|
||||
* The minimum number of **filtered** activities that must exist from the `window` results for this criteria to run
|
||||
* @default 5
|
||||
* */
|
||||
minActivityCount?: number
|
||||
@@ -68,8 +84,9 @@ export interface HistoryCriteria {
|
||||
export class HistoryRule extends Rule {
|
||||
criteria: HistoryCriteria[];
|
||||
condition: 'AND' | 'OR';
|
||||
include: string[];
|
||||
exclude: string[];
|
||||
include: (string | SubredditState)[];
|
||||
exclude: (string | SubredditState)[];
|
||||
activityFilterFunc: (x: Submission|Comment) => Promise<boolean> = async (x) => true;
|
||||
|
||||
constructor(options: HistoryOptions) {
|
||||
super(options);
|
||||
@@ -85,8 +102,41 @@ export class HistoryRule extends Rule {
|
||||
if (this.criteria.length === 0) {
|
||||
throw new Error('Must provide at least one HistoryCriteria');
|
||||
}
|
||||
this.include = include.map(x => parseSubredditName(x).toLowerCase());
|
||||
this.exclude = exclude.map(x => parseSubredditName(x).toLowerCase());
|
||||
|
||||
this.include = include;
|
||||
this.exclude = exclude;
|
||||
|
||||
if(this.include.length > 0) {
|
||||
const subStates = include.map((x) => {
|
||||
if(typeof x === 'string') {
|
||||
return toStrongSubredditState({name: x, stateDescription: x}, {defaultFlags: 'i', generateDescription: true});
|
||||
}
|
||||
return toStrongSubredditState(x, {defaultFlags: 'i', generateDescription: true});
|
||||
});
|
||||
this.activityFilterFunc = async (x: Submission|Comment) => {
|
||||
for(const ss of subStates) {
|
||||
if(await this.resources.testSubredditCriteria(x, ss)) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
};
|
||||
} else if(this.exclude.length > 0) {
|
||||
const subStates = exclude.map((x) => {
|
||||
if(typeof x === 'string') {
|
||||
return toStrongSubredditState({name: x, stateDescription: x}, {defaultFlags: 'i', generateDescription: true});
|
||||
}
|
||||
return toStrongSubredditState(x, {defaultFlags: 'i', generateDescription: true});
|
||||
});
|
||||
this.activityFilterFunc = async (x: Submission|Comment) => {
|
||||
for(const ss of subStates) {
|
||||
if(await this.resources.testSubredditCriteria(x, ss)) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
return true;
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
getKind(): string {
|
||||
@@ -107,25 +157,23 @@ export class HistoryRule extends Rule {
|
||||
|
||||
for (const criteria of this.criteria) {
|
||||
|
||||
const {comment, window, submission, minActivityCount = 5} = criteria;
|
||||
const {comment, window, submission, total, minActivityCount = 5} = criteria;
|
||||
|
||||
let activities = await this.resources.getAuthorActivities(item.author, {window: window});
|
||||
activities = activities.filter(act => {
|
||||
if (this.include.length > 0) {
|
||||
return this.include.some(x => x === act.subreddit.display_name.toLowerCase());
|
||||
} else if (this.exclude.length > 0) {
|
||||
return !this.exclude.some(x => x === act.subreddit.display_name.toLowerCase())
|
||||
const filteredActivities = [];
|
||||
for(const a of activities) {
|
||||
if(await this.activityFilterFunc(a)) {
|
||||
filteredActivities.push(a);
|
||||
}
|
||||
return true;
|
||||
});
|
||||
}
|
||||
|
||||
if (activities.length < minActivityCount) {
|
||||
if (filteredActivities.length < minActivityCount) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const activityTotal = activities.length;
|
||||
const {submissionTotal, commentTotal, opTotal} = activities.reduce((acc, act) => {
|
||||
if(act instanceof Submission) {
|
||||
if(asSubmission(act)) {
|
||||
return {...acc, submissionTotal: acc.submissionTotal + 1};
|
||||
}
|
||||
let a = {...acc, commentTotal: acc.commentTotal + 1};
|
||||
@@ -134,6 +182,24 @@ export class HistoryRule extends Rule {
|
||||
}
|
||||
return a;
|
||||
},{submissionTotal: 0, commentTotal: 0, opTotal: 0});
|
||||
let fSubmissionTotal = submissionTotal;
|
||||
let fCommentTotal = commentTotal;
|
||||
let fOpTotal = opTotal;
|
||||
if(activities.length !== filteredActivities.length) {
|
||||
const filteredCounts = filteredActivities.reduce((acc, act) => {
|
||||
if(asSubmission(act)) {
|
||||
return {...acc, submissionTotal: acc.submissionTotal + 1};
|
||||
}
|
||||
let a = {...acc, commentTotal: acc.commentTotal + 1};
|
||||
if(act.is_submitter) {
|
||||
a.opTotal = a.opTotal + 1;
|
||||
}
|
||||
return a;
|
||||
},{submissionTotal: 0, commentTotal: 0, opTotal: 0});
|
||||
fSubmissionTotal = filteredCounts.submissionTotal;
|
||||
fCommentTotal = filteredCounts.commentTotal;
|
||||
fOpTotal = filteredCounts.opTotal;
|
||||
}
|
||||
|
||||
let commentTrigger = undefined;
|
||||
if(comment !== undefined) {
|
||||
@@ -142,15 +208,15 @@ export class HistoryRule extends Rule {
|
||||
if(isPercent) {
|
||||
const per = value / 100;
|
||||
if(asOp) {
|
||||
commentTrigger = comparisonTextOp(opTotal / commentTotal, operator, per);
|
||||
commentTrigger = comparisonTextOp(fOpTotal / commentTotal, operator, per);
|
||||
} else {
|
||||
commentTrigger = comparisonTextOp(commentTotal / activityTotal, operator, per);
|
||||
commentTrigger = comparisonTextOp(fCommentTotal / activityTotal, operator, per);
|
||||
}
|
||||
} else {
|
||||
if(asOp) {
|
||||
commentTrigger = comparisonTextOp(opTotal, operator, value);
|
||||
commentTrigger = comparisonTextOp(fOpTotal, operator, value);
|
||||
} else {
|
||||
commentTrigger = comparisonTextOp(commentTotal, operator, value);
|
||||
commentTrigger = comparisonTextOp(fCommentTotal, operator, value);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -160,27 +226,40 @@ export class HistoryRule extends Rule {
|
||||
const {operator, value, isPercent} = parseGenericValueOrPercentComparison(submission);
|
||||
if(isPercent) {
|
||||
const per = value / 100;
|
||||
submissionTrigger = comparisonTextOp(submissionTotal / activityTotal, operator, per);
|
||||
submissionTrigger = comparisonTextOp(fSubmissionTotal / activityTotal, operator, per);
|
||||
} else {
|
||||
submissionTrigger = comparisonTextOp(submissionTotal, operator, value);
|
||||
submissionTrigger = comparisonTextOp(fSubmissionTotal, operator, value);
|
||||
}
|
||||
}
|
||||
|
||||
let totalTrigger = undefined;
|
||||
if(total !== undefined) {
|
||||
const {operator, value, isPercent} = parseGenericValueOrPercentComparison(total);
|
||||
if(isPercent) {
|
||||
const per = value / 100;
|
||||
totalTrigger = comparisonTextOp(filteredActivities.length / activityTotal, operator, per);
|
||||
} else {
|
||||
totalTrigger = comparisonTextOp(filteredActivities.length, operator, value);
|
||||
}
|
||||
}
|
||||
|
||||
const firstActivity = activities[0];
|
||||
const lastActivity = activities[activities.length - 1];
|
||||
|
||||
const activityTotalWindow = dayjs.duration(dayjs(firstActivity.created_utc * 1000).diff(dayjs(lastActivity.created_utc * 1000)));
|
||||
const activityTotalWindow = activities.length === 0 ? dayjs.duration(0, 's') : dayjs.duration(dayjs(firstActivity.created_utc * 1000).diff(dayjs(lastActivity.created_utc * 1000)));
|
||||
|
||||
criteriaResults.push({
|
||||
criteria,
|
||||
activityTotal,
|
||||
activityTotalWindow,
|
||||
submissionTotal,
|
||||
commentTotal,
|
||||
opTotal,
|
||||
submissionTotal: fSubmissionTotal,
|
||||
commentTotal: fCommentTotal,
|
||||
opTotal: fOpTotal,
|
||||
filteredTotal: filteredActivities.length,
|
||||
submissionTrigger,
|
||||
commentTrigger,
|
||||
triggered: (submissionTrigger === undefined || submissionTrigger === true) && (commentTrigger === undefined || commentTrigger === true)
|
||||
totalTrigger,
|
||||
triggered: (submissionTrigger === undefined || submissionTrigger === true) && (commentTrigger === undefined || commentTrigger === true) && (totalTrigger === undefined || totalTrigger === true)
|
||||
});
|
||||
}
|
||||
|
||||
@@ -223,36 +302,50 @@ export class HistoryRule extends Rule {
|
||||
activityTotalWindow,
|
||||
submissionTotal,
|
||||
commentTotal,
|
||||
filteredTotal,
|
||||
opTotal,
|
||||
criteria: {
|
||||
comment,
|
||||
submission,
|
||||
total,
|
||||
window,
|
||||
},
|
||||
criteria,
|
||||
triggered,
|
||||
submissionTrigger,
|
||||
commentTrigger,
|
||||
totalTrigger,
|
||||
} = results;
|
||||
|
||||
const data: any = {
|
||||
activityTotal,
|
||||
submissionTotal,
|
||||
commentTotal,
|
||||
filteredTotal,
|
||||
opTotal,
|
||||
commentPercent: formatNumber((commentTotal/activityTotal)*100),
|
||||
submissionPercent: formatNumber((submissionTotal/activityTotal)*100),
|
||||
opPercent: formatNumber((opTotal/commentTotal)*100),
|
||||
filteredPercent: formatNumber((filteredTotal/activityTotal)*100),
|
||||
criteria,
|
||||
window: typeof window === 'number' ? `${activityTotal} Items` : activityTotalWindow.humanize(true),
|
||||
window: typeof window === 'number' || activityTotal === 0 ? `${activityTotal} Items` : activityTotalWindow.humanize(true),
|
||||
triggered,
|
||||
submissionTrigger,
|
||||
commentTrigger,
|
||||
totalTrigger,
|
||||
};
|
||||
|
||||
let thresholdSummary = [];
|
||||
let totalSummary;
|
||||
let submissionSummary;
|
||||
let commentSummary;
|
||||
if(total !== undefined) {
|
||||
const {operator, value, isPercent, displayText} = parseGenericValueOrPercentComparison(total);
|
||||
const suffix = !isPercent ? 'Items' : `(${formatNumber((filteredTotal/activityTotal)*100)}%) of ${activityTotal} Total`;
|
||||
totalSummary = `${includePassFailSymbols ? `${submissionTrigger ? PASS : FAIL} ` : ''}Filtered Activities (${filteredTotal}) were${totalTrigger ? '' : ' not'} ${displayText} ${suffix}`;
|
||||
data.totalSummary = totalSummary;
|
||||
thresholdSummary.push(totalSummary);
|
||||
}
|
||||
if(submission !== undefined) {
|
||||
const {operator, value, isPercent, displayText} = parseGenericValueOrPercentComparison(submission);
|
||||
const suffix = !isPercent ? 'Items' : `(${formatNumber((submissionTotal/activityTotal)*100)}%) of ${activityTotal} Total`;
|
||||
@@ -298,21 +391,45 @@ interface HistoryConfig {
|
||||
condition?: 'AND' | 'OR'
|
||||
|
||||
/**
|
||||
* Only include Submissions from this list of Subreddits (by name, case-insensitive)
|
||||
* If present, activities will be counted only if they are found in this list of Subreddits.
|
||||
*
|
||||
* EX `["mealtimevideos","askscience"]`
|
||||
* @examples ["mealtimevideos","askscience"]
|
||||
* @minItems 1
|
||||
* Each value in the list can be either:
|
||||
*
|
||||
* * string (name of subreddit)
|
||||
* * regular expression to run on the subreddit name
|
||||
* * `SubredditState`
|
||||
*
|
||||
* EX `["mealtimevideos","askscience", "/onlyfans*\/i", {"over18": true}]`
|
||||
*
|
||||
* **Note:** This affects **post-window retrieval** activities. So that:
|
||||
*
|
||||
* * `activityTotal` is number of activities retrieved from `window` -- NOT post-filtering
|
||||
* * all comparisons using **percentages** will compare **post-filtering** results against **activity count from window**
|
||||
* * -- to run this rule where all activities are only from include/exclude filtering instead use include/exclude in `window`
|
||||
*
|
||||
* @examples [["mealtimevideos","askscience", "/onlyfans*\/i", {"over18": true}]]
|
||||
* */
|
||||
include?: string[],
|
||||
include?: (string | SubredditState)[],
|
||||
/**
|
||||
* Do not include Submissions from this list of Subreddits (by name, case-insensitive)
|
||||
* If present, activities will be counted only if they are **NOT** found in this list of Subreddits
|
||||
*
|
||||
* EX `["mealtimevideos","askscience"]`
|
||||
* @examples ["mealtimevideos","askscience"]
|
||||
* @minItems 1
|
||||
* Each value in the list can be either:
|
||||
*
|
||||
* * string (name of subreddit)
|
||||
* * regular expression to run on the subreddit name
|
||||
* * `SubredditState`
|
||||
*
|
||||
* EX `["mealtimevideos","askscience", "/onlyfans*\/i", {"over18": true}]`
|
||||
*
|
||||
* **Note:** This affects **post-window retrieval** activities. So that:
|
||||
*
|
||||
* * `activityTotal` is number of activities retrieved from `window` -- NOT post-filtering
|
||||
* * all comparisons using **percentages** will compare **post-filtering** results against **activity count from window**
|
||||
* * -- to run this rule where all activities are only from include/exclude filtering instead use include/exclude in `window`
|
||||
*
|
||||
* @examples [["mealtimevideos","askscience", "/onlyfans*\/i", {"over18": true}]]
|
||||
* */
|
||||
exclude?: string[],
|
||||
exclude?: (string | SubredditState)[],
|
||||
}
|
||||
|
||||
export interface HistoryOptions extends HistoryConfig, RuleOptions {
|
||||
|
||||
@@ -2,25 +2,26 @@ import {Rule, RuleJSONConfig, RuleOptions, RulePremise, RuleResult} from "./inde
|
||||
import {Comment, VoteableContent} from "snoowrap";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import {
|
||||
activityWindowText,
|
||||
comparisonTextOp, FAIL, formatNumber,
|
||||
parseGenericValueOrPercentComparison, parseSubredditName,
|
||||
activityWindowText, asSubmission,
|
||||
comparisonTextOp, FAIL, formatNumber, getActivitySubredditName, isSubmission, objectToStringSummary,
|
||||
parseGenericValueOrPercentComparison, parseStringToRegex, parseSubredditName,
|
||||
parseUsableLinkIdentifier,
|
||||
PASS
|
||||
PASS, toStrongSubredditState
|
||||
} from "../util";
|
||||
import {
|
||||
ActivityWindow,
|
||||
ActivityWindowCriteria,
|
||||
ActivityWindowType,
|
||||
ReferenceSubmission,
|
||||
SubredditCriteria
|
||||
ActivityWindowType, CommentState,
|
||||
ReferenceSubmission, StrongSubredditState, SubmissionState,
|
||||
SubredditCriteria, SubredditState
|
||||
} from "../Common/interfaces";
|
||||
import {SubredditResources} from "../Subreddit/SubredditResources";
|
||||
|
||||
const parseLink = parseUsableLinkIdentifier();
|
||||
|
||||
export class RecentActivityRule extends Rule {
|
||||
window: ActivityWindowType;
|
||||
thresholds: SubThreshold[];
|
||||
thresholds: ActivityThreshold[];
|
||||
useSubmissionAsReference: boolean;
|
||||
lookAt?: 'comments' | 'submissions';
|
||||
|
||||
@@ -67,14 +68,14 @@ export class RecentActivityRule extends Rule {
|
||||
|
||||
let viableActivity = activities;
|
||||
if (this.useSubmissionAsReference) {
|
||||
if (!(item instanceof Submission)) {
|
||||
if (!asSubmission(item)) {
|
||||
this.logger.warn('Cannot use post as reference because triggered item is not a Submission');
|
||||
} else if (item.is_self) {
|
||||
this.logger.warn('Cannot use post as reference because triggered Submission is not a link type');
|
||||
} else {
|
||||
const usableUrl = parseLink(await item.url);
|
||||
viableActivity = viableActivity.filter((x) => {
|
||||
if (!(x instanceof Submission)) {
|
||||
if (!asSubmission(x)) {
|
||||
return false;
|
||||
}
|
||||
if (x.url === undefined) {
|
||||
@@ -84,29 +85,59 @@ export class RecentActivityRule extends Rule {
|
||||
});
|
||||
}
|
||||
}
|
||||
const groupedActivity = viableActivity.reduce((grouped, activity) => {
|
||||
const s = activity.subreddit.display_name.toLowerCase();
|
||||
grouped[s] = (grouped[s] || []).concat(activity);
|
||||
return grouped;
|
||||
}, {} as Record<string, (Submission | Comment)[]>);
|
||||
|
||||
|
||||
const summaries = [];
|
||||
let totalTriggeredOn;
|
||||
for (const triggerSet of this.thresholds) {
|
||||
let currCount = 0;
|
||||
const presentSubs = [];
|
||||
const {threshold = '>= 1', subreddits = []} = triggerSet;
|
||||
for (const sub of subreddits.map(x => parseSubredditName(x))) {
|
||||
const isub = sub.toLowerCase();
|
||||
const {[isub]: tSub = []} = groupedActivity;
|
||||
if (tSub.length > 0) {
|
||||
currCount += tSub.length;
|
||||
presentSubs.push(sub);
|
||||
const presentSubs: string[] = [];
|
||||
let combinedKarma = 0;
|
||||
const {
|
||||
threshold = '>= 1',
|
||||
subreddits = [],
|
||||
karma: karmaThreshold,
|
||||
commentState,
|
||||
submissionState,
|
||||
} = triggerSet;
|
||||
|
||||
// convert subreddits array into entirely StrongSubredditState
|
||||
const subStates: StrongSubredditState[] = subreddits.map((x) => {
|
||||
if(typeof x === 'string') {
|
||||
return toStrongSubredditState({name: x, stateDescription: x}, {defaultFlags: 'i', generateDescription: true});
|
||||
}
|
||||
return toStrongSubredditState(x, {defaultFlags: 'i', generateDescription: true});
|
||||
});
|
||||
|
||||
for(const activity of viableActivity) {
|
||||
if(asSubmission(activity) && submissionState !== undefined) {
|
||||
if(!(await this.resources.testItemCriteria(activity, [submissionState]))) {
|
||||
continue;
|
||||
}
|
||||
} else if(commentState !== undefined) {
|
||||
if(!(await this.resources.testItemCriteria(activity, [commentState]))) {
|
||||
continue;
|
||||
}
|
||||
}
|
||||
let inSubreddits = false;
|
||||
for(const ss of subStates) {
|
||||
const res = await this.resources.testSubredditCriteria(activity, ss);
|
||||
if(res) {
|
||||
inSubreddits = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if(inSubreddits) {
|
||||
currCount++;
|
||||
combinedKarma += activity.score;
|
||||
const pSub = getActivitySubredditName(activity);
|
||||
if(!presentSubs.includes(pSub)) {
|
||||
presentSubs.push(pSub);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const {operator, value, isPercent} = parseGenericValueOrPercentComparison(threshold);
|
||||
let sum = {subsWithActivity: presentSubs, subreddits, count: currCount, threshold, triggered: false, testValue: currCount.toString()};
|
||||
let sum = {subsWithActivity: presentSubs, combinedKarma, karmaThreshold, subreddits: subStates.map(x => x.stateDescription), count: currCount, threshold, triggered: false, testValue: currCount.toString()};
|
||||
if (isPercent) {
|
||||
sum.testValue = `${formatNumber((currCount / viableActivity.length) * 100)}%`;
|
||||
if (comparisonTextOp(currCount / viableActivity.length, operator, value / 100)) {
|
||||
@@ -117,6 +148,15 @@ export class RecentActivityRule extends Rule {
|
||||
sum.triggered = true;
|
||||
totalTriggeredOn = sum;
|
||||
}
|
||||
// if we would trigger on threshold need to also test for karma
|
||||
if(totalTriggeredOn !== undefined && karmaThreshold !== undefined) {
|
||||
const {operator: opKarma, value: valueKarma} = parseGenericValueOrPercentComparison(karmaThreshold);
|
||||
if(!comparisonTextOp(combinedKarma, opKarma, valueKarma)) {
|
||||
sum.triggered = false;
|
||||
totalTriggeredOn = undefined;
|
||||
}
|
||||
}
|
||||
|
||||
summaries.push(sum);
|
||||
// if either trigger condition is hit end the iteration early
|
||||
if (totalTriggeredOn !== undefined) {
|
||||
@@ -150,10 +190,15 @@ export class RecentActivityRule extends Rule {
|
||||
subreddits = [],
|
||||
subsWithActivity = [],
|
||||
threshold,
|
||||
triggered
|
||||
triggered,
|
||||
combinedKarma,
|
||||
karmaThreshold,
|
||||
} = summary;
|
||||
const relevantSubs = subsWithActivity.length === 0 ? subreddits : subsWithActivity;
|
||||
const totalSummary = `${testValue} activities over ${relevantSubs.length} subreddits ${triggered ? 'met' : 'did not meet'} threshold of ${threshold}`;
|
||||
let totalSummary = `${testValue} activities over ${relevantSubs.length} subreddits${karmaThreshold !== undefined ? ` with ${combinedKarma} combined karma` : ''} ${triggered ? 'met' : 'did not meet'} threshold of ${threshold}${karmaThreshold !== undefined ? ` and ${karmaThreshold} combined karma` : ''}`;
|
||||
if(triggered && subsWithActivity.length > 0) {
|
||||
totalSummary = `${totalSummary} -- subreddits: ${subsWithActivity.join(', ')}`;
|
||||
}
|
||||
return {
|
||||
result: totalSummary,
|
||||
data: {
|
||||
@@ -163,7 +208,8 @@ export class RecentActivityRule extends Rule {
|
||||
subCount: relevantSubs.length,
|
||||
totalCount: count,
|
||||
threshold,
|
||||
testValue
|
||||
testValue,
|
||||
karmaThreshold,
|
||||
}
|
||||
};
|
||||
}
|
||||
@@ -175,7 +221,16 @@ export class RecentActivityRule extends Rule {
|
||||
* @minProperties 1
|
||||
* @additionalProperties false
|
||||
* */
|
||||
export interface SubThreshold extends SubredditCriteria {
|
||||
export interface ActivityThreshold {
|
||||
/**
|
||||
* When present, a Submission will only be counted if it meets this criteria
|
||||
* */
|
||||
submissionState?: SubmissionState
|
||||
/**
|
||||
* When present, a Comment will only be counted if it meets this criteria
|
||||
* */
|
||||
commentState?: CommentState
|
||||
|
||||
/**
|
||||
* A string containing a comparison operator and a value to compare recent activities against
|
||||
*
|
||||
@@ -191,6 +246,35 @@ export interface SubThreshold extends SubredditCriteria {
|
||||
* @examples [">= 1"]
|
||||
* */
|
||||
threshold?: string
|
||||
|
||||
/**
|
||||
* Test the **combined karma** from Activities found in the specified subreddits
|
||||
*
|
||||
* Value is a string containing a comparison operator and a number of **combined karma** to compare against
|
||||
*
|
||||
* If specified then both `threshold` and `karma` must be met for this `SubThreshold` to be satisfied
|
||||
*
|
||||
* The syntax is `(< OR > OR <= OR >=) <number>`
|
||||
*
|
||||
* * EX `> 50` => greater than 50 combined karma for all found Activities in specified subreddits
|
||||
*
|
||||
* @pattern ^\s*(>|>=|<|<=)\s*(\d+)\s*(%?)(.*)$
|
||||
* */
|
||||
karma?: string
|
||||
|
||||
/**
|
||||
* Activities will be counted if they are found in this list of Subreddits
|
||||
*
|
||||
* Each value in the list can be either:
|
||||
*
|
||||
* * string (name of subreddit)
|
||||
* * regular expression to run on the subreddit name
|
||||
* * `SubredditState`
|
||||
*
|
||||
* EX `["mealtimevideos","askscience", "/onlyfans*\/i", {"over18": true}]`
|
||||
* @examples [["mealtimevideos","askscience", "/onlyfans*\/i", {"over18": true}]]
|
||||
* */
|
||||
subreddits?: (string | SubredditState)[]
|
||||
}
|
||||
|
||||
interface RecentActivityConfig extends ActivityWindow, ReferenceSubmission {
|
||||
@@ -203,7 +287,7 @@ interface RecentActivityConfig extends ActivityWindow, ReferenceSubmission {
|
||||
* A list of subreddits/count criteria that may trigger this rule. ANY SubThreshold will trigger this rule.
|
||||
* @minItems 1
|
||||
* */
|
||||
thresholds: SubThreshold[],
|
||||
thresholds: ActivityThreshold[],
|
||||
}
|
||||
|
||||
export interface RecentActivityRuleOptions extends RecentActivityConfig, RuleOptions {
|
||||
|
||||
@@ -2,14 +2,16 @@ import {Rule, RuleJSONConfig, RuleOptions, RuleResult} from "./index";
|
||||
import {Comment} from "snoowrap";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import {
|
||||
comparisonTextOp, FAIL, isExternalUrlSubmission, parseGenericValueComparison,
|
||||
parseGenericValueOrPercentComparison, parseRegex,
|
||||
PASS
|
||||
asSubmission,
|
||||
comparisonTextOp, FAIL, isExternalUrlSubmission, isSubmission, parseGenericValueComparison,
|
||||
parseGenericValueOrPercentComparison, parseRegex, parseStringToRegex,
|
||||
PASS, triggeredIndicator
|
||||
} from "../util";
|
||||
import {
|
||||
ActivityWindowType, JoinOperands,
|
||||
} from "../Common/interfaces";
|
||||
import dayjs from 'dayjs';
|
||||
import SimpleError from "../Utils/SimpleError";
|
||||
|
||||
export interface RegexCriteria {
|
||||
/**
|
||||
@@ -21,17 +23,11 @@ export interface RegexCriteria {
|
||||
/**
|
||||
* A valid Regular Expression to test content against
|
||||
*
|
||||
* Do not wrap expression in forward slashes
|
||||
* If no flags are specified then the **global** flag is used by default
|
||||
*
|
||||
* EX For the expression `/reddit|FoxxMD/` use the value should be `reddit|FoxxMD`
|
||||
*
|
||||
* @examples ["reddit|FoxxMD"]
|
||||
* @examples ["/reddit|FoxxMD/ig"]
|
||||
* */
|
||||
regex: string,
|
||||
/**
|
||||
* Regex flags to use
|
||||
* */
|
||||
regexFlags?: string,
|
||||
|
||||
/**
|
||||
* Which content from an Activity to test the regex against
|
||||
@@ -134,12 +130,11 @@ export class RegexRule extends Rule {
|
||||
|
||||
let criteriaResults = [];
|
||||
|
||||
for (const criteria of this.criteria) {
|
||||
for (const [index, criteria] of this.criteria.entries()) {
|
||||
|
||||
const {
|
||||
name,
|
||||
name = (index + 1),
|
||||
regex,
|
||||
regexFlags,
|
||||
testOn: testOnVals = ['title', 'body'],
|
||||
lookAt = 'all',
|
||||
matchThreshold = '> 0',
|
||||
@@ -157,7 +152,10 @@ export class RegexRule extends Rule {
|
||||
}, []);
|
||||
|
||||
// check regex
|
||||
const reg = new RegExp(regex);
|
||||
const reg = parseStringToRegex(regex, 'g');
|
||||
if(reg === undefined) {
|
||||
throw new SimpleError(`Value given for regex on Criteria ${name} was not valid: ${regex}`);
|
||||
}
|
||||
// ok cool its a valid regex
|
||||
|
||||
const matchComparison = parseGenericValueComparison(matchThreshold);
|
||||
@@ -176,7 +174,7 @@ export class RegexRule extends Rule {
|
||||
|
||||
// first lets see if the activity we are checking satisfies thresholds
|
||||
// since we may be able to avoid api calls to get history
|
||||
let actMatches = this.getMatchesFromActivity(item, testOn, reg, regexFlags);
|
||||
let actMatches = this.getMatchesFromActivity(item, testOn, reg);
|
||||
matches = matches.concat(actMatches).slice(0, 100);
|
||||
matchCount += actMatches.length;
|
||||
|
||||
@@ -226,7 +224,7 @@ export class RegexRule extends Rule {
|
||||
|
||||
for (const h of history) {
|
||||
activitiesTested++;
|
||||
const aMatches = this.getMatchesFromActivity(h, testOn, reg, regexFlags);
|
||||
const aMatches = this.getMatchesFromActivity(h, testOn, reg);
|
||||
matches = matches.concat(aMatches).slice(0, 100);
|
||||
matchCount += aMatches.length;
|
||||
const matched = comparisonTextOp(aMatches.length, matchComparison.operator, matchComparison.value);
|
||||
@@ -300,30 +298,35 @@ export class RegexRule extends Rule {
|
||||
let index = 0;
|
||||
for (const c of criteriaResults) {
|
||||
index++;
|
||||
let msg = `Crit ${c.criteria.name || index} ${c.triggered ? PASS : FAIL}`;
|
||||
let msg = `Criteria ${c.criteria.name || `#${index}`} ${triggeredIndicator(c.triggered)}`;
|
||||
if (c.activityThresholdMet !== undefined) {
|
||||
msg = `${msg} -- Activity Match=> ${c.activityThresholdMet ? PASS : FAIL} ${c.activitiesMatchedCount} ${c.criteria.activityMatchThreshold} (Threshold ${c.criteria.matchThreshold})`;
|
||||
msg = `${msg} -- Activity Match ${triggeredIndicator(c.activityThresholdMet)} => ${c.activitiesMatchedCount} ${c.criteria.activityMatchThreshold} (Threshold ${c.criteria.matchThreshold})`;
|
||||
}
|
||||
if (c.totalThresholdMet !== undefined) {
|
||||
msg = `${msg} -- Total Matches=> ${c.totalThresholdMet ? PASS : FAIL} ${c.matchCount} ${c.criteria.totalMatchThreshold}`;
|
||||
msg = `${msg} -- Total Matches ${triggeredIndicator(c.totalThresholdMet)} => ${c.matchCount} ${c.criteria.totalMatchThreshold}`;
|
||||
} else {
|
||||
msg = `${msg} and ${c.matchCount} Total Matches`;
|
||||
}
|
||||
msg = `${msg} (Window: ${c.criteria.window})`;
|
||||
logSummary.push(msg);
|
||||
if(c.matches.length > 0) {
|
||||
let matchSample = `-- Matched Values: ${c.matches.slice(0, 3).map(x => `"${x}"`).join(', ')}${c.matches.length > 3 ? `, and ${c.matches.length - 3} more...` : ''}`;
|
||||
logSummary.push(`${msg} ${matchSample}`);
|
||||
} else {
|
||||
logSummary.push(msg);
|
||||
}
|
||||
}
|
||||
|
||||
const result = `${criteriaMet ? PASS : FAIL} ${logSummary.join(' || ')}`;
|
||||
const result = `${triggeredIndicator(criteriaMet)} ${logSummary.join(' || ')}`;
|
||||
this.logger.verbose(result);
|
||||
|
||||
return Promise.resolve([criteriaMet, this.getResult(criteriaMet, {result, data: criteriaResults})]);
|
||||
}
|
||||
|
||||
protected getMatchesFromActivity(a: (Submission | Comment), testOn: string[], reg: RegExp, flags?: string): string[] {
|
||||
protected getMatchesFromActivity(a: (Submission | Comment), testOn: string[], reg: RegExp): string[] {
|
||||
let m: string[] = [];
|
||||
// determine what content we are testing
|
||||
let contents: string[] = [];
|
||||
if (a instanceof Submission) {
|
||||
if (asSubmission(a)) {
|
||||
for (const l of testOn) {
|
||||
switch (l) {
|
||||
case 'title':
|
||||
@@ -346,7 +349,7 @@ export class RegexRule extends Rule {
|
||||
}
|
||||
|
||||
for (const c of contents) {
|
||||
const results = parseRegex(reg, c, flags);
|
||||
const results = parseRegex(reg, c);
|
||||
if (results.matched) {
|
||||
m = m.concat(results.matches);
|
||||
}
|
||||
|
||||
@@ -1,12 +1,18 @@
|
||||
import {Rule, RuleJSONConfig, RuleOptions, RuleResult} from "./index";
|
||||
import {Comment} from "snoowrap";
|
||||
import {
|
||||
activityWindowText,
|
||||
comparisonTextOp, FAIL, isExternalUrlSubmission, isRedditMedia,
|
||||
activityWindowText, asSubmission,
|
||||
comparisonTextOp, FAIL, getActivitySubredditName, isExternalUrlSubmission, isRedditMedia,
|
||||
parseGenericValueComparison, parseSubredditName,
|
||||
parseUsableLinkIdentifier as linkParser, PASS
|
||||
parseUsableLinkIdentifier as linkParser, PASS, toStrongSubredditState
|
||||
} from "../util";
|
||||
import {ActivityWindow, ActivityWindowType, ReferenceSubmission} from "../Common/interfaces";
|
||||
import {
|
||||
ActivityWindow,
|
||||
ActivityWindowType,
|
||||
ReferenceSubmission,
|
||||
StrongSubredditState,
|
||||
SubredditState
|
||||
} from "../Common/interfaces";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import dayjs from "dayjs";
|
||||
import Fuse from 'fuse.js'
|
||||
@@ -25,7 +31,7 @@ interface RepeatActivityReducer {
|
||||
|
||||
const getActivityIdentifier = (activity: (Submission | Comment), length = 200) => {
|
||||
let identifier: string;
|
||||
if (activity instanceof Submission) {
|
||||
if (asSubmission(activity)) {
|
||||
if (activity.is_self) {
|
||||
identifier = `${activity.title}${activity.selftext.slice(0, length)}`;
|
||||
} else if(isRedditMedia(activity)) {
|
||||
@@ -50,8 +56,9 @@ export class RepeatActivityRule extends Rule {
|
||||
gapAllowance?: number;
|
||||
useSubmissionAsReference: boolean;
|
||||
lookAt: 'submissions' | 'all';
|
||||
include: string[];
|
||||
exclude: string[];
|
||||
include: (string | SubredditState)[];
|
||||
exclude: (string | SubredditState)[];
|
||||
activityFilterFunc: (x: Submission|Comment) => Promise<boolean> = async (x) => true;
|
||||
keepRemoved: boolean;
|
||||
minWordCount: number;
|
||||
|
||||
@@ -74,8 +81,40 @@ export class RepeatActivityRule extends Rule {
|
||||
this.window = window;
|
||||
this.gapAllowance = gapAllowance;
|
||||
this.useSubmissionAsReference = useSubmissionAsReference;
|
||||
this.include = include.map(x => parseSubredditName(x).toLowerCase());
|
||||
this.exclude = exclude.map(x => parseSubredditName(x).toLowerCase());
|
||||
this.include = include;
|
||||
this.exclude = exclude;
|
||||
|
||||
if(this.include.length > 0) {
|
||||
const subStates = include.map((x) => {
|
||||
if(typeof x === 'string') {
|
||||
return toStrongSubredditState({name: x, stateDescription: x}, {defaultFlags: 'i', generateDescription: true});
|
||||
}
|
||||
return toStrongSubredditState(x, {defaultFlags: 'i', generateDescription: true});
|
||||
});
|
||||
this.activityFilterFunc = async (x: Submission|Comment) => {
|
||||
for(const ss of subStates) {
|
||||
if(await this.resources.testSubredditCriteria(x, ss)) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
};
|
||||
} else if(this.exclude.length > 0) {
|
||||
const subStates = exclude.map((x) => {
|
||||
if(typeof x === 'string') {
|
||||
return toStrongSubredditState({name: x, stateDescription: x}, {defaultFlags: 'i', generateDescription: true});
|
||||
}
|
||||
return toStrongSubredditState(x, {defaultFlags: 'i', generateDescription: true});
|
||||
});
|
||||
this.activityFilterFunc = async (x: Submission|Comment) => {
|
||||
for(const ss of subStates) {
|
||||
if(await this.resources.testSubredditCriteria(x, ss)) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
return true;
|
||||
};
|
||||
}
|
||||
this.lookAt = lookAt;
|
||||
}
|
||||
|
||||
@@ -96,17 +135,10 @@ export class RepeatActivityRule extends Rule {
|
||||
|
||||
async process(item: Submission|Comment): Promise<[boolean, RuleResult]> {
|
||||
let referenceUrl;
|
||||
if(item instanceof Submission && this.useSubmissionAsReference) {
|
||||
if(asSubmission(item) && this.useSubmissionAsReference) {
|
||||
referenceUrl = await item.url;
|
||||
}
|
||||
|
||||
let filterFunc = (x: any) => true;
|
||||
if(this.include.length > 0) {
|
||||
filterFunc = (x: Submission|Comment) => this.include.includes(x.subreddit.display_name.toLowerCase());
|
||||
} else if(this.exclude.length > 0) {
|
||||
filterFunc = (x: Submission|Comment) => !this.exclude.includes(x.subreddit.display_name.toLowerCase());
|
||||
}
|
||||
|
||||
let activities: (Submission | Comment)[] = [];
|
||||
switch (this.lookAt) {
|
||||
case 'submissions':
|
||||
@@ -117,13 +149,14 @@ export class RepeatActivityRule extends Rule {
|
||||
break;
|
||||
}
|
||||
|
||||
const condensedActivities = activities.reduce((acc: RepeatActivityReducer, activity: (Submission | Comment), index: number) => {
|
||||
const condensedActivities = await activities.reduce(async (accProm: Promise<RepeatActivityReducer>, activity: (Submission | Comment), index: number) => {
|
||||
const acc = await accProm;
|
||||
const {openSets = [], allSets = []} = acc;
|
||||
|
||||
let identifier = getActivityIdentifier(activity);
|
||||
const isUrl = isExternalUrlSubmission(activity);
|
||||
let fu = new Fuse([identifier], !isUrl ? fuzzyOptions : {...fuzzyOptions, distance: 5});
|
||||
const validSub = filterFunc(activity);
|
||||
const validSub = await this.activityFilterFunc(activity);
|
||||
let minMet = identifier.length >= this.minWordCount;
|
||||
|
||||
let updatedAllSets = [...allSets];
|
||||
@@ -174,7 +207,7 @@ export class RepeatActivityRule extends Rule {
|
||||
|
||||
return {openSets: updatedOpenSets, allSets: updatedAllSets};
|
||||
|
||||
}, {openSets: [], allSets: []});
|
||||
}, Promise.resolve({openSets: [], allSets: []}));
|
||||
|
||||
const allRepeatSets = [...condensedActivities.allSets, ...condensedActivities.openSets];
|
||||
|
||||
@@ -223,7 +256,7 @@ export class RepeatActivityRule extends Rule {
|
||||
};
|
||||
for (let set of value) {
|
||||
const test = comparisonTextOp(set.length, operator, thresholdValue);
|
||||
const md = set.map((x: (Comment | Submission)) => `[${x instanceof Submission ? x.title : getActivityIdentifier(x, 50)}](https://reddit.com${x.permalink}) in ${x.subreddit_name_prefixed} on ${dayjs(x.created_utc * 1000).utc().format()}`);
|
||||
const md = set.map((x: (Comment | Submission)) => `[${asSubmission(x) ? x.title : getActivityIdentifier(x, 50)}](https://reddit.com${x.permalink}) in ${x.subreddit_name_prefixed} on ${dayjs(x.created_utc * 1000).utc().format()}`);
|
||||
|
||||
summaryData.sets.push(set);
|
||||
summaryData.largestTrigger = Math.max(summaryData.largestTrigger, set.length);
|
||||
@@ -294,21 +327,31 @@ interface RepeatActivityConfig extends ActivityWindow, ReferenceSubmission {
|
||||
* */
|
||||
gapAllowance?: number,
|
||||
/**
|
||||
* Only include Submissions from this list of Subreddits (by name, case-insensitive)
|
||||
* If present, activities will be counted only if they are found in this list of Subreddits
|
||||
*
|
||||
* EX `["mealtimevideos","askscience"]`
|
||||
* @examples ["mealtimevideos","askscience"]
|
||||
* @minItems 1
|
||||
* Each value in the list can be either:
|
||||
*
|
||||
* * string (name of subreddit)
|
||||
* * regular expression to run on the subreddit name
|
||||
* * `SubredditState`
|
||||
*
|
||||
* EX `["mealtimevideos","askscience", "/onlyfans*\/i", {"over18": true}]`
|
||||
* @examples [["mealtimevideos","askscience", "/onlyfans*\/i", {"over18": true}]]
|
||||
* */
|
||||
include?: string[],
|
||||
include?: (string | SubredditState)[],
|
||||
/**
|
||||
* Do not include Submissions from this list of Subreddits (by name, case-insensitive)
|
||||
* If present, activities will be counted only if they are **NOT** found in this list of Subreddits
|
||||
*
|
||||
* EX `["mealtimevideos","askscience"]`
|
||||
* @examples ["mealtimevideos","askscience"]
|
||||
* @minItems 1
|
||||
* Each value in the list can be either:
|
||||
*
|
||||
* * string (name of subreddit)
|
||||
* * regular expression to run on the subreddit name
|
||||
* * `SubredditState`
|
||||
*
|
||||
* EX `["mealtimevideos","askscience", "/onlyfans*\/i", {"over18": true}]`
|
||||
* @examples [["mealtimevideos","askscience", "/onlyfans*\/i", {"over18": true}]]
|
||||
* */
|
||||
exclude?: string[],
|
||||
exclude?: (string | SubredditState)[],
|
||||
|
||||
/**
|
||||
* If present determines which activities to consider for gapAllowance.
|
||||
|
||||
@@ -6,29 +6,31 @@ import {AttributionJSONConfig, AttributionRule} from "./AttributionRule";
|
||||
import {Logger} from "winston";
|
||||
import HistoryRule, {HistoryJSONConfig} from "./HistoryRule";
|
||||
import RegexRule, {RegexRuleJSONConfig} from "./RegexRule";
|
||||
import {SubredditResources} from "../Subreddit/SubredditResources";
|
||||
import Snoowrap from "snoowrap";
|
||||
|
||||
export function ruleFactory
|
||||
(config: RuleJSONConfig, logger: Logger, subredditName: string): Rule {
|
||||
(config: RuleJSONConfig, logger: Logger, subredditName: string, resources: SubredditResources, client: Snoowrap): Rule {
|
||||
let cfg;
|
||||
switch (config.kind) {
|
||||
case 'recentActivity':
|
||||
cfg = config as RecentActivityRuleJSONConfig;
|
||||
return new RecentActivityRule({...cfg, logger, subredditName});
|
||||
return new RecentActivityRule({...cfg, logger, subredditName, resources, client});
|
||||
case 'repeatActivity':
|
||||
cfg = config as RepeatActivityJSONConfig;
|
||||
return new RepeatActivityRule({...cfg, logger, subredditName});
|
||||
return new RepeatActivityRule({...cfg, logger, subredditName, resources, client});
|
||||
case 'author':
|
||||
cfg = config as AuthorRuleJSONConfig;
|
||||
return new AuthorRule({...cfg, logger, subredditName});
|
||||
return new AuthorRule({...cfg, logger, subredditName, resources, client});
|
||||
case 'attribution':
|
||||
cfg = config as AttributionJSONConfig;
|
||||
return new AttributionRule({...cfg, logger, subredditName});
|
||||
return new AttributionRule({...cfg, logger, subredditName, resources, client});
|
||||
case 'history':
|
||||
cfg = config as HistoryJSONConfig;
|
||||
return new HistoryRule({...cfg, logger, subredditName});
|
||||
return new HistoryRule({...cfg, logger, subredditName, resources, client});
|
||||
case 'regex':
|
||||
cfg = config as RegexRuleJSONConfig;
|
||||
return new RegexRule({...cfg, logger, subredditName});
|
||||
return new RegexRule({...cfg, logger, subredditName, resources, client});
|
||||
default:
|
||||
throw new Error('rule "kind" was not recognized.');
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import {IRule, Triggerable, Rule, RuleJSONConfig, RuleResult, RuleSetResult} from "./index";
|
||||
import {Comment, Submission} from "snoowrap";
|
||||
import Snoowrap, {Comment, Submission} from "snoowrap";
|
||||
import {ruleFactory} from "./RuleFactory";
|
||||
import {createAjvFactory, mergeArr} from "../util";
|
||||
import {Logger} from "winston";
|
||||
@@ -7,6 +7,7 @@ import {JoinCondition, JoinOperands} from "../Common/interfaces";
|
||||
import * as RuleSchema from '../Schema/Rule.json';
|
||||
import Ajv from 'ajv';
|
||||
import {RuleJson, RuleObjectJson} from "../Common/types";
|
||||
import {SubredditResources} from "../Subreddit/SubredditResources";
|
||||
|
||||
export class RuleSet implements IRuleSet {
|
||||
rules: Rule[] = [];
|
||||
@@ -24,7 +25,7 @@ export class RuleSet implements IRuleSet {
|
||||
} else {
|
||||
const valid = ajv.validate(RuleSchema, r);
|
||||
if (valid) {
|
||||
this.rules.push(ruleFactory(r as RuleJSONConfig, logger, options.subredditName));
|
||||
this.rules.push(ruleFactory(r as RuleJSONConfig, logger, options.subredditName, options.resources, options.client));
|
||||
} else {
|
||||
this.logger.warn('Could not build rule because of JSON errors', {}, {errors: ajv.errors, obj: r});
|
||||
}
|
||||
@@ -85,6 +86,8 @@ export interface RuleSetOptions extends IRuleSet {
|
||||
rules: Array<IRule | RuleJSONConfig>,
|
||||
logger: Logger
|
||||
subredditName: string
|
||||
resources: SubredditResources
|
||||
client: Snoowrap
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
import {Comment} from "snoowrap";
|
||||
import Snoowrap, {Comment} from "snoowrap";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import {Logger} from "winston";
|
||||
import {findResultByPremise, mergeArr} from "../util";
|
||||
import ResourceManager, {SubredditResources} from "../Subreddit/SubredditResources";
|
||||
import {SubredditResources} from "../Subreddit/SubredditResources";
|
||||
import {ChecksActivityState, TypedActivityStates} from "../Common/interfaces";
|
||||
import Author, {AuthorOptions} from "../Author/Author";
|
||||
|
||||
@@ -12,6 +12,8 @@ export interface RuleOptions {
|
||||
itemIs?: TypedActivityStates;
|
||||
logger: Logger
|
||||
subredditName: string;
|
||||
resources: SubredditResources
|
||||
client: Snoowrap
|
||||
}
|
||||
|
||||
export interface RulePremise {
|
||||
@@ -30,6 +32,11 @@ export interface RuleResult extends ResultContext {
|
||||
triggered: (boolean | null)
|
||||
}
|
||||
|
||||
export type FormattedRuleResult = RuleResult & {
|
||||
triggered: string
|
||||
result: string
|
||||
}
|
||||
|
||||
export interface RuleSetResult {
|
||||
results: RuleResult[],
|
||||
condition: 'OR' | 'AND',
|
||||
@@ -50,6 +57,7 @@ export abstract class Rule implements IRule, Triggerable {
|
||||
authorIs: AuthorOptions;
|
||||
itemIs: TypedActivityStates;
|
||||
resources: SubredditResources;
|
||||
client: Snoowrap;
|
||||
|
||||
constructor(options: RuleOptions) {
|
||||
const {
|
||||
@@ -61,9 +69,12 @@ export abstract class Rule implements IRule, Triggerable {
|
||||
} = {},
|
||||
itemIs = [],
|
||||
subredditName,
|
||||
resources,
|
||||
client,
|
||||
} = options;
|
||||
this.name = name;
|
||||
this.resources = ResourceManager.get(subredditName) as SubredditResources;
|
||||
this.resources = resources;
|
||||
this.client = client;
|
||||
|
||||
this.authorIs = {
|
||||
exclude: exclude.map(x => new Author(x)),
|
||||
|
||||
@@ -21,7 +21,7 @@
|
||||
"properties": {
|
||||
"age": {
|
||||
"description": "Test the age of the Author's account (when it was created) against this comparison\n\nThe syntax is `(< OR > OR <= OR >=) <number> <unit>`\n\n* EX `> 100 days` => Passes if Author's account is older than 100 days\n* EX `<= 2 months` => Passes if Author's account is younger than or equal to 2 months\n\nUnit must be one of [DayJS Duration units](https://day.js.org/docs/en/durations/creating)\n\n[See] https://regexr.com/609n8 for example",
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<time>\\d+)\\s*(?<unit>days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?)\\s*$",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?)\\s*$",
|
||||
"type": "string"
|
||||
},
|
||||
"commentKarma": {
|
||||
@@ -69,6 +69,10 @@
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"shadowBanned": {
|
||||
"description": "Is the author shadowbanned?\n\nThis is determined by trying to retrieve the author's profile. If a 404 is returned it is likely they are shadowbanned",
|
||||
"type": "boolean"
|
||||
},
|
||||
"totalKarma": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
@@ -154,6 +158,16 @@
|
||||
"removed": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"reports": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"score": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"spam": {
|
||||
"type": "boolean"
|
||||
},
|
||||
@@ -213,6 +227,16 @@
|
||||
"removed": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"reports": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"score": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"spam": {
|
||||
"type": "boolean"
|
||||
},
|
||||
|
||||
@@ -1,6 +1,72 @@
|
||||
{
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"definitions": {
|
||||
"ActivityThreshold": {
|
||||
"additionalProperties": false,
|
||||
"description": "At least one count property must be present. If both are present then either can trigger the rule",
|
||||
"minProperties": 1,
|
||||
"properties": {
|
||||
"commentState": {
|
||||
"$ref": "#/definitions/CommentState",
|
||||
"description": "When present, a Comment will only be counted if it meets this criteria",
|
||||
"examples": [
|
||||
{
|
||||
"op": true,
|
||||
"removed": false
|
||||
}
|
||||
]
|
||||
},
|
||||
"karma": {
|
||||
"description": "Test the **combined karma** from Activities found in the specified subreddits\n\nValue is a string containing a comparison operator and a number of **combined karma** to compare against\n\nIf specified then both `threshold` and `karma` must be met for this `SubThreshold` to be satisfied\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 50` => greater than 50 combined karma for all found Activities in specified subreddits",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"submissionState": {
|
||||
"$ref": "#/definitions/SubmissionState",
|
||||
"description": "When present, a Submission will only be counted if it meets this criteria",
|
||||
"examples": [
|
||||
{
|
||||
"over_18": true,
|
||||
"removed": false
|
||||
}
|
||||
]
|
||||
},
|
||||
"subreddits": {
|
||||
"description": "Activities will be counted if they are found in this list of Subreddits\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`",
|
||||
"examples": [
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"threshold": {
|
||||
"default": ">= 1",
|
||||
"description": "A string containing a comparison operator and a value to compare recent activities against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 3` => greater than 3 activities found in the listed subreddits\n* EX `<= 75%` => number of Activities in the subreddits listed are equal to or less than 75% of all Activities\n\n**Note:** If you use percentage comparison here as well as `useSubmissionAsReference` then \"all Activities\" is only pertains to Activities that had the Link of the Submission, rather than all Activities from this window.",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ActivityWindowCriteria": {
|
||||
"additionalProperties": false,
|
||||
"description": "Multiple properties that may be used to define what range of Activity to retrieve.\n\nMay specify one, or both properties along with the `satisfyOn` property, to affect the retrieval behavior.",
|
||||
@@ -167,7 +233,7 @@
|
||||
"properties": {
|
||||
"aggregateOn": {
|
||||
"default": "undefined",
|
||||
"description": "If `domains` is not specified this list determines which categories of domains should be aggregated on. All aggregated domains will be tested against `threshold`\n\n* If `media` is included then aggregate author's submission history which reddit recognizes as media (youtube, vimeo, etc.)\n* If `self` is included then aggregate on author's submission history which are self-post (`self.[subreddit]`) or reddit image/video (i.redd.it / v.redd.it)\n* If `link` is included then aggregate author's submission history which is external links but not media\n\nIf nothing is specified or list is empty (default) all domains are aggregated",
|
||||
"description": "This list determines which categories of domains should be aggregated on. All aggregated domains will be tested against `threshold`\n\n* If `media` is included then aggregate author's submission history which reddit recognizes as media (youtube, vimeo, etc.)\n* If `redditMedia` is included then aggregate on author's submissions history which are media hosted on reddit: galleries, videos, and images (i.redd.it / v.redd.it)\n* If `self` is included then aggregate on author's submission history which are self-post (`self.[subreddit]`) or domain is `reddit.com`\n* If `link` is included then aggregate author's submission history which is external links and not recognized as `media` by reddit\n\nIf nothing is specified or list is empty (default) all domains are aggregated",
|
||||
"examples": [
|
||||
[
|
||||
]
|
||||
@@ -176,6 +242,7 @@
|
||||
"enum": [
|
||||
"link",
|
||||
"media",
|
||||
"redditMedia",
|
||||
"self"
|
||||
],
|
||||
"type": "string"
|
||||
@@ -195,7 +262,7 @@
|
||||
[
|
||||
]
|
||||
],
|
||||
"description": "A list of domains whose Activities will be tested against `threshold`.\n\nIf this is present then `aggregateOn` is ignored.\n\nThe values are tested as partial strings so you do not need to include full URLs, just the part that matters.\n\nEX `[\"youtube\"]` will match submissions with the domain `https://youtube.com/c/aChannel`\nEX `[\"youtube.com/c/bChannel\"]` will NOT match submissions with the domain `https://youtube.com/c/aChannel`\n\nIf you wish to aggregate on self-posts for a subreddit use the syntax `self.[subreddit]` EX `self.AskReddit`\n\n**If this Rule is part of a Check for a Submission and you wish to aggregate on the domain of the Submission use the special string `AGG:SELF`**\n\nIf nothing is specified or list is empty (default) aggregate using `aggregateOn`",
|
||||
"description": "A list of domains whose Activities will be tested against `threshold`.\n\nThe values are tested as partial strings so you do not need to include full URLs, just the part that matters.\n\nEX `[\"youtube\"]` will match submissions with the domain `https://youtube.com/c/aChannel`\nEX `[\"youtube.com/c/bChannel\"]` will NOT match submissions with the domain `https://youtube.com/c/aChannel`\n\nIf you wish to aggregate on self-posts for a subreddit use the syntax `self.[subreddit]` EX `self.AskReddit`\n\n**If this Rule is part of a Check for a Submission and you wish to aggregate on the domain of the Submission use the special string `AGG:SELF`**\n\nIf nothing is specified or list is empty (default) aggregate using `aggregateOn`",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
@@ -384,7 +451,7 @@
|
||||
"properties": {
|
||||
"age": {
|
||||
"description": "Test the age of the Author's account (when it was created) against this comparison\n\nThe syntax is `(< OR > OR <= OR >=) <number> <unit>`\n\n* EX `> 100 days` => Passes if Author's account is older than 100 days\n* EX `<= 2 months` => Passes if Author's account is younger than or equal to 2 months\n\nUnit must be one of [DayJS Duration units](https://day.js.org/docs/en/durations/creating)\n\n[See] https://regexr.com/609n8 for example",
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<time>\\d+)\\s*(?<unit>days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?)\\s*$",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?)\\s*$",
|
||||
"type": "string"
|
||||
},
|
||||
"commentKarma": {
|
||||
@@ -432,6 +499,10 @@
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"shadowBanned": {
|
||||
"description": "Is the author shadowbanned?\n\nThis is determined by trying to retrieve the author's profile. If a 404 is returned it is likely they are shadowbanned",
|
||||
"type": "boolean"
|
||||
},
|
||||
"totalKarma": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
@@ -686,7 +757,112 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"CacheConfig": {
|
||||
"properties": {
|
||||
"actionedEventsMax": {
|
||||
"default": 25,
|
||||
"description": "The **maximum** number of Events that the cache should store triggered result summaries for\n\nThese summaries are viewable through the Web UI.\n\nThe value specified by a subreddit cannot be larger than the value set by the Operator for the global/bot config (if set)",
|
||||
"type": "number"
|
||||
},
|
||||
"authorTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, author activity history (Comments/Submission) should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache\n\n* ENV => `AUTHOR_TTL`\n* ARG => `--authorTTL <sec>`",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"commentTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, a comment should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"filterCriteriaTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, to cache filter criteria results (`authorIs` and `itemIs` results)\n\nThis is especially useful if when polling high-volume comments and your checks rely on author/item filters\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"provider": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/CacheOptions"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"memory",
|
||||
"none",
|
||||
"redis"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
],
|
||||
"description": "The cache provider and, optionally, a custom configuration for that provider\n\nIf not present or `null` provider will be `memory`.\n\nTo specify another `provider` but use its default configuration set this property to a string of one of the available providers: `memory`, `redis`, or `none`"
|
||||
},
|
||||
"submissionTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, a submission should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"subredditTTL": {
|
||||
"default": 600,
|
||||
"description": "Amount of time, in seconds, a subreddit (attributes) should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache",
|
||||
"examples": [
|
||||
600
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"userNotesTTL": {
|
||||
"default": 300,
|
||||
"description": "Amount of time, in seconds, [Toolbox User Notes](https://www.reddit.com/r/toolbox/wiki/docs/usernotes) should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache",
|
||||
"examples": [
|
||||
300
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"wikiTTL": {
|
||||
"default": 300,
|
||||
"description": "Amount of time, in seconds, wiki content pages should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache",
|
||||
"examples": [
|
||||
300
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"CacheOptions": {
|
||||
"additionalProperties": {
|
||||
},
|
||||
"description": "Configure granular settings for a cache provider with this object",
|
||||
"properties": {
|
||||
"auth_pass": {
|
||||
@@ -1067,6 +1243,16 @@
|
||||
"removed": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"reports": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"score": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"spam": {
|
||||
"type": "boolean"
|
||||
},
|
||||
@@ -1247,23 +1433,28 @@
|
||||
"type": "object"
|
||||
},
|
||||
"HistoryCriteria": {
|
||||
"description": "If both `submission` and `comment` are defined then criteria will only trigger if BOTH thresholds are met",
|
||||
"description": "Criteria will only trigger if ALL present thresholds (comment, submission, total) are met",
|
||||
"properties": {
|
||||
"comment": {
|
||||
"description": "A string containing a comparison operator and a value to compare comments against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [OP]`\n\n* EX `> 100` => greater than 100 comments\n* EX `<= 75%` => comments are equal to or less than 75% of all Activities\n\nIf your string also contains the text `OP` somewhere **after** `<number>[percent sign]`...:\n\n* EX `> 100 OP` => greater than 100 comments as OP\n* EX `<= 25% as OP` => Comments as OP were less then or equal to 25% of **all Comments**",
|
||||
"description": "A string containing a comparison operator and a value to compare **filtered** (using `include` or `exclude`, if present) comments against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [OP]`\n\n* EX `> 100` => greater than 100 comments\n* EX `<= 75%` => comments are equal to or less than 75% of unfiltered Activities\n\nIf your string also contains the text `OP` somewhere **after** `<number>[percent sign]`...:\n\n* EX `> 100 OP` => greater than 100 filtered comments as OP\n* EX `<= 25% as OP` => **Filtered** comments as OP were less then or equal to 25% of **unfiltered Comments**",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"minActivityCount": {
|
||||
"default": 5,
|
||||
"description": "The minimum number of activities that must exist from the `window` results for this criteria to run",
|
||||
"description": "The minimum number of **filtered** activities that must exist from the `window` results for this criteria to run",
|
||||
"type": "number"
|
||||
},
|
||||
"name": {
|
||||
"type": "string"
|
||||
},
|
||||
"submission": {
|
||||
"description": "A string containing a comparison operator and a value to compare submissions against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 100` => greater than 100 submissions\n* EX `<= 75%` => submissions are equal to or less than 75% of all Activities",
|
||||
"description": "A string containing a comparison operator and a value to compare **filtered** (using `include` or `exclude`, if present) submissions against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 100` => greater than 100 filtered submissions\n* EX `<= 75%` => filtered submissions are equal to or less than 75% of unfiltered Activities",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"total": {
|
||||
"description": "A string containing a comparison operator and a value to compare **filtered** (using `include` or `exclude`) activities against\n\n**Note:** This is only useful if using `include` or `exclude` otherwise percent will always be 100% and total === activityTotal\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [OP]`\n\n* EX `> 100` => greater than 100 filtered activities\n* EX `<= 75%` => filtered activities are equal to or less than 75% of all Activities",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
@@ -1332,27 +1523,51 @@
|
||||
"type": "array"
|
||||
},
|
||||
"exclude": {
|
||||
"description": "Do not include Submissions from this list of Subreddits (by name, case-insensitive)\n\nEX `[\"mealtimevideos\",\"askscience\"]`",
|
||||
"description": "If present, activities will be counted only if they are **NOT** found in this list of Subreddits\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`\n\n**Note:** This affects **post-window retrieval** activities. So that:\n\n* `activityTotal` is number of activities retrieved from `window` -- NOT post-filtering\n* all comparisons using **percentages** will compare **post-filtering** results against **activity count from window**\n* -- to run this rule where all activities are only from include/exclude filtering instead use include/exclude in `window`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"include": {
|
||||
"description": "Only include Submissions from this list of Subreddits (by name, case-insensitive)\n\nEX `[\"mealtimevideos\",\"askscience\"]`",
|
||||
"description": "If present, activities will be counted only if they are found in this list of Subreddits.\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`\n\n **Note:** This affects **post-window retrieval** activities. So that:\n\n* `activityTotal` is number of activities retrieved from `window` -- NOT post-filtering\n* all comparisons using **percentages** will compare **post-filtering** results against **activity count from window**\n* -- to run this rule where all activities are only from include/exclude filtering instead use include/exclude in `window`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"itemIs": {
|
||||
@@ -1567,6 +1782,16 @@
|
||||
"title": {
|
||||
"description": "The title of the message\n\nIf not specified will be defaulted to `Concerning your [Submission/Comment]`",
|
||||
"type": "string"
|
||||
},
|
||||
"to": {
|
||||
"description": "Entity to send message to.\n\nIf not present Message be will sent to the Author of the Activity being checked.\n\nValid formats:\n\n* `aUserName` -- send to /u/aUserName\n* `u/aUserName` -- send to /u/aUserName\n* `r/aSubreddit` -- sent to modmail of /r/aSubreddit\n\n**Note:** Reddit does not support sending a message AS a subreddit TO another subreddit",
|
||||
"examples": [
|
||||
"aUserName",
|
||||
"u/aUserName",
|
||||
"r/aSubreddit"
|
||||
],
|
||||
"pattern": "^\\s*(\\/[ru]\\/|[ru]\\/)*(\\w+)*\\s*$",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
@@ -1759,7 +1984,7 @@
|
||||
"thresholds": {
|
||||
"description": "A list of subreddits/count criteria that may trigger this rule. ANY SubThreshold will trigger this rule.",
|
||||
"items": {
|
||||
"$ref": "#/definitions/SubThreshold"
|
||||
"$ref": "#/definitions/ActivityThreshold"
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
@@ -1796,6 +2021,49 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"RegExp": {
|
||||
"properties": {
|
||||
"dotAll": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"flags": {
|
||||
"type": "string"
|
||||
},
|
||||
"global": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"ignoreCase": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"lastIndex": {
|
||||
"type": "number"
|
||||
},
|
||||
"multiline": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"source": {
|
||||
"type": "string"
|
||||
},
|
||||
"sticky": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"unicode": {
|
||||
"type": "boolean"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"dotAll",
|
||||
"flags",
|
||||
"global",
|
||||
"ignoreCase",
|
||||
"lastIndex",
|
||||
"multiline",
|
||||
"source",
|
||||
"sticky",
|
||||
"unicode"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"RegexCriteria": {
|
||||
"properties": {
|
||||
"activityMatchThreshold": {
|
||||
@@ -1834,16 +2102,12 @@
|
||||
"type": "string"
|
||||
},
|
||||
"regex": {
|
||||
"description": "A valid Regular Expression to test content against\n\nDo not wrap expression in forward slashes\n\nEX For the expression `/reddit|FoxxMD/` use the value should be `reddit|FoxxMD`",
|
||||
"description": "A valid Regular Expression to test content against\n\nIf no flags are specified then the **global** flag is used by default",
|
||||
"examples": [
|
||||
"reddit|FoxxMD"
|
||||
"/reddit|FoxxMD/ig"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"regexFlags": {
|
||||
"description": "Regex flags to use",
|
||||
"type": "string"
|
||||
},
|
||||
"testOn": {
|
||||
"default": [
|
||||
"title",
|
||||
@@ -2082,15 +2346,27 @@
|
||||
]
|
||||
},
|
||||
"exclude": {
|
||||
"description": "Do not include Submissions from this list of Subreddits (by name, case-insensitive)\n\nEX `[\"mealtimevideos\",\"askscience\"]`",
|
||||
"description": "If present, activities will be counted only if they are **NOT** found in this list of Subreddits\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"gapAllowance": {
|
||||
@@ -2098,15 +2374,27 @@
|
||||
"type": "number"
|
||||
},
|
||||
"include": {
|
||||
"description": "Only include Submissions from this list of Subreddits (by name, case-insensitive)\n\nEX `[\"mealtimevideos\",\"askscience\"]`",
|
||||
"description": "If present, activities will be counted only if they are found in this list of Subreddits\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"itemIs": {
|
||||
@@ -2333,40 +2621,6 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"SubThreshold": {
|
||||
"additionalProperties": false,
|
||||
"description": "At least one count property must be present. If both are present then either can trigger the rule",
|
||||
"minProperties": 1,
|
||||
"properties": {
|
||||
"subreddits": {
|
||||
"description": "A list of Subreddits (by name, case-insensitive) to look for.\n\nEX [\"mealtimevideos\",\"askscience\"]",
|
||||
"examples": [
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"threshold": {
|
||||
"default": ">= 1",
|
||||
"description": "A string containing a comparison operator and a value to compare recent activities against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 3` => greater than 3 activities found in the listed subreddits\n* EX `<= 75%` => number of Activities in the subreddits listed are equal to or less than 75% of all Activities\n\n**Note:** If you use percentage comparison here as well as `useSubmissionAsReference` then \"all Activities\" is only pertains to Activities that had the Link of the Submission, rather than all Activities from this window.",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"subreddits"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"SubmissionCheckJson": {
|
||||
"properties": {
|
||||
"actions": {
|
||||
@@ -2439,6 +2693,10 @@
|
||||
}
|
||||
]
|
||||
},
|
||||
"cacheUserResult": {
|
||||
"$ref": "#/definitions/UserResultCacheOptions",
|
||||
"description": "Cache the result of this check based on the comment author and the submission id\n\nThis is useful in this type of scenario:\n\n1. This check is configured to run on comments for specific submissions with high volume activity\n2. The rules being run are not dependent on the content of the comment\n3. The rule results are not likely to change while cache is valid"
|
||||
},
|
||||
"condition": {
|
||||
"default": "AND",
|
||||
"description": "Under what condition should a set of run `Rule` objects be considered \"successful\"?\n\nIf `OR` then **any** triggered `Rule` object results in success.\n\nIf `AND` then **all** `Rule` objects must be triggered to result in success.",
|
||||
@@ -2583,6 +2841,16 @@
|
||||
"removed": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"reports": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"score": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"spam": {
|
||||
"type": "boolean"
|
||||
},
|
||||
@@ -2599,70 +2867,40 @@
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"SubredditCacheConfig": {
|
||||
"SubredditState": {
|
||||
"description": "Different attributes a `Subreddit` can be in. Only include a property if you want to check it.",
|
||||
"examples": [
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
],
|
||||
"properties": {
|
||||
"authorTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, author activities (Comments/Submission) should be cached",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"commentTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, a comment should be cached",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"filterCriteriaTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, to cache filter criteria results (`authorIs` and `itemIs` results)\n\nThis is especially useful if when polling high-volume comments and your checks rely on author/item filters",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"provider": {
|
||||
"name": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/CacheOptions"
|
||||
"$ref": "#/definitions/RegExp"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"memory",
|
||||
"none",
|
||||
"redis"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
],
|
||||
"description": "The name the subreddit.\n\nCan be a normal string (will check case-insensitive) or a regular expression\n\nEX `[\"mealtimevideos\", \"/onlyfans*\\/i\"]`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"/onlyfans*/i"
|
||||
]
|
||||
},
|
||||
"submissionTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, a submission should be cached",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": "number"
|
||||
"over18": {
|
||||
"description": "Is subreddit NSFW/over 18?\n\n**Note**: This is **mod-controlled flag** so it is up to the mods of the subreddit to correctly mark their subreddit as NSFW",
|
||||
"type": "boolean"
|
||||
},
|
||||
"userNotesTTL": {
|
||||
"default": 300,
|
||||
"description": "Amount of time, in milliseconds, [Toolbox User Notes](https://www.reddit.com/r/toolbox/wiki/docs/usernotes) should be cached",
|
||||
"examples": [
|
||||
300
|
||||
],
|
||||
"type": "number"
|
||||
"quarantine": {
|
||||
"description": "Is subreddit quarantined?",
|
||||
"type": "boolean"
|
||||
},
|
||||
"wikiTTL": {
|
||||
"default": 300,
|
||||
"description": "Amount of time, in seconds, wiki content pages should be cached",
|
||||
"examples": [
|
||||
300
|
||||
],
|
||||
"type": "number"
|
||||
"stateDescription": {
|
||||
"description": "A friendly description of what this State is trying to parse",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
@@ -2810,6 +3048,12 @@
|
||||
"description": "Cache the result of this check based on the comment author and the submission id\n\nThis is useful in this type of scenario:\n\n1. This check is configured to run on comments for specific submissions with high volume activity\n2. The rules being run are not dependent on the content of the comment\n3. The rule results are not likely to change while cache is valid",
|
||||
"properties": {
|
||||
"enable": {
|
||||
"default": false,
|
||||
"type": "boolean"
|
||||
},
|
||||
"runActions": {
|
||||
"default": true,
|
||||
"description": "In the event the cache returns a triggered result should the actions for the check also be run?",
|
||||
"type": "boolean"
|
||||
},
|
||||
"ttl": {
|
||||
@@ -2826,7 +3070,7 @@
|
||||
},
|
||||
"properties": {
|
||||
"caching": {
|
||||
"$ref": "#/definitions/SubredditCacheConfig",
|
||||
"$ref": "#/definitions/CacheConfig",
|
||||
"description": "Per-subreddit config for caching TTL values. If set to `false` caching is disabled."
|
||||
},
|
||||
"checks": {
|
||||
|
||||
@@ -1,7 +1,184 @@
|
||||
{
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"definitions": {
|
||||
"BotConnection": {
|
||||
"description": "Configuration required to connect to a CM Server",
|
||||
"properties": {
|
||||
"host": {
|
||||
"description": "The hostname and port the CM Server is listening on EX `localhost:8085`",
|
||||
"type": "string"
|
||||
},
|
||||
"secret": {
|
||||
"description": "The **shared secret** used to sign API calls from the Client to the Server.\n\nThis value should be the same as what is specified in the target CM's `api.secret` configuration",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"host",
|
||||
"secret"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"BotInstanceJsonConfig": {
|
||||
"description": "The configuration for an **individual reddit account** ContextMod will run as a bot.\n\nMultiple bot configs may be specified (one per reddit account).\n\n**NOTE:** If `bots` is not specified in a `FILE` then a default `bot` is generated using `ENV/ARG` values IE `CLIENT_ID`, etc...but if `bots` IS specified the default is not generated.",
|
||||
"properties": {
|
||||
"caching": {
|
||||
"$ref": "#/definitions/OperatorCacheConfig",
|
||||
"description": "Settings to configure the default caching behavior for this bot\n\nEvery setting not specified will default to what is specified by the global operator caching config"
|
||||
},
|
||||
"credentials": {
|
||||
"$ref": "#/definitions/RedditCredentials",
|
||||
"description": "Credentials required for the bot to interact with Reddit's API\n\nThese credentials will provided to both the API and Web interface unless otherwise specified with the `web.credentials` property\n\nRefer to the [required credentials table](https://github.com/FoxxMD/context-mod/blob/master/docs/operatorConfiguration.md#minimum-required-configuration) to see what is necessary to run the bot.",
|
||||
"examples": [
|
||||
{
|
||||
"accessToken": "p75_1c467b2",
|
||||
"clientId": "f4b4df1_9oiu",
|
||||
"clientSecret": "34v5q1c564_yt7",
|
||||
"redirectUri": "http://localhost:8085/callback",
|
||||
"refreshToken": "34_f1w1v4"
|
||||
}
|
||||
]
|
||||
},
|
||||
"name": {
|
||||
"type": "string"
|
||||
},
|
||||
"nanny": {
|
||||
"description": "Settings related to managing heavy API usage.",
|
||||
"properties": {
|
||||
"hardLimit": {
|
||||
"default": 50,
|
||||
"description": "When `api limit remaining` reaches this number the application will pause all event polling until the api limit is reset.",
|
||||
"examples": [
|
||||
50
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"softLimit": {
|
||||
"default": 250,
|
||||
"description": "When `api limit remaining` reaches this number the application will attempt to put heavy-usage subreddits in a **slow mode** where activity processed is slowed to one every 1.5 seconds until the api limit is reset.",
|
||||
"examples": [
|
||||
250
|
||||
],
|
||||
"type": "number"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"notifications": {
|
||||
"$ref": "#/definitions/NotificationConfig",
|
||||
"description": "Settings to configure 3rd party notifications for when behavior occurs"
|
||||
},
|
||||
"polling": {
|
||||
"allOf": [
|
||||
{
|
||||
"$ref": "#/definitions/PollingDefaults"
|
||||
},
|
||||
{
|
||||
"properties": {
|
||||
"sharedMod": {
|
||||
"default": false,
|
||||
"description": "If set to `true` all subreddits polling unmoderated/modqueue with default polling settings will share a request to \"r/mod\"\notherwise each subreddit will poll its own mod view\n\n* ENV => `SHARE_MOD`\n* ARG => `--shareMod`",
|
||||
"type": "boolean"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
}
|
||||
],
|
||||
"description": "Settings related to default polling configurations for subreddits"
|
||||
},
|
||||
"queue": {
|
||||
"description": "Settings related to default configurations for queue behavior for subreddits",
|
||||
"properties": {
|
||||
"maxWorkers": {
|
||||
"default": 1,
|
||||
"description": "Set the number of maximum concurrent workers any subreddit can use.\n\nSubreddits may define their own number of max workers in their config but the application will never allow any subreddit's max workers to be larger than the operator\n\nNOTE: Do not increase this unless you are certain you know what you are doing! The default is suitable for the majority of use cases.",
|
||||
"examples": [
|
||||
1
|
||||
],
|
||||
"type": "number"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"snoowrap": {
|
||||
"description": "Settings to control some [Snoowrap](https://github.com/not-an-aardvark/snoowrap) behavior",
|
||||
"properties": {
|
||||
"debug": {
|
||||
"description": "Manually set the debug status for snoowrap\n\nWhen snoowrap has `debug: true` it will log the http status response of reddit api requests to at the `debug` level\n\n* Set to `true` to always output\n* Set to `false` to never output\n\nIf not present or `null` will be set based on `logLevel`\n\n* ENV => `SNOO_DEBUG`\n* ARG => `--snooDebug`",
|
||||
"type": "boolean"
|
||||
},
|
||||
"proxy": {
|
||||
"description": "Proxy all requests to Reddit's API through this endpoint\n\n* ENV => `PROXY`\n* ARG => `--proxy <proxyEndpoint>`",
|
||||
"examples": [
|
||||
"http://localhost:4443"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"subreddits": {
|
||||
"description": "Settings related to bot behavior for subreddits it is managing",
|
||||
"properties": {
|
||||
"dryRun": {
|
||||
"default": false,
|
||||
"description": "If `true` then all subreddits will run in dry run mode, overriding configurations\n\n* ENV => `DRYRUN`\n* ARG => `--dryRun`",
|
||||
"examples": [
|
||||
false
|
||||
],
|
||||
"type": "boolean"
|
||||
},
|
||||
"exclude": {
|
||||
"description": "Names of subreddits the bot should NOT run, based on what subreddits it moderates\n\nThis setting is ignored if `names` is specified",
|
||||
"examples": [
|
||||
[
|
||||
"mealtimevideos",
|
||||
"programminghumor"
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"heartbeatInterval": {
|
||||
"default": 300,
|
||||
"description": "Interval, in seconds, to perform application heartbeat\n\nOn heartbeat the application does several things:\n\n* Log output with current api rate remaining and other statistics\n* Tries to retrieve and parse configurations for any subreddits with invalid configuration state\n* Restarts any bots stopped/paused due to polling issues, general errors, or invalid configs (if new config is valid)\n\n* ENV => `HEARTBEAT`\n* ARG => `--heartbeat <sec>`",
|
||||
"examples": [
|
||||
300
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"names": {
|
||||
"description": "Names of subreddits for bot to run on\n\nIf not present or `null` bot will run on all subreddits it is a moderator of\n\n* ENV => `SUBREDDITS` (comma-separated)\n* ARG => `--subreddits <list...>`",
|
||||
"examples": [
|
||||
[
|
||||
"mealtimevideos",
|
||||
"programminghumor"
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"wikiConfig": {
|
||||
"default": "botconfig/contextbot",
|
||||
"description": "The default relative url to the ContextMod wiki page EX `https://reddit.com/r/subreddit/wiki/<path>`\n\n* ENV => `WIKI_CONFIG`\n* ARG => `--wikiConfig <path>`",
|
||||
"examples": [
|
||||
"botconfig/contextbot"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"CacheOptions": {
|
||||
"additionalProperties": {
|
||||
},
|
||||
"description": "Configure granular settings for a cache provider with this object",
|
||||
"properties": {
|
||||
"auth_pass": {
|
||||
@@ -153,6 +330,114 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"OperatorCacheConfig": {
|
||||
"properties": {
|
||||
"actionedEventsDefault": {
|
||||
"default": 25,
|
||||
"description": "The **default** number of Events that the cache will store triggered result summaries for\n\nThese summaries are viewable through the Web UI.\n\nThe value specified cannot be larger than `actionedEventsMax` for the global/bot config (if set)",
|
||||
"type": "number"
|
||||
},
|
||||
"actionedEventsMax": {
|
||||
"default": 25,
|
||||
"description": "The **maximum** number of Events that the cache should store triggered result summaries for\n\nThese summaries are viewable through the Web UI.\n\nThe value specified by a subreddit cannot be larger than the value set by the Operator for the global/bot config (if set)",
|
||||
"type": "number"
|
||||
},
|
||||
"authorTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, author activity history (Comments/Submission) should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache\n\n* ENV => `AUTHOR_TTL`\n* ARG => `--authorTTL <sec>`",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"commentTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, a comment should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"filterCriteriaTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, to cache filter criteria results (`authorIs` and `itemIs` results)\n\nThis is especially useful if when polling high-volume comments and your checks rely on author/item filters\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"provider": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/CacheOptions"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"memory",
|
||||
"none",
|
||||
"redis"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
],
|
||||
"description": "The cache provider and, optionally, a custom configuration for that provider\n\nIf not present or `null` provider will be `memory`.\n\nTo specify another `provider` but use its default configuration set this property to a string of one of the available providers: `memory`, `redis`, or `none`"
|
||||
},
|
||||
"submissionTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, a submission should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"subredditTTL": {
|
||||
"default": 600,
|
||||
"description": "Amount of time, in seconds, a subreddit (attributes) should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache",
|
||||
"examples": [
|
||||
600
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"userNotesTTL": {
|
||||
"default": 300,
|
||||
"description": "Amount of time, in seconds, [Toolbox User Notes](https://www.reddit.com/r/toolbox/wiki/docs/usernotes) should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache",
|
||||
"examples": [
|
||||
300
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"wikiTTL": {
|
||||
"default": 300,
|
||||
"description": "Amount of time, in seconds, wiki content pages should be cached\n\n* If `0` or `true` will cache indefinitely (not recommended)\n* If `false` will not cache",
|
||||
"examples": [
|
||||
300
|
||||
],
|
||||
"type": [
|
||||
"number",
|
||||
"boolean"
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"PollingDefaults": {
|
||||
"properties": {
|
||||
"delayUntil": {
|
||||
@@ -177,104 +462,18 @@
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"description": "Configuration for application-level settings IE for running the bot instance\n\n* To load a JSON configuration **from the command line** use the `-c` cli argument EX: `node src/index.js -c /path/to/JSON/config.json`\n* To load a JSON configuration **using an environmental variable** use `OPERATOR_CONFIG` EX: `OPERATOR_CONFIG=/path/to/JSON/config.json`",
|
||||
"properties": {
|
||||
"api": {
|
||||
"description": "Settings related to managing heavy API usage.",
|
||||
"properties": {
|
||||
"hardLimit": {
|
||||
"default": 50,
|
||||
"description": "When `api limit remaining` reaches this number the application will pause all event polling until the api limit is reset.",
|
||||
"examples": [
|
||||
50
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"softLimit": {
|
||||
"default": 250,
|
||||
"description": "When `api limit remaining` reaches this number the application will attempt to put heavy-usage subreddits in a **slow mode** where activity processed is slowed to one every 1.5 seconds until the api limit is reset.",
|
||||
"examples": [
|
||||
250
|
||||
],
|
||||
"type": "number"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"caching": {
|
||||
"description": "Settings to configure the default caching behavior for each suberddit",
|
||||
"properties": {
|
||||
"authorTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, author activity history (Comments/Submission) should be cached\n\n* ENV => `AUTHOR_TTL`\n* ARG => `--authorTTL <sec>`",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"commentTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, a comment should be cached",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"filterCriteriaTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, to cache filter criteria results (`authorIs` and `itemIs` results)\n\nThis is especially useful if when polling high-volume comments and your checks rely on author/item filters",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"provider": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/CacheOptions"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"memory",
|
||||
"none",
|
||||
"redis"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
],
|
||||
"description": "The cache provider and, optionally, a custom configuration for that provider\n\nIf not present or `null` provider will be `memory`.\n\nTo specify another `provider` but use its default configuration set this property to a string of one of the available providers: `memory`, `redis`, or `none`"
|
||||
},
|
||||
"submissionTTL": {
|
||||
"default": 60,
|
||||
"description": "Amount of time, in seconds, a submission should be cached",
|
||||
"examples": [
|
||||
60
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"userNotesTTL": {
|
||||
"default": 300,
|
||||
"description": "Amount of time, in seconds, [Toolbox User Notes](https://www.reddit.com/r/toolbox/wiki/docs/usernotes) should be cached",
|
||||
"examples": [
|
||||
300
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"wikiTTL": {
|
||||
"default": 300,
|
||||
"description": "Amount of time, in seconds, wiki content pages should be cached",
|
||||
"examples": [
|
||||
300
|
||||
],
|
||||
"type": "number"
|
||||
"RedditCredentials": {
|
||||
"description": "Credentials required for the bot to interact with Reddit's API\n\nThese credentials will provided to both the API and Web interface unless otherwise specified with the `web.credentials` property\n\nRefer to the [required credentials table](https://github.com/FoxxMD/context-mod/blob/master/docs/operatorConfiguration.md#minimum-required-configuration) to see what is necessary to run the bot.",
|
||||
"examples": [
|
||||
{
|
||||
"accessToken": "p75_1c467b2",
|
||||
"clientId": "f4b4df1_9oiu",
|
||||
"clientSecret": "34v5q1c564_yt7",
|
||||
"redirectUri": "http://localhost:8085/callback",
|
||||
"refreshToken": "34_f1w1v4"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"credentials": {
|
||||
"description": "The credentials required for the bot to interact with Reddit's API\n\n**Note:** Only `clientId` and `clientSecret` are required for initial setup (to use the oauth helper) **but ALL are required to properly run the bot.**",
|
||||
],
|
||||
"properties": {
|
||||
"accessToken": {
|
||||
"description": "Access token retrieved from authenticating an account with your Reddit Application\n\n* ENV => `ACCESS_TOKEN`\n* ARG => `--accessToken <token>`",
|
||||
@@ -297,14 +496,6 @@
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"redirectUri": {
|
||||
"description": "Redirect URI for your Reddit application\n\nOnly required if running ContextMod with a web interface (and after using oauth helper)\n\n* ENV => `REDIRECT_URI`\n* ARG => `--redirectUri <uri>`",
|
||||
"examples": [
|
||||
"http://localhost:8085"
|
||||
],
|
||||
"format": "uri",
|
||||
"type": "string"
|
||||
},
|
||||
"refreshToken": {
|
||||
"description": "Refresh token retrieved from authenticating an account with your Reddit Application\n\n* ENV => `REFRESH_TOKEN`\n* ARG => `--refreshToken <token>`",
|
||||
"examples": [
|
||||
@@ -315,6 +506,75 @@
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"WebCredentials": {
|
||||
"description": "Separate credentials for the web interface can be provided when also running the api.\n\nAll properties not specified will default to values given in ENV/ARG credential properties\n\nRefer to the [required credentials table](https://github.com/FoxxMD/context-mod/blob/master/docs/operatorConfiguration.md#minimum-required-configuration) to see what is necessary for the web interface.",
|
||||
"examples": [
|
||||
{
|
||||
"clientId": "f4b4df1_9oiu",
|
||||
"clientSecret": "34v5q1c564_yt7",
|
||||
"redirectUri": "http://localhost:8085/callback"
|
||||
}
|
||||
],
|
||||
"properties": {
|
||||
"clientId": {
|
||||
"description": "Client ID for your Reddit application",
|
||||
"examples": [
|
||||
"f4b4df1_9oiu"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"clientSecret": {
|
||||
"description": "Client Secret for your Reddit application",
|
||||
"examples": [
|
||||
"34v5q1c564_yt7"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"redirectUri": {
|
||||
"description": "Redirect URI for your Reddit application\n\nUsed for:\n\n* accessing the web interface for monitoring bots\n* authenticating an account to use for a bot instance\n\n* ENV => `REDIRECT_URI`\n* ARG => `--redirectUri <uri>`",
|
||||
"examples": [
|
||||
"http://localhost:8085/callback"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"description": "Configuration for application-level settings IE for running the bot instance\n\n* To load a JSON configuration **from the command line** use the `-c` cli argument EX: `node src/index.js -c /path/to/JSON/config.json`\n* To load a JSON configuration **using an environmental variable** use `OPERATOR_CONFIG` EX: `OPERATOR_CONFIG=/path/to/JSON/config.json`",
|
||||
"properties": {
|
||||
"api": {
|
||||
"description": "Configuration for the **Server** application. See [Architecture Documentation](https://github.com/FoxxMD/context-mod/blob/master/docs/serverClientArchitecture.md) for more info",
|
||||
"properties": {
|
||||
"friendly": {
|
||||
"description": "A friendly name for this server. This will override `friendly` in `BotConnection` if specified.",
|
||||
"type": "string"
|
||||
},
|
||||
"port": {
|
||||
"default": 8095,
|
||||
"description": "The port the server listens on for API requests",
|
||||
"examples": [
|
||||
8095
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"secret": {
|
||||
"description": "The **shared secret** used to verify API requests come from an authenticated client.\n\nUse this same value for the `secret` value in a `BotConnection` object to connect to this Server",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"bots": {
|
||||
"items": {
|
||||
"$ref": "#/definitions/BotInstanceJsonConfig"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"caching": {
|
||||
"$ref": "#/definitions/OperatorCacheConfig",
|
||||
"description": "Settings to configure the default caching behavior globally\n\nThese settings will be used by each bot, and subreddit, that does not specify their own"
|
||||
},
|
||||
"logging": {
|
||||
"description": "Settings to configure global logging defaults",
|
||||
"properties": {
|
||||
@@ -343,6 +603,16 @@
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"mode": {
|
||||
"default": "all",
|
||||
"description": "Mode to run ContextMod in\n\n* `all` (default) - Run the api and the web interface\n* `client` - Run web interface only\n* `server` - Run the api/bots only",
|
||||
"enum": [
|
||||
"all",
|
||||
"client",
|
||||
"server"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"notifications": {
|
||||
"$ref": "#/definitions/NotificationConfig",
|
||||
"description": "Settings to configure 3rd party notifications for when ContextMod behavior occurs"
|
||||
@@ -350,13 +620,6 @@
|
||||
"operator": {
|
||||
"description": "Settings related to the user(s) running this ContextMod instance and information on the bot",
|
||||
"properties": {
|
||||
"botName": {
|
||||
"description": "The name to use when identifying the bot. Defaults to name of the authenticated Reddit account IE `u/yourBotAccount`",
|
||||
"examples": [
|
||||
"u/yourBotAccount"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"display": {
|
||||
"description": "A **public** name to display to users of the web interface. Use this to help moderators using your bot identify who is the operator in case they need to contact you.\n\nLeave undefined for no public name to be displayed.\n\n* ENV => `OPERATOR_DISPLAY`\n* ARG => `--operatorDisplay <name>`",
|
||||
"examples": [
|
||||
@@ -387,105 +650,63 @@
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"polling": {
|
||||
"allOf": [
|
||||
{
|
||||
"$ref": "#/definitions/PollingDefaults"
|
||||
},
|
||||
{
|
||||
"properties": {
|
||||
"sharedMod": {
|
||||
"default": false,
|
||||
"description": "If set to `true` all subreddits polling unmoderated/modqueue with default polling settings will share a request to \"r/mod\"\notherwise each subreddit will poll its own mod view\n\n* ENV => `SHARE_MOD`\n* ARG => `--shareMod`",
|
||||
"type": "boolean"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
}
|
||||
],
|
||||
"description": "Settings related to default polling configurations for subreddits"
|
||||
},
|
||||
"queue": {
|
||||
"description": "Settings related to default configurations for queue behavior for subreddits",
|
||||
"properties": {
|
||||
"maxWorkers": {
|
||||
"default": 1,
|
||||
"description": "Set the number of maximum concurrent workers any subreddit can use.\n\nSubreddits may define their own number of max workers in their config but the application will never allow any subreddit's max workers to be larger than the operator\n\nNOTE: Do not increase this unless you are certain you know what you are doing! The default is suitable for the majority of use cases.",
|
||||
"examples": [
|
||||
1
|
||||
],
|
||||
"type": "number"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"snoowrap": {
|
||||
"description": "Settings to control some [Snoowrap](https://github.com/not-an-aardvark/snoowrap) behavior",
|
||||
"properties": {
|
||||
"debug": {
|
||||
"description": "Manually set the debug status for snoowrap\n\nWhen snoowrap has `debug: true` it will log the http status response of reddit api requests to at the `debug` level\n\n* Set to `true` to always output\n* Set to `false` to never output\n\nIf not present or `null` will be set based on `logLevel`\n\n* ENV => `SNOO_DEBUG`\n* ARG => `--snooDebug`",
|
||||
"type": "boolean"
|
||||
},
|
||||
"proxy": {
|
||||
"description": "Proxy all requests to Reddit's API through this endpoint\n\n* ENV => `PROXY`\n* ARG => `--proxy <proxyEndpoint>`",
|
||||
"examples": [
|
||||
"http://localhost:4443"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"subreddits": {
|
||||
"description": "Settings related to bot behavior for subreddits it is managing",
|
||||
"properties": {
|
||||
"dryRun": {
|
||||
"default": false,
|
||||
"description": "If `true` then all subreddits will run in dry run mode, overriding configurations\n\n* ENV => `DRYRUN`\n* ARG => `--dryRun`",
|
||||
"examples": [
|
||||
false
|
||||
],
|
||||
"type": "boolean"
|
||||
},
|
||||
"heartbeatInterval": {
|
||||
"default": 300,
|
||||
"description": "Interval, in seconds, to perform application heartbeat\n\nOn heartbeat the application does several things:\n\n* Log output with current api rate remaining and other statistics\n* Tries to retrieve and parse configurations for any subreddits with invalid configuration state\n* Restarts any bots stopped/paused due to polling issues, general errors, or invalid configs (if new config is valid)\n\n* ENV => `HEARTBEAT`\n* ARG => `--heartbeat <sec>`",
|
||||
"examples": [
|
||||
300
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"names": {
|
||||
"description": "Names of subreddits for bot to run on\n\nIf not present or `null` bot will run on all subreddits it is a moderator of\n\n* ENV => `SUBREDDITS` (comma-separated)\n* ARG => `--subreddits <list...>`",
|
||||
"examples": [
|
||||
[
|
||||
"mealtimevideos",
|
||||
"programminghumor"
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"wikiConfig": {
|
||||
"default": "botconfig/contextbot",
|
||||
"description": "The default relative url to the ContextMod wiki page EX `https://reddit.com/r/subreddit/wiki/<path>`\n\n* ENV => `WIKI_CONFIG`\n* ARG => `--wikiConfig <path>`",
|
||||
"examples": [
|
||||
"botconfig/contextbot"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"web": {
|
||||
"description": "Settings for the web interface",
|
||||
"properties": {
|
||||
"enabled": {
|
||||
"default": true,
|
||||
"description": "Whether the web server interface should be started\n\nIn most cases this does not need to be specified as the application will automatically detect if it is possible to start it --\nuse this to specify \"cli only\" behavior if you encounter errors with port/address or are paranoid\n\n* ENV => `WEB`\n* ARG => `node src/index.js run [interface]` -- interface can be `web` or `cli`",
|
||||
"type": "boolean"
|
||||
"caching": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/CacheOptions"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"memory",
|
||||
"redis"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
],
|
||||
"description": "Caching provider to use for session and invite data\n\nIf none is provided the top-level caching provider is used"
|
||||
},
|
||||
"clients": {
|
||||
"description": "A list of CM Servers this Client should connect to.\n\nIf not specified a default `BotConnection` for this instance is generated",
|
||||
"examples": [
|
||||
[
|
||||
{
|
||||
"host": "localhost:8095",
|
||||
"secret": "aRandomString"
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"$ref": "#/definitions/BotConnection"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"credentials": {
|
||||
"$ref": "#/definitions/WebCredentials",
|
||||
"description": "Separate credentials for the web interface can be provided when also running the api.\n\nAll properties not specified will default to values given in ENV/ARG credential properties\n\nRefer to the [required credentials table](https://github.com/FoxxMD/context-mod/blob/master/docs/operatorConfiguration.md#minimum-required-configuration) to see what is necessary for the web interface.",
|
||||
"examples": [
|
||||
{
|
||||
"clientId": "f4b4df1_9oiu",
|
||||
"clientSecret": "34v5q1c564_yt7",
|
||||
"redirectUri": "http://localhost:8085/callback"
|
||||
}
|
||||
]
|
||||
},
|
||||
"invites": {
|
||||
"description": "Settings related to oauth flow invites",
|
||||
"properties": {
|
||||
"maxAge": {
|
||||
"default": 0,
|
||||
"description": "Number of seconds an invite should be valid for\n\n If `0` or not specified (default) invites do not expire",
|
||||
"examples": [
|
||||
0
|
||||
],
|
||||
"type": "number"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"logLevel": {
|
||||
"description": "The default log level to filter to in the web interface\n\nIf not specified or `null` will be same as global `logLevel`",
|
||||
@@ -506,6 +727,19 @@
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"operators": {
|
||||
"description": "The name, or names, of the Reddit accounts, without prefix, that the operators of this **web interface** uses.\n\n**Note:** This is **not the same** as the top-level `operator` property. This allows specified users to see the status of all `clients` but **not** access to them -- that must still be specified in the `operator.name` property in the configuration of each bot.\n\n\nEX -- User is /u/FoxxMD then `\"name\": [\"FoxxMD\"]`",
|
||||
"examples": [
|
||||
[
|
||||
"FoxxMD",
|
||||
"AnotherUser"
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"port": {
|
||||
"default": 8085,
|
||||
"description": "The port for the web interface\n\n* ENV => `PORT`\n* ARG => `--port <number>`",
|
||||
@@ -517,27 +751,16 @@
|
||||
"session": {
|
||||
"description": "Settings to configure the behavior of user sessions -- the session is what the web interface uses to identify logged in users.",
|
||||
"properties": {
|
||||
"provider": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/CacheOptions"
|
||||
},
|
||||
{
|
||||
"enum": [
|
||||
"memory",
|
||||
"redis"
|
||||
],
|
||||
"type": "string"
|
||||
}
|
||||
],
|
||||
"default": "memory",
|
||||
"description": "The cache provider to use.\n\nThe default should be sufficient for almost all use cases",
|
||||
"maxAge": {
|
||||
"default": 86400,
|
||||
"description": "Number of seconds a session should be valid for.\n\nDefault is 1 day",
|
||||
"examples": [
|
||||
"memory"
|
||||
]
|
||||
86400
|
||||
],
|
||||
"type": "number"
|
||||
},
|
||||
"secret": {
|
||||
"description": "The secret value used to encrypt session data\n\nIf provider is persistent (redis) specifying a value here will ensure sessions are valid between application restarts\n\nWhen not present or `null` a random string is generated on application start",
|
||||
"description": "The secret value used to encrypt session data\n\nIf provider is persistent (`redis`) specifying a value here will ensure sessions are valid between application restarts\n\nWhen not present or `null` a random string is generated on application start",
|
||||
"examples": [
|
||||
"definitelyARandomString"
|
||||
],
|
||||
|
||||
@@ -24,6 +24,72 @@
|
||||
}
|
||||
],
|
||||
"definitions": {
|
||||
"ActivityThreshold": {
|
||||
"additionalProperties": false,
|
||||
"description": "At least one count property must be present. If both are present then either can trigger the rule",
|
||||
"minProperties": 1,
|
||||
"properties": {
|
||||
"commentState": {
|
||||
"$ref": "#/definitions/CommentState",
|
||||
"description": "When present, a Comment will only be counted if it meets this criteria",
|
||||
"examples": [
|
||||
{
|
||||
"op": true,
|
||||
"removed": false
|
||||
}
|
||||
]
|
||||
},
|
||||
"karma": {
|
||||
"description": "Test the **combined karma** from Activities found in the specified subreddits\n\nValue is a string containing a comparison operator and a number of **combined karma** to compare against\n\nIf specified then both `threshold` and `karma` must be met for this `SubThreshold` to be satisfied\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 50` => greater than 50 combined karma for all found Activities in specified subreddits",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"submissionState": {
|
||||
"$ref": "#/definitions/SubmissionState",
|
||||
"description": "When present, a Submission will only be counted if it meets this criteria",
|
||||
"examples": [
|
||||
{
|
||||
"over_18": true,
|
||||
"removed": false
|
||||
}
|
||||
]
|
||||
},
|
||||
"subreddits": {
|
||||
"description": "Activities will be counted if they are found in this list of Subreddits\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`",
|
||||
"examples": [
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"threshold": {
|
||||
"default": ">= 1",
|
||||
"description": "A string containing a comparison operator and a value to compare recent activities against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 3` => greater than 3 activities found in the listed subreddits\n* EX `<= 75%` => number of Activities in the subreddits listed are equal to or less than 75% of all Activities\n\n**Note:** If you use percentage comparison here as well as `useSubmissionAsReference` then \"all Activities\" is only pertains to Activities that had the Link of the Submission, rather than all Activities from this window.",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ActivityWindowCriteria": {
|
||||
"additionalProperties": false,
|
||||
"description": "Multiple properties that may be used to define what range of Activity to retrieve.\n\nMay specify one, or both properties along with the `satisfyOn` property, to affect the retrieval behavior.",
|
||||
@@ -113,7 +179,7 @@
|
||||
"properties": {
|
||||
"aggregateOn": {
|
||||
"default": "undefined",
|
||||
"description": "If `domains` is not specified this list determines which categories of domains should be aggregated on. All aggregated domains will be tested against `threshold`\n\n* If `media` is included then aggregate author's submission history which reddit recognizes as media (youtube, vimeo, etc.)\n* If `self` is included then aggregate on author's submission history which are self-post (`self.[subreddit]`) or reddit image/video (i.redd.it / v.redd.it)\n* If `link` is included then aggregate author's submission history which is external links but not media\n\nIf nothing is specified or list is empty (default) all domains are aggregated",
|
||||
"description": "This list determines which categories of domains should be aggregated on. All aggregated domains will be tested against `threshold`\n\n* If `media` is included then aggregate author's submission history which reddit recognizes as media (youtube, vimeo, etc.)\n* If `redditMedia` is included then aggregate on author's submissions history which are media hosted on reddit: galleries, videos, and images (i.redd.it / v.redd.it)\n* If `self` is included then aggregate on author's submission history which are self-post (`self.[subreddit]`) or domain is `reddit.com`\n* If `link` is included then aggregate author's submission history which is external links and not recognized as `media` by reddit\n\nIf nothing is specified or list is empty (default) all domains are aggregated",
|
||||
"examples": [
|
||||
[
|
||||
]
|
||||
@@ -122,6 +188,7 @@
|
||||
"enum": [
|
||||
"link",
|
||||
"media",
|
||||
"redditMedia",
|
||||
"self"
|
||||
],
|
||||
"type": "string"
|
||||
@@ -141,7 +208,7 @@
|
||||
[
|
||||
]
|
||||
],
|
||||
"description": "A list of domains whose Activities will be tested against `threshold`.\n\nIf this is present then `aggregateOn` is ignored.\n\nThe values are tested as partial strings so you do not need to include full URLs, just the part that matters.\n\nEX `[\"youtube\"]` will match submissions with the domain `https://youtube.com/c/aChannel`\nEX `[\"youtube.com/c/bChannel\"]` will NOT match submissions with the domain `https://youtube.com/c/aChannel`\n\nIf you wish to aggregate on self-posts for a subreddit use the syntax `self.[subreddit]` EX `self.AskReddit`\n\n**If this Rule is part of a Check for a Submission and you wish to aggregate on the domain of the Submission use the special string `AGG:SELF`**\n\nIf nothing is specified or list is empty (default) aggregate using `aggregateOn`",
|
||||
"description": "A list of domains whose Activities will be tested against `threshold`.\n\nThe values are tested as partial strings so you do not need to include full URLs, just the part that matters.\n\nEX `[\"youtube\"]` will match submissions with the domain `https://youtube.com/c/aChannel`\nEX `[\"youtube.com/c/bChannel\"]` will NOT match submissions with the domain `https://youtube.com/c/aChannel`\n\nIf you wish to aggregate on self-posts for a subreddit use the syntax `self.[subreddit]` EX `self.AskReddit`\n\n**If this Rule is part of a Check for a Submission and you wish to aggregate on the domain of the Submission use the special string `AGG:SELF`**\n\nIf nothing is specified or list is empty (default) aggregate using `aggregateOn`",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
@@ -330,7 +397,7 @@
|
||||
"properties": {
|
||||
"age": {
|
||||
"description": "Test the age of the Author's account (when it was created) against this comparison\n\nThe syntax is `(< OR > OR <= OR >=) <number> <unit>`\n\n* EX `> 100 days` => Passes if Author's account is older than 100 days\n* EX `<= 2 months` => Passes if Author's account is younger than or equal to 2 months\n\nUnit must be one of [DayJS Duration units](https://day.js.org/docs/en/durations/creating)\n\n[See] https://regexr.com/609n8 for example",
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<time>\\d+)\\s*(?<unit>days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?)\\s*$",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?)\\s*$",
|
||||
"type": "string"
|
||||
},
|
||||
"commentKarma": {
|
||||
@@ -378,6 +445,10 @@
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"shadowBanned": {
|
||||
"description": "Is the author shadowbanned?\n\nThis is determined by trying to retrieve the author's profile. If a 404 is returned it is likely they are shadowbanned",
|
||||
"type": "boolean"
|
||||
},
|
||||
"totalKarma": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
@@ -538,6 +609,16 @@
|
||||
"removed": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"reports": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"score": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"spam": {
|
||||
"type": "boolean"
|
||||
},
|
||||
@@ -611,23 +692,28 @@
|
||||
"type": "object"
|
||||
},
|
||||
"HistoryCriteria": {
|
||||
"description": "If both `submission` and `comment` are defined then criteria will only trigger if BOTH thresholds are met",
|
||||
"description": "Criteria will only trigger if ALL present thresholds (comment, submission, total) are met",
|
||||
"properties": {
|
||||
"comment": {
|
||||
"description": "A string containing a comparison operator and a value to compare comments against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [OP]`\n\n* EX `> 100` => greater than 100 comments\n* EX `<= 75%` => comments are equal to or less than 75% of all Activities\n\nIf your string also contains the text `OP` somewhere **after** `<number>[percent sign]`...:\n\n* EX `> 100 OP` => greater than 100 comments as OP\n* EX `<= 25% as OP` => Comments as OP were less then or equal to 25% of **all Comments**",
|
||||
"description": "A string containing a comparison operator and a value to compare **filtered** (using `include` or `exclude`, if present) comments against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [OP]`\n\n* EX `> 100` => greater than 100 comments\n* EX `<= 75%` => comments are equal to or less than 75% of unfiltered Activities\n\nIf your string also contains the text `OP` somewhere **after** `<number>[percent sign]`...:\n\n* EX `> 100 OP` => greater than 100 filtered comments as OP\n* EX `<= 25% as OP` => **Filtered** comments as OP were less then or equal to 25% of **unfiltered Comments**",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"minActivityCount": {
|
||||
"default": 5,
|
||||
"description": "The minimum number of activities that must exist from the `window` results for this criteria to run",
|
||||
"description": "The minimum number of **filtered** activities that must exist from the `window` results for this criteria to run",
|
||||
"type": "number"
|
||||
},
|
||||
"name": {
|
||||
"type": "string"
|
||||
},
|
||||
"submission": {
|
||||
"description": "A string containing a comparison operator and a value to compare submissions against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 100` => greater than 100 submissions\n* EX `<= 75%` => submissions are equal to or less than 75% of all Activities",
|
||||
"description": "A string containing a comparison operator and a value to compare **filtered** (using `include` or `exclude`, if present) submissions against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 100` => greater than 100 filtered submissions\n* EX `<= 75%` => filtered submissions are equal to or less than 75% of unfiltered Activities",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"total": {
|
||||
"description": "A string containing a comparison operator and a value to compare **filtered** (using `include` or `exclude`) activities against\n\n**Note:** This is only useful if using `include` or `exclude` otherwise percent will always be 100% and total === activityTotal\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [OP]`\n\n* EX `> 100` => greater than 100 filtered activities\n* EX `<= 75%` => filtered activities are equal to or less than 75% of all Activities",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
@@ -696,27 +782,51 @@
|
||||
"type": "array"
|
||||
},
|
||||
"exclude": {
|
||||
"description": "Do not include Submissions from this list of Subreddits (by name, case-insensitive)\n\nEX `[\"mealtimevideos\",\"askscience\"]`",
|
||||
"description": "If present, activities will be counted only if they are **NOT** found in this list of Subreddits\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`\n\n**Note:** This affects **post-window retrieval** activities. So that:\n\n* `activityTotal` is number of activities retrieved from `window` -- NOT post-filtering\n* all comparisons using **percentages** will compare **post-filtering** results against **activity count from window**\n* -- to run this rule where all activities are only from include/exclude filtering instead use include/exclude in `window`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"include": {
|
||||
"description": "Only include Submissions from this list of Subreddits (by name, case-insensitive)\n\nEX `[\"mealtimevideos\",\"askscience\"]`",
|
||||
"description": "If present, activities will be counted only if they are found in this list of Subreddits.\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`\n\n **Note:** This affects **post-window retrieval** activities. So that:\n\n* `activityTotal` is number of activities retrieved from `window` -- NOT post-filtering\n* all comparisons using **percentages** will compare **post-filtering** results against **activity count from window**\n* -- to run this rule where all activities are only from include/exclude filtering instead use include/exclude in `window`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"itemIs": {
|
||||
@@ -830,7 +940,7 @@
|
||||
"thresholds": {
|
||||
"description": "A list of subreddits/count criteria that may trigger this rule. ANY SubThreshold will trigger this rule.",
|
||||
"items": {
|
||||
"$ref": "#/definitions/SubThreshold"
|
||||
"$ref": "#/definitions/ActivityThreshold"
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
@@ -867,6 +977,49 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"RegExp": {
|
||||
"properties": {
|
||||
"dotAll": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"flags": {
|
||||
"type": "string"
|
||||
},
|
||||
"global": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"ignoreCase": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"lastIndex": {
|
||||
"type": "number"
|
||||
},
|
||||
"multiline": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"source": {
|
||||
"type": "string"
|
||||
},
|
||||
"sticky": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"unicode": {
|
||||
"type": "boolean"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"dotAll",
|
||||
"flags",
|
||||
"global",
|
||||
"ignoreCase",
|
||||
"lastIndex",
|
||||
"multiline",
|
||||
"source",
|
||||
"sticky",
|
||||
"unicode"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"RegexCriteria": {
|
||||
"properties": {
|
||||
"activityMatchThreshold": {
|
||||
@@ -905,16 +1058,12 @@
|
||||
"type": "string"
|
||||
},
|
||||
"regex": {
|
||||
"description": "A valid Regular Expression to test content against\n\nDo not wrap expression in forward slashes\n\nEX For the expression `/reddit|FoxxMD/` use the value should be `reddit|FoxxMD`",
|
||||
"description": "A valid Regular Expression to test content against\n\nIf no flags are specified then the **global** flag is used by default",
|
||||
"examples": [
|
||||
"reddit|FoxxMD"
|
||||
"/reddit|FoxxMD/ig"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"regexFlags": {
|
||||
"description": "Regex flags to use",
|
||||
"type": "string"
|
||||
},
|
||||
"testOn": {
|
||||
"default": [
|
||||
"title",
|
||||
@@ -1076,15 +1225,27 @@
|
||||
]
|
||||
},
|
||||
"exclude": {
|
||||
"description": "Do not include Submissions from this list of Subreddits (by name, case-insensitive)\n\nEX `[\"mealtimevideos\",\"askscience\"]`",
|
||||
"description": "If present, activities will be counted only if they are **NOT** found in this list of Subreddits\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"gapAllowance": {
|
||||
@@ -1092,15 +1253,27 @@
|
||||
"type": "number"
|
||||
},
|
||||
"include": {
|
||||
"description": "Only include Submissions from this list of Subreddits (by name, case-insensitive)\n\nEX `[\"mealtimevideos\",\"askscience\"]`",
|
||||
"description": "If present, activities will be counted only if they are found in this list of Subreddits\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"itemIs": {
|
||||
@@ -1190,40 +1363,6 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"SubThreshold": {
|
||||
"additionalProperties": false,
|
||||
"description": "At least one count property must be present. If both are present then either can trigger the rule",
|
||||
"minProperties": 1,
|
||||
"properties": {
|
||||
"subreddits": {
|
||||
"description": "A list of Subreddits (by name, case-insensitive) to look for.\n\nEX [\"mealtimevideos\",\"askscience\"]",
|
||||
"examples": [
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"threshold": {
|
||||
"default": ">= 1",
|
||||
"description": "A string containing a comparison operator and a value to compare recent activities against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 3` => greater than 3 activities found in the listed subreddits\n* EX `<= 75%` => number of Activities in the subreddits listed are equal to or less than 75% of all Activities\n\n**Note:** If you use percentage comparison here as well as `useSubmissionAsReference` then \"all Activities\" is only pertains to Activities that had the Link of the Submission, rather than all Activities from this window.",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"subreddits"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"SubmissionState": {
|
||||
"description": "Different attributes a `Submission` can be in. Only include a property if you want to check it.",
|
||||
"examples": [
|
||||
@@ -1267,6 +1406,16 @@
|
||||
"removed": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"reports": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"score": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"spam": {
|
||||
"type": "boolean"
|
||||
},
|
||||
@@ -1283,6 +1432,44 @@
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"SubredditState": {
|
||||
"description": "Different attributes a `Subreddit` can be in. Only include a property if you want to check it.",
|
||||
"examples": [
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
],
|
||||
"properties": {
|
||||
"name": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/RegExp"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
],
|
||||
"description": "The name the subreddit.\n\nCan be a normal string (will check case-insensitive) or a regular expression\n\nEX `[\"mealtimevideos\", \"/onlyfans*\\/i\"]`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"/onlyfans*/i"
|
||||
]
|
||||
},
|
||||
"over18": {
|
||||
"description": "Is subreddit NSFW/over 18?\n\n**Note**: This is **mod-controlled flag** so it is up to the mods of the subreddit to correctly mark their subreddit as NSFW",
|
||||
"type": "boolean"
|
||||
},
|
||||
"quarantine": {
|
||||
"description": "Is subreddit quarantined?",
|
||||
"type": "boolean"
|
||||
},
|
||||
"stateDescription": {
|
||||
"description": "A friendly description of what this State is trying to parse",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"UserNoteCriteria": {
|
||||
"properties": {
|
||||
"count": {
|
||||
|
||||
@@ -1,6 +1,72 @@
|
||||
{
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"definitions": {
|
||||
"ActivityThreshold": {
|
||||
"additionalProperties": false,
|
||||
"description": "At least one count property must be present. If both are present then either can trigger the rule",
|
||||
"minProperties": 1,
|
||||
"properties": {
|
||||
"commentState": {
|
||||
"$ref": "#/definitions/CommentState",
|
||||
"description": "When present, a Comment will only be counted if it meets this criteria",
|
||||
"examples": [
|
||||
{
|
||||
"op": true,
|
||||
"removed": false
|
||||
}
|
||||
]
|
||||
},
|
||||
"karma": {
|
||||
"description": "Test the **combined karma** from Activities found in the specified subreddits\n\nValue is a string containing a comparison operator and a number of **combined karma** to compare against\n\nIf specified then both `threshold` and `karma` must be met for this `SubThreshold` to be satisfied\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 50` => greater than 50 combined karma for all found Activities in specified subreddits",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"submissionState": {
|
||||
"$ref": "#/definitions/SubmissionState",
|
||||
"description": "When present, a Submission will only be counted if it meets this criteria",
|
||||
"examples": [
|
||||
{
|
||||
"over_18": true,
|
||||
"removed": false
|
||||
}
|
||||
]
|
||||
},
|
||||
"subreddits": {
|
||||
"description": "Activities will be counted if they are found in this list of Subreddits\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`",
|
||||
"examples": [
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"threshold": {
|
||||
"default": ">= 1",
|
||||
"description": "A string containing a comparison operator and a value to compare recent activities against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 3` => greater than 3 activities found in the listed subreddits\n* EX `<= 75%` => number of Activities in the subreddits listed are equal to or less than 75% of all Activities\n\n**Note:** If you use percentage comparison here as well as `useSubmissionAsReference` then \"all Activities\" is only pertains to Activities that had the Link of the Submission, rather than all Activities from this window.",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"ActivityWindowCriteria": {
|
||||
"additionalProperties": false,
|
||||
"description": "Multiple properties that may be used to define what range of Activity to retrieve.\n\nMay specify one, or both properties along with the `satisfyOn` property, to affect the retrieval behavior.",
|
||||
@@ -90,7 +156,7 @@
|
||||
"properties": {
|
||||
"aggregateOn": {
|
||||
"default": "undefined",
|
||||
"description": "If `domains` is not specified this list determines which categories of domains should be aggregated on. All aggregated domains will be tested against `threshold`\n\n* If `media` is included then aggregate author's submission history which reddit recognizes as media (youtube, vimeo, etc.)\n* If `self` is included then aggregate on author's submission history which are self-post (`self.[subreddit]`) or reddit image/video (i.redd.it / v.redd.it)\n* If `link` is included then aggregate author's submission history which is external links but not media\n\nIf nothing is specified or list is empty (default) all domains are aggregated",
|
||||
"description": "This list determines which categories of domains should be aggregated on. All aggregated domains will be tested against `threshold`\n\n* If `media` is included then aggregate author's submission history which reddit recognizes as media (youtube, vimeo, etc.)\n* If `redditMedia` is included then aggregate on author's submissions history which are media hosted on reddit: galleries, videos, and images (i.redd.it / v.redd.it)\n* If `self` is included then aggregate on author's submission history which are self-post (`self.[subreddit]`) or domain is `reddit.com`\n* If `link` is included then aggregate author's submission history which is external links and not recognized as `media` by reddit\n\nIf nothing is specified or list is empty (default) all domains are aggregated",
|
||||
"examples": [
|
||||
[
|
||||
]
|
||||
@@ -99,6 +165,7 @@
|
||||
"enum": [
|
||||
"link",
|
||||
"media",
|
||||
"redditMedia",
|
||||
"self"
|
||||
],
|
||||
"type": "string"
|
||||
@@ -118,7 +185,7 @@
|
||||
[
|
||||
]
|
||||
],
|
||||
"description": "A list of domains whose Activities will be tested against `threshold`.\n\nIf this is present then `aggregateOn` is ignored.\n\nThe values are tested as partial strings so you do not need to include full URLs, just the part that matters.\n\nEX `[\"youtube\"]` will match submissions with the domain `https://youtube.com/c/aChannel`\nEX `[\"youtube.com/c/bChannel\"]` will NOT match submissions with the domain `https://youtube.com/c/aChannel`\n\nIf you wish to aggregate on self-posts for a subreddit use the syntax `self.[subreddit]` EX `self.AskReddit`\n\n**If this Rule is part of a Check for a Submission and you wish to aggregate on the domain of the Submission use the special string `AGG:SELF`**\n\nIf nothing is specified or list is empty (default) aggregate using `aggregateOn`",
|
||||
"description": "A list of domains whose Activities will be tested against `threshold`.\n\nThe values are tested as partial strings so you do not need to include full URLs, just the part that matters.\n\nEX `[\"youtube\"]` will match submissions with the domain `https://youtube.com/c/aChannel`\nEX `[\"youtube.com/c/bChannel\"]` will NOT match submissions with the domain `https://youtube.com/c/aChannel`\n\nIf you wish to aggregate on self-posts for a subreddit use the syntax `self.[subreddit]` EX `self.AskReddit`\n\n**If this Rule is part of a Check for a Submission and you wish to aggregate on the domain of the Submission use the special string `AGG:SELF`**\n\nIf nothing is specified or list is empty (default) aggregate using `aggregateOn`",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
@@ -307,7 +374,7 @@
|
||||
"properties": {
|
||||
"age": {
|
||||
"description": "Test the age of the Author's account (when it was created) against this comparison\n\nThe syntax is `(< OR > OR <= OR >=) <number> <unit>`\n\n* EX `> 100 days` => Passes if Author's account is older than 100 days\n* EX `<= 2 months` => Passes if Author's account is younger than or equal to 2 months\n\nUnit must be one of [DayJS Duration units](https://day.js.org/docs/en/durations/creating)\n\n[See] https://regexr.com/609n8 for example",
|
||||
"pattern": "^\\s*(?<opStr>>|>=|<|<=)\\s*(?<time>\\d+)\\s*(?<unit>days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?)\\s*$",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(days?|weeks?|months?|years?|hours?|minutes?|seconds?|milliseconds?)\\s*$",
|
||||
"type": "string"
|
||||
},
|
||||
"commentKarma": {
|
||||
@@ -355,6 +422,10 @@
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"shadowBanned": {
|
||||
"description": "Is the author shadowbanned?\n\nThis is determined by trying to retrieve the author's profile. If a 404 is returned it is likely they are shadowbanned",
|
||||
"type": "boolean"
|
||||
},
|
||||
"totalKarma": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
@@ -515,6 +586,16 @@
|
||||
"removed": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"reports": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"score": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"spam": {
|
||||
"type": "boolean"
|
||||
},
|
||||
@@ -588,23 +669,28 @@
|
||||
"type": "object"
|
||||
},
|
||||
"HistoryCriteria": {
|
||||
"description": "If both `submission` and `comment` are defined then criteria will only trigger if BOTH thresholds are met",
|
||||
"description": "Criteria will only trigger if ALL present thresholds (comment, submission, total) are met",
|
||||
"properties": {
|
||||
"comment": {
|
||||
"description": "A string containing a comparison operator and a value to compare comments against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [OP]`\n\n* EX `> 100` => greater than 100 comments\n* EX `<= 75%` => comments are equal to or less than 75% of all Activities\n\nIf your string also contains the text `OP` somewhere **after** `<number>[percent sign]`...:\n\n* EX `> 100 OP` => greater than 100 comments as OP\n* EX `<= 25% as OP` => Comments as OP were less then or equal to 25% of **all Comments**",
|
||||
"description": "A string containing a comparison operator and a value to compare **filtered** (using `include` or `exclude`, if present) comments against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [OP]`\n\n* EX `> 100` => greater than 100 comments\n* EX `<= 75%` => comments are equal to or less than 75% of unfiltered Activities\n\nIf your string also contains the text `OP` somewhere **after** `<number>[percent sign]`...:\n\n* EX `> 100 OP` => greater than 100 filtered comments as OP\n* EX `<= 25% as OP` => **Filtered** comments as OP were less then or equal to 25% of **unfiltered Comments**",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"minActivityCount": {
|
||||
"default": 5,
|
||||
"description": "The minimum number of activities that must exist from the `window` results for this criteria to run",
|
||||
"description": "The minimum number of **filtered** activities that must exist from the `window` results for this criteria to run",
|
||||
"type": "number"
|
||||
},
|
||||
"name": {
|
||||
"type": "string"
|
||||
},
|
||||
"submission": {
|
||||
"description": "A string containing a comparison operator and a value to compare submissions against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 100` => greater than 100 submissions\n* EX `<= 75%` => submissions are equal to or less than 75% of all Activities",
|
||||
"description": "A string containing a comparison operator and a value to compare **filtered** (using `include` or `exclude`, if present) submissions against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 100` => greater than 100 filtered submissions\n* EX `<= 75%` => filtered submissions are equal to or less than 75% of unfiltered Activities",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"total": {
|
||||
"description": "A string containing a comparison operator and a value to compare **filtered** (using `include` or `exclude`) activities against\n\n**Note:** This is only useful if using `include` or `exclude` otherwise percent will always be 100% and total === activityTotal\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign] [OP]`\n\n* EX `> 100` => greater than 100 filtered activities\n* EX `<= 75%` => filtered activities are equal to or less than 75% of all Activities",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
@@ -673,27 +759,51 @@
|
||||
"type": "array"
|
||||
},
|
||||
"exclude": {
|
||||
"description": "Do not include Submissions from this list of Subreddits (by name, case-insensitive)\n\nEX `[\"mealtimevideos\",\"askscience\"]`",
|
||||
"description": "If present, activities will be counted only if they are **NOT** found in this list of Subreddits\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`\n\n**Note:** This affects **post-window retrieval** activities. So that:\n\n* `activityTotal` is number of activities retrieved from `window` -- NOT post-filtering\n* all comparisons using **percentages** will compare **post-filtering** results against **activity count from window**\n* -- to run this rule where all activities are only from include/exclude filtering instead use include/exclude in `window`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"include": {
|
||||
"description": "Only include Submissions from this list of Subreddits (by name, case-insensitive)\n\nEX `[\"mealtimevideos\",\"askscience\"]`",
|
||||
"description": "If present, activities will be counted only if they are found in this list of Subreddits.\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`\n\n **Note:** This affects **post-window retrieval** activities. So that:\n\n* `activityTotal` is number of activities retrieved from `window` -- NOT post-filtering\n* all comparisons using **percentages** will compare **post-filtering** results against **activity count from window**\n* -- to run this rule where all activities are only from include/exclude filtering instead use include/exclude in `window`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"itemIs": {
|
||||
@@ -807,7 +917,7 @@
|
||||
"thresholds": {
|
||||
"description": "A list of subreddits/count criteria that may trigger this rule. ANY SubThreshold will trigger this rule.",
|
||||
"items": {
|
||||
"$ref": "#/definitions/SubThreshold"
|
||||
"$ref": "#/definitions/ActivityThreshold"
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
@@ -844,6 +954,49 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"RegExp": {
|
||||
"properties": {
|
||||
"dotAll": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"flags": {
|
||||
"type": "string"
|
||||
},
|
||||
"global": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"ignoreCase": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"lastIndex": {
|
||||
"type": "number"
|
||||
},
|
||||
"multiline": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"source": {
|
||||
"type": "string"
|
||||
},
|
||||
"sticky": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"unicode": {
|
||||
"type": "boolean"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"dotAll",
|
||||
"flags",
|
||||
"global",
|
||||
"ignoreCase",
|
||||
"lastIndex",
|
||||
"multiline",
|
||||
"source",
|
||||
"sticky",
|
||||
"unicode"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"RegexCriteria": {
|
||||
"properties": {
|
||||
"activityMatchThreshold": {
|
||||
@@ -882,16 +1035,12 @@
|
||||
"type": "string"
|
||||
},
|
||||
"regex": {
|
||||
"description": "A valid Regular Expression to test content against\n\nDo not wrap expression in forward slashes\n\nEX For the expression `/reddit|FoxxMD/` use the value should be `reddit|FoxxMD`",
|
||||
"description": "A valid Regular Expression to test content against\n\nIf no flags are specified then the **global** flag is used by default",
|
||||
"examples": [
|
||||
"reddit|FoxxMD"
|
||||
"/reddit|FoxxMD/ig"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"regexFlags": {
|
||||
"description": "Regex flags to use",
|
||||
"type": "string"
|
||||
},
|
||||
"testOn": {
|
||||
"default": [
|
||||
"title",
|
||||
@@ -1053,15 +1202,27 @@
|
||||
]
|
||||
},
|
||||
"exclude": {
|
||||
"description": "Do not include Submissions from this list of Subreddits (by name, case-insensitive)\n\nEX `[\"mealtimevideos\",\"askscience\"]`",
|
||||
"description": "If present, activities will be counted only if they are **NOT** found in this list of Subreddits\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"gapAllowance": {
|
||||
@@ -1069,15 +1230,27 @@
|
||||
"type": "number"
|
||||
},
|
||||
"include": {
|
||||
"description": "Only include Submissions from this list of Subreddits (by name, case-insensitive)\n\nEX `[\"mealtimevideos\",\"askscience\"]`",
|
||||
"description": "If present, activities will be counted only if they are found in this list of Subreddits\n\nEach value in the list can be either:\n\n * string (name of subreddit)\n * regular expression to run on the subreddit name\n * `SubredditState`\n\nEX `[\"mealtimevideos\",\"askscience\", \"/onlyfans*\\/i\", {\"over18\": true}]`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience",
|
||||
"/onlyfans*/i",
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/SubredditState"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"itemIs": {
|
||||
@@ -1167,40 +1340,6 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"SubThreshold": {
|
||||
"additionalProperties": false,
|
||||
"description": "At least one count property must be present. If both are present then either can trigger the rule",
|
||||
"minProperties": 1,
|
||||
"properties": {
|
||||
"subreddits": {
|
||||
"description": "A list of Subreddits (by name, case-insensitive) to look for.\n\nEX [\"mealtimevideos\",\"askscience\"]",
|
||||
"examples": [
|
||||
[
|
||||
"mealtimevideos",
|
||||
"askscience"
|
||||
]
|
||||
],
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"minItems": 1,
|
||||
"type": "array"
|
||||
},
|
||||
"threshold": {
|
||||
"default": ">= 1",
|
||||
"description": "A string containing a comparison operator and a value to compare recent activities against\n\nThe syntax is `(< OR > OR <= OR >=) <number>[percent sign]`\n\n* EX `> 3` => greater than 3 activities found in the listed subreddits\n* EX `<= 75%` => number of Activities in the subreddits listed are equal to or less than 75% of all Activities\n\n**Note:** If you use percentage comparison here as well as `useSubmissionAsReference` then \"all Activities\" is only pertains to Activities that had the Link of the Submission, rather than all Activities from this window.",
|
||||
"examples": [
|
||||
">= 1"
|
||||
],
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"subreddits"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"SubmissionState": {
|
||||
"description": "Different attributes a `Submission` can be in. Only include a property if you want to check it.",
|
||||
"examples": [
|
||||
@@ -1244,6 +1383,16 @@
|
||||
"removed": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"reports": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"score": {
|
||||
"description": "A string containing a comparison operator and a value to compare against\n\nThe syntax is `(< OR > OR <= OR >=) <number>`\n\n* EX `> 100` => greater than 100",
|
||||
"pattern": "^\\s*(>|>=|<|<=)\\s*(\\d+)\\s*(%?)(.*)$",
|
||||
"type": "string"
|
||||
},
|
||||
"spam": {
|
||||
"type": "boolean"
|
||||
},
|
||||
@@ -1260,6 +1409,44 @@
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"SubredditState": {
|
||||
"description": "Different attributes a `Subreddit` can be in. Only include a property if you want to check it.",
|
||||
"examples": [
|
||||
{
|
||||
"over18": true
|
||||
}
|
||||
],
|
||||
"properties": {
|
||||
"name": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/RegExp"
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
],
|
||||
"description": "The name the subreddit.\n\nCan be a normal string (will check case-insensitive) or a regular expression\n\nEX `[\"mealtimevideos\", \"/onlyfans*\\/i\"]`",
|
||||
"examples": [
|
||||
"mealtimevideos",
|
||||
"/onlyfans*/i"
|
||||
]
|
||||
},
|
||||
"over18": {
|
||||
"description": "Is subreddit NSFW/over 18?\n\n**Note**: This is **mod-controlled flag** so it is up to the mods of the subreddit to correctly mark their subreddit as NSFW",
|
||||
"type": "boolean"
|
||||
},
|
||||
"quarantine": {
|
||||
"description": "Is subreddit quarantined?",
|
||||
"type": "boolean"
|
||||
},
|
||||
"stateDescription": {
|
||||
"description": "A friendly description of what this State is trying to parse",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"UserNoteCriteria": {
|
||||
"properties": {
|
||||
"count": {
|
||||
|
||||
@@ -1,85 +0,0 @@
|
||||
import {addAsync, Router} from '@awaitjs/express';
|
||||
import express from 'express';
|
||||
import Snoowrap from "snoowrap";
|
||||
import {permissions} from "../util";
|
||||
import {getLogger} from "../Utils/loggerFactory";
|
||||
import {OperatorConfig} from "../Common/interfaces";
|
||||
|
||||
const app = addAsync(express());
|
||||
const router = Router();
|
||||
app.set('views', `${__dirname}/views`);
|
||||
app.set('view engine', 'ejs');
|
||||
|
||||
app.use(router);
|
||||
|
||||
const helperServer = async function (options: OperatorConfig) {
|
||||
let rUri: string;
|
||||
|
||||
const {
|
||||
credentials: {
|
||||
clientId,
|
||||
clientSecret,
|
||||
redirectUri
|
||||
},
|
||||
web: {
|
||||
port
|
||||
}
|
||||
} = options;
|
||||
|
||||
const server = await app.listen(port);
|
||||
const logger = getLogger(options);
|
||||
logger.info(`Helper UI started: http://localhost:${port}`);
|
||||
app.getAsync('/', async (req, res) => {
|
||||
res.render('helper', {
|
||||
redirectUri
|
||||
});
|
||||
});
|
||||
|
||||
app.getAsync('/auth', async (req, res) => {
|
||||
rUri = req.query.redirect as string;
|
||||
let permissionsList = permissions;
|
||||
|
||||
const includeWikiEdit = (req.query.wikiEdit as any).toString() === "1";
|
||||
if (!includeWikiEdit) {
|
||||
permissionsList = permissionsList.filter(x => x !== 'wikiedit');
|
||||
}
|
||||
const authUrl = Snoowrap.getAuthUrl({
|
||||
clientId,
|
||||
scope: permissionsList,
|
||||
redirectUri: rUri as string,
|
||||
permanent: true,
|
||||
});
|
||||
return res.redirect(authUrl);
|
||||
});
|
||||
|
||||
app.getAsync(/.*callback$/, async (req, res) => {
|
||||
const {error, code} = req.query as any;
|
||||
if (error !== undefined) {
|
||||
let errContent: string;
|
||||
switch (error) {
|
||||
case 'access_denied':
|
||||
errContent = 'You must <b>Allow</b> this application to connect in order to proceed.';
|
||||
break;
|
||||
default:
|
||||
errContent = error;
|
||||
}
|
||||
return res.render('error', {error: errContent, });
|
||||
}
|
||||
const client = await Snoowrap.fromAuthCode({
|
||||
userAgent: `web:contextBot:web`,
|
||||
clientId,
|
||||
clientSecret,
|
||||
redirectUri: rUri,
|
||||
code: code as string,
|
||||
});
|
||||
// @ts-ignore
|
||||
const user = await client.getMe();
|
||||
|
||||
res.render('callback', {
|
||||
accessToken: client.accessToken,
|
||||
refreshToken: client.refreshToken,
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
export default helperServer;
|
||||
@@ -1,21 +0,0 @@
|
||||
/* adapted from https://cdn.jsdelivr.net/npm/pretty-print-json@1.0/dist/pretty-print-json.css */
|
||||
.json-key { color: brown; }
|
||||
.json-string { color: olive; }
|
||||
.json-number { color: navy; }
|
||||
.json-boolean { color: teal; }
|
||||
.json-null { color: dimgray; }
|
||||
.json-mark { color: black; }
|
||||
a.json-link { color: purple; transition: all 400ms; }
|
||||
a.json-link:visited { color: slategray; }
|
||||
a.json-link:hover { color: blueviolet; }
|
||||
a.json-link:active { color: slategray; }
|
||||
.dark .json-key { color: indianred; }
|
||||
.dark .json-string { color: darkkhaki; }
|
||||
.dark .json-number { color: deepskyblue; }
|
||||
.dark .json-boolean { color: mediumseagreen; }
|
||||
.dark .json-null { color: darkorange; }
|
||||
.dark .json-mark { color: silver; }
|
||||
.dark a.json-link { color: mediumorchid; }
|
||||
.dark a.json-link:visited { color: slategray; }
|
||||
.dark a.json-link:hover { color: violet; }
|
||||
.dark a.json-link:active { color: silver; }
|
||||
@@ -1,781 +0,0 @@
|
||||
import {addAsync, Router} from '@awaitjs/express';
|
||||
import express from 'express';
|
||||
import bodyParser from 'body-parser';
|
||||
import session from 'express-session';
|
||||
import {Cache} from 'cache-manager';
|
||||
// @ts-ignore
|
||||
import CacheManagerStore from 'express-session-cache-manager'
|
||||
import Snoowrap from "snoowrap";
|
||||
import {App} from "../App";
|
||||
import dayjs from 'dayjs';
|
||||
import {Writable} from "stream";
|
||||
import winston from 'winston';
|
||||
import {Server as SocketServer} from 'socket.io';
|
||||
import sharedSession from 'express-socket.io-session';
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import EventEmitter from "events";
|
||||
import tcpUsed from 'tcp-port-used';
|
||||
import { prettyPrintJson } from 'pretty-print-json';
|
||||
|
||||
import {
|
||||
boolToString, cacheStats,
|
||||
COMMENT_URL_ID, createCacheManager,
|
||||
filterLogBySubreddit,
|
||||
formatLogLineToHtml, formatNumber,
|
||||
isLogLineMinLevel,
|
||||
LogEntry, parseFromJsonOrYamlToObject,
|
||||
parseLinkIdentifier,
|
||||
parseSubredditLogName, parseSubredditName,
|
||||
pollingInfo, SUBMISSION_URL_ID
|
||||
} from "../util";
|
||||
import {Manager} from "../Subreddit/Manager";
|
||||
import {getLogger} from "../Utils/loggerFactory";
|
||||
import LoggedError from "../Utils/LoggedError";
|
||||
import {OperatorConfig, ResourceStats, RUNNING, STOPPED, SYSTEM, USER} from "../Common/interfaces";
|
||||
import http from "http";
|
||||
import SimpleError from "../Utils/SimpleError";
|
||||
|
||||
const app = addAsync(express());
|
||||
const router = Router();
|
||||
|
||||
app.use(router);
|
||||
app.use(bodyParser.json());
|
||||
app.set('views', `${__dirname}/views`);
|
||||
app.set('view engine', 'ejs');
|
||||
|
||||
interface ConnectedUserInfo {
|
||||
subreddits: string[],
|
||||
level?: string,
|
||||
user: string
|
||||
}
|
||||
|
||||
const commentReg = parseLinkIdentifier([COMMENT_URL_ID]);
|
||||
const submissionReg = parseLinkIdentifier([SUBMISSION_URL_ID]);
|
||||
|
||||
const connectedUsers: Map<string, ConnectedUserInfo> = new Map();
|
||||
|
||||
const availableLevels = ['error', 'warn', 'info', 'verbose', 'debug'];
|
||||
|
||||
let operatorSessionIds: string[] = [];
|
||||
|
||||
declare module 'express-session' {
|
||||
interface SessionData {
|
||||
user: string,
|
||||
subreddits: string[],
|
||||
lastCheck?: number,
|
||||
limit?: number,
|
||||
sort?: string,
|
||||
level?: string,
|
||||
}
|
||||
}
|
||||
|
||||
const subLogMap: Map<string, LogEntry[]> = new Map();
|
||||
|
||||
const emitter = new EventEmitter();
|
||||
const stream = new Writable()
|
||||
|
||||
const rcbServer = function (options: OperatorConfig): ([() => Promise<void>, App]) {
|
||||
|
||||
const {
|
||||
credentials: {
|
||||
clientId,
|
||||
clientSecret,
|
||||
redirectUri
|
||||
},
|
||||
operator: {
|
||||
name,
|
||||
display,
|
||||
},
|
||||
web: {
|
||||
port,
|
||||
session: {
|
||||
provider,
|
||||
secret,
|
||||
},
|
||||
maxLogs,
|
||||
},
|
||||
} = options;
|
||||
|
||||
const opNames = name.map(x => x.toLowerCase());
|
||||
let bot: App;
|
||||
let botSubreddits: string[] = [];
|
||||
|
||||
stream._write = (chunk, encoding, next) => {
|
||||
let logLine = chunk.toString();
|
||||
const now = Date.now();
|
||||
const logEntry: LogEntry = [now, logLine];
|
||||
|
||||
const subName = parseSubredditLogName(logLine);
|
||||
if (subName !== undefined && (botSubreddits.length === 0 || botSubreddits.includes(subName))) {
|
||||
const subLogs = subLogMap.get(subName) || [];
|
||||
subLogs.unshift(logEntry);
|
||||
subLogMap.set(subName, subLogs.slice(0, maxLogs + 1));
|
||||
} else {
|
||||
const appLogs = subLogMap.get('app') || [];
|
||||
appLogs.unshift(logEntry);
|
||||
subLogMap.set('app', appLogs.slice(0, maxLogs + 1));
|
||||
}
|
||||
|
||||
emitter.emit('log', logLine);
|
||||
next();
|
||||
}
|
||||
const streamTransport = new winston.transports.Stream({
|
||||
stream,
|
||||
})
|
||||
|
||||
const logger = getLogger({...options.logging, additionalTransports: [streamTransport]})
|
||||
|
||||
// need to return App to main so that we can handle app shutdown on SIGTERM and discriminate between normal shutdown and crash on error
|
||||
bot = new App(options);
|
||||
|
||||
const serverFunc = async function () {
|
||||
|
||||
if (await tcpUsed.check(port)) {
|
||||
throw new SimpleError(`Specified port for web interface (${port}) is in use or not available. Cannot start web server.`);
|
||||
}
|
||||
|
||||
let server: http.Server,
|
||||
io: SocketServer;
|
||||
|
||||
try {
|
||||
server = await app.listen(port);
|
||||
io = new SocketServer(server);
|
||||
} catch (err) {
|
||||
logger.error('Error occurred while initializing web or socket.io server', err);
|
||||
err.logged = true;
|
||||
throw err;
|
||||
}
|
||||
|
||||
logger.info(`Web UI started: http://localhost:${port}`);
|
||||
|
||||
await bot.testClient();
|
||||
|
||||
app.use('/public', express.static(`${__dirname}/public`));
|
||||
|
||||
await bot.buildManagers();
|
||||
botSubreddits = bot.subManagers.map(x => x.displayLabel);
|
||||
// TODO potentially prune subLogMap of user keys? shouldn't have happened this early though
|
||||
|
||||
if (provider.store === 'none') {
|
||||
logger.warn(`Cannot use 'none' for session store or else no one can use the interface...falling back to 'memory'`);
|
||||
provider.store = 'memory';
|
||||
}
|
||||
const sessionObj = session({
|
||||
cookie: {
|
||||
maxAge: provider.ttl,
|
||||
},
|
||||
store: new CacheManagerStore(createCacheManager(provider) as Cache),
|
||||
resave: false,
|
||||
saveUninitialized: false,
|
||||
secret,
|
||||
});
|
||||
|
||||
app.use(sessionObj);
|
||||
io.use(sharedSession(sessionObj));
|
||||
|
||||
io.on("connection", function (socket) {
|
||||
// @ts-ignore
|
||||
if (socket.handshake.session.user !== undefined) {
|
||||
// @ts-ignore
|
||||
socket.join(socket.handshake.session.id);
|
||||
// @ts-ignore
|
||||
connectedUsers.set(socket.handshake.session.id, {
|
||||
// @ts-ignore
|
||||
subreddits: socket.handshake.session.subreddits,
|
||||
// @ts-ignore
|
||||
level: socket.handshake.session.level,
|
||||
// @ts-ignore
|
||||
user: socket.handshake.session.user
|
||||
});
|
||||
|
||||
// @ts-ignore
|
||||
if (opNames.includes(socket.handshake.session.user.toLowerCase())) {
|
||||
// @ts-ignore
|
||||
operatorSessionIds.push(socket.handshake.session.id)
|
||||
}
|
||||
}
|
||||
});
|
||||
io.on('disconnect', (socket) => {
|
||||
// @ts-ignore
|
||||
connectedUsers.delete(socket.handshake.session.id);
|
||||
operatorSessionIds = operatorSessionIds.filter(x => x !== socket.handshake.session.id)
|
||||
});
|
||||
|
||||
const redditUserMiddleware = async (req: express.Request, res: express.Response, next: Function) => {
|
||||
if (req.session.user === undefined) {
|
||||
return res.redirect('/login');
|
||||
}
|
||||
next();
|
||||
}
|
||||
|
||||
const booleanMiddle = (boolParams: string[] = []) => async (req: express.Request, res: express.Response, next: Function) => {
|
||||
if (req.query !== undefined) {
|
||||
for (const b of boolParams) {
|
||||
const bVal = req.query[b] as any;
|
||||
if (bVal !== undefined) {
|
||||
let truthyVal: boolean;
|
||||
if (bVal === 'true' || bVal === true || bVal === 1 || bVal === '1') {
|
||||
truthyVal = true;
|
||||
} else if (bVal === 'false' || bVal === false || bVal === 0 || bVal === '0') {
|
||||
truthyVal = false;
|
||||
} else {
|
||||
res.status(400);
|
||||
res.send(`Expected query parameter ${b} to be a truthy value. Got "${bVal}" but must be one of these: true/false, 1/0`);
|
||||
return;
|
||||
}
|
||||
// @ts-ignore
|
||||
req.query[b] = truthyVal;
|
||||
}
|
||||
}
|
||||
}
|
||||
next();
|
||||
}
|
||||
|
||||
app.getAsync('/logout', async (req, res) => {
|
||||
// @ts-ignore
|
||||
req.session.destroy();
|
||||
res.send('Bye!');
|
||||
})
|
||||
|
||||
app.getAsync('/login', async (req, res) => {
|
||||
if (redirectUri === undefined) {
|
||||
return res.render('error', {error: `No <b>redirectUri</b> was specified through environmental variables or program argument. This must be provided in order to use the web interface.`});
|
||||
}
|
||||
const authUrl = Snoowrap.getAuthUrl({
|
||||
clientId,
|
||||
scope: ['identity', 'mysubreddits'],
|
||||
redirectUri: redirectUri as string,
|
||||
permanent: false,
|
||||
});
|
||||
return res.redirect(authUrl);
|
||||
});
|
||||
|
||||
app.getAsync(/.*callback$/, async (req, res) => {
|
||||
const {error, code} = req.query as any;
|
||||
if (error !== undefined) {
|
||||
let errContent: string;
|
||||
switch (error) {
|
||||
case 'access_denied':
|
||||
errContent = 'You must <b>Allow</b> this application to connect in order to proceed.';
|
||||
break;
|
||||
default:
|
||||
errContent = error;
|
||||
}
|
||||
return res.render('error', {error: errContent, operatorDisplay: display});
|
||||
}
|
||||
const client = await Snoowrap.fromAuthCode({
|
||||
userAgent: `web:contextBot:web`,
|
||||
clientId,
|
||||
clientSecret,
|
||||
redirectUri: redirectUri as string,
|
||||
code: code as string,
|
||||
});
|
||||
// @ts-ignore
|
||||
const user = await client.getMe().name as string;
|
||||
const subs = await client.getModeratedSubreddits();
|
||||
|
||||
req.session['user'] = user;
|
||||
// @ts-ignore
|
||||
req.session['subreddits'] = opNames.includes(user.toLowerCase()) ? bot.subManagers.map(x => x.displayLabel) : subs.reduce((acc: string[], x) => {
|
||||
const sm = bot.subManagers.find(y => y.subreddit.display_name === x.display_name);
|
||||
if (sm !== undefined) {
|
||||
return acc.concat(sm.displayLabel);
|
||||
}
|
||||
return acc;
|
||||
}, []);
|
||||
req.session['lastCheck'] = dayjs().unix();
|
||||
res.redirect('/');
|
||||
});
|
||||
|
||||
app.use('/', redditUserMiddleware);
|
||||
app.getAsync('/', async (req, res) => {
|
||||
const {
|
||||
subreddits = [],
|
||||
user: userVal,
|
||||
limit = 200,
|
||||
level = 'verbose',
|
||||
sort = 'descending',
|
||||
lastCheck
|
||||
} = req.session;
|
||||
const user = userVal as string;
|
||||
const isOperator = opNames.includes(user.toLowerCase())
|
||||
|
||||
if ((req.session.subreddits as string[]).length === 0 && !isOperator) {
|
||||
return res.render('noSubs', {operatorDisplay: display});
|
||||
}
|
||||
|
||||
const logs = filterLogBySubreddit(subLogMap, req.session.subreddits, {
|
||||
level,
|
||||
operator: isOperator,
|
||||
user,
|
||||
// @ts-ignore
|
||||
sort,
|
||||
limit
|
||||
});
|
||||
|
||||
const subManagerData = [];
|
||||
for (const s of subreddits) {
|
||||
const m = bot.subManagers.find(x => x.displayLabel === s) as Manager;
|
||||
const sd = {
|
||||
name: s,
|
||||
//linkName: s.replace(/\W/g, ''),
|
||||
logs: logs.get(s) || [], // provide a default empty value in case we truly have not logged anything for this subreddit yet
|
||||
botState: m.botState,
|
||||
eventsState: m.eventsState,
|
||||
queueState: m.queueState,
|
||||
indicator: 'gray',
|
||||
queuedActivities: m.queue.length(),
|
||||
runningActivities: m.queue.running(),
|
||||
maxWorkers: m.queue.concurrency,
|
||||
subMaxWorkers: m.subMaxWorkers || bot.maxWorkers,
|
||||
globalMaxWorkers: bot.maxWorkers,
|
||||
validConfig: boolToString(m.validConfigLoaded),
|
||||
dryRun: boolToString(m.dryRun === true),
|
||||
pollingInfo: m.pollOptions.length === 0 ? ['nothing :('] : m.pollOptions.map(pollingInfo),
|
||||
checks: {
|
||||
submissions: m.submissionChecks === undefined ? 0 : m.submissionChecks.length,
|
||||
comments: m.commentChecks === undefined ? 0 : m.commentChecks.length,
|
||||
},
|
||||
wikiLocation: m.wikiLocation,
|
||||
wikiHref: `https://reddit.com/r/${m.subreddit.display_name}/wiki/${m.wikiLocation}`,
|
||||
wikiRevisionHuman: m.lastWikiRevision === undefined ? 'N/A' : `${dayjs.duration(dayjs().diff(m.lastWikiRevision)).humanize()} ago`,
|
||||
wikiRevision: m.lastWikiRevision === undefined ? 'N/A' : m.lastWikiRevision.local().format('MMMM D, YYYY h:mm A Z'),
|
||||
wikiLastCheckHuman: `${dayjs.duration(dayjs().diff(m.lastWikiCheck)).humanize()} ago`,
|
||||
wikiLastCheck: m.lastWikiCheck.local().format('MMMM D, YYYY h:mm A Z'),
|
||||
stats: await m.getStats(),
|
||||
startedAt: 'Not Started',
|
||||
startedAtHuman: 'Not Started',
|
||||
delayBy: m.delayBy === undefined ? 'No' : `Delayed by ${m.delayBy} sec`,
|
||||
};
|
||||
// TODO replace indicator data with js on client page
|
||||
let indicator;
|
||||
if (m.botState.state === RUNNING && m.queueState.state === RUNNING && m.eventsState.state === RUNNING) {
|
||||
indicator = 'green';
|
||||
} else if (m.botState.state === STOPPED && m.queueState.state === STOPPED && m.eventsState.state === STOPPED) {
|
||||
indicator = 'red';
|
||||
} else {
|
||||
indicator = 'yellow';
|
||||
}
|
||||
sd.indicator = indicator;
|
||||
if (m.startedAt !== undefined) {
|
||||
const dur = dayjs.duration(dayjs().diff(m.startedAt));
|
||||
sd.startedAtHuman = `${dur.humanize()} ago`;
|
||||
sd.startedAt = m.startedAt.local().format('MMMM D, YYYY h:mm A Z');
|
||||
|
||||
if (sd.stats.cache.totalRequests > 0) {
|
||||
const minutes = dur.asMinutes();
|
||||
if (minutes < 10) {
|
||||
sd.stats.cache.requestRate = formatNumber((10 / minutes) * sd.stats.cache.totalRequests, {
|
||||
toFixed: 0,
|
||||
round: {enable: true, indicate: true}
|
||||
});
|
||||
} else {
|
||||
sd.stats.cache.requestRate = formatNumber(sd.stats.cache.totalRequests / (minutes / 10), {
|
||||
toFixed: 0,
|
||||
round: {enable: true, indicate: true}
|
||||
});
|
||||
}
|
||||
} else {
|
||||
sd.stats.cache.requestRate = 0;
|
||||
}
|
||||
}
|
||||
subManagerData.push(sd);
|
||||
}
|
||||
const totalStats = subManagerData.reduce((acc, curr) => {
|
||||
return {
|
||||
checks: {
|
||||
submissions: acc.checks.submissions + curr.checks.submissions,
|
||||
comments: acc.checks.comments + curr.checks.comments,
|
||||
},
|
||||
eventsCheckedTotal: acc.eventsCheckedTotal + curr.stats.eventsCheckedTotal,
|
||||
checksRunTotal: acc.checksRunTotal + curr.stats.checksRunTotal,
|
||||
checksTriggeredTotal: acc.checksTriggeredTotal + curr.stats.checksTriggeredTotal,
|
||||
rulesRunTotal: acc.rulesRunTotal + curr.stats.rulesRunTotal,
|
||||
rulesCachedTotal: acc.rulesCachedTotal + curr.stats.rulesCachedTotal,
|
||||
rulesTriggeredTotal: acc.rulesTriggeredTotal + curr.stats.rulesTriggeredTotal,
|
||||
actionsRunTotal: acc.actionsRunTotal + curr.stats.actionsRunTotal,
|
||||
maxWorkers: acc.maxWorkers + curr.maxWorkers,
|
||||
subMaxWorkers: acc.subMaxWorkers + curr.subMaxWorkers,
|
||||
globalMaxWorkers: acc.globalMaxWorkers + curr.globalMaxWorkers,
|
||||
runningActivities: acc.runningActivities + curr.runningActivities,
|
||||
queuedActivities: acc.queuedActivities + curr.queuedActivities,
|
||||
};
|
||||
}, {
|
||||
checks: {
|
||||
submissions: 0,
|
||||
comments: 0,
|
||||
},
|
||||
eventsCheckedTotal: 0,
|
||||
checksRunTotal: 0,
|
||||
checksTriggeredTotal: 0,
|
||||
rulesRunTotal: 0,
|
||||
rulesCachedTotal: 0,
|
||||
rulesTriggeredTotal: 0,
|
||||
actionsRunTotal: 0,
|
||||
maxWorkers: 0,
|
||||
subMaxWorkers: 0,
|
||||
globalMaxWorkers: 0,
|
||||
runningActivities: 0,
|
||||
queuedActivities: 0,
|
||||
});
|
||||
const {
|
||||
checks,
|
||||
maxWorkers,
|
||||
globalMaxWorkers,
|
||||
subMaxWorkers,
|
||||
runningActivities,
|
||||
queuedActivities,
|
||||
...rest
|
||||
} = totalStats;
|
||||
|
||||
let cumRaw = subManagerData.reduce((acc, curr) => {
|
||||
Object.keys(curr.stats.cache.types as ResourceStats).forEach((k) => {
|
||||
acc[k].requests += curr.stats.cache.types[k].requests;
|
||||
acc[k].miss += curr.stats.cache.types[k].miss;
|
||||
});
|
||||
return acc;
|
||||
}, cacheStats());
|
||||
cumRaw = Object.keys(cumRaw).reduce((acc, curr) => {
|
||||
const per = acc[curr].miss === 0 ? 0 : formatNumber(acc[curr].miss / acc[curr].requests) * 100;
|
||||
// @ts-ignore
|
||||
acc[curr].missPercent = `${formatNumber(per, {toFixed: 0})}%`;
|
||||
return acc;
|
||||
}, cumRaw);
|
||||
const cacheReq = subManagerData.reduce((acc, curr) => acc + curr.stats.cache.totalRequests, 0);
|
||||
const cacheMiss = subManagerData.reduce((acc, curr) => acc + curr.stats.cache.totalMiss, 0);
|
||||
let allManagerData: any = {
|
||||
name: 'All',
|
||||
linkName: 'All',
|
||||
indicator: 'green',
|
||||
maxWorkers,
|
||||
globalMaxWorkers,
|
||||
subMaxWorkers,
|
||||
runningActivities,
|
||||
queuedActivities,
|
||||
botState: {
|
||||
state: RUNNING,
|
||||
causedBy: SYSTEM
|
||||
},
|
||||
dryRun: boolToString(bot.dryRun === true),
|
||||
logs: logs.get('all'),
|
||||
checks: checks,
|
||||
softLimit: bot.softLimit,
|
||||
hardLimit: bot.hardLimit,
|
||||
stats: {
|
||||
...rest,
|
||||
cache: {
|
||||
currentKeyCount: await bot.subManagers[0].resources.getCacheKeyCount(),
|
||||
isShared: false,
|
||||
totalRequests: cacheReq,
|
||||
totalMiss: cacheMiss,
|
||||
missPercent: `${formatNumber(cacheMiss === 0 || cacheReq === 0 ? 0 :(cacheMiss/cacheReq) * 100, {toFixed: 0})}%`,
|
||||
types: {
|
||||
...cumRaw,
|
||||
}
|
||||
}
|
||||
},
|
||||
};
|
||||
if (allManagerData.logs === undefined) {
|
||||
// this should happen but saw an edge case where potentially did
|
||||
logger.warn(`Logs for 'all' were undefined found but should always have a default empty value`);
|
||||
}
|
||||
// if(isOperator) {
|
||||
allManagerData.startedAt = bot.startedAt.local().format('MMMM D, YYYY h:mm A Z');
|
||||
allManagerData.heartbeatHuman = dayjs.duration({seconds: bot.heartbeatInterval}).humanize();
|
||||
allManagerData.heartbeat = bot.heartbeatInterval;
|
||||
allManagerData = {...allManagerData, ...opStats(bot)};
|
||||
//}
|
||||
|
||||
const botDur = dayjs.duration(dayjs().diff(bot.startedAt))
|
||||
if (allManagerData.stats.cache.totalRequests > 0) {
|
||||
const minutes = botDur.asMinutes();
|
||||
if (minutes < 10) {
|
||||
allManagerData.stats.cache.requestRate = formatNumber((10 / minutes) * allManagerData.stats.cache.totalRequests, {
|
||||
toFixed: 0,
|
||||
round: {enable: true, indicate: true}
|
||||
});
|
||||
} else {
|
||||
allManagerData.stats.cache.requestRate = formatNumber(allManagerData.stats.cache.totalRequests / (minutes / 10), {
|
||||
toFixed: 0,
|
||||
round: {enable: true, indicate: true}
|
||||
});
|
||||
}
|
||||
} else {
|
||||
allManagerData.stats.cache.requestRate = 0;
|
||||
}
|
||||
|
||||
const data = {
|
||||
userName: user,
|
||||
system: {
|
||||
startedAt: bot.startedAt.local().format('MMMM D, YYYY h:mm A Z'),
|
||||
...opStats(bot),
|
||||
},
|
||||
subreddits: [allManagerData, ...subManagerData],
|
||||
show: 'All',
|
||||
botName: bot.botName,
|
||||
botLink: bot.botLink,
|
||||
operatorDisplay: display,
|
||||
isOperator,
|
||||
operators: opNames.length === 0 ? 'None Specified' : name.join(', '),
|
||||
logSettings: {
|
||||
//limit: [10, 20, 50, 100, 200].map(x => `<a class="capitalize ${limit === x ? 'font-bold no-underline pointer-events-none' : ''}" data-limit="${x}" href="logs/settings/update?limit=${x}">${x}</a>`).join(' | '),
|
||||
limitSelect: [10, 20, 50, 100, 200].map(x => `<option ${limit === x ? 'selected' : ''} class="capitalize ${limit === x ? 'font-bold' : ''}" data-value="${x}">${x}</option>`).join(' | '),
|
||||
//sort: ['ascending', 'descending'].map(x => `<a class="capitalize ${sort === x ? 'font-bold no-underline pointer-events-none' : ''}" data-sort="${x}" href="logs/settings/update?sort=${x}">${x}</a>`).join(' | '),
|
||||
sortSelect: ['ascending', 'descending'].map(x => `<option ${sort === x ? 'selected' : ''} class="capitalize ${sort === x ? 'font-bold' : ''}" data-value="${x}">${x}</option>`).join(' '),
|
||||
//level: availableLevels.map(x => `<a class="capitalize log-${x} ${level === x ? `font-bold no-underline pointer-events-none` : ''}" data-log="${x}" href="logs/settings/update?level=${x}">${x}</a>`).join(' | '),
|
||||
levelSelect: availableLevels.map(x => `<option ${level === x ? 'selected' : ''} class="capitalize log-${x} ${level === x ? `font-bold` : ''}" data-value="${x}">${x}</option>`).join(' '),
|
||||
},
|
||||
};
|
||||
if (req.query.sub !== undefined) {
|
||||
const encoded = encodeURI(req.query.sub as string).toLowerCase();
|
||||
const shouldShow = data.subreddits.find(x => x.name.toLowerCase() === encoded);
|
||||
if (shouldShow !== undefined) {
|
||||
data.show = shouldShow.name;
|
||||
}
|
||||
}
|
||||
|
||||
res.render('status', data);
|
||||
});
|
||||
|
||||
app.getAsync('/logs/settings/update', async function (req, res) {
|
||||
const e = req.query;
|
||||
for (const [setting, val] of Object.entries(req.query)) {
|
||||
switch (setting) {
|
||||
case 'limit':
|
||||
req.session.limit = Number.parseInt(val as string);
|
||||
break;
|
||||
case 'sort':
|
||||
req.session.sort = val as string;
|
||||
break;
|
||||
case 'level':
|
||||
req.session.level = val as string;
|
||||
break;
|
||||
}
|
||||
}
|
||||
const {limit = 200, level = 'verbose', sort = 'descending', user} = req.session;
|
||||
|
||||
res.send('OK');
|
||||
|
||||
const subMap = filterLogBySubreddit(subLogMap, req.session.subreddits, {
|
||||
level,
|
||||
operator: opNames.includes((user as string).toLowerCase()),
|
||||
user,
|
||||
limit,
|
||||
sort: (sort as 'descending' | 'ascending'),
|
||||
});
|
||||
const subArr: any = [];
|
||||
subMap.forEach((v: string[], k: string) => {
|
||||
subArr.push({name: k, logs: v.join('')});
|
||||
});
|
||||
io.emit('logClear', subArr);
|
||||
});
|
||||
|
||||
app.use('/config', [redditUserMiddleware]);
|
||||
app.getAsync('/config', async (req, res) => {
|
||||
const {subreddit} = req.query as any;
|
||||
if(!(req.session.subreddits as string[]).includes(subreddit)) {
|
||||
return res.render('error', {error: 'Cannot retrieve config for subreddit you do not manage or is not run by the bot', operatorDisplay: display});
|
||||
}
|
||||
const manager = bot.subManagers.find(x => x.displayLabel === subreddit);
|
||||
if (manager === undefined) {
|
||||
return res.render('error', {error: 'Cannot retrieve config for subreddit you do not manage or is not run by the bot', operatorDisplay: display});
|
||||
}
|
||||
|
||||
// @ts-ignore
|
||||
const wiki = await manager.subreddit.getWikiPage(manager.wikiLocation).fetch();
|
||||
const [obj, jsonErr, yamlErr] = parseFromJsonOrYamlToObject(wiki.content_md);
|
||||
res.render('config', {
|
||||
config: prettyPrintJson.toHtml(obj, {quoteKeys: true, indent: 2}),
|
||||
botName: bot.botName,
|
||||
botLink: bot.botLink,
|
||||
operatorDisplay: display,
|
||||
});
|
||||
});
|
||||
|
||||
app.use('/action', [redditUserMiddleware, booleanMiddle(['force'])]);
|
||||
app.getAsync('/action', async (req, res) => {
|
||||
const {type, action, subreddit, force = false} = req.query as any;
|
||||
let subreddits: string[] = [];
|
||||
if (subreddit === 'All') {
|
||||
subreddits = req.session.subreddits as string[];
|
||||
} else if ((req.session.subreddits as string[]).includes(subreddit)) {
|
||||
subreddits = [subreddit];
|
||||
}
|
||||
|
||||
for (const s of subreddits) {
|
||||
const manager = bot.subManagers.find(x => x.displayLabel === s);
|
||||
if (manager === undefined) {
|
||||
logger.warn(`Manager for ${s} does not exist`, {subreddit: `/u/${req.session.user}`});
|
||||
continue;
|
||||
}
|
||||
const mLogger = manager.logger;
|
||||
mLogger.info(`/u/${req.session.user} invoked '${action}' action for ${type} on ${manager.displayLabel}`);
|
||||
try {
|
||||
switch (action) {
|
||||
case 'start':
|
||||
if (type === 'bot') {
|
||||
await manager.start('user');
|
||||
} else if (type === 'queue') {
|
||||
manager.startQueue('user');
|
||||
} else {
|
||||
await manager.startEvents('user');
|
||||
}
|
||||
break;
|
||||
case 'stop':
|
||||
if (type === 'bot') {
|
||||
await manager.stop('user');
|
||||
} else if (type === 'queue') {
|
||||
await manager.stopQueue('user');
|
||||
} else {
|
||||
manager.stopEvents('user');
|
||||
}
|
||||
break;
|
||||
case 'pause':
|
||||
if (type === 'queue') {
|
||||
await manager.pauseQueue('user');
|
||||
} else {
|
||||
manager.pauseEvents('user');
|
||||
}
|
||||
break;
|
||||
case 'reload':
|
||||
const prevQueueState = manager.queueState.state;
|
||||
const newConfig = await manager.parseConfiguration('user', force);
|
||||
if (newConfig === false) {
|
||||
mLogger.info('Config was up-to-date');
|
||||
}
|
||||
if (newConfig && prevQueueState === RUNNING) {
|
||||
await manager.startQueue(USER);
|
||||
}
|
||||
break;
|
||||
case 'check':
|
||||
if (type === 'unmoderated') {
|
||||
const activities = await manager.subreddit.getUnmoderated({limit: 100});
|
||||
for (const a of activities.reverse()) {
|
||||
manager.queue.push({
|
||||
checkType: a instanceof Submission ? 'Submission' : 'Comment',
|
||||
activity: a,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
const activities = await manager.subreddit.getModqueue({limit: 100});
|
||||
for (const a of activities.reverse()) {
|
||||
manager.queue.push({
|
||||
checkType: a instanceof Submission ? 'Submission' : 'Comment',
|
||||
activity: a,
|
||||
});
|
||||
}
|
||||
}
|
||||
break;
|
||||
}
|
||||
} catch (err) {
|
||||
if (!(err instanceof LoggedError)) {
|
||||
mLogger.error(err, {subreddit: manager.displayLabel});
|
||||
}
|
||||
}
|
||||
}
|
||||
res.send('OK');
|
||||
});
|
||||
|
||||
app.use('/check', [redditUserMiddleware, booleanMiddle(['dryRun'])]);
|
||||
app.getAsync('/check', async (req, res) => {
|
||||
const {url, dryRun, subreddit} = req.query as any;
|
||||
|
||||
let a;
|
||||
const commentId = commentReg(url);
|
||||
if (commentId !== undefined) {
|
||||
// @ts-ignore
|
||||
a = await bot.client.getComment(commentId);
|
||||
}
|
||||
if (a === undefined) {
|
||||
const submissionId = submissionReg(url);
|
||||
if (submissionId !== undefined) {
|
||||
// @ts-ignore
|
||||
a = await bot.client.getSubmission(submissionId);
|
||||
}
|
||||
}
|
||||
|
||||
if (a === undefined) {
|
||||
logger.error('Could not parse Comment or Submission ID from given URL', {subreddit: `/u/${req.session.user}`});
|
||||
return res.send('OK');
|
||||
} else {
|
||||
// @ts-ignore
|
||||
const activity = await a.fetch();
|
||||
const sub = await activity.subreddit.display_name;
|
||||
|
||||
let manager = subreddit === 'All' ? bot.subManagers.find(x => x.subreddit.display_name === sub) : bot.subManagers.find(x => x.displayLabel === subreddit);
|
||||
|
||||
if (manager === undefined || !(req.session.subreddits as string[]).includes(manager.displayLabel)) {
|
||||
let msg = 'Activity does not belong to a subreddit you moderate or the bot runs on.';
|
||||
if (subreddit === 'All') {
|
||||
msg = `${msg} If you want to test an Activity against a Subreddit\'s config it does not belong to then switch to that Subreddit's tab first.`
|
||||
}
|
||||
logger.error(msg, {subreddit: `/u/${req.session.user}`});
|
||||
return res.send('OK');
|
||||
}
|
||||
|
||||
// will run dryrun if specified or if running activity on subreddit it does not belong to
|
||||
const dr: boolean | undefined = (dryRun || manager.subreddit.display_name !== sub) ? true : undefined;
|
||||
manager.logger.info(`/u/${req.session.user} running${dr === true ? ' DRY RUN ' : ' '}check on${manager.subreddit.display_name !== sub ? ' FOREIGN ACTIVITY ' : ' '}${url}`);
|
||||
await manager.runChecks(activity instanceof Submission ? 'Submission' : 'Comment', activity, {dryRun: dr})
|
||||
}
|
||||
res.send('OK');
|
||||
})
|
||||
|
||||
setInterval(() => {
|
||||
// refresh op stats every 30 seconds
|
||||
io.emit('opStats', opStats(bot));
|
||||
// if (operatorSessionId !== undefined) {
|
||||
// io.to(operatorSessionId).emit('opStats', opStats(bot));
|
||||
// }
|
||||
}, 30000);
|
||||
|
||||
emitter.on('log', (log) => {
|
||||
const emittedSessions = [];
|
||||
const subName = parseSubredditLogName(log);
|
||||
if (subName !== undefined) {
|
||||
for (const [id, info] of connectedUsers) {
|
||||
const {subreddits, level = 'verbose', user} = info;
|
||||
if (isLogLineMinLevel(log, level) && (subreddits.includes(subName) || subName.includes(user))) {
|
||||
emittedSessions.push(id);
|
||||
io.to(id).emit('log', formatLogLineToHtml(log));
|
||||
}
|
||||
}
|
||||
}
|
||||
if (operatorSessionIds.length > 0) {
|
||||
for(const id of operatorSessionIds) {
|
||||
io.to(id).emit('opStats', opStats(bot));
|
||||
if (subName === undefined || !emittedSessions.includes(id)) {
|
||||
const {level = 'verbose'} = connectedUsers.get(id) || {};
|
||||
if (isLogLineMinLevel(log, level)) {
|
||||
io.to(id).emit('log', formatLogLineToHtml(log));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
await bot.runManagers();
|
||||
}
|
||||
|
||||
return [serverFunc, bot];
|
||||
};
|
||||
|
||||
const opStats = (bot: App) => {
|
||||
const limitReset = dayjs(bot.client.ratelimitExpiration);
|
||||
const nextHeartbeat = bot.nextHeartbeat !== undefined ? bot.nextHeartbeat.local().format('MMMM D, YYYY h:mm A Z') : 'N/A';
|
||||
const nextHeartbeatHuman = bot.nextHeartbeat !== undefined ? `in ${dayjs.duration(bot.nextHeartbeat.diff(dayjs())).humanize()}` : 'N/A'
|
||||
return {
|
||||
startedAtHuman: `${dayjs.duration(dayjs().diff(bot.startedAt)).humanize()}`,
|
||||
nextHeartbeat,
|
||||
nextHeartbeatHuman,
|
||||
apiLimit: bot.client.ratelimitRemaining,
|
||||
apiAvg: formatNumber(bot.apiRollingAvg),
|
||||
nannyMode: bot.nannyMode || 'Off',
|
||||
apiDepletion: bot.apiEstDepletion === undefined ? 'Not Calculated' : bot.apiEstDepletion.humanize(),
|
||||
limitReset,
|
||||
limitResetHuman: `in ${dayjs.duration(limitReset.diff(dayjs())).humanize()}`,
|
||||
}
|
||||
}
|
||||
|
||||
export default rcbServer;
|
||||
|
||||
@@ -1,65 +0,0 @@
|
||||
<html>
|
||||
<head>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/tailwindcss/2.0.3/tailwind.min.css"
|
||||
integrity="sha512-wl80ucxCRpLkfaCnbM88y4AxnutbGk327762eM9E/rRTvY/ZGAHWMZrYUq66VQBYMIYDFpDdJAOGSLyIPHZ2IQ=="
|
||||
crossorigin="anonymous"/>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/tailwindcss/2.0.3/tailwind-dark.min.css"
|
||||
integrity="sha512-WvyKyiVHgInX5UQt67447ExtRRZG/8GUijaq1MpqTNYp8wY4/EJOG5bI80sRp/5crDy4Z6bBUydZI2OFV3Vbtg=="
|
||||
crossorigin="anonymous"/>
|
||||
<script src="https://code.iconify.design/1/1.0.4/iconify.min.js"></script>
|
||||
<link rel="stylesheet" href="public/themeToggle.css">
|
||||
<link rel="stylesheet" href="public/app.css">
|
||||
<link rel="stylesheet" href="public/json.css">
|
||||
<title>CM for <%= botName %></title>
|
||||
<!--<title><%# `CM for /u/${botName}`%></title>-->
|
||||
<meta charset="utf-8">
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width,initial-scale=1.0">
|
||||
<!--icons from https://heroicons.com -->
|
||||
</head>
|
||||
<body style="user-select: none;" class="">
|
||||
<script>localStorage.getItem('ms-dark') === 'no' ? document.body.classList.remove('dark') : document.body.classList.add('dark')</script>
|
||||
<div class="min-w-screen min-h-screen bg-gray-100 bg-gray-100 dark:bg-gray-800 font-sans">
|
||||
<%- include('partials/authTitle') %>
|
||||
<div class="container mx-auto">
|
||||
<div class="grid">
|
||||
<div class="bg-white dark:bg-gray-700 dark:text-white">
|
||||
<div class="p-6 md:px-10 md:py-6 space-y-3">
|
||||
<div>Note: Comments have been removed</div>
|
||||
<pre style="user-select: text;"><%- config %></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<script>
|
||||
document.querySelectorAll('.theme').forEach(el => {
|
||||
el.addEventListener('click', e => {
|
||||
e.preventDefault();
|
||||
if (e.target.id === 'dark') {
|
||||
document.body.classList.add('dark');
|
||||
localStorage.setItem('ms-dark', 'yes');
|
||||
} else {
|
||||
document.body.classList.remove('dark');
|
||||
localStorage.setItem('ms-dark', 'no');
|
||||
}
|
||||
document.querySelectorAll('.theme').forEach(el => {
|
||||
el.classList.remove('font-bold', 'no-underline', 'pointer-events-none');
|
||||
});
|
||||
e.target.classList.add('font-bold', 'no-underline', 'pointer-events-none');
|
||||
})
|
||||
})
|
||||
|
||||
document.querySelector("#themeToggle").checked = localStorage.getItem('ms-dark') !== 'no';
|
||||
document.querySelector("#themeToggle").onchange = (e) => {
|
||||
if (e.target.checked === true) {
|
||||
document.body.classList.add('dark');
|
||||
localStorage.setItem('ms-dark', 'yes');
|
||||
} else {
|
||||
document.body.classList.remove('dark');
|
||||
localStorage.setItem('ms-dark', 'no');
|
||||
}
|
||||
}
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
@@ -1,60 +0,0 @@
|
||||
<html>
|
||||
<%- include('partials/head', {title: 'CM OAuth Helper'}) %>
|
||||
<body class="">
|
||||
<script>localStorage.getItem('ms-dark') === 'no' ? document.body.classList.remove('dark') : document.body.classList.add('dark')</script>
|
||||
<div class="min-w-screen min-h-screen bg-gray-100 bg-gray-100 dark:bg-gray-800 font-sans">
|
||||
<%- include('partials/title', {title: ' OAuth Helper'}) %>
|
||||
<div class="container mx-auto">
|
||||
<div class="grid">
|
||||
<div class="bg-white dark:bg-gray-500 dark:text-white">
|
||||
<div class="p-6 md:px-10 md:py-6">
|
||||
<div class="text-xl mb-4">Hi! Looks like you're setting up your bot. To get running:</div>
|
||||
<div class="text-lg text-semibold my-3">1. Set your redirect URL</div>
|
||||
<input id="redirectUri" style="min-width:500px;"
|
||||
class="text-black placeholder-gray-500 rounded mt-2 mb-3 p-2" value="<%= redirectUri %>">
|
||||
<div class="my-2">
|
||||
<input type="checkbox" id="wikiedit" name="wikiedit"
|
||||
checked>
|
||||
<label for="wikiedit">Include <span class="font-mono">wikiedit</span> permission for Toolbox
|
||||
User Notes</label>
|
||||
</div>
|
||||
<div class="space-y-3">
|
||||
<div>This is the URL Reddit will redirect you to once you have authorized an account to be used
|
||||
with your application.
|
||||
</div>
|
||||
<div>The input field has been pre-filled with either:
|
||||
<ul class="list-inside list-disc">
|
||||
<li>What you provided to the program as an argument/environmental variable or</li>
|
||||
<li>The current URL in your browser that would be used -- if you are using a reverse
|
||||
proxy this may be different so double check
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
<div>Make sure it matches what is found in the <b>redirect uri</b> for your <a target="_blank"
|
||||
href="https://www.reddit.com/prefs/apps">application
|
||||
on Reddit</a> and <b>it must end with "callback"</b></div>
|
||||
</div>
|
||||
<div class="text-lg text-semibold my-3">2. Login to Reddit with the account that will be the bot
|
||||
</div>
|
||||
Protip: Login to Reddit in an Incognito session, then open this URL in a new tab.
|
||||
<div class="text-lg text-semibold my-3">3. <a id="doAuth" href="">Authorize your bot account</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<script>
|
||||
if (document.querySelector('#redirectUri').value === '') {
|
||||
document.querySelector('#redirectUri').value = `${document.location.href}callback`;
|
||||
}
|
||||
|
||||
document.querySelector('#doAuth').addEventListener('click', e => {
|
||||
e.preventDefault()
|
||||
const wikiEdit = document.querySelector('#wikiedit').checked ? 1 : 0;
|
||||
const url = `${document.location.href}auth?redirect=${document.querySelector('#redirectUri').value}&wikiEdit=${wikiEdit}`;
|
||||
window.location.href = url;
|
||||
})
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
@@ -1,13 +0,0 @@
|
||||
<div class="space-x-4 p-6 md:px-10 md:py-6 leading-6 font-semibold bg-gray-800 text-white">
|
||||
<div class="container mx-auto">
|
||||
<div class="flex items-center justify-between">
|
||||
<div class="flex items-center flex-grow pr-4">
|
||||
<div class="px-4 width-full relative">
|
||||
<div><a href="https://github.com/FoxxMD/context-mod">ContextMod</a> <%= title %></div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="flex items-center flex-end text-sm">
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
@@ -5,23 +5,26 @@ import {CommentCheck} from "../Check/CommentCheck";
|
||||
import {
|
||||
cacheStats,
|
||||
createRetryHandler,
|
||||
determineNewResults, formatNumber,
|
||||
mergeArr, parseFromJsonOrYamlToObject, pollingInfo, sleep, totalFromMapStats,
|
||||
determineNewResults, findLastIndex, formatNumber,
|
||||
mergeArr, parseFromJsonOrYamlToObject, pollingInfo, resultsSummary, sleep, totalFromMapStats, triggeredIndicator,
|
||||
} from "../util";
|
||||
import {Poll} from "snoostorm";
|
||||
import pEvent from "p-event";
|
||||
import {RuleResult} from "../Rule";
|
||||
import {ConfigBuilder, buildPollingOptions} from "../ConfigBuilder";
|
||||
import {
|
||||
ActionedEvent,
|
||||
ActionResult,
|
||||
DEFAULT_POLLING_INTERVAL,
|
||||
DEFAULT_POLLING_LIMIT, Invokee,
|
||||
ManagerOptions, ManagerStateChangeOption, PAUSED,
|
||||
PollingOptionsStrong, RUNNING, RunState, STOPPED, SYSTEM, USER
|
||||
PollingOptionsStrong, ResourceStats, RUNNING, RunState, STOPPED, SYSTEM, USER
|
||||
} from "../Common/interfaces";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import {activityIsRemoved, itemContentPeek} from "../Utils/SnoowrapUtils";
|
||||
import LoggedError from "../Utils/LoggedError";
|
||||
import ResourceManager, {
|
||||
import {
|
||||
BotResourcesManager,
|
||||
SubredditResourceConfig,
|
||||
SubredditResources,
|
||||
SubredditResourceSetOptions
|
||||
@@ -35,6 +38,7 @@ import {queue, QueueObject} from 'async';
|
||||
import {JSONConfig} from "../JsonConfig";
|
||||
import {CheckStructuredJson} from "../Check";
|
||||
import NotificationManager from "../Notification/NotificationManager";
|
||||
import action from "../Web/Server/routes/authenticated/user/action";
|
||||
|
||||
export interface RunningState {
|
||||
state: RunState,
|
||||
@@ -45,6 +49,7 @@ export interface runCheckOptions {
|
||||
checkNames?: string[],
|
||||
delayUntil?: number,
|
||||
dryRun?: boolean,
|
||||
refresh?: boolean,
|
||||
}
|
||||
|
||||
export interface CheckTask {
|
||||
@@ -60,6 +65,45 @@ export interface RuntimeManagerOptions extends ManagerOptions {
|
||||
maxWorkers: number;
|
||||
}
|
||||
|
||||
export interface ManagerStats {
|
||||
eventsCheckedTotal: number
|
||||
eventsCheckedSinceStartTotal: number
|
||||
eventsAvg: number
|
||||
checksRunTotal: number
|
||||
checksRunSinceStartTotal: number
|
||||
checksTriggered: number
|
||||
checksTriggeredTotal: number
|
||||
checksTriggeredSinceStart: number
|
||||
checksTriggeredSinceStartTotal: number
|
||||
rulesRunTotal: number
|
||||
rulesRunSinceStartTotal: number
|
||||
rulesCachedTotal: number
|
||||
rulesCachedSinceStartTotal: number
|
||||
rulesTriggeredTotal: number
|
||||
rulesTriggeredSinceStartTotal: number
|
||||
rulesAvg: number
|
||||
actionsRun: number
|
||||
actionsRunTotal: number
|
||||
actionsRunSinceStart: number,
|
||||
actionsRunSinceStartTotal: number
|
||||
cache: {
|
||||
provider: string,
|
||||
currentKeyCount: number,
|
||||
isShared: boolean,
|
||||
totalRequests: number,
|
||||
totalMiss: number,
|
||||
missPercent: string,
|
||||
requestRate: number,
|
||||
types: ResourceStats
|
||||
},
|
||||
}
|
||||
|
||||
interface QueuedIdentifier {
|
||||
id: string,
|
||||
shouldRefresh: boolean
|
||||
state: 'queued' | 'processing'
|
||||
}
|
||||
|
||||
export class Manager {
|
||||
subreddit: Subreddit;
|
||||
client: Snoowrap;
|
||||
@@ -79,9 +123,19 @@ export class Manager {
|
||||
modStreamCallbacks: Map<string, any> = new Map();
|
||||
dryRun?: boolean;
|
||||
sharedModqueue: boolean;
|
||||
cacheManager: BotResourcesManager;
|
||||
globalDryRun?: boolean;
|
||||
emitter: EventEmitter = new EventEmitter();
|
||||
queue: QueueObject<CheckTask>;
|
||||
// firehose is used to ensure all activities from different polling streams are unique
|
||||
// that is -- if the same activities is in both modqueue and unmoderated we don't want to process the activity twice or use stale data
|
||||
//
|
||||
// so all activities get queued to firehose, it keeps track of items by id (using queuedItemsMeta)
|
||||
// and ensures that if any activities are ingested while they are ALSO currently queued or working then they are properly handled by either
|
||||
// 1) if queued, do not re-queue but instead tell worker to refresh before processing
|
||||
// 2) if currently processing then re-queue but also refresh before processing
|
||||
firehose: QueueObject<CheckTask>;
|
||||
queuedItemsMeta: QueuedIdentifier[] = [];
|
||||
globalMaxWorkers: number;
|
||||
subMaxWorkers?: number;
|
||||
|
||||
@@ -130,8 +184,9 @@ export class Manager {
|
||||
rulesUniqueRollingAvg: number = 0;
|
||||
actionsRun: Map<string, number> = new Map();
|
||||
actionsRunSinceStart: Map<string, number> = new Map();
|
||||
actionedEvents: ActionedEvent[] = [];
|
||||
|
||||
getStats = async () => {
|
||||
getStats = async (): Promise<ManagerStats> => {
|
||||
const data: any = {
|
||||
eventsCheckedTotal: this.eventsCheckedTotal,
|
||||
eventsCheckedSinceStartTotal: this.eventsCheckedSinceStartTotal,
|
||||
@@ -166,7 +221,7 @@ export class Manager {
|
||||
};
|
||||
|
||||
if (this.resources !== undefined) {
|
||||
const resStats = this.resources.getStats();
|
||||
const resStats = await this.resources.getStats();
|
||||
|
||||
data.cache = resStats.cache;
|
||||
data.cache.currentKeyCount = await this.resources.getCacheKeyCount();
|
||||
@@ -184,7 +239,7 @@ export class Manager {
|
||||
return this.displayLabel;
|
||||
}
|
||||
|
||||
constructor(sub: Subreddit, client: Snoowrap, logger: Logger, opts: RuntimeManagerOptions = {botName: 'ContextMod', maxWorkers: 1}) {
|
||||
constructor(sub: Subreddit, client: Snoowrap, logger: Logger, cacheManager: BotResourcesManager, opts: RuntimeManagerOptions = {botName: 'ContextMod', maxWorkers: 1}) {
|
||||
const {dryRun, sharedModqueue = false, wikiLocation = 'botconfig/contextbot', botName, maxWorkers} = opts;
|
||||
this.displayLabel = opts.nickname || `${sub.display_name_prefixed}`;
|
||||
const getLabels = this.getCurrentLabels;
|
||||
@@ -207,9 +262,11 @@ export class Manager {
|
||||
this.botName = botName;
|
||||
this.globalMaxWorkers = maxWorkers;
|
||||
this.notificationManager = new NotificationManager(this.logger, this.subreddit, this.displayLabel, botName);
|
||||
this.cacheManager = cacheManager;
|
||||
|
||||
this.queue = this.generateQueue(this.getMaxWorkers(this.globalMaxWorkers));
|
||||
this.queue.pause();
|
||||
this.firehose = this.generateFirehose();
|
||||
|
||||
this.eventsSampleInterval = setInterval((function(self) {
|
||||
return function() {
|
||||
@@ -270,6 +327,32 @@ export class Manager {
|
||||
return maxWorkers;
|
||||
}
|
||||
|
||||
protected generateFirehose() {
|
||||
return queue(async (task: CheckTask, cb) => {
|
||||
// items in queuedItemsMeta will be processing FIFO so earlier elements (by index) are older
|
||||
//
|
||||
// if we insert the same item again because it is currently being processed AND THEN we get the item AGAIN we only want to update the newest meta
|
||||
// so search the array backwards to get the neweset only
|
||||
const queuedItemIndex = findLastIndex(this.queuedItemsMeta, x => x.id === task.activity.id);
|
||||
if(queuedItemIndex !== -1) {
|
||||
const itemMeta = this.queuedItemsMeta[queuedItemIndex];
|
||||
let msg = `Item ${itemMeta.id} is already ${itemMeta.state}.`;
|
||||
if(itemMeta.state === 'queued') {
|
||||
this.logger.debug(`${msg} Flagging to refresh data before processing.`);
|
||||
this.queuedItemsMeta.splice(queuedItemIndex, 1, {...itemMeta, shouldRefresh: true});
|
||||
} else {
|
||||
this.logger.debug(`${msg} Re-queuing item but will also refresh data before processing.`);
|
||||
this.queuedItemsMeta.push({id: task.activity.id, shouldRefresh: true, state: 'queued'});
|
||||
this.queue.push(task);
|
||||
}
|
||||
} else {
|
||||
this.queuedItemsMeta.push({id: task.activity.id, shouldRefresh: false, state: 'queued'});
|
||||
this.queue.push(task);
|
||||
}
|
||||
}
|
||||
, 1);
|
||||
}
|
||||
|
||||
protected generateQueue(maxWorkers: number) {
|
||||
if (maxWorkers > 1) {
|
||||
this.logger.warn(`Setting max queue workers above 1 (specified: ${maxWorkers}) may have detrimental effects to log readability and api usage. Consult the documentation before using this advanced/experimental feature.`);
|
||||
@@ -280,7 +363,16 @@ export class Manager {
|
||||
this.logger.debug(`SOFT API LIMIT MODE: Delaying Event run by ${this.delayBy} seconds`);
|
||||
await sleep(this.delayBy * 1000);
|
||||
}
|
||||
await this.runChecks(task.checkType, task.activity, task.options);
|
||||
|
||||
const queuedItemIndex = this.queuedItemsMeta.findIndex(x => x.id === task.activity.id);
|
||||
try {
|
||||
const itemMeta = this.queuedItemsMeta[queuedItemIndex];
|
||||
this.queuedItemsMeta.splice(queuedItemIndex, 1, {...itemMeta, state: 'processing'});
|
||||
await this.runChecks(task.checkType, task.activity, {...task.options, refresh: itemMeta.shouldRefresh});
|
||||
} finally {
|
||||
// always remove item meta regardless of success or failure since we are done with it meow
|
||||
this.queuedItemsMeta.splice(queuedItemIndex, 1);
|
||||
}
|
||||
}
|
||||
, maxWorkers);
|
||||
q.error((err, task) => {
|
||||
@@ -342,9 +434,10 @@ export class Manager {
|
||||
footer,
|
||||
logger: this.logger,
|
||||
subreddit: this.subreddit,
|
||||
caching
|
||||
caching,
|
||||
client: this.client,
|
||||
};
|
||||
this.resources = ResourceManager.set(this.subreddit.display_name, resourceConfig);
|
||||
this.resources = this.cacheManager.set(this.subreddit.display_name, resourceConfig);
|
||||
this.resources.setLogger(this.logger);
|
||||
|
||||
this.logger.info('Subreddit-specific options updated');
|
||||
@@ -358,7 +451,9 @@ export class Manager {
|
||||
...jCheck,
|
||||
dryRun: this.dryRun || jCheck.dryRun,
|
||||
logger: this.logger,
|
||||
subredditName: this.subreddit.display_name
|
||||
subredditName: this.subreddit.display_name,
|
||||
resources: this.resources,
|
||||
client: this.client,
|
||||
};
|
||||
if (jCheck.kind === 'comment') {
|
||||
commentChecks.push(new CommentCheck(checkConfig));
|
||||
@@ -473,8 +568,11 @@ export class Manager {
|
||||
checkNames = [],
|
||||
delayUntil,
|
||||
dryRun,
|
||||
refresh = false,
|
||||
} = options || {};
|
||||
|
||||
let wasRefreshed = false;
|
||||
|
||||
if (delayUntil !== undefined) {
|
||||
const created = dayjs.unix(item.created_utc);
|
||||
const diff = dayjs().diff(created, 's');
|
||||
@@ -483,8 +581,16 @@ export class Manager {
|
||||
await sleep(delayUntil - diff);
|
||||
// @ts-ignore
|
||||
item = await activity.refresh();
|
||||
wasRefreshed = true;
|
||||
}
|
||||
}
|
||||
// refresh signal from firehose if activity was ingested multiple times before processing or re-queued while processing
|
||||
// want to make sure we have the most recent data
|
||||
if(!wasRefreshed && refresh === true) {
|
||||
this.logger.verbose('Refreshed data (probably due to signal from firehose)');
|
||||
// @ts-ignore
|
||||
item = await activity.refresh();
|
||||
}
|
||||
|
||||
const startingApiLimit = this.client.ratelimitRemaining;
|
||||
|
||||
@@ -501,10 +607,23 @@ export class Manager {
|
||||
let checksRun = 0;
|
||||
let actionsRun = 0;
|
||||
let totalRulesRun = 0;
|
||||
let runActions: Action[] = [];
|
||||
let runActions: ActionResult[] = [];
|
||||
let actionedEvent: ActionedEvent = {
|
||||
subreddit: this.subreddit.display_name_prefixed,
|
||||
activity: {
|
||||
peek: ePeek,
|
||||
link: item.permalink
|
||||
},
|
||||
author: item.author.name,
|
||||
timestamp: Date.now(),
|
||||
check: '',
|
||||
ruleSummary: '',
|
||||
ruleResults: [],
|
||||
actionResults: [],
|
||||
}
|
||||
let triggered = false;
|
||||
|
||||
try {
|
||||
let triggered = false;
|
||||
for (const check of checks) {
|
||||
if (checkNames.length > 0 && !checkNames.map(x => x.toLowerCase()).some(x => x === check.name.toLowerCase())) {
|
||||
this.logger.warn(`Check ${check.name} not in array of requested checks to run, skipping...`);
|
||||
@@ -516,14 +635,22 @@ export class Manager {
|
||||
}
|
||||
checksRun++;
|
||||
triggered = false;
|
||||
let isFromCache = false;
|
||||
let currentResults: RuleResult[] = [];
|
||||
try {
|
||||
const [checkTriggered, checkResults] = await check.runRules(item, allRuleResults);
|
||||
await check.setCacheResult(item, checkTriggered);
|
||||
const [checkTriggered, checkResults, fromCache = false] = await check.runRules(item, allRuleResults);
|
||||
isFromCache = fromCache;
|
||||
if(!fromCache) {
|
||||
await check.setCacheResult(item, {result: checkTriggered, ruleResults: checkResults});
|
||||
}
|
||||
currentResults = checkResults;
|
||||
totalRulesRun += checkResults.length;
|
||||
allRuleResults = allRuleResults.concat(determineNewResults(allRuleResults, checkResults));
|
||||
triggered = checkTriggered;
|
||||
if(triggered && fromCache && !check.cacheUserResult.runActions) {
|
||||
this.logger.info('Check was triggered but cache result options specified NOT to run actions...counting as check NOT triggered');
|
||||
triggered = false;
|
||||
}
|
||||
} catch (e) {
|
||||
if (e.logged !== true) {
|
||||
this.logger.warn(`Running rules for Check ${check.name} failed due to uncaught exception`, e);
|
||||
@@ -531,13 +658,20 @@ export class Manager {
|
||||
}
|
||||
|
||||
if (triggered) {
|
||||
actionedEvent.check = check.name;
|
||||
actionedEvent.ruleResults = currentResults;
|
||||
if(isFromCache) {
|
||||
actionedEvent.ruleSummary = `Check result was found in cache: ${triggeredIndicator(true)}`;
|
||||
} else {
|
||||
actionedEvent.ruleSummary = resultsSummary(currentResults, check.condition);
|
||||
}
|
||||
this.checksTriggered.set(check.name, (this.checksTriggered.get(check.name) || 0) + 1);
|
||||
this.checksTriggeredSinceStart.set(check.name, (this.checksTriggeredSinceStart.get(check.name) || 0) + 1);
|
||||
runActions = await check.runActions(item, currentResults.filter(x => x.triggered), dryRun);
|
||||
actionsRun = runActions.length;
|
||||
|
||||
if(check.notifyOnTrigger) {
|
||||
const ar = runActions.map(x => x.getActionUniqueName()).join(', ');
|
||||
const ar = runActions.map(x => x.name).join(', ');
|
||||
this.notificationManager.handle('eventActioned', 'Check Triggered', `Check "${check.name}" was triggered on Event: \n\n ${ePeek} \n\n with the following actions run: ${ar}`);
|
||||
}
|
||||
break;
|
||||
@@ -567,9 +701,13 @@ export class Manager {
|
||||
this.rulesTriggeredSinceStartTotal += triggeredRulesTotal;
|
||||
|
||||
for (const a of runActions) {
|
||||
const name = a.getActionUniqueName();
|
||||
const name = a.name;
|
||||
this.actionsRun.set(name, (this.actionsRun.get(name) || 0) + 1);
|
||||
this.actionsRunSinceStart.set(name, (this.actionsRunSinceStart.get(name) || 0) + 1)
|
||||
this.actionsRunSinceStart.set(name, (this.actionsRunSinceStart.get(name) || 0) + 1);
|
||||
}
|
||||
actionedEvent.actionResults = runActions;
|
||||
if(triggered) {
|
||||
await this.resources.addActionedEvent(actionedEvent);
|
||||
}
|
||||
|
||||
this.logger.verbose(`Run Stats: Checks ${checksRun} | Rules => Total: ${totalRulesRun} Unique: ${allRuleResults.length} Cached: ${totalRulesRun - allRuleResults.length} Rolling Avg: ~${formatNumber(this.rulesUniqueRollingAvg)}/s | Actions ${actionsRun}`);
|
||||
@@ -604,7 +742,7 @@ export class Manager {
|
||||
if (limit === DEFAULT_POLLING_LIMIT && interval === DEFAULT_POLLING_INTERVAL && this.sharedModqueue) {
|
||||
modStreamType = 'unmoderated';
|
||||
// use default mod stream from resources
|
||||
stream = ResourceManager.modStreams.get('unmoderated') as SPoll<Snoowrap.Submission | Snoowrap.Comment>;
|
||||
stream = this.cacheManager.modStreams.get('unmoderated') as SPoll<Snoowrap.Submission | Snoowrap.Comment>;
|
||||
} else {
|
||||
stream = new UnmoderatedStream(this.client, {
|
||||
subreddit: this.subreddit.display_name,
|
||||
@@ -617,7 +755,7 @@ export class Manager {
|
||||
if (limit === DEFAULT_POLLING_LIMIT && interval === DEFAULT_POLLING_INTERVAL) {
|
||||
modStreamType = 'modqueue';
|
||||
// use default mod stream from resources
|
||||
stream = ResourceManager.modStreams.get('modqueue') as SPoll<Snoowrap.Submission | Snoowrap.Comment>;
|
||||
stream = this.cacheManager.modStreams.get('modqueue') as SPoll<Snoowrap.Submission | Snoowrap.Comment>;
|
||||
} else {
|
||||
stream = new ModQueueStream(this.client, {
|
||||
subreddit: this.subreddit.display_name,
|
||||
@@ -671,7 +809,7 @@ export class Manager {
|
||||
checkType = 'Comment';
|
||||
}
|
||||
if (checkType !== undefined) {
|
||||
this.queue.push({checkType, activity: item, options: {delayUntil}})
|
||||
this.firehose.push({checkType, activity: item, options: {delayUntil}})
|
||||
}
|
||||
};
|
||||
|
||||
@@ -697,29 +835,6 @@ export class Manager {
|
||||
}
|
||||
}
|
||||
|
||||
async handle(): Promise<void> {
|
||||
if (this.submissionChecks.length === 0 && this.commentChecks.length === 0) {
|
||||
this.logger.warn('No submission or comment checks to run! Bot will not run.');
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
for (const s of this.streams) {
|
||||
s.startInterval();
|
||||
}
|
||||
this.startedAt = dayjs();
|
||||
this.running = true;
|
||||
this.manuallyStopped = false;
|
||||
this.logger.info('Bot Running');
|
||||
|
||||
await pEvent(this.emitter, 'end');
|
||||
} catch (err) {
|
||||
this.logger.error('Too many request errors occurred or an unhandled error was encountered, manager is stopping');
|
||||
} finally {
|
||||
this.stop();
|
||||
}
|
||||
}
|
||||
|
||||
startQueue(causedBy: Invokee = 'system', options?: ManagerStateChangeOption) {
|
||||
const {reason, suppressNotification = false} = options || {};
|
||||
if(this.queueState.state === RUNNING) {
|
||||
@@ -727,14 +842,19 @@ export class Manager {
|
||||
} else if (!this.validConfigLoaded) {
|
||||
this.logger.warn('Cannot start activity processing queue while manager has an invalid configuration');
|
||||
} else {
|
||||
if(this.queueState.state === STOPPED) {
|
||||
// extra precaution to make sure queue meta is cleared before starting queue
|
||||
this.queuedItemsMeta = [];
|
||||
}
|
||||
this.queue.resume();
|
||||
this.firehose.resume();
|
||||
this.logger.info(`Activity processing queue started RUNNING with ${this.queue.length()} queued activities`);
|
||||
this.queueState = {
|
||||
state: RUNNING,
|
||||
causedBy
|
||||
}
|
||||
if(!suppressNotification) {
|
||||
this.notificationManager.handle('runStateChanged', 'Queue Started', reason, causedBy)
|
||||
this.notificationManager.handle('runStateChanged', 'Queue Started', reason, causedBy);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -796,7 +916,9 @@ export class Manager {
|
||||
this.logger.verbose(`Activity processing queue is stopping...waiting for ${this.queue.running()} activities to finish processing`);
|
||||
}
|
||||
this.logger.info(`Activity processing queue stopped by ${causedBy} and ${this.queue.length()} queued activities cleared (waited ${dayjs().diff(pauseWaitStart, 's')} seconds while activity processing finished)`);
|
||||
this.firehose.kill();
|
||||
this.queue.kill();
|
||||
this.queuedItemsMeta = [];
|
||||
}
|
||||
|
||||
this.queueState = {
|
||||
@@ -876,7 +998,7 @@ export class Manager {
|
||||
}
|
||||
this.streams = [];
|
||||
for (const [k, v] of this.modStreamCallbacks) {
|
||||
const stream = ResourceManager.modStreams.get(k) as Poll<Snoowrap.Submission | Snoowrap.Comment>;
|
||||
const stream = this.cacheManager.modStreams.get(k) as Poll<Snoowrap.Submission | Snoowrap.Comment>;
|
||||
stream.removeListener('item', v);
|
||||
}
|
||||
this.startedAt = undefined;
|
||||
|
||||
@@ -1,29 +1,30 @@
|
||||
import Snoowrap, {RedditUser} from "snoowrap";
|
||||
import Snoowrap, {RedditUser, Subreddit} from "snoowrap";
|
||||
import objectHash from 'object-hash';
|
||||
import {
|
||||
activityIsDeleted, activityIsFiltered,
|
||||
activityIsRemoved,
|
||||
AuthorActivitiesOptions,
|
||||
AuthorTypedActivitiesOptions, BOT_LINK,
|
||||
getAuthorActivities, singleton,
|
||||
getAuthorActivities,
|
||||
testAuthorCriteria
|
||||
} from "../Utils/SnoowrapUtils";
|
||||
import Subreddit from 'snoowrap/dist/objects/Subreddit';
|
||||
import winston, {Logger} from "winston";
|
||||
import fetch from 'node-fetch';
|
||||
import {
|
||||
buildCacheOptionsFromProvider,
|
||||
cacheStats, createCacheManager,
|
||||
formatNumber,
|
||||
asSubmission,
|
||||
buildCacheOptionsFromProvider, buildCachePrefix,
|
||||
cacheStats, comparisonTextOp, createCacheManager,
|
||||
formatNumber, getActivityAuthorName, getActivitySubredditName, isStrongSubredditState,
|
||||
mergeArr,
|
||||
parseExternalUrl,
|
||||
parseWikiContext
|
||||
parseExternalUrl, parseGenericValueComparison,
|
||||
parseWikiContext, toStrongSubredditState
|
||||
} from "../util";
|
||||
import LoggedError from "../Utils/LoggedError";
|
||||
import {
|
||||
BotInstanceConfig,
|
||||
CacheOptions, CommentState,
|
||||
Footer, OperatorConfig, ResourceStats, SubmissionState,
|
||||
SubredditCacheConfig, TTLConfig, TypedActivityStates
|
||||
Footer, OperatorConfig, ResourceStats, StrongCache, SubmissionState,
|
||||
CacheConfig, TTLConfig, TypedActivityStates, UserResultCache, ActionedEvent, SubredditState, StrongSubredditState
|
||||
} from "../Common/interfaces";
|
||||
import UserNotes from "./UserNotes";
|
||||
import Mustache from "mustache";
|
||||
@@ -33,13 +34,15 @@ import {SPoll} from "./Streams";
|
||||
import {Cache} from 'cache-manager';
|
||||
import {Submission, Comment} from "snoowrap/dist/objects";
|
||||
import {cacheTTLDefaults} from "../Common/defaults";
|
||||
import {check} from "tcp-port-used";
|
||||
|
||||
export const DEFAULT_FOOTER = '\r\n*****\r\nThis action was performed by [a bot.]({{botLink}}) Mention a moderator or [send a modmail]({{modmailLink}}) if you any ideas, questions, or concerns about this action.';
|
||||
|
||||
export interface SubredditResourceConfig extends Footer {
|
||||
caching?: SubredditCacheConfig,
|
||||
caching?: CacheConfig,
|
||||
subreddit: Subreddit,
|
||||
logger: Logger;
|
||||
client: Snoowrap
|
||||
}
|
||||
|
||||
interface SubredditResourceOptions extends Footer {
|
||||
@@ -49,28 +52,35 @@ interface SubredditResourceOptions extends Footer {
|
||||
cacheSettingsHash: string
|
||||
subreddit: Subreddit,
|
||||
logger: Logger;
|
||||
client: Snoowrap;
|
||||
prefix?: string;
|
||||
actionedEventsMax: number;
|
||||
}
|
||||
|
||||
export interface SubredditResourceSetOptions extends SubredditCacheConfig, Footer {
|
||||
export interface SubredditResourceSetOptions extends CacheConfig, Footer {
|
||||
}
|
||||
|
||||
export class SubredditResources {
|
||||
//enabled!: boolean;
|
||||
protected useSubredditAuthorCache!: boolean;
|
||||
protected authorTTL: number = cacheTTLDefaults.authorTTL;
|
||||
protected wikiTTL: number = cacheTTLDefaults.wikiTTL;
|
||||
protected submissionTTL: number = cacheTTLDefaults.submissionTTL;
|
||||
protected commentTTL: number = cacheTTLDefaults.commentTTL;
|
||||
protected filterCriteriaTTL: number = cacheTTLDefaults.filterCriteriaTTL;
|
||||
protected authorTTL: number | false = cacheTTLDefaults.authorTTL;
|
||||
protected subredditTTL: number | false = cacheTTLDefaults.subredditTTL;
|
||||
protected wikiTTL: number | false = cacheTTLDefaults.wikiTTL;
|
||||
protected submissionTTL: number | false = cacheTTLDefaults.submissionTTL;
|
||||
protected commentTTL: number | false = cacheTTLDefaults.commentTTL;
|
||||
protected filterCriteriaTTL: number | false = cacheTTLDefaults.filterCriteriaTTL;
|
||||
name: string;
|
||||
protected logger: Logger;
|
||||
userNotes: UserNotes;
|
||||
footer: false | string = DEFAULT_FOOTER;
|
||||
subreddit: Subreddit
|
||||
client: Snoowrap
|
||||
cache: Cache
|
||||
cacheType: string
|
||||
cacheSettingsHash?: string;
|
||||
pruneInterval?: any;
|
||||
prefix?: string
|
||||
actionedEventsMax: number;
|
||||
|
||||
stats: { cache: ResourceStats };
|
||||
|
||||
@@ -83,22 +93,34 @@ export class SubredditResources {
|
||||
authorTTL,
|
||||
wikiTTL,
|
||||
filterCriteriaTTL,
|
||||
submissionTTL,
|
||||
commentTTL,
|
||||
subredditTTL,
|
||||
},
|
||||
cache,
|
||||
prefix,
|
||||
cacheType,
|
||||
actionedEventsMax,
|
||||
cacheSettingsHash,
|
||||
client,
|
||||
} = options || {};
|
||||
|
||||
this.cacheSettingsHash = cacheSettingsHash;
|
||||
this.cache = cache;
|
||||
this.prefix = prefix;
|
||||
this.client = client;
|
||||
this.cacheType = cacheType;
|
||||
this.authorTTL = authorTTL;
|
||||
this.wikiTTL = wikiTTL;
|
||||
this.filterCriteriaTTL = filterCriteriaTTL;
|
||||
this.actionedEventsMax = actionedEventsMax;
|
||||
this.authorTTL = authorTTL === true ? 0 : authorTTL;
|
||||
this.submissionTTL = submissionTTL === true ? 0 : submissionTTL;
|
||||
this.commentTTL = commentTTL === true ? 0 : commentTTL;
|
||||
this.subredditTTL = subredditTTL === true ? 0 : subredditTTL;
|
||||
this.wikiTTL = wikiTTL === true ? 0 : wikiTTL;
|
||||
this.filterCriteriaTTL = filterCriteriaTTL === true ? 0 : filterCriteriaTTL;
|
||||
this.subreddit = subreddit;
|
||||
this.name = name;
|
||||
if (logger === undefined) {
|
||||
const alogger = winston.loggers.get('default')
|
||||
const alogger = winston.loggers.get('app')
|
||||
this.logger = alogger.child({labels: [this.name, 'Resource Cache']}, mergeArr);
|
||||
} else {
|
||||
this.logger = logger.child({labels: ['Resource Cache']}, mergeArr);
|
||||
@@ -109,13 +131,14 @@ export class SubredditResources {
|
||||
};
|
||||
|
||||
const cacheUseCB = (miss: boolean) => {
|
||||
this.stats.cache.userNotes.requestTimestamps.push(Date.now());
|
||||
this.stats.cache.userNotes.requests++;
|
||||
this.stats.cache.userNotes.miss += miss ? 1 : 0;
|
||||
}
|
||||
this.userNotes = new UserNotes(userNotesTTL, this.subreddit, this.logger, this.cache, cacheUseCB)
|
||||
|
||||
if(this.cacheType === 'memory' && this.cacheSettingsHash !== 'default') {
|
||||
const min = Math.min(...([wikiTTL, authorTTL, userNotesTTL].filter(x => x !== 0)));
|
||||
const min = Math.min(...([this.wikiTTL, this.authorTTL, this.submissionTTL, this.commentTTL, this.filterCriteriaTTL].filter(x => typeof x === 'number' && x !== 0) as number[]));
|
||||
if(min > 0) {
|
||||
// set default prune interval
|
||||
this.pruneInterval = setInterval(() => {
|
||||
@@ -130,28 +153,64 @@ export class SubredditResources {
|
||||
|
||||
async getCacheKeyCount() {
|
||||
if (this.cache.store.keys !== undefined) {
|
||||
if(this.cacheType === 'redis') {
|
||||
return (await this.cache.store.keys(`${this.prefix}*`)).length;
|
||||
}
|
||||
return (await this.cache.store.keys()).length;
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
getStats() {
|
||||
async getStats() {
|
||||
const totals = Object.values(this.stats.cache).reduce((acc, curr) => ({
|
||||
miss: acc.miss + curr.miss,
|
||||
req: acc.req + curr.requests,
|
||||
}), {miss: 0, req: 0});
|
||||
const cacheKeys = Object.keys(this.stats.cache);
|
||||
return {
|
||||
cache: {
|
||||
// TODO could probably combine these two
|
||||
totalRequests: totals.req,
|
||||
totalMiss: totals.miss,
|
||||
missPercent: `${formatNumber(totals.miss === 0 || totals.req === 0 ? 0 :(totals.miss/totals.req) * 100, {toFixed: 0})}%`,
|
||||
types: Object.keys(this.stats.cache).reduce((acc, curr) => {
|
||||
types: await cacheKeys.reduce(async (accProm, curr) => {
|
||||
const acc = await accProm;
|
||||
// calculate miss percent
|
||||
|
||||
const per = acc[curr].miss === 0 ? 0 : formatNumber(acc[curr].miss / acc[curr].requests) * 100;
|
||||
// @ts-ignore
|
||||
acc[curr].missPercent = `${formatNumber(per, {toFixed: 0})}%`;
|
||||
|
||||
// calculate average identifier hits
|
||||
|
||||
const idCache = acc[curr].identifierRequestCount;
|
||||
// @ts-expect-error
|
||||
const idKeys = await idCache.store.keys() as string[];
|
||||
if(idKeys.length > 0) {
|
||||
let hits = 0;
|
||||
for (const k of idKeys) {
|
||||
hits += await idCache.get(k) as number;
|
||||
}
|
||||
acc[curr].identifierAverageHit = formatNumber(hits/idKeys.length);
|
||||
}
|
||||
|
||||
if(acc[curr].requestTimestamps.length > 1) {
|
||||
// calculate average time between request
|
||||
const diffData = acc[curr].requestTimestamps.reduce((acc, curr: number) => {
|
||||
if(acc.last === 0) {
|
||||
acc.last = curr;
|
||||
return acc;
|
||||
}
|
||||
acc.diffs.push(curr - acc.last);
|
||||
acc.last = curr;
|
||||
return acc;
|
||||
},{last: 0, diffs: [] as number[]});
|
||||
const avgDiff = diffData.diffs.reduce((acc, curr) => acc + curr, 0) / diffData.diffs.length;
|
||||
|
||||
acc[curr].averageTimeBetweenHits = formatNumber(avgDiff/1000);
|
||||
}
|
||||
|
||||
return acc;
|
||||
}, this.stats.cache)
|
||||
}, Promise.resolve(this.stats.cache))
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -160,31 +219,48 @@ export class SubredditResources {
|
||||
this.logger = logger.child({labels: ['Resource Cache']}, mergeArr);
|
||||
}
|
||||
|
||||
async getActionedEvents(): Promise<ActionedEvent[]> {
|
||||
return await this.cache.wrap(`actionedEvents-${this.subreddit.display_name}`, () => []);
|
||||
}
|
||||
|
||||
async addActionedEvent(ae: ActionedEvent) {
|
||||
const events = await this.cache.wrap(`actionedEvents-${this.subreddit.display_name}`, () => []) as ActionedEvent[];
|
||||
events.unshift(ae);
|
||||
await this.cache.set(`actionedEvents-${this.subreddit.display_name}`, events.slice(0, this.actionedEventsMax), {ttl: 0});
|
||||
}
|
||||
|
||||
async getActivity(item: Submission | Comment) {
|
||||
try {
|
||||
if (item instanceof Submission && this.submissionTTL > 0) {
|
||||
let hash = '';
|
||||
if (this.submissionTTL !== false && asSubmission(item)) {
|
||||
hash = `sub-${item.name}`;
|
||||
await this.stats.cache.submission.identifierRequestCount.set(hash, (await this.stats.cache.submission.identifierRequestCount.wrap(hash, () => 0) as number) + 1);
|
||||
this.stats.cache.submission.requestTimestamps.push(Date.now());
|
||||
this.stats.cache.submission.requests++;
|
||||
const cachedSubmission = await this.cache.get(`sub-${item.name}`);
|
||||
if (cachedSubmission !== undefined) {
|
||||
const cachedSubmission = await this.cache.get(hash);
|
||||
if (cachedSubmission !== undefined && cachedSubmission !== null) {
|
||||
this.logger.debug(`Cache Hit: Submission ${item.name}`);
|
||||
return cachedSubmission;
|
||||
}
|
||||
// @ts-ignore
|
||||
const submission = await item.fetch();
|
||||
this.stats.cache.submission.miss++;
|
||||
await this.cache.set(`sub-${item.name}`, submission, {ttl: this.submissionTTL});
|
||||
await this.cache.set(hash, submission, {ttl: this.submissionTTL});
|
||||
return submission;
|
||||
} else if (this.commentTTL > 0) {
|
||||
} else if (this.commentTTL !== false) {
|
||||
hash = `comm-${item.name}`;
|
||||
await this.stats.cache.comment.identifierRequestCount.set(hash, (await this.stats.cache.comment.identifierRequestCount.wrap(hash, () => 0) as number) + 1);
|
||||
this.stats.cache.comment.requestTimestamps.push(Date.now());
|
||||
this.stats.cache.comment.requests++;
|
||||
const cachedComment = await this.cache.get(`comm-${item.name}`);
|
||||
if (cachedComment !== undefined) {
|
||||
const cachedComment = await this.cache.get(hash);
|
||||
if (cachedComment !== undefined && cachedComment !== null) {
|
||||
this.logger.debug(`Cache Hit: Comment ${item.name}`);
|
||||
return cachedComment;
|
||||
}
|
||||
// @ts-ignore
|
||||
const comment = await item.fetch();
|
||||
this.stats.cache.comment.miss++;
|
||||
await this.cache.set(`comm-${item.name}`, comment, {ttl: this.commentTTL});
|
||||
await this.cache.set(hash, comment, {ttl: this.commentTTL});
|
||||
return comment;
|
||||
} else {
|
||||
// @ts-ignore
|
||||
@@ -196,28 +272,72 @@ export class SubredditResources {
|
||||
}
|
||||
}
|
||||
|
||||
async getAuthorActivities(user: RedditUser, options: AuthorTypedActivitiesOptions): Promise<Array<Submission | Comment>> {
|
||||
if (this.authorTTL > 0) {
|
||||
const userName = user.name;
|
||||
const hashObj: any = {...options, userName};
|
||||
if (this.useSubredditAuthorCache) {
|
||||
hashObj.subreddit = this.name;
|
||||
// @ts-ignore
|
||||
async getSubreddit(item: Submission | Comment) {
|
||||
try {
|
||||
let hash = '';
|
||||
if (this.subredditTTL !== false) {
|
||||
hash = `sub-${getActivitySubredditName(item)}`;
|
||||
await this.stats.cache.subreddit.identifierRequestCount.set(hash, (await this.stats.cache.subreddit.identifierRequestCount.wrap(hash, () => 0) as number) + 1);
|
||||
this.stats.cache.subreddit.requestTimestamps.push(Date.now());
|
||||
this.stats.cache.subreddit.requests++;
|
||||
const cachedSubreddit = await this.cache.get(hash);
|
||||
if (cachedSubreddit !== undefined && cachedSubreddit !== null) {
|
||||
this.logger.debug(`Cache Hit: Subreddit ${item.subreddit.display_name}`);
|
||||
// @ts-ignore
|
||||
return cachedSubreddit as Subreddit;
|
||||
}
|
||||
// @ts-ignore
|
||||
const subreddit = await this.client.getSubreddit(getActivitySubredditName(item)).fetch() as Subreddit;
|
||||
this.stats.cache.subreddit.miss++;
|
||||
// @ts-ignore
|
||||
await this.cache.set(hash, subreddit, {ttl: this.subredditTTL});
|
||||
// @ts-ignore
|
||||
return subreddit as Subreddit;
|
||||
} else {
|
||||
// @ts-ignore
|
||||
let subreddit = await this.client.getSubreddit(getActivitySubredditName(item));
|
||||
|
||||
return subreddit as Subreddit;
|
||||
}
|
||||
const hash = objectHash.sha1({...options, userName});
|
||||
} catch (err) {
|
||||
this.logger.error('Error while trying to fetch a cached activity', err);
|
||||
throw err.logged;
|
||||
}
|
||||
}
|
||||
|
||||
async getAuthorActivities(user: RedditUser, options: AuthorTypedActivitiesOptions): Promise<Array<Submission | Comment>> {
|
||||
const userName = getActivityAuthorName(user);
|
||||
if (this.authorTTL !== false) {
|
||||
const hashObj: any = options;
|
||||
if (this.useSubredditAuthorCache) {
|
||||
hashObj.subreddit = this.subreddit;
|
||||
}
|
||||
const hash = `authorActivities-${userName}-${options.type || 'overview'}-${objectHash.sha1(hashObj)}`;
|
||||
|
||||
this.stats.cache.author.requests++;
|
||||
await this.stats.cache.author.identifierRequestCount.set(userName, (await this.stats.cache.author.identifierRequestCount.wrap(userName, () => 0) as number) + 1);
|
||||
this.stats.cache.author.requestTimestamps.push(Date.now());
|
||||
let miss = false;
|
||||
const cacheVal = await this.cache.wrap(hash, async () => {
|
||||
miss = true;
|
||||
if(typeof user === 'string') {
|
||||
// @ts-ignore
|
||||
user = await this.client.getUser(userName);
|
||||
}
|
||||
return await getAuthorActivities(user, options);
|
||||
}, {ttl: this.authorTTL});
|
||||
if (!miss) {
|
||||
this.logger.debug(`Cache Hit: ${userName} (${options.type || 'overview'})`);
|
||||
this.logger.debug(`Cache Hit: ${userName} (Hash ${hash})`);
|
||||
} else {
|
||||
this.stats.cache.author.miss++;
|
||||
}
|
||||
return cacheVal as Array<Submission | Comment>;
|
||||
}
|
||||
if(typeof user === 'string') {
|
||||
// @ts-ignore
|
||||
user = await this.client.getUser(userName);
|
||||
}
|
||||
return await getAuthorActivities(user, options);
|
||||
}
|
||||
|
||||
@@ -249,12 +369,14 @@ export class SubredditResources {
|
||||
}
|
||||
|
||||
// try to get cached value first
|
||||
let hash = `${subreddit.display_name}-${cacheKey}`;
|
||||
if (this.wikiTTL > 0) {
|
||||
let hash = `${subreddit.display_name}-content-${cacheKey}`;
|
||||
if (this.wikiTTL !== false) {
|
||||
await this.stats.cache.content.identifierRequestCount.set(cacheKey, (await this.stats.cache.content.identifierRequestCount.wrap(cacheKey, () => 0) as number) + 1);
|
||||
this.stats.cache.content.requestTimestamps.push(Date.now());
|
||||
this.stats.cache.content.requests++;
|
||||
const cachedContent = await this.cache.get(hash);
|
||||
if (cachedContent !== undefined) {
|
||||
this.logger.debug(`Cache Hit: ${cacheKey}`);
|
||||
if (cachedContent !== undefined && cachedContent !== null) {
|
||||
this.logger.debug(`Content Cache Hit: ${cacheKey}`);
|
||||
return cachedContent as string;
|
||||
} else {
|
||||
this.stats.cache.content.miss++;
|
||||
@@ -269,9 +391,7 @@ export class SubredditResources {
|
||||
if (wikiContext.subreddit === undefined || wikiContext.subreddit.toLowerCase() === subreddit.display_name) {
|
||||
sub = subreddit;
|
||||
} else {
|
||||
// @ts-ignore
|
||||
const client = singleton.getClient();
|
||||
sub = client.getSubreddit(wikiContext.subreddit);
|
||||
sub = this.client.getSubreddit(wikiContext.subreddit);
|
||||
}
|
||||
try {
|
||||
// @ts-ignore
|
||||
@@ -300,25 +420,73 @@ export class SubredditResources {
|
||||
}
|
||||
}
|
||||
|
||||
if (this.wikiTTL > 0) {
|
||||
if (this.wikiTTL !== false) {
|
||||
this.cache.set(hash, wikiContent, {ttl: this.wikiTTL});
|
||||
}
|
||||
|
||||
return wikiContent;
|
||||
}
|
||||
|
||||
async testSubredditCriteria(item: (Comment | Submission), state: SubredditState | StrongSubredditState) {
|
||||
if(Object.keys(state).length === 0) {
|
||||
return true;
|
||||
}
|
||||
// optimize for name-only criteria checks
|
||||
// -- we don't need to store cache results for this since we know subreddit name is always available from item (no request required)
|
||||
const critCount = Object.entries(state).filter(([key, val]) => {
|
||||
return val !== undefined && !['name','stateDescription'].includes(key);
|
||||
}).length;
|
||||
if(critCount === 0) {
|
||||
const subName = getActivitySubredditName(item);
|
||||
return await this.isSubreddit({display_name: subName} as Subreddit, state, this.logger);
|
||||
}
|
||||
|
||||
if (this.filterCriteriaTTL !== false) {
|
||||
try {
|
||||
const hash = `subredditCrit-${getActivitySubredditName(item)}-${objectHash.sha1(state)}`;
|
||||
await this.stats.cache.subredditCrit.identifierRequestCount.set(hash, (await this.stats.cache.subredditCrit.identifierRequestCount.wrap(hash, () => 0) as number) + 1);
|
||||
this.stats.cache.subredditCrit.requestTimestamps.push(Date.now());
|
||||
this.stats.cache.subredditCrit.requests++;
|
||||
const cachedItem = await this.cache.get(hash);
|
||||
if (cachedItem !== undefined && cachedItem !== null) {
|
||||
this.logger.debug(`Cache Hit: Subreddit Check on ${getActivitySubredditName(item)} (Hash ${hash})`);
|
||||
return cachedItem as boolean;
|
||||
}
|
||||
const itemResult = await this.isSubreddit(await this.getSubreddit(item), state, this.logger);
|
||||
this.stats.cache.subredditCrit.miss++;
|
||||
await this.cache.set(hash, itemResult, {ttl: this.filterCriteriaTTL});
|
||||
return itemResult;
|
||||
} catch (err) {
|
||||
if (err.logged !== true) {
|
||||
this.logger.error('Error occurred while testing subreddit criteria', err);
|
||||
}
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
return await this.isSubreddit(await this.getSubreddit(item), state, this.logger);
|
||||
}
|
||||
|
||||
async testAuthorCriteria(item: (Comment | Submission), authorOpts: AuthorCriteria, include = true) {
|
||||
if (this.filterCriteriaTTL > 0) {
|
||||
const hashObj = {itemId: item.id, ...authorOpts, include};
|
||||
const hash = `authorCrit-${objectHash.sha1(hashObj)}`;
|
||||
if (this.filterCriteriaTTL !== false) {
|
||||
// in the criteria check we only actually use the `item` to get the author flair
|
||||
// which will be the same for the entire subreddit
|
||||
//
|
||||
// so we can create a hash only using subreddit-author-criteria
|
||||
// and ignore the actual item
|
||||
const hashObj = {...authorOpts, include};
|
||||
const userName = getActivityAuthorName(item.author);
|
||||
const hash = `authorCrit-${this.subreddit.display_name}-${userName}-${objectHash.sha1(hashObj)}`;
|
||||
await this.stats.cache.authorCrit.identifierRequestCount.set(hash, (await this.stats.cache.authorCrit.identifierRequestCount.wrap(hash, () => 0) as number) + 1);
|
||||
this.stats.cache.authorCrit.requestTimestamps.push(Date.now());
|
||||
this.stats.cache.authorCrit.requests++;
|
||||
let miss = false;
|
||||
const cachedAuthorTest = await this.cache.wrap(hash, async () => {
|
||||
miss = true;
|
||||
return await testAuthorCriteria(item, authorOpts, include, this.userNotes);
|
||||
}, {ttl: this.authorTTL});
|
||||
}, {ttl: this.filterCriteriaTTL});
|
||||
if (!miss) {
|
||||
this.logger.debug(`Cache Hit: Author Check on ${item.id}`);
|
||||
this.logger.debug(`Cache Hit: Author Check on ${userName} (Hash ${hash})`);
|
||||
} else {
|
||||
this.stats.cache.authorCrit.miss++;
|
||||
}
|
||||
@@ -328,32 +496,31 @@ export class SubredditResources {
|
||||
return await testAuthorCriteria(item, authorOpts, include, this.userNotes);
|
||||
}
|
||||
|
||||
async testItemCriteria(i: (Comment | Submission), s: TypedActivityStates) {
|
||||
if (this.filterCriteriaTTL > 0) {
|
||||
async testItemCriteria(i: (Comment | Submission), activityStates: TypedActivityStates) {
|
||||
if (this.filterCriteriaTTL !== false) {
|
||||
let item = i;
|
||||
let states = s;
|
||||
let states = activityStates;
|
||||
// optimize for submission only checks on comment item
|
||||
if (item instanceof Comment && states.length === 1 && Object.keys(states[0]).length === 1 && (states[0] as CommentState).submissionState !== undefined) {
|
||||
// get submission
|
||||
const client = singleton.getClient();
|
||||
// @ts-ignore
|
||||
const subProxy = await client.getSubmission(await i.link_id);
|
||||
const subProxy = await this.client.getSubmission(await i.link_id);
|
||||
// @ts-ignore
|
||||
item = await this.getActivity(subProxy);
|
||||
states = (states[0] as CommentState).submissionState as SubmissionState[];
|
||||
}
|
||||
try {
|
||||
const hashObj = {itemId: item.name, ...states};
|
||||
const hash = `itemCrit-${objectHash.sha1(hashObj)}`;
|
||||
const hash = `itemCrit-${item.name}-${objectHash.sha1(states)}`;
|
||||
await this.stats.cache.itemCrit.identifierRequestCount.set(hash, (await this.stats.cache.itemCrit.identifierRequestCount.wrap(hash, () => 0) as number) + 1);
|
||||
this.stats.cache.itemCrit.requestTimestamps.push(Date.now());
|
||||
this.stats.cache.itemCrit.requests++;
|
||||
const cachedItem = await this.cache.get(hash);
|
||||
if (cachedItem !== undefined) {
|
||||
this.logger.debug(`Cache Hit: Item Check on ${item.name}`);
|
||||
if (cachedItem !== undefined && cachedItem !== null) {
|
||||
this.logger.debug(`Cache Hit: Item Check on ${item.name} (Hash ${hash})`);
|
||||
return cachedItem as boolean;
|
||||
}
|
||||
const itemResult = await this.isItem(item, states, this.logger);
|
||||
this.stats.cache.itemCrit.miss++;
|
||||
const res = await this.cache.set(hash, itemResult, {ttl: this.filterCriteriaTTL});
|
||||
await this.cache.set(hash, itemResult, {ttl: this.filterCriteriaTTL});
|
||||
return itemResult;
|
||||
} catch (err) {
|
||||
if (err.logged !== true) {
|
||||
@@ -363,7 +530,50 @@ export class SubredditResources {
|
||||
}
|
||||
}
|
||||
|
||||
return await this.isItem(i, s, this.logger);
|
||||
return await this.isItem(i, activityStates, this.logger);
|
||||
}
|
||||
|
||||
async isSubreddit (subreddit: Subreddit, stateCriteria: SubredditState | StrongSubredditState, logger: Logger) {
|
||||
delete stateCriteria.stateDescription;
|
||||
|
||||
if (Object.keys(stateCriteria).length === 0) {
|
||||
return true;
|
||||
}
|
||||
|
||||
const crit = isStrongSubredditState(stateCriteria) ? stateCriteria : toStrongSubredditState(stateCriteria, {defaultFlags: 'i'});
|
||||
|
||||
const log = logger.child({leaf: 'Subreddit Check'}, mergeArr);
|
||||
|
||||
return await (async () => {
|
||||
for (const k of Object.keys(crit)) {
|
||||
// @ts-ignore
|
||||
if (crit[k] !== undefined) {
|
||||
switch (k) {
|
||||
case 'name':
|
||||
const nameReg = crit[k] as RegExp;
|
||||
if(!nameReg.test(subreddit.display_name)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
default:
|
||||
// @ts-ignore
|
||||
if (crit[k] !== undefined) {
|
||||
// @ts-ignore
|
||||
if (crit[k] !== subreddit[k]) {
|
||||
// @ts-ignore
|
||||
log.debug(`Failed: Expected => ${k}:${crit[k]} | Found => ${k}:${subreddit[k]}`)
|
||||
return false
|
||||
}
|
||||
} else {
|
||||
log.warn(`Tried to test for Subreddit property '${k}' but it did not exist`);
|
||||
}
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
log.debug(`Passed: ${JSON.stringify(stateCriteria)}`);
|
||||
return true;
|
||||
})() as boolean;
|
||||
}
|
||||
|
||||
async isItem (item: Submission | Comment, stateCriteria: TypedActivityStates, logger: Logger) {
|
||||
@@ -371,7 +581,7 @@ export class SubredditResources {
|
||||
return true;
|
||||
}
|
||||
|
||||
const log = logger.child({leaf: 'Item Check'});
|
||||
const log = logger.child({leaf: 'Item Check'}, mergeArr);
|
||||
|
||||
for (const crit of stateCriteria) {
|
||||
const pass = await (async () => {
|
||||
@@ -385,9 +595,8 @@ export class SubredditResources {
|
||||
continue;
|
||||
}
|
||||
// get submission
|
||||
const client = singleton.getClient();
|
||||
// @ts-ignore
|
||||
const subProxy = await client.getSubmission(await item.link_id);
|
||||
const subProxy = await this.client.getSubmission(await item.link_id);
|
||||
// @ts-ignore
|
||||
const sub = await this.getActivity(subProxy);
|
||||
// @ts-ignore
|
||||
@@ -396,6 +605,22 @@ export class SubredditResources {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'score':
|
||||
const scoreCompare = parseGenericValueComparison(crit[k] as string);
|
||||
if(!comparisonTextOp(item.score, scoreCompare.operator, scoreCompare.value)) {
|
||||
// @ts-ignore
|
||||
log.debug(`Failed: Expected => ${k}:${crit[k]} | Found => ${k}:${item.score}`)
|
||||
return false
|
||||
}
|
||||
break;
|
||||
case 'reports':
|
||||
const reportCompare = parseGenericValueComparison(crit[k] as string);
|
||||
if(!comparisonTextOp(item.num_reports, reportCompare.operator, reportCompare.value)) {
|
||||
// @ts-ignore
|
||||
log.debug(`Failed: Expected => ${k}:${crit[k]} | Found => ${k}:${item.num_reports}`)
|
||||
return false
|
||||
}
|
||||
break;
|
||||
case 'removed':
|
||||
const removed = activityIsRemoved(item);
|
||||
if (removed !== crit['removed']) {
|
||||
@@ -464,36 +689,26 @@ export class SubredditResources {
|
||||
return false
|
||||
}
|
||||
|
||||
async getCommentCheckCacheResult(item: Comment, checkConfig: object): Promise<boolean | undefined> {
|
||||
const criteria = {
|
||||
author: item.author.name,
|
||||
submission: item.link_id,
|
||||
...checkConfig
|
||||
}
|
||||
const hash = objectHash.sha1(criteria);
|
||||
async getCommentCheckCacheResult(item: Comment, checkConfig: object): Promise<UserResultCache | undefined> {
|
||||
const userName = getActivityAuthorName(item.author);
|
||||
const hash = `commentUserResult-${userName}-${item.link_id}-${objectHash.sha1(checkConfig)}`;
|
||||
this.stats.cache.commentCheck.requests++;
|
||||
const result = await this.cache.get(hash) as boolean | undefined;
|
||||
let result = await this.cache.get(hash) as UserResultCache | undefined | null;
|
||||
if(result === null) {
|
||||
result = undefined;
|
||||
}
|
||||
if(result === undefined) {
|
||||
this.stats.cache.commentCheck.miss++;
|
||||
}
|
||||
this.logger.debug(`Cache Hit: Comment Check for ${item.author.name} in Submission ${item.link_id}`);
|
||||
this.logger.debug(`Cache Hit: Comment Check for ${userName} in Submission ${item.link_id} (Hash ${hash})`);
|
||||
return result;
|
||||
}
|
||||
|
||||
async setCommentCheckCacheResult(item: Comment, checkConfig: object, result: boolean, ttl: number) {
|
||||
const criteria = {
|
||||
author: item.author.name,
|
||||
submission: item.link_id,
|
||||
...checkConfig
|
||||
}
|
||||
const hash = objectHash.sha1(criteria);
|
||||
// don't set if result is already cached
|
||||
if(undefined !== await this.cache.get(hash)) {
|
||||
this.logger.debug(`Check result already cached for User ${item.author.name} on Submission ${item.link_id}`);
|
||||
} else {
|
||||
await this.cache.set(hash, result, { ttl });
|
||||
this.logger.debug(`Cached check result '${result}' for User ${item.author.name} on Submission ${item.link_id} for ${ttl} seconds`);
|
||||
}
|
||||
async setCommentCheckCacheResult(item: Comment, checkConfig: object, result: UserResultCache, ttl: number) {
|
||||
const userName = getActivityAuthorName(item.author);
|
||||
const hash = `commentUserResult-${userName}-${item.link_id}-${objectHash.sha1(checkConfig)}`
|
||||
await this.cache.set(hash, result, { ttl });
|
||||
this.logger.debug(`Cached check result '${result.result}' for User ${userName} on Submission ${item.link_id} for ${ttl} seconds (Hash ${hash})`);
|
||||
}
|
||||
|
||||
async generateFooter(item: Submission | Comment, actionFooter?: false | string) {
|
||||
@@ -510,18 +725,21 @@ export class SubredditResources {
|
||||
}
|
||||
}
|
||||
|
||||
class SubredditResourcesManager {
|
||||
export class BotResourcesManager {
|
||||
resources: Map<string, SubredditResources> = new Map();
|
||||
authorTTL: number = 10000;
|
||||
enabled: boolean = true;
|
||||
modStreams: Map<string, SPoll<Snoowrap.Submission | Snoowrap.Comment>> = new Map();
|
||||
defaultCache!: Cache;
|
||||
defaultCache: Cache;
|
||||
defaultCacheConfig: StrongCache
|
||||
cacheType: string = 'none';
|
||||
cacheHash!: string;
|
||||
ttlDefaults!: Required<TTLConfig>;
|
||||
cacheHash: string;
|
||||
ttlDefaults: Required<TTLConfig>;
|
||||
actionedEventsMaxDefault?: number;
|
||||
actionedEventsDefault: number;
|
||||
pruneInterval: any;
|
||||
|
||||
setDefaultsFromConfig(config: OperatorConfig) {
|
||||
constructor(config: BotInstanceConfig) {
|
||||
const {
|
||||
caching: {
|
||||
authorTTL,
|
||||
@@ -529,39 +747,43 @@ class SubredditResourcesManager {
|
||||
wikiTTL,
|
||||
commentTTL,
|
||||
submissionTTL,
|
||||
subredditTTL,
|
||||
filterCriteriaTTL,
|
||||
provider,
|
||||
actionedEventsMax,
|
||||
actionedEventsDefault,
|
||||
},
|
||||
name,
|
||||
credentials,
|
||||
caching,
|
||||
} = config;
|
||||
this.cacheHash = objectHash.sha1(caching);
|
||||
this.setTTLDefaults({authorTTL, userNotesTTL, wikiTTL, commentTTL, submissionTTL, filterCriteriaTTL});
|
||||
this.setDefaultCache(provider);
|
||||
}
|
||||
caching.provider.prefix = buildCachePrefix([caching.provider.prefix, 'SHARED']);
|
||||
const {actionedEventsMax: eMax, actionedEventsDefault: eDef, ...relevantCacheSettings} = caching;
|
||||
this.cacheHash = objectHash.sha1(relevantCacheSettings);
|
||||
this.defaultCacheConfig = caching;
|
||||
this.ttlDefaults = {authorTTL, userNotesTTL, wikiTTL, commentTTL, submissionTTL, filterCriteriaTTL, subredditTTL};
|
||||
|
||||
setDefaultCache(options: CacheOptions) {
|
||||
const options = provider;
|
||||
this.cacheType = options.store;
|
||||
this.actionedEventsMaxDefault = actionedEventsMax;
|
||||
this.actionedEventsDefault = actionedEventsDefault;
|
||||
this.defaultCache = createCacheManager(options);
|
||||
if(this.cacheType === 'memory') {
|
||||
const min = Math.min(...([this.ttlDefaults.wikiTTL, this.ttlDefaults.authorTTL, this.ttlDefaults.userNotesTTL].filter(x => x !== 0)));
|
||||
if(min > 0) {
|
||||
if (this.cacheType === 'memory') {
|
||||
const min = Math.min(...([this.ttlDefaults.wikiTTL, this.ttlDefaults.authorTTL, this.ttlDefaults.userNotesTTL].filter(x => typeof x === 'number' && x !== 0) as number[]));
|
||||
if (min > 0) {
|
||||
// set default prune interval
|
||||
this.pruneInterval = setInterval(() => {
|
||||
// @ts-ignore
|
||||
this.defaultCache?.store.prune();
|
||||
// kinda hacky but whatever
|
||||
const logger = winston.loggers.get('default');
|
||||
const logger = winston.loggers.get('app');
|
||||
logger.debug('Pruned Shared Cache');
|
||||
// prune interval should be twice the smallest TTL
|
||||
},min * 1000 * 2)
|
||||
}, min * 1000 * 2)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
setTTLDefaults(def: Required<TTLConfig>) {
|
||||
this.ttlDefaults = def;
|
||||
}
|
||||
|
||||
get(subName: string): SubredditResources | undefined {
|
||||
if (this.resources.has(subName)) {
|
||||
return this.resources.get(subName) as SubredditResources;
|
||||
@@ -578,11 +800,13 @@ class SubredditResourcesManager {
|
||||
cacheType: this.cacheType,
|
||||
cacheSettingsHash: hash,
|
||||
ttl: this.ttlDefaults,
|
||||
prefix: this.defaultCacheConfig.provider.prefix,
|
||||
actionedEventsMax: this.actionedEventsMaxDefault !== undefined ? Math.min(this.actionedEventsDefault, this.actionedEventsMaxDefault) : this.actionedEventsDefault,
|
||||
...init,
|
||||
};
|
||||
|
||||
if(caching !== undefined) {
|
||||
const {provider = 'memory', ...rest} = caching;
|
||||
const {provider = this.defaultCacheConfig.provider, actionedEventsMax = this.actionedEventsDefault, ...rest} = caching;
|
||||
let cacheConfig = {
|
||||
provider: buildCacheOptionsFromProvider(provider),
|
||||
ttl: {
|
||||
@@ -594,10 +818,16 @@ class SubredditResourcesManager {
|
||||
// only need to create private if there settings are actually different than the default
|
||||
if(hash !== this.cacheHash) {
|
||||
const {provider: trueProvider, ...trueRest} = cacheConfig;
|
||||
const defaultPrefix = trueProvider.prefix;
|
||||
const subPrefix = defaultPrefix === this.defaultCacheConfig.provider.prefix ? buildCachePrefix([(defaultPrefix !== undefined ? defaultPrefix.replace('SHARED', '') : defaultPrefix), subName]) : trueProvider.prefix;
|
||||
trueProvider.prefix = subPrefix;
|
||||
const eventsMax = this.actionedEventsMaxDefault !== undefined ? Math.min(actionedEventsMax, this.actionedEventsMaxDefault) : actionedEventsMax;
|
||||
opts = {
|
||||
cache: createCacheManager(trueProvider),
|
||||
actionedEventsMax: eventsMax,
|
||||
cacheType: trueProvider.store,
|
||||
cacheSettingsHash: hash,
|
||||
prefix: subPrefix,
|
||||
...init,
|
||||
...trueRest,
|
||||
};
|
||||
@@ -625,7 +855,3 @@ class SubredditResourcesManager {
|
||||
return resource;
|
||||
}
|
||||
}
|
||||
|
||||
const manager = new SubredditResourcesManager();
|
||||
|
||||
export default manager;
|
||||
|
||||
@@ -1,6 +1,13 @@
|
||||
import dayjs, {Dayjs} from "dayjs";
|
||||
import {Comment, RedditUser, WikiPage} from "snoowrap";
|
||||
import {COMMENT_URL_ID, deflateUserNotes, inflateUserNotes, parseLinkIdentifier, SUBMISSION_URL_ID} from "../util";
|
||||
import {
|
||||
COMMENT_URL_ID,
|
||||
deflateUserNotes, getActivityAuthorName,
|
||||
inflateUserNotes,
|
||||
isScopeError,
|
||||
parseLinkIdentifier,
|
||||
SUBMISSION_URL_ID
|
||||
} from "../util";
|
||||
import Subreddit from "snoowrap/dist/objects/Subreddit";
|
||||
import {Logger} from "winston";
|
||||
import LoggedError from "../Utils/LoggedError";
|
||||
@@ -48,7 +55,7 @@ export interface RawNote {
|
||||
export type UserNotesConstants = Pick<any, "users" | "warnings">;
|
||||
|
||||
export class UserNotes {
|
||||
notesTTL: number;
|
||||
notesTTL: number | false;
|
||||
subreddit: Subreddit;
|
||||
wiki: WikiPage;
|
||||
moderators?: RedditUser[];
|
||||
@@ -63,8 +70,8 @@ export class UserNotes {
|
||||
debounceCB: any;
|
||||
batchCount: number = 0;
|
||||
|
||||
constructor(ttl: number, subreddit: Subreddit, logger: Logger, cache: Cache, cacheCB: Function) {
|
||||
this.notesTTL = ttl;
|
||||
constructor(ttl: number | boolean, subreddit: Subreddit, logger: Logger, cache: Cache, cacheCB: Function) {
|
||||
this.notesTTL = ttl === true ? 0 : ttl;
|
||||
this.subreddit = subreddit;
|
||||
this.logger = logger;
|
||||
this.wiki = subreddit.getWikiPage('usernotes');
|
||||
@@ -74,10 +81,11 @@ export class UserNotes {
|
||||
}
|
||||
|
||||
async getUserNotes(user: RedditUser): Promise<UserNote[]> {
|
||||
const userName = getActivityAuthorName(user);
|
||||
let notes: UserNote[] | undefined = [];
|
||||
|
||||
if (this.users !== undefined) {
|
||||
notes = this.users.get(user.name);
|
||||
notes = this.users.get(userName);
|
||||
if (notes !== undefined) {
|
||||
this.logger.debug('Returned cached notes');
|
||||
return notes;
|
||||
@@ -85,7 +93,7 @@ export class UserNotes {
|
||||
}
|
||||
|
||||
const payload = await this.retrieveData();
|
||||
const rawNotes = payload.blob[user.name];
|
||||
const rawNotes = payload.blob[userName];
|
||||
if (rawNotes !== undefined) {
|
||||
if (this.moderators === undefined) {
|
||||
this.moderators = await this.subreddit.getModerators();
|
||||
@@ -94,7 +102,7 @@ export class UserNotes {
|
||||
// sort in ascending order by time
|
||||
notes.sort((a, b) => a.time.isBefore(b.time) ? -1 : 1);
|
||||
if (this.notesTTL > 0 && this.cache !== undefined) {
|
||||
this.users.set(user.name, notes);
|
||||
this.users.set(userName, notes);
|
||||
}
|
||||
return notes;
|
||||
} else {
|
||||
@@ -105,6 +113,7 @@ export class UserNotes {
|
||||
async addUserNote(item: (Submission|Comment), type: string | number, text: string = ''): Promise<UserNote>
|
||||
{
|
||||
const payload = await this.retrieveData();
|
||||
const userName = getActivityAuthorName(item.author);
|
||||
|
||||
// idgaf
|
||||
// @ts-ignore
|
||||
@@ -120,16 +129,16 @@ export class UserNotes {
|
||||
}
|
||||
const newNote = new UserNote(dayjs(), text, mod, type, `https://reddit.com${item.permalink}`);
|
||||
|
||||
if(payload.blob[item.author.name] === undefined) {
|
||||
payload.blob[item.author.name] = {ns: []};
|
||||
if(payload.blob[userName] === undefined) {
|
||||
payload.blob[userName] = {ns: []};
|
||||
}
|
||||
payload.blob[item.author.name].ns.push(newNote.toRaw(payload.constants));
|
||||
payload.blob[userName].ns.push(newNote.toRaw(payload.constants));
|
||||
|
||||
await this.saveData(payload);
|
||||
if(this.notesTTL > 0) {
|
||||
const currNotes = this.users.get(item.author.name) || [];
|
||||
const currNotes = this.users.get(userName) || [];
|
||||
currNotes.push(newNote);
|
||||
this.users.set(item.author.name, currNotes);
|
||||
this.users.set(userName, currNotes);
|
||||
}
|
||||
return newNote;
|
||||
}
|
||||
@@ -144,7 +153,7 @@ export class UserNotes {
|
||||
let cacheMiss;
|
||||
if (this.notesTTL > 0) {
|
||||
const cachedPayload = await this.cache.get(this.identifier);
|
||||
if (cachedPayload !== undefined) {
|
||||
if (cachedPayload !== undefined && cachedPayload !== null) {
|
||||
this.cacheCB(false);
|
||||
return cachedPayload as unknown as RawUserNotesPayload;
|
||||
}
|
||||
@@ -153,14 +162,15 @@ export class UserNotes {
|
||||
}
|
||||
|
||||
try {
|
||||
if(cacheMiss && this.debounceCB !== undefined) {
|
||||
// timeout is still delayed. its our wiki data and we want it now! cm cacheworth 877 cache now
|
||||
this.logger.debug(`Detected missed cache on usernotes retrieval while batch (${this.batchCount}) save is in progress, executing save immediately before retrieving new notes...`);
|
||||
clearTimeout(this.saveDebounce);
|
||||
await this.debounceCB();
|
||||
this.debounceCB = undefined;
|
||||
this.saveDebounce = undefined;
|
||||
}
|
||||
// DISABLED for now because I think its causing issues
|
||||
// if(cacheMiss && this.debounceCB !== undefined) {
|
||||
// // timeout is still delayed. its our wiki data and we want it now! cm cacheworth 877 cache now
|
||||
// this.logger.debug(`Detected missed cache on usernotes retrieval while batch (${this.batchCount}) save is in progress, executing save immediately before retrieving new notes...`);
|
||||
// clearTimeout(this.saveDebounce);
|
||||
// await this.debounceCB();
|
||||
// this.debounceCB = undefined;
|
||||
// this.saveDebounce = undefined;
|
||||
// }
|
||||
// @ts-ignore
|
||||
this.wiki = await this.subreddit.getWikiPage('usernotes').fetch();
|
||||
const wikiContent = this.wiki.content_md;
|
||||
@@ -169,7 +179,7 @@ export class UserNotes {
|
||||
|
||||
userNotes.blob = inflateUserNotes(userNotes.blob);
|
||||
|
||||
if (this.notesTTL > 0) {
|
||||
if (this.notesTTL !== false) {
|
||||
await this.cache.set(`${this.subreddit.display_name}-usernotes`, userNotes, {ttl: this.notesTTL});
|
||||
this.users = new Map();
|
||||
}
|
||||
@@ -187,30 +197,36 @@ export class UserNotes {
|
||||
const blob = deflateUserNotes(payload.blob);
|
||||
const wikiPayload = {text: JSON.stringify({...payload, blob}), reason: 'ContextBot edited usernotes'};
|
||||
try {
|
||||
if (this.notesTTL > 0) {
|
||||
if (this.notesTTL !== false) {
|
||||
// DISABLED for now because if it fails throws an uncaught rejection
|
||||
// and need to figured out how to handle this other than just logging (want to interrupt action flow too?)
|
||||
//
|
||||
// debounce usernote save by 5 seconds -- effectively batch usernote saves
|
||||
//
|
||||
// so that if we are processing a ton of checks that write user notes we aren't calling to save the wiki page on every call
|
||||
// since we also have everything in cache (most likely...)
|
||||
//
|
||||
// TODO might want to increase timeout to 10 seconds
|
||||
if(this.saveDebounce !== undefined) {
|
||||
clearTimeout(this.saveDebounce);
|
||||
}
|
||||
this.debounceCB = (async function () {
|
||||
const p = wikiPayload;
|
||||
// @ts-ignore
|
||||
const self = this as UserNotes;
|
||||
// @ts-ignore
|
||||
self.wiki = await self.subreddit.getWikiPage('usernotes').edit(p);
|
||||
self.logger.debug(`Batch saved ${self.batchCount} usernotes`);
|
||||
self.debounceCB = undefined;
|
||||
self.saveDebounce = undefined;
|
||||
self.batchCount = 0;
|
||||
}).bind(this);
|
||||
this.saveDebounce = setTimeout(this.debounceCB,5000);
|
||||
this.batchCount++;
|
||||
this.logger.debug(`Saving Usernotes has been debounced for 5 seconds (${this.batchCount} batched)`)
|
||||
// if(this.saveDebounce !== undefined) {
|
||||
// clearTimeout(this.saveDebounce);
|
||||
// }
|
||||
// this.debounceCB = (async function () {
|
||||
// const p = wikiPayload;
|
||||
// // @ts-ignore
|
||||
// const self = this as UserNotes;
|
||||
// // @ts-ignore
|
||||
// self.wiki = await self.subreddit.getWikiPage('usernotes').edit(p);
|
||||
// self.logger.debug(`Batch saved ${self.batchCount} usernotes`);
|
||||
// self.debounceCB = undefined;
|
||||
// self.saveDebounce = undefined;
|
||||
// self.batchCount = 0;
|
||||
// }).bind(this);
|
||||
// this.saveDebounce = setTimeout(this.debounceCB,5000);
|
||||
// this.batchCount++;
|
||||
// this.logger.debug(`Saving Usernotes has been debounced for 5 seconds (${this.batchCount} batched)`)
|
||||
|
||||
// @ts-ignore
|
||||
await this.subreddit.getWikiPage('usernotes').edit(wikiPayload);
|
||||
await this.cache.set(this.identifier, payload, {ttl: this.notesTTL});
|
||||
this.users = new Map();
|
||||
} else {
|
||||
@@ -220,7 +236,13 @@ export class UserNotes {
|
||||
|
||||
return payload as RawUserNotesPayload;
|
||||
} catch (err) {
|
||||
const msg = `Could not edit usernotes. Make sure at least one moderator has used toolbox and usernotes before and that this account has editing permissions`;
|
||||
let msg = 'Could not edit usernotes.';
|
||||
// Make sure at least one moderator has used toolbox and usernotes before and that this account has editing permissions`;
|
||||
if(isScopeError(err)) {
|
||||
msg = `${msg} The bot account did not have sufficient OAUTH scope to perform this action. You must re-authenticate the bot and ensure it has has 'wikiedit' permissions.`
|
||||
} else {
|
||||
msg = `${msg} Make sure at least one moderator has used toolbox, created a usernote, and that this account has editing permissions for the wiki page.`;
|
||||
}
|
||||
this.logger.error(msg, err);
|
||||
throw new LoggedError(msg);
|
||||
}
|
||||
|
||||
25
src/Utils/AbortToken.ts
Normal file
@@ -0,0 +1,25 @@
|
||||
//https://gist.github.com/pygy/6290f78b078e22418821b07d8d63f111#gistcomment-3408351
|
||||
class AbortToken {
|
||||
private readonly abortSymbol = Symbol('cancelled');
|
||||
private abortPromise: Promise<any>;
|
||||
private resolve!: Function; // Works due to promise init
|
||||
|
||||
constructor() {
|
||||
this.abortPromise = new Promise(res => this.resolve = res);
|
||||
}
|
||||
|
||||
public async wrap<T>(p: PromiseLike<T>): Promise<T> {
|
||||
const result = await Promise.race([p, this.abortPromise]);
|
||||
if (result === this.abortSymbol) {
|
||||
throw new Error('aborted');
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
public abort() {
|
||||
this.resolve(this.abortSymbol);
|
||||
}
|
||||
}
|
||||
|
||||
export default AbortToken;
|
||||
@@ -40,7 +40,7 @@ export const hardLimit = new commander.Option('--hardLimit <limit>', 'When API l
|
||||
.argParser(argParseInt);
|
||||
|
||||
export const dryRun = new commander.Option('--dryRun', 'Set all subreddits in dry run mode, overriding configurations (default: process.env.DRYRUN || false)')
|
||||
.argParser(parseBoolWithDefault(undefined));
|
||||
.default(undefined);
|
||||
|
||||
export const checks = new commander.Option('-h, --checks <checkNames...>', 'An optional list of Checks, by name, that should be run. If none are specified all Checks for the Subreddit the Activity is in will be run');
|
||||
|
||||
|
||||
@@ -13,10 +13,16 @@ import {
|
||||
TypedActivityStates
|
||||
} from "../Common/interfaces";
|
||||
import {
|
||||
compareDurationValue, comparisonTextOp,
|
||||
isActivityWindowCriteria,
|
||||
normalizeName, parseDuration,
|
||||
parseDurationComparison, parseGenericValueComparison, parseGenericValueOrPercentComparison, parseSubredditName,
|
||||
compareDurationValue,
|
||||
comparisonTextOp, getActivityAuthorName,
|
||||
isActivityWindowCriteria, isStatusError,
|
||||
normalizeName,
|
||||
parseDuration,
|
||||
parseDurationComparison,
|
||||
parseGenericValueComparison,
|
||||
parseGenericValueOrPercentComparison,
|
||||
parseRuleResultsToMarkdownSummary,
|
||||
parseSubredditName,
|
||||
truncateStringToLength
|
||||
} from "../util";
|
||||
import UserNotes from "../Subreddit/UserNotes";
|
||||
@@ -119,17 +125,25 @@ export async function getAuthorActivities(user: RedditUser, options: AuthorTyped
|
||||
|
||||
let items: Array<Submission | Comment> = [];
|
||||
//let count = 1;
|
||||
let listing;
|
||||
switch (options.type) {
|
||||
case 'comment':
|
||||
listing = await user.getComments({limit: chunkSize});
|
||||
break;
|
||||
case 'submission':
|
||||
listing = await user.getSubmissions({limit: chunkSize});
|
||||
break;
|
||||
default:
|
||||
listing = await user.getOverview({limit: chunkSize});
|
||||
break;
|
||||
let listing = [];
|
||||
try {
|
||||
switch (options.type) {
|
||||
case 'comment':
|
||||
listing = await user.getComments({limit: chunkSize});
|
||||
break;
|
||||
case 'submission':
|
||||
listing = await user.getSubmissions({limit: chunkSize});
|
||||
break;
|
||||
default:
|
||||
listing = await user.getOverview({limit: chunkSize});
|
||||
break;
|
||||
}
|
||||
} catch (err) {
|
||||
if(isStatusError(err) && err.statusCode === 404) {
|
||||
throw new SimpleError('Reddit returned a 404 for user history. Likely this user is shadowbanned.');
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
let hitEnd = false;
|
||||
let offset = chunkSize;
|
||||
@@ -305,185 +319,217 @@ export const renderContent = async (template: string, data: (Submission | Commen
|
||||
};
|
||||
}, {});
|
||||
|
||||
const view = {item: templateData, rules: normalizedRuleResults};
|
||||
const view = {item: templateData, ruleSummary: parseRuleResultsToMarkdownSummary(ruleResults), rules: normalizedRuleResults};
|
||||
const rendered = Mustache.render(template, view) as string;
|
||||
return he.decode(rendered);
|
||||
}
|
||||
|
||||
export const testAuthorCriteria = async (item: (Comment | Submission), authorOpts: AuthorCriteria, include = true, userNotes: UserNotes) => {
|
||||
// @ts-ignore
|
||||
const author: RedditUser = await item.author;
|
||||
for (const k of Object.keys(authorOpts)) {
|
||||
// @ts-ignore
|
||||
if (authorOpts[k] !== undefined) {
|
||||
switch (k) {
|
||||
case 'name':
|
||||
const authPass = () => {
|
||||
// @ts-ignore
|
||||
for (const n of authorOpts[k]) {
|
||||
if (n.toLowerCase() === author.name.toLowerCase()) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
const authResult = authPass();
|
||||
if ((include && !authResult) || (!include && authResult)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'flairCssClass':
|
||||
const css = await item.author_flair_css_class;
|
||||
const cssPass = () => {
|
||||
// @ts-ignore
|
||||
for (const c of authorOpts[k]) {
|
||||
if (c === css) {
|
||||
return;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
const cssResult = cssPass();
|
||||
if ((include && !cssResult) || (!include && cssResult)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'flairText':
|
||||
const text = await item.author_flair_text;
|
||||
const textPass = () => {
|
||||
// @ts-ignore
|
||||
for (const c of authorOpts[k]) {
|
||||
if (c === text) {
|
||||
return
|
||||
}
|
||||
}
|
||||
return false;
|
||||
};
|
||||
const textResult = textPass();
|
||||
if ((include && !textResult) || (!include && textResult)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'isMod':
|
||||
const mods: RedditUser[] = await item.subreddit.getModerators();
|
||||
const isModerator = mods.some(x => x.name === item.author.name);
|
||||
const modMatch = authorOpts.isMod === isModerator;
|
||||
if ((include && !modMatch) || (!include && modMatch)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'age':
|
||||
const ageTest = compareDurationValue(parseDurationComparison(await authorOpts.age as string), dayjs.unix(await item.author.created));
|
||||
if ((include && !ageTest) || (!include && ageTest)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'linkKarma':
|
||||
const lkCompare = parseGenericValueOrPercentComparison(await authorOpts.linkKarma as string);
|
||||
let lkMatch;
|
||||
if (lkCompare.isPercent) {
|
||||
// @ts-ignore
|
||||
const tk = author.total_karma as number;
|
||||
lkMatch = comparisonTextOp(author.link_karma / tk, lkCompare.operator, lkCompare.value / 100);
|
||||
} else {
|
||||
lkMatch = comparisonTextOp(author.link_karma, lkCompare.operator, lkCompare.value);
|
||||
}
|
||||
if ((include && !lkMatch) || (!include && lkMatch)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'commentKarma':
|
||||
const ckCompare = parseGenericValueOrPercentComparison(await authorOpts.commentKarma as string);
|
||||
let ckMatch;
|
||||
if (ckCompare.isPercent) {
|
||||
// @ts-ignore
|
||||
const ck = author.total_karma as number;
|
||||
ckMatch = comparisonTextOp(author.comment_karma / ck, ckCompare.operator, ckCompare.value / 100);
|
||||
} else {
|
||||
ckMatch = comparisonTextOp(author.comment_karma, ckCompare.operator, ckCompare.value);
|
||||
}
|
||||
if ((include && !ckMatch) || (!include && ckMatch)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'totalKarma':
|
||||
const tkCompare = parseGenericValueComparison(await authorOpts.totalKarma as string);
|
||||
if (tkCompare.isPercent) {
|
||||
throw new SimpleError(`'totalKarma' value on AuthorCriteria cannot be a percentage`);
|
||||
}
|
||||
// @ts-ignore
|
||||
const totalKarma = author.total_karma as number;
|
||||
const tkMatch = comparisonTextOp(totalKarma, tkCompare.operator, tkCompare.value);
|
||||
if ((include && !tkMatch) || (!include && tkMatch)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'verified':
|
||||
const vMatch = await author.has_verified_mail === authorOpts.verified as boolean;
|
||||
if ((include && !vMatch) || (!include && vMatch)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'userNotes':
|
||||
const notes = await userNotes.getUserNotes(item.author);
|
||||
const notePass = () => {
|
||||
for (const noteCriteria of authorOpts[k] as UserNoteCriteria[]) {
|
||||
const {count = '>= 1', search = 'current', type} = noteCriteria;
|
||||
const {
|
||||
value,
|
||||
operator,
|
||||
isPercent,
|
||||
extra = ''
|
||||
} = parseGenericValueOrPercentComparison(count);
|
||||
const order = extra.includes('asc') ? 'ascending' : 'descending';
|
||||
switch (search) {
|
||||
case 'current':
|
||||
if (notes.length > 0 && notes[notes.length - 1].noteType === type) {
|
||||
return true;
|
||||
}
|
||||
break;
|
||||
case 'consecutive':
|
||||
let orderedNotes = notes;
|
||||
if (order === 'descending') {
|
||||
orderedNotes = [...notes];
|
||||
orderedNotes.reverse();
|
||||
}
|
||||
let currCount = 0;
|
||||
for (const note of orderedNotes) {
|
||||
if (note.noteType === type) {
|
||||
currCount++;
|
||||
} else {
|
||||
currCount = 0;
|
||||
}
|
||||
if (isPercent) {
|
||||
throw new SimpleError(`When comparing UserNotes with 'consecutive' search 'count' cannot be a percentage. Given: ${count}`);
|
||||
}
|
||||
if (comparisonTextOp(currCount, operator, value)) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
break;
|
||||
case 'total':
|
||||
if (isPercent) {
|
||||
if (comparisonTextOp(notes.filter(x => x.noteType === type).length / notes.length, operator, value / 100)) {
|
||||
return true;
|
||||
}
|
||||
} else if (comparisonTextOp(notes.filter(x => x.noteType === type).length, operator, value)) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
const noteResult = notePass();
|
||||
if ((include && !noteResult) || (!include && noteResult)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
const {shadowBanned, ...rest} = authorOpts;
|
||||
|
||||
if(shadowBanned !== undefined) {
|
||||
try {
|
||||
// @ts-ignore
|
||||
await item.author.fetch();
|
||||
// user is not shadowbanned
|
||||
// if criteria specifies they SHOULD be shadowbanned then return false now
|
||||
if(shadowBanned) {
|
||||
return false;
|
||||
}
|
||||
} catch (err) {
|
||||
if(isStatusError(err) && err.statusCode === 404) {
|
||||
// user is shadowbanned
|
||||
// if criteria specifies they should not be shadowbanned then return false now
|
||||
if(!shadowBanned) {
|
||||
return false;
|
||||
}
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
}
|
||||
return true;
|
||||
|
||||
try {
|
||||
const authorName = getActivityAuthorName(item.author);
|
||||
|
||||
for (const k of Object.keys(rest)) {
|
||||
// @ts-ignore
|
||||
if (authorOpts[k] !== undefined) {
|
||||
switch (k) {
|
||||
case 'name':
|
||||
const authPass = () => {
|
||||
// @ts-ignore
|
||||
for (const n of authorOpts[k]) {
|
||||
if (n.toLowerCase() === authorName.toLowerCase()) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
const authResult = authPass();
|
||||
if ((include && !authResult) || (!include && authResult)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'flairCssClass':
|
||||
const css = await item.author_flair_css_class;
|
||||
const cssPass = () => {
|
||||
// @ts-ignore
|
||||
for (const c of authorOpts[k]) {
|
||||
if (c === css) {
|
||||
return;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
const cssResult = cssPass();
|
||||
if ((include && !cssResult) || (!include && cssResult)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'flairText':
|
||||
const text = await item.author_flair_text;
|
||||
const textPass = () => {
|
||||
// @ts-ignore
|
||||
for (const c of authorOpts[k]) {
|
||||
if (c === text) {
|
||||
return
|
||||
}
|
||||
}
|
||||
return false;
|
||||
};
|
||||
const textResult = textPass();
|
||||
if ((include && !textResult) || (!include && textResult)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'isMod':
|
||||
const mods: RedditUser[] = await item.subreddit.getModerators();
|
||||
const isModerator = mods.some(x => x.name === authorName);
|
||||
const modMatch = authorOpts.isMod === isModerator;
|
||||
if ((include && !modMatch) || (!include && modMatch)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'age':
|
||||
const ageTest = compareDurationValue(parseDurationComparison(await authorOpts.age as string), dayjs.unix(await item.author.created));
|
||||
if ((include && !ageTest) || (!include && ageTest)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'linkKarma':
|
||||
const lkCompare = parseGenericValueOrPercentComparison(await authorOpts.linkKarma as string);
|
||||
let lkMatch;
|
||||
if (lkCompare.isPercent) {
|
||||
// @ts-ignore
|
||||
const tk = await item.author.total_karma as number;
|
||||
lkMatch = comparisonTextOp(item.author.link_karma / tk, lkCompare.operator, lkCompare.value / 100);
|
||||
} else {
|
||||
lkMatch = comparisonTextOp(item.author.link_karma, lkCompare.operator, lkCompare.value);
|
||||
}
|
||||
if ((include && !lkMatch) || (!include && lkMatch)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'commentKarma':
|
||||
const ckCompare = parseGenericValueOrPercentComparison(await authorOpts.commentKarma as string);
|
||||
let ckMatch;
|
||||
if (ckCompare.isPercent) {
|
||||
// @ts-ignore
|
||||
const ck = await item.author.total_karma as number;
|
||||
ckMatch = comparisonTextOp(item.author.comment_karma / ck, ckCompare.operator, ckCompare.value / 100);
|
||||
} else {
|
||||
ckMatch = comparisonTextOp(item.author.comment_karma, ckCompare.operator, ckCompare.value);
|
||||
}
|
||||
if ((include && !ckMatch) || (!include && ckMatch)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'totalKarma':
|
||||
const tkCompare = parseGenericValueComparison(await authorOpts.totalKarma as string);
|
||||
if (tkCompare.isPercent) {
|
||||
throw new SimpleError(`'totalKarma' value on AuthorCriteria cannot be a percentage`);
|
||||
}
|
||||
// @ts-ignore
|
||||
const totalKarma = await item.author.total_karma as number;
|
||||
const tkMatch = comparisonTextOp(totalKarma, tkCompare.operator, tkCompare.value);
|
||||
if ((include && !tkMatch) || (!include && tkMatch)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'verified':
|
||||
const vMatch = await item.author.has_verified_mail === authorOpts.verified as boolean;
|
||||
if ((include && !vMatch) || (!include && vMatch)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
case 'userNotes':
|
||||
const notes = await userNotes.getUserNotes(item.author);
|
||||
const notePass = () => {
|
||||
for (const noteCriteria of authorOpts[k] as UserNoteCriteria[]) {
|
||||
const {count = '>= 1', search = 'current', type} = noteCriteria;
|
||||
const {
|
||||
value,
|
||||
operator,
|
||||
isPercent,
|
||||
extra = ''
|
||||
} = parseGenericValueOrPercentComparison(count);
|
||||
const order = extra.includes('asc') ? 'ascending' : 'descending';
|
||||
switch (search) {
|
||||
case 'current':
|
||||
if (notes.length > 0 && notes[notes.length - 1].noteType === type) {
|
||||
return true;
|
||||
}
|
||||
break;
|
||||
case 'consecutive':
|
||||
let orderedNotes = notes;
|
||||
if (order === 'descending') {
|
||||
orderedNotes = [...notes];
|
||||
orderedNotes.reverse();
|
||||
}
|
||||
let currCount = 0;
|
||||
for (const note of orderedNotes) {
|
||||
if (note.noteType === type) {
|
||||
currCount++;
|
||||
} else {
|
||||
currCount = 0;
|
||||
}
|
||||
if (isPercent) {
|
||||
throw new SimpleError(`When comparing UserNotes with 'consecutive' search 'count' cannot be a percentage. Given: ${count}`);
|
||||
}
|
||||
if (comparisonTextOp(currCount, operator, value)) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
break;
|
||||
case 'total':
|
||||
if (isPercent) {
|
||||
if (comparisonTextOp(notes.filter(x => x.noteType === type).length / notes.length, operator, value / 100)) {
|
||||
return true;
|
||||
}
|
||||
} else if (comparisonTextOp(notes.filter(x => x.noteType === type).length, operator, value)) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
const noteResult = notePass();
|
||||
if ((include && !noteResult) || (!include && noteResult)) {
|
||||
return false;
|
||||
}
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
return true;
|
||||
} catch (err) {
|
||||
if(isStatusError(err) && err.statusCode === 404) {
|
||||
throw new SimpleError('Reddit returned a 404 while trying to retrieve User profile. It is likely this user is shadowbanned.');
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export interface ItemContent {
|
||||
@@ -504,7 +550,8 @@ export const itemContentPeek = async (item: (Comment | Submission), peekLength =
|
||||
peek = `${truncatePeek(item.title)} by ${author} https://reddit.com${item.permalink}`;
|
||||
|
||||
} else if (item instanceof Comment) {
|
||||
content = truncatePeek(item.body);
|
||||
// replace newlines with spaces to make peek more compact
|
||||
content = truncatePeek(item.body.replaceAll('\n', ' '));
|
||||
peek = `${truncatePeek(content)} by ${author} in https://reddit.com${item.permalink}`;
|
||||
}
|
||||
|
||||
@@ -606,6 +653,9 @@ export const getAttributionIdentifier = (sub: Submission, useParentMediaDomain =
|
||||
if (displayDomain === '') {
|
||||
displayDomain = domain;
|
||||
}
|
||||
if(domainIdents.length === 0 && domain !== '') {
|
||||
domainIdents.push(domain);
|
||||
}
|
||||
|
||||
return {display: displayDomain, domain, aliases: domainIdents, provider, mediaType};
|
||||
}
|
||||
@@ -636,24 +686,3 @@ export const activityIsDeleted = (item: Submission | Comment): boolean => {
|
||||
}
|
||||
return item.author.name === '[deleted]'
|
||||
}
|
||||
|
||||
class ClientSingleton {
|
||||
client!: Snoowrap;
|
||||
|
||||
constructor(client?: Snoowrap) {
|
||||
if (client !== undefined) {
|
||||
this.client = client;
|
||||
}
|
||||
}
|
||||
|
||||
setClient(client: Snoowrap) {
|
||||
this.client = client;
|
||||
}
|
||||
|
||||
getClient(): Snoowrap {
|
||||
return this.client;
|
||||
}
|
||||
}
|
||||
|
||||
// quick little hack to get access to the client without having to pass it all the way down the chain
|
||||
export const singleton = new ClientSingleton();
|
||||
|
||||
@@ -1,29 +1,42 @@
|
||||
import {labelledFormat, logLevels} from "../util";
|
||||
import winston, {Logger} from "winston";
|
||||
import {DuplexTransport} from "winston-duplex";
|
||||
|
||||
const {transports} = winston;
|
||||
|
||||
export const getLogger = (options: any, name = 'default'): Logger => {
|
||||
export const getLogger = (options: any, name = 'app'): Logger => {
|
||||
if(!winston.loggers.has(name)) {
|
||||
const {
|
||||
path,
|
||||
level,
|
||||
additionalTransports = [],
|
||||
defaultLabel = 'App',
|
||||
} = options || {};
|
||||
|
||||
const consoleTransport = new transports.Console();
|
||||
const consoleTransport = new transports.Console({
|
||||
handleExceptions: true,
|
||||
// @ts-expect-error
|
||||
handleRejections: true,
|
||||
});
|
||||
|
||||
const myTransports = [
|
||||
consoleTransport,
|
||||
new DuplexTransport({
|
||||
stream: {
|
||||
transform(chunk,e, cb) {
|
||||
cb(null, chunk);
|
||||
},
|
||||
objectMode: true,
|
||||
},
|
||||
name: 'duplex',
|
||||
dump: false,
|
||||
handleExceptions: true,
|
||||
// @ts-expect-error
|
||||
handleRejections: true,
|
||||
}),
|
||||
...additionalTransports,
|
||||
];
|
||||
|
||||
let errorTransports = [consoleTransport];
|
||||
|
||||
for (const a of additionalTransports) {
|
||||
myTransports.push(a);
|
||||
errorTransports.push(a);
|
||||
}
|
||||
|
||||
if (path !== undefined && path !== '') {
|
||||
const rotateTransport = new winston.transports.DailyRotateFile({
|
||||
dirname: path,
|
||||
@@ -31,21 +44,19 @@ export const getLogger = (options: any, name = 'default'): Logger => {
|
||||
symlinkName: 'contextBot-current.log',
|
||||
filename: 'contextBot-%DATE%.log',
|
||||
datePattern: 'YYYY-MM-DD',
|
||||
maxSize: '5m'
|
||||
maxSize: '5m',
|
||||
handleExceptions: true,
|
||||
handleRejections: true,
|
||||
});
|
||||
// @ts-ignore
|
||||
myTransports.push(rotateTransport);
|
||||
// @ts-ignore
|
||||
errorTransports.push(rotateTransport);
|
||||
}
|
||||
|
||||
const loggerOptions = {
|
||||
level: level || 'info',
|
||||
format: labelledFormat(),
|
||||
format: labelledFormat(defaultLabel),
|
||||
transports: myTransports,
|
||||
levels: logLevels,
|
||||
exceptionHandlers: errorTransports,
|
||||
rejectionHandlers: errorTransports,
|
||||
};
|
||||
|
||||
winston.loggers.add(name, loggerOptions);
|
||||
|
||||
1113
src/Web/Client/index.ts
Normal file
108
src/Web/Common/defaults.ts
Normal file
@@ -0,0 +1,108 @@
|
||||
import {BotStats, BotStatusResponse, SubredditDataResponse} from "./interfaces";
|
||||
import {ManagerStats, RunningState} from "../../Subreddit/Manager";
|
||||
import {Invokee, RunState} from "../../Common/interfaces";
|
||||
import {cacheStats} from "../../util";
|
||||
|
||||
const managerStats: ManagerStats = {
|
||||
actionsRun: 0,
|
||||
actionsRunSinceStart: 0,
|
||||
actionsRunSinceStartTotal: 0,
|
||||
actionsRunTotal: 0,
|
||||
cache: {
|
||||
currentKeyCount: 0,
|
||||
isShared: false,
|
||||
missPercent: "-",
|
||||
provider: "-",
|
||||
requestRate: 0,
|
||||
totalMiss: 0,
|
||||
totalRequests: 0,
|
||||
types: cacheStats()
|
||||
},
|
||||
checksRunSinceStartTotal: 0,
|
||||
checksRunTotal: 0,
|
||||
checksTriggered: 0,
|
||||
checksTriggeredSinceStart: 0,
|
||||
checksTriggeredSinceStartTotal: 0,
|
||||
checksTriggeredTotal: 0,
|
||||
eventsAvg: 0,
|
||||
eventsCheckedSinceStartTotal: 0,
|
||||
eventsCheckedTotal: 0,
|
||||
rulesAvg: 0,
|
||||
rulesCachedSinceStartTotal: 0,
|
||||
rulesCachedTotal: 0,
|
||||
rulesRunSinceStartTotal: 0,
|
||||
rulesRunTotal: 0,
|
||||
rulesTriggeredSinceStartTotal: 0,
|
||||
rulesTriggeredTotal: 0,
|
||||
};
|
||||
const botStats: BotStats = {
|
||||
apiAvg: '-',
|
||||
apiDepletion: "-",
|
||||
apiLimit: 0,
|
||||
limitReset: '-',
|
||||
limitResetHuman: "-",
|
||||
nannyMode: "-",
|
||||
nextHeartbeat: "-",
|
||||
nextHeartbeatHuman: "-",
|
||||
startedAtHuman: "-"
|
||||
};
|
||||
|
||||
const runningState: RunningState = {
|
||||
causedBy: '-' as Invokee,
|
||||
state: '-' as RunState,
|
||||
}
|
||||
|
||||
const sub: SubredditDataResponse = {
|
||||
botState: runningState,
|
||||
checks: {comments: 0, submissions: 0},
|
||||
delayBy: "-",
|
||||
dryRun: false,
|
||||
eventsState: runningState,
|
||||
globalMaxWorkers: 0,
|
||||
hardLimit: 0,
|
||||
heartbeat: 0,
|
||||
heartbeatHuman: "-",
|
||||
indicator: "-",
|
||||
logs: [],
|
||||
maxWorkers: 0,
|
||||
name: "-",
|
||||
pollingInfo: [],
|
||||
queueState: runningState,
|
||||
queuedActivities: 0,
|
||||
runningActivities: 0,
|
||||
softLimit: 0,
|
||||
startedAt: "-",
|
||||
startedAtHuman: "-",
|
||||
stats: managerStats,
|
||||
subMaxWorkers: 0,
|
||||
validConfig: false,
|
||||
wikiHref: "-",
|
||||
wikiLastCheck: "-",
|
||||
wikiLastCheckHuman: "-",
|
||||
wikiLocation: "-",
|
||||
wikiRevision: "-",
|
||||
wikiRevisionHuman: "-"
|
||||
};
|
||||
|
||||
export const defaultBotStatus = (subreddits: string[] = []) => {
|
||||
|
||||
const subs: SubredditDataResponse[] = [
|
||||
{
|
||||
...sub,
|
||||
name: 'All',
|
||||
},
|
||||
...subreddits.map(x => ({...sub, name: x}))
|
||||
];
|
||||
|
||||
const data: BotStatusResponse = {
|
||||
subreddits: subs,
|
||||
system: {
|
||||
startedAt: '-',
|
||||
running: false,
|
||||
account: '-',
|
||||
name: '-',
|
||||
...botStats,
|
||||
}
|
||||
};
|
||||
return data;
|
||||
}
|
||||
59
src/Web/Common/interfaces.ts
Normal file
@@ -0,0 +1,59 @@
|
||||
import {ManagerStats, RunningState} from "../../Subreddit/Manager";
|
||||
|
||||
export interface BotStats {
|
||||
startedAtHuman: string,
|
||||
nextHeartbeat: string,
|
||||
nextHeartbeatHuman: string,
|
||||
apiLimit: number,
|
||||
apiAvg: string | number,
|
||||
nannyMode: string,
|
||||
apiDepletion: string,
|
||||
limitReset: string | number,
|
||||
limitResetHuman: string
|
||||
}
|
||||
|
||||
export interface SubredditDataResponse {
|
||||
name: string
|
||||
logs: string[]
|
||||
botState: RunningState
|
||||
eventsState: RunningState
|
||||
queueState: RunningState
|
||||
indicator: string
|
||||
queuedActivities: number
|
||||
runningActivities: number
|
||||
maxWorkers: number
|
||||
subMaxWorkers: number
|
||||
globalMaxWorkers: number
|
||||
validConfig: string | boolean
|
||||
dryRun: string | boolean
|
||||
pollingInfo: string[]
|
||||
checks: {
|
||||
submissions: number
|
||||
comments: number
|
||||
}
|
||||
wikiLocation: string
|
||||
wikiHref: string
|
||||
wikiRevisionHuman: string
|
||||
wikiRevision: string
|
||||
wikiLastCheckHuman: string
|
||||
wikiLastCheck: string
|
||||
stats: ManagerStats
|
||||
startedAt: string
|
||||
startedAtHuman: string
|
||||
delayBy: string
|
||||
softLimit?: number
|
||||
hardLimit?: number
|
||||
heartbeatHuman?: string
|
||||
heartbeat: number
|
||||
}
|
||||
|
||||
export interface BotStatusResponse {
|
||||
system: BotStats & {
|
||||
startedAt: string,
|
||||
name: string,
|
||||
running: boolean,
|
||||
error?: string,
|
||||
account: string,
|
||||
}
|
||||
subreddits: SubredditDataResponse[]
|
||||
}
|
||||
33
src/Web/Common/middleware.ts
Normal file
@@ -0,0 +1,33 @@
|
||||
import {Request, Response} from 'express';
|
||||
|
||||
export interface boolOptions {
|
||||
name: string,
|
||||
defaultVal: any
|
||||
}
|
||||
|
||||
export const booleanMiddle = (boolParams: (string | boolOptions)[] = []) => async (req: Request, res: Response, next: Function) => {
|
||||
for (const b of boolParams) {
|
||||
const opts = typeof b === 'string' ? {name: b, defaultVal: undefined} : b as boolOptions;
|
||||
|
||||
const bVal = req.query[opts.name] as any;
|
||||
if (bVal !== undefined) {
|
||||
let truthyVal: boolean;
|
||||
if (bVal === 'true' || bVal === true || bVal === 1 || bVal === '1') {
|
||||
truthyVal = true;
|
||||
} else if (bVal === 'false' || bVal === false || bVal === 0 || bVal === '0') {
|
||||
truthyVal = false;
|
||||
} else {
|
||||
res.status(400);
|
||||
return res.send(`Expected query parameter ${opts.name} to be a truthy value. Got "${bVal}" but must be one of these: true/false, 1/0`);
|
||||
}
|
||||
// @ts-ignore
|
||||
req.query[opts.name] = truthyVal;
|
||||
} else if (opts.defaultVal !== undefined) {
|
||||
req.query[opts.name] = opts.defaultVal;
|
||||
} else {
|
||||
res.status(400);
|
||||
return res.send(`Expected query parameter ${opts.name} to be a truthy value but it was missing. Must be one of these: true/false, 1/0`);
|
||||
}
|
||||
}
|
||||
next();
|
||||
}
|
||||
22
src/Web/Common/util.ts
Normal file
@@ -0,0 +1,22 @@
|
||||
import {App} from "../../App";
|
||||
import {BotStats} from "./interfaces";
|
||||
import dayjs from "dayjs";
|
||||
import {formatNumber} from "../../util";
|
||||
import Bot from "../../Bot";
|
||||
|
||||
export const opStats = (bot: Bot): BotStats => {
|
||||
const limitReset = dayjs(bot.client.ratelimitExpiration);
|
||||
const nextHeartbeat = bot.nextHeartbeat !== undefined ? bot.nextHeartbeat.local().format('MMMM D, YYYY h:mm A Z') : 'N/A';
|
||||
const nextHeartbeatHuman = bot.nextHeartbeat !== undefined ? `in ${dayjs.duration(bot.nextHeartbeat.diff(dayjs())).humanize()}` : 'N/A'
|
||||
return {
|
||||
startedAtHuman: `${dayjs.duration(dayjs().diff(bot.startedAt)).humanize()}`,
|
||||
nextHeartbeat,
|
||||
nextHeartbeatHuman,
|
||||
apiLimit: bot.client !== undefined ? bot.client.ratelimitRemaining : 0,
|
||||
apiAvg: formatNumber(bot.apiRollingAvg),
|
||||
nannyMode: bot.nannyMode || 'Off',
|
||||
apiDepletion: bot.apiEstDepletion === undefined ? 'Not Calculated' : bot.apiEstDepletion.humanize(),
|
||||
limitReset: limitReset.format(),
|
||||
limitResetHuman: `in ${dayjs.duration(limitReset.diff(dayjs())).humanize()}`,
|
||||
}
|
||||
}
|
||||
27
src/Web/Server/interfaces.ts
Normal file
@@ -0,0 +1,27 @@
|
||||
import { Request } from "express";
|
||||
import {App} from "../../App";
|
||||
import Bot from "../../Bot";
|
||||
|
||||
// export interface ServerRequest extends Request {
|
||||
// botApp: App
|
||||
// bot?: Bot
|
||||
// //user?: AuthenticatedUser
|
||||
// }
|
||||
//
|
||||
// export interface ServerRequestRedditor extends ServerRequest {
|
||||
// user?: AuthenticatedRedditUser
|
||||
// }
|
||||
//
|
||||
// export interface AuthenticatedUser extends Express.User {
|
||||
// machine: boolean
|
||||
// }
|
||||
//
|
||||
// export interface AuthenticatedRedditUser extends AuthenticatedUser {
|
||||
// name: string
|
||||
// subreddits: string[]
|
||||
// isOperator: boolean
|
||||
// realManagers: string[]
|
||||
// moderatedManagers: string[]
|
||||
// realBots: string[]
|
||||
// moderatedBots: string[]
|
||||
// }
|
||||
56
src/Web/Server/middleware.ts
Normal file
@@ -0,0 +1,56 @@
|
||||
import {Request, Response} from "express";
|
||||
import Bot from "../../Bot";
|
||||
|
||||
export const authUserCheck = (userRequired: boolean = true) => async (req: Request, res: Response, next: Function) => {
|
||||
if (req.isAuthenticated()) {
|
||||
if (userRequired && req.user.machine) {
|
||||
return res.status(403).send('Must be authenticated as a user to access this route');
|
||||
}
|
||||
return next();
|
||||
} else {
|
||||
return res.status(401).send('Must be authenticated to access this route');
|
||||
}
|
||||
}
|
||||
|
||||
export const botRoute = (required = true) => async (req: Request, res: Response, next: Function) => {
|
||||
const {bot: botVal} = req.query;
|
||||
if (botVal === undefined) {
|
||||
if(required) {
|
||||
return res.status(400).send("Must specify 'bot' parameter");
|
||||
}
|
||||
return next();
|
||||
}
|
||||
const botStr = botVal as string;
|
||||
|
||||
if(req.user !== undefined) {
|
||||
if (req.user.realBots === undefined || !req.user.realBots.map(x => x.toLowerCase()).includes(botStr.toLowerCase())) {
|
||||
return res.status(404).send(`Bot named ${botStr} does not exist or you do not have permission to access it.`);
|
||||
}
|
||||
req.serverBot = req.botApp.bots.find(x => x.botName === botStr) as Bot;
|
||||
return next();
|
||||
}
|
||||
return next();
|
||||
}
|
||||
|
||||
export const subredditRoute = (required = true) => async (req: Request, res: Response, next: Function) => {
|
||||
|
||||
const bot = req.serverBot;
|
||||
|
||||
const {subreddit} = req.query as any;
|
||||
if(subreddit === undefined && required === false) {
|
||||
next();
|
||||
} else {
|
||||
const {name: userName, realManagers = [], isOperator} = req.user as Express.User;
|
||||
if (!isOperator && !realManagers.includes(subreddit)) {
|
||||
return res.status(400).send('Cannot access route for subreddit you do not manage or is not run by the bot')
|
||||
}
|
||||
const manager = bot.subManagers.find(x => x.displayLabel === subreddit);
|
||||
if (manager === undefined) {
|
||||
return res.status(400).send('Cannot access route for subreddit you do not manage or is not run by the bot')
|
||||
}
|
||||
|
||||
req.manager = manager;
|
||||
|
||||
next();
|
||||
}
|
||||
}
|
||||
35
src/Web/Server/routes/authenticated/applicationRoutes.ts
Normal file
@@ -0,0 +1,35 @@
|
||||
import {Router} from '@awaitjs/express';
|
||||
import {Request, Response} from 'express';
|
||||
import {authUserCheck} from "../../middleware";
|
||||
|
||||
const router = Router();
|
||||
router.use(authUserCheck(false));
|
||||
|
||||
interface OperatorData {
|
||||
name: string[]
|
||||
display?: string
|
||||
friendly?: string
|
||||
}
|
||||
|
||||
export const heartbeat = (opData: OperatorData) => {
|
||||
const response = async (req: Request, res: Response) => {
|
||||
if(req.botApp === undefined) {
|
||||
return res.status(500).send('Application is initializing, try again in a few seconds');
|
||||
}
|
||||
const heartbeatData = {
|
||||
subreddits: req.botApp.bots.map(y => y.subManagers.map(x => x.subreddit.display_name)).flat(),
|
||||
bots: req.botApp.bots.map(x => ({botName: x.botName, subreddits: x.subManagers.map(y => y.displayLabel), running: x.running})),
|
||||
operators: opData.name,
|
||||
operatorDisplay: opData.display,
|
||||
friendly: opData.friendly,
|
||||
//friendly: req.botApp !== undefined ? req.botApp.botName : undefined,
|
||||
//running: req.botApp !== undefined ? req.botApp.heartBeating : false,
|
||||
//nanny: req.botApp !== undefined ? req.botApp.nannyMode : undefined,
|
||||
//botName: req.botApp !== undefined ? req.botApp.botName : undefined,
|
||||
//botLink: req.botApp !== undefined ? req.botApp.botLink : undefined,
|
||||
//error: req.botApp.error,
|
||||
};
|
||||
return res.json(heartbeatData);
|
||||
};
|
||||
return [authUserCheck(false), response];
|
||||
}
|
||||
96
src/Web/Server/routes/authenticated/user/action.ts
Normal file
@@ -0,0 +1,96 @@
|
||||
import express, {Request, Response} from 'express';
|
||||
import {RUNNING, USER} from "../../../../../Common/interfaces";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import LoggedError from "../../../../../Utils/LoggedError";
|
||||
import winston from "winston";
|
||||
import {authUserCheck, botRoute} from "../../../middleware";
|
||||
import {booleanMiddle} from "../../../../Common/middleware";
|
||||
|
||||
const action = async (req: express.Request, res: express.Response) => {
|
||||
const bot = req.serverBot;
|
||||
|
||||
const {type, action, subreddit, force = false} = req.query as any;
|
||||
const {name: userName, realManagers = [], isOperator} = req.user as Express.User;
|
||||
let subreddits: string[] = [];
|
||||
if (subreddit === 'All') {
|
||||
subreddits = realManagers;
|
||||
} else if (realManagers.includes(subreddit)) {
|
||||
subreddits = [subreddit];
|
||||
}
|
||||
|
||||
for (const s of subreddits) {
|
||||
const manager = bot.subManagers.find(x => x.displayLabel === s);
|
||||
if (manager === undefined) {
|
||||
winston.loggers.get('app').warn(`Manager for ${s} does not exist`, {subreddit: `/u/${userName}`});
|
||||
continue;
|
||||
}
|
||||
const mLogger = manager.logger;
|
||||
mLogger.info(`/u/${userName} invoked '${action}' action for ${type} on ${manager.displayLabel}`);
|
||||
try {
|
||||
switch (action) {
|
||||
case 'start':
|
||||
if (type === 'bot') {
|
||||
await manager.start('user');
|
||||
} else if (type === 'queue') {
|
||||
manager.startQueue('user');
|
||||
} else {
|
||||
await manager.startEvents('user');
|
||||
}
|
||||
break;
|
||||
case 'stop':
|
||||
if (type === 'bot') {
|
||||
await manager.stop('user');
|
||||
} else if (type === 'queue') {
|
||||
await manager.stopQueue('user');
|
||||
} else {
|
||||
manager.stopEvents('user');
|
||||
}
|
||||
break;
|
||||
case 'pause':
|
||||
if (type === 'queue') {
|
||||
await manager.pauseQueue('user');
|
||||
} else {
|
||||
manager.pauseEvents('user');
|
||||
}
|
||||
break;
|
||||
case 'reload':
|
||||
const prevQueueState = manager.queueState.state;
|
||||
const newConfig = await manager.parseConfiguration('user', force);
|
||||
if (newConfig === false) {
|
||||
mLogger.info('Config was up-to-date');
|
||||
}
|
||||
if (newConfig && prevQueueState === RUNNING) {
|
||||
await manager.startQueue(USER);
|
||||
}
|
||||
break;
|
||||
case 'check':
|
||||
if (type === 'unmoderated') {
|
||||
const activities = await manager.subreddit.getUnmoderated({limit: 100});
|
||||
for (const a of activities.reverse()) {
|
||||
await manager.firehose.push({
|
||||
checkType: a instanceof Submission ? 'Submission' : 'Comment',
|
||||
activity: a,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
const activities = await manager.subreddit.getModqueue({limit: 100});
|
||||
for (const a of activities.reverse()) {
|
||||
await manager.firehose.push({
|
||||
checkType: a instanceof Submission ? 'Submission' : 'Comment',
|
||||
activity: a,
|
||||
});
|
||||
}
|
||||
}
|
||||
break;
|
||||
}
|
||||
} catch (err) {
|
||||
if (!(err instanceof LoggedError)) {
|
||||
mLogger.error(err, {subreddit: manager.displayLabel});
|
||||
}
|
||||
}
|
||||
}
|
||||
res.send('OK');
|
||||
};
|
||||
|
||||
const actionRoute = [authUserCheck(), botRoute(), booleanMiddle(['force']), action];
|
||||
export default actionRoute;
|
||||
50
src/Web/Server/routes/authenticated/user/addBot.ts
Normal file
@@ -0,0 +1,50 @@
|
||||
import {Request, Response} from 'express';
|
||||
import {BotInstanceConfig} from "../../../../../Common/interfaces";
|
||||
import {authUserCheck} from "../../../middleware";
|
||||
import Bot from "../../../../../Bot";
|
||||
import LoggedError from "../../../../../Utils/LoggedError";
|
||||
|
||||
const addBot = () => {
|
||||
|
||||
const middleware = [
|
||||
authUserCheck(),
|
||||
];
|
||||
|
||||
const response = async (req: Request, res: Response) => {
|
||||
|
||||
if (!(req.user as Express.User).isOperator) {
|
||||
return res.status(401).send("Must be an Operator to use this route");
|
||||
}
|
||||
|
||||
const newBot = new Bot(req.body as BotInstanceConfig, req.botApp.logger);
|
||||
req.botApp.bots.push(newBot);
|
||||
let result: any = {stored: true};
|
||||
try {
|
||||
if (newBot.error !== undefined) {
|
||||
result.error = newBot.error;
|
||||
return res.status(500).json(result);
|
||||
}
|
||||
await newBot.testClient();
|
||||
await newBot.buildManagers();
|
||||
newBot.runManagers('user').catch((err) => {
|
||||
req.botApp.logger.error(`Unexpected error occurred while running Bot ${newBot.botName}. Bot must be re-built to restart`);
|
||||
if (!err.logged || !(err instanceof LoggedError)) {
|
||||
req.botApp.logger.error(err);
|
||||
}
|
||||
});
|
||||
} catch (err) {
|
||||
if (newBot.error === undefined) {
|
||||
newBot.error = err.message;
|
||||
result.error = err.message;
|
||||
}
|
||||
req.botApp.logger.error(`Bot ${newBot.botName} cannot recover from this error and must be re-built`);
|
||||
if (!err.logged || !(err instanceof LoggedError)) {
|
||||
req.botApp.logger.error(err);
|
||||
}
|
||||
}
|
||||
return res.json(result);
|
||||
}
|
||||
return [...middleware, response];
|
||||
}
|
||||
|
||||
export default addBot;
|
||||
97
src/Web/Server/routes/authenticated/user/index.ts
Normal file
@@ -0,0 +1,97 @@
|
||||
import {Request, Response} from 'express';
|
||||
import {authUserCheck, botRoute, subredditRoute} from "../../../middleware";
|
||||
import Submission from "snoowrap/dist/objects/Submission";
|
||||
import winston from 'winston';
|
||||
import {COMMENT_URL_ID, parseLinkIdentifier, SUBMISSION_URL_ID} from "../../../../../util";
|
||||
import {booleanMiddle} from "../../../../Common/middleware";
|
||||
import {Manager} from "../../../../../Subreddit/Manager";
|
||||
import {ActionedEvent} from "../../../../../Common/interfaces";
|
||||
|
||||
const commentReg = parseLinkIdentifier([COMMENT_URL_ID]);
|
||||
const submissionReg = parseLinkIdentifier([SUBMISSION_URL_ID]);
|
||||
|
||||
const config = async (req: Request, res: Response) => {
|
||||
|
||||
const manager = req.manager as Manager;
|
||||
|
||||
// @ts-ignore
|
||||
const wiki = await manager.subreddit.getWikiPage(manager.wikiLocation).fetch();
|
||||
return res.send(wiki.content_md);
|
||||
};
|
||||
export const configRoute = [authUserCheck(), botRoute(), subredditRoute(), config];
|
||||
|
||||
const actionedEvents = async (req: Request, res: Response) => {
|
||||
|
||||
let managers: Manager[] = [];
|
||||
const manager = req.manager as Manager | undefined;
|
||||
if(manager !== undefined) {
|
||||
managers.push(manager);
|
||||
} else {
|
||||
for(const manager of req.serverBot.subManagers) {
|
||||
if((req.user?.realManagers as string[]).includes(manager.displayLabel)) {
|
||||
managers.push(manager);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let events: ActionedEvent[] = [];
|
||||
for(const m of managers) {
|
||||
if(m.resources !== undefined) {
|
||||
events = events.concat(await m.resources.getActionedEvents());
|
||||
}
|
||||
}
|
||||
|
||||
events.sort((a, b) => b.timestamp - a.timestamp);
|
||||
|
||||
return res.json(events);
|
||||
};
|
||||
export const actionedEventsRoute = [authUserCheck(), botRoute(), subredditRoute(false), actionedEvents];
|
||||
|
||||
const action = async (req: Request, res: Response) => {
|
||||
const bot = req.serverBot;
|
||||
|
||||
const {url, dryRun, subreddit} = req.query as any;
|
||||
const {name: userName, realManagers = [], isOperator} = req.user as Express.User;
|
||||
|
||||
let a;
|
||||
const commentId = commentReg(url);
|
||||
if (commentId !== undefined) {
|
||||
// @ts-ignore
|
||||
a = await bot.client.getComment(commentId);
|
||||
}
|
||||
if (a === undefined) {
|
||||
const submissionId = submissionReg(url);
|
||||
if (submissionId !== undefined) {
|
||||
// @ts-ignore
|
||||
a = await bot.client.getSubmission(submissionId);
|
||||
}
|
||||
}
|
||||
|
||||
if (a === undefined) {
|
||||
winston.loggers.get('app').error('Could not parse Comment or Submission ID from given URL', {subreddit: `/u/${userName}`});
|
||||
return res.send('OK');
|
||||
} else {
|
||||
// @ts-ignore
|
||||
const activity = await a.fetch();
|
||||
const sub = await activity.subreddit.display_name;
|
||||
|
||||
let manager = subreddit === 'All' ? bot.subManagers.find(x => x.subreddit.display_name === sub) : bot.subManagers.find(x => x.displayLabel === subreddit);
|
||||
|
||||
if (manager === undefined || (!realManagers.includes(manager.displayLabel))) {
|
||||
let msg = 'Activity does not belong to a subreddit you moderate or the bot runs on.';
|
||||
if (subreddit === 'All') {
|
||||
msg = `${msg} If you want to test an Activity against a Subreddit\'s config it does not belong to then switch to that Subreddit's tab first.`
|
||||
}
|
||||
winston.loggers.get('app').error(msg, {subreddit: `/u/${userName}`});
|
||||
return res.send('OK');
|
||||
}
|
||||
|
||||
// will run dryrun if specified or if running activity on subreddit it does not belong to
|
||||
const dr: boolean | undefined = (dryRun || manager.subreddit.display_name !== sub) ? true : undefined;
|
||||
manager.logger.info(`/u/${userName} running${dr === true ? ' DRY RUN ' : ' '}check on${manager.subreddit.display_name !== sub ? ' FOREIGN ACTIVITY ' : ' '}${url}`);
|
||||
await manager.runChecks(activity instanceof Submission ? 'Submission' : 'Comment', activity, {dryRun: dr})
|
||||
}
|
||||
res.send('OK');
|
||||
};
|
||||
|
||||
export const actionRoute = [authUserCheck(), botRoute(), booleanMiddle(['dryRun']), action];
|
||||
76
src/Web/Server/routes/authenticated/user/logs.ts
Normal file
@@ -0,0 +1,76 @@
|
||||
import {Router} from '@awaitjs/express';
|
||||
import {Request, Response} from 'express';
|
||||
import {filterLogBySubreddit, isLogLineMinLevel, LogEntry, parseSubredditLogName} from "../../../../../util";
|
||||
import {Transform} from "stream";
|
||||
import winston from "winston";
|
||||
import pEvent from "p-event";
|
||||
import {getLogger} from "../../../../../Utils/loggerFactory";
|
||||
import {booleanMiddle} from "../../../../Common/middleware";
|
||||
import {authUserCheck, botRoute} from "../../../middleware";
|
||||
import {LogInfo} from "../../../../../Common/interfaces";
|
||||
import {MESSAGE} from "triple-beam";
|
||||
|
||||
// TODO update logs api
|
||||
const logs = (subLogMap: Map<string, LogEntry[]>) => {
|
||||
const middleware = [
|
||||
authUserCheck(),
|
||||
booleanMiddle([{
|
||||
name: 'stream',
|
||||
defaultVal: false
|
||||
}])
|
||||
];
|
||||
|
||||
const response = async (req: Request, res: Response) => {
|
||||
|
||||
const logger = winston.loggers.get('app');
|
||||
|
||||
const {name: userName, realManagers = [], isOperator} = req.user as Express.User;
|
||||
const {level = 'verbose', stream, limit = 200, sort = 'descending', streamObjects = false} = req.query;
|
||||
if (stream) {
|
||||
const origin = req.header('X-Forwarded-For') ?? req.header('host');
|
||||
try {
|
||||
logger.stream().on('log', (log: LogInfo) => {
|
||||
if (isLogLineMinLevel(log, level as string)) {
|
||||
const {subreddit: subName} = log;
|
||||
if (isOperator || (subName !== undefined && (realManagers.includes(subName) || subName.includes(userName)))) {
|
||||
if(streamObjects) {
|
||||
res.write(`${JSON.stringify(log)}\r\n`);
|
||||
} else {
|
||||
res.write(`${log[MESSAGE]}\r\n`)
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
logger.info(`${userName} from ${origin} => CONNECTED`);
|
||||
await pEvent(req, 'close');
|
||||
console.log('Request closed detected with "close" listener');
|
||||
res.destroy();
|
||||
return;
|
||||
} catch (e) {
|
||||
if (e.code !== 'ECONNRESET') {
|
||||
logger.error(e);
|
||||
}
|
||||
} finally {
|
||||
logger.info(`${userName} from ${origin} => DISCONNECTED`);
|
||||
res.destroy();
|
||||
}
|
||||
} else {
|
||||
const logs = filterLogBySubreddit(subLogMap, realManagers, {
|
||||
level: (level as string),
|
||||
operator: isOperator,
|
||||
user: userName,
|
||||
sort: sort as 'descending' | 'ascending',
|
||||
limit: Number.parseInt((limit as string))
|
||||
});
|
||||
const subArr: any = [];
|
||||
logs.forEach((v: string[], k: string) => {
|
||||
subArr.push({name: k, logs: v.join('')});
|
||||
});
|
||||
return res.json(subArr);
|
||||
}
|
||||
};
|
||||
|
||||
return [...middleware, response];
|
||||
}
|
||||
|
||||
export default logs;
|
||||
306
src/Web/Server/routes/authenticated/user/status.ts
Normal file
@@ -0,0 +1,306 @@
|
||||
import {Request, Response} from 'express';
|
||||
import {
|
||||
boolToString,
|
||||
cacheStats,
|
||||
filterLogBySubreddit,
|
||||
formatNumber,
|
||||
intersect,
|
||||
LogEntry,
|
||||
pollingInfo
|
||||
} from "../../../../../util";
|
||||
import {Manager} from "../../../../../Subreddit/Manager";
|
||||
import dayjs from "dayjs";
|
||||
import {ResourceStats, RUNNING, STOPPED, SYSTEM} from "../../../../../Common/interfaces";
|
||||
import {BotStatusResponse} from "../../../../Common/interfaces";
|
||||
import winston from "winston";
|
||||
import {opStats} from "../../../../Common/util";
|
||||
import {authUserCheck, botRoute} from "../../../middleware";
|
||||
import Bot from "../../../../../Bot";
|
||||
|
||||
const status = () => {
|
||||
|
||||
const middleware = [
|
||||
authUserCheck(),
|
||||
//botRoute(),
|
||||
];
|
||||
|
||||
const response = async (req: Request, res: Response) => {
|
||||
let bots: Bot[] = [];
|
||||
const {
|
||||
limit = 200,
|
||||
level = 'verbose',
|
||||
sort = 'descending',
|
||||
} = req.query;
|
||||
|
||||
// @ts-ignore
|
||||
const botLogMap = req.botLogs as Map<string, Map<string, LogEntry[]>>;
|
||||
// @ts-ignore
|
||||
const systemLogs = req.systemLogs as LogEntry[];
|
||||
|
||||
|
||||
if(req.serverBot !== undefined) {
|
||||
bots = [req.serverBot];
|
||||
} else {
|
||||
bots = (req.user as Express.User).isOperator ? req.botApp.bots : req.botApp.bots.filter(x => intersect(req.user?.subreddits as string[], x.subManagers.map(y => y.subreddit.display_name)));
|
||||
}
|
||||
const botResponses: BotStatusResponse[] = [];
|
||||
for(const b of bots) {
|
||||
botResponses.push(await botStatResponse(b, req, botLogMap));
|
||||
}
|
||||
const system: any = {};
|
||||
if((req.user as Express.User).isOperator) {
|
||||
// @ts-ignore
|
||||
system.logs = filterLogBySubreddit(new Map([['app', systemLogs]]), [], {level, sort, limit, operator: true}).get('app');
|
||||
}
|
||||
const response = {
|
||||
bots: botResponses,
|
||||
system: system,
|
||||
};
|
||||
return res.json(response);
|
||||
}
|
||||
|
||||
const botStatResponse = async (bot: Bot, req: Request, botLogMap: Map<string, Map<string, LogEntry[]>>) => {
|
||||
const {
|
||||
//subreddits = [],
|
||||
//user: userVal,
|
||||
limit = 200,
|
||||
level = 'verbose',
|
||||
sort = 'descending',
|
||||
lastCheck
|
||||
} = req.query;
|
||||
|
||||
const {name: userName, realManagers = [], isOperator} = req.user as Express.User;
|
||||
const user = userName as string;
|
||||
const subreddits = realManagers;
|
||||
//const isOperator = opNames.includes(user.toLowerCase())
|
||||
|
||||
const logs = filterLogBySubreddit(botLogMap.get(bot.botName as string) || new Map(), realManagers, {
|
||||
level: (level as string),
|
||||
operator: isOperator,
|
||||
user,
|
||||
// @ts-ignore
|
||||
sort,
|
||||
limit: Number.parseInt((limit as string))
|
||||
});
|
||||
|
||||
const subManagerData = [];
|
||||
for (const s of subreddits) {
|
||||
const m = bot.subManagers.find(x => x.displayLabel === s) as Manager;
|
||||
if(m === undefined) {
|
||||
continue;
|
||||
}
|
||||
const sd = {
|
||||
name: s,
|
||||
//linkName: s.replace(/\W/g, ''),
|
||||
logs: logs.get(s) || [], // provide a default empty value in case we truly have not logged anything for this subreddit yet
|
||||
botState: m.botState,
|
||||
eventsState: m.eventsState,
|
||||
queueState: m.queueState,
|
||||
indicator: 'gray',
|
||||
queuedActivities: m.queue.length(),
|
||||
runningActivities: m.queue.running(),
|
||||
maxWorkers: m.queue.concurrency,
|
||||
subMaxWorkers: m.subMaxWorkers || bot.maxWorkers,
|
||||
globalMaxWorkers: bot.maxWorkers,
|
||||
validConfig: boolToString(m.validConfigLoaded),
|
||||
dryRun: boolToString(m.dryRun === true),
|
||||
pollingInfo: m.pollOptions.length === 0 ? ['nothing :('] : m.pollOptions.map(pollingInfo),
|
||||
checks: {
|
||||
submissions: m.submissionChecks === undefined ? 0 : m.submissionChecks.length,
|
||||
comments: m.commentChecks === undefined ? 0 : m.commentChecks.length,
|
||||
},
|
||||
wikiLocation: m.wikiLocation,
|
||||
wikiHref: `https://reddit.com/r/${m.subreddit.display_name}/wiki/${m.wikiLocation}`,
|
||||
wikiRevisionHuman: m.lastWikiRevision === undefined ? 'N/A' : `${dayjs.duration(dayjs().diff(m.lastWikiRevision)).humanize()} ago`,
|
||||
wikiRevision: m.lastWikiRevision === undefined ? 'N/A' : m.lastWikiRevision.local().format('MMMM D, YYYY h:mm A Z'),
|
||||
wikiLastCheckHuman: `${dayjs.duration(dayjs().diff(m.lastWikiCheck)).humanize()} ago`,
|
||||
wikiLastCheck: m.lastWikiCheck.local().format('MMMM D, YYYY h:mm A Z'),
|
||||
stats: await m.getStats(),
|
||||
startedAt: 'Not Started',
|
||||
startedAtHuman: 'Not Started',
|
||||
delayBy: m.delayBy === undefined ? 'No' : `Delayed by ${m.delayBy} sec`,
|
||||
};
|
||||
// TODO replace indicator data with js on client page
|
||||
let indicator;
|
||||
if (m.botState.state === RUNNING && m.queueState.state === RUNNING && m.eventsState.state === RUNNING) {
|
||||
indicator = 'green';
|
||||
} else if (m.botState.state === STOPPED && m.queueState.state === STOPPED && m.eventsState.state === STOPPED) {
|
||||
indicator = 'red';
|
||||
} else {
|
||||
indicator = 'yellow';
|
||||
}
|
||||
sd.indicator = indicator;
|
||||
if (m.startedAt !== undefined) {
|
||||
const dur = dayjs.duration(dayjs().diff(m.startedAt));
|
||||
sd.startedAtHuman = `${dur.humanize()} ago`;
|
||||
sd.startedAt = m.startedAt.local().format('MMMM D, YYYY h:mm A Z');
|
||||
|
||||
if (sd.stats.cache.totalRequests > 0) {
|
||||
const minutes = dur.asMinutes();
|
||||
if (minutes < 10) {
|
||||
sd.stats.cache.requestRate = formatNumber((10 / minutes) * sd.stats.cache.totalRequests, {
|
||||
toFixed: 0,
|
||||
round: {enable: true, indicate: true}
|
||||
});
|
||||
} else {
|
||||
sd.stats.cache.requestRate = formatNumber(sd.stats.cache.totalRequests / (minutes / 10), {
|
||||
toFixed: 0,
|
||||
round: {enable: true, indicate: true}
|
||||
});
|
||||
}
|
||||
} else {
|
||||
sd.stats.cache.requestRate = 0;
|
||||
}
|
||||
}
|
||||
subManagerData.push(sd);
|
||||
}
|
||||
const totalStats = subManagerData.reduce((acc, curr) => {
|
||||
return {
|
||||
checks: {
|
||||
submissions: acc.checks.submissions + curr.checks.submissions,
|
||||
comments: acc.checks.comments + curr.checks.comments,
|
||||
},
|
||||
eventsCheckedTotal: acc.eventsCheckedTotal + curr.stats.eventsCheckedTotal,
|
||||
checksRunTotal: acc.checksRunTotal + curr.stats.checksRunTotal,
|
||||
checksTriggeredTotal: acc.checksTriggeredTotal + curr.stats.checksTriggeredTotal,
|
||||
rulesRunTotal: acc.rulesRunTotal + curr.stats.rulesRunTotal,
|
||||
rulesCachedTotal: acc.rulesCachedTotal + curr.stats.rulesCachedTotal,
|
||||
rulesTriggeredTotal: acc.rulesTriggeredTotal + curr.stats.rulesTriggeredTotal,
|
||||
actionsRunTotal: acc.actionsRunTotal + curr.stats.actionsRunTotal,
|
||||
maxWorkers: acc.maxWorkers + curr.maxWorkers,
|
||||
subMaxWorkers: acc.subMaxWorkers + curr.subMaxWorkers,
|
||||
globalMaxWorkers: acc.globalMaxWorkers + curr.globalMaxWorkers,
|
||||
runningActivities: acc.runningActivities + curr.runningActivities,
|
||||
queuedActivities: acc.queuedActivities + curr.queuedActivities,
|
||||
};
|
||||
}, {
|
||||
checks: {
|
||||
submissions: 0,
|
||||
comments: 0,
|
||||
},
|
||||
eventsCheckedTotal: 0,
|
||||
checksRunTotal: 0,
|
||||
checksTriggeredTotal: 0,
|
||||
rulesRunTotal: 0,
|
||||
rulesCachedTotal: 0,
|
||||
rulesTriggeredTotal: 0,
|
||||
actionsRunTotal: 0,
|
||||
maxWorkers: 0,
|
||||
subMaxWorkers: 0,
|
||||
globalMaxWorkers: 0,
|
||||
runningActivities: 0,
|
||||
queuedActivities: 0,
|
||||
});
|
||||
const {
|
||||
checks,
|
||||
maxWorkers,
|
||||
globalMaxWorkers,
|
||||
subMaxWorkers,
|
||||
runningActivities,
|
||||
queuedActivities,
|
||||
...rest
|
||||
} = totalStats;
|
||||
|
||||
let cumRaw = subManagerData.reduce((acc, curr) => {
|
||||
Object.keys(curr.stats.cache.types as ResourceStats).forEach((k) => {
|
||||
acc[k].requests += curr.stats.cache.types[k].requests;
|
||||
acc[k].miss += curr.stats.cache.types[k].miss;
|
||||
// @ts-ignore
|
||||
acc[k].identifierAverageHit += (typeof curr.stats.cache.types[k].identifierAverageHit === 'string' ? Number.parseFloat(curr.stats.cache.types[k].identifierAverageHit) : curr.stats.cache.types[k].identifierAverageHit);
|
||||
acc[k].averageTimeBetweenHits += curr.stats.cache.types[k].averageTimeBetweenHits === 'N/A' ? 0 : Number.parseFloat(curr.stats.cache.types[k].averageTimeBetweenHits)
|
||||
});
|
||||
return acc;
|
||||
}, cacheStats());
|
||||
cumRaw = Object.keys(cumRaw).reduce((acc, curr) => {
|
||||
const per = acc[curr].miss === 0 ? 0 : formatNumber(acc[curr].miss / acc[curr].requests) * 100;
|
||||
// @ts-ignore
|
||||
acc[curr].missPercent = `${formatNumber(per, {toFixed: 0})}%`;
|
||||
acc[curr].identifierAverageHit = formatNumber(acc[curr].identifierAverageHit);
|
||||
acc[curr].averageTimeBetweenHits = formatNumber(acc[curr].averageTimeBetweenHits)
|
||||
return acc;
|
||||
}, cumRaw);
|
||||
const cacheReq = subManagerData.reduce((acc, curr) => acc + curr.stats.cache.totalRequests, 0);
|
||||
const cacheMiss = subManagerData.reduce((acc, curr) => acc + curr.stats.cache.totalMiss, 0);
|
||||
const sharedSub = subManagerData.find(x => x.stats.cache.isShared);
|
||||
const sharedCount = sharedSub !== undefined ? sharedSub.stats.cache.currentKeyCount : 0;
|
||||
let allManagerData: any = {
|
||||
name: 'All',
|
||||
status: bot.running ? 'RUNNING' : 'NOT RUNNING',
|
||||
indicator: bot.running ? 'green' : 'grey',
|
||||
maxWorkers,
|
||||
globalMaxWorkers,
|
||||
subMaxWorkers,
|
||||
runningActivities,
|
||||
queuedActivities,
|
||||
botState: {
|
||||
state: RUNNING,
|
||||
causedBy: SYSTEM
|
||||
},
|
||||
dryRun: boolToString(bot.dryRun === true),
|
||||
logs: logs.get('all'),
|
||||
checks: checks,
|
||||
softLimit: bot.softLimit,
|
||||
hardLimit: bot.hardLimit,
|
||||
stats: {
|
||||
...rest,
|
||||
cache: {
|
||||
currentKeyCount: sharedCount + subManagerData.reduce((acc, curr) => curr.stats.cache.isShared ? acc : acc + curr.stats.cache.currentKeyCount,0),
|
||||
isShared: false,
|
||||
totalRequests: cacheReq,
|
||||
totalMiss: cacheMiss,
|
||||
missPercent: `${formatNumber(cacheMiss === 0 || cacheReq === 0 ? 0 : (cacheMiss / cacheReq) * 100, {toFixed: 0})}%`,
|
||||
types: {
|
||||
...cumRaw,
|
||||
}
|
||||
}
|
||||
},
|
||||
};
|
||||
if (allManagerData.logs === undefined) {
|
||||
// this should happen but saw an edge case where potentially did
|
||||
winston.loggers.get('app').warn(`Logs for 'all' were undefined found but should always have a default empty value`);
|
||||
}
|
||||
// if(isOperator) {
|
||||
allManagerData.startedAt = bot.startedAt.local().format('MMMM D, YYYY h:mm A Z');
|
||||
allManagerData.heartbeatHuman = dayjs.duration({seconds: bot.heartbeatInterval}).humanize();
|
||||
allManagerData.heartbeat = bot.heartbeatInterval;
|
||||
allManagerData = {...allManagerData, ...opStats(bot)};
|
||||
//}
|
||||
|
||||
const botDur = dayjs.duration(dayjs().diff(bot.startedAt))
|
||||
if (allManagerData.stats.cache.totalRequests > 0) {
|
||||
const minutes = botDur.asMinutes();
|
||||
if (minutes < 10) {
|
||||
allManagerData.stats.cache.requestRate = formatNumber((10 / minutes) * allManagerData.stats.cache.totalRequests, {
|
||||
toFixed: 0,
|
||||
round: {enable: true, indicate: true}
|
||||
});
|
||||
} else {
|
||||
allManagerData.stats.cache.requestRate = formatNumber(allManagerData.stats.cache.totalRequests / (minutes / 10), {
|
||||
toFixed: 0,
|
||||
round: {enable: true, indicate: true}
|
||||
});
|
||||
}
|
||||
} else {
|
||||
allManagerData.stats.cache.requestRate = 0;
|
||||
}
|
||||
|
||||
const data: BotStatusResponse = {
|
||||
system: {
|
||||
startedAt: bot.startedAt.local().format('MMMM D, YYYY h:mm A Z'),
|
||||
running: bot.running,
|
||||
error: bot.error,
|
||||
account: bot.botAccount as string,
|
||||
name: bot.botName as string,
|
||||
...opStats(bot),
|
||||
},
|
||||
subreddits: [allManagerData, ...subManagerData],
|
||||
|
||||
};
|
||||
|
||||
return data;
|
||||
};
|
||||
|
||||
return [...middleware, response];
|
||||
}
|
||||
|
||||
export default status;
|
||||
232
src/Web/Server/server.ts
Normal file
@@ -0,0 +1,232 @@
|
||||
import {addAsync, Router} from '@awaitjs/express';
|
||||
import express, {Request, Response} from 'express';
|
||||
import bodyParser from 'body-parser';
|
||||
import {App} from "../../App";
|
||||
import {Transform} from "stream";
|
||||
import winston from 'winston';
|
||||
import {Server as SocketServer} from 'socket.io';
|
||||
import {Strategy as JwtStrategy, ExtractJwt} from 'passport-jwt';
|
||||
import passport from 'passport';
|
||||
import tcpUsed from 'tcp-port-used';
|
||||
|
||||
import {
|
||||
intersect,
|
||||
LogEntry, parseBotLogName,
|
||||
parseSubredditLogName
|
||||
} from "../../util";
|
||||
import {getLogger} from "../../Utils/loggerFactory";
|
||||
import LoggedError from "../../Utils/LoggedError";
|
||||
import {Invokee, LogInfo, OperatorConfig} from "../../Common/interfaces";
|
||||
import http from "http";
|
||||
import SimpleError from "../../Utils/SimpleError";
|
||||
import {heartbeat} from "./routes/authenticated/applicationRoutes";
|
||||
import logs from "./routes/authenticated/user/logs";
|
||||
import status from './routes/authenticated/user/status';
|
||||
import {actionedEventsRoute, actionRoute, configRoute} from "./routes/authenticated/user";
|
||||
import action from "./routes/authenticated/user/action";
|
||||
import {authUserCheck, botRoute} from "./middleware";
|
||||
import {opStats} from "../Common/util";
|
||||
import Bot from "../../Bot";
|
||||
import addBot from "./routes/authenticated/user/addBot";
|
||||
import dayjs from "dayjs";
|
||||
|
||||
const server = addAsync(express());
|
||||
server.use(bodyParser.json());
|
||||
server.use(bodyParser.urlencoded({extended: false}));
|
||||
|
||||
declare module 'express-session' {
|
||||
interface SessionData {
|
||||
user: string,
|
||||
subreddits: string[],
|
||||
lastCheck?: number,
|
||||
limit?: number,
|
||||
sort?: string,
|
||||
level?: string,
|
||||
}
|
||||
}
|
||||
|
||||
const subLogMap: Map<string, LogEntry[]> = new Map();
|
||||
const systemLogs: LogEntry[] = [];
|
||||
const botLogMap: Map<string, Map<string, LogEntry[]>> = new Map();
|
||||
|
||||
const botSubreddits: Map<string, string[]> = new Map();
|
||||
|
||||
const rcbServer = async function (options: OperatorConfig) {
|
||||
|
||||
const {
|
||||
operator: {
|
||||
name,
|
||||
display,
|
||||
},
|
||||
api: {
|
||||
secret: secret,
|
||||
port,
|
||||
friendly,
|
||||
}
|
||||
} = options;
|
||||
|
||||
const opNames = name.map(x => x.toLowerCase());
|
||||
let app: App;
|
||||
|
||||
const logger = getLogger({...options.logging});
|
||||
|
||||
logger.stream().on('log', (log: LogInfo) => {
|
||||
const logEntry: LogEntry = [dayjs(log.timestamp).unix(), log];
|
||||
|
||||
const {bot: botName, subreddit: subName} = log;
|
||||
|
||||
if(botName === undefined) {
|
||||
systemLogs.unshift(logEntry);
|
||||
systemLogs.slice(0, 201);
|
||||
} else {
|
||||
const botLog = botLogMap.get(botName) || new Map();
|
||||
|
||||
if(subName === undefined) {
|
||||
const appLogs = botLog.get('app') || [];
|
||||
appLogs.unshift(logEntry);
|
||||
botLog.set('app', appLogs.slice(0, 200 + 1));
|
||||
} else {
|
||||
let botSubs = botSubreddits.get(botName) || [];
|
||||
if(botSubs.length === 0 && app !== undefined) {
|
||||
const b = app.bots.find(x => x.botName === botName);
|
||||
if(b !== undefined) {
|
||||
botSubs = b.subManagers.map(x => x.displayLabel);
|
||||
botSubreddits.set(botName, botSubs);
|
||||
}
|
||||
}
|
||||
if(botSubs.length === 0 || botSubs.includes(subName)) {
|
||||
const subLogs = botLog.get(subName) || [];
|
||||
subLogs.unshift(logEntry);
|
||||
botLog.set(subName, subLogs.slice(0, 200 + 1));
|
||||
}
|
||||
}
|
||||
botLogMap.set(botName, botLog);
|
||||
}
|
||||
})
|
||||
|
||||
if (await tcpUsed.check(port)) {
|
||||
throw new SimpleError(`Specified port for API (${port}) is in use or not available. Cannot start API.`);
|
||||
}
|
||||
|
||||
let httpServer: http.Server,
|
||||
io: SocketServer;
|
||||
|
||||
try {
|
||||
httpServer = await server.listen(port);
|
||||
io = new SocketServer(httpServer);
|
||||
} catch (err) {
|
||||
logger.error('Error occurred while initializing web or socket.io server', err);
|
||||
err.logged = true;
|
||||
throw err;
|
||||
}
|
||||
|
||||
logger.info(`API started => localhost:${port}`);
|
||||
|
||||
passport.use(new JwtStrategy({
|
||||
secretOrKey: secret,
|
||||
jwtFromRequest: ExtractJwt.fromAuthHeaderAsBearerToken(),
|
||||
}, function (jwtPayload, done) {
|
||||
const {name, subreddits = [], machine = true} = jwtPayload.data;
|
||||
if (machine) {
|
||||
return done(null, {machine});
|
||||
}
|
||||
const isOperator = opNames.includes(name.toLowerCase());
|
||||
let moderatedBots: string[] = [];
|
||||
let moderatedManagers: string[] = [];
|
||||
let realBots: string[] = [];
|
||||
let realManagers: string[] = [];
|
||||
if(app !== undefined) {
|
||||
const modBots = app.bots.filter(x => intersect(subreddits, x.subManagers.map(y => y.subreddit.display_name)));
|
||||
moderatedBots = modBots.map(x => x.botName as string);
|
||||
moderatedManagers = [...new Set(modBots.map(x => x.subManagers.map(y => y.displayLabel)).flat())];
|
||||
realBots = isOperator ? app.bots.map(x => x.botName as string) : moderatedBots;
|
||||
realManagers = isOperator ? [...new Set(app.bots.map(x => x.subManagers.map(y => y.displayLabel)).flat())] : moderatedManagers
|
||||
}
|
||||
|
||||
return done(null, {
|
||||
name,
|
||||
subreddits,
|
||||
isOperator,
|
||||
machine: false,
|
||||
moderatedManagers,
|
||||
realManagers,
|
||||
moderatedBots,
|
||||
realBots,
|
||||
});
|
||||
}));
|
||||
|
||||
server.use(passport.authenticate('jwt', {session: false}));
|
||||
server.use((req, res, next) => {
|
||||
req.botApp = app;
|
||||
next();
|
||||
});
|
||||
|
||||
server.getAsync('/heartbeat', ...heartbeat({name, display, friendly}));
|
||||
|
||||
server.getAsync('/logs', ...logs(subLogMap));
|
||||
|
||||
server.getAsync('/stats', [authUserCheck(), botRoute(false)], async (req: Request, res: Response) => {
|
||||
let bots: Bot[] = [];
|
||||
if(req.serverBot !== undefined) {
|
||||
bots = [req.serverBot];
|
||||
} else {
|
||||
bots = (req.user as Express.User).isOperator ? req.botApp.bots : req.botApp.bots.filter(x => intersect(req.user?.subreddits as string[], x.subManagers.map(y => y.subreddit.display_name)));
|
||||
}
|
||||
const resp = [];
|
||||
for(const b of bots) {
|
||||
resp.push({name: b.botName, data: await opStats(b)});
|
||||
}
|
||||
return res.json(resp);
|
||||
});
|
||||
const passLogs = async (req: Request, res: Response, next: Function) => {
|
||||
// @ts-ignore
|
||||
req.botLogs = botLogMap;
|
||||
// @ts-ignore
|
||||
req.systemLogs = systemLogs;
|
||||
next();
|
||||
}
|
||||
server.getAsync('/status', passLogs, ...status())
|
||||
|
||||
server.getAsync('/config', ...configRoute);
|
||||
|
||||
server.getAsync('/events', ...actionedEventsRoute);
|
||||
|
||||
server.getAsync('/action', ...action);
|
||||
|
||||
server.getAsync('/check', ...actionRoute);
|
||||
|
||||
server.getAsync('/addBot', ...addBot());
|
||||
|
||||
const initBot = async (causedBy: Invokee = 'system') => {
|
||||
if(app !== undefined) {
|
||||
logger.info('A bot instance already exists. Attempting to stop event/queue processing first before building new bot.');
|
||||
await app.destroy(causedBy);
|
||||
}
|
||||
const newApp = new App(options);
|
||||
if(newApp.error === undefined) {
|
||||
try {
|
||||
await newApp.initBots(causedBy);
|
||||
} catch (err) {
|
||||
if(newApp.error === undefined) {
|
||||
newApp.error = err.message;
|
||||
}
|
||||
logger.error('Server is still ONLINE but bot cannot recover from this error and must be re-built');
|
||||
if(!err.logged || !(err instanceof LoggedError)) {
|
||||
logger.error(err);
|
||||
}
|
||||
}
|
||||
}
|
||||
return newApp;
|
||||
}
|
||||
|
||||
server.postAsync('/init', authUserCheck(), async (req, res) => {
|
||||
logger.info(`${(req.user as Express.User).name} requested the app to be re-built. Starting rebuild now...`, {subreddit: (req.user as Express.User).name});
|
||||
app = await initBot('user');
|
||||
});
|
||||
|
||||
logger.info('Beginning bot init on startup...');
|
||||
app = await initBot();
|
||||
};
|
||||
|
||||
export default rcbServer;
|
||||
|
||||
@@ -24,6 +24,13 @@ a {
|
||||
display: inherit;
|
||||
}
|
||||
|
||||
.nestedTabs {
|
||||
display: none;
|
||||
}
|
||||
.nestedTabs.active {
|
||||
display: inline-flex;
|
||||
}
|
||||
|
||||
/*https://stackoverflow.com/a/48386400/1469797*/
|
||||
.stats {
|
||||
display: grid;
|
||||
@@ -43,6 +50,10 @@ a {
|
||||
content: ":";
|
||||
}
|
||||
|
||||
.newRow {
|
||||
margin-top: 15px;
|
||||
}
|
||||
|
||||
.has-tooltip {
|
||||
/*position: relative;*/
|
||||
}
|
||||
@@ -70,3 +81,13 @@ a {
|
||||
.botStats.hidden {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.sub.offline a {
|
||||
pointer-events: none;
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
.sub.offline .logs a {
|
||||
pointer-events: initial;
|
||||
text-decoration: initial;
|
||||
}
|
||||
@@ -14,26 +14,21 @@
|
||||
<li>Access Token: <b><%= accessToken %></b></li>
|
||||
<li>Refresh Token: <b><%= refreshToken %></b></li>
|
||||
</ul>
|
||||
<div>Copy these somewhere and then restart the application providing these as either arguments
|
||||
or environmental variables as described in the <a
|
||||
href="https://github.com/FoxxMD/context-mod#usage">usage section.</a>
|
||||
<% if(locals.addResult !== undefined) { %>
|
||||
<div>Result of trying to add bot automatically: <%= addResult %></div>
|
||||
<% } else { %>
|
||||
<div>Bot was not automatically added to an instance and will need to manually appended to configuration...</div>
|
||||
<% } %>
|
||||
<div>If you are a <b>Moderator</b> then copy the above <b>Tokens</b> and pass them on to the Operator of this ContextMod instance.</div>
|
||||
<div>If you are an <b>Operator</b> copy these somewhere and then restart the application providing these as either arguments, environmental variables, or in a json config as described in the <a
|
||||
href="https://github.com/FoxxMD/context-mod/blob/master/docs/operatorConfiguration.md#defining-configuration">configuration guide</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<%- include('partials/footer') %>
|
||||
</div>
|
||||
<script>
|
||||
if (document.querySelector('#redirectUri').value === '') {
|
||||
document.querySelector('#redirectUri').value = `${document.location.href}callback`;
|
||||
}
|
||||
|
||||
document.querySelector('#doAuth').addEventListener('click', e => {
|
||||
e.preventDefault()
|
||||
const url = `${document.location.href}auth?redirect=${document.querySelector('#redirectUri').value}`
|
||||
window.location.href = url;
|
||||
})
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
202
src/Web/assets/views/config.ejs
Normal file
@@ -0,0 +1,202 @@
|
||||
<html>
|
||||
<head>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/tailwindcss/2.0.3/tailwind.min.css"
|
||||
integrity="sha512-wl80ucxCRpLkfaCnbM88y4AxnutbGk327762eM9E/rRTvY/ZGAHWMZrYUq66VQBYMIYDFpDdJAOGSLyIPHZ2IQ=="
|
||||
crossorigin="anonymous"/>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/tailwindcss/2.0.3/tailwind-dark.min.css"
|
||||
integrity="sha512-WvyKyiVHgInX5UQt67447ExtRRZG/8GUijaq1MpqTNYp8wY4/EJOG5bI80sRp/5crDy4Z6bBUydZI2OFV3Vbtg=="
|
||||
crossorigin="anonymous"/>
|
||||
<script src="https://code.iconify.design/1/1.0.4/iconify.min.js"></script>
|
||||
<link rel="stylesheet" href="/public/themeToggle.css">
|
||||
<link rel="stylesheet" href="/public/app.css">
|
||||
<title><%= title %></title>
|
||||
<meta charset="utf-8">
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width,initial-scale=1.0">
|
||||
<!--icons from https://heroicons.com -->
|
||||
</head>
|
||||
<body style="user-select: none;" class="">
|
||||
<script>localStorage.getItem('ms-dark') === 'no' ? document.body.classList.remove('dark') : document.body.classList.add('dark')</script>
|
||||
<div class="min-w-screen min-h-screen bg-gray-100 bg-gray-100 dark:bg-gray-800 font-sans">
|
||||
<%- include('partials/title') %>
|
||||
<div class="container mx-auto">
|
||||
<div class="grid">
|
||||
<div class="dark:text-white mb-3 pl-2">
|
||||
Schema <a href="/config?schema=subreddit" id="subredditSchemaType">Subreddit</a> / <a href="/config?schema=operator" id="operatorSchemaType">Operator</a> |
|
||||
<span class="has-tooltip">
|
||||
<span style="z-index:999; margin-top: 30px;" class='tooltip rounded shadow-lg p-3 bg-gray-100 text-black space-y-2'>
|
||||
<div>Copy + paste your configuration here to get:</div>
|
||||
<ul class="list-inside list-disc">
|
||||
<li>
|
||||
formatting (right click for menu)
|
||||
</li>
|
||||
<li>
|
||||
JSON syntax assist (red squiggly, hover for info)
|
||||
</li>
|
||||
<li>
|
||||
annotated properties (hover for info)
|
||||
</li>
|
||||
<li id="schemaTypeList"></li>
|
||||
</ul>
|
||||
<div>When done editing hit Ctrl+A (Command+A on macOS) to select all text, then copy + paste back into your wiki/file</div>
|
||||
</span>
|
||||
<span class="cursor-help">
|
||||
How To Use
|
||||
<span>
|
||||
<svg xmlns="http://www.w3.org/2000/svg"
|
||||
class="h-4 w-4 inline-block cursor-help"
|
||||
fill="none"
|
||||
viewBox="0 0 24 24" stroke="currentColor">
|
||||
<path stroke-linecap="round"
|
||||
stroke-linejoin="round"
|
||||
stroke-width="2"
|
||||
d="M8.228 9c.549-1.165 2.03-2 3.772-2 2.21 0 4 1.343 4 3 0 1.4-1.278 2.575-3.006 2.907-.542.104-.994.54-.994 1.093m0 3h.01M21 12a9 9 0 11-18 0 9 9 0 0118 0z"/>
|
||||
</svg>
|
||||
</span>
|
||||
</span>
|
||||
</span>
|
||||
| <input id="configUrl" class="text-black placeholder-gray-500 rounded mx-2" style="min-width:400px;" placeholder="URL of a config to load"/> <a href="#" id="loadConfig">Load</a>
|
||||
<div id="error" class="font-semibold"></div>
|
||||
</div>
|
||||
<div style="min-height: 80vh" id="editor"></div>
|
||||
</div>
|
||||
</div>
|
||||
<%- include('partials/footer') %>
|
||||
</div>
|
||||
<script>
|
||||
document.querySelectorAll('.theme').forEach(el => {
|
||||
el.addEventListener('click', e => {
|
||||
e.preventDefault();
|
||||
if (e.target.id === 'dark') {
|
||||
document.body.classList.add('dark');
|
||||
localStorage.setItem('ms-dark', 'yes');
|
||||
} else {
|
||||
document.body.classList.remove('dark');
|
||||
localStorage.setItem('ms-dark', 'no');
|
||||
}
|
||||
document.querySelectorAll('.theme').forEach(el => {
|
||||
el.classList.remove('font-bold', 'no-underline', 'pointer-events-none');
|
||||
});
|
||||
e.target.classList.add('font-bold', 'no-underline', 'pointer-events-none');
|
||||
})
|
||||
})
|
||||
|
||||
document.querySelector("#themeToggle").checked = localStorage.getItem('ms-dark') !== 'no';
|
||||
document.querySelector("#themeToggle").onchange = (e) => {
|
||||
if (e.target.checked === true) {
|
||||
document.body.classList.add('dark');
|
||||
localStorage.setItem('ms-dark', 'yes');
|
||||
} else {
|
||||
document.body.classList.remove('dark');
|
||||
localStorage.setItem('ms-dark', 'no');
|
||||
}
|
||||
}
|
||||
</script>
|
||||
<script src="/monaco/dev/vs/loader.js"></script>
|
||||
<script>
|
||||
require.config({ paths: { vs: 'monaco/dev/vs' } });
|
||||
|
||||
const preamble = [
|
||||
'// Copy + paste your configuration here to get',
|
||||
'// formatting, JSON syntax, annotated properties and'
|
||||
];
|
||||
|
||||
var searchParams = new URLSearchParams(window.location.search);
|
||||
|
||||
let schemaType;
|
||||
let schemaFile;
|
||||
if(searchParams.get('schema') === 'operator') {
|
||||
schemaType = 'operator';
|
||||
schemaFile = 'OperatorConfig.json';
|
||||
preamble.push('// automatic validation of your OPERATOR configuration');
|
||||
document.querySelector('#schemaTypeList').innerHTML = 'automatic validation of your OPERATOR configuration (yellow squiggly)';
|
||||
document.querySelector('#operatorSchemaType').classList.add('font-bold', 'no-underline', 'pointer-events-none');
|
||||
} else {
|
||||
schemaType = 'subreddit';
|
||||
schemaFile = 'App.json';
|
||||
preamble.push('// automatic validation of your SUBREDDIT configuration');
|
||||
document.querySelector('#schemaTypeList').innerHTML = 'automatic validation of your SUBREDDIT configuration (yellow squiggly)'
|
||||
document.querySelector('#subredditSchemaType').classList.add('font-bold', 'no-underline', 'pointer-events-none');
|
||||
}
|
||||
|
||||
const schemaUri = `${document.location.origin}/schemas/${schemaFile}`;
|
||||
|
||||
require(['vs/editor/editor.main'], function () {
|
||||
const modelUri = monaco.Uri.parse("a://b/foo.json");
|
||||
fetch(schemaUri).then((res) => {
|
||||
res.json().then((schemaData) => {
|
||||
monaco.languages.json.jsonDefaults.setDiagnosticsOptions({
|
||||
validate: true,
|
||||
allowComments: true,
|
||||
trailingCommas: "ignore",
|
||||
schemas: [{
|
||||
uri: schemaUri,
|
||||
fileMatch: [modelUri.toString()],
|
||||
schema: schemaData
|
||||
}]
|
||||
});
|
||||
var model = monaco.editor.createModel(preamble.join('\r\n'), "json", modelUri);
|
||||
|
||||
document.querySelector('#loadConfig').addEventListener('click', (e) => {
|
||||
e.preventDefault();
|
||||
const newUrl = document.querySelector('#configUrl').value;
|
||||
fetch(newUrl).then((resp) => {
|
||||
if(!resp.ok) {
|
||||
resp.text().then(data => {
|
||||
document.querySelector('#error').innerHTML = `Error occurred while fetching configuration => ${data}`
|
||||
});
|
||||
} else {
|
||||
var sp = new URLSearchParams();
|
||||
sp.append('schema', schemaType);
|
||||
sp.append('url', newUrl);
|
||||
history.pushState(null, '', `${window.location.pathname}?${sp.toString()}`);
|
||||
resp.text().then(data => {
|
||||
//model = monaco.editor.createModel(data, "json", modelUri);
|
||||
model.setValue(data);
|
||||
})
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
let dlUrl = searchParams.get('url');
|
||||
if(dlUrl === null && searchParams.get('subreddit') !== null) {
|
||||
dlUrl = `${document.location.origin}/config/content${document.location.search}`
|
||||
}
|
||||
if(dlUrl !== null) {
|
||||
document.querySelector('#configUrl').value = dlUrl;
|
||||
fetch(dlUrl).then((resp) => {
|
||||
if(!resp.ok) {
|
||||
resp.text().then(data => {
|
||||
document.querySelector('#error').innerHTML = `Error occurred while fetching configuration => ${data}`
|
||||
});
|
||||
} else {
|
||||
resp.text().then(data => {
|
||||
model.setValue(data);
|
||||
//model = monaco.editor.createModel(data, "json", modelUri);
|
||||
var editor = monaco.editor.create(document.getElementById('editor'), {
|
||||
model,
|
||||
theme: 'vs-dark',
|
||||
minimap: {
|
||||
enabled: false
|
||||
}
|
||||
});
|
||||
editor;
|
||||
})
|
||||
}
|
||||
});
|
||||
} else {
|
||||
var editor = monaco.editor.create(document.getElementById('editor'), {
|
||||
model,
|
||||
theme: 'vs-dark',
|
||||
minimap: {
|
||||
enabled: false
|
||||
}
|
||||
});
|
||||
editor;
|
||||
}
|
||||
})
|
||||
})
|
||||
});
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
@@ -3,7 +3,7 @@
|
||||
<body class="">
|
||||
<script>localStorage.getItem('ms-dark') === 'no' ? document.body.classList.remove('dark') : document.body.classList.add('dark')</script>
|
||||
<div class="min-w-screen min-h-screen bg-gray-100 bg-gray-100 dark:bg-gray-800 font-sans">
|
||||
<%- include('partials/title', {title: ''}) %>
|
||||
<%- include('partials/title', {title: 'Error'}) %>
|
||||
<div class="container mx-auto">
|
||||
<div class="grid">
|
||||
<div class="bg-white dark:bg-gray-500 dark:text-white">
|
||||
@@ -11,7 +11,7 @@
|
||||
<div class="text-xl mb-4">Oops 😬</div>
|
||||
<div class="space-y-3">
|
||||
<div>Something went wrong while processing that last request:</div>
|
||||
<div><%- error %></div>
|
||||
<div class="space-y-3"><%- error %></div>
|
||||
<% if(locals.operatorDisplay !== undefined && locals.operatorDisplay !== 'Anonymous') { %>
|
||||
<div>Operated By: <%= operatorDisplay %></div>
|
||||
<% } %>
|
||||
@@ -20,6 +20,7 @@
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<%- include('partials/footer') %>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
||||
100
src/Web/assets/views/events.ejs
Normal file
@@ -0,0 +1,100 @@
|
||||
<html>
|
||||
<head>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/tailwindcss/2.0.3/tailwind.min.css"
|
||||
integrity="sha512-wl80ucxCRpLkfaCnbM88y4AxnutbGk327762eM9E/rRTvY/ZGAHWMZrYUq66VQBYMIYDFpDdJAOGSLyIPHZ2IQ=="
|
||||
crossorigin="anonymous"/>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/tailwindcss/2.0.3/tailwind-dark.min.css"
|
||||
integrity="sha512-WvyKyiVHgInX5UQt67447ExtRRZG/8GUijaq1MpqTNYp8wY4/EJOG5bI80sRp/5crDy4Z6bBUydZI2OFV3Vbtg=="
|
||||
crossorigin="anonymous"/>
|
||||
<script src="https://code.iconify.design/1/1.0.4/iconify.min.js"></script>
|
||||
<link rel="stylesheet" href="/public/themeToggle.css">
|
||||
<link rel="stylesheet" href="/public/app.css">
|
||||
<title><%= title %></title>
|
||||
<meta charset="utf-8">
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width,initial-scale=1.0">
|
||||
<!--icons from https://heroicons.com -->
|
||||
<style>
|
||||
.peek a {
|
||||
display: none;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<script>localStorage.getItem('ms-dark') === 'no' ? document.body.classList.remove('dark') : document.body.classList.add('dark')</script>
|
||||
<div class="min-w-screen min-h-screen bg-gray-100 bg-gray-100 dark:bg-gray-800 font-sans">
|
||||
<%- include('partials/title') %>
|
||||
<div class="container mx-auto">
|
||||
<div class="grid">
|
||||
<div class="bg-white dark:bg-gray-500 dark:text-white px-3 py-6 space-y-3">
|
||||
<% if(data.length === 0) { %>
|
||||
No events have been actioned yet!
|
||||
<% } %>
|
||||
<% data.forEach(function (eRes){ %>
|
||||
<div class="shadow-lg">
|
||||
<div class="space-x-4 px-4 p-2 leading-2 font-semibold bg-gray-300 dark:bg-gray-700 dark:text-white">
|
||||
<div class="flex items-center justify-between">
|
||||
<div>
|
||||
<span class="peek"><%- eRes.activity.peek %></span><a target="_blank" href="https://reddit.com<%= eRes.activity.link%>">(Link)</a>
|
||||
</div>
|
||||
<div class="flex items-center flex-end">
|
||||
<%= eRes.subreddit %> @ <%= eRes.timestamp %>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="p-4 pl-6 pt-3 space-y-2">
|
||||
<div><span class="font-semibold">Check:</span> <%= eRes.check %><span class="px-3">➔</span><%= eRes.ruleSummary %></div>
|
||||
<div>
|
||||
<span class="font-semibold">Rules:</span>
|
||||
<ul class="list-inside list-disc">
|
||||
<% eRes.ruleResults.forEach(function (ruleResult) { %>
|
||||
<li><%= ruleResult.name %> - <%= ruleResult.triggered%> - <%= ruleResult.result %></li>
|
||||
<% }) %>
|
||||
</ul>
|
||||
</div>
|
||||
<div><span class="font-semibold">Actions</span>
|
||||
<ul class="list-inside list-disc">
|
||||
<% eRes.actionResults.forEach(function (aRes) { %>
|
||||
<li><%= aRes.name %><%= aRes.dryRun %> - <%= aRes.result %></li>
|
||||
<% }) %>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<% }) %>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<%- include('partials/footer') %>
|
||||
</div>
|
||||
<script>
|
||||
document.querySelectorAll('.theme').forEach(el => {
|
||||
el.addEventListener('click', e => {
|
||||
e.preventDefault();
|
||||
if (e.target.id === 'dark') {
|
||||
document.body.classList.add('dark');
|
||||
localStorage.setItem('ms-dark', 'yes');
|
||||
} else {
|
||||
document.body.classList.remove('dark');
|
||||
localStorage.setItem('ms-dark', 'no');
|
||||
}
|
||||
document.querySelectorAll('.theme').forEach(el => {
|
||||
el.classList.remove('font-bold', 'no-underline', 'pointer-events-none');
|
||||
});
|
||||
e.target.classList.add('font-bold', 'no-underline', 'pointer-events-none');
|
||||
})
|
||||
})
|
||||
|
||||
document.querySelector("#themeToggle").checked = localStorage.getItem('ms-dark') !== 'no';
|
||||
document.querySelector("#themeToggle").onchange = (e) => {
|
||||
if (e.target.checked === true) {
|
||||
document.body.classList.add('dark');
|
||||
localStorage.setItem('ms-dark', 'yes');
|
||||
} else {
|
||||
document.body.classList.remove('dark');
|
||||
localStorage.setItem('ms-dark', 'no');
|
||||
}
|
||||
}
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
231
src/Web/assets/views/helper.ejs
Normal file
@@ -0,0 +1,231 @@
|
||||
<html>
|
||||
<%- include('partials/head', {title: 'CM OAuth Helper'}) %>
|
||||
<body class="">
|
||||
<script>localStorage.getItem('ms-dark') === 'no' ? document.body.classList.remove('dark') : document.body.classList.add('dark')</script>
|
||||
<div class="min-w-screen min-h-screen bg-gray-100 bg-gray-100 dark:bg-gray-800 font-sans">
|
||||
<%- include('partials/title', {title: ' OAuth Helper'}) %>
|
||||
<div class="container mx-auto">
|
||||
<div class="grid">
|
||||
<div class="bg-white dark:bg-gray-500 dark:text-white">
|
||||
<div class="p-6 md:px-10 md:py-6">
|
||||
<div class="text-xl mb-4">Hi! Looks like you're setting up your bot. To get running:</div>
|
||||
<div class="text-lg text-semibold my-3">1. Set your redirect URL</div>
|
||||
<div class="ml-5">
|
||||
<input id="redirectUri" style="min-width:500px;"
|
||||
class="text-black placeholder-gray-500 rounded mt-2 mb-3 p-2" value="<%= redirectUri %>">
|
||||
<div class="space-y-3">
|
||||
<div>This is the URL Reddit will redirect you to once you have authorized an account to be
|
||||
used
|
||||
with your application.
|
||||
</div>
|
||||
<div>The input field has been pre-filled with either:
|
||||
<ul class="list-inside list-disc">
|
||||
<li>What you provided to the program as an argument/environmental variable or</li>
|
||||
<li>The current URL in your browser that would be used -- if you are using a reverse
|
||||
proxy this may be different so double check
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
<div>Make sure it matches what is found in the <b>redirect uri</b> for your <a
|
||||
target="_blank"
|
||||
href="https://www.reddit.com/prefs/apps">application
|
||||
on Reddit</a> and <b>it must end with "callback"</b></div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="text-lg text-semibold my-3">2. Optionally, set <b>Client Id</b> and <b>Client Secret</b>
|
||||
</div>
|
||||
<div class="ml-5">
|
||||
<div class="space-y-2">
|
||||
Leave these fields blank to use the id/secret you provided the application (if any),
|
||||
otherwise
|
||||
fill them in.
|
||||
</div>
|
||||
<input id="clientId" style="min-width:500px;"
|
||||
class="text-black placeholder-gray-500 rounded mt-2 mb-3 p-2"
|
||||
placeholder="<%= locals.clientId !== undefined ? 'Use Provided Client Id' : 'Client Id Not Provided' %>">
|
||||
<input id="clientSecret" style="min-width:500px; display: block;"
|
||||
class="text-black placeholder-gray-500 rounded mt-2 mb-3 p-2"
|
||||
placeholder="<%= locals.clientSecret !== undefined ? 'Use Provided Client Secret' : 'Client Secret Not Provided' %>">
|
||||
</div>
|
||||
<div class="text-lg text-semibold my-3">3. Select permissions</div>
|
||||
<div class="my-2 ml-5">
|
||||
<div class="space-y-3">
|
||||
<div>These are permissions to allow the bot account to perform these actions, <b>in
|
||||
general.</b></div>
|
||||
<div>In all cases the subreddits the bot moderates for will <b>also need to give the bot
|
||||
moderator permissions to do these actions</b> -- this is just an extra precaution if
|
||||
you are super paranoid.
|
||||
</div>
|
||||
<div><b>Note:</b> None of the permissions the bot receive allow it to view/edit the email or
|
||||
password for the account
|
||||
</div>
|
||||
</div>
|
||||
<div class="mt-1">
|
||||
<div class="my-3">
|
||||
<h3 class="font-semibold">Required</h3>
|
||||
<div>The following permissions are required for the bot to do <i>anything.</i></div>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="identity" name="identity"
|
||||
checked disabled>
|
||||
<label for="identity"><span class="font-mono font-semibold">identity</span> required for
|
||||
the bot to know who it is</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="mysubreddits" name="mysubreddits"
|
||||
checked disabled>
|
||||
<label for="mysubreddits"><span class="font-mono font-semibold">mysubreddits</span>
|
||||
required for the bot to find out what subreddits it moderates</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="read" name="read"
|
||||
checked disabled>
|
||||
<label for="read"><span class="font-mono font-semibold">read</span> required for the bot
|
||||
to be able to access other's activities</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="history" name="history"
|
||||
checked disabled>
|
||||
<label for="history"><span class="font-mono font-semibold">history</span> required for
|
||||
the bot to be able to access other user's history</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="wikiread" name="wikiread"
|
||||
checked disabled>
|
||||
<label for="wikiread"><span class="font-mono font-semibold">wikiread</span> required for
|
||||
the bot to read configurations in the subreddits it moderates</label>
|
||||
</div>
|
||||
<div class="my-3">
|
||||
<h3 class="font-semibold">Recommended</h3>
|
||||
<div class="mb-1">The following permissions cover what is necessary for the bot to
|
||||
perform moderation actions
|
||||
</div>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="privatemessages"
|
||||
name="privatemessages"
|
||||
checked>
|
||||
<label for="privatemessages"><span
|
||||
class="font-mono font-semibold">privatemessages</span> for the bot to send
|
||||
messages as itself</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="modmail" name="modmail"
|
||||
checked>
|
||||
<label for="modmail"><span class="font-mono font-semibold">modmail</span> for the bot to
|
||||
send messages as a subreddit</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="modflair" name="modflair"
|
||||
checked>
|
||||
<label for="modflair"><span class="font-mono font-semibold">modflair</span> for the bot
|
||||
to flair users and submissions in the subreddits it moderates</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="modcontributors"
|
||||
name="modcontributors"
|
||||
checked>
|
||||
<label for="modcontributors"><span
|
||||
class="font-mono font-semibold">modcontributors</span> for the bot to
|
||||
ban/mute users in the subreddits it moderates</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="modposts" name="modposts"
|
||||
checked>
|
||||
<label for="modposts"><span class="font-mono font-semibold">modposts</span> for the bot
|
||||
to approve/remove/nsfw/distinguish/etc. submissions/comments in the subreddits it
|
||||
moderates</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="report" name="report"
|
||||
checked>
|
||||
<label for="report"><span class="font-mono font-semibold">report</span> for the bot to
|
||||
be able to report submissions/comments</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="submit" name="submit"
|
||||
checked>
|
||||
<label for="submit"><span class="font-mono font-semibold">submit</span> for the bot to
|
||||
reply to submissions/comments</label>
|
||||
</div>
|
||||
<div class="my-3">
|
||||
<h3 class="font-semibold">Optional</h3>
|
||||
<div>The following permissions cover additional functionality ContextMod can use or may
|
||||
in the use future
|
||||
</div>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="wikiedit" name="wikiedit">
|
||||
<label for="wikiedit"><span class="font-mono font-semibold">wikiedit</span> for the bot
|
||||
to be able to create/edit
|
||||
Toolbox
|
||||
User Notes</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="modlog" name="modlog">
|
||||
<label for="modlog"><span class="font-mono font-semibold">modlog</span> for the bot to
|
||||
able to read the moderation log (not currently implemented)</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="text-lg text-semibold my-3">4. <a id="doAuth" href="">Create Authorization Invite</a>
|
||||
</div>
|
||||
<div class="ml-5 mb-4">
|
||||
<input id="inviteCode" style="min-width:500px;"
|
||||
class="text-black placeholder-gray-500 rounded mt-2 mb-3 p-2" placeholder="Invite code value to use. Leave blank to generate a random one."/>
|
||||
<div class="space-y-3">
|
||||
<div>A unique link will be generated that you (or someone) will use to authorize a Reddit account with this application.</div>
|
||||
<div id="inviteLink"></div>
|
||||
<div id="errorWrapper" class="font-semibold hidden">Error: <span id="error"></span></div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<script>
|
||||
if (document.querySelector('#redirectUri').value === '') {
|
||||
document.querySelector('#redirectUri').value = `${document.location.origin}/callback`;
|
||||
}
|
||||
|
||||
document.querySelector('#doAuth').addEventListener('click', e => {
|
||||
e.preventDefault()
|
||||
const permissions = {};
|
||||
|
||||
document.querySelectorAll('.permissionToggle').forEach((el) => {
|
||||
permissions[el.id] = el.checked;
|
||||
});
|
||||
fetch(`${document.location.origin}/auth/create`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
body: JSON.stringify({
|
||||
redirect: document.querySelector('#redirectUri').value,
|
||||
clientId: document.querySelector('#clientId').value,
|
||||
clientSecret: document.querySelector('#clientSecret').value,
|
||||
code: document.querySelector("#inviteCode").value === '' ? undefined : document.querySelector("#inviteCode").value,
|
||||
permissions,
|
||||
})
|
||||
}).then((resp) => {
|
||||
if(!resp.ok) {
|
||||
document.querySelector("#errorWrapper").classList.remove('hidden');
|
||||
resp.text().then(t => {
|
||||
document.querySelector("#error").innerHTML = t;
|
||||
});
|
||||
} else {
|
||||
document.querySelector("#errorWrapper").classList.add('hidden');
|
||||
document.querySelector("#inviteCode").value = '';
|
||||
resp.text().then(t => {
|
||||
document.querySelector("#inviteLink").innerHTML = `Invite Link: <a class="font-semibold" href="${document.location.origin}/auth/invite?invite=${t}">${document.location.origin}/auth/invite?invite=${t}</a>`;
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
//const url = `${document.location.origin}/auth/init?${params.toString()}`;
|
||||
//window.location.href = url;
|
||||
})
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
153
src/Web/assets/views/invite.ejs
Normal file
@@ -0,0 +1,153 @@
|
||||
<html>
|
||||
<%- include('partials/head', {title: 'CM OAuth Helper'}) %>
|
||||
<body class="">
|
||||
<script>localStorage.getItem('ms-dark') === 'no' ? document.body.classList.remove('dark') : document.body.classList.add('dark')</script>
|
||||
<div class="min-w-screen min-h-screen bg-gray-100 bg-gray-100 dark:bg-gray-800 font-sans">
|
||||
<%- include('partials/title', {title: ' OAuth Helper'}) %>
|
||||
<div class="container mx-auto">
|
||||
<div class="grid">
|
||||
<div class="bg-white dark:bg-gray-500 dark:text-white">
|
||||
<div class="p-6 md:px-10 md:py-6">
|
||||
<div class="text-xl mb-4">Hi! Looks like you're accepting an invite to authorize an account to run on this ContextMod instance:</div>
|
||||
<div class="text-lg text-semibold my-3">1. Review permissions</div>
|
||||
<div class="my-2 ml-5">
|
||||
<div class="space-y-3">
|
||||
<div>These are permissions to allow the bot account to perform these actions, <b>in
|
||||
general.</b> They were pre-selected by the user who you received this invitation from.</div>
|
||||
<div>In all cases the subreddit this account moderates for will <b>also need to give the bot
|
||||
moderator permissions to do these actions</b> -- this is just an extra precaution if
|
||||
you are super paranoid.
|
||||
</div>
|
||||
<div><b>Note:</b> None of the permissions the bot receive allow it to view/edit the email or
|
||||
password for the account
|
||||
</div>
|
||||
</div>
|
||||
<div class="mt-1">
|
||||
<div class="my-3">
|
||||
<h3 class="font-semibold">Required</h3>
|
||||
<div>The following permissions are required for the bot to do <i>anything.</i></div>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="identity" name="identity"
|
||||
disabled>
|
||||
<label for="identity"><span class="font-mono font-semibold">identity</span> required for
|
||||
the bot to know who it is</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="mysubreddits" name="mysubreddits"
|
||||
disabled>
|
||||
<label for="mysubreddits"><span class="font-mono font-semibold">mysubreddits</span>
|
||||
required for the bot to find out what subreddits it moderates</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="read" name="read"
|
||||
disabled>
|
||||
<label for="read"><span class="font-mono font-semibold">read</span> required for the bot
|
||||
to be able to access other's activities</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="history" name="history"
|
||||
disabled>
|
||||
<label for="history"><span class="font-mono font-semibold">history</span> required for
|
||||
the bot to be able to access other user's history</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="wikiread" name="wikiread"
|
||||
disabled>
|
||||
<label for="wikiread"><span class="font-mono font-semibold">wikiread</span> required for
|
||||
the bot to read configurations in the subreddits it moderates</label>
|
||||
</div>
|
||||
<div class="my-3">
|
||||
<h3 class="font-semibold">Recommended</h3>
|
||||
<div class="mb-1">The following permissions cover what is necessary for the bot to
|
||||
perform moderation actions
|
||||
</div>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="privatemessages"
|
||||
name="privatemessages" disabled>
|
||||
<label for="privatemessages"><span
|
||||
class="font-mono font-semibold">privatemessages</span> for the bot to send
|
||||
messages as itself</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="modmail" name="modmail" disabled>
|
||||
<label for="modmail"><span class="font-mono font-semibold">modmail</span> for the bot to
|
||||
send messages as a subreddit</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="modflair" name="modflair" disabled>
|
||||
<label for="modflair"><span class="font-mono font-semibold">modflair</span> for the bot
|
||||
to flair users and submissions in the subreddits it moderates</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="modcontributors"
|
||||
name="modcontributors" disabled>
|
||||
<label for="modcontributors"><span
|
||||
class="font-mono font-semibold">modcontributors</span> for the bot to
|
||||
ban/mute users in the subreddits it moderates</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="modposts" name="modposts" disabled>
|
||||
<label for="modposts"><span class="font-mono font-semibold">modposts</span> for the bot
|
||||
to approve/remove/nsfw/distinguish/etc. submissions/comments in the subreddits it
|
||||
moderates</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="report" name="report" disabled>
|
||||
<label for="report"><span class="font-mono font-semibold">report</span> for the bot to
|
||||
be able to report submissions/comments</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="submit" name="submit" disabled>
|
||||
<label for="submit"><span class="font-mono font-semibold">submit</span> for the bot to
|
||||
reply to submissions/comments</label>
|
||||
</div>
|
||||
<div class="my-3">
|
||||
<h3 class="font-semibold">Optional</h3>
|
||||
<div>The following permissions cover additional functionality ContextMod can use or may
|
||||
in the use future
|
||||
</div>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="wikiedit" name="wikiedit" disabled>
|
||||
<label for="wikiedit"><span class="font-mono font-semibold">wikiedit</span> for the bot
|
||||
to be able to create/edit
|
||||
Toolbox
|
||||
User Notes</label>
|
||||
</div>
|
||||
<div>
|
||||
<input class="permissionToggle" type="checkbox" id="modlog" name="modlog" disabled>
|
||||
<label for="modlog"><span class="font-mono font-semibold">modlog</span> for the bot to
|
||||
able to read the moderation log (not currently implemented)</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="text-lg text-semibold my-3">2. Login to Reddit with the account that will be the bot
|
||||
</div>
|
||||
<div class="ml-5">
|
||||
<div class="space-y-3">
|
||||
<div>Protip: Login to Reddit in an Incognito session, then open this URL in a new tab.</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="text-lg text-semibold my-3">3. <a id="doAuth" href="">Authorize the account</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<script>
|
||||
const permissions = <%- permissions %>;
|
||||
|
||||
for(const [k, v] of Object.entries(permissions)) {
|
||||
document.querySelector(`input#${k}`).checked = v;
|
||||
}
|
||||
document.querySelector('#doAuth').addEventListener('click', e => {
|
||||
e.preventDefault()
|
||||
const url = `${document.location.origin}/auth/init?invite=<%= invite %>`;
|
||||
window.location.href = url;
|
||||
})
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
@@ -1,5 +1,5 @@
|
||||
<html>
|
||||
<%- include('partials/head', {title: 'CM'}) %>
|
||||
<%- include('partials/head', {title: 'Access Denied'}) %>
|
||||
<body class="">
|
||||
<script>localStorage.getItem('ms-dark') === 'no' ? document.body.classList.remove('dark') : document.body.classList.add('dark')</script>
|
||||
<div class="min-w-screen min-h-screen bg-gray-100 bg-gray-100 dark:bg-gray-800 font-sans">
|
||||
@@ -10,13 +10,15 @@
|
||||
<div class="p-6 md:px-10 md:py-6">
|
||||
<div class="text-xl mb-4">Sorry!</div>
|
||||
<div class="space-y-3">
|
||||
<div>Your account was successfully logged in but you do not have access to this ContextMod instance because either:</div>
|
||||
<div>Your account was successfully logged in but you cannot access this ContextMod client or route for one of these reasons:</div>
|
||||
<ul class="list-inside list-disc">
|
||||
<li>The Bot account used by this instance is not a Moderator of any Subreddits you are also a Moderator of or</li>
|
||||
<li>the Bot is a Moderator of one of your Subreddits but the Operator of this instance is not currently running the instance on your Subreddits.</li>
|
||||
<li>You are not a moderator of any of the subreddits this client has access to</li>
|
||||
<li>You are not a moderator of any of the subreddits of the specific instance/bot you are trying to access</li>
|
||||
<li>The instance(s) running the subreddits you are a moderator of are offline</li>
|
||||
<li>The bot(s) running the subreddits you are a moderator of are not configured correctly IE the subreddits they moderate could not be retrieved</li>
|
||||
</ul>
|
||||
<div>Note: You must <a href="logout">Logout</a> in order for the instance to detect changes in your subreddits/moderator status</div>
|
||||
<% if(operatorDisplay !== 'Anonymous') { %>
|
||||
<% if(locals.operatorDisplay !== undefined && locals.operatorDisplay !== 'Anonymous') { %>
|
||||
<div>Operated By: <%= operatorDisplay %></div>
|
||||
<% } %>
|
||||
</div>
|
||||
@@ -24,6 +26,8 @@
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<%- include('partials/footer') %>
|
||||
</div>
|
||||
<%- include('partials/themeJs') %>
|
||||
</body>
|
||||
</html>
|
||||
83
src/Web/assets/views/offline.ejs
Normal file
@@ -0,0 +1,83 @@
|
||||
<html>
|
||||
<%- include('partials/head', {title: undefined}) %>
|
||||
<body class="">
|
||||
<script>localStorage.getItem('ms-dark') === 'no' ? document.body.classList.remove('dark') : document.body.classList.add('dark')</script>
|
||||
<div class="min-w-screen min-h-screen bg-gray-100 bg-gray-100 dark:bg-gray-800 font-sans">
|
||||
<%- include('partials/header') %>
|
||||
<%- include('partials/botsTab') %>
|
||||
<div class="container mx-auto">
|
||||
<%- include('partials/subredditsTab') %>
|
||||
<div class="grid">
|
||||
<div class="bg-white dark:bg-gray-500 dark:text-white">
|
||||
<div class="pb-6 md:px-7">
|
||||
<div class="sub active" data-subreddit="All" data-bot="All">
|
||||
Instance is currently <b>OFFLINE</b>
|
||||
<div class="flex items-center justify-between flex-wrap">
|
||||
<div class="inline-flex items-center">
|
||||
</div>
|
||||
<%- include('partials/logSettings') %>
|
||||
</div>
|
||||
<%- include('partials/loadingIcon') %>
|
||||
<div data-subreddit="All" class="logs font-mono text-sm">
|
||||
<% logs.forEach(function (logEntry){ %>
|
||||
<%- logEntry %>
|
||||
<% }) %>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<%- include('partials/footer') %>
|
||||
<%- include('partials/instanceTabJs') %>
|
||||
<%- include('partials/themeJs') %>
|
||||
<%- include('partials/logSettingsJs') %>
|
||||
<script src="https://cdn.socket.io/4.1.2/socket.io.min.js" integrity="sha384-toS6mmwu70G0fw54EGlWWeA4z3dyJ+dlXBtSURSKN4vyRFOcxd3Bzjj/AoOwY+Rg" crossorigin="anonymous"></script>
|
||||
<script>
|
||||
window.sort = 'desc';
|
||||
|
||||
const INSTANCE_NAME_LOG_REGEX = /\|(.+?)\|/;
|
||||
const parseALogName = (reg) => {
|
||||
return (val) => {
|
||||
const matches = val.match(reg);
|
||||
if (matches === null) {
|
||||
return undefined;
|
||||
}
|
||||
return matches[1];
|
||||
}
|
||||
}
|
||||
const parseInstanceLogName = parseALogName(INSTANCE_NAME_LOG_REGEX);
|
||||
|
||||
let socket = io({
|
||||
reconnectionAttempts: 5, // bail after 5 attempts
|
||||
});
|
||||
|
||||
const limit = Number.parseInt(document.querySelector(`[data-type="limit"]`).value);
|
||||
|
||||
const instanceURLSP = new URLSearchParams(window.location.search);
|
||||
const instanceSP = instanceURLSP.get('instance');
|
||||
|
||||
socket.on("connect", () => {
|
||||
document.body.classList.add('connected');
|
||||
socket.on("log", data => {
|
||||
const el = document.querySelector(`.sub`);
|
||||
const bot = parseInstanceLogName(data);
|
||||
if(bot === instanceSP) {
|
||||
const logContainer = el.querySelector(`.logs`);
|
||||
let existingLogs;
|
||||
if(window.sort === 'desc') {
|
||||
logContainer.insertAdjacentHTML('afterbegin', data);
|
||||
existingLogs = Array.from(el.querySelectorAll(`.logs .logLine`));
|
||||
logContainer.replaceChildren(...existingLogs.slice(0, limit));
|
||||
} else {
|
||||
logContainer.insertAdjacentHTML('beforeend', data);
|
||||
existingLogs = Array.from(el.querySelectorAll(`.logs .logLine`));
|
||||
const overLimit = limit - existingLogs.length;
|
||||
logContainer.replaceChildren(...existingLogs.slice(overLimit -1, limit));
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
</script>
|
||||
</body>
|
||||
30
src/Web/assets/views/partials/botsTab.ejs
Normal file
@@ -0,0 +1,30 @@
|
||||
<div class="space-x-4 py-1 md:px-10 leading-6 font-semibold bg-gray-500 dark:bg-gray-700 text-white">
|
||||
<div class="container mx-auto">
|
||||
<% if(locals.bots !== undefined) { %>
|
||||
<ul id="botTabs" class="inline-flex flex-wrap">
|
||||
<% bots.forEach(function (data){ %>
|
||||
<li class="my-3 px-3 dark:text-white">
|
||||
<span data-bot="<%= data.system.name %>" class="rounded-md py-2 px-3 tabSelectWrapper">
|
||||
<a class="tabSelect font-normal pointer hover:font-bold"
|
||||
data-bot="<%= data.system.name %>">
|
||||
<%= data.system.name %>
|
||||
</a>
|
||||
<% if ((data.system.name === 'All' && isOperator) || data.system.name !== 'All') { %>
|
||||
<span class="inline-block mb-0.5 ml-0.5 w-2 h-2 bg-<%= data.system.running ? 'green' : 'red' %>-400 rounded-full"></span>
|
||||
<% } %>
|
||||
</span>
|
||||
</li>
|
||||
<% }) %>
|
||||
<% if(locals.isOperator === true && locals.instanceId !== undefined) { %>
|
||||
<li class="my-3 px-3 dark:text-white">
|
||||
<span class="rounded-md py-2 px-3 border">
|
||||
<a class="font-normal pointer hover:font-bold" href="/auth/helper">
|
||||
Add Bot +
|
||||
</a>
|
||||
</span>
|
||||
</li>
|
||||
<% } %>
|
||||
</ul>
|
||||
<% } %>
|
||||
</div>
|
||||
</div>
|
||||
5
src/Web/assets/views/partials/footer.ejs
Normal file
@@ -0,0 +1,5 @@
|
||||
<div class="py-3 flex items-center justify-around font-semibold text-white">
|
||||
<div>
|
||||
<a href="https://github.com/FoxxMD/context-mod">ContextMod Web</a> created by /u/FoxxMD
|
||||
</div>
|
||||
</div>
|
||||
@@ -6,9 +6,9 @@
|
||||
integrity="sha512-WvyKyiVHgInX5UQt67447ExtRRZG/8GUijaq1MpqTNYp8wY4/EJOG5bI80sRp/5crDy4Z6bBUydZI2OFV3Vbtg=="
|
||||
crossorigin="anonymous"/>
|
||||
<script src="https://code.iconify.design/1/1.0.4/iconify.min.js"></script>
|
||||
<link rel="stylesheet" href="public/themeToggle.css">
|
||||
<link rel="stylesheet" href="public/app.css">
|
||||
<title><%= title !== undefined ? title : `CM for ${botName}`%></title>
|
||||
<link rel="stylesheet" href="/public/themeToggle.css">
|
||||
<link rel="stylesheet" href="/public/app.css">
|
||||
<title><%= locals.title !== undefined ? title : `${locals.botName !== undefined ? `CM for ${botName}` : 'ContextMod'}`%></title>
|
||||
<!--<title><%# `CM for /u/${botName}`%></title>-->
|
||||
<meta charset="utf-8">
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
55
src/Web/assets/views/partials/header.ejs
Normal file
@@ -0,0 +1,55 @@
|
||||
<div class="space-x-4 p-6 md:px-10 md:py-6 leading-6 font-semibold bg-gray-800 text-white">
|
||||
<div class="container mx-auto">
|
||||
<div class="flex items-center justify-between">
|
||||
<div class="flex items-center flex-grow pr-4">
|
||||
<% if(locals.instances !== undefined) { %>
|
||||
<ul class="inline-flex flex-wrap">
|
||||
<% instances.forEach(function (data) { %>
|
||||
<li class="my-3 px-3 dark:text-white">
|
||||
<span data-instance="<%= data.friendly %>" class="has-tooltip instanceSelectWrapper rounded-md py-3 px-3">
|
||||
<span class='tooltip rounded shadow-lg p-1 bg-gray-100 text-black' style="margin-top:3em">
|
||||
<div class="stats">
|
||||
<% if(data.canAccessLocation) { %>
|
||||
<label>Location</label>
|
||||
<span><%= data.normalUrl %></span>
|
||||
<% } %>
|
||||
<label>Online</label>
|
||||
<span><%= data.online %></span>
|
||||
<label>Running</label>
|
||||
<span><%= data.running %></span>
|
||||
<label>Error</label>
|
||||
<span><%= data.error %></span>
|
||||
</div>
|
||||
</span>
|
||||
<span>
|
||||
<a class="instanceSelect hover:font-bold" href="/?instance=<%= data.friendly %>">
|
||||
<%= data.friendly %>
|
||||
</a>
|
||||
<span class="inline-block mb-0.5 ml-0.5 w-2 h-2 bg-<%= data.online ? 'green' : 'gray' %>-400 rounded-full"></span>
|
||||
</span>
|
||||
</span>
|
||||
</li>
|
||||
<% }) %>
|
||||
</ul>
|
||||
<% } %>
|
||||
</div>
|
||||
<div class="flex items-center flex-end text-sm">
|
||||
<span class="inline-block mr-4">
|
||||
<label style="font-size:2.5px;">
|
||||
<input class='toggle-checkbox' type='checkbox' id="themeToggle" checked></input>
|
||||
<div class='toggle-slot'>
|
||||
<div class='sun-icon-wrapper'>
|
||||
<div class="iconify sun-icon" data-icon="feather-sun" data-inline="false"></div>
|
||||
</div>
|
||||
<div class='toggle-button'></div>
|
||||
<div class='moon-icon-wrapper'>
|
||||
<div class="iconify moon-icon" data-icon="feather-moon" data-inline="false"></div>
|
||||
</div>
|
||||
</div>
|
||||
</label>
|
||||
</span>
|
||||
<a href="logout">Logout</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
16
src/Web/assets/views/partials/instanceTabJs.ejs
Normal file
@@ -0,0 +1,16 @@
|
||||
<script>
|
||||
const instanceSearchParams = new URLSearchParams(window.location.search);
|
||||
|
||||
const instance = instanceSearchParams.get('instance');
|
||||
|
||||
document.querySelectorAll(`[data-instance].instanceSelectWrapper`).forEach((el) => {
|
||||
if(el.dataset.instance === instance) {
|
||||
el.classList.add('border-2');
|
||||
el.querySelector('a.instanceSelect').classList.add('pointer-events-none','no-underline','font-bold');
|
||||
} else {
|
||||
el.classList.add('border');
|
||||
el.querySelector('a.instanceSelect').classList.add('font-normal','pointer');
|
||||
}
|
||||
|
||||
});
|
||||
</script>
|
||||
15
src/Web/assets/views/partials/loadingIcon.ejs
Normal file
@@ -0,0 +1,15 @@
|
||||
<svg class="loading" version="1.1" id="L9" xmlns="http://www.w3.org/2000/svg"
|
||||
xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
|
||||
viewBox="0 0 100 100" xml:space="preserve">
|
||||
<path
|
||||
d="M73,50c0-12.7-10.3-23-23-23S27,37.3,27,50 M30.9,50c0-10.5,8.5-19.1,19.1-19.1S69.1,39.5,69.1,50">
|
||||
<animateTransform
|
||||
attributeName="transform"
|
||||
attributeType="XML"
|
||||
type="rotate"
|
||||
dur="1s"
|
||||
from="0 50 50"
|
||||
to="360 50 50"
|
||||
repeatCount="indefinite"/>
|
||||
</path>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 596 B |