33 Commits

Author SHA1 Message Date
Hendrik Eeckhaut
8e2a944243 build: Pin transitive dependencies for tlsn alpha.12 (#115) 2025-10-02 19:45:30 +02:00
Piotr Żelazko
e6b7db5acf fix: ensure transcript decoding supports unicode (#112) 2025-09-16 21:22:34 +08:00
dan
66ec4343e8 chore: update config options + update to alpha.12 (#113)
* chore: update config options

* Alpha.12
---------

Co-authored-by: Hendrik Eeckhaut <hendrik@eeckhaut.org>
2025-06-19 12:23:49 -07:00
Hendrik Eeckhaut
f51ddbf3de ci: notary server has tls disabled by default since alpha.11 (#111) 2025-06-02 11:32:32 +02:00
Hendrik Eeckhaut
1cb664b341 Alpha.11 (#109)
Co-authored-by: yuroitaki <25913766+yuroitaki@users.noreply.github.com>
2025-05-30 10:22:11 +02:00
Hendrik Eeckhaut
4cecbb5334 Use Playwright to test demos (#106)
* Run tests and demos with playwright
* ci: renamed workflow
* Improved demo readmes
* Use a separate page for each test
2025-05-28 08:50:05 +02:00
Hendrik Eeckhaut
8bc8a94948 chore: Use raw.githubusercontent.com instead of swapi for the demos (#105)
+ use a local proxy for testing
+ avoid duplicate github action runs
2025-05-12 16:34:36 +02:00
Hendrik Eeckhaut
8bf3745407 Update to tlsnotary v0.1.0-alpha.10 (#104)
* Update to tlsnotary v0.1.0-alpha.10
* Prove data from GitHub server instead of swapi in tests
* Log browser messages during test execution
2025-04-25 14:00:14 +02:00
Hendrik Eeckhaut
a418082762 chore: prepare 0.1.0-alpha.9.1 release 2025-04-03 10:05:33 +02:00
Hendrik Eeckhaut
07f2645a65 chore: update examples to alpha.9 + improve readme (#102) 2025-04-03 09:52:11 +02:00
Grzegorz Pociejewski
c70abc5eb2 Pin wasm version (#103) 2025-04-03 03:49:04 -04:00
Hendrik Eeckhaut
be717f4260 ci: correctly set auth token for npm publish 2025-03-28 08:58:25 +01:00
Hendrik Eeckhaut
0372b8e8aa Update to alpha.9 + added way to co-develop tlsn-wasm (#99) 2025-03-27 14:50:16 +01:00
Hendrik Eeckhaut
bf0114085f update readme (#98) 2025-03-26 17:06:40 +01:00
Hendrik Eeckhaut
d2ae87106f ci: use npm instead of pnpm (#100) 2025-03-17 17:41:24 +01:00
Hendrik Eeckhaut
9fad45e174 chore: run notary server in test with docker (#95) 2025-03-14 03:48:45 -04:00
tsukino
18268f716f feat: upgrade to alpha.8 (#93) 2025-03-12 12:04:58 -04:00
tsukino
91f74471c7 feat: return raw data from transcript and move parsing to client side (#92) 2025-02-28 04:28:06 -05:00
Tanner
0133efe529 Demo UI (#85)
* feat: webp2p demo ui update

* feat: webp2p demo ui update

* fix: webp2p webpack

* chore: linting

* feat: styled react-ts demo, fixed interactive demo bug

* chore: cleanup

* Improved React demo

* fix: unnessecary dependencies in package

* Improved interactive demo

* Fixed readme

* Removed unused type

---------

Co-authored-by: Hendrik Eeckhaut <hendrik@eeckhaut.org>
2025-02-05 11:21:13 -08:00
Hendrik Eeckhaut
14eee4a2e3 fix: make sure the tests run in ci
#89
2025-01-22 17:37:13 +01:00
fan ka
2b783618d2 fix: change webpack dev server port to avoid websocket conflict (#77)
chore: change webpack dev server port to avoid websocket conflict
2024-12-05 15:54:43 +01:00
Hendrik Eeckhaut
d7b250db45 feat: Improvements to interactive-demo (#78) 2024-12-04 05:53:01 -05:00
tsukino
a82add9a05 feat: Playground for P2P demo using websocket and streams (#76) 2024-12-03 07:54:26 -05:00
tsukino
d64361c785 chore: move interactive-demo inside demo folder (#75) 2024-11-09 10:07:09 +07:00
Hendrik Eeckhaut
1ded4136bf Interactive Prover in Typescript (Verifier in Rust) (#74)
feat: interactive verifier demo

* Verifier in Rust
* Prover in both Rust and TypeScript/React

---------
Co-authored-by: Pete Thomas <pete@xminusone.net>
Co-authored-by: yuroitaki <25913766+yuroitaki@users.noreply.github.com>
2024-11-05 22:41:30 +07:00
tsukino
18a30d32c1 fix: bump tlsn-wasm to alpha.7.1 and move to dependency (#71) 2024-10-04 05:34:46 -04:00
tsukino
b95c8f0159 feat: integrate with alpha.7 (#67) 2024-10-04 04:40:08 -04:00
Hendrik Eeckhaut
756cdb4f1f doc: added warning: do not treat as api yet (#55) 2024-09-20 01:20:16 -04:00
Hendrik Eeckhaut
313896a986 doc: Fix quickstart to work with notary server alpha.6 (#64) 2024-08-16 07:45:20 -04:00
tsukino
7b78fbca61 feat: add helper method and expose wasm types (#63) 2024-08-14 06:04:54 -04:00
tsukino
24aa329210 chore: update readme (#62) 2024-08-12 06:10:21 -04:00
tsukino
e4a8a9c723 feat: upgrade to alpha.6 and new tlsn-wasm prover (#61)
* Revert "chore: release v0.1.0-alpha.6 (#57)"

This reverts commit 292b4263d7.

* parsed json

* add json commitments

* parse json from transcript

* wip

* wip

* wip

* feat: update to alpha.6

* chore: commit wasm pkg directory

* chore: version nump

* fix: test suite for alpha.6

* fix: remove wasm build from ci

* fix: update pnpm lockfile

* fix: remove test:wasm

* fix: linter and add new devDependency
2024-08-12 05:52:19 -04:00
Hendrik Eeckhaut
292b4263d7 chore: release v0.1.0-alpha.6 (#57) 2024-06-26 08:18:49 -04:00
100 changed files with 20475 additions and 8207 deletions

View File

@@ -29,8 +29,7 @@
"build",
"test-build",
"dev-build",
"wasm",
"utils",
"tlsn-wasm",
"webpack.config.js",
"webpack.test.config.js"
]

View File

@@ -1,72 +0,0 @@
name: build
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
env:
PUPPETEER_SKIP_DOWNLOAD: true
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install stable rust toolchain
uses: dtolnay/rust-toolchain@stable
with:
targets: wasm32-unknown-unknown
toolchain: nightly
- name: Use caching
uses: Swatinem/rust-cache@v2
with:
workspaces: wasm/prover
- name: Install Node.js
uses: actions/setup-node@v4
with:
node-version: 18
- uses: pnpm/action-setup@v3
name: Install pnpm
with:
version: 8
run_install: false
- name: Get pnpm store directory
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- uses: actions/cache@v4
name: Setup pnpm cache
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Install wasm-pack
run: npm install -g wasm-pack
- name: Install nightly tool-chain
run: rustup component add rust-src --toolchain nightly-x86_64-unknown-linux-gnu
- name: Install dependencies
run: pnpm install
- name: Build WASM
run: npm run build:wasm
- name: Build
run: npm run build
- name: Lint
run: npm run lint

88
.github/workflows/ci.yaml vendored Normal file
View File

@@ -0,0 +1,88 @@
name: ci
on:
pull_request:
release:
types: [published]
jobs:
build-and-test:
name: Build and test
runs-on: ubuntu-latest
env:
RELEASE_MODE: 'dry-run' # dry-run by default, will be set to 'publish' for release builds
services:
notary-server:
image: ghcr.io/tlsnotary/tlsn/notary-server:v0.1.0-alpha.12
ports:
- 7047:7047
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Checkout tlsn
uses: actions/checkout@v4
with:
repository: tlsnotary/tlsn
path: tlsn-wasm/tlsn
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20
cache: 'npm'
- name: Install stable nightly toolchain
uses: dtolnay/rust-toolchain@stable
with:
targets: wasm32-unknown-unknown
components: rust-src
toolchain: nightly
- name: Use caching
uses: Swatinem/rust-cache@v2.7.7
with:
workspaces: tlsn-wasm/tlsn -> target
cache-on-failure: true
- name: Install wasm-pack
run: curl https://rustwasm.github.io/wasm-pack/installer/init.sh -sSf | sh
- name: Install dependencies
run: npm ci
- name: Build
run: npm run build
- name: Lint
run: npm run lint
- name: install wstcp
run: cargo install wstcp
- name: Install Chromium (Playwright)
run: npx playwright install --with-deps chromium
- name: Test
run: npm run test
- name: Determine release type (dry-run or publish)
run: |
if [[ $GITHUB_EVENT_NAME == "release" ]]; then
echo "RELEASE_MODE=publish" >> $GITHUB_ENV
else
echo "RELEASE_MODE=dry-run" >> $GITHUB_ENV
fi
- name: Dry-run release (non-release builds)
if: env.RELEASE_MODE == 'dry-run'
run: npm publish --dry-run
- name: Publish to npm (GitHub Release)
if: env.RELEASE_MODE == 'publish'
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
run: |
echo "//registry.npmjs.org/:_authToken=${NODE_AUTH_TOKEN}" > .npmrc
npm publish
rm .npmrc

54
.github/workflows/playwright.yml vendored Normal file
View File

@@ -0,0 +1,54 @@
name: Tests demos
on:
pull_request:
jobs:
test:
timeout-minutes: 60
name: Tests demos
runs-on: ubuntu-latest
services:
notary-server:
image: ghcr.io/tlsnotary/tlsn/notary-server:v0.1.0-alpha.12
env:
NOTARY_SERVER__TLS__ENABLED: false
ports:
- 7047:7047
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: lts/*
- name: build tlsn-js
run: npm ci; npm run build
- name: install wstcp
run: cargo install wstcp
- name: Install Chromium (Playwright)
run: npx playwright install --with-deps chromium
- name: Test react demo
working-directory: demo/react-ts-webpack
continue-on-error: true
run: |
set -e
npm i
npm run test
- name: Test interactive verifier demo
continue-on-error: true
run: |
set -e
cd demo/interactive-demo/verifier-rs
cargo build --release
cd ../prover-ts
npm i
npm run test
- name: Test web-to-web p2p demo
working-directory: demo/react-ts-webpack
continue-on-error: true
run: |
set -e
npm run test
- uses: actions/upload-artifact@v4
if: ${{ !cancelled() }}
with:
name: playwright-report
path: '**/playwright-report/'
retention-days: 30

View File

@@ -1,87 +0,0 @@
name: test
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
env:
LOCAL-NOTARY: true
LOCAL-WS: false
HEADLESS: true
PUPPETEER_SKIP_DOWNLOAD: true
jobs:
test:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install stable rust toolchain
uses: dtolnay/rust-toolchain@stable
with:
targets: wasm32-unknown-unknown
toolchain: nightly
- name: Use caching
uses: Swatinem/rust-cache@v2
with:
workspaces: wasm/prover
- name: Install Chrome
uses: browser-actions/setup-chrome@v1
id: setup-chrome
with:
chrome-version: 121.0.6167.85
- name: Set CHROME_PATH environment variable
run: echo "CHROME_PATH=${{ steps.setup-chrome.outputs['chrome-path'] }}" >> $GITHUB_ENV
- name: Install Node.js
uses: actions/setup-node@v4
with:
node-version: 18
- uses: pnpm/action-setup@v3
name: Install pnpm
with:
version: 8
run_install: false
- name: Get pnpm store directory
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- uses: actions/cache@v4
name: Setup pnpm cache
with:
path: ${STORE_PATH}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Install wasm-pack
run: npm install -g wasm-pack
- name: Install nightly tool-chain
run: rustup component add rust-src --toolchain nightly-x86_64-unknown-linux-gnu
- name: Install dependencies
run: pnpm install
- name: Build WASM
run: npm run build:wasm
- name: Test WASM
run: npm run test:wasm
- name: Build Test dependencies
run: npm run build:tlsn-binaries
- name: Test
run: npm run test

16
.gitignore vendored
View File

@@ -1,7 +1,3 @@
wasm/prover/pkg
wasm/prover/target
wasm/prover/Cargo.lock
wasm-pack.log
node_modules/
.idea/
.DS_Store
@@ -9,7 +5,11 @@ build/
dev-build/
test-build/
./demo/node_modules
./demo/package-lock.json
package-lock.json
pnpm-lock.yaml
yarn.lock
utils/tlsn
.vscode
# Playwright
/test-results/
/playwright-report/
/blob-report/
/playwright/.cache/

View File

@@ -1,6 +1,6 @@
{
"rust-analyzer.linkedProjects": [
"wasm/prover/Cargo.toml"
"demo/interactive-demo/verifier-rs/Cargo.toml",
"demo/interactive-demo/prover-rs/Cargo.toml"
],
"rust-analyzer.cargo.target": "wasm32-unknown-unknown"
}
}

2
demo/interactive-demo/.gitignore vendored Normal file
View File

@@ -0,0 +1,2 @@
**/target/
**/Cargo.lock

View File

@@ -0,0 +1,59 @@
# Interactive Verifier Demo
This demo shows how to use TLSNotary **without a notary**: a direct proof between a prover and a verifier, where the verifier checks both the TLS session and the revealed data.
There are two prover implementations:
- **Rust**
- **TypeScript** (browser)
The verifier is implemented in Rust.
---
## Interactive Verifier Demo with Rust Prover
1. **Start the verifier:**
```bash
cd verifier-rs
cargo run --release
```
2. **Run the prover:**
```bash
cd prover-rs
cargo run --release
```
---
## Interactive Verifier Demo with TypeScript Prover (Browser)
1. **Start the verifier:**
```bash
cd verifier-rs
cargo run --release
```
2. **Set up a websocket proxy for raw.githubusercontent.com**
Browsers cannot make raw TCP connections, so a websocket proxy is required:
```bash
cargo install wstcp
wstcp --bind-addr 127.0.0.1:55688 raw.githubusercontent.com:443
```
3. **Run the prover in the browser:**
1. **Build tlsn-js**
```bash
cd ..
npm install
npm run build
```
2. **Build and start the TypeScript prover demo**
```bash
cd prover-ts
npm install
npm run dev
```
3. **Open the demo in your browser:**
Go to [http://localhost:8080/](http://localhost:8080/) and click **Start Prover**.
---
**Tip:**
If you encounter issues, make sure all dependencies are installed and the websocket proxy is running before starting the browser demo.

View File

@@ -0,0 +1,38 @@
[package]
name = "interactive-networked-prover"
version = "0.1.0"
edition = "2021"
[dependencies]
async-tungstenite = { version = "0.25", features = ["tokio-runtime"] }
futures = "0.3"
http = "1.1"
http-body-util = "0.1"
hyper = { version = "1.1", features = ["client", "http1"] }
hyper-util = { version = "0.1", features = ["full"] }
regex = "1.10.3"
tokio = { version = "1", features = [
"rt",
"rt-multi-thread",
"macros",
"net",
"io-std",
"fs",
] }
tokio-util = { version = "0.7", features = ["compat"] }
tracing = "0.1.40"
tracing-subscriber = { version = "0.3.18", features = ["env-filter"] }
uuid = { version = "1.4.1", features = ["v4", "fast-rng"] }
ws_stream_tungstenite = { version = "0.13", features = ["tokio_io"] }
tlsn-core = { git = "https://github.com/tlsnotary/tlsn.git", tag = "v0.1.0-alpha.12", package = "tlsn-core" }
tlsn-prover = { git = "https://github.com/tlsnotary/tlsn.git", tag = "v0.1.0-alpha.12", package = "tlsn-prover" }
tlsn-common = { git = "https://github.com/tlsnotary/tlsn.git", tag = "v0.1.0-alpha.12", package = "tlsn-common" }
spansy = { git = "https://github.com/tlsnotary/tlsn-utils", package = "spansy", branch = "dev" }
rangeset = "0.2.0"
# --- Transitive dependency pins (for TLSNotary alpha.12)---
aes = "=0.9.0-rc.0"
cipher = "=0.5.0-rc.0"
crypto-common = "=0.2.0-rc.3"
inout = "=0.2.0-rc.5"

View File

@@ -0,0 +1,9 @@
## Interactive Prover
An implementation of the interactive prover in Rust.
## Running the prover
1. Configure this prover setting via the global variables defined in [main.rs](./src/main.rs) — please ensure that the hardcoded `SERVER_URL` is the same on the verifier side.
2. Start the prover by running the following in a terminal at the root of this crate.
```bash
cargo run --release
```

View File

@@ -0,0 +1,191 @@
use async_tungstenite::{tokio::connect_async_with_config, tungstenite::protocol::WebSocketConfig};
use http_body_util::Empty;
use hyper::{body::Bytes, Request, StatusCode, Uri};
use hyper_util::rt::TokioIo;
use rangeset::RangeSet;
use spansy::{
http::parse_response,
json::{self},
Spanned,
};
use tlsn_common::config::ProtocolConfig;
use tlsn_core::ProveConfig;
use tlsn_prover::{Prover, ProverConfig};
use tokio::io::{AsyncRead, AsyncWrite};
use tokio_util::compat::{FuturesAsyncReadCompatExt, TokioAsyncReadCompatExt};
use tracing::{debug, info};
use tracing_subscriber::{layer::SubscriberExt, util::SubscriberInitExt, EnvFilter};
use ws_stream_tungstenite::WsStream;
const TRACING_FILTER: &str = "INFO";
const VERIFIER_HOST: &str = "localhost";
const VERIFIER_PORT: u16 = 9816;
// Maximum number of bytes that can be sent from prover to server
const MAX_SENT_DATA: usize = 1 << 12;
// Maximum number of bytes that can be received by prover from server
const MAX_RECV_DATA: usize = 1 << 14;
const SECRET: &str = "TLSNotary's private key 🤡";
/// Make sure the following url's domain is the same as SERVER_DOMAIN on the verifier side
const SERVER_URL: &str = "https://raw.githubusercontent.com/tlsnotary/tlsn/refs/tags/v0.1.0-alpha.12/crates/server-fixture/server/src/data/1kb.json";
#[tokio::main]
async fn main() {
tracing_subscriber::registry()
.with(EnvFilter::try_from_default_env().unwrap_or_else(|_| TRACING_FILTER.into()))
.with(tracing_subscriber::fmt::layer())
.init();
run_prover(VERIFIER_HOST, VERIFIER_PORT, SERVER_URL).await;
}
async fn run_prover(verifier_host: &str, verifier_port: u16, server_uri: &str) {
info!("Sending websocket request...");
let request = http::Request::builder()
.uri(format!("ws://{verifier_host}:{verifier_port}/verify",))
.header("Host", verifier_host)
.header("Sec-WebSocket-Key", uuid::Uuid::new_v4().to_string())
.header("Sec-WebSocket-Version", "13")
.header("Connection", "Upgrade")
.header("Upgrade", "Websocket")
.body(())
.unwrap();
let (verifier_ws_stream, _) =
connect_async_with_config(request, Some(WebSocketConfig::default()))
.await
.unwrap();
info!("Websocket connection established!");
let verifier_ws_socket = WsStream::new(verifier_ws_stream);
prover(verifier_ws_socket, server_uri).await;
info!("Proving is successful!");
}
async fn prover<T: AsyncWrite + AsyncRead + Send + Unpin + 'static>(verifier_socket: T, uri: &str) {
debug!("Starting proving...");
let uri = uri.parse::<Uri>().unwrap();
assert_eq!(uri.scheme().unwrap().as_str(), "https");
let server_domain = uri.authority().unwrap().host();
let server_port = uri.port_u16().unwrap_or(443);
// Create prover and connect to verifier.
//
// Perform the setup phase with the verifier.
let prover = Prover::new(
ProverConfig::builder()
.server_name(server_domain)
.protocol_config(
ProtocolConfig::builder()
.max_sent_data(MAX_SENT_DATA)
.max_recv_data(MAX_RECV_DATA)
.build()
.unwrap(),
)
.build()
.unwrap(),
)
.setup(verifier_socket.compat())
.await
.unwrap();
// Connect to TLS Server.
let tls_client_socket = tokio::net::TcpStream::connect((server_domain, server_port))
.await
.unwrap();
// Pass server connection into the prover.
let (mpc_tls_connection, prover_fut) =
prover.connect(tls_client_socket.compat()).await.unwrap();
// Wrap the connection in a TokioIo compatibility layer to use it with hyper.
let mpc_tls_connection = TokioIo::new(mpc_tls_connection.compat());
// Spawn the Prover to run in the background.
let prover_task = tokio::spawn(prover_fut);
// MPC-TLS Handshake.
let (mut request_sender, connection) =
hyper::client::conn::http1::handshake(mpc_tls_connection)
.await
.unwrap();
tokio::spawn(connection);
// MPC-TLS: Send Request and wait for Response.
info!("Send Request and wait for Response");
let request = Request::builder()
.uri(uri.clone())
.header("Host", server_domain)
.header("Connection", "close")
.header("Secret", SECRET)
.method("GET")
.body(Empty::<Bytes>::new())
.unwrap();
let response = request_sender.send_request(request).await.unwrap();
debug!("TLS response: {:?}", response);
assert!(response.status() == StatusCode::OK);
// Create proof for the Verifier.
let mut prover = prover_task.await.unwrap().unwrap();
let mut builder: tlsn_core::ProveConfigBuilder<'_> = ProveConfig::builder(prover.transcript());
// Reveal the DNS name.
builder.server_identity();
let sent_rangeset = redact_and_reveal_sent_data(prover.transcript().sent());
let _ = builder.reveal_sent(&sent_rangeset);
let recv_rangeset = redact_and_reveal_received_data(prover.transcript().received());
let _ = builder.reveal_recv(&recv_rangeset);
let config = builder.build().unwrap();
prover.prove(&config).await.unwrap();
prover.close().await.unwrap();
}
/// Redacts and reveals received data to the verifier.
fn redact_and_reveal_received_data(recv_transcript: &[u8]) -> RangeSet<usize> {
// Get the some information from the received data.
let received_string = String::from_utf8(recv_transcript.to_vec()).unwrap();
debug!("Received data: {}", received_string);
let resp = parse_response(recv_transcript).unwrap();
let body = resp.body.unwrap();
let mut json = json::parse_slice(body.as_bytes()).unwrap();
json.offset(body.content.span().indices().min().unwrap());
let name = json.get("information.name").expect("name field not found");
let street = json
.get("information.address.street")
.expect("street field not found");
let name_start = name.span().indices().min().unwrap() - 9; // 9 is the length of "name: "
let name_end = name.span().indices().max().unwrap() + 1; // include `"`
let street_start = street.span().indices().min().unwrap() - 11; // 11 is the length of "street: "
let street_end = street.span().indices().max().unwrap() + 1; // include `"`
[name_start..name_end + 1, street_start..street_end + 1].into()
}
/// Redacts and reveals sent data to the verifier.
fn redact_and_reveal_sent_data(sent_transcript: &[u8]) -> RangeSet<usize> {
let sent_transcript_len = sent_transcript.len();
let sent_string: String = String::from_utf8(sent_transcript.to_vec()).unwrap();
let secret_start = sent_string.find(SECRET).unwrap();
debug!("Send data: {}", sent_string);
// Reveal everything except for the SECRET.
[
0..secret_start,
secret_start + SECRET.len()..sent_transcript_len,
]
.into()
}

View File

@@ -0,0 +1,8 @@
package-lock.json
# Playwright
node_modules/
/test-results/
/playwright-report/
/blob-report/
/playwright/.cache/

View File

@@ -0,0 +1,16 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>TLSNotary React TypeScript Demo</title>
</head>
<body>
<script>
</script>
<div id="root"></div>
</body>
</html>

View File

@@ -0,0 +1,41 @@
{
"name": "prover-ts",
"version": "1.0.0",
"description": "",
"main": "webpack.js",
"scripts": {
"dev": "webpack-dev-server --config webpack.js",
"start": "webpack serve --config webpack.js",
"test": "npx playwright test"
},
"author": "",
"license": "ISC",
"dependencies": {
"@fortawesome/fontawesome-free": "^6.7.1",
"css-loader": "^7.1.2",
"postcss-loader": "^8.1.1",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"react-loader-spinner": "^6.1.6",
"style-loader": "^4.0.0",
"tlsn-js": "../../.."
},
"devDependencies": {
"@types/react": "^18.0.26",
"@types/react-dom": "^18.0.10",
"babel-loader": "^9.1.3",
"copy-webpack-plugin": "^11.0.0",
"html-webpack-plugin": "^5.5.0",
"postcss": "^8.4.49",
"postcss-preset-env": "^10.1.1",
"sass": "^1.82.0",
"sass-loader": "^16.0.4",
"source-map-loader": "^5.0.0",
"tailwindcss": "^3.4.16",
"ts-loader": "^9.4.2",
"typescript": "^4.9.4",
"webpack": "^5.75.0",
"webpack-cli": "^4.10.0",
"webpack-dev-server": "^4.11.1"
}
}

View File

@@ -0,0 +1,90 @@
import { defineConfig, devices } from '@playwright/test';
/**
* Read environment variables from file.
* https://github.com/motdotla/dotenv
*/
// import dotenv from 'dotenv';
// import path from 'path';
// dotenv.config({ path: path.resolve(__dirname, '.env') });
/**
* See https://playwright.dev/docs/test-configuration.
*/
export default defineConfig({
testDir: './tests',
/* Run tests in files in parallel */
fullyParallel: true,
/* Fail the build on CI if you accidentally left test.only in the source code. */
forbidOnly: !!process.env.CI,
/* Retry on CI only */
retries: process.env.CI ? 2 : 0,
/* Opt out of parallel tests on CI. */
workers: process.env.CI ? 1 : undefined,
/* Reporter to use. See https://playwright.dev/docs/test-reporters */
reporter: 'html',
/* Shared settings for all the projects below. See https://playwright.dev/docs/api/class-testoptions. */
use: {
/* Base URL to use in actions like `await page.goto('/')`. */
baseURL: 'http://localhost:8080',
/* Collect trace when retrying the failed test. See https://playwright.dev/docs/trace-viewer */
trace: 'on-first-retry',
},
/* Configure projects for major browsers */
projects: [
{
name: 'chromium',
use: { ...devices['Desktop Chrome'] },
},
// {
// name: 'firefox',
// use: { ...devices['Desktop Firefox'] },
// },
// {
// name: 'webkit',
// use: { ...devices['Desktop Safari'] },
// },
/* Test against mobile viewports. */
// {
// name: 'Mobile Chrome',
// use: { ...devices['Pixel 5'] },
// },
// {
// name: 'Mobile Safari',
// use: { ...devices['iPhone 12'] },
// },
/* Test against branded browsers. */
// {
// name: 'Microsoft Edge',
// use: { ...devices['Desktop Edge'], channel: 'msedge' },
// },
// {
// name: 'Google Chrome',
// use: { ...devices['Desktop Chrome'], channel: 'chrome' },
// },
],
/* Run your local dev server before starting the tests */
webServer: [
{
command: 'npm run start',
url: 'http://localhost:8080',
reuseExistingServer: !process.env.CI,
},
{
command: 'wstcp --bind-addr 127.0.0.1:55688 raw.githubusercontent.com:443',
reuseExistingServer: true,
},
{
command: 'cargo run --release',
cwd: '../verifier-rs',
reuseExistingServer: true,
}
]
});

View File

@@ -0,0 +1,4 @@
const tailwindcss = require("tailwindcss");
module.exports = {
plugins: ["postcss-preset-env", tailwindcss],
};

View File

@@ -0,0 +1,3 @@
@tailwind base;
@tailwind components;
@tailwind utilities;

View File

@@ -0,0 +1,268 @@
import React, { ReactElement, useCallback, useState } from 'react';
import { createRoot } from 'react-dom/client';
import * as Comlink from 'comlink';
import { Watch } from 'react-loader-spinner';
import { Prover as TProver } from 'tlsn-js';
import { type Method } from 'tlsn-wasm';
import './app.scss';
import { HTTPParser } from 'http-parser-js';
import { Reveal, mapStringToRange, subtractRanges } from 'tlsn-js';
const { init, Prover }: any = Comlink.wrap(
new Worker(new URL('./worker.ts', import.meta.url)),
);
const container = document.getElementById('root');
const root = createRoot(container!);
root.render(<App />);
const serverUrl = 'https://raw.githubusercontent.com/tlsnotary/tlsn/refs/tags/v0.1.0-alpha.12/crates/server-fixture/server/src/data/1kb.json';
// const websocketProxyUrl = `wss://notary.pse.dev/proxy`;
const websocketProxyUrl = 'ws://localhost:55688';
const verifierProxyUrl = 'ws://localhost:9816/verify';
function App(): ReactElement {
const [processing, setProcessing] = useState(false);
const [result, setResult] = useState<string | null>(null);
const onClick = useCallback(async () => {
setProcessing(true);
const url = serverUrl;
const method: Method = 'GET';
const headers = {
secret: "TLSNotary's private key",
'Content-Type': 'application/json',
};
const body = {};
const hostname = new URL(url).hostname;
let prover: TProver;
try {
console.time('setup');
await init({ loggingLevel: 'Info' });
console.log('Setting up Prover for', hostname);
prover = (await new Prover({
serverDns: hostname,
maxRecvData: 2000
})) as TProver;
console.log('Setting up Prover: 1/2');
await prover.setup(verifierProxyUrl);
console.log('Setting up Prover: done');
console.timeEnd('setup');
} catch (error) {
const msg = `Error setting up prover: ${error}`;
console.error(msg);
setResult(msg);
setProcessing(false);
return;
}
let transcript;
try {
console.time('request');
console.log('Sending request to proxy');
const resp = await prover.sendRequest(
`${websocketProxyUrl}?token=${hostname}`,
{ url, method, headers, body },
);
console.log('Response:', resp);
console.log('Wait for transcript');
transcript = await prover.transcript();
console.log('Transcript:', transcript);
console.timeEnd('request');
} catch (error) {
const msg = `Error sending request: ${error}`;
console.error(msg);
setResult(msg);
setProcessing(false);
return;
}
try {
const { sent, recv } = transcript;
const {
info: recvInfo,
headers: recvHeaders,
body: recvBody,
} = parseHttpMessage(Buffer.from(recv), 'response');
const body = JSON.parse(recvBody[0].toString());
console.log("test", body.information.address.street);
console.time('reveal');
const reveal: Reveal = {
sent: subtractRanges(
{ start: 0, end: sent.length },
mapStringToRange(
['secret: test_secret'],
Buffer.from(sent).toString('utf-8'),
),
),
recv: [
...mapStringToRange(
[
recvInfo,
`${recvHeaders[4]}: ${recvHeaders[5]}\r\n`,
`${recvHeaders[6]}: ${recvHeaders[7]}\r\n`,
`${recvHeaders[8]}: ${recvHeaders[9]}\r\n`,
`${recvHeaders[10]}: ${recvHeaders[11]}\r\n`,
`${recvHeaders[12]}: ${recvHeaders[13]}`,
`${recvHeaders[14]}: ${recvHeaders[15]}`,
`${recvHeaders[16]}: ${recvHeaders[17]}`,
`${recvHeaders[18]}: ${recvHeaders[19]}`,
`"name": "${body.information.name}"`,
`"street": "${body.information.address.street}"`,
],
Buffer.from(recv).toString('utf-8'),
),
],
server_identity: true,
};
console.log('Start reveal:', reveal);
await prover.reveal(reveal);
console.timeEnd('reveal');
} catch (error) {
console.dir(error);
console.error('Error during data reveal:', error);
setResult(`${error}`);
setProcessing(false);
return;
}
console.log('Ready');
console.log('Unredacted data:', {
sent: transcript.sent,
received: transcript.recv,
});
setResult(
"Unredacted data successfully revealed to Verifier. Check the Verifier's console output to see what exactly was shared and revealed.",
);
setProcessing(false);
}, [setResult, setProcessing]);
return (
<div className="flex flex-col items-center justify-center w-full min-h-screen bg-gray-50 p-4">
<h1 className="text-4xl font-bold text-slate-500 mb-2">TLSNotary</h1>
<span className="text-lg text-gray-600 mb-4">
Interactive Prover Demo
</span>
<div className="text-center text-gray-700 mb-6">
<p>
Before clicking the <span className="font-semibold">Start</span>{' '}
button, make sure the <i>interactive verifier</i> and the{' '}
<i>web socket proxy</i> are running.
</p>
<p>
Check the{' '}
<a href="README.md" className="text-blue-600 hover:underline">
README
</a>{' '}
for the details.
</p>
<table className="text-left table-auto w-full mt-4">
<thead>
<tr>
<th className="px-4 py-2 text-left">Demo Settings</th>
<th className="px-4 py-2 text-left">URL</th>
</tr>
</thead>
<tbody>
<tr>
<td className="border px-4 py-2">Server</td>
<td className="border px-4 py-2">{serverUrl}</td>
</tr>
<tr>
<td className="border px-4 py-2">Verifier</td>
<td className="border px-4 py-2">{verifierProxyUrl}</td>
</tr>
<tr>
<td className="border px-4 py-2">WebSocket Proxy</td>
<td className="border px-4 py-2">{websocketProxyUrl}</td>
</tr>
</tbody>
</table>
</div>
<button
onClick={!processing ? onClick : undefined}
disabled={processing}
className={`px-6 py-2 rounded-lg font-medium text-white
${processing ? 'bg-slate-400 cursor-not-allowed' : 'bg-slate-600 hover:bg-slate-700'}
`}
>
Start Prover
</button>
<div className="mt-6 w-full max-w-3xl text-center">
<b className="text-lg font-medium text-gray-800">Proof: </b>
{!processing && !result ? (
<i className="text-gray-500">Not started yet</i>
) : !result ? (
<div className="flex flex-col items-center justify-center">
<p className="text-gray-700 mb-2">Proving data from GitHub...</p>
<Watch
visible={true}
height="40"
width="40"
radius="48"
color="#4A5568"
ariaLabel="watch-loading"
wrapperStyle={{}}
wrapperClass=""
/>
<p className="text-sm text-gray-500 mt-2">
Open <i>Developer Tools</i> to follow progress
</p>
</div>
) : (
<div className="bg-gray-100 border border-gray-300 p-4 rounded-lg mt-4">
<pre data-testid="proof-data" className="text-left text-sm text-gray-800 whitespace-pre-wrap overflow-auto">
{JSON.stringify(result, null, 2)}
</pre>
</div>
)}
</div>
</div>
);
}
function parseHttpMessage(buffer: Buffer, type: 'request' | 'response') {
const parser = new HTTPParser(
type === 'request' ? HTTPParser.REQUEST : HTTPParser.RESPONSE,
);
const body: Buffer[] = [];
let complete = false;
let headers: string[] = [];
parser.onBody = (t) => {
body.push(t);
};
parser.onHeadersComplete = (res) => {
headers = res.headers;
};
parser.onMessageComplete = () => {
complete = true;
};
parser.execute(buffer);
parser.finish();
if (!complete) throw new Error(`Could not parse ${type.toUpperCase()}`);
return {
info: buffer.toString('utf-8').split('\r\n')[0] + '\r\n',
headers,
body,
};
}

View File

@@ -0,0 +1,7 @@
import * as Comlink from 'comlink';
import init, { Prover } from 'tlsn-js';
Comlink.expose({
init,
Prover,
});

View File

@@ -0,0 +1,12 @@
/** @type {import('tailwindcss').Config} */
module.exports = {
content: ['./src/**/*.{js,jsx,ts,tsx}'],
theme: {
extend: {
colors: {
primary: '#243f5f',
},
},
},
plugins: [],
};

View File

@@ -0,0 +1,16 @@
import { test, expect } from '@playwright/test';
test('has title', async ({ page }) => {
await page.goto('/');
await expect(page).toHaveTitle(/TLSNotary/)
});
test('run demo', async ({ page }) => {
await page.goto('/');
// Click the get started link.
await page.getByRole('button', { name: 'Start Prover' }).click();
await expect(page.getByTestId('proof-data')).toContainText('Unredacted data successfully revealed to Verifier', { timeout: 60000 });
});

View File

@@ -0,0 +1,26 @@
{
"compilerOptions": {
"target": "es5",
"lib": [
"dom",
"dom.iterable",
"esnext"
],
"allowJs": false,
"skipLibCheck": true,
"esModuleInterop": true,
"allowSyntheticDefaultImports": true,
"strict": true,
"forceConsistentCasingInFileNames": true,
"noFallthroughCasesInSwitch": true,
"module": "esnext",
"moduleResolution": "node",
"resolveJsonModule": true,
"noEmit": false,
"jsx": "react"
},
"include": [
"src/app.tsx",
"src/worker.ts"
]
}

View File

@@ -0,0 +1,153 @@
var webpack = require('webpack'),
path = require('path'),
CopyWebpackPlugin = require('copy-webpack-plugin'),
HtmlWebpackPlugin = require('html-webpack-plugin');
const ASSET_PATH = process.env.ASSET_PATH || '/';
var alias = {};
var fileExtensions = [
'jpg',
'jpeg',
'png',
'gif',
'eot',
'otf',
'svg',
'ttf',
'woff',
'woff2',
];
var options = {
ignoreWarnings: [
/Circular dependency between chunks with runtime/,
/ResizeObserver loop completed with undelivered notifications/,
],
mode: 'development',
entry: {
app: path.join(__dirname, 'src', 'app.tsx'),
},
output: {
filename: '[name].bundle.js',
path: path.resolve(__dirname, 'build'),
clean: true,
publicPath: ASSET_PATH,
},
module: {
rules: [
{
test: new RegExp('.(' + fileExtensions.join('|') + ')$'),
type: 'asset/resource',
exclude: /node_modules/,
},
{
test: /\.html$/,
loader: 'html-loader',
exclude: /node_modules/,
},
{
test: /\.(ts|tsx)$/,
exclude: /node_modules/,
use: [
{
loader: 'source-map-loader',
},
{
loader: require.resolve('ts-loader'),
},
],
},
{
test: /\.(js|jsx)$/,
use: [
{
loader: 'source-map-loader',
},
{
loader: require.resolve('babel-loader'),
},
],
exclude: /node_modules/,
},
{
// look for .css or .scss files
test: /\.(css|scss)$/,
// in the `web` directory
use: [
{
loader: 'style-loader',
},
{
loader: 'css-loader',
options: { importLoaders: 1 },
},
{
loader: 'postcss-loader',
},
{
loader: 'sass-loader',
options: {
sourceMap: true,
},
},
],
},
],
},
resolve: {
alias: alias,
extensions: fileExtensions
.map((extension) => '.' + extension)
.concat(['.js', '.jsx', '.ts', '.tsx', '.css']),
fallback: {
crypto: require.resolve('crypto-browserify'),
stream: require.resolve('stream-browserify'),
vm: require.resolve('vm-browserify'),
},
},
plugins: [
new CopyWebpackPlugin({
patterns: [
{
from: 'node_modules/tlsn-js/build',
to: path.join(__dirname, 'build'),
force: true,
},
],
}),
new CopyWebpackPlugin({
patterns: [{ from: '../README.md', to: 'README.md' }],
}),
new HtmlWebpackPlugin({
template: path.join(__dirname, 'index.ejs'),
filename: 'index.html',
cache: false,
}),
new webpack.ProvidePlugin({
process: 'process/browser',
}),
new webpack.ProvidePlugin({
Buffer: ['buffer', 'Buffer'],
}),
].filter(Boolean),
// Required by wasm-bindgen-rayon, in order to use SharedArrayBuffer on the Web
// Ref:
// - https://github.com/GoogleChromeLabs/wasm-bindgen-rayon#setting-up
// - https://web.dev/i18n/en/coop-coep/
devServer: {
port: 8080,
host: 'localhost',
hot: true,
headers: {
'Cross-Origin-Embedder-Policy': 'require-corp',
'Cross-Origin-Opener-Policy': 'same-origin',
},
client: {
overlay: false,
},
},
};
module.exports = options;

View File

@@ -0,0 +1,44 @@
[package]
name = "interactive-networked-verifier"
version = "0.1.0"
edition = "2021"
[dependencies]
async-trait = "0.1.67"
async-tungstenite = { version = "0.25", features = ["tokio-native-tls"] }
axum = { version = "0.7", features = ["ws"] }
axum-core = "0.4"
base64 = "0.21.0"
eyre = "0.6.12"
futures-util = "0.3.28"
http = { version = "1.1" }
http-body-util = { version = "0.1" }
hyper = { version = "1.1", features = ["client", "http1", "server"] }
hyper-util = { version = "0.1", features = ["full"] }
serde = { version = "1.0.147", features = ["derive"] }
sha1 = "0.10"
tokio = { version = "1", features = [
"rt",
"rt-multi-thread",
"macros",
"net",
"io-std",
"fs",
] }
tokio-util = { version = "0.7", features = ["compat"] }
tower = { version = "0.4.12", features = ["make"] }
tower-service = { version = "0.3" }
tracing = "0.1.40"
tracing-subscriber = { version = "0.3.18", features = ["env-filter"] }
ws_stream_tungstenite = { version = "0.13", features = ["tokio_io"] }
tlsn-core = { git = "https://github.com/tlsnotary/tlsn.git", tag = "v0.1.0-alpha.12", package = "tlsn-core" }
tlsn-verifier = { git = "https://github.com/tlsnotary/tlsn.git", tag = "v0.1.0-alpha.12", package = "tlsn-verifier" }
tlsn-common = { git = "https://github.com/tlsnotary/tlsn.git", tag = "v0.1.0-alpha.12", package = "tlsn-common" }
tower-util = "0.3.1"
# --- Transitive dependency pins (for TLSNotary alpha.12)---
aes = "=0.9.0-rc.0"
cipher = "=0.5.0-rc.0"
crypto-common = "=0.2.0-rc.3"
inout = "=0.2.0-rc.5"

View File

@@ -0,0 +1,14 @@
# verifier-server
An implementation of the interactive verifier server in Rust.
## Running the server
1. Configure this server setting via the global variables defined in [main.rs](./src/main.rs) — please ensure that the hardcoded `SERVER_DOMAIN` has the same value on the prover side.
2. Start the server by running the following in a terminal at the root of this crate.
```bash
cargo run --release
```
## WebSocket APIs
### /verify
To perform verification via websocket, i.e. `ws://localhost:9816/verify`

View File

@@ -0,0 +1,930 @@
//! The following code is adapted from https://github.com/tokio-rs/axum/blob/axum-v0.7.3/axum/src/extract/ws.rs
//! where we swapped out tokio_tungstenite (https://docs.rs/tokio-tungstenite/latest/tokio_tungstenite/)
//! with async_tungstenite (https://docs.rs/async-tungstenite/latest/async_tungstenite/) so that we can use
//! ws_stream_tungstenite (https://docs.rs/ws_stream_tungstenite/latest/ws_stream_tungstenite/index.html)
//! to get AsyncRead and AsyncWrite implemented for the WebSocket. Any other modification is commented with the prefix "NOTARY_MODIFICATION:"
//!
//! The code is under the following license:
//!
//! Copyright (c) 2019 Axum Contributors
//!
//! Permission is hereby granted, free of charge, to any
//! person obtaining a copy of this software and associated
//! documentation files (the "Software"), to deal in the
//! Software without restriction, including without
//! limitation the rights to use, copy, modify, merge,
//! publish, distribute, sublicense, and/or sell copies of
//! the Software, and to permit persons to whom the Software
//! is furnished to do so, subject to the following
//! conditions:
//!
//! The above copyright notice and this permission notice
//! shall be included in all copies or substantial portions
//! of the Software.
//!
//! THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
//! ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
//! TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
//! PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
//! SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
//! CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
//! OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
//! IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
//! DEALINGS IN THE SOFTWARE.
//!
//!
//! Handle WebSocket connections.
//!
//! # Example
//!
//! ```
//! use axum::{
//! extract::ws::{WebSocketUpgrade, WebSocket},
//! routing::get,
//! response::{IntoResponse, Response},
//! Router,
//! };
//!
//! let app = Router::new().route("/ws", get(handler));
//!
//! async fn handler(ws: WebSocketUpgrade) -> Response {
//! ws.on_upgrade(handle_socket)
//! }
//!
//! async fn handle_socket(mut socket: WebSocket) {
//! while let Some(msg) = socket.recv().await {
//! let msg = if let Ok(msg) = msg {
//! msg
//! } else {
//! // client disconnected
//! return;
//! };
//!
//! if socket.send(msg).await.is_err() {
//! // client disconnected
//! return;
//! }
//! }
//! }
//! # let _: Router = app;
//! ```
//!
//! # Passing data and/or state to an `on_upgrade` callback
//!
//! ```
//! use axum::{
//! extract::{ws::{WebSocketUpgrade, WebSocket}, State},
//! response::Response,
//! routing::get,
//! Router,
//! };
//!
//! #[derive(Clone)]
//! struct AppState {
//! // ...
//! }
//!
//! async fn handler(ws: WebSocketUpgrade, State(state): State<AppState>) -> Response {
//! ws.on_upgrade(|socket| handle_socket(socket, state))
//! }
//!
//! async fn handle_socket(socket: WebSocket, state: AppState) {
//! // ...
//! }
//!
//! let app = Router::new()
//! .route("/ws", get(handler))
//! .with_state(AppState { /* ... */ });
//! # let _: Router = app;
//! ```
//!
//! # Read and write concurrently
//!
//! If you need to read and write concurrently from a [`WebSocket`] you can use
//! [`StreamExt::split`]:
//!
//! ```rust,no_run
//! use axum::{Error, extract::ws::{WebSocket, Message}};
//! use futures_util::{sink::SinkExt, stream::{StreamExt, SplitSink, SplitStream}};
//!
//! async fn handle_socket(mut socket: WebSocket) {
//! let (mut sender, mut receiver) = socket.split();
//!
//! tokio::spawn(write(sender));
//! tokio::spawn(read(receiver));
//! }
//!
//! async fn read(receiver: SplitStream<WebSocket>) {
//! // ...
//! }
//!
//! async fn write(sender: SplitSink<WebSocket, Message>) {
//! // ...
//! }
//! ```
//!
//! [`StreamExt::split`]: https://docs.rs/futures/0.3.17/futures/stream/trait.StreamExt.html#method.split
#![allow(unused)]
use self::rejection::*;
use async_trait::async_trait;
use async_tungstenite::{
tokio::TokioAdapter,
tungstenite::{
self as ts,
protocol::{self, WebSocketConfig},
},
WebSocketStream,
};
use axum::{body::Bytes, extract::FromRequestParts, response::Response, Error};
use axum_core::body::Body;
use futures_util::{
sink::{Sink, SinkExt},
stream::{Stream, StreamExt},
};
use http::{
header::{self, HeaderMap, HeaderName, HeaderValue},
request::Parts,
Method, StatusCode,
};
use hyper_util::rt::TokioIo;
use sha1::{Digest, Sha1};
use std::{
borrow::Cow,
future::Future,
pin::Pin,
task::{Context, Poll},
};
use tracing::error;
/// Extractor for establishing WebSocket connections.
///
/// Note: This extractor requires the request method to be `GET` so it should
/// always be used with [`get`](crate::routing::get). Requests with other methods will be
/// rejected.
///
/// See the [module docs](self) for an example.
#[cfg_attr(docsrs, doc(cfg(feature = "ws")))]
pub struct WebSocketUpgrade<F = DefaultOnFailedUpgrade> {
config: WebSocketConfig,
/// The chosen protocol sent in the `Sec-WebSocket-Protocol` header of the response.
protocol: Option<HeaderValue>,
sec_websocket_key: HeaderValue,
on_upgrade: hyper::upgrade::OnUpgrade,
on_failed_upgrade: F,
sec_websocket_protocol: Option<HeaderValue>,
}
impl<F> std::fmt::Debug for WebSocketUpgrade<F> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.debug_struct("WebSocketUpgrade")
.field("config", &self.config)
.field("protocol", &self.protocol)
.field("sec_websocket_key", &self.sec_websocket_key)
.field("sec_websocket_protocol", &self.sec_websocket_protocol)
.finish_non_exhaustive()
}
}
impl<F> WebSocketUpgrade<F> {
/// The target minimum size of the write buffer to reach before writing the data
/// to the underlying stream.
///
/// The default value is 128 KiB.
///
/// If set to `0` each message will be eagerly written to the underlying stream.
/// It is often more optimal to allow them to buffer a little, hence the default value.
///
/// Note: [`flush`](SinkExt::flush) will always fully write the buffer regardless.
pub fn write_buffer_size(mut self, size: usize) -> Self {
self.config.write_buffer_size = size;
self
}
/// The max size of the write buffer in bytes. Setting this can provide backpressure
/// in the case the write buffer is filling up due to write errors.
///
/// The default value is unlimited.
///
/// Note: The write buffer only builds up past [`write_buffer_size`](Self::write_buffer_size)
/// when writes to the underlying stream are failing. So the **write buffer can not
/// fill up if you are not observing write errors even if not flushing**.
///
/// Note: Should always be at least [`write_buffer_size + 1 message`](Self::write_buffer_size)
/// and probably a little more depending on error handling strategy.
pub fn max_write_buffer_size(mut self, max: usize) -> Self {
self.config.max_write_buffer_size = max;
self
}
/// Set the maximum message size (defaults to 64 megabytes)
pub fn max_message_size(mut self, max: usize) -> Self {
self.config.max_message_size = Some(max);
self
}
/// Set the maximum frame size (defaults to 16 megabytes)
pub fn max_frame_size(mut self, max: usize) -> Self {
self.config.max_frame_size = Some(max);
self
}
/// Allow server to accept unmasked frames (defaults to false)
pub fn accept_unmasked_frames(mut self, accept: bool) -> Self {
self.config.accept_unmasked_frames = accept;
self
}
/// Set the known protocols.
///
/// If the protocol name specified by `Sec-WebSocket-Protocol` header
/// to match any of them, the upgrade response will include `Sec-WebSocket-Protocol` header and
/// return the protocol name.
///
/// The protocols should be listed in decreasing order of preference: if the client offers
/// multiple protocols that the server could support, the server will pick the first one in
/// this list.
///
/// # Examples
///
/// ```
/// use axum::{
/// extract::ws::{WebSocketUpgrade, WebSocket},
/// routing::get,
/// response::{IntoResponse, Response},
/// Router,
/// };
///
/// let app = Router::new().route("/ws", get(handler));
///
/// async fn handler(ws: WebSocketUpgrade) -> Response {
/// ws.protocols(["graphql-ws", "graphql-transport-ws"])
/// .on_upgrade(|socket| async {
/// // ...
/// })
/// }
/// # let _: Router = app;
/// ```
pub fn protocols<I>(mut self, protocols: I) -> Self
where
I: IntoIterator,
I::Item: Into<Cow<'static, str>>,
{
if let Some(req_protocols) = self
.sec_websocket_protocol
.as_ref()
.and_then(|p| p.to_str().ok())
{
self.protocol = protocols
.into_iter()
// FIXME: This will often allocate a new `String` and so is less efficient than it
// could be. But that can't be fixed without breaking changes to the public API.
.map(Into::into)
.find(|protocol| {
req_protocols
.split(',')
.any(|req_protocol| req_protocol.trim() == protocol)
})
.map(|protocol| match protocol {
Cow::Owned(s) => HeaderValue::from_str(&s).unwrap(),
Cow::Borrowed(s) => HeaderValue::from_static(s),
});
}
self
}
/// Provide a callback to call if upgrading the connection fails.
///
/// The connection upgrade is performed in a background task. If that fails this callback
/// will be called.
///
/// By default any errors will be silently ignored.
///
/// # Example
///
/// ```
/// use axum::{
/// extract::{WebSocketUpgrade},
/// response::Response,
/// };
///
/// async fn handler(ws: WebSocketUpgrade) -> Response {
/// ws.on_failed_upgrade(|error| {
/// report_error(error);
/// })
/// .on_upgrade(|socket| async { /* ... */ })
/// }
/// #
/// # fn report_error(_: axum::Error) {}
/// ```
pub fn on_failed_upgrade<C>(self, callback: C) -> WebSocketUpgrade<C>
where
C: OnFailedUpgrade,
{
WebSocketUpgrade {
config: self.config,
protocol: self.protocol,
sec_websocket_key: self.sec_websocket_key,
on_upgrade: self.on_upgrade,
on_failed_upgrade: callback,
sec_websocket_protocol: self.sec_websocket_protocol,
}
}
/// Finalize upgrading the connection and call the provided callback with
/// the stream.
#[must_use = "to set up the WebSocket connection, this response must be returned"]
pub fn on_upgrade<C, Fut>(self, callback: C) -> Response
where
C: FnOnce(WebSocket) -> Fut + Send + 'static,
Fut: Future<Output = ()> + Send + 'static,
F: OnFailedUpgrade,
{
let on_upgrade = self.on_upgrade;
let config = self.config;
let on_failed_upgrade = self.on_failed_upgrade;
let protocol = self.protocol.clone();
tokio::spawn(async move {
let upgraded = match on_upgrade.await {
Ok(upgraded) => upgraded,
Err(err) => {
error!("Something wrong with on_upgrade: {:?}", err);
on_failed_upgrade.call(Error::new(err));
return;
}
};
let upgraded = TokioIo::new(upgraded);
let socket = WebSocketStream::from_raw_socket(
// NOTARY_MODIFICATION: Need to use TokioAdapter to wrap Upgraded which doesn't implement futures crate's AsyncRead and AsyncWrite
TokioAdapter::new(upgraded),
protocol::Role::Server,
Some(config),
)
.await;
let socket = WebSocket {
inner: socket,
protocol,
};
callback(socket).await;
});
#[allow(clippy::declare_interior_mutable_const)]
const UPGRADE: HeaderValue = HeaderValue::from_static("upgrade");
#[allow(clippy::declare_interior_mutable_const)]
const WEBSOCKET: HeaderValue = HeaderValue::from_static("websocket");
let mut builder = Response::builder()
.status(StatusCode::SWITCHING_PROTOCOLS)
.header(header::CONNECTION, UPGRADE)
.header(header::UPGRADE, WEBSOCKET)
.header(
header::SEC_WEBSOCKET_ACCEPT,
sign(self.sec_websocket_key.as_bytes()),
);
if let Some(protocol) = self.protocol {
builder = builder.header(header::SEC_WEBSOCKET_PROTOCOL, protocol);
}
builder.body(Body::empty()).unwrap()
}
}
/// What to do when a connection upgrade fails.
///
/// See [`WebSocketUpgrade::on_failed_upgrade`] for more details.
pub trait OnFailedUpgrade: Send + 'static {
/// Call the callback.
fn call(self, error: Error);
}
impl<F> OnFailedUpgrade for F
where
F: FnOnce(Error) + Send + 'static,
{
fn call(self, error: Error) {
self(error)
}
}
/// The default `OnFailedUpgrade` used by `WebSocketUpgrade`.
///
/// It simply ignores the error.
#[non_exhaustive]
#[derive(Debug)]
pub struct DefaultOnFailedUpgrade;
impl OnFailedUpgrade for DefaultOnFailedUpgrade {
#[inline]
fn call(self, _error: Error) {}
}
#[async_trait]
impl<S> FromRequestParts<S> for WebSocketUpgrade<DefaultOnFailedUpgrade>
where
S: Send + Sync,
{
type Rejection = WebSocketUpgradeRejection;
async fn from_request_parts(parts: &mut Parts, _state: &S) -> Result<Self, Self::Rejection> {
if parts.method != Method::GET {
return Err(MethodNotGet.into());
}
if !header_contains(&parts.headers, header::CONNECTION, "upgrade") {
return Err(InvalidConnectionHeader.into());
}
if !header_eq(&parts.headers, header::UPGRADE, "websocket") {
return Err(InvalidUpgradeHeader.into());
}
if !header_eq(&parts.headers, header::SEC_WEBSOCKET_VERSION, "13") {
return Err(InvalidWebSocketVersionHeader.into());
}
let sec_websocket_key = parts
.headers
.get(header::SEC_WEBSOCKET_KEY)
.ok_or(WebSocketKeyHeaderMissing)?
.clone();
let on_upgrade = parts
.extensions
.remove::<hyper::upgrade::OnUpgrade>()
.ok_or(ConnectionNotUpgradable)?;
let sec_websocket_protocol = parts.headers.get(header::SEC_WEBSOCKET_PROTOCOL).cloned();
Ok(Self {
config: Default::default(),
protocol: None,
sec_websocket_key,
on_upgrade,
sec_websocket_protocol,
on_failed_upgrade: DefaultOnFailedUpgrade,
})
}
}
/// NOTARY_MODIFICATION: Made this function public to be used in service.rs
pub fn header_eq(headers: &HeaderMap, key: HeaderName, value: &'static str) -> bool {
if let Some(header) = headers.get(&key) {
header.as_bytes().eq_ignore_ascii_case(value.as_bytes())
} else {
false
}
}
fn header_contains(headers: &HeaderMap, key: HeaderName, value: &'static str) -> bool {
let header = if let Some(header) = headers.get(&key) {
header
} else {
return false;
};
if let Ok(header) = std::str::from_utf8(header.as_bytes()) {
header.to_ascii_lowercase().contains(value)
} else {
false
}
}
/// A stream of WebSocket messages.
///
/// See [the module level documentation](self) for more details.
#[derive(Debug)]
pub struct WebSocket {
inner: WebSocketStream<TokioAdapter<TokioIo<hyper::upgrade::Upgraded>>>,
protocol: Option<HeaderValue>,
}
impl WebSocket {
/// NOTARY_MODIFICATION: Consume `self` and get the inner [`async_tungstenite::WebSocketStream`].
pub fn into_inner(self) -> WebSocketStream<TokioAdapter<TokioIo<hyper::upgrade::Upgraded>>> {
self.inner
}
/// Receive another message.
///
/// Returns `None` if the stream has closed.
pub async fn recv(&mut self) -> Option<Result<Message, Error>> {
self.next().await
}
/// Send a message.
pub async fn send(&mut self, msg: Message) -> Result<(), Error> {
self.inner
.send(msg.into_tungstenite())
.await
.map_err(Error::new)
}
/// Gracefully close this WebSocket.
pub async fn close(mut self) -> Result<(), Error> {
self.inner.close(None).await.map_err(Error::new)
}
/// Return the selected WebSocket subprotocol, if one has been chosen.
pub fn protocol(&self) -> Option<&HeaderValue> {
self.protocol.as_ref()
}
}
impl Stream for WebSocket {
type Item = Result<Message, Error>;
fn poll_next(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Option<Self::Item>> {
loop {
match futures_util::ready!(self.inner.poll_next_unpin(cx)) {
Some(Ok(msg)) => {
if let Some(msg) = Message::from_tungstenite(msg) {
return Poll::Ready(Some(Ok(msg)));
}
}
Some(Err(err)) => return Poll::Ready(Some(Err(Error::new(err)))),
None => return Poll::Ready(None),
}
}
}
}
impl Sink<Message> for WebSocket {
type Error = Error;
fn poll_ready(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Result<(), Self::Error>> {
Pin::new(&mut self.inner).poll_ready(cx).map_err(Error::new)
}
fn start_send(mut self: Pin<&mut Self>, item: Message) -> Result<(), Self::Error> {
Pin::new(&mut self.inner)
.start_send(item.into_tungstenite())
.map_err(Error::new)
}
fn poll_flush(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Result<(), Self::Error>> {
Pin::new(&mut self.inner).poll_flush(cx).map_err(Error::new)
}
fn poll_close(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Result<(), Self::Error>> {
Pin::new(&mut self.inner).poll_close(cx).map_err(Error::new)
}
}
/// Status code used to indicate why an endpoint is closing the WebSocket connection.
pub type CloseCode = u16;
/// A struct representing the close command.
#[derive(Debug, Clone, Eq, PartialEq)]
pub struct CloseFrame<'t> {
/// The reason as a code.
pub code: CloseCode,
/// The reason as text string.
pub reason: Cow<'t, str>,
}
/// A WebSocket message.
//
// This code comes from https://github.com/snapview/tungstenite-rs/blob/master/src/protocol/message.rs and is under following license:
// Copyright (c) 2017 Alexey Galakhov
// Copyright (c) 2016 Jason Housley
//
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to deal
// in the Software without restriction, including without limitation the rights
// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
// copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included in
// all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
// THE SOFTWARE.
#[derive(Debug, Eq, PartialEq, Clone)]
pub enum Message {
/// A text WebSocket message
Text(String),
/// A binary WebSocket message
Binary(Vec<u8>),
/// A ping message with the specified payload
///
/// The payload here must have a length less than 125 bytes.
///
/// Ping messages will be automatically responded to by the server, so you do not have to worry
/// about dealing with them yourself.
Ping(Vec<u8>),
/// A pong message with the specified payload
///
/// The payload here must have a length less than 125 bytes.
///
/// Pong messages will be automatically sent to the client if a ping message is received, so
/// you do not have to worry about constructing them yourself unless you want to implement a
/// [unidirectional heartbeat](https://tools.ietf.org/html/rfc6455#section-5.5.3).
Pong(Vec<u8>),
/// A close message with the optional close frame.
Close(Option<CloseFrame<'static>>),
}
impl Message {
fn into_tungstenite(self) -> ts::Message {
match self {
Self::Text(text) => ts::Message::Text(text),
Self::Binary(binary) => ts::Message::Binary(binary),
Self::Ping(ping) => ts::Message::Ping(ping),
Self::Pong(pong) => ts::Message::Pong(pong),
Self::Close(Some(close)) => ts::Message::Close(Some(ts::protocol::CloseFrame {
code: ts::protocol::frame::coding::CloseCode::from(close.code),
reason: close.reason,
})),
Self::Close(None) => ts::Message::Close(None),
}
}
fn from_tungstenite(message: ts::Message) -> Option<Self> {
match message {
ts::Message::Text(text) => Some(Self::Text(text)),
ts::Message::Binary(binary) => Some(Self::Binary(binary)),
ts::Message::Ping(ping) => Some(Self::Ping(ping)),
ts::Message::Pong(pong) => Some(Self::Pong(pong)),
ts::Message::Close(Some(close)) => Some(Self::Close(Some(CloseFrame {
code: close.code.into(),
reason: close.reason,
}))),
ts::Message::Close(None) => Some(Self::Close(None)),
// we can ignore `Frame` frames as recommended by the tungstenite maintainers
// https://github.com/snapview/tungstenite-rs/issues/268
ts::Message::Frame(_) => None,
}
}
/// Consume the WebSocket and return it as binary data.
pub fn into_data(self) -> Vec<u8> {
match self {
Self::Text(string) => string.into_bytes(),
Self::Binary(data) | Self::Ping(data) | Self::Pong(data) => data,
Self::Close(None) => Vec::new(),
Self::Close(Some(frame)) => frame.reason.into_owned().into_bytes(),
}
}
/// Attempt to consume the WebSocket message and convert it to a String.
pub fn into_text(self) -> Result<String, Error> {
match self {
Self::Text(string) => Ok(string),
Self::Binary(data) | Self::Ping(data) | Self::Pong(data) => Ok(String::from_utf8(data)
.map_err(|err| err.utf8_error())
.map_err(Error::new)?),
Self::Close(None) => Ok(String::new()),
Self::Close(Some(frame)) => Ok(frame.reason.into_owned()),
}
}
/// Attempt to get a &str from the WebSocket message,
/// this will try to convert binary data to utf8.
pub fn to_text(&self) -> Result<&str, Error> {
match *self {
Self::Text(ref string) => Ok(string),
Self::Binary(ref data) | Self::Ping(ref data) | Self::Pong(ref data) => {
Ok(std::str::from_utf8(data).map_err(Error::new)?)
}
Self::Close(None) => Ok(""),
Self::Close(Some(ref frame)) => Ok(&frame.reason),
}
}
}
impl From<String> for Message {
fn from(string: String) -> Self {
Message::Text(string)
}
}
impl<'s> From<&'s str> for Message {
fn from(string: &'s str) -> Self {
Message::Text(string.into())
}
}
impl<'b> From<&'b [u8]> for Message {
fn from(data: &'b [u8]) -> Self {
Message::Binary(data.into())
}
}
impl From<Vec<u8>> for Message {
fn from(data: Vec<u8>) -> Self {
Message::Binary(data)
}
}
impl From<Message> for Vec<u8> {
fn from(msg: Message) -> Self {
msg.into_data()
}
}
fn sign(key: &[u8]) -> HeaderValue {
use base64::engine::Engine as _;
let mut sha1 = Sha1::default();
sha1.update(key);
sha1.update(&b"258EAFA5-E914-47DA-95CA-C5AB0DC85B11"[..]);
let b64 = Bytes::from(base64::engine::general_purpose::STANDARD.encode(sha1.finalize()));
HeaderValue::from_maybe_shared(b64).expect("base64 is a valid value")
}
pub mod rejection {
//! WebSocket specific rejections.
use axum_core::{
__composite_rejection as composite_rejection, __define_rejection as define_rejection,
};
define_rejection! {
#[status = METHOD_NOT_ALLOWED]
#[body = "Request method must be `GET`"]
/// Rejection type for [`WebSocketUpgrade`](super::WebSocketUpgrade).
pub struct MethodNotGet;
}
define_rejection! {
#[status = BAD_REQUEST]
#[body = "Connection header did not include 'upgrade'"]
/// Rejection type for [`WebSocketUpgrade`](super::WebSocketUpgrade).
pub struct InvalidConnectionHeader;
}
define_rejection! {
#[status = BAD_REQUEST]
#[body = "`Upgrade` header did not include 'websocket'"]
/// Rejection type for [`WebSocketUpgrade`](super::WebSocketUpgrade).
pub struct InvalidUpgradeHeader;
}
define_rejection! {
#[status = BAD_REQUEST]
#[body = "`Sec-WebSocket-Version` header did not include '13'"]
/// Rejection type for [`WebSocketUpgrade`](super::WebSocketUpgrade).
pub struct InvalidWebSocketVersionHeader;
}
define_rejection! {
#[status = BAD_REQUEST]
#[body = "`Sec-WebSocket-Key` header missing"]
/// Rejection type for [`WebSocketUpgrade`](super::WebSocketUpgrade).
pub struct WebSocketKeyHeaderMissing;
}
define_rejection! {
#[status = UPGRADE_REQUIRED]
#[body = "WebSocket request couldn't be upgraded since no upgrade state was present"]
/// Rejection type for [`WebSocketUpgrade`](super::WebSocketUpgrade).
///
/// This rejection is returned if the connection cannot be upgraded for example if the
/// request is HTTP/1.0.
///
/// See [MDN] for more details about connection upgrades.
///
/// [MDN]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Upgrade
pub struct ConnectionNotUpgradable;
}
composite_rejection! {
/// Rejection used for [`WebSocketUpgrade`](super::WebSocketUpgrade).
///
/// Contains one variant for each way the [`WebSocketUpgrade`](super::WebSocketUpgrade)
/// extractor can fail.
pub enum WebSocketUpgradeRejection {
MethodNotGet,
InvalidConnectionHeader,
InvalidUpgradeHeader,
InvalidWebSocketVersionHeader,
WebSocketKeyHeaderMissing,
ConnectionNotUpgradable,
}
}
}
pub mod close_code {
//! Constants for [`CloseCode`]s.
//!
//! [`CloseCode`]: super::CloseCode
/// Indicates a normal closure, meaning that the purpose for which the connection was
/// established has been fulfilled.
pub const NORMAL: u16 = 1000;
/// Indicates that an endpoint is "going away", such as a server going down or a browser having
/// navigated away from a page.
pub const AWAY: u16 = 1001;
/// Indicates that an endpoint is terminating the connection due to a protocol error.
pub const PROTOCOL: u16 = 1002;
/// Indicates that an endpoint is terminating the connection because it has received a type of
/// data it cannot accept (e.g., an endpoint that understands only text data MAY send this if
/// it receives a binary message).
pub const UNSUPPORTED: u16 = 1003;
/// Indicates that no status code was included in a closing frame.
pub const STATUS: u16 = 1005;
/// Indicates an abnormal closure.
pub const ABNORMAL: u16 = 1006;
/// Indicates that an endpoint is terminating the connection because it has received data
/// within a message that was not consistent with the type of the message (e.g., non-UTF-8
/// RFC3629 data within a text message).
pub const INVALID: u16 = 1007;
/// Indicates that an endpoint is terminating the connection because it has received a message
/// that violates its policy. This is a generic status code that can be returned when there is
/// no other more suitable status code (e.g., `UNSUPPORTED` or `SIZE`) or if there is a need to
/// hide specific details about the policy.
pub const POLICY: u16 = 1008;
/// Indicates that an endpoint is terminating the connection because it has received a message
/// that is too big for it to process.
pub const SIZE: u16 = 1009;
/// Indicates that an endpoint (client) is terminating the connection because it has expected
/// the server to negotiate one or more extension, but the server didn't return them in the
/// response message of the WebSocket handshake. The list of extensions that are needed should
/// be given as the reason for closing. Note that this status code is not used by the server,
/// because it can fail the WebSocket handshake instead.
pub const EXTENSION: u16 = 1010;
/// Indicates that a server is terminating the connection because it encountered an unexpected
/// condition that prevented it from fulfilling the request.
pub const ERROR: u16 = 1011;
/// Indicates that the server is restarting.
pub const RESTART: u16 = 1012;
/// Indicates that the server is overloaded and the client should either connect to a different
/// IP (when multiple targets exist), or reconnect to the same IP when a user has performed an
/// action.
pub const AGAIN: u16 = 1013;
}
#[cfg(test)]
mod tests {
use super::*;
use axum::{body::Body, routing::get, Router};
use http::{Request, Version};
// NOTARY_MODIFICATION: use tower_util instead of tower to make clippy happy
use tower_util::ServiceExt;
#[tokio::test]
async fn rejects_http_1_0_requests() {
let svc = get(|ws: Result<WebSocketUpgrade, WebSocketUpgradeRejection>| {
let rejection = ws.unwrap_err();
assert!(matches!(
rejection,
WebSocketUpgradeRejection::ConnectionNotUpgradable(_)
));
std::future::ready(())
});
let req = Request::builder()
.version(Version::HTTP_10)
.method(Method::GET)
.header("upgrade", "websocket")
.header("connection", "Upgrade")
.header("sec-websocket-key", "6D69KGBOr4Re+Nj6zx9aQA==")
.header("sec-websocket-version", "13")
.body(Body::empty())
.unwrap();
let res = svc.oneshot(req).await.unwrap();
assert_eq!(res.status(), StatusCode::OK);
}
#[allow(dead_code)]
fn default_on_failed_upgrade() {
async fn handler(ws: WebSocketUpgrade) -> Response {
ws.on_upgrade(|_| async {})
}
let _: Router = Router::new().route("/", get(handler));
}
#[allow(dead_code)]
fn on_failed_upgrade() {
async fn handler(ws: WebSocketUpgrade) -> Response {
ws.on_failed_upgrade(|_error: Error| println!("oops!"))
.on_upgrade(|_| async {})
}
let _: Router = Router::new().route("/", get(handler));
}
}

View File

@@ -0,0 +1,193 @@
use axum::{
extract::{Request, State},
response::IntoResponse,
routing::get,
Router,
};
use axum_websocket::{WebSocket, WebSocketUpgrade};
use eyre::eyre;
use hyper::{body::Incoming, server::conn::http1};
use hyper_util::rt::TokioIo;
use std::{
net::{IpAddr, SocketAddr},
sync::Arc,
};
use tlsn_common::config::ProtocolConfigValidator;
use tlsn_core::{VerifierOutput, VerifyConfig};
use tlsn_verifier::{Verifier, VerifierConfig};
use tokio::{
io::{AsyncRead, AsyncWrite},
net::TcpListener,
};
use tokio_util::compat::TokioAsyncReadCompatExt;
use tower_service::Service;
use tracing::{debug, error, info};
use ws_stream_tungstenite::WsStream;
mod axum_websocket;
// Maximum number of bytes that can be sent from prover to server
const MAX_SENT_DATA: usize = 1 << 12;
// Maximum number of bytes that can be received by prover from server
const MAX_RECV_DATA: usize = 1 << 14;
/// Global data that needs to be shared with the axum handlers
#[derive(Clone, Debug)]
struct VerifierGlobals {
pub server_domain: String,
}
pub async fn run_server(
verifier_host: &str,
verifier_port: u16,
server_domain: &str,
) -> Result<(), eyre::ErrReport> {
let verifier_address = SocketAddr::new(
IpAddr::V4(verifier_host.parse().map_err(|err| {
eyre!("Failed to parse verifier host address from server config: {err}")
})?),
verifier_port,
);
let listener = TcpListener::bind(verifier_address)
.await
.map_err(|err| eyre!("Failed to bind server address to tcp listener: {err}"))?;
info!("Listening for TCP traffic at {}", verifier_address);
let protocol = Arc::new(http1::Builder::new());
let router = Router::new()
.route("/verify", get(ws_handler))
.with_state(VerifierGlobals {
server_domain: server_domain.to_string(),
});
loop {
let stream = match listener.accept().await {
Ok((stream, _)) => stream,
Err(err) => {
error!("Failed to connect to prover: {err}");
continue;
}
};
debug!("Received a prover's TCP connection");
let tower_service = router.clone();
let protocol = protocol.clone();
tokio::spawn(async move {
info!("Accepted prover's TCP connection",);
// Reference: https://github.com/tokio-rs/axum/blob/5201798d4e4d4759c208ef83e30ce85820c07baa/examples/low-level-rustls/src/main.rs#L67-L80
let io = TokioIo::new(stream);
let hyper_service = hyper::service::service_fn(move |request: Request<Incoming>| {
tower_service.clone().call(request)
});
// Serve different requests using the same hyper protocol and axum router
let _ = protocol
.serve_connection(io, hyper_service)
// use with_upgrades to upgrade connection to websocket for websocket clients
// and to extract tcp connection for tcp clients
.with_upgrades()
.await;
});
}
}
async fn ws_handler(
ws: WebSocketUpgrade,
State(verifier_globals): State<VerifierGlobals>,
) -> impl IntoResponse {
info!("Received websocket request");
ws.on_upgrade(|socket| handle_socket(socket, verifier_globals))
}
async fn handle_socket(socket: WebSocket, verifier_globals: VerifierGlobals) {
debug!("Upgraded to websocket connection");
let stream = WsStream::new(socket.into_inner());
match verifier(stream, &verifier_globals.server_domain).await {
Ok((sent, received)) => {
info!("Successfully verified {}", &verifier_globals.server_domain);
info!("Verified sent data:\n{}", sent,);
println!("Verified received data:\n{received}",);
}
Err(err) => {
error!("Failed verification using websocket: {err}");
}
}
}
async fn verifier<T: AsyncWrite + AsyncRead + Send + Unpin + 'static>(
socket: T,
server_domain: &str,
) -> Result<(String, String), eyre::ErrReport> {
debug!("Starting verification...");
// Setup Verifier.
let config_validator = ProtocolConfigValidator::builder()
.max_sent_data(MAX_SENT_DATA)
.max_recv_data(MAX_RECV_DATA)
.build()
.unwrap();
let verifier_config = VerifierConfig::builder()
.protocol_config_validator(config_validator)
.build()
.unwrap();
let verifier = Verifier::new(verifier_config);
// Receive authenticated data.
debug!("Starting MPC-TLS verification...");
let verify_config = VerifyConfig::default();
let VerifierOutput {
server_name,
transcript,
..
} = verifier
.verify(socket.compat(), &verify_config)
.await
.unwrap();
let transcript = transcript.expect("prover should have revealed transcript data");
// Check sent data: check host.
debug!("Starting sent data verification...");
let sent = transcript.sent_unsafe().to_vec();
let sent_data = String::from_utf8(sent.clone()).expect("Verifier expected sent data");
sent_data
.find(server_domain)
.ok_or_else(|| eyre!("Verification failed: Expected host {}", server_domain))?;
// Check received data: check json and version number.
debug!("Starting received data verification...");
let received = transcript.received_unsafe().to_vec();
let response = String::from_utf8(received.clone()).expect("Verifier expected received data");
debug!("Received data: {:?}", response);
response
.find("123 Elm Street")
.ok_or_else(|| eyre!("Verification failed: missing data in received data"))?;
// Check Session info: server name.
if let Some(server_name) = server_name {
if server_name.as_str() != server_domain {
return Err(eyre!("Verification failed: server name mismatches"));
}
} else {
// TODO: https://github.com/tlsnotary/tlsn-js/issues/110
// return Err(eyre!("Verification failed: server name is missing"));
}
let sent_string = bytes_to_redacted_string(&sent)?;
let received_string = bytes_to_redacted_string(&received)?;
Ok((sent_string, received_string))
}
/// Render redacted bytes as `🙈`.
fn bytes_to_redacted_string(bytes: &[u8]) -> Result<String, eyre::ErrReport> {
Ok(String::from_utf8(bytes.to_vec())
.map_err(|err| eyre!("Failed to parse bytes to redacted string: {err}"))?
.replace('\0', "🙈"))
}

View File

@@ -0,0 +1,22 @@
use interactive_networked_verifier::run_server;
use tracing_subscriber::{layer::SubscriberExt, util::SubscriberInitExt, EnvFilter};
const TRACING_FILTER: &str = "INFO";
const VERIFIER_HOST: &str = "0.0.0.0";
const VERIFIER_PORT: u16 = 9816;
/// Make sure the following domain is the same in SERVER_URL on the prover side
const SERVER_DOMAIN: &str = "raw.githubusercontent.com";
#[tokio::main]
async fn main() -> Result<(), eyre::ErrReport> {
tracing_subscriber::registry()
.with(EnvFilter::try_from_default_env().unwrap_or_else(|_| TRACING_FILTER.into()))
.with(tracing_subscriber::fmt::layer())
.init();
run_server(VERIFIER_HOST, VERIFIER_PORT, SERVER_DOMAIN).await?;
Ok(())
}

View File

@@ -1 +1,8 @@
package-lock.json
# Playwright
node_modules/
/test-results/
/playwright-report/
/blob-report/
/playwright/.cache/

View File

@@ -0,0 +1,78 @@
# TLSNotary in React/TypeScript with `tlsn-js`
This demo shows how to use TLSNotary with a delegated verifier, also known as a **notary**.
In this demo, we request JSON data from a GitHub page, use `tlsn-js` to notarize the TLS request with TLSNotary, and display the attestation and revealed data.
> **Note:**
> This demo uses TLSNotary to notarize **public** data for simplicity. In real-world applications, TLSNotary is especially valuable for notarizing private and sensitive data.
---
## Setup
Before running the demo, you need to start a local notary server and a websocket proxy. If you prefer to use the hosted test servers from PSE, see the section below.
### Websocket Proxy
Browsers cannot make raw TCP connections, so a websocket proxy server is required.
1. **Install [wstcp](https://github.com/sile/wstcp):**
```sh
cargo install wstcp
```
2. **Run a websocket proxy for `https://raw.githubusercontent.com`:**
```sh
wstcp --bind-addr 127.0.0.1:55688 raw.githubusercontent.com:443
```
> Note: The `raw.githubusercontent.com:443` argument specifies the server used in this quick start.
### Run a Local Notary Server
You also need to run a local notary server for this demo.
- **Using Git and Rust Cargo:**
```sh
git clone https://github.com/tlsnotary/tlsn.git
cargo run --release --bin notary-server
```
- **Using Docker (from the root of the tlsn-js repo):**
```sh
npm run notary
```
The notary server will now be running in the background, waiting for connections.
---
### Use the PSE Web Proxy and Notary
If you want to use the hosted PSE notary and proxy:
1. Open `app.tsx` in your editor.
2. Replace the notary URL:
```ts
notaryUrl: 'https://notary.pse.dev/v0.1.0-alpha.11',
```
This uses the [PSE](https://pse.dev) notary server to notarize the API request. You can use a different or [local notary](#run-a-local-notary-server); a local server will be faster due to the high bandwidth and low network latency.
3. Replace the websocket proxy URL:
```ts
websocketProxyUrl: 'wss://notary.pse.dev/proxy?token=raw.githubusercontent.com',
```
This uses a proxy hosted by [PSE](https://pse.dev). You can use a different or local proxy if you prefer.
---
## Run the Demo
1. **Install dependencies:**
```sh
npm i
```
2. **Start the Webpack Dev Server:**
```sh
npm run dev
```
3. **Open the demo in your browser:**
Go to [http://localhost:8080](http://localhost:8080)
4. **Click the "Start demo" button**
5. **Open Developer Tools** and monitor the console logs

View File

@@ -1,89 +0,0 @@
import React, { ReactElement, useCallback, useEffect, useState } from 'react';
import { createRoot } from 'react-dom/client';
import { prove, verify, set_logging_filter } from 'tlsn-js';
import { Proof } from 'tlsn-js/build/types';
import { Watch } from 'react-loader-spinner';
const container = document.getElementById('root');
const root = createRoot(container!);
root.render(<App />);
function App(): ReactElement {
const [processing, setProcessing] = useState(false);
const [result, setResult] = useState<{
time: number;
sent: string;
recv: string;
notaryUrl: string;
} | null>(null);
const [proof, setProof] = useState<Proof | null>(null);
const onClick = useCallback(async () => {
setProcessing(true);
await set_logging_filter('info,tlsn_extension_rs=debug');
const p = await prove('https://swapi.dev/api/people/1', {
method: 'GET',
maxTranscriptSize: 16384,
notaryUrl: 'http://localhost:7047',
websocketProxyUrl: 'ws://localhost:55688',
});
setProof(p);
}, [setProof, setProcessing]);
useEffect(() => {
(async () => {
if (proof) {
const r = await verify(proof);
setResult(r);
setProcessing(false);
}
})();
}, [proof, setResult]);
return (
<div>
<button onClick={!processing ? onClick : undefined} disabled={processing}>
Start demo
</button>
<div>
<b>Proof: </b>
{!processing && !proof ? (
<i>not started</i>
) : !proof ? (
<>
Proving data from swapi...
<Watch
visible={true}
height="40"
width="40"
radius="48"
color="#000000"
ariaLabel="watch-loading"
wrapperStyle={{}}
wrapperClass=""
/>
Open <i>Developer tools</i> to follow progress
</>
) : (
<>
<details>
<summary>View Proof</summary>
<pre>{JSON.stringify(proof, null, 2)}</pre>
</details>
</>
)}
</div>
<div>
<b>Verification: </b>
{!proof ? (
<i>not started</i>
) : !result ? (
<i>verifying</i>
) : (
<pre>{JSON.stringify(result, null, 2)}</pre>
)}
</div>
</div>
);
}

View File

@@ -4,13 +4,13 @@
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>React/Typescrip Example</title>
<title>TLSNotary React TypeScript Demo</title>
</head>
<body>
<script>
</script>
<div id="root"></div>
<script>
</script>
<div id="root"></div>
</body>
</html>

View File

@@ -4,25 +4,43 @@
"description": "",
"main": "webpack.js",
"scripts": {
"dev": "webpack-dev-server --config webpack.js"
"dev": "webpack-dev-server --config webpack.js",
"build": "webpack --config webpack.js",
"start": "webpack serve --config webpack.js",
"test": "npx playwright test"
},
"author": "",
"license": "ISC",
"dependencies": {
"comlink": "^4.4.1",
"css-loader": "^7.1.2",
"http-parser-js": "^0.5.9",
"postcss": "^8.4.49",
"postcss-loader": "^8.1.1",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"react-loader-spinner": "^6.1.6",
"sass": "^1.83.1",
"sass-loader": "^16.0.4",
"style-loader": "^4.0.0",
"tailwindcss": "^3.4.17",
"tlsn-js": "../../"
},
"devDependencies": {
"@playwright/test": "^1.52.0",
"@types/node": "^22.15.18",
"@types/react": "^18.0.26",
"@types/react-dom": "^18.0.10",
"babel-loader": "^9.1.3",
"copy-webpack-plugin": "^11.0.0",
"crypto-browserify": "^3.12.0",
"html-webpack-plugin": "^5.5.0",
"postcss-preset-env": "^10.1.3",
"source-map-loader": "^5.0.0",
"stream-browserify": "^3.0.0",
"ts-loader": "^9.4.2",
"typescript": "^4.9.4",
"vm-browserify": "^1.1.2",
"webpack": "^5.75.0",
"webpack-cli": "^4.10.0",
"webpack-dev-server": "^4.11.1"

View File

@@ -0,0 +1,85 @@
import { defineConfig, devices } from '@playwright/test';
/**
* Read environment variables from file.
* https://github.com/motdotla/dotenv
*/
// import dotenv from 'dotenv';
// import path from 'path';
// dotenv.config({ path: path.resolve(__dirname, '.env') });
/**
* See https://playwright.dev/docs/test-configuration.
*/
export default defineConfig({
testDir: './tests',
/* Run tests in files in parallel */
fullyParallel: true,
/* Fail the build on CI if you accidentally left test.only in the source code. */
forbidOnly: !!process.env.CI,
/* Retry on CI only */
retries: process.env.CI ? 2 : 0,
/* Opt out of parallel tests on CI. */
workers: process.env.CI ? 1 : undefined,
/* Reporter to use. See https://playwright.dev/docs/test-reporters */
reporter: 'html',
/* Shared settings for all the projects below. See https://playwright.dev/docs/api/class-testoptions. */
use: {
/* Base URL to use in actions like `await page.goto('/')`. */
baseURL: 'http://localhost:8080',
/* Collect trace when retrying the failed test. See https://playwright.dev/docs/trace-viewer */
trace: 'on-first-retry',
},
/* Configure projects for major browsers */
projects: [
{
name: 'chromium',
use: { ...devices['Desktop Chrome'] },
},
// {
// name: 'firefox',
// use: { ...devices['Desktop Firefox'] },
// },
// {
// name: 'webkit',
// use: { ...devices['Desktop Safari'] },
// },
/* Test against mobile viewports. */
// {
// name: 'Mobile Chrome',
// use: { ...devices['Pixel 5'] },
// },
// {
// name: 'Mobile Safari',
// use: { ...devices['iPhone 12'] },
// },
/* Test against branded browsers. */
// {
// name: 'Microsoft Edge',
// use: { ...devices['Desktop Edge'], channel: 'msedge' },
// },
// {
// name: 'Google Chrome',
// use: { ...devices['Desktop Chrome'], channel: 'chrome' },
// },
],
/* Run your local dev server before starting the tests */
webServer: [
{
command: 'npm run start',
url: 'http://localhost:8080',
reuseExistingServer: !process.env.CI,
},
{
command: 'wstcp --bind-addr 127.0.0.1:55688 raw.githubusercontent.com:443',
reuseExistingServer: true,
}
]
});

3428
demo/react-ts-webpack/pnpm-lock.yaml generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,4 @@
const tailwindcss = require("tailwindcss");
module.exports = {
plugins: ["postcss-preset-env", tailwindcss],
};

View File

@@ -0,0 +1,3 @@
@tailwind base;
@tailwind components;
@tailwind utilities;

View File

@@ -0,0 +1,348 @@
import React, { ReactElement, useCallback, useEffect, useState } from 'react';
import { createRoot } from 'react-dom/client';
import * as Comlink from 'comlink';
import { Watch } from 'react-loader-spinner';
import {
Prover as TProver,
Presentation as TPresentation,
Commit,
NotaryServer,
Transcript,
mapStringToRange,
subtractRanges,
} from 'tlsn-js';
import { PresentationJSON } from 'tlsn-js/build/types';
import './app.scss';
import { HTTPParser } from 'http-parser-js';
const { init, Prover, Presentation }: any = Comlink.wrap(
new Worker(new URL('./worker.ts', import.meta.url)),
);
const container = document.getElementById('root');
const root = createRoot(container!);
root.render(<App />);
const local = true; // Toggle between local and remote notary
const notaryUrl = local
? 'http://localhost:7047'
: 'https://notary.pse.dev/v0.1.0-alpha.12';
const websocketProxyUrl = local
? 'ws://localhost:55688'
: 'wss://notary.pse.dev/proxy?token=raw.githubusercontent.com';
const loggingLevel = 'Info'; // https://github.com/tlsnotary/tlsn/blob/main/crates/wasm/src/log.rs#L8
const serverUrl = 'https://raw.githubusercontent.com/tlsnotary/tlsn/refs/tags/v0.1.0-alpha.12/crates/server-fixture/server/src/data/1kb.json';
const serverDns = 'raw.githubusercontent.com';
function App(): ReactElement {
const [initialized, setInitialized] = useState(false);
const [processing, setProcessing] = useState(false);
const [result, setResult] = useState<any | null>(null);
const [presentationJSON, setPresentationJSON] =
useState<null | PresentationJSON>(null);
useEffect(() => {
(async () => {
await init({ loggingLevel: loggingLevel });
setInitialized(true);
})();
}, []);
const onClick = useCallback(async () => {
setProcessing(true);
const notary = NotaryServer.from(notaryUrl);
console.time('submit');
const prover = (await new Prover({
serverDns: serverDns,
maxRecvData: 2048,
})) as TProver;
await prover.setup(await notary.sessionUrl());
const resp = await prover.sendRequest(websocketProxyUrl, {
url: serverUrl,
method: 'GET',
headers: {
'Content-Type': 'application/json',
secret: 'test_secret',
},
body: {
hello: 'world',
one: 1,
},
});
console.timeEnd('submit');
console.log(resp);
console.time('transcript');
const transcript = await prover.transcript();
const { sent, recv } = transcript;
console.log(new Transcript({ sent, recv }));
console.timeEnd('transcript');
console.time('commit');
const {
info: recvInfo,
headers: recvHeaders,
body: recvBody,
} = parseHttpMessage(Buffer.from(recv), 'response');
const body = JSON.parse(recvBody[0].toString());
const commit: Commit = {
sent: subtractRanges(
{ start: 0, end: sent.length },
mapStringToRange(
['secret: test_secret'],
Buffer.from(sent).toString('utf-8'),
),
),
recv: [
...mapStringToRange(
[
recvInfo,
`${recvHeaders[4]}: ${recvHeaders[5]}\r\n`,
`${recvHeaders[6]}: ${recvHeaders[7]}\r\n`,
`${recvHeaders[8]}: ${recvHeaders[9]}\r\n`,
`${recvHeaders[10]}: ${recvHeaders[11]}\r\n`,
`${recvHeaders[12]}: ${recvHeaders[13]}`,
`${recvHeaders[14]}: ${recvHeaders[15]}`,
`${recvHeaders[16]}: ${recvHeaders[17]}`,
`${recvHeaders[18]}: ${recvHeaders[19]}`,
`"name": "${body.information.name}"`,
`"street": "${body.information.address.street}"`,
],
Buffer.from(recv).toString('utf-8'),
),
],
};
const notarizationOutputs = await prover.notarize(commit);
console.timeEnd('commit');
console.time('proof');
const presentation = (await new Presentation({
attestationHex: notarizationOutputs.attestation,
secretsHex: notarizationOutputs.secrets,
notaryUrl: notarizationOutputs.notaryUrl,
websocketProxyUrl: notarizationOutputs.websocketProxyUrl,
reveal: { ...commit, server_identity: false },
})) as TPresentation;
console.log(await presentation.serialize());
setPresentationJSON(await presentation.json());
console.timeEnd('proof');
}, [setPresentationJSON, setProcessing]);
const onAltClick = useCallback(async () => {
setProcessing(true);
const proof = await (Prover.notarize as typeof TProver.notarize)({
notaryUrl: notaryUrl,
websocketProxyUrl: websocketProxyUrl,
maxRecvData: 2048,
url: serverUrl,
method: 'GET',
headers: {
'Content-Type': 'application/json',
},
body: {
hello: 'world',
one: 1,
},
commit: {
sent: [{ start: 0, end: 50 }],
recv: [{ start: 0, end: 50 }],
},
});
setPresentationJSON(proof);
}, [setPresentationJSON, setProcessing]);
useEffect(() => {
(async () => {
if (presentationJSON) {
const proof = (await new Presentation(
presentationJSON.data,
)) as TPresentation;
const notary = NotaryServer.from(notaryUrl);
const notaryKey = await notary.publicKey('hex');
const verifierOutput = await proof.verify();
const transcript = new Transcript({
sent: verifierOutput.transcript?.sent || [],
recv: verifierOutput.transcript?.recv || [],
});
const vk = await proof.verifyingKey();
setResult({
time: verifierOutput.connection_info.time,
verifyingKey: Buffer.from(vk.data).toString('hex'),
notaryKey: notaryKey,
serverName: verifierOutput.server_name,
sent: transcript.sent(),
recv: transcript.recv(),
});
setProcessing(false);
}
})();
}, [presentationJSON, setResult]);
return (
<div className="bg-slate-100 min-h-screen p-6 text-slate-800 flex flex-col items-center">
<h1 className="text-2xl font-bold mb-6 text-slate-700">
TLSNotary React TypeScript Demo{' '}
</h1>
<div className="mb-4 text-base font-light max-w-2xl">
<p>
This demo showcases how to use TLSNotary in a React/TypeScript app
with the tlsn-js library. We will fetch JSON data from the Star Wars
API, notarize the TLS request using TLSNotary, and verify the proof.
The demo runs entirely in the browser.
</p>
<p>
<a
href="https://tlsnotary.org/docs/quick_start/tlsn-js/"
className="text-blue-500 hover:underline"
>
More info
</a>
</p>
<table className="table-auto w-full mt-4">
<thead>
<tr>
<th className="px-4 py-2 text-left">Demo Settings</th>
<th className="px-4 py-2 text-left">URL</th>
</tr>
</thead>
<tbody>
<tr>
<td className="border px-4 py-2">Server</td>
<td className="border px-4 py-2">{serverUrl}</td>
</tr>
<tr>
<td className="border px-4 py-2">Notary Server</td>
<td className="border px-4 py-2">{notaryUrl}</td>
</tr>
<tr>
<td className="border px-4 py-2">WebSocket Proxy</td>
<td className="border px-4 py-2">{websocketProxyUrl}</td>
</tr>
</tbody>
</table>
</div>
<div className="mb-4">
<p className="mb-2 text-base font-light">
There are two versions of the demo: one with a normal config and one
with a helper method.
</p>
<div className="flex justify-center gap-4">
<button
onClick={!processing ? onClick : undefined}
disabled={processing || !initialized}
className={`px-4 py-2 rounded-md text-white shadow-md font-semibold
${processing || !initialized ? 'bg-slate-400 cursor-not-allowed' : 'bg-slate-600 hover:bg-slate-700'}`}
>
Start Demo (Normal config)
</button>
<button
onClick={!processing ? onAltClick : undefined}
disabled={processing || !initialized}
className={`px-4 py-2 rounded-md text-white shadow-md font-semibold
${processing || !initialized ? 'bg-slate-400 cursor-not-allowed' : 'bg-slate-600 hover:bg-slate-700'}`}
>
Start Demo 2 (With helper method)
</button>
</div>
</div>
{processing && (
<div className="mt-6 flex justify-center items-center">
<Watch
visible={true}
height="40"
width="40"
radius="48"
color="#1E293B"
ariaLabel="watch-loading"
wrapperStyle={{}}
wrapperClass=""
/>
</div>
)}
<div className="flex flex-col gap-6 w-full max-w-4xl">
<div className="flex-1 bg-slate-50 border border-slate-200 rounded p-4">
<b className="text-slate-600">Proof: </b>
{!processing && !presentationJSON ? (
<i className="text-slate-500">not started</i>
) : !presentationJSON ? (
<div className="flex flex-col items-start space-y-2">
<span>Proving data from {serverDns}...</span>
<span className="text-slate-500">
Open <i>Developer tools</i> to follow progress
</span>
</div>
) : (
<details className="bg-slate-50 border border-slate-200 rounded p-2">
<summary className="cursor-pointer text-slate-600">
View Proof
</summary>
<pre data-testid="proof-data"
className="mt-2 p-2 bg-slate-100 rounded text-sm text-slate-800 overflow-auto"
style={{ whiteSpace: 'pre-wrap', wordBreak: 'break-all' }}
>
{JSON.stringify(presentationJSON, null, 2)}
</pre>
</details>
)}
</div>
<div className="flex-1 bg-slate-50 border border-slate-200 rounded p-4">
<b className="text-slate-600">Verification: </b>
{!presentationJSON ? (
<i className="text-slate-500">not started</i>
) : !result ? (
<i className="text-slate-500">verifying</i>
) : (
<pre data-testid="verify-data"
className="mt-2 p-2 bg-slate-100 rounded text-sm text-slate-800 overflow-auto"
style={{ whiteSpace: 'pre-wrap', wordBreak: 'break-all' }}
>
{JSON.stringify(result, null, 2)}
</pre>
)}
</div>
</div>
</div>
);
}
function parseHttpMessage(buffer: Buffer, type: 'request' | 'response') {
const parser = new HTTPParser(
type === 'request' ? HTTPParser.REQUEST : HTTPParser.RESPONSE,
);
const body: Buffer[] = [];
let complete = false;
let headers: string[] = [];
parser.onBody = (t) => {
body.push(t);
};
parser.onHeadersComplete = (res) => {
headers = res.headers;
};
parser.onMessageComplete = () => {
complete = true;
};
parser.execute(buffer);
parser.finish();
if (!complete) throw new Error(`Could not parse ${type.toUpperCase()}`);
return {
info: buffer.toString('utf-8').split('\r\n')[0] + '\r\n',
headers,
body,
};
}

View File

@@ -0,0 +1,9 @@
import * as Comlink from 'comlink';
import init, { Prover, Attestation, Presentation } from 'tlsn-js';
Comlink.expose({
init,
Prover,
Presentation,
Attestation,
});

View File

@@ -0,0 +1,12 @@
/** @type {import('tailwindcss').Config} */
module.exports = {
content: ['./src/**/*.{js,jsx,ts,tsx}'],
theme: {
extend: {
colors: {
primary: '#243f5f',
},
},
},
plugins: [],
};

View File

@@ -0,0 +1,36 @@
import { test, expect } from '@playwright/test';
test('has title', async ({ page }) => {
await page.goto('/');
await expect(page).toHaveTitle(/TLSNotary React TypeScript Demo/)
});
test('run demo (normal)', async ({ page }) => {
test.setTimeout(60000);
await page.goto('/');
// Click the get started link.
await page.getByRole('button', { name: 'Start Demo (Normal config)' }).click();
await expect(page.getByTestId('proof-data')).toContainText('"data":', { timeout: 60000 });
let verify_data = await page.getByTestId('verify-data').innerText();
expect(verify_data).toContain('"serverName": "raw.githubusercontent.com"');
expect(verify_data).toContain('John Doe');
});
test('run demo (helper)', async ({ page }) => {
test.setTimeout(60000);
await page.goto('/');
// Click the get started link.
await page.getByRole('button', { name: 'Start Demo 2 (With helper method)' }).click();
await expect(page.getByTestId('proof-data')).toContainText('"data":', { timeout: 60000 });
// await page.screenshot({ path: 'screenshot.png', fullPage: true });
let verify_data = await page.getByTestId('verify-data').innerText();
expect(verify_data).toContain('"serverName": "raw.githubusercontent.com"');
expect(verify_data).toContain('"recv"');
});

View File

@@ -1,6 +1,6 @@
{
"compilerOptions": {
"target": "es5",
"target": "es2015",
"lib": ["dom", "dom.iterable", "esnext"],
"allowJs": false,
"skipLibCheck": true,
@@ -15,5 +15,7 @@
"noEmit": false,
"jsx": "react"
},
"include": ["app.tsx"]
"include": [
"src/app.tsx"
]
}

View File

@@ -27,7 +27,7 @@ var options = {
],
mode: 'development',
entry: {
app: path.join(__dirname, 'app.tsx'),
app: path.join(__dirname, 'src', 'app.tsx'),
},
output: {
filename: '[name].bundle.js',
@@ -51,6 +51,9 @@ var options = {
test: /\.(ts|tsx)$/,
exclude: /node_modules/,
use: [
{
loader: 'source-map-loader',
},
{
loader: require.resolve('ts-loader'),
},
@@ -68,6 +71,29 @@ var options = {
],
exclude: /node_modules/,
},
{
// look for .css or .scss files
test: /\.(css|scss)$/,
// in the `web` directory
use: [
{
loader: 'style-loader',
},
{
loader: 'css-loader',
options: { importLoaders: 1 },
},
{
loader: 'postcss-loader',
},
{
loader: 'sass-loader',
options: {
sourceMap: true,
},
},
],
},
],
},
resolve: {
@@ -75,6 +101,11 @@ var options = {
extensions: fileExtensions
.map((extension) => '.' + extension)
.concat(['.js', '.jsx', '.ts', '.tsx', '.css']),
fallback: {
crypto: require.resolve('crypto-browserify'),
stream: require.resolve('stream-browserify'),
vm: require.resolve('vm-browserify'),
},
},
plugins: [
new CopyWebpackPlugin({
@@ -91,6 +122,9 @@ var options = {
filename: 'index.html',
cache: false,
}),
new webpack.ProvidePlugin({
process: 'process/browser',
}),
new webpack.ProvidePlugin({
Buffer: ['buffer', 'Buffer'],
}),
@@ -100,10 +134,16 @@ var options = {
// - https://github.com/GoogleChromeLabs/wasm-bindgen-rayon#setting-up
// - https://web.dev/i18n/en/coop-coep/
devServer: {
port: 8080,
host: 'localhost',
hot: true,
headers: {
'Cross-Origin-Embedder-Policy': 'require-corp',
'Cross-Origin-Opener-Policy': 'same-origin',
},
client: {
overlay: false,
},
},
};

8
demo/web-to-web-p2p/.gitignore vendored Normal file
View File

@@ -0,0 +1,8 @@
package-lock.json
# Playwright
node_modules/
/test-results/
/playwright-report/
/blob-report/
/playwright/.cache/

View File

@@ -0,0 +1,24 @@
# Web-to-Web P2P Demo
This project demonstrates a peer-to-peer (P2P) communication between two web clients using TLSNotary.
The web prover will get data from <https://raw.githubusercontent.com> and prove it to the web verifier.
In this demo, the two web clients run in the same browser page (`./src/app.tsx`) and communicate via a simple websocket server (`./server/index.js`)
## Run the demo
1. Run the demo:
```
npm i
npm run dev
```
2. Open <http://localhost:8080/>
3. Click the **Start Demo** button
The Prover window logs the Prover's output, the Verifier logs the Verifier's output. In the console view you can see the websocket log.
You can also open the Browser developer tools (F12) to see more TLSNotary protocol logs.
## Project Structure
- `src/`: Contains the source code for the demo.
- `server/`: Contains the WebSocket server code.

View File

@@ -0,0 +1,16 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Web-to-Web P2P Demo</title>
</head>
<body>
<script>
</script>
<div id="root"></div>
</body>
</html>

View File

@@ -0,0 +1,50 @@
{
"name": "web-to-web-p2p",
"version": "1.0.0",
"description": "",
"main": "webpack.js",
"scripts": {
"dev:server": "node ./server/index.js",
"dev:ui": "webpack-dev-server --config webpack.js",
"dev": "concurrently npm:dev:ui npm:dev:server",
"build": "webpack --config webpack.js",
"start:ui": "webpack serve --config webpack.js",
"test": "npm run build && npx playwright test"
},
"author": "",
"license": "ISC",
"dependencies": {
"comlink": "^4.4.1",
"concurrently": "^9.1.0",
"express": "^4.21.1",
"qs": "^6.13.0",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"react-loader-spinner": "^6.1.6",
"tailwindcss": "^3.4.14",
"tlsn-js": "../../",
"ws": "^8.18.0"
},
"devDependencies": {
"@types/react": "^18.0.26",
"@types/react-dom": "^18.0.10",
"babel-loader": "^9.1.3",
"copy-webpack-plugin": "^11.0.0",
"crypto-browserify": "^3.12.0",
"css-loader": "^6.7.3",
"html-webpack-plugin": "^5.5.0",
"postcss-loader": "^7.3.3",
"postcss-preset-env": "^9.1.1",
"sass": "^1.57.1",
"sass-loader": "^13.2.0",
"style-loader": "^3.3.1",
"source-map-loader": "^5.0.0",
"stream-browserify": "^3.0.0",
"ts-loader": "^9.4.2",
"typescript": "^4.9.4",
"vm-browserify": "^1.1.2",
"webpack": "^5.75.0",
"webpack-cli": "^4.10.0",
"webpack-dev-server": "^4.11.1"
}
}

View File

@@ -0,0 +1,90 @@
import { defineConfig, devices } from '@playwright/test';
/**
* Read environment variables from file.
* https://github.com/motdotla/dotenv
*/
// import dotenv from 'dotenv';
// import path from 'path';
// dotenv.config({ path: path.resolve(__dirname, '.env') });
/**
* See https://playwright.dev/docs/test-configuration.
*/
export default defineConfig({
testDir: './tests',
/* Run tests in files in parallel */
fullyParallel: true,
/* Fail the build on CI if you accidentally left test.only in the source code. */
forbidOnly: !!process.env.CI,
/* Retry on CI only */
retries: process.env.CI ? 2 : 0,
/* Opt out of parallel tests on CI. */
workers: process.env.CI ? 1 : undefined,
/* Reporter to use. See https://playwright.dev/docs/test-reporters */
reporter: 'html',
/* Shared settings for all the projects below. See https://playwright.dev/docs/api/class-testoptions. */
use: {
/* Base URL to use in actions like `await page.goto('/')`. */
baseURL: 'http://localhost:8080',
/* Collect trace when retrying the failed test. See https://playwright.dev/docs/trace-viewer */
trace: 'on-first-retry',
},
/* Configure projects for major browsers */
projects: [
{
name: 'chromium',
use: { ...devices['Desktop Chrome'] },
},
// {
// name: 'firefox',
// use: { ...devices['Desktop Firefox'] },
// },
// {
// name: 'webkit',
// use: { ...devices['Desktop Safari'] },
// },
/* Test against mobile viewports. */
// {
// name: 'Mobile Chrome',
// use: { ...devices['Pixel 5'] },
// },
// {
// name: 'Mobile Safari',
// use: { ...devices['iPhone 12'] },
// },
/* Test against branded browsers. */
// {
// name: 'Microsoft Edge',
// use: { ...devices['Desktop Edge'], channel: 'msedge' },
// },
// {
// name: 'Google Chrome',
// use: { ...devices['Desktop Chrome'], channel: 'chrome' },
// },
],
/* Run your local dev server before starting the tests */
webServer: [
{
command: 'npm run start:ui',
url: 'http://localhost:8080',
reuseExistingServer: !process.env.CI,
},
{
command: 'wstcp --bind-addr 127.0.0.1:55688 raw.githubusercontent.com:443',
reuseExistingServer: true,
},
{
command: 'node ./server/index.js',
port: 3001,
reuseExistingServer: !process.env.CI,
}
]
});

View File

@@ -0,0 +1,4 @@
const tailwindcss = require("tailwindcss");
module.exports = {
plugins: ["postcss-preset-env", tailwindcss],
};

View File

@@ -0,0 +1,27 @@
// This file runs a WebSocket server which enables a web prover and web verifier to connect to each other
const express = require('express');
const { createServer } = require('http');
const { WebSocketServer } = require('ws');
const qs = require('qs');
const app = express();
const port = process.env.PORT || 3001;
const server = createServer(app);
const wss = new WebSocketServer({ server });
const clients = new Map();
wss.on('connection', async (client, request) => {
const query = qs.parse((request.url || '').replace(/\/\?/g, ''));
const id = query.id;
clients.set(id, client);
client.on('message', (data) => {
const target = id === 'prover' ? 'verifier' : 'prover';
console.log(target, data.length);
clients.get(target).send(data);
});
});
server.listen(port, () => {
console.log(`ws server listening on port ${port}`);
});

View File

@@ -0,0 +1,3 @@
@tailwind base;
@tailwind components;
@tailwind utilities;

View File

@@ -0,0 +1,341 @@
import React, { ReactElement, useCallback, useEffect, useState } from 'react';
import { createRoot } from 'react-dom/client';
import { Watch } from 'react-loader-spinner';
import * as Comlink from 'comlink';
import {
Prover as TProver,
Verifier as TVerifier,
Commit,
Transcript,
subtractRanges,
mapStringToRange,
} from 'tlsn-js';
import './app.scss';
import WebSocketStream from './stream';
import { HTTPParser } from 'http-parser-js';
const { init, Prover, Verifier }: any = Comlink.wrap(
new Worker(new URL('./worker.ts', import.meta.url)),
);
const container = document.getElementById('root');
const root = createRoot(container!);
root.render(<App />);
let proverLogs: string[] = [];
let verifierLogs: string[] = [];
const p2pProxyUrl = 'ws://localhost:3001';
const serverDns = 'raw.githubusercontent.com';
const webSocketProxy = `wss://notary.pse.dev/proxy?token=${serverDns}`;
const requestUrl = `https://raw.githubusercontent.com/tlsnotary/tlsn/refs/tags/v0.1.0-alpha.12/crates/server-fixture/server/src/data/1kb.json`;
function App(): ReactElement {
const [ready, setReady] = useState(false);
const [proverMessages, setProverMessages] = useState<string[]>([]);
const [verifierMessages, setVerifierMessages] = useState<string[]>([]);
const [started, setStarted] = useState(false);
// Initialize TLSNotary
useEffect(() => {
(async () => {
await init({ loggingLevel: 'Info' });
setReady(true);
})();
}, []);
// Set up streams for prover and verifier
// This is just for demo purposes. In the future we want to pass in the stream to the
// prover instead of using the websocket url.
useEffect(() => {
(async () => {
(async () => {
const proverStream = new WebSocketStream(`${p2pProxyUrl}?id=prover`);
const reader = await proverStream.reader();
while (true) {
const { done, value } = await reader.read();
if (done) {
console.log('stream finished');
break;
}
console.log(`Received data from stream:`, await value.text());
}
})();
// Set up stream for verifier
(async () => {
const verifierStream = new WebSocketStream(
`${p2pProxyUrl}?id=verifier`,
);
const writer = await verifierStream.writer();
writer.write('Hello');
writer.write('World!');
writer.close();
})();
})();
}, []);
const addProverLog = useCallback((log: string) => {
proverLogs = proverLogs.concat(
`${new Date().toLocaleTimeString()} - ${log}`,
);
setProverMessages(proverLogs);
}, []);
const addVerifierLog = useCallback((log: string) => {
verifierLogs = verifierLogs.concat(
`${new Date().toLocaleTimeString()} - ${log}`,
);
setVerifierMessages(verifierLogs);
}, []);
const start = useCallback(async () => {
if (!ready) return;
setStarted(true);
addProverLog('Instantiate Prover class');
const prover: TProver = await new Prover({
serverDns: serverDns,
maxRecvData: 2000
});
addProverLog('Prover class instantiated');
addVerifierLog('Instantiate Verifier class');
const verifier: TVerifier = await new Verifier({
maxRecvData: 2000
});
addVerifierLog('Verifier class instantiated');
addVerifierLog('Connect verifier to p2p proxy');
// TODO tlsn-wasm: we want to pass in the stream here instead of the websocket url
// The stream is both readable and writable (duplex)
try {
await verifier.connect(`${p2pProxyUrl}?id=verifier`);
} catch (e: any) {
addVerifierLog('Error connecting verifier to p2p proxy');
addVerifierLog(e.message);
return;
}
addVerifierLog('Verifier connected to p2p proxy');
addProverLog('Set up prover and connect to p2p proxy');
// TODO: we also want to pass in the stream here
const proverSetup = prover.setup(`${p2pProxyUrl}?id=prover`);
addProverLog('Prover connected to p2p proxy');
// Wait for prover to finish setting up websocket
// TODO: Make the setup better and avoid this wait
await new Promise((r) => setTimeout(r, 2000));
addVerifierLog('Start verifier');
// This needs to be called before we send the request
// This starts the verifier and makes it wait for the prover to send the request
const verified = verifier.verify();
await proverSetup;
addProverLog('Finished prover setup');
addProverLog('Send request');
try {
await prover.sendRequest(webSocketProxy, {
url: requestUrl,
method: 'GET',
headers: {
'Content-Type': 'application/json',
},
body: {
hello: 'world',
one: 1,
},
});
} catch (e: any) {
addProverLog(`Error sending request to ${requestUrl}`);
addProverLog(e.message);
return;
}
addProverLog('Request sent');
const transcript = await prover.transcript();
addProverLog('Response received');
addProverLog('Transcript sent');
addProverLog(Buffer.from(transcript.sent).toString('utf-8'));
addProverLog('Transcript received');
addProverLog(Buffer.from(transcript.recv).toString('utf-8'));
addProverLog('Revealing data to verifier');
const { sent, recv } = transcript;
const {
info: recvInfo,
headers: recvHeaders,
body: recvBody,
} = parseHttpMessage(Buffer.from(recv), 'response');
const body = JSON.parse(recvBody[0].toString());
// Prover only reveals parts the transcript to the verifier
const commit: Commit = {
sent: subtractRanges(
{ start: 0, end: sent.length },
mapStringToRange(
['secret: test_secret'],
Buffer.from(sent).toString('utf-8'),
),
),
recv: [
...mapStringToRange(
[
recvInfo,
`${recvHeaders[4]}: ${recvHeaders[5]}\r\n`,
`${recvHeaders[6]}: ${recvHeaders[7]}\r\n`,
`${recvHeaders[8]}: ${recvHeaders[9]}\r\n`,
`${recvHeaders[10]}: ${recvHeaders[11]}\r\n`,
`${recvHeaders[12]}: ${recvHeaders[13]}`,
`${recvHeaders[14]}: ${recvHeaders[15]}`,
`${recvHeaders[16]}: ${recvHeaders[17]}`,
`${recvHeaders[18]}: ${recvHeaders[19]}`,
`"name": "${body.information.name}"`,
`"street": "${body.information.address.street}"`,
],
Buffer.from(recv).toString('utf-8'),
),
],
};
await prover.reveal({ ...commit, server_identity: false });
addProverLog('Data revealed to verifier');
const result = await verified;
addVerifierLog('Verification completed');
const t = new Transcript({
sent: result.transcript?.sent || [],
recv: result.transcript?.recv || [],
});
addVerifierLog('Verified data:');
addVerifierLog(`transcript.sent: ${t.sent()}`);
addVerifierLog(`transcript.recv: ${t.recv()}`);
setStarted(false);
}, [ready]);
return (
<div className="w-screen h-screen flex flex-col bg-slate-100 overflow-hidden">
<div className="w-full p-4 bg-slate-800 text-white flex-shrink-0 shadow-md">
<h1 className="text-xl font-bold">Web-to-Web P2P Demo</h1>
<p className="text-sm mt-1">
This demo showcases peer-to-peer communication between a web prover
and a web verifier using TLSNotary. The prover fetches data from{' '}
<a
href="https://raw.githubusercontent.com/tlsnotary/tlsn/refs/tags/v0.1.0-alpha.12/crates/server-fixture/server/src/data/1kb.json"
target="_blank"
rel="noopener noreferrer"
className="underline text-blue-400 hover:text-blue-300"
>
our GitHub repository
</a>{' '}
and proves it to the verifier.
</p>
</div>
<div className="grid grid-rows-2 grid-cols-2 gap-4 p-4 flex-grow">
<div className="flex flex-col items-center border border-slate-300 bg-white rounded-lg shadow-md row-span-1 col-span-1 p-4 gap-2">
<div className="font-semibold text-slate-700 text-lg">Prover</div>
<div className="flex flex-col text-sm bg-slate-50 border border-slate-200 w-full flex-grow py-2 overflow-y-auto rounded">
{proverMessages.map((m, index) => (
<span
key={index}
data-testid="prover-data"
className="px-3 py-1 text-slate-600 break-all"
>
{m}
</span>
))}
</div>
</div>
<div className="flex flex-col items-center border border-slate-300 bg-white rounded-lg shadow-md row-span-1 col-span-1 p-4 gap-2">
<div className="font-semibold text-slate-700 text-lg">Verifier</div>
<div className="flex flex-col text-sm bg-slate-50 border border-slate-200 w-full flex-grow py-2 overflow-y-auto rounded">
{verifierMessages.map((m, index) => (
<span
key={index}
data-testid="verifier-data"
className="px-3 py-1 text-slate-600 break-all"
>
{m}
</span>
))}
</div>
</div>
<div className="flex flex-row justify-center items-center row-span-1 col-span-2">
<Button
className="bg-slate-800 text-white font-semibold px-6 py-3 rounded-lg shadow-md hover:bg-slate-700 disabled:opacity-50"
disabled={!ready || started}
onClick={start}
>
<div data-testid="start" className="flex items-center">
{ready && !started ? (
<>Start Demo</>
) : (
<Watch
visible={true}
height="40"
width="40"
radius="48"
color="#ffffff"
ariaLabel="watch-loading"
/>
)}
</div>
</Button>
</div>
</div>
</div>
);
}
if ((module as any).hot) {
(module as any).hot.accept();
}
function Button(props: any) {
const { className = '', ...p } = props;
return (
<button
className={`px-4 py-2 bg-slate-300 rounded transition-colors border border-b-slate-400 border-r-slate-400 border-t-white border-l-white hover:bg-slate-200 active:bg-slate-300 active:border-t-slate-400 active:border-l-slate-400 active:border-b-white active:border-r-white disabled:opacity-50 disabled:bg-slate-200 ${className}`}
{...p}
/>
);
}
function parseHttpMessage(buffer: Buffer, type: 'request' | 'response') {
const parser = new HTTPParser(
type === 'request' ? HTTPParser.REQUEST : HTTPParser.RESPONSE,
);
const body: Buffer[] = [];
let complete = false;
let headers: string[] = [];
parser.onBody = (t) => {
body.push(t);
};
parser.onHeadersComplete = (res) => {
headers = res.headers;
};
parser.onMessageComplete = () => {
complete = true;
};
parser.execute(buffer);
parser.finish();
if (!complete) throw new Error(`Could not parse ${type.toUpperCase()}`);
return {
info: buffer.toString('utf-8').split('\r\n')[0] + '\r\n',
headers,
body,
};
}

View File

@@ -0,0 +1,62 @@
export default class WebSocketStream {
client: WebSocket;
readable: Promise<ReadableStream>;
writable: Promise<WritableStream>;
constructor(url: string) {
const client = new WebSocket(url);
const deferredReadable = defer<ReadableStream>();
const deferredWritable = defer<WritableStream>();
this.client = client;
this.readable = deferredReadable.promise;
this.writable = deferredWritable.promise;
client.onopen = () => {
const readable = new ReadableStream({
start(controller) {
client.onmessage = async (event) => {
controller.enqueue(event.data);
};
},
cancel() {
client.close();
},
});
const writable = new WritableStream({
write(chunk) {
client.send(chunk);
},
close() {
client.close();
},
});
deferredReadable.resolve(readable);
deferredWritable.resolve(writable);
};
}
async reader() {
return this.readable.then((stream) => stream.getReader());
}
async writer() {
return this.writable.then((stream) => stream.getWriter());
}
}
function defer<value = any>(): {
promise: Promise<value>;
resolve: (val: value | Promise<value>) => void;
reject: (err: any) => void;
} {
let resolve: (val: value | Promise<value>) => void,
reject: (err: any) => void;
const promise: Promise<value> = new Promise((res, rej) => {
resolve = res;
reject = rej;
});
return { promise, resolve: resolve!, reject: reject! };
}

View File

@@ -0,0 +1,10 @@
import * as Comlink from 'comlink';
import init, { Prover, Attestation, Presentation, Verifier } from 'tlsn-js';
Comlink.expose({
init,
Prover,
Verifier,
Presentation,
Attestation,
});

View File

@@ -0,0 +1,12 @@
/** @type {import('tailwindcss').Config} */
module.exports = {
content: ['./src/**/*.{js,jsx,ts,tsx}'],
theme: {
extend: {
colors: {
primary: '#243f5f',
},
},
},
plugins: [],
};

View File

@@ -0,0 +1,24 @@
import { test, expect } from '@playwright/test';
test('has title', async ({ page }) => {
await page.goto('/');
await expect(page).toHaveTitle(/Web-to-Web P2P Demo/)
});
test('run web-to-web p2p demo', async ({ page }) => {
await page.goto('/');
await page.getByTestId('start').click();
await expect(page.getByTestId('start')).toContainText('Start Demo', { timeout: 60000 });
const proverMessages = await page.getByTestId('prover-data').allTextContents();
expect(proverMessages.some(text => text.includes('Transcript received'))).toBe(true);
// console.log('Verifier Messages:', proverMessages);
expect(proverMessages.some(text => text.includes('"name": "John Doe",'))).toBe(true);
expect(proverMessages.some(text => text.includes('"address": {'))).toBe(true);
const verifierMessages = await page.getByTestId('verifier-data').allTextContents();
expect(verifierMessages.some(text => text.includes('Verification completed'))).toBe(true);
expect(verifierMessages.some(text => text.includes('***"name": "John Doe"*************************"street": "123 Elm Street"***'))).toBe(true);
});

View File

@@ -0,0 +1,21 @@
{
"compilerOptions": {
"target": "es2015",
"lib": ["dom", "dom.iterable", "esnext"],
"allowJs": false,
"skipLibCheck": true,
"esModuleInterop": true,
"allowSyntheticDefaultImports": true,
"strict": true,
"forceConsistentCasingInFileNames": true,
"noFallthroughCasesInSwitch": true,
"module": "esnext",
"moduleResolution": "node",
"resolveJsonModule": true,
"noEmit": false,
"jsx": "react"
},
"include": [
"src/app.tsx"
]
}

View File

@@ -0,0 +1,150 @@
var webpack = require('webpack'),
path = require('path'),
CopyWebpackPlugin = require('copy-webpack-plugin'),
HtmlWebpackPlugin = require('html-webpack-plugin');
const ASSET_PATH = process.env.ASSET_PATH || '/';
var alias = {};
var fileExtensions = [
'jpg',
'jpeg',
'png',
'gif',
'eot',
'otf',
'svg',
'ttf',
'woff',
'woff2',
];
var options = {
ignoreWarnings: [
/Circular dependency between chunks with runtime/,
/ResizeObserver loop completed with undelivered notifications/,
],
mode: 'development',
entry: {
app: path.join(__dirname, 'src', 'app.tsx'),
},
output: {
filename: '[name].bundle.js',
path: path.resolve(__dirname, 'build'),
clean: true,
publicPath: ASSET_PATH,
},
module: {
rules: [
{
test: new RegExp('.(' + fileExtensions.join('|') + ')$'),
type: 'asset/resource',
exclude: /node_modules/,
},
{
test: /\.html$/,
loader: 'html-loader',
exclude: /node_modules/,
},
{
test: /\.(ts|tsx)$/,
exclude: /node_modules/,
use: [
{
loader: 'source-map-loader',
},
{
loader: require.resolve('ts-loader'),
},
],
},
{
test: /\.(js|jsx)$/,
use: [
{
loader: 'source-map-loader',
},
{
loader: require.resolve('babel-loader'),
},
],
exclude: /node_modules/,
},
{
// look for .css or .scss files
test: /\.(css|scss)$/,
// in the `web` directory
use: [
{
loader: 'style-loader',
},
{
loader: 'css-loader',
options: { importLoaders: 1 },
},
{
loader: 'postcss-loader',
},
{
loader: 'sass-loader',
options: {
sourceMap: true,
},
},
],
},
],
},
resolve: {
alias: alias,
extensions: fileExtensions
.map((extension) => '.' + extension)
.concat(['.js', '.jsx', '.ts', '.tsx', '.css']),
fallback: {
crypto: require.resolve('crypto-browserify'),
stream: require.resolve('stream-browserify'),
vm: require.resolve('vm-browserify'),
},
},
plugins: [
new CopyWebpackPlugin({
patterns: [
{
from: 'node_modules/tlsn-js/build',
to: path.join(__dirname, 'build'),
force: true,
},
],
}),
new HtmlWebpackPlugin({
template: path.join(__dirname, 'index.ejs'),
filename: 'index.html',
cache: false,
}),
new webpack.ProvidePlugin({
process: 'process/browser',
}),
new webpack.ProvidePlugin({
Buffer: ['buffer', 'Buffer'],
}),
].filter(Boolean),
// Required by wasm-bindgen-rayon, in order to use SharedArrayBuffer on the Web
// Ref:
// - https://github.com/GoogleChromeLabs/wasm-bindgen-rayon#setting-up
// - https://web.dev/i18n/en/coop-coep/
devServer: {
port: 8080,
host: 'localhost',
hot: true,
headers: {
'Cross-Origin-Embedder-Policy': 'require-corp',
'Cross-Origin-Opener-Policy': 'same-origin',
},
client: {
overlay: false,
},
},
};
module.exports = options;

12007
package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,14 +1,13 @@
{
"name": "tlsn-js",
"version": "v0.1.0-alpha.5.3",
"version": "0.1.0-alpha.12.0",
"description": "",
"repository": "https://github.com/tlsnotary/tlsn-js",
"main": "build/index.js",
"types": "build/index.d.ts",
"main": "build/lib.js",
"types": "build/lib.d.ts",
"files": [
"build/",
"src/",
"wasm/prover/pkg/*",
"readme.md"
],
"scripts": {
@@ -17,51 +16,43 @@
"build:src": "webpack --config webpack.build.config.js",
"build:types": "tsc --project tsconfig.compile.json",
"build:lib": "NODE_ENV=production concurrently npm:build:src npm:build:types",
"build": "npm run build:wasm && npm run build:lib",
"update:wasm": "sh utils/check-wasm.sh -f",
"test:wasm": "cd wasm/prover; wasm-pack test --firefox --release --headless",
"build:wasm": "wasm-pack build --target web wasm/prover",
"build:tlsn-binaries": "sh utils/build-tlsn-binaries.sh",
"build:wasm": "sh tlsn-wasm/build.sh v0.1.0-alpha.12",
"build": "npm run build:lib",
"watch:dev": "webpack --config webpack.web.dev.config.js --watch",
"predev": "sh utils/check-wasm.sh",
"lint:wasm": "cd wasm/prover; cargo clippy --target wasm32-unknown-unknown",
"dev": "concurrently npm:watch:dev npm:serve:test",
"lint:eslint": "eslint . --fix",
"lint:tsc": "tsc --noEmit",
"lint": "concurrently npm:lint:tsc npm:lint:eslint",
"run:test": "TS_NODE_COMPILER_OPTIONS='{\"module\": \"commonjs\"}' mocha -r ts-node/register 'test/testRunner.ts'",
"test": "npm run build:tlsn-binaries && npm run build:test && npm run run:test"
},
"dependencies": {
"comlink": "^4.4.1"
"test": "playwright test",
"notary": "docker run --platform=linux/amd64 -p 7047:7047 --rm ghcr.io/tlsnotary/tlsn/notary-server:v0.1.0-alpha.12"
},
"devDependencies": {
"@types/expect": "^24.3.0",
"@types/mocha": "^10.0.6",
"@playwright/test": "^1.52.0",
"@types/node": "^22.15.18",
"@types/serve-handler": "^6.1.4",
"browserify": "^17.0.0",
"buffer": "^6.0.3",
"comlink": "4.4.1",
"concurrently": "^5.1.0",
"constants-browserify": "^1.0.0",
"copy-webpack-plugin": "^5.0.5",
"copy-webpack-plugin": "^11.0.0",
"crypto-browserify": "^3.12.0",
"eslint": "^8.57.0",
"eslint-config-prettier": "^9.0.0",
"eslint-plugin-prettier": "^5.0.0",
"file-loader": "^5.0.2",
"html-webpack-plugin": "~5.3.2",
"http-parser-js": "^0.5.9",
"https-browserify": "^1.0.0",
"image-webpack-loader": "^6.0.0",
"js-yaml": "^4.1.0",
"mocha": "^10.2.0",
"node-loader": "^0.6.0",
"prettier": "^3.0.2",
"process": "^0.11.10",
"puppeteer": "^21.10.0",
"serve": "14.2.1",
"serve-handler": "^6.1.5",
"stream-browserify": "^3.0.0",
"ts-loader": "^6.2.1",
"ts-mocha": "^10.0.0",
"ts-node": "^10.9.2",
"typescript": "^4.9.5",
"typescript-eslint": "^7.4.0",
@@ -74,5 +65,8 @@
"license": "ISC",
"engines": {
"node": ">= 16.20.2"
},
"dependencies": {
"tlsn-wasm": "0.1.0-alpha.12"
}
}

View File

@@ -0,0 +1,26 @@
import { test, expect } from '@playwright/test';
test('full-integration', async ({ page }) => {
// log browser console messages
page.on('console', (msg) => {
console.log(`[BROWSER ${msg.type().toUpperCase()}] ${msg.text()}`);
});
await page.goto('/full-integration');
await expect(page.getByTestId('full-integration')).toHaveText(/\{.*\}/s, { timeout: 60000 });
const json = await page.getByTestId('full-integration').innerText();
const { sent, recv, server_name, version, meta } = JSON.parse(json);
expect(version).toBe('0.1.0-alpha.12');
expect(new URL(meta.notaryUrl!).protocol === 'http:');
expect(server_name).toBe('raw.githubusercontent.com');
expect(sent).toContain('host: raw.githubusercontent.com');
expect(sent).not.toContain('secret: test_secret');
expect(recv).toContain('"id": 1234567890');
expect(recv).toContain('"city": "Anytown"');
expect(recv).toContain('"postalCode": "12345"');
});

View File

@@ -0,0 +1,21 @@
import { test, expect } from '@playwright/test';
test('simple verify', async ({ page }) => {
// log browser console messages
page.on('console', (msg) => {
console.log(`[BROWSER ${msg.type().toUpperCase()}] ${msg.text()}`);
});
await page.goto('/simple-verify');
await expect(page.getByTestId('simple-verify')).toHaveText(/\{.*\}/s);
const json = await page.getByTestId('simple-verify').innerText();
const { sent, recv } = JSON.parse(json);
expect(sent).toContain('host: raw.githubusercontent.com');
expect(recv).toContain('*******************');
expect(recv).toContain('"city": "Anytown"');
expect(recv).toContain('"id": 1234567890');
expect(recv).toContain('"postalCode": "12345"');
});

85
playwright.config.ts Normal file
View File

@@ -0,0 +1,85 @@
import { defineConfig, devices } from '@playwright/test';
/**
* Read environment variables from file.
* https://github.com/motdotla/dotenv
*/
// import dotenv from 'dotenv';
// import path from 'path';
// dotenv.config({ path: path.resolve(__dirname, '.env') });
/**
* See https://playwright.dev/docs/test-configuration.
*/
export default defineConfig({
testDir: './playwright-test',
/* Run tests in files in parallel */
fullyParallel: true,
/* Fail the build on CI if you accidentally left test.only in the source code. */
forbidOnly: !!process.env.CI,
/* Retry on CI only */
retries: process.env.CI ? 2 : 0,
/* Opt out of parallel tests on CI. */
workers: process.env.CI ? 1 : undefined,
/* Reporter to use. See https://playwright.dev/docs/test-reporters */
reporter: 'html',
/* Shared settings for all the projects below. See https://playwright.dev/docs/api/class-testoptions. */
use: {
/* Base URL to use in actions like `await page.goto('/')`. */
baseURL: 'http://localhost:3001',
/* Collect trace when retrying the failed test. See https://playwright.dev/docs/trace-viewer */
trace: 'on-first-retry',
},
/* Configure projects for major browsers */
projects: [
{
name: 'chromium',
use: { ...devices['Desktop Chrome'] },
},
// {
// name: 'firefox',
// use: { ...devices['Desktop Firefox'] },
// },
// {
// name: 'webkit',
// use: { ...devices['Desktop Safari'] },
// },
/* Test against mobile viewports. */
// {
// name: 'Mobile Chrome',
// use: { ...devices['Pixel 5'] },
// },
// {
// name: 'Mobile Safari',
// use: { ...devices['iPhone 12'] },
// },
/* Test against branded browsers. */
// {
// name: 'Microsoft Edge',
// use: { ...devices['Desktop Edge'], channel: 'msedge' },
// },
// {
// name: 'Google Chrome',
// use: { ...devices['Desktop Chrome'], channel: 'chrome' },
// },
],
/* Run your local dev server before starting the tests */
webServer: [
{
command: 'npm run build:test && npm run serve:test',
url: 'http://localhost:3001',
reuseExistingServer: !process.env.CI,
},
{
command: 'wstcp --bind-addr 127.0.0.1:55688 raw.githubusercontent.com:443',
reuseExistingServer: true,
},
]
});

View File

@@ -8,54 +8,52 @@ There is a simple react/typescript demo app in `./demo/react-ts-webpack`. The di
Since a web browser doesn't have the ability to make TCP connection, we need to use a websocket proxy server.
To run your own websocket proxy for `https://swapi.dev` **locally**:
To run your own websocket proxy for `https://raw.githubusercontent.com` **locally**:
1. Install [websocat](https://github.com/vi/websocat):
1. Install [wstcp](https://github.com/sile/wstcp):
| tool | command |
|--------|-------------------------------|
| cargo | `cargo install websocat` |
| brew | `brew install websocat` |
| source | https://github.com/vi/websocat|
| Tool | Command |
| ------ | ----------------------------- |
| cargo | `cargo install wstcp` |
| brew | `brew install wstcp` |
| source | https://github.com/sile/wstcp |
2. Run a websocket proxy for `https://swapi.dev`:
2. Run a websocket proxy for `https://raw.githubusercontent.com`:
```sh
websocat --binary -v ws-l:0.0.0.0:55688 tcp:swapi.dev:443
wstcp --bind-addr 127.0.0.1:55688 raw.githubusercontent.com:443
```
Note the `tcp:swapi.dev:443` argument on the last line, this is the server we will use in this quick start.
Note the `raw.githubusercontent.com:443` argument on the last line, this is the server we will use in this quick start.
### Run a Local Notary Server <a name="local-notary"></a>
For this demo, we also need to run a local notary server.
1. Clone the TLSNotary repository:
```shell
git clone https://github.com/tlsnotary/tlsn.git --branch "v0.1.0-alpha.5"
```
2. Edit the notary server config file (`notary-server/config/config.yaml`) to turn off TLS so that the browser extension can connect to the local notary server without requiring extra steps to accept self-signed certificates in the browser.
```yaml
tls:
enabled: false
```
3. Run the notary server:
```shell
cd notary-server
cargo run --release
```
* Use docker
```sh
npm run notary
```
* Or, compile and run the notary server natively:
```sh
# Clone the TLSNotary repository:
git clone https://github.com/tlsnotary/tlsn.git --branch "v0.1.0-alpha.12"
cd tlsn/crates/notary/server/
# Run the notary server
cargo run --release
```
The notary server will now be running in the background waiting for connections.
## `tlsn-js` in a React/Typescript app
### Run the
1. Clone the repository
1. Compile tlns-js
```sh
git clone https://github.com/tlsnotary/tlsn-js
npm i
npm run build
```
2. Go to the demo folder
```sh
cd ./tlsn-js/demo/react-ts-webpack
cd demo/react-ts-webpack
```
3. Install dependencies
```sh

143
readme.md
View File

@@ -1,94 +1,131 @@
![MIT licensed][mit-badge]
![MIT licensed][mit-badge]
![Apache licensed][apache-badge]
[mit-badge]: https://img.shields.io/badge/license-MIT-blue.svg
[mit-badge]: https://img.shields.io/badge/license-MIT-blue.svg
[apache-badge]: https://img.shields.io/github/license/saltstack/salt
# tlsn-js
NPM Modules for proving and verifying using TLSNotary in the browser.
The prover requires a [notary-server](https://github.com/tlsnotary/notary-server) and a websocket proxy
NPM modules for proving and verifying using TLSNotary in the browser.
> [!IMPORTANT]
> `tlsn-js` is developed for the usage of TLSNotary **in the Browser**. This module does not work in `nodejs`.
> `tlsn-js` is developed specifically for **browser environments** and does **not** work in Node.js.
> [!IMPORTANT]
> The primary goal of `tlsn-js` is to support the development of the [TLSNotary browser extension](https://github.com/tlsnotary/tlsn-extension/).
> **Please do not treat this as a public API (yet).**
## License
This repository is licensed under either of
This repository is licensed under either:
- [Apache License, Version 2.0](http://www.apache.org/licenses/LICENSE-2.0)
- [MIT license](http://opensource.org/licenses/MIT)
- [MIT License](http://opensource.org/licenses/MIT)
at your option.
...at your option.
## Example
```ts
import { prove, verify } from '../src';
## Examples
// To create a proof
const proof = await prove('https://swapi.dev/api/people/1', {
method: 'GET',
headers: {
Connection: 'close',
Accept: 'application/json',
'Accept-Encoding': 'identity',
},
body: '',
maxTranscriptSize: 20000,
notaryUrl: 'https://127.0.0.1:7047',
websocketProxyUrl: 'ws://127.0.0.1:55688',
});
`tlsn-js` can be used in several modes depending on your use case.
// To verify a proof
const result = await verify(proof);
console.log(result);
```
The `./demo` folder contains three demos:
## Running a local websocket proxy for `https://swapi.dev`
- `react-ts-webpack`: Create an attestation with a Notary and render the result.
- `interactive-demo`: Prove data interactively to a Verifier.
- `web-to-web-p2p`: Prove data between two browser peers.
1. Install [websocat](https://github.com/vi/websocat):
## Running a Local WebSocket Proxy
| tool | command |
|--------|-------------------------------|
| cargo | `cargo install websocat` |
| brew | `brew install websocat` |
| source | https://github.com/vi/websocat|
In the demos, we attest data from `https://raw.githubusercontent.com`. Since browsers do not support raw TCP connections, a WebSocket proxy is required:
2. Run a websocket proxy for `https://swapi.dev`:
```sh
websocat --binary -v ws-l:0.0.0.0:55688 tcp:swapi.dev:443
```
1. Install [wstcp](https://github.com/sile/wstcp):
| Tool | Command |
| ------ | ----------------------------- |
| cargo | `cargo install wstcp` |
| brew | `brew install wstcp` |
| source | https://github.com/sile/wstcp |
2. Run a WebSocket proxy for `https://raw.githubusercontent.com`:
```sh
wstcp --bind-addr 127.0.0.1:55688 raw.githubusercontent.com:443
```
## Install as NPM Package
```
```sh
npm install tlsn-js
```
## Development
> [!IMPORTANT]
> **Note on Rust-to-WASM Compilation**: This project requires compiling Rust into WASM, which needs [`clang`](https://clang.llvm.org/) version 16.0.0 or newer. MacOS users, be aware that Xcode's default `clang` might be older. If you encounter the error `No available targets are compatible with triple "wasm32-unknown-unknown"`, it's likely due to an outdated `clang`. Updating `clang` to a newer version should resolve this issue.
This library wraps the `tlsn-wasm` module.
To work on both `tlsn-wasm` and `tlsn-js` locally, update `package.json`:
```json
"tlsn-wasm": "./tlsn-wasm/pkg"
```
# make sure you have rust installed
# https://www.rust-lang.org/tools/install
Then build `tlsn-wasm`:
```sh
npm run build:wasm
```
Next:
```sh
npm install
# this serve a page that will execute the example code at http://localhost:3001
npm run dev
npm run test
```
> To switch back to the npm-published version of `tlsn-wasm`, delete or reset `package-lock.json` to remove the local path reference.
## Build for NPM
```
```sh
npm install
npm run build
```
## Adding a new test
1. Create a new `new-test.spec.ts` file in the `test/` directory
2. Add your spec file to the entry object fin `webpack.web.dev.config.js`
3. Add a new `div` block to `test/test.ejs` like this: `<div>Testing "new-test":<div id="new-test"></div></div>`. The div id must be the same as the filename.
## Testing
Testing is slightly complex due to the need for browser-based workers.
- Tests live in the `test/` directory.
- The `tests/` directory contains a Playwright test runner that opens a Chromium browser and runs the actual test page.
Some tests require a running Notary. You can start one via Docker:
```sh
npm run notary
```
### Adding a New `tlsn-js` Test
1. Create a `new-test.spec.ts` file in the `test/` directory.
2. Add your spec file to the `entry` object in `webpack.web.dev.config.js`.
3. Create a corresponding `new-test.spec.ts` file in the `playwright-test/` directory.
4. Add an `expect()` call for it in `tests/test.spec.ts`.
### Testing the Demos
Playwright is also used to test the demos.
```sh
npm install
npm run test
```
- View tests in the browser:
```sh
npx playwright test --ui
```
- Debug tests:
```sh
npx playwright test --debug
```

View File

@@ -1,104 +0,0 @@
import TLSN from './tlsn';
import { DEFAULT_LOGGING_FILTER } from './tlsn';
import { Proof } from './types';
let _tlsn: TLSN;
const current_logging_filter = DEFAULT_LOGGING_FILTER;
async function getTLSN(logging_filter?: string): Promise<TLSN> {
const logging_filter_changed =
logging_filter && logging_filter == current_logging_filter;
if (!logging_filter_changed && _tlsn) return _tlsn;
// @ts-ignore
if (logging_filter) _tlsn = await new TLSN(logging_filter);
else _tlsn = await new TLSN();
return _tlsn;
}
/**
* If you want to change the default logging filter, call this method before calling prove or verify
* For the filter syntax consult: https://docs.rs/tracing-subscriber/latest/tracing_subscriber/filter/struct.EnvFilter.html#example-syntax
* @param logging_filter
*/
export const set_logging_filter = async (logging_filter: string) => {
getTLSN(logging_filter);
};
export const prove = async (
url: string,
options: {
notaryUrl: string;
websocketProxyUrl: string;
method?: string;
headers?: { [key: string]: string };
body?: string;
maxSentData?: number;
maxRecvData?: number;
maxTranscriptSize?: number;
secretHeaders?: string[];
secretResps?: string[];
},
): Promise<Proof> => {
const {
method,
headers = {},
body = '',
maxSentData,
maxRecvData,
maxTranscriptSize = 16384,
notaryUrl,
websocketProxyUrl,
secretHeaders,
secretResps,
} = options;
const tlsn = await getTLSN();
headers['Host'] = new URL(url).host;
headers['Connection'] = 'close';
const proof = await tlsn.prove(url, {
method,
headers,
body,
maxSentData,
maxRecvData,
maxTranscriptSize,
notaryUrl,
websocketProxyUrl,
secretHeaders,
secretResps,
});
return {
...proof,
notaryUrl,
};
};
export const verify = async (
proof: Proof,
publicKeyOverride?: string,
): Promise<{
time: number;
sent: string;
recv: string;
notaryUrl: string;
}> => {
const publicKey =
publicKeyOverride || (await fetchPublicKeyFromNotary(proof.notaryUrl));
const tlsn = await getTLSN();
const result = await tlsn.verify(proof, publicKey);
return {
...result,
notaryUrl: proof.notaryUrl,
};
};
async function fetchPublicKeyFromNotary(notaryUrl: string) {
const res = await fetch(notaryUrl + '/info');
const json: any = await res.json();
if (!json.publicKey) throw new Error('invalid response');
return json.publicKey;
}

525
src/lib.ts Normal file
View File

@@ -0,0 +1,525 @@
import initWasm, {
initialize,
LoggingLevel,
LoggingConfig,
Attestation as WasmAttestation,
Secrets as WasmSecrets,
type Commit,
type Reveal,
Verifier as WasmVerifier,
Prover as WasmProver,
type ProverConfig,
type Method,
NetworkSetting,
VerifierConfig,
VerifierOutput,
VerifyingKey,
Presentation as WasmPresentation,
build_presentation,
ConnectionInfo,
PartialTranscript,
} from 'tlsn-wasm';
import { arrayToHex, expect, headerToMap, hexToArray } from './utils';
import { PresentationJSON, } from './types';
import { Buffer } from 'buffer';
import { Transcript, subtractRanges, mapStringToRange } from './transcript';
let LOGGING_LEVEL: LoggingLevel = 'Info';
function debug(...args: unknown[]) {
if (['Debug', 'Trace'].includes(LOGGING_LEVEL)) {
console.log('tlsn-js DEBUG', ...args);
}
}
export default async function init(config?: {
loggingLevel?: LoggingLevel;
hardwareConcurrency?: number;
}) {
const {
loggingLevel = 'Info',
hardwareConcurrency = navigator.hardwareConcurrency,
} = config || {};
LOGGING_LEVEL = loggingLevel;
const res = await initWasm();
// 6422528 ~= 6.12 mb
debug('res.memory', res.memory);
debug('res.memory.buffer.length', res.memory.buffer.byteLength);
debug('initialize thread pool');
await initialize(
{
level: loggingLevel,
crate_filters: undefined,
span_events: undefined,
},
hardwareConcurrency,
);
debug('initialized thread pool');
}
export class Prover {
#prover: WasmProver;
#config: ProverConfig;
#verifierUrl?: string;
#websocketProxyUrl?: string;
static async notarize(options: {
url: string;
notaryUrl: string;
websocketProxyUrl: string;
method?: Method;
headers?: {
[name: string]: string;
};
body?: unknown;
maxSentData?: number;
maxSentRecords?: number,
maxRecvData?: number;
maxRecvDataOnline?: number;
maxRecvRecordsOnline?: number,
network?: NetworkSetting
deferDecryptionFromStart?: boolean;
commit?: Commit;
serverIdentity?: boolean
clientAuth?: [number[][], number[]];
}): Promise<PresentationJSON> {
const {
url,
method = 'GET',
headers = {},
body,
maxSentData = 1024,
maxSentRecords,
maxRecvData = 1024,
maxRecvDataOnline,
maxRecvRecordsOnline,
network = 'Bandwidth',
deferDecryptionFromStart,
notaryUrl,
websocketProxyUrl,
commit: _commit,
serverIdentity = false,
clientAuth,
} = options;
const hostname = new URL(url).hostname;
const notary = NotaryServer.from(notaryUrl);
const prover = new WasmProver({
server_name: hostname,
max_sent_data: maxSentData,
max_sent_records: maxSentRecords,
max_recv_data: maxRecvData,
max_recv_data_online: maxRecvDataOnline,
max_recv_records_online: maxRecvRecordsOnline,
defer_decryption_from_start: deferDecryptionFromStart,
network: network,
client_auth: clientAuth,
});
await prover.setup(await notary.sessionUrl(maxSentData, maxRecvData));
const headerMap = Prover.getHeaderMap(url, body, headers);
await prover.send_request(websocketProxyUrl + `?token=${hostname}`, {
uri: url,
method,
headers: headerMap,
body,
});
const transcript = prover.transcript();
const commit = _commit || {
sent: [{ start: 0, end: transcript.sent.length }],
recv: [{ start: 0, end: transcript.recv.length }],
};
const { attestation, secrets } = await prover.notarize(commit);
const reveal: Reveal = { ...commit, server_identity: serverIdentity }
const presentation = build_presentation(attestation, secrets, reveal);
return {
version: '0.1.0-alpha.12',
data: arrayToHex(presentation.serialize()),
meta: {
notaryUrl: notary.normalizeUrl(),
websocketProxyUrl: websocketProxyUrl,
},
};
}
constructor(config: {
serverDns: string;
maxSentData?: number;
maxSentRecords?: number,
maxRecvData?: number;
maxRecvDataOnline?: number;
maxRecvRecordsOnline?: number,
deferDecryptionFromStart?: boolean;
network?: NetworkSetting
clientAuth?: [number[][], number[]] | undefined,
}) {
this.#config = {
server_name: config.serverDns,
max_sent_data: config.maxSentData || 1024,
max_sent_records: config.maxSentRecords,
max_recv_data: config.maxRecvData || 1024,
max_recv_data_online: config.maxRecvDataOnline,
max_recv_records_online: config.maxRecvRecordsOnline,
defer_decryption_from_start: config.deferDecryptionFromStart,
network: config.network || 'Bandwidth',
client_auth: config.clientAuth
};
this.#prover = new WasmProver(this.#config);
}
async free() {
return this.#prover.free();
}
async setup(verifierUrl: string): Promise<void> {
this.#verifierUrl = verifierUrl;
return this.#prover.setup(verifierUrl);
}
async transcript(): Promise<{ sent: number[]; recv: number[] }> {
const transcript = this.#prover.transcript();
return { sent: transcript.sent, recv: transcript.recv };
}
static getHeaderMap(
url: string,
body?: unknown,
headers?: { [key: string]: string },
) {
const hostname = new URL(url).hostname;
const h: { [name: string]: string } = {
Host: hostname,
Connection: 'close',
};
if (typeof body === 'string') {
h['Content-Length'] = body.length.toString();
} else if (typeof body === 'object') {
h['Content-Length'] = JSON.stringify(body).length.toString();
} else if (typeof body === 'number') {
h['Content-Length'] = body.toString().length.toString();
}
const headerMap = headerToMap({
...h,
...headers,
});
return headerMap;
}
async sendRequest(
wsProxyUrl: string,
request: {
url: string;
method?: Method;
headers?: { [key: string]: string };
body?: unknown;
},
): Promise<{
status: number;
headers: { [key: string]: string };
}> {
this.#websocketProxyUrl = wsProxyUrl;
const { url, method = 'GET', headers = {}, body } = request;
const headerMap = Prover.getHeaderMap(url, body, headers);
const resp = await this.#prover.send_request(wsProxyUrl, {
uri: url,
method,
headers: headerMap,
body,
});
debug('prover.sendRequest', resp);
return {
status: resp.status,
headers: resp.headers.reduce(
(acc: { [key: string]: string }, [name, arr]) => {
acc[name] = Buffer.from(arr).toString();
return acc;
},
{},
),
};
}
async notarize(commit?: Commit): Promise<{
attestation: string;
secrets: string;
notaryUrl?: string;
websocketProxyUrl?: string;
}> {
const transcript = await this.transcript();
const output = await this.#prover.notarize(
commit || {
sent: [{ start: 0, end: transcript.sent.length }],
recv: [{ start: 0, end: transcript.recv.length }],
},
);
return {
attestation: arrayToHex(output.attestation.serialize()),
secrets: arrayToHex(output.secrets.serialize()),
notaryUrl: this.#verifierUrl,
websocketProxyUrl: this.#websocketProxyUrl,
};
}
async reveal(reveal: Reveal) {
return this.#prover.reveal(reveal);
}
}
export class Verifier {
#config: VerifierConfig;
#verifier: WasmVerifier;
constructor(config: { maxSentData?: number; maxRecvData?: number; maxSentRecords?: number; maxRecvRecordsOnline?: number }) {
this.#config = {
max_recv_data: config.maxRecvData || 1024,
max_sent_data: config.maxSentData || 1024,
max_sent_records: config.maxSentRecords,
max_recv_records_online: config.maxRecvRecordsOnline,
};
this.#verifier = new WasmVerifier(this.#config);
}
async verify(): Promise<VerifierOutput> {
return this.#verifier.verify();
}
async connect(proverUrl: string): Promise<void> {
return this.#verifier.connect(proverUrl);
}
}
export class Presentation {
#presentation: WasmPresentation;
#notaryUrl?: string;
#websocketProxyUrl?: string;
constructor(
params:
| {
attestationHex: string;
secretsHex: string;
notaryUrl?: string;
websocketProxyUrl?: string;
reveal?: Reveal;
}
| string,
) {
if (typeof params === 'string') {
this.#presentation = WasmPresentation.deserialize(hexToArray(params));
} else {
const attestation = WasmAttestation.deserialize(
hexToArray(params.attestationHex),
);
const secrets = WasmSecrets.deserialize(hexToArray(params.secretsHex));
const transcript = secrets.transcript();
this.#presentation = build_presentation(
attestation,
secrets,
params.reveal || {
sent: [{ start: 0, end: transcript.sent.length }],
recv: [{ start: 0, end: transcript.recv.length }],
server_identity: false,
},
);
this.#websocketProxyUrl = params.websocketProxyUrl;
this.#notaryUrl = params.notaryUrl;
}
}
async free() {
return this.#presentation.free();
}
async serialize() {
return arrayToHex(this.#presentation.serialize());
}
async verifyingKey() {
return this.#presentation.verifying_key();
}
async json(): Promise<PresentationJSON> {
return {
version: '0.1.0-alpha.12',
data: await this.serialize(),
meta: {
notaryUrl: this.#notaryUrl
? NotaryServer.from(this.#notaryUrl).normalizeUrl()
: '',
websocketProxyUrl: this.#websocketProxyUrl,
},
};
}
async verify(): Promise<VerifierOutput> {
const {
server_name = '',
connection_info,
transcript = {
sent: [],
recv: [],
recv_authed: [],
sent_authed: [],
},
} = this.#presentation.verify();
return {
server_name: server_name,
connection_info,
transcript,
};
}
}
export class Attestation {
#attestation: WasmAttestation;
constructor(attestationHex: string) {
this.#attestation = WasmAttestation.deserialize(hexToArray(attestationHex));
}
async free() {
return this.#attestation.free();
}
async verifyingKey() {
return this.#attestation.verifying_key();
}
async serialize() {
return this.#attestation.serialize();
}
}
export class Secrets {
#secrets: WasmSecrets;
constructor(secretsHex: string) {
this.#secrets = WasmSecrets.deserialize(hexToArray(secretsHex));
}
async free() {
return this.#secrets.free();
}
async serialize() {
return this.#secrets.serialize();
}
async transcript() {
return this.#secrets.transcript();
}
}
export class NotaryServer {
#url: string;
static from(url: string) {
return new NotaryServer(url);
}
constructor(url: string) {
this.#url = url;
}
get url() {
return this.#url;
}
async publicKey(encoding: 'pem' | 'hex' = 'hex'): Promise<string> {
const res = await fetch(this.#url + '/info');
const { publicKey } = await res.json();
expect(
typeof publicKey === 'string' && !!publicKey.length,
'invalid public key',
);
if (encoding === 'pem') {
return publicKey!;
}
return Buffer.from(
publicKey!
.replace('-----BEGIN PUBLIC KEY-----', '')
.replace('-----END PUBLIC KEY-----', '')
.replace(/\n/g, ''),
'base64',
)
.slice(23)
.toString('hex');
}
normalizeUrl() {
const url = new URL(this.#url);
let protocol;
if (url.protocol === 'https:' || url.protocol === 'http:') {
protocol = url.protocol;
} else {
protocol = url.protocol === 'wss:' ? 'https:' : 'http:';
}
return `${protocol}//${url.host}`;
}
async sessionUrl(
maxSentData?: number,
maxRecvData?: number,
): Promise<string> {
const resp = await fetch(`${this.#url}/session`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
clientType: 'Websocket',
maxRecvData,
maxSentData,
}),
});
const { sessionId } = await resp.json();
expect(
typeof sessionId === 'string' && !!sessionId.length,
'invalid session id',
);
const url = new URL(this.#url);
const protocol = url.protocol === 'https:' ? 'wss' : 'ws';
const pathname = url.pathname;
return `${protocol}://${url.host}${pathname === '/' ? '' : pathname}/notarize?sessionId=${sessionId!}`;
}
}
export {
type LoggingLevel,
type LoggingConfig,
type Commit,
type Method,
type Reveal,
type ProverConfig,
type VerifierConfig,
type VerifyingKey,
type VerifierOutput,
type ConnectionInfo,
type PartialTranscript,
Transcript,
mapStringToRange,
subtractRanges,
};

View File

@@ -1,96 +0,0 @@
import init, {
initThreadPool,
prover,
verify,
setup_tracing_web,
} from '../wasm/prover/pkg/tlsn_extension_rs';
export const DEFAULT_LOGGING_FILTER: string = 'info,tlsn_extension_rs=debug';
export default class TLSN {
private startPromise: Promise<void>;
private resolveStart!: () => void;
private logging_filter: string;
/**
* Initializes a new instance of the TLSN class.
*
* @param logging_filter - Optional logging filter string.
* Defaults to DEFAULT_LOGGING_FILTER
*/
constructor(logging_filter: string = DEFAULT_LOGGING_FILTER) {
this.logging_filter = logging_filter;
this.startPromise = new Promise((resolve) => {
this.resolveStart = resolve;
});
this.start();
}
async start() {
// console.log('start');
const numConcurrency = navigator.hardwareConcurrency;
// console.log('!@# navigator.hardwareConcurrency=', numConcurrency);
await init();
setup_tracing_web(this.logging_filter);
// const res = await init();
// console.log('!@# res.memory=', res.memory);
// 6422528 ~= 6.12 mb
// console.log('!@# res.memory.buffer.length=', res.memory.buffer.byteLength);
await initThreadPool(numConcurrency);
this.resolveStart();
}
async waitForStart() {
return this.startPromise;
}
async prove(
url: string,
options?: {
method?: string;
headers?: { [key: string]: string };
body?: string;
maxSentData?: number;
maxRecvData?: number;
maxTranscriptSize?: number;
notaryUrl?: string;
websocketProxyUrl?: string;
secretHeaders?: string[];
secretResps?: string[];
},
) {
await this.waitForStart();
// console.log('worker', url, {
// ...options,
// notaryUrl: options?.notaryUrl,
// websocketProxyUrl: options?.websocketProxyUrl,
// });
const resProver = await prover(
url,
{
...options,
notaryUrl: options?.notaryUrl,
websocketProxyUrl: options?.websocketProxyUrl,
},
options?.secretHeaders || [],
options?.secretResps || [],
);
const resJSON = JSON.parse(resProver);
// console.log('!@# resProver,resJSON=', { resProver, resJSON });
// console.log('!@# resAfter.memory=', resJSON.memory);
// 1105920000 ~= 1.03 gb
// console.log(
// '!@# resAfter.memory.buffer.length=',
// resJSON.memory?.buffer?.byteLength,
// );
return resJSON;
}
async verify(proof: any, pubkey: string) {
await this.waitForStart();
const raw = await verify(JSON.stringify(proof), pubkey);
return JSON.parse(raw);
}
}

107
src/transcript.ts Normal file
View File

@@ -0,0 +1,107 @@
import { Buffer } from 'buffer';
export class Transcript {
#sent: number[];
#recv: number[];
constructor(params: { sent: number[]; recv: number[] }) {
this.#recv = params.recv;
this.#sent = params.sent;
}
get raw() {
return {
recv: this.#recv,
sent: this.#sent,
};
}
recv(redactedSymbol = '*') {
return bytesToUtf8(substituteRedactions(this.#recv, redactedSymbol));
}
sent(redactedSymbol = '*') {
return bytesToUtf8(substituteRedactions(this.#sent, redactedSymbol));
}
text = (redactedSymbol = '*') => {
return {
sent: this.sent(redactedSymbol),
recv: this.recv(redactedSymbol),
};
};
}
export function subtractRanges(
ranges: { start: number; end: number },
negatives: { start: number; end: number }[],
): { start: number; end: number }[] {
const returnVal: { start: number; end: number }[] = [ranges];
negatives
.sort((a, b) => (a.start < b.start ? -1 : 1))
.forEach(({ start, end }) => {
const last = returnVal.pop()!;
if (start < last.start || end > last.end) {
console.error('invalid ranges');
return;
}
if (start === last.start && end === last.end) {
return;
}
if (start === last.start && end < last.end) {
returnVal.push({ start: end, end: last.end });
return;
}
if (start > last.start && end < last.end) {
returnVal.push({ start: last.start, end: start });
returnVal.push({ start: end, end: last.end });
return;
}
if (start > last.start && end === last.end) {
returnVal.push({ start: last.start, end: start });
return;
}
});
return returnVal;
}
export function mapStringToRange(secrets: string[], text: string) {
return secrets
.map((secret: string) => {
const byteIdx = indexOfString(text, secret);
return byteIdx > -1
? {
start: byteIdx,
end: byteIdx + bytesSize(secret),
}
: null;
})
.filter((data: any) => !!data) as { start: number; end: number }[];
}
function indexOfString(str: string, substr: string): number {
return Buffer.from(str).indexOf(Buffer.from(substr));
}
function bytesSize(str: string): number {
return Buffer.from(str).byteLength;
}
function bytesToUtf8(array: number[]): string {
return Buffer.from(array).toString("utf8");
}
function substituteRedactions(
array: number[],
redactedSymbol: string = "*",
): number[] {
const replaceCharByte = redactedSymbol.charCodeAt(0);
return array.map((byte) => (byte === 0 ? replaceCharByte : byte));
}

View File

@@ -1,98 +1,14 @@
export interface Proof {
session: Session;
substrings: Substrings;
notaryUrl: string;
}
export interface Session {
header: Header;
signature: Signature;
session_info: SessionInfo;
}
export interface SessionInfo {
server_name: ServerName;
handshake_decommitment: HandshakeDecommitment;
}
export interface HandshakeDecommitment {
nonce: number[];
data: Data;
}
export interface Data {
server_cert_details: ServerCERTDetails;
server_kx_details: ServerKxDetails;
client_random: number[];
server_random: number[];
}
export interface ServerCERTDetails {
cert_chain: Array<number[]>;
ocsp_response: number[];
scts: null;
}
export interface ServerKxDetails {
kx_params: number[];
kx_sig: KxSig;
}
export interface KxSig {
scheme: string;
sig: number[];
}
export interface Header {
encoder_seed: number[];
merkle_root: number[];
sent_len: number;
recv_len: number;
handshake_summary: HandshakeSummary;
}
export interface HandshakeSummary {
time: number;
server_public_key: ServerPublicKey;
handshake_commitment: number[];
}
export interface ServerPublicKey {
group: string;
key: number[];
}
export interface ServerName {
Dns: string;
}
export interface Signature {
P256: string;
}
export interface Substrings {
openings: { [key: string]: Opening[] };
inclusion_proof: InclusionProof;
}
export interface InclusionProof {
proof: any[];
total_leaves: number;
}
export interface Opening {
kind?: string;
ranges?: Range[];
direction?: string;
Blake3?: Blake3;
}
export interface Blake3 {
data: number[];
nonce: number[];
}
export interface Range {
export type CommitData = {
start: number;
end: number;
}
};
export type PresentationJSON = {
version: '0.1.0-alpha.7' | '0.1.0-alpha.8' | '0.1.0-alpha.9' | '0.1.0-alpha.10' | '0.1.0-alpha.11' | '0.1.0-alpha.12';
data: string;
meta: {
notaryUrl?: string;
websocketProxyUrl?: string;
pluginUrl?: string;
};
};

27
src/utils.ts Normal file
View File

@@ -0,0 +1,27 @@
import { Buffer } from 'buffer';
export function expect(cond: any, msg = 'invalid expression') {
if (!cond) throw new Error(msg);
}
export function stringToBuffer(str: string): number[] {
return Buffer.from(str).toJSON().data;
}
export function arrayToHex(uintArr: Uint8Array): string {
return Buffer.from(uintArr).toString('hex');
}
export function hexToArray(hex: string): Uint8Array {
return new Uint8Array(Buffer.from(hex, 'hex'));
}
export function headerToMap(headers: {
[name: string]: string;
}): Map<string, number[]> {
const headerMap: Map<string, number[]> = new Map();
Object.entries(headers).forEach(([key, value]) => {
headerMap.set(key, stringToBuffer(value));
});
return headerMap;
}

View File

@@ -1,6 +0,0 @@
import * as Comlink from 'comlink';
import TLSN from './tlsn';
export default TLSN;
Comlink.expose(TLSN);

View File

@@ -1,4 +0,0 @@
-----BEGIN PUBLIC KEY-----
MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEBv36FI4ZFszJa0DQFJ3wWCXvVLFr
cRzMG5kaTeHGoSzDu6cFqx3uEWYpFGo6C0EOUgf+mEgbktLrXocv5yHzKg==
-----END PUBLIC KEY-----

View File

@@ -1,7 +0,0 @@
{
"serverName": "example.com",
"time": 1708595467,
"sent": "GET / HTTP/1.1\r\nhost: example.com\r\naccept: */*\r\naccept-encoding: identity\r\nconnection: close\r\nuser-agent: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX\r\n\r\n",
"recv": "HTTP/1.1 200 OK\r\nAge: 519895\r\nCache-Control: max-age=604800\r\nContent-Type: text/html; charset=UTF-8\r\nDate: Thu, 22 Feb 2024 09:51:08 GMT\r\nEtag: \"3147526947+ident\"\r\nExpires: Thu, 29 Feb 2024 09:51:08 GMT\r\nLast-Modified: Thu, 17 Oct 2019 07:18:26 GMT\r\nServer: ECS (dce/26A0)\r\nVary: Accept-Encoding\r\nX-Cache: HIT\r\nContent-Length: 1256\r\nConnection: close\r\n\r\n<!doctype html>\n<html>\n<head>\n <title>XXXXXXXXXXXXXX</title>\n\n <meta charset=\"utf-8\" />\n <meta http-equiv=\"Content-type\" content=\"text/html; charset=utf-8\" />\n <meta name=\"viewport\" content=\"width=device-width, initial-scale=1\" />\n <style type=\"text/css\">\n body {\n background-color: #f0f0f2;\n margin: 0;\n padding: 0;\n font-family: -apple-system, system-ui, BlinkMacSystemFont, \"Segoe UI\", \"Open Sans\", \"Helvetica Neue\", Helvetica, Arial, sans-serif;\n \n }\n div {\n width: 600px;\n margin: 5em auto;\n padding: 2em;\n background-color: #fdfdff;\n border-radius: 0.5em;\n box-shadow: 2px 3px 7px 2px rgba(0,0,0,0.02);\n }\n a:link, a:visited {\n color: #38488f;\n text-decoration: none;\n }\n @media (max-width: 700px) {\n div {\n margin: 0 auto;\n width: auto;\n }\n }\n </style> \n</head>\n\n<body>\n<div>\n <h1>XXXXXXXXXXXXXX</h1>\n <p>This domain is for use in illustrative examples in documents. You may use this\n domain in literature without prior coordination or asking for permission.</p>\n <p><a href=\"https://www.iana.org/domains/example\">More information...</a></p>\n</div>\n</body>\n</html>\n",
"notaryUrl": "http://localhost"
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,162 @@
import {
Prover as _Prover,
NotaryServer,
Presentation as _Presentation,
Commit,
mapStringToRange,
subtractRanges,
Transcript,
Reveal,
} from '../../src/lib';
import * as Comlink from 'comlink';
import { HTTPParser } from 'http-parser-js';
const { init, Prover, Presentation }: any = Comlink.wrap(
// @ts-ignore
new Worker(new URL('../worker.ts', import.meta.url)),
);
(async function () {
try {
await init({ loggingLevel: 'Debug' });
// @ts-ignore
console.log('test start');
console.time('prove');
const prover = (await new Prover({
serverDns: 'raw.githubusercontent.com',
maxRecvData: 1700,
network: "Bandwidth",
})) as _Prover;
const notary = NotaryServer.from('http://127.0.0.1:7047');
await prover.setup(await notary.sessionUrl());
// const websocketProxyUrl = 'wss://notary.pse.dev/proxy?token=raw.githubusercontent.com';
const websocketProxyUrl = 'ws://127.0.0.1:55688';
await prover.sendRequest(websocketProxyUrl, {
url: 'https://raw.githubusercontent.com/tlsnotary/tlsn/refs/heads/main/crates/server-fixture/server/src/data/protected_data.json',
headers: {
'content-type': 'application/json',
secret: 'test_secret',
},
});
const transcript = await prover.transcript();
const { sent, recv } = transcript;
const {
info: recvInfo,
headers: recvHeaders,
body: recvBody,
} = parseHttpMessage(Buffer.from(recv), 'response');
const body = JSON.parse(recvBody[0].toString());
const commit: Commit = {
sent: subtractRanges(
{ start: 0, end: sent.length },
mapStringToRange(
['secret: test_secret'],
Buffer.from(sent).toString('utf-8'),
),
),
recv: [
...mapStringToRange(
[
recvInfo,
`${recvHeaders[4]}: ${recvHeaders[5]}\r\n`,
`${recvHeaders[6]}: ${recvHeaders[7]}\r\n`,
`${recvHeaders[8]}: ${recvHeaders[9]}\r\n`,
`${recvHeaders[10]}: ${recvHeaders[11]}\r\n`,
`${recvHeaders[12]}: ${recvHeaders[13]}`,
`${recvHeaders[14]}: ${recvHeaders[15]}`,
`${recvHeaders[16]}: ${recvHeaders[17]}`,
`${recvHeaders[18]}: ${recvHeaders[19]}`,
`"id": ${body.id}`,
`"city": "${body.information.address.city}"`,
`"postalCode": "12345"`,
],
Buffer.from(recv).toString('utf-8'),
),
],
};
console.log(commit);
const notarizationOutput = await prover.notarize(commit);
const reveal: Reveal = {
...commit,
server_identity: false,
};
const presentation = (await new Presentation({
attestationHex: notarizationOutput.attestation,
secretsHex: notarizationOutput.secrets,
reveal: reveal,
notaryUrl: notary.url,
websocketProxyUrl: 'wss://notary.pse.dev/proxy',
})) as _Presentation;
console.log('presentation:', await presentation.serialize());
console.timeEnd('prove');
const json = await presentation.json();
console.time('verify');
const { transcript: partialTranscript, server_name } =
await presentation.verify();
const verifyingKey = await presentation.verifyingKey();
console.timeEnd('verify');
console.log('verifyingKey', verifyingKey);
const t = new Transcript({
sent: partialTranscript.sent,
recv: partialTranscript.recv,
});
const sentStr = t.sent();
const recvStr = t.recv();
console.log("Sent:", sentStr);
console.log("Received:", recvStr);
// @ts-ignore
document.getElementById('full-integration').textContent = JSON.stringify({
sent: sentStr,
recv: recvStr,
version: json.version,
meta: json.meta,
server_name
}, null, 2);
} catch (err) {
console.log('caught error from wasm');
console.error(err);
// @ts-ignore
document.getElementById('full-integration').textContent = err.message;
}
})();
function parseHttpMessage(buffer: Buffer, type: 'request' | 'response') {
const parser = new HTTPParser(
type === 'request' ? HTTPParser.REQUEST : HTTPParser.RESPONSE,
);
const body: Buffer[] = [];
let complete = false;
let headers: string[] = [];
parser.onBody = (t) => {
body.push(t);
};
parser.onHeadersComplete = (res) => {
headers = res.headers;
};
parser.onMessageComplete = () => {
complete = true;
};
parser.execute(buffer);
parser.finish();
if (!complete) throw new Error(`Could not parse ${type.toUpperCase()}`);
return {
info: buffer.toString('utf-8').split('\r\n')[0] + '\r\n',
headers,
body,
};
}

File diff suppressed because one or more lines are too long

View File

@@ -1,47 +0,0 @@
import { prove, verify } from '../../src';
import { assert } from '../utils';
(async function () {
try {
// @ts-ignore
console.log('test start');
console.time('prove');
const proof = await prove('https://swapi.dev/api/people/1', {
method: 'GET',
headers: { secret: 'test_secret' },
maxTranscriptSize: 16384,
notaryUrl: process.env.LOCAL_NOTARY
? 'http://localhost:7047'
: 'https://notary.pse.dev/v0.1.0-alpha.5',
websocketProxyUrl: process.env.LOCAL_WS
? 'ws://localhost:55688'
: 'wss://notary.pse.dev/proxy?token=swapi.dev',
secretHeaders: ['test_secret'],
secretResps: ['blond', 'fair'],
});
console.timeEnd('prove');
console.log('Proof: ', JSON.stringify(proof));
console.time('verify');
const result = await verify(proof);
console.timeEnd('verify');
console.log(result);
assert(result.sent.includes('host: swapi.dev'));
assert(result.sent.includes('secret: XXXXXXXXXXX'));
assert(result.recv.includes('Luke Skywalker'));
assert(result.recv.includes('"hair_color":"XXXXX"'));
assert(result.recv.includes('"skin_color":"XXXX"'));
// @ts-ignore
document.getElementById('full-integration-swapi').textContent = 'OK';
} catch (err) {
console.log('caught error from wasm');
console.error(err);
// @ts-ignore
document.getElementById('full-integration-swapi').textContent = err.message;
}
})();

View File

@@ -1,34 +0,0 @@
import { verify } from '../../src';
import simple_proof_redacted from '../assets/simple_proof_redacted.json';
import { assert } from '../utils';
(async function verify_simple() {
try {
const pem = `-----BEGIN PUBLIC KEY-----\nMFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEBv36FI4ZFszJa0DQFJ3wWCXvVLFr\ncRzMG5kaTeHGoSzDu6cFqx3uEWYpFGo6C0EOUgf+mEgbktLrXocv5yHzKg==\n-----END PUBLIC KEY-----`;
const proof = {
...simple_proof_redacted,
notaryUrl: 'http://127.0.0.1:7047',
};
console.log(proof);
console.time('verify');
const result = await verify(proof, pem);
console.timeEnd('verify');
assert(
result.sent.includes(
'user-agent: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX',
),
);
assert(result.recv.includes('<h1>XXXXXXXXXXXXXX</h1>'));
assert(result);
// @ts-ignore
document.getElementById('simple-verify').textContent = 'OK';
} catch (err) {
console.log('caught error from wasm');
console.error(err);
// @ts-ignore
document.getElementById('simple-verify').textContent = err.message;
}
})();

View File

@@ -5,19 +5,21 @@
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<link rel="manifest" href="/manifest.json">
<title>tlsn-js development</title>
<title>
Testing <%= htmlWebpackPlugin.options.testName || 'test' %>
</title>
</head>
<body>
<script>
global = globalThis //<- this should be enough
</script>
<div>Testing "full-integration-swapi":
<div id="full-integration-swapi"></div>
</div>
<div>Testing "simple-verify":
<div id="simple-verify"></div>
</div>
<h1>Testing "<%= htmlWebpackPlugin.options.testName || 'unknown' %>":</h1>
<pre>
<div id="<%= htmlWebpackPlugin.options.testName || 'test' %>"
data-testid="<%= htmlWebpackPlugin.options.testName || 'test' %>">
</div>
</pre>
</body>
</html>

View File

@@ -1,167 +0,0 @@
import puppeteer, { Browser, Page, PuppeteerLaunchOptions } from 'puppeteer';
import { describe, it, before, after } from 'mocha';
const assert = require('assert');
import { exec, ChildProcess } from 'node:child_process';
import * as fs from 'fs';
import path from 'path';
const yaml = require('js-yaml');
const timeout = 300000;
// puppeteer options
let opts: PuppeteerLaunchOptions = {
headless: !!process.env.HEADLESS ? 'new' : false,
slowMo: 100,
timeout: timeout,
};
if (process.env.CHROME_PATH) {
opts = {
...opts,
executablePath: process.env.CHROME_PATH,
};
}
let browser: Browser;
let page: Page;
let server: ChildProcess;
let tlsnServerFixture: ChildProcess;
const spawnTlsnServerFixture = () => {
const tlsnServerFixturePath = './utils/tlsn/tlsn/tlsn-server-fixture/';
// Spawn the server process
// tlsnServerFixture = spawn(tlsnServerFixturePath, []);
tlsnServerFixture = exec(`../target/release/main`, {
cwd: tlsnServerFixturePath,
});
tlsnServerFixture.stdout?.on('data', (data) => {
console.log(`Server: ${data}`);
});
tlsnServerFixture.stderr?.on('data', (data) => {
console.error(`Server Error: ${data}`);
});
};
let localNotaryServer: ChildProcess;
const spawnLocalNotaryServer = async () => {
const localNotaryServerPath = './utils/tlsn/notary-server/';
localNotaryServer = exec(`target/release/notary-server`, {
cwd: localNotaryServerPath,
});
localNotaryServer.stdout?.on('data', (data) => {
console.log(`Server: ${data}`);
});
localNotaryServer.stderr?.on('data', (data) => {
console.error(`Server Error: ${data}`);
});
// wait for the notary server to be ready
while (true) {
try {
const response = await fetch('http://127.0.0.1:7047/info');
if (response.ok) {
return;
}
} catch (error) {
console.error('Waiting for local notary server...', error);
}
await new Promise((resolve) => setTimeout(resolve, 1000));
}
};
const configureNotarySerer = () => {
try {
const configPath = './utils/tlsn/notary-server/config/config.yaml';
const fileContents = fs.readFileSync(configPath, 'utf8');
const data = yaml.load(fileContents) as any;
data.tls.enabled = false;
data.server.host = '127.0.0.1';
const newYaml = yaml.dump(data);
fs.writeFileSync(configPath, newYaml, 'utf8');
console.log('YAML file has been updated.');
} catch (error) {
console.error('Error reading or updating the YAML file:', error);
}
};
// expose variables
before(async function () {
server = exec('serve --config ../serve.json ./test-build -l 3001');
spawnTlsnServerFixture();
configureNotarySerer();
await spawnLocalNotaryServer();
browser = await puppeteer.launch(opts);
page = await browser.newPage();
await page.goto('http://127.0.0.1:3001');
});
// close browser and reset global variables
after(async function () {
console.log('Cleaning up:');
try {
tlsnServerFixture.kill();
console.log('* Stopped TLSN Server Fixture ✅');
localNotaryServer.kill();
console.log('* Stopped Notary Server ✅');
server.kill();
console.log('* Stopped Test Web Server ✅');
await page.close();
await browser.close();
const childProcess = browser.process();
if (childProcess) {
childProcess.kill(9);
}
console.log('* Closed browser ✅');
process.exit(0);
} catch (e) {
console.error(e);
process.exit(0);
}
});
describe('tlsn-js test suite', function () {
fs.readdirSync(path.join(__dirname, 'specs')).forEach((file) => {
const [id] = file.split('.');
it(`Test ID: ${id}`, async function () {
const content = await check(id);
assert(content === 'OK');
});
});
// it('should prove and verify data from the local tlsn-server-fixture', async function () {
// const content = await check('full-integration-swapi');
// assert(content === 'OK');
// });
//
// it('should verify', async function () {
// const content = await check('simple-verify');
// assert(content === 'OK');
// });
});
async function check(testId: string): Promise<string> {
const startTime = Date.now();
const attemptFetchContent = async (): Promise<string> => {
const content = await page.$eval(
`#${testId}`,
(el: any) => el.textContent || '',
);
if (content) return content;
const elapsedTime = Date.now() - startTime;
if (elapsedTime >= timeout) {
throw new Error(
`Timeout: Failed to retrieve content for '#${testId}' within ${timeout} ms.`,
);
}
await new Promise((resolve) => setTimeout(resolve, 1000));
return attemptFetchContent();
};
return attemptFetchContent();
}

View File

@@ -1,3 +0,0 @@
export function assert(expr: any, msg = 'unknown assertion error') {
if (!Boolean(expr)) throw new Error(msg);
}

9
test/worker.ts Normal file
View File

@@ -0,0 +1,9 @@
import * as Comlink from 'comlink';
import init, { Prover, Presentation, Attestation } from '../src/lib';
Comlink.expose({
init,
Prover,
Presentation,
Attestation,
});

View File

@@ -1 +1,2 @@
tlsn
pkg

View File

@@ -1,9 +1,14 @@
#!/bin/bash
# Run tlsn Server fixture
set -e # Exit on error
# Set the directory to the location of the script
cd "$(dirname "$0")"
VERSION=${1:-origin/dev} # use `dev` branch if no version is set
rm -rf pkg
# Name of the directory where the repo will be cloned
REPO_DIR="tlsn"
@@ -20,13 +25,12 @@ else
fi
# Checkout the specific tag
git checkout "v0.1.0-alpha.5"
git checkout "${VERSION}" --force
git reset --hard
for dir in "tlsn/tlsn-server-fixture/" "notary-server"; do
# Change to the specific subdirectory
cd ${dir}
cd crates/wasm
cargo update
./build.sh
cd ../../
# Build the project
cargo build --release
cd -
done
cp -r crates/wasm/pkg ..

View File

@@ -2,7 +2,7 @@
"compilerOptions": {
"baseUrl": ".",
"module": "esnext",
"target": "es5",
"target": "es2015",
"lib": ["dom", "dom.iterable", "esnext"],
"allowJs": true,
"skipLibCheck": true,

View File

@@ -1,32 +0,0 @@
#!/bin/bash
DIRECTORY="./wasm/prover/target"
FORCE_FLAG=0
# Check if --force or -f flag is passed
for arg in "$@"
do
if [ "$arg" == "--force" ] || [ "$arg" == "-f" ]
then
FORCE_FLAG=1
break
fi
done
# If force flag is set, remove the directories/files and run the npm command
if [ $FORCE_FLAG -eq 1 ]
then
echo "Force flag detected, removing directories and files."
rm -rf ./wasm/prover/pkg
rm -rf ./wasm/prover/target
rm -f ./wasm/prover/Cargo.lock
echo "Running npm run build:wasm"
npm run build:wasm
# If the directory does not exist or is empty, run the npm command
elif [ ! -d "$DIRECTORY" ] || [ -z "$(ls -A $DIRECTORY)" ]
then
echo "Running npm run build:wasm"
npm run build:wasm
else
echo "$DIRECTORY exists and is not empty."
fi

View File

@@ -1,8 +0,0 @@
[target.wasm32-unknown-unknown]
rustflags = [
"-C",
"target-feature=+atomics,+bulk-memory,+mutable-globals",
]
[unstable]
build-std = ["panic_abort", "std"]

View File

@@ -1,93 +0,0 @@
[package]
authors = ["The tlsn-extension Developers"]
description = "tlsn-js library for using TLSNotary in browsers"
edition = "2018"
license = "MIT OR Apache-2.0"
name = "tlsn-extension-rs"
rust-version = "1.56"
version = "0.1.0"
[lib]
crate-type = ["cdylib", "rlib"]
[dependencies]
chrono = "0.4"
elliptic-curve = { version = "0.13.5", features = ["pkcs8"] }
futures = "0.3"
futures-util = "0.3.28"
getrandom = { version = "0.2", features = ["js"] }
js-sys = "0.3.64"
p256 = { version = "0.13", features = ["pem", "ecdsa"] }
rayon = "1.5"
serde = { version = "1.0.147", features = ["derive"] }
serde-wasm-bindgen = "0.6.1"
serde_json = "1.0"
tracing = "0.1"
url = { version = "2.0", features = ["serde"] }
wasm-bindgen = "0.2.87"
wasm-bindgen-futures = "0.4.37"
wasm-bindgen-rayon = "1.0"
pin-project-lite = "0.2.4"
http-body-util = "0.1"
hyper = { version = "1.1", features = ["client", "http1"] }
hyper-util = { version = "0.1", features = ["http1"] }
tracing-subscriber = { version = "0.3", features = ["time","env-filter"] }
tracing-web = "0.1.2"
ring = { version = "0.17", features = ["wasm32_unknown_unknown_js"] }
# time crate: https://crates.io/crates/time
# NOTE: It is required, otherwise "time not implemented on this platform" error happens right after "!@# 2".
# Probably due to tokio's time feature is used in tlsn-prover?
time = { version = "0.3.34", features = ["wasm-bindgen"] }
# Used to calculate elapsed time.
web-time = "1.0"
tlsn-core = { git = "https://github.com/tlsnotary/tlsn.git", tag = "v0.1.0-alpha.5", package = "tlsn-core" }
tlsn-prover = { git = "https://github.com/tlsnotary/tlsn.git", tag = "v0.1.0-alpha.5", package = "tlsn-prover", features = [
"tracing",
] }
web-sys = { version = "0.3.4", features = [
"BinaryType",
"Blob",
"ErrorEvent",
"FileReader",
"MessageEvent",
"ProgressEvent",
"WebSocket",
"console",
'Document',
'HtmlElement',
'HtmlInputElement',
'Window',
'Worker',
'Headers',
'Request',
'RequestInit',
'RequestMode',
'Response',
] }
# Use the patched ws_stream_wasm to fix the issue https://github.com/najamelan/ws_stream_wasm/issues/12#issuecomment-1711902958
ws_stream_wasm = { version = "0.7.4", git = "https://github.com/tlsnotary/ws_stream_wasm", rev = "2ed12aad9f0236e5321f577672f309920b2aef51" }
# The `console_error_panic_hook` crate provides better debugging of panics by
# logging them with `console.error`. This is great for development, but requires
# all the `std::fmt` and `std::panicking` infrastructure, so isn't great for
# code size when deploying.
console_error_panic_hook = { version = "0.1.7" }
strum = { version = "0.26.1" }
strum_macros = "0.26.1"
[dev-dependencies]
wasm-bindgen-test = "0.3.34"
[profile.release]
lto = true # Enable Link Time Optimization
opt-level = "z" # Optimize for size
[package.metadata.wasm-pack.profile.release]
wasm-opt = true

View File

@@ -1,88 +0,0 @@
use core::slice;
use std::pin::Pin;
use std::task::{Context, Poll};
use pin_project_lite::pin_project;
pin_project! {
#[derive(Debug)]
pub(crate) struct FuturesIo<T> {
#[pin]
inner: T,
}
}
impl<T> FuturesIo<T> {
/// Create a new `FuturesIo` wrapping the given I/O object.
///
/// # Safety
///
/// This wrapper is only safe to use if the inner I/O object does not under any circumstance
/// read from the buffer passed to `poll_read` in the `futures::AsyncRead` implementation.
pub(crate) unsafe fn new(inner: T) -> Self {
Self { inner }
}
pub(crate) fn into_inner(self) -> T {
self.inner
}
}
impl<T> hyper::rt::Write for FuturesIo<T>
where
T: futures::AsyncWrite + Unpin,
{
fn poll_write(
self: Pin<&mut Self>,
cx: &mut Context<'_>,
buf: &[u8],
) -> Poll<Result<usize, std::io::Error>> {
self.project().inner.poll_write(cx, buf)
}
fn poll_flush(self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Result<(), std::io::Error>> {
self.project().inner.poll_flush(cx)
}
fn poll_shutdown(
self: Pin<&mut Self>,
cx: &mut Context<'_>,
) -> Poll<Result<(), std::io::Error>> {
self.project().inner.poll_close(cx)
}
fn poll_write_vectored(
self: Pin<&mut Self>,
cx: &mut Context<'_>,
bufs: &[std::io::IoSlice<'_>],
) -> Poll<Result<usize, std::io::Error>> {
self.project().inner.poll_write_vectored(cx, bufs)
}
}
// Adapted from https://github.com/hyperium/hyper-util/blob/99b77a5a6f75f24bc0bcb4ca74b5f26a07b19c80/src/rt/tokio.rs
impl<T> hyper::rt::Read for FuturesIo<T>
where
T: futures::AsyncRead + Unpin,
{
fn poll_read(
self: Pin<&mut Self>,
cx: &mut Context<'_>,
mut buf: hyper::rt::ReadBufCursor<'_>,
) -> Poll<Result<(), std::io::Error>> {
// Safety: buf_slice should only be written to, so it's safe to convert `&mut [MaybeUninit<u8>]` to `&mut [u8]`.
let buf_slice = unsafe {
slice::from_raw_parts_mut(buf.as_mut().as_mut_ptr() as *mut u8, buf.as_mut().len())
};
let n = match futures::AsyncRead::poll_read(self.project().inner, cx, buf_slice) {
Poll::Ready(Ok(n)) => n,
other => return other.map_ok(|_| ()),
};
unsafe {
buf.advance(n);
}
Poll::Ready(Ok(()))
}
}

View File

@@ -1,74 +0,0 @@
pub(crate) mod hyper_io;
mod request_opt;
mod requests;
pub mod prover;
pub use prover::prover;
pub mod verify;
use tracing::error;
pub use verify::verify;
use wasm_bindgen::prelude::*;
pub use crate::request_opt::{RequestOptions, VerifyResult};
pub use wasm_bindgen_rayon::init_thread_pool;
use js_sys::JSON;
use wasm_bindgen_futures::JsFuture;
use web_sys::{Request, RequestInit, Response};
use std::panic;
use tracing::debug;
use tracing_subscriber::fmt::format::Pretty;
use tracing_subscriber::fmt::time::UtcTime;
use tracing_subscriber::prelude::*;
use tracing_subscriber::EnvFilter;
use tracing_web::{performance_layer, MakeWebConsoleWriter};
extern crate console_error_panic_hook;
#[wasm_bindgen]
pub fn setup_tracing_web(logging_filter: &str) {
let fmt_layer = tracing_subscriber::fmt::layer()
.with_ansi(false) // Only partially supported across browsers
.with_timer(UtcTime::rfc_3339()) // std::time is not available in browsers
// .with_thread_ids(true)
// .with_thread_names(true)
.with_writer(MakeWebConsoleWriter::new()); // write events to the console
let perf_layer = performance_layer().with_details_from_fields(Pretty::default());
let filter_layer = EnvFilter::builder()
.parse(logging_filter)
.unwrap_or_default();
tracing_subscriber::registry()
.with(filter_layer)
.with(fmt_layer)
.with(perf_layer)
.init(); // Install these as subscribers to tracing events
// https://github.com/rustwasm/console_error_panic_hook
panic::set_hook(Box::new(|info| {
error!("panic occurred: {:?}", info);
console_error_panic_hook::hook(info);
}));
debug!("🪵 Logging set up 🪵")
}
pub async fn fetch_as_json_string(url: &str, opts: &RequestInit) -> Result<String, JsValue> {
let request = Request::new_with_str_and_init(url, opts)?;
let window = web_sys::window().expect("Window object");
let resp_value = JsFuture::from(window.fetch_with_request(&request)).await?;
assert!(resp_value.is_instance_of::<Response>());
let resp: Response = resp_value.dyn_into()?;
let json = JsFuture::from(resp.json()?).await?;
let stringified = JSON::stringify(&json)?;
stringified
.as_string()
.ok_or_else(|| JsValue::from_str("Could not stringify JSON"))
}

View File

@@ -1,479 +0,0 @@
use futures::channel::oneshot;
use std::ops::Range;
use tlsn_prover::tls::{Prover, ProverConfig};
use wasm_bindgen_futures::spawn_local;
use web_time::Instant;
use ws_stream_wasm::*;
use crate::hyper_io::FuturesIo;
use crate::request_opt::RequestOptions;
use crate::requests::{ClientType, NotarizationSessionRequest, NotarizationSessionResponse};
pub use wasm_bindgen_rayon::init_thread_pool;
use crate::fetch_as_json_string;
pub use crate::request_opt::VerifyResult;
use futures::AsyncWriteExt;
use http_body_util::{BodyExt, Full};
use hyper::{body::Bytes, Request, StatusCode};
use js_sys::Array;
use strum::EnumMessage;
use tlsn_core::proof::TlsProof;
use url::Url;
use wasm_bindgen::prelude::*;
use web_sys::{Headers, RequestInit, RequestMode};
use tracing::{debug, info, trace};
#[derive(strum_macros::EnumMessage, Debug, Clone, Copy)]
#[allow(dead_code)]
enum ProverPhases {
#[strum(message = "Connect application server with websocket proxy")]
ConnectWsProxy,
#[strum(message = "Build prover config")]
BuildProverConfig,
#[strum(message = "Set up prover")]
SetUpProver,
#[strum(message = "Bind the prover to the server connection")]
BindProverToConnection,
#[strum(message = "Spawn the prover thread")]
SpawnProverThread,
#[strum(message = "Attach the hyper HTTP client to the TLS connection")]
AttachHttpClient,
#[strum(message = "Spawn the HTTP task to be run concurrently")]
SpawnHttpTask,
#[strum(message = "Build request")]
BuildRequest,
#[strum(message = "Start MPC-TLS connection with the server")]
StartMpcConnection,
#[strum(message = "Received response from the server")]
ReceivedResponse,
#[strum(message = "Parsing response from the server")]
ParseResponse,
#[strum(message = "Close the connection to the server")]
CloseConnection,
#[strum(message = "Start notarization")]
StartNotarization,
#[strum(message = "Commit to data")]
Commit,
#[strum(message = "Finalize")]
Finalize,
#[strum(message = "Notarization complete")]
NotarizationComplete,
#[strum(message = "Create Proof")]
CreateProof,
}
fn log_phase(phase: ProverPhases) {
info!("tlsn-js {}: {}", phase as u8, phase.get_message().unwrap());
}
#[wasm_bindgen]
pub async fn prover(
target_url_str: &str,
val: JsValue,
secret_headers: JsValue,
secret_body: JsValue,
) -> Result<String, JsValue> {
debug!("target_url: {}", target_url_str);
let target_url = Url::parse(target_url_str)
.map_err(|e| JsValue::from_str(&format!("Could not parse target_url: {:?}", e)))?;
debug!(
"target_url.host: {}",
target_url
.host()
.ok_or(JsValue::from_str("Could not get target host"))?
);
let options: RequestOptions = serde_wasm_bindgen::from_value(val)
.map_err(|e| JsValue::from_str(&format!("Could not deserialize options: {:?}", e)))?;
debug!("options.notary_url: {}", options.notary_url.as_str());
let start_time = Instant::now();
/*
* Connect Notary with websocket
*/
let mut opts = RequestInit::new();
opts.method("POST");
// opts.method("GET");
opts.mode(RequestMode::Cors);
// set headers
let headers = Headers::new()
.map_err(|e| JsValue::from_str(&format!("Could not create headers: {:?}", e)))?;
let notary_url = Url::parse(options.notary_url.as_str())
.map_err(|e| JsValue::from_str(&format!("Could not parse notary_url: {:?}", e)))?;
let notary_ssl = notary_url.scheme() == "https" || notary_url.scheme() == "wss";
let notary_host = notary_url.authority();
let notary_path = notary_url.path();
let notary_path_str = if notary_path == "/" { "" } else { notary_path };
headers
.append("Host", notary_host)
.map_err(|e| JsValue::from_str(&format!("Could not append Host header: {:?}", e)))?;
headers
.append("Content-Type", "application/json")
.map_err(|e| {
JsValue::from_str(&format!("Could not append Content-Type header: {:?}", e))
})?;
opts.headers(&headers);
info!("notary_host: {}", notary_host);
// set body
let payload = serde_json::to_string(&NotarizationSessionRequest {
client_type: ClientType::Websocket,
max_sent_data: options.max_sent_data,
max_recv_data: options.max_recv_data,
})
.map_err(|e| JsValue::from_str(&format!("Could not serialize request: {:?}", e)))?;
opts.body(Some(&JsValue::from_str(&payload)));
// url
let url = format!(
"{}://{}{}/session",
if notary_ssl { "https" } else { "http" },
notary_host,
notary_path_str
);
debug!("Request: {}", url);
let rust_string = fetch_as_json_string(&url, &opts)
.await
.map_err(|e| JsValue::from_str(&format!("Could not fetch session: {:?}", e)))?;
let notarization_response =
serde_json::from_str::<NotarizationSessionResponse>(&rust_string)
.map_err(|e| JsValue::from_str(&format!("Could not deserialize response: {:?}", e)))?;
debug!("Response: {}", rust_string);
debug!("Notarization response: {:?}", notarization_response,);
let notary_wss_url = format!(
"{}://{}{}/notarize?sessionId={}",
if notary_ssl { "wss" } else { "ws" },
notary_host,
notary_path_str,
notarization_response.session_id
);
let (_, notary_ws_stream) = WsMeta::connect(notary_wss_url, None)
.await
.expect_throw("assume the notary ws connection succeeds");
let notary_ws_stream_into = notary_ws_stream.into_io();
log_phase(ProverPhases::BuildProverConfig);
let target_host = target_url
.host_str()
.ok_or(JsValue::from_str("Could not get target host"))?;
// Basic default prover config
let mut builder = ProverConfig::builder();
if let Some(max_sent_data) = options.max_sent_data {
builder.max_sent_data(max_sent_data);
}
if let Some(max_recv_data) = options.max_recv_data {
builder.max_recv_data(max_recv_data);
}
let config = builder
.id(notarization_response.session_id)
.server_dns(target_host)
.build()
.map_err(|e| JsValue::from_str(&format!("Could not build prover config: {:?}", e)))?;
// Create a Prover and set it up with the Notary
// This will set up the MPC backend prior to connecting to the server.
log_phase(ProverPhases::SetUpProver);
let prover = Prover::new(config)
.setup(notary_ws_stream_into)
.await
.map_err(|e| JsValue::from_str(&format!("Could not set up prover: {:?}", e)))?;
/*
Connect Application Server with websocket proxy
*/
log_phase(ProverPhases::ConnectWsProxy);
let (_, client_ws_stream) = WsMeta::connect(options.websocket_proxy_url, None)
.await
.expect_throw("assume the client ws connection succeeds");
// Bind the Prover to the server connection.
// The returned `mpc_tls_connection` is an MPC TLS connection to the Server: all data written
// to/read from it will be encrypted/decrypted using MPC with the Notary.
log_phase(ProverPhases::BindProverToConnection);
let (mpc_tls_connection, prover_fut) =
prover.connect(client_ws_stream.into_io()).await.unwrap();
let mpc_tls_connection = unsafe { FuturesIo::new(mpc_tls_connection) };
let prover_ctrl = prover_fut.control();
log_phase(ProverPhases::SpawnProverThread);
let (prover_sender, prover_receiver) = oneshot::channel();
let handled_prover_fut = async {
let result = prover_fut.await;
let _ = prover_sender.send(result);
};
spawn_local(handled_prover_fut);
// Attach the hyper HTTP client to the TLS connection
log_phase(ProverPhases::AttachHttpClient);
let (mut request_sender, connection) =
hyper::client::conn::http1::handshake(mpc_tls_connection)
.await
.map_err(|e| JsValue::from_str(&format!("Could not handshake: {:?}", e)))?;
// Spawn the HTTP task to be run concurrently
log_phase(ProverPhases::SpawnHttpTask);
let (connection_sender, connection_receiver) = oneshot::channel();
let connection_fut = connection.without_shutdown();
let handled_connection_fut = async {
let result = connection_fut.await;
let _ = connection_sender.send(result);
};
spawn_local(handled_connection_fut);
log_phase(ProverPhases::BuildRequest);
let mut req_with_header = Request::builder()
.uri(target_url_str)
.method(options.method.as_str());
for (key, value) in options.headers {
info!("adding header: {} - {}", key.as_str(), value.as_str());
req_with_header = req_with_header.header(key.as_str(), value.as_str());
}
let req_with_body = if options.body.is_empty() {
info!("empty body");
req_with_header.body(Full::new(Bytes::default()))
} else {
info!("added body - {}", options.body.as_str());
req_with_header.body(Full::from(options.body))
};
let unwrapped_request = req_with_body
.map_err(|e| JsValue::from_str(&format!("Could not build request: {:?}", e)))?;
log_phase(ProverPhases::StartMpcConnection);
// Defer decryption of the response.
prover_ctrl
.defer_decryption()
.await
.map_err(|e| JsValue::from_str(&format!("failed to enable deferred decryption: {}", e)))?;
// Send the request to the Server and get a response via the MPC TLS connection
let response = request_sender
.send_request(unwrapped_request)
.await
.map_err(|e| JsValue::from_str(&format!("Could not send request: {:?}", e)))?;
log_phase(ProverPhases::ReceivedResponse);
if response.status() != StatusCode::OK {
return Err(JsValue::from_str(&format!(
"Response status is not OK: {:?}",
response.status()
)));
}
log_phase(ProverPhases::ParseResponse);
// Pretty printing :)
let payload = response
.into_body()
.collect()
.await
.map_err(|e| JsValue::from_str(&format!("Could not get response body: {:?}", e)))?
.to_bytes();
let parsed = serde_json::from_str::<serde_json::Value>(&String::from_utf8_lossy(&payload))
.map_err(|e| JsValue::from_str(&format!("Could not parse response: {:?}", e)))?;
let response_pretty = serde_json::to_string_pretty(&parsed)
.map_err(|e| JsValue::from_str(&format!("Could not serialize response: {:?}", e)))?;
info!("Response: {}", response_pretty);
// Close the connection to the server
log_phase(ProverPhases::CloseConnection);
let mut client_socket = connection_receiver
.await
.map_err(|e| {
JsValue::from_str(&format!(
"Could not receive from connection_receiver: {:?}",
e
))
})?
.map_err(|e| JsValue::from_str(&format!("Could not get TlsConnection: {:?}", e)))?
.io
.into_inner();
client_socket
.close()
.await
.map_err(|e| JsValue::from_str(&format!("Could not close socket: {:?}", e)))?;
// The Prover task should be done now, so we can grab it.
log_phase(ProverPhases::StartNotarization);
let prover = prover_receiver
.await
.map_err(|e| {
JsValue::from_str(&format!("Could not receive from prover_receiver: {:?}", e))
})?
.map_err(|e| JsValue::from_str(&format!("Could not get Prover: {:?}", e)))?;
let mut prover = prover.start_notarize();
let secret_headers_vecs = string_list_to_bytes_vec(&secret_headers)?;
let secret_headers_slices: Vec<&[u8]> = secret_headers_vecs
.iter()
.map(|vec| vec.as_slice())
.collect();
// Identify the ranges in the transcript that contain revealed_headers
let (sent_public_ranges, sent_private_ranges) = find_ranges(
prover.sent_transcript().data(),
secret_headers_slices.as_slice(),
);
let secret_body_vecs = string_list_to_bytes_vec(&secret_body)?;
let secret_body_slices: Vec<&[u8]> =
secret_body_vecs.iter().map(|vec| vec.as_slice()).collect();
// Identify the ranges in the transcript that contain the only data we want to reveal later
let (recv_public_ranges, recv_private_ranges) = find_ranges(
prover.recv_transcript().data(),
secret_body_slices.as_slice(),
);
log_phase(ProverPhases::Commit);
let _recv_len = prover.recv_transcript().data().len();
let builder = prover.commitment_builder();
// Commit to the outbound and inbound transcript, isolating the data that contain secrets
let sent_pub_commitment_ids = sent_public_ranges
.iter()
.map(|range| {
builder.commit_sent(range).map_err(|e| {
JsValue::from_str(&format!("Error committing sent pub range: {:?}", e))
})
})
.collect::<Result<Vec<_>, _>>()?;
sent_private_ranges.iter().try_for_each(|range| {
builder
.commit_sent(range)
.map_err(|e| {
JsValue::from_str(&format!("Error committing sent private range: {:?}", e))
})
.map(|_| ())
})?;
let recv_pub_commitment_ids = recv_public_ranges
.iter()
.map(|range| {
builder.commit_recv(range).map_err(|e| {
JsValue::from_str(&format!("Error committing recv public ranges: {:?}", e))
})
})
.collect::<Result<Vec<_>, _>>()?;
recv_private_ranges.iter().try_for_each(|range| {
builder
.commit_recv(range)
.map_err(|e| {
JsValue::from_str(&format!("Error committing recv private range: {:?}", e))
})
.map(|_| ())
})?;
// Finalize, returning the notarized session
log_phase(ProverPhases::Finalize);
let notarized_session = prover
.finalize()
.await
.map_err(|e| JsValue::from_str(&format!("Error finalizing prover: {:?}", e)))?;
log_phase(ProverPhases::NotarizationComplete);
// Create a proof for all committed data in this session
log_phase(ProverPhases::CreateProof);
let session_proof = notarized_session.session_proof();
let mut proof_builder = notarized_session.data().build_substrings_proof();
// Reveal everything except the redacted stuff (which for the response it's everything except the screen_name)
sent_pub_commitment_ids
.iter()
.chain(recv_pub_commitment_ids.iter())
.try_for_each(|id| {
proof_builder
.reveal_by_id(*id)
.map_err(|e| JsValue::from_str(&format!("Could not reveal commitment: {:?}", e)))
.map(|_| ())
})?;
let substrings_proof = proof_builder
.build()
.map_err(|e| JsValue::from_str(&format!("Could not build proof: {:?}", e)))?;
let proof = TlsProof {
session: session_proof,
substrings: substrings_proof,
};
let res = serde_json::to_string_pretty(&proof)
.map_err(|e| JsValue::from_str(&format!("Could not serialize proof: {:?}", e)))?;
let duration = start_time.elapsed();
info!("!@# request took {} seconds", duration.as_secs());
Ok(res)
}
/// Find the ranges of the public and private parts of a sequence.
///
/// Returns a tuple of `(public, private)` ranges.
fn find_ranges(seq: &[u8], private_seq: &[&[u8]]) -> (Vec<Range<usize>>, Vec<Range<usize>>) {
let mut private_ranges = Vec::new();
for s in private_seq {
for (idx, w) in seq.windows(s.len()).enumerate() {
if w == *s {
private_ranges.push(idx..(idx + w.len()));
}
}
}
let mut sorted_ranges = private_ranges.clone();
sorted_ranges.sort_by_key(|r| r.start);
let mut public_ranges = Vec::new();
let mut last_end = 0;
for r in sorted_ranges {
if r.start > last_end {
public_ranges.push(last_end..r.start);
}
last_end = r.end;
}
if last_end < seq.len() {
public_ranges.push(last_end..seq.len());
}
(public_ranges, private_ranges)
}
fn string_list_to_bytes_vec(secrets: &JsValue) -> Result<Vec<Vec<u8>>, JsValue> {
let array: Array = Array::from(secrets);
let length = array.length();
let mut byte_slices: Vec<Vec<u8>> = Vec::new();
for i in 0..length {
let secret_js: JsValue = array.get(i);
let secret_str: String = secret_js
.as_string()
.ok_or(JsValue::from_str("Could not convert secret to string"))?;
let secret_bytes = secret_str.into_bytes();
byte_slices.push(secret_bytes);
}
Ok(byte_slices)
}

View File

@@ -1,30 +0,0 @@
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
/// Requestion Options of Fetch API
// https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch
#[derive(Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct RequestOptions {
pub method: String, // *GET, POST, PUT, DELETE, etc.
// pub mode: String, // no-cors, *cors, same-origin
// pub cache: String, // *default, no-cache, reload, force-cache, only-if-cached
// pub credentials: String, // include, *same-origin, omit
pub headers: HashMap<String, String>,
// pub redirect: String, // manual, *follow, error
// pub referrer_policy: String, // no-referrer, *no-referrer-when-downgrade, origin, origin-when-cross-origin, same-origin, strict-origin, strict-origin-when-cross-origin, unsafe-url
pub body: String, // body data type must match "Content-Type" header
pub max_sent_data: Option<usize>,
pub max_recv_data: Option<usize>,
pub notary_url: String,
pub websocket_proxy_url: String,
}
#[derive(Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct VerifyResult {
pub server_name: String,
pub time: u64,
pub sent: String,
pub recv: String,
}

View File

@@ -1,29 +0,0 @@
use serde::{Deserialize, Serialize};
/// Response object of the /session API
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct NotarizationSessionResponse {
/// Unique session id that is generated by notary and shared to prover
pub session_id: String,
}
/// Request object of the /session API
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct NotarizationSessionRequest {
pub client_type: ClientType,
/// Maximum number of bytes that can be sent in bytes.
pub max_sent_data: Option<usize>,
/// Maximum number of bytes that can be received in bytes.
pub max_recv_data: Option<usize>,
}
/// Types of client that the prover is using
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq)]
pub enum ClientType {
/// Client that has access to the transport layer
Tcp,
/// Client that cannot directly access transport layer, e.g. browser extension
Websocket,
}

View File

@@ -1,104 +0,0 @@
use tracing::info;
use wasm_bindgen::prelude::*;
use crate::request_opt::VerifyResult;
use elliptic_curve::pkcs8::DecodePublicKey;
use std::time::Duration;
use tlsn_core::proof::{SessionProof, TlsProof};
#[wasm_bindgen]
pub async fn verify(proof: &str, notary_pubkey_str: &str) -> Result<String, JsValue> {
let proof: TlsProof = serde_json::from_str(proof)
.map_err(|e| JsValue::from_str(&format!("Could not deserialize proof: {:?}", e)))?;
let TlsProof {
// The session proof establishes the identity of the server and the commitments
// to the TLS transcript.
session,
// The substrings proof proves select portions of the transcript, while redacting
// anything the Prover chose not to disclose.
substrings,
} = proof;
info!(
"!@# notary_pubkey {}, {}",
notary_pubkey_str,
notary_pubkey_str.len()
);
session
.verify_with_default_cert_verifier(get_notary_pubkey(notary_pubkey_str)?)
.map_err(|e| JsValue::from_str(&format!("Session verification failed: {:?}", e)))?;
let SessionProof {
// The session header that was signed by the Notary is a succinct commitment to the TLS transcript.
header,
// This is the server name, checked against the certificate chain shared in the TLS handshake.
session_info,
..
} = session;
// The time at which the session was recorded
let time = chrono::DateTime::UNIX_EPOCH + Duration::from_secs(header.time());
// Verify the substrings proof against the session header.
//
// This returns the redacted transcripts
let (mut sent, mut recv) = substrings
.verify(&header)
.map_err(|e| JsValue::from_str(&format!("Could not verify substrings: {:?}", e)))?;
// Replace the bytes which the Prover chose not to disclose with 'X'
sent.set_redacted(b'X');
recv.set_redacted(b'X');
info!("-------------------------------------------------------------------");
info!(
"Successfully verified that the bytes below came from a session with {:?} at {}.",
session_info.server_name, time
);
info!("Note that the bytes which the Prover chose not to disclose are shown as X.");
info!("Bytes sent:");
info!(
"{}",
String::from_utf8(sent.data().to_vec()).map_err(|e| JsValue::from_str(&format!(
"Could not convert sent data to string: {:?}",
e
)))?
);
info!("Bytes received:");
info!(
"{}",
String::from_utf8(recv.data().to_vec()).map_err(|e| JsValue::from_str(&format!(
"Could not convert recv data to string: {:?}",
e
)))?
);
info!("-------------------------------------------------------------------");
let result = VerifyResult {
server_name: String::from(session_info.server_name.as_str()),
time: header.time(),
sent: String::from_utf8(sent.data().to_vec()).map_err(|e| {
JsValue::from_str(&format!("Could not convert sent data to string: {:?}", e))
})?,
recv: String::from_utf8(recv.data().to_vec()).map_err(|e| {
JsValue::from_str(&format!("Could not convert recv data to string: {:?}", e))
})?,
};
let res = serde_json::to_string_pretty(&result)
.map_err(|e| JsValue::from_str(&format!("Could not serialize result: {:?}", e)))?;
Ok(res)
}
#[allow(unused)]
fn print_type_of<T: ?Sized>(_: &T) {
info!("{}", std::any::type_name::<T>());
}
/// Returns a Notary pubkey trusted by this Verifier
fn get_notary_pubkey(pubkey: &str) -> Result<p256::PublicKey, JsValue> {
p256::PublicKey::from_public_key_pem(pubkey)
.map_err(|e| JsValue::from_str(&format!("Could not get notary pubkey: {:?}", e)))
}

View File

@@ -1,61 +0,0 @@
//! Test suite for the Web and headless browsers.
#![cfg(target_arch = "wasm32")]
extern crate wasm_bindgen_test;
use serde_json::Value;
use std::{collections::HashMap, str};
use wasm_bindgen_test::*;
use web_sys::RequestInit;
extern crate tlsn_extension_rs;
use tlsn_extension_rs::*;
macro_rules! log {
( $( $t:tt )* ) => {
web_sys::console::log_1(&format!( $( $t )* ).into());
}
}
wasm_bindgen_test_configure!(run_in_browser);
#[wasm_bindgen_test]
async fn test_fetch() {
let url = "https://swapi.info/api/";
let mut opts = RequestInit::new();
opts.method("GET");
let rust_string: String = tlsn_extension_rs::fetch_as_json_string(&url, &opts)
.await
.unwrap();
assert!(rust_string.contains("starships"));
}
#[wasm_bindgen_test]
async fn verify() {
let pem = str::from_utf8(include_bytes!("../../../test/assets/notary.pem")).unwrap();
let proof = str::from_utf8(include_bytes!(
"../../../test/assets/simple_proof_redacted.json"
))
.unwrap();
let m: HashMap<String, Value> = serde_json::from_str(
&str::from_utf8(include_bytes!(
"../../../test/assets/simple_proof_expected.json"
))
.unwrap(),
)
.unwrap();
let result = tlsn_extension_rs::verify(proof, pem).await.expect("result");
log!("result: {}", &result);
let r: VerifyResult = serde_json::from_str::<VerifyResult>(&result).unwrap();
assert_eq!(r.server_name, m["serverName"]);
assert!(r.recv.contains("<title>XXXXXXXXXXXXXX</title>"));
assert_eq!(r.time, m["time"].as_u64().unwrap());
assert_eq!(r.sent, m["sent"].as_str().unwrap());
assert_eq!(r.recv, m["recv"].as_str().unwrap());
}

View File

@@ -1,20 +0,0 @@
{
"moz:firefoxOptions": {
"prefs": {
"media.navigator.streams.fake": true,
"media.navigator.permission.disabled": true
},
"args": []
},
"goog:chromeOptions": {
"args": [
"--use-fake-device-for-media-stream",
"--use-fake-ui-for-media-stream",
"--headless",
"--disable-gpu",
"--no-sandbox",
"--disable-dev-shm-usage",
"--window-size=1280,800"
]
}
}

View File

@@ -1,6 +1,7 @@
const webpack = require('webpack');
const path = require('path');
const isProd = process.env.NODE_ENV === 'production';
const CopyWebpackPlugin = require('copy-webpack-plugin');
const envPlugin = new webpack.EnvironmentPlugin({
NODE_ENV: 'development',
@@ -30,9 +31,9 @@ module.exports = [
{
mode: isProd ? 'production' : 'development',
entry: {
index: path.join(__dirname, 'src', 'index.ts'),
lib: path.join(__dirname, 'src', 'lib.ts'),
},
target: 'web',
target: 'webworker',
devtool: 'source-map',
resolve: {
extensions: ['.ts', '.js'],
@@ -53,6 +54,30 @@ module.exports = [
},
plugins: [
envPlugin,
new CopyWebpackPlugin({
patterns: [
{
from: 'node_modules/tlsn-wasm/tlsn_wasm.js',
to: path.join(__dirname, 'build'),
force: true,
},
{
from: 'node_modules/tlsn-wasm/tlsn_wasm_bg.wasm',
to: path.join(__dirname, 'build'),
force: true,
},
{
from: 'node_modules/tlsn-wasm/spawn.js',
to: path.join(__dirname, 'build'),
force: true,
},
{
from: 'node_modules/tlsn-wasm/snippets',
to: path.join(__dirname, 'build', 'snippets'),
force: true,
},
],
}),
],
},
];

View File

@@ -1,7 +1,7 @@
const webpack = require('webpack');
const HtmlWebpackPlugin = require('html-webpack-plugin');
const path = require('path');
const { compilerOptions } = require('./tsconfig.json');
const CopyWebpackPlugin = require('copy-webpack-plugin');
const isProd = process.env.NODE_ENV === 'production';
@@ -33,14 +33,17 @@ const rules = [
const rendererRules = [];
const entry = {
'full-integration': path.join(__dirname, 'test', 'e2e', 'full-integration.spec.ts'),
'simple-verify': path.join(__dirname, 'test', 'e2e', 'simple-verify.spec.ts'),
// add more entries as needed
};
module.exports = [
{
target: 'web',
mode: isProd ? 'production' : 'development',
entry: {
'full-integration-swapi.spec': path.join(__dirname, 'test', 'specs', 'full-integration-swapi.spec.ts'),
'simple-verify': path.join(__dirname, 'test', 'specs', 'simple-verify.spec.ts'),
},
entry,
output: {
path: __dirname + '/test-build',
publicPath: '/',
@@ -49,25 +52,6 @@ module.exports = [
devtool: 'source-map',
resolve: {
extensions: ['.ts', '.tsx', '.js', '.jsx', '.png', '.svg'],
modules: [
path.resolve('./node_modules'),
path.resolve(__dirname, compilerOptions.baseUrl),
],
fallback: {
browserify: require.resolve('browserify'),
stream: require.resolve('stream-browserify'),
path: require.resolve('path-browserify'),
crypto: require.resolve('crypto-browserify'),
os: require.resolve('os-browserify/browser'),
http: require.resolve('stream-http'),
https: require.resolve('https-browserify'),
assert: require.resolve('assert/'),
events: require.resolve('events/'),
'ansi-html-community': require.resolve('ansi-html-community'),
'html-entities': require.resolve('html-entities'),
constants: false,
fs: false,
},
},
module: {
rules: [...rules, ...rendererRules],
@@ -80,10 +64,50 @@ module.exports = [
new webpack.ProvidePlugin({
process: 'process',
}),
new CopyWebpackPlugin({
patterns: [
{
from: 'node_modules/tlsn-wasm',
to: path.join(__dirname, 'test-build'),
force: true,
},
],
}),
// Generate an HTML file for each entry
...Object.keys(entry).map(
(name) =>
new HtmlWebpackPlugin({
template: './test/test.ejs',
filename: `${name}.html`,
chunks: [name],
inject: true,
testName: name,
})
),
// Add an index page listing all test pages
new HtmlWebpackPlugin({
template: './test/test.ejs',
filename: `index.html`,
inject: true,
templateContent: () => `
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<title>tlsn-js test index</title>
</head>
<body>
<h1>tlsn-js test index</h1>
<ul>
${Object.keys(entry)
.map(
(name) =>
`<li><a href="${name}.html">${name}</a></li>`
)
.join('\n')}
</ul>
</body>
</html>
`,
filename: 'index.html',
inject: false,
}),
],
stats: 'minimal',