Merge remote-tracking branch 'engine.io-parser/main' into monorepo

Source: https://github.com/socketio/engine.io-parser
This commit is contained in:
Damien Arrachequesne
2024-07-08 10:54:44 +02:00
26 changed files with 22810 additions and 0 deletions

View File

@@ -0,0 +1 @@
lib/contrib/*

View File

@@ -0,0 +1,226 @@
# History
- [5.2.2](#522-2024-02-05) (Feb 2024)
- [5.2.1](#521-2023-08-01) (Aug 2023)
- [5.2.0](#520-2023-07-31) (Jul 2023)
- [5.1.0](#510-2023-06-11) (Jun 2023)
- [5.0.7](#507-2023-05-24) (May 2023)
- [5.0.6](#506-2023-01-16) (Jan 2023)
- [5.0.5](#505-2023-01-06) (Jan 2023)
- [5.0.4](#504-2022-04-30) (Apr 2022)
- [5.0.3](#503-2022-01-17) (Jan 2022)
- [5.0.2](#502-2021-11-14) (Nov 2021)
- [5.0.1](#501-2021-10-15) (Oct 2021)
- [**5.0.0**](#500-2021-10-04) (Oct 2021) (major: TypeScript rewrite)
- [4.0.3](#403-2021-08-29) (Aug 2021)
- [4.0.2](#402-2020-12-07) (Dec 2020)
- [2.2.1](#221-2020-09-30) (Sep 2020) (from the [2.2.x](https://github.com/socketio/engine.io-parser/tree/2.2.x) branch)
- [4.0.1](#401-2020-09-10) (Sep 2020)
- [**4.0.0**](#400-2020-09-08) (Sep 2020) (major: Engine.IO v4)
- [2.2.0](#220-2019-09-13) (Sep 2019)
# Release notes
## [5.2.2](https://github.com/socketio/engine.io-parser/compare/5.2.1...5.2.2) (2024-02-05)
### Bug Fixes
* **typescript:** properly import the TransformStream type ([0305b4a](https://github.com/socketio/engine.io-parser/commit/0305b4a7a597e0f070ce8ea17106121f9ab369bc))
## [5.2.1](https://github.com/socketio/engine.io-parser/compare/5.2.0...5.2.1) (2023-08-01)
The format of the WebTransport frame has been slightly updated.
## [5.2.0](https://github.com/socketio/engine.io-parser/compare/5.1.0...5.2.0) (2023-07-31)
### Features
* prepend a header to each WebTransport chunk ([6142324](https://github.com/socketio/engine.io-parser/commit/6142324fa61204393028f3f58f336d053030ea5f))
## [5.1.0](https://github.com/socketio/engine.io-parser/compare/5.0.7...5.1.0) (2023-06-11)
### Features
* implement WebTransport-related encoding/decoding ([bed70a4](https://github.com/socketio/engine.io-parser/commit/bed70a4f2598ebdf96d8ccc1b5d838b1a77a4290))
## [5.0.7](https://github.com/socketio/engine.io-parser/compare/5.0.6...5.0.7) (2023-05-24)
The CommonJS build now includes the TypeScript declarations too, in order to be compatible the "node16" moduleResolution.
## [5.0.6](https://github.com/socketio/engine.io-parser/compare/5.0.5...5.0.6) (2023-01-16)
The `compile` script was not run before publishing `5.0.5`, so the esm build did not include the latest changes.
## [5.0.5](https://github.com/socketio/engine.io-parser/compare/5.0.4...5.0.5) (2023-01-06)
### Bug Fixes
* properly encode empty buffer in base64 encoding ([#131](https://github.com/socketio/engine.io-parser/issues/131)) ([351ba82](https://github.com/socketio/engine.io-parser/commit/351ba8245b1aac795646d7e7a9001c8e1d0cc9f2))
## [5.0.4](https://github.com/socketio/engine.io-parser/compare/5.0.3...5.0.4) (2022-04-30)
### Bug Fixes
* add missing file extension for ESM import ([a421bbe](https://github.com/socketio/engine.io-parser/commit/a421bbec7bf43c567c49c608dee604872f6db823))
* **typings:** update the type of RawData ([039b45c](https://github.com/socketio/engine.io-parser/commit/039b45cc65b50acc1f9da42ad605eaccb8ccbcde))
## [5.0.3](https://github.com/socketio/engine.io-parser/compare/5.0.2...5.0.3) (2022-01-17)
## [5.0.2](https://github.com/socketio/engine.io-parser/compare/5.0.1...5.0.2) (2021-11-14)
### Bug Fixes
* add package name in nested package.json ([7e27159](https://github.com/socketio/engine.io-parser/commit/7e271596c3305fb4e4a9fbdcc7fd442e8ff71200))
* fix vite build for CommonJS users ([5f22ed0](https://github.com/socketio/engine.io-parser/commit/5f22ed0527cc80aa0cac415dfd12db2f94f0a855))
## [5.0.1](https://github.com/socketio/engine.io-parser/compare/5.0.0...5.0.1) (2021-10-15)
### Bug Fixes
* fix vite build ([900346e](https://github.com/socketio/engine.io-parser/commit/900346ea34ddc178d80eaabc8ea516d929457855))
## [5.0.0](https://github.com/socketio/engine.io-parser/compare/4.0.3...5.0.0) (2021-10-04)
This release includes the migration to TypeScript. The major bump is due to the new "exports" field in the package.json file.
See also: https://nodejs.org/api/packages.html#packages_package_entry_points
## [4.0.3](https://github.com/socketio/engine.io-parser/compare/4.0.2...4.0.3) (2021-08-29)
### Bug Fixes
* respect the offset and length of TypedArray objects ([6d7dd76](https://github.com/socketio/engine.io-parser/commit/6d7dd76130690afda6c214d5c04305d2bbc4eb4d))
## [4.0.2](https://github.com/socketio/engine.io-parser/compare/4.0.1...4.0.2) (2020-12-07)
### Bug Fixes
* add base64-arraybuffer as prod dependency ([2ccdeb2](https://github.com/socketio/engine.io-parser/commit/2ccdeb277955bed8742a29f2dcbbf57ca95eb12a))
## [2.2.1](https://github.com/socketio/engine.io-parser/compare/2.2.0...2.2.1) (2020-09-30)
## [4.0.1](https://github.com/socketio/engine.io-parser/compare/4.0.0...4.0.1) (2020-09-10)
### Bug Fixes
* use a terser-compatible representation of the separator ([886f9ea](https://github.com/socketio/engine.io-parser/commit/886f9ea7c4e717573152c31320f6fb6c6664061b))
## [4.0.0](https://github.com/socketio/engine.io-parser/compare/v4.0.0-alpha.1...4.0.0) (2020-09-08)
This major release contains the necessary changes for the version 4 of the Engine.IO protocol. More information about the new version can be found [there](https://github.com/socketio/engine.io-protocol#difference-between-v3-and-v4).
Encoding changes between v3 and v4:
- encodePacket with string
- input: `{ type: "message", data: "hello" }`
- output in v3: `"4hello"`
- output in v4: `"4hello"`
- encodePacket with binary
- input: `{ type: 'message', data: <Buffer 01 02 03> }`
- output in v3: `<Buffer 04 01 02 03>`
- output in v4: `<Buffer 01 02 03>`
- encodePayload with strings
- input: `[ { type: 'message', data: 'hello' }, { type: 'message', data: '€€€' } ]`
- output in v3: `"6:4hello4:4€€€"`
- output in v4: `"4hello\x1e4€€€"`
- encodePayload with string and binary
- input: `[ { type: 'message', data: 'hello' }, { type: 'message', data: <Buffer 01 02 03> } ]`
- output in v3: `<Buffer 00 06 ff 34 68 65 6c 6c 6f 01 04 ff 04 01 02 03>`
- output in v4: `"4hello\x1ebAQID"`
Please note that the parser is now dependency-free! This should help reduce the size of the browser bundle.
### Bug Fixes
* keep track of the buffer initial length ([8edf2d1](https://github.com/socketio/engine.io-parser/commit/8edf2d1478026da442f519c2d2521af43ba01832))
### Features
* restore the upgrade mechanism ([6efedfa](https://github.com/socketio/engine.io-parser/commit/6efedfa0f3048506a4ba99e70674ddf4c0732e0c))
## [4.0.0-alpha.1](https://github.com/socketio/engine.io-parser/compare/v4.0.0-alpha.0...v4.0.0-alpha.1) (2020-05-19)
### Features
* implement the version 4 of the protocol ([cab7db0](https://github.com/socketio/engine.io-parser/commit/cab7db0404e0a69f86a05ececd62c8c31f4d97d5))
## [4.0.0-alpha.0](https://github.com/socketio/engine.io-parser/compare/2.2.0...v4.0.0-alpha.0) (2020-02-04)
### Bug Fixes
* properly decode binary packets ([5085373](https://github.com/socketio/engine.io-parser/commit/50853738e0c6c16f9cee0d7887651155f4b78240))
### Features
* remove packet type when encoding binary packets ([a947ae5](https://github.com/socketio/engine.io-parser/commit/a947ae59a2844e4041db58ff36b270d1528b3bee))
### BREAKING CHANGES
* the packet containing binary data will now be sent without any transformation
Protocol v3: { type: 'message', data: <Buffer 01 02 03> } => <Buffer 04 01 02 03>
Protocol v4: { type: 'message', data: <Buffer 01 02 03> } => <Buffer 01 02 03>
## [2.2.0](https://github.com/socketio/engine.io-parser/compare/2.1.3...2.2.0) (2019-09-13)
* [refactor] Use `Buffer.allocUnsafe` instead of `new Buffer` (#104) ([aedf8eb](https://github.com/socketio/engine.io-parser/commit/aedf8eb29e8bf6aeb5c6cc68965d986c4c958ae2)), closes [#104](https://github.com/socketio/engine.io-parser/issues/104)
### BREAKING CHANGES
* drop support for Node.js 4 (since Buffer.allocUnsafe was added in v5.10.0)
Reference: https://nodejs.org/docs/latest/api/buffer.html#buffer_class_method_buffer_allocunsafe_size

View File

@@ -0,0 +1,22 @@
(The MIT License)
Copyright (c) 2016 Guillermo Rauch (@rauchg)
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
'Software'), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -0,0 +1,158 @@
# engine.io-parser
[![Build Status](https://github.com/socketio/engine.io-parser/workflows/CI/badge.svg?branch=main)](https://github.com/socketio/engine.io-parser/actions)
[![NPM version](https://badge.fury.io/js/engine.io-parser.svg)](https://npmjs.com/package/engine.io-parser)
This is the JavaScript parser for the engine.io protocol encoding,
shared by both
[engine.io-client](https://github.com/socketio/engine.io-client) and
[engine.io](https://github.com/socketio/engine.io).
## How to use
### Standalone
The parser can encode/decode packets, payloads, and payloads as binary
with the following methods: `encodePacket`, `decodePacket`, `encodePayload`,
`decodePayload`.
Example:
```js
const parser = require("engine.io-parser");
const data = Buffer.from([ 1, 2, 3, 4 ]);
parser.encodePacket({ type: "message", data }, encoded => {
const decodedData = parser.decodePacket(encoded); // decodedData === data
});
```
### With browserify
Engine.IO Parser is a commonjs module, which means you can include it by using
`require` on the browser and package using [browserify](http://browserify.org/):
1. install the parser package
```shell
npm install engine.io-parser
```
1. write your app code
```js
const parser = require("engine.io-parser");
const testBuffer = new Int8Array(10);
for (let i = 0; i < testBuffer.length; i++) testBuffer[i] = i;
const packets = [{ type: "message", data: testBuffer.buffer }, { type: "message", data: "hello" }];
parser.encodePayload(packets, encoded => {
parser.decodePayload(encoded,
(packet, index, total) => {
const isLast = index + 1 == total;
if (!isLast) {
const buffer = new Int8Array(packet.data); // testBuffer
} else {
const message = packet.data; // "hello"
}
});
});
```
1. build your app bundle
```bash
$ browserify app.js > bundle.js
```
1. include on your page
```html
<script src="/path/to/bundle.js"></script>
```
## Features
- Runs on browser and node.js seamlessly
- Runs inside HTML5 WebWorker
- Can encode and decode packets
- Encodes from/to ArrayBuffer or Blob when in browser, and Buffer or ArrayBuffer in Node
## API
Note: `cb(type)` means the type is a callback function that contains a parameter of type `type` when called.
### Node
- `encodePacket`
- Encodes a packet.
- **Parameters**
- `Object`: the packet to encode, has `type` and `data`.
- `data`: can be a `String`, `Number`, `Buffer`, `ArrayBuffer`
- `Boolean`: binary support
- `Function`: callback, returns the encoded packet (`cb(String)`)
- `decodePacket`
- Decodes a packet. Data also available as an ArrayBuffer if requested.
- Returns data as `String` or (`Blob` on browser, `ArrayBuffer` on Node)
- **Parameters**
- `String` | `ArrayBuffer`: the packet to decode, has `type` and `data`
- `String`: optional, the binary type
- `encodePayload`
- Encodes multiple messages (payload).
- If any contents are binary, they will be encoded as base64 strings. Base64
encoded strings are marked with a b before the length specifier
- **Parameters**
- `Array`: an array of packets
- `Function`: callback, returns the encoded payload (`cb(String)`)
- `decodePayload`
- Decodes data when a payload is maybe expected. Possible binary contents are
decoded from their base64 representation.
- **Parameters**
- `String`: the payload
- `Function`: callback, returns (cb(`Object`: packet, `Number`:packet index, `Number`:packet total))
## Tests
Standalone tests can be run with `npm test` which will run the node.js tests.
Browser tests are run using [zuul](https://github.com/defunctzombie/zuul).
(You must have zuul setup with a saucelabs account.)
You can run the tests locally using the following command:
```
npm run test:browser
```
## Support
The support channels for `engine.io-parser` are the same as `socket.io`:
- irc.freenode.net **#socket.io**
- [Github Discussions](https://github.com/socketio/socket.io/discussions)
- [Website](https://socket.io)
## Development
To contribute patches, run tests or benchmarks, make sure to clone the
repository:
```bash
git clone git://github.com/socketio/engine.io-parser.git
```
Then:
```bash
cd engine.io-parser
npm ci
```
See the `Tests` section above for how to run tests before submitting any patches.
## License
MIT

View File

@@ -0,0 +1,49 @@
const Benchmark = require('benchmark');
const suite = new Benchmark.Suite;
const parser = require('..');
suite
.add('encode packet as string', (deferred) => {
parser.encodePacket({ type: 'message', data: 'test' }, true, () => {
deferred.resolve();
});
}, { defer: true })
.add('encode packet as binary', (deferred) => {
parser.encodePacket({ type: 'message', data: Buffer.from([1, 2, 3, 4]) }, true, () => {
deferred.resolve();
});
}, { defer: true })
.add('encode payload as string', (deferred) => {
parser.encodePayload([{ type: 'message', data: 'test1' }, { type: 'message', data: 'test2' }], () => {
deferred.resolve();
});
}, { defer: true })
.add('encode payload as binary', (deferred) => {
parser.encodePayload([{ type: 'message', data: 'test' }, { type: 'message', data: Buffer.from([1, 2, 3, 4]) }], () => {
deferred.resolve();
})
}, { defer: true })
.add('decode packet from string', () => {
parser.decodePacket('4test');
})
.add('decode packet from binary', () => {
parser.decodePacket(Buffer.from([4, 1, 2, 3, 4]));
})
.add('decode payload from string', (deferred) => {
let i = 0;
parser.decodePayload('test1\x1etest2');
deferred.resolve();
}, { defer: true })
.add('decode payload from binary', (deferred) => {
parser.decodePayload('test1\x1ebAQIDBA==', 'nodebuffer');
deferred.resolve();
}, { defer: true })
.on('cycle', function(event) {
console.log(String(event.target));
})
.on('complete', function() {
console.log('Fastest is ' + this.filter('fastest').map('name'));
})
.run({ 'async': true });

View File

@@ -0,0 +1,26 @@
Current
```
encode packet as string x 175,944 ops/sec ±5.64% (25 runs sampled)
encode packet as binary x 176,945 ops/sec ±16.60% (51 runs sampled)
encode payload as string x 47,836 ops/sec ±9.84% (34 runs sampled)
encode payload as binary x 123,987 ops/sec ±22.03% (53 runs sampled)
decode packet from string x 27,680,068 ops/sec ±0.92% (89 runs sampled)
decode packet from binary x 7,747,089 ops/sec ±1.65% (83 runs sampled)
decode payload from string x 198,908 ops/sec ±27.95% (23 runs sampled)
decode payload from binary x 179,574 ops/sec ±41.32% (23 runs sampled)
```
Results from parser v2 / protocol v3
```
encode packet as string x 228,038 ops/sec ±9.28% (40 runs sampled)
encode packet as binary x 163,392 ops/sec ±8.72% (67 runs sampled)
encode payload as string x 73,457 ops/sec ±14.83% (56 runs sampled)
encode payload as binary x 71,400 ops/sec ±3.63% (75 runs sampled)
decode packet from string x 22,712,325 ops/sec ±3.14% (90 runs sampled)
decode packet from binary x 4,849,781 ops/sec ±1.27% (87 runs sampled)
decode payload from string x 82,514 ops/sec ±49.93% (22 runs sampled)
decode payload from binary x 149,206 ops/sec ±25.90% (76 runs sampled)
```

View File

@@ -0,0 +1,39 @@
const PACKET_TYPES = Object.create(null); // no Map = no polyfill
PACKET_TYPES["open"] = "0";
PACKET_TYPES["close"] = "1";
PACKET_TYPES["ping"] = "2";
PACKET_TYPES["pong"] = "3";
PACKET_TYPES["message"] = "4";
PACKET_TYPES["upgrade"] = "5";
PACKET_TYPES["noop"] = "6";
const PACKET_TYPES_REVERSE = Object.create(null);
Object.keys(PACKET_TYPES).forEach((key) => {
PACKET_TYPES_REVERSE[PACKET_TYPES[key]] = key;
});
const ERROR_PACKET: Packet = { type: "error", data: "parser error" };
export { PACKET_TYPES, PACKET_TYPES_REVERSE, ERROR_PACKET };
export type PacketType =
| "open"
| "close"
| "ping"
| "pong"
| "message"
| "upgrade"
| "noop"
| "error";
// RawData should be "string | Buffer | ArrayBuffer | ArrayBufferView | Blob", but Blob does not exist in Node.js and
// requires to add the dom lib in tsconfig.json
export type RawData = any;
export interface Packet {
type: PacketType;
options?: { compress: boolean };
data?: RawData;
}
export type BinaryType = "nodebuffer" | "arraybuffer" | "blob";

View File

@@ -0,0 +1,64 @@
// imported from https://github.com/socketio/base64-arraybuffer
const chars = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/';
// Use a lookup table to find the index.
const lookup = typeof Uint8Array === 'undefined' ? [] : new Uint8Array(256);
for (let i = 0; i < chars.length; i++) {
lookup[chars.charCodeAt(i)] = i;
}
export const encode = (arraybuffer: ArrayBuffer): string => {
let bytes = new Uint8Array(arraybuffer),
i,
len = bytes.length,
base64 = '';
for (i = 0; i < len; i += 3) {
base64 += chars[bytes[i] >> 2];
base64 += chars[((bytes[i] & 3) << 4) | (bytes[i + 1] >> 4)];
base64 += chars[((bytes[i + 1] & 15) << 2) | (bytes[i + 2] >> 6)];
base64 += chars[bytes[i + 2] & 63];
}
if (len % 3 === 2) {
base64 = base64.substring(0, base64.length - 1) + '=';
} else if (len % 3 === 1) {
base64 = base64.substring(0, base64.length - 2) + '==';
}
return base64;
};
export const decode = (base64: string): ArrayBuffer => {
let bufferLength = base64.length * 0.75,
len = base64.length,
i,
p = 0,
encoded1,
encoded2,
encoded3,
encoded4;
if (base64[base64.length - 1] === '=') {
bufferLength--;
if (base64[base64.length - 2] === '=') {
bufferLength--;
}
}
const arraybuffer = new ArrayBuffer(bufferLength),
bytes = new Uint8Array(arraybuffer);
for (i = 0; i < len; i += 4) {
encoded1 = lookup[base64.charCodeAt(i)];
encoded2 = lookup[base64.charCodeAt(i + 1)];
encoded3 = lookup[base64.charCodeAt(i + 2)];
encoded4 = lookup[base64.charCodeAt(i + 3)];
bytes[p++] = (encoded1 << 2) | (encoded2 >> 4);
bytes[p++] = ((encoded2 & 15) << 4) | (encoded3 >> 2);
bytes[p++] = ((encoded3 & 3) << 6) | (encoded4 & 63);
}
return arraybuffer;
};

View File

@@ -0,0 +1,72 @@
import {
ERROR_PACKET,
PACKET_TYPES_REVERSE,
Packet,
BinaryType,
RawData,
} from "./commons.js";
import { decode } from "./contrib/base64-arraybuffer.js";
const withNativeArrayBuffer = typeof ArrayBuffer === "function";
export const decodePacket = (
encodedPacket: RawData,
binaryType?: BinaryType,
): Packet => {
if (typeof encodedPacket !== "string") {
return {
type: "message",
data: mapBinary(encodedPacket, binaryType),
};
}
const type = encodedPacket.charAt(0);
if (type === "b") {
return {
type: "message",
data: decodeBase64Packet(encodedPacket.substring(1), binaryType),
};
}
const packetType = PACKET_TYPES_REVERSE[type];
if (!packetType) {
return ERROR_PACKET;
}
return encodedPacket.length > 1
? {
type: PACKET_TYPES_REVERSE[type],
data: encodedPacket.substring(1),
}
: {
type: PACKET_TYPES_REVERSE[type],
};
};
const decodeBase64Packet = (data, binaryType) => {
if (withNativeArrayBuffer) {
const decoded = decode(data);
return mapBinary(decoded, binaryType);
} else {
return { base64: true, data }; // fallback for old browsers
}
};
const mapBinary = (data, binaryType) => {
switch (binaryType) {
case "blob":
if (data instanceof Blob) {
// from WebSocket + binaryType "blob"
return data;
} else {
// from HTTP long-polling or WebTransport
return new Blob([data]);
}
case "arraybuffer":
default:
if (data instanceof ArrayBuffer) {
// from HTTP long-polling (base64) or WebSocket + binaryType "arraybuffer"
return data;
} else {
// from WebTransport (Uint8Array)
return data.buffer;
}
}
};

View File

@@ -0,0 +1,66 @@
import {
ERROR_PACKET,
PACKET_TYPES_REVERSE,
Packet,
BinaryType,
RawData,
} from "./commons.js";
export const decodePacket = (
encodedPacket: RawData,
binaryType?: BinaryType,
): Packet => {
if (typeof encodedPacket !== "string") {
return {
type: "message",
data: mapBinary(encodedPacket, binaryType),
};
}
const type = encodedPacket.charAt(0);
if (type === "b") {
const buffer = Buffer.from(encodedPacket.substring(1), "base64");
return {
type: "message",
data: mapBinary(buffer, binaryType),
};
}
if (!PACKET_TYPES_REVERSE[type]) {
return ERROR_PACKET;
}
return encodedPacket.length > 1
? {
type: PACKET_TYPES_REVERSE[type],
data: encodedPacket.substring(1),
}
: {
type: PACKET_TYPES_REVERSE[type],
};
};
const mapBinary = (data: RawData, binaryType?: BinaryType) => {
switch (binaryType) {
case "arraybuffer":
if (data instanceof ArrayBuffer) {
// from WebSocket & binaryType "arraybuffer"
return data;
} else if (Buffer.isBuffer(data)) {
// from HTTP long-polling
return data.buffer.slice(
data.byteOffset,
data.byteOffset + data.byteLength,
);
} else {
// from WebTransport (Uint8Array)
return data.buffer;
}
case "nodebuffer":
default:
if (Buffer.isBuffer(data)) {
// from HTTP long-polling or WebSocket & binaryType "nodebuffer" (default)
return data;
} else {
// from WebTransport (Uint8Array)
return Buffer.from(data);
}
}
};

View File

@@ -0,0 +1,85 @@
import { PACKET_TYPES, Packet, RawData } from "./commons.js";
const withNativeBlob =
typeof Blob === "function" ||
(typeof Blob !== "undefined" &&
Object.prototype.toString.call(Blob) === "[object BlobConstructor]");
const withNativeArrayBuffer = typeof ArrayBuffer === "function";
// ArrayBuffer.isView method is not defined in IE10
const isView = (obj) => {
return typeof ArrayBuffer.isView === "function"
? ArrayBuffer.isView(obj)
: obj && obj.buffer instanceof ArrayBuffer;
};
const encodePacket = (
{ type, data }: Packet,
supportsBinary: boolean,
callback: (encodedPacket: RawData) => void,
) => {
if (withNativeBlob && data instanceof Blob) {
if (supportsBinary) {
return callback(data);
} else {
return encodeBlobAsBase64(data, callback);
}
} else if (
withNativeArrayBuffer &&
(data instanceof ArrayBuffer || isView(data))
) {
if (supportsBinary) {
return callback(data);
} else {
return encodeBlobAsBase64(new Blob([data]), callback);
}
}
// plain string
return callback(PACKET_TYPES[type] + (data || ""));
};
const encodeBlobAsBase64 = (
data: Blob,
callback: (encodedPacket: RawData) => void,
) => {
const fileReader = new FileReader();
fileReader.onload = function () {
const content = (fileReader.result as string).split(",")[1];
callback("b" + (content || ""));
};
return fileReader.readAsDataURL(data);
};
function toArray(data: BufferSource) {
if (data instanceof Uint8Array) {
return data;
} else if (data instanceof ArrayBuffer) {
return new Uint8Array(data);
} else {
return new Uint8Array(data.buffer, data.byteOffset, data.byteLength);
}
}
let TEXT_ENCODER;
export function encodePacketToBinary(
packet: Packet,
callback: (encodedPacket: RawData) => void,
) {
if (withNativeBlob && packet.data instanceof Blob) {
return packet.data.arrayBuffer().then(toArray).then(callback);
} else if (
withNativeArrayBuffer &&
(packet.data instanceof ArrayBuffer || isView(packet.data))
) {
return callback(toArray(packet.data));
}
encodePacket(packet, false, (encoded) => {
if (!TEXT_ENCODER) {
TEXT_ENCODER = new TextEncoder();
}
callback(TEXT_ENCODER.encode(encoded));
});
}
export { encodePacket };

View File

@@ -0,0 +1,46 @@
import { PACKET_TYPES, Packet, RawData } from "./commons.js";
export const encodePacket = (
{ type, data }: Packet,
supportsBinary: boolean,
callback: (encodedPacket: RawData) => void,
) => {
if (data instanceof ArrayBuffer || ArrayBuffer.isView(data)) {
return callback(
supportsBinary ? data : "b" + toBuffer(data, true).toString("base64"),
);
}
// plain string
return callback(PACKET_TYPES[type] + (data || ""));
};
const toBuffer = (data: BufferSource, forceBufferConversion: boolean) => {
if (
Buffer.isBuffer(data) ||
(data instanceof Uint8Array && !forceBufferConversion)
) {
return data;
} else if (data instanceof ArrayBuffer) {
return Buffer.from(data);
} else {
return Buffer.from(data.buffer, data.byteOffset, data.byteLength);
}
};
let TEXT_ENCODER;
export function encodePacketToBinary(
packet: Packet,
callback: (encodedPacket: RawData) => void,
) {
if (packet.data instanceof ArrayBuffer || ArrayBuffer.isView(packet.data)) {
return callback(toBuffer(packet.data, false));
}
encodePacket(packet, true, (encoded) => {
if (!TEXT_ENCODER) {
// lazily created for compatibility with Node.js 10
TEXT_ENCODER = new TextEncoder();
}
callback(TEXT_ENCODER.encode(encoded));
});
}

View File

@@ -0,0 +1,214 @@
import { encodePacket, encodePacketToBinary } from "./encodePacket.js";
import { decodePacket } from "./decodePacket.js";
import {
Packet,
PacketType,
RawData,
BinaryType,
ERROR_PACKET,
} from "./commons.js";
// we can't import TransformStream as a value because it was added in Node.js v16.5.0, so it would break on older Node.js versions
// reference: https://nodejs.org/api/webstreams.html#class-transformstream
import type { TransformStream } from "node:stream/web";
const SEPARATOR = String.fromCharCode(30); // see https://en.wikipedia.org/wiki/Delimiter#ASCII_delimited_text
const encodePayload = (
packets: Packet[],
callback: (encodedPayload: string) => void,
) => {
// some packets may be added to the array while encoding, so the initial length must be saved
const length = packets.length;
const encodedPackets = new Array(length);
let count = 0;
packets.forEach((packet, i) => {
// force base64 encoding for binary packets
encodePacket(packet, false, (encodedPacket) => {
encodedPackets[i] = encodedPacket;
if (++count === length) {
callback(encodedPackets.join(SEPARATOR));
}
});
});
};
const decodePayload = (
encodedPayload: string,
binaryType?: BinaryType,
): Packet[] => {
const encodedPackets = encodedPayload.split(SEPARATOR);
const packets = [];
for (let i = 0; i < encodedPackets.length; i++) {
const decodedPacket = decodePacket(encodedPackets[i], binaryType);
packets.push(decodedPacket);
if (decodedPacket.type === "error") {
break;
}
}
return packets;
};
export function createPacketEncoderStream() {
// @ts-expect-error
return new TransformStream({
transform(packet: Packet, controller) {
encodePacketToBinary(packet, (encodedPacket) => {
const payloadLength = encodedPacket.length;
let header;
// inspired by the WebSocket format: https://developer.mozilla.org/en-US/docs/Web/API/WebSockets_API/Writing_WebSocket_servers#decoding_payload_length
if (payloadLength < 126) {
header = new Uint8Array(1);
new DataView(header.buffer).setUint8(0, payloadLength);
} else if (payloadLength < 65536) {
header = new Uint8Array(3);
const view = new DataView(header.buffer);
view.setUint8(0, 126);
view.setUint16(1, payloadLength);
} else {
header = new Uint8Array(9);
const view = new DataView(header.buffer);
view.setUint8(0, 127);
view.setBigUint64(1, BigInt(payloadLength));
}
// first bit indicates whether the payload is plain text (0) or binary (1)
if (packet.data && typeof packet.data !== "string") {
header[0] |= 0x80;
}
controller.enqueue(header);
controller.enqueue(encodedPacket);
});
},
});
}
let TEXT_DECODER;
function totalLength(chunks: Uint8Array[]) {
return chunks.reduce((acc, chunk) => acc + chunk.length, 0);
}
function concatChunks(chunks: Uint8Array[], size: number) {
if (chunks[0].length === size) {
return chunks.shift();
}
const buffer = new Uint8Array(size);
let j = 0;
for (let i = 0; i < size; i++) {
buffer[i] = chunks[0][j++];
if (j === chunks[0].length) {
chunks.shift();
j = 0;
}
}
if (chunks.length && j < chunks[0].length) {
chunks[0] = chunks[0].slice(j);
}
return buffer;
}
const enum State {
READ_HEADER,
READ_EXTENDED_LENGTH_16,
READ_EXTENDED_LENGTH_64,
READ_PAYLOAD,
}
export function createPacketDecoderStream(
maxPayload: number,
binaryType: BinaryType,
) {
if (!TEXT_DECODER) {
TEXT_DECODER = new TextDecoder();
}
const chunks: Uint8Array[] = [];
let state = State.READ_HEADER;
let expectedLength = -1;
let isBinary = false;
// @ts-expect-error
return new TransformStream({
transform(chunk: Uint8Array, controller) {
chunks.push(chunk);
while (true) {
if (state === State.READ_HEADER) {
if (totalLength(chunks) < 1) {
break;
}
const header = concatChunks(chunks, 1);
isBinary = (header[0] & 0x80) === 0x80;
expectedLength = header[0] & 0x7f;
if (expectedLength < 126) {
state = State.READ_PAYLOAD;
} else if (expectedLength === 126) {
state = State.READ_EXTENDED_LENGTH_16;
} else {
state = State.READ_EXTENDED_LENGTH_64;
}
} else if (state === State.READ_EXTENDED_LENGTH_16) {
if (totalLength(chunks) < 2) {
break;
}
const headerArray = concatChunks(chunks, 2);
expectedLength = new DataView(
headerArray.buffer,
headerArray.byteOffset,
headerArray.length,
).getUint16(0);
state = State.READ_PAYLOAD;
} else if (state === State.READ_EXTENDED_LENGTH_64) {
if (totalLength(chunks) < 8) {
break;
}
const headerArray = concatChunks(chunks, 8);
const view = new DataView(
headerArray.buffer,
headerArray.byteOffset,
headerArray.length,
);
const n = view.getUint32(0);
if (n > Math.pow(2, 53 - 32) - 1) {
// the maximum safe integer in JavaScript is 2^53 - 1
controller.enqueue(ERROR_PACKET);
break;
}
expectedLength = n * Math.pow(2, 32) + view.getUint32(4);
state = State.READ_PAYLOAD;
} else {
if (totalLength(chunks) < expectedLength) {
break;
}
const data = concatChunks(chunks, expectedLength);
controller.enqueue(
decodePacket(
isBinary ? data : TEXT_DECODER.decode(data),
binaryType,
),
);
state = State.READ_HEADER;
}
if (expectedLength === 0 || expectedLength > maxPayload) {
controller.enqueue(ERROR_PACKET);
break;
}
}
},
});
}
export const protocol = 4;
export {
encodePacket,
encodePayload,
decodePacket,
decodePayload,
Packet,
PacketType,
RawData,
BinaryType,
};

20920
packages/engine.io-parser/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,59 @@
{
"name": "engine.io-parser",
"description": "Parser for the client for the realtime Engine",
"license": "MIT",
"version": "5.2.2",
"main": "./build/cjs/index.js",
"module": "./build/esm/index.js",
"exports": {
"import": "./build/esm/index.js",
"require": "./build/cjs/index.js"
},
"types": "build/esm/index.d.ts",
"homepage": "https://github.com/socketio/engine.io-parser",
"devDependencies": {
"@babel/core": "~7.9.6",
"@babel/preset-env": "~7.9.6",
"@types/mocha": "^9.0.0",
"@types/node": "^16.9.6",
"babelify": "^10.0.0",
"benchmark": "^2.1.4",
"expect.js": "0.3.1",
"mocha": "^5.2.0",
"nyc": "~15.0.1",
"prettier": "^3.2.5",
"rimraf": "^3.0.2",
"socket.io-browsers": "^1.0.4",
"ts-node": "^10.2.1",
"tsify": "^5.0.4",
"typescript": "^4.4.3",
"zuul": "3.11.1",
"zuul-ngrok": "4.0.0"
},
"scripts": {
"compile": "rimraf ./build && tsc && tsc -p tsconfig.esm.json && ./postcompile.sh",
"test": "npm run format:check && npm run compile && if test \"$BROWSERS\" = \"1\" ; then npm run test:browser; else npm run test:node; fi",
"test:node": "nyc mocha -r ts-node/register test/index.ts",
"test:browser": "zuul test/index.ts --no-coverage",
"format:check": "prettier --check 'lib/**/*.ts' 'test/**/*.ts'",
"format:fix": "prettier --write 'lib/**/*.ts' 'test/**/*.ts'",
"prepack": "npm run compile"
},
"repository": {
"type": "git",
"url": "git@github.com:socketio/engine.io-parser.git"
},
"files": [
"build/"
],
"browser": {
"./test/node": "./test/browser",
"./build/esm/encodePacket.js": "./build/esm/encodePacket.browser.js",
"./build/esm/decodePacket.js": "./build/esm/decodePacket.browser.js",
"./build/cjs/encodePacket.js": "./build/cjs/encodePacket.browser.js",
"./build/cjs/decodePacket.js": "./build/cjs/decodePacket.browser.js"
},
"engines": {
"node": ">=10.0.0"
}
}

View File

@@ -0,0 +1,4 @@
#!/bin/bash
cp ./support/package.cjs.json ./build/cjs/package.json
cp ./support/package.esm.json ./build/esm/package.json

View File

@@ -0,0 +1,8 @@
{
"name": "engine.io-parser",
"type": "commonjs",
"browser": {
"./encodePacket.js": "./encodePacket.browser.js",
"./decodePacket.js": "./decodePacket.browser.js"
}
}

View File

@@ -0,0 +1,8 @@
{
"name": "engine.io-parser",
"type": "module",
"browser": {
"./encodePacket.js": "./encodePacket.browser.js",
"./decodePacket.js": "./decodePacket.browser.js"
}
}

View File

@@ -0,0 +1,12 @@
const parser = require('.');
parser.encodePayload([
{
type: 'message',
data: '€',
},
{
type: 'message',
data: Buffer.from([1, 2, 3, 4]),
},
], true, console.log);

View File

@@ -0,0 +1,154 @@
import {
decodePacket,
decodePayload,
encodePacket,
encodePayload,
createPacketEncoderStream,
createPacketDecoderStream,
Packet,
} from "..";
import * as expect from "expect.js";
import { areArraysEqual, createArrayBuffer } from "./util";
const withNativeArrayBuffer = typeof ArrayBuffer === "function";
describe("engine.io-parser (browser only)", () => {
describe("single packet", () => {
if (withNativeArrayBuffer) {
it("should encode/decode an ArrayBuffer", (done) => {
const packet: Packet = {
type: "message",
data: createArrayBuffer([1, 2, 3, 4]),
};
encodePacket(packet, true, (encodedPacket) => {
expect(encodedPacket).to.be.an(ArrayBuffer);
expect(areArraysEqual(encodedPacket, packet.data)).to.be(true);
const decodedPacket = decodePacket(encodedPacket, "arraybuffer");
expect(decodedPacket.type).to.eql("message");
expect(areArraysEqual(decodedPacket.data, packet.data)).to.be(true);
done();
});
});
it("should encode/decode an ArrayBuffer as base64", (done) => {
const packet: Packet = {
type: "message",
data: createArrayBuffer([1, 2, 3, 4]),
};
encodePacket(packet, false, (encodedPacket) => {
expect(encodedPacket).to.eql("bAQIDBA==");
const decodedPacket = decodePacket(encodedPacket, "arraybuffer");
expect(decodedPacket.type).to.eql(packet.type);
expect(decodedPacket.data).to.be.an(ArrayBuffer);
expect(areArraysEqual(decodedPacket.data, packet.data)).to.be(true);
done();
});
});
it("should encode a typed array", (done) => {
const buffer = createArrayBuffer([1, 2, 3, 4]);
const data = new Int8Array(buffer, 1, 2);
encodePacket({ type: "message", data }, true, (encodedPacket) => {
expect(encodedPacket).to.eql(data); // unmodified typed array
done();
});
});
}
if (typeof Blob === "function") {
it("should encode/decode a Blob", (done) => {
const packet: Packet = {
type: "message",
data: new Blob(["1234", createArrayBuffer([1, 2, 3, 4])]),
};
encodePacket(packet, true, (encodedPacket) => {
expect(encodedPacket).to.be.a(Blob);
const decodedPacket = decodePacket(encodedPacket, "blob");
expect(decodedPacket.type).to.eql("message");
expect(decodedPacket.data).to.be.a(Blob);
done();
});
});
it("should encode/decode a Blob as base64", (done) => {
const packet: Packet = {
type: "message",
data: new Blob(["1234", createArrayBuffer([1, 2, 3, 4])]),
};
encodePacket(packet, false, (encodedPacket) => {
expect(encodedPacket).to.eql("bMTIzNAECAwQ=");
const decodedPacket = decodePacket(encodedPacket, "blob");
expect(decodedPacket.type).to.eql("message");
expect(decodedPacket.data).to.be.a(Blob);
done();
});
});
}
});
describe("payload", () => {
if (withNativeArrayBuffer) {
it("should encode/decode a string + ArrayBuffer payload", (done) => {
const packets: Packet[] = [
{ type: "message", data: "test" },
{ type: "message", data: createArrayBuffer([1, 2, 3, 4]) },
];
encodePayload(packets, (payload) => {
expect(payload).to.eql("4test\x1ebAQIDBA==");
expect(decodePayload(payload, "arraybuffer")).to.eql(packets);
done();
});
});
it("should encode/decode a string + a 0-length ArrayBuffer payload", (done) => {
const packets: Packet[] = [
{ type: "message", data: "test" },
{ type: "message", data: createArrayBuffer([]) },
];
encodePayload(packets, (payload) => {
expect(payload).to.eql("4test\x1eb");
expect(decodePayload(payload, "arraybuffer")).to.eql(packets);
done();
});
});
}
});
if (typeof TextEncoder === "function") {
describe("createPacketEncoderStream", () => {
it("should encode a binary packet (Blob)", async () => {
const stream = createPacketEncoderStream();
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write({
type: "message",
data: new Blob([Uint8Array.from([1, 2, 3])]),
});
const header = await reader.read();
expect(header.value).to.eql(Uint8Array.of(131));
const payload = await reader.read();
expect(payload.value).to.eql(Uint8Array.of(1, 2, 3));
});
});
describe("createPacketDecoderStream", () => {
it("should decode a binary packet (Blob)", async () => {
const stream = createPacketDecoderStream(1e6, "blob");
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write(Uint8Array.of(131, 1, 2, 3));
const { value } = await reader.read();
expect(value.type).to.eql("message");
expect(value.data).to.be.a(Blob);
});
});
}
});

View File

@@ -0,0 +1,339 @@
import {
createPacketDecoderStream,
createPacketEncoderStream,
decodePacket,
decodePayload,
encodePacket,
encodePayload,
Packet,
} from "..";
import * as expect from "expect.js";
import { areArraysEqual } from "./util";
import "./node"; // replaced by "./browser" for the tests in the browser (see "browser" field in the package.json file)
describe("engine.io-parser", () => {
describe("single packet", () => {
it("should encode/decode a string", (done) => {
const packet: Packet = { type: "message", data: "test" };
encodePacket(packet, true, (encodedPacket) => {
expect(encodedPacket).to.eql("4test");
expect(decodePacket(encodedPacket)).to.eql(packet);
done();
});
});
it("should fail to decode a malformed packet", () => {
expect(decodePacket("")).to.eql({
type: "error",
data: "parser error",
});
expect(decodePacket("a123")).to.eql({
type: "error",
data: "parser error",
});
});
});
describe("payload", () => {
it("should encode/decode all packet types", (done) => {
const packets: Packet[] = [
{ type: "open" },
{ type: "close" },
{ type: "ping", data: "probe" },
{ type: "pong", data: "probe" },
{ type: "message", data: "test" },
];
encodePayload(packets, (payload) => {
expect(payload).to.eql("0\x1e1\x1e2probe\x1e3probe\x1e4test");
expect(decodePayload(payload)).to.eql(packets);
done();
});
});
it("should fail to decode a malformed payload", () => {
expect(decodePayload("{")).to.eql([
{ type: "error", data: "parser error" },
]);
expect(decodePayload("{}")).to.eql([
{ type: "error", data: "parser error" },
]);
expect(decodePayload('["a123", "a456"]')).to.eql([
{ type: "error", data: "parser error" },
]);
});
});
// note: `describe("", function() { this.skip() } );` was added in mocha@10, which has dropped support for Node.js 10
if (typeof TransformStream === "function") {
describe("createPacketEncoderStream", () => {
it("should encode a plaintext packet", async () => {
const stream = createPacketEncoderStream();
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write({
type: "message",
data: "1€",
});
const header = await reader.read();
expect(header.value).to.eql(Uint8Array.of(5));
const payload = await reader.read();
expect(payload.value).to.eql(Uint8Array.of(52, 49, 226, 130, 172));
});
it("should encode a binary packet (Uint8Array)", async () => {
const stream = createPacketEncoderStream();
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
const data = Uint8Array.of(1, 2, 3);
writer.write({
type: "message",
data,
});
const header = await reader.read();
expect(header.value).to.eql(Uint8Array.of(131));
const payload = await reader.read();
expect(payload.value === data).to.be(true);
});
it("should encode a binary packet (ArrayBuffer)", async () => {
const stream = createPacketEncoderStream();
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write({
type: "message",
data: Uint8Array.of(1, 2, 3).buffer,
});
const header = await reader.read();
expect(header.value).to.eql(Uint8Array.of(131));
const payload = await reader.read();
expect(payload.value).to.eql(Uint8Array.of(1, 2, 3));
});
it("should encode a binary packet (Uint16Array)", async () => {
const stream = createPacketEncoderStream();
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write({
type: "message",
data: Uint16Array.from([1, 2, 257]),
});
const header = await reader.read();
expect(header.value).to.eql(Uint8Array.of(134));
const payload = await reader.read();
expect(payload.value).to.eql(Uint8Array.of(1, 0, 2, 0, 1, 1));
});
it("should encode a binary packet (Uint8Array - medium)", async () => {
const stream = createPacketEncoderStream();
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
const data = new Uint8Array(12345);
writer.write({
type: "message",
data,
});
const header = await reader.read();
expect(header.value).to.eql(Uint8Array.of(254, 48, 57));
const payload = await reader.read();
expect(payload.value === data).to.be(true);
});
it("should encode a binary packet (Uint8Array - big)", async () => {
const stream = createPacketEncoderStream();
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
const data = new Uint8Array(123456789);
writer.write({
type: "message",
data,
});
const header = await reader.read();
expect(header.value).to.eql(
Uint8Array.of(255, 0, 0, 0, 0, 7, 91, 205, 21),
);
const payload = await reader.read();
expect(payload.value === data).to.be(true);
});
});
describe("createPacketDecoderStream", () => {
it("should decode a plaintext packet", async () => {
const stream = createPacketDecoderStream(1e6, "arraybuffer");
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write(Uint8Array.of(5));
writer.write(Uint8Array.of(52, 49, 226, 130, 172));
const packet = await reader.read();
expect(packet.value).to.eql({
type: "message",
data: "1€",
});
});
it("should decode a plaintext packet (bytes by bytes)", async () => {
const stream = createPacketDecoderStream(1e6, "arraybuffer");
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write(Uint8Array.of(5));
writer.write(Uint8Array.of(52));
writer.write(Uint8Array.of(49));
writer.write(Uint8Array.of(226));
writer.write(Uint8Array.of(130));
writer.write(Uint8Array.of(172));
writer.write(Uint8Array.of(1));
writer.write(Uint8Array.of(50));
writer.write(Uint8Array.of(1));
writer.write(Uint8Array.of(51));
const { value } = await reader.read();
expect(value).to.eql({ type: "message", data: "1€" });
const pingPacket = await reader.read();
expect(pingPacket.value).to.eql({ type: "ping" });
const pongPacket = await reader.read();
expect(pongPacket.value).to.eql({ type: "pong" });
});
it("should decode a plaintext packet (all bytes at once)", async () => {
const stream = createPacketDecoderStream(1e6, "arraybuffer");
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write(Uint8Array.of(5, 52, 49, 226, 130, 172, 1, 50, 1, 51));
const { value } = await reader.read();
expect(value).to.eql({ type: "message", data: "1€" });
const pingPacket = await reader.read();
expect(pingPacket.value).to.eql({ type: "ping" });
const pongPacket = await reader.read();
expect(pongPacket.value).to.eql({ type: "pong" });
});
it("should decode a binary packet (ArrayBuffer)", async () => {
const stream = createPacketDecoderStream(1e6, "arraybuffer");
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write(Uint8Array.of(131, 1, 2, 3));
const { value } = await reader.read();
expect(value.type).to.eql("message");
expect(value.data).to.be.an(ArrayBuffer);
expect(areArraysEqual(value.data, Uint8Array.of(1, 2, 3)));
});
it("should decode a binary packet (ArrayBuffer) (medium)", async () => {
const stream = createPacketDecoderStream(1e6, "arraybuffer");
const payload = new Uint8Array(12345);
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write(Uint8Array.of(254));
writer.write(Uint8Array.of(48, 57));
writer.write(payload);
const { value } = await reader.read();
expect(value.type).to.eql("message");
expect(value.data).to.be.an(ArrayBuffer);
expect(areArraysEqual(value.data, payload));
});
it("should decode a binary packet (ArrayBuffer) (big)", async () => {
const stream = createPacketDecoderStream(1e10, "arraybuffer");
const payload = new Uint8Array(123456789);
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write(Uint8Array.of(255));
writer.write(Uint8Array.of(0, 0, 0, 0, 7, 91, 205, 21));
writer.write(payload);
const { value } = await reader.read();
expect(value.type).to.eql("message");
expect(value.data).to.be.an(ArrayBuffer);
expect(areArraysEqual(value.data, payload));
});
it("should return an error packet if the length of the payload is too big", async () => {
const stream = createPacketDecoderStream(10, "arraybuffer");
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write(Uint8Array.of(11));
const packet = await reader.read();
expect(packet.value).to.eql({ type: "error", data: "parser error" });
});
it("should return an error packet if the length of the payload is invalid", async () => {
const stream = createPacketDecoderStream(1e6, "arraybuffer");
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write(Uint8Array.of(0));
const packet = await reader.read();
expect(packet.value).to.eql({ type: "error", data: "parser error" });
});
it("should return an error packet if the length of the payload is bigger than Number.MAX_SAFE_INTEGER", async () => {
const stream = createPacketDecoderStream(1e6, "arraybuffer");
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write(Uint8Array.of(255, 1, 0, 0, 0, 0, 0, 0, 0, 0));
const packet = await reader.read();
expect(packet.value).to.eql({ type: "error", data: "parser error" });
});
});
}
});

View File

@@ -0,0 +1,150 @@
import {
decodePacket,
decodePayload,
encodePacket,
encodePayload,
Packet,
createPacketDecoderStream,
createPacketEncoderStream,
} from "..";
import * as expect from "expect.js";
import { areArraysEqual } from "./util";
describe("engine.io-parser (node.js only)", () => {
describe("single packet", () => {
it("should encode/decode a Buffer", (done) => {
const packet: Packet = {
type: "message",
data: Buffer.from([1, 2, 3, 4]),
};
encodePacket(packet, true, (encodedPacket) => {
expect(encodedPacket).to.eql(packet.data); // noop
expect(decodePacket(encodedPacket)).to.eql(packet);
done();
});
});
it("should encode/decode a Buffer as base64", (done) => {
const packet: Packet = {
type: "message",
data: Buffer.from([1, 2, 3, 4]),
};
encodePacket(packet, false, (encodedPacket) => {
expect(encodedPacket).to.eql("bAQIDBA==");
expect(decodePacket(encodedPacket, "nodebuffer")).to.eql(packet);
done();
});
});
it("should encode/decode an ArrayBuffer", (done) => {
const packet: Packet = {
type: "message",
data: Int8Array.from([1, 2, 3, 4]).buffer,
};
encodePacket(packet, true, (encodedPacket) => {
expect(encodedPacket === packet.data).to.be(true);
const decodedPacket = decodePacket(encodedPacket, "arraybuffer");
expect(decodedPacket.type).to.eql(packet.type);
expect(decodedPacket.data).to.be.an(ArrayBuffer);
expect(areArraysEqual(decodedPacket.data, packet.data)).to.be(true);
done();
});
});
it("should encode/decode an ArrayBuffer as base64", (done) => {
const packet: Packet = {
type: "message",
data: Int8Array.from([1, 2, 3, 4]).buffer,
};
encodePacket(packet, false, (encodedPacket) => {
expect(encodedPacket).to.eql("bAQIDBA==");
const decodedPacket = decodePacket(encodedPacket, "arraybuffer");
expect(decodedPacket.type).to.eql(packet.type);
expect(decodedPacket.data).to.be.an(ArrayBuffer);
expect(areArraysEqual(decodedPacket.data, packet.data)).to.be(true);
done();
});
});
it("should encode a typed array", (done) => {
const packet: Packet = {
type: "message",
data: Int16Array.from([257, 258, 259, 260]),
};
encodePacket(packet, true, (encodedPacket) => {
expect(encodedPacket === packet.data).to.be(true);
done();
});
});
it("should encode a typed array (with offset and length)", (done) => {
const buffer = Int8Array.from([1, 2, 3, 4]).buffer;
const data = new Int8Array(buffer, 1, 2);
encodePacket({ type: "message", data }, true, (encodedPacket) => {
expect(encodedPacket).to.eql(Buffer.from([2, 3]));
done();
});
});
it("should decode an ArrayBuffer as ArrayBuffer", () => {
const encodedPacket = Int8Array.from([1, 2, 3, 4]).buffer;
const decodedPacket = decodePacket(encodedPacket, "arraybuffer");
expect(decodedPacket.type).to.eql("message");
expect(decodedPacket.data).to.be.an(ArrayBuffer);
expect(areArraysEqual(decodedPacket.data, encodedPacket)).to.be(true);
});
});
describe("payload", () => {
it("should encode/decode a string + Buffer payload", (done) => {
const packets: Packet[] = [
{ type: "message", data: "test" },
{ type: "message", data: Buffer.from([1, 2, 3, 4]) },
];
encodePayload(packets, (payload) => {
expect(payload).to.eql("4test\x1ebAQIDBA==");
expect(decodePayload(payload, "nodebuffer")).to.eql(packets);
done();
});
});
});
if (typeof TransformStream === "function") {
describe("createPacketEncoderStream", () => {
it("should encode a binary packet (Buffer)", async () => {
const stream = createPacketEncoderStream();
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write({
type: "message",
data: Buffer.of(1, 2, 3),
});
const header = await reader.read();
expect(header.value).to.eql(Uint8Array.of(131));
const payload = await reader.read();
expect(payload.value).to.eql(Uint8Array.of(1, 2, 3));
});
});
describe("createPacketDecoderStream", () => {
it("should decode a binary packet (Buffer)", async () => {
const stream = createPacketDecoderStream(1e6, "nodebuffer");
const writer = stream.writable.getWriter();
const reader = stream.readable.getReader();
writer.write(Uint8Array.of(131, 1, 2, 3));
const { value } = await reader.read();
expect(value.type).to.eql("message");
expect(Buffer.isBuffer(value.data)).to.be(true);
expect(value.data).to.eql(Buffer.from([1, 2, 3]));
});
});
}
});

View File

@@ -0,0 +1,21 @@
const areArraysEqual = (x, y) => {
if (x.byteLength !== y.byteLength) return false;
const xView = new Uint8Array(x),
yView = new Uint8Array(y);
for (let i = 0; i < x.byteLength; i++) {
if (xView[i] !== yView[i]) return false;
}
return true;
};
const createArrayBuffer = (array) => {
// Uint8Array.from() is not defined in IE 10/11
const arrayBuffer = new ArrayBuffer(array.length);
const view = new Uint8Array(arrayBuffer);
for (let i = 0; i < array.length; i++) {
view[i] = array[i];
}
return arrayBuffer;
};
export { areArraysEqual, createArrayBuffer };

View File

@@ -0,0 +1,12 @@
{
"compilerOptions": {
"outDir": "build/esm/",
"target": "es2018",
"module": "esnext",
"moduleResolution": "node",
"declaration": true
},
"include": [
"./lib/**/*"
]
}

View File

@@ -0,0 +1,11 @@
{
"compilerOptions": {
"outDir": "build/cjs/",
"target": "es2018", // Node.js 10 (https://github.com/microsoft/TypeScript/wiki/Node-Target-Mapping)
"module": "commonjs",
"declaration": true
},
"include": [
"./lib/**/*"
]
}

View File

@@ -0,0 +1,44 @@
"use strict";
const browsers = require("socket.io-browsers");
const zuulConfig = (module.exports = {
ui: "mocha-bdd",
// test on localhost by default
local: true,
open: true,
concurrency: 2, // ngrok only accepts two tunnels by default
// if browser does not sends output in 120s since last output:
// stop testing, something is wrong
browser_output_timeout: 120 * 1000,
browser_open_timeout: 60 * 4 * 1000,
// we want to be notified something is wrong asap, so no retry
browser_retries: 1,
browserify: [
{
plugin: ["tsify", {
target: "es5"
}],
transform: {
name: "babelify",
presets: ["@babel/preset-env"]
}
}
]
});
if (process.env.CI === "true") {
zuulConfig.local = false;
zuulConfig.tunnel = {
type: "ngrok",
bind_tls: true
};
}
const isPullRequest =
process.env.TRAVIS_PULL_REQUEST &&
process.env.TRAVIS_PULL_REQUEST !== "false";
zuulConfig.browsers = isPullRequest ? browsers.pullRequest : browsers.all;