mirror of
https://github.com/meteor/meteor.git
synced 2026-05-02 03:01:46 -04:00
Merge pull request #13681 from meteor/release-3.3
Release 3.3 (rc.0 available 🚀)
This commit is contained in:
@@ -115,6 +115,9 @@ build_machine_environment:
|
||||
NUM_GROUPS: 12
|
||||
RUNNING_AVG_LENGTH: 6
|
||||
|
||||
# Force modern bundler test
|
||||
METEOR_MODERN: true
|
||||
|
||||
jobs:
|
||||
Get Ready:
|
||||
<<: *build_machine_environment
|
||||
|
||||
@@ -18,6 +18,7 @@ env:
|
||||
TIMEOUT_SCALE_FACTOR: 20
|
||||
METEOR_HEADLESS: true
|
||||
SELF_TEST_EXCLUDE: '^NULL-LEAVE-THIS-HERE-NULL$'
|
||||
METEOR_MODERN: true
|
||||
|
||||
jobs:
|
||||
test:
|
||||
|
||||
@@ -4,7 +4,7 @@ dist: jammy
|
||||
sudo: required
|
||||
services: xvfb
|
||||
node_js:
|
||||
- "22.14.0"
|
||||
- "22.16.0"
|
||||
cache:
|
||||
directories:
|
||||
- ".meteor"
|
||||
@@ -17,6 +17,7 @@ env:
|
||||
- phantom=false
|
||||
- PUPPETEER_DOWNLOAD_PATH=~/.npm/chromium
|
||||
- TEST_PACKAGES_EXCLUDE=stylus
|
||||
- METEOR_MODERN=true
|
||||
addons:
|
||||
apt:
|
||||
sources:
|
||||
|
||||
@@ -213,6 +213,7 @@ redirects:
|
||||
/#/full/accounts-setusername: 'api/passwords.html#accounts-setusername'
|
||||
/#/full/accounts-addemail: 'api/passwords.html#accounts-addemail'
|
||||
/#/full/accounts-removeemail: 'api/passwords.html#accounts-removeemail'
|
||||
/#/full/accounts_replaceemail: 'api/passwords.html#Accounts-replaceEmail'
|
||||
/#/full/accounts_verifyemail: 'api/passwords.html#Accounts-verifyEmail'
|
||||
/#/full/accounts-finduserbyusername: 'api/passwords.html#accounts-finduserbyusername'
|
||||
/#/full/accounts-finduserbyemail: 'api/passwords.html#accounts-finduserbyemail'
|
||||
|
||||
@@ -55,6 +55,7 @@
|
||||
- `Accounts.sendVerificationEmail`
|
||||
- `Accounts.addEmail`
|
||||
- `Accounts.removeEmail`
|
||||
- `Accounts.replaceEmailAsync`
|
||||
- `Accounts.verifyEmail`
|
||||
- `Accounts.createUserVerifyingEmail`
|
||||
- `Accounts.createUser`
|
||||
|
||||
106
docs/history.md
106
docs/history.md
@@ -8,6 +8,112 @@
|
||||
|
||||
[//]: # (go to meteor/docs/generators/changelog/docs)
|
||||
|
||||
## v3.3.0, 2025-06-11
|
||||
|
||||
### Highlights
|
||||
|
||||
- Support SWC transpiler and minifier for faster dev and builds [PR#13657](https://github.com/meteor/meteor/pull/13657), [PR#13715](https://github.com/meteor/meteor/pull/13715)
|
||||
- Switch to `@parcel/watcher` for improved native file watching [PR#13699](https://github.com/meteor/meteor/pull/13699), [#13707](https://github.com/meteor/meteor/pull/13707)
|
||||
- Default to modern architecture, skip legacy processing [PR#13665](https://github.com/meteor/meteor/pull/13665), [PR#13698](https://github.com/meteor/meteor/pull/13698)
|
||||
- Optimize SQLite for faster startup and better performance [PR#13702](https://github.com/meteor/meteor/pull/13702)
|
||||
- Support CPU profiling in Meteor 3 bundler [PR#13650](https://github.com/meteor/meteor/pull/13650)
|
||||
- Improve `meteor profile`: show rebuild steps and total, support `--build` [PR#16](https://github.com/meteor/performance/pull/16), [PR#13694](https://github.com/meteor/meteor/pull/13694)
|
||||
- Improve `useFind` and `useSubscribe` React hooks
|
||||
- Add `replaceEmailAsync` helper to Accounts [PR#13677](https://github.com/meteor/meteor/pull/13677)
|
||||
- Fix user agent detection and oplog collection filtering
|
||||
- Refine type definitions for Meteor methods and SSR's ServerSink
|
||||
- Allow opting out of usage stats with `DO_NOT_TRACK`
|
||||
- Update Node to 22.16.0 and Express to 5.1.0
|
||||
|
||||
All Merged PRs@[GitHub PRs 3.3](https://github.com/meteor/meteor/pulls?q=is%3Apr+is%3Amerged+base%3Arelease-3.3)
|
||||
|
||||
React Packages Changelog: [react-meteor-data@4.0.0](https://github.com/meteor/react-packages/tree/master/packages/react-meteor-data/CHANGELOG.md#v400-2025-06-11)
|
||||
|
||||
#### Breaking Changes
|
||||
|
||||
- File watching strategy switched to `@parcel/watcher`
|
||||
- Most setups should be fine, but if issues appear, like when using WSL with host, volumes, or remote setups—switch to polling.
|
||||
- Set `METEOR_WATCH_FORCE_POLLING=true` to enable polling.
|
||||
- Set `METEOR_WATCH_POLLING_INTERVAL_MS=1000` to adjust the interval.
|
||||
|
||||
- `react-meteor-data@4.0.0`
|
||||
- Independent from the core, only applies if upgraded manually.
|
||||
- useFind describes no deps by default [PR#431](https://github.com/meteor/react-packages/pull/431)
|
||||
|
||||
#### Internal API changes
|
||||
|
||||
- `express@5.1.0` - Depends on Meteor’s `webapp` package.
|
||||
- Deprecates non-native promise usage [#154](https://github.com/pillarjs/router/pull/154)
|
||||
- Use `async/await` or `Promise.resolve` when defining endpoints to avoid deprecation warnings.
|
||||
|
||||
#### Migration Steps
|
||||
|
||||
Please run the following command to update your project:
|
||||
|
||||
```bash
|
||||
meteor update --release 3.3
|
||||
```
|
||||
|
||||
To apply react-meteor-data changes:
|
||||
|
||||
```bash
|
||||
meteor add react-meteor-data@4.0.0
|
||||
```
|
||||
|
||||
**Add this to your `package.json` to enable the new modern build stack:**
|
||||
|
||||
```json
|
||||
"meteor": {
|
||||
"modern": true
|
||||
}
|
||||
```
|
||||
|
||||
> These settings are on by default for new apps.
|
||||
|
||||
#### Bumped Meteor Packages
|
||||
|
||||
- accounts-base@3.1.1
|
||||
- accounts-password@3.2.0
|
||||
- autoupdate@2.0.1
|
||||
- babel-compiler@7.12.0
|
||||
- boilerplate-generator@2.0.1
|
||||
- ddp-client@3.1.1
|
||||
- ecmascript@0.16.11
|
||||
- ejson@1.1.5
|
||||
- meteor@2.1.1
|
||||
- minifier-js@3.0.2
|
||||
- modern-browsers@0.2.2
|
||||
- mongo@2.1.2
|
||||
- server-render@0.4.3
|
||||
- socket-stream-client@0.6.1
|
||||
- standard-minifier-js@3.1.0
|
||||
- typescript@5.6.4
|
||||
- webapp@2.0.7
|
||||
- meteor-tool@3.3.0
|
||||
|
||||
#### Bumped NPM Packages
|
||||
|
||||
- meteor-node-stubs@1.2.17
|
||||
|
||||
#### Special thanks to
|
||||
|
||||
✨✨✨
|
||||
|
||||
- [@nachocodoner](https://github.com/nachocodoner)
|
||||
- [@italojs](https://github.com/italojs)
|
||||
- [@Grubba27](https://github.com/Grubba27)
|
||||
- [@zodern](https://github.com/zodern)
|
||||
- [@9Morello](https://github.com/9Morello)
|
||||
- [@welkinwong](https://github.com/welkinwong)
|
||||
- [@Poyoman39](https://github.com/Poyoman39)
|
||||
- [@PedroMarianoAlmeida](https://github.com/PedroMarianoAlmeida)
|
||||
- [@harryadel](https://github.com/harryadel)
|
||||
- [@ericm546](https://github.com/ericm546)
|
||||
- [@StorytellerCZ](https://github.com/StorytellerCZ)
|
||||
|
||||
✨✨✨
|
||||
|
||||
|
||||
## v3.0.1, 2024-07-16
|
||||
|
||||
### Highlights
|
||||
|
||||
@@ -59,6 +59,8 @@ By default, an email address is added with `{ verified: false }`. Use
|
||||
[`Accounts.sendVerificationEmail`](#Accounts-sendVerificationEmail) to send an
|
||||
email with a link the user can use to verify their email address.
|
||||
|
||||
{% apibox "Accounts.replaceEmailAsync" %}
|
||||
|
||||
{% apibox "Accounts.removeEmail" %}
|
||||
|
||||
{% apibox "Accounts.verifyEmail" %}
|
||||
|
||||
3
meteor
3
meteor
@@ -1,7 +1,6 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
BUNDLE_VERSION=22.14.0.4
|
||||
|
||||
BUNDLE_VERSION=22.16.0.1
|
||||
|
||||
# OS Check. Put here because here is where we download the precompiled
|
||||
# bundles that are arch specific.
|
||||
|
||||
2
packages/accounts-base/accounts-base.d.ts
vendored
2
packages/accounts-base/accounts-base.d.ts
vendored
@@ -188,6 +188,8 @@ export namespace Accounts {
|
||||
|
||||
function removeEmail(userId: string, email: string): Promise<void>;
|
||||
|
||||
function replaceEmailAsync(userId: string, oldEmail: string, newEmail: string, verified?: boolean): Promise<void>;
|
||||
|
||||
function onCreateUser(
|
||||
func: (options: { profile?: {} | undefined }, user: Meteor.User) => void
|
||||
): void;
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Package.describe({
|
||||
summary: "A user account system",
|
||||
version: "3.1.0",
|
||||
version: "3.1.1",
|
||||
});
|
||||
|
||||
Package.onUse((api) => {
|
||||
|
||||
@@ -5,7 +5,7 @@ Package.describe({
|
||||
// 2.2.x in the future. The version was also bumped to 2.0.0 temporarily
|
||||
// during the Meteor 1.5.1 release process, so versions 2.0.0-beta.2
|
||||
// through -beta.5 and -rc.0 have already been published.
|
||||
version: "3.1.0",
|
||||
version: "3.2.0",
|
||||
});
|
||||
|
||||
Npm.depends({
|
||||
|
||||
@@ -1022,6 +1022,52 @@ Meteor.methods(
|
||||
}
|
||||
});
|
||||
|
||||
|
||||
/**
|
||||
* @summary Asynchronously replace an email address for a user. Use this instead of directly
|
||||
* updating the database. The operation will fail if there is a different user
|
||||
* with an email only differing in case. If the specified user has an existing
|
||||
* email only differing in case however, we replace it.
|
||||
* @locus Server
|
||||
* @param {String} userId The ID of the user to update.
|
||||
* @param {String} oldEmail The email address to replace.
|
||||
* @param {String} newEmail The new email address to use.
|
||||
* @param {Boolean} [verified] Optional - whether the new email address should
|
||||
* be marked as verified. Defaults to false.
|
||||
* @importFromPackage accounts-base
|
||||
*/
|
||||
Accounts.replaceEmailAsync = async (userId, oldEmail, newEmail, verified) => {
|
||||
check(userId, NonEmptyString);
|
||||
check(oldEmail, NonEmptyString);
|
||||
check(newEmail, NonEmptyString);
|
||||
check(verified, Match.Optional(Boolean));
|
||||
|
||||
if (verified === void 0) {
|
||||
verified = false;
|
||||
}
|
||||
|
||||
const user = await getUserById(userId, { fields: { _id: 1 } });
|
||||
if (!user)
|
||||
throw new Meteor.Error(403, "User not found");
|
||||
|
||||
// Ensure no user already has this new email
|
||||
await Accounts._checkForCaseInsensitiveDuplicates(
|
||||
"emails.address",
|
||||
"Email",
|
||||
newEmail,
|
||||
user._id
|
||||
);
|
||||
|
||||
const result = await Meteor.users.updateAsync(
|
||||
{ _id: user._id, 'emails.address': oldEmail },
|
||||
{ $set: { 'emails.$.address': newEmail, 'emails.$.verified': verified } }
|
||||
);
|
||||
|
||||
if (result.modifiedCount === 0) {
|
||||
throw new Meteor.Error(404, "No user could be found with old email");
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* @summary Asynchronously add an email address for a user. Use this instead of directly
|
||||
* updating the database. The operation will fail if there is a different user
|
||||
|
||||
@@ -1789,7 +1789,30 @@ if (Meteor.isServer) (() => {
|
||||
]);
|
||||
});
|
||||
|
||||
Tinytest.addAsync("passwords - remove email",
|
||||
|
||||
|
||||
Tinytest.addAsync("accounts emails - replace email", async test => {
|
||||
const origEmail = `originalemail@test.com`;
|
||||
const userId = await Accounts.createUserAsync({
|
||||
email: origEmail,
|
||||
password: 'password'
|
||||
});
|
||||
|
||||
const newEmail = `newemail@test.com`;
|
||||
|
||||
const u1 = await Accounts._findUserByQuery({ id: userId })
|
||||
test.equal(u1.emails, [
|
||||
{ address: origEmail, verified: false }
|
||||
]);
|
||||
|
||||
await Accounts.replaceEmailAsync(userId, origEmail, newEmail);
|
||||
const u2 = await Accounts._findUserByQuery({ id: userId })
|
||||
test.equal(u2.emails, [
|
||||
{ address: newEmail, verified: false }
|
||||
]);
|
||||
})
|
||||
|
||||
Tinytest.addAsync("passwords - remove email",
|
||||
async test => {
|
||||
const origEmail = `${ Random.id() }@turing.com`;
|
||||
const userId = await Accounts.createUser({
|
||||
|
||||
@@ -25,6 +25,7 @@
|
||||
// The ID of each document is the client architecture, and the fields of
|
||||
// the document are the versions described above.
|
||||
|
||||
import { onMessage } from "meteor/inter-process-messaging";
|
||||
import { ClientVersions } from "./client_versions.js";
|
||||
|
||||
export const Autoupdate = __meteor_runtime_config__.autoupdate = {
|
||||
@@ -152,7 +153,6 @@ function enqueueVersionsRefresh() {
|
||||
|
||||
const setupListeners = () => {
|
||||
// Listen for messages pertaining to the client-refresh topic.
|
||||
import { onMessage } from "meteor/inter-process-messaging";
|
||||
onMessage("client-refresh", enqueueVersionsRefresh);
|
||||
|
||||
// Another way to tell the process to refresh: send SIGHUP signal
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Package.describe({
|
||||
summary: 'Update the client when new client code is available',
|
||||
version: '2.0.0',
|
||||
version: '2.0.1',
|
||||
});
|
||||
|
||||
Package.onUse(function(api) {
|
||||
|
||||
@@ -1,14 +1,21 @@
|
||||
var semver = Npm.require("semver");
|
||||
var JSON5 = Npm.require("json5");
|
||||
var SWC = Npm.require("@meteorjs/swc-core");
|
||||
const reifyCompile = Npm.require("@meteorjs/reify/lib/compiler").compile;
|
||||
const reifyAcornParse = Npm.require("@meteorjs/reify/lib/parsers/acorn").parse;
|
||||
var fs = Npm.require('fs');
|
||||
var path = Npm.require('path');
|
||||
var vm = Npm.require('vm');
|
||||
var crypto = Npm.require('crypto');
|
||||
|
||||
/**
|
||||
* A compiler that can be instantiated with features and used inside
|
||||
* Plugin.registerCompiler
|
||||
* @param {Object} extraFeatures The same object that getDefaultOptions takes
|
||||
*/
|
||||
BabelCompiler = function BabelCompiler(extraFeatures, modifyBabelConfig) {
|
||||
BabelCompiler = function BabelCompiler(extraFeatures, modifyConfig) {
|
||||
this.extraFeatures = extraFeatures;
|
||||
this.modifyBabelConfig = modifyBabelConfig;
|
||||
this.modifyConfig = modifyConfig;
|
||||
this._babelrcCache = null;
|
||||
this._babelrcWarnings = Object.create(null);
|
||||
this.cacheDirectory = null;
|
||||
@@ -18,18 +25,166 @@ var BCp = BabelCompiler.prototype;
|
||||
var excludedFileExtensionPattern = /\.(es5|min)\.js$/i;
|
||||
var hasOwn = Object.prototype.hasOwnProperty;
|
||||
|
||||
// Check if verbose mode is enabled either in the provided config or in extraFeatures
|
||||
BCp.isVerbose = function(config) {
|
||||
if (config?.modern?.transpiler?.verbose) {
|
||||
return true;
|
||||
}
|
||||
if (config?.verbose) {
|
||||
return true;
|
||||
}
|
||||
return !!this.extraFeatures?.verbose;
|
||||
};
|
||||
|
||||
// There's no way to tell the current Meteor version, but we can infer
|
||||
// whether it's Meteor 1.4.4 or earlier by checking the Node version.
|
||||
var isMeteorPre144 = semver.lt(process.version, "4.8.1");
|
||||
|
||||
var enableClientTLA = process.env.METEOR_ENABLE_CLIENT_TOP_LEVEL_AWAIT === 'true';
|
||||
|
||||
function compileWithBabel(source, babelOptions, cacheOptions) {
|
||||
return profile('Babel.compile', function () {
|
||||
return Babel.compile(source, babelOptions, cacheOptions);
|
||||
});
|
||||
}
|
||||
|
||||
function compileWithSwc(source, swcOptions = {}, { features }) {
|
||||
return profile('SWC.compile', function () {
|
||||
// Perform SWC transformation.
|
||||
const transformed = SWC.transformSync(source, swcOptions);
|
||||
|
||||
let content = transformed.code;
|
||||
|
||||
// Preserve Meteor-specific features: reify modules, nested imports, and top-level await support.
|
||||
const result = reifyCompile(content, {
|
||||
parse: reifyAcornParse,
|
||||
generateLetDeclarations: false,
|
||||
ast: false,
|
||||
// Enforce reify options for proper compatibility.
|
||||
avoidModernSyntax: true,
|
||||
enforceStrictMode: false,
|
||||
dynamicImport: true,
|
||||
...(features.topLevelAwait && { topLevelAwait: true }),
|
||||
...(features.compileForShell && { moduleAlias: 'module' }),
|
||||
...((features.modernBrowsers || features.nodeMajorVersion >= 8) && {
|
||||
avoidModernSyntax: false,
|
||||
generateLetDeclarations: true,
|
||||
}),
|
||||
});
|
||||
content = result.code;
|
||||
|
||||
return {
|
||||
code: content,
|
||||
map: JSON.parse(transformed.map),
|
||||
sourceType: 'module',
|
||||
};
|
||||
});
|
||||
}
|
||||
const DEFAULT_MODERN = {
|
||||
transpiler: true,
|
||||
};
|
||||
|
||||
const normalizeModern = (r = false) => Object.fromEntries(
|
||||
Object.entries(DEFAULT_MODERN).map(([k, def]) => [
|
||||
k,
|
||||
r === true
|
||||
? def
|
||||
: r === false || r?.[k] === false
|
||||
? false
|
||||
: typeof r?.[k] === 'object'
|
||||
? { ...r[k] }
|
||||
: def,
|
||||
]),
|
||||
);
|
||||
|
||||
let modernForced = JSON.parse(process.env.METEOR_MODERN || "false");
|
||||
|
||||
let lastModifiedMeteorConfig;
|
||||
let lastModifiedMeteorConfigTime;
|
||||
BCp.initializeMeteorAppConfig = function () {
|
||||
if (!lastModifiedMeteorConfig && !fs.existsSync(`${getMeteorAppDir()}/package.json`)) {
|
||||
return;
|
||||
}
|
||||
const currentLastModifiedConfigTime = fs
|
||||
.statSync(`${getMeteorAppDir()}/package.json`)
|
||||
?.mtime?.getTime();
|
||||
if (currentLastModifiedConfigTime !== lastModifiedMeteorConfigTime) {
|
||||
lastModifiedMeteorConfigTime = currentLastModifiedConfigTime;
|
||||
lastModifiedMeteorConfig = getMeteorAppPackageJson()?.meteor;
|
||||
lastModifiedMeteorConfig = lastModifiedMeteorConfig != null ? {
|
||||
...lastModifiedMeteorConfig,
|
||||
modern: normalizeModern(modernForced || lastModifiedMeteorConfig?.modern),
|
||||
} : {};
|
||||
|
||||
if (this.isVerbose(lastModifiedMeteorConfig)) {
|
||||
logConfigBlock('Meteor Config', lastModifiedMeteorConfig);
|
||||
}
|
||||
}
|
||||
return lastModifiedMeteorConfig;
|
||||
};
|
||||
|
||||
let lastModifiedSwcConfig;
|
||||
let lastModifiedSwcConfigTime;
|
||||
BCp.initializeMeteorAppSwcrc = function () {
|
||||
const hasSwcRc = fs.existsSync(`${getMeteorAppDir()}/.swcrc`);
|
||||
const hasSwcJs = !hasSwcRc && fs.existsSync(`${getMeteorAppDir()}/swc.config.js`);
|
||||
if (!lastModifiedSwcConfig && !hasSwcRc && !hasSwcJs) {
|
||||
return;
|
||||
}
|
||||
const swcFile = hasSwcJs ? 'swc.config.js' : '.swcrc';
|
||||
const filePath = `${getMeteorAppDir()}/${swcFile}`;
|
||||
const fileStats = fs.statSync(filePath);
|
||||
const fileModTime = fileStats?.mtime?.getTime();
|
||||
|
||||
let currentLastModifiedConfigTime;
|
||||
if (hasSwcJs) {
|
||||
// For dynamic JS files, first get the resolved configuration
|
||||
const resolvedConfig = lastModifiedSwcConfig || getMeteorAppSwcrc(swcFile);
|
||||
// Calculate a hash of the resolved configuration to detect changes
|
||||
const contentHash = crypto
|
||||
.createHash('sha256')
|
||||
.update(JSON.stringify(resolvedConfig))
|
||||
.digest('hex');
|
||||
// Combine file modification time and content hash to create a unique identifier
|
||||
currentLastModifiedConfigTime = `${fileModTime}-${contentHash}`;
|
||||
// Store the resolved configuration
|
||||
lastModifiedSwcConfig = resolvedConfig;
|
||||
} else {
|
||||
// For static JSON files, just use the file modification time
|
||||
currentLastModifiedConfigTime = fileModTime;
|
||||
}
|
||||
|
||||
if (currentLastModifiedConfigTime !== lastModifiedSwcConfigTime) {
|
||||
lastModifiedSwcConfigTime = currentLastModifiedConfigTime;
|
||||
lastModifiedSwcConfig = getMeteorAppSwcrc(swcFile);
|
||||
|
||||
if (this.isVerbose(lastModifiedMeteorConfig)) {
|
||||
logConfigBlock('SWC Config', lastModifiedSwcConfig);
|
||||
}
|
||||
}
|
||||
return lastModifiedSwcConfig;
|
||||
};
|
||||
|
||||
let lastModifiedSwcLegacyConfig;
|
||||
BCp.initializeMeteorAppLegacyConfig = function () {
|
||||
const swcLegacyConfig = convertBabelTargetsForSwc(Babel.getMinimumModernBrowserVersions());
|
||||
if (this.isVerbose(lastModifiedMeteorConfig) && !lastModifiedSwcLegacyConfig) {
|
||||
logConfigBlock('SWC Legacy Config', swcLegacyConfig);
|
||||
}
|
||||
lastModifiedSwcLegacyConfig = swcLegacyConfig;
|
||||
return lastModifiedSwcConfig;
|
||||
};
|
||||
|
||||
BCp.processFilesForTarget = function (inputFiles) {
|
||||
var compiler = this;
|
||||
|
||||
// Reset this cache for each batch processed.
|
||||
this._babelrcCache = null;
|
||||
|
||||
this.initializeMeteorAppConfig();
|
||||
this.initializeMeteorAppSwcrc();
|
||||
this.initializeMeteorAppLegacyConfig();
|
||||
|
||||
inputFiles.forEach(function (inputFile) {
|
||||
if (inputFile.supportsLazyCompilation) {
|
||||
inputFile.addJavaScript({
|
||||
@@ -51,6 +206,8 @@ BCp.processFilesForTarget = function (inputFiles) {
|
||||
// null to indicate there was an error, and nothing should be added.
|
||||
BCp.processOneFileForTarget = function (inputFile, source) {
|
||||
this._babelrcCache = this._babelrcCache || Object.create(null);
|
||||
this._swcCache = this._swcCache || Object.create(null);
|
||||
this._swcIncompatible = this._swcIncompatible || Object.create(null);
|
||||
|
||||
if (typeof source !== "string") {
|
||||
// Other compiler plugins can call processOneFileForTarget with a
|
||||
@@ -121,32 +278,196 @@ BCp.processOneFileForTarget = function (inputFile, source) {
|
||||
},
|
||||
};
|
||||
|
||||
this.inferTypeScriptConfig(
|
||||
features, inputFile, cacheOptions.cacheDeps);
|
||||
|
||||
var babelOptions = Babel.getDefaultOptions(features);
|
||||
babelOptions.caller = { name: "meteor", arch };
|
||||
|
||||
this.inferExtraBabelOptions(
|
||||
inputFile,
|
||||
babelOptions,
|
||||
cacheOptions.cacheDeps
|
||||
);
|
||||
|
||||
babelOptions.sourceMaps = true;
|
||||
babelOptions.filename =
|
||||
babelOptions.sourceFileName = packageName
|
||||
? "packages/" + packageName + "/" + inputFilePath
|
||||
const filename = packageName
|
||||
? `packages/${packageName}/${inputFilePath}`
|
||||
: inputFilePath;
|
||||
|
||||
if (this.modifyBabelConfig) {
|
||||
this.modifyBabelConfig(babelOptions, inputFile);
|
||||
}
|
||||
const setupBabelOptions = () => {
|
||||
this.inferTypeScriptConfig(features, inputFile, cacheOptions.cacheDeps);
|
||||
|
||||
var babelOptions = Babel.getDefaultOptions(features);
|
||||
babelOptions.caller = { name: "meteor", arch };
|
||||
|
||||
babelOptions.sourceMaps = true;
|
||||
babelOptions.filename = babelOptions.sourceFileName = filename;
|
||||
|
||||
this.inferExtraBabelOptions(inputFile, babelOptions, cacheOptions.cacheDeps);
|
||||
|
||||
if (this.modifyConfig) {
|
||||
this.modifyConfig(babelOptions, inputFile);
|
||||
}
|
||||
|
||||
return babelOptions;
|
||||
};
|
||||
|
||||
const setupSWCOptions = () => {
|
||||
const isTypescriptSyntax = inputFilePath.endsWith('.ts') || inputFilePath.endsWith('.tsx');
|
||||
const hasTSXSupport = inputFilePath.endsWith('.tsx');
|
||||
const hasJSXSupport = inputFilePath.endsWith('.jsx');
|
||||
const isLegacyWebArch = arch.includes('legacy');
|
||||
|
||||
var swcOptions = {
|
||||
jsc: {
|
||||
...(!isLegacyWebArch && { target: 'es2015' }),
|
||||
parser: {
|
||||
syntax: isTypescriptSyntax ? 'typescript' : 'ecmascript',
|
||||
jsx: hasJSXSupport,
|
||||
tsx: hasTSXSupport,
|
||||
},
|
||||
},
|
||||
module: { type: 'es6' },
|
||||
minify: false,
|
||||
sourceMaps: true,
|
||||
filename,
|
||||
sourceFileName: filename,
|
||||
...(isLegacyWebArch && {
|
||||
env: { targets: lastModifiedSwcLegacyConfig || {} },
|
||||
}),
|
||||
};
|
||||
|
||||
// Merge with app-level SWC config
|
||||
if (lastModifiedSwcConfig) {
|
||||
swcOptions = deepMerge(swcOptions, lastModifiedSwcConfig, [
|
||||
'env.targets',
|
||||
'module.type',
|
||||
]);
|
||||
}
|
||||
|
||||
this.inferExtraSWCOptions(inputFile, swcOptions, cacheOptions.cacheDeps);
|
||||
|
||||
if (!!this.extraFeatures?.swc && this.modifyConfig) {
|
||||
this.modifyConfig(swcOptions, inputFile);
|
||||
}
|
||||
|
||||
// Resolve custom baseUrl to an absolute path pointing to the project root
|
||||
if (swcOptions.jsc && swcOptions.jsc.baseUrl) {
|
||||
swcOptions.jsc.baseUrl = path.resolve(process.cwd(), swcOptions.jsc.baseUrl);
|
||||
}
|
||||
|
||||
return swcOptions;
|
||||
};
|
||||
|
||||
var babelOptions = { filename };
|
||||
try {
|
||||
var result = profile('Babel.compile', function () {
|
||||
return Babel.compile(source, babelOptions, cacheOptions);
|
||||
});
|
||||
var result = (() => {
|
||||
const isNodeModulesCode = packageName == null && inputFilePath.includes("node_modules/");
|
||||
const isAppCode = packageName == null && !isNodeModulesCode;
|
||||
const isPackageCode = packageName != null;
|
||||
const isLegacyWebArch = arch.includes('legacy');
|
||||
|
||||
const config = lastModifiedMeteorConfig?.modern?.transpiler;
|
||||
const hasModernTranspiler = config != null && config !== false;
|
||||
const shouldSkipSwc =
|
||||
!hasModernTranspiler ||
|
||||
(isAppCode && config?.excludeApp === true) ||
|
||||
(isNodeModulesCode && config?.excludeNodeModules === true) ||
|
||||
(isPackageCode && config?.excludePackages === true) ||
|
||||
(isLegacyWebArch && config?.excludeLegacy === true) ||
|
||||
(isAppCode &&
|
||||
Array.isArray(config?.excludeApp) &&
|
||||
isExcludedConfig(inputFilePath, config?.excludeApp || [])) ||
|
||||
(isNodeModulesCode &&
|
||||
Array.isArray(config?.excludeNodeModules) &&
|
||||
(isExcludedConfig(inputFilePath, config?.excludeNodeModules || []) ||
|
||||
isExcludedConfig(
|
||||
inputFilePath.replace('node_modules/', ''),
|
||||
config?.excludeNodeModules || [],
|
||||
true,
|
||||
))) ||
|
||||
(isPackageCode &&
|
||||
Array.isArray(config?.excludePackages) &&
|
||||
(isExcludedConfig(packageName, config?.excludePackages || []) ||
|
||||
isExcludedConfig(
|
||||
`${packageName}/${inputFilePath}`,
|
||||
config?.excludePackages || [],
|
||||
)));
|
||||
|
||||
const cacheKey = [
|
||||
toBeAdded.hash,
|
||||
lastModifiedSwcConfigTime,
|
||||
isLegacyWebArch ? 'legacy' : '',
|
||||
]
|
||||
.filter(Boolean)
|
||||
.join('-');
|
||||
// Determine if SWC should be used based on package and file criteria.
|
||||
const shouldUseSwc =
|
||||
(!shouldSkipSwc || this.extraFeatures?.swc) &&
|
||||
!this._swcIncompatible[cacheKey];
|
||||
let compilation;
|
||||
try {
|
||||
let usedSwc = false;
|
||||
if (shouldUseSwc) {
|
||||
// Create a cache key based on the source hash and the compiler used
|
||||
// Check cache
|
||||
compilation = this.readFromSwcCache({ cacheKey });
|
||||
// Return cached result if found.
|
||||
if (compilation) {
|
||||
if (this.isVerbose(config)) {
|
||||
logTranspilation({
|
||||
usedSwc: true,
|
||||
inputFilePath,
|
||||
packageName,
|
||||
isNodeModulesCode,
|
||||
cacheHit: true,
|
||||
arch,
|
||||
});
|
||||
}
|
||||
return compilation;
|
||||
}
|
||||
|
||||
const swcOptions = setupSWCOptions();
|
||||
compilation = compileWithSwc(
|
||||
source,
|
||||
swcOptions,
|
||||
{ features },
|
||||
);
|
||||
// Save result in cache
|
||||
this.writeToSwcCache({ cacheKey, compilation });
|
||||
usedSwc = true;
|
||||
} else {
|
||||
// Set up Babel options only when compiling with Babel
|
||||
babelOptions = setupBabelOptions();
|
||||
|
||||
compilation = compileWithBabel(source, babelOptions, cacheOptions);
|
||||
usedSwc = false;
|
||||
}
|
||||
|
||||
if (this.isVerbose(config)) {
|
||||
logTranspilation({
|
||||
usedSwc,
|
||||
inputFilePath,
|
||||
packageName,
|
||||
isNodeModulesCode,
|
||||
cacheHit: false,
|
||||
arch,
|
||||
});
|
||||
}
|
||||
} catch (e) {
|
||||
this._swcIncompatible[cacheKey] = true;
|
||||
// If SWC fails, fall back to Babel
|
||||
|
||||
babelOptions = setupBabelOptions();
|
||||
compilation = compileWithBabel(source, babelOptions, cacheOptions);
|
||||
if (this.isVerbose(config)) {
|
||||
logTranspilation({
|
||||
usedSwc: false,
|
||||
inputFilePath,
|
||||
packageName,
|
||||
isNodeModulesCode,
|
||||
cacheHit: false,
|
||||
arch,
|
||||
errorMessage: e?.message,
|
||||
...(e?.message?.includes(
|
||||
'cannot be used outside of module code',
|
||||
) && {
|
||||
tip: 'Remove nested imports or replace them with require to support SWC and improve speed.',
|
||||
}),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return compilation;
|
||||
})();
|
||||
} catch (e) {
|
||||
if (e.loc) {
|
||||
// Error is from @babel/parser.
|
||||
@@ -256,6 +577,15 @@ BCp.inferExtraBabelOptions = function (inputFile, babelOptions, cacheDeps) {
|
||||
);
|
||||
};
|
||||
|
||||
BCp.inferExtraSWCOptions = function (inputFile, swcOptions, cacheDeps) {
|
||||
if (! inputFile.require ||
|
||||
! inputFile.findControlFile ||
|
||||
! inputFile.readAndWatchFile) {
|
||||
return false;
|
||||
}
|
||||
return this._inferFromSwcRc(inputFile, swcOptions, cacheDeps);
|
||||
};
|
||||
|
||||
BCp._inferFromBabelRc = function (inputFile, babelOptions, cacheDeps) {
|
||||
var babelrcPath = inputFile.findControlFile(".babelrc");
|
||||
if (babelrcPath) {
|
||||
@@ -308,6 +638,65 @@ BCp._inferFromPackageJson = function (inputFile, babelOptions, cacheDeps) {
|
||||
}
|
||||
};
|
||||
|
||||
BCp._inferFromSwcRc = function (inputFile, swcOptions, cacheDeps) {
|
||||
var swcrcPath = inputFile.findControlFile(".swcrc");
|
||||
if (swcrcPath) {
|
||||
if (! hasOwn.call(this._babelrcCache, swcrcPath)) {
|
||||
try {
|
||||
this._babelrcCache[swcrcPath] = {
|
||||
controlFilePath: swcrcPath,
|
||||
controlFileData: JSON.parse(
|
||||
inputFile.readAndWatchFile(swcrcPath)),
|
||||
deps: Object.create(null),
|
||||
};
|
||||
} catch (e) {
|
||||
if (e instanceof SyntaxError) {
|
||||
e.message = ".swcrc is not a valid JSON file: " + e.message;
|
||||
}
|
||||
throw e;
|
||||
}
|
||||
}
|
||||
|
||||
const cacheEntry = this._babelrcCache[swcrcPath];
|
||||
|
||||
if (this._inferHelperForSwc(inputFile, cacheEntry)) {
|
||||
deepMerge(swcOptions, cacheEntry.controlFileData);
|
||||
Object.assign(cacheDeps, cacheEntry.deps);
|
||||
return true;
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
BCp._inferHelperForSwc = function (inputFile, cacheEntry) {
|
||||
if (! cacheEntry.controlFileData) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (hasOwn.call(cacheEntry, "finalInferHelperForSwcResult")) {
|
||||
// We've already run _inferHelperForSwc and populated
|
||||
// cacheEntry.controlFileData, so we can return early here.
|
||||
return cacheEntry.finalInferHelperForSwcResult;
|
||||
}
|
||||
|
||||
// First, ensure that the current file path is not excluded.
|
||||
if (cacheEntry.controlFileData.exclude) {
|
||||
const exclude = cacheEntry.controlFileData.exclude;
|
||||
const path = inputFile.getPathInPackage();
|
||||
|
||||
if (exclude instanceof Array) {
|
||||
for (let i = 0; i < exclude.length; ++i) {
|
||||
if (path.match(exclude[i])) {
|
||||
return cacheEntry.finalInferHelperForSwcResult = false;
|
||||
}
|
||||
}
|
||||
} else if (path.match(exclude)) {
|
||||
return cacheEntry.finalInferHelperForSwcResult = false;
|
||||
}
|
||||
}
|
||||
|
||||
return cacheEntry.finalInferHelperForSwcResult = true;
|
||||
};
|
||||
|
||||
BCp._inferHelper = function (inputFile, cacheEntry) {
|
||||
if (! cacheEntry.controlFileData) {
|
||||
return false;
|
||||
@@ -564,3 +953,245 @@ function packageNameFromTopLevelModuleId(id) {
|
||||
}
|
||||
return parts[0];
|
||||
}
|
||||
|
||||
const SwcCacheContext = '.swc-cache';
|
||||
|
||||
BCp.readFromSwcCache = function({ cacheKey }) {
|
||||
// Check in-memory cache.
|
||||
let compilation = this._swcCache[cacheKey];
|
||||
// If not found, try file system cache if enabled.
|
||||
if (!compilation && this.cacheDirectory) {
|
||||
const cacheFilePath = path.join(this.cacheDirectory, SwcCacheContext, `${cacheKey}.json`);
|
||||
if (fs.existsSync(cacheFilePath)) {
|
||||
try {
|
||||
compilation = JSON.parse(fs.readFileSync(cacheFilePath, 'utf8'));
|
||||
// Save back to in-memory cache.
|
||||
this._swcCache[cacheKey] = compilation;
|
||||
} catch (err) {
|
||||
// Ignore any errors reading/parsing the cache.
|
||||
}
|
||||
}
|
||||
}
|
||||
return compilation;
|
||||
};
|
||||
|
||||
BCp.writeToSwcCache = function({ cacheKey, compilation }) {
|
||||
// Save to in-memory cache.
|
||||
this._swcCache[cacheKey] = compilation;
|
||||
// If file system caching is enabled, write asynchronously.
|
||||
if (this.cacheDirectory) {
|
||||
const cacheFilePath = path.join(this.cacheDirectory, SwcCacheContext, `${cacheKey}.json`);
|
||||
try {
|
||||
const writeFileCache = async () => {
|
||||
await fs.promises.mkdir(path.dirname(cacheFilePath), { recursive: true });
|
||||
await fs.promises.writeFile(cacheFilePath, JSON.stringify(compilation), 'utf8');
|
||||
};
|
||||
// Invoke without blocking the main flow.
|
||||
writeFileCache();
|
||||
} catch (err) {
|
||||
// If writing fails, ignore the error.
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
function getMeteorAppDir() {
|
||||
return process.cwd();
|
||||
}
|
||||
|
||||
function getMeteorAppPackageJson() {
|
||||
return JSON.parse(
|
||||
fs.readFileSync(`${getMeteorAppDir()}/package.json`, 'utf-8'),
|
||||
);
|
||||
}
|
||||
|
||||
function getMeteorAppSwcrc(file = '.swcrc') {
|
||||
try {
|
||||
const filePath = `${getMeteorAppDir()}/${file}`;
|
||||
if (file.endsWith('.js')) {
|
||||
let content = fs.readFileSync(filePath, 'utf-8');
|
||||
// Check if the content uses ES module syntax (export default)
|
||||
if (content.includes('export default')) {
|
||||
// Transform ES module syntax to CommonJS
|
||||
content = content.replace(/export\s+default\s+/, 'module.exports = ');
|
||||
}
|
||||
const script = new vm.Script(`
|
||||
(function() {
|
||||
const module = {};
|
||||
module.exports = {};
|
||||
(function(exports, module) {
|
||||
${content}
|
||||
})(module.exports, module);
|
||||
return module.exports;
|
||||
})()
|
||||
`);
|
||||
const context = vm.createContext({ process });
|
||||
return script.runInContext(context);
|
||||
} else {
|
||||
// For .swcrc and other JSON files, parse as JSON
|
||||
return JSON.parse(fs.readFileSync(filePath, 'utf-8'));
|
||||
}
|
||||
} catch (e) {
|
||||
console.error(`Error parsing ${file} file`, e);
|
||||
}
|
||||
}
|
||||
|
||||
const _regexCache = new Map();
|
||||
|
||||
function isRegexLike(str) {
|
||||
return /[.*+?^${}()|[\]\\]/.test(str);
|
||||
}
|
||||
|
||||
function isExcludedConfig(name, excludeList = [], startsWith) {
|
||||
if (!name || !excludeList?.length) return false;
|
||||
return excludeList.some(rule => {
|
||||
if (name === rule) return true;
|
||||
if (startsWith && name.startsWith(rule)) return true;
|
||||
if (isRegexLike(rule)) {
|
||||
let regex = _regexCache.get(rule);
|
||||
if (!regex) {
|
||||
try {
|
||||
regex = new RegExp(rule);
|
||||
_regexCache.set(rule, regex);
|
||||
} catch (err) {
|
||||
console.warn(`Invalid regex in exclude list: "${rule}"`);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
return regex.test(name);
|
||||
}
|
||||
|
||||
return false;
|
||||
});
|
||||
}
|
||||
|
||||
const disableTextColors = Boolean(JSON.parse(process.env.METEOR_DISABLE_COLORS || "false"));
|
||||
|
||||
function color(text, code) {
|
||||
return disableTextColors ? text : `\x1b[${code}m${text}\x1b[0m`;
|
||||
}
|
||||
|
||||
function logTranspilation({
|
||||
packageName,
|
||||
inputFilePath,
|
||||
usedSwc,
|
||||
cacheHit,
|
||||
isNodeModulesCode,
|
||||
arch,
|
||||
errorMessage = '',
|
||||
tip = '',
|
||||
}) {
|
||||
const transpiler = usedSwc ? 'SWC' : 'Babel';
|
||||
const transpilerColor = usedSwc ? 32 : 33;
|
||||
const label = color('[Transpiler]', 36);
|
||||
const transpilerPart = `${label} Used ${color(
|
||||
transpiler,
|
||||
transpilerColor,
|
||||
)} for`;
|
||||
const filePathPadded = `${
|
||||
packageName ? `${packageName}/` : ''
|
||||
}${inputFilePath}`.padEnd(50);
|
||||
let rawOrigin = '';
|
||||
if (packageName) {
|
||||
rawOrigin = `(package)`;
|
||||
} else {
|
||||
rawOrigin = isNodeModulesCode ? '(node_modules)' : '(app)';
|
||||
}
|
||||
const originPaddedRaw = rawOrigin.padEnd(35);
|
||||
const originPaddedColored = packageName
|
||||
? originPaddedRaw
|
||||
: isNodeModulesCode
|
||||
? color(originPaddedRaw, 90)
|
||||
: color(originPaddedRaw, 35);
|
||||
const cacheStatus = errorMessage
|
||||
? color('⚠️ Fallback', 33)
|
||||
: usedSwc
|
||||
? cacheHit
|
||||
? color('🟢 Cache hit', 32)
|
||||
: color('🔴 Cache miss', 31)
|
||||
: '';
|
||||
const archPart = arch ? color(` (${arch})`, 90) : '';
|
||||
console.log(
|
||||
`${transpilerPart} ${filePathPadded}${originPaddedColored}${cacheStatus}${archPart}`,
|
||||
);
|
||||
if (errorMessage) {
|
||||
console.log();
|
||||
console.log(` ↳ ${color('Error:', 31)} ${errorMessage}`);
|
||||
if (tip) {
|
||||
console.log();
|
||||
console.log(` ${color('💡 Tip:', 33)} ${tip}`);
|
||||
}
|
||||
console.log();
|
||||
}
|
||||
}
|
||||
|
||||
function logConfigBlock(description, configObject) {
|
||||
const label = color('[Config]', 36);
|
||||
const descriptionColor = color(description, 90);
|
||||
|
||||
console.log(`${label} ${descriptionColor}`);
|
||||
|
||||
const configLines = JSON.stringify(configObject, null, 2)
|
||||
.replace(/"([^"]+)":/g, '$1:')
|
||||
.split('\n')
|
||||
.map(line => ' ' + line);
|
||||
|
||||
configLines.forEach(line => console.log(line));
|
||||
console.log();
|
||||
}
|
||||
|
||||
function deepMerge(target, source, preservePaths = [], inPath = '') {
|
||||
for (const key in source) {
|
||||
const fullPath = inPath ? `${inPath}.${key}` : key;
|
||||
|
||||
// Skip preserved paths
|
||||
if (preservePaths.includes(fullPath)) continue;
|
||||
|
||||
if (
|
||||
typeof source[key] === 'object' &&
|
||||
source[key] !== null &&
|
||||
!Array.isArray(source[key])
|
||||
) {
|
||||
target[key] = deepMerge(
|
||||
target[key] || {},
|
||||
source[key],
|
||||
preservePaths,
|
||||
fullPath,
|
||||
);
|
||||
} else {
|
||||
target[key] = source[key];
|
||||
}
|
||||
}
|
||||
return target;
|
||||
}
|
||||
|
||||
function convertBabelTargetsForSwc(babelTargets) {
|
||||
const allowedEnvs = new Set([
|
||||
'chrome', 'opera', 'edge', 'firefox', 'safari',
|
||||
'ie', 'ios', 'android', 'node', 'electron'
|
||||
]);
|
||||
|
||||
const filteredTargets = {};
|
||||
for (const [env, version] of Object.entries(babelTargets)) {
|
||||
if (allowedEnvs.has(env)) {
|
||||
// Convert an array version (e.g., [10, 3]) into "10.3", otherwise convert to string.
|
||||
filteredTargets[env] = Array.isArray(version) ? version.join('.') : version.toString();
|
||||
}
|
||||
}
|
||||
|
||||
return filteredTargets;
|
||||
}
|
||||
|
||||
/**
|
||||
* A compiler that extends BabelCompiler but always uses SWC
|
||||
* @param {Object} extraFeatures Additional features to pass to BabelCompiler
|
||||
* @param {Function} modifyConfig Function to modify the configuration
|
||||
*/
|
||||
SwcCompiler = function SwcCompiler(extraFeatures, modifyConfig) {
|
||||
extraFeatures = extraFeatures || {};
|
||||
extraFeatures.swc = true;
|
||||
BabelCompiler.call(this, extraFeatures, modifyConfig);
|
||||
};
|
||||
|
||||
// Inherit from BabelCompiler
|
||||
SwcCompiler.prototype = Object.create(BabelCompiler.prototype);
|
||||
SwcCompiler.prototype.constructor = SwcCompiler;
|
||||
|
||||
@@ -1,13 +1,14 @@
|
||||
Package.describe({
|
||||
name: "babel-compiler",
|
||||
summary: "Parser/transpiler for ECMAScript 2015+ syntax",
|
||||
version: '7.11.3',
|
||||
version: '7.12.0',
|
||||
});
|
||||
|
||||
Npm.depends({
|
||||
'@meteorjs/babel': '7.20.1',
|
||||
'json5': '2.2.3',
|
||||
'semver': '7.6.3'
|
||||
'semver': '7.6.3',
|
||||
"@meteorjs/swc-core": "1.1.3",
|
||||
});
|
||||
|
||||
Package.onUse(function (api) {
|
||||
@@ -22,4 +23,5 @@ Package.onUse(function (api) {
|
||||
|
||||
api.export('Babel', 'server');
|
||||
api.export('BabelCompiler', 'server');
|
||||
api.export('SwcCompiler', 'server');
|
||||
});
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
import {readFileSync} from 'fs';
|
||||
import { create as createStream } from "combined-stream2";
|
||||
|
||||
import WebBrowserTemplate from './template-web.browser';
|
||||
import WebCordovaTemplate from './template-web.cordova';
|
||||
import { headTemplate as modernHeadTemplate, closeTemplate as modernCloseTemplate } from './template-web.browser';
|
||||
import { headTemplate as cordovaHeadTemplate, closeTemplate as cordovaCloseTemplate } from './template-web.cordova';
|
||||
|
||||
// Copied from webapp_server
|
||||
const readUtf8FileSync = filename => readFileSync(filename, 'utf8');
|
||||
@@ -151,11 +151,11 @@ function getTemplate(arch) {
|
||||
const prefix = arch.split(".", 2).join(".");
|
||||
|
||||
if (prefix === "web.browser") {
|
||||
return WebBrowserTemplate;
|
||||
return { headTemplate: modernHeadTemplate, closeTemplate: modernCloseTemplate };
|
||||
}
|
||||
|
||||
if (prefix === "web.cordova") {
|
||||
return WebCordovaTemplate;
|
||||
return { headTemplate: cordovaHeadTemplate, closeTemplate: cordovaCloseTemplate };
|
||||
}
|
||||
|
||||
throw new Error("Unsupported arch: " + arch);
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Package.describe({
|
||||
summary: "Generates the boilerplate html from program's manifest",
|
||||
version: '2.0.0',
|
||||
version: '2.0.1',
|
||||
});
|
||||
|
||||
Npm.depends({
|
||||
|
||||
@@ -74,7 +74,7 @@ export class Connection {
|
||||
if (typeof url === 'object') {
|
||||
self._stream = url;
|
||||
} else {
|
||||
import { ClientStream } from "meteor/socket-stream-client";
|
||||
const { ClientStream } = require("meteor/socket-stream-client");
|
||||
|
||||
self._stream = new ClientStream(url, {
|
||||
retry: options.retry,
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Package.describe({
|
||||
summary: "Meteor's latency-compensated distributed data client",
|
||||
version: "3.1.0",
|
||||
version: "3.1.1",
|
||||
documentation: null,
|
||||
});
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Package.describe({
|
||||
name: 'ecmascript',
|
||||
version: '0.16.10',
|
||||
version: '0.16.11',
|
||||
summary: 'Compiler plugin that supports ES2015+ in all .js files',
|
||||
documentation: 'README.md',
|
||||
});
|
||||
|
||||
@@ -1,3 +1,5 @@
|
||||
import { testExport as oyez } from './runtime-tests.js';
|
||||
|
||||
const isNode8OrLater = Meteor.isServer && parseInt(process.versions.node) >= 8;
|
||||
|
||||
Tinytest.add('ecmascript - runtime - template literals', test => {
|
||||
@@ -169,14 +171,12 @@ Tinytest.add('ecmascript - runtime - classes - properties', test => {
|
||||
static staticProp = 1234;
|
||||
|
||||
check = self => {
|
||||
import { testExport as oyez } from './runtime-tests.js';
|
||||
test.equal(oyez, 'oyez');
|
||||
test.isTrue(self === this);
|
||||
test.equal(this.property, 'property');
|
||||
};
|
||||
|
||||
method() {
|
||||
import { testExport as oyez } from './runtime-tests.js';
|
||||
test.equal(oyez, 'oyez');
|
||||
}
|
||||
}
|
||||
|
||||
@@ -9,6 +9,7 @@ import {
|
||||
isInfOrNaN,
|
||||
handleError,
|
||||
} from './utils';
|
||||
import canonicalStringify from './stringify';
|
||||
|
||||
/**
|
||||
* @namespace
|
||||
@@ -395,7 +396,6 @@ EJSON.stringify = handleError((item, options) => {
|
||||
let serialized;
|
||||
const json = EJSON.toJSONValue(item);
|
||||
if (options && (options.canonical || options.indent)) {
|
||||
import canonicalStringify from './stringify';
|
||||
serialized = canonicalStringify(json, options);
|
||||
} else {
|
||||
serialized = JSON.stringify(json);
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Package.describe({
|
||||
summary: 'Extended and Extensible JSON library',
|
||||
version: '1.1.4',
|
||||
version: '1.1.5',
|
||||
});
|
||||
|
||||
Package.onUse(function onUse(api) {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Package.describe({
|
||||
summary: "The Meteor command-line tool",
|
||||
version: "3.2.2",
|
||||
version: "3.3.0",
|
||||
});
|
||||
|
||||
Package.includeTool();
|
||||
|
||||
20
packages/meteor/meteor.d.ts
vendored
20
packages/meteor/meteor.d.ts
vendored
@@ -162,14 +162,26 @@ export namespace Meteor {
|
||||
* @param name Name of method to invoke
|
||||
* @param args Optional method arguments
|
||||
*/
|
||||
function call(name: string, ...args: any[]): any;
|
||||
function call<
|
||||
Result extends
|
||||
| EJSONable
|
||||
| EJSONable[]
|
||||
| EJSONableProperty
|
||||
| EJSONableProperty[]
|
||||
>(name: string, ...args: any[]): Result;
|
||||
|
||||
/**
|
||||
* Invokes a method with an async stub, passing any number of arguments.
|
||||
* @param name Name of method to invoke
|
||||
* @param args Optional method arguments
|
||||
*/
|
||||
function callAsync(name: string, ...args: any[]): Promise<any>;
|
||||
function callAsync<
|
||||
Result extends
|
||||
| EJSONable
|
||||
| EJSONable[]
|
||||
| EJSONableProperty
|
||||
| EJSONableProperty[]
|
||||
>(name: string, ...args: any[]): Promise<Result> & { stubPromise: Promise<Result>, serverPromise: Promise<Result> };
|
||||
|
||||
interface MethodApplyOptions<
|
||||
Result extends
|
||||
@@ -226,7 +238,7 @@ export namespace Meteor {
|
||||
error: global_Error | Meteor.Error | undefined,
|
||||
result?: Result
|
||||
) => void
|
||||
): any;
|
||||
): Result;
|
||||
|
||||
/**
|
||||
* Invokes a method with an async stub, passing any number of arguments.
|
||||
@@ -249,7 +261,7 @@ export namespace Meteor {
|
||||
error: global_Error | Meteor.Error | undefined,
|
||||
result?: Result
|
||||
) => void
|
||||
): Promise<Result>;
|
||||
): Promise<Result> & { stubPromise: Promise<Result>, serverPromise: Promise<Result> };
|
||||
/** Method **/
|
||||
|
||||
/** Url **/
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
Package.describe({
|
||||
summary: "Core Meteor environment",
|
||||
version: '2.1.0',
|
||||
version: '2.1.1',
|
||||
});
|
||||
|
||||
Package.registerBuildPlugin({
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Package.describe({
|
||||
summary: "JavaScript minifier",
|
||||
version: '3.0.1',
|
||||
version: '3.0.2',
|
||||
});
|
||||
|
||||
Npm.depends({
|
||||
|
||||
@@ -49,6 +49,13 @@ Tinytest.add('modern-browsers - versions - basic', function (test) {
|
||||
patch: 0,
|
||||
}));
|
||||
|
||||
test.isFalse(isModern({
|
||||
name: "mobileSafariUI/WKWebView",
|
||||
major: 0,
|
||||
minor: 0,
|
||||
patch: 0,
|
||||
}));
|
||||
|
||||
const oldPackageSettings = Meteor.settings.packages;
|
||||
|
||||
Meteor.settings.packages = {
|
||||
@@ -64,6 +71,13 @@ Tinytest.add('modern-browsers - versions - basic', function (test) {
|
||||
patch: 0,
|
||||
}));
|
||||
|
||||
test.isTrue(isModern({
|
||||
name: "mobileSafariUI/WKWebView",
|
||||
major: 0,
|
||||
minor: 0,
|
||||
patch: 0,
|
||||
}));
|
||||
|
||||
|
||||
Meteor.settings.packages = oldPackageSettings;
|
||||
});
|
||||
|
||||
@@ -93,7 +93,13 @@ function isModern(browser) {
|
||||
const entry = hasOwn.call(minimumVersions, lowerCaseName)
|
||||
? minimumVersions[lowerCaseName]
|
||||
: undefined;
|
||||
if (!entry) {
|
||||
if (
|
||||
!entry ||
|
||||
// When all version numbers are 0, this typically comes from in-app WebView UAs (e.g., iOS WKWebView).
|
||||
// We can let users decide whether to treat it as a modern browser
|
||||
// via the packageSettings.unknownBrowsersAssumedModern option.
|
||||
(browser.major === 0 && browser.minor === 0 && browser.patch === 0)
|
||||
) {
|
||||
const packageSettings = Meteor.settings.packages
|
||||
? Meteor.settings.packages['modern-browsers']
|
||||
: undefined;
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Package.describe({
|
||||
name: 'modern-browsers',
|
||||
version: '0.2.1',
|
||||
version: '0.2.2',
|
||||
summary:
|
||||
'API for defining the boundary between modern and legacy ' +
|
||||
'JavaScript clients',
|
||||
|
||||
@@ -11,6 +11,7 @@ import {
|
||||
validateCollectionName
|
||||
} from './collection_utils';
|
||||
import { ReplicationMethods } from './methods_replication';
|
||||
import { watchChangeStream } from './watch_change_stream';
|
||||
|
||||
/**
|
||||
* @summary Namespace for MongoDB-related items
|
||||
@@ -263,6 +264,10 @@ Mongo.Collection.ObjectID = Mongo.ObjectID;
|
||||
*/
|
||||
Meteor.Collection = Mongo.Collection;
|
||||
|
||||
|
||||
// Allow deny stuff is now in the allow-deny package
|
||||
Object.assign(Mongo.Collection.prototype, AllowDeny.CollectionPrototype);
|
||||
|
||||
// Só agora que Mongo.Collection existe, adicionamos o método ao prototype
|
||||
Object.assign(Mongo.Collection.prototype, { watchChangeStream });
|
||||
|
||||
|
||||
@@ -1,3 +1,5 @@
|
||||
import { Log } from 'meteor/logging';
|
||||
|
||||
export const IndexMethods = {
|
||||
// We'll actually design an index API later. For now, we just pass through to
|
||||
// Mongo's, but make it synchronous.
|
||||
@@ -21,8 +23,6 @@ export const IndexMethods = {
|
||||
if (self._collection.createIndexAsync) {
|
||||
await self._collection.createIndexAsync(index, options);
|
||||
} else {
|
||||
import { Log } from 'meteor/logging';
|
||||
|
||||
Log.debug(`ensureIndexAsync has been deprecated, please use the new 'createIndexAsync' instead${ options?.name ? `, index name: ${ options.name }` : `, index: ${ JSON.stringify(index) }` }`)
|
||||
await self._collection.ensureIndexAsync(index, options);
|
||||
}
|
||||
@@ -54,8 +54,6 @@ export const IndexMethods = {
|
||||
) &&
|
||||
Meteor.settings?.packages?.mongo?.reCreateIndexOnOptionMismatch
|
||||
) {
|
||||
import { Log } from 'meteor/logging';
|
||||
|
||||
Log.info(`Re-creating index ${ index } for ${ self._name } due to options mismatch.`);
|
||||
await self._collection.dropIndexAsync(index);
|
||||
await self._collection.createIndexAsync(index, options);
|
||||
@@ -88,4 +86,4 @@ export const IndexMethods = {
|
||||
throw new Error('Can only call dropIndexAsync on server collections');
|
||||
await self._collection.dropIndexAsync(index);
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
31
packages/mongo/collection/watch_change_stream.js
Normal file
31
packages/mongo/collection/watch_change_stream.js
Normal file
@@ -0,0 +1,31 @@
|
||||
/**
|
||||
* @summary Watches the MongoDB collection using Change Streams.
|
||||
* @locus Server
|
||||
* @memberof Mongo.Collection
|
||||
* @instance
|
||||
* @param {Array} [pipeline] Optional aggregation pipeline to filter Change Stream events.
|
||||
* @param {Object} [options] Optional settings for the Change Stream.
|
||||
* @returns {ChangeStream} The MongoDB ChangeStream instance.
|
||||
* @throws {Error} If called on a client/minimongo collection.
|
||||
*
|
||||
* @example
|
||||
* const changeStream = MyCollection.watchChangeStream([
|
||||
* { $match: { 'operationType': 'insert' } }
|
||||
* ]);
|
||||
* changeStream.on('change', (change) => {
|
||||
* console.log('Change detected:', change);
|
||||
* });
|
||||
*/
|
||||
|
||||
export function watchChangeStream(pipeline = [], options = {}) {
|
||||
// Only available on server
|
||||
if (typeof Package === 'undefined' || !this.rawCollection) {
|
||||
throw new Error('watchChangeStream is only available on server collections');
|
||||
}
|
||||
const raw = this.rawCollection();
|
||||
if (!raw.watch) {
|
||||
throw new Error('Underlying collection does not support watch (Change Streams)');
|
||||
}
|
||||
console.log('[watchChangeStream] Chamando raw.watch() com pipeline:', JSON.stringify(pipeline, null, 2), 'e options:', JSON.stringify(options, null, 2));
|
||||
return raw.watch(pipeline, options);
|
||||
}
|
||||
@@ -37,13 +37,15 @@ export class OplogHandle {
|
||||
public _dbName: string;
|
||||
private _oplogLastEntryConnection: MongoConnection | null;
|
||||
private _oplogTailConnection: MongoConnection | null;
|
||||
private _oplogOptions: { excludeCollections?: string[]; includeCollections?: string[] } | null;
|
||||
private _oplogOptions: {
|
||||
excludeCollections?: string[];
|
||||
includeCollections?: string[];
|
||||
};
|
||||
private _stopped: boolean;
|
||||
private _tailHandle: any;
|
||||
private _readyPromiseResolver: (() => void) | null;
|
||||
private _readyPromise: Promise<void>;
|
||||
public _crossbar: any;
|
||||
private _baseOplogSelector: any;
|
||||
private _catchingUpResolvers: CatchingUpResolver[];
|
||||
private _lastProcessedTS: any;
|
||||
private _onSkippedEntriesHook: any;
|
||||
@@ -61,29 +63,24 @@ export class OplogHandle {
|
||||
this._resolveTimeout = null;
|
||||
this._oplogLastEntryConnection = null;
|
||||
this._oplogTailConnection = null;
|
||||
this._oplogOptions = null;
|
||||
this._stopped = false;
|
||||
this._tailHandle = null;
|
||||
this._readyPromiseResolver = null;
|
||||
this._readyPromise = new Promise(r => this._readyPromiseResolver = r);
|
||||
this._readyPromise = new Promise(r => this._readyPromiseResolver = r);
|
||||
this._crossbar = new DDPServer._Crossbar({
|
||||
factPackage: "mongo-livedata", factName: "oplog-watchers"
|
||||
});
|
||||
this._baseOplogSelector = {
|
||||
ns: new RegExp("^(?:" + [
|
||||
// @ts-ignore
|
||||
Meteor._escapeRegExp(this._dbName + "."),
|
||||
// @ts-ignore
|
||||
Meteor._escapeRegExp("admin.$cmd"),
|
||||
].join("|") + ")"),
|
||||
|
||||
$or: [
|
||||
{ op: { $in: ['i', 'u', 'd'] } },
|
||||
{ op: 'c', 'o.drop': { $exists: true } },
|
||||
{ op: 'c', 'o.dropDatabase': 1 },
|
||||
{ op: 'c', 'o.applyOps': { $exists: true } },
|
||||
]
|
||||
};
|
||||
const includeCollections =
|
||||
Meteor.settings?.packages?.mongo?.oplogIncludeCollections;
|
||||
const excludeCollections =
|
||||
Meteor.settings?.packages?.mongo?.oplogExcludeCollections;
|
||||
if (includeCollections?.length && excludeCollections?.length) {
|
||||
throw new Error(
|
||||
"Can't use both mongo oplog settings oplogIncludeCollections and oplogExcludeCollections at the same time."
|
||||
);
|
||||
}
|
||||
this._oplogOptions = { includeCollections, excludeCollections };
|
||||
|
||||
this._catchingUpResolvers = [];
|
||||
this._lastProcessedTS = null;
|
||||
@@ -95,6 +92,67 @@ export class OplogHandle {
|
||||
this._startTrailingPromise = this._startTailing();
|
||||
}
|
||||
|
||||
private _getOplogSelector(lastProcessedTS?: any): any {
|
||||
const oplogCriteria: any = [
|
||||
{
|
||||
$or: [
|
||||
{ op: { $in: ["i", "u", "d"] } },
|
||||
{ op: "c", "o.drop": { $exists: true } },
|
||||
{ op: "c", "o.dropDatabase": 1 },
|
||||
{ op: "c", "o.applyOps": { $exists: true } },
|
||||
],
|
||||
},
|
||||
];
|
||||
|
||||
const nsRegex = new RegExp(
|
||||
"^(?:" +
|
||||
[
|
||||
// @ts-ignore
|
||||
Meteor._escapeRegExp(this._dbName + "."),
|
||||
// @ts-ignore
|
||||
Meteor._escapeRegExp("admin.$cmd"),
|
||||
].join("|") +
|
||||
")"
|
||||
);
|
||||
|
||||
if (this._oplogOptions.excludeCollections?.length) {
|
||||
oplogCriteria.push({
|
||||
ns: {
|
||||
$regex: nsRegex,
|
||||
$nin: this._oplogOptions.excludeCollections.map(
|
||||
(collName: string) => `${this._dbName}.${collName}`
|
||||
),
|
||||
},
|
||||
});
|
||||
} else if (this._oplogOptions.includeCollections?.length) {
|
||||
oplogCriteria.push({
|
||||
$or: [
|
||||
{ ns: /^admin\.\$cmd/ },
|
||||
{
|
||||
ns: {
|
||||
$in: this._oplogOptions.includeCollections.map(
|
||||
(collName: string) => `${this._dbName}.${collName}`
|
||||
),
|
||||
},
|
||||
},
|
||||
],
|
||||
});
|
||||
} else {
|
||||
oplogCriteria.push({
|
||||
ns: nsRegex,
|
||||
});
|
||||
}
|
||||
if(lastProcessedTS) {
|
||||
oplogCriteria.push({
|
||||
ts: { $gt: lastProcessedTS },
|
||||
});
|
||||
}
|
||||
|
||||
return {
|
||||
$and: oplogCriteria,
|
||||
};
|
||||
}
|
||||
|
||||
async stop(): Promise<void> {
|
||||
if (this._stopped) return;
|
||||
this._stopped = true;
|
||||
@@ -156,10 +214,11 @@ export class OplogHandle {
|
||||
let lastEntry: OplogEntry | null = null;
|
||||
|
||||
while (!this._stopped) {
|
||||
const oplogSelector = this._getOplogSelector();
|
||||
try {
|
||||
lastEntry = await this._oplogLastEntryConnection.findOneAsync(
|
||||
OPLOG_COLLECTION,
|
||||
this._baseOplogSelector,
|
||||
oplogSelector,
|
||||
{ projection: { ts: 1 }, sort: { $natural: -1 } }
|
||||
);
|
||||
break;
|
||||
@@ -238,41 +297,11 @@ export class OplogHandle {
|
||||
{ sort: { $natural: -1 }, projection: { ts: 1 } }
|
||||
);
|
||||
|
||||
let oplogSelector: any = { ...this._baseOplogSelector };
|
||||
const oplogSelector = this._getOplogSelector(lastOplogEntry?.ts);
|
||||
if (lastOplogEntry) {
|
||||
oplogSelector.ts = { $gt: lastOplogEntry.ts };
|
||||
this._lastProcessedTS = lastOplogEntry.ts;
|
||||
}
|
||||
|
||||
const includeCollections = Meteor.settings?.packages?.mongo?.oplogIncludeCollections;
|
||||
const excludeCollections = Meteor.settings?.packages?.mongo?.oplogExcludeCollections;
|
||||
|
||||
if (includeCollections?.length && excludeCollections?.length) {
|
||||
throw new Error("Can't use both mongo oplog settings oplogIncludeCollections and oplogExcludeCollections at the same time.");
|
||||
}
|
||||
|
||||
if (excludeCollections?.length) {
|
||||
oplogSelector.ns = {
|
||||
$regex: oplogSelector.ns,
|
||||
$nin: excludeCollections.map((collName: string) => `${this._dbName}.${collName}`)
|
||||
};
|
||||
this._oplogOptions = { excludeCollections };
|
||||
} else if (includeCollections?.length) {
|
||||
oplogSelector = {
|
||||
$and: [
|
||||
{
|
||||
$or: [
|
||||
{ ns: /^admin\.\$cmd/ },
|
||||
{ ns: { $in: includeCollections.map((collName: string) => `${this._dbName}.${collName}`) } }
|
||||
]
|
||||
},
|
||||
{ $or: oplogSelector.$or },
|
||||
{ ts: oplogSelector.ts }
|
||||
]
|
||||
};
|
||||
this._oplogOptions = { includeCollections };
|
||||
}
|
||||
|
||||
const cursorDescription = new CursorDescription(
|
||||
OPLOG_COLLECTION,
|
||||
oplogSelector,
|
||||
|
||||
@@ -9,7 +9,7 @@
|
||||
|
||||
Package.describe({
|
||||
summary: "Adaptor for using MongoDB and Minimongo over DDP",
|
||||
version: "2.1.1",
|
||||
version: "2.1.2",
|
||||
});
|
||||
|
||||
Npm.depends({
|
||||
|
||||
@@ -281,6 +281,68 @@ process.env.MONGO_OPLOG_URL && Tinytest.addAsync(
|
||||
}
|
||||
);
|
||||
|
||||
process.env.MONGO_OPLOG_URL &&
|
||||
Tinytest.addAsync(
|
||||
"mongo-livedata - oplog - oplogSettings - oplog doesn't get stuck on waitUntilCaughtUp",
|
||||
async (test) => {
|
||||
try {
|
||||
const includeCollectionName = "oplog-a-" + Random.id();
|
||||
const excludeCollectionName = "oplog-b-" + Random.id();
|
||||
const mongoPackageSettings = {
|
||||
oplogIncludeCollections: [includeCollectionName],
|
||||
};
|
||||
|
||||
previousMongoPackageSettings = {
|
||||
...(Meteor.settings?.packages?.mongo || {}),
|
||||
};
|
||||
if (!Meteor.settings.packages) Meteor.settings.packages = {};
|
||||
Meteor.settings.packages.mongo = mongoPackageSettings;
|
||||
|
||||
const myOplogHandle = new MongoInternals.OplogHandle(
|
||||
process.env.MONGO_OPLOG_URL,
|
||||
"meteor"
|
||||
);
|
||||
await myOplogHandle._startTrailingPromise;
|
||||
MongoInternals.defaultRemoteCollectionDriver().mongo._setOplogHandle(
|
||||
myOplogHandle
|
||||
);
|
||||
|
||||
const IncludeCollection = new Mongo.Collection(includeCollectionName);
|
||||
const ExcludeCollection = new Mongo.Collection(excludeCollectionName);
|
||||
await IncludeCollection.rawCollection().insertOne({
|
||||
include: "yes",
|
||||
foo: "bar",
|
||||
});
|
||||
|
||||
// Previously, when the last document inserted in the oplog was excluded from the oplog tailing,
|
||||
// waitUntilCaughtUp would hang until an oplog-tracked document was inserted.
|
||||
// This was preventing the observeChange callbacks from being called.
|
||||
await ExcludeCollection.rawCollection().insertOne({
|
||||
include: "no",
|
||||
foo: "bar",
|
||||
});
|
||||
const shouldBeTracked = Promise.race([
|
||||
new Promise((resolve) => {
|
||||
IncludeCollection.find({ include: "yes" }).observeChanges({
|
||||
added() {
|
||||
resolve(true);
|
||||
},
|
||||
});
|
||||
}),
|
||||
new Promise((resolve) => setTimeout(() => resolve(false), 2000)),
|
||||
]);
|
||||
|
||||
test.equal(await shouldBeTracked, true);
|
||||
} finally {
|
||||
// Reset:
|
||||
Meteor.settings.packages.mongo = { ...previousMongoPackageSettings };
|
||||
MongoInternals.defaultRemoteCollectionDriver().mongo._setOplogHandle(
|
||||
defaultOplogHandle
|
||||
);
|
||||
}
|
||||
}
|
||||
);
|
||||
|
||||
// TODO this is commented for now, but we need to find out the cause
|
||||
// PR: https://github.com/meteor/meteor/pull/12057
|
||||
// Meteor.isServer && Tinytest.addAsync(
|
||||
|
||||
@@ -3,7 +3,7 @@ Package.describe({
|
||||
summary: 'Compiler for CoffeeScript code, supporting the coffeescript package',
|
||||
// This version of NPM `coffeescript` module, with _1, _2 etc.
|
||||
// If you change this, make sure to also update ../coffeescript/package.js to match.
|
||||
version: '2.4.2'
|
||||
version: '2.4.2',
|
||||
});
|
||||
|
||||
Npm.depends({
|
||||
|
||||
@@ -6,12 +6,12 @@ Package.describe({
|
||||
// so bumping the version of this package will be how they get newer versions
|
||||
// of `coffeescript-compiler`. If you change this, make sure to also update
|
||||
// ../coffeescript-compiler/package.js to match.
|
||||
version: '2.7.2'
|
||||
version: '2.7.3',
|
||||
});
|
||||
|
||||
Package.registerBuildPlugin({
|
||||
name: 'compile-coffeescript',
|
||||
use: ['caching-compiler@2.0.0-rc300.2', 'ecmascript@0.16.9-rc300.2', 'coffeescript-compiler@2.4.1'],
|
||||
use: ['caching-compiler@2.0.1', 'ecmascript@0.16.11', 'coffeescript-compiler@2.4.2'],
|
||||
sources: ['compile-coffeescript.js'],
|
||||
npmDependencies: {
|
||||
// A breaking change was introduced in @babel/runtime@7.0.0-beta.56
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Package.describe({
|
||||
name: "server-render",
|
||||
version: '0.4.2',
|
||||
version: '0.4.3',
|
||||
summary: "Generic support for server-side rendering in Meteor apps",
|
||||
documentation: "README.md"
|
||||
});
|
||||
|
||||
27
packages/server-render/server-render.d.ts
vendored
27
packages/server-render/server-render.d.ts
vendored
@@ -19,9 +19,34 @@ export interface ClientSink {
|
||||
getCookies(): { [key: string]: string };
|
||||
}
|
||||
|
||||
/**
|
||||
* Meteor parses the user agent string in an attempt to identify the browser.
|
||||
* This is used, for example, to determine whether to serve the modern
|
||||
* or the legacy bundle, in case your app uses both..
|
||||
*/
|
||||
type IdentifiedBrowser = {
|
||||
name: string;
|
||||
major: number;
|
||||
minor: number;
|
||||
patch: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* A categorized request is an IncomingMessage with a pre-parsed URL,
|
||||
* and additional properties added by Meteor.
|
||||
*/
|
||||
export type CategorizedRequest = Omit<http.IncomingMessage, 'url'> & {
|
||||
browser: IdentifiedBrowser;
|
||||
dynamicHead: string | undefined;
|
||||
dynamicBody: string | undefined;
|
||||
modern: boolean;
|
||||
path: string;
|
||||
url: URL;
|
||||
}
|
||||
|
||||
export interface ServerSink extends ClientSink {
|
||||
// Server-only:
|
||||
request: http.IncomingMessage;
|
||||
request: CategorizedRequest;
|
||||
arch: string;
|
||||
head: string;
|
||||
body: string;
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Package.describe({
|
||||
name: "socket-stream-client",
|
||||
version: '0.6.0',
|
||||
version: '0.6.1',
|
||||
summary: "Provides the ClientStream abstraction used by ddp-client",
|
||||
documentation: "README.md"
|
||||
});
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -5,6 +5,14 @@
|
||||
Standard Minifier for JS
|
||||
===
|
||||
|
||||
# Credits and Acknowledgments
|
||||
|
||||
This package uses SWC (Speedy Web Compiler) for JavaScript minification, which provides significant performance improvements over traditional minifiers. The SWC implementation was initially based on the work from [zodern/minify-js-sourcemaps](https://github.com/zodern/minify-js-sourcemaps).
|
||||
|
||||
We also use Terser as a fallback minifier for cases where SWC's stricter module parsing might not be compatible with some Meteor packages.
|
||||
|
||||
Special thanks to the contributors of both projects for making efficient JavaScript minification possible in Meteor applications.
|
||||
|
||||
This package provides a minifier plugin used for Meteor apps by default.
|
||||
|
||||
The behavior of this plugin in development and production modes are depicted below
|
||||
@@ -15,7 +23,7 @@ in the table.
|
||||
|---------------|:-----:|:------:|
|
||||
| Minified | N | Y |
|
||||
| Concatenated | N | Y |
|
||||
| Source Maps | Y | N |
|
||||
| Source Maps | Y | Soon |
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Package.describe({
|
||||
name: 'standard-minifier-js',
|
||||
version: '3.0.0',
|
||||
version: '3.1.0',
|
||||
summary: 'Standard javascript minifiers used with Meteor apps by default.',
|
||||
documentation: 'README.md',
|
||||
});
|
||||
@@ -12,7 +12,12 @@ Package.registerBuildPlugin({
|
||||
'ecmascript'
|
||||
],
|
||||
npmDependencies: {
|
||||
'@meteorjs/swc-core': '1.1.3',
|
||||
'acorn': '8.10.0',
|
||||
"@babel/runtime": "7.18.9",
|
||||
'@babel/parser': '7.22.7',
|
||||
'terser': '5.19.2',
|
||||
'@meteorjs/reify': '0.25.4',
|
||||
},
|
||||
sources: [
|
||||
'plugin/minify-js.js',
|
||||
|
||||
@@ -1,105 +1,259 @@
|
||||
import { extractModuleSizesTree } from "./stats.js";
|
||||
import fs from 'fs';
|
||||
|
||||
Plugin.registerMinifier({
|
||||
extensions: ['js'],
|
||||
archMatching: 'web',
|
||||
},
|
||||
() => new MeteorMinifier()
|
||||
);
|
||||
export function getConfig() {
|
||||
try{
|
||||
const meteorAppDir = process.cwd();
|
||||
const packageJson = fs.readFileSync(`${meteorAppDir}/package.json`, 'utf8');
|
||||
const meteorConfig = JSON.parse(packageJson).meteor;
|
||||
return meteorConfig;
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
return {};
|
||||
}
|
||||
};
|
||||
|
||||
class MeteorMinifier {
|
||||
const statsEnabled = process.env.DISABLE_CLIENT_STATS !== 'true'
|
||||
|
||||
async processFilesForBundle (files, options) {
|
||||
const mode = options.minifyMode;
|
||||
|
||||
// don't minify anything for development
|
||||
if (mode === 'development') {
|
||||
files.forEach(function (file) {
|
||||
file.addJavaScript({
|
||||
data: file.getContentsAsBuffer(),
|
||||
sourceMap: file.getSourceMap(),
|
||||
path: file.getPathInBundle(),
|
||||
});
|
||||
});
|
||||
return;
|
||||
const Meteor = typeof global.Meteor !== 'undefined' ? global.Meteor : {
|
||||
_debug: function(...args) {
|
||||
if (typeof console !== 'undefined' && typeof console.log !== 'undefined' && process.env.NODE_INSPECTOR_IPC) {
|
||||
console.log('[DEBUG]', ...args);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// this function tries its best to locate the original source file
|
||||
// that the error being reported was located inside of
|
||||
function maybeThrowMinifyErrorBySourceFile(error, file) {
|
||||
// Profile for test and production environments
|
||||
let Profile;
|
||||
if (typeof global.Profile !== 'undefined') {
|
||||
Profile = global.Profile;
|
||||
} else if (typeof Plugin !== 'undefined' && Plugin.Profile) {
|
||||
Profile = Plugin.Profile;
|
||||
} else {
|
||||
Profile = function (label, func) {
|
||||
return function () {
|
||||
return func.apply(this, arguments);
|
||||
}
|
||||
}
|
||||
Profile.time = function (label, func) {
|
||||
func();
|
||||
}
|
||||
}
|
||||
|
||||
const lines = file.getContentsAsString().split(/\n/);
|
||||
const lineContent = lines[error.line - 1];
|
||||
let swc;
|
||||
|
||||
let originalSourceFileLineNumber = 0;
|
||||
// Register the minifier only when Plugin is available (not in tests)
|
||||
if (typeof Plugin !== 'undefined') {
|
||||
Plugin.registerMinifier({
|
||||
extensions: ['js'],
|
||||
archMatching: 'web',
|
||||
},
|
||||
() => new MeteorMinifier()
|
||||
);
|
||||
}
|
||||
|
||||
// Count backward from the failed line to find the oringal filename
|
||||
for (let i = (error.line - 1); i >= 0; i--) {
|
||||
let currentLine = lines[i];
|
||||
export class MeteorMinifier {
|
||||
|
||||
// If the line is a boatload of slashes (8 or more), we're in the right place.
|
||||
if (/^\/\/\/{6,}$/.test(currentLine)) {
|
||||
constructor() {
|
||||
this.config = getConfig();
|
||||
}
|
||||
|
||||
// If 4 lines back is the same exact line, we've found the framing.
|
||||
if (lines[i - 4] === currentLine) {
|
||||
_minifyWithSWC(file) {
|
||||
return Profile('_minifyWithSWC', () => {
|
||||
swc = swc || require('@meteorjs/swc-core');
|
||||
const NODE_ENV = process.env.NODE_ENV || 'development';
|
||||
|
||||
let content = file.getContentsAsString();
|
||||
|
||||
// So in that case, 2 lines back is the file path.
|
||||
let originalFilePath = lines[i - 2].substring(3).replace(/\s+\/\//, "");
|
||||
return swc.minifySync(
|
||||
content,
|
||||
{
|
||||
ecma: 5,
|
||||
compress: {
|
||||
drop_debugger: false,
|
||||
|
||||
throw new Error(
|
||||
`terser minification error (${error.name}:${error.message})\n` +
|
||||
`Source file: ${originalFilePath} (${originalSourceFileLineNumber}:${error.col})\n` +
|
||||
`Line content: ${lineContent}\n`);
|
||||
}
|
||||
unused: true,
|
||||
dead_code: true,
|
||||
typeofs: false,
|
||||
|
||||
global_defs: {
|
||||
'process.env.NODE_ENV': NODE_ENV,
|
||||
},
|
||||
},
|
||||
safari10: true,
|
||||
inlineSourcesContent: true
|
||||
}
|
||||
);
|
||||
})();
|
||||
}
|
||||
|
||||
_minifyWithTerser(file) {
|
||||
return Profile('_minifyWithTerser', async () => {
|
||||
let terser = require('terser');
|
||||
const NODE_ENV = process.env.NODE_ENV || 'development';
|
||||
const content = file.getContentsAsString();
|
||||
|
||||
return terser.minify(content, {
|
||||
compress: {
|
||||
drop_debugger: false,
|
||||
unused: false,
|
||||
dead_code: true,
|
||||
global_defs: {
|
||||
"process.env.NODE_ENV": NODE_ENV
|
||||
}
|
||||
originalSourceFileLineNumber++;
|
||||
},
|
||||
// Fix issue meteor/meteor#9866, as explained in this comment:
|
||||
// https://github.com/mishoo/UglifyJS2/issues/1753#issuecomment-324814782
|
||||
// And fix terser issue #117: https://github.com/terser-js/terser/issues/117
|
||||
safari10: true
|
||||
}).then(result => {
|
||||
if (!result) {
|
||||
throw new Error(`Terser produced empty result for ${file.getPathInBundle()}`);
|
||||
}
|
||||
return result;
|
||||
}).catch(error => {
|
||||
throw error;
|
||||
});
|
||||
})();
|
||||
}
|
||||
|
||||
minifyOneFile(file) {
|
||||
return Profile('minifyOneFile', () => {
|
||||
const modern = this.config && (this.config.modern === true || (this.config.modern && this.config.modern.minifier === true));
|
||||
// check if config is an empty object
|
||||
if(this.config && Object.keys(this.config).length === 0 || !modern) {
|
||||
Meteor._debug(`Minifying using Terser | file: ${file.getPathInBundle()}`);
|
||||
return this._minifyWithTerser(file);
|
||||
}
|
||||
|
||||
try {
|
||||
Meteor._debug(`Minifying using SWC | file: ${file.getPathInBundle()}`);
|
||||
return this._minifyWithSWC(file);
|
||||
} catch (swcError) {
|
||||
Meteor._debug(`SWC failed | file: ${file.getPathInBundle()}`);
|
||||
return this._minifyWithTerser(file);
|
||||
}
|
||||
})();
|
||||
}
|
||||
}
|
||||
|
||||
MeteorMinifier.prototype.processFilesForBundle = Profile('processFilesForBundle', async function (files, options) {
|
||||
const mode = options.minifyMode;
|
||||
|
||||
// don't minify anything for development
|
||||
if (mode === 'development') {
|
||||
files.forEach(function (file) {
|
||||
file.addJavaScript({
|
||||
data: file.getContentsAsBuffer(),
|
||||
sourceMap: file.getSourceMap(),
|
||||
path: file.getPathInBundle(),
|
||||
});
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// this function tries its best to locate the original source file
|
||||
// that the error being reported was located inside of
|
||||
function maybeThrowMinifyErrorBySourceFile(error, file) {
|
||||
const lines = file.getContentsAsString().split(/\n/);
|
||||
const lineContent = lines[error.line - 1];
|
||||
|
||||
let originalSourceFileLineNumber = 0;
|
||||
|
||||
// Count backward from the failed line to find the oringal filename
|
||||
for (let i = (error.line - 1); i >= 0; i--) {
|
||||
let currentLine = lines[i];
|
||||
|
||||
// If the line is a boatload of slashes (8 or more), we're in the right place.
|
||||
if (/^\/\/\/{6,}$/.test(currentLine)) {
|
||||
|
||||
// If 4 lines back is the same exact line, we've found the framing.
|
||||
if (lines[i - 4] === currentLine) {
|
||||
|
||||
// So in that case, 2 lines back is the file path.
|
||||
let originalFilePath = lines[i - 2].substring(3).replace(/\s+\/\//, "");
|
||||
|
||||
throw new Error(
|
||||
`terser minification error (${error.name}:${error.message})\n` +
|
||||
`Source file: ${originalFilePath} (${originalSourceFileLineNumber}:${error.col})\n` +
|
||||
`Line content: ${lineContent}\n`);
|
||||
}
|
||||
}
|
||||
originalSourceFileLineNumber++;
|
||||
}
|
||||
}
|
||||
|
||||
// this object will collect all the minified code in the
|
||||
// data field and post-minfiication file sizes in the stats field
|
||||
const toBeAdded = {
|
||||
data: "",
|
||||
stats: Object.create(null)
|
||||
};
|
||||
|
||||
for (let file of files) {
|
||||
// Don't reminify *.min.js.
|
||||
if (/\.min\.js$/.test(file.getPathInBundle())) {
|
||||
toBeAdded.data += file.getContentsAsString();
|
||||
Plugin.nudge();
|
||||
continue;
|
||||
}
|
||||
|
||||
let minified;
|
||||
let label = 'minify file'
|
||||
if (file.getPathInBundle() === 'app/app.js') {
|
||||
label = 'minify app/app.js'
|
||||
}
|
||||
if (file.getPathInBundle() === 'packages/modules.js') {
|
||||
label = 'minify packages/modules.js'
|
||||
}
|
||||
|
||||
// this object will collect all the minified code in the
|
||||
// data field and post-minfiication file sizes in the stats field
|
||||
const toBeAdded = {
|
||||
data: "",
|
||||
stats: Object.create(null)
|
||||
};
|
||||
|
||||
for await (file of files) {
|
||||
// Don't reminify *.min.js.
|
||||
if (/\.min\.js$/.test(file.getPathInBundle())) {
|
||||
toBeAdded.data += file.getContentsAsString();
|
||||
try {
|
||||
// Need to update this approach for async/await
|
||||
let minifyPromise;
|
||||
Profile.time(label, () => {
|
||||
minifyPromise = this.minifyOneFile(file);
|
||||
});
|
||||
minified = await minifyPromise;
|
||||
|
||||
if (!(minified && typeof minified.code === "string")) {
|
||||
throw new Error(`Invalid minification result for ${file.getPathInBundle()}`);
|
||||
}
|
||||
else {
|
||||
let minified;
|
||||
try {
|
||||
minified = await meteorJsMinify(file.getContentsAsString());
|
||||
}
|
||||
catch (err) {
|
||||
maybeThrowMinifyErrorBySourceFile(err, file);
|
||||
}
|
||||
catch (err) {
|
||||
maybeThrowMinifyErrorBySourceFile(err, file);
|
||||
var filePath = file.getPathInBundle();
|
||||
err.message += " while minifying " + filePath;
|
||||
throw err;
|
||||
}
|
||||
|
||||
throw new Error(`terser minification error (${err.name}:${err.message})\n` +
|
||||
`Bundled file: ${file.getPathInBundle()} (${err.line}:${err.col})\n`);
|
||||
}
|
||||
|
||||
const ast = extractModuleSizesTree(minified.code);
|
||||
|
||||
if (ast) {
|
||||
toBeAdded.stats[file.getPathInBundle()] = [Buffer.byteLength(minified.code), ast];
|
||||
if (statsEnabled) {
|
||||
let tree;
|
||||
Profile.time('extractModuleSizesTree', () => {
|
||||
tree = extractModuleSizesTree(minified.code);
|
||||
if (tree) {
|
||||
toBeAdded.stats[file.getPathInBundle()] = [Buffer.byteLength(minified.code), tree];
|
||||
} else {
|
||||
toBeAdded.stats[file.getPathInBundle()] = Buffer.byteLength(minified.code);
|
||||
}
|
||||
// append the minified code to the "running sum"
|
||||
// of code being minified
|
||||
toBeAdded.data += minified.code;
|
||||
}
|
||||
toBeAdded.data += '\n\n';
|
||||
|
||||
Plugin.nudge();
|
||||
});
|
||||
// Add the minified code outside of the Profile.time
|
||||
toBeAdded.data += minified.code;
|
||||
} else {
|
||||
// If stats are disabled, still need to add the minified code
|
||||
toBeAdded.data += minified.code;
|
||||
}
|
||||
|
||||
// this is where the minified code gets added to one
|
||||
// JS file that is delivered to the client
|
||||
if (files.length) {
|
||||
files[0].addJavaScript(toBeAdded);
|
||||
}
|
||||
toBeAdded.data += '\n\n';
|
||||
|
||||
Plugin.nudge();
|
||||
}
|
||||
}
|
||||
|
||||
// this is where the minified code gets added to one
|
||||
// JS file that is delivered to the client
|
||||
if (files.length) {
|
||||
files[0].addJavaScript(toBeAdded);
|
||||
}
|
||||
});
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
import Visitor from "@meteorjs/reify/lib/visitor.js";
|
||||
import { findPossibleIndexes } from "@meteorjs/reify/lib/utils.js";
|
||||
import { Babel } from "meteor/babel-compiler";
|
||||
import acorn from"acorn";
|
||||
|
||||
|
||||
// This RegExp will be used to scan the source for calls to meteorInstall,
|
||||
// taking into consideration that the function name may have been mangled
|
||||
@@ -22,7 +24,22 @@ const meteorInstallRegExp = new RegExp([
|
||||
export function extractModuleSizesTree(source) {
|
||||
const match = meteorInstallRegExp.exec(source);
|
||||
if (match) {
|
||||
const ast = Babel.parse(source);
|
||||
try {
|
||||
ast = acorn.parse(source, {
|
||||
ecmaVersion: 'latest',
|
||||
sourceType: 'script',
|
||||
allowAwaitOutsideFunction: true,
|
||||
allowImportExportEverywhere: true,
|
||||
allowReturnOutsideFunction: true,
|
||||
allowHashBang: true,
|
||||
checkPrivateFields: false
|
||||
});
|
||||
}
|
||||
catch(e){
|
||||
console.log(`Error while parsing with acorn. Falling back to babel minifier. ${e}`);
|
||||
ast = Babel.parse(source);
|
||||
}
|
||||
|
||||
let meteorInstallName = "meteorInstall";
|
||||
// The minifier may have renamed meteorInstall to something shorter.
|
||||
match.some((name, i) => (i > 0 && (meteorInstallName = name)));
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Package.describe({
|
||||
name: 'typescript',
|
||||
version: '5.6.3',
|
||||
version: '5.6.4',
|
||||
summary:
|
||||
'Compiler plugin that compiles TypeScript and ECMAScript in .ts and .tsx files',
|
||||
documentation: 'README.md',
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Package.describe({
|
||||
summary: "Serves a Meteor app over HTTP",
|
||||
version: "2.0.6",
|
||||
version: "2.0.7",
|
||||
});
|
||||
|
||||
Npm.depends({
|
||||
|
||||
@@ -132,7 +132,7 @@ var identifyBrowser = function(userAgentString) {
|
||||
patch: 0
|
||||
};
|
||||
}
|
||||
var userAgent = lookupUserAgent(userAgentString.substring(0, 150));
|
||||
var userAgent = lookupUserAgent(userAgentString);
|
||||
return {
|
||||
name: camelCase(userAgent.family),
|
||||
major: +userAgent.major,
|
||||
|
||||
@@ -153,6 +153,25 @@ Tinytest.addAsync("webapp - modern/legacy static files", test => {
|
||||
return Promise.all(promises);
|
||||
});
|
||||
|
||||
const specialUserAgent =
|
||||
"Mozilla/5.0 (Linux; Android 5.1.1; MI NOTE Pro Build/LMY47V; wv) " +
|
||||
"AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/48.0.2564.116 " +
|
||||
"Mobile Safari/537.36 baidubrowser/7.7.13.0 (Baidu; P1 5.1.1)"
|
||||
|
||||
Tinytest.addAsync("webapp - agent identification", async function (test) {
|
||||
const modernBrowser = WebAppInternals.identifyBrowser(modernUserAgent);
|
||||
test.equal(modernBrowser.name, "chrome");
|
||||
test.equal(modernBrowser.major, 68);
|
||||
test.equal(modernBrowser.minor, 0);
|
||||
test.equal(modernBrowser.patch, 3440);
|
||||
|
||||
const specialBrowser = WebAppInternals.identifyBrowser(specialUserAgent);
|
||||
test.equal(specialBrowser.name, "baiduBrowser");
|
||||
test.equal(specialBrowser.major, 7);
|
||||
test.equal(specialBrowser.minor, 7);
|
||||
test.equal(specialBrowser.patch, 13);
|
||||
})
|
||||
|
||||
Tinytest.addAsync(
|
||||
"webapp - additional static javascript",
|
||||
async function (test) {
|
||||
|
||||
@@ -42,6 +42,7 @@ const packages = {
|
||||
autopublish: {},
|
||||
"babel-compiler": {
|
||||
serverFiles: ["babel.js", "babel-compiler.js"],
|
||||
ignoredFiles: ["babel-compiler.js"],
|
||||
},
|
||||
"babel-runtime": {},
|
||||
"browser-policy": {},
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"track": "METEOR",
|
||||
"version": "3.2-beta.2",
|
||||
"version": "3.3-rc.0",
|
||||
"recommended": false,
|
||||
"official": false,
|
||||
"description": "Meteor experimental release"
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"track": "METEOR",
|
||||
"version": "3.2.2",
|
||||
"version": "3.3",
|
||||
"recommended": false,
|
||||
"official": true,
|
||||
"description": "The Official Meteor Distribution"
|
||||
|
||||
@@ -5,7 +5,7 @@ set -u
|
||||
|
||||
UNAME=$(uname)
|
||||
ARCH=$(uname -m)
|
||||
NODE_VERSION=22.14.0
|
||||
NODE_VERSION=22.16.0
|
||||
MONGO_VERSION_64BIT=7.0.16
|
||||
MONGO_VERSION_32BIT=3.2.22
|
||||
NPM_VERSION=10.9.2
|
||||
|
||||
@@ -59,6 +59,7 @@ var packageJson = {
|
||||
multipipe: "2.0.1",
|
||||
pathwatcher: "8.1.2",
|
||||
"vscode-nsfw": "2.1.8",
|
||||
"@parcel/watcher": "2.5.1",
|
||||
// The @wry/context package version must be compatible with the
|
||||
// version constraint imposed by optimism/package.json.
|
||||
optimism: "0.16.1",
|
||||
@@ -67,7 +68,8 @@ var packageJson = {
|
||||
"anser": "2.1.1",
|
||||
'xmlbuilder2': '1.8.1',
|
||||
"ws": "7.4.5",
|
||||
"open":"8.4.2"
|
||||
"open":"8.4.2",
|
||||
"acorn": "8.14.1",
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
@@ -261,13 +261,94 @@ export function parseRunTargets(targets) {
|
||||
});
|
||||
};
|
||||
|
||||
const excludableWebArchs = ['web.browser', 'web.browser.legacy', 'web.cordova'];
|
||||
function filterWebArchs(webArchs, excludeArchsOption) {
|
||||
if (excludeArchsOption) {
|
||||
const excludeArchs = excludeArchsOption.trim().split(/\s*,\s*/)
|
||||
.filter(arch => excludableWebArchs.includes(arch));
|
||||
webArchs = webArchs.filter(arch => !excludeArchs.includes(arch));
|
||||
const DEFAULT_MODERN = {
|
||||
transpiler: true,
|
||||
webArchOnly: true,
|
||||
watcher: true,
|
||||
};
|
||||
|
||||
const normalizeModern = (r = false) => Object.fromEntries(
|
||||
Object.entries(DEFAULT_MODERN).map(([k, def]) => [
|
||||
k,
|
||||
r === true
|
||||
? def
|
||||
: r === false || r?.[k] === false
|
||||
? false
|
||||
: typeof r?.[k] === 'object'
|
||||
? { ...r[k] }
|
||||
: def,
|
||||
]),
|
||||
);
|
||||
|
||||
|
||||
let modernForced = JSON.parse(process.env.METEOR_MODERN || "false");
|
||||
let meteorConfig;
|
||||
|
||||
function getMeteorConfig(appDir) {
|
||||
if (meteorConfig) return meteorConfig;
|
||||
const packageJsonPath = files.pathJoin(appDir, 'package.json');
|
||||
if (!files.exists(packageJsonPath)) {
|
||||
return false;
|
||||
}
|
||||
const packageJsonFile = files.readFile(packageJsonPath, 'utf8');
|
||||
const packageJson = JSON.parse(packageJsonFile);
|
||||
meteorConfig = packageJson?.meteor;
|
||||
return meteorConfig;
|
||||
}
|
||||
|
||||
function isModernArchsOnlyEnabled(appDir) {
|
||||
const meteorConfig = getMeteorConfig(appDir);
|
||||
return normalizeModern(modernForced || meteorConfig?.modern).webArchOnly !== false;
|
||||
}
|
||||
|
||||
export function isModernWatcherEnabled(appDir) {
|
||||
const meteorConfig = getMeteorConfig(appDir);
|
||||
return normalizeModern(modernForced || meteorConfig?.modern).watcher !== false;
|
||||
}
|
||||
|
||||
function filterWebArchs(webArchs, excludeArchsOption, appDir, options) {
|
||||
const platforms = (options.platforms || []);
|
||||
const isBuildMode = platforms?.length > 0;
|
||||
if (isBuildMode) {
|
||||
// Build Mode
|
||||
const isModernOnlyPlatform = platforms.includes('modern') && !platforms.includes('legacy');
|
||||
if (isModernOnlyPlatform) {
|
||||
webArchs = webArchs.filter(arch => arch !== 'web.browser.legacy');
|
||||
}
|
||||
const hasCordovaPlatforms = platforms.includes('android') || platforms.includes('ios');
|
||||
if (!hasCordovaPlatforms) {
|
||||
webArchs = webArchs.filter(arch => arch !== 'web.cordova');
|
||||
}
|
||||
} else {
|
||||
// Dev & Test Mode
|
||||
const isCordovaDev = (options.args || []).some(arg => ['ios', 'ios-device', 'android', 'android-device'].includes(arg));
|
||||
if (!isCordovaDev) {
|
||||
const excludeArchsOptions = excludeArchsOption ? excludeArchsOption.trim().split(/\s*,\s*/) : [];
|
||||
const hasExcludeArchsOptions = (excludeArchsOptions?.length || 0) > 0;
|
||||
const hasModernArchsOnlyEnabled = appDir && isModernArchsOnlyEnabled(appDir);
|
||||
if (hasExcludeArchsOptions && hasModernArchsOnlyEnabled) {
|
||||
console.warn('modern.webArchOnly and --exclude-archs are both active. If both are set, --exclude-archs takes priority.');
|
||||
}
|
||||
const automaticallyIgnoredLegacyArchs = (!hasExcludeArchsOptions && hasModernArchsOnlyEnabled) ? ['web.browser.legacy', 'web.cordova'] : [];
|
||||
if (hasExcludeArchsOptions || automaticallyIgnoredLegacyArchs.length) {
|
||||
const excludeArchs = [...excludeArchsOptions, ...automaticallyIgnoredLegacyArchs];
|
||||
webArchs = webArchs.filter(arch => !excludeArchs.includes(arch));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const forcedInclArchs = process.env.METEOR_FORCE_INCLUDE_ARCHS;
|
||||
if (forcedInclArchs != null) {
|
||||
const nextInclArchs = forcedInclArchs.trim().split(/\s*,\s*/);
|
||||
webArchs = Array.from(new Set([...webArchs, ...nextInclArchs]));
|
||||
}
|
||||
|
||||
const forcedExclArchs = process.env.METEOR_FORCE_EXCLUDE_ARCHS;
|
||||
if (forcedExclArchs != null) {
|
||||
const nextExclArchs = forcedExclArchs.trim().split(/\s*,\s*/);
|
||||
webArchs = webArchs.filter(_webArch => !nextExclArchs.includes(_webArch));
|
||||
}
|
||||
|
||||
return webArchs;
|
||||
}
|
||||
|
||||
@@ -509,11 +590,16 @@ async function doRunCommand(options) {
|
||||
webArchs.push("web.cordova");
|
||||
}
|
||||
}
|
||||
webArchs = filterWebArchs(webArchs, options['exclude-archs']);
|
||||
|
||||
webArchs = filterWebArchs(webArchs, options['exclude-archs'], options.appDir, options);
|
||||
// Set the webArchs to include for compilation later
|
||||
global.includedWebArchs = webArchs;
|
||||
|
||||
const buildMode = options.production ? 'production' : 'development';
|
||||
|
||||
let cordovaRunner;
|
||||
if (!_.isEmpty(runTargets)) {
|
||||
const shouldDisableCordova = Boolean(JSON.parse(process.env.METEOR_CORDOVA_DISABLE || 'false'));
|
||||
if (!shouldDisableCordova && !_.isEmpty(runTargets)) {
|
||||
|
||||
async function prepareCordovaProject() {
|
||||
import { CordovaProject } from '../cordova/project.js';
|
||||
@@ -570,7 +656,7 @@ async function doRunCommand(options) {
|
||||
open(`http://localhost:${options.port}`)
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
open: options.open,
|
||||
});
|
||||
}
|
||||
@@ -1344,7 +1430,7 @@ var buildCommand = async function (options) {
|
||||
let selectedPlatforms = null;
|
||||
if (options.platforms) {
|
||||
const platformsArray = options.platforms.split(",");
|
||||
|
||||
const excludableWebArchs = ['web.browser', 'web.browser.legacy', 'web.cordova'];
|
||||
platformsArray.forEach(plat => {
|
||||
if (![...excludableWebArchs, 'android', 'ios'].includes(plat)) {
|
||||
throw new Error(`Not allowed platform on '--platforms' flag: ${plat}`)
|
||||
@@ -1391,9 +1477,9 @@ on an OS X system.");
|
||||
// For example, if we want to build only android, there is no need to build
|
||||
// web.browser.
|
||||
let webArchs;
|
||||
const baseWebArchs = projectContext.platformList.getWebArchs();
|
||||
if (selectedPlatforms) {
|
||||
const filteredArchs = projectContext.platformList
|
||||
.getWebArchs()
|
||||
const filteredArchs = baseWebArchs
|
||||
.filter(arch => selectedPlatforms.includes(arch));
|
||||
|
||||
if (
|
||||
@@ -1404,6 +1490,11 @@ on an OS X system.");
|
||||
}
|
||||
|
||||
webArchs = filteredArchs.length ? filteredArchs : undefined;
|
||||
} else {
|
||||
webArchs = filterWebArchs(baseWebArchs, options['exclude-archs'], options.appDir, {
|
||||
...options,
|
||||
platforms: projectContext.platformList.getPlatforms(),
|
||||
});
|
||||
}
|
||||
|
||||
var buildDir = projectContext.getProjectLocalDirectory('build_tar');
|
||||
@@ -1561,7 +1652,7 @@ https://guide.meteor.com/cordova.html#submitting-android
|
||||
});
|
||||
}
|
||||
|
||||
await files.rm_recursive(buildDir);
|
||||
await files.rm_recursive_deferred(buildDir);
|
||||
|
||||
const npmShrinkwrapFilePath = files.pathJoin(bundlePath, 'programs/server/npm-shrinkwrap.json');
|
||||
if (files.exists(npmShrinkwrapFilePath)) {
|
||||
@@ -2113,7 +2204,10 @@ testCommandOptions = {
|
||||
|
||||
'extra-packages': { type: String },
|
||||
|
||||
'exclude-archs': { type: String }
|
||||
'exclude-archs': { type: String },
|
||||
|
||||
// Same as TINYTEST_FILTER
|
||||
filter: { type: String, short: 'f' },
|
||||
}
|
||||
};
|
||||
|
||||
@@ -2134,6 +2228,9 @@ main.registerCommand(Object.assign(
|
||||
});
|
||||
|
||||
async function doTestCommand(options) {
|
||||
if (options.filter) {
|
||||
process.env.TINYTEST_FILTER = options.filter;
|
||||
}
|
||||
// This "metadata" is accessed in a few places. Using a global
|
||||
// variable here was more expedient than navigating the many layers
|
||||
// of abstraction across the build process.
|
||||
@@ -2465,7 +2562,9 @@ var runTestAppForPackages = async function (projectContext, options) {
|
||||
if (options.cordovaRunner) {
|
||||
webArchs.push("web.cordova");
|
||||
}
|
||||
buildOptions.webArchs = filterWebArchs(webArchs, options['exclude-archs']);
|
||||
buildOptions.webArchs = filterWebArchs(webArchs, options['exclude-archs'], projectContext.appDirectory, options);
|
||||
// Set the webArchs to include for compilation later
|
||||
global.includedWebArchs = buildOptions.webArchs;
|
||||
|
||||
if (options.deploy) {
|
||||
// Run the constraint solver and build local packages.
|
||||
@@ -3350,7 +3449,7 @@ const setupBenchmarkSuite = async (profilingPath) => {
|
||||
process.env.GIT_TERMINAL_PROMPT = 0;
|
||||
|
||||
const repoUrl = "https://github.com/meteor/performance";
|
||||
const branch = "v3.2.0";
|
||||
const branch = "v3.3.0";
|
||||
const gitCommand = [
|
||||
`mkdir -p ${profilingPath}`,
|
||||
`git clone --no-checkout --depth 1 --filter=tree:0 --sparse --progress --branch ${branch} --single-branch ${repoUrl} ${profilingPath}`,
|
||||
@@ -3389,9 +3488,10 @@ async function doBenchmarkCommand(options) {
|
||||
|
||||
const meteorSizeEnvs = [
|
||||
!!options['size-only'] && 'METEOR_BUNDLE_SIZE_ONLY=true',
|
||||
!!options['size'] && 'METEOR_BUNDLE_SIZE=true'
|
||||
!!options['size'] && 'METEOR_BUNDLE_SIZE=true',
|
||||
!!options['build'] && 'METEOR_BUNDLE_BUILD=true',
|
||||
].filter(Boolean);
|
||||
const meteorOptions = args.filter(arg => !['--size-only', '--size'].includes(arg));
|
||||
const meteorOptions = args.filter(arg => !['--size-only', '--size', '--build'].includes(arg));
|
||||
|
||||
const profilingCommand = [
|
||||
`${meteorSizeEnvs.join(' ')} ${profilingPath}/scripts/monitor-bundler.sh ${projectContext.projectDir} ${new Date().getTime()} ${meteorOptions.join(' ')}`.trim(),
|
||||
@@ -3407,9 +3507,11 @@ main.registerCommand(
|
||||
name: 'profile',
|
||||
maxArgs: Infinity,
|
||||
options: {
|
||||
...buildCommands.options || {},
|
||||
...runCommandOptions.options || {},
|
||||
'size': { type: Boolean },
|
||||
'size-only': { type: Boolean },
|
||||
'size': { type: Boolean },
|
||||
'size-only': { type: Boolean },
|
||||
'build': { type: Boolean },
|
||||
},
|
||||
catalogRefresh: new catalog.Refresh.Never(),
|
||||
}, doBenchmarkCommand);
|
||||
|
||||
@@ -1173,6 +1173,7 @@ Use METEOR_LOG_DIR=<path-to-directory> to set a custom log directory.
|
||||
Options:
|
||||
--size monitor both bundle runtime and size
|
||||
--size-only monitor only the bundle size
|
||||
--build monitor build time
|
||||
|
||||
The rest of options for this command are the same as those for the meteor run command.
|
||||
You can pass typical runtime options (such as --settings, --exclude-archs, etc.)
|
||||
|
||||
@@ -287,7 +287,7 @@ main.captureAndExit = async function (header, title, f) {
|
||||
|
||||
// NB: files required up to this point may not define commands
|
||||
|
||||
require('./commands.js');
|
||||
const { isModernWatcherEnabled } = require('./commands.js');
|
||||
require('./commands-packages.js');
|
||||
require('./commands-packages-query.js');
|
||||
require('./commands-cordova.js');
|
||||
@@ -865,6 +865,7 @@ makeGlobalAsyncLocalStorage().run({}, async function () {
|
||||
var appDir = files.findAppDir();
|
||||
if (appDir) {
|
||||
appDir = files.pathResolve(appDir);
|
||||
global.modernWatcher = isModernWatcherEnabled(appDir);
|
||||
}
|
||||
|
||||
await require('../tool-env/isopackets.js').ensureIsopacketsLoadable();
|
||||
@@ -1543,6 +1544,9 @@ makeGlobalAsyncLocalStorage().run({}, async function () {
|
||||
});
|
||||
}
|
||||
|
||||
// Set the currentCommand in the global object to spread context
|
||||
global.currentCommand = { name: command.name, options };
|
||||
|
||||
var ret = await command.func(options, { rawOptions });
|
||||
|
||||
} catch (e) {
|
||||
|
||||
@@ -358,6 +358,20 @@ export const rm_recursive = Profile("files.rm_recursive", async (path: string) =
|
||||
}
|
||||
});
|
||||
|
||||
export const rm_recursive_deferred = Profile("files.rm_recursive_deferred", async (path: string) => {
|
||||
// Generate a temp path name for the old build directory
|
||||
const oldBuildPath = path + '.old-' + Math.floor(Math.random() * 999999);
|
||||
// If the original buildPath exists, rename it first
|
||||
if (exists(path)) {
|
||||
await rename(path, oldBuildPath);
|
||||
// Start deletion of old directory asynchronously without awaiting
|
||||
rm_recursive(oldBuildPath).catch(e => {
|
||||
// Log error but don't fail the build
|
||||
console.error(`Error removing old build directory ${oldBuildPath}:`, e);
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Returns the base64 SHA256 of the given file.
|
||||
export function fileHash(filename: string) {
|
||||
const crypto = require('crypto');
|
||||
|
||||
476
tools/fs/safe-watcher-legacy.ts
Normal file
476
tools/fs/safe-watcher-legacy.ts
Normal file
@@ -0,0 +1,476 @@
|
||||
import { FSWatcher, Stats, BigIntStats } from "fs";
|
||||
import { Profile } from "../tool-env/profile";
|
||||
import {
|
||||
statOrNull,
|
||||
convertToOSPath,
|
||||
watchFile,
|
||||
unwatchFile,
|
||||
toPosixPath,
|
||||
pathRelative
|
||||
} from "./files";
|
||||
import {
|
||||
join as nativeJoin
|
||||
} from 'path';
|
||||
import nsfw from 'vscode-nsfw';
|
||||
|
||||
const pathwatcher = require('pathwatcher');
|
||||
|
||||
// Default to prioritizing changed files, but disable that behavior (and
|
||||
// thus prioritize all files equally) if METEOR_WATCH_PRIORITIZE_CHANGED
|
||||
// is explicitly set to a string that parses to a falsy value.
|
||||
var PRIORITIZE_CHANGED = true;
|
||||
if (process.env.METEOR_WATCH_PRIORITIZE_CHANGED &&
|
||||
! JSON.parse(process.env.METEOR_WATCH_PRIORITIZE_CHANGED)) {
|
||||
PRIORITIZE_CHANGED = false;
|
||||
}
|
||||
|
||||
var DEFAULT_POLLING_INTERVAL =
|
||||
+(process.env.METEOR_WATCH_POLLING_INTERVAL_MS || 5000);
|
||||
|
||||
var NO_WATCHER_POLLING_INTERVAL =
|
||||
+(process.env.METEOR_WATCH_POLLING_INTERVAL_MS || 500);
|
||||
|
||||
// This may seems like a long time to wait before actually closing the
|
||||
// file watchers, but it's to our advantage if they survive restarts.
|
||||
const WATCHER_CLEANUP_DELAY_MS = 30000;
|
||||
|
||||
// Since linux doesn't have recursive file watching, nsfw has to walk the
|
||||
// watched folder and create a separate watcher for each subfolder. Until it has a
|
||||
// way for us to filter which folders it walks we will continue to use
|
||||
// pathwatcher to avoid having too many watchers.
|
||||
let watcherLibrary = process.env.METEOR_WATCHER_LIBRARY ||
|
||||
(process.platform === 'linux' ? 'pathwatcher' : 'nsfw');
|
||||
|
||||
// Pathwatcher complains (using console.error, ugh) if you try to watch
|
||||
// two files with the same stat.ino number but different paths on linux, so we have
|
||||
// to deduplicate files by ino.
|
||||
const DEDUPLICATE_BY_INO = watcherLibrary === 'pathwatcher';
|
||||
// Set METEOR_WATCH_FORCE_POLLING environment variable to a truthy value to
|
||||
// force the use of files.watchFile instead of watchLibrary.watch.
|
||||
let watcherEnabled = ! JSON.parse(
|
||||
process.env.METEOR_WATCH_FORCE_POLLING || "false"
|
||||
);
|
||||
|
||||
const entriesByIno = new Map;
|
||||
|
||||
export type SafeWatcher = {
|
||||
close: () => void;
|
||||
}
|
||||
|
||||
type EntryCallback = (event: string) => void;
|
||||
|
||||
interface Entry extends SafeWatcher {
|
||||
callbacks: Set<EntryCallback>;
|
||||
rewatch: () => void;
|
||||
release: (callback: EntryCallback) => void;
|
||||
_fire: (event: string) => void;
|
||||
}
|
||||
|
||||
const entries: Record<string, Entry | null> = Object.create(null);
|
||||
|
||||
// Folders that are watched recursively
|
||||
let watchRoots = new Set<string>();
|
||||
|
||||
// Set of paths for which a change event has been fired, watched with
|
||||
// watchLibrary.watch if available. This could be an LRU cache, but in
|
||||
// practice it should never grow large enough for that to matter.
|
||||
const changedPaths = new Set;
|
||||
|
||||
function hasPriority(absPath: string) {
|
||||
// If we're not prioritizing changed files, then all files have
|
||||
// priority, which means they should be watched with native file
|
||||
// watchers if the platform supports them. If we are prioritizing
|
||||
// changed files, then only changed files get priority.
|
||||
return PRIORITIZE_CHANGED
|
||||
? changedPaths.has(absPath)
|
||||
: true;
|
||||
}
|
||||
|
||||
function acquireWatcher(absPath: string, callback: EntryCallback) {
|
||||
const entry = entries[absPath] || (
|
||||
entries[absPath] = startNewWatcher(absPath));
|
||||
|
||||
// Watches successfully established in the past may have become invalid
|
||||
// because the watched file was deleted or renamed, so we need to make
|
||||
// sure we're still watching every time we call safeWatcher.watch.
|
||||
entry.rewatch();
|
||||
|
||||
// The size of the entry.callbacks Set also serves as a reference count
|
||||
// for this watcher.
|
||||
entry.callbacks.add(callback);
|
||||
|
||||
return entry;
|
||||
}
|
||||
|
||||
function startNewWatcher(absPath: string): Entry {
|
||||
let stat: Stats | BigIntStats | null | undefined = null;
|
||||
|
||||
if (DEDUPLICATE_BY_INO) {
|
||||
stat = statOrNull(absPath);
|
||||
if (stat && stat.ino > 0 && entriesByIno.has(stat.ino)) {
|
||||
const entry = entriesByIno.get(stat.ino);
|
||||
if (entries[absPath] === entry) {
|
||||
return entry;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
let entry = entries[absPath];
|
||||
if (entry) {
|
||||
return entry;
|
||||
}
|
||||
}
|
||||
|
||||
function safeUnwatch() {
|
||||
if (watcher) {
|
||||
watcher.close();
|
||||
watcher = null;
|
||||
if (stat && stat.ino > 0) {
|
||||
entriesByIno.delete(stat.ino);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let lastWatcherEventTime = Date.now();
|
||||
const callbacks = new Set<EntryCallback>();
|
||||
let watcherCleanupTimer: ReturnType<typeof setTimeout> | null = null;
|
||||
let watcher: FSWatcher | null = null;
|
||||
|
||||
// Determines the polling interval to be used for the fs.watchFile-based
|
||||
// safety net that works on all platforms and file systems.
|
||||
function getPollingInterval() {
|
||||
if (hasPriority(absPath)) {
|
||||
// Regardless of whether we have a native file watcher and it works
|
||||
// correctly on this file system, poll prioritized files (that is,
|
||||
// files that have been changed at least once) at a higher frequency
|
||||
// (every 500ms by default).
|
||||
return NO_WATCHER_POLLING_INTERVAL;
|
||||
}
|
||||
|
||||
if (watcherEnabled || PRIORITIZE_CHANGED) {
|
||||
// As long as native file watching is enabled (even if it doesn't
|
||||
// work correctly) and the developer hasn't explicitly opted out of
|
||||
// the file watching priority system, poll unchanged files at a
|
||||
// lower frequency (every 5000ms by default).
|
||||
return DEFAULT_POLLING_INTERVAL;
|
||||
}
|
||||
|
||||
// If native file watching is disabled and the developer has
|
||||
// explicitly opted out of the priority system, poll everything at the
|
||||
// higher frequency (every 500ms by default). Note that this leads to
|
||||
// higher idle CPU usage, so the developer may want to adjust the
|
||||
// METEOR_WATCH_POLLING_INTERVAL_MS environment variable.
|
||||
return NO_WATCHER_POLLING_INTERVAL;
|
||||
}
|
||||
|
||||
function fire(event: string) {
|
||||
if (event !== "change") {
|
||||
// When we receive a "delete" or "rename" event, the watcher is
|
||||
// probably not going to generate any more notifications for this
|
||||
// file, so we close and nullify the watcher to ensure that
|
||||
// entry.rewatch() will attempt to reestablish the watcher the next
|
||||
// time we call safeWatcher.watch.
|
||||
safeUnwatch();
|
||||
|
||||
// Make sure we don't throttle the watchFile callback after a
|
||||
// "delete" or "rename" event, since it is now our only reliable
|
||||
// source of file change notifications.
|
||||
lastWatcherEventTime = 0;
|
||||
|
||||
} else {
|
||||
changedPaths.add(absPath);
|
||||
rewatch();
|
||||
}
|
||||
|
||||
callbacks.forEach(cb => cb(event));
|
||||
}
|
||||
|
||||
function watchWrapper(event: string) {
|
||||
lastWatcherEventTime = Date.now();
|
||||
fire(event);
|
||||
|
||||
// It's tempting to call unwatchFile(absPath, watchFileWrapper) here,
|
||||
// but previous watcher success is no guarantee of future watcher
|
||||
// reliability. For example, watchLibrary.watch works just fine when file
|
||||
// changes originate from within a Vagrant VM, but changes to shared
|
||||
// files made outside the VM are invisible to watcher, so our only
|
||||
// hope of catching them is to continue polling.
|
||||
}
|
||||
|
||||
function rewatch() {
|
||||
if (hasPriority(absPath)) {
|
||||
if (watcher) {
|
||||
// Already watching; nothing to do.
|
||||
return;
|
||||
}
|
||||
watcher = watchLibraryWatch(absPath, watchWrapper);
|
||||
} else if (watcher) {
|
||||
safeUnwatch();
|
||||
}
|
||||
|
||||
// Since we're about to restart the stat-based file watcher, we don't
|
||||
// want to miss any of its events because of the lastWatcherEventTime
|
||||
// throttling that it attempts to do.
|
||||
lastWatcherEventTime = 0;
|
||||
|
||||
// We use files.watchFile in addition to watcher.watch as a fail-safe
|
||||
// to detect file changes even on network file systems. However
|
||||
// (unless the user disabled watcher or this watcher call failed), we
|
||||
// use a relatively long default polling interval of 5000ms to save
|
||||
// CPU cycles.
|
||||
statWatch(absPath, getPollingInterval(), watchFileWrapper);
|
||||
}
|
||||
|
||||
function watchFileWrapper(newStat: Stats, oldStat: Stats) {
|
||||
if (newStat.ino === 0 &&
|
||||
oldStat.ino === 0 &&
|
||||
+newStat.mtime === +oldStat.mtime) {
|
||||
// Node calls the watchFile listener once with bogus identical stat
|
||||
// objects, which should not trigger a file change event.
|
||||
return;
|
||||
}
|
||||
|
||||
// If a watcher event fired in the last polling interval, ignore
|
||||
// this event.
|
||||
if (Date.now() - lastWatcherEventTime > getPollingInterval()) {
|
||||
fire("change");
|
||||
}
|
||||
}
|
||||
|
||||
const entry = {
|
||||
callbacks,
|
||||
rewatch,
|
||||
|
||||
release(callback: EntryCallback) {
|
||||
if (! entries[absPath]) {
|
||||
return;
|
||||
}
|
||||
|
||||
callbacks.delete(callback);
|
||||
if (callbacks.size > 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Once there are no more callbacks in the Set, close both watchers
|
||||
// and nullify the shared data.
|
||||
if (watcherCleanupTimer) {
|
||||
clearTimeout(watcherCleanupTimer);
|
||||
}
|
||||
|
||||
watcherCleanupTimer = setTimeout(() => {
|
||||
if (callbacks.size > 0) {
|
||||
// If another callback was added while the timer was pending, we
|
||||
// can avoid tearing anything down.
|
||||
return;
|
||||
}
|
||||
entry.close();
|
||||
}, WATCHER_CLEANUP_DELAY_MS);
|
||||
},
|
||||
|
||||
close() {
|
||||
if (entries[absPath] !== entry) return;
|
||||
entries[absPath] = null;
|
||||
|
||||
if (watcherCleanupTimer) {
|
||||
clearTimeout(watcherCleanupTimer);
|
||||
watcherCleanupTimer = null;
|
||||
}
|
||||
|
||||
safeUnwatch();
|
||||
|
||||
unwatchFile(absPath, watchFileWrapper);
|
||||
},
|
||||
_fire: fire
|
||||
};
|
||||
|
||||
if (stat && stat.ino > 0) {
|
||||
entriesByIno.set(stat.ino, entry);
|
||||
}
|
||||
|
||||
return entry;
|
||||
}
|
||||
|
||||
export function closeAllWatchers() {
|
||||
Object.keys(entries).forEach(absPath => {
|
||||
const entry = entries[absPath];
|
||||
if (entry) {
|
||||
entry.close();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
const statWatchers = Object.create(null);
|
||||
|
||||
function statWatch(
|
||||
absPath: string,
|
||||
interval: number,
|
||||
callback: (current: Stats, previous: Stats) => void,
|
||||
) {
|
||||
let statWatcher = statWatchers[absPath];
|
||||
|
||||
if (!statWatcher) {
|
||||
statWatcher = {
|
||||
interval,
|
||||
changeListeners: [],
|
||||
stat: null
|
||||
};
|
||||
statWatchers[absPath] = statWatcher;
|
||||
}
|
||||
|
||||
// If the interval needs to be changed, replace the watcher.
|
||||
// Node will only recreate the watcher with the new interval if all old
|
||||
// watchers are stopped (which unwatchFile does when not passed a
|
||||
// specific listener)
|
||||
if (statWatcher.interval !== interval && statWatcher.stat) {
|
||||
// This stops all stat watchers for the file, not just those created by
|
||||
// statWatch
|
||||
unwatchFile(absPath);
|
||||
statWatcher.stat = null;
|
||||
statWatcher.interval = interval;
|
||||
}
|
||||
|
||||
if (!statWatcher.changeListeners.includes(callback)) {
|
||||
statWatcher.changeListeners.push(callback);
|
||||
}
|
||||
|
||||
if (!statWatcher.stat) {
|
||||
const newStat = watchFile(absPath, {
|
||||
persistent: false, // never persistent
|
||||
interval,
|
||||
}, (newStat, oldStat) => {
|
||||
statWatcher.changeListeners.forEach((
|
||||
listener: (newStat: Stats, oldStat: Stats) => void
|
||||
) => {
|
||||
listener(newStat, oldStat);
|
||||
});
|
||||
});
|
||||
|
||||
newStat.on("stop", () => {
|
||||
if (statWatchers[absPath] === statWatch) {
|
||||
delete statWatchers[absPath];
|
||||
}
|
||||
});
|
||||
|
||||
statWatcher.stat = newStat;
|
||||
}
|
||||
|
||||
return statWatcher;
|
||||
}
|
||||
|
||||
function watchLibraryWatch(absPath: string, callback: EntryCallback) {
|
||||
if (watcherEnabled && watcherLibrary === 'pathwatcher') {
|
||||
try {
|
||||
return pathwatcher.watch(convertToOSPath(absPath), callback);
|
||||
} catch (e: any) {
|
||||
maybeSuggestRaisingWatchLimit(e);
|
||||
// ... ignore the error. We'll still have watchFile, which is good
|
||||
// enough.
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
let suggestedRaisingWatchLimit = false;
|
||||
|
||||
function maybeSuggestRaisingWatchLimit(error: Error & { errno: number }) {
|
||||
var constants = require('constants');
|
||||
var archinfo = require('../utils/archinfo');
|
||||
if (! suggestedRaisingWatchLimit &&
|
||||
// Note: the not-super-documented require('constants') maps from
|
||||
// strings to SYSTEM errno values. System errno values aren't the same
|
||||
// as the numbers used internally by libuv! Once we're upgraded
|
||||
// to Node 0.12, we'll have the system errno as a string (on 'code'),
|
||||
// but the support for that wasn't in Node 0.10's uv.
|
||||
// See our PR https://github.com/atom/node-pathwatcher/pull/53
|
||||
// (and make sure to read the final commit message, not the original
|
||||
// proposed PR, which had a slightly different interface).
|
||||
error.errno === constants.ENOSPC &&
|
||||
// The only suggestion we currently have is for Linux.
|
||||
archinfo.matches(archinfo.host(), 'os.linux')) {
|
||||
|
||||
// Check suggestedRaisingWatchLimit again because archinfo.host() may
|
||||
// have yielded.
|
||||
if (suggestedRaisingWatchLimit) return;
|
||||
suggestedRaisingWatchLimit = true;
|
||||
|
||||
var Console = require('../console/console.js').Console;
|
||||
if (! Console.isHeadless()) {
|
||||
Console.arrowWarn(
|
||||
"It looks like a simple tweak to your system's configuration will " +
|
||||
"make many tools (including this Meteor command) more efficient. " +
|
||||
"To learn more, see " +
|
||||
Console.url("https://github.com/meteor/docs/blob/master/long-form/file-change-watcher-efficiency.md"));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export const watch = Profile(
|
||||
"safeWatcher.watchLegacy",
|
||||
(absPath: string, callback: EntryCallback) => {
|
||||
const entry = acquireWatcher(absPath, callback);
|
||||
return {
|
||||
close() {
|
||||
entry.release(callback);
|
||||
}
|
||||
} as SafeWatcher;
|
||||
}
|
||||
);
|
||||
|
||||
const fireNames = {
|
||||
[nsfw.actions.CREATED]: 'change',
|
||||
[nsfw.actions.MODIFIED]: 'change',
|
||||
[nsfw.actions.DELETED]: 'delete'
|
||||
}
|
||||
|
||||
export function addWatchRoot(absPath: string) {
|
||||
if (watchRoots.has(absPath) || watcherLibrary !== 'nsfw' || !watcherEnabled) {
|
||||
return;
|
||||
}
|
||||
|
||||
watchRoots.add(absPath);
|
||||
|
||||
// If there already is a watcher for a parent directory, there is no need
|
||||
// to create this watcher.
|
||||
for (const path of watchRoots) {
|
||||
let relativePath = pathRelative(path, absPath);
|
||||
if (
|
||||
path !== absPath &&
|
||||
!relativePath.startsWith('..') &&
|
||||
!relativePath.startsWith('/')
|
||||
) {
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// TODO: check if there are any existing watchers that are children of this
|
||||
// watcher and stop them
|
||||
|
||||
nsfw(
|
||||
convertToOSPath(absPath),
|
||||
(events) => {
|
||||
events.forEach(event => {
|
||||
if(event.action === nsfw.actions.RENAMED) {
|
||||
let oldPath = nativeJoin(event.directory, event.oldFile);
|
||||
let oldEntry = entries[toPosixPath(oldPath)];
|
||||
if (oldEntry) {
|
||||
oldEntry._fire('rename');
|
||||
}
|
||||
|
||||
let path = nativeJoin(event.newDirectory, event.newFile);
|
||||
let newEntry = entries[toPosixPath(path)];
|
||||
if (newEntry) {
|
||||
newEntry._fire('change');
|
||||
}
|
||||
} else {
|
||||
let path = nativeJoin(event.directory, event.file);
|
||||
let entry = entries[toPosixPath(path)];
|
||||
if (entry) {
|
||||
entry._fire(fireNames[event.action]);
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
).then(watcher => {
|
||||
watcher.start()
|
||||
});
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,7 +1,7 @@
|
||||
import assert from "assert";
|
||||
import {WatchSet, readAndWatchFile, sha1} from '../fs/watch';
|
||||
import files, {
|
||||
symlinkWithOverwrite, realpath,
|
||||
symlinkWithOverwrite, realpath, rm_recursive_deferred,
|
||||
} from '../fs/files';
|
||||
import NpmDiscards from './npm-discards';
|
||||
import {Profile} from '../tool-env/profile';
|
||||
@@ -126,7 +126,8 @@ Previous builder: ${previousBuilder.outputPath}, this builder: ${outputPath}`
|
||||
async init() {
|
||||
// Build the output from scratch
|
||||
if (this.resetBuildPath) {
|
||||
await files.rm_recursive(this.buildPath);
|
||||
await files.rm_recursive_deferred(this.buildPath);
|
||||
// Create the new build directory immediately without waiting for deletion
|
||||
await files.mkdir_p(this.buildPath, 0o755);
|
||||
}
|
||||
this.watchSet = new WatchSet();
|
||||
@@ -881,7 +882,7 @@ Previous builder: ${previousBuilder.outputPath}, this builder: ${outputPath}`
|
||||
removed[path] = true;
|
||||
} else {
|
||||
// directory
|
||||
await files.rm_recursive(absPath);
|
||||
await files.rm_recursive_deferred(absPath);
|
||||
|
||||
// mark all sub-paths as removed, too
|
||||
paths.forEach((anotherPath) => {
|
||||
@@ -904,7 +905,7 @@ Previous builder: ${previousBuilder.outputPath}, this builder: ${outputPath}`
|
||||
|
||||
// Delete the partially-completed bundle. Do not disturb outputPath.
|
||||
abort() {
|
||||
return files.rm_recursive(this.buildPath);
|
||||
return files.rm_recursive_deferred(this.buildPath);
|
||||
}
|
||||
|
||||
// Returns a WatchSet representing all files that were read from disk by the
|
||||
@@ -936,7 +937,7 @@ async function atomicallyRewriteFile(path, data, options) {
|
||||
// replacing a directory with a file; this is rare (so it can
|
||||
// be a slow path) but can legitimately happen if e.g. a developer
|
||||
// puts a file where there used to be a directory in their app.
|
||||
await files.rm_recursive(path);
|
||||
await files.rm_recursive_deferred(path);
|
||||
files.rename(rpath, path);
|
||||
} else {
|
||||
throw e;
|
||||
|
||||
@@ -1546,6 +1546,9 @@ class Target {
|
||||
// with the original sources.
|
||||
rewriteSourceMaps() {
|
||||
const rewriteSourceMap = function (sm) {
|
||||
if (!sm.sources) {
|
||||
return sm;
|
||||
}
|
||||
sm.sources = sm.sources.map(function (path) {
|
||||
const prefix = SOURCE_URL_PREFIX;
|
||||
if (path.slice(0, prefix.length) === prefix) {
|
||||
|
||||
@@ -1832,7 +1832,7 @@ export class PackageSourceBatch {
|
||||
if (cacheFilename) {
|
||||
// Write asynchronously.
|
||||
try {
|
||||
await files.rm_recursive(wildcardCacheFilename);
|
||||
await files.rm_recursive_deferred(wildcardCacheFilename);
|
||||
} finally {
|
||||
await files.writeFileAtomically(cacheFilename, retAsJSON);
|
||||
}
|
||||
|
||||
@@ -150,6 +150,7 @@ compiler.compile = Profile(function (packageSource, options) {
|
||||
// Isopack#initFromPath).
|
||||
var isobuildFeatures = [];
|
||||
packageSource.architectures.forEach((sourceArch) => {
|
||||
if (global.includedWebArchs != null && ![...global.includedWebArchs, 'os'].includes(sourceArch.arch)) return;
|
||||
sourceArch.uses.forEach((use) => {
|
||||
if (!use.weak && isIsobuildFeaturePackage(use.package) &&
|
||||
isobuildFeatures.indexOf(use.package) === -1) {
|
||||
@@ -181,6 +182,7 @@ compiler.compile = Profile(function (packageSource, options) {
|
||||
if (architecture.arch === 'web.cordova' && ! includeCordovaUnibuild) {
|
||||
continue;
|
||||
}
|
||||
if (global.includedWebArchs != null && ![...global.includedWebArchs, 'os'].includes(architecture.arch)) continue;
|
||||
|
||||
// TODO -> Maybe this withCache will bring some problems in other commands.
|
||||
await files.withCache(async () => {
|
||||
@@ -226,6 +228,7 @@ compiler.lint = Profile(function (packageSource, options) {
|
||||
&& architecture.arch === 'web.cordova') {
|
||||
continue;
|
||||
}
|
||||
if (global.includedWebArchs != null && ![...global.includedWebArchs, 'os'].includes(architecture.arch)) continue;
|
||||
|
||||
const unibuildWarnings = await lintUnibuild({
|
||||
isopack: options.isopack,
|
||||
@@ -246,6 +249,8 @@ compiler.getMinifiers = async function (packageSource, options) {
|
||||
|
||||
var minifiers = [];
|
||||
for (const architecture of packageSource.architectures) {
|
||||
if (global.includedWebArchs != null && ![...global.includedWebArchs, 'os'].includes(architecture.arch)) continue;
|
||||
|
||||
var activePluginPackages = await getActivePluginPackages(options.isopack, {
|
||||
isopackCache: options.isopackCache,
|
||||
uses: architecture.uses
|
||||
@@ -791,8 +796,9 @@ async function runLinters({inputSourceArch, isopackCache, sources,
|
||||
|
||||
const absPath = files.pathResolve(inputSourceArch.sourceRoot, relPath);
|
||||
const hash = optimisticHashOrNull(absPath);
|
||||
const contents = optimisticReadFile(absPath);
|
||||
watchSet.addFile(absPath, hash);
|
||||
if (!watchSet.hasFile(absPath)) {
|
||||
watchSet.addFile(absPath, hash);
|
||||
}
|
||||
|
||||
if (classification.type === "meteor-ignore") {
|
||||
// Return after watching .meteorignore files but before adding them
|
||||
@@ -801,6 +807,7 @@ async function runLinters({inputSourceArch, isopackCache, sources,
|
||||
return;
|
||||
}
|
||||
|
||||
const contents = optimisticReadFile(absPath);
|
||||
const wrappedSource = {
|
||||
relPath, contents, hash, fileOptions,
|
||||
arch: inputSourceArch.arch,
|
||||
|
||||
@@ -45,7 +45,7 @@ import {
|
||||
|
||||
import { wrap } from "optimism";
|
||||
const { compile: reifyCompile } = require("@meteorjs/reify/lib/compiler");
|
||||
const { parse: reifyBabelParse } = require("@meteorjs/reify/lib/parsers/babel");
|
||||
const { parse: reifyAcornParse } = require("@meteorjs/reify/lib/parsers/acorn");
|
||||
|
||||
import Resolver, { Resolution } from "./resolver";
|
||||
import LRUCache from 'lru-cache';
|
||||
@@ -88,7 +88,7 @@ const reifyCompileWithCache = Profile("reifyCompileWithCache", wrap(function (
|
||||
|
||||
const isLegacy = isLegacyArch(bundleArch);
|
||||
let result = reifyCompile(stripHashBang(source), {
|
||||
parse: reifyBabelParse,
|
||||
parse: reifyAcornParse,
|
||||
generateLetDeclarations: !isLegacy,
|
||||
avoidModernSyntax: isLegacy,
|
||||
enforceStrictMode: false,
|
||||
@@ -977,7 +977,7 @@ export default class ImportScanner {
|
||||
private async findImportedModuleIdentifiers(
|
||||
file: File,
|
||||
): Promise<Record<string, ImportInfo>> {
|
||||
const fileHash = file.hash;
|
||||
const fileHash = file.hash instanceof Promise ? await file.hash : file.hash;
|
||||
if (IMPORT_SCANNER_CACHE.has(fileHash)) {
|
||||
return IMPORT_SCANNER_CACHE.get(fileHash) as Record<string, ImportInfo>;
|
||||
}
|
||||
@@ -988,8 +988,8 @@ export default class ImportScanner {
|
||||
);
|
||||
|
||||
// there should always be file.hash, but better safe than sorry
|
||||
if (file.hash) {
|
||||
IMPORT_SCANNER_CACHE.set(file.hash, result);
|
||||
if (fileHash) {
|
||||
IMPORT_SCANNER_CACHE.set(fileHash, result);
|
||||
}
|
||||
|
||||
return result;
|
||||
|
||||
@@ -83,19 +83,19 @@ export class IsopackCache {
|
||||
// Wipe specific packages.
|
||||
for (const packageName of packages) {
|
||||
if (self.cacheDir) {
|
||||
await files.rm_recursive(self._isopackDir(packageName));
|
||||
await files.rm_recursive_deferred(self._isopackDir(packageName));
|
||||
}
|
||||
if (self._pluginCacheDirRoot) {
|
||||
await files.rm_recursive(self._pluginCacheDirForPackage(packageName));
|
||||
await files.rm_recursive_deferred(self._pluginCacheDirForPackage(packageName));
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// Wipe all packages.
|
||||
if (self.cacheDir) {
|
||||
await files.rm_recursive(self.cacheDir);
|
||||
await files.rm_recursive_deferred(self.cacheDir);
|
||||
}
|
||||
if (self._pluginCacheDirRoot) {
|
||||
await files.rm_recursive(self._pluginCacheDirRoot);
|
||||
await files.rm_recursive_deferred(self._pluginCacheDirRoot);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -351,7 +351,7 @@ export class IsopackCache {
|
||||
} else {
|
||||
// Nope! Compile it again. Give it a fresh plugin cache.
|
||||
if (pluginCacheDir) {
|
||||
await files.rm_recursive(pluginCacheDir);
|
||||
await files.rm_recursive_deferred(pluginCacheDir);
|
||||
files.mkdir_p(pluginCacheDir);
|
||||
}
|
||||
|
||||
|
||||
@@ -4,6 +4,7 @@ import LRUCache from "lru-cache";
|
||||
import { Profile } from '../tool-env/profile';
|
||||
import Visitor from "@meteorjs/reify/lib/visitor.js";
|
||||
import { findPossibleIndexes } from "@meteorjs/reify/lib/utils.js";
|
||||
import acorn from 'acorn';
|
||||
|
||||
const hasOwn = Object.prototype.hasOwnProperty;
|
||||
const objToStr = Object.prototype.toString
|
||||
@@ -15,7 +16,9 @@ function isRegExp(value) {
|
||||
var AST_CACHE = new LRUCache({
|
||||
max: Math.pow(2, 12),
|
||||
length(ast) {
|
||||
return ast.loc.end.line;
|
||||
// Estimate cached lines based on average length per character
|
||||
const avgCharsPerLine = 40;
|
||||
return Math.ceil(ast.end / avgCharsPerLine);
|
||||
}
|
||||
});
|
||||
|
||||
@@ -28,20 +31,32 @@ function tryToParse(source, hash) {
|
||||
let ast;
|
||||
try {
|
||||
Profile.time('jsAnalyze.parse', () => {
|
||||
ast = parse(source, {
|
||||
strictMode: false,
|
||||
sourceType: 'module',
|
||||
allowImportExportEverywhere: true,
|
||||
allowReturnOutsideFunction: true,
|
||||
allowUndeclaredExports: true,
|
||||
plugins: [
|
||||
// Only plugins for stage 3 features are enabled
|
||||
// Enabling some plugins significantly affects parser performance
|
||||
'importAttributes',
|
||||
'explicitResourceManagement',
|
||||
'decorators'
|
||||
]
|
||||
});
|
||||
try {
|
||||
ast = acorn.parse(source, {
|
||||
ecmaVersion: 'latest',
|
||||
sourceType: 'script',
|
||||
allowAwaitOutsideFunction: true,
|
||||
allowImportExportEverywhere: true,
|
||||
allowReturnOutsideFunction: true,
|
||||
allowHashBang: true,
|
||||
checkPrivateFields: false,
|
||||
});
|
||||
} catch (error) {
|
||||
ast = parse(source, {
|
||||
strictMode: false,
|
||||
sourceType: 'module',
|
||||
allowImportExportEverywhere: true,
|
||||
allowReturnOutsideFunction: true,
|
||||
allowUndeclaredExports: true,
|
||||
plugins: [
|
||||
// Only plugins for stage 3 features are enabled
|
||||
// Enabling some plugins significantly affects parser performance
|
||||
'importAttributes',
|
||||
'explicitResourceManagement',
|
||||
'decorators'
|
||||
]
|
||||
});
|
||||
}
|
||||
});
|
||||
} catch (e) {
|
||||
if (typeof e.loc === 'object') {
|
||||
|
||||
@@ -87,7 +87,7 @@ meteorNpm.updateDependencies = async function (packageName,
|
||||
// It didn't exist, which is exactly what we wanted.
|
||||
return false;
|
||||
}
|
||||
await files.rm_recursive(newPackageNpmDir);
|
||||
await files.rm_recursive_deferred(newPackageNpmDir);
|
||||
return false;
|
||||
}
|
||||
|
||||
@@ -102,7 +102,7 @@ meteorNpm.updateDependencies = async function (packageName,
|
||||
// proceed.
|
||||
if (files.exists(packageNpmDir) &&
|
||||
! files.exists(files.pathJoin(packageNpmDir, 'npm-shrinkwrap.json'))) {
|
||||
await files.rm_recursive(packageNpmDir);
|
||||
await files.rm_recursive_deferred(packageNpmDir);
|
||||
}
|
||||
|
||||
// with the changes on npm 8, where there were changes to how the packages metadata is given
|
||||
@@ -114,7 +114,7 @@ meteorNpm.updateDependencies = async function (packageName,
|
||||
files.pathJoin(packageNpmDir, 'npm-shrinkwrap.json')
|
||||
));
|
||||
if (shrinkwrap.lockfileVersion !== LOCK_FILE_VERSION) {
|
||||
await files.rm_recursive(packageNpmDir);
|
||||
await files.rm_recursive_deferred(packageNpmDir);
|
||||
}
|
||||
} catch (e) {}
|
||||
}
|
||||
@@ -143,7 +143,7 @@ meteorNpm.updateDependencies = async function (packageName,
|
||||
throw e;
|
||||
} finally {
|
||||
if (files.exists(newPackageNpmDir)) {
|
||||
await files.rm_recursive(newPackageNpmDir);
|
||||
await files.rm_recursive_deferred(newPackageNpmDir);
|
||||
}
|
||||
tmpDirs = _.without(tmpDirs, newPackageNpmDir);
|
||||
}
|
||||
@@ -384,7 +384,7 @@ Profile("meteorNpm.rebuildIfNonPortable", async function (nodeModulesDir) {
|
||||
const rebuildResult = await runNpmCommand(getRebuildArgs(), tempDir);
|
||||
if (! rebuildResult.success) {
|
||||
buildmessage.error(rebuildResult.error);
|
||||
await files.rm_recursive(tempDir);
|
||||
await files.rm_recursive_deferred(tempDir);
|
||||
return false;
|
||||
}
|
||||
|
||||
@@ -420,7 +420,7 @@ Profile("meteorNpm.rebuildIfNonPortable", async function (nodeModulesDir) {
|
||||
await files.renameDirAlmostAtomically(tempPkgDirs[pkgPath], pkgPath);
|
||||
}
|
||||
|
||||
await files.rm_recursive(tempDir);
|
||||
await files.rm_recursive_deferred(tempDir);
|
||||
|
||||
return true;
|
||||
});
|
||||
@@ -644,7 +644,7 @@ var updateExistingNpmDirectory = async function (packageName, newPackageNpmDir,
|
||||
}
|
||||
|
||||
if (oldNodeVersion !== currentNodeCompatibilityVersion()) {
|
||||
await files.rm_recursive(nodeModulesDir);
|
||||
await files.rm_recursive_deferred(nodeModulesDir);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -441,9 +441,9 @@ Object.assign(Table.prototype, {
|
||||
return "(" + _.times(n, function () { return "?" }).join(",") + ")";
|
||||
},
|
||||
|
||||
find: async function (txn, id) {
|
||||
find: async function (db, id) {
|
||||
var self = this;
|
||||
var rows = await txn.query(self._selectQuery, [ id ]);
|
||||
var rows = await db._query(self._selectQuery, [ id ]);
|
||||
if (rows.length !== 0) {
|
||||
if (rows.length !== 1) {
|
||||
throw new Error("Corrupt database (PK violation)");
|
||||
@@ -813,23 +813,23 @@ Object.assign(RemoteCatalog.prototype, {
|
||||
// track, sorted by their orderKey. Returns the empty array if the release
|
||||
// track does not exist or does not have any recommended versions.
|
||||
getSortedRecommendedReleaseRecords: async function (track, laterThanOrderKey) {
|
||||
var self = this;
|
||||
// XXX releaseVersions content objects are kinda big; if we put
|
||||
// 'recommended' and 'orderKey' in their own columns this could be faster
|
||||
var result = await self._contentQuery(
|
||||
"SELECT content FROM releaseVersions WHERE track=?", track);
|
||||
const hasMinKey = laterThanOrderKey != null;
|
||||
|
||||
var recommended = _.filter(result, function (v) {
|
||||
if (!v.recommended)
|
||||
return false;
|
||||
return !laterThanOrderKey || v.orderKey > laterThanOrderKey;
|
||||
});
|
||||
// Always use JSON1 to filter & sort directly in SQL
|
||||
const sql = `
|
||||
SELECT content
|
||||
FROM releaseVersions
|
||||
WHERE track = ?
|
||||
AND json_extract(content, '$.recommended') = 1
|
||||
${hasMinKey ? "AND json_extract(content, '$.orderKey') > ?" : ""}
|
||||
ORDER BY json_extract(content, '$.orderKey') DESC
|
||||
`;
|
||||
const params = hasMinKey
|
||||
? [track, laterThanOrderKey]
|
||||
: [track];
|
||||
|
||||
var recSort = _.sortBy(recommended, function (rec) {
|
||||
return rec.orderKey;
|
||||
});
|
||||
recSort.reverse();
|
||||
return recSort;
|
||||
// _contentQuery will JSON.parse(content) for you
|
||||
return this._contentQuery(sql, params);
|
||||
},
|
||||
|
||||
// Given a release track, returns all version records for this track.
|
||||
@@ -893,9 +893,7 @@ Object.assign(RemoteCatalog.prototype, {
|
||||
// No JSON parsing is performed.
|
||||
_columnsQuery: async function (query, params) {
|
||||
var self = this;
|
||||
var rows = await self.db.runInTransaction(function (txn) {
|
||||
return txn.query(query, params);
|
||||
});
|
||||
var rows = await self.db._query(query, params);
|
||||
return rows;
|
||||
},
|
||||
|
||||
@@ -946,9 +944,7 @@ Object.assign(RemoteCatalog.prototype, {
|
||||
|
||||
getMetadata: async function(key) {
|
||||
var self = this;
|
||||
var row = await self.db.runInTransaction(function (txn) {
|
||||
return self.tableMetadata.find(txn, key);
|
||||
});
|
||||
var row = await self.tableMetadata.find(self.db, key);
|
||||
if (row) {
|
||||
return JSON.parse(row['content']);
|
||||
}
|
||||
@@ -970,9 +966,7 @@ Object.assign(RemoteCatalog.prototype, {
|
||||
|
||||
shouldShowBanner: async function (releaseName, bannerDate) {
|
||||
var self = this;
|
||||
var row = await self.db.runInTransaction(function (txn) {
|
||||
return self.tableBannersShown.find(txn, releaseName);
|
||||
});
|
||||
var row = await self.tableBannersShown.find(self.db, releaseName);
|
||||
// We've never printed a banner for this release.
|
||||
if (! row)
|
||||
return true;
|
||||
|
||||
@@ -1664,7 +1664,7 @@ Object.assign(exports.ReleaseFile.prototype, {
|
||||
if (this.isCheckout()) {
|
||||
// Only create .meteor/local/dev_bundle if .meteor/release refers to
|
||||
// an actual release, and remove it otherwise.
|
||||
await files.rm_recursive(devBundleLink);
|
||||
await files.rm_recursive_deferred(devBundleLink);
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -1819,14 +1819,16 @@ export class MeteorConfig {
|
||||
// TODO Implement an API for setting these values?
|
||||
get(...keys) {
|
||||
let config = this._ensureInitialized();
|
||||
let filteredConfig = keys.length ? {} : config;
|
||||
if (config) {
|
||||
keys.every(key => {
|
||||
if (config && _.has(config, key)) {
|
||||
config = config[key];
|
||||
filteredConfig = config[key];
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
});
|
||||
return config;
|
||||
return filteredConfig;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -474,7 +474,7 @@ var launchMongo = async function(options) {
|
||||
if (options.multiple) {
|
||||
// This is only for testing, so we're OK with incurring the replset
|
||||
// setup on each startup.
|
||||
await files.rm_recursive(dbPath);
|
||||
await files.rm_recursive_deferred(dbPath);
|
||||
files.mkdir_p(dbPath, 0o755);
|
||||
} else if (portFile) {
|
||||
var portFileExists = false;
|
||||
|
||||
@@ -47,7 +47,7 @@ module.exports = function enable ({ cachePath, createLoader = true } = {}) {
|
||||
};
|
||||
|
||||
const reifyVersion = require("@meteorjs/reify/package.json").version;
|
||||
const reifyBabelParse = require("@meteorjs/reify/lib/parsers/babel").parse;
|
||||
const reifyAcornParse = require("@meteorjs/reify/lib/parsers/acorn").parse;
|
||||
const reifyCompile = require("@meteorjs/reify/lib/compiler").compile;
|
||||
|
||||
function compileContent (content) {
|
||||
@@ -55,7 +55,7 @@ module.exports = function enable ({ cachePath, createLoader = true } = {}) {
|
||||
|
||||
try {
|
||||
const result = reifyCompile(content, {
|
||||
parse: reifyBabelParse,
|
||||
parse: reifyAcornParse,
|
||||
generateLetDeclarations: false,
|
||||
ast: false,
|
||||
});
|
||||
|
||||
@@ -21,6 +21,7 @@
|
||||
"client": "client/main.jsx",
|
||||
"server": "server/main.js"
|
||||
},
|
||||
"testModule": "tests/main.js"
|
||||
"testModule": "tests/main.js",
|
||||
"modern": true
|
||||
}
|
||||
}
|
||||
|
||||
@@ -17,6 +17,7 @@
|
||||
"client": "client/main.js",
|
||||
"server": "server/main.js"
|
||||
},
|
||||
"testModule": "tests/main.js"
|
||||
"testModule": "tests/main.js",
|
||||
"modern": true
|
||||
}
|
||||
}
|
||||
|
||||
@@ -24,6 +24,7 @@
|
||||
"client": "client/main.jsx",
|
||||
"server": "server/main.js"
|
||||
},
|
||||
"testModule": "tests/main.js"
|
||||
"testModule": "tests/main.js",
|
||||
"modern": true
|
||||
}
|
||||
}
|
||||
|
||||
@@ -16,6 +16,7 @@
|
||||
"client": "client/main.js",
|
||||
"server": "server/main.js"
|
||||
},
|
||||
"testModule": "tests/main.js"
|
||||
"testModule": "tests/main.js",
|
||||
"modern": true
|
||||
}
|
||||
}
|
||||
|
||||
@@ -18,6 +18,7 @@
|
||||
"client": "client/main.jsx",
|
||||
"server": "server/main.js"
|
||||
},
|
||||
"testModule": "tests/main.js"
|
||||
"testModule": "tests/main.js",
|
||||
"modern": true
|
||||
}
|
||||
}
|
||||
|
||||
@@ -18,7 +18,8 @@
|
||||
"client": "client/entry-meteor.js",
|
||||
"server": "server/entry-meteor.js"
|
||||
},
|
||||
"testModule": "tests/main.js"
|
||||
"testModule": "tests/main.js",
|
||||
"modern": true
|
||||
},
|
||||
"devDependencies": {
|
||||
"babel-preset-solid": "^1.8.15",
|
||||
|
||||
@@ -27,6 +27,7 @@
|
||||
]
|
||||
}
|
||||
},
|
||||
"testModule": "tests/main.js"
|
||||
"testModule": "tests/main.js",
|
||||
"modern": true
|
||||
}
|
||||
}
|
||||
|
||||
@@ -22,6 +22,7 @@
|
||||
"client": "client/main.jsx",
|
||||
"server": "server/main.js"
|
||||
},
|
||||
"testModule": "tests/main.js"
|
||||
"testModule": "tests/main.js",
|
||||
"modern": true
|
||||
}
|
||||
}
|
||||
|
||||
@@ -25,6 +25,7 @@
|
||||
"client": "client/main.tsx",
|
||||
"server": "server/main.ts"
|
||||
},
|
||||
"testModule": "tests/main.ts"
|
||||
"testModule": "tests/main.ts",
|
||||
"modern": true
|
||||
}
|
||||
}
|
||||
|
||||
1
tools/tests/apps/modern/.gitignore
vendored
Normal file
1
tools/tests/apps/modern/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
||||
node_modules/
|
||||
19
tools/tests/apps/modern/.meteor/.finished-upgraders
Normal file
19
tools/tests/apps/modern/.meteor/.finished-upgraders
Normal file
@@ -0,0 +1,19 @@
|
||||
# This file contains information which helps Meteor properly upgrade your
|
||||
# app when you run 'meteor update'. You should check it into version control
|
||||
# with your project.
|
||||
|
||||
notices-for-0.9.0
|
||||
notices-for-0.9.1
|
||||
0.9.4-platform-file
|
||||
notices-for-facebook-graph-api-2
|
||||
1.2.0-standard-minifiers-package
|
||||
1.2.0-meteor-platform-split
|
||||
1.2.0-cordova-changes
|
||||
1.2.0-breaking-changes
|
||||
1.3.0-split-minifiers-package
|
||||
1.4.0-remove-old-dev-bundle-link
|
||||
1.4.1-add-shell-server-package
|
||||
1.4.3-split-account-service-packages
|
||||
1.5-add-dynamic-import-package
|
||||
1.7-split-underscore-from-meteor-base
|
||||
1.8.3-split-jquery-from-blaze
|
||||
1
tools/tests/apps/modern/.meteor/.gitignore
vendored
Normal file
1
tools/tests/apps/modern/.meteor/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
||||
local
|
||||
7
tools/tests/apps/modern/.meteor/.id
Normal file
7
tools/tests/apps/modern/.meteor/.id
Normal file
@@ -0,0 +1,7 @@
|
||||
# This file contains a token that is unique to your project.
|
||||
# Check it into your repository along with the rest of this directory.
|
||||
# It can be used for purposes such as:
|
||||
# - ensuring you don't accidentally deploy one app on top of another
|
||||
# - providing package authors with aggregated statistics
|
||||
|
||||
xh4qomttjyd.znxg6je45aan
|
||||
15
tools/tests/apps/modern/.meteor/packages
Normal file
15
tools/tests/apps/modern/.meteor/packages
Normal file
@@ -0,0 +1,15 @@
|
||||
# Meteor packages used by this project, one per line.
|
||||
# Check this file (and the other files in this directory) into your repository.
|
||||
#
|
||||
# 'meteor add' and 'meteor remove' will edit this file for you,
|
||||
# but you can also edit it by hand.
|
||||
|
||||
meteor # Shared foundation for all Meteor packages
|
||||
static-html # Define static page content in .html files
|
||||
standard-minifier-css # CSS minifier run for production mode
|
||||
standard-minifier-js # JS minifier run for production mode
|
||||
es5-shim # ECMAScript 5 compatibility for older browsers
|
||||
ecmascript # Enable ECMAScript2015+ syntax in app code
|
||||
shell-server # Server-side component of the `meteor shell` command
|
||||
webapp # Serves a Meteor app over HTTP
|
||||
server-render # Support for server-side rendering
|
||||
2
tools/tests/apps/modern/.meteor/platforms
Normal file
2
tools/tests/apps/modern/.meteor/platforms
Normal file
@@ -0,0 +1,2 @@
|
||||
server
|
||||
browser
|
||||
1
tools/tests/apps/modern/.meteor/release
Normal file
1
tools/tests/apps/modern/.meteor/release
Normal file
@@ -0,0 +1 @@
|
||||
none
|
||||
39
tools/tests/apps/modern/.meteor/versions
Normal file
39
tools/tests/apps/modern/.meteor/versions
Normal file
@@ -0,0 +1,39 @@
|
||||
babel-compiler@7.2.4
|
||||
babel-runtime@1.3.0
|
||||
base64@1.0.11
|
||||
blaze-tools@1.0.10
|
||||
boilerplate-generator@1.6.0
|
||||
caching-compiler@1.2.1
|
||||
caching-html-compiler@1.1.3
|
||||
dynamic-import@0.5.1
|
||||
ecmascript@0.12.4
|
||||
ecmascript-runtime@0.7.0
|
||||
ecmascript-runtime-client@0.8.0
|
||||
ecmascript-runtime-server@0.7.1
|
||||
ejson@1.1.0
|
||||
es5-shim@4.8.0
|
||||
fetch@0.1.0
|
||||
html-tools@1.0.11
|
||||
htmljs@1.0.11
|
||||
inter-process-messaging@0.1.0
|
||||
logging@1.1.20
|
||||
meteor@1.9.3
|
||||
minifier-css@1.4.1
|
||||
minifier-js@2.4.0
|
||||
modern-browsers@0.1.4-beta181.16
|
||||
modules@0.13.0
|
||||
modules-runtime@0.10.3
|
||||
promise@0.11.2
|
||||
random@1.1.0
|
||||
routepolicy@1.1.0
|
||||
server-render@0.3.1
|
||||
shell-server@0.4.0
|
||||
spacebars-compiler@1.1.3
|
||||
standard-minifier-css@1.5.2
|
||||
standard-minifier-js@2.4.0
|
||||
static-html@1.2.2
|
||||
templating-tools@1.1.2
|
||||
tracker@1.2.1
|
||||
underscore@1.0.11
|
||||
webapp@1.7.3-beta181.16
|
||||
webapp-hashing@1.0.9
|
||||
8
tools/tests/apps/modern/.swcrc
Normal file
8
tools/tests/apps/modern/.swcrc
Normal file
@@ -0,0 +1,8 @@
|
||||
{
|
||||
"jsc": {
|
||||
"baseUrl": "./",
|
||||
"paths": {
|
||||
"@swcAlias/*": ["swcAlias/*"]
|
||||
}
|
||||
}
|
||||
}
|
||||
4
tools/tests/apps/modern/client/main.css
Normal file
4
tools/tests/apps/modern/client/main.css
Normal file
@@ -0,0 +1,4 @@
|
||||
body {
|
||||
padding: 10px;
|
||||
font-family: sans-serif;
|
||||
}
|
||||
21
tools/tests/apps/modern/client/main.html
Normal file
21
tools/tests/apps/modern/client/main.html
Normal file
@@ -0,0 +1,21 @@
|
||||
<head>
|
||||
<title>Minimal Meteor app</title>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<h1>Minimal Meteor app</h1>
|
||||
<p>
|
||||
This Meteor app uses as few Meteor packages as possible, to keep the
|
||||
client JavaScript bundle as small as possible.
|
||||
</p>
|
||||
|
||||
<em id="server-render-target"></em>
|
||||
|
||||
<h2>Learn Meteor!</h2>
|
||||
<ul>
|
||||
<li><a href="https://www.meteor.com/try" target="_blank">Do the Tutorial</a></li>
|
||||
<li><a href="http://guide.meteor.com" target="_blank">Follow the Guide</a></li>
|
||||
<li><a href="https://docs.meteor.com" target="_blank">Read the Docs</a></li>
|
||||
<li><a href="https://forums.meteor.com" target="_blank">Discussions</a></li>
|
||||
</ul>
|
||||
</body>
|
||||
1
tools/tests/apps/modern/client/main.js
Normal file
1
tools/tests/apps/modern/client/main.js
Normal file
@@ -0,0 +1 @@
|
||||
Meteor.startup(() => {});
|
||||
1119
tools/tests/apps/modern/package-lock.json
generated
Normal file
1119
tools/tests/apps/modern/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
19
tools/tests/apps/modern/package.json
Normal file
19
tools/tests/apps/modern/package.json
Normal file
@@ -0,0 +1,19 @@
|
||||
{
|
||||
"name": "modern",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"start": "meteor run"
|
||||
},
|
||||
"dependencies": {
|
||||
"@babel/runtime": "^7.23.5",
|
||||
"meteor-node-stubs": "^1.2.12",
|
||||
"react": "^18.3.1"
|
||||
},
|
||||
"meteor": {
|
||||
"mainModule": {
|
||||
"client": "client/main.js",
|
||||
"server": "server/main.js"
|
||||
},
|
||||
"testModule": "tests/main.js"
|
||||
}
|
||||
}
|
||||
1
tools/tests/apps/modern/server/alias.js
Normal file
1
tools/tests/apps/modern/server/alias.js
Normal file
@@ -0,0 +1 @@
|
||||
import '@swcAlias/main';
|
||||
5
tools/tests/apps/modern/server/custom-component.js
Normal file
5
tools/tests/apps/modern/server/custom-component.js
Normal file
@@ -0,0 +1,5 @@
|
||||
import React from 'react';
|
||||
|
||||
console.log('custom-component.js');
|
||||
|
||||
export default <></>;
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user