mirror of
https://github.com/meteor/meteor.git
synced 2026-05-02 03:01:46 -04:00
Merge branch 'packaging' into cordova-hcp
Conflicts: packages/constraint-solver/constraint-solver-tests.js packages/constraint-solver/constraint-solver.js packages/less/plugin/compile-less.js packages/meteor/plugin/basic-file-types.js packages/star-translate/translator.js packages/stylus/plugin/compile-stylus.js packages/templating/plugin/compile-templates.js packages/webapp/webapp_server.js tools/bundler.js tools/commands.js tools/compiler.js tools/package-source.js tools/run-app.js tools/selftest.js tools/tests/old/test-bundler-assets.js tools/tests/old/test-bundler-options.js tools/unipackage.js
This commit is contained in:
2
.mailmap
2
.mailmap
@@ -27,6 +27,7 @@ GITHUB: codeinthehole <david.winterbottom@gmail.com>
|
||||
GITHUB: dandv <ddascalescu+github@gmail.com>
|
||||
GITHUB: davegonzalez <gonzalez.dalex@gmail.com>
|
||||
GITHUB: ducdigital <duc@ducdigital.com>
|
||||
GITHUB: duckspeaker <gallo.j@gmail.com>
|
||||
GITHUB: emgee3 <hello@gravitronic.com>
|
||||
GITHUB: felixrabe <felix@rabe.io>
|
||||
GITHUB: FredericoC <frederico.carvalho@3stack.com.au>
|
||||
@@ -75,3 +76,4 @@ METEOR: sixolet <naomi@meteor.com>
|
||||
METEOR: Slava <slava@meteor.com>
|
||||
METEOR: stubailo <sashko@mit.edu>
|
||||
METEOR: ekatek <ekate@meteor.com>
|
||||
METEOR: mariapacana <maria.pacana@gmail.com>
|
||||
|
||||
78
History.md
78
History.md
@@ -1,4 +1,4 @@
|
||||
## v.NEXT.NEXT
|
||||
## v.NEXT
|
||||
|
||||
* The `appcache` package now defaults to functioning on all browsers that
|
||||
support the AppCache API, rather than a whitelist of browsers. You can still
|
||||
@@ -6,16 +6,90 @@
|
||||
change is that `appcache` is now enabled by default on Firefox, because
|
||||
Firefox no longer makes a confusing popup. #2241
|
||||
|
||||
* When a call to `match` fails in a method or subscription, log the
|
||||
failure on the server. (This matches the behavior described in our docs)
|
||||
|
||||
|
||||
## v.NEXT
|
||||
## v0.8.3
|
||||
|
||||
#### Blaze
|
||||
|
||||
* Refactor Blaze to simplify internals while preserving the public
|
||||
API. `UI.Component` has been replaced with `Blaze.View.`
|
||||
|
||||
* Fix performance issues and memory leaks concerning event handlers.
|
||||
|
||||
* Add `UI.remove`, which removes a template after `UI.render`/`UI.insert`.
|
||||
|
||||
* Add `this.autorun` to the template instance, which is like `Deps.autorun`
|
||||
but is automatically stopped when the template is destroyed.
|
||||
|
||||
* Create `<a>` tags as SVG elements when they have `xlink:href`
|
||||
attributes. (Previously, `<a>` tags inside SVGs were never created as
|
||||
SVG elements.) #2178
|
||||
|
||||
* Throw an error in `{{foo bar}}` if `foo` is missing or not a function.
|
||||
|
||||
* Cursors returned from template helpers for #each should implement
|
||||
the `observeChanges` method and don't have to be Minimongo cursors
|
||||
(allowing new custom data stores for Blaze like Miniredis).
|
||||
|
||||
* Remove warnings when {{#each}} iterates over a list of strings,
|
||||
numbers, or other items that contains duplicates. #1980
|
||||
|
||||
#### Meteor Accounts
|
||||
|
||||
* Fix regression in 0.8.2 where an exception would be thrown if
|
||||
`Meteor.loginWithPassword` didn't have a callback. Callbacks to
|
||||
`Meteor.loginWithPassword` are now optional again. #2255
|
||||
|
||||
* Fix OAuth popup flow in mobile apps that don't support
|
||||
`window.opener`. #2302
|
||||
|
||||
* Fix "Email already exists" error with MongoDB 2.6. #2238
|
||||
|
||||
|
||||
#### mongo-livedata and minimongo
|
||||
|
||||
* Fix performance issue where a large batch of oplog updates could block
|
||||
the node event loop for long periods. #2299.
|
||||
|
||||
* Fix oplog bug resulting in error message "Buffer inexplicably empty". #2274
|
||||
|
||||
* Fix regression from 0.8.2 that caused collections to appear empty in
|
||||
reactive `findOne()` or `fetch` queries that run before a mutator
|
||||
returns. #2275
|
||||
|
||||
|
||||
#### Miscellaneous
|
||||
|
||||
* Stop including code by default that automatically refreshes the page
|
||||
if JavaScript and CSS don't load correctly. While this code is useful
|
||||
in some multi-server deployments, it can cause infinite refresh loops
|
||||
if there are errors on the page. Add the `reload-safetybelt` package
|
||||
to your app if you want to include this code.
|
||||
|
||||
* On the server, `Meteor.startup(c)` now calls `c` immediately if the
|
||||
server has already started up, matching the client behavior. #2239
|
||||
|
||||
* Add support for server-side source maps when debugging with
|
||||
`node-inspector`.
|
||||
|
||||
* Add `WebAppInternals.addStaticJs()` for adding static JavaScript code
|
||||
to be served in the app, inline if allowed by `browser-policy`.
|
||||
|
||||
* Make the `tinytest/run` method return immediately, so that `wait`
|
||||
method calls from client tests don't block on server tests completing.
|
||||
|
||||
* Log errors from method invocations on the client if there is no
|
||||
callback provided.
|
||||
|
||||
* Upgraded dependencies:
|
||||
- node: 0.10.29 (from 0.10.28)
|
||||
- less: 1.7.1 (from 1.6.1)
|
||||
|
||||
Patches contributed by GitHub users Cangit, cmather, duckspeaker, zol.
|
||||
|
||||
|
||||
## v0.8.2
|
||||
|
||||
|
||||
@@ -1 +1 @@
|
||||
0.8.2
|
||||
0.8.3
|
||||
|
||||
@@ -2189,6 +2189,12 @@ This property provides access to the data context at the top level of
|
||||
the template. It is updated each time the template is re-rendered.
|
||||
Access is read-only and non-reactive.
|
||||
|
||||
{{> api_box template_autorun}}
|
||||
|
||||
You can use `this.autorun` from a [`created`](#template_created) or
|
||||
[`rendered`](#template_rendered) callback to reactively update the DOM
|
||||
or the template instance. The Computation is automatically stopped
|
||||
when the template is destroyed.
|
||||
|
||||
<h2 id="ui"><span>Template utilities</span></h2>
|
||||
|
||||
@@ -2205,6 +2211,15 @@ any part of the DOM for finer control than just using template inclusions.
|
||||
You can define helpers and event maps on `UI.body` just like on any
|
||||
`Template.myTemplate` object.
|
||||
|
||||
Helpers on `UI.body` are only available in the `<body>` tags of your
|
||||
app. To register a global helper, use
|
||||
[UI.registerHelper](#ui_registerhelper).
|
||||
|
||||
Event maps on `UI.body` don't apply to elements added to the body via
|
||||
`UI.insert`, jQuery, or the DOM API, or to the body element itself.
|
||||
To handle events on the body, window, or document, use jQuery or the
|
||||
DOM API.
|
||||
|
||||
{{> api_box ui_render}}
|
||||
|
||||
This returns an "rendered template" object, which can be passed to
|
||||
@@ -2238,7 +2253,15 @@ changes.
|
||||
|
||||
{{> api_box ui_getelementdata}}
|
||||
|
||||
{{> api_box ui_dynamic}}
|
||||
|
||||
`UI.dynamic` allows you to include a template by name, where the name
|
||||
may be calculated by a helper and may change reactively. The `data`
|
||||
argument is optional, and if it is omitted, the current data context
|
||||
is used.
|
||||
|
||||
For example, if there is a template named "foo", `{{dstache}}> UI.dynamic
|
||||
template="foo"}}` is equivalent to `{{dstache}}> foo}}`.
|
||||
|
||||
{{#api_box eventmaps}}
|
||||
|
||||
@@ -2281,8 +2304,8 @@ Example:
|
||||
// Fires when any element with the 'accept' class is clicked
|
||||
'click .accept': function (event) { ... },
|
||||
|
||||
// Fires when 'accept' is clicked, or a key is pressed
|
||||
'keydown, click .accept': function (event) { ... }
|
||||
// Fires when 'accept' is clicked or focused, or a key is pressed
|
||||
'click .accept, focus .accept, keypress': function (event) { ... }
|
||||
}
|
||||
|
||||
Most events bubble up the document tree from their originating
|
||||
|
||||
@@ -1186,6 +1186,11 @@ Template.api.accounts_ui_config = {
|
||||
type: "Object",
|
||||
descr: "To ask the user for permission to act on their behalf when offline, map the relevant external service to `true`. Currently only supported with Google. See [Meteor.loginWithExternalService](#meteor_loginwithexternalservice) for more details."
|
||||
},
|
||||
{
|
||||
name: "forceApprovalPrompt",
|
||||
type: "Boolean",
|
||||
descr: "If true, forces the user to approve the app's permissions, even if previously approved. Currently only supported with Google."
|
||||
},
|
||||
{
|
||||
name: "passwordSignupFields",
|
||||
type: "String",
|
||||
@@ -1845,6 +1850,17 @@ Template.api.template_data = {
|
||||
descr: ["The data context of this instance's latest invocation."]
|
||||
};
|
||||
|
||||
Template.api.template_autorun = {
|
||||
id: "template_autorun",
|
||||
name: "<em>this</em>.autorun(runFunc)",
|
||||
locus: "Client",
|
||||
descr: ["A version of [Deps.autorun](#deps_autorun) that is stopped when the template is destroyed."],
|
||||
args: [
|
||||
{name: "runFunc",
|
||||
type: "Function",
|
||||
descr: "The function to run. It receives one argument: a Deps.Computation object."}
|
||||
]
|
||||
};
|
||||
|
||||
Template.api.ui_registerhelper = {
|
||||
id: "ui_registerhelper",
|
||||
@@ -1862,6 +1878,22 @@ Template.api.ui_registerhelper = {
|
||||
}]
|
||||
};
|
||||
|
||||
Template.api.ui_dynamic = {
|
||||
id: "ui_dynamic",
|
||||
name: "{{> UI.dynamic template=templateName [data=dataContext]}}",
|
||||
locus: "Client",
|
||||
descr: ["Choose a template to include dynamically, by name."],
|
||||
args: [
|
||||
{name: "templateName",
|
||||
type: "String",
|
||||
descr: "The name of the template to include."
|
||||
},
|
||||
{name: "dataContext",
|
||||
type: "Object",
|
||||
descr: "Optional. The data context in which to include the template."
|
||||
}]
|
||||
};
|
||||
|
||||
Template.api.ui_body = {
|
||||
id: "ui_body",
|
||||
name: "UI.body",
|
||||
|
||||
@@ -33,7 +33,7 @@ packages that most any app will use (for example `webapp`, which
|
||||
handles incoming HTTP connections, and `templating`, which lets you
|
||||
make HTML templates that automatically update live as data changes).
|
||||
Then there are optional packages like `email`, which lets your app
|
||||
send emails, or the Meteor Accounts series (`account-password`,
|
||||
send emails, or the Meteor Accounts series (`accounts-password`,
|
||||
`accounts-facebook`, `accounts-ui`, and others) which provide a
|
||||
full-featured user account system that you can drop right into your
|
||||
app. And beyond these "official" packages, there are hundreds of
|
||||
|
||||
@@ -253,7 +253,8 @@ var toc = [
|
||||
{instance: "this", name: "find", id: "template_find"},
|
||||
{instance: "this", name: "firstNode", id: "template_firstNode"},
|
||||
{instance: "this", name: "lastNode", id: "template_lastNode"},
|
||||
{instance: "this", name: "data", id: "template_data"}
|
||||
{instance: "this", name: "data", id: "template_data"},
|
||||
{instance: "this", name: "autorun", id: "template_autorun"}
|
||||
],
|
||||
"UI", [
|
||||
"UI.registerHelper",
|
||||
@@ -262,7 +263,8 @@ var toc = [
|
||||
"UI.renderWithData",
|
||||
"UI.insert",
|
||||
"UI.remove",
|
||||
"UI.getElementData"
|
||||
"UI.getElementData",
|
||||
{name: "{{> UI.dynamic}}", id: "ui_dynamic"}
|
||||
],
|
||||
{type: "spacer"},
|
||||
{name: "Event maps", style: "noncode"}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
// While galaxy apps are on their own special meteor releases, override
|
||||
// Meteor.release here.
|
||||
if (Meteor.isClient) {
|
||||
Meteor.release = Meteor.release ? "0.8.2" : undefined;
|
||||
Meteor.release = Meteor.release ? "0.8.3" : undefined;
|
||||
}
|
||||
|
||||
@@ -1 +1 @@
|
||||
0.8.2
|
||||
0.8.3
|
||||
|
||||
@@ -1 +1 @@
|
||||
0.8.2
|
||||
0.8.3
|
||||
|
||||
@@ -1 +1 @@
|
||||
0.8.2
|
||||
0.8.3
|
||||
|
||||
@@ -1 +1 @@
|
||||
0.8.2
|
||||
0.8.3
|
||||
|
||||
@@ -35,6 +35,16 @@ var randomString = function (length) {
|
||||
return ret;
|
||||
};
|
||||
|
||||
var preCall = function (name) {
|
||||
console.log('> ' + name);
|
||||
};
|
||||
|
||||
var postCall = function (name) {
|
||||
return function (err, callback) {
|
||||
console.log('< ' + name + ' ' + (err ? 'ERR' : 'OK'));
|
||||
};
|
||||
};
|
||||
|
||||
var pickCollection = function () {
|
||||
return Random.choice(Collections);
|
||||
};
|
||||
@@ -96,7 +106,8 @@ if (Meteor.isServer) {
|
||||
Meteor.setInterval(function () {
|
||||
var when = +(new Date) - PARAMS.maxAgeSeconds*1000;
|
||||
_.each(Collections, function (C) {
|
||||
C.remove({when: {$lt: when}});
|
||||
preCall('removeMaxAge');
|
||||
C.remove({when: {$lt: when}}, postCall('removeMaxAge'));
|
||||
});
|
||||
// Clear out 5% of the DB each time, steady state. XXX parameterize?
|
||||
}, 1000*PARAMS.maxAgeSeconds / 20);
|
||||
@@ -121,7 +132,8 @@ if (Meteor.isServer) {
|
||||
doc.when = +(new Date);
|
||||
|
||||
var C = pickCollection();
|
||||
C.insert(doc);
|
||||
preCall('insert');
|
||||
C.insert(doc, postCall('insert'));
|
||||
},
|
||||
update: function (processId, field, value) {
|
||||
check([processId, field, value], [String]);
|
||||
@@ -130,15 +142,18 @@ if (Meteor.isServer) {
|
||||
|
||||
var C = pickCollection();
|
||||
// update one message.
|
||||
C.update({fromProcess: processId}, {$set: modifer}, {multi: false});
|
||||
preCall('update');
|
||||
C.update({fromProcess: processId}, {$set: modifer}, {multi: false}, postCall('update'));
|
||||
},
|
||||
remove: function (processId) {
|
||||
check(processId, String);
|
||||
var C = pickCollection();
|
||||
// remove one message.
|
||||
var obj = C.findOne({fromProcess: processId});
|
||||
if (obj)
|
||||
C.remove(obj._id);
|
||||
if (obj) {
|
||||
preCall('remove');
|
||||
C.remove(obj._id, postCall('remove'));
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
|
||||
@@ -1,8 +1,12 @@
|
||||
#!/bin/bash
|
||||
|
||||
PORT=9000
|
||||
NUM_CLIENTS=10
|
||||
DURATION=120
|
||||
if [ -z "$NUM_CLIENTS" ]; then
|
||||
NUM_CLIENTS=10
|
||||
fi
|
||||
if [ -z "$DURATION" ]; then
|
||||
DURATION=120
|
||||
fi
|
||||
REPORT_INTERVAL=10
|
||||
|
||||
set -e
|
||||
@@ -20,7 +24,7 @@ pkill -f "$PROJDIR/.meteor/local/db" || true
|
||||
../../../meteor reset || true
|
||||
|
||||
# start the benchmark app
|
||||
../../../meteor --production --settings "scenarios/${SCENARIO}.json" --port 9000 &
|
||||
../../../meteor --production --settings "scenarios/${SCENARIO}.json" --port ${PORT} &
|
||||
OUTER_PID=$!
|
||||
|
||||
echo "Waiting for server to come up"
|
||||
@@ -30,12 +34,13 @@ function wait_for_port {
|
||||
sleep 1
|
||||
N=$(($N+1))
|
||||
if [ $N -ge $2 ] ; then
|
||||
curl -v "$1" || true
|
||||
echo "Timed out waiting for port $1"
|
||||
exit 2
|
||||
fi
|
||||
done
|
||||
}
|
||||
wait_for_port "http://localhost:9001" 60
|
||||
wait_for_port "http://localhost:${PORT}" 60
|
||||
|
||||
|
||||
echo "Starting phantoms"
|
||||
|
||||
11
examples/unfinished/benchmark/scenarios/scale10.json
Normal file
11
examples/unfinished/benchmark/scenarios/scale10.json
Normal file
@@ -0,0 +1,11 @@
|
||||
{
|
||||
"params": {
|
||||
"numCollections": 1,
|
||||
"maxAgeSeconds": 60,
|
||||
"insertsPerSecond": 10,
|
||||
"updatesPerSecond": 10,
|
||||
"removesPerSecond": 1,
|
||||
"documentSize": 128,
|
||||
"documentNumFields": 2
|
||||
}
|
||||
}
|
||||
11
examples/unfinished/benchmark/scenarios/scale100.json
Normal file
11
examples/unfinished/benchmark/scenarios/scale100.json
Normal file
@@ -0,0 +1,11 @@
|
||||
{
|
||||
"params": {
|
||||
"numCollections": 1,
|
||||
"maxAgeSeconds": 60,
|
||||
"insertsPerSecond": 100,
|
||||
"updatesPerSecond": 100,
|
||||
"removesPerSecond": 10,
|
||||
"documentSize": 128,
|
||||
"documentNumFields": 2
|
||||
}
|
||||
}
|
||||
11
examples/unfinished/benchmark/scenarios/scale20.json
Normal file
11
examples/unfinished/benchmark/scenarios/scale20.json
Normal file
@@ -0,0 +1,11 @@
|
||||
{
|
||||
"params": {
|
||||
"numCollections": 1,
|
||||
"maxAgeSeconds": 60,
|
||||
"insertsPerSecond": 20,
|
||||
"updatesPerSecond": 20,
|
||||
"removesPerSecond": 2,
|
||||
"documentSize": 128,
|
||||
"documentNumFields": 2
|
||||
}
|
||||
}
|
||||
11
examples/unfinished/benchmark/scenarios/scale40.json
Normal file
11
examples/unfinished/benchmark/scenarios/scale40.json
Normal file
@@ -0,0 +1,11 @@
|
||||
{
|
||||
"params": {
|
||||
"numCollections": 1,
|
||||
"maxAgeSeconds": 60,
|
||||
"insertsPerSecond": 40,
|
||||
"updatesPerSecond": 40,
|
||||
"removesPerSecond": 4,
|
||||
"documentSize": 128,
|
||||
"documentNumFields": 2
|
||||
}
|
||||
}
|
||||
11
examples/unfinished/benchmark/scenarios/scale50.json
Normal file
11
examples/unfinished/benchmark/scenarios/scale50.json
Normal file
@@ -0,0 +1,11 @@
|
||||
{
|
||||
"params": {
|
||||
"numCollections": 1,
|
||||
"maxAgeSeconds": 60,
|
||||
"insertsPerSecond": 50,
|
||||
"updatesPerSecond": 50,
|
||||
"removesPerSecond": 5,
|
||||
"documentSize": 128,
|
||||
"documentNumFields": 2
|
||||
}
|
||||
}
|
||||
@@ -1 +1 @@
|
||||
0.8.2
|
||||
0.8.3
|
||||
|
||||
@@ -2,12 +2,14 @@ Accounts.ui = {};
|
||||
|
||||
Accounts.ui._options = {
|
||||
requestPermissions: {},
|
||||
requestOfflineToken: {}
|
||||
requestOfflineToken: {},
|
||||
forceApprovalPrompt: {}
|
||||
};
|
||||
|
||||
// XXX refactor duplicated code in this function
|
||||
Accounts.ui.config = function(options) {
|
||||
// validate options keys
|
||||
var VALID_KEYS = ['passwordSignupFields', 'requestPermissions', 'requestOfflineToken'];
|
||||
var VALID_KEYS = ['passwordSignupFields', 'requestPermissions', 'requestOfflineToken', 'forceApprovalPrompt'];
|
||||
_.each(_.keys(options), function (key) {
|
||||
if (!_.contains(VALID_KEYS, key))
|
||||
throw new Error("Accounts.ui.config: Invalid key: " + key);
|
||||
@@ -56,6 +58,20 @@ Accounts.ui.config = function(options) {
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// deal with `forceApprovalPrompt`
|
||||
if (options.forceApprovalPrompt) {
|
||||
_.each(options.forceApprovalPrompt, function (value, service) {
|
||||
if (service !== 'google')
|
||||
throw new Error("Accounts.ui.config: `forceApprovalPrompt` only supported for Google login at the moment.");
|
||||
|
||||
if (Accounts.ui._options.forceApprovalPrompt[service]) {
|
||||
throw new Error("Accounts.ui.config: Can't set `forceApprovalPrompt` more than once for " + service);
|
||||
} else {
|
||||
Accounts.ui._options.forceApprovalPrompt[service] = value;
|
||||
}
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
passwordSignupFields = function () {
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
|
||||
|
||||
// XXX it'd be cool to also test that the right thing happens if options
|
||||
// *are* validated, but Accouns.ui._options is global state which makes this hard
|
||||
// *are* validated, but Accounts.ui._options is global state which makes this hard
|
||||
// (impossible?)
|
||||
Tinytest.add('accounts-ui - config validates keys', function (test) {
|
||||
test.throws(function () {
|
||||
@@ -19,4 +19,8 @@ Tinytest.add('accounts-ui - config validates keys', function (test) {
|
||||
test.throws(function () {
|
||||
Accounts.ui.config({requestPermissions: {facebook: "not an array"}});
|
||||
});
|
||||
|
||||
test.throws(function () {
|
||||
Accounts.ui.config({forceApprovalPrompt: {facebook: "only google"}});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -29,6 +29,8 @@ Template._loginButtonsLoggedOutSingleLoginButton.events({
|
||||
options.requestPermissions = Accounts.ui._options.requestPermissions[serviceName];
|
||||
if (Accounts.ui._options.requestOfflineToken[serviceName])
|
||||
options.requestOfflineToken = Accounts.ui._options.requestOfflineToken[serviceName];
|
||||
if (Accounts.ui._options.forceApprovalPrompt[serviceName])
|
||||
options.forceApprovalPrompt = Accounts.ui._options.forceApprovalPrompt[serviceName];
|
||||
|
||||
loginWithService(options, callback);
|
||||
}
|
||||
|
||||
@@ -11,7 +11,7 @@ Cordova.depends({
|
||||
Package.on_use(function (api) {
|
||||
api.use('webapp', 'server');
|
||||
api.use(['deps', 'retry'], 'client');
|
||||
api.use(['livedata', 'mongo-livedata'], ['client', 'server']);
|
||||
api.use(['livedata', 'mongo-livedata', 'underscore'], ['client', 'server']);
|
||||
api.use('deps', 'client');
|
||||
api.use('reload', 'client', {weak: true});
|
||||
api.use('http', 'client.cordova');
|
||||
|
||||
@@ -169,6 +169,7 @@ Blaze.InOuterTemplateScope = function (templateView, contentFunc) {
|
||||
parentView = parentView.parentView;
|
||||
|
||||
view.onCreated(function () {
|
||||
this.originalParentView = this.parentView;
|
||||
this.parentView = parentView;
|
||||
});
|
||||
return view;
|
||||
|
||||
@@ -29,6 +29,28 @@ Blaze.DOMRange = function (nodeAndRangeArray) {
|
||||
};
|
||||
var DOMRange = Blaze.DOMRange;
|
||||
|
||||
// In IE 8, don't use empty text nodes as placeholders
|
||||
// in empty DOMRanges, use comment nodes instead. Using
|
||||
// empty text nodes in modern browsers is great because
|
||||
// it doesn't clutter the web inspector. In IE 8, however,
|
||||
// it seems to lead in some roundabout way to the OAuth
|
||||
// pop-up crashing the browser completely. In the past,
|
||||
// we didn't use empty text nodes on IE 8 because they
|
||||
// don't accept JS properties, so just use the same logic
|
||||
// even though we don't need to set properties on the
|
||||
// placeholder anymore.
|
||||
DOMRange._USE_COMMENT_PLACEHOLDERS = (function () {
|
||||
var result = false;
|
||||
var textNode = document.createTextNode("");
|
||||
try {
|
||||
textNode.someProp = true;
|
||||
} catch (e) {
|
||||
// IE 8
|
||||
result = true;
|
||||
}
|
||||
return result;
|
||||
})();
|
||||
|
||||
// static methods
|
||||
DOMRange._insert = function (rangeOrNode, parentElement, nextNode, _isMove) {
|
||||
var m = rangeOrNode;
|
||||
@@ -118,7 +140,10 @@ DOMRange.prototype.attach = function (parentElement, nextNode, _isMove) {
|
||||
DOMRange._insert(members[i], parentElement, nextNode, _isMove);
|
||||
}
|
||||
} else {
|
||||
var placeholder = document.createTextNode("");
|
||||
var placeholder = (
|
||||
DOMRange._USE_COMMENT_PLACEHOLDERS ?
|
||||
document.createComment("") :
|
||||
document.createTextNode(""));
|
||||
this.emptyRangePlaceholder = placeholder;
|
||||
parentElement.insertBefore(placeholder, nextNode || null);
|
||||
}
|
||||
|
||||
@@ -6,7 +6,9 @@ var bindIfIsFunction = function (x, target) {
|
||||
};
|
||||
};
|
||||
|
||||
var bindToCurrentDataIfIsFunction = function (x) {
|
||||
// If `x` is a function, binds the value of `this` for that function
|
||||
// to the current data context.
|
||||
var bindDataContext = function (x) {
|
||||
if (typeof x === 'function') {
|
||||
return function () {
|
||||
var data = Blaze.getCurrentData();
|
||||
@@ -22,6 +24,8 @@ var wrapHelper = function (f) {
|
||||
return Blaze.wrapCatchingExceptions(f, 'template helper');
|
||||
};
|
||||
|
||||
// !!! FIX THIS COMMENT !!!
|
||||
//
|
||||
// Implements {{foo}} where `name` is "foo"
|
||||
// and `component` is the component the tag is found in
|
||||
// (the lexical "self," on which to look for methods).
|
||||
@@ -45,11 +49,11 @@ Blaze.View.prototype.lookup = function (name, _options) {
|
||||
return Blaze._parentData(name.length - 1, true /*_functionWrapped*/);
|
||||
|
||||
} else if (template && (name in template)) {
|
||||
return wrapHelper(bindToCurrentDataIfIsFunction(template[name]));
|
||||
return wrapHelper(bindDataContext(template[name]));
|
||||
} else if (lookupTemplate && Template.__lookup__(name)) {
|
||||
return Template.__lookup__(name);
|
||||
} else if (UI._globalHelpers[name]) {
|
||||
return wrapHelper(bindToCurrentDataIfIsFunction(UI._globalHelpers[name]));
|
||||
return wrapHelper(bindDataContext(UI._globalHelpers[name]));
|
||||
} else {
|
||||
return function () {
|
||||
var isCalledAsFunction = (arguments.length > 0);
|
||||
|
||||
@@ -49,13 +49,19 @@ Blaze.DOMMaterializer.def({
|
||||
|
||||
var rawAttrs = tag.attrs;
|
||||
var children = tag.children;
|
||||
if (tagName === 'textarea' && ! (rawAttrs && ('value' in rawAttrs))) {
|
||||
// turn TEXTAREA contents into a value attribute.
|
||||
// Reactivity in the form of nested Views won't work here
|
||||
// because the Views have already been instantiated. To
|
||||
// get Views in a textarea they need to be wrapped in a
|
||||
// function and provided as the "value" attribute by the
|
||||
// compiler.
|
||||
if (tagName === 'textarea' && tag.children.length &&
|
||||
! (rawAttrs && ('value' in rawAttrs))) {
|
||||
// Provide very limited support for TEXTAREA tags with children
|
||||
// rather than a "value" attribute.
|
||||
// Reactivity in the form of Views nested in the tag's children
|
||||
// won't work. Compilers should compile textarea contents into
|
||||
// the "value" attribute of the tag, wrapped in a function if there
|
||||
// is reactivity.
|
||||
if (typeof rawAttrs === 'function' ||
|
||||
HTML.isArray(rawAttrs)) {
|
||||
throw new Error("Can't have reactive children of TEXTAREA node; " +
|
||||
"use the 'value' attribute instead.");
|
||||
}
|
||||
rawAttrs = _.extend({}, rawAttrs || null);
|
||||
rawAttrs.value = Blaze._expand(children, self.parentView);
|
||||
children = [];
|
||||
|
||||
@@ -312,14 +312,21 @@ Blaze.HTMLJSExpander.def({
|
||||
}
|
||||
});
|
||||
|
||||
// Return Blaze.currentView, but only if it is being rendered
|
||||
// (i.e. we are in its render() method).
|
||||
var currentViewIfRendering = function () {
|
||||
var view = Blaze.currentView;
|
||||
return (view && view.isInRender) ? view : null;
|
||||
};
|
||||
|
||||
Blaze._expand = function (htmljs, parentView) {
|
||||
parentView = parentView || Blaze.currentView;
|
||||
parentView = parentView || currentViewIfRendering();
|
||||
return (new Blaze.HTMLJSExpander(
|
||||
{parentView: parentView})).visit(htmljs);
|
||||
};
|
||||
|
||||
Blaze._expandAttributes = function (attrs, parentView) {
|
||||
parentView = parentView || Blaze.currentView;
|
||||
parentView = parentView || currentViewIfRendering();
|
||||
return (new Blaze.HTMLJSExpander(
|
||||
{parentView: parentView})).visitAttributes(attrs);
|
||||
};
|
||||
@@ -383,7 +390,7 @@ Blaze.runTemplate = function (t/*, args*/) {
|
||||
};
|
||||
|
||||
Blaze.render = function (content, parentView) {
|
||||
parentView = parentView || Blaze.currentView;
|
||||
parentView = parentView || currentViewIfRendering();
|
||||
|
||||
var view;
|
||||
if (typeof content === 'function') {
|
||||
@@ -401,7 +408,7 @@ Blaze.render = function (content, parentView) {
|
||||
Blaze.toHTML = function (htmljs, parentView) {
|
||||
if (typeof htmljs === 'function')
|
||||
throw new Error("Blaze.toHTML doesn't take a function, just HTMLjs");
|
||||
parentView = parentView || Blaze.currentView;
|
||||
parentView = parentView || currentViewIfRendering();
|
||||
return HTML.toHTML(Blaze._expand(htmljs, parentView));
|
||||
};
|
||||
|
||||
@@ -414,7 +421,7 @@ Blaze.toText = function (htmljs, parentView, textMode) {
|
||||
textMode = parentView;
|
||||
parentView = null;
|
||||
}
|
||||
parentView = parentView || Blaze.currentView;
|
||||
parentView = parentView || currentViewIfRendering();
|
||||
|
||||
if (! textMode)
|
||||
throw new Error("textMode required");
|
||||
@@ -529,7 +536,11 @@ Blaze._addEventMap = function (view, eventMap, thisInHandler) {
|
||||
function (evt) {
|
||||
if (! range.containsElement(evt.currentTarget))
|
||||
return null;
|
||||
return handler.apply(thisInHandler || this, arguments);
|
||||
var handlerThis = thisInHandler || this;
|
||||
var handlerArgs = arguments;
|
||||
return Blaze.withCurrentView(view, function () {
|
||||
return handler.apply(handlerThis, handlerArgs);
|
||||
});
|
||||
},
|
||||
range, function (r) {
|
||||
return r.parentRange;
|
||||
|
||||
@@ -12,8 +12,8 @@ Package.on_use(function (api) {
|
||||
// spacebars compiler rather than letting the 'templating' package (which
|
||||
// isn't fully supported on the server yet) handle it. That also means that
|
||||
// they don't contain the outer "<template>" tag.
|
||||
api.add_files(['boilerplate_client.browser.html',
|
||||
'boilerplate_client.cordova.html'],
|
||||
api.add_files(['boilerplate_web.browser.html',
|
||||
'boilerplate_web.cordova.html'],
|
||||
'server', {isAsset: true});
|
||||
});
|
||||
|
||||
|
||||
@@ -17,7 +17,7 @@ var insertVersion = function (name, version, ecv, deps) {
|
||||
references: [
|
||||
{ arch: "os", targetSlice: "main", weak: false,
|
||||
implied: false, unordered: false },
|
||||
{ arch: "client", targetSlice: "main", weak: false,
|
||||
{ arch: "web", targetSlice: "main", weak: false,
|
||||
implied: false, unordered: false }]
|
||||
};
|
||||
});
|
||||
@@ -25,7 +25,7 @@ var insertVersion = function (name, version, ecv, deps) {
|
||||
earliestCompatibleVersion: ecv,
|
||||
dependencies: constructedDeps });
|
||||
Builds.insert({ packageName: name, version: version,
|
||||
buildArchitectures: "client+os" });
|
||||
buildArchitectures: "web+os" });
|
||||
};
|
||||
insertVersion("sparky-forms", "1.1.2", "1.0.0", {"forms": "=1.0.1", "sparkle": "=2.1.1"});
|
||||
insertVersion("sparky-forms", "1.0.0", "1.0.0", {"awesome-dropdown": "=1.4.0"});
|
||||
@@ -505,7 +505,7 @@ function getCatalogStub (gems) {
|
||||
packageVersion.dependencies[name] = {
|
||||
constraint: convertConstraints(constraints)[0], // XXX pick first one only
|
||||
references: [{
|
||||
"arch": "client"
|
||||
"arch": "web"
|
||||
}, {
|
||||
"arch": "os" }]
|
||||
};
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
var semver = Npm.require('semver');
|
||||
|
||||
// Copied from archinfo.matches() in tools/
|
||||
var archMatches = function (arch, baseArch) {
|
||||
return arch.substr(0, baseArch.length) === baseArch &&
|
||||
(arch.length === baseArch.length ||
|
||||
@@ -66,9 +67,9 @@ ConstraintSolver.PackagesResolver.prototype._loadPackageInfo = function (
|
||||
var unibuilds = {};
|
||||
|
||||
// XXX in theory there might be different archs but in practice they are
|
||||
// always "os", "client.browser" and "client.cordova". Fix this once we
|
||||
// always "os", "web.browser" and "web.cordova". Fix this once we
|
||||
// actually have different archs used.
|
||||
var allArchs = ["os", "client.browser", "client.cordova"];
|
||||
var allArchs = ["os", "web.browser", "web.cordova"];
|
||||
_.each(allArchs, function (arch) {
|
||||
var unitName = packageName + "#" + arch;
|
||||
unibuilds[unitName] = new ConstraintSolver.UnitVersion(
|
||||
@@ -179,11 +180,11 @@ ConstraintSolver.PackagesResolver.prototype.resolve = function (
|
||||
}
|
||||
|
||||
// split every package name to one or more archs belonging to that package
|
||||
// (["foobar"] => ["foobar#os", "foobar#client.browser", ...])
|
||||
// (["foobar"] => ["foobar#os", "foobar#web.browser", ...])
|
||||
// XXX for now just hardcode in all of the known architectures
|
||||
options.upgrade = _.filter(_.flatten(_.map(options.upgrade, function (packageName) {
|
||||
return [packageName + "#os", packageName + "#client.browser",
|
||||
packageName + "#client.cordova"];
|
||||
return [packageName + "#os", packageName + "#web.browser",
|
||||
packageName + "#web.cordova"];
|
||||
})), _.identity);
|
||||
|
||||
var dc = self._splitDepsToConstraints(dependencies, constraints);
|
||||
@@ -214,7 +215,7 @@ ConstraintSolver.PackagesResolver.prototype.resolve = function (
|
||||
var resultChoices = {};
|
||||
_.each(res, function (uv) {
|
||||
// Since we don't yet define the interface for a an app to depend only on
|
||||
// certain unibuilds of the packages (like only client unibuilds) and we know
|
||||
// certain unibuilds of the packages (like only web unibuilds) and we know
|
||||
// that each unibuild weakly depends on other sibling unibuilds of the same
|
||||
// version, we can safely output the whole package for each unibuild in the
|
||||
// result.
|
||||
@@ -257,7 +258,7 @@ ConstraintSolver.PackagesResolver.prototype.propagateExactDeps =
|
||||
};
|
||||
|
||||
// takes dependencies and constraints and rewrites the names from "foo" to
|
||||
// "foo#os" and "foo#client.browser" and "foo#client.cordova"
|
||||
// "foo#os" and "foo#web.browser" and "foo#web.cordova"
|
||||
// XXX right now creates a dependency for every unibuild it can find
|
||||
ConstraintSolver.PackagesResolver.prototype._splitDepsToConstraints =
|
||||
function (inputDeps, inputConstraints) {
|
||||
@@ -294,7 +295,7 @@ ConstraintSolver.PackagesResolver.prototype._unibuildsForPackage =
|
||||
var unibuilds = [];
|
||||
// XXX hardcode all common architectures assuming that every package has the
|
||||
// same set of architectures.
|
||||
_.each(["os", "client.browser", "client.cordova"], function (arch) {
|
||||
_.each(["os", "web.browser", "web.cordova"], function (arch) {
|
||||
if (self.resolver.unitsVersions[unibuildPrefix + arch])
|
||||
unibuilds.push(unibuildPrefix + arch);
|
||||
});
|
||||
|
||||
@@ -3,16 +3,7 @@ var path = Npm.require('path');
|
||||
var less = Npm.require('less');
|
||||
var Future = Npm.require('fibers/future');
|
||||
|
||||
Plugin.registerSourceHandler("less", function (compileStep) {
|
||||
// XXX annoying that this is replicated in .css, .less, and .styl
|
||||
if (! compileStep.archMatches('client')) {
|
||||
// XXX in the future, might be better to emit some kind of a
|
||||
// warning if a stylesheet is included on the server, rather than
|
||||
// silently ignoring it. but that would mean you can't stick .css
|
||||
// at the top level of your app, which is kind of silly.
|
||||
return;
|
||||
}
|
||||
|
||||
Plugin.registerSourceHandler("less", {archMatching: 'web'}, function (compileStep) {
|
||||
var source = compileStep.read().toString('utf8');
|
||||
var options = {
|
||||
filename: compileStep.inputPath,
|
||||
|
||||
@@ -1456,6 +1456,16 @@ var wrapInternalException = function (exception, context) {
|
||||
if (!exception || exception instanceof Meteor.Error)
|
||||
return exception;
|
||||
|
||||
// tests can set the 'expected' flag on an exception so it won't go to the
|
||||
// server log
|
||||
if (!exception.expected) {
|
||||
Meteor._debug("Exception " + context, exception.stack);
|
||||
if (exception.sanitizedError) {
|
||||
Meteor._debug("Sanitized and reported to the client as:", exception.sanitizedError.message);
|
||||
Meteor._debug();
|
||||
}
|
||||
}
|
||||
|
||||
// Did the error contain more details that could have been useful if caught in
|
||||
// server code (or if thrown from non-client-originated code), but also
|
||||
// provided a "sanitized" version with more context than 500 Internal server
|
||||
@@ -1467,11 +1477,6 @@ var wrapInternalException = function (exception, context) {
|
||||
"is not a Meteor.Error; ignoring");
|
||||
}
|
||||
|
||||
// tests can set the 'expected' flag on an exception so it won't go to the
|
||||
// server log
|
||||
if (!exception.expected)
|
||||
Meteor._debug("Exception " + context, exception.stack);
|
||||
|
||||
return new Meteor.Error(500, "Internal server error");
|
||||
};
|
||||
|
||||
|
||||
@@ -526,16 +526,18 @@ if (Meteor.isClient) {
|
||||
]);
|
||||
|
||||
testAsyncMulti("livedata - publisher errors", (function () {
|
||||
// Use a separate connection so that we can safely check to see if
|
||||
// conn._subscriptions is empty.
|
||||
var conn = new LivedataTest.Connection('/',
|
||||
{reloadWithOutstanding: true});
|
||||
var collName = Random.id();
|
||||
var coll = new Meteor.Collection(collName, {connection: conn});
|
||||
var conn, collName, coll;
|
||||
var errorFromRerun;
|
||||
var gotErrorFromStopper = false;
|
||||
return [
|
||||
function (test, expect) {
|
||||
// Use a separate connection so that we can safely check to see if
|
||||
// conn._subscriptions is empty.
|
||||
conn = new LivedataTest.Connection('/',
|
||||
{reloadWithOutstanding: true});
|
||||
collName = Random.id();
|
||||
coll = new Meteor.Collection(collName, {connection: conn});
|
||||
|
||||
var testSubError = function (options) {
|
||||
conn.subscribe("publisherErrors", collName, options, {
|
||||
onReady: expect(),
|
||||
|
||||
@@ -54,7 +54,7 @@ _.extend(LivedataTest.ClientStream.prototype, {
|
||||
// But _launchConnection calls _cleanup which closes previous connections.
|
||||
// It's our belief that this stifles future 'open' events, but maybe
|
||||
// we are wrong?
|
||||
throw new Error("Got open from inactive client");
|
||||
throw new Error("Got open from inactive client " + !!self.client);
|
||||
}
|
||||
|
||||
if (self._forcedToDisconnect) {
|
||||
|
||||
@@ -2,16 +2,7 @@
|
||||
we can't exactly define the *.js source file handler in a *.js
|
||||
source file. */
|
||||
|
||||
Plugin.registerSourceHandler("css", function (compileStep) {
|
||||
// XXX annoying that this is replicated in .css, .less, and .styl
|
||||
if (! compileStep.archMatches('client')) {
|
||||
// XXX in the future, might be better to emit some kind of a
|
||||
// warning if a stylesheet is included on the server, rather than
|
||||
// silently ignoring it. but that would mean you can't stick .css
|
||||
// at the top level of your app, which is kind of silly.
|
||||
return;
|
||||
}
|
||||
|
||||
Plugin.registerSourceHandler("css", {archMatching: 'web'}, function (compileStep) {
|
||||
compileStep.addStylesheet({
|
||||
data: compileStep.read().toString('utf8'),
|
||||
path: compileStep.inputPath
|
||||
|
||||
@@ -339,6 +339,21 @@ _.extend(LocalCollection.Cursor.prototype, {
|
||||
query.movedBefore = wrapCallback(options.movedBefore);
|
||||
}
|
||||
|
||||
if (!options._suppress_initial && !self.collection.paused) {
|
||||
// XXX unify ordered and unordered interface
|
||||
var each = ordered
|
||||
? _.bind(_.each, null, query.results)
|
||||
: _.bind(query.results.forEach, query.results);
|
||||
each(function (doc) {
|
||||
var fields = EJSON.clone(doc);
|
||||
|
||||
delete fields._id;
|
||||
if (ordered)
|
||||
query.addedBefore(doc._id, fields, null);
|
||||
query.added(doc._id, fields);
|
||||
});
|
||||
}
|
||||
|
||||
var handle = new LocalCollection.ObserveHandle;
|
||||
_.extend(handle, {
|
||||
collection: self.collection,
|
||||
@@ -358,30 +373,9 @@ _.extend(LocalCollection.Cursor.prototype, {
|
||||
handle.stop();
|
||||
});
|
||||
}
|
||||
|
||||
if (!options._suppress_initial && !self.collection.paused) {
|
||||
// XXX unify ordered and unordered interface
|
||||
var each = ordered
|
||||
? _.bind(_.each, null, query.results)
|
||||
: _.bind(query.results.forEach, query.results);
|
||||
each(function (doc) {
|
||||
var fields = EJSON.clone(doc);
|
||||
|
||||
delete fields._id;
|
||||
if (ordered)
|
||||
query.addedBefore(doc._id, fields, null);
|
||||
query.added(doc._id, fields);
|
||||
});
|
||||
|
||||
// run the observe callbacks resulting from the initial contents
|
||||
// before we leave the observe.
|
||||
if (self.collection._observeQueue.safeToRunTask()) {
|
||||
self.collection._observeQueue.drain();
|
||||
} else if (options.added || options.addedBefore) {
|
||||
// See #2315.
|
||||
throw Error("observeChanges called from an observe callback on the same collection cannot differentiate between initial and later adds");
|
||||
}
|
||||
}
|
||||
// run the observe callbacks resulting from the initial contents
|
||||
// before we leave the observe.
|
||||
self.collection._observeQueue.drain();
|
||||
|
||||
return handle;
|
||||
}
|
||||
|
||||
@@ -3109,28 +3109,3 @@ Tinytest.add("minimongo - fetch in observe", function (test) {
|
||||
observe.stop();
|
||||
computation.stop();
|
||||
});
|
||||
|
||||
Tinytest.add("minimongo - observe in observe", function (test) {
|
||||
var coll = new LocalCollection;
|
||||
coll.insert({foo: 2});
|
||||
|
||||
var observe1AddedCalled = false;
|
||||
var observe1 = coll.find({foo: 1}).observeChanges({
|
||||
added: function (id, fields) {
|
||||
observe1AddedCalled = true;
|
||||
test.equal(fields, {foo: 1});
|
||||
|
||||
// It would be even better if this didn't throw; see #2315.
|
||||
test.throws(function () {
|
||||
coll.find({foo: 2}).observeChanges({
|
||||
added: function () {
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
});
|
||||
test.isFalse(observe1AddedCalled);
|
||||
coll.insert({foo: 1});
|
||||
test.isTrue(observe1AddedCalled);
|
||||
observe1.stop();
|
||||
});
|
||||
|
||||
@@ -159,7 +159,8 @@ LocalCollection._observeFromObserveChanges = function (cursor, observeCallbacks)
|
||||
var oldDoc = self.docs.get(id);
|
||||
var doc = EJSON.clone(oldDoc);
|
||||
LocalCollection._applyChanges(doc, fields);
|
||||
observeCallbacks.changed(transform(doc), transform(oldDoc));
|
||||
observeCallbacks.changed(transform(doc),
|
||||
transform(EJSON.clone(oldDoc)));
|
||||
}
|
||||
},
|
||||
removed: function (id) {
|
||||
@@ -176,32 +177,5 @@ LocalCollection._observeFromObserveChanges = function (cursor, observeCallbacks)
|
||||
var handle = cursor.observeChanges(changeObserver.applyChange);
|
||||
suppressed = false;
|
||||
|
||||
if (changeObserver.ordered) {
|
||||
// Fetches the current list of documents, in order, as an array. Can be
|
||||
// called at any time. Internal API assumed by the `observe-sequence`
|
||||
// package (used by Meteor UI for `#each` blocks). Only defined on ordered
|
||||
// observes (those that listen on `addedAt` or similar). Continues to work
|
||||
// after `stop()` is called on the handle.
|
||||
//
|
||||
// Because we already materialize the full OrderedDict of all documents, it
|
||||
// seems nice to provide access to the view rather than making the data
|
||||
// consumer reconstitute it. This gives the consumer a shot at doing
|
||||
// something smart with the feed like proxying it, since firing callbacks
|
||||
// like `changed` and `movedTo` basically requires omniscience (knowing old
|
||||
// and new documents, old and new indices, and the correct value for
|
||||
// `before`).
|
||||
//
|
||||
// NOTE: If called from an observe callback for a certain change, the result
|
||||
// is *not* guaranteed to be a snapshot of the cursor up to that
|
||||
// change. This is because the callbacks are invoked before updating docs.
|
||||
handle._fetch = function () {
|
||||
var docsArray = [];
|
||||
changeObserver.docs.forEach(function (doc) {
|
||||
docsArray.push(transform(EJSON.clone(doc)));
|
||||
});
|
||||
return docsArray;
|
||||
};
|
||||
}
|
||||
|
||||
return handle;
|
||||
};
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -4,19 +4,28 @@
|
||||
if (##SET_CREDENTIAL_TOKEN##) {
|
||||
var credentialToken = ##TOKEN##;
|
||||
var credentialSecret = ##SECRET##;
|
||||
try {
|
||||
localStorage[##LOCAL_STORAGE_PREFIX## + credentialToken] = credentialSecret;
|
||||
} catch (err) {
|
||||
// localStorage didn't work; try window.opener.
|
||||
window.opener &&
|
||||
window.opener.Package.oauth.OAuth._handleCredentialSecret(
|
||||
credentialToken, credentialSecret);
|
||||
// If window.opener isn't set, we can't do much else, but at least
|
||||
// close the popup instead of having it hang around on a blank page.
|
||||
if (window.opener && window.opener.Package &&
|
||||
window.opener.Package.oauth) {
|
||||
window.opener.Package.oauth.OAuth._handleCredentialSecret(
|
||||
credentialToken, credentialSecret);
|
||||
} else {
|
||||
try {
|
||||
localStorage[##LOCAL_STORAGE_PREFIX## + credentialToken] = credentialSecret;
|
||||
} catch (err) {
|
||||
// We can't do much else, but at least close the popup instead
|
||||
// of having it hang around on a blank page.
|
||||
}
|
||||
}
|
||||
}
|
||||
window.close();
|
||||
</script>
|
||||
</head>
|
||||
<body></body>
|
||||
<body>
|
||||
<p>
|
||||
Login completed. <a href="#" onclick="window.close()">
|
||||
Click here</a> to close this window.
|
||||
</p>
|
||||
<img src="not-a-real-image-for-oauth" style="width:1px;height:1px"
|
||||
onerror="window.close();" />
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -92,17 +92,16 @@ OAuth._handleCredentialSecret = function (credentialToken, secret) {
|
||||
// Used by accounts-oauth, which needs both a credentialToken and the
|
||||
// corresponding to credential secret to call the `login` method over DDP.
|
||||
OAuth._retrieveCredentialSecret = function (credentialToken) {
|
||||
// Check localStorage first, then check the secrets collected by
|
||||
// OAuth._handleCredentialSecret. This matches what we do in
|
||||
// First check the secrets collected by OAuth._handleCredentialSecret,
|
||||
// then check localStorage. This matches what we do in
|
||||
// end_of_login_response.html.
|
||||
var localStorageKey = OAuth._localStorageTokenPrefix +
|
||||
credentialToken;
|
||||
var secret = Meteor._localStorage.getItem(localStorageKey);
|
||||
|
||||
if (secret) {
|
||||
var secret = credentialSecrets[credentialToken];
|
||||
if (! secret) {
|
||||
var localStorageKey = OAuth._localStorageTokenPrefix +
|
||||
credentialToken;
|
||||
secret = Meteor._localStorage.getItem(localStorageKey);
|
||||
Meteor._localStorage.removeItem(localStorageKey);
|
||||
} else {
|
||||
secret = credentialSecrets[credentialToken];
|
||||
delete credentialSecrets[credentialToken];
|
||||
}
|
||||
return secret;
|
||||
|
||||
@@ -11,7 +11,6 @@ var registeredServices = {};
|
||||
// Internal: Maps from service version to handler function. The
|
||||
// 'oauth1' and 'oauth2' packages manipulate this directly to register
|
||||
// for callbacks.
|
||||
//
|
||||
OAuth._requestHandlers = {};
|
||||
|
||||
|
||||
|
||||
@@ -82,11 +82,10 @@ ObserveSequence = {
|
||||
Deps.nonreactive(function () {
|
||||
var seqArray; // same structure as `lastSeqArray` above.
|
||||
|
||||
// If we were previously observing a cursor, replace lastSeqArray with
|
||||
// more up-to-date information (specifically, the state of the observe
|
||||
// before it was stopped, which may be older than the DB).
|
||||
if (activeObserveHandle) {
|
||||
lastSeqArray = _.map(activeObserveHandle._fetch(), function (doc) {
|
||||
// If we were previously observing a cursor, replace lastSeqArray with
|
||||
// more up-to-date information. Then stop the old observe.
|
||||
lastSeqArray = _.map(lastSeq.fetch(), function (doc) {
|
||||
return {_id: doc._id, item: doc};
|
||||
});
|
||||
activeObserveHandle.stop();
|
||||
@@ -94,78 +93,19 @@ ObserveSequence = {
|
||||
}
|
||||
|
||||
if (!seq) {
|
||||
seqArray = [];
|
||||
diffArray(lastSeqArray, seqArray, callbacks);
|
||||
seqArray = seqChangedToEmpty(lastSeqArray, callbacks);
|
||||
} else if (seq instanceof Array) {
|
||||
var idsUsed = {};
|
||||
seqArray = _.map(seq, function (item, index) {
|
||||
var id;
|
||||
if (typeof item === 'string') {
|
||||
// ensure not empty, since other layers (eg DomRange) assume this as well
|
||||
id = "-" + item;
|
||||
} else if (typeof item === 'number' ||
|
||||
typeof item === 'boolean' ||
|
||||
item === undefined) {
|
||||
id = item;
|
||||
} else if (typeof item === 'object') {
|
||||
id = (item && item._id) || index;
|
||||
} else {
|
||||
throw new Error("{{#each}} doesn't support arrays with " +
|
||||
"elements of type " + typeof item);
|
||||
}
|
||||
|
||||
var idString = idStringify(id);
|
||||
if (idsUsed[idString]) {
|
||||
if (typeof item === 'object' && '_id' in item)
|
||||
warn("duplicate id " + id + " in", seq);
|
||||
id = Random.id();
|
||||
} else {
|
||||
idsUsed[idString] = true;
|
||||
}
|
||||
|
||||
return { _id: id, item: item };
|
||||
});
|
||||
|
||||
diffArray(lastSeqArray, seqArray, callbacks);
|
||||
seqArray = seqChangedToArray(lastSeqArray, seq, callbacks);
|
||||
} else if (isStoreCursor(seq)) {
|
||||
var cursor = seq;
|
||||
seqArray = [];
|
||||
|
||||
var initial = true; // are we observing initial data from cursor?
|
||||
activeObserveHandle = cursor.observe({
|
||||
addedAt: function (document, atIndex, before) {
|
||||
if (initial) {
|
||||
// keep track of initial data so that we can diff once
|
||||
// we exit `observe`.
|
||||
if (before !== null)
|
||||
throw new Error("Expected initial data from observe in order");
|
||||
seqArray.push({ _id: document._id, item: document });
|
||||
} else {
|
||||
callbacks.addedAt(document._id, document, atIndex, before);
|
||||
}
|
||||
},
|
||||
changedAt: function (newDocument, oldDocument, atIndex) {
|
||||
callbacks.changedAt(newDocument._id, newDocument, oldDocument,
|
||||
atIndex);
|
||||
},
|
||||
removedAt: function (oldDocument, atIndex) {
|
||||
callbacks.removedAt(oldDocument._id, oldDocument, atIndex);
|
||||
},
|
||||
movedTo: function (document, fromIndex, toIndex, before) {
|
||||
callbacks.movedTo(
|
||||
document._id, document, fromIndex, toIndex, before);
|
||||
}
|
||||
});
|
||||
initial = false;
|
||||
|
||||
// diff the old sequnce with initial data in the new cursor. this will
|
||||
// fire `addedAt` callbacks on the initial data.
|
||||
diffArray(lastSeqArray, seqArray, callbacks);
|
||||
|
||||
var result /* [seqArray, activeObserveHandle] */ =
|
||||
seqChangedToCursor(lastSeqArray, seq, callbacks);
|
||||
seqArray = result[0];
|
||||
activeObserveHandle = result[1];
|
||||
} else {
|
||||
throw badSequenceError();
|
||||
}
|
||||
|
||||
diffArray(lastSeqArray, seqArray, callbacks);
|
||||
lastSeq = seq;
|
||||
lastSeqArray = seqArray;
|
||||
});
|
||||
@@ -306,3 +246,73 @@ var diffArray = function (lastSeqArray, seqArray, callbacks) {
|
||||
}
|
||||
});
|
||||
};
|
||||
|
||||
seqChangedToEmpty = function (lastSeqArray, callbacks) {
|
||||
return [];
|
||||
};
|
||||
|
||||
seqChangedToArray = function (lastSeqArray, array, callbacks) {
|
||||
var idsUsed = {};
|
||||
var seqArray = _.map(array, function (item, index) {
|
||||
var id;
|
||||
if (typeof item === 'string') {
|
||||
// ensure not empty, since other layers (eg DomRange) assume this as well
|
||||
id = "-" + item;
|
||||
} else if (typeof item === 'number' ||
|
||||
typeof item === 'boolean' ||
|
||||
item === undefined) {
|
||||
id = item;
|
||||
} else if (typeof item === 'object') {
|
||||
id = (item && item._id) || index;
|
||||
} else {
|
||||
throw new Error("{{#each}} doesn't support arrays with " +
|
||||
"elements of type " + typeof item);
|
||||
}
|
||||
|
||||
var idString = idStringify(id);
|
||||
if (idsUsed[idString]) {
|
||||
if (typeof item === 'object' && '_id' in item)
|
||||
warn("duplicate id " + id + " in", array);
|
||||
id = Random.id();
|
||||
} else {
|
||||
idsUsed[idString] = true;
|
||||
}
|
||||
|
||||
return { _id: id, item: item };
|
||||
});
|
||||
|
||||
return seqArray;
|
||||
};
|
||||
|
||||
seqChangedToCursor = function (lastSeqArray, cursor, callbacks) {
|
||||
var initial = true; // are we observing initial data from cursor?
|
||||
var seqArray = [];
|
||||
|
||||
var observeHandle = cursor.observe({
|
||||
addedAt: function (document, atIndex, before) {
|
||||
if (initial) {
|
||||
// keep track of initial data so that we can diff once
|
||||
// we exit `observe`.
|
||||
if (before !== null)
|
||||
throw new Error("Expected initial data from observe in order");
|
||||
seqArray.push({ _id: document._id, item: document });
|
||||
} else {
|
||||
callbacks.addedAt(document._id, document, atIndex, before);
|
||||
}
|
||||
},
|
||||
changedAt: function (newDocument, oldDocument, atIndex) {
|
||||
callbacks.changedAt(newDocument._id, newDocument, oldDocument,
|
||||
atIndex);
|
||||
},
|
||||
removedAt: function (oldDocument, atIndex) {
|
||||
callbacks.removedAt(oldDocument._id, oldDocument, atIndex);
|
||||
},
|
||||
movedTo: function (document, fromIndex, toIndex, before) {
|
||||
callbacks.movedTo(
|
||||
document._id, document, fromIndex, toIndex, before);
|
||||
}
|
||||
});
|
||||
initial = false;
|
||||
|
||||
return [seqArray, observeHandle];
|
||||
};
|
||||
|
||||
@@ -439,9 +439,8 @@ Tinytest.add('observe-sequence - cursor to same cursor', function (test) {
|
||||
}, [
|
||||
{addedAt: ["13", {_id: "13", rank: 1}, 0, null]},
|
||||
{addedAt: ["24", {_id: "24", rank: 2}, 1, null]},
|
||||
// even if the cursor changes to the same cursor, we diff to see if we
|
||||
// missed anything during the invalidation, which leads to these
|
||||
// 'changedAt' events.
|
||||
// even if the cursor changes to the same cursor, we do a diff,
|
||||
// which leads to these 'changedAt' events.
|
||||
{changedAt: ["13", {_id: "13", rank: 1}, {_id: "13", rank: 1}, 0]},
|
||||
{changedAt: ["24", {_id: "24", rank: 2}, {_id: "24", rank: 2}, 1]},
|
||||
{addedAt: ["78", {_id: "78", rank: 3}, 2, null]}
|
||||
|
||||
@@ -1,7 +1,11 @@
|
||||
var currentTest = null;
|
||||
|
||||
var t = function (versionString, expected, descr) {
|
||||
currentTest.equal(PackageVersion.parseConstraint(versionString), expected, descr);
|
||||
currentTest.equal(
|
||||
_.omit(PackageVersion.parseConstraint(versionString),
|
||||
'constraintString'),
|
||||
expected,
|
||||
descr);
|
||||
};
|
||||
|
||||
var FAIL = function (versionString) {
|
||||
|
||||
@@ -13,6 +13,7 @@ Package.on_test(function (api) {
|
||||
api.use('test-helpers');
|
||||
api.use('showdown');
|
||||
api.use('minimongo');
|
||||
api.use('deps');
|
||||
|
||||
api.use('templating', 'client');
|
||||
api.add_files([
|
||||
|
||||
@@ -922,3 +922,41 @@ Hi there!
|
||||
<template name="spacebars_test_isolated_lookup3">
|
||||
{{> bar}}--{{> spacebars_test_isolated_lookup_inclusion}}
|
||||
</template>
|
||||
|
||||
<template name="spacebars_test_current_view_in_event">
|
||||
<span>{{.}}</span>
|
||||
</template>
|
||||
|
||||
<template name="spacebars_test_textarea_attrs">
|
||||
<textarea {{attrs}}></textarea>
|
||||
</template>
|
||||
|
||||
<template name="spacebars_test_textarea_attrs_contents">
|
||||
<textarea {{attrs}}>Hello {{name}}</textarea>
|
||||
</template>
|
||||
|
||||
<template name="spacebars_test_textarea_attrs_array_contents">
|
||||
<textarea {{attrs}} class="bar">Hello {{name}}</textarea>
|
||||
</template>
|
||||
|
||||
<template name="spacebars_test_autorun">
|
||||
{{#if show}}
|
||||
{{>spacebars_test_autorun_inner}}
|
||||
{{/if}}
|
||||
</template>
|
||||
|
||||
<template name="spacebars_test_autorun_inner">
|
||||
Hello
|
||||
</template>
|
||||
|
||||
<template name="spacebars_test_contentBlock_arg">
|
||||
{{#spacebars_test_contentBlock_arg_inner}}
|
||||
{{this.bar}}
|
||||
{{/spacebars_test_contentBlock_arg_inner}}
|
||||
</template>
|
||||
|
||||
<template name="spacebars_test_contentBlock_arg_inner">
|
||||
{{#with foo="AAA" bar="BBB"}}
|
||||
{{this.foo}} {{> UI.contentBlock this}}
|
||||
{{/with}}
|
||||
</template>
|
||||
|
||||
@@ -2604,3 +2604,163 @@ _.each([1, 2, 3], function (n) {
|
||||
}
|
||||
);
|
||||
});
|
||||
|
||||
Tinytest.add('spacebars-tests - template_tests - current view in event handler', function (test) {
|
||||
var tmpl = Template.spacebars_test_current_view_in_event;
|
||||
|
||||
var currentView;
|
||||
var currentData;
|
||||
|
||||
tmpl.events({
|
||||
'click span': function () {
|
||||
currentView = Blaze.getCurrentView();
|
||||
currentData = Blaze.getCurrentData();
|
||||
}
|
||||
});
|
||||
|
||||
var div = renderToDiv(tmpl, 'blah');
|
||||
test.equal(canonicalizeHtml(div.innerHTML), '<span>blah</span>');
|
||||
document.body.appendChild(div);
|
||||
clickElement(div.querySelector('span'));
|
||||
$(div).remove();
|
||||
|
||||
test.isTrue(currentView);
|
||||
test.equal(currentData, 'blah');
|
||||
});
|
||||
|
||||
|
||||
Tinytest.add(
|
||||
"spacebars-tests - template_tests - textarea attrs", function (test) {
|
||||
var tmplNoContents = {
|
||||
tmpl: Template.spacebars_test_textarea_attrs,
|
||||
hasTextAreaContents: false
|
||||
};
|
||||
var tmplWithContents = {
|
||||
tmpl: Template.spacebars_test_textarea_attrs_contents,
|
||||
hasTextAreaContents: true
|
||||
};
|
||||
var tmplWithContentsAndMoreAttrs = {
|
||||
tmpl: Template.spacebars_test_textarea_attrs_array_contents,
|
||||
hasTextAreaContents: true
|
||||
};
|
||||
|
||||
_.each(
|
||||
[tmplNoContents, tmplWithContents,
|
||||
tmplWithContentsAndMoreAttrs],
|
||||
function (tmplInfo) {
|
||||
|
||||
var id = new ReactiveVar("textarea-" + Random.id());
|
||||
var name = new ReactiveVar("one");
|
||||
var attrs = new ReactiveVar({
|
||||
id: "textarea-" + Random.id()
|
||||
});
|
||||
|
||||
var div = renderToDiv(tmplInfo.tmpl, {
|
||||
attrs: function () {
|
||||
return attrs.get();
|
||||
},
|
||||
name: function () {
|
||||
return name.get();
|
||||
}
|
||||
});
|
||||
|
||||
// Check that the id and value attribute are as we expect.
|
||||
// We can't check div.innerHTML because Chrome at least doesn't
|
||||
// appear to put textarea value attributes in innerHTML.
|
||||
var textarea = div.querySelector("textarea");
|
||||
test.equal(textarea.id, attrs.get().id);
|
||||
test.equal(
|
||||
textarea.value, tmplInfo.hasTextAreaContents ? "Hello one" : "");
|
||||
// One of the templates has a separate attribute in addition to
|
||||
// an attributes dictionary.
|
||||
if (tmplInfo === tmplWithContentsAndMoreAttrs) {
|
||||
test.equal($(textarea).attr("class"), "bar");
|
||||
}
|
||||
|
||||
// Change the id, check that the attribute updates reactively.
|
||||
attrs.set({ id: "textarea-" + Random.id() });
|
||||
Deps.flush();
|
||||
test.equal(textarea.id, attrs.get().id);
|
||||
|
||||
// Change the name variable, check that the textarea value
|
||||
// updates reactively.
|
||||
name.set("two");
|
||||
Deps.flush();
|
||||
test.equal(
|
||||
textarea.value, tmplInfo.hasTextAreaContents ? "Hello two" : "");
|
||||
|
||||
if (tmplInfo === tmplWithContentsAndMoreAttrs) {
|
||||
test.equal($(textarea).attr("class"), "bar");
|
||||
}
|
||||
|
||||
});
|
||||
|
||||
});
|
||||
|
||||
Tinytest.add(
|
||||
"spacebars-tests - template_tests - this.autorun",
|
||||
function (test) {
|
||||
var tmpl = Template.spacebars_test_autorun;
|
||||
var tmplInner = Template.spacebars_test_autorun_inner;
|
||||
|
||||
// Keep track of the value of `UI._templateInstance()` inside the
|
||||
// autorun each time it runs.
|
||||
var autorunTemplateInstances = [];
|
||||
var actualTemplateInstance;
|
||||
var returnedComputation;
|
||||
var computationArg;
|
||||
|
||||
var show = new ReactiveVar(true);
|
||||
var rv = new ReactiveVar("foo");
|
||||
|
||||
tmplInner.created = function () {
|
||||
actualTemplateInstance = this;
|
||||
returnedComputation = this.autorun(function (c) {
|
||||
computationArg = c;
|
||||
rv.get();
|
||||
autorunTemplateInstances.push(UI._templateInstance());
|
||||
});
|
||||
};
|
||||
|
||||
tmpl.helpers({
|
||||
show: function () {
|
||||
return show.get();
|
||||
}
|
||||
});
|
||||
|
||||
var div = renderToDiv(tmpl);
|
||||
test.equal(autorunTemplateInstances.length, 1);
|
||||
test.equal(autorunTemplateInstances[0], actualTemplateInstance);
|
||||
|
||||
// Test that the autorun returned a computation and received a
|
||||
// computation as an argument.
|
||||
test.isTrue(returnedComputation instanceof Deps.Computation);
|
||||
test.equal(returnedComputation, computationArg);
|
||||
|
||||
// Make sure the autorun re-runs when `rv` changes, and that it has
|
||||
// the correct current view.
|
||||
rv.set("bar");
|
||||
Deps.flush();
|
||||
test.equal(autorunTemplateInstances.length, 2);
|
||||
test.equal(autorunTemplateInstances[1], actualTemplateInstance);
|
||||
|
||||
// If the inner template is destroyed, the autorun should be stopped.
|
||||
show.set(false);
|
||||
Deps.flush();
|
||||
rv.set("baz");
|
||||
Deps.flush();
|
||||
|
||||
test.equal(autorunTemplateInstances.length, 2);
|
||||
test.equal(rv.numListeners(), 0);
|
||||
}
|
||||
);
|
||||
|
||||
// Test that argument in {{> UI.contentBlock arg}} is evaluated in
|
||||
// the proper data context.
|
||||
Tinytest.add(
|
||||
"spacebars-tests - template_tests - contentBlock argument",
|
||||
function (test) {
|
||||
var tmpl = Template.spacebars_test_contentBlock_arg;
|
||||
var div = renderToDiv(tmpl);
|
||||
test.equal(canonicalizeHtml(div.innerHTML), 'AAA BBB');
|
||||
});
|
||||
|
||||
@@ -17,10 +17,6 @@ render the template. -->
|
||||
the template to render) and a `data` property, which can be falsey. -->
|
||||
<template name="__dynamicWithDataContext">
|
||||
{{#with chooseTemplate template}}
|
||||
{{#with ../data}} {{! original 'dataContext' argument to __dynamic}}
|
||||
{{> ..}} {{! return value from chooseTemplate(template) }}
|
||||
{{else}} {{! if the 'dataContext' argument was falsey }}
|
||||
{{> .. ../data}} {{! return value from chooseTemplate(template) }}
|
||||
{{/with}}
|
||||
{{> .. ../data}} {{!-- The .. is evaluated inside {{#with ../data}} --}}
|
||||
{{/with}}
|
||||
</template>
|
||||
|
||||
@@ -205,7 +205,32 @@ Spacebars.dot = function (value, id1/*, id2, ...*/) {
|
||||
};
|
||||
|
||||
Spacebars.TemplateWith = function (argFunc, contentBlock) {
|
||||
var w = Blaze.With(argFunc, contentBlock);
|
||||
var w;
|
||||
|
||||
// This is a little messy. When we compile `{{> UI.contentBlock}}`, we
|
||||
// wrap it in Blaze.InOuterTemplateScope in order to skip the intermediate
|
||||
// parent Views in the current template. However, when there's an argument
|
||||
// (`{{> UI.contentBlock arg}}`), the argument needs to be evaluated
|
||||
// in the original scope. There's no good order to nest
|
||||
// Blaze.InOuterTemplateScope and Spacebars.TemplateWith to achieve this,
|
||||
// so we wrap argFunc to run it in the "original parentView" of the
|
||||
// Blaze.InOuterTemplateScope.
|
||||
//
|
||||
// To make this better, reconsider InOuterTemplateScope as a primitive.
|
||||
// Longer term, evaluate expressions in the proper lexical scope.
|
||||
var wrappedArgFunc = function () {
|
||||
var viewToEvaluateArg = null;
|
||||
if (w.parentView && w.parentView.kind === 'InOuterTemplateScope') {
|
||||
viewToEvaluateArg = w.parentView.originalParentView;
|
||||
}
|
||||
if (viewToEvaluateArg) {
|
||||
return Blaze.withCurrentView(viewToEvaluateArg, argFunc);
|
||||
} else {
|
||||
return argFunc();
|
||||
}
|
||||
};
|
||||
|
||||
w = Blaze.With(wrappedArgFunc, contentBlock);
|
||||
w.__isTemplateWith = true;
|
||||
return w;
|
||||
};
|
||||
|
||||
@@ -1,22 +1,26 @@
|
||||
// 'url' is assigned to in a statement before this.
|
||||
var page = require('webpage').create();
|
||||
page.open(url);
|
||||
setInterval(function() {
|
||||
var ready = page.evaluate(function () {
|
||||
if (typeof Meteor !== 'undefined'
|
||||
&& typeof(Meteor.status) !== 'undefined'
|
||||
&& Meteor.status().connected) {
|
||||
Deps.flush();
|
||||
return DDP._allSubscriptionsReady();
|
||||
}
|
||||
return false;
|
||||
});
|
||||
if (ready) {
|
||||
var out = page.content;
|
||||
out = out.replace(/<script[^>]+>(.|\n|\r)*?<\/script\s*>/ig, '');
|
||||
out = out.replace('<meta name="fragment" content="!">', '');
|
||||
console.log(out);
|
||||
page.open(url, function(status) {
|
||||
if (status === 'fail')
|
||||
phantom.exit();
|
||||
}
|
||||
}, 100);
|
||||
setInterval(function() {
|
||||
var ready = page.evaluate(function () {
|
||||
if (typeof Meteor !== 'undefined'
|
||||
&& typeof(Meteor.status) !== 'undefined'
|
||||
&& Meteor.status().connected) {
|
||||
Deps.flush();
|
||||
return DDP._allSubscriptionsReady();
|
||||
}
|
||||
return false;
|
||||
});
|
||||
if (ready) {
|
||||
var out = page.content;
|
||||
out = out.replace(/<script[^>]+>(.|\n|\r)*?<\/script\s*>/ig, '');
|
||||
out = out.replace('<meta name="fragment" content="!">', '');
|
||||
console.log(out);
|
||||
phantom.exit();
|
||||
}
|
||||
}, 100);
|
||||
});
|
||||
|
||||
|
||||
|
||||
@@ -39,8 +39,8 @@ StarTranslator._translate = function (bundlePath) {
|
||||
"builtBy": "Star translator",
|
||||
"programs": [
|
||||
{
|
||||
"name": "client.browser",
|
||||
"arch": "client.browser",
|
||||
"name": "web.browser",
|
||||
"arch": "web.browser",
|
||||
"path": "client.json"
|
||||
},
|
||||
{
|
||||
@@ -99,7 +99,7 @@ StarTranslator._writeClientProg = function (bundlePath, clientProgPath) {
|
||||
"app.json"),
|
||||
'utf8'));
|
||||
var clientManifest = {
|
||||
"format": "client-program-pre1",
|
||||
"format": "web-program-pre1",
|
||||
"manifest": origClientManifest.manifest,
|
||||
// XXX Haven't updated this for the app.html -> head/body change, but
|
||||
// surely we don't need to because code in pre-star apps doesn't
|
||||
|
||||
@@ -4,16 +4,7 @@ var nib = Npm.require('nib');
|
||||
var path = Npm.require('path');
|
||||
var Future = Npm.require('fibers/future');
|
||||
|
||||
Plugin.registerSourceHandler("styl", function (compileStep) {
|
||||
// XXX annoying that this is replicated in .css, .less, and .styl
|
||||
if (! compileStep.archMatches('client')) {
|
||||
// XXX in the future, might be better to emit some kind of a
|
||||
// warning if a stylesheet is included on the server, rather than
|
||||
// silently ignoring it. but that would mean you can't stick .css
|
||||
// at the top level of your app, which is kind of silly.
|
||||
return;
|
||||
}
|
||||
|
||||
Plugin.registerSourceHandler("styl", {archMatching: 'web'}, function (compileStep) {
|
||||
var f = new Future;
|
||||
stylus(compileStep.read().toString('utf8'))
|
||||
.use(nib())
|
||||
|
||||
@@ -1,15 +1,6 @@
|
||||
var path = Npm.require('path');
|
||||
|
||||
var doHTMLScanning = function (compileStep, htmlScanner) {
|
||||
|
||||
if (! compileStep.archMatches("client"))
|
||||
// XXX might be nice to throw an error here, but then we'd have to
|
||||
// make it so that packages.js ignores html files that appear in
|
||||
// the server directories in an app tree.. or, it might be nice to
|
||||
// make html files actually work on the server (against jsdom or
|
||||
// something)
|
||||
return;
|
||||
|
||||
// XXX the way we deal with encodings here is sloppy .. should get
|
||||
// religion on that
|
||||
var contents = compileStep.read().toString('utf8');
|
||||
@@ -53,7 +44,7 @@ var doHTMLScanning = function (compileStep, htmlScanner) {
|
||||
};
|
||||
|
||||
Plugin.registerSourceHandler(
|
||||
"html", {isTemplate: true},
|
||||
"html", {isTemplate: true, archMatching: 'web'},
|
||||
function (compileStep) {
|
||||
doHTMLScanning(compileStep, html_scanner);
|
||||
}
|
||||
|
||||
@@ -37,6 +37,9 @@ Template.__updateTemplateInstance = function (view) {
|
||||
data: null,
|
||||
firstNode: null,
|
||||
lastNode: null,
|
||||
autorun: function (f) {
|
||||
return view.autorun(f);
|
||||
},
|
||||
__view__: view
|
||||
};
|
||||
}
|
||||
|
||||
@@ -2,6 +2,7 @@ canonicalizeHtml = function(html) {
|
||||
var h = html;
|
||||
// kill IE-specific comments inserted by DomRange
|
||||
h = h.replace(/<!--IE-->/g, '');
|
||||
h = h.replace(/<!---->/g, '');
|
||||
// ignore exact text of comments
|
||||
h = h.replace(/<!--.*?-->/g, '<!---->');
|
||||
// make all tags lowercase
|
||||
|
||||
@@ -8,18 +8,43 @@ TEST_STATUS = {
|
||||
FAILURES: null
|
||||
};
|
||||
|
||||
// xUnit format uses XML output
|
||||
var XML_CHAR_MAP = {
|
||||
'<': '<',
|
||||
'>': '>',
|
||||
'&': '&',
|
||||
'"': '"',
|
||||
"'": '''
|
||||
};
|
||||
|
||||
// Escapes a string for insertion into XML
|
||||
var escapeXml = function (s) {
|
||||
return s.replace(/[<>&"']/g, function (c) {
|
||||
return XML_CHAR_MAP[c];
|
||||
});
|
||||
}
|
||||
|
||||
// Returns a human name for a test
|
||||
var getName = function (result) {
|
||||
return (result.server ? "S: " : "C: ") +
|
||||
result.groupPath.join(" - ") + " - " + result.test;
|
||||
};
|
||||
|
||||
// Calls console.log, but returns silently if console.log is not available
|
||||
var log = function (/*arguments*/) {
|
||||
if (typeof console !== 'undefined') {
|
||||
console.log.apply(console, arguments);
|
||||
}
|
||||
};
|
||||
|
||||
// Logs xUnit output, if xunit output is enabled
|
||||
// Output is sent to console.log, prefixed with a magic string 'XUNIT '
|
||||
// By grepping for that prefix, the xUnit output can be extracted
|
||||
var xunit = function (s) {
|
||||
if (xunitEnabled) {
|
||||
log('XUNIT ' + s);
|
||||
}
|
||||
};
|
||||
|
||||
var passed = 0;
|
||||
var failed = 0;
|
||||
@@ -31,6 +56,10 @@ var hrefPath = document.location.href.split("/");
|
||||
var platform = decodeURIComponent(hrefPath.length && hrefPath[hrefPath.length - 1]);
|
||||
if (!platform)
|
||||
platform = "local";
|
||||
|
||||
// We enable xUnit output when platform is xunit
|
||||
var xunitEnabled = (platform == 'xunit');
|
||||
|
||||
var doReport = Meteor &&
|
||||
Meteor.settings &&
|
||||
Meteor.settings.public &&
|
||||
@@ -82,10 +111,13 @@ Meteor.startup(function () {
|
||||
status: "PENDING",
|
||||
events: [],
|
||||
server: !!results.server,
|
||||
testPath: testPath
|
||||
testPath: testPath,
|
||||
test: results.test
|
||||
};
|
||||
report(name, false);
|
||||
}
|
||||
// Loop through events, and record status for each test
|
||||
// Also log result if test has finished
|
||||
_.each(results.events, function (event) {
|
||||
resultSet[name].events.push(event);
|
||||
switch (event.type) {
|
||||
@@ -136,6 +168,7 @@ Meteor.startup(function () {
|
||||
});
|
||||
},
|
||||
|
||||
// After test completion, log a quick summary
|
||||
function () {
|
||||
if (failed > 0) {
|
||||
log("~~~~~~~ THERE ARE FAILURES ~~~~~~~");
|
||||
@@ -153,6 +186,43 @@ Meteor.startup(function () {
|
||||
TEST_STATUS.DONE = DONE = true;
|
||||
}
|
||||
});
|
||||
|
||||
// Also log xUnit output
|
||||
xunit('<testsuite errors="" failures="" name="meteor" skips="" tests="" time="">');
|
||||
_.each(resultSet, function (result, name) {
|
||||
var classname = result.testPath.join('.').replace(/ /g, '-') + (result.server ? "-server" : "-client");
|
||||
var name = result.test.replace(/ /g, '-') + (result.server ? "-server" : "-client");
|
||||
var time = "";
|
||||
var error = "";
|
||||
_.each(result.events, function (event) {
|
||||
switch (event.type) {
|
||||
case "finish":
|
||||
var timeMs = event.timeMs;
|
||||
if (timeMs !== undefined) {
|
||||
time = (timeMs / 1000) + "";
|
||||
}
|
||||
break;
|
||||
case "fail":
|
||||
var details = event.details || {};
|
||||
error = (details.message || '?') + " filename=" + (details.filename || '?') + " line=" + (details.line || '?');
|
||||
}
|
||||
});
|
||||
switch (event.status) {
|
||||
case "FAIL":
|
||||
error = error || '?';
|
||||
break;
|
||||
case "EXPECTED":
|
||||
error = "Expected failure";
|
||||
break;
|
||||
}
|
||||
|
||||
xunit('<testcase classname="' + escapeXml(classname) + '" name="' + escapeXml(name) + '" time="' + time + '">');
|
||||
if (error) {
|
||||
xunit(' <failure message="test failure">' + escapeXml(error) + '</failure>');
|
||||
}
|
||||
xunit('</testcase>');
|
||||
});
|
||||
xunit('</testsuite>');
|
||||
},
|
||||
["tinytest"]);
|
||||
});
|
||||
|
||||
@@ -2,23 +2,40 @@
|
||||
// the server. Sets a 'server' flag on test results that came from the
|
||||
// server.
|
||||
//
|
||||
Tinytest._runTestsEverywhere = function (onReport, onComplete, pathPrefix) {
|
||||
// Options:
|
||||
// serial if true, will not run tests in parallel. Currently this means
|
||||
// running the server tests before running the client tests.
|
||||
// Default is currently true (serial operation), but we will likely
|
||||
// change this to false in future.
|
||||
Tinytest._runTestsEverywhere = function (onReport, onComplete, pathPrefix, options) {
|
||||
var runId = Random.id();
|
||||
var localComplete = false;
|
||||
var localStarted = false;
|
||||
var remoteComplete = false;
|
||||
var done = false;
|
||||
|
||||
options = _.extend({
|
||||
serial: true
|
||||
}, options);
|
||||
var serial = !!options.serial;
|
||||
|
||||
var maybeDone = function () {
|
||||
if (!done && localComplete && remoteComplete) {
|
||||
done = true;
|
||||
onComplete && onComplete();
|
||||
}
|
||||
if (serial && remoteComplete && !localStarted) {
|
||||
startLocalTests();
|
||||
}
|
||||
};
|
||||
|
||||
Tinytest._runTests(onReport, function () {
|
||||
localComplete = true;
|
||||
maybeDone();
|
||||
}, pathPrefix);
|
||||
var startLocalTests = function() {
|
||||
localStarted = true;
|
||||
Tinytest._runTests(onReport, function () {
|
||||
localComplete = true;
|
||||
maybeDone();
|
||||
}, pathPrefix);
|
||||
};
|
||||
|
||||
var handle;
|
||||
|
||||
@@ -59,4 +76,8 @@ Tinytest._runTestsEverywhere = function (onReport, onComplete, pathPrefix) {
|
||||
// XXX better report error
|
||||
throw new Error("Test server returned an error");
|
||||
});
|
||||
|
||||
if (!serial) {
|
||||
startLocalTests();
|
||||
}
|
||||
};
|
||||
|
||||
@@ -20,7 +20,7 @@ var LONG_SOCKET_TIMEOUT = 120*1000;
|
||||
WebApp = {};
|
||||
WebAppInternals = {};
|
||||
|
||||
WebApp.defaultArch = 'client.browser';
|
||||
WebApp.defaultArch = 'web.browser';
|
||||
|
||||
// XXX maps archs to manifests
|
||||
WebApp.clientPrograms = {};
|
||||
@@ -475,11 +475,11 @@ var runWebAppServer = function () {
|
||||
staticFiles = {};
|
||||
var getClientManifest = function (clientPath, arch) {
|
||||
// read the control for the client we'll be serving up
|
||||
var clientJsonPath = path.join(__meteor_bootstrap__.serverDir,
|
||||
clientJsonPath = path.join(__meteor_bootstrap__.serverDir,
|
||||
clientPath);
|
||||
var clientDir = path.dirname(clientJsonPath);
|
||||
var clientJson = JSON.parse(readUtf8FileSync(clientJsonPath));
|
||||
if (clientJson.format !== "client-program-pre1")
|
||||
clientDir = path.dirname(clientJsonPath);
|
||||
clientJson = JSON.parse(readUtf8FileSync(clientJsonPath));
|
||||
if (clientJson.format !== "web-program-pre1")
|
||||
throw new Error("Unsupported format for client assets: " +
|
||||
JSON.stringify(clientJson.format));
|
||||
|
||||
@@ -527,7 +527,7 @@ var runWebAppServer = function () {
|
||||
// Exported for tests.
|
||||
WebAppInternals.staticFiles = staticFiles;
|
||||
} catch (e) {
|
||||
Log.error("Error reloading the client program: " + e.message);
|
||||
Log.error("Error reloading the client program: " + e.stack);
|
||||
process.exit(1);
|
||||
}
|
||||
});
|
||||
|
||||
@@ -1,7 +1,5 @@
|
||||
=> Meteor 0.8.2: Switch `accounts-password` to use bcrypt on the
|
||||
server. User accounts will seamlessly transition to bcrypt on the
|
||||
next login, but this transition is one-way, so you cannot downgrade a
|
||||
production app once you upgrade to 0.8.2.
|
||||
=> Meteor 0.8.3: Performance improvements and a big refactoring of the
|
||||
Blaze internals.
|
||||
|
||||
This release is being downloaded in the background. Update your
|
||||
project to Meteor 0.8.2 by running 'meteor update'.
|
||||
project to Meteor 0.8.3 by running 'meteor update'.
|
||||
|
||||
@@ -141,7 +141,7 @@ Couldn't write the launcher script. Please either:
|
||||
|
||||
(1) Run the following as root:
|
||||
cp ~/.meteor/tools/latest/launch-meteor /usr/bin/meteor
|
||||
(2) Add ~/.meteor to your path, or
|
||||
(2) Add "$HOME/.meteor" to your path, or
|
||||
(3) Rerun this command to try again.
|
||||
|
||||
Then to get started, take a look at 'meteor --help' or see the docs at
|
||||
@@ -153,7 +153,7 @@ else
|
||||
|
||||
Now you need to do one of the following:
|
||||
|
||||
(1) Add ~/.meteor to your path, or
|
||||
(1) Add "$HOME/.meteor" to your path, or
|
||||
(2) Run this command as root:
|
||||
cp ~/.meteor/tools/latest/launch-meteor /usr/bin/meteor
|
||||
|
||||
|
||||
6
scripts/admin/meteor-release-experimental.json
Normal file
6
scripts/admin/meteor-release-experimental.json
Normal file
@@ -0,0 +1,6 @@
|
||||
{
|
||||
"track": "METEOR-CORE",
|
||||
"version": "0.9.0-rc0",
|
||||
"recommended": false,
|
||||
"description": "An experimental release of meteor."
|
||||
}
|
||||
7
scripts/admin/meteor-release-official.json
Normal file
7
scripts/admin/meteor-release-official.json
Normal file
@@ -0,0 +1,7 @@
|
||||
{
|
||||
"track": "METEOR-CORE",
|
||||
"version": "0.9.0",
|
||||
"recommended": false,
|
||||
"official": true,
|
||||
"description": "Meteor release, supported by MDG."
|
||||
}
|
||||
@@ -149,6 +149,9 @@
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"release": "0.8.3"
|
||||
},
|
||||
{
|
||||
"release": "NEXT"
|
||||
}
|
||||
|
||||
@@ -59,7 +59,15 @@ var configureS3 = function () {
|
||||
return {accessKey: accessKey, secretKey: secretKey};
|
||||
};
|
||||
|
||||
var s3Credentials = getS3Credentials();
|
||||
var s3Credentials;
|
||||
if (process.env.AWS_ACCESS_KEY_ID) {
|
||||
s3Credentials = {};
|
||||
s3Credentials.accessKey = process.env.AWS_ACCESS_KEY_ID;
|
||||
s3Credentials.secretKey = process.env.AWS_SECRET_ACCESS_KEY;
|
||||
} else {
|
||||
s3Credentials = getS3Credentials();
|
||||
}
|
||||
|
||||
var s3 = new S3({
|
||||
accessKeyId: s3Credentials.accessKey,
|
||||
secretAccessKey: s3Credentials.secretKey,
|
||||
|
||||
@@ -110,7 +110,7 @@ var sessionMethodCaller = function (methodName, options) {
|
||||
var conn = options.connection || openAccountsConnection();
|
||||
conn.apply(methodName, args, fiberHelpers.firstTimeResolver(fut));
|
||||
if (options.timeout !== undefined) {
|
||||
var timer = setTimeout(fiberHelpers.inFiber(function () {
|
||||
var timer = setTimeout(fiberHelpers.bindEnvironment(function () {
|
||||
if (!fut.isResolved())
|
||||
fut.throw(new Error('Method call timed out'));
|
||||
}), options.timeout);
|
||||
@@ -541,7 +541,7 @@ var logInToGalaxy = function (galaxyName) {
|
||||
var authorizeResult;
|
||||
|
||||
try {
|
||||
sendAuthorizeRequest(
|
||||
authorizeResult = sendAuthorizeRequest(
|
||||
galaxyClientId,
|
||||
galaxyRedirect,
|
||||
encodeURIComponent(JSON.stringify(stateInfo))
|
||||
@@ -821,7 +821,7 @@ exports.pollForRegistrationCompletion = function (options) {
|
||||
fut['return'](result);
|
||||
});
|
||||
|
||||
var timer = setTimeout(fiberHelpers.inFiber(function () {
|
||||
var timer = setTimeout(fiberHelpers.bindEnvironment(function () {
|
||||
if (! fut.isResolved()) {
|
||||
fut['return'](null);
|
||||
}
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
var _ = require('underscore');
|
||||
var files = require('./files.js');
|
||||
var parseStack = require('./parse-stack.js');
|
||||
var fiberHelpers = require('./fiber-helpers.js');
|
||||
|
||||
var debugBuild = !!process.env.METEOR_DEBUG_BUILD;
|
||||
|
||||
@@ -153,8 +154,9 @@ _.extend(MessageSet.prototype, {
|
||||
}
|
||||
});
|
||||
|
||||
var currentMessageSet = null;
|
||||
var currentJob = null;
|
||||
var currentMessageSet = new fiberHelpers.EnvironmentVariable;
|
||||
var currentJob = new fiberHelpers.EnvironmentVariable;
|
||||
var currentNestingLevel = new fiberHelpers.EnvironmentVariable(0);
|
||||
|
||||
// Create a new MessageSet, run `f` with that as the current
|
||||
// MessageSet for the purpose of accumulating and recovering from
|
||||
@@ -165,37 +167,29 @@ var currentJob = null;
|
||||
// begin capturing errors. Alternately you may pass `options`
|
||||
// (otherwise optional) and a job will be created for you based on
|
||||
// `options`.
|
||||
//
|
||||
// ** Not compatible with multifiber environments **
|
||||
// Using a single fiber to block on i/o is fine however.
|
||||
var capture = function (options, f) {
|
||||
var originalMessageSet = currentMessageSet;
|
||||
var messageSet = new MessageSet;
|
||||
currentMessageSet = messageSet;
|
||||
|
||||
var originalJob = currentJob;
|
||||
var job = null;
|
||||
if (typeof options === "object") {
|
||||
job = new Job(options);
|
||||
currentMessageSet.jobs.push(job);
|
||||
} else {
|
||||
f = options; // options not actually provided
|
||||
}
|
||||
|
||||
currentJob = job;
|
||||
|
||||
if (debugBuild)
|
||||
console.log("START CAPTURE: " + options.title);
|
||||
|
||||
try {
|
||||
f();
|
||||
} finally {
|
||||
currentMessageSet = originalMessageSet;
|
||||
currentJob = originalJob;
|
||||
if (debugBuild)
|
||||
console.log("DONE CAPTURE: " + options.title);
|
||||
}
|
||||
currentMessageSet.withValue(messageSet, function () {
|
||||
var job = null;
|
||||
if (typeof options === "object") {
|
||||
job = new Job(options);
|
||||
messageSet.jobs.push(job);
|
||||
} else {
|
||||
f = options; // options not actually provided
|
||||
}
|
||||
|
||||
currentJob.withValue(job, function () {
|
||||
var nestingLevel = currentNestingLevel.get();
|
||||
currentNestingLevel.withValue(nestingLevel + 1, function () {
|
||||
debugBuild && console.log("START CAPTURE", nestingLevel, options.title);
|
||||
try {
|
||||
f();
|
||||
} finally {
|
||||
debugBuild && console.log("END CAPTURE", nestingLevel, options.title);
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
return messageSet;
|
||||
};
|
||||
|
||||
@@ -217,29 +211,26 @@ var enterJob = function (options, f) {
|
||||
options = {};
|
||||
}
|
||||
|
||||
if (! currentMessageSet) {
|
||||
if (! currentMessageSet.get()) {
|
||||
return f();
|
||||
}
|
||||
|
||||
var job = new Job(options);
|
||||
var originalJob = currentJob;
|
||||
if (originalJob)
|
||||
originalJob.children.push(job);
|
||||
currentMessageSet.jobs.push(job);
|
||||
currentJob = job;
|
||||
var originalJob = currentJob.get();
|
||||
originalJob && originalJob.children.push(job);
|
||||
currentMessageSet.get().jobs.push(job);
|
||||
|
||||
if (debugBuild)
|
||||
console.log("START: " + options.title);
|
||||
|
||||
try {
|
||||
var ret = f();
|
||||
} finally {
|
||||
if (debugBuild)
|
||||
console.log("DONE: " + options.title);
|
||||
currentJob = originalJob;
|
||||
}
|
||||
|
||||
return ret;
|
||||
return currentJob.withValue(job, function () {
|
||||
var nestingLevel = currentNestingLevel.get();
|
||||
return currentNestingLevel.withValue(nestingLevel + 1, function () {
|
||||
debugBuild && console.log("START", nestingLevel, options.title);
|
||||
try {
|
||||
return f();
|
||||
} finally {
|
||||
debugBuild && console.log("DONE", nestingLevel, options.title);
|
||||
}
|
||||
});
|
||||
});
|
||||
};
|
||||
|
||||
// If not inside a job, return false. Otherwise, return true if any
|
||||
@@ -252,7 +243,7 @@ var jobHasMessages = function () {
|
||||
return !! _.find(job.children, search);
|
||||
};
|
||||
|
||||
return currentJob ? search(currentJob) : false;
|
||||
return currentJob.get() ? search(currentJob.get()) : false;
|
||||
};
|
||||
|
||||
// Given a function f, return a "marked" version of f. The mark
|
||||
@@ -294,7 +285,7 @@ var error = function (message, options) {
|
||||
if (options.downcase)
|
||||
message = message.slice(0,1).toLowerCase() + message.slice(1);
|
||||
|
||||
if (! currentJob)
|
||||
if (! currentJob.get())
|
||||
throw new Error("Error: " + message);
|
||||
|
||||
if (options.secondary && jobHasMessages())
|
||||
@@ -318,7 +309,7 @@ var error = function (message, options) {
|
||||
delete info.useMyCaller;
|
||||
}
|
||||
|
||||
currentJob.addMessage(info);
|
||||
currentJob.get().addMessage(info);
|
||||
};
|
||||
|
||||
// Record an exception. The message as well as any file and line
|
||||
@@ -330,7 +321,7 @@ var error = function (message, options) {
|
||||
// actually occurred, rather than the place where the exception was
|
||||
// thrown.
|
||||
var exception = function (error) {
|
||||
if (! currentJob) {
|
||||
if (! currentJob.get()) {
|
||||
// XXX this may be the wrong place to do this, but it makes syntax errors in
|
||||
// files loaded via unipackage.load have context.
|
||||
if (error instanceof files.FancySyntaxError) {
|
||||
@@ -345,7 +336,7 @@ var exception = function (error) {
|
||||
if (error instanceof files.FancySyntaxError) {
|
||||
// No stack, because FancySyntaxError isn't a real Error and has no stack
|
||||
// property!
|
||||
currentJob.addMessage({
|
||||
currentJob.get().addMessage({
|
||||
message: message,
|
||||
file: error.file,
|
||||
line: error.line,
|
||||
@@ -354,7 +345,7 @@ var exception = function (error) {
|
||||
} else {
|
||||
var stack = parseStack.parse(error);
|
||||
var locus = stack[0];
|
||||
currentJob.addMessage({
|
||||
currentJob.get().addMessage({
|
||||
message: message,
|
||||
stack: stack,
|
||||
func: locus.func,
|
||||
@@ -365,6 +356,16 @@ var exception = function (error) {
|
||||
}
|
||||
};
|
||||
|
||||
var assertInJob = function () {
|
||||
if (! currentJob.get())
|
||||
throw new Error("Expected to be in a buildmessage job");
|
||||
};
|
||||
|
||||
var assertInCapture = function () {
|
||||
if (! currentMessageSet.get())
|
||||
throw new Error("Expected to be in a buildmessage capture");
|
||||
};
|
||||
|
||||
var buildmessage = exports;
|
||||
_.extend(exports, {
|
||||
capture: capture,
|
||||
@@ -372,5 +373,7 @@ _.extend(exports, {
|
||||
markBoundary: markBoundary,
|
||||
error: error,
|
||||
exception: exception,
|
||||
jobHasMessages: jobHasMessages
|
||||
jobHasMessages: jobHasMessages,
|
||||
assertInJob: assertInJob,
|
||||
assertInCapture: assertInCapture
|
||||
});
|
||||
|
||||
137
tools/bundler.js
137
tools/bundler.js
@@ -51,13 +51,13 @@
|
||||
// really the build tool can lay out the star however it wants.
|
||||
//
|
||||
//
|
||||
// == Format of a program when arch is "client.*" ==
|
||||
// == Format of a program when arch is "web.*" ==
|
||||
//
|
||||
// Standard:
|
||||
//
|
||||
// /program.json
|
||||
//
|
||||
// - format: "client-program-pre1" for this version
|
||||
// - format: "web-program-pre1" for this version
|
||||
//
|
||||
// - manifest: array of resources to serve with HTTP, each an object:
|
||||
// - path: path of file relative to program.json
|
||||
@@ -388,7 +388,7 @@ var Target = function (options) {
|
||||
// PackageLoader to use for resolving package dependenices.
|
||||
self.packageLoader = options.packageLoader;
|
||||
|
||||
// Something like "client.w3c" or "os" or "os.osx.x86_64"
|
||||
// Something like "web.browser" or "os" or "os.osx.x86_64"
|
||||
self.arch = options.arch;
|
||||
|
||||
// All of the Unibuilds that are to go into this target, in the order
|
||||
@@ -434,6 +434,7 @@ _.extend(Target.prototype, {
|
||||
// on server targets.
|
||||
make: function (options) {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
|
||||
// Populate the list of unibuilds to load
|
||||
self._determineLoadOrder({
|
||||
@@ -481,6 +482,8 @@ _.extend(Target.prototype, {
|
||||
// package name as a string
|
||||
_determineLoadOrder: function (options) {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
|
||||
var packageLoader = self.packageLoader;
|
||||
|
||||
// Find the roots
|
||||
@@ -579,7 +582,7 @@ _.extend(Target.prototype, {
|
||||
_emitResources: function () {
|
||||
var self = this;
|
||||
|
||||
var isClient = archinfo.matches(self.arch, "client");
|
||||
var isWeb = archinfo.matches(self.arch, "web");
|
||||
var isOs = archinfo.matches(self.arch, "os");
|
||||
|
||||
// Copy their resources into the bundle in order
|
||||
@@ -617,7 +620,7 @@ _.extend(Target.prototype, {
|
||||
: stripLeadingSlash(resource.servePath);
|
||||
f.setTargetPathFromRelPath(relPath);
|
||||
|
||||
if (isClient)
|
||||
if (isWeb)
|
||||
f.setUrlFromRelPath(resource.servePath);
|
||||
else {
|
||||
unibuildAssets[resource.path] = resource.data;
|
||||
@@ -632,7 +635,7 @@ _.extend(Target.prototype, {
|
||||
return; // already handled
|
||||
|
||||
if (_.contains(["js", "css"], resource.type)) {
|
||||
if (resource.type === "css" && ! isClient)
|
||||
if (resource.type === "css" && ! isWeb)
|
||||
// XXX might be nice to throw an error here, but then we'd
|
||||
// have to make it so that package.js ignores css files
|
||||
// that appear in the server directories in an app tree
|
||||
@@ -655,7 +658,7 @@ _.extend(Target.prototype, {
|
||||
|
||||
f.setTargetPathFromRelPath(relPath);
|
||||
|
||||
if (isClient) {
|
||||
if (isWeb) {
|
||||
f.setUrlFromRelPath(resource.servePath);
|
||||
}
|
||||
|
||||
@@ -692,7 +695,7 @@ _.extend(Target.prototype, {
|
||||
}
|
||||
|
||||
if (_.contains(["head", "body"], resource.type)) {
|
||||
if (! isClient)
|
||||
if (! isWeb)
|
||||
throw new Error("HTML segments can only go to the client");
|
||||
self[resource.type].push(resource.data);
|
||||
return;
|
||||
@@ -779,7 +782,7 @@ var ClientTarget = function (options) {
|
||||
self.head = [];
|
||||
self.body = [];
|
||||
|
||||
if (! archinfo.matches(self.arch, "client"))
|
||||
if (! archinfo.matches(self.arch, "web"))
|
||||
throw new Error("ClientTarget targeting something that isn't a client?");
|
||||
};
|
||||
|
||||
@@ -972,7 +975,7 @@ _.extend(ClientTarget.prototype, {
|
||||
|
||||
// Control file
|
||||
builder.writeJson('program.json', {
|
||||
format: "client-program-pre1",
|
||||
format: "web-program-pre1",
|
||||
manifest: manifest
|
||||
});
|
||||
return "program.json";
|
||||
@@ -1541,55 +1544,6 @@ var writeSiteArchive = function (targets, outputPath, options) {
|
||||
cordovaDependencies: {}
|
||||
};
|
||||
|
||||
// Pick a path in the bundle for each target
|
||||
var paths = {};
|
||||
_.each(targets, function (target, name) {
|
||||
var p = path.join('programs', name);
|
||||
builder.reserve(p, { directory: true });
|
||||
paths[name] = p;
|
||||
});
|
||||
|
||||
// Hack to let servers find relative paths to clients. Should find
|
||||
// another solution eventually (probably some kind of mount
|
||||
// directive that mounts the client bundle in the server at runtime)
|
||||
var getRelativeTargetPath = function (options) {
|
||||
var pathForTarget = function (target) {
|
||||
var name;
|
||||
_.each(targets, function (t, n) {
|
||||
if (t === target)
|
||||
name = n;
|
||||
});
|
||||
if (! name)
|
||||
throw new Error("missing target?");
|
||||
|
||||
if (! (name in paths))
|
||||
throw new Error("missing target path?");
|
||||
|
||||
return paths[name];
|
||||
};
|
||||
|
||||
return path.relative(pathForTarget(options.relativeTo),
|
||||
pathForTarget(options.forTarget));
|
||||
};
|
||||
|
||||
// Write out each target
|
||||
_.each(targets, function (target, name) {
|
||||
var relControlFilePath =
|
||||
target.write(builder.enter(paths[name]), {
|
||||
includeNodeModulesSymlink: options.includeNodeModulesSymlink,
|
||||
getRelativeTargetPath: getRelativeTargetPath });
|
||||
|
||||
json.programs.push({
|
||||
name: name,
|
||||
arch: target.mostCompatibleArch(),
|
||||
path: path.join(paths[name], relControlFilePath)
|
||||
});
|
||||
|
||||
_.each(target.cordovaDependencies, function (version, name) {
|
||||
json.cordovaDependencies[name] = version;
|
||||
});
|
||||
});
|
||||
|
||||
// Tell Galaxy what version of the dependency kit we're using, so
|
||||
// it can load the right modules. (Include this even if we copied
|
||||
// or symlinked a node_modules, since that's probably enough for
|
||||
@@ -1624,9 +1578,6 @@ var writeSiteArchive = function (targets, outputPath, options) {
|
||||
'utf8')});
|
||||
}
|
||||
|
||||
// Control file
|
||||
builder.writeJson('star.json', json);
|
||||
|
||||
// Merge the WatchSet of everything that went into the bundle.
|
||||
var clientWatchSet = new watch.WatchSet();
|
||||
var serverWatchSet = new watch.WatchSet();
|
||||
@@ -1640,9 +1591,18 @@ var writeSiteArchive = function (targets, outputPath, options) {
|
||||
});
|
||||
|
||||
_.each(targets, function (target, name) {
|
||||
json.programs.push(writeTargetToPath(name, target, builder.buildPath, options));
|
||||
json.programs.push(writeTargetToPath(name, target, builder.buildPath, {
|
||||
includeNodeModulesSymlink: options.includeNodeModulesSymlink,
|
||||
builtBy: options.builtBy,
|
||||
controlProgram: options.controlProgram,
|
||||
releaseName: options.releaseName,
|
||||
getRelativeTargetPath: options.getRelativeTargetPath
|
||||
}));
|
||||
});
|
||||
|
||||
// Control file
|
||||
builder.writeJson('star.json', json);
|
||||
|
||||
// We did it!
|
||||
builder.complete();
|
||||
|
||||
@@ -1680,9 +1640,10 @@ var writeSiteArchive = function (targets, outputPath, options) {
|
||||
*
|
||||
* - buildOptions: may include
|
||||
* - minify: minify the CSS and JS assets (boolean, default false)
|
||||
* - arch: the server architecture to target (defaults to archinfo.host())
|
||||
* - serverArch: the server architecture to target
|
||||
* (defaults to archinfo.host())
|
||||
* - clientArchs: an array of client archs to target
|
||||
* - webArchs: an array of web archs to target
|
||||
*
|
||||
* - hasCachedBundle: true if we already have a cached bundle stored in
|
||||
* /build. When true, we only build the new client targets in the bundle.
|
||||
@@ -1711,23 +1672,12 @@ exports.bundle = function (options) {
|
||||
var includeNodeModulesSymlink = !!options.includeNodeModulesSymlink;
|
||||
var buildOptions = options.buildOptions || {};
|
||||
|
||||
var appDir = project.project.rootDir;
|
||||
if (! release.usingRightReleaseForApp(appDir))
|
||||
throw new Error("running wrong release for app?");
|
||||
|
||||
var serverArch = buildOptions.serverArch || archinfo.host();
|
||||
var clientArchs = buildOptions.clientArchs || [ "client.browser", "client.cordova" ];
|
||||
|
||||
var appDir = project.project.rootDir;
|
||||
var packageLoader = project.project.getPackageLoader();
|
||||
var downloaded = project.project._ensurePackagesExistOnDisk(
|
||||
project.project.dependencies, { arch: serverArch, verbose: true });
|
||||
|
||||
if (_.keys(downloaded).length !==
|
||||
_.keys(project.project.dependencies).length) {
|
||||
buildmessage.error("Unable to download package builds for this architecture.");
|
||||
// Recover by returning.
|
||||
return;
|
||||
}
|
||||
var webArchs = buildOptions.webArchs || [ "web.browser" ];
|
||||
|
||||
var releaseName =
|
||||
release.current.isCheckout() ? "none" : release.current.name;
|
||||
@@ -1743,12 +1693,23 @@ exports.bundle = function (options) {
|
||||
var messages = buildmessage.capture({
|
||||
title: "building the application"
|
||||
}, function () {
|
||||
var packageLoader = project.project.getPackageLoader();
|
||||
var downloaded = project.project._ensurePackagesExistOnDisk(
|
||||
project.project.dependencies, { serverArch: serverArch, verbose: true });
|
||||
|
||||
if (_.keys(downloaded).length !==
|
||||
_.keys(project.project.dependencies).length) {
|
||||
buildmessage.error("Unable to download package builds for this architecture.");
|
||||
// Recover by returning.
|
||||
return;
|
||||
}
|
||||
|
||||
var controlProgram = null;
|
||||
|
||||
var makeClientTarget = function (app, clientArch) {
|
||||
var makeClientTarget = function (app, webArch) {
|
||||
var client = new ClientTarget({
|
||||
packageLoader: packageLoader,
|
||||
arch: clientArch
|
||||
arch: webArch
|
||||
});
|
||||
|
||||
client.make({
|
||||
@@ -1763,7 +1724,7 @@ exports.bundle = function (options) {
|
||||
var makeBlankClientTarget = function () {
|
||||
var client = new ClientTarget({
|
||||
packageLoader: packageLoader,
|
||||
arch: "client.browser"
|
||||
arch: "web.browser"
|
||||
});
|
||||
client.make({
|
||||
minify: buildOptions.minify,
|
||||
@@ -1807,16 +1768,15 @@ exports.bundle = function (options) {
|
||||
|
||||
var clientTargets = [];
|
||||
// Client
|
||||
_.each(clientArchs, function (arch) {
|
||||
_.each(webArchs, function (arch) {
|
||||
var client = makeClientTarget(app, arch);
|
||||
targets[arch] = client;
|
||||
clientTargets.push(client);
|
||||
targets[arch] = client;
|
||||
});
|
||||
|
||||
// Server
|
||||
if (! options.hasCachedBundle) {
|
||||
var server = makeServerTarget(app, clientTargets);
|
||||
server.clientTargets = clientTargets;
|
||||
targets.server = server;
|
||||
}
|
||||
}
|
||||
@@ -1918,8 +1878,7 @@ exports.bundle = function (options) {
|
||||
_.each(programs, function (p) {
|
||||
// Read this directory as a package and create a target from
|
||||
// it
|
||||
var pkg = packageCache.packageCache.
|
||||
loadPackageAtPath(p.name, p.path);
|
||||
var pkg = packageCache.packageCache.loadPackageAtPath(p.name, p.path);
|
||||
var target;
|
||||
switch (p.type) {
|
||||
case "server":
|
||||
@@ -2000,10 +1959,10 @@ exports.bundle = function (options) {
|
||||
};
|
||||
|
||||
if (options.hasCachedBundle) {
|
||||
// XXX If we already have a cached bundle, just recreate the new targets.
|
||||
// This might make the contents of "star.json" out of date.
|
||||
// If we already have a cached bundle, just recreate the new targets.
|
||||
// XXX This might make the contents of "star.json" out of date.
|
||||
_.each(targets, function (target, name) {
|
||||
writeTargetToPath(name, target, outputPath, options);
|
||||
writeTargetToPath(name, target, outputPath, writeOptions);
|
||||
clientWatchSet.merge(target.getWatchSet());
|
||||
});
|
||||
} else {
|
||||
@@ -2063,6 +2022,7 @@ exports.bundle = function (options) {
|
||||
// without also saying "make its namespace the same as the global
|
||||
// namespace." It should be an easy refactor,
|
||||
exports.buildJsImage = function (options) {
|
||||
buildmessage.assertInCapture();
|
||||
if (options.npmDependencies && ! options.npmDir)
|
||||
throw new Error("Must indicate .npm directory to use");
|
||||
if (! options.name)
|
||||
@@ -2114,6 +2074,7 @@ exports.readJsImage = function (controlFilePath) {
|
||||
// with a topological sort.
|
||||
exports.iterateOverAllUsedUnipackages = function (loader, arch,
|
||||
packageNames, callback) {
|
||||
buildmessage.assertInCapture();
|
||||
var target = new Target({packageLoader: loader,
|
||||
arch: arch});
|
||||
target._determineLoadOrder({packages: packageNames});
|
||||
|
||||
@@ -339,23 +339,6 @@ _.extend(baseCatalog.BaseCatalog.prototype, {
|
||||
return self._recordOrRefresh(getDef);
|
||||
},
|
||||
|
||||
// Given a name and a version of a package, return a path on disk
|
||||
// from which we can load it. If we don't have it on disk (we
|
||||
// haven't downloaded it, or it just plain doesn't exist in the
|
||||
// catalog) return null.
|
||||
//
|
||||
// Doesn't download packages. Downloading should be done at the time
|
||||
// that .meteor/versions is updated.
|
||||
getLoadPathForPackage: function (name, version) {
|
||||
var self = this;
|
||||
|
||||
var packageDir = tropohouse.default.packagePath(name, version);
|
||||
if (fs.existsSync(packageDir)) {
|
||||
return packageDir;
|
||||
}
|
||||
return null;
|
||||
},
|
||||
|
||||
// Reload catalog data to account for new information if needed.
|
||||
refresh: function () {
|
||||
throw new Error("no such thing as a base refresh");
|
||||
|
||||
@@ -22,13 +22,13 @@ var Future = require('fibers/future');
|
||||
var catalog = exports;
|
||||
|
||||
/////////////////////////////////////////////////////////////////////////////////////
|
||||
// Server Catalog
|
||||
// Official Catalog
|
||||
/////////////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
// The serverlog syncs up with the server. It doesn't care about local
|
||||
// packages. When the user wants information about the state of the package
|
||||
// world (ex: search), we should use this catalog first.
|
||||
var ServerCatalog = function () {
|
||||
// The official catalog syncs up with the package server. It doesn't care about
|
||||
// local packages. When the user wants information about the state of the
|
||||
// package world (ex: search), we should use this catalog first.
|
||||
var OfficialCatalog = function () {
|
||||
var self = this;
|
||||
|
||||
// Set this to true if we are not going to connect to the remote package
|
||||
@@ -41,9 +41,9 @@ var ServerCatalog = function () {
|
||||
BaseCatalog.call(self);
|
||||
};
|
||||
|
||||
util.inherits(ServerCatalog, BaseCatalog);
|
||||
util.inherits(OfficialCatalog, BaseCatalog);
|
||||
|
||||
_.extend(ServerCatalog.prototype, {
|
||||
_.extend(OfficialCatalog.prototype, {
|
||||
initialize : function (options) {
|
||||
var self = this;
|
||||
options = options || {};
|
||||
@@ -245,6 +245,9 @@ _.extend(CompleteCatalog.prototype, {
|
||||
var self = this;
|
||||
opts = opts || {};
|
||||
self._requireInitialized();
|
||||
// XXX for now, just doing the assertion if we have to call project
|
||||
// stuff. but oh, this will be improved.
|
||||
opts.ignoreProjectDeps || buildmessage.assertInCapture();
|
||||
|
||||
// Kind of a hack, as per specification. We don't have a constraint solver
|
||||
// initialized yet. We are probably trying to build the constraint solver
|
||||
@@ -556,6 +559,8 @@ _.extend(CompleteCatalog.prototype, {
|
||||
// compiled and built in the directory, and null otherwise.
|
||||
_maybeGetUpToDateBuild : function (name, constraintSolverOpts) {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
|
||||
var sourcePath = self.effectiveLocalPackages[name];
|
||||
var buildDir = path.join(sourcePath, '.build.' + name);
|
||||
if (fs.existsSync(buildDir)) {
|
||||
@@ -595,6 +600,8 @@ _.extend(CompleteCatalog.prototype, {
|
||||
// that we use is in the catalog already, we build it here.
|
||||
_build : function (name, onStack, constraintSolverOpts) {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
|
||||
var unip = null;
|
||||
|
||||
if (! _.has(self.unbuilt, name)) {
|
||||
@@ -781,6 +788,7 @@ _.extend(CompleteCatalog.prototype, {
|
||||
rebuildLocalPackages: function (namedPackages) {
|
||||
var self = this;
|
||||
self._requireInitialized();
|
||||
buildmessage.assertInCapture();
|
||||
|
||||
// Clear any cached builds in the package cache.
|
||||
packageCache.packageCache.refresh();
|
||||
@@ -846,6 +854,7 @@ _.extend(CompleteCatalog.prototype, {
|
||||
getLoadPathForPackage: function (name, version, constraintSolverOpts) {
|
||||
var self = this;
|
||||
self._requireInitialized();
|
||||
buildmessage.assertInCapture();
|
||||
constraintSolverOpts = constraintSolverOpts || {};
|
||||
|
||||
// Check local packages first.
|
||||
@@ -876,7 +885,7 @@ _.extend(CompleteCatalog.prototype, {
|
||||
// This is the catalog that's used to answer the specific question of "so what's
|
||||
// on the server?". It does not contain any local catalogs. Typically, we call
|
||||
// catalog.official.refresh() to update data.json.
|
||||
catalog.official = new ServerCatalog();
|
||||
catalog.official = new OfficialCatalog();
|
||||
|
||||
// This is the catalog that's used to actually drive the constraint solver: it
|
||||
// contains local packages, and since local packages always beat server
|
||||
|
||||
@@ -38,23 +38,7 @@ var getReleaseOrPackageRecord = function(name) {
|
||||
if (rec)
|
||||
rel = true;
|
||||
}
|
||||
return { record: rec, release: rel };
|
||||
};
|
||||
|
||||
|
||||
// Checks to see if you are an authorized maintainer for a given
|
||||
// release/package. If you are not, calls process.exit
|
||||
// explaining that you can't take that action.
|
||||
// record: package or track record
|
||||
// action: string for error handling
|
||||
var checkAuthorizedPackageMaintainer = function (record, action) {
|
||||
var authorized = _.indexOf(
|
||||
_.pluck(record.maintainers, 'username'), auth.loggedInUsername());
|
||||
if (authorized == -1) {
|
||||
process.stderr.write('You are not an authorized maintainer of ' + record.name + ".\n");
|
||||
process.stderr.write('Only authorized maintainers may ' + action + ".\n");
|
||||
return 1;;
|
||||
}
|
||||
return { record: rec, isRelease: rel };
|
||||
};
|
||||
|
||||
// Returns a pretty list suitable for showing to the user. Input is an
|
||||
@@ -113,7 +97,17 @@ main.registerCommand({
|
||||
// Refresh the catalog, caching the remote package data on the server. We can
|
||||
// optimize the workflow by using this data to weed out obviously incorrect
|
||||
// submissions before they ever hit the wire.
|
||||
catalog.official.refresh(true);
|
||||
catalog.official.refresh();
|
||||
var packageName = path.basename(options.packageDir);
|
||||
|
||||
// Fail early if the package already exists.
|
||||
if (options.create) {
|
||||
if (catalog.official.getPackage(packageName)) {
|
||||
process.stderr.write("Package already exists. To create a new version of an existing "+
|
||||
"package, do not use the --create flag! \n");
|
||||
return 2;
|
||||
}
|
||||
};
|
||||
|
||||
try {
|
||||
var conn = packageClient.loggedInPackagesConnection();
|
||||
@@ -134,7 +128,6 @@ main.registerCommand({
|
||||
var messages = buildmessage.capture(
|
||||
{ title: "building the package" },
|
||||
function () {
|
||||
var packageName = path.basename(options.packageDir);
|
||||
|
||||
if (! utils.validPackageName(packageName)) {
|
||||
buildmessage.error("Invalid package name:", packageName);
|
||||
@@ -148,32 +141,44 @@ main.registerCommand({
|
||||
if (buildmessage.jobHasMessages())
|
||||
return; // already have errors, so skip the build
|
||||
|
||||
var directDeps =
|
||||
compiler.determineBuildTimeDependencies(packageSource).directDependencies;
|
||||
project._ensurePackagesExistOnDisk(directDeps);
|
||||
var deps =
|
||||
compiler.determineBuildTimeDependencies(packageSource).packageDependencies;
|
||||
project._ensurePackagesExistOnDisk(deps);
|
||||
|
||||
compileResult = compiler.compile(packageSource, { officialBuild: true });
|
||||
});
|
||||
|
||||
if (messages.hasMessages()) {
|
||||
process.stdout.write(messages.formatMessages());
|
||||
process.stderr.write(messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
// We have initialized everything, so perform the publish oepration.
|
||||
var ec = packageClient.publishPackage(
|
||||
packageSource, compileResult, conn, {
|
||||
new: options.create,
|
||||
existingVersion: options['existing-version']
|
||||
});
|
||||
var ec; // XXX maybe combine with messages?
|
||||
messages = buildmessage.capture({
|
||||
title: "publishing the package"
|
||||
}, function () {
|
||||
ec = packageClient.publishPackage(
|
||||
packageSource, compileResult, conn, {
|
||||
new: options.create,
|
||||
existingVersion: options['existing-version']
|
||||
});
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write(messages.formatMessages());
|
||||
return ec || 1;
|
||||
}
|
||||
|
||||
// Warn the user if their package is not good for all architectures.
|
||||
if (compileResult.unipackage.buildArchitectures() !== "browser+os") {
|
||||
var allArchs = compileResult.unipackage.buildArchitectures().split('+');
|
||||
if (_.any(allArchs, function (arch) {
|
||||
return arch.match(/^os\./);
|
||||
})) {
|
||||
process.stdout.write(
|
||||
"\nWARNING: Your package contains binary code and is only compatible with " +
|
||||
archinfo.host() + " architecture.\n" +
|
||||
"Please use publish-for-arch to publish new builds of the package.\n\n");
|
||||
};
|
||||
}
|
||||
|
||||
// We are only publishing one package, so we should close the connection, and
|
||||
// then exit with the previous error code.
|
||||
@@ -246,21 +251,43 @@ main.registerCommand({
|
||||
return 1;
|
||||
}
|
||||
|
||||
var packageSource = new PackageSource;
|
||||
var unipkg;
|
||||
var messages = buildmessage.capture({
|
||||
title: "building package " + name
|
||||
}, function () {
|
||||
var packageSource = new PackageSource;
|
||||
|
||||
// This package source, although it is initialized from a directory is
|
||||
// immutable. It should be built exactly as is. If we need to modify anything,
|
||||
// such as the version lock file, something has gone terribly wrong and we
|
||||
// should throw.
|
||||
packageSource.initFromPackageDir(name, packageDir, {
|
||||
requireVersion: true,
|
||||
immutable: true
|
||||
// This package source, although it is initialized from a directory is
|
||||
// immutable. It should be built exactly as is. If we need to modify
|
||||
// anything, such as the version lock file, something has gone terribly
|
||||
// wrong and we should throw.
|
||||
packageSource.initFromPackageDir(name, packageDir, {
|
||||
requireVersion: true,
|
||||
immutable: true
|
||||
});
|
||||
if (buildmessage.jobHasMessages())
|
||||
return;
|
||||
|
||||
|
||||
// Now compile it! Once again, everything should compile, and if
|
||||
// it doesn't we should fail. Hopefully, of course, we have
|
||||
// tested our stuff before deciding to publish it to the package
|
||||
// server, but we need to be careful.
|
||||
var deps =
|
||||
compiler.determineBuildTimeDependencies(packageSource).packageDependencies;
|
||||
project._ensurePackagesExistOnDisk(deps);
|
||||
|
||||
unipkg = compiler.compile(packageSource, {
|
||||
officialBuild: true
|
||||
}).unipackage;
|
||||
if (buildmessage.jobHasMessages())
|
||||
return;
|
||||
});
|
||||
|
||||
var unipkg = compiler.compile(packageSource, {
|
||||
officialBuild: true
|
||||
}).unipackage;
|
||||
unipkg.saveToPath(path.join(packageDir, '.build.' + packageSource.name));
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write("\n" + messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
var conn;
|
||||
try {
|
||||
@@ -269,10 +296,19 @@ main.registerCommand({
|
||||
packageClient.handlePackageServerConnectionError(err);
|
||||
return 1;
|
||||
}
|
||||
packageClient.createAndPublishBuiltPackage(conn, unipkg);
|
||||
|
||||
messages = buildmessage.capture({
|
||||
title: "publishing package " + name
|
||||
}, function () {
|
||||
packageClient.createAndPublishBuiltPackage(conn, unipkg);
|
||||
});
|
||||
|
||||
catalog.official.refresh();
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write("\n" + messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
catalog.official.refresh(); // XXX buildmessage.capture?
|
||||
return 0;
|
||||
});
|
||||
|
||||
@@ -386,15 +422,17 @@ main.registerCommand({
|
||||
'. If you are creating a new track, use the --create-track flag.\n');
|
||||
return 1;
|
||||
}
|
||||
var auth = require("./auth.js");
|
||||
var authorized = _.indexOf(
|
||||
_.pluck(trackRecord.maintainers, 'username'), auth.loggedInUsername());
|
||||
if (authorized == -1) {
|
||||
|
||||
// We are going to call the server to check if we are authorized, so that when
|
||||
// we implement things like organizations, we are not handicapped by the
|
||||
// user's meteor version.
|
||||
if (!packageClient.amIAuthorized(relConf.track,conn, true)) {
|
||||
process.stderr.write('\n You are not an authorized maintainer of ' + relConf.track + ".\n");
|
||||
process.stderr.write('Only authorized maintainers may publish new versions.\n');
|
||||
return 1;
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
process.stdout.write(". OK!\n");
|
||||
|
||||
// This is sort of a hidden option to just take your entire meteor checkout
|
||||
@@ -446,7 +484,7 @@ main.registerCommand({
|
||||
// directory, though we really shouldn't be, or, if we ever restructure the
|
||||
// way that we store packages in the meteor directory, we should be sure to
|
||||
// reevaluate what this command actually does.
|
||||
var localPackageDir = path.join(files.getCurrentToolsDir(),"packages");
|
||||
var localPackageDir = path.join(files.getCurrentToolsDir(), "packages");
|
||||
var contents = fs.readdirSync(localPackageDir);
|
||||
var myPackages = {};
|
||||
var toPublish = {};
|
||||
@@ -600,7 +638,7 @@ main.registerCommand({
|
||||
});
|
||||
|
||||
if (messages.hasMessages()) {
|
||||
process.stdout.write("\n" + messages.formatMessages());
|
||||
process.stderr.write("\n" + messages.formatMessages());
|
||||
return 1;
|
||||
};
|
||||
|
||||
@@ -616,20 +654,29 @@ main.registerCommand({
|
||||
};
|
||||
process.stdout.write("Publishing package: " + name + "\n");
|
||||
|
||||
// If we are creating a new package, dsPS will document this for us, so we
|
||||
// don't need to do this here. Though, in the future, once we are done
|
||||
// bootstrapping package servers, we should consider having some extra
|
||||
// checks around this.
|
||||
var pub = packageClient.publishPackage(
|
||||
prebuilt.source,
|
||||
prebuilt.compileResult,
|
||||
conn,
|
||||
opts);
|
||||
var pubEC; // XXX merge with messages?
|
||||
messages = buildmessage.capture({
|
||||
title: "publishing package " + name
|
||||
}, function () {
|
||||
// If we are creating a new package, dsPS will document this for us, so
|
||||
// we don't need to do this here. Though, in the future, once we are
|
||||
// done bootstrapping package servers, we should consider having some
|
||||
// extra checks around this.
|
||||
pubEC = packageClient.publishPackage(
|
||||
prebuilt.source,
|
||||
prebuilt.compileResult,
|
||||
conn,
|
||||
opts);
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write(messages.formatMessages());
|
||||
return pubEC || 1;
|
||||
}
|
||||
|
||||
// If we fail to publish, just exit outright, something has gone wrong.
|
||||
if (pub > 0) {
|
||||
if (pubEC > 0) {
|
||||
process.stderr.write("Failed to publish: " + name + "\n");
|
||||
return 1;
|
||||
return pubEC;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -649,21 +696,26 @@ main.registerCommand({
|
||||
}
|
||||
|
||||
process.stdout.write("Creating a new release version...\n");
|
||||
// Send it over!
|
||||
var record = {
|
||||
track: relConf.track,
|
||||
version: relConf.version,
|
||||
orderKey: relConf.orderKey,
|
||||
description: relConf.description,
|
||||
recommended: !!relConf.recommended,
|
||||
tool: relConf.tool,
|
||||
packages: relConf.packages
|
||||
};
|
||||
var record = {
|
||||
track: relConf.track,
|
||||
version: relConf.version,
|
||||
orderKey: relConf.orderKey,
|
||||
description: relConf.description,
|
||||
recommended: !!relConf.recommended,
|
||||
tool: relConf.tool,
|
||||
packages: relConf.packages
|
||||
};
|
||||
|
||||
var uploadInfo;
|
||||
if (!relConf.patchFrom) {
|
||||
uploadInfo = conn.call('createReleaseVersion', record);
|
||||
} else {
|
||||
uploadInfo = conn.call('createPatchReleaseVersion', record, relConf.patchFrom);
|
||||
try {
|
||||
if (!relConf.patchFrom) {
|
||||
uploadInfo = conn.call('createReleaseVersion', record);
|
||||
} else {
|
||||
uploadInfo = conn.call('createPatchReleaseVersion', record, relConf.patchFrom);
|
||||
}
|
||||
} catch (err) {
|
||||
process.stderr.write("ERROR: " + err + "\n");
|
||||
return 1;
|
||||
}
|
||||
|
||||
// Get it back.
|
||||
@@ -681,13 +733,25 @@ main.registerCommand({
|
||||
|
||||
main.registerCommand({
|
||||
name: 'search',
|
||||
minArgs: 1,
|
||||
minArgs: 0,
|
||||
maxArgs: 1,
|
||||
options: {
|
||||
details: { type: Boolean, required: false }
|
||||
details: { type: Boolean, required: false },
|
||||
mine: {type: Boolean, required: false }
|
||||
},
|
||||
}, function (options) {
|
||||
|
||||
if (options.details && options.mine) {
|
||||
process.stderr.write("You must select a specific package by name to view details. \n");
|
||||
return 1;
|
||||
}
|
||||
|
||||
if (!options.mine && options.args.length === 0) {
|
||||
process.stderr.write("You must search for packages by name or substring. \n");
|
||||
throw new main.ShowUsage;
|
||||
}
|
||||
|
||||
|
||||
catalog.official.refresh();
|
||||
|
||||
if (options.details) {
|
||||
@@ -701,7 +765,7 @@ main.registerCommand({
|
||||
}
|
||||
var versionRecords;
|
||||
var label;
|
||||
if (!allRecord.release) {
|
||||
if (!allRecord.isRelease) {
|
||||
label = "package";
|
||||
var getRelevantRecord = function (version) {
|
||||
var versionRecord =
|
||||
@@ -709,15 +773,18 @@ main.registerCommand({
|
||||
var myBuilds = _.pluck(
|
||||
catalog.official.getAllBuilds(name, version),
|
||||
'buildArchitectures');
|
||||
// This package does not have different builds, so we don't care.
|
||||
if (_.isEqual(myBuilds, ["browser+os"])) {
|
||||
return versionRecord;
|
||||
// Does this package only have a cross-platform build?
|
||||
if (myBuilds.length === 1) {
|
||||
var allArches = myBuilds[0].split('+');
|
||||
if (!_.any(allArches, function (arch) {
|
||||
return arch.match(/^os\./);
|
||||
})) {
|
||||
return versionRecord;
|
||||
}
|
||||
}
|
||||
// This package is only available for some architectures.
|
||||
var myStringBuilds = "";
|
||||
_.each(myBuilds, function (build) {
|
||||
myStringBuilds = myStringBuilds + build.split('+')[1] + " ";
|
||||
});
|
||||
// XXX show in a more human way?
|
||||
var myStringBuilds = myBuilds.join(' ');
|
||||
return _.extend({ buildArchitectures: myStringBuilds },
|
||||
versionRecord);
|
||||
};
|
||||
@@ -786,15 +853,50 @@ main.registerCommand({
|
||||
}
|
||||
process.stdout.write(maintain + "\n");
|
||||
} else {
|
||||
var search = options.args[0];
|
||||
|
||||
|
||||
var allPackages = catalog.official.getAllPackageNames();
|
||||
var allReleases = catalog.official.getAllReleaseTracks();
|
||||
var matchingPackages = [];
|
||||
var matchingReleases = [];
|
||||
|
||||
var selector;
|
||||
if (options.mine) {
|
||||
var myUserName = auth.loggedInUsername();
|
||||
if (!myUserName) {
|
||||
// But couldn't you just grep the data.json for any maintainer? Yeah,
|
||||
// but that's temporary, and won't work once organizations are around.
|
||||
process.stderr.write("Please login so we know who you are. \n");
|
||||
auth.doUsernamePasswordLogin({});
|
||||
myUserName = auth.loggedInUsername();
|
||||
}
|
||||
// In the future, we should consider checking this on the server, but I
|
||||
// suspect the main use of this command will be to deal with the automatic
|
||||
// migration and uncommon in everyday use. From that perspective, it makes
|
||||
// little sense to require you to be online to find out what packages you
|
||||
// own; and the consequence of not mentioning your group packages until
|
||||
// you update to a new version of meteor is not that dire.
|
||||
selector = function (packageName, isRelease) {
|
||||
var record;
|
||||
if (!isRelease) {
|
||||
record = catalog.official.getPackage(packageName);
|
||||
} else {
|
||||
record = catalog.official.getReleaseTrack(packageName);
|
||||
}
|
||||
if (_.indexOf(_.pluck(record.maintainers, 'username'), myUserName) !== -1) {
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
};
|
||||
} else {
|
||||
var search = options.args[0];
|
||||
selector = function (packageName) {
|
||||
return packageName.match(search);
|
||||
};
|
||||
}
|
||||
|
||||
_.each(allPackages, function (pack) {
|
||||
if (pack.match(search)) {
|
||||
if (selector(pack, false)) {
|
||||
var vr = catalog.official.getLatestVersion(pack);
|
||||
if (vr) {
|
||||
matchingPackages.push(
|
||||
@@ -803,7 +905,7 @@ main.registerCommand({
|
||||
}
|
||||
});
|
||||
_.each(allReleases, function (track) {
|
||||
if (track.match(search)) {
|
||||
if (selector(track, true)) {
|
||||
var vr = catalog.official.getDefaultReleaseVersion(track);
|
||||
if (vr) {
|
||||
var vrlong =
|
||||
@@ -851,14 +953,15 @@ main.registerCommand({
|
||||
}, function (options) {
|
||||
var items = [];
|
||||
|
||||
// Packages that are used by this app
|
||||
var packages = project.getConstraints();
|
||||
// Versions of the packages. We need this to get the right description for the
|
||||
// user, in case it changed between versions.
|
||||
var versions = project.getVersions();
|
||||
var newVersionsAvailable = false;
|
||||
|
||||
var messages = buildmessage.capture(function () {
|
||||
// Packages that are used by this app
|
||||
var packages = project.getConstraints();
|
||||
// Versions of the packages. We need this to get the right description for
|
||||
// the user, in case it changed between versions.
|
||||
var versions = project.getVersions();
|
||||
|
||||
_.each(packages, function (version, name) {
|
||||
if (!version) {
|
||||
version = versions[name];
|
||||
@@ -890,7 +993,7 @@ main.registerCommand({
|
||||
});
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
process.stdout.write("\n" + messages.formatMessages());
|
||||
process.stderr.write("\n" + messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
@@ -1098,7 +1201,14 @@ main.registerCommand({
|
||||
|
||||
var solutionPackageVersions = null;
|
||||
var directDependencies = project.getConstraints();
|
||||
var previousVersions = project.getVersions();
|
||||
var previousVersions;
|
||||
var messages = buildmessage.capture(function () {
|
||||
previousVersions = project.getVersions();
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write(messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
var solutionReleaseVersion = _.find(releaseVersionsToTry, function (versionToTry) {
|
||||
var releaseRecord = catalog.complete.getReleaseVersion(releaseTrack, versionToTry);
|
||||
if (!releaseRecord)
|
||||
@@ -1139,8 +1249,16 @@ main.registerCommand({
|
||||
var upgradersToRun = upgraders.upgradersToRun();
|
||||
|
||||
// Write the new versions to .meteor/packages and .meteor/versions.
|
||||
var setV = project.setVersions(solutionPackageVersions,
|
||||
{ alwaysRecord : true });
|
||||
var setV;
|
||||
messages = buildmessage.capture(function () {
|
||||
setV = project.setVersions(solutionPackageVersions,
|
||||
{ alwaysRecord : true });
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write("Error while setting versions:\n" +
|
||||
messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
project.showPackageChanges(previousVersions, solutionPackageVersions, {
|
||||
onDiskPackages: setV.downloaded
|
||||
});
|
||||
@@ -1173,13 +1291,20 @@ main.registerCommand({
|
||||
// We can't update packages when we are not in a release.
|
||||
if (!options.appDir) return 0;
|
||||
|
||||
var versions = project.getVersions();
|
||||
var allPackages = project.getCurrentCombinedConstraints();
|
||||
var upgradePackages;
|
||||
var versions, allPackages;
|
||||
messages = buildmessage.capture(function () {
|
||||
versions = project.getVersions();
|
||||
allPackages = project.getCurrentCombinedConstraints();
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write(messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
// If no packages have been specified, then we will send in a request to
|
||||
// update all direct dependencies. If a specific list of packages has been
|
||||
// specified, then only upgrade those.
|
||||
var upgradePackages;
|
||||
if (options.args.length === 0) {
|
||||
upgradePackages = _.pluck(allPackages, 'packageName');
|
||||
} else {
|
||||
@@ -1203,12 +1328,22 @@ main.registerCommand({
|
||||
}
|
||||
|
||||
// Set our versions and download the new packages.
|
||||
setV = project.setVersions(newVersions, {
|
||||
alwaysRecord : true });
|
||||
|
||||
// Display changes: what we have added/removed/upgraded.
|
||||
return project.showPackageChanges(versions, newVersions, {
|
||||
onDiskPackages: setV.downloaded});
|
||||
messages = buildmessage.capture(function () {
|
||||
setV = project.setVersions(newVersions, { alwaysRecord : true });
|
||||
});
|
||||
// XXX cleanup this madness of error handling
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write("Error while setting package versions:\n" +
|
||||
messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
var showExitCode = project.showPackageChanges(
|
||||
versions, newVersions, { onDiskPackages: setV.downloaded });
|
||||
if (!setV.success) {
|
||||
process.stderr.write("Could not install all the requested packages.\n");
|
||||
return 1;
|
||||
}
|
||||
return showExitCode;
|
||||
}
|
||||
return 0;
|
||||
});
|
||||
@@ -1239,9 +1374,16 @@ main.registerCommand({
|
||||
// Read in existing package dependencies.
|
||||
var packages = project.getConstraints();
|
||||
|
||||
// Combine into one object mapping package name to list of
|
||||
// constraints, to pass in to the constraint solver.
|
||||
var allPackages = project.getCurrentCombinedConstraints();
|
||||
var allPackages;
|
||||
var messages = buildmessage.capture(function () {
|
||||
// Combine into one object mapping package name to list of constraints, to
|
||||
// pass in to the constraint solver.
|
||||
allPackages = project.getCurrentCombinedConstraints();
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write(messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
// For every package name specified, add it to our list of package
|
||||
// constraints. Don't run the constraint solver until you have added all of
|
||||
@@ -1323,36 +1465,43 @@ main.registerCommand({
|
||||
return 1;
|
||||
}
|
||||
|
||||
// Get the contents of our versions file. We need to pass them to the
|
||||
// constraint solver, because our contract with the user says that we will
|
||||
// never downgrade a dependency.
|
||||
var versions = project.getVersions();
|
||||
var downloaded, versions, newVersions;
|
||||
var messages = buildmessage.capture(function () {
|
||||
// Get the contents of our versions file. We need to pass them to the
|
||||
// constraint solver, because our contract with the user says that we will
|
||||
// never downgrade a dependency.
|
||||
versions = project.getVersions();
|
||||
|
||||
// Call the constraint solver.
|
||||
var resolverOpts = {
|
||||
previousSolution: versions,
|
||||
breaking: !!options.force
|
||||
};
|
||||
newVersions = catalog.complete.resolveConstraints(
|
||||
allPackages,
|
||||
resolverOpts,
|
||||
{ ignoreProjectDeps: true });
|
||||
if ( ! newVersions) {
|
||||
// XXX: Better error handling.
|
||||
process.stderr.write("Cannot resolve package dependencies.\n");
|
||||
return;
|
||||
}
|
||||
|
||||
// Call the constraint solver.
|
||||
var resolverOpts = {
|
||||
previousSolution: versions,
|
||||
breaking: !!options.force
|
||||
};
|
||||
var newVersions = catalog.complete.resolveConstraints(allPackages,
|
||||
resolverOpts,
|
||||
{ ignoreProjectDeps: true });
|
||||
if ( ! newVersions) {
|
||||
// XXX: Better error handling.
|
||||
process.stderr.write("Cannot resolve package dependencies.\n");
|
||||
// Don't tell the user what all the operations were until we finish -- we
|
||||
// don't want to give a false sense of completeness until everything is
|
||||
// written to disk.
|
||||
var messageLog = [];
|
||||
|
||||
// Install the new versions. If all new versions were installed
|
||||
// successfully, then change the .meteor/packages and .meteor/versions to
|
||||
// match expected reality.
|
||||
downloaded = project.addPackages(constraints, newVersions);
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write(messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
// Don't tell the user what all the operations were until we finish -- we
|
||||
// don't want to give a false sense of completeness until everything is
|
||||
// written to disk.
|
||||
var messageLog = [];
|
||||
|
||||
|
||||
// Install the new versions. If all new versions were installed successfully,
|
||||
// then change the .meteor/packages and .meteor/versions to match expected
|
||||
// reality.
|
||||
var downloaded = project.addPackages(constraints, newVersions);
|
||||
|
||||
var ret = project.showPackageChanges(versions, newVersions, {
|
||||
onDiskPackages: downloaded});
|
||||
if (ret !== 0) return ret;
|
||||
@@ -1412,19 +1561,24 @@ main.registerCommand({
|
||||
}
|
||||
});
|
||||
|
||||
var messages = buildmessage.capture(function () {
|
||||
// Get the contents of our versions file, we will want them in order to
|
||||
// remove to the user what we removed.
|
||||
var versions = project.getVersions();
|
||||
|
||||
// Get the contents of our versions file, we will want them in order to remove
|
||||
// to the user what we removed.
|
||||
var versions = project.getVersions();
|
||||
// Remove the packages from the project! There is really no way for this to
|
||||
// fail, unless something has gone horribly wrong, so we don't need to check
|
||||
// for it.
|
||||
project.removePackages(packagesToRemove);
|
||||
|
||||
// Remove the packages from the project! There is really no way for this to
|
||||
// fail, unless something has gone horribly wrong, so we don't need to check
|
||||
// for it.
|
||||
project.removePackages(packagesToRemove);
|
||||
|
||||
// Retrieve the new dependency versions that we have chosen for this project
|
||||
// and do some pretty output reporting.
|
||||
var newVersions = project.getVersions();
|
||||
// Retrieve the new dependency versions that we have chosen for this project
|
||||
// and do some pretty output reporting.
|
||||
var newVersions = project.getVersions();
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write(messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
// Log that we removed the constraints. It is possible that there are
|
||||
// constraints that we officially removed that the project still 'depends' on,
|
||||
@@ -1478,8 +1632,6 @@ main.registerCommand({
|
||||
var record = fullRecord.record;
|
||||
if (!options.list) {
|
||||
|
||||
checkAuthorizedPackageMaintainer(record, " add or remove maintainers");
|
||||
|
||||
try {
|
||||
var conn = packageClient.loggedInPackagesConnection();
|
||||
} catch (err) {
|
||||
@@ -1505,7 +1657,7 @@ main.registerCommand({
|
||||
process.stdout.write(" Done!\n");
|
||||
}
|
||||
} catch (err) {
|
||||
process.stdout.write("\n" + err + "\n");
|
||||
process.stderr.write("\n" + err + "\n");
|
||||
}
|
||||
conn.close();
|
||||
catalog.official.refresh();
|
||||
@@ -1547,8 +1699,6 @@ main.registerCommand({
|
||||
return 1;
|
||||
}
|
||||
|
||||
checkAuthorizedPackageMaintainer(record, " recommend or unrecommend release");
|
||||
|
||||
try {
|
||||
var conn = packageClient.loggedInPackagesConnection();
|
||||
} catch (err) {
|
||||
@@ -1569,7 +1719,7 @@ main.registerCommand({
|
||||
" is now a recommended release\n");
|
||||
}
|
||||
} catch (err) {
|
||||
process.stdout.write("\n" + err + "\n");
|
||||
process.stderr.write("\n" + err + "\n");
|
||||
}
|
||||
conn.close();
|
||||
catalog.official.refresh();
|
||||
@@ -1602,8 +1752,6 @@ main.registerCommand({
|
||||
return 1;
|
||||
}
|
||||
|
||||
checkAuthorizedPackageMaintainer(record, " set earliest recommended version");
|
||||
|
||||
try {
|
||||
var conn = packageClient.loggedInPackagesConnection();
|
||||
} catch (err) {
|
||||
@@ -1620,7 +1768,7 @@ main.registerCommand({
|
||||
conn.call('_setEarliestCompatibleVersion', versionInfo, ecv);
|
||||
process.stdout.write("Done!\n");
|
||||
} catch (err) {
|
||||
process.stdout.write("\n" + err + "\n");
|
||||
process.stderr.write("\n" + err + "\n");
|
||||
}
|
||||
conn.close();
|
||||
catalog.official.refresh();
|
||||
@@ -1647,8 +1795,6 @@ main.registerCommand({
|
||||
return 1;
|
||||
}
|
||||
|
||||
checkAuthorizedPackageMaintainer(record, " change repository URL");
|
||||
|
||||
try {
|
||||
var conn = packageClient.loggedInPackagesConnection();
|
||||
} catch (err) {
|
||||
@@ -1663,7 +1809,7 @@ main.registerCommand({
|
||||
conn.call('_changePackageHomepage', name, url);
|
||||
process.stdout.write("Done!\n");
|
||||
} catch (err) {
|
||||
process.stdout.write("\n" + err + "\n");
|
||||
process.stderr.write("\n" + err + "\n");
|
||||
}
|
||||
conn.close();
|
||||
catalog.official.refresh();
|
||||
|
||||
@@ -165,6 +165,7 @@ main.registerCommand({
|
||||
catalog.official.refresh();
|
||||
|
||||
var loadPackages = function (packagesToLoad, loader) {
|
||||
buildmessage.assertInCapture();
|
||||
_.each(packagesToLoad, function (name) {
|
||||
// Calling getPackage on the loader will return a unipackage object, which
|
||||
// means that the package will be compiled/downloaded. That we throw the
|
||||
@@ -173,8 +174,9 @@ main.registerCommand({
|
||||
});
|
||||
};
|
||||
|
||||
|
||||
var messages = buildmessage.capture(function () {
|
||||
var messages = buildmessage.capture({
|
||||
title: 'getting packages ready'
|
||||
}, function () {
|
||||
// First, build all accessible *local* packages, whether or not this app
|
||||
// uses them. Use the "all packages are local" loader.
|
||||
loadPackages(catalog.complete.getLocalPackageNames(),
|
||||
@@ -185,7 +187,7 @@ main.registerCommand({
|
||||
// of everything we need from versions have been downloaded. (Calling
|
||||
// buildPackages may be redundant, but can't hurt.)
|
||||
if (options.appDir) {
|
||||
loadPackages(_.keys(project.getVersions), project.getPackageLoader());
|
||||
loadPackages(_.keys(project.getVersions()), project.getPackageLoader());
|
||||
}
|
||||
|
||||
// Using a release? Get all the packages in the release.
|
||||
@@ -201,7 +203,7 @@ main.registerCommand({
|
||||
});
|
||||
|
||||
if (messages.hasMessages()) {
|
||||
process.stdout.write("\n" + messages.formatMessages());
|
||||
process.stderr.write("\n" + messages.formatMessages());
|
||||
return 1;
|
||||
};
|
||||
|
||||
@@ -239,7 +241,13 @@ main.registerCommand({
|
||||
// those issues are concurrency-related, or possibly some weird
|
||||
// order-of-execution of interaction that we are failing to understand. This
|
||||
// seems to fix it in a clear and understandable fashion.)
|
||||
project.getVersions();
|
||||
var messages = buildmessage.capture(function () {
|
||||
project.getVersions(); // #StructuredProjectInitialization
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write(messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
// XXX factor this out into a {type: host/port}?
|
||||
var portMatch = options.port.match(/^(?:(.+):)?([0-9]+)$/);
|
||||
@@ -496,7 +504,13 @@ main.registerCommand({
|
||||
project.setMuted(true);
|
||||
project.writeMeteorReleaseVersion(
|
||||
release.current.isCheckout() ? "none" : release.current.name);
|
||||
project._ensureDepsUpToDate();
|
||||
var messages = buildmessage.capture(function () {
|
||||
project._ensureDepsUpToDate();
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write(messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
process.stdout.write(appPath + ": created");
|
||||
if (options.example && options.example !== appPath)
|
||||
@@ -574,9 +588,17 @@ main.registerCommand({
|
||||
var bundlePath = options['directory'] ?
|
||||
outputPath : path.join(buildDir, 'bundle');
|
||||
|
||||
var bundler = require(path.join(__dirname, 'bundler.js'));
|
||||
var loader = project.getPackageLoader();
|
||||
var loader;
|
||||
var messages = buildmessage.capture(function () {
|
||||
loader = project.getPackageLoader();
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write("Errors prevented bundling your app:\n");
|
||||
process.stderr.write(messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
var bundler = require(path.join(__dirname, 'bundler.js'));
|
||||
var bundleResult = bundler.bundle({
|
||||
outputPath: bundlePath,
|
||||
buildOptions: {
|
||||
@@ -590,8 +612,8 @@ main.registerCommand({
|
||||
}
|
||||
});
|
||||
if (bundleResult.errors) {
|
||||
process.stdout.write("Errors prevented bundling:\n");
|
||||
process.stdout.write(bundleResult.errors.formatMessages());
|
||||
process.stderr.write("Errors prevented bundling:\n");
|
||||
process.stderr.write(bundleResult.errors.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
@@ -772,7 +794,13 @@ main.registerCommand({
|
||||
// issues are concurrency-related, or possibly some weird order-of-execution
|
||||
// of interaction that we are failing to understand. This seems to fix it in a
|
||||
// clear and understandable fashion.)
|
||||
project.getVersions();
|
||||
var messages = buildmessage.capture(function () {
|
||||
project.getVersions(); // #StructuredProjectInitialization
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write(messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
if (options.password) {
|
||||
if (useGalaxy) {
|
||||
@@ -1107,7 +1135,8 @@ var getPackagesForTest = function (packages) {
|
||||
});
|
||||
|
||||
if (messages.hasMessages()) {
|
||||
throw new Error(messages.formatMessages());
|
||||
process.stderr.write("\n" + messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1217,7 +1246,7 @@ main.registerCommand({
|
||||
if (count)
|
||||
console.log("Built " + count + " packages.");
|
||||
if (messages.hasMessages()) {
|
||||
process.stdout.write("\n" + messages.formatMessages());
|
||||
process.stderr.write("\n" + messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
});
|
||||
@@ -1331,7 +1360,7 @@ main.registerCommand({
|
||||
_.each(osArches, function (osArch) {
|
||||
_.each(release.packages, function (pkgVersion, pkgName) {
|
||||
buildmessage.enterJob({
|
||||
title: "Looking up " + pkgName + "@" + pkgVersion + " on " + osArch
|
||||
title: "looking up " + pkgName + "@" + pkgVersion + " on " + osArch
|
||||
}, function () {
|
||||
if (!catalog.official.getBuildsForArches(pkgName, pkgVersion, [osArch])) {
|
||||
buildmessage.error("missing build of " + pkgName + "@" + pkgVersion +
|
||||
@@ -1343,7 +1372,7 @@ main.registerCommand({
|
||||
});
|
||||
|
||||
if (messages.hasMessages()) {
|
||||
process.stdout.write("\n" + messages.formatMessages());
|
||||
process.stderr.write("\n" + messages.formatMessages());
|
||||
return 1;
|
||||
};
|
||||
|
||||
@@ -1357,20 +1386,30 @@ main.registerCommand({
|
||||
// XXX update to '.meteor' when we combine houses
|
||||
var tmpTropo = new tropohouse.Tropohouse(
|
||||
path.join(tmpdir, '.meteor0'), catalog.official);
|
||||
try {
|
||||
tmpTropo.maybeDownloadPackageForArchitectures(
|
||||
{packageName: toolPkg.package, version: toolPkg.constraint},
|
||||
[osArch], // XXX 'browser' too?
|
||||
true);
|
||||
_.each(release.packages, function (pkgVersion, pkgName) {
|
||||
var messages = buildmessage.capture(function () {
|
||||
buildmessage.enterJob({
|
||||
title: "downloading tool package " + toolPkg.package + "@" +
|
||||
toolPkg.constraint
|
||||
}, function () {
|
||||
tmpTropo.maybeDownloadPackageForArchitectures(
|
||||
{packageName: pkgName, version: pkgVersion},
|
||||
[osArch], // XXX 'browser' too?
|
||||
{packageName: toolPkg.package, version: toolPkg.constraint},
|
||||
[osArch], // XXX 'web.browser' too?
|
||||
true);
|
||||
});
|
||||
} catch (err) {
|
||||
console.log(err);
|
||||
process.exit(1);
|
||||
_.each(release.packages, function (pkgVersion, pkgName) {
|
||||
buildmessage.enterJob({
|
||||
title: "downloading package " + pkgName + "@" + pkgVersion
|
||||
}, function () {
|
||||
tmpTropo.maybeDownloadPackageForArchitectures(
|
||||
{packageName: pkgName, version: pkgVersion},
|
||||
[osArch], // XXX 'web.browser' too?
|
||||
true);
|
||||
});
|
||||
});
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write("\n" + messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
// XXX should we include some sort of preliminary package-metadata as well?
|
||||
@@ -1408,7 +1447,8 @@ main.registerCommand({
|
||||
main.registerCommand({
|
||||
name: 'admin set-banners',
|
||||
minArgs: 1,
|
||||
maxArgs: 1
|
||||
maxArgs: 1,
|
||||
hidden: true,
|
||||
}, function (options) {
|
||||
var bannersFile = options.args[0];
|
||||
try {
|
||||
@@ -1493,7 +1533,7 @@ main.registerCommand({
|
||||
browserstack: options.browserstack
|
||||
};
|
||||
|
||||
return selftest.runTests({
|
||||
return selftest.runTests({
|
||||
onlyChanged: options.changed,
|
||||
offline: offline,
|
||||
includeSlowTests: options.slow,
|
||||
|
||||
@@ -28,7 +28,7 @@ var compiler = exports;
|
||||
// end up as watched dependencies. (At least for now, packages only used in
|
||||
// target creation (eg minifiers and dev-bundle-fetcher) don't require you to
|
||||
// update BUILT_BY, though you will need to quit and rerun "meteor run".)
|
||||
compiler.BUILT_BY = 'meteor/11';
|
||||
compiler.BUILT_BY = 'meteor/12';
|
||||
|
||||
// XXX where should this go? I'll make it a random utility function
|
||||
// for now
|
||||
@@ -133,10 +133,11 @@ compiler.eachUsedUnibuild = function (
|
||||
// PackgeSource), or we could have some kind of cache (the ideal place
|
||||
// for such a cache might be inside the constraint solver, since it
|
||||
// will know how/when to invalidate it).
|
||||
var determineBuildTimeDependencies = function
|
||||
(packageSource, constraintSolverOpts) {
|
||||
var determineBuildTimeDependencies = function (packageSource,
|
||||
constraintSolverOpts) {
|
||||
var ret = {};
|
||||
constraintSolverOpts = constraintSolverOpts || {};
|
||||
constraintSolverOpts = constraintSolverOpts || {};
|
||||
constraintSolverOpts.ignoreProjectDeps || buildmessage.assertInCapture();
|
||||
|
||||
// There are some special cases where we know that the package has no source
|
||||
// files, which means it can't have any interesting build-time
|
||||
@@ -334,7 +335,7 @@ var compileUnibuild = function (unipackage, inputSourceArch, packageLoader,
|
||||
_.each(activePluginPackages, function (otherPkg) {
|
||||
_.each(otherPkg.getSourceHandlers(), function (sourceHandler, ext) {
|
||||
// XXX comparing function text here seems wrong.
|
||||
if (ext in allHandlers &&
|
||||
if (_.has(allHandlers, ext) &&
|
||||
allHandlers[ext].toString() !== sourceHandler.handler.toString()) {
|
||||
buildmessage.error(
|
||||
"conflict: two packages included in " +
|
||||
@@ -343,10 +344,16 @@ var compileUnibuild = function (unipackage, inputSourceArch, packageLoader,
|
||||
(otherPkg.name || "the app") + ", " +
|
||||
"are both trying to handle ." + ext);
|
||||
// Recover by just going with the first handler we saw
|
||||
} else {
|
||||
allHandlers[ext] = sourceHandler.handler;
|
||||
sourceExtensions[ext] = !!sourceHandler.isTemplate;
|
||||
return;
|
||||
}
|
||||
// Is this handler only registered for, say, "web", and we're building,
|
||||
// say, "os"?
|
||||
if (sourceHandler.archMatching &&
|
||||
!archinfo.matches(inputSourceArch.arch, sourceHandler.archMatching)) {
|
||||
return;
|
||||
}
|
||||
allHandlers[ext] = sourceHandler.handler;
|
||||
sourceExtensions[ext] = !!sourceHandler.isTemplate;
|
||||
});
|
||||
});
|
||||
|
||||
@@ -402,12 +409,6 @@ var compileUnibuild = function (unipackage, inputSourceArch, packageLoader,
|
||||
var file = watch.readAndWatchFileWithHash(sourceWatchSet, absPath);
|
||||
var contents = file.contents;
|
||||
|
||||
// Only add the source file to the WatchSet if it's actually added to
|
||||
// the build. This is a hacky workaround because plugins do not register
|
||||
// themselves as "client" or "server", so we need to detect whether a file
|
||||
// is actually added to the client/server program.
|
||||
var sourceIsWatched = false;
|
||||
|
||||
sources.push(relPath);
|
||||
|
||||
if (contents === null) {
|
||||
@@ -456,11 +457,11 @@ var compileUnibuild = function (unipackage, inputSourceArch, packageLoader,
|
||||
// information.
|
||||
// - pathForSourceMap: If this file is to be included in a source map,
|
||||
// this is the name you should use for it in the map.
|
||||
// - rootOutputPath: on client targets, for resources such as
|
||||
// - rootOutputPath: on web targets, for resources such as
|
||||
// stylesheet and static assets, this is the root URL that
|
||||
// will get prepended to the paths you pick for your output
|
||||
// files so that you get your own namespace, for example
|
||||
// '/packages/foo'. null on non-client targets
|
||||
// '/packages/foo'. null on non-web targets
|
||||
// - fileOptions: any options passed to "api.add_files"; for
|
||||
// use by the plugin. The built-in "js" plugin uses the "bare"
|
||||
// option for files that shouldn't be wrapped in a closure.
|
||||
@@ -472,11 +473,12 @@ var compileUnibuild = function (unipackage, inputSourceArch, packageLoader,
|
||||
// file as a Buffer. If n is omitted you get the rest of the
|
||||
// file.
|
||||
// - appendDocument({ section: "head", data: "my markup" })
|
||||
// Client targets only. Add markup to the "head" or "body"
|
||||
// Browser targets only. Add markup to the "head" or "body"
|
||||
// Web targets only. Add markup to the "head" or "body"
|
||||
// section of the document.
|
||||
// - addStylesheet({ path: "my/stylesheet.css", data: "my css",
|
||||
// sourceMap: "stringified json sourcemap"})
|
||||
// Client targets only. Add a stylesheet to the
|
||||
// Web targets only. Add a stylesheet to the
|
||||
// document. 'path' is a requested URL for the stylesheet that
|
||||
// may or may not ultimately be honored. (Meteor will add
|
||||
// appropriate tags to cause the stylesheet to be loaded. It
|
||||
@@ -497,10 +499,10 @@ var compileUnibuild = function (unipackage, inputSourceArch, packageLoader,
|
||||
// a closure, so that its vars are shared with other files
|
||||
// in the module.
|
||||
// - addAsset({ path: "my/image.png", data: Buffer })
|
||||
// Add a file to serve as-is over HTTP (client targets) or
|
||||
// Add a file to serve as-is over HTTP (web targets) or
|
||||
// to include as-is in the bundle (os targets).
|
||||
// This time `data` is a Buffer rather than a string. For
|
||||
// client targets, it will be served at the exact path you
|
||||
// web targets, it will be served at the exact path you
|
||||
// request (concatenated with rootOutputPath). For server
|
||||
// targets, the file can be retrieved by passing path to
|
||||
// Assets.getText or Assets.getBinary.
|
||||
@@ -513,7 +515,7 @@ var compileUnibuild = function (unipackage, inputSourceArch, packageLoader,
|
||||
// line, column, and func are all optional.
|
||||
//
|
||||
// XXX for now, these handlers must only generate portable code
|
||||
// (code that isn't dependent on the arch, other than 'client'
|
||||
// (code that isn't dependent on the arch, other than 'web'
|
||||
// vs 'os') -- they can look at the arch that is provided
|
||||
// but they can't rely on the running on that particular arch
|
||||
// (in the end, an arch-specific unibuild will be emitted only if
|
||||
@@ -571,26 +573,24 @@ var compileUnibuild = function (unipackage, inputSourceArch, packageLoader,
|
||||
return ret;
|
||||
},
|
||||
appendDocument: function (options) {
|
||||
if (! archinfo.matches(inputSourceArch.arch, "client"))
|
||||
if (! archinfo.matches(inputSourceArch.arch, "web"))
|
||||
throw new Error("Document sections can only be emitted to " +
|
||||
"client targets");
|
||||
"web targets");
|
||||
if (options.section !== "head" && options.section !== "body")
|
||||
throw new Error("'section' must be 'head' or 'body'");
|
||||
if (typeof options.data !== "string")
|
||||
throw new Error("'data' option to appendDocument must be a string");
|
||||
sourceIsWatched = true;
|
||||
resources.push({
|
||||
type: options.section,
|
||||
data: new Buffer(options.data, 'utf8')
|
||||
});
|
||||
},
|
||||
addStylesheet: function (options) {
|
||||
if (! archinfo.matches(inputSourceArch.arch, "client"))
|
||||
if (! archinfo.matches(inputSourceArch.arch, "web"))
|
||||
throw new Error("Stylesheets can only be emitted to " +
|
||||
"client targets");
|
||||
"web targets");
|
||||
if (typeof options.data !== "string")
|
||||
throw new Error("'data' option to addStylesheet must be a string");
|
||||
sourceIsWatched = true;
|
||||
resources.push({
|
||||
type: "css",
|
||||
refreshable: true,
|
||||
@@ -604,9 +604,8 @@ var compileUnibuild = function (unipackage, inputSourceArch, packageLoader,
|
||||
throw new Error("'data' option to addJavaScript must be a string");
|
||||
if (typeof options.sourcePath !== "string")
|
||||
throw new Error("'sourcePath' option must be supplied to addJavaScript. Consider passing inputPath.");
|
||||
if (options.bare && ! archinfo.matches(inputSourceArch.arch, "client"))
|
||||
throw new Error("'bare' option may only be used for client targets");
|
||||
sourceIsWatched = true;
|
||||
if (options.bare && ! archinfo.matches(inputSourceArch.arch, "web"))
|
||||
throw new Error("'bare' option may only be used for web targets");
|
||||
js.push({
|
||||
source: options.data,
|
||||
sourcePath: options.sourcePath,
|
||||
@@ -618,7 +617,6 @@ var compileUnibuild = function (unipackage, inputSourceArch, packageLoader,
|
||||
addAsset: function (options) {
|
||||
if (! (options.data instanceof Buffer))
|
||||
throw new Error("'data' option to addAsset must be a Buffer");
|
||||
sourceIsWatched = true;
|
||||
addAsset(options.data, options.path);
|
||||
},
|
||||
error: function (options) {
|
||||
@@ -641,9 +639,7 @@ var compileUnibuild = function (unipackage, inputSourceArch, packageLoader,
|
||||
// handler might already have emitted resources)
|
||||
}
|
||||
|
||||
if (sourceIsWatched) {
|
||||
watchSet.merge(sourceWatchSet);
|
||||
}
|
||||
watchSet.merge(sourceWatchSet);
|
||||
});
|
||||
|
||||
// *** Run Phase 1 link
|
||||
@@ -745,6 +741,7 @@ var compileUnibuild = function (unipackage, inputSourceArch, packageLoader,
|
||||
// disk) that were used by the compilation (the source files you'd have to
|
||||
// ship to a different machine to replicate the build there)
|
||||
compiler.compile = function (packageSource, options) {
|
||||
buildmessage.assertInCapture();
|
||||
var sources = [];
|
||||
var pluginWatchSet = packageSource.pluginWatchSet.clone();
|
||||
var plugins = {};
|
||||
@@ -911,7 +908,10 @@ var getPluginProviders = function (versions) {
|
||||
// string). Yes, it is possible that multiple versions of some other
|
||||
// package might be build-time dependencies (because of plugins).
|
||||
compiler.getBuildOrderConstraints = function (
|
||||
packageSource, constraintSolverOpts) {
|
||||
packageSource, constraintSolverOpts) {
|
||||
constraintSolverOpts = constraintSolverOpts || {};
|
||||
constraintSolverOpts.ignoreProjectDeps || buildmessage.assertInCapture();
|
||||
|
||||
var versions = {}; // map from package name to version to true
|
||||
var addVersion = function (version, name) {
|
||||
if (name !== packageSource.name) {
|
||||
@@ -950,6 +950,7 @@ compiler.getBuildOrderConstraints = function (
|
||||
// build-time dependency has changed.
|
||||
compiler.checkUpToDate = function (
|
||||
packageSource, unipackage, constraintSolverOpts) {
|
||||
buildmessage.assertInCapture();
|
||||
|
||||
if (unipackage.forceNotUpToDate) {
|
||||
return false;
|
||||
|
||||
@@ -156,27 +156,49 @@ _.extend(exports, {
|
||||
}
|
||||
},
|
||||
|
||||
getLocalPackageCacheFilename: function (serverUrl) {
|
||||
// Note: this is NOT guaranteed to return a distinct prefix for every
|
||||
// conceivable URL. But it sure ought to return a distinct prefix for every
|
||||
// server we actually use.
|
||||
getPackageServerFilePrefix: function (serverUrl) {
|
||||
var self = this;
|
||||
if (!serverUrl) serverUrl = self.getPackageServerUrl();
|
||||
var serLen = serverUrl.length;
|
||||
|
||||
// Chop off http:// and https://.
|
||||
serverUrl = serverUrl.replace(/^\https:/, '');
|
||||
serverUrl = serverUrl.replace(/^\http:/, '');
|
||||
serverUrl = serverUrl.slice(2, serLen);
|
||||
// Chop off http:// and https:// and trailing slashes.
|
||||
serverUrl = serverUrl.replace(/^\https:\/\//, '');
|
||||
serverUrl = serverUrl.replace(/^\http:\/\//, '');
|
||||
serverUrl = serverUrl.replace(/\/+$/, '');
|
||||
|
||||
// Chop off meteor.com.
|
||||
serverUrl = serverUrl.replace(/meteor\.com/, '');
|
||||
serverUrl = serverUrl.replace(/\.meteor\.com$/, '');
|
||||
|
||||
// Replace other weird stuff with X.
|
||||
serverUrl = serverUrl.replace(/[^a-zA-Z0-9.:-]/g, 'X');
|
||||
|
||||
return serverUrl;
|
||||
},
|
||||
|
||||
getPackagesDirectoryName: function (serverUrl) {
|
||||
var self = this;
|
||||
|
||||
var prefix = config.getPackageServerFilePrefix();
|
||||
if (prefix !== 'packages') {
|
||||
prefix = path.join('packages-from-server', prefix);
|
||||
}
|
||||
|
||||
return prefix;
|
||||
},
|
||||
|
||||
getLocalPackageCacheFilename: function (serverUrl) {
|
||||
var self = this;
|
||||
var prefix = self.getPackageServerFilePrefix();
|
||||
|
||||
// Should look like 'packages.data.json' in the default case
|
||||
// (test-packages.data.json before 0.9.0).
|
||||
return serverUrl + "data.json";
|
||||
return prefix + ".data.json";
|
||||
},
|
||||
|
||||
getPackageStorage: function() {
|
||||
var self = this;
|
||||
var serverFile = self.getPackageServerUrl() + ".data.json";
|
||||
return path.join(tropohouse.default.root,
|
||||
"package-metadata", "v1",
|
||||
self.getLocalPackageCacheFilename());
|
||||
|
||||
@@ -5,7 +5,6 @@ var path = require('path');
|
||||
var fs = require('fs');
|
||||
var uniload = require('./uniload.js');
|
||||
var fiberHelpers = require('./fiber-helpers.js');
|
||||
var Fiber = require('fibers');
|
||||
var httpHelpers = require('./http-helpers.js');
|
||||
var auth = require('./auth.js');
|
||||
var release = require('./release.js');
|
||||
@@ -206,7 +205,14 @@ exports.deploy = function (options) {
|
||||
|
||||
if (! options.starball && ! messages.hasMessages()) {
|
||||
process.stdout.write('Deploying ' + options.app + '. Bundling...\n');
|
||||
stats.recordPackages();
|
||||
var statsMessages = buildmessage.capture(function () {
|
||||
stats.recordPackages();
|
||||
});
|
||||
if (statsMessages.hasMessages()) {
|
||||
process.stdout.write("Error talking to stats server:\n" +
|
||||
statsMessages.formatMessages());
|
||||
// ... but continue;
|
||||
}
|
||||
var bundleResult = bundler.bundle({
|
||||
outputPath: bundlePath,
|
||||
buildOptions: options.buildOptions
|
||||
|
||||
@@ -13,7 +13,6 @@ var config = require('./config.js');
|
||||
var auth = require('./auth.js');
|
||||
var utils = require('./utils.js');
|
||||
var _ = require('underscore');
|
||||
var inFiber = require('./fiber-helpers.js').inFiber;
|
||||
var Future = require('fibers/future');
|
||||
var project = require('./project.js');
|
||||
var stats = require('./stats.js');
|
||||
@@ -397,8 +396,16 @@ var bundleAndDeploy = function (options) {
|
||||
if (! messages.hasMessages()) {
|
||||
var bundler = require('./bundler.js');
|
||||
|
||||
if (options.recordPackageUsage)
|
||||
stats.recordPackages(options.appDir);
|
||||
if (options.recordPackageUsage) {
|
||||
var statsMessages = buildmessage.capture(function () {
|
||||
stats.recordPackages(options.appDir);
|
||||
});
|
||||
if (statsMessages.hasMessages()) {
|
||||
process.stdout.write("Error talking to stats server:\n" +
|
||||
statsMessages.formatMessages());
|
||||
// ... but continue;
|
||||
}
|
||||
}
|
||||
|
||||
var bundleResult = bundler.bundle({
|
||||
outputPath: bundlePath,
|
||||
|
||||
@@ -2,30 +2,6 @@ var _ = require("underscore");
|
||||
var Fiber = require("fibers");
|
||||
var Future = require("fibers/future");
|
||||
|
||||
// Use this to wrap callbacks that need to run in a fiber, when
|
||||
// passing callbacks to functions such as setTimeout that aren't
|
||||
// callback-aware. (In app code we handle this with Meteor.setTimeout,
|
||||
// but we don't have such a thing in the tools code yet.)
|
||||
//
|
||||
// Specifically, given a function f, this returns a new function that
|
||||
// returns immediately but also runs f (with any provided arguments)
|
||||
// in a new fiber.
|
||||
//
|
||||
// NOTE: It's probably better to not use callbacks. Instead you can
|
||||
// use Futures to generate synchronous equivalents.
|
||||
//
|
||||
// XXX just suck it up and replace setTimeout and clearTimeout,
|
||||
// globally, with fiberized versions? will this mess up npm dependencies?
|
||||
exports.inFiber = function (func) {
|
||||
return function (/*arguments*/) {
|
||||
var self = this;
|
||||
var args = arguments;
|
||||
new Fiber(function () {
|
||||
func.apply(self, args);
|
||||
}).run();
|
||||
};
|
||||
};
|
||||
|
||||
exports.parallelEach = function (collection, callback, context) {
|
||||
var futures = _.map(collection, function () {
|
||||
var args = _.toArray(arguments);
|
||||
@@ -89,3 +65,97 @@ exports.noYieldsAllowed = function (f) {
|
||||
Fiber.yield = savedYield;
|
||||
}
|
||||
};
|
||||
|
||||
// Borrowed from packages/meteor/dynamics_nodejs.js
|
||||
// Used by buildmessage
|
||||
|
||||
exports.nodeCodeMustBeInFiber = function () {
|
||||
if (!Fiber.current) {
|
||||
throw new Error("Meteor code must always run within a Fiber. " +
|
||||
"Try wrapping callbacks that you pass to non-Meteor " +
|
||||
"libraries with Meteor.bindEnvironment.");
|
||||
}
|
||||
};
|
||||
|
||||
var nextSlot = 0;
|
||||
exports.EnvironmentVariable = function (defaultValue) {
|
||||
var self = this;
|
||||
self.slot = 'slot' + nextSlot++;
|
||||
self.defaultValue = defaultValue;
|
||||
};
|
||||
|
||||
_.extend(exports.EnvironmentVariable.prototype, {
|
||||
get: function () {
|
||||
var self = this;
|
||||
exports.nodeCodeMustBeInFiber();
|
||||
|
||||
if (!Fiber.current._meteorDynamics)
|
||||
return self.defaultValue;
|
||||
if (!_.has(Fiber.current._meteorDynamics, self.slot))
|
||||
return self.defaultValue;
|
||||
return Fiber.current._meteorDynamics[self.slot];
|
||||
},
|
||||
|
||||
withValue: function (value, func) {
|
||||
var self = this;
|
||||
exports.nodeCodeMustBeInFiber();
|
||||
|
||||
if (!Fiber.current._meteorDynamics)
|
||||
Fiber.current._meteorDynamics = {};
|
||||
var currentValues = Fiber.current._meteorDynamics;
|
||||
|
||||
var saved = _.has(currentValues, self.slot)
|
||||
? currentValues[self.slot] : self.defaultValue;
|
||||
currentValues[self.slot] = value;
|
||||
|
||||
try {
|
||||
return func();
|
||||
} finally {
|
||||
currentValues[self.slot] = saved;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// This is like Meteor.bindEnvironment.
|
||||
// Experimentally, we are NOT including onException or _this in this version.
|
||||
exports.bindEnvironment = function (func) {
|
||||
exports.nodeCodeMustBeInFiber();
|
||||
|
||||
var boundValues = _.clone(Fiber.current._meteorDynamics || {});
|
||||
|
||||
return function (/* arguments */) {
|
||||
var self = this;
|
||||
var args = _.toArray(arguments);
|
||||
|
||||
var runWithEnvironment = function () {
|
||||
var savedValues = Fiber.current._meteorDynamics;
|
||||
try {
|
||||
// Need to clone boundValues in case two fibers invoke this
|
||||
// function at the same time
|
||||
Fiber.current._meteorDynamics = _.clone(boundValues);
|
||||
return func.apply(self, args);
|
||||
} finally {
|
||||
Fiber.current._meteorDynamics = savedValues;
|
||||
}
|
||||
};
|
||||
|
||||
if (Fiber.current)
|
||||
return runWithEnvironment();
|
||||
Fiber(runWithEnvironment).run();
|
||||
};
|
||||
};
|
||||
|
||||
// An alternative to bindEnvironment for the rare case where you
|
||||
// want the callback you're passing to some Node function to start
|
||||
// a new fiber but *NOT* to inherit the current environment.
|
||||
// Eg, if you are trying to do the equivalent of start a background
|
||||
// thread.
|
||||
exports.inBareFiber = function (func) {
|
||||
return function (/*arguments*/) {
|
||||
var self = this;
|
||||
var args = arguments;
|
||||
new Fiber(function () {
|
||||
func.apply(self, args);
|
||||
}).run();
|
||||
};
|
||||
};
|
||||
|
||||
@@ -283,7 +283,13 @@ files.fileHash = function (filename) {
|
||||
// Returns a base64 SHA256 hash representing a tree on disk. It is not sensitive
|
||||
// to modtime, uid/gid, or any permissions bits other than the current-user-exec
|
||||
// bit on normal files.
|
||||
files.treeHash = function (root) {
|
||||
files.treeHash = function (root, options) {
|
||||
options = _.extend({
|
||||
ignore: function (relativePath) {
|
||||
return false;
|
||||
}
|
||||
}, options);
|
||||
|
||||
var crypto = require('crypto');
|
||||
var hash = crypto.createHash('sha256');
|
||||
|
||||
@@ -296,6 +302,11 @@ files.treeHash = function (root) {
|
||||
};
|
||||
|
||||
var traverse = function (relativePath) {
|
||||
if (options.ignore(relativePath)) {
|
||||
hashLog && hashLog.push('SKIP ' + JSON.stringify(relativePath) + '\n');
|
||||
return;
|
||||
}
|
||||
|
||||
var absPath = path.join(root, relativePath);
|
||||
var stat = fs.lstatSync(absPath);
|
||||
|
||||
|
||||
@@ -69,14 +69,38 @@ Options:
|
||||
|
||||
|
||||
>>> update
|
||||
Upgrade this project to the latest version of Meteor
|
||||
Usage: meteor update [--release <release>]
|
||||
Upgrade this project's dependencies to their latest versions.
|
||||
Usage: meteor update
|
||||
meteor update --patch
|
||||
meteor update --release <release>
|
||||
meteor update --packages-only
|
||||
meteor update [packageName packageName2 ...]
|
||||
|
||||
Sets the version of Meteor to use with the current project. If a
|
||||
release is specified with --release, set the project to use that
|
||||
version. Otherwise download and use the latest release of Meteor.
|
||||
Updates the meteor release, and then, if applicable, updates the packages
|
||||
used by the app to the latest versions that don't cause dependency
|
||||
conflicts with other packages in the app.
|
||||
|
||||
Passing the --patch argument will update to the latest patch, if such exists.
|
||||
Patch releases contain very minor changes, usually bug fixes. Updating to
|
||||
the latest patch is always recommended. This will try to not update non-core
|
||||
packages unless strictly nessessary.
|
||||
|
||||
Passing the --release argument will force update to a specific release of meteor.
|
||||
This will not update non-core packages unless strictly nessessary. It is also
|
||||
possible that some packages cannot be updated to be compatible with the new
|
||||
release. If that happens, the app will not build until dependencies on those
|
||||
packages are removed.
|
||||
|
||||
Passing --packages-only will try to update non-core packages to their latest
|
||||
versions. It will not update the version of meteor. To update individual packages
|
||||
(for example: 'foo:awesome') pass in their names instead, with no options. ('meteor
|
||||
update foo:awesome').
|
||||
|
||||
Options:
|
||||
--packages-only Update the package versions only. Do not update the release.
|
||||
--patch Update the release to a patch release only.
|
||||
--release Update to a specific release of meteor.
|
||||
|
||||
XXX: change to new world order
|
||||
|
||||
>>> run-upgrader
|
||||
Execute a specific upgrader by name. Intended for testing.
|
||||
@@ -412,13 +436,24 @@ Options:
|
||||
|
||||
|
||||
>>> publish-for-arch
|
||||
Publishes a bla bla bla.
|
||||
Builds an already-published package for a new platform
|
||||
|
||||
Usage: fry the eggs, then
|
||||
Usage: meteor publish-for-arch packageName@version
|
||||
|
||||
Publishes a
|
||||
When you publish a package with 'meteor publish' for a package which has
|
||||
platform-specific components (eg, npm modules with native code), the package
|
||||
will only be usable on machines of the same architecture that you are currently
|
||||
on. To make your package's version usable on other architectures, you can use
|
||||
the publish-for-arch command. (The architectures currently supported by Meteor
|
||||
are 32-bit Linux, 64-bit Linux, and 64-bit OS X; the 'meteor deploy' servers use
|
||||
64-bit Linux.)
|
||||
|
||||
XXX: WRITE
|
||||
On a machine of the appropriate architecture, install Meteor and run
|
||||
$ meteor publish-for-arch packageName@version
|
||||
|
||||
You don't need to have a copy of your package's source to do this: Meteor will
|
||||
automatically download your package's source and dependencies from the package
|
||||
server.
|
||||
|
||||
|
||||
>>> rebuild
|
||||
@@ -443,6 +478,7 @@ debugging the Meteor packaging tools themselves.
|
||||
Search through the package server database.
|
||||
Usage: meteor search <string>
|
||||
meteor search --details <package or release name>
|
||||
meteor search --mine
|
||||
|
||||
Search through the meteor package&release database for names containing the
|
||||
specified substring.
|
||||
@@ -451,9 +487,12 @@ Use --details to get detailed information on a specific package or release.
|
||||
Use --details <name>@<version> to get more information on a specific version
|
||||
of a package or release.
|
||||
|
||||
Use --mine to get the list of packages on which you are one of the authorized
|
||||
maintainers.
|
||||
|
||||
Options:
|
||||
--details show detailed information on a specific package or release
|
||||
|
||||
--mine list packages on which you are the maintainer
|
||||
|
||||
>>> admin maintainers
|
||||
View or change authorized maintainers for a package
|
||||
@@ -501,13 +540,7 @@ Usage: meteor admin change-homepage <package name> <new url>
|
||||
Change the homepage containing package information.
|
||||
|
||||
|
||||
>>> admin set-breaking
|
||||
Mark a given version as containing breaking changes.
|
||||
Usage: stuff
|
||||
|
||||
Mark a given package version as containing breaking changes.
|
||||
|
||||
>>> admin set-banners <banner file config>
|
||||
>>> admin set-banners
|
||||
Set banners on published releases.
|
||||
Usage: meteor admin set-banners <path to banner configuration>
|
||||
|
||||
|
||||
@@ -230,7 +230,7 @@ var loadHelp = function () {
|
||||
});
|
||||
};
|
||||
|
||||
var longHelp = function (commandName) {
|
||||
var longHelp = exports.longHelp = function (commandName) {
|
||||
commandName = commandName.trim();
|
||||
var parts = commandName.length ? commandName.split(' ') : [];
|
||||
var node = commands;
|
||||
@@ -313,11 +313,15 @@ var springboard = function (rel, releaseOverride) {
|
||||
|
||||
// XXX split better
|
||||
try {
|
||||
tropohouse.default.maybeDownloadPackageForArchitectures(
|
||||
{packageName: toolsPkg, version: toolsVersion},
|
||||
[archinfo.host()],
|
||||
true /* print downloading message */
|
||||
);
|
||||
var messages = buildmessage.capture({
|
||||
title: "downloading tools package " + toolsPkg + "@" + toolsVersion
|
||||
}, function () {
|
||||
tropohouse.default.maybeDownloadPackageForArchitectures(
|
||||
{packageName: toolsPkg, version: toolsVersion},
|
||||
[archinfo.host()],
|
||||
true /* print downloading message */
|
||||
);
|
||||
});
|
||||
} catch (err) {
|
||||
// We have failed to download the tool that we are supposed to springboard
|
||||
// to! That's bad. Let's exit.
|
||||
@@ -327,8 +331,12 @@ var springboard = function (rel, releaseOverride) {
|
||||
rel.getToolsPackageAtVersion() + "\n");
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// XXX support warehouse too
|
||||
if (messages.hasMessages()) {
|
||||
process.stderr.write(
|
||||
"Could not springboard to release: " + rel.name + ".\n" +
|
||||
messages.formatMessages());
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
var packagePath = tropohouse.default.packagePath(toolsPkg, toolsVersion);
|
||||
var toolUnipackage = new unipackage.Unipackage;
|
||||
@@ -1157,6 +1165,3 @@ commandName + ": You're not in a Meteor project directory.\n" +
|
||||
process.exit(ret);
|
||||
}).run();
|
||||
|
||||
// exports
|
||||
main.longHelp = longHelp;
|
||||
|
||||
|
||||
@@ -71,6 +71,7 @@ _.extend(PackageCache.prototype, {
|
||||
// loadPackageAtPath() is called for them, see refresh().
|
||||
loadPackageAtPath: function (name, loadPath) {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
|
||||
// We need to build and load both the test and normal package, which,
|
||||
// frequently means 2 packages per directory/loadPath. Rather than
|
||||
@@ -206,6 +207,7 @@ _.extend(PackageCache.prototype, {
|
||||
// ignore when scanning for source files.)
|
||||
loadAppAtPath: function (appDir, ignoreFiles) {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
|
||||
var packageSource = new PackageSource;
|
||||
packageSource.initFromAppDir(appDir, ignoreFiles);
|
||||
|
||||
@@ -62,12 +62,11 @@ exports.loadCachedServerData = function (packageStorageFile) {
|
||||
var data = fs.readFileSync(config.getPackageStorage(), 'utf8');
|
||||
} catch (e) {
|
||||
if (e.code == 'ENOENT') {
|
||||
// process.stderr.write("No cached server data found on disk.\n");
|
||||
return noDataToken;
|
||||
}
|
||||
// XXX we should probably return an error to the caller here to
|
||||
// figure out how to handle it
|
||||
console.log(e.message);
|
||||
process.stderr.write("ERROR " + e.message + "\n");
|
||||
process.exit(1);
|
||||
}
|
||||
var ret = noDataToken;
|
||||
@@ -75,15 +74,19 @@ exports.loadCachedServerData = function (packageStorageFile) {
|
||||
ret = JSON.parse(data);
|
||||
} catch (err) {
|
||||
// XXX error handling
|
||||
console.log("Could not parse JSON in data.json.");
|
||||
process.stderr.write(
|
||||
"ERROR: Could not parse JSON for local package-metadata cache. \n");
|
||||
// This should only happen if you decided to manually edit this or
|
||||
// whatever. Regardless, go on and treat this as an empty file.
|
||||
}
|
||||
return ret;
|
||||
};
|
||||
|
||||
// Opens a connection to the server, requests and returns new package data that
|
||||
// we haven't cached on disk. We assume that data is cached chronologically, so
|
||||
// essentially, we are asking for a diff from the last time that we did this.
|
||||
// Requests and returns one page of new package data that we haven't cached on
|
||||
// disk. We assume that data is cached chronologically, so essentially, we are
|
||||
// asking for a diff from the last time that we did this.
|
||||
// Takes in:
|
||||
// - conn: the connection to use (does not have to be logged in)
|
||||
// - syncToken: a syncToken object to be sent to the server that
|
||||
// represents the last time that we talked to the server.
|
||||
// - _optionsForTest:
|
||||
@@ -97,25 +100,20 @@ exports.loadCachedServerData = function (packageStorageFile) {
|
||||
//
|
||||
// Throws a ServiceConnection.ConnectionTimeoutError if the method call
|
||||
// times out.
|
||||
var loadRemotePackageData = function (syncToken, _optionsForTest) {
|
||||
var loadRemotePackageData = function (conn, syncToken, _optionsForTest) {
|
||||
_optionsForTest = _optionsForTest || {};
|
||||
|
||||
var conn = openPackageServerConnection();
|
||||
var syncOpts;
|
||||
if (_optionsForTest && _optionsForTest.useShortPages) {
|
||||
syncOpts = { shortPagesForTest: _optionsForTest.useShortPages };
|
||||
}
|
||||
try {
|
||||
var collectionData;
|
||||
if (syncOpts) {
|
||||
collectionData = conn.call(
|
||||
'syncNewPackageData', syncToken, syncOpts);
|
||||
} else {
|
||||
collectionData = conn.call(
|
||||
'syncNewPackageData', syncToken);
|
||||
}
|
||||
} finally {
|
||||
conn.close();
|
||||
var collectionData;
|
||||
if (syncOpts) {
|
||||
collectionData = conn.call(
|
||||
'syncNewPackageData', syncToken, syncOpts);
|
||||
} else {
|
||||
collectionData = conn.call(
|
||||
'syncNewPackageData', syncToken);
|
||||
}
|
||||
return collectionData;
|
||||
};
|
||||
@@ -182,50 +180,67 @@ var writePackageDataToDisk = function (syncToken, data, options) {
|
||||
// - useShortPages: Boolean. Request short pages of ~3 records from the
|
||||
// server, instead of ~100 that it would send otherwise
|
||||
exports.updateServerPackageData = function (cachedServerData, _optionsForTest) {
|
||||
var self = this;
|
||||
_optionsForTest = _optionsForTest || {};
|
||||
var done = false;
|
||||
|
||||
var sources = [];
|
||||
if (cachedServerData.collections) {
|
||||
sources.push(cachedServerData.collections);
|
||||
}
|
||||
var syncToken = cachedServerData.syncToken;
|
||||
var remoteData;
|
||||
try {
|
||||
remoteData = loadRemotePackageData(syncToken, {
|
||||
useShortPages: _optionsForTest.useShortPages
|
||||
});
|
||||
} catch (err) {
|
||||
console.log(err);
|
||||
if (err instanceof ServiceConnection.ConnectionTimeoutError) {
|
||||
return null;
|
||||
} else {
|
||||
throw err;
|
||||
var conn = openPackageServerConnection();
|
||||
|
||||
var getSomeData = function () {
|
||||
var sources = [];
|
||||
if (cachedServerData.collections) {
|
||||
sources.push(cachedServerData.collections);
|
||||
}
|
||||
var syncToken = cachedServerData.syncToken;
|
||||
var remoteData;
|
||||
try {
|
||||
remoteData = loadRemotePackageData(conn, syncToken, {
|
||||
useShortPages: _optionsForTest.useShortPages
|
||||
});
|
||||
} catch (err) {
|
||||
process.stderr.write("ERROR " + err.message + "\n");
|
||||
if (err instanceof ServiceConnection.ConnectionTimeoutError) {
|
||||
cachedServerData = null;
|
||||
done = true;
|
||||
return;
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// If there is no new data from the server, don't bother writing things to
|
||||
// disk.
|
||||
if (_.isEqual(remoteData.collections, {})) {
|
||||
return cachedServerData;
|
||||
}
|
||||
// If there is no new data from the server, don't bother writing things to
|
||||
// disk.
|
||||
// XXX fix for resetData?
|
||||
if (_.isEqual(remoteData.collections, {})) {
|
||||
done = true;
|
||||
return;
|
||||
}
|
||||
|
||||
sources.push(remoteData.collections);
|
||||
var allCollections = mergeCollections(sources);
|
||||
var data = {
|
||||
syncToken: remoteData.syncToken,
|
||||
formatVersion: "1.0",
|
||||
collections: allCollections
|
||||
sources.push(remoteData.collections);
|
||||
var allCollections = mergeCollections(sources);
|
||||
var data = {
|
||||
syncToken: remoteData.syncToken,
|
||||
formatVersion: "1.0",
|
||||
collections: allCollections
|
||||
};
|
||||
writePackageDataToDisk(remoteData.syncToken, data, {
|
||||
packageStorageFile: _optionsForTest.packageStorageFile
|
||||
});
|
||||
|
||||
cachedServerData = data;
|
||||
if (remoteData.upToDate)
|
||||
done = true;
|
||||
};
|
||||
writePackageDataToDisk(remoteData.syncToken, data, {
|
||||
packageStorageFile: _optionsForTest.packageStorageFile
|
||||
});
|
||||
|
||||
// If we are not done, keep trying!
|
||||
if (!remoteData.upToDate) {
|
||||
data = this.updateServerPackageData(data, _optionsForTest);
|
||||
try {
|
||||
while (!done) {
|
||||
getSomeData();
|
||||
}
|
||||
} finally {
|
||||
conn.close();
|
||||
}
|
||||
|
||||
return data;
|
||||
return cachedServerData;
|
||||
};
|
||||
|
||||
// Returns a logged-in DDP connection to the package server, or null if
|
||||
@@ -334,6 +349,8 @@ var uploadTarball = function (putUrl, tarball) {
|
||||
exports.uploadTarball = uploadTarball;
|
||||
|
||||
var bundleBuild = function (unipackage) {
|
||||
buildmessage.assertInJob();
|
||||
|
||||
var tempDir = files.mkdtemp('build-package-');
|
||||
var packageTarName = unipackage.tarballName();
|
||||
var tarInputDir = path.join(tempDir, packageTarName);
|
||||
@@ -350,7 +367,18 @@ var bundleBuild = function (unipackage) {
|
||||
files.createTarball(tarInputDir, buildTarball);
|
||||
|
||||
var tarballHash = files.fileHash(buildTarball);
|
||||
var treeHash = files.treeHash(tarInputDir);
|
||||
var treeHash = files.treeHash(tarInputDir, {
|
||||
// We don't include any package.json from an npm module in the tree hash,
|
||||
// because npm isn't super consistent about what it puts in there (eg, does
|
||||
// it include the "readme" field)? This ends up leading to spurious
|
||||
// differences. The tree hash will still notice any actual CODE changes in
|
||||
// the npm packages.
|
||||
ignore: function (relativePath) {
|
||||
var pieces = relativePath.split(path.sep);
|
||||
return pieces.length && _.last(pieces) === 'package.json'
|
||||
&& _.contains(pieces, 'npm');
|
||||
}
|
||||
});
|
||||
|
||||
return {
|
||||
buildTarball: buildTarball,
|
||||
@@ -362,6 +390,8 @@ var bundleBuild = function (unipackage) {
|
||||
exports.bundleBuild = bundleBuild;
|
||||
|
||||
var createAndPublishBuiltPackage = function (conn, unipackage) {
|
||||
buildmessage.assertInJob();
|
||||
|
||||
process.stdout.write('Creating package build...\n');
|
||||
var uploadInfo = conn.call('createPackageBuild', {
|
||||
packageName: unipackage.name,
|
||||
@@ -370,6 +400,8 @@ var createAndPublishBuiltPackage = function (conn, unipackage) {
|
||||
});
|
||||
|
||||
var bundleResult = bundleBuild(unipackage);
|
||||
if (buildmessage.jobHasMessages())
|
||||
return;
|
||||
|
||||
process.stdout.write('Uploading build...\n');
|
||||
uploadTarball(uploadInfo.uploadUrl,
|
||||
@@ -382,7 +414,7 @@ var createAndPublishBuiltPackage = function (conn, unipackage) {
|
||||
bundleResult.tarballHash,
|
||||
bundleResult.treeHash);
|
||||
} catch (err) {
|
||||
console.log(err);
|
||||
process.stderr.write("ERROR " + err.message + "\n");
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -425,6 +457,8 @@ exports.handlePackageServerConnectionError = function (error) {
|
||||
//
|
||||
// Return true on success and an error code otherwise.
|
||||
exports.publishPackage = function (packageSource, compileResult, conn, options) {
|
||||
buildmessage.assertInJob();
|
||||
|
||||
options = options || {};
|
||||
|
||||
if (options.new && options.existingVersion)
|
||||
@@ -477,9 +511,8 @@ exports.publishPackage = function (packageSource, compileResult, conn, options)
|
||||
process.stderr.write("Publish failed. \n");
|
||||
return 1;
|
||||
}
|
||||
var authorized = _.indexOf(
|
||||
_.pluck(packRecord.maintainers, 'username'), auth.loggedInUsername());
|
||||
if (authorized == -1) {
|
||||
|
||||
if (!amIAuthorized(name, conn, false)) {
|
||||
process.stderr.write('You are not an authorized maintainer of ' + name + ".\n");
|
||||
process.stderr.write('Only authorized maintainers may publish new versions. \n');
|
||||
return 1;
|
||||
@@ -534,7 +567,7 @@ exports.publishPackage = function (packageSource, compileResult, conn, options)
|
||||
});
|
||||
|
||||
if (messages.hasMessages()) {
|
||||
process.stdout.write(messages.formatMessages());
|
||||
process.stderr.write(messages.formatMessages());
|
||||
return 1;
|
||||
}
|
||||
|
||||
@@ -564,8 +597,8 @@ exports.publishPackage = function (packageSource, compileResult, conn, options)
|
||||
name: packageSource.name
|
||||
});
|
||||
} catch (err) {
|
||||
console.log(err.message);
|
||||
return 1;
|
||||
process.stderr.write(err.message + "\n");
|
||||
return 3;
|
||||
}
|
||||
|
||||
}
|
||||
@@ -598,9 +631,8 @@ exports.publishPackage = function (packageSource, compileResult, conn, options)
|
||||
try {
|
||||
var uploadInfo = conn.call('createPackageVersion', uploadRec);
|
||||
} catch (err) {
|
||||
console.log("ERROR:", err.message);
|
||||
console.log("Package could not be published.");
|
||||
return 1;
|
||||
process.stderr.write("ERROR " + err.message + "\n");
|
||||
return 3;
|
||||
}
|
||||
|
||||
// XXX If package version already exists, print a nice error message
|
||||
@@ -617,8 +649,8 @@ exports.publishPackage = function (packageSource, compileResult, conn, options)
|
||||
{ tarballHash: sourceBundleResult.tarballHash,
|
||||
treeHash: sourceBundleResult.treeHash });
|
||||
} catch (err) {
|
||||
console.log(err.message);
|
||||
return 1;
|
||||
process.stderr.write("ERROR " + err.message + "\n");
|
||||
return 3;
|
||||
}
|
||||
|
||||
}
|
||||
@@ -627,3 +659,29 @@ exports.publishPackage = function (packageSource, compileResult, conn, options)
|
||||
|
||||
return 0;
|
||||
};
|
||||
|
||||
// Call the server to ask if we are authorized to update this release or
|
||||
// package. This is a way to save time before sending data to the server. It
|
||||
// will mostly ignore most errors (just in case we have a flaky network connection or
|
||||
// something) and let the method deal with those.
|
||||
//
|
||||
// If this returns FALSE, then we are NOT authorized.
|
||||
// Otherwise, return true.
|
||||
var amIAuthorized = function (name, conn, isRelease) {
|
||||
var methodName = "amIAuthorized" +
|
||||
(isRelease ? "Release" : "Package");
|
||||
|
||||
try {
|
||||
conn.call(methodName, name);
|
||||
} catch (err) {
|
||||
if (err.error === 401) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// We don't know what this error is. Probably we can't contact the server,
|
||||
// or the like. It would be a pity to fail all operations with the server
|
||||
// just because a preliminary check fails, so return true for now.
|
||||
return true;
|
||||
}
|
||||
return true;
|
||||
};
|
||||
|
||||
@@ -36,6 +36,7 @@ _.extend(exports.PackageLoader.prototype, {
|
||||
// XXX rename to throwOnNotFound
|
||||
getPackage: function (name, options) {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
|
||||
options = options || {};
|
||||
if (options.throwOnError === undefined) {
|
||||
@@ -87,6 +88,7 @@ _.extend(exports.PackageLoader.prototype, {
|
||||
// getPackageLoadPath / getPackageVersionLoadPath?
|
||||
getLoadPathForPackage: function (name) {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
|
||||
if (self.uniloadDir) {
|
||||
var packagePath = path.join(self.uniloadDir, name);
|
||||
@@ -116,6 +118,7 @@ _.extend(exports.PackageLoader.prototype, {
|
||||
// that package at the right architecture.
|
||||
getUnibuild: function (packageName, arch) {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
|
||||
var pkg = self.getPackage(packageName, { throwOnError: true });
|
||||
return pkg.getUnibuildAtArch(arch);
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
var fs = require('fs');
|
||||
var path = require('path');
|
||||
var _ = require('underscore');
|
||||
var semver = require('semver');
|
||||
var sourcemap = require('source-map');
|
||||
|
||||
var files = require('./files.js');
|
||||
@@ -34,10 +35,10 @@ var earliestCompatible = function (version) {
|
||||
// This is not the place to check to see if version parses as
|
||||
// semver. That should have been done when we first received it from
|
||||
// the user.
|
||||
var m = version.match(/^(\d)+\./);
|
||||
if (! m)
|
||||
var parsed = semver.parse(version);
|
||||
if (! parsed)
|
||||
throw new Error("not a valid version: " + version);
|
||||
return m[1] + ".0.0";
|
||||
return parsed.major + ".0.0";
|
||||
};
|
||||
|
||||
// Returns a sort comparator to order files into load order.
|
||||
@@ -92,6 +93,17 @@ var loadOrderSort = function (templateExtensions) {
|
||||
};
|
||||
};
|
||||
|
||||
// XXX We currently have a 1 to 1 mapping between 'where' and 'arch'.
|
||||
// In the future, we may let people specify different 'where' and 'arch'.
|
||||
var mapWhereToArch = function (where) {
|
||||
if (where === 'server') {
|
||||
return 'os';
|
||||
} else {
|
||||
// Transform client.* into web.*
|
||||
return 'web.' + where.split('.').slice(1).join('.');
|
||||
}
|
||||
};
|
||||
|
||||
///////////////////////////////////////////////////////////////////////////////
|
||||
// SourceArch
|
||||
///////////////////////////////////////////////////////////////////////////////
|
||||
@@ -200,7 +212,7 @@ var PackageSource = function () {
|
||||
|
||||
// Path that will be prepended to the URLs of all resources emitted
|
||||
// by this package (assuming they don't end up getting
|
||||
// concatenated). For non-client targets, the only effect this will
|
||||
// concatenated). For non-web targets, the only effect this will
|
||||
// have is to change the actual on-disk paths of the files in the
|
||||
// bundle, for those that care to open up the bundle and look (but
|
||||
// it's still nice to get it right).
|
||||
@@ -299,7 +311,7 @@ var PackageSource = function () {
|
||||
// this option transparent to the user in package.js.
|
||||
self.noVersionFile = false;
|
||||
|
||||
// The list of archs that we can build for. Doesn't include 'client' because
|
||||
// The list of where that we can target. Doesn't include 'client' because
|
||||
// it is expanded into 'client.*'.
|
||||
self.allWheres = ['server', 'client.browser', 'client.cordova'];
|
||||
};
|
||||
@@ -419,6 +431,10 @@ _.extend(PackageSource.prototype, {
|
||||
}
|
||||
}
|
||||
|
||||
if (!utils.validPackageName(self.name)) {
|
||||
buildmessage.error("Package name invalid: " + self.name);
|
||||
return;
|
||||
}
|
||||
|
||||
if (! fs.existsSync(self.sourceRoot))
|
||||
throw new Error("putative package directory " + dir + " doesn't exist?");
|
||||
@@ -694,7 +710,7 @@ _.extend(PackageSource.prototype, {
|
||||
cordovaDependencies = null;
|
||||
}
|
||||
|
||||
if (! self.version && options.requireVersion) {
|
||||
if (self.version === null && options.requireVersion) {
|
||||
if (! buildmessage.jobHasMessages()) {
|
||||
// Only write the error if there have been no errors so
|
||||
// far. (Otherwise if there is a parse error we'll always get
|
||||
@@ -713,13 +729,39 @@ _.extend(PackageSource.prototype, {
|
||||
// not like we didn't already have to think about this case.
|
||||
}
|
||||
|
||||
if (self.version && ! self.earliestCompatibleVersion) {
|
||||
self.earliestCompatibleVersion =
|
||||
earliestCompatible(self.version);
|
||||
if (self.version !== null && typeof(self.version) !== "string") {
|
||||
if (!buildmessage.jobHasMessages()) {
|
||||
buildmessage.error("The package version (specified with "
|
||||
+ "Package.describe) must be a string.");
|
||||
}
|
||||
// Recover by pretending there was no version (see above).
|
||||
self.version = null;
|
||||
}
|
||||
|
||||
if (!utils.validPackageName(self.name)) {
|
||||
buildmessage.error("Package name invalid: " + self.name);
|
||||
if (self.version !== null) {
|
||||
var parsedVersion = semver.parse(self.version);
|
||||
if (!parsedVersion) {
|
||||
if (!buildmessage.jobHasMessages()) {
|
||||
buildmessage.error(
|
||||
"The package version (specified with Package.describe) must be "
|
||||
+ "valid semver (see http://semver.org/).");
|
||||
}
|
||||
// Recover by pretending there was no version (see above).
|
||||
self.version = null;
|
||||
} else if (parsedVersion.build.length) {
|
||||
if (!buildmessage.jobHasMessages()) {
|
||||
buildmessage.error(
|
||||
"The package version (specified with Package.describe) may not "
|
||||
+ "contain a plus-separated build ID.");
|
||||
}
|
||||
// Recover by pretending there was no version (see above).
|
||||
self.version = null;
|
||||
}
|
||||
}
|
||||
|
||||
if (self.version !== null && ! self.earliestCompatibleVersion) {
|
||||
self.earliestCompatibleVersion =
|
||||
earliestCompatible(self.version);
|
||||
}
|
||||
|
||||
// source files used
|
||||
@@ -786,7 +828,7 @@ _.extend(PackageSource.prototype, {
|
||||
buildmessage.error(
|
||||
"Invalid 'where' argument: '" + inputWhere + "'",
|
||||
// skip toWhereArray in addition to the actual API function
|
||||
{useMyCaller: 1});
|
||||
{useMyCaller: 2});
|
||||
}
|
||||
});
|
||||
return where;
|
||||
@@ -797,9 +839,9 @@ _.extend(PackageSource.prototype, {
|
||||
// used. Can also take literal package objects, if you have
|
||||
// anonymous packages you want to use (eg, app packages)
|
||||
//
|
||||
// @param where 'client', 'client.browser', 'client.cordova', 'server',
|
||||
// @param where 'web', 'web.browser', 'web.cordova', 'server',
|
||||
// or an array of those.
|
||||
// The default is ['client', 'server'].
|
||||
// The default is ['web', 'server'].
|
||||
//
|
||||
// options can include:
|
||||
//
|
||||
@@ -893,6 +935,13 @@ _.extend(PackageSource.prototype, {
|
||||
// you don't fill in dependencies for some of your implies/uses, we will
|
||||
// look at the packages listed in the release to figure that out.
|
||||
versionsFrom: function (release) {
|
||||
if (releaseRecord) {
|
||||
buildmessage.error("api.versionsFrom may only be specified once.",
|
||||
{ useMyCaller: true });
|
||||
// recover by ignoring
|
||||
return;
|
||||
}
|
||||
|
||||
// If you don't specify a track, use our default.
|
||||
if (release.indexOf('@') === -1) {
|
||||
release = catalog.complete.DEFAULT_TRACK + "@" + release;
|
||||
@@ -906,19 +955,19 @@ _.extend(PackageSource.prototype, {
|
||||
// catalog may not be initialized, but we are pretty sure that the
|
||||
// releases are there anyway. This is not the right way to do this
|
||||
// long term.
|
||||
releaseRecord = catalog.official.getReleaseVersion(
|
||||
releaseRecord = catalog.complete.getReleaseVersion(
|
||||
relInf[0], relInf[1], true);
|
||||
if (!releaseRecord) {
|
||||
throw new Error("Unknown release "+ release);
|
||||
buildmessage.error("Unknown release "+ release);
|
||||
}
|
||||
},
|
||||
|
||||
// Export symbols from this package.
|
||||
//
|
||||
// @param symbols String (eg "Foo") or array of String
|
||||
// @param where 'client', 'server', 'client.browser', 'client.cordova'
|
||||
// @param where 'web', 'server', 'web.browser', 'web.cordova'
|
||||
// or an array of those.
|
||||
// The default is ['client', 'server'].
|
||||
// The default is ['web', 'server'].
|
||||
// @param options 'testOnly', boolean.
|
||||
export: function (symbols, where, options) {
|
||||
// Support `api.export("FooTest", {testOnly: true})` without
|
||||
@@ -1032,9 +1081,10 @@ _.extend(PackageSource.prototype, {
|
||||
files.rm_recursive(path.join(self.sourceRoot, '.npm', f));
|
||||
});
|
||||
|
||||
// Create source architectures, one for the server and one for each client.
|
||||
// Create source architectures, one for the server and one for each web
|
||||
// arch.
|
||||
_.each(self.allWheres, function (where) {
|
||||
var arch = (where === 'server') ? 'os' : where;
|
||||
var arch = mapWhereToArch(where);
|
||||
|
||||
// Everything depends on the package 'meteor', which sets up
|
||||
// the basic environment) (except 'meteor' itself, and js-analyze
|
||||
@@ -1118,8 +1168,9 @@ _.extend(PackageSource.prototype, {
|
||||
// Determine used packages
|
||||
var project = require('./project.js').project;
|
||||
var names = project.getConstraints();
|
||||
var arch = where === "server" ? "os" : where;
|
||||
// XXX what about /client.browser/* etc
|
||||
var arch = mapWhereToArch(where);
|
||||
// XXX what about /client.browser/* etc, these directories could also
|
||||
// be for specific client targets.
|
||||
|
||||
// Create unibuild
|
||||
var sourceArch = new SourceArch(self, {
|
||||
@@ -1238,7 +1289,7 @@ _.extend(PackageSource.prototype, {
|
||||
|
||||
// Special case: on the client, JavaScript files in a
|
||||
// `client/compatibility` directory don't get wrapped in a closure.
|
||||
if (archinfo.matches(arch, "client") && relPath.match(/\.js$/)) {
|
||||
if (archinfo.matches(arch, "web") && relPath.match(/\.js$/)) {
|
||||
var clientCompatSubstr =
|
||||
path.sep + 'client' + path.sep + 'compatibility' + path.sep;
|
||||
if ((path.sep + relPath).indexOf(clientCompatSubstr) !== -1)
|
||||
@@ -1248,7 +1299,7 @@ _.extend(PackageSource.prototype, {
|
||||
});
|
||||
|
||||
// Now look for assets for this unibuild.
|
||||
var assetDir = archinfo.matches(arch, "client") ? "public" : "private";
|
||||
var assetDir = archinfo.matches(arch, "web") ? "public" : "private";
|
||||
var assetDirs = readAndWatchDirectory('', {
|
||||
include: [new RegExp('^' + assetDir + '/$')]
|
||||
});
|
||||
@@ -1383,7 +1434,7 @@ _.extend(PackageSource.prototype, {
|
||||
});
|
||||
};
|
||||
|
||||
// Both plugins and direct dependencies are objects mapping package name to
|
||||
// Both plugins and direct dependencies are objectsmapping package name to
|
||||
// version number. When we write them on disk, we will convert them to
|
||||
// arrays of <packageName, version> and alphabetized by packageName.
|
||||
versions["dependencies"] = alphabetize(versions["dependencies"]);
|
||||
|
||||
@@ -173,6 +173,7 @@ _.extend(Project.prototype, {
|
||||
// the package loader for this project. This WILL REWRITE THE VERSIONS FILE.
|
||||
_ensureDepsUpToDate : function () {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
|
||||
// To calculate project dependencies, we need to know what release we are
|
||||
// on, but to do that, we need to have a rootDirectory. So, we initialize
|
||||
@@ -198,11 +199,18 @@ _.extend(Project.prototype, {
|
||||
// Call the constraint solver, using the previous dependencies as the last
|
||||
// solution. It is useful to set ignoreProjectDeps, but not nessessary,
|
||||
// since self.viableDepSource is false.
|
||||
var newVersions = catalog.complete.resolveConstraints(
|
||||
self.combinedConstraints,
|
||||
{ previousSolution: self.dependencies },
|
||||
{ ignoreProjectDeps: true }
|
||||
);
|
||||
try {
|
||||
var newVersions = catalog.complete.resolveConstraints(
|
||||
self.combinedConstraints,
|
||||
{ previousSolution: self.dependencies },
|
||||
{ ignoreProjectDeps: true }
|
||||
);
|
||||
} catch (err) {
|
||||
process.stdout.write(
|
||||
"Could not resolve the specified constraints for this project:\n"
|
||||
+ err +"\n");
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Download packages to disk, and rewrite .meteor/versions if it has
|
||||
// changed.
|
||||
@@ -213,7 +221,9 @@ _.extend(Project.prototype, {
|
||||
});
|
||||
|
||||
if (!setV.success) {
|
||||
throw new Error ("Could not install all the requested packages.");
|
||||
process.stdout.write(
|
||||
"Could not install all the requested packages. \n");
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Finally, initialize the package loader.
|
||||
@@ -354,7 +364,7 @@ _.extend(Project.prototype, {
|
||||
return 1;
|
||||
|
||||
// Show the user the messageLog of packages we added.
|
||||
if (!self.muted) {
|
||||
if (!self.muted && !_.isEmpty(versions)) {
|
||||
_.each(messageLog, function (msg) {
|
||||
process.stdout.write(msg + "\n");
|
||||
});
|
||||
@@ -421,6 +431,7 @@ _.extend(Project.prototype, {
|
||||
// null if the package is unconstrained.
|
||||
getCurrentCombinedConstraints : function () {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
self._ensureDepsUpToDate();
|
||||
return self.combinedConstraints;
|
||||
},
|
||||
@@ -439,6 +450,7 @@ _.extend(Project.prototype, {
|
||||
// Returns an object mapping package name to its string version.
|
||||
getVersions : function () {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
self._ensureDepsUpToDate();
|
||||
return self.dependencies;
|
||||
},
|
||||
@@ -480,6 +492,7 @@ _.extend(Project.prototype, {
|
||||
// transitive dependencies.
|
||||
getPackageLoader : function () {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
self._ensureDepsUpToDate();
|
||||
return self.packageLoader;
|
||||
},
|
||||
@@ -595,6 +608,7 @@ _.extend(Project.prototype, {
|
||||
// here because this really shouldn't fail (we are just removing things).
|
||||
removePackages : function (names) {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
self._removePackageRecords(names);
|
||||
|
||||
// Force a recalculation of all the dependencies, and record them to disk.
|
||||
@@ -615,6 +629,7 @@ _.extend(Project.prototype, {
|
||||
setVersions: function (newVersions, options) {
|
||||
var self = this;
|
||||
options = options || {};
|
||||
buildmessage.assertInCapture();
|
||||
|
||||
var downloaded = self._ensurePackagesExistOnDisk(newVersions);
|
||||
var ret = {
|
||||
@@ -671,8 +686,9 @@ _.extend(Project.prototype, {
|
||||
// that could lead to changes in the versions file.
|
||||
_ensurePackagesExistOnDisk : function (versions, options) {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
options = options || {};
|
||||
var arch = options.arch || archinfo.host();
|
||||
var serverArch = options.serverArch || archinfo.host();
|
||||
var verbose = options.verbose || !self.muted;
|
||||
var downloadedPackages = {};
|
||||
_.each(versions, function (version, name) {
|
||||
@@ -680,7 +696,7 @@ _.extend(Project.prototype, {
|
||||
try {
|
||||
var available = tropohouse.default.maybeDownloadPackageForArchitectures(
|
||||
packageVersionInfo,
|
||||
['browser', arch],
|
||||
[serverArch], // XXX 'web.browser' too?
|
||||
verbose /* print downloading message */
|
||||
);
|
||||
downloadedPackages[name] = version;
|
||||
@@ -711,6 +727,7 @@ _.extend(Project.prototype, {
|
||||
// disk and the operation has failed.
|
||||
addPackages : function (moreDeps, newVersions) {
|
||||
var self = this;
|
||||
buildmessage.assertInCapture();
|
||||
|
||||
// First, we need to make sure that we have downloaded all the packages that
|
||||
// we are going to use. So, go through the versions and call tropohouse to
|
||||
|
||||
@@ -1,8 +1,6 @@
|
||||
var _ = require('underscore');
|
||||
var Future = require('fibers/future');
|
||||
var Fiber = require('fibers');
|
||||
var files = require('./files.js');
|
||||
var inFiber = require('./fiber-helpers.js').inFiber;
|
||||
var release = require('./release.js');
|
||||
|
||||
var runLog = require('./run-log.js');
|
||||
|
||||
101
tools/run-app.js
101
tools/run-app.js
@@ -3,13 +3,13 @@ var path = require("path");
|
||||
var _ = require('underscore');
|
||||
var Future = require('fibers/future');
|
||||
var Fiber = require('fibers');
|
||||
var fiberHelpers = require('./fiber-helpers.js');
|
||||
var files = require('./files.js');
|
||||
var watch = require('./watch.js');
|
||||
var project = require('./project.js').project;
|
||||
var bundler = require('./bundler.js');
|
||||
var release = require('./release.js');
|
||||
var buildmessage = require('./buildmessage.js');
|
||||
var inFiber = require('./fiber-helpers.js').inFiber;
|
||||
var runLog = require('./run-log.js');
|
||||
var catalog = require('./catalog.js');
|
||||
var packageCache = require('./package-cache.js');
|
||||
@@ -90,7 +90,7 @@ _.extend(AppProcess.prototype, {
|
||||
|
||||
// Send stdout and stderr to the runLog
|
||||
var eachline = require('eachline');
|
||||
eachline(self.proc.stdout, 'utf8', function (line) {
|
||||
eachline(self.proc.stdout, 'utf8', fiberHelpers.inBareFiber(function (line) {
|
||||
if (line.match(/^LISTENING\s*$/)) {
|
||||
// This is the child process telling us that it's ready to
|
||||
// receive connections.
|
||||
@@ -98,26 +98,26 @@ _.extend(AppProcess.prototype, {
|
||||
} else {
|
||||
runLog.logAppOutput(line);
|
||||
}
|
||||
});
|
||||
}));
|
||||
|
||||
eachline(self.proc.stderr, 'utf8', function (line) {
|
||||
eachline(self.proc.stderr, 'utf8', fiberHelpers.inBareFiber(function (line) {
|
||||
runLog.logAppOutput(line, true);
|
||||
});
|
||||
}));
|
||||
|
||||
// Watch for exit and for stdio to be fully closed (so that we don't miss
|
||||
// log lines).
|
||||
self.proc.on('close', function (code, signal) {
|
||||
self.proc.on('close', fiberHelpers.inBareFiber(function (code, signal) {
|
||||
self._maybeCallOnExit(code, signal);
|
||||
});
|
||||
}));
|
||||
|
||||
self.proc.on('error', function (err) {
|
||||
self.proc.on('error', fiberHelpers.inBareFiber(function (err) {
|
||||
runLog.log("=> Couldn't spawn process: " + err.message);
|
||||
|
||||
// node docs say that it might make both an 'error' and a
|
||||
// 'close' callback, so we use a guard to make sure we only call
|
||||
// onExit once.
|
||||
self._maybeCallOnExit();
|
||||
});
|
||||
}));
|
||||
|
||||
// This happens sometimes when we write a keepalive after the app
|
||||
// is dead. If we don't register a handler, we get a top level
|
||||
@@ -355,7 +355,9 @@ _.extend(AppRunner.prototype, {
|
||||
throw new Error("already started?");
|
||||
|
||||
self.startFuture = new Future;
|
||||
self.fiber = new Fiber(function () {
|
||||
// XXX I think it's correct to not try to use bindEnvironment here:
|
||||
// the extra fiber should be independent of this one.
|
||||
self.fiber = Fiber(function () {
|
||||
self._fiber();
|
||||
});
|
||||
self.fiber.run();
|
||||
@@ -418,12 +420,20 @@ _.extend(AppRunner.prototype, {
|
||||
|
||||
// Bundle up the app
|
||||
var bundlePath = path.join(self.appDir, '.meteor', 'local', 'build');
|
||||
if (self.recordPackageUsage)
|
||||
stats.recordPackages(self.appDir);
|
||||
if (self.recordPackageUsage) {
|
||||
var statsMessages = buildmessage.capture(function () {
|
||||
stats.recordPackages(self.appDir);
|
||||
});
|
||||
if (statsMessages.hasMessages()) {
|
||||
process.stdout.write("Error talking to stats server:\n" +
|
||||
statsMessages.formatMessages());
|
||||
// ... but continue;
|
||||
}
|
||||
}
|
||||
|
||||
// Cache the server target because the server will not change inside
|
||||
// a single invocation of _runOnce().
|
||||
var cachedBundle;
|
||||
var cachedServerWatchSet;
|
||||
var bundleApp = function () {
|
||||
if (! self.firstRun)
|
||||
packageCache.packageCache.refresh(true); // pick up changes to packages
|
||||
@@ -432,16 +442,16 @@ _.extend(AppRunner.prototype, {
|
||||
outputPath: bundlePath,
|
||||
includeNodeModulesSymlink: true,
|
||||
buildOptions: self.buildOptions,
|
||||
hasCachedBundle: !! cachedBundle
|
||||
hasCachedBundle: !! cachedServerWatchSet
|
||||
});
|
||||
|
||||
// Overwrite the null elements in the new bundle with the elements from
|
||||
// the cached bundle. However, we make sure to keep the serverWatchSet
|
||||
// from the original cached bundle.
|
||||
if (cachedBundle) {
|
||||
bundle.serverWatchSet = cachedBundle.serverWatchSet;
|
||||
// Keep the server watch set from the initial bundle, because subsequent
|
||||
// bundles will not contain a server target.
|
||||
if (cachedServerWatchSet) {
|
||||
bundle.serverWatchSet = cachedServerWatchSet;
|
||||
} else {
|
||||
cachedServerWatchSet = bundle.serverWatchSet;
|
||||
}
|
||||
cachedBundle = _.defaults(bundle, cachedBundle);
|
||||
|
||||
return bundle;
|
||||
};
|
||||
@@ -561,34 +571,37 @@ _.extend(AppRunner.prototype, {
|
||||
// source file to change. Or, for stop() to be called.
|
||||
var ret = runFuture.wait();
|
||||
|
||||
while (ret.outcome === 'changed-refreshable') {
|
||||
// We stay in this loop as long as only refreshable assets have changed.
|
||||
// When ret.refreshable becomes false, we restart the server.
|
||||
bundleResult = bundleApp();
|
||||
if (bundleResult.errors) {
|
||||
return {
|
||||
outcome: 'bundle-fail',
|
||||
bundleResult: bundleResult
|
||||
};
|
||||
try {
|
||||
while (ret.outcome === 'changed-refreshable') {
|
||||
// We stay in this loop as long as only refreshable assets have changed.
|
||||
// When ret.refreshable becomes false, we restart the server.
|
||||
bundleResult = bundleApp();
|
||||
if (bundleResult.errors) {
|
||||
return {
|
||||
outcome: 'bundle-fail',
|
||||
bundleResult: bundleResult
|
||||
};
|
||||
}
|
||||
|
||||
// Establish a watcher on the new files.
|
||||
setupClientWatcher();
|
||||
|
||||
// Notify the server that new client assets have been added to the build.
|
||||
process.kill(appProcess.proc.pid, 'SIGUSR2');
|
||||
runLog.logClientRestart();
|
||||
|
||||
self.runFuture = new Future;
|
||||
ret = self.runFuture.wait();
|
||||
}
|
||||
} finally {
|
||||
self.runFuture = null;
|
||||
|
||||
// Establish a watcher on the new files.
|
||||
setupClientWatcher();
|
||||
self.proxy.setMode("hold");
|
||||
appProcess.stop();
|
||||
|
||||
// Notify the server that new client assets have been added to the build.
|
||||
process.kill(appProcess.proc.pid, 'SIGUSR2');
|
||||
runLog.logClientRestart();
|
||||
|
||||
self.runFuture = new Future;
|
||||
ret = self.runFuture.wait();
|
||||
serverWatcher && serverWatcher.stop();
|
||||
clientWatcher && clientWatcher.stop();
|
||||
}
|
||||
self.runFuture = null;
|
||||
|
||||
self.proxy.setMode("hold");
|
||||
appProcess.stop();
|
||||
|
||||
serverWatcher && serverWatcher.stop();
|
||||
clientWatcher && clientWatcher.stop();
|
||||
|
||||
return ret;
|
||||
},
|
||||
|
||||
@@ -6,12 +6,10 @@ var utils = require('./utils.js');
|
||||
var release = require('./release.js');
|
||||
var mongoExitCodes = require('./mongo-exit-codes.js');
|
||||
var fiberHelpers = require('./fiber-helpers.js');
|
||||
var inFiber = fiberHelpers.inFiber;
|
||||
var runLog = require('./run-log.js');
|
||||
|
||||
var _ = require('underscore');
|
||||
var uniload = require('./uniload.js');
|
||||
var Fiber = require('fibers');
|
||||
var Future = require('fibers/future');
|
||||
|
||||
// Given a Mongo URL, open an interative Mongo shell on this terminal
|
||||
@@ -309,7 +307,7 @@ var launchMongo = function (options) {
|
||||
}
|
||||
});
|
||||
|
||||
procExitHandler = inFiber(function (code, signal) {
|
||||
procExitHandler = fiberHelpers.bindEnvironment(function (code, signal) {
|
||||
// Defang subHandle.stop().
|
||||
proc = null;
|
||||
|
||||
@@ -338,7 +336,7 @@ var launchMongo = function (options) {
|
||||
}
|
||||
};
|
||||
|
||||
var stdoutOnData = inFiber(function (data) {
|
||||
var stdoutOnData = fiberHelpers.bindEnvironment(function (data) {
|
||||
// note: don't use "else ifs" in this, because 'data' can have multiple
|
||||
// lines
|
||||
if (/config from self or any seed \(EMPTYCONFIG\)/.test(data)) {
|
||||
@@ -629,7 +627,7 @@ _.extend(MongoRunner.prototype, {
|
||||
|
||||
if (self.errorCount < 3) {
|
||||
// Wait a second, then restart.
|
||||
self.restartTimer = setTimeout(inFiber(function () {
|
||||
self.restartTimer = setTimeout(fiberHelpers.bindEnvironment(function () {
|
||||
self.restartTimer = null;
|
||||
self._startOrRestart();
|
||||
}), 1000);
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
var _ = require('underscore');
|
||||
var Fiber = require('fibers');
|
||||
var inFiber = require('./fiber-helpers.js').inFiber;
|
||||
var fiberHelpers = require('./fiber-helpers.js');
|
||||
|
||||
var Updater = function () {
|
||||
var self = this;
|
||||
@@ -17,12 +17,14 @@ _.extend(Updater.prototype, {
|
||||
if (self.timer)
|
||||
throw new Error("already running?");
|
||||
|
||||
// Check twice a day.
|
||||
self.timer = setInterval(inFiber(function () {
|
||||
// Check twice a day. (Should not share buildmessage state with
|
||||
// the main fiber.)
|
||||
self.timer = setInterval(fiberHelpers.inBareFiber(function () {
|
||||
self._check();
|
||||
}), 12*60*60*1000);
|
||||
|
||||
// Also start a check now, but don't block on it.
|
||||
// Also start a check now, but don't block on it. (This should
|
||||
// not share buildmessage state with the main fiber.)
|
||||
new Fiber(function () {
|
||||
self._check();
|
||||
}).run();
|
||||
|
||||
@@ -10,6 +10,8 @@ var archinfo = require('./archinfo.js');
|
||||
var packageLoader = require('./package-loader.js');
|
||||
var Future = require('fibers/future');
|
||||
var uniload = require('./uniload.js');
|
||||
var config = require('./config.js');
|
||||
var buildmessage = require('./buildmessage.js');
|
||||
var util = require('util');
|
||||
var child_process = require('child_process');
|
||||
var webdriver = require('browserstack-webdriver');
|
||||
@@ -64,14 +66,14 @@ var expectThrows = markStack(function (f) {
|
||||
});
|
||||
|
||||
var getToolsPackage = function () {
|
||||
buildmessage.assertInCapture();
|
||||
// Rebuild the tool package --- necessary because we don't actually
|
||||
// rebuild the tool in the cached version every time.
|
||||
if (catalog.complete.rebuildLocalPackages([toolPackageName]) !== 1) {
|
||||
throw Error("didn't rebuild meteor-tool?");
|
||||
}
|
||||
var loader = new packageLoader.PackageLoader({versions: null});
|
||||
var toolPackage = loader.getPackage(toolPackageName);
|
||||
return toolPackage;
|
||||
return loader.getPackage(toolPackageName);
|
||||
};
|
||||
|
||||
// Execute a command synchronously, discarding stderr.
|
||||
@@ -82,6 +84,15 @@ var execFileSync = function (binary, args, opts) {
|
||||
})().wait();
|
||||
};
|
||||
|
||||
var captureAndThrow = function (f) {
|
||||
var messages = buildmessage.capture(function () {
|
||||
f();
|
||||
});
|
||||
if (messages.hasMessages()) {
|
||||
throw Error(messages.formatMessages());
|
||||
}
|
||||
};
|
||||
|
||||
///////////////////////////////////////////////////////////////////////////////
|
||||
// Matcher
|
||||
///////////////////////////////////////////////////////////////////////////////
|
||||
@@ -355,6 +366,14 @@ var Sandbox = function (options) {
|
||||
self.env = {};
|
||||
self.fakeMongo = options.fakeMongo;
|
||||
|
||||
// By default, tests use the package server that this meteor binary is built
|
||||
// with. If a test is tagged 'test-package-server', it uses the test
|
||||
// server. Tests that publish packages should have this flag; tests that
|
||||
// assume that the release's packages can be found on the server should not.
|
||||
if (runningTest.tags['test-package-server']) {
|
||||
self.set('METEOR_PACKAGE_SERVER_URL', 'https://test-packages.meteor.com');
|
||||
}
|
||||
|
||||
if (_.has(options, 'warehouse')) {
|
||||
if (!files.inCheckout())
|
||||
throw Error("make only use a fake warehouse in a checkout");
|
||||
@@ -525,11 +544,6 @@ _.extend(Sandbox.prototype, {
|
||||
self.write(to, contents);
|
||||
},
|
||||
|
||||
mkdir: function (dirPath) {
|
||||
var self = this;
|
||||
return fs.mkdirSync(path.join(self.cwd, dirPath));
|
||||
},
|
||||
|
||||
// Delete a file in the sandbox. 'filename' is as in write().
|
||||
unlink: function (filename) {
|
||||
var self = this;
|
||||
@@ -589,12 +603,14 @@ _.extend(Sandbox.prototype, {
|
||||
// releases containing that tool only (and no packages).
|
||||
//
|
||||
// packageServerUrl indicates which package server we think we are using. Use
|
||||
// the default, if we do not pass this in -- which means that if you
|
||||
// initialize a warehouse w/o specifying the packageServerUrl and *then* set a
|
||||
// new PACKAGE_SERVER_URL environment variable, you will be sad.
|
||||
_makeWarehouse: function (releases, packageServerUrl) {
|
||||
// the default, if we do not pass this in; you should pass it in any case that
|
||||
// you will be specifying $METEOR_PACKAGE_SERVER_URL in the environment of a
|
||||
// command you are running in this sandbox.
|
||||
_makeWarehouse: function (releases) {
|
||||
var self = this;
|
||||
files.mkdir_p(path.join(self.warehouse, 'packages'), 0755);
|
||||
var serverUrl = self.env.METEOR_PACKAGE_SERVER_URL;
|
||||
var packagesDirectoryName = config.getPackagesDirectoryName(serverUrl);
|
||||
files.mkdir_p(path.join(self.warehouse, packagesDirectoryName), 0755);
|
||||
files.mkdir_p(path.join(self.warehouse, 'package-metadata', 'v1'), 0755);
|
||||
|
||||
var stubCatalog = {
|
||||
@@ -615,15 +631,18 @@ _.extend(Sandbox.prototype, {
|
||||
// be building some packages besides meteor-tool (so that we can
|
||||
// build apps that contain core packages).
|
||||
|
||||
var toolPackage = getToolsPackage();
|
||||
var toolPackageDirectory =
|
||||
'.' + toolPackage.version + '.XXX++'
|
||||
+ toolPackage.buildArchitectures();
|
||||
toolPackage.saveToPath(path.join(self.warehouse, 'packages',
|
||||
toolPackageName, toolPackageDirectory));
|
||||
var toolPackage, toolPackageDirectory;
|
||||
captureAndThrow(function () {
|
||||
toolPackage = getToolsPackage();
|
||||
toolPackageDirectory = '.' + toolPackage.version + '.XXX++'
|
||||
+ toolPackage.buildArchitectures();
|
||||
toolPackage.saveToPath(path.join(self.warehouse, packagesDirectoryName,
|
||||
toolPackageName, toolPackageDirectory));
|
||||
});
|
||||
|
||||
fs.symlinkSync(toolPackageDirectory,
|
||||
path.join(self.warehouse, 'packages', toolPackageName,
|
||||
toolPackage.version));
|
||||
path.join(self.warehouse, packagesDirectoryName,
|
||||
toolPackageName, toolPackage.version));
|
||||
stubCatalog.collections.packages.push({
|
||||
name: toolPackageName,
|
||||
_id: utils.randomToken()
|
||||
@@ -653,14 +672,14 @@ _.extend(Sandbox.prototype, {
|
||||
});
|
||||
|
||||
// Now create each requested release.
|
||||
_.each(releases, function (config, releaseName) {
|
||||
_.each(releases, function (configuration, releaseName) {
|
||||
// Release info
|
||||
stubCatalog.collections.releaseVersions.push({
|
||||
track: catalog.complete.DEFAULT_TRACK,
|
||||
version: releaseName,
|
||||
orderKey: releaseName,
|
||||
description: "test release " + releaseName,
|
||||
recommended: !!config.recommended,
|
||||
recommended: !!configuration.recommended,
|
||||
// XXX support multiple tools packages for springboard tests
|
||||
tool: toolPackageName + "@" + toolPackage.version,
|
||||
packages: {}
|
||||
@@ -718,7 +737,7 @@ _.extend(Sandbox.prototype, {
|
||||
// Insert into builds. Assume the package is available for all
|
||||
// architectures.
|
||||
stubCatalog.collections.builds.push({
|
||||
buildArchitectures: "browser+os",
|
||||
buildArchitectures: "web.browser+os",
|
||||
versionId: versionRec._id,
|
||||
build: buildRec.build,
|
||||
_id: utils.randomToken()
|
||||
@@ -726,14 +745,14 @@ _.extend(Sandbox.prototype, {
|
||||
});
|
||||
catalog.official.offline = oldOffline;
|
||||
|
||||
var config = require("./config.js");
|
||||
var dataFile = config. getLocalPackageCacheFilename(packageServerUrl);
|
||||
var dataFile = config.getLocalPackageCacheFilename(serverUrl);
|
||||
fs.writeFileSync(
|
||||
path.join(self.warehouse, 'package-metadata', 'v1', dataFile),
|
||||
JSON.stringify(stubCatalog, null, 2));
|
||||
|
||||
// And a cherry on top
|
||||
fs.symlinkSync(path.join('packages', toolPackageName, toolPackage.version,
|
||||
fs.symlinkSync(path.join(packagesDirectoryName,
|
||||
toolPackageName, toolPackage.version,
|
||||
'meteor-tool-' + archinfo.host(), 'meteor'),
|
||||
path.join(self.warehouse, 'meteor'));
|
||||
}
|
||||
@@ -1026,6 +1045,7 @@ _.extend(Run.prototype, {
|
||||
self._ensureStarted();
|
||||
|
||||
var timeout = self.baseTimeout + self.extraTime;
|
||||
timeout *= utils.timeoutScaleFactor;
|
||||
self.extraTime = 0;
|
||||
return self.stdoutMatcher.match(pattern, timeout, _strict);
|
||||
}),
|
||||
@@ -1036,6 +1056,7 @@ _.extend(Run.prototype, {
|
||||
self._ensureStarted();
|
||||
|
||||
var timeout = self.baseTimeout + self.extraTime;
|
||||
timeout *= utils.timeoutScaleFactor;
|
||||
self.extraTime = 0;
|
||||
return self.stderrMatcher.match(pattern, timeout, _strict);
|
||||
}),
|
||||
@@ -1081,6 +1102,7 @@ _.extend(Run.prototype, {
|
||||
self._ensureStarted();
|
||||
|
||||
var timeout = self.baseTimeout + self.extraTime;
|
||||
timeout *= utils.timeoutScaleFactor;
|
||||
self.extraTime = 0;
|
||||
self.expectExit();
|
||||
|
||||
@@ -1098,6 +1120,7 @@ _.extend(Run.prototype, {
|
||||
|
||||
if (self.exitStatus === undefined) {
|
||||
var timeout = self.baseTimeout + self.extraTime;
|
||||
timeout *= utils.timeoutScaleFactor;
|
||||
self.extraTime = 0;
|
||||
|
||||
var fut = new Future;
|
||||
@@ -1556,5 +1579,6 @@ _.extend(exports, {
|
||||
expectEqual: expectEqual,
|
||||
expectThrows: expectThrows,
|
||||
getToolsPackage: getToolsPackage,
|
||||
execFileSync: execFileSync
|
||||
execFileSync: execFileSync,
|
||||
captureAndThrow: captureAndThrow
|
||||
});
|
||||
|
||||
@@ -9,6 +9,7 @@ var project = require("./project.js");
|
||||
var auth = require("./auth.js");
|
||||
var ServiceConnection = require("./service-connection.js");
|
||||
var release = require("./release.js");
|
||||
var buildmessage = require("./buildmessage.js");
|
||||
|
||||
// The name of the package that you add to your app to opt out of
|
||||
// sending stats.
|
||||
@@ -33,6 +34,7 @@ var optOutPackageName = "package-stats-opt-out";
|
||||
// that it is pointing to a root directory with an existing
|
||||
// .meteor/versions file.
|
||||
var packageList = function (_currentProjectForTest) {
|
||||
buildmessage.assertInCapture();
|
||||
var directDeps = (_currentProjectForTest || project.project).getConstraints();
|
||||
|
||||
var versions;
|
||||
@@ -55,6 +57,7 @@ var packageList = function (_currentProjectForTest) {
|
||||
};
|
||||
|
||||
var recordPackages = function () {
|
||||
buildmessage.assertInCapture();
|
||||
// Before doing anything, look at the app's dependencies to see if the
|
||||
// opt-out package is there; if present, we don't record any stats.
|
||||
var packages = packageList();
|
||||
@@ -69,6 +72,7 @@ var recordPackages = function () {
|
||||
// We do this inside a new fiber to avoid blocking anything on talking
|
||||
// to the package stats server. If we can't connect, for example, we
|
||||
// don't care; we'll just miss out on recording these packages.
|
||||
// This also gives it its own buildmessage state.
|
||||
Fiber(function () {
|
||||
|
||||
var userAgentInfo = {
|
||||
|
||||
@@ -2,6 +2,7 @@ var _ = require('underscore');
|
||||
var release = require('./release.js');
|
||||
var uniload = require('./uniload.js');
|
||||
var config = require('./config.js');
|
||||
var utils = require('./utils.js');
|
||||
|
||||
var randomString = function (charsCount) {
|
||||
var chars = 'abcdefghijklmnopqrstuvwxyz';
|
||||
@@ -12,7 +13,7 @@ var randomString = function (charsCount) {
|
||||
return str;
|
||||
};
|
||||
|
||||
exports.accountsCommandTimeoutSecs = 15;
|
||||
exports.accountsCommandTimeoutSecs = 15 * utils.timeoutScaleFactor;
|
||||
|
||||
exports.randomString = randomString;
|
||||
|
||||
|
||||
@@ -25,7 +25,7 @@ if (Meteor.isClient) {
|
||||
oldBackgroundColor = backgroundColor();
|
||||
Meteor.call("newStylesheet", ++numCssChanges, oldBackgroundColor);
|
||||
waitingForCssReloadToComplete = false;
|
||||
Meteor.clearTimeout(handle);
|
||||
Meteor.clearInterval(handle);
|
||||
}
|
||||
}, 500);
|
||||
}
|
||||
@@ -44,4 +44,4 @@ if (Meteor.isServer) {
|
||||
console.log("background-color: " + backgroundColor);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
@@ -0,0 +1 @@
|
||||
<head><title>bla</title></head>
|
||||
@@ -3,9 +3,10 @@ var Sandbox = selftest.Sandbox;
|
||||
var files = require('../files.js');
|
||||
var _= require('underscore');
|
||||
|
||||
// Add packages to an app. Change the contents of the packages and their
|
||||
// dependencies, make sure that the app still refreshes.
|
||||
selftest.define('constraint solver benchmark', ['slow'], function () {
|
||||
// Runs all of the constraint-solver tests, including ones that tie up the CPU
|
||||
// for too long to safely run in the normal test-packages run.
|
||||
// Only run from checkouts, because test-packages only works on local packages.
|
||||
selftest.define('constraint solver benchmark', ['slow', 'checkout'], function () {
|
||||
var s = new Sandbox();
|
||||
s.set('CONSTRAINT_SOLVER_BENCHMARK', 't');
|
||||
var run = s.run("test-packages",
|
||||
|
||||
@@ -37,20 +37,20 @@ selftest.define("css hot code push", function (options) {
|
||||
// rgba(0, 0, 0, 0).
|
||||
run.match(/background-color: (transparent|rgba\(0, 0, 0, 0\))/);
|
||||
|
||||
// The server restarts if a new css file is added.
|
||||
// The server does NOT restart if a new css file is added.
|
||||
s.write("test.css", "body { background-color: red; }");
|
||||
run.match("server restarted");
|
||||
run.match("Client modified -- refreshing");
|
||||
run.match("numCssChanges: 1");
|
||||
run.match("background-color: rgb(255, 0, 0)");
|
||||
|
||||
s.write("test.css", "body { background-color: blue; }");
|
||||
run.match("refreshing");
|
||||
run.match("Client modified -- refreshing");
|
||||
run.match("numCssChanges: 2");
|
||||
run.match("background-color: rgb(0, 0, 255)");
|
||||
|
||||
// The server restarts if a css file is removed.
|
||||
// The server does NOT restart if a css file is removed.
|
||||
s.unlink("test.css");
|
||||
run.match("server restarted");
|
||||
run.match("Client modified -- refreshing");
|
||||
run.match("numCssChanges: 3");
|
||||
run.match(/background-color: (transparent|rgba\(0, 0, 0, 0\))/);
|
||||
run.stop();
|
||||
@@ -119,6 +119,24 @@ selftest.define("javascript hot code push", function (options) {
|
||||
run.match("client connected: 0");
|
||||
run.match("jsVar: undefined");
|
||||
|
||||
// Break the HTML file. This should kill the server, and print errors.
|
||||
// (It would be reasonable behavior for this to NOT kill the server, since
|
||||
// it only affects the client. But this is a regression test for a bug where
|
||||
// fixing the HTML file wouldn't actually restart the server; that's the
|
||||
// important part of this test.)
|
||||
s.write("hot-code-push-test.html", ">");
|
||||
run.match("Errors prevented startup");
|
||||
run.match("bad formatting in HTML template");
|
||||
// Fix it. It should notice, and restart. The client will restart too.
|
||||
s.write("hot-code-push-test.html", "");
|
||||
run.match("server restarted");
|
||||
run.match("client connected: 0");
|
||||
// Write something else to it. The client should restart.
|
||||
s.write("hot-code-push-test.html", "<head><title>foo</title></head>");
|
||||
run.match("Client modified -- refreshing");
|
||||
run.match("client connected: 1");
|
||||
run.match("jsVar: undefined");
|
||||
|
||||
// Add appcache and ensure that the browser still reloads.
|
||||
s.write(".meteor/packages", "standard-app-packages \n appcache");
|
||||
run.match("added appcache");
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user