You can still include them on the client, but they don’t work in
Safari 4 and IE 8 because semver.js uses ES 5 methods including
String#trim, Array#map/filter/forEach, and possibly others.
This should fix any unit test failures in these packages.
This unit test demonstrates 20-second solving time. Thanks to the
CatalogCache abstraction, the data provided to the solver in the test
is exactly the data it gets when running the “meteor” command in a test
app with a bunch of packages in .meteor/packages and no .meteor/versions
file.
The test is hidden behind an environment variable:
CONSTRAINT_SOLVER_SLOW_TESTS
Previously, “meteor update foo” meant “ignore .meteor/versions for foo”,
which would upgrade if “foo” was a root dependency, and downgrade if foo
was only a transitive dependency.
Now, we make sure to try to upgrade foo even if it is not a root
dependency.
See #3282.
CS.Input is a serializable representation of the “problem.” It includes
the arguments to PackagesResolver#resolve, and also the catalog data
loaded into the CatalogCache. It’s independent of the solver, and
doesn’t even know about PackagesResolver or Resolver.
Along the way, get rid of the _testing and _debug flags. “_testing”
came about to avoid running the real cost function on some of the unit
tests, but it doesn’t actually seem to matter anymore for correctness
or performance of the tests. “_debug” was just used to enable some
console.logs, and possibly shouldn’t have been committed in the first
place.
Don’t mutate the “options” object in PackageResolver#resolve, and don’t
pass it on to _getResolverOptions.
At this point, this is rearranging deck chairs on the Titanic, but
making this code more understandable helps me replace it.
PackageResolver no longer loads data from the Catalog. Instead, it
tells CatalogLoader what to load, and it sets up the Resolver based
on what it finds in the CatalogCache.
PackageResolver now creates the Resolver inside resolve(…). If the tool
were to invoke resolve(…) multiple times on the same PackageResolver
(which it doesn’t at the moment), the CatalogCache would persist, but
not the Resolver. (Note that PackageResolver#resolve makes multiple
calls to the same Resolver#resolve internally.)
The purpose of this change is to stop using Resolver to store the
dependency graph. Resolver will be replaced with a logic-solver-based
implementation that will not represent the graph as is, but instead
encode the graph as a satisfiability problem. Meanwhile, CatalogCache
is better at storing the graph than Resolver was, because it is easy
to populate, query, and serialize.
This change brings us back to a functional “devel”.
This is a breaking change to package-version-parser.
A PackageConstraint used to look like this:
```
{ name: String,
constraintString: String,
constraints: [{version: String|null,
type: String}]}
```
Now it looks like this:
```
{ name: String,
constraintString: String,
vConstraint: {
raw: String,
alternatives: [{versionString: String|null,
type: String}]}}
```
Where (vConstraint instanceof VersionConstraint) and
(vConstraint.raw === constraintString).
This achieves several desirable changes at once.
* `constraint.constraints` for the disjuncts in “1.0.0||2.0.0”
was confusing. `alternatives` is better.
* Having a class for VersionConstraint will come in handy because
we can add methods to it, and we can use it in the constraint
solver to represent the problem statement.
* The names “vConstraint” and “versionString” are a little verbose,
but there really shouldn’t be a lot of code that dives into this
structure, and I really wanted to avoid anyone ever writing:
`constraint.constraint.alternatives[0].version`, and then wondering
what sort of object that was (not a parsed PackageVersion! we could
parse eagerly but that might be slow).
Mostly I just made parseConstraint clearer and gave the return value a
prototype.
There are new unit tests, and the existing ones have been given much
better names.
PV.parseConstraint also now optionally takes two arguments, a package name and a version constraint. We can eventually propagate this feature to the tool and stop concatenating with “@” so much.
I’m referring to “foo@=1.0.0” as a PackageConstraint and “=1.0.0” as a version constraint. I’ll probably add a VersionConstraint class too.
It was only used in one place in resolver.js. If the semantics of constraints are that buildIDs are ignored, that should be implemented when interpreting a constraint, not parsing it. We don’t use buildIDs ourselves anymore, so it’s rather theoretical, but we’ve always stripped them out of constraint strings so it’s not even clear why we allow them at all — ie. why we are willing to parse “=1.0.0+foo” as a valid constraint and then throw away the “+foo”.
Bonus: One less place where we conflate two different options objects (utils.parseConstraint and PV.parseConstraint).
This commit fixes a 0.9.3 regression where the calculation of the
topLevelPrereleases object was not updated with the introduction of
disjunction constraints (||) and would always be empty.
topLevelPrereleases and useRCsOK are merged into a single
anticipatedPrereleases option which is now passed in to the
constraint-solver package rather than constructed inside it. (This
object will also be used in ProjectContext as part of PackageDelta
calculation.)
The usedRCs return flag is now called
neededToUseUnanticipatedPrereleases.
There were too many things called `constraints` — ConstraintsList objects, arrays and maps of Constraint strings or objects… Also “alternatives” is already taken, so use “disjunction.” Consider propagating this rename upstream to package-version-parser, since it’s kind of weird that “constraints” is implicitly a disjunction (rather than, say, a conjunction).
Don’t call a constraint string a version string.
When it comes to pre-releases, Constraints aren’t “context-free.” That’s ok for now, but at least hoist the pre-release check out of the innermost loop. Ideally we’d hoist it higher, but I can see that it’s pretty convenient this way.
Should be identical in behavior to before (differing maybe in some super edge case, but I doubt it).
Consider the following situation:
- app uses package P
- Every version of P contains `api.use('s', 'server')`
- Every version of S contains `api.use('c', 'client')`
- There's nothing else around using S or C
When we bundle our app, we will not end up putting any unibuilds from C
into the bundle. That's fine.
The previous version of the constraint solver understood this, and so C
wasn't even part of the constraint solver solution.
HOWEVER, even though C does not contribute any unibuilds to the app
bundle, we still need to compile C in order to compile S. That's because
our current implementation never compiles only part of an isopack, even
if only part of the isopack will be needed for the app.
The structured project initialization done via ProjectContext will thus
decide to not build or load C, which will make it fail to compile S when
it gets around to compiling the client unibuilds in S.
We could make the model more complex by making it possible to compile
only part of S.
Instead, we'll make the Meteor package model simpler. Constraints, as
far as the constraint solver is concerned, are now at a package level.
So in this case, "C" will actually be part of the project (will end up
in .meteor/versions, etc) even though it will continue to not provide
any part of any of the bundled client or server programs.
This means that nodes in the constraint solver's graph will now just be
package versions, not unibuild versions. That's already the language
that the constraint solver spoke in as far as its inputs, outputs, and
error messages were concerned!
An example of an app that couldn't be built on the isopack-cache branch
before this commit and can be now is
https://github.com/glasser/precise-constraints-example
(It can be built with 1.0, but only by compiling a version of `c` that
isn't part of .meteor/versions!)
Note that this also means that .meteor/versions expresses enough to
allow us to implement a simpler constraint check that doesn't need to do
the full tree walk of the constraint solver. Such a checker would be
implemented as:
- gather root dependencies and constraints (project constraints,
release constraints, etc)
- add the dependencies and constraints from all versions mentioned
in .meteor/versions
- see if the choices made in .meteor/versions satisfy these
dependencies and constraints
This algorithm did NOT work previously, because you couldn't just look
at the constraints coming from `s@0.0.0` and say "they're satifisfied"
because you had to know to "ignore" the constraint on c#web.browser
because s#web.browser is not part of the app, which is not a fact that
actually got recorded in .meteor/versions.
(This commit does not implement this simpler constraint check, though.)
The original benchmark/test translated all constraints into either inexact or exact Meteor constraints — that is, Rails’s `~>1.2.3`, `>1.2.3`, and `>=1.2.3` became the inexact `1.2.3`, and `=1.2.3` became `=1.2.3` — but also gave all versions of all packages an ECV of 0.0.0, meaning all versions are compatible with each other.
In order to make this test pass without ECV, we need to emulate pure `>=` (without incompatibility between major versions) in some other way. So, for each constraint of the form `foo@a.b.c` where `foo` has larger major versions available than `a`, we allow those major versions as well. For example, if we depend on `foo@1.1.11` and the largest version of `foo` is `3.2.0`, we rewrite `1.1.11` to `1.1.11 || 2.0.0 || 3.0.0` in order to emulate a pure `>=1.1.11` with no concern for major version compatibility.
in test-data. Importantly, tests still pass!
For example:
- Convert 1.2 to 1.2.0
- Convert 1.2.0.0 to 1.2.0
- Remove 1.2.0.1 completely
- Remove all rcs, betas, alphas, pres, etc.
Remove unused “platform” property.
Next step: Make the test not depend on setting ECVs in order to pass.
This was an undocumented and entirely unused feature (only two dummy
packages on the package server have this set to a non-default value).
No attempt is being made to remove the field from existing isopacks or
catalog entries. To continue to support existing clients, the package
server has been modified to ignore any provided
earliestCompatibleVersion and instead always write the default ECV to
the catalog.