Extract Platform to a separate crate.
This moves the `Platform`, `Cfg`, `CfgExpr` types to a new crate named "cargo-platform". The intent here is to give users of `cargo_metadata` a way of parsing and inspecting cargo's platform values.
Along the way, I rewrote the error handling to remove `failure`, and to slightly improve the output.
I'm having doubts whether or not this is a good idea. As you can see from the `examples/matches.rs` example, it is nontrivial to use this (which also misses cargo's config values and environment variables). I don't know if anyone will actually use this. If this doesn't seem to have value, I would suggest closing it.
I've also included a sample script, `publish.py`, for publishing cargo itself. I suspect it will need tweaking, but I figure it would be a start and open for feedback.
Don't require the `serde` feature of `url`
Ends up meaning that in full crate compiles that `url` doesn't wait for
`serde` to finish, which in turn enables crates like `git2` to start
sooner!
Experiment: Create timing report.
This is just an experiment, so I'm not sure if we'll want to merge it.
This adds an HTML report which gets saved to disk when the build is finished. It is primarily geared for identifying slow dependencies, and for visualizing how pipelining affects the build.
Here's an example: https://ehuss.github.io/cargo-timing.html
You can mouse over the blocks to highlight the reverse-dependencies that are released when a unit finishes. `syn` is a really good example.
It does a few other things, like displaying a message after each unit is finished. See the docs for more information.
This home's release remove support for the old `.multirust`
directory. Also it fixes rustup_home and cargo_home implementation
when corresponding environment variables are absolute paths.
Instead of treating Windows differently, this just always uses canonical paths
on all platforms. This fixes a problem where symlinks were not treated
correctly on all platforms.
Switching rm_rf to the remove_dir_all crate because deleting symbolic links on
Windows is difficult.
Tighten requirements for git2 crates
Bring in a few updates, used to update libgit2 and fix a Windows issue
as well as updating the `url` dependencies.
Closes#7173
This commit fixes a test in Cargo to work around a seeming regression in
behavior in libgit2 around HTTP authentication. The expected flow for
HTTP authentication with git is that git sends an HTTP request and
receives an "unauthorized" response. It then sends another request with
authorization information and that's what we're testing is received in
the our test.
Previously libgit2 would issue a new HTTP connection if the previous one
was closed, but it looks like changes in libgit2 now require that the
same HTTP connection is used for the initial request and the subsequent
request with authorization information. This broke our test since it's
not using an HTTP compliant server at all and is just some handwritten
TCP reads/writes. The fix here is to basically stay with handwritten TCP
reads/writes but tweak how it happens so it's all on the same HTTP/TCP
connection to match what libgit2 is expecting.
Some extra assertions have also been added to try to prevent deadlocks
from happening in the future and instead make the test fail fast if this
situation comes up again.
These tests take a good amount of time to run locally and they're also
causing a lot of dependencies to get pulled into rust-lang/rust, so
let's have a separate crate that we just test on our own CI
Test the Resolver against the varisat Library
Resolution can be reduced to the SAT problem. So this is an alternative implementation of the resolver that uses a SAT library for the hard work. This is intended to be easy to read, as compared to the real resolver, and run as part of the test sweet to make sure the real resolver works as expected. Part of #6120.
Some notes on performance:
The initial version did not support public & private deps:
~64 loc, `O(nln(n))` vars, `O(nln(n) + n*d)` clauses, 0.5x slower on `prop_passes_validation`
The final version:
~163 loc, `O(dn^2`) vars, `O(dn^3)` clauses, 1.5x slower on `prop_passes_validation`
That comparison makes me feel better about spending months trying to get public & private deps to be fast enough for stabilization.
Parse less JSON on null builds
This commit fixes a performance pathology in Cargo today. Whenever Cargo
generates a lock file (which happens on all invocations of `cargo build`
for example) Cargo will parse the crates.io index to learn about
dependencies. Currently, however, when it parses a crate it parses the
JSON blob for every single version of the crate. With a lock file,
however, or with incremental builds only one of these lines of JSON is
relevant. Measured today Cargo building Cargo parses 3700 JSON
dependencies in the registry.
This commit implements an optimization that brings down the number of
parsed JSON lines in the registry to precisely the right number
necessary to build a project. For example Cargo has 150 crates in its
lock file, so now it only parses 150 JSON lines (a 20x reduction from
3700). This in turn can greatly improve Cargo's null build time. Cargo
building Cargo dropped from 120ms to 60ms on a Linux machine and 400ms
to 200ms on a Mac.
The commit internally has a lot more details about how this is done but
the general idea is to have a cache which is optimized for Cargo to read
which is maintained automatically by Cargo.
Closes#6866
This commit fixes a performance pathology in Cargo today. Whenever Cargo
generates a lock file (which happens on all invocations of `cargo build`
for example) Cargo will parse the crates.io index to learn about
dependencies. Currently, however, when it parses a crate it parses the
JSON blob for every single version of the crate. With a lock file,
however, or with incremental builds only one of these lines of JSON is
relevant. Measured today Cargo building Cargo parses 3700 JSON
dependencies in the registry.
This commit implements an optimization that brings down the number of
parsed JSON lines in the registry to precisely the right number
necessary to build a project. For example Cargo has 150 crates in its
lock file, so now it only parses 150 JSON lines (a 20x reduction from
3700). This in turn can greatly improve Cargo's null build time. Cargo
building Cargo dropped from 120ms to 60ms on a Linux machine and 400ms
to 200ms on a Mac.
The commit internally has a lot more details about how this is done but
the general idea is to have a cache which is optimized for Cargo to read
which is maintained automatically by Cargo.
Closes#6866
This is intended to fix#6747 where multiple Cargos invoked with the
same jobserver would all have their own token but not actually run
concurrently due to file locking. Instead the fix is that whenever Cargo
blocks for a file lock with a configured global jobserver, a token is
released just before we block and then reacquired afterwards. This way
we should ensure that we're not hogging a cpu/token unnecessarily
without doing any work!
Closes#6747