Merge remote-tracking branch 'origin/master' into custom-profile-pr-rfc

This commit is contained in:
Dan Aloni 2019-06-07 21:48:47 +03:00
commit 5d048e0330
49 changed files with 2373 additions and 259 deletions

View file

@ -1,17 +1,51 @@
# Changelog
## Cargo 1.37 (2019-08-15)
[c4fcfb72...HEAD](https://github.com/rust-lang/cargo/compare/c4fcfb72...HEAD)
### Added
- Added `doctest` field to `cargo metadata` to determine if a target's documentation is tested.
[#6953](https://github.com/rust-lang/cargo/pull/6953)
[#6965](https://github.com/rust-lang/cargo/pull/6965)
- (Nightly only): Added [compiler message
caching](https://doc.rust-lang.org/nightly/cargo/reference/unstable.html#cache-messages).
The `-Z cache-messages` flag makes cargo cache the compiler output so that
future runs can redisplay previous warnings.
[#6933](https://github.com/rust-lang/cargo/pull/6933)
### Changed
- `cargo package` now verifies that build scripts do not create empty
directories.
[#6973](https://github.com/rust-lang/cargo/pull/6973)
### Fixed
- Fixed how zsh completions fetch the list of commands.
[#6956](https://github.com/rust-lang/cargo/pull/6956)
- "+ debuginfo" is no longer printed in the build summary when `debug` is set
to 0.
[#6971](https://github.com/rust-lang/cargo/pull/6971)
## Cargo 1.36 (2019-07-04)
[6f3e9c36...HEAD](https://github.com/rust-lang/cargo/compare/6f3e9c36...HEAD)
[6f3e9c36...c4fcfb72](https://github.com/rust-lang/cargo/compare/6f3e9c36...c4fcfb72)
### Added
- (Nightly only): Added [`-Z install-upgrade`
feature](https://github.com/rust-lang/cargo/blob/7b13469ee90a24fa5aa06166d447dac3370846ef/src/doc/src/reference/unstable.md#install-upgrade)
to track details about installed crates and to automatically update them if
they are out-of-date. [#6798](https://github.com/rust-lang/cargo/pull/6798)
feature](https://doc.rust-lang.org/nightly/cargo/reference/unstable.html#install-upgrade)
to track details about installed crates and to update them if they are
out-of-date. [#6798](https://github.com/rust-lang/cargo/pull/6798)
- (Nightly only): Added the [`public-dependency`
feature](https://github.com/rust-lang/cargo/blob/7b13469ee90a24fa5aa06166d447dac3370846ef/src/doc/src/reference/unstable.md#public-dependency)
feature](https://doc.rust-lang.org/nightly/cargo/reference/unstable.html#public-dependency)
which allows tracking public versus private dependencies.
[#6772](https://github.com/rust-lang/cargo/pull/6772)
- Added more detailed documentation on target auto-discovery.
[#6898](https://github.com/rust-lang/cargo/pull/6898)
- (Nightly only): Added build pipelining via the `build.pipelining` config
option (`CARGO_BUILD_PIPELINING` env var).
[#6883](https://github.com/rust-lang/cargo/pull/6883)
- 🔥 Stabilize the `--offline` flag which allows using cargo without a network
connection.
[#6934](https://github.com/rust-lang/cargo/pull/6934)
[#6871](https://github.com/rust-lang/cargo/pull/6871)
### Changed
- `publish = ["crates-io"]` may be added to the manifest to restrict
@ -34,8 +68,38 @@
- Setting a feature on a dependency where that feature points to a *required*
dependency is now an error. Previously it was a warning.
[#6860](https://github.com/rust-lang/cargo/pull/6860)
- The `registry.index` config value now supports relative `file:` URLs.
[#6873](https://github.com/rust-lang/cargo/pull/6873)
- macOS: The `.dSYM` directory is now symbolically linked next to example
binaries without the metadata hash so that debuggers can find it.
[#6891](https://github.com/rust-lang/cargo/pull/6891)
- The default `Cargo.toml` template for now projects now includes a comment
providing a link to the documentation.
[#6881](https://github.com/rust-lang/cargo/pull/6881)
- Some improvements to the wording of the crate download summary.
[#6916](https://github.com/rust-lang/cargo/pull/6916)
[#6920](https://github.com/rust-lang/cargo/pull/6920)
- ✨ Changed `RUST_LOG` environment variable to `CARGO_LOG` so that user code
that uses the `log` crate will not display cargo's debug output.
[#6918](https://github.com/rust-lang/cargo/pull/6918)
- `Cargo.toml` is now always included when packaging, even if it is not listed
in `package.include`.
[#6925](https://github.com/rust-lang/cargo/pull/6925)
- Package include/exclude values now use gitignore patterns instead of glob
patterns. [#6924](https://github.com/rust-lang/cargo/pull/6924)
- Provide a better error message when crates.io times out. Also improve error
messages with other HTTP response codes.
[#6936](https://github.com/rust-lang/cargo/pull/6936)
### Performance
- Resolver performance improvements for some cases.
[#6853](https://github.com/rust-lang/cargo/pull/6853)
- Optimized how cargo reads the index JSON files by caching the results.
[#6880](https://github.com/rust-lang/cargo/pull/6880)
[#6912](https://github.com/rust-lang/cargo/pull/6912)
[#6940](https://github.com/rust-lang/cargo/pull/6940)
- Various performance improvements.
[#6867](https://github.com/rust-lang/cargo/pull/6867)
### Fixed
- More carefully track the on-disk fingerprint information for dependencies.
@ -43,6 +107,14 @@
restarted. [#6832](https://github.com/rust-lang/cargo/pull/6832)
- `cargo run` now correctly passes non-UTF8 arguments to the child process.
[#6849](https://github.com/rust-lang/cargo/pull/6849)
- Fixed bash completion to run on bash 3.2, the stock version in macOS.
[#6905](https://github.com/rust-lang/cargo/pull/6905)
- Various fixes and improvements to zsh completion.
[#6926](https://github.com/rust-lang/cargo/pull/6926)
[#6929](https://github.com/rust-lang/cargo/pull/6929)
- Fix `cargo update` ignoring `-p` arguments if the `Cargo.lock` file was
missing.
[#6904](https://github.com/rust-lang/cargo/pull/6904)
## Cargo 1.35 (2019-05-23)
[6789d8a0...6f3e9c36](https://github.com/rust-lang/cargo/compare/6789d8a0...6f3e9c36)

View file

@ -32,8 +32,8 @@ failure = "0.1.5"
filetime = "0.2"
flate2 = { version = "1.0.3", features = ['zlib'] }
fs2 = "0.4"
git2 = "0.8.0"
git2-curl = "0.9.0"
git2 = "0.9.0"
git2-curl = "0.10.0"
glob = "0.3.0"
hex = "0.3"
home = "0.3"
@ -43,7 +43,7 @@ jobserver = "0.1.13"
lazycell = "1.2.0"
libc = "0.2"
log = "0.4.6"
libgit2-sys = "0.7.9"
libgit2-sys = "0.8.0"
memchr = "2.1.3"
num_cpus = "1.0"
opener = "0.4"
@ -103,6 +103,7 @@ features = [
[dev-dependencies]
bufstream = "0.1"
proptest = "0.9.1"
varisat = "0.2.1"
[[bin]]
name = "cargo"

View file

@ -50,10 +50,12 @@ a list of known community-developed subcommands.
## Releases
High level release notes are available as part of [Rust's release notes][rel].
Cargo releases coincide with Rust releases.
High level release notes are available as part of [Rust's release notes][rel].
Detailed release notes are available in this repo at [CHANGELOG.md].
[rel]: https://github.com/rust-lang/rust/blob/master/RELEASES.md
[CHANGELOG.md]: CHANGELOG.md
## Reporting issues

View file

@ -30,6 +30,7 @@ pub fn builtin() -> Vec<App> {
test::cli(),
uninstall::cli(),
update::cli(),
vendor::cli(),
verify_project::cli(),
version::cli(),
yank::cli(),
@ -66,6 +67,7 @@ pub fn builtin_exec(cmd: &str) -> Option<fn(&mut Config, &ArgMatches<'_>) -> Cli
"test" => test::exec,
"uninstall" => uninstall::exec,
"update" => update::exec,
"vendor" => vendor::exec,
"verify-project" => verify_project::exec,
"version" => version::exec,
"yank" => yank::exec,
@ -102,6 +104,7 @@ pub mod search;
pub mod test;
pub mod uninstall;
pub mod update;
pub mod vendor;
pub mod verify_project;
pub mod version;
pub mod yank;

View file

@ -0,0 +1,127 @@
use crate::command_prelude::*;
use cargo::ops;
use std::path::PathBuf;
pub fn cli() -> App {
subcommand("vendor")
.about("Vendor all dependencies for a project locally")
.arg(opt("quiet", "No output printed to stdout").short("q"))
.arg_manifest_path()
.arg(Arg::with_name("path").help("Where to vendor crates (`vendor` by default)"))
.arg(
Arg::with_name("no-delete")
.long("no-delete")
.help("Don't delete older crates in the vendor directory"),
)
.arg(
Arg::with_name("tomls")
.short("s")
.long("sync")
.help("Additional `Cargo.toml` to sync and vendor")
.value_name("TOML")
.multiple(true),
)
.arg(
Arg::with_name("respect-source-config")
.long("respect-source-config")
.help("Respect `[source]` config in `.cargo/config`")
.multiple(true),
)
.arg(
Arg::with_name("no-merge-sources")
.long("no-merge-sources")
.hidden(true),
)
.arg(
Arg::with_name("relative-path")
.long("relative-path")
.hidden(true),
)
.arg(
Arg::with_name("only-git-deps")
.long("only-git-deps")
.hidden(true),
)
.arg(
Arg::with_name("explicit-version")
.short("-x")
.long("explicit-version")
.hidden(true),
)
.arg(
Arg::with_name("disallow-duplicates")
.long("disallow-duplicates")
.hidden(true),
)
.after_help(
"\
This cargo subcommand will vendor all crates.io and git dependencies for a
project into the specified directory at `<path>`. After this command completes
the vendor directory specified by `<path>` will contain all remote sources from
dependencies specified. Additional manifests beyond the default one can be
specified with the `-s` option.
The `cargo vendor` command will also print out the configuration necessary
to use the vendored sources, which you will need to add to `.cargo/config`.
",
)
}
pub fn exec(config: &mut Config, args: &ArgMatches<'_>) -> CliResult {
// We're doing the vendoring operation outselves, so we don't actually want
// to respect any of the `source` configuration in Cargo itself. That's
// intended for other consumers of Cargo, but we want to go straight to the
// source, e.g. crates.io, to fetch crates.
if !args.is_present("respect-source-config") {
config.values_mut()?.remove("source");
}
// When we moved `cargo vendor` into Cargo itself we didn't stabilize a few
// flags, so try to provide a helpful error message in that case to enusre
// that users currently using the flag aren't tripped up.
let crates_io_cargo_vendor_flag = if args.is_present("no-merge-sources") {
Some("--no-merge-sources")
} else if args.is_present("relative-path") {
Some("--relative-path")
} else if args.is_present("only-git-deps") {
Some("--only-git-deps")
} else if args.is_present("explicit-version") {
Some("--explicit-version")
} else if args.is_present("disallow-duplicates") {
Some("--disallow-duplicates")
} else {
None
};
if let Some(flag) = crates_io_cargo_vendor_flag {
return Err(failure::format_err!(
"\
the crates.io `cargo vendor` command has now been merged into Cargo itself
and does not support the flag `{}` currently; to continue using the flag you
can execute `cargo-vendor vendor ...`, and if you would like to see this flag
supported in Cargo itself please feel free to file an issue at
https://github.com/rust-lang/cargo/issues/new
",
flag
)
.into());
}
let ws = args.workspace(config)?;
let path = args
.value_of_os("path")
.map(|val| PathBuf::from(val.to_os_string()))
.unwrap_or_else(|| PathBuf::from("vendor"));
ops::vendor(
&ws,
&ops::VendorOptions {
no_delete: args.is_present("no-delete"),
destination: &path,
extra: args
.values_of_os("tomls")
.unwrap_or_default()
.map(|s| PathBuf::from(s.to_os_string()))
.collect(),
},
)?;
Ok(())
}

View file

@ -204,16 +204,19 @@ impl CompileMode {
}
}
/// Returns `true` if this is a doc or doc test. Be careful using this.
/// Although both run rustdoc, the dependencies for those two modes are
/// very different.
/// Returns `true` if this is generating documentation.
pub fn is_doc(self) -> bool {
match self {
CompileMode::Doc { .. } | CompileMode::Doctest => true,
CompileMode::Doc { .. } => true,
_ => false,
}
}
/// Returns `true` if this a doc test.
pub fn is_doc_test(self) -> bool {
self == CompileMode::Doctest
}
/// Returns `true` if this is any type of test (test, benchmark, doc test, or
/// check test).
pub fn is_any_test(self) -> bool {

View file

@ -16,7 +16,7 @@ mod target_info;
pub use self::target_info::{FileFlavor, TargetInfo};
/// The build context, containing all information about a build task.
pub struct BuildContext<'a, 'cfg: 'a> {
pub struct BuildContext<'a, 'cfg> {
/// The workspace the build is for.
pub ws: &'a Workspace<'cfg>,
/// The cargo configuration.

View file

@ -9,7 +9,7 @@ use lazycell::LazyCell;
use log::info;
use super::{BuildContext, Context, FileFlavor, Kind, Layout};
use crate::core::compiler::Unit;
use crate::core::compiler::{CompileMode, Unit};
use crate::core::{TargetKind, Workspace};
use crate::util::{self, CargoResult};
@ -50,7 +50,7 @@ impl fmt::Display for Metadata {
}
}
pub struct CompilationFiles<'a, 'cfg: 'a> {
pub struct CompilationFiles<'a, 'cfg> {
/// The target directory layout for the host (and target if it is the same as host).
pub(super) host: Layout,
/// The target directory layout for the target (if different from then host).
@ -146,6 +146,8 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
pub fn out_dir(&self, unit: &Unit<'a>) -> PathBuf {
if unit.mode.is_doc() {
self.layout(unit.kind).root().parent().unwrap().join("doc")
} else if unit.mode.is_doc_test() {
panic!("doc tests do not have an out dir");
} else if unit.target.is_custom_build() {
self.build_script_dir(unit)
} else if unit.target.is_example() {
@ -293,41 +295,76 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
unit: &Unit<'a>,
bcx: &BuildContext<'a, 'cfg>,
) -> CargoResult<Arc<Vec<OutputFile>>> {
let out_dir = self.out_dir(unit);
let ret = match unit.mode {
CompileMode::Check { .. } => {
// This may be confusing. rustc outputs a file named `lib*.rmeta`
// for both libraries and binaries.
let file_stem = self.file_stem(unit);
let path = self.out_dir(unit).join(format!("lib{}.rmeta", file_stem));
vec![OutputFile {
path,
hardlink: None,
export_path: None,
flavor: FileFlavor::Linkable { rmeta: false },
}]
}
CompileMode::Doc { .. } => {
let path = self
.out_dir(unit)
.join(unit.target.crate_name())
.join("index.html");
vec![OutputFile {
path,
hardlink: None,
export_path: None,
flavor: FileFlavor::Normal,
}]
}
CompileMode::RunCustomBuild => {
// At this time, this code path does not handle build script
// outputs.
vec![]
}
CompileMode::Doctest => {
// Doctests are built in a temporary directory and then
// deleted. There is the `--persist-doctests` unstable flag,
// but Cargo does not know about that.
vec![]
}
CompileMode::Test | CompileMode::Build | CompileMode::Bench => {
self.calc_outputs_rustc(unit, bcx)?
}
};
info!("Target filenames: {:?}", ret);
Ok(Arc::new(ret))
}
fn calc_outputs_rustc(
&self,
unit: &Unit<'a>,
bcx: &BuildContext<'a, 'cfg>,
) -> CargoResult<Vec<OutputFile>> {
let mut ret = Vec::new();
let mut unsupported = Vec::new();
let out_dir = self.out_dir(unit);
let link_stem = self.link_stem(unit);
let info = if unit.kind == Kind::Host {
&bcx.host_info
} else {
&bcx.target_info
};
let file_stem = self.file_stem(unit);
let mut ret = Vec::new();
let mut unsupported = Vec::new();
{
if unit.mode.is_check() {
// This may be confusing. rustc outputs a file named `lib*.rmeta`
// for both libraries and binaries.
let path = out_dir.join(format!("lib{}.rmeta", file_stem));
ret.push(OutputFile {
path,
hardlink: None,
export_path: None,
flavor: FileFlavor::Linkable { rmeta: false },
});
} else {
let mut add = |crate_type: &str, flavor: FileFlavor| -> CargoResult<()> {
let crate_type = if crate_type == "lib" {
"rlib"
} else {
crate_type
};
let file_types = info.file_types(
crate_type,
flavor,
unit.target.kind(),
bcx.target_triple(),
)?;
let file_types =
info.file_types(crate_type, flavor, unit.target.kind(), bcx.target_triple())?;
match file_types {
Some(types) => {
@ -360,7 +397,6 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
}
Ok(())
};
// info!("{:?}", unit);
match *unit.target.kind() {
TargetKind::Bin
| TargetKind::CustomBuild
@ -384,7 +420,7 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
)?;
}
let path = out_dir.join(format!("lib{}.rmeta", file_stem));
if !unit.target.requires_upstream_objects() {
if !unit.requires_upstream_objects() {
ret.push(OutputFile {
path,
hardlink: None,
@ -394,8 +430,6 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
}
}
}
}
}
if ret.is_empty() {
if !unsupported.is_empty() {
failure::bail!(
@ -413,9 +447,7 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
bcx.target_triple()
);
}
info!("Target filenames: {:?}", ret);
Ok(Arc::new(ret))
Ok(ret)
}
}
@ -439,6 +471,10 @@ fn compute_metadata<'a, 'cfg>(
cx: &Context<'a, 'cfg>,
metas: &mut HashMap<Unit<'a>, Option<Metadata>>,
) -> Option<Metadata> {
if unit.mode.is_doc_test() {
// Doc tests do not have metadata.
return None;
}
// No metadata for dylibs because of a couple issues:
// - macOS encodes the dylib name in the executable,
// - Windows rustc multiple files of which we can't easily link all of them.
@ -518,10 +554,24 @@ fn compute_metadata<'a, 'cfg>(
// Throw in the rustflags we're compiling with.
// This helps when the target directory is a shared cache for projects with different cargo configs,
// or if the user is experimenting with different rustflags manually.
if unit.mode.is_doc() {
cx.bcx.rustdocflags_args(unit).hash(&mut hasher);
let mut flags = if unit.mode.is_doc() {
cx.bcx.rustdocflags_args(unit)
} else {
cx.bcx.rustflags_args(unit).hash(&mut hasher);
cx.bcx.rustflags_args(unit)
}
.iter();
// Ignore some flags. These may affect reproducible builds if they affect
// the path. The fingerprint will handle recompilation if these change.
while let Some(flag) = flags.next() {
if flag.starts_with("--remap-path-prefix=") {
continue;
}
if flag == "--remap-path-prefix" {
flags.next();
continue;
}
flag.hash(&mut hasher);
}
// Artifacts compiled for the host should have a different metadata

View file

@ -27,7 +27,7 @@ mod compilation_files;
use self::compilation_files::CompilationFiles;
pub use self::compilation_files::{Metadata, OutputFile};
pub struct Context<'a, 'cfg: 'a> {
pub struct Context<'a, 'cfg> {
pub bcx: &'a BuildContext<'a, 'cfg>,
pub compilation: Compilation<'cfg>,
pub build_state: Arc<BuildState>,
@ -166,7 +166,7 @@ impl<'a, 'cfg> Context<'a, 'cfg> {
}
}
if unit.mode == CompileMode::Doctest {
if unit.mode.is_doc_test() {
// Note that we can *only* doc-test rlib outputs here. A
// staticlib output cannot be linked by the compiler (it just
// doesn't do that). A dylib output, however, can be linked by
@ -472,11 +472,11 @@ impl<'a, 'cfg> Context<'a, 'cfg> {
self.pipelining
// We're only a candidate for requiring an `rmeta` file if we
// ourselves are building an rlib,
&& !parent.target.requires_upstream_objects()
&& !parent.requires_upstream_objects()
&& parent.mode == CompileMode::Build
// Our dependency must also be built as an rlib, otherwise the
// object code must be useful in some fashion
&& !dep.target.requires_upstream_objects()
&& !dep.requires_upstream_objects()
&& dep.mode == CompileMode::Build
}

View file

@ -25,7 +25,7 @@ use crate::CargoResult;
use log::trace;
use std::collections::{HashMap, HashSet};
struct State<'a: 'tmp, 'cfg: 'a, 'tmp> {
struct State<'a, 'cfg, 'tmp> {
cx: &'tmp mut Context<'a, 'cfg>,
waiting_on_download: HashSet<PackageId>,
downloads: Downloads<'a, 'cfg>,
@ -129,7 +129,7 @@ fn compute_deps<'a, 'cfg, 'tmp>(
) -> CargoResult<Vec<(Unit<'a>, UnitFor)>> {
if unit.mode.is_run_custom_build() {
return compute_deps_custom_build(unit, state.cx.bcx);
} else if unit.mode.is_doc() && !unit.mode.is_any_test() {
} else if unit.mode.is_doc() {
// Note: this does not include doc test.
return compute_deps_doc(unit, state);
}

View file

@ -59,7 +59,7 @@
//! Target flags (test/bench/for_host/edition) | ✓ |
//! -C incremental=… flag | ✓ |
//! mtime of sources | ✓[^3] |
//! RUSTFLAGS/RUSTDOCFLAGS | ✓ |
//! RUSTFLAGS/RUSTDOCFLAGS | ✓ |
//!
//! [^1]: Build script and bin dependencies are not included.
//!
@ -1015,6 +1015,8 @@ fn calculate<'a, 'cfg>(
}
let mut fingerprint = if unit.mode.is_run_custom_build() {
calculate_run_custom_build(cx, unit)?
} else if unit.mode.is_doc_test() {
panic!("doc tests do not fingerprint");
} else {
calculate_normal(cx, unit)?
};
@ -1066,19 +1068,12 @@ fn calculate_normal<'a, 'cfg>(
// Figure out what the outputs of our unit is, and we'll be storing them
// into the fingerprint as well.
let outputs = if unit.mode.is_doc() {
vec![cx
.files()
.out_dir(unit)
.join(unit.target.crate_name())
.join("index.html")]
} else {
cx.outputs(unit)?
let outputs = cx
.outputs(unit)?
.iter()
.filter(|output| output.flavor != FileFlavor::DebugInfo)
.map(|output| output.path.clone())
.collect()
};
.collect();
// Fill out a bunch more information that we'll be tracking typically
// hashed to take up less space on disk as we just need to know when things
@ -1339,7 +1334,8 @@ fn write_fingerprint(loc: &Path, fingerprint: &Fingerprint) -> CargoResult<()> {
pub fn prepare_init<'a, 'cfg>(cx: &mut Context<'a, 'cfg>, unit: &Unit<'a>) -> CargoResult<()> {
let new1 = cx.files().fingerprint_dir(unit);
if fs::metadata(&new1).is_err() {
// Doc tests have no output, thus no fingerprint.
if !new1.exists() && !unit.mode.is_doc_test() {
fs::create_dir(&new1)?;
}

View file

@ -196,7 +196,7 @@ impl<'a, 'cfg> JobQueue<'a, 'cfg> {
// the target as well. This should ensure that edges changed to
// `Metadata` propagate upwards `All` dependencies to anything that
// transitively contains the `Metadata` edge.
if unit.target.requires_upstream_objects() {
if unit.requires_upstream_objects() {
for dep in dependencies.iter() {
depend_on_deps_of_deps(cx, &mut queue_deps, dep);
}
@ -597,11 +597,10 @@ impl<'a, 'cfg> JobQueue<'a, 'cfg> {
// being a compiled package.
Dirty => {
if unit.mode.is_doc() {
// Skip doc test.
if !unit.mode.is_any_test() {
self.documented.insert(unit.pkg.package_id());
config.shell().status("Documenting", unit.pkg)?;
}
} else if unit.mode.is_doc_test() {
// Skip doc test.
} else {
self.compiled.insert(unit.pkg.package_id());
if unit.mode.is_check() {
@ -614,8 +613,7 @@ impl<'a, 'cfg> JobQueue<'a, 'cfg> {
Fresh => {
// If doc test are last, only print "Fresh" if nothing has been printed.
if self.counts[&unit.pkg.package_id()] == 0
&& !(unit.mode == CompileMode::Doctest
&& self.compiled.contains(&unit.pkg.package_id()))
&& !(unit.mode.is_doc_test() && self.compiled.contains(&unit.pkg.package_id()))
{
self.compiled.insert(unit.pkg.package_id());
config.shell().verbose(|c| c.status("Fresh", unit.pkg))?;

View file

@ -125,7 +125,7 @@ fn compile<'a, 'cfg: 'a>(
let job = if unit.mode.is_run_custom_build() {
custom_build::prepare(cx, unit)?
} else if unit.mode == CompileMode::Doctest {
} else if unit.mode.is_doc_test() {
// We run these targets later, so this is just a no-op for now.
Job::new(Work::noop(), Freshness::Fresh)
} else if build_plan {
@ -827,7 +827,7 @@ fn build_base_args<'a, 'cfg>(
if unit.mode.is_check() {
cmd.arg("--emit=dep-info,metadata");
} else if !unit.target.requires_upstream_objects() {
} else if !unit.requires_upstream_objects() {
// Always produce metdata files for rlib outputs. Metadata may be used
// in this session for a pipelined compilation, or it may be used in a
// future Cargo session as part of a pipelined compile.
@ -1309,7 +1309,7 @@ fn replay_output_cache(
}
let contents = fs::read_to_string(&path)?;
for line in contents.lines() {
on_stderr_line(state, &line, package_id, &target, &mut options)?;
on_stderr_line(state, line, package_id, &target, &mut options)?;
}
Ok(())
})

View file

@ -50,6 +50,18 @@ pub struct UnitInner<'a> {
pub mode: CompileMode,
}
impl UnitInner<'_> {
/// Returns whether compilation of this unit requires all upstream artifacts
/// to be available.
///
/// This effectively means that this unit is a synchronization point (if the
/// return value is `true`) that all previously pipelined units need to
/// finish in their entirety before this one is started.
pub fn requires_upstream_objects(&self) -> bool {
self.mode.is_any_test() || self.target.kind().requires_upstream_objects()
}
}
impl<'a> Unit<'a> {
pub fn buildkey(&self) -> String {
format!("{}-{}", self.pkg.name(), short_hash(self))

View file

@ -206,6 +206,20 @@ impl TargetKind {
TargetKind::CustomBuild => "build-script",
}
}
/// Returns whether production of this artifact requires the object files
/// from dependencies to be available.
///
/// This only returns `false` when all we're producing is an rlib, otherwise
/// it will return `true`.
pub fn requires_upstream_objects(&self) -> bool {
match self {
TargetKind::Lib(kinds) | TargetKind::ExampleLib(kinds) => {
kinds.iter().any(|k| k.requires_upstream_objects())
}
_ => true,
}
}
}
/// Information about a binary, a library, an example, etc. that is part of the
@ -821,20 +835,6 @@ impl Target {
}
}
/// Returns whether production of this artifact requires the object files
/// from dependencies to be available.
///
/// This only returns `false` when all we're producing is an rlib, otherwise
/// it will return `true`.
pub fn requires_upstream_objects(&self) -> bool {
match &self.kind {
TargetKind::Lib(kinds) | TargetKind::ExampleLib(kinds) => {
kinds.iter().any(|k| k.requires_upstream_objects())
}
_ => true,
}
}
pub fn is_bin(&self) -> bool {
self.kind == TargetKind::Bin
}

View file

@ -290,7 +290,7 @@ pub struct PackageSet<'cfg> {
}
/// Helper for downloading crates.
pub struct Downloads<'a, 'cfg: 'a> {
pub struct Downloads<'a, 'cfg> {
set: &'a PackageSet<'cfg>,
/// When a download is started, it is added to this map. The key is a
/// "token" (see `Download::token`). It is removed once the download is

View file

@ -57,7 +57,7 @@ pub type Activations =
/// A type that represents when cargo treats two Versions as compatible.
/// Versions `a` and `b` are compatible if their left-most nonzero digit is the
/// same.
#[derive(Clone, Copy, Eq, PartialEq, Hash)]
#[derive(Clone, Copy, Eq, PartialEq, Hash, Debug)]
pub enum SemverCompatibility {
Major(NonZeroU64),
Minor(NonZeroU64),

View file

@ -415,7 +415,19 @@ impl Requirements<'_> {
}
fn require_crate_feature(&mut self, package: InternedString, feat: InternedString) {
// If `package` is indeed an optional dependency then we activate the
// feature named `package`, but otherwise if `package` is a required
// dependency then there's no feature associated with it.
if let Some(dep) = self
.summary
.dependencies()
.iter()
.find(|p| p.name_in_toml() == package)
{
if dep.is_optional() {
self.used.insert(package);
}
}
self.deps
.entry(package)
.or_insert((false, BTreeSet::new()))

View file

@ -325,7 +325,7 @@ impl<'de> de::Deserialize<'de> for EncodablePackageId {
}
}
pub struct WorkspaceResolve<'a, 'cfg: 'a> {
pub struct WorkspaceResolve<'a, 'cfg> {
pub ws: &'a Workspace<'cfg>,
pub resolve: &'a Resolve,
}

View file

@ -125,7 +125,7 @@ pub struct WorkspaceRootConfig {
/// An iterator over the member packages of a workspace, returned by
/// `Workspace::members`
pub struct Members<'a, 'cfg: 'a> {
pub struct Members<'a, 'cfg> {
ws: &'a Workspace<'cfg>,
iter: slice::Iter<'a, PathBuf>,
}

View file

@ -3,7 +3,7 @@
#![warn(rust_2018_idioms)]
// Clippy isn't enforced by CI (@alexcrichton isn't a fan).
#![allow(clippy::blacklisted_name)] // frequently used in tests
#![allow(clippy::cyclomatic_complexity)] // large project
#![allow(clippy::cognitive_complexity)] // large project
#![allow(clippy::derive_hash_xor_eq)] // there's an intentional incoherence
#![allow(clippy::explicit_into_iter_loop)] // explicit loops are clearer
#![allow(clippy::explicit_iter_loop)] // explicit loops are clearer
@ -26,6 +26,7 @@
#![allow(clippy::unneeded_field_pattern)]
use std::fmt;
use std::io;
use failure::Error;
use log::debug;
@ -147,23 +148,37 @@ fn handle_cause(cargo_err: &Error, shell: &mut Shell) -> bool {
drop(writeln!(shell.err(), " {}", error));
}
fn print_stderror_causes(error: &dyn std::error::Error, shell: &mut Shell) {
let mut cur = std::error::Error::source(error);
while let Some(err) = cur {
print(&err.to_string(), shell);
cur = std::error::Error::source(err);
}
}
let verbose = shell.verbosity();
if verbose == Verbose {
// The first error has already been printed to the shell.
// Print all remaining errors.
for err in cargo_err.iter_causes() {
print(&err.to_string(), shell);
}
} else {
// The first error has already been printed to the shell.
// Print remaining errors until one marked as `Internal` appears.
for err in cargo_err.iter_causes() {
if err.downcast_ref::<Internal>().is_some() {
// If we're not in verbose mode then print remaining errors until one
// marked as `Internal` appears.
if verbose != Verbose && err.downcast_ref::<Internal>().is_some() {
return false;
}
print(&err.to_string(), shell);
// Using the `failure` crate currently means that when using
// `iter_causes` we're only iterating over the `failure` causes, but
// this doesn't include the causes from the standard library `Error`
// trait. We don't have a great way of getting an `&dyn Error` from a
// `&dyn Fail`, so we currently just special case a few errors that are
// known to maybe have causes and we try to print them here.
//
// Note that this isn't an exhaustive match since causes for
// `std::error::Error` aren't the most common thing in the world.
if let Some(io) = err.downcast_ref::<io::Error>() {
print_stderror_causes(io, shell);
}
}

View file

@ -115,6 +115,13 @@ pub fn clean(ws: &Workspace<'_>, opts: &CleanOptions<'_>) -> CargoResult<()> {
cx.prepare_units(None, &units)?;
for unit in units.iter() {
if unit.mode.is_doc() || unit.mode.is_doc_test() {
// Cleaning individual rustdoc crates is currently not supported.
// For example, the search index would need to be rebuilt to fully
// remove it (otherwise you're left with lots of broken links).
// Doc tests produce no output.
continue;
}
rm_rf(&cx.files().fingerprint_dir(unit), config)?;
if unit.target.is_custom_build() {
if unit.mode.is_run_custom_build() {

View file

@ -383,7 +383,7 @@ pub fn compile_ws<'a>(
}
if let Some(args) = local_rustdoc_args {
for unit in &units {
if unit.mode.is_doc() {
if unit.mode.is_doc() || unit.mode.is_doc_test() {
bcx.extra_compiler_args.insert(*unit, args.clone());
}
}
@ -703,7 +703,7 @@ fn generate_targets<'a>(
filter_targets(packages, Target::is_lib, false, bcx.build_config.mode)
{
let Proposal { target, pkg, .. } = proposal;
if bcx.build_config.mode == CompileMode::Doctest && !target.doctestable() {
if bcx.build_config.mode.is_doc_test() && !target.doctestable() {
ws.config().shell().warn(format!(
"doc tests are not supported for crate type(s) `{}` in package `{}`",
target.rustc_crate_types().join(", "),

View file

@ -26,6 +26,7 @@ pub use self::resolve::{
add_overrides, get_resolved_packages, resolve_with_previous, resolve_ws, resolve_ws_precisely,
resolve_ws_with_method,
};
pub use self::vendor::{vendor, VendorOptions};
mod cargo_clean;
mod cargo_compile;
@ -46,3 +47,4 @@ mod fix;
mod lockfile;
mod registry;
mod resolve;
mod vendor;

327
src/cargo/ops/vendor.rs Normal file
View file

@ -0,0 +1,327 @@
use crate::core::shell::Verbosity;
use crate::core::{GitReference, Workspace};
use crate::ops;
use crate::sources::path::PathSource;
use crate::util::Sha256;
use crate::util::{paths, CargoResult, CargoResultExt, Config};
use failure::bail;
use serde::Serialize;
use std::collections::HashSet;
use std::collections::{BTreeMap, BTreeSet, HashMap};
use std::fs::{self, File};
use std::io::Write;
use std::path::{Path, PathBuf};
pub struct VendorOptions<'a> {
pub no_delete: bool,
pub destination: &'a Path,
pub extra: Vec<PathBuf>,
}
pub fn vendor(ws: &Workspace<'_>, opts: &VendorOptions<'_>) -> CargoResult<()> {
let mut extra_workspaces = Vec::new();
for extra in opts.extra.iter() {
let extra = ws.config().cwd().join(extra);
let ws = Workspace::new(&extra, ws.config())?;
extra_workspaces.push(ws);
}
let workspaces = extra_workspaces.iter().chain(Some(ws)).collect::<Vec<_>>();
let vendor_config =
sync(ws.config(), &workspaces, opts).chain_err(|| "failed to sync".to_string())?;
let shell = ws.config().shell();
if shell.verbosity() != Verbosity::Quiet {
eprint!("To use vendored sources, add this to your .cargo/config for this project:\n\n");
print!("{}", &toml::to_string(&vendor_config).unwrap());
}
Ok(())
}
#[derive(Serialize)]
struct VendorConfig {
source: BTreeMap<String, VendorSource>,
}
#[derive(Serialize)]
#[serde(rename_all = "lowercase", untagged)]
enum VendorSource {
Directory {
directory: PathBuf,
},
Registry {
registry: Option<String>,
#[serde(rename = "replace-with")]
replace_with: String,
},
Git {
git: String,
branch: Option<String>,
tag: Option<String>,
rev: Option<String>,
#[serde(rename = "replace-with")]
replace_with: String,
},
}
fn sync(
config: &Config,
workspaces: &[&Workspace<'_>],
opts: &VendorOptions<'_>,
) -> CargoResult<VendorConfig> {
let canonical_destination = opts.destination.canonicalize();
let canonical_destination = canonical_destination
.as_ref()
.map(|p| &**p)
.unwrap_or(opts.destination);
fs::create_dir_all(&canonical_destination)?;
let mut to_remove = HashSet::new();
if !opts.no_delete {
for entry in canonical_destination.read_dir()? {
let entry = entry?;
to_remove.insert(entry.path());
}
}
// First up attempt to work around rust-lang/cargo#5956. Apparently build
// artifacts sprout up in Cargo's global cache for whatever reason, although
// it's unsure what tool is causing these issues at this time. For now we
// apply a heavy-hammer approach which is to delete Cargo's unpacked version
// of each crate to start off with. After we do this we'll re-resolve and
// redownload again, which should trigger Cargo to re-extract all the
// crates.
//
// Note that errors are largely ignored here as this is a best-effort
// attempt. If anything fails here we basically just move on to the next
// crate to work with.
for ws in workspaces {
let (packages, resolve) =
ops::resolve_ws(ws).chain_err(|| "failed to load pkg lockfile")?;
packages
.get_many(resolve.iter())
.chain_err(|| "failed to download packages")?;
for pkg in resolve.iter() {
// Don't delete actual source code!
if pkg.source_id().is_path() {
if let Ok(path) = pkg.source_id().url().to_file_path() {
if let Ok(path) = path.canonicalize() {
to_remove.remove(&path);
}
}
continue;
}
if pkg.source_id().is_git() {
continue;
}
if let Ok(pkg) = packages.get_one(pkg) {
drop(fs::remove_dir_all(pkg.manifest_path().parent().unwrap()));
}
}
}
let mut checksums = HashMap::new();
let mut ids = BTreeMap::new();
// Next up let's actually download all crates and start storing internal
// tables about them.
for ws in workspaces {
let (packages, resolve) =
ops::resolve_ws(ws).chain_err(|| "failed to load pkg lockfile")?;
packages
.get_many(resolve.iter())
.chain_err(|| "failed to download packages")?;
for pkg in resolve.iter() {
// No need to vendor path crates since they're already in the
// repository
if pkg.source_id().is_path() {
continue;
}
ids.insert(
pkg,
packages
.get_one(pkg)
.chain_err(|| "failed to fetch package")?
.clone(),
);
checksums.insert(pkg, resolve.checksums().get(&pkg).cloned());
}
}
let mut versions = HashMap::new();
for id in ids.keys() {
let map = versions.entry(id.name()).or_insert_with(BTreeMap::default);
if let Some(prev) = map.get(&id.version()) {
bail!(
"found duplicate version of package `{} v{}` \
vendored from two sources:\n\
\n\
\tsource 1: {}\n\
\tsource 2: {}",
id.name(),
id.version(),
prev,
id.source_id()
);
}
map.insert(id.version(), id.source_id());
}
let mut sources = BTreeSet::new();
for (id, pkg) in ids.iter() {
// Next up, copy it to the vendor directory
let src = pkg
.manifest_path()
.parent()
.expect("manifest_path should point to a file");
let max_version = *versions[&id.name()].iter().rev().next().unwrap().0;
let dir_has_version_suffix = id.version() != max_version;
let dst_name = if dir_has_version_suffix {
// Eg vendor/futures-0.1.13
format!("{}-{}", id.name(), id.version())
} else {
// Eg vendor/futures
id.name().to_string()
};
sources.insert(id.source_id());
let dst = canonical_destination.join(&dst_name);
to_remove.remove(&dst);
let cksum = dst.join(".cargo-checksum.json");
if dir_has_version_suffix && cksum.exists() {
// Always re-copy directory without version suffix in case the version changed
continue;
}
config.shell().status(
"Vendoring",
&format!("{} ({}) to {}", id, src.to_string_lossy(), dst.display()),
)?;
let _ = fs::remove_dir_all(&dst);
let pathsource = PathSource::new(src, id.source_id(), config);
let paths = pathsource.list_files(pkg)?;
let mut map = BTreeMap::new();
cp_sources(src, &paths, &dst, &mut map)
.chain_err(|| format!("failed to copy over vendored sources for: {}", id))?;
// Finally, emit the metadata about this package
let json = serde_json::json!({
"package": checksums.get(id),
"files": map,
});
File::create(&cksum)?.write_all(json.to_string().as_bytes())?;
}
for path in to_remove {
if path.is_dir() {
paths::remove_dir_all(&path)?;
} else {
paths::remove_file(&path)?;
}
}
// add our vendored source
let mut config = BTreeMap::new();
let merged_source_name = "vendored-sources";
config.insert(
merged_source_name.to_string(),
VendorSource::Directory {
directory: canonical_destination.to_path_buf(),
},
);
// replace original sources with vendor
for source_id in sources {
let name = if source_id.is_default_registry() {
"crates-io".to_string()
} else {
source_id.url().to_string()
};
let source = if source_id.is_default_registry() {
VendorSource::Registry {
registry: None,
replace_with: merged_source_name.to_string(),
}
} else if source_id.is_git() {
let mut branch = None;
let mut tag = None;
let mut rev = None;
if let Some(reference) = source_id.git_reference() {
match *reference {
GitReference::Branch(ref b) => branch = Some(b.clone()),
GitReference::Tag(ref t) => tag = Some(t.clone()),
GitReference::Rev(ref r) => rev = Some(r.clone()),
}
}
VendorSource::Git {
git: source_id.url().to_string(),
branch,
tag,
rev,
replace_with: merged_source_name.to_string(),
}
} else {
panic!("Invalid source ID: {}", source_id)
};
config.insert(name, source);
}
Ok(VendorConfig { source: config })
}
fn cp_sources(
src: &Path,
paths: &[PathBuf],
dst: &Path,
cksums: &mut BTreeMap<String, String>,
) -> CargoResult<()> {
for p in paths {
let relative = p.strip_prefix(&src).unwrap();
match relative.to_str() {
// Skip git config files as they're not relevant to builds most of
// the time and if we respect them (e.g. in git) then it'll
// probably mess with the checksums when a vendor dir is checked
// into someone else's source control
Some(".gitattributes") | Some(".gitignore") | Some(".git") => continue,
// Temporary Cargo files
Some(".cargo-ok") => continue,
// Skip patch-style orig/rej files. Published crates on crates.io
// have `Cargo.toml.orig` which we don't want to use here and
// otherwise these are rarely used as part of the build process.
Some(filename) => {
if filename.ends_with(".orig") || filename.ends_with(".rej") {
continue;
}
}
_ => {}
};
// Join pathname components individually to make sure that the joined
// path uses the correct directory separators everywhere, since
// `relative` may use Unix-style and `dst` may require Windows-style
// backslashes.
let dst = relative
.iter()
.fold(dst.to_owned(), |acc, component| acc.join(&component));
fs::create_dir_all(dst.parent().unwrap())?;
fs::copy(&p, &dst)
.chain_err(|| format!("failed to copy `{}` to `{}`", p.display(), dst.display()))?;
let cksum = Sha256::new().update_path(dst)?.finish_hex();
cksums.insert(relative.to_str().unwrap().replace("\\", "/"), cksum);
}
Ok(())
}

View file

@ -1,7 +1,5 @@
use std::collections::HashMap;
use std::fmt::{self, Debug, Formatter};
use std::fs::File;
use std::io::Read;
use std::path::{Path, PathBuf};
use serde::Deserialize;
@ -170,23 +168,12 @@ impl<'cfg> Source for DirectorySource<'cfg> {
None => failure::bail!("failed to find entry for `{}` in directory source", id),
};
let mut buf = [0; 16 * 1024];
for (file, cksum) in cksum.files.iter() {
let mut h = Sha256::new();
let file = pkg.root().join(file);
(|| -> CargoResult<()> {
let mut f = File::open(&file)?;
loop {
match f.read(&mut buf)? {
0 => return Ok(()),
n => h.update(&buf[..n]),
}
}
})()
.chain_err(|| format!("failed to calculate checksum of: {}", file.display()))?;
let actual = hex::encode(h.finish());
let actual = Sha256::new()
.update_path(&file)
.chain_err(|| format!("failed to calculate checksum of: {}", file.display()))?
.finish_hex();
if &*actual != cksum {
failure::bail!(
"\

View file

@ -1,9 +1,8 @@
use crate::core::{InternedString, PackageId};
use crate::sources::registry::{MaybeLock, RegistryConfig, RegistryData};
use crate::util::errors::{CargoResult, CargoResultExt};
use crate::util::errors::CargoResult;
use crate::util::paths;
use crate::util::{Config, Filesystem, Sha256};
use hex;
use std::fs::File;
use std::io::prelude::*;
use std::io::SeekFrom;
@ -99,18 +98,8 @@ impl<'cfg> RegistryData for LocalRegistry<'cfg> {
// We don't actually need to download anything per-se, we just need to
// verify the checksum matches the .crate file itself.
let mut state = Sha256::new();
let mut buf = [0; 64 * 1024];
loop {
let n = crate_file
.read(&mut buf)
.chain_err(|| format!("failed to read `{}`", path.display()))?;
if n == 0 {
break;
}
state.update(&buf[..n]);
}
if hex::encode(state.finish()) != checksum {
let actual = Sha256::new().update_file(&crate_file)?.finish_hex();
if actual != checksum {
failure::bail!("failed to verify the checksum of `{}`", pkg)
}

View file

@ -281,9 +281,8 @@ impl<'cfg> RegistryData for RemoteRegistry<'cfg> {
data: &[u8],
) -> CargoResult<File> {
// Verify what we just downloaded
let mut state = Sha256::new();
state.update(data);
if hex::encode(state.finish()) != checksum {
let actual = Sha256::new().update(data).finish_hex();
if actual != checksum {
failure::bail!("failed to verify the checksum of `{}`", pkg)
}

View file

@ -294,6 +294,13 @@ impl Config {
self.values.try_borrow_with(|| self.load_values())
}
pub fn values_mut(&mut self) -> CargoResult<&mut HashMap<String, ConfigValue>> {
match self.values.borrow_mut() {
Some(map) => Ok(map),
None => failure::bail!("config values not loaded yet"),
}
}
// Note: this is used by RLS, not Cargo.
pub fn set_values(&self, values: HashMap<String, ConfigValue>) -> CargoResult<()> {
if self.values.borrow().is_some() {

View file

@ -1,6 +1,9 @@
use self::crypto_hash::{Algorithm, Hasher};
use crate::util::{CargoResult, CargoResultExt};
use crypto_hash;
use std::io::Write;
use std::fs::File;
use std::io::{self, Read, Write};
use std::path::Path;
pub struct Sha256(Hasher);
@ -10,8 +13,28 @@ impl Sha256 {
Sha256(hasher)
}
pub fn update(&mut self, bytes: &[u8]) {
pub fn update(&mut self, bytes: &[u8]) -> &mut Sha256 {
let _ = self.0.write_all(bytes);
self
}
pub fn update_file(&mut self, mut file: &File) -> io::Result<&mut Sha256> {
let mut buf = [0; 64 * 1024];
loop {
let n = file.read(&mut buf)?;
if n == 0 {
break Ok(self);
}
self.update(&buf[..n]);
}
}
pub fn update_path<P: AsRef<Path>>(&mut self, path: P) -> CargoResult<&mut Sha256> {
let path = path.as_ref();
let file = File::open(path)?;
self.update_file(&file)
.chain_err(|| format!("failed to read `{}`", path.display()))?;
Ok(self)
}
pub fn finish(&mut self) -> [u8; 32] {
@ -20,6 +43,10 @@ impl Sha256 {
ret.copy_from_slice(&data[..]);
ret
}
pub fn finish_hex(&mut self) -> String {
hex::encode(self.finish())
}
}
impl Default for Sha256 {

View file

@ -0,0 +1,75 @@
= cargo-vendor(1)
:idprefix: cargo_vendor_
:doctype: manpage
== NAME
cargo-vendor - Vendor all dependencies locally
== SYNOPSIS
`cargo vendor [_OPTIONS_] [_PATH_]`
== DESCRIPTION
This cargo subcommand will vendor all crates.io and git dependencies for a
project into the specified directory at `<path>`. After this command completes
the vendor directory specified by `<path>` will contain all remote sources from
dependencies specified. Additional manifests beyond the default one can be
specified with the `-s` option.
The `cargo vendor` command will also print out the configuration necessary
to use the vendored sources, which you will need to add to `.cargo/config`.
== OPTIONS
=== Owner Options
*-s* _MANIFEST_::
*--sync* _MANIFEST_::
Specify extra `Cargo.toml` manifests to workspaces which should also be
vendored and synced to the output.
*--no-delete*::
Don't delete the "vendor" directory when vendoring, but rather keep all
existing contents of the vendor directory
*--respect-source-config*::
Instead of ignoring `[source]` configuration by default in `.cargo/config`
read it and use it when downloading crates from crates.io, for example
=== Manifest Options
include::options-manifest-path.adoc[]
=== Display Options
include::options-display.adoc[]
=== Common Options
include::options-common.adoc[]
include::options-locked.adoc[]
include::section-environment.adoc[]
include::section-exit-status.adoc[]
== EXAMPLES
. Vendor all dependencies into a local "vendor" folder
cargo vendor
. Vendor all dependencies into a local "third-part/vendor" folder
cargo vendor third-party/vendor
. Vendor the current workspace as well as another to "vendor"
cargo vendor -s ../path/to/Cargo.toml
== SEE ALSO
man:cargo[1]

View file

@ -35,7 +35,7 @@ for a Rust API for reading the metadata.</p>
</div>
<div class="listingblock">
<div class="content">
<pre class="highlightjs highlight"><code class="language-javascript hljs" data-lang="javascript">{
<pre class="highlightjs highlight"><code data-lang="javascript" class="language-javascript hljs">{
/* Array of all packages in the workspace.
It also includes all feature-enabled dependencies unless --no-deps is used.
*/

View file

@ -0,0 +1,207 @@
<h2 id="cargo_vendor_name">NAME</h2>
<div class="sectionbody">
<p>cargo-vendor - Vendor all dependencies locally</p>
</div>
<div class="sect1">
<h2 id="cargo_vendor_synopsis">SYNOPSIS</h2>
<div class="sectionbody">
<div class="paragraph">
<p><code>cargo vendor [<em>OPTIONS</em>] [<em>PATH</em>]</code></p>
</div>
</div>
</div>
<div class="sect1">
<h2 id="cargo_vendor_description">DESCRIPTION</h2>
<div class="sectionbody">
<div class="paragraph">
<p>This cargo subcommand will vendor all crates.io and git dependencies for a
project into the specified directory at <code>&lt;path&gt;</code>. After this command completes
the vendor directory specified by <code>&lt;path&gt;</code> will contain all remote sources from
dependencies specified. Additional manifests beyond the default one can be
specified with the <code>-s</code> option.</p>
</div>
<div class="paragraph">
<p>The <code>cargo vendor</code> command will also print out the configuration necessary
to use the vendored sources, which you will need to add to <code>.cargo/config</code>.</p>
</div>
</div>
</div>
<div class="sect1">
<h2 id="cargo_vendor_options">OPTIONS</h2>
<div class="sectionbody">
<div class="sect2">
<h3 id="cargo_vendor_owner_options">Owner Options</h3>
<div class="dlist">
<dl>
<dt class="hdlist1"><strong>-s</strong> <em>MANIFEST</em></dt>
<dt class="hdlist1"><strong>--sync</strong> <em>MANIFEST</em></dt>
<dd>
<p>Specify extra <code>Cargo.toml</code> manifests to workspaces which should also be
vendored and synced to the output.</p>
</dd>
<dt class="hdlist1"><strong>--no-delete</strong></dt>
<dd>
<p>Don&#8217;t delete the "vendor" directory when vendoring, but rather keep all
existing contents of the vendor directory</p>
</dd>
<dt class="hdlist1"><strong>--respect-source-config</strong></dt>
<dd>
<p>Instead of ignoring <code>[source]</code> configuration by default in <code>.cargo/config</code>
read it and use it when downloading crates from crates.io, for example</p>
</dd>
</dl>
</div>
</div>
<div class="sect2">
<h3 id="cargo_vendor_manifest_options">Manifest Options</h3>
<div class="dlist">
<dl>
<dt class="hdlist1"><strong>--manifest-path</strong> <em>PATH</em></dt>
<dd>
<p>Path to the <code>Cargo.toml</code> file. By default, Cargo searches in the current
directory or any parent directory for the <code>Cargo.toml</code> file.</p>
</dd>
</dl>
</div>
</div>
<div class="sect2">
<h3 id="cargo_vendor_display_options">Display Options</h3>
<div class="dlist">
<dl>
<dt class="hdlist1"><strong>-v</strong></dt>
<dt class="hdlist1"><strong>--verbose</strong></dt>
<dd>
<p>Use verbose output. May be specified twice for "very verbose" output which
includes extra output such as dependency warnings and build script output.
May also be specified with the <code>term.verbose</code>
<a href="reference/config.html">config value</a>.</p>
</dd>
<dt class="hdlist1"><strong>-q</strong></dt>
<dt class="hdlist1"><strong>--quiet</strong></dt>
<dd>
<p>No output printed to stdout.</p>
</dd>
<dt class="hdlist1"><strong>--color</strong> <em>WHEN</em></dt>
<dd>
<p>Control when colored output is used. Valid values:</p>
<div class="ulist">
<ul>
<li>
<p><code>auto</code> (default): Automatically detect if color support is available on the
terminal.</p>
</li>
<li>
<p><code>always</code>: Always display colors.</p>
</li>
<li>
<p><code>never</code>: Never display colors.</p>
</li>
</ul>
</div>
<div class="paragraph">
<p>May also be specified with the <code>term.color</code>
<a href="reference/config.html">config value</a>.</p>
</div>
</dd>
</dl>
</div>
</div>
<div class="sect2">
<h3 id="cargo_vendor_common_options">Common Options</h3>
<div class="dlist">
<dl>
<dt class="hdlist1"><strong>-h</strong></dt>
<dt class="hdlist1"><strong>--help</strong></dt>
<dd>
<p>Prints help information.</p>
</dd>
<dt class="hdlist1"><strong>-Z</strong> <em>FLAG</em>&#8230;&#8203;</dt>
<dd>
<p>Unstable (nightly-only) flags to Cargo. Run <code>cargo -Z help</code> for
details.</p>
</dd>
<dt class="hdlist1"><strong>--frozen</strong></dt>
<dt class="hdlist1"><strong>--locked</strong></dt>
<dd>
<p>Either of these flags requires that the <code>Cargo.lock</code> file is
up-to-date. If the lock file is missing, or it needs to be updated, Cargo will
exit with an error. The <code>--frozen</code> flag also prevents Cargo from
attempting to access the network to determine if it is out-of-date.</p>
<div class="paragraph">
<p>These may be used in environments where you want to assert that the
<code>Cargo.lock</code> file is up-to-date (such as a CI build) or want to avoid network
access.</p>
</div>
</dd>
</dl>
</div>
</div>
</div>
</div>
<div class="sect1">
<h2 id="cargo_vendor_environment">ENVIRONMENT</h2>
<div class="sectionbody">
<div class="paragraph">
<p>See <a href="reference/environment-variables.html">the reference</a> for
details on environment variables that Cargo reads.</p>
</div>
</div>
</div>
<div class="sect1">
<h2 id="cargo_vendor_exit_status">Exit Status</h2>
<div class="sectionbody">
<div class="dlist">
<dl>
<dt class="hdlist1">0</dt>
<dd>
<p>Cargo succeeded.</p>
</dd>
<dt class="hdlist1">101</dt>
<dd>
<p>Cargo failed to complete.</p>
</dd>
</dl>
</div>
</div>
</div>
<div class="sect1">
<h2 id="cargo_vendor_examples">EXAMPLES</h2>
<div class="sectionbody">
<div class="olist arabic">
<ol class="arabic">
<li>
<p>Vendor all dependencies into a local "vendor" folder</p>
<div class="literalblock">
<div class="content">
<pre>cargo vendor</pre>
</div>
</div>
</li>
<li>
<p>Vendor all dependencies into a local "third-part/vendor" folder</p>
<div class="literalblock">
<div class="content">
<pre>cargo vendor third-party/vendor</pre>
</div>
</div>
</li>
<li>
<p>Vendor the current workspace as well as another to "vendor"</p>
<div class="literalblock">
<div class="content">
<pre>cargo vendor -s ../path/to/Cargo.toml</pre>
</div>
</div>
</li>
</ol>
</div>
</div>
</div>
<div class="sect1">
<h2 id="cargo_vendor_see_also">SEE ALSO</h2>
<div class="sectionbody">
<div class="paragraph">
<p><a href="commands/index.html">cargo(1)</a></p>
</div>
</div>
</div>

View file

@ -49,6 +49,7 @@
* [metadata](commands/cargo-metadata.md)
* [pkgid](commands/cargo-pkgid.md)
* [update](commands/cargo-update.md)
* [vendor](commands/cargo-vendor.md)
* [verify-project](commands/cargo-verify-project.md)
* [Package Commands](commands/package-commands.md)
* [init](commands/cargo-init.md)

View file

@ -0,0 +1,4 @@
# cargo vendor
{{#include command-common.html}}
{{#include ../../man/generated/cargo-vendor.html}}

View file

@ -107,12 +107,8 @@ the crates that are present).
A "directory source" is similar to a local registry source where it contains a
number of crates available on the local filesystem, suitable for vendoring
dependencies. Also like local registries, directory sources can primarily be
managed by an external subcommand, [`cargo-vendor`][cargo-vendor],
[available on crates.io][cargo-vendor] and can be
installed with `cargo install cargo-vendor`.
[cargo-vendor]: https://crates.io/crates/cargo-vendor
dependencies. Directory sources are primarily managed the `cargo vendor`
subcommand.
Directory sources are distinct from local registries though in that they contain
the unpacked version of `*.crate` files, making it more suitable in some

View file

@ -440,7 +440,8 @@ Cargo how to find local unpublished crates.
Platform-specific dependencies take the same format, but are listed under a
`target` section. Normally Rust-like `#[cfg]` syntax will be used to define
`target` section. Normally Rust-like [`#[cfg]`
syntax](../reference/conditional-compilation.html) will be used to define
these sections:
```toml
@ -458,8 +459,18 @@ native = { path = "native/x86_64" }
```
Like with Rust, the syntax here supports the `not`, `any`, and `all` operators
to combine various cfg name/value pairs. Note that the `cfg` syntax has only
been available since Cargo 0.9.0 (Rust 1.8.0).
to combine various cfg name/value pairs.
If you want to know which cfg targets are available on your platform, run
`rustc --print=cfg` from the command line. If you want to know which `cfg`
targets are available for another platform, such as 64-bit Windows,
run `rustc --print=cfg --target=x86_64-pc-windows-msvc`.
Unlike in your Rust source code,
you cannot use `[target.'cfg(feature = "my_crate")'.dependencies]` to add
dependencies based on optional crate features.
Use [the `[features]` section](reference/manifest.html#the-features-section)
instead.
In addition to `#[cfg]` syntax, Cargo also supports listing out the full target
the dependencies would apply to:

View file

@ -270,7 +270,7 @@ private_dep = "2.0.0" # Will be 'private' by default
```
### cache-messages
* Tracking Issue: TODO
* Tracking Issue: [#6986](https://github.com/rust-lang/cargo/issues/6986)
The `cache-messages` feature causes Cargo to cache the messages generated by
the compiler. This is primarily useful if a crate compiles successfully with

223
src/etc/man/cargo-vendor.1 Normal file
View file

@ -0,0 +1,223 @@
'\" t
.\" Title: cargo-vendor
.\" Author: [see the "AUTHOR(S)" section]
.\" Generator: Asciidoctor 2.0.8
.\" Date: 2019-04-29
.\" Manual: \ \&
.\" Source: \ \&
.\" Language: English
.\"
.TH "CARGO\-VENDOR" "1" "2019-04-29" "\ \&" "\ \&"
.ie \n(.g .ds Aq \(aq
.el .ds Aq '
.ss \n[.ss] 0
.nh
.ad l
.de URL
\fI\\$2\fP <\\$1>\\$3
..
.als MTO URL
.if \n[.g] \{\
. mso www.tmac
. am URL
. ad l
. .
. am MTO
. ad l
. .
. LINKSTYLE blue R < >
.\}
.SH "NAME"
cargo\-vendor \- Vendor all dependencies locally
.SH "SYNOPSIS"
.sp
\fBcargo vendor [\fIOPTIONS\fP] [\fIPATH\fP]\fP
.SH "DESCRIPTION"
.sp
This cargo subcommand will vendor all crates.io and git dependencies for a
project into the specified directory at \fB<path>\fP. After this command completes
the vendor directory specified by \fB<path>\fP will contain all remote sources from
dependencies specified. Additional manifests beyond the default one can be
specified with the \fB\-s\fP option.
.sp
The \fBcargo vendor\fP command will also print out the configuration necessary
to use the vendored sources, which you will need to add to \fB.cargo/config\fP.
.SH "OPTIONS"
.SS "Owner Options"
.sp
\fB\-s\fP \fIMANIFEST\fP, \fB\-\-sync\fP \fIMANIFEST\fP
.RS 4
Specify extra \fBCargo.toml\fP manifests to workspaces which should also be
vendored and synced to the output.
.RE
.sp
\fB\-\-no\-delete\fP
.RS 4
Don\(cqt delete the "vendor" directory when vendoring, but rather keep all
existing contents of the vendor directory
.RE
.sp
\fB\-\-respect\-source\-config\fP
.RS 4
Instead of ignoring \fB[source]\fP configuration by default in \fB.cargo/config\fP
read it and use it when downloading crates from crates.io, for example
.RE
.SS "Manifest Options"
.sp
\fB\-\-manifest\-path\fP \fIPATH\fP
.RS 4
Path to the \fBCargo.toml\fP file. By default, Cargo searches in the current
directory or any parent directory for the \fBCargo.toml\fP file.
.RE
.SS "Display Options"
.sp
\fB\-v\fP, \fB\-\-verbose\fP
.RS 4
Use verbose output. May be specified twice for "very verbose" output which
includes extra output such as dependency warnings and build script output.
May also be specified with the \fBterm.verbose\fP
.URL "https://doc.rust\-lang.org/cargo/reference/config.html" "config value" "."
.RE
.sp
\fB\-q\fP, \fB\-\-quiet\fP
.RS 4
No output printed to stdout.
.RE
.sp
\fB\-\-color\fP \fIWHEN\fP
.RS 4
Control when colored output is used. Valid values:
.sp
.RS 4
.ie n \{\
\h'-04'\(bu\h'+03'\c
.\}
.el \{\
. sp -1
. IP \(bu 2.3
.\}
\fBauto\fP (default): Automatically detect if color support is available on the
terminal.
.RE
.sp
.RS 4
.ie n \{\
\h'-04'\(bu\h'+03'\c
.\}
.el \{\
. sp -1
. IP \(bu 2.3
.\}
\fBalways\fP: Always display colors.
.RE
.sp
.RS 4
.ie n \{\
\h'-04'\(bu\h'+03'\c
.\}
.el \{\
. sp -1
. IP \(bu 2.3
.\}
\fBnever\fP: Never display colors.
.RE
.sp
May also be specified with the \fBterm.color\fP
.URL "https://doc.rust\-lang.org/cargo/reference/config.html" "config value" "."
.RE
.SS "Common Options"
.sp
\fB\-h\fP, \fB\-\-help\fP
.RS 4
Prints help information.
.RE
.sp
\fB\-Z\fP \fIFLAG\fP...
.RS 4
Unstable (nightly\-only) flags to Cargo. Run \fBcargo \-Z help\fP for
details.
.RE
.sp
\fB\-\-frozen\fP, \fB\-\-locked\fP
.RS 4
Either of these flags requires that the \fBCargo.lock\fP file is
up\-to\-date. If the lock file is missing, or it needs to be updated, Cargo will
exit with an error. The \fB\-\-frozen\fP flag also prevents Cargo from
attempting to access the network to determine if it is out\-of\-date.
.sp
These may be used in environments where you want to assert that the
\fBCargo.lock\fP file is up\-to\-date (such as a CI build) or want to avoid network
access.
.RE
.SH "ENVIRONMENT"
.sp
See \c
.URL "https://doc.rust\-lang.org/cargo/reference/environment\-variables.html" "the reference" " "
for
details on environment variables that Cargo reads.
.SH "EXIT STATUS"
.sp
0
.RS 4
Cargo succeeded.
.RE
.sp
101
.RS 4
Cargo failed to complete.
.RE
.SH "EXAMPLES"
.sp
.RS 4
.ie n \{\
\h'-04' 1.\h'+01'\c
.\}
.el \{\
. sp -1
. IP " 1." 4.2
.\}
Vendor all dependencies into a local "vendor" folder
.sp
.if n .RS 4
.nf
cargo vendor
.fi
.if n .RE
.RE
.sp
.RS 4
.ie n \{\
\h'-04' 2.\h'+01'\c
.\}
.el \{\
. sp -1
. IP " 2." 4.2
.\}
Vendor all dependencies into a local "third\-part/vendor" folder
.sp
.if n .RS 4
.nf
cargo vendor third\-party/vendor
.fi
.if n .RE
.RE
.sp
.RS 4
.ie n \{\
\h'-04' 3.\h'+01'\c
.\}
.el \{\
. sp -1
. IP " 3." 4.2
.\}
Vendor the current workspace as well as another to "vendor"
.sp
.if n .RS 4
.nf
cargo vendor \-s ../path/to/Cargo.toml
.fi
.if n .RE
.RE
.SH "SEE ALSO"
.sp
\fBcargo\fP(1)

View file

@ -91,7 +91,11 @@ fn http_auth_offered() {
let config = paths::home().join(".gitconfig");
let mut config = git2::Config::open(&config).unwrap();
config
.set_str("credential.helper", &script.display().to_string())
.set_str(
"credential.helper",
// This is a bash script so replace `\` with `/` for Windows
&script.display().to_string().replace("\\", "/"),
)
.unwrap();
let p = project()

View file

@ -103,3 +103,47 @@ This may become a hard error in the future; see <https://github.com/rust-lang/ca
")
.run();
}
#[test]
fn collision_doc() {
let p = project()
.file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
foo2 = { path = "foo2" }
"#,
)
.file("src/lib.rs", "")
.file(
"foo2/Cargo.toml",
r#"
[package]
name = "foo2"
version = "0.1.0"
[lib]
name = "foo"
"#,
)
.file("foo2/src/lib.rs", "")
.build();
p.cargo("doc")
.with_stderr_contains(
"\
[WARNING] output filename collision.
The lib target `foo` in package `foo2 v0.1.0 ([..]/foo/foo2)` has the same output \
filename as the lib target `foo` in package `foo v0.1.0 ([..]/foo)`.
Colliding filename is: [..]/foo/target/doc/foo/index.html
The targets should have unique names.
Consider changing their names to be unique or compiling them separately.
This may become a hard error in the future; see <https://github.com/rust-lang/cargo/issues/6313>.
",
)
.run();
}

View file

@ -1870,3 +1870,44 @@ fn warn_if_default_features() {
"#.trim(),
).run();
}
#[test]
fn no_feature_for_non_optional_dep() {
let p = project()
.file(
"Cargo.toml",
r#"
[project]
name = "foo"
version = "0.0.1"
authors = []
[dependencies]
bar = { path = "bar" }
"#,
)
.file(
"src/main.rs",
r#"
#[cfg(not(feature = "bar"))]
fn main() {
}
"#,
)
.file(
"bar/Cargo.toml",
r#"
[project]
name = "bar"
version = "0.0.1"
authors = []
[features]
a = []
"#,
)
.file("bar/src/lib.rs", "pub fn bar() {}")
.build();
p.cargo("build --features bar/a").run();
}

View file

@ -89,6 +89,7 @@ mod small_fd_limits;
mod test;
mod tool_paths;
mod update;
mod vendor;
mod verify_project;
mod version;
mod warn_on_failure;

View file

@ -9,7 +9,7 @@ use crate::support::registry::Package;
use crate::support::resolver::{
assert_contains, assert_same, dep, dep_kind, dep_loc, dep_req, dep_req_kind, loc_names, names,
pkg, pkg_dep, pkg_id, pkg_loc, registry, registry_strategy, remove_dep, resolve,
resolve_and_validated, resolve_with_config, PrettyPrintRegistry, ToDep, ToPkgId,
resolve_and_validated, resolve_with_config, PrettyPrintRegistry, SatResolve, ToDep, ToPkgId,
};
use proptest::prelude::*;
@ -41,6 +41,7 @@ proptest! {
PrettyPrintRegistry(input) in registry_strategy(50, 20, 60)
) {
let reg = registry(input.clone());
let mut sat_resolve = SatResolve::new(&reg);
// there is only a small chance that any one
// crate will be interesting.
// So we try some of the most complicated.
@ -49,6 +50,7 @@ proptest! {
pkg_id("root"),
vec![dep_req(&this.name(), &format!("={}", this.version()))],
&reg,
Some(&mut sat_resolve),
);
}
}
@ -236,6 +238,18 @@ proptest! {
}
}
#[test]
fn pub_fail() {
let input = vec![
pkg!(("a", "0.0.4")),
pkg!(("a", "0.0.5")),
pkg!(("e", "0.0.6") => [dep_req_kind("a", "<= 0.0.4", Kind::Normal, true),]),
pkg!(("kB", "0.0.3") => [dep_req("a", ">= 0.0.5"),dep("e"),]),
];
let reg = registry(input);
assert!(resolve_and_validated(pkg_id("root"), vec![dep("kB")], &reg, None).is_err());
}
#[test]
fn basic_public_dependency() {
let reg = registry(vec![
@ -245,7 +259,7 @@ fn basic_public_dependency() {
pkg!("C" => [dep("A"), dep("B")]),
]);
let res = resolve_and_validated(pkg_id("root"), vec![dep("C")], &reg).unwrap();
let res = resolve_and_validated(pkg_id("root"), vec![dep("C")], &reg, None).unwrap();
assert_same(
&res,
&names(&[
@ -281,7 +295,7 @@ fn public_dependency_filling_in() {
pkg!("d" => [dep("c"), dep("a"), dep("b")]),
]);
let res = resolve_and_validated(pkg_id("root"), vec![dep("d")], &reg).unwrap();
let res = resolve_and_validated(pkg_id("root"), vec![dep("d")], &reg, None).unwrap();
assert_same(
&res,
&names(&[
@ -316,7 +330,7 @@ fn public_dependency_filling_in_and_update() {
pkg!("C" => [dep("A"),dep("B")]),
pkg!("D" => [dep("B"),dep("C")]),
]);
let res = resolve_and_validated(pkg_id("root"), vec![dep("D")], &reg).unwrap();
let res = resolve_and_validated(pkg_id("root"), vec![dep("D")], &reg, None).unwrap();
assert_same(
&res,
&names(&[
@ -343,7 +357,7 @@ fn public_dependency_skipping() {
];
let reg = registry(input);
resolve(pkg_id("root"), vec![dep("c")], &reg).unwrap();
resolve_and_validated(pkg_id("root"), vec![dep("c")], &reg, None).unwrap();
}
#[test]
@ -363,7 +377,50 @@ fn public_dependency_skipping_in_backtracking() {
];
let reg = registry(input);
resolve(pkg_id("root"), vec![dep("C")], &reg).unwrap();
resolve_and_validated(pkg_id("root"), vec![dep("C")], &reg, None).unwrap();
}
#[test]
fn public_sat_topological_order() {
let input = vec![
pkg!(("a", "0.0.1")),
pkg!(("a", "0.0.0")),
pkg!(("b", "0.0.1") => [dep_req_kind("a", "= 0.0.1", Kind::Normal, true),]),
pkg!(("b", "0.0.0") => [dep("bad"),]),
pkg!("A" => [dep_req("a", "= 0.0.0"),dep_req_kind("b", "*", Kind::Normal, true)]),
];
let reg = registry(input);
assert!(resolve_and_validated(pkg_id("root"), vec![dep("A")], &reg, None).is_err());
}
#[test]
fn public_sat_unused_makes_things_pub() {
let input = vec![
pkg!(("a", "0.0.1")),
pkg!(("a", "0.0.0")),
pkg!(("b", "8.0.1") => [dep_req_kind("a", "= 0.0.1", Kind::Normal, true),]),
pkg!(("b", "8.0.0") => [dep_req("a", "= 0.0.1"),]),
pkg!("c" => [dep_req("b", "= 8.0.0"),dep_req("a", "= 0.0.0"),]),
];
let reg = registry(input);
resolve_and_validated(pkg_id("root"), vec![dep("c")], &reg, None).unwrap();
}
#[test]
fn public_sat_unused_makes_things_pub_2() {
let input = vec![
pkg!(("c", "0.0.2")),
pkg!(("c", "0.0.1")),
pkg!(("a-sys", "0.0.2")),
pkg!(("a-sys", "0.0.1") => [dep_req_kind("c", "= 0.0.1", Kind::Normal, true),]),
pkg!("P" => [dep_req_kind("a-sys", "*", Kind::Normal, true),dep_req("c", "= 0.0.1"),]),
pkg!("A" => [dep("P"),dep_req("c", "= 0.0.2"),]),
];
let reg = registry(input);
resolve_and_validated(pkg_id("root"), vec![dep("A")], &reg, None).unwrap();
}
#[test]
@ -1407,7 +1464,7 @@ fn conflict_store_bug() {
];
let reg = registry(input);
let _ = resolve_and_validated(pkg_id("root"), vec![dep("j")], &reg);
let _ = resolve_and_validated(pkg_id("root"), vec![dep("j")], &reg, None);
}
#[test]
@ -1435,5 +1492,5 @@ fn conflict_store_more_then_one_match() {
]),
];
let reg = registry(input);
let _ = resolve_and_validated(pkg_id("root"), vec![dep("nA")], &reg);
let _ = resolve_and_validated(pkg_id("root"), vec![dep("nA")], &reg, None);
}

View file

@ -1359,3 +1359,28 @@ fn env_rustflags_misspelled_build_script() {
.with_stderr_contains("[WARNING] Cargo does not read `RUST_FLAGS` environment variable. Did you mean `RUSTFLAGS`?")
.run();
}
#[test]
fn reamp_path_prefix_ignored() {
// Ensure that --remap-path-prefix does not affect metadata hash.
let p = project().file("src/lib.rs", "").build();
p.cargo("build").run();
let rlibs = p
.glob("target/debug/deps/*.rlib")
.collect::<Result<Vec<_>, _>>()
.unwrap();
assert_eq!(rlibs.len(), 1);
p.cargo("clean").run();
p.cargo("build")
.env(
"RUSTFLAGS",
"--remap-path-prefix=/abc=/zoo --remap-path-prefix /spaced=/zoo",
)
.run();
let rlibs2 = p
.glob("target/debug/deps/*.rlib")
.collect::<Result<Vec<_>, _>>()
.unwrap();
assert_eq!(rlibs, rlibs2);
}

View file

@ -538,9 +538,7 @@ impl Package {
}
pub fn cksum(s: &[u8]) -> String {
let mut sha = Sha256::new();
sha.update(s);
hex::encode(&sha.finish())
Sha256::new().update(s).finish_hex()
}
impl Dependency {

View file

@ -11,7 +11,7 @@ use cargo::core::resolver::{self, Method};
use cargo::core::source::{GitReference, SourceId};
use cargo::core::Resolve;
use cargo::core::{Dependency, PackageId, Registry, Summary};
use cargo::util::{CargoResult, Config, ToUrl};
use cargo::util::{CargoResult, Config, Graph, ToUrl};
use proptest::collection::{btree_map, vec};
use proptest::prelude::*;
@ -19,6 +19,7 @@ use proptest::sample::Index;
use proptest::strategy::ValueTree;
use proptest::string::string_regex;
use proptest::test_runner::TestRunner;
use varisat::{self, ExtendFormula};
pub fn resolve(
pkg: PackageId,
@ -32,8 +33,17 @@ pub fn resolve_and_validated(
pkg: PackageId,
deps: Vec<Dependency>,
registry: &[Summary],
sat_resolve: Option<&mut SatResolve>,
) -> CargoResult<Vec<PackageId>> {
let resolve = resolve_with_config_raw(pkg, deps, registry, None)?;
let should_resolve = if let Some(sat) = sat_resolve {
sat.sat_resolve(&deps)
} else {
SatResolve::new(registry).sat_resolve(&deps)
};
let resolve = resolve_with_config_raw(pkg, deps, registry, None);
assert_eq!(resolve.is_ok(), should_resolve);
let resolve = resolve?;
let mut stack = vec![pkg];
let mut used = HashSet::new();
let mut links = HashSet::new();
@ -176,6 +186,274 @@ pub fn resolve_with_config_raw(
resolve
}
const fn num_bits<T>() -> usize {
std::mem::size_of::<T>() * 8
}
fn log_bits(x: usize) -> usize {
if x == 0 {
return 0;
}
assert!(x > 0);
(num_bits::<usize>() as u32 - x.leading_zeros()) as usize
}
fn sat_at_most_one(solver: &mut impl varisat::ExtendFormula, vars: &[varisat::Var]) {
if vars.len() <= 1 {
return;
} else if vars.len() == 2 {
solver.add_clause(&[vars[0].negative(), vars[1].negative()]);
return;
} else if vars.len() == 3 {
solver.add_clause(&[vars[0].negative(), vars[1].negative()]);
solver.add_clause(&[vars[0].negative(), vars[2].negative()]);
solver.add_clause(&[vars[1].negative(), vars[2].negative()]);
return;
}
// use the "Binary Encoding" from
// https://www.it.uu.se/research/group/astra/ModRef10/papers/Alan%20M.%20Frisch%20and%20Paul%20A.%20Giannoros.%20SAT%20Encodings%20of%20the%20At-Most-k%20Constraint%20-%20ModRef%202010.pdf
let bits: Vec<varisat::Var> = solver.new_var_iter(log_bits(vars.len())).collect();
for (i, p) in vars.iter().enumerate() {
for b in 0..bits.len() {
solver.add_clause(&[p.negative(), bits[b].lit(((1 << b) & i) > 0)]);
}
}
}
fn sat_at_most_one_by_key<K: std::hash::Hash + Eq>(
cnf: &mut impl varisat::ExtendFormula,
data: impl Iterator<Item = (K, varisat::Var)>,
) -> HashMap<K, Vec<varisat::Var>> {
// no two packages with the same links set
let mut by_keys: HashMap<K, Vec<varisat::Var>> = HashMap::new();
for (p, v) in data {
by_keys.entry(p).or_default().push(v)
}
for key in by_keys.values() {
sat_at_most_one(cnf, key);
}
by_keys
}
/// Resolution can be reduced to the SAT problem. So this is an alternative implementation
/// of the resolver that uses a SAT library for the hard work. This is intended to be easy to read,
/// as compared to the real resolver.
///
/// For the subset of functionality that are currently made by `registry_strategy` this will,
/// find a valid resolution if one exists. The big thing that the real resolver does,
/// that this one does not do is work with features and optional dependencies.
///
/// The SAT library dose not optimize for the newer version,
/// so the selected packages may not match the real resolver.
pub struct SatResolve {
solver: varisat::Solver<'static>,
var_for_is_packages_used: HashMap<PackageId, varisat::Var>,
by_name: HashMap<&'static str, Vec<PackageId>>,
}
impl SatResolve {
pub fn new(registry: &[Summary]) -> Self {
let mut cnf = varisat::CnfFormula::new();
let var_for_is_packages_used: HashMap<PackageId, varisat::Var> = registry
.iter()
.map(|s| (s.package_id(), cnf.new_var()))
.collect();
// no two packages with the same links set
sat_at_most_one_by_key(
&mut cnf,
registry
.iter()
.map(|s| (s.links(), var_for_is_packages_used[&s.package_id()]))
.filter(|(l, _)| l.is_some()),
);
// no two semver compatible versions of the same package
let by_activations_keys = sat_at_most_one_by_key(
&mut cnf,
var_for_is_packages_used
.iter()
.map(|(p, &v)| (p.as_activations_key(), v)),
);
let mut by_name: HashMap<&'static str, Vec<PackageId>> = HashMap::new();
for p in registry.iter() {
by_name
.entry(p.name().as_str())
.or_default()
.push(p.package_id())
}
let empty_vec = vec![];
let mut graph: Graph<PackageId, ()> = Graph::new();
let mut version_selected_for: HashMap<
PackageId,
HashMap<Dependency, HashMap<_, varisat::Var>>,
> = HashMap::new();
// active packages need each of there `deps` to be satisfied
for p in registry.iter() {
graph.add(p.package_id());
for dep in p.dependencies() {
// This can more easily be written as:
// !is_active(p) or one of the things that match dep is_active
// All the complexity, from here to the end, is to support public and private dependencies!
let mut by_key: HashMap<_, Vec<varisat::Lit>> = HashMap::new();
for &m in by_name
.get(dep.package_name().as_str())
.unwrap_or(&empty_vec)
.iter()
.filter(|&p| dep.matches_id(*p))
{
graph.link(p.package_id(), m);
by_key
.entry(m.as_activations_key())
.or_default()
.push(var_for_is_packages_used[&m].positive());
}
let keys: HashMap<_, _> = by_key.keys().map(|&k| (k, cnf.new_var())).collect();
// if `p` is active then we need to select one of the keys
let matches: Vec<_> = keys
.values()
.map(|v| v.positive())
.chain(Some(var_for_is_packages_used[&p.package_id()].negative()))
.collect();
cnf.add_clause(&matches);
// if a key is active then we need to select one of the versions
for (key, vars) in by_key.iter() {
let mut matches = vars.clone();
matches.push(keys[key].negative());
cnf.add_clause(&matches);
}
version_selected_for
.entry(p.package_id())
.or_default()
.insert(dep.clone(), keys);
}
}
let topological_order = graph.sort();
// we already ensure there is only one version for each `activations_key` so we can think of
// `publicly_exports` as being in terms of a set of `activations_key`s
let mut publicly_exports: HashMap<_, HashMap<_, varisat::Var>> = HashMap::new();
for &key in by_activations_keys.keys() {
// everything publicly depends on itself
let var = publicly_exports
.entry(key)
.or_default()
.entry(key)
.or_insert_with(|| cnf.new_var());
cnf.add_clause(&[var.positive()]);
}
// if a `dep` is public then `p` `publicly_exports` all the things that the selected version `publicly_exports`
for &p in topological_order.iter() {
if let Some(deps) = version_selected_for.get(&p) {
let mut p_exports = publicly_exports.remove(&p.as_activations_key()).unwrap();
for (_, versions) in deps.iter().filter(|(d, _)| d.is_public()) {
for (ver, sel) in versions {
for (&export_pid, &export_var) in publicly_exports[ver].iter() {
let our_var =
p_exports.entry(export_pid).or_insert_with(|| cnf.new_var());
cnf.add_clause(&[
sel.negative(),
export_var.negative(),
our_var.positive(),
]);
}
}
}
publicly_exports.insert(p.as_activations_key(), p_exports);
}
}
// we already ensure there is only one version for each `activations_key` so we can think of
// `can_see` as being in terms of a set of `activations_key`s
// and if `p` `publicly_exports` `export` then it `can_see` `export`
let mut can_see: HashMap<_, HashMap<_, varisat::Var>> = HashMap::new();
// if `p` has a `dep` that selected `ver` then it `can_see` all the things that the selected version `publicly_exports`
for (&p, deps) in version_selected_for.iter() {
let p_can_see = can_see.entry(p).or_default();
for (_, versions) in deps.iter() {
for (&ver, sel) in versions {
for (&export_pid, &export_var) in publicly_exports[&ver].iter() {
let our_var = p_can_see.entry(export_pid).or_insert_with(|| cnf.new_var());
cnf.add_clause(&[
sel.negative(),
export_var.negative(),
our_var.positive(),
]);
}
}
}
}
// a package `can_see` only one version by each name
for (_, see) in can_see.iter() {
sat_at_most_one_by_key(&mut cnf, see.iter().map(|((name, _, _), &v)| (name, v)));
}
let mut solver = varisat::Solver::new();
solver.add_formula(&cnf);
// We dont need to `solve` now. We know that "use nothing" will satisfy all the clauses so far.
// But things run faster if we let it spend some time figuring out how the constraints interact before we add assumptions.
solver
.solve()
.expect("docs say it can't error in default config");
SatResolve {
solver,
var_for_is_packages_used,
by_name,
}
}
pub fn sat_resolve(&mut self, deps: &[Dependency]) -> bool {
let mut assumption = vec![];
let mut this_call = None;
// the starting `deps` need to be satisfied
for dep in deps.iter() {
let empty_vec = vec![];
let matches: Vec<varisat::Lit> = self
.by_name
.get(dep.package_name().as_str())
.unwrap_or(&empty_vec)
.iter()
.filter(|&p| dep.matches_id(*p))
.map(|p| self.var_for_is_packages_used[p].positive())
.collect();
if matches.is_empty() {
return false;
} else if matches.len() == 1 {
assumption.extend_from_slice(&matches)
} else {
if this_call.is_none() {
let new_var = self.solver.new_var();
this_call = Some(new_var);
assumption.push(new_var.positive());
}
let mut matches = matches;
matches.push(this_call.unwrap().negative());
self.solver.add_clause(&matches);
}
}
self.solver.assume(&assumption);
self.solver
.solve()
.expect("docs say it can't error in default config")
}
}
pub trait ToDep {
fn to_dep(self) -> Dependency;
}
@ -527,12 +805,11 @@ pub fn registry_strategy(
// => Kind::Development, // Development has no impact so don't gen
_ => panic!("bad index for Kind"),
},
p,
p && k == 0,
))
}
PrettyPrintRegistry(
list_of_pkgid
let mut out: Vec<Summary> = list_of_pkgid
.into_iter()
.zip(dependency_by_pkgid.into_iter())
.map(|(((name, ver), allow_deps), deps)| {
@ -548,8 +825,14 @@ pub fn registry_strategy(
},
)
})
.collect(),
)
.collect();
if reverse_alphabetical {
// make sure the complicated cases are at the end
out.reverse();
}
PrettyPrintRegistry(out)
},
)
}

508
tests/testsuite/vendor.rs Normal file
View file

@ -0,0 +1,508 @@
use crate::support::git;
use crate::support::registry::Package;
use crate::support::{basic_lib_manifest, project, Project};
#[test]
fn vendor_simple() {
let p = project()
.file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
log = "0.3.5"
"#,
)
.file("src/lib.rs", "")
.build();
Package::new("log", "0.3.5").publish();
p.cargo("vendor --respect-source-config").run();
let lock = p.read_file("vendor/log/Cargo.toml");
assert!(lock.contains("version = \"0.3.5\""));
add_vendor_config(&p);
p.cargo("build").run();
}
fn add_vendor_config(p: &Project) {
p.change_file(
".cargo/config",
r#"
[source.crates-io]
replace-with = 'vendor'
[source.vendor]
directory = 'vendor'
"#,
);
}
#[test]
fn two_versions() {
let p = project()
.file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
bitflags = "0.8.0"
bar = { path = "bar" }
"#,
)
.file("src/lib.rs", "")
.file(
"bar/Cargo.toml",
r#"
[package]
name = "bar"
version = "0.1.0"
[dependencies]
bitflags = "0.7.0"
"#,
)
.file("bar/src/lib.rs", "")
.build();
Package::new("bitflags", "0.7.0").publish();
Package::new("bitflags", "0.8.0").publish();
p.cargo("vendor --respect-source-config").run();
let lock = p.read_file("vendor/bitflags/Cargo.toml");
assert!(lock.contains("version = \"0.8.0\""));
let lock = p.read_file("vendor/bitflags-0.7.0/Cargo.toml");
assert!(lock.contains("version = \"0.7.0\""));
add_vendor_config(&p);
p.cargo("build").run();
}
#[test]
fn help() {
let p = project().build();
p.cargo("vendor -h").run();
}
#[test]
fn update_versions() {
let p = project()
.file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
bitflags = "0.7.0"
"#,
)
.file("src/lib.rs", "")
.build();
Package::new("bitflags", "0.7.0").publish();
Package::new("bitflags", "0.8.0").publish();
p.cargo("vendor --respect-source-config").run();
let lock = p.read_file("vendor/bitflags/Cargo.toml");
assert!(lock.contains("version = \"0.7.0\""));
p.change_file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
bitflags = "0.8.0"
"#,
);
p.cargo("vendor --respect-source-config").run();
let lock = p.read_file("vendor/bitflags/Cargo.toml");
assert!(lock.contains("version = \"0.8.0\""));
}
#[test]
fn two_lockfiles() {
let p = project()
.no_manifest()
.file(
"foo/Cargo.toml",
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
bitflags = "=0.7.0"
"#,
)
.file("foo/src/lib.rs", "")
.file(
"bar/Cargo.toml",
r#"
[package]
name = "bar"
version = "0.1.0"
[dependencies]
bitflags = "=0.8.0"
"#,
)
.file("bar/src/lib.rs", "")
.build();
Package::new("bitflags", "0.7.0").publish();
Package::new("bitflags", "0.8.0").publish();
p.cargo("vendor --respect-source-config -s bar/Cargo.toml --manifest-path foo/Cargo.toml")
.run();
let lock = p.read_file("vendor/bitflags/Cargo.toml");
assert!(lock.contains("version = \"0.8.0\""));
let lock = p.read_file("vendor/bitflags-0.7.0/Cargo.toml");
assert!(lock.contains("version = \"0.7.0\""));
add_vendor_config(&p);
p.cargo("build").cwd("foo").run();
p.cargo("build").cwd("bar").run();
}
#[test]
fn delete_old_crates() {
let p = project()
.file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
bitflags = "=0.7.0"
"#,
)
.file("src/lib.rs", "")
.build();
Package::new("bitflags", "0.7.0").publish();
Package::new("log", "0.3.5").publish();
p.cargo("vendor --respect-source-config").run();
p.read_file("vendor/bitflags/Cargo.toml");
p.change_file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
log = "=0.3.5"
"#,
);
p.cargo("vendor --respect-source-config").run();
let lock = p.read_file("vendor/log/Cargo.toml");
assert!(lock.contains("version = \"0.3.5\""));
assert!(!p.root().join("vendor/bitflags/Cargo.toml").exists());
}
#[test]
fn ignore_files() {
let p = project()
.file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
url = "1.4.1"
"#,
)
.file("src/lib.rs", "")
.build();
Package::new("url", "1.4.1")
.file("src/lib.rs", "")
.file("foo.orig", "")
.file(".gitignore", "")
.file(".gitattributes", "")
.file("foo.rej", "")
.publish();
p.cargo("vendor --respect-source-config").run();
let csum = p.read_file("vendor/url/.cargo-checksum.json");
assert!(!csum.contains("foo.orig"));
assert!(!csum.contains(".gitignore"));
assert!(!csum.contains(".gitattributes"));
assert!(!csum.contains(".cargo-ok"));
assert!(!csum.contains("foo.rej"));
}
#[test]
fn included_files_only() {
let git = git::new("a", |p| {
p.file("Cargo.toml", &basic_lib_manifest("a"))
.file("src/lib.rs", "")
.file(".gitignore", "a")
.file("a/b.md", "")
})
.unwrap();
let p = project()
.file(
"Cargo.toml",
&format!(
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
a = {{ git = '{}' }}
"#,
git.url()
),
)
.file("src/lib.rs", "")
.build();
p.cargo("vendor --respect-source-config").run();
let csum = p.read_file("vendor/a/.cargo-checksum.json");
assert!(!csum.contains("a/b.md"));
}
#[test]
fn dependent_crates_in_crates() {
let git = git::new("a", |p| {
p.file(
"Cargo.toml",
r#"
[package]
name = "a"
version = "0.1.0"
[dependencies]
b = { path = 'b' }
"#,
)
.file("src/lib.rs", "")
.file("b/Cargo.toml", &basic_lib_manifest("b"))
.file("b/src/lib.rs", "")
})
.unwrap();
let p = project()
.file(
"Cargo.toml",
&format!(
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
a = {{ git = '{}' }}
"#,
git.url()
),
)
.file("src/lib.rs", "")
.build();
p.cargo("vendor --respect-source-config").run();
p.read_file("vendor/a/.cargo-checksum.json");
p.read_file("vendor/b/.cargo-checksum.json");
}
#[test]
fn vendoring_git_crates() {
let git = git::new("git", |p| {
p.file("Cargo.toml", &basic_lib_manifest("serde_derive"))
.file("src/lib.rs", "")
.file("src/wut.rs", "")
})
.unwrap();
let p = project()
.file(
"Cargo.toml",
&format!(
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies.serde]
version = "0.5.0"
[dependencies.serde_derive]
version = "0.5.0"
[patch.crates-io]
serde_derive = {{ git = '{}' }}
"#,
git.url()
),
)
.file("src/lib.rs", "")
.build();
Package::new("serde", "0.5.0")
.dep("serde_derive", "0.5")
.publish();
Package::new("serde_derive", "0.5.0").publish();
p.cargo("vendor --respect-source-config").run();
p.read_file("vendor/serde_derive/src/wut.rs");
add_vendor_config(&p);
p.cargo("build").run();
}
#[test]
fn git_simple() {
let git = git::new("git", |p| {
p.file("Cargo.toml", &basic_lib_manifest("a"))
.file("src/lib.rs", "")
})
.unwrap();
let p = project()
.file(
"Cargo.toml",
&format!(
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
a = {{ git = '{}' }}
"#,
git.url()
),
)
.file("src/lib.rs", "")
.build();
p.cargo("vendor --respect-source-config").run();
let csum = p.read_file("vendor/a/.cargo-checksum.json");
assert!(csum.contains("\"package\":null"));
}
#[test]
fn git_duplicate() {
let git = git::new("a", |p| {
p.file(
"Cargo.toml",
r#"
[package]
name = "a"
version = "0.1.0"
[dependencies]
b = { path = 'b' }
"#,
)
.file("src/lib.rs", "")
.file("b/Cargo.toml", &basic_lib_manifest("b"))
.file("b/src/lib.rs", "")
})
.unwrap();
let p = project()
.file(
"Cargo.toml",
&format!(
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
a = {{ git = '{}' }}
b = '0.5.0'
"#,
git.url()
),
)
.file("src/lib.rs", "")
.build();
Package::new("b", "0.5.0").publish();
p.cargo("vendor --respect-source-config")
.with_stderr(
"\
[UPDATING] [..]
[UPDATING] [..]
[DOWNLOADING] [..]
[DOWNLOADED] [..]
error: failed to sync
Caused by:
found duplicate version of package `b v0.5.0` vendored from two sources:
<tab>source 1: [..]
<tab>source 2: [..]
",
)
.with_status(101)
.run();
}
#[test]
fn depend_on_vendor_dir_not_deleted() {
let p = project()
.file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
libc = "0.2.30"
"#,
)
.file("src/lib.rs", "")
.build();
Package::new("libc", "0.2.30").publish();
p.cargo("vendor --respect-source-config").run();
assert!(p.root().join("vendor/libc").is_dir());
p.change_file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
libc = "0.2.30"
[patch.crates-io]
libc = { path = 'vendor/libc' }
"#,
);
p.cargo("vendor --respect-source-config").run();
assert!(p.root().join("vendor/libc").is_dir());
}