git/Documentation/git-repack.txt

280 lines
11 KiB
Plaintext
Raw Normal View History

git-repack(1)
=============
NAME
----
git-repack - Pack unpacked objects in a repository
SYNOPSIS
--------
[verse]
builtin/repack.c: support writing a MIDX while repacking Teach `git repack` a new `--write-midx` option for callers that wish to persist a multi-pack index in their repository while repacking. There are two existing alternatives to this new flag, but they don't cover our particular use-case. These alternatives are: - Call 'git multi-pack-index write' after running 'git repack', or - Set 'GIT_TEST_MULTI_PACK_INDEX=1' in your environment when running 'git repack'. The former works, but introduces a gap in bitmap coverage between repacking and writing a new MIDX (since the repack may have deleted a pack included in the existing MIDX, invalidating it altogether). Setting the 'GIT_TEST_' environment variable is obviously unsupported. In fact, even if it were supported officially, it still wouldn't work, because it generates the MIDX *after* redundant packs have been dropped, leading to the same issue as above. Introduce a new option which eliminates this race by teaching `git repack` to generate the MIDX at the critical point: after the new packs have been written and moved into place, but before the redundant packs have been removed. This option is compatible with `git repack`'s '--bitmap' option (it changes the interpretation to be: "write a bitmap corresponding to the MIDX after one has been generated"). There is a little bit of additional noise in the patch below to avoid repeating ourselves when selecting which packs to delete. Instead of a single loop as before (where we iterate over 'existing_packs', decide if a pack is worth deleting, and if so, delete it), we have two loops (the first where we decide which ones are worth deleting, and the second where we actually do the deleting). This makes it so we have a single check we can make consistently when (1) telling the MIDX which packs we want to exclude, and (2) actually unlinking the redundant packs. There is also a tiny change to short-circuit the body of write_midx_included_packs() when no packs remain in the case of an empty repository. The MIDX code does not handle this, so avoid trying to generate a MIDX covering zero packs in the first place. Signed-off-by: Taylor Blau <me@ttaylorr.com> Signed-off-by: Junio C Hamano <gitster@pobox.com>
2021-09-29 01:55:18 +00:00
'git repack' [-a] [-A] [-d] [-f] [-F] [-l] [-n] [-q] [-b] [-m] [--window=<n>] [--depth=<n>] [--threads=<n>] [--keep-pack=<pack-name>] [--write-midx]
DESCRIPTION
-----------
This command is used to combine all objects that do not currently
reside in a "pack", into a pack. It can also be used to re-organize
existing packs into a single, more efficient pack.
A pack is a collection of objects, individually compressed, with
delta compression applied, stored in a single file, with an
associated index file.
Packs are used to reduce the load on mirror systems, backup
engines, disk storage, etc.
OPTIONS
-------
-a::
Instead of incrementally packing the unpacked objects,
pack everything referenced into a single pack.
Especially useful when packing a repository that is used
for private development. Use
with `-d`. This will clean up the objects that `git prune`
leaves behind, but `git fsck --full --dangling` shows as
dangling.
+
Note that users fetching over dumb protocols will have to fetch the
whole new pack in order to get any contained object, no matter how many
other objects in that pack they already have locally.
+
Promisor packfiles are repacked separately: if there are packfiles that
have an associated ".promisor" file, these packfiles will be repacked
into another separate pack, and an empty ".promisor" file corresponding
to the new separate pack will be written.
-A::
Same as `-a`, unless `-d` is used. Then any unreachable
objects in a previous pack become loose, unpacked objects,
instead of being left in the old pack. Unreachable objects
are never intentionally added to a pack, even when repacking.
This option prevents unreachable objects from being immediately
deleted by way of being left in the old pack and then
removed. Instead, the loose unreachable objects
will be pruned according to normal expiry rules
with the next 'git gc' invocation. See linkgit:git-gc[1].
-d::
After packing, if the newly created packs make some
existing packs redundant, remove the redundant packs.
Also run 'git prune-packed' to remove redundant
loose object files.
--cruft::
Same as `-a`, unless `-d` is used. Then any unreachable objects
are packed into a separate cruft pack. Unreachable objects can
be pruned using the normal expiry rules with the next `git gc`
invocation (see linkgit:git-gc[1]). Incompatible with `-k`.
--cruft-expiration=<approxidate>::
Expire unreachable objects older than `<approxidate>`
immediately instead of waiting for the next `git gc` invocation.
Only useful with `--cruft -d`.
builtin/repack.c: implement support for `--max-cruft-size` Cruft packs are an alternative mechanism for storing a collection of unreachable objects whose mtimes are recent enough to avoid being pruned out of the repository. When cruft packs were first introduced back in b757353676 (builtin/pack-objects.c: --cruft without expiration, 2022-05-20) and a7d493833f (builtin/pack-objects.c: --cruft with expiration, 2022-05-20), the recommended workflow consisted of: - Repacking periodically, either by packing anything loose in the repository (via `git repack -d`) or producing a geometric sequence of packs (via `git repack --geometric=<d> -d`). - Every so often, splitting the repository into two packs, one cruft to store the unreachable objects, and another non-cruft pack to store the reachable objects. Repositories may (out of band with the above) choose periodically to prune out some unreachable objects which have aged out of the grace period by generating a pack with `--cruft-expiration=<approxidate>`. This allowed repositories to maintain relatively few packs on average, and quarantine unreachable objects together in a cruft pack, avoiding the pitfalls of holding unreachable objects as loose while they age out (for more, see some of the details in 3d89a8c118 (Documentation/technical: add cruft-packs.txt, 2022-05-20)). This all works, but can be costly from an I/O-perspective when frequently repacking a repository that has many unreachable objects. This problem is exacerbated when those unreachable objects are rarely (if every) pruned. Since there is at most one cruft pack in the above scheme, each time we update the cruft pack it must be rewritten from scratch. Because much of the pack is reused, this is a relatively inexpensive operation from a CPU-perspective, but is very costly in terms of I/O since we end up rewriting basically the same pack (plus any new unreachable objects that have entered the repository since the last time a cruft pack was generated). At the time, we decided against implementing more robust support for multiple cruft packs. This patch implements that support which we were lacking. Introduce a new option `--max-cruft-size` which allows repositories to accumulate cruft packs up to a given size, after which point a new generation of cruft packs can accumulate until it reaches the maximum size, and so on. To generate a new cruft pack, the process works like so: - Sort a list of any existing cruft packs in ascending order of pack size. - Starting from the beginning of the list, group cruft packs together while the accumulated size is smaller than the maximum specified pack size. - Combine the objects in these cruft packs together into a new cruft pack, along with any other unreachable objects which have since entered the repository. Once a cruft pack grows beyond the size specified via `--max-cruft-size` the pack is effectively frozen. This limits the I/O churn up to a quadratic function of the value specified by the `--max-cruft-size` option, instead of behaving quadratically in the number of total unreachable objects. When pruning unreachable objects, we bypass the new code paths which combine small cruft packs together, and instead start from scratch, passing in the appropriate `--max-pack-size` down to `pack-objects`, putting it in charge of keeping the resulting set of cruft packs sized correctly. This may seem like further I/O churn, but in practice it isn't so bad. We could prune old cruft packs for whom all or most objects are removed, and then generate a new cruft pack with just the remaining set of objects. But this additional complexity buys us relatively little, because most objects end up being pruned anyway, so the I/O churn is well contained. Signed-off-by: Taylor Blau <me@ttaylorr.com> Signed-off-by: Junio C Hamano <gitster@pobox.com>
2023-10-03 00:44:32 +00:00
--max-cruft-size=<n>::
Repack cruft objects into packs as large as `<n>` bytes before
creating new packs. As long as there are enough cruft packs
smaller than `<n>`, repacking will cause a new cruft pack to
be created containing objects from any combined cruft packs,
along with any new unreachable objects. Cruft packs larger than
`<n>` will not be modified. When the new cruft pack is larger
than `<n>` bytes, it will be split into multiple packs, all of
which are guaranteed to be at most `<n>` bytes in size. Only
useful with `--cruft -d`.
builtin/repack.c: implement `--expire-to` for storing pruned objects When pruning objects with `--cruft`, `git repack` offers some flexibility when selecting the set of which objects are pruned via the `--cruft-expiration` option. This is useful for expiring objects which are older than the grace period, making races where to-be-pruned objects become reachable and then ancestors of freshly pushed objects, leaving the repository in a corrupt state after pruning substantially less likely [1]. But in practice, such races are impossible to avoid entirely, no matter how long the grace period is. To prevent this race, it is often advisable to temporarily put a repository into a read-only state. But in practice, this is not always practical, and so some middle ground would be nice. This patch introduces a new option, `--expire-to`, which teaches `git repack` to write an additional cruft pack containing just the objects which were pruned from the repository. The caller can specify a directory outside of the current repository as the destination for this second cruft pack. This makes it possible to prune objects from a repository, while still holding onto a supplemental copy of them outside of the original repository. Having this copy on-disk makes it substantially easier to recover objects when the aforementioned race is encountered. `--expire-to` is implemented in a somewhat convoluted manner, which is to take advantage of the fact that the first time `write_cruft_pack()` is called, it adds the name of the cruft pack to the `names` string list. That means the second time we call `write_cruft_pack()`, objects in the previously-written cruft pack will be excluded. As long as the caller ensures that no objects are expired during the second pass, this is sufficient to generate a cruft pack containing all objects which don't appear in any of the new packs written by `git repack`, including the cruft pack. In other words, all of the objects which are about to be pruned from the repository. It is important to note that the destination in `--expire-to` does not necessarily need to be a Git repository (though it can be) Notably, the expired packs do not contain all ancestors of expired objects. So if the source repository contains something like: <unreachable> / C1 --- C2 \ refs/heads/master where C2 is unreachable, but has a parent (C1) which is reachable, and C2 would be pruned, then the expiry pack will contain only C2, not C1. [1]: https://lore.kernel.org/git/20190319001829.GL29661@sigill.intra.peff.net/ Signed-off-by: Taylor Blau <me@ttaylorr.com> Signed-off-by: Junio C Hamano <gitster@pobox.com>
2022-10-24 18:43:12 +00:00
--expire-to=<dir>::
Write a cruft pack containing pruned objects (if any) to the
directory `<dir>`. This option is useful for keeping a copy of
any pruned objects in a separate directory as a backup. Only
useful with `--cruft -d`.
-l::
Pass the `--local` option to 'git pack-objects'. See
linkgit:git-pack-objects[1].
-f::
Pass the `--no-reuse-delta` option to `git-pack-objects`, see
linkgit:git-pack-objects[1].
-F::
Pass the `--no-reuse-object` option to `git-pack-objects`, see
linkgit:git-pack-objects[1].
-q::
--quiet::
Show no progress over the standard error stream and pass the `-q`
option to 'git pack-objects'. See linkgit:git-pack-objects[1].
-n::
Do not update the server information with
'git update-server-info'. This option skips
updating local catalog files needed to publish
this repository (or a direct copy of it)
over HTTP or FTP. See linkgit:git-update-server-info[1].
--window=<n>::
--depth=<n>::
These two options affect how the objects contained in the pack are
stored using delta compression. The objects are first internally
sorted by type, size and optionally names and compared against the
other objects within `--window` to see if using delta compression saves
space. `--depth` limits the maximum delta depth; making it too deep
affects the performance on the unpacker side, because delta data needs
to be applied that many times to get to the necessary object.
+
The default value for --window is 10 and --depth is 50. The maximum
depth is 4095.
--threads=<n>::
This option is passed through to `git pack-objects`.
--window-memory=<n>::
This option provides an additional limit on top of `--window`;
the window size will dynamically scale down so as to not take
up more than '<n>' bytes in memory. This is useful in
repositories with a mix of large and small objects to not run
out of memory with a large window, but still be able to take
advantage of the large window for the smaller objects. The
size can be suffixed with "k", "m", or "g".
`--window-memory=0` makes memory usage unlimited. The default
is taken from the `pack.windowMemory` configuration variable.
Note that the actual memory usage will be the limit multiplied
by the number of threads used by linkgit:git-pack-objects[1].
--max-pack-size=<n>::
Maximum size of each output pack file. The size can be suffixed with
"k", "m", or "g". The minimum size allowed is limited to 1 MiB.
If specified, multiple packfiles may be created, which also
prevents the creation of a bitmap index.
The default is unlimited, unless the config variable
`pack.packSizeLimit` is set. Note that this option may result in
a larger and slower repository; see the discussion in
`pack.packSizeLimit`.
repack: add `--filter=<filter-spec>` option This new option puts the objects specified by `<filter-spec>` into a separate packfile. This could be useful if, for example, some blobs take up a lot of precious space on fast storage while they are rarely accessed. It could make sense to move them into a separate cheaper, though slower, storage. It's possible to find which new packfile contains the filtered out objects using one of the following: - `git verify-pack -v ...`, - `test-tool find-pack ...`, which a previous commit added, - `--filter-to=<dir>`, which a following commit will add to specify where the pack containing the filtered out objects will be. This feature is implemented by running `git pack-objects` twice in a row. The first command is run with `--filter=<filter-spec>`, using the specified filter. It packs objects while omitting the objects specified by the filter. Then another `git pack-objects` command is launched using `--stdin-packs`. We pass it all the previously existing packs into its stdin, so that it will pack all the objects in the previously existing packs. But we also pass into its stdin, the pack created by the previous `git pack-objects --filter=<filter-spec>` command as well as the kept packs, all prefixed with '^', so that the objects in these packs will be omitted from the resulting pack. The result is that only the objects filtered out by the first `git pack-objects` command are in the pack resulting from the second `git pack-objects` command. As the interactions with kept packs are a bit tricky, a few related tests are added. Helped-by: Taylor Blau <me@ttaylorr.com> Signed-off-by: John Cai <johncai86@gmail.com> Signed-off-by: Christian Couder <chriscool@tuxfamily.org> Signed-off-by: Junio C Hamano <gitster@pobox.com>
2023-10-02 16:55:01 +00:00
--filter=<filter-spec>::
Remove objects matching the filter specification from the
resulting packfile and put them into a separate packfile. Note
that objects used in the working directory are not filtered
out. So for the split to fully work, it's best to perform it
in a bare repo and to use the `-a` and `-d` options along with
this option. Also `--no-write-bitmap-index` (or the
`repack.writebitmaps` config option set to `false`) should be
used otherwise writing bitmap index will fail, as it supposes
a single packfile containing all the objects. See
linkgit:git-rev-list[1] for valid `<filter-spec>` forms.
repack: implement `--filter-to` for storing filtered out objects A previous commit has implemented `git repack --filter=<filter-spec>` to allow users to filter out some objects from the main pack and move them into a new different pack. It would be nice if this new different pack could be created in a different directory than the regular pack. This would make it possible to move large blobs into a pack on a different kind of storage, for example cheaper storage. Even in a different directory, this pack can be accessible if, for example, the Git alternates mechanism is used to point to it. In fact not using the Git alternates mechanism can corrupt a repo as the generated pack containing the filtered objects might not be accessible from the repo any more. So setting up the Git alternates mechanism should be done before using this feature if the user wants the repo to be fully usable while this feature is used. In some cases, like when a repo has just been cloned or when there is no other activity in the repo, it's Ok to setup the Git alternates mechanism afterwards though. It's also Ok to just inspect the generated packfile containing the filtered objects and then just move it into the '.git/objects/pack/' directory manually. That's why it's not necessary for this command to check that the Git alternates mechanism has been already setup. While at it, as an example to show that `--filter` and `--filter-to` work well with other options, let's also add a test to check that these options work well with `--max-pack-size`. Signed-off-by: Christian Couder <chriscool@tuxfamily.org> Signed-off-by: Junio C Hamano <gitster@pobox.com>
2023-10-02 16:55:03 +00:00
--filter-to=<dir>::
Write the pack containing filtered out objects to the
directory `<dir>`. Only useful with `--filter`. This can be
used for putting the pack on a separate object directory that
is accessed through the Git alternates mechanism. **WARNING:**
If the packfile containing the filtered out objects is not
accessible, the repo can become corrupt as it might not be
possible to access the objects in that packfile. See the
`objects` and `objects/info/alternates` sections of
linkgit:gitrepository-layout[5].
-b::
--write-bitmap-index::
Write a reachability bitmap index as part of the repack. This
builtin/repack.c: support writing a MIDX while repacking Teach `git repack` a new `--write-midx` option for callers that wish to persist a multi-pack index in their repository while repacking. There are two existing alternatives to this new flag, but they don't cover our particular use-case. These alternatives are: - Call 'git multi-pack-index write' after running 'git repack', or - Set 'GIT_TEST_MULTI_PACK_INDEX=1' in your environment when running 'git repack'. The former works, but introduces a gap in bitmap coverage between repacking and writing a new MIDX (since the repack may have deleted a pack included in the existing MIDX, invalidating it altogether). Setting the 'GIT_TEST_' environment variable is obviously unsupported. In fact, even if it were supported officially, it still wouldn't work, because it generates the MIDX *after* redundant packs have been dropped, leading to the same issue as above. Introduce a new option which eliminates this race by teaching `git repack` to generate the MIDX at the critical point: after the new packs have been written and moved into place, but before the redundant packs have been removed. This option is compatible with `git repack`'s '--bitmap' option (it changes the interpretation to be: "write a bitmap corresponding to the MIDX after one has been generated"). There is a little bit of additional noise in the patch below to avoid repeating ourselves when selecting which packs to delete. Instead of a single loop as before (where we iterate over 'existing_packs', decide if a pack is worth deleting, and if so, delete it), we have two loops (the first where we decide which ones are worth deleting, and the second where we actually do the deleting). This makes it so we have a single check we can make consistently when (1) telling the MIDX which packs we want to exclude, and (2) actually unlinking the redundant packs. There is also a tiny change to short-circuit the body of write_midx_included_packs() when no packs remain in the case of an empty repository. The MIDX code does not handle this, so avoid trying to generate a MIDX covering zero packs in the first place. Signed-off-by: Taylor Blau <me@ttaylorr.com> Signed-off-by: Junio C Hamano <gitster@pobox.com>
2021-09-29 01:55:18 +00:00
only makes sense when used with `-a`, `-A` or `-m`, as the bitmaps
must be able to refer to all reachable objects. This option
builtin/repack.c: support writing a MIDX while repacking Teach `git repack` a new `--write-midx` option for callers that wish to persist a multi-pack index in their repository while repacking. There are two existing alternatives to this new flag, but they don't cover our particular use-case. These alternatives are: - Call 'git multi-pack-index write' after running 'git repack', or - Set 'GIT_TEST_MULTI_PACK_INDEX=1' in your environment when running 'git repack'. The former works, but introduces a gap in bitmap coverage between repacking and writing a new MIDX (since the repack may have deleted a pack included in the existing MIDX, invalidating it altogether). Setting the 'GIT_TEST_' environment variable is obviously unsupported. In fact, even if it were supported officially, it still wouldn't work, because it generates the MIDX *after* redundant packs have been dropped, leading to the same issue as above. Introduce a new option which eliminates this race by teaching `git repack` to generate the MIDX at the critical point: after the new packs have been written and moved into place, but before the redundant packs have been removed. This option is compatible with `git repack`'s '--bitmap' option (it changes the interpretation to be: "write a bitmap corresponding to the MIDX after one has been generated"). There is a little bit of additional noise in the patch below to avoid repeating ourselves when selecting which packs to delete. Instead of a single loop as before (where we iterate over 'existing_packs', decide if a pack is worth deleting, and if so, delete it), we have two loops (the first where we decide which ones are worth deleting, and the second where we actually do the deleting). This makes it so we have a single check we can make consistently when (1) telling the MIDX which packs we want to exclude, and (2) actually unlinking the redundant packs. There is also a tiny change to short-circuit the body of write_midx_included_packs() when no packs remain in the case of an empty repository. The MIDX code does not handle this, so avoid trying to generate a MIDX covering zero packs in the first place. Signed-off-by: Taylor Blau <me@ttaylorr.com> Signed-off-by: Junio C Hamano <gitster@pobox.com>
2021-09-29 01:55:18 +00:00
overrides the setting of `repack.writeBitmaps`. This option
has no effect if multiple packfiles are created, unless writing a
MIDX (in which case a multi-pack bitmap is created).
repack: add `repack.packKeptObjects` config var The git-repack command always passes `--honor-pack-keep` to pack-objects. This has traditionally been a good thing, as we do not want to duplicate those objects in a new pack, and we are not going to delete the old pack. However, when bitmaps are in use, it is important for a full repack to include all reachable objects, even if they may be duplicated in a .keep pack. Otherwise, we cannot generate the bitmaps, as the on-disk format requires the set of objects in the pack to be fully closed. Even if the repository does not generally have .keep files, a simultaneous push could cause a race condition in which a .keep file exists at the moment of a repack. The repack may try to include those objects in one of two situations: 1. The pushed .keep pack contains objects that were already in the repository (e.g., blobs due to a revert of an old commit). 2. Receive-pack updates the refs, making the objects reachable, but before it removes the .keep file, the repack runs. In either case, we may prefer to duplicate some objects in the new, full pack, and let the next repack (after the .keep file is cleaned up) take care of removing them. This patch introduces both a command-line and config option to disable the `--honor-pack-keep` option. By default, it is triggered when pack.writeBitmaps (or `--write-bitmap-index` is turned on), but specifying it explicitly can override the behavior (e.g., in cases where you prefer .keep files to bitmaps, but only when they are present). Note that this option just disables the pack-objects behavior. We still leave packs with a .keep in place, as we do not necessarily know that we have duplicated all of their objects. Signed-off-by: Jeff King <peff@peff.net> Signed-off-by: Junio C Hamano <gitster@pobox.com>
2014-03-03 20:04:20 +00:00
--pack-kept-objects::
Include objects in `.keep` files when repacking. Note that we
still do not delete `.keep` packs after `pack-objects` finishes.
This means that we may duplicate objects, but this makes the
option safe to use when there are concurrent pushes or fetches.
This option is generally only useful if you are writing bitmaps
with `-b` or `repack.writeBitmaps`, as it ensures that the
repack: add `repack.packKeptObjects` config var The git-repack command always passes `--honor-pack-keep` to pack-objects. This has traditionally been a good thing, as we do not want to duplicate those objects in a new pack, and we are not going to delete the old pack. However, when bitmaps are in use, it is important for a full repack to include all reachable objects, even if they may be duplicated in a .keep pack. Otherwise, we cannot generate the bitmaps, as the on-disk format requires the set of objects in the pack to be fully closed. Even if the repository does not generally have .keep files, a simultaneous push could cause a race condition in which a .keep file exists at the moment of a repack. The repack may try to include those objects in one of two situations: 1. The pushed .keep pack contains objects that were already in the repository (e.g., blobs due to a revert of an old commit). 2. Receive-pack updates the refs, making the objects reachable, but before it removes the .keep file, the repack runs. In either case, we may prefer to duplicate some objects in the new, full pack, and let the next repack (after the .keep file is cleaned up) take care of removing them. This patch introduces both a command-line and config option to disable the `--honor-pack-keep` option. By default, it is triggered when pack.writeBitmaps (or `--write-bitmap-index` is turned on), but specifying it explicitly can override the behavior (e.g., in cases where you prefer .keep files to bitmaps, but only when they are present). Note that this option just disables the pack-objects behavior. We still leave packs with a .keep in place, as we do not necessarily know that we have duplicated all of their objects. Signed-off-by: Jeff King <peff@peff.net> Signed-off-by: Junio C Hamano <gitster@pobox.com>
2014-03-03 20:04:20 +00:00
bitmapped packfile has the necessary objects.
--keep-pack=<pack-name>::
Exclude the given pack from repacking. This is the equivalent
of having `.keep` file on the pack. `<pack-name>` is the
pack file name without leading directory (e.g. `pack-123.pack`).
The option can be specified multiple times to keep multiple
packs.
--unpack-unreachable=<when>::
When loosening unreachable objects, do not bother loosening any
objects older than `<when>`. This can be used to optimize out
the write of any objects that would be immediately pruned by
a follow-up `git prune`.
-k::
--keep-unreachable::
When used with `-ad`, any unreachable objects from existing
packs will be appended to the end of the packfile instead of
being removed. In addition, any unreachable loose objects will
be packed (and their loose counterparts removed).
-i::
--delta-islands::
Pass the `--delta-islands` option to `git-pack-objects`, see
linkgit:git-pack-objects[1].
-g<factor>::
builtin/repack.c: add '--geometric' option Often it is useful to both: - have relatively few packfiles in a repository, and - avoid having so few packfiles in a repository that we repack its entire contents regularly This patch implements a '--geometric=<n>' option in 'git repack'. This allows the caller to specify that they would like each pack to be at least a factor times as large as the previous largest pack (by object count). Concretely, say that a repository has 'n' packfiles, labeled P1, P2, ..., up to Pn. Each packfile has an object count equal to 'objects(Pn)'. With a geometric factor of 'r', it should be that: objects(Pi) > r*objects(P(i-1)) for all i in [1, n], where the packs are sorted by objects(P1) <= objects(P2) <= ... <= objects(Pn). Since finding a true optimal repacking is NP-hard, we approximate it along two directions: 1. We assume that there is a cutoff of packs _before starting the repack_ where everything to the right of that cut-off already forms a geometric progression (or no cutoff exists and everything must be repacked). 2. We assume that everything smaller than the cutoff count must be repacked. This forms our base assumption, but it can also cause even the "heavy" packs to get repacked, for e.g., if we have 6 packs containing the following number of objects: 1, 1, 1, 2, 4, 32 then we would place the cutoff between '1, 1' and '1, 2, 4, 32', rolling up the first two packs into a pack with 2 objects. That breaks our progression and leaves us: 2, 1, 2, 4, 32 ^ (where the '^' indicates the position of our split). To restore a progression, we move the split forward (towards larger packs) joining each pack into our new pack until a geometric progression is restored. Here, that looks like: 2, 1, 2, 4, 32 ~> 3, 2, 4, 32 ~> 5, 4, 32 ~> ... ~> 9, 32 ^ ^ ^ ^ This has the advantage of not repacking the heavy-side of packs too often while also only creating one new pack at a time. Another wrinkle is that we assume that loose, indexed, and reflog'd objects are insignificant, and lump them into any new pack that we create. This can lead to non-idempotent results. Suggested-by: Derrick Stolee <dstolee@microsoft.com> Signed-off-by: Taylor Blau <me@ttaylorr.com> Reviewed-by: Jeff King <peff@peff.net> Signed-off-by: Junio C Hamano <gitster@pobox.com>
2021-02-23 02:25:27 +00:00
--geometric=<factor>::
Arrange resulting pack structure so that each successive pack
contains at least `<factor>` times the number of objects as the
next-largest pack.
+
`git repack` ensures this by determining a "cut" of packfiles that need
to be repacked into one in order to ensure a geometric progression. It
picks the smallest set of packfiles such that as many of the larger
packfiles (by count of objects contained in that pack) may be left
intact.
+
Unlike other repack modes, the set of objects to pack is determined
uniquely by the set of packs being "rolled-up"; in other words, the
packs determined to need to be combined in order to restore a geometric
progression.
+
Loose objects are implicitly included in this "roll-up", without respect to
their reachability. This is subject to change in the future.
+
When writing a multi-pack bitmap, `git repack` selects the largest resulting
pack as the preferred pack for object selection by the MIDX (see
linkgit:git-multi-pack-index[1]).
builtin/repack.c: add '--geometric' option Often it is useful to both: - have relatively few packfiles in a repository, and - avoid having so few packfiles in a repository that we repack its entire contents regularly This patch implements a '--geometric=<n>' option in 'git repack'. This allows the caller to specify that they would like each pack to be at least a factor times as large as the previous largest pack (by object count). Concretely, say that a repository has 'n' packfiles, labeled P1, P2, ..., up to Pn. Each packfile has an object count equal to 'objects(Pn)'. With a geometric factor of 'r', it should be that: objects(Pi) > r*objects(P(i-1)) for all i in [1, n], where the packs are sorted by objects(P1) <= objects(P2) <= ... <= objects(Pn). Since finding a true optimal repacking is NP-hard, we approximate it along two directions: 1. We assume that there is a cutoff of packs _before starting the repack_ where everything to the right of that cut-off already forms a geometric progression (or no cutoff exists and everything must be repacked). 2. We assume that everything smaller than the cutoff count must be repacked. This forms our base assumption, but it can also cause even the "heavy" packs to get repacked, for e.g., if we have 6 packs containing the following number of objects: 1, 1, 1, 2, 4, 32 then we would place the cutoff between '1, 1' and '1, 2, 4, 32', rolling up the first two packs into a pack with 2 objects. That breaks our progression and leaves us: 2, 1, 2, 4, 32 ^ (where the '^' indicates the position of our split). To restore a progression, we move the split forward (towards larger packs) joining each pack into our new pack until a geometric progression is restored. Here, that looks like: 2, 1, 2, 4, 32 ~> 3, 2, 4, 32 ~> 5, 4, 32 ~> ... ~> 9, 32 ^ ^ ^ ^ This has the advantage of not repacking the heavy-side of packs too often while also only creating one new pack at a time. Another wrinkle is that we assume that loose, indexed, and reflog'd objects are insignificant, and lump them into any new pack that we create. This can lead to non-idempotent results. Suggested-by: Derrick Stolee <dstolee@microsoft.com> Signed-off-by: Taylor Blau <me@ttaylorr.com> Reviewed-by: Jeff King <peff@peff.net> Signed-off-by: Junio C Hamano <gitster@pobox.com>
2021-02-23 02:25:27 +00:00
builtin/repack.c: support writing a MIDX while repacking Teach `git repack` a new `--write-midx` option for callers that wish to persist a multi-pack index in their repository while repacking. There are two existing alternatives to this new flag, but they don't cover our particular use-case. These alternatives are: - Call 'git multi-pack-index write' after running 'git repack', or - Set 'GIT_TEST_MULTI_PACK_INDEX=1' in your environment when running 'git repack'. The former works, but introduces a gap in bitmap coverage between repacking and writing a new MIDX (since the repack may have deleted a pack included in the existing MIDX, invalidating it altogether). Setting the 'GIT_TEST_' environment variable is obviously unsupported. In fact, even if it were supported officially, it still wouldn't work, because it generates the MIDX *after* redundant packs have been dropped, leading to the same issue as above. Introduce a new option which eliminates this race by teaching `git repack` to generate the MIDX at the critical point: after the new packs have been written and moved into place, but before the redundant packs have been removed. This option is compatible with `git repack`'s '--bitmap' option (it changes the interpretation to be: "write a bitmap corresponding to the MIDX after one has been generated"). There is a little bit of additional noise in the patch below to avoid repeating ourselves when selecting which packs to delete. Instead of a single loop as before (where we iterate over 'existing_packs', decide if a pack is worth deleting, and if so, delete it), we have two loops (the first where we decide which ones are worth deleting, and the second where we actually do the deleting). This makes it so we have a single check we can make consistently when (1) telling the MIDX which packs we want to exclude, and (2) actually unlinking the redundant packs. There is also a tiny change to short-circuit the body of write_midx_included_packs() when no packs remain in the case of an empty repository. The MIDX code does not handle this, so avoid trying to generate a MIDX covering zero packs in the first place. Signed-off-by: Taylor Blau <me@ttaylorr.com> Signed-off-by: Junio C Hamano <gitster@pobox.com>
2021-09-29 01:55:18 +00:00
-m::
--write-midx::
Write a multi-pack index (see linkgit:git-multi-pack-index[1])
containing the non-redundant packs.
CONFIGURATION
-------------
Various configuration variables affect packing, see
linkgit:git-config[1] (search for "pack" and "delta").
By default, the command passes `--delta-base-offset` option to
'git pack-objects'; this typically results in slightly smaller packs,
but the generated packs are incompatible with versions of Git older than
version 1.4.4. If you need to share your repository with such ancient Git
transport: drop support for git-over-rsync The git-over-rsync protocol is inefficient and broken, and has been for a long time. It transfers way more objects than it needs (grabbing all of the remote's "objects/", regardless of which objects we need). It does its own ad-hoc parsing of loose and packed refs from the remote, but doesn't properly override packed refs with loose ones, leading to garbage results (e.g., expecting the other side to have an object pointed to by a stale packed-refs entry, or complaining that the other side has two copies of the refs[1]). This latter breakage means that nobody could have successfully pulled from a moderately active repository since cd547b4 (fetch/push: readd rsync support, 2007-10-01). We never made an official deprecation notice in the release notes for git's rsync protocol, but the tutorial has marked it as such since 914328a (Update tutorial., 2005-08-30). And on the mailing list as far back as Oct 2005, we can find Junio mentioning it as having "been deprecated for quite some time."[2,3,4]. So it was old news then; cogito had deprecated the transport in July of 2005[5] (though it did come back briefly when Linus broke git-http-pull!). Of course some people professed their love of rsync through 2006, but Linus clarified in his usual gentle manner[6]: > Thanks! This is why I still use rsync, even though > everybody and their mother tells me "Linus says rsync is > deprecated." No. You're using rsync because you're actively doing something _wrong_. The deprecation sentiment was reinforced in 2008, with a mention that cloning via rsync is broken (with no fix)[7]. Even the commit porting rsync over to C from shell (cd547b4) lists it as deprecated! So between the 10 years of informal warnings, and the fact that it has been severely broken since 2007, it's probably safe to simply remove it without further deprecation warnings. [1] http://article.gmane.org/gmane.comp.version-control.git/285101 [2] http://article.gmane.org/gmane.comp.version-control.git/10093 [3] http://article.gmane.org/gmane.comp.version-control.git/17734 [4] http://article.gmane.org/gmane.comp.version-control.git/18911 [5] http://article.gmane.org/gmane.comp.version-control.git/5617 [6] http://article.gmane.org/gmane.comp.version-control.git/19354 [7] http://article.gmane.org/gmane.comp.version-control.git/103635 Signed-off-by: Jeff King <peff@peff.net> Signed-off-by: Junio C Hamano <gitster@pobox.com>
2016-01-30 07:21:26 +00:00
versions, either directly or via the dumb http protocol, then you
need to set the configuration variable `repack.UseDeltaBaseOffset` to
"false" and repack. Access from old Git versions over the native protocol
is unaffected by this option as the conversion is performed on the fly
as needed in that case.
Delta compression is not used on objects larger than the
`core.bigFileThreshold` configuration variable and on files with the
attribute `delta` set to false.
SEE ALSO
--------
linkgit:git-pack-objects[1]
linkgit:git-prune-packed[1]
GIT
---
Part of the linkgit:git[1] suite