Compare commits

..

59 Commits
v2.47 ... main

Author SHA1 Message Date
Kaushik Lingarkar
a94457d1ce Fallback to full sync when depth enabled fetch of a sha1 fails
In sha1 mode, when depth is enabled, syncing the revision from
upstream may not work because some servers only allow fetching
named refs. Fetching a specific sha1 may result in an error like
'server does not allow request for unadvertised object'. In this
case, attempt a full sync with depth disabled.

Bug: 410825502
Change-Id: If51bcf18b877cd9491706f5bc3d6fd13c0c3d4f3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/468282
Commit-Queue: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Tested-by: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-04-17 11:46:11 -07:00
Gavin Mak
97dc5c1bd9 project: use --netrc-optional instead of --netrc
Some users are reporting a "curl: (26) .netrc error: no such file"
message on sync caused by an change to curl behavior.
See https://github.com/curl/curl/issues/16163.

Use --netrc-optional which was introduced in curl version 7.9.8
released in 2002.

Bug: 409354839
Change-Id: I8365c6e806968a4ee765a7e023b4bced30489c20
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/467026
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@chromium.org>
2025-04-10 11:30:42 -07:00
Mike Frysinger
0214730c9a launcher: switch command quoting to shlex.quote
Minor fix, but just in case, provides properly quoted commands for
people to copy & paste.

Change-Id: Ia9fce5c0df9f51cbed9d49861adcf6821251e46f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/466821
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-04-10 10:23:08 -07:00
Gavin Mak
daebd6cbc2 sync: Warn about excessive job counts
Warn users if the effective job count specified via `-j`,
`--jobs-network`, or `--jobs-checkout` exceeds a threshold
(currently 100). This encourages users to use more reasonable
values.

Bug: 406868778
Bug: 254914814
Change-Id: I116e2bbaf3dc824c04d1b2fbe52cf9ca5be77b9a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/466801
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-04-09 14:52:22 -07:00
Mike Frysinger
3667de1d0f run_tests: fix running when cwd is not the root
If you try running this from a subdir, then most of the tests fail
because they assume they're running from the top of the source tree.
Change all the tests to actually run there.

For example: cd docs && ../run_tests

Change-Id: I92e17476393a108e56b58e049193b9fd72c5b7ba
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/464841
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-04-03 11:11:04 -07:00
Mike Frysinger
85ee1738e6 run_tests: enable Python 3.8 CI coverage
Change-Id: I507da20d3b7234e9f2a22d7654a6405b362eebaf
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/464541
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-04-02 13:50:29 -07:00
Egor Duda
f070331a4c Fix EROFS error when root fs is mounted read-only
repo attempts to create /etc/.repo_gitconfig.json file, and fails if
root file system is mounted read-only. Removing non-existing file on
read-only filesystem results in EROFS instead of ENOENT.

Bug: 401018409
Change-Id: I64edc0567fb88649f3fd8cacb65a8780744640d4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/458821
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Egor Duda <egor.duda@gmail.com>
Commit-Queue: Egor Duda <egor.duda@gmail.com>
2025-04-02 06:43:06 -07:00
Mike Frysinger
9ecb80ba26 pager: drop unused global vars
We use global when we need to write to a variable, not read it.
This function only reads, so drop the keyword.

Change-Id: Iee91998fba67fd3e8ebaf2f4a79f95032f70b1c0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/464501
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-04-01 20:59:10 -07:00
Mike Frysinger
dc8185f2a9 launcher: change RunError to subprocess.CalledProcessError
Since we require Python 3.6 now in the launcher, swap out our custom
RunError class for the standard subprocess one.

Change-Id: Id0ca17c40e22ece03e06366a263ad340963f979d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/464401
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2025-04-01 17:28:26 -07:00
Mike Frysinger
59b81c84de launcher: change collections.namedtuple to typing.NamedTuple
Since we require Python 3.6 now in the launcher, switch to NamedTuple
so we get better documentation & typing information.

Change-Id: Ic58fdc07db02fc49166eccbbc3e527f474973424
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/463721
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-03-28 19:13:49 -07:00
Mike Frysinger
507d463600 tox: sync black settings with run_tests
We updated run_tests to use black-25, so update tox too.

Change-Id: I7ee6471fbc78825bd2dbc8c1f8dab9dc10460852
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/463601
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-03-27 14:20:01 -07:00
Mike Frysinger
cd391e77d0 black: update to v25
Requires a little reformatting in the tree.

Change-Id: Iaa40fe0dfca372c49c04cc26edccb5f7b0c2a8ad
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/462883
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2025-03-25 11:20:35 -07:00
Mike Frysinger
8310436be0 run_tests: move test filtering to pytest markers
Move the test disable logic even closer to the exact test that's
disabled.  This way people updating tests have a better chance of
seeing they'll get reduced coverage in the CQ.

Change-Id: I57c1a073a844019798b27e14d742fd32925d9ae8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/462882
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-03-25 11:19:49 -07:00
Mike Frysinger
d5087392ed run_tests: move CQ test skips here
Our recipes have been disabling a bunch of tests.  To increase
visibility, and to make it easier to test changes, move that
logic to this script.

Change-Id: I3894f047715177c0f1d27a2fe4c3490972dab204
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/462881
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-03-25 10:08:54 -07:00
Mike Frysinger
91f428058d run_tests: run all tests all the time
Using a generator w/all() causes the code to exit on the first error.
We really want to see all errors all the time, so use sum() instead.

Change-Id: Ib1adb8de199db9fe727d4b49c890b4d5061e9e6b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/462901
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-03-25 10:07:42 -07:00
Mike Frysinger
243df2042e launcher: change RunResult to subprocess.CompletedProcess
Since we require Python 3.6 now in the launcher, swap out our custom
RunResult class for the standard subprocess one.

Change-Id: Idd8598df37c0a952d3ef828df6e250cab03c6589
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/462341
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-03-24 11:49:00 -07:00
Albert Akmukhametov
4b94e773ef Sync: Fix full submodule sync while shallow specified
Git allows to clone submodules as shallow clone [1]. On the other
hand, when repo synchronize a projcet with submodules inside, it
ignores the shallow parameter.

When a project contains submodules, project.py parses the .gitmodules
file for URL and path. This parsing does not consider the shallow
option. Consequently, this parameter is not propgated to newly
created Project instance for that submodule.

[1] https://git-scm.com/docs/gitmodules#Documentation/gitmodules.txt-submoduleltnamegtshallow

Change-Id: I54fc9c69ae1b8e3cda2801202e3f0c7693b718d2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/454261
Tested-by: Albert Akmukhametov <alb.02057@gmail.com>
Commit-Queue: Albert Akmukhametov <alb.02057@gmail.com>
Reviewed-by: Josip Sokcevic <sokcevic@chromium.org>
Reviewed-by: Никита Сказкоподателев (Nask) <skazkopodatelev@gmail.com>
2025-03-13 09:12:45 -07:00
Josip Sokcevic
fc901b92bb sync: Refresh index before updating repo
If the repo index is stale, reset --keep will refuse to reset workspace.
An index can be stale if there are any modifications to file node,
including mtime, atime, ownership changes, etc.

Bug: b/375423099
Change-Id: Ibef03d9d8d2babbb107041707281687342ab7a77
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/460022
Commit-Queue: Josip Sokcevic <sokcevic@chromium.org>
Tested-by: Josip Sokcevic <sokcevic@chromium.org>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-03-13 08:24:35 -07:00
Josip Sokcevic
8d5f032611 gc: Add tags to remote pack list
If tags are omitted from the remote pack list, they must be present in
local pack. However, local packs don't have promisor objects, meaning
that all blobs must be available locally, and therefore all missing
blobs will be downloaded during rev-list phase. Git downloads those
sequentially, by invokving fetch operation (rev-list/fetch).

Instead of downloading tags' blobs, instruct Git to include all tags in
remote rev-list operation. This change was tested with `git fsck --all`.

R=yiwzhang@google.com

Bug: b/392732561
Change-Id: Id94a40aebbe4f084c952329583d559d296db1a11
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/451422
Reviewed-by: Yiwei Zhang <yiwzhang@google.com>
Tested-by: Josip Sokcevic <sokcevic@chromium.org>
Commit-Queue: Josip Sokcevic <sokcevic@chromium.org>
2025-02-05 12:36:27 -08:00
Kaushik Lingarkar
99eca45eb2 Activate submodules
This change moves further towards ensuring Git can understand repo's
submodules. 'submodule init' is used to make the submodules active[1].

[1] https://git-scm.com/docs/gitsubmodules#_active_submodules

Change-Id: I0c20ff1991101fc5be171e566d8fb644aab47200
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/446182
Tested-by: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Reviewed-by: Nasser Grainawi <nasser.grainawi@oss.qualcomm.com>
Reviewed-by: Josip Sokcevic <sokcevic@chromium.org>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-02-04 08:07:49 -08:00
Kaushik Lingarkar
66685f07ec Use 'gitfile' in submodule checkouts
This change takes another step towards ensuring Git can understand
repo's submodules to some extent. Replace the old '.git' symlink with
gitfile[1] pointing to the bare checkout of the submodule. This is
required for Git's 'recurse submodules' opts to work with repo's
submodules as '.git' is expected to be writable by Git when recursing
over submodules.

[1] https://git-scm.com/docs/gitrepository-layout#_description

Change-Id: I52d15451768ee7bd6db289f4d2b3be5907370d42
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/446181
Tested-by: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Reviewed-by: Josip Sokcevic <sokcevic@chromium.org>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Nasser Grainawi <nasser.grainawi@oss.qualcomm.com>
2025-02-04 08:07:49 -08:00
Kaushik Lingarkar
cf9a2a2a76 Update internal filesystem layout for submodules
Change the bare checkout directory for submodules from 'subprojects'
to 'modules'. Git expects bare submodule checkouts to be in the
'modules' directory. If old subproject directories are found, they
will be migrated to the new modules directory. This change is the
first step in ensuring Git can understand repo's submodules to some
extent.

Change-Id: I385029f1bb55d040616d970d6ffb4bb856692520
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/444881
Tested-by: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Reviewed-by: Josip Sokcevic <sokcevic@chromium.org>
2025-02-04 08:07:49 -08:00
Josip Sokcevic
5ae8292fea Revert "sync: skip network half on repo upgrade"
This reverts commit 61224d01fa29bcf54dd6d521e09e09a8c0da77fe.

Reason for revert: the manifest will be updated during in the
post-upgrade process, and that can result in a missing object in
LocalHalf, since NetworkHalf is not skipped.

Bug: b/392979411
Change-Id: I8a46e5b54093ed78285c8b30f000bb08a8244179
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/450181
Tested-by: Josip Sokcevic <sokcevic@chromium.org>
Commit-Queue: Josip Sokcevic <sokcevic@chromium.org>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-01-31 12:01:49 -08:00
Mike Frysinger
dfdf577e98 docs: smart-sync: split out & expand details
The existing documentation on smart-sync behavior is a bit light on
details, and out of date wrt what the code actually does.  Start a
dedicated document and fill it out more.

Change-Id: I1a8a3ac6edf9291d72182ad55db865035d9b683e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/450002
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@chromium.org>
2025-01-30 19:17:24 -08:00
Mike Frysinger
747ec83f58 run_tests: update to python 3.11 & pytest 8.3.4
Change-Id: Iffe45d85a54dc380cdd37bbbbe64b058eacad0a9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/449901
Reviewed-by: Josip Sokcevic <sokcevic@chromium.org>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-01-30 10:54:54 -08:00
flexagoon
1711bc23c0 git_config: prefer XDG config location
Currently, repo ignores the XDG path for the git config file, and
creates a new one in the user's home directory. This commit changes the
behavior to prefer the XDG path if it exists, which matches git behavior
and avoids littering the home directory.

Bug: 40012443
Change-Id: Icd3ec6db6b0832f47417bbe98ff9461306b51297
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/448385
Tested-by: lmaor xenix <25misha52@gmail.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-01-23 23:47:06 -08:00
Josip Sokcevic
db111d3924 sync: Recover from errors during read-tree
When repo is initializing a git repository, it calls `git read-tree`.
During such operation, git is restoring workspace based on the current
index. However, some things can go wrong: a user can run out of disk
space, or, in case of partial clone, user may no longer reach the remote
host. That will leave affected repository in a bad state with partially
checked out workspace. The follow up repo sync won't try to fix such
state.

This change removes .git symlink, which will force the next `repo sync`
to redo Git repository setup.

Bug: b/363171216
Bug: b/390161127
Change-Id: I57db4b6cae0ef21826dc7cede4d3bf02cfc3d955
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/447801
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Josip Sokcevic <sokcevic@chromium.org>
Commit-Queue: Josip Sokcevic <sokcevic@chromium.org>
2025-01-16 09:19:45 -08:00
Josip Sokcevic
3405446a4e gc: Add repack option
When a repository is partially cloned, no longer needed blobs are never
removed. To reclaim some of disk space, allow user to pass --repack
which affects only repositories with filter=blob:none and if projects
are not shared.

Change-Id: I0608172c9eff82fb8a6b6ef703eb109fedb7a6cc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/447722
Commit-Queue: Josip Sokcevic <sokcevic@chromium.org>
Tested-by: Josip Sokcevic <sokcevic@chromium.org>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-01-14 15:17:34 -08:00
Josip Sokcevic
41a27eb854 gc: extract deletion from Execute method
Change-Id: Icef4f28fbdb9658892611def7589f5eba43c952c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/447721
Reviewed-by: Scott Lee <ddoman@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@chromium.org>
Tested-by: Josip Sokcevic <sokcevic@chromium.org>
2025-01-14 12:33:45 -08:00
Josip Sokcevic
d93fe60e89 sync: Handle KeyboardInterrupt during checkout
KeyboardInterrupt is handled during NetworkHalf. This patch handles
KeyboardInterrupt during LocalHalf.

Bug: b/372069163
Change-Id: I26847f7ca3cdf1fe57b265b4f6b18cc8102d2921
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/447401
Tested-by: Josip Sokcevic <sokcevic@chromium.org>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-01-08 13:36:52 -08:00
Josip Sokcevic
61224d01fa sync: skip network half on repo upgrade
When repo upgrades itself, it will restart itself and rerun sync
command. At that point, we know that network half is already done and we
can just proceed with local half.

R=ddoman@google.com

Bug: b/377567091
Change-Id: I77205b1f2df19891597347d55283a617de3c6634
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/446201
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Josip Sokcevic <sokcevic@chromium.org>
Commit-Queue: Josip Sokcevic <sokcevic@chromium.org>
2024-12-18 11:49:17 -08:00
Josip Sokcevic
13d6588bf6 gc: Introduce new command to remove old projects
When projects are removed from manifest, they are only removed from
worktree and not from .repo/projects and .repo/project-objects. Keeping
data under .repo can be desired if user expects deleted projects to be
restored (e.g. checking out a release branch).

Android has ongoing effort to remove many stale projects and this change
allows users to easily free-up their disk space.

Bug: b/344018971
Bug: 40013312
Change-Id: Id23c7524a88082ee6db908f9fd69dcd5d0c4f681
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/445921
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@chromium.org>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Josip Sokcevic <sokcevic@chromium.org>
2024-12-18 09:23:49 -08:00
Josip Sokcevic
9500aca754 sync: Delete symlinks relative to client topdir
If repo sync is invoked outside the repo root, and the latest manifest
removes symlinks, repo incorrectly tries to remove symlink - it starts
from `cwd` instead of the repo root.

Bug: b/113935847
Bug: 40010423
Change-Id: Ia50ea70a376e38c94389880f020c80da3c3f453c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/445901
Tested-by: Josip Sokcevic <sokcevic@chromium.org>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2024-12-16 10:23:40 -08:00
Fredrik de Groot
e8a7b9d596 Add smoke test for subcmd forall
After some refactoring earlier, the forall command was
broken briefly, returning after only one run instead
of after all projects.

This test, albeit simple in nature, would have caught that.

Due to the somewhat demanding nature of forall,
a lot more setup was needed than expected but seems
to do its job now so hopefully it catches similar stuff
in the future.

Change-Id: I51e161ff0e7e31a65401211c376f319bda504532
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/445461
Tested-by: Fredrik de Groot <fredrik.de.groot@haleytek.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Fredrik de Groot <fredrik.de.groot@haleytek.com>
2024-12-11 00:30:15 -08:00
Josip Sokcevic
cf411b3f03 Remove gitc support from repo
gitc is no longer available.

Change-Id: I0cbfdf936832f2cdd4876104ae3cc5a6e26154e2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/444841
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2024-12-03 22:27:56 +00:00
Josip Sokcevic
1feecbd91e branches: Escape percent signs in branch names
If a branch name contains a percent sign, it will be interpreted as a placeholder and color.py will fail to format it.

To avoid this, escape the percent signs prior to calling Coloring
method.

Bug: b/379090488
Change-Id: Id019c776bbf8cbed5c101f2773606f1d32c9e057
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/443801
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-12-03 19:02:20 +00:00
Peter Kjellerstedt
616e314902 sync: Do not fail to sync a manifest with no projects
Since commit 454fdaf1191c87e5c770ab865a911e10e600e178 (v2.48), syncing a
manifest without any projects would result in:

  Repo command failed: RepoUnhandledExceptionError
          Number of processes must be at least 1

Bug: 377546300
Change-Id: Iaa2f6a3ac64542ad65a19c0eef449f53c09cae67
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/443442
Reviewed-by: Erik Elmeke <erik@haleytek.corp-partner.google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
2024-11-26 10:16:03 +00:00
Josip Sokcevic
fafd1ec23e Fix event log command event hierarchy.
command should be cmd_name, to match what git is emitting. This also
fixes arguments, so that only relevant arguments are passed instead
of the entire sys.args, which will contain wrapper information

Change-Id: Id436accfff511292ec2c56798fffb2306dda38fc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/443741
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2024-11-22 18:39:41 +00:00
Josip Sokcevic
b1613d741e Make repo installation work without .git
Some tools like jj and cog will not have .git. This change
makes it possible to run all repo commands in such setups.

Change-Id: I7f3845dc970fbaa731c31e0aa48355a4b56ed3a6
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/442821
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-11-18 19:36:14 +00:00
Kuang-che Wu
ab2d321104 sync: fix connection error on macOS
With a large number of sync workers, the sync process may fail on
macOS due to connection errors. The root cause is that multiple
workers may attempt to connect to the multiprocessing manager server
at the same time when handling the first job. This can lead to
connection failures if there are too many pending connections, exceeding
the socket listening backlog.

Bug: 377538810
Change-Id: I1924d318d076ca3be61d75daa37bfa8d7dc23ed7
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/441541
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2024-11-06 16:33:17 +00:00
Josip Sokcevic
aada468916 upload: Return correct tuple values in _ProcessResults
Incorrect tuple values were returned with http://go/grev/440221 -
instead of returning (Project, ReviewableBranch), _ProcessResults was
returning (int, ReviewableBranch).

R=jojwang@google.com

Bug: 376731172
Change-Id: I75205f42fd23f5ee6bd8d0c15b18066189b42bd9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/441121
Reviewed-by: Sam Saccone <samccone@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2024-10-31 21:18:53 +00:00
Allen Webb
1d5098617e worktree: Do not try to fix relative paths
--worktree was broken with incorrect paths in the .git files
whenever the local copy of git populated gitdir with relative paths
instead of absoulte paths.

Bug: 376251410
Change-Id: Id32dc1576315218967de2a9bfe43bf7a5a0e7aa6
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/440801
Commit-Queue: Allen Webb <allenwebb@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Allen Webb <allenwebb@google.com>
2024-10-30 17:03:57 +00:00
Josip Sokcevic
e219c78fe5 forall: Fix returning results early
rc should be returned only after all results are processed.

R=jojwang@google.com

Bug: b/376454189
Change-Id: I8200b9954240dd3e8e9f2ab82494779a3cb38627
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/440901
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
2024-10-30 16:11:04 +00:00
joehsu
f9f4df62e0 Use full name of the revision when checking dest-branch
The manifest usually doesn't sepecify the revision with the full name
(e.g. refs/heads/REV).
However, when checking if the name of the merge branch, full name is
used on the merge branch.

The CL use full name of revision when comparing it with the merge
branch.

Bug: b/370919047
Test: repo upload on a project with `dest-branch` set
Change-Id: Ib6fa2f7246beb5bae0a26a70048a7ac03b6c5a2f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/438401
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Joe Hsu <joehsu@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-10-28 23:47:08 +00:00
Fredrik de Groot
ebdf0409d2 Add REPO_SKIP_SELF_UPDATE check in sync
The command _PostRepoFetch will try to self update
during repo sync. That is beneficial but adds
version uncertainty, fail potential and slow downs
in non-interactive scenarios.

Conditionally skip the update if env variable
REPO_SKIP_SELF_UPDATE is defined.

A call to selfupdate works as before, meaning even
with the variable set, it will run the update.

Change-Id: Iab0ef55dc3d3db3cbf1ba1f506c57fbb58a504c3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/439967
Tested-by: Fredrik de Groot <fredrik.de.groot@haleytek.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2024-10-28 17:46:25 +00:00
Fredrik de Groot
303bd963d5 manifest: add optional base check on remove and extend
This adds an optional, built-in checker for
guarding against patches hanging on wrong
base revisions, which is useful if a lower layer of
the manifest changes after a patch was done.

When adding a patch with a new revision using
extend-project or remove-project/project:

          C---D---E patches in project bla
         /
    A---B project bla in manifest state 1

<extend-project name="bla" revision="E" base-rev="B">

If project bla gets updated, in a new snap ID
or by a supplier or similar, to a new state:

          C---D---E patches in project bla
         /
    A---B---F---G project bla in manifest state 2

Parsing will fail because revision of bla is now G,
giving the choice to create a new patch branch
from G and updating base-rev, or keeping previous
branch for some reason and only updating base-rev.

Intended for use in a layered manifest with
hashed revisions. Named refs like branches and tags
also work fine when comparing, but will be misleading
if a branch is used as base-rev.

Change-Id: Ic6211550a7d3cc9656057f6a2087c505b40cad2b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/436777
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Fredrik de Groot <fredrik.de.groot@haleytek.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-10-28 16:55:10 +00:00
Josip Sokcevic
ae384f8623 [event_log] Stop leaking semaphore resources
With the global state and fork, we are left with uncleaned resources.
Isolate mulitprocessing.Value in a function so we stop the leak.

Bug: 353656374
Change-Id: If50bb544bda12b72f00c02bc1d2c0d19de000b88
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/440261
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2024-10-24 16:58:17 +00:00
Kuang-che Wu
70a4e643e6 progress: always show done message
The done message was omitted if the task is shorter than 0.5s. This
might confuse users.

Bug: b/371638995
Change-Id: I3fdd2cd8daea16d34fba88457d09397fff71af15
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/440222
Tested-by: Kuang-che Wu <kcwu@google.com>
Commit-Queue: Kuang-che Wu <kcwu@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2024-10-24 16:21:28 +00:00
Kuang-che Wu
8da4861b38 subcmds: reduce multiprocessing serialization overhead
Follow the same approach as 39ffd9977e to reduce serialization overhead.

Below benchmarks are tested with 2.7k projects on my workstation
(warm cache). git tracing is disabled for benchmark.

(seconds)              | v2.48 | v2.48 | this CL | this CL
	               |       |  -j32 |         |    -j32
-----------------------------------------------------------
with clean tree state:
branches (none)        |   5.6 |   5.9 |    1.0  |    0.9
status (clean)         |  21.3 |   9.4 |   19.4  |    4.7
diff (none)            |   7.6 |   7.2 |    5.7  |    2.2
prune (none)           |   5.7 |   6.1 |    1.3  |    1.2
abandon (none)         |  19.4 |  18.6 |    0.9  |    0.8
upload (none)          |  19.7 |  18.7 |    0.9  |    0.8
forall -c true         |   7.5 |   7.6 |    0.6  |    0.6
forall -c "git log -1" |  11.3 |  11.1 |    0.6  |    0.6

with branches:
start BRANCH --all     |  21.9 |  20.3 |   13.6  |    2.6
checkout BRANCH        |  29.1 |  27.8 |    1.1  |    1.0
branches (2)           |  28.0 |  28.6 |    1.5  |    1.3
abandon BRANCH         |  29.2 |  27.5 |    9.7  |    2.2

Bug: b/371638995
Change-Id: I53989a3d1e43063587b3f52f852b1c2c56b49412
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/440221
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Kuang-che Wu <kcwu@google.com>
Commit-Queue: Kuang-che Wu <kcwu@google.com>
2024-10-23 23:34:34 +00:00
Kuang-che Wu
39ffd9977e sync: reduce multiprocessing serialization overhead
Background:
 - Manifest object is large (for projects like Android) in terms of
   serialization cost and size (more than 1mb).
 - Lots of Project objects usually share only a few manifest objects.

Before this CL, Project objects were passed to workers via function
parameters. Function parameters are pickled separately (in chunk). In
other words, manifests are serialized again and again. The major
serialization overhead of repo sync was
  O(manifest_size * projects / chunksize)

This CL uses following tricks to reduce serialization overhead.
 - All projects are pickled in one invocation. Because Project objects
   share manifests, pickle library remembers which objects are already
   seen and avoid the serialization cost.
 - Pass the Project objects to workers at worker intialization time.
   And pass project index as function parameters instead. The number of
   workers is much smaller than the number of projects.
 - Worker init state are shared on Linux (fork based). So it requires
   zero serialization for Project objects.

On Linux (fork based), the serialization overhead is
  O(projects)  --- one int per project
On Windows (spawn based), the serialization overhead is
  O(manifest_size * min(workers, projects))

Moreover, use chunksize=1 to avoid the chance that some workers are idle
while other workers still have more than one job in their chunk queue.

Using 2.7k projects as the baseline, originally "repo sync" no-op
sync takes 31s for fetch and 25s for checkout on my Linux workstation.
With this CL, it takes 12s for fetch and 1s for checkout.

Bug: b/371638995
Change-Id: Ifa22072ea54eacb4a5c525c050d84de371e87caa
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/439921
Tested-by: Kuang-che Wu <kcwu@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Kuang-che Wu <kcwu@google.com>
2024-10-23 02:58:45 +00:00
Kaushik Lingarkar
584863fb5e Fix incremental syncs for prjs with submodules
When performing an incremental sync (re-running repo init with an
updated manifest revision) with --fetch-submodules or sync-s=true,
there is an attempt to get a list of all projects (including
submodules) before projects are actually fetched. However, we can
only list submodules of a project if we have already fetched its
revision. Instead of throwing an error when we don't have the
revision, assume there are no submodules for that project. In the
sync cmd, we already update the list of projects to include
submodules after fetching superprojects.

Change-Id: I48bc68c48b5b10117356b18f5375d17f9a89ec05
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/439761
Commit-Queue: Kaushik Lingarkar <kaushik.lingarkar@linaro.org>
Tested-by: Kaushik Lingarkar <kaushik.lingarkar@linaro.org>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Nasser Grainawi <nasser.grainawi@linaro.org>
2024-10-18 03:55:10 +00:00
Josip Sokcevic
454fdaf119 sync: Always use WORKER_BATCH_SIZE
With 551285fa35ccd0836513e9cf64ee8d3372e5e3f4, the comment about number
of workers no longer stands - dict is shared among multiprocesses and
real time information is available.

Using 2.7k projects as the baseline, using chunk size of 4 takes close
to 5 minutes. A chunk size of 32 takes this down to 40s - a reduction of
rougly 8 times which matches the increase.

R=gavinmak@google.com

Bug: b/371638995
Change-Id: Ida5fd8f7abc44b3b82c02aa0f7f7ae01dff5eb07
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/438523
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2024-10-07 18:44:19 +00:00
Josip Sokcevic
f7f9dd4deb project: Handle git sso auth failures as repo exit
If a user is not authenticated, repo continues execution and it will
likely result in more of the same errors being printed. A user is also
likely to SIGTERM the process resulting in more errors.

This change stops repo sync if any of repositories can't be fetched to
Git authentcation using sso helper. We could extend this to all Git
authentication

Change-Id: I9e471e063450c0a51f25a5e7f12a83064dfb170c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/438522
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-10-03 20:47:50 +00:00
Josip Sokcevic
70ee4dd313 superproject: Remove notice about beta
It's been the default for Android for over a year now, and it's no
longer useful notice.

Change-Id: I53c6f1e7cee8c1b2f408e67d3a6732db3b272bee
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/438521
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-10-03 20:37:18 +00:00
Josip Sokcevic
cfe3095e50 project: run fetch --refetch on unable to not parse commit
Similarly to e59e2ae757623e64f625a9cdadf1c2010ef82b34, handle missing
gc'ed commits by running `git fetch --refetch`.

R=jojwang@google.com

Bug: b/360889369
Bug: b/371000949
Change-Id: I108b870b855d3b9f23665afa134c6e35f7cd2830
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/438461
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-10-03 17:40:37 +00:00
Josip Sokcevic
621de7ed12 Disable git terminal prompt during fetch/clone
git fetch operation may prompt user to enter username and password.
This won't be visible to user when repo sync operation since stdout and
stderr are redirected. If that happens, user may think repo is doing
work and likely won't realize it's stuck on user's input.

This patch disables prompt for clone and fetch operations, and repo will
fail fast.

R=gavinmak@google.com

Bug: b/368644181
Change-Id: I2efa88ae66067587a00678eda155d861034b9127
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/438001
Reviewed-by: Nasser Grainawi <nasser.grainawi@linaro.org>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2024-09-26 22:10:36 +00:00
Kaushik Lingarkar
d7ebdf56be init: add --manifest-upstream-branch
When a sha1 is provided to '--manifest-branch', the ref which
is expected to contain that sha1 can be provided using the new
'--manifest-upstream-branch' option. This is useful with
'--current-branch' to avoid having to sync all heads and tags,
or with a commit that comes from a non-head/tag ref (like a
Gerrit change ref).

Change-Id: I46a3e255ca69ed9e809039e58b0c163e02af94ef
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/436717
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Kaushik Lingarkar <kaushik.lingarkar@linaro.org>
Tested-by: Kaushik Lingarkar <kaushik.lingarkar@linaro.org>
2024-09-26 00:52:28 +00:00
Kaushik Lingarkar
fabab4e245 man: regenerate man pages
Change-Id: Icf697eda7d5dcdc87854ad6adf607353c7ba5ac2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/437941
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Nasser Grainawi <nasser.grainawi@linaro.org>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2024-09-25 20:57:42 +00:00
Brian Norris
b577444a90 project: Copy and link files even with local branches
In the winding maze that constitutes Sync_LocalHalf(), there are paths
in which we don't copy-and-link files. Examples include something like:

  cd some/project/
  repo start head .
  # do some work, make some commit, upload that commit to Gerrit

  [[ ... in the meantime, someone addes a <linkfile ...> for
     some/project/ in the manifest ... ]]

  cd some/project/
  git pull --rebase
  repo sync

In this case, we never hit a `repo rebase` case, which might have saved
us. Instead, the developer is left confused why some/project/ never had
its <linkfile>s created.

Notably, this opens up one more corner case in which <linkfile ... /> or
<copyfile ... /> could potentially clobber existing work in the
destination directory, but there are existing cases where that's true,
and frankly, those seem like bigger holes than this new one.

Change-Id: I394b0e4529023a8ee319dc25d03d513a19251a4a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/437421
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Brian Norris <briannorris@google.com>
Commit-Queue: Brian Norris <briannorris@google.com>
2024-09-19 00:11:52 +00:00
52 changed files with 1783 additions and 670 deletions

View File

@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import contextlib
import multiprocessing import multiprocessing
import optparse import optparse
import os import os
@ -70,6 +71,14 @@ class Command:
# migrated subcommands can set it to False. # migrated subcommands can set it to False.
MULTI_MANIFEST_SUPPORT = True MULTI_MANIFEST_SUPPORT = True
# Shared data across parallel execution workers.
_parallel_context = None
@classmethod
def get_parallel_context(cls):
assert cls._parallel_context is not None
return cls._parallel_context
def __init__( def __init__(
self, self,
repodir=None, repodir=None,
@ -242,9 +251,39 @@ class Command:
"""Perform the action, after option parsing is complete.""" """Perform the action, after option parsing is complete."""
raise NotImplementedError raise NotImplementedError
@staticmethod @classmethod
@contextlib.contextmanager
def ParallelContext(cls):
"""Obtains the context, which is shared to ExecuteInParallel workers.
Callers can store data in the context dict before invocation of
ExecuteInParallel. The dict will then be shared to child workers of
ExecuteInParallel.
"""
assert cls._parallel_context is None
cls._parallel_context = {}
try:
yield
finally:
cls._parallel_context = None
@classmethod
def _InitParallelWorker(cls, context, initializer):
cls._parallel_context = context
if initializer:
initializer()
@classmethod
def ExecuteInParallel( def ExecuteInParallel(
jobs, func, inputs, callback, output=None, ordered=False cls,
jobs,
func,
inputs,
callback,
output=None,
ordered=False,
chunksize=WORKER_BATCH_SIZE,
initializer=None,
): ):
"""Helper for managing parallel execution boiler plate. """Helper for managing parallel execution boiler plate.
@ -269,6 +308,9 @@ class Command:
output: An output manager. May be progress.Progess or output: An output manager. May be progress.Progess or
color.Coloring. color.Coloring.
ordered: Whether the jobs should be processed in order. ordered: Whether the jobs should be processed in order.
chunksize: The number of jobs processed in batch by parallel
workers.
initializer: Worker initializer.
Returns: Returns:
The |callback| function's results are returned. The |callback| function's results are returned.
@ -278,12 +320,16 @@ class Command:
if len(inputs) == 1 or jobs == 1: if len(inputs) == 1 or jobs == 1:
return callback(None, output, (func(x) for x in inputs)) return callback(None, output, (func(x) for x in inputs))
else: else:
with multiprocessing.Pool(jobs) as pool: with multiprocessing.Pool(
jobs,
initializer=cls._InitParallelWorker,
initargs=(cls._parallel_context, initializer),
) as pool:
submit = pool.imap if ordered else pool.imap_unordered submit = pool.imap if ordered else pool.imap_unordered
return callback( return callback(
pool, pool,
output, output,
submit(func, inputs, chunksize=WORKER_BATCH_SIZE), submit(func, inputs, chunksize=chunksize),
) )
finally: finally:
if isinstance(output, progress.Progress): if isinstance(output, progress.Progress):
@ -501,7 +547,3 @@ class MirrorSafeCommand:
"""Command permits itself to run within a mirror, and does not require a """Command permits itself to run within a mirror, and does not require a
working directory. working directory.
""" """
class GitcClientCommand:
"""Command that requires the local client to be a GITC client."""

View File

@ -1 +1,2 @@
black<24 # NB: Keep in sync with run_tests.vpython3.
black<26

View File

@ -141,7 +141,7 @@ Instead, you should use standard Git workflows like [git worktree] or
(e.g. a local mirror & a public review server) while avoiding duplicating (e.g. a local mirror & a public review server) while avoiding duplicating
the content. However, this can run into problems if different remotes use the content. However, this can run into problems if different remotes use
the same path on their respective servers. Best to avoid that. the same path on their respective servers. Best to avoid that.
* `subprojects/`: Like `projects/`, but for git submodules. * `modules/`: Like `projects/`, but for git submodules.
* `subproject-objects/`: Like `project-objects/`, but for git submodules. * `subproject-objects/`: Like `project-objects/`, but for git submodules.
* `worktrees/`: Bare checkouts of every project synced by the manifest. The * `worktrees/`: Bare checkouts of every project synced by the manifest. The
filesystem layout matches the `<project name=...` setting in the manifest filesystem layout matches the `<project name=...` setting in the manifest

View File

@ -107,11 +107,13 @@ following DTD:
<!ATTLIST extend-project remote CDATA #IMPLIED> <!ATTLIST extend-project remote CDATA #IMPLIED>
<!ATTLIST extend-project dest-branch CDATA #IMPLIED> <!ATTLIST extend-project dest-branch CDATA #IMPLIED>
<!ATTLIST extend-project upstream CDATA #IMPLIED> <!ATTLIST extend-project upstream CDATA #IMPLIED>
<!ATTLIST extend-project base-rev CDATA #IMPLIED>
<!ELEMENT remove-project EMPTY> <!ELEMENT remove-project EMPTY>
<!ATTLIST remove-project name CDATA #IMPLIED> <!ATTLIST remove-project name CDATA #IMPLIED>
<!ATTLIST remove-project path CDATA #IMPLIED> <!ATTLIST remove-project path CDATA #IMPLIED>
<!ATTLIST remove-project optional CDATA #IMPLIED> <!ATTLIST remove-project optional CDATA #IMPLIED>
<!ATTLIST remove-project base-rev CDATA #IMPLIED>
<!ELEMENT repo-hooks EMPTY> <!ELEMENT repo-hooks EMPTY>
<!ATTLIST repo-hooks in-project CDATA #REQUIRED> <!ATTLIST repo-hooks in-project CDATA #REQUIRED>
@ -229,26 +231,7 @@ At most one manifest-server may be specified. The url attribute
is used to specify the URL of a manifest server, which is an is used to specify the URL of a manifest server, which is an
XML RPC service. XML RPC service.
The manifest server should implement the following RPC methods: See the [smart sync documentation](./smart-sync.md) for more details.
GetApprovedManifest(branch, target)
Return a manifest in which each project is pegged to a known good revision
for the current branch and target. This is used by repo sync when the
--smart-sync option is given.
The target to use is defined by environment variables TARGET_PRODUCT
and TARGET_BUILD_VARIANT. These variables are used to create a string
of the form $TARGET_PRODUCT-$TARGET_BUILD_VARIANT, e.g. passion-userdebug.
If one of those variables or both are not present, the program will call
GetApprovedManifest without the target parameter and the manifest server
should choose a reasonable default target.
GetManifest(tag)
Return a manifest in which each project is pegged to the revision at
the specified tag. This is used by repo sync when the --smart-tag option
is given.
### Element submanifest ### Element submanifest
@ -433,6 +416,14 @@ project. Same syntax as the corresponding element of `project`.
Attribute `upstream`: If specified, overrides the upstream of the original Attribute `upstream`: If specified, overrides the upstream of the original
project. Same syntax as the corresponding element of `project`. project. Same syntax as the corresponding element of `project`.
Attribute `base-rev`: If specified, adds a check against the revision
to be extended. Manifest parse will fail and give a list of mismatch extends
if the revisions being extended have changed since base-rev was set.
Intended for use with layered manifests using hash revisions to prevent
patch branches hiding newer upstream revisions. Also compares named refs
like branches or tags but is misleading if branches are used as base-rev.
Same syntax as the corresponding element of `project`.
### Element annotation ### Element annotation
Zero or more annotation elements may be specified as children of a Zero or more annotation elements may be specified as children of a
@ -496,6 +487,14 @@ name. Logic otherwise behaves like both are specified.
Attribute `optional`: Set to true to ignore remove-project elements with no Attribute `optional`: Set to true to ignore remove-project elements with no
matching `project` element. matching `project` element.
Attribute `base-rev`: If specified, adds a check against the revision
to be removed. Manifest parse will fail and give a list of mismatch removes
if the revisions being removed have changed since base-rev was set.
Intended for use with layered manifests using hash revisions to prevent
patch branches hiding newer upstream revisions. Also compares named refs
like branches or tags but is misleading if branches are used as base-rev.
Same syntax as the corresponding element of `project`.
### Element repo-hooks ### Element repo-hooks
NB: See the [practical documentation](./repo-hooks.md) for using repo hooks. NB: See the [practical documentation](./repo-hooks.md) for using repo hooks.

View File

@ -96,6 +96,9 @@ If that tag is valid, then repo will warn and use that commit instead.
If that tag cannot be verified, it gives up and forces the user to resolve. If that tag cannot be verified, it gives up and forces the user to resolve.
If env variable `REPO_SKIP_SELF_UPDATE` is defined, this will
bypass the self update algorithm.
### Force an update ### Force an update
The `repo selfupdate` command can be used to force an immediate update. The `repo selfupdate` command can be used to force an immediate update.

129
docs/smart-sync.md Normal file
View File

@ -0,0 +1,129 @@
# repo Smart Syncing
Repo normally fetches & syncs manifests from the same URL specified during
`repo init`, and that often fetches the latest revisions of all projects in
the manifest. This flow works well for tracking and developing with the
latest code, but often it's desirable to sync to other points. For example,
to get a local build matching a specific release or build to reproduce bugs
reported by other people.
Repo's sync subcommand has support for fetching manifests from a server over
an XML-RPC connection. The local configuration and network API are defined by
repo, but individual projects have to host their own server for the client to
communicate with.
This process is called "smart syncing" -- instead of blindly fetching the latest
revision of all projects and getting an unknown state to develop against, the
client passes a request to the server and is given a matching manifest that
typically specifies specific commits for every project to fetch a known source
state.
[TOC]
## Manifest Configuration
The manifest specifies the server to communicate with via the
the [`<manifest-server>` element](manifest-format.md#Element-manifest_server)
element. This is how the client knows what service to talk to.
```xml
<manifest-server url="https://example.com/your/manifest/server/url" />
```
If the URL starts with `persistent-`, then the
[`git-remote-persistent-https` helper](https://github.com/git/git/blob/HEAD/contrib/persistent-https/README)
is used to communicate with the server.
## Credentials
Credentials may be specified directly in typical `username:password`
[URI syntax](https://en.wikipedia.org/wiki/URI#Syntax) in the
`<manifest-server>` element directly in the manifest.
If they are not specified, `repo sync` has `--manifest-server-username=USERNAME`
and `--manifest-server-password=PASSWORD` options.
If those are not used, then repo will look up the host in your
[`~/.netrc`](https://docs.python.org/3/library/netrc.html) database.
When making the connection, cookies matching the host are automatically loaded
from the cookiejar specified in
[Git's `http.cookiefile` setting](https://git-scm.com/docs/git-config#Documentation/git-config.txt-httpcookieFile).
## Manifest Server
Unfortunately, there are no public reference implementations. Google has an
internal one for Android, but it is written using Google's internal systems,
so wouldn't be that helpful as a reference.
That said, the XML-RPC API is pretty simple, so any standard XML-RPC server
example would do. Google's internal server uses Python's
[xmlrpc.server.SimpleXMLRPCDispatcher](https://docs.python.org/3/library/xmlrpc.server.html).
## Network API
The manifest server should implement the following RPC methods.
### GetApprovedManifest
> `GetApprovedManifest(branch: str, target: Optional[str]) -> str`
The meaning of `branch` and `target` is not strictly defined. The server may
interpret them however it wants. The recommended interpretation is that the
`branch` matches the manifest branch, and `target` is an identifier for your
project that matches something users would build.
See the client section below for how repo typically generates these values.
The server will return a manifest or an error. If it's an error, repo will
show the output directly to the user to provide a limited feedback channel.
If the user's request is ambiguous and could match multiple manifests, the
server has to decide whether to pick one automatically (and silently such that
the user won't know there were multiple matches), or return an error and force
the user to be more specific.
### GetManifest
> `GetManifest(tag: str) -> str`
The meaning of `tag` is not strictly defined. Projects are encouraged to use
a system where the tag matches a unique source state.
See the client section below for how repo typically generates these values.
The server will return a manifest or an error. If it's an error, repo will
show the output directly to the user to provide a limited feedback channel.
If the user's request is ambiguous and could match multiple manifests, the
server has to decide whether to pick one automatically (and silently such that
the user won't know there were multiple matches), or return an error and force
the user to be more specific.
## Client Options
Once repo has successfully downloaded the manifest from the server, it saves a
copy into `.repo/manifests/smart_sync_override.xml` so users can examine it.
The next time `repo sync` is run, this file is automatically replaced or removed
based on the current set of options.
### --smart-sync
Repo will call `GetApprovedManifest(branch[, target])`.
The `branch` is determined by the current manifest branch as specified by
`--manifest-branch=BRANCH` when running `repo init`.
The `target` is defined by environment variables in the order below. If none
of them match, then `target` is omitted. These variables were decided as they
match the settings Android build environments automatically setup.
1. `${SYNC_TARGET}`: If defined, the value is used directly.
2. `${TARGET_PRODUCT}-${TARGET_RELEASE}-${TARGET_BUILD_VARIANT}`: If these
variables are all defined, then they are merged with `-` and used.
3. `${TARGET_PRODUCT}-${TARGET_BUILD_VARIANT}`: If these variables are all
defined, then they are merged with `-` and used.
### --smart-tag=TAG
Repo will call `GetManifest(TAG)`.

View File

@ -107,8 +107,8 @@ class GitError(RepoError):
return self.message return self.message
class GitcUnsupportedError(RepoExitError): class GitAuthError(RepoExitError):
"""Gitc no longer supported.""" """Cannot talk to remote due to auth issue."""
class UploadError(RepoError): class UploadError(RepoError):

View File

@ -168,8 +168,10 @@ class EventLog:
f.write("\n") f.write("\n")
# An integer id that is unique across this invocation of the program. # An integer id that is unique across this invocation of the program, to be set
_EVENT_ID = multiprocessing.Value("i", 1) # by the first Add event. We can't set it here since it results in leaked
# resources (see: https://issues.gerritcodereview.com/353656374).
_EVENT_ID = None
def _NextEventId(): def _NextEventId():
@ -178,6 +180,12 @@ def _NextEventId():
Returns: Returns:
A unique, to this invocation of the program, integer id. A unique, to this invocation of the program, integer id.
""" """
global _EVENT_ID
if _EVENT_ID is None:
# There is a small chance of race condition - two parallel processes
# setting up _EVENT_ID. However, we expect TASK_COMMAND to happen before
# mp kicks in.
_EVENT_ID = multiprocessing.Value("i", 1)
with _EVENT_ID.get_lock(): with _EVENT_ID.get_lock():
val = _EVENT_ID.value val = _EVENT_ID.value
_EVENT_ID.value += 1 _EVENT_ID.value += 1

View File

@ -238,9 +238,9 @@ def _build_env(
s = p + " " + s s = p + " " + s
env["GIT_CONFIG_PARAMETERS"] = s env["GIT_CONFIG_PARAMETERS"] = s
if "GIT_ALLOW_PROTOCOL" not in env: if "GIT_ALLOW_PROTOCOL" not in env:
env[ env["GIT_ALLOW_PROTOCOL"] = (
"GIT_ALLOW_PROTOCOL" "file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc"
] = "file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc" )
env["GIT_HTTP_USER_AGENT"] = user_agent.git env["GIT_HTTP_USER_AGENT"] = user_agent.git
if objdir: if objdir:
@ -313,10 +313,13 @@ class GitCommand:
cwd = None cwd = None
command_name = cmdv[0] command_name = cmdv[0]
command.append(command_name) command.append(command_name)
if command_name in ("fetch", "clone"):
env["GIT_TERMINAL_PROMPT"] = "0"
# Need to use the --progress flag for fetch/clone so output will be # Need to use the --progress flag for fetch/clone so output will be
# displayed as by default git only does progress output if stderr is a # displayed as by default git only does progress output if stderr is
# TTY. # a TTY.
if sys.stderr.isatty() and command_name in ("fetch", "clone"): if sys.stderr.isatty():
if "--progress" not in cmdv and "--quiet" not in cmdv: if "--progress" not in cmdv and "--quiet" not in cmdv:
command.append("--progress") command.append("--progress")
command.extend(cmdv[1:]) command.extend(cmdv[1:])
@ -347,9 +350,9 @@ class GitCommand:
"Project": e.project, "Project": e.project,
"CommandName": command_name, "CommandName": command_name,
"Message": str(e), "Message": str(e),
"ReturnCode": str(e.git_rc) "ReturnCode": (
if e.git_rc is not None str(e.git_rc) if e.git_rc is not None else None
else None, ),
"IsError": log_as_error, "IsError": log_as_error,
} }
) )

View File

@ -90,6 +90,20 @@ class GitConfig:
@staticmethod @staticmethod
def _getUserConfig(): def _getUserConfig():
"""Get the user-specific config file.
Prefers the XDG config location if available, with fallback to
~/.gitconfig
This matches git behavior:
https://git-scm.com/docs/git-config#FILES
"""
xdg_config_home = os.getenv(
"XDG_CONFIG_HOME", os.path.expanduser("~/.config")
)
xdg_config_file = os.path.join(xdg_config_home, "git", "config")
if os.path.exists(xdg_config_file):
return xdg_config_file
return os.path.expanduser("~/.gitconfig") return os.path.expanduser("~/.gitconfig")
@classmethod @classmethod

View File

@ -307,8 +307,6 @@ class Superproject:
) )
return SyncResult(False, False) return SyncResult(False, False)
_PrintBetaNotice()
should_exit = True should_exit = True
if not self._remote_url: if not self._remote_url:
self._LogWarning( self._LogWarning(
@ -452,16 +450,6 @@ class Superproject:
return UpdateProjectsResult(manifest_path, False) return UpdateProjectsResult(manifest_path, False)
@functools.lru_cache(maxsize=10)
def _PrintBetaNotice():
"""Print the notice of beta status."""
print(
"NOTICE: --use-superproject is in beta; report any issues to the "
"address described in `repo version`",
file=sys.stderr,
)
@functools.lru_cache(maxsize=None) @functools.lru_cache(maxsize=None)
def _UseSuperprojectFromConfiguration(): def _UseSuperprojectFromConfiguration():
"""Returns the user choice of whether to use superproject.""" """Returns the user choice of whether to use superproject."""

View File

@ -130,10 +130,10 @@ class BaseEventLog:
"time": datetime.datetime.now(datetime.timezone.utc).isoformat(), "time": datetime.datetime.now(datetime.timezone.utc).isoformat(),
} }
def StartEvent(self): def StartEvent(self, argv):
"""Append a 'start' event to the current log.""" """Append a 'start' event to the current log."""
start_event = self._CreateEventDict("start") start_event = self._CreateEventDict("start")
start_event["argv"] = sys.argv start_event["argv"] = argv
self._log.append(start_event) self._log.append(start_event)
def ExitEvent(self, result): def ExitEvent(self, result):
@ -159,9 +159,11 @@ class BaseEventLog:
name: Name of the primary command (ex: repo, git) name: Name of the primary command (ex: repo, git)
subcommands: List of the sub-commands (ex: version, init, sync) subcommands: List of the sub-commands (ex: version, init, sync)
""" """
command_event = self._CreateEventDict("command") command_event = self._CreateEventDict("cmd_name")
name = f"{name}-"
name += "-".join(subcommands)
command_event["name"] = name command_event["name"] = name
command_event["subcommands"] = subcommands command_event["hierarchy"] = name
self._log.append(command_event) self._log.append(command_event)
def LogConfigEvents(self, config, event_dict_name): def LogConfigEvents(self, config, event_dict_name):

View File

@ -45,7 +45,6 @@ from command import InteractiveCommand
from command import MirrorSafeCommand from command import MirrorSafeCommand
from editor import Editor from editor import Editor
from error import DownloadError from error import DownloadError
from error import GitcUnsupportedError
from error import InvalidProjectGroupsError from error import InvalidProjectGroupsError
from error import ManifestInvalidRevisionError from error import ManifestInvalidRevisionError
from error import ManifestParseError from error import ManifestParseError
@ -308,10 +307,6 @@ class _Repo:
outer_client=outer_client, outer_client=outer_client,
) )
if Wrapper().gitc_parse_clientdir(os.getcwd()):
logger.error("GITC is not supported.")
raise GitcUnsupportedError()
try: try:
cmd = self.commands[name]( cmd = self.commands[name](
repodir=self.repodir, repodir=self.repodir,
@ -357,7 +352,7 @@ class _Repo:
start = time.time() start = time.time()
cmd_event = cmd.event_log.Add(name, event_log.TASK_COMMAND, start) cmd_event = cmd.event_log.Add(name, event_log.TASK_COMMAND, start)
cmd.event_log.SetParent(cmd_event) cmd.event_log.SetParent(cmd_event)
git_trace2_event_log.StartEvent() git_trace2_event_log.StartEvent(["repo", name] + argv)
git_trace2_event_log.CommandEvent(name="repo", subcommands=[name]) git_trace2_event_log.CommandEvent(name="repo", subcommands=[name])
def execute_command_helper(): def execute_command_helper():

43
man/repo-gc.1 Normal file
View File

@ -0,0 +1,43 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "December 2024" "repo gc" "Repo Manual"
.SH NAME
repo \- repo gc - manual page for repo gc
.SH SYNOPSIS
.B repo
\fI\,gc\/\fR
.SH DESCRIPTION
Summary
.PP
Cleaning up internal repo state.
.SH OPTIONS
.TP
\fB\-h\fR, \fB\-\-help\fR
show this help message and exit
.TP
\fB\-n\fR, \fB\-\-dry\-run\fR
do everything except actually delete
.TP
\fB\-y\fR, \fB\-\-yes\fR
answer yes to all safe prompts
.SS Logging options:
.TP
\fB\-v\fR, \fB\-\-verbose\fR
show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help gc` to view the detailed manual.

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man. .\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "October 2022" "repo init" "Repo Manual" .TH REPO "1" "September 2024" "repo init" "Repo Manual"
.SH NAME .SH NAME
repo \- repo init - manual page for repo init repo \- repo init - manual page for repo init
.SH SYNOPSIS .SH SYNOPSIS
@ -28,6 +28,11 @@ manifest repository location
\fB\-b\fR REVISION, \fB\-\-manifest\-branch\fR=\fI\,REVISION\/\fR \fB\-b\fR REVISION, \fB\-\-manifest\-branch\fR=\fI\,REVISION\/\fR
manifest branch or revision (use HEAD for default) manifest branch or revision (use HEAD for default)
.TP .TP
\fB\-\-manifest\-upstream\-branch\fR=\fI\,BRANCH\/\fR
when a commit is provided to \fB\-\-manifest\-branch\fR, this
is the name of the git ref in which the commit can be
found
.TP
\fB\-m\fR NAME.xml, \fB\-\-manifest\-name\fR=\fI\,NAME\/\fR.xml \fB\-m\fR NAME.xml, \fB\-\-manifest\-name\fR=\fI\,NAME\/\fR.xml
initial manifest file initial manifest file
.TP .TP
@ -163,6 +168,10 @@ The optional \fB\-b\fR argument can be used to select the manifest branch to che
and use. If no branch is specified, the remote's default branch is used. This is and use. If no branch is specified, the remote's default branch is used. This is
equivalent to using \fB\-b\fR HEAD. equivalent to using \fB\-b\fR HEAD.
.PP .PP
The optional \fB\-\-manifest\-upstream\-branch\fR argument can be used when a commit is
provided to \fB\-\-manifest\-branch\fR (or \fB\-b\fR), to specify the name of the git ref in
which the commit can be found.
.PP
The optional \fB\-m\fR argument can be used to specify an alternate manifest to be The optional \fB\-m\fR argument can be used to specify an alternate manifest to be
used. If no manifest is specified, the manifest default.xml will be used. used. If no manifest is specified, the manifest default.xml will be used.
.PP .PP

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man. .\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "April 2024" "repo manifest" "Repo Manual" .TH REPO "1" "December 2024" "repo manifest" "Repo Manual"
.SH NAME .SH NAME
repo \- repo manifest - manual page for repo manifest repo \- repo manifest - manual page for repo manifest
.SH SYNOPSIS .SH SYNOPSIS
@ -192,11 +192,13 @@ CDATA #IMPLIED>
<!ATTLIST extend\-project remote CDATA #IMPLIED> <!ATTLIST extend\-project remote CDATA #IMPLIED>
<!ATTLIST extend\-project dest\-branch CDATA #IMPLIED> <!ATTLIST extend\-project dest\-branch CDATA #IMPLIED>
<!ATTLIST extend\-project upstream CDATA #IMPLIED> <!ATTLIST extend\-project upstream CDATA #IMPLIED>
<!ATTLIST extend\-project base\-rev CDATA #IMPLIED>
.IP .IP
<!ELEMENT remove\-project EMPTY> <!ELEMENT remove\-project EMPTY>
<!ATTLIST remove\-project name CDATA #IMPLIED> <!ATTLIST remove\-project name CDATA #IMPLIED>
<!ATTLIST remove\-project path CDATA #IMPLIED> <!ATTLIST remove\-project path CDATA #IMPLIED>
<!ATTLIST remove\-project optional CDATA #IMPLIED> <!ATTLIST remove\-project optional CDATA #IMPLIED>
<!ATTLIST remove\-project base\-rev CDATA #IMPLIED>
.IP .IP
<!ELEMENT repo\-hooks EMPTY> <!ELEMENT repo\-hooks EMPTY>
<!ATTLIST repo\-hooks in\-project CDATA #REQUIRED> <!ATTLIST repo\-hooks in\-project CDATA #REQUIRED>
@ -495,6 +497,14 @@ project. Same syntax as the corresponding element of `project`.
Attribute `upstream`: If specified, overrides the upstream of the original Attribute `upstream`: If specified, overrides the upstream of the original
project. Same syntax as the corresponding element of `project`. project. Same syntax as the corresponding element of `project`.
.PP .PP
Attribute `base\-rev`: If specified, adds a check against the revision to be
extended. Manifest parse will fail and give a list of mismatch extends if the
revisions being extended have changed since base\-rev was set. Intended for use
with layered manifests using hash revisions to prevent patch branches hiding
newer upstream revisions. Also compares named refs like branches or tags but is
misleading if branches are used as base\-rev. Same syntax as the corresponding
element of `project`.
.PP
Element annotation Element annotation
.PP .PP
Zero or more annotation elements may be specified as children of a project or Zero or more annotation elements may be specified as children of a project or
@ -556,6 +566,14 @@ Logic otherwise behaves like both are specified.
Attribute `optional`: Set to true to ignore remove\-project elements with no Attribute `optional`: Set to true to ignore remove\-project elements with no
matching `project` element. matching `project` element.
.PP .PP
Attribute `base\-rev`: If specified, adds a check against the revision to be
removed. Manifest parse will fail and give a list of mismatch removes if the
revisions being removed have changed since base\-rev was set. Intended for use
with layered manifests using hash revisions to prevent patch branches hiding
newer upstream revisions. Also compares named refs like branches or tags but is
misleading if branches are used as base\-rev. Same syntax as the corresponding
element of `project`.
.PP
Element repo\-hooks Element repo\-hooks
.PP .PP
NB: See the [practical documentation](./repo\-hooks.md) for using repo hooks. NB: See the [practical documentation](./repo\-hooks.md) for using repo hooks.

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man. .\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "April 2024" "repo smartsync" "Repo Manual" .TH REPO "1" "September 2024" "repo smartsync" "Repo Manual"
.SH NAME .SH NAME
repo \- repo smartsync - manual page for repo smartsync repo \- repo smartsync - manual page for repo smartsync
.SH SYNOPSIS .SH SYNOPSIS
@ -47,6 +47,10 @@ force remove projects with uncommitted modifications
if projects no longer exist in the manifest. WARNING: if projects no longer exist in the manifest. WARNING:
this may cause loss of data this may cause loss of data
.TP .TP
\fB\-\-rebase\fR
rebase local commits regardless of whether they are
published
.TP
\fB\-l\fR, \fB\-\-local\-only\fR \fB\-l\fR, \fB\-\-local\-only\fR
only update working tree, don't fetch only update working tree, don't fetch
.TP .TP

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man. .\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "April 2024" "repo sync" "Repo Manual" .TH REPO "1" "September 2024" "repo sync" "Repo Manual"
.SH NAME .SH NAME
repo \- repo sync - manual page for repo sync repo \- repo sync - manual page for repo sync
.SH SYNOPSIS .SH SYNOPSIS
@ -47,6 +47,10 @@ force remove projects with uncommitted modifications
if projects no longer exist in the manifest. WARNING: if projects no longer exist in the manifest. WARNING:
this may cause loss of data this may cause loss of data
.TP .TP
\fB\-\-rebase\fR
rebase local commits regardless of whether they are
published
.TP
\fB\-l\fR, \fB\-\-local\-only\fR \fB\-l\fR, \fB\-\-local\-only\fR
only update working tree, don't fetch only update working tree, don't fetch
.TP .TP

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man. .\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "April 2024" "repo" "Repo Manual" .TH REPO "1" "December 2024" "repo" "Repo Manual"
.SH NAME .SH NAME
repo \- repository management tool built on top of git repo \- repository management tool built on top of git
.SH SYNOPSIS .SH SYNOPSIS
@ -79,6 +79,9 @@ Download and checkout a change
forall forall
Run a shell command in each project Run a shell command in each project
.TP .TP
gc
Cleaning up internal repo state.
.TP
grep grep
Print lines matching a pattern Print lines matching a pattern
.TP .TP

View File

@ -1014,9 +1014,9 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
def SetManifestOverride(self, path): def SetManifestOverride(self, path):
"""Override manifestFile. The caller must call Unload()""" """Override manifestFile. The caller must call Unload()"""
self._outer_client.manifest.manifestFileOverrides[ self._outer_client.manifest.manifestFileOverrides[self.path_prefix] = (
self.path_prefix path
] = path )
@property @property
def UseLocalManifests(self): def UseLocalManifests(self):
@ -1445,6 +1445,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
repo_hooks_project = None repo_hooks_project = None
enabled_repo_hooks = None enabled_repo_hooks = None
failed_revision_changes = []
for node in itertools.chain(*node_list): for node in itertools.chain(*node_list):
if node.nodeName == "project": if node.nodeName == "project":
project = self._ParseProject(node) project = self._ParseProject(node)
@ -1471,6 +1472,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
remote = self._get_remote(node) remote = self._get_remote(node)
dest_branch = node.getAttribute("dest-branch") dest_branch = node.getAttribute("dest-branch")
upstream = node.getAttribute("upstream") upstream = node.getAttribute("upstream")
base_revision = node.getAttribute("base-rev")
named_projects = self._projects[name] named_projects = self._projects[name]
if dest_path and not path and len(named_projects) > 1: if dest_path and not path and len(named_projects) > 1:
@ -1484,6 +1486,13 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
if groups: if groups:
p.groups.extend(groups) p.groups.extend(groups)
if revision: if revision:
if base_revision:
if p.revisionExpr != base_revision:
failed_revision_changes.append(
"extend-project name %s mismatch base "
"%s vs revision %s"
% (name, base_revision, p.revisionExpr)
)
p.SetRevision(revision) p.SetRevision(revision)
if remote_name: if remote_name:
@ -1558,6 +1567,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
if node.nodeName == "remove-project": if node.nodeName == "remove-project":
name = node.getAttribute("name") name = node.getAttribute("name")
path = node.getAttribute("path") path = node.getAttribute("path")
base_revision = node.getAttribute("base-rev")
# Name or path needed. # Name or path needed.
if not name and not path: if not name and not path:
@ -1571,6 +1581,13 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
for projname, projects in list(self._projects.items()): for projname, projects in list(self._projects.items()):
for p in projects: for p in projects:
if name == projname and not path: if name == projname and not path:
if base_revision:
if p.revisionExpr != base_revision:
failed_revision_changes.append(
"remove-project name %s mismatch base "
"%s vs revision %s"
% (name, base_revision, p.revisionExpr)
)
del self._paths[p.relpath] del self._paths[p.relpath]
if not removed_project: if not removed_project:
del self._projects[name] del self._projects[name]
@ -1578,6 +1595,17 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
elif path == p.relpath and ( elif path == p.relpath and (
name == projname or not name name == projname or not name
): ):
if base_revision:
if p.revisionExpr != base_revision:
failed_revision_changes.append(
"remove-project path %s mismatch base "
"%s vs revision %s"
% (
p.relpath,
base_revision,
p.revisionExpr,
)
)
self._projects[projname].remove(p) self._projects[projname].remove(p)
del self._paths[p.relpath] del self._paths[p.relpath]
removed_project = p.name removed_project = p.name
@ -1597,6 +1625,13 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
"project: %s" % node.toxml() "project: %s" % node.toxml()
) )
if failed_revision_changes:
raise ManifestParseError(
"revision base check failed, rebase patches and update "
"base revs for: ",
failed_revision_changes,
)
# Store repo hooks project information. # Store repo hooks project information.
if repo_hooks_project: if repo_hooks_project:
# Store a reference to the Project. # Store a reference to the Project.
@ -2021,7 +2056,12 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
path = path.rstrip("/") path = path.rstrip("/")
name = name.rstrip("/") name = name.rstrip("/")
relpath = self._JoinRelpath(parent.relpath, path) relpath = self._JoinRelpath(parent.relpath, path)
gitdir = os.path.join(parent.gitdir, "subprojects", "%s.git" % path) subprojects = os.path.join(parent.gitdir, "subprojects", f"{path}.git")
modules = os.path.join(parent.gitdir, "modules", path)
if platform_utils.isdir(subprojects):
gitdir = subprojects
else:
gitdir = modules
objdir = os.path.join( objdir = os.path.join(
parent.gitdir, "subproject-objects", "%s.git" % name parent.gitdir, "subproject-objects", "%s.git" % name
) )
@ -2072,22 +2112,22 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
# implementation: # implementation:
# https://eclipse.googlesource.com/jgit/jgit/+/9110037e3e9461ff4dac22fee84ef3694ed57648/org.eclipse.jgit/src/org/eclipse/jgit/lib/ObjectChecker.java#884 # https://eclipse.googlesource.com/jgit/jgit/+/9110037e3e9461ff4dac22fee84ef3694ed57648/org.eclipse.jgit/src/org/eclipse/jgit/lib/ObjectChecker.java#884
BAD_CODEPOINTS = { BAD_CODEPOINTS = {
"\u200C", # ZERO WIDTH NON-JOINER "\u200c", # ZERO WIDTH NON-JOINER
"\u200D", # ZERO WIDTH JOINER "\u200d", # ZERO WIDTH JOINER
"\u200E", # LEFT-TO-RIGHT MARK "\u200e", # LEFT-TO-RIGHT MARK
"\u200F", # RIGHT-TO-LEFT MARK "\u200f", # RIGHT-TO-LEFT MARK
"\u202A", # LEFT-TO-RIGHT EMBEDDING "\u202a", # LEFT-TO-RIGHT EMBEDDING
"\u202B", # RIGHT-TO-LEFT EMBEDDING "\u202b", # RIGHT-TO-LEFT EMBEDDING
"\u202C", # POP DIRECTIONAL FORMATTING "\u202c", # POP DIRECTIONAL FORMATTING
"\u202D", # LEFT-TO-RIGHT OVERRIDE "\u202d", # LEFT-TO-RIGHT OVERRIDE
"\u202E", # RIGHT-TO-LEFT OVERRIDE "\u202e", # RIGHT-TO-LEFT OVERRIDE
"\u206A", # INHIBIT SYMMETRIC SWAPPING "\u206a", # INHIBIT SYMMETRIC SWAPPING
"\u206B", # ACTIVATE SYMMETRIC SWAPPING "\u206b", # ACTIVATE SYMMETRIC SWAPPING
"\u206C", # INHIBIT ARABIC FORM SHAPING "\u206c", # INHIBIT ARABIC FORM SHAPING
"\u206D", # ACTIVATE ARABIC FORM SHAPING "\u206d", # ACTIVATE ARABIC FORM SHAPING
"\u206E", # NATIONAL DIGIT SHAPES "\u206e", # NATIONAL DIGIT SHAPES
"\u206F", # NOMINAL DIGIT SHAPES "\u206f", # NOMINAL DIGIT SHAPES
"\uFEFF", # ZERO WIDTH NO-BREAK SPACE "\ufeff", # ZERO WIDTH NO-BREAK SPACE
} }
if BAD_CODEPOINTS & path_codepoints: if BAD_CODEPOINTS & path_codepoints:
# This message is more expansive than reality, but should be fine. # This message is more expansive than reality, but should be fine.

View File

@ -40,7 +40,7 @@ def RunPager(globalConfig):
def TerminatePager(): def TerminatePager():
global pager_process, old_stdout, old_stderr global pager_process
if pager_process: if pager_process:
sys.stdout.flush() sys.stdout.flush()
sys.stderr.flush() sys.stderr.flush()

View File

@ -156,6 +156,12 @@ def remove(path, missing_ok=False):
os.rmdir(longpath) os.rmdir(longpath)
else: else:
os.remove(longpath) os.remove(longpath)
elif (
e.errno == errno.EROFS
and missing_ok
and not os.path.exists(longpath)
):
pass
elif missing_ok and e.errno == errno.ENOENT: elif missing_ok and e.errno == errno.ENOENT:
pass pass
else: else:

View File

@ -100,6 +100,7 @@ class Progress:
self._show = not delay self._show = not delay
self._units = units self._units = units
self._elide = elide and _TTY self._elide = elide and _TTY
self._quiet = quiet
# Only show the active jobs section if we run more than one in parallel. # Only show the active jobs section if we run more than one in parallel.
self._show_jobs = False self._show_jobs = False
@ -114,13 +115,7 @@ class Progress:
) )
self._update_thread.daemon = True self._update_thread.daemon = True
# When quiet, never show any output. It's a bit hacky, but reusing the if not quiet and show_elapsed:
# existing logic that delays initial output keeps the rest of the class
# clean. Basically we set the start time to years in the future.
if quiet:
self._show = False
self._start += 2**32
elif show_elapsed:
self._update_thread.start() self._update_thread.start()
def _update_loop(self): def _update_loop(self):
@ -160,7 +155,7 @@ class Progress:
msg = self._last_msg msg = self._last_msg
self._last_msg = msg self._last_msg = msg
if not _TTY or IsTraceToStderr(): if not _TTY or IsTraceToStderr() or self._quiet:
return return
elapsed_sec = time.time() - self._start elapsed_sec = time.time() - self._start
@ -202,7 +197,7 @@ class Progress:
def end(self): def end(self):
self._update_event.set() self._update_event.set()
if not _TTY or IsTraceToStderr() or not self._show: if not _TTY or IsTraceToStderr() or self._quiet:
return return
duration = duration_str(time.time() - self._start) duration = duration_str(time.time() - self._start)

View File

@ -32,6 +32,7 @@ import urllib.parse
from color import Coloring from color import Coloring
from error import DownloadError from error import DownloadError
from error import GitAuthError
from error import GitError from error import GitError
from error import ManifestInvalidPathError from error import ManifestInvalidPathError
from error import ManifestInvalidRevisionError from error import ManifestInvalidRevisionError
@ -575,7 +576,6 @@ class Project:
dest_branch=None, dest_branch=None,
optimized_fetch=False, optimized_fetch=False,
retry_fetches=0, retry_fetches=0,
old_revision=None,
): ):
"""Init a Project object. """Init a Project object.
@ -608,7 +608,6 @@ class Project:
only fetch from the remote if the sha1 is not present locally. only fetch from the remote if the sha1 is not present locally.
retry_fetches: Retry remote fetches n times upon receiving transient retry_fetches: Retry remote fetches n times upon receiving transient
error with exponential backoff and jitter. error with exponential backoff and jitter.
old_revision: saved git commit id for open GITC projects.
""" """
self.client = self.manifest = manifest self.client = self.manifest = manifest
self.name = name self.name = name
@ -638,12 +637,15 @@ class Project:
self.linkfiles = [] self.linkfiles = []
self.annotations = [] self.annotations = []
self.dest_branch = dest_branch self.dest_branch = dest_branch
self.old_revision = old_revision
# This will be filled in if a project is later identified to be the # This will be filled in if a project is later identified to be the
# project containing repo hooks. # project containing repo hooks.
self.enabled_repo_hooks = [] self.enabled_repo_hooks = []
# This will be updated later if the project has submodules and
# if they will be synced.
self.has_subprojects = False
def RelPath(self, local=True): def RelPath(self, local=True):
"""Return the path for the project relative to a manifest. """Return the path for the project relative to a manifest.
@ -1562,6 +1564,11 @@ class Project:
return return
self._InitWorkTree(force_sync=force_sync, submodules=submodules) self._InitWorkTree(force_sync=force_sync, submodules=submodules)
# TODO(https://git-scm.com/docs/git-worktree#_bugs): Re-evaluate if
# submodules can be init when using worktrees once its support is
# complete.
if self.has_subprojects and not self.use_git_worktrees:
self._InitSubmodules()
all_refs = self.bare_ref.all all_refs = self.bare_ref.all
self.CleanPublishedCache(all_refs) self.CleanPublishedCache(all_refs)
revid = self.GetRevisionId(all_refs) revid = self.GetRevisionId(all_refs)
@ -1695,6 +1702,8 @@ class Project:
) )
) )
return return
syncbuf.later1(self, _doff, not verbose)
return
elif pub == head: elif pub == head:
# All published commits are merged, and thus we are a # All published commits are merged, and thus we are a
# strict subset. We can fast-forward safely. # strict subset. We can fast-forward safely.
@ -2188,24 +2197,27 @@ class Project:
def get_submodules(gitdir, rev): def get_submodules(gitdir, rev):
# Parse .gitmodules for submodule sub_paths and sub_urls. # Parse .gitmodules for submodule sub_paths and sub_urls.
sub_paths, sub_urls = parse_gitmodules(gitdir, rev) sub_paths, sub_urls, sub_shallows = parse_gitmodules(gitdir, rev)
if not sub_paths: if not sub_paths:
return [] return []
# Run `git ls-tree` to read SHAs of submodule object, which happen # Run `git ls-tree` to read SHAs of submodule object, which happen
# to be revision of submodule repository. # to be revision of submodule repository.
sub_revs = git_ls_tree(gitdir, rev, sub_paths) sub_revs = git_ls_tree(gitdir, rev, sub_paths)
submodules = [] submodules = []
for sub_path, sub_url in zip(sub_paths, sub_urls): for sub_path, sub_url, sub_shallow in zip(
sub_paths, sub_urls, sub_shallows
):
try: try:
sub_rev = sub_revs[sub_path] sub_rev = sub_revs[sub_path]
except KeyError: except KeyError:
# Ignore non-exist submodules. # Ignore non-exist submodules.
continue continue
submodules.append((sub_rev, sub_path, sub_url)) submodules.append((sub_rev, sub_path, sub_url, sub_shallow))
return submodules return submodules
re_path = re.compile(r"^submodule\.(.+)\.path=(.*)$") re_path = re.compile(r"^submodule\.(.+)\.path=(.*)$")
re_url = re.compile(r"^submodule\.(.+)\.url=(.*)$") re_url = re.compile(r"^submodule\.(.+)\.url=(.*)$")
re_shallow = re.compile(r"^submodule\.(.+)\.shallow=(.*)$")
def parse_gitmodules(gitdir, rev): def parse_gitmodules(gitdir, rev):
cmd = ["cat-file", "blob", "%s:.gitmodules" % rev] cmd = ["cat-file", "blob", "%s:.gitmodules" % rev]
@ -2219,9 +2231,9 @@ class Project:
gitdir=gitdir, gitdir=gitdir,
) )
except GitError: except GitError:
return [], [] return [], [], []
if p.Wait() != 0: if p.Wait() != 0:
return [], [] return [], [], []
gitmodules_lines = [] gitmodules_lines = []
fd, temp_gitmodules_path = tempfile.mkstemp() fd, temp_gitmodules_path = tempfile.mkstemp()
@ -2238,16 +2250,17 @@ class Project:
gitdir=gitdir, gitdir=gitdir,
) )
if p.Wait() != 0: if p.Wait() != 0:
return [], [] return [], [], []
gitmodules_lines = p.stdout.split("\n") gitmodules_lines = p.stdout.split("\n")
except GitError: except GitError:
return [], [] return [], [], []
finally: finally:
platform_utils.remove(temp_gitmodules_path) platform_utils.remove(temp_gitmodules_path)
names = set() names = set()
paths = {} paths = {}
urls = {} urls = {}
shallows = {}
for line in gitmodules_lines: for line in gitmodules_lines:
if not line: if not line:
continue continue
@ -2261,10 +2274,16 @@ class Project:
names.add(m.group(1)) names.add(m.group(1))
urls[m.group(1)] = m.group(2) urls[m.group(1)] = m.group(2)
continue continue
m = re_shallow.match(line)
if m:
names.add(m.group(1))
shallows[m.group(1)] = m.group(2)
continue
names = sorted(names) names = sorted(names)
return ( return (
[paths.get(name, "") for name in names], [paths.get(name, "") for name in names],
[urls.get(name, "") for name in names], [urls.get(name, "") for name in names],
[shallows.get(name, "") for name in names],
) )
def git_ls_tree(gitdir, rev, paths): def git_ls_tree(gitdir, rev, paths):
@ -2293,7 +2312,9 @@ class Project:
try: try:
rev = self.GetRevisionId() rev = self.GetRevisionId()
except GitError: except (GitError, ManifestInvalidRevisionError):
# The git repo may be outdated (i.e. not fetched yet) and querying
# its submodules using the revision may not work; so return here.
return [] return []
return get_submodules(self.gitdir, rev) return get_submodules(self.gitdir, rev)
@ -2303,7 +2324,7 @@ class Project:
# If git repo does not exist yet, querying its submodules will # If git repo does not exist yet, querying its submodules will
# mess up its states; so return here. # mess up its states; so return here.
return result return result
for rev, path, url in self._GetSubmodules(): for rev, path, url, shallow in self._GetSubmodules():
name = self.manifest.GetSubprojectName(self, path) name = self.manifest.GetSubprojectName(self, path)
( (
relpath, relpath,
@ -2325,6 +2346,7 @@ class Project:
review=self.remote.review, review=self.remote.review,
revision=self.remote.revision, revision=self.remote.revision,
) )
clone_depth = 1 if shallow.lower() == "true" else None
subproject = Project( subproject = Project(
manifest=self.manifest, manifest=self.manifest,
name=name, name=name,
@ -2341,10 +2363,13 @@ class Project:
sync_s=self.sync_s, sync_s=self.sync_s,
sync_tags=self.sync_tags, sync_tags=self.sync_tags,
parent=self, parent=self,
clone_depth=clone_depth,
is_derived=True, is_derived=True,
) )
result.append(subproject) result.append(subproject)
result.extend(subproject.GetDerivedSubprojects()) result.extend(subproject.GetDerivedSubprojects())
if result:
self.has_subprojects = True
return result return result
def EnableRepositoryExtension(self, key, value="true", version=1): def EnableRepositoryExtension(self, key, value="true", version=1):
@ -2393,26 +2418,25 @@ class Project:
try: try:
# if revision (sha or tag) is not present then following function # if revision (sha or tag) is not present then following function
# throws an error. # throws an error.
self.bare_git.rev_list( revs = [f"{self.revisionExpr}^0"]
"-1", upstream_rev = None
"--missing=allow-any",
"%s^0" % self.revisionExpr,
"--",
log_as_error=False,
)
if self.upstream: if self.upstream:
rev = self.GetRemote().ToLocal(self.upstream) upstream_rev = self.GetRemote().ToLocal(self.upstream)
revs.append(upstream_rev)
self.bare_git.rev_list( self.bare_git.rev_list(
"-1", "-1",
"--missing=allow-any", "--missing=allow-any",
"%s^0" % rev, *revs,
"--", "--",
log_as_error=False, log_as_error=False,
) )
if self.upstream:
self.bare_git.merge_base( self.bare_git.merge_base(
"--is-ancestor", "--is-ancestor",
self.revisionExpr, self.revisionExpr,
rev, upstream_rev,
log_as_error=False, log_as_error=False,
) )
return True return True
@ -2662,7 +2686,10 @@ class Project:
# TODO(b/360889369#comment24): git may gc commits incorrectly. # TODO(b/360889369#comment24): git may gc commits incorrectly.
# Until the root cause is fixed, retry fetch with --refetch which # Until the root cause is fixed, retry fetch with --refetch which
# will bring the repository into a good state. # will bring the repository into a good state.
elif gitcmd.stdout and "could not parse commit" in gitcmd.stdout: elif gitcmd.stdout and (
"could not parse commit" in gitcmd.stdout
or "unable to parse commit" in gitcmd.stdout
):
cmd.insert(1, "--refetch") cmd.insert(1, "--refetch")
print( print(
"could not parse commit error, retrying with refetch", "could not parse commit error, retrying with refetch",
@ -2695,12 +2722,47 @@ class Project:
) )
# Continue right away so we don't sleep as we shouldn't need to. # Continue right away so we don't sleep as we shouldn't need to.
continue continue
elif (
ret == 128
and gitcmd.stdout
and "fatal: could not read Username" in gitcmd.stdout
):
# User needs to be authenticated, and Git wants to prompt for
# username and password.
print(
"git requires authentication, but repo cannot perform "
"interactive authentication. Check git credentials.",
file=output_redir,
)
break
elif (
ret == 128
and gitcmd.stdout
and "remote helper 'sso' aborted session" in gitcmd.stdout
):
# User needs to be authenticated, and Git wants to prompt for
# username and password.
print(
"git requires authentication, but repo cannot perform "
"interactive authentication.",
file=output_redir,
)
raise GitAuthError(gitcmd.stdout)
break
elif current_branch_only and is_sha1 and ret == 128: elif current_branch_only and is_sha1 and ret == 128:
# Exit code 128 means "couldn't find the ref you asked for"; if # Exit code 128 means "couldn't find the ref you asked for"; if
# we're in sha1 mode, we just tried sync'ing from the upstream # we're in sha1 mode, we just tried sync'ing from the upstream
# field; it doesn't exist, thus abort the optimization attempt # field; it doesn't exist, thus abort the optimization attempt
# and do a full sync. # and do a full sync.
break break
elif depth and is_sha1 and ret == 1:
# In sha1 mode, when depth is enabled, syncing the revision
# from upstream may not work because some servers only allow
# fetching named refs. Fetching a specific sha1 may result
# in an error like 'server does not allow request for
# unadvertised object'. In this case, attempt a full sync
# without depth.
break
elif ret < 0: elif ret < 0:
# Git died with a signal, exit immediately. # Git died with a signal, exit immediately.
break break
@ -2821,7 +2883,14 @@ class Project:
# We do not use curl's --retry option since it generally doesn't # We do not use curl's --retry option since it generally doesn't
# actually retry anything; code 18 for example, it will not retry on. # actually retry anything; code 18 for example, it will not retry on.
cmd = ["curl", "--fail", "--output", tmpPath, "--netrc", "--location"] cmd = [
"curl",
"--fail",
"--output",
tmpPath,
"--netrc-optional",
"--location",
]
if quiet: if quiet:
cmd += ["--silent", "--show-error"] cmd += ["--silent", "--show-error"]
if os.path.exists(tmpPath): if os.path.exists(tmpPath):
@ -2966,6 +3035,17 @@ class Project:
project=self.name, project=self.name,
) )
def _InitSubmodules(self, quiet=True):
"""Initialize the submodules for the project."""
cmd = ["submodule", "init"]
if quiet:
cmd.append("-q")
if GitCommand(self, cmd).Wait() != 0:
raise GitError(
f"{self.name} submodule init",
project=self.name,
)
def _Rebase(self, upstream, onto=None): def _Rebase(self, upstream, onto=None):
cmd = ["rebase"] cmd = ["rebase"]
if onto is not None: if onto is not None:
@ -3341,20 +3421,25 @@ class Project:
setting = fp.read() setting = fp.read()
assert setting.startswith("gitdir:") assert setting.startswith("gitdir:")
git_worktree_path = setting.split(":", 1)[1].strip() git_worktree_path = setting.split(":", 1)[1].strip()
# `gitdir` maybe be either relative or absolute depending on the
# behavior of the local copy of git, so only convert the path to
# relative if it needs to be converted.
if os.path.isabs(git_worktree_path):
# Some platforms (e.g. Windows) won't let us update dotgit in situ # Some platforms (e.g. Windows) won't let us update dotgit in situ
# because of file permissions. Delete it and recreate it from scratch # because of file permissions. Delete it and recreate it from
# to avoid. # scratch to avoid.
platform_utils.remove(dotgit) platform_utils.remove(dotgit)
# Use relative path from checkout->worktree & maintain Unix line endings # Use relative path from checkout->worktree & maintain Unix line
# on all OS's to match git behavior. # endings on all OS's to match git behavior.
with open(dotgit, "w", newline="\n") as fp: with open(dotgit, "w", newline="\n") as fp:
print( print(
"gitdir:", "gitdir:",
os.path.relpath(git_worktree_path, self.worktree), os.path.relpath(git_worktree_path, self.worktree),
file=fp, file=fp,
) )
# Use relative path from worktree->checkout & maintain Unix line endings # Use relative path from worktree->checkout & maintain Unix line
# on all OS's to match git behavior. # endings on all OS's to match git behavior.
with open( with open(
os.path.join(git_worktree_path, "gitdir"), "w", newline="\n" os.path.join(git_worktree_path, "gitdir"), "w", newline="\n"
) as fp: ) as fp:
@ -3379,6 +3464,11 @@ class Project:
""" """
dotgit = os.path.join(self.worktree, ".git") dotgit = os.path.join(self.worktree, ".git")
# If bare checkout of the submodule is stored under the subproject dir,
# migrate it.
if self.parent:
self._MigrateOldSubmoduleDir()
# If using an old layout style (a directory), migrate it. # If using an old layout style (a directory), migrate it.
if not platform_utils.islink(dotgit) and platform_utils.isdir(dotgit): if not platform_utils.islink(dotgit) and platform_utils.isdir(dotgit):
self._MigrateOldWorkTreeGitDir(dotgit, project=self.name) self._MigrateOldWorkTreeGitDir(dotgit, project=self.name)
@ -3389,34 +3479,76 @@ class Project:
self._InitGitWorktree() self._InitGitWorktree()
self._CopyAndLinkFiles() self._CopyAndLinkFiles()
else: else:
# Remove old directory symbolic links for submodules.
if self.parent and platform_utils.islink(dotgit):
platform_utils.remove(dotgit)
init_dotgit = True
if not init_dotgit: if not init_dotgit:
# See if the project has changed. # See if the project has changed.
if os.path.realpath(self.gitdir) != os.path.realpath(dotgit): self._removeBadGitDirLink(dotgit)
platform_utils.remove(dotgit)
if init_dotgit or not os.path.exists(dotgit): if init_dotgit or not os.path.exists(dotgit):
os.makedirs(self.worktree, exist_ok=True) self._createDotGit(dotgit)
platform_utils.symlink(
os.path.relpath(self.gitdir, self.worktree), dotgit
)
if init_dotgit: if init_dotgit:
_lwrite( _lwrite(
os.path.join(dotgit, HEAD), "%s\n" % self.GetRevisionId() os.path.join(self.gitdir, HEAD), f"{self.GetRevisionId()}\n"
) )
# Finish checking out the worktree. # Finish checking out the worktree.
cmd = ["read-tree", "--reset", "-u", "-v", HEAD] cmd = ["read-tree", "--reset", "-u", "-v", HEAD]
try:
if GitCommand(self, cmd).Wait() != 0: if GitCommand(self, cmd).Wait() != 0:
raise GitError( raise GitError(
"Cannot initialize work tree for " + self.name, "Cannot initialize work tree for " + self.name,
project=self.name, project=self.name,
) )
except Exception as e:
# Something went wrong with read-tree (perhaps fetching
# missing blobs), so remove .git to avoid half initialized
# workspace from which repo can't recover on its own.
platform_utils.remove(dotgit)
raise e
if submodules: if submodules:
self._SyncSubmodules(quiet=True) self._SyncSubmodules(quiet=True)
self._CopyAndLinkFiles() self._CopyAndLinkFiles()
def _createDotGit(self, dotgit):
"""Initialize .git path.
For submodule projects, create a '.git' file using the gitfile
mechanism, and for the rest, create a symbolic link.
"""
os.makedirs(self.worktree, exist_ok=True)
if self.parent:
_lwrite(
dotgit,
f"gitdir: {os.path.relpath(self.gitdir, self.worktree)}\n",
)
else:
platform_utils.symlink(
os.path.relpath(self.gitdir, self.worktree), dotgit
)
def _removeBadGitDirLink(self, dotgit):
"""Verify .git is initialized correctly, otherwise delete it."""
if self.parent and os.path.isfile(dotgit):
with open(dotgit) as fp:
setting = fp.read()
if not setting.startswith("gitdir:"):
raise GitError(
f"'.git' in {self.worktree} must start with 'gitdir:'",
project=self.name,
)
gitdir = setting.split(":", 1)[1].strip()
dotgit_path = os.path.normpath(os.path.join(self.worktree, gitdir))
else:
dotgit_path = os.path.realpath(dotgit)
if os.path.realpath(self.gitdir) != dotgit_path:
platform_utils.remove(dotgit)
@classmethod @classmethod
def _MigrateOldWorkTreeGitDir(cls, dotgit, project=None): def _MigrateOldWorkTreeGitDir(cls, dotgit, project=None):
"""Migrate the old worktree .git/ dir style to a symlink. """Migrate the old worktree .git/ dir style to a symlink.
@ -3505,6 +3637,28 @@ class Project:
dotgit, dotgit,
) )
def _MigrateOldSubmoduleDir(self):
"""Move the old bare checkout in 'subprojects' to 'modules'
as bare checkouts of submodules are now in 'modules' dir.
"""
subprojects = os.path.join(self.parent.gitdir, "subprojects")
if not platform_utils.isdir(subprojects):
return
modules = os.path.join(self.parent.gitdir, "modules")
old = self.gitdir
new = os.path.splitext(self.gitdir.replace(subprojects, modules))[0]
if all(map(platform_utils.isdir, [old, new])):
platform_utils.rmtree(old, ignore_errors=True)
else:
os.makedirs(modules, exist_ok=True)
platform_utils.rename(old, new)
self.gitdir = new
self.UpdatePaths(self.relpath, self.worktree, self.gitdir, self.objdir)
if platform_utils.isdir(subprojects) and not os.listdir(subprojects):
platform_utils.rmtree(subprojects, ignore_errors=True)
def _get_symlink_error_message(self): def _get_symlink_error_message(self):
if platform_utils.isWindows(): if platform_utils.isWindows():
return ( return (

View File

@ -16,3 +16,8 @@
line-length = 80 line-length = 80
# NB: Keep in sync with tox.ini. # NB: Keep in sync with tox.ini.
target-version = ['py36', 'py37', 'py38', 'py39', 'py310', 'py311'] #, 'py312' target-version = ['py36', 'py37', 'py38', 'py39', 'py310', 'py311'] #, 'py312'
[tool.pytest.ini_options]
markers = """
skip_cq: Skip tests in the CQ. Should be rarely used!
"""

View File

@ -16,6 +16,7 @@
import os import os
import re import re
import shlex
import subprocess import subprocess
import sys import sys
@ -35,12 +36,7 @@ KEYID_ECC = "E1F9040D7A3F6DAFAC897CD3D3B95DA243E48A39"
def cmdstr(cmd): def cmdstr(cmd):
"""Get a nicely quoted shell command.""" """Get a nicely quoted shell command."""
ret = [] return " ".join(shlex.quote(x) for x in cmd)
for arg in cmd:
if not re.match(r"^[a-zA-Z0-9/_.=-]+$", arg):
arg = f'"{arg}"'
ret.append(arg)
return " ".join(ret)
def run(opts, cmd, check=True, **kwargs): def run(opts, cmd, check=True, **kwargs):

150
repo
View File

@ -27,6 +27,7 @@ import platform
import shlex import shlex
import subprocess import subprocess
import sys import sys
from typing import NamedTuple
# These should never be newer than the main.py version since this needs to be a # These should never be newer than the main.py version since this needs to be a
@ -56,9 +57,14 @@ class Trace:
trace = Trace() trace = Trace()
def cmdstr(cmd):
"""Get a nicely quoted shell command."""
return " ".join(shlex.quote(x) for x in cmd)
def exec_command(cmd): def exec_command(cmd):
"""Execute |cmd| or return None on failure.""" """Execute |cmd| or return None on failure."""
trace.print(":", " ".join(cmd)) trace.print(":", cmdstr(cmd))
try: try:
if platform.system() == "Windows": if platform.system() == "Windows":
ret = subprocess.call(cmd) ret = subprocess.call(cmd)
@ -124,7 +130,7 @@ if not REPO_REV:
BUG_URL = "https://issues.gerritcodereview.com/issues/new?component=1370071" BUG_URL = "https://issues.gerritcodereview.com/issues/new?component=1370071"
# increment this whenever we make important changes to this script # increment this whenever we make important changes to this script
VERSION = (2, 45) VERSION = (2, 54)
# increment this if the MAINTAINER_KEYS block is modified # increment this if the MAINTAINER_KEYS block is modified
KEYRING_VERSION = (2, 3) KEYRING_VERSION = (2, 3)
@ -215,11 +221,8 @@ repodir = ".repo" # name of repo's private directory
S_repo = "repo" # special repo repository S_repo = "repo" # special repo repository
S_manifests = "manifests" # special manifest repository S_manifests = "manifests" # special manifest repository
REPO_MAIN = S_repo + "/main.py" # main script REPO_MAIN = S_repo + "/main.py" # main script
GITC_CONFIG_FILE = "/gitc/.config"
GITC_FS_ROOT_DIR = "/gitc/manifest-rw/"
import collections
import errno import errno
import json import json
import optparse import optparse
@ -235,11 +238,8 @@ home_dot_repo = os.path.join(repo_config_dir, ".repoconfig")
gpg_dir = os.path.join(home_dot_repo, "gnupg") gpg_dir = os.path.join(home_dot_repo, "gnupg")
def GetParser(gitc_init=False): def GetParser():
"""Setup the CLI parser.""" """Setup the CLI parser."""
if gitc_init:
sys.exit("repo: fatal: GITC not supported.")
else:
usage = "repo init [options] [-u] url" usage = "repo init [options] [-u] url"
parser = optparse.OptionParser(usage=usage) parser = optparse.OptionParser(usage=usage)
@ -282,6 +282,12 @@ def InitParser(parser):
metavar="REVISION", metavar="REVISION",
help="manifest branch or revision (use HEAD for default)", help="manifest branch or revision (use HEAD for default)",
) )
group.add_option(
"--manifest-upstream-branch",
help="when a commit is provided to --manifest-branch, this "
"is the name of the git ref in which the commit can be found",
metavar="BRANCH",
)
group.add_option( group.add_option(
"-m", "-m",
"--manifest-name", "--manifest-name",
@ -481,16 +487,6 @@ def InitParser(parser):
return parser return parser
# This is a poor replacement for subprocess.run until we require Python 3.6+.
RunResult = collections.namedtuple(
"RunResult", ("returncode", "stdout", "stderr")
)
class RunError(Exception):
"""Error when running a command failed."""
def run_command(cmd, **kwargs): def run_command(cmd, **kwargs):
"""Run |cmd| and return its output.""" """Run |cmd| and return its output."""
check = kwargs.pop("check", False) check = kwargs.pop("check", False)
@ -515,7 +511,7 @@ def run_command(cmd, **kwargs):
# Run & package the results. # Run & package the results.
proc = subprocess.Popen(cmd, **kwargs) proc = subprocess.Popen(cmd, **kwargs)
(stdout, stderr) = proc.communicate(input=cmd_input) (stdout, stderr) = proc.communicate(input=cmd_input)
dbg = ": " + " ".join(cmd) dbg = ": " + cmdstr(cmd)
if cmd_input is not None: if cmd_input is not None:
dbg += " 0<|" dbg += " 0<|"
if stdout == subprocess.PIPE: if stdout == subprocess.PIPE:
@ -525,7 +521,9 @@ def run_command(cmd, **kwargs):
elif stderr == subprocess.STDOUT: elif stderr == subprocess.STDOUT:
dbg += " 2>&1" dbg += " 2>&1"
trace.print(dbg) trace.print(dbg)
ret = RunResult(proc.returncode, decode(stdout), decode(stderr)) ret = subprocess.CompletedProcess(
cmd, proc.returncode, decode(stdout), decode(stderr)
)
# If things failed, print useful debugging output. # If things failed, print useful debugging output.
if check and ret.returncode: if check and ret.returncode:
@ -546,56 +544,13 @@ def run_command(cmd, **kwargs):
_print_output("stdout", ret.stdout) _print_output("stdout", ret.stdout)
_print_output("stderr", ret.stderr) _print_output("stderr", ret.stderr)
raise RunError(ret) # This will raise subprocess.CalledProcessError for us.
ret.check_returncode()
return ret return ret
_gitc_manifest_dir = None
def get_gitc_manifest_dir():
global _gitc_manifest_dir
if _gitc_manifest_dir is None:
_gitc_manifest_dir = ""
try:
with open(GITC_CONFIG_FILE) as gitc_config:
for line in gitc_config:
match = re.match("gitc_dir=(?P<gitc_manifest_dir>.*)", line)
if match:
_gitc_manifest_dir = match.group("gitc_manifest_dir")
except OSError:
pass
return _gitc_manifest_dir
def gitc_parse_clientdir(gitc_fs_path):
"""Parse a path in the GITC FS and return its client name.
Args:
gitc_fs_path: A subdirectory path within the GITC_FS_ROOT_DIR.
Returns:
The GITC client name.
"""
if gitc_fs_path == GITC_FS_ROOT_DIR:
return None
if not gitc_fs_path.startswith(GITC_FS_ROOT_DIR):
manifest_dir = get_gitc_manifest_dir()
if manifest_dir == "":
return None
if manifest_dir[-1] != "/":
manifest_dir += "/"
if gitc_fs_path == manifest_dir:
return None
if not gitc_fs_path.startswith(manifest_dir):
return None
return gitc_fs_path.split(manifest_dir)[1].split("/")[0]
return gitc_fs_path.split(GITC_FS_ROOT_DIR)[1].split("/")[0]
class CloneFailure(Exception): class CloneFailure(Exception):
"""Indicate the remote clone of repo itself failed.""" """Indicate the remote clone of repo itself failed."""
@ -632,9 +587,9 @@ def check_repo_rev(dst, rev, repo_verify=True, quiet=False):
return (remote_ref, rev) return (remote_ref, rev)
def _Init(args, gitc_init=False): def _Init(args):
"""Installs repo by cloning it over the network.""" """Installs repo by cloning it over the network."""
parser = GetParser(gitc_init=gitc_init) parser = GetParser()
opt, args = parser.parse_args(args) opt, args = parser.parse_args(args)
if args: if args:
if not opt.manifest_url: if not opt.manifest_url:
@ -714,15 +669,20 @@ def run_git(*args, **kwargs):
file=sys.stderr, file=sys.stderr,
) )
sys.exit(1) sys.exit(1)
except RunError: except subprocess.CalledProcessError:
raise CloneFailure() raise CloneFailure()
# The git version info broken down into components for easy analysis. class GitVersion(NamedTuple):
# Similar to Python's sys.version_info. """The git version info broken down into components for easy analysis.
GitVersion = collections.namedtuple(
"GitVersion", ("major", "minor", "micro", "full") Similar to Python's sys.version_info.
) """
major: int
minor: int
micro: int
full: int
def ParseGitVersion(ver_str=None): def ParseGitVersion(ver_str=None):
@ -888,10 +848,11 @@ def _GetRepoConfig(name):
return None return None
else: else:
print( print(
f"repo: error: git {' '.join(cmd)} failed:\n{ret.stderr}", f"repo: error: git {cmdstr(cmd)} failed:\n{ret.stderr}",
file=sys.stderr, file=sys.stderr,
) )
raise RunError() # This will raise subprocess.CalledProcessError for us.
ret.check_returncode()
def _InitHttp(): def _InitHttp():
@ -1158,7 +1119,7 @@ class _Options:
def _ExpandAlias(name): def _ExpandAlias(name):
"""Look up user registered aliases.""" """Look up user registered aliases."""
# We don't resolve aliases for existing subcommands. This matches git. # We don't resolve aliases for existing subcommands. This matches git.
if name in {"gitc-init", "help", "init"}: if name in {"help", "init"}:
return name, [] return name, []
alias = _GetRepoConfig(f"alias.{name}") alias = _GetRepoConfig(f"alias.{name}")
@ -1286,10 +1247,6 @@ class Requirements:
def _Usage(): def _Usage():
gitc_usage = ""
if get_gitc_manifest_dir():
gitc_usage = " gitc-init Initialize a GITC Client.\n"
print( print(
"""usage: repo COMMAND [ARGS] """usage: repo COMMAND [ARGS]
@ -1298,9 +1255,7 @@ repo is not yet installed. Use "repo init" to install it here.
The most commonly used repo commands are: The most commonly used repo commands are:
init Install repo in the current working directory init Install repo in the current working directory
""" help Display detailed help on a command
+ gitc_usage
+ """ help Display detailed help on a command
For access to the full online help, install repo ("repo init"). For access to the full online help, install repo ("repo init").
""" """
@ -1311,8 +1266,8 @@ For access to the full online help, install repo ("repo init").
def _Help(args): def _Help(args):
if args: if args:
if args[0] in {"init", "gitc-init"}: if args[0] in {"init"}:
parser = GetParser(gitc_init=args[0] == "gitc-init") parser = GetParser()
parser.print_help() parser.print_help()
sys.exit(0) sys.exit(0)
else: else:
@ -1329,10 +1284,11 @@ def _Help(args):
def _Version(): def _Version():
"""Show version information.""" """Show version information."""
git_version = ParseGitVersion()
print("<repo not installed>") print("<repo not installed>")
print(f"repo launcher version {'.'.join(str(x) for x in VERSION)}") print(f"repo launcher version {'.'.join(str(x) for x in VERSION)}")
print(f" (from {__file__})") print(f" (from {__file__})")
print(f"git {ParseGitVersion().full}") print(f"git {git_version.full}" if git_version else "git not installed")
print(f"Python {sys.version}") print(f"Python {sys.version}")
uname = platform.uname() uname = platform.uname()
print(f"OS {uname.system} {uname.release} ({uname.version})") print(f"OS {uname.system} {uname.release} ({uname.version})")
@ -1365,11 +1321,11 @@ def _RunSelf(wrapper_path):
my_main = os.path.join(my_dir, "main.py") my_main = os.path.join(my_dir, "main.py")
my_git = os.path.join(my_dir, ".git") my_git = os.path.join(my_dir, ".git")
if os.path.isfile(my_main) and os.path.isdir(my_git): if os.path.isfile(my_main):
for name in ["git_config.py", "project.py", "subcmds"]: for name in ["git_config.py", "project.py", "subcmds"]:
if not os.path.exists(os.path.join(my_dir, name)): if not os.path.exists(os.path.join(my_dir, name)):
return None, None return None, None
return my_main, my_git return my_main, my_git if os.path.isdir(my_git) else None
return None, None return None, None
@ -1400,23 +1356,11 @@ def main(orig_args):
# We run this early as we run some git commands ourselves. # We run this early as we run some git commands ourselves.
SetGitTrace2ParentSid() SetGitTrace2ParentSid()
repo_main, rel_repo_dir = None, None
# Don't use the local repo copy, make sure to switch to the gitc client first.
if cmd != "gitc-init":
repo_main, rel_repo_dir = _FindRepo() repo_main, rel_repo_dir = _FindRepo()
wrapper_path = os.path.abspath(__file__) wrapper_path = os.path.abspath(__file__)
my_main, my_git = _RunSelf(wrapper_path) my_main, my_git = _RunSelf(wrapper_path)
cwd = os.getcwd()
if get_gitc_manifest_dir() and cwd.startswith(get_gitc_manifest_dir()):
print(
"error: repo cannot be used in the GITC local manifest directory."
"\nIf you want to work on this GITC client please rerun this "
"command from the corresponding client under /gitc/",
file=sys.stderr,
)
sys.exit(1)
if not repo_main: if not repo_main:
# Only expand aliases here since we'll be parsing the CLI ourselves. # Only expand aliases here since we'll be parsing the CLI ourselves.
# If we had repo_main, alias expansion would happen in main.py. # If we had repo_main, alias expansion would happen in main.py.
@ -1431,11 +1375,11 @@ def main(orig_args):
_Version() _Version()
if not cmd: if not cmd:
_NotInstalled() _NotInstalled()
if cmd == "init" or cmd == "gitc-init": if cmd == "init":
if my_git: if my_git:
_SetDefaultsTo(my_git) _SetDefaultsTo(my_git)
try: try:
_Init(args, gitc_init=(cmd == "gitc-init")) _Init(args)
except CloneFailure: except CloneFailure:
path = os.path.join(repodir, S_repo) path = os.path.join(repodir, S_repo)
print( print(

View File

@ -15,16 +15,57 @@
"""Wrapper to run linters and pytest with the right settings.""" """Wrapper to run linters and pytest with the right settings."""
import functools
import os import os
import subprocess import subprocess
import sys import sys
from typing import List
import pytest
ROOT_DIR = os.path.dirname(os.path.realpath(__file__)) ROOT_DIR = os.path.dirname(os.path.realpath(__file__))
@functools.lru_cache()
def is_ci() -> bool:
"""Whether we're running in our CI system."""
return os.getenv("LUCI_CQ") == "yes"
def run_pytest(argv: List[str]) -> int:
"""Returns the exit code from pytest."""
if is_ci():
argv = ["-m", "not skip_cq"] + argv
return subprocess.run(
[sys.executable, "-m", "pytest"] + argv,
check=False,
cwd=ROOT_DIR,
).returncode
def run_pytest_py38(argv: List[str]) -> int:
"""Returns the exit code from pytest under Python 3.8."""
if is_ci():
argv = ["-m", "not skip_cq"] + argv
try:
return subprocess.run(
[
"vpython3",
"-vpython-spec",
"run_tests.vpython3.8",
"-m",
"pytest",
]
+ argv,
check=False,
cwd=ROOT_DIR,
).returncode
except FileNotFoundError:
# Skip if the user doesn't have vpython from depot_tools.
return 0
def run_black(): def run_black():
"""Returns the exit code from black.""" """Returns the exit code from black."""
# Black by default only matches .py files. We have to list standalone # Black by default only matches .py files. We have to list standalone
@ -38,32 +79,40 @@ def run_black():
return subprocess.run( return subprocess.run(
[sys.executable, "-m", "black", "--check", ROOT_DIR] + extra_programs, [sys.executable, "-m", "black", "--check", ROOT_DIR] + extra_programs,
check=False, check=False,
cwd=ROOT_DIR,
).returncode ).returncode
def run_flake8(): def run_flake8():
"""Returns the exit code from flake8.""" """Returns the exit code from flake8."""
return subprocess.run( return subprocess.run(
[sys.executable, "-m", "flake8", ROOT_DIR], check=False [sys.executable, "-m", "flake8", ROOT_DIR],
check=False,
cwd=ROOT_DIR,
).returncode ).returncode
def run_isort(): def run_isort():
"""Returns the exit code from isort.""" """Returns the exit code from isort."""
return subprocess.run( return subprocess.run(
[sys.executable, "-m", "isort", "--check", ROOT_DIR], check=False [sys.executable, "-m", "isort", "--check", ROOT_DIR],
check=False,
cwd=ROOT_DIR,
).returncode ).returncode
def main(argv): def main(argv):
"""The main entry.""" """The main entry."""
checks = ( checks = (
lambda: pytest.main(argv), functools.partial(run_pytest, argv),
functools.partial(run_pytest_py38, argv),
run_black, run_black,
run_flake8, run_flake8,
run_isort, run_isort,
) )
return 0 if all(not c() for c in checks) else 1 # Run all the tests all the time to get full feedback. Don't exit on the
# first error as that makes it more difficult to iterate in the CQ.
return 1 if sum(c() for c in checks) else 0
if __name__ == "__main__": if __name__ == "__main__":

View File

@ -5,97 +5,92 @@
# List of available wheels: # List of available wheels:
# https://chromium.googlesource.com/infra/infra/+/main/infra/tools/dockerbuild/wheels.md # https://chromium.googlesource.com/infra/infra/+/main/infra/tools/dockerbuild/wheels.md
python_version: "3.8" python_version: "3.11"
wheel: < wheel: <
name: "infra/python/wheels/pytest-py3" name: "infra/python/wheels/pytest-py3"
version: "version:6.2.2" version: "version:8.3.4"
> >
# Required by pytest==6.2.2 # Required by pytest==8.3.4
wheel: < wheel: <
name: "infra/python/wheels/py-py2_py3" name: "infra/python/wheels/py-py2_py3"
version: "version:1.10.0" version: "version:1.11.0"
> >
# Required by pytest==6.2.2 # Required by pytest==8.3.4
wheel: < wheel: <
name: "infra/python/wheels/iniconfig-py3" name: "infra/python/wheels/iniconfig-py3"
version: "version:1.1.1" version: "version:1.1.1"
> >
# Required by pytest==6.2.2 # Required by pytest==8.3.4
wheel: < wheel: <
name: "infra/python/wheels/packaging-py3" name: "infra/python/wheels/packaging-py3"
version: "version:23.0" version: "version:23.0"
> >
# Required by pytest==6.2.2 # Required by pytest==8.3.4
wheel: < wheel: <
name: "infra/python/wheels/pluggy-py3" name: "infra/python/wheels/pluggy-py3"
version: "version:0.13.1" version: "version:1.5.0"
> >
# Required by pytest==6.2.2 # Required by pytest==8.3.4
wheel: < wheel: <
name: "infra/python/wheels/toml-py3" name: "infra/python/wheels/toml-py3"
version: "version:0.10.1" version: "version:0.10.1"
> >
# Required by pytest==6.2.2 # Required by pytest==8.3.4
wheel: < wheel: <
name: "infra/python/wheels/pyparsing-py3" name: "infra/python/wheels/pyparsing-py3"
version: "version:3.0.7" version: "version:3.0.7"
> >
# Required by pytest==6.2.2 # Required by pytest==8.3.4
wheel: < wheel: <
name: "infra/python/wheels/attrs-py2_py3" name: "infra/python/wheels/attrs-py2_py3"
version: "version:21.4.0" version: "version:21.4.0"
> >
# Required by packaging==16.8 # NB: Keep in sync with constraints.txt.
wheel: <
name: "infra/python/wheels/six-py2_py3"
version: "version:1.16.0"
>
wheel: < wheel: <
name: "infra/python/wheels/black-py3" name: "infra/python/wheels/black-py3"
version: "version:23.1.0" version: "version:25.1.0"
> >
# Required by black==23.1.0 # Required by black==25.1.0
wheel: < wheel: <
name: "infra/python/wheels/mypy-extensions-py3" name: "infra/python/wheels/mypy-extensions-py3"
version: "version:0.4.3" version: "version:0.4.3"
> >
# Required by black==23.1.0 # Required by black==25.1.0
wheel: < wheel: <
name: "infra/python/wheels/tomli-py3" name: "infra/python/wheels/tomli-py3"
version: "version:2.0.1" version: "version:2.0.1"
> >
# Required by black==23.1.0 # Required by black==25.1.0
wheel: < wheel: <
name: "infra/python/wheels/platformdirs-py3" name: "infra/python/wheels/platformdirs-py3"
version: "version:2.5.2" version: "version:2.5.2"
> >
# Required by black==23.1.0 # Required by black==25.1.0
wheel: < wheel: <
name: "infra/python/wheels/pathspec-py3" name: "infra/python/wheels/pathspec-py3"
version: "version:0.9.0" version: "version:0.9.0"
> >
# Required by black==23.1.0 # Required by black==25.1.0
wheel: < wheel: <
name: "infra/python/wheels/typing-extensions-py3" name: "infra/python/wheels/typing-extensions-py3"
version: "version:4.3.0" version: "version:4.3.0"
> >
# Required by black==23.1.0 # Required by black==25.1.0
wheel: < wheel: <
name: "infra/python/wheels/click-py3" name: "infra/python/wheels/click-py3"
version: "version:8.0.3" version: "version:8.0.3"

67
run_tests.vpython3.8 Normal file
View File

@ -0,0 +1,67 @@
# This is a vpython "spec" file.
#
# Read more about `vpython` and how to modify this file here:
# https://chromium.googlesource.com/infra/infra/+/main/doc/users/vpython.md
# List of available wheels:
# https://chromium.googlesource.com/infra/infra/+/main/infra/tools/dockerbuild/wheels.md
python_version: "3.8"
wheel: <
name: "infra/python/wheels/pytest-py3"
version: "version:8.3.4"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/py-py2_py3"
version: "version:1.11.0"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/iniconfig-py3"
version: "version:1.1.1"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/packaging-py3"
version: "version:23.0"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/pluggy-py3"
version: "version:1.5.0"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/toml-py3"
version: "version:0.10.1"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/tomli-py3"
version: "version:2.1.0"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/pyparsing-py3"
version: "version:3.0.7"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/attrs-py2_py3"
version: "version:21.4.0"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/exceptiongroup-py3"
version: "version:1.1.2"
>

View File

@ -70,8 +70,10 @@ It is equivalent to "git branch -D <branchname>".
else: else:
args.insert(0, "'All local branches'") args.insert(0, "'All local branches'")
def _ExecuteOne(self, all_branches, nb, project): @classmethod
def _ExecuteOne(cls, all_branches, nb, project_idx):
"""Abandon one project.""" """Abandon one project."""
project = cls.get_parallel_context()["projects"][project_idx]
if all_branches: if all_branches:
branches = project.GetBranches() branches = project.GetBranches()
else: else:
@ -89,7 +91,7 @@ It is equivalent to "git branch -D <branchname>".
if status is not None: if status is not None:
ret[name] = status ret[name] = status
return (ret, project, errors) return (ret, project_idx, errors)
def Execute(self, opt, args): def Execute(self, opt, args):
nb = args[0].split() nb = args[0].split()
@ -102,7 +104,8 @@ It is equivalent to "git branch -D <branchname>".
_RelPath = lambda p: p.RelPath(local=opt.this_manifest_only) _RelPath = lambda p: p.RelPath(local=opt.this_manifest_only)
def _ProcessResults(_pool, pm, states): def _ProcessResults(_pool, pm, states):
for results, project, errors in states: for results, project_idx, errors in states:
project = all_projects[project_idx]
for branch, status in results.items(): for branch, status in results.items():
if status: if status:
success[branch].append(project) success[branch].append(project)
@ -111,14 +114,17 @@ It is equivalent to "git branch -D <branchname>".
aggregate_errors.extend(errors) aggregate_errors.extend(errors)
pm.update(msg="") pm.update(msg="")
with self.ParallelContext():
self.get_parallel_context()["projects"] = all_projects
self.ExecuteInParallel( self.ExecuteInParallel(
opt.jobs, opt.jobs,
functools.partial(self._ExecuteOne, opt.all, nb), functools.partial(self._ExecuteOne, opt.all, nb),
all_projects, range(len(all_projects)),
callback=_ProcessResults, callback=_ProcessResults,
output=Progress( output=Progress(
f"Abandon {nb}", len(all_projects), quiet=opt.quiet f"Abandon {nb}", len(all_projects), quiet=opt.quiet
), ),
chunksize=1,
) )
width = max( width = max(

View File

@ -98,6 +98,22 @@ is shown, then the branch appears in all projects.
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
@classmethod
def _ExpandProjectToBranches(cls, project_idx):
"""Expands a project into a list of branch names & associated info.
Args:
project_idx: project.Project index
Returns:
List[Tuple[str, git_config.Branch, int]]
"""
branches = []
project = cls.get_parallel_context()["projects"][project_idx]
for name, b in project.GetBranches().items():
branches.append((name, b, project_idx))
return branches
def Execute(self, opt, args): def Execute(self, opt, args):
projects = self.GetProjects( projects = self.GetProjects(
args, all_manifests=not opt.this_manifest_only args, all_manifests=not opt.this_manifest_only
@ -107,15 +123,18 @@ is shown, then the branch appears in all projects.
project_cnt = len(projects) project_cnt = len(projects)
def _ProcessResults(_pool, _output, results): def _ProcessResults(_pool, _output, results):
for name, b in itertools.chain.from_iterable(results): for name, b, project_idx in itertools.chain.from_iterable(results):
b.project = projects[project_idx]
if name not in all_branches: if name not in all_branches:
all_branches[name] = BranchInfo(name) all_branches[name] = BranchInfo(name)
all_branches[name].add(b) all_branches[name].add(b)
with self.ParallelContext():
self.get_parallel_context()["projects"] = projects
self.ExecuteInParallel( self.ExecuteInParallel(
opt.jobs, opt.jobs,
expand_project_to_branches, self._ExpandProjectToBranches,
projects, range(len(projects)),
callback=_ProcessResults, callback=_ProcessResults,
) )
@ -148,7 +167,10 @@ is shown, then the branch appears in all projects.
else: else:
published = " " published = " "
hdr("%c%c %-*s" % (current, published, width, name)) # A branch name can contain a percent sign, so we need to escape it.
# Escape after f-string formatting to properly account for leading
# spaces.
hdr(f"{current}{published} {name:{width}}".replace("%", "%%"))
out.write(" |") out.write(" |")
_RelPath = lambda p: p.RelPath(local=opt.this_manifest_only) _RelPath = lambda p: p.RelPath(local=opt.this_manifest_only)
@ -191,19 +213,3 @@ is shown, then the branch appears in all projects.
else: else:
out.write(" in all projects") out.write(" in all projects")
out.nl() out.nl()
def expand_project_to_branches(project):
"""Expands a project into a list of branch names & associated information.
Args:
project: project.Project
Returns:
List[Tuple[str, git_config.Branch]]
"""
branches = []
for name, b in project.GetBranches().items():
b.project = project
branches.append((name, b))
return branches

View File

@ -20,7 +20,6 @@ from command import DEFAULT_LOCAL_JOBS
from error import GitError from error import GitError
from error import RepoExitError from error import RepoExitError
from progress import Progress from progress import Progress
from project import Project
from repo_logging import RepoLogger from repo_logging import RepoLogger
@ -30,7 +29,7 @@ logger = RepoLogger(__file__)
class CheckoutBranchResult(NamedTuple): class CheckoutBranchResult(NamedTuple):
# Whether the Project is on the branch (i.e. branch exists and no errors) # Whether the Project is on the branch (i.e. branch exists and no errors)
result: bool result: bool
project: Project project_idx: int
error: Exception error: Exception
@ -62,15 +61,17 @@ The command is equivalent to:
if not args: if not args:
self.Usage() self.Usage()
def _ExecuteOne(self, nb, project): @classmethod
def _ExecuteOne(cls, nb, project_idx):
"""Checkout one project.""" """Checkout one project."""
error = None error = None
result = None result = None
project = cls.get_parallel_context()["projects"][project_idx]
try: try:
result = project.CheckoutBranch(nb) result = project.CheckoutBranch(nb)
except GitError as e: except GitError as e:
error = e error = e
return CheckoutBranchResult(result, project, error) return CheckoutBranchResult(result, project_idx, error)
def Execute(self, opt, args): def Execute(self, opt, args):
nb = args[0] nb = args[0]
@ -83,17 +84,20 @@ The command is equivalent to:
def _ProcessResults(_pool, pm, results): def _ProcessResults(_pool, pm, results):
for result in results: for result in results:
project = all_projects[result.project_idx]
if result.error is not None: if result.error is not None:
err.append(result.error) err.append(result.error)
err_projects.append(result.project) err_projects.append(project)
elif result.result: elif result.result:
success.append(result.project) success.append(project)
pm.update(msg="") pm.update(msg="")
with self.ParallelContext():
self.get_parallel_context()["projects"] = all_projects
self.ExecuteInParallel( self.ExecuteInParallel(
opt.jobs, opt.jobs,
functools.partial(self._ExecuteOne, nb), functools.partial(self._ExecuteOne, nb),
all_projects, range(len(all_projects)),
callback=_ProcessResults, callback=_ProcessResults,
output=Progress( output=Progress(
f"Checkout {nb}", len(all_projects), quiet=opt.quiet f"Checkout {nb}", len(all_projects), quiet=opt.quiet

View File

@ -40,7 +40,8 @@ to the Unix 'patch' command.
help="paths are relative to the repository root", help="paths are relative to the repository root",
) )
def _ExecuteOne(self, absolute, local, project): @classmethod
def _ExecuteOne(cls, absolute, local, project_idx):
"""Obtains the diff for a specific project. """Obtains the diff for a specific project.
Args: Args:
@ -48,12 +49,13 @@ to the Unix 'patch' command.
local: a boolean, if True, the path is relative to the local local: a boolean, if True, the path is relative to the local
(sub)manifest. If false, the path is relative to the outermost (sub)manifest. If false, the path is relative to the outermost
manifest. manifest.
project: Project to get status of. project_idx: Project index to get status of.
Returns: Returns:
The status of the project. The status of the project.
""" """
buf = io.StringIO() buf = io.StringIO()
project = cls.get_parallel_context()["projects"][project_idx]
ret = project.PrintWorkTreeDiff(absolute, output_redir=buf, local=local) ret = project.PrintWorkTreeDiff(absolute, output_redir=buf, local=local)
return (ret, buf.getvalue()) return (ret, buf.getvalue())
@ -71,12 +73,15 @@ to the Unix 'patch' command.
ret = 1 ret = 1
return ret return ret
with self.ParallelContext():
self.get_parallel_context()["projects"] = all_projects
return self.ExecuteInParallel( return self.ExecuteInParallel(
opt.jobs, opt.jobs,
functools.partial( functools.partial(
self._ExecuteOne, opt.absolute, opt.this_manifest_only self._ExecuteOne, opt.absolute, opt.this_manifest_only
), ),
all_projects, range(len(all_projects)),
callback=_ProcessResults, callback=_ProcessResults,
ordered=True, ordered=True,
chunksize=1,
) )

View File

@ -233,9 +233,9 @@ synced and their revisions won't be found.
) )
self.printRevision = self.out.nofmt_printer("revision", fg="yellow") self.printRevision = self.out.nofmt_printer("revision", fg="yellow")
else: else:
self.printProject = ( self.printProject = self.printAdded = self.printRemoved = (
self.printAdded self.printRevision
) = self.printRemoved = self.printRevision = self.printText ) = self.printText
manifest1 = RepoClient(self.repodir) manifest1 = RepoClient(self.repodir)
manifest1.Override(args[0], load_local_manifests=False) manifest1.Override(args[0], load_local_manifests=False)

View File

@ -15,7 +15,6 @@
import errno import errno
import functools import functools
import io import io
import multiprocessing
import os import os
import re import re
import signal import signal
@ -26,7 +25,6 @@ from color import Coloring
from command import Command from command import Command
from command import DEFAULT_LOCAL_JOBS from command import DEFAULT_LOCAL_JOBS
from command import MirrorSafeCommand from command import MirrorSafeCommand
from command import WORKER_BATCH_SIZE
from error import ManifestInvalidRevisionError from error import ManifestInvalidRevisionError
from repo_logging import RepoLogger from repo_logging import RepoLogger
@ -241,7 +239,6 @@ without iterating through the remaining projects.
cmd.insert(cmd.index(cn) + 1, "--color") cmd.insert(cmd.index(cn) + 1, "--color")
mirror = self.manifest.IsMirror mirror = self.manifest.IsMirror
rc = 0
smart_sync_manifest_name = "smart_sync_override.xml" smart_sync_manifest_name = "smart_sync_override.xml"
smart_sync_manifest_path = os.path.join( smart_sync_manifest_path = os.path.join(
@ -264,18 +261,10 @@ without iterating through the remaining projects.
os.environ["REPO_COUNT"] = str(len(projects)) os.environ["REPO_COUNT"] = str(len(projects))
try: def _ProcessResults(_pool, _output, results):
config = self.manifest.manifestProject.config rc = 0
with multiprocessing.Pool(opt.jobs, InitWorker) as pool:
results_it = pool.imap(
functools.partial(
DoWorkWrapper, mirror, opt, cmd, shell, config
),
enumerate(projects),
chunksize=WORKER_BATCH_SIZE,
)
first = True first = True
for r, output in results_it: for r, output in results:
if output: if output:
if first: if first:
first = False first = False
@ -290,9 +279,26 @@ without iterating through the remaining projects.
rc = rc or r rc = rc or r
if r != 0 and opt.abort_on_errors: if r != 0 and opt.abort_on_errors:
raise Exception("Aborting due to previous error") raise Exception("Aborting due to previous error")
return rc
try:
config = self.manifest.manifestProject.config
with self.ParallelContext():
self.get_parallel_context()["projects"] = projects
rc = self.ExecuteInParallel(
opt.jobs,
functools.partial(
self.DoWorkWrapper, mirror, opt, cmd, shell, config
),
range(len(projects)),
callback=_ProcessResults,
ordered=True,
initializer=self.InitWorker,
chunksize=1,
)
except (KeyboardInterrupt, WorkerKeyboardInterrupt): except (KeyboardInterrupt, WorkerKeyboardInterrupt):
# Catch KeyboardInterrupt raised inside and outside of workers # Catch KeyboardInterrupt raised inside and outside of workers
rc = rc or errno.EINTR rc = errno.EINTR
except Exception as e: except Exception as e:
# Catch any other exceptions raised # Catch any other exceptions raised
logger.error( logger.error(
@ -300,20 +306,16 @@ without iterating through the remaining projects.
type(e).__name__, type(e).__name__,
e, e,
) )
rc = rc or getattr(e, "errno", 1) rc = getattr(e, "errno", 1)
if rc != 0: if rc != 0:
sys.exit(rc) sys.exit(rc)
@classmethod
class WorkerKeyboardInterrupt(Exception): def InitWorker(cls):
"""Keyboard interrupt exception for worker processes."""
def InitWorker():
signal.signal(signal.SIGINT, signal.SIG_IGN) signal.signal(signal.SIGINT, signal.SIG_IGN)
@classmethod
def DoWorkWrapper(mirror, opt, cmd, shell, config, args): def DoWorkWrapper(cls, mirror, opt, cmd, shell, config, project_idx):
"""A wrapper around the DoWork() method. """A wrapper around the DoWork() method.
Catch the KeyboardInterrupt exceptions here and re-raise them as a Catch the KeyboardInterrupt exceptions here and re-raise them as a
@ -321,14 +323,18 @@ def DoWorkWrapper(mirror, opt, cmd, shell, config, args):
with stacktraces and making the parent hang indefinitely. with stacktraces and making the parent hang indefinitely.
""" """
cnt, project = args project = cls.get_parallel_context()["projects"][project_idx]
try: try:
return DoWork(project, mirror, opt, cmd, shell, cnt, config) return DoWork(project, mirror, opt, cmd, shell, project_idx, config)
except KeyboardInterrupt: except KeyboardInterrupt:
print("%s: Worker interrupted" % project.name) print("%s: Worker interrupted" % project.name)
raise WorkerKeyboardInterrupt() raise WorkerKeyboardInterrupt()
class WorkerKeyboardInterrupt(Exception):
"""Keyboard interrupt exception for worker processes."""
def DoWork(project, mirror, opt, cmd, shell, cnt, config): def DoWork(project, mirror, opt, cmd, shell, cnt, config):
env = os.environ.copy() env = os.environ.copy()

294
subcmds/gc.py Normal file
View File

@ -0,0 +1,294 @@
# Copyright (C) 2024 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from typing import List, Set
from command import Command
from git_command import GitCommand
import platform_utils
from progress import Progress
from project import Project
class Gc(Command):
COMMON = True
helpSummary = "Cleaning up internal repo and Git state."
helpUsage = """
%prog
"""
def _Options(self, p):
p.add_option(
"-n",
"--dry-run",
dest="dryrun",
default=False,
action="store_true",
help="do everything except actually delete",
)
p.add_option(
"-y",
"--yes",
default=False,
action="store_true",
help="answer yes to all safe prompts",
)
p.add_option(
"--repack",
default=False,
action="store_true",
help="repack all projects that use partial clone with "
"filter=blob:none",
)
def _find_git_to_delete(
self, to_keep: Set[str], start_dir: str
) -> Set[str]:
"""Searches no longer needed ".git" directories.
Scans the file system starting from `start_dir` and removes all
directories that end with ".git" that are not in the `to_keep` set.
"""
to_delete = set()
for root, dirs, _ in platform_utils.walk(start_dir):
for directory in dirs:
if not directory.endswith(".git"):
continue
path = os.path.join(root, directory)
if path not in to_keep:
to_delete.add(path)
return to_delete
def delete_unused_projects(self, projects: List[Project], opt):
print(f"Scanning filesystem under {self.repodir}...")
project_paths = set()
project_object_paths = set()
for project in projects:
project_paths.add(project.gitdir)
project_object_paths.add(project.objdir)
to_delete = self._find_git_to_delete(
project_paths, os.path.join(self.repodir, "projects")
)
to_delete.update(
self._find_git_to_delete(
project_object_paths,
os.path.join(self.repodir, "project-objects"),
)
)
if not to_delete:
print("Nothing to clean up.")
return 0
print("Identified the following projects are no longer used:")
print("\n".join(to_delete))
print("")
if not opt.yes:
print(
"If you proceed, any local commits in those projects will be "
"destroyed!"
)
ask = input("Proceed? [y/N] ")
if ask.lower() != "y":
return 1
pm = Progress(
"Deleting",
len(to_delete),
delay=False,
quiet=opt.quiet,
show_elapsed=True,
elide=True,
)
for path in to_delete:
if opt.dryrun:
print(f"\nWould have deleted ${path}")
else:
tmp_path = os.path.join(
os.path.dirname(path),
f"to_be_deleted_{os.path.basename(path)}",
)
platform_utils.rename(path, tmp_path)
platform_utils.rmtree(tmp_path)
pm.update(msg=path)
pm.end()
return 0
def _generate_promisor_files(self, pack_dir: str):
"""Generates promisor files for all pack files in the given directory.
Promisor files are empty files with the same name as the corresponding
pack file but with the ".promisor" extension. They are used by Git.
"""
for root, _, files in platform_utils.walk(pack_dir):
for file in files:
if not file.endswith(".pack"):
continue
with open(os.path.join(root, f"{file[:-4]}promisor"), "w"):
pass
def repack_projects(self, projects: List[Project], opt):
repack_projects = []
# Find all projects eligible for repacking:
# - can't be shared
# - have a specific fetch filter
for project in projects:
if project.config.GetBoolean("extensions.preciousObjects"):
continue
if not project.clone_depth:
continue
if project.manifest.CloneFilterForDepth != "blob:none":
continue
repack_projects.append(project)
if opt.dryrun:
print(f"Would have repacked {len(repack_projects)} projects.")
return 0
pm = Progress(
"Repacking (this will take a while)",
len(repack_projects),
delay=False,
quiet=opt.quiet,
show_elapsed=True,
elide=True,
)
for project in repack_projects:
pm.update(msg=f"{project.name}")
pack_dir = os.path.join(project.gitdir, "tmp_repo_repack")
if os.path.isdir(pack_dir):
platform_utils.rmtree(pack_dir)
os.mkdir(pack_dir)
# Prepare workspace for repacking - remove all unreachable refs and
# their objects.
GitCommand(
project,
["reflog", "expire", "--expire-unreachable=all"],
verify_command=True,
).Wait()
pm.update(msg=f"{project.name} | gc", inc=0)
GitCommand(
project,
["gc"],
verify_command=True,
).Wait()
# Get all objects that are reachable from the remote, and pack them.
pm.update(msg=f"{project.name} | generating list of objects", inc=0)
remote_objects_cmd = GitCommand(
project,
[
"rev-list",
"--objects",
f"--remotes={project.remote.name}",
"--filter=blob:none",
"--tags",
],
capture_stdout=True,
verify_command=True,
)
# Get all local objects and pack them.
local_head_objects_cmd = GitCommand(
project,
["rev-list", "--objects", "HEAD^{tree}"],
capture_stdout=True,
verify_command=True,
)
local_objects_cmd = GitCommand(
project,
[
"rev-list",
"--objects",
"--all",
"--reflog",
"--indexed-objects",
"--not",
f"--remotes={project.remote.name}",
"--tags",
],
capture_stdout=True,
verify_command=True,
)
remote_objects_cmd.Wait()
pm.update(msg=f"{project.name} | remote repack", inc=0)
GitCommand(
project,
["pack-objects", os.path.join(pack_dir, "pack")],
input=remote_objects_cmd.stdout,
capture_stderr=True,
capture_stdout=True,
verify_command=True,
).Wait()
# create promisor file for each pack file
self._generate_promisor_files(pack_dir)
local_head_objects_cmd.Wait()
local_objects_cmd.Wait()
pm.update(msg=f"{project.name} | local repack", inc=0)
GitCommand(
project,
["pack-objects", os.path.join(pack_dir, "pack")],
input=local_head_objects_cmd.stdout + local_objects_cmd.stdout,
capture_stderr=True,
capture_stdout=True,
verify_command=True,
).Wait()
# Swap the old pack directory with the new one.
platform_utils.rename(
os.path.join(project.objdir, "objects", "pack"),
os.path.join(project.objdir, "objects", "pack_old"),
)
platform_utils.rename(
pack_dir,
os.path.join(project.objdir, "objects", "pack"),
)
platform_utils.rmtree(
os.path.join(project.objdir, "objects", "pack_old")
)
pm.end()
return 0
def Execute(self, opt, args):
projects: List[Project] = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
ret = self.delete_unused_projects(projects, opt)
if ret != 0:
return ret
if not opt.repack:
return
return self.repack_projects(projects, opt)

View File

@ -23,7 +23,6 @@ from error import GitError
from error import InvalidArgumentsError from error import InvalidArgumentsError
from error import SilentRepoExitError from error import SilentRepoExitError
from git_command import GitCommand from git_command import GitCommand
from project import Project
from repo_logging import RepoLogger from repo_logging import RepoLogger
@ -40,7 +39,7 @@ class GrepColoring(Coloring):
class ExecuteOneResult(NamedTuple): class ExecuteOneResult(NamedTuple):
"""Result from an execute instance.""" """Result from an execute instance."""
project: Project project_idx: int
rc: int rc: int
stdout: str stdout: str
stderr: str stderr: str
@ -262,8 +261,10 @@ contain a line that matches both expressions:
help="Show only file names not containing matching lines", help="Show only file names not containing matching lines",
) )
def _ExecuteOne(self, cmd_argv, project): @classmethod
def _ExecuteOne(cls, cmd_argv, project_idx):
"""Process one project.""" """Process one project."""
project = cls.get_parallel_context()["projects"][project_idx]
try: try:
p = GitCommand( p = GitCommand(
project, project,
@ -274,7 +275,7 @@ contain a line that matches both expressions:
verify_command=True, verify_command=True,
) )
except GitError as e: except GitError as e:
return ExecuteOneResult(project, -1, None, str(e), e) return ExecuteOneResult(project_idx, -1, None, str(e), e)
try: try:
error = None error = None
@ -282,10 +283,12 @@ contain a line that matches both expressions:
except GitError as e: except GitError as e:
rc = 1 rc = 1
error = e error = e
return ExecuteOneResult(project, rc, p.stdout, p.stderr, error) return ExecuteOneResult(project_idx, rc, p.stdout, p.stderr, error)
@staticmethod @staticmethod
def _ProcessResults(full_name, have_rev, opt, _pool, out, results): def _ProcessResults(
full_name, have_rev, opt, projects, _pool, out, results
):
git_failed = False git_failed = False
bad_rev = False bad_rev = False
have_match = False have_match = False
@ -293,9 +296,10 @@ contain a line that matches both expressions:
errors = [] errors = []
for result in results: for result in results:
project = projects[result.project_idx]
if result.rc < 0: if result.rc < 0:
git_failed = True git_failed = True
out.project("--- project %s ---" % _RelPath(result.project)) out.project("--- project %s ---" % _RelPath(project))
out.nl() out.nl()
out.fail("%s", result.stderr) out.fail("%s", result.stderr)
out.nl() out.nl()
@ -311,9 +315,7 @@ contain a line that matches both expressions:
): ):
bad_rev = True bad_rev = True
else: else:
out.project( out.project("--- project %s ---" % _RelPath(project))
"--- project %s ---" % _RelPath(result.project)
)
out.nl() out.nl()
out.fail("%s", result.stderr.strip()) out.fail("%s", result.stderr.strip())
out.nl() out.nl()
@ -331,13 +333,13 @@ contain a line that matches both expressions:
rev, line = line.split(":", 1) rev, line = line.split(":", 1)
out.write("%s", rev) out.write("%s", rev)
out.write(":") out.write(":")
out.project(_RelPath(result.project)) out.project(_RelPath(project))
out.write("/") out.write("/")
out.write("%s", line) out.write("%s", line)
out.nl() out.nl()
elif full_name: elif full_name:
for line in r: for line in r:
out.project(_RelPath(result.project)) out.project(_RelPath(project))
out.write("/") out.write("/")
out.write("%s", line) out.write("%s", line)
out.nl() out.nl()
@ -381,15 +383,18 @@ contain a line that matches both expressions:
cmd_argv.extend(opt.revision) cmd_argv.extend(opt.revision)
cmd_argv.append("--") cmd_argv.append("--")
with self.ParallelContext():
self.get_parallel_context()["projects"] = projects
git_failed, bad_rev, have_match, errors = self.ExecuteInParallel( git_failed, bad_rev, have_match, errors = self.ExecuteInParallel(
opt.jobs, opt.jobs,
functools.partial(self._ExecuteOne, cmd_argv), functools.partial(self._ExecuteOne, cmd_argv),
projects, range(len(projects)),
callback=functools.partial( callback=functools.partial(
self._ProcessResults, full_name, have_rev, opt self._ProcessResults, full_name, have_rev, opt, projects
), ),
output=out, output=out,
ordered=True, ordered=True,
chunksize=1,
) )
if git_failed: if git_failed:

View File

@ -52,6 +52,10 @@ The optional -b argument can be used to select the manifest branch
to checkout and use. If no branch is specified, the remote's default to checkout and use. If no branch is specified, the remote's default
branch is used. This is equivalent to using -b HEAD. branch is used. This is equivalent to using -b HEAD.
The optional --manifest-upstream-branch argument can be used when a commit is
provided to --manifest-branch (or -b), to specify the name of the git ref in
which the commit can be found.
The optional -m argument can be used to specify an alternate manifest The optional -m argument can be used to specify an alternate manifest
to be used. If no manifest is specified, the manifest default.xml to be used. If no manifest is specified, the manifest default.xml
will be used. will be used.
@ -135,6 +139,7 @@ to update the working directory files.
# manifest project is special and is created when instantiating the # manifest project is special and is created when instantiating the
# manifest which happens before we parse options. # manifest which happens before we parse options.
self.manifest.manifestProject.clone_depth = opt.manifest_depth self.manifest.manifestProject.clone_depth = opt.manifest_depth
self.manifest.manifestProject.upstream = opt.manifest_upstream_branch
clone_filter_for_depth = ( clone_filter_for_depth = (
"blob:none" if (_REPO_ALLOW_SHALLOW == "0") else None "blob:none" if (_REPO_ALLOW_SHALLOW == "0") else None
) )
@ -317,6 +322,12 @@ to update the working directory files.
" be used with --standalone-manifest." " be used with --standalone-manifest."
) )
if opt.manifest_upstream_branch and opt.manifest_branch is None:
self.OptionParser.error(
"--manifest-upstream-branch cannot be used without "
"--manifest-branch."
)
if args: if args:
if opt.manifest_url: if opt.manifest_url:
self.OptionParser.error( self.OptionParser.error(

View File

@ -27,8 +27,10 @@ class Prune(PagedCommand):
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _ExecuteOne(self, project): @classmethod
def _ExecuteOne(cls, project_idx):
"""Process one project.""" """Process one project."""
project = cls.get_parallel_context()["projects"][project_idx]
return project.PruneHeads() return project.PruneHeads()
def Execute(self, opt, args): def Execute(self, opt, args):
@ -41,10 +43,12 @@ class Prune(PagedCommand):
def _ProcessResults(_pool, _output, results): def _ProcessResults(_pool, _output, results):
return list(itertools.chain.from_iterable(results)) return list(itertools.chain.from_iterable(results))
with self.ParallelContext():
self.get_parallel_context()["projects"] = projects
all_branches = self.ExecuteInParallel( all_branches = self.ExecuteInParallel(
opt.jobs, opt.jobs,
self._ExecuteOne, self._ExecuteOne,
projects, range(len(projects)),
callback=_ProcessResults, callback=_ProcessResults,
ordered=True, ordered=True,
) )

View File

@ -21,7 +21,6 @@ from error import RepoExitError
from git_command import git from git_command import git
from git_config import IsImmutable from git_config import IsImmutable
from progress import Progress from progress import Progress
from project import Project
from repo_logging import RepoLogger from repo_logging import RepoLogger
@ -29,7 +28,7 @@ logger = RepoLogger(__file__)
class ExecuteOneResult(NamedTuple): class ExecuteOneResult(NamedTuple):
project: Project project_idx: int
error: Exception error: Exception
@ -80,18 +79,20 @@ revision specified in the manifest.
if not git.check_ref_format("heads/%s" % nb): if not git.check_ref_format("heads/%s" % nb):
self.OptionParser.error("'%s' is not a valid name" % nb) self.OptionParser.error("'%s' is not a valid name" % nb)
def _ExecuteOne(self, revision, nb, project): @classmethod
def _ExecuteOne(cls, revision, nb, default_revisionExpr, project_idx):
"""Start one project.""" """Start one project."""
# If the current revision is immutable, such as a SHA1, a tag or # If the current revision is immutable, such as a SHA1, a tag or
# a change, then we can't push back to it. Substitute with # a change, then we can't push back to it. Substitute with
# dest_branch, if defined; or with manifest default revision instead. # dest_branch, if defined; or with manifest default revision instead.
branch_merge = "" branch_merge = ""
error = None error = None
project = cls.get_parallel_context()["projects"][project_idx]
if IsImmutable(project.revisionExpr): if IsImmutable(project.revisionExpr):
if project.dest_branch: if project.dest_branch:
branch_merge = project.dest_branch branch_merge = project.dest_branch
else: else:
branch_merge = self.manifest.default.revisionExpr branch_merge = default_revisionExpr
try: try:
project.StartBranch( project.StartBranch(
@ -100,7 +101,7 @@ revision specified in the manifest.
except Exception as e: except Exception as e:
logger.error("error: unable to checkout %s: %s", project.name, e) logger.error("error: unable to checkout %s: %s", project.name, e)
error = e error = e
return ExecuteOneResult(project, error) return ExecuteOneResult(project_idx, error)
def Execute(self, opt, args): def Execute(self, opt, args):
nb = args[0] nb = args[0]
@ -120,18 +121,27 @@ revision specified in the manifest.
def _ProcessResults(_pool, pm, results): def _ProcessResults(_pool, pm, results):
for result in results: for result in results:
if result.error: if result.error:
err_projects.append(result.project) project = all_projects[result.project_idx]
err_projects.append(project)
err.append(result.error) err.append(result.error)
pm.update(msg="") pm.update(msg="")
with self.ParallelContext():
self.get_parallel_context()["projects"] = all_projects
self.ExecuteInParallel( self.ExecuteInParallel(
opt.jobs, opt.jobs,
functools.partial(self._ExecuteOne, opt.revision, nb), functools.partial(
all_projects, self._ExecuteOne,
opt.revision,
nb,
self.manifest.default.revisionExpr,
),
range(len(all_projects)),
callback=_ProcessResults, callback=_ProcessResults,
output=Progress( output=Progress(
f"Starting {nb}", len(all_projects), quiet=opt.quiet f"Starting {nb}", len(all_projects), quiet=opt.quiet
), ),
chunksize=1,
) )
if err_projects: if err_projects:

View File

@ -88,7 +88,8 @@ the following meanings:
"projects", "projects",
) )
def _StatusHelper(self, quiet, local, project): @classmethod
def _StatusHelper(cls, quiet, local, project_idx):
"""Obtains the status for a specific project. """Obtains the status for a specific project.
Obtains the status for a project, redirecting the output to Obtains the status for a project, redirecting the output to
@ -99,12 +100,13 @@ the following meanings:
local: a boolean, if True, the path is relative to the local local: a boolean, if True, the path is relative to the local
(sub)manifest. If false, the path is relative to the outermost (sub)manifest. If false, the path is relative to the outermost
manifest. manifest.
project: Project to get status of. project_idx: Project index to get status of.
Returns: Returns:
The status of the project. The status of the project.
""" """
buf = io.StringIO() buf = io.StringIO()
project = cls.get_parallel_context()["projects"][project_idx]
ret = project.PrintWorkTreeStatus( ret = project.PrintWorkTreeStatus(
quiet=quiet, output_redir=buf, local=local quiet=quiet, output_redir=buf, local=local
) )
@ -143,14 +145,17 @@ the following meanings:
ret += 1 ret += 1
return ret return ret
with self.ParallelContext():
self.get_parallel_context()["projects"] = all_projects
counter = self.ExecuteInParallel( counter = self.ExecuteInParallel(
opt.jobs, opt.jobs,
functools.partial( functools.partial(
self._StatusHelper, opt.quiet, opt.this_manifest_only self._StatusHelper, opt.quiet, opt.this_manifest_only
), ),
all_projects, range(len(all_projects)),
callback=_ProcessResults, callback=_ProcessResults,
ordered=True, ordered=True,
chunksize=1,
) )
if not opt.quiet and len(all_projects) == counter: if not opt.quiet and len(all_projects) == counter:

View File

@ -131,12 +131,17 @@ def _SafeCheckoutOrder(checkouts: List[Project]) -> List[List[Project]]:
return res return res
def _chunksize(projects: int, jobs: int) -> int:
"""Calculate chunk size for the given number of projects and jobs."""
return min(max(1, projects // jobs), WORKER_BATCH_SIZE)
class _FetchOneResult(NamedTuple): class _FetchOneResult(NamedTuple):
"""_FetchOne return value. """_FetchOne return value.
Attributes: Attributes:
success (bool): True if successful. success (bool): True if successful.
project (Project): The fetched project. project_idx (int): The fetched project index.
start (float): The starting time.time(). start (float): The starting time.time().
finish (float): The ending time.time(). finish (float): The ending time.time().
remote_fetched (bool): True if the remote was actually queried. remote_fetched (bool): True if the remote was actually queried.
@ -144,7 +149,7 @@ class _FetchOneResult(NamedTuple):
success: bool success: bool
errors: List[Exception] errors: List[Exception]
project: Project project_idx: int
start: float start: float
finish: float finish: float
remote_fetched: bool remote_fetched: bool
@ -177,14 +182,14 @@ class _CheckoutOneResult(NamedTuple):
Attributes: Attributes:
success (bool): True if successful. success (bool): True if successful.
project (Project): The project. project_idx (int): The project index.
start (float): The starting time.time(). start (float): The starting time.time().
finish (float): The ending time.time(). finish (float): The ending time.time().
""" """
success: bool success: bool
errors: List[Exception] errors: List[Exception]
project: Project project_idx: int
start: float start: float
finish: float finish: float
@ -345,6 +350,8 @@ later is required to fix a server side protocol bug.
# value later on. # value later on.
PARALLEL_JOBS = 0 PARALLEL_JOBS = 0
_JOBS_WARN_THRESHOLD = 100
def _Options(self, p, show_smart=True): def _Options(self, p, show_smart=True):
p.add_option( p.add_option(
"--jobs-network", "--jobs-network",
@ -587,7 +594,8 @@ later is required to fix a server side protocol bug.
branch = branch[len(R_HEADS) :] branch = branch[len(R_HEADS) :]
return branch return branch
def _GetCurrentBranchOnly(self, opt, manifest): @classmethod
def _GetCurrentBranchOnly(cls, opt, manifest):
"""Returns whether current-branch or use-superproject options are """Returns whether current-branch or use-superproject options are
enabled. enabled.
@ -705,7 +713,8 @@ later is required to fix a server side protocol bug.
if need_unload: if need_unload:
m.outer_client.manifest.Unload() m.outer_client.manifest.Unload()
def _FetchProjectList(self, opt, projects): @classmethod
def _FetchProjectList(cls, opt, projects):
"""Main function of the fetch worker. """Main function of the fetch worker.
The projects we're given share the same underlying git object store, so The projects we're given share the same underlying git object store, so
@ -717,21 +726,23 @@ later is required to fix a server side protocol bug.
opt: Program options returned from optparse. See _Options(). opt: Program options returned from optparse. See _Options().
projects: Projects to fetch. projects: Projects to fetch.
""" """
return [self._FetchOne(opt, x) for x in projects] return [cls._FetchOne(opt, x) for x in projects]
def _FetchOne(self, opt, project): @classmethod
def _FetchOne(cls, opt, project_idx):
"""Fetch git objects for a single project. """Fetch git objects for a single project.
Args: Args:
opt: Program options returned from optparse. See _Options(). opt: Program options returned from optparse. See _Options().
project: Project object for the project to fetch. project_idx: Project index for the project to fetch.
Returns: Returns:
Whether the fetch was successful. Whether the fetch was successful.
""" """
project = cls.get_parallel_context()["projects"][project_idx]
start = time.time() start = time.time()
k = f"{project.name} @ {project.relpath}" k = f"{project.name} @ {project.relpath}"
self._sync_dict[k] = start cls.get_parallel_context()["sync_dict"][k] = start
success = False success = False
remote_fetched = False remote_fetched = False
errors = [] errors = []
@ -741,7 +752,7 @@ later is required to fix a server side protocol bug.
quiet=opt.quiet, quiet=opt.quiet,
verbose=opt.verbose, verbose=opt.verbose,
output_redir=buf, output_redir=buf,
current_branch_only=self._GetCurrentBranchOnly( current_branch_only=cls._GetCurrentBranchOnly(
opt, project.manifest opt, project.manifest
), ),
force_sync=opt.force_sync, force_sync=opt.force_sync,
@ -751,7 +762,7 @@ later is required to fix a server side protocol bug.
optimized_fetch=opt.optimized_fetch, optimized_fetch=opt.optimized_fetch,
retry_fetches=opt.retry_fetches, retry_fetches=opt.retry_fetches,
prune=opt.prune, prune=opt.prune,
ssh_proxy=self.ssh_proxy, ssh_proxy=cls.get_parallel_context()["ssh_proxy"],
clone_filter=project.manifest.CloneFilter, clone_filter=project.manifest.CloneFilter,
partial_clone_exclude=project.manifest.PartialCloneExclude, partial_clone_exclude=project.manifest.PartialCloneExclude,
clone_filter_for_depth=project.manifest.CloneFilterForDepth, clone_filter_for_depth=project.manifest.CloneFilterForDepth,
@ -783,24 +794,20 @@ later is required to fix a server side protocol bug.
type(e).__name__, type(e).__name__,
e, e,
) )
del self._sync_dict[k]
errors.append(e) errors.append(e)
raise raise
finally:
del cls.get_parallel_context()["sync_dict"][k]
finish = time.time() finish = time.time()
del self._sync_dict[k]
return _FetchOneResult( return _FetchOneResult(
success, errors, project, start, finish, remote_fetched success, errors, project_idx, start, finish, remote_fetched
) )
@classmethod
def _FetchInitChild(cls, ssh_proxy):
cls.ssh_proxy = ssh_proxy
def _GetSyncProgressMessage(self): def _GetSyncProgressMessage(self):
earliest_time = float("inf") earliest_time = float("inf")
earliest_proj = None earliest_proj = None
items = self._sync_dict.items() items = self.get_parallel_context()["sync_dict"].items()
for project, t in items: for project, t in items:
if t < earliest_time: if t < earliest_time:
earliest_time = t earliest_time = t
@ -808,7 +815,7 @@ later is required to fix a server side protocol bug.
if not earliest_proj: if not earliest_proj:
# This function is called when sync is still running but in some # This function is called when sync is still running but in some
# cases (by chance), _sync_dict can contain no entries. Return some # cases (by chance), sync_dict can contain no entries. Return some
# text to indicate that sync is still working. # text to indicate that sync is still working.
return "..working.." return "..working.."
@ -816,10 +823,19 @@ later is required to fix a server side protocol bug.
jobs = jobs_str(len(items)) jobs = jobs_str(len(items))
return f"{jobs} | {elapsed_str(elapsed)} {earliest_proj}" return f"{jobs} | {elapsed_str(elapsed)} {earliest_proj}"
@classmethod
def InitWorker(cls):
# Force connect to the manager server now.
# This is good because workers are initialized one by one. Without this,
# multiple workers may connect to the manager when handling the first
# job at the same time. Then the connection may fail if too many
# connections are pending and execeeded the socket listening backlog,
# especially on MacOS.
len(cls.get_parallel_context()["sync_dict"])
def _Fetch(self, projects, opt, err_event, ssh_proxy, errors): def _Fetch(self, projects, opt, err_event, ssh_proxy, errors):
ret = True ret = True
jobs = opt.jobs_network
fetched = set() fetched = set()
remote_fetched = set() remote_fetched = set()
pm = Progress( pm = Progress(
@ -831,7 +847,6 @@ later is required to fix a server side protocol bug.
elide=True, elide=True,
) )
self._sync_dict = multiprocessing.Manager().dict()
sync_event = _threading.Event() sync_event = _threading.Event()
def _MonitorSyncLoop(): def _MonitorSyncLoop():
@ -842,19 +857,13 @@ later is required to fix a server side protocol bug.
sync_progress_thread = _threading.Thread(target=_MonitorSyncLoop) sync_progress_thread = _threading.Thread(target=_MonitorSyncLoop)
sync_progress_thread.daemon = True sync_progress_thread.daemon = True
sync_progress_thread.start()
objdir_project_map = dict() def _ProcessResults(pool, pm, results_sets):
for project in projects:
objdir_project_map.setdefault(project.objdir, []).append(project)
projects_list = list(objdir_project_map.values())
def _ProcessResults(results_sets):
ret = True ret = True
for results in results_sets: for results in results_sets:
for result in results: for result in results:
success = result.success success = result.success
project = result.project project = projects[result.project_idx]
start = result.start start = result.start
finish = result.finish finish = result.finish
self._fetch_times.Set(project, finish - start) self._fetch_times.Set(project, finish - start)
@ -878,58 +887,50 @@ later is required to fix a server side protocol bug.
fetched.add(project.gitdir) fetched.add(project.gitdir)
pm.update() pm.update()
if not ret and opt.fail_fast: if not ret and opt.fail_fast:
if pool:
pool.close()
break break
return ret return ret
# We pass the ssh proxy settings via the class. This allows with self.ParallelContext():
# multiprocessing to pickle it up when spawning children. We can't pass self.get_parallel_context()["projects"] = projects
# it as an argument to _FetchProjectList below as multiprocessing is self.get_parallel_context()[
# unable to pickle those. "sync_dict"
Sync.ssh_proxy = None ] = multiprocessing.Manager().dict()
# NB: Multiprocessing is heavy, so don't spin it up for one job. objdir_project_map = dict()
if len(projects_list) == 1 or jobs == 1: for index, project in enumerate(projects):
self._FetchInitChild(ssh_proxy) objdir_project_map.setdefault(project.objdir, []).append(index)
if not _ProcessResults( projects_list = list(objdir_project_map.values())
self._FetchProjectList(opt, x) for x in projects_list
): jobs = max(1, min(opt.jobs_network, len(projects_list)))
ret = False
else: # We pass the ssh proxy settings via the class. This allows
# Favor throughput over responsiveness when quiet. It seems that # multiprocessing to pickle it up when spawning children. We can't
# imap() will yield results in batches relative to chunksize, so # pass it as an argument to _FetchProjectList below as
# even as the children finish a sync, we won't see the result until # multiprocessing is unable to pickle those.
# one child finishes ~chunksize jobs. When using a large --jobs self.get_parallel_context()["ssh_proxy"] = ssh_proxy
# with large chunksize, this can be jarring as there will be a large
# initial delay where repo looks like it isn't doing anything and sync_progress_thread.start()
# sits at 0%, but then suddenly completes a lot of jobs all at once. if not opt.quiet:
# Since this code is more network bound, we can accept a bit more
# CPU overhead with a smaller chunksize so that the user sees more
# immediate & continuous feedback.
if opt.quiet:
chunksize = WORKER_BATCH_SIZE
else:
pm.update(inc=0, msg="warming up") pm.update(inc=0, msg="warming up")
chunksize = 4 try:
with multiprocessing.Pool( ret = self.ExecuteInParallel(
jobs, initializer=self._FetchInitChild, initargs=(ssh_proxy,) jobs,
) as pool:
results = pool.imap_unordered(
functools.partial(self._FetchProjectList, opt), functools.partial(self._FetchProjectList, opt),
projects_list, projects_list,
chunksize=chunksize, callback=_ProcessResults,
output=pm,
# Use chunksize=1 to avoid the chance that some workers are
# idle while other workers still have more than one job in
# their chunk queue.
chunksize=1,
initializer=self.InitWorker,
) )
if not _ProcessResults(results): finally:
ret = False
pool.close()
# Cleanup the reference now that we're done with it, and we're going to
# release any resources it points to. If we don't, later
# multiprocessing usage (e.g. checkouts) will try to pickle and then
# crash.
del Sync.ssh_proxy
sync_event.set() sync_event.set()
pm.end() sync_progress_thread.join()
self._fetch_times.Save() self._fetch_times.Save()
self._local_sync_state.Save() self._local_sync_state.Save()
@ -970,6 +971,8 @@ later is required to fix a server side protocol bug.
if not success: if not success:
err_event.set() err_event.set()
# Call self update, unless requested not to
if os.environ.get("REPO_SKIP_SELF_UPDATE", "0") == "0":
_PostRepoFetch(rp, opt.repo_verify) _PostRepoFetch(rp, opt.repo_verify)
if opt.network_only: if opt.network_only:
# Bail out now; the rest touches the working tree. # Bail out now; the rest touches the working tree.
@ -1015,14 +1018,15 @@ later is required to fix a server side protocol bug.
return _FetchMainResult(all_projects) return _FetchMainResult(all_projects)
@classmethod
def _CheckoutOne( def _CheckoutOne(
self, cls,
detach_head, detach_head,
force_sync, force_sync,
force_checkout, force_checkout,
force_rebase, force_rebase,
verbose, verbose,
project, project_idx,
): ):
"""Checkout work tree for one project """Checkout work tree for one project
@ -1034,11 +1038,12 @@ later is required to fix a server side protocol bug.
force_checkout: Force checking out of the repo content. force_checkout: Force checking out of the repo content.
force_rebase: Force rebase. force_rebase: Force rebase.
verbose: Whether to show verbose messages. verbose: Whether to show verbose messages.
project: Project object for the project to checkout. project_idx: Project index for the project to checkout.
Returns: Returns:
Whether the fetch was successful. Whether the fetch was successful.
""" """
project = cls.get_parallel_context()["projects"][project_idx]
start = time.time() start = time.time()
syncbuf = SyncBuffer( syncbuf = SyncBuffer(
project.manifest.manifestProject.config, detach_head=detach_head project.manifest.manifestProject.config, detach_head=detach_head
@ -1055,6 +1060,8 @@ later is required to fix a server side protocol bug.
verbose=verbose, verbose=verbose,
) )
success = syncbuf.Finish() success = syncbuf.Finish()
except KeyboardInterrupt:
logger.error("Keyboard interrupt while processing %s", project.name)
except GitError as e: except GitError as e:
logger.error( logger.error(
"error.GitError: Cannot checkout %s: %s", project.name, e "error.GitError: Cannot checkout %s: %s", project.name, e
@ -1072,7 +1079,7 @@ later is required to fix a server side protocol bug.
if not success: if not success:
logger.error("error: Cannot checkout %s", project.name) logger.error("error: Cannot checkout %s", project.name)
finish = time.time() finish = time.time()
return _CheckoutOneResult(success, errors, project, start, finish) return _CheckoutOneResult(success, errors, project_idx, start, finish)
def _Checkout(self, all_projects, opt, err_results, checkout_errors): def _Checkout(self, all_projects, opt, err_results, checkout_errors):
"""Checkout projects listed in all_projects """Checkout projects listed in all_projects
@ -1090,7 +1097,9 @@ later is required to fix a server side protocol bug.
ret = True ret = True
for result in results: for result in results:
success = result.success success = result.success
project = result.project project = self.get_parallel_context()["projects"][
result.project_idx
]
start = result.start start = result.start
finish = result.finish finish = result.finish
self.event_log.AddSync( self.event_log.AddSync(
@ -1117,6 +1126,8 @@ later is required to fix a server side protocol bug.
return ret return ret
for projects in _SafeCheckoutOrder(all_projects): for projects in _SafeCheckoutOrder(all_projects):
with self.ParallelContext():
self.get_parallel_context()["projects"] = projects
proc_res = self.ExecuteInParallel( proc_res = self.ExecuteInParallel(
opt.jobs_checkout, opt.jobs_checkout,
functools.partial( functools.partial(
@ -1127,11 +1138,15 @@ later is required to fix a server side protocol bug.
opt.rebase, opt.rebase,
opt.verbose, opt.verbose,
), ),
projects, range(len(projects)),
callback=_ProcessResults, callback=_ProcessResults,
output=Progress( output=Progress(
"Checking out", len(all_projects), quiet=opt.quiet "Checking out", len(all_projects), quiet=opt.quiet
), ),
# Use chunksize=1 to avoid the chance that some workers are
# idle while other workers still have more than one job in
# their chunk queue.
chunksize=1,
) )
self._local_sync_state.Save() self._local_sync_state.Save()
@ -1431,7 +1446,10 @@ later is required to fix a server side protocol bug.
for need_remove_file in need_remove_files: for need_remove_file in need_remove_files:
# Try to remove the updated copyfile or linkfile. # Try to remove the updated copyfile or linkfile.
# So, if the file is not exist, nothing need to do. # So, if the file is not exist, nothing need to do.
platform_utils.remove(need_remove_file, missing_ok=True) platform_utils.remove(
os.path.join(self.client.topdir, need_remove_file),
missing_ok=True,
)
# Create copy-link-files.json, save dest path of "copyfile" and # Create copy-link-files.json, save dest path of "copyfile" and
# "linkfile". # "linkfile".
@ -1486,6 +1504,7 @@ later is required to fix a server side protocol bug.
if manifest_server.startswith("persistent-"): if manifest_server.startswith("persistent-"):
manifest_server = manifest_server[len("persistent-") :] manifest_server = manifest_server[len("persistent-") :]
# Changes in behavior should update docs/smart-sync.md accordingly.
try: try:
server = xmlrpc.client.Server(manifest_server, transport=transport) server = xmlrpc.client.Server(manifest_server, transport=transport)
if opt.smart_sync: if opt.smart_sync:
@ -1711,6 +1730,24 @@ later is required to fix a server side protocol bug.
opt.jobs_network = min(opt.jobs_network, jobs_soft_limit) opt.jobs_network = min(opt.jobs_network, jobs_soft_limit)
opt.jobs_checkout = min(opt.jobs_checkout, jobs_soft_limit) opt.jobs_checkout = min(opt.jobs_checkout, jobs_soft_limit)
# Warn once if effective job counts seem excessively high.
# Prioritize --jobs, then --jobs-network, then --jobs-checkout.
job_options_to_check = (
("--jobs", opt.jobs),
("--jobs-network", opt.jobs_network),
("--jobs-checkout", opt.jobs_checkout),
)
for name, value in job_options_to_check:
if value > self._JOBS_WARN_THRESHOLD:
logger.warning(
"High job count (%d > %d) specified for %s; this may "
"lead to excessive resource usage or diminishing returns.",
value,
self._JOBS_WARN_THRESHOLD,
name,
)
break
def Execute(self, opt, args): def Execute(self, opt, args):
errors = [] errors = []
try: try:
@ -1982,6 +2019,8 @@ def _PostRepoFetch(rp, repo_verify=True, verbose=False):
# We also have to make sure this will switch to an older commit if # We also have to make sure this will switch to an older commit if
# that's the latest tag in order to support release rollback. # that's the latest tag in order to support release rollback.
try: try:
# Refresh index since reset --keep won't do it.
rp.work_git.update_index("-q", "--refresh")
rp.work_git.reset("--keep", new_rev) rp.work_git.reset("--keep", new_rev)
except GitError as e: except GitError as e:
raise RepoUnhandledExceptionError(e) raise RepoUnhandledExceptionError(e)

View File

@ -603,19 +603,22 @@ Gerrit Code Review: https://www.gerritcodereview.com/
full_dest = destination full_dest = destination
if not full_dest.startswith(R_HEADS): if not full_dest.startswith(R_HEADS):
full_dest = R_HEADS + full_dest full_dest = R_HEADS + full_dest
full_revision = branch.project.revisionExpr
if not full_revision.startswith(R_HEADS):
full_revision = R_HEADS + full_revision
# If the merge branch of the local branch is different from # If the merge branch of the local branch is different from
# the project's revision AND destination, this might not be # the project's revision AND destination, this might not be
# intentional. # intentional.
if ( if (
merge_branch merge_branch
and merge_branch != branch.project.revisionExpr and merge_branch != full_revision
and merge_branch != full_dest and merge_branch != full_dest
): ):
print( print(
f"For local branch {branch.name}: merge branch " f"For local branch {branch.name}: merge branch "
f"{merge_branch} does not match destination branch " f"{merge_branch} does not match destination branch "
f"{destination}" f"{destination} and revision {branch.project.revisionExpr}"
) )
print("skipping upload.") print("skipping upload.")
print( print(
@ -713,16 +716,17 @@ Gerrit Code Review: https://www.gerritcodereview.com/
merge_branch = p.stdout.strip() merge_branch = p.stdout.strip()
return merge_branch return merge_branch
@staticmethod @classmethod
def _GatherOne(opt, project): def _GatherOne(cls, opt, project_idx):
"""Figure out the upload status for |project|.""" """Figure out the upload status for |project|."""
project = cls.get_parallel_context()["projects"][project_idx]
if opt.current_branch: if opt.current_branch:
cbr = project.CurrentBranch cbr = project.CurrentBranch
up_branch = project.GetUploadableBranch(cbr) up_branch = project.GetUploadableBranch(cbr)
avail = [up_branch] if up_branch else None avail = [up_branch] if up_branch else None
else: else:
avail = project.GetUploadableBranches(opt.branch) avail = project.GetUploadableBranches(opt.branch)
return (project, avail) return (project_idx, avail)
def Execute(self, opt, args): def Execute(self, opt, args):
projects = self.GetProjects( projects = self.GetProjects(
@ -732,7 +736,8 @@ Gerrit Code Review: https://www.gerritcodereview.com/
def _ProcessResults(_pool, _out, results): def _ProcessResults(_pool, _out, results):
pending = [] pending = []
for result in results: for result in results:
project, avail = result project_idx, avail = result
project = projects[project_idx]
if avail is None: if avail is None:
logger.error( logger.error(
'repo: error: %s: Unable to upload branch "%s". ' 'repo: error: %s: Unable to upload branch "%s". '
@ -743,13 +748,15 @@ Gerrit Code Review: https://www.gerritcodereview.com/
project.manifest.branch, project.manifest.branch,
) )
elif avail: elif avail:
pending.append(result) pending.append((project, avail))
return pending return pending
with self.ParallelContext():
self.get_parallel_context()["projects"] = projects
pending = self.ExecuteInParallel( pending = self.ExecuteInParallel(
opt.jobs, opt.jobs,
functools.partial(self._GatherOne, opt), functools.partial(self._GatherOne, opt),
projects, range(len(projects)),
callback=_ProcessResults, callback=_ProcessResults,
) )

View File

@ -1 +0,0 @@
gitc_dir=/test/usr/local/google/gitc

View File

@ -21,6 +21,8 @@ import subprocess
import unittest import unittest
from unittest import mock from unittest import mock
import pytest
import git_command import git_command
import wrapper import wrapper
@ -263,6 +265,7 @@ class UserAgentUnitTest(unittest.TestCase):
m = re.match(r"^[^ ]+$", os_name) m = re.match(r"^[^ ]+$", os_name)
self.assertIsNotNone(m) self.assertIsNotNone(m)
@pytest.mark.skip_cq("TODO(b/266734831): Find out why this fails in CQ")
def test_smoke_repo(self): def test_smoke_repo(self):
"""Make sure repo UA returns something useful.""" """Make sure repo UA returns something useful."""
ua = git_command.user_agent.repo ua = git_command.user_agent.repo
@ -271,6 +274,7 @@ class UserAgentUnitTest(unittest.TestCase):
m = re.match(r"^git-repo/[^ ]+ ([^ ]+) git/[^ ]+ Python/[0-9.]+", ua) m = re.match(r"^git-repo/[^ ]+ ([^ ]+) git/[^ ]+ Python/[0-9.]+", ua)
self.assertIsNotNone(m) self.assertIsNotNone(m)
@pytest.mark.skip_cq("TODO(b/266734831): Find out why this fails in CQ")
def test_smoke_git(self): def test_smoke_git(self):
"""Make sure git UA returns something useful.""" """Make sure git UA returns something useful."""
ua = git_command.user_agent.git ua = git_command.user_agent.git

View File

@ -21,6 +21,7 @@ import tempfile
import unittest import unittest
from unittest import mock from unittest import mock
import pytest
from test_manifest_xml import sort_attributes from test_manifest_xml import sort_attributes
import git_superproject import git_superproject
@ -145,6 +146,7 @@ class SuperprojectTestCase(unittest.TestCase):
) )
self.assertIsNone(manifest.superproject) self.assertIsNone(manifest.superproject)
@pytest.mark.skip_cq("TODO(b/266734831): Find out why this takes 8m+ in CQ")
def test_superproject_get_superproject_invalid_url(self): def test_superproject_get_superproject_invalid_url(self):
"""Test with an invalid url.""" """Test with an invalid url."""
manifest = self.getXmlManifest( manifest = self.getXmlManifest(
@ -168,6 +170,7 @@ class SuperprojectTestCase(unittest.TestCase):
self.assertFalse(sync_result.success) self.assertFalse(sync_result.success)
self.assertTrue(sync_result.fatal) self.assertTrue(sync_result.fatal)
@pytest.mark.skip_cq("TODO(b/266734831): Find out why this takes 8m+ in CQ")
def test_superproject_get_superproject_invalid_branch(self): def test_superproject_get_superproject_invalid_branch(self):
"""Test with an invalid branch.""" """Test with an invalid branch."""
manifest = self.getXmlManifest( manifest = self.getXmlManifest(

View File

@ -150,7 +150,7 @@ class EventLogTestCase(unittest.TestCase):
<version event> <version event>
<start event> <start event>
""" """
self._event_log_module.StartEvent() self._event_log_module.StartEvent([])
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir: with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir) log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path) self._log_data = self.readLog(log_path)
@ -213,10 +213,8 @@ class EventLogTestCase(unittest.TestCase):
<version event> <version event>
<command event> <command event>
""" """
name = "repo"
subcommands = ["init" "this"]
self._event_log_module.CommandEvent( self._event_log_module.CommandEvent(
name="repo", subcommands=subcommands name="repo", subcommands=["init", "this"]
) )
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir: with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir) log_path = self._event_log_module.Write(path=tempdir)
@ -225,12 +223,10 @@ class EventLogTestCase(unittest.TestCase):
self.assertEqual(len(self._log_data), 2) self.assertEqual(len(self._log_data), 2)
command_event = self._log_data[1] command_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name="version") self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
self.verifyCommonKeys(command_event, expected_event_name="command") self.verifyCommonKeys(command_event, expected_event_name="cmd_name")
# Check for 'command' event specific fields. # Check for 'command' event specific fields.
self.assertIn("name", command_event) self.assertIn("name", command_event)
self.assertIn("subcommands", command_event) self.assertEqual(command_event["name"], "repo-init-this")
self.assertEqual(command_event["name"], name)
self.assertEqual(command_event["subcommands"], subcommands)
def test_def_params_event_repo_config(self): def test_def_params_event_repo_config(self):
"""Test 'def_params' event data outputs only repo config keys. """Test 'def_params' event data outputs only repo config keys.
@ -382,17 +378,17 @@ class EventLogTestCase(unittest.TestCase):
socket_path = os.path.join(tempdir, "server.sock") socket_path = os.path.join(tempdir, "server.sock")
server_ready = threading.Condition() server_ready = threading.Condition()
# Start "server" listening on Unix domain socket at socket_path. # Start "server" listening on Unix domain socket at socket_path.
try:
server_thread = threading.Thread( server_thread = threading.Thread(
target=serverLoggingThread, target=serverLoggingThread,
args=(socket_path, server_ready, received_traces), args=(socket_path, server_ready, received_traces),
) )
try:
server_thread.start() server_thread.start()
with server_ready: with server_ready:
server_ready.wait(timeout=120) server_ready.wait(timeout=120)
self._event_log_module.StartEvent() self._event_log_module.StartEvent([])
path = self._event_log_module.Write( path = self._event_log_module.Write(
path=f"af_unix:{socket_path}" path=f"af_unix:{socket_path}"
) )

View File

@ -51,7 +51,7 @@ INVALID_FS_PATHS = (
"foo~", "foo~",
"blah/foo~", "blah/foo~",
# Block Unicode characters that get normalized out by filesystems. # Block Unicode characters that get normalized out by filesystems.
"foo\u200Cbar", "foo\u200cbar",
# Block newlines. # Block newlines.
"f\n/bar", "f\n/bar",
"f\r/bar", "f\r/bar",
@ -1049,6 +1049,91 @@ class RemoveProjectElementTests(ManifestParseTestCase):
self.assertTrue(found_proj1_path1) self.assertTrue(found_proj1_path1)
self.assertTrue(found_proj2) self.assertTrue(found_proj2)
def test_base_revision_checks_on_patching(self):
manifest_fail_wrong_tag = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="tag.002" />
<project name="project1" path="tests/path1" />
<extend-project name="project1" revision="new_hash" base-rev="tag.001" />
</manifest>
"""
)
with self.assertRaises(error.ManifestParseError):
manifest_fail_wrong_tag.ToXml()
manifest_fail_remove = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="project1" path="tests/path1" revision="hash1" />
<remove-project name="project1" base-rev="wrong_hash" />
</manifest>
"""
)
with self.assertRaises(error.ManifestParseError):
manifest_fail_remove.ToXml()
manifest_fail_extend = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="project1" path="tests/path1" revision="hash1" />
<extend-project name="project1" revision="new_hash" base-rev="wrong_hash" />
</manifest>
"""
)
with self.assertRaises(error.ManifestParseError):
manifest_fail_extend.ToXml()
manifest_fail_unknown = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="project1" path="tests/path1" />
<extend-project name="project1" revision="new_hash" base-rev="any_hash" />
</manifest>
"""
)
with self.assertRaises(error.ManifestParseError):
manifest_fail_unknown.ToXml()
manifest_ok = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="project1" path="tests/path1" revision="hash1" />
<project name="project2" path="tests/path2" revision="hash2" />
<project name="project3" path="tests/path3" revision="hash3" />
<project name="project4" path="tests/path4" revision="hash4" />
<remove-project name="project1" />
<remove-project name="project2" base-rev="hash2" />
<project name="project2" path="tests/path2" revision="new_hash2" />
<extend-project name="project3" base-rev="hash3" revision="new_hash3" />
<extend-project name="project3" base-rev="new_hash3" revision="newer_hash3" />
<remove-project path="tests/path4" base-rev="hash4" />
</manifest>
"""
)
found_proj2 = False
found_proj3 = False
for proj in manifest_ok.projects:
if proj.name == "project2":
found_proj2 = True
if proj.name == "project3":
found_proj3 = True
self.assertNotEqual(proj.name, "project1")
self.assertNotEqual(proj.name, "project4")
self.assertTrue(found_proj2)
self.assertTrue(found_proj3)
self.assertTrue(len(manifest_ok.projects) == 2)
class ExtendProjectElementTests(ManifestParseTestCase): class ExtendProjectElementTests(ManifestParseTestCase):
"""Tests for <extend-project>.""" """Tests for <extend-project>."""

View File

@ -0,0 +1,156 @@
# Copyright (C) 2024 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the forall subcmd."""
from io import StringIO
import os
from shutil import rmtree
import subprocess
import tempfile
import unittest
from unittest import mock
import git_command
import manifest_xml
import project
import subcmds
class AllCommands(unittest.TestCase):
"""Check registered all_commands."""
def setUp(self):
"""Common setup."""
self.tempdirobj = tempfile.TemporaryDirectory(prefix="forall_tests")
self.tempdir = self.tempdirobj.name
self.repodir = os.path.join(self.tempdir, ".repo")
self.manifest_dir = os.path.join(self.repodir, "manifests")
self.manifest_file = os.path.join(
self.repodir, manifest_xml.MANIFEST_FILE_NAME
)
self.local_manifest_dir = os.path.join(
self.repodir, manifest_xml.LOCAL_MANIFESTS_DIR_NAME
)
os.mkdir(self.repodir)
os.mkdir(self.manifest_dir)
def tearDown(self):
"""Common teardown."""
rmtree(self.tempdir, ignore_errors=True)
def initTempGitTree(self, git_dir):
"""Create a new empty git checkout for testing."""
# Tests need to assume, that main is default branch at init,
# which is not supported in config until 2.28.
cmd = ["git", "init", "-q"]
if git_command.git_require((2, 28, 0)):
cmd += ["--initial-branch=main"]
else:
# Use template dir for init
templatedir = os.path.join(self.tempdirobj.name, ".test-template")
os.makedirs(templatedir)
with open(os.path.join(templatedir, "HEAD"), "w") as fp:
fp.write("ref: refs/heads/main\n")
cmd += ["--template", templatedir]
cmd += [git_dir]
subprocess.check_call(cmd)
def getXmlManifestWith8Projects(self):
"""Create and return a setup of 8 projects with enough dummy
files and setup to execute forall."""
# Set up a manifest git dir for parsing to work
gitdir = os.path.join(self.repodir, "manifests.git")
os.mkdir(gitdir)
with open(os.path.join(gitdir, "config"), "w") as fp:
fp.write(
"""[remote "origin"]
url = https://localhost:0/manifest
verbose = false
"""
)
# Add the manifest data
manifest_data = """
<manifest>
<remote name="origin" fetch="http://localhost" />
<default remote="origin" revision="refs/heads/main" />
<project name="project1" path="tests/path1" />
<project name="project2" path="tests/path2" />
<project name="project3" path="tests/path3" />
<project name="project4" path="tests/path4" />
<project name="project5" path="tests/path5" />
<project name="project6" path="tests/path6" />
<project name="project7" path="tests/path7" />
<project name="project8" path="tests/path8" />
</manifest>
"""
with open(self.manifest_file, "w", encoding="utf-8") as fp:
fp.write(manifest_data)
# Set up 8 empty projects to match the manifest
for x in range(1, 9):
os.makedirs(
os.path.join(
self.repodir, "projects/tests/path" + str(x) + ".git"
)
)
os.makedirs(
os.path.join(
self.repodir, "project-objects/project" + str(x) + ".git"
)
)
git_path = os.path.join(self.tempdir, "tests/path" + str(x))
self.initTempGitTree(git_path)
return manifest_xml.XmlManifest(self.repodir, self.manifest_file)
# Use mock to capture stdout from the forall run
@unittest.mock.patch("sys.stdout", new_callable=StringIO)
def test_forall_all_projects_called_once(self, mock_stdout):
"""Test that all projects get a command run once each."""
manifest_with_8_projects = self.getXmlManifestWith8Projects()
cmd = subcmds.forall.Forall()
cmd.manifest = manifest_with_8_projects
# Use echo project names as the test of forall
opts, args = cmd.OptionParser.parse_args(["-c", "echo $REPO_PROJECT"])
opts.verbose = False
# Mock to not have the Execute fail on remote check
with mock.patch.object(
project.Project, "GetRevisionId", return_value="refs/heads/main"
):
# Run the forall command
cmd.Execute(opts, args)
# Verify that we got every project name in the prints
for x in range(1, 9):
self.assertIn("project" + str(x), mock_stdout.getvalue())
# Split the captured output into lines to count them
line_count = 0
for line in mock_stdout.getvalue().split("\n"):
# A commented out print to stderr as a reminder
# that stdout is mocked, include sys and uncomment if needed
# print(line, file=sys.stderr)
if len(line) > 0:
line_count += 1
# Verify that we didn't get more lines than expected
assert line_count == 8

View File

@ -355,6 +355,30 @@ class SafeCheckoutOrder(unittest.TestCase):
) )
class Chunksize(unittest.TestCase):
"""Tests for _chunksize."""
def test_single_project(self):
"""Single project."""
self.assertEqual(sync._chunksize(1, 1), 1)
def test_low_project_count(self):
"""Multiple projects, low number of projects to sync."""
self.assertEqual(sync._chunksize(10, 1), 10)
self.assertEqual(sync._chunksize(10, 2), 5)
self.assertEqual(sync._chunksize(10, 4), 2)
self.assertEqual(sync._chunksize(10, 8), 1)
self.assertEqual(sync._chunksize(10, 16), 1)
def test_high_project_count(self):
"""Multiple projects, high number of projects to sync."""
self.assertEqual(sync._chunksize(2800, 1), 32)
self.assertEqual(sync._chunksize(2800, 16), 32)
self.assertEqual(sync._chunksize(2800, 32), 32)
self.assertEqual(sync._chunksize(2800, 64), 32)
self.assertEqual(sync._chunksize(2800, 128), 21)
class GetPreciousObjectsState(unittest.TestCase): class GetPreciousObjectsState(unittest.TestCase):
"""Tests for _GetPreciousObjectsState.""" """Tests for _GetPreciousObjectsState."""

View File

@ -17,6 +17,7 @@
import io import io
import os import os
import re import re
import subprocess
import sys import sys
import tempfile import tempfile
import unittest import unittest
@ -72,84 +73,11 @@ class RepoWrapperUnitTest(RepoWrapperTestCase):
def test_init_parser(self): def test_init_parser(self):
"""Make sure 'init' GetParser works.""" """Make sure 'init' GetParser works."""
parser = self.wrapper.GetParser(gitc_init=False) parser = self.wrapper.GetParser()
opts, args = parser.parse_args([]) opts, args = parser.parse_args([])
self.assertEqual([], args) self.assertEqual([], args)
self.assertIsNone(opts.manifest_url) self.assertIsNone(opts.manifest_url)
def test_gitc_init_parser(self):
"""Make sure 'gitc-init' GetParser raises."""
with self.assertRaises(SystemExit):
self.wrapper.GetParser(gitc_init=True)
def test_get_gitc_manifest_dir_no_gitc(self):
"""
Test reading a missing gitc config file
"""
self.wrapper.GITC_CONFIG_FILE = fixture("missing_gitc_config")
val = self.wrapper.get_gitc_manifest_dir()
self.assertEqual(val, "")
def test_get_gitc_manifest_dir(self):
"""
Test reading the gitc config file and parsing the directory
"""
self.wrapper.GITC_CONFIG_FILE = fixture("gitc_config")
val = self.wrapper.get_gitc_manifest_dir()
self.assertEqual(val, "/test/usr/local/google/gitc")
def test_gitc_parse_clientdir_no_gitc(self):
"""
Test parsing the gitc clientdir without gitc running
"""
self.wrapper.GITC_CONFIG_FILE = fixture("missing_gitc_config")
self.assertEqual(self.wrapper.gitc_parse_clientdir("/something"), None)
self.assertEqual(
self.wrapper.gitc_parse_clientdir("/gitc/manifest-rw/test"), "test"
)
def test_gitc_parse_clientdir(self):
"""
Test parsing the gitc clientdir
"""
self.wrapper.GITC_CONFIG_FILE = fixture("gitc_config")
self.assertEqual(self.wrapper.gitc_parse_clientdir("/something"), None)
self.assertEqual(
self.wrapper.gitc_parse_clientdir("/gitc/manifest-rw/test"), "test"
)
self.assertEqual(
self.wrapper.gitc_parse_clientdir("/gitc/manifest-rw/test/"), "test"
)
self.assertEqual(
self.wrapper.gitc_parse_clientdir("/gitc/manifest-rw/test/extra"),
"test",
)
self.assertEqual(
self.wrapper.gitc_parse_clientdir(
"/test/usr/local/google/gitc/test"
),
"test",
)
self.assertEqual(
self.wrapper.gitc_parse_clientdir(
"/test/usr/local/google/gitc/test/"
),
"test",
)
self.assertEqual(
self.wrapper.gitc_parse_clientdir(
"/test/usr/local/google/gitc/test/extra"
),
"test",
)
self.assertEqual(
self.wrapper.gitc_parse_clientdir("/gitc/manifest-rw/"), None
)
self.assertEqual(
self.wrapper.gitc_parse_clientdir("/test/usr/local/google/gitc/"),
None,
)
class SetGitTrace2ParentSid(RepoWrapperTestCase): class SetGitTrace2ParentSid(RepoWrapperTestCase):
"""Check SetGitTrace2ParentSid behavior.""" """Check SetGitTrace2ParentSid behavior."""
@ -198,7 +126,7 @@ class RunCommand(RepoWrapperTestCase):
self.wrapper.run_command(["true"], check=False) self.wrapper.run_command(["true"], check=False)
self.wrapper.run_command(["true"], check=True) self.wrapper.run_command(["true"], check=True)
self.wrapper.run_command(["false"], check=False) self.wrapper.run_command(["false"], check=False)
with self.assertRaises(self.wrapper.RunError): with self.assertRaises(subprocess.CalledProcessError):
self.wrapper.run_command(["false"], check=True) self.wrapper.run_command(["false"], check=True)
@ -431,8 +359,8 @@ class VerifyRev(RepoWrapperTestCase):
def test_verify_passes(self): def test_verify_passes(self):
"""Check when we have a valid signed tag.""" """Check when we have a valid signed tag."""
desc_result = self.wrapper.RunResult(0, "v1.0\n", "") desc_result = subprocess.CompletedProcess([], 0, "v1.0\n", "")
gpg_result = self.wrapper.RunResult(0, "", "") gpg_result = subprocess.CompletedProcess([], 0, "", "")
with mock.patch.object( with mock.patch.object(
self.wrapper, "run_git", side_effect=(desc_result, gpg_result) self.wrapper, "run_git", side_effect=(desc_result, gpg_result)
): ):
@ -443,8 +371,8 @@ class VerifyRev(RepoWrapperTestCase):
def test_unsigned_commit(self): def test_unsigned_commit(self):
"""Check we fall back to signed tag when we have an unsigned commit.""" """Check we fall back to signed tag when we have an unsigned commit."""
desc_result = self.wrapper.RunResult(0, "v1.0-10-g1234\n", "") desc_result = subprocess.CompletedProcess([], 0, "v1.0-10-g1234\n", "")
gpg_result = self.wrapper.RunResult(0, "", "") gpg_result = subprocess.CompletedProcess([], 0, "", "")
with mock.patch.object( with mock.patch.object(
self.wrapper, "run_git", side_effect=(desc_result, gpg_result) self.wrapper, "run_git", side_effect=(desc_result, gpg_result)
): ):
@ -455,7 +383,7 @@ class VerifyRev(RepoWrapperTestCase):
def test_verify_fails(self): def test_verify_fails(self):
"""Check we fall back to signed tag when we have an unsigned commit.""" """Check we fall back to signed tag when we have an unsigned commit."""
desc_result = self.wrapper.RunResult(0, "v1.0-10-g1234\n", "") desc_result = subprocess.CompletedProcess([], 0, "v1.0-10-g1234\n", "")
gpg_result = Exception gpg_result = Exception
with mock.patch.object( with mock.patch.object(
self.wrapper, "run_git", side_effect=(desc_result, gpg_result) self.wrapper, "run_git", side_effect=(desc_result, gpg_result)