Compare commits

..

90 Commits

Author SHA1 Message Date
148e1ce81a sync: fix recursive fetching
Commit b2fa30a2b8 ("sync: switch network
fetch to multiprocessing") accidentally changed the variable passed to
the 2nd fetch call from |missing| to |to_fetch| due to a copy & paste
of the earlier changed logic.  Undo that to fix git submodule fetching.

Bug: https://crbug.com/gerrit/14489
Change-Id: I627954f80fd2e80d9d5809b530aa6b0ef9260abb
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305262
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 22:43:09 -04:00
32ca6687ae git_config: hoist Windows ssh check earlier
The ssh master logic has never worked under Windows which is why this
code always returned False when running there (including cygwin).  But
the OS check was still done while holding the threading lock.  While
it might be a little slower than necessary, it still worked.

The switch from the threading module to the multiprocessing module
changed global behavior subtly under Windows and broke things: the
globals previously would stay valid, but now they get cleared.  So
the lock is reset to None in children workers.

We could tweak the logic to pass the lock through, but there isn't
much point when the rest of the code is still disabled in Windows.
So perform the platform check before we grab the lock.  This fixes
the crash, and probably speeds things up a few nanoseconds.

This shouldn't be a problem on Linux systems as the platform fork
will duplicate the existing process memory (including globals).

Bug: https://crbug.com/gerrit/14480
Change-Id: I1d1da82c6d7bd6b8cdc1f03f640a520ecd047063
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305149
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 19:49:58 -04:00
0ae9503a86 sync: fix print error when handling server error
When converting this logic from print() to the output buffer, this
error codepath should have dropped the use of the file= redirect.

Bug: https://crbug.com/gerrit/14482
Change-Id: Ib484924a2031ba3295c1c1a5b9a2d816b9912279
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305142
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 12:48:42 -04:00
cd89ec147a sync: Fix exception in an exsiting clone (without partial-clone).
Default the partial_clone_exclude argument to an empty set.

Fixes the following report by Emil Medve.

With this change (up to v2.14.1), on an existing "normal" clone (without partial-clone options) I'm seeing this traceback during `repo selfupdate`:

Traceback (most recent call last):

  File ".../.repo/repo/main.py", line 630, in <module>
    _Main(sys.argv[1:])
  File ".../.repo/repo/main.py", line 604, in _Main
    result = run()
  File ".../.repo/repo/main.py", line 597, in <lambda>
    run = lambda: repo._Run(name, gopts, argv) or 0
  File ".../.repo/repo/main.py", line 261, in _Run
    result = cmd.Execute(copts, cargs)
  File ".../.repo/repo/subcmds/selfupdate.py", line 54, in Execute
    if not rp.Sync_NetworkHalf():
  File ".../.repo/repo/project.py", line 1091, in Sync_NetworkHalf
    if self.name in partial_clone_exclude:
TypeError: argument of type 'NoneType' is not iterable

$ ./run_tests -v

Change-Id: I71e744e4ef2a37b13aa9ba42eba3935e78c4e40a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/304082
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-04-22 18:00:32 +00:00
d41eed0b36 sync: fix missing import for -q
Some refactors during review dropped this import when it was reworked,
but it's still needed when using the --quiet setting.

Change-Id: I6d9302ef5a056e52415ea63f35bad592b9dfa75d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/303942
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-21 15:37:16 +00:00
d2b086bea9 init: restore default --manifest-name
The merge of the repo & init parser missed this default.

When running `repo init ...` in an existing checkout but w/out the -m
option, then repo would error out complaining that -m is required when
it didn't do this before.

Change-Id: I58035d48cc413b5d373702b9dc3b9ecd3fd1e900
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/303945
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Jonathan Nieder <jrn@google.com>
2021-04-21 04:45:59 +00:00
6823bc269d sync: cleanup sleep+retry logic a bit
Make sure we print a message whenever we retry so it's clear to the
user why repo is pausing for a long time, and why repo might have
passed even though it displayed some errors earlier.

Also unify the sleep logic so we don't have two independent methods.
This makes it easier to reason about.

Also don't sleep if we're in the last iteration of the for loop.  It
doesn't make sense to and needlessly slows things down when there are
real errors.

Bug: https://crbug.com/gerrit/12494
Change-Id: Ifceace5b2dde75c2dac39ea5388527dd37376336
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/303402
Reviewed-by: Sam Saccone 🐐 <samccone@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-19 16:42:06 +00:00
ad8aa69772 sync: only print error.GitError, don't raise that exception.
In _FetchOne & _CheckOne, only print error.GitError exception,
but other exceptions are still thrown

Fixes the GitError exceptions from /usr/lib/python3.8/multiprocessing/pool.py
exiting the repo sync.

Tested the code with the following commands and verified repo sync
continues after fetch error because of an invalid SHA1.

$ ./run_tests -v

$ python3 ~/work/repo/git-repo/repo sync -m manifest_P21623846.xml -j32
...
error.GitError: Cannot fetch platform/vendor/google_devices/redbull/proprietary update-ref: fatal: d5a99e518f09d6abb0c0dfa899594e1ea6232459^0: not a valid SHA1
....

An error like the following when jobs=1
  error.GitError: Cannot checkout platform/vendor/qcom/sdm845/proprietary/qcrilOemHook: Cannot initialize work tree for platform/vendor/qcom/sdm845/proprietary/qcrilOemHook

Bug: https://crbug.com/gerrit/14392
Change-Id: I8922ad6c07c733125419f5698b0f7e32d70c7905
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/303544
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-04-15 22:43:07 +00:00
b5d075d04f command: add a helper for the parallel execution boilerplate
Now that we have a bunch of subcommands doing parallel execution, a
common pattern arises that we can factor out for most of them.  We
leave forall alone as it's a bit too complicated atm to cut over.

Change-Id: I3617a4f7c66142bcd1ab030cb4cca698a65010ac
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/301942
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
2021-04-15 05:10:16 +00:00
b8bf291ddb tests: Make ReviewableBranchTests.test_smoke work with git < 2.28.0
Bug: https://crbug.com/gerrit/14380
Change-Id: Id015bd98b008e1530ada2c7e4332c67e8e208e25
Signed-off-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/303325
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-04-14 16:22:52 +00:00
233badcdd1 list: fix help grammar
Change-Id: Ia642e38532173d59868e0101cc098eab706d715e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/303302
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-14 15:25:53 +00:00
0888a083ec help: switch from formatter module to textwrap
Since Python has deprecated the formatter module, switch to the textwrap
module instead for reflowing text.  We weren't really using any other
feature anyways.

Verified by diffing the output before & after the change and making sure
it was the same.

Then made a few tweaks to tighten up the output.

Change-Id: I0be1bc2a6661a311b1a4693c80d0f8366320ba55
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/303282
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-14 01:00:51 +00:00
e2effe11a5 list: add option to show non-checkedout projects too
Currently, list only shows projects that exist in the checkout, and
doesn't offer any way to list all projects in the manifest (based on
the current settings, or on the options passed to list).  This seems
to be the opposite of what (at least some) users expect, so let's
add an option to show all of them regardless of checkout state.

Change-Id: I94bbdc5bd0ff2a411704fa215e7fc2b60fa3360e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/301263
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-13 22:42:32 +00:00
151701e85f progress: hide progress bar when --quiet
We want progress bars in the default output mode, but not when the
user specifies --quiet.  Add a setting to the Progress bar class so
it takes care of not displaying anything itself rather than having
to update every subcommand to conditionally setup & call the object.

Change-Id: I1134993bffc5437bc22e26be11a512125f10597f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/303225
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-13 22:25:26 +00:00
9180a07b8f command: make --verbose/--quiet available to all subcommands
Add new CommonOptions entry points to move the existing --jobs to,
and relocate all --verbose/--quiet options to that.  This provides
both a consistent interface for users as well as for code.

Change-Id: Ifaf83b88872421f4749b073c472b4a67ca6c0437
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/303224
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-13 22:25:17 +00:00
f32f243ff8 init: Added --partial-clone-exclude option.
partial-clone-exclude option excludes projects during
partial clone. This is a comma-delimited project names
(from manifest.xml). This option is persisted and it
is used by the sync command.

A project that has been unparital'ed will remain unpartial if
that project's name is specified in the --partial-clone-exclude
option. The project name should match exactly.

Added
$ ./run_tests -v

Bug: [google internal] b/175712967
"I can't "unpartial" my androidx-main checkout"

$ rm -rf androidx-main/
$ mkdir androidx-main/
$ cd androidx-main/
$ repo_dev init -u https://android.googlesource.com/platform/manifest -b androidx-main --partial-clone --clone-filter=blob:limit=10M -m default.xml
$ repo_dev sync -c -j8

+ Verify a project is partial
$ cd frameworks/support/
$ git config -l | grep  'partial'

+ Unpartial a project.
$ /google/bin/releases/android/git_repack/git_unpartial

+ Verify project is unpartial
$ git config -l | grep  'partial'
$ cd ../..

+ Exclude the project from being unparial'ed after init and sync.
$ repo_dev init -u https://android.googlesource.com/platform/manifest -b androidx-main --partial-clone --clone-filter=blob:limit=10M --partial-clone-exclude="platform/frameworks/support,platform/frameworks/support-golden" -m default.xml

+ Verify project is unpartial
$ cd frameworks/support/
$ git config -l | grep  'partial'
$ cd ../..
$ repo_dev sync -c -j8
$ cd frameworks/support/
$ git config -l | grep  'partial'
$ cd ../..

+ Remove the project from exclude list and verify that project is partially cloned.
$ repo_dev init -u https://android.googlesource.com/platform/manifest -b androidx-main --partial-clone --clone-filter=blob:limit=10M --partial-clone-exclude= -m default.xml
$ repo_dev sync -c -j8
$ cd frameworks/support/
$ git config -l | grep  'partial'

Change-Id: Id5dba418eba1d3f54b54e826000406534c0ec196
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/303162
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-04-13 15:47:10 +00:00
49de8ef584 sync: add separate --jobs options for different steps
The number of jobs one wants to run against the network tends to
factor differently from the number of jobs one wants to run when
checking out local projects.  The former is constrained by your
internet connection & server limits while the later is constrained
by your local computer's CPU & storage I/O.  People with beefier
computers probably want to keep the network/server jobs bounded a
bit lower than the local/checkout jobs.

Change-Id: Ia27ab682c62c09d244a8a1427b1c65acf0116c1c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/302804
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-09 15:58:56 +00:00
a1051d8baa init: organize command line options a bit
We've grown a lot of options in here and it's hard to make sense of
them.  Add more groups to try and make it easier to pick things out.

Change-Id: I6b9dc0e83f96137f974baf82d3fb86992b857bd2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/302803
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-09 15:58:45 +00:00
65af2602b5 sync: add progress bar to garbage collection phase
This can take a few seconds, if not a lot more, so add a progress bar
so users understand what's going on.

Change-Id: I5b4b54c1bbb9ec18728f979521310f7087afaa5c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/302802
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-09 15:58:21 +00:00
347f9ed393 sync: rework selfupdate logic
The current logic has a downside in that it doesn't sync to the latest
signed version available if the latest commit itself is unsigned.  This
can come up when using the "main" branch as it is sometimes signed, but
often not as it's holding the latest merged commits.  When people use
the main branch, it's to get early testing on versions tagged but not
yet released, and we don't want them to get stuck indefinitely on that
old version of repo.

For example, this series of events:
* "stable" is at v2.12.
* "main" is tagged with v2.13.
* early testers use --repo-rev main to get v2.13.
* new commits are merged to "main".
* "main" is tagged with v2.14.
* new commits are merged to "main".
* devs who had synced in the past to test v2.13 are stuck on v2.13.
  repo sees "main" is unsigned and so doesn't try to upgrade at all.

The only way to get unwedged is to re-run `repo init --repo-rev main`,
or to manually sync once with repo verification disabled, or for us to
leave "main" signed for a while and hope devs will sync in that window.

The new logic is that whenever changes are available, we switch to the
latest signed tag.  We also replace some of the duplicated verification
code in the sync command with the newer wrapper logic.  This handles a
couple of important scenarios inaddition to above:
* rollback (e.g. v2.13.8 -> v2.13.7)
* do not trash uncommitted changes (in case of ad-hoc testing)
* switch tag histories (e.g. v2.13.8 -> v2.13.8-cr1)

Change-Id: I5b45ba1dd26a7c582700ee3711f303dc7538579b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/300122
Reviewed-by: Jonathan Nieder <jrn@google.com>
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-09 03:16:45 +00:00
9a734a3975 init: merge subcmd & wrapper parsers
These are manually kept in sync which is a pain.  Have the init
subcmd reuse the wrapper code directly.

Change-Id: Ica73211422c64377bacc9bb3b1d1a8d9d5f7f4ca
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/302762
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-09 01:04:32 +00:00
6a2f4fb390 repo init: Added --no-partial-clone and made it persist. Bumped version to 2.14.
Saved the repo.partialclone when --no-partial-clone option is passed
to init, so repo sync will honor the no-partial-clone option.

$ ./run_tests -v

Bug: [google internal] b/175712967

$ mkdir androidx-main && cd androidx-main
$ repo init -u https://android.googlesource.com/platform/manifest -b androidx-main --partial-clone --clone-filter=blob:limit=10M
$ repo sync -c -j32
$ cd frameworks/support/ && /google/bin/releases/android/git_repack/git_unpartial
$ git config -l | grep  'partialclonefilter=blob'

Observe partialclone is not enabled.

$ cd ../..
$ repo init -u https://android.googlesource.com/platform/manifest -b androidx-main
$ repo sync -c -j32
$ cd frameworks/support/ && git config -l | grep  'partialclonefilter=blob'

Observe partialclone is enabled.

$ /google/bin/releases/android/git_repack/git_unpartial

Observe partialclone is not enabled.

$ cd ../..
$ repo_dev init -u https://android.googlesource.com/platform/manifest -b androidx-main --no-partial-clone
$ repo sync -c -j32
$ cd frameworks/support/ && git config -l | grep  'partialclonefilter=blob'

Observe partialclone is not enabled.

Change-Id: I4400ad7803b106319856bcd0fffe00bafcdf014e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/302122
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-04-05 05:53:19 +00:00
beea5de842 tox: enable python 3.5 & 3.9 testing
We still support Python 3.5, so make sure it keeps working.

Change-Id: I150158a656b26de6d733316a68a2cbb8b5b99716
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/299625
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-01 14:57:05 +00:00
bfbcfd9045 run_tests: fix exit code handling
We need to pass back an int, not a CompletedProcess object.  Switch to
check=False so we don't throw an exception on failure -- we're already
showing pytest's stderr, and will return the non-zero status.

Change-Id: Ib0d3862a09a3963f25025f39a8e34419cf2a54df
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/299624
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-01 14:56:57 +00:00
74317d3b01 setup: bump major version
We don't keep this up-to-date in general, but might as well keep
the major version in sync.

Change-Id: I20908005b3b393d384da0ef9342d7c9d094550cb
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/299622
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-01 14:56:47 +00:00
b2fa30a2b8 sync: switch network fetch to multiprocessing
This avoids GIL limitations with using threads for parallel processing.

This reworks the fetch logic to return results for processing in the
main thread instead of leaving every thread to do its own processing.

We have to tweak the chunking logic a little here because multiprocessing
favors batching over returning immediate results when using a larger value
for chunksize.  When a single job can be quite slow, this tradeoff is not
good UX.

Bug: https://crbug.com/gerrit/12389
Change-Id: I0f0512d15ad7332d1eb28aff52c29d378acc9e1d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298642
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-01 14:52:57 +00:00
d246d1fee7 grep: add --jobs support
Use multiprocessing to run in parallel.  When operating on multiple
projects, this can greatly speed things up.  Across 1000 repos, it
goes from ~40sec to ~16sec with the default -j8.

The output processing does not appear to be a significant bottle
neck -- it accounts for <1sec out of the ~16sec runtime.  Thus we
leave it in the main thread to simplify the code.

Change-Id: I750b72c7711b0c5d26e65d480738fbaac3a69971
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297984
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-01 14:43:19 +00:00
bec4fe8aa3 prune: add --jobs support
Use multiprocessing to run in parallel.  When operating on multiple
projects, this can greatly speed things up.  Across 1000 repos, it
goes from ~10sec to ~4sec with the default -j8.

This only does a simple conversion over to get an easy speedup.  It
is currently written to collect all results before displaying them.
If we refactored this module more, we could have it display results
as they came in.

Change-Id: I5caf4ca51df0b7f078f0db104ae5232268482c1c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298643
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-31 16:28:24 +00:00
ddab0604ee forall: handle missing project refs better
If the project exists, but the ref the manifest wants doesn't exist,
don't throw an error (and abort the process in general).  This can
come up with a partially synced tree: the manifest is up-to-date,
but not all the projects have yet been synced.

Bug: https://crbug.com/gerrit/14289
Change-Id: Iba97413c476544223ffe518198c900c2193a00ed
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/301262
Reviewed-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-25 23:08:51 +00:00
2ae44d7029 sync: imply -c if --use-superproject option is used.
Tested the code with the following commands.

$ ./run_tests -v

Bug: [google internal] b/183232698
Bug: https://crbug.com/gerrit/13707

$ repo_dev init -u sso://android.git.corp.google.com/platform/manifest -b master --partial-clone --clone-filter=blob:limit=10M --repo-rev=main --use-superproject
$ repo_dev sync --use-superproject
$ repo_dev sync
  real	0m8.046s
  user	0m2.866s
  sys	0m2.457s

Second time repo sync took only 8 seconds and verified by printing that
urrent_branch_only is True in project.py's Sync_NetworkHalf function.

Change-Id: Ic48efb23ea427dfa36e12a5c49973d6ae776d818
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/301182
Tested-by: Raman Tenneti <rtenneti@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-03-24 15:17:19 +00:00
d1e4fa7015 start: add a --HEAD alias
This makes it easy to use --H as a shortcut, and kind of matches the
use of storing HEAD as the revision.

Change-Id: I590bf488518f313e7a593853140316df98262d7e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/301163
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-24 00:32:04 +00:00
323b113f55 forall/list: delete spurious "
Change-Id: I6995d48be1d8fc5d93f4b9fa617fad70b5b3429f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/300902
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-19 21:13:49 +00:00
8367096d02 superproject: Added --depth=1 argument to git fetch command.
Tested the code with the following commands.

$ ./run_tests -v

Without --depth

$ time repo_dev init -u sso://googleplex-android.git.corp.google.com/platform/manifest -b master --repo-rev=main --use-superproject
real    6m48.086s
user    3m27.281s
sys     1m1.386s

With --depth=1
$ time repo_dev init -u sso://googleplex-android.git.corp.google.com/platform/manifest -b master --repo-rev=main --use-superproject

real    2m49.637s
user    2m51.458s
sys     0m39.108s

From dwillemsen@:
 "For me it's the difference between 9m28s using the current code and
  16s using --depth=1 while fetching the superproject."

Bug: [google internal] b/180451672
Bug: https://crbug.com/gerrit/13707
Change-Id: I1c3e4aef4414c4e9dd259fb6e4619da0421896b0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/300922
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-03-19 21:10:39 +00:00
d34af28ac2 forall: allow interactive commands with -j1
Historically forall has been interactive since it ran in serial.
Recent rework in here dropped that to enable parallel processing.
Restore support for interactive commands when running -j1 or with
an explicit --interactive option.

Bug: https://crbug.com/gerrit/14256
Change-Id: I502007186f771914cfd7830846a4e1938b5e1f38
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/300722
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-18 22:13:01 +00:00
a5b40a2845 repo: Add a new "command" event type to git trace2 logging in repo.
Add a new "event": "command", which is emitted at when all command
arguments have been processed.

Additional fields:

"name": Name of the primary command (ex: repo, git)

"subcommands"': List of the sub-commands once command-line arguments
                are processed

Examples:

Command: repo --version

Event: {"event": "command", <common fields>,
        "name": "repo",
	"subcommands": ["version"]
	}

Bug: [google internal] b/178507266
Testing:
- Unit tests
- Verified repo git trace2 logs had expected data

Change-Id: I825bd0ecedee45135382461a4ba10f987f09aef3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/300343
Reviewed-by: Ian Kasprzak <iankaz@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-03-18 14:58:24 +00:00
511a0e54f5 sync: fix reporting of failed local checkouts
The refactor to multiprocessing broke status reporting slightly when
checking out projects.  Make sure we mark the step as failed if any
of the projects failed, not just when --fail-fast is set.

Change-Id: I0efb56ce83b068b2c334046df3fef23d797599c9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/299882
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-15 16:54:23 +00:00
8da7b6fc65 bash-completion: initial import based on CrOS version
We've had a limited version of this in CrOS for a long time.  There's
nothing CrOS specific about it, so lets move it to the repo project so
everyone can utilize it.

Change-Id: I04cd94610c1100f3afcd2baf8c8e7ab13e589490
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/299202
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-15 16:54:21 +00:00
0458faa502 manifest: allow toplevel project checkouts
Re-allow checking out projects to the top of the repo client checkout.
We add checks to prevent checking out files under .repo/ as that path
is only managed by us, and projects cannot inject content or settings
into it.

Bug: https://crbug.com/gerrit/14156
Bug: https://crbug.com/gerrit/14200
Change-Id: Id6bf9e882f5be748442b2c35bbeaee3549410b25
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/299623
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-12 16:31:14 +00:00
68d5d4dfe5 document the new manifest restrictions on name & path settings
Bug: https://crbug.com/gerrit/14156
Change-Id: I473edab1173e6a266d0754c29d5dc7ff761f1359
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/299403
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-12 16:30:37 +00:00
a3794e9c6f prune: minor optimization & robustification
If the current project doesn't have any local branches, then there's
nothing to prune, so return right away.  This avoids running a few
git commands when we aren't actually going to use the results, and
it avoids checking repository validity.  Since we aren't going to do
anything in here, no need to check it.

Change-Id: Ie9d5c75a954e42807477299f3e5a63a92fac138b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/299742
Reviewed-by: Jonathan Nieder <jrn@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-12 05:28:06 +00:00
080877e413 superproject: pass groups to ToXml method.
Added the following methods to XmlManifest class.
+ GetDefaultGroupsStr() - return 'default,platform-' + platform.system().lower()
+ GetGroupsStr() - Same as gitc_utils.py's _manifest_groups func.

+ Replaced gitc_utils.py's_manifest_groups calls with GetGroupsStr.
+ Used the above methods to get groups in command.py::GetProjects
  and part of init.py.

TODO: clean up these funcs to take structured group data more instead
      of passing strings around everywhere that need parsing.

Tested the code with the following commands.

$ ./run_tests -v

Tested the sync code by using repo_dev alias and pointing to this CL
and verified prebuilts/fullsdk-linux directory has all the folders.

Tested repo init and repo sync with --use-superproject and without
--use-superproject argument.

$ repo_dev init -u sso://android.git.corp.google.com/platform/manifest -b androidx-main  --partial-clone --clone-filter=blob:limit=10M --repo-rev=main --use-superproject

$ repo_dev sync -c -j32

Bug: [google internal] b/181804931
Bug: https://crbug.com/gerrit/13707
Change-Id: Ia98585cbfa3a1449710655af55d56241794242b6
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/299422
Reviewed-by: Jonathan Nieder <jrn@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-03-11 01:24:52 +00:00
9888accb0c project: fix diff printing with embedded %
The recent commit 84230009ee ("project:
make diff tools synchronous") broke repo diff if it includes % formats.
Add an explicit format string to fix.

Bug: https://crbug.com/gerrit/14208
Change-Id: Ie255a43c5b767488616b2b3dd15abc18f93bfab2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/299402
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-09 17:00:02 +00:00
5a4c8fde17 init: expose --worktree option
There's a few rough edges here still, but no known corruption ones,
so open it up a bit for people to experiment with.

Bug: https://crbug.com/gerrit/11486
Change-Id: I81e0122ab6d3e032c546c8239dd4f03740676e80
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/299242
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-09 16:59:59 +00:00
835a34bdb9 Log repo.* config variables in git trace2 logger.
Bug: [google internal] b/181758736
Testing:
- Unit tests
- Verified repo git trace2 logs had expected data

Change-Id: I9af8a574377bd91115f085808c1271e9dee16a36
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/299182
Tested-by: Ian Kasprzak <iankaz@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Raman Tenneti <rtenneti@google.com>
2021-03-08 17:32:09 +00:00
ef99ec07b4 superproject: Display status messages during repo init/sync.
Superproject objects accept the optional argument “quiet”.
The following progress messages are displayed if quiet is false.

Displayed the following message whenever we find we have to make a new
folder (aka new remote), because if you started with repo init android
and later do googleplex-android that is when it will be slow.

"<location>: Performing initial setup for superproject; this might take
several minutes.".

After fetch completion, added the following notification:
"<location>: Initial setup for superproject completed."

Tested the code with the following commands.

$ ./run_tests -v

Tested the sync code by using repo_dev alias and pointing to this CL.

$ repo_dev init -u persistent-https://googleplex-android.git.corp.google.com/platform/manifest -b rvc-dev  --partial-clone --clone-filter=blob:limit=10M --repo-rev=main  --use-superproject

Bug: [google internal] b/181178282
Bug: https://crbug.com/gerrit/13707
Change-Id: Ia7fb85c6fb934faaa90c48fc0c55e7f41055f48a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/299122
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-03-04 20:07:52 +00:00
934cb0a849 tests: fix duplicate method from copy & paste error
Change-Id: Ib748c61b1e65aee6dff8b97a9753d14c470a827f
Reported-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/299002
Reviewed-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Ian Kasprzak <iankaz@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-04 16:17:11 +00:00
3c0931285c project: fix variable typo
Bug: https://crbug.com/gerrit/11293
Reported-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Change-Id: I37bac58aa1dc9ecc10e29253d14ff9e6fb42427c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298942
Reviewed-by: Ian Kasprzak <iankaz@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-03 16:45:21 +00:00
5413397204 manifest: relax include name rules for user-specified path
Allow the user to specify relative or absolute or any other funky
path that they want when using `repo init` or `repo sync`.  Our
goal is to restrict the paths in the remote manifest git repo we
cloned from the network, not protect the user from themselves.

Bug: https://crbug.com/gerrit/14156
Change-Id: I1ccfb2a6bd1dce2bd765e261bef0bbf0f8a9beb6
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298823
Reviewed-by: Jonathan Nieder <jrn@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-02 03:18:57 +00:00
13cb7f799d forall: greatly speed up processing overhead
With the recent commit 0501b29e7a
("status: Use multiprocessing for `repo status -j<num>` instead of
threading"), the limitation with project serialization no longer
applies.  It turns out that ad-hoc logic is expensive.  In the CrOS
checkout (~1000 projects w/8 jobs by default), it adds about ~7sec
overhead to all invocations.  With a fast nop run:
	time repo forall -j8 -c true
This goes from ~11sec to ~4sec -- more than 50% speedup.

Change-Id: Ie6bcccd21eef20440692751b7ebd36c890d5bbcc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298724
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-01 15:58:06 +00:00
819c73954f forall: simplify arg passing to worker children
The ProjectArgs function can be inlined which simplifies it quite a
bit.  We shouldn't need the custom exception handling here either.
This also makes the next commit easier to review.

Change-Id: If3be04f58c302c36a0f20b99de0f67e78beac141
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298723
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-01 15:58:06 +00:00
179a242caa forall: move nested func out to the class
This is in preparation for simplifying the jobs support.  The nested
function is referenced in the options object which can't be pickled,
so pull it out into a static method instead.

Change-Id: I01d3c4eaabcb8b8775ddf22312a6e142c84cb77d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298722
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-01 15:57:32 +00:00
31fabeed54 download: handle shared projects a bit better
If a manifest checksout a project multiple times, repo download isn't
able to accurately pick the right project.  We were just picking the
first result which could be a bit random for the user.  If we hit that
situation, check if the cwd is one of the projects, and if it isn't,
we emit an error and tell the user it's an ambiguous request.

Bug: https://crbug.com/gerrit/13070
Change-Id: Id1059b81330229126b48c7312569b37504808383
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298702
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-01 15:57:17 +00:00
76844ba292 project: skip clone bundles when we've already initialized the object dir
The clone bundle logic assumes there is a one-to-one mapping between the
projects/ and project-objects/ trees.  When using shared projects (where
we checkout different branches from the same project), this would lead us
to fetching the same clone bundle multiple times.  Automatically skip the
clone bundle logic if the project-objects/ dir already exists.

Bug: https://crbug.com/gerrit/10993
Change-Id: I82c6fa1faf8605fd56c104fcea2a43dd4eecbce4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298682
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-03-01 15:57:12 +00:00
6d1faa1db3 git_refs: fix crash with binary . files in .git/refs/
On macOS, the Finder app likes to poop .DS_Store files in every path
that the user browses.  If the user pokes around the .git/ tree, it
could generate a .DS_Store file in there too.  When repo goes to read
all the local refs, it tries to decode this binary file as UTF-8 and
subsequently crashes.

Since paths that begin with . are not valid refs, ignore them like we
already do with paths that end in .lock.  Also bump the check up to
ignore dirs that match since that follows the git rules: they apply
to any component in its path, not just the final path (name).

We don't implement the full valid ref algorithm that git employs as
it's a bit complicated, and we only really need to focus on what will
practically show up locally.

Bug: https://crbug.com/gerrit/14162
Change-Id: I6519f990e33cc58a72fcb00c0f983ad3285ace3d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298662
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Michael Mortensen <mmortensen@google.com>
2021-02-28 16:07:24 +00:00
4510be51c1 git_command: pass GIT_DIR on Windows with /
When using Git under Windows, it seems that Git doesn't always parse
GIT_DIR correctly when it uses the Windows \ form, but does when it
uses / only.

For example, when using worktrees:
$ GIT_DIR='C:\Users\vapier\Desktop\repo\breakpad\tools\test\.git' git worktree list
fatal: not a git repository: ..\..\.repo\worktrees\linux-syscall-support.git\worktrees\test
$ GIT_DIR='C:/Users/vapier/Desktop/repo/breakpad/tools/test/.git' git worktree list
C:/Users/vapier/Desktop/repo/breakpad/.repo/worktrees/linux-syscall-support.git  fd00dbbd0c06 (detached HEAD)
..\..\..\..\..\src\src\third_party\lss\.git                                      fd00dbbd0c06 (detached HEAD)
..\..\..\..\..\tools\test\.git                                                   fd00dbbd0c06 (detached HEAD)

Change-Id: I666c03ae845ecb55d7f9800731ea6987d3e7f401
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298622
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-28 16:07:20 +00:00
a29424ea6d manifest: validate project name & path and include name attributes
These attribute values are used to construct local filesystem paths,
so apply the existing filesystem checks to them.

Bug: https://crbug.com/gerrit/14156
Change-Id: Ibcceecd60fa74f0eb97cd9ed1a9792e139534ed4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298443
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-28 16:07:12 +00:00
a00c5f40e7 manifest: refactor the filesystem checking logic for more reuse
This function is currently written with copyfile & linkfile in mind.
Generalize the logic & function arguments slightly so we can reuse
in more places that make sense.

This changes the validation logic slightly too in that we no longer
allow "." for the dest attribute with copyfile & linkfile, nor for
the src attribute with copyfile.  We already rejected those later on
when checking against the active filesystem, but now we reject them
a little sooner when parsing.

The empty path check isn't a new requirement exactly -- repo used to
crash on it, so it was effectively blocked, but now we diagnosis it.

Bug: https://crbug.com/gerrit/14156
Change-Id: I0fdb42a3da60ed149ff1997c5dd4b85da70eec3d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298442
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-28 16:07:12 +00:00
6093d99d13 checkout: add --jobs support
Use multiprocessing to run in parallel.  When operating on multiple
projects, this can speed things up.  Across 1000 repos, it goes from
~9sec to ~5sec with the default -j8.

Change-Id: Ida6dd565db78ff7bac0ecb25d2805e8a1bf78048
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297982
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-27 19:56:24 +00:00
ebf04a4404 sync: switch local checkout to multiprocessing
This avoids GIL limitations with using threads for parallel processing.
In a CrOS checkout with ~1000 repos, the nop case goes from ~6 sec down
to ~4 sec with -j8.  Not a big deal, but shows that this actually works
to speed things up unlike the threading model.

This reworks the checkout logic to return results for processing in the
main thread instead of leaving every thread to do its own processing.

Bug: https://crbug.com/gerrit/12389
Change-Id: I143e5e3f7158e83ea67e2d14e5552153a874248a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298063
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-27 19:55:14 +00:00
8dbc07aced abandon/start: add --jobs support
Use multiprocessing to run in parallel.  When operating on multiple
projects, this can greatly speed things up.  Across 1000 repos, it
goes from ~30sec to ~3sec with the default -j8.

Change-Id: I0dc62d704c022dd02cac0bd67fe79224f4e34095
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297484
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
2021-02-27 19:45:14 +00:00
8d2a6df1fd progress: include execution time summary
We're already keeping tracking of the start time, so might as
well use it to display overall execution time for steps.

Change-Id: Ib4cf8b2b0dfcdf7b776a84295d59cc569971bdf5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298482
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-26 17:16:29 +00:00
ceba2ddc13 sync: superproject - support for switching hosts and switching branches.
+ superproject will be fetched into a directory with the name
  “<remote name>-superproject.git” instead of the current
  “superproject.git” folder.

+ Deleted  _Clone method and added _Init method.

+ _Init method will do “git init --bare <remote>-superproject.git”.
  It will create the folder and set up a bare repository in
  <remote>-superproject.git folder.

+ _Fetch method, will pass <remote url>, <branch> arguments.
  Moved the --filter argument from “git clone” to “git fetch”.
  _Fetch method will execute the following command to fetch
  superproject. Added --no-tags argument.

  master:  git fetch <remote url> --force --no-tags --filter blob:none
  branch:  git fetch <remote url> --force --no-tags --filter blob:none \
           <branch>:<branch>

+ Performance improvements for aosp-master
  ++ repo init performance improved from 35 seconds to 17 seconds.
  ++ repo init --use-superproject is around 5 to 7 secsonds slower.
  ++ repo sync --use-superproject is around 3 to 4 minutes faster.

Tested the code with the following commands.

$ ./run_tests -v

Tested the sync code by using repo_dev alias and pointing to this CL.

$ time repo_dev init -u sso://android.git.corp.google.com/platform/manifest -b master --partial-clone --clone-filter=blob:limit=10M --repo-rev=main --use-superproject
...
  real	0m20.648s
  user	0m8.046s
  sys	0m3.271s

+ Without superproject
$ time repo init -u sso://android.git.corp.google.com/platform/manifest -b master --partial-clone --clone-filter=blob:limit=10M --repo-rev=main
  real	0m13.078s
  user	0m9.783s
  sys	0m2.528s

$ time repo_dev sync -c -j32 --use-superproject
...
  real	15m7.072s
  user	110m7.216s
  sys	20m17.559s

+ Without superproject
$ time repo sync -c -j32
...
  real	19m25.644s
  user	91m56.331s
  sys	20m59.170s

Bug: [google internal] b/180492484
Bug: [google internal] b/179470886
Bug: [google internal] b/180124069
Bug: https://crbug.com/gerrit/13709
Bug: https://crbug.com/gerrit/13707

Change-Id: Ib04bd7f1e25ceb75532643e58ad0129300ba3299
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297702
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-02-25 20:45:26 +00:00
45ad1541c5 grep: move nested func out to the class
This is in preparation for adding jobs support.  The nested function
is referenced in the options object which can't be pickled, so pull
it out into a static method instead.

Change-Id: I280ed2bf26390a0203925517a0d17c13053becaa
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297983
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-25 20:13:33 +00:00
7b586f231b sync: capture all git output by default
The default sync output should show a progress bar only for successful
commands, and the error output for any commands that fail.  Implement
that policy here.

Bug: https://crbug.com/gerrit/11293
Change-Id: I85716032201b6e2b45df876b07dd79cb2c1447a5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297905
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-25 20:13:18 +00:00
fbb95a4342 progress/sync: include active number of jobs
Provide a bit more info to users that things are actively running.

Bug: https://crbug.com/gerrit/11293
Change-Id: Ie8eeaa8804d1ca71cf5c78ad850fa2d17d26208c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297904
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-25 20:13:18 +00:00
4e05f650e0 progress: always enable always_print_percentage
The idea for skipping some progress updates was to avoid spending
too much time on the progress bar itself.  Unfortunately, for large
projects (100s if not 1000s) of repos, we get into the situation
with large/slow checkouts that we skip showing updates when a repo
finishes, but not enough repos finished to increase the percent.

Since the progress bar should be relatively fast compared to the
actual network & local dick operations, have it show an update
whenever the caller requests it.  A test with ~1000 repos shows
that the progress bar in total adds <100ms.

Bug: https://crbug.com/gerrit/11293
Change-Id: I708a0c4bd923c59c7691a5b48ae33eb6fca4cd14
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297903
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-25 20:13:18 +00:00
23882b33fe init: support -b HEAD as a shortcut to "the default"
When people switch to non-default branches, they sometimes want to
switch back to the default, but don't know the exact name for that
branch.  Add a -b HEAD shortcut for that.

Change-Id: I090230da25f9f5a169608115d483f660f555624f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297843
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-25 20:12:51 +00:00
92304bff00 project: fix http error retry logic
When sync moved to consume clone output, it merged stdout & stderr,
but the retry logic in this function is based on stderr only.  Move
it over to checking stdout.

Change-Id: I71bdc18ed25c978055952721e3a768289d7a3bd2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297902
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-25 20:12:42 +00:00
adbd01e0d3 tests: fix init subcmd after url change
My recent 401c6f0725 ("init: make
--manifest-url flag optional") commit broke the unittest.

Change-Id: I19ad0e8c8cbb84ab5474ebc370e00acfe957e136
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298223
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-25 17:07:11 +00:00
37ac3d626f tests: refactor manifest tests
The XmlManifestTests class is getting to be large and we're only
adding more to it.  Factor out the core logic into a new TestCase
so we can reuse it to better group more tests.

Change-Id: I5113444a4649a70ecfa8d83d3305959a953693f7
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298222
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-25 17:06:56 +00:00
55d6a5a3a2 sync: use superproject if manifest's config has superproject enabled.
If --use-superproject is passed as argument to "repo init", then
--use-superproject need not be specified during "repo sync".

Tested the code with the following commands.

$ time repo_dev sync -c -j32
...
WARNING: --use-superproject is experimental and not for general use

Bug: https://crbug.com/gerrit/13709
Bug: https://crbug.com/gerrit/13707
Change-Id: Ibb33f3038a2515f74a6c4f7cb785d354b26ee680
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298102
Tested-by: Raman Tenneti <rtenneti@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Ian Kasprzak <iankaz@google.com>
2021-02-25 16:35:53 +00:00
6db4097f31 docs: add warnings about repos data model
For people coming across these docs and thinking that repo's methods
are good to replicate, add a note warning them against doing so.

Change-Id: I443a783794313851a6e7ba1c39baebac988bff9a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/298164
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-25 15:48:03 +00:00
f0925c482f platform_utils: delete unused FileDescriptorStreams APIs
Now that we've converted the few users of this over to subprocess APIs,
we don't need this anymore.  It's been a bit hairy to maintain across
different operating systems, so there's no desire to bring it back.

Using multiprocessing Pool to batch things has been working better in
general anyways.

Change-Id: I10769e96f60ecf27a80d8cc2aa0d1b199085252e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297682
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-24 01:45:57 +00:00
be24a54d9c sync: update event is_set API
Python 3 renamed this method from isSet to is_set.

Change-Id: I8f9bb0b302d55873bed3cb20f2d994fa2d082157
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297742
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-23 17:56:49 +00:00
c87c1863b1 git_command: switch process capturing over to subprocess
Now that these code paths are all synchronous, there's no need to run
our own poll loop to read & pass thru/save output.  Delete all of that
and just let the subprocess module take care of it all.

Change-Id: Ic27fe71b6f964905cf280ce2b183bb7ee46f4a0d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297422
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Reviewed-by: Jonathan Nieder <jrn@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-23 00:36:51 +00:00
69b4a9cf21 diff: add --jobs support
Use multiprocessing to run diff in parallel.

Change-Id: I61e973d9c2cde039d5eebe8d0fe8bb63171ef447
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297483
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
2021-02-23 00:31:27 +00:00
fbab6065d4 forall: rewrite parallel logic
This fixes intermingling of parallel jobs and simplifies the code
by switching to subprocess.run.  This also provides stable output
in the order of projects by returning the output as a string that
the main loop outputs.

This drops support for interactive commands, but it's unclear if
anyone was relying on that, and the default behavior (-j2) made
that unreliable.  If it turns out someone still wants this, we can
look at readding it.

Change-Id: I7555b4e7a15aad336667292614f730fb7a90bd26
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297482
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-22 22:58:30 +00:00
15e807cf3c forall: improve pool logic
Use a pool contextmanager to take care of the messy details like
properly cleaning it up when aborting.

Change-Id: I264ebb591c2e67c9a975b6dcc0f14b29cc66a874
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297243
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-22 22:51:46 +00:00
7c871163c8 status: improve parallel execution stability
The status command runs a bunch of jobs in parallel, and each one
is responsible for writing to stdout directly.  When running many
noisy jobs in parallel, output can get intermingled.  Pass down a
StringIO buffer for writing to so we can return the entire output
as a string so the main job can handle displaying it.  This fixes
interleaved output as well as making the output stable: we always
display results in the same project order now.  By switching from
map to imap, this ends up not really adding any overhead.

Bug: https://crbug.com/gerrit/12231
Change-Id: Ic18b07c8074c046ff36e306eb8d392fb34fb6eca
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297242
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
2021-02-22 22:51:34 +00:00
6a2400a4d0 command: unify --job option & default values
Extend the Command class to support adding the --jobs option to the
parser if the command declares it supports running in parallel.  Also
pull the default value used for the number of local jobs into the
command module so local commands can share it.

Change-Id: I22b0f8d2cf69875013cec657b8e6c4385549ccac
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297024
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
2021-02-22 22:51:07 +00:00
c5bbea8db3 git_command: make execution synchronous
Every use of GitCommand in the tree just calls Wait as soon as it's
instantiated.  Move the bulk of the logic into the init path to make
the call synchronous to simplify.  We'll cleanup the users of the
Wait API to follup commits -- having this split makes it easier to
track down regressions.

Change-Id: I1e8c519efa912da723749ff7663558c04c1f491c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297244
Reviewed-by: Jonathan Nieder <jrn@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-20 08:41:10 +00:00
5d9c4972e0 use simpler super() magic
Python 3 has a simpler super() style so switch to it to make the
code a little simpler and to stop pylint warnings.

Change-Id: I1b3ccf57ae968d56a9a0bcfc1258fbd8bfa3afee
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297383
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-19 20:06:20 +00:00
057905fa1d error: fix pickling of all exceptions
Make sure all our custom exceptions can be pickled so that if they
get thrown in a multiprocess subprocess, we don't crash & hang due
to multiprocessing being unable to pickle+unpickle the exception.

Details/examples can be seen in Python reports like:
https://bugs.python.org/issue13751

Change-Id: Iddf14d3952ad4e2867cfc71891d6b6559130df4b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297382
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-19 20:06:03 +00:00
401c6f0725 init: make --manifest-url flag optional
Since the --manifest-url flag is always required when creating a new
checkout, allow the url to be specified via a positional argument.
This brings it a little closer to the `git clone` UI.

Change-Id: Iaf18e794ae2fa38b20579243d067205cae5fae2f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297322
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Jonathan Nieder <jrn@google.com>
2021-02-18 20:38:47 +00:00
8c1e9e62a3 gitc_utils: rewrite to use multiprocessing
This is the only code in the tree that uses GitCommand asynchronously.
Rewrite it to use multiprocessing.Pool as it makes the code a little
bit easier to understand and simpler.

Change-Id: I3ed3b037f24aa1e9dfe8eec9ec21815cdda7678a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297143
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Michael Mortensen <mmortensen@google.com>
2021-02-18 07:11:07 +00:00
84230009ee project: make diff tools synchronous
These are the only users in the tree that process the output as it's
produced.  All others capture all the output first and then process
the results.  However, these functions still don't fully return until
it's finished processing, and these funcs are in turn used in other
synchronous code paths.  So it's unclear whether anyone will notice
that it's slightly slower or less interactive.  Let's try it out and
see if users report issues.

This will allow us to simplify our custom GitCommand code and move it
over to Python's subprocess.run, and will help fix interleaved output
when running multiple commands in parallel (e.g. `repo diff -j8`).

Change-Id: Ida16fafc47119d30a629a8783babeba890515de0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297144
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Jonathan Nieder <jrn@google.com>
2021-02-18 03:54:30 +00:00
f37b9827a9 git_command: rework stdin handling
We only provide input to GitCommand in one place, so inline the logic
to be more synchronous and similar to subprocess.run.  This makes the
code simpler and easier to understand.

Change-Id: Ibe498fedf608774bae1f807fc301eb67841c468b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297142
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-17 15:15:16 +00:00
c47a235bc5 trim redundant pass statements
Clean up a few linter warnings.

Change-Id: I531d0263a202435d32d83d87ec24998f4051639c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297062
Reviewed-by: Jonathan Nieder <jrn@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-16 19:23:00 +00:00
f307916f22 git_command: use subprocess.run for version info
The code is a bit simpler & easier to reason about.

Change-Id: If125ea7d776cdfa38a0440a2b03583de079c4839
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297023
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-16 16:26:43 +00:00
fb21d6ab64 sync: use subprocess.run to verify tags
The code is a bit simpler & easier to reason about.

Change-Id: I149729c7d01434b08b58cc9715dcf0f0d11201c2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/297022
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-02-16 16:26:41 +00:00
46 changed files with 1945 additions and 1496 deletions

View File

@ -14,7 +14,7 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
os: [ubuntu-latest, macos-latest, windows-latest] os: [ubuntu-latest, macos-latest, windows-latest]
python-version: [3.6, 3.7, 3.8] python-version: [3.5, 3.6, 3.7, 3.8, 3.9]
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
steps: steps:

View File

@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import multiprocessing
import os import os
import optparse import optparse
import platform import platform
@ -21,6 +22,21 @@ import sys
from event_log import EventLog from event_log import EventLog
from error import NoSuchProjectError from error import NoSuchProjectError
from error import InvalidProjectGroupsError from error import InvalidProjectGroupsError
import progress
# Number of projects to submit to a single worker process at a time.
# This number represents a tradeoff between the overhead of IPC and finer
# grained opportunity for parallelism. This particular value was chosen by
# iterating through powers of two until the overall performance no longer
# improved. The performance of this batch size is not a function of the
# number of cores on the system.
WORKER_BATCH_SIZE = 32
# How many jobs to run in parallel by default? This assumes the jobs are
# largely I/O bound and do not hit the network.
DEFAULT_LOCAL_JOBS = min(os.cpu_count(), 8)
class Command(object): class Command(object):
@ -32,6 +48,10 @@ class Command(object):
manifest = None manifest = None
_optparse = None _optparse = None
# Whether this command supports running in parallel. If greater than 0,
# it is the number of parallel jobs to default to.
PARALLEL_JOBS = None
def WantPager(self, _opt): def WantPager(self, _opt):
return False return False
@ -66,12 +86,33 @@ class Command(object):
usage = 'repo %s' % self.NAME usage = 'repo %s' % self.NAME
epilog = 'Run `repo help %s` to view the detailed manual.' % self.NAME epilog = 'Run `repo help %s` to view the detailed manual.' % self.NAME
self._optparse = optparse.OptionParser(usage=usage, epilog=epilog) self._optparse = optparse.OptionParser(usage=usage, epilog=epilog)
self._CommonOptions(self._optparse)
self._Options(self._optparse) self._Options(self._optparse)
return self._optparse return self._optparse
def _Options(self, p): def _CommonOptions(self, p, opt_v=True):
"""Initialize the option parser. """Initialize the option parser with common options.
These will show up for *all* subcommands, so use sparingly.
NB: Keep in sync with repo:InitParser().
""" """
g = p.add_option_group('Logging options')
opts = ['-v'] if opt_v else []
g.add_option(*opts, '--verbose',
dest='output_mode', action='store_true',
help='show all output')
g.add_option('-q', '--quiet',
dest='output_mode', action='store_false',
help='only show errors')
if self.PARALLEL_JOBS is not None:
p.add_option(
'-j', '--jobs',
type=int, default=self.PARALLEL_JOBS,
help='number of jobs to run in parallel (default: %s)' % self.PARALLEL_JOBS)
def _Options(self, p):
"""Initialize the option parser with subcommand-specific options."""
def _RegisteredEnvironmentOptions(self): def _RegisteredEnvironmentOptions(self):
"""Get options that can be set from environment variables. """Get options that can be set from environment variables.
@ -97,6 +138,11 @@ class Command(object):
self.OptionParser.print_usage() self.OptionParser.print_usage()
sys.exit(1) sys.exit(1)
def CommonValidateOptions(self, opt, args):
"""Validate common options."""
opt.quiet = opt.output_mode is False
opt.verbose = opt.output_mode is True
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
"""Validate the user options & arguments before executing. """Validate the user options & arguments before executing.
@ -112,6 +158,44 @@ class Command(object):
""" """
raise NotImplementedError raise NotImplementedError
@staticmethod
def ExecuteInParallel(jobs, func, inputs, callback, output=None, ordered=False):
"""Helper for managing parallel execution boiler plate.
For subcommands that can easily split their work up.
Args:
jobs: How many parallel processes to use.
func: The function to apply to each of the |inputs|. Usually a
functools.partial for wrapping additional arguments. It will be run
in a separate process, so it must be pickalable, so nested functions
won't work. Methods on the subcommand Command class should work.
inputs: The list of items to process. Must be a list.
callback: The function to pass the results to for processing. It will be
executed in the main thread and process the results of |func| as they
become available. Thus it may be a local nested function. Its return
value is passed back directly. It takes three arguments:
- The processing pool (or None with one job).
- The |output| argument.
- An iterator for the results.
output: An output manager. May be progress.Progess or color.Coloring.
ordered: Whether the jobs should be processed in order.
Returns:
The |callback| function's results are returned.
"""
try:
# NB: Multiprocessing is heavy, so don't spin it up for one job.
if len(inputs) == 1 or jobs == 1:
return callback(None, output, (func(x) for x in inputs))
else:
with multiprocessing.Pool(jobs) as pool:
submit = pool.imap if ordered else pool.imap_unordered
return callback(pool, output, submit(func, inputs, chunksize=WORKER_BATCH_SIZE))
finally:
if isinstance(output, progress.Progress):
output.end()
def _ResetPathToProjectMap(self, projects): def _ResetPathToProjectMap(self, projects):
self._by_path = dict((p.worktree, p) for p in projects) self._by_path = dict((p.worktree, p) for p in projects)
@ -155,9 +239,7 @@ class Command(object):
mp = manifest.manifestProject mp = manifest.manifestProject
if not groups: if not groups:
groups = mp.config.GetString('manifest.groups') groups = manifest.GetGroupsStr()
if not groups:
groups = 'default,platform-' + platform.system().lower()
groups = [x for x in re.split(r'[,\s]+', groups) if x] groups = [x for x in re.split(r'[,\s]+', groups) if x]
if not args: if not args:

121
completion.bash Normal file
View File

@ -0,0 +1,121 @@
# Copyright 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Programmable bash completion. https://github.com/scop/bash-completion
# Complete the list of repo subcommands.
__complete_repo_list_commands() {
local repo=${COMP_WORDS[0]}
(
# Handle completions if running outside of a checkout.
if ! "${repo}" help --all 2>/dev/null; then
repo help 2>/dev/null
fi
) | sed -n '/^ /{s/ \([^ ]\+\) .\+/\1/;p}'
}
# Complete list of all branches available in all projects in the repo client
# checkout.
__complete_repo_list_branches() {
local repo=${COMP_WORDS[0]}
"${repo}" branches 2>/dev/null | \
sed -n '/|/{s/[ *][Pp ] *\([^ ]\+\) .*/\1/;p}'
}
# Complete list of all projects available in the repo client checkout.
__complete_repo_list_projects() {
local repo=${COMP_WORDS[0]}
"${repo}" list -n 2>/dev/null
}
# Complete the repo <command> argument.
__complete_repo_command() {
if [[ ${COMP_CWORD} -ne 1 ]]; then
return 1
fi
local command=${COMP_WORDS[1]}
COMPREPLY=($(compgen -W "$(__complete_repo_list_commands)" -- "${command}"))
return 0
}
# Complete repo subcommands that take <branch> <projects>.
__complete_repo_command_branch_projects() {
local current=$1
if [[ ${COMP_CWORD} -eq 2 ]]; then
COMPREPLY=($(compgen -W "$(__complete_repo_list_branches)" -- "${current}"))
else
COMPREPLY=($(compgen -W "$(__complete_repo_list_projects)" -- "${current}"))
fi
}
# Complete repo subcommands that take only <projects>.
__complete_repo_command_projects() {
local current=$1
COMPREPLY=($(compgen -W "$(__complete_repo_list_projects)" -- "${current}"))
}
# Complete the repo subcommand arguments.
__complete_repo_arg() {
if [[ ${COMP_CWORD} -le 1 ]]; then
return 1
fi
local command=${COMP_WORDS[1]}
local current=${COMP_WORDS[COMP_CWORD]}
case ${command} in
abandon|checkout)
__complete_repo_command_branch_projects "${current}"
return 0
;;
branch|branches|diff|info|list|overview|prune|rebase|smartsync|stage|status|\
sync|upload)
__complete_repo_command_projects "${current}"
return 0
;;
help)
if [[ ${COMP_CWORD} -eq 2 ]]; then
COMPREPLY=(
$(compgen -W "$(__complete_repo_list_commands)" -- "${current}")
)
fi
return 0
;;
start)
if [[ ${COMP_CWORD} -gt 2 ]]; then
COMPREPLY=(
$(compgen -W "$(__complete_repo_list_projects)" -- "${current}")
)
fi
return 0
;;
*)
return 1
;;
esac
}
# Complete the repo arguments.
__complete_repo() {
COMPREPLY=()
__complete_repo_command && return 0
__complete_repo_arg && return 0
return 0
}
complete -F __complete_repo repo

View File

@ -93,6 +93,23 @@ support, see the [manifest-format.md] file.
### Project objects ### Project objects
*** note
**Warning**: Please do not use repo's approach to projects/ & project-objects/
layouts as a model for other tools to implement similar approaches.
It has a number of known downsides like:
* [Symlinks do not work well under Windows](./windows.md).
* Git sometimes replaces symlinks under .git/ with real files (under unknown
circumstances), and then the internal state gets out of sync, and data loss
may ensue.
* When sharing project-objects between multiple project checkouts, Git might
automatically run `gc` or `prune` which may lead to data loss or corruption
(since those operate on leaf projects and miss refs in other leaves). See
https://gerrit-review.googlesource.com/c/git-repo/+/254392 for more details.
Instead, you should use standard Git workflows like [git worktree] or
[gitsubmodules] with [superprojects].
***
* `project.list`: Tracking file used by `repo sync` to determine when projects * `project.list`: Tracking file used by `repo sync` to determine when projects
are added or removed and need corresponding updates in the checkout. are added or removed and need corresponding updates in the checkout.
* `projects/`: Bare checkouts of every project synced by the manifest. The * `projects/`: Bare checkouts of every project synced by the manifest. The
@ -121,7 +138,7 @@ support, see the [manifest-format.md] file.
(i.e. the path on the remote server) with a `.git` suffix. This has the (i.e. the path on the remote server) with a `.git` suffix. This has the
same advantages as the `project-objects/` layout above. same advantages as the `project-objects/` layout above.
This is used when git worktrees are enabled. This is used when [git worktree]'s are enabled.
### Global settings ### Global settings
@ -131,7 +148,7 @@ Most settings use the `[repo]` section to avoid conflicts with git.
User controlled settings are initialized when running `repo init`. User controlled settings are initialized when running `repo init`.
| Setting | `repo init` Option | Use/Meaning | | Setting | `repo init` Option | Use/Meaning |
|-------------------|---------------------------|-------------| |------------------- |---------------------------|-------------|
| manifest.groups | `--groups` & `--platform` | The manifest groups to sync | | manifest.groups | `--groups` & `--platform` | The manifest groups to sync |
| repo.archive | `--archive` | Use `git archive` for checkouts | | repo.archive | `--archive` | Use `git archive` for checkouts |
| repo.clonebundle | `--clone-bundle` | Whether the initial sync used clone.bundle explicitly | | repo.clonebundle | `--clone-bundle` | Whether the initial sync used clone.bundle explicitly |
@ -140,10 +157,11 @@ User controlled settings are initialized when running `repo init`.
| repo.dissociate | `--dissociate` | Dissociate from any reference/mirrors after initial clone | | repo.dissociate | `--dissociate` | Dissociate from any reference/mirrors after initial clone |
| repo.mirror | `--mirror` | Checkout is a repo mirror | | repo.mirror | `--mirror` | Checkout is a repo mirror |
| repo.partialclone | `--partial-clone` | Create [partial git clones] | | repo.partialclone | `--partial-clone` | Create [partial git clones] |
| repo.partialcloneexclude | `--partial-clone-exclude` | Comma-delimited list of project names (not paths) to exclude while using [partial git clones] |
| repo.reference | `--reference` | Reference repo client checkout | | repo.reference | `--reference` | Reference repo client checkout |
| repo.submodules | `--submodules` | Sync git submodules | | repo.submodules | `--submodules` | Sync git submodules |
| repo.superproject | `--use-superproject` | Sync [superproject] | | repo.superproject | `--use-superproject` | Sync [superproject] |
| repo.worktree | `--worktree` | Use `git worktree` for checkouts | | repo.worktree | `--worktree` | Use [git worktree] for checkouts |
| user.email | `--config-name` | User's e-mail address; Copied into `.git/config` when checking out a new project | | user.email | `--config-name` | User's e-mail address; Copied into `.git/config` when checking out a new project |
| user.name | `--config-name` | User's name; Copied into `.git/config` when checking out a new project | | user.name | `--config-name` | User's name; Copied into `.git/config` when checking out a new project |
@ -228,7 +246,10 @@ Repo will create & maintain a few files in the user's home directory.
[git-config]: https://git-scm.com/docs/git-config [git-config]: https://git-scm.com/docs/git-config
[git worktree]: https://git-scm.com/docs/git-worktree
[gitsubmodules]: https://git-scm.com/docs/gitsubmodules
[manifest-format.md]: ./manifest-format.md [manifest-format.md]: ./manifest-format.md
[local manifests]: ./manifest-format.md#Local-Manifests [local manifests]: ./manifest-format.md#Local-Manifests
[superprojects]: https://en.wikibooks.org/wiki/Git/Submodules_and_Superprojects
[topic]: https://gerrit-review.googlesource.com/Documentation/intro-user.html#topics [topic]: https://gerrit-review.googlesource.com/Documentation/intro-user.html#topics
[upload-notify]: https://gerrit-review.googlesource.com/Documentation/user-upload.html#notify [upload-notify]: https://gerrit-review.googlesource.com/Documentation/user-upload.html#notify

View File

@ -21,6 +21,7 @@ following DTD:
```xml ```xml
<!DOCTYPE manifest [ <!DOCTYPE manifest [
<!ELEMENT manifest (notice?, <!ELEMENT manifest (notice?,
remote*, remote*,
default?, default?,
@ -252,12 +253,25 @@ name will be prefixed by the parent's.
The project name must match the name Gerrit knows, if Gerrit is The project name must match the name Gerrit knows, if Gerrit is
being used for code reviews. being used for code reviews.
"name" must not be empty, and may not be an absolute path or use "." or ".."
path components. It is always interpreted relative to the remote's fetch
settings, so if a different base path is needed, declare a different remote
with the new settings needed.
These restrictions are not enforced for [Local Manifests].
Attribute `path`: An optional path relative to the top directory Attribute `path`: An optional path relative to the top directory
of the repo client where the Git working directory for this project of the repo client where the Git working directory for this project
should be placed. If not supplied the project name is used. should be placed. If not supplied the project "name" is used.
If the project has a parent element, its path will be prefixed If the project has a parent element, its path will be prefixed
by the parent's. by the parent's.
"path" may not be an absolute path or use "." or ".." path components.
These restrictions are not enforced for [Local Manifests].
If you want to place files into the root of the checkout (e.g. a README or
Makefile or another build script), use the [copyfile] or [linkfile] elements
instead.
Attribute `remote`: Name of a previously defined remote element. Attribute `remote`: Name of a previously defined remote element.
If not supplied the remote given by the default element is used. If not supplied the remote given by the default element is used.
@ -419,12 +433,15 @@ target manifest to include - it must be a usable manifest on its own.
Attribute `name`: the manifest to include, specified relative to Attribute `name`: the manifest to include, specified relative to
the manifest repository's root. the manifest repository's root.
"name" may not be an absolute path or use "." or ".." path components.
These restrictions are not enforced for [Local Manifests].
Attribute `groups`: List of additional groups to which all projects Attribute `groups`: List of additional groups to which all projects
in the included manifest belong. This appends and recurses, meaning in the included manifest belong. This appends and recurses, meaning
all projects in sub-manifests carry all parent include groups. all projects in sub-manifests carry all parent include groups.
Same syntax as the corresponding element of `project`. Same syntax as the corresponding element of `project`.
## Local Manifests ## Local Manifests {#local-manifests}
Additional remotes and projects may be added through local manifest Additional remotes and projects may be added through local manifest
files stored in `$TOP_DIR/.repo/local_manifests/*.xml`. files stored in `$TOP_DIR/.repo/local_manifests/*.xml`.
@ -452,3 +469,8 @@ Manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml` will
be loaded in alphabetical order. be loaded in alphabetical order.
The legacy `$TOP_DIR/.repo/local_manifest.xml` path is no longer supported. The legacy `$TOP_DIR/.repo/local_manifest.xml` path is no longer supported.
[copyfile]: #Element-copyfile
[linkfile]: #Element-linkfile
[Local Manifests]: #local-manifests

View File

@ -22,12 +22,12 @@ class ManifestParseError(Exception):
""" """
class ManifestInvalidRevisionError(Exception): class ManifestInvalidRevisionError(ManifestParseError):
"""The revision value in a project is incorrect. """The revision value in a project is incorrect.
""" """
class ManifestInvalidPathError(Exception): class ManifestInvalidPathError(ManifestParseError):
"""A path used in <copyfile> or <linkfile> is incorrect. """A path used in <copyfile> or <linkfile> is incorrect.
""" """
@ -37,7 +37,7 @@ class NoManifestException(Exception):
""" """
def __init__(self, path, reason): def __init__(self, path, reason):
super(NoManifestException, self).__init__() super().__init__(path, reason)
self.path = path self.path = path
self.reason = reason self.reason = reason
@ -50,7 +50,7 @@ class EditorError(Exception):
""" """
def __init__(self, reason): def __init__(self, reason):
super(EditorError, self).__init__() super().__init__(reason)
self.reason = reason self.reason = reason
def __str__(self): def __str__(self):
@ -62,7 +62,7 @@ class GitError(Exception):
""" """
def __init__(self, command): def __init__(self, command):
super(GitError, self).__init__() super().__init__(command)
self.command = command self.command = command
def __str__(self): def __str__(self):
@ -74,7 +74,7 @@ class UploadError(Exception):
""" """
def __init__(self, reason): def __init__(self, reason):
super(UploadError, self).__init__() super().__init__(reason)
self.reason = reason self.reason = reason
def __str__(self): def __str__(self):
@ -86,7 +86,7 @@ class DownloadError(Exception):
""" """
def __init__(self, reason): def __init__(self, reason):
super(DownloadError, self).__init__() super().__init__(reason)
self.reason = reason self.reason = reason
def __str__(self): def __str__(self):
@ -98,7 +98,7 @@ class NoSuchProjectError(Exception):
""" """
def __init__(self, name=None): def __init__(self, name=None):
super(NoSuchProjectError, self).__init__() super().__init__(name)
self.name = name self.name = name
def __str__(self): def __str__(self):
@ -112,7 +112,7 @@ class InvalidProjectGroupsError(Exception):
""" """
def __init__(self, name=None): def __init__(self, name=None):
super(InvalidProjectGroupsError, self).__init__() super().__init__(name)
self.name = name self.name = name
def __str__(self): def __str__(self):
@ -128,7 +128,7 @@ class RepoChangedException(Exception):
""" """
def __init__(self, extra_args=None): def __init__(self, extra_args=None):
super(RepoChangedException, self).__init__() super().__init__(extra_args)
self.extra_args = extra_args or [] self.extra_args = extra_args or []

View File

@ -162,11 +162,10 @@ def RepoSourceVersion():
proj = os.path.dirname(os.path.abspath(__file__)) proj = os.path.dirname(os.path.abspath(__file__))
env[GIT_DIR] = os.path.join(proj, '.git') env[GIT_DIR] = os.path.join(proj, '.git')
result = subprocess.run([GIT, 'describe', HEAD], stdout=subprocess.PIPE,
p = subprocess.Popen([GIT, 'describe', HEAD], stdout=subprocess.PIPE, encoding='utf-8', env=env, check=False)
env=env) if result.returncode == 0:
if p.wait() == 0: ver = result.stdout.strip()
ver = p.stdout.read().strip().decode('utf-8')
if ver.startswith('v'): if ver.startswith('v'):
ver = ver[1:] ver = ver[1:]
else: else:
@ -250,7 +249,7 @@ class GitCommand(object):
project, project,
cmdv, cmdv,
bare=False, bare=False,
provide_stdin=False, input=None,
capture_stdout=False, capture_stdout=False,
capture_stderr=False, capture_stderr=False,
merge_output=False, merge_output=False,
@ -260,9 +259,6 @@ class GitCommand(object):
gitdir=None): gitdir=None):
env = self._GetBasicEnv() env = self._GetBasicEnv()
# If we are not capturing std* then need to print it.
self.tee = {'stdout': not capture_stdout, 'stderr': not capture_stderr}
if disable_editor: if disable_editor:
env['GIT_EDITOR'] = ':' env['GIT_EDITOR'] = ':'
if ssh_proxy: if ssh_proxy:
@ -289,6 +285,9 @@ class GitCommand(object):
command = [GIT] command = [GIT]
if bare: if bare:
if gitdir: if gitdir:
# Git on Windows wants its paths only using / for reliability.
if platform_utils.isWindows():
gitdir = gitdir.replace('\\', '/')
env[GIT_DIR] = gitdir env[GIT_DIR] = gitdir
cwd = None cwd = None
command.append(cmdv[0]) command.append(cmdv[0])
@ -299,13 +298,10 @@ class GitCommand(object):
command.append('--progress') command.append('--progress')
command.extend(cmdv[1:]) command.extend(cmdv[1:])
if provide_stdin: stdin = subprocess.PIPE if input else None
stdin = subprocess.PIPE stdout = subprocess.PIPE if capture_stdout else None
else: stderr = (subprocess.STDOUT if merge_output else
stdin = None (subprocess.PIPE if capture_stderr else None))
stdout = subprocess.PIPE
stderr = subprocess.STDOUT if merge_output else subprocess.PIPE
if IsTrace(): if IsTrace():
global LAST_CWD global LAST_CWD
@ -341,6 +337,8 @@ class GitCommand(object):
p = subprocess.Popen(command, p = subprocess.Popen(command,
cwd=cwd, cwd=cwd,
env=env, env=env,
encoding='utf-8',
errors='backslashreplace',
stdin=stdin, stdin=stdin,
stdout=stdout, stdout=stdout,
stderr=stderr) stderr=stderr)
@ -351,7 +349,17 @@ class GitCommand(object):
_add_ssh_client(p) _add_ssh_client(p)
self.process = p self.process = p
self.stdin = p.stdin if input:
if isinstance(input, str):
input = input.encode('utf-8')
p.stdin.write(input)
p.stdin.close()
try:
self.stdout, self.stderr = p.communicate()
finally:
_remove_ssh_client(p)
self.rc = p.wait()
@staticmethod @staticmethod
def _GetBasicEnv(): def _GetBasicEnv():
@ -371,36 +379,4 @@ class GitCommand(object):
return env return env
def Wait(self): def Wait(self):
try: return self.rc
p = self.process
rc = self._CaptureOutput()
finally:
_remove_ssh_client(p)
return rc
def _CaptureOutput(self):
p = self.process
s_in = platform_utils.FileDescriptorStreams.create()
s_in.add(p.stdout, sys.stdout, 'stdout')
if p.stderr is not None:
s_in.add(p.stderr, sys.stderr, 'stderr')
self.stdout = ''
self.stderr = ''
while not s_in.is_done:
in_ready = s_in.select()
for s in in_ready:
buf = s.read()
if not buf:
s_in.remove(s)
continue
if not hasattr(buf, 'encode'):
buf = buf.decode('utf-8', 'backslashreplace')
if s.std_name == 'stdout':
self.stdout += buf
else:
self.stderr += buf
if self.tee[s.std_name]:
s.dest.write(buf)
s.dest.flush()
return p.wait()

View File

@ -145,6 +145,21 @@ class GitConfig(object):
except ValueError: except ValueError:
return None return None
def DumpConfigDict(self):
"""Returns the current configuration dict.
Configuration data is information only (e.g. logging) and
should not be considered a stable data-source.
Returns:
dict of {<key>, <value>} for git configuration cache.
<value> are strings converted by GetString.
"""
config_dict = {}
for key in self._cache:
config_dict[key] = self.GetString(key)
return config_dict
def GetBoolean(self, name): def GetBoolean(self, name):
"""Returns a boolean from the configuration file. """Returns a boolean from the configuration file.
None : The value was not defined, or is not a boolean. None : The value was not defined, or is not a boolean.
@ -444,6 +459,11 @@ def init_ssh():
def _open_ssh(host, port=None): def _open_ssh(host, port=None):
global _ssh_master global _ssh_master
# Bail before grabbing the lock if we already know that we aren't going to
# try creating new masters below.
if sys.platform in ('win32', 'cygwin'):
return False
# Acquire the lock. This is needed to prevent opening multiple masters for # Acquire the lock. This is needed to prevent opening multiple masters for
# the same host when we're running "repo sync -jN" (for N > 1) _and_ the # the same host when we're running "repo sync -jN" (for N > 1) _and_ the
# manifest <remote fetch="ssh://xyz"> specifies a different host from the # manifest <remote fetch="ssh://xyz"> specifies a different host from the
@ -461,11 +481,8 @@ def _open_ssh(host, port=None):
if key in _master_keys: if key in _master_keys:
return True return True
if (not _ssh_master if not _ssh_master or 'GIT_SSH' in os.environ:
or 'GIT_SSH' in os.environ # Failed earlier, so don't retry.
or sys.platform in ('win32', 'cygwin')):
# failed earlier, or cygwin ssh can't do this
#
return False return False
# We will make two calls to ssh; this is the common part of both calls. # We will make two calls to ssh; this is the common part of both calls.

View File

@ -131,11 +131,14 @@ class GitRefs(object):
base = os.path.join(self._gitdir, prefix) base = os.path.join(self._gitdir, prefix)
for name in platform_utils.listdir(base): for name in platform_utils.listdir(base):
p = os.path.join(base, name) p = os.path.join(base, name)
if platform_utils.isdir(p): # We don't implement the full ref validation algorithm, just the simple
# rules that would show up in local filesystems.
# https://git-scm.com/docs/git-check-ref-format
if name.startswith('.') or name.endswith('.lock'):
pass
elif platform_utils.isdir(p):
self._mtime[prefix] = os.path.getmtime(base) self._mtime[prefix] = os.path.getmtime(base)
self._ReadLoose(prefix + name + '/') self._ReadLoose(prefix + name + '/')
elif name.endswith('.lock'):
pass
else: else:
self._ReadLoose1(p, prefix + name) self._ReadLoose1(p, prefix + name)
@ -144,7 +147,7 @@ class GitRefs(object):
with open(path) as fd: with open(path) as fd:
mtime = os.path.getmtime(path) mtime = os.path.getmtime(path)
ref_id = fd.readline() ref_id = fd.readline()
except (IOError, OSError): except (OSError, UnicodeError):
return return
try: try:

View File

@ -22,13 +22,13 @@ Examples:
project_commit_ids = superproject.UpdateProjectsRevisionId(projects) project_commit_ids = superproject.UpdateProjectsRevisionId(projects)
""" """
import hashlib
import os import os
import sys import sys
from error import BUG_REPORT_URL from error import BUG_REPORT_URL
from git_command import GitCommand from git_command import GitCommand
from git_refs import R_HEADS from git_refs import R_HEADS
import platform_utils
_SUPERPROJECT_GIT_NAME = 'superproject.git' _SUPERPROJECT_GIT_NAME = 'superproject.git'
_SUPERPROJECT_MANIFEST_NAME = 'superproject_override.xml' _SUPERPROJECT_MANIFEST_NAME = 'superproject_override.xml'
@ -37,11 +37,12 @@ _SUPERPROJECT_MANIFEST_NAME = 'superproject_override.xml'
class Superproject(object): class Superproject(object):
"""Get commit ids from superproject. """Get commit ids from superproject.
It does a 'git clone' of superproject and 'git ls-tree' to get list of commit ids Initializes a local copy of a superproject for the manifest. This allows
for all projects. It contains project_commit_ids which is a dictionary with lookup of commit ids for all projects. It contains _project_commit_ids which
project/commit id entries. is a dictionary with project/commit id entries.
""" """
def __init__(self, manifest, repodir, superproject_dir='exp-superproject'): def __init__(self, manifest, repodir, superproject_dir='exp-superproject',
quiet=False):
"""Initializes superproject. """Initializes superproject.
Args: Args:
@ -49,17 +50,23 @@ class Superproject(object):
repodir: Path to the .repo/ dir for holding all internal checkout state. repodir: Path to the .repo/ dir for holding all internal checkout state.
It must be in the top directory of the repo client checkout. It must be in the top directory of the repo client checkout.
superproject_dir: Relative path under |repodir| to checkout superproject. superproject_dir: Relative path under |repodir| to checkout superproject.
quiet: If True then only print the progress messages.
""" """
self._project_commit_ids = None self._project_commit_ids = None
self._manifest = manifest self._manifest = manifest
self._quiet = quiet
self._branch = self._GetBranch() self._branch = self._GetBranch()
self._repodir = os.path.abspath(repodir) self._repodir = os.path.abspath(repodir)
self._superproject_dir = superproject_dir self._superproject_dir = superproject_dir
self._superproject_path = os.path.join(self._repodir, superproject_dir) self._superproject_path = os.path.join(self._repodir, superproject_dir)
self._manifest_path = os.path.join(self._superproject_path, self._manifest_path = os.path.join(self._superproject_path,
_SUPERPROJECT_MANIFEST_NAME) _SUPERPROJECT_MANIFEST_NAME)
self._work_git = os.path.join(self._superproject_path, git_name = ''
_SUPERPROJECT_GIT_NAME) if self._manifest.superproject:
remote_name = self._manifest.superproject['remote'].name
git_name = hashlib.md5(remote_name.encode('utf8')).hexdigest() + '-'
self._work_git_name = git_name + _SUPERPROJECT_GIT_NAME
self._work_git = os.path.join(self._superproject_path, self._work_git_name)
@property @property
def project_commit_ids(self): def project_commit_ids(self):
@ -77,20 +84,18 @@ class Superproject(object):
branch = branch[len(R_HEADS):] branch = branch[len(R_HEADS):]
return branch return branch
def _Clone(self, url): def _Init(self):
"""Do a 'git clone' for the given url. """Sets up a local Git repository to get a copy of a superproject.
Args:
url: superproject's url to be passed to git clone.
Returns: Returns:
True if git clone is successful, or False. True if initialization is successful, or False.
""" """
if not os.path.exists(self._superproject_path): if not os.path.exists(self._superproject_path):
os.mkdir(self._superproject_path) os.mkdir(self._superproject_path)
cmd = ['clone', url, '--filter', 'blob:none', '--bare'] if not self._quiet and not os.path.exists(self._work_git):
if self._branch: print('%s: Performing initial setup for superproject; this might take '
cmd += ['--branch', self._branch] 'several minutes.' % self._work_git)
cmd = ['init', '--bare', self._work_git_name]
p = GitCommand(None, p = GitCommand(None,
cmd, cmd,
cwd=self._superproject_path, cwd=self._superproject_path,
@ -98,24 +103,27 @@ class Superproject(object):
capture_stderr=True) capture_stderr=True)
retval = p.Wait() retval = p.Wait()
if retval: if retval:
# `git clone` is documented to produce an exit status of `128` if print('repo: error: git init call failed with return code: %r, stderr: %r' %
# the requested url or branch are not present in the configuration.
print('repo: error: git clone call failed with return code: %r, stderr: %r' %
(retval, p.stderr), file=sys.stderr) (retval, p.stderr), file=sys.stderr)
return False return False
return True return True
def _Fetch(self): def _Fetch(self, url):
"""Do a 'git fetch' to to fetch the latest content. """Fetches a local copy of a superproject for the manifest based on url.
Args:
url: superproject's url.
Returns: Returns:
True if 'git fetch' is successful, or False. True if fetch is successful, or False.
""" """
if not os.path.exists(self._work_git): if not os.path.exists(self._work_git):
print('git fetch missing drectory: %s' % self._work_git, print('git fetch missing drectory: %s' % self._work_git,
file=sys.stderr) file=sys.stderr)
return False return False
cmd = ['fetch', 'origin', '+refs/heads/*:refs/heads/*', '--prune'] cmd = ['fetch', url, '--depth', '1', '--force', '--no-tags', '--filter', 'blob:none']
if self._branch:
cmd += [self._branch + ':' + self._branch]
p = GitCommand(None, p = GitCommand(None,
cmd, cmd,
cwd=self._work_git, cwd=self._work_git,
@ -129,7 +137,7 @@ class Superproject(object):
return True return True
def _LsTree(self): def _LsTree(self):
"""Returns the data from 'git ls-tree ...'. """Gets the commit ids for all projects.
Works only in git repositories. Works only in git repositories.
@ -153,14 +161,12 @@ class Superproject(object):
if retval == 0: if retval == 0:
data = p.stdout data = p.stdout
else: else:
# `git clone` is documented to produce an exit status of `128` if
# the requested url or branch are not present in the configuration.
print('repo: error: git ls-tree call failed with return code: %r, stderr: %r' % ( print('repo: error: git ls-tree call failed with return code: %r, stderr: %r' % (
retval, p.stderr), file=sys.stderr) retval, p.stderr), file=sys.stderr)
return data return data
def Sync(self): def Sync(self):
"""Sync superproject either by git clone/fetch. """Gets a local copy of a superproject for the manifest.
Returns: Returns:
True if sync of superproject is successful, or False. True if sync of superproject is successful, or False.
@ -179,17 +185,12 @@ class Superproject(object):
file=sys.stderr) file=sys.stderr)
return False return False
do_clone = True if not self._Init():
if os.path.exists(self._superproject_path):
if not self._Fetch():
# If fetch fails due to a corrupted git directory, then do a git clone.
platform_utils.rmtree(self._superproject_path)
else:
do_clone = False
if do_clone:
if not self._Clone(url):
print('error: git clone failed for url: %s' % url, file=sys.stderr)
return False return False
if not self._Fetch(url):
return False
if not self._quiet:
print('%s: Initial setup for superproject completed.' % self._work_git)
return True return True
def _GetAllProjectsCommitIds(self): def _GetAllProjectsCommitIds(self):
@ -203,7 +204,8 @@ class Superproject(object):
data = self._LsTree() data = self._LsTree()
if not data: if not data:
print('error: git ls-tree failed for superproject', file=sys.stderr) print('error: git ls-tree failed to return data for superproject',
file=sys.stderr)
return None return None
# Parse lines like the following to select lines starting with '160000' and # Parse lines like the following to select lines starting with '160000' and
@ -233,7 +235,7 @@ class Superproject(object):
self._superproject_path, self._superproject_path,
file=sys.stderr) file=sys.stderr)
return None return None
manifest_str = self._manifest.ToXml().toxml() manifest_str = self._manifest.ToXml(groups=self._manifest.GetGroupsStr()).toxml()
manifest_path = self._manifest_path manifest_path = self._manifest_path
try: try:
with open(manifest_path, 'w', encoding='utf-8') as fp: with open(manifest_path, 'w', encoding='utf-8') as fp:

View File

@ -132,6 +132,33 @@ class EventLog(object):
exit_event['code'] = result exit_event['code'] = result
self._log.append(exit_event) self._log.append(exit_event)
def CommandEvent(self, name, subcommands):
"""Append a 'command' event to the current log.
Args:
name: Name of the primary command (ex: repo, git)
subcommands: List of the sub-commands (ex: version, init, sync)
"""
command_event = self._CreateEventDict('command')
command_event['name'] = name
command_event['subcommands'] = subcommands
self._log.append(command_event)
def DefParamRepoEvents(self, config):
"""Append a 'def_param' event for each repo.* config key to the current log.
Args:
config: Repo configuration dictionary
"""
# Only output the repo.* config parameters.
repo_config = {k: v for k, v in config.items() if k.startswith('repo.')}
for param, value in repo_config.items():
def_param_event = self._CreateEventDict('def_param')
def_param_event['param'] = param
def_param_event['value'] = value
self._log.append(def_param_event)
def _GetEventTargetPath(self): def _GetEventTargetPath(self):
"""Get the 'trace2.eventtarget' path from git configuration. """Get the 'trace2.eventtarget' path from git configuration.

View File

@ -13,6 +13,7 @@
# limitations under the License. # limitations under the License.
import os import os
import multiprocessing
import platform import platform
import re import re
import sys import sys
@ -35,6 +36,15 @@ def parse_clientdir(gitc_fs_path):
return wrapper.Wrapper().gitc_parse_clientdir(gitc_fs_path) return wrapper.Wrapper().gitc_parse_clientdir(gitc_fs_path)
def _get_project_revision(args):
"""Worker for _set_project_revisions to lookup one project remote."""
(i, url, expr) = args
gitcmd = git_command.GitCommand(
None, ['ls-remote', url, expr], capture_stdout=True, cwd='/tmp')
rc = gitcmd.Wait()
return (i, rc, gitcmd.stdout.split('\t', 1)[0])
def _set_project_revisions(projects): def _set_project_revisions(projects):
"""Sets the revisionExpr for a list of projects. """Sets the revisionExpr for a list of projects.
@ -47,38 +57,24 @@ def _set_project_revisions(projects):
""" """
# Retrieve the commit id for each project based off of it's current # Retrieve the commit id for each project based off of it's current
# revisionExpr and it is not already a commit id. # revisionExpr and it is not already a commit id.
project_gitcmds = [( with multiprocessing.Pool(NUM_BATCH_RETRIEVE_REVISIONID) as pool:
project, git_command.GitCommand(None, results_iter = pool.imap_unordered(
['ls-remote', _get_project_revision,
project.remote.url, ((i, project.remote.url, project.revisionExpr)
project.revisionExpr], for i, project in enumerate(projects)
capture_stdout=True, cwd='/tmp')) if not git_config.IsId(project.revisionExpr)),
for project in projects if not git_config.IsId(project.revisionExpr)] chunksize=8)
for proj, gitcmd in project_gitcmds: for (i, rc, revisionExpr) in results_iter:
if gitcmd.Wait(): project = projects[i]
print('FATAL: Failed to retrieve revisionExpr for %s' % proj) if rc:
print('FATAL: Failed to retrieve revisionExpr for %s' % project.name)
pool.terminate()
sys.exit(1) sys.exit(1)
revisionExpr = gitcmd.stdout.split('\t')[0]
if not revisionExpr: if not revisionExpr:
pool.terminate()
raise ManifestParseError('Invalid SHA-1 revision project %s (%s)' % raise ManifestParseError('Invalid SHA-1 revision project %s (%s)' %
(proj.remote.url, proj.revisionExpr)) (project.remote.url, project.revisionExpr))
proj.revisionExpr = revisionExpr project.revisionExpr = revisionExpr
def _manifest_groups(manifest):
"""Returns the manifest group string that should be synced
This is the same logic used by Command.GetProjects(), which is used during
repo sync
Args:
manifest: The XmlManifest object
"""
mp = manifest.manifestProject
groups = mp.config.GetString('manifest.groups')
if not groups:
groups = 'default,platform-' + platform.system().lower()
return groups
def generate_gitc_manifest(gitc_manifest, manifest, paths=None): def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
@ -95,7 +91,7 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
if paths is None: if paths is None:
paths = list(manifest.paths.keys()) paths = list(manifest.paths.keys())
groups = [x for x in re.split(r'[,\s]+', _manifest_groups(manifest)) if x] groups = [x for x in re.split(r'[,\s]+', manifest.GetGroupsStr()) if x]
# Convert the paths to projects, and filter them to the matched groups. # Convert the paths to projects, and filter them to the matched groups.
projects = [manifest.paths[p] for p in paths] projects = [manifest.paths[p] for p in paths]
@ -123,11 +119,7 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
else: else:
proj.revisionExpr = gitc_proj.revisionExpr proj.revisionExpr = gitc_proj.revisionExpr
index = 0 _set_project_revisions(projects)
while index < len(projects):
_set_project_revisions(
projects[index:(index + NUM_BATCH_RETRIEVE_REVISIONID)])
index += NUM_BATCH_RETRIEVE_REVISIONID
if gitc_manifest is not None: if gitc_manifest is not None:
for path, proj in gitc_manifest.paths.items(): for path, proj in gitc_manifest.paths.items():
@ -158,7 +150,7 @@ def save_manifest(manifest, client_dir=None):
else: else:
manifest_file = os.path.join(client_dir, '.manifest') manifest_file = os.path.join(client_dir, '.manifest')
with open(manifest_file, 'w') as f: with open(manifest_file, 'w') as f:
manifest.Save(f, groups=_manifest_groups(manifest)) manifest.Save(f, groups=manifest.GetGroupsStr())
# TODO(sbasi/jorg): Come up with a solution to remove the sleep below. # TODO(sbasi/jorg): Come up with a solution to remove the sleep below.
# Give the GITC filesystem time to register the manifest changes. # Give the GITC filesystem time to register the manifest changes.
time.sleep(3) time.sleep(3)

View File

@ -254,8 +254,10 @@ class _Repo(object):
cmd_event = cmd.event_log.Add(name, event_log.TASK_COMMAND, start) cmd_event = cmd.event_log.Add(name, event_log.TASK_COMMAND, start)
cmd.event_log.SetParent(cmd_event) cmd.event_log.SetParent(cmd_event)
git_trace2_event_log.StartEvent() git_trace2_event_log.StartEvent()
git_trace2_event_log.CommandEvent(name='repo', subcommands=[name])
try: try:
cmd.CommonValidateOptions(copts, cargs)
cmd.ValidateOptions(copts, cargs) cmd.ValidateOptions(copts, cargs)
result = cmd.Execute(copts, cargs) result = cmd.Execute(copts, cargs)
except (DownloadError, ManifestInvalidRevisionError, except (DownloadError, ManifestInvalidRevisionError,
@ -297,6 +299,8 @@ class _Repo(object):
cmd.event_log.FinishEvent(cmd_event, finish, cmd.event_log.FinishEvent(cmd_event, finish,
result is None or result == 0) result is None or result == 0)
git_trace2_event_log.DefParamRepoEvents(
cmd.manifest.manifestProject.config.DumpConfigDict())
git_trace2_event_log.ExitEvent(result) git_trace2_event_log.ExitEvent(result)
if gopts.event_log: if gopts.event_log:

View File

@ -14,6 +14,7 @@
import itertools import itertools
import os import os
import platform
import re import re
import sys import sys
import xml.dom.minidom import xml.dom.minidom
@ -533,7 +534,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
def _output_manifest_project_extras(self, p, e): def _output_manifest_project_extras(self, p, e):
"""Manifests can modify e if they support extra project attributes.""" """Manifests can modify e if they support extra project attributes."""
pass
@property @property
def paths(self): def paths(self):
@ -589,6 +589,12 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
return self.manifestProject.config.GetString('repo.clonefilter') return self.manifestProject.config.GetString('repo.clonefilter')
return None return None
@property
def PartialCloneExclude(self):
exclude = self.manifest.manifestProject.config.GetString(
'repo.partialcloneexclude') or ''
return set(x.strip() for x in exclude.split(','))
@property @property
def IsMirror(self): def IsMirror(self):
return self.manifestProject.config.GetBoolean('repo.mirror') return self.manifestProject.config.GetBoolean('repo.mirror')
@ -605,6 +611,17 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
def HasSubmodules(self): def HasSubmodules(self):
return self.manifestProject.config.GetBoolean('repo.submodules') return self.manifestProject.config.GetBoolean('repo.submodules')
def GetDefaultGroupsStr(self):
"""Returns the default group string for the platform."""
return 'default,platform-' + platform.system().lower()
def GetGroupsStr(self):
"""Returns the manifest group string that should be synced."""
groups = self.manifestProject.config.GetString('manifest.groups')
if not groups:
groups = self.GetDefaultGroupsStr()
return groups
def _Unload(self): def _Unload(self):
self._loaded = False self._loaded = False
self._projects = {} self._projects = {}
@ -625,16 +642,22 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
b = b[len(R_HEADS):] b = b[len(R_HEADS):]
self.branch = b self.branch = b
# The manifestFile was specified by the user which is why we allow include
# paths to point anywhere.
nodes = [] nodes = []
nodes.append(self._ParseManifestXml(self.manifestFile, nodes.append(self._ParseManifestXml(
self.manifestProject.worktree)) self.manifestFile, self.manifestProject.worktree,
restrict_includes=False))
if self._load_local_manifests and self.local_manifests: if self._load_local_manifests and self.local_manifests:
try: try:
for local_file in sorted(platform_utils.listdir(self.local_manifests)): for local_file in sorted(platform_utils.listdir(self.local_manifests)):
if local_file.endswith('.xml'): if local_file.endswith('.xml'):
local = os.path.join(self.local_manifests, local_file) local = os.path.join(self.local_manifests, local_file)
nodes.append(self._ParseManifestXml(local, self.repodir)) # Since local manifests are entirely managed by the user, allow
# them to point anywhere the user wants.
nodes.append(self._ParseManifestXml(
local, self.repodir, restrict_includes=False))
except OSError: except OSError:
pass pass
@ -652,7 +675,19 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
self._loaded = True self._loaded = True
def _ParseManifestXml(self, path, include_root, parent_groups=''): def _ParseManifestXml(self, path, include_root, parent_groups='',
restrict_includes=True):
"""Parse a manifest XML and return the computed nodes.
Args:
path: The XML file to read & parse.
include_root: The path to interpret include "name"s relative to.
parent_groups: The groups to apply to this projects.
restrict_includes: Whether to constrain the "name" attribute of includes.
Returns:
List of XML nodes.
"""
try: try:
root = xml.dom.minidom.parse(path) root = xml.dom.minidom.parse(path)
except (OSError, xml.parsers.expat.ExpatError) as e: except (OSError, xml.parsers.expat.ExpatError) as e:
@ -671,6 +706,11 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
for node in manifest.childNodes: for node in manifest.childNodes:
if node.nodeName == 'include': if node.nodeName == 'include':
name = self._reqatt(node, 'name') name = self._reqatt(node, 'name')
if restrict_includes:
msg = self._CheckLocalPath(name)
if msg:
raise ManifestInvalidPathError(
'<include> invalid "name": %s: %s' % (name, msg))
include_groups = '' include_groups = ''
if parent_groups: if parent_groups:
include_groups = parent_groups include_groups = parent_groups
@ -678,13 +718,13 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
include_groups = node.getAttribute('groups') + ',' + include_groups include_groups = node.getAttribute('groups') + ',' + include_groups
fp = os.path.join(include_root, name) fp = os.path.join(include_root, name)
if not os.path.isfile(fp): if not os.path.isfile(fp):
raise ManifestParseError("include %s doesn't exist or isn't a file" raise ManifestParseError("include [%s/]%s doesn't exist or isn't a file"
% (name,)) % (include_root, name))
try: try:
nodes.extend(self._ParseManifestXml(fp, include_root, include_groups)) nodes.extend(self._ParseManifestXml(fp, include_root, include_groups))
# should isolate this to the exact exception, but that's # should isolate this to the exact exception, but that's
# tricky. actual parsing implementation may vary. # tricky. actual parsing implementation may vary.
except (KeyboardInterrupt, RuntimeError, SystemExit): except (KeyboardInterrupt, RuntimeError, SystemExit, ManifestParseError):
raise raise
except Exception as e: except Exception as e:
raise ManifestParseError( raise ManifestParseError(
@ -980,6 +1020,10 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
reads a <project> element from the manifest file reads a <project> element from the manifest file
""" """
name = self._reqatt(node, 'name') name = self._reqatt(node, 'name')
msg = self._CheckLocalPath(name, dir_ok=True)
if msg:
raise ManifestInvalidPathError(
'<project> invalid "name": %s: %s' % (name, msg))
if parent: if parent:
name = self._JoinName(parent.name, name) name = self._JoinName(parent.name, name)
@ -1000,9 +1044,12 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
path = node.getAttribute('path') path = node.getAttribute('path')
if not path: if not path:
path = name path = name
if path.startswith('/'): else:
raise ManifestParseError("project %s path cannot be absolute in %s" % # NB: The "." project is handled specially in Project.Sync_LocalHalf.
(name, self.manifestFile)) msg = self._CheckLocalPath(path, dir_ok=True, cwd_dot_ok=True)
if msg:
raise ManifestInvalidPathError(
'<project> invalid "path": %s: %s' % (path, msg))
rebase = XmlBool(node, 'rebase', True) rebase = XmlBool(node, 'rebase', True)
sync_c = XmlBool(node, 'sync-c', False) sync_c = XmlBool(node, 'sync-c', False)
@ -1122,8 +1169,33 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
return relpath, worktree, gitdir, objdir return relpath, worktree, gitdir, objdir
@staticmethod @staticmethod
def _CheckLocalPath(path, symlink=False): def _CheckLocalPath(path, dir_ok=False, cwd_dot_ok=False):
"""Verify |path| is reasonable for use in <copyfile> & <linkfile>.""" """Verify |path| is reasonable for use in filesystem paths.
Used with <copyfile> & <linkfile> & <project> elements.
This only validates the |path| in isolation: it does not check against the
current filesystem state. Thus it is suitable as a first-past in a parser.
It enforces a number of constraints:
* No empty paths.
* No "~" in paths.
* No Unicode codepoints that filesystems might elide when normalizing.
* No relative path components like "." or "..".
* No absolute paths.
* No ".git" or ".repo*" path components.
Args:
path: The path name to validate.
dir_ok: Whether |path| may force a directory (e.g. end in a /).
cwd_dot_ok: Whether |path| may be just ".".
Returns:
None if |path| is OK, a failure message otherwise.
"""
if not path:
return 'empty paths not allowed'
if '~' in path: if '~' in path:
return '~ not allowed (due to 8.3 filenames on Windows filesystems)' return '~ not allowed (due to 8.3 filenames on Windows filesystems)'
@ -1162,16 +1234,18 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
# our constructed logic here. Especially since manifest authors only use # our constructed logic here. Especially since manifest authors only use
# / in their paths. # / in their paths.
resep = re.compile(r'[/%s]' % re.escape(os.path.sep)) resep = re.compile(r'[/%s]' % re.escape(os.path.sep))
parts = resep.split(path) # Strip off trailing slashes as those only produce '' elements, and we use
# parts to look for individual bad components.
parts = resep.split(path.rstrip('/'))
# Some people use src="." to create stable links to projects. Lets allow # Some people use src="." to create stable links to projects. Lets allow
# that but reject all other uses of "." to keep things simple. # that but reject all other uses of "." to keep things simple.
if parts != ['.']: if not cwd_dot_ok or parts != ['.']:
for part in set(parts): for part in set(parts):
if part in {'.', '..', '.git'} or part.startswith('.repo'): if part in {'.', '..', '.git'} or part.startswith('.repo'):
return 'bad component: %s' % (part,) return 'bad component: %s' % (part,)
if not symlink and resep.match(path[-1]): if not dir_ok and resep.match(path[-1]):
return 'dirs not allowed' return 'dirs not allowed'
# NB: The two abspath checks here are to handle platforms with multiple # NB: The two abspath checks here are to handle platforms with multiple
@ -1203,7 +1277,8 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
# |src| is the file we read from or path we point to for symlinks. # |src| is the file we read from or path we point to for symlinks.
# It is relative to the top of the git project checkout. # It is relative to the top of the git project checkout.
msg = cls._CheckLocalPath(src, symlink=element == 'linkfile') is_linkfile = element == 'linkfile'
msg = cls._CheckLocalPath(src, dir_ok=is_linkfile, cwd_dot_ok=is_linkfile)
if msg: if msg:
raise ManifestInvalidPathError( raise ManifestInvalidPathError(
'<%s> invalid "src": %s: %s' % (element, src, msg)) '<%s> invalid "src": %s: %s' % (element, src, msg))
@ -1302,7 +1377,7 @@ class GitcManifest(XmlManifest):
def _ParseProject(self, node, parent=None): def _ParseProject(self, node, parent=None):
"""Override _ParseProject and add support for GITC specific attributes.""" """Override _ParseProject and add support for GITC specific attributes."""
return super(GitcManifest, self)._ParseProject( return super()._ParseProject(
node, parent=parent, old_revision=node.getAttribute('old-revision')) node, parent=parent, old_revision=node.getAttribute('old-revision'))
def _output_manifest_project_extras(self, p, e): def _output_manifest_project_extras(self, p, e):
@ -1326,7 +1401,7 @@ class RepoClient(XmlManifest):
if manifest_file is None: if manifest_file is None:
manifest_file = os.path.join(repodir, MANIFEST_FILE_NAME) manifest_file = os.path.join(repodir, MANIFEST_FILE_NAME)
local_manifests = os.path.abspath(os.path.join(repodir, LOCAL_MANIFESTS_DIR_NAME)) local_manifests = os.path.abspath(os.path.join(repodir, LOCAL_MANIFESTS_DIR_NAME))
super(RepoClient, self).__init__(repodir, manifest_file, local_manifests) super().__init__(repodir, manifest_file, local_manifests)
# TODO: Completely separate manifest logic out of the client. # TODO: Completely separate manifest logic out of the client.
self.manifest = self self.manifest = self
@ -1341,6 +1416,5 @@ class GitcClient(RepoClient, GitcManifest):
self.gitc_client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(), self.gitc_client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(),
gitc_client_name) gitc_client_name)
super(GitcManifest, self).__init__( super().__init__(repodir, os.path.join(self.gitc_client_dir, '.manifest'))
repodir, os.path.join(self.gitc_client_dir, '.manifest'))
self.isGitcClient = True self.isGitcClient = True

View File

@ -15,11 +15,8 @@
import errno import errno
import os import os
import platform import platform
from queue import Queue
import select
import shutil import shutil
import stat import stat
from threading import Thread
def isWindows(): def isWindows():
@ -31,161 +28,6 @@ def isWindows():
return platform.system() == "Windows" return platform.system() == "Windows"
class FileDescriptorStreams(object):
""" Platform agnostic abstraction enabling non-blocking I/O over a
collection of file descriptors. This abstraction is required because
fctnl(os.O_NONBLOCK) is not supported on Windows.
"""
@classmethod
def create(cls):
""" Factory method: instantiates the concrete class according to the
current platform.
"""
if isWindows():
return _FileDescriptorStreamsThreads()
else:
return _FileDescriptorStreamsNonBlocking()
def __init__(self):
self.streams = []
def add(self, fd, dest, std_name):
""" Wraps an existing file descriptor as a stream.
"""
self.streams.append(self._create_stream(fd, dest, std_name))
def remove(self, stream):
""" Removes a stream, when done with it.
"""
self.streams.remove(stream)
@property
def is_done(self):
""" Returns True when all streams have been processed.
"""
return len(self.streams) == 0
def select(self):
""" Returns the set of streams that have data available to read.
The returned streams each expose a read() and a close() method.
When done with a stream, call the remove(stream) method.
"""
raise NotImplementedError
def _create_stream(self, fd, dest, std_name):
""" Creates a new stream wrapping an existing file descriptor.
"""
raise NotImplementedError
class _FileDescriptorStreamsNonBlocking(FileDescriptorStreams):
""" Implementation of FileDescriptorStreams for platforms that support
non blocking I/O.
"""
def __init__(self):
super(_FileDescriptorStreamsNonBlocking, self).__init__()
self._poll = select.poll()
self._fd_to_stream = {}
class Stream(object):
""" Encapsulates a file descriptor """
def __init__(self, fd, dest, std_name):
self.fd = fd
self.dest = dest
self.std_name = std_name
self.set_non_blocking()
def set_non_blocking(self):
import fcntl
flags = fcntl.fcntl(self.fd, fcntl.F_GETFL)
fcntl.fcntl(self.fd, fcntl.F_SETFL, flags | os.O_NONBLOCK)
def fileno(self):
return self.fd.fileno()
def read(self):
return self.fd.read(4096)
def close(self):
self.fd.close()
def _create_stream(self, fd, dest, std_name):
stream = self.Stream(fd, dest, std_name)
self._fd_to_stream[stream.fileno()] = stream
self._poll.register(stream, select.POLLIN)
return stream
def remove(self, stream):
self._poll.unregister(stream)
del self._fd_to_stream[stream.fileno()]
super(_FileDescriptorStreamsNonBlocking, self).remove(stream)
def select(self):
return [self._fd_to_stream[fd] for fd, _ in self._poll.poll()]
class _FileDescriptorStreamsThreads(FileDescriptorStreams):
""" Implementation of FileDescriptorStreams for platforms that don't support
non blocking I/O. This implementation requires creating threads issuing
blocking read operations on file descriptors.
"""
def __init__(self):
super(_FileDescriptorStreamsThreads, self).__init__()
# The queue is shared accross all threads so we can simulate the
# behavior of the select() function
self.queue = Queue(10) # Limit incoming data from streams
def _create_stream(self, fd, dest, std_name):
return self.Stream(fd, dest, std_name, self.queue)
def select(self):
# Return only one stream at a time, as it is the most straighforward
# thing to do and it is compatible with the select() function.
item = self.queue.get()
stream = item.stream
stream.data = item.data
return [stream]
class QueueItem(object):
""" Item put in the shared queue """
def __init__(self, stream, data):
self.stream = stream
self.data = data
class Stream(object):
""" Encapsulates a file descriptor """
def __init__(self, fd, dest, std_name, queue):
self.fd = fd
self.dest = dest
self.std_name = std_name
self.queue = queue
self.data = None
self.thread = Thread(target=self.read_to_queue)
self.thread.daemon = True
self.thread.start()
def close(self):
self.fd.close()
def read(self):
data = self.data
self.data = None
return data
def read_to_queue(self):
""" The thread function: reads everything from the file descriptor into
the shared queue and terminates when reaching EOF.
"""
for line in iter(self.fd.readline, b''):
self.queue.put(_FileDescriptorStreamsThreads.QueueItem(self, line))
self.fd.close()
self.queue.put(_FileDescriptorStreamsThreads.QueueItem(self, b''))
def symlink(source, link_name): def symlink(source, link_name):
"""Creates a symbolic link pointing to source named link_name. """Creates a symbolic link pointing to source named link_name.
Note: On Windows, source must exist on disk, as the implementation needs Note: On Windows, source must exist on disk, as the implementation needs

View File

@ -25,18 +25,52 @@ _NOT_TTY = not os.isatty(2)
CSI_ERASE_LINE = '\x1b[2K' CSI_ERASE_LINE = '\x1b[2K'
def duration_str(total):
"""A less noisy timedelta.__str__.
The default timedelta stringification contains a lot of leading zeros and
uses microsecond resolution. This makes for noisy output.
"""
hours, rem = divmod(total, 3600)
mins, secs = divmod(rem, 60)
ret = '%.3fs' % (secs,)
if mins:
ret = '%im%s' % (mins, ret)
if hours:
ret = '%ih%s' % (hours, ret)
return ret
class Progress(object): class Progress(object):
def __init__(self, title, total=0, units='', print_newline=False, def __init__(self, title, total=0, units='', print_newline=False, delay=True,
always_print_percentage=False): quiet=False):
self._title = title self._title = title
self._total = total self._total = total
self._done = 0 self._done = 0
self._lastp = -1
self._start = time() self._start = time()
self._show = False self._show = not delay
self._units = units self._units = units
self._print_newline = print_newline self._print_newline = print_newline
self._always_print_percentage = always_print_percentage # Only show the active jobs section if we run more than one in parallel.
self._show_jobs = False
self._active = 0
# When quiet, never show any output. It's a bit hacky, but reusing the
# existing logic that delays initial output keeps the rest of the class
# clean. Basically we set the start time to years in the future.
if quiet:
self._show = False
self._start += 2**32
def start(self, name):
self._active += 1
if not self._show_jobs:
self._show_jobs = self._active > 1
self.update(inc=0, msg='started ' + name)
def finish(self, name):
self.update(msg='finished ' + name)
self._active -= 1
def update(self, inc=1, msg=''): def update(self, inc=1, msg=''):
self._done += inc self._done += inc
@ -58,35 +92,40 @@ class Progress(object):
sys.stderr.flush() sys.stderr.flush()
else: else:
p = (100 * self._done) / self._total p = (100 * self._done) / self._total
if self._show_jobs:
if self._lastp != p or self._always_print_percentage: jobs = '[%d job%s] ' % (self._active, 's' if self._active > 1 else '')
self._lastp = p else:
sys.stderr.write('%s\r%s: %3d%% (%d%s/%d%s)%s%s%s' % ( jobs = ''
sys.stderr.write('%s\r%s: %2d%% %s(%d%s/%d%s)%s%s%s' % (
CSI_ERASE_LINE, CSI_ERASE_LINE,
self._title, self._title,
p, p,
jobs,
self._done, self._units, self._done, self._units,
self._total, self._units, self._total, self._units,
' ' if msg else '', msg, ' ' if msg else '', msg,
"\n" if self._print_newline else "")) '\n' if self._print_newline else ''))
sys.stderr.flush() sys.stderr.flush()
def end(self): def end(self):
if _NOT_TTY or IsTrace() or not self._show: if _NOT_TTY or IsTrace() or not self._show:
return return
duration = duration_str(time() - self._start)
if self._total <= 0: if self._total <= 0:
sys.stderr.write('%s\r%s: %d, done.\n' % ( sys.stderr.write('%s\r%s: %d, done in %s\n' % (
CSI_ERASE_LINE, CSI_ERASE_LINE,
self._title, self._title,
self._done)) self._done,
duration))
sys.stderr.flush() sys.stderr.flush()
else: else:
p = (100 * self._done) / self._total p = (100 * self._done) / self._total
sys.stderr.write('%s\r%s: %3d%% (%d%s/%d%s), done.\n' % ( sys.stderr.write('%s\r%s: %3d%% (%d%s/%d%s), done in %s\n' % (
CSI_ERASE_LINE, CSI_ERASE_LINE,
self._title, self._title,
p, p,
self._done, self._units, self._done, self._units,
self._total, self._units)) self._total, self._units,
duration))
sys.stderr.flush() sys.stderr.flush()

View File

@ -232,7 +232,7 @@ class ReviewableBranch(object):
class StatusColoring(Coloring): class StatusColoring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, 'status') super().__init__(config, 'status')
self.project = self.printer('header', attr='bold') self.project = self.printer('header', attr='bold')
self.branch = self.printer('header', attr='bold') self.branch = self.printer('header', attr='bold')
self.nobranch = self.printer('nobranch', fg='red') self.nobranch = self.printer('nobranch', fg='red')
@ -246,7 +246,7 @@ class StatusColoring(Coloring):
class DiffColoring(Coloring): class DiffColoring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, 'diff') super().__init__(config, 'diff')
self.project = self.printer('header', attr='bold') self.project = self.printer('header', attr='bold')
self.fail = self.printer('fail', fg='red') self.fail = self.printer('fail', fg='red')
@ -832,10 +832,12 @@ class Project(object):
return 'DIRTY' return 'DIRTY'
def PrintWorkTreeDiff(self, absolute_paths=False): def PrintWorkTreeDiff(self, absolute_paths=False, output_redir=None):
"""Prints the status of the repository to stdout. """Prints the status of the repository to stdout.
""" """
out = DiffColoring(self.config) out = DiffColoring(self.config)
if output_redir:
out.redirect(output_redir)
cmd = ['diff'] cmd = ['diff']
if out.is_on: if out.is_on:
cmd.append('--color') cmd.append('--color')
@ -849,6 +851,7 @@ class Project(object):
cmd, cmd,
capture_stdout=True, capture_stdout=True,
capture_stderr=True) capture_stderr=True)
p.Wait()
except GitError as e: except GitError as e:
out.nl() out.nl()
out.project('project %s/' % self.relpath) out.project('project %s/' % self.relpath)
@ -856,16 +859,11 @@ class Project(object):
out.fail('%s', str(e)) out.fail('%s', str(e))
out.nl() out.nl()
return False return False
has_diff = False if p.stdout:
for line in p.process.stdout:
if not hasattr(line, 'encode'):
line = line.decode()
if not has_diff:
out.nl() out.nl()
out.project('project %s/' % self.relpath) out.project('project %s/' % self.relpath)
out.nl() out.nl()
has_diff = True out.write('%s', p.stdout)
print(line[:-1])
return p.Wait() == 0 return p.Wait() == 0
# Publish / Upload ## # Publish / Upload ##
@ -1041,6 +1039,7 @@ class Project(object):
def Sync_NetworkHalf(self, def Sync_NetworkHalf(self,
quiet=False, quiet=False,
verbose=False, verbose=False,
output_redir=None,
is_new=None, is_new=None,
current_branch_only=False, current_branch_only=False,
force_sync=False, force_sync=False,
@ -1051,7 +1050,8 @@ class Project(object):
retry_fetches=0, retry_fetches=0,
prune=False, prune=False,
submodules=False, submodules=False,
clone_filter=None): clone_filter=None,
partial_clone_exclude=set()):
"""Perform only the network IO portion of the sync process. """Perform only the network IO portion of the sync process.
Local working directory/branch state is not affected. Local working directory/branch state is not affected.
""" """
@ -1082,6 +1082,16 @@ class Project(object):
_warn("Cannot remove archive %s: %s", tarpath, str(e)) _warn("Cannot remove archive %s: %s", tarpath, str(e))
self._CopyAndLinkFiles() self._CopyAndLinkFiles()
return True return True
# If the shared object dir already exists, don't try to rebootstrap with a
# clone bundle download. We should have the majority of objects already.
if clone_bundle and os.path.exists(self.objdir):
clone_bundle = False
if self.name in partial_clone_exclude:
clone_bundle = True
clone_filter = None
if is_new is None: if is_new is None:
is_new = not self.Exists is_new = not self.Exists
if is_new: if is_new:
@ -1128,8 +1138,9 @@ class Project(object):
(ID_RE.match(self.revisionExpr) and (ID_RE.match(self.revisionExpr) and
self._CheckForImmutableRevision())): self._CheckForImmutableRevision())):
if not self._RemoteFetch( if not self._RemoteFetch(
initial=is_new, quiet=quiet, verbose=verbose, alt_dir=alt_dir, initial=is_new,
current_branch_only=current_branch_only, quiet=quiet, verbose=verbose, output_redir=output_redir,
alt_dir=alt_dir, current_branch_only=current_branch_only,
tags=tags, prune=prune, depth=depth, tags=tags, prune=prune, depth=depth,
submodules=submodules, force_sync=force_sync, submodules=submodules, force_sync=force_sync,
clone_filter=clone_filter, retry_fetches=retry_fetches): clone_filter=clone_filter, retry_fetches=retry_fetches):
@ -1141,7 +1152,11 @@ class Project(object):
alternates_file = os.path.join(self.gitdir, 'objects/info/alternates') alternates_file = os.path.join(self.gitdir, 'objects/info/alternates')
if os.path.exists(alternates_file): if os.path.exists(alternates_file):
cmd = ['repack', '-a', '-d'] cmd = ['repack', '-a', '-d']
if GitCommand(self, cmd, bare=True).Wait() != 0: p = GitCommand(self, cmd, bare=True, capture_stdout=bool(output_redir),
merge_output=bool(output_redir))
if p.stdout and output_redir:
output_redir.write(p.stdout)
if p.Wait() != 0:
return False return False
platform_utils.remove(alternates_file) platform_utils.remove(alternates_file)
@ -1217,6 +1232,18 @@ class Project(object):
self.CleanPublishedCache(all_refs) self.CleanPublishedCache(all_refs)
revid = self.GetRevisionId(all_refs) revid = self.GetRevisionId(all_refs)
# Special case the root of the repo client checkout. Make sure it doesn't
# contain files being checked out to dirs we don't allow.
if self.relpath == '.':
PROTECTED_PATHS = {'.repo'}
paths = set(self.work_git.ls_tree('-z', '--name-only', '--', revid).split('\0'))
bad_paths = paths & PROTECTED_PATHS
if bad_paths:
syncbuf.fail(self,
'Refusing to checkout project that writes to protected '
'paths: %s' % (', '.join(bad_paths),))
return
def _doff(): def _doff():
self._FastForward(revid) self._FastForward(revid)
self._CopyAndLinkFiles() self._CopyAndLinkFiles()
@ -1688,6 +1715,11 @@ class Project(object):
if cb is None or name != cb: if cb is None or name != cb:
kill.append(name) kill.append(name)
# Minor optimization: If there's nothing to prune, then don't try to read
# any project state.
if not kill and not cb:
return []
rev = self.GetRevisionId(left) rev = self.GetRevisionId(left)
if cb is not None \ if cb is not None \
and not self._revlist(HEAD + '...' + rev) \ and not self._revlist(HEAD + '...' + rev) \
@ -1953,6 +1985,7 @@ class Project(object):
initial=False, initial=False,
quiet=False, quiet=False,
verbose=False, verbose=False,
output_redir=None,
alt_dir=None, alt_dir=None,
tags=True, tags=True,
prune=False, prune=False,
@ -2130,29 +2163,27 @@ class Project(object):
ok = prune_tried = False ok = prune_tried = False
for try_n in range(retry_fetches): for try_n in range(retry_fetches):
gitcmd = GitCommand(self, cmd, bare=True, ssh_proxy=ssh_proxy, gitcmd = GitCommand(self, cmd, bare=True, ssh_proxy=ssh_proxy,
merge_output=True, capture_stdout=quiet) merge_output=True, capture_stdout=quiet or bool(output_redir))
if gitcmd.stdout and not quiet and output_redir:
output_redir.write(gitcmd.stdout)
ret = gitcmd.Wait() ret = gitcmd.Wait()
if ret == 0: if ret == 0:
ok = True ok = True
break break
# Retry later due to HTTP 429 Too Many Requests. # Retry later due to HTTP 429 Too Many Requests.
elif ('error:' in gitcmd.stderr and elif (gitcmd.stdout and
'HTTP 429' in gitcmd.stderr): 'error:' in gitcmd.stdout and
if not quiet: 'HTTP 429' in gitcmd.stdout):
print('429 received, sleeping: %s sec' % retry_cur_sleep, # Fallthru to sleep+retry logic at the bottom.
file=sys.stderr) pass
time.sleep(retry_cur_sleep)
retry_cur_sleep = min(retry_exp_factor * retry_cur_sleep,
MAXIMUM_RETRY_SLEEP_SEC)
retry_cur_sleep *= (1 - random.uniform(-RETRY_JITTER_PERCENT,
RETRY_JITTER_PERCENT))
continue
# If this is not last attempt, try 'git remote prune'. # Try to prune remote branches once in case there are conflicts.
elif (try_n < retry_fetches - 1 and # For example, if the remote had refs/heads/upstream, but deleted that and
'error:' in gitcmd.stderr and # now has refs/heads/upstream/foo.
'git remote prune' in gitcmd.stderr and elif (gitcmd.stdout and
'error:' in gitcmd.stdout and
'git remote prune' in gitcmd.stdout and
not prune_tried): not prune_tried):
prune_tried = True prune_tried = True
prunecmd = GitCommand(self, ['remote', 'prune', name], bare=True, prunecmd = GitCommand(self, ['remote', 'prune', name], bare=True,
@ -2160,6 +2191,8 @@ class Project(object):
ret = prunecmd.Wait() ret = prunecmd.Wait()
if ret: if ret:
break break
output_redir.write('retrying fetch after pruning remote branches')
# Continue right away so we don't sleep as we shouldn't need to.
continue continue
elif current_branch_only and is_sha1 and ret == 128: elif current_branch_only and is_sha1 and ret == 128:
# Exit code 128 means "couldn't find the ref you asked for"; if we're # Exit code 128 means "couldn't find the ref you asked for"; if we're
@ -2169,9 +2202,17 @@ class Project(object):
elif ret < 0: elif ret < 0:
# Git died with a signal, exit immediately # Git died with a signal, exit immediately
break break
# Figure out how long to sleep before the next attempt, if there is one.
if not verbose: if not verbose:
print('%s:\n%s' % (self.name, gitcmd.stdout), file=sys.stderr) output_redir.write('\n%s:\n%s' % (self.name, gitcmd.stdout))
time.sleep(random.randint(30, 45)) if try_n < retry_fetches - 1:
output_redir.write('sleeping %s seconds before retrying' % retry_cur_sleep)
time.sleep(retry_cur_sleep)
retry_cur_sleep = min(retry_exp_factor * retry_cur_sleep,
MAXIMUM_RETRY_SLEEP_SEC)
retry_cur_sleep *= (1 - random.uniform(-RETRY_JITTER_PERCENT,
RETRY_JITTER_PERCENT))
if initial: if initial:
if alt_dir: if alt_dir:
@ -2189,7 +2230,7 @@ class Project(object):
# Sync the current branch only with depth set to None. # Sync the current branch only with depth set to None.
# We always pass depth=None down to avoid infinite recursion. # We always pass depth=None down to avoid infinite recursion.
return self._RemoteFetch( return self._RemoteFetch(
name=name, quiet=quiet, verbose=verbose, name=name, quiet=quiet, verbose=verbose, output_redir=output_redir,
current_branch_only=current_branch_only and depth, current_branch_only=current_branch_only and depth,
initial=False, alt_dir=alt_dir, initial=False, alt_dir=alt_dir,
depth=None, clone_filter=clone_filter) depth=None, clone_filter=clone_filter)
@ -2861,11 +2902,9 @@ class Project(object):
bare=False, bare=False,
capture_stdout=True, capture_stdout=True,
capture_stderr=True) capture_stderr=True)
try: p.Wait()
out = p.process.stdout.read()
if not hasattr(out, 'encode'):
out = out.decode()
r = {} r = {}
out = p.stdout
if out: if out:
out = iter(out[:-1].split('\0')) out = iter(out[:-1].split('\0'))
while out: while out:
@ -2901,8 +2940,6 @@ class Project(object):
info.path = next(out) info.path = next(out)
r[info.path] = info r[info.path] = info
return r return r
finally:
p.Wait()
def GetDotgitPath(self, subpath=None): def GetDotgitPath(self, subpath=None):
"""Return the full path to the .git dir. """Return the full path to the .git dir.
@ -3099,7 +3136,7 @@ class _Later(object):
class _SyncColoring(Coloring): class _SyncColoring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, 'reposync') super().__init__(config, 'reposync')
self.project = self.printer('header', attr='bold') self.project = self.printer('header', attr='bold')
self.info = self.printer('info') self.info = self.printer('info')
self.fail = self.printer('fail', fg='red') self.fail = self.printer('fail', fg='red')

75
repo
View File

@ -147,7 +147,7 @@ if not REPO_REV:
REPO_REV = 'stable' REPO_REV = 'stable'
# increment this whenever we make important changes to this script # increment this whenever we make important changes to this script
VERSION = (2, 12) VERSION = (2, 14)
# increment this if the MAINTAINER_KEYS block is modified # increment this if the MAINTAINER_KEYS block is modified
KEYRING_VERSION = (2, 3) KEYRING_VERSION = (2, 3)
@ -270,11 +270,18 @@ gpg_dir = os.path.join(home_dot_repo, 'gnupg')
def GetParser(gitc_init=False): def GetParser(gitc_init=False):
"""Setup the CLI parser.""" """Setup the CLI parser."""
if gitc_init: if gitc_init:
usage = 'repo gitc-init -u url -c client [options]' usage = 'repo gitc-init -c client [options] [-u] url'
else: else:
usage = 'repo init -u url [options]' usage = 'repo init [options] [-u] url'
parser = optparse.OptionParser(usage=usage) parser = optparse.OptionParser(usage=usage)
InitParser(parser, gitc_init=gitc_init)
return parser
def InitParser(parser, gitc_init=False):
"""Setup the CLI parser."""
# NB: Keep in sync with command.py:_CommonOptions().
# Logging. # Logging.
group = parser.add_option_group('Logging options') group = parser.add_option_group('Logging options')
@ -289,10 +296,24 @@ def GetParser(gitc_init=False):
group = parser.add_option_group('Manifest options') group = parser.add_option_group('Manifest options')
group.add_option('-u', '--manifest-url', group.add_option('-u', '--manifest-url',
help='manifest repository location', metavar='URL') help='manifest repository location', metavar='URL')
group.add_option('-b', '--manifest-branch', group.add_option('-b', '--manifest-branch', metavar='REVISION',
help='manifest branch or revision', metavar='REVISION') help='manifest branch or revision (use HEAD for default)')
group.add_option('-m', '--manifest-name', group.add_option('-m', '--manifest-name', default='default.xml',
help='initial manifest file', metavar='NAME.xml') help='initial manifest file', metavar='NAME.xml')
group.add_option('-g', '--groups', default='default',
help='restrict manifest projects to ones with specified '
'group(s) [default|all|G1,G2,G3|G4,-G5,-G6]',
metavar='GROUP')
group.add_option('-p', '--platform', default='auto',
help='restrict manifest projects to ones with a specified '
'platform group [auto|all|none|linux|darwin|...]',
metavar='PLATFORM')
group.add_option('--submodules', action='store_true',
help='sync any submodules associated with the manifest repo')
# Options that only affect manifest project, and not any of the projects
# specified in the manifest itself.
group = parser.add_option_group('Manifest (only) checkout options')
cbr_opts = ['--current-branch'] cbr_opts = ['--current-branch']
# The gitc-init subcommand allocates -c itself, but a lot of init users # The gitc-init subcommand allocates -c itself, but a lot of init users
# want -c, so try to satisfy both as best we can. # want -c, so try to satisfy both as best we can.
@ -301,9 +322,23 @@ def GetParser(gitc_init=False):
group.add_option(*cbr_opts, group.add_option(*cbr_opts,
dest='current_branch_only', action='store_true', dest='current_branch_only', action='store_true',
help='fetch only current manifest branch from server') help='fetch only current manifest branch from server')
group.add_option('--no-tags',
dest='tags', default=True, action='store_false',
help="don't fetch tags in the manifest")
# These are fundamentally different ways of structuring the checkout.
group = parser.add_option_group('Checkout modes')
group.add_option('--mirror', action='store_true', group.add_option('--mirror', action='store_true',
help='create a replica of the remote repositories ' help='create a replica of the remote repositories '
'rather than a client working directory') 'rather than a client working directory')
group.add_option('--archive', action='store_true',
help='checkout an archive instead of a git repository for '
'each project. See git archive.')
group.add_option('--worktree', action='store_true',
help='use git-worktree to manage projects')
# These are fundamentally different ways of structuring the checkout.
group = parser.add_option_group('Project checkout optimizations')
group.add_option('--reference', group.add_option('--reference',
help='location of mirror directory', metavar='DIR') help='location of mirror directory', metavar='DIR')
group.add_option('--dissociate', action='store_true', group.add_option('--dissociate', action='store_true',
@ -314,38 +349,27 @@ def GetParser(gitc_init=False):
group.add_option('--partial-clone', action='store_true', group.add_option('--partial-clone', action='store_true',
help='perform partial clone (https://git-scm.com/' help='perform partial clone (https://git-scm.com/'
'docs/gitrepository-layout#_code_partialclone_code)') 'docs/gitrepository-layout#_code_partialclone_code)')
group.add_option('--no-partial-clone', action='store_false',
help='disable use of partial clone (https://git-scm.com/'
'docs/gitrepository-layout#_code_partialclone_code)')
group.add_option('--partial-clone-exclude', action='store',
help='exclude the specified projects (a comma-delimited '
'project names) from partial clone (https://git-scm.com'
'/docs/gitrepository-layout#_code_partialclone_code)')
group.add_option('--clone-filter', action='store', default='blob:none', group.add_option('--clone-filter', action='store', default='blob:none',
help='filter for use with --partial-clone ' help='filter for use with --partial-clone '
'[default: %default]') '[default: %default]')
group.add_option('--worktree', action='store_true',
help=optparse.SUPPRESS_HELP)
group.add_option('--archive', action='store_true',
help='checkout an archive instead of a git repository for '
'each project. See git archive.')
group.add_option('--submodules', action='store_true',
help='sync any submodules associated with the manifest repo')
group.add_option('--use-superproject', action='store_true', default=None, group.add_option('--use-superproject', action='store_true', default=None,
help='use the manifest superproject to sync projects') help='use the manifest superproject to sync projects')
group.add_option('--no-use-superproject', action='store_false', group.add_option('--no-use-superproject', action='store_false',
dest='use_superproject', dest='use_superproject',
help='disable use of manifest superprojects') help='disable use of manifest superprojects')
group.add_option('-g', '--groups', default='default',
help='restrict manifest projects to ones with specified '
'group(s) [default|all|G1,G2,G3|G4,-G5,-G6]',
metavar='GROUP')
group.add_option('-p', '--platform', default='auto',
help='restrict manifest projects to ones with a specified '
'platform group [auto|all|none|linux|darwin|...]',
metavar='PLATFORM')
group.add_option('--clone-bundle', action='store_true', group.add_option('--clone-bundle', action='store_true',
help='enable use of /clone.bundle on HTTP/HTTPS ' help='enable use of /clone.bundle on HTTP/HTTPS '
'(default if not --partial-clone)') '(default if not --partial-clone)')
group.add_option('--no-clone-bundle', group.add_option('--no-clone-bundle',
dest='clone_bundle', action='store_false', dest='clone_bundle', action='store_false',
help='disable use of /clone.bundle on HTTP/HTTPS (default if --partial-clone)') help='disable use of /clone.bundle on HTTP/HTTPS (default if --partial-clone)')
group.add_option('--no-tags',
dest='tags', default=True, action='store_false',
help="don't fetch tags in the manifest")
# Tool. # Tool.
group = parser.add_option_group('repo Version options') group = parser.add_option_group('repo Version options')
@ -521,6 +545,9 @@ def _Init(args, gitc_init=False):
""" """
parser = GetParser(gitc_init=gitc_init) parser = GetParser(gitc_init=gitc_init)
opt, args = parser.parse_args(args) opt, args = parser.parse_args(args)
if args:
if not opt.manifest_url:
opt.manifest_url = args.pop(0)
if args: if args:
parser.print_usage() parser.print_usage()
sys.exit(1) sys.exit(1)

View File

@ -34,8 +34,8 @@ def find_pytest():
if ret: if ret:
return ret return ret
print(f'{__file__}: unable to find pytest.', file=sys.stderr) print('%s: unable to find pytest.' % (__file__,), file=sys.stderr)
print(f'{__file__}: Try installing: sudo apt-get install python-pytest', print('%s: Try installing: sudo apt-get install python-pytest' % (__file__,),
file=sys.stderr) file=sys.stderr)
@ -50,7 +50,7 @@ def main(argv):
os.environ['PYTHONPATH'] = pythonpath os.environ['PYTHONPATH'] = pythonpath
pytest = find_pytest() pytest = find_pytest()
return subprocess.run([pytest] + argv, check=True) return subprocess.run([pytest] + argv, check=False).returncode
if __name__ == '__main__': if __name__ == '__main__':

View File

@ -32,7 +32,7 @@ with open(os.path.join(TOPDIR, 'README.md')) as fp:
# https://packaging.python.org/tutorials/packaging-projects/ # https://packaging.python.org/tutorials/packaging-projects/
setuptools.setup( setuptools.setup(
name='repo', name='repo',
version='1.13.8', version='2',
maintainer='Various', maintainer='Various',
maintainer_email='repo-discuss@googlegroups.com', maintainer_email='repo-discuss@googlegroups.com',
description='Repo helps manage many Git repositories', description='Repo helps manage many Git repositories',
@ -56,6 +56,6 @@ setuptools.setup(
'Programming Language :: Python :: 3 :: Only', 'Programming Language :: Python :: 3 :: Only',
'Topic :: Software Development :: Version Control :: Git', 'Topic :: Software Development :: Version Control :: Git',
], ],
python_requires='>=3.6', python_requires='>=3.5',
packages=['subcmds'], packages=['subcmds'],
) )

View File

@ -13,9 +13,11 @@
# limitations under the License. # limitations under the License.
from collections import defaultdict from collections import defaultdict
import functools
import itertools
import sys import sys
from command import Command from command import Command, DEFAULT_LOCAL_JOBS
from git_command import git from git_command import git
from progress import Progress from progress import Progress
@ -31,11 +33,9 @@ deleting it (and all its history) from your local repository.
It is equivalent to "git branch -D <branchname>". It is equivalent to "git branch -D <branchname>".
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p): def _Options(self, p):
p.add_option('-q', '--quiet',
action='store_true', default=False,
help='be quiet')
p.add_option('--all', p.add_option('--all',
dest='all', action='store_true', dest='all', action='store_true',
help='delete all branches in all projects') help='delete all branches in all projects')
@ -51,35 +51,44 @@ It is equivalent to "git branch -D <branchname>".
else: else:
args.insert(0, "'All local branches'") args.insert(0, "'All local branches'")
def _ExecuteOne(self, all_branches, nb, project):
"""Abandon one project."""
if all_branches:
branches = project.GetBranches()
else:
branches = [nb]
ret = {}
for name in branches:
status = project.AbandonBranch(name)
if status is not None:
ret[name] = status
return (ret, project)
def Execute(self, opt, args): def Execute(self, opt, args):
nb = args[0] nb = args[0]
err = defaultdict(list) err = defaultdict(list)
success = defaultdict(list) success = defaultdict(list)
all_projects = self.GetProjects(args[1:]) all_projects = self.GetProjects(args[1:])
pm = Progress('Abandon %s' % nb, len(all_projects)) def _ProcessResults(_pool, pm, states):
for project in all_projects: for (results, project) in states:
for branch, status in results.items():
if status:
success[branch].append(project)
else:
err[branch].append(project)
pm.update() pm.update()
if opt.all: self.ExecuteInParallel(
branches = list(project.GetBranches().keys()) opt.jobs,
else: functools.partial(self._ExecuteOne, opt.all, nb),
branches = [nb] all_projects,
callback=_ProcessResults,
for name in branches: output=Progress('Abandon %s' % (nb,), len(all_projects), quiet=opt.quiet))
status = project.AbandonBranch(name)
if status is not None:
if status:
success[name].append(project)
else:
err[name].append(project)
pm.end()
width = 25
for name in branches:
if width < len(name):
width = len(name)
width = max(itertools.chain(
[25], (len(x) for x in itertools.chain(success, err))))
if err: if err:
for br in err.keys(): for br in err.keys():
err_msg = "error: cannot abandon %s" % br err_msg = "error: cannot abandon %s" % br

View File

@ -13,18 +13,10 @@
# limitations under the License. # limitations under the License.
import itertools import itertools
import multiprocessing
import sys import sys
from color import Coloring
from command import Command
# Number of projects to submit to a single worker process at a time. from color import Coloring
# This number represents a tradeoff between the overhead of IPC and finer from command import Command, DEFAULT_LOCAL_JOBS
# grained opportunity for parallelism. This particular value was chosen by
# iterating through powers of two until the overall performance no longer
# improved. The performance of this batch size is not a function of the
# number of cores on the system.
WORKER_BATCH_SIZE = 32
class BranchColoring(Coloring): class BranchColoring(Coloring):
@ -103,32 +95,26 @@ the branch appears in, or does not appear in. If no project list
is shown, then the branch appears in all projects. is shown, then the branch appears in all projects.
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p):
"""Add flags to CLI parser for this subcommand."""
default_jobs = min(multiprocessing.cpu_count(), 8)
p.add_option(
'-j',
'--jobs',
type=int,
default=default_jobs,
help='Number of worker processes to spawn '
'(default: %s)' % default_jobs)
def Execute(self, opt, args): def Execute(self, opt, args):
projects = self.GetProjects(args) projects = self.GetProjects(args)
out = BranchColoring(self.manifest.manifestProject.config) out = BranchColoring(self.manifest.manifestProject.config)
all_branches = {} all_branches = {}
project_cnt = len(projects) project_cnt = len(projects)
with multiprocessing.Pool(processes=opt.jobs) as pool:
project_branches = pool.imap_unordered(
expand_project_to_branches, projects, chunksize=WORKER_BATCH_SIZE)
for name, b in itertools.chain.from_iterable(project_branches): def _ProcessResults(_pool, _output, results):
for name, b in itertools.chain.from_iterable(results):
if name not in all_branches: if name not in all_branches:
all_branches[name] = BranchInfo(name) all_branches[name] = BranchInfo(name)
all_branches[name].add(b) all_branches[name].add(b)
self.ExecuteInParallel(
opt.jobs,
expand_project_to_branches,
projects,
callback=_ProcessResults)
names = sorted(all_branches) names = sorted(all_branches)
if not names: if not names:

View File

@ -12,8 +12,10 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import functools
import sys import sys
from command import Command
from command import Command, DEFAULT_LOCAL_JOBS
from progress import Progress from progress import Progress
@ -31,28 +33,37 @@ The command is equivalent to:
repo forall [<project>...] -c git checkout <branchname> repo forall [<project>...] -c git checkout <branchname>
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
if not args: if not args:
self.Usage() self.Usage()
def _ExecuteOne(self, nb, project):
"""Checkout one project."""
return (project.CheckoutBranch(nb), project)
def Execute(self, opt, args): def Execute(self, opt, args):
nb = args[0] nb = args[0]
err = [] err = []
success = [] success = []
all_projects = self.GetProjects(args[1:]) all_projects = self.GetProjects(args[1:])
pm = Progress('Checkout %s' % nb, len(all_projects)) def _ProcessResults(_pool, pm, results):
for project in all_projects: for status, project in results:
pm.update()
status = project.CheckoutBranch(nb)
if status is not None: if status is not None:
if status: if status:
success.append(project) success.append(project)
else: else:
err.append(project) err.append(project)
pm.end() pm.update()
self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, nb),
all_projects,
callback=_ProcessResults,
output=Progress('Checkout %s' % (nb,), len(all_projects), quiet=opt.quiet))
if err: if err:
for p in err: for p in err:

View File

@ -32,9 +32,6 @@ The change id will be updated, and a reference to the old
change id will be added. change id will be added.
""" """
def _Options(self, p):
pass
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
if len(args) != 1: if len(args) != 1:
self.Usage() self.Usage()
@ -72,11 +69,9 @@ change id will be added.
new_msg = self._Reformat(old_msg, sha1) new_msg = self._Reformat(old_msg, sha1)
p = GitCommand(None, ['commit', '--amend', '-F', '-'], p = GitCommand(None, ['commit', '--amend', '-F', '-'],
provide_stdin=True, input=new_msg,
capture_stdout=True, capture_stdout=True,
capture_stderr=True) capture_stderr=True)
p.stdin.write(new_msg)
p.stdin.close()
if p.Wait() != 0: if p.Wait() != 0:
print("error: Failed to update commit message", file=sys.stderr) print("error: Failed to update commit message", file=sys.stderr)
sys.exit(1) sys.exit(1)

View File

@ -12,7 +12,10 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from command import PagedCommand import functools
import io
from command import DEFAULT_LOCAL_JOBS, PagedCommand
class Diff(PagedCommand): class Diff(PagedCommand):
@ -25,15 +28,42 @@ The -u option causes '%prog' to generate diff output with file paths
relative to the repository root, so the output can be applied relative to the repository root, so the output can be applied
to the Unix 'patch' command. to the Unix 'patch' command.
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p): def _Options(self, p):
p.add_option('-u', '--absolute', p.add_option('-u', '--absolute',
dest='absolute', action='store_true', dest='absolute', action='store_true',
help='Paths are relative to the repository root') help='Paths are relative to the repository root')
def _ExecuteOne(self, absolute, project):
"""Obtains the diff for a specific project.
Args:
absolute: Paths are relative to the root.
project: Project to get status of.
Returns:
The status of the project.
"""
buf = io.StringIO()
ret = project.PrintWorkTreeDiff(absolute, output_redir=buf)
return (ret, buf.getvalue())
def Execute(self, opt, args): def Execute(self, opt, args):
all_projects = self.GetProjects(args)
def _ProcessResults(_pool, _output, results):
ret = 0 ret = 0
for project in self.GetProjects(args): for (state, output) in results:
if not project.PrintWorkTreeDiff(opt.absolute): if output:
print(output, end='')
if not state:
ret = 1 ret = 1
return ret return ret
return self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, opt.absolute),
all_projects,
callback=_ProcessResults,
ordered=True)

View File

@ -16,7 +16,7 @@ import re
import sys import sys
from command import Command from command import Command
from error import GitError from error import GitError, NoSuchProjectError
CHANGE_RE = re.compile(r'^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$') CHANGE_RE = re.compile(r'^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$')
@ -60,6 +60,7 @@ If no project is specified try to use current directory as a project.
if m: if m:
if not project: if not project:
project = self.GetProjects(".")[0] project = self.GetProjects(".")[0]
print('Defaulting to cwd project', project.name)
chg_id = int(m.group(1)) chg_id = int(m.group(1))
if m.group(2): if m.group(2):
ps_id = int(m.group(2)) ps_id = int(m.group(2))
@ -76,7 +77,23 @@ If no project is specified try to use current directory as a project.
ps_id = max(int(match.group(1)), ps_id) ps_id = max(int(match.group(1)), ps_id)
to_get.append((project, chg_id, ps_id)) to_get.append((project, chg_id, ps_id))
else: else:
project = self.GetProjects([a])[0] projects = self.GetProjects([a])
if len(projects) > 1:
# If the cwd is one of the projects, assume they want that.
try:
project = self.GetProjects('.')[0]
except NoSuchProjectError:
project = None
if project not in projects:
print('error: %s matches too many projects; please re-run inside '
'the project checkout.' % (a,), file=sys.stderr)
for project in projects:
print(' %s/ @ %s' % (project.relpath, project.revisionExpr),
file=sys.stderr)
sys.exit(1)
else:
project = projects[0]
print('Defaulting to cwd project', project.name)
return to_get return to_get
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):

View File

@ -13,6 +13,8 @@
# limitations under the License. # limitations under the License.
import errno import errno
import functools
import io
import multiprocessing import multiprocessing
import re import re
import os import os
@ -21,8 +23,8 @@ import sys
import subprocess import subprocess
from color import Coloring from color import Coloring
from command import Command, MirrorSafeCommand from command import DEFAULT_LOCAL_JOBS, Command, MirrorSafeCommand, WORKER_BATCH_SIZE
import platform_utils from error import ManifestInvalidRevisionError
_CAN_COLOR = [ _CAN_COLOR = [
'branch', 'branch',
@ -43,7 +45,7 @@ class Forall(Command, MirrorSafeCommand):
helpSummary = "Run a shell command in each project" helpSummary = "Run a shell command in each project"
helpUsage = """ helpUsage = """
%prog [<project>...] -c <command> [<arg>...] %prog [<project>...] -c <command> [<arg>...]
%prog -r str1 [str2] ... -c <command> [<arg>...]" %prog -r str1 [str2] ... -c <command> [<arg>...]
""" """
helpDescription = """ helpDescription = """
Executes the same shell command in each project. Executes the same shell command in each project.
@ -51,6 +53,11 @@ Executes the same shell command in each project.
The -r option allows running the command only on projects matching The -r option allows running the command only on projects matching
regex or wildcard expression. regex or wildcard expression.
By default, projects are processed non-interactively in parallel. If you want
to run interactive commands, make sure to pass --interactive to force --jobs 1.
While the processing order of projects is not guaranteed, the order of project
output is stable.
# Output Formatting # Output Formatting
The -p option causes '%prog' to bind pipes to the command's stdin, The -p option causes '%prog' to bind pipes to the command's stdin,
@ -113,12 +120,15 @@ terminal and are not redirected.
If -e is used, when a command exits unsuccessfully, '%prog' will abort If -e is used, when a command exits unsuccessfully, '%prog' will abort
without iterating through the remaining projects. without iterating through the remaining projects.
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p): @staticmethod
def cmd(option, opt_str, value, parser): def _cmd_option(option, _opt_str, _value, parser):
setattr(parser.values, option.dest, list(parser.rargs)) setattr(parser.values, option.dest, list(parser.rargs))
while parser.rargs: while parser.rargs:
del parser.rargs[0] del parser.rargs[0]
def _Options(self, p):
p.add_option('-r', '--regex', p.add_option('-r', '--regex',
dest='regex', action='store_true', dest='regex', action='store_true',
help="Execute the command only on projects matching regex or wildcard expression") help="Execute the command only on projects matching regex or wildcard expression")
@ -133,7 +143,7 @@ without iterating through the remaining projects.
help='Command (and arguments) to execute', help='Command (and arguments) to execute',
dest='command', dest='command',
action='callback', action='callback',
callback=cmd) callback=self._cmd_option)
p.add_option('-e', '--abort-on-errors', p.add_option('-e', '--abort-on-errors',
dest='abort_on_errors', action='store_true', dest='abort_on_errors', action='store_true',
help='Abort if a command exits unsuccessfully') help='Abort if a command exits unsuccessfully')
@ -141,45 +151,17 @@ without iterating through the remaining projects.
help='Silently skip & do not exit non-zero due missing ' help='Silently skip & do not exit non-zero due missing '
'checkouts') 'checkouts')
g = p.add_option_group('Output') g = p.get_option_group('--quiet')
g.add_option('-p', g.add_option('-p',
dest='project_header', action='store_true', dest='project_header', action='store_true',
help='Show project headers before output') help='Show project headers before output')
g.add_option('-v', '--verbose', p.add_option('--interactive',
dest='verbose', action='store_true', action='store_true',
help='Show command error messages') help='force interactive usage')
g.add_option('-j', '--jobs',
dest='jobs', action='store', type='int', default=1,
help='number of commands to execute simultaneously')
def WantPager(self, opt): def WantPager(self, opt):
return opt.project_header and opt.jobs == 1 return opt.project_header and opt.jobs == 1
def _SerializeProject(self, project):
""" Serialize a project._GitGetByExec instance.
project._GitGetByExec is not pickle-able. Instead of trying to pass it
around between processes, make a dict ourselves containing only the
attributes that we need.
"""
if not self.manifest.IsMirror:
lrev = project.GetRevisionId()
else:
lrev = None
return {
'name': project.name,
'relpath': project.relpath,
'remote_name': project.remote.name,
'lrev': lrev,
'rrev': project.revisionExpr,
'annotations': dict((a.name, a.value) for a in project.annotations),
'gitdir': project.gitdir,
'worktree': project.worktree,
'upstream': project.upstream,
'dest_branch': project.dest_branch,
}
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
if not opt.command: if not opt.command:
self.Usage() self.Usage()
@ -195,6 +177,11 @@ without iterating through the remaining projects.
cmd.append(cmd[0]) cmd.append(cmd[0])
cmd.extend(opt.command[1:]) cmd.extend(opt.command[1:])
# Historically, forall operated interactively, and in serial. If the user
# has selected 1 job, then default to interacive mode.
if opt.jobs == 1:
opt.interactive = True
if opt.project_header \ if opt.project_header \
and not shell \ and not shell \
and cmd[0] == 'git': and cmd[0] == 'git':
@ -234,60 +221,50 @@ without iterating through the remaining projects.
os.environ['REPO_COUNT'] = str(len(projects)) os.environ['REPO_COUNT'] = str(len(projects))
pool = multiprocessing.Pool(opt.jobs, InitWorker)
try: try:
config = self.manifest.manifestProject.config config = self.manifest.manifestProject.config
with multiprocessing.Pool(opt.jobs, InitWorker) as pool:
results_it = pool.imap( results_it = pool.imap(
DoWorkWrapper, functools.partial(DoWorkWrapper, mirror, opt, cmd, shell, config),
self.ProjectArgs(projects, mirror, opt, cmd, shell, config)) enumerate(projects),
pool.close() chunksize=WORKER_BATCH_SIZE)
for r in results_it: first = True
for (r, output) in results_it:
if output:
if first:
first = False
elif opt.project_header:
print()
# To simplify the DoWorkWrapper, take care of automatic newlines.
end = '\n'
if output[-1] == '\n':
end = ''
print(output, end=end)
rc = rc or r rc = rc or r
if r != 0 and opt.abort_on_errors: if r != 0 and opt.abort_on_errors:
raise Exception('Aborting due to previous error') raise Exception('Aborting due to previous error')
except (KeyboardInterrupt, WorkerKeyboardInterrupt): except (KeyboardInterrupt, WorkerKeyboardInterrupt):
# Catch KeyboardInterrupt raised inside and outside of workers # Catch KeyboardInterrupt raised inside and outside of workers
print('Interrupted - terminating the pool')
pool.terminate()
rc = rc or errno.EINTR rc = rc or errno.EINTR
except Exception as e: except Exception as e:
# Catch any other exceptions raised # Catch any other exceptions raised
print('Got an error, terminating the pool: %s: %s' % print('forall: unhandled error, terminating the pool: %s: %s' %
(type(e).__name__, e), (type(e).__name__, e),
file=sys.stderr) file=sys.stderr)
pool.terminate()
rc = rc or getattr(e, 'errno', 1) rc = rc or getattr(e, 'errno', 1)
finally:
pool.join()
if rc != 0: if rc != 0:
sys.exit(rc) sys.exit(rc)
def ProjectArgs(self, projects, mirror, opt, cmd, shell, config):
for cnt, p in enumerate(projects):
try:
project = self._SerializeProject(p)
except Exception as e:
print('Project list error on project %s: %s: %s' %
(p.name, type(e).__name__, e),
file=sys.stderr)
return
except KeyboardInterrupt:
print('Project list interrupted',
file=sys.stderr)
return
yield [mirror, opt, cmd, shell, cnt, config, project]
class WorkerKeyboardInterrupt(Exception): class WorkerKeyboardInterrupt(Exception):
""" Keyboard interrupt exception for worker processes. """ """ Keyboard interrupt exception for worker processes. """
pass
def InitWorker(): def InitWorker():
signal.signal(signal.SIGINT, signal.SIG_IGN) signal.signal(signal.SIGINT, signal.SIG_IGN)
def DoWorkWrapper(args): def DoWorkWrapper(mirror, opt, cmd, shell, config, args):
""" A wrapper around the DoWork() method. """ A wrapper around the DoWork() method.
Catch the KeyboardInterrupt exceptions here and re-raise them as a different, Catch the KeyboardInterrupt exceptions here and re-raise them as a different,
@ -295,11 +272,11 @@ def DoWorkWrapper(args):
and making the parent hang indefinitely. and making the parent hang indefinitely.
""" """
project = args.pop() cnt, project = args
try: try:
return DoWork(project, *args) return DoWork(project, mirror, opt, cmd, shell, cnt, config)
except KeyboardInterrupt: except KeyboardInterrupt:
print('%s: Worker interrupted' % project['name']) print('%s: Worker interrupted' % project.name)
raise WorkerKeyboardInterrupt() raise WorkerKeyboardInterrupt()
@ -311,94 +288,65 @@ def DoWork(project, mirror, opt, cmd, shell, cnt, config):
val = '' val = ''
env[name] = val env[name] = val
setenv('REPO_PROJECT', project['name']) setenv('REPO_PROJECT', project.name)
setenv('REPO_PATH', project['relpath']) setenv('REPO_PATH', project.relpath)
setenv('REPO_REMOTE', project['remote_name']) setenv('REPO_REMOTE', project.remote.name)
setenv('REPO_LREV', project['lrev']) try:
setenv('REPO_RREV', project['rrev']) # If we aren't in a fully synced state and we don't have the ref the manifest
setenv('REPO_UPSTREAM', project['upstream']) # wants, then this will fail. Ignore it for the purposes of this code.
setenv('REPO_DEST_BRANCH', project['dest_branch']) lrev = '' if mirror else project.GetRevisionId()
except ManifestInvalidRevisionError:
lrev = ''
setenv('REPO_LREV', lrev)
setenv('REPO_RREV', project.revisionExpr)
setenv('REPO_UPSTREAM', project.upstream)
setenv('REPO_DEST_BRANCH', project.dest_branch)
setenv('REPO_I', str(cnt + 1)) setenv('REPO_I', str(cnt + 1))
for name in project['annotations']: for annotation in project.annotations:
setenv("REPO__%s" % (name), project['annotations'][name]) setenv("REPO__%s" % (annotation.name), annotation.value)
if mirror: if mirror:
setenv('GIT_DIR', project['gitdir']) setenv('GIT_DIR', project.gitdir)
cwd = project['gitdir'] cwd = project.gitdir
else: else:
cwd = project['worktree'] cwd = project.worktree
if not os.path.exists(cwd): if not os.path.exists(cwd):
# Allow the user to silently ignore missing checkouts so they can run on # Allow the user to silently ignore missing checkouts so they can run on
# partial checkouts (good for infra recovery tools). # partial checkouts (good for infra recovery tools).
if opt.ignore_missing: if opt.ignore_missing:
return 0 return (0, '')
output = ''
if ((opt.project_header and opt.verbose) if ((opt.project_header and opt.verbose)
or not opt.project_header): or not opt.project_header):
print('skipping %s/' % project['relpath'], file=sys.stderr) output = 'skipping %s/' % project.relpath
return 1 return (1, output)
if opt.project_header: if opt.verbose:
stdin = subprocess.PIPE stderr = subprocess.STDOUT
stdout = subprocess.PIPE
stderr = subprocess.PIPE
else: else:
stdin = None stderr = subprocess.DEVNULL
stdout = None
stderr = None
p = subprocess.Popen(cmd, stdin = None if opt.interactive else subprocess.DEVNULL
cwd=cwd,
shell=shell,
env=env,
stdin=stdin,
stdout=stdout,
stderr=stderr)
result = subprocess.run(
cmd, cwd=cwd, shell=shell, env=env, check=False,
encoding='utf-8', errors='replace',
stdin=stdin, stdout=subprocess.PIPE, stderr=stderr)
output = result.stdout
if opt.project_header: if opt.project_header:
if output:
buf = io.StringIO()
out = ForallColoring(config) out = ForallColoring(config)
out.redirect(sys.stdout) out.redirect(buf)
empty = True
errbuf = ''
p.stdin.close()
s_in = platform_utils.FileDescriptorStreams.create()
s_in.add(p.stdout, sys.stdout, 'stdout')
s_in.add(p.stderr, sys.stderr, 'stderr')
while not s_in.is_done:
in_ready = s_in.select()
for s in in_ready:
buf = s.read().decode()
if not buf:
s_in.remove(s)
s.close()
continue
if not opt.verbose:
if s.std_name == 'stderr':
errbuf += buf
continue
if empty and out:
if not cnt == 0:
out.nl()
if mirror: if mirror:
project_header_path = project['name'] project_header_path = project.name
else: else:
project_header_path = project['relpath'] project_header_path = project.relpath
out.project('project %s/', project_header_path) out.project('project %s/' % project_header_path)
out.nl() out.nl()
out.flush() buf.write(output)
if errbuf: output = buf.getvalue()
sys.stderr.write(errbuf) return (result.returncode, output)
sys.stderr.flush()
errbuf = ''
empty = False
s.dest.write(buf)
s.dest.flush()
r = p.wait()
return r

View File

@ -47,14 +47,7 @@ use for this GITC client.
""" """
def _Options(self, p): def _Options(self, p):
super(GitcInit, self)._Options(p, gitc_init=True) super()._Options(p, gitc_init=True)
g = p.add_option_group('GITC options')
g.add_option('-f', '--manifest-file',
dest='manifest_file',
help='Optional manifest file to use for this GITC client.')
g.add_option('-c', '--gitc-client',
dest='gitc_client',
help='The name of the gitc_client instance to create or modify.')
def Execute(self, opt, args): def Execute(self, opt, args):
gitc_client = gitc_utils.parse_clientdir(os.getcwd()) gitc_client = gitc_utils.parse_clientdir(os.getcwd())
@ -64,7 +57,7 @@ use for this GITC client.
sys.exit(1) sys.exit(1)
self.client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(), self.client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(),
gitc_client) gitc_client)
super(GitcInit, self).Execute(opt, args) super().Execute(opt, args)
manifest_file = self.manifest.manifestFile manifest_file = self.manifest.manifestFile
if opt.manifest_file: if opt.manifest_file:

View File

@ -12,10 +12,11 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import functools
import sys import sys
from color import Coloring from color import Coloring
from command import PagedCommand from command import DEFAULT_LOCAL_JOBS, PagedCommand
from error import GitError from error import GitError
from git_command import GitCommand from git_command import GitCommand
@ -61,12 +62,10 @@ contain a line that matches both expressions:
repo grep --all-match -e NODE -e Unexpected repo grep --all-match -e NODE -e Unexpected
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p): @staticmethod
def carry(option, def _carry_option(_option, opt_str, value, parser):
opt_str,
value,
parser):
pt = getattr(parser.values, 'cmd_argv', None) pt = getattr(parser.values, 'cmd_argv', None)
if pt is None: if pt is None:
pt = [] pt = []
@ -82,9 +81,14 @@ contain a line that matches both expressions:
if value is not None: if value is not None:
pt.append(value) pt.append(value)
def _CommonOptions(self, p):
"""Override common options slightly."""
super()._CommonOptions(p, opt_v=False)
def _Options(self, p):
g = p.add_option_group('Sources') g = p.add_option_group('Sources')
g.add_option('--cached', g.add_option('--cached',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help='Search the index, instead of the work tree') help='Search the index, instead of the work tree')
g.add_option('-r', '--revision', g.add_option('-r', '--revision',
dest='revision', action='append', metavar='TREEish', dest='revision', action='append', metavar='TREEish',
@ -92,68 +96,134 @@ contain a line that matches both expressions:
g = p.add_option_group('Pattern') g = p.add_option_group('Pattern')
g.add_option('-e', g.add_option('-e',
action='callback', callback=carry, action='callback', callback=self._carry_option,
metavar='PATTERN', type='str', metavar='PATTERN', type='str',
help='Pattern to search for') help='Pattern to search for')
g.add_option('-i', '--ignore-case', g.add_option('-i', '--ignore-case',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help='Ignore case differences') help='Ignore case differences')
g.add_option('-a', '--text', g.add_option('-a', '--text',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help="Process binary files as if they were text") help="Process binary files as if they were text")
g.add_option('-I', g.add_option('-I',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help="Don't match the pattern in binary files") help="Don't match the pattern in binary files")
g.add_option('-w', '--word-regexp', g.add_option('-w', '--word-regexp',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help='Match the pattern only at word boundaries') help='Match the pattern only at word boundaries')
g.add_option('-v', '--invert-match', g.add_option('-v', '--invert-match',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help='Select non-matching lines') help='Select non-matching lines')
g.add_option('-G', '--basic-regexp', g.add_option('-G', '--basic-regexp',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help='Use POSIX basic regexp for patterns (default)') help='Use POSIX basic regexp for patterns (default)')
g.add_option('-E', '--extended-regexp', g.add_option('-E', '--extended-regexp',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help='Use POSIX extended regexp for patterns') help='Use POSIX extended regexp for patterns')
g.add_option('-F', '--fixed-strings', g.add_option('-F', '--fixed-strings',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help='Use fixed strings (not regexp) for pattern') help='Use fixed strings (not regexp) for pattern')
g = p.add_option_group('Pattern Grouping') g = p.add_option_group('Pattern Grouping')
g.add_option('--all-match', g.add_option('--all-match',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help='Limit match to lines that have all patterns') help='Limit match to lines that have all patterns')
g.add_option('--and', '--or', '--not', g.add_option('--and', '--or', '--not',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help='Boolean operators to combine patterns') help='Boolean operators to combine patterns')
g.add_option('-(', '-)', g.add_option('-(', '-)',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help='Boolean operator grouping') help='Boolean operator grouping')
g = p.add_option_group('Output') g = p.add_option_group('Output')
g.add_option('-n', g.add_option('-n',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help='Prefix the line number to matching lines') help='Prefix the line number to matching lines')
g.add_option('-C', g.add_option('-C',
action='callback', callback=carry, action='callback', callback=self._carry_option,
metavar='CONTEXT', type='str', metavar='CONTEXT', type='str',
help='Show CONTEXT lines around match') help='Show CONTEXT lines around match')
g.add_option('-B', g.add_option('-B',
action='callback', callback=carry, action='callback', callback=self._carry_option,
metavar='CONTEXT', type='str', metavar='CONTEXT', type='str',
help='Show CONTEXT lines before match') help='Show CONTEXT lines before match')
g.add_option('-A', g.add_option('-A',
action='callback', callback=carry, action='callback', callback=self._carry_option,
metavar='CONTEXT', type='str', metavar='CONTEXT', type='str',
help='Show CONTEXT lines after match') help='Show CONTEXT lines after match')
g.add_option('-l', '--name-only', '--files-with-matches', g.add_option('-l', '--name-only', '--files-with-matches',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help='Show only file names containing matching lines') help='Show only file names containing matching lines')
g.add_option('-L', '--files-without-match', g.add_option('-L', '--files-without-match',
action='callback', callback=carry, action='callback', callback=self._carry_option,
help='Show only file names not containing matching lines') help='Show only file names not containing matching lines')
def _ExecuteOne(self, cmd_argv, project):
"""Process one project."""
try:
p = GitCommand(project,
cmd_argv,
bare=False,
capture_stdout=True,
capture_stderr=True)
except GitError as e:
return (project, -1, None, str(e))
return (project, p.Wait(), p.stdout, p.stderr)
@staticmethod
def _ProcessResults(full_name, have_rev, _pool, out, results):
git_failed = False
bad_rev = False
have_match = False
for project, rc, stdout, stderr in results:
if rc < 0:
git_failed = True
out.project('--- project %s ---' % project.relpath)
out.nl()
out.fail('%s', stderr)
out.nl()
continue
if rc:
# no results
if stderr:
if have_rev and 'fatal: ambiguous argument' in stderr:
bad_rev = True
else:
out.project('--- project %s ---' % project.relpath)
out.nl()
out.fail('%s', stderr.strip())
out.nl()
continue
have_match = True
# We cut the last element, to avoid a blank line.
r = stdout.split('\n')
r = r[0:-1]
if have_rev and full_name:
for line in r:
rev, line = line.split(':', 1)
out.write("%s", rev)
out.write(':')
out.project(project.relpath)
out.write('/')
out.write("%s", line)
out.nl()
elif full_name:
for line in r:
out.project(project.relpath)
out.write('/')
out.write("%s", line)
out.nl()
else:
for line in r:
print(line)
return (git_failed, bad_rev, have_match)
def Execute(self, opt, args): def Execute(self, opt, args):
out = GrepColoring(self.manifest.manifestProject.config) out = GrepColoring(self.manifest.manifestProject.config)
@ -185,62 +255,13 @@ contain a line that matches both expressions:
cmd_argv.extend(opt.revision) cmd_argv.extend(opt.revision)
cmd_argv.append('--') cmd_argv.append('--')
git_failed = False git_failed, bad_rev, have_match = self.ExecuteInParallel(
bad_rev = False opt.jobs,
have_match = False functools.partial(self._ExecuteOne, cmd_argv),
projects,
for project in projects: callback=functools.partial(self._ProcessResults, full_name, have_rev),
try: output=out,
p = GitCommand(project, ordered=True)
cmd_argv,
bare=False,
capture_stdout=True,
capture_stderr=True)
except GitError as e:
git_failed = True
out.project('--- project %s ---' % project.relpath)
out.nl()
out.fail('%s', str(e))
out.nl()
continue
if p.Wait() != 0:
# no results
#
if p.stderr:
if have_rev and 'fatal: ambiguous argument' in p.stderr:
bad_rev = True
else:
out.project('--- project %s ---' % project.relpath)
out.nl()
out.fail('%s', p.stderr.strip())
out.nl()
continue
have_match = True
# We cut the last element, to avoid a blank line.
#
r = p.stdout.split('\n')
r = r[0:-1]
if have_rev and full_name:
for line in r:
rev, line = line.split(':', 1)
out.write("%s", rev)
out.write(':')
out.project(project.relpath)
out.write('/')
out.write("%s", line)
out.nl()
elif full_name:
for line in r:
out.project(project.relpath)
out.write('/')
out.write("%s", line)
out.nl()
else:
for line in r:
print(line)
if git_failed: if git_failed:
sys.exit(1) sys.exit(1)

View File

@ -14,7 +14,7 @@
import re import re
import sys import sys
from formatter import AbstractFormatter, DumbWriter import textwrap
from subcmds import all_commands from subcmds import all_commands
from color import Coloring from color import Coloring
@ -84,8 +84,7 @@ Displays detailed usage information about a command.
def __init__(self, gc): def __init__(self, gc):
Coloring.__init__(self, gc, 'help') Coloring.__init__(self, gc, 'help')
self.heading = self.printer('heading', attr='bold') self.heading = self.printer('heading', attr='bold')
self._first = True
self.wrap = AbstractFormatter(DumbWriter())
def _PrintSection(self, heading, bodyAttr): def _PrintSection(self, heading, bodyAttr):
try: try:
@ -95,7 +94,9 @@ Displays detailed usage information about a command.
if body == '' or body is None: if body == '' or body is None:
return return
if not self._first:
self.nl() self.nl()
self._first = False
self.heading('%s%s', header_prefix, heading) self.heading('%s%s', header_prefix, heading)
self.nl() self.nl()
@ -105,7 +106,8 @@ Displays detailed usage information about a command.
body = body.strip() body = body.strip()
body = body.replace('%prog', me) body = body.replace('%prog', me)
asciidoc_hdr = re.compile(r'^\n?#+ (.+)$') # Extract the title, but skip any trailing {#anchors}.
asciidoc_hdr = re.compile(r'^\n?#+ ([^{]+)(\{#.+\})?$')
for para in body.split("\n\n"): for para in body.split("\n\n"):
if para.startswith(' '): if para.startswith(' '):
self.write('%s', para) self.write('%s', para)
@ -120,9 +122,12 @@ Displays detailed usage information about a command.
self.nl() self.nl()
continue continue
self.wrap.add_flowing_data(para) lines = textwrap.wrap(para.replace(' ', ' '), width=80,
self.wrap.end_paragraph(1) break_long_words=False, break_on_hyphens=False)
self.wrap.end_paragraph(0) for line in lines:
self.write('%s', line)
self.nl()
self.nl()
out = _Out(self.client.globalConfig) out = _Out(self.client.globalConfig)
out._PrintSection('Summary', 'helpSummary') out._PrintSection('Summary', 'helpSummary')

View File

@ -32,9 +32,9 @@ from wrapper import Wrapper
class Init(InteractiveCommand, MirrorSafeCommand): class Init(InteractiveCommand, MirrorSafeCommand):
common = True common = True
helpSummary = "Initialize repo in the current directory" helpSummary = "Initialize a repo client checkout in the current directory"
helpUsage = """ helpUsage = """
%prog [options] %prog [options] [manifest url]
""" """
helpDescription = """ helpDescription = """
The '%prog' command is run once to install and initialize repo. The '%prog' command is run once to install and initialize repo.
@ -42,9 +42,13 @@ The latest repo source code and manifest collection is downloaded
from the server and is installed in the .repo/ directory in the from the server and is installed in the .repo/ directory in the
current working directory. current working directory.
When creating a new checkout, the manifest URL is the only required setting.
It may be specified using the --manifest-url option, or as the first optional
argument.
The optional -b argument can be used to select the manifest branch The optional -b argument can be used to select the manifest branch
to checkout and use. If no branch is specified, the remote's default to checkout and use. If no branch is specified, the remote's default
branch is used. branch is used. This is equivalent to using -b HEAD.
The optional -m argument can be used to specify an alternate manifest The optional -m argument can be used to specify an alternate manifest
to be used. If no manifest is specified, the manifest default.xml to be used. If no manifest is specified, the manifest default.xml
@ -75,117 +79,25 @@ manifest, a subsequent `repo sync` (or `repo sync -d`) is necessary
to update the working directory files. to update the working directory files.
""" """
def _CommonOptions(self, p):
"""Disable due to re-use of Wrapper()."""
def _Options(self, p, gitc_init=False): def _Options(self, p, gitc_init=False):
# Logging Wrapper().InitParser(p, gitc_init=gitc_init)
g = p.add_option_group('Logging options')
g.add_option('-v', '--verbose',
dest='output_mode', action='store_true',
help='show all output')
g.add_option('-q', '--quiet',
dest='output_mode', action='store_false',
help='only show errors')
# Manifest
g = p.add_option_group('Manifest options')
g.add_option('-u', '--manifest-url',
dest='manifest_url',
help='manifest repository location', metavar='URL')
g.add_option('-b', '--manifest-branch',
dest='manifest_branch',
help='manifest branch or revision', metavar='REVISION')
cbr_opts = ['--current-branch']
# The gitc-init subcommand allocates -c itself, but a lot of init users
# want -c, so try to satisfy both as best we can.
if not gitc_init:
cbr_opts += ['-c']
g.add_option(*cbr_opts,
dest='current_branch_only', action='store_true',
help='fetch only current manifest branch from server')
g.add_option('-m', '--manifest-name',
dest='manifest_name', default='default.xml',
help='initial manifest file', metavar='NAME.xml')
g.add_option('--mirror',
dest='mirror', action='store_true',
help='create a replica of the remote repositories '
'rather than a client working directory')
g.add_option('--reference',
dest='reference',
help='location of mirror directory', metavar='DIR')
g.add_option('--dissociate',
dest='dissociate', action='store_true',
help='dissociate from reference mirrors after clone')
g.add_option('--depth', type='int', default=None,
dest='depth',
help='create a shallow clone with given depth; see git clone')
g.add_option('--partial-clone', action='store_true',
dest='partial_clone',
help='perform partial clone (https://git-scm.com/'
'docs/gitrepository-layout#_code_partialclone_code)')
g.add_option('--clone-filter', action='store', default='blob:none',
dest='clone_filter',
help='filter for use with --partial-clone [default: %default]')
# TODO(vapier): Expose option with real help text once this has been in the
# wild for a while w/out significant bug reports. Goal is by ~Sep 2020.
g.add_option('--worktree', action='store_true',
help=optparse.SUPPRESS_HELP)
g.add_option('--archive',
dest='archive', action='store_true',
help='checkout an archive instead of a git repository for '
'each project. See git archive.')
g.add_option('--submodules',
dest='submodules', action='store_true',
help='sync any submodules associated with the manifest repo')
g.add_option('--use-superproject', action='store_true',
help='use the manifest superproject to sync projects')
g.add_option('--no-use-superproject', action='store_false',
dest='use_superproject',
help='disable use of manifest superprojects')
g.add_option('-g', '--groups',
dest='groups', default='default',
help='restrict manifest projects to ones with specified '
'group(s) [default|all|G1,G2,G3|G4,-G5,-G6]',
metavar='GROUP')
g.add_option('-p', '--platform',
dest='platform', default='auto',
help='restrict manifest projects to ones with a specified '
'platform group [auto|all|none|linux|darwin|...]',
metavar='PLATFORM')
g.add_option('--clone-bundle', action='store_true',
help='force use of /clone.bundle on HTTP/HTTPS (default if not --partial-clone)')
g.add_option('--no-clone-bundle',
dest='clone_bundle', action='store_false',
help='disable use of /clone.bundle on HTTP/HTTPS (default if --partial-clone)')
g.add_option('--no-tags',
dest='tags', default=True, action='store_false',
help="don't fetch tags in the manifest")
# Tool
g = p.add_option_group('repo Version options')
g.add_option('--repo-url',
dest='repo_url',
help='repo repository location', metavar='URL')
g.add_option('--repo-rev', metavar='REV',
help='repo branch or revision')
g.add_option('--repo-branch', dest='repo_rev',
help=optparse.SUPPRESS_HELP)
g.add_option('--no-repo-verify',
dest='repo_verify', default=True, action='store_false',
help='do not verify repo source code')
# Other
g = p.add_option_group('Other options')
g.add_option('--config-name',
dest='config_name', action="store_true", default=False,
help='Always prompt for name/e-mail')
def _RegisteredEnvironmentOptions(self): def _RegisteredEnvironmentOptions(self):
return {'REPO_MANIFEST_URL': 'manifest_url', return {'REPO_MANIFEST_URL': 'manifest_url',
'REPO_MIRROR_LOCATION': 'reference'} 'REPO_MIRROR_LOCATION': 'reference'}
def _CloneSuperproject(self): def _CloneSuperproject(self, opt):
"""Clone the superproject based on the superproject's url and branch.""" """Clone the superproject based on the superproject's url and branch.
Args:
opt: Program options returned from optparse. See _Options().
"""
superproject = git_superproject.Superproject(self.manifest, superproject = git_superproject.Superproject(self.manifest,
self.repodir) self.repodir,
quiet=opt.quiet)
if not superproject.Sync(): if not superproject.Sync():
print('error: git update of superproject failed', file=sys.stderr) print('error: git update of superproject failed', file=sys.stderr)
sys.exit(1) sys.exit(1)
@ -196,7 +108,7 @@ to update the working directory files.
if is_new: if is_new:
if not opt.manifest_url: if not opt.manifest_url:
print('fatal: manifest url (-u) is required.', file=sys.stderr) print('fatal: manifest url is required.', file=sys.stderr)
sys.exit(1) sys.exit(1)
if not opt.quiet: if not opt.quiet:
@ -228,6 +140,11 @@ to update the working directory files.
r.Save() r.Save()
if opt.manifest_branch: if opt.manifest_branch:
if opt.manifest_branch == 'HEAD':
opt.manifest_branch = m.ResolveRemoteHead()
if opt.manifest_branch is None:
print('fatal: unable to resolve HEAD', file=sys.stderr)
sys.exit(1)
m.revisionExpr = opt.manifest_branch m.revisionExpr = opt.manifest_branch
else: else:
if is_new: if is_new:
@ -256,7 +173,7 @@ to update the working directory files.
groups = [x for x in groups if x] groups = [x for x in groups if x]
groupstr = ','.join(groups) groupstr = ','.join(groups)
if opt.platform == 'auto' and groupstr == 'default,platform-' + platform.system().lower(): if opt.platform == 'auto' and groupstr == self.manifest.GetDefaultGroupsStr():
groupstr = None groupstr = None
m.config.SetString('manifest.groups', groupstr) m.config.SetString('manifest.groups', groupstr)
@ -300,7 +217,7 @@ to update the working directory files.
'in another location.', file=sys.stderr) 'in another location.', file=sys.stderr)
sys.exit(1) sys.exit(1)
if opt.partial_clone: if opt.partial_clone is not None:
if opt.mirror: if opt.mirror:
print('fatal: --mirror and --partial-clone are mutually exclusive', print('fatal: --mirror and --partial-clone are mutually exclusive',
file=sys.stderr) file=sys.stderr)
@ -308,9 +225,14 @@ to update the working directory files.
m.config.SetBoolean('repo.partialclone', opt.partial_clone) m.config.SetBoolean('repo.partialclone', opt.partial_clone)
if opt.clone_filter: if opt.clone_filter:
m.config.SetString('repo.clonefilter', opt.clone_filter) m.config.SetString('repo.clonefilter', opt.clone_filter)
elif m.config.GetBoolean('repo.partialclone'):
opt.clone_filter = m.config.GetString('repo.clonefilter')
else: else:
opt.clone_filter = None opt.clone_filter = None
if opt.partial_clone_exclude is not None:
m.config.SetString('repo.partialcloneexclude', opt.partial_clone_exclude)
if opt.clone_bundle is None: if opt.clone_bundle is None:
opt.clone_bundle = False if opt.partial_clone else True opt.clone_bundle = False if opt.partial_clone else True
else: else:
@ -326,7 +248,8 @@ to update the working directory files.
clone_bundle=opt.clone_bundle, clone_bundle=opt.clone_bundle,
current_branch_only=opt.current_branch_only, current_branch_only=opt.current_branch_only,
tags=opt.tags, submodules=opt.submodules, tags=opt.tags, submodules=opt.submodules,
clone_filter=opt.clone_filter): clone_filter=opt.clone_filter,
partial_clone_exclude=self.manifest.PartialCloneExclude):
r = m.GetRemote(m.remote.name) r = m.GetRemote(m.remote.name)
print('fatal: cannot obtain manifest %s' % r.url, file=sys.stderr) print('fatal: cannot obtain manifest %s' % r.url, file=sys.stderr)
@ -498,7 +421,15 @@ to update the working directory files.
self.OptionParser.error('--mirror and --archive cannot be used together.') self.OptionParser.error('--mirror and --archive cannot be used together.')
if args: if args:
self.OptionParser.error('init takes no arguments') if opt.manifest_url:
self.OptionParser.error(
'--manifest-url option and URL argument both specified: only use '
'one to select the manifest URL.')
opt.manifest_url = args.pop(0)
if args:
self.OptionParser.error('too many arguments to init')
def Execute(self, opt, args): def Execute(self, opt, args):
git_require(MIN_GIT_VERSION_HARD, fail=True) git_require(MIN_GIT_VERSION_HARD, fail=True)
@ -508,9 +439,6 @@ to update the working directory files.
% ('.'.join(str(x) for x in MIN_GIT_VERSION_SOFT),), % ('.'.join(str(x) for x in MIN_GIT_VERSION_SOFT),),
file=sys.stderr) file=sys.stderr)
opt.quiet = opt.output_mode is False
opt.verbose = opt.output_mode is True
rp = self.manifest.repoProject rp = self.manifest.repoProject
# Handle new --repo-url requests. # Handle new --repo-url requests.
@ -537,7 +465,7 @@ to update the working directory files.
self._LinkManifest(opt.manifest_name) self._LinkManifest(opt.manifest_name)
if self.manifest.manifestProject.config.GetBoolean('repo.superproject'): if self.manifest.manifestProject.config.GetBoolean('repo.superproject'):
self._CloneSuperproject() self._CloneSuperproject(opt)
if os.isatty(0) and os.isatty(1) and not self.manifest.IsMirror: if os.isatty(0) and os.isatty(1) and not self.manifest.IsMirror:
if opt.config_name or self._ShouldConfigureUser(opt): if opt.config_name or self._ShouldConfigureUser(opt):

View File

@ -20,11 +20,16 @@ class List(Command, MirrorSafeCommand):
helpSummary = "List projects and their associated directories" helpSummary = "List projects and their associated directories"
helpUsage = """ helpUsage = """
%prog [-f] [<project>...] %prog [-f] [<project>...]
%prog [-f] -r str1 [str2]..." %prog [-f] -r str1 [str2]...
""" """
helpDescription = """ helpDescription = """
List all projects; pass '.' to list the project for the cwd. List all projects; pass '.' to list the project for the cwd.
By default, only projects that currently exist in the checkout are shown. If
you want to list all projects (using the specified filter settings), use the
--all option. If you want to show all projects regardless of the manifest
groups, then also pass --groups all.
This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'. This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
""" """
@ -35,6 +40,9 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
p.add_option('-g', '--groups', p.add_option('-g', '--groups',
dest='groups', dest='groups',
help="Filter the project list based on the groups the project is in") help="Filter the project list based on the groups the project is in")
p.add_option('-a', '--all',
action='store_true',
help='Show projects regardless of checkout state')
p.add_option('-f', '--fullpath', p.add_option('-f', '--fullpath',
dest='fullpath', action='store_true', dest='fullpath', action='store_true',
help="Display the full work tree path instead of the relative path") help="Display the full work tree path instead of the relative path")
@ -61,7 +69,7 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
args: Positional args. Can be a list of projects to list, or empty. args: Positional args. Can be a list of projects to list, or empty.
""" """
if not opt.regex: if not opt.regex:
projects = self.GetProjects(args, groups=opt.groups) projects = self.GetProjects(args, groups=opt.groups, missing_ok=opt.all)
else: else:
projects = self.FindProjects(args) projects = self.FindProjects(args)
@ -79,5 +87,6 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
else: else:
lines.append("%s : %s" % (_getpath(project), project.name)) lines.append("%s : %s" % (_getpath(project), project.name))
if lines:
lines.sort() lines.sort()
print('\n'.join(lines)) print('\n'.join(lines))

View File

@ -12,8 +12,10 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import itertools
from color import Coloring from color import Coloring
from command import PagedCommand from command import DEFAULT_LOCAL_JOBS, PagedCommand
class Prune(PagedCommand): class Prune(PagedCommand):
@ -22,11 +24,26 @@ class Prune(PagedCommand):
helpUsage = """ helpUsage = """
%prog [<project>...] %prog [<project>...]
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _ExecuteOne(self, project):
"""Process one project."""
return project.PruneHeads()
def Execute(self, opt, args): def Execute(self, opt, args):
all_branches = [] projects = self.GetProjects(args)
for project in self.GetProjects(args):
all_branches.extend(project.PruneHeads()) # NB: Should be able to refactor this module to display summary as results
# come back from children.
def _ProcessResults(_pool, _output, results):
return list(itertools.chain.from_iterable(results))
all_branches = self.ExecuteInParallel(
opt.jobs,
self._ExecuteOne,
projects,
callback=_ProcessResults,
ordered=True)
if not all_branches: if not all_branches:
return return

View File

@ -39,7 +39,8 @@ branch but need to incorporate new upstream changes "underneath" them.
""" """
def _Options(self, p): def _Options(self, p):
p.add_option('-i', '--interactive', g = p.get_option_group('--quiet')
g.add_option('-i', '--interactive',
dest="interactive", action="store_true", dest="interactive", action="store_true",
help="interactive rebase (single project only)") help="interactive rebase (single project only)")
@ -52,9 +53,6 @@ branch but need to incorporate new upstream changes "underneath" them.
p.add_option('--no-ff', p.add_option('--no-ff',
dest='ff', default=True, action='store_false', dest='ff', default=True, action='store_false',
help='Pass --no-ff to git rebase') help='Pass --no-ff to git rebase')
p.add_option('-q', '--quiet',
dest='quiet', action='store_true',
help='Pass --quiet to git rebase')
p.add_option('--autosquash', p.add_option('--autosquash',
dest='autosquash', action='store_true', dest='autosquash', action='store_true',
help='Pass --autosquash to git rebase') help='Pass --autosquash to git rebase')

View File

@ -38,7 +38,8 @@ The '%prog' command stages files to prepare the next commit.
""" """
def _Options(self, p): def _Options(self, p):
p.add_option('-i', '--interactive', g = p.get_option_group('--quiet')
g.add_option('-i', '--interactive',
dest='interactive', action='store_true', dest='interactive', action='store_true',
help='use interactive staging') help='use interactive staging')

View File

@ -12,10 +12,11 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import functools
import os import os
import sys import sys
from command import Command from command import Command, DEFAULT_LOCAL_JOBS
from git_config import IsImmutable from git_config import IsImmutable
from git_command import git from git_command import git
import gitc_utils import gitc_utils
@ -33,6 +34,7 @@ class Start(Command):
'%prog' begins a new branch of development, starting from the '%prog' begins a new branch of development, starting from the
revision specified in the manifest. revision specified in the manifest.
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p): def _Options(self, p):
p.add_option('--all', p.add_option('--all',
@ -40,7 +42,8 @@ revision specified in the manifest.
help='begin branch in all projects') help='begin branch in all projects')
p.add_option('-r', '--rev', '--revision', dest='revision', p.add_option('-r', '--rev', '--revision', dest='revision',
help='point branch at this revision instead of upstream') help='point branch at this revision instead of upstream')
p.add_option('--head', dest='revision', action='store_const', const='HEAD', p.add_option('--head', '--HEAD',
dest='revision', action='store_const', const='HEAD',
help='abbreviation for --rev HEAD') help='abbreviation for --rev HEAD')
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
@ -51,6 +54,26 @@ revision specified in the manifest.
if not git.check_ref_format('heads/%s' % nb): if not git.check_ref_format('heads/%s' % nb):
self.OptionParser.error("'%s' is not a valid name" % nb) self.OptionParser.error("'%s' is not a valid name" % nb)
def _ExecuteOne(self, revision, nb, project):
"""Start one project."""
# If the current revision is immutable, such as a SHA1, a tag or
# a change, then we can't push back to it. Substitute with
# dest_branch, if defined; or with manifest default revision instead.
branch_merge = ''
if IsImmutable(project.revisionExpr):
if project.dest_branch:
branch_merge = project.dest_branch
else:
branch_merge = self.manifest.default.revisionExpr
try:
ret = project.StartBranch(
nb, branch_merge=branch_merge, revision=revision)
except Exception as e:
print('error: unable to checkout %s: %s' % (project.name, e), file=sys.stderr)
ret = False
return (ret, project)
def Execute(self, opt, args): def Execute(self, opt, args):
nb = args[0] nb = args[0]
err = [] err = []
@ -82,11 +105,8 @@ revision specified in the manifest.
if not os.path.exists(os.getcwd()): if not os.path.exists(os.getcwd()):
os.chdir(self.manifest.topdir) os.chdir(self.manifest.topdir)
pm = Progress('Starting %s' % nb, len(all_projects)) pm = Progress('Syncing %s' % nb, len(all_projects), quiet=opt.quiet)
for project in all_projects: for project in all_projects:
pm.update()
if self.gitc_manifest:
gitc_project = self.gitc_manifest.paths[project.relpath] gitc_project = self.gitc_manifest.paths[project.relpath]
# Sync projects that have not been opened. # Sync projects that have not been opened.
if not gitc_project.already_synced: if not gitc_project.already_synced:
@ -99,22 +119,22 @@ revision specified in the manifest.
sync_buf = SyncBuffer(self.manifest.manifestProject.config) sync_buf = SyncBuffer(self.manifest.manifestProject.config)
project.Sync_LocalHalf(sync_buf) project.Sync_LocalHalf(sync_buf)
project.revisionId = gitc_project.old_revision project.revisionId = gitc_project.old_revision
pm.update()
# If the current revision is immutable, such as a SHA1, a tag or
# a change, then we can't push back to it. Substitute with
# dest_branch, if defined; or with manifest default revision instead.
branch_merge = ''
if IsImmutable(project.revisionExpr):
if project.dest_branch:
branch_merge = project.dest_branch
else:
branch_merge = self.manifest.default.revisionExpr
if not project.StartBranch(
nb, branch_merge=branch_merge, revision=opt.revision):
err.append(project)
pm.end() pm.end()
def _ProcessResults(_pool, pm, results):
for (result, project) in results:
if not result:
err.append(project)
pm.update()
self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, opt.revision, nb),
all_projects,
callback=_ProcessResults,
output=Progress('Starting %s' % (nb,), len(all_projects), quiet=opt.quiet))
if err: if err:
for p in err: for p in err:
print("error: %s/: cannot start %s" % (p.relpath, nb), print("error: %s/: cannot start %s" % (p.relpath, nb),

View File

@ -14,10 +14,10 @@
import functools import functools
import glob import glob
import multiprocessing import io
import os import os
from command import PagedCommand from command import DEFAULT_LOCAL_JOBS, PagedCommand
from color import Coloring from color import Coloring
import platform_utils import platform_utils
@ -76,16 +76,12 @@ the following meanings:
d: deleted ( in index, not in work tree ) d: deleted ( in index, not in work tree )
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p): def _Options(self, p):
p.add_option('-j', '--jobs',
dest='jobs', action='store', type='int', default=2,
help="number of projects to check simultaneously")
p.add_option('-o', '--orphans', p.add_option('-o', '--orphans',
dest='orphans', action='store_true', dest='orphans', action='store_true',
help="include objects in working directory outside of repo projects") help="include objects in working directory outside of repo projects")
p.add_option('-q', '--quiet', action='store_true',
help="only print the name of modified projects")
def _StatusHelper(self, quiet, project): def _StatusHelper(self, quiet, project):
"""Obtains the status for a specific project. """Obtains the status for a specific project.
@ -100,7 +96,9 @@ the following meanings:
Returns: Returns:
The status of the project. The status of the project.
""" """
return project.PrintWorkTreeStatus(quiet=quiet) buf = io.StringIO()
ret = project.PrintWorkTreeStatus(quiet=quiet, output_redir=buf)
return (ret, buf.getvalue())
def _FindOrphans(self, dirs, proj_dirs, proj_dirs_parents, outstring): def _FindOrphans(self, dirs, proj_dirs, proj_dirs_parents, outstring):
"""find 'dirs' that are present in 'proj_dirs_parents' but not in 'proj_dirs'""" """find 'dirs' that are present in 'proj_dirs_parents' but not in 'proj_dirs'"""
@ -120,17 +118,23 @@ the following meanings:
def Execute(self, opt, args): def Execute(self, opt, args):
all_projects = self.GetProjects(args) all_projects = self.GetProjects(args)
counter = 0
if opt.jobs == 1: def _ProcessResults(_pool, _output, results):
for project in all_projects: ret = 0
state = project.PrintWorkTreeStatus(quiet=opt.quiet) for (state, output) in results:
if output:
print(output, end='')
if state == 'CLEAN': if state == 'CLEAN':
counter += 1 ret += 1
else: return ret
with multiprocessing.Pool(opt.jobs) as pool:
states = pool.map(functools.partial(self._StatusHelper, opt.quiet), all_projects) counter = self.ExecuteInParallel(
counter += states.count('CLEAN') opt.jobs,
functools.partial(self._StatusHelper, opt.quiet),
all_projects,
callback=_ProcessResults,
ordered=True)
if not opt.quiet and len(all_projects) == counter: if not opt.quiet and len(all_projects) == counter:
print('nothing to commit (working directory clean)') print('nothing to commit (working directory clean)')

View File

@ -12,14 +12,15 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import functools
import http.cookiejar as cookielib import http.cookiejar as cookielib
import io
import json import json
import multiprocessing
import netrc import netrc
from optparse import SUPPRESS_HELP from optparse import SUPPRESS_HELP
import os import os
import re
import socket import socket
import subprocess
import sys import sys
import tempfile import tempfile
import time import time
@ -42,20 +43,15 @@ except ImportError:
def _rlimit_nofile(): def _rlimit_nofile():
return (256, 256) return (256, 256)
try:
import multiprocessing
except ImportError:
multiprocessing = None
import event_log import event_log
from git_command import GIT, git_require from git_command import git_require
from git_config import GetUrlCookieFile from git_config import GetUrlCookieFile
from git_refs import R_HEADS, HEAD from git_refs import R_HEADS, HEAD
import git_superproject import git_superproject
import gitc_utils import gitc_utils
from project import Project from project import Project
from project import RemoteSpec from project import RemoteSpec
from command import Command, MirrorSafeCommand from command import Command, MirrorSafeCommand, WORKER_BATCH_SIZE
from error import RepoChangedException, GitError, ManifestParseError from error import RepoChangedException, GitError, ManifestParseError
import platform_utils import platform_utils
from project import SyncBuffer from project import SyncBuffer
@ -66,15 +62,6 @@ from manifest_xml import GitcManifest
_ONE_DAY_S = 24 * 60 * 60 _ONE_DAY_S = 24 * 60 * 60
class _FetchError(Exception):
"""Internal error thrown in _FetchHelper() when we don't want stack trace."""
pass
class _CheckoutError(Exception):
"""Internal error thrown in _CheckoutOne() when we don't want stack trace."""
class Sync(Command, MirrorSafeCommand): class Sync(Command, MirrorSafeCommand):
jobs = 1 jobs = 1
common = True common = True
@ -178,12 +165,20 @@ If the remote SSH daemon is Gerrit Code Review, version 2.0.10 or
later is required to fix a server side protocol bug. later is required to fix a server side protocol bug.
""" """
PARALLEL_JOBS = 1
def _CommonOptions(self, p):
try:
self.PARALLEL_JOBS = self.manifest.default.sync_j
except ManifestParseError:
pass
super()._CommonOptions(p)
def _Options(self, p, show_smart=True): def _Options(self, p, show_smart=True):
try: p.add_option('--jobs-network', default=None, type=int, metavar='JOBS',
self.jobs = self.manifest.default.sync_j help='number of network jobs to run in parallel (defaults to --jobs)')
except ManifestParseError: p.add_option('--jobs-checkout', default=None, type=int, metavar='JOBS',
self.jobs = 1 help='number of local checkout jobs to run in parallel (defaults to --jobs)')
p.add_option('-f', '--force-broken', p.add_option('-f', '--force-broken',
dest='force_broken', action='store_true', dest='force_broken', action='store_true',
@ -217,15 +212,6 @@ later is required to fix a server side protocol bug.
p.add_option('-c', '--current-branch', p.add_option('-c', '--current-branch',
dest='current_branch_only', action='store_true', dest='current_branch_only', action='store_true',
help='fetch only current branch from server') help='fetch only current branch from server')
p.add_option('-v', '--verbose',
dest='output_mode', action='store_true',
help='show all sync output')
p.add_option('-q', '--quiet',
dest='output_mode', action='store_false',
help='only show errors')
p.add_option('-j', '--jobs',
dest='jobs', action='store', type='int',
help="projects to fetch simultaneously (default %d)" % self.jobs)
p.add_option('-m', '--manifest-name', p.add_option('-m', '--manifest-name',
dest='manifest_name', dest='manifest_name',
help='temporary manifest to use for this sync', metavar='NAME.xml') help='temporary manifest to use for this sync', metavar='NAME.xml')
@ -280,6 +266,16 @@ later is required to fix a server side protocol bug.
branch = branch[len(R_HEADS):] branch = branch[len(R_HEADS):]
return branch return branch
def _UseSuperproject(self, opt):
"""Returns True if use-superproject option is enabled"""
return (opt.use_superproject or
self.manifest.manifestProject.config.GetBoolean(
'repo.superproject'))
def _GetCurrentBranchOnly(self, opt):
"""Returns True if current-branch or use-superproject options are enabled."""
return opt.current_branch_only or self._UseSuperproject(opt)
def _UpdateProjectsRevisionId(self, opt, args): def _UpdateProjectsRevisionId(self, opt, args):
"""Update revisionId of every project with the SHA from superproject. """Update revisionId of every project with the SHA from superproject.
@ -295,7 +291,8 @@ later is required to fix a server side protocol bug.
Returns path to the overriding manifest file. Returns path to the overriding manifest file.
""" """
superproject = git_superproject.Superproject(self.manifest, superproject = git_superproject.Superproject(self.manifest,
self.repodir) self.repodir,
quiet=opt.quiet)
all_projects = self.GetProjects(args, all_projects = self.GetProjects(args,
missing_ok=True, missing_ok=True,
submodules_ok=opt.fetch_submodules) submodules_ok=opt.fetch_submodules)
@ -307,140 +304,123 @@ later is required to fix a server side protocol bug.
self._ReloadManifest(manifest_path) self._ReloadManifest(manifest_path)
return manifest_path return manifest_path
def _FetchProjectList(self, opt, projects, sem, *args, **kwargs): def _FetchProjectList(self, opt, projects):
"""Main function of the fetch threads. """Main function of the fetch worker.
The projects we're given share the same underlying git object store, so we
have to fetch them in serial.
Delegates most of the work to _FetchHelper. Delegates most of the work to _FetchHelper.
Args: Args:
opt: Program options returned from optparse. See _Options(). opt: Program options returned from optparse. See _Options().
projects: Projects to fetch. projects: Projects to fetch.
sem: We'll release() this semaphore when we exit so that another thread
can be started up.
*args, **kwargs: Remaining arguments to pass to _FetchHelper. See the
_FetchHelper docstring for details.
""" """
try: return [self._FetchOne(opt, x) for x in projects]
for project in projects:
success = self._FetchHelper(opt, project, *args, **kwargs)
if not success and opt.fail_fast:
break
finally:
sem.release()
def _FetchHelper(self, opt, project, lock, fetched, pm, err_event, def _FetchOne(self, opt, project):
clone_filter):
"""Fetch git objects for a single project. """Fetch git objects for a single project.
Args: Args:
opt: Program options returned from optparse. See _Options(). opt: Program options returned from optparse. See _Options().
project: Project object for the project to fetch. project: Project object for the project to fetch.
lock: Lock for accessing objects that are shared amongst multiple
_FetchHelper() threads.
fetched: set object that we will add project.gitdir to when we're done
(with our lock held).
pm: Instance of a Project object. We will call pm.update() (with our
lock held).
err_event: We'll set this event in the case of an error (after printing
out info about the error).
clone_filter: Filter for use in a partial clone.
Returns: Returns:
Whether the fetch was successful. Whether the fetch was successful.
""" """
# We'll set to true once we've locked the lock.
did_lock = False
# Encapsulate everything in a try/except/finally so that:
# - We always set err_event in the case of an exception.
# - We always make sure we unlock the lock if we locked it.
start = time.time() start = time.time()
success = False success = False
try: buf = io.StringIO()
try: try:
success = project.Sync_NetworkHalf( success = project.Sync_NetworkHalf(
quiet=opt.quiet, quiet=opt.quiet,
verbose=opt.verbose, verbose=opt.verbose,
current_branch_only=opt.current_branch_only, output_redir=buf,
current_branch_only=self._GetCurrentBranchOnly(opt),
force_sync=opt.force_sync, force_sync=opt.force_sync,
clone_bundle=opt.clone_bundle, clone_bundle=opt.clone_bundle,
tags=opt.tags, archive=self.manifest.IsArchive, tags=opt.tags, archive=self.manifest.IsArchive,
optimized_fetch=opt.optimized_fetch, optimized_fetch=opt.optimized_fetch,
retry_fetches=opt.retry_fetches, retry_fetches=opt.retry_fetches,
prune=opt.prune, prune=opt.prune,
clone_filter=clone_filter) clone_filter=self.manifest.CloneFilter,
self._fetch_times.Set(project, time.time() - start) partial_clone_exclude=self.manifest.PartialCloneExclude)
# Lock around all the rest of the code, since printing, updating a set output = buf.getvalue()
# and Progress.update() are not thread safe. if opt.verbose and output:
lock.acquire() print('\n' + output.rstrip())
did_lock = True
if not success: if not success:
err_event.set()
print('error: Cannot fetch %s from %s' print('error: Cannot fetch %s from %s'
% (project.name, project.remote.url), % (project.name, project.remote.url),
file=sys.stderr) file=sys.stderr)
if opt.fail_fast: except GitError as e:
raise _FetchError() print('error.GitError: Cannot fetch %s' % str(e), file=sys.stderr)
fetched.add(project.gitdir)
pm.update(msg=project.name)
except _FetchError:
pass
except Exception as e: except Exception as e:
print('error: Cannot fetch %s (%s: %s)' print('error: Cannot fetch %s (%s: %s)'
% (project.name, type(e).__name__, str(e)), file=sys.stderr) % (project.name, type(e).__name__, str(e)), file=sys.stderr)
err_event.set()
raise raise
finally:
if did_lock:
lock.release()
finish = time.time()
self.event_log.AddSync(project, event_log.TASK_SYNC_NETWORK,
start, finish, success)
return success finish = time.time()
return (success, project, start, finish)
def _Fetch(self, projects, opt, err_event): def _Fetch(self, projects, opt, err_event):
ret = True
jobs = opt.jobs_network if opt.jobs_network else self.jobs
fetched = set() fetched = set()
lock = _threading.Lock() pm = Progress('Fetching', len(projects), delay=False, quiet=opt.quiet)
pm = Progress('Fetching projects', len(projects),
always_print_percentage=opt.quiet)
objdir_project_map = dict() objdir_project_map = dict()
for project in projects: for project in projects:
objdir_project_map.setdefault(project.objdir, []).append(project) objdir_project_map.setdefault(project.objdir, []).append(project)
projects_list = list(objdir_project_map.values())
threads = set() def _ProcessResults(results_sets):
sem = _threading.Semaphore(self.jobs) ret = True
for project_list in objdir_project_map.values(): for results in results_sets:
for (success, project, start, finish) in results:
self._fetch_times.Set(project, finish - start)
self.event_log.AddSync(project, event_log.TASK_SYNC_NETWORK,
start, finish, success)
# Check for any errors before running any more tasks. # Check for any errors before running any more tasks.
# ...we'll let existing threads finish, though. # ...we'll let existing jobs finish, though.
if err_event.isSet() and opt.fail_fast: if not success:
break ret = False
sem.acquire()
kwargs = dict(opt=opt,
projects=project_list,
sem=sem,
lock=lock,
fetched=fetched,
pm=pm,
err_event=err_event,
clone_filter=self.manifest.CloneFilter)
if self.jobs > 1:
t = _threading.Thread(target=self._FetchProjectList,
kwargs=kwargs)
# Ensure that Ctrl-C will not freeze the repo process.
t.daemon = True
threads.add(t)
t.start()
else: else:
self._FetchProjectList(**kwargs) fetched.add(project.gitdir)
pm.update(msg=project.name)
if not ret and opt.fail_fast:
break
return ret
for t in threads: # NB: Multiprocessing is heavy, so don't spin it up for one job.
t.join() if len(projects_list) == 1 or jobs == 1:
if not _ProcessResults(self._FetchProjectList(opt, x) for x in projects_list):
ret = False
else:
# Favor throughput over responsiveness when quiet. It seems that imap()
# will yield results in batches relative to chunksize, so even as the
# children finish a sync, we won't see the result until one child finishes
# ~chunksize jobs. When using a large --jobs with large chunksize, this
# can be jarring as there will be a large initial delay where repo looks
# like it isn't doing anything and sits at 0%, but then suddenly completes
# a lot of jobs all at once. Since this code is more network bound, we
# can accept a bit more CPU overhead with a smaller chunksize so that the
# user sees more immediate & continuous feedback.
if opt.quiet:
chunksize = WORKER_BATCH_SIZE
else:
pm.update(inc=0, msg='warming up')
chunksize = 4
with multiprocessing.Pool(jobs) as pool:
results = pool.imap_unordered(
functools.partial(self._FetchProjectList, opt),
projects_list,
chunksize=chunksize)
if not _ProcessResults(results):
ret = False
pool.close()
pm.end() pm.end()
self._fetch_times.Save() self._fetch_times.Save()
@ -448,178 +428,108 @@ later is required to fix a server side protocol bug.
if not self.manifest.IsArchive: if not self.manifest.IsArchive:
self._GCProjects(projects, opt, err_event) self._GCProjects(projects, opt, err_event)
return fetched return (ret, fetched)
def _CheckoutWorker(self, opt, sem, project, *args, **kwargs): def _CheckoutOne(self, detach_head, force_sync, project):
"""Main function of the fetch threads.
Delegates most of the work to _CheckoutOne.
Args:
opt: Program options returned from optparse. See _Options().
projects: Projects to fetch.
sem: We'll release() this semaphore when we exit so that another thread
can be started up.
*args, **kwargs: Remaining arguments to pass to _CheckoutOne. See the
_CheckoutOne docstring for details.
"""
try:
return self._CheckoutOne(opt, project, *args, **kwargs)
finally:
sem.release()
def _CheckoutOne(self, opt, project, lock, pm, err_event, err_results):
"""Checkout work tree for one project """Checkout work tree for one project
Args: Args:
opt: Program options returned from optparse. See _Options(). detach_head: Whether to leave a detached HEAD.
force_sync: Force checking out of the repo.
project: Project object for the project to checkout. project: Project object for the project to checkout.
lock: Lock for accessing objects that are shared amongst multiple
_CheckoutWorker() threads.
pm: Instance of a Project object. We will call pm.update() (with our
lock held).
err_event: We'll set this event in the case of an error (after printing
out info about the error).
err_results: A list of strings, paths to git repos where checkout
failed.
Returns: Returns:
Whether the fetch was successful. Whether the fetch was successful.
""" """
# We'll set to true once we've locked the lock.
did_lock = False
# Encapsulate everything in a try/except/finally so that:
# - We always set err_event in the case of an exception.
# - We always make sure we unlock the lock if we locked it.
start = time.time() start = time.time()
syncbuf = SyncBuffer(self.manifest.manifestProject.config, syncbuf = SyncBuffer(self.manifest.manifestProject.config,
detach_head=opt.detach_head) detach_head=detach_head)
success = False success = False
try: try:
try: project.Sync_LocalHalf(syncbuf, force_sync=force_sync)
project.Sync_LocalHalf(syncbuf, force_sync=opt.force_sync)
# Lock around all the rest of the code, since printing, updating a set
# and Progress.update() are not thread safe.
lock.acquire()
success = syncbuf.Finish() success = syncbuf.Finish()
did_lock = True except GitError as e:
print('error.GitError: Cannot checkout %s: %s' %
if not success: (project.name, str(e)), file=sys.stderr)
err_event.set()
print('error: Cannot checkout %s' % (project.name),
file=sys.stderr)
raise _CheckoutError()
pm.update(msg=project.name)
except _CheckoutError:
pass
except Exception as e: except Exception as e:
print('error: Cannot checkout %s: %s: %s' % print('error: Cannot checkout %s: %s: %s' %
(project.name, type(e).__name__, str(e)), (project.name, type(e).__name__, str(e)),
file=sys.stderr) file=sys.stderr)
err_event.set()
raise raise
finally:
if did_lock:
if not success: if not success:
err_results.append(project.relpath) print('error: Cannot checkout %s' % (project.name), file=sys.stderr)
lock.release()
finish = time.time() finish = time.time()
self.event_log.AddSync(project, event_log.TASK_SYNC_LOCAL, return (success, project, start, finish)
start, finish, success)
return success def _Checkout(self, all_projects, opt, err_results):
def _Checkout(self, all_projects, opt, err_event, err_results):
"""Checkout projects listed in all_projects """Checkout projects listed in all_projects
Args: Args:
all_projects: List of all projects that should be checked out. all_projects: List of all projects that should be checked out.
opt: Program options returned from optparse. See _Options(). opt: Program options returned from optparse. See _Options().
err_event: We'll set this event in the case of an error (after printing err_results: A list of strings, paths to git repos where checkout failed.
out info about the error).
err_results: A list of strings, paths to git repos where checkout
failed.
""" """
# Only checkout projects with worktrees.
all_projects = [x for x in all_projects if x.worktree]
# Perform checkouts in multiple threads when we are using partial clone. def _ProcessResults(pool, pm, results):
# Without partial clone, all needed git objects are already downloaded, ret = True
# in this situation it's better to use only one process because the checkout for (success, project, start, finish) in results:
# would be mostly disk I/O; with partial clone, the objects are only self.event_log.AddSync(project, event_log.TASK_SYNC_LOCAL,
# downloaded when demanded (at checkout time), which is similar to the start, finish, success)
# Sync_NetworkHalf case and parallelism would be helpful.
if self.manifest.CloneFilter:
syncjobs = self.jobs
else:
syncjobs = 1
lock = _threading.Lock()
pm = Progress('Checking out projects', len(all_projects))
threads = set()
sem = _threading.Semaphore(syncjobs)
for project in all_projects:
# Check for any errors before running any more tasks. # Check for any errors before running any more tasks.
# ...we'll let existing threads finish, though. # ...we'll let existing jobs finish, though.
if err_event.isSet() and opt.fail_fast: if not success:
break ret = False
err_results.append(project.relpath)
if opt.fail_fast:
if pool:
pool.close()
return ret
pm.update(msg=project.name)
return ret
sem.acquire() return self.ExecuteInParallel(
if project.worktree: opt.jobs_checkout if opt.jobs_checkout else self.jobs,
kwargs = dict(opt=opt, functools.partial(self._CheckoutOne, opt.detach_head, opt.force_sync),
sem=sem, all_projects,
project=project, callback=_ProcessResults,
lock=lock, output=Progress('Checking out', len(all_projects), quiet=opt.quiet)) and not err_results
pm=pm,
err_event=err_event,
err_results=err_results)
if syncjobs > 1:
t = _threading.Thread(target=self._CheckoutWorker,
kwargs=kwargs)
# Ensure that Ctrl-C will not freeze the repo process.
t.daemon = True
threads.add(t)
t.start()
else:
self._CheckoutWorker(**kwargs)
for t in threads:
t.join()
pm.end()
def _GCProjects(self, projects, opt, err_event): def _GCProjects(self, projects, opt, err_event):
pm = Progress('Garbage collecting', len(projects), delay=False, quiet=opt.quiet)
pm.update(inc=0, msg='prescan')
gc_gitdirs = {} gc_gitdirs = {}
for project in projects: for project in projects:
# Make sure pruning never kicks in with shared projects. # Make sure pruning never kicks in with shared projects.
if (not project.use_git_worktrees and if (not project.use_git_worktrees and
len(project.manifest.GetProjectsWithName(project.name)) > 1): len(project.manifest.GetProjectsWithName(project.name)) > 1):
if not opt.quiet: if not opt.quiet:
print('%s: Shared project %s found, disabling pruning.' % print('\r%s: Shared project %s found, disabling pruning.' %
(project.relpath, project.name)) (project.relpath, project.name))
if git_require((2, 7, 0)): if git_require((2, 7, 0)):
project.EnableRepositoryExtension('preciousObjects') project.EnableRepositoryExtension('preciousObjects')
else: else:
# This isn't perfect, but it's the best we can do with old git. # This isn't perfect, but it's the best we can do with old git.
print('%s: WARNING: shared projects are unreliable when using old ' print('\r%s: WARNING: shared projects are unreliable when using old '
'versions of git; please upgrade to git-2.7.0+.' 'versions of git; please upgrade to git-2.7.0+.'
% (project.relpath,), % (project.relpath,),
file=sys.stderr) file=sys.stderr)
project.config.SetString('gc.pruneExpire', 'never') project.config.SetString('gc.pruneExpire', 'never')
gc_gitdirs[project.gitdir] = project.bare_git gc_gitdirs[project.gitdir] = project.bare_git
if multiprocessing: pm.update(inc=len(projects) - len(gc_gitdirs), msg='warming up')
cpu_count = multiprocessing.cpu_count()
else: cpu_count = os.cpu_count()
cpu_count = 1
jobs = min(self.jobs, cpu_count) jobs = min(self.jobs, cpu_count)
if jobs < 2: if jobs < 2:
for bare_git in gc_gitdirs.values(): for bare_git in gc_gitdirs.values():
pm.update(msg=bare_git._project.name)
bare_git.gc('--auto') bare_git.gc('--auto')
pm.end()
return return
config = {'pack.threads': cpu_count // jobs if cpu_count > jobs else 1} config = {'pack.threads': cpu_count // jobs if cpu_count > jobs else 1}
@ -628,6 +538,7 @@ later is required to fix a server side protocol bug.
sem = _threading.Semaphore(jobs) sem = _threading.Semaphore(jobs)
def GC(bare_git): def GC(bare_git):
pm.start(bare_git._project.name)
try: try:
try: try:
bare_git.gc('--auto', config=config) bare_git.gc('--auto', config=config)
@ -637,10 +548,11 @@ later is required to fix a server side protocol bug.
err_event.set() err_event.set()
raise raise
finally: finally:
pm.finish(bare_git._project.name)
sem.release() sem.release()
for bare_git in gc_gitdirs.values(): for bare_git in gc_gitdirs.values():
if err_event.isSet() and opt.fail_fast: if err_event.is_set() and opt.fail_fast:
break break
sem.acquire() sem.acquire()
t = _threading.Thread(target=GC, args=(bare_git,)) t = _threading.Thread(target=GC, args=(bare_git,))
@ -650,6 +562,7 @@ later is required to fix a server side protocol bug.
for t in threads: for t in threads:
t.join() t.join()
pm.end()
def _ReloadManifest(self, manifest_name=None): def _ReloadManifest(self, manifest_name=None):
if manifest_name: if manifest_name:
@ -796,13 +709,14 @@ later is required to fix a server side protocol bug.
if not opt.local_only: if not opt.local_only:
start = time.time() start = time.time()
success = mp.Sync_NetworkHalf(quiet=opt.quiet, verbose=opt.verbose, success = mp.Sync_NetworkHalf(quiet=opt.quiet, verbose=opt.verbose,
current_branch_only=opt.current_branch_only, current_branch_only=self._GetCurrentBranchOnly(opt),
force_sync=opt.force_sync, force_sync=opt.force_sync,
tags=opt.tags, tags=opt.tags,
optimized_fetch=opt.optimized_fetch, optimized_fetch=opt.optimized_fetch,
retry_fetches=opt.retry_fetches, retry_fetches=opt.retry_fetches,
submodules=self.manifest.HasSubmodules, submodules=self.manifest.HasSubmodules,
clone_filter=self.manifest.CloneFilter) clone_filter=self.manifest.CloneFilter,
partial_clone_exclude=self.manifest.PartialCloneExclude)
finish = time.time() finish = time.time()
self.event_log.AddSync(mp, event_log.TASK_SYNC_NETWORK, self.event_log.AddSync(mp, event_log.TASK_SYNC_NETWORK,
start, finish, success) start, finish, success)
@ -845,9 +759,6 @@ later is required to fix a server side protocol bug.
soft_limit, _ = _rlimit_nofile() soft_limit, _ = _rlimit_nofile()
self.jobs = min(self.jobs, (soft_limit - 5) // 3) self.jobs = min(self.jobs, (soft_limit - 5) // 3)
opt.quiet = opt.output_mode is False
opt.verbose = opt.output_mode is True
if opt.manifest_name: if opt.manifest_name:
self.manifest.Override(opt.manifest_name) self.manifest.Override(opt.manifest_name)
@ -891,7 +802,7 @@ later is required to fix a server side protocol bug.
else: else:
self._UpdateManifestProject(opt, mp, manifest_name) self._UpdateManifestProject(opt, mp, manifest_name)
if opt.use_superproject: if self._UseSuperproject(opt):
manifest_name = self._UpdateProjectsRevisionId(opt, args) manifest_name = self._UpdateProjectsRevisionId(opt, args)
if self.gitc_manifest: if self.gitc_manifest:
@ -935,7 +846,6 @@ later is required to fix a server side protocol bug.
err_network_sync = False err_network_sync = False
err_update_projects = False err_update_projects = False
err_checkout = False
self._fetch_times = _FetchTimes(self.manifest) self._fetch_times = _FetchTimes(self.manifest)
if not opt.local_only: if not opt.local_only:
@ -946,12 +856,14 @@ later is required to fix a server side protocol bug.
to_fetch.extend(all_projects) to_fetch.extend(all_projects)
to_fetch.sort(key=self._fetch_times.Get, reverse=True) to_fetch.sort(key=self._fetch_times.Get, reverse=True)
fetched = self._Fetch(to_fetch, opt, err_event) success, fetched = self._Fetch(to_fetch, opt, err_event)
if not success:
err_event.set()
_PostRepoFetch(rp, opt.repo_verify) _PostRepoFetch(rp, opt.repo_verify)
if opt.network_only: if opt.network_only:
# bail out now; the rest touches the working tree # bail out now; the rest touches the working tree
if err_event.isSet(): if err_event.is_set():
print('\nerror: Exited sync due to fetch errors.\n', file=sys.stderr) print('\nerror: Exited sync due to fetch errors.\n', file=sys.stderr)
sys.exit(1) sys.exit(1)
return return
@ -975,10 +887,13 @@ later is required to fix a server side protocol bug.
if previously_missing_set == missing_set: if previously_missing_set == missing_set:
break break
previously_missing_set = missing_set previously_missing_set = missing_set
fetched.update(self._Fetch(missing, opt, err_event)) success, new_fetched = self._Fetch(missing, opt, err_event)
if not success:
err_event.set()
fetched.update(new_fetched)
# If we saw an error, exit with code 1 so that other scripts can check. # If we saw an error, exit with code 1 so that other scripts can check.
if err_event.isSet(): if err_event.is_set():
err_network_sync = True err_network_sync = True
if opt.fail_fast: if opt.fail_fast:
print('\nerror: Exited sync due to fetch errors.\n' print('\nerror: Exited sync due to fetch errors.\n'
@ -1000,10 +915,10 @@ later is required to fix a server side protocol bug.
sys.exit(1) sys.exit(1)
err_results = [] err_results = []
self._Checkout(all_projects, opt, err_event, err_results)
if err_event.isSet():
err_checkout = True
# NB: We don't exit here because this is the last step. # NB: We don't exit here because this is the last step.
err_checkout = not self._Checkout(all_projects, opt, err_results)
if err_checkout:
err_event.set()
# If there's a notice that's supposed to print at the end of the sync, print # If there's a notice that's supposed to print at the end of the sync, print
# it now... # it now...
@ -1011,7 +926,7 @@ later is required to fix a server side protocol bug.
print(self.manifest.notice) print(self.manifest.notice)
# If we saw an error, exit with code 1 so that other scripts can check. # If we saw an error, exit with code 1 so that other scripts can check.
if err_event.isSet(): if err_event.is_set():
print('\nerror: Unable to fully sync the tree.', file=sys.stderr) print('\nerror: Unable to fully sync the tree.', file=sys.stderr)
if err_network_sync: if err_network_sync:
print('error: Downloading network changes failed.', file=sys.stderr) print('error: Downloading network changes failed.', file=sys.stderr)
@ -1041,12 +956,25 @@ def _PostRepoUpgrade(manifest, quiet=False):
def _PostRepoFetch(rp, repo_verify=True, verbose=False): def _PostRepoFetch(rp, repo_verify=True, verbose=False):
if rp.HasChanges: if rp.HasChanges:
print('info: A new version of repo is available', file=sys.stderr) print('info: A new version of repo is available', file=sys.stderr)
print(file=sys.stderr) wrapper = Wrapper()
if not repo_verify or _VerifyTag(rp): try:
syncbuf = SyncBuffer(rp.config) rev = rp.bare_git.describe(rp.GetRevisionId())
rp.Sync_LocalHalf(syncbuf) except GitError:
if not syncbuf.Finish(): rev = None
sys.exit(1) _, new_rev = wrapper.check_repo_rev(rp.gitdir, rev, repo_verify=repo_verify)
# See if we're held back due to missing signed tag.
current_revid = rp.bare_git.rev_parse('HEAD')
new_revid = rp.bare_git.rev_parse('--verify', new_rev)
if current_revid != new_revid:
# We want to switch to the new rev, but also not trash any uncommitted
# changes. This helps with local testing/hacking.
# If a local change has been made, we will throw that away.
# We also have to make sure this will switch to an older commit if that's
# the latest tag in order to support release rollback.
try:
rp.work_git.reset('--keep', new_rev)
except GitError as e:
sys.exit(str(e))
print('info: Restarting repo with latest version', file=sys.stderr) print('info: Restarting repo with latest version', file=sys.stderr)
raise RepoChangedException(['--repo-upgraded']) raise RepoChangedException(['--repo-upgraded'])
else: else:
@ -1057,54 +985,6 @@ def _PostRepoFetch(rp, repo_verify=True, verbose=False):
file=sys.stderr) file=sys.stderr)
def _VerifyTag(project):
gpg_dir = os.path.expanduser('~/.repoconfig/gnupg')
if not os.path.exists(gpg_dir):
print('warning: GnuPG was not available during last "repo init"\n'
'warning: Cannot automatically authenticate repo."""',
file=sys.stderr)
return True
try:
cur = project.bare_git.describe(project.GetRevisionId())
except GitError:
cur = None
if not cur \
or re.compile(r'^.*-[0-9]{1,}-g[0-9a-f]{1,}$').match(cur):
rev = project.revisionExpr
if rev.startswith(R_HEADS):
rev = rev[len(R_HEADS):]
print(file=sys.stderr)
print("warning: project '%s' branch '%s' is not signed"
% (project.name, rev), file=sys.stderr)
return False
env = os.environ.copy()
env['GIT_DIR'] = project.gitdir
env['GNUPGHOME'] = gpg_dir
cmd = [GIT, 'tag', '-v', cur]
proc = subprocess.Popen(cmd,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
env=env)
out = proc.stdout.read()
proc.stdout.close()
err = proc.stderr.read()
proc.stderr.close()
if proc.wait() != 0:
print(file=sys.stderr)
print(out, file=sys.stderr)
print(err, file=sys.stderr)
print(file=sys.stderr)
return False
return True
class _FetchTimes(object): class _FetchTimes(object):
_ALPHA = 0.5 _ALPHA = 0.5

53
tests/test_error.py Normal file
View File

@ -0,0 +1,53 @@
# Copyright 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the error.py module."""
import inspect
import pickle
import unittest
import error
class PickleTests(unittest.TestCase):
"""Make sure all our custom exceptions can be pickled."""
def getExceptions(self):
"""Return all our custom exceptions."""
for name in dir(error):
cls = getattr(error, name)
if isinstance(cls, type) and issubclass(cls, Exception):
yield cls
def testExceptionLookup(self):
"""Make sure our introspection logic works."""
classes = list(self.getExceptions())
self.assertIn(error.HookError, classes)
# Don't assert the exact number to avoid being a change-detector test.
self.assertGreater(len(classes), 10)
def testPickle(self):
"""Try to pickle all the exceptions."""
for cls in self.getExceptions():
args = inspect.getfullargspec(cls.__init__).args[1:]
obj = cls(*args)
p = pickle.dumps(obj)
try:
newobj = pickle.loads(p)
except Exception as e: # pylint: disable=broad-except
self.fail('Class %s is unable to be pickled: %s\n'
'Incomplete super().__init__(...) call?' % (cls, e))
self.assertIsInstance(newobj, cls)
self.assertEqual(str(obj), str(newobj))

View File

@ -15,6 +15,7 @@
"""Unittests for the git_superproject.py module.""" """Unittests for the git_superproject.py module."""
import os import os
import platform
import tempfile import tempfile
import unittest import unittest
from unittest import mock from unittest import mock
@ -34,6 +35,7 @@ class SuperprojectTestCase(unittest.TestCase):
self.manifest_file = os.path.join( self.manifest_file = os.path.join(
self.repodir, manifest_xml.MANIFEST_FILE_NAME) self.repodir, manifest_xml.MANIFEST_FILE_NAME)
os.mkdir(self.repodir) os.mkdir(self.repodir)
self.platform = platform.system().lower()
# The manifest parsing really wants a git repo currently. # The manifest parsing really wants a git repo currently.
gitdir = os.path.join(self.repodir, 'manifests.git') gitdir = os.path.join(self.repodir, 'manifests.git')
@ -48,8 +50,8 @@ class SuperprojectTestCase(unittest.TestCase):
<remote name="default-remote" fetch="http://localhost" /> <remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" /> <default remote="default-remote" revision="refs/heads/main" />
<superproject name="superproject"/> <superproject name="superproject"/>
<project path="art" name="platform/art" /> <project path="art" name="platform/art" groups="notdefault,platform-""" + self.platform + """
</manifest> " /></manifest>
""") """)
self._superproject = git_superproject.Superproject(manifest, self.repodir) self._superproject = git_superproject.Superproject(manifest, self.repodir)
@ -97,17 +99,17 @@ class SuperprojectTestCase(unittest.TestCase):
with mock.patch.object(self._superproject, '_GetBranch', return_value='junk'): with mock.patch.object(self._superproject, '_GetBranch', return_value='junk'):
self.assertFalse(superproject.Sync()) self.assertFalse(superproject.Sync())
def test_superproject_get_superproject_mock_clone(self): def test_superproject_get_superproject_mock_init(self):
"""Test with _Clone failing.""" """Test with _Init failing."""
with mock.patch.object(self._superproject, '_Clone', return_value=False): with mock.patch.object(self._superproject, '_Init', return_value=False):
self.assertFalse(self._superproject.Sync()) self.assertFalse(self._superproject.Sync())
def test_superproject_get_superproject_mock_fetch(self): def test_superproject_get_superproject_mock_fetch(self):
"""Test with _Fetch failing and _clone being called.""" """Test with _Fetch failing."""
with mock.patch.object(self._superproject, '_Clone', return_value=True): with mock.patch.object(self._superproject, '_Init', return_value=True):
os.mkdir(self._superproject._superproject_path) os.mkdir(self._superproject._superproject_path)
with mock.patch.object(self._superproject, '_Fetch', return_value=False): with mock.patch.object(self._superproject, '_Fetch', return_value=False):
self.assertTrue(self._superproject.Sync()) self.assertFalse(self._superproject.Sync())
def test_superproject_get_all_project_commit_ids_mock_ls_tree(self): def test_superproject_get_all_project_commit_ids_mock_ls_tree(self):
"""Test with LsTree being a mock.""" """Test with LsTree being a mock."""
@ -116,7 +118,8 @@ class SuperprojectTestCase(unittest.TestCase):
'160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00' '160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00'
'120000 blob acc2cbdf438f9d2141f0ae424cec1d8fc4b5d97f\tbootstrap.bash\x00' '120000 blob acc2cbdf438f9d2141f0ae424cec1d8fc4b5d97f\tbootstrap.bash\x00'
'160000 commit ade9b7a0d874e25fff4bf2552488825c6f111928\tbuild/bazel\x00') '160000 commit ade9b7a0d874e25fff4bf2552488825c6f111928\tbuild/bazel\x00')
with mock.patch.object(self._superproject, '_Clone', return_value=True): with mock.patch.object(self._superproject, '_Init', return_value=True):
with mock.patch.object(self._superproject, '_Fetch', return_value=True):
with mock.patch.object(self._superproject, '_LsTree', return_value=data): with mock.patch.object(self._superproject, '_LsTree', return_value=data):
commit_ids = self._superproject._GetAllProjectsCommitIds() commit_ids = self._superproject._GetAllProjectsCommitIds()
self.assertEqual(commit_ids, { self.assertEqual(commit_ids, {
@ -141,7 +144,8 @@ class SuperprojectTestCase(unittest.TestCase):
'<?xml version="1.0" ?><manifest>' + '<?xml version="1.0" ?><manifest>' +
'<remote name="default-remote" fetch="http://localhost"/>' + '<remote name="default-remote" fetch="http://localhost"/>' +
'<default remote="default-remote" revision="refs/heads/main"/>' + '<default remote="default-remote" revision="refs/heads/main"/>' +
'<project name="platform/art" path="art" revision="ABCDEF"/>' + '<project name="platform/art" path="art" revision="ABCDEF" ' +
'groups="notdefault,platform-' + self.platform + '"/>' +
'<superproject name="superproject"/>' + '<superproject name="superproject"/>' +
'</manifest>') '</manifest>')
@ -151,7 +155,7 @@ class SuperprojectTestCase(unittest.TestCase):
projects = self._superproject._manifest.projects projects = self._superproject._manifest.projects
data = ('160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00' data = ('160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00'
'160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00') '160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00')
with mock.patch.object(self._superproject, '_Clone', return_value=True): with mock.patch.object(self._superproject, '_Init', return_value=True):
with mock.patch.object(self._superproject, '_Fetch', return_value=True): with mock.patch.object(self._superproject, '_Fetch', return_value=True):
with mock.patch.object(self._superproject, with mock.patch.object(self._superproject,
'_LsTree', '_LsTree',
@ -168,7 +172,8 @@ class SuperprojectTestCase(unittest.TestCase):
'<remote name="default-remote" fetch="http://localhost"/>' + '<remote name="default-remote" fetch="http://localhost"/>' +
'<default remote="default-remote" revision="refs/heads/main"/>' + '<default remote="default-remote" revision="refs/heads/main"/>' +
'<project name="platform/art" path="art" ' + '<project name="platform/art" path="art" ' +
'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea"/>' + 'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" ' +
'groups="notdefault,platform-' + self.platform + '"/>' +
'<superproject name="superproject"/>' + '<superproject name="superproject"/>' +
'</manifest>') '</manifest>')

View File

@ -161,6 +161,79 @@ class EventLogTestCase(unittest.TestCase):
self.assertIn('code', exit_event) self.assertIn('code', exit_event)
self.assertEqual(exit_event['code'], 2) self.assertEqual(exit_event['code'], 2)
def test_command_event(self):
"""Test and validate 'command' event data is valid.
Expected event log:
<version event>
<command event>
"""
name = 'repo'
subcommands = ['init' 'this']
self._event_log_module.CommandEvent(name='repo', subcommands=subcommands)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
command_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(command_event, expected_event_name='command')
# Check for 'command' event specific fields.
self.assertIn('name', command_event)
self.assertIn('subcommands', command_event)
self.assertEqual(command_event['name'], name)
self.assertEqual(command_event['subcommands'], subcommands)
def test_def_params_event_repo_config(self):
"""Test 'def_params' event data outputs only repo config keys.
Expected event log:
<version event>
<def_param event>
<def_param event>
"""
config = {
'git.foo': 'bar',
'repo.partialclone': 'true',
'repo.partialclonefilter': 'blob:none',
}
self._event_log_module.DefParamRepoEvents(config)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 3)
def_param_events = self._log_data[1:]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
for event in def_param_events:
self.verifyCommonKeys(event, expected_event_name='def_param')
# Check for 'def_param' event specific fields.
self.assertIn('param', event)
self.assertIn('value', event)
self.assertTrue(event['param'].startswith('repo.'))
def test_def_params_event_no_repo_config(self):
"""Test 'def_params' event data won't output non-repo config keys.
Expected event log:
<version event>
"""
config = {
'git.foo': 'bar',
'git.core.foo2': 'baz',
}
self._event_log_module.DefParamRepoEvents(config)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 1)
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
def test_write_with_filename(self): def test_write_with_filename(self):
"""Test Write() with a path to a file exits with None.""" """Test Write() with a path to a file exits with None."""
self.assertIsNone(self._event_log_module.Write(path='path/to/file')) self.assertIsNone(self._event_log_module.Write(path='path/to/file'))

View File

@ -15,6 +15,7 @@
"""Unittests for the manifest_xml.py module.""" """Unittests for the manifest_xml.py module."""
import os import os
import platform
import shutil import shutil
import tempfile import tempfile
import unittest import unittest
@ -24,6 +25,73 @@ import error
import manifest_xml import manifest_xml
# Invalid paths that we don't want in the filesystem.
INVALID_FS_PATHS = (
'',
'.',
'..',
'../',
'./',
'.//',
'foo/',
'./foo',
'../foo',
'foo/./bar',
'foo/../../bar',
'/foo',
'./../foo',
'.git/foo',
# Check case folding.
'.GIT/foo',
'blah/.git/foo',
'.repo/foo',
'.repoconfig',
# Block ~ due to 8.3 filenames on Windows filesystems.
'~',
'foo~',
'blah/foo~',
# Block Unicode characters that get normalized out by filesystems.
u'foo\u200Cbar',
)
# Make sure platforms that use path separators (e.g. Windows) are also
# rejected properly.
if os.path.sep != '/':
INVALID_FS_PATHS += tuple(x.replace('/', os.path.sep) for x in INVALID_FS_PATHS)
class ManifestParseTestCase(unittest.TestCase):
"""TestCase for parsing manifests."""
def setUp(self):
self.tempdir = tempfile.mkdtemp(prefix='repo_tests')
self.repodir = os.path.join(self.tempdir, '.repo')
self.manifest_dir = os.path.join(self.repodir, 'manifests')
self.manifest_file = os.path.join(
self.repodir, manifest_xml.MANIFEST_FILE_NAME)
self.local_manifest_dir = os.path.join(
self.repodir, manifest_xml.LOCAL_MANIFESTS_DIR_NAME)
os.mkdir(self.repodir)
os.mkdir(self.manifest_dir)
# The manifest parsing really wants a git repo currently.
gitdir = os.path.join(self.repodir, 'manifests.git')
os.mkdir(gitdir)
with open(os.path.join(gitdir, 'config'), 'w') as fp:
fp.write("""[remote "origin"]
url = https://localhost:0/manifest
""")
def tearDown(self):
shutil.rmtree(self.tempdir, ignore_errors=True)
def getXmlManifest(self, data):
"""Helper to initialize a manifest for testing."""
with open(self.manifest_file, 'w') as fp:
fp.write(data)
return manifest_xml.XmlManifest(self.repodir, self.manifest_file)
class ManifestValidateFilePaths(unittest.TestCase): class ManifestValidateFilePaths(unittest.TestCase):
"""Check _ValidateFilePaths helper. """Check _ValidateFilePaths helper.
@ -54,36 +122,7 @@ class ManifestValidateFilePaths(unittest.TestCase):
def test_bad_paths(self): def test_bad_paths(self):
"""Make sure bad paths (src & dest) are rejected.""" """Make sure bad paths (src & dest) are rejected."""
PATHS = ( for path in INVALID_FS_PATHS:
'..',
'../',
'./',
'foo/',
'./foo',
'../foo',
'foo/./bar',
'foo/../../bar',
'/foo',
'./../foo',
'.git/foo',
# Check case folding.
'.GIT/foo',
'blah/.git/foo',
'.repo/foo',
'.repoconfig',
# Block ~ due to 8.3 filenames on Windows filesystems.
'~',
'foo~',
'blah/foo~',
# Block Unicode characters that get normalized out by filesystems.
u'foo\u200Cbar',
)
# Make sure platforms that use path separators (e.g. Windows) are also
# rejected properly.
if os.path.sep != '/':
PATHS += tuple(x.replace('/', os.path.sep) for x in PATHS)
for path in PATHS:
self.assertRaises( self.assertRaises(
error.ManifestInvalidPathError, self.check_both, path, 'a') error.ManifestInvalidPathError, self.check_both, path, 'a')
self.assertRaises( self.assertRaises(
@ -146,37 +185,9 @@ class ValueTests(unittest.TestCase):
manifest_xml.XmlInt(node, 'a') manifest_xml.XmlInt(node, 'a')
class XmlManifestTests(unittest.TestCase): class XmlManifestTests(ManifestParseTestCase):
"""Check manifest processing.""" """Check manifest processing."""
def setUp(self):
self.tempdir = tempfile.mkdtemp(prefix='repo_tests')
self.repodir = os.path.join(self.tempdir, '.repo')
self.manifest_dir = os.path.join(self.repodir, 'manifests')
self.manifest_file = os.path.join(
self.repodir, manifest_xml.MANIFEST_FILE_NAME)
self.local_manifest_dir = os.path.join(
self.repodir, manifest_xml.LOCAL_MANIFESTS_DIR_NAME)
os.mkdir(self.repodir)
os.mkdir(self.manifest_dir)
# The manifest parsing really wants a git repo currently.
gitdir = os.path.join(self.repodir, 'manifests.git')
os.mkdir(gitdir)
with open(os.path.join(gitdir, 'config'), 'w') as fp:
fp.write("""[remote "origin"]
url = https://localhost:0/manifest
""")
def tearDown(self):
shutil.rmtree(self.tempdir, ignore_errors=True)
def getXmlManifest(self, data):
"""Helper to initialize a manifest for testing."""
with open(self.manifest_file, 'w') as fp:
fp.write(data)
return manifest_xml.XmlManifest(self.repodir, self.manifest_file)
def test_empty(self): def test_empty(self):
"""Parse an 'empty' manifest file.""" """Parse an 'empty' manifest file."""
manifest = self.getXmlManifest( manifest = self.getXmlManifest(
@ -221,67 +232,6 @@ class XmlManifestTests(unittest.TestCase):
self.assertEqual(manifest.repo_hooks_project.name, 'repohooks') self.assertEqual(manifest.repo_hooks_project.name, 'repohooks')
self.assertEqual(manifest.repo_hooks_project.enabled_repo_hooks, ['a', 'b']) self.assertEqual(manifest.repo_hooks_project.enabled_repo_hooks, ['a', 'b'])
def test_superproject(self):
"""Check superproject settings."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<superproject name="superproject"/>
</manifest>
""")
self.assertEqual(manifest.superproject['name'], 'superproject')
self.assertEqual(manifest.superproject['remote'].name, 'test-remote')
self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/superproject')
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' +
'<remote name="test-remote" fetch="http://localhost"/>' +
'<default remote="test-remote" revision="refs/heads/main"/>' +
'<superproject name="superproject"/>' +
'</manifest>')
def test_superproject_with_remote(self):
"""Check superproject settings."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<remote name="superproject-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<superproject name="platform/superproject" remote="superproject-remote"/>
</manifest>
""")
self.assertEqual(manifest.superproject['name'], 'platform/superproject')
self.assertEqual(manifest.superproject['remote'].name, 'superproject-remote')
self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/platform/superproject')
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' +
'<remote name="default-remote" fetch="http://localhost"/>' +
'<remote name="superproject-remote" fetch="http://localhost"/>' +
'<default remote="default-remote" revision="refs/heads/main"/>' +
'<superproject name="platform/superproject" remote="superproject-remote"/>' +
'</manifest>')
def test_superproject_with_defalut_remote(self):
"""Check superproject settings."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<superproject name="superproject" remote="default-remote"/>
</manifest>
""")
self.assertEqual(manifest.superproject['name'], 'superproject')
self.assertEqual(manifest.superproject['remote'].name, 'default-remote')
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' +
'<remote name="default-remote" fetch="http://localhost"/>' +
'<default remote="default-remote" revision="refs/heads/main"/>' +
'<superproject name="superproject"/>' +
'</manifest>')
def test_unknown_tags(self): def test_unknown_tags(self):
"""Check superproject settings.""" """Check superproject settings."""
manifest = self.getXmlManifest(""" manifest = self.getXmlManifest("""
@ -303,51 +253,11 @@ class XmlManifestTests(unittest.TestCase):
'<superproject name="superproject"/>' + '<superproject name="superproject"/>' +
'</manifest>') '</manifest>')
def test_project_group(self):
"""Check project group settings."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<project name="test-name" path="test-path"/>
<project name="extras" path="path" groups="g1,g2,g1"/>
</manifest>
""")
self.assertEqual(len(manifest.projects), 2)
# Ordering isn't guaranteed.
result = {
manifest.projects[0].name: manifest.projects[0].groups,
manifest.projects[1].name: manifest.projects[1].groups,
}
project = manifest.projects[0]
self.assertCountEqual(
result['test-name'],
['name:test-name', 'all', 'path:test-path'])
self.assertCountEqual(
result['extras'],
['g1', 'g2', 'g1', 'name:extras', 'all', 'path:path'])
def test_project_set_revision_id(self): class IncludeElementTests(ManifestParseTestCase):
"""Check setting of project's revisionId.""" """Tests for <include>."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="test-name"/>
</manifest>
""")
self.assertEqual(len(manifest.projects), 1)
project = manifest.projects[0]
project.SetRevisionId('ABCDEF')
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' +
'<remote name="default-remote" fetch="http://localhost"/>' +
'<default remote="default-remote" revision="refs/heads/main"/>' +
'<project name="test-name" revision="ABCDEF"/>' +
'</manifest>')
def test_include_levels(self): def test_group_levels(self):
root_m = os.path.join(self.manifest_dir, 'root.xml') root_m = os.path.join(self.manifest_dir, 'root.xml')
with open(root_m, 'w') as fp: with open(root_m, 'w') as fp:
fp.write(""" fp.write("""
@ -389,3 +299,251 @@ class XmlManifestTests(unittest.TestCase):
self.assertIn('level2-group', proj.groups) self.assertIn('level2-group', proj.groups)
# Check level2 proj group not removed. # Check level2 proj group not removed.
self.assertIn('l2g1', proj.groups) self.assertIn('l2g1', proj.groups)
def test_allow_bad_name_from_user(self):
"""Check handling of bad name attribute from the user's input."""
def parse(name):
manifest = self.getXmlManifest(f"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<include name="{name}" />
</manifest>
""")
# Force the manifest to be parsed.
manifest.ToXml()
# Setup target of the include.
target = os.path.join(self.tempdir, 'target.xml')
with open(target, 'w') as fp:
fp.write('<manifest></manifest>')
# Include with absolute path.
parse(os.path.abspath(target))
# Include with relative path.
parse(os.path.relpath(target, self.manifest_dir))
def test_bad_name_checks(self):
"""Check handling of bad name attribute."""
def parse(name):
# Setup target of the include.
with open(os.path.join(self.manifest_dir, 'target.xml'), 'w') as fp:
fp.write(f'<manifest><include name="{name}"/></manifest>')
manifest = self.getXmlManifest("""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<include name="target.xml" />
</manifest>
""")
# Force the manifest to be parsed.
manifest.ToXml()
# Handle empty name explicitly because a different codepath rejects it.
with self.assertRaises(error.ManifestParseError):
parse('')
for path in INVALID_FS_PATHS:
if not path:
continue
with self.assertRaises(error.ManifestInvalidPathError):
parse(path)
class ProjectElementTests(ManifestParseTestCase):
"""Tests for <project>."""
def test_group(self):
"""Check project group settings."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<project name="test-name" path="test-path"/>
<project name="extras" path="path" groups="g1,g2,g1"/>
</manifest>
""")
self.assertEqual(len(manifest.projects), 2)
# Ordering isn't guaranteed.
result = {
manifest.projects[0].name: manifest.projects[0].groups,
manifest.projects[1].name: manifest.projects[1].groups,
}
project = manifest.projects[0]
self.assertCountEqual(
result['test-name'],
['name:test-name', 'all', 'path:test-path'])
self.assertCountEqual(
result['extras'],
['g1', 'g2', 'g1', 'name:extras', 'all', 'path:path'])
groupstr = 'default,platform-' + platform.system().lower()
self.assertEqual(groupstr, manifest.GetGroupsStr())
groupstr = 'g1,g2,g1'
manifest.manifestProject.config.SetString('manifest.groups', groupstr)
self.assertEqual(groupstr, manifest.GetGroupsStr())
def test_set_revision_id(self):
"""Check setting of project's revisionId."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="test-name"/>
</manifest>
""")
self.assertEqual(len(manifest.projects), 1)
project = manifest.projects[0]
project.SetRevisionId('ABCDEF')
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' +
'<remote name="default-remote" fetch="http://localhost"/>' +
'<default remote="default-remote" revision="refs/heads/main"/>' +
'<project name="test-name" revision="ABCDEF"/>' +
'</manifest>')
def test_trailing_slash(self):
"""Check handling of trailing slashes in attributes."""
def parse(name, path):
return self.getXmlManifest(f"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="{name}" path="{path}" />
</manifest>
""")
manifest = parse('a/path/', 'foo')
self.assertEqual(manifest.projects[0].gitdir,
os.path.join(self.tempdir, '.repo/projects/foo.git'))
self.assertEqual(manifest.projects[0].objdir,
os.path.join(self.tempdir, '.repo/project-objects/a/path.git'))
manifest = parse('a/path', 'foo/')
self.assertEqual(manifest.projects[0].gitdir,
os.path.join(self.tempdir, '.repo/projects/foo.git'))
self.assertEqual(manifest.projects[0].objdir,
os.path.join(self.tempdir, '.repo/project-objects/a/path.git'))
manifest = parse('a/path', 'foo//////')
self.assertEqual(manifest.projects[0].gitdir,
os.path.join(self.tempdir, '.repo/projects/foo.git'))
self.assertEqual(manifest.projects[0].objdir,
os.path.join(self.tempdir, '.repo/project-objects/a/path.git'))
def test_toplevel_path(self):
"""Check handling of path=. specially."""
def parse(name, path):
return self.getXmlManifest(f"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="{name}" path="{path}" />
</manifest>
""")
for path in ('.', './', './/', './//'):
manifest = parse('server/path', path)
self.assertEqual(manifest.projects[0].gitdir,
os.path.join(self.tempdir, '.repo/projects/..git'))
def test_bad_path_name_checks(self):
"""Check handling of bad path & name attributes."""
def parse(name, path):
manifest = self.getXmlManifest(f"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="{name}" path="{path}" />
</manifest>
""")
# Force the manifest to be parsed.
manifest.ToXml()
# Verify the parser is valid by default to avoid buggy tests below.
parse('ok', 'ok')
# Handle empty name explicitly because a different codepath rejects it.
# Empty path is OK because it defaults to the name field.
with self.assertRaises(error.ManifestParseError):
parse('', 'ok')
for path in INVALID_FS_PATHS:
if not path or path.endswith('/'):
continue
with self.assertRaises(error.ManifestInvalidPathError):
parse(path, 'ok')
# We have a dedicated test for path=".".
if path not in {'.'}:
with self.assertRaises(error.ManifestInvalidPathError):
parse('ok', path)
class SuperProjectElementTests(ManifestParseTestCase):
"""Tests for <superproject>."""
def test_superproject(self):
"""Check superproject settings."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<superproject name="superproject"/>
</manifest>
""")
self.assertEqual(manifest.superproject['name'], 'superproject')
self.assertEqual(manifest.superproject['remote'].name, 'test-remote')
self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/superproject')
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' +
'<remote name="test-remote" fetch="http://localhost"/>' +
'<default remote="test-remote" revision="refs/heads/main"/>' +
'<superproject name="superproject"/>' +
'</manifest>')
def test_remote(self):
"""Check superproject settings with a remote."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<remote name="superproject-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<superproject name="platform/superproject" remote="superproject-remote"/>
</manifest>
""")
self.assertEqual(manifest.superproject['name'], 'platform/superproject')
self.assertEqual(manifest.superproject['remote'].name, 'superproject-remote')
self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/platform/superproject')
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' +
'<remote name="default-remote" fetch="http://localhost"/>' +
'<remote name="superproject-remote" fetch="http://localhost"/>' +
'<default remote="default-remote" revision="refs/heads/main"/>' +
'<superproject name="platform/superproject" remote="superproject-remote"/>' +
'</manifest>')
def test_defalut_remote(self):
"""Check superproject settings with a default remote."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<superproject name="superproject" remote="default-remote"/>
</manifest>
""")
self.assertEqual(manifest.superproject['name'], 'superproject')
self.assertEqual(manifest.superproject['remote'].name, 'default-remote')
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' +
'<remote name="default-remote" fetch="http://localhost"/>' +
'<default remote="default-remote" revision="refs/heads/main"/>' +
'<superproject name="superproject"/>' +
'</manifest>')

View File

@ -46,7 +46,7 @@ def TempGitTree():
templatedir = tempfile.mkdtemp(prefix='.test-template') templatedir = tempfile.mkdtemp(prefix='.test-template')
with open(os.path.join(templatedir, 'HEAD'), 'w') as fp: with open(os.path.join(templatedir, 'HEAD'), 'w') as fp:
fp.write('ref: refs/heads/main\n') fp.write('ref: refs/heads/main\n')
cmd += ['--template=', templatedir] cmd += ['--template', templatedir]
subprocess.check_call(cmd, cwd=tempdir) subprocess.check_call(cmd, cwd=tempdir)
yield tempdir yield tempdir
finally: finally:

View File

@ -38,7 +38,7 @@ class InitCommand(unittest.TestCase):
"""Check invalid command line options.""" """Check invalid command line options."""
ARGV = ( ARGV = (
# Too many arguments. # Too many arguments.
['asdf'], ['url', 'asdf'],
# Conflicting options. # Conflicting options.
['--mirror', '--archive'], ['--mirror', '--archive'],

View File

@ -305,8 +305,8 @@ class Requirements(RepoWrapperTestCase):
reqs = self.wrapper.Requirements({'python': {'hard': sys.version_info}}) reqs = self.wrapper.Requirements({'python': {'hard': sys.version_info}})
reqs.assert_all() reqs.assert_all()
def test_assert_all_old_repo(self): def test_assert_all_old_python(self):
"""Check assert_all rejects old repo.""" """Check assert_all rejects old python."""
reqs = self.wrapper.Requirements({'python': {'hard': [99999, 0]}}) reqs = self.wrapper.Requirements({'python': {'hard': [99999, 0]}})
with self.assertRaises(SystemExit): with self.assertRaises(SystemExit):
reqs.assert_all() reqs.assert_all()

View File

@ -15,13 +15,15 @@
# https://tox.readthedocs.io/ # https://tox.readthedocs.io/
[tox] [tox]
envlist = py36, py37, py38 envlist = py35, py36, py37, py38, py39
[gh-actions] [gh-actions]
python = python =
3.5: py35
3.6: py36 3.6: py36
3.7: py37 3.7: py37
3.8: py38 3.8: py38
3.9: py39
[testenv] [testenv]
deps = pytest deps = pytest