Compare commits

...

240 Commits

Author SHA1 Message Date
04cba4add5 sync: Show number of running fetch jobs
Last of the recent `repo sync` UX changes. Show number of fetch jobs eg:
"Fetching:  3% (8/251) 0:03 | 8 jobs | 0:01 chromiumos/overlays/chrom.."

Bug: https://crbug.com/gerrit/11293
Change-Id: I1b3dcf3e56ae6731c6c6cb73cfce069b2f374b69
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/374920
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
2023-05-25 17:26:22 +00:00
3eacfdf309 upload: use f-string
Change-Id: I91b99a7147c7c3cb5485d5406316c8ffd79f9272
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/374914
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-05-25 17:12:18 +00:00
aafed29d34 project: Include tags option during fetch retry
If the original fetch attempt did not want tags, we should continue to
honor that when doing a retry fetch with depth set to None. This seems
to match the intent of the retry based on the inline comment and results
in a significant performance improvement when the original fetch-by-sha1
fails due to the server not allowing requests for unadvertised objects.

Change-Id: Ia26bb31ea9aecc4ba2d3e87fc0c5412472cd98c4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/374918
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Kaushik Lingarkar <kaushik.lingarkar@linaro.org>
Tested-by: Kaushik Lingarkar <kaushik.lingarkar@linaro.org>
2023-05-25 12:16:06 +00:00
90f574f02e Parse OpenSSH versions with no SSH_EXTRAVERSION
If the Debian banner is not used, then there won't be a space after the
version number: it'll be followed directly by a comma.

Bug: https://crbug.com/gerrit/16903
Change-Id: I12b873f32afc9424f42b772399c346f96ca95a96
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/372875
Tested-by: Saagar Jha <saagarjha@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-05-24 17:33:08 +00:00
551285fa35 sync: Show elapsed time for the longest syncing project
"Last synced: X" is printed only after a project finishes syncing.
Replace that with a message that shows the longest actively syncing
project.

Bug: https://crbug.com/gerrit/11293
Change-Id: I84c7873539d84999772cd554f426b44921521e85
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/372674
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2023-05-18 18:10:24 +00:00
131fc96381 [git_trace2] Add logs for critical cmds
Trace logs emitted from repo are not useful on error for many critical
commands. This change adds errors for critical commands to trace logs.

Change-Id: Ideb9358bee31e540bd84a94327a09ff9b0246a77
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/373814
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2023-05-17 18:06:14 +00:00
2ad5d50874 [trace2] Add absolute time on trace2 exit events
Change-Id: I58aff46bd4ff4ba79286a7f1226e19eb568c34c5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/373954
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2023-05-15 22:21:23 +00:00
acb9523eaa SUBMITTING_PATCHES: update with commit queue details
Change-Id: I59dffb8524cb95b3fd4196bcecd18426f09bf9c4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/373694
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2023-05-11 19:27:57 +00:00
041f97725a sync: Fix how sync times for shared projects are recorded
https://gerrit.googlesource.com/git-repo/+/d947858325ae70ff9c0b2f463a9e8c4ffd00002a introduced a moving average of fetch times in 2012.

The code does not handle shared projects, and averages times based on project names which is incorrect.

Change-Id: I9926122cdb1ecf201887a81e96f5f816d3c2f72a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/373574
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2023-05-10 21:14:57 +00:00
3e3340d94f manifest: add support for revision in include
Attribute groups can now be added to manifest include, thus
all projects in an included manifest file can easily modify
default branch without modifying all projects in that manifest file.

For example,
the main manifest.xml has an include node contain revision attribute,
```
<include name="include.xml" revision="r1" />
```
and the include.xml has some projects,
```
<project path="project1_path" name="project1_name" revision="r2" />
<project path="project2_path" name="project2_name" />
```
With this change, the final manifest will have revision="r1" for project2.
```
<project name="project1_name" path="project1_path" revision="r2" />
<project name="project2_name" path="project2_path" revision="r1" />
```

Test: added unit tests to cover the inheritance

Change-Id: I4b8547a7198610ec3a3c6aeb2136e0c0f3557df0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/369714
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Shuchuan Zeng <zengshuchuan@allwinnertech.com>
Tested-by: Shuchuan Zeng <zengshuchuan@allwinnertech.com>
2023-05-05 03:40:28 +00:00
edcaa94ca8 sync: Display total elapsed fetch time
Give users an indication that `repo sync` isn't stuck if taking a long
time to fetch.

Bug: https://crbug.com/gerrit/11293
Change-Id: Iccdaec918f86c9cc2db5dc12f9e3eef7ad0bcbda
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/371414
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2023-05-02 20:51:46 +00:00
7ef5b465cd [SyncAnalysisState] Preserve synctime µs
By default, datetime.isoformat() uses different format depending on
microseconds - if is equal to 0, microseconds are omitted, but otherwise
not.

Setting timespec = 'microseconds' ensures the format is the same
regardless of current time.

Change-Id: Icb1be31eb681247c7e46923cdeabb8f5469c20f0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/371694
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2023-04-27 20:19:34 +00:00
e7e20f4686 tests: do not allow underscores in cli options
We use dashes in --long-options, not underscores, so add a test to
make sure people don't accidentally add them.

Change-Id: Iffbce474d22cf1f6c2042f7882f215875c8df3cf
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/369734
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-04-19 02:53:37 +00:00
99ebf627db upload: Add --no-follow-tags by default to git push
Gerrit does not accept pushing git tags to CLs. Hence, this change disables push.followTags for repo upload.

Fixed: b/155095555
Change-Id: I8d99eac29c0b4b375bdb857ed063914441026fa1
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/367736
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
2023-04-05 19:05:45 +00:00
57cb42861d run_tests: Check flake8
This also gets enforced in CQ.

Bug: b/267675342
Change-Id: I8ffcc5d583275072fd61ae65ae4214b36bfa59f3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/366799
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2023-03-31 04:25:53 +00:00
e74d9046ee Update abandon to support multiple branches
This change updates `repo abandon` command to take multiple space-separated branchnames as parameters.

Bug: https://crbug.com/gerrit/13354
Change-Id: I00ad7a79872c0e4161f8183843835f25cd515605
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/365524
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
2023-03-24 07:39:28 +00:00
21cc3a9d53 run_tests: Always check black and check it last
https://gerrit-review.googlesource.com/c/git-repo/+/363474/24..25 meant
to improve run_tests UX by letting users rerun it quickly, but it also
removed CQ enforcement of formatting since CQ passes args to run_tests.

Run pytest first so devs don't have format first and always check black
formatting so it's enforced in CQ.

Bug: b/267675342
Change-Id: I09544f110a6eb71b0c6c640787e10b04991a804e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/365727
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2023-03-24 01:38:20 +00:00
ea2e330e43 Format codebase with black and check formatting in CQ
Apply rules set by https://gerrit-review.googlesource.com/c/git-repo/+/362954/ across the codebase and fix any lingering errors caught
by flake8. Also check black formatting in run_tests (and CQ).

Bug: b/267675342
Change-Id: I972d77649dac351150dcfeb1cd1ad0ea2efc1956
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/363474
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2023-03-22 17:46:28 +00:00
1604cf255f Make black with line length 80 repo's code style
Provide a consistent formatting style and tox commands to lint and
format.

Bug: b/267675342
Change-Id: I33ddfe07af8473f4334c347d156246bfb66d4cfe
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/362954
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2023-03-20 20:37:24 +00:00
75eb8ea935 docs: update Focal Python version
It ships with Python 3.8 by default, not 3.7.

Change-Id: I11401d1098b60285cfdccadb6a06bb33a5f95369
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/361634
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-03-02 22:26:00 +00:00
7fa149b47a upload: Skip upload if merge branch doesn't match project revision and
dest_branch.

- This still prevents the case mentioned here:
https://gerrit-review.googlesource.com/c/50300
while also supporting dest_branch.
- Update _GetMergeBranch to get merge branches for any branch, not just
the one we happen to run `repo upload` in. (e.g. for uploading multiple
branches)

Bug: b/27955930
Change-Id: Ia8ee1d6a83a783c984bb2eb308bb11b3a721a95d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/360794
Commit-Queue: Joanna Wang <jojwang@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Joanna Wang <jojwang@google.com>
2023-02-28 14:21:17 +00:00
a56e0e17e2 tests: Change docstring for CopyLinkTestCase
Change-Id: Ic31b8073090abffe4e90cd208b684e99b83d7ef2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/358455
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2023-02-14 21:48:41 +00:00
3ed84466f4 tests: Rework run_tests to use pytest directly and add vpython3 file
Remove logic to handle importing the right version of pytest.
'./run_tests' still works but this allows presubmit builders to test
using 'vpython3 ./run_tests'.

Google-Bug-Id: b/266734831
Change-Id: I6a543c1f4b5b4449e723095b4a70e5228b1ccd34
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/356717
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-02-13 19:19:38 +00:00
48067714ec sync: Remove unused variable
Change-Id: I44ab990c89ab4da82c424bae95e463cabb12fd50
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/357136
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-02-02 18:11:27 +00:00
69427da8c9 Handle KeyboardInterrupt during repo sync
If interrupt signal is sent to repo process while sync is running, repo
prints stack trace for each concurrent job that is currently running
with no useful information.

Instead, this change captures KeyboardInterrupt in each process and
prints one line about current project that is being processed.

Change-Id: Ieca760ed862341939396b8186ae04128d769cd56
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/357135
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2023-02-01 23:41:11 +00:00
dccf38e34f Update sync progress
repo sync progress bar is misleading. Many bug reports mentioned that
repo is stuck at the repo that is currently displayed in the progress
bar. Repo sync actually shows what repository is the last processed.
This change makes that obvious.

Change-Id: I962bf0bc65af7ac0ed98db86e9144f07d9e1f96f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/357134
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2023-02-01 23:38:52 +00:00
7f44d366d0 project: clean up error message
Superproject update failures on single-manifest checkouts had an extra
space.

Bug: b/254523816
Change-Id: I6f71e42337e324a6975c5d6bba487f83abaf054f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/357056
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2023-02-01 23:00:47 +00:00
2aa5d32d70 Update bug tracking links
Update monorail component where actual git-repo bugs are.

Change-Id: I46c68053683d7aa93585bb5633a598f1578b1468
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/357057
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2023-02-01 22:57:36 +00:00
016a25447f git_superproject: Log actual error fmt instead of the entire error message.
Bug: b/258492341
Change-Id: I00678d572712791190ae1ad4e1bcf3cbe04cc1c0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/357114
Tested-by: Joanna Wang <jojwang@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2023-02-01 22:32:56 +00:00
7eab0eedf2 sync: Silence 'not found in manifest' message
This can potentially show up when sync'ing projects with submodules
that are not declared in the manifest as well as the internal
'.repo/repo' project, which is likely not desirable from a user
standpoint.

Change-Id: I93d7fcd6e3fd1818357ea4537882a864dea9942c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/355920
Reviewed-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Michael Kelly <mkelly@arista.com>
2023-01-31 21:52:16 +00:00
7e3b65beb7 Enable use of REPO_CONFIG_DIR to customize .repoconfig location
For use cases with multiple instances of repo, eg some CI environments.

Bug: https://crbug.com/gerrit/15803
Change-Id: I65c1cfc8f6a98adfeb5efefc7ac6b45bf8e134de
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/356719
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-01-28 02:05:52 +00:00
c3d61ec252 init: Silence the "rm -r .repo and try again" message if quiet
Bug: b/258532367
Change-Id: I53a23aa0b237b0bb5f7e58464936f8c9b0db1311
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/355915
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2023-01-06 16:01:52 +00:00
78e82ec78e Fix flake8 warnings for some files
Change-Id: If67f8660cfb0479f0e710b3566285ef401fcf077
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/355969
Tested-by: Sergiy Belozorov <sergiyb@chromium.org>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-01-05 18:43:12 +00:00
37ae75f27d update_manpages.py: treat regex as raw string
Treat the values in the regex map as raw strings to fix
Invalid escape sequence 'g' (W605).

Change-Id: I53bf5d6bd1e1d6a1d1293e4f55640b6513bf3075
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354698
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-12-13 16:24:07 +00:00
7438aef1ca Use 'backslashreplace' for decode
Resolve TODO as we are now requiring Python 3.

Change-Id: I7821627bd5c606276741c98efedaf5b11aecbcc3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354702
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
2022-12-13 16:23:46 +00:00
e641281d14 Use print with flush=True instead of stdout.flush
Resolves multiple TODOs. Since we are requiring Python 3,
we move to using print function with flush=True instead of
using sys.stdout.flush().

Change-Id: I54db0344ec78ac81a8d6c6c7e43ee7d301f42f02
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354701
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-12-13 16:23:28 +00:00
035f22abec pylint: remove unused imports
Removed unused imports accross multiple files.

Change-Id: Ib5ae4cebf9660e7339b11e3fa592d99f8d51e8d8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354700
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-12-13 16:23:19 +00:00
e0728a5ecd update-manpages: clean up symlink in checkout
We don't want symlinks in the git tree as it causes pain for Windows
users.  We also don't really need it as we can refactor the code we
want to import slightly.

Change-Id: I4537c07c50ee9449e9f53e0f132a386e8ffe16ec
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354356
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
2022-12-12 23:04:40 +00:00
d98f393524 upload: Allow user to configure unusual commit threshold
Add a per-remote option `uploadwarningthreshold` allowing the user to
override how many commits can be uploaded prior to a warning being
displayed.

Change-Id: Ia7e1b2c7de89a0bf9ca1c24cc83dc595b3667437
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354375
Tested-by: David Greenaway <dgreenaway@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-12-12 22:19:57 +00:00
0324e43242 repo_trace: Avoid race conditions with trace_file updating.
Change-Id: I0bc1bb3c8f60465dc6bee5081688a9f163dd8cf8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354515
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Joanna Wang <jojwang@google.com>
2022-12-09 22:49:31 +00:00
8d25584f69 github: enable flake8 postsubmit testing
Change-Id: I8532f52b3016eb491ddeb48463459d74afd36015
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354514
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-12-09 14:32:29 +00:00
0e4f1e7fba Use --negotiation-tip in superproject fetches.
Bug: b/260645739
Change-Id: Ib0cdbb13f130b91ab14df9c60a510f1e27cca8e0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354354
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Joanna Wang <jojwang@google.com>
2022-12-09 14:25:15 +00:00
e815286492 tests: clean up repo_trace._TRACE_FILE patching
Patch this automatically for all tests rather than duplicating the
boilerplate in diff testcases.

Change-Id: I391d5c859974cda3d5680d34ede2ce6e9e925838
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354358
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-12-08 22:22:39 +00:00
0ab6b11688 wrapper: switch to functools.lru_cache
No need to implement our own caching logic with newer Python.

Change-Id: Idc3243b8e22ff020817b0a4f18c9b86b1222d631
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354357
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2022-12-08 22:22:36 +00:00
a621254b26 tests: drop old unittest.main logic
We use pytest now which doesn't need this boilerplate.

Change-Id: Ib71d90b3f1669897814ee768927b5b595ca8d789
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354355
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-12-08 17:17:30 +00:00
f159ce0f9e sync: fix manifest sync-j handling
Since --jobs defaults to 0, not None, we never pull the value out
of the manifest.  Treat values of 0 and None the same to fix.

Bug: http://b/239712300
Bug: http://b/260908907
Change-Id: I9b1026682072366616825fd72f90bd90c10a252f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354254
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Sam Saccone <samccone@google.com>
2022-12-08 15:06:24 +00:00
802cd0c601 sync: Fix undefined variable in _FetchOne
If syncing in _FetchOne fails with GitError, sync_result does not get
set. There's already a separate local variable for success; do the same
for remote_fetched instead of referring to the conditionally defined
named tuple.

This bug is originally caused by a combination of ad8aa697 "sync: only
print error.GitError, don't raise that exception." and 1eddca84 "sync:
use namedtuples for internal return values".

Change-Id: I0f9dbafb97f8268044e5a56a6f92cf29bc23ca6a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354176
Tested-by: Karsten Tausche <karsten@fairphone.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
2022-12-08 06:29:00 +00:00
100a214315 sync: finish marking REPO_AUTO_GC=1 as deprecated.
The wrong revision of the change was submitted as
d793553804.

Change-Id: I6f3e4993cf40c30ccf0d69020177db8fe5f76b8c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353934
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Sam Saccone <samccone@google.com>
2022-12-05 18:11:24 +00:00
8051cdb629 test_manifest_config_properties: use assertEqual
The method assertEquals is an deprecated alias for
assertEqual.
See: https://docs.python.org/3/library/unittest.html#deprecated-aliases

Change-Id: Id94ba6d6055bdc18b87c53e8729902bb278855aa
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/354035
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
2022-12-05 13:09:09 +00:00
43549d8d08 sync: cleanup output when not doing GC
Do not use a progress bar when not doing GC, and restrict activity in
that case to only repairing preciousObject state.

This also includes additional cleanup based on review comments from
previous changes.

Change-Id: I48581c9d25da358bc7ae15f40e98d55bec142331
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353514
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-12-02 22:50:23 +00:00
55b7125d6a Revert "sync: save any cruft after calling git gc."
This bug-cacher related code is no longer needed.

This reverts commit 891e8f72ce.

Change-Id: Ia94a2690ff149427fdcafacd39f5008cd60827d5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353774
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Sam Saccone <samccone@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-12-02 22:40:06 +00:00
d793553804 sync: mark REPO_AUTO_GC=1 as deprecated.
REPO_AUTO_GC was introduced as a way for users to restore the previous
default behavior, since the default changed at the same time as the
option was added.  As such, it should be marked as deprecated, and
removed entirely in a future release.

Change-Id: Ib73d98fbea693e7057cc4587928c225a9e4beab2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353734
Reviewed-by: Sam Saccone <samccone@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-12-02 22:33:11 +00:00
ea5239ddd9 Fix ManifestProject.partial_clone_exclude property.
Bug: b/256358360

Change-Id: Ic6e3a049aa38827123d0324c8b14157562c5986e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353574
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Joanna Wang <jojwang@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2022-12-02 14:57:56 +00:00
1b8714937c release-process: update to use ./release/sign-tag.py
We have a helper script for signing releases now, so point the docs
to that rather than the multiple manual steps.

Change-Id: I309e883dbce1894650e31682d9975cf0d6bdeca3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/352834
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2022-12-01 00:04:24 +00:00
50a2c0e368 wrapper.py: Replacing load_module() with exec_module()
Fixed "DeprecationWarning: the load_module() method is deprecated and
slated for removal in Python 3.12; use exec_module() instead." in
wrapper.py. Additionally removed Python 2 code (imp.load_source()).

Test: tox
Signed-off-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Change-Id: Ib7cc19b1c545f6449e034c4b01b582cf6cf4b581
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353237
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-11-28 15:06:59 +00:00
35af2f8daf Fixed wrapper related warnings in tests
Multiple "Could not find reference" warnings in test_wrapper.py
and test_git_command.py resolved.

Test: tox
Signed-off-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Change-Id: Ic254c378bbdae6bc3f8f29682ababb37db76adfe
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353235
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-11-28 13:00:34 +00:00
e287fa760b test_capture: allow both Unix and Windows line sep
On Linux/macOS we allow \n in the end of the line.
On Windows we allow both \r\n and \n. Here we also allow Unix line
seperators as tests might be excuted in for example git-shell.

Change-Id: I3975b563cf95407da92e5479980e670eb748b30e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353181
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-11-28 06:39:31 +00:00
3593a10643 test_bad_path_name_checks: allow Windows path sep
With this change if a path ends with '/' on Linux/macOS
and ends with either '/' or '\' on Windows, the test will pass.

Signed-off-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Change-Id: Id7d1b134f9c0bdf7ceaf149af304bbf90cbd7b21
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353180
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-11-28 04:07:17 +00:00
003684b6e5 test: Fix char encoding issues on windows
Some tests were failing due to Windows not using utf-8
by default when executing the tests. Enforcing usage of utf-8
resolves these issues.

Signed-off-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Change-Id: If42f6be2a2b688a6105ecf4fcdb541aade24519a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353179
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-11-27 17:03:41 +00:00
0297f8312c test: fix path seperator errors on windows
Fixing multiple errors when running tests on Windows related
to path seperator being different ('\' instead of '/').

Signed-off-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Change-Id: I26b44d092b925edecab46a4d88e77dd9dcb8df28
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353178
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-11-27 17:03:01 +00:00
7b3afcab7a tox: Allow passing positional arguments
Allows us to pass on arguments to run_tests and pytest after -- when
executing tox.
E.g.: To run all tests verbose in a test class:
  tox -- -v tests/test_project.py::ReviewableBranchTests

Signed-off-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Change-Id: Ibd78856c6d4053c769f3d0b6130ebc8145275f78
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353176
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-11-27 11:35:46 +00:00
eda6b1ead7 trace: make test timeout after 2min
Before this commit, the test was hanging forever when
run on a Windows host. This should resolve that issue.

Signed-off-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Change-Id: Id9ea6d54926b797db3d2978a2ae2930088201eec
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353125
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-11-26 23:30:43 +00:00
4364a79088 tox: Make all tests timeout after 5min
Use pytest-timeout to make sure tests don't get stuck for more than
5 minutes. In future individual tests can exceed this timeout by
being decorated with @pytest.mark.timeout(600).

Signed-off-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Change-Id: I8f5b61a20230c22a86fd5636297c78f41369449a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353124
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-11-26 23:30:37 +00:00
a98a5ebc6d Update GH Action test-ci.yml dependencies
Updating version of checkout and setup-python actions.
Also making sure we install tox, tox-gh-actions into our venv.
Changes based on tox-gh-actions README.

Signed-off-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Change-Id: I18946a8b41d5a3c350deee3ddbde77b4c0b3bdfe
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353123
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-11-26 00:57:04 +00:00
f8d342beac tox: enable python 3.10 testing
Note that in YAML, Python version 3.10 would be parsed as 3.1,
hence I put all the Python versions in quotes.
More on this:
https://github.com/actions/setup-python/issues/160

Signed-off-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Change-Id: Iba380a6a6a6de8486486c8981e712c7bf4dfe759
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353019
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-11-25 12:16:43 +00:00
6d2e8c8237 Resolved DeprecationWarning for currentThread()
In Python 3.10 onwards we see a DeprecationWarning:
currentThread() is deprecated, use current_thread() instead.
Same goes for getName(), replaced by name attribute.

Test: tox (python 3.6 - 3.10)

Signed-off-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Change-Id: I80ec819752a5276cff3b2dadba0ec10cc92d09a4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/353018
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-11-25 08:34:57 +00:00
a24185ee6c Set repo version to 2.30 (current)
Change-Id: Ie01ea8475b978f950471b0a52fc576e59060c6c5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/352694
Reviewed-by: Jonathan Nieder <jrn@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2022-11-23 01:45:59 +00:00
d686365449 Extract env building into a testable helper.
Previously env dict building was untested and mixed with other mutative
actions. Extract the dict building into a dedicated function and author
tests to ensure the functionality is working as expected.

BUG: b/255376186
BUG: https://crbug.com/gerrit/16247
Change-Id: I0c88e53eb285c5c3fb27f8e6b3a903aedb8e02a8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/351874
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Sam Saccone <samccone@google.com>
2022-11-16 18:26:49 +00:00
d3cadf1856 Do not set ALT object dirs when said path resolves to the same dir.
Due to symlink resolution git was treating this as two different directories even if the paths were the same. This mitigates the git core bug inside of repo (while the git core fix is being worked on).

Bug: b/255376186
Bug: https://crbug.com/gerrit/16247
Change-Id: I12458ee04c307be916851dddd36231997bc8839e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/351836
Tested-by: Sam Saccone <samccone@google.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
2022-11-16 18:26:49 +00:00
fa90f7a36f tests: Fix update-manpages test.
Change-Id: I58d85e06edeb9208a782957acc982e996c026ed2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/351854
Reviewed-by: Sam Saccone <samccone@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-11-16 16:53:49 +00:00
bee4efb874 subcmds: display correct path multitree messages
Correct usage of project.relpath for multi manifest workspaces.

Change-Id: Idc32873552fcdae6eec7b03dde2b2f31134b72fd
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/347534
Reviewed-by: Xin Li <delphij@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-11-15 21:13:06 +00:00
f8af33c9f0 update-manpages: explicitly strip color codes
On some systems, help2man produces color codes in the output.  Remove
them to avoid manpage churn.

Also begin adding unit tests.

Change-Id: I3f0204b19d9cae524d3cb5fcfb61ee309b0931fc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/349655
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2022-11-14 23:46:43 +00:00
ed25be569e repo_trace: drop notification of trace file name.
The trace file is local to the workspace. We shouldn't tell the user
that on every command that they run.

Change-Id: I8674ab485bd5142814a043a225bf8aaca7795752
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/351234
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2022-11-14 23:46:06 +00:00
afd767103e repo_trace: adjust formatting, update man page.
No behavior change in this CL.

Change-Id: Iab1eb01864ea8a5aec3a683200764d20786b42de
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/351474
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2022-11-14 23:46:06 +00:00
b240d28bc0 upload: track projects by path, rather than name
Since the same project can be checked out in multiple paths, we need to
track the "to be uploaded" projects by path, rather than project name.

Bug: crbug.com/gerrit/16260
Test: manual
Change-Id: Ic3dc81bb8acb34886baa6299e90a49c7ba372957
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/351054
Reviewed-by: Xin Li <delphij@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-11-14 21:58:10 +00:00
47020ba249 trace: restore Progress indicator.
If we are not tracing to stderr, then we should still have progress
indication.

Change-Id: Ifc9678e1fccbd92251e972fcf25aad6369d60e15
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/351195
Reviewed-by: Sam Saccone <samccone@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2022-11-10 00:44:33 +00:00
5ed8c63942 sync: REPO_AUTO_GC=1 to restore old behavior.
Add an environment variable to restore previous behavior, since the
older version of repo does not support `--auto-gc`.

Change-Id: I874dfb8fc3533a97b8adfd52125eb3d1d75e2f3c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/351194
Reviewed-by: Sam Saccone <samccone@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-11-10 00:44:33 +00:00
24c6314fca Fix TRACE_FILE renaming.
Bug: b/258073923

Change-Id: I997961056388e1550711f73a6310788b5c7ad4d4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/350934
Tested-by: Joanna Wang <jojwang@google.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
2022-11-09 01:24:49 +00:00
7efab539f0 sync: no garbage collection by default
Adds --auto-gc and --no-auto-gc (default) options to control sync's
behavior around calling `git gc`.

Bug: b/184882274
Change-Id: I4d6ca3b233d79566f27e876ab2d79f238ebc12a9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/344535
Reviewed-by: Xin Li <delphij@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-11-08 19:54:20 +00:00
a3ff64cae5 Improve always-on-trace
Notes to the user need to go to stderr, and tracing should not be on for
fast exiting invocations (such as --help).

This makes it so that release/update-manpages works.

Change-Id: Ib183193c868a78c295a184c01c4532cd53d512eb
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/350794
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2022-11-08 19:54:20 +00:00
776138a938 Merge branch stable into main (--strategy=ours).
This will allow the next repo release to be a fast-forward on stable.

* origin/stable:
  v2.29.7: Revert back to v2.29.5

Change-Id: I3e52f76766807c58f56d3e246fa142ed55ede59b
2022-11-08 18:49:16 +00:00
5fb9c6a5b3 v2.29.7: Revert back to v2.29.5
This change reverts stable to v2.29.5, to fix clients that received
v2.29.6, and keep future updates simpler.

Change-Id: I2f5c52c466b7321665c9699ccdbf98f928483fee
2022-11-08 00:54:56 +00:00
859d3d9580 GitcInit: fix gitc-init failure
Aligns argument usage of refactored GitcManifest (8c1e9cbef
"manifest_xml: refactor manifest parsing from client management") to fix
the `repo gitc-init` error: `fatal: manifest_file must be abspath`.

Change-Id: I1728032cce3f39ed1077bbb7ef714410c2c49e1a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/350374
Tested-by: Woody Lin <woodylin@google.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2022-11-04 17:30:40 +00:00
fa8d939c8f sync: clear preciousObjects when set in error.
If this is a project that is not using object sharing (there is only one
copy of the remote project) then clear preciousObjects.

To override this for a project, run:

  git config --replace-all repo.preservePreciousObjects true

Change-Id: If3ea061c631c5ecd44ead84f68576012e2c7405c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/350235
Reviewed-by: Jonathan Nieder <jrn@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-11-03 23:01:16 +00:00
a6c52f566a Set tracing to always on and save to .repo/TRACE_FILE.
- add `--trace_to_stderr` option so stderr will include trace outputs and any other errors that get sent to stderr
- while TRACE_FILE will only include trace outputs

piggy-backing on: https://gerrit-review.googlesource.com/c/git-repo/+/349154

Change-Id: I3895a84de4b2784f17fac4325521cd5e72e645e2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/350114
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Joanna Wang <jojwang@google.com>
2022-11-03 21:07:07 +00:00
0d130d2da0 tests: Make the tests pass for Python < 3.8
Before Python 3.8, xml.dom.minidom sorted the attributes of an element
when writing it to a file, while later versions output the attributes
in the order they were created. Avoid these differences by sorting the
attributes for each element before comparing the generated manifests
with the expected ones.

This corresponds to commit 5d58c18, but for new tests introduced since
it was integrated.

Change-Id: I5c360656a0968e6e8d57eb068c8e87da7dfa61c1
Signed-off-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/349917
Reviewed-by: LaMont Jones <lamontjones@google.com>
2022-10-28 17:26:48 +00:00
b750b48f50 init: add --manifest-depth for shallow manifest clone
People rarely care about the history of the manifest repo.  Add a
parameter to specify depth for the manifest.

For now, make the default behavior the same as the current behavior.  At
a future date, the default will be changed to 1.  People who need the
full history should begin passing --manifest-depth=0 to preserve the
behavior when the default changes.

We can't reuse the existing --depth option because that applies to
all projects we clone, not just the manifest repo.

Bug: https://crbug.com/gerrit/16193, https://crbug.com/gerrit/16358
Change-Id: I9130fed3eaed656435c778a85cfe9d04e3a4a6a0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/349814
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2022-10-27 21:59:09 +00:00
6c8b894d8d Revert "init: change --depth default to 1 for manifest repo"
This reverts commit 076d54652e.

Reason for revert: crbug.com/gerrit/16358

Change-Id: I2970eb50677cca69786f71edffe4aa5271cf139f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/349834
Reviewed-by: Sam Saccone <samccone@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2022-10-27 21:52:02 +00:00
b6cfa09500 sync: uninitialized variable on mirror sync failure
When repo sync fails, if the workspace is a mirror, an uninitialized
variable is referenced.

Bug: crbug.com/gerrit/16356
Change-Id: I1dba9f92319b9cbfd18460327560a395c88a089f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/349654
Reviewed-by: Sam Saccone <samccone@google.com>
Reviewed-by: Xin Li <delphij@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-10-26 23:13:02 +00:00
78dcd3799b sync: do not require python 3.9
Use pre-3.9 syntax for NamedTuple, so that users do not need to have
python 3.9 or later installed.

Bug: b/255632143, crbug.com/gerrit/16355
Test: manually verified with python 3.8
Change-Id: I488d2d5267ed98d5c55c233cc789e629f1911c9d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/349395
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Jonathan Nieder <jrn@google.com>
2022-10-25 22:46:47 +00:00
acc4c857a0 sync: only use --cruft when git supports it.
git gc --cruft was added in 2.37.0.

Bug: https://crbug.com/gerrit/16270
Change-Id: I71e46741e33472a92f16d6f11c51a23e1e55d869
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/346577
Reviewed-by: Emily Shaffer <emilyshaffer@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-09-22 19:18:48 +00:00
a39af3d432 project: Add a missing call to _CopyAndLinkFiles
If a file that is copied using a <copyfile> tag is modified and not
committed or if it is committed to a detached head, then running `repo
sync` would update the target file as expected. However, if the
modified file is committed to a local branch, then running `repo sync'
would not update the target file as expected.

Change-Id: Ic98e37d1c2e51fd1bf15abf149c7d06190cfd6d2
Signed-off-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/344475
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-09-20 09:24:01 +00:00
4cdfdb7734 manifest: allow extend-project to override dest-branch and upstream
Bug: https://crbug.com/gerrit/16238
Change-Id: Id6eff34791525b3df690e160c911c0286331984b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/345144
Tested-by: Erik Elmeke <erik@haleytek.corp-partner.google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-09-20 04:25:02 +00:00
1eddca8476 sync: use namedtuples for internal return values
Replace tuple returns with namedtuples, to simplify adding new fields.

Extend the Sync_NetworkHalf return value to:
 - success: True if successful (the former return value)
 - remote_fetched: True if we called `git fetch`

Change-Id: If63c24c2f849523f77fa19c05bbf23a5e9a20ba9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/344534
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-09-19 22:03:18 +00:00
aefa4d3a29 sync: incorporate review feedback.
This incorporates feedback from
https://gerrit-review.googlesource.com/c/git-repo/+/345114

Change-Id: I04433d6435b967858f1ffb355217d90bc48c1e5d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/345894
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-09-19 22:03:18 +00:00
4ba29c42ca diffmanifests: Handle Missing Projects in Repo Workspace
By default there are 4 categories in the diffmanifests
api puts the diffs in to - added, removed, changed and unreachable

Example of command - repo diffmanifests 1.xml 2.xml

added - list down the projects present in second manifest but not in
first
removed - list down the projects present in first but not in
second
changed - list down the changes and the differences for each project
unreachable - when it encounters revision value in a project is incorrect

But, when there are projects present in both manifests and could not
find in local workspace where we have cloned the repo(because of
different/subset manifest xml) - this will create unhandled exception

Now we have added a 5th category called 'missing' - where in such
cases it will handle the scenario and print the log for user

Example:
added projects :
        project_2 at revision e6c8a59832c05dc4b6a68cee6bc0feb832181725

removed projects :
        project_1 at revision e6c8a59832c05dc4b6a68cee6bc0feb832181725

changed projects :
        project_3 changed from 3bb890e1286f04e84d505e5db48e0ada89892331 to e434b3736f11537c67590fefadfe4495895e9785

missing projects :
        project_4

Change-Id: I244e8389bff7e95664c29d3dcb61e22308e3a573
Signed-off-by: Shashank Devaraj <shashankkarthik@gmail.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/344774
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-09-15 17:42:08 +00:00
45ef9011c2 update-manpages: force use of active interp
Since the repo wrapper uses #!/usr/bin/python, use the python3 that
this wrapper is actively using.

Change-Id: I03d1e54418d18a504eec628e549b4cc233621c45
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/345294
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-09-12 19:27:09 +00:00
891e8f72ce sync: save any cruft after calling git gc.
This is ENABLED BY DEFAULT due to data corruption potential.  To disable
it, set REPO_BACKUP_OBJECTS=0 in the environment.

While the workspace will grow over time, this provides a recovery path
for an issue where objects are erroneously deleted from the workspace,
resulting in lost work.  Once the root cause is determined, we will be
able to stop saving backups again.

Backups are kept in .git/objects/.repo/pack.bak

Bug: https://crbug.com/gerrit/16247
Change-Id: Ib8b5c9b4bf0dfa9e29606e0f5c881d65996b2a40
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/345114
Reviewed-by: Jonathan Tan <jonathantanmy@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2022-09-09 19:04:30 +00:00
af8fb132d5 Revert "project: initialize new manifests in temp dirs"
This reverts commit 07d21e6bde.

Reason for revert: crbug.com/gerrit/16230, b/244467766 - breaks aosp-master-with-phones case

Change-Id: Id967d92f8622c2c13356b09e46ece9f20040aabc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/344314
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2022-09-08 23:28:27 +00:00
4112c07688 sync: Correctly sync multi manifest workspaces
When actually fetching the manifests, start at the correct (sub)
manifest.

Bug: https://crbug.com/gerrit/16198
Change-Id: I39fdd726f1917ef4277a0b7c83663c8f49167466
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343914
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-08-24 19:35:38 +00:00
fbd5dd3a30 manifest_xml: improve topdir accuracy.
Do not include a trailing path separator when submanifest_path is empty.

Bug: https://crbug.com/gerrit/16104
Change-Id: Ia65e355de25bdb1067fe50ab1d47db6e798d5a71
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343674
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-08-22 21:49:49 +00:00
3d27c71dd9 init: hide identify spam when reinitializing
We don't want to keep showing the user config notice when reinitializing
existing checkouts, so hide it.

Change-Id: Id40610bd683396cbff7e1aefc092c8b77c397783
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343536
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
2022-08-22 21:01:35 +00:00
488d54d4ee init: show a notice when reinitializing
Make it clear to users when we're reinitializing an existing checkout
in case they weren't expecting it.

Bug: https://crbug.com/gerrit/12396
Change-Id: I22e89ae041a8e7b147c9d06a82f1302dd5807ae0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343535
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-08-22 21:01:10 +00:00
5a5cfce1b2 stage: add missing flush before project prompt
Bug: https://crbug.com/gerrit/13223
Change-Id: Ib279d86a52e1035e02d6f7d8f053c3a43e721032
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343555
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-08-22 20:48:29 +00:00
e6d4b84060 upload: respect --yes with large upload confirmation
If the user passes in --yes, don't prompt them to confirm large uploads.

Bug: https://crbug.com/gerrit/14085
Change-Id: Ic801b21be80ba181801531acd4af5057ec10c11c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343554
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-08-22 20:48:27 +00:00
d75ca2eb9d launcher: make missing .repo/repo/repo an error
If the specified repo dir doesn't actually have a `repo` program,
we only show a warning before continuing on, and then we fail in
weird ways.  Since we really need the repo dir to contain repo,
have this be fatal and delete the results.

Bug: https://crbug.com/gerrit/13526
Change-Id: Icee4cba96136d470cbb459a81918c40205078f98
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343538
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-08-22 20:48:02 +00:00
a010a9f4a0 launcher: initialize repo in a temp dir
In case something goes wrong in the initial setup of the repo dir,
clone it into a temporary .repo/repo.tmp/ directory first, and then
rename it only when things have finished fully.

Bug: https://crbug.com/gerrit/13526
Change-Id: Ib0f5a975e4d436b0fb616fac70f5789c4e02a61a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343537
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-08-22 20:47:50 +00:00
8a54a7eac3 cherry-pick: tighten up output
If stdout or stderr are empty, don't print empty lines.
Also trim any trailing lines so we don't show excess ones.

Change-Id: I65fcea191e1f725be03c064cb510578632466013
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343516
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
2022-08-22 19:38:32 +00:00
63a5657ecf git_command: fix input passing
After reworking this function to use subprocess for output capturing
in commit c87c1863b1 ("git_command:
switch process capturing over to subprocess"), passing input via
stdin write no longer works.  We have to pass it via communicate(),
and we have to pass it a string instead of bytes (since we always
use encoding='utf-8' now).  This is fine since the only user of the
input= setting today is already passing in a string.

Bug: https://crbug.com/gerrit/16151
Change-Id: Ic58db1e568b8f8aa840a6d62c5a157c14aa6d9bc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343515
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
2022-08-22 19:34:59 +00:00
07d21e6bde project: initialize new manifests in temp dirs
If initializing the manifest fails for any reason, don't leave it in
a half complete state.  This can cause problems if/when the user tries
to reinit because different codepaths will be taken.  For example, if
we initialize manifests.git and don't finish probing the remote to see
what default branch it uses, we end up always using "master" even if
that isn't what the remote uses.

To avoid all of this, use .tmp dirs when initializing, and rename to
the final path only after we complete all the right steps.

We should roll this out to all projects we clone, but start with the
manifest project for now.

Bug: https://crbug.com/gerrit/13526
Bug: https://crbug.com/gerrit/15805
Change-Id: I0214338de69ee11e090285c6b0b211052804af06
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343539
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-08-22 19:34:46 +00:00
076d54652e init: change --depth default to 1 for manifest repo
People rarely care about the history of the manifest repo.  Change
the default to 1 to speed up initial setup.  If people really want
the full history, they can pass --manifest-depth=0.

We can't reuse the existing --depth option because that applies to
all projects we clone, not just the manifest repo.

Bug: https://crbug.com/gerrit/16193
Change-Id: Ideab1712e9ffc743b9bae97903e074d7d429b134
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343435
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
2022-08-18 19:12:21 +00:00
790f4cea7a add a few more docs to existing funcs
Change-Id: I27317a59aba67c05ca1fd333e8f064c0edccb209
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343185
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-08-18 17:31:01 +00:00
39cb17f7a3 init: use --current-branch by default
People rarely care about having all manifest branches locally.  Change
the default to only pull down the selected branch.  If people want other
branches, the -b option will fetch it automatically, or people can use
--no-current-branch.

This only applies to the manifest project syncing, not the rest of the
projects that are in the checkout.

Bug: https://crbug.com/gerrit/16194
Change-Id: Ia9e7e2f23b8028d82772db72dcc7d6c32260be79
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343434
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
2022-08-18 16:54:17 +00:00
ad1b7bd2e2 start: do not swallow git output all the time
Normally git produces no output when creating or switching branches.
If there's a problem though, we want to show that to the user.  So
switch from capturing all output to running in quiet mode.

Bug: https://crbug.com/gerrit/15819
Change-Id: I7873ecc7c3bacce591899cc9471cb0244eb74541
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343454
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-08-18 16:43:39 +00:00
3c2d807905 pager: catch startup failures on Windows
If the user's pager settings are broken, display an error message
rather than crash to avoid confusing them.

Bug: https://crbug.com/gerrit/16173
Change-Id: Idc0891da783c68f3a96ac53a82781e34e40421fb
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343437
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-08-18 16:43:16 +00:00
7fa8eedd8f upload: add --push-options tips & doc link
Change-Id: Iee38a80974c53231d1e9f04f7f85b2d0bac96dbb
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/342354
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-08-18 16:40:40 +00:00
dede564c3d project: simplify GetRemote a bit
We almost always use self.remote.name when calling self.GetRemote,
so make that the default to simplify the code a bit.

Change-Id: Ifdf6e1370d6b8963b44e6d384b0fac8fa5c4f2ba
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/343184
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
2022-08-17 16:54:21 +00:00
ac76fd3e3a upload: Add ready flag to remove wip
The `--wip` allow to bulk push changed as work-in-progress. This CL
intend to allow the opposite opperation by removing the wip mark on the
CL and set it to be ready to review

Change-Id: If0743c5b14829f77be2def5a8547060d06a5648c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/342214
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: William Escande <wescande@google.com>
2022-08-03 20:17:06 +00:00
a8c34d1075 commit-msg: Sync commit-msg from gerrit 3.6.1
This includes:
- Ignore squash commits.
- Update to hash generation.
- Update to Change-Id and trailer generation.

TEST=cp hooks/commit-msg .git/hooks/commit-msg
TEST=git commit -s # right order
TEST=git commit; git commit --amend -s # right order

Change-Id: I4e4a2a02905d330f2863b562d7914fe6567a4118
Signed-off-by: Evan Benn <evanbenn@chromium.org>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/339554
Tested-by: Evan Benn <evanbenn@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-07-29 01:29:19 +00:00
5951e3043f sync: handle smartsync HTML responses better
If the server responds with an HTML page, we should show that to the
user instead of crashing with XML errors.

Bug: https://crbug.com/gerrit/15936
Change-Id: I52e6b781c3bb6a6c9f6ecbe2e0907044876cdc8d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/337519
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-07-28 17:44:21 +00:00
48ea25c6a7 sync: start clearing git settings that disabled pruning
For projects that no longer share their per-project objects directly, we
no longer have to disable the git settings that disable pruning.  See
commit "project: stop directly sharing objects/ between shared projects"
for more details.

Bug: https://crbug.com/gerrit/15553
Change-Id: Ica0e83c3002716424c2bc9111b3b3d3a76c30973
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/337535
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-07-25 17:13:20 +00:00
355f4398d8 sync: rework --jobs to provide better defaults
For --jobs-network, the logic is now:
* If the user specifies --jobs-network, use that.
* Else, if the user specifies --jobs, use that.
* Else, if the manifest specifies sync-j, use that.
* Else, default to 1.
Then we limit the jobs count based on the softlimit RLIMIT_NOFILE.

For --jobs-checkout, the logic is now:
* If the user specifies --jobs-checkout, use that.
* Else, if the user specifies --jobs, use that.
* Else, if the manifest specifies sync-j, use that.
* Else, default to DEFAULT_LOCAL_JOBS which is based on user's ncpus.
Then we limit the jobs count based on the softlimit RLIMIT_NOFILE.

For garbage collecting, the logic is now:
* If the user specifies --jobs, use that.
* Else, if the manifest specifies sync-j, use that.
* Else, default to the user's ncpus.
Then we limit the jobs count based on the softlimit RLIMIT_NOFILE.

Having to factor in the manifest settings makes this more complicated
which is why we delay processing of defaults until after we've synced
the manifest projects.

Bug: http://b/239712300
Change-Id: Id27cda63c76c156f1d63f6a20cb2c4ceeb3d547c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/341394
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
2022-07-25 15:36:43 +00:00
bddc964d93 Fix the printed path of the ".repo" dir after a fresh init.
Apparently, manifest.topdir already contains a trailing slash in some
cases, so a simple string concatenation may or not lead to double
slashes. It is safer to use os.path.join. See
https://screenshot.googleplex.com/6pSY3QewAeCdAqk

Change-Id: I2411452296b7e78fc975787b675273a48d6b3d85
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/341574
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mateus Azis <azis@google.com>
2022-07-25 15:22:14 +00:00
a8cf575d68 Omit local_manifest groups from superproject override.
When we create superproject_override.xml, do not include projects that
are present from local_manifests/*.  Such projects are fully under the
control of the local_manifests/ file.

Bug: b/238934278
Test: manual, ./run_tests
Change-Id: I40382ceb82d9cf7b8dc7b5f2abed3f6d4d80017e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/340877
Tested-by: Xin Li <delphij@google.com>
Reviewed-by: Xin Li <delphij@google.com>
Reviewed-by: Sam Saccone 🐐 <samccone@google.com>
2022-07-15 23:32:24 +00:00
8501d4602a status, diff: display correct path for multi-manifest
Display the project path relative to the outermost manifest by default,
and relative to the sub manifest only when --this-manifest-only is
specified.

For project-related diagnostic messages, use the outermost manifest for
messages.

Change-Id: I4537d7dd412a2c182e77d6720e95c1b0ef70eb0e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/340754
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-07-14 16:00:18 +00:00
8db78c7d4d project: simplify if-statement
Change-Id: I05e4505b45963fe6e85cf74a669afafd00fc83c0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/340457
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Martin Geisler <mgeisler@google.com>
2022-07-11 17:58:06 +00:00
9fb64ae29c upload: add ‘--ignore-untracked-files’ option
This option will suppress the

    Uncommitted changes in ... (did you forget to amend?)

prompt when there are untracked (unknown) files in the working copy.
The prompt is still shown if tracked files are modified.

Change-Id: Ia3fcc82989b7fad09b69214eda31e2d0dfc14600
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/340456
Tested-by: Martin Geisler <mgeisler@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-07-11 17:57:43 +00:00
d47d9ff1cb man: regenerate
Change-Id: I3ca8ca8f502605b194ebe65b315eda08c51592a6
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/340494
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-07-11 16:42:23 +00:00
68d69635c7 Fix Projects.shareable_dirs
If this tree is not using alternates for object sharing, then we need to
continue to call it a shared directory.

Bug: https://bugs.chromium.org/p/gerrit/issues/detail?id=15982
Test: manual
Change-Id: I1750f10b192504ac67f552222f8ddb9809d344fe
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/338974
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-06-08 16:49:08 +00:00
ff6b1dae1e Only sync superproject if it will be used.
If the user says `--no-use-superproject`, then do not bother syncing the
superproject.

Also add/update docstrings and comments throughout.

Change-Id: I9cdad706130501bab9a22d3099a1dae605e9c194
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/338975
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-06-08 16:49:08 +00:00
bdcba7dc36 sync: add multi-manifest support
With this change, partial syncs (sync with a project list) are again
supported.

If the updated manifest includes new sub manifests, download them
inheriting options from the parent manifestProject.

Change-Id: Id952f85df2e26d34e38b251973be26434443ff56
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/334819
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-05-26 00:03:37 +00:00
1d00a7e2ae project: initial separation of shared project objects
For now, this is opt-in via environment variables:
  - export REPO_USE_ALTERNATES=1

The shared project logic that shares the internal .git/objects/ dir
directly between multiple projects via the project-objects/ tree has
a lot of KI with random corruption.  It all boils down to projects
sharing objects/ but not refs/.  Git operations that use refs to see
what objects are reachable and discard the rest can easily discard
objects that are used by other projects.

Consider this project layout:
<show fs layout>

There are unique refs in each of these trees that are not visible in
the others.  This means it's not safe to run basic operations like
git prune or git gc.

Since we can't share refs (each project needs to have unique refs
like HEAD in order to function), let's change how we share objects.
The old way involved symlinking .git/objects/ to the project-objects
tree.  The new way shares objects using git's info/alternates.

This means project-objects/ will only contain objects that exist in
the remote project.  Local per-project objects (like when creating
branches and making changes) will never be shared.  When running a
prune or gc operation in the per-project state, it will only ever
repack or discard those per-project objects.  The common shared
objects would only be cleaned up when running a common operation
(i.e. by repo itself).

One downside to this for users is if they try blending unrelated
upstream projects.  For example, in CrOS we have multiple kernel
projects (for diff versions) checked out.  If a dev fetched the
upstream Linus tree into one of them, the objects & tags would
not be shared with the others, so they would have to fetch the
upstream state for each project.  Annoying, but better than the
current corruption situation we're in now.

Also if the dev runs a manual `git fetch` in the per-project to
sync it up to newer state than the last `repo sync` they ran,
the objects would get duplicated.  However, git operations later
on should eventually dedupe this.

Bug: https://crbug.com/gerrit/15553
Change-Id: I313a9b8962f9d439ef98ac0ed37ecfb9e0b3864e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/328101
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-05-26 00:02:18 +00:00
3a0a145b0e upload: move label validation to core function
This way we know we don't need to encode the labels.

Change-Id: Ib83ed8f4ed05f00b9d2d06a9dd3f304e4443430e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/337518
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
2022-05-21 19:19:44 +00:00
74737da1ab tests: switch to tempfile.TemporaryDirectory
Now that we don't need to support Python 2, we can switch to this
API for better contextmanager logic.

Change-Id: I2d03e391121886547e7808a3b5c3b470c411533f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/337515
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-05-20 11:38:10 +00:00
0ddb677611 project: fix --use-superproject logic for init.
If init was run with --use-superproject, init failed.

If init was run without --{no,}use-superproject option then manifests
with <superproject/> elements were mishandled.

Bug: b/233226285
Test: manual
Change-Id: I737e71c89d2d7c324114f58bf2dc82b40e5beba7
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/337534
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-05-20 11:01:28 +00:00
501733c2ab manifest: add submanifest.default_groups attribute
When the user does not specify any manifest groups, this allows the
parent manifest to indicate which manifest groups should be used for
syncing the submanifest.

Change-Id: I88806ed35013d13dd2ab3cd245fcd4f9061112c4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/335474
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-04-29 18:42:23 +00:00
0165e20fcc project: Do not exit early on --standalone-manifest.
After we successfully download the standalone manifest file, we cannot
exit early.

Bug: https://bugs.chromium.org/p/gerrit/issues/detail?id=15861
Change-Id: Ic47c9f7e9921851f94c6f24fd82b896eff524037
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/335974
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-04-29 17:13:49 +00:00
0de4fc3001 project: Add missing imports
Some imports were missed when moving manifestProject to project.py

Bug: https://bugs.chromium.org/p/gerrit/issues/detail?id=15861
Change-Id: Id8fffeaa3f88f344a13b5ab44e5403c7edd98f31
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/335554
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Raman Tenneti <rtenneti@google.com>
2022-04-21 18:44:26 +00:00
4c11aebeb9 progress: optimize progress bar updates a bit
Rather than erase the entire line first then print out the new content,
print out the new content on top of the old and then erase anything we
didn't update.  This should result in a lot less flashing with faster
terminals.

Bug: https://crbug.com/gerrit/11293
Change-Id: Ie2920b0bf3d5e6f920b8631a1c406444b23cd12d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/335214
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-04-19 23:50:48 +00:00
b90a422ab6 Override the manifest for the entire command
When a manifest file is overridden, remember that and keep using the
override for the remainder of the process.  If we need to revert it,
make the override name evaluate False.

Change-Id: I1eee05fec6988c1ee4a3c751c4b540d5b5d11797
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/335136
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-04-19 21:28:20 +00:00
a46047a822 sync: refactor use of self.manifest
We need to iterate over multiple manifests, and generally use the
outer_client.manifest for multi-manifest support.  This refactors the
use of self.manifest into a chosen manifest.

Change-Id: I992f21d610c929675e99555ece9c38df4b635839
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/334699
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-04-14 22:24:04 +00:00
5fa912b0d1 Stop passing optparse.Values to git_superproject
Make git_superproject independent of the command line by passing
the specific value instead of requiring the caller to have an
optparse.Values object to pass in.

Flag --use-superproject and --archive as incompatible in subcmds/init.py

Change-Id: Ied7c874b312e151038df903c8af4328f070f387c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/335135
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-04-14 22:23:16 +00:00
4ada043dc0 ManifestProject: add manifest_platform
And fix most of the other attributes to return the value instead of
None.

Change-Id: Iddcbbeb56238ee082bb1cae30adbd27a2f551f3d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/335134
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2022-04-14 20:56:45 +00:00
d8de29c447 forall: fix multi-manifest variables.
- REPO_PATH is relative to the root of the client. REPO_OUTERPATH is not
  needed.
- REPO_INNERPATH is relative to the sub manifest root.
- REPO_OUTERPATH is the path for the sub manifest root relative to the
  root of the client.

Change-Id: I031692891cfef2634d1358584d27a6a4df735c20
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/334899
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-04-14 14:31:47 +00:00
2cc3ab7663 git_superproject: only print beta notice once.
This eliminates duplicate notices during multi-manifest syncs.

Change-Id: Idcb038ddeb363368637c58c11346ebf8fd2b27ac
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/334939
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-04-14 00:07:25 +00:00
d56e2eb421 manifest_xml: use Superproject to hold XML content
Always create Superproject when there is a <superproject> tag, and have
it hold the XML content, similar to how other manifest elements are
handled.

This also adds SetQuiet and SetPrintMessages to Superproject
consistent with manifest.SetUseLocalManifests.

Change-Id: I522bf3da542006575799f0640c67f7052704f266
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/334641
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-04-12 15:46:23 +00:00
d52ca421d5 sync: respect sync-c manifest option
The documentation states that a `sync-c` attribute in the manifest file
can set a default for whether only the current branch should be fetched
or all branches. This seems to have been broken for some time.

Commit 7356114 introduced the `--no-current-branch` CLI option and
relied on getting `None` via `optparse` if neither `--current-branch`
nor `--no-current-branch` was set to distinguish it from a boolean
value. If `None` was received, it would read the value from the manifest
option `sync-c`. The parsing went through the utility function
`_GetCurrentBranchOnly` which returned `True` if `--current-branch` had
been given on the command-line, or fell back on the "superproject"
setting, which would either return `True` or `None`. This would
incorrectly make `repo` fall back to the manifest setting even if the
user had given `--no-current-branch` if no superproject was requested --
the manifest became "too powerful":

Command-line         Using superproject  → `current_branch_only`
------------         ------------------  -----------------------
                     No                  From manifest
                     Yes                 True
--current-branch     No                  True
--current-branch     Yes                 True
--no-current-branch  No                  From manifest ← wrong
--no-current-branch  Yes                 True

In commit 0cb6e92 the superproject configuration value reading changed
from something that could return `None` to something that always
returned a boolean. If it returned `False`, this would then incorrectly
make `repo` ignore the manifest option even if neither
`--current-branch` nor `--no-current-branch` had been given. The
manifest default became useless:

Command-line         Using superproject  → `current_branch_only`
------------         ------------------  -----------------------
                     No                  False ← wrong
                     Yes                 True
--current-branch     No                  True
--current-branch     Yes                 True
--no-current-branch  No                  False
--no-current-branch  Yes                 True

By swapping the order in which the command-line option target and the
superproject setting is evaluated, things should work as documented:

Command-line         Using superproject  → `current_branch_only`
------------         ------------------  -----------------------
                     No                  From manifest
                     Yes                 True
--current-branch     No                  True
--current-branch     Yes                 True
--no-current-branch  No                  False
--no-current-branch  Yes                 True

Change-Id: I933c232d2fbecc6b9bdc364ebac181798bce9175
Tested-by: Daniel Andersson <daniel.r.andersson@volvocars.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/334270
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-04-08 21:06:37 +00:00
a2ff20dd20 manifest_xml: Add Load and Unload methods
- do not call the internal method from subcmds/sync.py.
- use the correct default groups for submanifests.
- only sync the superproject when we are told to.

Change-Id: I81e4025058f1ee564732b9e17aecc522f6b5f626
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/334639
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-04-08 19:52:04 +00:00
55ee304304 Fix sub manifest handling
Also fixes some typos

Change-Id: Id2ba5834ba3a74ed3f29c36d2c0030737dc63e35
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/334579
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-04-06 21:04:46 +00:00
409407a731 init: add multi-manifest support
This moves more of the manifest project handling into ManifestProject.

Change-Id: Iecdafbec18cccdfd8e625753c3bd1bcddf2b227f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/334520
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-04-06 17:02:40 +00:00
d82be3e672 Move manifest config logic into ManifestProject
Use ManifestProject properties for config values.

Change-Id: Ib4ad90b0d9a089916e35615b8058942e6d01dc04
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/334519
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-04-06 16:59:45 +00:00
9b03f15e8e project: add ManifestProject.Sync()
Move the logic to sync a ManifestProject out of subcmds/init.py

Change-Id: Ia9d00f3da1dc3c5dada84c4d19cf9802c2346cb0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/334140
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-04-01 15:48:04 +00:00
9b72cf2ba5 project: Isolate ManifestProject from RepoProject
Create RepoProject and ManifestProject, inheriting from MetaProject,
  with methods separated for isolation and clarity.

Change-Id: Ic1d6efc65c99470290fea612e2abaf8670d199f4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/334139
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-03-31 21:02:52 +00:00
5d3291d818 manifest_file must be an absolute path
Correctly pass the full path of the manifest file for the submanifest.
The manifest-name in the <submanifest/> element was being passed in
as given, which caused it to not be found since the current directory
never set. (b/226333721: fails when manifest-name is given.)

Also verify that the manifest_file passed to XmlManifest() is an
absolute path.

Bug: https://b.corp.google.com/issues/226333721
Change-Id: I23461078233e34562bc2eafeb732cfe8bd38ddc1
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/333861
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-03-23 21:18:41 +00:00
244c9a71a6 trace: allow writing traces to a socket
Git can write trace2 events to a Unix domain socket [1]. This can be
specified via Git's `trace2.eventTarget` config option, which we read to
determine where to log our own trace2 events. Currently, if the Git
config specifies a socket as the trace2 target, we fail to log any
traces.

Fix this by adding support for writing to a Unix domain socket,
following the same specification that Git supports.

[1]: https://git-scm.com/docs/api-trace2#_enabling_a_target

Change-Id: I928bc22ba04fba603a9132eb055141845fa48ab2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/332339
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Josh Steadmon <steadmon@google.com>
2022-03-16 17:33:07 +00:00
b308db1e2a manifest_xml: group for submanifest projects
Add all projects in a submanifest to the group
submanifest::<path_prefix> for ease in filtering.

Change-Id: Ia6f01f9445f4f8d20fda3402f4d5821c43ceaf7f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/331319
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-02-28 20:08:58 +00:00
cc879a97c3 Add multi-manifest support with <submanifest> element
To be addressed in another change:
 - a partial `repo sync` (with a list of projects/paths to sync)
   requires `--this-tree-only`.

Change-Id: I6c7400bf001540e9d7694fa70934f8f204cb5f57
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/322657
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-02-17 21:57:55 +00:00
87cce68b28 Move local-manifest check to manifest_xml.py
This removes the need for git_superproject to include manifest_xml, and
puts the logic for local_manifest detection in one place.

Change-Id: I4d33ded0542ceea4606a1ea24304f678de20c59e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/330499
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Raman Tenneti <rtenneti@google.com>
2022-02-15 22:15:42 +00:00
adaa1d8734 project.py: pass --recurse-submodules={value}
If submodules is False, explicitly pass '=no'.  Uninitialized submodules
may cause the default option to fail.

Change-Id: Ia00bcba5b69c4b65195f4c469c686a3ef9a4a3ad
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/330159
Reviewed-by: Xin Li <delphij@google.com>
Tested-by: LaMont Jones <lamontjones@google.com>
2022-02-10 23:57:45 +00:00
8e91248655 project: mark gc.log as safe to discard when migrating .git/
This is just a log file that, while useful for humans when gc aborts,
doesn't contain any data, so it's safe to throw away.

Bug: https://crbug.com/gerrit/15619
Change-Id: Ia95e0e281f52260668f7a80b5d5f990e32a8597a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/328999
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-01-26 16:46:03 +00:00
630876f9e4 init: add an option --enable-git-lfs-filter
It was reported that git-lfs did not work with git-repo. Specifically,
`git read-tree -u` run by `repo sync` would fail git-lfs's smudge
filter. See https://github.com/github/git-lfs/issues/1422.

In fact, by the time `git read-tree -u` is run, the repository is not
bare. It is just that, the working directory is not the same as the
.git directory. git-lfs's filter should work. No one seems to have
delved into that issue.

Today, with newer versions of git-repo and git-lfs, that issue will
not reproduce. Tested with
- git 2.33, git-lfs 2.13 on macOS
- git 2.17, git-lfs 2.3 on ubuntu

So, it seems fine to add an option --enable-git-lfs-filter, default to
false, and stat that it may not work with older versions of git and
git-lfs in the help doc.

Bug: https://crbug.com/gerrit/14516
Change-Id: I8d21854eeeea541e072f63d6b10ad1253b1a9826
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/328359
Tested-by: XD Trol <milestonejxd@gmail.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-01-26 01:47:20 +00:00
4aa8584ec6 init: make bad --repo-rev settings more clear
If the user passes a bad --repo-rev setting in a new checkout, add a
tip to the error message that their option is probably bad instead of
just saying "unable to resolve".

If the user has already initialized a checkout, we'd display a raw
traceback which would confuse them.  Swallow that and also include
the --repo-rev tip.

Bug: https://crbug.com/gerrit/15610
Change-Id: I5d72513c7b37bf9bb5d19862fcdfaf0d1f44e886
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/328820
Reviewed-by: Jack Neus <jackneus@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-01-25 18:54:42 +00:00
b550501254 project: Ignore failure to remove the sample hooks
Removing the sample hooks is just clean up, so if repo cannot remove a
sample hook that should not cause it to fail.

Change-Id: I716b977da091c22b8f53e134f4fbc114116f9a65
Signed-off-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/328635
Reviewed-by: Mike Frysinger <vapier@google.com>
2022-01-22 06:13:10 +00:00
a535ae4418 branches: Fix "not in" handling
If the branch is current, or present in less than half of the projects,
list which projects it is *in*.

Otherwise, correctly detect which projects (by relpath) it is not in.

Previously, the "not in" path would incorrectly list all projects.

Change-Id: Ia153856f577035a51f538b7bf5d3135b70c69d52
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/328199
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2022-01-21 17:20:21 +00:00
67d6cdf2bc project: store objects in project-objects directly
In order to stop sharing objects/ directly between shared projects,
we have to fetch the remote objects into project-objects/ manually.
So instead of running git operations in the individual project dirs
and relying on .git/objects being symlinked to project-objects/,
tell git to store any objects it fetches in project-objects/.

We do this by leveraging the GIT_OBJECT_DIRECTORY override.  This
has been in git forever, or at least since v1.7.2 which is what we
already hard require.  This tells git to save new objects to the
specified path no matter where it's being run otherwise.

We still otherwise run git in the project-specific dir so that it
can find the right set of refs that it wants to compare against,
including local refs.  For that reason, we also have to leverage
GIT_ALTERNATE_OBJECT_DIRECTORIES to tell git where to find objects
that are not in the upstream remote.  This way git doesn't blow up
when it can't find objects only associated with local commits.

As it stands right now, the practical result is the same: since we
symlink the project objects/ dir to the project-objects/ tree, the
default objects dir, the one we set $GIT_OBJECT_DIRECTORY to, and
the one we set $GIT_ALTERNATE_OBJECT_DIRECTORIES to are actually
all the same.  So this commit by itself should be safe.  But in a
follow up commit, we can replace the symlink with a separate dir
and git will keep working.

Bug: https://crbug.com/gerrit/15553
Change-Id: Ie4e654aec3e1ee307eee925a54908a2db6a5869f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/328100
Reviewed-by: Jack Neus <jackneus@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-01-19 17:24:51 +00:00
152032cca2 project: move --reference handling to project-objects
When using --reference, the path is written to objects/info/alternates.
The path is accessed inconsistently -- sometimes through projects/ (via
self.gitdir) and sometimes through project-objects/ (via self.objdir).
This works because projects/.../objects is a symlink to the objects dir
under project-objects/.  Change all accesses to go through self.objdir.
This will allow us to stop symlinking projects/.../objects without the
reference dir logic breaking.  The projects/ path is going to use its
alternates file for its own needs.

Bug: https://crbug.com/gerrit/15553
Change-Id: I6b452ad1aaffec74ecb7ac1bb9baa3a3a52e076c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/328099
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Jack Neus <jackneus@google.com>
2022-01-13 18:27:28 +00:00
a3ac816278 test_project: use os.readlink instead of Path.readlink
Path.readlink is only available on Python 3.9, breaking compatibility
with all python versions below. os.readlink is already used in other
places of this file, so use it here as well.

Change-Id: I5acf8f5334a3e7c8de9cea1939d7e2b9af5f30ae
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/327844
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Sebastian Wagner <sebix@sebix.at>
2022-01-11 20:16:42 +00:00
98bb76577d project: prune sample hooks
These hooks are never used and often get stale, so just trim them.
Users rarely look in these dirs to begin with.

Change-Id: Ic785aa55fb7ec84a61376df101127d0018882030
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/327538
Reviewed-by: Jack Neus <jackneus@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-01-10 17:41:45 +00:00
d33dce0b77 project: drop support for symlinking internal .git files
Since we don't do this anymore, and there prob won't be a need to
bring it back, drop support for it.

Bug: https://crbug.com/gerrit/15460
Change-Id: I7d86706f108c797a5c7962cb1578693d49430367
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/327537
Reviewed-by: Jack Neus <jackneus@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-01-10 17:41:40 +00:00
89ed8acdbe project: abort a bit earlier before migrating .git/
Verify all the .git/ paths will be handled by the migration logic before
starting the migration.  This way we still abort & log an error, but the
user gets to see it before we put the tree into a state that they have to
manually recover.  Also add a few more known-safe-to-clobber paths.

Bug: https://crbug.com/gerrit/15273
Change-Id: If49d69b341bc960ddcafa30da333fb5ec7145b51
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/327557
Reviewed-by: Colin Cross <ccross@android.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-01-07 20:17:14 +00:00
71e48b7672 Revert "sync: dropped "NOTICE: --use-superproject is in beta ..." message."
This reverts commit d53cb9549a. As long as
repo's reference docs treat this feature as a work in progress and don't
cover it well enough to allow all repo maintainers to easily support it,
it is inconsistent to report to users that it is no longer in beta.
Thanks for vapier@google.com for noticing.

https://crbug.com/gerrit/15527 tracks the required documentation changes
before we'd be ready to roll forward again.

Change-Id: Ic9bd951cfb3c1abf6e1bfa30dfe4afa1c9b7bec6
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/327337
Reviewed-by: Jonathan Nieder <jrn@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Jonathan Nieder <jrn@google.com>
2022-01-06 20:01:03 +00:00
13576a8caf project: stop symlinking info dir under .git/
Unsharing this directory shouldn't be a problem.  The current repo code
treated it as a file, and while that's actually incorrect, files & dirs
are basically treated the same, so it's practically the same.

Let's enumerate each subpath since there aren't that many.

info/refs:
Only used when the project is exported over git dumb transports (i.e.
a http:// server).  Repo never does this, and it's extremely unlikely
any user has ever done this.  Plus, this proposal talks about unsharing
project refs, so this file should get unshared too.

info/grafts:
A user-configurable file that repo never touches.  Might be useful to
share across projects, but probably rarely (if ever) used by developers,
and forcing them to configure it for each project isn't that big of a
deal.

info/exclude:
info/attributes:
User-configurable files that repo never touches.  Doesn't seem like
most users ever touch these, and if they do, having them do it for
each shared project isn't a big deal.

info/sparse-checkout:
Repo doesn't use sparse checkouts, and it's extremely unlikely to even
work if a user tried doing something themselves.

Bug: https://crbug.com/gerrit/15460
Change-Id: I53e44d73a6d7a92da615b46600d8ea51cb46e3ac
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/327519
Reviewed-by: Jonathan Nieder <jrn@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-01-06 08:31:45 +00:00
2345906d04 project: stop symlinking description file under .git/
Nothing uses this path.  It’s only for exporting git dirs e.g. for
online gitweb use which probably no one does.  It is not the same
description file as exists on servers we cloned from.  Leaving it
as the default plain text file will simplify code.

We don't undo any existing symlinks if they exist since repo does
not care about them, and their existence doesn't hurt.

Bug: https://crbug.com/gerrit/15460
Change-Id: Ic34fe7c3cfb8f6da844de5be30158f59382b1cc8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/327518
Reviewed-by: Jonathan Nieder <jrn@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-01-06 08:29:06 +00:00
41289c62b4 project: stop symlinking svn under .git/
This path only matters to users of `git svn` who manually run it in
local projects after they get a full repo client checkout.  With svn
usage falling in general, and with the fact that the source checkout
now symlinks its .git/ state to the internal projects/ path, we don't
need to manage this anymore.

It means the path won't be shared among multiple local projects that
have the same remote, but so it goes.  It was an optimization only,
not functionality required for correctness.  We want to simplify the
internals to stop messing with git state, and this particular path
doesn't seem worth the effort to maintain.

We don't undo any existing svn symlinks if they exist since repo does
not care about them, and their existence doesn't hurt anything.

Bug: https://crbug.com/gerrit/15460
Change-Id: Ie8496b275bcc589771aa9f4ee874ed2ee6d5241d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/327517
Reviewed-by: Jonathan Nieder <jrn@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2022-01-06 08:28:37 +00:00
c72bd8486a project: clean up now unused code
Now that we symlink worktree .git/ paths to .repo/projects/, we never
set share_refs=True anywhere, which means all of this logic is dead
code.  Throw it all away.  Do it as a separate commit to make the
parent commit easier to review.

Bug: https://crbug.com/gerrit/15273
Change-Id: If496d39029d3d3bd523ba24c603ce47a63ad9b51
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/326817
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Jack Neus <jackneus@google.com>
2022-01-06 04:08:05 +00:00
d53cb9549a sync: dropped "NOTICE: --use-superproject is in beta ..." message.
Tested the code with the following commands.

$ ./run_tests -v

Bug: [google internal] b/209511230
Change-Id: Ia3c6de47709f5276e324a5bb608383aba3b2c562
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/327197
Reviewed-by: Xin Li <delphij@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-12-29 19:07:08 +00:00
cf0ba48649 sync: With --mirror option, don't display no-use-superproject... message.
+ Display 'Defaulting to no-use-superproject because there is no working tree.'
  message if --use-superproject option is used and we are not using
  superproject because manifest is either a mirror or is an archive.

Tested the code with the following commands.

$ ./run_tests -v

Tested the sync code by using repo_dev alias and pointing to this CL.

$ repo init -u https://android.googlesource.com/mirror/manifest --mirror

$ repo_dev sync
Receiving objects: 100% (3/3), done.eiving objects:  33% (1/3)

$ repo_dev sync --use-superproject
Defaulting to no-use-superproject because there is no working tree.
Fetching:  0% (0/2158) warming up

Bug: https://crbug.com/gerrit/15368
Change-Id: I16b87ee9623315dbc3100b612b1decdaab7ac1dc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/325797
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-12-07 16:46:41 +00:00
2a089cfee4 project: migrate worktree .git/ dirs to symlinks
Historically we created a .git/ subdir in each source checkout and
symlinked individual files to the .repo/projects/ paths.  This layer
of indirection isn't actually needed: the .repo/projects/ paths are
guaranteed to only ever have a 1-to-1 mapping with the actual git
checkout.  So we don't need to worry about having files in .git/ be
isolated.

To that end, change how we manage the actual project checkouts from
a dir full of symlinks (and a few files) to a symlink to the internal
.repo/projects/ dir.  This makes the code simpler & faster.

The directory structure we have today is:
.repo/
  project-objects/chromiumos/third_party/kernel.git/
    <paths omitted as not relevant to this change>
  projects/src/third_party/kernel/
    v3.8.git/
      config
      description   -> …/project-objects/…/config
      FETCH_HEAD
      HEAD
      hooks/        -> …/project-objects/…/hooks/
      info/         -> …/project-objects/…/info/
      logs/
      objects/      -> …/project-objects/…/objects/
      packed-refs
      refs/
      rr-cache/     -> …/project-objects/…/rr-cache/
src/third_party/kernel/
  v3.8/
    .git/
      config        -> …/projects/…/v3.8.git/config
      description   -> …/project-objects/…/v3.8.git/description
      HEAD
      hooks/        -> …/project-objects/…/v3.8.git/hooks/
      index
      info/         -> …/project-objects/…/v3.8.git/info/
      logs/         -> …/projects/…/v3.8.git/logs/
      objects/      -> …/project-objects/…/v3.8.git/objects/
      packed-refs   -> …/projects/…/v3.8.git/packed-refs
      refs/         -> …/projects/…/v3.8.git/refs/
      rr-cache/     -> …/project-objects/…/v3.8.git/rr-cache/

The directory structure we have after this commit:
.repo/
  <nothing changes>
src/third_party/kernel/
  v3.8/
    .git            -> …/projects/…/v3.8.git

Bug: https://crbug.com/gerrit/15273
Change-Id: I9dd8def23fbfb2f4cb209a93f8b1b2b24002a444
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/323695
Reviewed-by: Mike Nichols <mikenichols@google.com>
Reviewed-by: Xin Li <delphij@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-12-01 15:27:16 +00:00
4a478edb44 init, sync: fixed flake8 warnings.
Tested:
+ run_tests
+ flake8 subcmds/init.py
+ flake8 subcmds/sync.py

Change-Id: Ie337481d8a210bfc49b0745f75c05a308a0e74d3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/324155
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-11-18 16:22:40 +00:00
6bd89aa657 superproject: Inherit --no-use-superproject with --mirror option.
init.py
+ Similar to opt.archive, gave an error if --mirror option is
  used with --use-superproject.

sync.py
+ Defaulted to --no-use-superproject if manifest is a mirror or
  archive (similar to error at line# 1067).

Tested:
+ run_tests
+ flake8 (will fix known errors in another CL).

$ repo_dev init -u sso://googleplex-android.git.corp.google.com/platform/manifest --use-superproject --mirror
Usage: repo init [options] [manifest url]

main.py: error: --mirror and --use-superproject cannot be used together.

+ repo init and repo sync with --mirror and without --mirror
  options.
  $ repo_dev init -u https://android.googlesource.com/platform/manifest
  $ repo_dev sync
    ...superproject.git: Initial setup for superproject completed.

+ With --mirror option, verfied there are no exceptions in git_superproject.py

Bug: [google internal] b/206537893
Change-Id: I059f20e76f0ab36f0587f29779bb53ede4663bd4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/323955
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-11-18 01:27:41 +00:00
9c1fc5bc5d sync: Handle tag ref in "upstream" field
repo sync only handles a git tag properly when it is in the "revision"
field. However, "revision locked manifests" (`repo manifest
--revision-as-HEAD`) specifies the tag in the "upstream" field. The
issue is that this tag is not fetched. Only the commit that the tag
points to is fetched. This cases issues as
self._CheckForImmutableRevision() runs and comes to the conclusion that
the tag was changed while in fact, it was just not fetched. This causes
a full sync.

File docs/manifest-format.md, section Element-project:
> Attribute upstream: Name of the Git ref in which a sha1 can be found.
Used when syncing a revision locked manifest in -c mode to avoid having
to sync the entire ref space. Project elements not setting their own
upstream will inherit this value.

Change-Id: I0507d3a5f30aee8920a9f820bafedb48dd5db554
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/323620
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Robin Schneider <ypid@riseup.net>
2021-11-16 20:43:57 +00:00
333c0a499b project: init hooks in objdir only
objdir is the .repo/project-objects/ dir based on the remote path.
gitdir is the .repo/projects/ dir based on the local source checkout
path.  When we setup the gitdir, we symlink "hooks" to the one in the
objdir.  But when we go to initialize the hooks, we do it via gitdir.
There is a 1-to-many mapping from project-objects to projects, so
initializing via gitdir can be repetitive.  Collapse the hook init
logic to the objdir init path.

Change-Id: I828fca60ce6e125d6706c709cdb2797faa40aa50
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/323815
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-11-15 19:50:18 +00:00
fdeb20f43f sync: link the internal-fs-layout doc into checkouts
This should make it easy to discover for people poking around .repo/.

Change-Id: Ie5051551f25127c0592df5e36efba7bb2263e5d4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/323701
Reviewed-by: Jack Neus <jackneus@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-11-15 01:39:53 +00:00
bf40957b38 git-review: add config file
This is used by the `git review` tool that some people use.

Change-Id: I8dac4e1dad155109a05181deaec61e1a74857b1f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/323698
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-11-15 01:39:36 +00:00
f9e81c922d SUBMITTING_PATCHES: link to commit message style docs
Change-Id: I2090ebc43fc1c816b941a53dd89dbedf7bc61289
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/323696
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Jack Neus <jackneus@google.com>
2021-11-15 01:39:16 +00:00
e6601067ed man: refresh pages
Change-Id: I3f2c3ad77c16a76276bba2954887ab9e7605661c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/323516
Reviewed-by: Jack Neus <jackneus@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-11-12 17:30:45 +00:00
3001d6a426 help: fix grammar in help text
Bug: https://crbug.com/gerrit/14838
Change-Id: Ic5000921ba9a1baa086153630ebbb429e3d17642
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/323515
Reviewed-by: Jack Neus <jackneus@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-11-12 17:30:32 +00:00
00c5ea3787 Fix typo for ValueError
which will cause error log like below:
NameError: name 'ValueErrorl' is not defined

Change-Id: I388886b7cf6d700e224c3847b7ba4ba4fe9c041d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/323015
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: 彭杨益 <pyy101727@gmail.com>
2021-11-07 02:32:08 +00:00
0531a623e1 sync: make --prune the default
If a remote deletes a ref, and it points to an object that doesn't
exist locally, we can get into a bad state, and the only way for the
user to recover is to run `repo sync --prune` (and to know that is
the option they need).  The error message is not helpful:

fatal: bad object refs/remotes/cros/firmware-zork-13421.B-master
error: https://chromium.googlesource.com/chromiumos/platform/ec did not send all necessary objects

This situation can also come up when the remote renames refs in a
UNIX FS incompatible way.  For example, replacing refs/heads/foo
with refs/heads/foo/bar.

Also add a --no-prune option for users to disable the behavior.

Bug: https://issuetracker.google.com/203366450
Change-Id: Icf45d838a10938feb091d29800f7e49240830ec3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/322956
Reviewed-by: Andrew Lamb <andrewlamb@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-11-05 20:13:30 +00:00
2273f46cb3 sync: fix --tags option
This has been broken since it was added where --tags was actually
the same as --no-tags.  Oddly, it was copied from init where the
logic is correct.

Bug: https://crbug.com/gerrit/12401
Change-Id: I15b89da1a655176a11bebc22573b25c728055328
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/322955
Reviewed-by: Andrew Lamb <andrewlamb@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-11-05 20:13:02 +00:00
7b9b251a5e project: fix format string in error message
BUG=None

Change-Id: I0b195fd919c6db8cb3547e8d6f4c733f2bd4a535
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/322735
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2021-11-05 16:59:14 +00:00
6251729cb4 superproject: added 'implies -c' in the help of --use-superproject option.
sync.py: deleted unused import errno.

Tested:
$ ./run_tests
$ flake8 repo subcmds/sync.py

Bug: https://crbug.com/gerrit/15208
Change-Id: I2bb3098f5602ded3861e000100766041ad93b53d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/322555
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-11-01 22:15:53 +00:00
11b30b91df Support more url schemes for getting standalone manifest
urllib.requests.urlopen also supports file, so call it unless the
scheme is 'gs'.  This adds http, https, and ftp support.

Change-Id: I3f215c3ebd8e6dee29ba14c7e79ed99d37287109
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/322095
Reviewed-by: Michael Kelly <mkelly@arista.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Matt Story <mstory@arista.com>
2021-10-27 13:20:35 +00:00
198838599c fetch: Fix stderr handling for gsutil
Previously gsutil stderr was getting piped into stdout, which
yields bad results if there are non-fatal warnings in stderr.

Additionally, we should fail outright if gsutil fails (by adding
`check = True`) rather than fail later on when we try to sync to
a manifest that is in fact just a stderr dump.

BUG=none
TEST=manual runs with bad gs urls

Change-Id: Id71791d0c3f180bd0601ef2c783a8e8e4afa8f59
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/321935
Tested-by: Jack Neus <jackneus@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-10-26 22:18:28 +00:00
282d0cae89 ssh: handle FileNotFoundError errors
If ssh isn't installed, it throws a distinct error we have to catch.

Bug: https://crbug.com/gerrit/15196
Change-Id: I0660e842c304ce7575f5cb100894d05fd65f9454
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/322055
Reviewed-by: Jack Neus <jackneus@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-10-26 16:18:45 +00:00
03ff276cd7 sync: properly handle standalone manifests for sync command
sync should not attempt to sync the manifest project if it was
created from a standalone manifest. The current work around is to
run sync with --nmu.

BUG=none
TEST=manual runs

Change-Id: I2e121af0badf9642143e77c7af89d1c2d993b0f3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/321195
Tested-by: Jack Neus <jackneus@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-10-15 17:20:00 +00:00
4ee4a45d03 subcmds/sync: Use pack-refs instead of gc for redundant gitdirs.
Previously `git gc` was being run on every gitdir even when they shared
the same objects. Instead only call it once and use pack-refs for the
gitdirs that were not gc'ed.

Bug: https://crbug.com/gerrit/15113
Test: repo sync -j # and check that git pack-refs is called
Change-Id: Icff37ab3ec78cfb44391d8cc7f2d875991532320
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/320275
Tested-by: Allen Webb <allenwebb@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-10-14 12:27:12 +00:00
0f6f16ed17 repo: more arg checking for --standalone-manifest re-inits
`repo init` doesn't do anything on re-init when the checkout has
been initialized using --standalone manifest. Rather than let the
tool run through its existing flows (which happen to noop), check
the args and explicitly quit if a bare `repo init` is run on a
standalone checkout.

BUG=none
TEST=manual tests

Change-Id: Ie4346ef6df1282ec3e3f8045a08138c93653fece
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/320735
Tested-by: Jack Neus <jackneus@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-10-11 18:58:11 +00:00
76491590b8 repo: fix bug with --standalone-manifest
We were accidentally always setting manifest.standlone in config,
which was messing up behavior for standard use cases.

BUG=gerrit:15160
TEST=manual runs

Change-Id: Ic80f084ae97de5721aced3bb52d3ea9115f8d833
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/320715
Tested-by: Jack Neus <jackneus@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-10-11 18:57:57 +00:00
6a74c91f50 sign-launcher: make the help text more automatic
Rather than display "3.0" all the time and confuse people, extract
the version from the launcher we're signing and display that.

Also reformat the text to follow our current practice: upload the
versioned launcher by itself first, and then later copy that over
the default.

And while we're here, add tips for rollbacks.

Change-Id: I1654425c88e5c67d78879f2f33ad685c59be14dc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/319637
Reviewed-by: Xin Li <delphij@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-10-06 17:02:56 +00:00
669efd0fd7 subcmds/sync: Disable autoDetach for git gc.
gc.autoDetach is enabled by default which makes 'git gc --auto' return
immediately and run in background. This can lead to a pile up of
operations all using large amounts of memory at the same time. To avoid
this set gc.autoDetach to false so that the garbage collect task waits
for instances to finish before spawning more.

Bug: https://crbug.com/gerrit/15113
Test: repo sync -j # and check the number of 'git gc' processes
Change-Id: Ic0815156ba3db03972968f33f6f9f51e4928f23b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/319835
Tested-by: Allen Webb <allenwebb@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-10-05 14:12:01 +00:00
a0f6006ae7 git_config: Fixed test.gitconfig getting updated when running tests.
Moved test_GetSyncAnalysisStateData to GitConfigReadWriteTests class.

Deleted [repo "syncstate*..] data from tests/fixtures/test.gitconfig.

Tested:
./run_tests
...
tests/test_git_config.py::GitConfigReadWriteTests::test_GetSyncAnalysisStateData PASSED [ 84%]
...

Bug: https://crbug.com/gerrit/15103
Change-Id: I8cb89ce10b025994a045106c9c66dd243ae8ba50
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/319557
Tested-by: Raman Tenneti <rtenneti@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-09-30 21:45:09 +00:00
2ddbf8a8bf Merge "Merge history of v2.14.5." into main 2021-09-30 21:37:09 +00:00
445723fd37 Merge history of v2.14.5.
The v2.14.[345] releases were cut on a branch based on v2.14.2.  We
had some regression fixes we wanted in v2.14, but too many risky
changes landed in main since to cut another v2.14.x directly, and
we didn't want to destabilize even more by pushing a v2.15 right
away.  So we branched to keep things healthy.

But people with old checkouts trying to upgrade from those versions
run into an old repo bug where it only selfupdates with fast-forwards,
and repo can't fast-forward from those divergent histories.  So let's
do a merge commit to stitch the history back together.

There's no actual changes in here.

Change-Id: I05a96048e3846321e57c5f5224fb8dcf3c191d35
2021-09-30 21:36:56 +00:00
436bde5137 Merge history of v1.13.11.
For older versions of repo, this would make it easier for it to perform
a self update by making it a fast-forward from the following tags:

  v1.13.9.2, v1.13.9.3, v1.13.9.4, v1.13.10, v1.13.11

Change-Id: Ia75776312eaf802a150db8bd7c0a6dce57914580
2021-09-30 11:49:08 -07:00
4f88206178 trace2_event: Add remove_prefix to fix failing tests on Linux & macOS.
removeprefix is available i python 3.9. Mac and Linux are running in
a version below 3.9. Thus tests are failing with the following error:
  "AttributeError: 'str' object has no attribute 'removeprefix' "

Replaced the removeprefix with custom function which we will delete
once Linux and macOS versions are updated.

Tested:
$ ./run_tests

Bug: [google internal] b/201453085
Change-Id: I9b4d564ff1176e1b4471805ef05472c1914cd9f9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/319375
Tested-by: Raman Tenneti <rtenneti@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-09-29 21:24:59 +00:00
f88282ccc2 git_config: update error handling with no config file
Now that _do throws an exception when `git` fails, update the logic
that tries to read config files but the file doesn't exist.

Bug: b/192664812
Change-Id: I6417ecd70891b8f2d5f2bdb819f91df69ac4b70c
Test: `repo upload` no longer crashes when .repo/config doesn't exist
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/319295
Reviewed-by: Jack Neus <jackneus@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-09-29 01:02:47 +00:00
8967a5aec6 launcher: bump version for new release
Change-Id: I9812185c9dfc11289547f5956c0cbe567d720f7f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/319335
Reviewed-by: Xin Li <delphij@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-09-28 20:54:56 +00:00
2f3c3316e4 Update revisionId if required when using extend-project
When a hard revision ID is provided in a regular project tag then the
revisionId is updated as well if it is a commit hash.  The difference
is that if the revisionExpr is a commit, git-repo needs to update
refs/remotes/m/master with update-ref not symbolic-ref, as the latter
must refer to another ref, not to a specific commit.

Change-Id: I215a62dabb30225e480ad2c731416d775fc0c750
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/310963
Tested-by: Michael Kelly <mkelly@arista.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-09-28 20:12:00 +00:00
37c21c268b Add 'dest-path' to extend-project to support changing path
This allows us to move the repository to a new location in the source
tree without having to remove-project + add a new project tag.

Change-Id: I4dba6151842e57f6f2b8fe60cda260ecea68b7b4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/310962
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Michael Kelly <mkelly@arista.com>
2021-09-28 20:12:00 +00:00
b12c369e0b superproject: Only trigger enrollment logic when manifest have it.
The current code would check for enrollment status when the user did not
explicitly specify --[no-]use-superproject and do not have a remembered
value in their repo client. However, because superproject only makes
sense for manifests that have one specified, we should skip the
enrollment logic in that case.

Address this by checking manifest.superproject prior to proceeding. This
would avoid showing the greeting message of superproject enrollment
which can be confusing for developers.

Tested:
  For manifest without superproject:
   - repo sync --use-superproject will still show message for
     superproject;
   - repo sync will not show message regardless of enrollment state
  For manifest with superproject:
   - repo sync will show message and perform enrollment if not
     previously enrolled

Bug: https://crbug.com/gerrit/15039
Change-Id: Ic2be9f9d037f0e7cf3446da474a5a0d0e4bd88da
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/319255
Tested-by: Xin Li <delphij@google.com>
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-09-28 19:42:01 +00:00
bbe8836494 superproject: Log syncstate's parameter as data-json it it is an array.
All the values of syncstate are strings, check the first byte and last
byte to see if it is an array. For syncstate data, there were no false
positives.

Tested:
$ repo_dev sync

Verified event logged for argv is "data-json".

$./run_tests

Bug: [google internal] b/201102002
Change-Id: Id56adb532b80267f08d09147ac663cdd5987ce87
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/319075
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-09-28 18:22:49 +00:00
9d96f58f5f make file removal a bit more robust
Some of the file removal calls are subject to race conditions (if
something else deletes the file), so extend our remove API to have
an option to ignore ENOENT errors.  Then update a bunch of random
call sites to use this new functionality.

Change-Id: I31a9090e135452033135337a202a4fc2dbf8b63c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/319195
Reviewed-by: Sean McAllister <smcallis@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-09-28 16:06:50 +00:00
7a1e7e772f repo: add support for reading standalone manifests from disk
BUG=b:192664812
TEST=existing tests (no coverage), manual runs

Change-Id: Ic032417ecfca77d5e0de1b1ff62b30ce8205bfc5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/318715
Tested-by: Jack Neus <jackneus@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-09-28 16:03:21 +00:00
c474c9cba1 repo: Add support for standalone manifests
Added --standalone_manifest to repo tool. If set, the
manifest is downloaded directly from the appropriate source
(currently, we only support GS) and used instead of creating
a manifest git checkout. The manifests.git repo is still created to
keep track of various config but is marked as being for a standalone
manifest so that the repo tool doesn't try to run networked git
commands in it.

BUG=b:192664812
TEST=existing tests (no coverage), manual runs

Change-Id: I84378cbc7f8e515eabeccdde9665efc8cd2a9d21
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/312942
Tested-by: Jack Neus <jackneus@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-09-28 15:40:46 +00:00
956f7363d1 superproject: Log argv parameter of syncstate as 'data-json'.
Fixed: "we need to make a special case for logging the argv; it
should probably be a "data-json" event so that we log this directly as
an array rather than an encoded string.

Tested:
$ repo_dev sync

Verified event logged for argv is "data-json".

$./run_tests

Bug: [google internal] b/201102002
Change-Id: I18ccec79c73c8dc931cb8afc472b2361db8aea4c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/319055
Reviewed-by: Josh Steadmon <steadmon@google.com>
Reviewed-by: Xin Li <delphij@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-09-27 19:11:14 +00:00
6f8c1bf4ff Fix indent error which would have prevented choice expiration to work.
Change-Id: I077e05eea23ad58d1dde2c9fe5608660a56d03e5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/318815
Tested-by: Xin Li <delphij@google.com>
Reviewed-by: Amith Dsouza <amithds@google.com>
Reviewed-by: Raman Tenneti <rtenneti@google.com>
2021-09-27 18:59:21 +00:00
e0b16a22a0 superproject: support a new revision attribute.
Tested:
$ ./run_tests

Verified that a manifest that specified superproject revision would use
the specified revision, and superproject will use the default revision.

Note that this is a slight behavior change from earlier repo versions,
which would always use the branch name of the manifest itself. However,
the new behavior would be more consisitent with regular "project"
element and would allow superproject be used even if it is not enabled
for the particular manifest branch, so we have decided to make the
change as it would provide more flexibility and better matches what
other elements would do.

Bug: [google internal] b/187868160
Change-Id: I35255ee347aff6e65179f7879d52931f168b477e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/317643
Tested-by: Xin Li <delphij@google.com>
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-09-27 06:36:05 +00:00
d669d2dee5 release-process: update distro baseline & add OpenSSH
Stop tracking Ubuntu Trusty & Xenial and Debian Jessie & Stretch
as they only had Python 3.5 available which we've dropped.

Backfill OpenSSH versions since we've started testing for it.

Change-Id: I03183ed97f6e43dce8a00e36cce2956544a26afc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/318835
Reviewed-by: Jack Neus <jackneus@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-09-24 16:42:56 +00:00
366824937c platform_utils: os.rename exception when src and des on different file system
Symptom: repo sync exception
Root Cause: os.rename only works when source and destination are on the same file system
Solution: using shutil.move

to save disk usage, I create links for projects and project-objects, link to folder on another disk
lrwxrwxrwx  1 owenwen owenwen   47 Jun  9 16:40 project-objects -> /disk3/AndroidLocalRepos/.repo/project-objects/
lrwxrwxrwx  1 owenwen owenwen   40 Jun  9 16:40 projects -> /disk3/AndroidLocalRepos/.repo/projects/

below are exception I met:
"""
Traceback (most recent call last):
  File "/usr/lib/python3.6/multiprocessing/pool.py", line 119, in worker
    result = (True, func(*args, **kwds))
  File "/usr/lib/python3.6/multiprocessing/pool.py", line 44, in mapstar
    return list(map(*args))
  File "/disk2/Android11/.repo/repo/subcmds/sync.py", line 550, in _CheckoutOne
    project.Sync_LocalHalf(syncbuf, force_sync=force_sync)
  File "/disk2/Android11/.repo/repo/project.py", line 1251, in Sync_LocalHalf
    self._InitWorkTree(force_sync=force_sync, submodules=submodules)
  File "/disk2/Android11/.repo/repo/project.py", line 2801, in _InitWorkTree
    self._CheckDirReference(self.gitdir, dotgit, share_refs=True)
  File "/disk2/Android11/.repo/repo/project.py", line 2674, in _CheckDirReference
    platform_utils.rename(dst_path, src_path)
  File "/disk2/Android11/.repo/repo/platform_utils.py", line 127, in rename
    os.rename(src, dst)
OSError: [Errno 18] Invalid cross-device link: '/disk2/Android11/system/libhidl/.git/packed-refs' -> '/disk2/Android11/.repo/projects/system/libhidl.git/packed-refs'
"""

Change-Id: Ifda2f16530cc5a8f280169f482ee858f9e5241d3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/316002
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-09-24 08:20:06 +00:00
a84f43a006 manifest: make repo-hooks more robust wrt element ordering
Currently, repo will fail to sync to a manifest if the definition
of the repo-hooks project comes after the repo-hooks element.

BUG=none
TEST=new test, run_tests

Change-Id: I0bf85625173492af6c6404d4b67543e96e670562
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/318520
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Jack Neus <jackneus@google.com>
2021-09-23 21:17:38 +00:00
0468feac39 update-manpages: avoid regen just for datestamp update
To avoid noise due to the passage of time, don't regenerate man pages
if the only thing different is the datestamp in the header.

Change-Id: Ic8d7b08d12e59c66994c0cc2d4ec2d2ed3eb6e6d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/318575
Reviewed-by: Jack Neus <jackneus@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-09-22 19:37:35 +00:00
0ec2029833 superproject: Move enrollment to opt-out when enabled globally
Our internal experiments was a success so far and we are enrolling 100%
users now.  Instead of asking every two weeks, simply consider a lack of
unexpired choice as accepting the system default.

With this change the user would still be able to override the system
default with --no-use-superproject, or to permanently set the choice in
user's profile with git config --global repo.superprojectchoice.

Bug: [google internal] b/190688390
Change-Id: Idc77a9cbf88a169d90304169e91f0d722dc4ac8b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/317975
Tested-by: Xin Li <delphij@google.com>
Reviewed-by: Raman Tenneti <rtenneti@google.com>
2021-09-20 07:21:22 +00:00
d8e8ae8990 superproject: Log branch and remote url with every log message.
Saved superproject's remote URL in _remote_url data and used it
in the _Fecth function.

Tested:
$ ./run_tests
$ flake8 git_superproject.py
$ repo_dev init --use-superproject -u https://android.googlesource.com/platform/manifest
$ repo_dev sync

   Verified the all log messages have the following format.
   repo superproject branch: <branch> url: <url> warning: <message>

Bug: [google internal] b/200072098
Change-Id: Iac6af7c99225479fd50bc6909396b22e0ce5f76b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/318177
Reviewed-by: Xin Li <delphij@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-09-16 15:46:19 +00:00
6448a4f2af sync: Log repo sync state events as 'data' events.
git_trace2_event_log.py:
+ Added LogDataConfigEvents method to log 'data' events.
  Sync's current_sync_state and previous_sync_state are logged
  as 'data' events in the current log.

  It logs are key/value in the |config| argument. Each key is
  prefixed with |prefix| argument.

  The following are sample events that are logged during repo sync.

   {"event":"data",
   "sid":"repo-20210914T181545Z-P000330c0/repo-20210914T181545Z-P000330c0",
   "thread":"MainThread",
   "time":"2021-09-14T18:16:19.935846Z",
   "key":"previous_sync_state/repo.syncstate.main.synctime",
   "value":"2021-09-14T17:27:11.573717Z"}

   {"event":"data",
   "sid":"repo-20210914T181545Z-P000330c0/repo-20210914T181545Z-P000330c0",
   "thread":"MainThread",
   "time":"2021-09-14T18:16:19.955546Z",
   "key":"current_sync_state/repo.syncstate.main.synctime",
   "value":"2021-09-14T18:16:19.935979Z"}

tests/test_git_trace2_event_log.py:
+ Added unit tests

sync.py:
+ Changed logging calls to LogDataConfigEvents.

Tested:
$ ./run_tests

Tested it by running the following command multiple times.
$ repo_dev sync -j 20
  repo sync has finished successfully

  Verified config data is looged in trace2 event logs.

Bug: [google internal] b/199758376
Change-Id: I75fd830e90c1811ec28510538c99a2632b104e85
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/317823
Reviewed-by: Josh Steadmon <steadmon@google.com>
Reviewed-by: Xin Li <delphij@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-09-14 21:36:12 +00:00
148e1ce81a sync: fix recursive fetching
Commit b2fa30a2b8 ("sync: switch network
fetch to multiprocessing") accidentally changed the variable passed to
the 2nd fetch call from |missing| to |to_fetch| due to a copy & paste
of the earlier changed logic.  Undo that to fix git submodule fetching.

Bug: https://crbug.com/gerrit/14489
Change-Id: I627954f80fd2e80d9d5809b530aa6b0ef9260abb
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305262
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 22:43:09 -04:00
32ca6687ae git_config: hoist Windows ssh check earlier
The ssh master logic has never worked under Windows which is why this
code always returned False when running there (including cygwin).  But
the OS check was still done while holding the threading lock.  While
it might be a little slower than necessary, it still worked.

The switch from the threading module to the multiprocessing module
changed global behavior subtly under Windows and broke things: the
globals previously would stay valid, but now they get cleared.  So
the lock is reset to None in children workers.

We could tweak the logic to pass the lock through, but there isn't
much point when the rest of the code is still disabled in Windows.
So perform the platform check before we grab the lock.  This fixes
the crash, and probably speeds things up a few nanoseconds.

This shouldn't be a problem on Linux systems as the platform fork
will duplicate the existing process memory (including globals).

Bug: https://crbug.com/gerrit/14480
Change-Id: I1d1da82c6d7bd6b8cdc1f03f640a520ecd047063
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305149
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 19:49:58 -04:00
0ae9503a86 sync: fix print error when handling server error
When converting this logic from print() to the output buffer, this
error codepath should have dropped the use of the file= redirect.

Bug: https://crbug.com/gerrit/14482
Change-Id: Ib484924a2031ba3295c1c1a5b9a2d816b9912279
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305142
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 12:48:42 -04:00
d92076d930 Revert "Save cookies back to jar when fetching clone.bundle"
This reverts commit 4abf8e6ef8.

The curl process for updating the cookie file is not atomic.  When
fetching many bundles in parallel, we can sometimes corrupt the file
causing it to be cleared.  Since users should manage gitcookies on
their own, leave it read-only.

Bug: https://crbug.com/gerrit/12300
Change-Id: Id472c99b197bc4cf8533c649f8881509f38643c1
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/254092
Reviewed-by: David Pursehouse <dpursehouse@collab.net>
Tested-by: Mike Frysinger <vapier@google.com>
(cherry picked from commit dc1d0e0c7f)
2020-02-11 21:03:35 -05:00
aeb2eee9d3 repo: bump launcher version
This way we can push out the updated stable branch change.

Change-Id: I72d5dab4523a10dfeb6529796892096aa80eba3c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/254492
Reviewed-by: David Pursehouse <dpursehouse@collab.net>
Tested-by: Mike Frysinger <vapier@google.com>
2020-02-12 00:12:14 +00:00
45d1c372a7 project: fix bytes/str encoding when updating git submodules
Since tempfile.mkstemp() returns a file handle in binary mode,
make sure we turn our strings into bytes before writing.

Bug: https://crbug.com/gerrit/12043
Change-Id: I3e84d595e84b8bc12a1fbc7fd0bb3ea0ba2832b0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/254393
Reviewed-by: Michael Mortensen <mmortensen@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
(cherry picked from commit 163d42eb43)
2020-02-11 14:46:13 -05:00
19607b2817 repo: allow REPO_REV to be an env var
We do this for REPO_URL already.

Bug: https://crbug.com/gerrit/10233
Change-Id: I53410645474b00d900467c96fa5d8446f3a607d3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253552
Reviewed-by: David Pursehouse <dpursehouse@collab.net>
Tested-by: Mike Frysinger <vapier@google.com>
(cherry picked from commit 563f1a6512)
2020-02-11 14:44:29 -05:00
68744dbc01 Fixing forall subcommand for Py3
Execution of 'repo forall -p -c' doesn't work with Py3 and ends up
with an error:

Got an error, terminating the pool: TypeError: can only concatenate
str (not "bytes") to str

That's fixed by using the decode() method.

Change-Id: Ice01aaa1822dde8d957b5bf096021dd5a2b7dd51
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253659
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: David Pursehouse <dpursehouse@collab.net>
Tested-by: Jiri Tyr <jiri.tyr@gmail.com>
(cherry picked from commit 83a3227b62)
2020-02-10 23:31:45 -05:00
ef412624e9 remove spurious +x bits
These files are not directly executable, so drop the +x bits.

Change-Id: Iaf19a03a497686cc21103e7ddf08073173440dd1
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/254076
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: David Pursehouse <dpursehouse@collab.net>
(cherry picked from commit e7c91889a6)
2020-02-10 23:31:03 -05:00
a06ab7d28b find python via env
This allows these scripts to run through the active version of the
virtualenv python when invoked via tox.

Change-Id: Ib52f475b7b20c34d62cfd179a1341da1a08a8b5c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253974
Reviewed-by: David Pursehouse <dpursehouse@collab.net>
Tested-by: Mike Frysinger <vapier@google.com>
(cherry picked from commit 1b117db767)
2020-02-10 23:30:58 -05:00
471a7ed5f7 git_config: fix encoding handling in GetUrlCookieFile
Make sure we decode the bytes coming from the subprocess.Popen as
we're treating them as strings.

Change-Id: I44100ca5cd94f68a35d489936292eb641006edbe
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253973
Reviewed-by: Jonathan Nieder <jrn@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
(cherry picked from commit ded477dbb9)
2020-02-10 23:28:40 -05:00
619a2b5887 Fix inverted logic around [gitc-]init and -c
Instead of not using '-c' for '--current-branch' when using gitc, we
were only using '-c' when using gitc, so we still had the conflict with
the gitc option, and other users still couldn't use '-c'.

Test: repo init -u https://android.googlesource.com/platform/manifest; repo init -c
Test: repo gitc-init -u ... -b ... -c testing
Change-Id: I71e4950a49c281418249f0783c6a2ea34f0d3e2b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253795
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Dan Willemsen <dwillemsen@google.com>
(cherry picked from commit 93293ca47f)
2020-02-07 15:54:52 -05:00
ab15e42fa4 Do not try to fetch default revision for mirrors always
* Mirrors may contain multiple projects, some of which may not
  always contain the default revision.
* Only fetch the default revision explicitly if
  '--current-branch' is set.
* Fixes breakage casued by
  commit 6856f98467
  "Fix repo mirror with --current-branch"

Bug: https://crbug.com/gerrit/12274
Change-Id: Iaafabe2992f76f3644b841f24245d3e19c9515a9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253093
Reviewed-by: Kuang-che Wu <kcwu@chromium.org>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Chirayu Desai <chirayudesai1@gmail.com>
(cherry picked from commit f7b64e3350)
2020-02-06 09:19:35 -05:00
75c02fe4cb init: handle -c conflicts with gitc-init
We keep getting requests for init to support -c.  This conflicts with
gitc-init which allocates -c for its own use.  Lets make this dynamic
so we keep it with "init" but omit it for "gitc-init".

Bug: https://crbug.com/gerrit/10200
Change-Id: Ibf69c2bbeff638e28e63cb08926fea0c622258db
(cherry picked from commit 66098f707a)
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253392
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2020-02-05 18:04:11 +00:00
afd1b4023f repo: point default branch to repo-1
Since this will be feature-frozen for Python 2 users, lets point the
default update branch to "repo-1" rather than "stable" as the latter
will follow the master development (and Python 3-only).

Bug: https://crbug.com/gerrit/10418
Change-Id: Iceff0983684a580dc5c9ec1c60acfb5eda5ce2c4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253172
Reviewed-by: David Pursehouse <dpursehouse@collab.net>
Tested-by: Mike Frysinger <vapier@google.com>
2020-02-05 02:56:08 +00:00
119 changed files with 21576 additions and 14714 deletions

18
.flake8
View File

@ -1,15 +1,13 @@
[flake8]
max-line-length=100
ignore=
# E111: Indentation is not a multiple of four
E111,
# E114: Indentation is not a multiple of four (comment)
E114,
max-line-length = 80
per-file-ignores =
# E501: line too long
tests/test_git_superproject.py: E501
extend-ignore =
# E203: Whitespace before ':'
# See https://github.com/PyCQA/pycodestyle/issues/373
E203,
# E402: Module level import not at top of file
E402,
# E731: do not assign a lambda expression, use a def
E731,
# W503: Line break before binary operator
W503,
# W504: Line break after binary operator
W504

23
.github/workflows/flake8-postsubmit.yml vendored Normal file
View File

@ -0,0 +1,23 @@
# GitHub actions workflow.
# https://help.github.com/en/actions/automating-your-workflow-with-github-actions/workflow-syntax-for-github-actions
# https://github.com/marketplace/actions/python-flake8
name: Flake8
on:
push:
branches: [main]
jobs:
lint:
name: Python Lint
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.9"
- name: Run flake8
uses: julianwachholz/flake8-action@v2
with:
checkName: "Python Lint"

View File

@ -14,18 +14,18 @@ jobs:
fail-fast: false
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
python-version: [3.6, 3.7, 3.8, 3.9]
python-version: ['3.6', '3.7', '3.8', '3.9', '3.10']
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v1
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox tox-gh-actions
python -m pip install tox tox-gh-actions
- name: Test with tox
run: tox

5
.gitreview Normal file
View File

@ -0,0 +1,5 @@
[gerrit]
host=gerrit-review.googlesource.com
scheme=https
project=git-repo.git
defaultbranch=main

View File

@ -8,7 +8,7 @@ that you can put anywhere in your path.
* Homepage: <https://gerrit.googlesource.com/git-repo/>
* Mailing list: [repo-discuss on Google Groups][repo-discuss]
* Bug reports: <https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo>
* Bug reports: <https://bugs.chromium.org/p/gerrit/issues/list?q=component:Applications%3Erepo>
* Source: <https://gerrit.googlesource.com/git-repo/>
* Overview: <https://source.android.com/source/developing.html>
* Docs: <https://source.android.com/source/using-repo.html>
@ -51,5 +51,5 @@ $ chmod a+rx ~/.bin/repo
[new-bug]: https://bugs.chromium.org/p/gerrit/issues/entry?template=Repo+tool+issue
[issue tracker]: https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo
[issue tracker]: https://bugs.chromium.org/p/gerrit/issues/list?q=component:Applications%3Erepo
[repo-discuss]: https://groups.google.com/forum/#!forum/repo-discuss

View File

@ -1,19 +1,19 @@
# Submitting Changes
Here's a short overview of the process.
* Make small logical changes.
* [Provide a meaningful commit message][commit-message-style].
* Make sure all code is under the Apache License, 2.0.
* Publish your changes for review.
* `git push origin HEAD:refs/for/main`
* Make corrections if requested.
* [Verify your changes on Gerrit.](#verify)
* [Send to the commit queue for testing & merging.](#cq)
[TOC]
# Short Version
- Make small logical changes.
- Provide a meaningful commit message.
- Check for coding errors and style nits with flake8.
- Make sure all code is under the Apache License, 2.0.
- Publish your changes for review.
- Make corrections if requested.
- Verify your changes on gerrit so they can be submitted.
`git push https://gerrit-review.googlesource.com/git-repo HEAD:refs/for/main`
# Long Version
## Long Version
I wanted a file describing how to submit patches for repo,
so I started with the one found in the core Git distribution
@ -26,10 +26,11 @@ yourself with the following relevant bits.
## Make separate commits for logically separate changes.
Unless your patch is really trivial, you should not be sending
out a patch that was generated between your working tree and your
commit head. Instead, always make a commit with complete commit
message and generate a series of patches from your repository.
Unless your patch is really trivial, you should not be sending out a patch that
was generated between your working tree and your commit head.
Instead, always make a commit with a complete
[commit message][commit-message-style] and generate a series of patches from
your repository.
It is a good discipline.
Describe the technical detail of the change(s).
@ -38,17 +39,26 @@ If your description starts to get too long, that's a sign that you
probably need to split up your commit to finer grained pieces.
## Check for coding errors and style violations with flake8
## Linting and formatting code
Run `flake8` on changed modules:
Lint any changes by running:
```sh
$ tox -e lint -- file.py
```
flake8 file.py
And format with:
```sh
$ tox -e format -- file.py
```
Note that repo generally follows [Google's Python Style Guide] rather than
[PEP 8], with a couple of notable exceptions:
Or format everything:
```sh
$ tox -e format
```
* Indentation is at 2 columns rather than 4
* The maximum line length is 100 columns rather than 80
Repo uses [black](https://black.readthedocs.io/) with line length of 80 as its
formatter and flake8 as its linter. Repo also follows
[Google's Python Style Guide].
There should be no new errors or warnings introduced.
@ -165,9 +175,16 @@ commit. If you make the requested changes you will need to amend your commit
and push it to the review server again.
## Verify your changes on gerrit
## Verify your changes on Gerrit {#verify}
After you receive a Code-Review+2 from the maintainer, select the Verified
button on the gerrit page for the change. This verifies that you have tested
button on the Gerrit page for the change. This verifies that you have tested
your changes and notifies the maintainer that they are ready to be submitted.
The maintainer will then submit your changes to the repository.
## Merge your changes via the commit queue {#cq}
Once a change is ready to be merged, select the Commit-Queue+2 setting on the
Gerrit page for it. This tells the CI system to test the change, and if it
passes all the checks, automatically merges it.
[commit-message-style]: https://chris.beams.io/posts/git-commit/

View File

@ -17,23 +17,20 @@ import sys
import pager
COLORS = {None: -1,
'normal': -1,
'black': 0,
'red': 1,
'green': 2,
'yellow': 3,
'blue': 4,
'magenta': 5,
'cyan': 6,
'white': 7}
COLORS = {
None: -1,
"normal": -1,
"black": 0,
"red": 1,
"green": 2,
"yellow": 3,
"blue": 4,
"magenta": 5,
"cyan": 6,
"white": 7,
}
ATTRS = {None: -1,
'bold': 1,
'dim': 2,
'ul': 4,
'blink': 5,
'reverse': 7}
ATTRS = {None: -1, "bold": 1, "dim": 2, "ul": 4, "blink": 5, "reverse": 7}
RESET = "\033[m"
@ -56,30 +53,30 @@ def _Color(fg=None, bg=None, attr=None):
code = "\033["
if attr >= 0:
code += chr(ord('0') + attr)
code += chr(ord("0") + attr)
need_sep = True
if fg >= 0:
if need_sep:
code += ';'
code += ";"
need_sep = True
if fg < 8:
code += '3%c' % (ord('0') + fg)
code += "3%c" % (ord("0") + fg)
else:
code += '38;5;%d' % fg
code += "38;5;%d" % fg
if bg >= 0:
if need_sep:
code += ';'
code += ";"
if bg < 8:
code += '4%c' % (ord('0') + bg)
code += "4%c" % (ord("0") + bg)
else:
code += '48;5;%d' % bg
code += 'm'
code += "48;5;%d" % bg
code += "m"
else:
code = ''
code = ""
return code
@ -97,17 +94,17 @@ def SetDefaultColoring(state):
global DEFAULT
state = state.lower()
if state in ('auto',):
if state in ("auto",):
DEFAULT = state
elif state in ('always', 'yes', 'true', True):
DEFAULT = 'always'
elif state in ('never', 'no', 'false', False):
DEFAULT = 'never'
elif state in ("always", "yes", "true", True):
DEFAULT = "always"
elif state in ("never", "no", "false", False):
DEFAULT = "never"
class Coloring(object):
def __init__(self, config, section_type):
self._section = 'color.%s' % section_type
self._section = "color.%s" % section_type
self._config = config
self._out = sys.stdout
@ -115,14 +112,14 @@ class Coloring(object):
if on is None:
on = self._config.GetString(self._section)
if on is None:
on = self._config.GetString('color.ui')
on = self._config.GetString("color.ui")
if on == 'auto':
if on == "auto":
if pager.active or os.isatty(1):
self._on = True
else:
self._on = False
elif on in ('true', 'always'):
elif on in ("true", "always"):
self._on = True
else:
self._on = False
@ -141,7 +138,7 @@ class Coloring(object):
self._out.flush()
def nl(self):
self._out.write('\n')
self._out.write("\n")
def printer(self, opt=None, fg=None, bg=None, attr=None):
s = self
@ -149,6 +146,7 @@ class Coloring(object):
def f(fmt, *args):
s._out.write(c(fmt, *args))
return f
def nofmt_printer(self, opt=None, fg=None, bg=None, attr=None):
@ -157,6 +155,7 @@ class Coloring(object):
def f(fmt):
s._out.write(c(fmt))
return f
def colorer(self, opt=None, fg=None, bg=None, attr=None):
@ -165,12 +164,14 @@ class Coloring(object):
def f(fmt, *args):
output = fmt % args
return ''.join([c, output, RESET])
return "".join([c, output, RESET])
return f
else:
def f(fmt, *args):
return fmt % args
return f
def nofmt_colorer(self, opt=None, fg=None, bg=None, attr=None):
@ -178,29 +179,32 @@ class Coloring(object):
c = self._parse(opt, fg, bg, attr)
def f(fmt):
return ''.join([c, fmt, RESET])
return "".join([c, fmt, RESET])
return f
else:
def f(fmt):
return fmt
return f
def _parse(self, opt, fg, bg, attr):
if not opt:
return _Color(fg, bg, attr)
v = self._config.GetString('%s.%s' % (self._section, opt))
v = self._config.GetString("%s.%s" % (self._section, opt))
if v is None:
return _Color(fg, bg, attr)
v = v.strip().lower()
if v == "reset":
return RESET
elif v == '':
elif v == "":
return _Color(fg, bg, attr)
have_fg = False
for a in v.split(' '):
for a in v.split(" "):
if is_color(a):
if have_fg:
bg = a

View File

@ -25,7 +25,7 @@ import progress
# Are we generating man-pages?
GENERATE_MANPAGES = os.environ.get('_REPO_GENERATE_MANPAGES_') == ' indeed! '
GENERATE_MANPAGES = os.environ.get("_REPO_GENERATE_MANPAGES_") == " indeed! "
# Number of projects to submit to a single worker process at a time.
@ -43,8 +43,7 @@ DEFAULT_LOCAL_JOBS = min(os.cpu_count(), 8)
class Command(object):
"""Base class for any command line action in repo.
"""
"""Base class for any command line action in repo."""
# Singleton for all commands to track overall repo command execution and
# provide event summary to callers. Only used by sync subcommand currently.
@ -52,22 +51,38 @@ class Command(object):
# NB: This is being replaced by git trace2 events. See git_trace2_event_log.
event_log = EventLog()
# Whether this command is a "common" one, i.e. whether the user would commonly
# use it or it's a more uncommon command. This is used by the help command to
# show short-vs-full summaries.
# Whether this command is a "common" one, i.e. whether the user would
# commonly use it or it's a more uncommon command. This is used by the help
# command to show short-vs-full summaries.
COMMON = False
# Whether this command supports running in parallel. If greater than 0,
# it is the number of parallel jobs to default to.
PARALLEL_JOBS = None
def __init__(self, repodir=None, client=None, manifest=None, gitc_manifest=None,
git_event_log=None):
# Whether this command supports Multi-manifest. If False, then main.py will
# iterate over the manifests and invoke the command once per (sub)manifest.
# This is only checked after calling ValidateOptions, so that partially
# migrated subcommands can set it to False.
MULTI_MANIFEST_SUPPORT = True
def __init__(
self,
repodir=None,
client=None,
manifest=None,
gitc_manifest=None,
git_event_log=None,
outer_client=None,
outer_manifest=None,
):
self.repodir = repodir
self.client = client
self.outer_client = outer_client or client
self.manifest = manifest
self.gitc_manifest = gitc_manifest
self.git_event_log = git_event_log
self.outer_manifest = outer_manifest
# Cache for the OptionParser property.
self._optparse = None
@ -85,8 +100,8 @@ class Command(object):
opt_value = getattr(opts, opt_key)
# If the value is set, it means the user has passed it as a command
# line option, and we should use that. Otherwise we can try to set it
# with the value from the corresponding environment variable.
# line option, and we should use that. Otherwise we can try to set
# it with the value from the corresponding environment variable.
if opt_value is not None:
continue
@ -100,11 +115,13 @@ class Command(object):
def OptionParser(self):
if self._optparse is None:
try:
me = 'repo %s' % self.NAME
usage = self.helpUsage.strip().replace('%prog', me)
me = "repo %s" % self.NAME
usage = self.helpUsage.strip().replace("%prog", me)
except AttributeError:
usage = 'repo %s' % self.NAME
epilog = 'Run `repo help %s` to view the detailed manual.' % self.NAME
usage = "repo %s" % self.NAME
epilog = (
"Run `repo help %s` to view the detailed manual." % self.NAME
)
self._optparse = optparse.OptionParser(usage=usage, epilog=epilog)
self._CommonOptions(self._optparse)
self._Options(self._optparse)
@ -116,24 +133,63 @@ class Command(object):
These will show up for *all* subcommands, so use sparingly.
NB: Keep in sync with repo:InitParser().
"""
g = p.add_option_group('Logging options')
opts = ['-v'] if opt_v else []
g.add_option(*opts, '--verbose',
dest='output_mode', action='store_true',
help='show all output')
g.add_option('-q', '--quiet',
dest='output_mode', action='store_false',
help='only show errors')
g = p.add_option_group("Logging options")
opts = ["-v"] if opt_v else []
g.add_option(
*opts,
"--verbose",
dest="output_mode",
action="store_true",
help="show all output",
)
g.add_option(
"-q",
"--quiet",
dest="output_mode",
action="store_false",
help="only show errors",
)
if self.PARALLEL_JOBS is not None:
default = 'based on number of CPU cores'
default = "based on number of CPU cores"
if not GENERATE_MANPAGES:
# Only include active cpu count if we aren't generating man pages.
default = f'%default; {default}'
# Only include active cpu count if we aren't generating man
# pages.
default = f"%default; {default}"
p.add_option(
'-j', '--jobs',
type=int, default=self.PARALLEL_JOBS,
help=f'number of jobs to run in parallel (default: {default})')
"-j",
"--jobs",
type=int,
default=self.PARALLEL_JOBS,
help=f"number of jobs to run in parallel (default: {default})",
)
m = p.add_option_group("Multi-manifest options")
m.add_option(
"--outer-manifest",
action="store_true",
default=None,
help="operate starting at the outermost manifest",
)
m.add_option(
"--no-outer-manifest",
dest="outer_manifest",
action="store_false",
help="do not operate on outer manifests",
)
m.add_option(
"--this-manifest-only",
action="store_true",
default=None,
help="only operate on this (sub)manifest",
)
m.add_option(
"--no-this-manifest-only",
"--all-manifests",
dest="this_manifest_only",
action="store_false",
help="operate on this manifest and its submanifests",
)
def _Options(self, p):
"""Initialize the option parser with subcommand-specific options."""
@ -157,8 +213,7 @@ class Command(object):
return {}
def Usage(self):
"""Display usage and terminate.
"""
"""Display usage and terminate."""
self.OptionParser.print_usage()
sys.exit(1)
@ -166,6 +221,10 @@ class Command(object):
"""Validate common options."""
opt.quiet = opt.output_mode is False
opt.verbose = opt.output_mode is True
if opt.outer_manifest is None:
# By default, treat multi-manifest instances as a single manifest
# from the user's perspective.
opt.outer_manifest = True
def ValidateOptions(self, opt, args):
"""Validate the user options & arguments before executing.
@ -178,12 +237,13 @@ class Command(object):
"""
def Execute(self, opt, args):
"""Perform the action, after option parsing is complete.
"""
"""Perform the action, after option parsing is complete."""
raise NotImplementedError
@staticmethod
def ExecuteInParallel(jobs, func, inputs, callback, output=None, ordered=False):
def ExecuteInParallel(
jobs, func, inputs, callback, output=None, ordered=False
):
"""Helper for managing parallel execution boiler plate.
For subcommands that can easily split their work up.
@ -191,18 +251,21 @@ class Command(object):
Args:
jobs: How many parallel processes to use.
func: The function to apply to each of the |inputs|. Usually a
functools.partial for wrapping additional arguments. It will be run
in a separate process, so it must be pickalable, so nested functions
won't work. Methods on the subcommand Command class should work.
functools.partial for wrapping additional arguments. It will be
run in a separate process, so it must be pickalable, so nested
functions won't work. Methods on the subcommand Command class
should work.
inputs: The list of items to process. Must be a list.
callback: The function to pass the results to for processing. It will be
executed in the main thread and process the results of |func| as they
become available. Thus it may be a local nested function. Its return
value is passed back directly. It takes three arguments:
callback: The function to pass the results to for processing. It
will be executed in the main thread and process the results of
|func| as they become available. Thus it may be a local nested
function. Its return value is passed back directly. It takes
three arguments:
- The processing pool (or None with one job).
- The |output| argument.
- An iterator for the results.
output: An output manager. May be progress.Progess or color.Coloring.
output: An output manager. May be progress.Progess or
color.Coloring.
ordered: Whether the jobs should be processed in order.
Returns:
@ -215,7 +278,11 @@ class Command(object):
else:
with multiprocessing.Pool(jobs) as pool:
submit = pool.imap if ordered else pool.imap_unordered
return callback(pool, output, submit(func, inputs, chunksize=WORKER_BATCH_SIZE))
return callback(
pool,
output,
submit(func, inputs, chunksize=WORKER_BATCH_SIZE),
)
finally:
if isinstance(output, progress.Progress):
output.end()
@ -230,9 +297,7 @@ class Command(object):
project = None
if os.path.exists(path):
oldpath = None
while (path and
path != oldpath and
path != manifest.topdir):
while path and path != oldpath and path != manifest.topdir:
try:
project = self._by_path[path]
break
@ -251,54 +316,99 @@ class Command(object):
pass
return project
def GetProjects(self, args, manifest=None, groups='', missing_ok=False,
submodules_ok=False):
def GetProjects(
self,
args,
manifest=None,
groups="",
missing_ok=False,
submodules_ok=False,
all_manifests=False,
):
"""A list of projects that match the arguments.
Args:
args: a list of (case-insensitive) strings, projects to search for.
manifest: an XmlManifest, the manifest to use, or None for default.
groups: a string, the manifest groups in use.
missing_ok: a boolean, whether to allow missing projects.
submodules_ok: a boolean, whether to allow submodules.
all_manifests: a boolean, if True then all manifests and
submanifests are used. If False, then only the local
(sub)manifest is used.
Returns:
A list of matching Project instances.
"""
if all_manifests:
if not manifest:
manifest = self.manifest.outer_client
all_projects_list = manifest.all_projects
else:
if not manifest:
manifest = self.manifest
all_projects_list = manifest.projects
result = []
mp = manifest.manifestProject
if not groups:
groups = manifest.GetGroupsStr()
groups = [x for x in re.split(r'[,\s]+', groups) if x]
groups = [x for x in re.split(r"[,\s]+", groups) if x]
if not args:
derived_projects = {}
for project in all_projects_list:
if submodules_ok or project.sync_s:
derived_projects.update((p.name, p)
for p in project.GetDerivedSubprojects())
derived_projects.update(
(p.name, p) for p in project.GetDerivedSubprojects()
)
all_projects_list.extend(derived_projects.values())
for project in all_projects_list:
if (missing_ok or project.Exists) and project.MatchesGroups(groups):
if (missing_ok or project.Exists) and project.MatchesGroups(
groups
):
result.append(project)
else:
self._ResetPathToProjectMap(all_projects_list)
for arg in args:
# We have to filter by manifest groups in case the requested project is
# checked out multiple times or differently based on them.
projects = [project for project in manifest.GetProjectsWithName(arg)
if project.MatchesGroups(groups)]
# We have to filter by manifest groups in case the requested
# project is checked out multiple times or differently based on
# them.
projects = [
project
for project in manifest.GetProjectsWithName(
arg, all_manifests=all_manifests
)
if project.MatchesGroups(groups)
]
if not projects:
path = os.path.abspath(arg).replace('\\', '/')
project = self._GetProjectByPath(manifest, path)
path = os.path.abspath(arg).replace("\\", "/")
tree = manifest
if all_manifests:
# Look for the deepest matching submanifest.
for tree in reversed(list(manifest.all_manifests)):
if path.startswith(tree.topdir):
break
project = self._GetProjectByPath(tree, path)
# If it's not a derived project, update path->project mapping and
# search again, as arg might actually point to a derived subproject.
if (project and not project.Derived and (submodules_ok or
project.sync_s)):
# If it's not a derived project, update path->project
# mapping and search again, as arg might actually point to
# a derived subproject.
if (
project
and not project.Derived
and (submodules_ok or project.sync_s)
):
search_again = False
for subproject in project.GetDerivedSubprojects():
self._UpdatePathToProjectMap(subproject)
search_again = True
if search_again:
project = self._GetProjectByPath(manifest, path) or project
project = (
self._GetProjectByPath(manifest, path)
or project
)
if project:
projects = [project]
@ -308,7 +418,10 @@ class Command(object):
for project in projects:
if not missing_ok and not project.Exists:
raise NoSuchProjectError('%s (%s)' % (arg, project.relpath))
raise NoSuchProjectError(
"%s (%s)"
% (arg, project.RelPath(local=not all_manifests))
)
if not project.MatchesGroups(groups):
raise InvalidProjectGroupsError(arg)
@ -316,15 +429,27 @@ class Command(object):
def _getpath(x):
return x.relpath
result.sort(key=_getpath)
return result
def FindProjects(self, args, inverse=False):
def FindProjects(self, args, inverse=False, all_manifests=False):
"""Find projects from command line arguments.
Args:
args: a list of (case-insensitive) strings, projects to search for.
inverse: a boolean, if True, then projects not matching any |args|
are returned.
all_manifests: a boolean, if True then all manifests and
submanifests are used. If False, then only the local
(sub)manifest is used.
"""
result = []
patterns = [re.compile(r'%s' % a, re.IGNORECASE) for a in args]
for project in self.GetProjects(''):
patterns = [re.compile(r"%s" % a, re.IGNORECASE) for a in args]
for project in self.GetProjects("", all_manifests=all_manifests):
paths = [project.name, project.RelPath(local=not all_manifests)]
for pattern in patterns:
match = pattern.search(project.name) or pattern.search(project.relpath)
match = any(pattern.search(x) for x in paths)
if not inverse and match:
result.append(project)
break
@ -333,13 +458,29 @@ class Command(object):
else:
if inverse:
result.append(project)
result.sort(key=lambda project: project.relpath)
result.sort(
key=lambda project: (project.manifest.path_prefix, project.relpath)
)
return result
def ManifestList(self, opt):
"""Yields all of the manifests to traverse.
Args:
opt: The command options.
"""
top = self.outer_manifest
if not opt.outer_manifest or opt.this_manifest_only:
top = self.manifest
yield top
if not opt.this_manifest_only:
for child in top.all_children:
yield child
class InteractiveCommand(Command):
"""Command which requires user interaction on the tty and
must not run within a pager, even if the user asks to.
"""Command which requires user interaction on the tty and must not run
within a pager, even if the user asks to.
"""
def WantPager(self, _opt):
@ -347,8 +488,8 @@ class InteractiveCommand(Command):
class PagedCommand(Command):
"""Command which defaults to output in a pager, as its
display tends to be larger than one screen full.
"""Command which defaults to output in a pager, as its display tends to be
larger than one screen full.
"""
def WantPager(self, _opt):
@ -356,18 +497,16 @@ class PagedCommand(Command):
class MirrorSafeCommand(object):
"""Command permits itself to run within a mirror,
and does not require a working directory.
"""Command permits itself to run within a mirror, and does not require a
working directory.
"""
class GitcAvailableCommand(object):
"""Command that requires GITC to be available, but does
not require the local client to be a GITC client.
"""Command that requires GITC to be available, but does not require the
local client to be a GITC client.
"""
class GitcClientCommand(object):
"""Command that requires the local client to be a GITC
client.
"""
"""Command that requires the local client to be a GITC client."""

View File

@ -50,6 +50,10 @@ For example, if you want to change the manifest branch, you can simply run
For more documentation on the manifest format, including the local_manifests
support, see the [manifest-format.md] file.
* `submanifests/{submanifest.path}/`: The path prefix to the manifest state of
a submanifest included in a multi-manifest checkout. The outermost manifest
manifest state is found adjacent to `submanifests/`.
* `manifests/`: A git checkout of the manifest project. Its `.git/` state
points to the `manifest.git` bare checkout (see below). It tracks the git
branch specified at `repo init` time via `--manifest-branch`.
@ -157,11 +161,13 @@ User controlled settings are initialized when running `repo init`.
| Setting | `repo init` Option | Use/Meaning |
|------------------- |---------------------------|-------------|
| manifest.groups | `--groups` & `--platform` | The manifest groups to sync |
| manifest.standalone | `--standalone-manifest` | Download manifest as static file instead of creating checkout |
| repo.archive | `--archive` | Use `git archive` for checkouts |
| repo.clonebundle | `--clone-bundle` | Whether the initial sync used clone.bundle explicitly |
| repo.clonefilter | `--clone-filter` | Filter setting when using [partial git clones] |
| repo.depth | `--depth` | Create shallow checkouts when cloning |
| repo.dissociate | `--dissociate` | Dissociate from any reference/mirrors after initial clone |
| repo.git-lfs | `--git-lfs` | Enable [Git LFS] support |
| repo.mirror | `--mirror` | Checkout is a repo mirror |
| repo.partialclone | `--partial-clone` | Create [partial git clones] |
| repo.partialcloneexclude | `--partial-clone-exclude` | Comma-delimited list of project names (not paths) to exclude while using [partial git clones] |
@ -217,7 +223,7 @@ The `[remote]` settings are automatically populated/updated from the manifest.
The `[branch]` settings are updated by `repo start` and `git branch`.
| Setting | Subcommands | Use/Meaning |
|-------------------------------|---------------|-------------|
|---------------------------------------|---------------|-------------|
| review.\<url\>.autocopy | upload | Automatically add to `--cc=<value>` |
| review.\<url\>.autoreviewer | upload | Automatically add to `--reviewers=<value>` |
| review.\<url\>.autoupload | upload | Automatically answer "yes" or "no" to all prompts |
@ -225,6 +231,7 @@ The `[branch]` settings are updated by `repo start` and `git branch`.
| review.\<url\>.uploadlabels | upload | Automatically add to `--label=<value>` |
| review.\<url\>.uploadnotify | upload | [Notify setting][upload-notify] to use |
| review.\<url\>.uploadtopic | upload | Default [topic] to use |
| review.\<url\>.uploadwarningthreshold | upload | Warn when attempting to upload more than this many CLs |
| review.\<url\>.username | upload | Override username with `ssh://` review URIs |
| remote.\<remote\>.fetch | sync | Set of refs to fetch |
| remote.\<remote\>.projectname | \<network\> | The name of the project as it exists in Gerrit review |
@ -236,7 +243,9 @@ The `[branch]` settings are updated by `repo start` and `git branch`.
## ~/ dotconfig layout
Repo will create & maintain a few files in the user's home directory.
Repo will create & maintain a few files under the `.repoconfig/` directory.
This is placed in the user's home directory by default but can be changed by
setting `REPO_CONFIG_DIR`.
* `.repoconfig/`: Repo's per-user directory for all random config files/state.
* `.repoconfig/config`: Per-user settings using [git-config] file format.
@ -253,6 +262,7 @@ Repo will create & maintain a few files in the user's home directory.
[git-config]: https://git-scm.com/docs/git-config
[Git LFS]: https://git-lfs.github.com/
[git worktree]: https://git-scm.com/docs/git-worktree
[gitsubmodules]: https://git-scm.com/docs/gitsubmodules
[manifest-format.md]: ./manifest-format.md

View File

@ -26,6 +26,7 @@ following DTD:
remote*,
default?,
manifest-server?,
submanifest*?,
remove-project*,
project*,
extend-project*,
@ -57,6 +58,16 @@ following DTD:
<!ELEMENT manifest-server EMPTY>
<!ATTLIST manifest-server url CDATA #REQUIRED>
<!ELEMENT submanifest EMPTY>
<!ATTLIST submanifest name ID #REQUIRED>
<!ATTLIST submanifest remote IDREF #IMPLIED>
<!ATTLIST submanifest project CDATA #IMPLIED>
<!ATTLIST submanifest manifest-name CDATA #IMPLIED>
<!ATTLIST submanifest revision CDATA #IMPLIED>
<!ATTLIST submanifest path CDATA #IMPLIED>
<!ATTLIST submanifest groups CDATA #IMPLIED>
<!ATTLIST submanifest default-groups CDATA #IMPLIED>
<!ELEMENT project (annotation*,
project*,
copyfile*,
@ -90,9 +101,12 @@ following DTD:
<!ELEMENT extend-project EMPTY>
<!ATTLIST extend-project name CDATA #REQUIRED>
<!ATTLIST extend-project path CDATA #IMPLIED>
<!ATTLIST extend-project dest-path CDATA #IMPLIED>
<!ATTLIST extend-project groups CDATA #IMPLIED>
<!ATTLIST extend-project revision CDATA #IMPLIED>
<!ATTLIST extend-project remote CDATA #IMPLIED>
<!ATTLIST extend-project dest-branch CDATA #IMPLIED>
<!ATTLIST extend-project upstream CDATA #IMPLIED>
<!ELEMENT remove-project EMPTY>
<!ATTLIST remove-project name CDATA #REQUIRED>
@ -105,6 +119,7 @@ following DTD:
<!ELEMENT superproject EMPTY>
<!ATTLIST superproject name CDATA #REQUIRED>
<!ATTLIST superproject remote IDREF #IMPLIED>
<!ATTLIST superproject revision CDATA #IMPLIED>
<!ELEMENT contactinfo EMPTY>
<!ATTLIST contactinfo bugurl CDATA #REQUIRED>
@ -112,6 +127,7 @@ following DTD:
<!ELEMENT include EMPTY>
<!ATTLIST include name CDATA #REQUIRED>
<!ATTLIST include groups CDATA #IMPLIED>
<!ATTLIST include revision CDATA #IMPLIED>
]>
```
@ -234,6 +250,66 @@ the specified tag. This is used by repo sync when the --smart-tag option
is given.
### Element submanifest
One or more submanifest elements may be specified. Each element describes a
single manifest to be checked out as a child.
Attribute `name`: A unique name (within the current (sub)manifest) for this
submanifest. It acts as a default for `revision` below. The same name can be
used for submanifests with different parent (sub)manifests.
Attribute `remote`: Name of a previously defined remote element.
If not supplied the remote given by the default element is used.
Attribute `project`: The manifest project name. The project's name is appended
onto its remote's fetch URL to generate the actual URL to configure the Git
remote with. The URL gets formed as:
${remote_fetch}/${project_name}.git
where ${remote_fetch} is the remote's fetch attribute and
${project_name} is the project's name attribute. The suffix ".git"
is always appended as repo assumes the upstream is a forest of
bare Git repositories. If the project has a parent element, its
name will be prefixed by the parent's.
The project name must match the name Gerrit knows, if Gerrit is
being used for code reviews.
`project` must not be empty, and may not be an absolute path or use "." or ".."
path components. It is always interpreted relative to the remote's fetch
settings, so if a different base path is needed, declare a different remote
with the new settings needed.
If not supplied the remote and project for this manifest will be used: `remote`
cannot be supplied.
Projects from a submanifest and its submanifests are added to the
submanifest::path:<path_prefix> group.
Attribute `manifest-name`: The manifest filename in the manifest project. If
not supplied, `default.xml` is used.
Attribute `revision`: Name of a Git branch (e.g. "main" or "refs/heads/main"),
tag (e.g. "refs/tags/stable"), or a commit hash. If not supplied, `name` is
used.
Attribute `path`: An optional path relative to the top directory
of the repo client where the submanifest repo client top directory
should be placed. If not supplied, `revision` is used.
`path` may not be an absolute path or use "." or ".." path components.
Attribute `groups`: List of additional groups to which all projects
in the included submanifest belong. This appends and recurses, meaning
all projects in submanifests carry all parent submanifest groups.
Same syntax as the corresponding element of `project`.
Attribute `default-groups`: The list of manifest groups to sync if no
`--groups=` parameter was specified at init. When that list is empty, use this
list instead of "default" as the list of groups to sync.
### Element project
One or more project elements may be specified. Each element
@ -336,6 +412,11 @@ against changes to the original manifest.
Attribute `path`: If specified, limit the change to projects checked out
at the specified path, rather than all projects with the given name.
Attribute `dest-path`: If specified, a path relative to the top directory
of the repo client where the Git working directory for this project
should be placed. This is used to move a project in the checkout by
overriding the existing `path` setting.
Attribute `groups`: List of additional groups to which this project
belongs. Same syntax as the corresponding element of `project`.
@ -345,6 +426,12 @@ project. Same syntax as the corresponding element of `project`.
Attribute `remote`: If specified, overrides the remote of the original
project. Same syntax as the corresponding element of `project`.
Attribute `dest-branch`: If specified, overrides the dest-branch of the original
project. Same syntax as the corresponding element of `project`.
Attribute `upstream`: If specified, overrides the upstream of the original
project. Same syntax as the corresponding element of `project`.
### Element annotation
Zero or more annotation elements may be specified as children of a
@ -432,6 +519,11 @@ same meaning as project's name attribute. See the
Attribute `remote`: Name of a previously defined remote element.
If not supplied the remote given by the default element is used.
Attribute `revision`: Name of the Git branch the manifest wants
to track for this superproject. If not supplied the revision given
by the remote element is used if applicable, else the default
element is used.
### Element contactinfo
***
@ -459,9 +551,12 @@ These restrictions are not enforced for [Local Manifests].
Attribute `groups`: List of additional groups to which all projects
in the included manifest belong. This appends and recurses, meaning
all projects in sub-manifests carry all parent include groups.
all projects in included manifests carry all parent include groups.
Same syntax as the corresponding element of `project`.
Attribute `revision`: Name of a Git branch (e.g. `main` or `refs/heads/main`)
default to which all projects in the included manifest belong.
## Local Manifests {#local-manifests}
Additional remotes and projects may be added through local manifest

View File

@ -143,23 +143,14 @@ internal processes for accessing the restricted keys.
***
```sh
# Set the gpg key directory.
$ export GNUPGHOME=~/.gnupg/repo/
# Verify the listed key is “Repo Maintainer”.
$ gpg -K
# Pick whatever branch or commit you want to tag.
$ r=main
# Pick the new version.
$ t=1.12.10
$ t=v2.30
# Create the signed tag.
$ git tag -s v$t -u "Repo Maintainer <repo@android.kernel.org>" -m "repo $t" $r
# Create a new signed tag with the current HEAD.
$ ./release/sign-tag.py $t
# Verify the signed tag.
$ git show v$t
$ git show $t
```
### Push the new release
@ -168,11 +159,11 @@ Once you're ready to make the release available to everyone, push it to the
`stable` branch.
Make sure you never push the tag itself to the stable branch!
Only push the commit -- notice the use of `$t` and `$r` below.
Only push the commit -- note the use of `^0` below.
```sh
$ git push https://gerrit-review.googlesource.com/git-repo v$t
$ git push https://gerrit-review.googlesource.com/git-repo $r:stable
$ git push https://gerrit-review.googlesource.com/git-repo $t
$ git push https://gerrit-review.googlesource.com/git-repo $t^0:stable
```
If something goes horribly wrong, you can force push the previous version to the
@ -195,7 +186,9 @@ You can create a short changelog using the command:
```sh
# If you haven't pushed to the stable branch yet, you can use origin/stable.
# If you have pushed, change origin/stable to the previous release tag.
$ git log --format="%h (%aN) %s" --no-merges origin/stable..$r
# This assumes "main" is the current tagged release. If it's newer, change it
# to the current release tag too.
$ git log --format="%h (%aN) %s" --no-merges origin/stable..main
```
## Project References
@ -208,85 +201,132 @@ Things in bold indicate stuff to take note of, but does not guarantee that we
still support them.
Things in italics are things we used to care about but probably don't anymore.
| Date | EOL | [Git][rel-g] | [Python][rel-p] | [Ubuntu][rel-u] / [Debian][rel-d] | Git | Python |
|:--------:|:------------:|--------------|-----------------|-----------------------------------|-----|--------|
| Oct 2008 | *Oct 2013* | | 2.6.0 | *10.04 Lucid* - 10.10 Maverick / *Squeeze* |
| Date | EOL | [Git][rel-g] | [Python][rel-p] | [SSH][rel-o] | [Ubuntu][rel-u] / [Debian][rel-d] | Git | Python | SSH |
|:--------:|:------------:|:------------:|:---------------:|:------------:|-----------------------------------|-----|--------|-----|
| Apr 2008 | | | | 5.0 |
| Jun 2008 | | | | 5.1 |
| Oct 2008 | *Oct 2013* | | 2.6.0 | | *10.04 Lucid* - 10.10 Maverick / *Squeeze* |
| Dec 2008 | *Feb 2009* | | 3.0.0 |
| Feb 2009 | *Mar 2012* | | | Debian 5 Lenny | 1.5.6.5 | 2.5.2 |
| Jun 2009 | *Jun 2016* | | 3.1.0 | *10.04 Lucid* - 10.10 Maverick / *Squeeze* |
| Feb 2010 | *Oct 2012* | 1.7.0 | | *10.04 Lucid* - *12.04 Precise* - 12.10 Quantal |
| Apr 2010 | *Apr 2015* | | | *10.04 Lucid* | 1.7.0.4 | 2.6.5 3.1.2 |
| Jul 2010 | *Dec 2019* | | **2.7.0** | 11.04 Natty - **<current>** |
| Oct 2010 | | | | 10.10 Maverick | 1.7.1 | 2.6.6 3.1.3 |
| Feb 2011 | *Feb 2016* | | | Debian 6 Squeeze | 1.7.2.5 | 2.6.6 3.1.3 |
| Apr 2011 | | | | 11.04 Natty | 1.7.4 | 2.7.1 3.2.0 |
| Oct 2011 | *Feb 2016* | | 3.2.0 | 11.04 Natty - 12.10 Quantal |
| Oct 2011 | | | | 11.10 Ocelot | 1.7.5.4 | 2.7.2 3.2.2 |
| Apr 2012 | *Apr 2019* | | | *12.04 Precise* | 1.7.9.5 | 2.7.3 3.2.3 |
| Sep 2012 | *Sep 2017* | | 3.3.0 | 13.04 Raring - 13.10 Saucy |
| Oct 2012 | *Dec 2014* | 1.8.0 | | 13.04 Raring - 13.10 Saucy |
| Oct 2012 | | | | 12.10 Quantal | 1.7.10.4 | 2.7.3 3.2.3 |
| Apr 2013 | | | | 13.04 Raring | 1.8.1.2 | 2.7.4 3.3.1 |
| May 2013 | *May 2018* | | | Debian 7 Wheezy | 1.7.10.4 | 2.7.3 3.2.3 |
| Oct 2013 | | | | 13.10 Saucy | 1.8.3.2 | 2.7.5 3.3.2 |
| Feb 2014 | *Dec 2014* | **1.9.0** | | **14.04 Trusty** |
| Mar 2014 | *Mar 2019* | | **3.4.0** | **14.04 Trusty** - 15.10 Wily / **Jessie** |
| Apr 2014 | **Apr 2022** | | | **14.04 Trusty** | 1.9.1 | 2.7.5 3.4.0 |
| Feb 2009 | | | | 5.2 |
| Feb 2009 | *Mar 2012* | | | | Debian 5 Lenny | 1.5.6.5 | 2.5.2 |
| Jun 2009 | *Jun 2016* | | 3.1.0 | | *10.04 Lucid* - 10.10 Maverick / *Squeeze* |
| Sep 2009 | | | | 5.3 | *10.04 Lucid* |
| Feb 2010 | *Oct 2012* | 1.7.0 | | | *10.04 Lucid* - *12.04 Precise* - 12.10 Quantal |
| Mar 2010 | | | | 5.4 |
| Apr 2010 | | | | 5.5 | 10.10 Maverick |
| Apr 2010 | *Apr 2015* | | | | *10.04 Lucid* | 1.7.0.4 | 2.6.5 3.1.2 | 5.3 |
| Jul 2010 | *Dec 2019* | | *2.7.0* | | 11.04 Natty - *<current>* |
| Aug 2010 | | | | 5.6 |
| Oct 2010 | | | | | 10.10 Maverick | 1.7.1 | 2.6.6 3.1.3 | 5.5 |
| Jan 2011 | | | | 5.7 |
| Feb 2011 | | | | 5.8 | 11.04 Natty |
| Feb 2011 | *Feb 2016* | | | | Debian 6 Squeeze | 1.7.2.5 | 2.6.6 3.1.3 |
| Apr 2011 | | | | | 11.04 Natty | 1.7.4 | 2.7.1 3.2.0 | 5.8 |
| Sep 2011 | | | | 5.9 | *12.04 Precise* |
| Oct 2011 | *Feb 2016* | | 3.2.0 | | 11.04 Natty - 12.10 Quantal |
| Oct 2011 | | | | | 11.10 Ocelot | 1.7.5.4 | 2.7.2 3.2.2 | 5.8 |
| Apr 2012 | | | | 6.0 | 12.10 Quantal |
| Apr 2012 | *Apr 2019* | | | | *12.04 Precise* | 1.7.9.5 | 2.7.3 3.2.3 | 5.9 |
| Aug 2012 | | | | 6.1 | 13.04 Raring |
| Sep 2012 | *Sep 2017* | | 3.3.0 | | 13.04 Raring - 13.10 Saucy |
| Oct 2012 | *Dec 2014* | 1.8.0 | | | 13.04 Raring - 13.10 Saucy |
| Oct 2012 | | | | | 12.10 Quantal | 1.7.10.4 | 2.7.3 3.2.3 | 6.0 |
| Mar 2013 | | | | 6.2 | 13.10 Saucy |
| Apr 2013 | | | | | 13.04 Raring | 1.8.1.2 | 2.7.4 3.3.1 | 6.1 |
| May 2013 | *May 2018* | | | | Debian 7 Wheezy | 1.7.10.4 | 2.7.3 3.2.3 |
| Sep 2013 | | | | 6.3 |
| Oct 2013 | | | | | 13.10 Saucy | 1.8.3.2 | 2.7.5 3.3.2 | 6.2 |
| Nov 2013 | | | | 6.4 |
| Jan 2014 | | | | 6.5 |
| Feb 2014 | *Dec 2014* | **1.9.0** | | | *14.04 Trusty* |
| Mar 2014 | *Mar 2019* | | *3.4.0* | | *14.04 Trusty* - 15.10 Wily / *Jessie* |
| Mar 2014 | | | | 6.6 | *14.04 Trusty* - 14.10 Utopic |
| Apr 2014 | *Apr 2022* | | | | *14.04 Trusty* | 1.9.1 | 2.7.5 3.4.0 | 6.6 |
| May 2014 | *Dec 2014* | 2.0.0 |
| Aug 2014 | *Dec 2014* | **2.1.0** | | 14.10 Utopic - 15.04 Vivid / **Jessie** |
| Oct 2014 | | | | 14.10 Utopic | 2.1.0 | 2.7.8 3.4.2 |
| Aug 2014 | *Dec 2014* | *2.1.0* | | | 14.10 Utopic - 15.04 Vivid / *Jessie* |
| Oct 2014 | | | | 6.7 | 15.04 Vivid |
| Oct 2014 | | | | | 14.10 Utopic | 2.1.0 | 2.7.8 3.4.2 | 6.6 |
| Nov 2014 | *Sep 2015* | 2.2.0 |
| Feb 2015 | *Sep 2015* | 2.3.0 |
| Mar 2015 | | | | 6.8 |
| Apr 2015 | *May 2017* | 2.4.0 |
| Apr 2015 | **Jun 2020** | | | **Debian 8 Jessie** | 2.1.4 | 2.7.9 3.4.2 |
| Apr 2015 | | | | 15.04 Vivid | 2.1.4 | 2.7.9 3.4.3 |
| Jul 2015 | *May 2017* | 2.5.0 | | 15.10 Wily |
| Apr 2015 | *Jun 2020* | | | | *Debian 8 Jessie* | 2.1.4 | 2.7.9 3.4.2 |
| Apr 2015 | | | | | 15.04 Vivid | 2.1.4 | 2.7.9 3.4.3 | 6.7 |
| Jul 2015 | *May 2017* | 2.5.0 | | | 15.10 Wily |
| Jul 2015 | | | | 6.9 | 15.10 Wily |
| Aug 2015 | | | | 7.0 |
| Aug 2015 | | | | 7.1 |
| Sep 2015 | *May 2017* | 2.6.0 |
| Sep 2015 | **Sep 2020** | | **3.5.0** | **16.04 Xenial** - 17.04 Zesty / **Stretch** |
| Oct 2015 | | | | 15.10 Wily | 2.5.0 | 2.7.9 3.4.3 |
| Jan 2016 | *Jul 2017* | **2.7.0** | | **16.04 Xenial** |
| Sep 2015 | *Sep 2020* | | *3.5.0* | | *16.04 Xenial* - 17.04 Zesty / *Stretch* |
| Oct 2015 | | | | | 15.10 Wily | 2.5.0 | 2.7.9 3.4.3 | 6.9 |
| Jan 2016 | *Jul 2017* | *2.7.0* | | | *16.04 Xenial* |
| Feb 2016 | | | | 7.2 | *16.04 Xenial* |
| Mar 2016 | *Jul 2017* | 2.8.0 |
| Apr 2016 | **Apr 2024** | | | **16.04 Xenial** | 2.7.4 | 2.7.11 3.5.1 |
| Jun 2016 | *Jul 2017* | 2.9.0 | | 16.10 Yakkety |
| Apr 2016 | *Apr 2024* | | | | *16.04 Xenial* | 2.7.4 | 2.7.11 3.5.1 | 7.2 |
| Jun 2016 | *Jul 2017* | 2.9.0 | | | 16.10 Yakkety |
| Jul 2016 | | | | 7.3 | 16.10 Yakkety |
| Sep 2016 | *Sep 2017* | 2.10.0 |
| Oct 2016 | | | | 16.10 Yakkety | 2.9.3 | 2.7.11 3.5.1 |
| Nov 2016 | *Sep 2017* | **2.11.0** | | 17.04 Zesty / **Stretch** |
| Dec 2016 | **Dec 2021** | | **3.6.0** | 17.10 Artful - **18.04 Bionic** - 18.10 Cosmic |
| Oct 2016 | | | | | 16.10 Yakkety | 2.9.3 | 2.7.11 3.5.1 | 7.3 |
| Nov 2016 | *Sep 2017* | *2.11.0* | | | 17.04 Zesty / *Stretch* |
| Dec 2016 | **Dec 2021** | | **3.6.0** | | 17.10 Artful - **18.04 Bionic** - 18.10 Cosmic |
| Dec 2016 | | | | 7.4 | 17.04 Zesty / *Debian 9 Stretch* |
| Feb 2017 | *Sep 2017* | 2.12.0 |
| Apr 2017 | | | | 17.04 Zesty | 2.11.0 | 2.7.13 3.5.3 |
| Mar 2017 | | | | 7.5 | 17.10 Artful |
| Apr 2017 | | | | | 17.04 Zesty | 2.11.0 | 2.7.13 3.5.3 | 7.4 |
| May 2017 | *May 2018* | 2.13.0 |
| Jun 2017 | **Jun 2022** | | | **Debian 9 Stretch** | 2.11.0 | 2.7.13 3.5.3 |
| Aug 2017 | *Dec 2019* | 2.14.0 | | 17.10 Artful |
| Jun 2017 | *Jun 2022* | | | | *Debian 9 Stretch* | 2.11.0 | 2.7.13 3.5.3 | 7.4 |
| Aug 2017 | *Dec 2019* | 2.14.0 | | | 17.10 Artful |
| Oct 2017 | *Dec 2019* | 2.15.0 |
| Oct 2017 | | | | 17.10 Artful | 2.14.1 | 2.7.14 3.6.3 |
| Oct 2017 | | | | 7.6 | **18.04 Bionic** |
| Oct 2017 | | | | | 17.10 Artful | 2.14.1 | 2.7.14 3.6.3 | 7.5 |
| Jan 2018 | *Dec 2019* | 2.16.0 |
| Apr 2018 | *Dec 2019* | 2.17.0 | | **18.04 Bionic** |
| Apr 2018 | **Apr 2028** | | | **18.04 Bionic** | 2.17.0 | 2.7.15 3.6.5 |
| Jun 2018 | *Dec 2019* | 2.18.0 |
| Jun 2018 | **Jun 2023** | | 3.7.0 | 19.04 Disco - **20.04 Focal** / **Buster** |
| Sep 2018 | *Dec 2019* | 2.19.0 | | 18.10 Cosmic |
| Oct 2018 | | | | 18.10 Cosmic | 2.19.1 | 2.7.15 3.6.6 |
| Dec 2018 | *Dec 2019* | **2.20.0** | | 19.04 Disco / **Buster** |
| Feb 2019 | *Dec 2019* | 2.21.0 |
| Apr 2019 | | | | 19.04 Disco | 2.20.1 | 2.7.16 3.7.3 |
| Apr 2018 | *Mar 2021* | **2.17.0** | | | **18.04 Bionic** |
| Apr 2018 | | | | 7.7 | 18.10 Cosmic |
| Apr 2018 | **Apr 2028** | | | | **18.04 Bionic** | 2.17.0 | 2.7.15 3.6.5 | 7.6 |
| Jun 2018 | *Mar 2021* | 2.18.0 |
| Jun 2018 | **Jun 2023** | | 3.7.0 | | 19.04 Disco - **Buster** |
| Aug 2018 | | | | 7.8 |
| Sep 2018 | *Mar 2021* | 2.19.0 | | | 18.10 Cosmic |
| Oct 2018 | | | | 7.9 | 19.04 Disco / **Buster** |
| Oct 2018 | | | | | 18.10 Cosmic | 2.19.1 | 2.7.15 3.6.6 | 7.7 |
| Dec 2018 | *Mar 2021* | **2.20.0** | | | 19.04 Disco - 19.10 Eoan / **Buster** |
| Feb 2019 | *Mar 2021* | 2.21.0 |
| Apr 2019 | | | | 8.0 | 19.10 Eoan |
| Apr 2019 | | | | | 19.04 Disco | 2.20.1 | 2.7.16 3.7.3 | 7.9 |
| Jun 2019 | | 2.22.0 |
| Jul 2019 | **Jul 2024** | | | **Debian 10 Buster** | 2.20.1 | 2.7.16 3.7.3 |
| Aug 2019 | | 2.23.0 |
| Oct 2019 | **Oct 2024** | | 3.8.0 |
| Oct 2019 | | | | 19.10 Eoan | 2.20.1 | 2.7.17 3.7.5 |
| Nov 2019 | | 2.24.0 |
| Jan 2020 | | 2.25.0 | | **20.04 Focal** |
| Apr 2020 | **Apr 2030** | | | **20.04 Focal** | 2.25.0 | 2.7.17 3.7.5 |
| Oct 2020 | **Oct 2025** | | 3.9.0 | 21.04 Hirsute / **Bullseye** |
| Dec 2020 | | 2.30.0 | | 21.04 Hirsute / **Bullseye** |
| Apr 2021 | *Jan 2022* | | | 21.04 Hirsute | 2.30.2 | 2.7.18 3.9.4 |
| Aug 2021 | **Aug 2026** | | | **Debian 11 Bullseye** | 2.30.2 | 2.7.18 3.9.2 |
| **Date** | **EOL** | **[Git][rel-g]** | **[Python][rel-p]** | **[Ubuntu][rel-u] / [Debian][rel-d]** | **Git** | **Python** |
| Jul 2019 | **Jul 2024** | | | | **Debian 10 Buster** | 2.20.1 | 2.7.16 3.7.3 | 7.9 |
| Aug 2019 | *Mar 2021* | 2.23.0 |
| Oct 2019 | **Oct 2024** | | 3.8.0 | | **20.04 Focal** - 20.10 Groovy |
| Oct 2019 | | | | 8.1 |
| Oct 2019 | | | | | 19.10 Eoan | 2.20.1 | 2.7.17 3.7.5 | 8.0 |
| Nov 2019 | *Mar 2021* | 2.24.0 |
| Jan 2020 | *Mar 2021* | 2.25.0 | | | **20.04 Focal** |
| Feb 2020 | | | | 8.2 | **20.04 Focal** |
| Mar 2020 | *Mar 2021* | 2.26.0 |
| Apr 2020 | **Apr 2030** | | | | **20.04 Focal** | 2.25.1 | 2.7.17 3.8.2 | 8.2 |
| May 2020 | *Mar 2021* | 2.27.0 | | | 20.10 Groovy |
| May 2020 | | | | 8.3 |
| Jul 2020 | *Mar 2021* | 2.28.0 |
| Sep 2020 | | | | 8.4 | 21.04 Hirsute / **Bullseye** |
| Oct 2020 | *Mar 2021* | 2.29.0 |
| Oct 2020 | | | | | 20.10 Groovy | 2.27.0 | 2.7.18 3.8.6 | 8.3 |
| Oct 2020 | **Oct 2025** | | 3.9.0 | | 21.04 Hirsute / **Bullseye** |
| Dec 2020 | *Mar 2021* | 2.30.0 | | | 21.04 Hirsute / **Bullseye** |
| Mar 2021 | | 2.31.0 |
| Mar 2021 | | | | 8.5 |
| Apr 2021 | | | | 8.6 |
| Apr 2021 | *Jan 2022* | | | | 21.04 Hirsute | 2.30.2 | 2.7.18 3.9.4 | 8.4 |
| Jun 2021 | | 2.32.0 |
| Aug 2021 | | 2.33.0 |
| Aug 2021 | | | | 8.7 |
| Aug 2021 | **Aug 2026** | | | | **Debian 11 Bullseye** | 2.30.2 | 2.7.18 3.9.2 | 8.4 |
| **Date** | **EOL** | **[Git][rel-g]** | **[Python][rel-p]** | **[SSH][rel-o]** | **[Ubuntu][rel-u] / [Debian][rel-d]** | **Git** | **Python** | **SSH** |
[contact]: ../README.md#contact
[rel-d]: https://en.wikipedia.org/wiki/Debian_version_history
[rel-g]: https://en.wikipedia.org/wiki/Git#Releases
[rel-o]: https://www.openssh.com/releasenotes.html
[rel-p]: https://en.wikipedia.org/wiki/History_of_Python#Table_of_versions
[rel-u]: https://en.wikipedia.org/wiki/Ubuntu_version_history#Table_of_versions
[example announcement]: https://groups.google.com/d/topic/repo-discuss/UGBNismWo1M/discussion

View File

@ -36,31 +36,33 @@ class Editor(object):
@classmethod
def _SelectEditor(cls):
e = os.getenv('GIT_EDITOR')
e = os.getenv("GIT_EDITOR")
if e:
return e
if cls.globalConfig:
e = cls.globalConfig.GetString('core.editor')
e = cls.globalConfig.GetString("core.editor")
if e:
return e
e = os.getenv('VISUAL')
e = os.getenv("VISUAL")
if e:
return e
e = os.getenv('EDITOR')
e = os.getenv("EDITOR")
if e:
return e
if os.getenv('TERM') == 'dumb':
if os.getenv("TERM") == "dumb":
print(
"""No editor specified in GIT_EDITOR, core.editor, VISUAL or EDITOR.
Tried to fall back to vi but terminal is dumb. Please configure at
least one of these before using this command.""", file=sys.stderr)
least one of these before using this command.""", # noqa: E501
file=sys.stderr,
)
sys.exit(1)
return 'vi'
return "vi"
@classmethod
def EditString(cls, data):
@ -76,22 +78,23 @@ least one of these before using this command.""", file=sys.stderr)
EditorError: The editor failed to run.
"""
editor = cls._GetEditor()
if editor == ':':
if editor == ":":
return data
fd, path = tempfile.mkstemp()
try:
os.write(fd, data.encode('utf-8'))
os.write(fd, data.encode("utf-8"))
os.close(fd)
fd = None
if platform_utils.isWindows():
# Split on spaces, respecting quoted strings
import shlex
args = shlex.split(editor)
shell = False
elif re.compile("^.*[$ \t'].*$").match(editor):
args = [editor + ' "$@"', 'sh']
args = [editor + ' "$@"', "sh"]
shell = True
else:
args = [editor]
@ -101,14 +104,17 @@ least one of these before using this command.""", file=sys.stderr)
try:
rc = subprocess.Popen(args, shell=shell).wait()
except OSError as e:
raise EditorError('editor failed, %s: %s %s'
% (str(e), editor, path))
raise EditorError(
"editor failed, %s: %s %s" % (str(e), editor, path)
)
if rc != 0:
raise EditorError('editor failed with exit status %d: %s %s'
% (rc, editor, path))
raise EditorError(
"editor failed with exit status %d: %s %s"
% (rc, editor, path)
)
with open(path, mode='rb') as fd2:
return fd2.read().decode('utf-8')
with open(path, mode="rb") as fd2:
return fd2.read().decode("utf-8")
finally:
if fd:
os.close(fd)

View File

@ -14,23 +14,19 @@
class ManifestParseError(Exception):
"""Failed to parse the manifest file.
"""
"""Failed to parse the manifest file."""
class ManifestInvalidRevisionError(ManifestParseError):
"""The revision value in a project is incorrect.
"""
"""The revision value in a project is incorrect."""
class ManifestInvalidPathError(ManifestParseError):
"""A path used in <copyfile> or <linkfile> is incorrect.
"""
"""A path used in <copyfile> or <linkfile> is incorrect."""
class NoManifestException(Exception):
"""The required manifest does not exist.
"""
"""The required manifest does not exist."""
def __init__(self, path, reason):
super().__init__(path, reason)
@ -42,8 +38,7 @@ class NoManifestException(Exception):
class EditorError(Exception):
"""Unspecified error from the user's text editor.
"""
"""Unspecified error from the user's text editor."""
def __init__(self, reason):
super().__init__(reason)
@ -54,8 +49,7 @@ class EditorError(Exception):
class GitError(Exception):
"""Unspecified internal error from git.
"""
"""Unspecified internal error from git."""
def __init__(self, command):
super().__init__(command)
@ -66,8 +60,7 @@ class GitError(Exception):
class UploadError(Exception):
"""A bundle upload to Gerrit did not succeed.
"""
"""A bundle upload to Gerrit did not succeed."""
def __init__(self, reason):
super().__init__(reason)
@ -78,8 +71,7 @@ class UploadError(Exception):
class DownloadError(Exception):
"""Cannot download a repository.
"""
"""Cannot download a repository."""
def __init__(self, reason):
super().__init__(reason)
@ -90,8 +82,7 @@ class DownloadError(Exception):
class NoSuchProjectError(Exception):
"""A specified project does not exist in the work tree.
"""
"""A specified project does not exist in the work tree."""
def __init__(self, name=None):
super().__init__(name)
@ -99,13 +90,12 @@ class NoSuchProjectError(Exception):
def __str__(self):
if self.name is None:
return 'in current directory'
return "in current directory"
return self.name
class InvalidProjectGroupsError(Exception):
"""A specified project is not suitable for the specified groups
"""
"""A specified project is not suitable for the specified groups"""
def __init__(self, name=None):
super().__init__(name)
@ -113,7 +103,7 @@ class InvalidProjectGroupsError(Exception):
def __str__(self):
if self.name is None:
return 'in current directory'
return "in current directory"
return self.name

View File

@ -15,9 +15,9 @@
import json
import multiprocessing
TASK_COMMAND = 'command'
TASK_SYNC_NETWORK = 'sync-network'
TASK_SYNC_LOCAL = 'sync-local'
TASK_COMMAND = "command"
TASK_SYNC_NETWORK = "sync-network"
TASK_SYNC_LOCAL = "sync-local"
class EventLog(object):
@ -51,8 +51,16 @@ class EventLog(object):
self._log = []
self._parent = None
def Add(self, name, task_name, start, finish=None, success=None,
try_count=1, kind='RepoOp'):
def Add(
self,
name,
task_name,
start,
finish=None,
success=None,
try_count=1,
kind="RepoOp",
):
"""Add an event to the log.
Args:
@ -68,15 +76,15 @@ class EventLog(object):
A dictionary of the event added to the log.
"""
event = {
'id': (kind, _NextEventId()),
'name': name,
'task_name': task_name,
'start_time': start,
'try': try_count,
"id": (kind, _NextEventId()),
"name": name,
"task_name": task_name,
"start_time": start,
"try": try_count,
}
if self._parent:
event['parent'] = self._parent['id']
event["parent"] = self._parent["id"]
if success is not None or finish is not None:
self.FinishEvent(event, finish, success)
@ -100,15 +108,15 @@ class EventLog(object):
"""
event = self.Add(project.relpath, task_name, start, finish, success)
if event is not None:
event['project'] = project.name
event["project"] = project.name
if project.revisionExpr:
event['revision'] = project.revisionExpr
event["revision"] = project.revisionExpr
if project.remote.url:
event['project_url'] = project.remote.url
event["project_url"] = project.remote.url
if project.remote.fetchUrl:
event['remote_url'] = project.remote.fetchUrl
event["remote_url"] = project.remote.fetchUrl
try:
event['git_hash'] = project.GetCommitRevisionId()
event["git_hash"] = project.GetCommitRevisionId()
except Exception:
pass
return event
@ -122,7 +130,7 @@ class EventLog(object):
Returns:
status string.
"""
return 'pass' if success else 'fail'
return "pass" if success else "fail"
def FinishEvent(self, event, finish, success):
"""Finishes an incomplete event.
@ -135,8 +143,8 @@ class EventLog(object):
Returns:
A dictionary of the event added to the log.
"""
event['status'] = self.GetStatusString(success)
event['finish_time'] = finish
event["status"] = self.GetStatusString(success)
event["finish_time"] = finish
return event
def SetParent(self, event):
@ -153,14 +161,14 @@ class EventLog(object):
Args:
filename: The file to write the log to.
"""
with open(filename, 'w+') as f:
with open(filename, "w+") as f:
for e in self._log:
json.dump(e, f, sort_keys=True)
f.write('\n')
f.write("\n")
# An integer id that is unique across this invocation of the program.
_EVENT_ID = multiprocessing.Value('i', 1)
_EVENT_ID = multiprocessing.Value("i", 1)
def _NextEventId():

49
fetch.py Normal file
View File

@ -0,0 +1,49 @@
# Copyright (C) 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""This module contains functions used to fetch files from various sources."""
import subprocess
import sys
from urllib.parse import urlparse
from urllib.request import urlopen
def fetch_file(url, verbose=False):
"""Fetch a file from the specified source using the appropriate protocol.
Returns:
The contents of the file as bytes.
"""
scheme = urlparse(url).scheme
if scheme == "gs":
cmd = ["gsutil", "cat", url]
try:
result = subprocess.run(
cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, check=True
)
if result.stderr and verbose:
print(
'warning: non-fatal error running "gsutil": %s'
% result.stderr,
file=sys.stderr,
)
return result.stdout
except subprocess.CalledProcessError as e:
print(
'fatal: error running "gsutil": %s' % e.stderr, file=sys.stderr
)
sys.exit(1)
with urlopen(url) as f:
return f.read()

View File

@ -16,6 +16,7 @@ import functools
import os
import sys
import subprocess
from typing import Any, Optional
from error import GitError
from git_refs import HEAD
@ -23,7 +24,7 @@ import platform_utils
from repo_trace import REPO_TRACE, IsTrace, Trace
from wrapper import Wrapper
GIT = 'git'
GIT = "git"
# NB: These do not need to be kept in sync with the repo launcher script.
# These may be much newer as it allows the repo launcher to roll between
# different repo releases while source versions might require a newer git.
@ -35,7 +36,7 @@ GIT = 'git'
# git-1.7 is in (EOL) Ubuntu Precise. git-1.9 is in Ubuntu Trusty.
MIN_GIT_VERSION_SOFT = (1, 9, 1)
MIN_GIT_VERSION_HARD = (1, 7, 2)
GIT_DIR = 'GIT_DIR'
GIT_DIR = "GIT_DIR"
LAST_GITDIR = None
LAST_CWD = None
@ -46,17 +47,18 @@ class _GitCall(object):
def version_tuple(self):
ret = Wrapper().ParseGitVersion()
if ret is None:
print('fatal: unable to detect git version', file=sys.stderr)
print("fatal: unable to detect git version", file=sys.stderr)
sys.exit(1)
return ret
def __getattr__(self, name):
name = name.replace('_', '-')
name = name.replace("_", "-")
def fun(*cmdv):
command = [name]
command.extend(cmdv)
return GitCommand(None, command).Wait() == 0
return fun
@ -65,7 +67,7 @@ git = _GitCall()
def RepoSourceVersion():
"""Return the version of the repo.git tree."""
ver = getattr(RepoSourceVersion, 'version', None)
ver = getattr(RepoSourceVersion, "version", None)
# We avoid GitCommand so we don't run into circular deps -- GitCommand needs
# to initialize version info we provide.
@ -73,17 +75,22 @@ def RepoSourceVersion():
env = GitCommand._GetBasicEnv()
proj = os.path.dirname(os.path.abspath(__file__))
env[GIT_DIR] = os.path.join(proj, '.git')
result = subprocess.run([GIT, 'describe', HEAD], stdout=subprocess.PIPE,
stderr=subprocess.DEVNULL, encoding='utf-8',
env=env, check=False)
env[GIT_DIR] = os.path.join(proj, ".git")
result = subprocess.run(
[GIT, "describe", HEAD],
stdout=subprocess.PIPE,
stderr=subprocess.DEVNULL,
encoding="utf-8",
env=env,
check=False,
)
if result.returncode == 0:
ver = result.stdout.strip()
if ver.startswith('v'):
if ver.startswith("v"):
ver = ver[1:]
else:
ver = 'unknown'
setattr(RepoSourceVersion, 'version', ver)
ver = "unknown"
setattr(RepoSourceVersion, "version", ver)
return ver
@ -104,14 +111,14 @@ class UserAgent(object):
"""The operating system name."""
if self._os is None:
os_name = sys.platform
if os_name.lower().startswith('linux'):
os_name = 'Linux'
elif os_name == 'win32':
os_name = 'Win32'
elif os_name == 'cygwin':
os_name = 'Cygwin'
elif os_name == 'darwin':
os_name = 'Darwin'
if os_name.lower().startswith("linux"):
os_name = "Linux"
elif os_name == "win32":
os_name = "Win32"
elif os_name == "cygwin":
os_name = "Cygwin"
elif os_name == "darwin":
os_name = "Darwin"
self._os = os_name
return self._os
@ -121,11 +128,14 @@ class UserAgent(object):
"""The UA when connecting directly from repo."""
if self._repo_ua is None:
py_version = sys.version_info
self._repo_ua = 'git-repo/%s (%s) git/%s Python/%d.%d.%d' % (
self._repo_ua = "git-repo/%s (%s) git/%s Python/%d.%d.%d" % (
RepoSourceVersion(),
self.os,
git.version_tuple().full,
py_version.major, py_version.minor, py_version.micro)
py_version.major,
py_version.minor,
py_version.micro,
)
return self._repo_ua
@ -133,10 +143,11 @@ class UserAgent(object):
def git(self):
"""The UA when running git."""
if self._git_ua is None:
self._git_ua = 'git/%s (%s) git-repo/%s' % (
self._git_ua = "git/%s (%s) git-repo/%s" % (
git.version_tuple().full,
self.os,
RepoSourceVersion())
RepoSourceVersion(),
)
return self._git_ua
@ -144,21 +155,76 @@ class UserAgent(object):
user_agent = UserAgent()
def git_require(min_version, fail=False, msg=''):
def git_require(min_version, fail=False, msg=""):
git_version = git.version_tuple()
if min_version <= git_version:
return True
if fail:
need = '.'.join(map(str, min_version))
need = ".".join(map(str, min_version))
if msg:
msg = ' for ' + msg
print('fatal: git %s or later required%s' % (need, msg), file=sys.stderr)
msg = " for " + msg
print(
"fatal: git %s or later required%s" % (need, msg), file=sys.stderr
)
sys.exit(1)
return False
def _build_env(
_kwargs_only=(),
bare: Optional[bool] = False,
disable_editor: Optional[bool] = False,
ssh_proxy: Optional[Any] = None,
gitdir: Optional[str] = None,
objdir: Optional[str] = None,
):
"""Constucts an env dict for command execution."""
assert _kwargs_only == (), "_build_env only accepts keyword arguments."
env = GitCommand._GetBasicEnv()
if disable_editor:
env["GIT_EDITOR"] = ":"
if ssh_proxy:
env["REPO_SSH_SOCK"] = ssh_proxy.sock()
env["GIT_SSH"] = ssh_proxy.proxy
env["GIT_SSH_VARIANT"] = "ssh"
if "http_proxy" in env and "darwin" == sys.platform:
s = "'http.proxy=%s'" % (env["http_proxy"],)
p = env.get("GIT_CONFIG_PARAMETERS")
if p is not None:
s = p + " " + s
env["GIT_CONFIG_PARAMETERS"] = s
if "GIT_ALLOW_PROTOCOL" not in env:
env[
"GIT_ALLOW_PROTOCOL"
] = "file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc"
env["GIT_HTTP_USER_AGENT"] = user_agent.git
if objdir:
# Set to the place we want to save the objects.
env["GIT_OBJECT_DIRECTORY"] = objdir
alt_objects = os.path.join(gitdir, "objects") if gitdir else None
if alt_objects and os.path.realpath(alt_objects) != os.path.realpath(
objdir
):
# Allow git to search the original place in case of local or unique
# refs that git will attempt to resolve even if we aren't fetching
# them.
env["GIT_ALTERNATE_OBJECT_DIRECTORIES"] = alt_objects
if bare and gitdir is not None:
env[GIT_DIR] = gitdir
return env
class GitCommand(object):
def __init__(self,
"""Wrapper around a single git invocation."""
def __init__(
self,
project,
cmdv,
bare=False,
@ -169,107 +235,112 @@ class GitCommand(object):
disable_editor=False,
ssh_proxy=None,
cwd=None,
gitdir=None):
env = self._GetBasicEnv()
if disable_editor:
env['GIT_EDITOR'] = ':'
if ssh_proxy:
env['REPO_SSH_SOCK'] = ssh_proxy.sock()
env['GIT_SSH'] = ssh_proxy.proxy
env['GIT_SSH_VARIANT'] = 'ssh'
if 'http_proxy' in env and 'darwin' == sys.platform:
s = "'http.proxy=%s'" % (env['http_proxy'],)
p = env.get('GIT_CONFIG_PARAMETERS')
if p is not None:
s = p + ' ' + s
env['GIT_CONFIG_PARAMETERS'] = s
if 'GIT_ALLOW_PROTOCOL' not in env:
env['GIT_ALLOW_PROTOCOL'] = (
'file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc')
env['GIT_HTTP_USER_AGENT'] = user_agent.git
gitdir=None,
objdir=None,
):
if project:
if not cwd:
cwd = project.worktree
if not gitdir:
gitdir = project.gitdir
command = [GIT]
if bare:
if gitdir:
# Git on Windows wants its paths only using / for reliability.
if platform_utils.isWindows():
gitdir = gitdir.replace('\\', '/')
env[GIT_DIR] = gitdir
if objdir:
objdir = objdir.replace("\\", "/")
if gitdir:
gitdir = gitdir.replace("\\", "/")
env = _build_env(
disable_editor=disable_editor,
ssh_proxy=ssh_proxy,
objdir=objdir,
gitdir=gitdir,
bare=bare,
)
command = [GIT]
if bare:
cwd = None
command.append(cmdv[0])
# Need to use the --progress flag for fetch/clone so output will be
# displayed as by default git only does progress output if stderr is a TTY.
if sys.stderr.isatty() and cmdv[0] in ('fetch', 'clone'):
if '--progress' not in cmdv and '--quiet' not in cmdv:
command.append('--progress')
# displayed as by default git only does progress output if stderr is a
# TTY.
if sys.stderr.isatty() and cmdv[0] in ("fetch", "clone"):
if "--progress" not in cmdv and "--quiet" not in cmdv:
command.append("--progress")
command.extend(cmdv[1:])
stdin = subprocess.PIPE if input else None
stdout = subprocess.PIPE if capture_stdout else None
stderr = (subprocess.STDOUT if merge_output else
(subprocess.PIPE if capture_stderr else None))
stderr = (
subprocess.STDOUT
if merge_output
else (subprocess.PIPE if capture_stderr else None)
)
dbg = ""
if IsTrace():
global LAST_CWD
global LAST_GITDIR
dbg = ''
if cwd and LAST_CWD != cwd:
if LAST_GITDIR or LAST_CWD:
dbg += '\n'
dbg += ': cd %s\n' % cwd
dbg += "\n"
dbg += ": cd %s\n" % cwd
LAST_CWD = cwd
if GIT_DIR in env and LAST_GITDIR != env[GIT_DIR]:
if LAST_GITDIR or LAST_CWD:
dbg += '\n'
dbg += ': export GIT_DIR=%s\n' % env[GIT_DIR]
dbg += "\n"
dbg += ": export GIT_DIR=%s\n" % env[GIT_DIR]
LAST_GITDIR = env[GIT_DIR]
dbg += ': '
dbg += ' '.join(command)
if stdin == subprocess.PIPE:
dbg += ' 0<|'
if stdout == subprocess.PIPE:
dbg += ' 1>|'
if stderr == subprocess.PIPE:
dbg += ' 2>|'
elif stderr == subprocess.STDOUT:
dbg += ' 2>&1'
Trace('%s', dbg)
if "GIT_OBJECT_DIRECTORY" in env:
dbg += (
": export GIT_OBJECT_DIRECTORY=%s\n"
% env["GIT_OBJECT_DIRECTORY"]
)
if "GIT_ALTERNATE_OBJECT_DIRECTORIES" in env:
dbg += ": export GIT_ALTERNATE_OBJECT_DIRECTORIES=%s\n" % (
env["GIT_ALTERNATE_OBJECT_DIRECTORIES"]
)
dbg += ": "
dbg += " ".join(command)
if stdin == subprocess.PIPE:
dbg += " 0<|"
if stdout == subprocess.PIPE:
dbg += " 1>|"
if stderr == subprocess.PIPE:
dbg += " 2>|"
elif stderr == subprocess.STDOUT:
dbg += " 2>&1"
with Trace(
"git command %s %s with debug: %s", LAST_GITDIR, command, dbg
):
try:
p = subprocess.Popen(command,
p = subprocess.Popen(
command,
cwd=cwd,
env=env,
encoding='utf-8',
errors='backslashreplace',
encoding="utf-8",
errors="backslashreplace",
stdin=stdin,
stdout=stdout,
stderr=stderr)
stderr=stderr,
)
except Exception as e:
raise GitError('%s: %s' % (command[1], e))
raise GitError("%s: %s" % (command[1], e))
if ssh_proxy:
ssh_proxy.add_client(p)
self.process = p
if input:
if isinstance(input, str):
input = input.encode('utf-8')
p.stdin.write(input)
p.stdin.close()
try:
self.stdout, self.stderr = p.communicate()
self.stdout, self.stderr = p.communicate(input=input)
finally:
if ssh_proxy:
ssh_proxy.remove_client(p)
@ -282,13 +353,15 @@ class GitCommand(object):
This is guaranteed to be side-effect free.
"""
env = os.environ.copy()
for key in (REPO_TRACE,
for key in (
REPO_TRACE,
GIT_DIR,
'GIT_ALTERNATE_OBJECT_DIRECTORIES',
'GIT_OBJECT_DIRECTORY',
'GIT_WORK_TREE',
'GIT_GRAFT_FILE',
'GIT_INDEX_FILE'):
"GIT_ALTERNATE_OBJECT_DIRECTORIES",
"GIT_OBJECT_DIRECTORY",
"GIT_WORK_TREE",
"GIT_GRAFT_FILE",
"GIT_INDEX_FILE",
):
env.pop(key, None)
return env

View File

@ -22,6 +22,7 @@ import re
import ssl
import subprocess
import sys
from typing import Union
import urllib.error
import urllib.request
@ -33,9 +34,9 @@ from git_refs import R_CHANGES, R_HEADS, R_TAGS
# Prefix that is prepended to all the keys of SyncAnalysisState's data
# that is saved in the config.
SYNC_STATE_PREFIX = 'repo.syncstate.'
SYNC_STATE_PREFIX = "repo.syncstate."
ID_RE = re.compile(r'^[0-9a-f]{40}$')
ID_RE = re.compile(r"^[0-9a-f]{40}$")
REVIEW_CACHE = dict()
@ -57,21 +58,19 @@ def IsImmutable(rev):
def _key(name):
parts = name.split('.')
parts = name.split(".")
if len(parts) < 2:
return name.lower()
parts[0] = parts[0].lower()
parts[-1] = parts[-1].lower()
return '.'.join(parts)
return ".".join(parts)
class GitConfig(object):
_ForUser = None
_USER_CONFIG = '~/.gitconfig'
_ForSystem = None
_SYSTEM_CONFIG = '/etc/gitconfig'
_SYSTEM_CONFIG = "/etc/gitconfig"
@classmethod
def ForSystem(cls):
@ -82,13 +81,16 @@ class GitConfig(object):
@classmethod
def ForUser(cls):
if cls._ForUser is None:
cls._ForUser = cls(configfile=os.path.expanduser(cls._USER_CONFIG))
cls._ForUser = cls(configfile=cls._getUserConfig())
return cls._ForUser
@staticmethod
def _getUserConfig():
return os.path.expanduser("~/.gitconfig")
@classmethod
def ForRepository(cls, gitdir, defaults=None):
return cls(configfile=os.path.join(gitdir, 'config'),
defaults=defaults)
return cls(configfile=os.path.join(gitdir, "config"), defaults=defaults)
def __init__(self, configfile, defaults=None, jsonFile=None):
self.file = configfile
@ -102,18 +104,22 @@ class GitConfig(object):
if self._json is None:
self._json = os.path.join(
os.path.dirname(self.file),
'.repo_' + os.path.basename(self.file) + '.json')
".repo_" + os.path.basename(self.file) + ".json",
)
def ClearCache(self):
"""Clear the in-memory cache of config."""
self._cache_dict = None
def Has(self, name, include_defaults=True):
"""Return true if this configuration file has the key.
"""
"""Return true if this configuration file has the key."""
if _key(name) in self._cache:
return True
if include_defaults and self.defaults:
return self.defaults.Has(name, include_defaults=True)
return False
def GetInt(self, name):
def GetInt(self, name: str) -> Union[int, None]:
"""Returns an integer from the configuration file.
This follows the git config syntax.
@ -122,7 +128,7 @@ class GitConfig(object):
name: The key to lookup.
Returns:
None if the value was not defined, or is not a boolean.
None if the value was not defined, or is not an int.
Otherwise, the number itself.
"""
v = self.GetString(name)
@ -131,23 +137,28 @@ class GitConfig(object):
v = v.strip()
mult = 1
if v.endswith('k'):
if v.endswith("k"):
v = v[:-1]
mult = 1024
elif v.endswith('m'):
elif v.endswith("m"):
v = v[:-1]
mult = 1024 * 1024
elif v.endswith('g'):
elif v.endswith("g"):
v = v[:-1]
mult = 1024 * 1024 * 1024
base = 10
if v.startswith('0x'):
if v.startswith("0x"):
base = 16
try:
return int(v, base=base) * mult
except ValueError:
print(
f"warning: expected {name} to represent an integer, got {v} "
"instead",
file=sys.stderr,
)
return None
def DumpConfigDict(self):
@ -165,8 +176,10 @@ class GitConfig(object):
config_dict[key] = self.GetString(key)
return config_dict
def GetBoolean(self, name):
def GetBoolean(self, name: str) -> Union[str, None]:
"""Returns a boolean from the configuration file.
Returns:
None: The value was not defined, or is not a boolean.
True: The value was set to true or yes.
False: The value was set to false or no.
@ -175,19 +188,23 @@ class GitConfig(object):
if v is None:
return None
v = v.lower()
if v in ('true', 'yes'):
if v in ("true", "yes"):
return True
if v in ('false', 'no'):
if v in ("false", "no"):
return False
print(
f"warning: expected {name} to represent a boolean, got {v} instead",
file=sys.stderr,
)
return None
def SetBoolean(self, name, value):
"""Set the truthy value for a key."""
if value is not None:
value = 'true' if value else 'false'
value = "true" if value else "false"
self.SetString(name, value)
def GetString(self, name, all_keys=False):
def GetString(self, name: str, all_keys: bool = False) -> Union[str, None]:
"""Get the first value for a key, or None if it is not defined.
This configuration file is used first, if the key is not
@ -215,8 +232,8 @@ class GitConfig(object):
"""Set the value(s) for a key.
Only this configuration file is modified.
The supplied value should be either a string,
or a list of strings (to store multiple values).
The supplied value should be either a string, or a list of strings (to
store multiple values), or None (to delete the key).
"""
key = _key(name)
@ -228,7 +245,7 @@ class GitConfig(object):
if value is None:
if old:
del self._cache[key]
self._do('--unset-all', name)
self._do("--unset-all", name)
elif isinstance(value, list):
if len(value) == 0:
@ -239,17 +256,16 @@ class GitConfig(object):
elif old != value:
self._cache[key] = list(value)
self._do('--replace-all', name, value[0])
self._do("--replace-all", name, value[0])
for i in range(1, len(value)):
self._do('--add', name, value[i])
self._do("--add", name, value[i])
elif len(old) != 1 or old[0] != value:
self._cache[key] = [value]
self._do('--replace-all', name, value)
self._do("--replace-all", name, value)
def GetRemote(self, name):
"""Get the remote.$name.* configuration values as an object.
"""
"""Get the remote.$name.* configuration values as an object."""
try:
r = self._remotes[name]
except KeyError:
@ -258,8 +274,7 @@ class GitConfig(object):
return r
def GetBranch(self, name):
"""Get the branch.$name.* configuration values as an object.
"""
"""Get the branch.$name.* configuration values as an object."""
try:
b = self._branches[name]
except KeyError:
@ -269,14 +284,20 @@ class GitConfig(object):
def GetSyncAnalysisStateData(self):
"""Returns data to be logged for the analysis of sync performance."""
return {k: v for k, v in self.DumpConfigDict().items() if k.startswith(SYNC_STATE_PREFIX)}
return {
k: v
for k, v in self.DumpConfigDict().items()
if k.startswith(SYNC_STATE_PREFIX)
}
def UpdateSyncAnalysisState(self, options, superproject_logging_data):
"""Update Config's SYNC_STATE_PREFIX* data with the latest sync data.
Args:
options: Options passed to sync returned from optparse. See _Options().
superproject_logging_data: A dictionary of superproject data that is to be logged.
options: Options passed to sync returned from optparse. See
_Options().
superproject_logging_data: A dictionary of superproject data that is
to be logged.
Returns:
SyncAnalysisState object.
@ -284,23 +305,20 @@ class GitConfig(object):
return SyncAnalysisState(self, options, superproject_logging_data)
def GetSubSections(self, section):
"""List all subsection names matching $section.*.*
"""
"""List all subsection names matching $section.*.*"""
return self._sections.get(section, set())
def HasSection(self, section, subsection=''):
"""Does at least one key in section.subsection exist?
"""
def HasSection(self, section, subsection=""):
"""Does at least one key in section.subsection exist?"""
try:
return subsection in self._sections[section]
except KeyError:
return False
def UrlInsteadOf(self, url):
"""Resolve any url.*.insteadof references.
"""
for new_url in self.GetSubSections('url'):
for old_url in self.GetString('url.%s.insteadof' % new_url, True):
"""Resolve any url.*.insteadof references."""
for new_url in self.GetSubSections("url"):
for old_url in self.GetString("url.%s.insteadof" % new_url, True):
if old_url is not None and url.startswith(old_url):
return new_url + url[len(old_url) :]
return url
@ -311,13 +329,13 @@ class GitConfig(object):
if d is None:
d = {}
for name in self._cache.keys():
p = name.split('.')
p = name.split(".")
if 2 == len(p):
section = p[0]
subsect = ''
subsect = ""
else:
section = p[0]
subsect = '.'.join(p[1:-1])
subsect = ".".join(p[1:-1])
if section not in d:
d[section] = set()
d[section].add(subsect)
@ -345,20 +363,19 @@ class GitConfig(object):
except OSError:
return None
try:
Trace(': parsing %s', self.file)
with Trace(": parsing %s", self.file):
with open(self._json) as fd:
return json.load(fd)
except (IOError, ValueError):
platform_utils.remove(self._json)
platform_utils.remove(self._json, missing_ok=True)
return None
def _SaveJson(self, cache):
try:
with open(self._json, 'w') as fd:
with open(self._json, "w") as fd:
json.dump(cache, fd, indent=2)
except (IOError, TypeError):
if os.path.exists(self._json):
platform_utils.remove(self._json)
platform_utils.remove(self._json, missing_ok=True)
def _ReadGit(self):
"""
@ -368,12 +385,13 @@ class GitConfig(object):
"""
c = {}
d = self._do('--null', '--list')
if d is None:
if not os.path.exists(self.file):
return c
for line in d.rstrip('\0').split('\0'):
if '\n' in line:
key, val = line.split('\n', 1)
d = self._do("--null", "--list")
for line in d.rstrip("\0").split("\0"):
if "\n" in line:
key, val = line.split("\n", 1)
else:
key = line
val = None
@ -387,25 +405,25 @@ class GitConfig(object):
def _do(self, *args):
if self.file == self._SYSTEM_CONFIG:
command = ['config', '--system', '--includes']
command = ["config", "--system", "--includes"]
else:
command = ['config', '--file', self.file, '--includes']
command = ["config", "--file", self.file, "--includes"]
command.extend(args)
p = GitCommand(None,
command,
capture_stdout=True,
capture_stderr=True)
p = GitCommand(None, command, capture_stdout=True, capture_stderr=True)
if p.Wait() == 0:
return p.stdout
else:
GitError('git config %s: %s' % (str(args), p.stderr))
raise GitError("git config %s: %s" % (str(args), p.stderr))
class RepoConfig(GitConfig):
"""User settings for repo itself."""
_USER_CONFIG = '~/.repoconfig/config'
@staticmethod
def _getUserConfig():
repo_config_dir = os.getenv("REPO_CONFIG_DIR", os.path.expanduser("~"))
return os.path.join(repo_config_dir, ".repoconfig/config")
class RefSpec(object):
@ -418,8 +436,8 @@ class RefSpec(object):
@classmethod
def FromString(cls, rs):
lhs, rhs = rs.split(':', 2)
if lhs.startswith('+'):
lhs, rhs = rs.split(":", 2)
if lhs.startswith("+"):
lhs = lhs[1:]
forced = True
else:
@ -435,7 +453,7 @@ class RefSpec(object):
if self.src:
if rev == self.src:
return True
if self.src.endswith('/*') and rev.startswith(self.src[:-1]):
if self.src.endswith("/*") and rev.startswith(self.src[:-1]):
return True
return False
@ -443,28 +461,28 @@ class RefSpec(object):
if self.dst:
if ref == self.dst:
return True
if self.dst.endswith('/*') and ref.startswith(self.dst[:-1]):
if self.dst.endswith("/*") and ref.startswith(self.dst[:-1]):
return True
return False
def MapSource(self, rev):
if self.src.endswith('/*'):
if self.src.endswith("/*"):
return self.dst[:-1] + rev[len(self.src) - 1 :]
return self.dst
def __str__(self):
s = ''
s = ""
if self.forced:
s += '+'
s += "+"
if self.src:
s += self.src
if self.dst:
s += ':'
s += ":"
s += self.dst
return s
URI_ALL = re.compile(r'^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/')
URI_ALL = re.compile(r"^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/")
def GetSchemeFromUrl(url):
@ -476,21 +494,25 @@ def GetSchemeFromUrl(url):
@contextlib.contextmanager
def GetUrlCookieFile(url, quiet):
if url.startswith('persistent-'):
if url.startswith("persistent-"):
try:
p = subprocess.Popen(
['git-remote-persistent-https', '-print_config', url],
stdin=subprocess.PIPE, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
["git-remote-persistent-https", "-print_config", url],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
)
try:
cookieprefix = 'http.cookiefile='
proxyprefix = 'http.proxy='
cookieprefix = "http.cookiefile="
proxyprefix = "http.proxy="
cookiefile = None
proxy = None
for line in p.stdout:
line = line.strip().decode('utf-8')
line = line.strip().decode("utf-8")
if line.startswith(cookieprefix):
cookiefile = os.path.expanduser(line[len(cookieprefix):])
cookiefile = os.path.expanduser(
line[len(cookieprefix) :]
)
if line.startswith(proxyprefix):
proxy = line[len(proxyprefix) :]
# Leave subprocess open, as cookie file may be transient.
@ -500,8 +522,8 @@ def GetUrlCookieFile(url, quiet):
finally:
p.stdin.close()
if p.wait():
err_msg = p.stderr.read().decode('utf-8')
if ' -print_config' in err_msg:
err_msg = p.stderr.read().decode("utf-8")
if " -print_config" in err_msg:
pass # Persistent proxy doesn't support -print_config.
elif not quiet:
print(err_msg, file=sys.stderr)
@ -509,30 +531,30 @@ def GetUrlCookieFile(url, quiet):
if e.errno == errno.ENOENT:
pass # No persistent proxy.
raise
cookiefile = GitConfig.ForUser().GetString('http.cookiefile')
cookiefile = GitConfig.ForUser().GetString("http.cookiefile")
if cookiefile:
cookiefile = os.path.expanduser(cookiefile)
yield cookiefile, None
class Remote(object):
"""Configuration options related to a remote.
"""
"""Configuration options related to a remote."""
def __init__(self, config, name):
self._config = config
self.name = name
self.url = self._Get('url')
self.pushUrl = self._Get('pushurl')
self.review = self._Get('review')
self.projectname = self._Get('projectname')
self.fetch = list(map(RefSpec.FromString,
self._Get('fetch', all_keys=True)))
self.url = self._Get("url")
self.pushUrl = self._Get("pushurl")
self.review = self._Get("review")
self.projectname = self._Get("projectname")
self.fetch = list(
map(RefSpec.FromString, self._Get("fetch", all_keys=True))
)
self._review_url = None
def _InsteadOf(self):
globCfg = GitConfig.ForUser()
urlList = globCfg.GetSubSections('url')
urlList = globCfg.GetSubSections("url")
longest = ""
longestUrl = ""
@ -541,8 +563,9 @@ class Remote(object):
insteadOfList = globCfg.GetString(key, all_keys=True)
for insteadOf in insteadOfList:
if (self.url.startswith(insteadOf)
and len(insteadOf) > len(longest)):
if self.url.startswith(insteadOf) and len(insteadOf) > len(
longest
):
longest = insteadOf
longestUrl = url
@ -575,71 +598,78 @@ class Remote(object):
return None
u = self.review
if u.startswith('persistent-'):
u = u[len('persistent-'):]
if u.split(':')[0] not in ('http', 'https', 'sso', 'ssh'):
u = 'http://%s' % u
if u.endswith('/Gerrit'):
u = u[:len(u) - len('/Gerrit')]
if u.endswith('/ssh_info'):
u = u[:len(u) - len('/ssh_info')]
if not u.endswith('/'):
u += '/'
if u.startswith("persistent-"):
u = u[len("persistent-") :]
if u.split(":")[0] not in ("http", "https", "sso", "ssh"):
u = "http://%s" % u
if u.endswith("/Gerrit"):
u = u[: len(u) - len("/Gerrit")]
if u.endswith("/ssh_info"):
u = u[: len(u) - len("/ssh_info")]
if not u.endswith("/"):
u += "/"
http_url = u
if u in REVIEW_CACHE:
self._review_url = REVIEW_CACHE[u]
elif 'REPO_HOST_PORT_INFO' in os.environ:
host, port = os.environ['REPO_HOST_PORT_INFO'].split()
elif "REPO_HOST_PORT_INFO" in os.environ:
host, port = os.environ["REPO_HOST_PORT_INFO"].split()
self._review_url = self._SshReviewUrl(userEmail, host, port)
REVIEW_CACHE[u] = self._review_url
elif u.startswith('sso:') or u.startswith('ssh:'):
elif u.startswith("sso:") or u.startswith("ssh:"):
self._review_url = u # Assume it's right
REVIEW_CACHE[u] = self._review_url
elif 'REPO_IGNORE_SSH_INFO' in os.environ:
elif "REPO_IGNORE_SSH_INFO" in os.environ:
self._review_url = http_url
REVIEW_CACHE[u] = self._review_url
else:
try:
info_url = u + 'ssh_info'
info_url = u + "ssh_info"
if not validate_certs:
context = ssl._create_unverified_context()
info = urllib.request.urlopen(info_url, context=context).read()
info = urllib.request.urlopen(
info_url, context=context
).read()
else:
info = urllib.request.urlopen(info_url).read()
if info == b'NOT_AVAILABLE' or b'<' in info:
# If `info` contains '<', we assume the server gave us some sort
# of HTML response back, like maybe a login page.
if info == b"NOT_AVAILABLE" or b"<" in info:
# If `info` contains '<', we assume the server gave us
# some sort of HTML response back, like maybe a login
# page.
#
# Assume HTTP if SSH is not enabled or ssh_info doesn't look right.
# Assume HTTP if SSH is not enabled or ssh_info doesn't
# look right.
self._review_url = http_url
else:
info = info.decode('utf-8')
info = info.decode("utf-8")
host, port = info.split()
self._review_url = self._SshReviewUrl(userEmail, host, port)
self._review_url = self._SshReviewUrl(
userEmail, host, port
)
except urllib.error.HTTPError as e:
raise UploadError('%s: %s' % (self.review, str(e)))
raise UploadError("%s: %s" % (self.review, str(e)))
except urllib.error.URLError as e:
raise UploadError('%s: %s' % (self.review, str(e)))
raise UploadError("%s: %s" % (self.review, str(e)))
except HTTPException as e:
raise UploadError('%s: %s' % (self.review, e.__class__.__name__))
raise UploadError(
"%s: %s" % (self.review, e.__class__.__name__)
)
REVIEW_CACHE[u] = self._review_url
return self._review_url + self.projectname
def _SshReviewUrl(self, userEmail, host, port):
username = self._config.GetString('review.%s.username' % self.review)
username = self._config.GetString("review.%s.username" % self.review)
if username is None:
username = userEmail.split('@')[0]
return 'ssh://%s@%s:%s/' % (username, host, port)
username = userEmail.split("@")[0]
return "ssh://%s@%s:%s/" % (username, host, port)
def ToLocal(self, rev):
"""Convert a remote revision string to something we have locally.
"""
if self.name == '.' or IsId(rev):
"""Convert a remote revision string to something we have locally."""
if self.name == "." or IsId(rev):
return rev
if not rev.startswith('refs/'):
if not rev.startswith("refs/"):
rev = R_HEADS + rev
for spec in self.fetch:
@ -649,57 +679,55 @@ class Remote(object):
if not rev.startswith(R_HEADS):
return rev
raise GitError('%s: remote %s does not have %s' %
(self.projectname, self.name, rev))
raise GitError(
"%s: remote %s does not have %s"
% (self.projectname, self.name, rev)
)
def WritesTo(self, ref):
"""True if the remote stores to the tracking ref.
"""
"""True if the remote stores to the tracking ref."""
for spec in self.fetch:
if spec.DestMatches(ref):
return True
return False
def ResetFetch(self, mirror=False):
"""Set the fetch refspec to its default value.
"""
"""Set the fetch refspec to its default value."""
if mirror:
dst = 'refs/heads/*'
dst = "refs/heads/*"
else:
dst = 'refs/remotes/%s/*' % self.name
self.fetch = [RefSpec(True, 'refs/heads/*', dst)]
dst = "refs/remotes/%s/*" % self.name
self.fetch = [RefSpec(True, "refs/heads/*", dst)]
def Save(self):
"""Save this remote to the configuration.
"""
self._Set('url', self.url)
"""Save this remote to the configuration."""
self._Set("url", self.url)
if self.pushUrl is not None:
self._Set('pushurl', self.pushUrl + '/' + self.projectname)
self._Set("pushurl", self.pushUrl + "/" + self.projectname)
else:
self._Set('pushurl', self.pushUrl)
self._Set('review', self.review)
self._Set('projectname', self.projectname)
self._Set('fetch', list(map(str, self.fetch)))
self._Set("pushurl", self.pushUrl)
self._Set("review", self.review)
self._Set("projectname", self.projectname)
self._Set("fetch", list(map(str, self.fetch)))
def _Set(self, key, value):
key = 'remote.%s.%s' % (self.name, key)
key = "remote.%s.%s" % (self.name, key)
return self._config.SetString(key, value)
def _Get(self, key, all_keys=False):
key = 'remote.%s.%s' % (self.name, key)
key = "remote.%s.%s" % (self.name, key)
return self._config.GetString(key, all_keys=all_keys)
class Branch(object):
"""Configuration options related to a single branch.
"""
"""Configuration options related to a single branch."""
def __init__(self, config, name):
self._config = config
self.name = name
self.merge = self._Get('merge')
self.merge = self._Get("merge")
r = self._Get('remote')
r = self._Get("remote")
if r:
self.remote = self._config.GetRemote(r)
else:
@ -707,36 +735,34 @@ class Branch(object):
@property
def LocalMerge(self):
"""Convert the merge spec to a local name.
"""
"""Convert the merge spec to a local name."""
if self.remote and self.merge:
return self.remote.ToLocal(self.merge)
return None
def Save(self):
"""Save this branch back into the configuration.
"""
if self._config.HasSection('branch', self.name):
"""Save this branch back into the configuration."""
if self._config.HasSection("branch", self.name):
if self.remote:
self._Set('remote', self.remote.name)
self._Set("remote", self.remote.name)
else:
self._Set('remote', None)
self._Set('merge', self.merge)
self._Set("remote", None)
self._Set("merge", self.merge)
else:
with open(self._config.file, 'a') as fd:
with open(self._config.file, "a") as fd:
fd.write('[branch "%s"]\n' % self.name)
if self.remote:
fd.write('\tremote = %s\n' % self.remote.name)
fd.write("\tremote = %s\n" % self.remote.name)
if self.merge:
fd.write('\tmerge = %s\n' % self.merge)
fd.write("\tmerge = %s\n" % self.merge)
def _Set(self, key, value):
key = 'branch.%s.%s' % (self.name, key)
key = "branch.%s.%s" % (self.name, key)
return self._config.SetString(key, value)
def _Get(self, key, all_keys=False):
key = 'branch.%s.%s' % (self.name, key)
key = "branch.%s.%s" % (self.name, key)
return self._config.GetString(key, all_keys=all_keys)
@ -745,6 +771,7 @@ class SyncAnalysisState:
This object is versioned.
"""
def __init__(self, config, options, superproject_logging_data):
"""Initializes SyncAnalysisState.
@ -758,23 +785,30 @@ class SyncAnalysisState:
Args:
config: GitConfig object to store all options.
options: Options passed to sync returned from optparse. See _Options().
superproject_logging_data: A dictionary of superproject data that is to be logged.
options: Options passed to sync returned from optparse. See
_Options().
superproject_logging_data: A dictionary of superproject data that is
to be logged.
"""
self._config = config
now = datetime.datetime.utcnow()
self._Set('main.synctime', now.isoformat() + 'Z')
self._Set('main.version', '1')
self._Set('sys.argv', sys.argv)
self._Set("main.synctime", now.isoformat(timespec="microseconds") + "Z")
self._Set("main.version", "1")
self._Set("sys.argv", sys.argv)
for key, value in superproject_logging_data.items():
self._Set(f'superproject.{key}', value)
self._Set(f"superproject.{key}", value)
for key, value in options.__dict__.items():
self._Set(f'options.{key}', value)
self._Set(f"options.{key}", value)
config_items = config.DumpConfigDict().items()
EXTRACT_NAMESPACES = {'repo', 'branch', 'remote'}
self._SetDictionary({k: v for k, v in config_items
if not k.startswith(SYNC_STATE_PREFIX) and
k.split('.', 1)[0] in EXTRACT_NAMESPACES})
EXTRACT_NAMESPACES = {"repo", "branch", "remote"}
self._SetDictionary(
{
k: v
for k, v in config_items
if not k.startswith(SYNC_STATE_PREFIX)
and k.split(".", 1)[0] in EXTRACT_NAMESPACES
}
)
def _SetDictionary(self, data):
"""Save all key/value pairs of |data| dictionary.
@ -792,13 +826,14 @@ class SyncAnalysisState:
Args:
key: Name of the key.
value: |value| could be of any type. If it is 'bool', it will be saved
as a Boolean and for all other types, it will be saved as a String.
value: |value| could be of any type. If it is 'bool', it will be
saved as a Boolean and for all other types, it will be saved as
a String.
"""
if value is None:
return
sync_key = f'{SYNC_STATE_PREFIX}{key}'
sync_key = sync_key.replace('_', '')
sync_key = f"{SYNC_STATE_PREFIX}{key}"
sync_key = sync_key.replace("_", "")
if isinstance(value, str):
self._config.SetString(sync_key, value)
elif isinstance(value, bool):

View File

@ -16,14 +16,14 @@ import os
from repo_trace import Trace
import platform_utils
HEAD = 'HEAD'
R_CHANGES = 'refs/changes/'
R_HEADS = 'refs/heads/'
R_TAGS = 'refs/tags/'
R_PUB = 'refs/published/'
R_WORKTREE = 'refs/worktree/'
R_WORKTREE_M = R_WORKTREE + 'm/'
R_M = 'refs/remotes/m/'
HEAD = "HEAD"
R_CHANGES = "refs/changes/"
R_HEADS = "refs/heads/"
R_TAGS = "refs/tags/"
R_PUB = "refs/published/"
R_WORKTREE = "refs/worktree/"
R_WORKTREE_M = R_WORKTREE + "m/"
R_M = "refs/remotes/m/"
class GitRefs(object):
@ -42,7 +42,7 @@ class GitRefs(object):
try:
return self.all[name]
except KeyError:
return ''
return ""
def deleted(self, name):
if self._phyref is not None:
@ -60,32 +60,32 @@ class GitRefs(object):
self._EnsureLoaded()
return self._symref[name]
except KeyError:
return ''
return ""
def _EnsureLoaded(self):
if self._phyref is None or self._NeedUpdate():
self._LoadAll()
def _NeedUpdate(self):
Trace(': scan refs %s', self._gitdir)
with Trace(": scan refs %s", self._gitdir):
for name, mtime in self._mtime.items():
try:
if mtime != os.path.getmtime(os.path.join(self._gitdir, name)):
if mtime != os.path.getmtime(
os.path.join(self._gitdir, name)
):
return True
except OSError:
return True
return False
def _LoadAll(self):
Trace(': load refs %s', self._gitdir)
with Trace(": load refs %s", self._gitdir):
self._phyref = {}
self._symref = {}
self._mtime = {}
self._ReadPackedRefs()
self._ReadLoose('refs/')
self._ReadLoose("refs/")
self._ReadLoose1(os.path.join(self._gitdir, HEAD), HEAD)
scan = self._symref
@ -101,9 +101,9 @@ class GitRefs(object):
attempts += 1
def _ReadPackedRefs(self):
path = os.path.join(self._gitdir, 'packed-refs')
path = os.path.join(self._gitdir, "packed-refs")
try:
fd = open(path, 'r')
fd = open(path, "r")
mtime = os.path.getmtime(path)
except IOError:
return
@ -112,33 +112,33 @@ class GitRefs(object):
try:
for line in fd:
line = str(line)
if line[0] == '#':
if line[0] == "#":
continue
if line[0] == '^':
if line[0] == "^":
continue
line = line[:-1]
p = line.split(' ')
p = line.split(" ")
ref_id = p[0]
name = p[1]
self._phyref[name] = ref_id
finally:
fd.close()
self._mtime['packed-refs'] = mtime
self._mtime["packed-refs"] = mtime
def _ReadLoose(self, prefix):
base = os.path.join(self._gitdir, prefix)
for name in platform_utils.listdir(base):
p = os.path.join(base, name)
# We don't implement the full ref validation algorithm, just the simple
# rules that would show up in local filesystems.
# We don't implement the full ref validation algorithm, just the
# simple rules that would show up in local filesystems.
# https://git-scm.com/docs/git-check-ref-format
if name.startswith('.') or name.endswith('.lock'):
if name.startswith(".") or name.endswith(".lock"):
pass
elif platform_utils.isdir(p):
self._mtime[prefix] = os.path.getmtime(base)
self._ReadLoose(prefix + name + '/')
self._ReadLoose(prefix + name + "/")
else:
self._ReadLoose1(p, prefix + name)
@ -158,7 +158,7 @@ class GitRefs(object):
return
ref_id = ref_id[:-1]
if ref_id.startswith('ref: '):
if ref_id.startswith("ref: "):
self._symref[name] = ref_id[5:]
else:
self._phyref[name] = ref_id

View File

@ -12,13 +12,13 @@
# See the License for the specific language governing permissions and
# limitations under the License.
"""Provide functionality to get all projects and their commit ids from Superproject.
"""Provide functionality to get projects and their commit ids from Superproject.
For more information on superproject, check out:
https://en.wikibooks.org/wiki/Git/Submodules_and_Superprojects
Examples:
superproject = Superproject()
superproject = Superproject(manifest, name, remote, revision)
UpdateProjectsResult = superproject.UpdateProjectsRevisionId(projects)
"""
@ -31,11 +31,10 @@ from typing import NamedTuple
from git_command import git_require, GitCommand
from git_config import RepoConfig
from git_refs import R_HEADS
from manifest_xml import LOCAL_MANIFEST_GROUP_PREFIX
from git_refs import GitRefs
_SUPERPROJECT_GIT_NAME = 'superproject.git'
_SUPERPROJECT_MANIFEST_NAME = 'superproject_override.xml'
_SUPERPROJECT_GIT_NAME = "superproject.git"
_SUPERPROJECT_MANIFEST_NAME = "superproject_override.xml"
class SyncResult(NamedTuple):
@ -72,36 +71,60 @@ class Superproject(object):
lookup of commit ids for all projects. It contains _project_commit_ids which
is a dictionary with project/commit id entries.
"""
def __init__(self, manifest, repodir, git_event_log,
superproject_dir='exp-superproject', quiet=False, print_messages=False):
def __init__(
self,
manifest,
name,
remote,
revision,
superproject_dir="exp-superproject",
):
"""Initializes superproject.
Args:
manifest: A Manifest object that is to be written to a file.
repodir: Path to the .repo/ dir for holding all internal checkout state.
It must be in the top directory of the repo client checkout.
git_event_log: A git trace2 event log to log events.
superproject_dir: Relative path under |repodir| to checkout superproject.
quiet: If True then only print the progress messages.
print_messages: if True then print error/warning messages.
name: The unique name of the superproject
remote: The RemoteSpec for the remote.
revision: The name of the git branch to track.
superproject_dir: Relative path under |manifest.subdir| to checkout
superproject.
"""
self._project_commit_ids = None
self._manifest = manifest
self._git_event_log = git_event_log
self._quiet = quiet
self._print_messages = print_messages
self._branch = self._GetBranch()
self._repodir = os.path.abspath(repodir)
self.name = name
self.remote = remote
self.revision = self._branch = revision
self._repodir = manifest.repodir
self._superproject_dir = superproject_dir
self._superproject_path = os.path.join(self._repodir, superproject_dir)
self._manifest_path = os.path.join(self._superproject_path,
_SUPERPROJECT_MANIFEST_NAME)
git_name = ''
if self._manifest.superproject:
remote_name = self._manifest.superproject['remote'].name
git_name = hashlib.md5(remote_name.encode('utf8')).hexdigest() + '-'
self._superproject_path = manifest.SubmanifestInfoDir(
manifest.path_prefix, superproject_dir
)
self._manifest_path = os.path.join(
self._superproject_path, _SUPERPROJECT_MANIFEST_NAME
)
git_name = hashlib.md5(remote.name.encode("utf8")).hexdigest() + "-"
self._remote_url = remote.url
self._work_git_name = git_name + _SUPERPROJECT_GIT_NAME
self._work_git = os.path.join(self._superproject_path, self._work_git_name)
self._work_git = os.path.join(
self._superproject_path, self._work_git_name
)
# The following are command arguemnts, rather than superproject
# attributes, and were included here originally. They should eventually
# become arguments that are passed down from the public methods, instead
# of being treated as attributes.
self._git_event_log = None
self._quiet = False
self._print_messages = False
def SetQuiet(self, value):
"""Set the _quiet attribute."""
self._quiet = value
def SetPrintMessages(self, value):
"""Set the _print_messages attribute."""
self._print_messages = value
@property
def project_commit_ids(self):
@ -111,32 +134,30 @@ class Superproject(object):
@property
def manifest_path(self):
"""Returns the manifest path if the path exists or None."""
return self._manifest_path if os.path.exists(self._manifest_path) else None
return (
self._manifest_path if os.path.exists(self._manifest_path) else None
)
def _GetBranch(self):
"""Returns the branch name for getting the approved manifest."""
p = self._manifest.manifestProject
b = p.GetBranch(p.CurrentBranch)
if not b:
return None
branch = b.merge
if branch and branch.startswith(R_HEADS):
branch = branch[len(R_HEADS):]
return branch
def _LogMessage(self, message):
def _LogMessage(self, fmt, *inputs):
"""Logs message to stderr and _git_event_log."""
message = f"{self._LogMessagePrefix()} {fmt.format(*inputs)}"
if self._print_messages:
print(message, file=sys.stderr)
self._git_event_log.ErrorEvent(message, f'{message}')
self._git_event_log.ErrorEvent(message, fmt)
def _LogError(self, message):
def _LogMessagePrefix(self):
"""Returns the prefix string to be logged in each log message"""
return (
f"repo superproject branch: {self._branch} url: {self._remote_url}"
)
def _LogError(self, fmt, *inputs):
"""Logs error message to stderr and _git_event_log."""
self._LogMessage(f'repo superproject error: {message}')
self._LogMessage(f"error: {fmt}", *inputs)
def _LogWarning(self, message):
def _LogWarning(self, fmt, *inputs):
"""Logs warning message to stderr and _git_event_log."""
self._LogMessage(f'repo superproject warning: {message}')
self._LogMessage(f"warning: {fmt}", *inputs)
def _Init(self):
"""Sets up a local Git repository to get a copy of a superproject.
@ -147,48 +168,84 @@ class Superproject(object):
if not os.path.exists(self._superproject_path):
os.mkdir(self._superproject_path)
if not self._quiet and not os.path.exists(self._work_git):
print('%s: Performing initial setup for superproject; this might take '
'several minutes.' % self._work_git)
cmd = ['init', '--bare', self._work_git_name]
p = GitCommand(None,
print(
"%s: Performing initial setup for superproject; this might "
"take several minutes." % self._work_git
)
cmd = ["init", "--bare", self._work_git_name]
p = GitCommand(
None,
cmd,
cwd=self._superproject_path,
capture_stdout=True,
capture_stderr=True)
capture_stderr=True,
)
retval = p.Wait()
if retval:
self._LogWarning(f'git init call failed, command: git {cmd}, '
f'return code: {retval}, stderr: {p.stderr}')
self._LogWarning(
"git init call failed, command: git {}, "
"return code: {}, stderr: {}",
cmd,
retval,
p.stderr,
)
return False
return True
def _Fetch(self, url):
"""Fetches a local copy of a superproject for the manifest based on url.
def _Fetch(self):
"""Fetches a superproject for the manifest based on |_remote_url|.
Args:
url: superproject's url.
This runs git fetch which stores a local copy the superproject.
Returns:
True if fetch is successful, or False.
"""
if not os.path.exists(self._work_git):
self._LogWarning(f'git fetch missing directory: {self._work_git}')
self._LogWarning("git fetch missing directory: {}", self._work_git)
return False
if not git_require((2, 28, 0)):
self._LogWarning('superproject requires a git version 2.28 or later')
self._LogWarning(
"superproject requires a git version 2.28 or later"
)
return False
cmd = ['fetch', url, '--depth', '1', '--force', '--no-tags', '--filter', 'blob:none']
cmd = [
"fetch",
self._remote_url,
"--depth",
"1",
"--force",
"--no-tags",
"--filter",
"blob:none",
]
# Check if there is a local ref that we can pass to --negotiation-tip.
# If this is the first fetch, it does not exist yet.
# We use --negotiation-tip to speed up the fetch. Superproject branches
# do not share commits. So this lets git know it only needs to send
# commits reachable from the specified local refs.
rev_commit = GitRefs(self._work_git).get(f"refs/heads/{self.revision}")
if rev_commit:
cmd.extend(["--negotiation-tip", rev_commit])
if self._branch:
cmd += [self._branch + ':' + self._branch]
p = GitCommand(None,
cmd += [self._branch + ":" + self._branch]
p = GitCommand(
None,
cmd,
cwd=self._work_git,
capture_stdout=True,
capture_stderr=True)
capture_stderr=True,
)
retval = p.Wait()
if retval:
self._LogWarning(f'git fetch call failed, command: git {cmd}, '
f'return code: {retval}, stderr: {p.stderr}')
self._LogWarning(
"git fetch call failed, command: git {}, "
"return code: {}, stderr: {}",
cmd,
retval,
p.stderr,
)
return False
return True
@ -201,80 +258,103 @@ class Superproject(object):
data: data returned from 'git ls-tree ...' instead of None.
"""
if not os.path.exists(self._work_git):
self._LogWarning(f'git ls-tree missing directory: {self._work_git}')
self._LogWarning(
"git ls-tree missing directory: {}", self._work_git
)
return None
data = None
branch = 'HEAD' if not self._branch else self._branch
cmd = ['ls-tree', '-z', '-r', branch]
branch = "HEAD" if not self._branch else self._branch
cmd = ["ls-tree", "-z", "-r", branch]
p = GitCommand(None,
p = GitCommand(
None,
cmd,
cwd=self._work_git,
capture_stdout=True,
capture_stderr=True)
capture_stderr=True,
)
retval = p.Wait()
if retval == 0:
data = p.stdout
else:
self._LogWarning(f'git ls-tree call failed, command: git {cmd}, '
f'return code: {retval}, stderr: {p.stderr}')
self._LogWarning(
"git ls-tree call failed, command: git {}, "
"return code: {}, stderr: {}",
cmd,
retval,
p.stderr,
)
return data
def Sync(self):
def Sync(self, git_event_log):
"""Gets a local copy of a superproject for the manifest.
Args:
git_event_log: an EventLog, for git tracing.
Returns:
SyncResult
"""
self._git_event_log = git_event_log
if not self._manifest.superproject:
self._LogWarning(f'superproject tag is not defined in manifest: '
f'{self._manifest.manifestFile}')
self._LogWarning(
"superproject tag is not defined in manifest: {}",
self._manifest.manifestFile,
)
return SyncResult(False, False)
print('NOTICE: --use-superproject is in beta; report any issues to the '
'address described in `repo version`', file=sys.stderr)
_PrintBetaNotice()
should_exit = True
url = self._manifest.superproject['remote'].url
if not url:
self._LogWarning(f'superproject URL is not defined in manifest: '
f'{self._manifest.manifestFile}')
if not self._remote_url:
self._LogWarning(
"superproject URL is not defined in manifest: {}",
self._manifest.manifestFile,
)
return SyncResult(False, should_exit)
if not self._Init():
return SyncResult(False, should_exit)
if not self._Fetch(url):
if not self._Fetch():
return SyncResult(False, should_exit)
if not self._quiet:
print('%s: Initial setup for superproject completed.' % self._work_git)
print(
"%s: Initial setup for superproject completed." % self._work_git
)
return SyncResult(True, False)
def _GetAllProjectsCommitIds(self):
"""Get commit ids for all projects from superproject and save them in _project_commit_ids.
"""Get commit ids for all projects from superproject and save them.
Commit ids are saved in _project_commit_ids.
Returns:
CommitIdsResult
"""
sync_result = self.Sync()
sync_result = self.Sync(self._git_event_log)
if not sync_result.success:
return CommitIdsResult(None, sync_result.fatal)
data = self._LsTree()
if not data:
self._LogWarning(f'warning: git ls-tree failed to return data for manifest: '
f'{self._manifest.manifestFile}')
self._LogWarning(
"git ls-tree failed to return data for manifest: {}",
self._manifest.manifestFile,
)
return CommitIdsResult(None, True)
# Parse lines like the following to select lines starting with '160000' and
# build a dictionary with project path (last element) and its commit id (3rd element).
# Parse lines like the following to select lines starting with '160000'
# and build a dictionary with project path (last element) and its commit
# id (3rd element).
#
# 160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00
# 120000 blob acc2cbdf438f9d2141f0ae424cec1d8fc4b5d97f\tbootstrap.bash\x00
# 120000 blob acc2cbdf438f9d2141f0ae424cec1d8fc4b5d97f\tbootstrap.bash\x00 # noqa: E501
commit_ids = {}
for line in data.split('\x00'):
for line in data.split("\x00"):
ls_data = line.split(None, 3)
if not ls_data:
break
if ls_data[0] == '160000':
if ls_data[0] == "160000":
commit_ids[ls_data[3]] = ls_data[2]
self._project_commit_ids = commit_ids
@ -284,18 +364,23 @@ class Superproject(object):
"""Writes manifest to a file.
Returns:
manifest_path: Path name of the file into which manifest is written instead of None.
manifest_path: Path name of the file into which manifest is written
instead of None.
"""
if not os.path.exists(self._superproject_path):
self._LogWarning(f'missing superproject directory: {self._superproject_path}')
self._LogWarning(
"missing superproject directory: {}", self._superproject_path
)
return None
manifest_str = self._manifest.ToXml(groups=self._manifest.GetGroupsStr()).toxml()
manifest_str = self._manifest.ToXml(
groups=self._manifest.GetGroupsStr(), omit_local=True
).toxml()
manifest_path = self._manifest_path
try:
with open(manifest_path, 'w', encoding='utf-8') as fp:
with open(manifest_path, "w", encoding="utf-8") as fp:
fp.write(manifest_str)
except IOError as e:
self._LogError(f'cannot write manifest to : {manifest_path} {e}')
self._LogError("cannot write manifest to : {} {}", manifest_path, e)
return None
return manifest_path
@ -317,17 +402,19 @@ class Superproject(object):
if project.revisionId:
return True
# Skip the project if it comes from the local manifest.
return any(s.startswith(LOCAL_MANIFEST_GROUP_PREFIX) for s in project.groups)
return project.manifest.IsFromLocalManifest(project)
def UpdateProjectsRevisionId(self, projects):
def UpdateProjectsRevisionId(self, projects, git_event_log):
"""Update revisionId of every project in projects with the commit id.
Args:
projects: List of projects whose revisionId needs to be updated.
projects: a list of projects whose revisionId needs to be updated.
git_event_log: an EventLog, for git tracing.
Returns:
UpdateProjectsResult
"""
self._git_event_log = git_event_log
commit_ids_result = self._GetAllProjectsCommitIds()
commit_ids = commit_ids_result.commit_ids
if not commit_ids:
@ -345,8 +432,12 @@ class Superproject(object):
# If superproject doesn't have a commit id for a project, then report an
# error event and continue as if do not use superproject is specified.
if projects_missing_commit_ids:
self._LogWarning(f'please file a bug using {self._manifest.contactinfo.bugurl} '
f'to report missing commit_ids for: {projects_missing_commit_ids}')
self._LogWarning(
"please file a bug using {} to report missing "
"commit_ids for: {}",
self._manifest.contactinfo.bugurl,
projects_missing_commit_ids,
)
return UpdateProjectsResult(None, False)
for project in projects:
@ -357,89 +448,109 @@ class Superproject(object):
return UpdateProjectsResult(manifest_path, False)
@functools.lru_cache(maxsize=10)
def _PrintBetaNotice():
"""Print the notice of beta status."""
print(
"NOTICE: --use-superproject is in beta; report any issues to the "
"address described in `repo version`",
file=sys.stderr,
)
@functools.lru_cache(maxsize=None)
def _UseSuperprojectFromConfiguration():
"""Returns the user choice of whether to use superproject."""
user_cfg = RepoConfig.ForUser()
time_now = int(time.time())
user_value = user_cfg.GetBoolean('repo.superprojectChoice')
user_value = user_cfg.GetBoolean("repo.superprojectChoice")
if user_value is not None:
user_expiration = user_cfg.GetInt('repo.superprojectChoiceExpire')
if user_expiration is not None and (user_expiration <= 0 or user_expiration >= time_now):
# TODO(b/190688390) - Remove prompt when we are comfortable with the new
# default value.
user_expiration = user_cfg.GetInt("repo.superprojectChoiceExpire")
if (
user_expiration is None
or user_expiration <= 0
or user_expiration >= time_now
):
# TODO(b/190688390) - Remove prompt when we are comfortable with the
# new default value.
if user_value:
print(('You are currently enrolled in Git submodules experiment '
'(go/android-submodules-quickstart). Use --no-use-superproject '
'to override.\n'), file=sys.stderr)
print(
(
"You are currently enrolled in Git submodules "
"experiment (go/android-submodules-quickstart). Use "
"--no-use-superproject to override.\n"
),
file=sys.stderr,
)
else:
print(('You are not currently enrolled in Git submodules experiment '
'(go/android-submodules-quickstart). Use --use-superproject '
'to override.\n'), file=sys.stderr)
print(
(
"You are not currently enrolled in Git submodules "
"experiment (go/android-submodules-quickstart). Use "
"--use-superproject to override.\n"
),
file=sys.stderr,
)
return user_value
# We don't have an unexpired choice, ask for one.
system_cfg = RepoConfig.ForSystem()
system_value = system_cfg.GetBoolean('repo.superprojectChoice')
system_value = system_cfg.GetBoolean("repo.superprojectChoice")
if system_value:
# The system configuration is proposing that we should enable the
# use of superproject. Present this to user for confirmation if we
# are on a TTY, or, when we are not on a TTY, accept the system
# default for this time only.
# use of superproject. Treat the user as enrolled for two weeks.
#
# TODO(b/190688390) - Remove prompt when we are comfortable with the new
# default value.
prompt = ('Repo can now use Git submodules (go/android-submodules-quickstart) '
'instead of manifests to represent the state of the Android '
'superproject, which results in faster syncs and better atomicity.\n\n')
if sys.stdout.isatty():
prompt += 'Would you like to opt in for two weeks (y/N)? '
response = input(prompt).lower()
userchoice = True
time_choiceexpire = time_now + (86400 * 14)
if response in ('y', 'yes'):
userchoice = True
elif response in ('a', 'always'):
userchoice = True
time_choiceexpire = 0
elif response == 'never':
userchoice = False
time_choiceexpire = 0
elif response in ('n', 'no'):
userchoice = False
else:
# Unrecognized user response, assume the intention was no, but
# only for 2 hours instead of 2 weeks to balance between not
# being overly pushy while still retain the opportunity to
# enroll.
userchoice = False
time_choiceexpire = time_now + 7200
user_cfg.SetString('repo.superprojectChoiceExpire', str(time_choiceexpire))
user_cfg.SetBoolean('repo.superprojectChoice', userchoice)
return userchoice
else:
print('Accepting once since we are not on a TTY', file=sys.stderr)
user_cfg.SetString(
"repo.superprojectChoiceExpire", str(time_choiceexpire)
)
user_cfg.SetBoolean("repo.superprojectChoice", userchoice)
print(
"You are automatically enrolled in Git submodules experiment "
"(go/android-submodules-quickstart) for another two weeks.\n",
file=sys.stderr,
)
return True
# For all other cases, we would not use superproject by default.
return False
def PrintMessages(opt, manifest):
"""Returns a boolean if error/warning messages are to be printed."""
return opt.use_superproject is not None or manifest.superproject
def PrintMessages(use_superproject, manifest):
"""Returns a boolean if error/warning messages are to be printed.
Args:
use_superproject: option value from optparse.
manifest: manifest to use.
"""
return use_superproject is not None or bool(manifest.superproject)
def UseSuperproject(opt, manifest):
"""Returns a boolean if use-superproject option is enabled."""
def UseSuperproject(use_superproject, manifest):
"""Returns a boolean if use-superproject option is enabled.
if opt.use_superproject is not None:
return opt.use_superproject
Args:
use_superproject: option value from optparse.
manifest: manifest to use.
Returns:
Whether the superproject should be used.
"""
if not manifest.superproject:
# This (sub) manifest does not have a superproject definition.
return False
elif use_superproject is not None:
return use_superproject
else:
client_value = manifest.manifestProject.config.GetBoolean('repo.superproject')
client_value = manifest.manifestProject.use_superproject
if client_value is not None:
return client_value
else:
elif manifest.superproject:
return _UseSuperprojectFromConfiguration()
else:
return False

View File

@ -29,8 +29,10 @@ https://git-scm.com/docs/api-trace2#_the_event_format_target
import datetime
import errno
import json
import os
import socket
import sys
import tempfile
import threading
@ -46,7 +48,8 @@ class EventLog(object):
Each entry contains the following common keys:
- event: The event name
- sid: session-id - Unique string to allow process instance to be identified.
- sid: session-id - Unique string to allow process instance to be
identified.
- thread: The thread name.
- time: is the UTC time of the event.
@ -58,20 +61,23 @@ class EventLog(object):
"""Initializes the event log."""
self._log = []
# Try to get session-id (sid) from environment (setup in repo launcher).
KEY = 'GIT_TRACE2_PARENT_SID'
KEY = "GIT_TRACE2_PARENT_SID"
if env is None:
env = os.environ
now = datetime.datetime.utcnow()
self.start = datetime.datetime.utcnow()
# Save both our sid component and the complete sid.
# We use our sid component (self._sid) as the unique filename prefix and
# the full sid (self._full_sid) in the log itself.
self._sid = 'repo-%s-P%08x' % (now.strftime('%Y%m%dT%H%M%SZ'), os.getpid())
self._sid = "repo-%s-P%08x" % (
self.start.strftime("%Y%m%dT%H%M%SZ"),
os.getpid(),
)
parent_sid = env.get(KEY)
# Append our sid component to the parent sid (if it exists).
if parent_sid is not None:
self._full_sid = parent_sid + '/' + self._sid
self._full_sid = parent_sid + "/" + self._sid
else:
self._full_sid = self._sid
@ -91,13 +97,13 @@ class EventLog(object):
def _AddVersionEvent(self):
"""Adds a 'version' event at the beginning of current log."""
version_event = self._CreateEventDict('version')
version_event['evt'] = "2"
version_event['exe'] = RepoSourceVersion()
version_event = self._CreateEventDict("version")
version_event["evt"] = "2"
version_event["exe"] = RepoSourceVersion()
self._log.insert(0, version_event)
def _CreateEventDict(self, event_name):
"""Returns a dictionary with the common keys/values for git trace2 events.
"""Returns a dictionary with common keys/values for git trace2 events.
Args:
event_name: The event name.
@ -106,16 +112,16 @@ class EventLog(object):
Dictionary with the common event fields populated.
"""
return {
'event': event_name,
'sid': self._full_sid,
'thread': threading.currentThread().getName(),
'time': datetime.datetime.utcnow().isoformat() + 'Z',
"event": event_name,
"sid": self._full_sid,
"thread": threading.current_thread().name,
"time": datetime.datetime.utcnow().isoformat() + "Z",
}
def StartEvent(self):
"""Append a 'start' event to the current log."""
start_event = self._CreateEventDict('start')
start_event['argv'] = sys.argv
start_event = self._CreateEventDict("start")
start_event["argv"] = sys.argv
self._log.append(start_event)
def ExitEvent(self, result):
@ -124,12 +130,14 @@ class EventLog(object):
Args:
result: Exit code of the event
"""
exit_event = self._CreateEventDict('exit')
exit_event = self._CreateEventDict("exit")
# Consider 'None' success (consistent with event_log result handling).
if result is None:
result = 0
exit_event['code'] = result
exit_event["code"] = result
time_delta = datetime.datetime.utcnow() - self.start
exit_event["t_abs"] = time_delta.total_seconds()
self._log.append(exit_event)
def CommandEvent(self, name, subcommands):
@ -139,9 +147,9 @@ class EventLog(object):
name: Name of the primary command (ex: repo, git)
subcommands: List of the sub-commands (ex: version, init, sync)
"""
command_event = self._CreateEventDict('command')
command_event['name'] = name
command_event['subcommands'] = subcommands
command_event = self._CreateEventDict("command")
command_event["name"] = name
command_event["subcommands"] = subcommands
self._log.append(command_event)
def LogConfigEvents(self, config, event_dict_name):
@ -149,29 +157,54 @@ class EventLog(object):
Args:
config: Configuration dictionary.
event_dict_name: Name of the event dictionary for items to be logged under.
event_dict_name: Name of the event dictionary for items to be logged
under.
"""
for param, value in config.items():
event = self._CreateEventDict(event_dict_name)
event['param'] = param
event['value'] = value
event["param"] = param
event["value"] = value
self._log.append(event)
def DefParamRepoEvents(self, config):
"""Append a 'def_param' event for each repo.* config key to the current log.
"""Append 'def_param' events for repo config keys to the current log.
This appends one event for each repo.* config key.
Args:
config: Repo configuration dictionary
"""
# Only output the repo.* config parameters.
repo_config = {k: v for k, v in config.items() if k.startswith('repo.')}
self.LogConfigEvents(repo_config, 'def_param')
repo_config = {k: v for k, v in config.items() if k.startswith("repo.")}
self.LogConfigEvents(repo_config, "def_param")
def ErrorEvent(self, msg, fmt):
def GetDataEventName(self, value):
"""Returns 'data-json' if the value is an array else returns 'data'."""
return "data-json" if value[0] == "[" and value[-1] == "]" else "data"
def LogDataConfigEvents(self, config, prefix):
"""Append a 'data' event for each entry in |config| to the current log.
For each keyX and valueX of the config, "key" field of the event is
'|prefix|/keyX' and the "value" of the "key" field is valueX.
Args:
config: Configuration dictionary.
prefix: Prefix for each key that is logged.
"""
for key, value in config.items():
event = self._CreateEventDict(self.GetDataEventName(value))
event["key"] = f"{prefix}/{key}"
event["value"] = value
self._log.append(event)
def ErrorEvent(self, msg, fmt=None):
"""Append a 'error' event to the current log."""
error_event = self._CreateEventDict('error')
error_event['msg'] = msg
error_event['fmt'] = fmt
error_event = self._CreateEventDict("error")
if fmt is None:
fmt = msg
error_event["msg"] = msg
error_event["fmt"] = fmt
self._log.append(error_event)
def _GetEventTargetPath(self):
@ -181,40 +214,72 @@ class EventLog(object):
path: git config's 'trace2.eventtarget' path if it exists, or None
"""
path = None
cmd = ['config', '--get', 'trace2.eventtarget']
cmd = ["config", "--get", "trace2.eventtarget"]
# TODO(https://crbug.com/gerrit/13706): Use GitConfig when it supports
# system git config variables.
p = GitCommand(None, cmd, capture_stdout=True, capture_stderr=True,
bare=True)
p = GitCommand(
None, cmd, capture_stdout=True, capture_stderr=True, bare=True
)
retval = p.Wait()
if retval == 0:
# Strip trailing carriage-return in path.
path = p.stdout.rstrip('\n')
path = p.stdout.rstrip("\n")
elif retval != 1:
# `git config --get` is documented to produce an exit status of `1` if
# the requested variable is not present in the configuration. Report any
# other return value as an error.
print("repo: error: 'git config --get' call failed with return code: %r, stderr: %r" % (
retval, p.stderr), file=sys.stderr)
# `git config --get` is documented to produce an exit status of `1`
# if the requested variable is not present in the configuration.
# Report any other return value as an error.
print(
"repo: error: 'git config --get' call failed with return code: "
"%r, stderr: %r" % (retval, p.stderr),
file=sys.stderr,
)
return path
def _WriteLog(self, write_fn):
"""Writes the log out using a provided writer function.
Generate compact JSON output for each item in the log, and write it
using write_fn.
Args:
write_fn: A function that accepts byts and writes them to a
destination.
"""
for e in self._log:
# Dump in compact encoding mode.
# See 'Compact encoding' in Python docs:
# https://docs.python.org/3/library/json.html#module-json
write_fn(
json.dumps(e, indent=None, separators=(",", ":")).encode(
"utf-8"
)
+ b"\n"
)
def Write(self, path=None):
"""Writes the log out to a file.
"""Writes the log out to a file or socket.
Log is only written if 'path' or 'git config --get trace2.eventtarget'
provide a valid path to write logs to.
provide a valid path (or socket) to write logs to.
Logging filename format follows the git trace2 style of being a unique
(exclusive writable) file.
Args:
path: Path to where logs should be written.
path: Path to where logs should be written. The path may have a
prefix of the form "af_unix:[{stream|dgram}:]", in which case
the path is treated as a Unix domain socket. See
https://git-scm.com/docs/api-trace2#_enabling_a_target for
details.
Returns:
log_path: Path to the log file if log is written, otherwise None
log_path: Path to the log file or socket if log is written,
otherwise None
"""
log_path = None
# If no logging path is specified, get the path from 'trace2.eventtarget'.
# If no logging path is specified, get the path from
# 'trace2.eventtarget'.
if path is None:
path = self._GetEventTargetPath()
@ -222,32 +287,88 @@ class EventLog(object):
if path is None:
return None
path_is_socket = False
socket_type = None
if isinstance(path, str):
parts = path.split(":", 1)
if parts[0] == "af_unix" and len(parts) == 2:
path_is_socket = True
path = parts[1]
parts = path.split(":", 1)
if parts[0] == "stream" and len(parts) == 2:
socket_type = socket.SOCK_STREAM
path = parts[1]
elif parts[0] == "dgram" and len(parts) == 2:
socket_type = socket.SOCK_DGRAM
path = parts[1]
else:
# Get absolute path.
path = os.path.abspath(os.path.expanduser(path))
else:
raise TypeError('path: str required but got %s.' % type(path))
raise TypeError("path: str required but got %s." % type(path))
# Git trace2 requires a directory to write log to.
# TODO(https://crbug.com/gerrit/13706): Support file (append) mode also.
if not os.path.isdir(path):
if not (path_is_socket or os.path.isdir(path)):
return None
# Use NamedTemporaryFile to generate a unique filename as required by git trace2.
if path_is_socket:
if socket_type == socket.SOCK_STREAM or socket_type is None:
try:
with tempfile.NamedTemporaryFile(mode='x', prefix=self._sid, dir=path,
delete=False) as f:
# TODO(https://crbug.com/gerrit/13706): Support writing events as they
# occur.
for e in self._log:
# Dump in compact encoding mode.
# See 'Compact encoding' in Python docs:
# https://docs.python.org/3/library/json.html#module-json
json.dump(e, f, indent=None, separators=(',', ':'))
f.write('\n')
with socket.socket(
socket.AF_UNIX, socket.SOCK_STREAM
) as sock:
sock.connect(path)
self._WriteLog(sock.sendall)
return f"af_unix:stream:{path}"
except OSError as err:
# If we tried to connect to a DGRAM socket using STREAM,
# ignore the attempt and continue to DGRAM below. Otherwise,
# issue a warning.
if err.errno != errno.EPROTOTYPE:
print(
f"repo: warning: git trace2 logging failed: {err}",
file=sys.stderr,
)
return None
if socket_type == socket.SOCK_DGRAM or socket_type is None:
try:
with socket.socket(
socket.AF_UNIX, socket.SOCK_DGRAM
) as sock:
self._WriteLog(lambda bs: sock.sendto(bs, path))
return f"af_unix:dgram:{path}"
except OSError as err:
print(
f"repo: warning: git trace2 logging failed: {err}",
file=sys.stderr,
)
return None
# Tried to open a socket but couldn't connect (SOCK_STREAM) or write
# (SOCK_DGRAM).
print(
"repo: warning: git trace2 logging failed: could not write to "
"socket",
file=sys.stderr,
)
return None
# Path is an absolute path
# Use NamedTemporaryFile to generate a unique filename as required by
# git trace2.
try:
with tempfile.NamedTemporaryFile(
mode="xb", prefix=self._sid, dir=path, delete=False
) as f:
# TODO(https://crbug.com/gerrit/13706): Support writing events
# as they occur.
self._WriteLog(f.write)
log_path = f.name
except FileExistsError as err:
print('repo: warning: git trace2 logging failed: %r' % err,
file=sys.stderr)
print(
"repo: warning: git trace2 logging failed: %r" % err,
file=sys.stderr,
)
return None
return log_path

View File

@ -14,7 +14,6 @@
import os
import multiprocessing
import platform
import re
import sys
import time
@ -40,9 +39,10 @@ def _get_project_revision(args):
"""Worker for _set_project_revisions to lookup one project remote."""
(i, url, expr) = args
gitcmd = git_command.GitCommand(
None, ['ls-remote', url, expr], capture_stdout=True, cwd='/tmp')
None, ["ls-remote", url, expr], capture_stdout=True, cwd="/tmp"
)
rc = gitcmd.Wait()
return (i, rc, gitcmd.stdout.split('\t', 1)[0])
return (i, rc, gitcmd.stdout.split("\t", 1)[0])
def _set_project_revisions(projects):
@ -55,25 +55,33 @@ def _set_project_revisions(projects):
Args:
projects: List of project objects to set the revionExpr for.
"""
# Retrieve the commit id for each project based off of it's current
# Retrieve the commit id for each project based off of its current
# revisionExpr and it is not already a commit id.
with multiprocessing.Pool(NUM_BATCH_RETRIEVE_REVISIONID) as pool:
results_iter = pool.imap_unordered(
_get_project_revision,
((i, project.remote.url, project.revisionExpr)
(
(i, project.remote.url, project.revisionExpr)
for i, project in enumerate(projects)
if not git_config.IsId(project.revisionExpr)),
chunksize=8)
for (i, rc, revisionExpr) in results_iter:
if not git_config.IsId(project.revisionExpr)
),
chunksize=8,
)
for i, rc, revisionExpr in results_iter:
project = projects[i]
if rc:
print('FATAL: Failed to retrieve revisionExpr for %s' % project.name)
print(
"FATAL: Failed to retrieve revisionExpr for %s"
% project.name
)
pool.terminate()
sys.exit(1)
if not revisionExpr:
pool.terminate()
raise ManifestParseError('Invalid SHA-1 revision project %s (%s)' %
(project.remote.url, project.revisionExpr))
raise ManifestParseError(
"Invalid SHA-1 revision project %s (%s)"
% (project.remote.url, project.revisionExpr)
)
project.revisionExpr = revisionExpr
@ -86,12 +94,14 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
paths: List of project paths we want to update.
"""
print('Generating GITC Manifest by fetching revision SHAs for each '
'project.')
print(
"Generating GITC Manifest by fetching revision SHAs for each "
"project."
)
if paths is None:
paths = list(manifest.paths.keys())
groups = [x for x in re.split(r'[,\s]+', manifest.GetGroupsStr()) if x]
groups = [x for x in re.split(r"[,\s]+", manifest.GetGroupsStr()) if x]
# Convert the paths to projects, and filter them to the matched groups.
projects = [manifest.paths[p] for p in paths]
@ -106,12 +116,12 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
proj.upstream = proj.revisionExpr
if path not in gitc_manifest.paths:
# Any new projects need their first revision, even if we weren't asked
# for them.
# Any new projects need their first revision, even if we weren't
# asked for them.
projects.append(proj)
elif path not in paths:
# And copy revisions from the previous manifest if we're not updating
# them now.
# And copy revisions from the previous manifest if we're not
# updating them now.
gitc_proj = gitc_manifest.paths[path]
if gitc_proj.old_revision:
proj.revisionExpr = None
@ -124,8 +134,8 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
if gitc_manifest is not None:
for path, proj in gitc_manifest.paths.items():
if proj.old_revision and path in paths:
# If we updated a project that has been started, keep the old-revision
# updated.
# If we updated a project that has been started, keep the
# old-revision updated.
repo_proj = manifest.paths[path]
repo_proj.old_revision = repo_proj.revisionExpr
repo_proj.revisionExpr = None
@ -148,8 +158,8 @@ def save_manifest(manifest, client_dir=None):
if not client_dir:
manifest_file = manifest.manifestFile
else:
manifest_file = os.path.join(client_dir, '.manifest')
with open(manifest_file, 'w') as f:
manifest_file = os.path.join(client_dir, ".manifest")
with open(manifest_file, "w") as f:
manifest.Save(f, groups=manifest.GetGroupsStr())
# TODO(sbasi/jorg): Come up with a solution to remove the sleep below.
# Give the GITC filesystem time to register the manifest changes.

296
hooks.py
View File

@ -28,8 +28,9 @@ from git_refs import HEAD
class RepoHook(object):
"""A RepoHook contains information about a script to run as a hook.
Hooks are used to run a python script before running an upload (for instance,
to run presubmit checks). Eventually, we may have hooks for other actions.
Hooks are used to run a python script before running an upload (for
instance, to run presubmit checks). Eventually, we may have hooks for other
actions.
This shouldn't be confused with files in the 'repo/hooks' directory. Those
files are copied into each '.git/hooks' folder for each project. Repo-level
@ -52,7 +53,8 @@ class RepoHook(object):
Invalid
"""
def __init__(self,
def __init__(
self,
hook_type,
hooks_project,
repo_topdir,
@ -60,7 +62,8 @@ class RepoHook(object):
bypass_hooks=False,
allow_all_hooks=False,
ignore_hooks=False,
abort_if_user_denies=False):
abort_if_user_denies=False,
):
"""RepoHook constructor.
Params:
@ -78,8 +81,8 @@ class RepoHook(object):
bypass_hooks: If True, then 'Do not run the hook'.
allow_all_hooks: If True, then 'Run the hook without prompting'.
ignore_hooks: If True, then 'Do not abort action if hooks fail'.
abort_if_user_denies: If True, we'll abort running the hook if the user
doesn't allow us to run the hook.
abort_if_user_denies: If True, we'll abort running the hook if the
user doesn't allow us to run the hook.
"""
self._hook_type = hook_type
self._hooks_project = hooks_project
@ -92,77 +95,82 @@ class RepoHook(object):
# Store the full path to the script for convenience.
if self._hooks_project:
self._script_fullpath = os.path.join(self._hooks_project.worktree,
self._hook_type + '.py')
self._script_fullpath = os.path.join(
self._hooks_project.worktree, self._hook_type + ".py"
)
else:
self._script_fullpath = None
def _GetHash(self):
"""Return a hash of the contents of the hooks directory.
We'll just use git to do this. This hash has the property that if anything
changes in the directory we will return a different has.
We'll just use git to do this. This hash has the property that if
anything changes in the directory we will return a different has.
SECURITY CONSIDERATION:
This hash only represents the contents of files in the hook directory, not
any other files imported or called by hooks. Changes to imported files
can change the script behavior without affecting the hash.
This hash only represents the contents of files in the hook
directory, not any other files imported or called by hooks. Changes
to imported files can change the script behavior without affecting
the hash.
Returns:
A string representing the hash. This will always be ASCII so that it can
be printed to the user easily.
A string representing the hash. This will always be ASCII so that
it can be printed to the user easily.
"""
assert self._hooks_project, "Must have hooks to calculate their hash."
# We will use the work_git object rather than just calling GetRevisionId().
# That gives us a hash of the latest checked in version of the files that
# the user will actually be executing. Specifically, GetRevisionId()
# doesn't appear to change even if a user checks out a different version
# of the hooks repo (via git checkout) nor if a user commits their own revs.
# We will use the work_git object rather than just calling
# GetRevisionId(). That gives us a hash of the latest checked in version
# of the files that the user will actually be executing. Specifically,
# GetRevisionId() doesn't appear to change even if a user checks out a
# different version of the hooks repo (via git checkout) nor if a user
# commits their own revs.
#
# NOTE: Local (non-committed) changes will not be factored into this hash.
# I think this is OK, since we're really only worried about warning the user
# about upstream changes.
# NOTE: Local (non-committed) changes will not be factored into this
# hash. I think this is OK, since we're really only worried about
# warning the user about upstream changes.
return self._hooks_project.work_git.rev_parse(HEAD)
def _GetMustVerb(self):
"""Return 'must' if the hook is required; 'should' if not."""
if self._abort_if_user_denies:
return 'must'
return "must"
else:
return 'should'
return "should"
def _CheckForHookApproval(self):
"""Check to see whether this hook has been approved.
We'll accept approval of manifest URLs if they're using secure transports.
This way the user can say they trust the manifest hoster. For insecure
hosts, we fall back to checking the hash of the hooks repo.
We'll accept approval of manifest URLs if they're using secure
transports. This way the user can say they trust the manifest hoster.
For insecure hosts, we fall back to checking the hash of the hooks repo.
Note that we ask permission for each individual hook even though we use
the hash of all hooks when detecting changes. We'd like the user to be
able to approve / deny each hook individually. We only use the hash of all
hooks because there is no other easy way to detect changes to local imports.
able to approve / deny each hook individually. We only use the hash of
all hooks because there is no other easy way to detect changes to local
imports.
Returns:
True if this hook is approved to run; False otherwise.
Raises:
HookError: Raised if the user doesn't approve and abort_if_user_denies
was passed to the consturctor.
HookError: Raised if the user doesn't approve and
abort_if_user_denies was passed to the consturctor.
"""
if self._ManifestUrlHasSecureScheme():
return self._CheckForHookApprovalManifest()
else:
return self._CheckForHookApprovalHash()
def _CheckForHookApprovalHelper(self, subkey, new_val, main_prompt,
changed_prompt):
def _CheckForHookApprovalHelper(
self, subkey, new_val, main_prompt, changed_prompt
):
"""Check for approval for a particular attribute and hook.
Args:
subkey: The git config key under [repo.hooks.<hook_type>] to store the
last approved string.
subkey: The git config key under [repo.hooks.<hook_type>] to store
the last approved string.
new_val: The new value to compare against the last approved one.
main_prompt: Message to display to the user to ask for approval.
changed_prompt: Message explaining why we're re-asking for approval.
@ -171,11 +179,11 @@ class RepoHook(object):
True if this hook is approved to run; False otherwise.
Raises:
HookError: Raised if the user doesn't approve and abort_if_user_denies
was passed to the consturctor.
HookError: Raised if the user doesn't approve and
abort_if_user_denies was passed to the consturctor.
"""
hooks_config = self._hooks_project.config
git_approval_key = 'repo.hooks.%s.%s' % (self._hook_type, subkey)
git_approval_key = "repo.hooks.%s.%s" % (self._hook_type, subkey)
# Get the last value that the user approved for this hook; may be None.
old_val = hooks_config.GetString(git_approval_key)
@ -186,35 +194,44 @@ class RepoHook(object):
# Approval matched. We're done.
return True
else:
# Give the user a reason why we're prompting, since they last told
# us to "never ask again".
prompt = 'WARNING: %s\n\n' % (changed_prompt,)
# Give the user a reason why we're prompting, since they last
# told us to "never ask again".
prompt = "WARNING: %s\n\n" % (changed_prompt,)
else:
prompt = ''
prompt = ""
# Prompt the user if we're not on a tty; on a tty we'll assume "no".
if sys.stdout.isatty():
prompt += main_prompt + ' (yes/always/NO)? '
prompt += main_prompt + " (yes/always/NO)? "
response = input(prompt).lower()
print()
# User is doing a one-time approval.
if response in ('y', 'yes'):
if response in ("y", "yes"):
return True
elif response == 'always':
elif response == "always":
hooks_config.SetString(git_approval_key, new_val)
return True
# For anything else, we'll assume no approval.
if self._abort_if_user_denies:
raise HookError('You must allow the %s hook or use --no-verify.' %
self._hook_type)
raise HookError(
"You must allow the %s hook or use --no-verify."
% self._hook_type
)
return False
def _ManifestUrlHasSecureScheme(self):
"""Check if the URI for the manifest is a secure transport."""
secure_schemes = ('file', 'https', 'ssh', 'persistent-https', 'sso', 'rpc')
secure_schemes = (
"file",
"https",
"ssh",
"persistent-https",
"sso",
"rpc",
)
parse_results = urllib.parse.urlparse(self._manifest_url)
return parse_results.scheme in secure_schemes
@ -225,10 +242,12 @@ class RepoHook(object):
True if this hook is approved to run; False otherwise.
"""
return self._CheckForHookApprovalHelper(
'approvedmanifest',
"approvedmanifest",
self._manifest_url,
'Run hook scripts from %s' % (self._manifest_url,),
'Manifest URL has changed since %s was allowed.' % (self._hook_type,))
"Run hook scripts from %s" % (self._manifest_url,),
"Manifest URL has changed since %s was allowed."
% (self._hook_type,),
)
def _CheckForHookApprovalHash(self):
"""Check whether the user has approved the hooks repo.
@ -236,15 +255,18 @@ class RepoHook(object):
Returns:
True if this hook is approved to run; False otherwise.
"""
prompt = ('Repo %s run the script:\n'
' %s\n'
'\n'
'Do you want to allow this script to run')
prompt = (
"Repo %s run the script:\n"
" %s\n"
"\n"
"Do you want to allow this script to run"
)
return self._CheckForHookApprovalHelper(
'approvedhash',
"approvedhash",
self._GetHash(),
prompt % (self._GetMustVerb(), self._script_fullpath),
'Scripts have changed since %s was allowed.' % (self._hook_type,))
"Scripts have changed since %s was allowed." % (self._hook_type,),
)
@staticmethod
def _ExtractInterpFromShebang(data):
@ -256,8 +278,8 @@ class RepoHook(object):
data: The file content of the script.
Returns:
The basename of the main script interpreter, or None if a shebang is not
used or could not be parsed out.
The basename of the main script interpreter, or None if a shebang is
not used or could not be parsed out.
"""
firstline = data.splitlines()[:1]
if not firstline:
@ -265,13 +287,13 @@ class RepoHook(object):
# The format here can be tricky.
shebang = firstline[0].strip()
m = re.match(r'^#!\s*([^\s]+)(?:\s+([^\s]+))?', shebang)
m = re.match(r"^#!\s*([^\s]+)(?:\s+([^\s]+))?", shebang)
if not m:
return None
# If the using `env`, find the target program.
interp = m.group(1)
if os.path.basename(interp) == 'env':
if os.path.basename(interp) == "env":
interp = m.group(2)
return interp
@ -300,18 +322,18 @@ data = open(path).read()
exec(compile(data, path, 'exec'), context)
context['main'](**kwargs)
""" % {
'path': self._script_fullpath,
'kwargs': json.dumps(kwargs),
'context': json.dumps(context),
"path": self._script_fullpath,
"kwargs": json.dumps(kwargs),
"context": json.dumps(context),
}
# We pass the script via stdin to avoid OS argv limits. It also makes
# unhandled exception tracebacks less verbose/confusing for users.
cmd = [interp, '-c', 'import sys; exec(sys.stdin.read())']
cmd = [interp, "-c", "import sys; exec(sys.stdin.read())"]
proc = subprocess.Popen(cmd, stdin=subprocess.PIPE)
proc.communicate(input=script.encode('utf-8'))
proc.communicate(input=script.encode("utf-8"))
if proc.returncode:
raise HookError('Failed to run %s hook.' % (self._hook_type,))
raise HookError("Failed to run %s hook." % (self._hook_type,))
def _ExecuteHookViaImport(self, data, context, **kwargs):
"""Execute the hook code in |data| directly.
@ -327,23 +349,27 @@ context['main'](**kwargs)
# Exec, storing global context in the context dict. We catch exceptions
# and convert to a HookError w/ just the failing traceback.
try:
exec(compile(data, self._script_fullpath, 'exec'), context)
exec(compile(data, self._script_fullpath, "exec"), context)
except Exception:
raise HookError('%s\nFailed to import %s hook; see traceback above.' %
(traceback.format_exc(), self._hook_type))
raise HookError(
"%s\nFailed to import %s hook; see traceback above."
% (traceback.format_exc(), self._hook_type)
)
# Running the script should have defined a main() function.
if 'main' not in context:
if "main" not in context:
raise HookError('Missing main() in: "%s"' % self._script_fullpath)
# Call the main function in the hook. If the hook should cause the
# build to fail, it will raise an Exception. We'll catch that convert
# to a HookError w/ just the failing traceback.
try:
context['main'](**kwargs)
context["main"](**kwargs)
except Exception:
raise HookError('%s\nFailed to run main() for %s hook; see traceback '
'above.' % (traceback.format_exc(), self._hook_type))
raise HookError(
"%s\nFailed to run main() for %s hook; see traceback "
"above." % (traceback.format_exc(), self._hook_type)
)
def _ExecuteHook(self, **kwargs):
"""Actually execute the given hook.
@ -351,9 +377,9 @@ context['main'](**kwargs)
This will run the hook's 'main' function in our python interpreter.
Args:
kwargs: Keyword arguments to pass to the hook. These are often specific
to the hook type. For instance, pre-upload hooks will contain
a project_list.
kwargs: Keyword arguments to pass to the hook. These are often
specific to the hook type. For instance, pre-upload hooks will
contain a project_list.
"""
# Keep sys.path and CWD stashed away so that we can always restore them
# upon function exit.
@ -370,17 +396,18 @@ context['main'](**kwargs)
sys.path = [os.path.dirname(self._script_fullpath)] + sys.path[1:]
# Initial global context for the hook to run within.
context = {'__file__': self._script_fullpath}
context = {"__file__": self._script_fullpath}
# Add 'hook_should_take_kwargs' to the arguments to be passed to main.
# We don't actually want hooks to define their main with this argument--
# it's there to remind them that their hook should always take **kwargs.
# Add 'hook_should_take_kwargs' to the arguments to be passed to
# main. We don't actually want hooks to define their main with this
# argument--it's there to remind them that their hook should always
# take **kwargs.
# For instance, a pre-upload hook should be defined like:
# def main(project_list, **kwargs):
#
# This allows us to later expand the API without breaking old hooks.
kwargs = kwargs.copy()
kwargs['hook_should_take_kwargs'] = True
kwargs["hook_should_take_kwargs"] = True
# See what version of python the hook has been written against.
data = open(self._script_fullpath).read()
@ -388,18 +415,20 @@ context['main'](**kwargs)
reexec = False
if interp:
prog = os.path.basename(interp)
if prog.startswith('python2') and sys.version_info.major != 2:
if prog.startswith("python2") and sys.version_info.major != 2:
reexec = True
elif prog.startswith('python3') and sys.version_info.major == 2:
elif prog.startswith("python3") and sys.version_info.major == 2:
reexec = True
# Attempt to execute the hooks through the requested version of Python.
# Attempt to execute the hooks through the requested version of
# Python.
if reexec:
try:
self._ExecuteHookViaReexec(interp, context, **kwargs)
except OSError as e:
if e.errno == errno.ENOENT:
# We couldn't find the interpreter, so fallback to importing.
# We couldn't find the interpreter, so fallback to
# importing.
reexec = False
else:
raise
@ -415,7 +444,9 @@ context['main'](**kwargs)
def _CheckHook(self):
# Bail with a nice error if we can't find the hook.
if not os.path.isfile(self._script_fullpath):
raise HookError('Couldn\'t find repo hook: %s' % self._script_fullpath)
raise HookError(
"Couldn't find repo hook: %s" % self._script_fullpath
)
def Run(self, **kwargs):
"""Run the hook.
@ -424,27 +455,30 @@ context['main'](**kwargs)
this particular hook is not enabled), this is a no-op.
Args:
user_allows_all_hooks: If True, we will never prompt about running the
hook--we'll just assume it's OK to run it.
kwargs: Keyword arguments to pass to the hook. These are often specific
to the hook type. For instance, pre-upload hooks will contain
a project_list.
user_allows_all_hooks: If True, we will never prompt about running
the hook--we'll just assume it's OK to run it.
kwargs: Keyword arguments to pass to the hook. These are often
specific to the hook type. For instance, pre-upload hooks will
contain a project_list.
Returns:
True: On success or ignore hooks by user-request
False: The hook failed. The caller should respond with aborting the action.
Some examples in which False is returned:
False: The hook failed. The caller should respond with aborting the
action. Some examples in which False is returned:
* Finding the hook failed while it was enabled, or
* the user declined to run a required hook (from _CheckForHookApproval)
* the user declined to run a required hook (from
_CheckForHookApproval)
In all these cases the user did not pass the proper arguments to
ignore the result through the option combinations as listed in
AddHookOptionGroup().
"""
# Do not do anything in case bypass_hooks is set, or
# no-op if there is no hooks project or if hook is disabled.
if (self._bypass_hooks or
not self._hooks_project or
self._hook_type not in self._hooks_project.enabled_repo_hooks):
if (
self._bypass_hooks
or not self._hooks_project
or self._hook_type not in self._hooks_project.enabled_repo_hooks
):
return True
passed = True
@ -457,15 +491,21 @@ context['main'](**kwargs)
self._ExecuteHook(**kwargs)
except SystemExit as e:
passed = False
print('ERROR: %s hooks exited with exit code: %s' % (self._hook_type, str(e)),
file=sys.stderr)
print(
"ERROR: %s hooks exited with exit code: %s"
% (self._hook_type, str(e)),
file=sys.stderr,
)
except HookError as e:
passed = False
print('ERROR: %s' % str(e), file=sys.stderr)
print("ERROR: %s" % str(e), file=sys.stderr)
if not passed and self._ignore_hooks:
print('\nWARNING: %s hooks failed, but continuing anyways.' % self._hook_type,
file=sys.stderr)
print(
"\nWARNING: %s hooks failed, but continuing anyways."
% self._hook_type,
file=sys.stderr,
)
passed = True
return passed
@ -478,16 +518,20 @@ context['main'](**kwargs)
manifest: The current active manifest for this command from which we
extract a couple of fields.
opt: Contains the commandline options for the action of this hook.
It should contain the options added by AddHookOptionGroup() in which
we are interested in RepoHook execution.
It should contain the options added by AddHookOptionGroup() in
which we are interested in RepoHook execution.
"""
for key in ('bypass_hooks', 'allow_all_hooks', 'ignore_hooks'):
for key in ("bypass_hooks", "allow_all_hooks", "ignore_hooks"):
kwargs.setdefault(key, getattr(opt, key))
kwargs.update({
'hooks_project': manifest.repo_hooks_project,
'repo_topdir': manifest.topdir,
'manifest_url': manifest.manifestProject.GetRemote('origin').url,
})
kwargs.update(
{
"hooks_project": manifest.repo_hooks_project,
"repo_topdir": manifest.topdir,
"manifest_url": manifest.manifestProject.GetRemote(
"origin"
).url,
}
)
return cls(*args, **kwargs)
@staticmethod
@ -497,13 +541,21 @@ context['main'](**kwargs)
# Note that verify and no-verify are NOT opposites of each other, which
# is why they store to different locations. We are using them to match
# 'git commit' syntax.
group = parser.add_option_group(name + ' hooks')
group.add_option('--no-verify',
dest='bypass_hooks', action='store_true',
help='Do not run the %s hook.' % name)
group.add_option('--verify',
dest='allow_all_hooks', action='store_true',
help='Run the %s hook without prompting.' % name)
group.add_option('--ignore-hooks',
action='store_true',
help='Do not abort if %s hooks fail.' % name)
group = parser.add_option_group(name + " hooks")
group.add_option(
"--no-verify",
dest="bypass_hooks",
action="store_true",
help="Do not run the %s hook." % name,
)
group.add_option(
"--verify",
dest="allow_all_hooks",
action="store_true",
help="Run the %s hook without prompting." % name,
)
group.add_option(
"--ignore-hooks",
action="store_true",
help="Do not abort if %s hooks fail." % name,
)

View File

@ -1,5 +1,5 @@
#!/bin/sh
# From Gerrit Code Review 3.1.3
# From Gerrit Code Review 3.6.1 c67916dbdc07555c44e32a68f92ffc484b9b34f0
#
# Part of Gerrit Code Review (https://www.gerritcodereview.com/)
#
@ -17,6 +17,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
set -u
# avoid [[ which is not POSIX sh.
if test "$#" != 1 ; then
echo "$0 requires an argument."
@ -29,15 +31,25 @@ if test ! -f "$1" ; then
fi
# Do not create a change id if requested
if test "false" = "`git config --bool --get gerrit.createChangeId`" ; then
if test "false" = "$(git config --bool --get gerrit.createChangeId)" ; then
exit 0
fi
# $RANDOM will be undefined if not using bash, so don't use set -u
random=$( (whoami ; hostname ; date; cat $1 ; echo $RANDOM) | git hash-object --stdin)
# Do not create a change id for squash commits.
if head -n1 "$1" | grep -q '^squash! '; then
exit 0
fi
if git rev-parse --verify HEAD >/dev/null 2>&1; then
refhash="$(git rev-parse HEAD)"
else
refhash="$(git hash-object -t tree /dev/null)"
fi
random=$({ git var GIT_COMMITTER_IDENT ; echo "$refhash" ; cat "$1"; } | git hash-object --stdin)
dest="$1.tmp.${random}"
trap 'rm -f "${dest}"' EXIT
trap 'rm -f "$dest" "$dest-2"' EXIT
if ! git stripspace --strip-comments < "$1" > "${dest}" ; then
echo "cannot strip comments from $1"
@ -49,11 +61,40 @@ if test ! -s "${dest}" ; then
exit 1
fi
reviewurl="$(git config --get gerrit.reviewUrl)"
if test -n "${reviewurl}" ; then
token="Link"
value="${reviewurl%/}/id/I$random"
pattern=".*/id/I[0-9a-f]\{40\}$"
else
token="Change-Id"
value="I$random"
pattern=".*"
fi
if git interpret-trailers --parse < "$1" | grep -q "^$token: $pattern$" ; then
exit 0
fi
# There must be a Signed-off-by trailer for the code below to work. Insert a
# sentinel at the end to make sure there is one.
# Avoid the --in-place option which only appeared in Git 2.8
# Avoid the --if-exists option which only appeared in Git 2.15
if ! git -c trailer.ifexists=doNothing interpret-trailers \
--trailer "Change-Id: I${random}" < "$1" > "${dest}" ; then
echo "cannot insert change-id line in $1"
if ! git interpret-trailers \
--trailer "Signed-off-by: SENTINEL" < "$1" > "$dest-2" ; then
echo "cannot insert Signed-off-by sentinel line in $1"
exit 1
fi
# Make sure the trailer appears before any Signed-off-by trailers by inserting
# it as if it was a Signed-off-by trailer and then use sed to remove the
# Signed-off-by prefix and the Signed-off-by sentinel line.
# Avoid the --in-place option which only appeared in Git 2.8
# Avoid the --where option which only appeared in Git 2.15
if ! git -c trailer.where=before interpret-trailers \
--trailer "Signed-off-by: $token: $value" < "$dest-2" |
sed -re "s/^Signed-off-by: ($token: )/\1/" \
-e "/^Signed-off-by: SENTINEL/d" > "$dest" ; then
echo "cannot insert $token line in $1"
exit 1
fi

498
main.py
View File

@ -37,7 +37,7 @@ except ImportError:
from color import SetDefaultColoring
import event_log
from repo_trace import SetTrace
from repo_trace import SetTrace, Trace, SetTraceToStderr
from git_command import user_agent
from git_config import RepoConfig
from git_trace2_event_log import EventLog
@ -74,59 +74,109 @@ MIN_PYTHON_VERSION_SOFT = (3, 6)
MIN_PYTHON_VERSION_HARD = (3, 6)
if sys.version_info.major < 3:
print('repo: error: Python 2 is no longer supported; '
'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr)
print(
"repo: error: Python 2 is no longer supported; "
"Please upgrade to Python {}.{}+.".format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr,
)
sys.exit(1)
else:
if sys.version_info < MIN_PYTHON_VERSION_HARD:
print('repo: error: Python 3 version is too old; '
'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr)
print(
"repo: error: Python 3 version is too old; "
"Please upgrade to Python {}.{}+.".format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr,
)
sys.exit(1)
elif sys.version_info < MIN_PYTHON_VERSION_SOFT:
print('repo: warning: your Python 3 version is no longer supported; '
'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr)
print(
"repo: warning: your Python 3 version is no longer supported; "
"Please upgrade to Python {}.{}+.".format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr,
)
global_options = optparse.OptionParser(
usage='repo [-p|--paginate|--no-pager] COMMAND [ARGS]',
add_help_option=False)
global_options.add_option('-h', '--help', action='store_true',
help='show this help message and exit')
global_options.add_option('--help-all', action='store_true',
help='show this help message with all subcommands and exit')
global_options.add_option('-p', '--paginate',
dest='pager', action='store_true',
help='display command output in the pager')
global_options.add_option('--no-pager',
dest='pager', action='store_false',
help='disable the pager')
global_options.add_option('--color',
choices=('auto', 'always', 'never'), default=None,
help='control color usage: auto, always, never')
global_options.add_option('--trace',
dest='trace', action='store_true',
help='trace git command execution (REPO_TRACE=1)')
global_options.add_option('--trace-python',
dest='trace_python', action='store_true',
help='trace python command execution')
global_options.add_option('--time',
dest='time', action='store_true',
help='time repo command execution')
global_options.add_option('--version',
dest='show_version', action='store_true',
help='display this version of repo')
global_options.add_option('--show-toplevel',
action='store_true',
help='display the path of the top-level directory of '
'the repo client checkout')
global_options.add_option('--event-log',
dest='event_log', action='store',
help='filename of event log to append timeline to')
global_options.add_option('--git-trace2-event-log', action='store',
help='directory to write git trace2 event log to')
usage="repo [-p|--paginate|--no-pager] COMMAND [ARGS]",
add_help_option=False,
)
global_options.add_option(
"-h", "--help", action="store_true", help="show this help message and exit"
)
global_options.add_option(
"--help-all",
action="store_true",
help="show this help message with all subcommands and exit",
)
global_options.add_option(
"-p",
"--paginate",
dest="pager",
action="store_true",
help="display command output in the pager",
)
global_options.add_option(
"--no-pager", dest="pager", action="store_false", help="disable the pager"
)
global_options.add_option(
"--color",
choices=("auto", "always", "never"),
default=None,
help="control color usage: auto, always, never",
)
global_options.add_option(
"--trace",
dest="trace",
action="store_true",
help="trace git command execution (REPO_TRACE=1)",
)
global_options.add_option(
"--trace-to-stderr",
dest="trace_to_stderr",
action="store_true",
help="trace outputs go to stderr in addition to .repo/TRACE_FILE",
)
global_options.add_option(
"--trace-python",
dest="trace_python",
action="store_true",
help="trace python command execution",
)
global_options.add_option(
"--time",
dest="time",
action="store_true",
help="time repo command execution",
)
global_options.add_option(
"--version",
dest="show_version",
action="store_true",
help="display this version of repo",
)
global_options.add_option(
"--show-toplevel",
action="store_true",
help="display the path of the top-level directory of "
"the repo client checkout",
)
global_options.add_option(
"--event-log",
dest="event_log",
action="store",
help="filename of event log to append timeline to",
)
global_options.add_option(
"--git-trace2-event-log",
action="store",
help="directory to write git trace2 event log to",
)
global_options.add_option(
"--submanifest-path",
action="store",
metavar="REL_PATH",
help="submanifest path",
)
class _Repo(object):
@ -139,13 +189,15 @@ class _Repo(object):
global_options.print_help()
print()
if short:
commands = ' '.join(sorted(self.commands))
commands = " ".join(sorted(self.commands))
wrapped_commands = textwrap.wrap(commands, width=77)
print('Available commands:\n %s' % ('\n '.join(wrapped_commands),))
print('\nRun `repo help <command>` for command-specific details.')
print('Bug reports:', Wrapper().BUG_URL)
print(
"Available commands:\n %s" % ("\n ".join(wrapped_commands),)
)
print("\nRun `repo help <command>` for command-specific details.")
print("Bug reports:", Wrapper().BUG_URL)
else:
cmd = self.commands['help']()
cmd = self.commands["help"]()
if all_commands:
cmd.PrintAllCommandsBody()
else:
@ -154,7 +206,7 @@ class _Repo(object):
def _ParseArgs(self, argv):
"""Parse the main `repo` command line options."""
for i, arg in enumerate(argv):
if not arg.startswith('-'):
if not arg.startswith("-"):
name = arg
glob = argv[:i]
argv = argv[i + 1 :]
@ -177,14 +229,14 @@ class _Repo(object):
if name in self.commands:
return name, []
key = 'alias.%s' % (name,)
key = "alias.%s" % (name,)
alias = RepoConfig.ForRepository(self.repodir).GetString(key)
if alias is None:
alias = RepoConfig.ForUser().GetString(key)
if alias is None:
return name, []
args = alias.strip().split(' ', 1)
args = alias.strip().split(" ", 1)
name = args[0]
if len(args) == 2:
args = shlex.split(args[1])
@ -196,16 +248,13 @@ class _Repo(object):
"""Execute the requested subcommand."""
result = 0
if gopts.trace:
SetTrace()
# Handle options that terminate quickly first.
if gopts.help or gopts.help_all:
self._PrintHelp(short=False, all_commands=gopts.help_all)
return 0
elif gopts.show_version:
# Always allow global --version regardless of subcommand validity.
name = 'version'
name = "version"
elif gopts.show_toplevel:
print(os.path.dirname(self.repodir))
return 0
@ -214,10 +263,40 @@ class _Repo(object):
self._PrintHelp(short=True)
return 1
run = lambda: self._RunLong(name, gopts, argv) or 0
with Trace(
"starting new command: %s",
", ".join([name] + argv),
first_trace=True,
):
if gopts.trace_python:
import trace
tracer = trace.Trace(
count=False,
trace=True,
timing=True,
ignoredirs=set(sys.path[1:]),
)
result = tracer.runfunc(run)
else:
result = run()
return result
def _RunLong(self, name, gopts, argv):
"""Execute the (longer running) requested subcommand."""
result = 0
SetDefaultColoring(gopts.color)
git_trace2_event_log = EventLog()
repo_client = RepoClient(self.repodir)
outer_client = RepoClient(self.repodir)
repo_client = outer_client
if gopts.submanifest_path:
repo_client = RepoClient(
self.repodir,
submanifest_path=gopts.submanifest_path,
outer_client=outer_client,
)
gitc_manifest = None
gitc_client_name = gitc_utils.parse_clientdir(os.getcwd())
if gitc_client_name:
@ -229,38 +308,53 @@ class _Repo(object):
repodir=self.repodir,
client=repo_client,
manifest=repo_client.manifest,
outer_client=outer_client,
outer_manifest=outer_client.manifest,
gitc_manifest=gitc_manifest,
git_event_log=git_trace2_event_log)
git_event_log=git_trace2_event_log,
)
except KeyError:
print("repo: '%s' is not a repo command. See 'repo help'." % name,
file=sys.stderr)
print(
"repo: '%s' is not a repo command. See 'repo help'." % name,
file=sys.stderr,
)
return 1
Editor.globalConfig = cmd.client.globalConfig
if not isinstance(cmd, MirrorSafeCommand) and cmd.manifest.IsMirror:
print("fatal: '%s' requires a working directory" % name,
file=sys.stderr)
print(
"fatal: '%s' requires a working directory" % name,
file=sys.stderr,
)
return 1
if isinstance(cmd, GitcAvailableCommand) and not gitc_utils.get_gitc_manifest_dir():
print("fatal: '%s' requires GITC to be available" % name,
file=sys.stderr)
if (
isinstance(cmd, GitcAvailableCommand)
and not gitc_utils.get_gitc_manifest_dir()
):
print(
"fatal: '%s' requires GITC to be available" % name,
file=sys.stderr,
)
return 1
if isinstance(cmd, GitcClientCommand) and not gitc_client_name:
print("fatal: '%s' requires a GITC client" % name,
file=sys.stderr)
print("fatal: '%s' requires a GITC client" % name, file=sys.stderr)
return 1
try:
copts, cargs = cmd.OptionParser.parse_args(argv)
copts = cmd.ReadEnvironmentOptions(copts)
except NoManifestException as e:
print('error: in `%s`: %s' % (' '.join([name] + argv), str(e)),
file=sys.stderr)
print('error: manifest missing or unreadable -- please run init',
file=sys.stderr)
print(
"error: in `%s`: %s" % (" ".join([name] + argv), str(e)),
file=sys.stderr,
)
print(
"error: manifest missing or unreadable -- please run init",
file=sys.stderr,
)
return 1
if gopts.pager is not False and not isinstance(cmd, InteractiveCommand):
@ -268,7 +362,7 @@ class _Repo(object):
if gopts.pager:
use_pager = True
else:
use_pager = config.GetBoolean('pager.%s' % name)
use_pager = config.GetBoolean("pager.%s" % name)
if use_pager is None:
use_pager = cmd.WantPager(copts)
if use_pager:
@ -278,32 +372,77 @@ class _Repo(object):
cmd_event = cmd.event_log.Add(name, event_log.TASK_COMMAND, start)
cmd.event_log.SetParent(cmd_event)
git_trace2_event_log.StartEvent()
git_trace2_event_log.CommandEvent(name='repo', subcommands=[name])
git_trace2_event_log.CommandEvent(name="repo", subcommands=[name])
try:
cmd.CommonValidateOptions(copts, cargs)
cmd.ValidateOptions(copts, cargs)
this_manifest_only = copts.this_manifest_only
outer_manifest = copts.outer_manifest
if cmd.MULTI_MANIFEST_SUPPORT or this_manifest_only:
result = cmd.Execute(copts, cargs)
except (DownloadError, ManifestInvalidRevisionError,
NoManifestException) as e:
print('error: in `%s`: %s' % (' '.join([name] + argv), str(e)),
file=sys.stderr)
elif outer_manifest and repo_client.manifest.is_submanifest:
# The command does not support multi-manifest, we are using a
# submanifest, and the command line is for the outermost
# manifest. Re-run using the outermost manifest, which will
# recurse through the submanifests.
gopts.submanifest_path = ""
result = self._Run(name, gopts, argv)
else:
# No multi-manifest support. Run the command in the current
# (sub)manifest, and then any child submanifests.
result = cmd.Execute(copts, cargs)
for submanifest in repo_client.manifest.submanifests.values():
spec = submanifest.ToSubmanifestSpec()
gopts.submanifest_path = submanifest.repo_client.path_prefix
child_argv = argv[:]
child_argv.append("--no-outer-manifest")
# Not all subcommands support the 3 manifest options, so
# only add them if the original command includes them.
if hasattr(copts, "manifest_url"):
child_argv.extend(["--manifest-url", spec.manifestUrl])
if hasattr(copts, "manifest_name"):
child_argv.extend(
["--manifest-name", spec.manifestName]
)
if hasattr(copts, "manifest_branch"):
child_argv.extend(["--manifest-branch", spec.revision])
result = self._Run(name, gopts, child_argv) or result
except (
DownloadError,
ManifestInvalidRevisionError,
NoManifestException,
) as e:
print(
"error: in `%s`: %s" % (" ".join([name] + argv), str(e)),
file=sys.stderr,
)
if isinstance(e, NoManifestException):
print('error: manifest missing or unreadable -- please run init',
file=sys.stderr)
print(
"error: manifest missing or unreadable -- please run init",
file=sys.stderr,
)
result = 1
except NoSuchProjectError as e:
if e.name:
print('error: project %s not found' % e.name, file=sys.stderr)
print("error: project %s not found" % e.name, file=sys.stderr)
else:
print('error: no project in current directory', file=sys.stderr)
print("error: no project in current directory", file=sys.stderr)
result = 1
except InvalidProjectGroupsError as e:
if e.name:
print('error: project group must be enabled for project %s' % e.name, file=sys.stderr)
print(
"error: project group must be enabled for project %s"
% e.name,
file=sys.stderr,
)
else:
print('error: project group must be enabled for the project in the current directory',
file=sys.stderr)
print(
"error: project group must be enabled for the project in "
"the current directory",
file=sys.stderr,
)
result = 1
except SystemExit as e:
if e.code:
@ -316,20 +455,27 @@ class _Repo(object):
minutes, seconds = divmod(remainder, 60)
if gopts.time:
if hours == 0:
print('real\t%dm%.3fs' % (minutes, seconds), file=sys.stderr)
print(
"real\t%dm%.3fs" % (minutes, seconds), file=sys.stderr
)
else:
print('real\t%dh%dm%.3fs' % (hours, minutes, seconds),
file=sys.stderr)
print(
"real\t%dh%dm%.3fs" % (hours, minutes, seconds),
file=sys.stderr,
)
cmd.event_log.FinishEvent(cmd_event, finish,
result is None or result == 0)
cmd.event_log.FinishEvent(
cmd_event, finish, result is None or result == 0
)
git_trace2_event_log.DefParamRepoEvents(
cmd.manifest.manifestProject.config.DumpConfigDict())
cmd.manifest.manifestProject.config.DumpConfigDict()
)
git_trace2_event_log.ExitEvent(result)
if gopts.event_log:
cmd.event_log.Write(os.path.abspath(
os.path.expanduser(gopts.event_log)))
cmd.event_log.Write(
os.path.abspath(os.path.expanduser(gopts.event_log))
)
git_trace2_event_log.Write(gopts.git_trace2_event_log)
return result
@ -339,29 +485,31 @@ def _CheckWrapperVersion(ver_str, repo_path):
"""Verify the repo launcher is new enough for this checkout.
Args:
ver_str: The version string passed from the repo launcher when it ran us.
ver_str: The version string passed from the repo launcher when it ran
us.
repo_path: The path to the repo launcher that loaded us.
"""
# Refuse to work with really old wrapper versions. We don't test these,
# so might as well require a somewhat recent sane version.
# v1.15 of the repo launcher was released in ~Mar 2012.
MIN_REPO_VERSION = (1, 15)
min_str = '.'.join(str(x) for x in MIN_REPO_VERSION)
min_str = ".".join(str(x) for x in MIN_REPO_VERSION)
if not repo_path:
repo_path = '~/bin/repo'
repo_path = "~/bin/repo"
if not ver_str:
print('no --wrapper-version argument', file=sys.stderr)
print("no --wrapper-version argument", file=sys.stderr)
sys.exit(1)
# Pull out the version of the repo launcher we know about to compare.
exp = Wrapper().VERSION
ver = tuple(map(int, ver_str.split('.')))
ver = tuple(map(int, ver_str.split(".")))
exp_str = '.'.join(map(str, exp))
exp_str = ".".join(map(str, exp))
if ver < MIN_REPO_VERSION:
print("""
print(
"""
repo: error:
!!! Your version of repo %s is too old.
!!! We need at least version %s.
@ -369,29 +517,42 @@ repo: error:
!!! You must upgrade before you can continue:
cp %s %s
""" % (ver_str, min_str, exp_str, WrapperPath(), repo_path), file=sys.stderr)
"""
% (ver_str, min_str, exp_str, WrapperPath(), repo_path),
file=sys.stderr,
)
sys.exit(1)
if exp > ver:
print('\n... A new version of repo (%s) is available.' % (exp_str,),
file=sys.stderr)
print(
"\n... A new version of repo (%s) is available." % (exp_str,),
file=sys.stderr,
)
if os.access(repo_path, os.W_OK):
print("""\
print(
"""\
... You should upgrade soon:
cp %s %s
""" % (WrapperPath(), repo_path), file=sys.stderr)
"""
% (WrapperPath(), repo_path),
file=sys.stderr,
)
else:
print("""\
print(
"""\
... New version is available at: %s
... The launcher is run from: %s
!!! The launcher is not writable. Please talk to your sysadmin or distro
!!! to get an update installed.
""" % (WrapperPath(), repo_path), file=sys.stderr)
"""
% (WrapperPath(), repo_path),
file=sys.stderr,
)
def _CheckRepoDir(repo_dir):
if not repo_dir:
print('no --repo-dir argument', file=sys.stderr)
print("no --repo-dir argument", file=sys.stderr)
sys.exit(1)
@ -399,10 +560,10 @@ def _PruneOptions(argv, opt):
i = 0
while i < len(argv):
a = argv[i]
if a == '--':
if a == "--":
break
if a.startswith('--'):
eq = a.find('=')
if a.startswith("--"):
eq = a.find("=")
if eq > 0:
a = a[0:eq]
if not opt.has_option(a):
@ -413,11 +574,11 @@ def _PruneOptions(argv, opt):
class _UserAgentHandler(urllib.request.BaseHandler):
def http_request(self, req):
req.add_header('User-Agent', user_agent.repo)
req.add_header("User-Agent", user_agent.repo)
return req
def https_request(self, req):
req.add_header('User-Agent', user_agent.repo)
req.add_header("User-Agent", user_agent.repo)
return req
@ -428,7 +589,7 @@ def _AddPasswordFromUserInput(handler, msg, req):
if user is None:
print(msg)
try:
user = input('User: ')
user = input("User: ")
password = getpass.getpass()
except KeyboardInterrupt:
return
@ -439,23 +600,28 @@ class _BasicAuthHandler(urllib.request.HTTPBasicAuthHandler):
def http_error_401(self, req, fp, code, msg, headers):
_AddPasswordFromUserInput(self, msg, req)
return urllib.request.HTTPBasicAuthHandler.http_error_401(
self, req, fp, code, msg, headers)
self, req, fp, code, msg, headers
)
def http_error_auth_reqed(self, authreq, host, req, headers):
try:
old_add_header = req.add_header
def _add_header(name, val):
val = val.replace('\n', '')
val = val.replace("\n", "")
old_add_header(name, val)
req.add_header = _add_header
return urllib.request.AbstractBasicAuthHandler.http_error_auth_reqed(
self, authreq, host, req, headers)
return (
urllib.request.AbstractBasicAuthHandler.http_error_auth_reqed(
self, authreq, host, req, headers
)
)
except Exception:
reset = getattr(self, 'reset_retry_count', None)
reset = getattr(self, "reset_retry_count", None)
if reset is not None:
reset()
elif getattr(self, 'retried', None):
elif getattr(self, "retried", None):
self.retried = 0
raise
@ -464,23 +630,28 @@ class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler):
def http_error_401(self, req, fp, code, msg, headers):
_AddPasswordFromUserInput(self, msg, req)
return urllib.request.HTTPDigestAuthHandler.http_error_401(
self, req, fp, code, msg, headers)
self, req, fp, code, msg, headers
)
def http_error_auth_reqed(self, auth_header, host, req, headers):
try:
old_add_header = req.add_header
def _add_header(name, val):
val = val.replace('\n', '')
val = val.replace("\n", "")
old_add_header(name, val)
req.add_header = _add_header
return urllib.request.AbstractDigestAuthHandler.http_error_auth_reqed(
self, auth_header, host, req, headers)
return (
urllib.request.AbstractDigestAuthHandler.http_error_auth_reqed(
self, auth_header, host, req, headers
)
)
except Exception:
reset = getattr(self, 'reset_retry_count', None)
reset = getattr(self, "reset_retry_count", None)
if reset is not None:
reset()
elif getattr(self, 'retried', None):
elif getattr(self, "retried", None):
self.retried = 0
raise
@ -493,7 +664,9 @@ class _KerberosAuthHandler(urllib.request.BaseHandler):
def http_error_401(self, req, fp, code, msg, headers):
host = req.get_host()
retry = self.http_error_auth_reqed('www-authenticate', host, req, headers)
retry = self.http_error_auth_reqed(
"www-authenticate", host, req, headers
)
return retry
def http_error_auth_reqed(self, auth_header, host, req, headers):
@ -502,8 +675,13 @@ class _KerberosAuthHandler(urllib.request.BaseHandler):
authdata = self._negotiate_get_authdata(auth_header, headers)
if self.retried > 3:
raise urllib.request.HTTPError(req.get_full_url(), 401,
"Negotiate auth failed", headers, None)
raise urllib.request.HTTPError(
req.get_full_url(),
401,
"Negotiate auth failed",
headers,
None,
)
else:
self.retried += 1
@ -511,7 +689,7 @@ class _KerberosAuthHandler(urllib.request.BaseHandler):
if neghdr is None:
return None
req.add_unredirected_header('Authorization', neghdr)
req.add_unredirected_header("Authorization", neghdr)
response = self.parent.open(req)
srvauth = self._negotiate_get_authdata(auth_header, response.info())
@ -574,8 +752,8 @@ def init_http():
n = netrc.netrc()
for host in n.hosts:
p = n.hosts[host]
mgr.add_password(p[1], 'http://%s/' % host, p[0], p[2])
mgr.add_password(p[1], 'https://%s/' % host, p[0], p[2])
mgr.add_password(p[1], "http://%s/" % host, p[0], p[2])
mgr.add_password(p[1], "https://%s/" % host, p[0], p[2])
except netrc.NetrcParseError:
pass
except IOError:
@ -585,10 +763,12 @@ def init_http():
if kerberos:
handlers.append(_KerberosAuthHandler())
if 'http_proxy' in os.environ:
url = os.environ['http_proxy']
handlers.append(urllib.request.ProxyHandler({'http': url, 'https': url}))
if 'REPO_CURL_VERBOSE' in os.environ:
if "http_proxy" in os.environ:
url = os.environ["http_proxy"]
handlers.append(
urllib.request.ProxyHandler({"http": url, "https": url})
)
if "REPO_CURL_VERBOSE" in os.environ:
handlers.append(urllib.request.HTTPHandler(debuglevel=1))
handlers.append(urllib.request.HTTPSHandler(debuglevel=1))
urllib.request.install_opener(urllib.request.build_opener(*handlers))
@ -598,12 +778,17 @@ def _Main(argv):
result = 0
opt = optparse.OptionParser(usage="repo wrapperinfo -- ...")
opt.add_option("--repo-dir", dest="repodir",
help="path to .repo/")
opt.add_option("--wrapper-version", dest="wrapper_version",
help="version of the wrapper script")
opt.add_option("--wrapper-path", dest="wrapper_path",
help="location of the wrapper script")
opt.add_option("--repo-dir", dest="repodir", help="path to .repo/")
opt.add_option(
"--wrapper-version",
dest="wrapper_version",
help="version of the wrapper script",
)
opt.add_option(
"--wrapper-path",
dest="wrapper_path",
help="location of the wrapper script",
)
_PruneOptions(argv, opt)
opt, argv = opt.parse_args(argv)
@ -614,22 +799,23 @@ def _Main(argv):
Version.wrapper_path = opt.wrapper_path
repo = _Repo(opt.repodir)
try:
init_http()
name, gopts, argv = repo._ParseArgs(argv)
run = lambda: repo._Run(name, gopts, argv) or 0
if gopts.trace_python:
import trace
tracer = trace.Trace(count=False, trace=True, timing=True,
ignoredirs=set(sys.path[1:]))
result = tracer.runfunc(run)
else:
result = run()
if gopts.trace:
SetTrace()
if gopts.trace_to_stderr:
SetTraceToStderr()
result = repo._Run(name, gopts, argv) or 0
except KeyboardInterrupt:
print('aborted by user', file=sys.stderr)
print("aborted by user", file=sys.stderr)
result = 1
except ManifestParseError as mpe:
print('fatal: %s' % mpe, file=sys.stderr)
print("fatal: %s" % mpe, file=sys.stderr)
result = 1
except RepoChangedException as rce:
# If repo changed, re-exec ourselves.
@ -639,13 +825,13 @@ def _Main(argv):
try:
os.execv(sys.executable, [__file__] + argv)
except OSError as e:
print('fatal: cannot restart repo after upgrade', file=sys.stderr)
print('fatal: %s' % e, file=sys.stderr)
print("fatal: cannot restart repo after upgrade", file=sys.stderr)
print("fatal: %s" % e, file=sys.stderr)
result = 128
TerminatePager()
sys.exit(result)
if __name__ == '__main__':
if __name__ == "__main__":
_Main(sys.argv[1:])

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo abandon" "Repo Manual"
.TH REPO "1" "July 2022" "repo abandon" "Repo Manual"
.SH NAME
repo \- repo abandon - manual page for repo abandon
.SH SYNOPSIS
@ -32,5 +32,18 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help abandon` to view the detailed manual.

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo branches" "Repo Manual"
.TH REPO "1" "July 2022" "repo branches" "Repo Manual"
.SH NAME
repo \- repo branches - manual page for repo branches
.SH SYNOPSIS
@ -55,5 +55,18 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help branches` to view the detailed manual.

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo checkout" "Repo Manual"
.TH REPO "1" "July 2022" "repo checkout" "Repo Manual"
.SH NAME
repo \- repo checkout - manual page for repo checkout
.SH SYNOPSIS
@ -24,6 +24,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help checkout` to view the detailed manual.
.SH DETAILS

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo cherry-pick" "Repo Manual"
.TH REPO "1" "July 2022" "repo cherry-pick" "Repo Manual"
.SH NAME
repo \- repo cherry-pick - manual page for repo cherry-pick
.SH SYNOPSIS
@ -20,6 +20,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help cherry\-pick` to view the detailed manual.
.SH DETAILS

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo diff" "Repo Manual"
.TH REPO "1" "July 2022" "repo diff" "Repo Manual"
.SH NAME
repo \- repo diff - manual page for repo diff
.SH SYNOPSIS
@ -31,5 +31,18 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help diff` to view the detailed manual.

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo diffmanifests" "Repo Manual"
.TH REPO "1" "July 2022" "repo diffmanifests" "Repo Manual"
.SH NAME
repo \- repo diffmanifests - manual page for repo diffmanifests
.SH SYNOPSIS
@ -29,6 +29,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help diffmanifests` to view the detailed manual.
.SH DETAILS

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo download" "Repo Manual"
.TH REPO "1" "July 2022" "repo download" "Repo Manual"
.SH NAME
repo \- repo download - manual page for repo download
.SH SYNOPSIS
@ -35,6 +35,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help download` to view the detailed manual.
.SH DETAILS

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo forall" "Repo Manual"
.TH REPO "1" "July 2022" "repo forall" "Repo Manual"
.SH NAME
repo \- repo forall - manual page for repo forall
.SH SYNOPSIS
@ -54,6 +54,19 @@ only show errors
.TP
\fB\-p\fR
show project headers before output
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help forall` to view the detailed manual.
.SH DETAILS
@ -93,6 +106,11 @@ REPO_PROJECT is set to the unique name of the project.
.PP
REPO_PATH is the path relative the the root of the client.
.PP
REPO_OUTERPATH is the path of the sub manifest's root relative to the root of
the client.
.PP
REPO_INNERPATH is the path relative to the root of the sub manifest.
.PP
REPO_REMOTE is the name of the remote system from the manifest.
.PP
REPO_LREV is the name of the revision from the manifest, translated to a local

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo gitc-delete" "Repo Manual"
.TH REPO "1" "July 2022" "repo gitc-delete" "Repo Manual"
.SH NAME
repo \- repo gitc-delete - manual page for repo gitc-delete
.SH SYNOPSIS
@ -23,6 +23,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help gitc\-delete` to view the detailed manual.
.SH DETAILS

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo gitc-init" "Repo Manual"
.TH REPO "1" "October 2022" "repo gitc-init" "Repo Manual"
.SH NAME
repo \- repo gitc-init - manual page for repo gitc-init
.SH SYNOPSIS
@ -41,10 +41,19 @@ platform group [auto|all|none|linux|darwin|...]
.TP
\fB\-\-submodules\fR
sync any submodules associated with the manifest repo
.TP
\fB\-\-standalone\-manifest\fR
download the manifest as a static file rather then
create a git checkout of the manifest repo
.TP
\fB\-\-manifest\-depth\fR=\fI\,DEPTH\/\fR
create a shallow clone of the manifest repo with given
depth (0 for full clone); see git clone (default: 0)
.SS Manifest (only) checkout options:
.TP
\fB\-\-current\-branch\fR
fetch only current manifest branch from server
(default)
.TP
\fB\-\-no\-current\-branch\fR
fetch all manifest branches from server
@ -92,7 +101,8 @@ filter for use with \fB\-\-partial\-clone\fR [default:
blob:none]
.TP
\fB\-\-use\-superproject\fR
use the manifest superproject to sync projects
use the manifest superproject to sync projects;
implies \fB\-c\fR
.TP
\fB\-\-no\-use\-superproject\fR
disable use of manifest superprojects
@ -104,6 +114,12 @@ not \fB\-\-partial\-clone\fR)
\fB\-\-no\-clone\-bundle\fR
disable use of \fI\,/clone.bundle\/\fP on HTTP/HTTPS (default if
\fB\-\-partial\-clone\fR)
.TP
\fB\-\-git\-lfs\fR
enable Git LFS support
.TP
\fB\-\-no\-git\-lfs\fR
disable Git LFS support
.SS repo Version options:
.TP
\fB\-\-repo\-url\fR=\fI\,URL\/\fR
@ -125,6 +141,19 @@ Optional manifest file to use for this GITC client.
.TP
\fB\-c\fR GITC_CLIENT, \fB\-\-gitc\-client\fR=\fI\,GITC_CLIENT\/\fR
Name of the gitc_client instance to create or modify.
.SS Multi\-manifest:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help gitc\-init` to view the detailed manual.
.SH DETAILS

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo grep" "Repo Manual"
.TH REPO "1" "July 2022" "repo grep" "Repo Manual"
.SH NAME
repo \- repo grep - manual page for repo grep
.SH SYNOPSIS
@ -24,6 +24,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.SS Sources:
.TP
\fB\-\-cached\fR

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo help" "Repo Manual"
.TH REPO "1" "July 2022" "repo help" "Repo Manual"
.SH NAME
repo \- repo help - manual page for repo help
.SH SYNOPSIS
@ -26,6 +26,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help help` to view the detailed manual.
.SH DETAILS

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo info" "Repo Manual"
.TH REPO "1" "July 2022" "repo info" "Repo Manual"
.SH NAME
repo \- repo info - manual page for repo info
.SH SYNOPSIS
@ -36,5 +36,18 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help info` to view the detailed manual.

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo init" "Repo Manual"
.TH REPO "1" "October 2022" "repo init" "Repo Manual"
.SH NAME
repo \- repo init - manual page for repo init
.SH SYNOPSIS
@ -41,10 +41,19 @@ platform group [auto|all|none|linux|darwin|...]
.TP
\fB\-\-submodules\fR
sync any submodules associated with the manifest repo
.TP
\fB\-\-standalone\-manifest\fR
download the manifest as a static file rather then
create a git checkout of the manifest repo
.TP
\fB\-\-manifest\-depth\fR=\fI\,DEPTH\/\fR
create a shallow clone of the manifest repo with given
depth (0 for full clone); see git clone (default: 0)
.SS Manifest (only) checkout options:
.TP
\fB\-c\fR, \fB\-\-current\-branch\fR
fetch only current manifest branch from server
(default)
.TP
\fB\-\-no\-current\-branch\fR
fetch all manifest branches from server
@ -92,7 +101,8 @@ filter for use with \fB\-\-partial\-clone\fR [default:
blob:none]
.TP
\fB\-\-use\-superproject\fR
use the manifest superproject to sync projects
use the manifest superproject to sync projects;
implies \fB\-c\fR
.TP
\fB\-\-no\-use\-superproject\fR
disable use of manifest superprojects
@ -104,6 +114,12 @@ not \fB\-\-partial\-clone\fR)
\fB\-\-no\-clone\-bundle\fR
disable use of \fI\,/clone.bundle\/\fP on HTTP/HTTPS (default if
\fB\-\-partial\-clone\fR)
.TP
\fB\-\-git\-lfs\fR
enable Git LFS support
.TP
\fB\-\-no\-git\-lfs\fR
disable Git LFS support
.SS repo Version options:
.TP
\fB\-\-repo\-url\fR=\fI\,URL\/\fR
@ -118,6 +134,19 @@ do not verify repo source code
.TP
\fB\-\-config\-name\fR
Always prompt for name/e\-mail
.SS Multi\-manifest:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help init` to view the detailed manual.
.SH DETAILS
@ -137,6 +166,12 @@ equivalent to using \fB\-b\fR HEAD.
The optional \fB\-m\fR argument can be used to specify an alternate manifest to be
used. If no manifest is specified, the manifest default.xml will be used.
.PP
If the \fB\-\-standalone\-manifest\fR argument is set, the manifest will be downloaded
directly from the specified \fB\-\-manifest\-url\fR as a static file (rather than setting
up a manifest git checkout). With \fB\-\-standalone\-manifest\fR, the manifest will be
fully static and will not be re\-downloaded during subsesquent `repo init` and
`repo sync` calls.
.PP
The \fB\-\-reference\fR option can be used to point to a directory that has the content
of a \fB\-\-mirror\fR sync. This will make the working directory use as much data as
possible from the local reference directory when fetching from the server. This

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo list" "Repo Manual"
.TH REPO "1" "July 2022" "repo list" "Repo Manual"
.SH NAME
repo \- repo list - manual page for repo list
.SH SYNOPSIS
@ -47,6 +47,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help list` to view the detailed manual.
.SH DETAILS

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo manifest" "Repo Manual"
.TH REPO "1" "October 2022" "repo manifest" "Repo Manual"
.SH NAME
repo \- repo manifest - manual page for repo manifest
.SH SYNOPSIS
@ -40,7 +40,8 @@ format output for humans to read
ignore local manifests
.TP
\fB\-o\fR \-|NAME.xml, \fB\-\-output\-file\fR=\fI\,\-\/\fR|NAME.xml
file to save the manifest to
file to save the manifest to. (Filename prefix for
multi\-tree.)
.SS Logging options:
.TP
\fB\-v\fR, \fB\-\-verbose\fR
@ -48,6 +49,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help manifest` to view the detailed manual.
.SH DETAILS
@ -88,6 +102,7 @@ A manifest XML file (e.g. `default.xml`) roughly conforms to the following DTD:
remote*,
default?,
manifest\-server?,
submanifest*?,
remove\-project*,
project*,
extend\-project*,
@ -118,6 +133,16 @@ include*)>
.IP
<!ELEMENT manifest\-server EMPTY>
<!ATTLIST manifest\-server url CDATA #REQUIRED>
.IP
<!ELEMENT submanifest EMPTY>
<!ATTLIST submanifest name ID #REQUIRED>
<!ATTLIST submanifest remote IDREF #IMPLIED>
<!ATTLIST submanifest project CDATA #IMPLIED>
<!ATTLIST submanifest manifest\-name CDATA #IMPLIED>
<!ATTLIST submanifest revision CDATA #IMPLIED>
<!ATTLIST submanifest path CDATA #IMPLIED>
<!ATTLIST submanifest groups CDATA #IMPLIED>
<!ATTLIST submanifest default\-groups CDATA #IMPLIED>
.TP
<!ELEMENT project (annotation*,
project*,
@ -161,9 +186,12 @@ CDATA #IMPLIED>
<!ELEMENT extend\-project EMPTY>
<!ATTLIST extend\-project name CDATA #REQUIRED>
<!ATTLIST extend\-project path CDATA #IMPLIED>
<!ATTLIST extend\-project dest\-path CDATA #IMPLIED>
<!ATTLIST extend\-project groups CDATA #IMPLIED>
<!ATTLIST extend\-project revision CDATA #IMPLIED>
<!ATTLIST extend\-project remote CDATA #IMPLIED>
<!ATTLIST extend\-project dest\-branch CDATA #IMPLIED>
<!ATTLIST extend\-project upstream CDATA #IMPLIED>
.IP
<!ELEMENT remove\-project EMPTY>
<!ATTLIST remove\-project name CDATA #REQUIRED>
@ -176,6 +204,7 @@ CDATA #IMPLIED>
<!ELEMENT superproject EMPTY>
<!ATTLIST superproject name CDATA #REQUIRED>
<!ATTLIST superproject remote IDREF #IMPLIED>
<!ATTLIST superproject revision CDATA #IMPLIED>
.IP
<!ELEMENT contactinfo EMPTY>
<!ATTLIST contactinfo bugurl CDATA #REQUIRED>
@ -293,6 +322,65 @@ GetManifest(tag)
Return a manifest in which each project is pegged to the revision at the
specified tag. This is used by repo sync when the \fB\-\-smart\-tag\fR option is given.
.PP
Element submanifest
.PP
One or more submanifest elements may be specified. Each element describes a
single manifest to be checked out as a child.
.PP
Attribute `name`: A unique name (within the current (sub)manifest) for this
submanifest. It acts as a default for `revision` below. The same name can be
used for submanifests with different parent (sub)manifests.
.PP
Attribute `remote`: Name of a previously defined remote element. If not supplied
the remote given by the default element is used.
.PP
Attribute `project`: The manifest project name. The project's name is appended
onto its remote's fetch URL to generate the actual URL to configure the Git
remote with. The URL gets formed as:
.IP
${remote_fetch}/${project_name}.git
.PP
where ${remote_fetch} is the remote's fetch attribute and ${project_name} is the
project's name attribute. The suffix ".git" is always appended as repo assumes
the upstream is a forest of bare Git repositories. If the project has a parent
element, its name will be prefixed by the parent's.
.PP
The project name must match the name Gerrit knows, if Gerrit is being used for
code reviews.
.PP
`project` must not be empty, and may not be an absolute path or use "." or ".."
path components. It is always interpreted relative to the remote's fetch
settings, so if a different base path is needed, declare a different remote with
the new settings needed.
.PP
If not supplied the remote and project for this manifest will be used: `remote`
cannot be supplied.
.PP
Projects from a submanifest and its submanifests are added to the
submanifest::path:<path_prefix> group.
.PP
Attribute `manifest\-name`: The manifest filename in the manifest project. If not
supplied, `default.xml` is used.
.PP
Attribute `revision`: Name of a Git branch (e.g. "main" or "refs/heads/main"),
tag (e.g. "refs/tags/stable"), or a commit hash. If not supplied, `name` is
used.
.PP
Attribute `path`: An optional path relative to the top directory of the repo
client where the submanifest repo client top directory should be placed. If not
supplied, `revision` is used.
.PP
`path` may not be an absolute path or use "." or ".." path components.
.PP
Attribute `groups`: List of additional groups to which all projects in the
included submanifest belong. This appends and recurses, meaning all projects in
submanifests carry all parent submanifest groups. Same syntax as the
corresponding element of `project`.
.PP
Attribute `default\-groups`: The list of manifest groups to sync if no
`\-\-groups=` parameter was specified at init. When that list is empty, use this
list instead of "default" as the list of groups to sync.
.PP
Element project
.PP
One or more project elements may be specified. Each element describes a single
@ -385,6 +473,11 @@ original manifest.
Attribute `path`: If specified, limit the change to projects checked out at the
specified path, rather than all projects with the given name.
.PP
Attribute `dest\-path`: If specified, a path relative to the top directory of the
repo client where the Git working directory for this project should be placed.
This is used to move a project in the checkout by overriding the existing `path`
setting.
.PP
Attribute `groups`: List of additional groups to which this project belongs.
Same syntax as the corresponding element of `project`.
.PP
@ -394,6 +487,12 @@ project. Same syntax as the corresponding element of `project`.
Attribute `remote`: If specified, overrides the remote of the original project.
Same syntax as the corresponding element of `project`.
.PP
Attribute `dest\-branch`: If specified, overrides the dest\-branch of the original
project. Same syntax as the corresponding element of `project`.
.PP
Attribute `upstream`: If specified, overrides the upstream of the original
project. Same syntax as the corresponding element of `project`.
.PP
Element annotation
.PP
Zero or more annotation elements may be specified as children of a project or
@ -477,6 +576,10 @@ project](#element\-project) for more information.
Attribute `remote`: Name of a previously defined remote element. If not supplied
the remote given by the default element is used.
.PP
Attribute `revision`: Name of the Git branch the manifest wants to track for
this superproject. If not supplied the revision given by the remote element is
used if applicable, else the default element is used.
.PP
Element contactinfo
.PP
*** *Note*: This is currently a WIP. ***
@ -502,8 +605,8 @@ restrictions are not enforced for [Local Manifests].
.PP
Attribute `groups`: List of additional groups to which all projects in the
included manifest belong. This appends and recurses, meaning all projects in
sub\-manifests carry all parent include groups. Same syntax as the corresponding
element of `project`.
included manifests carry all parent include groups. Same syntax as the
corresponding element of `project`.
.PP
Local Manifests
.PP

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo overview" "Repo Manual"
.TH REPO "1" "July 2022" "repo overview" "Repo Manual"
.SH NAME
repo \- repo overview - manual page for repo overview
.SH SYNOPSIS
@ -26,6 +26,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help overview` to view the detailed manual.
.SH DETAILS

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo prune" "Repo Manual"
.TH REPO "1" "July 2022" "repo prune" "Repo Manual"
.SH NAME
repo \- repo prune - manual page for repo prune
.SH SYNOPSIS
@ -24,5 +24,18 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help prune` to view the detailed manual.

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo rebase" "Repo Manual"
.TH REPO "1" "July 2022" "repo rebase" "Repo Manual"
.SH NAME
repo \- repo rebase - manual page for repo rebase
.SH SYNOPSIS
@ -46,6 +46,19 @@ only show errors
.TP
\fB\-i\fR, \fB\-\-interactive\fR
interactive rebase (single project only)
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help rebase` to view the detailed manual.
.SH DETAILS

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo selfupdate" "Repo Manual"
.TH REPO "1" "July 2022" "repo selfupdate" "Repo Manual"
.SH NAME
repo \- repo selfupdate - manual page for repo selfupdate
.SH SYNOPSIS
@ -20,6 +20,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.SS repo Version options:
.TP
\fB\-\-no\-repo\-verify\fR

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo smartsync" "Repo Manual"
.TH REPO "1" "November 2022" "repo smartsync" "Repo Manual"
.SH NAME
repo \- repo smartsync - manual page for repo smartsync
.SH SYNOPSIS
@ -20,11 +20,11 @@ number of CPU cores)
.TP
\fB\-\-jobs\-network\fR=\fI\,JOBS\/\fR
number of network jobs to run in parallel (defaults to
\fB\-\-jobs\fR)
\fB\-\-jobs\fR or 1)
.TP
\fB\-\-jobs\-checkout\fR=\fI\,JOBS\/\fR
number of local checkout jobs to run in parallel
(defaults to \fB\-\-jobs\fR)
(defaults to \fB\-\-jobs\fR or 8)
.TP
\fB\-f\fR, \fB\-\-force\-broken\fR
obsolete option (to be deleted in the future)
@ -80,7 +80,8 @@ password to authenticate with the manifest server
fetch submodules from server
.TP
\fB\-\-use\-superproject\fR
use the manifest superproject to sync projects
use the manifest superproject to sync projects;
implies \fB\-c\fR
.TP
\fB\-\-no\-use\-superproject\fR
disable use of manifest superprojects
@ -89,7 +90,7 @@ disable use of manifest superprojects
fetch tags
.TP
\fB\-\-no\-tags\fR
don't fetch tags
don't fetch tags (default)
.TP
\fB\-\-optimized\-fetch\fR
only fetch projects fixed to sha1 if revision does not
@ -100,6 +101,17 @@ number of times to retry fetches on transient errors
.TP
\fB\-\-prune\fR
delete refs that no longer exist on the remote
(default)
.TP
\fB\-\-no\-prune\fR
do not delete refs that no longer exist on the remote
.TP
\fB\-\-auto\-gc\fR
run garbage collection on all synced projects
.TP
\fB\-\-no\-auto\-gc\fR
do not run garbage collection on any projects
(default)
.SS Logging options:
.TP
\fB\-v\fR, \fB\-\-verbose\fR
@ -107,6 +119,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.SS repo Version options:
.TP
\fB\-\-no\-repo\-verify\fR

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo stage" "Repo Manual"
.TH REPO "1" "July 2022" "repo stage" "Repo Manual"
.SH NAME
repo \- repo stage - manual page for repo stage
.SH SYNOPSIS
@ -23,6 +23,19 @@ only show errors
.TP
\fB\-i\fR, \fB\-\-interactive\fR
use interactive staging
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help stage` to view the detailed manual.
.SH DETAILS

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo start" "Repo Manual"
.TH REPO "1" "July 2022" "repo start" "Repo Manual"
.SH NAME
repo \- repo start - manual page for repo start
.SH SYNOPSIS
@ -33,6 +33,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help start` to view the detailed manual.
.SH DETAILS

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo status" "Repo Manual"
.TH REPO "1" "July 2022" "repo status" "Repo Manual"
.SH NAME
repo \- repo status - manual page for repo status
.SH SYNOPSIS
@ -28,6 +28,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help status` to view the detailed manual.
.SH DETAILS

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo sync" "Repo Manual"
.TH REPO "1" "November 2022" "repo sync" "Repo Manual"
.SH NAME
repo \- repo sync - manual page for repo sync
.SH SYNOPSIS
@ -20,11 +20,11 @@ number of CPU cores)
.TP
\fB\-\-jobs\-network\fR=\fI\,JOBS\/\fR
number of network jobs to run in parallel (defaults to
\fB\-\-jobs\fR)
\fB\-\-jobs\fR or 1)
.TP
\fB\-\-jobs\-checkout\fR=\fI\,JOBS\/\fR
number of local checkout jobs to run in parallel
(defaults to \fB\-\-jobs\fR)
(defaults to \fB\-\-jobs\fR or 8)
.TP
\fB\-f\fR, \fB\-\-force\-broken\fR
obsolete option (to be deleted in the future)
@ -80,7 +80,8 @@ password to authenticate with the manifest server
fetch submodules from server
.TP
\fB\-\-use\-superproject\fR
use the manifest superproject to sync projects
use the manifest superproject to sync projects;
implies \fB\-c\fR
.TP
\fB\-\-no\-use\-superproject\fR
disable use of manifest superprojects
@ -89,7 +90,7 @@ disable use of manifest superprojects
fetch tags
.TP
\fB\-\-no\-tags\fR
don't fetch tags
don't fetch tags (default)
.TP
\fB\-\-optimized\-fetch\fR
only fetch projects fixed to sha1 if revision does not
@ -100,6 +101,17 @@ number of times to retry fetches on transient errors
.TP
\fB\-\-prune\fR
delete refs that no longer exist on the remote
(default)
.TP
\fB\-\-no\-prune\fR
do not delete refs that no longer exist on the remote
.TP
\fB\-\-auto\-gc\fR
run garbage collection on all synced projects
.TP
\fB\-\-no\-auto\-gc\fR
do not run garbage collection on any projects
(default)
.TP
\fB\-s\fR, \fB\-\-smart\-sync\fR
smart sync using manifest from the latest known good
@ -114,6 +126,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.SS repo Version options:
.TP
\fB\-\-no\-repo\-verify\fR
@ -182,6 +207,9 @@ to a sha1 revision if the sha1 revision does not already exist locally.
The \fB\-\-prune\fR option can be used to remove any refs that no longer exist on the
remote.
.PP
The \fB\-\-auto\-gc\fR option can be used to trigger garbage collection on all projects.
By default, repo does not run garbage collection.
.PP
SSH Connections
.PP
If at least one project remote URL uses an SSH connection (ssh://, git+ssh://,

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo upload" "Repo Manual"
.TH REPO "1" "August 2022" "repo upload" "Repo Manual"
.SH NAME
repo \- repo upload - manual page for repo upload
.SH SYNOPSIS
@ -54,6 +54,9 @@ upload as a private change (deprecated; use \fB\-\-wip\fR)
\fB\-w\fR, \fB\-\-wip\fR
upload as a work\-in\-progress change
.TP
\fB\-r\fR, \fB\-\-ready\fR
mark change as ready (clears work\-in\-progress setting)
.TP
\fB\-o\fR PUSH_OPTIONS, \fB\-\-push\-option\fR=\fI\,PUSH_OPTIONS\/\fR
additional push options to transmit
.TP
@ -66,6 +69,12 @@ do everything except actually upload the CL
\fB\-y\fR, \fB\-\-yes\fR
answer yes to all safe prompts
.TP
\fB\-\-ignore\-untracked\-files\fR
ignore untracked files in the working copy
.TP
\fB\-\-no\-ignore\-untracked\-files\fR
always ask about untracked files in the working copy
.TP
\fB\-\-no\-cert\-checks\fR
disable verifying ssl certs (unsafe)
.SS Logging options:
@ -75,6 +84,19 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.SS pre\-upload hooks:
.TP
\fB\-\-no\-verify\fR
@ -105,6 +127,12 @@ respective list of users, and emails are sent to any new users. Users passed as
\fB\-\-reviewers\fR must already be registered with the code review system, or the
upload will fail.
.PP
While most normal Gerrit options have dedicated command line options, direct
access to the Gerit options is available via \fB\-\-push\-options\fR. This is useful when
Gerrit has newer functionality that repo upload doesn't yet support, or doesn't
have plans to support. See the Push Options documentation for more details:
https://gerrit\-review.googlesource.com/Documentation/user\-upload.html#push_options
.PP
Configuration
.PP
review.URL.autoupload:

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo version" "Repo Manual"
.TH REPO "1" "July 2022" "repo version" "Repo Manual"
.SH NAME
repo \- repo version - manual page for repo version
.SH SYNOPSIS
@ -20,5 +20,18 @@ show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help version` to view the detailed manual.

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2021" "repo" "Repo Manual"
.TH REPO "1" "November 2022" "repo" "Repo Manual"
.SH NAME
repo \- repository management tool built on top of git
.SH SYNOPSIS
@ -25,6 +25,10 @@ control color usage: auto, always, never
\fB\-\-trace\fR
trace git command execution (REPO_TRACE=1)
.TP
\fB\-\-trace\-to\-stderr\fR
trace outputs go to stderr in addition to
\&.repo/TRACE_FILE
.TP
\fB\-\-trace\-python\fR
trace python command execution
.TP
@ -43,7 +47,10 @@ filename of event log to append timeline to
.TP
\fB\-\-git\-trace2\-event\-log\fR=\fI\,GIT_TRACE2_EVENT_LOG\/\fR
directory to write git trace2 event log to
.SS "The complete list of recognized repo commands are:"
.TP
\fB\-\-submanifest\-path\fR=\fI\,REL_PATH\/\fR
submanifest path
.SS "The complete list of recognized repo commands is:"
.TP
abandon
Permanently abandon a development branch

File diff suppressed because it is too large Load Diff

View File

@ -29,7 +29,7 @@ def RunPager(globalConfig):
if not os.isatty(0) or not os.isatty(1):
return
pager = _SelectPager(globalConfig)
if pager == '' or pager == 'cat':
if pager == "" or pager == "cat":
return
if platform_utils.isWindows():
@ -46,8 +46,8 @@ def TerminatePager():
pager_process.stdin.close()
pager_process.wait()
pager_process = None
# Restore initial stdout/err in case there is more output in this process
# after shutting down the pager process
# Restore initial stdout/err in case there is more output in this
# process after shutting down the pager process.
sys.stdout = old_stdout
sys.stderr = old_stderr
@ -55,9 +55,13 @@ def TerminatePager():
def _PipePager(pager):
global pager_process, old_stdout, old_stderr
assert pager_process is None, "Only one active pager process at a time"
# Create pager process, piping stdout/err into its stdin
pager_process = subprocess.Popen([pager], stdin=subprocess.PIPE, stdout=sys.stdout,
stderr=sys.stderr)
# Create pager process, piping stdout/err into its stdin.
try:
pager_process = subprocess.Popen(
[pager], stdin=subprocess.PIPE, stdout=sys.stdout, stderr=sys.stderr
)
except FileNotFoundError:
sys.exit(f'fatal: cannot start pager "{pager}"')
old_stdout = sys.stdout
old_stderr = sys.stderr
sys.stdout = pager_process.stdin
@ -69,7 +73,6 @@ def _ForkPager(pager):
# This process turns into the pager; a child it forks will
# do the real processing and output back to the pager. This
# is necessary to keep the pager in control of the tty.
#
try:
r, w = os.pipe()
pid = os.fork()
@ -93,32 +96,31 @@ def _ForkPager(pager):
def _SelectPager(globalConfig):
try:
return os.environ['GIT_PAGER']
return os.environ["GIT_PAGER"]
except KeyError:
pass
pager = globalConfig.GetString('core.pager')
pager = globalConfig.GetString("core.pager")
if pager:
return pager
try:
return os.environ['PAGER']
return os.environ["PAGER"]
except KeyError:
pass
return 'less'
return "less"
def _BecomePager(pager):
# Delaying execution of the pager until we have output
# ready works around a long-standing bug in popularly
# available versions of 'less', a better 'more'.
#
_a, _b, _c = select.select([0], [], [0])
os.environ['LESS'] = 'FRSX'
os.environ["LESS"] = "FRSX"
try:
os.execvp(pager, [pager])
except OSError:
os.execv('/bin/sh', ['sh', '-c', pager])
os.execv("/bin/sh", ["sh", "-c", pager])

View File

@ -30,18 +30,24 @@ def isWindows():
def symlink(source, link_name):
"""Creates a symbolic link pointing to source named link_name.
Note: On Windows, source must exist on disk, as the implementation needs
to know whether to create a "File" or a "Directory" symbolic link.
"""
if isWindows():
import platform_utils_win32
source = _validate_winpath(source)
link_name = _validate_winpath(link_name)
target = os.path.join(os.path.dirname(link_name), source)
if isdir(target):
platform_utils_win32.create_dirsymlink(_makelongpath(source), link_name)
platform_utils_win32.create_dirsymlink(
_makelongpath(source), link_name
)
else:
platform_utils_win32.create_filesymlink(_makelongpath(source), link_name)
platform_utils_win32.create_filesymlink(
_makelongpath(source), link_name
)
else:
return os.symlink(source, link_name)
@ -50,8 +56,10 @@ def _validate_winpath(path):
path = os.path.normpath(path)
if _winpath_is_valid(path):
return path
raise ValueError("Path \"%s\" must be a relative path or an absolute "
"path starting with a drive letter".format(path))
raise ValueError(
'Path "{}" must be a relative path or an absolute '
"path starting with a drive letter".format(path)
)
def _winpath_is_valid(path):
@ -77,16 +85,17 @@ def _makelongpath(path):
MAX_PATH limit.
"""
if isWindows():
# Note: MAX_PATH is 260, but, for directories, the maximum value is actually 246.
# Note: MAX_PATH is 260, but, for directories, the maximum value is
# actually 246.
if len(path) < 246:
return path
if path.startswith(u"\\\\?\\"):
if path.startswith("\\\\?\\"):
return path
if not os.path.isabs(path):
return path
# Append prefix and ensure unicode so that the special longpath syntax
# is supported by underlying Win32 API calls
return u"\\\\?\\" + os.path.normpath(path)
return "\\\\?\\" + os.path.normpath(path)
else:
return path
@ -94,7 +103,8 @@ def _makelongpath(path):
def rmtree(path, ignore_errors=False):
"""shutil.rmtree(path) wrapper with support for long paths on Windows.
Availability: Unix, Windows."""
Availability: Unix, Windows.
"""
onerror = None
if isWindows():
path = _makelongpath(path)
@ -103,7 +113,7 @@ def rmtree(path, ignore_errors=False):
def handle_rmtree_error(function, path, excinfo):
# Allow deleting read-only files
# Allow deleting read-only files.
os.chmod(path, stat.S_IWRITE)
function(path)
@ -111,7 +121,8 @@ def handle_rmtree_error(function, path, excinfo):
def rename(src, dst):
"""os.rename(src, dst) wrapper with support for long paths on Windows.
Availability: Unix, Windows."""
Availability: Unix, Windows.
"""
if isWindows():
# On Windows, rename fails if destination exists, see
# https://docs.python.org/2/library/os.html#os.rename
@ -124,17 +135,17 @@ def rename(src, dst):
else:
raise
else:
os.rename(src, dst)
shutil.move(src, dst)
def remove(path):
def remove(path, missing_ok=False):
"""Remove (delete) the file path. This is a replacement for os.remove that
allows deleting read-only files on Windows, with support for long paths and
for deleting directory symbolic links.
Availability: Unix, Windows."""
if isWindows():
longpath = _makelongpath(path)
Availability: Unix, Windows.
"""
longpath = _makelongpath(path) if isWindows() else path
try:
os.remove(longpath)
except OSError as e:
@ -145,10 +156,10 @@ def remove(path):
os.rmdir(longpath)
else:
os.remove(longpath)
elif missing_ok and e.errno == errno.ENOENT:
pass
else:
raise
else:
os.remove(path)
def walk(top, topdown=True, onerror=None, followlinks=False):
@ -182,7 +193,9 @@ def _walk_windows_impl(top, topdown, onerror, followlinks):
for name in dirs:
new_path = os.path.join(top, name)
if followlinks or not islink(new_path):
for x in _walk_windows_impl(new_path, topdown, onerror, followlinks):
for x in _walk_windows_impl(
new_path, topdown, onerror, followlinks
):
yield x
if not topdown:
yield top, dirs, nondirs
@ -219,6 +232,7 @@ def islink(path):
"""
if isWindows():
import platform_utils_win32
return platform_utils_win32.islink(_makelongpath(path))
else:
return os.path.islink(path)
@ -234,6 +248,7 @@ def readlink(path):
"""
if isWindows():
import platform_utils_win32
return platform_utils_win32.readlink(_makelongpath(path))
else:
return os.readlink(path)
@ -251,10 +266,12 @@ def realpath(path):
for c in range(0, 100): # Avoid cycles
if islink(current_path):
target = readlink(current_path)
current_path = os.path.join(os.path.dirname(current_path), target)
current_path = os.path.join(
os.path.dirname(current_path), target
)
else:
basename = os.path.basename(current_path)
if basename == '':
if basename == "":
path_tail.append(current_path)
break
path_tail.append(basename)

View File

@ -19,7 +19,7 @@ from ctypes import c_buffer, c_ubyte, Structure, Union, byref
from ctypes.wintypes import BOOL, BOOLEAN, LPCWSTR, DWORD, HANDLE
from ctypes.wintypes import WCHAR, USHORT, LPVOID, ULONG, LPDWORD
kernel32 = WinDLL('kernel32', use_last_error=True)
kernel32 = WinDLL("kernel32", use_last_error=True)
UCHAR = c_ubyte
@ -31,14 +31,17 @@ ERROR_PRIVILEGE_NOT_HELD = 1314
# Win32 API entry points
CreateSymbolicLinkW = kernel32.CreateSymbolicLinkW
CreateSymbolicLinkW.restype = BOOLEAN
CreateSymbolicLinkW.argtypes = (LPCWSTR, # lpSymlinkFileName In
CreateSymbolicLinkW.argtypes = (
LPCWSTR, # lpSymlinkFileName In
LPCWSTR, # lpTargetFileName In
DWORD) # dwFlags In
DWORD, # dwFlags In
)
# Symbolic link creation flags
SYMBOLIC_LINK_FLAG_FILE = 0x00
SYMBOLIC_LINK_FLAG_DIRECTORY = 0x01
# symlink support for CreateSymbolicLink() starting with Windows 10 (1703, v10.0.14972)
# symlink support for CreateSymbolicLink() starting with Windows 10 (1703,
# v10.0.14972)
SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE = 0x02
GetFileAttributesW = kernel32.GetFileAttributesW
@ -50,13 +53,15 @@ FILE_ATTRIBUTE_REPARSE_POINT = 0x00400
CreateFileW = kernel32.CreateFileW
CreateFileW.restype = HANDLE
CreateFileW.argtypes = (LPCWSTR, # lpFileName In
CreateFileW.argtypes = (
LPCWSTR, # lpFileName In
DWORD, # dwDesiredAccess In
DWORD, # dwShareMode In
LPVOID, # lpSecurityAttributes In_opt
DWORD, # dwCreationDisposition In
DWORD, # dwFlagsAndAttributes In
HANDLE) # hTemplateFile In_opt
HANDLE, # hTemplateFile In_opt
)
CloseHandle = kernel32.CloseHandle
CloseHandle.restype = BOOL
@ -69,14 +74,16 @@ FILE_FLAG_OPEN_REPARSE_POINT = 0x00200000
DeviceIoControl = kernel32.DeviceIoControl
DeviceIoControl.restype = BOOL
DeviceIoControl.argtypes = (HANDLE, # hDevice In
DeviceIoControl.argtypes = (
HANDLE, # hDevice In
DWORD, # dwIoControlCode In
LPVOID, # lpInBuffer In_opt
DWORD, # nInBufferSize In
LPVOID, # lpOutBuffer Out_opt
DWORD, # nOutBufferSize In
LPDWORD, # lpBytesReturned Out_opt
LPVOID) # lpOverlapped Inout_opt
LPVOID, # lpOverlapped Inout_opt
)
# Device I/O control flags and options
FSCTL_GET_REPARSE_POINT = 0x000900A8
@ -86,16 +93,18 @@ MAXIMUM_REPARSE_DATA_BUFFER_SIZE = 0x4000
class GENERIC_REPARSE_BUFFER(Structure):
_fields_ = (('DataBuffer', UCHAR * 1),)
_fields_ = (("DataBuffer", UCHAR * 1),)
class SYMBOLIC_LINK_REPARSE_BUFFER(Structure):
_fields_ = (('SubstituteNameOffset', USHORT),
('SubstituteNameLength', USHORT),
('PrintNameOffset', USHORT),
('PrintNameLength', USHORT),
('Flags', ULONG),
('PathBuffer', WCHAR * 1))
_fields_ = (
("SubstituteNameOffset", USHORT),
("SubstituteNameLength", USHORT),
("PrintNameOffset", USHORT),
("PrintNameLength", USHORT),
("Flags", ULONG),
("PathBuffer", WCHAR * 1),
)
@property
def PrintName(self):
@ -105,11 +114,13 @@ class SYMBOLIC_LINK_REPARSE_BUFFER(Structure):
class MOUNT_POINT_REPARSE_BUFFER(Structure):
_fields_ = (('SubstituteNameOffset', USHORT),
('SubstituteNameLength', USHORT),
('PrintNameOffset', USHORT),
('PrintNameLength', USHORT),
('PathBuffer', WCHAR * 1))
_fields_ = (
("SubstituteNameOffset", USHORT),
("SubstituteNameLength", USHORT),
("PrintNameOffset", USHORT),
("PrintNameLength", USHORT),
("PathBuffer", WCHAR * 1),
)
@property
def PrintName(self):
@ -120,14 +131,19 @@ class MOUNT_POINT_REPARSE_BUFFER(Structure):
class REPARSE_DATA_BUFFER(Structure):
class REPARSE_BUFFER(Union):
_fields_ = (('SymbolicLinkReparseBuffer', SYMBOLIC_LINK_REPARSE_BUFFER),
('MountPointReparseBuffer', MOUNT_POINT_REPARSE_BUFFER),
('GenericReparseBuffer', GENERIC_REPARSE_BUFFER))
_fields_ = (('ReparseTag', ULONG),
('ReparseDataLength', USHORT),
('Reserved', USHORT),
('ReparseBuffer', REPARSE_BUFFER))
_anonymous_ = ('ReparseBuffer',)
_fields_ = (
("SymbolicLinkReparseBuffer", SYMBOLIC_LINK_REPARSE_BUFFER),
("MountPointReparseBuffer", MOUNT_POINT_REPARSE_BUFFER),
("GenericReparseBuffer", GENERIC_REPARSE_BUFFER),
)
_fields_ = (
("ReparseTag", ULONG),
("ReparseDataLength", USHORT),
("Reserved", USHORT),
("ReparseBuffer", REPARSE_BUFFER),
)
_anonymous_ = ("ReparseBuffer",)
def create_filesymlink(source, link_name):
@ -136,25 +152,27 @@ def create_filesymlink(source, link_name):
def create_dirsymlink(source, link_name):
"""Creates a Windows directory symbolic link source pointing to link_name.
"""
"""Creates a Windows directory symbolic link source pointing to link_name.""" # noqa: E501
_create_symlink(source, link_name, SYMBOLIC_LINK_FLAG_DIRECTORY)
def _create_symlink(source, link_name, dwFlags):
if not CreateSymbolicLinkW(link_name, source,
dwFlags | SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE):
# See https://github.com/golang/go/pull/24307/files#diff-b87bc12e4da2497308f9ef746086e4f0
# "the unprivileged create flag is unsupported below Windows 10 (1703, v10.0.14972).
# retry without it."
if not CreateSymbolicLinkW(
link_name,
source,
dwFlags | SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE,
):
# See https://github.com/golang/go/pull/24307/files#diff-b87bc12e4da2497308f9ef746086e4f0 # noqa: E501
# "the unprivileged create flag is unsupported below Windows 10 (1703,
# v10.0.14972). retry without it."
if not CreateSymbolicLinkW(link_name, source, dwFlags):
code = get_last_error()
error_desc = FormatError(code).strip()
if code == ERROR_PRIVILEGE_NOT_HELD:
raise OSError(errno.EPERM, error_desc, link_name)
_raise_winerror(
code,
'Error creating symbolic link \"%s\"'.format(link_name))
code, 'Error creating symbolic link "{}"'.format(link_name)
)
def islink(path):
@ -165,45 +183,48 @@ def islink(path):
def readlink(path):
reparse_point_handle = CreateFileW(path,
reparse_point_handle = CreateFileW(
path,
0,
0,
None,
OPEN_EXISTING,
FILE_FLAG_OPEN_REPARSE_POINT |
FILE_FLAG_BACKUP_SEMANTICS,
None)
FILE_FLAG_OPEN_REPARSE_POINT | FILE_FLAG_BACKUP_SEMANTICS,
None,
)
if reparse_point_handle == INVALID_HANDLE_VALUE:
_raise_winerror(
get_last_error(),
'Error opening symbolic link \"%s\"'.format(path))
get_last_error(), 'Error opening symbolic link "{}"'.format(path)
)
target_buffer = c_buffer(MAXIMUM_REPARSE_DATA_BUFFER_SIZE)
n_bytes_returned = DWORD()
io_result = DeviceIoControl(reparse_point_handle,
io_result = DeviceIoControl(
reparse_point_handle,
FSCTL_GET_REPARSE_POINT,
None,
0,
target_buffer,
len(target_buffer),
byref(n_bytes_returned),
None)
None,
)
CloseHandle(reparse_point_handle)
if not io_result:
_raise_winerror(
get_last_error(),
'Error reading symbolic link \"%s\"'.format(path))
get_last_error(), 'Error reading symbolic link "{}"'.format(path)
)
rdb = REPARSE_DATA_BUFFER.from_buffer(target_buffer)
if rdb.ReparseTag == IO_REPARSE_TAG_SYMLINK:
return rdb.SymbolicLinkReparseBuffer.PrintName
elif rdb.ReparseTag == IO_REPARSE_TAG_MOUNT_POINT:
return rdb.MountPointReparseBuffer.PrintName
# Unsupported reparse point type
# Unsupported reparse point type.
_raise_winerror(
ERROR_NOT_SUPPORTED,
'Error reading symbolic link \"%s\"'.format(path))
ERROR_NOT_SUPPORTED, 'Error reading symbolic link "{}"'.format(path)
)
def _raise_winerror(code, error_desc):
win_error_desc = FormatError(code).strip()
error_desc = "%s: %s".format(error_desc, win_error_desc)
error_desc = "{0}: {1}".format(error_desc, win_error_desc)
raise WinError(code, error_desc)

View File

@ -14,15 +14,33 @@
import os
import sys
from time import time
from repo_trace import IsTrace
import time
try:
import threading as _threading
except ImportError:
import dummy_threading as _threading
from repo_trace import IsTraceToStderr
_NOT_TTY = not os.isatty(2)
# This will erase all content in the current line (wherever the cursor is).
# It does not move the cursor, so this is usually followed by \r to move to
# column 0.
CSI_ERASE_LINE = '\x1b[2K'
CSI_ERASE_LINE = "\x1b[2K"
# This will erase all content in the current line after the cursor. This is
# useful for partial updates & progress messages as the terminal can display
# it better.
CSI_ERASE_LINE_AFTER = "\x1b[K"
def convert_to_hms(total):
"""Converts a period of seconds to hours, minutes, and seconds."""
hours, rem = divmod(total, 3600)
mins, secs = divmod(rem, 60)
return int(hours), int(mins), secs
def duration_str(total):
@ -31,101 +49,177 @@ def duration_str(total):
The default timedelta stringification contains a lot of leading zeros and
uses microsecond resolution. This makes for noisy output.
"""
hours, rem = divmod(total, 3600)
mins, secs = divmod(rem, 60)
ret = '%.3fs' % (secs,)
hours, mins, secs = convert_to_hms(total)
ret = "%.3fs" % (secs,)
if mins:
ret = '%im%s' % (mins, ret)
ret = "%im%s" % (mins, ret)
if hours:
ret = '%ih%s' % (hours, ret)
ret = "%ih%s" % (hours, ret)
return ret
def elapsed_str(total):
"""Returns seconds in the format [H:]MM:SS.
Does not display a leading zero for minutes if under 10 minutes. This should
be used when displaying elapsed time in a progress indicator.
"""
hours, mins, secs = convert_to_hms(total)
ret = f"{int(secs):>02d}"
if total >= 3600:
# Show leading zeroes if over an hour.
ret = f"{mins:>02d}:{ret}"
else:
ret = f"{mins}:{ret}"
if hours:
ret = f"{hours}:{ret}"
return ret
def jobs_str(total):
return f"{total} job{'s' if total > 1 else ''}"
class Progress(object):
def __init__(self, title, total=0, units='', print_newline=False, delay=True,
quiet=False):
def __init__(
self,
title,
total=0,
units="",
delay=True,
quiet=False,
show_elapsed=False,
elide=False,
):
self._title = title
self._total = total
self._done = 0
self._start = time()
self._start = time.time()
self._show = not delay
self._units = units
self._print_newline = print_newline
self._elide = elide
# Only show the active jobs section if we run more than one in parallel.
self._show_jobs = False
self._active = 0
# Save the last message for displaying on refresh.
self._last_msg = None
self._show_elapsed = show_elapsed
self._update_event = _threading.Event()
self._update_thread = _threading.Thread(
target=self._update_loop,
)
self._update_thread.daemon = True
# When quiet, never show any output. It's a bit hacky, but reusing the
# existing logic that delays initial output keeps the rest of the class
# clean. Basically we set the start time to years in the future.
if quiet:
self._show = False
self._start += 2**32
elif show_elapsed:
self._update_thread.start()
def _update_loop(self):
while True:
self.update(inc=0)
if self._update_event.wait(timeout=1):
return
def _write(self, s):
s = "\r" + s
if self._elide:
col = os.get_terminal_size().columns
if len(s) > col:
s = s[: col - 1] + ".."
sys.stderr.write(s)
sys.stderr.flush()
def start(self, name):
self._active += 1
if not self._show_jobs:
self._show_jobs = self._active > 1
self.update(inc=0, msg='started ' + name)
self.update(inc=0, msg="started " + name)
def finish(self, name):
self.update(msg='finished ' + name)
self.update(msg="finished " + name)
self._active -= 1
def update(self, inc=1, msg=''):
self._done += inc
def update(self, inc=1, msg=None):
"""Updates the progress indicator.
if _NOT_TTY or IsTrace():
Args:
inc: The number of items completed.
msg: The message to display. If None, use the last message.
"""
self._done += inc
if msg is None:
msg = self._last_msg
self._last_msg = msg
if _NOT_TTY or IsTraceToStderr():
return
elapsed_sec = time.time() - self._start
if not self._show:
if 0.5 <= time() - self._start:
if 0.5 <= elapsed_sec:
self._show = True
else:
return
if self._total <= 0:
sys.stderr.write('%s\r%s: %d,' % (
CSI_ERASE_LINE,
self._title,
self._done))
sys.stderr.flush()
self._write(
"%s: %d,%s" % (self._title, self._done, CSI_ERASE_LINE_AFTER)
)
else:
p = (100 * self._done) / self._total
if self._show_jobs:
jobs = '[%d job%s] ' % (self._active, 's' if self._active > 1 else '')
jobs = f"[{jobs_str(self._active)}] "
else:
jobs = ''
sys.stderr.write('%s\r%s: %2d%% %s(%d%s/%d%s)%s%s%s' % (
CSI_ERASE_LINE,
jobs = ""
if self._show_elapsed:
elapsed = f" {elapsed_str(elapsed_sec)} |"
else:
elapsed = ""
self._write(
"%s: %2d%% %s(%d%s/%d%s)%s %s%s"
% (
self._title,
p,
jobs,
self._done, self._units,
self._total, self._units,
' ' if msg else '', msg,
'\n' if self._print_newline else ''))
sys.stderr.flush()
self._done,
self._units,
self._total,
self._units,
elapsed,
msg,
CSI_ERASE_LINE_AFTER,
)
)
def end(self):
if _NOT_TTY or IsTrace() or not self._show:
self._update_event.set()
if _NOT_TTY or IsTraceToStderr() or not self._show:
return
duration = duration_str(time() - self._start)
duration = duration_str(time.time() - self._start)
if self._total <= 0:
sys.stderr.write('%s\r%s: %d, done in %s\n' % (
CSI_ERASE_LINE,
self._title,
self._done,
duration))
sys.stderr.flush()
self._write(
"%s: %d, done in %s%s\n"
% (self._title, self._done, duration, CSI_ERASE_LINE_AFTER)
)
else:
p = (100 * self._done) / self._total
sys.stderr.write('%s\r%s: %3d%% (%d%s/%d%s), done in %s\n' % (
CSI_ERASE_LINE,
self._write(
"%s: %3d%% (%d%s/%d%s), done in %s%s\n"
% (
self._title,
p,
self._done, self._units,
self._total, self._units,
duration))
sys.stderr.flush()
self._done,
self._units,
self._total,
self._units,
duration,
CSI_ERASE_LINE_AFTER,
)
)

2996
project.py

File diff suppressed because it is too large Load Diff

18
pyproject.toml Normal file
View File

@ -0,0 +1,18 @@
# Copyright 2023 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
[tool.black]
line-length = 80
# NB: Keep in sync with tox.ini.
target-version = ['py36', 'py37', 'py38', 'py39', 'py310']

View File

@ -20,6 +20,7 @@ This is intended to be run only by the official Repo release managers.
import argparse
import os
import re
import subprocess
import sys
@ -28,57 +29,104 @@ import util
def sign(opts):
"""Sign the launcher!"""
output = ''
output = ""
for key in opts.keys:
# We use ! at the end of the key so that gpg uses this specific key.
# Otherwise it uses the key as a lookup into the overall key and uses the
# default signing key. i.e. It will see that KEYID_RSA is a subkey of
# another key, and use the primary key to sign instead of the subkey.
cmd = ['gpg', '--homedir', opts.gpgdir, '-u', f'{key}!', '--batch', '--yes',
'--armor', '--detach-sign', '--output', '-', opts.launcher]
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
# Otherwise it uses the key as a lookup into the overall key and uses
# the default signing key. i.e. It will see that KEYID_RSA is a subkey
# of another key, and use the primary key to sign instead of the subkey.
cmd = [
"gpg",
"--homedir",
opts.gpgdir,
"-u",
f"{key}!",
"--batch",
"--yes",
"--armor",
"--detach-sign",
"--output",
"-",
opts.launcher,
]
ret = util.run(opts, cmd, encoding="utf-8", stdout=subprocess.PIPE)
output += ret.stdout
# Save the combined signatures into one file.
with open(f'{opts.launcher}.asc', 'w', encoding='utf-8') as fp:
with open(f"{opts.launcher}.asc", "w", encoding="utf-8") as fp:
fp.write(output)
def check(opts):
"""Check the signature."""
util.run(opts, ['gpg', '--verify', f'{opts.launcher}.asc'])
util.run(opts, ["gpg", "--verify", f"{opts.launcher}.asc"])
def postmsg(opts):
def get_version(opts):
"""Get the version from |launcher|."""
# Make sure we don't search $PATH when signing the "repo" file in the cwd.
launcher = os.path.join(".", opts.launcher)
cmd = [launcher, "--version"]
ret = util.run(opts, cmd, encoding="utf-8", stdout=subprocess.PIPE)
m = re.search(r"repo launcher version ([0-9.]+)", ret.stdout)
if not m:
sys.exit(f"{opts.launcher}: unable to detect repo version")
return m.group(1)
def postmsg(opts, version):
"""Helpful info to show at the end for release manager."""
print(f"""
print(
f"""
Repo launcher bucket:
gs://git-repo-downloads/
To upload this launcher directly:
gsutil cp -a public-read {opts.launcher} {opts.launcher}.asc gs://git-repo-downloads/
You should first upload it with a specific version:
gsutil cp -a public-read {opts.launcher} gs://git-repo-downloads/repo-{version}
gsutil cp -a public-read {opts.launcher}.asc gs://git-repo-downloads/repo-{version}.asc
NB: You probably want to upload it with a specific version first, e.g.:
gsutil cp -a public-read {opts.launcher} gs://git-repo-downloads/repo-3.0
gsutil cp -a public-read {opts.launcher}.asc gs://git-repo-downloads/repo-3.0.asc
""")
Then to make it the public default:
gsutil cp -a public-read gs://git-repo-downloads/repo-{version} gs://git-repo-downloads/repo
gsutil cp -a public-read gs://git-repo-downloads/repo-{version}.asc gs://git-repo-downloads/repo.asc
NB: If a rollback is necessary, the GS bucket archives old versions, and may be
accessed by specifying their unique id number.
gsutil ls -la gs://git-repo-downloads/repo gs://git-repo-downloads/repo.asc
gsutil cp -a public-read gs://git-repo-downloads/repo#<unique id> gs://git-repo-downloads/repo
gsutil cp -a public-read gs://git-repo-downloads/repo.asc#<unique id> gs://git-repo-downloads/repo.asc
""" # noqa: E501
)
def get_parser():
"""Get a CLI parser."""
parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument('-n', '--dry-run',
dest='dryrun', action='store_true',
help='show everything that would be done')
parser.add_argument('--gpgdir',
default=os.path.join(util.HOMEDIR, '.gnupg', 'repo'),
help='path to dedicated gpg dir with release keys '
'(default: ~/.gnupg/repo/)')
parser.add_argument('--keyid', dest='keys', default=[], action='append',
help='alternative signing keys to use')
parser.add_argument('launcher',
default=os.path.join(util.TOPDIR, 'repo'), nargs='?',
help='the launcher script to sign')
parser.add_argument(
"-n",
"--dry-run",
dest="dryrun",
action="store_true",
help="show everything that would be done",
)
parser.add_argument(
"--gpgdir",
default=os.path.join(util.HOMEDIR, ".gnupg", "repo"),
help="path to dedicated gpg dir with release keys "
"(default: ~/.gnupg/repo/)",
)
parser.add_argument(
"--keyid",
dest="keys",
default=[],
action="append",
help="alternative signing keys to use",
)
parser.add_argument(
"launcher",
default=os.path.join(util.TOPDIR, "repo"),
nargs="?",
help="the launcher script to sign",
)
return parser
@ -88,27 +136,30 @@ def main(argv):
opts = parser.parse_args(argv)
if not os.path.exists(opts.gpgdir):
parser.error(f'--gpgdir does not exist: {opts.gpgdir}')
parser.error(f"--gpgdir does not exist: {opts.gpgdir}")
if not os.path.exists(opts.launcher):
parser.error(f'launcher does not exist: {opts.launcher}')
parser.error(f"launcher does not exist: {opts.launcher}")
opts.launcher = os.path.relpath(opts.launcher)
print(f'Signing "{opts.launcher}" launcher script and saving to '
f'"{opts.launcher}.asc"')
print(
f'Signing "{opts.launcher}" launcher script and saving to '
f'"{opts.launcher}.asc"'
)
if opts.keys:
print(f'Using custom keys to sign: {" ".join(opts.keys)}')
else:
print('Using official Repo release keys to sign')
print("Using official Repo release keys to sign")
opts.keys = [util.KEYID_DSA, util.KEYID_RSA, util.KEYID_ECC]
util.import_release_key(opts)
version = get_version(opts)
sign(opts)
check(opts)
postmsg(opts)
postmsg(opts, version)
return 0
if __name__ == '__main__':
if __name__ == "__main__":
sys.exit(main(sys.argv[1:]))

View File

@ -35,7 +35,7 @@ import util
KEYID = util.KEYID_DSA
# Regular expression to validate tag names.
RE_VALID_TAG = r'^v([0-9]+[.])+[0-9]+$'
RE_VALID_TAG = r"^v([0-9]+[.])+[0-9]+$"
def sign(opts):
@ -44,11 +44,20 @@ def sign(opts):
# Otherwise it uses the key as a lookup into the overall key and uses the
# default signing key. i.e. It will see that KEYID_RSA is a subkey of
# another key, and use the primary key to sign instead of the subkey.
cmd = ['git', 'tag', '-s', opts.tag, '-u', f'{opts.key}!',
'-m', f'repo {opts.tag}', opts.commit]
cmd = [
"git",
"tag",
"-s",
opts.tag,
"-u",
f"{opts.key}!",
"-m",
f"repo {opts.tag}",
opts.commit,
]
key = 'GNUPGHOME'
print('+', f'export {key}="{opts.gpgdir}"')
key = "GNUPGHOME"
print("+", f'export {key}="{opts.gpgdir}"')
oldvalue = os.getenv(key)
os.putenv(key, opts.gpgdir)
util.run(opts, cmd)
@ -60,21 +69,27 @@ def sign(opts):
def check(opts):
"""Check the signature."""
util.run(opts, ['git', 'tag', '--verify', opts.tag])
util.run(opts, ["git", "tag", "--verify", opts.tag])
def postmsg(opts):
"""Helpful info to show at the end for release manager."""
cmd = ['git', 'rev-parse', 'remotes/origin/stable']
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
cmd = ["git", "rev-parse", "remotes/origin/stable"]
ret = util.run(opts, cmd, encoding="utf-8", stdout=subprocess.PIPE)
current_release = ret.stdout.strip()
cmd = ['git', 'log', '--format=%h (%aN) %s', '--no-merges',
f'remotes/origin/stable..{opts.tag}']
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
cmd = [
"git",
"log",
"--format=%h (%aN) %s",
"--no-merges",
f"remotes/origin/stable..{opts.tag}",
]
ret = util.run(opts, cmd, encoding="utf-8", stdout=subprocess.PIPE)
shortlog = ret.stdout.strip()
print(f"""
print(
f"""
Here's the short log since the last release.
{shortlog}
@ -84,29 +99,39 @@ NB: People will start upgrading to this version immediately.
To roll back a release:
git push origin --force {current_release}:stable -n
""")
"""
)
def get_parser():
"""Get a CLI parser."""
parser = argparse.ArgumentParser(
description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter)
parser.add_argument('-n', '--dry-run',
dest='dryrun', action='store_true',
help='show everything that would be done')
parser.add_argument('--gpgdir',
default=os.path.join(util.HOMEDIR, '.gnupg', 'repo'),
help='path to dedicated gpg dir with release keys '
'(default: ~/.gnupg/repo/)')
parser.add_argument('-f', '--force', action='store_true',
help='force signing of any tag')
parser.add_argument('--keyid', dest='key',
help='alternative signing key to use')
parser.add_argument('tag',
help='the tag to create (e.g. "v2.0")')
parser.add_argument('commit', default='HEAD', nargs='?',
help='the commit to tag')
formatter_class=argparse.RawDescriptionHelpFormatter,
)
parser.add_argument(
"-n",
"--dry-run",
dest="dryrun",
action="store_true",
help="show everything that would be done",
)
parser.add_argument(
"--gpgdir",
default=os.path.join(util.HOMEDIR, ".gnupg", "repo"),
help="path to dedicated gpg dir with release keys "
"(default: ~/.gnupg/repo/)",
)
parser.add_argument(
"-f", "--force", action="store_true", help="force signing of any tag"
)
parser.add_argument(
"--keyid", dest="key", help="alternative signing key to use"
)
parser.add_argument("tag", help='the tag to create (e.g. "v2.0")')
parser.add_argument(
"commit", default="HEAD", nargs="?", help="the commit to tag"
)
return parser
@ -116,16 +141,18 @@ def main(argv):
opts = parser.parse_args(argv)
if not os.path.exists(opts.gpgdir):
parser.error(f'--gpgdir does not exist: {opts.gpgdir}')
parser.error(f"--gpgdir does not exist: {opts.gpgdir}")
if not opts.force and not re.match(RE_VALID_TAG, opts.tag):
parser.error(f'tag "{opts.tag}" does not match regex "{RE_VALID_TAG}"; '
'use --force to sign anyways')
parser.error(
f'tag "{opts.tag}" does not match regex "{RE_VALID_TAG}"; '
"use --force to sign anyways"
)
if opts.key:
print(f'Using custom key to sign: {opts.key}')
print(f"Using custom key to sign: {opts.key}")
else:
print('Using official Repo release key to sign')
print("Using official Repo release key to sign")
opts.key = KEYID
util.import_release_key(opts)
@ -136,5 +163,5 @@ def main(argv):
return 0
if __name__ == '__main__':
if __name__ == "__main__":
sys.exit(main(sys.argv[1:]))

View File

@ -18,73 +18,8 @@
This is intended to be run before every official Repo release.
"""
from pathlib import Path
from functools import partial
import argparse
import multiprocessing
import os
import re
import shutil
import subprocess
import sys
import tempfile
TOPDIR = Path(__file__).resolve().parent.parent
MANDIR = TOPDIR.joinpath('man')
import update_manpages
# Load repo local modules.
sys.path.insert(0, str(TOPDIR))
from git_command import RepoSourceVersion
import subcmds
def worker(cmd, **kwargs):
subprocess.run(cmd, **kwargs)
def main(argv):
parser = argparse.ArgumentParser(description=__doc__)
opts = parser.parse_args(argv)
if not shutil.which('help2man'):
sys.exit('Please install help2man to continue.')
# Let repo know we're generating man pages so it can avoid some dynamic
# behavior (like probing active number of CPUs). We use a weird name &
# value to make it less likely for users to set this var themselves.
os.environ['_REPO_GENERATE_MANPAGES_'] = ' indeed! '
# "repo branch" is an alias for "repo branches".
del subcmds.all_commands['branch']
(MANDIR / 'repo-branch.1').write_text('.so man1/repo-branches.1')
version = RepoSourceVersion()
cmdlist = [['help2man', '-N', '-n', f'repo {cmd} - manual page for repo {cmd}',
'-S', f'repo {cmd}', '-m', 'Repo Manual', f'--version-string={version}',
'-o', MANDIR.joinpath(f'repo-{cmd}.1'), TOPDIR.joinpath('repo'),
'-h', f'help {cmd}'] for cmd in subcmds.all_commands]
cmdlist.append(['help2man', '-N', '-n', 'repository management tool built on top of git',
'-S', 'repo', '-m', 'Repo Manual', f'--version-string={version}',
'-o', MANDIR.joinpath('repo.1'), TOPDIR.joinpath('repo'),
'-h', '--help-all'])
with tempfile.TemporaryDirectory() as tempdir:
repo_dir = Path(tempdir) / '.repo'
repo_dir.mkdir()
(repo_dir / 'repo').symlink_to(TOPDIR)
# Run all cmd in parallel, and wait for them to finish.
with multiprocessing.Pool() as pool:
pool.map(partial(worker, cwd=tempdir, check=True), cmdlist)
regex = (
(r'(It was generated by help2man) [0-9.]+', '\g<1>.'),
(r'^\.IP\n(.*:)\n', '.SS \g<1>\n'),
(r'^\.PP\nDescription', '.SH DETAILS'),
)
for path in MANDIR.glob('*.1'):
data = path.read_text()
for pattern, replacement in regex:
data = re.sub(pattern, replacement, data, flags=re.M)
path.write_text(data)
if __name__ == '__main__':
sys.exit(main(sys.argv[1:]))
sys.exit(update_manpages.main(sys.argv[1:]))

153
release/update_manpages.py Normal file
View File

@ -0,0 +1,153 @@
# Copyright (C) 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Helper tool for generating manual page for all repo commands.
Most code lives in this module so it can be unittested.
"""
from pathlib import Path
from functools import partial
import argparse
import multiprocessing
import os
import re
import shutil
import subprocess
import sys
import tempfile
TOPDIR = Path(__file__).resolve().parent.parent
MANDIR = TOPDIR.joinpath("man")
# Load repo local modules.
sys.path.insert(0, str(TOPDIR))
from git_command import RepoSourceVersion
import subcmds
def worker(cmd, **kwargs):
subprocess.run(cmd, **kwargs)
def main(argv):
parser = argparse.ArgumentParser(description=__doc__)
parser.parse_args(argv)
if not shutil.which("help2man"):
sys.exit("Please install help2man to continue.")
# Let repo know we're generating man pages so it can avoid some dynamic
# behavior (like probing active number of CPUs). We use a weird name &
# value to make it less likely for users to set this var themselves.
os.environ["_REPO_GENERATE_MANPAGES_"] = " indeed! "
# "repo branch" is an alias for "repo branches".
del subcmds.all_commands["branch"]
(MANDIR / "repo-branch.1").write_text(".so man1/repo-branches.1")
version = RepoSourceVersion()
cmdlist = [
[
"help2man",
"-N",
"-n",
f"repo {cmd} - manual page for repo {cmd}",
"-S",
f"repo {cmd}",
"-m",
"Repo Manual",
f"--version-string={version}",
"-o",
MANDIR.joinpath(f"repo-{cmd}.1.tmp"),
"./repo",
"-h",
f"help {cmd}",
]
for cmd in subcmds.all_commands
]
cmdlist.append(
[
"help2man",
"-N",
"-n",
"repository management tool built on top of git",
"-S",
"repo",
"-m",
"Repo Manual",
f"--version-string={version}",
"-o",
MANDIR.joinpath("repo.1.tmp"),
"./repo",
"-h",
"--help-all",
]
)
with tempfile.TemporaryDirectory() as tempdir:
tempdir = Path(tempdir)
repo_dir = tempdir / ".repo"
repo_dir.mkdir()
(repo_dir / "repo").symlink_to(TOPDIR)
# Create a repo wrapper using the active Python executable. We can't
# pass this directly to help2man as it's too simple, so insert it via
# shebang.
data = (TOPDIR / "repo").read_text(encoding="utf-8")
tempbin = tempdir / "repo"
tempbin.write_text(f"#!{sys.executable}\n" + data, encoding="utf-8")
tempbin.chmod(0o755)
# Run all cmd in parallel, and wait for them to finish.
with multiprocessing.Pool() as pool:
pool.map(partial(worker, cwd=tempdir, check=True), cmdlist)
for tmp_path in MANDIR.glob("*.1.tmp"):
path = tmp_path.parent / tmp_path.stem
old_data = path.read_text() if path.exists() else ""
data = tmp_path.read_text()
tmp_path.unlink()
data = replace_regex(data)
# If the only thing that changed was the date, don't refresh. This
# avoids a lot of noise when only one file actually updates.
old_data = re.sub(
r'^(\.TH REPO "1" ")([^"]+)', r"\1", old_data, flags=re.M
)
new_data = re.sub(r'^(\.TH REPO "1" ")([^"]+)', r"\1", data, flags=re.M)
if old_data != new_data:
path.write_text(data)
def replace_regex(data):
"""Replace semantically null regexes in the data.
Args:
data: manpage text.
Returns:
Updated manpage text.
"""
regex = (
(r"(It was generated by help2man) [0-9.]+", r"\g<1>."),
(r"^\033\[[0-9;]*m([^\033]*)\033\[m", r"\g<1>"),
(r"^\.IP\n(.*:)\n", r".SS \g<1>\n"),
(r"^\.PP\nDescription", r".SH DETAILS"),
)
for pattern, replacement in regex:
data = re.sub(pattern, replacement, data, flags=re.M)
return data

View File

@ -20,54 +20,60 @@ import subprocess
import sys
assert sys.version_info >= (3, 6), 'This module requires Python 3.6+'
assert sys.version_info >= (3, 6), "This module requires Python 3.6+"
TOPDIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
HOMEDIR = os.path.expanduser('~')
HOMEDIR = os.path.expanduser("~")
# These are the release keys we sign with.
KEYID_DSA = '8BB9AD793E8E6153AF0F9A4416530D5E920F5C65'
KEYID_RSA = 'A34A13BE8E76BFF46A0C022DA2E75A824AAB9624'
KEYID_ECC = 'E1F9040D7A3F6DAFAC897CD3D3B95DA243E48A39'
KEYID_DSA = "8BB9AD793E8E6153AF0F9A4416530D5E920F5C65"
KEYID_RSA = "A34A13BE8E76BFF46A0C022DA2E75A824AAB9624"
KEYID_ECC = "E1F9040D7A3F6DAFAC897CD3D3B95DA243E48A39"
def cmdstr(cmd):
"""Get a nicely quoted shell command."""
ret = []
for arg in cmd:
if not re.match(r'^[a-zA-Z0-9/_.=-]+$', arg):
if not re.match(r"^[a-zA-Z0-9/_.=-]+$", arg):
arg = f'"{arg}"'
ret.append(arg)
return ' '.join(ret)
return " ".join(ret)
def run(opts, cmd, check=True, **kwargs):
"""Helper around subprocess.run to include logging."""
print('+', cmdstr(cmd))
print("+", cmdstr(cmd))
if opts.dryrun:
cmd = ['true', '--'] + cmd
cmd = ["true", "--"] + cmd
try:
return subprocess.run(cmd, check=check, **kwargs)
except subprocess.CalledProcessError as e:
print(f'aborting: {e}', file=sys.stderr)
print(f"aborting: {e}", file=sys.stderr)
sys.exit(1)
def import_release_key(opts):
"""Import the public key of the official release repo signing key."""
# Extract the key from our repo launcher.
launcher = getattr(opts, 'launcher', os.path.join(TOPDIR, 'repo'))
launcher = getattr(opts, "launcher", os.path.join(TOPDIR, "repo"))
print(f'Importing keys from "{launcher}" launcher script')
with open(launcher, encoding='utf-8') as fp:
with open(launcher, encoding="utf-8") as fp:
data = fp.read()
keys = re.findall(
r'\n-----BEGIN PGP PUBLIC KEY BLOCK-----\n[^-]*'
r'\n-----END PGP PUBLIC KEY BLOCK-----\n', data, flags=re.M)
run(opts, ['gpg', '--import'], input='\n'.join(keys).encode('utf-8'))
r"\n-----BEGIN PGP PUBLIC KEY BLOCK-----\n[^-]*"
r"\n-----END PGP PUBLIC KEY BLOCK-----\n",
data,
flags=re.M,
)
run(opts, ["gpg", "--import"], input="\n".join(keys).encode("utf-8"))
print('Marking keys as fully trusted')
run(opts, ['gpg', '--import-ownertrust'],
input=f'{KEYID_DSA}:6:\n'.encode('utf-8'))
print("Marking keys as fully trusted")
run(
opts,
["gpg", "--import-ownertrust"],
input=f"{KEYID_DSA}:6:\n".encode("utf-8"),
)

40
repo
View File

@ -149,7 +149,7 @@ if not REPO_REV:
BUG_URL = 'https://bugs.chromium.org/p/gerrit/issues/entry?template=Repo+tool+issue'
# increment this whenever we make important changes to this script
VERSION = (2, 15)
VERSION = (2, 32)
# increment this if the MAINTAINER_KEYS block is modified
KEYRING_VERSION = (2, 3)
@ -265,7 +265,8 @@ else:
urllib.error = urllib2
home_dot_repo = os.path.expanduser('~/.repoconfig')
repo_config_dir = os.getenv('REPO_CONFIG_DIR', os.path.expanduser('~'))
home_dot_repo = os.path.join(repo_config_dir, '.repoconfig')
gpg_dir = os.path.join(home_dot_repo, 'gnupg')
@ -312,6 +313,14 @@ def InitParser(parser, gitc_init=False):
metavar='PLATFORM')
group.add_option('--submodules', action='store_true',
help='sync any submodules associated with the manifest repo')
group.add_option('--standalone-manifest', action='store_true',
help='download the manifest as a static file '
'rather then create a git checkout of '
'the manifest repo')
group.add_option('--manifest-depth', type='int', default=0, metavar='DEPTH',
help='create a shallow clone of the manifest repo with '
'given depth (0 for full clone); see git clone '
'(default: %default)')
# Options that only affect manifest project, and not any of the projects
# specified in the manifest itself.
@ -321,9 +330,9 @@ def InitParser(parser, gitc_init=False):
# want -c, so try to satisfy both as best we can.
if not gitc_init:
cbr_opts += ['-c']
group.add_option(*cbr_opts,
group.add_option(*cbr_opts, default=True,
dest='current_branch_only', action='store_true',
help='fetch only current manifest branch from server')
help='fetch only current manifest branch from server (default)')
group.add_option('--no-current-branch',
dest='current_branch_only', action='store_false',
help='fetch all manifest branches from server')
@ -368,7 +377,7 @@ def InitParser(parser, gitc_init=False):
help='filter for use with --partial-clone '
'[default: %default]')
group.add_option('--use-superproject', action='store_true', default=None,
help='use the manifest superproject to sync projects')
help='use the manifest superproject to sync projects; implies -c')
group.add_option('--no-use-superproject', action='store_false',
dest='use_superproject',
help='disable use of manifest superprojects')
@ -378,6 +387,11 @@ def InitParser(parser, gitc_init=False):
group.add_option('--no-clone-bundle',
dest='clone_bundle', action='store_false',
help='disable use of /clone.bundle on HTTP/HTTPS (default if --partial-clone)')
group.add_option('--git-lfs', action='store_true',
help='enable Git LFS support')
group.add_option('--no-git-lfs',
dest='git_lfs', action='store_false',
help='disable Git LFS support')
# Tool.
group = parser.add_option_group('repo Version options')
@ -434,8 +448,7 @@ def run_command(cmd, **kwargs):
except UnicodeError:
print('repo: warning: Invalid UTF-8 output:\ncmd: %r\n%r' % (cmd, output),
file=sys.stderr)
# TODO(vapier): Once we require Python 3, use 'backslashreplace'.
return output.decode('utf-8', 'replace')
return output.decode('utf-8', 'backslashreplace')
# Run & package the results.
proc = subprocess.Popen(cmd, **kwargs)
@ -603,17 +616,23 @@ def _Init(args, gitc_init=False):
try:
if not opt.quiet:
print('Downloading Repo source from', url)
dst = os.path.abspath(os.path.join(repodir, S_repo))
dst_final = os.path.abspath(os.path.join(repodir, S_repo))
dst = dst_final + '.tmp'
shutil.rmtree(dst, ignore_errors=True)
_Clone(url, dst, opt.clone_bundle, opt.quiet, opt.verbose)
remote_ref, rev = check_repo_rev(dst, rev, opt.repo_verify, quiet=opt.quiet)
_Checkout(dst, remote_ref, rev, opt.quiet)
if not os.path.isfile(os.path.join(dst, 'repo')):
print("warning: '%s' does not look like a git-repo repository, is "
"REPO_URL set correctly?" % url, file=sys.stderr)
print("fatal: '%s' does not look like a git-repo repository, is "
"--repo-url set correctly?" % url, file=sys.stderr)
raise CloneFailure()
os.rename(dst, dst_final)
except CloneFailure:
print('fatal: double check your --repo-rev setting.', file=sys.stderr)
if opt.quiet:
print('fatal: repo init failed; run without --quiet to see why',
file=sys.stderr)
@ -1307,6 +1326,7 @@ def main(orig_args):
print("fatal: cloning the git-repo repository failed, will remove "
"'%s' " % path, file=sys.stderr)
shutil.rmtree(path, ignore_errors=True)
shutil.rmtree(path + '.tmp', ignore_errors=True)
sys.exit(1)
repo_main, rel_repo_dir = _FindRepo()
else:

View File

@ -15,26 +15,156 @@
"""Logic for tracing repo interactions.
Activated via `repo --trace ...` or `REPO_TRACE=1 repo ...`.
Temporary: Tracing is always on. Set `REPO_TRACE=0` to turn off.
To also include trace outputs in stderr do `repo --trace_to_stderr ...`
"""
import sys
import os
import time
import tempfile
from contextlib import ContextDecorator
import platform_utils
# Env var to implicitly turn on tracing.
REPO_TRACE = 'REPO_TRACE'
REPO_TRACE = "REPO_TRACE"
_TRACE = os.environ.get(REPO_TRACE) == '1'
# Temporarily set tracing to always on unless user expicitly sets to 0.
_TRACE = os.environ.get(REPO_TRACE) != "0"
_TRACE_TO_STDERR = False
_TRACE_FILE = None
_TRACE_FILE_NAME = "TRACE_FILE"
_MAX_SIZE = 70 # in MiB
_NEW_COMMAND_SEP = "+++++++++++++++NEW COMMAND+++++++++++++++++++"
def IsTraceToStderr():
"""Whether traces are written to stderr."""
return _TRACE_TO_STDERR
def IsTrace():
"""Whether tracing is enabled."""
return _TRACE
def SetTraceToStderr():
"""Enables tracing logging to stderr."""
global _TRACE_TO_STDERR
_TRACE_TO_STDERR = True
def SetTrace():
"""Enables tracing."""
global _TRACE
_TRACE = True
def Trace(fmt, *args):
if IsTrace():
print(fmt % args, file=sys.stderr)
def _SetTraceFile(quiet):
"""Sets the trace file location."""
global _TRACE_FILE
_TRACE_FILE = _GetTraceFile(quiet)
class Trace(ContextDecorator):
"""Used to capture and save git traces."""
def _time(self):
"""Generate nanoseconds of time in a py3.6 safe way"""
return int(time.time() * 1e9)
def __init__(self, fmt, *args, first_trace=False, quiet=True):
"""Initialize the object.
Args:
fmt: The format string for the trace.
*args: Arguments to pass to formatting.
first_trace: Whether this is the first trace of a `repo` invocation.
quiet: Whether to suppress notification of trace file location.
"""
if not IsTrace():
return
self._trace_msg = fmt % args
if not _TRACE_FILE:
_SetTraceFile(quiet)
if first_trace:
_ClearOldTraces()
self._trace_msg = f"{_NEW_COMMAND_SEP} {self._trace_msg}"
def __enter__(self):
if not IsTrace():
return self
print_msg = (
f"PID: {os.getpid()} START: {self._time()} :{self._trace_msg}\n"
)
with open(_TRACE_FILE, "a") as f:
print(print_msg, file=f)
if _TRACE_TO_STDERR:
print(print_msg, file=sys.stderr)
return self
def __exit__(self, *exc):
if not IsTrace():
return False
print_msg = (
f"PID: {os.getpid()} END: {self._time()} :{self._trace_msg}\n"
)
with open(_TRACE_FILE, "a") as f:
print(print_msg, file=f)
if _TRACE_TO_STDERR:
print(print_msg, file=sys.stderr)
return False
def _GetTraceFile(quiet):
"""Get the trace file or create one."""
# TODO: refactor to pass repodir to Trace.
repo_dir = os.path.dirname(os.path.dirname(__file__))
trace_file = os.path.join(repo_dir, _TRACE_FILE_NAME)
if not quiet:
print(f"Trace outputs in {trace_file}", file=sys.stderr)
return trace_file
def _ClearOldTraces():
"""Clear the oldest commands if trace file is too big."""
try:
with open(_TRACE_FILE, "r", errors="ignore") as f:
if os.path.getsize(f.name) / (1024 * 1024) <= _MAX_SIZE:
return
trace_lines = f.readlines()
except FileNotFoundError:
return
while sum(len(x) for x in trace_lines) / (1024 * 1024) > _MAX_SIZE:
for i, line in enumerate(trace_lines):
if "END:" in line and _NEW_COMMAND_SEP in line:
trace_lines = trace_lines[i + 1 :]
break
else:
# The last chunk is bigger than _MAX_SIZE, so just throw everything
# away.
trace_lines = []
while trace_lines and trace_lines[-1] == "\n":
trace_lines = trace_lines[:-1]
# Write to a temporary file with a unique name in the same filesystem
# before replacing the original trace file.
temp_dir, temp_prefix = os.path.split(_TRACE_FILE)
with tempfile.NamedTemporaryFile(
"w", dir=temp_dir, prefix=temp_prefix, delete=False
) as f:
f.writelines(trace_lines)
platform_utils.rename(f.name, _TRACE_FILE)

View File

@ -13,49 +13,40 @@
# See the License for the specific language governing permissions and
# limitations under the License.
"""Wrapper to run pytest with the right settings."""
"""Wrapper to run linters and pytest with the right settings."""
import errno
import os
import shutil
import subprocess
import sys
import pytest
def find_pytest():
"""Try to locate a good version of pytest."""
# If we're in a virtualenv, assume that it's provided the right pytest.
if 'VIRTUAL_ENV' in os.environ:
return 'pytest'
ROOT_DIR = os.path.dirname(os.path.realpath(__file__))
# Use the Python 3 version if available.
ret = shutil.which('pytest-3')
if ret:
return ret
# Hopefully this is a Python 3 version.
ret = shutil.which('pytest')
if ret:
return ret
def run_black():
"""Returns the exit code from black."""
return subprocess.run(
[sys.executable, "-m", "black", "--check", ROOT_DIR], check=False
).returncode
print('%s: unable to find pytest.' % (__file__,), file=sys.stderr)
print('%s: Try installing: sudo apt-get install python-pytest' % (__file__,),
file=sys.stderr)
def run_flake8():
"""Returns the exit code from flake8."""
return subprocess.run(
[sys.executable, "-m", "flake8", ROOT_DIR], check=False
).returncode
def main(argv):
"""The main entry."""
# Add the repo tree to PYTHONPATH as the tests expect to be able to import
# modules directly.
pythonpath = os.path.dirname(os.path.realpath(__file__))
oldpythonpath = os.environ.get('PYTHONPATH', None)
if oldpythonpath is not None:
pythonpath += os.pathsep + oldpythonpath
os.environ['PYTHONPATH'] = pythonpath
pytest = find_pytest()
return subprocess.run([pytest] + argv, check=False).returncode
checks = (
lambda: pytest.main(argv),
run_black,
run_flake8,
)
return 0 if all(not c() for c in checks) else 1
if __name__ == '__main__':
if __name__ == "__main__":
sys.exit(main(sys.argv[1:]))

125
run_tests.vpython3 Normal file
View File

@ -0,0 +1,125 @@
# This is a vpython "spec" file.
#
# Read more about `vpython` and how to modify this file here:
# https://chromium.googlesource.com/infra/infra/+/main/doc/users/vpython.md
# List of available wheels:
# https://chromium.googlesource.com/infra/infra/+/main/infra/tools/dockerbuild/wheels.md
python_version: "3.8"
wheel: <
name: "infra/python/wheels/pytest-py3"
version: "version:6.2.2"
>
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/py-py2_py3"
version: "version:1.10.0"
>
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/iniconfig-py3"
version: "version:1.1.1"
>
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/packaging-py3"
version: "version:23.0"
>
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/pluggy-py3"
version: "version:0.13.1"
>
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/toml-py3"
version: "version:0.10.1"
>
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/pyparsing-py3"
version: "version:3.0.7"
>
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/attrs-py2_py3"
version: "version:21.4.0"
>
# Required by packaging==16.8
wheel: <
name: "infra/python/wheels/six-py2_py3"
version: "version:1.16.0"
>
wheel: <
name: "infra/python/wheels/black-py3"
version: "version:23.1.0"
>
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/mypy-extensions-py3"
version: "version:0.4.3"
>
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/tomli-py3"
version: "version:2.0.1"
>
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/platformdirs-py3"
version: "version:2.5.2"
>
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/pathspec-py3"
version: "version:0.9.0"
>
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/typing-extensions-py3"
version: "version:4.3.0"
>
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/click-py3"
version: "version:8.0.3"
>
wheel: <
name: "infra/python/wheels/flake8-py2_py3"
version: "version:6.0.0"
>
# Required by flake8==6.0.0
wheel: <
name: "infra/python/wheels/mccabe-py2_py3"
version: "version:0.7.0"
>
# Required by flake8==6.0.0
wheel: <
name: "infra/python/wheels/pyflakes-py2_py3"
version: "version:3.0.1"
>
# Required by flake8==6.0.0
wheel: <
name: "infra/python/wheels/pycodestyle-py2_py3"
version: "version:2.10.0"
>

View File

@ -23,39 +23,39 @@ TOPDIR = os.path.dirname(os.path.abspath(__file__))
# Rip out the first intro paragraph.
with open(os.path.join(TOPDIR, 'README.md')) as fp:
with open(os.path.join(TOPDIR, "README.md")) as fp:
lines = fp.read().splitlines()[2:]
end = lines.index('')
long_description = ' '.join(lines[0:end])
end = lines.index("")
long_description = " ".join(lines[0:end])
# https://packaging.python.org/tutorials/packaging-projects/
setuptools.setup(
name='repo',
version='2',
maintainer='Various',
maintainer_email='repo-discuss@googlegroups.com',
description='Repo helps manage many Git repositories',
name="repo",
version="2",
maintainer="Various",
maintainer_email="repo-discuss@googlegroups.com",
description="Repo helps manage many Git repositories",
long_description=long_description,
long_description_content_type='text/plain',
url='https://gerrit.googlesource.com/git-repo/',
long_description_content_type="text/plain",
url="https://gerrit.googlesource.com/git-repo/",
project_urls={
'Bug Tracker': 'https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo',
"Bug Tracker": "https://bugs.chromium.org/p/gerrit/issues/list?q=component:Applications%3Erepo", # noqa: E501
},
# https://pypi.org/classifiers/
classifiers=[
'Development Status :: 6 - Mature',
'Environment :: Console',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Natural Language :: English',
'Operating System :: MacOS :: MacOS X',
'Operating System :: Microsoft :: Windows :: Windows 10',
'Operating System :: POSIX :: Linux',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3 :: Only',
'Topic :: Software Development :: Version Control :: Git',
"Development Status :: 6 - Mature",
"Environment :: Console",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Natural Language :: English",
"Operating System :: MacOS :: MacOS X",
"Operating System :: Microsoft :: Windows :: Windows 10",
"Operating System :: POSIX :: Linux",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3 :: Only",
"Topic :: Software Development :: Version Control :: Git",
],
python_requires='>=3.6',
packages=['subcmds'],
python_requires=">=3.6",
packages=["subcmds"],
)

124
ssh.py
View File

@ -28,21 +28,23 @@ import platform_utils
from repo_trace import Trace
PROXY_PATH = os.path.join(os.path.dirname(__file__), 'git_ssh')
PROXY_PATH = os.path.join(os.path.dirname(__file__), "git_ssh")
def _run_ssh_version():
"""run ssh -V to display the version number"""
return subprocess.check_output(['ssh', '-V'], stderr=subprocess.STDOUT).decode()
return subprocess.check_output(
["ssh", "-V"], stderr=subprocess.STDOUT
).decode()
def _parse_ssh_version(ver_str=None):
"""parse a ssh version string into a tuple"""
if ver_str is None:
ver_str = _run_ssh_version()
m = re.match(r'^OpenSSH_([0-9.]+)(p[0-9]+)?\s', ver_str)
m = re.match(r"^OpenSSH_([0-9.]+)(p[0-9]+)?[\s,]", ver_str)
if m:
return tuple(int(x) for x in m.group(1).split('.'))
return tuple(int(x) for x in m.group(1).split("."))
else:
return ()
@ -52,13 +54,16 @@ def version():
"""return ssh version as a tuple"""
try:
return _parse_ssh_version()
except FileNotFoundError:
print("fatal: ssh not installed", file=sys.stderr)
sys.exit(1)
except subprocess.CalledProcessError:
print('fatal: unable to detect ssh version', file=sys.stderr)
print("fatal: unable to detect ssh version", file=sys.stderr)
sys.exit(1)
URI_SCP = re.compile(r'^([^@:]*@?[^:/]{1,}):')
URI_ALL = re.compile(r'^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/')
URI_SCP = re.compile(r"^([^@:]*@?[^:/]{1,}):")
URI_ALL = re.compile(r"^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/")
class ProxyManager:
@ -67,8 +72,8 @@ class ProxyManager:
This will take care of sharing state between multiprocessing children, and
make sure that if we crash, we don't leak any of the ssh sessions.
The code should work with a single-process scenario too, and not add too much
overhead due to the manager.
The code should work with a single-process scenario too, and not add too
much overhead due to the manager.
"""
# Path to the ssh program to run which will pass our master settings along.
@ -78,16 +83,17 @@ class ProxyManager:
def __init__(self, manager):
# Protect access to the list of active masters.
self._lock = multiprocessing.Lock()
# List of active masters (pid). These will be spawned on demand, and we are
# responsible for shutting them all down at the end.
# List of active masters (pid). These will be spawned on demand, and we
# are responsible for shutting them all down at the end.
self._masters = manager.list()
# Set of active masters indexed by "host:port" information.
# The value isn't used, but multiprocessing doesn't provide a set class.
self._master_keys = manager.dict()
# Whether ssh masters are known to be broken, so we give up entirely.
self._master_broken = manager.Value('b', False)
self._master_broken = manager.Value("b", False)
# List of active ssh sesssions. Clients will be added & removed as
# connections finish, so this list is just for safety & cleanup if we crash.
# connections finish, so this list is just for safety & cleanup if we
# crash.
self._clients = manager.list()
# Path to directory for holding master sockets.
self._sock_path = None
@ -129,7 +135,7 @@ class ProxyManager:
while True:
try:
procs.pop(0)
except:
except: # noqa: E722
break
def close(self):
@ -152,63 +158,71 @@ class ProxyManager:
If one doesn't exist already, we'll create it.
We won't grab any locks, so the caller has to do that. This helps keep the
business logic of actually creating the master separate from grabbing locks.
We won't grab any locks, so the caller has to do that. This helps keep
the business logic of actually creating the master separate from
grabbing locks.
"""
# Check to see whether we already think that the master is running; if we
# think it's already running, return right away.
# Check to see whether we already think that the master is running; if
# we think it's already running, return right away.
if port is not None:
key = '%s:%s' % (host, port)
key = "%s:%s" % (host, port)
else:
key = host
if key in self._master_keys:
return True
if self._master_broken.value or 'GIT_SSH' in os.environ:
if self._master_broken.value or "GIT_SSH" in os.environ:
# Failed earlier, so don't retry.
return False
# We will make two calls to ssh; this is the common part of both calls.
command_base = ['ssh', '-o', 'ControlPath %s' % self.sock(), host]
command_base = ["ssh", "-o", "ControlPath %s" % self.sock(), host]
if port is not None:
command_base[1:1] = ['-p', str(port)]
command_base[1:1] = ["-p", str(port)]
# Since the key wasn't in _master_keys, we think that master isn't running.
# ...but before actually starting a master, we'll double-check. This can
# be important because we can't tell that that 'git@myhost.com' is the same
# as 'myhost.com' where "User git" is setup in the user's ~/.ssh/config file.
check_command = command_base + ['-O', 'check']
# Since the key wasn't in _master_keys, we think that master isn't
# running... but before actually starting a master, we'll double-check.
# This can be important because we can't tell that that 'git@myhost.com'
# is the same as 'myhost.com' where "User git" is setup in the user's
# ~/.ssh/config file.
check_command = command_base + ["-O", "check"]
with Trace("Call to ssh (check call): %s", " ".join(check_command)):
try:
Trace(': %s', ' '.join(check_command))
check_process = subprocess.Popen(check_command,
check_process = subprocess.Popen(
check_command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
stderr=subprocess.PIPE,
)
check_process.communicate() # read output, but ignore it...
isnt_running = check_process.wait()
if not isnt_running:
# Our double-check found that the master _was_ infact running. Add to
# the list of keys.
# Our double-check found that the master _was_ infact
# running. Add to the list of keys.
self._master_keys[key] = True
return True
except Exception:
# Ignore excpetions. We we will fall back to the normal command and print
# to the log there.
# Ignore excpetions. We we will fall back to the normal command
# and print to the log there.
pass
command = command_base[:1] + ['-M', '-N'] + command_base[1:]
command = command_base[:1] + ["-M", "-N"] + command_base[1:]
p = None
try:
Trace(': %s', ' '.join(command))
with Trace("Call to ssh: %s", " ".join(command)):
p = subprocess.Popen(command)
except Exception as e:
self._master_broken.value = True
print('\nwarn: cannot enable ssh control master for %s:%s\n%s'
% (host, port, str(e)), file=sys.stderr)
print(
"\nwarn: cannot enable ssh control master for %s:%s\n%s"
% (host, port, str(e)),
file=sys.stderr,
)
return False
time.sleep(1)
ssh_died = (p.poll() is not None)
ssh_died = p.poll() is not None
if ssh_died:
return False
@ -223,29 +237,29 @@ class ProxyManager:
This will obtain any necessary locks to avoid inter-process races.
"""
# Bail before grabbing the lock if we already know that we aren't going to
# try creating new masters below.
if sys.platform in ('win32', 'cygwin'):
# Bail before grabbing the lock if we already know that we aren't going
# to try creating new masters below.
if sys.platform in ("win32", "cygwin"):
return False
# Acquire the lock. This is needed to prevent opening multiple masters for
# the same host when we're running "repo sync -jN" (for N > 1) _and_ the
# manifest <remote fetch="ssh://xyz"> specifies a different host from the
# one that was passed to repo init.
# Acquire the lock. This is needed to prevent opening multiple masters
# for the same host when we're running "repo sync -jN" (for N > 1) _and_
# the manifest <remote fetch="ssh://xyz"> specifies a different host
# from the one that was passed to repo init.
with self._lock:
return self._open_unlocked(host, port)
def preconnect(self, url):
"""If |uri| will create a ssh connection, setup the ssh master for it."""
"""If |uri| will create a ssh connection, setup the ssh master for it.""" # noqa: E501
m = URI_ALL.match(url)
if m:
scheme = m.group(1)
host = m.group(2)
if ':' in host:
host, port = host.split(':')
if ":" in host:
host, port = host.split(":")
else:
port = None
if scheme in ('ssh', 'git+ssh', 'ssh+git'):
if scheme in ("ssh", "git+ssh", "ssh+git"):
return self._open(host, port)
return False
@ -264,14 +278,14 @@ class ProxyManager:
if self._sock_path is None:
if not create:
return None
tmp_dir = '/tmp'
tmp_dir = "/tmp"
if not os.path.exists(tmp_dir):
tmp_dir = tempfile.gettempdir()
if version() < (6, 7):
tokens = '%r@%h:%p'
tokens = "%r@%h:%p"
else:
tokens = '%C' # hash of %l%h%p%r
tokens = "%C" # hash of %l%h%p%r
self._sock_path = os.path.join(
tempfile.mkdtemp('', 'ssh-', tmp_dir),
'master-' + tokens)
tempfile.mkdtemp("", "ssh-", tmp_dir), "master-" + tokens
)
return self._sock_path

View File

@ -19,31 +19,29 @@ all_commands = {}
my_dir = os.path.dirname(__file__)
for py in os.listdir(my_dir):
if py == '__init__.py':
if py == "__init__.py":
continue
if py.endswith('.py'):
if py.endswith(".py"):
name = py[:-3]
clsn = name.capitalize()
while clsn.find('_') > 0:
h = clsn.index('_')
while clsn.find("_") > 0:
h = clsn.index("_")
clsn = clsn[0:h] + clsn[h + 1 :].capitalize()
mod = __import__(__name__,
globals(),
locals(),
['%s' % name])
mod = __import__(__name__, globals(), locals(), ["%s" % name])
mod = getattr(mod, name)
try:
cmd = getattr(mod, clsn)
except AttributeError:
raise SyntaxError('%s/%s does not define class %s' % (
__name__, py, clsn))
raise SyntaxError(
"%s/%s does not define class %s" % (__name__, py, clsn)
)
name = name.replace('_', '-')
name = name.replace("_", "-")
cmd.NAME = name
all_commands[name] = cmd
# Add 'branch' as an alias for 'branches'.
all_commands['branch'] = all_commands['branches']
all_commands["branch"] = all_commands["branches"]

View File

@ -36,18 +36,27 @@ It is equivalent to "git branch -D <branchname>".
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p):
p.add_option('--all',
dest='all', action='store_true',
help='delete all branches in all projects')
p.add_option(
"--all",
dest="all",
action="store_true",
help="delete all branches in all projects",
)
def ValidateOptions(self, opt, args):
if not opt.all and not args:
self.Usage()
if not opt.all:
nb = args[0]
if not git.check_ref_format('heads/%s' % nb):
self.OptionParser.error("'%s' is not a valid branch name" % nb)
branches = args[0].split()
invalid_branches = [
x for x in branches if not git.check_ref_format(f"heads/{x}")
]
if invalid_branches:
self.OptionParser.error(
f"{invalid_branches} are not valid branch names"
)
else:
args.insert(0, "'All local branches'")
@ -56,7 +65,7 @@ It is equivalent to "git branch -D <branchname>".
if all_branches:
branches = project.GetBranches()
else:
branches = [nb]
branches = nb
ret = {}
for name in branches:
@ -66,49 +75,68 @@ It is equivalent to "git branch -D <branchname>".
return (ret, project)
def Execute(self, opt, args):
nb = args[0]
nb = args[0].split()
err = defaultdict(list)
success = defaultdict(list)
all_projects = self.GetProjects(args[1:])
all_projects = self.GetProjects(
args[1:], all_manifests=not opt.this_manifest_only
)
_RelPath = lambda p: p.RelPath(local=opt.this_manifest_only)
def _ProcessResults(_pool, pm, states):
for (results, project) in states:
for results, project in states:
for branch, status in results.items():
if status:
success[branch].append(project)
else:
err[branch].append(project)
pm.update()
pm.update(msg="")
self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, opt.all, nb),
all_projects,
callback=_ProcessResults,
output=Progress('Abandon %s' % (nb,), len(all_projects), quiet=opt.quiet))
output=Progress(
"Abandon %s" % (nb,), len(all_projects), quiet=opt.quiet
),
)
width = max(itertools.chain(
[25], (len(x) for x in itertools.chain(success, err))))
width = max(
itertools.chain(
[25], (len(x) for x in itertools.chain(success, err))
)
)
if err:
for br in err.keys():
err_msg = "error: cannot abandon %s" % br
print(err_msg, file=sys.stderr)
for proj in err[br]:
print(' ' * len(err_msg) + " | %s" % proj.relpath, file=sys.stderr)
print(
" " * len(err_msg) + " | %s" % _RelPath(proj),
file=sys.stderr,
)
sys.exit(1)
elif not success:
print('error: no project has local branch(es) : %s' % nb,
file=sys.stderr)
print(
"error: no project has local branch(es) : %s" % nb,
file=sys.stderr,
)
sys.exit(1)
else:
# Everything below here is displaying status.
if opt.quiet:
return
print('Abandoned branches:')
print("Abandoned branches:")
for br in success.keys():
if len(all_projects) > 1 and len(all_projects) == len(success[br]):
if len(all_projects) > 1 and len(all_projects) == len(
success[br]
):
result = "all project"
else:
result = "%s" % (
('\n' + ' ' * width + '| ').join(p.relpath for p in success[br]))
print("%s%s| %s\n" % (br, ' ' * (width - len(br)), result))
("\n" + " " * width + "| ").join(
_RelPath(p) for p in success[br]
)
)
print("%s%s| %s\n" % (br, " " * (width - len(br)), result))

View File

@ -21,10 +21,10 @@ from command import Command, DEFAULT_LOCAL_JOBS
class BranchColoring(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, 'branch')
self.current = self.printer('current', fg='green')
self.local = self.printer('local')
self.notinproject = self.printer('notinproject', fg='red')
Coloring.__init__(self, config, "branch")
self.current = self.printer("current", fg="green")
self.local = self.printer("local")
self.notinproject = self.printer("notinproject", fg="red")
class BranchInfo(object):
@ -98,7 +98,9 @@ is shown, then the branch appears in all projects.
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def Execute(self, opt, args):
projects = self.GetProjects(args)
projects = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
out = BranchColoring(self.manifest.manifestProject.config)
all_branches = {}
project_cnt = len(projects)
@ -113,12 +115,13 @@ is shown, then the branch appears in all projects.
opt.jobs,
expand_project_to_branches,
projects,
callback=_ProcessResults)
callback=_ProcessResults,
)
names = sorted(all_branches)
if not names:
print(' (no branches)', file=sys.stderr)
print(" (no branches)", file=sys.stderr)
return
width = 25
@ -131,59 +134,61 @@ is shown, then the branch appears in all projects.
in_cnt = len(i.projects)
if i.IsCurrent:
current = '*'
current = "*"
hdr = out.current
else:
current = ' '
current = " "
hdr = out.local
if i.IsPublishedEqual:
published = 'P'
published = "P"
elif i.IsPublished:
published = 'p'
published = "p"
else:
published = ' '
published = " "
hdr('%c%c %-*s' % (current, published, width, name))
out.write(' |')
hdr("%c%c %-*s" % (current, published, width, name))
out.write(" |")
_RelPath = lambda p: p.RelPath(local=opt.this_manifest_only)
if in_cnt < project_cnt:
fmt = out.write
paths = []
non_cur_paths = []
if i.IsSplitCurrent or (in_cnt < project_cnt - in_cnt):
in_type = 'in'
if i.IsSplitCurrent or (in_cnt <= project_cnt - in_cnt):
in_type = "in"
for b in i.projects:
relpath = _RelPath(b.project)
if not i.IsSplitCurrent or b.current:
paths.append(b.project.relpath)
paths.append(relpath)
else:
non_cur_paths.append(b.project.relpath)
non_cur_paths.append(relpath)
else:
fmt = out.notinproject
in_type = 'not in'
in_type = "not in"
have = set()
for b in i.projects:
have.add(b.project)
have.add(_RelPath(b.project))
for p in projects:
if p not in have:
paths.append(p.relpath)
if _RelPath(p) not in have:
paths.append(_RelPath(p))
s = ' %s %s' % (in_type, ', '.join(paths))
s = " %s %s" % (in_type, ", ".join(paths))
if not i.IsSplitCurrent and (width + 7 + len(s) < 80):
fmt = out.current if i.IsCurrent else fmt
fmt(s)
else:
fmt(' %s:' % in_type)
fmt(" %s:" % in_type)
fmt = out.current if i.IsCurrent else out.write
for p in paths:
out.nl()
fmt(width * ' ' + ' %s' % p)
fmt(width * " " + " %s" % p)
fmt = out.write
for p in non_cur_paths:
out.nl()
fmt(width * ' ' + ' %s' % p)
fmt(width * " " + " %s" % p)
else:
out.write(' in all projects')
out.write(" in all projects")
out.nl()

View File

@ -47,7 +47,9 @@ The command is equivalent to:
nb = args[0]
err = []
success = []
all_projects = self.GetProjects(args[1:])
all_projects = self.GetProjects(
args[1:], all_manifests=not opt.this_manifest_only
)
def _ProcessResults(_pool, pm, results):
for status, project in results:
@ -56,20 +58,25 @@ The command is equivalent to:
success.append(project)
else:
err.append(project)
pm.update()
pm.update(msg="")
self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, nb),
all_projects,
callback=_ProcessResults,
output=Progress('Checkout %s' % (nb,), len(all_projects), quiet=opt.quiet))
output=Progress(
"Checkout %s" % (nb,), len(all_projects), quiet=opt.quiet
),
)
if err:
for p in err:
print("error: %s/: cannot checkout %s" % (p.relpath, nb),
file=sys.stderr)
print(
"error: %s/: cannot checkout %s" % (p.relpath, nb),
file=sys.stderr,
)
sys.exit(1)
elif not success:
print('error: no project has branch %s' % nb, file=sys.stderr)
print("error: no project has branch %s" % nb, file=sys.stderr)
sys.exit(1)

View File

@ -17,7 +17,7 @@ import sys
from command import Command
from git_command import GitCommand
CHANGE_ID_RE = re.compile(r'^\s*Change-Id: I([0-9a-f]{40})\s*$')
CHANGE_ID_RE = re.compile(r"^\s*Change-Id: I([0-9a-f]{40})\s*$")
class CherryPick(Command):
@ -39,46 +39,59 @@ change id will be added.
def Execute(self, opt, args):
reference = args[0]
p = GitCommand(None,
['rev-parse', '--verify', reference],
p = GitCommand(
None,
["rev-parse", "--verify", reference],
capture_stdout=True,
capture_stderr=True)
capture_stderr=True,
)
if p.Wait() != 0:
print(p.stderr, file=sys.stderr)
sys.exit(1)
sha1 = p.stdout.strip()
p = GitCommand(None, ['cat-file', 'commit', sha1], capture_stdout=True)
p = GitCommand(None, ["cat-file", "commit", sha1], capture_stdout=True)
if p.Wait() != 0:
print("error: Failed to retrieve old commit message", file=sys.stderr)
print(
"error: Failed to retrieve old commit message", file=sys.stderr
)
sys.exit(1)
old_msg = self._StripHeader(p.stdout)
p = GitCommand(None,
['cherry-pick', sha1],
p = GitCommand(
None,
["cherry-pick", sha1],
capture_stdout=True,
capture_stderr=True)
capture_stderr=True,
)
status = p.Wait()
print(p.stdout, file=sys.stdout)
print(p.stderr, file=sys.stderr)
if p.stdout:
print(p.stdout.strip(), file=sys.stdout)
if p.stderr:
print(p.stderr.strip(), file=sys.stderr)
if status == 0:
# The cherry-pick was applied correctly. We just need to edit the
# commit message.
new_msg = self._Reformat(old_msg, sha1)
p = GitCommand(None, ['commit', '--amend', '-F', '-'],
p = GitCommand(
None,
["commit", "--amend", "-F", "-"],
input=new_msg,
capture_stdout=True,
capture_stderr=True)
capture_stderr=True,
)
if p.Wait() != 0:
print("error: Failed to update commit message", file=sys.stderr)
sys.exit(1)
else:
print('NOTE: When committing (please see above) and editing the commit '
'message, please remove the old Change-Id-line and add:')
print(
"NOTE: When committing (please see above) and editing the "
"commit message, please remove the old Change-Id-line and add:"
)
print(self._GetReference(sha1), file=sys.stderr)
print(file=sys.stderr)
@ -99,7 +112,7 @@ change id will be added.
if not self._IsChangeId(line):
new_msg.append(line)
# Add a blank line between the message and the change id/reference
# Add a blank line between the message and the change id/reference.
try:
if new_msg[-1].strip() != "":
new_msg.append("")

View File

@ -31,39 +31,51 @@ to the Unix 'patch' command.
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p):
p.add_option('-u', '--absolute',
dest='absolute', action='store_true',
help='paths are relative to the repository root')
p.add_option(
"-u",
"--absolute",
dest="absolute",
action="store_true",
help="paths are relative to the repository root",
)
def _ExecuteOne(self, absolute, project):
def _ExecuteOne(self, absolute, local, project):
"""Obtains the diff for a specific project.
Args:
absolute: Paths are relative to the root.
local: a boolean, if True, the path is relative to the local
(sub)manifest. If false, the path is relative to the outermost
manifest.
project: Project to get status of.
Returns:
The status of the project.
"""
buf = io.StringIO()
ret = project.PrintWorkTreeDiff(absolute, output_redir=buf)
ret = project.PrintWorkTreeDiff(absolute, output_redir=buf, local=local)
return (ret, buf.getvalue())
def Execute(self, opt, args):
all_projects = self.GetProjects(args)
all_projects = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
def _ProcessResults(_pool, _output, results):
ret = 0
for (state, output) in results:
for state, output in results:
if output:
print(output, end='')
print(output, end="")
if not state:
ret = 1
return ret
return self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, opt.absolute),
functools.partial(
self._ExecuteOne, opt.absolute, opt.this_manifest_only
),
all_projects,
callback=_ProcessResults,
ordered=True)
ordered=True,
)

View File

@ -66,130 +66,188 @@ synced and their revisions won't be found.
"""
def _Options(self, p):
p.add_option('--raw',
dest='raw', action='store_true',
help='display raw diff')
p.add_option('--no-color',
dest='color', action='store_false', default=True,
help='does not display the diff in color')
p.add_option('--pretty-format',
dest='pretty_format', action='store',
metavar='<FORMAT>',
help='print the log using a custom git pretty format string')
p.add_option(
"--raw", dest="raw", action="store_true", help="display raw diff"
)
p.add_option(
"--no-color",
dest="color",
action="store_false",
default=True,
help="does not display the diff in color",
)
p.add_option(
"--pretty-format",
dest="pretty_format",
action="store",
metavar="<FORMAT>",
help="print the log using a custom git pretty format string",
)
def _printRawDiff(self, diff, pretty_format=None):
for project in diff['added']:
self.printText("A %s %s" % (project.relpath, project.revisionExpr))
def _printRawDiff(self, diff, pretty_format=None, local=False):
_RelPath = lambda p: p.RelPath(local=local)
for project in diff["added"]:
self.printText(
"A %s %s" % (_RelPath(project), project.revisionExpr)
)
self.out.nl()
for project in diff['removed']:
self.printText("R %s %s" % (project.relpath, project.revisionExpr))
for project in diff["removed"]:
self.printText(
"R %s %s" % (_RelPath(project), project.revisionExpr)
)
self.out.nl()
for project, otherProject in diff['changed']:
self.printText("C %s %s %s" % (project.relpath, project.revisionExpr,
otherProject.revisionExpr))
for project, otherProject in diff["changed"]:
self.printText(
"C %s %s %s"
% (
_RelPath(project),
project.revisionExpr,
otherProject.revisionExpr,
)
)
self.out.nl()
self._printLogs(project, otherProject, raw=True, color=False, pretty_format=pretty_format)
self._printLogs(
project,
otherProject,
raw=True,
color=False,
pretty_format=pretty_format,
)
for project, otherProject in diff['unreachable']:
self.printText("U %s %s %s" % (project.relpath, project.revisionExpr,
otherProject.revisionExpr))
for project, otherProject in diff["unreachable"]:
self.printText(
"U %s %s %s"
% (
_RelPath(project),
project.revisionExpr,
otherProject.revisionExpr,
)
)
self.out.nl()
def _printDiff(self, diff, color=True, pretty_format=None):
if diff['added']:
def _printDiff(self, diff, color=True, pretty_format=None, local=False):
_RelPath = lambda p: p.RelPath(local=local)
if diff["added"]:
self.out.nl()
self.printText('added projects : \n')
self.printText("added projects : \n")
self.out.nl()
for project in diff['added']:
self.printProject('\t%s' % (project.relpath))
self.printText(' at revision ')
for project in diff["added"]:
self.printProject("\t%s" % (_RelPath(project)))
self.printText(" at revision ")
self.printRevision(project.revisionExpr)
self.out.nl()
if diff['removed']:
if diff["removed"]:
self.out.nl()
self.printText('removed projects : \n')
self.printText("removed projects : \n")
self.out.nl()
for project in diff['removed']:
self.printProject('\t%s' % (project.relpath))
self.printText(' at revision ')
for project in diff["removed"]:
self.printProject("\t%s" % (_RelPath(project)))
self.printText(" at revision ")
self.printRevision(project.revisionExpr)
self.out.nl()
if diff['changed']:
if diff["missing"]:
self.out.nl()
self.printText('changed projects : \n')
self.printText("missing projects : \n")
self.out.nl()
for project, otherProject in diff['changed']:
self.printProject('\t%s' % (project.relpath))
self.printText(' changed from ')
for project in diff["missing"]:
self.printProject("\t%s" % (_RelPath(project)))
self.printText(" at revision ")
self.printRevision(project.revisionExpr)
self.printText(' to ')
self.out.nl()
if diff["changed"]:
self.out.nl()
self.printText("changed projects : \n")
self.out.nl()
for project, otherProject in diff["changed"]:
self.printProject("\t%s" % (_RelPath(project)))
self.printText(" changed from ")
self.printRevision(project.revisionExpr)
self.printText(" to ")
self.printRevision(otherProject.revisionExpr)
self.out.nl()
self._printLogs(project, otherProject, raw=False, color=color,
pretty_format=pretty_format)
self._printLogs(
project,
otherProject,
raw=False,
color=color,
pretty_format=pretty_format,
)
self.out.nl()
if diff['unreachable']:
if diff["unreachable"]:
self.out.nl()
self.printText('projects with unreachable revisions : \n')
self.printText("projects with unreachable revisions : \n")
self.out.nl()
for project, otherProject in diff['unreachable']:
self.printProject('\t%s ' % (project.relpath))
for project, otherProject in diff["unreachable"]:
self.printProject("\t%s " % (_RelPath(project)))
self.printRevision(project.revisionExpr)
self.printText(' or ')
self.printText(" or ")
self.printRevision(otherProject.revisionExpr)
self.printText(' not found')
self.printText(" not found")
self.out.nl()
def _printLogs(self, project, otherProject, raw=False, color=True,
pretty_format=None):
logs = project.getAddedAndRemovedLogs(otherProject,
def _printLogs(
self, project, otherProject, raw=False, color=True, pretty_format=None
):
logs = project.getAddedAndRemovedLogs(
otherProject,
oneline=(pretty_format is None),
color=color,
pretty_format=pretty_format)
if logs['removed']:
removedLogs = logs['removed'].split('\n')
pretty_format=pretty_format,
)
if logs["removed"]:
removedLogs = logs["removed"].split("\n")
for log in removedLogs:
if log.strip():
if raw:
self.printText(' R ' + log)
self.printText(" R " + log)
self.out.nl()
else:
self.printRemoved('\t\t[-] ')
self.printRemoved("\t\t[-] ")
self.printText(log)
self.out.nl()
if logs['added']:
addedLogs = logs['added'].split('\n')
if logs["added"]:
addedLogs = logs["added"].split("\n")
for log in addedLogs:
if log.strip():
if raw:
self.printText(' A ' + log)
self.printText(" A " + log)
self.out.nl()
else:
self.printAdded('\t\t[+] ')
self.printAdded("\t\t[+] ")
self.printText(log)
self.out.nl()
def ValidateOptions(self, opt, args):
if not args or len(args) > 2:
self.OptionParser.error('missing manifests to diff')
self.OptionParser.error("missing manifests to diff")
if opt.this_manifest_only is False:
raise self.OptionParser.error(
"`diffmanifest` only supports the current tree"
)
def Execute(self, opt, args):
self.out = _Coloring(self.client.globalConfig)
self.printText = self.out.nofmt_printer('text')
self.printText = self.out.nofmt_printer("text")
if opt.color:
self.printProject = self.out.nofmt_printer('project', attr='bold')
self.printAdded = self.out.nofmt_printer('green', fg='green', attr='bold')
self.printRemoved = self.out.nofmt_printer('red', fg='red', attr='bold')
self.printRevision = self.out.nofmt_printer('revision', fg='yellow')
self.printProject = self.out.nofmt_printer("project", attr="bold")
self.printAdded = self.out.nofmt_printer(
"green", fg="green", attr="bold"
)
self.printRemoved = self.out.nofmt_printer(
"red", fg="red", attr="bold"
)
self.printRevision = self.out.nofmt_printer("revision", fg="yellow")
else:
self.printProject = self.printAdded = self.printRemoved = self.printRevision = self.printText
self.printProject = (
self.printAdded
) = self.printRemoved = self.printRevision = self.printText
manifest1 = RepoClient(self.repodir)
manifest1.Override(args[0], load_local_manifests=False)
@ -201,6 +259,15 @@ synced and their revisions won't be found.
diff = manifest1.projectsDiff(manifest2)
if opt.raw:
self._printRawDiff(diff, pretty_format=opt.pretty_format)
self._printRawDiff(
diff,
pretty_format=opt.pretty_format,
local=opt.this_manifest_only,
)
else:
self._printDiff(diff, color=opt.color, pretty_format=opt.pretty_format)
self._printDiff(
diff,
color=opt.color,
pretty_format=opt.pretty_format,
local=opt.this_manifest_only,
)

View File

@ -18,7 +18,7 @@ import sys
from command import Command
from error import GitError, NoSuchProjectError
CHANGE_RE = re.compile(r'^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$')
CHANGE_RE = re.compile(r"^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$")
class Download(Command):
@ -34,21 +34,36 @@ If no project is specified try to use current directory as a project.
"""
def _Options(self, p):
p.add_option('-b', '--branch',
help='create a new branch first')
p.add_option('-c', '--cherry-pick',
dest='cherrypick', action='store_true',
help="cherry-pick instead of checkout")
p.add_option('-x', '--record-origin', action='store_true',
help='pass -x when cherry-picking')
p.add_option('-r', '--revert',
dest='revert', action='store_true',
help="revert instead of checkout")
p.add_option('-f', '--ff-only',
dest='ffonly', action='store_true',
help="force fast-forward merge")
p.add_option("-b", "--branch", help="create a new branch first")
p.add_option(
"-c",
"--cherry-pick",
dest="cherrypick",
action="store_true",
help="cherry-pick instead of checkout",
)
p.add_option(
"-x",
"--record-origin",
action="store_true",
help="pass -x when cherry-picking",
)
p.add_option(
"-r",
"--revert",
dest="revert",
action="store_true",
help="revert instead of checkout",
)
p.add_option(
"-f",
"--ff-only",
dest="ffonly",
action="store_true",
help="force fast-forward merge",
)
def _ParseChangeIds(self, args):
def _ParseChangeIds(self, opt, args):
if not args:
self.Usage()
@ -60,16 +75,16 @@ If no project is specified try to use current directory as a project.
if m:
if not project:
project = self.GetProjects(".")[0]
print('Defaulting to cwd project', project.name)
print("Defaulting to cwd project", project.name)
chg_id = int(m.group(1))
if m.group(2):
ps_id = int(m.group(2))
else:
ps_id = 1
refs = 'refs/changes/%2.2d/%d/' % (chg_id % 100, chg_id)
output = project._LsRemote(refs + '*')
refs = "refs/changes/%2.2d/%d/" % (chg_id % 100, chg_id)
output = project._LsRemote(refs + "*")
if output:
regex = refs + r'(\d+)'
regex = refs + r"(\d+)"
rcomp = re.compile(regex, re.I)
for line in output.splitlines():
match = rcomp.search(line)
@ -77,73 +92,99 @@ If no project is specified try to use current directory as a project.
ps_id = max(int(match.group(1)), ps_id)
to_get.append((project, chg_id, ps_id))
else:
projects = self.GetProjects([a])
projects = self.GetProjects(
[a], all_manifests=not opt.this_manifest_only
)
if len(projects) > 1:
# If the cwd is one of the projects, assume they want that.
try:
project = self.GetProjects('.')[0]
project = self.GetProjects(".")[0]
except NoSuchProjectError:
project = None
if project not in projects:
print('error: %s matches too many projects; please re-run inside '
'the project checkout.' % (a,), file=sys.stderr)
print(
"error: %s matches too many projects; please "
"re-run inside the project checkout." % (a,),
file=sys.stderr,
)
for project in projects:
print(' %s/ @ %s' % (project.relpath, project.revisionExpr),
file=sys.stderr)
print(
" %s/ @ %s"
% (
project.RelPath(
local=opt.this_manifest_only
),
project.revisionExpr,
),
file=sys.stderr,
)
sys.exit(1)
else:
project = projects[0]
print('Defaulting to cwd project', project.name)
print("Defaulting to cwd project", project.name)
return to_get
def ValidateOptions(self, opt, args):
if opt.record_origin:
if not opt.cherrypick:
self.OptionParser.error('-x only makes sense with --cherry-pick')
self.OptionParser.error(
"-x only makes sense with --cherry-pick"
)
if opt.ffonly:
self.OptionParser.error('-x and --ff are mutually exclusive options')
self.OptionParser.error(
"-x and --ff are mutually exclusive options"
)
def Execute(self, opt, args):
for project, change_id, ps_id in self._ParseChangeIds(args):
for project, change_id, ps_id in self._ParseChangeIds(opt, args):
dl = project.DownloadPatchSet(change_id, ps_id)
if not dl:
print('[%s] change %d/%d not found'
print(
"[%s] change %d/%d not found"
% (project.name, change_id, ps_id),
file=sys.stderr)
file=sys.stderr,
)
sys.exit(1)
if not opt.revert and not dl.commits:
print('[%s] change %d/%d has already been merged'
print(
"[%s] change %d/%d has already been merged"
% (project.name, change_id, ps_id),
file=sys.stderr)
file=sys.stderr,
)
continue
if len(dl.commits) > 1:
print('[%s] %d/%d depends on %d unmerged changes:'
print(
"[%s] %d/%d depends on %d unmerged changes:"
% (project.name, change_id, ps_id, len(dl.commits)),
file=sys.stderr)
file=sys.stderr,
)
for c in dl.commits:
print(' %s' % (c), file=sys.stderr)
print(" %s" % (c), file=sys.stderr)
if opt.cherrypick:
mode = 'cherry-pick'
mode = "cherry-pick"
elif opt.revert:
mode = 'revert'
mode = "revert"
elif opt.ffonly:
mode = 'fast-forward merge'
mode = "fast-forward merge"
else:
mode = 'checkout'
mode = "checkout"
# We'll combine the branch+checkout operation, but all the rest need a
# dedicated branch start.
if opt.branch and mode != 'checkout':
# We'll combine the branch+checkout operation, but all the rest need
# a dedicated branch start.
if opt.branch and mode != "checkout":
project.StartBranch(opt.branch)
try:
if opt.cherrypick:
project._CherryPick(dl.commit, ffonly=opt.ffonly,
record_origin=opt.record_origin)
project._CherryPick(
dl.commit,
ffonly=opt.ffonly,
record_origin=opt.record_origin,
)
elif opt.revert:
project._Revert(dl.commit)
elif opt.ffonly:
@ -155,6 +196,9 @@ If no project is specified try to use current directory as a project.
project._Checkout(dl.commit)
except GitError:
print('[%s] Could not complete the %s of %s'
% (project.name, mode, dl.commit), file=sys.stderr)
print(
"[%s] Could not complete the %s of %s"
% (project.name, mode, dl.commit),
file=sys.stderr,
)
sys.exit(1)

View File

@ -23,21 +23,26 @@ import sys
import subprocess
from color import Coloring
from command import DEFAULT_LOCAL_JOBS, Command, MirrorSafeCommand, WORKER_BATCH_SIZE
from command import (
DEFAULT_LOCAL_JOBS,
Command,
MirrorSafeCommand,
WORKER_BATCH_SIZE,
)
from error import ManifestInvalidRevisionError
_CAN_COLOR = [
'branch',
'diff',
'grep',
'log',
"branch",
"diff",
"grep",
"log",
]
class ForallColoring(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, 'forall')
self.project = self.printer('project', attr='bold')
Coloring.__init__(self, config, "forall")
self.project = self.printer("project", attr="bold")
class Forall(Command, MirrorSafeCommand):
@ -84,6 +89,11 @@ REPO_PROJECT is set to the unique name of the project.
REPO_PATH is the path relative the the root of the client.
REPO_OUTERPATH is the path of the sub manifest's root relative to the root of
the client.
REPO_INNERPATH is the path relative to the root of the sub manifest.
REPO_REMOTE is the name of the remote system from the manifest.
REPO_LREV is the name of the revision from the manifest, translated
@ -129,35 +139,61 @@ without iterating through the remaining projects.
del parser.rargs[0]
def _Options(self, p):
p.add_option('-r', '--regex',
dest='regex', action='store_true',
help='execute the command only on projects matching regex or wildcard expression')
p.add_option('-i', '--inverse-regex',
dest='inverse_regex', action='store_true',
help='execute the command only on projects not matching regex or '
'wildcard expression')
p.add_option('-g', '--groups',
dest='groups',
help='execute the command only on projects matching the specified groups')
p.add_option('-c', '--command',
help='command (and arguments) to execute',
dest='command',
action='callback',
callback=self._cmd_option)
p.add_option('-e', '--abort-on-errors',
dest='abort_on_errors', action='store_true',
help='abort if a command exits unsuccessfully')
p.add_option('--ignore-missing', action='store_true',
help='silently skip & do not exit non-zero due missing '
'checkouts')
p.add_option(
"-r",
"--regex",
dest="regex",
action="store_true",
help="execute the command only on projects matching regex or "
"wildcard expression",
)
p.add_option(
"-i",
"--inverse-regex",
dest="inverse_regex",
action="store_true",
help="execute the command only on projects not matching regex or "
"wildcard expression",
)
p.add_option(
"-g",
"--groups",
dest="groups",
help="execute the command only on projects matching the specified "
"groups",
)
p.add_option(
"-c",
"--command",
help="command (and arguments) to execute",
dest="command",
action="callback",
callback=self._cmd_option,
)
p.add_option(
"-e",
"--abort-on-errors",
dest="abort_on_errors",
action="store_true",
help="abort if a command exits unsuccessfully",
)
p.add_option(
"--ignore-missing",
action="store_true",
help="silently skip & do not exit non-zero due missing "
"checkouts",
)
g = p.get_option_group('--quiet')
g.add_option('-p',
dest='project_header', action='store_true',
help='show project headers before output')
p.add_option('--interactive',
action='store_true',
help='force interactive usage')
g = p.get_option_group("--quiet")
g.add_option(
"-p",
dest="project_header",
action="store_true",
help="show project headers before output",
)
p.add_option(
"--interactive", action="store_true", help="force interactive usage"
)
def WantPager(self, opt):
return opt.project_header and opt.jobs == 1
@ -168,90 +204,102 @@ without iterating through the remaining projects.
def Execute(self, opt, args):
cmd = [opt.command[0]]
all_trees = not opt.this_manifest_only
shell = True
if re.compile(r'^[a-z0-9A-Z_/\.-]+$').match(cmd[0]):
if re.compile(r"^[a-z0-9A-Z_/\.-]+$").match(cmd[0]):
shell = False
if shell:
cmd.append(cmd[0])
cmd.extend(opt.command[1:])
# Historically, forall operated interactively, and in serial. If the user
# has selected 1 job, then default to interacive mode.
# Historically, forall operated interactively, and in serial. If the
# user has selected 1 job, then default to interacive mode.
if opt.jobs == 1:
opt.interactive = True
if opt.project_header \
and not shell \
and cmd[0] == 'git':
if opt.project_header and not shell and cmd[0] == "git":
# If this is a direct git command that can enable colorized
# output and the user prefers coloring, add --color into the
# command line because we are going to wrap the command into
# a pipe and git won't know coloring should activate.
#
for cn in cmd[1:]:
if not cn.startswith('-'):
if not cn.startswith("-"):
break
else:
cn = None
if cn and cn in _CAN_COLOR:
class ColorCmd(Coloring):
def __init__(self, config, cmd):
Coloring.__init__(self, config, cmd)
if ColorCmd(self.manifest.manifestProject.config, cn).is_on:
cmd.insert(cmd.index(cn) + 1, '--color')
cmd.insert(cmd.index(cn) + 1, "--color")
mirror = self.manifest.IsMirror
rc = 0
smart_sync_manifest_name = "smart_sync_override.xml"
smart_sync_manifest_path = os.path.join(
self.manifest.manifestProject.worktree, smart_sync_manifest_name)
self.manifest.manifestProject.worktree, smart_sync_manifest_name
)
if os.path.isfile(smart_sync_manifest_path):
self.manifest.Override(smart_sync_manifest_path)
if opt.regex:
projects = self.FindProjects(args)
projects = self.FindProjects(args, all_manifests=all_trees)
elif opt.inverse_regex:
projects = self.FindProjects(args, inverse=True)
projects = self.FindProjects(
args, inverse=True, all_manifests=all_trees
)
else:
projects = self.GetProjects(args, groups=opt.groups)
projects = self.GetProjects(
args, groups=opt.groups, all_manifests=all_trees
)
os.environ['REPO_COUNT'] = str(len(projects))
os.environ["REPO_COUNT"] = str(len(projects))
try:
config = self.manifest.manifestProject.config
with multiprocessing.Pool(opt.jobs, InitWorker) as pool:
results_it = pool.imap(
functools.partial(DoWorkWrapper, mirror, opt, cmd, shell, config),
functools.partial(
DoWorkWrapper, mirror, opt, cmd, shell, config
),
enumerate(projects),
chunksize=WORKER_BATCH_SIZE)
chunksize=WORKER_BATCH_SIZE,
)
first = True
for (r, output) in results_it:
for r, output in results_it:
if output:
if first:
first = False
elif opt.project_header:
print()
# To simplify the DoWorkWrapper, take care of automatic newlines.
end = '\n'
if output[-1] == '\n':
end = ''
# To simplify the DoWorkWrapper, take care of automatic
# newlines.
end = "\n"
if output[-1] == "\n":
end = ""
print(output, end=end)
rc = rc or r
if r != 0 and opt.abort_on_errors:
raise Exception('Aborting due to previous error')
raise Exception("Aborting due to previous error")
except (KeyboardInterrupt, WorkerKeyboardInterrupt):
# Catch KeyboardInterrupt raised inside and outside of workers
rc = rc or errno.EINTR
except Exception as e:
# Catch any other exceptions raised
print('forall: unhandled error, terminating the pool: %s: %s' %
(type(e).__name__, e),
file=sys.stderr)
rc = rc or getattr(e, 'errno', 1)
print(
"forall: unhandled error, terminating the pool: %s: %s"
% (type(e).__name__, e),
file=sys.stderr,
)
rc = rc or getattr(e, "errno", 1)
if rc != 0:
sys.exit(rc)
@ -267,16 +315,16 @@ def InitWorker():
def DoWorkWrapper(mirror, opt, cmd, shell, config, args):
"""A wrapper around the DoWork() method.
Catch the KeyboardInterrupt exceptions here and re-raise them as a different,
``Exception``-based exception to stop it flooding the console with stacktraces
and making the parent hang indefinitely.
Catch the KeyboardInterrupt exceptions here and re-raise them as a
different, ``Exception``-based exception to stop it flooding the console
with stacktraces and making the parent hang indefinitely.
"""
cnt, project = args
try:
return DoWork(project, mirror, opt, cmd, shell, cnt, config)
except KeyboardInterrupt:
print('%s: Worker interrupted' % project.name)
print("%s: Worker interrupted" % project.name)
raise WorkerKeyboardInterrupt()
@ -285,28 +333,31 @@ def DoWork(project, mirror, opt, cmd, shell, cnt, config):
def setenv(name, val):
if val is None:
val = ''
val = ""
env[name] = val
setenv('REPO_PROJECT', project.name)
setenv('REPO_PATH', project.relpath)
setenv('REPO_REMOTE', project.remote.name)
setenv("REPO_PROJECT", project.name)
setenv("REPO_OUTERPATH", project.manifest.path_prefix)
setenv("REPO_INNERPATH", project.relpath)
setenv("REPO_PATH", project.RelPath(local=opt.this_manifest_only))
setenv("REPO_REMOTE", project.remote.name)
try:
# If we aren't in a fully synced state and we don't have the ref the manifest
# wants, then this will fail. Ignore it for the purposes of this code.
lrev = '' if mirror else project.GetRevisionId()
# If we aren't in a fully synced state and we don't have the ref the
# manifest wants, then this will fail. Ignore it for the purposes of
# this code.
lrev = "" if mirror else project.GetRevisionId()
except ManifestInvalidRevisionError:
lrev = ''
setenv('REPO_LREV', lrev)
setenv('REPO_RREV', project.revisionExpr)
setenv('REPO_UPSTREAM', project.upstream)
setenv('REPO_DEST_BRANCH', project.dest_branch)
setenv('REPO_I', str(cnt + 1))
lrev = ""
setenv("REPO_LREV", lrev)
setenv("REPO_RREV", project.revisionExpr)
setenv("REPO_UPSTREAM", project.upstream)
setenv("REPO_DEST_BRANCH", project.dest_branch)
setenv("REPO_I", str(cnt + 1))
for annotation in project.annotations:
setenv("REPO__%s" % (annotation.name), annotation.value)
if mirror:
setenv('GIT_DIR', project.gitdir)
setenv("GIT_DIR", project.gitdir)
cwd = project.gitdir
else:
cwd = project.worktree
@ -315,12 +366,13 @@ def DoWork(project, mirror, opt, cmd, shell, cnt, config):
# Allow the user to silently ignore missing checkouts so they can run on
# partial checkouts (good for infra recovery tools).
if opt.ignore_missing:
return (0, '')
return (0, "")
output = ''
if ((opt.project_header and opt.verbose)
or not opt.project_header):
output = 'skipping %s/' % project.relpath
output = ""
if (opt.project_header and opt.verbose) or not opt.project_header:
output = "skipping %s/" % project.RelPath(
local=opt.this_manifest_only
)
return (1, output)
if opt.verbose:
@ -331,9 +383,17 @@ def DoWork(project, mirror, opt, cmd, shell, cnt, config):
stdin = None if opt.interactive else subprocess.DEVNULL
result = subprocess.run(
cmd, cwd=cwd, shell=shell, env=env, check=False,
encoding='utf-8', errors='replace',
stdin=stdin, stdout=subprocess.PIPE, stderr=stderr)
cmd,
cwd=cwd,
shell=shell,
env=env,
check=False,
encoding="utf-8",
errors="replace",
stdin=stdin,
stdout=subprocess.PIPE,
stderr=stderr,
)
output = result.stdout
if opt.project_header:
@ -344,8 +404,10 @@ def DoWork(project, mirror, opt, cmd, shell, cnt, config):
if mirror:
project_header_path = project.name
else:
project_header_path = project.relpath
out.project('project %s/' % project_header_path)
project_header_path = project.RelPath(
local=opt.this_manifest_only
)
out.project("project %s/" % project_header_path)
out.nl()
buf.write(output)
output = buf.getvalue()

View File

@ -31,16 +31,22 @@ and all locally downloaded sources.
"""
def _Options(self, p):
p.add_option('-f', '--force',
dest='force', action='store_true',
help='force the deletion (no prompt)')
p.add_option(
"-f",
"--force",
dest="force",
action="store_true",
help="force the deletion (no prompt)",
)
def Execute(self, opt, args):
if not opt.force:
prompt = ('This will delete GITC client: %s\nAre you sure? (yes/no) ' %
self.gitc_manifest.gitc_client_name)
prompt = (
"This will delete GITC client: %s\nAre you sure? (yes/no) "
% self.gitc_manifest.gitc_client_name
)
response = input(prompt).lower()
if not response == 'yes':
if not response == "yes":
print('Response was not "yes"\n Exiting...')
sys.exit(1)
platform_utils.rmtree(self.gitc_manifest.gitc_client_dir)

View File

@ -24,6 +24,7 @@ import wrapper
class GitcInit(init.Init, GitcAvailableCommand):
COMMON = True
MULTI_MANIFEST_SUPPORT = False
helpSummary = "Initialize a GITC Client."
helpUsage = """
%prog [options] [client name]
@ -51,24 +52,36 @@ use for this GITC client.
def Execute(self, opt, args):
gitc_client = gitc_utils.parse_clientdir(os.getcwd())
if not gitc_client or (opt.gitc_client and gitc_client != opt.gitc_client):
print('fatal: Please update your repo command. See go/gitc for instructions.',
file=sys.stderr)
if not gitc_client or (
opt.gitc_client and gitc_client != opt.gitc_client
):
print(
"fatal: Please update your repo command. See go/gitc for "
"instructions.",
file=sys.stderr,
)
sys.exit(1)
self.client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(),
gitc_client)
self.client_dir = os.path.join(
gitc_utils.get_gitc_manifest_dir(), gitc_client
)
super().Execute(opt, args)
manifest_file = self.manifest.manifestFile
if opt.manifest_file:
if not os.path.exists(opt.manifest_file):
print('fatal: Specified manifest file %s does not exist.' %
opt.manifest_file)
print(
"fatal: Specified manifest file %s does not exist."
% opt.manifest_file
)
sys.exit(1)
manifest_file = opt.manifest_file
manifest = GitcManifest(self.repodir, gitc_client)
manifest = GitcManifest(
self.repodir, os.path.join(self.client_dir, ".manifest")
)
manifest.Override(manifest_file)
gitc_utils.generate_gitc_manifest(None, manifest)
print('Please run `cd %s` to view your GITC client.' %
os.path.join(wrapper.Wrapper().GITC_FS_ROOT_DIR, gitc_client))
print(
"Please run `cd %s` to view your GITC client."
% os.path.join(wrapper.Wrapper().GITC_FS_ROOT_DIR, gitc_client)
)

View File

@ -23,9 +23,9 @@ from git_command import GitCommand
class GrepColoring(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, 'grep')
self.project = self.printer('project', attr='bold')
self.fail = self.printer('fail', fg='red')
Coloring.__init__(self, config, "grep")
self.project = self.printer("project", attr="bold")
self.fail = self.printer("fail", fg="red")
class Grep(PagedCommand):
@ -66,15 +66,15 @@ contain a line that matches both expressions:
@staticmethod
def _carry_option(_option, opt_str, value, parser):
pt = getattr(parser.values, 'cmd_argv', None)
pt = getattr(parser.values, "cmd_argv", None)
if pt is None:
pt = []
setattr(parser.values, 'cmd_argv', pt)
setattr(parser.values, "cmd_argv", pt)
if opt_str == '-(':
pt.append('(')
elif opt_str == '-)':
pt.append(')')
if opt_str == "-(":
pt.append("(")
elif opt_str == "-)":
pt.append(")")
else:
pt.append(opt_str)
@ -86,136 +86,218 @@ contain a line that matches both expressions:
super()._CommonOptions(p, opt_v=False)
def _Options(self, p):
g = p.add_option_group('Sources')
g.add_option('--cached',
action='callback', callback=self._carry_option,
help='Search the index, instead of the work tree')
g.add_option('-r', '--revision',
dest='revision', action='append', metavar='TREEish',
help='Search TREEish, instead of the work tree')
g = p.add_option_group("Sources")
g.add_option(
"--cached",
action="callback",
callback=self._carry_option,
help="Search the index, instead of the work tree",
)
g.add_option(
"-r",
"--revision",
dest="revision",
action="append",
metavar="TREEish",
help="Search TREEish, instead of the work tree",
)
g = p.add_option_group('Pattern')
g.add_option('-e',
action='callback', callback=self._carry_option,
metavar='PATTERN', type='str',
help='Pattern to search for')
g.add_option('-i', '--ignore-case',
action='callback', callback=self._carry_option,
help='Ignore case differences')
g.add_option('-a', '--text',
action='callback', callback=self._carry_option,
help="Process binary files as if they were text")
g.add_option('-I',
action='callback', callback=self._carry_option,
help="Don't match the pattern in binary files")
g.add_option('-w', '--word-regexp',
action='callback', callback=self._carry_option,
help='Match the pattern only at word boundaries')
g.add_option('-v', '--invert-match',
action='callback', callback=self._carry_option,
help='Select non-matching lines')
g.add_option('-G', '--basic-regexp',
action='callback', callback=self._carry_option,
help='Use POSIX basic regexp for patterns (default)')
g.add_option('-E', '--extended-regexp',
action='callback', callback=self._carry_option,
help='Use POSIX extended regexp for patterns')
g.add_option('-F', '--fixed-strings',
action='callback', callback=self._carry_option,
help='Use fixed strings (not regexp) for pattern')
g = p.add_option_group("Pattern")
g.add_option(
"-e",
action="callback",
callback=self._carry_option,
metavar="PATTERN",
type="str",
help="Pattern to search for",
)
g.add_option(
"-i",
"--ignore-case",
action="callback",
callback=self._carry_option,
help="Ignore case differences",
)
g.add_option(
"-a",
"--text",
action="callback",
callback=self._carry_option,
help="Process binary files as if they were text",
)
g.add_option(
"-I",
action="callback",
callback=self._carry_option,
help="Don't match the pattern in binary files",
)
g.add_option(
"-w",
"--word-regexp",
action="callback",
callback=self._carry_option,
help="Match the pattern only at word boundaries",
)
g.add_option(
"-v",
"--invert-match",
action="callback",
callback=self._carry_option,
help="Select non-matching lines",
)
g.add_option(
"-G",
"--basic-regexp",
action="callback",
callback=self._carry_option,
help="Use POSIX basic regexp for patterns (default)",
)
g.add_option(
"-E",
"--extended-regexp",
action="callback",
callback=self._carry_option,
help="Use POSIX extended regexp for patterns",
)
g.add_option(
"-F",
"--fixed-strings",
action="callback",
callback=self._carry_option,
help="Use fixed strings (not regexp) for pattern",
)
g = p.add_option_group('Pattern Grouping')
g.add_option('--all-match',
action='callback', callback=self._carry_option,
help='Limit match to lines that have all patterns')
g.add_option('--and', '--or', '--not',
action='callback', callback=self._carry_option,
help='Boolean operators to combine patterns')
g.add_option('-(', '-)',
action='callback', callback=self._carry_option,
help='Boolean operator grouping')
g = p.add_option_group("Pattern Grouping")
g.add_option(
"--all-match",
action="callback",
callback=self._carry_option,
help="Limit match to lines that have all patterns",
)
g.add_option(
"--and",
"--or",
"--not",
action="callback",
callback=self._carry_option,
help="Boolean operators to combine patterns",
)
g.add_option(
"-(",
"-)",
action="callback",
callback=self._carry_option,
help="Boolean operator grouping",
)
g = p.add_option_group('Output')
g.add_option('-n',
action='callback', callback=self._carry_option,
help='Prefix the line number to matching lines')
g.add_option('-C',
action='callback', callback=self._carry_option,
metavar='CONTEXT', type='str',
help='Show CONTEXT lines around match')
g.add_option('-B',
action='callback', callback=self._carry_option,
metavar='CONTEXT', type='str',
help='Show CONTEXT lines before match')
g.add_option('-A',
action='callback', callback=self._carry_option,
metavar='CONTEXT', type='str',
help='Show CONTEXT lines after match')
g.add_option('-l', '--name-only', '--files-with-matches',
action='callback', callback=self._carry_option,
help='Show only file names containing matching lines')
g.add_option('-L', '--files-without-match',
action='callback', callback=self._carry_option,
help='Show only file names not containing matching lines')
g = p.add_option_group("Output")
g.add_option(
"-n",
action="callback",
callback=self._carry_option,
help="Prefix the line number to matching lines",
)
g.add_option(
"-C",
action="callback",
callback=self._carry_option,
metavar="CONTEXT",
type="str",
help="Show CONTEXT lines around match",
)
g.add_option(
"-B",
action="callback",
callback=self._carry_option,
metavar="CONTEXT",
type="str",
help="Show CONTEXT lines before match",
)
g.add_option(
"-A",
action="callback",
callback=self._carry_option,
metavar="CONTEXT",
type="str",
help="Show CONTEXT lines after match",
)
g.add_option(
"-l",
"--name-only",
"--files-with-matches",
action="callback",
callback=self._carry_option,
help="Show only file names containing matching lines",
)
g.add_option(
"-L",
"--files-without-match",
action="callback",
callback=self._carry_option,
help="Show only file names not containing matching lines",
)
def _ExecuteOne(self, cmd_argv, project):
"""Process one project."""
try:
p = GitCommand(project,
p = GitCommand(
project,
cmd_argv,
bare=False,
capture_stdout=True,
capture_stderr=True)
capture_stderr=True,
)
except GitError as e:
return (project, -1, None, str(e))
return (project, p.Wait(), p.stdout, p.stderr)
@staticmethod
def _ProcessResults(full_name, have_rev, _pool, out, results):
def _ProcessResults(full_name, have_rev, opt, _pool, out, results):
git_failed = False
bad_rev = False
have_match = False
_RelPath = lambda p: p.RelPath(local=opt.this_manifest_only)
for project, rc, stdout, stderr in results:
if rc < 0:
git_failed = True
out.project('--- project %s ---' % project.relpath)
out.project("--- project %s ---" % _RelPath(project))
out.nl()
out.fail('%s', stderr)
out.fail("%s", stderr)
out.nl()
continue
if rc:
# no results
if stderr:
if have_rev and 'fatal: ambiguous argument' in stderr:
if have_rev and "fatal: ambiguous argument" in stderr:
bad_rev = True
else:
out.project('--- project %s ---' % project.relpath)
out.project("--- project %s ---" % _RelPath(project))
out.nl()
out.fail('%s', stderr.strip())
out.fail("%s", stderr.strip())
out.nl()
continue
have_match = True
# We cut the last element, to avoid a blank line.
r = stdout.split('\n')
r = stdout.split("\n")
r = r[0:-1]
if have_rev and full_name:
for line in r:
rev, line = line.split(':', 1)
rev, line = line.split(":", 1)
out.write("%s", rev)
out.write(':')
out.project(project.relpath)
out.write('/')
out.write(":")
out.project(_RelPath(project))
out.write("/")
out.write("%s", line)
out.nl()
elif full_name:
for line in r:
out.project(project.relpath)
out.write('/')
out.project(_RelPath(project))
out.write("/")
out.write("%s", line)
out.nl()
else:
@ -227,41 +309,49 @@ contain a line that matches both expressions:
def Execute(self, opt, args):
out = GrepColoring(self.manifest.manifestProject.config)
cmd_argv = ['grep']
cmd_argv = ["grep"]
if out.is_on:
cmd_argv.append('--color')
cmd_argv.extend(getattr(opt, 'cmd_argv', []))
cmd_argv.append("--color")
cmd_argv.extend(getattr(opt, "cmd_argv", []))
if '-e' not in cmd_argv:
if "-e" not in cmd_argv:
if not args:
self.Usage()
cmd_argv.append('-e')
cmd_argv.append("-e")
cmd_argv.append(args[0])
args = args[1:]
projects = self.GetProjects(args)
projects = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
full_name = False
if len(projects) > 1:
cmd_argv.append('--full-name')
cmd_argv.append("--full-name")
full_name = True
have_rev = False
if opt.revision:
if '--cached' in cmd_argv:
print('fatal: cannot combine --cached and --revision', file=sys.stderr)
if "--cached" in cmd_argv:
print(
"fatal: cannot combine --cached and --revision",
file=sys.stderr,
)
sys.exit(1)
have_rev = True
cmd_argv.extend(opt.revision)
cmd_argv.append('--')
cmd_argv.append("--")
git_failed, bad_rev, have_match = self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, cmd_argv),
projects,
callback=functools.partial(self._ProcessResults, full_name, have_rev),
callback=functools.partial(
self._ProcessResults, full_name, have_rev, opt
),
output=out,
ordered=True)
ordered=True,
)
if git_failed:
sys.exit(1)

View File

@ -18,7 +18,12 @@ import textwrap
from subcmds import all_commands
from color import Coloring
from command import PagedCommand, MirrorSafeCommand, GitcAvailableCommand, GitcClientCommand
from command import (
PagedCommand,
MirrorSafeCommand,
GitcAvailableCommand,
GitcClientCommand,
)
import gitc_utils
from wrapper import Wrapper
@ -38,37 +43,41 @@ Displays detailed usage information about a command.
maxlen = 0
for name in commandNames:
maxlen = max(maxlen, len(name))
fmt = ' %%-%ds %%s' % maxlen
fmt = " %%-%ds %%s" % maxlen
for name in commandNames:
command = all_commands[name]()
try:
summary = command.helpSummary.strip()
except AttributeError:
summary = ''
summary = ""
print(fmt % (name, summary))
def _PrintAllCommands(self):
print('usage: repo COMMAND [ARGS]')
print("usage: repo COMMAND [ARGS]")
self.PrintAllCommandsBody()
def PrintAllCommandsBody(self):
print('The complete list of recognized repo commands are:')
print("The complete list of recognized repo commands is:")
commandNames = list(sorted(all_commands))
self._PrintCommands(commandNames)
print("See 'repo help <command>' for more information on a "
'specific command.')
print('Bug reports:', Wrapper().BUG_URL)
print(
"See 'repo help <command>' for more information on a "
"specific command."
)
print("Bug reports:", Wrapper().BUG_URL)
def _PrintCommonCommands(self):
print('usage: repo COMMAND [ARGS]')
print("usage: repo COMMAND [ARGS]")
self.PrintCommonCommandsBody()
def PrintCommonCommandsBody(self):
print('The most commonly used repo commands are:')
print("The most commonly used repo commands are:")
def gitc_supported(cmd):
if not isinstance(cmd, GitcAvailableCommand) and not isinstance(cmd, GitcClientCommand):
if not isinstance(cmd, GitcAvailableCommand) and not isinstance(
cmd, GitcClientCommand
):
return True
if self.client.isGitcClient:
return True
@ -78,21 +87,29 @@ Displays detailed usage information about a command.
return True
return False
commandNames = list(sorted([name
commandNames = list(
sorted(
[
name
for name, command in all_commands.items()
if command.COMMON and gitc_supported(command)]))
if command.COMMON and gitc_supported(command)
]
)
)
self._PrintCommands(commandNames)
print(
"See 'repo help <command>' for more information on a specific command.\n"
"See 'repo help --all' for a complete list of recognized commands.")
print('Bug reports:', Wrapper().BUG_URL)
"See 'repo help <command>' for more information on a specific "
"command.\nSee 'repo help --all' for a complete list of recognized "
"commands."
)
print("Bug reports:", Wrapper().BUG_URL)
def _PrintCommandHelp(self, cmd, header_prefix=''):
def _PrintCommandHelp(self, cmd, header_prefix=""):
class _Out(Coloring):
def __init__(self, gc):
Coloring.__init__(self, gc, 'help')
self.heading = self.printer('heading', attr='bold')
Coloring.__init__(self, gc, "help")
self.heading = self.printer("heading", attr="bold")
self._first = True
def _PrintSection(self, heading, bodyAttr):
@ -100,61 +117,72 @@ Displays detailed usage information about a command.
body = getattr(cmd, bodyAttr)
except AttributeError:
return
if body == '' or body is None:
if body == "" or body is None:
return
if not self._first:
self.nl()
self._first = False
self.heading('%s%s', header_prefix, heading)
self.heading("%s%s", header_prefix, heading)
self.nl()
self.nl()
me = 'repo %s' % cmd.NAME
me = "repo %s" % cmd.NAME
body = body.strip()
body = body.replace('%prog', me)
body = body.replace("%prog", me)
# Extract the title, but skip any trailing {#anchors}.
asciidoc_hdr = re.compile(r'^\n?#+ ([^{]+)(\{#.+\})?$')
asciidoc_hdr = re.compile(r"^\n?#+ ([^{]+)(\{#.+\})?$")
for para in body.split("\n\n"):
if para.startswith(' '):
self.write('%s', para)
if para.startswith(" "):
self.write("%s", para)
self.nl()
self.nl()
continue
m = asciidoc_hdr.match(para)
if m:
self.heading('%s%s', header_prefix, m.group(1))
self.heading("%s%s", header_prefix, m.group(1))
self.nl()
self.nl()
continue
lines = textwrap.wrap(para.replace(' ', ' '), width=80,
break_long_words=False, break_on_hyphens=False)
lines = textwrap.wrap(
para.replace(" ", " "),
width=80,
break_long_words=False,
break_on_hyphens=False,
)
for line in lines:
self.write('%s', line)
self.write("%s", line)
self.nl()
self.nl()
out = _Out(self.client.globalConfig)
out._PrintSection('Summary', 'helpSummary')
out._PrintSection("Summary", "helpSummary")
cmd.OptionParser.print_help()
out._PrintSection('Description', 'helpDescription')
out._PrintSection("Description", "helpDescription")
def _PrintAllCommandHelp(self):
for name in sorted(all_commands):
cmd = all_commands[name](manifest=self.manifest)
self._PrintCommandHelp(cmd, header_prefix='[%s] ' % (name,))
self._PrintCommandHelp(cmd, header_prefix="[%s] " % (name,))
def _Options(self, p):
p.add_option('-a', '--all',
dest='show_all', action='store_true',
help='show the complete list of commands')
p.add_option('--help-all',
dest='show_all_help', action='store_true',
help='show the --help of all commands')
p.add_option(
"-a",
"--all",
dest="show_all",
action="store_true",
help="show the complete list of commands",
)
p.add_option(
"--help-all",
dest="show_all_help",
action="store_true",
help="show the --help of all commands",
)
def Execute(self, opt, args):
if len(args) == 0:
@ -171,7 +199,9 @@ Displays detailed usage information about a command.
try:
cmd = all_commands[name](manifest=self.manifest)
except KeyError:
print("repo: '%s' is not a repo command." % name, file=sys.stderr)
print(
"repo: '%s' is not a repo command." % name, file=sys.stderr
)
sys.exit(1)
self._PrintCommandHelp(cmd)

View File

@ -26,45 +26,70 @@ class _Coloring(Coloring):
class Info(PagedCommand):
COMMON = True
helpSummary = "Get info on the manifest branch, current branch or unmerged branches"
helpSummary = (
"Get info on the manifest branch, current branch or unmerged branches"
)
helpUsage = "%prog [-dl] [-o [-c]] [<project>...]"
def _Options(self, p):
p.add_option('-d', '--diff',
dest='all', action='store_true',
help="show full info and commit diff including remote branches")
p.add_option('-o', '--overview',
dest='overview', action='store_true',
help='show overview of all local commits')
p.add_option('-c', '--current-branch',
dest="current_branch", action="store_true",
help="consider only checked out branches")
p.add_option('--no-current-branch',
dest='current_branch', action='store_false',
help='consider all local branches')
p.add_option(
"-d",
"--diff",
dest="all",
action="store_true",
help="show full info and commit diff including remote branches",
)
p.add_option(
"-o",
"--overview",
dest="overview",
action="store_true",
help="show overview of all local commits",
)
p.add_option(
"-c",
"--current-branch",
dest="current_branch",
action="store_true",
help="consider only checked out branches",
)
p.add_option(
"--no-current-branch",
dest="current_branch",
action="store_false",
help="consider all local branches",
)
# Turn this into a warning & remove this someday.
p.add_option('-b',
dest='current_branch', action='store_true',
help=optparse.SUPPRESS_HELP)
p.add_option('-l', '--local-only',
dest="local", action="store_true",
help="disable all remote operations")
p.add_option(
"-b",
dest="current_branch",
action="store_true",
help=optparse.SUPPRESS_HELP,
)
p.add_option(
"-l",
"--local-only",
dest="local",
action="store_true",
help="disable all remote operations",
)
def Execute(self, opt, args):
self.out = _Coloring(self.client.globalConfig)
self.heading = self.out.printer('heading', attr='bold')
self.headtext = self.out.nofmt_printer('headtext', fg='yellow')
self.redtext = self.out.printer('redtext', fg='red')
self.sha = self.out.printer("sha", fg='yellow')
self.text = self.out.nofmt_printer('text')
self.dimtext = self.out.printer('dimtext', attr='dim')
self.heading = self.out.printer("heading", attr="bold")
self.headtext = self.out.nofmt_printer("headtext", fg="yellow")
self.redtext = self.out.printer("redtext", fg="red")
self.sha = self.out.printer("sha", fg="yellow")
self.text = self.out.nofmt_printer("text")
self.dimtext = self.out.printer("dimtext", attr="dim")
self.opt = opt
if not opt.this_manifest_only:
self.manifest = self.manifest.outer_client
manifestConfig = self.manifest.manifestProject.config
mergeBranch = manifestConfig.GetBranch("default").merge
manifestGroups = (manifestConfig.GetString('manifest.groups')
or 'all,-notdefault')
manifestGroups = self.manifest.GetGroupsStr()
self.heading("Manifest branch: ")
if self.manifest.default.revisionExpr:
@ -80,17 +105,17 @@ class Info(PagedCommand):
self.printSeparator()
if not opt.overview:
self.printDiffInfo(args)
self._printDiffInfo(opt, args)
else:
self.printCommitOverview(args)
self._printCommitOverview(opt, args)
def printSeparator(self):
self.text("----------------------------")
self.out.nl()
def printDiffInfo(self, args):
def _printDiffInfo(self, opt, args):
# We let exceptions bubble up to main as they'll be well structured.
projs = self.GetProjects(args)
projs = self.GetProjects(args, all_manifests=not opt.this_manifest_only)
for p in projs:
self.heading("Project: ")
@ -107,7 +132,7 @@ class Info(PagedCommand):
currentBranch = p.CurrentBranch
if currentBranch:
self.heading('Current branch: ')
self.heading("Current branch: ")
self.headtext(currentBranch)
self.out.nl()
@ -134,7 +159,7 @@ class Info(PagedCommand):
if not self.opt.local:
project.Sync_NetworkHalf(quiet=True, current_branch_only=True)
branch = self.manifest.manifestProject.config.GetBranch('default').merge
branch = self.manifest.manifestProject.config.GetBranch("default").merge
if branch.startswith(R_HEADS):
branch = branch[len(R_HEADS) :]
logTarget = R_M + branch
@ -142,18 +167,20 @@ class Info(PagedCommand):
bareTmp = project.bare_git._bare
project.bare_git._bare = False
localCommits = project.bare_git.rev_list(
'--abbrev=8',
'--abbrev-commit',
'--pretty=oneline',
"--abbrev=8",
"--abbrev-commit",
"--pretty=oneline",
logTarget + "..",
'--')
"--",
)
originCommits = project.bare_git.rev_list(
'--abbrev=8',
'--abbrev-commit',
'--pretty=oneline',
"--abbrev=8",
"--abbrev-commit",
"--pretty=oneline",
".." + logTarget,
'--')
"--",
)
project.bare_git._bare = bareTmp
self.heading("Local Commits: ")
@ -179,11 +206,12 @@ class Info(PagedCommand):
self.text(" ".join(split[1:]))
self.out.nl()
def printCommitOverview(self, args):
def _printCommitOverview(self, opt, args):
all_branches = []
for project in self.GetProjects(args):
br = [project.GetUploadableBranch(x)
for x in project.GetBranches()]
for project in self.GetProjects(
args, all_manifests=not opt.this_manifest_only
):
br = [project.GetUploadableBranch(x) for x in project.GetBranches()]
br = [x for x in br if x]
if self.opt.current_branch:
br = [x for x in br if x.name == project.CurrentBranch]
@ -193,29 +221,33 @@ class Info(PagedCommand):
return
self.out.nl()
self.heading('Projects Overview')
self.heading("Projects Overview")
project = None
for branch in all_branches:
if project != branch.project:
project = branch.project
self.out.nl()
self.headtext(project.relpath)
self.headtext(project.RelPath(local=opt.this_manifest_only))
self.out.nl()
commits = branch.commits
date = branch.date
self.text('%s %-33s (%2d commit%s, %s)' % (
branch.name == project.CurrentBranch and '*' or ' ',
self.text(
"%s %-33s (%2d commit%s, %s)"
% (
branch.name == project.CurrentBranch and "*" or " ",
branch.name,
len(commits),
len(commits) != 1 and 's' or '',
date))
len(commits) != 1 and "s" or "",
date,
)
)
self.out.nl()
for commit in commits:
split = commit.split()
self.text('{0:38}{1} '.format('', '-'))
self.text("{0:38}{1} ".format("", "-"))
self.sha(split[0] + " ")
self.text(" ".join(split[1:]))
self.out.nl()

View File

@ -13,24 +13,17 @@
# limitations under the License.
import os
import platform
import re
import sys
import urllib.parse
from color import Coloring
from command import InteractiveCommand, MirrorSafeCommand
from error import ManifestParseError
from project import SyncBuffer
from git_config import GitConfig
from git_command import git_require, MIN_GIT_VERSION_SOFT, MIN_GIT_VERSION_HARD
import git_superproject
import platform_utils
from wrapper import Wrapper
class Init(InteractiveCommand, MirrorSafeCommand):
COMMON = True
MULTI_MANIFEST_SUPPORT = True
helpSummary = "Initialize a repo client checkout in the current directory"
helpUsage = """
%prog [options] [manifest url]
@ -53,6 +46,12 @@ The optional -m argument can be used to specify an alternate manifest
to be used. If no manifest is specified, the manifest default.xml
will be used.
If the --standalone-manifest argument is set, the manifest will be downloaded
directly from the specified --manifest-url as a static file (rather than
setting up a manifest git checkout). With --standalone-manifest, the manifest
will be fully static and will not be re-downloaded during subsesquent
`repo init` and `repo sync` calls.
The --reference option can be used to point to a directory that
has the content of a --mirror sync. This will make the working
directory use as much data as possible from the local reference
@ -83,238 +82,108 @@ to update the working directory files.
def _Options(self, p, gitc_init=False):
Wrapper().InitParser(p, gitc_init=gitc_init)
m = p.add_option_group("Multi-manifest")
m.add_option(
"--outer-manifest",
action="store_true",
default=True,
help="operate starting at the outermost manifest",
)
m.add_option(
"--no-outer-manifest",
dest="outer_manifest",
action="store_false",
help="do not operate on outer manifests",
)
m.add_option(
"--this-manifest-only",
action="store_true",
default=None,
help="only operate on this (sub)manifest",
)
m.add_option(
"--no-this-manifest-only",
"--all-manifests",
dest="this_manifest_only",
action="store_false",
help="operate on this manifest and its submanifests",
)
def _RegisteredEnvironmentOptions(self):
return {'REPO_MANIFEST_URL': 'manifest_url',
'REPO_MIRROR_LOCATION': 'reference'}
def _CloneSuperproject(self, opt):
"""Clone the superproject based on the superproject's url and branch.
Args:
opt: Program options returned from optparse. See _Options().
"""
superproject = git_superproject.Superproject(self.manifest,
self.repodir,
self.git_event_log,
quiet=opt.quiet)
sync_result = superproject.Sync()
if not sync_result.success:
print('warning: git update of superproject failed, repo sync will not '
'use superproject to fetch source; while this error is not fatal, '
'and you can continue to run repo sync, please run repo init with '
'the --no-use-superproject option to stop seeing this warning',
file=sys.stderr)
if sync_result.fatal and opt.use_superproject is not None:
sys.exit(1)
return {
"REPO_MANIFEST_URL": "manifest_url",
"REPO_MIRROR_LOCATION": "reference",
}
def _SyncManifest(self, opt):
m = self.manifest.manifestProject
is_new = not m.Exists
"""Call manifestProject.Sync with arguments from opt.
if is_new:
if not opt.manifest_url:
print('fatal: manifest url is required.', file=sys.stderr)
sys.exit(1)
if not opt.quiet:
print('Downloading manifest from %s' %
(GitConfig.ForUser().UrlInsteadOf(opt.manifest_url),),
file=sys.stderr)
# The manifest project object doesn't keep track of the path on the
# server where this git is located, so let's save that here.
mirrored_manifest_git = None
if opt.reference:
manifest_git_path = urllib.parse.urlparse(opt.manifest_url).path[1:]
mirrored_manifest_git = os.path.join(opt.reference, manifest_git_path)
if not mirrored_manifest_git.endswith(".git"):
mirrored_manifest_git += ".git"
if not os.path.exists(mirrored_manifest_git):
mirrored_manifest_git = os.path.join(opt.reference,
'.repo/manifests.git')
m._InitGitDir(mirror_git=mirrored_manifest_git)
self._ConfigureDepth(opt)
# Set the remote URL before the remote branch as we might need it below.
if opt.manifest_url:
r = m.GetRemote(m.remote.name)
r.url = opt.manifest_url
r.ResetFetch()
r.Save()
if opt.manifest_branch:
if opt.manifest_branch == 'HEAD':
opt.manifest_branch = m.ResolveRemoteHead()
if opt.manifest_branch is None:
print('fatal: unable to resolve HEAD', file=sys.stderr)
sys.exit(1)
m.revisionExpr = opt.manifest_branch
else:
if is_new:
default_branch = m.ResolveRemoteHead()
if default_branch is None:
# If the remote doesn't have HEAD configured, default to master.
default_branch = 'refs/heads/master'
m.revisionExpr = default_branch
else:
m.PreSync()
groups = re.split(r'[,\s]+', opt.groups)
all_platforms = ['linux', 'darwin', 'windows']
platformize = lambda x: 'platform-' + x
if opt.platform == 'auto':
if (not opt.mirror and
not m.config.GetString('repo.mirror') == 'true'):
groups.append(platformize(platform.system().lower()))
elif opt.platform == 'all':
groups.extend(map(platformize, all_platforms))
elif opt.platform in all_platforms:
groups.append(platformize(opt.platform))
elif opt.platform != 'none':
print('fatal: invalid platform flag', file=sys.stderr)
sys.exit(1)
groups = [x for x in groups if x]
groupstr = ','.join(groups)
if opt.platform == 'auto' and groupstr == self.manifest.GetDefaultGroupsStr():
groupstr = None
m.config.SetString('manifest.groups', groupstr)
if opt.reference:
m.config.SetString('repo.reference', opt.reference)
if opt.dissociate:
m.config.SetBoolean('repo.dissociate', opt.dissociate)
if opt.worktree:
if opt.mirror:
print('fatal: --mirror and --worktree are incompatible',
file=sys.stderr)
sys.exit(1)
if opt.submodules:
print('fatal: --submodules and --worktree are incompatible',
file=sys.stderr)
sys.exit(1)
m.config.SetBoolean('repo.worktree', opt.worktree)
if is_new:
m.use_git_worktrees = True
print('warning: --worktree is experimental!', file=sys.stderr)
if opt.archive:
if is_new:
m.config.SetBoolean('repo.archive', opt.archive)
else:
print('fatal: --archive is only supported when initializing a new '
'workspace.', file=sys.stderr)
print('Either delete the .repo folder in this workspace, or initialize '
'in another location.', file=sys.stderr)
sys.exit(1)
if opt.mirror:
if is_new:
m.config.SetBoolean('repo.mirror', opt.mirror)
else:
print('fatal: --mirror is only supported when initializing a new '
'workspace.', file=sys.stderr)
print('Either delete the .repo folder in this workspace, or initialize '
'in another location.', file=sys.stderr)
sys.exit(1)
if opt.partial_clone is not None:
if opt.mirror:
print('fatal: --mirror and --partial-clone are mutually exclusive',
file=sys.stderr)
sys.exit(1)
m.config.SetBoolean('repo.partialclone', opt.partial_clone)
if opt.clone_filter:
m.config.SetString('repo.clonefilter', opt.clone_filter)
elif m.config.GetBoolean('repo.partialclone'):
opt.clone_filter = m.config.GetString('repo.clonefilter')
else:
opt.clone_filter = None
if opt.partial_clone_exclude is not None:
m.config.SetString('repo.partialcloneexclude', opt.partial_clone_exclude)
if opt.clone_bundle is None:
opt.clone_bundle = False if opt.partial_clone else True
else:
m.config.SetBoolean('repo.clonebundle', opt.clone_bundle)
if opt.submodules:
m.config.SetBoolean('repo.submodules', opt.submodules)
if opt.use_superproject is not None:
m.config.SetBoolean('repo.superproject', opt.use_superproject)
if not m.Sync_NetworkHalf(is_new=is_new, quiet=opt.quiet, verbose=opt.verbose,
clone_bundle=opt.clone_bundle,
current_branch_only=opt.current_branch_only,
tags=opt.tags, submodules=opt.submodules,
Args:
opt: options from optparse.
"""
# Normally this value is set when instantiating the project, but the
# manifest project is special and is created when instantiating the
# manifest which happens before we parse options.
self.manifest.manifestProject.clone_depth = opt.manifest_depth
if not self.manifest.manifestProject.Sync(
manifest_url=opt.manifest_url,
manifest_branch=opt.manifest_branch,
standalone_manifest=opt.standalone_manifest,
groups=opt.groups,
platform=opt.platform,
mirror=opt.mirror,
dissociate=opt.dissociate,
reference=opt.reference,
worktree=opt.worktree,
submodules=opt.submodules,
archive=opt.archive,
partial_clone=opt.partial_clone,
clone_filter=opt.clone_filter,
partial_clone_exclude=self.manifest.PartialCloneExclude):
r = m.GetRemote(m.remote.name)
print('fatal: cannot obtain manifest %s' % r.url, file=sys.stderr)
# Better delete the manifest git dir if we created it; otherwise next
# time (when user fixes problems) we won't go through the "is_new" logic.
if is_new:
platform_utils.rmtree(m.gitdir)
sys.exit(1)
if opt.manifest_branch:
m.MetaBranchSwitch(submodules=opt.submodules)
syncbuf = SyncBuffer(m.config)
m.Sync_LocalHalf(syncbuf, submodules=opt.submodules)
syncbuf.Finish()
if is_new or m.CurrentBranch is None:
if not m.StartBranch('default'):
print('fatal: cannot create default in manifest', file=sys.stderr)
sys.exit(1)
def _LinkManifest(self, name):
if not name:
print('fatal: manifest name (-m) is required.', file=sys.stderr)
sys.exit(1)
try:
self.manifest.Link(name)
except ManifestParseError as e:
print("fatal: manifest '%s' not available" % name, file=sys.stderr)
print('fatal: %s' % str(e), file=sys.stderr)
partial_clone_exclude=opt.partial_clone_exclude,
clone_bundle=opt.clone_bundle,
git_lfs=opt.git_lfs,
use_superproject=opt.use_superproject,
verbose=opt.verbose,
current_branch_only=opt.current_branch_only,
tags=opt.tags,
depth=opt.depth,
git_event_log=self.git_event_log,
manifest_name=opt.manifest_name,
):
sys.exit(1)
def _Prompt(self, prompt, value):
print('%-10s [%s]: ' % (prompt, value), end='')
# TODO: When we require Python 3, use flush=True w/print above.
sys.stdout.flush()
print("%-10s [%s]: " % (prompt, value), end="", flush=True)
a = sys.stdin.readline().strip()
if a == '':
if a == "":
return value
return a
def _ShouldConfigureUser(self, opt):
def _ShouldConfigureUser(self, opt, existing_checkout):
gc = self.client.globalConfig
mp = self.manifest.manifestProject
# If we don't have local settings, get from global.
if not mp.config.Has('user.name') or not mp.config.Has('user.email'):
if not gc.Has('user.name') or not gc.Has('user.email'):
if not mp.config.Has("user.name") or not mp.config.Has("user.email"):
if not gc.Has("user.name") or not gc.Has("user.email"):
return True
mp.config.SetString('user.name', gc.GetString('user.name'))
mp.config.SetString('user.email', gc.GetString('user.email'))
mp.config.SetString("user.name", gc.GetString("user.name"))
mp.config.SetString("user.email", gc.GetString("user.email"))
if not opt.quiet:
if not opt.quiet and not existing_checkout or opt.verbose:
print()
print('Your identity is: %s <%s>' % (mp.config.GetString('user.name'),
mp.config.GetString('user.email')))
print("If you want to change this, please re-run 'repo init' with --config-name")
print(
"Your identity is: %s <%s>"
% (
mp.config.GetString("user.name"),
mp.config.GetString("user.email"),
)
)
print(
"If you want to change this, please re-run 'repo init' with "
"--config-name"
)
return False
def _ConfigureUser(self, opt):
@ -323,27 +192,25 @@ to update the working directory files.
while True:
if not opt.quiet:
print()
name = self._Prompt('Your Name', mp.UserName)
email = self._Prompt('Your Email', mp.UserEmail)
name = self._Prompt("Your Name", mp.UserName)
email = self._Prompt("Your Email", mp.UserEmail)
if not opt.quiet:
print()
print('Your identity is: %s <%s>' % (name, email))
print('is this correct [y/N]? ', end='')
# TODO: When we require Python 3, use flush=True w/print above.
sys.stdout.flush()
print("Your identity is: %s <%s>" % (name, email))
print("is this correct [y/N]? ", end="", flush=True)
a = sys.stdin.readline().strip().lower()
if a in ('yes', 'y', 't', 'true'):
if a in ("yes", "y", "t", "true"):
break
if name != mp.UserName:
mp.config.SetString('user.name', name)
mp.config.SetString("user.name", name)
if email != mp.UserEmail:
mp.config.SetString('user.email', email)
mp.config.SetString("user.email", email)
def _HasColorSet(self, gc):
for n in ['ui', 'diff', 'status']:
if gc.Has('color.%s' % n):
for n in ["ui", "diff", "status"]:
if gc.Has("color.%s" % n):
return True
return False
@ -354,128 +221,161 @@ to update the working directory files.
class _Test(Coloring):
def __init__(self):
Coloring.__init__(self, gc, 'test color display')
Coloring.__init__(self, gc, "test color display")
self._on = True
out = _Test()
print()
print("Testing colorized output (for 'repo diff', 'repo status'):")
for c in ['black', 'red', 'green', 'yellow', 'blue', 'magenta', 'cyan']:
out.write(' ')
out.printer(fg=c)(' %-6s ', c)
out.write(' ')
out.printer(fg='white', bg='black')(' %s ' % 'white')
for c in ["black", "red", "green", "yellow", "blue", "magenta", "cyan"]:
out.write(" ")
out.printer(fg=c)(" %-6s ", c)
out.write(" ")
out.printer(fg="white", bg="black")(" %s " % "white")
out.nl()
for c in ['bold', 'dim', 'ul', 'reverse']:
out.write(' ')
out.printer(fg='black', attr=c)(' %-6s ', c)
for c in ["bold", "dim", "ul", "reverse"]:
out.write(" ")
out.printer(fg="black", attr=c)(" %-6s ", c)
out.nl()
print('Enable color display in this user account (y/N)? ', end='')
# TODO: When we require Python 3, use flush=True w/print above.
sys.stdout.flush()
print(
"Enable color display in this user account (y/N)? ",
end="",
flush=True,
)
a = sys.stdin.readline().strip().lower()
if a in ('y', 'yes', 't', 'true', 'on'):
gc.SetString('color.ui', 'auto')
if a in ("y", "yes", "t", "true", "on"):
gc.SetString("color.ui", "auto")
def _ConfigureDepth(self, opt):
"""Configure the depth we'll sync down.
Args:
opt: Options from optparse. We care about opt.depth.
"""
# Opt.depth will be non-None if user actually passed --depth to repo init.
if opt.depth is not None:
if opt.depth > 0:
# Positive values will set the depth.
depth = str(opt.depth)
else:
# Negative numbers will clear the depth; passing None to SetString
# will do that.
depth = None
# We store the depth in the main manifest project.
self.manifest.manifestProject.config.SetString('repo.depth', depth)
def _DisplayResult(self, opt):
def _DisplayResult(self):
if self.manifest.IsMirror:
init_type = 'mirror '
init_type = "mirror "
else:
init_type = ''
init_type = ""
if not opt.quiet:
print()
print('repo %shas been initialized in %s' %
(init_type, self.manifest.topdir))
print(
"repo %shas been initialized in %s"
% (init_type, self.manifest.topdir)
)
current_dir = os.getcwd()
if current_dir != self.manifest.topdir:
print('If this is not the directory in which you want to initialize '
'repo, please run:')
print(' rm -r %s/.repo' % self.manifest.topdir)
print('and try again.')
print(
"If this is not the directory in which you want to initialize "
"repo, please run:"
)
print(" rm -r %s" % os.path.join(self.manifest.topdir, ".repo"))
print("and try again.")
def ValidateOptions(self, opt, args):
if opt.reference:
opt.reference = os.path.expanduser(opt.reference)
# Check this here, else manifest will be tagged "not new" and init won't be
# possible anymore without removing the .repo/manifests directory.
if opt.archive and opt.mirror:
self.OptionParser.error('--mirror and --archive cannot be used together.')
# Check this here, else manifest will be tagged "not new" and init won't
# be possible anymore without removing the .repo/manifests directory.
if opt.mirror:
if opt.archive:
self.OptionParser.error(
"--mirror and --archive cannot be used " "together."
)
if opt.use_superproject is not None:
self.OptionParser.error(
"--mirror and --use-superproject cannot be "
"used together."
)
if opt.archive and opt.use_superproject is not None:
self.OptionParser.error(
"--archive and --use-superproject cannot be used " "together."
)
if opt.standalone_manifest and (
opt.manifest_branch or opt.manifest_name != "default.xml"
):
self.OptionParser.error(
"--manifest-branch and --manifest-name cannot"
" be used with --standalone-manifest."
)
if args:
if opt.manifest_url:
self.OptionParser.error(
'--manifest-url option and URL argument both specified: only use '
'one to select the manifest URL.')
"--manifest-url option and URL argument both specified: "
"only use one to select the manifest URL."
)
opt.manifest_url = args.pop(0)
if args:
self.OptionParser.error('too many arguments to init')
self.OptionParser.error("too many arguments to init")
def Execute(self, opt, args):
git_require(MIN_GIT_VERSION_HARD, fail=True)
if not git_require(MIN_GIT_VERSION_SOFT):
print('repo: warning: git-%s+ will soon be required; please upgrade your '
'version of git to maintain support.'
% ('.'.join(str(x) for x in MIN_GIT_VERSION_SOFT),),
file=sys.stderr)
print(
"repo: warning: git-%s+ will soon be required; please upgrade "
"your version of git to maintain support."
% (".".join(str(x) for x in MIN_GIT_VERSION_SOFT),),
file=sys.stderr,
)
rp = self.manifest.repoProject
# Handle new --repo-url requests.
if opt.repo_url:
remote = rp.GetRemote('origin')
remote = rp.GetRemote("origin")
remote.url = opt.repo_url
remote.Save()
# Handle new --repo-rev requests.
if opt.repo_rev:
wrapper = Wrapper()
try:
remote_ref, rev = wrapper.check_repo_rev(
rp.gitdir, opt.repo_rev, repo_verify=opt.repo_verify, quiet=opt.quiet)
branch = rp.GetBranch('default')
rp.gitdir,
opt.repo_rev,
repo_verify=opt.repo_verify,
quiet=opt.quiet,
)
except wrapper.CloneFailure:
err_msg = "fatal: double check your --repo-rev setting."
print(
err_msg,
file=sys.stderr,
)
self.git_event_log.ErrorEvent(err_msg)
sys.exit(1)
branch = rp.GetBranch("default")
branch.merge = remote_ref
rp.work_git.reset('--hard', rev)
rp.work_git.reset("--hard", rev)
branch.Save()
if opt.worktree:
# Older versions of git supported worktree, but had dangerous gc bugs.
git_require((2, 15, 0), fail=True, msg='git gc worktree corruption')
# Older versions of git supported worktree, but had dangerous gc
# bugs.
git_require((2, 15, 0), fail=True, msg="git gc worktree corruption")
# Provide a short notice that we're reinitializing an existing checkout.
# Sometimes developers might not realize that they're in one, or that
# repo doesn't do nested checkouts.
existing_checkout = self.manifest.manifestProject.Exists
if not opt.quiet and existing_checkout:
print(
"repo: reusing existing repo client checkout in",
self.manifest.topdir,
)
self._SyncManifest(opt)
self._LinkManifest(opt.manifest_name)
if self.manifest.manifestProject.config.GetBoolean('repo.superproject'):
self._CloneSuperproject(opt)
if os.isatty(0) and os.isatty(1) and not self.manifest.IsMirror:
if opt.config_name or self._ShouldConfigureUser(opt):
if opt.config_name or self._ShouldConfigureUser(
opt, existing_checkout
):
self._ConfigureUser(opt)
self._ConfigureColor()
self._DisplayResult(opt)
if not opt.quiet:
self._DisplayResult()

View File

@ -36,30 +36,58 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
"""
def _Options(self, p):
p.add_option('-r', '--regex',
dest='regex', action='store_true',
help='filter the project list based on regex or wildcard matching of strings')
p.add_option('-g', '--groups',
dest='groups',
help='filter the project list based on the groups the project is in')
p.add_option('-a', '--all',
action='store_true',
help='show projects regardless of checkout state')
p.add_option('-n', '--name-only',
dest='name_only', action='store_true',
help='display only the name of the repository')
p.add_option('-p', '--path-only',
dest='path_only', action='store_true',
help='display only the path of the repository')
p.add_option('-f', '--fullpath',
dest='fullpath', action='store_true',
help='display the full work tree path instead of the relative path')
p.add_option('--relative-to', metavar='PATH',
help='display paths relative to this one (default: top of repo client checkout)')
p.add_option(
"-r",
"--regex",
dest="regex",
action="store_true",
help="filter the project list based on regex or wildcard matching "
"of strings",
)
p.add_option(
"-g",
"--groups",
dest="groups",
help="filter the project list based on the groups the project is "
"in",
)
p.add_option(
"-a",
"--all",
action="store_true",
help="show projects regardless of checkout state",
)
p.add_option(
"-n",
"--name-only",
dest="name_only",
action="store_true",
help="display only the name of the repository",
)
p.add_option(
"-p",
"--path-only",
dest="path_only",
action="store_true",
help="display only the path of the repository",
)
p.add_option(
"-f",
"--fullpath",
dest="fullpath",
action="store_true",
help="display the full work tree path instead of the relative path",
)
p.add_option(
"--relative-to",
metavar="PATH",
help="display paths relative to this one (default: top of repo "
"client checkout)",
)
def ValidateOptions(self, opt, args):
if opt.fullpath and opt.name_only:
self.OptionParser.error('cannot combine -f and -n')
self.OptionParser.error("cannot combine -f and -n")
# Resolve any symlinks so the output is stable.
if opt.relative_to:
@ -77,16 +105,23 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
args: Positional args. Can be a list of projects to list, or empty.
"""
if not opt.regex:
projects = self.GetProjects(args, groups=opt.groups, missing_ok=opt.all)
projects = self.GetProjects(
args,
groups=opt.groups,
missing_ok=opt.all,
all_manifests=not opt.this_manifest_only,
)
else:
projects = self.FindProjects(args)
projects = self.FindProjects(
args, all_manifests=not opt.this_manifest_only
)
def _getpath(x):
if opt.fullpath:
return x.worktree
if opt.relative_to:
return os.path.relpath(x.worktree, opt.relative_to)
return x.relpath
return x.RelPath(local=opt.this_manifest_only)
lines = []
for project in projects:
@ -99,4 +134,4 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
if lines:
lines.sort()
print('\n'.join(lines))
print("\n".join(lines))

View File

@ -42,77 +42,130 @@ to indicate the remote ref to push changes to via 'repo upload'.
@property
def helpDescription(self):
helptext = self._helpDescription + '\n'
helptext = self._helpDescription + "\n"
r = os.path.dirname(__file__)
r = os.path.dirname(r)
with open(os.path.join(r, 'docs', 'manifest-format.md')) as fd:
with open(os.path.join(r, "docs", "manifest-format.md")) as fd:
for line in fd:
helptext += line
return helptext
def _Options(self, p):
p.add_option('-r', '--revision-as-HEAD',
dest='peg_rev', action='store_true',
help='save revisions as current HEAD')
p.add_option('-m', '--manifest-name',
help='temporary manifest to use for this sync', metavar='NAME.xml')
p.add_option('--suppress-upstream-revision', dest='peg_rev_upstream',
default=True, action='store_false',
help='if in -r mode, do not write the upstream field '
'(only of use if the branch names for a sha1 manifest are '
'sensitive)')
p.add_option('--suppress-dest-branch', dest='peg_rev_dest_branch',
default=True, action='store_false',
help='if in -r mode, do not write the dest-branch field '
'(only of use if the branch names for a sha1 manifest are '
'sensitive)')
p.add_option('--json', default=False, action='store_true',
help='output manifest in JSON format (experimental)')
p.add_option('--pretty', default=False, action='store_true',
help='format output for humans to read')
p.add_option('--no-local-manifests', default=False, action='store_true',
dest='ignore_local_manifests', help='ignore local manifests')
p.add_option('-o', '--output-file',
dest='output_file',
default='-',
help='file to save the manifest to',
metavar='-|NAME.xml')
p.add_option(
"-r",
"--revision-as-HEAD",
dest="peg_rev",
action="store_true",
help="save revisions as current HEAD",
)
p.add_option(
"-m",
"--manifest-name",
help="temporary manifest to use for this sync",
metavar="NAME.xml",
)
p.add_option(
"--suppress-upstream-revision",
dest="peg_rev_upstream",
default=True,
action="store_false",
help="if in -r mode, do not write the upstream field "
"(only of use if the branch names for a sha1 manifest are "
"sensitive)",
)
p.add_option(
"--suppress-dest-branch",
dest="peg_rev_dest_branch",
default=True,
action="store_false",
help="if in -r mode, do not write the dest-branch field "
"(only of use if the branch names for a sha1 manifest are "
"sensitive)",
)
p.add_option(
"--json",
default=False,
action="store_true",
help="output manifest in JSON format (experimental)",
)
p.add_option(
"--pretty",
default=False,
action="store_true",
help="format output for humans to read",
)
p.add_option(
"--no-local-manifests",
default=False,
action="store_true",
dest="ignore_local_manifests",
help="ignore local manifests",
)
p.add_option(
"-o",
"--output-file",
dest="output_file",
default="-",
help="file to save the manifest to. (Filename prefix for "
"multi-tree.)",
metavar="-|NAME.xml",
)
def _Output(self, opt):
# If alternate manifest is specified, override the manifest file that we're using.
# If alternate manifest is specified, override the manifest file that
# we're using.
if opt.manifest_name:
self.manifest.Override(opt.manifest_name, False)
if opt.output_file == '-':
for manifest in self.ManifestList(opt):
output_file = opt.output_file
if output_file == "-":
fd = sys.stdout
else:
fd = open(opt.output_file, 'w')
if manifest.path_prefix:
output_file = (
f"{opt.output_file}:"
f'{manifest.path_prefix.replace("/", "%2f")}'
)
fd = open(output_file, "w")
self.manifest.SetUseLocalManifests(not opt.ignore_local_manifests)
manifest.SetUseLocalManifests(not opt.ignore_local_manifests)
if opt.json:
print('warning: --json is experimental!', file=sys.stderr)
doc = self.manifest.ToDict(peg_rev=opt.peg_rev,
print("warning: --json is experimental!", file=sys.stderr)
doc = manifest.ToDict(
peg_rev=opt.peg_rev,
peg_rev_upstream=opt.peg_rev_upstream,
peg_rev_dest_branch=opt.peg_rev_dest_branch)
peg_rev_dest_branch=opt.peg_rev_dest_branch,
)
json_settings = {
# JSON style guide says Uunicode characters are fully allowed.
'ensure_ascii': False,
# JSON style guide says Unicode characters are fully
# allowed.
"ensure_ascii": False,
# We use 2 space indent to match JSON style guide.
'indent': 2 if opt.pretty else None,
'separators': (',', ': ') if opt.pretty else (',', ':'),
'sort_keys': True,
"indent": 2 if opt.pretty else None,
"separators": (",", ": ") if opt.pretty else (",", ":"),
"sort_keys": True,
}
fd.write(json.dumps(doc, **json_settings))
else:
self.manifest.Save(fd,
manifest.Save(
fd,
peg_rev=opt.peg_rev,
peg_rev_upstream=opt.peg_rev_upstream,
peg_rev_dest_branch=opt.peg_rev_dest_branch)
peg_rev_dest_branch=opt.peg_rev_dest_branch,
)
if output_file != "-":
fd.close()
if opt.output_file != '-':
print('Saved manifest to %s' % opt.output_file, file=sys.stderr)
if manifest.path_prefix:
print(
f"Saved {manifest.path_prefix} submanifest to "
f"{output_file}",
file=sys.stderr,
)
else:
print(f"Saved manifest to {output_file}", file=sys.stderr)
def ValidateOptions(self, opt, args):
if args:

View File

@ -34,22 +34,33 @@ are displayed.
"""
def _Options(self, p):
p.add_option('-c', '--current-branch',
dest="current_branch", action="store_true",
help="consider only checked out branches")
p.add_option('--no-current-branch',
dest='current_branch', action='store_false',
help='consider all local branches')
p.add_option(
"-c",
"--current-branch",
dest="current_branch",
action="store_true",
help="consider only checked out branches",
)
p.add_option(
"--no-current-branch",
dest="current_branch",
action="store_false",
help="consider all local branches",
)
# Turn this into a warning & remove this someday.
p.add_option('-b',
dest='current_branch', action='store_true',
help=optparse.SUPPRESS_HELP)
p.add_option(
"-b",
dest="current_branch",
action="store_true",
help=optparse.SUPPRESS_HELP,
)
def Execute(self, opt, args):
all_branches = []
for project in self.GetProjects(args):
br = [project.GetUploadableBranch(x)
for x in project.GetBranches()]
for project in self.GetProjects(
args, all_manifests=not opt.this_manifest_only
):
br = [project.GetUploadableBranch(x) for x in project.GetBranches()]
br = [x for x in br if x]
if opt.current_branch:
br = [x for x in br if x.name == project.CurrentBranch]
@ -60,14 +71,14 @@ are displayed.
class Report(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, 'status')
self.project = self.printer('header', attr='bold')
self.text = self.printer('text')
Coloring.__init__(self, config, "status")
self.project = self.printer("header", attr="bold")
self.text = self.printer("text")
out = Report(all_branches[0].project.config)
out.text("Deprecated. See repo info -o.")
out.nl()
out.project('Projects Overview')
out.project("Projects Overview")
out.nl()
project = None
@ -76,16 +87,23 @@ are displayed.
if project != branch.project:
project = branch.project
out.nl()
out.project('project %s/' % project.relpath)
out.project(
"project %s/"
% project.RelPath(local=opt.this_manifest_only)
)
out.nl()
commits = branch.commits
date = branch.date
print('%s %-33s (%2d commit%s, %s)' % (
branch.name == project.CurrentBranch and '*' or ' ',
print(
"%s %-33s (%2d commit%s, %s)"
% (
branch.name == project.CurrentBranch and "*" or " ",
branch.name,
len(commits),
len(commits) != 1 and 's' or ' ',
date))
len(commits) != 1 and "s" or " ",
date,
)
)
for commit in commits:
print('%-35s - %s' % ('', commit))
print("%-35s - %s" % ("", commit))

View File

@ -31,10 +31,12 @@ class Prune(PagedCommand):
return project.PruneHeads()
def Execute(self, opt, args):
projects = self.GetProjects(args)
projects = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
# NB: Should be able to refactor this module to display summary as results
# come back from children.
# NB: Should be able to refactor this module to display summary as
# results come back from children.
def _ProcessResults(_pool, _output, results):
return list(itertools.chain.from_iterable(results))
@ -43,18 +45,19 @@ class Prune(PagedCommand):
self._ExecuteOne,
projects,
callback=_ProcessResults,
ordered=True)
ordered=True,
)
if not all_branches:
return
class Report(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, 'status')
self.project = self.printer('header', attr='bold')
Coloring.__init__(self, config, "status")
self.project = self.printer("header", attr="bold")
out = Report(all_branches[0].project.config)
out.project('Pending Branches')
out.project("Pending Branches")
out.nl()
project = None
@ -63,19 +66,29 @@ class Prune(PagedCommand):
if project != branch.project:
project = branch.project
out.nl()
out.project('project %s/' % project.relpath)
out.project(
"project %s/"
% project.RelPath(local=opt.this_manifest_only)
)
out.nl()
print('%s %-33s ' % (
branch.name == project.CurrentBranch and '*' or ' ',
branch.name), end='')
print(
"%s %-33s "
% (
branch.name == project.CurrentBranch and "*" or " ",
branch.name,
),
end="",
)
if not branch.base_exists:
print('(ignoring: tracking branch is gone: %s)' % (branch.base,))
print(
"(ignoring: tracking branch is gone: %s)" % (branch.base,)
)
else:
commits = branch.commits
date = branch.date
print('(%2d commit%s, %s)' % (
len(commits),
len(commits) != 1 and 's' or ' ',
date))
print(
"(%2d commit%s, %s)"
% (len(commits), len(commits) != 1 and "s" or " ", date)
)

View File

@ -21,9 +21,9 @@ from git_command import GitCommand
class RebaseColoring(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, 'rebase')
self.project = self.printer('project', attr='bold')
self.fail = self.printer('fail', fg='red')
Coloring.__init__(self, config, "rebase")
self.project = self.printer("project", attr="bold")
self.fail = self.printer("fail", fg="red")
class Rebase(Command):
@ -39,65 +39,103 @@ branch but need to incorporate new upstream changes "underneath" them.
"""
def _Options(self, p):
g = p.get_option_group('--quiet')
g.add_option('-i', '--interactive',
dest="interactive", action="store_true",
help="interactive rebase (single project only)")
g = p.get_option_group("--quiet")
g.add_option(
"-i",
"--interactive",
dest="interactive",
action="store_true",
help="interactive rebase (single project only)",
)
p.add_option('--fail-fast',
dest='fail_fast', action='store_true',
help='stop rebasing after first error is hit')
p.add_option('-f', '--force-rebase',
dest='force_rebase', action='store_true',
help='pass --force-rebase to git rebase')
p.add_option('--no-ff',
dest='ff', default=True, action='store_false',
help='pass --no-ff to git rebase')
p.add_option('--autosquash',
dest='autosquash', action='store_true',
help='pass --autosquash to git rebase')
p.add_option('--whitespace',
dest='whitespace', action='store', metavar='WS',
help='pass --whitespace to git rebase')
p.add_option('--auto-stash',
dest='auto_stash', action='store_true',
help='stash local modifications before starting')
p.add_option('-m', '--onto-manifest',
dest='onto_manifest', action='store_true',
help='rebase onto the manifest version instead of upstream '
'HEAD (this helps to make sure the local tree stays '
'consistent if you previously synced to a manifest)')
p.add_option(
"--fail-fast",
dest="fail_fast",
action="store_true",
help="stop rebasing after first error is hit",
)
p.add_option(
"-f",
"--force-rebase",
dest="force_rebase",
action="store_true",
help="pass --force-rebase to git rebase",
)
p.add_option(
"--no-ff",
dest="ff",
default=True,
action="store_false",
help="pass --no-ff to git rebase",
)
p.add_option(
"--autosquash",
dest="autosquash",
action="store_true",
help="pass --autosquash to git rebase",
)
p.add_option(
"--whitespace",
dest="whitespace",
action="store",
metavar="WS",
help="pass --whitespace to git rebase",
)
p.add_option(
"--auto-stash",
dest="auto_stash",
action="store_true",
help="stash local modifications before starting",
)
p.add_option(
"-m",
"--onto-manifest",
dest="onto_manifest",
action="store_true",
help="rebase onto the manifest version instead of upstream "
"HEAD (this helps to make sure the local tree stays "
"consistent if you previously synced to a manifest)",
)
def Execute(self, opt, args):
all_projects = self.GetProjects(args)
all_projects = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
one_project = len(all_projects) == 1
if opt.interactive and not one_project:
print('error: interactive rebase not supported with multiple projects',
file=sys.stderr)
print(
"error: interactive rebase not supported with multiple "
"projects",
file=sys.stderr,
)
if len(args) == 1:
print('note: project %s is mapped to more than one path' % (args[0],),
file=sys.stderr)
print(
"note: project %s is mapped to more than one path"
% (args[0],),
file=sys.stderr,
)
return 1
# Setup the common git rebase args that we use for all projects.
common_args = ['rebase']
common_args = ["rebase"]
if opt.whitespace:
common_args.append('--whitespace=%s' % opt.whitespace)
common_args.append("--whitespace=%s" % opt.whitespace)
if opt.quiet:
common_args.append('--quiet')
common_args.append("--quiet")
if opt.force_rebase:
common_args.append('--force-rebase')
common_args.append("--force-rebase")
if not opt.ff:
common_args.append('--no-ff')
common_args.append("--no-ff")
if opt.autosquash:
common_args.append('--autosquash')
common_args.append("--autosquash")
if opt.interactive:
common_args.append('-i')
common_args.append("-i")
config = self.manifest.manifestProject.config
out = RebaseColoring(config)
out.redirect(sys.stdout)
_RelPath = lambda p: p.RelPath(local=opt.this_manifest_only)
ret = 0
for project in all_projects:
@ -107,30 +145,40 @@ branch but need to incorporate new upstream changes "underneath" them.
cb = project.CurrentBranch
if not cb:
if one_project:
print("error: project %s has a detached HEAD" % project.relpath,
file=sys.stderr)
print(
"error: project %s has a detached HEAD"
% _RelPath(project),
file=sys.stderr,
)
return 1
# ignore branches with detatched HEADs
# Ignore branches with detached HEADs.
continue
upbranch = project.GetBranch(cb)
if not upbranch.LocalMerge:
if one_project:
print("error: project %s does not track any remote branches"
% project.relpath, file=sys.stderr)
print(
"error: project %s does not track any remote branches"
% _RelPath(project),
file=sys.stderr,
)
return 1
# ignore branches without remotes
# Ignore branches without remotes.
continue
args = common_args[:]
if opt.onto_manifest:
args.append('--onto')
args.append("--onto")
args.append(project.revisionExpr)
args.append(upbranch.LocalMerge)
out.project('project %s: rebasing %s -> %s',
project.relpath, cb, upbranch.LocalMerge)
out.project(
"project %s: rebasing %s -> %s",
_RelPath(project),
cb,
upbranch.LocalMerge,
)
out.nl()
out.flush()
@ -152,13 +200,15 @@ branch but need to incorporate new upstream changes "underneath" them.
continue
if needs_stash:
stash_args.append('pop')
stash_args.append('--quiet')
stash_args.append("pop")
stash_args.append("--quiet")
if GitCommand(project, stash_args).Wait() != 0:
ret += 1
if ret:
out.fail('%i projects had errors', ret)
msg_fmt = "%d projects had errors"
self.git_event_log.ErrorEvent(msg_fmt % (ret), msg_fmt)
out.fail(msg_fmt, ret)
out.nl()
return ret

View File

@ -35,13 +35,20 @@ need to be performed by an end-user.
"""
def _Options(self, p):
g = p.add_option_group('repo Version options')
g.add_option('--no-repo-verify',
dest='repo_verify', default=True, action='store_false',
help='do not verify repo source code')
g.add_option('--repo-upgraded',
dest='repo_upgraded', action='store_true',
help=SUPPRESS_HELP)
g = p.add_option_group("repo Version options")
g.add_option(
"--no-repo-verify",
dest="repo_verify",
default=True,
action="store_false",
help="do not verify repo source code",
)
g.add_option(
"--repo-upgraded",
dest="repo_upgraded",
action="store_true",
help=SUPPRESS_HELP,
)
def Execute(self, opt, args):
rp = self.manifest.repoProject
@ -51,11 +58,9 @@ need to be performed by an end-user.
_PostRepoUpgrade(self.manifest)
else:
if not rp.Sync_NetworkHalf():
if not rp.Sync_NetworkHalf().success:
print("error: can't update repo", file=sys.stderr)
sys.exit(1)
rp.bare_git.gc('--auto')
_PostRepoFetch(rp,
repo_verify=opt.repo_verify,
verbose=True)
rp.bare_git.gc("--auto")
_PostRepoFetch(rp, repo_verify=opt.repo_verify, verbose=True)

View File

@ -21,10 +21,10 @@ from git_command import GitCommand
class _ProjectList(Coloring):
def __init__(self, gc):
Coloring.__init__(self, gc, 'interactive')
self.prompt = self.printer('prompt', fg='blue', attr='bold')
self.header = self.printer('header', attr='bold')
self.help = self.printer('help', fg='red', attr='bold')
Coloring.__init__(self, gc, "interactive")
self.prompt = self.printer("prompt", fg="blue", attr="bold")
self.header = self.printer("header", attr="bold")
self.help = self.printer("help", fg="red", attr="bold")
class Stage(InteractiveCommand):
@ -38,10 +38,14 @@ The '%prog' command stages files to prepare the next commit.
"""
def _Options(self, p):
g = p.get_option_group('--quiet')
g.add_option('-i', '--interactive',
dest='interactive', action='store_true',
help='use interactive staging')
g = p.get_option_group("--quiet")
g.add_option(
"-i",
"--interactive",
dest="interactive",
action="store_true",
help="use interactive staging",
)
def Execute(self, opt, args):
if opt.interactive:
@ -50,39 +54,50 @@ The '%prog' command stages files to prepare the next commit.
self.Usage()
def _Interactive(self, opt, args):
all_projects = [p for p in self.GetProjects(args) if p.IsDirty()]
all_projects = [
p
for p in self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
if p.IsDirty()
]
if not all_projects:
print('no projects have uncommitted modifications', file=sys.stderr)
print("no projects have uncommitted modifications", file=sys.stderr)
return
out = _ProjectList(self.manifest.manifestProject.config)
while True:
out.header(' %s', 'project')
out.header(" %s", "project")
out.nl()
for i in range(len(all_projects)):
project = all_projects[i]
out.write('%3d: %s', i + 1, project.relpath + '/')
out.write(
"%3d: %s",
i + 1,
project.RelPath(local=opt.this_manifest_only) + "/",
)
out.nl()
out.nl()
out.write('%3d: (', 0)
out.prompt('q')
out.write('uit)')
out.write("%3d: (", 0)
out.prompt("q")
out.write("uit)")
out.nl()
out.prompt('project> ')
out.prompt("project> ")
out.flush()
try:
a = sys.stdin.readline()
except KeyboardInterrupt:
out.nl()
break
if a == '':
if a == "":
out.nl()
break
a = a.strip()
if a.lower() in ('q', 'quit', 'exit'):
if a.lower() in ("q", "quit", "exit"):
break
if not a:
continue
@ -99,13 +114,17 @@ The '%prog' command stages files to prepare the next commit.
_AddI(all_projects[a_index - 1])
continue
projects = [p for p in all_projects if a in [p.name, p.relpath]]
projects = [
p
for p in all_projects
if a in [p.name, p.RelPath(local=opt.this_manifest_only)]
]
if len(projects) == 1:
_AddI(projects[0])
continue
print('Bye.')
print("Bye.")
def _AddI(project):
p = GitCommand(project, ['add', '--interactive'], bare=False)
p = GitCommand(project, ["add", "--interactive"], bare=False)
p.Wait()

View File

@ -37,21 +37,34 @@ revision specified in the manifest.
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p):
p.add_option('--all',
dest='all', action='store_true',
help='begin branch in all projects')
p.add_option('-r', '--rev', '--revision', dest='revision',
help='point branch at this revision instead of upstream')
p.add_option('--head', '--HEAD',
dest='revision', action='store_const', const='HEAD',
help='abbreviation for --rev HEAD')
p.add_option(
"--all",
dest="all",
action="store_true",
help="begin branch in all projects",
)
p.add_option(
"-r",
"--rev",
"--revision",
dest="revision",
help="point branch at this revision instead of upstream",
)
p.add_option(
"--head",
"--HEAD",
dest="revision",
action="store_const",
const="HEAD",
help="abbreviation for --rev HEAD",
)
def ValidateOptions(self, opt, args):
if not args:
self.Usage()
nb = args[0]
if not git.check_ref_format('heads/%s' % nb):
if not git.check_ref_format("heads/%s" % nb):
self.OptionParser.error("'%s' is not a valid name" % nb)
def _ExecuteOne(self, revision, nb, project):
@ -59,7 +72,7 @@ revision specified in the manifest.
# If the current revision is immutable, such as a SHA1, a tag or
# a change, then we can't push back to it. Substitute with
# dest_branch, if defined; or with manifest default revision instead.
branch_merge = ''
branch_merge = ""
if IsImmutable(project.revisionExpr):
if project.dest_branch:
branch_merge = project.dest_branch
@ -68,9 +81,13 @@ revision specified in the manifest.
try:
ret = project.StartBranch(
nb, branch_merge=branch_merge, revision=revision)
nb, branch_merge=branch_merge, revision=revision
)
except Exception as e:
print('error: unable to checkout %s: %s' % (project.name, e), file=sys.stderr)
print(
"error: unable to checkout %s: %s" % (project.name, e),
file=sys.stderr,
)
ret = False
return (ret, project)
@ -81,16 +98,21 @@ revision specified in the manifest.
if not opt.all:
projects = args[1:]
if len(projects) < 1:
projects = ['.'] # start it in the local project by default
projects = ["."] # start it in the local project by default
all_projects = self.GetProjects(projects,
missing_ok=bool(self.gitc_manifest))
all_projects = self.GetProjects(
projects,
missing_ok=bool(self.gitc_manifest),
all_manifests=not opt.this_manifest_only,
)
# This must happen after we find all_projects, since GetProjects may need
# the local directory, which will disappear once we save the GITC manifest.
# This must happen after we find all_projects, since GetProjects may
# need the local directory, which will disappear once we save the GITC
# manifest.
if self.gitc_manifest:
gitc_projects = self.GetProjects(projects, manifest=self.gitc_manifest,
missing_ok=True)
gitc_projects = self.GetProjects(
projects, manifest=self.gitc_manifest, missing_ok=True
)
for project in gitc_projects:
if project.old_revision:
project.already_synced = True
@ -101,17 +123,18 @@ revision specified in the manifest.
# Save the GITC manifest.
gitc_utils.save_manifest(self.gitc_manifest)
# Make sure we have a valid CWD
# Make sure we have a valid CWD.
if not os.path.exists(os.getcwd()):
os.chdir(self.manifest.topdir)
pm = Progress('Syncing %s' % nb, len(all_projects), quiet=opt.quiet)
pm = Progress("Syncing %s" % nb, len(all_projects), quiet=opt.quiet)
for project in all_projects:
gitc_project = self.gitc_manifest.paths[project.relpath]
# Sync projects that have not been opened.
if not gitc_project.already_synced:
proj_localdir = os.path.join(self.gitc_manifest.gitc_client_dir,
project.relpath)
proj_localdir = os.path.join(
self.gitc_manifest.gitc_client_dir, project.relpath
)
project.worktree = proj_localdir
if not os.path.exists(proj_localdir):
os.makedirs(proj_localdir)
@ -119,24 +142,32 @@ revision specified in the manifest.
sync_buf = SyncBuffer(self.manifest.manifestProject.config)
project.Sync_LocalHalf(sync_buf)
project.revisionId = gitc_project.old_revision
pm.update()
pm.update(msg="")
pm.end()
def _ProcessResults(_pool, pm, results):
for (result, project) in results:
for result, project in results:
if not result:
err.append(project)
pm.update()
pm.update(msg="")
self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, opt.revision, nb),
all_projects,
callback=_ProcessResults,
output=Progress('Starting %s' % (nb,), len(all_projects), quiet=opt.quiet))
output=Progress(
"Starting %s" % (nb,), len(all_projects), quiet=opt.quiet
),
)
if err:
for p in err:
print("error: %s/: cannot start %s" % (p.relpath, nb),
file=sys.stderr)
print(
"error: %s/: cannot start %s"
% (p.RelPath(local=opt.this_manifest_only), nb),
file=sys.stderr,
)
msg_fmt = "cannot start %d project(s)"
self.git_event_log.ErrorEvent(msg_fmt % (len(err)), msg_fmt)
sys.exit(1)

View File

@ -79,11 +79,16 @@ the following meanings:
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p):
p.add_option('-o', '--orphans',
dest='orphans', action='store_true',
help="include objects in working directory outside of repo projects")
p.add_option(
"-o",
"--orphans",
dest="orphans",
action="store_true",
help="include objects in working directory outside of repo "
"projects",
)
def _StatusHelper(self, quiet, project):
def _StatusHelper(self, quiet, local, project):
"""Obtains the status for a specific project.
Obtains the status for a project, redirecting the output to
@ -91,88 +96,107 @@ the following meanings:
Args:
quiet: Where to output the status.
local: a boolean, if True, the path is relative to the local
(sub)manifest. If false, the path is relative to the outermost
manifest.
project: Project to get status of.
Returns:
The status of the project.
"""
buf = io.StringIO()
ret = project.PrintWorkTreeStatus(quiet=quiet, output_redir=buf)
ret = project.PrintWorkTreeStatus(
quiet=quiet, output_redir=buf, local=local
)
return (ret, buf.getvalue())
def _FindOrphans(self, dirs, proj_dirs, proj_dirs_parents, outstring):
"""find 'dirs' that are present in 'proj_dirs_parents' but not in 'proj_dirs'"""
status_header = ' --\t'
"""find 'dirs' that are present in 'proj_dirs_parents' but not in 'proj_dirs'""" # noqa: E501
status_header = " --\t"
for item in dirs:
if not platform_utils.isdir(item):
outstring.append(''.join([status_header, item]))
outstring.append("".join([status_header, item]))
continue
if item in proj_dirs:
continue
if item in proj_dirs_parents:
self._FindOrphans(glob.glob('%s/.*' % item) +
glob.glob('%s/*' % item),
proj_dirs, proj_dirs_parents, outstring)
self._FindOrphans(
glob.glob("%s/.*" % item) + glob.glob("%s/*" % item),
proj_dirs,
proj_dirs_parents,
outstring,
)
continue
outstring.append(''.join([status_header, item, '/']))
outstring.append("".join([status_header, item, "/"]))
def Execute(self, opt, args):
all_projects = self.GetProjects(args)
all_projects = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
def _ProcessResults(_pool, _output, results):
ret = 0
for (state, output) in results:
for state, output in results:
if output:
print(output, end='')
if state == 'CLEAN':
print(output, end="")
if state == "CLEAN":
ret += 1
return ret
counter = self.ExecuteInParallel(
opt.jobs,
functools.partial(self._StatusHelper, opt.quiet),
functools.partial(
self._StatusHelper, opt.quiet, opt.this_manifest_only
),
all_projects,
callback=_ProcessResults,
ordered=True)
ordered=True,
)
if not opt.quiet and len(all_projects) == counter:
print('nothing to commit (working directory clean)')
print("nothing to commit (working directory clean)")
if opt.orphans:
proj_dirs = set()
proj_dirs_parents = set()
for project in self.GetProjects(None, missing_ok=True):
proj_dirs.add(project.relpath)
(head, _tail) = os.path.split(project.relpath)
for project in self.GetProjects(
None, missing_ok=True, all_manifests=not opt.this_manifest_only
):
relpath = project.RelPath(local=opt.this_manifest_only)
proj_dirs.add(relpath)
(head, _tail) = os.path.split(relpath)
while head != "":
proj_dirs_parents.add(head)
(head, _tail) = os.path.split(head)
proj_dirs.add('.repo')
proj_dirs.add(".repo")
class StatusColoring(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, 'status')
self.project = self.printer('header', attr='bold')
self.untracked = self.printer('untracked', fg='red')
Coloring.__init__(self, config, "status")
self.project = self.printer("header", attr="bold")
self.untracked = self.printer("untracked", fg="red")
orig_path = os.getcwd()
try:
os.chdir(self.manifest.topdir)
outstring = []
self._FindOrphans(glob.glob('.*') +
glob.glob('*'),
proj_dirs, proj_dirs_parents, outstring)
self._FindOrphans(
glob.glob(".*") + glob.glob("*"),
proj_dirs,
proj_dirs_parents,
outstring,
)
if outstring:
output = StatusColoring(self.client.globalConfig)
output.project('Objects not within a project (orphans)')
output.project("Objects not within a project (orphans)")
output.nl()
for entry in outstring:
output.untracked(entry)
output.nl()
else:
print('No orphan files or directories')
print("No orphan files or directories")
finally:
# Restore CWD.

File diff suppressed because it is too large Load Diff

View File

@ -17,6 +17,7 @@ import functools
import optparse
import re
import sys
from typing import List
from command import DEFAULT_LOCAL_JOBS, InteractiveCommand
from editor import Editor
@ -24,33 +25,74 @@ from error import UploadError
from git_command import GitCommand
from git_refs import R_HEADS
from hooks import RepoHook
from project import ReviewableBranch
UNUSUAL_COMMIT_THRESHOLD = 5
_DEFAULT_UNUSUAL_COMMIT_THRESHOLD = 5
def _ConfirmManyUploads(multiple_branches=False):
if multiple_branches:
print('ATTENTION: One or more branches has an unusually high number '
'of commits.')
def _VerifyPendingCommits(branches: List[ReviewableBranch]) -> bool:
"""Perform basic safety checks on the given set of branches.
Ensures that each branch does not have a "large" number of commits
and, if so, prompts the user to confirm they want to proceed with
the upload.
Returns true if all branches pass the safety check or the user
confirmed. Returns false if the upload should be aborted.
"""
# Determine if any branch has a suspicious number of commits.
many_commits = False
for branch in branches:
# Get the user's unusual threshold for the branch.
#
# Each branch may be configured to have a different threshold.
remote = branch.project.GetBranch(branch.name).remote
key = f"review.{remote.review}.uploadwarningthreshold"
threshold = branch.project.config.GetInt(key)
if threshold is None:
threshold = _DEFAULT_UNUSUAL_COMMIT_THRESHOLD
# If the branch has more commits than the threshold, show a warning.
if len(branch.commits) > threshold:
many_commits = True
break
# If any branch has many commits, prompt the user.
if many_commits:
if len(branches) > 1:
print(
"ATTENTION: One or more branches has an unusually high number "
"of commits."
)
else:
print('ATTENTION: You are uploading an unusually high number of commits.')
print('YOU PROBABLY DO NOT MEAN TO DO THIS. (Did you rebase across '
'branches?)')
answer = input("If you are sure you intend to do this, type 'yes': ").strip()
print(
"ATTENTION: You are uploading an unusually high number of "
"commits."
)
print(
"YOU PROBABLY DO NOT MEAN TO DO THIS. (Did you rebase across "
"branches?)"
)
answer = input(
"If you are sure you intend to do this, type 'yes': "
).strip()
return answer == "yes"
return True
def _die(fmt, *args):
msg = fmt % args
print('error: %s' % msg, file=sys.stderr)
print("error: %s" % msg, file=sys.stderr)
sys.exit(1)
def _SplitEmails(values):
result = []
for value in values:
result.extend([s.strip() for s in value.split(',')])
result.extend([s.strip() for s in value.split(",")])
return result
@ -78,6 +120,13 @@ added to the respective list of users, and emails are sent to any
new users. Users passed as --reviewers must already be registered
with the code review system, or the upload will fail.
While most normal Gerrit options have dedicated command line options,
direct access to the Gerit options is available via --push-options.
This is useful when Gerrit has newer functionality that %prog doesn't
yet support, or doesn't have plans to support. See the Push Options
documentation for more details:
https://gerrit-review.googlesource.com/Documentation/user-upload.html#push_options
# Configuration
review.URL.autoupload:
@ -142,6 +191,13 @@ review.URL.uploadnotify:
Control e-mail notifications when uploading.
https://gerrit-review.googlesource.com/Documentation/user-upload.html#notify
review.URL.uploadwarningthreshold:
Repo will warn you if you are attempting to upload a large number
of commits in one or more branches. By default, the threshold
is five commits. This option allows you to override the warning
threshold to a different value.
# References
Gerrit Code Review: https://www.gerritcodereview.com/
@ -150,71 +206,169 @@ Gerrit Code Review: https://www.gerritcodereview.com/
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p):
p.add_option('-t',
dest='auto_topic', action='store_true',
help='send local branch name to Gerrit Code Review')
p.add_option('--hashtag', '--ht',
dest='hashtags', action='append', default=[],
help='add hashtags (comma delimited) to the review')
p.add_option('--hashtag-branch', '--htb',
action='store_true',
help='add local branch name as a hashtag')
p.add_option('-l', '--label',
dest='labels', action='append', default=[],
help='add a label when uploading')
p.add_option('--re', '--reviewers',
type='string', action='append', dest='reviewers',
help='request reviews from these people')
p.add_option('--cc',
type='string', action='append', dest='cc',
help='also send email to these email addresses')
p.add_option('--br', '--branch',
type='string', action='store', dest='branch',
help='(local) branch to upload')
p.add_option('-c', '--current-branch',
dest='current_branch', action='store_true',
help='upload current git branch')
p.add_option('--no-current-branch',
dest='current_branch', action='store_false',
help='upload all git branches')
# Turn this into a warning & remove this someday.
p.add_option('--cbr',
dest='current_branch', action='store_true',
help=optparse.SUPPRESS_HELP)
p.add_option('--ne', '--no-emails',
action='store_false', dest='notify', default=True,
help='do not send e-mails on upload')
p.add_option('-p', '--private',
action='store_true', dest='private', default=False,
help='upload as a private change (deprecated; use --wip)')
p.add_option('-w', '--wip',
action='store_true', dest='wip', default=False,
help='upload as a work-in-progress change')
p.add_option('-o', '--push-option',
type='string', action='append', dest='push_options',
p.add_option(
"-t",
dest="auto_topic",
action="store_true",
help="send local branch name to Gerrit Code Review",
)
p.add_option(
"--hashtag",
"--ht",
dest="hashtags",
action="append",
default=[],
help='additional push options to transmit')
p.add_option('-D', '--destination', '--dest',
type='string', action='store', dest='dest_branch',
metavar='BRANCH',
help='submit for review on this target branch')
p.add_option('-n', '--dry-run',
dest='dryrun', default=False, action='store_true',
help='do everything except actually upload the CL')
p.add_option('-y', '--yes',
default=False, action='store_true',
help='answer yes to all safe prompts')
p.add_option('--no-cert-checks',
dest='validate_certs', action='store_false', default=True,
help='disable verifying ssl certs (unsafe)')
RepoHook.AddOptionGroup(p, 'pre-upload')
help="add hashtags (comma delimited) to the review",
)
p.add_option(
"--hashtag-branch",
"--htb",
action="store_true",
help="add local branch name as a hashtag",
)
p.add_option(
"-l",
"--label",
dest="labels",
action="append",
default=[],
help="add a label when uploading",
)
p.add_option(
"--re",
"--reviewers",
type="string",
action="append",
dest="reviewers",
help="request reviews from these people",
)
p.add_option(
"--cc",
type="string",
action="append",
dest="cc",
help="also send email to these email addresses",
)
p.add_option(
"--br",
"--branch",
type="string",
action="store",
dest="branch",
help="(local) branch to upload",
)
p.add_option(
"-c",
"--current-branch",
dest="current_branch",
action="store_true",
help="upload current git branch",
)
p.add_option(
"--no-current-branch",
dest="current_branch",
action="store_false",
help="upload all git branches",
)
# Turn this into a warning & remove this someday.
p.add_option(
"--cbr",
dest="current_branch",
action="store_true",
help=optparse.SUPPRESS_HELP,
)
p.add_option(
"--ne",
"--no-emails",
action="store_false",
dest="notify",
default=True,
help="do not send e-mails on upload",
)
p.add_option(
"-p",
"--private",
action="store_true",
dest="private",
default=False,
help="upload as a private change (deprecated; use --wip)",
)
p.add_option(
"-w",
"--wip",
action="store_true",
dest="wip",
default=False,
help="upload as a work-in-progress change",
)
p.add_option(
"-r",
"--ready",
action="store_true",
default=False,
help="mark change as ready (clears work-in-progress setting)",
)
p.add_option(
"-o",
"--push-option",
type="string",
action="append",
dest="push_options",
default=[],
help="additional push options to transmit",
)
p.add_option(
"-D",
"--destination",
"--dest",
type="string",
action="store",
dest="dest_branch",
metavar="BRANCH",
help="submit for review on this target branch",
)
p.add_option(
"-n",
"--dry-run",
dest="dryrun",
default=False,
action="store_true",
help="do everything except actually upload the CL",
)
p.add_option(
"-y",
"--yes",
default=False,
action="store_true",
help="answer yes to all safe prompts",
)
p.add_option(
"--ignore-untracked-files",
action="store_true",
default=False,
help="ignore untracked files in the working copy",
)
p.add_option(
"--no-ignore-untracked-files",
dest="ignore_untracked_files",
action="store_false",
help="always ask about untracked files in the working copy",
)
p.add_option(
"--no-cert-checks",
dest="validate_certs",
action="store_false",
default=True,
help="disable verifying ssl certs (unsafe)",
)
RepoHook.AddOptionGroup(p, "pre-upload")
def _SingleBranch(self, opt, branch, people):
project = branch.project
name = branch.name
remote = project.GetBranch(name).remote
key = 'review.%s.autoupload' % remote.review
key = "review.%s.autoupload" % remote.review
answer = project.config.GetBoolean(key)
if answer is False:
@ -224,45 +378,55 @@ Gerrit Code Review: https://www.gerritcodereview.com/
date = branch.date
commit_list = branch.commits
destination = opt.dest_branch or project.dest_branch or project.revisionExpr
print('Upload project %s/ to remote branch %s%s:' %
(project.relpath, destination, ' (private)' if opt.private else ''))
print(' branch %s (%2d commit%s, %s):' % (
destination = (
opt.dest_branch or project.dest_branch or project.revisionExpr
)
print(
"Upload project %s/ to remote branch %s%s:"
% (
project.RelPath(local=opt.this_manifest_only),
destination,
" (private)" if opt.private else "",
)
)
print(
" branch %s (%2d commit%s, %s):"
% (
name,
len(commit_list),
len(commit_list) != 1 and 's' or '',
date))
len(commit_list) != 1 and "s" or "",
date,
)
)
for commit in commit_list:
print(' %s' % commit)
print(" %s" % commit)
print('to %s (y/N)? ' % remote.review, end='')
# TODO: When we require Python 3, use flush=True w/print above.
sys.stdout.flush()
print("to %s (y/N)? " % remote.review, end="", flush=True)
if opt.yes:
print('<--yes>')
print("<--yes>")
answer = True
else:
answer = sys.stdin.readline().strip().lower()
answer = answer in ('y', 'yes', '1', 'true', 't')
if answer:
if len(branch.commits) > UNUSUAL_COMMIT_THRESHOLD:
answer = _ConfirmManyUploads()
if answer:
self._UploadAndReport(opt, [branch], people)
else:
answer = answer in ("y", "yes", "1", "true", "t")
if not answer:
_die("upload aborted by user")
# Perform some basic safety checks prior to uploading.
if not opt.yes and not _VerifyPendingCommits([branch]):
_die("upload aborted by user")
self._UploadAndReport(opt, [branch], people)
def _MultipleBranches(self, opt, pending, people):
projects = {}
branches = {}
script = []
script.append('# Uncomment the branches to upload:')
script.append("# Uncomment the branches to upload:")
for project, avail in pending:
script.append('#')
script.append('# project %s/:' % project.relpath)
project_path = project.RelPath(local=opt.this_manifest_only)
script.append("#")
script.append(f"# project {project_path}/:")
b = {}
for branch in avail:
@ -273,26 +437,34 @@ Gerrit Code Review: https://www.gerritcodereview.com/
commit_list = branch.commits
if b:
script.append('#')
destination = opt.dest_branch or project.dest_branch or project.revisionExpr
script.append('# branch %s (%2d commit%s, %s) to remote branch %s:' % (
script.append("#")
destination = (
opt.dest_branch
or project.dest_branch
or project.revisionExpr
)
script.append(
"# branch %s (%2d commit%s, %s) to remote branch %s:"
% (
name,
len(commit_list),
len(commit_list) != 1 and 's' or '',
len(commit_list) != 1 and "s" or "",
date,
destination))
destination,
)
)
for commit in commit_list:
script.append('# %s' % commit)
script.append("# %s" % commit)
b[name] = branch
projects[project.relpath] = project
branches[project.name] = b
script.append('')
projects[project_path] = project
branches[project_path] = b
script.append("")
script = Editor.EditString("\n".join(script)).split("\n")
project_re = re.compile(r'^#?\s*project\s*([^\s]+)/:$')
branch_re = re.compile(r'^\s*branch\s*([^\s(]+)\s*\(.*')
project_re = re.compile(r"^#?\s*project\s*([^\s]+)/:$")
branch_re = re.compile(r"^\s*branch\s*([^\s(]+)\s*\(.*")
project = None
todo = []
@ -303,28 +475,24 @@ Gerrit Code Review: https://www.gerritcodereview.com/
name = m.group(1)
project = projects.get(name)
if not project:
_die('project %s not available for upload', name)
_die("project %s not available for upload", name)
continue
m = branch_re.match(line)
if m:
name = m.group(1)
if not project:
_die('project for branch %s not in script', name)
branch = branches[project.name].get(name)
_die("project for branch %s not in script", name)
project_path = project.RelPath(local=opt.this_manifest_only)
branch = branches[project_path].get(name)
if not branch:
_die('branch %s not in %s', name, project.relpath)
_die("branch %s not in %s", name, project_path)
todo.append(branch)
if not todo:
_die("nothing uncommented for upload")
many_commits = False
for branch in todo:
if len(branch.commits) > UNUSUAL_COMMIT_THRESHOLD:
many_commits = True
break
if many_commits:
if not _ConfirmManyUploads(multiple_branches=True):
# Perform some basic safety checks prior to uploading.
if not opt.yes and not _VerifyPendingCommits(todo):
_die("upload aborted by user")
self._UploadAndReport(opt, todo, people)
@ -332,21 +500,21 @@ Gerrit Code Review: https://www.gerritcodereview.com/
def _AppendAutoList(self, branch, people):
"""
Appends the list of reviewers in the git project's config.
Appends the list of users in the CC list in the git project's config if a
non-empty reviewer list was found.
Appends the list of users in the CC list in the git project's config if
a non-empty reviewer list was found.
"""
name = branch.name
project = branch.project
key = 'review.%s.autoreviewer' % project.GetBranch(name).remote.review
key = "review.%s.autoreviewer" % project.GetBranch(name).remote.review
raw_list = project.config.GetString(key)
if raw_list is not None:
people[0].extend([entry.strip() for entry in raw_list.split(',')])
people[0].extend([entry.strip() for entry in raw_list.split(",")])
key = 'review.%s.autocopy' % project.GetBranch(name).remote.review
key = "review.%s.autocopy" % project.GetBranch(name).remote.review
raw_list = project.config.GetString(key)
if raw_list is not None and len(people[0]) > 0:
people[1].extend([entry.strip() for entry in raw_list.split(',')])
people[1].extend([entry.strip() for entry in raw_list.split(",")])
def _FindGerritChange(self, branch):
last_pub = branch.project.WasPublished(branch.name)
@ -356,7 +524,7 @@ Gerrit Code Review: https://www.gerritcodereview.com/
refs = branch.GetPublishedRefs()
try:
# refs/changes/XYZ/N --> XYZ
return refs.get(last_pub).split('/')[-2]
return refs.get(last_pub).split("/")[-2]
except (AttributeError, IndexError):
return ""
@ -367,92 +535,113 @@ Gerrit Code Review: https://www.gerritcodereview.com/
people = copy.deepcopy(original_people)
self._AppendAutoList(branch, people)
# Check if there are local changes that may have been forgotten
# Check if there are local changes that may have been forgotten.
changes = branch.project.UncommitedFiles()
if opt.ignore_untracked_files:
untracked = set(branch.project.UntrackedFiles())
changes = [x for x in changes if x not in untracked]
if changes:
key = 'review.%s.autoupload' % branch.project.remote.review
key = "review.%s.autoupload" % branch.project.remote.review
answer = branch.project.config.GetBoolean(key)
# if they want to auto upload, let's not ask because it could be automated
# If they want to auto upload, let's not ask because it
# could be automated.
if answer is None:
print()
print('Uncommitted changes in %s (did you forget to amend?):'
% branch.project.name)
print('\n'.join(changes))
print('Continue uploading? (y/N) ', end='')
# TODO: When we require Python 3, use flush=True w/print above.
sys.stdout.flush()
print(
"Uncommitted changes in %s (did you forget to "
"amend?):" % branch.project.name
)
print("\n".join(changes))
print("Continue uploading? (y/N) ", end="", flush=True)
if opt.yes:
print('<--yes>')
a = 'yes'
print("<--yes>")
a = "yes"
else:
a = sys.stdin.readline().strip().lower()
if a not in ('y', 'yes', 't', 'true', 'on'):
if a not in ("y", "yes", "t", "true", "on"):
print("skipping upload", file=sys.stderr)
branch.uploaded = False
branch.error = 'User aborted'
branch.error = "User aborted"
continue
# Check if topic branches should be sent to the server during upload
# Check if topic branches should be sent to the server during
# upload.
if opt.auto_topic is not True:
key = 'review.%s.uploadtopic' % branch.project.remote.review
key = "review.%s.uploadtopic" % branch.project.remote.review
opt.auto_topic = branch.project.config.GetBoolean(key)
def _ExpandCommaList(value):
"""Split |value| up into comma delimited entries."""
if not value:
return
for ret in value.split(','):
for ret in value.split(","):
ret = ret.strip()
if ret:
yield ret
# Check if hashtags should be included.
key = 'review.%s.uploadhashtags' % branch.project.remote.review
hashtags = set(_ExpandCommaList(branch.project.config.GetString(key)))
key = "review.%s.uploadhashtags" % branch.project.remote.review
hashtags = set(
_ExpandCommaList(branch.project.config.GetString(key))
)
for tag in opt.hashtags:
hashtags.update(_ExpandCommaList(tag))
if opt.hashtag_branch:
hashtags.add(branch.name)
# Check if labels should be included.
key = 'review.%s.uploadlabels' % branch.project.remote.review
labels = set(_ExpandCommaList(branch.project.config.GetString(key)))
key = "review.%s.uploadlabels" % branch.project.remote.review
labels = set(
_ExpandCommaList(branch.project.config.GetString(key))
)
for label in opt.labels:
labels.update(_ExpandCommaList(label))
# Basic sanity check on label syntax.
for label in labels:
if not re.match(r'^.+[+-][0-9]+$', label):
print('repo: error: invalid label syntax "%s": labels use forms '
'like CodeReview+1 or Verified-1' % (label,), file=sys.stderr)
sys.exit(1)
# Handle e-mail notifications.
if opt.notify is False:
notify = 'NONE'
notify = "NONE"
else:
key = 'review.%s.uploadnotify' % branch.project.remote.review
key = (
"review.%s.uploadnotify" % branch.project.remote.review
)
notify = branch.project.config.GetString(key)
destination = opt.dest_branch or branch.project.dest_branch
# Make sure our local branch is not setup to track a different remote branch
merge_branch = self._GetMergeBranch(branch.project)
if destination:
if branch.project.dest_branch and not opt.dest_branch:
merge_branch = self._GetMergeBranch(
branch.project, local_branch=branch.name
)
full_dest = destination
if not full_dest.startswith(R_HEADS):
full_dest = R_HEADS + full_dest
if not opt.dest_branch and merge_branch and merge_branch != full_dest:
print('merge branch %s does not match destination branch %s'
% (merge_branch, full_dest))
print('skipping upload.')
print('Please use `--destination %s` if this is intentional'
% destination)
# If the merge branch of the local branch is different from
# the project's revision AND destination, this might not be
# intentional.
if (
merge_branch
and merge_branch != branch.project.revisionExpr
and merge_branch != full_dest
):
print(
f"For local branch {branch.name}: merge branch "
f"{merge_branch} does not match destination branch "
f"{destination}"
)
print("skipping upload.")
print(
f"Please use `--destination {destination}` if this "
"is intentional"
)
branch.uploaded = False
continue
branch.UploadForReview(people,
branch.UploadForReview(
people,
dryrun=opt.dryrun,
auto_topic=opt.auto_topic,
hashtags=hashtags,
@ -460,54 +649,72 @@ Gerrit Code Review: https://www.gerritcodereview.com/
private=opt.private,
notify=notify,
wip=opt.wip,
ready=opt.ready,
dest_branch=destination,
validate_certs=opt.validate_certs,
push_options=opt.push_options)
push_options=opt.push_options,
)
branch.uploaded = True
except UploadError as e:
self.git_event_log.ErrorEvent(f"upload error: {e}")
branch.error = e
branch.uploaded = False
have_errors = True
print(file=sys.stderr)
print('----------------------------------------------------------------------', file=sys.stderr)
print("-" * 70, file=sys.stderr)
if have_errors:
for branch in todo:
if not branch.uploaded:
if len(str(branch.error)) <= 30:
fmt = ' (%s)'
fmt = " (%s)"
else:
fmt = '\n (%s)'
print(('[FAILED] %-15s %-15s' + fmt) % (
branch.project.relpath + '/',
fmt = "\n (%s)"
print(
("[FAILED] %-15s %-15s" + fmt)
% (
branch.project.RelPath(local=opt.this_manifest_only)
+ "/",
branch.name,
str(branch.error)),
file=sys.stderr)
str(branch.error),
),
file=sys.stderr,
)
print()
for branch in todo:
if branch.uploaded:
print('[OK ] %-15s %s' % (
branch.project.relpath + '/',
branch.name),
file=sys.stderr)
print(
"[OK ] %-15s %s"
% (
branch.project.RelPath(local=opt.this_manifest_only)
+ "/",
branch.name,
),
file=sys.stderr,
)
if have_errors:
sys.exit(1)
def _GetMergeBranch(self, project):
p = GitCommand(project,
['rev-parse', '--abbrev-ref', 'HEAD'],
def _GetMergeBranch(self, project, local_branch=None):
if local_branch is None:
p = GitCommand(
project,
["rev-parse", "--abbrev-ref", "HEAD"],
capture_stdout=True,
capture_stderr=True)
capture_stderr=True,
)
p.Wait()
local_branch = p.stdout.strip()
p = GitCommand(project,
['config', '--get', 'branch.%s.merge' % local_branch],
p = GitCommand(
project,
["config", "--get", "branch.%s.merge" % local_branch],
capture_stdout=True,
capture_stderr=True)
capture_stderr=True,
)
p.Wait()
merge_branch = p.stdout.strip()
return merge_branch
@ -524,18 +731,26 @@ Gerrit Code Review: https://www.gerritcodereview.com/
return (project, avail)
def Execute(self, opt, args):
projects = self.GetProjects(args)
projects = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
def _ProcessResults(_pool, _out, results):
pending = []
for result in results:
project, avail = result
if avail is None:
print('repo: error: %s: Unable to upload branch "%s". '
'You might be able to fix the branch by running:\n'
' git branch --set-upstream-to m/%s' %
(project.relpath, project.CurrentBranch, self.manifest.branch),
file=sys.stderr)
print(
'repo: error: %s: Unable to upload branch "%s". '
"You might be able to fix the branch by running:\n"
" git branch --set-upstream-to m/%s"
% (
project.RelPath(local=opt.this_manifest_only),
project.CurrentBranch,
project.manifest.branch,
),
file=sys.stderr,
)
elif avail:
pending.append(result)
return pending
@ -544,25 +759,50 @@ Gerrit Code Review: https://www.gerritcodereview.com/
opt.jobs,
functools.partial(self._GatherOne, opt),
projects,
callback=_ProcessResults)
callback=_ProcessResults,
)
if not pending:
if opt.branch is None:
print('repo: error: no branches ready for upload', file=sys.stderr)
print(
"repo: error: no branches ready for upload", file=sys.stderr
)
else:
print('repo: error: no branches named "%s" ready for upload' %
(opt.branch,), file=sys.stderr)
print(
'repo: error: no branches named "%s" ready for upload'
% (opt.branch,),
file=sys.stderr,
)
return 1
pending_proj_names = [project.name for (project, available) in pending]
pending_worktrees = [project.worktree for (project, available) in pending]
manifests = {
project.manifest.topdir: project.manifest
for (project, available) in pending
}
ret = 0
for manifest in manifests.values():
pending_proj_names = [
project.name
for (project, available) in pending
if project.manifest.topdir == manifest.topdir
]
pending_worktrees = [
project.worktree
for (project, available) in pending
if project.manifest.topdir == manifest.topdir
]
hook = RepoHook.FromSubcmd(
hook_type='pre-upload', manifest=self.manifest,
opt=opt, abort_if_user_denies=True)
hook_type="pre-upload",
manifest=manifest,
opt=opt,
abort_if_user_denies=True,
)
if not hook.Run(
project_list=pending_proj_names,
worktree_list=pending_worktrees):
return 1
project_list=pending_proj_names, worktree_list=pending_worktrees
):
ret = 1
if ret:
return ret
reviewers = _SplitEmails(opt.reviewers) if opt.reviewers else []
cc = _SplitEmails(opt.cc) if opt.cc else []

View File

@ -33,34 +33,41 @@ class Version(Command, MirrorSafeCommand):
def Execute(self, opt, args):
rp = self.manifest.repoProject
rem = rp.GetRemote(rp.remote.name)
branch = rp.GetBranch('default')
rem = rp.GetRemote()
branch = rp.GetBranch("default")
# These might not be the same. Report them both.
src_ver = RepoSourceVersion()
rp_ver = rp.bare_git.describe(HEAD)
print('repo version %s' % rp_ver)
print(' (from %s)' % rem.url)
print(' (tracking %s)' % branch.merge)
print(' (%s)' % rp.bare_git.log('-1', '--format=%cD', HEAD))
print("repo version %s" % rp_ver)
print(" (from %s)" % rem.url)
print(" (tracking %s)" % branch.merge)
print(" (%s)" % rp.bare_git.log("-1", "--format=%cD", HEAD))
if self.wrapper_path is not None:
print('repo launcher version %s' % self.wrapper_version)
print(' (from %s)' % self.wrapper_path)
print("repo launcher version %s" % self.wrapper_version)
print(" (from %s)" % self.wrapper_path)
if src_ver != rp_ver:
print(' (currently at %s)' % src_ver)
print(" (currently at %s)" % src_ver)
print('repo User-Agent %s' % user_agent.repo)
print('git %s' % git.version_tuple().full)
print('git User-Agent %s' % user_agent.git)
print('Python %s' % sys.version)
print("repo User-Agent %s" % user_agent.repo)
print("git %s" % git.version_tuple().full)
print("git User-Agent %s" % user_agent.git)
print("Python %s" % sys.version)
uname = platform.uname()
if sys.version_info.major < 3:
# Python 3 returns a named tuple, but Python 2 is simpler.
print(uname)
else:
print('OS %s %s (%s)' % (uname.system, uname.release, uname.version))
print('CPU %s (%s)' %
(uname.machine, uname.processor if uname.processor else 'unknown'))
print('Bug reports:', Wrapper().BUG_URL)
print(
"OS %s %s (%s)" % (uname.system, uname.release, uname.version)
)
print(
"CPU %s (%s)"
% (
uname.machine,
uname.processor if uname.processor else "unknown",
)
)
print("Bug reports:", Wrapper().BUG_URL)

25
tests/conftest.py Normal file
View File

@ -0,0 +1,25 @@
# Copyright 2022 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Common fixtures for pytests."""
import pytest
import repo_trace
@pytest.fixture(autouse=True)
def disable_repo_trace(tmp_path):
"""Set an environment marker to relax certain strict checks for test code.""" # noqa: E501
repo_trace._TRACE_FILE = str(tmp_path / "TRACE_FILE_from_test")

View File

@ -11,13 +11,3 @@
intk = 10k
intm = 10m
intg = 10g
[repo "syncstate.main"]
synctime = 2021-08-13T18:37:43.928600Z
version = 1
[repo "syncstate.sys"]
argv = ['/usr/bin/pytest-3']
[repo "syncstate.superproject"]
test = false
[repo "syncstate.options"]
verbose = true
mpupdate = false

View File

@ -38,8 +38,8 @@ class GetEditor(EditorTestCase):
def test_basic(self):
"""Basic checking of _GetEditor."""
self.setEditor(':')
self.assertEqual(':', Editor._GetEditor())
self.setEditor(":")
self.assertEqual(":", Editor._GetEditor())
class EditString(EditorTestCase):
@ -47,10 +47,10 @@ class EditString(EditorTestCase):
def test_no_editor(self):
"""Check behavior when no editor is available."""
self.setEditor(':')
self.assertEqual('foo', Editor.EditString('foo'))
self.setEditor(":")
self.assertEqual("foo", Editor.EditString("foo"))
def test_cat_editor(self):
"""Check behavior when editor is `cat`."""
self.setEditor('cat')
self.assertEqual('foo', Editor.EditString('foo'))
self.setEditor("cat")
self.assertEqual("foo", Editor.EditString("foo"))

View File

@ -47,7 +47,9 @@ class PickleTests(unittest.TestCase):
try:
newobj = pickle.loads(p)
except Exception as e: # pylint: disable=broad-except
self.fail('Class %s is unable to be pickled: %s\n'
'Incomplete super().__init__(...) call?' % (cls, e))
self.fail(
"Class %s is unable to be pickled: %s\n"
"Incomplete super().__init__(...) call?" % (cls, e)
)
self.assertIsInstance(newobj, cls)
self.assertEqual(str(obj), str(newobj))

View File

@ -15,6 +15,7 @@
"""Unittests for the git_command.py module."""
import re
import os
import unittest
try:
@ -26,6 +27,44 @@ import git_command
import wrapper
class GitCommandTest(unittest.TestCase):
"""Tests the GitCommand class (via git_command.git)."""
def setUp(self):
def realpath_mock(val):
return val
mock.patch.object(
os.path, "realpath", side_effect=realpath_mock
).start()
def tearDown(self):
mock.patch.stopall()
def test_alternative_setting_when_matching(self):
r = git_command._build_env(
objdir=os.path.join("zap", "objects"), gitdir="zap"
)
self.assertIsNone(r.get("GIT_ALTERNATE_OBJECT_DIRECTORIES"))
self.assertEqual(
r.get("GIT_OBJECT_DIRECTORY"), os.path.join("zap", "objects")
)
def test_alternative_setting_when_different(self):
r = git_command._build_env(
objdir=os.path.join("wow", "objects"), gitdir="zap"
)
self.assertEqual(
r.get("GIT_ALTERNATE_OBJECT_DIRECTORIES"),
os.path.join("zap", "objects"),
)
self.assertEqual(
r.get("GIT_OBJECT_DIRECTORY"), os.path.join("wow", "objects")
)
class GitCallUnitTest(unittest.TestCase):
"""Tests the _GitCall class (via git_command.git)."""
@ -35,8 +74,8 @@ class GitCallUnitTest(unittest.TestCase):
self.assertIsNotNone(ver)
# We don't dive too deep into the values here to avoid having to update
# whenever git versions change. We do check relative to this min version
# as this is what `repo` itself requires via MIN_GIT_VERSION.
# whenever git versions change. We do check relative to this min
# version as this is what `repo` itself requires via MIN_GIT_VERSION.
MIN_GIT_VERSION = (2, 10, 2)
self.assertTrue(isinstance(ver.major, int))
self.assertTrue(isinstance(ver.minor, int))
@ -49,7 +88,7 @@ class GitCallUnitTest(unittest.TestCase):
self.assertGreaterEqual(ver, MIN_GIT_VERSION)
self.assertLess(ver, (9999, 9999, 9999))
self.assertNotEqual('', ver.full)
self.assertNotEqual("", ver.full)
class UserAgentUnitTest(unittest.TestCase):
@ -58,25 +97,25 @@ class UserAgentUnitTest(unittest.TestCase):
def test_smoke_os(self):
"""Make sure UA OS setting returns something useful."""
os_name = git_command.user_agent.os
# We can't dive too deep because of OS/tool differences, but we can check
# the general form.
m = re.match(r'^[^ ]+$', os_name)
# We can't dive too deep because of OS/tool differences, but we can
# check the general form.
m = re.match(r"^[^ ]+$", os_name)
self.assertIsNotNone(m)
def test_smoke_repo(self):
"""Make sure repo UA returns something useful."""
ua = git_command.user_agent.repo
# We can't dive too deep because of OS/tool differences, but we can check
# the general form.
m = re.match(r'^git-repo/[^ ]+ ([^ ]+) git/[^ ]+ Python/[0-9.]+', ua)
# We can't dive too deep because of OS/tool differences, but we can
# check the general form.
m = re.match(r"^git-repo/[^ ]+ ([^ ]+) git/[^ ]+ Python/[0-9.]+", ua)
self.assertIsNotNone(m)
def test_smoke_git(self):
"""Make sure git UA returns something useful."""
ua = git_command.user_agent.git
# We can't dive too deep because of OS/tool differences, but we can check
# the general form.
m = re.match(r'^git/[^ ]+ ([^ ]+) git-repo/[^ ]+', ua)
# We can't dive too deep because of OS/tool differences, but we can
# check the general form.
m = re.match(r"^git/[^ ]+ ([^ ]+) git-repo/[^ ]+", ua)
self.assertIsNotNone(m)
@ -84,8 +123,11 @@ class GitRequireTests(unittest.TestCase):
"""Test the git_require helper."""
def setUp(self):
ver = wrapper.GitVersion(1, 2, 3, 4)
mock.patch.object(git_command.git, 'version_tuple', return_value=ver).start()
self.wrapper = wrapper.Wrapper()
ver = self.wrapper.GitVersion(1, 2, 3, 4)
mock.patch.object(
git_command.git, "version_tuple", return_value=ver
).start()
def tearDown(self):
mock.patch.stopall()
@ -118,5 +160,5 @@ class GitRequireTests(unittest.TestCase):
def test_older_fatal_msg(self):
"""Test fatal require calls with old versions and message."""
with self.assertRaises(SystemExit) as e:
git_command.git_require((2,), fail=True, msg='so sad')
git_command.git_require((2,), fail=True, msg="so sad")
self.assertNotEqual(0, e.code)

View File

@ -22,18 +22,16 @@ import git_config
def fixture(*paths):
"""Return a path relative to test/fixtures.
"""
return os.path.join(os.path.dirname(__file__), 'fixtures', *paths)
"""Return a path relative to test/fixtures."""
return os.path.join(os.path.dirname(__file__), "fixtures", *paths)
class GitConfigReadOnlyTests(unittest.TestCase):
"""Read-only tests of the GitConfig class."""
def setUp(self):
"""Create a GitConfig object using the test.gitconfig fixture.
"""
config_fixture = fixture('test.gitconfig')
"""Create a GitConfig object using the test.gitconfig fixture."""
config_fixture = fixture("test.gitconfig")
self.config = git_config.GitConfig(config_fixture)
def test_GetString_with_empty_config_values(self):
@ -44,7 +42,7 @@ class GitConfigReadOnlyTests(unittest.TestCase):
empty
"""
val = self.config.GetString('section.empty')
val = self.config.GetString("section.empty")
self.assertEqual(val, None)
def test_GetString_with_true_value(self):
@ -55,73 +53,54 @@ class GitConfigReadOnlyTests(unittest.TestCase):
nonempty = true
"""
val = self.config.GetString('section.nonempty')
self.assertEqual(val, 'true')
val = self.config.GetString("section.nonempty")
self.assertEqual(val, "true")
def test_GetString_from_missing_file(self):
"""
Test missing config file
"""
config_fixture = fixture('not.present.gitconfig')
config_fixture = fixture("not.present.gitconfig")
config = git_config.GitConfig(config_fixture)
val = config.GetString('empty')
val = config.GetString("empty")
self.assertEqual(val, None)
def test_GetBoolean_undefined(self):
"""Test GetBoolean on key that doesn't exist."""
self.assertIsNone(self.config.GetBoolean('section.missing'))
self.assertIsNone(self.config.GetBoolean("section.missing"))
def test_GetBoolean_invalid(self):
"""Test GetBoolean on invalid boolean value."""
self.assertIsNone(self.config.GetBoolean('section.boolinvalid'))
self.assertIsNone(self.config.GetBoolean("section.boolinvalid"))
def test_GetBoolean_true(self):
"""Test GetBoolean on valid true boolean."""
self.assertTrue(self.config.GetBoolean('section.booltrue'))
self.assertTrue(self.config.GetBoolean("section.booltrue"))
def test_GetBoolean_false(self):
"""Test GetBoolean on valid false boolean."""
self.assertFalse(self.config.GetBoolean('section.boolfalse'))
self.assertFalse(self.config.GetBoolean("section.boolfalse"))
def test_GetInt_undefined(self):
"""Test GetInt on key that doesn't exist."""
self.assertIsNone(self.config.GetInt('section.missing'))
self.assertIsNone(self.config.GetInt("section.missing"))
def test_GetInt_invalid(self):
"""Test GetInt on invalid integer value."""
self.assertIsNone(self.config.GetBoolean('section.intinvalid'))
self.assertIsNone(self.config.GetBoolean("section.intinvalid"))
def test_GetInt_valid(self):
"""Test GetInt on valid integers."""
TESTS = (
('inthex', 16),
('inthexk', 16384),
('int', 10),
('intk', 10240),
('intm', 10485760),
('intg', 10737418240),
("inthex", 16),
("inthexk", 16384),
("int", 10),
("intk", 10240),
("intm", 10485760),
("intg", 10737418240),
)
for key, value in TESTS:
self.assertEqual(value, self.config.GetInt('section.%s' % (key,)))
def test_GetSyncAnalysisStateData(self):
"""Test config entries with a sync state analysis data."""
superproject_logging_data = {}
superproject_logging_data['test'] = False
options = type('options', (object,), {})()
options.verbose = 'true'
options.mp_update = 'false'
TESTS = (
('superproject.test', 'false'),
('options.verbose', 'true'),
('options.mpupdate', 'false'),
('main.version', '1'),
)
self.config.UpdateSyncAnalysisState(options, superproject_logging_data)
sync_data = self.config.GetSyncAnalysisStateData()
for key, value in TESTS:
self.assertEqual(sync_data[f'{git_config.SYNC_STATE_PREFIX}{key}'], value)
self.assertTrue(sync_data[f'{git_config.SYNC_STATE_PREFIX}main.synctime'])
self.assertEqual(value, self.config.GetInt("section.%s" % (key,)))
class GitConfigReadWriteTests(unittest.TestCase):
@ -138,55 +117,74 @@ class GitConfigReadWriteTests(unittest.TestCase):
def test_SetString(self):
"""Test SetString behavior."""
# Set a value.
self.assertIsNone(self.config.GetString('foo.bar'))
self.config.SetString('foo.bar', 'val')
self.assertEqual('val', self.config.GetString('foo.bar'))
self.assertIsNone(self.config.GetString("foo.bar"))
self.config.SetString("foo.bar", "val")
self.assertEqual("val", self.config.GetString("foo.bar"))
# Make sure the value was actually written out.
config = self.get_config()
self.assertEqual('val', config.GetString('foo.bar'))
self.assertEqual("val", config.GetString("foo.bar"))
# Update the value.
self.config.SetString('foo.bar', 'valll')
self.assertEqual('valll', self.config.GetString('foo.bar'))
self.config.SetString("foo.bar", "valll")
self.assertEqual("valll", self.config.GetString("foo.bar"))
config = self.get_config()
self.assertEqual('valll', config.GetString('foo.bar'))
self.assertEqual("valll", config.GetString("foo.bar"))
# Delete the value.
self.config.SetString('foo.bar', None)
self.assertIsNone(self.config.GetString('foo.bar'))
self.config.SetString("foo.bar", None)
self.assertIsNone(self.config.GetString("foo.bar"))
config = self.get_config()
self.assertIsNone(config.GetString('foo.bar'))
self.assertIsNone(config.GetString("foo.bar"))
def test_SetBoolean(self):
"""Test SetBoolean behavior."""
# Set a true value.
self.assertIsNone(self.config.GetBoolean('foo.bar'))
self.assertIsNone(self.config.GetBoolean("foo.bar"))
for val in (True, 1):
self.config.SetBoolean('foo.bar', val)
self.assertTrue(self.config.GetBoolean('foo.bar'))
self.config.SetBoolean("foo.bar", val)
self.assertTrue(self.config.GetBoolean("foo.bar"))
# Make sure the value was actually written out.
config = self.get_config()
self.assertTrue(config.GetBoolean('foo.bar'))
self.assertEqual('true', config.GetString('foo.bar'))
self.assertTrue(config.GetBoolean("foo.bar"))
self.assertEqual("true", config.GetString("foo.bar"))
# Set a false value.
for val in (False, 0):
self.config.SetBoolean('foo.bar', val)
self.assertFalse(self.config.GetBoolean('foo.bar'))
self.config.SetBoolean("foo.bar", val)
self.assertFalse(self.config.GetBoolean("foo.bar"))
# Make sure the value was actually written out.
config = self.get_config()
self.assertFalse(config.GetBoolean('foo.bar'))
self.assertEqual('false', config.GetString('foo.bar'))
self.assertFalse(config.GetBoolean("foo.bar"))
self.assertEqual("false", config.GetString("foo.bar"))
# Delete the value.
self.config.SetBoolean('foo.bar', None)
self.assertIsNone(self.config.GetBoolean('foo.bar'))
self.config.SetBoolean("foo.bar", None)
self.assertIsNone(self.config.GetBoolean("foo.bar"))
config = self.get_config()
self.assertIsNone(config.GetBoolean('foo.bar'))
self.assertIsNone(config.GetBoolean("foo.bar"))
if __name__ == '__main__':
unittest.main()
def test_GetSyncAnalysisStateData(self):
"""Test config entries with a sync state analysis data."""
superproject_logging_data = {}
superproject_logging_data["test"] = False
options = type("options", (object,), {})()
options.verbose = "true"
options.mp_update = "false"
TESTS = (
("superproject.test", "false"),
("options.verbose", "true"),
("options.mpupdate", "false"),
("main.version", "1"),
)
self.config.UpdateSyncAnalysisState(options, superproject_logging_data)
sync_data = self.config.GetSyncAnalysisStateData()
for key, value in TESTS:
self.assertEqual(
sync_data[f"{git_config.SYNC_STATE_PREFIX}{key}"], value
)
self.assertTrue(
sync_data[f"{git_config.SYNC_STATE_PREFIX}main.synctime"]
)

View File

@ -24,24 +24,25 @@ from unittest import mock
import git_superproject
import git_trace2_event_log
import manifest_xml
import platform_utils
from test_manifest_xml import sort_attributes
class SuperprojectTestCase(unittest.TestCase):
"""TestCase for the Superproject module."""
PARENT_SID_KEY = 'GIT_TRACE2_PARENT_SID'
PARENT_SID_VALUE = 'parent_sid'
SELF_SID_REGEX = r'repo-\d+T\d+Z-.*'
FULL_SID_REGEX = r'^%s/%s' % (PARENT_SID_VALUE, SELF_SID_REGEX)
PARENT_SID_KEY = "GIT_TRACE2_PARENT_SID"
PARENT_SID_VALUE = "parent_sid"
SELF_SID_REGEX = r"repo-\d+T\d+Z-.*"
FULL_SID_REGEX = r"^%s/%s" % (PARENT_SID_VALUE, SELF_SID_REGEX)
def setUp(self):
"""Set up superproject every time."""
self.tempdir = tempfile.mkdtemp(prefix='repo_tests')
self.repodir = os.path.join(self.tempdir, '.repo')
self.tempdirobj = tempfile.TemporaryDirectory(prefix="repo_tests")
self.tempdir = self.tempdirobj.name
self.repodir = os.path.join(self.tempdir, ".repo")
self.manifest_file = os.path.join(
self.repodir, manifest_xml.MANIFEST_FILE_NAME)
self.repodir, manifest_xml.MANIFEST_FILE_NAME
)
os.mkdir(self.repodir)
self.platform = platform.system().lower()
@ -53,53 +54,65 @@ class SuperprojectTestCase(unittest.TestCase):
self.git_event_log = git_trace2_event_log.EventLog(env=env)
# The manifest parsing really wants a git repo currently.
gitdir = os.path.join(self.repodir, 'manifests.git')
gitdir = os.path.join(self.repodir, "manifests.git")
os.mkdir(gitdir)
with open(os.path.join(gitdir, 'config'), 'w') as fp:
fp.write("""[remote "origin"]
with open(os.path.join(gitdir, "config"), "w") as fp:
fp.write(
"""[remote "origin"]
url = https://localhost:0/manifest
""")
"""
)
manifest = self.getXmlManifest("""
manifest = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<superproject name="superproject"/>
<project path="art" name="platform/art" groups="notdefault,platform-""" + self.platform + """
<project path="art" name="platform/art" groups="notdefault,platform-"""
+ self.platform
+ """
" /></manifest>
""")
self._superproject = git_superproject.Superproject(manifest, self.repodir,
self.git_event_log)
"""
)
self._superproject = git_superproject.Superproject(
manifest,
name="superproject",
remote=manifest.remotes.get("default-remote").ToRemoteSpec(
"superproject"
),
revision="refs/heads/main",
)
def tearDown(self):
"""Tear down superproject every time."""
platform_utils.rmtree(self.tempdir)
self.tempdirobj.cleanup()
def getXmlManifest(self, data):
"""Helper to initialize a manifest for testing."""
with open(self.manifest_file, 'w') as fp:
with open(self.manifest_file, "w") as fp:
fp.write(data)
return manifest_xml.XmlManifest(self.repodir, self.manifest_file)
def verifyCommonKeys(self, log_entry, expected_event_name, full_sid=True):
"""Helper function to verify common event log keys."""
self.assertIn('event', log_entry)
self.assertIn('sid', log_entry)
self.assertIn('thread', log_entry)
self.assertIn('time', log_entry)
self.assertIn("event", log_entry)
self.assertIn("sid", log_entry)
self.assertIn("thread", log_entry)
self.assertIn("time", log_entry)
# Do basic data format validation.
self.assertEqual(expected_event_name, log_entry['event'])
self.assertEqual(expected_event_name, log_entry["event"])
if full_sid:
self.assertRegex(log_entry['sid'], self.FULL_SID_REGEX)
self.assertRegex(log_entry["sid"], self.FULL_SID_REGEX)
else:
self.assertRegex(log_entry['sid'], self.SELF_SID_REGEX)
self.assertRegex(log_entry['time'], r'^\d+-\d+-\d+T\d+:\d+:\d+\.\d+Z$')
self.assertRegex(log_entry["sid"], self.SELF_SID_REGEX)
self.assertRegex(log_entry["time"], r"^\d+-\d+-\d+T\d+:\d+:\d+\.\d+Z$")
def readLog(self, log_path):
"""Helper function to read log data into a list."""
log_data = []
with open(log_path, mode='rb') as f:
with open(log_path, mode="rb") as f:
for line in f:
log_data.append(json.loads(line))
return log_data
@ -107,105 +120,133 @@ class SuperprojectTestCase(unittest.TestCase):
def verifyErrorEvent(self):
"""Helper to verify that error event is written."""
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self.git_event_log.Write(path=tempdir)
self.log_data = self.readLog(log_path)
self.assertEqual(len(self.log_data), 2)
error_event = self.log_data[1]
self.verifyCommonKeys(self.log_data[0], expected_event_name='version')
self.verifyCommonKeys(error_event, expected_event_name='error')
self.verifyCommonKeys(self.log_data[0], expected_event_name="version")
self.verifyCommonKeys(error_event, expected_event_name="error")
# Check for 'error' event specific fields.
self.assertIn('msg', error_event)
self.assertIn('fmt', error_event)
self.assertIn("msg", error_event)
self.assertIn("fmt", error_event)
def test_superproject_get_superproject_no_superproject(self):
"""Test with no url."""
manifest = self.getXmlManifest("""
manifest = self.getXmlManifest(
"""
<manifest>
</manifest>
""")
superproject = git_superproject.Superproject(manifest, self.repodir, self.git_event_log)
# Test that exit condition is false when there is no superproject tag.
sync_result = superproject.Sync()
self.assertFalse(sync_result.success)
self.assertFalse(sync_result.fatal)
self.verifyErrorEvent()
"""
)
self.assertIsNone(manifest.superproject)
def test_superproject_get_superproject_invalid_url(self):
"""Test with an invalid url."""
manifest = self.getXmlManifest("""
manifest = self.getXmlManifest(
"""
<manifest>
<remote name="test-remote" fetch="localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<superproject name="superproject"/>
</manifest>
""")
superproject = git_superproject.Superproject(manifest, self.repodir, self.git_event_log)
sync_result = superproject.Sync()
"""
)
superproject = git_superproject.Superproject(
manifest,
name="superproject",
remote=manifest.remotes.get("test-remote").ToRemoteSpec(
"superproject"
),
revision="refs/heads/main",
)
sync_result = superproject.Sync(self.git_event_log)
self.assertFalse(sync_result.success)
self.assertTrue(sync_result.fatal)
def test_superproject_get_superproject_invalid_branch(self):
"""Test with an invalid branch."""
manifest = self.getXmlManifest("""
manifest = self.getXmlManifest(
"""
<manifest>
<remote name="test-remote" fetch="localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<superproject name="superproject"/>
</manifest>
""")
self._superproject = git_superproject.Superproject(manifest, self.repodir,
self.git_event_log)
with mock.patch.object(self._superproject, '_GetBranch', return_value='junk'):
sync_result = self._superproject.Sync()
"""
)
self._superproject = git_superproject.Superproject(
manifest,
name="superproject",
remote=manifest.remotes.get("test-remote").ToRemoteSpec(
"superproject"
),
revision="refs/heads/main",
)
with mock.patch.object(self._superproject, "_branch", "junk"):
sync_result = self._superproject.Sync(self.git_event_log)
self.assertFalse(sync_result.success)
self.assertTrue(sync_result.fatal)
self.verifyErrorEvent()
def test_superproject_get_superproject_mock_init(self):
"""Test with _Init failing."""
with mock.patch.object(self._superproject, '_Init', return_value=False):
sync_result = self._superproject.Sync()
with mock.patch.object(self._superproject, "_Init", return_value=False):
sync_result = self._superproject.Sync(self.git_event_log)
self.assertFalse(sync_result.success)
self.assertTrue(sync_result.fatal)
def test_superproject_get_superproject_mock_fetch(self):
"""Test with _Fetch failing."""
with mock.patch.object(self._superproject, '_Init', return_value=True):
with mock.patch.object(self._superproject, "_Init", return_value=True):
os.mkdir(self._superproject._superproject_path)
with mock.patch.object(self._superproject, '_Fetch', return_value=False):
sync_result = self._superproject.Sync()
with mock.patch.object(
self._superproject, "_Fetch", return_value=False
):
sync_result = self._superproject.Sync(self.git_event_log)
self.assertFalse(sync_result.success)
self.assertTrue(sync_result.fatal)
def test_superproject_get_all_project_commit_ids_mock_ls_tree(self):
"""Test with LsTree being a mock."""
data = ('120000 blob 158258bdf146f159218e2b90f8b699c4d85b5804\tAndroid.bp\x00'
'160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00'
'160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00'
'120000 blob acc2cbdf438f9d2141f0ae424cec1d8fc4b5d97f\tbootstrap.bash\x00'
'160000 commit ade9b7a0d874e25fff4bf2552488825c6f111928\tbuild/bazel\x00')
with mock.patch.object(self._superproject, '_Init', return_value=True):
with mock.patch.object(self._superproject, '_Fetch', return_value=True):
with mock.patch.object(self._superproject, '_LsTree', return_value=data):
commit_ids_result = self._superproject._GetAllProjectsCommitIds()
self.assertEqual(commit_ids_result.commit_ids, {
'art': '2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea',
'bootable/recovery': 'e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06',
'build/bazel': 'ade9b7a0d874e25fff4bf2552488825c6f111928'
})
data = (
"120000 blob 158258bdf146f159218e2b90f8b699c4d85b5804\tAndroid.bp\x00"
"160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00"
"160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00"
"120000 blob acc2cbdf438f9d2141f0ae424cec1d8fc4b5d97f\tbootstrap.bash\x00"
"160000 commit ade9b7a0d874e25fff4bf2552488825c6f111928\tbuild/bazel\x00"
)
with mock.patch.object(self._superproject, "_Init", return_value=True):
with mock.patch.object(
self._superproject, "_Fetch", return_value=True
):
with mock.patch.object(
self._superproject, "_LsTree", return_value=data
):
commit_ids_result = (
self._superproject._GetAllProjectsCommitIds()
)
self.assertEqual(
commit_ids_result.commit_ids,
{
"art": "2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea",
"bootable/recovery": "e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06",
"build/bazel": "ade9b7a0d874e25fff4bf2552488825c6f111928",
},
)
self.assertFalse(commit_ids_result.fatal)
def test_superproject_write_manifest_file(self):
"""Test with writing manifest to a file after setting revisionId."""
self.assertEqual(len(self._superproject._manifest.projects), 1)
project = self._superproject._manifest.projects[0]
project.SetRevisionId('ABCDEF')
project.SetRevisionId("ABCDEF")
# Create temporary directory so that it can write the file.
os.mkdir(self._superproject._superproject_path)
manifest_path = self._superproject._WriteManifestFile()
self.assertIsNotNone(manifest_path)
with open(manifest_path, 'r') as fp:
with open(manifest_path, "r") as fp:
manifest_xml_data = fp.read()
self.assertEqual(
sort_attributes(manifest_xml_data),
@ -215,117 +256,141 @@ class SuperprojectTestCase(unittest.TestCase):
'<project groups="notdefault,platform-' + self.platform + '" '
'name="platform/art" path="art" revision="ABCDEF" upstream="refs/heads/main"/>'
'<superproject name="superproject"/>'
'</manifest>')
"</manifest>",
)
def test_superproject_update_project_revision_id(self):
"""Test with LsTree being a mock."""
self.assertEqual(len(self._superproject._manifest.projects), 1)
projects = self._superproject._manifest.projects
data = ('160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00'
'160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00')
with mock.patch.object(self._superproject, '_Init', return_value=True):
with mock.patch.object(self._superproject, '_Fetch', return_value=True):
with mock.patch.object(self._superproject,
'_LsTree',
return_value=data):
data = (
"160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00"
"160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00"
)
with mock.patch.object(self._superproject, "_Init", return_value=True):
with mock.patch.object(
self._superproject, "_Fetch", return_value=True
):
with mock.patch.object(
self._superproject, "_LsTree", return_value=data
):
# Create temporary directory so that it can write the file.
os.mkdir(self._superproject._superproject_path)
update_result = self._superproject.UpdateProjectsRevisionId(projects)
update_result = self._superproject.UpdateProjectsRevisionId(
projects, self.git_event_log
)
self.assertIsNotNone(update_result.manifest_path)
self.assertFalse(update_result.fatal)
with open(update_result.manifest_path, 'r') as fp:
with open(update_result.manifest_path, "r") as fp:
manifest_xml_data = fp.read()
self.assertEqual(
sort_attributes(manifest_xml_data),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project groups="notdefault,platform-' + self.platform + '" '
'<project groups="notdefault,platform-'
+ self.platform
+ '" '
'name="platform/art" path="art" '
'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" upstream="refs/heads/main"/>'
'<superproject name="superproject"/>'
'</manifest>')
"</manifest>",
)
def test_superproject_update_project_revision_id_no_superproject_tag(self):
"""Test update of commit ids of a manifest without superproject tag."""
manifest = self.getXmlManifest("""
manifest = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="test-name"/>
</manifest>
""")
"""
)
self.maxDiff = None
self._superproject = git_superproject.Superproject(manifest, self.repodir,
self.git_event_log)
self.assertEqual(len(self._superproject._manifest.projects), 1)
projects = self._superproject._manifest.projects
project = projects[0]
project.SetRevisionId('ABCDEF')
update_result = self._superproject.UpdateProjectsRevisionId(projects)
self.assertIsNone(update_result.manifest_path)
self.assertFalse(update_result.fatal)
self.verifyErrorEvent()
self.assertIsNone(manifest.superproject)
self.assertEqual(
sort_attributes(manifest.ToXml().toxml()),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project name="test-name" revision="ABCDEF" upstream="refs/heads/main"/>'
'</manifest>')
'<project name="test-name"/>'
"</manifest>",
)
def test_superproject_update_project_revision_id_from_local_manifest_group(self):
def test_superproject_update_project_revision_id_from_local_manifest_group(
self,
):
"""Test update of commit ids of a manifest that have local manifest no superproject group."""
local_group = manifest_xml.LOCAL_MANIFEST_GROUP_PREFIX + ':local'
manifest = self.getXmlManifest("""
local_group = manifest_xml.LOCAL_MANIFEST_GROUP_PREFIX + ":local"
manifest = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<remote name="goog" fetch="http://localhost2" />
<default remote="default-remote" revision="refs/heads/main" />
<superproject name="superproject"/>
<project path="vendor/x" name="platform/vendor/x" remote="goog"
groups=\"""" + local_group + """
groups=\""""
+ local_group
+ """
" revision="master-with-vendor" clone-depth="1" />
<project path="art" name="platform/art" groups="notdefault,platform-""" + self.platform + """
<project path="art" name="platform/art" groups="notdefault,platform-"""
+ self.platform
+ """
" /></manifest>
""")
"""
)
self.maxDiff = None
self._superproject = git_superproject.Superproject(manifest, self.repodir,
self.git_event_log)
self._superproject = git_superproject.Superproject(
manifest,
name="superproject",
remote=manifest.remotes.get("default-remote").ToRemoteSpec(
"superproject"
),
revision="refs/heads/main",
)
self.assertEqual(len(self._superproject._manifest.projects), 2)
projects = self._superproject._manifest.projects
data = ('160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00')
with mock.patch.object(self._superproject, '_Init', return_value=True):
with mock.patch.object(self._superproject, '_Fetch', return_value=True):
with mock.patch.object(self._superproject,
'_LsTree',
return_value=data):
data = "160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00"
with mock.patch.object(self._superproject, "_Init", return_value=True):
with mock.patch.object(
self._superproject, "_Fetch", return_value=True
):
with mock.patch.object(
self._superproject, "_LsTree", return_value=data
):
# Create temporary directory so that it can write the file.
os.mkdir(self._superproject._superproject_path)
update_result = self._superproject.UpdateProjectsRevisionId(projects)
update_result = self._superproject.UpdateProjectsRevisionId(
projects, self.git_event_log
)
self.assertIsNotNone(update_result.manifest_path)
self.assertFalse(update_result.fatal)
with open(update_result.manifest_path, 'r') as fp:
with open(update_result.manifest_path, "r") as fp:
manifest_xml_data = fp.read()
# Verify platform/vendor/x's project revision hasn't changed.
# Verify platform/vendor/x's project revision hasn't
# changed.
self.assertEqual(
sort_attributes(manifest_xml_data),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<remote fetch="http://localhost2" name="goog"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project groups="notdefault,platform-' + self.platform + '" '
'<project groups="notdefault,platform-'
+ self.platform
+ '" '
'name="platform/art" path="art" '
'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" upstream="refs/heads/main"/>'
'<project clone-depth="1" groups="' + local_group + '" '
'name="platform/vendor/x" path="vendor/x" remote="goog" '
'revision="master-with-vendor"/>'
'<superproject name="superproject"/>'
'</manifest>')
"</manifest>",
)
def test_superproject_update_project_revision_id_with_pinned_manifest(self):
"""Test update of commit ids of a pinned manifest."""
manifest = self.getXmlManifest("""
manifest = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
@ -333,35 +398,53 @@ class SuperprojectTestCase(unittest.TestCase):
<project path="vendor/x" name="platform/vendor/x" revision="" />
<project path="vendor/y" name="platform/vendor/y"
revision="52d3c9f7c107839ece2319d077de0cd922aa9d8f" />
<project path="art" name="platform/art" groups="notdefault,platform-""" + self.platform + """
<project path="art" name="platform/art" groups="notdefault,platform-"""
+ self.platform
+ """
" /></manifest>
""")
"""
)
self.maxDiff = None
self._superproject = git_superproject.Superproject(manifest, self.repodir,
self.git_event_log)
self._superproject = git_superproject.Superproject(
manifest,
name="superproject",
remote=manifest.remotes.get("default-remote").ToRemoteSpec(
"superproject"
),
revision="refs/heads/main",
)
self.assertEqual(len(self._superproject._manifest.projects), 3)
projects = self._superproject._manifest.projects
data = ('160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00'
'160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tvendor/x\x00')
with mock.patch.object(self._superproject, '_Init', return_value=True):
with mock.patch.object(self._superproject, '_Fetch', return_value=True):
with mock.patch.object(self._superproject,
'_LsTree',
return_value=data):
data = (
"160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00"
"160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tvendor/x\x00"
)
with mock.patch.object(self._superproject, "_Init", return_value=True):
with mock.patch.object(
self._superproject, "_Fetch", return_value=True
):
with mock.patch.object(
self._superproject, "_LsTree", return_value=data
):
# Create temporary directory so that it can write the file.
os.mkdir(self._superproject._superproject_path)
update_result = self._superproject.UpdateProjectsRevisionId(projects)
update_result = self._superproject.UpdateProjectsRevisionId(
projects, self.git_event_log
)
self.assertIsNotNone(update_result.manifest_path)
self.assertFalse(update_result.fatal)
with open(update_result.manifest_path, 'r') as fp:
with open(update_result.manifest_path, "r") as fp:
manifest_xml_data = fp.read()
# Verify platform/vendor/x's project revision hasn't changed.
# Verify platform/vendor/x's project revision hasn't
# changed.
self.assertEqual(
sort_attributes(manifest_xml_data),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project groups="notdefault,platform-' + self.platform + '" '
'<project groups="notdefault,platform-'
+ self.platform
+ '" '
'name="platform/art" path="art" '
'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" upstream="refs/heads/main"/>'
'<project name="platform/vendor/x" path="vendor/x" '
@ -369,8 +452,78 @@ class SuperprojectTestCase(unittest.TestCase):
'<project name="platform/vendor/y" path="vendor/y" '
'revision="52d3c9f7c107839ece2319d077de0cd922aa9d8f"/>'
'<superproject name="superproject"/>'
'</manifest>')
"</manifest>",
)
def test_Fetch(self):
manifest = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<superproject name="superproject"/>
" /></manifest>
"""
)
self.maxDiff = None
self._superproject = git_superproject.Superproject(
manifest,
name="superproject",
remote=manifest.remotes.get("default-remote").ToRemoteSpec(
"superproject"
),
revision="refs/heads/main",
)
os.mkdir(self._superproject._superproject_path)
os.mkdir(self._superproject._work_git)
with mock.patch.object(self._superproject, "_Init", return_value=True):
with mock.patch(
"git_superproject.GitCommand", autospec=True
) as mock_git_command:
with mock.patch(
"git_superproject.GitRefs.get", autospec=True
) as mock_git_refs:
instance = mock_git_command.return_value
instance.Wait.return_value = 0
mock_git_refs.side_effect = ["", "1234"]
if __name__ == '__main__':
unittest.main()
self.assertTrue(self._superproject._Fetch())
self.assertEqual(
mock_git_command.call_args.args,
(
None,
[
"fetch",
"http://localhost/superproject",
"--depth",
"1",
"--force",
"--no-tags",
"--filter",
"blob:none",
"refs/heads/main:refs/heads/main",
],
),
)
# If branch for revision exists, set as --negotiation-tip.
self.assertTrue(self._superproject._Fetch())
self.assertEqual(
mock_git_command.call_args.args,
(
None,
[
"fetch",
"http://localhost/superproject",
"--depth",
"1",
"--force",
"--no-tags",
"--filter",
"blob:none",
"--negotiation-tip",
"1234",
"refs/heads/main:refs/heads/main",
],
),
)

View File

@ -16,20 +16,52 @@
import json
import os
import socket
import tempfile
import threading
import unittest
from unittest import mock
import git_trace2_event_log
import platform_utils
def serverLoggingThread(socket_path, server_ready, received_traces):
"""Helper function to receive logs over a Unix domain socket.
Appends received messages on the provided socket and appends to
received_traces.
Args:
socket_path: path to a Unix domain socket on which to listen for traces
server_ready: a threading.Condition used to signal to the caller that
this thread is ready to accept connections
received_traces: a list to which received traces will be appended (after
decoding to a utf-8 string).
"""
platform_utils.remove(socket_path, missing_ok=True)
data = b""
with socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) as sock:
sock.bind(socket_path)
sock.listen(0)
with server_ready:
server_ready.notify()
with sock.accept()[0] as conn:
while True:
recved = conn.recv(4096)
if not recved:
break
data += recved
received_traces.extend(data.decode("utf-8").splitlines())
class EventLogTestCase(unittest.TestCase):
"""TestCase for the EventLog module."""
PARENT_SID_KEY = 'GIT_TRACE2_PARENT_SID'
PARENT_SID_VALUE = 'parent_sid'
SELF_SID_REGEX = r'repo-\d+T\d+Z-.*'
FULL_SID_REGEX = r'^%s/%s' % (PARENT_SID_VALUE, SELF_SID_REGEX)
PARENT_SID_KEY = "GIT_TRACE2_PARENT_SID"
PARENT_SID_VALUE = "parent_sid"
SELF_SID_REGEX = r"repo-\d+T\d+Z-.*"
FULL_SID_REGEX = r"^%s/%s" % (PARENT_SID_VALUE, SELF_SID_REGEX)
def setUp(self):
"""Load the event_log module every time."""
@ -42,29 +74,40 @@ class EventLogTestCase(unittest.TestCase):
self._event_log_module = git_trace2_event_log.EventLog(env=env)
self._log_data = None
def verifyCommonKeys(self, log_entry, expected_event_name, full_sid=True):
def verifyCommonKeys(
self, log_entry, expected_event_name=None, full_sid=True
):
"""Helper function to verify common event log keys."""
self.assertIn('event', log_entry)
self.assertIn('sid', log_entry)
self.assertIn('thread', log_entry)
self.assertIn('time', log_entry)
self.assertIn("event", log_entry)
self.assertIn("sid", log_entry)
self.assertIn("thread", log_entry)
self.assertIn("time", log_entry)
# Do basic data format validation.
self.assertEqual(expected_event_name, log_entry['event'])
if expected_event_name:
self.assertEqual(expected_event_name, log_entry["event"])
if full_sid:
self.assertRegex(log_entry['sid'], self.FULL_SID_REGEX)
self.assertRegex(log_entry["sid"], self.FULL_SID_REGEX)
else:
self.assertRegex(log_entry['sid'], self.SELF_SID_REGEX)
self.assertRegex(log_entry['time'], r'^\d+-\d+-\d+T\d+:\d+:\d+\.\d+Z$')
self.assertRegex(log_entry["sid"], self.SELF_SID_REGEX)
self.assertRegex(log_entry["time"], r"^\d+-\d+-\d+T\d+:\d+:\d+\.\d+Z$")
def readLog(self, log_path):
"""Helper function to read log data into a list."""
log_data = []
with open(log_path, mode='rb') as f:
with open(log_path, mode="rb") as f:
for line in f:
log_data.append(json.loads(line))
return log_data
def remove_prefix(self, s, prefix):
"""Return a copy string after removing |prefix| from |s|, if present or
the original string."""
if s.startswith(prefix):
return s[len(prefix) :]
else:
return s
def test_initial_state_with_parent_sid(self):
"""Test initial state when 'GIT_TRACE2_PARENT_SID' is set by parent."""
self.assertRegex(self._event_log_module.full_sid, self.FULL_SID_REGEX)
@ -84,19 +127,19 @@ class EventLogTestCase(unittest.TestCase):
Expected event log:
<version event>
"""
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
# A log with no added events should only have the version entry.
self.assertEqual(len(self._log_data), 1)
version_event = self._log_data[0]
self.verifyCommonKeys(version_event, expected_event_name='version')
self.verifyCommonKeys(version_event, expected_event_name="version")
# Check for 'version' event specific fields.
self.assertIn('evt', version_event)
self.assertIn('exe', version_event)
self.assertIn("evt", version_event)
self.assertIn("exe", version_event)
# Verify "evt" version field is a string.
self.assertIsInstance(version_event['evt'], str)
self.assertIsInstance(version_event["evt"], str)
def test_start_event(self):
"""Test and validate 'start' event data is valid.
@ -106,17 +149,17 @@ class EventLogTestCase(unittest.TestCase):
<start event>
"""
self._event_log_module.StartEvent()
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
start_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(start_event, expected_event_name='start')
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
self.verifyCommonKeys(start_event, expected_event_name="start")
# Check for 'start' event specific fields.
self.assertIn('argv', start_event)
self.assertTrue(isinstance(start_event['argv'], list))
self.assertIn("argv", start_event)
self.assertTrue(isinstance(start_event["argv"], list))
def test_exit_event_result_none(self):
"""Test 'exit' event data is valid when result is None.
@ -128,18 +171,18 @@ class EventLogTestCase(unittest.TestCase):
<exit event>
"""
self._event_log_module.ExitEvent(None)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
exit_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(exit_event, expected_event_name='exit')
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
self.verifyCommonKeys(exit_event, expected_event_name="exit")
# Check for 'exit' event specific fields.
self.assertIn('code', exit_event)
self.assertIn("code", exit_event)
# 'None' result should convert to 0 (successful) return code.
self.assertEqual(exit_event['code'], 0)
self.assertEqual(exit_event["code"], 0)
def test_exit_event_result_integer(self):
"""Test 'exit' event data is valid when result is an integer.
@ -149,17 +192,17 @@ class EventLogTestCase(unittest.TestCase):
<exit event>
"""
self._event_log_module.ExitEvent(2)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
exit_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(exit_event, expected_event_name='exit')
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
self.verifyCommonKeys(exit_event, expected_event_name="exit")
# Check for 'exit' event specific fields.
self.assertIn('code', exit_event)
self.assertEqual(exit_event['code'], 2)
self.assertIn("code", exit_event)
self.assertEqual(exit_event["code"], 2)
def test_command_event(self):
"""Test and validate 'command' event data is valid.
@ -168,22 +211,24 @@ class EventLogTestCase(unittest.TestCase):
<version event>
<command event>
"""
name = 'repo'
subcommands = ['init' 'this']
self._event_log_module.CommandEvent(name='repo', subcommands=subcommands)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
name = "repo"
subcommands = ["init" "this"]
self._event_log_module.CommandEvent(
name="repo", subcommands=subcommands
)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
command_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(command_event, expected_event_name='command')
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
self.verifyCommonKeys(command_event, expected_event_name="command")
# Check for 'command' event specific fields.
self.assertIn('name', command_event)
self.assertIn('subcommands', command_event)
self.assertEqual(command_event['name'], name)
self.assertEqual(command_event['subcommands'], subcommands)
self.assertIn("name", command_event)
self.assertIn("subcommands", command_event)
self.assertEqual(command_event["name"], name)
self.assertEqual(command_event["subcommands"], subcommands)
def test_def_params_event_repo_config(self):
"""Test 'def_params' event data outputs only repo config keys.
@ -194,26 +239,26 @@ class EventLogTestCase(unittest.TestCase):
<def_param event>
"""
config = {
'git.foo': 'bar',
'repo.partialclone': 'true',
'repo.partialclonefilter': 'blob:none',
"git.foo": "bar",
"repo.partialclone": "true",
"repo.partialclonefilter": "blob:none",
}
self._event_log_module.DefParamRepoEvents(config)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 3)
def_param_events = self._log_data[1:]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
for event in def_param_events:
self.verifyCommonKeys(event, expected_event_name='def_param')
self.verifyCommonKeys(event, expected_event_name="def_param")
# Check for 'def_param' event specific fields.
self.assertIn('param', event)
self.assertIn('value', event)
self.assertTrue(event['param'].startswith('repo.'))
self.assertIn("param", event)
self.assertIn("value", event)
self.assertTrue(event["param"].startswith("repo."))
def test_def_params_event_no_repo_config(self):
"""Test 'def_params' event data won't output non-repo config keys.
@ -222,17 +267,55 @@ class EventLogTestCase(unittest.TestCase):
<version event>
"""
config = {
'git.foo': 'bar',
'git.core.foo2': 'baz',
"git.foo": "bar",
"git.core.foo2": "baz",
}
self._event_log_module.DefParamRepoEvents(config)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 1)
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
def test_data_event_config(self):
"""Test 'data' event data outputs all config keys.
Expected event log:
<version event>
<data event>
<data event>
"""
config = {
"git.foo": "bar",
"repo.partialclone": "false",
"repo.syncstate.superproject.hassuperprojecttag": "true",
"repo.syncstate.superproject.sys.argv": ["--", "sync", "protobuf"],
}
prefix_value = "prefix"
self._event_log_module.LogDataConfigEvents(config, prefix_value)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 5)
data_events = self._log_data[1:]
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
for event in data_events:
self.verifyCommonKeys(event)
# Check for 'data' event specific fields.
self.assertIn("key", event)
self.assertIn("value", event)
key = event["key"]
key = self.remove_prefix(key, f"{prefix_value}/")
value = event["value"]
self.assertEqual(
self._event_log_module.GetDataEventName(value), event["event"]
)
self.assertTrue(key in config and value == config[key])
def test_error_event(self):
"""Test and validate 'error' event data is valid.
@ -241,38 +324,45 @@ class EventLogTestCase(unittest.TestCase):
<version event>
<error event>
"""
msg = 'invalid option: --cahced'
fmt = 'invalid option: %s'
msg = "invalid option: --cahced"
fmt = "invalid option: %s"
self._event_log_module.ErrorEvent(msg, fmt)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
error_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(error_event, expected_event_name='error')
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
self.verifyCommonKeys(error_event, expected_event_name="error")
# Check for 'error' event specific fields.
self.assertIn('msg', error_event)
self.assertIn('fmt', error_event)
self.assertEqual(error_event['msg'], msg)
self.assertEqual(error_event['fmt'], fmt)
self.assertIn("msg", error_event)
self.assertIn("fmt", error_event)
self.assertEqual(error_event["msg"], msg)
self.assertEqual(error_event["fmt"], fmt)
def test_write_with_filename(self):
"""Test Write() with a path to a file exits with None."""
self.assertIsNone(self._event_log_module.Write(path='path/to/file'))
self.assertIsNone(self._event_log_module.Write(path="path/to/file"))
def test_write_with_git_config(self):
"""Test Write() uses the git config path when 'git config' call succeeds."""
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
with mock.patch.object(self._event_log_module,
'_GetEventTargetPath', return_value=tempdir):
self.assertEqual(os.path.dirname(self._event_log_module.Write()), tempdir)
"""Test Write() uses the git config path when 'git config' call
succeeds."""
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
with mock.patch.object(
self._event_log_module,
"_GetEventTargetPath",
return_value=tempdir,
):
self.assertEqual(
os.path.dirname(self._event_log_module.Write()), tempdir
)
def test_write_no_git_config(self):
"""Test Write() with no git config variable present exits with None."""
with mock.patch.object(self._event_log_module,
'_GetEventTargetPath', return_value=None):
with mock.patch.object(
self._event_log_module, "_GetEventTargetPath", return_value=None
):
self.assertIsNone(self._event_log_module.Write())
def test_write_non_string(self):
@ -280,6 +370,39 @@ class EventLogTestCase(unittest.TestCase):
with self.assertRaises(TypeError):
self._event_log_module.Write(path=1234)
def test_write_socket(self):
"""Test Write() with Unix domain socket for |path| and validate received
traces."""
received_traces = []
with tempfile.TemporaryDirectory(
prefix="test_server_sockets"
) as tempdir:
socket_path = os.path.join(tempdir, "server.sock")
server_ready = threading.Condition()
# Start "server" listening on Unix domain socket at socket_path.
try:
server_thread = threading.Thread(
target=serverLoggingThread,
args=(socket_path, server_ready, received_traces),
)
server_thread.start()
if __name__ == '__main__':
unittest.main()
with server_ready:
server_ready.wait(timeout=120)
self._event_log_module.StartEvent()
path = self._event_log_module.Write(
path=f"af_unix:{socket_path}"
)
finally:
server_thread.join(timeout=5)
self.assertEqual(path, f"af_unix:stream:{socket_path}")
self.assertEqual(len(received_traces), 2)
version_event = json.loads(received_traces[0])
start_event = json.loads(received_traces[1])
self.verifyCommonKeys(version_event, expected_event_name="version")
self.verifyCommonKeys(start_event, expected_event_name="start")
# Check for 'start' event specific fields.
self.assertIn("argv", start_event)
self.assertIsInstance(start_event["argv"], list)

View File

@ -17,39 +17,38 @@
import hooks
import unittest
class RepoHookShebang(unittest.TestCase):
"""Check shebang parsing in RepoHook."""
def test_no_shebang(self):
"""Lines w/out shebangs should be rejected."""
DATA = (
'',
'#\n# foo\n',
'# Bad shebang in script\n#!/foo\n'
)
DATA = ("", "#\n# foo\n", "# Bad shebang in script\n#!/foo\n")
for data in DATA:
self.assertIsNone(hooks.RepoHook._ExtractInterpFromShebang(data))
def test_direct_interp(self):
"""Lines whose shebang points directly to the interpreter."""
DATA = (
('#!/foo', '/foo'),
('#! /foo', '/foo'),
('#!/bin/foo ', '/bin/foo'),
('#! /usr/foo ', '/usr/foo'),
('#! /usr/foo -args', '/usr/foo'),
("#!/foo", "/foo"),
("#! /foo", "/foo"),
("#!/bin/foo ", "/bin/foo"),
("#! /usr/foo ", "/usr/foo"),
("#! /usr/foo -args", "/usr/foo"),
)
for shebang, interp in DATA:
self.assertEqual(hooks.RepoHook._ExtractInterpFromShebang(shebang),
interp)
self.assertEqual(
hooks.RepoHook._ExtractInterpFromShebang(shebang), interp
)
def test_env_interp(self):
"""Lines whose shebang launches through `env`."""
DATA = (
('#!/usr/bin/env foo', 'foo'),
('#!/bin/env foo', 'foo'),
('#! /bin/env /bin/foo ', '/bin/foo'),
("#!/usr/bin/env foo", "foo"),
("#!/bin/env foo", "foo"),
("#! /bin/env /bin/foo ", "/bin/foo"),
)
for shebang, interp in DATA:
self.assertEqual(hooks.RepoHook._ExtractInterpFromShebang(shebang),
interp)
self.assertEqual(
hooks.RepoHook._ExtractInterpFromShebang(shebang), interp
)

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,52 @@
# Copyright 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the platform_utils.py module."""
import os
import tempfile
import unittest
import platform_utils
class RemoveTests(unittest.TestCase):
"""Check remove() helper."""
def testMissingOk(self):
"""Check missing_ok handling."""
with tempfile.TemporaryDirectory() as tmpdir:
path = os.path.join(tmpdir, "test")
# Should not fail.
platform_utils.remove(path, missing_ok=True)
# Should fail.
self.assertRaises(OSError, platform_utils.remove, path)
self.assertRaises(
OSError, platform_utils.remove, path, missing_ok=False
)
# Should not fail if it exists.
open(path, "w").close()
platform_utils.remove(path, missing_ok=True)
self.assertFalse(os.path.exists(path))
open(path, "w").close()
platform_utils.remove(path)
self.assertFalse(os.path.exists(path))
open(path, "w").close()
platform_utils.remove(path, missing_ok=False)
self.assertFalse(os.path.exists(path))

View File

@ -16,12 +16,13 @@
import contextlib
import os
import shutil
from pathlib import Path
import subprocess
import tempfile
import unittest
import error
import manifest_xml
import git_command
import git_config
import platform_utils
@ -31,26 +32,20 @@ import project
@contextlib.contextmanager
def TempGitTree():
"""Create a new empty git checkout for testing."""
# TODO(vapier): Convert this to tempfile.TemporaryDirectory once we drop
# Python 2 support entirely.
try:
tempdir = tempfile.mkdtemp(prefix='repo-tests')
with tempfile.TemporaryDirectory(prefix="repo-tests") as tempdir:
# Tests need to assume, that main is default branch at init,
# which is not supported in config until 2.28.
cmd = ['git', 'init']
cmd = ["git", "init"]
if git_command.git_require((2, 28, 0)):
cmd += ['--initial-branch=main']
cmd += ["--initial-branch=main"]
else:
# Use template dir for init.
templatedir = tempfile.mkdtemp(prefix='.test-template')
with open(os.path.join(templatedir, 'HEAD'), 'w') as fp:
fp.write('ref: refs/heads/main\n')
cmd += ['--template', templatedir]
templatedir = tempfile.mkdtemp(prefix=".test-template")
with open(os.path.join(templatedir, "HEAD"), "w") as fp:
fp.write("ref: refs/heads/main\n")
cmd += ["--template", templatedir]
subprocess.check_call(cmd, cwd=tempdir)
yield tempdir
finally:
platform_utils.rmtree(tempdir)
class FakeProject(object):
@ -58,12 +53,14 @@ class FakeProject(object):
def __init__(self, worktree):
self.worktree = worktree
self.gitdir = os.path.join(worktree, '.git')
self.name = 'fakeproject'
self.gitdir = os.path.join(worktree, ".git")
self.name = "fakeproject"
self.work_git = project.Project._GitGetByExec(
self, bare=False, gitdir=self.gitdir)
self, bare=False, gitdir=self.gitdir
)
self.bare_git = project.Project._GitGetByExec(
self, bare=True, gitdir=self.gitdir)
self, bare=True, gitdir=self.gitdir
)
self.config = git_config.GitConfig.ForRepository(gitdir=self.gitdir)
@ -76,20 +73,21 @@ class ReviewableBranchTests(unittest.TestCase):
fakeproj = FakeProject(tempdir)
# Generate some commits.
with open(os.path.join(tempdir, 'readme'), 'w') as fp:
fp.write('txt')
fakeproj.work_git.add('readme')
fakeproj.work_git.commit('-mAdd file')
fakeproj.work_git.checkout('-b', 'work')
fakeproj.work_git.rm('-f', 'readme')
fakeproj.work_git.commit('-mDel file')
with open(os.path.join(tempdir, "readme"), "w") as fp:
fp.write("txt")
fakeproj.work_git.add("readme")
fakeproj.work_git.commit("-mAdd file")
fakeproj.work_git.checkout("-b", "work")
fakeproj.work_git.rm("-f", "readme")
fakeproj.work_git.commit("-mDel file")
# Start off with the normal details.
rb = project.ReviewableBranch(
fakeproj, fakeproj.config.GetBranch('work'), 'main')
self.assertEqual('work', rb.name)
fakeproj, fakeproj.config.GetBranch("work"), "main"
)
self.assertEqual("work", rb.name)
self.assertEqual(1, len(rb.commits))
self.assertIn('Del file', rb.commits[0])
self.assertIn("Del file", rb.commits[0])
d = rb.unabbrev_commits
self.assertEqual(1, len(d))
short, long = next(iter(d.items()))
@ -99,9 +97,10 @@ class ReviewableBranchTests(unittest.TestCase):
self.assertTrue(rb.date)
# Now delete the tracking branch!
fakeproj.work_git.branch('-D', 'main')
fakeproj.work_git.branch("-D", "main")
rb = project.ReviewableBranch(
fakeproj, fakeproj.config.GetBranch('work'), 'main')
fakeproj, fakeproj.config.GetBranch("work"), "main"
)
self.assertEqual(0, len(rb.commits))
self.assertFalse(rb.base_exists)
# Hard to assert anything useful about this.
@ -111,7 +110,7 @@ class ReviewableBranchTests(unittest.TestCase):
class CopyLinkTestCase(unittest.TestCase):
"""TestCase for stub repo client checkouts.
It'll have a layout like:
It'll have a layout like this:
tempdir/ # self.tempdir
checkout/ # self.topdir
git-project/ # self.worktree
@ -123,18 +122,19 @@ class CopyLinkTestCase(unittest.TestCase):
"""
def setUp(self):
self.tempdir = tempfile.mkdtemp(prefix='repo_tests')
self.topdir = os.path.join(self.tempdir, 'checkout')
self.worktree = os.path.join(self.topdir, 'git-project')
self.tempdirobj = tempfile.TemporaryDirectory(prefix="repo_tests")
self.tempdir = self.tempdirobj.name
self.topdir = os.path.join(self.tempdir, "checkout")
self.worktree = os.path.join(self.topdir, "git-project")
os.makedirs(self.topdir)
os.makedirs(self.worktree)
def tearDown(self):
shutil.rmtree(self.tempdir, ignore_errors=True)
self.tempdirobj.cleanup()
@staticmethod
def touch(path):
with open(path, 'w'):
with open(path, "w"):
pass
def assertExists(self, path, msg=None):
@ -143,18 +143,19 @@ class CopyLinkTestCase(unittest.TestCase):
return
if msg is None:
msg = ['path is missing: %s' % path]
while path != '/':
msg = ["path is missing: %s" % path]
while path != "/":
path = os.path.dirname(path)
if not path:
# If we're given something like "foo", abort once we get to "".
# If we're given something like "foo", abort once we get to
# "".
break
result = os.path.exists(path)
msg.append('\tos.path.exists(%s): %s' % (path, result))
msg.append("\tos.path.exists(%s): %s" % (path, result))
if result:
msg.append('\tcontents: %r' % os.listdir(path))
msg.append("\tcontents: %r" % os.listdir(path))
break
msg = '\n'.join(msg)
msg = "\n".join(msg)
raise self.failureException(msg)
@ -167,98 +168,99 @@ class CopyFile(CopyLinkTestCase):
def test_basic(self):
"""Basic test of copying a file from a project to the toplevel."""
src = os.path.join(self.worktree, 'foo.txt')
src = os.path.join(self.worktree, "foo.txt")
self.touch(src)
cf = self.CopyFile('foo.txt', 'foo')
cf = self.CopyFile("foo.txt", "foo")
cf._Copy()
self.assertExists(os.path.join(self.topdir, 'foo'))
self.assertExists(os.path.join(self.topdir, "foo"))
def test_src_subdir(self):
"""Copy a file from a subdir of a project."""
src = os.path.join(self.worktree, 'bar', 'foo.txt')
src = os.path.join(self.worktree, "bar", "foo.txt")
os.makedirs(os.path.dirname(src))
self.touch(src)
cf = self.CopyFile('bar/foo.txt', 'new.txt')
cf = self.CopyFile("bar/foo.txt", "new.txt")
cf._Copy()
self.assertExists(os.path.join(self.topdir, 'new.txt'))
self.assertExists(os.path.join(self.topdir, "new.txt"))
def test_dest_subdir(self):
"""Copy a file to a subdir of a checkout."""
src = os.path.join(self.worktree, 'foo.txt')
src = os.path.join(self.worktree, "foo.txt")
self.touch(src)
cf = self.CopyFile('foo.txt', 'sub/dir/new.txt')
self.assertFalse(os.path.exists(os.path.join(self.topdir, 'sub')))
cf = self.CopyFile("foo.txt", "sub/dir/new.txt")
self.assertFalse(os.path.exists(os.path.join(self.topdir, "sub")))
cf._Copy()
self.assertExists(os.path.join(self.topdir, 'sub', 'dir', 'new.txt'))
self.assertExists(os.path.join(self.topdir, "sub", "dir", "new.txt"))
def test_update(self):
"""Make sure changed files get copied again."""
src = os.path.join(self.worktree, 'foo.txt')
dest = os.path.join(self.topdir, 'bar')
with open(src, 'w') as f:
f.write('1st')
cf = self.CopyFile('foo.txt', 'bar')
src = os.path.join(self.worktree, "foo.txt")
dest = os.path.join(self.topdir, "bar")
with open(src, "w") as f:
f.write("1st")
cf = self.CopyFile("foo.txt", "bar")
cf._Copy()
self.assertExists(dest)
with open(dest) as f:
self.assertEqual(f.read(), '1st')
self.assertEqual(f.read(), "1st")
with open(src, 'w') as f:
f.write('2nd!')
with open(src, "w") as f:
f.write("2nd!")
cf._Copy()
with open(dest) as f:
self.assertEqual(f.read(), '2nd!')
self.assertEqual(f.read(), "2nd!")
def test_src_block_symlink(self):
"""Do not allow reading from a symlinked path."""
src = os.path.join(self.worktree, 'foo.txt')
sym = os.path.join(self.worktree, 'sym')
src = os.path.join(self.worktree, "foo.txt")
sym = os.path.join(self.worktree, "sym")
self.touch(src)
platform_utils.symlink('foo.txt', sym)
platform_utils.symlink("foo.txt", sym)
self.assertExists(sym)
cf = self.CopyFile('sym', 'foo')
cf = self.CopyFile("sym", "foo")
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_symlink_traversal(self):
"""Do not allow reading through a symlink dir."""
realfile = os.path.join(self.tempdir, 'file.txt')
realfile = os.path.join(self.tempdir, "file.txt")
self.touch(realfile)
src = os.path.join(self.worktree, 'bar', 'file.txt')
platform_utils.symlink(self.tempdir, os.path.join(self.worktree, 'bar'))
src = os.path.join(self.worktree, "bar", "file.txt")
platform_utils.symlink(self.tempdir, os.path.join(self.worktree, "bar"))
self.assertExists(src)
cf = self.CopyFile('bar/file.txt', 'foo')
cf = self.CopyFile("bar/file.txt", "foo")
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_copy_from_dir(self):
"""Do not allow copying from a directory."""
src = os.path.join(self.worktree, 'dir')
src = os.path.join(self.worktree, "dir")
os.makedirs(src)
cf = self.CopyFile('dir', 'foo')
cf = self.CopyFile("dir", "foo")
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_dest_block_symlink(self):
"""Do not allow writing to a symlink."""
src = os.path.join(self.worktree, 'foo.txt')
src = os.path.join(self.worktree, "foo.txt")
self.touch(src)
platform_utils.symlink('dest', os.path.join(self.topdir, 'sym'))
cf = self.CopyFile('foo.txt', 'sym')
platform_utils.symlink("dest", os.path.join(self.topdir, "sym"))
cf = self.CopyFile("foo.txt", "sym")
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_dest_block_symlink_traversal(self):
"""Do not allow writing through a symlink dir."""
src = os.path.join(self.worktree, 'foo.txt')
src = os.path.join(self.worktree, "foo.txt")
self.touch(src)
platform_utils.symlink(tempfile.gettempdir(),
os.path.join(self.topdir, 'sym'))
cf = self.CopyFile('foo.txt', 'sym/foo.txt')
platform_utils.symlink(
tempfile.gettempdir(), os.path.join(self.topdir, "sym")
)
cf = self.CopyFile("foo.txt", "sym/foo.txt")
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_copy_to_dir(self):
"""Do not allow copying to a directory."""
src = os.path.join(self.worktree, 'foo.txt')
src = os.path.join(self.worktree, "foo.txt")
self.touch(src)
os.makedirs(os.path.join(self.topdir, 'dir'))
cf = self.CopyFile('foo.txt', 'dir')
os.makedirs(os.path.join(self.topdir, "dir"))
cf = self.CopyFile("foo.txt", "dir")
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
@ -270,68 +272,252 @@ class LinkFile(CopyLinkTestCase):
def test_basic(self):
"""Basic test of linking a file from a project into the toplevel."""
src = os.path.join(self.worktree, 'foo.txt')
src = os.path.join(self.worktree, "foo.txt")
self.touch(src)
lf = self.LinkFile('foo.txt', 'foo')
lf = self.LinkFile("foo.txt", "foo")
lf._Link()
dest = os.path.join(self.topdir, 'foo')
dest = os.path.join(self.topdir, "foo")
self.assertExists(dest)
self.assertTrue(os.path.islink(dest))
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest))
self.assertEqual(
os.path.join("git-project", "foo.txt"), os.readlink(dest)
)
def test_src_subdir(self):
"""Link to a file in a subdir of a project."""
src = os.path.join(self.worktree, 'bar', 'foo.txt')
src = os.path.join(self.worktree, "bar", "foo.txt")
os.makedirs(os.path.dirname(src))
self.touch(src)
lf = self.LinkFile('bar/foo.txt', 'foo')
lf = self.LinkFile("bar/foo.txt", "foo")
lf._Link()
self.assertExists(os.path.join(self.topdir, 'foo'))
self.assertExists(os.path.join(self.topdir, "foo"))
def test_src_self(self):
"""Link to the project itself."""
dest = os.path.join(self.topdir, 'foo', 'bar')
lf = self.LinkFile('.', 'foo/bar')
dest = os.path.join(self.topdir, "foo", "bar")
lf = self.LinkFile(".", "foo/bar")
lf._Link()
self.assertExists(dest)
self.assertEqual(os.path.join('..', 'git-project'), os.readlink(dest))
self.assertEqual(os.path.join("..", "git-project"), os.readlink(dest))
def test_dest_subdir(self):
"""Link a file to a subdir of a checkout."""
src = os.path.join(self.worktree, 'foo.txt')
src = os.path.join(self.worktree, "foo.txt")
self.touch(src)
lf = self.LinkFile('foo.txt', 'sub/dir/foo/bar')
self.assertFalse(os.path.exists(os.path.join(self.topdir, 'sub')))
lf = self.LinkFile("foo.txt", "sub/dir/foo/bar")
self.assertFalse(os.path.exists(os.path.join(self.topdir, "sub")))
lf._Link()
self.assertExists(os.path.join(self.topdir, 'sub', 'dir', 'foo', 'bar'))
self.assertExists(os.path.join(self.topdir, "sub", "dir", "foo", "bar"))
def test_src_block_relative(self):
"""Do not allow relative symlinks."""
BAD_SOURCES = (
'./',
'..',
'../',
'foo/.',
'foo/./bar',
'foo/..',
'foo/../foo',
"./",
"..",
"../",
"foo/.",
"foo/./bar",
"foo/..",
"foo/../foo",
)
for src in BAD_SOURCES:
lf = self.LinkFile(src, 'foo')
lf = self.LinkFile(src, "foo")
self.assertRaises(error.ManifestInvalidPathError, lf._Link)
def test_update(self):
"""Make sure changed targets get updated."""
dest = os.path.join(self.topdir, 'sym')
dest = os.path.join(self.topdir, "sym")
src = os.path.join(self.worktree, 'foo.txt')
src = os.path.join(self.worktree, "foo.txt")
self.touch(src)
lf = self.LinkFile('foo.txt', 'sym')
lf = self.LinkFile("foo.txt", "sym")
lf._Link()
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest))
self.assertEqual(
os.path.join("git-project", "foo.txt"), os.readlink(dest)
)
# Point the symlink somewhere else.
os.unlink(dest)
platform_utils.symlink(self.tempdir, dest)
lf._Link()
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest))
self.assertEqual(
os.path.join("git-project", "foo.txt"), os.readlink(dest)
)
class MigrateWorkTreeTests(unittest.TestCase):
"""Check _MigrateOldWorkTreeGitDir handling."""
_SYMLINKS = {
"config",
"description",
"hooks",
"info",
"logs",
"objects",
"packed-refs",
"refs",
"rr-cache",
"shallow",
"svn",
}
_FILES = {
"COMMIT_EDITMSG",
"FETCH_HEAD",
"HEAD",
"index",
"ORIG_HEAD",
"unknown-file-should-be-migrated",
}
_CLEAN_FILES = {
"a-vim-temp-file~",
"#an-emacs-temp-file#",
}
@classmethod
@contextlib.contextmanager
def _simple_layout(cls):
"""Create a simple repo client checkout to test against."""
with tempfile.TemporaryDirectory() as tempdir:
tempdir = Path(tempdir)
gitdir = tempdir / ".repo/projects/src/test.git"
gitdir.mkdir(parents=True)
cmd = ["git", "init", "--bare", str(gitdir)]
subprocess.check_call(cmd)
dotgit = tempdir / "src/test/.git"
dotgit.mkdir(parents=True)
for name in cls._SYMLINKS:
(dotgit / name).symlink_to(
f"../../../.repo/projects/src/test.git/{name}"
)
for name in cls._FILES | cls._CLEAN_FILES:
(dotgit / name).write_text(name)
yield tempdir
def test_standard(self):
"""Migrate a standard checkout that we expect."""
with self._simple_layout() as tempdir:
dotgit = tempdir / "src/test/.git"
project.Project._MigrateOldWorkTreeGitDir(str(dotgit))
# Make sure the dir was transformed into a symlink.
self.assertTrue(dotgit.is_symlink())
self.assertEqual(
os.readlink(dotgit),
os.path.normpath("../../.repo/projects/src/test.git"),
)
# Make sure files were moved over.
gitdir = tempdir / ".repo/projects/src/test.git"
for name in self._FILES:
self.assertEqual(name, (gitdir / name).read_text())
# Make sure files were removed.
for name in self._CLEAN_FILES:
self.assertFalse((gitdir / name).exists())
def test_unknown(self):
"""A checkout with unknown files should abort."""
with self._simple_layout() as tempdir:
dotgit = tempdir / "src/test/.git"
(tempdir / ".repo/projects/src/test.git/random-file").write_text(
"one"
)
(dotgit / "random-file").write_text("two")
with self.assertRaises(error.GitError):
project.Project._MigrateOldWorkTreeGitDir(str(dotgit))
# Make sure no content was actually changed.
self.assertTrue(dotgit.is_dir())
for name in self._FILES:
self.assertTrue((dotgit / name).is_file())
for name in self._CLEAN_FILES:
self.assertTrue((dotgit / name).is_file())
for name in self._SYMLINKS:
self.assertTrue((dotgit / name).is_symlink())
class ManifestPropertiesFetchedCorrectly(unittest.TestCase):
"""Ensure properties are fetched properly."""
def setUpManifest(self, tempdir):
repodir = os.path.join(tempdir, ".repo")
manifest_dir = os.path.join(repodir, "manifests")
manifest_file = os.path.join(repodir, manifest_xml.MANIFEST_FILE_NAME)
os.mkdir(repodir)
os.mkdir(manifest_dir)
manifest = manifest_xml.XmlManifest(repodir, manifest_file)
return project.ManifestProject(
manifest, "test/manifest", os.path.join(tempdir, ".git"), tempdir
)
def test_manifest_config_properties(self):
"""Test we are fetching the manifest config properties correctly."""
with TempGitTree() as tempdir:
fakeproj = self.setUpManifest(tempdir)
# Set property using the expected Set method, then ensure
# the porperty functions are using the correct Get methods.
fakeproj.config.SetString(
"manifest.standalone", "https://chicken/manifest.git"
)
self.assertEqual(
fakeproj.standalone_manifest_url, "https://chicken/manifest.git"
)
fakeproj.config.SetString(
"manifest.groups", "test-group, admin-group"
)
self.assertEqual(
fakeproj.manifest_groups, "test-group, admin-group"
)
fakeproj.config.SetString("repo.reference", "mirror/ref")
self.assertEqual(fakeproj.reference, "mirror/ref")
fakeproj.config.SetBoolean("repo.dissociate", False)
self.assertFalse(fakeproj.dissociate)
fakeproj.config.SetBoolean("repo.archive", False)
self.assertFalse(fakeproj.archive)
fakeproj.config.SetBoolean("repo.mirror", False)
self.assertFalse(fakeproj.mirror)
fakeproj.config.SetBoolean("repo.worktree", False)
self.assertFalse(fakeproj.use_worktree)
fakeproj.config.SetBoolean("repo.clonebundle", False)
self.assertFalse(fakeproj.clone_bundle)
fakeproj.config.SetBoolean("repo.submodules", False)
self.assertFalse(fakeproj.submodules)
fakeproj.config.SetBoolean("repo.git-lfs", False)
self.assertFalse(fakeproj.git_lfs)
fakeproj.config.SetBoolean("repo.superproject", False)
self.assertFalse(fakeproj.use_superproject)
fakeproj.config.SetBoolean("repo.partialclone", False)
self.assertFalse(fakeproj.partial_clone)
fakeproj.config.SetString("repo.depth", "48")
self.assertEqual(fakeproj.depth, "48")
fakeproj.config.SetString("repo.clonefilter", "blob:limit=10M")
self.assertEqual(fakeproj.clone_filter, "blob:limit=10M")
fakeproj.config.SetString(
"repo.partialcloneexclude", "third_party/big_repo"
)
self.assertEqual(
fakeproj.partial_clone_exclude, "third_party/big_repo"
)
fakeproj.config.SetString("manifest.platform", "auto")
self.assertEqual(fakeproj.manifest_platform, "auto")

60
tests/test_repo_trace.py Normal file
View File

@ -0,0 +1,60 @@
# Copyright 2022 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the repo_trace.py module."""
import os
import unittest
from unittest import mock
import repo_trace
class TraceTests(unittest.TestCase):
"""Check Trace behavior."""
def testTrace_MaxSizeEnforced(self):
content = "git chicken"
with repo_trace.Trace(content, first_trace=True):
pass
first_trace_size = os.path.getsize(repo_trace._TRACE_FILE)
with repo_trace.Trace(content):
pass
self.assertGreater(
os.path.getsize(repo_trace._TRACE_FILE), first_trace_size
)
# Check we clear everything is the last chunk is larger than _MAX_SIZE.
with mock.patch("repo_trace._MAX_SIZE", 0):
with repo_trace.Trace(content, first_trace=True):
pass
self.assertEqual(
first_trace_size, os.path.getsize(repo_trace._TRACE_FILE)
)
# Check we only clear the chunks we need to.
repo_trace._MAX_SIZE = (first_trace_size + 1) / (1024 * 1024)
with repo_trace.Trace(content, first_trace=True):
pass
self.assertEqual(
first_trace_size * 2, os.path.getsize(repo_trace._TRACE_FILE)
)
with repo_trace.Trace(content, first_trace=True):
pass
self.assertEqual(
first_trace_size * 2, os.path.getsize(repo_trace._TRACE_FILE)
)

View File

@ -27,18 +27,24 @@ class SshTests(unittest.TestCase):
def test_parse_ssh_version(self):
"""Check _parse_ssh_version() handling."""
ver = ssh._parse_ssh_version('Unknown\n')
ver = ssh._parse_ssh_version("Unknown\n")
self.assertEqual(ver, ())
ver = ssh._parse_ssh_version('OpenSSH_1.0\n')
ver = ssh._parse_ssh_version("OpenSSH_1.0\n")
self.assertEqual(ver, (1, 0))
ver = ssh._parse_ssh_version('OpenSSH_6.6.1p1 Ubuntu-2ubuntu2.13, OpenSSL 1.0.1f 6 Jan 2014\n')
ver = ssh._parse_ssh_version(
"OpenSSH_6.6.1p1 Ubuntu-2ubuntu2.13, OpenSSL 1.0.1f 6 Jan 2014\n"
)
self.assertEqual(ver, (6, 6, 1))
ver = ssh._parse_ssh_version('OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\n')
ver = ssh._parse_ssh_version(
"OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\n"
)
self.assertEqual(ver, (7, 6))
ver = ssh._parse_ssh_version("OpenSSH_9.0p1, LibreSSL 3.3.6\n")
self.assertEqual(ver, (9, 0))
def test_version(self):
"""Check version() handling."""
with mock.patch('ssh._run_ssh_version', return_value='OpenSSH_1.2\n'):
with mock.patch("ssh._run_ssh_version", return_value="OpenSSH_1.2\n"):
self.assertEqual(ssh.version(), (1, 2))
def test_context_manager_empty(self):
@ -51,9 +57,9 @@ class SshTests(unittest.TestCase):
"""Verify orphaned clients & masters get cleaned up."""
with multiprocessing.Manager() as manager:
with ssh.ProxyManager(manager) as ssh_proxy:
client = subprocess.Popen(['sleep', '964853320'])
client = subprocess.Popen(["sleep", "964853320"])
ssh_proxy.add_client(client)
master = subprocess.Popen(['sleep', '964853321'])
master = subprocess.Popen(["sleep", "964853321"])
ssh_proxy.add_master(master)
# If the process still exists, these will throw timeout errors.
client.wait(0)
@ -63,12 +69,12 @@ class SshTests(unittest.TestCase):
"""Check sock() function."""
manager = multiprocessing.Manager()
proxy = ssh.ProxyManager(manager)
with mock.patch('tempfile.mkdtemp', return_value='/tmp/foo'):
# old ssh version uses port
with mock.patch('ssh.version', return_value=(6, 6)):
self.assertTrue(proxy.sock().endswith('%p'))
with mock.patch("tempfile.mkdtemp", return_value="/tmp/foo"):
# Old ssh version uses port.
with mock.patch("ssh.version", return_value=(6, 6)):
self.assertTrue(proxy.sock().endswith("%p"))
proxy._sock_path = None
# new ssh version uses hash
with mock.patch('ssh.version', return_value=(6, 7)):
self.assertTrue(proxy.sock().endswith('%C'))
# New ssh version uses hash.
with mock.patch("ssh.version", return_value=(6, 7)):
self.assertTrue(proxy.sock().endswith("%C"))

View File

@ -25,30 +25,30 @@ class AllCommands(unittest.TestCase):
def test_required_basic(self):
"""Basic checking of registered commands."""
# NB: We don't test all subcommands as we want to avoid "change detection"
# tests, so we just look for the most common/important ones here that are
# unlikely to ever change.
for cmd in {'cherry-pick', 'help', 'init', 'start', 'sync', 'upload'}:
# NB: We don't test all subcommands as we want to avoid "change
# detection" tests, so we just look for the most common/important ones
# here that are unlikely to ever change.
for cmd in {"cherry-pick", "help", "init", "start", "sync", "upload"}:
self.assertIn(cmd, subcmds.all_commands)
def test_naming(self):
"""Verify we don't add things that we shouldn't."""
for cmd in subcmds.all_commands:
# Reject filename suffixes like "help.py".
self.assertNotIn('.', cmd)
self.assertNotIn(".", cmd)
# Make sure all '_' were converted to '-'.
self.assertNotIn('_', cmd)
self.assertNotIn("_", cmd)
# Reject internal python paths like "__init__".
self.assertFalse(cmd.startswith('__'))
self.assertFalse(cmd.startswith("__"))
def test_help_desc_style(self):
"""Force some consistency in option descriptions.
Python's optparse & argparse has a few default options like --help. Their
option description text uses lowercase sentence fragments, so enforce our
options follow the same style so UI is consistent.
Python's optparse & argparse has a few default options like --help.
Their option description text uses lowercase sentence fragments, so
enforce our options follow the same style so UI is consistent.
We enforce:
* Text starts with lowercase.
@ -63,11 +63,29 @@ class AllCommands(unittest.TestCase):
c = option.help[0]
self.assertEqual(
c.lower(), c,
msg=f'subcmds/{name}.py: {option.get_opt_string()}: help text '
f'should start with lowercase: "{option.help}"')
c.lower(),
c,
msg=f"subcmds/{name}.py: {option.get_opt_string()}: "
f'help text should start with lowercase: "{option.help}"',
)
self.assertNotEqual(
option.help[-1], '.',
msg=f'subcmds/{name}.py: {option.get_opt_string()}: help text '
f'should not end in a period: "{option.help}"')
option.help[-1],
".",
msg=f"subcmds/{name}.py: {option.get_opt_string()}: "
f'help text should not end in a period: "{option.help}"',
)
def test_cli_option_style(self):
"""Force some consistency in option flags."""
for name, cls in subcmds.all_commands.items():
cmd = cls()
parser = cmd.OptionParser
for option in parser.option_list:
for opt in option._long_opts:
self.assertNotIn(
"_",
opt,
msg=f"subcmds/{name}.py: {opt}: only use dashes in "
"options, not underscores",
)

View File

@ -27,9 +27,7 @@ class InitCommand(unittest.TestCase):
def test_cli_parser_good(self):
"""Check valid command line options."""
ARGV = (
[],
)
ARGV = ([],)
for argv in ARGV:
opts, args = self.cmd.OptionParser.parse_args(argv)
self.cmd.ValidateOptions(opts, args)
@ -38,10 +36,9 @@ class InitCommand(unittest.TestCase):
"""Check invalid command line options."""
ARGV = (
# Too many arguments.
['url', 'asdf'],
["url", "asdf"],
# Conflicting options.
['--mirror', '--archive'],
["--mirror", "--archive"],
)
for argv in ARGV:
opts, args = self.cmd.OptionParser.parse_args(argv)

160
tests/test_subcmds_sync.py Normal file
View File

@ -0,0 +1,160 @@
# Copyright (C) 2022 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the subcmds/sync.py module."""
import os
import unittest
from unittest import mock
import pytest
import command
from subcmds import sync
@pytest.mark.parametrize(
"use_superproject, cli_args, result",
[
(True, ["--current-branch"], True),
(True, ["--no-current-branch"], True),
(True, [], True),
(False, ["--current-branch"], True),
(False, ["--no-current-branch"], False),
(False, [], None),
],
)
def test_get_current_branch_only(use_superproject, cli_args, result):
"""Test Sync._GetCurrentBranchOnly logic.
Sync._GetCurrentBranchOnly should return True if a superproject is
requested, and otherwise the value of the current_branch_only option.
"""
cmd = sync.Sync()
opts, _ = cmd.OptionParser.parse_args(cli_args)
with mock.patch(
"git_superproject.UseSuperproject", return_value=use_superproject
):
assert cmd._GetCurrentBranchOnly(opts, cmd.manifest) == result
# Used to patch os.cpu_count() for reliable results.
OS_CPU_COUNT = 24
@pytest.mark.parametrize(
"argv, jobs_manifest, jobs, jobs_net, jobs_check",
[
# No user or manifest settings.
([], None, OS_CPU_COUNT, 1, command.DEFAULT_LOCAL_JOBS),
# No user settings, so manifest settings control.
([], 3, 3, 3, 3),
# User settings, but no manifest.
(["--jobs=4"], None, 4, 4, 4),
(["--jobs=4", "--jobs-network=5"], None, 4, 5, 4),
(["--jobs=4", "--jobs-checkout=6"], None, 4, 4, 6),
(["--jobs=4", "--jobs-network=5", "--jobs-checkout=6"], None, 4, 5, 6),
(
["--jobs-network=5"],
None,
OS_CPU_COUNT,
5,
command.DEFAULT_LOCAL_JOBS,
),
(["--jobs-checkout=6"], None, OS_CPU_COUNT, 1, 6),
(["--jobs-network=5", "--jobs-checkout=6"], None, OS_CPU_COUNT, 5, 6),
# User settings with manifest settings.
(["--jobs=4"], 3, 4, 4, 4),
(["--jobs=4", "--jobs-network=5"], 3, 4, 5, 4),
(["--jobs=4", "--jobs-checkout=6"], 3, 4, 4, 6),
(["--jobs=4", "--jobs-network=5", "--jobs-checkout=6"], 3, 4, 5, 6),
(["--jobs-network=5"], 3, 3, 5, 3),
(["--jobs-checkout=6"], 3, 3, 3, 6),
(["--jobs-network=5", "--jobs-checkout=6"], 3, 3, 5, 6),
# Settings that exceed rlimits get capped.
(["--jobs=1000000"], None, 83, 83, 83),
([], 1000000, 83, 83, 83),
],
)
def test_cli_jobs(argv, jobs_manifest, jobs, jobs_net, jobs_check):
"""Tests --jobs option behavior."""
mp = mock.MagicMock()
mp.manifest.default.sync_j = jobs_manifest
cmd = sync.Sync()
opts, args = cmd.OptionParser.parse_args(argv)
cmd.ValidateOptions(opts, args)
with mock.patch.object(sync, "_rlimit_nofile", return_value=(256, 256)):
with mock.patch.object(os, "cpu_count", return_value=OS_CPU_COUNT):
cmd._ValidateOptionsWithManifest(opts, mp)
assert opts.jobs == jobs
assert opts.jobs_network == jobs_net
assert opts.jobs_checkout == jobs_check
class GetPreciousObjectsState(unittest.TestCase):
"""Tests for _GetPreciousObjectsState."""
def setUp(self):
"""Common setup."""
self.cmd = sync.Sync()
self.project = p = mock.MagicMock(
use_git_worktrees=False, UseAlternates=False
)
p.manifest.GetProjectsWithName.return_value = [p]
self.opt = mock.Mock(spec_set=["this_manifest_only"])
self.opt.this_manifest_only = False
def test_worktrees(self):
"""False for worktrees."""
self.project.use_git_worktrees = True
self.assertFalse(
self.cmd._GetPreciousObjectsState(self.project, self.opt)
)
def test_not_shared(self):
"""Singleton project."""
self.assertFalse(
self.cmd._GetPreciousObjectsState(self.project, self.opt)
)
def test_shared(self):
"""Shared project."""
self.project.manifest.GetProjectsWithName.return_value = [
self.project,
self.project,
]
self.assertTrue(
self.cmd._GetPreciousObjectsState(self.project, self.opt)
)
def test_shared_with_alternates(self):
"""Shared project, with alternates."""
self.project.manifest.GetProjectsWithName.return_value = [
self.project,
self.project,
]
self.project.UseAlternates = True
self.assertFalse(
self.cmd._GetPreciousObjectsState(self.project, self.opt)
)
def test_not_found(self):
"""Project not found in manifest."""
self.project.manifest.GetProjectsWithName.return_value = []
self.assertFalse(
self.cmd._GetPreciousObjectsState(self.project, self.opt)
)

View File

@ -0,0 +1,28 @@
# Copyright 2022 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the update_manpages module."""
import unittest
from release import update_manpages
class UpdateManpagesTest(unittest.TestCase):
"""Tests the update-manpages code."""
def test_replace_regex(self):
"""Check that replace_regex works."""
data = "\n\033[1mSummary\033[m\n"
self.assertEqual(update_manpages.replace_regex(data), "\nSummary\n")

View File

@ -14,11 +14,9 @@
"""Unittests for the wrapper.py module."""
import contextlib
from io import StringIO
import os
import re
import shutil
import sys
import tempfile
import unittest
@ -26,26 +24,12 @@ from unittest import mock
import git_command
import main
import platform_utils
import wrapper
@contextlib.contextmanager
def TemporaryDirectory():
"""Create a new empty git checkout for testing."""
# TODO(vapier): Convert this to tempfile.TemporaryDirectory once we drop
# Python 2 support entirely.
try:
tempdir = tempfile.mkdtemp(prefix='repo-tests')
yield tempdir
finally:
platform_utils.rmtree(tempdir)
def fixture(*paths):
"""Return a path relative to tests/fixtures.
"""
return os.path.join(os.path.dirname(__file__), 'fixtures', *paths)
"""Return a path relative to tests/fixtures."""
return os.path.join(os.path.dirname(__file__), "fixtures", *paths)
class RepoWrapperTestCase(unittest.TestCase):
@ -53,33 +37,36 @@ class RepoWrapperTestCase(unittest.TestCase):
def setUp(self):
"""Load the wrapper module every time."""
wrapper._wrapper_module = None
wrapper.Wrapper.cache_clear()
self.wrapper = wrapper.Wrapper()
class RepoWrapperUnitTest(RepoWrapperTestCase):
"""Tests helper functions in the repo wrapper
"""
"""Tests helper functions in the repo wrapper"""
def test_version(self):
"""Make sure _Version works."""
with self.assertRaises(SystemExit) as e:
with mock.patch('sys.stdout', new_callable=StringIO) as stdout:
with mock.patch('sys.stderr', new_callable=StringIO) as stderr:
with mock.patch("sys.stdout", new_callable=StringIO) as stdout:
with mock.patch("sys.stderr", new_callable=StringIO) as stderr:
self.wrapper._Version()
self.assertEqual(0, e.exception.code)
self.assertEqual('', stderr.getvalue())
self.assertIn('repo launcher version', stdout.getvalue())
self.assertEqual("", stderr.getvalue())
self.assertIn("repo launcher version", stdout.getvalue())
def test_python_constraints(self):
"""The launcher should never require newer than main.py."""
self.assertGreaterEqual(main.MIN_PYTHON_VERSION_HARD,
wrapper.MIN_PYTHON_VERSION_HARD)
self.assertGreaterEqual(main.MIN_PYTHON_VERSION_SOFT,
wrapper.MIN_PYTHON_VERSION_SOFT)
self.assertGreaterEqual(
main.MIN_PYTHON_VERSION_HARD, self.wrapper.MIN_PYTHON_VERSION_HARD
)
self.assertGreaterEqual(
main.MIN_PYTHON_VERSION_SOFT, self.wrapper.MIN_PYTHON_VERSION_SOFT
)
# Make sure the versions are themselves in sync.
self.assertGreaterEqual(wrapper.MIN_PYTHON_VERSION_SOFT,
wrapper.MIN_PYTHON_VERSION_HARD)
self.assertGreaterEqual(
self.wrapper.MIN_PYTHON_VERSION_SOFT,
self.wrapper.MIN_PYTHON_VERSION_HARD,
)
def test_init_parser(self):
"""Make sure 'init' GetParser works."""
@ -99,48 +86,76 @@ class RepoWrapperUnitTest(RepoWrapperTestCase):
"""
Test reading a missing gitc config file
"""
self.wrapper.GITC_CONFIG_FILE = fixture('missing_gitc_config')
self.wrapper.GITC_CONFIG_FILE = fixture("missing_gitc_config")
val = self.wrapper.get_gitc_manifest_dir()
self.assertEqual(val, '')
self.assertEqual(val, "")
def test_get_gitc_manifest_dir(self):
"""
Test reading the gitc config file and parsing the directory
"""
self.wrapper.GITC_CONFIG_FILE = fixture('gitc_config')
self.wrapper.GITC_CONFIG_FILE = fixture("gitc_config")
val = self.wrapper.get_gitc_manifest_dir()
self.assertEqual(val, '/test/usr/local/google/gitc')
self.assertEqual(val, "/test/usr/local/google/gitc")
def test_gitc_parse_clientdir_no_gitc(self):
"""
Test parsing the gitc clientdir without gitc running
"""
self.wrapper.GITC_CONFIG_FILE = fixture('missing_gitc_config')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/something'), None)
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test'), 'test')
self.wrapper.GITC_CONFIG_FILE = fixture("missing_gitc_config")
self.assertEqual(self.wrapper.gitc_parse_clientdir("/something"), None)
self.assertEqual(
self.wrapper.gitc_parse_clientdir("/gitc/manifest-rw/test"), "test"
)
def test_gitc_parse_clientdir(self):
"""
Test parsing the gitc clientdir
"""
self.wrapper.GITC_CONFIG_FILE = fixture('gitc_config')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/something'), None)
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test/'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test/extra'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/extra'),
'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/'), None)
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/'), None)
self.wrapper.GITC_CONFIG_FILE = fixture("gitc_config")
self.assertEqual(self.wrapper.gitc_parse_clientdir("/something"), None)
self.assertEqual(
self.wrapper.gitc_parse_clientdir("/gitc/manifest-rw/test"), "test"
)
self.assertEqual(
self.wrapper.gitc_parse_clientdir("/gitc/manifest-rw/test/"), "test"
)
self.assertEqual(
self.wrapper.gitc_parse_clientdir("/gitc/manifest-rw/test/extra"),
"test",
)
self.assertEqual(
self.wrapper.gitc_parse_clientdir(
"/test/usr/local/google/gitc/test"
),
"test",
)
self.assertEqual(
self.wrapper.gitc_parse_clientdir(
"/test/usr/local/google/gitc/test/"
),
"test",
)
self.assertEqual(
self.wrapper.gitc_parse_clientdir(
"/test/usr/local/google/gitc/test/extra"
),
"test",
)
self.assertEqual(
self.wrapper.gitc_parse_clientdir("/gitc/manifest-rw/"), None
)
self.assertEqual(
self.wrapper.gitc_parse_clientdir("/test/usr/local/google/gitc/"),
None,
)
class SetGitTrace2ParentSid(RepoWrapperTestCase):
"""Check SetGitTrace2ParentSid behavior."""
KEY = 'GIT_TRACE2_PARENT_SID'
VALID_FORMAT = re.compile(r'^repo-[0-9]{8}T[0-9]{6}Z-P[0-9a-f]{8}$')
KEY = "GIT_TRACE2_PARENT_SID"
VALID_FORMAT = re.compile(r"^repo-[0-9]{8}T[0-9]{6}Z-P[0-9a-f]{8}$")
def test_first_set(self):
"""Test env var not yet set."""
@ -152,11 +167,11 @@ class SetGitTrace2ParentSid(RepoWrapperTestCase):
def test_append(self):
"""Test env var is appended."""
env = {self.KEY: 'pfx'}
env = {self.KEY: "pfx"}
self.wrapper.SetGitTrace2ParentSid(env)
self.assertIn(self.KEY, env)
value = env[self.KEY]
self.assertTrue(value.startswith('pfx/'))
self.assertTrue(value.startswith("pfx/"))
self.assertRegex(value[4:], self.VALID_FORMAT)
def test_global_context(self):
@ -173,16 +188,18 @@ class RunCommand(RepoWrapperTestCase):
def test_capture(self):
"""Check capture_output handling."""
ret = self.wrapper.run_command(['echo', 'hi'], capture_output=True)
self.assertEqual(ret.stdout, 'hi\n')
ret = self.wrapper.run_command(["echo", "hi"], capture_output=True)
# echo command appends OS specific linesep, but on Windows + Git Bash
# we get UNIX ending, so we allow both.
self.assertIn(ret.stdout, ["hi" + os.linesep, "hi\n"])
def test_check(self):
"""Check check handling."""
self.wrapper.run_command(['true'], check=False)
self.wrapper.run_command(['true'], check=True)
self.wrapper.run_command(['false'], check=False)
self.wrapper.run_command(["true"], check=False)
self.wrapper.run_command(["true"], check=True)
self.wrapper.run_command(["false"], check=False)
with self.assertRaises(self.wrapper.RunError):
self.wrapper.run_command(['false'], check=True)
self.wrapper.run_command(["false"], check=True)
class RunGit(RepoWrapperTestCase):
@ -190,14 +207,14 @@ class RunGit(RepoWrapperTestCase):
def test_capture(self):
"""Check capture_output handling."""
ret = self.wrapper.run_git('--version')
self.assertIn('git', ret.stdout)
ret = self.wrapper.run_git("--version")
self.assertIn("git", ret.stdout)
def test_check(self):
"""Check check handling."""
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper.run_git('--version-asdfasdf')
self.wrapper.run_git('--version-asdfasdf', check=False)
self.wrapper.run_git("--version-asdfasdf")
self.wrapper.run_git("--version-asdfasdf", check=False)
class ParseGitVersion(RepoWrapperTestCase):
@ -210,25 +227,26 @@ class ParseGitVersion(RepoWrapperTestCase):
def test_bad_ver(self):
"""Check handling of bad git versions."""
ret = self.wrapper.ParseGitVersion(ver_str='asdf')
ret = self.wrapper.ParseGitVersion(ver_str="asdf")
self.assertIsNone(ret)
def test_normal_ver(self):
"""Check handling of normal git versions."""
ret = self.wrapper.ParseGitVersion(ver_str='git version 2.25.1')
ret = self.wrapper.ParseGitVersion(ver_str="git version 2.25.1")
self.assertEqual(2, ret.major)
self.assertEqual(25, ret.minor)
self.assertEqual(1, ret.micro)
self.assertEqual('2.25.1', ret.full)
self.assertEqual("2.25.1", ret.full)
def test_extended_ver(self):
"""Check handling of extended distro git versions."""
ret = self.wrapper.ParseGitVersion(
ver_str='git version 1.30.50.696.g5e7596f4ac-goog')
ver_str="git version 1.30.50.696.g5e7596f4ac-goog"
)
self.assertEqual(1, ret.major)
self.assertEqual(30, ret.minor)
self.assertEqual(50, ret.micro)
self.assertEqual('1.30.50.696.g5e7596f4ac-goog', ret.full)
self.assertEqual("1.30.50.696.g5e7596f4ac-goog", ret.full)
class CheckGitVersion(RepoWrapperTestCase):
@ -236,23 +254,29 @@ class CheckGitVersion(RepoWrapperTestCase):
def test_unknown(self):
"""Unknown versions should abort."""
with mock.patch.object(self.wrapper, 'ParseGitVersion', return_value=None):
with mock.patch.object(
self.wrapper, "ParseGitVersion", return_value=None
):
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper._CheckGitVersion()
def test_old(self):
"""Old versions should abort."""
with mock.patch.object(
self.wrapper, 'ParseGitVersion',
return_value=self.wrapper.GitVersion(1, 0, 0, '1.0.0')):
self.wrapper,
"ParseGitVersion",
return_value=self.wrapper.GitVersion(1, 0, 0, "1.0.0"),
):
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper._CheckGitVersion()
def test_new(self):
"""Newer versions should run fine."""
with mock.patch.object(
self.wrapper, 'ParseGitVersion',
return_value=self.wrapper.GitVersion(100, 0, 0, '100.0.0')):
self.wrapper,
"ParseGitVersion",
return_value=self.wrapper.GitVersion(100, 0, 0, "100.0.0"),
):
self.wrapper._CheckGitVersion()
@ -263,26 +287,34 @@ class Requirements(RepoWrapperTestCase):
"""Don't crash if the file is missing (old version)."""
testdir = os.path.dirname(os.path.realpath(__file__))
self.assertIsNone(self.wrapper.Requirements.from_dir(testdir))
self.assertIsNone(self.wrapper.Requirements.from_file(
os.path.join(testdir, 'xxxxxxxxxxxxxxxxxxxxxxxx')))
self.assertIsNone(
self.wrapper.Requirements.from_file(
os.path.join(testdir, "xxxxxxxxxxxxxxxxxxxxxxxx")
)
)
def test_corrupt_data(self):
"""If the file can't be parsed, don't blow up."""
self.assertIsNone(self.wrapper.Requirements.from_file(__file__))
self.assertIsNone(self.wrapper.Requirements.from_data(b'x'))
self.assertIsNone(self.wrapper.Requirements.from_data(b"x"))
def test_valid_data(self):
"""Make sure we can parse the file we ship."""
self.assertIsNotNone(self.wrapper.Requirements.from_data(b'{}'))
self.assertIsNotNone(self.wrapper.Requirements.from_data(b"{}"))
rootdir = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
self.assertIsNotNone(self.wrapper.Requirements.from_dir(rootdir))
self.assertIsNotNone(self.wrapper.Requirements.from_file(os.path.join(
rootdir, 'requirements.json')))
self.assertIsNotNone(
self.wrapper.Requirements.from_file(
os.path.join(rootdir, "requirements.json")
)
)
def test_format_ver(self):
"""Check format_ver can format."""
self.assertEqual('1.2.3', self.wrapper.Requirements._format_ver((1, 2, 3)))
self.assertEqual('1', self.wrapper.Requirements._format_ver([1]))
self.assertEqual(
"1.2.3", self.wrapper.Requirements._format_ver((1, 2, 3))
)
self.assertEqual("1", self.wrapper.Requirements._format_ver([1]))
def test_assert_all_unknown(self):
"""Check assert_all works with incompatible file."""
@ -291,44 +323,48 @@ class Requirements(RepoWrapperTestCase):
def test_assert_all_new_repo(self):
"""Check assert_all accepts new enough repo."""
reqs = self.wrapper.Requirements({'repo': {'hard': [1, 0]}})
reqs = self.wrapper.Requirements({"repo": {"hard": [1, 0]}})
reqs.assert_all()
def test_assert_all_old_repo(self):
"""Check assert_all rejects old repo."""
reqs = self.wrapper.Requirements({'repo': {'hard': [99999, 0]}})
reqs = self.wrapper.Requirements({"repo": {"hard": [99999, 0]}})
with self.assertRaises(SystemExit):
reqs.assert_all()
def test_assert_all_new_python(self):
"""Check assert_all accepts new enough python."""
reqs = self.wrapper.Requirements({'python': {'hard': sys.version_info}})
reqs = self.wrapper.Requirements({"python": {"hard": sys.version_info}})
reqs.assert_all()
def test_assert_all_old_python(self):
"""Check assert_all rejects old python."""
reqs = self.wrapper.Requirements({'python': {'hard': [99999, 0]}})
reqs = self.wrapper.Requirements({"python": {"hard": [99999, 0]}})
with self.assertRaises(SystemExit):
reqs.assert_all()
def test_assert_ver_unknown(self):
"""Check assert_ver works with incompatible file."""
reqs = self.wrapper.Requirements({})
reqs.assert_ver('xxx', (1, 0))
reqs.assert_ver("xxx", (1, 0))
def test_assert_ver_new(self):
"""Check assert_ver allows new enough versions."""
reqs = self.wrapper.Requirements({'git': {'hard': [1, 0], 'soft': [2, 0]}})
reqs.assert_ver('git', (1, 0))
reqs.assert_ver('git', (1, 5))
reqs.assert_ver('git', (2, 0))
reqs.assert_ver('git', (2, 5))
reqs = self.wrapper.Requirements(
{"git": {"hard": [1, 0], "soft": [2, 0]}}
)
reqs.assert_ver("git", (1, 0))
reqs.assert_ver("git", (1, 5))
reqs.assert_ver("git", (2, 0))
reqs.assert_ver("git", (2, 5))
def test_assert_ver_old(self):
"""Check assert_ver rejects old versions."""
reqs = self.wrapper.Requirements({'git': {'hard': [1, 0], 'soft': [2, 0]}})
reqs = self.wrapper.Requirements(
{"git": {"hard": [1, 0], "soft": [2, 0]}}
)
with self.assertRaises(SystemExit):
reqs.assert_ver('git', (0, 5))
reqs.assert_ver("git", (0, 5))
class NeedSetupGnuPG(RepoWrapperTestCase):
@ -336,38 +372,38 @@ class NeedSetupGnuPG(RepoWrapperTestCase):
def test_missing_dir(self):
"""The ~/.repoconfig tree doesn't exist yet."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = os.path.join(tempdir, 'foo')
with tempfile.TemporaryDirectory(prefix="repo-tests") as tempdir:
self.wrapper.home_dot_repo = os.path.join(tempdir, "foo")
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_missing_keyring(self):
"""The keyring-version file doesn't exist yet."""
with TemporaryDirectory() as tempdir:
with tempfile.TemporaryDirectory(prefix="repo-tests") as tempdir:
self.wrapper.home_dot_repo = tempdir
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_empty_keyring(self):
"""The keyring-version file exists, but is empty."""
with TemporaryDirectory() as tempdir:
with tempfile.TemporaryDirectory(prefix="repo-tests") as tempdir:
self.wrapper.home_dot_repo = tempdir
with open(os.path.join(tempdir, 'keyring-version'), 'w'):
with open(os.path.join(tempdir, "keyring-version"), "w"):
pass
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_old_keyring(self):
"""The keyring-version file exists, but it's old."""
with TemporaryDirectory() as tempdir:
with tempfile.TemporaryDirectory(prefix="repo-tests") as tempdir:
self.wrapper.home_dot_repo = tempdir
with open(os.path.join(tempdir, 'keyring-version'), 'w') as fp:
fp.write('1.0\n')
with open(os.path.join(tempdir, "keyring-version"), "w") as fp:
fp.write("1.0\n")
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_new_keyring(self):
"""The keyring-version file exists, and is up-to-date."""
with TemporaryDirectory() as tempdir:
with tempfile.TemporaryDirectory(prefix="repo-tests") as tempdir:
self.wrapper.home_dot_repo = tempdir
with open(os.path.join(tempdir, 'keyring-version'), 'w') as fp:
fp.write('1000.0\n')
with open(os.path.join(tempdir, "keyring-version"), "w") as fp:
fp.write("1000.0\n")
self.assertFalse(self.wrapper.NeedSetupGnuPG())
@ -376,14 +412,18 @@ class SetupGnuPG(RepoWrapperTestCase):
def test_full(self):
"""Make sure it works completely."""
with TemporaryDirectory() as tempdir:
with tempfile.TemporaryDirectory(prefix="repo-tests") as tempdir:
self.wrapper.home_dot_repo = tempdir
self.wrapper.gpg_dir = os.path.join(self.wrapper.home_dot_repo, 'gnupg')
self.wrapper.gpg_dir = os.path.join(
self.wrapper.home_dot_repo, "gnupg"
)
self.assertTrue(self.wrapper.SetupGnuPG(True))
with open(os.path.join(tempdir, 'keyring-version'), 'r') as fp:
with open(os.path.join(tempdir, "keyring-version"), "r") as fp:
data = fp.read()
self.assertEqual('.'.join(str(x) for x in self.wrapper.KEYRING_VERSION),
data.strip())
self.assertEqual(
".".join(str(x) for x in self.wrapper.KEYRING_VERSION),
data.strip(),
)
class VerifyRev(RepoWrapperTestCase):
@ -391,30 +431,37 @@ class VerifyRev(RepoWrapperTestCase):
def test_verify_passes(self):
"""Check when we have a valid signed tag."""
desc_result = self.wrapper.RunResult(0, 'v1.0\n', '')
gpg_result = self.wrapper.RunResult(0, '', '')
with mock.patch.object(self.wrapper, 'run_git',
side_effect=(desc_result, gpg_result)):
ret = self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
self.assertEqual('v1.0^0', ret)
desc_result = self.wrapper.RunResult(0, "v1.0\n", "")
gpg_result = self.wrapper.RunResult(0, "", "")
with mock.patch.object(
self.wrapper, "run_git", side_effect=(desc_result, gpg_result)
):
ret = self.wrapper.verify_rev(
"/", "refs/heads/stable", "1234", True
)
self.assertEqual("v1.0^0", ret)
def test_unsigned_commit(self):
"""Check we fall back to signed tag when we have an unsigned commit."""
desc_result = self.wrapper.RunResult(0, 'v1.0-10-g1234\n', '')
gpg_result = self.wrapper.RunResult(0, '', '')
with mock.patch.object(self.wrapper, 'run_git',
side_effect=(desc_result, gpg_result)):
ret = self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
self.assertEqual('v1.0^0', ret)
desc_result = self.wrapper.RunResult(0, "v1.0-10-g1234\n", "")
gpg_result = self.wrapper.RunResult(0, "", "")
with mock.patch.object(
self.wrapper, "run_git", side_effect=(desc_result, gpg_result)
):
ret = self.wrapper.verify_rev(
"/", "refs/heads/stable", "1234", True
)
self.assertEqual("v1.0^0", ret)
def test_verify_fails(self):
"""Check we fall back to signed tag when we have an unsigned commit."""
desc_result = self.wrapper.RunResult(0, 'v1.0-10-g1234\n', '')
desc_result = self.wrapper.RunResult(0, "v1.0-10-g1234\n", "")
gpg_result = Exception
with mock.patch.object(self.wrapper, 'run_git',
side_effect=(desc_result, gpg_result)):
with mock.patch.object(
self.wrapper, "run_git", side_effect=(desc_result, gpg_result)
):
with self.assertRaises(Exception):
self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
self.wrapper.verify_rev("/", "refs/heads/stable", "1234", True)
class GitCheckoutTestCase(RepoWrapperTestCase):
@ -426,39 +473,47 @@ class GitCheckoutTestCase(RepoWrapperTestCase):
@classmethod
def setUpClass(cls):
# Create a repo to operate on, but do it once per-class.
cls.GIT_DIR = tempfile.mkdtemp(prefix='repo-rev-tests')
cls.tempdirobj = tempfile.TemporaryDirectory(prefix="repo-rev-tests")
cls.GIT_DIR = cls.tempdirobj.name
run_git = wrapper.Wrapper().run_git
remote = os.path.join(cls.GIT_DIR, 'remote')
remote = os.path.join(cls.GIT_DIR, "remote")
os.mkdir(remote)
# Tests need to assume, that main is default branch at init,
# which is not supported in config until 2.28.
if git_command.git_require((2, 28, 0)):
initstr = '--initial-branch=main'
initstr = "--initial-branch=main"
else:
# Use template dir for init.
templatedir = tempfile.mkdtemp(prefix='.test-template')
with open(os.path.join(templatedir, 'HEAD'), 'w') as fp:
fp.write('ref: refs/heads/main\n')
initstr = '--template=' + templatedir
templatedir = tempfile.mkdtemp(prefix=".test-template")
with open(os.path.join(templatedir, "HEAD"), "w") as fp:
fp.write("ref: refs/heads/main\n")
initstr = "--template=" + templatedir
run_git('init', initstr, cwd=remote)
run_git('commit', '--allow-empty', '-minit', cwd=remote)
run_git('branch', 'stable', cwd=remote)
run_git('tag', 'v1.0', cwd=remote)
run_git('commit', '--allow-empty', '-m2nd commit', cwd=remote)
cls.REV_LIST = run_git('rev-list', 'HEAD', cwd=remote).stdout.splitlines()
run_git("init", initstr, cwd=remote)
run_git("commit", "--allow-empty", "-minit", cwd=remote)
run_git("branch", "stable", cwd=remote)
run_git("tag", "v1.0", cwd=remote)
run_git("commit", "--allow-empty", "-m2nd commit", cwd=remote)
cls.REV_LIST = run_git(
"rev-list", "HEAD", cwd=remote
).stdout.splitlines()
run_git('init', cwd=cls.GIT_DIR)
run_git('fetch', remote, '+refs/heads/*:refs/remotes/origin/*', cwd=cls.GIT_DIR)
run_git("init", cwd=cls.GIT_DIR)
run_git(
"fetch",
remote,
"+refs/heads/*:refs/remotes/origin/*",
cwd=cls.GIT_DIR,
)
@classmethod
def tearDownClass(cls):
if not cls.GIT_DIR:
if not cls.tempdirobj:
return
shutil.rmtree(cls.GIT_DIR)
cls.tempdirobj.cleanup()
class ResolveRepoRev(GitCheckoutTestCase):
@ -466,36 +521,40 @@ class ResolveRepoRev(GitCheckoutTestCase):
def test_explicit_branch(self):
"""Check refs/heads/branch argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/heads/stable')
self.assertEqual('refs/heads/stable', rrev)
rrev, lrev = self.wrapper.resolve_repo_rev(
self.GIT_DIR, "refs/heads/stable"
)
self.assertEqual("refs/heads/stable", rrev)
self.assertEqual(self.REV_LIST[1], lrev)
with self.assertRaises(wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/heads/unknown')
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, "refs/heads/unknown")
def test_explicit_tag(self):
"""Check refs/tags/tag argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/tags/v1.0')
self.assertEqual('refs/tags/v1.0', rrev)
rrev, lrev = self.wrapper.resolve_repo_rev(
self.GIT_DIR, "refs/tags/v1.0"
)
self.assertEqual("refs/tags/v1.0", rrev)
self.assertEqual(self.REV_LIST[1], lrev)
with self.assertRaises(wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/tags/unknown')
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, "refs/tags/unknown")
def test_branch_name(self):
"""Check branch argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'stable')
self.assertEqual('refs/heads/stable', rrev)
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, "stable")
self.assertEqual("refs/heads/stable", rrev)
self.assertEqual(self.REV_LIST[1], lrev)
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'main')
self.assertEqual('refs/heads/main', rrev)
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, "main")
self.assertEqual("refs/heads/main", rrev)
self.assertEqual(self.REV_LIST[0], lrev)
def test_tag_name(self):
"""Check tag argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'v1.0')
self.assertEqual('refs/tags/v1.0', rrev)
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, "v1.0")
self.assertEqual("refs/tags/v1.0", rrev)
self.assertEqual(self.REV_LIST[1], lrev)
def test_full_commit(self):
@ -514,8 +573,8 @@ class ResolveRepoRev(GitCheckoutTestCase):
def test_unknown(self):
"""Check unknown ref/commit argument."""
with self.assertRaises(wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, 'boooooooya')
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, "boooooooya")
class CheckRepoVerify(RepoWrapperTestCase):
@ -527,13 +586,17 @@ class CheckRepoVerify(RepoWrapperTestCase):
def test_gpg_initialized(self):
"""Should pass if gpg is setup already."""
with mock.patch.object(self.wrapper, 'NeedSetupGnuPG', return_value=False):
with mock.patch.object(
self.wrapper, "NeedSetupGnuPG", return_value=False
):
self.assertTrue(self.wrapper.check_repo_verify(True))
def test_need_gpg_setup(self):
"""Should pass/fail based on gpg setup."""
with mock.patch.object(self.wrapper, 'NeedSetupGnuPG', return_value=True):
with mock.patch.object(self.wrapper, 'SetupGnuPG') as m:
with mock.patch.object(
self.wrapper, "NeedSetupGnuPG", return_value=True
):
with mock.patch.object(self.wrapper, "SetupGnuPG") as m:
m.return_value = True
self.assertTrue(self.wrapper.check_repo_verify(True))
@ -546,26 +609,34 @@ class CheckRepoRev(GitCheckoutTestCase):
def test_verify_works(self):
"""Should pass when verification passes."""
with mock.patch.object(self.wrapper, 'check_repo_verify', return_value=True):
with mock.patch.object(self.wrapper, 'verify_rev', return_value='12345'):
rrev, lrev = self.wrapper.check_repo_rev(self.GIT_DIR, 'stable')
self.assertEqual('refs/heads/stable', rrev)
self.assertEqual('12345', lrev)
with mock.patch.object(
self.wrapper, "check_repo_verify", return_value=True
):
with mock.patch.object(
self.wrapper, "verify_rev", return_value="12345"
):
rrev, lrev = self.wrapper.check_repo_rev(self.GIT_DIR, "stable")
self.assertEqual("refs/heads/stable", rrev)
self.assertEqual("12345", lrev)
def test_verify_fails(self):
"""Should fail when verification fails."""
with mock.patch.object(self.wrapper, 'check_repo_verify', return_value=True):
with mock.patch.object(self.wrapper, 'verify_rev', side_effect=Exception):
with mock.patch.object(
self.wrapper, "check_repo_verify", return_value=True
):
with mock.patch.object(
self.wrapper, "verify_rev", side_effect=Exception
):
with self.assertRaises(Exception):
self.wrapper.check_repo_rev(self.GIT_DIR, 'stable')
self.wrapper.check_repo_rev(self.GIT_DIR, "stable")
def test_verify_ignore(self):
"""Should pass when verification is disabled."""
with mock.patch.object(self.wrapper, 'verify_rev', side_effect=Exception):
rrev, lrev = self.wrapper.check_repo_rev(self.GIT_DIR, 'stable', repo_verify=False)
self.assertEqual('refs/heads/stable', rrev)
with mock.patch.object(
self.wrapper, "verify_rev", side_effect=Exception
):
rrev, lrev = self.wrapper.check_repo_rev(
self.GIT_DIR, "stable", repo_verify=False
)
self.assertEqual("refs/heads/stable", rrev)
self.assertEqual(self.REV_LIST[1], lrev)
if __name__ == '__main__':
unittest.main()

32
tox.ini
View File

@ -15,7 +15,7 @@
# https://tox.readthedocs.io/
[tox]
envlist = py36, py37, py38, py39
envlist = lint, py36, py37, py38, py39, py310
[gh-actions]
python =
@ -23,11 +23,37 @@ python =
3.7: py37
3.8: py38
3.9: py39
3.10: py310
[testenv]
deps = pytest
commands = {envpython} run_tests
deps =
black
flake8
pytest
pytest-timeout
commands = {envpython} run_tests {posargs}
setenv =
GIT_AUTHOR_NAME = Repo test author
GIT_COMMITTER_NAME = Repo test committer
EMAIL = repo@gerrit.nodomain
[testenv:lint]
skip_install = true
deps =
black
flake8
commands =
black --check {posargs:.}
flake8
[testenv:format]
skip_install = true
deps =
black
flake8
commands =
black {posargs:.}
flake8
[pytest]
timeout = 300

View File

@ -12,24 +12,21 @@
# See the License for the specific language governing permissions and
# limitations under the License.
try:
from importlib.machinery import SourceFileLoader
_loader = lambda *args: SourceFileLoader(*args).load_module()
except ImportError:
import imp
_loader = lambda *args: imp.load_source(*args)
import functools
import importlib.machinery
import importlib.util
import os
def WrapperPath():
return os.path.join(os.path.dirname(__file__), 'repo')
_wrapper_module = None
return os.path.join(os.path.dirname(__file__), "repo")
@functools.lru_cache(maxsize=None)
def Wrapper():
global _wrapper_module
if not _wrapper_module:
_wrapper_module = _loader('wrapper', WrapperPath())
return _wrapper_module
modname = "wrapper"
loader = importlib.machinery.SourceFileLoader(modname, WrapperPath())
spec = importlib.util.spec_from_loader(modname, loader)
module = importlib.util.module_from_spec(spec)
loader.exec_module(module)
return module