Compare commits

...

84 Commits
v2.36 ... v2.44

Author SHA1 Message Date
fff1d2d74c ssh: Print details if running the command fails
Change-Id: I87adbdd5fe4eb2709c97ab4c21b414145acf788b
Signed-off-by: Sebastian Schuberth <sschuberth@gmail.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/392915
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Tuan Vo Hung <vohungtuan@gmail.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-03-11 16:40:55 +00:00
4b01a242d8 upload: Fix patchset description destination
Bug: 308467447
Change-Id: I8ad598d39f5fdb24d549d3277ad5fedac203581b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/412477
Reviewed-by: George Engelbrecht <engeg@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-03-08 18:05:36 +00:00
46790229fc sync: Fix sorting for nested projects
The current logic to create checkout layers doesn't work in all cases.
For example, let's assume there are three projects: "foo", "foo/bar" and
"foo-bar". Sorting lexicographical order is incorrect as foo-bar would
be placed between foo and foo/bar, breaking layering logic.

Instead, we split filepaths based using path delimiter (always /) and
then use lexicographical sort.

BUG=b:325119758
TEST=./run_tests, manual sync on chromiumos repository

Change-Id: I76924c3cc6ba2bb860d7a3e48406a6bba8f58c10
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/412338
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: George Engelbrecht <engeg@google.com>
2024-03-08 17:58:24 +00:00
edadb25c02 sync: introduce --force-checkout
In some cases (e.g. in a CI system), it's desirable to be able to
instruct repo to force checkout. This flag passes --force flag to `git
checkout` operations.

Bug: b/327624021
Change-Id: I579edda546fb8147c4e1a267e2605fcf6e597421
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/411518
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: George Engelbrecht <engeg@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2024-03-07 17:21:51 +00:00
96edb9b573 upload: Add support for setting patchset description
Bug: 308467447
Change-Id: I7abcbc98131b826120fc9ab85d5b889f90db4b0a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/355968
Tested-by: Sergiy Belozorov <sergiyb@chromium.org>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Sergiy Belozorov <sergiyb@chromium.org>
2024-03-04 18:50:24 +00:00
5554572f02 sync: Introduce git checkout levels
If a repo manifest is updated so that project B is placed within a
project A, and if project A had content in new B's location in the old
checkout, then repo sync could break depending on checkout order, since
B can't be checked out before A.

This change introduces checkout levels which enforces right sequence of
checkouts while still allowing for parallel checkout. In an example
above, A will always be checked out first before B.

BUG=b:325119758
TEST=./run_tests, manual sync on ChromeOS repository

Change-Id: Ib3b5e4d2639ca56620a1e4c6bf76d7b1ab805250
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/410421
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Greg Edelston <gredelston@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2024-02-27 17:28:33 +00:00
97ca50f5f9 git_command: Return None from GetEventTargetPath() if set to empty string
If trace2.eventTarget was set to the empty string,
match git behavior and don't write a trace.

Bug: 319673783
Change-Id: I02b3884ad97551f8a9d7363c2cbe6b0adee6f73e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/410518
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Tested-by: Peter Collingbourne <pcc@google.com>
2024-02-26 17:51:11 +00:00
8896b68926 trace: Save trace2 sid in REPO_TRACE file
git-trace2 events contain additional information what git is doing under
the hood, and repo doesn't have visibility into.

Instead of relying on timestamp information to match REPO_TRACE with
git-trace2 events, add SID information into REPO_TRACE.

Change-Id: I37672a3face81858072c7a3ce34ca3379199dab5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/410280
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-02-22 20:55:09 +00:00
fec8cd6704 subcmds: sync: Remove deprecated _AUTO_GC
Opportunistic cleanup. It looks like this deprecated feature was slated
for deletion nearly a year ago.

Bug: None
Test: ./run_tests
Change-Id: I0bd0c0e6acbd1eaee1c0b4945c79038257d22f44
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/410198
Reviewed-by: Yiwei Zhang <yiwzhang@google.com>
Commit-Queue: Greg Edelston <gredelston@google.com>
Tested-by: Greg Edelston <gredelston@google.com>
2024-02-20 19:55:15 +00:00
b8139bdcf8 launcher: Set shebang to python3
Some (most?) Linux repos don't have /usr/bin/python, unless
python-is-python3 is installed. While package owners can adjust shebang,
we have seen an increase in number of bugs filed as extra steps are
required.

Per PEP-0394, python3 is acceptable and should be available if python3
is supported. We no longer support python2, and repo no longer works
with python2, so this change makes that explicit.

Bug: 40014585
Change-Id: I9aed90fd470ef601bd33bd596af3df69da69ef5d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/407497
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Максим Паймушкин <maxim.paymushkin@gmail.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2024-02-07 20:44:32 +00:00
26fa3180fb sync: ensure RepoChangedException propagated
Prior to this change RepoChangedException would be caught and re-rasied
as a different exception. This would prevent RepoChangedException
handler from running in main.py

Bug: b/323232806
Change-Id: I9055ff95d439d6ff225206c5bf1755cc718bcfcc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/407144
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-02-06 18:46:19 +00:00
d379e77f44 stop passing project to UpdateManifestError
UpdateManifestError inherits from RepoExitError which inherits
from BaseRepoError. None of them takes project as kwargs
causing the error like "UpdateManifestError() takes no keyword
arguments" in b/317183321

[1]: https://gerrit.googlesource.com/git-repo/+/449b23b698d7d4b13909667a49a0698eb495eeaa/error.py#144

Bug: b/317183321
Change-Id: I64c3dc502027f9dda56a0824f2712364b4502934
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/398997
Commit-Queue: Yiwei Zhang <yiwzhang@google.com>
Tested-by: Yiwei Zhang <yiwzhang@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
2024-02-02 18:35:13 +00:00
4217a82bec project: Rename if deletion fails
If a project contains files not owned by the current user, remove will
fail. In order to ensure repo sync continues to work, rename the
affected project instead, and let user know about it.

Bug: 321273512
Change-Id: I0779d61fc67042308a0226adea7d98167252a5d3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/404372
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-01-25 21:32:58 +00:00
208f344950 Clean up remaining repo sync log spam.
There are still some verbose messages (e.g. "remote: ...") when doing
repo sync after a couple days. Let's hide them behind verbose flag.

Bug: N/A
Test: repo sync
Change-Id: I1408472c95ed80d9555adfe8f92211245c03cf41
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/400855
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Tomasz Wasilczyk <twasilczyk@google.com>
Commit-Queue: Tomasz Wasilczyk <twasilczyk@google.com>
2024-01-05 21:40:43 +00:00
138c8a9ff5 docs: fix some grammar typos
Change-Id: Ie1a32cda67f94b0a2b3329b1be9e03dcbedf39cc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/400917
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2024-01-04 17:19:33 +00:00
9b57aa00f6 project: Check references during sync
Symbolic references need to be checked each time sync is called, not
only for newly created repositories. For example, it is possible to
change a project name to the already existing name, and that will result
in a broken git setup without this patch: refs/ will still point to the
old repository, whereas all objects will point to the new repository.

Bug: 40013418
Change-Id: I596d29d182986804989f0562fb45090224549b0f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/395798
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-01-03 22:26:07 +00:00
b1d1ece2fb tests: setup user identity for tests
After a6413f5d a GitCommandError is raised.

Since there were no user identity were set up,
it fails:
 - ReviewableBranchTests from test_project.py
 - ResolveRepoRev and CheckRepoRev from test_wrapper.py

Test: ./run_tests
Change-Id: Id7f5772afe22c77fc4c8f8f0b8be1b627ed42187
Signed-off-by: Vitalii Dmitriev <vitalii.dmitriev@unikie.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/398658
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Vitalii Dmitriev <dmit.vitalii@gmail.com>
Commit-Queue: Vitalii Dmitriev <dmit.vitalii@gmail.com>
2023-12-20 19:04:57 +00:00
449b23b698 manifest_xml: fix url normalization for inits and remotes
Before the change, repo normalizes the urls
with a following format only:

    git@github.com:foo/bar

It doesn't cover the following case:

   <remote name="org" fetch="git@github.com:org/" />
   <project name="somerepo" remote="org" />

Results to:
   error: Cannot fetch somerepo
     from ssh://git@github.com/org/git@github.com:org/somerepo

Current change fixes it by normalizing this format:

    git@github.com:foo

Test: ./run_tests tests/test_manifest_xml.py
Change-Id: I1ad0f5df0d52c0b7229ba4c9a4db4eecb5c1a003
Signed-off-by: Vitalii Dmitriev <vitalii.dmitriev@unikie.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/398337
Commit-Queue: Vitalii Dmitriev <dmit.vitalii@gmail.com>
Tested-by: Vitalii Dmitriev <dmit.vitalii@gmail.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-12-20 07:38:49 +00:00
e5fb6e585f git_trace2: Add socket timeout
repo blocks indefinitely until trace collector receives trace events,
which is not desired. This change adds a fixed timeout to connect and
send operations. It is possible that some events will be lost. repo logs
any failed trace operation.

Bug: b/316227772
Change-Id: I017636421b8e22ae3fcbab9e4eb2bee1d4fbbff4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/398717
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
2023-12-19 19:38:52 +00:00
48e4137eba manifest_xml: do not allow / before : in scp-like syntax
Since git doesn't treat these as ssh:// URIs, we shouldn't either.

Bug: https://g-issues.gerritcodereview.com/issues/40010331
Change-Id: I001f49be30395187cac447d09cb5a6c29e95768b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/398517
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-12-19 18:00:44 +00:00
172c58398b repo: Drop reexec of python3 from check_python_version()
This simplifies check_python_version() since there is no point in trying
to fall back to python3, as we are already running using some Python 3
interpreter.

Change-Id: I9dfdd002b4ef5567e064d3d6ca98ee1f3410fd48
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/397759
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
2023-12-15 06:49:27 +00:00
aa506db8a7 repo: Remove Python 2 compatibility code
Change-Id: I1f5c691bf94f255442eea95e59ddd93db6213ad8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/397758
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
2023-12-15 06:48:48 +00:00
14c61d2c9d repo: Remove a Python 2 related comment
The EnvironmentError exception was changed to OSError in commit
ae824fb2fc.

Change-Id: I1b4ff742af409ec848131e82900e885c9f089f0c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/397757
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
2023-12-14 18:31:51 +00:00
4c80921d22 Don't log spam repo sync by default
Most times a repo sync after some time (week+) results in a bunch of
messages, which are not very useful for average user:
- discarding 1 commits
- Deleting obsolete checkout.

Bug: N/A
Test: repo sync
Change-Id: I881eab61f9f261e98f3656c09e73ddd159ce288c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/397038
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Tested-by: Tomasz Wasilczyk <twasilczyk@google.com>
2023-12-08 23:08:46 +00:00
f56484c05b tox: Remove pylint timeout
It's not a valid pylint config

Change-Id: Ida480429a3a86637f26e9fc3a0d6fa2d225d952a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/396921
Reviewed-by: Scott Lee <ddoman@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2023-12-08 22:55:22 +00:00
a50c4e3bc0 Update commit-msg hook
Modified in https://gerrit-review.googlesource.com/c/gerrit/+/394841.

Change-Id: I381e48fbdb92b33454219dd9d945a1756e551a77
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/395577
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Orgad Shaneh <orgads@gmail.com>
Commit-Queue: Orgad Shaneh <orgads@gmail.com>
Reviewed-by: Ernesto Rodriguez <guez30nesto@gmail.com>
2023-12-04 17:43:33 +00:00
0dd0a830b0 sync: Fix partial sync false positive
In the case of a project being removed from the manifest, and in the
path in which the project used to exist, and symlink is place to another
project repo will start to warn about partial syncs when a partial sync
did not occur.

Repro steps:

1) Create a manifest with two projects. Project a -> a/ and project b -> b/
2) Run `repo sync`
3) Remove project b from the manifest.
4) Use `link` in the manifest to link all of Project a to b/

Bug: 314161804
Change-Id: I4a4ac4f70a7038bc7e0c4e0e51ae9fc942411a34
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/395640
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Matt Schulte <matsch@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2023-11-30 22:36:41 +00:00
9f0ef5d926 repo: add repo main script's directory to PYTHONPATH.
Python 3.11 introduces PYTHONSAFEPATH and the -P flag which, if enabled,
does not prepend the script's directory to sys.path by default.
This breaks repo because main.py expects its own directory to be part of
Python's import path.

This causes problems with tools that add PYTHONSAFEPATH to python
programs, most notably Bazel.

We will simply prepend main.py's path to PYTHONPATH instead.

Bug: 307767740
Change-Id: I94f3fda50213e450df0d1e2df6a0b8b597416973
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/391236
Tested-by: Duy Truong <duytruong@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-11-29 11:50:53 +00:00
c287428b37 info: Handle undefined mergeBranch
When a repo client is initialized with --standalone-manifest, it doesn't
have merge branch defined. This results in mergeBranch being None.

Bug: b/308025460
Change-Id: Iebceac0976e5d3adab5300bd8dfc76744a791234
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/393716
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2023-11-20 19:22:53 +00:00
c984e8d4f6 manifest_xml: support nested submanifests
Change-Id: I58f91c6b0db631bb7f55164f41d11d3a349ac94f
Signed-off-by: Guillaume Micouin-Jorda <gmicouin@netcourrier.com>
Signed-off-by: Hadamik Stephan <Stephan.Hadamik@continental-corporation.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/392020
Reviewed-by: Ben PUJOL <pujolbe@gmail.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Roberto Prado <roberto.prado.c@gmail.com>
Commit-Queue: Roberto Prado <roberto.prado.c@gmail.com>
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Roberto Prado <roberto.prado.c@gmail.com>
2023-11-15 13:06:23 +00:00
6d821124e0 repo_logging: Ensure error details are printed
This updates RepoLogger.log_aggregated_errors to print out the error
message the RepoExitError when there is not a list of aggregated
errors.

Previously it would log out:
=======================================================================
Repo command failed: ManifestParseError

This told us what class of error occurred but missed the helpful error
message that developers put in the error. After this change it will now
print out the error message:

=======================================================================
Repo command failed: ManifestParseError
    error parsing manifest /path/to/manifest.xml: no element found:
    line 197, column 0

Change-Id: I4805540fddb5fa9171dbc8912becfa7fdfb1ba67
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/392614
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Joshua Bartel <josh.bartel@garmin.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
2023-11-13 20:51:19 +00:00
560a79727f repo: Use the worktree when checking the repo rev.
Avoids treating the operation as if it were acting on a bare repository, thereby triggering failures when the Git client is configured with `safe.bareRepository=explicit`. Repo doesn't actually use a bare repository, but pointing at the gitdir acts as if it had.

Bug: 307559774
Change-Id: I2c142275b2726a59526729c0b2c54faf728f125d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/391554
Commit-Queue: Jason R. Coombs <jaraco@google.com>
Tested-by: Jason R. Coombs <jaraco@google.com>
Tested-by: Emily Shaffer <emilyshaffer@google.com>
Reviewed-by: Emily Shaffer <emilyshaffer@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-11-13 18:21:31 +00:00
8a6d1724d9 git_superproject: tell git that superproject is bare
The superproject is initialized as a bare repo in Superproject:_Init().
That means that later operations must treat it as a bare repository,
specifying the gitdir and setting 'bare' appropriately when launching
GitCommand()s. It's also OK not to specify cwd here because GitCommand()
will drop cwd if bare == True anyways.

With this change, it's possible to run `repo init` and `repo sync` with the
Git config 'safe.bareRepository' set to 'explicit'. This config strengthens
Git's security posture against embedded bare repository attacks like
https://github.com/justinsteven/advisories/blob/main/2022_git_buried_bare_repos_and_fsmonitor_various_abuses.md.

Bug: b/227257481
Change-Id: I954a64c6883d2ca2af9c603e7076fd83b52584e9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389794
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Jason R. Coombs <jaraco@google.com>
Tested-by: Emily Shaffer <emilyshaffer@google.com>
Reviewed-by: Emily Shaffer <emilyshaffer@google.com>
Commit-Queue: Jason R. Coombs <jaraco@google.com>
2023-11-09 22:13:17 +00:00
3652b497bb Correctly handle schema-less URIs for remote fetch URL
Currently we don't deal with schema-less URIs like
`git@github.com:foo` at all resulting in a scenario where we append
them to the manifest repo URL.

In order to deal with this, we munge both the manifest URL and the
fetch URL into a format we like and proceed with that.

Bug: https://g-issues.gerritcodereview.com/issues/40010331
Change-Id: I7b79fc4ed276630fdbeb235b94e327b172f0879b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/386954
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Michael Kelly <mkelly@arista.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-11-08 05:03:20 +00:00
89f761cfef main: Log ManifestParseError exception messages
This lets us see manifest parsing error messages again.

Change-Id: I2d90b97cfb50e4520f79e75fa0d648c373b49e98
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/391477
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Łukasz Patron <priv.luk@gmail.com>
Tested-by: Łukasz Patron <priv.luk@gmail.com>
2023-11-06 19:39:24 +00:00
d32b2dcd15 repo: Remove unreachable code.
Change-Id: I41371feb88c85e9da0656b9fab04057c22d1dcf4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/391514
Tested-by: Jason R. Coombs <jaraco@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Jason R. Coombs <jaraco@google.com>
2023-11-01 17:02:34 +00:00
b32ccbb66b cleanup: Update codebase to expect Python 3.6
- Bump minimum version to Python 3.6.
- Use f-strings in a lot of places.

Change-Id: I2aa70197230fcec2eff8e7c8eb754f20c08075bb
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389034
Tested-by: Jason R. Coombs <jaraco@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Jason R. Coombs <jaraco@google.com>
2023-10-31 16:03:54 +00:00
b99272c601 sync: PersistentTransport call parent init
Found via pylint:
  W0231: __init__ method from base class 'Transport'
  is not called (super-init-not-called)

Just fixed for code correctness and to avoid potential future bugs.

Change-Id: Ie1e723c2afe65d026d70ac01a16ee7a40c149834
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390676
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-10-25 09:06:23 +00:00
b0430b5bc5 sync: TeeStringIO write should return int
Change-Id: I211776a493cad4b005c6e201833e9700def2feb9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390657
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-10-24 19:29:57 +00:00
1fd5c4bdf2 sync: Fix tracking of broken links
Change-Id: Ice4f4cc745cbac59f356bd4ce1124b6162894e61
Bug: b/113935847
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390434
Tested-by: Jason Chang <jasonnc@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Jason Chang <jasonnc@google.com>
2023-10-24 18:49:20 +00:00
9267d58727 project: Speculative fix for project corruption
When a new shared project is added to manifest, there's a short window
where objects can be deleted that are used by other projects.

To close that window, set preciousObjects during git init. For
non-shared projects, repo should correct the state in the same execution
instance.

Bug: 288102993
Change-Id: I366f524535ac58c820d51a88599ae2108df9ab48
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390234
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-10-23 16:13:02 +00:00
ae824fb2fc cleanup: convert exceptions to OSError
In Python 3, these exceptions were merged into OSError, so switch
everything over to that.

Change-Id: If876a28b692de5aa5c62a3bdc8c000793ce52c63
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390376
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2023-10-21 00:56:10 +00:00
034950b9ee cleanup: delete redundant "r" open mode
Change-Id: I86ebb8c5a9dc3752e8a25f4b11b64c5be3a6429e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390375
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
2023-10-21 00:55:33 +00:00
0bcffd8656 cleanup: use new dict & set generator styles
Change-Id: Ie34ac33ada7855945c77238da3ce644f8a9f8306
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390374
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
2023-10-21 00:55:01 +00:00
7393f6bc41 manifest_xml: Fix empty project list when DOCTYPE is present
When parsing the manifest XML, the code looks for a top
level DOM node named "manifest". However, it doesn't check
that it's an element type node so if there is also an XML
document type declaration node present (which has the same
name as the root element) then it selects the wrong node
and hence you end up with no projects defined at all.

Change-Id: I8d101caffbbc2a06e56136ff21302e3f09cfc96b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390357
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Chris Allen <chris.allen@arm.com>
Commit-Queue: Chris Allen <chris.allen@arm.com>
2023-10-20 18:22:59 +00:00
8dd8521854 cleanup: leverage yield from in more places
Change-Id: I4f9cb27d89241d3738486764817b51981444a903
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390274
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2023-10-20 17:33:03 +00:00
49c9b06838 git_config: GetBoolean should return bool
Test: tox
Change-Id: Ifc0dc089deef5a3b396d889c9ebfcf8d4f007bf2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390360
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-10-20 16:41:58 +00:00
3d58d219cb project: using --depth results in error when including submanifests
Fix: https://issues.gerritcodereview.com/issues/40015442
Change-Id: I7fb6c50cf2e438b21181ce1a5893885f09b9ee2b
Signed-off-by: Roberto Vladimir Prado Carranza <roberto.prado.c@gmail.com>
Signed-off-by: Guillaume Micouin-Jorda <gmicouin@netcourrier.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/385995
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Jerome Couto <jerome.couto@renault.com>
2023-10-20 12:34:34 +00:00
c0aad7de18 repo: drop Python 2 compat logic
Bug: 302871152
Change-Id: Ie7a0219e7ac582cd25c2bc5fb530e2c03bcbcc6e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390034
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2023-10-20 05:19:40 +00:00
d4aee6570b delete Python 2 (object) compat
Bug: 302871152
Change-Id: I39636d73a6e1d69efa8ade74f75c5381651e6dc8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390054
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2023-10-20 04:51:01 +00:00
024df06ec1 tests: Set HOME to a temporary directory when running tests.
When running the tests in my environment, tests that derived from `test_wrapper.GitCheckoutTestCase` would fail on commit or tag due to incomplete or incorrect gpg config. Ideally, the tests should not be dependent on the user's git config. This change ensures $HOME (or Windows equivalent) is replaced for the session.

Bug: 302797407

Change-Id: Ib42b712dd7b6602fee6e18329a8c6d52fb9458b9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/388235
Tested-by: Jason R. Coombs <jaraco@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Jason R. Coombs <jaraco@google.com>
2023-10-17 15:15:55 +00:00
45809e51ca tests: added python 3.12
adding the recently released python 3.12 to our
list of test environments.

Change-Id: I05ec0129ad29c16fff65ddfb389f251571f811a2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389754
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-10-17 13:58:33 +00:00
331c5dd3e7 github: add python 3.11 to test-ci.yml
added python 3.11 to the test matrix.

Change-Id: I0533205b5a10105b3144f770aa08c4c649aaf6be
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389675
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-10-16 22:29:49 +00:00
e848e9f72c github: pin ubuntu to 20.04 to make py36 work
Ubuntu versions newer that 20.04 do not support Python 3.6 as per
https://raw.githubusercontent.com/actions/python-versions/main/versions-manifest.json

Change-Id: I92d8e762a7d05e4b0d6d6e90944ceedbbfa74e57
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389117
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-10-16 22:26:38 +00:00
1544afe460 python-support: update with current status & guidelines
This doc was written back in 2019 when we were planning on the Python 3
migration.  It isn't relevant anymore, and people are reading it thinking
we still support Python 2.  Rewrite it to match current requirements and
to make it clear there is no support for older versions.

Bug: 302871152
Change-Id: I2acf3aee1816a03ee0a70774db8bf4a23713a03f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389455
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2023-10-14 06:22:53 +00:00
3b8f9535c7 hooks: drop support for Python 2
Stop running old repohooks via python2.  Abort immediately with a
clear error for the user.

Bug: 302871152
Change-Id: I750c6cbbf3c7950e249512bb1bd023c32587eef5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389454
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-10-13 20:52:46 +00:00
8f4f98582e main: drop Python 2 check
Python 2 can't even parse this code anymore due to syntax changes,
so there's no point in checking for it explicitly.

Bug: 302871152
Change-Id: I9852ace5f5079d037c60fd3ac490d77e074e6875
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389434
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2023-10-13 20:08:33 +00:00
8bc5000423 Update logger.warn to logger.warning
Bug: 305035810
Change-Id: Ic2b35d5c3cbe92480c24da612f29382f5d26d4aa
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389414
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
2023-10-13 19:34:26 +00:00
6a7f73bb9a git_command: read1 needs a size in py3.6
Not setting size causes "TypeError: read1() takes exactly one argument
(0 given)" in Python 3.6.
In Python 3.7 onwards size defaults to -1, which means an arbitrary
number of bytes will be returned.

Compare https://docs.python.org/3.6/library/io.html#io.BufferedReader.read1
and https://docs.python.org/3.7/library/io.html#io.BufferedIOBase.read1
for more details.

Change-Id: Ia4aaf8140ead9493ec650fac167c641569e6a9d8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/388718
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-10-09 17:04:38 +00:00
23d063bdcd git_command: lru_cache needs maxsize for py36 & 37
Python 3.6 and 3.7 do not have a default value for lru_cache maxsize.
Not setting it would cause:
  TypeError: Expected maxsize to be an integer or None

Change-Id: I32d4fb6a0040a0c24da0b2f29f22f85a36c96531
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/388737
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-10-09 14:08:29 +00:00
ce0ed799b6 sync: Fix print statement in _PostRepoFetch
R=jasonnc@google.com

Bug: b/303806829
Change-Id: I49075bfb55b842610786e61a0dedfe008cd1296a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/388614
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
2023-10-06 18:39:46 +00:00
2844a5f3cc git_command: Augment underlying git errors with suggestions
This change appends suggestions to the underlying git error to make the
error slightly more actionable.

DD: go/improve-repo-error-reporting & go/tee-repo-stderr

Bug: b/292704435
Change-Id: I2bf8bea5fca42c6a9acd2fadc70f58f22456e027
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/387774
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
2023-10-06 18:21:45 +00:00
47944bbe2e project: Invoke realpath on dotgit for symmetry with gitdir to ensure a short relpath.
Bug: 302680231

Change-Id: Icd01dd2ce62d737a4acb114e729189cd31f6bde9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/388234
Tested-by: Jason R. Coombs <jaraco@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Jason R. Coombs <jaraco@google.com>
2023-10-05 14:29:29 +00:00
83c66ec661 Reset info logs back to print in sync
Bug: b/292704435
Change-Id: Ib4b4873de726888fc68e476167ff2dcd74ec9045
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/387974
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
2023-09-28 19:46:49 +00:00
87058c6ca5 Track expected git errors in logs
Sometimes it is expected that a GitCommand executed in repo fails. In
such cases indicate in trace logs that the error was expected.

Bug: b/293344017
Change-Id: If137fae9ef9769258246f5b4494e070345db4a71
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/387714
Commit-Queue: Jason Chang <jasonnc@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Jason Chang <jasonnc@google.com>
2023-09-27 19:05:16 +00:00
b5644160b7 tests: Fix tox error in py36 use virtualenv<20.22.0
tox uses virtualenv under its hood for managing virtual environments.
Virtualenv 20.22.0 dropped support for Python <= 3.6.

Since we want to test against Python 3.6 we need to make sure we use
a version of virtualenv earlier than 20.22.0.

This error was not stopping any tests from passing but was printed
multiple times to stderr when executing the py36 target:

  Error processing line 1 of [...]/.tox/py36/[...]/_virtualenv.pth:

    Traceback (most recent call last):
      File "/usr/lib/python3.6/site.py", line 168, in addpackage
        exec(line)
      File "<string>", line 1, in <module>
      File "[...]/.tox/py36/[...]/_virtualenv.py", line 3
        from __future__ import annotations
                                         ^
    SyntaxError: future feature annotations is not defined

Source: https://tox.wiki/en/latest/faq.html#testing-end-of-life-python-versions
Change-Id: I27bd8200987ecf745108ee8c7561a365f542102a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/387694
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-09-27 18:47:04 +00:00
aadd12cb08 Use non-deprecated API for obtaining UTC time
DeprecationWarning: datetime.datetime.utcnow() is deprecated and
scheduled for removal in a future version. Use timezone-aware objects to
represent datetimes in UTC: datetime.datetime.now(datetime.UTC).

Change-Id: Ia2c46fb87c544d98cc2dd68a829f67d4770b479c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/386615
Tested-by: Łukasz Patron <priv.luk@gmail.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Łukasz Patron <priv.luk@gmail.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-09-18 23:59:37 +00:00
b8fd19215f main: Use repo logger
Bug: b/292704435
Change-Id: Ica02e4c00994a2f64083bb36e8f4ee8aa45d76bd
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/386454
Reviewed-by: Jason Chang <jasonnc@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
2023-09-18 20:06:30 +00:00
7a1f1f70f0 project: Use repo logger
Bug: b/292704435
Change-Id: I510fc911530db2c84a7ee099fa2905ceac35d0b7
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/386295
Reviewed-by: Jason Chang <jasonnc@google.com>
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
2023-09-14 17:14:40 +00:00
c993c5068e subcmds: Use repo logger
Bug: b/292704435
Change-Id: Ia3a45d87fc0bf0d4a1ba53050d9c3cd2dba20e55
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/386236
Reviewed-by: Jason Chang <jasonnc@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
2023-09-14 17:13:37 +00:00
c3d7c8536c github: add PR closer
We don't accept PRs via GH, so add a job to automatically close them
with an explanation for how to submit.

Change-Id: I5cc3176549a04ff23b04dae1110cd27a58ba1fd3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/386134
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2023-09-13 18:42:18 +00:00
880c621dc6 tests: test_subcmds_sync.py: fix for py3.6 & 3.7
tests/test_subcmds_sync.py::LocalSyncState::test_prune_removed_projects
was failing in Python 3.6 and 3.7 due to topdir not being set with the
following error message:
    TypeError: expected str, bytes or os.PathLike object, not MagicMock

topdir is accessed from within PruneRemovedProjects().

Test: tox with Python 3.6 to 3.11
Change-Id: I7ba5144df0a0126c01776384e2178136c3510091
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/382816
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-09-13 18:24:04 +00:00
da6ae1da8b tests: test_git_superproject.py: fix py3.6 & 3.7
tests/test_git_superproject.py::SuperprojectTestCase::test_Fetch was
failing in Python 3.6 and 3.7 due to attribute args only being
introduced in Python 3.8. Falling back on old way of accessing
the arguments.

Test: tox with Python 3.6 to 3.11
Change-Id: Iae1934a7bce8cbd6b4519e4dbc92d94e21b43435
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/382818
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-09-13 18:23:40 +00:00
5771897459 start: Use repo logger
Bug: b/292704435
Change-Id: I7b8988207dfdcf0ffc283a48499611892ef5187d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/385534
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
2023-09-11 21:38:55 +00:00
56a5a01c65 project: Use IsId instead of ID_RE.match
Change-Id: I8ca83a034400da0cb97cba41415bfc50858a898b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/385857
Tested-by: Sylvain Desodt <sylvain.desodt@gmail.com>
Commit-Queue: Sylvain Desodt <sylvain.desodt@gmail.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-09-11 12:35:19 +00:00
e9cb391117 project: Optimise GetCommitRevisionId when revisionId is set
When comparing 2 manifests, most of the time is
spent getting the relevant commit id as it relies
on _allrefs which ends up loading all git references.

However, the value from `revisionIs` (when it is valid)
could be used directly leading to a huge performance improvement
(from 180+ seconds to less than 0.01 sec which is more
than 25000 times faster for manifests with 700+ projects).

Bug: 295282548

Change-Id: I5881aa4b2326cc17bbb4ee91d23293111f76ad7e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/385834
Tested-by: Sylvain Desodt <sylvain.desodt@gmail.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Sylvain Desodt <sylvain.desodt@gmail.com>
2023-09-11 12:28:25 +00:00
25d6c7cc10 manifest_xml: use a set instead of (sorted) list in projectsDiff
The logic in projectsDiff performs various operations which
suggest that a set is more appropriate than a list:
 - membership lookup ("in")
 - removal

Also, sorting can be performed on the the remaining elements at the
end (which will usually involve a much smaller number of elements).

(The performance gain is invisible in comparison to the time being
spent performing git operations).

Cosmetic chance:
 - the definition of 'fromProj' is moved to be used in more places
 - the values in diff["added"] are added with a single call to extend

Change-Id: I5ed22ba73b50650ca2d3a49a1ae81f02be3b3055
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/383434
Tested-by: Sylvain Desodt <sylvain.desodt@gmail.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Sylvain Desodt <sylvain.desodt@gmail.com>
2023-09-10 19:24:56 +00:00
f19b310f15 Log ErrorEvent for failing GitCommands
Change-Id: I270af7401cff310349e736bef87e9b381cc4d016
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/385054
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Jason Chang <jasonnc@google.com>
Tested-by: Jason Chang <jasonnc@google.com>
2023-09-06 18:22:33 +00:00
712e62b9b0 logging: Use log.formatter for coloring logs
Bug: b/292704435
Change-Id: Iebdf8fb7666592dc5df2b36aae3185d1fc71bd66
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/385514
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-09-06 18:07:55 +00:00
daf2ad38eb sync: Preserve errors on KeyboardInterrupt
If a KeyboardInterrupt is encountered before an error is aggregated then
the context surrounding the interrupt is lost. This change aggregates
errors as soon as possible for the sync command

Bug: b/293344017
Change-Id: Iac14f9d59723cc9dedbb960f14fdc1fa5b348ea3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/384974
Tested-by: Jason Chang <jasonnc@google.com>
Commit-Queue: Jason Chang <jasonnc@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2023-09-06 17:36:31 +00:00
b861511db9 fix black formatting of standalone programs
Black will only check .py files when given a dir and --check, so list
our few standalone programs explicitly.  This causes the repo launcher
to be reformatted since it was missed in the previous mass reformat.

Bug: b/267675342
Change-Id: Ic90a7f5d84fc02e9fccb05945310fd067e2ed764
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/385034
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-09-01 18:08:58 +00:00
e914ec293a sync: Use repo logger within sync
Bug: b/292704435
Change-Id: Iceb3ad5111e656a1ff9730ae5deb032a9b43b4a5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/383454
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-08-31 22:29:51 +00:00
1e9f7b9e9e project: Preserve stderr on upload
A previous change captured stderr when uploading git projects. This
change ensures stderr is sent to stderr.

Bug: b/297097597
Change-Id: I8314e1017d2a42b7b655fe43ce3c312d397894ca
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/384134
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Jason Chang <jasonnc@google.com>
Tested-by: Jason Chang <jasonnc@google.com>
Reviewed-by: Sam Saccone <samccone@google.com>
2023-08-28 17:13:44 +00:00
1dbf8b4346 tox.ini: add isort as dependency
a previous introduced isort, which causes tox
runs to fail for all python versions. adding
isort as dependency resolve these issues.

Change-Id: If3faf78e6928e6e5111b2ef2359351459832431f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/384175
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-08-28 02:08:45 +00:00
60 changed files with 3247 additions and 2369 deletions

View File

@ -0,0 +1,22 @@
# GitHub actions workflow.
# https://docs.github.com/en/actions/learn-github-actions/workflow-syntax-for-github-actions
# https://github.com/superbrothers/close-pull-request
name: Close Pull Request
on:
pull_request_target:
types: [opened]
jobs:
run:
runs-on: ubuntu-latest
steps:
- uses: superbrothers/close-pull-request@v3
with:
comment: >
Thanks for your contribution!
Unfortunately, we don't use GitHub pull requests to manage code
contributions to this repository.
Instead, please see [README.md](../blob/HEAD/SUBMITTING_PATCHES.md)
which provides full instructions on how to get involved.

View File

@ -13,8 +13,9 @@ jobs:
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
python-version: ['3.6', '3.7', '3.8', '3.9', '3.10']
# ubuntu-20.04 is the last version that supports python 3.6
os: [ubuntu-20.04, macos-latest, windows-latest]
python-version: ['3.6', '3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
runs-on: ${{ matrix.os }}
steps:

View File

@ -103,7 +103,7 @@ def SetDefaultColoring(state):
DEFAULT = "never"
class Coloring(object):
class Coloring:
def __init__(self, config, section_type):
self._section = "color.%s" % section_type
self._config = config
@ -194,7 +194,7 @@ class Coloring(object):
if not opt:
return _Color(fg, bg, attr)
v = self._config.GetString("%s.%s" % (self._section, opt))
v = self._config.GetString(f"{self._section}.{opt}")
if v is None:
return _Color(fg, bg, attr)

View File

@ -46,7 +46,7 @@ class UsageError(RepoExitError):
"""Exception thrown with invalid command usage."""
class Command(object):
class Command:
"""Base class for any command line action in repo."""
# Singleton for all commands to track overall repo command execution and
@ -290,7 +290,7 @@ class Command(object):
output.end()
def _ResetPathToProjectMap(self, projects):
self._by_path = dict((p.worktree, p) for p in projects)
self._by_path = {p.worktree: p for p in projects}
def _UpdatePathToProjectMap(self, project):
self._by_path[project.worktree] = project
@ -476,8 +476,7 @@ class Command(object):
top = self.manifest
yield top
if not opt.this_manifest_only:
for child in top.all_children:
yield child
yield from top.all_children
class InteractiveCommand(Command):
@ -498,11 +497,11 @@ class PagedCommand(Command):
return True
class MirrorSafeCommand(object):
class MirrorSafeCommand:
"""Command permits itself to run within a mirror, and does not require a
working directory.
"""
class GitcClientCommand(object):
class GitcClientCommand:
"""Command that requires the local client to be a GITC client."""

View File

@ -1,47 +1,92 @@
# Supported Python Versions
With Python 2.7 officially going EOL on [01 Jan 2020](https://pythonclock.org/),
we need a support plan for the repo project itself.
Inevitably, there will be a long tail of users who still want to use Python 2 on
their old LTS/corp systems and have little power to change the system.
This documents the current supported Python versions, and tries to provide
guidance for when we decide to drop support for older versions.
## Summary
* Python 3.6 (released Dec 2016) is required by default starting with repo-2.x.
* Older versions of Python (e.g. v2.7) may use the legacy feature-frozen branch
based on repo-1.x.
* Python 3.6 (released Dec 2016) is required starting with repo-2.0.
* Older versions of Python (e.g. v2.7) may use old releases via the repo-1.x
branch, but no support is provided.
## Overview
We provide a branch for Python 2 users that is feature-frozen.
Bugfixes may be added on a best-effort basis or from the community, but largely
no new features will be added, nor is support guaranteed.
Users can select this during `repo init` time via the [repo launcher].
Otherwise the default branches (e.g. stable & main) will be used which will
require Python 3.
This means the [repo launcher] needs to support both Python 2 & Python 3, but
since it doesn't import any other repo code, this shouldn't be too problematic.
The main branch will require Python 3.6 at a minimum.
If the system has an older version of Python 3, then users will have to select
the legacy Python 2 branch instead.
### repo hooks
## repo hooks
Projects that use [repo hooks] run on independent schedules.
They might migrate to Python 3 earlier or later than us.
To support them, we'll probe the shebang of the hook script and if we find an
interpreter in there that indicates a different version than repo is currently
running under, we'll attempt to reexec ourselves under that.
Since it's not possible to detect what version of Python the hooks were written
or tested against, we always import & exec them with the active Python version.
For example, a hook with a header like `#!/usr/bin/python2` will have repo
execute `/usr/bin/python2` to execute the hook code specifically if repo is
currently running Python 3.
If the user's Python is too new for the [repo hooks], then it is up to the hooks
maintainer to update.
For more details, consult the [repo hooks] documentation.
## Repo launcher
The [repo launcher] is an independent script that can support older versions of
Python without holding back the rest of the codebase.
If it detects the current version of Python is too old, it will try to reexec
via a newer version of Python via standard `pythonX.Y` interpreter names.
However, this is provided as a nicety when it is not onerous, and there is no
official support for older versions of Python than the rest of the codebase.
If your default python interpreters are too old to run the launcher even though
you have newer versions installed, your choices are:
* Modify the [repo launcher]'s shebang to suite your environment.
* Download an older version of the [repo launcher] and don't upgrade it.
Be aware that we do not guarantee old repo launchers will work with current
versions of repo. Bug reports using old launchers will not be accepted.
## When to drop support
So far, Python 3.6 has provided most of the interesting features that we want
(e.g. typing & f-strings), and there haven't been features in newer versions
that are critical to us.
That said, let's assume we need functionality that only exists in Python 3.7.
How do we decide when it's acceptable to drop Python 3.6?
1. Review the [Project References](./release-process.md#project-references) to
see what major distros are using the previous version of Python, and when
they go EOL. Generally we care about Ubuntu LTS & current/previous Debian
stable versions.
* If they're all EOL already, then go for it, drop support.
* If they aren't EOL, start a thread on [repo-discuss] to see how the user
base feels about the proposal.
1. Update the "soft" versions in the codebase. This will start warning users
that the older version is deprecated.
* Update [repo](/repo) if the launcher needs updating.
This only helps with people who download newer launchers.
* Update [main.py](/main.py) for the main codebase.
This warns for everyone regardless of [repo launcher] version.
* Update [requirements.json](/requirements.json).
This allows [repo launcher] to display warnings/errors without having
to execute the new codebase. This helps in case of syntax or module
changes where older versions won't even be able to import the new code.
1. After some grace period (ideally at least 2 quarters after the first release
with the updated soft requirements), update the "hard" versions, and then
start using the new functionality.
## Python 2.7 & 3.0-3.5
> **There is no support for these versions.**
> **Do not file bugs if you are using old Python versions.**
> **Any such reports will be marked invalid and ignored.**
> **Upgrade your distro and/or runtime instead.**
Fetch an old version of the [repo launcher]:
```sh
$ curl https://storage.googleapis.com/git-repo-downloads/repo-2.32 > ~/.bin/repo-2.32
$ chmod a+rx ~/.bin/repo-2.32
```
Then initialize an old version of repo:
```sh
$ repo-2.32 init --repo-rev=repo-1 ...
```
[repo-discuss]: https://groups.google.com/forum/#!forum/repo-discuss
[repo hooks]: ./repo-hooks.md
[repo launcher]: ../repo

View File

@ -22,7 +22,7 @@ from error import EditorError
import platform_utils
class Editor(object):
class Editor:
"""Manages the user's preferred text editor."""
_editor = None
@ -104,9 +104,7 @@ least one of these before using this command.""", # noqa: E501
try:
rc = subprocess.Popen(args, shell=shell).wait()
except OSError as e:
raise EditorError(
"editor failed, %s: %s %s" % (str(e), editor, path)
)
raise EditorError(f"editor failed, {str(e)}: {editor} {path}")
if rc != 0:
raise EditorError(
"editor failed with exit status %d: %s %s"

View File

@ -21,7 +21,7 @@ TASK_SYNC_NETWORK = "sync-network"
TASK_SYNC_LOCAL = "sync-local"
class EventLog(object):
class EventLog:
"""Event log that records events that occurred during a repo invocation.
Events are written to the log as a consecutive JSON entries, one per line.

View File

@ -13,7 +13,9 @@
# limitations under the License.
import functools
import json
import os
import re
import subprocess
import sys
from typing import Any, Optional
@ -21,7 +23,9 @@ from typing import Any, Optional
from error import GitError
from error import RepoExitError
from git_refs import HEAD
from git_trace2_event_log_base import BaseEventLog
import platform_utils
from repo_logging import RepoLogger
from repo_trace import IsTrace
from repo_trace import REPO_TRACE
from repo_trace import Trace
@ -45,19 +49,22 @@ GIT_DIR = "GIT_DIR"
LAST_GITDIR = None
LAST_CWD = None
DEFAULT_GIT_FAIL_MESSAGE = "git command failure"
ERROR_EVENT_LOGGING_PREFIX = "RepoGitCommandError"
# Common line length limit
GIT_ERROR_STDOUT_LINES = 1
GIT_ERROR_STDERR_LINES = 1
GIT_ERROR_STDERR_LINES = 10
INVALID_GIT_EXIT_CODE = 126
logger = RepoLogger(__file__)
class _GitCall(object):
class _GitCall:
@functools.lru_cache(maxsize=None)
def version_tuple(self):
ret = Wrapper().ParseGitVersion()
if ret is None:
msg = "fatal: unable to detect git version"
print(msg, file=sys.stderr)
logger.error(msg)
raise GitRequireError(msg)
return ret
@ -67,7 +74,7 @@ class _GitCall(object):
def fun(*cmdv):
command = [name]
command.extend(cmdv)
return GitCommand(None, command).Wait() == 0
return GitCommand(None, command, add_event_log=False).Wait() == 0
return fun
@ -105,7 +112,45 @@ def RepoSourceVersion():
return ver
class UserAgent(object):
@functools.lru_cache(maxsize=None)
def GetEventTargetPath():
"""Get the 'trace2.eventtarget' path from git configuration.
Returns:
path: git config's 'trace2.eventtarget' path if it exists, or None
"""
path = None
cmd = ["config", "--get", "trace2.eventtarget"]
# TODO(https://crbug.com/gerrit/13706): Use GitConfig when it supports
# system git config variables.
p = GitCommand(
None,
cmd,
capture_stdout=True,
capture_stderr=True,
bare=True,
add_event_log=False,
)
retval = p.Wait()
if retval == 0:
# Strip trailing carriage-return in path.
path = p.stdout.rstrip("\n")
if path == "":
return None
elif retval != 1:
# `git config --get` is documented to produce an exit status of `1`
# if the requested variable is not present in the configuration.
# Report any other return value as an error.
logger.error(
"repo: error: 'git config --get' call failed with return code: "
"%r, stderr: %r",
retval,
p.stderr,
)
return path
class UserAgent:
"""Mange User-Agent settings when talking to external services
We follow the style as documented here:
@ -153,12 +198,10 @@ class UserAgent(object):
def git(self):
"""The UA when running git."""
if self._git_ua is None:
self._git_ua = "git/%s (%s) git-repo/%s" % (
git.version_tuple().full,
self.os,
RepoSourceVersion(),
self._git_ua = (
f"git/{git.version_tuple().full} ({self.os}) "
f"git-repo/{RepoSourceVersion()}"
)
return self._git_ua
@ -173,8 +216,8 @@ def git_require(min_version, fail=False, msg=""):
need = ".".join(map(str, min_version))
if msg:
msg = " for " + msg
error_msg = "fatal: git %s or later required%s" % (need, msg)
print(error_msg, file=sys.stderr)
error_msg = f"fatal: git {need} or later required{msg}"
logger.error(error_msg)
raise GitRequireError(error_msg)
return False
@ -200,7 +243,7 @@ def _build_env(
env["GIT_SSH"] = ssh_proxy.proxy
env["GIT_SSH_VARIANT"] = "ssh"
if "http_proxy" in env and "darwin" == sys.platform:
s = "'http.proxy=%s'" % (env["http_proxy"],)
s = f"'http.proxy={env['http_proxy']}'"
p = env.get("GIT_CONFIG_PARAMETERS")
if p is not None:
s = p + " " + s
@ -229,7 +272,7 @@ def _build_env(
return env
class GitCommand(object):
class GitCommand:
"""Wrapper around a single git invocation."""
def __init__(
@ -247,6 +290,8 @@ class GitCommand(object):
gitdir=None,
objdir=None,
verify_command=False,
add_event_log=True,
log_as_error=True,
):
if project:
if not cwd:
@ -257,6 +302,7 @@ class GitCommand(object):
self.project = project
self.cmdv = cmdv
self.verify_command = verify_command
self.stdout, self.stderr = None, None
# Git on Windows wants its paths only using / for reliability.
if platform_utils.isWindows():
@ -276,15 +322,67 @@ class GitCommand(object):
command = [GIT]
if bare:
cwd = None
command.append(cmdv[0])
command_name = cmdv[0]
command.append(command_name)
# Need to use the --progress flag for fetch/clone so output will be
# displayed as by default git only does progress output if stderr is a
# TTY.
if sys.stderr.isatty() and cmdv[0] in ("fetch", "clone"):
if sys.stderr.isatty() and command_name in ("fetch", "clone"):
if "--progress" not in cmdv and "--quiet" not in cmdv:
command.append("--progress")
command.extend(cmdv[1:])
event_log = (
BaseEventLog(env=env, add_init_count=True)
if add_event_log
else None
)
try:
self._RunCommand(
command,
env,
capture_stdout=capture_stdout,
capture_stderr=capture_stderr,
merge_output=merge_output,
ssh_proxy=ssh_proxy,
cwd=cwd,
input=input,
)
self.VerifyCommand()
except GitCommandError as e:
if event_log is not None:
error_info = json.dumps(
{
"ErrorType": type(e).__name__,
"Project": e.project,
"CommandName": command_name,
"Message": str(e),
"ReturnCode": str(e.git_rc)
if e.git_rc is not None
else None,
"IsError": log_as_error,
}
)
event_log.ErrorEvent(
f"{ERROR_EVENT_LOGGING_PREFIX}:{error_info}"
)
event_log.Write(GetEventTargetPath())
if isinstance(e, GitPopenCommandError):
raise
def _RunCommand(
self,
command,
env,
capture_stdout=False,
capture_stderr=False,
merge_output=False,
ssh_proxy=None,
cwd=None,
input=None,
):
# Set subprocess.PIPE for streams that need to be captured.
stdin = subprocess.PIPE if input else None
stdout = subprocess.PIPE if capture_stdout else None
stderr = (
@ -293,6 +391,30 @@ class GitCommand(object):
else (subprocess.PIPE if capture_stderr else None)
)
# tee_stderr acts like a tee command for stderr, in that, it captures
# stderr from the subprocess and streams it back to sys.stderr, while
# keeping a copy in-memory.
# This allows us to store stderr logs from the subprocess into
# GitCommandError.
# Certain git operations, such as `git push`, writes diagnostic logs,
# such as, progress bar for pushing, into stderr. To ensure we don't
# break git's UX, we need to write to sys.stderr as we read from the
# subprocess. Setting encoding or errors makes subprocess return
# io.TextIOWrapper, which is line buffered. To avoid line-buffering
# while tee-ing stderr, we unset these kwargs. See GitCommand._Tee
# for tee-ing between the streams.
# We tee stderr iff the caller doesn't want to capture any stream to
# not disrupt the existing flow.
# See go/tee-repo-stderr for more context.
tee_stderr = False
kwargs = {"encoding": "utf-8", "errors": "backslashreplace"}
if not (stdin or stdout or stderr):
tee_stderr = True
# stderr will be written back to sys.stderr even though it is
# piped here.
stderr = subprocess.PIPE
kwargs = {}
dbg = ""
if IsTrace():
global LAST_CWD
@ -339,17 +461,16 @@ class GitCommand(object):
command,
cwd=cwd,
env=env,
encoding="utf-8",
errors="backslashreplace",
stdin=stdin,
stdout=stdout,
stderr=stderr,
**kwargs,
)
except Exception as e:
raise GitCommandError(
message="%s: %s" % (command[1], e),
project=project.name if project else None,
command_args=cmdv,
raise GitPopenCommandError(
message=f"{command[1]}: {e}",
project=self.project.name if self.project else None,
command_args=self.cmdv,
)
if ssh_proxy:
@ -358,12 +479,45 @@ class GitCommand(object):
self.process = p
try:
self.stdout, self.stderr = p.communicate(input=input)
if tee_stderr:
# tee_stderr streams stderr to sys.stderr while capturing
# a copy within self.stderr. tee_stderr is only enabled
# when the caller wants to pipe no stream.
self.stderr = self._Tee(p.stderr, sys.stderr)
else:
self.stdout, self.stderr = p.communicate(input=input)
finally:
if ssh_proxy:
ssh_proxy.remove_client(p)
self.rc = p.wait()
@staticmethod
def _Tee(in_stream, out_stream):
"""Writes text from in_stream to out_stream while recording in buffer.
Args:
in_stream: I/O stream to be read from.
out_stream: I/O stream to write to.
Returns:
A str containing everything read from the in_stream.
"""
buffer = ""
read_size = 1024 if sys.version_info < (3, 7) else -1
chunk = in_stream.read1(read_size)
while chunk:
# Convert to str.
if not hasattr(chunk, "encode"):
chunk = chunk.decode("utf-8", "backslashreplace")
buffer += chunk
out_stream.write(chunk)
out_stream.flush()
chunk = in_stream.read1(read_size)
return buffer
@staticmethod
def _GetBasicEnv():
"""Return a basic env for running git under.
@ -383,16 +537,14 @@ class GitCommand(object):
env.pop(key, None)
return env
def Wait(self):
if not self.verify_command or self.rc == 0:
return self.rc
def VerifyCommand(self):
if self.rc == 0:
return None
stdout = (
"\n".join(self.stdout.split("\n")[:GIT_ERROR_STDOUT_LINES])
if self.stdout
else None
)
stderr = (
"\n".join(self.stderr.split("\n")[:GIT_ERROR_STDERR_LINES])
if self.stderr
@ -407,6 +559,11 @@ class GitCommand(object):
git_stderr=stderr,
)
def Wait(self):
if self.verify_command:
self.VerifyCommand()
return self.rc
class GitRequireError(RepoExitError):
"""Error raised when git version is unavailable or invalid."""
@ -423,6 +580,29 @@ class GitCommandError(GitError):
raised exclusively from non-zero exit codes returned from git commands.
"""
# Tuples with error formats and suggestions for those errors.
_ERROR_TO_SUGGESTION = [
(
re.compile("couldn't find remote ref .*"),
"Check if the provided ref exists in the remote.",
),
(
re.compile("unable to access '.*': .*"),
(
"Please make sure you have the correct access rights and the "
"repository exists."
),
),
(
re.compile("'.*' does not appear to be a git repository"),
"Are you running this repo command outside of a repo workspace?",
),
(
re.compile("not a git repository"),
"Are you running this repo command outside of a repo workspace?",
),
]
def __init__(
self,
message: str = DEFAULT_GIT_FAIL_MESSAGE,
@ -439,13 +619,40 @@ class GitCommandError(GitError):
self.git_stdout = git_stdout
self.git_stderr = git_stderr
@property
@functools.lru_cache(maxsize=None)
def suggestion(self):
"""Returns helpful next steps for the given stderr."""
if not self.git_stderr:
return self.git_stderr
for err, suggestion in self._ERROR_TO_SUGGESTION:
if err.search(self.git_stderr):
return suggestion
return None
def __str__(self):
args = "[]" if not self.command_args else " ".join(self.command_args)
error_type = type(self).__name__
return f"""{error_type}: {self.message}
Project: {self.project}
Args: {args}
Stdout:
{self.git_stdout}
Stderr:
{self.git_stderr}"""
string = f"{error_type}: '{args}' on {self.project} failed"
if self.message != DEFAULT_GIT_FAIL_MESSAGE:
string += f": {self.message}"
if self.git_stdout:
string += f"\nstdout: {self.git_stdout}"
if self.git_stderr:
string += f"\nstderr: {self.git_stderr}"
if self.suggestion:
string += f"\nsuggestion: {self.suggestion}"
return string
class GitPopenCommandError(GitError):
"""
Error raised when subprocess.Popen fails for a GitCommand
"""

View File

@ -70,7 +70,7 @@ def _key(name):
return ".".join(parts)
class GitConfig(object):
class GitConfig:
_ForUser = None
_ForSystem = None
@ -180,7 +180,7 @@ class GitConfig(object):
config_dict[key] = self.GetString(key)
return config_dict
def GetBoolean(self, name: str) -> Union[str, None]:
def GetBoolean(self, name: str) -> Union[bool, None]:
"""Returns a boolean from the configuration file.
Returns:
@ -370,7 +370,7 @@ class GitConfig(object):
with Trace(": parsing %s", self.file):
with open(self._json) as fd:
return json.load(fd)
except (IOError, ValueError):
except (OSError, ValueError):
platform_utils.remove(self._json, missing_ok=True)
return None
@ -378,7 +378,7 @@ class GitConfig(object):
try:
with open(self._json, "w") as fd:
json.dump(cache, fd, indent=2)
except (IOError, TypeError):
except (OSError, TypeError):
platform_utils.remove(self._json, missing_ok=True)
def _ReadGit(self):
@ -418,7 +418,7 @@ class GitConfig(object):
if p.Wait() == 0:
return p.stdout
else:
raise GitError("git config %s: %s" % (str(args), p.stderr))
raise GitError(f"git config {str(args)}: {p.stderr}")
class RepoConfig(GitConfig):
@ -430,7 +430,7 @@ class RepoConfig(GitConfig):
return os.path.join(repo_config_dir, ".repoconfig/config")
class RefSpec(object):
class RefSpec:
"""A Git refspec line, split into its components:
forced: True if the line starts with '+'
@ -541,7 +541,7 @@ def GetUrlCookieFile(url, quiet):
yield cookiefile, None
class Remote(object):
class Remote:
"""Configuration options related to a remote."""
def __init__(self, config, name):
@ -651,13 +651,11 @@ class Remote(object):
userEmail, host, port
)
except urllib.error.HTTPError as e:
raise UploadError("%s: %s" % (self.review, str(e)))
raise UploadError(f"{self.review}: {str(e)}")
except urllib.error.URLError as e:
raise UploadError("%s: %s" % (self.review, str(e)))
raise UploadError(f"{self.review}: {str(e)}")
except http.client.HTTPException as e:
raise UploadError(
"%s: %s" % (self.review, e.__class__.__name__)
)
raise UploadError(f"{self.review}: {e.__class__.__name__}")
REVIEW_CACHE[u] = self._review_url
return self._review_url + self.projectname
@ -666,7 +664,7 @@ class Remote(object):
username = self._config.GetString("review.%s.username" % self.review)
if username is None:
username = userEmail.split("@")[0]
return "ssh://%s@%s:%s/" % (username, host, port)
return f"ssh://{username}@{host}:{port}/"
def ToLocal(self, rev):
"""Convert a remote revision string to something we have locally."""
@ -715,15 +713,15 @@ class Remote(object):
self._Set("fetch", list(map(str, self.fetch)))
def _Set(self, key, value):
key = "remote.%s.%s" % (self.name, key)
key = f"remote.{self.name}.{key}"
return self._config.SetString(key, value)
def _Get(self, key, all_keys=False):
key = "remote.%s.%s" % (self.name, key)
key = f"remote.{self.name}.{key}"
return self._config.GetString(key, all_keys=all_keys)
class Branch(object):
class Branch:
"""Configuration options related to a single branch."""
def __init__(self, config, name):
@ -762,11 +760,11 @@ class Branch(object):
fd.write("\tmerge = %s\n" % self.merge)
def _Set(self, key, value):
key = "branch.%s.%s" % (self.name, key)
key = f"branch.{self.name}.{key}"
return self._config.SetString(key, value)
def _Get(self, key, all_keys=False):
key = "branch.%s.%s" % (self.name, key)
key = f"branch.{self.name}.{key}"
return self._config.GetString(key, all_keys=all_keys)
@ -795,8 +793,8 @@ class SyncAnalysisState:
to be logged.
"""
self._config = config
now = datetime.datetime.utcnow()
self._Set("main.synctime", now.isoformat(timespec="microseconds") + "Z")
now = datetime.datetime.now(datetime.timezone.utc)
self._Set("main.synctime", now.isoformat(timespec="microseconds"))
self._Set("main.version", "1")
self._Set("sys.argv", sys.argv)
for key, value in superproject_logging_data.items():

View File

@ -28,7 +28,7 @@ R_WORKTREE_M = R_WORKTREE + "m/"
R_M = "refs/remotes/m/"
class GitRefs(object):
class GitRefs:
def __init__(self, gitdir):
self._gitdir = gitdir
self._phyref = None
@ -105,10 +105,8 @@ class GitRefs(object):
def _ReadPackedRefs(self):
path = os.path.join(self._gitdir, "packed-refs")
try:
fd = open(path, "r")
fd = open(path)
mtime = os.path.getmtime(path)
except IOError:
return
except OSError:
return
try:

View File

@ -66,12 +66,12 @@ class UpdateProjectsResult(NamedTuple):
fatal: bool
class Superproject(object):
class Superproject:
"""Get commit ids from superproject.
Initializes a local copy of a superproject for the manifest. This allows
lookup of commit ids for all projects. It contains _project_commit_ids which
is a dictionary with project/commit id entries.
Initializes a bare local copy of a superproject for the manifest. This
allows lookup of commit ids for all projects. It contains
_project_commit_ids which is a dictionary with project/commit id entries.
"""
def __init__(
@ -235,7 +235,8 @@ class Superproject(object):
p = GitCommand(
None,
cmd,
cwd=self._work_git,
gitdir=self._work_git,
bare=True,
capture_stdout=True,
capture_stderr=True,
)
@ -271,7 +272,8 @@ class Superproject(object):
p = GitCommand(
None,
cmd,
cwd=self._work_git,
gitdir=self._work_git,
bare=True,
capture_stdout=True,
capture_stderr=True,
)
@ -381,7 +383,7 @@ class Superproject(object):
try:
with open(manifest_path, "w", encoding="utf-8") as fp:
fp.write(manifest_str)
except IOError as e:
except OSError as e:
self._LogError("cannot write manifest to : {} {}", manifest_path, e)
return None
return manifest_path

View File

@ -1,47 +1,9 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Provide event logging in the git trace2 EVENT format.
The git trace2 EVENT format is defined at:
https://www.kernel.org/pub/software/scm/git/docs/technical/api-trace2.html#_event_format
https://git-scm.com/docs/api-trace2#_the_event_format_target
Usage:
git_trace_log = EventLog()
git_trace_log.StartEvent()
...
git_trace_log.ExitEvent()
git_trace_log.Write()
"""
import datetime
import errno
import json
import os
import socket
import sys
import tempfile
import threading
from git_command import GitCommand
from git_command import GetEventTargetPath
from git_command import RepoSourceVersion
from git_trace2_event_log_base import BaseEventLog
class EventLog(object):
class EventLog(BaseEventLog):
"""Event log that records events that occurred during a repo invocation.
Events are written to the log as a consecutive JSON entries, one per line.
@ -58,318 +20,13 @@ class EventLog(object):
https://git-scm.com/docs/api-trace2#_event_format
"""
def __init__(self, env=None):
"""Initializes the event log."""
self._log = []
# Try to get session-id (sid) from environment (setup in repo launcher).
KEY = "GIT_TRACE2_PARENT_SID"
if env is None:
env = os.environ
def __init__(self, **kwargs):
super().__init__(repo_source_version=RepoSourceVersion(), **kwargs)
self.start = datetime.datetime.utcnow()
# Save both our sid component and the complete sid.
# We use our sid component (self._sid) as the unique filename prefix and
# the full sid (self._full_sid) in the log itself.
self._sid = "repo-%s-P%08x" % (
self.start.strftime("%Y%m%dT%H%M%SZ"),
os.getpid(),
)
parent_sid = env.get(KEY)
# Append our sid component to the parent sid (if it exists).
if parent_sid is not None:
self._full_sid = parent_sid + "/" + self._sid
else:
self._full_sid = self._sid
# Set/update the environment variable.
# Environment handling across systems is messy.
try:
env[KEY] = self._full_sid
except UnicodeEncodeError:
env[KEY] = self._full_sid.encode()
# Add a version event to front of the log.
self._AddVersionEvent()
@property
def full_sid(self):
return self._full_sid
def _AddVersionEvent(self):
"""Adds a 'version' event at the beginning of current log."""
version_event = self._CreateEventDict("version")
version_event["evt"] = "2"
version_event["exe"] = RepoSourceVersion()
self._log.insert(0, version_event)
def _CreateEventDict(self, event_name):
"""Returns a dictionary with common keys/values for git trace2 events.
Args:
event_name: The event name.
Returns:
Dictionary with the common event fields populated.
"""
return {
"event": event_name,
"sid": self._full_sid,
"thread": threading.current_thread().name,
"time": datetime.datetime.utcnow().isoformat() + "Z",
}
def StartEvent(self):
"""Append a 'start' event to the current log."""
start_event = self._CreateEventDict("start")
start_event["argv"] = sys.argv
self._log.append(start_event)
def ExitEvent(self, result):
"""Append an 'exit' event to the current log.
Args:
result: Exit code of the event
"""
exit_event = self._CreateEventDict("exit")
# Consider 'None' success (consistent with event_log result handling).
if result is None:
result = 0
exit_event["code"] = result
time_delta = datetime.datetime.utcnow() - self.start
exit_event["t_abs"] = time_delta.total_seconds()
self._log.append(exit_event)
def CommandEvent(self, name, subcommands):
"""Append a 'command' event to the current log.
Args:
name: Name of the primary command (ex: repo, git)
subcommands: List of the sub-commands (ex: version, init, sync)
"""
command_event = self._CreateEventDict("command")
command_event["name"] = name
command_event["subcommands"] = subcommands
self._log.append(command_event)
def LogConfigEvents(self, config, event_dict_name):
"""Append a |event_dict_name| event for each config key in |config|.
Args:
config: Configuration dictionary.
event_dict_name: Name of the event dictionary for items to be logged
under.
"""
for param, value in config.items():
event = self._CreateEventDict(event_dict_name)
event["param"] = param
event["value"] = value
self._log.append(event)
def DefParamRepoEvents(self, config):
"""Append 'def_param' events for repo config keys to the current log.
This appends one event for each repo.* config key.
Args:
config: Repo configuration dictionary
"""
# Only output the repo.* config parameters.
repo_config = {k: v for k, v in config.items() if k.startswith("repo.")}
self.LogConfigEvents(repo_config, "def_param")
def GetDataEventName(self, value):
"""Returns 'data-json' if the value is an array else returns 'data'."""
return "data-json" if value[0] == "[" and value[-1] == "]" else "data"
def LogDataConfigEvents(self, config, prefix):
"""Append a 'data' event for each entry in |config| to the current log.
For each keyX and valueX of the config, "key" field of the event is
'|prefix|/keyX' and the "value" of the "key" field is valueX.
Args:
config: Configuration dictionary.
prefix: Prefix for each key that is logged.
"""
for key, value in config.items():
event = self._CreateEventDict(self.GetDataEventName(value))
event["key"] = f"{prefix}/{key}"
event["value"] = value
self._log.append(event)
def ErrorEvent(self, msg, fmt=None):
"""Append a 'error' event to the current log."""
error_event = self._CreateEventDict("error")
if fmt is None:
fmt = msg
error_event["msg"] = f"RepoErrorEvent:{msg}"
error_event["fmt"] = f"RepoErrorEvent:{fmt}"
self._log.append(error_event)
def _GetEventTargetPath(self):
"""Get the 'trace2.eventtarget' path from git configuration.
Returns:
path: git config's 'trace2.eventtarget' path if it exists, or None
"""
path = None
cmd = ["config", "--get", "trace2.eventtarget"]
# TODO(https://crbug.com/gerrit/13706): Use GitConfig when it supports
# system git config variables.
p = GitCommand(
None, cmd, capture_stdout=True, capture_stderr=True, bare=True
)
retval = p.Wait()
if retval == 0:
# Strip trailing carriage-return in path.
path = p.stdout.rstrip("\n")
elif retval != 1:
# `git config --get` is documented to produce an exit status of `1`
# if the requested variable is not present in the configuration.
# Report any other return value as an error.
print(
"repo: error: 'git config --get' call failed with return code: "
"%r, stderr: %r" % (retval, p.stderr),
file=sys.stderr,
)
return path
def _WriteLog(self, write_fn):
"""Writes the log out using a provided writer function.
Generate compact JSON output for each item in the log, and write it
using write_fn.
Args:
write_fn: A function that accepts byts and writes them to a
destination.
"""
for e in self._log:
# Dump in compact encoding mode.
# See 'Compact encoding' in Python docs:
# https://docs.python.org/3/library/json.html#module-json
write_fn(
json.dumps(e, indent=None, separators=(",", ":")).encode(
"utf-8"
)
+ b"\n"
)
def Write(self, path=None):
"""Writes the log out to a file or socket.
Log is only written if 'path' or 'git config --get trace2.eventtarget'
provide a valid path (or socket) to write logs to.
Logging filename format follows the git trace2 style of being a unique
(exclusive writable) file.
Args:
path: Path to where logs should be written. The path may have a
prefix of the form "af_unix:[{stream|dgram}:]", in which case
the path is treated as a Unix domain socket. See
https://git-scm.com/docs/api-trace2#_enabling_a_target for
details.
Returns:
log_path: Path to the log file or socket if log is written,
otherwise None
"""
log_path = None
# If no logging path is specified, get the path from
# 'trace2.eventtarget'.
def Write(self, path=None, **kwargs):
if path is None:
path = self._GetEventTargetPath()
return super().Write(path=path, **kwargs)
# If no logging path is specified, exit.
if path is None:
return None
path_is_socket = False
socket_type = None
if isinstance(path, str):
parts = path.split(":", 1)
if parts[0] == "af_unix" and len(parts) == 2:
path_is_socket = True
path = parts[1]
parts = path.split(":", 1)
if parts[0] == "stream" and len(parts) == 2:
socket_type = socket.SOCK_STREAM
path = parts[1]
elif parts[0] == "dgram" and len(parts) == 2:
socket_type = socket.SOCK_DGRAM
path = parts[1]
else:
# Get absolute path.
path = os.path.abspath(os.path.expanduser(path))
else:
raise TypeError("path: str required but got %s." % type(path))
# Git trace2 requires a directory to write log to.
# TODO(https://crbug.com/gerrit/13706): Support file (append) mode also.
if not (path_is_socket or os.path.isdir(path)):
return None
if path_is_socket:
if socket_type == socket.SOCK_STREAM or socket_type is None:
try:
with socket.socket(
socket.AF_UNIX, socket.SOCK_STREAM
) as sock:
sock.connect(path)
self._WriteLog(sock.sendall)
return f"af_unix:stream:{path}"
except OSError as err:
# If we tried to connect to a DGRAM socket using STREAM,
# ignore the attempt and continue to DGRAM below. Otherwise,
# issue a warning.
if err.errno != errno.EPROTOTYPE:
print(
f"repo: warning: git trace2 logging failed: {err}",
file=sys.stderr,
)
return None
if socket_type == socket.SOCK_DGRAM or socket_type is None:
try:
with socket.socket(
socket.AF_UNIX, socket.SOCK_DGRAM
) as sock:
self._WriteLog(lambda bs: sock.sendto(bs, path))
return f"af_unix:dgram:{path}"
except OSError as err:
print(
f"repo: warning: git trace2 logging failed: {err}",
file=sys.stderr,
)
return None
# Tried to open a socket but couldn't connect (SOCK_STREAM) or write
# (SOCK_DGRAM).
print(
"repo: warning: git trace2 logging failed: could not write to "
"socket",
file=sys.stderr,
)
return None
# Path is an absolute path
# Use NamedTemporaryFile to generate a unique filename as required by
# git trace2.
try:
with tempfile.NamedTemporaryFile(
mode="xb", prefix=self._sid, dir=path, delete=False
) as f:
# TODO(https://crbug.com/gerrit/13706): Support writing events
# as they occur.
self._WriteLog(f.write)
log_path = f.name
except FileExistsError as err:
print(
"repo: warning: git trace2 logging failed: %r" % err,
file=sys.stderr,
)
return None
return log_path
def _GetEventTargetPath(self):
return GetEventTargetPath()

View File

@ -0,0 +1,354 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Provide event logging in the git trace2 EVENT format.
The git trace2 EVENT format is defined at:
https://www.kernel.org/pub/software/scm/git/docs/technical/api-trace2.html#_event_format
https://git-scm.com/docs/api-trace2#_the_event_format_target
Usage:
git_trace_log = EventLog()
git_trace_log.StartEvent()
...
git_trace_log.ExitEvent()
git_trace_log.Write()
"""
import datetime
import errno
import json
import os
import socket
import sys
import tempfile
import threading
# Timeout when sending events via socket (applies to connect, send)
SOCK_TIMEOUT = 0.5 # in seconds
# BaseEventLog __init__ Counter that is consistent within the same process
p_init_count = 0
class BaseEventLog:
"""Event log that records events that occurred during a repo invocation.
Events are written to the log as a consecutive JSON entries, one per line.
Entries follow the git trace2 EVENT format.
Each entry contains the following common keys:
- event: The event name
- sid: session-id - Unique string to allow process instance to be
identified.
- thread: The thread name.
- time: is the UTC time of the event.
Valid 'event' names and event specific fields are documented here:
https://git-scm.com/docs/api-trace2#_event_format
"""
def __init__(
self, env=None, repo_source_version=None, add_init_count=False
):
"""Initializes the event log."""
global p_init_count
p_init_count += 1
self._log = []
# Try to get session-id (sid) from environment (setup in repo launcher).
KEY = "GIT_TRACE2_PARENT_SID"
if env is None:
env = os.environ
self.start = datetime.datetime.now(datetime.timezone.utc)
# Save both our sid component and the complete sid.
# We use our sid component (self._sid) as the unique filename prefix and
# the full sid (self._full_sid) in the log itself.
self._sid = (
f"repo-{self.start.strftime('%Y%m%dT%H%M%SZ')}-P{os.getpid():08x}"
)
if add_init_count:
self._sid = f"{self._sid}-{p_init_count}"
parent_sid = env.get(KEY)
# Append our sid component to the parent sid (if it exists).
if parent_sid is not None:
self._full_sid = parent_sid + "/" + self._sid
else:
self._full_sid = self._sid
# Set/update the environment variable.
# Environment handling across systems is messy.
try:
env[KEY] = self._full_sid
except UnicodeEncodeError:
env[KEY] = self._full_sid.encode()
if repo_source_version is not None:
# Add a version event to front of the log.
self._AddVersionEvent(repo_source_version)
@property
def full_sid(self):
return self._full_sid
def _AddVersionEvent(self, repo_source_version):
"""Adds a 'version' event at the beginning of current log."""
version_event = self._CreateEventDict("version")
version_event["evt"] = "2"
version_event["exe"] = repo_source_version
self._log.insert(0, version_event)
def _CreateEventDict(self, event_name):
"""Returns a dictionary with common keys/values for git trace2 events.
Args:
event_name: The event name.
Returns:
Dictionary with the common event fields populated.
"""
return {
"event": event_name,
"sid": self._full_sid,
"thread": threading.current_thread().name,
"time": datetime.datetime.now(datetime.timezone.utc).isoformat(),
}
def StartEvent(self):
"""Append a 'start' event to the current log."""
start_event = self._CreateEventDict("start")
start_event["argv"] = sys.argv
self._log.append(start_event)
def ExitEvent(self, result):
"""Append an 'exit' event to the current log.
Args:
result: Exit code of the event
"""
exit_event = self._CreateEventDict("exit")
# Consider 'None' success (consistent with event_log result handling).
if result is None:
result = 0
exit_event["code"] = result
time_delta = datetime.datetime.now(datetime.timezone.utc) - self.start
exit_event["t_abs"] = time_delta.total_seconds()
self._log.append(exit_event)
def CommandEvent(self, name, subcommands):
"""Append a 'command' event to the current log.
Args:
name: Name of the primary command (ex: repo, git)
subcommands: List of the sub-commands (ex: version, init, sync)
"""
command_event = self._CreateEventDict("command")
command_event["name"] = name
command_event["subcommands"] = subcommands
self._log.append(command_event)
def LogConfigEvents(self, config, event_dict_name):
"""Append a |event_dict_name| event for each config key in |config|.
Args:
config: Configuration dictionary.
event_dict_name: Name of the event dictionary for items to be logged
under.
"""
for param, value in config.items():
event = self._CreateEventDict(event_dict_name)
event["param"] = param
event["value"] = value
self._log.append(event)
def DefParamRepoEvents(self, config):
"""Append 'def_param' events for repo config keys to the current log.
This appends one event for each repo.* config key.
Args:
config: Repo configuration dictionary
"""
# Only output the repo.* config parameters.
repo_config = {k: v for k, v in config.items() if k.startswith("repo.")}
self.LogConfigEvents(repo_config, "def_param")
def GetDataEventName(self, value):
"""Returns 'data-json' if the value is an array else returns 'data'."""
return "data-json" if value[0] == "[" and value[-1] == "]" else "data"
def LogDataConfigEvents(self, config, prefix):
"""Append a 'data' event for each entry in |config| to the current log.
For each keyX and valueX of the config, "key" field of the event is
'|prefix|/keyX' and the "value" of the "key" field is valueX.
Args:
config: Configuration dictionary.
prefix: Prefix for each key that is logged.
"""
for key, value in config.items():
event = self._CreateEventDict(self.GetDataEventName(value))
event["key"] = f"{prefix}/{key}"
event["value"] = value
self._log.append(event)
def ErrorEvent(self, msg, fmt=None):
"""Append a 'error' event to the current log."""
error_event = self._CreateEventDict("error")
if fmt is None:
fmt = msg
error_event["msg"] = f"RepoErrorEvent:{msg}"
error_event["fmt"] = f"RepoErrorEvent:{fmt}"
self._log.append(error_event)
def _WriteLog(self, write_fn):
"""Writes the log out using a provided writer function.
Generate compact JSON output for each item in the log, and write it
using write_fn.
Args:
write_fn: A function that accepts byts and writes them to a
destination.
"""
for e in self._log:
# Dump in compact encoding mode.
# See 'Compact encoding' in Python docs:
# https://docs.python.org/3/library/json.html#module-json
write_fn(
json.dumps(e, indent=None, separators=(",", ":")).encode(
"utf-8"
)
+ b"\n"
)
def Write(self, path=None):
"""Writes the log out to a file or socket.
Log is only written if 'path' or 'git config --get trace2.eventtarget'
provide a valid path (or socket) to write logs to.
Logging filename format follows the git trace2 style of being a unique
(exclusive writable) file.
Args:
path: Path to where logs should be written. The path may have a
prefix of the form "af_unix:[{stream|dgram}:]", in which case
the path is treated as a Unix domain socket. See
https://git-scm.com/docs/api-trace2#_enabling_a_target for
details.
Returns:
log_path: Path to the log file or socket if log is written,
otherwise None
"""
log_path = None
# If no logging path is specified, exit.
if path is None:
return None
path_is_socket = False
socket_type = None
if isinstance(path, str):
parts = path.split(":", 1)
if parts[0] == "af_unix" and len(parts) == 2:
path_is_socket = True
path = parts[1]
parts = path.split(":", 1)
if parts[0] == "stream" and len(parts) == 2:
socket_type = socket.SOCK_STREAM
path = parts[1]
elif parts[0] == "dgram" and len(parts) == 2:
socket_type = socket.SOCK_DGRAM
path = parts[1]
else:
# Get absolute path.
path = os.path.abspath(os.path.expanduser(path))
else:
raise TypeError("path: str required but got %s." % type(path))
# Git trace2 requires a directory to write log to.
# TODO(https://crbug.com/gerrit/13706): Support file (append) mode also.
if not (path_is_socket or os.path.isdir(path)):
return None
if path_is_socket:
if socket_type == socket.SOCK_STREAM or socket_type is None:
try:
with socket.socket(
socket.AF_UNIX, socket.SOCK_STREAM
) as sock:
sock.settimeout(SOCK_TIMEOUT)
sock.connect(path)
self._WriteLog(sock.sendall)
return f"af_unix:stream:{path}"
except OSError as err:
# If we tried to connect to a DGRAM socket using STREAM,
# ignore the attempt and continue to DGRAM below. Otherwise,
# issue a warning.
if err.errno != errno.EPROTOTYPE:
print(
f"repo: warning: git trace2 logging failed: {err}",
file=sys.stderr,
)
return None
if socket_type == socket.SOCK_DGRAM or socket_type is None:
try:
with socket.socket(
socket.AF_UNIX, socket.SOCK_DGRAM
) as sock:
self._WriteLog(lambda bs: sock.sendto(bs, path))
return f"af_unix:dgram:{path}"
except OSError as err:
print(
f"repo: warning: git trace2 logging failed: {err}",
file=sys.stderr,
)
return None
# Tried to open a socket but couldn't connect (SOCK_STREAM) or write
# (SOCK_DGRAM).
print(
"repo: warning: git trace2 logging failed: could not write to "
"socket",
file=sys.stderr,
)
return None
# Path is an absolute path
# Use NamedTemporaryFile to generate a unique filename as required by
# git trace2.
try:
with tempfile.NamedTemporaryFile(
mode="xb", prefix=self._sid, dir=path, delete=False
) as f:
# TODO(https://crbug.com/gerrit/13706): Support writing events
# as they occur.
self._WriteLog(f.write)
log_path = f.name
except FileExistsError as err:
print(
"repo: warning: git trace2 logging failed: %r" % err,
file=sys.stderr,
)
return None
return log_path

View File

@ -12,11 +12,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import errno
import json
import os
import re
import subprocess
import sys
import traceback
import urllib.parse
@ -25,7 +22,7 @@ from error import HookError
from git_refs import HEAD
class RepoHook(object):
class RepoHook:
"""A RepoHook contains information about a script to run as a hook.
Hooks are used to run a python script before running an upload (for
@ -183,7 +180,7 @@ class RepoHook(object):
abort_if_user_denies was passed to the consturctor.
"""
hooks_config = self._hooks_project.config
git_approval_key = "repo.hooks.%s.%s" % (self._hook_type, subkey)
git_approval_key = f"repo.hooks.{self._hook_type}.{subkey}"
# Get the last value that the user approved for this hook; may be None.
old_val = hooks_config.GetString(git_approval_key)
@ -196,7 +193,7 @@ class RepoHook(object):
else:
# Give the user a reason why we're prompting, since they last
# told us to "never ask again".
prompt = "WARNING: %s\n\n" % (changed_prompt,)
prompt = f"WARNING: {changed_prompt}\n\n"
else:
prompt = ""
@ -244,9 +241,8 @@ class RepoHook(object):
return self._CheckForHookApprovalHelper(
"approvedmanifest",
self._manifest_url,
"Run hook scripts from %s" % (self._manifest_url,),
"Manifest URL has changed since %s was allowed."
% (self._hook_type,),
f"Run hook scripts from {self._manifest_url}",
f"Manifest URL has changed since {self._hook_type} was allowed.",
)
def _CheckForHookApprovalHash(self):
@ -265,7 +261,7 @@ class RepoHook(object):
"approvedhash",
self._GetHash(),
prompt % (self._GetMustVerb(), self._script_fullpath),
"Scripts have changed since %s was allowed." % (self._hook_type,),
f"Scripts have changed since {self._hook_type} was allowed.",
)
@staticmethod
@ -298,43 +294,6 @@ class RepoHook(object):
return interp
def _ExecuteHookViaReexec(self, interp, context, **kwargs):
"""Execute the hook script through |interp|.
Note: Support for this feature should be dropped ~Jun 2021.
Args:
interp: The Python program to run.
context: Basic Python context to execute the hook inside.
kwargs: Arbitrary arguments to pass to the hook script.
Raises:
HookError: When the hooks failed for any reason.
"""
# This logic needs to be kept in sync with _ExecuteHookViaImport below.
script = """
import json, os, sys
path = '''%(path)s'''
kwargs = json.loads('''%(kwargs)s''')
context = json.loads('''%(context)s''')
sys.path.insert(0, os.path.dirname(path))
data = open(path).read()
exec(compile(data, path, 'exec'), context)
context['main'](**kwargs)
""" % {
"path": self._script_fullpath,
"kwargs": json.dumps(kwargs),
"context": json.dumps(context),
}
# We pass the script via stdin to avoid OS argv limits. It also makes
# unhandled exception tracebacks less verbose/confusing for users.
cmd = [interp, "-c", "import sys; exec(sys.stdin.read())"]
proc = subprocess.Popen(cmd, stdin=subprocess.PIPE)
proc.communicate(input=script.encode("utf-8"))
if proc.returncode:
raise HookError("Failed to run %s hook." % (self._hook_type,))
def _ExecuteHookViaImport(self, data, context, **kwargs):
"""Execute the hook code in |data| directly.
@ -412,30 +371,13 @@ context['main'](**kwargs)
# See what version of python the hook has been written against.
data = open(self._script_fullpath).read()
interp = self._ExtractInterpFromShebang(data)
reexec = False
if interp:
prog = os.path.basename(interp)
if prog.startswith("python2") and sys.version_info.major != 2:
reexec = True
elif prog.startswith("python3") and sys.version_info.major == 2:
reexec = True
# Attempt to execute the hooks through the requested version of
# Python.
if reexec:
try:
self._ExecuteHookViaReexec(interp, context, **kwargs)
except OSError as e:
if e.errno == errno.ENOENT:
# We couldn't find the interpreter, so fallback to
# importing.
reexec = False
else:
raise
if prog.startswith("python2"):
raise HookError("Python 2 is not supported")
# Run the hook by importing directly.
if not reexec:
self._ExecuteHookViaImport(data, context, **kwargs)
self._ExecuteHookViaImport(data, context, **kwargs)
finally:
# Restore sys.path and CWD.
sys.path = orig_syspath

View File

@ -1,5 +1,5 @@
#!/bin/sh
# From Gerrit Code Review 3.6.1 c67916dbdc07555c44e32a68f92ffc484b9b34f0
# From Gerrit Code Review 3.10.0 d5403dbf335ba7d48977fc95170c3f7027c34659
#
# Part of Gerrit Code Review (https://www.gerritcodereview.com/)
#
@ -31,14 +31,21 @@ if test ! -f "$1" ; then
fi
# Do not create a change id if requested
if test "false" = "$(git config --bool --get gerrit.createChangeId)" ; then
exit 0
fi
create_setting=$(git config --get gerrit.createChangeId)
case "$create_setting" in
false)
exit 0
;;
always)
;;
*)
# Do not create a change id for squash/fixup commits.
if head -n1 "$1" | LC_ALL=C grep -q '^[a-z][a-z]*! '; then
exit 0
fi
;;
esac
# Do not create a change id for squash commits.
if head -n1 "$1" | grep -q '^squash! '; then
exit 0
fi
if git rev-parse --verify HEAD >/dev/null 2>&1; then
refhash="$(git rev-parse HEAD)"
@ -51,7 +58,7 @@ dest="$1.tmp.${random}"
trap 'rm -f "$dest" "$dest-2"' EXIT
if ! git stripspace --strip-comments < "$1" > "${dest}" ; then
if ! cat "$1" | sed -e '/>8/q' | git stripspace --strip-comments > "${dest}" ; then
echo "cannot strip comments from $1"
exit 1
fi
@ -65,7 +72,7 @@ reviewurl="$(git config --get gerrit.reviewUrl)"
if test -n "${reviewurl}" ; then
token="Link"
value="${reviewurl%/}/id/I$random"
pattern=".*/id/I[0-9a-f]\{40\}$"
pattern=".*/id/I[0-9a-f]\{40\}"
else
token="Change-Id"
value="I$random"
@ -92,7 +99,7 @@ fi
# Avoid the --where option which only appeared in Git 2.15
if ! git -c trailer.where=before interpret-trailers \
--trailer "Signed-off-by: $token: $value" < "$dest-2" |
sed -re "s/^Signed-off-by: ($token: )/\1/" \
sed -e "s/^Signed-off-by: \($token: \)/\1/" \
-e "/^Signed-off-by: SENTINEL/d" > "$dest" ; then
echo "cannot insert $token line in $1"
exit 1

156
main.py
View File

@ -32,6 +32,8 @@ import textwrap
import time
import urllib.request
from repo_logging import RepoLogger
try:
import kerberos
@ -46,6 +48,7 @@ from error import DownloadError
from error import GitcUnsupportedError
from error import InvalidProjectGroupsError
from error import ManifestInvalidRevisionError
from error import ManifestParseError
from error import NoManifestException
from error import NoSuchProjectError
from error import RepoChangedException
@ -69,6 +72,9 @@ from wrapper import Wrapper
from wrapper import WrapperPath
logger = RepoLogger(__file__)
# NB: These do not need to be kept in sync with the repo launcher script.
# These may be much newer as it allows the repo launcher to roll between
# different repo releases while source versions might require a newer python.
@ -81,27 +87,19 @@ from wrapper import WrapperPath
MIN_PYTHON_VERSION_SOFT = (3, 6)
MIN_PYTHON_VERSION_HARD = (3, 6)
if sys.version_info.major < 3:
print(
"repo: error: Python 2 is no longer supported; "
"Please upgrade to Python {}.{}+.".format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr,
if sys.version_info < MIN_PYTHON_VERSION_HARD:
logger.error(
"repo: error: Python version is too old; "
"Please upgrade to Python %d.%d+.",
*MIN_PYTHON_VERSION_SOFT,
)
sys.exit(1)
else:
if sys.version_info < MIN_PYTHON_VERSION_HARD:
print(
"repo: error: Python 3 version is too old; "
"Please upgrade to Python {}.{}+.".format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr,
)
sys.exit(1)
elif sys.version_info < MIN_PYTHON_VERSION_SOFT:
print(
"repo: warning: your Python 3 version is no longer supported; "
"Please upgrade to Python {}.{}+.".format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr,
)
elif sys.version_info < MIN_PYTHON_VERSION_SOFT:
logger.error(
"repo: warning: your Python version is no longer supported; "
"Please upgrade to Python %d.%d+.",
*MIN_PYTHON_VERSION_SOFT,
)
KEYBOARD_INTERRUPT_EXIT = 128 + signal.SIGINT
MAX_PRINT_ERRORS = 5
@ -189,7 +187,7 @@ global_options.add_option(
)
class _Repo(object):
class _Repo:
def __init__(self, repodir):
self.repodir = repodir
self.commands = all_commands
@ -201,9 +199,8 @@ class _Repo(object):
if short:
commands = " ".join(sorted(self.commands))
wrapped_commands = textwrap.wrap(commands, width=77)
print(
"Available commands:\n %s" % ("\n ".join(wrapped_commands),)
)
help_commands = "".join(f"\n {x}" for x in wrapped_commands)
print(f"Available commands:{help_commands}")
print("\nRun `repo help <command>` for command-specific details.")
print("Bug reports:", Wrapper().BUG_URL)
else:
@ -239,7 +236,7 @@ class _Repo(object):
if name in self.commands:
return name, []
key = "alias.%s" % (name,)
key = f"alias.{name}"
alias = RepoConfig.ForRepository(self.repodir).GetString(key)
if alias is None:
alias = RepoConfig.ForUser().GetString(key)
@ -273,10 +270,14 @@ class _Repo(object):
self._PrintHelp(short=True)
return 1
run = lambda: self._RunLong(name, gopts, argv) or 0
git_trace2_event_log = EventLog()
run = (
lambda: self._RunLong(name, gopts, argv, git_trace2_event_log) or 0
)
with Trace(
"starting new command: %s",
"starting new command: %s [sid=%s]",
", ".join([name] + argv),
git_trace2_event_log.full_sid,
first_trace=True,
):
if gopts.trace_python:
@ -293,12 +294,11 @@ class _Repo(object):
result = run()
return result
def _RunLong(self, name, gopts, argv):
def _RunLong(self, name, gopts, argv, git_trace2_event_log):
"""Execute the (longer running) requested subcommand."""
result = 0
SetDefaultColoring(gopts.color)
git_trace2_event_log = EventLog()
outer_client = RepoClient(self.repodir)
repo_client = outer_client
if gopts.submanifest_path:
@ -309,7 +309,7 @@ class _Repo(object):
)
if Wrapper().gitc_parse_clientdir(os.getcwd()):
print("GITC is not supported.", file=sys.stderr)
logger.error("GITC is not supported.")
raise GitcUnsupportedError()
try:
@ -322,32 +322,24 @@ class _Repo(object):
git_event_log=git_trace2_event_log,
)
except KeyError:
print(
"repo: '%s' is not a repo command. See 'repo help'." % name,
file=sys.stderr,
logger.error(
"repo: '%s' is not a repo command. See 'repo help'.", name
)
return 1
Editor.globalConfig = cmd.client.globalConfig
if not isinstance(cmd, MirrorSafeCommand) and cmd.manifest.IsMirror:
print(
"fatal: '%s' requires a working directory" % name,
file=sys.stderr,
)
logger.error("fatal: '%s' requires a working directory", name)
return 1
try:
copts, cargs = cmd.OptionParser.parse_args(argv)
copts = cmd.ReadEnvironmentOptions(copts)
except NoManifestException as e:
print(
"error: in `%s`: %s" % (" ".join([name] + argv), str(e)),
file=sys.stderr,
)
print(
"error: manifest missing or unreadable -- please run init",
file=sys.stderr,
logger.error("error: in `%s`: %s", " ".join([name] + argv), e)
logger.error(
"error: manifest missing or unreadable -- please run init"
)
return 1
@ -451,36 +443,31 @@ class _Repo(object):
except (
DownloadError,
ManifestInvalidRevisionError,
ManifestParseError,
NoManifestException,
) as e:
print(
"error: in `%s`: %s" % (" ".join([name] + argv), str(e)),
file=sys.stderr,
)
logger.error("error: in `%s`: %s", " ".join([name] + argv), e)
if isinstance(e, NoManifestException):
print(
"error: manifest missing or unreadable -- please run init",
file=sys.stderr,
logger.error(
"error: manifest missing or unreadable -- please run init"
)
result = e.exit_code
except NoSuchProjectError as e:
if e.name:
print("error: project %s not found" % e.name, file=sys.stderr)
logger.error("error: project %s not found", e.name)
else:
print("error: no project in current directory", file=sys.stderr)
logger.error("error: no project in current directory")
result = e.exit_code
except InvalidProjectGroupsError as e:
if e.name:
print(
"error: project group must be enabled for project %s"
% e.name,
file=sys.stderr,
logger.error(
"error: project group must be enabled for project %s",
e.name,
)
else:
print(
logger.error(
"error: project group must be enabled for the project in "
"the current directory",
file=sys.stderr,
"the current directory"
)
result = e.exit_code
except SystemExit as e:
@ -547,7 +534,7 @@ def _CheckWrapperVersion(ver_str, repo_path):
repo_path = "~/bin/repo"
if not ver_str:
print("no --wrapper-version argument", file=sys.stderr)
logger.error("no --wrapper-version argument")
sys.exit(1)
# Pull out the version of the repo launcher we know about to compare.
@ -556,7 +543,7 @@ def _CheckWrapperVersion(ver_str, repo_path):
exp_str = ".".join(map(str, exp))
if ver < MIN_REPO_VERSION:
print(
logger.error(
"""
repo: error:
!!! Your version of repo %s is too old.
@ -565,42 +552,44 @@ repo: error:
!!! You must upgrade before you can continue:
cp %s %s
"""
% (ver_str, min_str, exp_str, WrapperPath(), repo_path),
file=sys.stderr,
""",
ver_str,
min_str,
exp_str,
WrapperPath(),
repo_path,
)
sys.exit(1)
if exp > ver:
print(
"\n... A new version of repo (%s) is available." % (exp_str,),
file=sys.stderr,
logger.warning(
"\n... A new version of repo (%s) is available.", exp_str
)
if os.access(repo_path, os.W_OK):
print(
logger.warning(
"""\
... You should upgrade soon:
cp %s %s
"""
% (WrapperPath(), repo_path),
file=sys.stderr,
""",
WrapperPath(),
repo_path,
)
else:
print(
logger.warning(
"""\
... New version is available at: %s
... The launcher is run from: %s
!!! The launcher is not writable. Please talk to your sysadmin or distro
!!! to get an update installed.
"""
% (WrapperPath(), repo_path),
file=sys.stderr,
""",
WrapperPath(),
repo_path,
)
def _CheckRepoDir(repo_dir):
if not repo_dir:
print("no --repo-dir argument", file=sys.stderr)
logger.error("no --repo-dir argument")
sys.exit(1)
@ -804,7 +793,7 @@ def init_http():
mgr.add_password(p[1], "https://%s/" % host, p[0], p[2])
except netrc.NetrcParseError:
pass
except IOError:
except OSError:
pass
handlers.append(_BasicAuthHandler(mgr))
handlers.append(_DigestAuthHandler(mgr))
@ -861,18 +850,7 @@ def _Main(argv):
result = repo._Run(name, gopts, argv) or 0
except RepoExitError as e:
if not isinstance(e, SilentRepoExitError):
exception_name = type(e).__name__
print("fatal: %s" % e, file=sys.stderr)
if e.aggregate_errors:
print(f"{exception_name} Aggregate Errors")
for err in e.aggregate_errors[:MAX_PRINT_ERRORS]:
print(err)
if (
e.aggregate_errors
and len(e.aggregate_errors) > MAX_PRINT_ERRORS
):
diff = len(e.aggregate_errors) - MAX_PRINT_ERRORS
print(f"+{diff} additional errors ...")
logger.log_aggregated_errors(e)
result = e.exit_code
except KeyboardInterrupt:
print("aborted by user", file=sys.stderr)

View File

@ -114,12 +114,40 @@ def XmlInt(node, attr, default=None):
try:
return int(value)
except ValueError:
raise ManifestParseError(
'manifest: invalid %s="%s" integer' % (attr, value)
)
raise ManifestParseError(f'manifest: invalid {attr}="{value}" integer')
class _Default(object):
def normalize_url(url: str) -> str:
"""Mutate input 'url' into normalized form:
* remove trailing slashes
* convert SCP-like syntax to SSH URL
Args:
url: URL to modify
Returns:
The normalized URL.
"""
url = url.rstrip("/")
parsed_url = urllib.parse.urlparse(url)
# This matches patterns like "git@github.com:foo".
scp_like_url_re = r"^[^/:]+@[^/:]+:[^/]+"
# If our URL is missing a schema and matches git's
# SCP-like syntax we should convert it to a proper
# SSH URL instead to make urljoin() happier.
#
# See: https://git-scm.com/docs/git-clone#URLS
if not parsed_url.scheme and re.match(scp_like_url_re, url):
return "ssh://" + url.replace(":", "/", 1)
return url
class _Default:
"""Project defaults within the manifest."""
revisionExpr = None
@ -142,7 +170,7 @@ class _Default(object):
return self.__dict__ != other.__dict__
class _XmlRemote(object):
class _XmlRemote:
def __init__(
self,
name,
@ -182,20 +210,22 @@ class _XmlRemote(object):
def _resolveFetchUrl(self):
if self.fetchUrl is None:
return ""
url = self.fetchUrl.rstrip("/")
manifestUrl = self.manifestUrl.rstrip("/")
# urljoin will gets confused over quite a few things. The ones we care
# about here are:
# * no scheme in the base url, like <hostname:port>
# We handle no scheme by replacing it with an obscure protocol, gopher
# and then replacing it with the original when we are done.
if manifestUrl.find(":") != manifestUrl.find("/") - 1:
url = urllib.parse.urljoin("gopher://" + manifestUrl, url)
url = re.sub(r"^gopher://", "", url)
fetch_url = normalize_url(self.fetchUrl)
manifest_url = normalize_url(self.manifestUrl)
# urljoin doesn't like URLs with no scheme in the base URL
# such as file paths. We handle this by prefixing it with
# an obscure protocol, gopher, and replacing it with the
# original after urljoin
if manifest_url.find(":") != manifest_url.find("/") - 1:
fetch_url = urllib.parse.urljoin(
"gopher://" + manifest_url, fetch_url
)
fetch_url = re.sub(r"^gopher://", "", fetch_url)
else:
url = urllib.parse.urljoin(manifestUrl, url)
return url
fetch_url = urllib.parse.urljoin(manifest_url, fetch_url)
return fetch_url
def ToRemoteSpec(self, projectName):
fetchUrl = self.resolvedFetchUrl.rstrip("/")
@ -275,7 +305,7 @@ class _XmlSubmanifest:
parent.repodir,
linkFile,
parent_groups=",".join(groups) or "",
submanifest_path=self.relpath,
submanifest_path=os.path.join(parent.path_prefix, self.relpath),
outer_client=outer_client,
default_groups=default_groups,
)
@ -354,7 +384,7 @@ class SubmanifestSpec:
self.groups = groups or []
class XmlManifest(object):
class XmlManifest:
"""manages the repo configuration file"""
def __init__(
@ -727,10 +757,10 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
self._output_manifest_project_extras(p, e)
if p.subprojects:
subprojects = set(subp.name for subp in p.subprojects)
subprojects = {subp.name for subp in p.subprojects}
output_projects(p, e, list(sorted(subprojects)))
projects = set(p.name for p in self._paths.values() if not p.parent)
projects = {p.name for p in self._paths.values() if not p.parent}
output_projects(None, root, list(sorted(projects)))
if self._repo_hooks_project:
@ -800,17 +830,17 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
for child in node.childNodes:
if child.nodeType == xml.dom.Node.ELEMENT_NODE:
attrs = child.attributes
element = dict(
(attrs.item(i).localName, attrs.item(i).value)
element = {
attrs.item(i).localName: attrs.item(i).value
for i in range(attrs.length)
)
}
if child.nodeName in SINGLE_ELEMENTS:
ret[child.nodeName] = element
elif child.nodeName in MULTI_ELEMENTS:
ret.setdefault(child.nodeName, []).append(element)
else:
raise ManifestParseError(
'Unhandled element "%s"' % (child.nodeName,)
f'Unhandled element "{child.nodeName}"'
)
append_children(element, child)
@ -857,8 +887,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
self._Load()
outer = self._outer_client
yield outer
for tree in outer.all_children:
yield tree
yield from outer.all_children
@property
def all_children(self):
@ -867,8 +896,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
for child in self._submanifests.values():
if child.repo_client:
yield child.repo_client
for tree in child.repo_client.all_children:
yield tree
yield from child.repo_client.all_children
@property
def path_prefix(self):
@ -987,7 +1015,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
@property
def PartialCloneExclude(self):
exclude = self.manifest.manifestProject.partial_clone_exclude or ""
return set(x.strip() for x in exclude.split(","))
return {x.strip() for x in exclude.split(",")}
def SetManifestOverride(self, path):
"""Override manifestFile. The caller must call Unload()"""
@ -1260,18 +1288,19 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
try:
root = xml.dom.minidom.parse(path)
except (OSError, xml.parsers.expat.ExpatError) as e:
raise ManifestParseError(
"error parsing manifest %s: %s" % (path, e)
)
raise ManifestParseError(f"error parsing manifest {path}: {e}")
if not root or not root.childNodes:
raise ManifestParseError("no root node in %s" % (path,))
raise ManifestParseError(f"no root node in {path}")
for manifest in root.childNodes:
if manifest.nodeName == "manifest":
if (
manifest.nodeType == manifest.ELEMENT_NODE
and manifest.nodeName == "manifest"
):
break
else:
raise ManifestParseError("no <manifest> in %s" % (path,))
raise ManifestParseError(f"no <manifest> in {path}")
nodes = []
for node in manifest.childNodes:
@ -1281,7 +1310,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
msg = self._CheckLocalPath(name)
if msg:
raise ManifestInvalidPathError(
'<include> invalid "name": %s: %s' % (name, msg)
f'<include> invalid "name": {name}: {msg}'
)
include_groups = ""
if parent_groups:
@ -1313,7 +1342,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
raise
except Exception as e:
raise ManifestParseError(
"failed parsing included manifest %s: %s" % (name, e)
f"failed parsing included manifest {name}: {e}"
)
else:
if parent_groups and node.nodeName == "project":
@ -1764,13 +1793,13 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
msg = self._CheckLocalPath(name)
if msg:
raise ManifestInvalidPathError(
'<submanifest> invalid "name": %s: %s' % (name, msg)
f'<submanifest> invalid "name": {name}: {msg}'
)
else:
msg = self._CheckLocalPath(path)
if msg:
raise ManifestInvalidPathError(
'<submanifest> invalid "path": %s: %s' % (path, msg)
f'<submanifest> invalid "path": {path}: {msg}'
)
submanifest = _XmlSubmanifest(
@ -1805,7 +1834,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
msg = self._CheckLocalPath(name, dir_ok=True)
if msg:
raise ManifestInvalidPathError(
'<project> invalid "name": %s: %s' % (name, msg)
f'<project> invalid "name": {name}: {msg}'
)
if parent:
name = self._JoinName(parent.name, name)
@ -1815,7 +1844,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
remote = self._default.remote
if remote is None:
raise ManifestParseError(
"no remote for project %s within %s" % (name, self.manifestFile)
f"no remote for project {name} within {self.manifestFile}"
)
revisionExpr = node.getAttribute("revision") or remote.revision
@ -1836,7 +1865,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
msg = self._CheckLocalPath(path, dir_ok=True, cwd_dot_ok=True)
if msg:
raise ManifestInvalidPathError(
'<project> invalid "path": %s: %s' % (path, msg)
f'<project> invalid "path": {path}: {msg}'
)
rebase = XmlBool(node, "rebase", True)
@ -2093,7 +2122,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
if not cwd_dot_ok or parts != ["."]:
for part in set(parts):
if part in {".", "..", ".git"} or part.startswith(".repo"):
return "bad component: %s" % (part,)
return f"bad component: {part}"
if not dir_ok and resep.match(path[-1]):
return "dirs not allowed"
@ -2129,7 +2158,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
msg = cls._CheckLocalPath(dest)
if msg:
raise ManifestInvalidPathError(
'<%s> invalid "dest": %s: %s' % (element, dest, msg)
f'<{element}> invalid "dest": {dest}: {msg}'
)
# |src| is the file we read from or path we point to for symlinks.
@ -2140,7 +2169,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
)
if msg:
raise ManifestInvalidPathError(
'<%s> invalid "src": %s: %s' % (element, src, msg)
f'<{element}> invalid "src": {src}: {msg}'
)
def _ParseCopyFile(self, project, node):
@ -2184,7 +2213,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
v = self._remotes.get(name)
if not v:
raise ManifestParseError(
"remote %s not defined in %s" % (name, self.manifestFile)
f"remote {name} not defined in {self.manifestFile}"
)
return v
@ -2210,7 +2239,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
toProjects = manifest.paths
fromKeys = sorted(fromProjects.keys())
toKeys = sorted(toProjects.keys())
toKeys = set(toProjects.keys())
diff = {
"added": [],
@ -2221,13 +2250,13 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
}
for proj in fromKeys:
fromProj = fromProjects[proj]
if proj not in toKeys:
diff["removed"].append(fromProjects[proj])
elif not fromProjects[proj].Exists:
diff["removed"].append(fromProj)
elif not fromProj.Exists:
diff["missing"].append(toProjects[proj])
toKeys.remove(proj)
else:
fromProj = fromProjects[proj]
toProj = toProjects[proj]
try:
fromRevId = fromProj.GetCommitRevisionId()
@ -2239,8 +2268,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
diff["changed"].append((fromProj, toProj))
toKeys.remove(proj)
for proj in toKeys:
diff["added"].append(toProjects[proj])
diff["added"].extend(toProjects[proj] for proj in sorted(toKeys))
return diff

View File

@ -57,8 +57,8 @@ def _validate_winpath(path):
if _winpath_is_valid(path):
return path
raise ValueError(
'Path "{}" must be a relative path or an absolute '
"path starting with a drive letter".format(path)
f'Path "{path}" must be a relative path or an absolute '
"path starting with a drive letter"
)
@ -193,10 +193,9 @@ def _walk_windows_impl(top, topdown, onerror, followlinks):
for name in dirs:
new_path = os.path.join(top, name)
if followlinks or not islink(new_path):
for x in _walk_windows_impl(
yield from _walk_windows_impl(
new_path, topdown, onerror, followlinks
):
yield x
)
if not topdown:
yield top, dirs, nondirs

View File

@ -186,9 +186,7 @@ def _create_symlink(source, link_name, dwFlags):
error_desc = FormatError(code).strip()
if code == ERROR_PRIVILEGE_NOT_HELD:
raise OSError(errno.EPERM, error_desc, link_name)
_raise_winerror(
code, 'Error creating symbolic link "{}"'.format(link_name)
)
_raise_winerror(code, f'Error creating symbolic link "{link_name}"')
def islink(path):
@ -210,7 +208,7 @@ def readlink(path):
)
if reparse_point_handle == INVALID_HANDLE_VALUE:
_raise_winerror(
get_last_error(), 'Error opening symbolic link "{}"'.format(path)
get_last_error(), f'Error opening symbolic link "{path}"'
)
target_buffer = c_buffer(MAXIMUM_REPARSE_DATA_BUFFER_SIZE)
n_bytes_returned = DWORD()
@ -227,7 +225,7 @@ def readlink(path):
CloseHandle(reparse_point_handle)
if not io_result:
_raise_winerror(
get_last_error(), 'Error reading symbolic link "{}"'.format(path)
get_last_error(), f'Error reading symbolic link "{path}"'
)
rdb = REPARSE_DATA_BUFFER.from_buffer(target_buffer)
if rdb.ReparseTag == IO_REPARSE_TAG_SYMLINK:
@ -236,11 +234,11 @@ def readlink(path):
return rdb.MountPointReparseBuffer.PrintName
# Unsupported reparse point type.
_raise_winerror(
ERROR_NOT_SUPPORTED, 'Error reading symbolic link "{}"'.format(path)
ERROR_NOT_SUPPORTED, f'Error reading symbolic link "{path}"'
)
def _raise_winerror(code, error_desc):
win_error_desc = FormatError(code).strip()
error_desc = "{0}: {1}".format(error_desc, win_error_desc)
error_desc = f"{error_desc}: {win_error_desc}"
raise WinError(code, error_desc)

View File

@ -52,11 +52,11 @@ def duration_str(total):
uses microsecond resolution. This makes for noisy output.
"""
hours, mins, secs = convert_to_hms(total)
ret = "%.3fs" % (secs,)
ret = f"{secs:.3f}s"
if mins:
ret = "%im%s" % (mins, ret)
ret = f"{mins}m{ret}"
if hours:
ret = "%ih%s" % (hours, ret)
ret = f"{hours}h{ret}"
return ret
@ -82,7 +82,7 @@ def jobs_str(total):
return f"{total} job{'s' if total > 1 else ''}"
class Progress(object):
class Progress:
def __init__(
self,
title,

File diff suppressed because it is too large Load Diff

View File

@ -15,4 +15,4 @@
[tool.black]
line-length = 80
# NB: Keep in sync with tox.ini.
target-version = ['py36', 'py37', 'py38', 'py39', 'py310', 'py311']
target-version = ['py36', 'py37', 'py38', 'py39', 'py310', 'py311'] #, 'py312'

2007
repo

File diff suppressed because it is too large Load Diff

View File

@ -15,24 +15,16 @@
"""Logic for printing user-friendly logs in repo."""
import logging
import multiprocessing
from color import Coloring
from error import RepoExitError
SEPARATOR = "=" * 80
MAX_PRINT_ERRORS = 5
class LogColoring(Coloring):
"""Coloring outstream for logging."""
def __init__(self, config):
super().__init__(config, "logs")
self.error = self.colorer("error", fg="red")
self.warning = self.colorer("warn", fg="yellow")
class ConfigMock:
class _ConfigMock:
"""Default coloring config to use when Logging.config is not set."""
def __init__(self):
@ -42,34 +34,60 @@ class ConfigMock:
return self.default_values.get(x, None)
class _LogColoring(Coloring):
"""Coloring outstream for logging."""
def __init__(self, config):
super().__init__(config, "logs")
self.error = self.colorer("error", fg="red")
self.warning = self.colorer("warn", fg="yellow")
self.levelMap = {
"WARNING": self.warning,
"ERROR": self.error,
}
class _LogColoringFormatter(logging.Formatter):
"""Coloring formatter for logging."""
def __init__(self, config=None, *args, **kwargs):
self.config = config if config else _ConfigMock()
self.colorer = _LogColoring(self.config)
super().__init__(*args, **kwargs)
def format(self, record):
"""Formats |record| with color."""
msg = super().format(record)
colorer = self.colorer.levelMap.get(record.levelname)
return msg if not colorer else colorer(msg)
class RepoLogger(logging.Logger):
"""Repo Logging Module."""
# Aggregates error-level logs. This is used to generate an error summary
# section at the end of a command execution.
errors = multiprocessing.Manager().list()
def __init__(self, name, config=None, **kwargs):
def __init__(self, name: str, config=None, **kwargs):
super().__init__(name, **kwargs)
self.config = config if config else ConfigMock()
self.colorer = LogColoring(self.config)
handler = logging.StreamHandler()
handler.setFormatter(_LogColoringFormatter(config))
self.addHandler(handler)
def error(self, msg, *args, **kwargs):
"""Print and aggregate error-level logs."""
colored_error = self.colorer.error(msg, *args)
RepoLogger.errors.append(colored_error)
super().error(colored_error, **kwargs)
def warning(self, msg, *args, **kwargs):
"""Print warning-level logs with coloring."""
colored_warning = self.colorer.warning(msg, *args)
super().warning(colored_warning, **kwargs)
def log_aggregated_errors(self):
def log_aggregated_errors(self, err: RepoExitError):
"""Print all aggregated logs."""
super().error(self.colorer.error(SEPARATOR))
super().error(
self.colorer.error("Repo command failed due to following errors:")
self.error(SEPARATOR)
if not err.aggregate_errors:
self.error("Repo command failed: %s", type(err).__name__)
self.error("\t%s", str(err))
return
self.error(
"Repo command failed due to the following `%s` errors:",
type(err).__name__,
)
super().error("\n".join(RepoLogger.errors))
self.error(
"\n".join(str(e) for e in err.aggregate_errors[:MAX_PRINT_ERRORS])
)
diff = len(err.aggregate_errors) - MAX_PRINT_ERRORS
if diff > 0:
self.error("+%d additional errors...", diff)

View File

@ -142,7 +142,7 @@ def _GetTraceFile(quiet):
def _ClearOldTraces():
"""Clear the oldest commands if trace file is too big."""
try:
with open(_TRACE_FILE, "r", errors="ignore") as f:
with open(_TRACE_FILE, errors="ignore") as f:
if os.path.getsize(f.name) / (1024 * 1024) <= _MAX_SIZE:
return
trace_lines = f.readlines()

View File

@ -27,8 +27,16 @@ ROOT_DIR = os.path.dirname(os.path.realpath(__file__))
def run_black():
"""Returns the exit code from black."""
# Black by default only matches .py files. We have to list standalone
# scripts manually.
extra_programs = [
"repo",
"run_tests",
"release/update-manpages",
]
return subprocess.run(
[sys.executable, "-m", "black", "--check", ROOT_DIR], check=False
[sys.executable, "-m", "black", "--check", ROOT_DIR] + extra_programs,
check=False,
).returncode

10
ssh.py
View File

@ -57,8 +57,12 @@ def version():
except FileNotFoundError:
print("fatal: ssh not installed", file=sys.stderr)
sys.exit(1)
except subprocess.CalledProcessError:
print("fatal: unable to detect ssh version", file=sys.stderr)
except subprocess.CalledProcessError as e:
print(
"fatal: unable to detect ssh version"
f" (code={e.returncode}, output={e.stdout})",
file=sys.stderr,
)
sys.exit(1)
@ -165,7 +169,7 @@ class ProxyManager:
# Check to see whether we already think that the master is running; if
# we think it's already running, return right away.
if port is not None:
key = "%s:%s" % (host, port)
key = f"{host}:{port}"
else:
key = host

View File

@ -37,9 +37,7 @@ for py in os.listdir(my_dir):
try:
cmd = getattr(mod, clsn)
except AttributeError:
raise SyntaxError(
"%s/%s does not define class %s" % (__name__, py, clsn)
)
raise SyntaxError(f"{__name__}/{py} does not define class {clsn}")
name = name.replace("_", "-")
cmd.NAME = name

View File

@ -15,7 +15,6 @@
import collections
import functools
import itertools
import sys
from command import Command
from command import DEFAULT_LOCAL_JOBS
@ -23,6 +22,10 @@ from error import RepoError
from error import RepoExitError
from git_command import git
from progress import Progress
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class AbandonError(RepoExitError):
@ -114,7 +117,7 @@ It is equivalent to "git branch -D <branchname>".
all_projects,
callback=_ProcessResults,
output=Progress(
"Abandon %s" % (nb,), len(all_projects), quiet=opt.quiet
f"Abandon {nb}", len(all_projects), quiet=opt.quiet
),
)
@ -126,18 +129,12 @@ It is equivalent to "git branch -D <branchname>".
if err:
for br in err.keys():
err_msg = "error: cannot abandon %s" % br
print(err_msg, file=sys.stderr)
logger.error(err_msg)
for proj in err[br]:
print(
" " * len(err_msg) + " | %s" % _RelPath(proj),
file=sys.stderr,
)
logger.error(" " * len(err_msg) + " | %s", _RelPath(proj))
raise AbandonError(aggregate_errors=aggregate_errors)
elif not success:
print(
"error: no project has local branch(es) : %s" % nb,
file=sys.stderr,
)
logger.error("error: no project has local branch(es) : %s", nb)
raise AbandonError(aggregate_errors=aggregate_errors)
else:
# Everything below here is displaying status.
@ -155,4 +152,4 @@ It is equivalent to "git branch -D <branchname>".
_RelPath(p) for p in success[br]
)
)
print("%s%s| %s\n" % (br, " " * (width - len(br)), result))
print(f"{br}{' ' * (width - len(br))}| {result}\n")

View File

@ -28,7 +28,7 @@ class BranchColoring(Coloring):
self.notinproject = self.printer("notinproject", fg="red")
class BranchInfo(object):
class BranchInfo:
def __init__(self, name):
self.name = name
self.current = 0
@ -174,7 +174,7 @@ is shown, then the branch appears in all projects.
if _RelPath(p) not in have:
paths.append(_RelPath(p))
s = " %s %s" % (in_type, ", ".join(paths))
s = f" {in_type} {', '.join(paths)}"
if not i.IsSplitCurrent and (width + 7 + len(s) < 80):
fmt = out.current if i.IsCurrent else fmt
fmt(s)

View File

@ -13,7 +13,6 @@
# limitations under the License.
import functools
import sys
from typing import NamedTuple
from command import Command
@ -22,6 +21,10 @@ from error import GitError
from error import RepoExitError
from progress import Progress
from project import Project
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class CheckoutBranchResult(NamedTuple):
@ -93,18 +96,15 @@ The command is equivalent to:
all_projects,
callback=_ProcessResults,
output=Progress(
"Checkout %s" % (nb,), len(all_projects), quiet=opt.quiet
f"Checkout {nb}", len(all_projects), quiet=opt.quiet
),
)
if err_projects:
for p in err_projects:
print(
"error: %s/: cannot checkout %s" % (p.relpath, nb),
file=sys.stderr,
)
logger.error("error: %s/: cannot checkout %s", p.relpath, nb)
raise CheckoutCommandError(aggregate_errors=err)
elif not success:
msg = f"error: no project has branch {nb}"
print(msg, file=sys.stderr)
logger.error(msg)
raise MissingBranchError(msg)

View File

@ -18,9 +18,11 @@ import sys
from command import Command
from error import GitError
from git_command import GitCommand
from repo_logging import RepoLogger
CHANGE_ID_RE = re.compile(r"^\s*Change-Id: I([0-9a-f]{40})\s*$")
logger = RepoLogger(__file__)
class CherryPick(Command):
@ -52,7 +54,7 @@ change id will be added.
try:
p.Wait()
except GitError:
print(p.stderr, file=sys.stderr)
logger.error(p.stderr)
raise
sha1 = p.stdout.strip()
@ -67,9 +69,7 @@ change id will be added.
try:
p.Wait()
except GitError:
print(
"error: Failed to retrieve old commit message", file=sys.stderr
)
logger.error("error: Failed to retrieve old commit message")
raise
old_msg = self._StripHeader(p.stdout)
@ -85,14 +85,13 @@ change id will be added.
try:
p.Wait()
except GitError as e:
print(str(e))
print(
logger.error(e)
logger.warning(
"NOTE: When committing (please see above) and editing the "
"commit message, please remove the old Change-Id-line and "
"add:"
"add:\n%s",
self._GetReference(sha1),
)
print(self._GetReference(sha1), file=sys.stderr)
print(file=sys.stderr)
raise
if p.stdout:
@ -115,10 +114,7 @@ change id will be added.
try:
p.Wait()
except GitError:
print(
"error: Failed to update commit message",
file=sys.stderr,
)
logger.error("error: Failed to update commit message")
raise
def _IsChangeId(self, line):

View File

@ -87,25 +87,17 @@ synced and their revisions won't be found.
def _printRawDiff(self, diff, pretty_format=None, local=False):
_RelPath = lambda p: p.RelPath(local=local)
for project in diff["added"]:
self.printText(
"A %s %s" % (_RelPath(project), project.revisionExpr)
)
self.printText(f"A {_RelPath(project)} {project.revisionExpr}")
self.out.nl()
for project in diff["removed"]:
self.printText(
"R %s %s" % (_RelPath(project), project.revisionExpr)
)
self.printText(f"R {_RelPath(project)} {project.revisionExpr}")
self.out.nl()
for project, otherProject in diff["changed"]:
self.printText(
"C %s %s %s"
% (
_RelPath(project),
project.revisionExpr,
otherProject.revisionExpr,
)
f"C {_RelPath(project)} {project.revisionExpr} "
f"{otherProject.revisionExpr}"
)
self.out.nl()
self._printLogs(
@ -118,12 +110,8 @@ synced and their revisions won't be found.
for project, otherProject in diff["unreachable"]:
self.printText(
"U %s %s %s"
% (
_RelPath(project),
project.revisionExpr,
otherProject.revisionExpr,
)
f"U {_RelPath(project)} {project.revisionExpr} "
f"{otherProject.revisionExpr}"
)
self.out.nl()

View File

@ -19,9 +19,11 @@ from command import Command
from error import GitError
from error import NoSuchProjectError
from error import RepoExitError
from repo_logging import RepoLogger
CHANGE_RE = re.compile(r"^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$")
logger = RepoLogger(__file__)
class DownloadCommandError(RepoExitError):
@ -109,21 +111,16 @@ If no project is specified try to use current directory as a project.
except NoSuchProjectError:
project = None
if project not in projects:
print(
logger.error(
"error: %s matches too many projects; please "
"re-run inside the project checkout." % (a,),
file=sys.stderr,
"re-run inside the project checkout.",
a,
)
for project in projects:
print(
" %s/ @ %s"
% (
project.RelPath(
local=opt.this_manifest_only
),
project.revisionExpr,
),
file=sys.stderr,
logger.error(
" %s/ @ %s",
project.RelPath(local=opt.this_manifest_only),
project.revisionExpr,
)
raise NoSuchProjectError()
else:
@ -156,18 +153,21 @@ If no project is specified try to use current directory as a project.
dl = project.DownloadPatchSet(change_id, ps_id)
if not opt.revert and not dl.commits:
print(
"[%s] change %d/%d has already been merged"
% (project.name, change_id, ps_id),
file=sys.stderr,
logger.error(
"[%s] change %d/%d has already been merged",
project.name,
change_id,
ps_id,
)
continue
if len(dl.commits) > 1:
print(
"[%s] %d/%d depends on %d unmerged changes:"
% (project.name, change_id, ps_id, len(dl.commits)),
file=sys.stderr,
logger.error(
"[%s] %d/%d depends on %d unmerged changes:",
project.name,
change_id,
ps_id,
len(dl.commits),
)
for c in dl.commits:
print(" %s" % (c), file=sys.stderr)
@ -204,9 +204,10 @@ If no project is specified try to use current directory as a project.
project._Checkout(dl.commit)
except GitError:
print(
"[%s] Could not complete the %s of %s"
% (project.name, mode, dl.commit),
file=sys.stderr,
logger.error(
"[%s] Could not complete the %s of %s",
project.name,
mode,
dl.commit,
)
raise

View File

@ -28,8 +28,10 @@ from command import DEFAULT_LOCAL_JOBS
from command import MirrorSafeCommand
from command import WORKER_BATCH_SIZE
from error import ManifestInvalidRevisionError
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
_CAN_COLOR = [
"branch",
"diff",
@ -293,10 +295,10 @@ without iterating through the remaining projects.
rc = rc or errno.EINTR
except Exception as e:
# Catch any other exceptions raised
print(
"forall: unhandled error, terminating the pool: %s: %s"
% (type(e).__name__, e),
file=sys.stderr,
logger.error(
"forall: unhandled error, terminating the pool: %s: %s",
type(e).__name__,
e,
)
rc = rc or getattr(e, "errno", 1)
if rc != 0:

View File

@ -24,6 +24,10 @@ from error import InvalidArgumentsError
from error import SilentRepoExitError
from git_command import GitCommand
from project import Project
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class GrepColoring(Coloring):
@ -371,7 +375,7 @@ contain a line that matches both expressions:
if opt.revision:
if "--cached" in cmd_argv:
msg = "fatal: cannot combine --cached and --revision"
print(msg, file=sys.stderr)
logger.error(msg)
raise InvalidArgumentsError(msg)
have_rev = True
cmd_argv.extend(opt.revision)
@ -396,5 +400,5 @@ contain a line that matches both expressions:
sys.exit(0)
elif have_rev and bad_rev:
for r in opt.revision:
print("error: can't search revision %s" % r, file=sys.stderr)
logger.error("error: can't search revision %s", r)
raise GrepCommandError(aggregate_errors=errors)

View File

@ -150,7 +150,7 @@ Displays detailed usage information about a command.
def _PrintAllCommandHelp(self):
for name in sorted(all_commands):
cmd = all_commands[name](manifest=self.manifest)
self._PrintCommandHelp(cmd, header_prefix="[%s] " % (name,))
self._PrintCommandHelp(cmd, header_prefix=f"[{name}] ")
def _Options(self, p):
p.add_option(

View File

@ -97,7 +97,9 @@ class Info(PagedCommand):
self.headtext(self.manifest.default.revisionExpr)
self.out.nl()
self.heading("Manifest merge branch: ")
self.headtext(mergeBranch)
# The manifest might not have a merge branch if it isn't in a git repo,
# e.g. if `repo init --standalone-manifest` is used.
self.headtext(mergeBranch or "")
self.out.nl()
self.heading("Manifest groups: ")
self.headtext(manifestGroups)
@ -248,7 +250,7 @@ class Info(PagedCommand):
for commit in commits:
split = commit.split()
self.text("{0:38}{1} ".format("", "-"))
self.text(f"{'':38}{'-'} ")
self.sha(split[0] + " ")
self.text(" ".join(split[1:]))
self.out.nl()

View File

@ -23,9 +23,12 @@ from error import UpdateManifestError
from git_command import git_require
from git_command import MIN_GIT_VERSION_HARD
from git_command import MIN_GIT_VERSION_SOFT
from repo_logging import RepoLogger
from wrapper import Wrapper
logger = RepoLogger(__file__)
_REPO_ALLOW_SHALLOW = os.environ.get("REPO_ALLOW_SHALLOW")
@ -212,7 +215,7 @@ to update the working directory files.
if not opt.quiet:
print()
print("Your identity is: %s <%s>" % (name, email))
print(f"Your identity is: {name} <{email}>")
print("is this correct [y/N]? ", end="", flush=True)
a = sys.stdin.readline().strip().lower()
if a in ("yes", "y", "t", "true"):
@ -330,11 +333,11 @@ to update the working directory files.
def Execute(self, opt, args):
git_require(MIN_GIT_VERSION_HARD, fail=True)
if not git_require(MIN_GIT_VERSION_SOFT):
print(
"repo: warning: git-%s+ will soon be required; please upgrade "
"your version of git to maintain support."
% (".".join(str(x) for x in MIN_GIT_VERSION_SOFT),),
file=sys.stderr,
logger.warning(
"repo: warning: git-%s+ will soon be required; "
"please upgrade your version of git to maintain "
"support.",
".".join(str(x) for x in MIN_GIT_VERSION_SOFT),
)
rp = self.manifest.repoProject
@ -350,17 +353,14 @@ to update the working directory files.
wrapper = Wrapper()
try:
remote_ref, rev = wrapper.check_repo_rev(
rp.gitdir,
rp.worktree,
opt.repo_rev,
repo_verify=opt.repo_verify,
quiet=opt.quiet,
)
except wrapper.CloneFailure as e:
err_msg = "fatal: double check your --repo-rev setting."
print(
err_msg,
file=sys.stderr,
)
logger.error(err_msg)
self.git_event_log.ErrorEvent(err_msg)
raise RepoUnhandledExceptionError(e)

View File

@ -131,7 +131,7 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
elif opt.path_only and not opt.name_only:
lines.append("%s" % (_getpath(project)))
else:
lines.append("%s : %s" % (_getpath(project), project.name))
lines.append(f"{_getpath(project)} : {project.name}")
if lines:
lines.sort()

View File

@ -17,6 +17,10 @@ import os
import sys
from command import PagedCommand
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class Manifest(PagedCommand):
@ -132,7 +136,7 @@ to indicate the remote ref to push changes to via 'repo upload'.
manifest.SetUseLocalManifests(not opt.ignore_local_manifests)
if opt.json:
print("warning: --json is experimental!", file=sys.stderr)
logger.warning("warning: --json is experimental!")
doc = manifest.ToDict(
peg_rev=opt.peg_rev,
peg_rev_upstream=opt.peg_rev_upstream,
@ -159,13 +163,13 @@ to indicate the remote ref to push changes to via 'repo upload'.
if output_file != "-":
fd.close()
if manifest.path_prefix:
print(
f"Saved {manifest.path_prefix} submanifest to "
f"{output_file}",
file=sys.stderr,
logger.warning(
"Saved %s submanifest to %s",
manifest.path_prefix,
output_file,
)
else:
print(f"Saved manifest to {output_file}", file=sys.stderr)
logger.warning("Saved manifest to %s", output_file)
def ValidateOptions(self, opt, args):
if args:

View File

@ -83,9 +83,7 @@ class Prune(PagedCommand):
)
if not branch.base_exists:
print(
"(ignoring: tracking branch is gone: %s)" % (branch.base,)
)
print(f"(ignoring: tracking branch is gone: {branch.base})")
else:
commits = branch.commits
date = branch.date

View File

@ -17,6 +17,10 @@ import sys
from color import Coloring
from command import Command
from git_command import GitCommand
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class RebaseColoring(Coloring):
@ -104,17 +108,15 @@ branch but need to incorporate new upstream changes "underneath" them.
one_project = len(all_projects) == 1
if opt.interactive and not one_project:
print(
"error: interactive rebase not supported with multiple "
"projects",
file=sys.stderr,
logger.error(
"error: interactive rebase not supported with multiple projects"
)
if len(args) == 1:
print(
"note: project %s is mapped to more than one path"
% (args[0],),
file=sys.stderr,
logger.warning(
"note: project %s is mapped to more than one path", args[0]
)
return 1
# Setup the common git rebase args that we use for all projects.
@ -145,10 +147,9 @@ branch but need to incorporate new upstream changes "underneath" them.
cb = project.CurrentBranch
if not cb:
if one_project:
print(
"error: project %s has a detached HEAD"
% _RelPath(project),
file=sys.stderr,
logger.error(
"error: project %s has a detached HEAD",
_RelPath(project),
)
return 1
# Ignore branches with detached HEADs.
@ -157,10 +158,9 @@ branch but need to incorporate new upstream changes "underneath" them.
upbranch = project.GetBranch(cb)
if not upbranch.LocalMerge:
if one_project:
print(
"error: project %s does not track any remote branches"
% _RelPath(project),
file=sys.stderr,
logger.error(
"error: project %s does not track any remote branches",
_RelPath(project),
)
return 1
# Ignore branches without remotes.

View File

@ -13,15 +13,18 @@
# limitations under the License.
import optparse
import sys
from command import Command
from command import MirrorSafeCommand
from error import RepoExitError
from repo_logging import RepoLogger
from subcmds.sync import _PostRepoFetch
from subcmds.sync import _PostRepoUpgrade
logger = RepoLogger(__file__)
class SelfupdateError(RepoExitError):
"""Exit error for failed selfupdate command."""
@ -66,7 +69,7 @@ need to be performed by an end-user.
else:
result = rp.Sync_NetworkHalf()
if result.error:
print("error: can't update repo", file=sys.stderr)
logger.error("error: can't update repo")
raise SelfupdateError(aggregate_errors=[result.error])
rp.bare_git.gc("--auto")

View File

@ -17,6 +17,10 @@ import sys
from color import Coloring
from command import InteractiveCommand
from git_command import GitCommand
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class _ProjectList(Coloring):
@ -62,7 +66,7 @@ The '%prog' command stages files to prepare the next commit.
if p.IsDirty()
]
if not all_projects:
print("no projects have uncommitted modifications", file=sys.stderr)
logger.error("no projects have uncommitted modifications")
return
out = _ProjectList(self.manifest.manifestProject.config)

View File

@ -13,7 +13,6 @@
# limitations under the License.
import functools
import sys
from typing import NamedTuple
from command import Command
@ -23,6 +22,10 @@ from git_command import git
from git_config import IsImmutable
from progress import Progress
from project import Project
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class ExecuteOneResult(NamedTuple):
@ -95,10 +98,7 @@ revision specified in the manifest.
nb, branch_merge=branch_merge, revision=revision
)
except Exception as e:
print(
"error: unable to checkout %s: %s" % (project.name, e),
file=sys.stderr,
)
logger.error("error: unable to checkout %s: %s", project.name, e)
error = e
return ExecuteOneResult(project, error)
@ -130,16 +130,16 @@ revision specified in the manifest.
all_projects,
callback=_ProcessResults,
output=Progress(
"Starting %s" % (nb,), len(all_projects), quiet=opt.quiet
f"Starting {nb}", len(all_projects), quiet=opt.quiet
),
)
if err_projects:
for p in err_projects:
print(
"error: %s/: cannot start %s"
% (p.RelPath(local=opt.this_manifest_only), nb),
file=sys.stderr,
logger.error(
"error: %s/: cannot start %s",
p.RelPath(local=opt.this_manifest_only),
nb,
)
msg_fmt = "cannot start %d project(s)"
self.git_event_log.ErrorEvent(

View File

@ -21,11 +21,11 @@ import multiprocessing
import netrc
import optparse
import os
import socket
from pathlib import Path
import sys
import tempfile
import time
from typing import List, NamedTuple, Set
from typing import List, NamedTuple, Set, Union
import urllib.error
import urllib.parse
import urllib.request
@ -56,6 +56,7 @@ from command import MirrorSafeCommand
from command import WORKER_BATCH_SIZE
from error import GitError
from error import RepoChangedException
from error import RepoError
from error import RepoExitError
from error import RepoUnhandledExceptionError
from error import SyncError
@ -74,6 +75,7 @@ from project import DeleteWorktreeError
from project import Project
from project import RemoteSpec
from project import SyncBuffer
from repo_logging import RepoLogger
from repo_trace import Trace
import ssh
from wrapper import Wrapper
@ -81,13 +83,53 @@ from wrapper import Wrapper
_ONE_DAY_S = 24 * 60 * 60
# Env var to implicitly turn auto-gc back on. This was added to allow a user to
# revert a change in default behavior in v2.29.9. Remove after 2023-04-01.
_REPO_AUTO_GC = "REPO_AUTO_GC"
_AUTO_GC = os.environ.get(_REPO_AUTO_GC) == "1"
_REPO_ALLOW_SHALLOW = os.environ.get("REPO_ALLOW_SHALLOW")
logger = RepoLogger(__file__)
def _SafeCheckoutOrder(checkouts: List[Project]) -> List[List[Project]]:
"""Generate a sequence of checkouts that is safe to perform. The client
should checkout everything from n-th index before moving to n+1.
This is only useful if manifest contains nested projects.
E.g. if foo, foo/bar and foo/bar/baz are project paths, then foo needs to
finish before foo/bar can proceed, and foo/bar needs to finish before
foo/bar/baz."""
res = [[]]
current = res[0]
# depth_stack contains a current stack of parent paths.
depth_stack = []
# Checkouts are iterated in the hierarchical order. That way, it can easily
# be determined if the previous checkout is parent of the current checkout.
# We are splitting by the path separator so the final result is
# hierarchical, and not just lexicographical. For example, if the projects
# are: foo, foo/bar, foo-bar, lexicographical order produces foo, foo-bar
# and foo/bar, which doesn't work.
for checkout in sorted(checkouts, key=lambda x: x.relpath.split("/")):
checkout_path = Path(checkout.relpath)
while depth_stack:
try:
checkout_path.relative_to(depth_stack[-1])
except ValueError:
# Path.relative_to returns ValueError if paths are not relative.
# TODO(sokcevic): Switch to is_relative_to once min supported
# version is py3.9.
depth_stack.pop()
else:
if len(depth_stack) >= len(res):
# Another depth created.
res.append([])
break
current = res[len(depth_stack)]
current.append(checkout)
depth_stack.append(checkout_path)
return res
class _FetchOneResult(NamedTuple):
"""_FetchOne return value.
@ -118,7 +160,6 @@ class _FetchResult(NamedTuple):
success: bool
projects: Set[str]
errors: List[Exception]
class _FetchMainResult(NamedTuple):
@ -129,7 +170,6 @@ class _FetchMainResult(NamedTuple):
"""
all_projects: List[Project]
errors: List[Exception]
class _CheckoutOneResult(NamedTuple):
@ -161,6 +201,35 @@ class SmartSyncError(SyncError):
"""Smart sync exit error."""
class ManifestInterruptError(RepoError):
"""Aggregate Error to be logged when a user interrupts a manifest update."""
def __init__(self, output, **kwargs):
super().__init__(output, **kwargs)
self.output = output
def __str__(self):
error_type = type(self).__name__
return f"{error_type}:{self.output}"
class TeeStringIO(io.StringIO):
"""StringIO class that can write to an additional destination."""
def __init__(
self, io: Union[io.TextIOWrapper, None], *args, **kwargs
) -> None:
super().__init__(*args, **kwargs)
self.io = io
def write(self, s: str) -> int:
"""Write to additional destination."""
ret = super().write(s)
if self.io is not None:
self.io.write(s)
return ret
class Sync(Command, MirrorSafeCommand):
COMMON = True
MULTI_MANIFEST_SUPPORT = True
@ -213,6 +282,11 @@ directories if they have previously been linked to a different
object directory. WARNING: This may cause data to be lost since
refs may be removed when overwriting.
The --force-checkout option can be used to force git to switch revs even if the
index or the working tree differs from HEAD, and if there are untracked files.
WARNING: This may cause data to be lost since uncommitted changes may be
removed.
The --force-remove-dirty option can be used to remove previously used
projects with uncommitted changes. WARNING: This may cause data to be
lost since uncommitted changes may be removed with projects that no longer
@ -310,6 +384,14 @@ later is required to fix a server side protocol bug.
"point to a different object directory. WARNING: this "
"may cause loss of data",
)
p.add_option(
"--force-checkout",
dest="force_checkout",
action="store_true",
help="force checkout even if it results in throwing away "
"uncommitted modifications. "
"WARNING: this may cause loss of data",
)
p.add_option(
"--force-remove-dirty",
dest="force_remove_dirty",
@ -580,14 +662,15 @@ later is required to fix a server side protocol bug.
superproject_logging_data["superproject"] = False
superproject_logging_data["noworktree"] = True
if opt.use_superproject is not False:
print(
f"{m.path_prefix}: not using superproject because "
"there is no working tree."
logger.warning(
"%s: not using superproject because there is no "
"working tree.",
m.path_prefix,
)
if not use_super:
continue
m.superproject.SetQuiet(opt.quiet)
m.superproject.SetQuiet(not opt.verbose)
print_messages = git_superproject.PrintMessages(
opt.use_superproject, m
)
@ -602,13 +685,13 @@ later is required to fix a server side protocol bug.
need_unload = True
else:
if print_messages:
print(
f"{m.path_prefix}: warning: Update of revisionId from "
"superproject has failed, repo sync will not use "
"superproject to fetch the source. ",
"Please resync with the --no-use-superproject option "
"to avoid this repo warning.",
file=sys.stderr,
logger.warning(
"%s: warning: Update of revisionId from superproject "
"has failed, repo sync will not use superproject to "
"fetch the source. Please resync with the "
"--no-use-superproject option to avoid this repo "
"warning.",
m.path_prefix,
)
if update_result.fatal and opt.use_superproject is not None:
raise SuperprojectError()
@ -645,7 +728,7 @@ later is required to fix a server side protocol bug.
success = False
remote_fetched = False
errors = []
buf = io.StringIO()
buf = TeeStringIO(sys.stdout if opt.verbose else None)
try:
sync_result = project.Sync_NetworkHalf(
quiet=opt.quiet,
@ -672,25 +755,26 @@ later is required to fix a server side protocol bug.
errors.append(sync_result.error)
output = buf.getvalue()
if (opt.verbose or not success) and output:
if output and buf.io is None and not success:
print("\n" + output.rstrip())
if not success:
print(
"error: Cannot fetch %s from %s"
% (project.name, project.remote.url),
file=sys.stderr,
logger.error(
"error: Cannot fetch %s from %s",
project.name,
project.remote.url,
)
except KeyboardInterrupt:
print(f"Keyboard interrupt while processing {project.name}")
logger.error("Keyboard interrupt while processing %s", project.name)
except GitError as e:
print("error.GitError: Cannot fetch %s" % str(e), file=sys.stderr)
logger.error("error.GitError: Cannot fetch %s", e)
errors.append(e)
except Exception as e:
print(
"error: Cannot fetch %s (%s: %s)"
% (project.name, type(e).__name__, str(e)),
file=sys.stderr,
logger.error(
"error: Cannot fetch %s (%s: %s)",
project.name,
type(e).__name__,
e,
)
del self._sync_dict[k]
errors.append(e)
@ -725,13 +809,12 @@ later is required to fix a server side protocol bug.
jobs = jobs_str(len(items))
return f"{jobs} | {elapsed_str(elapsed)} {earliest_proj}"
def _Fetch(self, projects, opt, err_event, ssh_proxy):
def _Fetch(self, projects, opt, err_event, ssh_proxy, errors):
ret = True
jobs = opt.jobs_network
fetched = set()
remote_fetched = set()
errors = []
pm = Progress(
"Fetching",
len(projects),
@ -846,10 +929,10 @@ later is required to fix a server side protocol bug.
if not self.outer_client.manifest.IsArchive:
self._GCProjects(projects, opt, err_event)
return _FetchResult(ret, fetched, errors)
return _FetchResult(ret, fetched)
def _FetchMain(
self, opt, args, all_projects, err_event, ssh_proxy, manifest
self, opt, args, all_projects, err_event, ssh_proxy, manifest, errors
):
"""The main network fetch loop.
@ -865,7 +948,6 @@ later is required to fix a server side protocol bug.
List of all projects that should be checked out.
"""
rp = manifest.repoProject
errors = []
to_fetch = []
now = time.time()
@ -874,11 +956,9 @@ later is required to fix a server side protocol bug.
to_fetch.extend(all_projects)
to_fetch.sort(key=self._fetch_times.Get, reverse=True)
result = self._Fetch(to_fetch, opt, err_event, ssh_proxy)
result = self._Fetch(to_fetch, opt, err_event, ssh_proxy, errors)
success = result.success
fetched = result.projects
if result.errors:
errors.extend(result.errors)
if not success:
err_event.set()
@ -887,15 +967,14 @@ later is required to fix a server side protocol bug.
if opt.network_only:
# Bail out now; the rest touches the working tree.
if err_event.is_set():
print(
"\nerror: Exited sync due to fetch errors.\n",
file=sys.stderr,
)
raise SyncError(
e = SyncError(
"error: Exited sync due to fetch errors.",
aggregate_errors=errors,
)
return _FetchMainResult([], errors)
logger.error(e)
raise e
return _FetchMainResult([])
# Iteratively fetch missing and/or nested unregistered submodules.
previously_missing_set = set()
@ -916,27 +995,31 @@ later is required to fix a server side protocol bug.
break
# Stop us from non-stopped fetching actually-missing repos: If set
# of missing repos has not been changed from last fetch, we break.
missing_set = set(p.name for p in missing)
missing_set = {p.name for p in missing}
if previously_missing_set == missing_set:
break
previously_missing_set = missing_set
result = self._Fetch(missing, opt, err_event, ssh_proxy)
result = self._Fetch(missing, opt, err_event, ssh_proxy, errors)
success = result.success
new_fetched = result.projects
if result.errors:
errors.extend(result.errors)
if not success:
err_event.set()
fetched.update(new_fetched)
return _FetchMainResult(all_projects, errors)
return _FetchMainResult(all_projects)
def _CheckoutOne(self, detach_head, force_sync, project):
def _CheckoutOne(
self, detach_head, force_sync, force_checkout, verbose, project
):
"""Checkout work tree for one project
Args:
detach_head: Whether to leave a detached HEAD.
force_sync: Force checking out of the repo.
force_sync: Force checking out of .git directory (e.g. overwrite
existing git directory that was previously linked to a different
object directory).
force_checkout: Force checking out of the repo content.
verbose: Whether to show verbose messages.
project: Project object for the project to checkout.
Returns:
@ -950,26 +1033,29 @@ later is required to fix a server side protocol bug.
errors = []
try:
project.Sync_LocalHalf(
syncbuf, force_sync=force_sync, errors=errors
syncbuf,
force_sync=force_sync,
force_checkout=force_checkout,
errors=errors,
verbose=verbose,
)
success = syncbuf.Finish()
except GitError as e:
print(
"error.GitError: Cannot checkout %s: %s"
% (project.name, str(e)),
file=sys.stderr,
logger.error(
"error.GitError: Cannot checkout %s: %s", project.name, e
)
errors.append(e)
except Exception as e:
print(
"error: Cannot checkout %s: %s: %s"
% (project.name, type(e).__name__, str(e)),
file=sys.stderr,
logger.error(
"error: Cannot checkout %s: %s: %s",
project.name,
type(e).__name__,
e,
)
raise
if not success:
print("error: Cannot checkout %s" % (project.name), file=sys.stderr)
logger.error("error: Cannot checkout %s", project.name)
finish = time.time()
return _CheckoutOneResult(success, errors, project, start, finish)
@ -1015,15 +1101,22 @@ later is required to fix a server side protocol bug.
pm.update(msg=project.name)
return ret
proc_res = self.ExecuteInParallel(
opt.jobs_checkout,
functools.partial(
self._CheckoutOne, opt.detach_head, opt.force_sync
),
all_projects,
callback=_ProcessResults,
output=Progress("Checking out", len(all_projects), quiet=opt.quiet),
)
for projects in _SafeCheckoutOrder(all_projects):
proc_res = self.ExecuteInParallel(
opt.jobs_checkout,
functools.partial(
self._CheckoutOne,
opt.detach_head,
opt.force_sync,
opt.force_checkout,
opt.verbose,
),
projects,
callback=_ProcessResults,
output=Progress(
"Checking out", len(all_projects), quiet=opt.quiet
),
)
self._local_sync_state.Save()
return proc_res and not err_results
@ -1092,21 +1185,20 @@ later is required to fix a server side protocol bug.
"\r%s: Shared project %s found, disabling pruning."
% (relpath, project.name)
)
if git_require((2, 7, 0)):
project.EnableRepositoryExtension("preciousObjects")
else:
# This isn't perfect, but it's the best we can do with old
# git.
print(
"\r%s: WARNING: shared projects are unreliable when "
logger.warning(
"%s: WARNING: shared projects are unreliable when "
"using old versions of git; please upgrade to "
"git-2.7.0+." % (relpath,),
file=sys.stderr,
"git-2.7.0+.",
relpath,
)
project.config.SetString("gc.pruneExpire", "never")
else:
if not opt.quiet:
print(f"\r{relpath}: not shared, disabling pruning.")
project.config.SetString("extensions.preciousObjects", None)
project.config.SetString("gc.pruneExpire", None)
@ -1240,7 +1332,7 @@ later is required to fix a server side protocol bug.
old_project_paths = []
if os.path.exists(file_path):
with open(file_path, "r") as fd:
with open(file_path) as fd:
old_project_paths = fd.read().split("\n")
# In reversed order, so subfolders are deleted before parent folder.
for path in sorted(old_project_paths, reverse=True):
@ -1265,7 +1357,7 @@ later is required to fix a server side protocol bug.
groups=None,
)
project.DeleteWorktree(
quiet=opt.quiet, force=opt.force_remove_dirty
verbose=opt.verbose, force=opt.force_remove_dirty
)
new_project_paths.sort()
@ -1303,10 +1395,9 @@ later is required to fix a server side protocol bug.
try:
old_copylinkfile_paths = json.load(fp)
except Exception:
print(
"error: %s is not a json formatted file."
% copylinkfile_path,
file=sys.stderr,
logger.error(
"error: %s is not a json formatted file.",
copylinkfile_path,
)
platform_utils.remove(copylinkfile_path)
raise
@ -1352,7 +1443,7 @@ later is required to fix a server side protocol bug.
else:
try:
info = netrc.netrc()
except IOError:
except OSError:
# .netrc file does not exist or could not be opened.
pass
else:
@ -1363,19 +1454,16 @@ later is required to fix a server side protocol bug.
if auth:
username, _account, password = auth
else:
print(
"No credentials found for %s in .netrc"
% parse_result.hostname,
file=sys.stderr,
logger.error(
"No credentials found for %s in .netrc",
parse_result.hostname,
)
except netrc.NetrcParseError as e:
print(
"Error parsing .netrc file: %s" % e, file=sys.stderr
)
logger.error("Error parsing .netrc file: %s", e)
if username and password:
manifest_server = manifest_server.replace(
"://", "://%s:%s@" % (username, password), 1
"://", f"://{username}:{password}@", 1
)
transport = PersistentTransport(manifest_server)
@ -1414,7 +1502,7 @@ later is required to fix a server side protocol bug.
try:
with open(smart_sync_manifest_path, "w") as f:
f.write(manifest_str)
except IOError as e:
except OSError as e:
raise SmartSyncError(
"error: cannot write manifest to %s:\n%s"
% (smart_sync_manifest_path, e),
@ -1425,7 +1513,7 @@ later is required to fix a server side protocol bug.
raise SmartSyncError(
"error: manifest server RPC call failed: %s" % manifest_str
)
except (socket.error, IOError, xmlrpc.client.Fault) as e:
except (OSError, xmlrpc.client.Fault) as e:
raise SmartSyncError(
"error: cannot connect to manifest server %s:\n%s"
% (manifest.manifest_server, e),
@ -1440,7 +1528,7 @@ later is required to fix a server side protocol bug.
return manifest_name
def _UpdateAllManifestProjects(self, opt, mp, manifest_name):
def _UpdateAllManifestProjects(self, opt, mp, manifest_name, errors):
"""Fetch & update the local manifest project.
After syncing the manifest project, if the manifest has any sub
@ -1452,7 +1540,7 @@ later is required to fix a server side protocol bug.
manifest_name: Manifest file to be reloaded.
"""
if not mp.standalone_manifest_url:
self._UpdateManifestProject(opt, mp, manifest_name)
self._UpdateManifestProject(opt, mp, manifest_name, errors)
if mp.manifest.submanifests:
for submanifest in mp.manifest.submanifests.values():
@ -1465,10 +1553,10 @@ later is required to fix a server side protocol bug.
git_event_log=self.git_event_log,
)
self._UpdateAllManifestProjects(
opt, child.manifestProject, None
opt, child.manifestProject, None, errors
)
def _UpdateManifestProject(self, opt, mp, manifest_name):
def _UpdateManifestProject(self, opt, mp, manifest_name, errors):
"""Fetch & update the local manifest project.
Args:
@ -1478,21 +1566,32 @@ later is required to fix a server side protocol bug.
"""
if not opt.local_only:
start = time.time()
result = mp.Sync_NetworkHalf(
quiet=opt.quiet,
verbose=opt.verbose,
current_branch_only=self._GetCurrentBranchOnly(
opt, mp.manifest
),
force_sync=opt.force_sync,
tags=opt.tags,
optimized_fetch=opt.optimized_fetch,
retry_fetches=opt.retry_fetches,
submodules=mp.manifest.HasSubmodules,
clone_filter=mp.manifest.CloneFilter,
partial_clone_exclude=mp.manifest.PartialCloneExclude,
clone_filter_for_depth=mp.manifest.CloneFilterForDepth,
)
buf = TeeStringIO(sys.stdout)
try:
result = mp.Sync_NetworkHalf(
quiet=not opt.verbose,
output_redir=buf,
verbose=opt.verbose,
current_branch_only=self._GetCurrentBranchOnly(
opt, mp.manifest
),
force_sync=opt.force_sync,
tags=opt.tags,
optimized_fetch=opt.optimized_fetch,
retry_fetches=opt.retry_fetches,
submodules=mp.manifest.HasSubmodules,
clone_filter=mp.manifest.CloneFilter,
partial_clone_exclude=mp.manifest.PartialCloneExclude,
clone_filter_for_depth=mp.manifest.CloneFilterForDepth,
)
if result.error:
errors.append(result.error)
except KeyboardInterrupt:
errors.append(
ManifestInterruptError(buf.getvalue(), project=mp.name)
)
raise
finish = time.time()
self.event_log.AddSync(
mp, event_log.TASK_SYNC_NETWORK, start, finish, result.success
@ -1503,24 +1602,24 @@ later is required to fix a server side protocol bug.
syncbuf = SyncBuffer(mp.config)
start = time.time()
mp.Sync_LocalHalf(
syncbuf, submodules=mp.manifest.HasSubmodules, errors=errors
syncbuf,
submodules=mp.manifest.HasSubmodules,
errors=errors,
verbose=opt.verbose,
)
clean = syncbuf.Finish()
self.event_log.AddSync(
mp, event_log.TASK_SYNC_LOCAL, start, time.time(), clean
)
if not clean:
raise UpdateManifestError(
aggregate_errors=errors, project=mp.name
)
raise UpdateManifestError(aggregate_errors=errors)
self._ReloadManifest(manifest_name, mp.manifest)
def ValidateOptions(self, opt, args):
if opt.force_broken:
print(
logger.warning(
"warning: -f/--force-broken is now the default behavior, and "
"the options are deprecated",
file=sys.stderr,
"the options are deprecated"
)
if opt.network_only and opt.detach_head:
self.OptionParser.error("cannot combine -n and -d")
@ -1544,15 +1643,6 @@ later is required to fix a server side protocol bug.
if opt.prune is None:
opt.prune = True
if opt.auto_gc is None and _AUTO_GC:
print(
f"Will run `git gc --auto` because {_REPO_AUTO_GC} is set.",
f"{_REPO_AUTO_GC} is deprecated and will be removed in a ",
"future release. Use `--auto-gc` instead.",
file=sys.stderr,
)
opt.auto_gc = True
def _ValidateOptionsWithManifest(self, opt, mp):
"""Like ValidateOptions, but after we've updated the manifest.
@ -1596,7 +1686,7 @@ later is required to fix a server side protocol bug.
errors = []
try:
self._ExecuteHelper(opt, args, errors)
except RepoExitError:
except (RepoExitError, RepoChangedException):
raise
except (KeyboardInterrupt, Exception) as e:
raise RepoUnhandledExceptionError(e, aggregate_errors=errors)
@ -1626,10 +1716,10 @@ later is required to fix a server side protocol bug.
try:
platform_utils.remove(smart_sync_manifest_path)
except OSError as e:
print(
logger.error(
"error: failed to remove existing smart sync override "
"manifest: %s" % e,
file=sys.stderr,
"manifest: %s",
e,
)
err_event = multiprocessing.Event()
@ -1640,11 +1730,10 @@ later is required to fix a server side protocol bug.
if cb:
base = rp.GetBranch(cb).merge
if not base or not base.startswith("refs/heads/"):
print(
logger.warning(
"warning: repo is not tracking a remote branch, so it will "
"not receive updates; run `repo init --repo-rev=stable` to "
"fix.",
file=sys.stderr,
"fix."
)
for m in self.ManifestList(opt):
@ -1665,7 +1754,7 @@ later is required to fix a server side protocol bug.
mp.ConfigureCloneFilterForDepth("blob:none")
if opt.mp_update:
self._UpdateAllManifestProjects(opt, mp, manifest_name)
self._UpdateAllManifestProjects(opt, mp, manifest_name, errors)
else:
print("Skipping update of local manifest project.")
@ -1705,10 +1794,14 @@ later is required to fix a server side protocol bug.
# Initialize the socket dir once in the parent.
ssh_proxy.sock()
result = self._FetchMain(
opt, args, all_projects, err_event, ssh_proxy, manifest
opt,
args,
all_projects,
err_event,
ssh_proxy,
manifest,
errors,
)
if result.errors:
errors.extend(result.errors)
all_projects = result.all_projects
if opt.network_only:
@ -1719,12 +1812,11 @@ later is required to fix a server side protocol bug.
if err_event.is_set():
err_network_sync = True
if opt.fail_fast:
print(
"\nerror: Exited sync due to fetch errors.\n"
logger.error(
"error: Exited sync due to fetch errors.\n"
"Local checkouts *not* updated. Resolve network issues "
"& retry.\n"
"`repo sync -l` will update some local checkouts.",
file=sys.stderr,
"`repo sync -l` will update some local checkouts."
)
raise SyncFailFastError(aggregate_errors=errors)
@ -1742,13 +1834,9 @@ later is required to fix a server side protocol bug.
if isinstance(e, DeleteWorktreeError):
errors.extend(e.aggregate_errors)
if opt.fail_fast:
print(
"\nerror: Local checkouts *not* updated.",
file=sys.stderr,
)
logger.error("error: Local checkouts *not* updated.")
raise SyncFailFastError(aggregate_errors=errors)
err_update_linkfiles = False
try:
self.UpdateCopyLinkfileList(m)
except Exception as e:
@ -1756,9 +1844,8 @@ later is required to fix a server side protocol bug.
errors.append(e)
err_event.set()
if opt.fail_fast:
print(
"\nerror: Local update copyfile or linkfile failed.",
file=sys.stderr,
logger.error(
"error: Local update copyfile or linkfile failed."
)
raise SyncFailFastError(aggregate_errors=errors)
@ -1781,12 +1868,10 @@ later is required to fix a server side protocol bug.
# If we saw an error, exit with code 1 so that other scripts can check.
if err_event.is_set():
# Add a new line so it's easier to read.
print("\n", file=sys.stderr)
def print_and_log(err_msg):
self.git_event_log.ErrorEvent(err_msg)
print(err_msg, file=sys.stderr)
logger.error("%s", err_msg)
print_and_log("error: Unable to fully sync the tree")
if err_network_sync:
@ -1799,15 +1884,11 @@ later is required to fix a server side protocol bug.
print_and_log("error: Checking out local projects failed.")
if err_results:
# Don't log repositories, as it may contain sensitive info.
print(
"Failing repos:\n%s" % "\n".join(err_results),
file=sys.stderr,
)
logger.error("Failing repos:\n%s", "\n".join(err_results))
# Not useful to log.
print(
logger.error(
'Try re-running with "-j1 --fail-fast" to exit at the first '
"error.",
file=sys.stderr,
"error."
)
raise SyncError(aggregate_errors=errors)
@ -1824,10 +1905,9 @@ later is required to fix a server side protocol bug.
self._local_sync_state.PruneRemovedProjects()
if self._local_sync_state.IsPartiallySynced():
print(
logger.warning(
"warning: Partial syncs are not supported. For the best "
"experience, sync the entire tree.",
file=sys.stderr,
"experience, sync the entire tree."
)
if not opt.quiet:
@ -1854,7 +1934,7 @@ def _PostRepoUpgrade(manifest, quiet=False):
def _PostRepoFetch(rp, repo_verify=True, verbose=False):
if rp.HasChanges:
print("info: A new version of repo is available", file=sys.stderr)
logger.warning("info: A new version of repo is available")
wrapper = Wrapper()
try:
rev = rp.bare_git.describe(rp.GetRevisionId())
@ -1876,22 +1956,16 @@ def _PostRepoFetch(rp, repo_verify=True, verbose=False):
rp.work_git.reset("--keep", new_rev)
except GitError as e:
raise RepoUnhandledExceptionError(e)
print("info: Restarting repo with latest version", file=sys.stderr)
print("info: Restarting repo with latest version")
raise RepoChangedException(["--repo-upgraded"])
else:
print(
"warning: Skipped upgrade to unverified version",
file=sys.stderr,
)
logger.warning("warning: Skipped upgrade to unverified version")
else:
if verbose:
print(
"repo version %s is current" % rp.work_git.describe(HEAD),
file=sys.stderr,
)
print("repo version %s is current" % rp.work_git.describe(HEAD))
class _FetchTimes(object):
class _FetchTimes:
_ALPHA = 0.5
def __init__(self, manifest):
@ -1914,7 +1988,7 @@ class _FetchTimes(object):
try:
with open(self._path) as f:
self._saved = json.load(f)
except (IOError, ValueError):
except (OSError, ValueError):
platform_utils.remove(self._path, missing_ok=True)
self._saved = {}
@ -1930,11 +2004,11 @@ class _FetchTimes(object):
try:
with open(self._path, "w") as f:
json.dump(self._seen, f, indent=2)
except (IOError, TypeError):
except (OSError, TypeError):
platform_utils.remove(self._path, missing_ok=True)
class LocalSyncState(object):
class LocalSyncState:
_LAST_FETCH = "last_fetch"
_LAST_CHECKOUT = "last_checkout"
@ -1977,7 +2051,7 @@ class LocalSyncState(object):
try:
with open(self._path) as f:
self._state = json.load(f)
except (IOError, ValueError):
except (OSError, ValueError):
platform_utils.remove(self._path, missing_ok=True)
self._state = {}
@ -1987,7 +2061,7 @@ class LocalSyncState(object):
try:
with open(self._path, "w") as f:
json.dump(self._state, f, indent=2)
except (IOError, TypeError):
except (OSError, TypeError):
platform_utils.remove(self._path, missing_ok=True)
def PruneRemovedProjects(self):
@ -1997,7 +2071,7 @@ class LocalSyncState(object):
delete = set()
for path in self._state:
gitdir = os.path.join(self._manifest.topdir, path, ".git")
if not os.path.exists(gitdir):
if not os.path.exists(gitdir) or os.path.islink(gitdir):
delete.add(path)
if not delete:
return
@ -2029,6 +2103,7 @@ class LocalSyncState(object):
# is passed during initialization.
class PersistentTransport(xmlrpc.client.Transport):
def __init__(self, orig_host):
super().__init__()
self.orig_host = orig_host
def request(self, host, handler, request_body, verbose=False):
@ -2120,7 +2195,7 @@ class PersistentTransport(xmlrpc.client.Transport):
try:
p.feed(data)
except xml.parsers.expat.ExpatError as e:
raise IOError(
raise OSError(
f"Parsing the manifest failed: {e}\n"
f"Please report this to your manifest server admin.\n"
f'Here is the full response:\n{data.decode("utf-8")}'

View File

@ -29,10 +29,12 @@ from git_command import GitCommand
from git_refs import R_HEADS
from hooks import RepoHook
from project import ReviewableBranch
from repo_logging import RepoLogger
from subcmds.sync import LocalSyncState
_DEFAULT_UNUSUAL_COMMIT_THRESHOLD = 5
logger = RepoLogger(__file__)
class UploadExitError(SilentRepoExitError):
@ -70,16 +72,16 @@ def _VerifyPendingCommits(branches: List[ReviewableBranch]) -> bool:
# If any branch has many commits, prompt the user.
if many_commits:
if len(branches) > 1:
print(
logger.warning(
"ATTENTION: One or more branches has an unusually high number "
"of commits."
)
else:
print(
logger.warning(
"ATTENTION: You are uploading an unusually high number of "
"commits."
)
print(
logger.warning(
"YOU PROBABLY DO NOT MEAN TO DO THIS. (Did you rebase across "
"branches?)"
)
@ -93,7 +95,7 @@ def _VerifyPendingCommits(branches: List[ReviewableBranch]) -> bool:
def _die(fmt, *args):
msg = fmt % args
print("error: %s" % msg, file=sys.stderr)
logger.error("error: %s", msg)
raise UploadExitError(msg)
@ -242,6 +244,12 @@ Gerrit Code Review: https://www.gerritcodereview.com/
default=[],
help="add a label when uploading",
)
p.add_option(
"--pd",
"--patchset-description",
dest="patchset_description",
help="description for patchset",
)
p.add_option(
"--re",
"--reviewers",
@ -653,6 +661,7 @@ Gerrit Code Review: https://www.gerritcodereview.com/
dest_branch=destination,
validate_certs=opt.validate_certs,
push_options=opt.push_options,
patchset_description=opt.patchset_description,
)
branch.uploaded = True
@ -748,16 +757,13 @@ Gerrit Code Review: https://www.gerritcodereview.com/
for result in results:
project, avail = result
if avail is None:
print(
logger.error(
'repo: error: %s: Unable to upload branch "%s". '
"You might be able to fix the branch by running:\n"
" git branch --set-upstream-to m/%s"
% (
project.RelPath(local=opt.this_manifest_only),
project.CurrentBranch,
project.manifest.branch,
),
file=sys.stderr,
" git branch --set-upstream-to m/%s",
project.RelPath(local=opt.this_manifest_only),
project.CurrentBranch,
project.manifest.branch,
)
elif avail:
pending.append(result)
@ -772,14 +778,11 @@ Gerrit Code Review: https://www.gerritcodereview.com/
if not pending:
if opt.branch is None:
print(
"repo: error: no branches ready for upload", file=sys.stderr
)
logger.error("repo: error: no branches ready for upload")
else:
print(
'repo: error: no branches named "%s" ready for upload'
% (opt.branch,),
file=sys.stderr,
logger.error(
'repo: error: no branches named "%s" ready for upload',
opt.branch,
)
return 1
@ -809,10 +812,9 @@ Gerrit Code Review: https://www.gerritcodereview.com/
project_list=pending_proj_names, worktree_list=pending_worktrees
):
if LocalSyncState(manifest).IsPartiallySynced():
print(
logger.error(
"Partially synced tree detected. Syncing all projects "
"may resolve issues you're seeing.",
file=sys.stderr,
"may resolve issues you're seeing."
)
ret = 1
if ret:

View File

@ -42,35 +42,28 @@ class Version(Command, MirrorSafeCommand):
# These might not be the same. Report them both.
src_ver = RepoSourceVersion()
rp_ver = rp.bare_git.describe(HEAD)
print("repo version %s" % rp_ver)
print(" (from %s)" % rem.url)
print(" (tracking %s)" % branch.merge)
print(" (%s)" % rp.bare_git.log("-1", "--format=%cD", HEAD))
print(f"repo version {rp_ver}")
print(f" (from {rem.url})")
print(f" (tracking {branch.merge})")
print(f" ({rp.bare_git.log('-1', '--format=%cD', HEAD)})")
if self.wrapper_path is not None:
print("repo launcher version %s" % self.wrapper_version)
print(" (from %s)" % self.wrapper_path)
print(f"repo launcher version {self.wrapper_version}")
print(f" (from {self.wrapper_path})")
if src_ver != rp_ver:
print(" (currently at %s)" % src_ver)
print(f" (currently at {src_ver})")
print("repo User-Agent %s" % user_agent.repo)
print("git %s" % git.version_tuple().full)
print("git User-Agent %s" % user_agent.git)
print("Python %s" % sys.version)
print(f"repo User-Agent {user_agent.repo}")
print(f"git {git.version_tuple().full}")
print(f"git User-Agent {user_agent.git}")
print(f"Python {sys.version}")
uname = platform.uname()
if sys.version_info.major < 3:
# Python 3 returns a named tuple, but Python 2 is simpler.
print(uname)
else:
print(
"OS %s %s (%s)" % (uname.system, uname.release, uname.version)
)
print(
"CPU %s (%s)"
% (
uname.machine,
uname.processor if uname.processor else "unknown",
)
)
print(f"OS {uname.system} {uname.release} ({uname.version})")
processor = uname.processor if uname.processor else "unknown"
print(f"CPU {uname.machine} ({processor})")
print("Bug reports:", Wrapper().BUG_URL)

View File

@ -14,8 +14,11 @@
"""Common fixtures for pytests."""
import pathlib
import pytest
import platform_utils
import repo_trace
@ -23,3 +26,58 @@ import repo_trace
def disable_repo_trace(tmp_path):
"""Set an environment marker to relax certain strict checks for test code.""" # noqa: E501
repo_trace._TRACE_FILE = str(tmp_path / "TRACE_FILE_from_test")
# adapted from pytest-home 0.5.1
def _set_home(monkeypatch, path: pathlib.Path):
"""
Set the home dir using a pytest monkeypatch context.
"""
win = platform_utils.isWindows()
vars = ["HOME"] + win * ["USERPROFILE"]
for var in vars:
monkeypatch.setenv(var, str(path))
return path
# copied from
# https://github.com/pytest-dev/pytest/issues/363#issuecomment-1335631998
@pytest.fixture(scope="session")
def monkeysession():
with pytest.MonkeyPatch.context() as mp:
yield mp
@pytest.fixture(autouse=True, scope="session")
def session_tmp_home_dir(tmp_path_factory, monkeysession):
"""Set HOME to a temporary directory, avoiding user's .gitconfig.
b/302797407
Set home at session scope to take effect prior to
``test_wrapper.GitCheckoutTestCase.setUpClass``.
"""
return _set_home(monkeysession, tmp_path_factory.mktemp("home"))
# adapted from pytest-home 0.5.1
@pytest.fixture(autouse=True)
def tmp_home_dir(monkeypatch, tmp_path_factory):
"""Set HOME to a temporary directory.
Ensures that state doesn't accumulate in $HOME across tests.
Note that in conjunction with session_tmp_homedir, the HOME
dir is patched twice, once at session scope, and then again at
the function scope.
"""
return _set_home(monkeypatch, tmp_path_factory.mktemp("home"))
@pytest.fixture(autouse=True)
def setup_user_identity(monkeysession, scope="session"):
"""Set env variables for author and committer name and email."""
monkeysession.setenv("GIT_AUTHOR_NAME", "Foo Bar")
monkeysession.setenv("GIT_COMMITTER_NAME", "Foo Bar")
monkeysession.setenv("GIT_AUTHOR_EMAIL", "foo@bar.baz")
monkeysession.setenv("GIT_COMMITTER_EMAIL", "foo@bar.baz")

View File

@ -14,16 +14,12 @@
"""Unittests for the git_command.py module."""
import io
import os
import re
import subprocess
import unittest
try:
from unittest import mock
except ImportError:
import mock
from unittest import mock
import git_command
import wrapper
@ -71,9 +67,13 @@ class GitCommandWaitTest(unittest.TestCase):
"""Tests the GitCommand class .Wait()"""
def setUp(self):
class MockPopen(object):
class MockPopen:
rc = 0
def __init__(self):
self.stdout = io.BufferedReader(io.BytesIO())
self.stderr = io.BufferedReader(io.BytesIO())
def communicate(
self, input: str = None, timeout: float = None
) -> [str, str]:
@ -117,6 +117,115 @@ class GitCommandWaitTest(unittest.TestCase):
self.assertEqual(1, r.Wait())
class GitCommandStreamLogsTest(unittest.TestCase):
"""Tests the GitCommand class stderr log streaming cases."""
def setUp(self):
self.mock_process = mock.MagicMock()
self.mock_process.communicate.return_value = (None, None)
self.mock_process.wait.return_value = 0
self.mock_popen = mock.MagicMock()
self.mock_popen.return_value = self.mock_process
mock.patch("subprocess.Popen", self.mock_popen).start()
def tearDown(self):
mock.patch.stopall()
def test_does_not_stream_logs_when_input_is_set(self):
git_command.GitCommand(None, ["status"], input="foo")
self.mock_popen.assert_called_once_with(
["git", "status"],
cwd=None,
env=mock.ANY,
encoding="utf-8",
errors="backslashreplace",
stdin=subprocess.PIPE,
stdout=None,
stderr=None,
)
self.mock_process.communicate.assert_called_once_with(input="foo")
self.mock_process.stderr.read1.assert_not_called()
def test_does_not_stream_logs_when_stdout_is_set(self):
git_command.GitCommand(None, ["status"], capture_stdout=True)
self.mock_popen.assert_called_once_with(
["git", "status"],
cwd=None,
env=mock.ANY,
encoding="utf-8",
errors="backslashreplace",
stdin=None,
stdout=subprocess.PIPE,
stderr=None,
)
self.mock_process.communicate.assert_called_once_with(input=None)
self.mock_process.stderr.read1.assert_not_called()
def test_does_not_stream_logs_when_stderr_is_set(self):
git_command.GitCommand(None, ["status"], capture_stderr=True)
self.mock_popen.assert_called_once_with(
["git", "status"],
cwd=None,
env=mock.ANY,
encoding="utf-8",
errors="backslashreplace",
stdin=None,
stdout=None,
stderr=subprocess.PIPE,
)
self.mock_process.communicate.assert_called_once_with(input=None)
self.mock_process.stderr.read1.assert_not_called()
def test_does_not_stream_logs_when_merge_output_is_set(self):
git_command.GitCommand(None, ["status"], merge_output=True)
self.mock_popen.assert_called_once_with(
["git", "status"],
cwd=None,
env=mock.ANY,
encoding="utf-8",
errors="backslashreplace",
stdin=None,
stdout=None,
stderr=subprocess.STDOUT,
)
self.mock_process.communicate.assert_called_once_with(input=None)
self.mock_process.stderr.read1.assert_not_called()
@mock.patch("sys.stderr")
def test_streams_stderr_when_no_stream_is_set(self, mock_stderr):
logs = "\n".join(
[
"Enumerating objects: 5, done.",
"Counting objects: 100% (5/5), done.",
"Writing objects: 100% (3/3), 330 bytes | 330 KiB/s, done.",
"remote: Processing changes: refs: 1, new: 1, done ",
"remote: SUCCESS",
]
)
self.mock_process.stderr = io.BufferedReader(
io.BytesIO(bytes(logs, "utf-8"))
)
cmd = git_command.GitCommand(None, ["push"])
self.mock_popen.assert_called_once_with(
["git", "push"],
cwd=None,
env=mock.ANY,
stdin=None,
stdout=None,
stderr=subprocess.PIPE,
)
self.mock_process.communicate.assert_not_called()
mock_stderr.write.assert_called_once_with(logs)
self.assertEqual(cmd.stderr, logs)
class GitCallUnitTest(unittest.TestCase):
"""Tests the _GitCall class (via git_command.git)."""
@ -214,3 +323,22 @@ class GitRequireTests(unittest.TestCase):
with self.assertRaises(git_command.GitRequireError) as e:
git_command.git_require((2,), fail=True, msg="so sad")
self.assertNotEqual(0, e.code)
class GitCommandErrorTest(unittest.TestCase):
"""Test for the GitCommandError class."""
def test_augument_stderr(self):
self.assertEqual(
git_command.GitCommandError(
git_stderr="couldn't find remote ref refs/heads/foo"
).suggestion,
"Check if the provided ref exists in the remote.",
)
self.assertEqual(
git_command.GitCommandError(
git_stderr="'foobar' does not appear to be a git repository"
).suggestion,
"Are you running this repo command outside of a repo workspace?",
)

View File

@ -100,7 +100,7 @@ class GitConfigReadOnlyTests(unittest.TestCase):
("intg", 10737418240),
)
for key, value in TESTS:
self.assertEqual(value, self.config.GetInt("section.%s" % (key,)))
self.assertEqual(value, self.config.GetInt(f"section.{key}"))
class GitConfigReadWriteTests(unittest.TestCase):

View File

@ -34,7 +34,7 @@ class SuperprojectTestCase(unittest.TestCase):
PARENT_SID_KEY = "GIT_TRACE2_PARENT_SID"
PARENT_SID_VALUE = "parent_sid"
SELF_SID_REGEX = r"repo-\d+T\d+Z-.*"
FULL_SID_REGEX = r"^%s/%s" % (PARENT_SID_VALUE, SELF_SID_REGEX)
FULL_SID_REGEX = rf"^{PARENT_SID_VALUE}/{SELF_SID_REGEX}"
def setUp(self):
"""Set up superproject every time."""
@ -108,7 +108,9 @@ class SuperprojectTestCase(unittest.TestCase):
self.assertRegex(log_entry["sid"], self.FULL_SID_REGEX)
else:
self.assertRegex(log_entry["sid"], self.SELF_SID_REGEX)
self.assertRegex(log_entry["time"], r"^\d+-\d+-\d+T\d+:\d+:\d+\.\d+Z$")
self.assertRegex(
log_entry["time"], r"^\d+-\d+-\d+T\d+:\d+:\d+\.\d+\+00:00$"
)
def readLog(self, log_path):
"""Helper function to read log data into a list."""
@ -247,7 +249,7 @@ class SuperprojectTestCase(unittest.TestCase):
os.mkdir(self._superproject._superproject_path)
manifest_path = self._superproject._WriteManifestFile()
self.assertIsNotNone(manifest_path)
with open(manifest_path, "r") as fp:
with open(manifest_path) as fp:
manifest_xml_data = fp.read()
self.assertEqual(
sort_attributes(manifest_xml_data),
@ -282,7 +284,7 @@ class SuperprojectTestCase(unittest.TestCase):
)
self.assertIsNotNone(update_result.manifest_path)
self.assertFalse(update_result.fatal)
with open(update_result.manifest_path, "r") as fp:
with open(update_result.manifest_path) as fp:
manifest_xml_data = fp.read()
self.assertEqual(
sort_attributes(manifest_xml_data),
@ -369,7 +371,7 @@ class SuperprojectTestCase(unittest.TestCase):
)
self.assertIsNotNone(update_result.manifest_path)
self.assertFalse(update_result.fatal)
with open(update_result.manifest_path, "r") as fp:
with open(update_result.manifest_path) as fp:
manifest_xml_data = fp.read()
# Verify platform/vendor/x's project revision hasn't
# changed.
@ -434,7 +436,7 @@ class SuperprojectTestCase(unittest.TestCase):
)
self.assertIsNotNone(update_result.manifest_path)
self.assertFalse(update_result.fatal)
with open(update_result.manifest_path, "r") as fp:
with open(update_result.manifest_path) as fp:
manifest_xml_data = fp.read()
# Verify platform/vendor/x's project revision hasn't
# changed.
@ -490,7 +492,9 @@ class SuperprojectTestCase(unittest.TestCase):
self.assertTrue(self._superproject._Fetch())
self.assertEqual(
mock_git_command.call_args.args,
# TODO: Once we require Python 3.8+,
# use 'mock_git_command.call_args.args'.
mock_git_command.call_args[0],
(
None,
[
@ -510,7 +514,9 @@ class SuperprojectTestCase(unittest.TestCase):
# If branch for revision exists, set as --negotiation-tip.
self.assertTrue(self._superproject._Fetch())
self.assertEqual(
mock_git_command.call_args.args,
# TODO: Once we require Python 3.8+,
# use 'mock_git_command.call_args.args'.
mock_git_command.call_args[0],
(
None,
[

View File

@ -61,7 +61,7 @@ class EventLogTestCase(unittest.TestCase):
PARENT_SID_KEY = "GIT_TRACE2_PARENT_SID"
PARENT_SID_VALUE = "parent_sid"
SELF_SID_REGEX = r"repo-\d+T\d+Z-.*"
FULL_SID_REGEX = r"^%s/%s" % (PARENT_SID_VALUE, SELF_SID_REGEX)
FULL_SID_REGEX = rf"^{PARENT_SID_VALUE}/{SELF_SID_REGEX}"
def setUp(self):
"""Load the event_log module every time."""
@ -90,7 +90,9 @@ class EventLogTestCase(unittest.TestCase):
self.assertRegex(log_entry["sid"], self.FULL_SID_REGEX)
else:
self.assertRegex(log_entry["sid"], self.SELF_SID_REGEX)
self.assertRegex(log_entry["time"], r"^\d+-\d+-\d+T\d+:\d+:\d+\.\d+Z$")
self.assertRegex(
log_entry["time"], r"^\d+-\d+-\d+T\d+:\d+:\d+\.\d+\+00:00$"
)
def readLog(self, log_path):
"""Helper function to read log data into a list."""

View File

@ -198,13 +198,13 @@ class ValueTests(unittest.TestCase):
def test_bool_true(self):
"""Check XmlBool true values."""
for value in ("yes", "true", "1"):
node = self._get_node('<node a="%s"/>' % (value,))
node = self._get_node(f'<node a="{value}"/>')
self.assertTrue(manifest_xml.XmlBool(node, "a"))
def test_bool_false(self):
"""Check XmlBool false values."""
for value in ("no", "false", "0"):
node = self._get_node('<node a="%s"/>' % (value,))
node = self._get_node(f'<node a="{value}"/>')
self.assertFalse(manifest_xml.XmlBool(node, "a"))
def test_int_default(self):
@ -220,7 +220,7 @@ class ValueTests(unittest.TestCase):
def test_int_good(self):
"""Check XmlInt numeric handling."""
for value in (-1, 0, 1, 50000):
node = self._get_node('<node a="%s"/>' % (value,))
node = self._get_node(f'<node a="{value}"/>')
self.assertEqual(value, manifest_xml.XmlInt(node, "a"))
def test_int_invalid(self):
@ -385,6 +385,21 @@ class XmlManifestTests(ManifestParseTestCase):
"</manifest>",
)
def test_parse_with_xml_doctype(self):
"""Check correct manifest parse with DOCTYPE node present."""
manifest = self.getXmlManifest(
"""<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE manifest []>
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<project name="test-project" path="src/test-project"/>
</manifest>
"""
)
self.assertEqual(len(manifest.projects), 1)
self.assertEqual(manifest.projects[0].name, "test-project")
class IncludeElementTests(ManifestParseTestCase):
"""Tests for <include>."""
@ -1113,3 +1128,79 @@ class ExtendProjectElementTests(ManifestParseTestCase):
)
self.assertEqual(len(manifest.projects), 1)
self.assertEqual(manifest.projects[0].upstream, "bar")
class NormalizeUrlTests(ManifestParseTestCase):
"""Tests for normalize_url() in manifest_xml.py"""
def test_has_trailing_slash(self):
url = "http://foo.com/bar/baz/"
self.assertEqual(
"http://foo.com/bar/baz", manifest_xml.normalize_url(url)
)
url = "http://foo.com/bar/"
self.assertEqual("http://foo.com/bar", manifest_xml.normalize_url(url))
def test_has_leading_slash(self):
"""SCP-like syntax except a / comes before the : which git disallows."""
url = "/git@foo.com:bar/baf"
self.assertEqual(url, manifest_xml.normalize_url(url))
url = "gi/t@foo.com:bar/baf"
self.assertEqual(url, manifest_xml.normalize_url(url))
url = "git@fo/o.com:bar/baf"
self.assertEqual(url, manifest_xml.normalize_url(url))
def test_has_no_scheme(self):
"""Deal with cases where we have no scheme, but we also
aren't dealing with the git SCP-like syntax
"""
url = "foo.com/baf/bat"
self.assertEqual(url, manifest_xml.normalize_url(url))
url = "foo.com/baf"
self.assertEqual(url, manifest_xml.normalize_url(url))
url = "git@foo.com/baf/bat"
self.assertEqual(url, manifest_xml.normalize_url(url))
url = "git@foo.com/baf"
self.assertEqual(url, manifest_xml.normalize_url(url))
url = "/file/path/here"
self.assertEqual(url, manifest_xml.normalize_url(url))
def test_has_no_scheme_matches_scp_like_syntax(self):
url = "git@foo.com:bar/baf"
self.assertEqual(
"ssh://git@foo.com/bar/baf", manifest_xml.normalize_url(url)
)
url = "git@foo.com:bar/"
self.assertEqual(
"ssh://git@foo.com/bar", manifest_xml.normalize_url(url)
)
def test_remote_url_resolution(self):
remote = manifest_xml._XmlRemote(
name="foo",
fetch="git@github.com:org2/",
manifestUrl="git@github.com:org2/custom_manifest.git",
)
self.assertEqual("ssh://git@github.com/org2", remote.resolvedFetchUrl)
remote = manifest_xml._XmlRemote(
name="foo",
fetch="ssh://git@github.com/org2/",
manifestUrl="git@github.com:org2/custom_manifest.git",
)
self.assertEqual("ssh://git@github.com/org2", remote.resolvedFetchUrl)
remote = manifest_xml._XmlRemote(
name="foo",
fetch="git@github.com:org2/",
manifestUrl="ssh://git@github.com/org2/custom_manifest.git",
)
self.assertEqual("ssh://git@github.com/org2", remote.resolvedFetchUrl)

View File

@ -48,7 +48,7 @@ def TempGitTree():
yield tempdir
class FakeProject(object):
class FakeProject:
"""A fake for Project for basic functionality."""
def __init__(self, worktree):
@ -107,6 +107,16 @@ class ReviewableBranchTests(unittest.TestCase):
self.assertTrue(rb.date)
class ProjectTests(unittest.TestCase):
"""Check Project behavior."""
def test_encode_patchset_description(self):
self.assertEqual(
project.Project._encode_patchset_description("abcd00!! +"),
"abcd00%21%21_%2b",
)
class CopyLinkTestCase(unittest.TestCase):
"""TestCase for stub repo client checkouts.
@ -151,7 +161,7 @@ class CopyLinkTestCase(unittest.TestCase):
# "".
break
result = os.path.exists(path)
msg.append("\tos.path.exists(%s): %s" % (path, result))
msg.append(f"\tos.path.exists({path}): {result}")
if result:
msg.append("\tcontents: %r" % os.listdir(path))
break
@ -507,7 +517,10 @@ class ManifestPropertiesFetchedCorrectly(unittest.TestCase):
self.assertFalse(fakeproj.partial_clone)
fakeproj.config.SetString("repo.depth", "48")
self.assertEqual(fakeproj.depth, "48")
self.assertEqual(fakeproj.depth, 48)
fakeproj.config.SetString("repo.depth", "invalid_depth")
self.assertEqual(fakeproj.depth, None)
fakeproj.config.SetString("repo.clonefilter", "blob:limit=10M")
self.assertEqual(fakeproj.clone_filter, "blob:limit=10M")

View File

@ -16,100 +16,49 @@
import unittest
from unittest import mock
from error import RepoExitError
from repo_logging import RepoLogger
class TestRepoLogger(unittest.TestCase):
def test_error_logs_error(self):
"""Test if error fn outputs logs."""
@mock.patch.object(RepoLogger, "error")
def test_log_aggregated_errors_logs_aggregated_errors(self, mock_error):
"""Test if log_aggregated_errors logs a list of aggregated errors."""
logger = RepoLogger(__name__)
RepoLogger.errors[:] = []
result = None
def mock_handler(log):
nonlocal result
result = log.getMessage()
mock_out = mock.MagicMock()
mock_out.level = 0
mock_out.handle = mock_handler
logger.addHandler(mock_out)
logger.error("We're no strangers to love")
self.assertEqual(result, "We're no strangers to love")
def test_warning_logs_error(self):
"""Test if warning fn outputs logs."""
logger = RepoLogger(__name__)
RepoLogger.errors[:] = []
result = None
def mock_handler(log):
nonlocal result
result = log.getMessage()
mock_out = mock.MagicMock()
mock_out.level = 0
mock_out.handle = mock_handler
logger.addHandler(mock_out)
logger.warning("You know the rules and so do I (do I)")
self.assertEqual(result, "You know the rules and so do I (do I)")
def test_error_aggregates_error_msg(self):
"""Test if error fn aggregates error logs."""
logger = RepoLogger(__name__)
RepoLogger.errors[:] = []
logger.error("A full commitment's what I'm thinking of")
logger.error("You wouldn't get this from any other guy")
logger.error("I just wanna tell you how I'm feeling")
logger.error("Gotta make you understand")
self.assertEqual(
RepoLogger.errors[:],
[
"A full commitment's what I'm thinking of",
"You wouldn't get this from any other guy",
"I just wanna tell you how I'm feeling",
"Gotta make you understand",
],
logger.log_aggregated_errors(
RepoExitError(
aggregate_errors=[
Exception("foo"),
Exception("bar"),
Exception("baz"),
Exception("hello"),
Exception("world"),
Exception("test"),
]
)
)
def test_log_aggregated_errors_logs_aggregated_errors(self):
"""Test if log_aggregated_errors outputs aggregated errors."""
logger = RepoLogger(__name__)
RepoLogger.errors[:] = []
result = []
def mock_handler(log):
nonlocal result
result.append(log.getMessage())
mock_out = mock.MagicMock()
mock_out.level = 0
mock_out.handle = mock_handler
logger.addHandler(mock_out)
logger.error("Never gonna give you up")
logger.error("Never gonna let you down")
logger.error("Never gonna run around and desert you")
logger.log_aggregated_errors()
self.assertEqual(
result,
mock_error.assert_has_calls(
[
"Never gonna give you up",
"Never gonna let you down",
"Never gonna run around and desert you",
"=" * 80,
"Repo command failed due to following errors:",
(
"Never gonna give you up\n"
"Never gonna let you down\n"
"Never gonna run around and desert you"
mock.call("=" * 80),
mock.call(
"Repo command failed due to the following `%s` errors:",
"RepoExitError",
),
],
mock.call("foo\nbar\nbaz\nhello\nworld"),
mock.call("+%d additional errors...", 1),
]
)
@mock.patch.object(RepoLogger, "error")
def test_log_aggregated_errors_logs_single_error(self, mock_error):
"""Test if log_aggregated_errors logs empty aggregated_errors."""
logger = RepoLogger(__name__)
logger.log_aggregated_errors(RepoExitError())
mock_error.assert_has_calls(
[
mock.call("=" * 80),
mock.call("Repo command failed: %s", "RepoExitError"),
]
)

View File

@ -117,8 +117,12 @@ class LocalSyncState(unittest.TestCase):
def setUp(self):
"""Common setup."""
self.repodir = tempfile.mkdtemp(".repo")
self.topdir = tempfile.mkdtemp("LocalSyncState")
self.repodir = os.path.join(self.topdir, ".repo")
os.makedirs(self.repodir)
self.manifest = mock.MagicMock(
topdir=self.topdir,
repodir=self.repodir,
repoProject=mock.MagicMock(relpath=".repo/repo"),
)
@ -126,7 +130,7 @@ class LocalSyncState(unittest.TestCase):
def tearDown(self):
"""Common teardown."""
shutil.rmtree(self.repodir)
shutil.rmtree(self.topdir)
def _new_state(self, time=_TIME):
with mock.patch("time.time", return_value=time):
@ -261,6 +265,95 @@ class LocalSyncState(unittest.TestCase):
self.assertIsNone(self.state.GetFetchTime(projA))
self.assertEqual(self.state.GetFetchTime(projB), 7)
def test_prune_removed_and_symlinked_projects(self):
"""Removed projects that still exists on disk as symlink are pruned."""
with open(self.state._path, "w") as f:
f.write(
"""
{
"projA": {
"last_fetch": 5
},
"projB": {
"last_fetch": 7
}
}
"""
)
def mock_exists(path):
return True
def mock_islink(path):
if "projB" in path:
return True
return False
projA = mock.MagicMock(relpath="projA")
projB = mock.MagicMock(relpath="projB")
self.state = self._new_state()
self.assertEqual(self.state.GetFetchTime(projA), 5)
self.assertEqual(self.state.GetFetchTime(projB), 7)
with mock.patch("os.path.exists", side_effect=mock_exists):
with mock.patch("os.path.islink", side_effect=mock_islink):
self.state.PruneRemovedProjects()
self.assertIsNone(self.state.GetFetchTime(projB))
self.state = self._new_state()
self.assertIsNone(self.state.GetFetchTime(projB))
self.assertEqual(self.state.GetFetchTime(projA), 5)
class FakeProject:
def __init__(self, relpath):
self.relpath = relpath
def __str__(self):
return f"project: {self.relpath}"
def __repr__(self):
return str(self)
class SafeCheckoutOrder(unittest.TestCase):
def test_no_nested(self):
p_f = FakeProject("f")
p_foo = FakeProject("foo")
out = sync._SafeCheckoutOrder([p_f, p_foo])
self.assertEqual(out, [[p_f, p_foo]])
def test_basic_nested(self):
p_foo = p_foo = FakeProject("foo")
p_foo_bar = FakeProject("foo/bar")
out = sync._SafeCheckoutOrder([p_foo, p_foo_bar])
self.assertEqual(out, [[p_foo], [p_foo_bar]])
def test_complex_nested(self):
p_foo = FakeProject("foo")
p_foobar = FakeProject("foobar")
p_foo_dash_bar = FakeProject("foo-bar")
p_foo_bar = FakeProject("foo/bar")
p_foo_bar_baz_baq = FakeProject("foo/bar/baz/baq")
p_bar = FakeProject("bar")
out = sync._SafeCheckoutOrder(
[
p_foo_bar_baz_baq,
p_foo,
p_foobar,
p_foo_dash_bar,
p_foo_bar,
p_bar,
]
)
self.assertEqual(
out,
[
[p_bar, p_foo, p_foo_dash_bar, p_foobar],
[p_foo_bar],
[p_foo_bar_baz_baq],
],
)
class GetPreciousObjectsState(unittest.TestCase):
"""Tests for _GetPreciousObjectsState."""

View File

@ -418,7 +418,7 @@ class SetupGnuPG(RepoWrapperTestCase):
self.wrapper.home_dot_repo, "gnupg"
)
self.assertTrue(self.wrapper.SetupGnuPG(True))
with open(os.path.join(tempdir, "keyring-version"), "r") as fp:
with open(os.path.join(tempdir, "keyring-version")) as fp:
data = fp.read()
self.assertEqual(
".".join(str(x) for x in self.wrapper.KEYRING_VERSION),

View File

@ -15,7 +15,8 @@
# https://tox.readthedocs.io/
[tox]
envlist = lint, py36, py37, py38, py39, py310, py311
envlist = lint, py36, py37, py38, py39, py310, py311, py312
requires = virtualenv<20.22.0
[gh-actions]
python =
@ -25,11 +26,13 @@ python =
3.9: py39
3.10: py310
3.11: py311
3.12: py312
[testenv]
deps =
black
flake8
isort
pytest
pytest-timeout
commands = {envpython} run_tests {posargs}
@ -55,6 +58,3 @@ deps =
commands =
black {posargs:.}
flake8
[pytest]
timeout = 300