Compare commits

...

176 Commits

Author SHA1 Message Date
ab2d321104 sync: fix connection error on macOS
With a large number of sync workers, the sync process may fail on
macOS due to connection errors. The root cause is that multiple
workers may attempt to connect to the multiprocessing manager server
at the same time when handling the first job. This can lead to
connection failures if there are too many pending connections, exceeding
the socket listening backlog.

Bug: 377538810
Change-Id: I1924d318d076ca3be61d75daa37bfa8d7dc23ed7
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/441541
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2024-11-06 16:33:17 +00:00
aada468916 upload: Return correct tuple values in _ProcessResults
Incorrect tuple values were returned with http://go/grev/440221 -
instead of returning (Project, ReviewableBranch), _ProcessResults was
returning (int, ReviewableBranch).

R=jojwang@google.com

Bug: 376731172
Change-Id: I75205f42fd23f5ee6bd8d0c15b18066189b42bd9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/441121
Reviewed-by: Sam Saccone <samccone@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2024-10-31 21:18:53 +00:00
1d5098617e worktree: Do not try to fix relative paths
--worktree was broken with incorrect paths in the .git files
whenever the local copy of git populated gitdir with relative paths
instead of absoulte paths.

Bug: 376251410
Change-Id: Id32dc1576315218967de2a9bfe43bf7a5a0e7aa6
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/440801
Commit-Queue: Allen Webb <allenwebb@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Allen Webb <allenwebb@google.com>
2024-10-30 17:03:57 +00:00
e219c78fe5 forall: Fix returning results early
rc should be returned only after all results are processed.

R=jojwang@google.com

Bug: b/376454189
Change-Id: I8200b9954240dd3e8e9f2ab82494779a3cb38627
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/440901
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
2024-10-30 16:11:04 +00:00
f9f4df62e0 Use full name of the revision when checking dest-branch
The manifest usually doesn't sepecify the revision with the full name
(e.g. refs/heads/REV).
However, when checking if the name of the merge branch, full name is
used on the merge branch.

The CL use full name of revision when comparing it with the merge
branch.

Bug: b/370919047
Test: repo upload on a project with `dest-branch` set
Change-Id: Ib6fa2f7246beb5bae0a26a70048a7ac03b6c5a2f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/438401
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Joe Hsu <joehsu@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-10-28 23:47:08 +00:00
ebdf0409d2 Add REPO_SKIP_SELF_UPDATE check in sync
The command _PostRepoFetch will try to self update
during repo sync. That is beneficial but adds
version uncertainty, fail potential and slow downs
in non-interactive scenarios.

Conditionally skip the update if env variable
REPO_SKIP_SELF_UPDATE is defined.

A call to selfupdate works as before, meaning even
with the variable set, it will run the update.

Change-Id: Iab0ef55dc3d3db3cbf1ba1f506c57fbb58a504c3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/439967
Tested-by: Fredrik de Groot <fredrik.de.groot@haleytek.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2024-10-28 17:46:25 +00:00
303bd963d5 manifest: add optional base check on remove and extend
This adds an optional, built-in checker for
guarding against patches hanging on wrong
base revisions, which is useful if a lower layer of
the manifest changes after a patch was done.

When adding a patch with a new revision using
extend-project or remove-project/project:

          C---D---E patches in project bla
         /
    A---B project bla in manifest state 1

<extend-project name="bla" revision="E" base-rev="B">

If project bla gets updated, in a new snap ID
or by a supplier or similar, to a new state:

          C---D---E patches in project bla
         /
    A---B---F---G project bla in manifest state 2

Parsing will fail because revision of bla is now G,
giving the choice to create a new patch branch
from G and updating base-rev, or keeping previous
branch for some reason and only updating base-rev.

Intended for use in a layered manifest with
hashed revisions. Named refs like branches and tags
also work fine when comparing, but will be misleading
if a branch is used as base-rev.

Change-Id: Ic6211550a7d3cc9656057f6a2087c505b40cad2b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/436777
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Fredrik de Groot <fredrik.de.groot@haleytek.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-10-28 16:55:10 +00:00
ae384f8623 [event_log] Stop leaking semaphore resources
With the global state and fork, we are left with uncleaned resources.
Isolate mulitprocessing.Value in a function so we stop the leak.

Bug: 353656374
Change-Id: If50bb544bda12b72f00c02bc1d2c0d19de000b88
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/440261
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2024-10-24 16:58:17 +00:00
70a4e643e6 progress: always show done message
The done message was omitted if the task is shorter than 0.5s. This
might confuse users.

Bug: b/371638995
Change-Id: I3fdd2cd8daea16d34fba88457d09397fff71af15
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/440222
Tested-by: Kuang-che Wu <kcwu@google.com>
Commit-Queue: Kuang-che Wu <kcwu@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2024-10-24 16:21:28 +00:00
8da4861b38 subcmds: reduce multiprocessing serialization overhead
Follow the same approach as 39ffd9977e to reduce serialization overhead.

Below benchmarks are tested with 2.7k projects on my workstation
(warm cache). git tracing is disabled for benchmark.

(seconds)              | v2.48 | v2.48 | this CL | this CL
	               |       |  -j32 |         |    -j32
-----------------------------------------------------------
with clean tree state:
branches (none)        |   5.6 |   5.9 |    1.0  |    0.9
status (clean)         |  21.3 |   9.4 |   19.4  |    4.7
diff (none)            |   7.6 |   7.2 |    5.7  |    2.2
prune (none)           |   5.7 |   6.1 |    1.3  |    1.2
abandon (none)         |  19.4 |  18.6 |    0.9  |    0.8
upload (none)          |  19.7 |  18.7 |    0.9  |    0.8
forall -c true         |   7.5 |   7.6 |    0.6  |    0.6
forall -c "git log -1" |  11.3 |  11.1 |    0.6  |    0.6

with branches:
start BRANCH --all     |  21.9 |  20.3 |   13.6  |    2.6
checkout BRANCH        |  29.1 |  27.8 |    1.1  |    1.0
branches (2)           |  28.0 |  28.6 |    1.5  |    1.3
abandon BRANCH         |  29.2 |  27.5 |    9.7  |    2.2

Bug: b/371638995
Change-Id: I53989a3d1e43063587b3f52f852b1c2c56b49412
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/440221
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Kuang-che Wu <kcwu@google.com>
Commit-Queue: Kuang-che Wu <kcwu@google.com>
2024-10-23 23:34:34 +00:00
39ffd9977e sync: reduce multiprocessing serialization overhead
Background:
 - Manifest object is large (for projects like Android) in terms of
   serialization cost and size (more than 1mb).
 - Lots of Project objects usually share only a few manifest objects.

Before this CL, Project objects were passed to workers via function
parameters. Function parameters are pickled separately (in chunk). In
other words, manifests are serialized again and again. The major
serialization overhead of repo sync was
  O(manifest_size * projects / chunksize)

This CL uses following tricks to reduce serialization overhead.
 - All projects are pickled in one invocation. Because Project objects
   share manifests, pickle library remembers which objects are already
   seen and avoid the serialization cost.
 - Pass the Project objects to workers at worker intialization time.
   And pass project index as function parameters instead. The number of
   workers is much smaller than the number of projects.
 - Worker init state are shared on Linux (fork based). So it requires
   zero serialization for Project objects.

On Linux (fork based), the serialization overhead is
  O(projects)  --- one int per project
On Windows (spawn based), the serialization overhead is
  O(manifest_size * min(workers, projects))

Moreover, use chunksize=1 to avoid the chance that some workers are idle
while other workers still have more than one job in their chunk queue.

Using 2.7k projects as the baseline, originally "repo sync" no-op
sync takes 31s for fetch and 25s for checkout on my Linux workstation.
With this CL, it takes 12s for fetch and 1s for checkout.

Bug: b/371638995
Change-Id: Ifa22072ea54eacb4a5c525c050d84de371e87caa
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/439921
Tested-by: Kuang-che Wu <kcwu@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Kuang-che Wu <kcwu@google.com>
2024-10-23 02:58:45 +00:00
584863fb5e Fix incremental syncs for prjs with submodules
When performing an incremental sync (re-running repo init with an
updated manifest revision) with --fetch-submodules or sync-s=true,
there is an attempt to get a list of all projects (including
submodules) before projects are actually fetched. However, we can
only list submodules of a project if we have already fetched its
revision. Instead of throwing an error when we don't have the
revision, assume there are no submodules for that project. In the
sync cmd, we already update the list of projects to include
submodules after fetching superprojects.

Change-Id: I48bc68c48b5b10117356b18f5375d17f9a89ec05
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/439761
Commit-Queue: Kaushik Lingarkar <kaushik.lingarkar@linaro.org>
Tested-by: Kaushik Lingarkar <kaushik.lingarkar@linaro.org>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Nasser Grainawi <nasser.grainawi@linaro.org>
2024-10-18 03:55:10 +00:00
454fdaf119 sync: Always use WORKER_BATCH_SIZE
With 551285fa35, the comment about number
of workers no longer stands - dict is shared among multiprocesses and
real time information is available.

Using 2.7k projects as the baseline, using chunk size of 4 takes close
to 5 minutes. A chunk size of 32 takes this down to 40s - a reduction of
rougly 8 times which matches the increase.

R=gavinmak@google.com

Bug: b/371638995
Change-Id: Ida5fd8f7abc44b3b82c02aa0f7f7ae01dff5eb07
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/438523
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2024-10-07 18:44:19 +00:00
f7f9dd4deb project: Handle git sso auth failures as repo exit
If a user is not authenticated, repo continues execution and it will
likely result in more of the same errors being printed. A user is also
likely to SIGTERM the process resulting in more errors.

This change stops repo sync if any of repositories can't be fetched to
Git authentcation using sso helper. We could extend this to all Git
authentication

Change-Id: I9e471e063450c0a51f25a5e7f12a83064dfb170c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/438522
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-10-03 20:47:50 +00:00
70ee4dd313 superproject: Remove notice about beta
It's been the default for Android for over a year now, and it's no
longer useful notice.

Change-Id: I53c6f1e7cee8c1b2f408e67d3a6732db3b272bee
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/438521
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-10-03 20:37:18 +00:00
cfe3095e50 project: run fetch --refetch on unable to not parse commit
Similarly to e59e2ae757, handle missing
gc'ed commits by running `git fetch --refetch`.

R=jojwang@google.com

Bug: b/360889369
Bug: b/371000949
Change-Id: I108b870b855d3b9f23665afa134c6e35f7cd2830
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/438461
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-10-03 17:40:37 +00:00
621de7ed12 Disable git terminal prompt during fetch/clone
git fetch operation may prompt user to enter username and password.
This won't be visible to user when repo sync operation since stdout and
stderr are redirected. If that happens, user may think repo is doing
work and likely won't realize it's stuck on user's input.

This patch disables prompt for clone and fetch operations, and repo will
fail fast.

R=gavinmak@google.com

Bug: b/368644181
Change-Id: I2efa88ae66067587a00678eda155d861034b9127
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/438001
Reviewed-by: Nasser Grainawi <nasser.grainawi@linaro.org>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2024-09-26 22:10:36 +00:00
d7ebdf56be init: add --manifest-upstream-branch
When a sha1 is provided to '--manifest-branch', the ref which
is expected to contain that sha1 can be provided using the new
'--manifest-upstream-branch' option. This is useful with
'--current-branch' to avoid having to sync all heads and tags,
or with a commit that comes from a non-head/tag ref (like a
Gerrit change ref).

Change-Id: I46a3e255ca69ed9e809039e58b0c163e02af94ef
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/436717
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Kaushik Lingarkar <kaushik.lingarkar@linaro.org>
Tested-by: Kaushik Lingarkar <kaushik.lingarkar@linaro.org>
2024-09-26 00:52:28 +00:00
fabab4e245 man: regenerate man pages
Change-Id: Icf697eda7d5dcdc87854ad6adf607353c7ba5ac2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/437941
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Nasser Grainawi <nasser.grainawi@linaro.org>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2024-09-25 20:57:42 +00:00
b577444a90 project: Copy and link files even with local branches
In the winding maze that constitutes Sync_LocalHalf(), there are paths
in which we don't copy-and-link files. Examples include something like:

  cd some/project/
  repo start head .
  # do some work, make some commit, upload that commit to Gerrit

  [[ ... in the meantime, someone addes a <linkfile ...> for
     some/project/ in the manifest ... ]]

  cd some/project/
  git pull --rebase
  repo sync

In this case, we never hit a `repo rebase` case, which might have saved
us. Instead, the developer is left confused why some/project/ never had
its <linkfile>s created.

Notably, this opens up one more corner case in which <linkfile ... /> or
<copyfile ... /> could potentially clobber existing work in the
destination directory, but there are existing cases where that's true,
and frankly, those seem like bigger holes than this new one.

Change-Id: I394b0e4529023a8ee319dc25d03d513a19251a4a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/437421
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Brian Norris <briannorris@google.com>
Commit-Queue: Brian Norris <briannorris@google.com>
2024-09-19 00:11:52 +00:00
1e19f7dd61 sync: include TARGET_RELEASE when constructing smart sync target.
When using the smart sync option, we try to construct the target that
was "lunched" from the TARGET_PRODUCT and TARGET_BUILD_VARIANT envvars.

However, an android target is now made of three parts,
{TARGET_PRODUCT}-{TARGET_RELEASE}-{TARGET_BUILD_VARIANT}.

I am leaving the option of creating a target if a TARGET_RELEASE is not
specified in case there are other consumers who depend on that option.

BUG=b:358101714
TEST=./run_tests
TEST=smart sync on android repo and manually inspecting
smart_sync_override.xml

Change-Id: I556137e33558783a86a0631f29756910b4a93d92
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/436977
Tested-by: Yiwei Zhang <yiwzhang@google.com>
Reviewed-by: Yiwei Zhang <yiwzhang@google.com>
Commit-Queue: Yiwei Zhang <yiwzhang@google.com>
2024-09-12 16:15:50 +00:00
d8b4101eae color: fix have_fg not re assign to true
In method _parse the value of this variable 'have_fg ' is always
False, Maybe reassign it to True is lost.
I guess the author’s original intention was:
if set some value in gitconfig file(for ex: text = black red ul),
the first is bg color, the second is fg color, and the last one is attr.



Change-Id: I372698fe625db4c1fdaa94ea7f193a80a850ecb9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/425997
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Bright Ma <mmh1989@foxmail.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-09-12 16:15:06 +00:00
1c53b0fa44 tox.ini: Make the lint and format environments run black for all code
This matches the extra files specified in run_tests.

Change-Id: Ic8999383a17b3ec7ae27322323ea44eeaa40c968
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/434998
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-09-12 16:09:24 +00:00
e5ae870a2f tox.ini, constraints.txt: Lock the version of black to <24
The formatting produced by black versions before 24 matches the current
formatting of the code.

Change-Id: I045f22d2f32a09d4683867293e81512f2abd1036
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/434997
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-09-12 16:05:35 +00:00
e59e2ae757 project: run fetch --refetch onacould not parse commit
git may gc reachable objects in partial clone repository due to a bug
(report:
https://lore.kernel.org/git/20240802073143.56731-1-hanyang.tony@bytedance.com/
). Until git is properly patched and released, force --refetch iff
"could not parse commit" is part of git output. --refetch will will
ensure that gc'ed objects are retrieved.

Bug: b/360889369
Change-Id: I0fc911c591060f859235dcd8d019881106f0858e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/437017
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Sam Saccone <samccone@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2024-09-12 15:27:12 +00:00
c44ad09309 Add a --rebase option to sync command
Previously repo would abort a sync if there were published changes not
merged upstream. The --rebase option allows the modification of
published commits.

This is a copy of http://go/grev/369694 with the merge conflicts
resolved.

Bug: 40014610
Change-Id: Idac8199400346327b530abea33f1ed794e5bb4c2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/435838
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Jeroen Dhollander <jeroendh@google.com>
Commit-Queue: Jeroen Dhollander <jeroendh@google.com>
2024-08-30 09:08:29 +00:00
4592a63de5 sync: Fix git command for aborting rebase being called incorrectly.
The argument list was incorrectly destructured so only the first
element of the list was considered a git-cmd, split up by each
character in the string.

Change-Id: Idee8a95a89a7da8b8addde07135354fc506c2758
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/435839
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Erik Elmeke <erik@haleytek.corp-partner.google.com>
Tested-by: Erik Elmeke <erik@haleytek.corp-partner.google.com>
2024-08-28 08:56:35 +00:00
0444ddf78e project: ignore more curl failure modes
Current clone bundle fetches from Google storage results HTTP/404
and curl exiting 56.  This is basically WAI, so stop emitting
verbose error output whenever that happens.  Also add a few more
curl exit statuses based on chromite history, and document them.

Change-Id: I3109f8a8a19109ba9bbd62780b40bbcd4fce9b76
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/432197
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2024-07-02 19:03:54 +00:00
9bf8236c24 logging: Fix log formatting with colored output
The log message is already formatted before being passed to the colorer.
To avoid the exception "TypeError: not enough arguments for format
string", we should use the `nofmt_colorer` instead.

This bug occurs only when the formatted string still contains '%'
character. The following snippet can reproduce the bug:

```
from repo_logging import RepoLogger
RepoLogger(__name__).error("%s", "100% failed")
```

Change-Id: I4e3977b3d21aec4e0deb95fc1c6dd1e59272d695
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/432017
Tested-by: Shik Chen <shik@google.com>
Commit-Queue: Shik Chen <shik@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2024-07-02 06:24:31 +00:00
87f52f308c upload: add a --topic option for setting topic explicitly
Let people specify the exact topic when uploading CLs.  The existing
-t option only supports setting the topic to the current local branch.

Add a --topic-branch long option to the existing -t to align it a bit
better with --hashtag & --hashtag-branch.

Change-Id: I010abc4a7f3c685021cae776dd1e597c22b79135
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/431997
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2024-07-01 17:54:19 +00:00
562cea7758 sync: Abort rebase in progress if force-checkout is set
This will make "repo sync -d --force-checkout" more reliable
in CI automation, as there are fewer things in the way that may
need manual intervention.

Bug: b/40015382
Change-Id: I8a79971724a3d9a8e2d682b7a0c04deda9e34177
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/423317
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Erik Elmeke <erik@haleytek.corp-partner.google.com>
Commit-Queue: Erik Elmeke <erik@haleytek.corp-partner.google.com>
2024-05-23 14:14:18 +00:00
eede374e3e ssh: Set git protocol version 2 on SSH ControlMaster
According to https://git-scm.com/docs/protocol-v2#_ssh_and_file_transport,
when using SSH, the environment variable GIT_PROTOCOL must be set
when establishing the connection to the git server.

Normally git does this by itself. But in repo-tool where the SSH
connection is managed by the repo-tool, it must be passed in
explicitly instead.

Under some circumstances of environment configuration, this
caused all repo sync commands over ssh to always use
git protocol version 1. Even when git was configured to use
version 2.

Using git protocol v2 can significantly improve fetch speeds,
since it uses server side filtering of refs, reducing the
amount of unneccessary objects to send.

Change-Id: I6d4c3b7300a6090d707480b1a638ed03622fa71a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/411362
Tested-by: Erik Elmeke <erik@haleytek.corp-partner.google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Erik Elmeke <erik@haleytek.corp-partner.google.com>
2024-05-16 13:26:46 +00:00
2c5fb84d35 upload: drop check for uncommitted local changes
git push, like most git commands, does not warn or otherwise prompt
users when there are local uncommitted changes.  To simplify the
upload logic, and to harmonize repo upload with git push as a more
git-esque flow, stop checking/warning/prompting the user here too.

Change-Id: Iee18132f0faad0881f1a796cb58821328e04b694
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/425337
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2024-05-14 02:32:27 +00:00
12f6dc49e9 git: raise hard version to 1.9.1
Debian 7 Wheezy went EOL in May 2018.  We don't need to carry support
for that anymore as there have been 5 major releases since.  Ubuntu
Precise went EOL in Apr 2019 (including the extended support phase).
That means we can bump the required git version from 1.7.9 to 1.9.1.

git-1.7.9 was released in 2012 while git-1.9.1 was released in 2014.
So that shouldn't be a problem either.  And we've been warning people
using git versions older than 1.9.1 for 3 years now that they need to
upgrade.

Change-Id: Ifbbf72f51010b0a944c2785895d1b605333f9146
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/415637
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2024-05-01 15:23:50 +00:00
5591d99ee2 release: update-hooks: helper for automatically syncing hooks
These hooks are maintained in other projects.  Add a script to automate
their import so people don't send us changes directly, and we can try to
steer them to the correct place.

Change-Id: Iac0bdb3aae84dda43a1600e73107555b513ce82b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/422177
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2024-04-23 18:31:51 +00:00
9d865454aa gitc: delete a few more dead references
Change-Id: I1da6f2ee799c735a63ac3ca6e5abd1211af10433
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/419217
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2024-04-18 02:30:06 +00:00
cbd78a9194 man: regenerate man pages
Change-Id: I8d9dcb37f315d4208b7c8005206ae939dad79a3e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/419197
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2024-04-18 02:28:33 +00:00
46819a78a1 Remove platform_utils.realpath
... since it's just a simple wrapper of os.path.realpath now.

Change-Id: I7433e5fe09c64b130f06e2541151dce1961772c9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/416637
Tested-by: Kaiyi Li <kaiyili@google.com>
Reviewed-by: Greg Edelston <gredelston@google.com>
Commit-Queue: Kaiyi Li <kaiyili@google.com>
2024-03-27 17:13:58 +00:00
159389f0da Fix drive mounted directory on Windows
On my Windows machine, I mount drive D: to the directory C:\src.

The old implementation returns the incorrect 'C:\\??\\Volume{ad2eb15e-f293-4d48-a448-54757d95a97c}' result, which breaks the repo init command.

With the use of os.path.realpath, it can return 'D:\\' correctly.

Change-Id: Ia5f53989055125cb282d4123cf55d060718aa1ff
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/416580
Reviewed-by: Greg Edelston <gredelston@google.com>
Tested-by: Kaiyi Li <kaiyili@google.com>
Commit-Queue: Kaiyi Li <kaiyili@google.com>
2024-03-27 14:00:47 +00:00
4406642e20 git_command: unify soft/hard versions with requirements.json
Use the requirements logic in the wrapper to load versions out of the
requirements.json file to avoid duplicating them in git_command.py.

Change-Id: Ib479049fc54ebc6f52c2c30d1315cf1734ff1990
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/415617
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2024-03-21 21:20:50 +00:00
73356f1d5c project: Check if dotgit exists w/out symlink check
os.path.exists returns false on a broken symlink. This is not what repo
needs when checking if a project is setup properly.

For example, if src/foo/.git can't be resolved, repo tries to create
symlink and that results in FileExistsError.

Use lexists which returns True even if symlink is broken. That will
force path where repo checks where symlink is pointing to and will fix
it to the correct location.

Bug: b/281746795
Change-Id: Id3f7dc3a3cb6499d02ce7335eca992ddc7deb645
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/415197
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: George Engelbrecht <engeg@google.com>
Reviewed-by: Greg Edelston <gredelston@google.com>
2024-03-20 22:09:14 +00:00
09fc214a79 git: raise soft version to 2.7.4
git-1.9.1 was released in 2014 while git-2.7.4 was released in 2016.
Debian Stretch was released in 2017 and Ubuntu Xenial was released in
2016 which are plenty old at this point.  Both of those include at
least git-2.7.4.

We will start warning users of Debian Jessie (released in 2015 & EOL
in 2020) and Ubuntu Trusty (released in 2014 & EOL Apr 2024) that
they will need to upgrade.

Change-Id: I6be3809bc45968fdcb02cad3f7daf75ded1bb5b1
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/415137
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2024-03-20 21:11:26 +00:00
3762b17e98 git: raise hard version to 1.7.9
Debian 6 Squeeze went EOL in Feb 2016.  We don't need to carry support
for that anymore as there have been 6 major releases since.  That means
we can bump the required git version from 1.7.2 to 1.7.9.  Ubuntu Precise
shipped with the latter.

git-1.7.2 was released in 2010 while git-1.7.9 was released in 2012.
So that shouldn't be a problem either.  And we've been warning people
using git versions older than 1.9.1 for 3 years now that they need to
upgrade.

Change-Id: I7712f110ea158297b489b8379b112c6700b21a46
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/415097
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2024-03-20 19:49:44 +00:00
ae419e1e01 docs: release: add recent git/python/ssh/debian info
Change-Id: I744360b1bfc816e94b3511f0130abb2c08dedd42
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/415117
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2024-03-20 19:49:01 +00:00
a3a7372612 main: Stringify project name in error_info
If a project can't be removed from checkout due to uncommitted changes
present, error.project is type of Project and not a string (as it is in
some cases). Project is not JSON serializable, resulting in exception
within exception handler:

TypeError: Object of type Project is not JSON serializable

This change casts project to string as a defensive mechanism. It also
passes project name instead of project object.

Change-Id: Ie7b782d73dc3647975755d5a3774d16ea6cd5348
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/413877
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-03-15 19:26:10 +00:00
fff1d2d74c ssh: Print details if running the command fails
Change-Id: I87adbdd5fe4eb2709c97ab4c21b414145acf788b
Signed-off-by: Sebastian Schuberth <sschuberth@gmail.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/392915
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Tuan Vo Hung <vohungtuan@gmail.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-03-11 16:40:55 +00:00
4b01a242d8 upload: Fix patchset description destination
Bug: 308467447
Change-Id: I8ad598d39f5fdb24d549d3277ad5fedac203581b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/412477
Reviewed-by: George Engelbrecht <engeg@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-03-08 18:05:36 +00:00
46790229fc sync: Fix sorting for nested projects
The current logic to create checkout layers doesn't work in all cases.
For example, let's assume there are three projects: "foo", "foo/bar" and
"foo-bar". Sorting lexicographical order is incorrect as foo-bar would
be placed between foo and foo/bar, breaking layering logic.

Instead, we split filepaths based using path delimiter (always /) and
then use lexicographical sort.

BUG=b:325119758
TEST=./run_tests, manual sync on chromiumos repository

Change-Id: I76924c3cc6ba2bb860d7a3e48406a6bba8f58c10
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/412338
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: George Engelbrecht <engeg@google.com>
2024-03-08 17:58:24 +00:00
edadb25c02 sync: introduce --force-checkout
In some cases (e.g. in a CI system), it's desirable to be able to
instruct repo to force checkout. This flag passes --force flag to `git
checkout` operations.

Bug: b/327624021
Change-Id: I579edda546fb8147c4e1a267e2605fcf6e597421
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/411518
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: George Engelbrecht <engeg@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2024-03-07 17:21:51 +00:00
96edb9b573 upload: Add support for setting patchset description
Bug: 308467447
Change-Id: I7abcbc98131b826120fc9ab85d5b889f90db4b0a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/355968
Tested-by: Sergiy Belozorov <sergiyb@chromium.org>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Sergiy Belozorov <sergiyb@chromium.org>
2024-03-04 18:50:24 +00:00
5554572f02 sync: Introduce git checkout levels
If a repo manifest is updated so that project B is placed within a
project A, and if project A had content in new B's location in the old
checkout, then repo sync could break depending on checkout order, since
B can't be checked out before A.

This change introduces checkout levels which enforces right sequence of
checkouts while still allowing for parallel checkout. In an example
above, A will always be checked out first before B.

BUG=b:325119758
TEST=./run_tests, manual sync on ChromeOS repository

Change-Id: Ib3b5e4d2639ca56620a1e4c6bf76d7b1ab805250
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/410421
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Greg Edelston <gredelston@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2024-02-27 17:28:33 +00:00
97ca50f5f9 git_command: Return None from GetEventTargetPath() if set to empty string
If trace2.eventTarget was set to the empty string,
match git behavior and don't write a trace.

Bug: 319673783
Change-Id: I02b3884ad97551f8a9d7363c2cbe6b0adee6f73e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/410518
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Tested-by: Peter Collingbourne <pcc@google.com>
2024-02-26 17:51:11 +00:00
8896b68926 trace: Save trace2 sid in REPO_TRACE file
git-trace2 events contain additional information what git is doing under
the hood, and repo doesn't have visibility into.

Instead of relying on timestamp information to match REPO_TRACE with
git-trace2 events, add SID information into REPO_TRACE.

Change-Id: I37672a3face81858072c7a3ce34ca3379199dab5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/410280
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-02-22 20:55:09 +00:00
fec8cd6704 subcmds: sync: Remove deprecated _AUTO_GC
Opportunistic cleanup. It looks like this deprecated feature was slated
for deletion nearly a year ago.

Bug: None
Test: ./run_tests
Change-Id: I0bd0c0e6acbd1eaee1c0b4945c79038257d22f44
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/410198
Reviewed-by: Yiwei Zhang <yiwzhang@google.com>
Commit-Queue: Greg Edelston <gredelston@google.com>
Tested-by: Greg Edelston <gredelston@google.com>
2024-02-20 19:55:15 +00:00
b8139bdcf8 launcher: Set shebang to python3
Some (most?) Linux repos don't have /usr/bin/python, unless
python-is-python3 is installed. While package owners can adjust shebang,
we have seen an increase in number of bugs filed as extra steps are
required.

Per PEP-0394, python3 is acceptable and should be available if python3
is supported. We no longer support python2, and repo no longer works
with python2, so this change makes that explicit.

Bug: 40014585
Change-Id: I9aed90fd470ef601bd33bd596af3df69da69ef5d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/407497
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Максим Паймушкин <maxim.paymushkin@gmail.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2024-02-07 20:44:32 +00:00
26fa3180fb sync: ensure RepoChangedException propagated
Prior to this change RepoChangedException would be caught and re-rasied
as a different exception. This would prevent RepoChangedException
handler from running in main.py

Bug: b/323232806
Change-Id: I9055ff95d439d6ff225206c5bf1755cc718bcfcc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/407144
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-02-06 18:46:19 +00:00
d379e77f44 stop passing project to UpdateManifestError
UpdateManifestError inherits from RepoExitError which inherits
from BaseRepoError. None of them takes project as kwargs
causing the error like "UpdateManifestError() takes no keyword
arguments" in b/317183321

[1]: https://gerrit.googlesource.com/git-repo/+/449b23b698d7d4b13909667a49a0698eb495eeaa/error.py#144

Bug: b/317183321
Change-Id: I64c3dc502027f9dda56a0824f2712364b4502934
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/398997
Commit-Queue: Yiwei Zhang <yiwzhang@google.com>
Tested-by: Yiwei Zhang <yiwzhang@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
2024-02-02 18:35:13 +00:00
4217a82bec project: Rename if deletion fails
If a project contains files not owned by the current user, remove will
fail. In order to ensure repo sync continues to work, rename the
affected project instead, and let user know about it.

Bug: 321273512
Change-Id: I0779d61fc67042308a0226adea7d98167252a5d3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/404372
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-01-25 21:32:58 +00:00
208f344950 Clean up remaining repo sync log spam.
There are still some verbose messages (e.g. "remote: ...") when doing
repo sync after a couple days. Let's hide them behind verbose flag.

Bug: N/A
Test: repo sync
Change-Id: I1408472c95ed80d9555adfe8f92211245c03cf41
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/400855
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Tomasz Wasilczyk <twasilczyk@google.com>
Commit-Queue: Tomasz Wasilczyk <twasilczyk@google.com>
2024-01-05 21:40:43 +00:00
138c8a9ff5 docs: fix some grammar typos
Change-Id: Ie1a32cda67f94b0a2b3329b1be9e03dcbedf39cc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/400917
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2024-01-04 17:19:33 +00:00
9b57aa00f6 project: Check references during sync
Symbolic references need to be checked each time sync is called, not
only for newly created repositories. For example, it is possible to
change a project name to the already existing name, and that will result
in a broken git setup without this patch: refs/ will still point to the
old repository, whereas all objects will point to the new repository.

Bug: 40013418
Change-Id: I596d29d182986804989f0562fb45090224549b0f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/395798
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2024-01-03 22:26:07 +00:00
b1d1ece2fb tests: setup user identity for tests
After a6413f5d a GitCommandError is raised.

Since there were no user identity were set up,
it fails:
 - ReviewableBranchTests from test_project.py
 - ResolveRepoRev and CheckRepoRev from test_wrapper.py

Test: ./run_tests
Change-Id: Id7f5772afe22c77fc4c8f8f0b8be1b627ed42187
Signed-off-by: Vitalii Dmitriev <vitalii.dmitriev@unikie.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/398658
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Vitalii Dmitriev <dmit.vitalii@gmail.com>
Commit-Queue: Vitalii Dmitriev <dmit.vitalii@gmail.com>
2023-12-20 19:04:57 +00:00
449b23b698 manifest_xml: fix url normalization for inits and remotes
Before the change, repo normalizes the urls
with a following format only:

    git@github.com:foo/bar

It doesn't cover the following case:

   <remote name="org" fetch="git@github.com:org/" />
   <project name="somerepo" remote="org" />

Results to:
   error: Cannot fetch somerepo
     from ssh://git@github.com/org/git@github.com:org/somerepo

Current change fixes it by normalizing this format:

    git@github.com:foo

Test: ./run_tests tests/test_manifest_xml.py
Change-Id: I1ad0f5df0d52c0b7229ba4c9a4db4eecb5c1a003
Signed-off-by: Vitalii Dmitriev <vitalii.dmitriev@unikie.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/398337
Commit-Queue: Vitalii Dmitriev <dmit.vitalii@gmail.com>
Tested-by: Vitalii Dmitriev <dmit.vitalii@gmail.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-12-20 07:38:49 +00:00
e5fb6e585f git_trace2: Add socket timeout
repo blocks indefinitely until trace collector receives trace events,
which is not desired. This change adds a fixed timeout to connect and
send operations. It is possible that some events will be lost. repo logs
any failed trace operation.

Bug: b/316227772
Change-Id: I017636421b8e22ae3fcbab9e4eb2bee1d4fbbff4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/398717
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
2023-12-19 19:38:52 +00:00
48e4137eba manifest_xml: do not allow / before : in scp-like syntax
Since git doesn't treat these as ssh:// URIs, we shouldn't either.

Bug: https://g-issues.gerritcodereview.com/issues/40010331
Change-Id: I001f49be30395187cac447d09cb5a6c29e95768b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/398517
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-12-19 18:00:44 +00:00
172c58398b repo: Drop reexec of python3 from check_python_version()
This simplifies check_python_version() since there is no point in trying
to fall back to python3, as we are already running using some Python 3
interpreter.

Change-Id: I9dfdd002b4ef5567e064d3d6ca98ee1f3410fd48
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/397759
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
2023-12-15 06:49:27 +00:00
aa506db8a7 repo: Remove Python 2 compatibility code
Change-Id: I1f5c691bf94f255442eea95e59ddd93db6213ad8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/397758
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
2023-12-15 06:48:48 +00:00
14c61d2c9d repo: Remove a Python 2 related comment
The EnvironmentError exception was changed to OSError in commit
ae824fb2fc.

Change-Id: I1b4ff742af409ec848131e82900e885c9f089f0c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/397757
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
2023-12-14 18:31:51 +00:00
4c80921d22 Don't log spam repo sync by default
Most times a repo sync after some time (week+) results in a bunch of
messages, which are not very useful for average user:
- discarding 1 commits
- Deleting obsolete checkout.

Bug: N/A
Test: repo sync
Change-Id: I881eab61f9f261e98f3656c09e73ddd159ce288c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/397038
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Tested-by: Tomasz Wasilczyk <twasilczyk@google.com>
2023-12-08 23:08:46 +00:00
f56484c05b tox: Remove pylint timeout
It's not a valid pylint config

Change-Id: Ida480429a3a86637f26e9fc3a0d6fa2d225d952a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/396921
Reviewed-by: Scott Lee <ddoman@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
2023-12-08 22:55:22 +00:00
a50c4e3bc0 Update commit-msg hook
Modified in https://gerrit-review.googlesource.com/c/gerrit/+/394841.

Change-Id: I381e48fbdb92b33454219dd9d945a1756e551a77
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/395577
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Orgad Shaneh <orgads@gmail.com>
Commit-Queue: Orgad Shaneh <orgads@gmail.com>
Reviewed-by: Ernesto Rodriguez <guez30nesto@gmail.com>
2023-12-04 17:43:33 +00:00
0dd0a830b0 sync: Fix partial sync false positive
In the case of a project being removed from the manifest, and in the
path in which the project used to exist, and symlink is place to another
project repo will start to warn about partial syncs when a partial sync
did not occur.

Repro steps:

1) Create a manifest with two projects. Project a -> a/ and project b -> b/
2) Run `repo sync`
3) Remove project b from the manifest.
4) Use `link` in the manifest to link all of Project a to b/

Bug: 314161804
Change-Id: I4a4ac4f70a7038bc7e0c4e0e51ae9fc942411a34
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/395640
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Matt Schulte <matsch@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2023-11-30 22:36:41 +00:00
9f0ef5d926 repo: add repo main script's directory to PYTHONPATH.
Python 3.11 introduces PYTHONSAFEPATH and the -P flag which, if enabled,
does not prepend the script's directory to sys.path by default.
This breaks repo because main.py expects its own directory to be part of
Python's import path.

This causes problems with tools that add PYTHONSAFEPATH to python
programs, most notably Bazel.

We will simply prepend main.py's path to PYTHONPATH instead.

Bug: 307767740
Change-Id: I94f3fda50213e450df0d1e2df6a0b8b597416973
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/391236
Tested-by: Duy Truong <duytruong@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-11-29 11:50:53 +00:00
c287428b37 info: Handle undefined mergeBranch
When a repo client is initialized with --standalone-manifest, it doesn't
have merge branch defined. This results in mergeBranch being None.

Bug: b/308025460
Change-Id: Iebceac0976e5d3adab5300bd8dfc76744a791234
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/393716
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2023-11-20 19:22:53 +00:00
c984e8d4f6 manifest_xml: support nested submanifests
Change-Id: I58f91c6b0db631bb7f55164f41d11d3a349ac94f
Signed-off-by: Guillaume Micouin-Jorda <gmicouin@netcourrier.com>
Signed-off-by: Hadamik Stephan <Stephan.Hadamik@continental-corporation.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/392020
Reviewed-by: Ben PUJOL <pujolbe@gmail.com>
Reviewed-by: LaMont Jones <lamontjones@google.com>
Tested-by: Roberto Prado <roberto.prado.c@gmail.com>
Commit-Queue: Roberto Prado <roberto.prado.c@gmail.com>
Tested-by: LaMont Jones <lamontjones@google.com>
Reviewed-by: Roberto Prado <roberto.prado.c@gmail.com>
2023-11-15 13:06:23 +00:00
6d821124e0 repo_logging: Ensure error details are printed
This updates RepoLogger.log_aggregated_errors to print out the error
message the RepoExitError when there is not a list of aggregated
errors.

Previously it would log out:
=======================================================================
Repo command failed: ManifestParseError

This told us what class of error occurred but missed the helpful error
message that developers put in the error. After this change it will now
print out the error message:

=======================================================================
Repo command failed: ManifestParseError
    error parsing manifest /path/to/manifest.xml: no element found:
    line 197, column 0

Change-Id: I4805540fddb5fa9171dbc8912becfa7fdfb1ba67
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/392614
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Joshua Bartel <josh.bartel@garmin.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
2023-11-13 20:51:19 +00:00
560a79727f repo: Use the worktree when checking the repo rev.
Avoids treating the operation as if it were acting on a bare repository, thereby triggering failures when the Git client is configured with `safe.bareRepository=explicit`. Repo doesn't actually use a bare repository, but pointing at the gitdir acts as if it had.

Bug: 307559774
Change-Id: I2c142275b2726a59526729c0b2c54faf728f125d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/391554
Commit-Queue: Jason R. Coombs <jaraco@google.com>
Tested-by: Jason R. Coombs <jaraco@google.com>
Tested-by: Emily Shaffer <emilyshaffer@google.com>
Reviewed-by: Emily Shaffer <emilyshaffer@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-11-13 18:21:31 +00:00
8a6d1724d9 git_superproject: tell git that superproject is bare
The superproject is initialized as a bare repo in Superproject:_Init().
That means that later operations must treat it as a bare repository,
specifying the gitdir and setting 'bare' appropriately when launching
GitCommand()s. It's also OK not to specify cwd here because GitCommand()
will drop cwd if bare == True anyways.

With this change, it's possible to run `repo init` and `repo sync` with the
Git config 'safe.bareRepository' set to 'explicit'. This config strengthens
Git's security posture against embedded bare repository attacks like
https://github.com/justinsteven/advisories/blob/main/2022_git_buried_bare_repos_and_fsmonitor_various_abuses.md.

Bug: b/227257481
Change-Id: I954a64c6883d2ca2af9c603e7076fd83b52584e9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389794
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Jason R. Coombs <jaraco@google.com>
Tested-by: Emily Shaffer <emilyshaffer@google.com>
Reviewed-by: Emily Shaffer <emilyshaffer@google.com>
Commit-Queue: Jason R. Coombs <jaraco@google.com>
2023-11-09 22:13:17 +00:00
3652b497bb Correctly handle schema-less URIs for remote fetch URL
Currently we don't deal with schema-less URIs like
`git@github.com:foo` at all resulting in a scenario where we append
them to the manifest repo URL.

In order to deal with this, we munge both the manifest URL and the
fetch URL into a format we like and proceed with that.

Bug: https://g-issues.gerritcodereview.com/issues/40010331
Change-Id: I7b79fc4ed276630fdbeb235b94e327b172f0879b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/386954
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Michael Kelly <mkelly@arista.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-11-08 05:03:20 +00:00
89f761cfef main: Log ManifestParseError exception messages
This lets us see manifest parsing error messages again.

Change-Id: I2d90b97cfb50e4520f79e75fa0d648c373b49e98
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/391477
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Łukasz Patron <priv.luk@gmail.com>
Tested-by: Łukasz Patron <priv.luk@gmail.com>
2023-11-06 19:39:24 +00:00
d32b2dcd15 repo: Remove unreachable code.
Change-Id: I41371feb88c85e9da0656b9fab04057c22d1dcf4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/391514
Tested-by: Jason R. Coombs <jaraco@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Jason R. Coombs <jaraco@google.com>
2023-11-01 17:02:34 +00:00
b32ccbb66b cleanup: Update codebase to expect Python 3.6
- Bump minimum version to Python 3.6.
- Use f-strings in a lot of places.

Change-Id: I2aa70197230fcec2eff8e7c8eb754f20c08075bb
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389034
Tested-by: Jason R. Coombs <jaraco@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Jason R. Coombs <jaraco@google.com>
2023-10-31 16:03:54 +00:00
b99272c601 sync: PersistentTransport call parent init
Found via pylint:
  W0231: __init__ method from base class 'Transport'
  is not called (super-init-not-called)

Just fixed for code correctness and to avoid potential future bugs.

Change-Id: Ie1e723c2afe65d026d70ac01a16ee7a40c149834
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390676
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-10-25 09:06:23 +00:00
b0430b5bc5 sync: TeeStringIO write should return int
Change-Id: I211776a493cad4b005c6e201833e9700def2feb9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390657
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-10-24 19:29:57 +00:00
1fd5c4bdf2 sync: Fix tracking of broken links
Change-Id: Ice4f4cc745cbac59f356bd4ce1124b6162894e61
Bug: b/113935847
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390434
Tested-by: Jason Chang <jasonnc@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Jason Chang <jasonnc@google.com>
2023-10-24 18:49:20 +00:00
9267d58727 project: Speculative fix for project corruption
When a new shared project is added to manifest, there's a short window
where objects can be deleted that are used by other projects.

To close that window, set preciousObjects during git init. For
non-shared projects, repo should correct the state in the same execution
instance.

Bug: 288102993
Change-Id: I366f524535ac58c820d51a88599ae2108df9ab48
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390234
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-10-23 16:13:02 +00:00
ae824fb2fc cleanup: convert exceptions to OSError
In Python 3, these exceptions were merged into OSError, so switch
everything over to that.

Change-Id: If876a28b692de5aa5c62a3bdc8c000793ce52c63
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390376
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2023-10-21 00:56:10 +00:00
034950b9ee cleanup: delete redundant "r" open mode
Change-Id: I86ebb8c5a9dc3752e8a25f4b11b64c5be3a6429e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390375
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
2023-10-21 00:55:33 +00:00
0bcffd8656 cleanup: use new dict & set generator styles
Change-Id: Ie34ac33ada7855945c77238da3ce644f8a9f8306
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390374
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
2023-10-21 00:55:01 +00:00
7393f6bc41 manifest_xml: Fix empty project list when DOCTYPE is present
When parsing the manifest XML, the code looks for a top
level DOM node named "manifest". However, it doesn't check
that it's an element type node so if there is also an XML
document type declaration node present (which has the same
name as the root element) then it selects the wrong node
and hence you end up with no projects defined at all.

Change-Id: I8d101caffbbc2a06e56136ff21302e3f09cfc96b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390357
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Chris Allen <chris.allen@arm.com>
Commit-Queue: Chris Allen <chris.allen@arm.com>
2023-10-20 18:22:59 +00:00
8dd8521854 cleanup: leverage yield from in more places
Change-Id: I4f9cb27d89241d3738486764817b51981444a903
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390274
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2023-10-20 17:33:03 +00:00
49c9b06838 git_config: GetBoolean should return bool
Test: tox
Change-Id: Ifc0dc089deef5a3b396d889c9ebfcf8d4f007bf2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390360
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-10-20 16:41:58 +00:00
3d58d219cb project: using --depth results in error when including submanifests
Fix: https://issues.gerritcodereview.com/issues/40015442
Change-Id: I7fb6c50cf2e438b21181ce1a5893885f09b9ee2b
Signed-off-by: Roberto Vladimir Prado Carranza <roberto.prado.c@gmail.com>
Signed-off-by: Guillaume Micouin-Jorda <gmicouin@netcourrier.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/385995
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Jerome Couto <jerome.couto@renault.com>
2023-10-20 12:34:34 +00:00
c0aad7de18 repo: drop Python 2 compat logic
Bug: 302871152
Change-Id: Ie7a0219e7ac582cd25c2bc5fb530e2c03bcbcc6e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390034
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2023-10-20 05:19:40 +00:00
d4aee6570b delete Python 2 (object) compat
Bug: 302871152
Change-Id: I39636d73a6e1d69efa8ade74f75c5381651e6dc8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/390054
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2023-10-20 04:51:01 +00:00
024df06ec1 tests: Set HOME to a temporary directory when running tests.
When running the tests in my environment, tests that derived from `test_wrapper.GitCheckoutTestCase` would fail on commit or tag due to incomplete or incorrect gpg config. Ideally, the tests should not be dependent on the user's git config. This change ensures $HOME (or Windows equivalent) is replaced for the session.

Bug: 302797407

Change-Id: Ib42b712dd7b6602fee6e18329a8c6d52fb9458b9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/388235
Tested-by: Jason R. Coombs <jaraco@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Jason R. Coombs <jaraco@google.com>
2023-10-17 15:15:55 +00:00
45809e51ca tests: added python 3.12
adding the recently released python 3.12 to our
list of test environments.

Change-Id: I05ec0129ad29c16fff65ddfb389f251571f811a2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389754
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-10-17 13:58:33 +00:00
331c5dd3e7 github: add python 3.11 to test-ci.yml
added python 3.11 to the test matrix.

Change-Id: I0533205b5a10105b3144f770aa08c4c649aaf6be
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389675
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-10-16 22:29:49 +00:00
e848e9f72c github: pin ubuntu to 20.04 to make py36 work
Ubuntu versions newer that 20.04 do not support Python 3.6 as per
https://raw.githubusercontent.com/actions/python-versions/main/versions-manifest.json

Change-Id: I92d8e762a7d05e4b0d6d6e90944ceedbbfa74e57
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389117
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-10-16 22:26:38 +00:00
1544afe460 python-support: update with current status & guidelines
This doc was written back in 2019 when we were planning on the Python 3
migration.  It isn't relevant anymore, and people are reading it thinking
we still support Python 2.  Rewrite it to match current requirements and
to make it clear there is no support for older versions.

Bug: 302871152
Change-Id: I2acf3aee1816a03ee0a70774db8bf4a23713a03f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389455
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2023-10-14 06:22:53 +00:00
3b8f9535c7 hooks: drop support for Python 2
Stop running old repohooks via python2.  Abort immediately with a
clear error for the user.

Bug: 302871152
Change-Id: I750c6cbbf3c7950e249512bb1bd023c32587eef5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389454
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-10-13 20:52:46 +00:00
8f4f98582e main: drop Python 2 check
Python 2 can't even parse this code anymore due to syntax changes,
so there's no point in checking for it explicitly.

Bug: 302871152
Change-Id: I9852ace5f5079d037c60fd3ac490d77e074e6875
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389434
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2023-10-13 20:08:33 +00:00
8bc5000423 Update logger.warn to logger.warning
Bug: 305035810
Change-Id: Ic2b35d5c3cbe92480c24da612f29382f5d26d4aa
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/389414
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
2023-10-13 19:34:26 +00:00
6a7f73bb9a git_command: read1 needs a size in py3.6
Not setting size causes "TypeError: read1() takes exactly one argument
(0 given)" in Python 3.6.
In Python 3.7 onwards size defaults to -1, which means an arbitrary
number of bytes will be returned.

Compare https://docs.python.org/3.6/library/io.html#io.BufferedReader.read1
and https://docs.python.org/3.7/library/io.html#io.BufferedIOBase.read1
for more details.

Change-Id: Ia4aaf8140ead9493ec650fac167c641569e6a9d8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/388718
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-10-09 17:04:38 +00:00
23d063bdcd git_command: lru_cache needs maxsize for py36 & 37
Python 3.6 and 3.7 do not have a default value for lru_cache maxsize.
Not setting it would cause:
  TypeError: Expected maxsize to be an integer or None

Change-Id: I32d4fb6a0040a0c24da0b2f29f22f85a36c96531
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/388737
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-10-09 14:08:29 +00:00
ce0ed799b6 sync: Fix print statement in _PostRepoFetch
R=jasonnc@google.com

Bug: b/303806829
Change-Id: I49075bfb55b842610786e61a0dedfe008cd1296a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/388614
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
2023-10-06 18:39:46 +00:00
2844a5f3cc git_command: Augment underlying git errors with suggestions
This change appends suggestions to the underlying git error to make the
error slightly more actionable.

DD: go/improve-repo-error-reporting & go/tee-repo-stderr

Bug: b/292704435
Change-Id: I2bf8bea5fca42c6a9acd2fadc70f58f22456e027
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/387774
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
2023-10-06 18:21:45 +00:00
47944bbe2e project: Invoke realpath on dotgit for symmetry with gitdir to ensure a short relpath.
Bug: 302680231

Change-Id: Icd01dd2ce62d737a4acb114e729189cd31f6bde9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/388234
Tested-by: Jason R. Coombs <jaraco@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Jason R. Coombs <jaraco@google.com>
2023-10-05 14:29:29 +00:00
83c66ec661 Reset info logs back to print in sync
Bug: b/292704435
Change-Id: Ib4b4873de726888fc68e476167ff2dcd74ec9045
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/387974
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
2023-09-28 19:46:49 +00:00
87058c6ca5 Track expected git errors in logs
Sometimes it is expected that a GitCommand executed in repo fails. In
such cases indicate in trace logs that the error was expected.

Bug: b/293344017
Change-Id: If137fae9ef9769258246f5b4494e070345db4a71
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/387714
Commit-Queue: Jason Chang <jasonnc@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Jason Chang <jasonnc@google.com>
2023-09-27 19:05:16 +00:00
b5644160b7 tests: Fix tox error in py36 use virtualenv<20.22.0
tox uses virtualenv under its hood for managing virtual environments.
Virtualenv 20.22.0 dropped support for Python <= 3.6.

Since we want to test against Python 3.6 we need to make sure we use
a version of virtualenv earlier than 20.22.0.

This error was not stopping any tests from passing but was printed
multiple times to stderr when executing the py36 target:

  Error processing line 1 of [...]/.tox/py36/[...]/_virtualenv.pth:

    Traceback (most recent call last):
      File "/usr/lib/python3.6/site.py", line 168, in addpackage
        exec(line)
      File "<string>", line 1, in <module>
      File "[...]/.tox/py36/[...]/_virtualenv.py", line 3
        from __future__ import annotations
                                         ^
    SyntaxError: future feature annotations is not defined

Source: https://tox.wiki/en/latest/faq.html#testing-end-of-life-python-versions
Change-Id: I27bd8200987ecf745108ee8c7561a365f542102a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/387694
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-09-27 18:47:04 +00:00
aadd12cb08 Use non-deprecated API for obtaining UTC time
DeprecationWarning: datetime.datetime.utcnow() is deprecated and
scheduled for removal in a future version. Use timezone-aware objects to
represent datetimes in UTC: datetime.datetime.now(datetime.UTC).

Change-Id: Ia2c46fb87c544d98cc2dd68a829f67d4770b479c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/386615
Tested-by: Łukasz Patron <priv.luk@gmail.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Łukasz Patron <priv.luk@gmail.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-09-18 23:59:37 +00:00
b8fd19215f main: Use repo logger
Bug: b/292704435
Change-Id: Ica02e4c00994a2f64083bb36e8f4ee8aa45d76bd
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/386454
Reviewed-by: Jason Chang <jasonnc@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
2023-09-18 20:06:30 +00:00
7a1f1f70f0 project: Use repo logger
Bug: b/292704435
Change-Id: I510fc911530db2c84a7ee099fa2905ceac35d0b7
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/386295
Reviewed-by: Jason Chang <jasonnc@google.com>
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
2023-09-14 17:14:40 +00:00
c993c5068e subcmds: Use repo logger
Bug: b/292704435
Change-Id: Ia3a45d87fc0bf0d4a1ba53050d9c3cd2dba20e55
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/386236
Reviewed-by: Jason Chang <jasonnc@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
2023-09-14 17:13:37 +00:00
c3d7c8536c github: add PR closer
We don't accept PRs via GH, so add a job to automatically close them
with an explanation for how to submit.

Change-Id: I5cc3176549a04ff23b04dae1110cd27a58ba1fd3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/386134
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2023-09-13 18:42:18 +00:00
880c621dc6 tests: test_subcmds_sync.py: fix for py3.6 & 3.7
tests/test_subcmds_sync.py::LocalSyncState::test_prune_removed_projects
was failing in Python 3.6 and 3.7 due to topdir not being set with the
following error message:
    TypeError: expected str, bytes or os.PathLike object, not MagicMock

topdir is accessed from within PruneRemovedProjects().

Test: tox with Python 3.6 to 3.11
Change-Id: I7ba5144df0a0126c01776384e2178136c3510091
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/382816
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-09-13 18:24:04 +00:00
da6ae1da8b tests: test_git_superproject.py: fix py3.6 & 3.7
tests/test_git_superproject.py::SuperprojectTestCase::test_Fetch was
failing in Python 3.6 and 3.7 due to attribute args only being
introduced in Python 3.8. Falling back on old way of accessing
the arguments.

Test: tox with Python 3.6 to 3.11
Change-Id: Iae1934a7bce8cbd6b4519e4dbc92d94e21b43435
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/382818
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-09-13 18:23:40 +00:00
5771897459 start: Use repo logger
Bug: b/292704435
Change-Id: I7b8988207dfdcf0ffc283a48499611892ef5187d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/385534
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Jason Chang <jasonnc@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
2023-09-11 21:38:55 +00:00
56a5a01c65 project: Use IsId instead of ID_RE.match
Change-Id: I8ca83a034400da0cb97cba41415bfc50858a898b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/385857
Tested-by: Sylvain Desodt <sylvain.desodt@gmail.com>
Commit-Queue: Sylvain Desodt <sylvain.desodt@gmail.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-09-11 12:35:19 +00:00
e9cb391117 project: Optimise GetCommitRevisionId when revisionId is set
When comparing 2 manifests, most of the time is
spent getting the relevant commit id as it relies
on _allrefs which ends up loading all git references.

However, the value from `revisionIs` (when it is valid)
could be used directly leading to a huge performance improvement
(from 180+ seconds to less than 0.01 sec which is more
than 25000 times faster for manifests with 700+ projects).

Bug: 295282548

Change-Id: I5881aa4b2326cc17bbb4ee91d23293111f76ad7e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/385834
Tested-by: Sylvain Desodt <sylvain.desodt@gmail.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Sylvain Desodt <sylvain.desodt@gmail.com>
2023-09-11 12:28:25 +00:00
25d6c7cc10 manifest_xml: use a set instead of (sorted) list in projectsDiff
The logic in projectsDiff performs various operations which
suggest that a set is more appropriate than a list:
 - membership lookup ("in")
 - removal

Also, sorting can be performed on the the remaining elements at the
end (which will usually involve a much smaller number of elements).

(The performance gain is invisible in comparison to the time being
spent performing git operations).

Cosmetic chance:
 - the definition of 'fromProj' is moved to be used in more places
 - the values in diff["added"] are added with a single call to extend

Change-Id: I5ed22ba73b50650ca2d3a49a1ae81f02be3b3055
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/383434
Tested-by: Sylvain Desodt <sylvain.desodt@gmail.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Sylvain Desodt <sylvain.desodt@gmail.com>
2023-09-10 19:24:56 +00:00
f19b310f15 Log ErrorEvent for failing GitCommands
Change-Id: I270af7401cff310349e736bef87e9b381cc4d016
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/385054
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Jason Chang <jasonnc@google.com>
Tested-by: Jason Chang <jasonnc@google.com>
2023-09-06 18:22:33 +00:00
712e62b9b0 logging: Use log.formatter for coloring logs
Bug: b/292704435
Change-Id: Iebdf8fb7666592dc5df2b36aae3185d1fc71bd66
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/385514
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-09-06 18:07:55 +00:00
daf2ad38eb sync: Preserve errors on KeyboardInterrupt
If a KeyboardInterrupt is encountered before an error is aggregated then
the context surrounding the interrupt is lost. This change aggregates
errors as soon as possible for the sync command

Bug: b/293344017
Change-Id: Iac14f9d59723cc9dedbb960f14fdc1fa5b348ea3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/384974
Tested-by: Jason Chang <jasonnc@google.com>
Commit-Queue: Jason Chang <jasonnc@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2023-09-06 17:36:31 +00:00
b861511db9 fix black formatting of standalone programs
Black will only check .py files when given a dir and --check, so list
our few standalone programs explicitly.  This causes the repo launcher
to be reformatted since it was missed in the previous mass reformat.

Bug: b/267675342
Change-Id: Ic90a7f5d84fc02e9fccb05945310fd067e2ed764
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/385034
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-09-01 18:08:58 +00:00
e914ec293a sync: Use repo logger within sync
Bug: b/292704435
Change-Id: Iceb3ad5111e656a1ff9730ae5deb032a9b43b4a5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/383454
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-08-31 22:29:51 +00:00
1e9f7b9e9e project: Preserve stderr on upload
A previous change captured stderr when uploading git projects. This
change ensures stderr is sent to stderr.

Bug: b/297097597
Change-Id: I8314e1017d2a42b7b655fe43ce3c312d397894ca
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/384134
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Jason Chang <jasonnc@google.com>
Tested-by: Jason Chang <jasonnc@google.com>
Reviewed-by: Sam Saccone <samccone@google.com>
2023-08-28 17:13:44 +00:00
1dbf8b4346 tox.ini: add isort as dependency
a previous introduced isort, which causes tox
runs to fail for all python versions. adding
isort as dependency resolve these issues.

Change-Id: If3faf78e6928e6e5111b2ef2359351459832431f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/384175
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-08-28 02:08:45 +00:00
6447733eb2 isort: format codebase
Change-Id: I6f11d123b68fd077f558d3c21349c55c5f251019
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/383715
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-08-22 18:32:22 +00:00
06ddc8c50a tweak stdlib imports to follow Google style guide
Google Python style guide says to import modules.
Clean up all our stdlib imports.  Leave the repo ones alone
for now as that's a much bigger shave.

Change-Id: Ida42fc2ae78b86e6b7a6cbc98f94ca04b295f8cc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/383714
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2023-08-22 18:22:49 +00:00
16109a66b7 upload: Suggest full sync if hooks fail with partially synced tree
Pre-upload hooks may fail because of partial syncs.

Bug: b/271507654
Change-Id: I124cd386c5af2c34e1dcaa3e86916624e235b1e3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/383474
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2023-08-22 17:18:13 +00:00
321b7934b5 sync: Ignore repo project when checking partial syncs
The repo project is fetched at most once a day and should be ignored
when checking if the tree is partially synced.

Bug: b/286126621, b/271507654
Change-Id: I684ed1669c3b3b9605162f8cc9d57185bb3dfe8e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/383494
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2023-08-22 17:13:43 +00:00
5a3a5f7cec upload: fix error handling
There was a bug in error handeling code that caused an uncaught
exception to be raised.

Bug: b/296316540
Change-Id: I49c72f29c00f26ba60de552f958bc6eddf841162
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/383254
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Jason Chang <jasonnc@google.com>
Tested-by: Jason Chang <jasonnc@google.com>
2023-08-21 16:52:48 +00:00
11cb96030e docs: Document .repo_localsyncstate.json
Update docs to reflect the new internal filesystem layout.

Bug: b/286126621, b/271507654
Change-Id: I8a2f8f36dff75544f32356ac5e31668f32ddffb3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/383074
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2023-08-18 18:19:46 +00:00
8914b1f86d gitc: drop support
Bug: b/282775958
Change-Id: Ib6383d6fd82a017d0a6670d6558a905d41be321f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/375314
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Jason Chang <jasonnc@google.com>
Commit-Queue: Jason Chang <jasonnc@google.com>
2023-08-15 22:14:52 +00:00
082487dcd1 tox: enable python 3.11 testing
Python 3.11 was released almost a year ago.

Test: tox -epy311
Change-Id: I447637a1e97038a596373d7612c9000c0c738ec9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/382838
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
2023-08-15 15:47:28 +00:00
f767f7d5c4 flake8: exclude venv and .tox folder
Excluding these two folders to avoid countless lint warnings
caused by dependencies in these two folders.

Change-Id: I2403b23f88cebb5941a4f9b5ac6cc34d107fd2f1
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/382837
Commit-Queue: Daniel Kutik <daniel.kutik@lavawerk.com>
Tested-by: Daniel Kutik <daniel.kutik@lavawerk.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-08-15 15:46:52 +00:00
1a3612fe6d Raise RepoExitError in place of sys.exit
Bug: b/293344017
Change-Id: Icae4932b00e4068cba502a5ab4a0274fd7854d9d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/382214
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Jason Chang <jasonnc@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Commit-Queue: Jason Chang <jasonnc@google.com>
2023-08-10 23:46:31 +00:00
f0aeb220de sync: Warn if partial sync state is detected
Partial syncs are not supported and can lead to strange behavior like
deleting files. Explicitly warn users on partial sync.

Bug: b/286126621, b/271507654
Change-Id: I471f78ac5942eb855bc34c80af47aa561dfa61e8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/382154
Reviewed-by: Jason Chang <jasonnc@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2023-08-10 18:13:14 +00:00
f1ddaaa553 main: Pass path to python binary as arg0 when restarting repo
Not including it causes flaky behavior in some Chromium builders
because Chromium's custom Python build used by vpython relies on
argv[0] to find its own internal files.

Bug: https://crbug.com/1468522
Change-Id: I5c32ebe71c9b684d6ee50dbd8c3d6fcd51ca309b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/381974
Reviewed-by: Chenlin Fan <fancl@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2023-08-08 05:50:07 +00:00
f9aacd4087 Raise repo exit errors in place of sys.exit
Bug: b/293344017
Change-Id: I92d81c78eba8ff31b5252415f4c9a515a6c76411
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/381774
Tested-by: Jason Chang <jasonnc@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
Commit-Queue: Jason Chang <jasonnc@google.com>
2023-08-07 23:56:07 +00:00
b8a7b4a629 Prefix error events with RepoErrorEvent:
Prior to this change there is no way to distinguish between git sessions logs
generated from repo source v.s. from git.

Bug: b/294446468
Change-Id: I309f59e146c30cb08a0637e8d0b9c5d9efd5cada
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/381794
Commit-Queue: Jason Chang <jasonnc@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Jason Chang <jasonnc@google.com>
2023-08-07 18:14:40 +00:00
32b59565b7 Refactor errors for sync command
Per discussion in go/repo-error-update updated aggregated and exit
errors for sync command.

Aggregated errors are errors that result in eventual command failure.
Exit errors are errors that result in immediate command failure.

Also updated main.py to log aggregated and exit errors to git sessions
log

Bug: b/293344017
Change-Id: I77a21f14da32fe2e68c16841feb22de72e86a251
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/379614
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Jason Chang <jasonnc@google.com>
Commit-Queue: Jason Chang <jasonnc@google.com>
2023-08-02 18:29:05 +00:00
a6413f5d88 Update errors to extend BaseRepoError
In order to better analyze and track repo errors, repo command failures
need to be tied to specific errors in repo source code.

Additionally a new GitCommandError was added to differentiate between
general git related errors to failed git commands. Git commands that opt
into verification will raise a GitCommandError if the command failed.

The first step in this process is a general error refactoring

Bug: b/293344017
Change-Id: I46944b1825ce892757c8dd3f7e2fab7e460760c0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/380994
Commit-Queue: Jason Chang <jasonnc@google.com>
Reviewed-by: Aravind Vasudevan <aravindvasudev@google.com>
Tested-by: Jason Chang <jasonnc@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
2023-07-31 21:31:36 +00:00
8c35d948cf [repo logging] Add logging module
Bug: b/292704435
Change-Id: I8834591f661c75449f8be5de1c61ecd43669026d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/380714
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
2023-07-31 16:49:57 +00:00
1d2e99d028 sync: Track last completed fetch/checkout
Save the latest time any project is fetched and checked out. This will
be used to detect partial checkouts.

Bug: b/286126621
Change-Id: I53b264dc70ba168d506076dbd693ef79a696b61d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/380514
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2023-07-28 18:55:04 +00:00
c657844efe main: Fix exitcode logging
Fixed a couple of bugs in ExitEvent logging:
- log exitcode 130 on KeyboardInterrupt
- log exitcode 1 on unhandled Exception
- log errorevent with specific reason for exit

Before this CL an exitcode of 0 would be logged, and it would be
difficult to determine the cause of non-zero exit codes

Bug: b/287105597
Change-Id: I2d34f180581f9fbd77a1c78c966ebed065223af6
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/377834
Tested-by: Jason Chang <jasonnc@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2023-06-26 15:38:30 +00:00
1d3b4fbeec sync: Track new/existing project count
New vs existing project may be a useful measure for analyzing
sync performance.

Bug: b/287105597
Change-Id: Ibea3e90c9fe3d16fd8b863bcae22b21963a6771a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/377574
Tested-by: Jason Chang <jasonnc@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
2023-06-23 20:08:58 +00:00
be71c2f80f manifest: enable remove-project using path
A something.xml that gets included by two different
files, that both remove and add same shared project
to two different locations, would not work
prior to this change.

Reason is that remove killed all name keys, even
though reuse of same repo in different locations
is allowed.

Solve by adding optional attrib path to
<remove-project name="foo" path="only_this_path" />
and tweak remove-project.

Behaves as before without path, and deletes
more selectively when remove path is supplied.

As secondary feature, a project can now also be removed
by only using path, assuming a matching project name
can be found.

Change-Id: I502d9f949f5d858ddc1503846b170473f76dc8e2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/375694
Tested-by: Fredrik de Groot <fredrik.de.groot@aptiv.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-06-21 14:50:16 +00:00
696e0c48a9 update links from monorail to issuetracker
Change-Id: Ie05373aa4becc0e4d0cab74e7ea0a61eb2cc2746
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/377014
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2023-06-14 21:19:58 +00:00
b2263ba124 sync: Handle case when output isn't connected to a terminal
Currently `repo sync | tee` exits with an OSError.

Bug: https://crbug.com/gerrit/17023
Change-Id: I91ae05f1c91d374b5d57721d45af74db1b2072a5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/376414
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2023-06-09 15:26:16 +00:00
945c006f40 sync: Update sync progress even when _sync_dict is empty
By chance, _sync_dict can be empty even though repo sync is still
working. In that case, the progress message shows incorrect info. Handle this case and fix a bug where "0 jobs" can show.

Bug: http://b/284465096
Change-Id: If915d953ba60e7cf84a6fb2d137fd6ed82abd3cc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/375494
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2023-05-30 20:25:00 +00:00
71122f941f sync: Handle race condition when reading active jobs
It's possible that number of jobs is more than 0 when we
check length, but in the meantime number of jobs drops to
0. In that case, we are working with float(inf) which
causes other problems

Bug: 284383869
Change-Id: I5d070d1be428f8395df7fde8ca84866db46f2100
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/375134
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2023-05-26 15:50:20 +00:00
07a4529278 pager: set $LESS only when missing
This matches the git behavior. From [1],

> When the `LESS` environment variable is unset, Git sets it to `FRX`
> (if `LESS` environment variable is set, Git does not change it at
> all).

The default $LESS is changed from FRSX to FRX since git 2.1.0 [2]. This
change also updates the default $LESS for repo.

[1] https://git-scm.com/docs/git-config#Documentation/git-config.txt-corepager
[2] b3275838d9

Bug: https://crbug.com/gerrit/16973
Change-Id: I64ccaa7b034fdb6a92c10025e47f5d07e85e6451
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/374894
Reviewed-by: Chih-Hsuan Yen <x5f4qvj3w3ge2tiq@chyen.cc>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Chih-Hsuan Yen <x5f4qvj3w3ge2tiq@chyen.cc>
2023-05-26 14:39:09 +00:00
17833322d9 Add envar to replace shallow clones with partial
An investigation go/git-repo-shallow shows a number of problems
when doing a shallow git fetch/clone. This change introduces an
environment variable REPO_ALLOW_SHALLOW. When this environment variable
is set to 1 during a repo init or repo sync all shallow git fetch
commands are replaced with partial fetch commands. Any shallow
repository needing update is unshallowed. This behavior continues until
a subsequent repo sync command is run with REPO_ALLOW_SHALLOW set to 1.

Bug: b/274340522
Change-Id: I1c3188270629359e52449788897d9d4988ebf280
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/374754
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Tested-by: Jason Chang <jasonnc@google.com>
2023-05-25 22:37:04 +00:00
04cba4add5 sync: Show number of running fetch jobs
Last of the recent `repo sync` UX changes. Show number of fetch jobs eg:
"Fetching:  3% (8/251) 0:03 | 8 jobs | 0:01 chromiumos/overlays/chrom.."

Bug: https://crbug.com/gerrit/11293
Change-Id: I1b3dcf3e56ae6731c6c6cb73cfce069b2f374b69
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/374920
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
2023-05-25 17:26:22 +00:00
3eacfdf309 upload: use f-string
Change-Id: I91b99a7147c7c3cb5485d5406316c8ffd79f9272
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/374914
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-05-25 17:12:18 +00:00
aafed29d34 project: Include tags option during fetch retry
If the original fetch attempt did not want tags, we should continue to
honor that when doing a retry fetch with depth set to None. This seems
to match the intent of the retry based on the inline comment and results
in a significant performance improvement when the original fetch-by-sha1
fails due to the server not allowing requests for unadvertised objects.

Change-Id: Ia26bb31ea9aecc4ba2d3e87fc0c5412472cd98c4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/374918
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Kaushik Lingarkar <kaushik.lingarkar@linaro.org>
Tested-by: Kaushik Lingarkar <kaushik.lingarkar@linaro.org>
2023-05-25 12:16:06 +00:00
90f574f02e Parse OpenSSH versions with no SSH_EXTRAVERSION
If the Debian banner is not used, then there won't be a space after the
version number: it'll be followed directly by a comma.

Bug: https://crbug.com/gerrit/16903
Change-Id: I12b873f32afc9424f42b772399c346f96ca95a96
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/372875
Tested-by: Saagar Jha <saagarjha@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2023-05-24 17:33:08 +00:00
551285fa35 sync: Show elapsed time for the longest syncing project
"Last synced: X" is printed only after a project finishes syncing.
Replace that with a message that shows the longest actively syncing
project.

Bug: https://crbug.com/gerrit/11293
Change-Id: I84c7873539d84999772cd554f426b44921521e85
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/372674
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2023-05-18 18:10:24 +00:00
131fc96381 [git_trace2] Add logs for critical cmds
Trace logs emitted from repo are not useful on error for many critical
commands. This change adds errors for critical commands to trace logs.

Change-Id: Ideb9358bee31e540bd84a94327a09ff9b0246a77
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/373814
Reviewed-by: Joanna Wang <jojwang@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2023-05-17 18:06:14 +00:00
2ad5d50874 [trace2] Add absolute time on trace2 exit events
Change-Id: I58aff46bd4ff4ba79286a7f1226e19eb568c34c5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/373954
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2023-05-15 22:21:23 +00:00
acb9523eaa SUBMITTING_PATCHES: update with commit queue details
Change-Id: I59dffb8524cb95b3fd4196bcecd18426f09bf9c4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/373694
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2023-05-11 19:27:57 +00:00
041f97725a sync: Fix how sync times for shared projects are recorded
https://gerrit.googlesource.com/git-repo/+/d947858325ae70ff9c0b2f463a9e8c4ffd00002a introduced a moving average of fetch times in 2012.

The code does not handle shared projects, and averages times based on project names which is incorrect.

Change-Id: I9926122cdb1ecf201887a81e96f5f816d3c2f72a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/373574
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
2023-05-10 21:14:57 +00:00
3e3340d94f manifest: add support for revision in include
Attribute groups can now be added to manifest include, thus
all projects in an included manifest file can easily modify
default branch without modifying all projects in that manifest file.

For example,
the main manifest.xml has an include node contain revision attribute,
```
<include name="include.xml" revision="r1" />
```
and the include.xml has some projects,
```
<project path="project1_path" name="project1_name" revision="r2" />
<project path="project2_path" name="project2_name" />
```
With this change, the final manifest will have revision="r1" for project2.
```
<project name="project1_name" path="project1_path" revision="r2" />
<project name="project2_name" path="project2_path" revision="r1" />
```

Test: added unit tests to cover the inheritance

Change-Id: I4b8547a7198610ec3a3c6aeb2136e0c0f3557df0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/369714
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Shuchuan Zeng <zengshuchuan@allwinnertech.com>
Tested-by: Shuchuan Zeng <zengshuchuan@allwinnertech.com>
2023-05-05 03:40:28 +00:00
edcaa94ca8 sync: Display total elapsed fetch time
Give users an indication that `repo sync` isn't stuck if taking a long
time to fetch.

Bug: https://crbug.com/gerrit/11293
Change-Id: Iccdaec918f86c9cc2db5dc12f9e3eef7ad0bcbda
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/371414
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2023-05-02 20:51:46 +00:00
7ef5b465cd [SyncAnalysisState] Preserve synctime µs
By default, datetime.isoformat() uses different format depending on
microseconds - if is equal to 0, microseconds are omitted, but otherwise
not.

Setting timespec = 'microseconds' ensures the format is the same
regardless of current time.

Change-Id: Icb1be31eb681247c7e46923cdeabb8f5469c20f0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/371694
Tested-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Joanna Wang <jojwang@google.com>
Commit-Queue: Josip Sokcevic <sokcevic@google.com>
2023-04-27 20:19:34 +00:00
e7e20f4686 tests: do not allow underscores in cli options
We use dashes in --long-options, not underscores, so add a test to
make sure people don't accidentally add them.

Change-Id: Iffbce474d22cf1f6c2042f7882f215875c8df3cf
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/369734
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-04-19 02:53:37 +00:00
99ebf627db upload: Add --no-follow-tags by default to git push
Gerrit does not accept pushing git tags to CLs. Hence, this change disables push.followTags for repo upload.

Fixed: b/155095555
Change-Id: I8d99eac29c0b4b375bdb857ed063914441026fa1
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/367736
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
2023-04-05 19:05:45 +00:00
57cb42861d run_tests: Check flake8
This also gets enforced in CQ.

Bug: b/267675342
Change-Id: I8ffcc5d583275072fd61ae65ae4214b36bfa59f3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/366799
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2023-03-31 04:25:53 +00:00
e74d9046ee Update abandon to support multiple branches
This change updates `repo abandon` command to take multiple space-separated branchnames as parameters.

Bug: https://crbug.com/gerrit/13354
Change-Id: I00ad7a79872c0e4161f8183843835f25cd515605
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/365524
Tested-by: Aravind Vasudevan <aravindvasudev@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Aravind Vasudevan <aravindvasudev@google.com>
2023-03-24 07:39:28 +00:00
21cc3a9d53 run_tests: Always check black and check it last
https://gerrit-review.googlesource.com/c/git-repo/+/363474/24..25 meant
to improve run_tests UX by letting users rerun it quickly, but it also
removed CQ enforcement of formatting since CQ passes args to run_tests.

Run pytest first so devs don't have format first and always check black
formatting so it's enforced in CQ.

Bug: b/267675342
Change-Id: I09544f110a6eb71b0c6c640787e10b04991a804e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/365727
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2023-03-24 01:38:20 +00:00
ea2e330e43 Format codebase with black and check formatting in CQ
Apply rules set by https://gerrit-review.googlesource.com/c/git-repo/+/362954/ across the codebase and fix any lingering errors caught
by flake8. Also check black formatting in run_tests (and CQ).

Bug: b/267675342
Change-Id: I972d77649dac351150dcfeb1cd1ad0ea2efc1956
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/363474
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2023-03-22 17:46:28 +00:00
1604cf255f Make black with line length 80 repo's code style
Provide a consistent formatting style and tox commands to lint and
format.

Bug: b/267675342
Change-Id: I33ddfe07af8473f4334c347d156246bfb66d4cfe
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/362954
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2023-03-20 20:37:24 +00:00
75eb8ea935 docs: update Focal Python version
It ships with Python 3.8 by default, not 3.7.

Change-Id: I11401d1098b60285cfdccadb6a06bb33a5f95369
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/361634
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2023-03-02 22:26:00 +00:00
109 changed files with 24128 additions and 18447 deletions

21
.flake8
View File

@ -1,15 +1,16 @@
[flake8] [flake8]
max-line-length=100 max-line-length = 80
ignore= per-file-ignores =
# E111: Indentation is not a multiple of four # E501: line too long
E111, tests/test_git_superproject.py: E501
# E114: Indentation is not a multiple of four (comment) extend-ignore =
E114, # E203: Whitespace before ':'
# See https://github.com/PyCQA/pycodestyle/issues/373
E203,
# E402: Module level import not at top of file # E402: Module level import not at top of file
E402, E402,
# E731: do not assign a lambda expression, use a def # E731: do not assign a lambda expression, use a def
E731, E731,
# W503: Line break before binary operator exclude =
W503, venv,
# W504: Line break after binary operator .tox,
W504

View File

@ -0,0 +1,22 @@
# GitHub actions workflow.
# https://docs.github.com/en/actions/learn-github-actions/workflow-syntax-for-github-actions
# https://github.com/superbrothers/close-pull-request
name: Close Pull Request
on:
pull_request_target:
types: [opened]
jobs:
run:
runs-on: ubuntu-latest
steps:
- uses: superbrothers/close-pull-request@v3
with:
comment: >
Thanks for your contribution!
Unfortunately, we don't use GitHub pull requests to manage code
contributions to this repository.
Instead, please see [README.md](../blob/HEAD/SUBMITTING_PATCHES.md)
which provides full instructions on how to get involved.

View File

@ -13,8 +13,9 @@ jobs:
strategy: strategy:
fail-fast: false fail-fast: false
matrix: matrix:
os: [ubuntu-latest, macos-latest, windows-latest] # ubuntu-20.04 is the last version that supports python 3.6
python-version: ['3.6', '3.7', '3.8', '3.9', '3.10'] os: [ubuntu-20.04, macos-latest, windows-latest]
python-version: ['3.6', '3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
steps: steps:

41
.isort.cfg Normal file
View File

@ -0,0 +1,41 @@
# Copyright 2023 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Config file for the isort python module.
# This is used to enforce import sorting standards.
#
# https://pycqa.github.io/isort/docs/configuration/options.html
[settings]
# Be compatible with `black` since it also matches what we want.
profile = black
line_length = 80
length_sort = false
force_single_line = true
lines_after_imports = 2
from_first = false
case_sensitive = false
force_sort_within_sections = true
order_by_type = false
# Ignore generated files.
extend_skip_glob = *_pb2.py
# Allow importing multiple classes on a single line from these modules.
# https://google.github.io/styleguide/pyguide#s2.2-imports
single_line_exclusions =
abc,
collections.abc,
typing,

View File

@ -8,7 +8,7 @@ that you can put anywhere in your path.
* Homepage: <https://gerrit.googlesource.com/git-repo/> * Homepage: <https://gerrit.googlesource.com/git-repo/>
* Mailing list: [repo-discuss on Google Groups][repo-discuss] * Mailing list: [repo-discuss on Google Groups][repo-discuss]
* Bug reports: <https://bugs.chromium.org/p/gerrit/issues/list?q=component:Applications%3Erepo> * Bug reports: <https://issues.gerritcodereview.com/issues?q=is:open%20componentid:1370071>
* Source: <https://gerrit.googlesource.com/git-repo/> * Source: <https://gerrit.googlesource.com/git-repo/>
* Overview: <https://source.android.com/source/developing.html> * Overview: <https://source.android.com/source/developing.html>
* Docs: <https://source.android.com/source/using-repo.html> * Docs: <https://source.android.com/source/using-repo.html>
@ -50,6 +50,6 @@ $ chmod a+rx ~/.bin/repo
``` ```
[new-bug]: https://bugs.chromium.org/p/gerrit/issues/entry?template=Repo+tool+issue [new-bug]: https://issues.gerritcodereview.com/issues/new?component=1370071
[issue tracker]: https://bugs.chromium.org/p/gerrit/issues/list?q=component:Applications%3Erepo [issue tracker]: https://issues.gerritcodereview.com/issues?q=is:open%20componentid:1370071
[repo-discuss]: https://groups.google.com/forum/#!forum/repo-discuss [repo-discuss]: https://groups.google.com/forum/#!forum/repo-discuss

View File

@ -1,19 +1,19 @@
# Submitting Changes
Here's a short overview of the process.
* Make small logical changes.
* [Provide a meaningful commit message][commit-message-style].
* Make sure all code is under the Apache License, 2.0.
* Publish your changes for review.
* `git push origin HEAD:refs/for/main`
* Make corrections if requested.
* [Verify your changes on Gerrit.](#verify)
* [Send to the commit queue for testing & merging.](#cq)
[TOC] [TOC]
# Short Version ## Long Version
- Make small logical changes.
- [Provide a meaningful commit message][commit-message-style].
- Check for coding errors and style nits with flake8.
- Make sure all code is under the Apache License, 2.0.
- Publish your changes for review.
- Make corrections if requested.
- Verify your changes on gerrit so they can be submitted.
`git push https://gerrit-review.googlesource.com/git-repo HEAD:refs/for/main`
# Long Version
I wanted a file describing how to submit patches for repo, I wanted a file describing how to submit patches for repo,
so I started with the one found in the core Git distribution so I started with the one found in the core Git distribution
@ -39,17 +39,26 @@ If your description starts to get too long, that's a sign that you
probably need to split up your commit to finer grained pieces. probably need to split up your commit to finer grained pieces.
## Check for coding errors and style violations with flake8 ## Linting and formatting code
Run `flake8` on changed modules: Lint any changes by running:
```sh
$ tox -e lint -- file.py
```
flake8 file.py And format with:
```sh
$ tox -e format -- file.py
```
Note that repo generally follows [Google's Python Style Guide] rather than Or format everything:
[PEP 8], with a couple of notable exceptions: ```sh
$ tox -e format
```
* Indentation is at 2 columns rather than 4 Repo uses [black](https://black.readthedocs.io/) with line length of 80 as its
* The maximum line length is 100 columns rather than 80 formatter and flake8 as its linter. Repo also follows
[Google's Python Style Guide].
There should be no new errors or warnings introduced. There should be no new errors or warnings introduced.
@ -166,12 +175,16 @@ commit. If you make the requested changes you will need to amend your commit
and push it to the review server again. and push it to the review server again.
## Verify your changes on gerrit ## Verify your changes on Gerrit {#verify}
After you receive a Code-Review+2 from the maintainer, select the Verified After you receive a Code-Review+2 from the maintainer, select the Verified
button on the gerrit page for the change. This verifies that you have tested button on the Gerrit page for the change. This verifies that you have tested
your changes and notifies the maintainer that they are ready to be submitted. your changes and notifies the maintainer that they are ready to be submitted.
The maintainer will then submit your changes to the repository.
## Merge your changes via the commit queue {#cq}
Once a change is ready to be merged, select the Commit-Queue+2 setting on the
Gerrit page for it. This tells the CI system to test the change, and if it
passes all the checks, automatically merges it.
[commit-message-style]: https://chris.beams.io/posts/git-commit/ [commit-message-style]: https://chris.beams.io/posts/git-commit/

328
color.py
View File

@ -17,196 +17,202 @@ import sys
import pager import pager
COLORS = {None: -1,
'normal': -1,
'black': 0,
'red': 1,
'green': 2,
'yellow': 3,
'blue': 4,
'magenta': 5,
'cyan': 6,
'white': 7}
ATTRS = {None: -1, COLORS = {
'bold': 1, None: -1,
'dim': 2, "normal": -1,
'ul': 4, "black": 0,
'blink': 5, "red": 1,
'reverse': 7} "green": 2,
"yellow": 3,
"blue": 4,
"magenta": 5,
"cyan": 6,
"white": 7,
}
ATTRS = {None: -1, "bold": 1, "dim": 2, "ul": 4, "blink": 5, "reverse": 7}
RESET = "\033[m" RESET = "\033[m"
def is_color(s): def is_color(s):
return s in COLORS return s in COLORS
def is_attr(s): def is_attr(s):
return s in ATTRS return s in ATTRS
def _Color(fg=None, bg=None, attr=None): def _Color(fg=None, bg=None, attr=None):
fg = COLORS[fg] fg = COLORS[fg]
bg = COLORS[bg] bg = COLORS[bg]
attr = ATTRS[attr] attr = ATTRS[attr]
if attr >= 0 or fg >= 0 or bg >= 0: if attr >= 0 or fg >= 0 or bg >= 0:
need_sep = False need_sep = False
code = "\033[" code = "\033["
if attr >= 0: if attr >= 0:
code += chr(ord('0') + attr) code += chr(ord("0") + attr)
need_sep = True need_sep = True
if fg >= 0: if fg >= 0:
if need_sep: if need_sep:
code += ';' code += ";"
need_sep = True need_sep = True
if fg < 8: if fg < 8:
code += '3%c' % (ord('0') + fg) code += "3%c" % (ord("0") + fg)
else: else:
code += '38;5;%d' % fg code += "38;5;%d" % fg
if bg >= 0: if bg >= 0:
if need_sep: if need_sep:
code += ';' code += ";"
if bg < 8: if bg < 8:
code += '4%c' % (ord('0') + bg) code += "4%c" % (ord("0") + bg)
else: else:
code += '48;5;%d' % bg code += "48;5;%d" % bg
code += 'm' code += "m"
else: else:
code = '' code = ""
return code return code
DEFAULT = None DEFAULT = None
def SetDefaultColoring(state): def SetDefaultColoring(state):
"""Set coloring behavior to |state|. """Set coloring behavior to |state|.
This is useful for overriding config options via the command line. This is useful for overriding config options via the command line.
""" """
if state is None: if state is None:
# Leave it alone -- return quick! # Leave it alone -- return quick!
return return
global DEFAULT global DEFAULT
state = state.lower() state = state.lower()
if state in ('auto',): if state in ("auto",):
DEFAULT = state DEFAULT = state
elif state in ('always', 'yes', 'true', True): elif state in ("always", "yes", "true", True):
DEFAULT = 'always' DEFAULT = "always"
elif state in ('never', 'no', 'false', False): elif state in ("never", "no", "false", False):
DEFAULT = 'never' DEFAULT = "never"
class Coloring(object): class Coloring:
def __init__(self, config, section_type): def __init__(self, config, section_type):
self._section = 'color.%s' % section_type self._section = "color.%s" % section_type
self._config = config self._config = config
self._out = sys.stdout self._out = sys.stdout
on = DEFAULT on = DEFAULT
if on is None: if on is None:
on = self._config.GetString(self._section) on = self._config.GetString(self._section)
if on is None: if on is None:
on = self._config.GetString('color.ui') on = self._config.GetString("color.ui")
if on == 'auto': if on == "auto":
if pager.active or os.isatty(1): if pager.active or os.isatty(1):
self._on = True self._on = True
else: else:
self._on = False self._on = False
elif on in ('true', 'always'): elif on in ("true", "always"):
self._on = True self._on = True
else:
self._on = False
def redirect(self, out):
self._out = out
@property
def is_on(self):
return self._on
def write(self, fmt, *args):
self._out.write(fmt % args)
def flush(self):
self._out.flush()
def nl(self):
self._out.write('\n')
def printer(self, opt=None, fg=None, bg=None, attr=None):
s = self
c = self.colorer(opt, fg, bg, attr)
def f(fmt, *args):
s._out.write(c(fmt, *args))
return f
def nofmt_printer(self, opt=None, fg=None, bg=None, attr=None):
s = self
c = self.nofmt_colorer(opt, fg, bg, attr)
def f(fmt):
s._out.write(c(fmt))
return f
def colorer(self, opt=None, fg=None, bg=None, attr=None):
if self._on:
c = self._parse(opt, fg, bg, attr)
def f(fmt, *args):
output = fmt % args
return ''.join([c, output, RESET])
return f
else:
def f(fmt, *args):
return fmt % args
return f
def nofmt_colorer(self, opt=None, fg=None, bg=None, attr=None):
if self._on:
c = self._parse(opt, fg, bg, attr)
def f(fmt):
return ''.join([c, fmt, RESET])
return f
else:
def f(fmt):
return fmt
return f
def _parse(self, opt, fg, bg, attr):
if not opt:
return _Color(fg, bg, attr)
v = self._config.GetString('%s.%s' % (self._section, opt))
if v is None:
return _Color(fg, bg, attr)
v = v.strip().lower()
if v == "reset":
return RESET
elif v == '':
return _Color(fg, bg, attr)
have_fg = False
for a in v.split(' '):
if is_color(a):
if have_fg:
bg = a
else: else:
fg = a self._on = False
elif is_attr(a):
attr = a
return _Color(fg, bg, attr) def redirect(self, out):
self._out = out
@property
def is_on(self):
return self._on
def write(self, fmt, *args):
self._out.write(fmt % args)
def flush(self):
self._out.flush()
def nl(self):
self._out.write("\n")
def printer(self, opt=None, fg=None, bg=None, attr=None):
s = self
c = self.colorer(opt, fg, bg, attr)
def f(fmt, *args):
s._out.write(c(fmt, *args))
return f
def nofmt_printer(self, opt=None, fg=None, bg=None, attr=None):
s = self
c = self.nofmt_colorer(opt, fg, bg, attr)
def f(fmt):
s._out.write(c(fmt))
return f
def colorer(self, opt=None, fg=None, bg=None, attr=None):
if self._on:
c = self._parse(opt, fg, bg, attr)
def f(fmt, *args):
output = fmt % args
return "".join([c, output, RESET])
return f
else:
def f(fmt, *args):
return fmt % args
return f
def nofmt_colorer(self, opt=None, fg=None, bg=None, attr=None):
if self._on:
c = self._parse(opt, fg, bg, attr)
def f(fmt):
return "".join([c, fmt, RESET])
return f
else:
def f(fmt):
return fmt
return f
def _parse(self, opt, fg, bg, attr):
if not opt:
return _Color(fg, bg, attr)
v = self._config.GetString(f"{self._section}.{opt}")
if v is None:
return _Color(fg, bg, attr)
v = v.strip().lower()
if v == "reset":
return RESET
elif v == "":
return _Color(fg, bg, attr)
have_fg = False
for a in v.split(" "):
if is_color(a):
if have_fg:
bg = a
else:
have_fg = True
fg = a
elif is_attr(a):
attr = a
return _Color(fg, bg, attr)

View File

@ -12,20 +12,21 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import contextlib
import multiprocessing import multiprocessing
import os
import optparse import optparse
import os
import re import re
import sys
from event_log import EventLog
from error import NoSuchProjectError
from error import InvalidProjectGroupsError from error import InvalidProjectGroupsError
from error import NoSuchProjectError
from error import RepoExitError
from event_log import EventLog
import progress import progress
# Are we generating man-pages? # Are we generating man-pages?
GENERATE_MANPAGES = os.environ.get('_REPO_GENERATE_MANPAGES_') == ' indeed! ' GENERATE_MANPAGES = os.environ.get("_REPO_GENERATE_MANPAGES_") == " indeed! "
# Number of projects to submit to a single worker process at a time. # Number of projects to submit to a single worker process at a time.
@ -42,404 +43,511 @@ WORKER_BATCH_SIZE = 32
DEFAULT_LOCAL_JOBS = min(os.cpu_count(), 8) DEFAULT_LOCAL_JOBS = min(os.cpu_count(), 8)
class Command(object): class UsageError(RepoExitError):
"""Base class for any command line action in repo. """Exception thrown with invalid command usage."""
"""
# Singleton for all commands to track overall repo command execution and
# provide event summary to callers. Only used by sync subcommand currently.
#
# NB: This is being replaced by git trace2 events. See git_trace2_event_log.
event_log = EventLog()
# Whether this command is a "common" one, i.e. whether the user would commonly class Command:
# use it or it's a more uncommon command. This is used by the help command to """Base class for any command line action in repo."""
# show short-vs-full summaries.
COMMON = False
# Whether this command supports running in parallel. If greater than 0, # Singleton for all commands to track overall repo command execution and
# it is the number of parallel jobs to default to. # provide event summary to callers. Only used by sync subcommand currently.
PARALLEL_JOBS = None #
# NB: This is being replaced by git trace2 events. See git_trace2_event_log.
event_log = EventLog()
# Whether this command supports Multi-manifest. If False, then main.py will # Whether this command is a "common" one, i.e. whether the user would
# iterate over the manifests and invoke the command once per (sub)manifest. # commonly use it or it's a more uncommon command. This is used by the help
# This is only checked after calling ValidateOptions, so that partially # command to show short-vs-full summaries.
# migrated subcommands can set it to False. COMMON = False
MULTI_MANIFEST_SUPPORT = True
def __init__(self, repodir=None, client=None, manifest=None, gitc_manifest=None, # Whether this command supports running in parallel. If greater than 0,
git_event_log=None, outer_client=None, outer_manifest=None): # it is the number of parallel jobs to default to.
self.repodir = repodir PARALLEL_JOBS = None
self.client = client
self.outer_client = outer_client or client
self.manifest = manifest
self.gitc_manifest = gitc_manifest
self.git_event_log = git_event_log
self.outer_manifest = outer_manifest
# Cache for the OptionParser property. # Whether this command supports Multi-manifest. If False, then main.py will
self._optparse = None # iterate over the manifests and invoke the command once per (sub)manifest.
# This is only checked after calling ValidateOptions, so that partially
# migrated subcommands can set it to False.
MULTI_MANIFEST_SUPPORT = True
def WantPager(self, _opt): # Shared data across parallel execution workers.
return False _parallel_context = None
def ReadEnvironmentOptions(self, opts): @classmethod
""" Set options from environment variables. """ def get_parallel_context(cls):
assert cls._parallel_context is not None
return cls._parallel_context
env_options = self._RegisteredEnvironmentOptions() def __init__(
self,
repodir=None,
client=None,
manifest=None,
git_event_log=None,
outer_client=None,
outer_manifest=None,
):
self.repodir = repodir
self.client = client
self.outer_client = outer_client or client
self.manifest = manifest
self.git_event_log = git_event_log
self.outer_manifest = outer_manifest
for env_key, opt_key in env_options.items(): # Cache for the OptionParser property.
# Get the user-set option value if any self._optparse = None
opt_value = getattr(opts, opt_key)
# If the value is set, it means the user has passed it as a command def WantPager(self, _opt):
# line option, and we should use that. Otherwise we can try to set it return False
# with the value from the corresponding environment variable.
if opt_value is not None:
continue
env_value = os.environ.get(env_key) def ReadEnvironmentOptions(self, opts):
if env_value is not None: """Set options from environment variables."""
setattr(opts, opt_key, env_value)
return opts env_options = self._RegisteredEnvironmentOptions()
@property for env_key, opt_key in env_options.items():
def OptionParser(self): # Get the user-set option value if any
if self._optparse is None: opt_value = getattr(opts, opt_key)
try:
me = 'repo %s' % self.NAME
usage = self.helpUsage.strip().replace('%prog', me)
except AttributeError:
usage = 'repo %s' % self.NAME
epilog = 'Run `repo help %s` to view the detailed manual.' % self.NAME
self._optparse = optparse.OptionParser(usage=usage, epilog=epilog)
self._CommonOptions(self._optparse)
self._Options(self._optparse)
return self._optparse
def _CommonOptions(self, p, opt_v=True): # If the value is set, it means the user has passed it as a command
"""Initialize the option parser with common options. # line option, and we should use that. Otherwise we can try to set
# it with the value from the corresponding environment variable.
if opt_value is not None:
continue
These will show up for *all* subcommands, so use sparingly. env_value = os.environ.get(env_key)
NB: Keep in sync with repo:InitParser(). if env_value is not None:
""" setattr(opts, opt_key, env_value)
g = p.add_option_group('Logging options')
opts = ['-v'] if opt_v else []
g.add_option(*opts, '--verbose',
dest='output_mode', action='store_true',
help='show all output')
g.add_option('-q', '--quiet',
dest='output_mode', action='store_false',
help='only show errors')
if self.PARALLEL_JOBS is not None: return opts
default = 'based on number of CPU cores'
if not GENERATE_MANPAGES:
# Only include active cpu count if we aren't generating man pages.
default = f'%default; {default}'
p.add_option(
'-j', '--jobs',
type=int, default=self.PARALLEL_JOBS,
help=f'number of jobs to run in parallel (default: {default})')
m = p.add_option_group('Multi-manifest options') @property
m.add_option('--outer-manifest', action='store_true', default=None, def OptionParser(self):
help='operate starting at the outermost manifest') if self._optparse is None:
m.add_option('--no-outer-manifest', dest='outer_manifest', try:
action='store_false', help='do not operate on outer manifests') me = "repo %s" % self.NAME
m.add_option('--this-manifest-only', action='store_true', default=None, usage = self.helpUsage.strip().replace("%prog", me)
help='only operate on this (sub)manifest') except AttributeError:
m.add_option('--no-this-manifest-only', '--all-manifests', usage = "repo %s" % self.NAME
dest='this_manifest_only', action='store_false', epilog = (
help='operate on this manifest and its submanifests') "Run `repo help %s` to view the detailed manual." % self.NAME
)
self._optparse = optparse.OptionParser(usage=usage, epilog=epilog)
self._CommonOptions(self._optparse)
self._Options(self._optparse)
return self._optparse
def _Options(self, p): def _CommonOptions(self, p, opt_v=True):
"""Initialize the option parser with subcommand-specific options.""" """Initialize the option parser with common options.
def _RegisteredEnvironmentOptions(self): These will show up for *all* subcommands, so use sparingly.
"""Get options that can be set from environment variables. NB: Keep in sync with repo:InitParser().
"""
g = p.add_option_group("Logging options")
opts = ["-v"] if opt_v else []
g.add_option(
*opts,
"--verbose",
dest="output_mode",
action="store_true",
help="show all output",
)
g.add_option(
"-q",
"--quiet",
dest="output_mode",
action="store_false",
help="only show errors",
)
Return a dictionary mapping environment variable name if self.PARALLEL_JOBS is not None:
to option key name that it can override. default = "based on number of CPU cores"
if not GENERATE_MANPAGES:
# Only include active cpu count if we aren't generating man
# pages.
default = f"%default; {default}"
p.add_option(
"-j",
"--jobs",
type=int,
default=self.PARALLEL_JOBS,
help=f"number of jobs to run in parallel (default: {default})",
)
Example: {'REPO_MY_OPTION': 'my_option'} m = p.add_option_group("Multi-manifest options")
m.add_option(
"--outer-manifest",
action="store_true",
default=None,
help="operate starting at the outermost manifest",
)
m.add_option(
"--no-outer-manifest",
dest="outer_manifest",
action="store_false",
help="do not operate on outer manifests",
)
m.add_option(
"--this-manifest-only",
action="store_true",
default=None,
help="only operate on this (sub)manifest",
)
m.add_option(
"--no-this-manifest-only",
"--all-manifests",
dest="this_manifest_only",
action="store_false",
help="operate on this manifest and its submanifests",
)
Will allow the option with key value 'my_option' to be set def _Options(self, p):
from the value in the environment variable named 'REPO_MY_OPTION'. """Initialize the option parser with subcommand-specific options."""
Note: This does not work properly for options that are explicitly def _RegisteredEnvironmentOptions(self):
set to None by the user, or options that are defined with a """Get options that can be set from environment variables.
default value other than None.
""" Return a dictionary mapping environment variable name
return {} to option key name that it can override.
def Usage(self): Example: {'REPO_MY_OPTION': 'my_option'}
"""Display usage and terminate.
"""
self.OptionParser.print_usage()
sys.exit(1)
def CommonValidateOptions(self, opt, args): Will allow the option with key value 'my_option' to be set
"""Validate common options.""" from the value in the environment variable named 'REPO_MY_OPTION'.
opt.quiet = opt.output_mode is False
opt.verbose = opt.output_mode is True
if opt.outer_manifest is None:
# By default, treat multi-manifest instances as a single manifest from
# the user's perspective.
opt.outer_manifest = True
def ValidateOptions(self, opt, args): Note: This does not work properly for options that are explicitly
"""Validate the user options & arguments before executing. set to None by the user, or options that are defined with a
default value other than None.
This is meant to help break the code up into logical steps. Some tips: """
* Use self.OptionParser.error to display CLI related errors. return {}
* Adjust opt member defaults as makes sense.
* Adjust the args list, but do so inplace so the caller sees updates.
* Try to avoid updating self state. Leave that to Execute.
"""
def Execute(self, opt, args): def Usage(self):
"""Perform the action, after option parsing is complete. """Display usage and terminate."""
""" self.OptionParser.print_usage()
raise NotImplementedError raise UsageError()
@staticmethod def CommonValidateOptions(self, opt, args):
def ExecuteInParallel(jobs, func, inputs, callback, output=None, ordered=False): """Validate common options."""
"""Helper for managing parallel execution boiler plate. opt.quiet = opt.output_mode is False
opt.verbose = opt.output_mode is True
if opt.outer_manifest is None:
# By default, treat multi-manifest instances as a single manifest
# from the user's perspective.
opt.outer_manifest = True
For subcommands that can easily split their work up. def ValidateOptions(self, opt, args):
"""Validate the user options & arguments before executing.
Args: This is meant to help break the code up into logical steps. Some tips:
jobs: How many parallel processes to use. * Use self.OptionParser.error to display CLI related errors.
func: The function to apply to each of the |inputs|. Usually a * Adjust opt member defaults as makes sense.
functools.partial for wrapping additional arguments. It will be run * Adjust the args list, but do so inplace so the caller sees updates.
in a separate process, so it must be pickalable, so nested functions * Try to avoid updating self state. Leave that to Execute.
won't work. Methods on the subcommand Command class should work. """
inputs: The list of items to process. Must be a list.
callback: The function to pass the results to for processing. It will be
executed in the main thread and process the results of |func| as they
become available. Thus it may be a local nested function. Its return
value is passed back directly. It takes three arguments:
- The processing pool (or None with one job).
- The |output| argument.
- An iterator for the results.
output: An output manager. May be progress.Progess or color.Coloring.
ordered: Whether the jobs should be processed in order.
Returns: def Execute(self, opt, args):
The |callback| function's results are returned. """Perform the action, after option parsing is complete."""
""" raise NotImplementedError
try:
# NB: Multiprocessing is heavy, so don't spin it up for one job.
if len(inputs) == 1 or jobs == 1:
return callback(None, output, (func(x) for x in inputs))
else:
with multiprocessing.Pool(jobs) as pool:
submit = pool.imap if ordered else pool.imap_unordered
return callback(pool, output, submit(func, inputs, chunksize=WORKER_BATCH_SIZE))
finally:
if isinstance(output, progress.Progress):
output.end()
def _ResetPathToProjectMap(self, projects): @classmethod
self._by_path = dict((p.worktree, p) for p in projects) @contextlib.contextmanager
def ParallelContext(cls):
"""Obtains the context, which is shared to ExecuteInParallel workers.
def _UpdatePathToProjectMap(self, project): Callers can store data in the context dict before invocation of
self._by_path[project.worktree] = project ExecuteInParallel. The dict will then be shared to child workers of
ExecuteInParallel.
def _GetProjectByPath(self, manifest, path): """
project = None assert cls._parallel_context is None
if os.path.exists(path): cls._parallel_context = {}
oldpath = None
while (path and
path != oldpath and
path != manifest.topdir):
try: try:
project = self._by_path[path] yield
break finally:
except KeyError: cls._parallel_context = None
oldpath = path
path = os.path.dirname(path) @classmethod
if not project and path == manifest.topdir: def _InitParallelWorker(cls, context, initializer):
cls._parallel_context = context
if initializer:
initializer()
@classmethod
def ExecuteInParallel(
cls,
jobs,
func,
inputs,
callback,
output=None,
ordered=False,
chunksize=WORKER_BATCH_SIZE,
initializer=None,
):
"""Helper for managing parallel execution boiler plate.
For subcommands that can easily split their work up.
Args:
jobs: How many parallel processes to use.
func: The function to apply to each of the |inputs|. Usually a
functools.partial for wrapping additional arguments. It will be
run in a separate process, so it must be pickalable, so nested
functions won't work. Methods on the subcommand Command class
should work.
inputs: The list of items to process. Must be a list.
callback: The function to pass the results to for processing. It
will be executed in the main thread and process the results of
|func| as they become available. Thus it may be a local nested
function. Its return value is passed back directly. It takes
three arguments:
- The processing pool (or None with one job).
- The |output| argument.
- An iterator for the results.
output: An output manager. May be progress.Progess or
color.Coloring.
ordered: Whether the jobs should be processed in order.
chunksize: The number of jobs processed in batch by parallel
workers.
initializer: Worker initializer.
Returns:
The |callback| function's results are returned.
"""
try: try:
project = self._by_path[path] # NB: Multiprocessing is heavy, so don't spin it up for one job.
except KeyError: if len(inputs) == 1 or jobs == 1:
pass return callback(None, output, (func(x) for x in inputs))
else: else:
try: with multiprocessing.Pool(
project = self._by_path[path] jobs,
except KeyError: initializer=cls._InitParallelWorker,
pass initargs=(cls._parallel_context, initializer),
return project ) as pool:
submit = pool.imap if ordered else pool.imap_unordered
return callback(
pool,
output,
submit(func, inputs, chunksize=chunksize),
)
finally:
if isinstance(output, progress.Progress):
output.end()
def GetProjects(self, args, manifest=None, groups='', missing_ok=False, def _ResetPathToProjectMap(self, projects):
submodules_ok=False, all_manifests=False): self._by_path = {p.worktree: p for p in projects}
"""A list of projects that match the arguments.
Args: def _UpdatePathToProjectMap(self, project):
args: a list of (case-insensitive) strings, projects to search for. self._by_path[project.worktree] = project
manifest: an XmlManifest, the manifest to use, or None for default.
groups: a string, the manifest groups in use.
missing_ok: a boolean, whether to allow missing projects.
submodules_ok: a boolean, whether to allow submodules.
all_manifests: a boolean, if True then all manifests and submanifests are
used. If False, then only the local (sub)manifest is used.
Returns: def _GetProjectByPath(self, manifest, path):
A list of matching Project instances. project = None
""" if os.path.exists(path):
if all_manifests: oldpath = None
if not manifest: while path and path != oldpath and path != manifest.topdir:
manifest = self.manifest.outer_client try:
all_projects_list = manifest.all_projects project = self._by_path[path]
else: break
if not manifest: except KeyError:
manifest = self.manifest oldpath = path
all_projects_list = manifest.projects path = os.path.dirname(path)
result = [] if not project and path == manifest.topdir:
try:
project = self._by_path[path]
except KeyError:
pass
else:
try:
project = self._by_path[path]
except KeyError:
pass
return project
if not groups: def GetProjects(
groups = manifest.GetGroupsStr() self,
groups = [x for x in re.split(r'[,\s]+', groups) if x] args,
manifest=None,
groups="",
missing_ok=False,
submodules_ok=False,
all_manifests=False,
):
"""A list of projects that match the arguments.
if not args: Args:
derived_projects = {} args: a list of (case-insensitive) strings, projects to search for.
for project in all_projects_list: manifest: an XmlManifest, the manifest to use, or None for default.
if submodules_ok or project.sync_s: groups: a string, the manifest groups in use.
derived_projects.update((p.name, p) missing_ok: a boolean, whether to allow missing projects.
for p in project.GetDerivedSubprojects()) submodules_ok: a boolean, whether to allow submodules.
all_projects_list.extend(derived_projects.values()) all_manifests: a boolean, if True then all manifests and
for project in all_projects_list: submanifests are used. If False, then only the local
if (missing_ok or project.Exists) and project.MatchesGroups(groups): (sub)manifest is used.
result.append(project)
else:
self._ResetPathToProjectMap(all_projects_list)
for arg in args: Returns:
# We have to filter by manifest groups in case the requested project is A list of matching Project instances.
# checked out multiple times or differently based on them. """
projects = [project if all_manifests:
if not manifest:
manifest = self.manifest.outer_client
all_projects_list = manifest.all_projects
else:
if not manifest:
manifest = self.manifest
all_projects_list = manifest.projects
result = []
if not groups:
groups = manifest.GetGroupsStr()
groups = [x for x in re.split(r"[,\s]+", groups) if x]
if not args:
derived_projects = {}
for project in all_projects_list:
if submodules_ok or project.sync_s:
derived_projects.update(
(p.name, p) for p in project.GetDerivedSubprojects()
)
all_projects_list.extend(derived_projects.values())
for project in all_projects_list:
if (missing_ok or project.Exists) and project.MatchesGroups(
groups
):
result.append(project)
else:
self._ResetPathToProjectMap(all_projects_list)
for arg in args:
# We have to filter by manifest groups in case the requested
# project is checked out multiple times or differently based on
# them.
projects = [
project
for project in manifest.GetProjectsWithName( for project in manifest.GetProjectsWithName(
arg, all_manifests=all_manifests) arg, all_manifests=all_manifests
if project.MatchesGroups(groups)] )
if project.MatchesGroups(groups)
]
if not projects: if not projects:
path = os.path.abspath(arg).replace('\\', '/') path = os.path.abspath(arg).replace("\\", "/")
tree = manifest tree = manifest
if all_manifests: if all_manifests:
# Look for the deepest matching submanifest. # Look for the deepest matching submanifest.
for tree in reversed(list(manifest.all_manifests)): for tree in reversed(list(manifest.all_manifests)):
if path.startswith(tree.topdir): if path.startswith(tree.topdir):
break break
project = self._GetProjectByPath(tree, path) project = self._GetProjectByPath(tree, path)
# If it's not a derived project, update path->project mapping and # If it's not a derived project, update path->project
# search again, as arg might actually point to a derived subproject. # mapping and search again, as arg might actually point to
if (project and not project.Derived and (submodules_ok or # a derived subproject.
project.sync_s)): if (
search_again = False project
for subproject in project.GetDerivedSubprojects(): and not project.Derived
self._UpdatePathToProjectMap(subproject) and (submodules_ok or project.sync_s)
search_again = True ):
if search_again: search_again = False
project = self._GetProjectByPath(manifest, path) or project for subproject in project.GetDerivedSubprojects():
self._UpdatePathToProjectMap(subproject)
search_again = True
if search_again:
project = (
self._GetProjectByPath(manifest, path)
or project
)
if project: if project:
projects = [project] projects = [project]
if not projects: if not projects:
raise NoSuchProjectError(arg) raise NoSuchProjectError(arg)
for project in projects: for project in projects:
if not missing_ok and not project.Exists: if not missing_ok and not project.Exists:
raise NoSuchProjectError('%s (%s)' % ( raise NoSuchProjectError(
arg, project.RelPath(local=not all_manifests))) "%s (%s)"
if not project.MatchesGroups(groups): % (arg, project.RelPath(local=not all_manifests))
raise InvalidProjectGroupsError(arg) )
if not project.MatchesGroups(groups):
raise InvalidProjectGroupsError(arg)
result.extend(projects) result.extend(projects)
def _getpath(x): def _getpath(x):
return x.relpath return x.relpath
result.sort(key=_getpath)
return result
def FindProjects(self, args, inverse=False, all_manifests=False): result.sort(key=_getpath)
"""Find projects from command line arguments. return result
Args: def FindProjects(self, args, inverse=False, all_manifests=False):
args: a list of (case-insensitive) strings, projects to search for. """Find projects from command line arguments.
inverse: a boolean, if True, then projects not matching any |args| are
returned.
all_manifests: a boolean, if True then all manifests and submanifests are
used. If False, then only the local (sub)manifest is used.
"""
result = []
patterns = [re.compile(r'%s' % a, re.IGNORECASE) for a in args]
for project in self.GetProjects('', all_manifests=all_manifests):
paths = [project.name, project.RelPath(local=not all_manifests)]
for pattern in patterns:
match = any(pattern.search(x) for x in paths)
if not inverse and match:
result.append(project)
break
if inverse and match:
break
else:
if inverse:
result.append(project)
result.sort(key=lambda project: (project.manifest.path_prefix,
project.relpath))
return result
def ManifestList(self, opt): Args:
"""Yields all of the manifests to traverse. args: a list of (case-insensitive) strings, projects to search for.
inverse: a boolean, if True, then projects not matching any |args|
are returned.
all_manifests: a boolean, if True then all manifests and
submanifests are used. If False, then only the local
(sub)manifest is used.
"""
result = []
patterns = [re.compile(r"%s" % a, re.IGNORECASE) for a in args]
for project in self.GetProjects("", all_manifests=all_manifests):
paths = [project.name, project.RelPath(local=not all_manifests)]
for pattern in patterns:
match = any(pattern.search(x) for x in paths)
if not inverse and match:
result.append(project)
break
if inverse and match:
break
else:
if inverse:
result.append(project)
result.sort(
key=lambda project: (project.manifest.path_prefix, project.relpath)
)
return result
Args: def ManifestList(self, opt):
opt: The command options. """Yields all of the manifests to traverse.
"""
top = self.outer_manifest Args:
if not opt.outer_manifest or opt.this_manifest_only: opt: The command options.
top = self.manifest """
yield top top = self.outer_manifest
if not opt.this_manifest_only: if not opt.outer_manifest or opt.this_manifest_only:
for child in top.all_children: top = self.manifest
yield child yield top
if not opt.this_manifest_only:
yield from top.all_children
class InteractiveCommand(Command): class InteractiveCommand(Command):
"""Command which requires user interaction on the tty and """Command which requires user interaction on the tty and must not run
must not run within a pager, even if the user asks to. within a pager, even if the user asks to.
""" """
def WantPager(self, _opt): def WantPager(self, _opt):
return False return False
class PagedCommand(Command): class PagedCommand(Command):
"""Command which defaults to output in a pager, as its """Command which defaults to output in a pager, as its display tends to be
display tends to be larger than one screen full. larger than one screen full.
""" """
def WantPager(self, _opt): def WantPager(self, _opt):
return True return True
class MirrorSafeCommand(object): class MirrorSafeCommand:
"""Command permits itself to run within a mirror, """Command permits itself to run within a mirror, and does not require a
and does not require a working directory. working directory.
""" """
class GitcAvailableCommand(object): class GitcClientCommand:
"""Command that requires GITC to be available, but does """Command that requires the local client to be a GITC client."""
not require the local client to be a GITC client.
"""
class GitcClientCommand(object):
"""Command that requires the local client to be a GITC
client.
"""

1
constraints.txt Normal file
View File

@ -0,0 +1 @@
black<24

View File

@ -42,8 +42,12 @@ For example, if you want to change the manifest branch, you can simply run
change the git URL/branch that this tracks, re-run `repo init` with the new change the git URL/branch that this tracks, re-run `repo init` with the new
settings. settings.
* `.repo_fetchtimes.json`: Used by `repo sync` to record stats when syncing * `.repo_fetchtimes.json`: Used by `repo sync` to record fetch times when
the various projects. syncing the various projects.
* `.repo_localsyncstate.json`: Used by `repo sync` to detect and warn on
on partial tree syncs. Partial syncs are allowed by `repo` itself, but are
unsupported by many projects where `repo` is used.
### Manifests ### Manifests

View File

@ -107,10 +107,13 @@ following DTD:
<!ATTLIST extend-project remote CDATA #IMPLIED> <!ATTLIST extend-project remote CDATA #IMPLIED>
<!ATTLIST extend-project dest-branch CDATA #IMPLIED> <!ATTLIST extend-project dest-branch CDATA #IMPLIED>
<!ATTLIST extend-project upstream CDATA #IMPLIED> <!ATTLIST extend-project upstream CDATA #IMPLIED>
<!ATTLIST extend-project base-rev CDATA #IMPLIED>
<!ELEMENT remove-project EMPTY> <!ELEMENT remove-project EMPTY>
<!ATTLIST remove-project name CDATA #REQUIRED> <!ATTLIST remove-project name CDATA #IMPLIED>
<!ATTLIST remove-project optional CDATA #IMPLIED> <!ATTLIST remove-project path CDATA #IMPLIED>
<!ATTLIST remove-project optional CDATA #IMPLIED>
<!ATTLIST remove-project base-rev CDATA #IMPLIED>
<!ELEMENT repo-hooks EMPTY> <!ELEMENT repo-hooks EMPTY>
<!ATTLIST repo-hooks in-project CDATA #REQUIRED> <!ATTLIST repo-hooks in-project CDATA #REQUIRED>
@ -125,8 +128,9 @@ following DTD:
<!ATTLIST contactinfo bugurl CDATA #REQUIRED> <!ATTLIST contactinfo bugurl CDATA #REQUIRED>
<!ELEMENT include EMPTY> <!ELEMENT include EMPTY>
<!ATTLIST include name CDATA #REQUIRED> <!ATTLIST include name CDATA #REQUIRED>
<!ATTLIST include groups CDATA #IMPLIED> <!ATTLIST include groups CDATA #IMPLIED>
<!ATTLIST include revision CDATA #IMPLIED>
]> ]>
``` ```
@ -431,6 +435,14 @@ project. Same syntax as the corresponding element of `project`.
Attribute `upstream`: If specified, overrides the upstream of the original Attribute `upstream`: If specified, overrides the upstream of the original
project. Same syntax as the corresponding element of `project`. project. Same syntax as the corresponding element of `project`.
Attribute `base-rev`: If specified, adds a check against the revision
to be extended. Manifest parse will fail and give a list of mismatch extends
if the revisions being extended have changed since base-rev was set.
Intended for use with layered manifests using hash revisions to prevent
patch branches hiding newer upstream revisions. Also compares named refs
like branches or tags but is misleading if branches are used as base-rev.
Same syntax as the corresponding element of `project`.
### Element annotation ### Element annotation
Zero or more annotation elements may be specified as children of a Zero or more annotation elements may be specified as children of a
@ -472,7 +484,7 @@ of the repo client.
### Element remove-project ### Element remove-project
Deletes the named project from the internal manifest table, possibly Deletes a project from the internal manifest table, possibly
allowing a subsequent project element in the same manifest file to allowing a subsequent project element in the same manifest file to
replace the project with a different source. replace the project with a different source.
@ -480,9 +492,28 @@ This element is mostly useful in a local manifest file, where
the user can remove a project, and possibly replace it with their the user can remove a project, and possibly replace it with their
own definition. own definition.
The project `name` or project `path` can be used to specify the remove target
meaning one of them is required. If only name is specified, all
projects with that name are removed.
If both name and path are specified, only projects with the same name and
path are removed, meaning projects with the same name but in other
locations are kept.
If only path is specified, a matching project is removed regardless of its
name. Logic otherwise behaves like both are specified.
Attribute `optional`: Set to true to ignore remove-project elements with no Attribute `optional`: Set to true to ignore remove-project elements with no
matching `project` element. matching `project` element.
Attribute `base-rev`: If specified, adds a check against the revision
to be removed. Manifest parse will fail and give a list of mismatch removes
if the revisions being removed have changed since base-rev was set.
Intended for use with layered manifests using hash revisions to prevent
patch branches hiding newer upstream revisions. Also compares named refs
like branches or tags but is misleading if branches are used as base-rev.
Same syntax as the corresponding element of `project`.
### Element repo-hooks ### Element repo-hooks
NB: See the [practical documentation](./repo-hooks.md) for using repo hooks. NB: See the [practical documentation](./repo-hooks.md) for using repo hooks.
@ -553,6 +584,9 @@ in the included manifest belong. This appends and recurses, meaning
all projects in included manifests carry all parent include groups. all projects in included manifests carry all parent include groups.
Same syntax as the corresponding element of `project`. Same syntax as the corresponding element of `project`.
Attribute `revision`: Name of a Git branch (e.g. `main` or `refs/heads/main`)
default to which all projects in the included manifest belong.
## Local Manifests {#local-manifests} ## Local Manifests {#local-manifests}
Additional remotes and projects may be added through local manifest Additional remotes and projects may be added through local manifest

View File

@ -1,47 +1,92 @@
# Supported Python Versions # Supported Python Versions
With Python 2.7 officially going EOL on [01 Jan 2020](https://pythonclock.org/), This documents the current supported Python versions, and tries to provide
we need a support plan for the repo project itself. guidance for when we decide to drop support for older versions.
Inevitably, there will be a long tail of users who still want to use Python 2 on
their old LTS/corp systems and have little power to change the system.
## Summary ## Summary
* Python 3.6 (released Dec 2016) is required by default starting with repo-2.x. * Python 3.6 (released Dec 2016) is required starting with repo-2.0.
* Older versions of Python (e.g. v2.7) may use the legacy feature-frozen branch * Older versions of Python (e.g. v2.7) may use old releases via the repo-1.x
based on repo-1.x. branch, but no support is provided.
## Overview ## repo hooks
We provide a branch for Python 2 users that is feature-frozen.
Bugfixes may be added on a best-effort basis or from the community, but largely
no new features will be added, nor is support guaranteed.
Users can select this during `repo init` time via the [repo launcher].
Otherwise the default branches (e.g. stable & main) will be used which will
require Python 3.
This means the [repo launcher] needs to support both Python 2 & Python 3, but
since it doesn't import any other repo code, this shouldn't be too problematic.
The main branch will require Python 3.6 at a minimum.
If the system has an older version of Python 3, then users will have to select
the legacy Python 2 branch instead.
### repo hooks
Projects that use [repo hooks] run on independent schedules. Projects that use [repo hooks] run on independent schedules.
They might migrate to Python 3 earlier or later than us. Since it's not possible to detect what version of Python the hooks were written
To support them, we'll probe the shebang of the hook script and if we find an or tested against, we always import & exec them with the active Python version.
interpreter in there that indicates a different version than repo is currently
running under, we'll attempt to reexec ourselves under that.
For example, a hook with a header like `#!/usr/bin/python2` will have repo If the user's Python is too new for the [repo hooks], then it is up to the hooks
execute `/usr/bin/python2` to execute the hook code specifically if repo is maintainer to update.
currently running Python 3.
For more details, consult the [repo hooks] documentation. ## Repo launcher
The [repo launcher] is an independent script that can support older versions of
Python without holding back the rest of the codebase.
If it detects the current version of Python is too old, it will try to reexec
via a newer version of Python via standard `pythonX.Y` interpreter names.
However, this is provided as a nicety when it is not onerous, and there is no
official support for older versions of Python than the rest of the codebase.
If your default python interpreters are too old to run the launcher even though
you have newer versions installed, your choices are:
* Modify the [repo launcher]'s shebang to suite your environment.
* Download an older version of the [repo launcher] and don't upgrade it.
Be aware that we do not guarantee old repo launchers will work with current
versions of repo. Bug reports using old launchers will not be accepted.
## When to drop support
So far, Python 3.6 has provided most of the interesting features that we want
(e.g. typing & f-strings), and there haven't been features in newer versions
that are critical to us.
That said, let's assume we need functionality that only exists in Python 3.7.
How do we decide when it's acceptable to drop Python 3.6?
1. Review the [Project References](./release-process.md#project-references) to
see what major distros are using the previous version of Python, and when
they go EOL. Generally we care about Ubuntu LTS & current/previous Debian
stable versions.
* If they're all EOL already, then go for it, drop support.
* If they aren't EOL, start a thread on [repo-discuss] to see how the user
base feels about the proposal.
1. Update the "soft" versions in the codebase. This will start warning users
that the older version is deprecated.
* Update [repo](/repo) if the launcher needs updating.
This only helps with people who download newer launchers.
* Update [main.py](/main.py) for the main codebase.
This warns for everyone regardless of [repo launcher] version.
* Update [requirements.json](/requirements.json).
This allows [repo launcher] to display warnings/errors without having
to execute the new codebase. This helps in case of syntax or module
changes where older versions won't even be able to import the new code.
1. After some grace period (ideally at least 2 quarters after the first release
with the updated soft requirements), update the "hard" versions, and then
start using the new functionality.
## Python 2.7 & 3.0-3.5
> **There is no support for these versions.**
> **Do not file bugs if you are using old Python versions.**
> **Any such reports will be marked invalid and ignored.**
> **Upgrade your distro and/or runtime instead.**
Fetch an old version of the [repo launcher]:
```sh
$ curl https://storage.googleapis.com/git-repo-downloads/repo-2.32 > ~/.bin/repo-2.32
$ chmod a+rx ~/.bin/repo-2.32
```
Then initialize an old version of repo:
```sh
$ repo-2.32 init --repo-rev=repo-1 ...
```
[repo-discuss]: https://groups.google.com/forum/#!forum/repo-discuss
[repo hooks]: ./repo-hooks.md [repo hooks]: ./repo-hooks.md
[repo launcher]: ../repo [repo launcher]: ../repo

View File

@ -96,6 +96,9 @@ If that tag is valid, then repo will warn and use that commit instead.
If that tag cannot be verified, it gives up and forces the user to resolve. If that tag cannot be verified, it gives up and forces the user to resolve.
If env variable `REPO_SKIP_SELF_UPDATE` is defined, this will
bypass the self update algorithm.
### Force an update ### Force an update
The `repo selfupdate` command can be used to force an immediate update. The `repo selfupdate` command can be used to force an immediate update.
@ -202,7 +205,7 @@ still support them.
Things in italics are things we used to care about but probably don't anymore. Things in italics are things we used to care about but probably don't anymore.
| Date | EOL | [Git][rel-g] | [Python][rel-p] | [SSH][rel-o] | [Ubuntu][rel-u] / [Debian][rel-d] | Git | Python | SSH | | Date | EOL | [Git][rel-g] | [Python][rel-p] | [SSH][rel-o] | [Ubuntu][rel-u] / [Debian][rel-d] | Git | Python | SSH |
|:--------:|:------------:|:------------:|:---------------:|:------------:|-----------------------------------|-----|--------|-----| |:--------:|:------------:|:------------:|:---------------:|:------------:|-----------------------------------|:---:|:------:|:---:|
| Apr 2008 | | | | 5.0 | | Apr 2008 | | | | 5.0 |
| Jun 2008 | | | | 5.1 | | Jun 2008 | | | | 5.1 |
| Oct 2008 | *Oct 2013* | | 2.6.0 | | *10.04 Lucid* - 10.10 Maverick / *Squeeze* | | Oct 2008 | *Oct 2013* | | 2.6.0 | | *10.04 Lucid* - 10.10 Maverick / *Squeeze* |
@ -241,7 +244,7 @@ Things in italics are things we used to care about but probably don't anymore.
| Feb 2014 | *Dec 2014* | **1.9.0** | | | *14.04 Trusty* | | Feb 2014 | *Dec 2014* | **1.9.0** | | | *14.04 Trusty* |
| Mar 2014 | *Mar 2019* | | *3.4.0* | | *14.04 Trusty* - 15.10 Wily / *Jessie* | | Mar 2014 | *Mar 2019* | | *3.4.0* | | *14.04 Trusty* - 15.10 Wily / *Jessie* |
| Mar 2014 | | | | 6.6 | *14.04 Trusty* - 14.10 Utopic | | Mar 2014 | | | | 6.6 | *14.04 Trusty* - 14.10 Utopic |
| Apr 2014 | *Apr 2022* | | | | *14.04 Trusty* | 1.9.1 | 2.7.5 3.4.0 | 6.6 | | Apr 2014 | *Apr 2024* | | | | *14.04 Trusty* | 1.9.1 | 2.7.5 3.4.0 | 6.6 |
| May 2014 | *Dec 2014* | 2.0.0 | | May 2014 | *Dec 2014* | 2.0.0 |
| Aug 2014 | *Dec 2014* | *2.1.0* | | | 14.10 Utopic - 15.04 Vivid / *Jessie* | | Aug 2014 | *Dec 2014* | *2.1.0* | | | 14.10 Utopic - 15.04 Vivid / *Jessie* |
| Oct 2014 | | | | 6.7 | 15.04 Vivid | | Oct 2014 | | | | 6.7 | 15.04 Vivid |
@ -262,7 +265,7 @@ Things in italics are things we used to care about but probably don't anymore.
| Jan 2016 | *Jul 2017* | *2.7.0* | | | *16.04 Xenial* | | Jan 2016 | *Jul 2017* | *2.7.0* | | | *16.04 Xenial* |
| Feb 2016 | | | | 7.2 | *16.04 Xenial* | | Feb 2016 | | | | 7.2 | *16.04 Xenial* |
| Mar 2016 | *Jul 2017* | 2.8.0 | | Mar 2016 | *Jul 2017* | 2.8.0 |
| Apr 2016 | *Apr 2024* | | | | *16.04 Xenial* | 2.7.4 | 2.7.11 3.5.1 | 7.2 | | Apr 2016 | *Apr 2026* | | | | *16.04 Xenial* | 2.7.4 | 2.7.11 3.5.1 | 7.2 |
| Jun 2016 | *Jul 2017* | 2.9.0 | | | 16.10 Yakkety | | Jun 2016 | *Jul 2017* | 2.9.0 | | | 16.10 Yakkety |
| Jul 2016 | | | | 7.3 | 16.10 Yakkety | | Jul 2016 | | | | 7.3 | 16.10 Yakkety |
| Sep 2016 | *Sep 2017* | 2.10.0 | | Sep 2016 | *Sep 2017* | 2.10.0 |
@ -284,7 +287,7 @@ Things in italics are things we used to care about but probably don't anymore.
| Apr 2018 | | | | 7.7 | 18.10 Cosmic | | Apr 2018 | | | | 7.7 | 18.10 Cosmic |
| Apr 2018 | **Apr 2028** | | | | **18.04 Bionic** | 2.17.0 | 2.7.15 3.6.5 | 7.6 | | Apr 2018 | **Apr 2028** | | | | **18.04 Bionic** | 2.17.0 | 2.7.15 3.6.5 | 7.6 |
| Jun 2018 | *Mar 2021* | 2.18.0 | | Jun 2018 | *Mar 2021* | 2.18.0 |
| Jun 2018 | **Jun 2023** | | 3.7.0 | | 19.04 Disco - **20.04 Focal** / **Buster** | | Jun 2018 | **Jun 2023** | | 3.7.0 | | 19.04 Disco - **Buster** |
| Aug 2018 | | | | 7.8 | | Aug 2018 | | | | 7.8 |
| Sep 2018 | *Mar 2021* | 2.19.0 | | | 18.10 Cosmic | | Sep 2018 | *Mar 2021* | 2.19.0 | | | 18.10 Cosmic |
| Oct 2018 | | | | 7.9 | 19.04 Disco / **Buster** | | Oct 2018 | | | | 7.9 | 19.04 Disco / **Buster** |
@ -312,14 +315,33 @@ Things in italics are things we used to care about but probably don't anymore.
| Oct 2020 | | | | | 20.10 Groovy | 2.27.0 | 2.7.18 3.8.6 | 8.3 | | Oct 2020 | | | | | 20.10 Groovy | 2.27.0 | 2.7.18 3.8.6 | 8.3 |
| Oct 2020 | **Oct 2025** | | 3.9.0 | | 21.04 Hirsute / **Bullseye** | | Oct 2020 | **Oct 2025** | | 3.9.0 | | 21.04 Hirsute / **Bullseye** |
| Dec 2020 | *Mar 2021* | 2.30.0 | | | 21.04 Hirsute / **Bullseye** | | Dec 2020 | *Mar 2021* | 2.30.0 | | | 21.04 Hirsute / **Bullseye** |
| Mar 2021 | | 2.31.0 | | Mar 2021 | | 2.31.0 | | 8.5 |
| Mar 2021 | | | | 8.5 |
| Apr 2021 | | | | 8.6 | | Apr 2021 | | | | 8.6 |
| Apr 2021 | *Jan 2022* | | | | 21.04 Hirsute | 2.30.2 | 2.7.18 3.9.4 | 8.4 | | Apr 2021 | *Jan 2022* | | | | 21.04 Hirsute | 2.30.2 | 2.7.18 3.9.4 | 8.4 |
| Jun 2021 | | 2.32.0 | | Jun 2021 | | 2.32.0 |
| Aug 2021 | | 2.33.0 | | Aug 2021 | | 2.33.0 | | 8.7 |
| Aug 2021 | | | | 8.7 |
| Aug 2021 | **Aug 2026** | | | | **Debian 11 Bullseye** | 2.30.2 | 2.7.18 3.9.2 | 8.4 | | Aug 2021 | **Aug 2026** | | | | **Debian 11 Bullseye** | 2.30.2 | 2.7.18 3.9.2 | 8.4 |
| Sep 2021 | | | | 8.8 |
| Oct 2021 | | 2.34.0 | 3.10.0 | | **22.04 Jammy** |
| Jan 2022 | | 2.35.0 |
| Feb 2022 | | | | 8.9 | **22.04 Jammy** |
| Apr 2022 | | 2.36.0 | | 9.0 |
| Apr 2022 | **Apr 2032** | | | | **22.04 Jammy** | 2.34.1 | 2.7.18 3.10.6 | 8.9 |
| Jun 2022 | | 2.37.0 |
| Oct 2022 | | 2.38.0 | | 9.1 |
| Oct 2022 | | | 3.11.0 | | **Bookworm** |
| Dec 2022 | | 2.39.0 | | | **Bookworm** |
| Feb 2023 | | | | 9.2 | **Bookworm** |
| Mar 2023 | | 2.40.0 | | 9.3 |
| Jun 2023 | | 2.41.0 |
| Jun 2023 | **Jun 2028** | | | | **Debian 12 Bookworm** | 2.39.2 | 3.11.2 | 9.2 |
| Aug 2023 | | 2.42.0 | | 9.4 |
| Oct 2023 | | | 3.12.0 | 9.5 |
| Nov 2022 | | 2.43.0 |
| Dec 2023 | | | | 9.6 |
| Feb 2024 | | 2.44.0 |
| Mar 2024 | | | | 9.7 |
| Oct 2024 | | | 3.13.0 |
| **Date** | **EOL** | **[Git][rel-g]** | **[Python][rel-p]** | **[SSH][rel-o]** | **[Ubuntu][rel-u] / [Debian][rel-d]** | **Git** | **Python** | **SSH** | | **Date** | **EOL** | **[Git][rel-g]** | **[Python][rel-p]** | **[SSH][rel-o]** | **[Ubuntu][rel-u] / [Debian][rel-d]** | **Git** | **Python** | **SSH** |
@ -328,7 +350,7 @@ Things in italics are things we used to care about but probably don't anymore.
[rel-g]: https://en.wikipedia.org/wiki/Git#Releases [rel-g]: https://en.wikipedia.org/wiki/Git#Releases
[rel-o]: https://www.openssh.com/releasenotes.html [rel-o]: https://www.openssh.com/releasenotes.html
[rel-p]: https://en.wikipedia.org/wiki/History_of_Python#Table_of_versions [rel-p]: https://en.wikipedia.org/wiki/History_of_Python#Table_of_versions
[rel-u]: https://en.wikipedia.org/wiki/Ubuntu_version_history#Table_of_versions [rel-u]: https://wiki.ubuntu.com/Releases
[example announcement]: https://groups.google.com/d/topic/repo-discuss/UGBNismWo1M/discussion [example announcement]: https://groups.google.com/d/topic/repo-discuss/UGBNismWo1M/discussion
[repo-discuss@googlegroups.com]: https://groups.google.com/forum/#!forum/repo-discuss [repo-discuss@googlegroups.com]: https://groups.google.com/forum/#!forum/repo-discuss
[go/repo-release]: https://goto.google.com/repo-release [go/repo-release]: https://goto.google.com/repo-release

154
editor.py
View File

@ -14,102 +14,106 @@
import os import os
import re import re
import sys
import subprocess import subprocess
import sys
import tempfile import tempfile
from error import EditorError from error import EditorError
import platform_utils import platform_utils
class Editor(object): class Editor:
"""Manages the user's preferred text editor.""" """Manages the user's preferred text editor."""
_editor = None _editor = None
globalConfig = None globalConfig = None
@classmethod @classmethod
def _GetEditor(cls): def _GetEditor(cls):
if cls._editor is None: if cls._editor is None:
cls._editor = cls._SelectEditor() cls._editor = cls._SelectEditor()
return cls._editor return cls._editor
@classmethod @classmethod
def _SelectEditor(cls): def _SelectEditor(cls):
e = os.getenv('GIT_EDITOR') e = os.getenv("GIT_EDITOR")
if e: if e:
return e return e
if cls.globalConfig: if cls.globalConfig:
e = cls.globalConfig.GetString('core.editor') e = cls.globalConfig.GetString("core.editor")
if e: if e:
return e return e
e = os.getenv('VISUAL') e = os.getenv("VISUAL")
if e: if e:
return e return e
e = os.getenv('EDITOR') e = os.getenv("EDITOR")
if e: if e:
return e return e
if os.getenv('TERM') == 'dumb': if os.getenv("TERM") == "dumb":
print( print(
"""No editor specified in GIT_EDITOR, core.editor, VISUAL or EDITOR. """No editor specified in GIT_EDITOR, core.editor, VISUAL or EDITOR.
Tried to fall back to vi but terminal is dumb. Please configure at Tried to fall back to vi but terminal is dumb. Please configure at
least one of these before using this command.""", file=sys.stderr) least one of these before using this command.""", # noqa: E501
sys.exit(1) file=sys.stderr,
)
sys.exit(1)
return 'vi' return "vi"
@classmethod @classmethod
def EditString(cls, data): def EditString(cls, data):
"""Opens an editor to edit the given content. """Opens an editor to edit the given content.
Args: Args:
data: The text to edit. data: The text to edit.
Returns: Returns:
New value of edited text. New value of edited text.
Raises: Raises:
EditorError: The editor failed to run. EditorError: The editor failed to run.
""" """
editor = cls._GetEditor() editor = cls._GetEditor()
if editor == ':': if editor == ":":
return data return data
fd, path = tempfile.mkstemp() fd, path = tempfile.mkstemp()
try: try:
os.write(fd, data.encode('utf-8')) os.write(fd, data.encode("utf-8"))
os.close(fd) os.close(fd)
fd = None fd = None
if platform_utils.isWindows(): if platform_utils.isWindows():
# Split on spaces, respecting quoted strings # Split on spaces, respecting quoted strings
import shlex import shlex
args = shlex.split(editor)
shell = False
elif re.compile("^.*[$ \t'].*$").match(editor):
args = [editor + ' "$@"', 'sh']
shell = True
else:
args = [editor]
shell = False
args.append(path)
try: args = shlex.split(editor)
rc = subprocess.Popen(args, shell=shell).wait() shell = False
except OSError as e: elif re.compile("^.*[$ \t'].*$").match(editor):
raise EditorError('editor failed, %s: %s %s' args = [editor + ' "$@"', "sh"]
% (str(e), editor, path)) shell = True
if rc != 0: else:
raise EditorError('editor failed with exit status %d: %s %s' args = [editor]
% (rc, editor, path)) shell = False
args.append(path)
with open(path, mode='rb') as fd2: try:
return fd2.read().decode('utf-8') rc = subprocess.Popen(args, shell=shell).wait()
finally: except OSError as e:
if fd: raise EditorError(f"editor failed, {str(e)}: {editor} {path}")
os.close(fd) if rc != 0:
platform_utils.remove(path) raise EditorError(
"editor failed with exit status %d: %s %s"
% (rc, editor, path)
)
with open(path, mode="rb") as fd2:
return fd2.read().decode("utf-8")
finally:
if fd:
os.close(fd)
platform_utils.remove(path)

228
error.py
View File

@ -12,124 +12,182 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from typing import List
class ManifestParseError(Exception):
"""Failed to parse the manifest file. class BaseRepoError(Exception):
""" """All repo specific exceptions derive from BaseRepoError."""
class RepoError(BaseRepoError):
"""Exceptions thrown inside repo that can be handled."""
def __init__(self, *args, project: str = None) -> None:
super().__init__(*args)
self.project = project
class RepoExitError(BaseRepoError):
"""Exception thrown that result in termination of repo program.
- Should only be handled in main.py
"""
def __init__(
self,
*args,
exit_code: int = 1,
aggregate_errors: List[Exception] = None,
**kwargs,
) -> None:
super().__init__(*args, **kwargs)
self.exit_code = exit_code
self.aggregate_errors = aggregate_errors
class RepoUnhandledExceptionError(RepoExitError):
"""Exception that maintains error as reason for program exit."""
def __init__(
self,
error: BaseException,
**kwargs,
) -> None:
super().__init__(error, **kwargs)
self.error = error
class SilentRepoExitError(RepoExitError):
"""RepoExitError that should no include CLI logging of issue/issues."""
class ManifestParseError(RepoExitError):
"""Failed to parse the manifest file."""
class ManifestInvalidRevisionError(ManifestParseError): class ManifestInvalidRevisionError(ManifestParseError):
"""The revision value in a project is incorrect. """The revision value in a project is incorrect."""
"""
class ManifestInvalidPathError(ManifestParseError): class ManifestInvalidPathError(ManifestParseError):
"""A path used in <copyfile> or <linkfile> is incorrect. """A path used in <copyfile> or <linkfile> is incorrect."""
"""
class NoManifestException(Exception): class NoManifestException(RepoExitError):
"""The required manifest does not exist. """The required manifest does not exist."""
"""
def __init__(self, path, reason): def __init__(self, path, reason, **kwargs):
super().__init__(path, reason) super().__init__(path, reason, **kwargs)
self.path = path self.path = path
self.reason = reason self.reason = reason
def __str__(self): def __str__(self):
return self.reason return self.reason
class EditorError(Exception): class EditorError(RepoError):
"""Unspecified error from the user's text editor. """Unspecified error from the user's text editor."""
"""
def __init__(self, reason): def __init__(self, reason, **kwargs):
super().__init__(reason) super().__init__(reason, **kwargs)
self.reason = reason self.reason = reason
def __str__(self): def __str__(self):
return self.reason return self.reason
class GitError(Exception): class GitError(RepoError):
"""Unspecified internal error from git. """Unspecified git related error."""
"""
def __init__(self, command): def __init__(self, message, command_args=None, **kwargs):
super().__init__(command) super().__init__(message, **kwargs)
self.command = command self.message = message
self.command_args = command_args
def __str__(self): def __str__(self):
return self.command return self.message
class UploadError(Exception): class GitAuthError(RepoExitError):
"""A bundle upload to Gerrit did not succeed. """Cannot talk to remote due to auth issue."""
"""
def __init__(self, reason):
super().__init__(reason)
self.reason = reason
def __str__(self):
return self.reason
class DownloadError(Exception): class GitcUnsupportedError(RepoExitError):
"""Cannot download a repository. """Gitc no longer supported."""
"""
def __init__(self, reason):
super().__init__(reason)
self.reason = reason
def __str__(self):
return self.reason
class NoSuchProjectError(Exception): class UploadError(RepoError):
"""A specified project does not exist in the work tree. """A bundle upload to Gerrit did not succeed."""
"""
def __init__(self, name=None): def __init__(self, reason, **kwargs):
super().__init__(name) super().__init__(reason, **kwargs)
self.name = name self.reason = reason
def __str__(self): def __str__(self):
if self.name is None: return self.reason
return 'in current directory'
return self.name
class InvalidProjectGroupsError(Exception): class DownloadError(RepoExitError):
"""A specified project is not suitable for the specified groups """Cannot download a repository."""
"""
def __init__(self, name=None): def __init__(self, reason, **kwargs):
super().__init__(name) super().__init__(reason, **kwargs)
self.name = name self.reason = reason
def __str__(self): def __str__(self):
if self.name is None: return self.reason
return 'in current directory'
return self.name
class RepoChangedException(Exception): class InvalidArgumentsError(RepoExitError):
"""Thrown if 'repo sync' results in repo updating its internal """Invalid command Arguments."""
repo or manifest repositories. In this special case we must
use exec to re-execute repo with the new code and manifest.
"""
def __init__(self, extra_args=None):
super().__init__(extra_args)
self.extra_args = extra_args or []
class HookError(Exception): class SyncError(RepoExitError):
"""Thrown if a 'repo-hook' could not be run. """Cannot sync repo."""
The common case is that the file wasn't present when we tried to run it.
""" class UpdateManifestError(RepoExitError):
"""Cannot update manifest."""
class NoSuchProjectError(RepoExitError):
"""A specified project does not exist in the work tree."""
def __init__(self, name=None, **kwargs):
super().__init__(**kwargs)
self.name = name
def __str__(self):
if self.name is None:
return "in current directory"
return self.name
class InvalidProjectGroupsError(RepoExitError):
"""A specified project is not suitable for the specified groups"""
def __init__(self, name=None, **kwargs):
super().__init__(**kwargs)
self.name = name
def __str__(self):
if self.name is None:
return "in current directory"
return self.name
class RepoChangedException(BaseRepoError):
"""Thrown if 'repo sync' results in repo updating its internal
repo or manifest repositories. In this special case we must
use exec to re-execute repo with the new code and manifest.
"""
def __init__(self, extra_args=None):
super().__init__(extra_args)
self.extra_args = extra_args or []
class HookError(RepoError):
"""Thrown if a 'repo-hook' could not be run.
The common case is that the file wasn't present when we tried to run it.
"""

View File

@ -15,161 +15,178 @@
import json import json
import multiprocessing import multiprocessing
TASK_COMMAND = 'command'
TASK_SYNC_NETWORK = 'sync-network' TASK_COMMAND = "command"
TASK_SYNC_LOCAL = 'sync-local' TASK_SYNC_NETWORK = "sync-network"
TASK_SYNC_LOCAL = "sync-local"
class EventLog(object): class EventLog:
"""Event log that records events that occurred during a repo invocation. """Event log that records events that occurred during a repo invocation.
Events are written to the log as a consecutive JSON entries, one per line. Events are written to the log as a consecutive JSON entries, one per line.
Each entry contains the following keys: Each entry contains the following keys:
- id: A ('RepoOp', ID) tuple, suitable for storing in a datastore. - id: A ('RepoOp', ID) tuple, suitable for storing in a datastore.
The ID is only unique for the invocation of the repo command. The ID is only unique for the invocation of the repo command.
- name: Name of the object being operated upon. - name: Name of the object being operated upon.
- task_name: The task that was performed. - task_name: The task that was performed.
- start: Timestamp of when the operation started. - start: Timestamp of when the operation started.
- finish: Timestamp of when the operation finished. - finish: Timestamp of when the operation finished.
- success: Boolean indicating if the operation was successful. - success: Boolean indicating if the operation was successful.
- try_count: A counter indicating the try count of this task. - try_count: A counter indicating the try count of this task.
Optionally: Optionally:
- parent: A ('RepoOp', ID) tuple indicating the parent event for nested - parent: A ('RepoOp', ID) tuple indicating the parent event for nested
events. events.
Valid task_names include: Valid task_names include:
- command: The invocation of a subcommand. - command: The invocation of a subcommand.
- sync-network: The network component of a sync command. - sync-network: The network component of a sync command.
- sync-local: The local component of a sync command. - sync-local: The local component of a sync command.
Specific tasks may include additional informational properties. Specific tasks may include additional informational properties.
"""
def __init__(self):
"""Initializes the event log."""
self._log = []
self._parent = None
def Add(self, name, task_name, start, finish=None, success=None,
try_count=1, kind='RepoOp'):
"""Add an event to the log.
Args:
name: Name of the object being operated upon.
task_name: A sub-task that was performed for name.
start: Timestamp of when the operation started.
finish: Timestamp of when the operation finished.
success: Boolean indicating if the operation was successful.
try_count: A counter indicating the try count of this task.
kind: The kind of the object for the unique identifier.
Returns:
A dictionary of the event added to the log.
""" """
event = {
'id': (kind, _NextEventId()),
'name': name,
'task_name': task_name,
'start_time': start,
'try': try_count,
}
if self._parent: def __init__(self):
event['parent'] = self._parent['id'] """Initializes the event log."""
self._log = []
self._parent = None
if success is not None or finish is not None: def Add(
self.FinishEvent(event, finish, success) self,
name,
task_name,
start,
finish=None,
success=None,
try_count=1,
kind="RepoOp",
):
"""Add an event to the log.
self._log.append(event) Args:
return event name: Name of the object being operated upon.
task_name: A sub-task that was performed for name.
start: Timestamp of when the operation started.
finish: Timestamp of when the operation finished.
success: Boolean indicating if the operation was successful.
try_count: A counter indicating the try count of this task.
kind: The kind of the object for the unique identifier.
def AddSync(self, project, task_name, start, finish, success): Returns:
"""Add a event to the log for a sync command. A dictionary of the event added to the log.
"""
event = {
"id": (kind, _NextEventId()),
"name": name,
"task_name": task_name,
"start_time": start,
"try": try_count,
}
Args: if self._parent:
project: Project being synced. event["parent"] = self._parent["id"]
task_name: A sub-task that was performed for name.
One of (TASK_SYNC_NETWORK, TASK_SYNC_LOCAL)
start: Timestamp of when the operation started.
finish: Timestamp of when the operation finished.
success: Boolean indicating if the operation was successful.
Returns: if success is not None or finish is not None:
A dictionary of the event added to the log. self.FinishEvent(event, finish, success)
"""
event = self.Add(project.relpath, task_name, start, finish, success)
if event is not None:
event['project'] = project.name
if project.revisionExpr:
event['revision'] = project.revisionExpr
if project.remote.url:
event['project_url'] = project.remote.url
if project.remote.fetchUrl:
event['remote_url'] = project.remote.fetchUrl
try:
event['git_hash'] = project.GetCommitRevisionId()
except Exception:
pass
return event
def GetStatusString(self, success): self._log.append(event)
"""Converst a boolean success to a status string. return event
Args: def AddSync(self, project, task_name, start, finish, success):
success: Boolean indicating if the operation was successful. """Add a event to the log for a sync command.
Returns: Args:
status string. project: Project being synced.
""" task_name: A sub-task that was performed for name.
return 'pass' if success else 'fail' One of (TASK_SYNC_NETWORK, TASK_SYNC_LOCAL)
start: Timestamp of when the operation started.
finish: Timestamp of when the operation finished.
success: Boolean indicating if the operation was successful.
def FinishEvent(self, event, finish, success): Returns:
"""Finishes an incomplete event. A dictionary of the event added to the log.
"""
event = self.Add(project.relpath, task_name, start, finish, success)
if event is not None:
event["project"] = project.name
if project.revisionExpr:
event["revision"] = project.revisionExpr
if project.remote.url:
event["project_url"] = project.remote.url
if project.remote.fetchUrl:
event["remote_url"] = project.remote.fetchUrl
try:
event["git_hash"] = project.GetCommitRevisionId()
except Exception:
pass
return event
Args: def GetStatusString(self, success):
event: An event that has been added to the log. """Converst a boolean success to a status string.
finish: Timestamp of when the operation finished.
success: Boolean indicating if the operation was successful.
Returns: Args:
A dictionary of the event added to the log. success: Boolean indicating if the operation was successful.
"""
event['status'] = self.GetStatusString(success)
event['finish_time'] = finish
return event
def SetParent(self, event): Returns:
"""Set a parent event for all new entities. status string.
"""
return "pass" if success else "fail"
Args: def FinishEvent(self, event, finish, success):
event: The event to use as a parent. """Finishes an incomplete event.
"""
self._parent = event
def Write(self, filename): Args:
"""Writes the log out to a file. event: An event that has been added to the log.
finish: Timestamp of when the operation finished.
success: Boolean indicating if the operation was successful.
Args: Returns:
filename: The file to write the log to. A dictionary of the event added to the log.
""" """
with open(filename, 'w+') as f: event["status"] = self.GetStatusString(success)
for e in self._log: event["finish_time"] = finish
json.dump(e, f, sort_keys=True) return event
f.write('\n')
def SetParent(self, event):
"""Set a parent event for all new entities.
Args:
event: The event to use as a parent.
"""
self._parent = event
def Write(self, filename):
"""Writes the log out to a file.
Args:
filename: The file to write the log to.
"""
with open(filename, "w+") as f:
for e in self._log:
json.dump(e, f, sort_keys=True)
f.write("\n")
# An integer id that is unique across this invocation of the program. # An integer id that is unique across this invocation of the program, to be set
_EVENT_ID = multiprocessing.Value('i', 1) # by the first Add event. We can't set it here since it results in leaked
# resources (see: https://issues.gerritcodereview.com/353656374).
_EVENT_ID = None
def _NextEventId(): def _NextEventId():
"""Helper function for grabbing the next unique id. """Helper function for grabbing the next unique id.
Returns: Returns:
A unique, to this invocation of the program, integer id. A unique, to this invocation of the program, integer id.
""" """
with _EVENT_ID.get_lock(): global _EVENT_ID
val = _EVENT_ID.value if _EVENT_ID is None:
_EVENT_ID.value += 1 # There is a small chance of race condition - two parallel processes
return val # setting up _EVENT_ID. However, we expect TASK_COMMAND to happen before
# mp kicks in.
_EVENT_ID = multiprocessing.Value("i", 1)
with _EVENT_ID.get_lock():
val = _EVENT_ID.value
_EVENT_ID.value += 1
return val

View File

@ -19,27 +19,39 @@ import sys
from urllib.parse import urlparse from urllib.parse import urlparse
from urllib.request import urlopen from urllib.request import urlopen
from error import RepoExitError
class FetchFileError(RepoExitError):
"""Exit error when fetch_file fails."""
def fetch_file(url, verbose=False): def fetch_file(url, verbose=False):
"""Fetch a file from the specified source using the appropriate protocol. """Fetch a file from the specified source using the appropriate protocol.
Returns: Returns:
The contents of the file as bytes. The contents of the file as bytes.
""" """
scheme = urlparse(url).scheme scheme = urlparse(url).scheme
if scheme == 'gs': if scheme == "gs":
cmd = ['gsutil', 'cat', url] cmd = ["gsutil", "cat", url]
try: errors = []
result = subprocess.run( try:
cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, result = subprocess.run(
check=True) cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, check=True
if result.stderr and verbose: )
print('warning: non-fatal error running "gsutil": %s' % result.stderr, if result.stderr and verbose:
file=sys.stderr) print(
return result.stdout 'warning: non-fatal error running "gsutil": %s'
except subprocess.CalledProcessError as e: % result.stderr,
print('fatal: error running "gsutil": %s' % e.stderr, file=sys.stderr,
file=sys.stderr) )
sys.exit(1) return result.stdout
with urlopen(url) as f: except subprocess.CalledProcessError as e:
return f.read() errors.append(e)
print(
'fatal: error running "gsutil": %s' % e.stderr, file=sys.stderr
)
raise FetchFileError(aggregate_errors=errors)
with urlopen(url) as f:
return f.read()

View File

@ -13,149 +13,202 @@
# limitations under the License. # limitations under the License.
import functools import functools
import json
import os import os
import sys import re
import subprocess import subprocess
import sys
from typing import Any, Optional from typing import Any, Optional
from error import GitError from error import GitError
from error import RepoExitError
from git_refs import HEAD from git_refs import HEAD
from git_trace2_event_log_base import BaseEventLog
import platform_utils import platform_utils
from repo_trace import REPO_TRACE, IsTrace, Trace from repo_logging import RepoLogger
from repo_trace import IsTrace
from repo_trace import REPO_TRACE
from repo_trace import Trace
from wrapper import Wrapper from wrapper import Wrapper
GIT = 'git'
# NB: These do not need to be kept in sync with the repo launcher script. GIT = "git"
# These may be much newer as it allows the repo launcher to roll between GIT_DIR = "GIT_DIR"
# different repo releases while source versions might require a newer git.
#
# The soft version is when we start warning users that the version is old and
# we'll be dropping support for it. We'll refuse to work with versions older
# than the hard version.
#
# git-1.7 is in (EOL) Ubuntu Precise. git-1.9 is in Ubuntu Trusty.
MIN_GIT_VERSION_SOFT = (1, 9, 1)
MIN_GIT_VERSION_HARD = (1, 7, 2)
GIT_DIR = 'GIT_DIR'
LAST_GITDIR = None LAST_GITDIR = None
LAST_CWD = None LAST_CWD = None
DEFAULT_GIT_FAIL_MESSAGE = "git command failure"
ERROR_EVENT_LOGGING_PREFIX = "RepoGitCommandError"
# Common line length limit
GIT_ERROR_STDOUT_LINES = 1
GIT_ERROR_STDERR_LINES = 10
INVALID_GIT_EXIT_CODE = 126
logger = RepoLogger(__file__)
class _GitCall(object): class _GitCall:
@functools.lru_cache(maxsize=None) @functools.lru_cache(maxsize=None)
def version_tuple(self): def version_tuple(self):
ret = Wrapper().ParseGitVersion() ret = Wrapper().ParseGitVersion()
if ret is None: if ret is None:
print('fatal: unable to detect git version', file=sys.stderr) msg = "fatal: unable to detect git version"
sys.exit(1) logger.error(msg)
return ret raise GitRequireError(msg)
return ret
def __getattr__(self, name): def __getattr__(self, name):
name = name.replace('_', '-') name = name.replace("_", "-")
def fun(*cmdv): def fun(*cmdv):
command = [name] command = [name]
command.extend(cmdv) command.extend(cmdv)
return GitCommand(None, command).Wait() == 0 return GitCommand(None, command, add_event_log=False).Wait() == 0
return fun
return fun
git = _GitCall() git = _GitCall()
def RepoSourceVersion(): def RepoSourceVersion():
"""Return the version of the repo.git tree.""" """Return the version of the repo.git tree."""
ver = getattr(RepoSourceVersion, 'version', None) ver = getattr(RepoSourceVersion, "version", None)
# We avoid GitCommand so we don't run into circular deps -- GitCommand needs # We avoid GitCommand so we don't run into circular deps -- GitCommand needs
# to initialize version info we provide. # to initialize version info we provide.
if ver is None: if ver is None:
env = GitCommand._GetBasicEnv() env = GitCommand._GetBasicEnv()
proj = os.path.dirname(os.path.abspath(__file__)) proj = os.path.dirname(os.path.abspath(__file__))
env[GIT_DIR] = os.path.join(proj, '.git') env[GIT_DIR] = os.path.join(proj, ".git")
result = subprocess.run([GIT, 'describe', HEAD], stdout=subprocess.PIPE, result = subprocess.run(
stderr=subprocess.DEVNULL, encoding='utf-8', [GIT, "describe", HEAD],
env=env, check=False) stdout=subprocess.PIPE,
if result.returncode == 0: stderr=subprocess.DEVNULL,
ver = result.stdout.strip() encoding="utf-8",
if ver.startswith('v'): env=env,
ver = ver[1:] check=False,
else: )
ver = 'unknown' if result.returncode == 0:
setattr(RepoSourceVersion, 'version', ver) ver = result.stdout.strip()
if ver.startswith("v"):
ver = ver[1:]
else:
ver = "unknown"
setattr(RepoSourceVersion, "version", ver)
return ver return ver
class UserAgent(object): @functools.lru_cache(maxsize=None)
"""Mange User-Agent settings when talking to external services def GetEventTargetPath():
"""Get the 'trace2.eventtarget' path from git configuration.
We follow the style as documented here: Returns:
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/User-Agent path: git config's 'trace2.eventtarget' path if it exists, or None
""" """
path = None
cmd = ["config", "--get", "trace2.eventtarget"]
# TODO(https://crbug.com/gerrit/13706): Use GitConfig when it supports
# system git config variables.
p = GitCommand(
None,
cmd,
capture_stdout=True,
capture_stderr=True,
bare=True,
add_event_log=False,
)
retval = p.Wait()
if retval == 0:
# Strip trailing carriage-return in path.
path = p.stdout.rstrip("\n")
if path == "":
return None
elif retval != 1:
# `git config --get` is documented to produce an exit status of `1`
# if the requested variable is not present in the configuration.
# Report any other return value as an error.
logger.error(
"repo: error: 'git config --get' call failed with return code: "
"%r, stderr: %r",
retval,
p.stderr,
)
return path
_os = None
_repo_ua = None
_git_ua = None
@property class UserAgent:
def os(self): """Mange User-Agent settings when talking to external services
"""The operating system name."""
if self._os is None:
os_name = sys.platform
if os_name.lower().startswith('linux'):
os_name = 'Linux'
elif os_name == 'win32':
os_name = 'Win32'
elif os_name == 'cygwin':
os_name = 'Cygwin'
elif os_name == 'darwin':
os_name = 'Darwin'
self._os = os_name
return self._os We follow the style as documented here:
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/User-Agent
"""
@property _os = None
def repo(self): _repo_ua = None
"""The UA when connecting directly from repo.""" _git_ua = None
if self._repo_ua is None:
py_version = sys.version_info
self._repo_ua = 'git-repo/%s (%s) git/%s Python/%d.%d.%d' % (
RepoSourceVersion(),
self.os,
git.version_tuple().full,
py_version.major, py_version.minor, py_version.micro)
return self._repo_ua @property
def os(self):
"""The operating system name."""
if self._os is None:
os_name = sys.platform
if os_name.lower().startswith("linux"):
os_name = "Linux"
elif os_name == "win32":
os_name = "Win32"
elif os_name == "cygwin":
os_name = "Cygwin"
elif os_name == "darwin":
os_name = "Darwin"
self._os = os_name
@property return self._os
def git(self):
"""The UA when running git."""
if self._git_ua is None:
self._git_ua = 'git/%s (%s) git-repo/%s' % (
git.version_tuple().full,
self.os,
RepoSourceVersion())
return self._git_ua @property
def repo(self):
"""The UA when connecting directly from repo."""
if self._repo_ua is None:
py_version = sys.version_info
self._repo_ua = "git-repo/%s (%s) git/%s Python/%d.%d.%d" % (
RepoSourceVersion(),
self.os,
git.version_tuple().full,
py_version.major,
py_version.minor,
py_version.micro,
)
return self._repo_ua
@property
def git(self):
"""The UA when running git."""
if self._git_ua is None:
self._git_ua = (
f"git/{git.version_tuple().full} ({self.os}) "
f"git-repo/{RepoSourceVersion()}"
)
return self._git_ua
user_agent = UserAgent() user_agent = UserAgent()
def git_require(min_version, fail=False, msg=''): def git_require(min_version, fail=False, msg=""):
git_version = git.version_tuple() git_version = git.version_tuple()
if min_version <= git_version: if min_version <= git_version:
return True return True
if fail: if fail:
need = '.'.join(map(str, min_version)) need = ".".join(map(str, min_version))
if msg: if msg:
msg = ' for ' + msg msg = " for " + msg
print('fatal: git %s or later required%s' % (need, msg), file=sys.stderr) error_msg = f"fatal: git {need} or later required{msg}"
sys.exit(1) logger.error(error_msg)
return False raise GitRequireError(error_msg)
return False
def _build_env( def _build_env(
@ -164,175 +217,434 @@ def _build_env(
disable_editor: Optional[bool] = False, disable_editor: Optional[bool] = False,
ssh_proxy: Optional[Any] = None, ssh_proxy: Optional[Any] = None,
gitdir: Optional[str] = None, gitdir: Optional[str] = None,
objdir: Optional[str] = None objdir: Optional[str] = None,
): ):
"""Constucts an env dict for command execution.""" """Constucts an env dict for command execution."""
assert _kwargs_only == (), '_build_env only accepts keyword arguments.' assert _kwargs_only == (), "_build_env only accepts keyword arguments."
env = GitCommand._GetBasicEnv() env = GitCommand._GetBasicEnv()
if disable_editor: if disable_editor:
env['GIT_EDITOR'] = ':' env["GIT_EDITOR"] = ":"
if ssh_proxy: if ssh_proxy:
env['REPO_SSH_SOCK'] = ssh_proxy.sock() env["REPO_SSH_SOCK"] = ssh_proxy.sock()
env['GIT_SSH'] = ssh_proxy.proxy env["GIT_SSH"] = ssh_proxy.proxy
env['GIT_SSH_VARIANT'] = 'ssh' env["GIT_SSH_VARIANT"] = "ssh"
if 'http_proxy' in env and 'darwin' == sys.platform: if "http_proxy" in env and "darwin" == sys.platform:
s = "'http.proxy=%s'" % (env['http_proxy'],) s = f"'http.proxy={env['http_proxy']}'"
p = env.get('GIT_CONFIG_PARAMETERS') p = env.get("GIT_CONFIG_PARAMETERS")
if p is not None: if p is not None:
s = p + ' ' + s s = p + " " + s
env['GIT_CONFIG_PARAMETERS'] = s env["GIT_CONFIG_PARAMETERS"] = s
if 'GIT_ALLOW_PROTOCOL' not in env: if "GIT_ALLOW_PROTOCOL" not in env:
env['GIT_ALLOW_PROTOCOL'] = ( env[
'file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc') "GIT_ALLOW_PROTOCOL"
env['GIT_HTTP_USER_AGENT'] = user_agent.git ] = "file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc"
env["GIT_HTTP_USER_AGENT"] = user_agent.git
if objdir: if objdir:
# Set to the place we want to save the objects. # Set to the place we want to save the objects.
env['GIT_OBJECT_DIRECTORY'] = objdir env["GIT_OBJECT_DIRECTORY"] = objdir
alt_objects = os.path.join(gitdir, 'objects') if gitdir else None alt_objects = os.path.join(gitdir, "objects") if gitdir else None
if alt_objects and os.path.realpath(alt_objects) != os.path.realpath(objdir): if alt_objects and os.path.realpath(alt_objects) != os.path.realpath(
# Allow git to search the original place in case of local or unique refs objdir
# that git will attempt to resolve even if we aren't fetching them. ):
env['GIT_ALTERNATE_OBJECT_DIRECTORIES'] = alt_objects # Allow git to search the original place in case of local or unique
if bare and gitdir is not None: # refs that git will attempt to resolve even if we aren't fetching
env[GIT_DIR] = gitdir # them.
env["GIT_ALTERNATE_OBJECT_DIRECTORIES"] = alt_objects
if bare and gitdir is not None:
env[GIT_DIR] = gitdir
return env
class GitCommand(object):
"""Wrapper around a single git invocation."""
def __init__(self,
project,
cmdv,
bare=False,
input=None,
capture_stdout=False,
capture_stderr=False,
merge_output=False,
disable_editor=False,
ssh_proxy=None,
cwd=None,
gitdir=None,
objdir=None):
if project:
if not cwd:
cwd = project.worktree
if not gitdir:
gitdir = project.gitdir
# Git on Windows wants its paths only using / for reliability.
if platform_utils.isWindows():
if objdir:
objdir = objdir.replace('\\', '/')
if gitdir:
gitdir = gitdir.replace('\\', '/')
env = _build_env(
disable_editor=disable_editor,
ssh_proxy=ssh_proxy,
objdir=objdir,
gitdir=gitdir,
bare=bare,
)
command = [GIT]
if bare:
cwd = None
command.append(cmdv[0])
# Need to use the --progress flag for fetch/clone so output will be
# displayed as by default git only does progress output if stderr is a TTY.
if sys.stderr.isatty() and cmdv[0] in ('fetch', 'clone'):
if '--progress' not in cmdv and '--quiet' not in cmdv:
command.append('--progress')
command.extend(cmdv[1:])
stdin = subprocess.PIPE if input else None
stdout = subprocess.PIPE if capture_stdout else None
stderr = (subprocess.STDOUT if merge_output else
(subprocess.PIPE if capture_stderr else None))
dbg = ''
if IsTrace():
global LAST_CWD
global LAST_GITDIR
if cwd and LAST_CWD != cwd:
if LAST_GITDIR or LAST_CWD:
dbg += '\n'
dbg += ': cd %s\n' % cwd
LAST_CWD = cwd
if GIT_DIR in env and LAST_GITDIR != env[GIT_DIR]:
if LAST_GITDIR or LAST_CWD:
dbg += '\n'
dbg += ': export GIT_DIR=%s\n' % env[GIT_DIR]
LAST_GITDIR = env[GIT_DIR]
if 'GIT_OBJECT_DIRECTORY' in env:
dbg += ': export GIT_OBJECT_DIRECTORY=%s\n' % env['GIT_OBJECT_DIRECTORY']
if 'GIT_ALTERNATE_OBJECT_DIRECTORIES' in env:
dbg += ': export GIT_ALTERNATE_OBJECT_DIRECTORIES=%s\n' % (
env['GIT_ALTERNATE_OBJECT_DIRECTORIES'])
dbg += ': '
dbg += ' '.join(command)
if stdin == subprocess.PIPE:
dbg += ' 0<|'
if stdout == subprocess.PIPE:
dbg += ' 1>|'
if stderr == subprocess.PIPE:
dbg += ' 2>|'
elif stderr == subprocess.STDOUT:
dbg += ' 2>&1'
with Trace('git command %s %s with debug: %s', LAST_GITDIR, command, dbg):
try:
p = subprocess.Popen(command,
cwd=cwd,
env=env,
encoding='utf-8',
errors='backslashreplace',
stdin=stdin,
stdout=stdout,
stderr=stderr)
except Exception as e:
raise GitError('%s: %s' % (command[1], e))
if ssh_proxy:
ssh_proxy.add_client(p)
self.process = p
try:
self.stdout, self.stderr = p.communicate(input=input)
finally:
if ssh_proxy:
ssh_proxy.remove_client(p)
self.rc = p.wait()
@staticmethod
def _GetBasicEnv():
"""Return a basic env for running git under.
This is guaranteed to be side-effect free.
"""
env = os.environ.copy()
for key in (REPO_TRACE,
GIT_DIR,
'GIT_ALTERNATE_OBJECT_DIRECTORIES',
'GIT_OBJECT_DIRECTORY',
'GIT_WORK_TREE',
'GIT_GRAFT_FILE',
'GIT_INDEX_FILE'):
env.pop(key, None)
return env return env
def Wait(self):
return self.rc class GitCommand:
"""Wrapper around a single git invocation."""
def __init__(
self,
project,
cmdv,
bare=False,
input=None,
capture_stdout=False,
capture_stderr=False,
merge_output=False,
disable_editor=False,
ssh_proxy=None,
cwd=None,
gitdir=None,
objdir=None,
verify_command=False,
add_event_log=True,
log_as_error=True,
):
if project:
if not cwd:
cwd = project.worktree
if not gitdir:
gitdir = project.gitdir
self.project = project
self.cmdv = cmdv
self.verify_command = verify_command
self.stdout, self.stderr = None, None
# Git on Windows wants its paths only using / for reliability.
if platform_utils.isWindows():
if objdir:
objdir = objdir.replace("\\", "/")
if gitdir:
gitdir = gitdir.replace("\\", "/")
env = _build_env(
disable_editor=disable_editor,
ssh_proxy=ssh_proxy,
objdir=objdir,
gitdir=gitdir,
bare=bare,
)
command = [GIT]
if bare:
cwd = None
command_name = cmdv[0]
command.append(command_name)
if command_name in ("fetch", "clone"):
env["GIT_TERMINAL_PROMPT"] = "0"
# Need to use the --progress flag for fetch/clone so output will be
# displayed as by default git only does progress output if stderr is
# a TTY.
if sys.stderr.isatty():
if "--progress" not in cmdv and "--quiet" not in cmdv:
command.append("--progress")
command.extend(cmdv[1:])
event_log = (
BaseEventLog(env=env, add_init_count=True)
if add_event_log
else None
)
try:
self._RunCommand(
command,
env,
capture_stdout=capture_stdout,
capture_stderr=capture_stderr,
merge_output=merge_output,
ssh_proxy=ssh_proxy,
cwd=cwd,
input=input,
)
self.VerifyCommand()
except GitCommandError as e:
if event_log is not None:
error_info = json.dumps(
{
"ErrorType": type(e).__name__,
"Project": e.project,
"CommandName": command_name,
"Message": str(e),
"ReturnCode": str(e.git_rc)
if e.git_rc is not None
else None,
"IsError": log_as_error,
}
)
event_log.ErrorEvent(
f"{ERROR_EVENT_LOGGING_PREFIX}:{error_info}"
)
event_log.Write(GetEventTargetPath())
if isinstance(e, GitPopenCommandError):
raise
def _RunCommand(
self,
command,
env,
capture_stdout=False,
capture_stderr=False,
merge_output=False,
ssh_proxy=None,
cwd=None,
input=None,
):
# Set subprocess.PIPE for streams that need to be captured.
stdin = subprocess.PIPE if input else None
stdout = subprocess.PIPE if capture_stdout else None
stderr = (
subprocess.STDOUT
if merge_output
else (subprocess.PIPE if capture_stderr else None)
)
# tee_stderr acts like a tee command for stderr, in that, it captures
# stderr from the subprocess and streams it back to sys.stderr, while
# keeping a copy in-memory.
# This allows us to store stderr logs from the subprocess into
# GitCommandError.
# Certain git operations, such as `git push`, writes diagnostic logs,
# such as, progress bar for pushing, into stderr. To ensure we don't
# break git's UX, we need to write to sys.stderr as we read from the
# subprocess. Setting encoding or errors makes subprocess return
# io.TextIOWrapper, which is line buffered. To avoid line-buffering
# while tee-ing stderr, we unset these kwargs. See GitCommand._Tee
# for tee-ing between the streams.
# We tee stderr iff the caller doesn't want to capture any stream to
# not disrupt the existing flow.
# See go/tee-repo-stderr for more context.
tee_stderr = False
kwargs = {"encoding": "utf-8", "errors": "backslashreplace"}
if not (stdin or stdout or stderr):
tee_stderr = True
# stderr will be written back to sys.stderr even though it is
# piped here.
stderr = subprocess.PIPE
kwargs = {}
dbg = ""
if IsTrace():
global LAST_CWD
global LAST_GITDIR
if cwd and LAST_CWD != cwd:
if LAST_GITDIR or LAST_CWD:
dbg += "\n"
dbg += ": cd %s\n" % cwd
LAST_CWD = cwd
if GIT_DIR in env and LAST_GITDIR != env[GIT_DIR]:
if LAST_GITDIR or LAST_CWD:
dbg += "\n"
dbg += ": export GIT_DIR=%s\n" % env[GIT_DIR]
LAST_GITDIR = env[GIT_DIR]
if "GIT_OBJECT_DIRECTORY" in env:
dbg += (
": export GIT_OBJECT_DIRECTORY=%s\n"
% env["GIT_OBJECT_DIRECTORY"]
)
if "GIT_ALTERNATE_OBJECT_DIRECTORIES" in env:
dbg += ": export GIT_ALTERNATE_OBJECT_DIRECTORIES=%s\n" % (
env["GIT_ALTERNATE_OBJECT_DIRECTORIES"]
)
dbg += ": "
dbg += " ".join(command)
if stdin == subprocess.PIPE:
dbg += " 0<|"
if stdout == subprocess.PIPE:
dbg += " 1>|"
if stderr == subprocess.PIPE:
dbg += " 2>|"
elif stderr == subprocess.STDOUT:
dbg += " 2>&1"
with Trace(
"git command %s %s with debug: %s", LAST_GITDIR, command, dbg
):
try:
p = subprocess.Popen(
command,
cwd=cwd,
env=env,
stdin=stdin,
stdout=stdout,
stderr=stderr,
**kwargs,
)
except Exception as e:
raise GitPopenCommandError(
message=f"{command[1]}: {e}",
project=self.project.name if self.project else None,
command_args=self.cmdv,
)
if ssh_proxy:
ssh_proxy.add_client(p)
self.process = p
try:
if tee_stderr:
# tee_stderr streams stderr to sys.stderr while capturing
# a copy within self.stderr. tee_stderr is only enabled
# when the caller wants to pipe no stream.
self.stderr = self._Tee(p.stderr, sys.stderr)
else:
self.stdout, self.stderr = p.communicate(input=input)
finally:
if ssh_proxy:
ssh_proxy.remove_client(p)
self.rc = p.wait()
@staticmethod
def _Tee(in_stream, out_stream):
"""Writes text from in_stream to out_stream while recording in buffer.
Args:
in_stream: I/O stream to be read from.
out_stream: I/O stream to write to.
Returns:
A str containing everything read from the in_stream.
"""
buffer = ""
read_size = 1024 if sys.version_info < (3, 7) else -1
chunk = in_stream.read1(read_size)
while chunk:
# Convert to str.
if not hasattr(chunk, "encode"):
chunk = chunk.decode("utf-8", "backslashreplace")
buffer += chunk
out_stream.write(chunk)
out_stream.flush()
chunk = in_stream.read1(read_size)
return buffer
@staticmethod
def _GetBasicEnv():
"""Return a basic env for running git under.
This is guaranteed to be side-effect free.
"""
env = os.environ.copy()
for key in (
REPO_TRACE,
GIT_DIR,
"GIT_ALTERNATE_OBJECT_DIRECTORIES",
"GIT_OBJECT_DIRECTORY",
"GIT_WORK_TREE",
"GIT_GRAFT_FILE",
"GIT_INDEX_FILE",
):
env.pop(key, None)
return env
def VerifyCommand(self):
if self.rc == 0:
return None
stdout = (
"\n".join(self.stdout.split("\n")[:GIT_ERROR_STDOUT_LINES])
if self.stdout
else None
)
stderr = (
"\n".join(self.stderr.split("\n")[:GIT_ERROR_STDERR_LINES])
if self.stderr
else None
)
project = self.project.name if self.project else None
raise GitCommandError(
project=project,
command_args=self.cmdv,
git_rc=self.rc,
git_stdout=stdout,
git_stderr=stderr,
)
def Wait(self):
if self.verify_command:
self.VerifyCommand()
return self.rc
class GitRequireError(RepoExitError):
"""Error raised when git version is unavailable or invalid."""
def __init__(self, message, exit_code: int = INVALID_GIT_EXIT_CODE):
super().__init__(message, exit_code=exit_code)
class GitCommandError(GitError):
"""
Error raised from a failed git command.
Note that GitError can refer to any Git related error (e.g. branch not
specified for project.py 'UploadForReview'), while GitCommandError is
raised exclusively from non-zero exit codes returned from git commands.
"""
# Tuples with error formats and suggestions for those errors.
_ERROR_TO_SUGGESTION = [
(
re.compile("couldn't find remote ref .*"),
"Check if the provided ref exists in the remote.",
),
(
re.compile("unable to access '.*': .*"),
(
"Please make sure you have the correct access rights and the "
"repository exists."
),
),
(
re.compile("'.*' does not appear to be a git repository"),
"Are you running this repo command outside of a repo workspace?",
),
(
re.compile("not a git repository"),
"Are you running this repo command outside of a repo workspace?",
),
]
def __init__(
self,
message: str = DEFAULT_GIT_FAIL_MESSAGE,
git_rc: int = None,
git_stdout: str = None,
git_stderr: str = None,
**kwargs,
):
super().__init__(
message,
**kwargs,
)
self.git_rc = git_rc
self.git_stdout = git_stdout
self.git_stderr = git_stderr
@property
@functools.lru_cache(maxsize=None)
def suggestion(self):
"""Returns helpful next steps for the given stderr."""
if not self.git_stderr:
return self.git_stderr
for err, suggestion in self._ERROR_TO_SUGGESTION:
if err.search(self.git_stderr):
return suggestion
return None
def __str__(self):
args = "[]" if not self.command_args else " ".join(self.command_args)
error_type = type(self).__name__
string = f"{error_type}: '{args}' on {self.project} failed"
if self.message != DEFAULT_GIT_FAIL_MESSAGE:
string += f": {self.message}"
if self.git_stdout:
string += f"\nstdout: {self.git_stdout}"
if self.git_stderr:
string += f"\nstderr: {self.git_stderr}"
if self.suggestion:
string += f"\nsuggestion: {self.suggestion}"
return string
class GitPopenCommandError(GitError):
"""
Error raised when subprocess.Popen fails for a GitCommand
"""

File diff suppressed because it is too large Load Diff

View File

@ -13,152 +13,153 @@
# limitations under the License. # limitations under the License.
import os import os
from repo_trace import Trace
import platform_utils import platform_utils
from repo_trace import Trace
HEAD = 'HEAD'
R_CHANGES = 'refs/changes/'
R_HEADS = 'refs/heads/'
R_TAGS = 'refs/tags/'
R_PUB = 'refs/published/'
R_WORKTREE = 'refs/worktree/'
R_WORKTREE_M = R_WORKTREE + 'm/'
R_M = 'refs/remotes/m/'
class GitRefs(object): HEAD = "HEAD"
def __init__(self, gitdir): R_CHANGES = "refs/changes/"
self._gitdir = gitdir R_HEADS = "refs/heads/"
self._phyref = None R_TAGS = "refs/tags/"
self._symref = None R_PUB = "refs/published/"
self._mtime = {} R_WORKTREE = "refs/worktree/"
R_WORKTREE_M = R_WORKTREE + "m/"
R_M = "refs/remotes/m/"
@property
def all(self):
self._EnsureLoaded()
return self._phyref
def get(self, name): class GitRefs:
try: def __init__(self, gitdir):
return self.all[name] self._gitdir = gitdir
except KeyError: self._phyref = None
return '' self._symref = None
self._mtime = {}
def deleted(self, name): @property
if self._phyref is not None: def all(self):
if name in self._phyref: self._EnsureLoaded()
del self._phyref[name] return self._phyref
if name in self._symref: def get(self, name):
del self._symref[name]
if name in self._mtime:
del self._mtime[name]
def symref(self, name):
try:
self._EnsureLoaded()
return self._symref[name]
except KeyError:
return ''
def _EnsureLoaded(self):
if self._phyref is None or self._NeedUpdate():
self._LoadAll()
def _NeedUpdate(self):
with Trace(': scan refs %s', self._gitdir):
for name, mtime in self._mtime.items():
try: try:
if mtime != os.path.getmtime(os.path.join(self._gitdir, name)): return self.all[name]
return True except KeyError:
return ""
def deleted(self, name):
if self._phyref is not None:
if name in self._phyref:
del self._phyref[name]
if name in self._symref:
del self._symref[name]
if name in self._mtime:
del self._mtime[name]
def symref(self, name):
try:
self._EnsureLoaded()
return self._symref[name]
except KeyError:
return ""
def _EnsureLoaded(self):
if self._phyref is None or self._NeedUpdate():
self._LoadAll()
def _NeedUpdate(self):
with Trace(": scan refs %s", self._gitdir):
for name, mtime in self._mtime.items():
try:
if mtime != os.path.getmtime(
os.path.join(self._gitdir, name)
):
return True
except OSError:
return True
return False
def _LoadAll(self):
with Trace(": load refs %s", self._gitdir):
self._phyref = {}
self._symref = {}
self._mtime = {}
self._ReadPackedRefs()
self._ReadLoose("refs/")
self._ReadLoose1(os.path.join(self._gitdir, HEAD), HEAD)
scan = self._symref
attempts = 0
while scan and attempts < 5:
scan_next = {}
for name, dest in scan.items():
if dest in self._phyref:
self._phyref[name] = self._phyref[dest]
else:
scan_next[name] = dest
scan = scan_next
attempts += 1
def _ReadPackedRefs(self):
path = os.path.join(self._gitdir, "packed-refs")
try:
fd = open(path)
mtime = os.path.getmtime(path)
except OSError: except OSError:
return True return
return False try:
for line in fd:
line = str(line)
if line[0] == "#":
continue
if line[0] == "^":
continue
def _LoadAll(self): line = line[:-1]
with Trace(': load refs %s', self._gitdir): p = line.split(" ")
ref_id = p[0]
name = p[1]
self._phyref = {} self._phyref[name] = ref_id
self._symref = {} finally:
self._mtime = {} fd.close()
self._mtime["packed-refs"] = mtime
self._ReadPackedRefs() def _ReadLoose(self, prefix):
self._ReadLoose('refs/') base = os.path.join(self._gitdir, prefix)
self._ReadLoose1(os.path.join(self._gitdir, HEAD), HEAD) for name in platform_utils.listdir(base):
p = os.path.join(base, name)
# We don't implement the full ref validation algorithm, just the
# simple rules that would show up in local filesystems.
# https://git-scm.com/docs/git-check-ref-format
if name.startswith(".") or name.endswith(".lock"):
pass
elif platform_utils.isdir(p):
self._mtime[prefix] = os.path.getmtime(base)
self._ReadLoose(prefix + name + "/")
else:
self._ReadLoose1(p, prefix + name)
scan = self._symref def _ReadLoose1(self, path, name):
attempts = 0 try:
while scan and attempts < 5: with open(path) as fd:
scan_next = {} mtime = os.path.getmtime(path)
for name, dest in scan.items(): ref_id = fd.readline()
if dest in self._phyref: except (OSError, UnicodeError):
self._phyref[name] = self._phyref[dest] return
else:
scan_next[name] = dest
scan = scan_next
attempts += 1
def _ReadPackedRefs(self): try:
path = os.path.join(self._gitdir, 'packed-refs') ref_id = ref_id.decode()
try: except AttributeError:
fd = open(path, 'r') pass
mtime = os.path.getmtime(path) if not ref_id:
except IOError: return
return ref_id = ref_id[:-1]
except OSError:
return
try:
for line in fd:
line = str(line)
if line[0] == '#':
continue
if line[0] == '^':
continue
line = line[:-1] if ref_id.startswith("ref: "):
p = line.split(' ') self._symref[name] = ref_id[5:]
ref_id = p[0] else:
name = p[1] self._phyref[name] = ref_id
self._mtime[name] = mtime
self._phyref[name] = ref_id
finally:
fd.close()
self._mtime['packed-refs'] = mtime
def _ReadLoose(self, prefix):
base = os.path.join(self._gitdir, prefix)
for name in platform_utils.listdir(base):
p = os.path.join(base, name)
# We don't implement the full ref validation algorithm, just the simple
# rules that would show up in local filesystems.
# https://git-scm.com/docs/git-check-ref-format
if name.startswith('.') or name.endswith('.lock'):
pass
elif platform_utils.isdir(p):
self._mtime[prefix] = os.path.getmtime(base)
self._ReadLoose(prefix + name + '/')
else:
self._ReadLoose1(p, prefix + name)
def _ReadLoose1(self, path, name):
try:
with open(path) as fd:
mtime = os.path.getmtime(path)
ref_id = fd.readline()
except (OSError, UnicodeError):
return
try:
ref_id = ref_id.decode()
except AttributeError:
pass
if not ref_id:
return
ref_id = ref_id[:-1]
if ref_id.startswith('ref: '):
self._symref[name] = ref_id[5:]
else:
self._phyref[name] = ref_id
self._mtime[name] = mtime

View File

@ -12,7 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
"""Provide functionality to get all projects and their commit ids from Superproject. """Provide functionality to get projects and their commit ids from Superproject.
For more information on superproject, check out: For more information on superproject, check out:
https://en.wikibooks.org/wiki/Git/Submodules_and_Superprojects https://en.wikibooks.org/wiki/Git/Submodules_and_Superprojects
@ -22,445 +22,527 @@ Examples:
UpdateProjectsResult = superproject.UpdateProjectsRevisionId(projects) UpdateProjectsResult = superproject.UpdateProjectsRevisionId(projects)
""" """
import hashlib
import functools import functools
import hashlib
import os import os
import sys import sys
import time import time
from typing import NamedTuple from typing import NamedTuple
from git_command import git_require, GitCommand from git_command import git_require
from git_command import GitCommand
from git_config import RepoConfig from git_config import RepoConfig
from git_refs import GitRefs from git_refs import GitRefs
_SUPERPROJECT_GIT_NAME = 'superproject.git'
_SUPERPROJECT_MANIFEST_NAME = 'superproject_override.xml' _SUPERPROJECT_GIT_NAME = "superproject.git"
_SUPERPROJECT_MANIFEST_NAME = "superproject_override.xml"
class SyncResult(NamedTuple): class SyncResult(NamedTuple):
"""Return the status of sync and whether caller should exit.""" """Return the status of sync and whether caller should exit."""
# Whether the superproject sync was successful. # Whether the superproject sync was successful.
success: bool success: bool
# Whether the caller should exit. # Whether the caller should exit.
fatal: bool fatal: bool
class CommitIdsResult(NamedTuple): class CommitIdsResult(NamedTuple):
"""Return the commit ids and whether caller should exit.""" """Return the commit ids and whether caller should exit."""
# A dictionary with the projects/commit ids on success, otherwise None. # A dictionary with the projects/commit ids on success, otherwise None.
commit_ids: dict commit_ids: dict
# Whether the caller should exit. # Whether the caller should exit.
fatal: bool fatal: bool
class UpdateProjectsResult(NamedTuple): class UpdateProjectsResult(NamedTuple):
"""Return the overriding manifest file and whether caller should exit.""" """Return the overriding manifest file and whether caller should exit."""
# Path name of the overriding manifest file if successful, otherwise None. # Path name of the overriding manifest file if successful, otherwise None.
manifest_path: str manifest_path: str
# Whether the caller should exit. # Whether the caller should exit.
fatal: bool fatal: bool
class Superproject(object): class Superproject:
"""Get commit ids from superproject. """Get commit ids from superproject.
Initializes a local copy of a superproject for the manifest. This allows Initializes a bare local copy of a superproject for the manifest. This
lookup of commit ids for all projects. It contains _project_commit_ids which allows lookup of commit ids for all projects. It contains
is a dictionary with project/commit id entries. _project_commit_ids which is a dictionary with project/commit id entries.
"""
def __init__(self, manifest, name, remote, revision,
superproject_dir='exp-superproject'):
"""Initializes superproject.
Args:
manifest: A Manifest object that is to be written to a file.
name: The unique name of the superproject
remote: The RemoteSpec for the remote.
revision: The name of the git branch to track.
superproject_dir: Relative path under |manifest.subdir| to checkout
superproject.
""" """
self._project_commit_ids = None
self._manifest = manifest
self.name = name
self.remote = remote
self.revision = self._branch = revision
self._repodir = manifest.repodir
self._superproject_dir = superproject_dir
self._superproject_path = manifest.SubmanifestInfoDir(manifest.path_prefix,
superproject_dir)
self._manifest_path = os.path.join(self._superproject_path,
_SUPERPROJECT_MANIFEST_NAME)
git_name = hashlib.md5(remote.name.encode('utf8')).hexdigest() + '-'
self._remote_url = remote.url
self._work_git_name = git_name + _SUPERPROJECT_GIT_NAME
self._work_git = os.path.join(self._superproject_path, self._work_git_name)
# The following are command arguemnts, rather than superproject attributes, def __init__(
# and were included here originally. They should eventually become self,
# arguments that are passed down from the public methods, instead of being manifest,
# treated as attributes. name,
self._git_event_log = None remote,
self._quiet = False revision,
self._print_messages = False superproject_dir="exp-superproject",
):
"""Initializes superproject.
def SetQuiet(self, value): Args:
"""Set the _quiet attribute.""" manifest: A Manifest object that is to be written to a file.
self._quiet = value name: The unique name of the superproject
remote: The RemoteSpec for the remote.
revision: The name of the git branch to track.
superproject_dir: Relative path under |manifest.subdir| to checkout
superproject.
"""
self._project_commit_ids = None
self._manifest = manifest
self.name = name
self.remote = remote
self.revision = self._branch = revision
self._repodir = manifest.repodir
self._superproject_dir = superproject_dir
self._superproject_path = manifest.SubmanifestInfoDir(
manifest.path_prefix, superproject_dir
)
self._manifest_path = os.path.join(
self._superproject_path, _SUPERPROJECT_MANIFEST_NAME
)
git_name = hashlib.md5(remote.name.encode("utf8")).hexdigest() + "-"
self._remote_url = remote.url
self._work_git_name = git_name + _SUPERPROJECT_GIT_NAME
self._work_git = os.path.join(
self._superproject_path, self._work_git_name
)
def SetPrintMessages(self, value): # The following are command arguemnts, rather than superproject
"""Set the _print_messages attribute.""" # attributes, and were included here originally. They should eventually
self._print_messages = value # become arguments that are passed down from the public methods, instead
# of being treated as attributes.
self._git_event_log = None
self._quiet = False
self._print_messages = False
@property def SetQuiet(self, value):
def project_commit_ids(self): """Set the _quiet attribute."""
"""Returns a dictionary of projects and their commit ids.""" self._quiet = value
return self._project_commit_ids
@property def SetPrintMessages(self, value):
def manifest_path(self): """Set the _print_messages attribute."""
"""Returns the manifest path if the path exists or None.""" self._print_messages = value
return self._manifest_path if os.path.exists(self._manifest_path) else None
def _LogMessage(self, fmt, *inputs): @property
"""Logs message to stderr and _git_event_log.""" def project_commit_ids(self):
message = f'{self._LogMessagePrefix()} {fmt.format(*inputs)}' """Returns a dictionary of projects and their commit ids."""
if self._print_messages: return self._project_commit_ids
print(message, file=sys.stderr)
self._git_event_log.ErrorEvent(message, fmt)
def _LogMessagePrefix(self): @property
"""Returns the prefix string to be logged in each log message""" def manifest_path(self):
return f'repo superproject branch: {self._branch} url: {self._remote_url}' """Returns the manifest path if the path exists or None."""
return (
self._manifest_path if os.path.exists(self._manifest_path) else None
)
def _LogError(self, fmt, *inputs): def _LogMessage(self, fmt, *inputs):
"""Logs error message to stderr and _git_event_log.""" """Logs message to stderr and _git_event_log."""
self._LogMessage(f'error: {fmt}', *inputs) message = f"{self._LogMessagePrefix()} {fmt.format(*inputs)}"
if self._print_messages:
print(message, file=sys.stderr)
self._git_event_log.ErrorEvent(message, fmt)
def _LogWarning(self, fmt, *inputs): def _LogMessagePrefix(self):
"""Logs warning message to stderr and _git_event_log.""" """Returns the prefix string to be logged in each log message"""
self._LogMessage(f'warning: {fmt}', *inputs) return (
f"repo superproject branch: {self._branch} url: {self._remote_url}"
)
def _Init(self): def _LogError(self, fmt, *inputs):
"""Sets up a local Git repository to get a copy of a superproject. """Logs error message to stderr and _git_event_log."""
self._LogMessage(f"error: {fmt}", *inputs)
Returns: def _LogWarning(self, fmt, *inputs):
True if initialization is successful, or False. """Logs warning message to stderr and _git_event_log."""
""" self._LogMessage(f"warning: {fmt}", *inputs)
if not os.path.exists(self._superproject_path):
os.mkdir(self._superproject_path)
if not self._quiet and not os.path.exists(self._work_git):
print('%s: Performing initial setup for superproject; this might take '
'several minutes.' % self._work_git)
cmd = ['init', '--bare', self._work_git_name]
p = GitCommand(None,
cmd,
cwd=self._superproject_path,
capture_stdout=True,
capture_stderr=True)
retval = p.Wait()
if retval:
self._LogWarning('git init call failed, command: git {}, '
'return code: {}, stderr: {}', cmd, retval, p.stderr)
return False
return True
def _Fetch(self): def _Init(self):
"""Fetches a local copy of a superproject for the manifest based on |_remote_url|. """Sets up a local Git repository to get a copy of a superproject.
Returns: Returns:
True if fetch is successful, or False. True if initialization is successful, or False.
""" """
if not os.path.exists(self._work_git): if not os.path.exists(self._superproject_path):
self._LogWarning('git fetch missing directory: {}', self._work_git) os.mkdir(self._superproject_path)
return False if not self._quiet and not os.path.exists(self._work_git):
if not git_require((2, 28, 0)): print(
self._LogWarning('superproject requires a git version 2.28 or later') "%s: Performing initial setup for superproject; this might "
return False "take several minutes." % self._work_git
cmd = ['fetch', self._remote_url, '--depth', '1', '--force', '--no-tags', )
'--filter', 'blob:none'] cmd = ["init", "--bare", self._work_git_name]
p = GitCommand(
None,
cmd,
cwd=self._superproject_path,
capture_stdout=True,
capture_stderr=True,
)
retval = p.Wait()
if retval:
self._LogWarning(
"git init call failed, command: git {}, "
"return code: {}, stderr: {}",
cmd,
retval,
p.stderr,
)
return False
return True
# Check if there is a local ref that we can pass to --negotiation-tip. def _Fetch(self):
# If this is the first fetch, it does not exist yet. """Fetches a superproject for the manifest based on |_remote_url|.
# We use --negotiation-tip to speed up the fetch. Superproject branches do
# not share commits. So this lets git know it only needs to send commits
# reachable from the specified local refs.
rev_commit = GitRefs(self._work_git).get(f'refs/heads/{self.revision}')
if rev_commit:
cmd.extend(['--negotiation-tip', rev_commit])
if self._branch: This runs git fetch which stores a local copy the superproject.
cmd += [self._branch + ':' + self._branch]
p = GitCommand(None,
cmd,
cwd=self._work_git,
capture_stdout=True,
capture_stderr=True)
retval = p.Wait()
if retval:
self._LogWarning('git fetch call failed, command: git {}, '
'return code: {}, stderr: {}', cmd, retval, p.stderr)
return False
return True
def _LsTree(self): Returns:
"""Gets the commit ids for all projects. True if fetch is successful, or False.
"""
if not os.path.exists(self._work_git):
self._LogWarning("git fetch missing directory: {}", self._work_git)
return False
if not git_require((2, 28, 0)):
self._LogWarning(
"superproject requires a git version 2.28 or later"
)
return False
cmd = [
"fetch",
self._remote_url,
"--depth",
"1",
"--force",
"--no-tags",
"--filter",
"blob:none",
]
Works only in git repositories. # Check if there is a local ref that we can pass to --negotiation-tip.
# If this is the first fetch, it does not exist yet.
# We use --negotiation-tip to speed up the fetch. Superproject branches
# do not share commits. So this lets git know it only needs to send
# commits reachable from the specified local refs.
rev_commit = GitRefs(self._work_git).get(f"refs/heads/{self.revision}")
if rev_commit:
cmd.extend(["--negotiation-tip", rev_commit])
Returns: if self._branch:
data: data returned from 'git ls-tree ...' instead of None. cmd += [self._branch + ":" + self._branch]
""" p = GitCommand(
if not os.path.exists(self._work_git): None,
self._LogWarning('git ls-tree missing directory: {}', self._work_git) cmd,
return None gitdir=self._work_git,
data = None bare=True,
branch = 'HEAD' if not self._branch else self._branch capture_stdout=True,
cmd = ['ls-tree', '-z', '-r', branch] capture_stderr=True,
)
retval = p.Wait()
if retval:
self._LogWarning(
"git fetch call failed, command: git {}, "
"return code: {}, stderr: {}",
cmd,
retval,
p.stderr,
)
return False
return True
p = GitCommand(None, def _LsTree(self):
cmd, """Gets the commit ids for all projects.
cwd=self._work_git,
capture_stdout=True,
capture_stderr=True)
retval = p.Wait()
if retval == 0:
data = p.stdout
else:
self._LogWarning('git ls-tree call failed, command: git {}, '
'return code: {}, stderr: {}', cmd, retval, p.stderr)
return data
def Sync(self, git_event_log): Works only in git repositories.
"""Gets a local copy of a superproject for the manifest.
Args: Returns:
git_event_log: an EventLog, for git tracing. data: data returned from 'git ls-tree ...' instead of None.
"""
if not os.path.exists(self._work_git):
self._LogWarning(
"git ls-tree missing directory: {}", self._work_git
)
return None
data = None
branch = "HEAD" if not self._branch else self._branch
cmd = ["ls-tree", "-z", "-r", branch]
Returns: p = GitCommand(
SyncResult None,
""" cmd,
self._git_event_log = git_event_log gitdir=self._work_git,
if not self._manifest.superproject: bare=True,
self._LogWarning('superproject tag is not defined in manifest: {}', capture_stdout=True,
self._manifest.manifestFile) capture_stderr=True,
return SyncResult(False, False) )
retval = p.Wait()
if retval == 0:
data = p.stdout
else:
self._LogWarning(
"git ls-tree call failed, command: git {}, "
"return code: {}, stderr: {}",
cmd,
retval,
p.stderr,
)
return data
_PrintBetaNotice() def Sync(self, git_event_log):
"""Gets a local copy of a superproject for the manifest.
should_exit = True Args:
if not self._remote_url: git_event_log: an EventLog, for git tracing.
self._LogWarning('superproject URL is not defined in manifest: {}',
self._manifest.manifestFile)
return SyncResult(False, should_exit)
if not self._Init(): Returns:
return SyncResult(False, should_exit) SyncResult
if not self._Fetch(): """
return SyncResult(False, should_exit) self._git_event_log = git_event_log
if not self._quiet: if not self._manifest.superproject:
print('%s: Initial setup for superproject completed.' % self._work_git) self._LogWarning(
return SyncResult(True, False) "superproject tag is not defined in manifest: {}",
self._manifest.manifestFile,
)
return SyncResult(False, False)
def _GetAllProjectsCommitIds(self): should_exit = True
"""Get commit ids for all projects from superproject and save them in _project_commit_ids. if not self._remote_url:
self._LogWarning(
"superproject URL is not defined in manifest: {}",
self._manifest.manifestFile,
)
return SyncResult(False, should_exit)
Returns: if not self._Init():
CommitIdsResult return SyncResult(False, should_exit)
""" if not self._Fetch():
sync_result = self.Sync(self._git_event_log) return SyncResult(False, should_exit)
if not sync_result.success: if not self._quiet:
return CommitIdsResult(None, sync_result.fatal) print(
"%s: Initial setup for superproject completed." % self._work_git
)
return SyncResult(True, False)
data = self._LsTree() def _GetAllProjectsCommitIds(self):
if not data: """Get commit ids for all projects from superproject and save them.
self._LogWarning('git ls-tree failed to return data for manifest: {}',
self._manifest.manifestFile)
return CommitIdsResult(None, True)
# Parse lines like the following to select lines starting with '160000' and Commit ids are saved in _project_commit_ids.
# build a dictionary with project path (last element) and its commit id (3rd element).
#
# 160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00
# 120000 blob acc2cbdf438f9d2141f0ae424cec1d8fc4b5d97f\tbootstrap.bash\x00
commit_ids = {}
for line in data.split('\x00'):
ls_data = line.split(None, 3)
if not ls_data:
break
if ls_data[0] == '160000':
commit_ids[ls_data[3]] = ls_data[2]
self._project_commit_ids = commit_ids Returns:
return CommitIdsResult(commit_ids, False) CommitIdsResult
"""
sync_result = self.Sync(self._git_event_log)
if not sync_result.success:
return CommitIdsResult(None, sync_result.fatal)
def _WriteManifestFile(self): data = self._LsTree()
"""Writes manifest to a file. if not data:
self._LogWarning(
"git ls-tree failed to return data for manifest: {}",
self._manifest.manifestFile,
)
return CommitIdsResult(None, True)
Returns: # Parse lines like the following to select lines starting with '160000'
manifest_path: Path name of the file into which manifest is written instead of None. # and build a dictionary with project path (last element) and its commit
""" # id (3rd element).
if not os.path.exists(self._superproject_path): #
self._LogWarning('missing superproject directory: {}', self._superproject_path) # 160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00
return None # 120000 blob acc2cbdf438f9d2141f0ae424cec1d8fc4b5d97f\tbootstrap.bash\x00 # noqa: E501
manifest_str = self._manifest.ToXml(groups=self._manifest.GetGroupsStr(), commit_ids = {}
omit_local=True).toxml() for line in data.split("\x00"):
manifest_path = self._manifest_path ls_data = line.split(None, 3)
try: if not ls_data:
with open(manifest_path, 'w', encoding='utf-8') as fp: break
fp.write(manifest_str) if ls_data[0] == "160000":
except IOError as e: commit_ids[ls_data[3]] = ls_data[2]
self._LogError('cannot write manifest to : {} {}',
manifest_path, e)
return None
return manifest_path
def _SkipUpdatingProjectRevisionId(self, project): self._project_commit_ids = commit_ids
"""Checks if a project's revision id needs to be updated or not. return CommitIdsResult(commit_ids, False)
Revision id for projects from local manifest will not be updated. def _WriteManifestFile(self):
"""Writes manifest to a file.
Args: Returns:
project: project whose revision id is being updated. manifest_path: Path name of the file into which manifest is written
instead of None.
"""
if not os.path.exists(self._superproject_path):
self._LogWarning(
"missing superproject directory: {}", self._superproject_path
)
return None
manifest_str = self._manifest.ToXml(
groups=self._manifest.GetGroupsStr(), omit_local=True
).toxml()
manifest_path = self._manifest_path
try:
with open(manifest_path, "w", encoding="utf-8") as fp:
fp.write(manifest_str)
except OSError as e:
self._LogError("cannot write manifest to : {} {}", manifest_path, e)
return None
return manifest_path
Returns: def _SkipUpdatingProjectRevisionId(self, project):
True if a project's revision id should not be updated, or False, """Checks if a project's revision id needs to be updated or not.
"""
path = project.relpath
if not path:
return True
# Skip the project with revisionId.
if project.revisionId:
return True
# Skip the project if it comes from the local manifest.
return project.manifest.IsFromLocalManifest(project)
def UpdateProjectsRevisionId(self, projects, git_event_log): Revision id for projects from local manifest will not be updated.
"""Update revisionId of every project in projects with the commit id.
Args: Args:
projects: a list of projects whose revisionId needs to be updated. project: project whose revision id is being updated.
git_event_log: an EventLog, for git tracing.
Returns: Returns:
UpdateProjectsResult True if a project's revision id should not be updated, or False,
""" """
self._git_event_log = git_event_log path = project.relpath
commit_ids_result = self._GetAllProjectsCommitIds() if not path:
commit_ids = commit_ids_result.commit_ids return True
if not commit_ids: # Skip the project with revisionId.
return UpdateProjectsResult(None, commit_ids_result.fatal) if project.revisionId:
return True
# Skip the project if it comes from the local manifest.
return project.manifest.IsFromLocalManifest(project)
projects_missing_commit_ids = [] def UpdateProjectsRevisionId(self, projects, git_event_log):
for project in projects: """Update revisionId of every project in projects with the commit id.
if self._SkipUpdatingProjectRevisionId(project):
continue
path = project.relpath
commit_id = commit_ids.get(path)
if not commit_id:
projects_missing_commit_ids.append(path)
# If superproject doesn't have a commit id for a project, then report an Args:
# error event and continue as if do not use superproject is specified. projects: a list of projects whose revisionId needs to be updated.
if projects_missing_commit_ids: git_event_log: an EventLog, for git tracing.
self._LogWarning('please file a bug using {} to report missing '
'commit_ids for: {}', self._manifest.contactinfo.bugurl,
projects_missing_commit_ids)
return UpdateProjectsResult(None, False)
for project in projects: Returns:
if not self._SkipUpdatingProjectRevisionId(project): UpdateProjectsResult
project.SetRevisionId(commit_ids.get(project.relpath)) """
self._git_event_log = git_event_log
commit_ids_result = self._GetAllProjectsCommitIds()
commit_ids = commit_ids_result.commit_ids
if not commit_ids:
return UpdateProjectsResult(None, commit_ids_result.fatal)
manifest_path = self._WriteManifestFile() projects_missing_commit_ids = []
return UpdateProjectsResult(manifest_path, False) for project in projects:
if self._SkipUpdatingProjectRevisionId(project):
continue
path = project.relpath
commit_id = commit_ids.get(path)
if not commit_id:
projects_missing_commit_ids.append(path)
# If superproject doesn't have a commit id for a project, then report an
# error event and continue as if do not use superproject is specified.
if projects_missing_commit_ids:
self._LogWarning(
"please file a bug using {} to report missing "
"commit_ids for: {}",
self._manifest.contactinfo.bugurl,
projects_missing_commit_ids,
)
return UpdateProjectsResult(None, False)
@functools.lru_cache(maxsize=10) for project in projects:
def _PrintBetaNotice(): if not self._SkipUpdatingProjectRevisionId(project):
"""Print the notice of beta status.""" project.SetRevisionId(commit_ids.get(project.relpath))
print('NOTICE: --use-superproject is in beta; report any issues to the '
'address described in `repo version`', file=sys.stderr) manifest_path = self._WriteManifestFile()
return UpdateProjectsResult(manifest_path, False)
@functools.lru_cache(maxsize=None) @functools.lru_cache(maxsize=None)
def _UseSuperprojectFromConfiguration(): def _UseSuperprojectFromConfiguration():
"""Returns the user choice of whether to use superproject.""" """Returns the user choice of whether to use superproject."""
user_cfg = RepoConfig.ForUser() user_cfg = RepoConfig.ForUser()
time_now = int(time.time()) time_now = int(time.time())
user_value = user_cfg.GetBoolean('repo.superprojectChoice') user_value = user_cfg.GetBoolean("repo.superprojectChoice")
if user_value is not None: if user_value is not None:
user_expiration = user_cfg.GetInt('repo.superprojectChoiceExpire') user_expiration = user_cfg.GetInt("repo.superprojectChoiceExpire")
if user_expiration is None or user_expiration <= 0 or user_expiration >= time_now: if (
# TODO(b/190688390) - Remove prompt when we are comfortable with the new user_expiration is None
# default value. or user_expiration <= 0
if user_value: or user_expiration >= time_now
print(('You are currently enrolled in Git submodules experiment ' ):
'(go/android-submodules-quickstart). Use --no-use-superproject ' # TODO(b/190688390) - Remove prompt when we are comfortable with the
'to override.\n'), file=sys.stderr) # new default value.
else: if user_value:
print(('You are not currently enrolled in Git submodules experiment ' print(
'(go/android-submodules-quickstart). Use --use-superproject ' (
'to override.\n'), file=sys.stderr) "You are currently enrolled in Git submodules "
return user_value "experiment (go/android-submodules-quickstart). Use "
"--no-use-superproject to override.\n"
),
file=sys.stderr,
)
else:
print(
(
"You are not currently enrolled in Git submodules "
"experiment (go/android-submodules-quickstart). Use "
"--use-superproject to override.\n"
),
file=sys.stderr,
)
return user_value
# We don't have an unexpired choice, ask for one. # We don't have an unexpired choice, ask for one.
system_cfg = RepoConfig.ForSystem() system_cfg = RepoConfig.ForSystem()
system_value = system_cfg.GetBoolean('repo.superprojectChoice') system_value = system_cfg.GetBoolean("repo.superprojectChoice")
if system_value: if system_value:
# The system configuration is proposing that we should enable the # The system configuration is proposing that we should enable the
# use of superproject. Treat the user as enrolled for two weeks. # use of superproject. Treat the user as enrolled for two weeks.
# #
# TODO(b/190688390) - Remove prompt when we are comfortable with the new # TODO(b/190688390) - Remove prompt when we are comfortable with the new
# default value. # default value.
userchoice = True userchoice = True
time_choiceexpire = time_now + (86400 * 14) time_choiceexpire = time_now + (86400 * 14)
user_cfg.SetString('repo.superprojectChoiceExpire', str(time_choiceexpire)) user_cfg.SetString(
user_cfg.SetBoolean('repo.superprojectChoice', userchoice) "repo.superprojectChoiceExpire", str(time_choiceexpire)
print('You are automatically enrolled in Git submodules experiment ' )
'(go/android-submodules-quickstart) for another two weeks.\n', user_cfg.SetBoolean("repo.superprojectChoice", userchoice)
file=sys.stderr) print(
return True "You are automatically enrolled in Git submodules experiment "
"(go/android-submodules-quickstart) for another two weeks.\n",
file=sys.stderr,
)
return True
# For all other cases, we would not use superproject by default. # For all other cases, we would not use superproject by default.
return False return False
def PrintMessages(use_superproject, manifest): def PrintMessages(use_superproject, manifest):
"""Returns a boolean if error/warning messages are to be printed. """Returns a boolean if error/warning messages are to be printed.
Args: Args:
use_superproject: option value from optparse. use_superproject: option value from optparse.
manifest: manifest to use. manifest: manifest to use.
""" """
return use_superproject is not None or bool(manifest.superproject) return use_superproject is not None or bool(manifest.superproject)
def UseSuperproject(use_superproject, manifest): def UseSuperproject(use_superproject, manifest):
"""Returns a boolean if use-superproject option is enabled. """Returns a boolean if use-superproject option is enabled.
Args: Args:
use_superproject: option value from optparse. use_superproject: option value from optparse.
manifest: manifest to use. manifest: manifest to use.
Returns: Returns:
Whether the superproject should be used. Whether the superproject should be used.
""" """
if not manifest.superproject: if not manifest.superproject:
# This (sub) manifest does not have a superproject definition. # This (sub) manifest does not have a superproject definition.
return False return False
elif use_superproject is not None: elif use_superproject is not None:
return use_superproject return use_superproject
else:
client_value = manifest.manifestProject.use_superproject
if client_value is not None:
return client_value
elif manifest.superproject:
return _UseSuperprojectFromConfiguration()
else: else:
return False client_value = manifest.manifestProject.use_superproject
if client_value is not None:
return client_value
elif manifest.superproject:
return _UseSuperprojectFromConfiguration()
else:
return False

View File

@ -1,331 +1,32 @@
# Copyright (C) 2020 The Android Open Source Project from git_command import GetEventTargetPath
# from git_command import RepoSourceVersion
# Licensed under the Apache License, Version 2.0 (the "License"); from git_trace2_event_log_base import BaseEventLog
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Provide event logging in the git trace2 EVENT format.
The git trace2 EVENT format is defined at:
https://www.kernel.org/pub/software/scm/git/docs/technical/api-trace2.html#_event_format
https://git-scm.com/docs/api-trace2#_the_event_format_target
Usage:
git_trace_log = EventLog()
git_trace_log.StartEvent()
...
git_trace_log.ExitEvent()
git_trace_log.Write()
"""
import datetime class EventLog(BaseEventLog):
import errno """Event log that records events that occurred during a repo invocation.
import json
import os
import socket
import sys
import tempfile
import threading
from git_command import GitCommand, RepoSourceVersion Events are written to the log as a consecutive JSON entries, one per line.
Entries follow the git trace2 EVENT format.
Each entry contains the following common keys:
- event: The event name
- sid: session-id - Unique string to allow process instance to be
identified.
- thread: The thread name.
- time: is the UTC time of the event.
class EventLog(object): Valid 'event' names and event specific fields are documented here:
"""Event log that records events that occurred during a repo invocation. https://git-scm.com/docs/api-trace2#_event_format
Events are written to the log as a consecutive JSON entries, one per line.
Entries follow the git trace2 EVENT format.
Each entry contains the following common keys:
- event: The event name
- sid: session-id - Unique string to allow process instance to be identified.
- thread: The thread name.
- time: is the UTC time of the event.
Valid 'event' names and event specific fields are documented here:
https://git-scm.com/docs/api-trace2#_event_format
"""
def __init__(self, env=None):
"""Initializes the event log."""
self._log = []
# Try to get session-id (sid) from environment (setup in repo launcher).
KEY = 'GIT_TRACE2_PARENT_SID'
if env is None:
env = os.environ
now = datetime.datetime.utcnow()
# Save both our sid component and the complete sid.
# We use our sid component (self._sid) as the unique filename prefix and
# the full sid (self._full_sid) in the log itself.
self._sid = 'repo-%s-P%08x' % (now.strftime('%Y%m%dT%H%M%SZ'), os.getpid())
parent_sid = env.get(KEY)
# Append our sid component to the parent sid (if it exists).
if parent_sid is not None:
self._full_sid = parent_sid + '/' + self._sid
else:
self._full_sid = self._sid
# Set/update the environment variable.
# Environment handling across systems is messy.
try:
env[KEY] = self._full_sid
except UnicodeEncodeError:
env[KEY] = self._full_sid.encode()
# Add a version event to front of the log.
self._AddVersionEvent()
@property
def full_sid(self):
return self._full_sid
def _AddVersionEvent(self):
"""Adds a 'version' event at the beginning of current log."""
version_event = self._CreateEventDict('version')
version_event['evt'] = "2"
version_event['exe'] = RepoSourceVersion()
self._log.insert(0, version_event)
def _CreateEventDict(self, event_name):
"""Returns a dictionary with the common keys/values for git trace2 events.
Args:
event_name: The event name.
Returns:
Dictionary with the common event fields populated.
"""
return {
'event': event_name,
'sid': self._full_sid,
'thread': threading.current_thread().name,
'time': datetime.datetime.utcnow().isoformat() + 'Z',
}
def StartEvent(self):
"""Append a 'start' event to the current log."""
start_event = self._CreateEventDict('start')
start_event['argv'] = sys.argv
self._log.append(start_event)
def ExitEvent(self, result):
"""Append an 'exit' event to the current log.
Args:
result: Exit code of the event
"""
exit_event = self._CreateEventDict('exit')
# Consider 'None' success (consistent with event_log result handling).
if result is None:
result = 0
exit_event['code'] = result
self._log.append(exit_event)
def CommandEvent(self, name, subcommands):
"""Append a 'command' event to the current log.
Args:
name: Name of the primary command (ex: repo, git)
subcommands: List of the sub-commands (ex: version, init, sync)
"""
command_event = self._CreateEventDict('command')
command_event['name'] = name
command_event['subcommands'] = subcommands
self._log.append(command_event)
def LogConfigEvents(self, config, event_dict_name):
"""Append a |event_dict_name| event for each config key in |config|.
Args:
config: Configuration dictionary.
event_dict_name: Name of the event dictionary for items to be logged under.
"""
for param, value in config.items():
event = self._CreateEventDict(event_dict_name)
event['param'] = param
event['value'] = value
self._log.append(event)
def DefParamRepoEvents(self, config):
"""Append a 'def_param' event for each repo.* config key to the current log.
Args:
config: Repo configuration dictionary
"""
# Only output the repo.* config parameters.
repo_config = {k: v for k, v in config.items() if k.startswith('repo.')}
self.LogConfigEvents(repo_config, 'def_param')
def GetDataEventName(self, value):
"""Returns 'data-json' if the value is an array else returns 'data'."""
return 'data-json' if value[0] == '[' and value[-1] == ']' else 'data'
def LogDataConfigEvents(self, config, prefix):
"""Append a 'data' event for each config key/value in |config| to the current log.
For each keyX and valueX of the config, "key" field of the event is '|prefix|/keyX'
and the "value" of the "key" field is valueX.
Args:
config: Configuration dictionary.
prefix: Prefix for each key that is logged.
"""
for key, value in config.items():
event = self._CreateEventDict(self.GetDataEventName(value))
event['key'] = f'{prefix}/{key}'
event['value'] = value
self._log.append(event)
def ErrorEvent(self, msg, fmt):
"""Append a 'error' event to the current log."""
error_event = self._CreateEventDict('error')
error_event['msg'] = msg
error_event['fmt'] = fmt
self._log.append(error_event)
def _GetEventTargetPath(self):
"""Get the 'trace2.eventtarget' path from git configuration.
Returns:
path: git config's 'trace2.eventtarget' path if it exists, or None
"""
path = None
cmd = ['config', '--get', 'trace2.eventtarget']
# TODO(https://crbug.com/gerrit/13706): Use GitConfig when it supports
# system git config variables.
p = GitCommand(None, cmd, capture_stdout=True, capture_stderr=True,
bare=True)
retval = p.Wait()
if retval == 0:
# Strip trailing carriage-return in path.
path = p.stdout.rstrip('\n')
elif retval != 1:
# `git config --get` is documented to produce an exit status of `1` if
# the requested variable is not present in the configuration. Report any
# other return value as an error.
print("repo: error: 'git config --get' call failed with return code: %r, stderr: %r" % (
retval, p.stderr), file=sys.stderr)
return path
def _WriteLog(self, write_fn):
"""Writes the log out using a provided writer function.
Generate compact JSON output for each item in the log, and write it using
write_fn.
Args:
write_fn: A function that accepts byts and writes them to a destination.
""" """
for e in self._log: def __init__(self, **kwargs):
# Dump in compact encoding mode. super().__init__(repo_source_version=RepoSourceVersion(), **kwargs)
# See 'Compact encoding' in Python docs:
# https://docs.python.org/3/library/json.html#module-json
write_fn(json.dumps(e, indent=None, separators=(',', ':')).encode('utf-8') + b'\n')
def Write(self, path=None): def Write(self, path=None, **kwargs):
"""Writes the log out to a file or socket. if path is None:
path = self._GetEventTargetPath()
return super().Write(path=path, **kwargs)
Log is only written if 'path' or 'git config --get trace2.eventtarget' def _GetEventTargetPath(self):
provide a valid path (or socket) to write logs to. return GetEventTargetPath()
Logging filename format follows the git trace2 style of being a unique
(exclusive writable) file.
Args:
path: Path to where logs should be written. The path may have a prefix of
the form "af_unix:[{stream|dgram}:]", in which case the path is
treated as a Unix domain socket. See
https://git-scm.com/docs/api-trace2#_enabling_a_target for details.
Returns:
log_path: Path to the log file or socket if log is written, otherwise None
"""
log_path = None
# If no logging path is specified, get the path from 'trace2.eventtarget'.
if path is None:
path = self._GetEventTargetPath()
# If no logging path is specified, exit.
if path is None:
return None
path_is_socket = False
socket_type = None
if isinstance(path, str):
parts = path.split(':', 1)
if parts[0] == 'af_unix' and len(parts) == 2:
path_is_socket = True
path = parts[1]
parts = path.split(':', 1)
if parts[0] == 'stream' and len(parts) == 2:
socket_type = socket.SOCK_STREAM
path = parts[1]
elif parts[0] == 'dgram' and len(parts) == 2:
socket_type = socket.SOCK_DGRAM
path = parts[1]
else:
# Get absolute path.
path = os.path.abspath(os.path.expanduser(path))
else:
raise TypeError('path: str required but got %s.' % type(path))
# Git trace2 requires a directory to write log to.
# TODO(https://crbug.com/gerrit/13706): Support file (append) mode also.
if not (path_is_socket or os.path.isdir(path)):
return None
if path_is_socket:
if socket_type == socket.SOCK_STREAM or socket_type is None:
try:
with socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) as sock:
sock.connect(path)
self._WriteLog(sock.sendall)
return f'af_unix:stream:{path}'
except OSError as err:
# If we tried to connect to a DGRAM socket using STREAM, ignore the
# attempt and continue to DGRAM below. Otherwise, issue a warning.
if err.errno != errno.EPROTOTYPE:
print(f'repo: warning: git trace2 logging failed: {err}', file=sys.stderr)
return None
if socket_type == socket.SOCK_DGRAM or socket_type is None:
try:
with socket.socket(socket.AF_UNIX, socket.SOCK_DGRAM) as sock:
self._WriteLog(lambda bs: sock.sendto(bs, path))
return f'af_unix:dgram:{path}'
except OSError as err:
print(f'repo: warning: git trace2 logging failed: {err}', file=sys.stderr)
return None
# Tried to open a socket but couldn't connect (SOCK_STREAM) or write
# (SOCK_DGRAM).
print('repo: warning: git trace2 logging failed: could not write to socket', file=sys.stderr)
return None
# Path is an absolute path
# Use NamedTemporaryFile to generate a unique filename as required by git trace2.
try:
with tempfile.NamedTemporaryFile(mode='xb', prefix=self._sid, dir=path,
delete=False) as f:
# TODO(https://crbug.com/gerrit/13706): Support writing events as they
# occur.
self._WriteLog(f.write)
log_path = f.name
except FileExistsError as err:
print('repo: warning: git trace2 logging failed: %r' % err,
file=sys.stderr)
return None
return log_path

View File

@ -0,0 +1,354 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Provide event logging in the git trace2 EVENT format.
The git trace2 EVENT format is defined at:
https://www.kernel.org/pub/software/scm/git/docs/technical/api-trace2.html#_event_format
https://git-scm.com/docs/api-trace2#_the_event_format_target
Usage:
git_trace_log = EventLog()
git_trace_log.StartEvent()
...
git_trace_log.ExitEvent()
git_trace_log.Write()
"""
import datetime
import errno
import json
import os
import socket
import sys
import tempfile
import threading
# Timeout when sending events via socket (applies to connect, send)
SOCK_TIMEOUT = 0.5 # in seconds
# BaseEventLog __init__ Counter that is consistent within the same process
p_init_count = 0
class BaseEventLog:
"""Event log that records events that occurred during a repo invocation.
Events are written to the log as a consecutive JSON entries, one per line.
Entries follow the git trace2 EVENT format.
Each entry contains the following common keys:
- event: The event name
- sid: session-id - Unique string to allow process instance to be
identified.
- thread: The thread name.
- time: is the UTC time of the event.
Valid 'event' names and event specific fields are documented here:
https://git-scm.com/docs/api-trace2#_event_format
"""
def __init__(
self, env=None, repo_source_version=None, add_init_count=False
):
"""Initializes the event log."""
global p_init_count
p_init_count += 1
self._log = []
# Try to get session-id (sid) from environment (setup in repo launcher).
KEY = "GIT_TRACE2_PARENT_SID"
if env is None:
env = os.environ
self.start = datetime.datetime.now(datetime.timezone.utc)
# Save both our sid component and the complete sid.
# We use our sid component (self._sid) as the unique filename prefix and
# the full sid (self._full_sid) in the log itself.
self._sid = (
f"repo-{self.start.strftime('%Y%m%dT%H%M%SZ')}-P{os.getpid():08x}"
)
if add_init_count:
self._sid = f"{self._sid}-{p_init_count}"
parent_sid = env.get(KEY)
# Append our sid component to the parent sid (if it exists).
if parent_sid is not None:
self._full_sid = parent_sid + "/" + self._sid
else:
self._full_sid = self._sid
# Set/update the environment variable.
# Environment handling across systems is messy.
try:
env[KEY] = self._full_sid
except UnicodeEncodeError:
env[KEY] = self._full_sid.encode()
if repo_source_version is not None:
# Add a version event to front of the log.
self._AddVersionEvent(repo_source_version)
@property
def full_sid(self):
return self._full_sid
def _AddVersionEvent(self, repo_source_version):
"""Adds a 'version' event at the beginning of current log."""
version_event = self._CreateEventDict("version")
version_event["evt"] = "2"
version_event["exe"] = repo_source_version
self._log.insert(0, version_event)
def _CreateEventDict(self, event_name):
"""Returns a dictionary with common keys/values for git trace2 events.
Args:
event_name: The event name.
Returns:
Dictionary with the common event fields populated.
"""
return {
"event": event_name,
"sid": self._full_sid,
"thread": threading.current_thread().name,
"time": datetime.datetime.now(datetime.timezone.utc).isoformat(),
}
def StartEvent(self):
"""Append a 'start' event to the current log."""
start_event = self._CreateEventDict("start")
start_event["argv"] = sys.argv
self._log.append(start_event)
def ExitEvent(self, result):
"""Append an 'exit' event to the current log.
Args:
result: Exit code of the event
"""
exit_event = self._CreateEventDict("exit")
# Consider 'None' success (consistent with event_log result handling).
if result is None:
result = 0
exit_event["code"] = result
time_delta = datetime.datetime.now(datetime.timezone.utc) - self.start
exit_event["t_abs"] = time_delta.total_seconds()
self._log.append(exit_event)
def CommandEvent(self, name, subcommands):
"""Append a 'command' event to the current log.
Args:
name: Name of the primary command (ex: repo, git)
subcommands: List of the sub-commands (ex: version, init, sync)
"""
command_event = self._CreateEventDict("command")
command_event["name"] = name
command_event["subcommands"] = subcommands
self._log.append(command_event)
def LogConfigEvents(self, config, event_dict_name):
"""Append a |event_dict_name| event for each config key in |config|.
Args:
config: Configuration dictionary.
event_dict_name: Name of the event dictionary for items to be logged
under.
"""
for param, value in config.items():
event = self._CreateEventDict(event_dict_name)
event["param"] = param
event["value"] = value
self._log.append(event)
def DefParamRepoEvents(self, config):
"""Append 'def_param' events for repo config keys to the current log.
This appends one event for each repo.* config key.
Args:
config: Repo configuration dictionary
"""
# Only output the repo.* config parameters.
repo_config = {k: v for k, v in config.items() if k.startswith("repo.")}
self.LogConfigEvents(repo_config, "def_param")
def GetDataEventName(self, value):
"""Returns 'data-json' if the value is an array else returns 'data'."""
return "data-json" if value[0] == "[" and value[-1] == "]" else "data"
def LogDataConfigEvents(self, config, prefix):
"""Append a 'data' event for each entry in |config| to the current log.
For each keyX and valueX of the config, "key" field of the event is
'|prefix|/keyX' and the "value" of the "key" field is valueX.
Args:
config: Configuration dictionary.
prefix: Prefix for each key that is logged.
"""
for key, value in config.items():
event = self._CreateEventDict(self.GetDataEventName(value))
event["key"] = f"{prefix}/{key}"
event["value"] = value
self._log.append(event)
def ErrorEvent(self, msg, fmt=None):
"""Append a 'error' event to the current log."""
error_event = self._CreateEventDict("error")
if fmt is None:
fmt = msg
error_event["msg"] = f"RepoErrorEvent:{msg}"
error_event["fmt"] = f"RepoErrorEvent:{fmt}"
self._log.append(error_event)
def _WriteLog(self, write_fn):
"""Writes the log out using a provided writer function.
Generate compact JSON output for each item in the log, and write it
using write_fn.
Args:
write_fn: A function that accepts byts and writes them to a
destination.
"""
for e in self._log:
# Dump in compact encoding mode.
# See 'Compact encoding' in Python docs:
# https://docs.python.org/3/library/json.html#module-json
write_fn(
json.dumps(e, indent=None, separators=(",", ":")).encode(
"utf-8"
)
+ b"\n"
)
def Write(self, path=None):
"""Writes the log out to a file or socket.
Log is only written if 'path' or 'git config --get trace2.eventtarget'
provide a valid path (or socket) to write logs to.
Logging filename format follows the git trace2 style of being a unique
(exclusive writable) file.
Args:
path: Path to where logs should be written. The path may have a
prefix of the form "af_unix:[{stream|dgram}:]", in which case
the path is treated as a Unix domain socket. See
https://git-scm.com/docs/api-trace2#_enabling_a_target for
details.
Returns:
log_path: Path to the log file or socket if log is written,
otherwise None
"""
log_path = None
# If no logging path is specified, exit.
if path is None:
return None
path_is_socket = False
socket_type = None
if isinstance(path, str):
parts = path.split(":", 1)
if parts[0] == "af_unix" and len(parts) == 2:
path_is_socket = True
path = parts[1]
parts = path.split(":", 1)
if parts[0] == "stream" and len(parts) == 2:
socket_type = socket.SOCK_STREAM
path = parts[1]
elif parts[0] == "dgram" and len(parts) == 2:
socket_type = socket.SOCK_DGRAM
path = parts[1]
else:
# Get absolute path.
path = os.path.abspath(os.path.expanduser(path))
else:
raise TypeError("path: str required but got %s." % type(path))
# Git trace2 requires a directory to write log to.
# TODO(https://crbug.com/gerrit/13706): Support file (append) mode also.
if not (path_is_socket or os.path.isdir(path)):
return None
if path_is_socket:
if socket_type == socket.SOCK_STREAM or socket_type is None:
try:
with socket.socket(
socket.AF_UNIX, socket.SOCK_STREAM
) as sock:
sock.settimeout(SOCK_TIMEOUT)
sock.connect(path)
self._WriteLog(sock.sendall)
return f"af_unix:stream:{path}"
except OSError as err:
# If we tried to connect to a DGRAM socket using STREAM,
# ignore the attempt and continue to DGRAM below. Otherwise,
# issue a warning.
if err.errno != errno.EPROTOTYPE:
print(
f"repo: warning: git trace2 logging failed: {err}",
file=sys.stderr,
)
return None
if socket_type == socket.SOCK_DGRAM or socket_type is None:
try:
with socket.socket(
socket.AF_UNIX, socket.SOCK_DGRAM
) as sock:
self._WriteLog(lambda bs: sock.sendto(bs, path))
return f"af_unix:dgram:{path}"
except OSError as err:
print(
f"repo: warning: git trace2 logging failed: {err}",
file=sys.stderr,
)
return None
# Tried to open a socket but couldn't connect (SOCK_STREAM) or write
# (SOCK_DGRAM).
print(
"repo: warning: git trace2 logging failed: could not write to "
"socket",
file=sys.stderr,
)
return None
# Path is an absolute path
# Use NamedTemporaryFile to generate a unique filename as required by
# git trace2.
try:
with tempfile.NamedTemporaryFile(
mode="xb", prefix=self._sid, dir=path, delete=False
) as f:
# TODO(https://crbug.com/gerrit/13706): Support writing events
# as they occur.
self._WriteLog(f.write)
log_path = f.name
except FileExistsError as err:
print(
"repo: warning: git trace2 logging failed: %r" % err,
file=sys.stderr,
)
return None
return log_path

View File

@ -1,155 +0,0 @@
# Copyright (C) 2015 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import multiprocessing
import re
import sys
import time
import git_command
import git_config
import wrapper
from error import ManifestParseError
NUM_BATCH_RETRIEVE_REVISIONID = 32
def get_gitc_manifest_dir():
return wrapper.Wrapper().get_gitc_manifest_dir()
def parse_clientdir(gitc_fs_path):
return wrapper.Wrapper().gitc_parse_clientdir(gitc_fs_path)
def _get_project_revision(args):
"""Worker for _set_project_revisions to lookup one project remote."""
(i, url, expr) = args
gitcmd = git_command.GitCommand(
None, ['ls-remote', url, expr], capture_stdout=True, cwd='/tmp')
rc = gitcmd.Wait()
return (i, rc, gitcmd.stdout.split('\t', 1)[0])
def _set_project_revisions(projects):
"""Sets the revisionExpr for a list of projects.
Because of the limit of open file descriptors allowed, length of projects
should not be overly large. Recommend calling this function multiple times
with each call not exceeding NUM_BATCH_RETRIEVE_REVISIONID projects.
Args:
projects: List of project objects to set the revionExpr for.
"""
# Retrieve the commit id for each project based off of it's current
# revisionExpr and it is not already a commit id.
with multiprocessing.Pool(NUM_BATCH_RETRIEVE_REVISIONID) as pool:
results_iter = pool.imap_unordered(
_get_project_revision,
((i, project.remote.url, project.revisionExpr)
for i, project in enumerate(projects)
if not git_config.IsId(project.revisionExpr)),
chunksize=8)
for (i, rc, revisionExpr) in results_iter:
project = projects[i]
if rc:
print('FATAL: Failed to retrieve revisionExpr for %s' % project.name)
pool.terminate()
sys.exit(1)
if not revisionExpr:
pool.terminate()
raise ManifestParseError('Invalid SHA-1 revision project %s (%s)' %
(project.remote.url, project.revisionExpr))
project.revisionExpr = revisionExpr
def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
"""Generate a manifest for shafsd to use for this GITC client.
Args:
gitc_manifest: Current gitc manifest, or None if there isn't one yet.
manifest: A GitcManifest object loaded with the current repo manifest.
paths: List of project paths we want to update.
"""
print('Generating GITC Manifest by fetching revision SHAs for each '
'project.')
if paths is None:
paths = list(manifest.paths.keys())
groups = [x for x in re.split(r'[,\s]+', manifest.GetGroupsStr()) if x]
# Convert the paths to projects, and filter them to the matched groups.
projects = [manifest.paths[p] for p in paths]
projects = [p for p in projects if p.MatchesGroups(groups)]
if gitc_manifest is not None:
for path, proj in manifest.paths.items():
if not proj.MatchesGroups(groups):
continue
if not proj.upstream and not git_config.IsId(proj.revisionExpr):
proj.upstream = proj.revisionExpr
if path not in gitc_manifest.paths:
# Any new projects need their first revision, even if we weren't asked
# for them.
projects.append(proj)
elif path not in paths:
# And copy revisions from the previous manifest if we're not updating
# them now.
gitc_proj = gitc_manifest.paths[path]
if gitc_proj.old_revision:
proj.revisionExpr = None
proj.old_revision = gitc_proj.old_revision
else:
proj.revisionExpr = gitc_proj.revisionExpr
_set_project_revisions(projects)
if gitc_manifest is not None:
for path, proj in gitc_manifest.paths.items():
if proj.old_revision and path in paths:
# If we updated a project that has been started, keep the old-revision
# updated.
repo_proj = manifest.paths[path]
repo_proj.old_revision = repo_proj.revisionExpr
repo_proj.revisionExpr = None
# Convert URLs from relative to absolute.
for _name, remote in manifest.remotes.items():
remote.fetchUrl = remote.resolvedFetchUrl
# Save the manifest.
save_manifest(manifest)
def save_manifest(manifest, client_dir=None):
"""Save the manifest file in the client_dir.
Args:
manifest: Manifest object to save.
client_dir: Client directory to save the manifest in.
"""
if not client_dir:
manifest_file = manifest.manifestFile
else:
manifest_file = os.path.join(client_dir, '.manifest')
with open(manifest_file, 'w') as f:
manifest.Save(f, groups=manifest.GetGroupsStr())
# TODO(sbasi/jorg): Come up with a solution to remove the sleep below.
# Give the GITC filesystem time to register the manifest changes.
time.sleep(3)

846
hooks.py
View File

@ -12,11 +12,8 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import errno
import json
import os import os
import re import re
import subprocess
import sys import sys
import traceback import traceback
import urllib.parse import urllib.parse
@ -25,485 +22,482 @@ from error import HookError
from git_refs import HEAD from git_refs import HEAD
class RepoHook(object): class RepoHook:
"""A RepoHook contains information about a script to run as a hook. """A RepoHook contains information about a script to run as a hook.
Hooks are used to run a python script before running an upload (for instance, Hooks are used to run a python script before running an upload (for
to run presubmit checks). Eventually, we may have hooks for other actions. instance, to run presubmit checks). Eventually, we may have hooks for other
actions.
This shouldn't be confused with files in the 'repo/hooks' directory. Those This shouldn't be confused with files in the 'repo/hooks' directory. Those
files are copied into each '.git/hooks' folder for each project. Repo-level files are copied into each '.git/hooks' folder for each project. Repo-level
hooks are associated instead with repo actions. hooks are associated instead with repo actions.
Hooks are always python. When a hook is run, we will load the hook into the Hooks are always python. When a hook is run, we will load the hook into the
interpreter and execute its main() function. interpreter and execute its main() function.
Combinations of hook option flags: Combinations of hook option flags:
- no-verify=False, verify=False (DEFAULT): - no-verify=False, verify=False (DEFAULT):
If stdout is a tty, can prompt about running hooks if needed. If stdout is a tty, can prompt about running hooks if needed.
If user denies running hooks, the action is cancelled. If stdout is If user denies running hooks, the action is cancelled. If stdout is
not a tty and we would need to prompt about hooks, action is not a tty and we would need to prompt about hooks, action is
cancelled. cancelled.
- no-verify=False, verify=True: - no-verify=False, verify=True:
Always run hooks with no prompt. Always run hooks with no prompt.
- no-verify=True, verify=False: - no-verify=True, verify=False:
Never run hooks, but run action anyway (AKA bypass hooks). Never run hooks, but run action anyway (AKA bypass hooks).
- no-verify=True, verify=True: - no-verify=True, verify=True:
Invalid Invalid
"""
def __init__(self,
hook_type,
hooks_project,
repo_topdir,
manifest_url,
bypass_hooks=False,
allow_all_hooks=False,
ignore_hooks=False,
abort_if_user_denies=False):
"""RepoHook constructor.
Params:
hook_type: A string representing the type of hook. This is also used
to figure out the name of the file containing the hook. For
example: 'pre-upload'.
hooks_project: The project containing the repo hooks.
If you have a manifest, this is manifest.repo_hooks_project.
OK if this is None, which will make the hook a no-op.
repo_topdir: The top directory of the repo client checkout.
This is the one containing the .repo directory. Scripts will
run with CWD as this directory.
If you have a manifest, this is manifest.topdir.
manifest_url: The URL to the manifest git repo.
bypass_hooks: If True, then 'Do not run the hook'.
allow_all_hooks: If True, then 'Run the hook without prompting'.
ignore_hooks: If True, then 'Do not abort action if hooks fail'.
abort_if_user_denies: If True, we'll abort running the hook if the user
doesn't allow us to run the hook.
""" """
self._hook_type = hook_type
self._hooks_project = hooks_project
self._repo_topdir = repo_topdir
self._manifest_url = manifest_url
self._bypass_hooks = bypass_hooks
self._allow_all_hooks = allow_all_hooks
self._ignore_hooks = ignore_hooks
self._abort_if_user_denies = abort_if_user_denies
# Store the full path to the script for convenience. def __init__(
if self._hooks_project: self,
self._script_fullpath = os.path.join(self._hooks_project.worktree, hook_type,
self._hook_type + '.py') hooks_project,
else: repo_topdir,
self._script_fullpath = None manifest_url,
bypass_hooks=False,
allow_all_hooks=False,
ignore_hooks=False,
abort_if_user_denies=False,
):
"""RepoHook constructor.
def _GetHash(self): Params:
"""Return a hash of the contents of the hooks directory. hook_type: A string representing the type of hook. This is also used
to figure out the name of the file containing the hook. For
example: 'pre-upload'.
hooks_project: The project containing the repo hooks.
If you have a manifest, this is manifest.repo_hooks_project.
OK if this is None, which will make the hook a no-op.
repo_topdir: The top directory of the repo client checkout.
This is the one containing the .repo directory. Scripts will
run with CWD as this directory.
If you have a manifest, this is manifest.topdir.
manifest_url: The URL to the manifest git repo.
bypass_hooks: If True, then 'Do not run the hook'.
allow_all_hooks: If True, then 'Run the hook without prompting'.
ignore_hooks: If True, then 'Do not abort action if hooks fail'.
abort_if_user_denies: If True, we'll abort running the hook if the
user doesn't allow us to run the hook.
"""
self._hook_type = hook_type
self._hooks_project = hooks_project
self._repo_topdir = repo_topdir
self._manifest_url = manifest_url
self._bypass_hooks = bypass_hooks
self._allow_all_hooks = allow_all_hooks
self._ignore_hooks = ignore_hooks
self._abort_if_user_denies = abort_if_user_denies
We'll just use git to do this. This hash has the property that if anything # Store the full path to the script for convenience.
changes in the directory we will return a different has. if self._hooks_project:
self._script_fullpath = os.path.join(
self._hooks_project.worktree, self._hook_type + ".py"
)
else:
self._script_fullpath = None
SECURITY CONSIDERATION: def _GetHash(self):
This hash only represents the contents of files in the hook directory, not """Return a hash of the contents of the hooks directory.
any other files imported or called by hooks. Changes to imported files
can change the script behavior without affecting the hash.
Returns: We'll just use git to do this. This hash has the property that if
A string representing the hash. This will always be ASCII so that it can anything changes in the directory we will return a different has.
be printed to the user easily.
"""
assert self._hooks_project, "Must have hooks to calculate their hash."
# We will use the work_git object rather than just calling GetRevisionId(). SECURITY CONSIDERATION:
# That gives us a hash of the latest checked in version of the files that This hash only represents the contents of files in the hook
# the user will actually be executing. Specifically, GetRevisionId() directory, not any other files imported or called by hooks. Changes
# doesn't appear to change even if a user checks out a different version to imported files can change the script behavior without affecting
# of the hooks repo (via git checkout) nor if a user commits their own revs. the hash.
#
# NOTE: Local (non-committed) changes will not be factored into this hash.
# I think this is OK, since we're really only worried about warning the user
# about upstream changes.
return self._hooks_project.work_git.rev_parse(HEAD)
def _GetMustVerb(self): Returns:
"""Return 'must' if the hook is required; 'should' if not.""" A string representing the hash. This will always be ASCII so that
if self._abort_if_user_denies: it can be printed to the user easily.
return 'must' """
else: assert self._hooks_project, "Must have hooks to calculate their hash."
return 'should'
def _CheckForHookApproval(self): # We will use the work_git object rather than just calling
"""Check to see whether this hook has been approved. # GetRevisionId(). That gives us a hash of the latest checked in version
# of the files that the user will actually be executing. Specifically,
# GetRevisionId() doesn't appear to change even if a user checks out a
# different version of the hooks repo (via git checkout) nor if a user
# commits their own revs.
#
# NOTE: Local (non-committed) changes will not be factored into this
# hash. I think this is OK, since we're really only worried about
# warning the user about upstream changes.
return self._hooks_project.work_git.rev_parse(HEAD)
We'll accept approval of manifest URLs if they're using secure transports. def _GetMustVerb(self):
This way the user can say they trust the manifest hoster. For insecure """Return 'must' if the hook is required; 'should' if not."""
hosts, we fall back to checking the hash of the hooks repo. if self._abort_if_user_denies:
return "must"
else:
return "should"
Note that we ask permission for each individual hook even though we use def _CheckForHookApproval(self):
the hash of all hooks when detecting changes. We'd like the user to be """Check to see whether this hook has been approved.
able to approve / deny each hook individually. We only use the hash of all
hooks because there is no other easy way to detect changes to local imports.
Returns: We'll accept approval of manifest URLs if they're using secure
True if this hook is approved to run; False otherwise. transports. This way the user can say they trust the manifest hoster.
For insecure hosts, we fall back to checking the hash of the hooks repo.
Raises: Note that we ask permission for each individual hook even though we use
HookError: Raised if the user doesn't approve and abort_if_user_denies the hash of all hooks when detecting changes. We'd like the user to be
was passed to the consturctor. able to approve / deny each hook individually. We only use the hash of
""" all hooks because there is no other easy way to detect changes to local
if self._ManifestUrlHasSecureScheme(): imports.
return self._CheckForHookApprovalManifest()
else:
return self._CheckForHookApprovalHash()
def _CheckForHookApprovalHelper(self, subkey, new_val, main_prompt, Returns:
changed_prompt): True if this hook is approved to run; False otherwise.
"""Check for approval for a particular attribute and hook.
Args: Raises:
subkey: The git config key under [repo.hooks.<hook_type>] to store the HookError: Raised if the user doesn't approve and
last approved string. abort_if_user_denies was passed to the consturctor.
new_val: The new value to compare against the last approved one. """
main_prompt: Message to display to the user to ask for approval. if self._ManifestUrlHasSecureScheme():
changed_prompt: Message explaining why we're re-asking for approval. return self._CheckForHookApprovalManifest()
else:
return self._CheckForHookApprovalHash()
Returns: def _CheckForHookApprovalHelper(
True if this hook is approved to run; False otherwise. self, subkey, new_val, main_prompt, changed_prompt
):
"""Check for approval for a particular attribute and hook.
Raises: Args:
HookError: Raised if the user doesn't approve and abort_if_user_denies subkey: The git config key under [repo.hooks.<hook_type>] to store
was passed to the consturctor. the last approved string.
""" new_val: The new value to compare against the last approved one.
hooks_config = self._hooks_project.config main_prompt: Message to display to the user to ask for approval.
git_approval_key = 'repo.hooks.%s.%s' % (self._hook_type, subkey) changed_prompt: Message explaining why we're re-asking for approval.
# Get the last value that the user approved for this hook; may be None. Returns:
old_val = hooks_config.GetString(git_approval_key) True if this hook is approved to run; False otherwise.
if old_val is not None: Raises:
# User previously approved hook and asked not to be prompted again. HookError: Raised if the user doesn't approve and
if new_val == old_val: abort_if_user_denies was passed to the consturctor.
# Approval matched. We're done. """
return True hooks_config = self._hooks_project.config
else: git_approval_key = f"repo.hooks.{self._hook_type}.{subkey}"
# Give the user a reason why we're prompting, since they last told
# us to "never ask again".
prompt = 'WARNING: %s\n\n' % (changed_prompt,)
else:
prompt = ''
# Prompt the user if we're not on a tty; on a tty we'll assume "no". # Get the last value that the user approved for this hook; may be None.
if sys.stdout.isatty(): old_val = hooks_config.GetString(git_approval_key)
prompt += main_prompt + ' (yes/always/NO)? '
response = input(prompt).lower()
print()
# User is doing a one-time approval. if old_val is not None:
if response in ('y', 'yes'): # User previously approved hook and asked not to be prompted again.
return True if new_val == old_val:
elif response == 'always': # Approval matched. We're done.
hooks_config.SetString(git_approval_key, new_val) return True
return True else:
# Give the user a reason why we're prompting, since they last
# told us to "never ask again".
prompt = f"WARNING: {changed_prompt}\n\n"
else:
prompt = ""
# For anything else, we'll assume no approval. # Prompt the user if we're not on a tty; on a tty we'll assume "no".
if self._abort_if_user_denies: if sys.stdout.isatty():
raise HookError('You must allow the %s hook or use --no-verify.' % prompt += main_prompt + " (yes/always/NO)? "
self._hook_type) response = input(prompt).lower()
print()
return False # User is doing a one-time approval.
if response in ("y", "yes"):
return True
elif response == "always":
hooks_config.SetString(git_approval_key, new_val)
return True
def _ManifestUrlHasSecureScheme(self): # For anything else, we'll assume no approval.
"""Check if the URI for the manifest is a secure transport.""" if self._abort_if_user_denies:
secure_schemes = ('file', 'https', 'ssh', 'persistent-https', 'sso', 'rpc') raise HookError(
parse_results = urllib.parse.urlparse(self._manifest_url) "You must allow the %s hook or use --no-verify."
return parse_results.scheme in secure_schemes % self._hook_type
)
def _CheckForHookApprovalManifest(self): return False
"""Check whether the user has approved this manifest host.
Returns: def _ManifestUrlHasSecureScheme(self):
True if this hook is approved to run; False otherwise. """Check if the URI for the manifest is a secure transport."""
""" secure_schemes = (
return self._CheckForHookApprovalHelper( "file",
'approvedmanifest', "https",
self._manifest_url, "ssh",
'Run hook scripts from %s' % (self._manifest_url,), "persistent-https",
'Manifest URL has changed since %s was allowed.' % (self._hook_type,)) "sso",
"rpc",
)
parse_results = urllib.parse.urlparse(self._manifest_url)
return parse_results.scheme in secure_schemes
def _CheckForHookApprovalHash(self): def _CheckForHookApprovalManifest(self):
"""Check whether the user has approved the hooks repo. """Check whether the user has approved this manifest host.
Returns: Returns:
True if this hook is approved to run; False otherwise. True if this hook is approved to run; False otherwise.
""" """
prompt = ('Repo %s run the script:\n' return self._CheckForHookApprovalHelper(
' %s\n' "approvedmanifest",
'\n' self._manifest_url,
'Do you want to allow this script to run') f"Run hook scripts from {self._manifest_url}",
return self._CheckForHookApprovalHelper( f"Manifest URL has changed since {self._hook_type} was allowed.",
'approvedhash', )
self._GetHash(),
prompt % (self._GetMustVerb(), self._script_fullpath),
'Scripts have changed since %s was allowed.' % (self._hook_type,))
@staticmethod def _CheckForHookApprovalHash(self):
def _ExtractInterpFromShebang(data): """Check whether the user has approved the hooks repo.
"""Extract the interpreter used in the shebang.
Try to locate the interpreter the script is using (ignoring `env`). Returns:
True if this hook is approved to run; False otherwise.
"""
prompt = (
"Repo %s run the script:\n"
" %s\n"
"\n"
"Do you want to allow this script to run"
)
return self._CheckForHookApprovalHelper(
"approvedhash",
self._GetHash(),
prompt % (self._GetMustVerb(), self._script_fullpath),
f"Scripts have changed since {self._hook_type} was allowed.",
)
Args: @staticmethod
data: The file content of the script. def _ExtractInterpFromShebang(data):
"""Extract the interpreter used in the shebang.
Returns: Try to locate the interpreter the script is using (ignoring `env`).
The basename of the main script interpreter, or None if a shebang is not
used or could not be parsed out.
"""
firstline = data.splitlines()[:1]
if not firstline:
return None
# The format here can be tricky. Args:
shebang = firstline[0].strip() data: The file content of the script.
m = re.match(r'^#!\s*([^\s]+)(?:\s+([^\s]+))?', shebang)
if not m:
return None
# If the using `env`, find the target program. Returns:
interp = m.group(1) The basename of the main script interpreter, or None if a shebang is
if os.path.basename(interp) == 'env': not used or could not be parsed out.
interp = m.group(2) """
firstline = data.splitlines()[:1]
if not firstline:
return None
return interp # The format here can be tricky.
shebang = firstline[0].strip()
m = re.match(r"^#!\s*([^\s]+)(?:\s+([^\s]+))?", shebang)
if not m:
return None
def _ExecuteHookViaReexec(self, interp, context, **kwargs): # If the using `env`, find the target program.
"""Execute the hook script through |interp|. interp = m.group(1)
if os.path.basename(interp) == "env":
interp = m.group(2)
Note: Support for this feature should be dropped ~Jun 2021. return interp
Args: def _ExecuteHookViaImport(self, data, context, **kwargs):
interp: The Python program to run. """Execute the hook code in |data| directly.
context: Basic Python context to execute the hook inside.
kwargs: Arbitrary arguments to pass to the hook script.
Raises: Args:
HookError: When the hooks failed for any reason. data: The code of the hook to execute.
""" context: Basic Python context to execute the hook inside.
# This logic needs to be kept in sync with _ExecuteHookViaImport below. kwargs: Arbitrary arguments to pass to the hook script.
script = """
import json, os, sys
path = '''%(path)s'''
kwargs = json.loads('''%(kwargs)s''')
context = json.loads('''%(context)s''')
sys.path.insert(0, os.path.dirname(path))
data = open(path).read()
exec(compile(data, path, 'exec'), context)
context['main'](**kwargs)
""" % {
'path': self._script_fullpath,
'kwargs': json.dumps(kwargs),
'context': json.dumps(context),
}
# We pass the script via stdin to avoid OS argv limits. It also makes Raises:
# unhandled exception tracebacks less verbose/confusing for users. HookError: When the hooks failed for any reason.
cmd = [interp, '-c', 'import sys; exec(sys.stdin.read())'] """
proc = subprocess.Popen(cmd, stdin=subprocess.PIPE) # Exec, storing global context in the context dict. We catch exceptions
proc.communicate(input=script.encode('utf-8')) # and convert to a HookError w/ just the failing traceback.
if proc.returncode:
raise HookError('Failed to run %s hook.' % (self._hook_type,))
def _ExecuteHookViaImport(self, data, context, **kwargs):
"""Execute the hook code in |data| directly.
Args:
data: The code of the hook to execute.
context: Basic Python context to execute the hook inside.
kwargs: Arbitrary arguments to pass to the hook script.
Raises:
HookError: When the hooks failed for any reason.
"""
# Exec, storing global context in the context dict. We catch exceptions
# and convert to a HookError w/ just the failing traceback.
try:
exec(compile(data, self._script_fullpath, 'exec'), context)
except Exception:
raise HookError('%s\nFailed to import %s hook; see traceback above.' %
(traceback.format_exc(), self._hook_type))
# Running the script should have defined a main() function.
if 'main' not in context:
raise HookError('Missing main() in: "%s"' % self._script_fullpath)
# Call the main function in the hook. If the hook should cause the
# build to fail, it will raise an Exception. We'll catch that convert
# to a HookError w/ just the failing traceback.
try:
context['main'](**kwargs)
except Exception:
raise HookError('%s\nFailed to run main() for %s hook; see traceback '
'above.' % (traceback.format_exc(), self._hook_type))
def _ExecuteHook(self, **kwargs):
"""Actually execute the given hook.
This will run the hook's 'main' function in our python interpreter.
Args:
kwargs: Keyword arguments to pass to the hook. These are often specific
to the hook type. For instance, pre-upload hooks will contain
a project_list.
"""
# Keep sys.path and CWD stashed away so that we can always restore them
# upon function exit.
orig_path = os.getcwd()
orig_syspath = sys.path
try:
# Always run hooks with CWD as topdir.
os.chdir(self._repo_topdir)
# Put the hook dir as the first item of sys.path so hooks can do
# relative imports. We want to replace the repo dir as [0] so
# hooks can't import repo files.
sys.path = [os.path.dirname(self._script_fullpath)] + sys.path[1:]
# Initial global context for the hook to run within.
context = {'__file__': self._script_fullpath}
# Add 'hook_should_take_kwargs' to the arguments to be passed to main.
# We don't actually want hooks to define their main with this argument--
# it's there to remind them that their hook should always take **kwargs.
# For instance, a pre-upload hook should be defined like:
# def main(project_list, **kwargs):
#
# This allows us to later expand the API without breaking old hooks.
kwargs = kwargs.copy()
kwargs['hook_should_take_kwargs'] = True
# See what version of python the hook has been written against.
data = open(self._script_fullpath).read()
interp = self._ExtractInterpFromShebang(data)
reexec = False
if interp:
prog = os.path.basename(interp)
if prog.startswith('python2') and sys.version_info.major != 2:
reexec = True
elif prog.startswith('python3') and sys.version_info.major == 2:
reexec = True
# Attempt to execute the hooks through the requested version of Python.
if reexec:
try: try:
self._ExecuteHookViaReexec(interp, context, **kwargs) exec(compile(data, self._script_fullpath, "exec"), context)
except OSError as e: except Exception:
if e.errno == errno.ENOENT: raise HookError(
# We couldn't find the interpreter, so fallback to importing. "%s\nFailed to import %s hook; see traceback above."
reexec = False % (traceback.format_exc(), self._hook_type)
else: )
raise
# Run the hook by importing directly. # Running the script should have defined a main() function.
if not reexec: if "main" not in context:
self._ExecuteHookViaImport(data, context, **kwargs) raise HookError('Missing main() in: "%s"' % self._script_fullpath)
finally:
# Restore sys.path and CWD.
sys.path = orig_syspath
os.chdir(orig_path)
def _CheckHook(self): # Call the main function in the hook. If the hook should cause the
# Bail with a nice error if we can't find the hook. # build to fail, it will raise an Exception. We'll catch that convert
if not os.path.isfile(self._script_fullpath): # to a HookError w/ just the failing traceback.
raise HookError('Couldn\'t find repo hook: %s' % self._script_fullpath) try:
context["main"](**kwargs)
except Exception:
raise HookError(
"%s\nFailed to run main() for %s hook; see traceback "
"above." % (traceback.format_exc(), self._hook_type)
)
def Run(self, **kwargs): def _ExecuteHook(self, **kwargs):
"""Run the hook. """Actually execute the given hook.
If the hook doesn't exist (because there is no hooks project or because This will run the hook's 'main' function in our python interpreter.
this particular hook is not enabled), this is a no-op.
Args: Args:
user_allows_all_hooks: If True, we will never prompt about running the kwargs: Keyword arguments to pass to the hook. These are often
hook--we'll just assume it's OK to run it. specific to the hook type. For instance, pre-upload hooks will
kwargs: Keyword arguments to pass to the hook. These are often specific contain a project_list.
to the hook type. For instance, pre-upload hooks will contain """
a project_list. # Keep sys.path and CWD stashed away so that we can always restore them
# upon function exit.
orig_path = os.getcwd()
orig_syspath = sys.path
Returns: try:
True: On success or ignore hooks by user-request # Always run hooks with CWD as topdir.
False: The hook failed. The caller should respond with aborting the action. os.chdir(self._repo_topdir)
Some examples in which False is returned:
* Finding the hook failed while it was enabled, or
* the user declined to run a required hook (from _CheckForHookApproval)
In all these cases the user did not pass the proper arguments to
ignore the result through the option combinations as listed in
AddHookOptionGroup().
"""
# Do not do anything in case bypass_hooks is set, or
# no-op if there is no hooks project or if hook is disabled.
if (self._bypass_hooks or
not self._hooks_project or
self._hook_type not in self._hooks_project.enabled_repo_hooks):
return True
passed = True # Put the hook dir as the first item of sys.path so hooks can do
try: # relative imports. We want to replace the repo dir as [0] so
self._CheckHook() # hooks can't import repo files.
sys.path = [os.path.dirname(self._script_fullpath)] + sys.path[1:]
# Make sure the user is OK with running the hook. # Initial global context for the hook to run within.
if self._allow_all_hooks or self._CheckForHookApproval(): context = {"__file__": self._script_fullpath}
# Run the hook with the same version of python we're using.
self._ExecuteHook(**kwargs)
except SystemExit as e:
passed = False
print('ERROR: %s hooks exited with exit code: %s' % (self._hook_type, str(e)),
file=sys.stderr)
except HookError as e:
passed = False
print('ERROR: %s' % str(e), file=sys.stderr)
if not passed and self._ignore_hooks: # Add 'hook_should_take_kwargs' to the arguments to be passed to
print('\nWARNING: %s hooks failed, but continuing anyways.' % self._hook_type, # main. We don't actually want hooks to define their main with this
file=sys.stderr) # argument--it's there to remind them that their hook should always
passed = True # take **kwargs.
# For instance, a pre-upload hook should be defined like:
# def main(project_list, **kwargs):
#
# This allows us to later expand the API without breaking old hooks.
kwargs = kwargs.copy()
kwargs["hook_should_take_kwargs"] = True
return passed # See what version of python the hook has been written against.
data = open(self._script_fullpath).read()
interp = self._ExtractInterpFromShebang(data)
if interp:
prog = os.path.basename(interp)
if prog.startswith("python2"):
raise HookError("Python 2 is not supported")
@classmethod # Run the hook by importing directly.
def FromSubcmd(cls, manifest, opt, *args, **kwargs): self._ExecuteHookViaImport(data, context, **kwargs)
"""Method to construct the repo hook class finally:
# Restore sys.path and CWD.
sys.path = orig_syspath
os.chdir(orig_path)
Args: def _CheckHook(self):
manifest: The current active manifest for this command from which we # Bail with a nice error if we can't find the hook.
extract a couple of fields. if not os.path.isfile(self._script_fullpath):
opt: Contains the commandline options for the action of this hook. raise HookError(
It should contain the options added by AddHookOptionGroup() in which "Couldn't find repo hook: %s" % self._script_fullpath
we are interested in RepoHook execution. )
"""
for key in ('bypass_hooks', 'allow_all_hooks', 'ignore_hooks'):
kwargs.setdefault(key, getattr(opt, key))
kwargs.update({
'hooks_project': manifest.repo_hooks_project,
'repo_topdir': manifest.topdir,
'manifest_url': manifest.manifestProject.GetRemote('origin').url,
})
return cls(*args, **kwargs)
@staticmethod def Run(self, **kwargs):
def AddOptionGroup(parser, name): """Run the hook.
"""Help options relating to the various hooks."""
# Note that verify and no-verify are NOT opposites of each other, which If the hook doesn't exist (because there is no hooks project or because
# is why they store to different locations. We are using them to match this particular hook is not enabled), this is a no-op.
# 'git commit' syntax.
group = parser.add_option_group(name + ' hooks') Args:
group.add_option('--no-verify', user_allows_all_hooks: If True, we will never prompt about running
dest='bypass_hooks', action='store_true', the hook--we'll just assume it's OK to run it.
help='Do not run the %s hook.' % name) kwargs: Keyword arguments to pass to the hook. These are often
group.add_option('--verify', specific to the hook type. For instance, pre-upload hooks will
dest='allow_all_hooks', action='store_true', contain a project_list.
help='Run the %s hook without prompting.' % name)
group.add_option('--ignore-hooks', Returns:
action='store_true', True: On success or ignore hooks by user-request
help='Do not abort if %s hooks fail.' % name) False: The hook failed. The caller should respond with aborting the
action. Some examples in which False is returned:
* Finding the hook failed while it was enabled, or
* the user declined to run a required hook (from
_CheckForHookApproval)
In all these cases the user did not pass the proper arguments to
ignore the result through the option combinations as listed in
AddHookOptionGroup().
"""
# Do not do anything in case bypass_hooks is set, or
# no-op if there is no hooks project or if hook is disabled.
if (
self._bypass_hooks
or not self._hooks_project
or self._hook_type not in self._hooks_project.enabled_repo_hooks
):
return True
passed = True
try:
self._CheckHook()
# Make sure the user is OK with running the hook.
if self._allow_all_hooks or self._CheckForHookApproval():
# Run the hook with the same version of python we're using.
self._ExecuteHook(**kwargs)
except SystemExit as e:
passed = False
print(
"ERROR: %s hooks exited with exit code: %s"
% (self._hook_type, str(e)),
file=sys.stderr,
)
except HookError as e:
passed = False
print("ERROR: %s" % str(e), file=sys.stderr)
if not passed and self._ignore_hooks:
print(
"\nWARNING: %s hooks failed, but continuing anyways."
% self._hook_type,
file=sys.stderr,
)
passed = True
return passed
@classmethod
def FromSubcmd(cls, manifest, opt, *args, **kwargs):
"""Method to construct the repo hook class
Args:
manifest: The current active manifest for this command from which we
extract a couple of fields.
opt: Contains the commandline options for the action of this hook.
It should contain the options added by AddHookOptionGroup() in
which we are interested in RepoHook execution.
"""
for key in ("bypass_hooks", "allow_all_hooks", "ignore_hooks"):
kwargs.setdefault(key, getattr(opt, key))
kwargs.update(
{
"hooks_project": manifest.repo_hooks_project,
"repo_topdir": manifest.topdir,
"manifest_url": manifest.manifestProject.GetRemote(
"origin"
).url,
}
)
return cls(*args, **kwargs)
@staticmethod
def AddOptionGroup(parser, name):
"""Help options relating to the various hooks."""
# Note that verify and no-verify are NOT opposites of each other, which
# is why they store to different locations. We are using them to match
# 'git commit' syntax.
group = parser.add_option_group(name + " hooks")
group.add_option(
"--no-verify",
dest="bypass_hooks",
action="store_true",
help="Do not run the %s hook." % name,
)
group.add_option(
"--verify",
dest="allow_all_hooks",
action="store_true",
help="Run the %s hook without prompting." % name,
)
group.add_option(
"--ignore-hooks",
action="store_true",
help="Do not abort if %s hooks fail." % name,
)

View File

@ -1,5 +1,8 @@
#!/bin/sh #!/bin/sh
# From Gerrit Code Review 3.6.1 c67916dbdc07555c44e32a68f92ffc484b9b34f0 # DO NOT EDIT THIS FILE
# All updates should be sent upstream: https://gerrit.googlesource.com/gerrit/
# This is synced from commit: 62f5bbea67f6dafa6e22a601a0c298214c510caf
# DO NOT EDIT THIS FILE
# #
# Part of Gerrit Code Review (https://www.gerritcodereview.com/) # Part of Gerrit Code Review (https://www.gerritcodereview.com/)
# #
@ -31,14 +34,20 @@ if test ! -f "$1" ; then
fi fi
# Do not create a change id if requested # Do not create a change id if requested
if test "false" = "$(git config --bool --get gerrit.createChangeId)" ; then case "$(git config --get gerrit.createChangeId)" in
exit 0 false)
fi exit 0
;;
always)
;;
*)
# Do not create a change id for squash/fixup commits.
if head -n1 "$1" | LC_ALL=C grep -q '^[a-z][a-z]*! '; then
exit 0
fi
;;
esac
# Do not create a change id for squash commits.
if head -n1 "$1" | grep -q '^squash! '; then
exit 0
fi
if git rev-parse --verify HEAD >/dev/null 2>&1; then if git rev-parse --verify HEAD >/dev/null 2>&1; then
refhash="$(git rev-parse HEAD)" refhash="$(git rev-parse HEAD)"
@ -51,7 +60,7 @@ dest="$1.tmp.${random}"
trap 'rm -f "$dest" "$dest-2"' EXIT trap 'rm -f "$dest" "$dest-2"' EXIT
if ! git stripspace --strip-comments < "$1" > "${dest}" ; then if ! cat "$1" | sed -e '/>8/q' | git stripspace --strip-comments > "${dest}" ; then
echo "cannot strip comments from $1" echo "cannot strip comments from $1"
exit 1 exit 1
fi fi
@ -65,7 +74,7 @@ reviewurl="$(git config --get gerrit.reviewUrl)"
if test -n "${reviewurl}" ; then if test -n "${reviewurl}" ; then
token="Link" token="Link"
value="${reviewurl%/}/id/I$random" value="${reviewurl%/}/id/I$random"
pattern=".*/id/I[0-9a-f]\{40\}$" pattern=".*/id/I[0-9a-f]\{40\}"
else else
token="Change-Id" token="Change-Id"
value="I$random" value="I$random"
@ -92,7 +101,7 @@ fi
# Avoid the --where option which only appeared in Git 2.15 # Avoid the --where option which only appeared in Git 2.15
if ! git -c trailer.where=before interpret-trailers \ if ! git -c trailer.where=before interpret-trailers \
--trailer "Signed-off-by: $token: $value" < "$dest-2" | --trailer "Signed-off-by: $token: $value" < "$dest-2" |
sed -re "s/^Signed-off-by: ($token: )/\1/" \ sed -e "s/^Signed-off-by: \($token: \)/\1/" \
-e "/^Signed-off-by: SENTINEL/d" > "$dest" ; then -e "/^Signed-off-by: SENTINEL/d" > "$dest" ; then
echo "cannot insert $token line in $1" echo "cannot insert $token line in $1"
exit 1 exit 1

View File

@ -1,33 +1,25 @@
#!/bin/sh #!/bin/sh
# DO NOT EDIT THIS FILE
# All updates should be sent upstream: https://github.com/git/git
# This is synced from commit: 00e10ef10e161a913893b8cb33aa080d4ca5baa6
# DO NOT EDIT THIS FILE
# #
# An example hook script to verify if you are on battery, in case you # An example hook script to verify if you are on battery, in case you
# are running Windows, Linux or OS X. Called by git-gc --auto with no # are running Linux or OS X. Called by git-gc --auto with no arguments.
# arguments. The hook should exit with non-zero status after issuing an # The hook should exit with non-zero status after issuing an appropriate
# appropriate message if it wants to stop the auto repacking. # message if it wants to stop the auto repacking.
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
# #
# This program is distributed in the hope that it will be useful, # This hook is stored in the contrib/hooks directory. Your distribution
# but WITHOUT ANY WARRANTY; without even the implied warranty of # may have put this somewhere else. If you want to use this hook, you
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # should make this script executable then link to it in the repository
# GNU General Public License for more details. # you would like to use it in.
# #
# You should have received a copy of the GNU General Public License # For example, if the hook is stored in
# along with this program; if not, write to the Free Software # /usr/share/git-core/contrib/hooks/pre-auto-gc-battery:
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA #
# cd /path/to/your/repository.git
if uname -s | grep -q "_NT-" # ln -sf /usr/share/git-core/contrib/hooks/pre-auto-gc-battery \
then # hooks/pre-auto-gc
if test -x $SYSTEMROOT/System32/Wbem/wmic
then
STATUS=$(wmic path win32_battery get batterystatus /format:list | tr -d '\r\n')
[ "$STATUS" = "BatteryStatus=2" ] && exit 0 || exit 1
fi
exit 0
fi
if test -x /sbin/on_ac_power && (/sbin/on_ac_power;test $? -ne 1) if test -x /sbin/on_ac_power && (/sbin/on_ac_power;test $? -ne 1)
then then
@ -48,11 +40,6 @@ elif test -x /usr/bin/pmset && /usr/bin/pmset -g batt |
grep -q "drawing from 'AC Power'" grep -q "drawing from 'AC Power'"
then then
exit 0 exit 0
elif test -d /sys/bus/acpi/drivers/battery && test 0 = \
"$(find /sys/bus/acpi/drivers/battery/ -type l | wc -l)";
then
# No battery exists.
exit 0
fi fi
echo "Auto packing deferred; not on AC" echo "Auto packing deferred; not on AC"

1283
main.py

File diff suppressed because it is too large Load Diff

View File

@ -1,44 +0,0 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "July 2022" "repo gitc-delete" "Repo Manual"
.SH NAME
repo \- repo gitc-delete - manual page for repo gitc-delete
.SH SYNOPSIS
.B repo
\fI\,gitc-delete\/\fR
.SH DESCRIPTION
Summary
.PP
Delete a GITC Client.
.SH OPTIONS
.TP
\fB\-h\fR, \fB\-\-help\fR
show this help message and exit
.TP
\fB\-f\fR, \fB\-\-force\fR
force the deletion (no prompt)
.SS Logging options:
.TP
\fB\-v\fR, \fB\-\-verbose\fR
show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help gitc\-delete` to view the detailed manual.
.SH DETAILS
.PP
This subcommand deletes the current GITC client, deleting the GITC manifest and
all locally downloaded sources.

View File

@ -1,175 +0,0 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "October 2022" "repo gitc-init" "Repo Manual"
.SH NAME
repo \- repo gitc-init - manual page for repo gitc-init
.SH SYNOPSIS
.B repo
\fI\,gitc-init \/\fR[\fI\,options\/\fR] [\fI\,client name\/\fR]
.SH DESCRIPTION
Summary
.PP
Initialize a GITC Client.
.SH OPTIONS
.TP
\fB\-h\fR, \fB\-\-help\fR
show this help message and exit
.SS Logging options:
.TP
\fB\-v\fR, \fB\-\-verbose\fR
show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Manifest options:
.TP
\fB\-u\fR URL, \fB\-\-manifest\-url\fR=\fI\,URL\/\fR
manifest repository location
.TP
\fB\-b\fR REVISION, \fB\-\-manifest\-branch\fR=\fI\,REVISION\/\fR
manifest branch or revision (use HEAD for default)
.TP
\fB\-m\fR NAME.xml, \fB\-\-manifest\-name\fR=\fI\,NAME\/\fR.xml
initial manifest file
.TP
\fB\-g\fR GROUP, \fB\-\-groups\fR=\fI\,GROUP\/\fR
restrict manifest projects to ones with specified
group(s) [default|all|G1,G2,G3|G4,\-G5,\-G6]
.TP
\fB\-p\fR PLATFORM, \fB\-\-platform\fR=\fI\,PLATFORM\/\fR
restrict manifest projects to ones with a specified
platform group [auto|all|none|linux|darwin|...]
.TP
\fB\-\-submodules\fR
sync any submodules associated with the manifest repo
.TP
\fB\-\-standalone\-manifest\fR
download the manifest as a static file rather then
create a git checkout of the manifest repo
.TP
\fB\-\-manifest\-depth\fR=\fI\,DEPTH\/\fR
create a shallow clone of the manifest repo with given
depth (0 for full clone); see git clone (default: 0)
.SS Manifest (only) checkout options:
.TP
\fB\-\-current\-branch\fR
fetch only current manifest branch from server
(default)
.TP
\fB\-\-no\-current\-branch\fR
fetch all manifest branches from server
.TP
\fB\-\-tags\fR
fetch tags in the manifest
.TP
\fB\-\-no\-tags\fR
don't fetch tags in the manifest
.SS Checkout modes:
.TP
\fB\-\-mirror\fR
create a replica of the remote repositories rather
than a client working directory
.TP
\fB\-\-archive\fR
checkout an archive instead of a git repository for
each project. See git archive.
.TP
\fB\-\-worktree\fR
use git\-worktree to manage projects
.SS Project checkout optimizations:
.TP
\fB\-\-reference\fR=\fI\,DIR\/\fR
location of mirror directory
.TP
\fB\-\-dissociate\fR
dissociate from reference mirrors after clone
.TP
\fB\-\-depth\fR=\fI\,DEPTH\/\fR
create a shallow clone with given depth; see git clone
.TP
\fB\-\-partial\-clone\fR
perform partial clone (https://gitscm.com/docs/gitrepositorylayout#_code_partialclone_code)
.TP
\fB\-\-no\-partial\-clone\fR
disable use of partial clone (https://gitscm.com/docs/gitrepositorylayout#_code_partialclone_code)
.TP
\fB\-\-partial\-clone\-exclude\fR=\fI\,PARTIAL_CLONE_EXCLUDE\/\fR
exclude the specified projects (a comma\-delimited
project names) from partial clone (https://gitscm.com/docs/gitrepositorylayout#_code_partialclone_code)
.TP
\fB\-\-clone\-filter\fR=\fI\,CLONE_FILTER\/\fR
filter for use with \fB\-\-partial\-clone\fR [default:
blob:none]
.TP
\fB\-\-use\-superproject\fR
use the manifest superproject to sync projects;
implies \fB\-c\fR
.TP
\fB\-\-no\-use\-superproject\fR
disable use of manifest superprojects
.TP
\fB\-\-clone\-bundle\fR
enable use of \fI\,/clone.bundle\/\fP on HTTP/HTTPS (default if
not \fB\-\-partial\-clone\fR)
.TP
\fB\-\-no\-clone\-bundle\fR
disable use of \fI\,/clone.bundle\/\fP on HTTP/HTTPS (default if
\fB\-\-partial\-clone\fR)
.TP
\fB\-\-git\-lfs\fR
enable Git LFS support
.TP
\fB\-\-no\-git\-lfs\fR
disable Git LFS support
.SS repo Version options:
.TP
\fB\-\-repo\-url\fR=\fI\,URL\/\fR
repo repository location ($REPO_URL)
.TP
\fB\-\-repo\-rev\fR=\fI\,REV\/\fR
repo branch or revision ($REPO_REV)
.TP
\fB\-\-no\-repo\-verify\fR
do not verify repo source code
.SS Other options:
.TP
\fB\-\-config\-name\fR
Always prompt for name/e\-mail
.SS GITC options:
.TP
\fB\-f\fR MANIFEST_FILE, \fB\-\-manifest\-file\fR=\fI\,MANIFEST_FILE\/\fR
Optional manifest file to use for this GITC client.
.TP
\fB\-c\fR GITC_CLIENT, \fB\-\-gitc\-client\fR=\fI\,GITC_CLIENT\/\fR
Name of the gitc_client instance to create or modify.
.SS Multi\-manifest:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help gitc\-init` to view the detailed manual.
.SH DETAILS
.PP
The 'repo gitc\-init' command is ran to initialize a new GITC client for use with
the GITC file system.
.PP
This command will setup the client directory, initialize repo, just like repo
init does, and then downloads the manifest collection and installs it in the
\&.repo/directory of the GITC client.
.PP
Once this is done, a GITC manifest is generated by pulling the HEAD SHA for each
project and generates the properly formatted XML file and installs it as
\&.manifest in the GITC client directory.
.PP
The \fB\-c\fR argument is required to specify the GITC client name.
.PP
The optional \fB\-f\fR argument can be used to specify the manifest file to use for
this GITC client.

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man. .\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "October 2022" "repo init" "Repo Manual" .TH REPO "1" "September 2024" "repo init" "Repo Manual"
.SH NAME .SH NAME
repo \- repo init - manual page for repo init repo \- repo init - manual page for repo init
.SH SYNOPSIS .SH SYNOPSIS
@ -28,6 +28,11 @@ manifest repository location
\fB\-b\fR REVISION, \fB\-\-manifest\-branch\fR=\fI\,REVISION\/\fR \fB\-b\fR REVISION, \fB\-\-manifest\-branch\fR=\fI\,REVISION\/\fR
manifest branch or revision (use HEAD for default) manifest branch or revision (use HEAD for default)
.TP .TP
\fB\-\-manifest\-upstream\-branch\fR=\fI\,BRANCH\/\fR
when a commit is provided to \fB\-\-manifest\-branch\fR, this
is the name of the git ref in which the commit can be
found
.TP
\fB\-m\fR NAME.xml, \fB\-\-manifest\-name\fR=\fI\,NAME\/\fR.xml \fB\-m\fR NAME.xml, \fB\-\-manifest\-name\fR=\fI\,NAME\/\fR.xml
initial manifest file initial manifest file
.TP .TP
@ -163,6 +168,10 @@ The optional \fB\-b\fR argument can be used to select the manifest branch to che
and use. If no branch is specified, the remote's default branch is used. This is and use. If no branch is specified, the remote's default branch is used. This is
equivalent to using \fB\-b\fR HEAD. equivalent to using \fB\-b\fR HEAD.
.PP .PP
The optional \fB\-\-manifest\-upstream\-branch\fR argument can be used when a commit is
provided to \fB\-\-manifest\-branch\fR (or \fB\-b\fR), to specify the name of the git ref in
which the commit can be found.
.PP
The optional \fB\-m\fR argument can be used to specify an alternate manifest to be The optional \fB\-m\fR argument can be used to specify an alternate manifest to be
used. If no manifest is specified, the manifest default.xml will be used. used. If no manifest is specified, the manifest default.xml will be used.
.PP .PP

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man. .\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "October 2022" "repo manifest" "Repo Manual" .TH REPO "1" "April 2024" "repo manifest" "Repo Manual"
.SH NAME .SH NAME
repo \- repo manifest - manual page for repo manifest repo \- repo manifest - manual page for repo manifest
.SH SYNOPSIS .SH SYNOPSIS
@ -194,8 +194,9 @@ CDATA #IMPLIED>
<!ATTLIST extend\-project upstream CDATA #IMPLIED> <!ATTLIST extend\-project upstream CDATA #IMPLIED>
.IP .IP
<!ELEMENT remove\-project EMPTY> <!ELEMENT remove\-project EMPTY>
<!ATTLIST remove\-project name CDATA #REQUIRED> <!ATTLIST remove\-project name CDATA #IMPLIED>
<!ATTLIST remove\-project optional CDATA #IMPLIED> <!ATTLIST remove\-project path CDATA #IMPLIED>
<!ATTLIST remove\-project optional CDATA #IMPLIED>
.IP .IP
<!ELEMENT repo\-hooks EMPTY> <!ELEMENT repo\-hooks EMPTY>
<!ATTLIST repo\-hooks in\-project CDATA #REQUIRED> <!ATTLIST repo\-hooks in\-project CDATA #REQUIRED>
@ -210,8 +211,9 @@ CDATA #IMPLIED>
<!ATTLIST contactinfo bugurl CDATA #REQUIRED> <!ATTLIST contactinfo bugurl CDATA #REQUIRED>
.IP .IP
<!ELEMENT include EMPTY> <!ELEMENT include EMPTY>
<!ATTLIST include name CDATA #REQUIRED> <!ATTLIST include name CDATA #REQUIRED>
<!ATTLIST include groups CDATA #IMPLIED> <!ATTLIST include groups CDATA #IMPLIED>
<!ATTLIST include revision CDATA #IMPLIED>
.PP .PP
]> ]>
``` ```
@ -533,13 +535,24 @@ the repo client.
.PP .PP
Element remove\-project Element remove\-project
.PP .PP
Deletes the named project from the internal manifest table, possibly allowing a Deletes a project from the internal manifest table, possibly allowing a
subsequent project element in the same manifest file to replace the project with subsequent project element in the same manifest file to replace the project with
a different source. a different source.
.PP .PP
This element is mostly useful in a local manifest file, where the user can This element is mostly useful in a local manifest file, where the user can
remove a project, and possibly replace it with their own definition. remove a project, and possibly replace it with their own definition.
.PP .PP
The project `name` or project `path` can be used to specify the remove target
meaning one of them is required. If only name is specified, all projects with
that name are removed.
.PP
If both name and path are specified, only projects with the same name and path
are removed, meaning projects with the same name but in other locations are
kept.
.PP
If only path is specified, a matching project is removed regardless of its name.
Logic otherwise behaves like both are specified.
.PP
Attribute `optional`: Set to true to ignore remove\-project elements with no Attribute `optional`: Set to true to ignore remove\-project elements with no
matching `project` element. matching `project` element.
.PP .PP
@ -608,7 +621,10 @@ included manifest belong. This appends and recurses, meaning all projects in
included manifests carry all parent include groups. Same syntax as the included manifests carry all parent include groups. Same syntax as the
corresponding element of `project`. corresponding element of `project`.
.PP .PP
Local Manifests Attribute `revision`: Name of a Git branch (e.g. `main` or `refs/heads/main`)
default to which all projects in the included manifest belong.
.PP
Local Manifests
.PP .PP
Additional remotes and projects may be added through local manifest files stored Additional remotes and projects may be added through local manifest files stored
in `$TOP_DIR/.repo/local_manifests/*.xml`. in `$TOP_DIR/.repo/local_manifests/*.xml`.

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man. .\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "November 2022" "repo smartsync" "Repo Manual" .TH REPO "1" "September 2024" "repo smartsync" "Repo Manual"
.SH NAME .SH NAME
repo \- repo smartsync - manual page for repo smartsync repo \- repo smartsync - manual page for repo smartsync
.SH SYNOPSIS .SH SYNOPSIS
@ -37,11 +37,20 @@ overwrite an existing git directory if it needs to
point to a different object directory. WARNING: this point to a different object directory. WARNING: this
may cause loss of data may cause loss of data
.TP .TP
\fB\-\-force\-checkout\fR
force checkout even if it results in throwing away
uncommitted modifications. WARNING: this may cause
loss of data
.TP
\fB\-\-force\-remove\-dirty\fR \fB\-\-force\-remove\-dirty\fR
force remove projects with uncommitted modifications force remove projects with uncommitted modifications
if projects no longer exist in the manifest. WARNING: if projects no longer exist in the manifest. WARNING:
this may cause loss of data this may cause loss of data
.TP .TP
\fB\-\-rebase\fR
rebase local commits regardless of whether they are
published
.TP
\fB\-l\fR, \fB\-\-local\-only\fR \fB\-l\fR, \fB\-\-local\-only\fR
only update working tree, don't fetch only update working tree, don't fetch
.TP .TP

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man. .\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "November 2022" "repo sync" "Repo Manual" .TH REPO "1" "September 2024" "repo sync" "Repo Manual"
.SH NAME .SH NAME
repo \- repo sync - manual page for repo sync repo \- repo sync - manual page for repo sync
.SH SYNOPSIS .SH SYNOPSIS
@ -37,11 +37,20 @@ overwrite an existing git directory if it needs to
point to a different object directory. WARNING: this point to a different object directory. WARNING: this
may cause loss of data may cause loss of data
.TP .TP
\fB\-\-force\-checkout\fR
force checkout even if it results in throwing away
uncommitted modifications. WARNING: this may cause
loss of data
.TP
\fB\-\-force\-remove\-dirty\fR \fB\-\-force\-remove\-dirty\fR
force remove projects with uncommitted modifications force remove projects with uncommitted modifications
if projects no longer exist in the manifest. WARNING: if projects no longer exist in the manifest. WARNING:
this may cause loss of data this may cause loss of data
.TP .TP
\fB\-\-rebase\fR
rebase local commits regardless of whether they are
published
.TP
\fB\-l\fR, \fB\-\-local\-only\fR \fB\-l\fR, \fB\-\-local\-only\fR
only update working tree, don't fetch only update working tree, don't fetch
.TP .TP
@ -185,6 +194,11 @@ The \fB\-\-force\-sync\fR option can be used to overwrite existing git directori
they have previously been linked to a different object directory. WARNING: This they have previously been linked to a different object directory. WARNING: This
may cause data to be lost since refs may be removed when overwriting. may cause data to be lost since refs may be removed when overwriting.
.PP .PP
The \fB\-\-force\-checkout\fR option can be used to force git to switch revs even if the
index or the working tree differs from HEAD, and if there are untracked files.
WARNING: This may cause data to be lost since uncommitted changes may be
removed.
.PP
The \fB\-\-force\-remove\-dirty\fR option can be used to remove previously used projects The \fB\-\-force\-remove\-dirty\fR option can be used to remove previously used projects
with uncommitted changes. WARNING: This may cause data to be lost since with uncommitted changes. WARNING: This may cause data to be lost since
uncommitted changes may be removed with projects that no longer exist in the uncommitted changes may be removed with projects that no longer exist in the

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man. .\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "August 2022" "repo upload" "Repo Manual" .TH REPO "1" "June 2024" "repo upload" "Repo Manual"
.SH NAME .SH NAME
repo \- repo upload - manual page for repo upload repo \- repo upload - manual page for repo upload
.SH SYNOPSIS .SH SYNOPSIS
@ -18,8 +18,11 @@ show this help message and exit
number of jobs to run in parallel (default: based on number of jobs to run in parallel (default: based on
number of CPU cores) number of CPU cores)
.TP .TP
\fB\-t\fR \fB\-t\fR, \fB\-\-topic\-branch\fR
send local branch name to Gerrit Code Review set the topic to the local branch name
.TP
\fB\-\-topic\fR=\fI\,TOPIC\/\fR
set topic for the change
.TP .TP
\fB\-\-hashtag\fR=\fI\,HASHTAGS\/\fR, \fB\-\-ht\fR=\fI\,HASHTAGS\/\fR \fB\-\-hashtag\fR=\fI\,HASHTAGS\/\fR, \fB\-\-ht\fR=\fI\,HASHTAGS\/\fR
add hashtags (comma delimited) to the review add hashtags (comma delimited) to the review
@ -30,6 +33,9 @@ add local branch name as a hashtag
\fB\-l\fR LABELS, \fB\-\-label\fR=\fI\,LABELS\/\fR \fB\-l\fR LABELS, \fB\-\-label\fR=\fI\,LABELS\/\fR
add a label when uploading add a label when uploading
.TP .TP
\fB\-\-pd\fR=\fI\,PATCHSET_DESCRIPTION\/\fR, \fB\-\-patchset\-description\fR=\fI\,PATCHSET_DESCRIPTION\/\fR
description for patchset
.TP
\fB\-\-re\fR=\fI\,REVIEWERS\/\fR, \fB\-\-reviewers\fR=\fI\,REVIEWERS\/\fR \fB\-\-re\fR=\fI\,REVIEWERS\/\fR, \fB\-\-reviewers\fR=\fI\,REVIEWERS\/\fR
request reviews from these people request reviews from these people
.TP .TP
@ -198,6 +204,12 @@ review.URL.uploadnotify:
Control e\-mail notifications when uploading. Control e\-mail notifications when uploading.
https://gerrit\-review.googlesource.com/Documentation/user\-upload.html#notify https://gerrit\-review.googlesource.com/Documentation/user\-upload.html#notify
.PP .PP
review.URL.uploadwarningthreshold:
.PP
Repo will warn you if you are attempting to upload a large number of commits in
one or more branches. By default, the threshold is five commits. This option
allows you to override the warning threshold to a different value.
.PP
References References
.PP .PP
Gerrit Code Review: https://www.gerritcodereview.com/ Gerrit Code Review: https://www.gerritcodereview.com/

View File

@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man. .\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "November 2022" "repo" "Repo Manual" .TH REPO "1" "April 2024" "repo" "Repo Manual"
.SH NAME .SH NAME
repo \- repository management tool built on top of git repo \- repository management tool built on top of git
.SH SYNOPSIS .SH SYNOPSIS
@ -79,12 +79,6 @@ Download and checkout a change
forall forall
Run a shell command in each project Run a shell command in each project
.TP .TP
gitc\-delete
Delete a GITC Client.
.TP
gitc\-init
Initialize a GITC Client.
.TP
grep grep
Print lines matching a pattern Print lines matching a pattern
.TP .TP
@ -137,4 +131,4 @@ version
Display the version of repo Display the version of repo
.PP .PP
See 'repo help <command>' for more information on a specific command. See 'repo help <command>' for more information on a specific command.
Bug reports: https://bugs.chromium.org/p/gerrit/issues/entry?template=Repo+tool+issue Bug reports: https://issues.gerritcodereview.com/issues/new?component=1370071

File diff suppressed because it is too large Load Diff

155
pager.py
View File

@ -19,6 +19,7 @@ import sys
import platform_utils import platform_utils
active = False active = False
pager_process = None pager_process = None
old_stdout = None old_stdout = None
@ -26,102 +27,104 @@ old_stderr = None
def RunPager(globalConfig): def RunPager(globalConfig):
if not os.isatty(0) or not os.isatty(1): if not os.isatty(0) or not os.isatty(1):
return return
pager = _SelectPager(globalConfig) pager = _SelectPager(globalConfig)
if pager == '' or pager == 'cat': if pager == "" or pager == "cat":
return return
if platform_utils.isWindows(): if platform_utils.isWindows():
_PipePager(pager) _PipePager(pager)
else: else:
_ForkPager(pager) _ForkPager(pager)
def TerminatePager(): def TerminatePager():
global pager_process, old_stdout, old_stderr global pager_process, old_stdout, old_stderr
if pager_process: if pager_process:
sys.stdout.flush() sys.stdout.flush()
sys.stderr.flush() sys.stderr.flush()
pager_process.stdin.close() pager_process.stdin.close()
pager_process.wait() pager_process.wait()
pager_process = None pager_process = None
# Restore initial stdout/err in case there is more output in this process # Restore initial stdout/err in case there is more output in this
# after shutting down the pager process # process after shutting down the pager process.
sys.stdout = old_stdout sys.stdout = old_stdout
sys.stderr = old_stderr sys.stderr = old_stderr
def _PipePager(pager): def _PipePager(pager):
global pager_process, old_stdout, old_stderr global pager_process, old_stdout, old_stderr
assert pager_process is None, "Only one active pager process at a time" assert pager_process is None, "Only one active pager process at a time"
# Create pager process, piping stdout/err into its stdin # Create pager process, piping stdout/err into its stdin.
try: try:
pager_process = subprocess.Popen([pager], stdin=subprocess.PIPE, stdout=sys.stdout, pager_process = subprocess.Popen(
stderr=sys.stderr) [pager], stdin=subprocess.PIPE, stdout=sys.stdout, stderr=sys.stderr
except FileNotFoundError: )
sys.exit(f'fatal: cannot start pager "{pager}"') except FileNotFoundError:
old_stdout = sys.stdout sys.exit(f'fatal: cannot start pager "{pager}"')
old_stderr = sys.stderr old_stdout = sys.stdout
sys.stdout = pager_process.stdin old_stderr = sys.stderr
sys.stderr = pager_process.stdin sys.stdout = pager_process.stdin
sys.stderr = pager_process.stdin
def _ForkPager(pager): def _ForkPager(pager):
global active global active
# This process turns into the pager; a child it forks will # This process turns into the pager; a child it forks will
# do the real processing and output back to the pager. This # do the real processing and output back to the pager. This
# is necessary to keep the pager in control of the tty. # is necessary to keep the pager in control of the tty.
# try:
try: r, w = os.pipe()
r, w = os.pipe() pid = os.fork()
pid = os.fork() if not pid:
if not pid: os.dup2(w, 1)
os.dup2(w, 1) os.dup2(w, 2)
os.dup2(w, 2) os.close(r)
os.close(r) os.close(w)
os.close(w) active = True
active = True return
return
os.dup2(r, 0) os.dup2(r, 0)
os.close(r) os.close(r)
os.close(w) os.close(w)
_BecomePager(pager) _BecomePager(pager)
except Exception: except Exception:
print("fatal: cannot start pager '%s'" % pager, file=sys.stderr) print("fatal: cannot start pager '%s'" % pager, file=sys.stderr)
sys.exit(255) sys.exit(255)
def _SelectPager(globalConfig): def _SelectPager(globalConfig):
try: try:
return os.environ['GIT_PAGER'] return os.environ["GIT_PAGER"]
except KeyError: except KeyError:
pass pass
pager = globalConfig.GetString('core.pager') pager = globalConfig.GetString("core.pager")
if pager: if pager:
return pager return pager
try: try:
return os.environ['PAGER'] return os.environ["PAGER"]
except KeyError: except KeyError:
pass pass
return 'less' return "less"
def _BecomePager(pager): def _BecomePager(pager):
# Delaying execution of the pager until we have output # Delaying execution of the pager until we have output
# ready works around a long-standing bug in popularly # ready works around a long-standing bug in popularly
# available versions of 'less', a better 'more'. # available versions of 'less', a better 'more'.
# _a, _b, _c = select.select([0], [], [0])
_a, _b, _c = select.select([0], [], [0])
os.environ['LESS'] = 'FRSX' # This matches the behavior of git, which sets $LESS to `FRX` if it is not
# set. See:
# https://git-scm.com/docs/git-config#Documentation/git-config.txt-corepager
os.environ.setdefault("LESS", "FRX")
try: try:
os.execvp(pager, [pager]) os.execvp(pager, [pager])
except OSError: except OSError:
os.execv('/bin/sh', ['sh', '-c', pager]) os.execv("/bin/sh", ["sh", "-c", pager])

View File

@ -20,246 +20,234 @@ import stat
def isWindows(): def isWindows():
""" Returns True when running with the native port of Python for Windows, """Returns True when running with the native port of Python for Windows,
False when running on any other platform (including the Cygwin port of False when running on any other platform (including the Cygwin port of
Python). Python).
""" """
# Note: The cygwin port of Python returns "CYGWIN_NT_xxx" # Note: The cygwin port of Python returns "CYGWIN_NT_xxx"
return platform.system() == "Windows" return platform.system() == "Windows"
def symlink(source, link_name): def symlink(source, link_name):
"""Creates a symbolic link pointing to source named link_name. """Creates a symbolic link pointing to source named link_name.
Note: On Windows, source must exist on disk, as the implementation needs
to know whether to create a "File" or a "Directory" symbolic link. Note: On Windows, source must exist on disk, as the implementation needs
""" to know whether to create a "File" or a "Directory" symbolic link.
if isWindows(): """
import platform_utils_win32 if isWindows():
source = _validate_winpath(source) import platform_utils_win32
link_name = _validate_winpath(link_name)
target = os.path.join(os.path.dirname(link_name), source) source = _validate_winpath(source)
if isdir(target): link_name = _validate_winpath(link_name)
platform_utils_win32.create_dirsymlink(_makelongpath(source), link_name) target = os.path.join(os.path.dirname(link_name), source)
if isdir(target):
platform_utils_win32.create_dirsymlink(
_makelongpath(source), link_name
)
else:
platform_utils_win32.create_filesymlink(
_makelongpath(source), link_name
)
else: else:
platform_utils_win32.create_filesymlink(_makelongpath(source), link_name) return os.symlink(source, link_name)
else:
return os.symlink(source, link_name)
def _validate_winpath(path): def _validate_winpath(path):
path = os.path.normpath(path) path = os.path.normpath(path)
if _winpath_is_valid(path): if _winpath_is_valid(path):
return path return path
raise ValueError("Path \"%s\" must be a relative path or an absolute " raise ValueError(
"path starting with a drive letter".format(path)) f'Path "{path}" must be a relative path or an absolute '
"path starting with a drive letter"
)
def _winpath_is_valid(path): def _winpath_is_valid(path):
"""Windows only: returns True if path is relative (e.g. ".\\foo") or is """Windows only: returns True if path is relative (e.g. ".\\foo") or is
absolute including a drive letter (e.g. "c:\\foo"). Returns False if path absolute including a drive letter (e.g. "c:\\foo"). Returns False if path
is ambiguous (e.g. "x:foo" or "\\foo"). is ambiguous (e.g. "x:foo" or "\\foo").
""" """
assert isWindows() assert isWindows()
path = os.path.normpath(path) path = os.path.normpath(path)
drive, tail = os.path.splitdrive(path) drive, tail = os.path.splitdrive(path)
if tail: if tail:
if not drive: if not drive:
return tail[0] != os.sep # "\\foo" is invalid return tail[0] != os.sep # "\\foo" is invalid
else:
return tail[0] == os.sep # "x:foo" is invalid
else: else:
return tail[0] == os.sep # "x:foo" is invalid return not drive # "x:" is invalid
else:
return not drive # "x:" is invalid
def _makelongpath(path): def _makelongpath(path):
"""Return the input path normalized to support the Windows long path syntax """Return the input path normalized to support the Windows long path syntax
("\\\\?\\" prefix) if needed, i.e. if the input path is longer than the ("\\\\?\\" prefix) if needed, i.e. if the input path is longer than the
MAX_PATH limit. MAX_PATH limit.
""" """
if isWindows(): if isWindows():
# Note: MAX_PATH is 260, but, for directories, the maximum value is actually 246. # Note: MAX_PATH is 260, but, for directories, the maximum value is
if len(path) < 246: # actually 246.
return path if len(path) < 246:
if path.startswith(u"\\\\?\\"): return path
return path if path.startswith("\\\\?\\"):
if not os.path.isabs(path): return path
return path if not os.path.isabs(path):
# Append prefix and ensure unicode so that the special longpath syntax return path
# is supported by underlying Win32 API calls # Append prefix and ensure unicode so that the special longpath syntax
return u"\\\\?\\" + os.path.normpath(path) # is supported by underlying Win32 API calls
else: return "\\\\?\\" + os.path.normpath(path)
return path else:
return path
def rmtree(path, ignore_errors=False): def rmtree(path, ignore_errors=False):
"""shutil.rmtree(path) wrapper with support for long paths on Windows. """shutil.rmtree(path) wrapper with support for long paths on Windows.
Availability: Unix, Windows.""" Availability: Unix, Windows.
onerror = None """
if isWindows(): onerror = None
path = _makelongpath(path) if isWindows():
onerror = handle_rmtree_error path = _makelongpath(path)
shutil.rmtree(path, ignore_errors=ignore_errors, onerror=onerror) onerror = handle_rmtree_error
shutil.rmtree(path, ignore_errors=ignore_errors, onerror=onerror)
def handle_rmtree_error(function, path, excinfo): def handle_rmtree_error(function, path, excinfo):
# Allow deleting read-only files # Allow deleting read-only files.
os.chmod(path, stat.S_IWRITE) os.chmod(path, stat.S_IWRITE)
function(path) function(path)
def rename(src, dst): def rename(src, dst):
"""os.rename(src, dst) wrapper with support for long paths on Windows. """os.rename(src, dst) wrapper with support for long paths on Windows.
Availability: Unix, Windows.""" Availability: Unix, Windows.
if isWindows(): """
# On Windows, rename fails if destination exists, see if isWindows():
# https://docs.python.org/2/library/os.html#os.rename # On Windows, rename fails if destination exists, see
try: # https://docs.python.org/2/library/os.html#os.rename
os.rename(_makelongpath(src), _makelongpath(dst)) try:
except OSError as e: os.rename(_makelongpath(src), _makelongpath(dst))
if e.errno == errno.EEXIST: except OSError as e:
os.remove(_makelongpath(dst)) if e.errno == errno.EEXIST:
os.rename(_makelongpath(src), _makelongpath(dst)) os.remove(_makelongpath(dst))
else: os.rename(_makelongpath(src), _makelongpath(dst))
raise else:
else: raise
shutil.move(src, dst) else:
shutil.move(src, dst)
def remove(path, missing_ok=False): def remove(path, missing_ok=False):
"""Remove (delete) the file path. This is a replacement for os.remove that """Remove (delete) the file path. This is a replacement for os.remove that
allows deleting read-only files on Windows, with support for long paths and allows deleting read-only files on Windows, with support for long paths and
for deleting directory symbolic links. for deleting directory symbolic links.
Availability: Unix, Windows.""" Availability: Unix, Windows.
longpath = _makelongpath(path) if isWindows() else path """
try: longpath = _makelongpath(path) if isWindows() else path
os.remove(longpath) try:
except OSError as e:
if e.errno == errno.EACCES:
os.chmod(longpath, stat.S_IWRITE)
# Directory symbolic links must be deleted with 'rmdir'.
if islink(longpath) and isdir(longpath):
os.rmdir(longpath)
else:
os.remove(longpath) os.remove(longpath)
elif missing_ok and e.errno == errno.ENOENT: except OSError as e:
pass if e.errno == errno.EACCES:
else: os.chmod(longpath, stat.S_IWRITE)
raise # Directory symbolic links must be deleted with 'rmdir'.
if islink(longpath) and isdir(longpath):
os.rmdir(longpath)
else:
os.remove(longpath)
elif missing_ok and e.errno == errno.ENOENT:
pass
else:
raise
def walk(top, topdown=True, onerror=None, followlinks=False): def walk(top, topdown=True, onerror=None, followlinks=False):
"""os.walk(path) wrapper with support for long paths on Windows. """os.walk(path) wrapper with support for long paths on Windows.
Availability: Windows, Unix. Availability: Windows, Unix.
""" """
if isWindows(): if isWindows():
return _walk_windows_impl(top, topdown, onerror, followlinks) return _walk_windows_impl(top, topdown, onerror, followlinks)
else: else:
return os.walk(top, topdown, onerror, followlinks) return os.walk(top, topdown, onerror, followlinks)
def _walk_windows_impl(top, topdown, onerror, followlinks): def _walk_windows_impl(top, topdown, onerror, followlinks):
try: try:
names = listdir(top) names = listdir(top)
except Exception as err: except Exception as err:
if onerror is not None: if onerror is not None:
onerror(err) onerror(err)
return return
dirs, nondirs = [], [] dirs, nondirs = [], []
for name in names: for name in names:
if isdir(os.path.join(top, name)): if isdir(os.path.join(top, name)):
dirs.append(name) dirs.append(name)
else: else:
nondirs.append(name) nondirs.append(name)
if topdown: if topdown:
yield top, dirs, nondirs yield top, dirs, nondirs
for name in dirs: for name in dirs:
new_path = os.path.join(top, name) new_path = os.path.join(top, name)
if followlinks or not islink(new_path): if followlinks or not islink(new_path):
for x in _walk_windows_impl(new_path, topdown, onerror, followlinks): yield from _walk_windows_impl(
yield x new_path, topdown, onerror, followlinks
if not topdown: )
yield top, dirs, nondirs if not topdown:
yield top, dirs, nondirs
def listdir(path): def listdir(path):
"""os.listdir(path) wrapper with support for long paths on Windows. """os.listdir(path) wrapper with support for long paths on Windows.
Availability: Windows, Unix. Availability: Windows, Unix.
""" """
return os.listdir(_makelongpath(path)) return os.listdir(_makelongpath(path))
def rmdir(path): def rmdir(path):
"""os.rmdir(path) wrapper with support for long paths on Windows. """os.rmdir(path) wrapper with support for long paths on Windows.
Availability: Windows, Unix. Availability: Windows, Unix.
""" """
os.rmdir(_makelongpath(path)) os.rmdir(_makelongpath(path))
def isdir(path): def isdir(path):
"""os.path.isdir(path) wrapper with support for long paths on Windows. """os.path.isdir(path) wrapper with support for long paths on Windows.
Availability: Windows, Unix. Availability: Windows, Unix.
""" """
return os.path.isdir(_makelongpath(path)) return os.path.isdir(_makelongpath(path))
def islink(path): def islink(path):
"""os.path.islink(path) wrapper with support for long paths on Windows. """os.path.islink(path) wrapper with support for long paths on Windows.
Availability: Windows, Unix. Availability: Windows, Unix.
""" """
if isWindows(): if isWindows():
import platform_utils_win32 import platform_utils_win32
return platform_utils_win32.islink(_makelongpath(path))
else: return platform_utils_win32.islink(_makelongpath(path))
return os.path.islink(path) else:
return os.path.islink(path)
def readlink(path): def readlink(path):
"""Return a string representing the path to which the symbolic link """Return a string representing the path to which the symbolic link
points. The result may be either an absolute or relative pathname; points. The result may be either an absolute or relative pathname;
if it is relative, it may be converted to an absolute pathname using if it is relative, it may be converted to an absolute pathname using
os.path.join(os.path.dirname(path), result). os.path.join(os.path.dirname(path), result).
Availability: Windows, Unix. Availability: Windows, Unix.
""" """
if isWindows(): if isWindows():
import platform_utils_win32 import platform_utils_win32
return platform_utils_win32.readlink(_makelongpath(path))
else:
return os.readlink(path)
return platform_utils_win32.readlink(_makelongpath(path))
def realpath(path): else:
"""Return the canonical path of the specified filename, eliminating return os.readlink(path)
any symbolic links encountered in the path.
Availability: Windows, Unix.
"""
if isWindows():
current_path = os.path.abspath(path)
path_tail = []
for c in range(0, 100): # Avoid cycles
if islink(current_path):
target = readlink(current_path)
current_path = os.path.join(os.path.dirname(current_path), target)
else:
basename = os.path.basename(current_path)
if basename == '':
path_tail.append(current_path)
break
path_tail.append(basename)
current_path = os.path.dirname(current_path)
path_tail.reverse()
result = os.path.normpath(os.path.join(*path_tail))
return result
else:
return os.path.realpath(path)

View File

@ -12,14 +12,30 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from ctypes import addressof
from ctypes import byref
from ctypes import c_buffer
from ctypes import c_ubyte
from ctypes import FormatError
from ctypes import get_last_error
from ctypes import Structure
from ctypes import Union
from ctypes import WinDLL
from ctypes import WinError
from ctypes.wintypes import BOOL
from ctypes.wintypes import BOOLEAN
from ctypes.wintypes import DWORD
from ctypes.wintypes import HANDLE
from ctypes.wintypes import LPCWSTR
from ctypes.wintypes import LPDWORD
from ctypes.wintypes import LPVOID
from ctypes.wintypes import ULONG
from ctypes.wintypes import USHORT
from ctypes.wintypes import WCHAR
import errno import errno
from ctypes import WinDLL, get_last_error, FormatError, WinError, addressof
from ctypes import c_buffer, c_ubyte, Structure, Union, byref
from ctypes.wintypes import BOOL, BOOLEAN, LPCWSTR, DWORD, HANDLE
from ctypes.wintypes import WCHAR, USHORT, LPVOID, ULONG, LPDWORD
kernel32 = WinDLL('kernel32', use_last_error=True) kernel32 = WinDLL("kernel32", use_last_error=True)
UCHAR = c_ubyte UCHAR = c_ubyte
@ -31,14 +47,17 @@ ERROR_PRIVILEGE_NOT_HELD = 1314
# Win32 API entry points # Win32 API entry points
CreateSymbolicLinkW = kernel32.CreateSymbolicLinkW CreateSymbolicLinkW = kernel32.CreateSymbolicLinkW
CreateSymbolicLinkW.restype = BOOLEAN CreateSymbolicLinkW.restype = BOOLEAN
CreateSymbolicLinkW.argtypes = (LPCWSTR, # lpSymlinkFileName In CreateSymbolicLinkW.argtypes = (
LPCWSTR, # lpTargetFileName In LPCWSTR, # lpSymlinkFileName In
DWORD) # dwFlags In LPCWSTR, # lpTargetFileName In
DWORD, # dwFlags In
)
# Symbolic link creation flags # Symbolic link creation flags
SYMBOLIC_LINK_FLAG_FILE = 0x00 SYMBOLIC_LINK_FLAG_FILE = 0x00
SYMBOLIC_LINK_FLAG_DIRECTORY = 0x01 SYMBOLIC_LINK_FLAG_DIRECTORY = 0x01
# symlink support for CreateSymbolicLink() starting with Windows 10 (1703, v10.0.14972) # symlink support for CreateSymbolicLink() starting with Windows 10 (1703,
# v10.0.14972)
SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE = 0x02 SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE = 0x02
GetFileAttributesW = kernel32.GetFileAttributesW GetFileAttributesW = kernel32.GetFileAttributesW
@ -50,13 +69,15 @@ FILE_ATTRIBUTE_REPARSE_POINT = 0x00400
CreateFileW = kernel32.CreateFileW CreateFileW = kernel32.CreateFileW
CreateFileW.restype = HANDLE CreateFileW.restype = HANDLE
CreateFileW.argtypes = (LPCWSTR, # lpFileName In CreateFileW.argtypes = (
DWORD, # dwDesiredAccess In LPCWSTR, # lpFileName In
DWORD, # dwShareMode In DWORD, # dwDesiredAccess In
LPVOID, # lpSecurityAttributes In_opt DWORD, # dwShareMode In
DWORD, # dwCreationDisposition In LPVOID, # lpSecurityAttributes In_opt
DWORD, # dwFlagsAndAttributes In DWORD, # dwCreationDisposition In
HANDLE) # hTemplateFile In_opt DWORD, # dwFlagsAndAttributes In
HANDLE, # hTemplateFile In_opt
)
CloseHandle = kernel32.CloseHandle CloseHandle = kernel32.CloseHandle
CloseHandle.restype = BOOL CloseHandle.restype = BOOL
@ -69,14 +90,16 @@ FILE_FLAG_OPEN_REPARSE_POINT = 0x00200000
DeviceIoControl = kernel32.DeviceIoControl DeviceIoControl = kernel32.DeviceIoControl
DeviceIoControl.restype = BOOL DeviceIoControl.restype = BOOL
DeviceIoControl.argtypes = (HANDLE, # hDevice In DeviceIoControl.argtypes = (
DWORD, # dwIoControlCode In HANDLE, # hDevice In
LPVOID, # lpInBuffer In_opt DWORD, # dwIoControlCode In
DWORD, # nInBufferSize In LPVOID, # lpInBuffer In_opt
LPVOID, # lpOutBuffer Out_opt DWORD, # nInBufferSize In
DWORD, # nOutBufferSize In LPVOID, # lpOutBuffer Out_opt
LPDWORD, # lpBytesReturned Out_opt DWORD, # nOutBufferSize In
LPVOID) # lpOverlapped Inout_opt LPDWORD, # lpBytesReturned Out_opt
LPVOID, # lpOverlapped Inout_opt
)
# Device I/O control flags and options # Device I/O control flags and options
FSCTL_GET_REPARSE_POINT = 0x000900A8 FSCTL_GET_REPARSE_POINT = 0x000900A8
@ -86,124 +109,136 @@ MAXIMUM_REPARSE_DATA_BUFFER_SIZE = 0x4000
class GENERIC_REPARSE_BUFFER(Structure): class GENERIC_REPARSE_BUFFER(Structure):
_fields_ = (('DataBuffer', UCHAR * 1),) _fields_ = (("DataBuffer", UCHAR * 1),)
class SYMBOLIC_LINK_REPARSE_BUFFER(Structure): class SYMBOLIC_LINK_REPARSE_BUFFER(Structure):
_fields_ = (('SubstituteNameOffset', USHORT), _fields_ = (
('SubstituteNameLength', USHORT), ("SubstituteNameOffset", USHORT),
('PrintNameOffset', USHORT), ("SubstituteNameLength", USHORT),
('PrintNameLength', USHORT), ("PrintNameOffset", USHORT),
('Flags', ULONG), ("PrintNameLength", USHORT),
('PathBuffer', WCHAR * 1)) ("Flags", ULONG),
("PathBuffer", WCHAR * 1),
)
@property @property
def PrintName(self): def PrintName(self):
arrayt = WCHAR * (self.PrintNameLength // 2) arrayt = WCHAR * (self.PrintNameLength // 2)
offset = type(self).PathBuffer.offset + self.PrintNameOffset offset = type(self).PathBuffer.offset + self.PrintNameOffset
return arrayt.from_address(addressof(self) + offset).value return arrayt.from_address(addressof(self) + offset).value
class MOUNT_POINT_REPARSE_BUFFER(Structure): class MOUNT_POINT_REPARSE_BUFFER(Structure):
_fields_ = (('SubstituteNameOffset', USHORT), _fields_ = (
('SubstituteNameLength', USHORT), ("SubstituteNameOffset", USHORT),
('PrintNameOffset', USHORT), ("SubstituteNameLength", USHORT),
('PrintNameLength', USHORT), ("PrintNameOffset", USHORT),
('PathBuffer', WCHAR * 1)) ("PrintNameLength", USHORT),
("PathBuffer", WCHAR * 1),
)
@property @property
def PrintName(self): def PrintName(self):
arrayt = WCHAR * (self.PrintNameLength // 2) arrayt = WCHAR * (self.PrintNameLength // 2)
offset = type(self).PathBuffer.offset + self.PrintNameOffset offset = type(self).PathBuffer.offset + self.PrintNameOffset
return arrayt.from_address(addressof(self) + offset).value return arrayt.from_address(addressof(self) + offset).value
class REPARSE_DATA_BUFFER(Structure): class REPARSE_DATA_BUFFER(Structure):
class REPARSE_BUFFER(Union): class REPARSE_BUFFER(Union):
_fields_ = (('SymbolicLinkReparseBuffer', SYMBOLIC_LINK_REPARSE_BUFFER), _fields_ = (
('MountPointReparseBuffer', MOUNT_POINT_REPARSE_BUFFER), ("SymbolicLinkReparseBuffer", SYMBOLIC_LINK_REPARSE_BUFFER),
('GenericReparseBuffer', GENERIC_REPARSE_BUFFER)) ("MountPointReparseBuffer", MOUNT_POINT_REPARSE_BUFFER),
_fields_ = (('ReparseTag', ULONG), ("GenericReparseBuffer", GENERIC_REPARSE_BUFFER),
('ReparseDataLength', USHORT), )
('Reserved', USHORT),
('ReparseBuffer', REPARSE_BUFFER)) _fields_ = (
_anonymous_ = ('ReparseBuffer',) ("ReparseTag", ULONG),
("ReparseDataLength", USHORT),
("Reserved", USHORT),
("ReparseBuffer", REPARSE_BUFFER),
)
_anonymous_ = ("ReparseBuffer",)
def create_filesymlink(source, link_name): def create_filesymlink(source, link_name):
"""Creates a Windows file symbolic link source pointing to link_name.""" """Creates a Windows file symbolic link source pointing to link_name."""
_create_symlink(source, link_name, SYMBOLIC_LINK_FLAG_FILE) _create_symlink(source, link_name, SYMBOLIC_LINK_FLAG_FILE)
def create_dirsymlink(source, link_name): def create_dirsymlink(source, link_name):
"""Creates a Windows directory symbolic link source pointing to link_name. """Creates a Windows directory symbolic link source pointing to link_name.""" # noqa: E501
""" _create_symlink(source, link_name, SYMBOLIC_LINK_FLAG_DIRECTORY)
_create_symlink(source, link_name, SYMBOLIC_LINK_FLAG_DIRECTORY)
def _create_symlink(source, link_name, dwFlags): def _create_symlink(source, link_name, dwFlags):
if not CreateSymbolicLinkW(link_name, source, if not CreateSymbolicLinkW(
dwFlags | SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE): link_name,
# See https://github.com/golang/go/pull/24307/files#diff-b87bc12e4da2497308f9ef746086e4f0 source,
# "the unprivileged create flag is unsupported below Windows 10 (1703, v10.0.14972). dwFlags | SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE,
# retry without it." ):
if not CreateSymbolicLinkW(link_name, source, dwFlags): # See https://github.com/golang/go/pull/24307/files#diff-b87bc12e4da2497308f9ef746086e4f0 # noqa: E501
code = get_last_error() # "the unprivileged create flag is unsupported below Windows 10 (1703,
error_desc = FormatError(code).strip() # v10.0.14972). retry without it."
if code == ERROR_PRIVILEGE_NOT_HELD: if not CreateSymbolicLinkW(link_name, source, dwFlags):
raise OSError(errno.EPERM, error_desc, link_name) code = get_last_error()
_raise_winerror( error_desc = FormatError(code).strip()
code, if code == ERROR_PRIVILEGE_NOT_HELD:
'Error creating symbolic link \"%s\"'.format(link_name)) raise OSError(errno.EPERM, error_desc, link_name)
_raise_winerror(code, f'Error creating symbolic link "{link_name}"')
def islink(path): def islink(path):
result = GetFileAttributesW(path) result = GetFileAttributesW(path)
if result == INVALID_FILE_ATTRIBUTES: if result == INVALID_FILE_ATTRIBUTES:
return False return False
return bool(result & FILE_ATTRIBUTE_REPARSE_POINT) return bool(result & FILE_ATTRIBUTE_REPARSE_POINT)
def readlink(path): def readlink(path):
reparse_point_handle = CreateFileW(path, reparse_point_handle = CreateFileW(
0, path,
0, 0,
None, 0,
OPEN_EXISTING, None,
FILE_FLAG_OPEN_REPARSE_POINT | OPEN_EXISTING,
FILE_FLAG_BACKUP_SEMANTICS, FILE_FLAG_OPEN_REPARSE_POINT | FILE_FLAG_BACKUP_SEMANTICS,
None) None,
if reparse_point_handle == INVALID_HANDLE_VALUE: )
if reparse_point_handle == INVALID_HANDLE_VALUE:
_raise_winerror(
get_last_error(), f'Error opening symbolic link "{path}"'
)
target_buffer = c_buffer(MAXIMUM_REPARSE_DATA_BUFFER_SIZE)
n_bytes_returned = DWORD()
io_result = DeviceIoControl(
reparse_point_handle,
FSCTL_GET_REPARSE_POINT,
None,
0,
target_buffer,
len(target_buffer),
byref(n_bytes_returned),
None,
)
CloseHandle(reparse_point_handle)
if not io_result:
_raise_winerror(
get_last_error(), f'Error reading symbolic link "{path}"'
)
rdb = REPARSE_DATA_BUFFER.from_buffer(target_buffer)
if rdb.ReparseTag == IO_REPARSE_TAG_SYMLINK:
return rdb.SymbolicLinkReparseBuffer.PrintName
elif rdb.ReparseTag == IO_REPARSE_TAG_MOUNT_POINT:
return rdb.MountPointReparseBuffer.PrintName
# Unsupported reparse point type.
_raise_winerror( _raise_winerror(
get_last_error(), ERROR_NOT_SUPPORTED, f'Error reading symbolic link "{path}"'
'Error opening symbolic link \"%s\"'.format(path)) )
target_buffer = c_buffer(MAXIMUM_REPARSE_DATA_BUFFER_SIZE)
n_bytes_returned = DWORD()
io_result = DeviceIoControl(reparse_point_handle,
FSCTL_GET_REPARSE_POINT,
None,
0,
target_buffer,
len(target_buffer),
byref(n_bytes_returned),
None)
CloseHandle(reparse_point_handle)
if not io_result:
_raise_winerror(
get_last_error(),
'Error reading symbolic link \"%s\"'.format(path))
rdb = REPARSE_DATA_BUFFER.from_buffer(target_buffer)
if rdb.ReparseTag == IO_REPARSE_TAG_SYMLINK:
return rdb.SymbolicLinkReparseBuffer.PrintName
elif rdb.ReparseTag == IO_REPARSE_TAG_MOUNT_POINT:
return rdb.MountPointReparseBuffer.PrintName
# Unsupported reparse point type
_raise_winerror(
ERROR_NOT_SUPPORTED,
'Error reading symbolic link \"%s\"'.format(path))
def _raise_winerror(code, error_desc): def _raise_winerror(code, error_desc):
win_error_desc = FormatError(code).strip() win_error_desc = FormatError(code).strip()
error_desc = "%s: %s".format(error_desc, win_error_desc) error_desc = f"{error_desc}: {win_error_desc}"
raise WinError(code, error_desc) raise WinError(code, error_desc)

View File

@ -14,123 +14,210 @@
import os import os
import sys import sys
from time import time import time
try:
import threading as _threading
except ImportError:
import dummy_threading as _threading
from repo_trace import IsTraceToStderr from repo_trace import IsTraceToStderr
_NOT_TTY = not os.isatty(2)
_TTY = sys.stderr.isatty()
# This will erase all content in the current line (wherever the cursor is). # This will erase all content in the current line (wherever the cursor is).
# It does not move the cursor, so this is usually followed by \r to move to # It does not move the cursor, so this is usually followed by \r to move to
# column 0. # column 0.
CSI_ERASE_LINE = '\x1b[2K' CSI_ERASE_LINE = "\x1b[2K"
# This will erase all content in the current line after the cursor. This is # This will erase all content in the current line after the cursor. This is
# useful for partial updates & progress messages as the terminal can display # useful for partial updates & progress messages as the terminal can display
# it better. # it better.
CSI_ERASE_LINE_AFTER = '\x1b[K' CSI_ERASE_LINE_AFTER = "\x1b[K"
def convert_to_hms(total):
"""Converts a period of seconds to hours, minutes, and seconds."""
hours, rem = divmod(total, 3600)
mins, secs = divmod(rem, 60)
return int(hours), int(mins), secs
def duration_str(total): def duration_str(total):
"""A less noisy timedelta.__str__. """A less noisy timedelta.__str__.
The default timedelta stringification contains a lot of leading zeros and The default timedelta stringification contains a lot of leading zeros and
uses microsecond resolution. This makes for noisy output. uses microsecond resolution. This makes for noisy output.
""" """
hours, rem = divmod(total, 3600) hours, mins, secs = convert_to_hms(total)
mins, secs = divmod(rem, 60) ret = f"{secs:.3f}s"
ret = '%.3fs' % (secs,) if mins:
if mins: ret = f"{mins}m{ret}"
ret = '%im%s' % (mins, ret) if hours:
if hours: ret = f"{hours}h{ret}"
ret = '%ih%s' % (hours, ret) return ret
return ret
class Progress(object): def elapsed_str(total):
def __init__(self, title, total=0, units='', print_newline=False, delay=True, """Returns seconds in the format [H:]MM:SS.
quiet=False):
self._title = title
self._total = total
self._done = 0
self._start = time()
self._show = not delay
self._units = units
self._print_newline = print_newline
# Only show the active jobs section if we run more than one in parallel.
self._show_jobs = False
self._active = 0
# When quiet, never show any output. It's a bit hacky, but reusing the Does not display a leading zero for minutes if under 10 minutes. This should
# existing logic that delays initial output keeps the rest of the class be used when displaying elapsed time in a progress indicator.
# clean. Basically we set the start time to years in the future. """
if quiet: hours, mins, secs = convert_to_hms(total)
self._show = False ret = f"{int(secs):>02d}"
self._start += 2**32 if total >= 3600:
# Show leading zeroes if over an hour.
def start(self, name): ret = f"{mins:>02d}:{ret}"
self._active += 1
if not self._show_jobs:
self._show_jobs = self._active > 1
self.update(inc=0, msg='started ' + name)
def finish(self, name):
self.update(msg='finished ' + name)
self._active -= 1
def update(self, inc=1, msg=''):
self._done += inc
if _NOT_TTY or IsTraceToStderr():
return
if not self._show:
if 0.5 <= time() - self._start:
self._show = True
else:
return
if self._total <= 0:
sys.stderr.write('\r%s: %d,%s' % (
self._title,
self._done,
CSI_ERASE_LINE_AFTER))
sys.stderr.flush()
else: else:
p = (100 * self._done) / self._total ret = f"{mins}:{ret}"
if self._show_jobs: if hours:
jobs = '[%d job%s] ' % (self._active, 's' if self._active > 1 else '') ret = f"{hours}:{ret}"
else: return ret
jobs = ''
sys.stderr.write('\r%s: %2d%% %s(%d%s/%d%s)%s%s%s%s' % (
self._title,
p,
jobs,
self._done, self._units,
self._total, self._units,
' ' if msg else '', msg,
CSI_ERASE_LINE_AFTER,
'\n' if self._print_newline else ''))
sys.stderr.flush()
def end(self):
if _NOT_TTY or IsTraceToStderr() or not self._show:
return
duration = duration_str(time() - self._start) def jobs_str(total):
if self._total <= 0: return f"{total} job{'s' if total > 1 else ''}"
sys.stderr.write('\r%s: %d, done in %s%s\n' % (
self._title,
self._done, class Progress:
duration, def __init__(
CSI_ERASE_LINE_AFTER)) self,
sys.stderr.flush() title,
else: total=0,
p = (100 * self._done) / self._total units="",
sys.stderr.write('\r%s: %3d%% (%d%s/%d%s), done in %s%s\n' % ( delay=True,
self._title, quiet=False,
p, show_elapsed=False,
self._done, self._units, elide=False,
self._total, self._units, ):
duration, self._title = title
CSI_ERASE_LINE_AFTER)) self._total = total
sys.stderr.flush() self._done = 0
self._start = time.time()
self._show = not delay
self._units = units
self._elide = elide and _TTY
self._quiet = quiet
# Only show the active jobs section if we run more than one in parallel.
self._show_jobs = False
self._active = 0
# Save the last message for displaying on refresh.
self._last_msg = None
self._show_elapsed = show_elapsed
self._update_event = _threading.Event()
self._update_thread = _threading.Thread(
target=self._update_loop,
)
self._update_thread.daemon = True
if not quiet and show_elapsed:
self._update_thread.start()
def _update_loop(self):
while True:
self.update(inc=0)
if self._update_event.wait(timeout=1):
return
def _write(self, s):
s = "\r" + s
if self._elide:
col = os.get_terminal_size(sys.stderr.fileno()).columns
if len(s) > col:
s = s[: col - 1] + ".."
sys.stderr.write(s)
sys.stderr.flush()
def start(self, name):
self._active += 1
if not self._show_jobs:
self._show_jobs = self._active > 1
self.update(inc=0, msg="started " + name)
def finish(self, name):
self.update(msg="finished " + name)
self._active -= 1
def update(self, inc=1, msg=None):
"""Updates the progress indicator.
Args:
inc: The number of items completed.
msg: The message to display. If None, use the last message.
"""
self._done += inc
if msg is None:
msg = self._last_msg
self._last_msg = msg
if not _TTY or IsTraceToStderr() or self._quiet:
return
elapsed_sec = time.time() - self._start
if not self._show:
if 0.5 <= elapsed_sec:
self._show = True
else:
return
if self._total <= 0:
self._write(
"%s: %d,%s" % (self._title, self._done, CSI_ERASE_LINE_AFTER)
)
else:
p = (100 * self._done) / self._total
if self._show_jobs:
jobs = f"[{jobs_str(self._active)}] "
else:
jobs = ""
if self._show_elapsed:
elapsed = f" {elapsed_str(elapsed_sec)} |"
else:
elapsed = ""
self._write(
"%s: %2d%% %s(%d%s/%d%s)%s %s%s"
% (
self._title,
p,
jobs,
self._done,
self._units,
self._total,
self._units,
elapsed,
msg,
CSI_ERASE_LINE_AFTER,
)
)
def end(self):
self._update_event.set()
if not _TTY or IsTraceToStderr() or self._quiet:
return
duration = duration_str(time.time() - self._start)
if self._total <= 0:
self._write(
"%s: %d, done in %s%s\n"
% (self._title, self._done, duration, CSI_ERASE_LINE_AFTER)
)
else:
p = (100 * self._done) / self._total
self._write(
"%s: %3d%% (%d%s/%d%s), done in %s%s\n"
% (
self._title,
p,
self._done,
self._units,
self._total,
self._units,
duration,
CSI_ERASE_LINE_AFTER,
)
)

8134
project.py

File diff suppressed because it is too large Load Diff

18
pyproject.toml Normal file
View File

@ -0,0 +1,18 @@
# Copyright 2023 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
[tool.black]
line-length = 80
# NB: Keep in sync with tox.ini.
target-version = ['py36', 'py37', 'py38', 'py39', 'py310', 'py311'] #, 'py312'

View File

@ -28,43 +28,56 @@ import util
def sign(opts): def sign(opts):
"""Sign the launcher!""" """Sign the launcher!"""
output = '' output = ""
for key in opts.keys: for key in opts.keys:
# We use ! at the end of the key so that gpg uses this specific key. # We use ! at the end of the key so that gpg uses this specific key.
# Otherwise it uses the key as a lookup into the overall key and uses the # Otherwise it uses the key as a lookup into the overall key and uses
# default signing key. i.e. It will see that KEYID_RSA is a subkey of # the default signing key. i.e. It will see that KEYID_RSA is a subkey
# another key, and use the primary key to sign instead of the subkey. # of another key, and use the primary key to sign instead of the subkey.
cmd = ['gpg', '--homedir', opts.gpgdir, '-u', f'{key}!', '--batch', '--yes', cmd = [
'--armor', '--detach-sign', '--output', '-', opts.launcher] "gpg",
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE) "--homedir",
output += ret.stdout opts.gpgdir,
"-u",
f"{key}!",
"--batch",
"--yes",
"--armor",
"--detach-sign",
"--output",
"-",
opts.launcher,
]
ret = util.run(opts, cmd, encoding="utf-8", stdout=subprocess.PIPE)
output += ret.stdout
# Save the combined signatures into one file. # Save the combined signatures into one file.
with open(f'{opts.launcher}.asc', 'w', encoding='utf-8') as fp: with open(f"{opts.launcher}.asc", "w", encoding="utf-8") as fp:
fp.write(output) fp.write(output)
def check(opts): def check(opts):
"""Check the signature.""" """Check the signature."""
util.run(opts, ['gpg', '--verify', f'{opts.launcher}.asc']) util.run(opts, ["gpg", "--verify", f"{opts.launcher}.asc"])
def get_version(opts): def get_version(opts):
"""Get the version from |launcher|.""" """Get the version from |launcher|."""
# Make sure we don't search $PATH when signing the "repo" file in the cwd. # Make sure we don't search $PATH when signing the "repo" file in the cwd.
launcher = os.path.join('.', opts.launcher) launcher = os.path.join(".", opts.launcher)
cmd = [launcher, '--version'] cmd = [launcher, "--version"]
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE) ret = util.run(opts, cmd, encoding="utf-8", stdout=subprocess.PIPE)
m = re.search(r'repo launcher version ([0-9.]+)', ret.stdout) m = re.search(r"repo launcher version ([0-9.]+)", ret.stdout)
if not m: if not m:
sys.exit(f'{opts.launcher}: unable to detect repo version') sys.exit(f"{opts.launcher}: unable to detect repo version")
return m.group(1) return m.group(1)
def postmsg(opts, version): def postmsg(opts, version):
"""Helpful info to show at the end for release manager.""" """Helpful info to show at the end for release manager."""
print(f""" print(
f"""
Repo launcher bucket: Repo launcher bucket:
gs://git-repo-downloads/ gs://git-repo-downloads/
@ -81,55 +94,72 @@ NB: If a rollback is necessary, the GS bucket archives old versions, and may be
gsutil ls -la gs://git-repo-downloads/repo gs://git-repo-downloads/repo.asc gsutil ls -la gs://git-repo-downloads/repo gs://git-repo-downloads/repo.asc
gsutil cp -a public-read gs://git-repo-downloads/repo#<unique id> gs://git-repo-downloads/repo gsutil cp -a public-read gs://git-repo-downloads/repo#<unique id> gs://git-repo-downloads/repo
gsutil cp -a public-read gs://git-repo-downloads/repo.asc#<unique id> gs://git-repo-downloads/repo.asc gsutil cp -a public-read gs://git-repo-downloads/repo.asc#<unique id> gs://git-repo-downloads/repo.asc
""") """ # noqa: E501
)
def get_parser(): def get_parser():
"""Get a CLI parser.""" """Get a CLI parser."""
parser = argparse.ArgumentParser(description=__doc__) parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument('-n', '--dry-run', parser.add_argument(
dest='dryrun', action='store_true', "-n",
help='show everything that would be done') "--dry-run",
parser.add_argument('--gpgdir', dest="dryrun",
default=os.path.join(util.HOMEDIR, '.gnupg', 'repo'), action="store_true",
help='path to dedicated gpg dir with release keys ' help="show everything that would be done",
'(default: ~/.gnupg/repo/)') )
parser.add_argument('--keyid', dest='keys', default=[], action='append', parser.add_argument(
help='alternative signing keys to use') "--gpgdir",
parser.add_argument('launcher', default=os.path.join(util.HOMEDIR, ".gnupg", "repo"),
default=os.path.join(util.TOPDIR, 'repo'), nargs='?', help="path to dedicated gpg dir with release keys "
help='the launcher script to sign') "(default: ~/.gnupg/repo/)",
return parser )
parser.add_argument(
"--keyid",
dest="keys",
default=[],
action="append",
help="alternative signing keys to use",
)
parser.add_argument(
"launcher",
default=os.path.join(util.TOPDIR, "repo"),
nargs="?",
help="the launcher script to sign",
)
return parser
def main(argv): def main(argv):
"""The main func!""" """The main func!"""
parser = get_parser() parser = get_parser()
opts = parser.parse_args(argv) opts = parser.parse_args(argv)
if not os.path.exists(opts.gpgdir): if not os.path.exists(opts.gpgdir):
parser.error(f'--gpgdir does not exist: {opts.gpgdir}') parser.error(f"--gpgdir does not exist: {opts.gpgdir}")
if not os.path.exists(opts.launcher): if not os.path.exists(opts.launcher):
parser.error(f'launcher does not exist: {opts.launcher}') parser.error(f"launcher does not exist: {opts.launcher}")
opts.launcher = os.path.relpath(opts.launcher) opts.launcher = os.path.relpath(opts.launcher)
print(f'Signing "{opts.launcher}" launcher script and saving to ' print(
f'"{opts.launcher}.asc"') f'Signing "{opts.launcher}" launcher script and saving to '
f'"{opts.launcher}.asc"'
)
if opts.keys: if opts.keys:
print(f'Using custom keys to sign: {" ".join(opts.keys)}') print(f'Using custom keys to sign: {" ".join(opts.keys)}')
else: else:
print('Using official Repo release keys to sign') print("Using official Repo release keys to sign")
opts.keys = [util.KEYID_DSA, util.KEYID_RSA, util.KEYID_ECC] opts.keys = [util.KEYID_DSA, util.KEYID_RSA, util.KEYID_ECC]
util.import_release_key(opts) util.import_release_key(opts)
version = get_version(opts) version = get_version(opts)
sign(opts) sign(opts)
check(opts) check(opts)
postmsg(opts, version) postmsg(opts, version)
return 0 return 0
if __name__ == '__main__': if __name__ == "__main__":
sys.exit(main(sys.argv[1:])) sys.exit(main(sys.argv[1:]))

View File

@ -35,46 +35,61 @@ import util
KEYID = util.KEYID_DSA KEYID = util.KEYID_DSA
# Regular expression to validate tag names. # Regular expression to validate tag names.
RE_VALID_TAG = r'^v([0-9]+[.])+[0-9]+$' RE_VALID_TAG = r"^v([0-9]+[.])+[0-9]+$"
def sign(opts): def sign(opts):
"""Tag the commit & sign it!""" """Tag the commit & sign it!"""
# We use ! at the end of the key so that gpg uses this specific key. # We use ! at the end of the key so that gpg uses this specific key.
# Otherwise it uses the key as a lookup into the overall key and uses the # Otherwise it uses the key as a lookup into the overall key and uses the
# default signing key. i.e. It will see that KEYID_RSA is a subkey of # default signing key. i.e. It will see that KEYID_RSA is a subkey of
# another key, and use the primary key to sign instead of the subkey. # another key, and use the primary key to sign instead of the subkey.
cmd = ['git', 'tag', '-s', opts.tag, '-u', f'{opts.key}!', cmd = [
'-m', f'repo {opts.tag}', opts.commit] "git",
"tag",
"-s",
opts.tag,
"-u",
f"{opts.key}!",
"-m",
f"repo {opts.tag}",
opts.commit,
]
key = 'GNUPGHOME' key = "GNUPGHOME"
print('+', f'export {key}="{opts.gpgdir}"') print("+", f'export {key}="{opts.gpgdir}"')
oldvalue = os.getenv(key) oldvalue = os.getenv(key)
os.putenv(key, opts.gpgdir) os.putenv(key, opts.gpgdir)
util.run(opts, cmd) util.run(opts, cmd)
if oldvalue is None: if oldvalue is None:
os.unsetenv(key) os.unsetenv(key)
else: else:
os.putenv(key, oldvalue) os.putenv(key, oldvalue)
def check(opts): def check(opts):
"""Check the signature.""" """Check the signature."""
util.run(opts, ['git', 'tag', '--verify', opts.tag]) util.run(opts, ["git", "tag", "--verify", opts.tag])
def postmsg(opts): def postmsg(opts):
"""Helpful info to show at the end for release manager.""" """Helpful info to show at the end for release manager."""
cmd = ['git', 'rev-parse', 'remotes/origin/stable'] cmd = ["git", "rev-parse", "remotes/origin/stable"]
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE) ret = util.run(opts, cmd, encoding="utf-8", stdout=subprocess.PIPE)
current_release = ret.stdout.strip() current_release = ret.stdout.strip()
cmd = ['git', 'log', '--format=%h (%aN) %s', '--no-merges', cmd = [
f'remotes/origin/stable..{opts.tag}'] "git",
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE) "log",
shortlog = ret.stdout.strip() "--format=%h (%aN) %s",
"--no-merges",
f"remotes/origin/stable..{opts.tag}",
]
ret = util.run(opts, cmd, encoding="utf-8", stdout=subprocess.PIPE)
shortlog = ret.stdout.strip()
print(f""" print(
f"""
Here's the short log since the last release. Here's the short log since the last release.
{shortlog} {shortlog}
@ -84,57 +99,69 @@ NB: People will start upgrading to this version immediately.
To roll back a release: To roll back a release:
git push origin --force {current_release}:stable -n git push origin --force {current_release}:stable -n
""") """
)
def get_parser(): def get_parser():
"""Get a CLI parser.""" """Get a CLI parser."""
parser = argparse.ArgumentParser( parser = argparse.ArgumentParser(
description=__doc__, description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter) formatter_class=argparse.RawDescriptionHelpFormatter,
parser.add_argument('-n', '--dry-run', )
dest='dryrun', action='store_true', parser.add_argument(
help='show everything that would be done') "-n",
parser.add_argument('--gpgdir', "--dry-run",
default=os.path.join(util.HOMEDIR, '.gnupg', 'repo'), dest="dryrun",
help='path to dedicated gpg dir with release keys ' action="store_true",
'(default: ~/.gnupg/repo/)') help="show everything that would be done",
parser.add_argument('-f', '--force', action='store_true', )
help='force signing of any tag') parser.add_argument(
parser.add_argument('--keyid', dest='key', "--gpgdir",
help='alternative signing key to use') default=os.path.join(util.HOMEDIR, ".gnupg", "repo"),
parser.add_argument('tag', help="path to dedicated gpg dir with release keys "
help='the tag to create (e.g. "v2.0")') "(default: ~/.gnupg/repo/)",
parser.add_argument('commit', default='HEAD', nargs='?', )
help='the commit to tag') parser.add_argument(
return parser "-f", "--force", action="store_true", help="force signing of any tag"
)
parser.add_argument(
"--keyid", dest="key", help="alternative signing key to use"
)
parser.add_argument("tag", help='the tag to create (e.g. "v2.0")')
parser.add_argument(
"commit", default="HEAD", nargs="?", help="the commit to tag"
)
return parser
def main(argv): def main(argv):
"""The main func!""" """The main func!"""
parser = get_parser() parser = get_parser()
opts = parser.parse_args(argv) opts = parser.parse_args(argv)
if not os.path.exists(opts.gpgdir): if not os.path.exists(opts.gpgdir):
parser.error(f'--gpgdir does not exist: {opts.gpgdir}') parser.error(f"--gpgdir does not exist: {opts.gpgdir}")
if not opts.force and not re.match(RE_VALID_TAG, opts.tag): if not opts.force and not re.match(RE_VALID_TAG, opts.tag):
parser.error(f'tag "{opts.tag}" does not match regex "{RE_VALID_TAG}"; ' parser.error(
'use --force to sign anyways') f'tag "{opts.tag}" does not match regex "{RE_VALID_TAG}"; '
"use --force to sign anyways"
)
if opts.key: if opts.key:
print(f'Using custom key to sign: {opts.key}') print(f"Using custom key to sign: {opts.key}")
else: else:
print('Using official Repo release key to sign') print("Using official Repo release key to sign")
opts.key = KEYID opts.key = KEYID
util.import_release_key(opts) util.import_release_key(opts)
sign(opts) sign(opts)
check(opts) check(opts)
postmsg(opts) postmsg(opts)
return 0 return 0
if __name__ == '__main__': if __name__ == "__main__":
sys.exit(main(sys.argv[1:])) sys.exit(main(sys.argv[1:]))

143
release/update-hooks Executable file
View File

@ -0,0 +1,143 @@
#!/usr/bin/env python3
# Copyright (C) 2024 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Helper tool for updating hooks from their various upstreams."""
import argparse
import base64
import json
from pathlib import Path
import sys
from typing import List, Optional
import urllib.request
assert sys.version_info >= (3, 8), "Python 3.8+ required"
TOPDIR = Path(__file__).resolve().parent.parent
HOOKS_DIR = TOPDIR / "hooks"
def update_hook_commit_msg() -> None:
"""Update commit-msg hook from Gerrit."""
hook = HOOKS_DIR / "commit-msg"
print(
f"{hook.name}: Updating from https://gerrit.googlesource.com/gerrit/"
"+/HEAD/resources/com/google/gerrit/server/tools/root/hooks/commit-msg"
)
# Get the current commit.
url = "https://gerrit.googlesource.com/gerrit/+/HEAD?format=JSON"
with urllib.request.urlopen(url) as fp:
data = fp.read()
# Discard the xss protection.
data = data.split(b"\n", 1)[1]
data = json.loads(data)
commit = data["commit"]
# Fetch the data for that commit.
url = (
f"https://gerrit.googlesource.com/gerrit/+/{commit}/"
"resources/com/google/gerrit/server/tools/root/hooks/commit-msg"
)
with urllib.request.urlopen(f"{url}?format=TEXT") as fp:
data = fp.read()
# gitiles base64 encodes text data.
data = base64.b64decode(data)
# Inject header into the hook.
lines = data.split(b"\n")
lines = (
lines[:1]
+ [
b"# DO NOT EDIT THIS FILE",
(
b"# All updates should be sent upstream: "
b"https://gerrit.googlesource.com/gerrit/"
),
f"# This is synced from commit: {commit}".encode("utf-8"),
b"# DO NOT EDIT THIS FILE",
]
+ lines[1:]
)
data = b"\n".join(lines)
# Update the hook.
hook.write_bytes(data)
hook.chmod(0o755)
def update_hook_pre_auto_gc() -> None:
"""Update pre-auto-gc hook from git."""
hook = HOOKS_DIR / "pre-auto-gc"
print(
f"{hook.name}: Updating from https://github.com/git/git/"
"HEAD/contrib/hooks/pre-auto-gc-battery"
)
# Get the current commit.
headers = {
"Accept": "application/vnd.github+json",
"X-GitHub-Api-Version": "2022-11-28",
}
url = "https://api.github.com/repos/git/git/git/refs/heads/master"
req = urllib.request.Request(url, headers=headers)
with urllib.request.urlopen(req) as fp:
data = fp.read()
data = json.loads(data)
# Fetch the data for that commit.
commit = data["object"]["sha"]
url = (
f"https://raw.githubusercontent.com/git/git/{commit}/"
"contrib/hooks/pre-auto-gc-battery"
)
with urllib.request.urlopen(url) as fp:
data = fp.read()
# Inject header into the hook.
lines = data.split(b"\n")
lines = (
lines[:1]
+ [
b"# DO NOT EDIT THIS FILE",
(
b"# All updates should be sent upstream: "
b"https://github.com/git/git/"
),
f"# This is synced from commit: {commit}".encode("utf-8"),
b"# DO NOT EDIT THIS FILE",
]
+ lines[1:]
)
data = b"\n".join(lines)
# Update the hook.
hook.write_bytes(data)
hook.chmod(0o755)
def main(argv: Optional[List[str]] = None) -> Optional[int]:
parser = argparse.ArgumentParser(description=__doc__)
parser.parse_args(argv)
update_hook_commit_msg()
update_hook_pre_auto_gc()
if __name__ == "__main__":
sys.exit(main(sys.argv[1:]))

View File

@ -22,4 +22,5 @@ import sys
import update_manpages import update_manpages
sys.exit(update_manpages.main(sys.argv[1:])) sys.exit(update_manpages.main(sys.argv[1:]))

View File

@ -17,103 +17,140 @@
Most code lives in this module so it can be unittested. Most code lives in this module so it can be unittested.
""" """
from pathlib import Path
from functools import partial
import argparse import argparse
import functools
import multiprocessing import multiprocessing
import os import os
from pathlib import Path
import re import re
import shutil import shutil
import subprocess import subprocess
import sys import sys
import tempfile import tempfile
TOPDIR = Path(__file__).resolve().parent.parent TOPDIR = Path(__file__).resolve().parent.parent
MANDIR = TOPDIR.joinpath('man') MANDIR = TOPDIR.joinpath("man")
# Load repo local modules. # Load repo local modules.
sys.path.insert(0, str(TOPDIR)) sys.path.insert(0, str(TOPDIR))
from git_command import RepoSourceVersion from git_command import RepoSourceVersion
import subcmds import subcmds
def worker(cmd, **kwargs): def worker(cmd, **kwargs):
subprocess.run(cmd, **kwargs) subprocess.run(cmd, **kwargs)
def main(argv): def main(argv):
parser = argparse.ArgumentParser(description=__doc__) parser = argparse.ArgumentParser(description=__doc__)
opts = parser.parse_args(argv) parser.parse_args(argv)
if not shutil.which('help2man'): if not shutil.which("help2man"):
sys.exit('Please install help2man to continue.') sys.exit("Please install help2man to continue.")
# Let repo know we're generating man pages so it can avoid some dynamic # Let repo know we're generating man pages so it can avoid some dynamic
# behavior (like probing active number of CPUs). We use a weird name & # behavior (like probing active number of CPUs). We use a weird name &
# value to make it less likely for users to set this var themselves. # value to make it less likely for users to set this var themselves.
os.environ['_REPO_GENERATE_MANPAGES_'] = ' indeed! ' os.environ["_REPO_GENERATE_MANPAGES_"] = " indeed! "
# "repo branch" is an alias for "repo branches". # "repo branch" is an alias for "repo branches".
del subcmds.all_commands['branch'] del subcmds.all_commands["branch"]
(MANDIR / 'repo-branch.1').write_text('.so man1/repo-branches.1') (MANDIR / "repo-branch.1").write_text(".so man1/repo-branches.1")
version = RepoSourceVersion() version = RepoSourceVersion()
cmdlist = [['help2man', '-N', '-n', f'repo {cmd} - manual page for repo {cmd}', cmdlist = [
'-S', f'repo {cmd}', '-m', 'Repo Manual', f'--version-string={version}', [
'-o', MANDIR.joinpath(f'repo-{cmd}.1.tmp'), './repo', "help2man",
'-h', f'help {cmd}'] for cmd in subcmds.all_commands] "-N",
cmdlist.append(['help2man', '-N', '-n', 'repository management tool built on top of git', "-n",
'-S', 'repo', '-m', 'Repo Manual', f'--version-string={version}', f"repo {cmd} - manual page for repo {cmd}",
'-o', MANDIR.joinpath('repo.1.tmp'), './repo', "-S",
'-h', '--help-all']) f"repo {cmd}",
"-m",
"Repo Manual",
f"--version-string={version}",
"-o",
MANDIR.joinpath(f"repo-{cmd}.1.tmp"),
"./repo",
"-h",
f"help {cmd}",
]
for cmd in subcmds.all_commands
]
cmdlist.append(
[
"help2man",
"-N",
"-n",
"repository management tool built on top of git",
"-S",
"repo",
"-m",
"Repo Manual",
f"--version-string={version}",
"-o",
MANDIR.joinpath("repo.1.tmp"),
"./repo",
"-h",
"--help-all",
]
)
with tempfile.TemporaryDirectory() as tempdir: with tempfile.TemporaryDirectory() as tempdir:
tempdir = Path(tempdir) tempdir = Path(tempdir)
repo_dir = tempdir / '.repo' repo_dir = tempdir / ".repo"
repo_dir.mkdir() repo_dir.mkdir()
(repo_dir / 'repo').symlink_to(TOPDIR) (repo_dir / "repo").symlink_to(TOPDIR)
# Create a repo wrapper using the active Python executable. We can't pass # Create a repo wrapper using the active Python executable. We can't
# this directly to help2man as it's too simple, so insert it via shebang. # pass this directly to help2man as it's too simple, so insert it via
data = (TOPDIR / 'repo').read_text(encoding='utf-8') # shebang.
tempbin = tempdir / 'repo' data = (TOPDIR / "repo").read_text(encoding="utf-8")
tempbin.write_text(f'#!{sys.executable}\n' + data, encoding='utf-8') tempbin = tempdir / "repo"
tempbin.chmod(0o755) tempbin.write_text(f"#!{sys.executable}\n" + data, encoding="utf-8")
tempbin.chmod(0o755)
# Run all cmd in parallel, and wait for them to finish. # Run all cmd in parallel, and wait for them to finish.
with multiprocessing.Pool() as pool: with multiprocessing.Pool() as pool:
pool.map(partial(worker, cwd=tempdir, check=True), cmdlist) pool.map(
functools.partial(worker, cwd=tempdir, check=True), cmdlist
)
for tmp_path in MANDIR.glob('*.1.tmp'): for tmp_path in MANDIR.glob("*.1.tmp"):
path = tmp_path.parent / tmp_path.stem path = tmp_path.parent / tmp_path.stem
old_data = path.read_text() if path.exists() else '' old_data = path.read_text() if path.exists() else ""
data = tmp_path.read_text() data = tmp_path.read_text()
tmp_path.unlink() tmp_path.unlink()
data = replace_regex(data) data = replace_regex(data)
# If the only thing that changed was the date, don't refresh. This avoids # If the only thing that changed was the date, don't refresh. This
# a lot of noise when only one file actually updates. # avoids a lot of noise when only one file actually updates.
old_data = re.sub(r'^(\.TH REPO "1" ")([^"]+)', r'\1', old_data, flags=re.M) old_data = re.sub(
new_data = re.sub(r'^(\.TH REPO "1" ")([^"]+)', r'\1', data, flags=re.M) r'^(\.TH REPO "1" ")([^"]+)', r"\1", old_data, flags=re.M
if old_data != new_data: )
path.write_text(data) new_data = re.sub(r'^(\.TH REPO "1" ")([^"]+)', r"\1", data, flags=re.M)
if old_data != new_data:
path.write_text(data)
def replace_regex(data): def replace_regex(data):
"""Replace semantically null regexes in the data. """Replace semantically null regexes in the data.
Args: Args:
data: manpage text. data: manpage text.
Returns: Returns:
Updated manpage text. Updated manpage text.
""" """
regex = ( regex = (
(r'(It was generated by help2man) [0-9.]+', r'\g<1>.'), (r"(It was generated by help2man) [0-9.]+", r"\g<1>."),
(r'^\033\[[0-9;]*m([^\033]*)\033\[m', r'\g<1>'), (r"^\033\[[0-9;]*m([^\033]*)\033\[m", r"\g<1>"),
(r'^\.IP\n(.*:)\n', r'.SS \g<1>\n'), (r"^\.IP\n(.*:)\n", r".SS \g<1>\n"),
(r'^\.PP\nDescription', r'.SH DETAILS'), (r"^\.PP\nDescription", r".SH DETAILS"),
) )
for pattern, replacement in regex: for pattern, replacement in regex:
data = re.sub(pattern, replacement, data, flags=re.M) data = re.sub(pattern, replacement, data, flags=re.M)
return data return data

View File

@ -20,54 +20,60 @@ import subprocess
import sys import sys
assert sys.version_info >= (3, 6), 'This module requires Python 3.6+' assert sys.version_info >= (3, 6), "This module requires Python 3.6+"
TOPDIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) TOPDIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
HOMEDIR = os.path.expanduser('~') HOMEDIR = os.path.expanduser("~")
# These are the release keys we sign with. # These are the release keys we sign with.
KEYID_DSA = '8BB9AD793E8E6153AF0F9A4416530D5E920F5C65' KEYID_DSA = "8BB9AD793E8E6153AF0F9A4416530D5E920F5C65"
KEYID_RSA = 'A34A13BE8E76BFF46A0C022DA2E75A824AAB9624' KEYID_RSA = "A34A13BE8E76BFF46A0C022DA2E75A824AAB9624"
KEYID_ECC = 'E1F9040D7A3F6DAFAC897CD3D3B95DA243E48A39' KEYID_ECC = "E1F9040D7A3F6DAFAC897CD3D3B95DA243E48A39"
def cmdstr(cmd): def cmdstr(cmd):
"""Get a nicely quoted shell command.""" """Get a nicely quoted shell command."""
ret = [] ret = []
for arg in cmd: for arg in cmd:
if not re.match(r'^[a-zA-Z0-9/_.=-]+$', arg): if not re.match(r"^[a-zA-Z0-9/_.=-]+$", arg):
arg = f'"{arg}"' arg = f'"{arg}"'
ret.append(arg) ret.append(arg)
return ' '.join(ret) return " ".join(ret)
def run(opts, cmd, check=True, **kwargs): def run(opts, cmd, check=True, **kwargs):
"""Helper around subprocess.run to include logging.""" """Helper around subprocess.run to include logging."""
print('+', cmdstr(cmd)) print("+", cmdstr(cmd))
if opts.dryrun: if opts.dryrun:
cmd = ['true', '--'] + cmd cmd = ["true", "--"] + cmd
try: try:
return subprocess.run(cmd, check=check, **kwargs) return subprocess.run(cmd, check=check, **kwargs)
except subprocess.CalledProcessError as e: except subprocess.CalledProcessError as e:
print(f'aborting: {e}', file=sys.stderr) print(f"aborting: {e}", file=sys.stderr)
sys.exit(1) sys.exit(1)
def import_release_key(opts): def import_release_key(opts):
"""Import the public key of the official release repo signing key.""" """Import the public key of the official release repo signing key."""
# Extract the key from our repo launcher. # Extract the key from our repo launcher.
launcher = getattr(opts, 'launcher', os.path.join(TOPDIR, 'repo')) launcher = getattr(opts, "launcher", os.path.join(TOPDIR, "repo"))
print(f'Importing keys from "{launcher}" launcher script') print(f'Importing keys from "{launcher}" launcher script')
with open(launcher, encoding='utf-8') as fp: with open(launcher, encoding="utf-8") as fp:
data = fp.read() data = fp.read()
keys = re.findall( keys = re.findall(
r'\n-----BEGIN PGP PUBLIC KEY BLOCK-----\n[^-]*' r"\n-----BEGIN PGP PUBLIC KEY BLOCK-----\n[^-]*"
r'\n-----END PGP PUBLIC KEY BLOCK-----\n', data, flags=re.M) r"\n-----END PGP PUBLIC KEY BLOCK-----\n",
run(opts, ['gpg', '--import'], input='\n'.join(keys).encode('utf-8')) data,
flags=re.M,
)
run(opts, ["gpg", "--import"], input="\n".join(keys).encode("utf-8"))
print('Marking keys as fully trusted') print("Marking keys as fully trusted")
run(opts, ['gpg', '--import-ownertrust'], run(
input=f'{KEYID_DSA}:6:\n'.encode('utf-8')) opts,
["gpg", "--import-ownertrust"],
input=f"{KEYID_DSA}:6:\n".encode("utf-8"),
)

2047
repo

File diff suppressed because it is too large Load Diff

93
repo_logging.py Normal file
View File

@ -0,0 +1,93 @@
# Copyright (C) 2023 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Logic for printing user-friendly logs in repo."""
import logging
from color import Coloring
from error import RepoExitError
SEPARATOR = "=" * 80
MAX_PRINT_ERRORS = 5
class _ConfigMock:
"""Default coloring config to use when Logging.config is not set."""
def __init__(self):
self.default_values = {"color.ui": "auto"}
def GetString(self, x):
return self.default_values.get(x, None)
class _LogColoring(Coloring):
"""Coloring outstream for logging."""
def __init__(self, config):
super().__init__(config, "logs")
self.error = self.nofmt_colorer("error", fg="red")
self.warning = self.nofmt_colorer("warn", fg="yellow")
self.levelMap = {
"WARNING": self.warning,
"ERROR": self.error,
}
class _LogColoringFormatter(logging.Formatter):
"""Coloring formatter for logging."""
def __init__(self, config=None, *args, **kwargs):
self.config = config if config else _ConfigMock()
self.colorer = _LogColoring(self.config)
super().__init__(*args, **kwargs)
def format(self, record):
"""Formats |record| with color."""
msg = super().format(record)
colorer = self.colorer.levelMap.get(record.levelname)
return msg if not colorer else colorer(msg)
class RepoLogger(logging.Logger):
"""Repo Logging Module."""
def __init__(self, name: str, config=None, **kwargs):
super().__init__(name, **kwargs)
handler = logging.StreamHandler()
handler.setFormatter(_LogColoringFormatter(config))
self.addHandler(handler)
def log_aggregated_errors(self, err: RepoExitError):
"""Print all aggregated logs."""
self.error(SEPARATOR)
if not err.aggregate_errors:
self.error("Repo command failed: %s", type(err).__name__)
self.error("\t%s", str(err))
return
self.error(
"Repo command failed due to the following `%s` errors:",
type(err).__name__,
)
self.error(
"\n".join(str(e) for e in err.aggregate_errors[:MAX_PRINT_ERRORS])
)
diff = len(err.aggregate_errors) - MAX_PRINT_ERRORS
if diff > 0:
self.error("+%d additional errors...", diff)

View File

@ -20,147 +20,152 @@ Temporary: Tracing is always on. Set `REPO_TRACE=0` to turn off.
To also include trace outputs in stderr do `repo --trace_to_stderr ...` To also include trace outputs in stderr do `repo --trace_to_stderr ...`
""" """
import sys import contextlib
import os import os
import time import sys
import tempfile import tempfile
from contextlib import ContextDecorator import time
import platform_utils import platform_utils
# Env var to implicitly turn on tracing. # Env var to implicitly turn on tracing.
REPO_TRACE = 'REPO_TRACE' REPO_TRACE = "REPO_TRACE"
# Temporarily set tracing to always on unless user expicitly sets to 0. # Temporarily set tracing to always on unless user expicitly sets to 0.
_TRACE = os.environ.get(REPO_TRACE) != '0' _TRACE = os.environ.get(REPO_TRACE) != "0"
_TRACE_TO_STDERR = False _TRACE_TO_STDERR = False
_TRACE_FILE = None _TRACE_FILE = None
_TRACE_FILE_NAME = 'TRACE_FILE' _TRACE_FILE_NAME = "TRACE_FILE"
_MAX_SIZE = 70 # in MiB _MAX_SIZE = 70 # in MiB
_NEW_COMMAND_SEP = '+++++++++++++++NEW COMMAND+++++++++++++++++++' _NEW_COMMAND_SEP = "+++++++++++++++NEW COMMAND+++++++++++++++++++"
def IsTraceToStderr(): def IsTraceToStderr():
"""Whether traces are written to stderr.""" """Whether traces are written to stderr."""
return _TRACE_TO_STDERR return _TRACE_TO_STDERR
def IsTrace(): def IsTrace():
"""Whether tracing is enabled.""" """Whether tracing is enabled."""
return _TRACE return _TRACE
def SetTraceToStderr(): def SetTraceToStderr():
"""Enables tracing logging to stderr.""" """Enables tracing logging to stderr."""
global _TRACE_TO_STDERR global _TRACE_TO_STDERR
_TRACE_TO_STDERR = True _TRACE_TO_STDERR = True
def SetTrace(): def SetTrace():
"""Enables tracing.""" """Enables tracing."""
global _TRACE global _TRACE
_TRACE = True _TRACE = True
def _SetTraceFile(quiet): def _SetTraceFile(quiet):
"""Sets the trace file location.""" """Sets the trace file location."""
global _TRACE_FILE global _TRACE_FILE
_TRACE_FILE = _GetTraceFile(quiet) _TRACE_FILE = _GetTraceFile(quiet)
class Trace(ContextDecorator): class Trace(contextlib.ContextDecorator):
"""Used to capture and save git traces.""" """Used to capture and save git traces."""
def _time(self): def _time(self):
"""Generate nanoseconds of time in a py3.6 safe way""" """Generate nanoseconds of time in a py3.6 safe way"""
return int(time.time() * 1e+9) return int(time.time() * 1e9)
def __init__(self, fmt, *args, first_trace=False, quiet=True): def __init__(self, fmt, *args, first_trace=False, quiet=True):
"""Initialize the object. """Initialize the object.
Args: Args:
fmt: The format string for the trace. fmt: The format string for the trace.
*args: Arguments to pass to formatting. *args: Arguments to pass to formatting.
first_trace: Whether this is the first trace of a `repo` invocation. first_trace: Whether this is the first trace of a `repo` invocation.
quiet: Whether to suppress notification of trace file location. quiet: Whether to suppress notification of trace file location.
""" """
if not IsTrace(): if not IsTrace():
return return
self._trace_msg = fmt % args self._trace_msg = fmt % args
if not _TRACE_FILE: if not _TRACE_FILE:
_SetTraceFile(quiet) _SetTraceFile(quiet)
if first_trace: if first_trace:
_ClearOldTraces() _ClearOldTraces()
self._trace_msg = f'{_NEW_COMMAND_SEP} {self._trace_msg}' self._trace_msg = f"{_NEW_COMMAND_SEP} {self._trace_msg}"
def __enter__(self): def __enter__(self):
if not IsTrace(): if not IsTrace():
return self return self
print_msg = f'PID: {os.getpid()} START: {self._time()} :{self._trace_msg}\n' print_msg = (
f"PID: {os.getpid()} START: {self._time()} :{self._trace_msg}\n"
)
with open(_TRACE_FILE, 'a') as f: with open(_TRACE_FILE, "a") as f:
print(print_msg, file=f) print(print_msg, file=f)
if _TRACE_TO_STDERR: if _TRACE_TO_STDERR:
print(print_msg, file=sys.stderr) print(print_msg, file=sys.stderr)
return self return self
def __exit__(self, *exc): def __exit__(self, *exc):
if not IsTrace(): if not IsTrace():
return False return False
print_msg = f'PID: {os.getpid()} END: {self._time()} :{self._trace_msg}\n' print_msg = (
f"PID: {os.getpid()} END: {self._time()} :{self._trace_msg}\n"
)
with open(_TRACE_FILE, 'a') as f: with open(_TRACE_FILE, "a") as f:
print(print_msg, file=f) print(print_msg, file=f)
if _TRACE_TO_STDERR: if _TRACE_TO_STDERR:
print(print_msg, file=sys.stderr) print(print_msg, file=sys.stderr)
return False return False
def _GetTraceFile(quiet): def _GetTraceFile(quiet):
"""Get the trace file or create one.""" """Get the trace file or create one."""
# TODO: refactor to pass repodir to Trace. # TODO: refactor to pass repodir to Trace.
repo_dir = os.path.dirname(os.path.dirname(__file__)) repo_dir = os.path.dirname(os.path.dirname(__file__))
trace_file = os.path.join(repo_dir, _TRACE_FILE_NAME) trace_file = os.path.join(repo_dir, _TRACE_FILE_NAME)
if not quiet: if not quiet:
print(f'Trace outputs in {trace_file}', file=sys.stderr) print(f"Trace outputs in {trace_file}", file=sys.stderr)
return trace_file return trace_file
def _ClearOldTraces(): def _ClearOldTraces():
"""Clear the oldest commands if trace file is too big.""" """Clear the oldest commands if trace file is too big."""
try: try:
with open(_TRACE_FILE, 'r', errors='ignore') as f: with open(_TRACE_FILE, errors="ignore") as f:
if os.path.getsize(f.name) / (1024 * 1024) <= _MAX_SIZE: if os.path.getsize(f.name) / (1024 * 1024) <= _MAX_SIZE:
return
trace_lines = f.readlines()
except FileNotFoundError:
return return
trace_lines = f.readlines()
except FileNotFoundError:
return
while sum(len(x) for x in trace_lines) / (1024 * 1024) > _MAX_SIZE: while sum(len(x) for x in trace_lines) / (1024 * 1024) > _MAX_SIZE:
for i, line in enumerate(trace_lines): for i, line in enumerate(trace_lines):
if 'END:' in line and _NEW_COMMAND_SEP in line: if "END:" in line and _NEW_COMMAND_SEP in line:
trace_lines = trace_lines[i + 1:] trace_lines = trace_lines[i + 1 :]
break break
else: else:
# The last chunk is bigger than _MAX_SIZE, so just throw everything away. # The last chunk is bigger than _MAX_SIZE, so just throw everything
trace_lines = [] # away.
trace_lines = []
while trace_lines and trace_lines[-1] == '\n': while trace_lines and trace_lines[-1] == "\n":
trace_lines = trace_lines[:-1] trace_lines = trace_lines[:-1]
# Write to a temporary file with a unique name in the same filesystem # Write to a temporary file with a unique name in the same filesystem
# before replacing the original trace file. # before replacing the original trace file.
temp_dir, temp_prefix = os.path.split(_TRACE_FILE) temp_dir, temp_prefix = os.path.split(_TRACE_FILE)
with tempfile.NamedTemporaryFile('w', with tempfile.NamedTemporaryFile(
dir=temp_dir, "w", dir=temp_dir, prefix=temp_prefix, delete=False
prefix=temp_prefix, ) as f:
delete=False) as f: f.writelines(trace_lines)
f.writelines(trace_lines) platform_utils.rename(f.name, _TRACE_FILE)
platform_utils.rename(f.name, _TRACE_FILE)

View File

@ -46,12 +46,14 @@
# Supported git versions. # Supported git versions.
# #
# git-1.7.2 is in Debian Squeeze.
# git-1.7.9 is in Ubuntu Precise.
# git-1.9.1 is in Ubuntu Trusty. # git-1.9.1 is in Ubuntu Trusty.
# git-1.7.10 is in Debian Wheezy. # git-2.1.4 is in Debian Jessie.
# git-2.7.4 is in Ubuntu Xenial.
# git-2.11.0 is in Debian Stretch.
# git-2.17.0 is in Ubuntu Bionic.
# git-2.20.1 is in Debian Buster.
"git": { "git": {
"hard": [1, 7, 2], "hard": [1, 9, 1],
"soft": [1, 9, 1] "soft": [2, 7, 4]
} }
} }

View File

@ -13,10 +13,58 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
"""Wrapper to run pytest with the right settings.""" """Wrapper to run linters and pytest with the right settings."""
import os
import subprocess
import sys import sys
import pytest import pytest
if __name__ == '__main__':
sys.exit(pytest.main(sys.argv[1:])) ROOT_DIR = os.path.dirname(os.path.realpath(__file__))
def run_black():
"""Returns the exit code from black."""
# Black by default only matches .py files. We have to list standalone
# scripts manually.
extra_programs = [
"repo",
"run_tests",
"release/update-hooks",
"release/update-manpages",
]
return subprocess.run(
[sys.executable, "-m", "black", "--check", ROOT_DIR] + extra_programs,
check=False,
).returncode
def run_flake8():
"""Returns the exit code from flake8."""
return subprocess.run(
[sys.executable, "-m", "flake8", ROOT_DIR], check=False
).returncode
def run_isort():
"""Returns the exit code from isort."""
return subprocess.run(
[sys.executable, "-m", "isort", "--check", ROOT_DIR], check=False
).returncode
def main(argv):
"""The main entry."""
checks = (
lambda: pytest.main(argv),
run_black,
run_flake8,
run_isort,
)
return 0 if all(not c() for c in checks) else 1
if __name__ == "__main__":
sys.exit(main(sys.argv[1:]))

View File

@ -26,8 +26,8 @@ wheel: <
# Required by pytest==6.2.2 # Required by pytest==6.2.2
wheel: < wheel: <
name: "infra/python/wheels/packaging-py2_py3" name: "infra/python/wheels/packaging-py3"
version: "version:16.8" version: "version:23.0"
> >
# Required by pytest==6.2.2 # Required by pytest==6.2.2
@ -59,3 +59,72 @@ wheel: <
name: "infra/python/wheels/six-py2_py3" name: "infra/python/wheels/six-py2_py3"
version: "version:1.16.0" version: "version:1.16.0"
> >
wheel: <
name: "infra/python/wheels/black-py3"
version: "version:23.1.0"
>
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/mypy-extensions-py3"
version: "version:0.4.3"
>
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/tomli-py3"
version: "version:2.0.1"
>
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/platformdirs-py3"
version: "version:2.5.2"
>
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/pathspec-py3"
version: "version:0.9.0"
>
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/typing-extensions-py3"
version: "version:4.3.0"
>
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/click-py3"
version: "version:8.0.3"
>
wheel: <
name: "infra/python/wheels/flake8-py2_py3"
version: "version:6.0.0"
>
# Required by flake8==6.0.0
wheel: <
name: "infra/python/wheels/mccabe-py2_py3"
version: "version:0.7.0"
>
# Required by flake8==6.0.0
wheel: <
name: "infra/python/wheels/pyflakes-py2_py3"
version: "version:3.0.1"
>
# Required by flake8==6.0.0
wheel: <
name: "infra/python/wheels/pycodestyle-py2_py3"
version: "version:2.10.0"
>
wheel: <
name: "infra/python/wheels/isort-py3"
version: "version:5.10.1"
>

View File

@ -16,6 +16,7 @@
"""Python packaging for repo.""" """Python packaging for repo."""
import os import os
import setuptools import setuptools
@ -23,39 +24,39 @@ TOPDIR = os.path.dirname(os.path.abspath(__file__))
# Rip out the first intro paragraph. # Rip out the first intro paragraph.
with open(os.path.join(TOPDIR, 'README.md')) as fp: with open(os.path.join(TOPDIR, "README.md")) as fp:
lines = fp.read().splitlines()[2:] lines = fp.read().splitlines()[2:]
end = lines.index('') end = lines.index("")
long_description = ' '.join(lines[0:end]) long_description = " ".join(lines[0:end])
# https://packaging.python.org/tutorials/packaging-projects/ # https://packaging.python.org/tutorials/packaging-projects/
setuptools.setup( setuptools.setup(
name='repo', name="repo",
version='2', version="2",
maintainer='Various', maintainer="Various",
maintainer_email='repo-discuss@googlegroups.com', maintainer_email="repo-discuss@googlegroups.com",
description='Repo helps manage many Git repositories', description="Repo helps manage many Git repositories",
long_description=long_description, long_description=long_description,
long_description_content_type='text/plain', long_description_content_type="text/plain",
url='https://gerrit.googlesource.com/git-repo/', url="https://gerrit.googlesource.com/git-repo/",
project_urls={ project_urls={
'Bug Tracker': 'https://bugs.chromium.org/p/gerrit/issues/list?q=component:Applications%3Erepo', "Bug Tracker": "https://issues.gerritcodereview.com/issues?q=is:open%20componentid:1370071", # noqa: E501
}, },
# https://pypi.org/classifiers/ # https://pypi.org/classifiers/
classifiers=[ classifiers=[
'Development Status :: 6 - Mature', "Development Status :: 6 - Mature",
'Environment :: Console', "Environment :: Console",
'Intended Audience :: Developers', "Intended Audience :: Developers",
'License :: OSI Approved :: Apache Software License', "License :: OSI Approved :: Apache Software License",
'Natural Language :: English', "Natural Language :: English",
'Operating System :: MacOS :: MacOS X', "Operating System :: MacOS :: MacOS X",
'Operating System :: Microsoft :: Windows :: Windows 10', "Operating System :: Microsoft :: Windows :: Windows 10",
'Operating System :: POSIX :: Linux', "Operating System :: POSIX :: Linux",
'Programming Language :: Python :: 3', "Programming Language :: Python :: 3",
'Programming Language :: Python :: 3 :: Only', "Programming Language :: Python :: 3 :: Only",
'Topic :: Software Development :: Version Control :: Git', "Topic :: Software Development :: Version Control :: Git",
], ],
python_requires='>=3.6', python_requires=">=3.6",
packages=['subcmds'], packages=["subcmds"],
) )

524
ssh.py
View File

@ -24,258 +24,328 @@ import sys
import tempfile import tempfile
import time import time
from git_command import git
import platform_utils import platform_utils
from repo_trace import Trace from repo_trace import Trace
PROXY_PATH = os.path.join(os.path.dirname(__file__), 'git_ssh') PROXY_PATH = os.path.join(os.path.dirname(__file__), "git_ssh")
def _run_ssh_version(): def _run_ssh_version():
"""run ssh -V to display the version number""" """run ssh -V to display the version number"""
return subprocess.check_output(['ssh', '-V'], stderr=subprocess.STDOUT).decode() return subprocess.check_output(
["ssh", "-V"], stderr=subprocess.STDOUT
).decode()
def _parse_ssh_version(ver_str=None): def _parse_ssh_version(ver_str=None):
"""parse a ssh version string into a tuple""" """parse a ssh version string into a tuple"""
if ver_str is None: if ver_str is None:
ver_str = _run_ssh_version() ver_str = _run_ssh_version()
m = re.match(r'^OpenSSH_([0-9.]+)(p[0-9]+)?\s', ver_str) m = re.match(r"^OpenSSH_([0-9.]+)(p[0-9]+)?[\s,]", ver_str)
if m: if m:
return tuple(int(x) for x in m.group(1).split('.')) return tuple(int(x) for x in m.group(1).split("."))
else: else:
return () return ()
@functools.lru_cache(maxsize=None) @functools.lru_cache(maxsize=None)
def version(): def version():
"""return ssh version as a tuple""" """return ssh version as a tuple"""
try: try:
return _parse_ssh_version() return _parse_ssh_version()
except FileNotFoundError: except FileNotFoundError:
print('fatal: ssh not installed', file=sys.stderr) print("fatal: ssh not installed", file=sys.stderr)
sys.exit(1) sys.exit(1)
except subprocess.CalledProcessError: except subprocess.CalledProcessError as e:
print('fatal: unable to detect ssh version', file=sys.stderr) print(
sys.exit(1) "fatal: unable to detect ssh version"
f" (code={e.returncode}, output={e.stdout})",
file=sys.stderr,
)
sys.exit(1)
URI_SCP = re.compile(r'^([^@:]*@?[^:/]{1,}):') URI_SCP = re.compile(r"^([^@:]*@?[^:/]{1,}):")
URI_ALL = re.compile(r'^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/') URI_ALL = re.compile(r"^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/")
class ProxyManager: class ProxyManager:
"""Manage various ssh clients & masters that we spawn. """Manage various ssh clients & masters that we spawn.
This will take care of sharing state between multiprocessing children, and This will take care of sharing state between multiprocessing children, and
make sure that if we crash, we don't leak any of the ssh sessions. make sure that if we crash, we don't leak any of the ssh sessions.
The code should work with a single-process scenario too, and not add too much The code should work with a single-process scenario too, and not add too
overhead due to the manager. much overhead due to the manager.
""" """
# Path to the ssh program to run which will pass our master settings along. # Path to the ssh program to run which will pass our master settings along.
# Set here more as a convenience API. # Set here more as a convenience API.
proxy = PROXY_PATH proxy = PROXY_PATH
def __init__(self, manager): def __init__(self, manager):
# Protect access to the list of active masters. # Protect access to the list of active masters.
self._lock = multiprocessing.Lock() self._lock = multiprocessing.Lock()
# List of active masters (pid). These will be spawned on demand, and we are # List of active masters (pid). These will be spawned on demand, and we
# responsible for shutting them all down at the end. # are responsible for shutting them all down at the end.
self._masters = manager.list() self._masters = manager.list()
# Set of active masters indexed by "host:port" information. # Set of active masters indexed by "host:port" information.
# The value isn't used, but multiprocessing doesn't provide a set class. # The value isn't used, but multiprocessing doesn't provide a set class.
self._master_keys = manager.dict() self._master_keys = manager.dict()
# Whether ssh masters are known to be broken, so we give up entirely. # Whether ssh masters are known to be broken, so we give up entirely.
self._master_broken = manager.Value('b', False) self._master_broken = manager.Value("b", False)
# List of active ssh sesssions. Clients will be added & removed as # List of active ssh sesssions. Clients will be added & removed as
# connections finish, so this list is just for safety & cleanup if we crash. # connections finish, so this list is just for safety & cleanup if we
self._clients = manager.list() # crash.
# Path to directory for holding master sockets. self._clients = manager.list()
self._sock_path = None # Path to directory for holding master sockets.
self._sock_path = None
def __enter__(self): def __enter__(self):
"""Enter a new context.""" """Enter a new context."""
return self return self
def __exit__(self, exc_type, exc_value, traceback): def __exit__(self, exc_type, exc_value, traceback):
"""Exit a context & clean up all resources.""" """Exit a context & clean up all resources."""
self.close() self.close()
def add_client(self, proc): def add_client(self, proc):
"""Track a new ssh session.""" """Track a new ssh session."""
self._clients.append(proc.pid) self._clients.append(proc.pid)
def remove_client(self, proc): def remove_client(self, proc):
"""Remove a completed ssh session.""" """Remove a completed ssh session."""
try:
self._clients.remove(proc.pid)
except ValueError:
pass
def add_master(self, proc):
"""Track a new master connection."""
self._masters.append(proc.pid)
def _terminate(self, procs):
"""Kill all |procs|."""
for pid in procs:
try:
os.kill(pid, signal.SIGTERM)
os.waitpid(pid, 0)
except OSError:
pass
# The multiprocessing.list() API doesn't provide many standard list()
# methods, so we have to manually clear the list.
while True:
try:
procs.pop(0)
except: # noqa: E722
break
def close(self):
"""Close this active ssh session.
Kill all ssh clients & masters we created, and nuke the socket dir.
"""
self._terminate(self._clients)
self._terminate(self._masters)
d = self.sock(create=False)
if d:
try:
platform_utils.rmdir(os.path.dirname(d))
except OSError:
pass
def _open_unlocked(self, host, port=None):
"""Make sure a ssh master session exists for |host| & |port|.
If one doesn't exist already, we'll create it.
We won't grab any locks, so the caller has to do that. This helps keep
the business logic of actually creating the master separate from
grabbing locks.
"""
# Check to see whether we already think that the master is running; if
# we think it's already running, return right away.
if port is not None:
key = f"{host}:{port}"
else:
key = host
if key in self._master_keys:
return True
if self._master_broken.value or "GIT_SSH" in os.environ:
# Failed earlier, so don't retry.
return False
# We will make two calls to ssh; this is the common part of both calls.
command_base = ["ssh", "-o", "ControlPath %s" % self.sock(), host]
if port is not None:
command_base[1:1] = ["-p", str(port)]
# Since the key wasn't in _master_keys, we think that master isn't
# running... but before actually starting a master, we'll double-check.
# This can be important because we can't tell that that 'git@myhost.com'
# is the same as 'myhost.com' where "User git" is setup in the user's
# ~/.ssh/config file.
check_command = command_base + ["-O", "check"]
with Trace("Call to ssh (check call): %s", " ".join(check_command)):
try:
check_process = subprocess.Popen(
check_command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
)
check_process.communicate() # read output, but ignore it...
isnt_running = check_process.wait()
if not isnt_running:
# Our double-check found that the master _was_ infact
# running. Add to the list of keys.
self._master_keys[key] = True
return True
except Exception:
# Ignore excpetions. We we will fall back to the normal command
# and print to the log there.
pass
# Git protocol V2 is a new feature in git 2.18.0, made default in
# git 2.26.0
# It is faster and more efficient than V1.
# To enable it when using SSH, the environment variable GIT_PROTOCOL
# must be set in the SSH side channel when establishing the connection
# to the git server.
# See https://git-scm.com/docs/protocol-v2#_ssh_and_file_transport
# Normally git does this by itself. But here, where the SSH connection
# is established manually over ControlMaster via the repo-tool, it must
# be passed in explicitly instead.
# Based on https://git-scm.com/docs/gitprotocol-pack#_extra_parameters,
# GIT_PROTOCOL is considered an "Extra Parameter" and must be ignored
# by servers that do not understand it. This means that it is safe to
# set it even when connecting to older servers.
# It should also be safe to set the environment variable for older
# local git versions, since it is only part of the ssh side channel.
git_protocol_version = _get_git_protocol_version()
ssh_git_protocol_args = [
"-o",
f"SetEnv GIT_PROTOCOL=version={git_protocol_version}",
]
command = (
command_base[:1]
+ ["-M", "-N", *ssh_git_protocol_args]
+ command_base[1:]
)
p = None
try:
with Trace("Call to ssh: %s", " ".join(command)):
p = subprocess.Popen(command)
except Exception as e:
self._master_broken.value = True
print(
"\nwarn: cannot enable ssh control master for %s:%s\n%s"
% (host, port, str(e)),
file=sys.stderr,
)
return False
time.sleep(1)
ssh_died = p.poll() is not None
if ssh_died:
return False
self.add_master(p)
self._master_keys[key] = True
return True
def _open(self, host, port=None):
"""Make sure a ssh master session exists for |host| & |port|.
If one doesn't exist already, we'll create it.
This will obtain any necessary locks to avoid inter-process races.
"""
# Bail before grabbing the lock if we already know that we aren't going
# to try creating new masters below.
if sys.platform in ("win32", "cygwin"):
return False
# Acquire the lock. This is needed to prevent opening multiple masters
# for the same host when we're running "repo sync -jN" (for N > 1) _and_
# the manifest <remote fetch="ssh://xyz"> specifies a different host
# from the one that was passed to repo init.
with self._lock:
return self._open_unlocked(host, port)
def preconnect(self, url):
"""If |uri| will create a ssh connection, setup the ssh master for it.""" # noqa: E501
m = URI_ALL.match(url)
if m:
scheme = m.group(1)
host = m.group(2)
if ":" in host:
host, port = host.split(":")
else:
port = None
if scheme in ("ssh", "git+ssh", "ssh+git"):
return self._open(host, port)
return False
m = URI_SCP.match(url)
if m:
host = m.group(1)
return self._open(host)
return False
def sock(self, create=True):
"""Return the path to the ssh socket dir.
This has all the master sockets so clients can talk to them.
"""
if self._sock_path is None:
if not create:
return None
tmp_dir = "/tmp"
if not os.path.exists(tmp_dir):
tmp_dir = tempfile.gettempdir()
if version() < (6, 7):
tokens = "%r@%h:%p"
else:
tokens = "%C" # hash of %l%h%p%r
self._sock_path = os.path.join(
tempfile.mkdtemp("", "ssh-", tmp_dir), "master-" + tokens
)
return self._sock_path
@functools.lru_cache(maxsize=1)
def _get_git_protocol_version() -> str:
"""Return the git protocol version.
The version is found by first reading the global git config.
If no git config for protocol version exists, try to deduce the default
protocol version based on the git version.
See https://git-scm.com/docs/gitprotocol-v2 for details.
"""
try: try:
self._clients.remove(proc.pid) return subprocess.check_output(
except ValueError: ["git", "config", "--get", "--global", "protocol.version"],
pass encoding="utf-8",
stderr=subprocess.PIPE,
def add_master(self, proc): ).strip()
"""Track a new master connection.""" except subprocess.CalledProcessError as e:
self._masters.append(proc.pid) if e.returncode == 1:
# Exit code 1 means that the git config key was not found.
def _terminate(self, procs): # Try to imitate the defaults that git would have used.
"""Kill all |procs|.""" git_version = git.version_tuple()
for pid in procs: if git_version >= (2, 26, 0):
try: # Since git version 2.26, protocol v2 is the default.
os.kill(pid, signal.SIGTERM) return "2"
os.waitpid(pid, 0) return "1"
except OSError: # Other exit codes indicate error with reading the config.
pass raise
# The multiprocessing.list() API doesn't provide many standard list()
# methods, so we have to manually clear the list.
while True:
try:
procs.pop(0)
except:
break
def close(self):
"""Close this active ssh session.
Kill all ssh clients & masters we created, and nuke the socket dir.
"""
self._terminate(self._clients)
self._terminate(self._masters)
d = self.sock(create=False)
if d:
try:
platform_utils.rmdir(os.path.dirname(d))
except OSError:
pass
def _open_unlocked(self, host, port=None):
"""Make sure a ssh master session exists for |host| & |port|.
If one doesn't exist already, we'll create it.
We won't grab any locks, so the caller has to do that. This helps keep the
business logic of actually creating the master separate from grabbing locks.
"""
# Check to see whether we already think that the master is running; if we
# think it's already running, return right away.
if port is not None:
key = '%s:%s' % (host, port)
else:
key = host
if key in self._master_keys:
return True
if self._master_broken.value or 'GIT_SSH' in os.environ:
# Failed earlier, so don't retry.
return False
# We will make two calls to ssh; this is the common part of both calls.
command_base = ['ssh', '-o', 'ControlPath %s' % self.sock(), host]
if port is not None:
command_base[1:1] = ['-p', str(port)]
# Since the key wasn't in _master_keys, we think that master isn't running.
# ...but before actually starting a master, we'll double-check. This can
# be important because we can't tell that that 'git@myhost.com' is the same
# as 'myhost.com' where "User git" is setup in the user's ~/.ssh/config file.
check_command = command_base + ['-O', 'check']
with Trace('Call to ssh (check call): %s', ' '.join(check_command)):
try:
check_process = subprocess.Popen(check_command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
check_process.communicate() # read output, but ignore it...
isnt_running = check_process.wait()
if not isnt_running:
# Our double-check found that the master _was_ infact running. Add to
# the list of keys.
self._master_keys[key] = True
return True
except Exception:
# Ignore excpetions. We we will fall back to the normal command and
# print to the log there.
pass
command = command_base[:1] + ['-M', '-N'] + command_base[1:]
p = None
try:
with Trace('Call to ssh: %s', ' '.join(command)):
p = subprocess.Popen(command)
except Exception as e:
self._master_broken.value = True
print('\nwarn: cannot enable ssh control master for %s:%s\n%s'
% (host, port, str(e)), file=sys.stderr)
return False
time.sleep(1)
ssh_died = (p.poll() is not None)
if ssh_died:
return False
self.add_master(p)
self._master_keys[key] = True
return True
def _open(self, host, port=None):
"""Make sure a ssh master session exists for |host| & |port|.
If one doesn't exist already, we'll create it.
This will obtain any necessary locks to avoid inter-process races.
"""
# Bail before grabbing the lock if we already know that we aren't going to
# try creating new masters below.
if sys.platform in ('win32', 'cygwin'):
return False
# Acquire the lock. This is needed to prevent opening multiple masters for
# the same host when we're running "repo sync -jN" (for N > 1) _and_ the
# manifest <remote fetch="ssh://xyz"> specifies a different host from the
# one that was passed to repo init.
with self._lock:
return self._open_unlocked(host, port)
def preconnect(self, url):
"""If |uri| will create a ssh connection, setup the ssh master for it."""
m = URI_ALL.match(url)
if m:
scheme = m.group(1)
host = m.group(2)
if ':' in host:
host, port = host.split(':')
else:
port = None
if scheme in ('ssh', 'git+ssh', 'ssh+git'):
return self._open(host, port)
return False
m = URI_SCP.match(url)
if m:
host = m.group(1)
return self._open(host)
return False
def sock(self, create=True):
"""Return the path to the ssh socket dir.
This has all the master sockets so clients can talk to them.
"""
if self._sock_path is None:
if not create:
return None
tmp_dir = '/tmp'
if not os.path.exists(tmp_dir):
tmp_dir = tempfile.gettempdir()
if version() < (6, 7):
tokens = '%r@%h:%p'
else:
tokens = '%C' # hash of %l%h%p%r
self._sock_path = os.path.join(
tempfile.mkdtemp('', 'ssh-', tmp_dir),
'master-' + tokens)
return self._sock_path

View File

@ -14,36 +14,35 @@
import os import os
# A mapping of the subcommand name to the class that implements it. # A mapping of the subcommand name to the class that implements it.
all_commands = {} all_commands = {}
all_modules = []
my_dir = os.path.dirname(__file__) my_dir = os.path.dirname(__file__)
for py in os.listdir(my_dir): for py in os.listdir(my_dir):
if py == '__init__.py': if py == "__init__.py":
continue continue
if py.endswith('.py'): if py.endswith(".py"):
name = py[:-3] name = py[:-3]
clsn = name.capitalize() clsn = name.capitalize()
while clsn.find('_') > 0: while clsn.find("_") > 0:
h = clsn.index('_') h = clsn.index("_")
clsn = clsn[0:h] + clsn[h + 1:].capitalize() clsn = clsn[0:h] + clsn[h + 1 :].capitalize()
mod = __import__(__name__, mod = __import__(__name__, globals(), locals(), ["%s" % name])
globals(), mod = getattr(mod, name)
locals(), try:
['%s' % name]) cmd = getattr(mod, clsn)
mod = getattr(mod, name) except AttributeError:
try: raise SyntaxError(f"{__name__}/{py} does not define class {clsn}")
cmd = getattr(mod, clsn)
except AttributeError:
raise SyntaxError('%s/%s does not define class %s' % (
__name__, py, clsn))
name = name.replace('_', '-') name = name.replace("_", "-")
cmd.NAME = name cmd.NAME = name
all_commands[name] = cmd all_commands[name] = cmd
all_modules.append(mod)
# Add 'branch' as an alias for 'branches'. # Add 'branch' as an alias for 'branches'.
all_commands['branch'] = all_commands['branches'] all_commands["branch"] = all_commands["branches"]

View File

@ -12,20 +12,30 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from collections import defaultdict import collections
import functools import functools
import itertools import itertools
import sys
from command import Command, DEFAULT_LOCAL_JOBS from command import Command
from command import DEFAULT_LOCAL_JOBS
from error import RepoError
from error import RepoExitError
from git_command import git from git_command import git
from progress import Progress from progress import Progress
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class AbandonError(RepoExitError):
"""Exit error when abandon command fails."""
class Abandon(Command): class Abandon(Command):
COMMON = True COMMON = True
helpSummary = "Permanently abandon a development branch" helpSummary = "Permanently abandon a development branch"
helpUsage = """ helpUsage = """
%prog [--all | <branchname>] [<project>...] %prog [--all | <branchname>] [<project>...]
This subcommand permanently abandons a development branch by This subcommand permanently abandons a development branch by
@ -33,83 +43,119 @@ deleting it (and all its history) from your local repository.
It is equivalent to "git branch -D <branchname>". It is equivalent to "git branch -D <branchname>".
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p): def _Options(self, p):
p.add_option('--all', p.add_option(
dest='all', action='store_true', "--all",
help='delete all branches in all projects') dest="all",
action="store_true",
help="delete all branches in all projects",
)
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
if not opt.all and not args: if not opt.all and not args:
self.Usage() self.Usage()
if not opt.all: if not opt.all:
nb = args[0] branches = args[0].split()
if not git.check_ref_format('heads/%s' % nb): invalid_branches = [
self.OptionParser.error("'%s' is not a valid branch name" % nb) x for x in branches if not git.check_ref_format(f"heads/{x}")
else: ]
args.insert(0, "'All local branches'")
def _ExecuteOne(self, all_branches, nb, project): if invalid_branches:
"""Abandon one project.""" self.OptionParser.error(
if all_branches: f"{invalid_branches} are not valid branch names"
branches = project.GetBranches() )
else:
branches = [nb]
ret = {}
for name in branches:
status = project.AbandonBranch(name)
if status is not None:
ret[name] = status
return (ret, project)
def Execute(self, opt, args):
nb = args[0]
err = defaultdict(list)
success = defaultdict(list)
all_projects = self.GetProjects(args[1:], all_manifests=not opt.this_manifest_only)
_RelPath = lambda p: p.RelPath(local=opt.this_manifest_only)
def _ProcessResults(_pool, pm, states):
for (results, project) in states:
for branch, status in results.items():
if status:
success[branch].append(project)
else:
err[branch].append(project)
pm.update()
self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, opt.all, nb),
all_projects,
callback=_ProcessResults,
output=Progress('Abandon %s' % (nb,), len(all_projects), quiet=opt.quiet))
width = max(itertools.chain(
[25], (len(x) for x in itertools.chain(success, err))))
if err:
for br in err.keys():
err_msg = "error: cannot abandon %s" % br
print(err_msg, file=sys.stderr)
for proj in err[br]:
print(' ' * len(err_msg) + " | %s" % _RelPath(proj), file=sys.stderr)
sys.exit(1)
elif not success:
print('error: no project has local branch(es) : %s' % nb,
file=sys.stderr)
sys.exit(1)
else:
# Everything below here is displaying status.
if opt.quiet:
return
print('Abandoned branches:')
for br in success.keys():
if len(all_projects) > 1 and len(all_projects) == len(success[br]):
result = "all project"
else: else:
result = "%s" % ( args.insert(0, "'All local branches'")
('\n' + ' ' * width + '| ').join(_RelPath(p) for p in success[br]))
print("%s%s| %s\n" % (br, ' ' * (width - len(br)), result)) @classmethod
def _ExecuteOne(cls, all_branches, nb, project_idx):
"""Abandon one project."""
project = cls.get_parallel_context()["projects"][project_idx]
if all_branches:
branches = project.GetBranches()
else:
branches = nb
ret = {}
errors = []
for name in branches:
status = None
try:
status = project.AbandonBranch(name)
except RepoError as e:
status = False
errors.append(e)
if status is not None:
ret[name] = status
return (ret, project_idx, errors)
def Execute(self, opt, args):
nb = args[0].split()
err = collections.defaultdict(list)
success = collections.defaultdict(list)
aggregate_errors = []
all_projects = self.GetProjects(
args[1:], all_manifests=not opt.this_manifest_only
)
_RelPath = lambda p: p.RelPath(local=opt.this_manifest_only)
def _ProcessResults(_pool, pm, states):
for results, project_idx, errors in states:
project = all_projects[project_idx]
for branch, status in results.items():
if status:
success[branch].append(project)
else:
err[branch].append(project)
aggregate_errors.extend(errors)
pm.update(msg="")
with self.ParallelContext():
self.get_parallel_context()["projects"] = all_projects
self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, opt.all, nb),
range(len(all_projects)),
callback=_ProcessResults,
output=Progress(
f"Abandon {nb}", len(all_projects), quiet=opt.quiet
),
chunksize=1,
)
width = max(
itertools.chain(
[25], (len(x) for x in itertools.chain(success, err))
)
)
if err:
for br in err.keys():
err_msg = "error: cannot abandon %s" % br
logger.error(err_msg)
for proj in err[br]:
logger.error(" " * len(err_msg) + " | %s", _RelPath(proj))
raise AbandonError(aggregate_errors=aggregate_errors)
elif not success:
logger.error("error: no project has local branch(es) : %s", nb)
raise AbandonError(aggregate_errors=aggregate_errors)
else:
# Everything below here is displaying status.
if opt.quiet:
return
print("Abandoned branches:")
for br in success.keys():
if len(all_projects) > 1 and len(all_projects) == len(
success[br]
):
result = "all project"
else:
result = "%s" % (
("\n" + " " * width + "| ").join(
_RelPath(p) for p in success[br]
)
)
print(f"{br}{' ' * (width - len(br))}| {result}\n")

View File

@ -16,55 +16,56 @@ import itertools
import sys import sys
from color import Coloring from color import Coloring
from command import Command, DEFAULT_LOCAL_JOBS from command import Command
from command import DEFAULT_LOCAL_JOBS
class BranchColoring(Coloring): class BranchColoring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, 'branch') Coloring.__init__(self, config, "branch")
self.current = self.printer('current', fg='green') self.current = self.printer("current", fg="green")
self.local = self.printer('local') self.local = self.printer("local")
self.notinproject = self.printer('notinproject', fg='red') self.notinproject = self.printer("notinproject", fg="red")
class BranchInfo(object): class BranchInfo:
def __init__(self, name): def __init__(self, name):
self.name = name self.name = name
self.current = 0 self.current = 0
self.published = 0 self.published = 0
self.published_equal = 0 self.published_equal = 0
self.projects = [] self.projects = []
def add(self, b): def add(self, b):
if b.current: if b.current:
self.current += 1 self.current += 1
if b.published: if b.published:
self.published += 1 self.published += 1
if b.revision == b.published: if b.revision == b.published:
self.published_equal += 1 self.published_equal += 1
self.projects.append(b) self.projects.append(b)
@property @property
def IsCurrent(self): def IsCurrent(self):
return self.current > 0 return self.current > 0
@property @property
def IsSplitCurrent(self): def IsSplitCurrent(self):
return self.current != 0 and self.current != len(self.projects) return self.current != 0 and self.current != len(self.projects)
@property @property
def IsPublished(self): def IsPublished(self):
return self.published > 0 return self.published > 0
@property @property
def IsPublishedEqual(self): def IsPublishedEqual(self):
return self.published_equal == len(self.projects) return self.published_equal == len(self.projects)
class Branches(Command): class Branches(Command):
COMMON = True COMMON = True
helpSummary = "View current topic branches" helpSummary = "View current topic branches"
helpUsage = """ helpUsage = """
%prog [<project>...] %prog [<project>...]
Summarizes the currently available topic branches. Summarizes the currently available topic branches.
@ -95,111 +96,117 @@ the branch appears in, or does not appear in. If no project list
is shown, then the branch appears in all projects. is shown, then the branch appears in all projects.
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def Execute(self, opt, args): @classmethod
projects = self.GetProjects(args, all_manifests=not opt.this_manifest_only) def _ExpandProjectToBranches(cls, project_idx):
out = BranchColoring(self.manifest.manifestProject.config) """Expands a project into a list of branch names & associated info.
all_branches = {}
project_cnt = len(projects)
def _ProcessResults(_pool, _output, results): Args:
for name, b in itertools.chain.from_iterable(results): project_idx: project.Project index
if name not in all_branches:
all_branches[name] = BranchInfo(name)
all_branches[name].add(b)
self.ExecuteInParallel( Returns:
opt.jobs, List[Tuple[str, git_config.Branch, int]]
expand_project_to_branches, """
projects, branches = []
callback=_ProcessResults) project = cls.get_parallel_context()["projects"][project_idx]
for name, b in project.GetBranches().items():
branches.append((name, b, project_idx))
return branches
names = sorted(all_branches) def Execute(self, opt, args):
projects = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
out = BranchColoring(self.manifest.manifestProject.config)
all_branches = {}
project_cnt = len(projects)
if not names: def _ProcessResults(_pool, _output, results):
print(' (no branches)', file=sys.stderr) for name, b, project_idx in itertools.chain.from_iterable(results):
return b.project = projects[project_idx]
if name not in all_branches:
all_branches[name] = BranchInfo(name)
all_branches[name].add(b)
width = 25 with self.ParallelContext():
for name in names: self.get_parallel_context()["projects"] = projects
if width < len(name): self.ExecuteInParallel(
width = len(name) opt.jobs,
self._ExpandProjectToBranches,
range(len(projects)),
callback=_ProcessResults,
)
for name in names: names = sorted(all_branches)
i = all_branches[name]
in_cnt = len(i.projects)
if i.IsCurrent: if not names:
current = '*' print(" (no branches)", file=sys.stderr)
hdr = out.current return
else:
current = ' '
hdr = out.local
if i.IsPublishedEqual: width = 25
published = 'P' for name in names:
elif i.IsPublished: if width < len(name):
published = 'p' width = len(name)
else:
published = ' '
hdr('%c%c %-*s' % (current, published, width, name)) for name in names:
out.write(' |') i = all_branches[name]
in_cnt = len(i.projects)
_RelPath = lambda p: p.RelPath(local=opt.this_manifest_only) if i.IsCurrent:
if in_cnt < project_cnt: current = "*"
fmt = out.write hdr = out.current
paths = []
non_cur_paths = []
if i.IsSplitCurrent or (in_cnt <= project_cnt - in_cnt):
in_type = 'in'
for b in i.projects:
relpath = _RelPath(b.project)
if not i.IsSplitCurrent or b.current:
paths.append(relpath)
else: else:
non_cur_paths.append(relpath) current = " "
else: hdr = out.local
fmt = out.notinproject
in_type = 'not in'
have = set()
for b in i.projects:
have.add(_RelPath(b.project))
for p in projects:
if _RelPath(p) not in have:
paths.append(_RelPath(p))
s = ' %s %s' % (in_type, ', '.join(paths)) if i.IsPublishedEqual:
if not i.IsSplitCurrent and (width + 7 + len(s) < 80): published = "P"
fmt = out.current if i.IsCurrent else fmt elif i.IsPublished:
fmt(s) published = "p"
else: else:
fmt(' %s:' % in_type) published = " "
fmt = out.current if i.IsCurrent else out.write
for p in paths: hdr("%c%c %-*s" % (current, published, width, name))
out.write(" |")
_RelPath = lambda p: p.RelPath(local=opt.this_manifest_only)
if in_cnt < project_cnt:
fmt = out.write
paths = []
non_cur_paths = []
if i.IsSplitCurrent or (in_cnt <= project_cnt - in_cnt):
in_type = "in"
for b in i.projects:
relpath = _RelPath(b.project)
if not i.IsSplitCurrent or b.current:
paths.append(relpath)
else:
non_cur_paths.append(relpath)
else:
fmt = out.notinproject
in_type = "not in"
have = set()
for b in i.projects:
have.add(_RelPath(b.project))
for p in projects:
if _RelPath(p) not in have:
paths.append(_RelPath(p))
s = f" {in_type} {', '.join(paths)}"
if not i.IsSplitCurrent and (width + 7 + len(s) < 80):
fmt = out.current if i.IsCurrent else fmt
fmt(s)
else:
fmt(" %s:" % in_type)
fmt = out.current if i.IsCurrent else out.write
for p in paths:
out.nl()
fmt(width * " " + " %s" % p)
fmt = out.write
for p in non_cur_paths:
out.nl()
fmt(width * " " + " %s" % p)
else:
out.write(" in all projects")
out.nl() out.nl()
fmt(width * ' ' + ' %s' % p)
fmt = out.write
for p in non_cur_paths:
out.nl()
fmt(width * ' ' + ' %s' % p)
else:
out.write(' in all projects')
out.nl()
def expand_project_to_branches(project):
"""Expands a project into a list of branch names & associated information.
Args:
project: project.Project
Returns:
List[Tuple[str, git_config.Branch]]
"""
branches = []
for name, b in project.GetBranches().items():
b.project = project
branches.append((name, b))
return branches

View File

@ -13,19 +13,41 @@
# limitations under the License. # limitations under the License.
import functools import functools
import sys from typing import NamedTuple
from command import Command, DEFAULT_LOCAL_JOBS from command import Command
from command import DEFAULT_LOCAL_JOBS
from error import GitError
from error import RepoExitError
from progress import Progress from progress import Progress
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class CheckoutBranchResult(NamedTuple):
# Whether the Project is on the branch (i.e. branch exists and no errors)
result: bool
project_idx: int
error: Exception
class CheckoutCommandError(RepoExitError):
"""Exception thrown when checkout command fails."""
class MissingBranchError(RepoExitError):
"""Exception thrown when no project has specified branch."""
class Checkout(Command): class Checkout(Command):
COMMON = True COMMON = True
helpSummary = "Checkout a branch for development" helpSummary = "Checkout a branch for development"
helpUsage = """ helpUsage = """
%prog <branchname> [<project>...] %prog <branchname> [<project>...]
""" """
helpDescription = """ helpDescription = """
The '%prog' command checks out an existing branch that was previously The '%prog' command checks out an existing branch that was previously
created by 'repo start'. created by 'repo start'.
@ -33,43 +55,60 @@ The command is equivalent to:
repo forall [<project>...] -c git checkout <branchname> repo forall [<project>...] -c git checkout <branchname>
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
if not args: if not args:
self.Usage() self.Usage()
def _ExecuteOne(self, nb, project): @classmethod
"""Checkout one project.""" def _ExecuteOne(cls, nb, project_idx):
return (project.CheckoutBranch(nb), project) """Checkout one project."""
error = None
result = None
project = cls.get_parallel_context()["projects"][project_idx]
try:
result = project.CheckoutBranch(nb)
except GitError as e:
error = e
return CheckoutBranchResult(result, project_idx, error)
def Execute(self, opt, args): def Execute(self, opt, args):
nb = args[0] nb = args[0]
err = [] err = []
success = [] err_projects = []
all_projects = self.GetProjects(args[1:], all_manifests=not opt.this_manifest_only) success = []
all_projects = self.GetProjects(
args[1:], all_manifests=not opt.this_manifest_only
)
def _ProcessResults(_pool, pm, results): def _ProcessResults(_pool, pm, results):
for status, project in results: for result in results:
if status is not None: project = all_projects[result.project_idx]
if status: if result.error is not None:
success.append(project) err.append(result.error)
else: err_projects.append(project)
err.append(project) elif result.result:
pm.update() success.append(project)
pm.update(msg="")
self.ExecuteInParallel( with self.ParallelContext():
opt.jobs, self.get_parallel_context()["projects"] = all_projects
functools.partial(self._ExecuteOne, nb), self.ExecuteInParallel(
all_projects, opt.jobs,
callback=_ProcessResults, functools.partial(self._ExecuteOne, nb),
output=Progress('Checkout %s' % (nb,), len(all_projects), quiet=opt.quiet)) range(len(all_projects)),
callback=_ProcessResults,
output=Progress(
f"Checkout {nb}", len(all_projects), quiet=opt.quiet
),
)
if err: if err_projects:
for p in err: for p in err_projects:
print("error: %s/: cannot checkout %s" % (p.relpath, nb), logger.error("error: %s/: cannot checkout %s", p.relpath, nb)
file=sys.stderr) raise CheckoutCommandError(aggregate_errors=err)
sys.exit(1) elif not success:
elif not success: msg = f"error: no project has branch {nb}"
print('error: no project has branch %s' % nb, file=sys.stderr) logger.error(msg)
sys.exit(1) raise MissingBranchError(msg)

View File

@ -14,99 +14,132 @@
import re import re
import sys import sys
from command import Command
from git_command import GitCommand
CHANGE_ID_RE = re.compile(r'^\s*Change-Id: I([0-9a-f]{40})\s*$') from command import Command
from error import GitError
from git_command import GitCommand
from repo_logging import RepoLogger
CHANGE_ID_RE = re.compile(r"^\s*Change-Id: I([0-9a-f]{40})\s*$")
logger = RepoLogger(__file__)
class CherryPick(Command): class CherryPick(Command):
COMMON = True COMMON = True
helpSummary = "Cherry-pick a change." helpSummary = "Cherry-pick a change."
helpUsage = """ helpUsage = """
%prog <sha1> %prog <sha1>
""" """
helpDescription = """ helpDescription = """
'%prog' cherry-picks a change from one branch to another. '%prog' cherry-picks a change from one branch to another.
The change id will be updated, and a reference to the old The change id will be updated, and a reference to the old
change id will be added. change id will be added.
""" """
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
if len(args) != 1: if len(args) != 1:
self.Usage() self.Usage()
def Execute(self, opt, args): def Execute(self, opt, args):
reference = args[0] reference = args[0]
p = GitCommand(None, p = GitCommand(
['rev-parse', '--verify', reference], None,
capture_stdout=True, ["rev-parse", "--verify", reference],
capture_stderr=True) capture_stdout=True,
if p.Wait() != 0: capture_stderr=True,
print(p.stderr, file=sys.stderr) verify_command=True,
sys.exit(1) )
sha1 = p.stdout.strip() try:
p.Wait()
except GitError:
logger.error(p.stderr)
raise
p = GitCommand(None, ['cat-file', 'commit', sha1], capture_stdout=True) sha1 = p.stdout.strip()
if p.Wait() != 0:
print("error: Failed to retrieve old commit message", file=sys.stderr)
sys.exit(1)
old_msg = self._StripHeader(p.stdout)
p = GitCommand(None, p = GitCommand(
['cherry-pick', sha1], None,
capture_stdout=True, ["cat-file", "commit", sha1],
capture_stderr=True) capture_stdout=True,
status = p.Wait() verify_command=True,
)
if p.stdout: try:
print(p.stdout.strip(), file=sys.stdout) p.Wait()
if p.stderr: except GitError:
print(p.stderr.strip(), file=sys.stderr) logger.error("error: Failed to retrieve old commit message")
raise
if status == 0: old_msg = self._StripHeader(p.stdout)
# The cherry-pick was applied correctly. We just need to edit the
# commit message.
new_msg = self._Reformat(old_msg, sha1)
p = GitCommand(None, ['commit', '--amend', '-F', '-'], p = GitCommand(
input=new_msg, None,
capture_stdout=True, ["cherry-pick", sha1],
capture_stderr=True) capture_stdout=True,
if p.Wait() != 0: capture_stderr=True,
print("error: Failed to update commit message", file=sys.stderr) verify_command=True,
sys.exit(1) )
else: try:
print('NOTE: When committing (please see above) and editing the commit ' p.Wait()
'message, please remove the old Change-Id-line and add:') except GitError as e:
print(self._GetReference(sha1), file=sys.stderr) logger.error(e)
print(file=sys.stderr) logger.warning(
"NOTE: When committing (please see above) and editing the "
"commit message, please remove the old Change-Id-line and "
"add:\n%s",
self._GetReference(sha1),
)
raise
def _IsChangeId(self, line): if p.stdout:
return CHANGE_ID_RE.match(line) print(p.stdout.strip(), file=sys.stdout)
if p.stderr:
print(p.stderr.strip(), file=sys.stderr)
def _GetReference(self, sha1): # The cherry-pick was applied correctly. We just need to edit
return "(cherry picked from commit %s)" % sha1 # the commit message.
new_msg = self._Reformat(old_msg, sha1)
def _StripHeader(self, commit_msg): p = GitCommand(
lines = commit_msg.splitlines() None,
return "\n".join(lines[lines.index("") + 1:]) ["commit", "--amend", "-F", "-"],
input=new_msg,
capture_stdout=True,
capture_stderr=True,
verify_command=True,
)
try:
p.Wait()
except GitError:
logger.error("error: Failed to update commit message")
raise
def _Reformat(self, old_msg, sha1): def _IsChangeId(self, line):
new_msg = [] return CHANGE_ID_RE.match(line)
for line in old_msg.splitlines(): def _GetReference(self, sha1):
if not self._IsChangeId(line): return "(cherry picked from commit %s)" % sha1
new_msg.append(line)
# Add a blank line between the message and the change id/reference def _StripHeader(self, commit_msg):
try: lines = commit_msg.splitlines()
if new_msg[-1].strip() != "": return "\n".join(lines[lines.index("") + 1 :])
new_msg.append("")
except IndexError:
pass
new_msg.append(self._GetReference(sha1)) def _Reformat(self, old_msg, sha1):
return "\n".join(new_msg) new_msg = []
for line in old_msg.splitlines():
if not self._IsChangeId(line):
new_msg.append(line)
# Add a blank line between the message and the change id/reference.
try:
if new_msg[-1].strip() != "":
new_msg.append("")
except IndexError:
pass
new_msg.append(self._GetReference(sha1))
return "\n".join(new_msg)

View File

@ -15,58 +15,73 @@
import functools import functools
import io import io
from command import DEFAULT_LOCAL_JOBS, PagedCommand from command import DEFAULT_LOCAL_JOBS
from command import PagedCommand
class Diff(PagedCommand): class Diff(PagedCommand):
COMMON = True COMMON = True
helpSummary = "Show changes between commit and working tree" helpSummary = "Show changes between commit and working tree"
helpUsage = """ helpUsage = """
%prog [<project>...] %prog [<project>...]
The -u option causes '%prog' to generate diff output with file paths The -u option causes '%prog' to generate diff output with file paths
relative to the repository root, so the output can be applied relative to the repository root, so the output can be applied
to the Unix 'patch' command. to the Unix 'patch' command.
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p): def _Options(self, p):
p.add_option('-u', '--absolute', p.add_option(
dest='absolute', action='store_true', "-u",
help='paths are relative to the repository root') "--absolute",
dest="absolute",
action="store_true",
help="paths are relative to the repository root",
)
def _ExecuteOne(self, absolute, local, project): @classmethod
"""Obtains the diff for a specific project. def _ExecuteOne(cls, absolute, local, project_idx):
"""Obtains the diff for a specific project.
Args: Args:
absolute: Paths are relative to the root. absolute: Paths are relative to the root.
local: a boolean, if True, the path is relative to the local local: a boolean, if True, the path is relative to the local
(sub)manifest. If false, the path is relative to the (sub)manifest. If false, the path is relative to the outermost
outermost manifest. manifest.
project: Project to get status of. project_idx: Project index to get status of.
Returns: Returns:
The status of the project. The status of the project.
""" """
buf = io.StringIO() buf = io.StringIO()
ret = project.PrintWorkTreeDiff(absolute, output_redir=buf, local=local) project = cls.get_parallel_context()["projects"][project_idx]
return (ret, buf.getvalue()) ret = project.PrintWorkTreeDiff(absolute, output_redir=buf, local=local)
return (ret, buf.getvalue())
def Execute(self, opt, args): def Execute(self, opt, args):
all_projects = self.GetProjects(args, all_manifests=not opt.this_manifest_only) all_projects = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
def _ProcessResults(_pool, _output, results): def _ProcessResults(_pool, _output, results):
ret = 0 ret = 0
for (state, output) in results: for state, output in results:
if output: if output:
print(output, end='') print(output, end="")
if not state: if not state:
ret = 1 ret = 1
return ret return ret
return self.ExecuteInParallel( with self.ParallelContext():
opt.jobs, self.get_parallel_context()["projects"] = all_projects
functools.partial(self._ExecuteOne, opt.absolute, opt.this_manifest_only), return self.ExecuteInParallel(
all_projects, opt.jobs,
callback=_ProcessResults, functools.partial(
ordered=True) self._ExecuteOne, opt.absolute, opt.this_manifest_only
),
range(len(all_projects)),
callback=_ProcessResults,
ordered=True,
chunksize=1,
)

View File

@ -18,24 +18,24 @@ from manifest_xml import RepoClient
class _Coloring(Coloring): class _Coloring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, "status") Coloring.__init__(self, config, "status")
class Diffmanifests(PagedCommand): class Diffmanifests(PagedCommand):
""" A command to see logs in projects represented by manifests """A command to see logs in projects represented by manifests
This is used to see deeper differences between manifests. Where a simple This is used to see deeper differences between manifests. Where a simple
diff would only show a diff of sha1s for example, this command will display diff would only show a diff of sha1s for example, this command will display
the logs of the project between both sha1s, allowing user to see diff at a the logs of the project between both sha1s, allowing user to see diff at a
deeper level. deeper level.
""" """
COMMON = True COMMON = True
helpSummary = "Manifest diff utility" helpSummary = "Manifest diff utility"
helpUsage = """%prog manifest1.xml [manifest2.xml] [options]""" helpUsage = """%prog manifest1.xml [manifest2.xml] [options]"""
helpDescription = """ helpDescription = """
The %prog command shows differences between project revisions of manifest1 and The %prog command shows differences between project revisions of manifest1 and
manifest2. if manifest2 is not specified, current manifest.xml will be used manifest2. if manifest2 is not specified, current manifest.xml will be used
instead. Both absolute and relative paths may be used for manifests. Relative instead. Both absolute and relative paths may be used for manifests. Relative
@ -65,159 +65,197 @@ synced and their revisions won't be found.
""" """
def _Options(self, p): def _Options(self, p):
p.add_option('--raw', p.add_option(
dest='raw', action='store_true', "--raw", dest="raw", action="store_true", help="display raw diff"
help='display raw diff') )
p.add_option('--no-color', p.add_option(
dest='color', action='store_false', default=True, "--no-color",
help='does not display the diff in color') dest="color",
p.add_option('--pretty-format', action="store_false",
dest='pretty_format', action='store', default=True,
metavar='<FORMAT>', help="does not display the diff in color",
help='print the log using a custom git pretty format string') )
p.add_option(
"--pretty-format",
dest="pretty_format",
action="store",
metavar="<FORMAT>",
help="print the log using a custom git pretty format string",
)
def _printRawDiff(self, diff, pretty_format=None, local=False): def _printRawDiff(self, diff, pretty_format=None, local=False):
_RelPath = lambda p: p.RelPath(local=local) _RelPath = lambda p: p.RelPath(local=local)
for project in diff['added']: for project in diff["added"]:
self.printText("A %s %s" % (_RelPath(project), project.revisionExpr)) self.printText(f"A {_RelPath(project)} {project.revisionExpr}")
self.out.nl()
for project in diff['removed']:
self.printText("R %s %s" % (_RelPath(project), project.revisionExpr))
self.out.nl()
for project, otherProject in diff['changed']:
self.printText("C %s %s %s" % (_RelPath(project), project.revisionExpr,
otherProject.revisionExpr))
self.out.nl()
self._printLogs(project, otherProject, raw=True, color=False, pretty_format=pretty_format)
for project, otherProject in diff['unreachable']:
self.printText("U %s %s %s" % (_RelPath(project), project.revisionExpr,
otherProject.revisionExpr))
self.out.nl()
def _printDiff(self, diff, color=True, pretty_format=None, local=False):
_RelPath = lambda p: p.RelPath(local=local)
if diff['added']:
self.out.nl()
self.printText('added projects : \n')
self.out.nl()
for project in diff['added']:
self.printProject('\t%s' % (_RelPath(project)))
self.printText(' at revision ')
self.printRevision(project.revisionExpr)
self.out.nl()
if diff['removed']:
self.out.nl()
self.printText('removed projects : \n')
self.out.nl()
for project in diff['removed']:
self.printProject('\t%s' % (_RelPath(project)))
self.printText(' at revision ')
self.printRevision(project.revisionExpr)
self.out.nl()
if diff['missing']:
self.out.nl()
self.printText('missing projects : \n')
self.out.nl()
for project in diff['missing']:
self.printProject('\t%s' % (_RelPath(project)))
self.printText(' at revision ')
self.printRevision(project.revisionExpr)
self.out.nl()
if diff['changed']:
self.out.nl()
self.printText('changed projects : \n')
self.out.nl()
for project, otherProject in diff['changed']:
self.printProject('\t%s' % (_RelPath(project)))
self.printText(' changed from ')
self.printRevision(project.revisionExpr)
self.printText(' to ')
self.printRevision(otherProject.revisionExpr)
self.out.nl()
self._printLogs(project, otherProject, raw=False, color=color,
pretty_format=pretty_format)
self.out.nl()
if diff['unreachable']:
self.out.nl()
self.printText('projects with unreachable revisions : \n')
self.out.nl()
for project, otherProject in diff['unreachable']:
self.printProject('\t%s ' % (_RelPath(project)))
self.printRevision(project.revisionExpr)
self.printText(' or ')
self.printRevision(otherProject.revisionExpr)
self.printText(' not found')
self.out.nl()
def _printLogs(self, project, otherProject, raw=False, color=True,
pretty_format=None):
logs = project.getAddedAndRemovedLogs(otherProject,
oneline=(pretty_format is None),
color=color,
pretty_format=pretty_format)
if logs['removed']:
removedLogs = logs['removed'].split('\n')
for log in removedLogs:
if log.strip():
if raw:
self.printText(' R ' + log)
self.out.nl()
else:
self.printRemoved('\t\t[-] ')
self.printText(log)
self.out.nl() self.out.nl()
if logs['added']: for project in diff["removed"]:
addedLogs = logs['added'].split('\n') self.printText(f"R {_RelPath(project)} {project.revisionExpr}")
for log in addedLogs:
if log.strip():
if raw:
self.printText(' A ' + log)
self.out.nl()
else:
self.printAdded('\t\t[+] ')
self.printText(log)
self.out.nl() self.out.nl()
def ValidateOptions(self, opt, args): for project, otherProject in diff["changed"]:
if not args or len(args) > 2: self.printText(
self.OptionParser.error('missing manifests to diff') f"C {_RelPath(project)} {project.revisionExpr} "
if opt.this_manifest_only is False: f"{otherProject.revisionExpr}"
raise self.OptionParser.error( )
'`diffmanifest` only supports the current tree') self.out.nl()
self._printLogs(
project,
otherProject,
raw=True,
color=False,
pretty_format=pretty_format,
)
def Execute(self, opt, args): for project, otherProject in diff["unreachable"]:
self.out = _Coloring(self.client.globalConfig) self.printText(
self.printText = self.out.nofmt_printer('text') f"U {_RelPath(project)} {project.revisionExpr} "
if opt.color: f"{otherProject.revisionExpr}"
self.printProject = self.out.nofmt_printer('project', attr='bold') )
self.printAdded = self.out.nofmt_printer('green', fg='green', attr='bold') self.out.nl()
self.printRemoved = self.out.nofmt_printer('red', fg='red', attr='bold')
self.printRevision = self.out.nofmt_printer('revision', fg='yellow')
else:
self.printProject = self.printAdded = self.printRemoved = self.printRevision = self.printText
manifest1 = RepoClient(self.repodir) def _printDiff(self, diff, color=True, pretty_format=None, local=False):
manifest1.Override(args[0], load_local_manifests=False) _RelPath = lambda p: p.RelPath(local=local)
if len(args) == 1: if diff["added"]:
manifest2 = self.manifest self.out.nl()
else: self.printText("added projects : \n")
manifest2 = RepoClient(self.repodir) self.out.nl()
manifest2.Override(args[1], load_local_manifests=False) for project in diff["added"]:
self.printProject("\t%s" % (_RelPath(project)))
self.printText(" at revision ")
self.printRevision(project.revisionExpr)
self.out.nl()
diff = manifest1.projectsDiff(manifest2) if diff["removed"]:
if opt.raw: self.out.nl()
self._printRawDiff(diff, pretty_format=opt.pretty_format, self.printText("removed projects : \n")
local=opt.this_manifest_only) self.out.nl()
else: for project in diff["removed"]:
self._printDiff(diff, color=opt.color, pretty_format=opt.pretty_format, self.printProject("\t%s" % (_RelPath(project)))
local=opt.this_manifest_only) self.printText(" at revision ")
self.printRevision(project.revisionExpr)
self.out.nl()
if diff["missing"]:
self.out.nl()
self.printText("missing projects : \n")
self.out.nl()
for project in diff["missing"]:
self.printProject("\t%s" % (_RelPath(project)))
self.printText(" at revision ")
self.printRevision(project.revisionExpr)
self.out.nl()
if diff["changed"]:
self.out.nl()
self.printText("changed projects : \n")
self.out.nl()
for project, otherProject in diff["changed"]:
self.printProject("\t%s" % (_RelPath(project)))
self.printText(" changed from ")
self.printRevision(project.revisionExpr)
self.printText(" to ")
self.printRevision(otherProject.revisionExpr)
self.out.nl()
self._printLogs(
project,
otherProject,
raw=False,
color=color,
pretty_format=pretty_format,
)
self.out.nl()
if diff["unreachable"]:
self.out.nl()
self.printText("projects with unreachable revisions : \n")
self.out.nl()
for project, otherProject in diff["unreachable"]:
self.printProject("\t%s " % (_RelPath(project)))
self.printRevision(project.revisionExpr)
self.printText(" or ")
self.printRevision(otherProject.revisionExpr)
self.printText(" not found")
self.out.nl()
def _printLogs(
self, project, otherProject, raw=False, color=True, pretty_format=None
):
logs = project.getAddedAndRemovedLogs(
otherProject,
oneline=(pretty_format is None),
color=color,
pretty_format=pretty_format,
)
if logs["removed"]:
removedLogs = logs["removed"].split("\n")
for log in removedLogs:
if log.strip():
if raw:
self.printText(" R " + log)
self.out.nl()
else:
self.printRemoved("\t\t[-] ")
self.printText(log)
self.out.nl()
if logs["added"]:
addedLogs = logs["added"].split("\n")
for log in addedLogs:
if log.strip():
if raw:
self.printText(" A " + log)
self.out.nl()
else:
self.printAdded("\t\t[+] ")
self.printText(log)
self.out.nl()
def ValidateOptions(self, opt, args):
if not args or len(args) > 2:
self.OptionParser.error("missing manifests to diff")
if opt.this_manifest_only is False:
raise self.OptionParser.error(
"`diffmanifest` only supports the current tree"
)
def Execute(self, opt, args):
self.out = _Coloring(self.client.globalConfig)
self.printText = self.out.nofmt_printer("text")
if opt.color:
self.printProject = self.out.nofmt_printer("project", attr="bold")
self.printAdded = self.out.nofmt_printer(
"green", fg="green", attr="bold"
)
self.printRemoved = self.out.nofmt_printer(
"red", fg="red", attr="bold"
)
self.printRevision = self.out.nofmt_printer("revision", fg="yellow")
else:
self.printProject = (
self.printAdded
) = self.printRemoved = self.printRevision = self.printText
manifest1 = RepoClient(self.repodir)
manifest1.Override(args[0], load_local_manifests=False)
if len(args) == 1:
manifest2 = self.manifest
else:
manifest2 = RepoClient(self.repodir)
manifest2.Override(args[1], load_local_manifests=False)
diff = manifest1.projectsDiff(manifest2)
if opt.raw:
self._printRawDiff(
diff,
pretty_format=opt.pretty_format,
local=opt.this_manifest_only,
)
else:
self._printDiff(
diff,
color=opt.color,
pretty_format=opt.pretty_format,
local=opt.this_manifest_only,
)

View File

@ -16,145 +16,198 @@ import re
import sys import sys
from command import Command from command import Command
from error import GitError, NoSuchProjectError from error import GitError
from error import NoSuchProjectError
from error import RepoExitError
from repo_logging import RepoLogger
CHANGE_RE = re.compile(r'^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$')
CHANGE_RE = re.compile(r"^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$")
logger = RepoLogger(__file__)
class DownloadCommandError(RepoExitError):
"""Error raised when download command fails."""
class Download(Command): class Download(Command):
COMMON = True COMMON = True
helpSummary = "Download and checkout a change" helpSummary = "Download and checkout a change"
helpUsage = """ helpUsage = """
%prog {[project] change[/patchset]}... %prog {[project] change[/patchset]}...
""" """
helpDescription = """ helpDescription = """
The '%prog' command downloads a change from the review system and The '%prog' command downloads a change from the review system and
makes it available in your project's local working directory. makes it available in your project's local working directory.
If no project is specified try to use current directory as a project. If no project is specified try to use current directory as a project.
""" """
def _Options(self, p): def _Options(self, p):
p.add_option('-b', '--branch', p.add_option("-b", "--branch", help="create a new branch first")
help='create a new branch first') p.add_option(
p.add_option('-c', '--cherry-pick', "-c",
dest='cherrypick', action='store_true', "--cherry-pick",
help="cherry-pick instead of checkout") dest="cherrypick",
p.add_option('-x', '--record-origin', action='store_true', action="store_true",
help='pass -x when cherry-picking') help="cherry-pick instead of checkout",
p.add_option('-r', '--revert', )
dest='revert', action='store_true', p.add_option(
help="revert instead of checkout") "-x",
p.add_option('-f', '--ff-only', "--record-origin",
dest='ffonly', action='store_true', action="store_true",
help="force fast-forward merge") help="pass -x when cherry-picking",
)
p.add_option(
"-r",
"--revert",
dest="revert",
action="store_true",
help="revert instead of checkout",
)
p.add_option(
"-f",
"--ff-only",
dest="ffonly",
action="store_true",
help="force fast-forward merge",
)
def _ParseChangeIds(self, opt, args): def _ParseChangeIds(self, opt, args):
if not args: if not args:
self.Usage() self.Usage()
to_get = [] to_get = []
project = None project = None
for a in args: for a in args:
m = CHANGE_RE.match(a) m = CHANGE_RE.match(a)
if m: if m:
if not project: if not project:
project = self.GetProjects(".")[0] project = self.GetProjects(".")[0]
print('Defaulting to cwd project', project.name) print("Defaulting to cwd project", project.name)
chg_id = int(m.group(1)) chg_id = int(m.group(1))
if m.group(2): if m.group(2):
ps_id = int(m.group(2)) ps_id = int(m.group(2))
else: else:
ps_id = 1 ps_id = 1
refs = 'refs/changes/%2.2d/%d/' % (chg_id % 100, chg_id) refs = "refs/changes/%2.2d/%d/" % (chg_id % 100, chg_id)
output = project._LsRemote(refs + '*') output = project._LsRemote(refs + "*")
if output: if output:
regex = refs + r'(\d+)' regex = refs + r"(\d+)"
rcomp = re.compile(regex, re.I) rcomp = re.compile(regex, re.I)
for line in output.splitlines(): for line in output.splitlines():
match = rcomp.search(line) match = rcomp.search(line)
if match: if match:
ps_id = max(int(match.group(1)), ps_id) ps_id = max(int(match.group(1)), ps_id)
to_get.append((project, chg_id, ps_id)) to_get.append((project, chg_id, ps_id))
else: else:
projects = self.GetProjects([a], all_manifests=not opt.this_manifest_only) projects = self.GetProjects(
if len(projects) > 1: [a], all_manifests=not opt.this_manifest_only
# If the cwd is one of the projects, assume they want that. )
try: if len(projects) > 1:
project = self.GetProjects('.')[0] # If the cwd is one of the projects, assume they want that.
except NoSuchProjectError: try:
project = None project = self.GetProjects(".")[0]
if project not in projects: except NoSuchProjectError:
print('error: %s matches too many projects; please re-run inside ' project = None
'the project checkout.' % (a,), file=sys.stderr) if project not in projects:
for project in projects: logger.error(
print(' %s/ @ %s' % (project.RelPath(local=opt.this_manifest_only), "error: %s matches too many projects; please "
project.revisionExpr), file=sys.stderr) "re-run inside the project checkout.",
sys.exit(1) a,
else: )
project = projects[0] for project in projects:
print('Defaulting to cwd project', project.name) logger.error(
return to_get " %s/ @ %s",
project.RelPath(local=opt.this_manifest_only),
project.revisionExpr,
)
raise NoSuchProjectError()
else:
project = projects[0]
print("Defaulting to cwd project", project.name)
return to_get
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
if opt.record_origin: if opt.record_origin:
if not opt.cherrypick: if not opt.cherrypick:
self.OptionParser.error('-x only makes sense with --cherry-pick') self.OptionParser.error(
"-x only makes sense with --cherry-pick"
)
if opt.ffonly: if opt.ffonly:
self.OptionParser.error('-x and --ff are mutually exclusive options') self.OptionParser.error(
"-x and --ff are mutually exclusive options"
)
def Execute(self, opt, args): def Execute(self, opt, args):
for project, change_id, ps_id in self._ParseChangeIds(opt, args): try:
dl = project.DownloadPatchSet(change_id, ps_id) self._ExecuteHelper(opt, args)
if not dl: except Exception as e:
print('[%s] change %d/%d not found' if isinstance(e, RepoExitError):
% (project.name, change_id, ps_id), raise e
file=sys.stderr) raise DownloadCommandError(aggregate_errors=[e])
sys.exit(1)
if not opt.revert and not dl.commits: def _ExecuteHelper(self, opt, args):
print('[%s] change %d/%d has already been merged' for project, change_id, ps_id in self._ParseChangeIds(opt, args):
% (project.name, change_id, ps_id), dl = project.DownloadPatchSet(change_id, ps_id)
file=sys.stderr)
continue
if len(dl.commits) > 1: if not opt.revert and not dl.commits:
print('[%s] %d/%d depends on %d unmerged changes:' logger.error(
% (project.name, change_id, ps_id, len(dl.commits)), "[%s] change %d/%d has already been merged",
file=sys.stderr) project.name,
for c in dl.commits: change_id,
print(' %s' % (c), file=sys.stderr) ps_id,
)
continue
if opt.cherrypick: if len(dl.commits) > 1:
mode = 'cherry-pick' logger.error(
elif opt.revert: "[%s] %d/%d depends on %d unmerged changes:",
mode = 'revert' project.name,
elif opt.ffonly: change_id,
mode = 'fast-forward merge' ps_id,
else: len(dl.commits),
mode = 'checkout' )
for c in dl.commits:
print(" %s" % (c), file=sys.stderr)
# We'll combine the branch+checkout operation, but all the rest need a if opt.cherrypick:
# dedicated branch start. mode = "cherry-pick"
if opt.branch and mode != 'checkout': elif opt.revert:
project.StartBranch(opt.branch) mode = "revert"
elif opt.ffonly:
mode = "fast-forward merge"
else:
mode = "checkout"
try: # We'll combine the branch+checkout operation, but all the rest need
if opt.cherrypick: # a dedicated branch start.
project._CherryPick(dl.commit, ffonly=opt.ffonly, if opt.branch and mode != "checkout":
record_origin=opt.record_origin) project.StartBranch(opt.branch)
elif opt.revert:
project._Revert(dl.commit)
elif opt.ffonly:
project._FastForward(dl.commit, ffonly=True)
else:
if opt.branch:
project.StartBranch(opt.branch, revision=dl.commit)
else:
project._Checkout(dl.commit)
except GitError: try:
print('[%s] Could not complete the %s of %s' if opt.cherrypick:
% (project.name, mode, dl.commit), file=sys.stderr) project._CherryPick(
sys.exit(1) dl.commit,
ffonly=opt.ffonly,
record_origin=opt.record_origin,
)
elif opt.revert:
project._Revert(dl.commit)
elif opt.ffonly:
project._FastForward(dl.commit, ffonly=True)
else:
if opt.branch:
project.StartBranch(opt.branch, revision=dl.commit)
else:
project._Checkout(dl.commit)
except GitError:
logger.error(
"[%s] Could not complete the %s of %s",
project.name,
mode,
dl.commit,
)
raise

View File

@ -15,39 +15,43 @@
import errno import errno
import functools import functools
import io import io
import multiprocessing
import re
import os import os
import re
import signal import signal
import sys
import subprocess import subprocess
import sys
from color import Coloring from color import Coloring
from command import DEFAULT_LOCAL_JOBS, Command, MirrorSafeCommand, WORKER_BATCH_SIZE from command import Command
from command import DEFAULT_LOCAL_JOBS
from command import MirrorSafeCommand
from error import ManifestInvalidRevisionError from error import ManifestInvalidRevisionError
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
_CAN_COLOR = [ _CAN_COLOR = [
'branch', "branch",
'diff', "diff",
'grep', "grep",
'log', "log",
] ]
class ForallColoring(Coloring): class ForallColoring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, 'forall') Coloring.__init__(self, config, "forall")
self.project = self.printer('project', attr='bold') self.project = self.printer("project", attr="bold")
class Forall(Command, MirrorSafeCommand): class Forall(Command, MirrorSafeCommand):
COMMON = False COMMON = False
helpSummary = "Run a shell command in each project" helpSummary = "Run a shell command in each project"
helpUsage = """ helpUsage = """
%prog [<project>...] -c <command> [<arg>...] %prog [<project>...] -c <command> [<arg>...]
%prog -r str1 [str2] ... -c <command> [<arg>...] %prog -r str1 [str2] ... -c <command> [<arg>...]
""" """
helpDescription = """ helpDescription = """
Executes the same shell command in each project. Executes the same shell command in each project.
The -r option allows running the command only on projects matching The -r option allows running the command only on projects matching
@ -125,236 +129,293 @@ terminal and are not redirected.
If -e is used, when a command exits unsuccessfully, '%prog' will abort If -e is used, when a command exits unsuccessfully, '%prog' will abort
without iterating through the remaining projects. without iterating through the remaining projects.
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
@staticmethod @staticmethod
def _cmd_option(option, _opt_str, _value, parser): def _cmd_option(option, _opt_str, _value, parser):
setattr(parser.values, option.dest, list(parser.rargs)) setattr(parser.values, option.dest, list(parser.rargs))
while parser.rargs: while parser.rargs:
del parser.rargs[0] del parser.rargs[0]
def _Options(self, p): def _Options(self, p):
p.add_option('-r', '--regex', p.add_option(
dest='regex', action='store_true', "-r",
help='execute the command only on projects matching regex or wildcard expression') "--regex",
p.add_option('-i', '--inverse-regex', dest="regex",
dest='inverse_regex', action='store_true', action="store_true",
help='execute the command only on projects not matching regex or ' help="execute the command only on projects matching regex or "
'wildcard expression') "wildcard expression",
p.add_option('-g', '--groups', )
dest='groups', p.add_option(
help='execute the command only on projects matching the specified groups') "-i",
p.add_option('-c', '--command', "--inverse-regex",
help='command (and arguments) to execute', dest="inverse_regex",
dest='command', action="store_true",
action='callback', help="execute the command only on projects not matching regex or "
callback=self._cmd_option) "wildcard expression",
p.add_option('-e', '--abort-on-errors', )
dest='abort_on_errors', action='store_true', p.add_option(
help='abort if a command exits unsuccessfully') "-g",
p.add_option('--ignore-missing', action='store_true', "--groups",
help='silently skip & do not exit non-zero due missing ' dest="groups",
'checkouts') help="execute the command only on projects matching the specified "
"groups",
)
p.add_option(
"-c",
"--command",
help="command (and arguments) to execute",
dest="command",
action="callback",
callback=self._cmd_option,
)
p.add_option(
"-e",
"--abort-on-errors",
dest="abort_on_errors",
action="store_true",
help="abort if a command exits unsuccessfully",
)
p.add_option(
"--ignore-missing",
action="store_true",
help="silently skip & do not exit non-zero due missing "
"checkouts",
)
g = p.get_option_group('--quiet') g = p.get_option_group("--quiet")
g.add_option('-p', g.add_option(
dest='project_header', action='store_true', "-p",
help='show project headers before output') dest="project_header",
p.add_option('--interactive', action="store_true",
action='store_true', help="show project headers before output",
help='force interactive usage') )
p.add_option(
"--interactive", action="store_true", help="force interactive usage"
)
def WantPager(self, opt): def WantPager(self, opt):
return opt.project_header and opt.jobs == 1 return opt.project_header and opt.jobs == 1
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
if not opt.command: if not opt.command:
self.Usage() self.Usage()
def Execute(self, opt, args): def Execute(self, opt, args):
cmd = [opt.command[0]] cmd = [opt.command[0]]
all_trees = not opt.this_manifest_only all_trees = not opt.this_manifest_only
shell = True shell = True
if re.compile(r'^[a-z0-9A-Z_/\.-]+$').match(cmd[0]): if re.compile(r"^[a-z0-9A-Z_/\.-]+$").match(cmd[0]):
shell = False shell = False
if shell: if shell:
cmd.append(cmd[0]) cmd.append(cmd[0])
cmd.extend(opt.command[1:]) cmd.extend(opt.command[1:])
# Historically, forall operated interactively, and in serial. If the user # Historically, forall operated interactively, and in serial. If the
# has selected 1 job, then default to interacive mode. # user has selected 1 job, then default to interacive mode.
if opt.jobs == 1: if opt.jobs == 1:
opt.interactive = True opt.interactive = True
if opt.project_header \ if opt.project_header and not shell and cmd[0] == "git":
and not shell \ # If this is a direct git command that can enable colorized
and cmd[0] == 'git': # output and the user prefers coloring, add --color into the
# If this is a direct git command that can enable colorized # command line because we are going to wrap the command into
# output and the user prefers coloring, add --color into the # a pipe and git won't know coloring should activate.
# command line because we are going to wrap the command into #
# a pipe and git won't know coloring should activate. for cn in cmd[1:]:
# if not cn.startswith("-"):
for cn in cmd[1:]: break
if not cn.startswith('-'): else:
break cn = None
else: if cn and cn in _CAN_COLOR:
cn = None
if cn and cn in _CAN_COLOR:
class ColorCmd(Coloring):
def __init__(self, config, cmd):
Coloring.__init__(self, config, cmd)
if ColorCmd(self.manifest.manifestProject.config, cn).is_on:
cmd.insert(cmd.index(cn) + 1, '--color')
mirror = self.manifest.IsMirror class ColorCmd(Coloring):
rc = 0 def __init__(self, config, cmd):
Coloring.__init__(self, config, cmd)
smart_sync_manifest_name = "smart_sync_override.xml" if ColorCmd(self.manifest.manifestProject.config, cn).is_on:
smart_sync_manifest_path = os.path.join( cmd.insert(cmd.index(cn) + 1, "--color")
self.manifest.manifestProject.worktree, smart_sync_manifest_name)
if os.path.isfile(smart_sync_manifest_path): mirror = self.manifest.IsMirror
self.manifest.Override(smart_sync_manifest_path)
if opt.regex: smart_sync_manifest_name = "smart_sync_override.xml"
projects = self.FindProjects(args, all_manifests=all_trees) smart_sync_manifest_path = os.path.join(
elif opt.inverse_regex: self.manifest.manifestProject.worktree, smart_sync_manifest_name
projects = self.FindProjects(args, inverse=True, all_manifests=all_trees) )
else:
projects = self.GetProjects(args, groups=opt.groups, all_manifests=all_trees)
os.environ['REPO_COUNT'] = str(len(projects)) if os.path.isfile(smart_sync_manifest_path):
self.manifest.Override(smart_sync_manifest_path)
try: if opt.regex:
config = self.manifest.manifestProject.config projects = self.FindProjects(args, all_manifests=all_trees)
with multiprocessing.Pool(opt.jobs, InitWorker) as pool: elif opt.inverse_regex:
results_it = pool.imap( projects = self.FindProjects(
functools.partial(DoWorkWrapper, mirror, opt, cmd, shell, config), args, inverse=True, all_manifests=all_trees
enumerate(projects), )
chunksize=WORKER_BATCH_SIZE) else:
first = True projects = self.GetProjects(
for (r, output) in results_it: args, groups=opt.groups, all_manifests=all_trees
if output: )
if first:
first = False os.environ["REPO_COUNT"] = str(len(projects))
elif opt.project_header:
print() def _ProcessResults(_pool, _output, results):
# To simplify the DoWorkWrapper, take care of automatic newlines. rc = 0
end = '\n' first = True
if output[-1] == '\n': for r, output in results:
end = '' if output:
print(output, end=end) if first:
rc = rc or r first = False
if r != 0 and opt.abort_on_errors: elif opt.project_header:
raise Exception('Aborting due to previous error') print()
except (KeyboardInterrupt, WorkerKeyboardInterrupt): # To simplify the DoWorkWrapper, take care of automatic
# Catch KeyboardInterrupt raised inside and outside of workers # newlines.
rc = rc or errno.EINTR end = "\n"
except Exception as e: if output[-1] == "\n":
# Catch any other exceptions raised end = ""
print('forall: unhandled error, terminating the pool: %s: %s' % print(output, end=end)
(type(e).__name__, e), rc = rc or r
file=sys.stderr) if r != 0 and opt.abort_on_errors:
rc = rc or getattr(e, 'errno', 1) raise Exception("Aborting due to previous error")
if rc != 0: return rc
sys.exit(rc)
try:
config = self.manifest.manifestProject.config
with self.ParallelContext():
self.get_parallel_context()["projects"] = projects
rc = self.ExecuteInParallel(
opt.jobs,
functools.partial(
self.DoWorkWrapper, mirror, opt, cmd, shell, config
),
range(len(projects)),
callback=_ProcessResults,
ordered=True,
initializer=self.InitWorker,
chunksize=1,
)
except (KeyboardInterrupt, WorkerKeyboardInterrupt):
# Catch KeyboardInterrupt raised inside and outside of workers
rc = errno.EINTR
except Exception as e:
# Catch any other exceptions raised
logger.error(
"forall: unhandled error, terminating the pool: %s: %s",
type(e).__name__,
e,
)
rc = getattr(e, "errno", 1)
if rc != 0:
sys.exit(rc)
@classmethod
def InitWorker(cls):
signal.signal(signal.SIGINT, signal.SIG_IGN)
@classmethod
def DoWorkWrapper(cls, mirror, opt, cmd, shell, config, project_idx):
"""A wrapper around the DoWork() method.
Catch the KeyboardInterrupt exceptions here and re-raise them as a
different, ``Exception``-based exception to stop it flooding the console
with stacktraces and making the parent hang indefinitely.
"""
project = cls.get_parallel_context()["projects"][project_idx]
try:
return DoWork(project, mirror, opt, cmd, shell, project_idx, config)
except KeyboardInterrupt:
print("%s: Worker interrupted" % project.name)
raise WorkerKeyboardInterrupt()
class WorkerKeyboardInterrupt(Exception): class WorkerKeyboardInterrupt(Exception):
""" Keyboard interrupt exception for worker processes. """ """Keyboard interrupt exception for worker processes."""
def InitWorker():
signal.signal(signal.SIGINT, signal.SIG_IGN)
def DoWorkWrapper(mirror, opt, cmd, shell, config, args):
""" A wrapper around the DoWork() method.
Catch the KeyboardInterrupt exceptions here and re-raise them as a different,
``Exception``-based exception to stop it flooding the console with stacktraces
and making the parent hang indefinitely.
"""
cnt, project = args
try:
return DoWork(project, mirror, opt, cmd, shell, cnt, config)
except KeyboardInterrupt:
print('%s: Worker interrupted' % project.name)
raise WorkerKeyboardInterrupt()
def DoWork(project, mirror, opt, cmd, shell, cnt, config): def DoWork(project, mirror, opt, cmd, shell, cnt, config):
env = os.environ.copy() env = os.environ.copy()
def setenv(name, val): def setenv(name, val):
if val is None: if val is None:
val = '' val = ""
env[name] = val env[name] = val
setenv('REPO_PROJECT', project.name) setenv("REPO_PROJECT", project.name)
setenv('REPO_OUTERPATH', project.manifest.path_prefix) setenv("REPO_OUTERPATH", project.manifest.path_prefix)
setenv('REPO_INNERPATH', project.relpath) setenv("REPO_INNERPATH", project.relpath)
setenv('REPO_PATH', project.RelPath(local=opt.this_manifest_only)) setenv("REPO_PATH", project.RelPath(local=opt.this_manifest_only))
setenv('REPO_REMOTE', project.remote.name) setenv("REPO_REMOTE", project.remote.name)
try: try:
# If we aren't in a fully synced state and we don't have the ref the manifest # If we aren't in a fully synced state and we don't have the ref the
# wants, then this will fail. Ignore it for the purposes of this code. # manifest wants, then this will fail. Ignore it for the purposes of
lrev = '' if mirror else project.GetRevisionId() # this code.
except ManifestInvalidRevisionError: lrev = "" if mirror else project.GetRevisionId()
lrev = '' except ManifestInvalidRevisionError:
setenv('REPO_LREV', lrev) lrev = ""
setenv('REPO_RREV', project.revisionExpr) setenv("REPO_LREV", lrev)
setenv('REPO_UPSTREAM', project.upstream) setenv("REPO_RREV", project.revisionExpr)
setenv('REPO_DEST_BRANCH', project.dest_branch) setenv("REPO_UPSTREAM", project.upstream)
setenv('REPO_I', str(cnt + 1)) setenv("REPO_DEST_BRANCH", project.dest_branch)
for annotation in project.annotations: setenv("REPO_I", str(cnt + 1))
setenv("REPO__%s" % (annotation.name), annotation.value) for annotation in project.annotations:
setenv("REPO__%s" % (annotation.name), annotation.value)
if mirror: if mirror:
setenv('GIT_DIR', project.gitdir) setenv("GIT_DIR", project.gitdir)
cwd = project.gitdir cwd = project.gitdir
else: else:
cwd = project.worktree cwd = project.worktree
if not os.path.exists(cwd): if not os.path.exists(cwd):
# Allow the user to silently ignore missing checkouts so they can run on # Allow the user to silently ignore missing checkouts so they can run on
# partial checkouts (good for infra recovery tools). # partial checkouts (good for infra recovery tools).
if opt.ignore_missing: if opt.ignore_missing:
return (0, '') return (0, "")
output = '' output = ""
if ((opt.project_header and opt.verbose) if (opt.project_header and opt.verbose) or not opt.project_header:
or not opt.project_header): output = "skipping %s/" % project.RelPath(
output = 'skipping %s/' % project.RelPath(local=opt.this_manifest_only) local=opt.this_manifest_only
return (1, output) )
return (1, output)
if opt.verbose: if opt.verbose:
stderr = subprocess.STDOUT stderr = subprocess.STDOUT
else: else:
stderr = subprocess.DEVNULL stderr = subprocess.DEVNULL
stdin = None if opt.interactive else subprocess.DEVNULL stdin = None if opt.interactive else subprocess.DEVNULL
result = subprocess.run( result = subprocess.run(
cmd, cwd=cwd, shell=shell, env=env, check=False, cmd,
encoding='utf-8', errors='replace', cwd=cwd,
stdin=stdin, stdout=subprocess.PIPE, stderr=stderr) shell=shell,
env=env,
check=False,
encoding="utf-8",
errors="replace",
stdin=stdin,
stdout=subprocess.PIPE,
stderr=stderr,
)
output = result.stdout output = result.stdout
if opt.project_header: if opt.project_header:
if output: if output:
buf = io.StringIO() buf = io.StringIO()
out = ForallColoring(config) out = ForallColoring(config)
out.redirect(buf) out.redirect(buf)
if mirror: if mirror:
project_header_path = project.name project_header_path = project.name
else: else:
project_header_path = project.RelPath(local=opt.this_manifest_only) project_header_path = project.RelPath(
out.project('project %s/' % project_header_path) local=opt.this_manifest_only
out.nl() )
buf.write(output) out.project("project %s/" % project_header_path)
output = buf.getvalue() out.nl()
return (result.returncode, output) buf.write(output)
output = buf.getvalue()
return (result.returncode, output)

View File

@ -1,46 +0,0 @@
# Copyright (C) 2015 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
from command import Command, GitcClientCommand
import platform_utils
class GitcDelete(Command, GitcClientCommand):
COMMON = True
visible_everywhere = False
helpSummary = "Delete a GITC Client."
helpUsage = """
%prog
"""
helpDescription = """
This subcommand deletes the current GITC client, deleting the GITC manifest
and all locally downloaded sources.
"""
def _Options(self, p):
p.add_option('-f', '--force',
dest='force', action='store_true',
help='force the deletion (no prompt)')
def Execute(self, opt, args):
if not opt.force:
prompt = ('This will delete GITC client: %s\nAre you sure? (yes/no) ' %
self.gitc_manifest.gitc_client_name)
response = input(prompt).lower()
if not response == 'yes':
print('Response was not "yes"\n Exiting...')
sys.exit(1)
platform_utils.rmtree(self.gitc_manifest.gitc_client_dir)

View File

@ -1,76 +0,0 @@
# Copyright (C) 2015 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import sys
import gitc_utils
from command import GitcAvailableCommand
from manifest_xml import GitcManifest
from subcmds import init
import wrapper
class GitcInit(init.Init, GitcAvailableCommand):
COMMON = True
MULTI_MANIFEST_SUPPORT = False
helpSummary = "Initialize a GITC Client."
helpUsage = """
%prog [options] [client name]
"""
helpDescription = """
The '%prog' command is ran to initialize a new GITC client for use
with the GITC file system.
This command will setup the client directory, initialize repo, just
like repo init does, and then downloads the manifest collection
and installs it in the .repo/directory of the GITC client.
Once this is done, a GITC manifest is generated by pulling the HEAD
SHA for each project and generates the properly formatted XML file
and installs it as .manifest in the GITC client directory.
The -c argument is required to specify the GITC client name.
The optional -f argument can be used to specify the manifest file to
use for this GITC client.
"""
def _Options(self, p):
super()._Options(p, gitc_init=True)
def Execute(self, opt, args):
gitc_client = gitc_utils.parse_clientdir(os.getcwd())
if not gitc_client or (opt.gitc_client and gitc_client != opt.gitc_client):
print('fatal: Please update your repo command. See go/gitc for instructions.',
file=sys.stderr)
sys.exit(1)
self.client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(),
gitc_client)
super().Execute(opt, args)
manifest_file = self.manifest.manifestFile
if opt.manifest_file:
if not os.path.exists(opt.manifest_file):
print('fatal: Specified manifest file %s does not exist.' %
opt.manifest_file)
sys.exit(1)
manifest_file = opt.manifest_file
manifest = GitcManifest(self.repodir, os.path.join(self.client_dir,
'.manifest'))
manifest.Override(manifest_file)
gitc_utils.generate_gitc_manifest(None, manifest)
print('Please run `cd %s` to view your GITC client.' %
os.path.join(wrapper.Wrapper().GITC_FS_ROOT_DIR, gitc_client))

View File

@ -14,27 +14,51 @@
import functools import functools
import sys import sys
from typing import NamedTuple
from color import Coloring from color import Coloring
from command import DEFAULT_LOCAL_JOBS, PagedCommand from command import DEFAULT_LOCAL_JOBS
from command import PagedCommand
from error import GitError from error import GitError
from error import InvalidArgumentsError
from error import SilentRepoExitError
from git_command import GitCommand from git_command import GitCommand
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class GrepColoring(Coloring): class GrepColoring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, 'grep') Coloring.__init__(self, config, "grep")
self.project = self.printer('project', attr='bold') self.project = self.printer("project", attr="bold")
self.fail = self.printer('fail', fg='red') self.fail = self.printer("fail", fg="red")
class ExecuteOneResult(NamedTuple):
"""Result from an execute instance."""
project_idx: int
rc: int
stdout: str
stderr: str
error: GitError
class GrepCommandError(SilentRepoExitError):
"""Grep command failure. Since Grep command
output already outputs errors ensure that
aggregate errors exit silently."""
class Grep(PagedCommand): class Grep(PagedCommand):
COMMON = True COMMON = True
helpSummary = "Print lines matching a pattern" helpSummary = "Print lines matching a pattern"
helpUsage = """ helpUsage = """
%prog {pattern | -e pattern} [<project>...] %prog {pattern | -e pattern} [<project>...]
""" """
helpDescription = """ helpDescription = """
Search for the specified patterns in all project files. Search for the specified patterns in all project files.
# Boolean Options # Boolean Options
@ -62,215 +86,324 @@ contain a line that matches both expressions:
repo grep --all-match -e NODE -e Unexpected repo grep --all-match -e NODE -e Unexpected
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
@staticmethod @staticmethod
def _carry_option(_option, opt_str, value, parser): def _carry_option(_option, opt_str, value, parser):
pt = getattr(parser.values, 'cmd_argv', None) pt = getattr(parser.values, "cmd_argv", None)
if pt is None: if pt is None:
pt = [] pt = []
setattr(parser.values, 'cmd_argv', pt) setattr(parser.values, "cmd_argv", pt)
if opt_str == '-(': if opt_str == "-(":
pt.append('(') pt.append("(")
elif opt_str == '-)': elif opt_str == "-)":
pt.append(')') pt.append(")")
else: else:
pt.append(opt_str) pt.append(opt_str)
if value is not None: if value is not None:
pt.append(value) pt.append(value)
def _CommonOptions(self, p): def _CommonOptions(self, p):
"""Override common options slightly.""" """Override common options slightly."""
super()._CommonOptions(p, opt_v=False) super()._CommonOptions(p, opt_v=False)
def _Options(self, p): def _Options(self, p):
g = p.add_option_group('Sources') g = p.add_option_group("Sources")
g.add_option('--cached', g.add_option(
action='callback', callback=self._carry_option, "--cached",
help='Search the index, instead of the work tree') action="callback",
g.add_option('-r', '--revision', callback=self._carry_option,
dest='revision', action='append', metavar='TREEish', help="Search the index, instead of the work tree",
help='Search TREEish, instead of the work tree') )
g.add_option(
"-r",
"--revision",
dest="revision",
action="append",
metavar="TREEish",
help="Search TREEish, instead of the work tree",
)
g = p.add_option_group('Pattern') g = p.add_option_group("Pattern")
g.add_option('-e', g.add_option(
action='callback', callback=self._carry_option, "-e",
metavar='PATTERN', type='str', action="callback",
help='Pattern to search for') callback=self._carry_option,
g.add_option('-i', '--ignore-case', metavar="PATTERN",
action='callback', callback=self._carry_option, type="str",
help='Ignore case differences') help="Pattern to search for",
g.add_option('-a', '--text', )
action='callback', callback=self._carry_option, g.add_option(
help="Process binary files as if they were text") "-i",
g.add_option('-I', "--ignore-case",
action='callback', callback=self._carry_option, action="callback",
help="Don't match the pattern in binary files") callback=self._carry_option,
g.add_option('-w', '--word-regexp', help="Ignore case differences",
action='callback', callback=self._carry_option, )
help='Match the pattern only at word boundaries') g.add_option(
g.add_option('-v', '--invert-match', "-a",
action='callback', callback=self._carry_option, "--text",
help='Select non-matching lines') action="callback",
g.add_option('-G', '--basic-regexp', callback=self._carry_option,
action='callback', callback=self._carry_option, help="Process binary files as if they were text",
help='Use POSIX basic regexp for patterns (default)') )
g.add_option('-E', '--extended-regexp', g.add_option(
action='callback', callback=self._carry_option, "-I",
help='Use POSIX extended regexp for patterns') action="callback",
g.add_option('-F', '--fixed-strings', callback=self._carry_option,
action='callback', callback=self._carry_option, help="Don't match the pattern in binary files",
help='Use fixed strings (not regexp) for pattern') )
g.add_option(
"-w",
"--word-regexp",
action="callback",
callback=self._carry_option,
help="Match the pattern only at word boundaries",
)
g.add_option(
"-v",
"--invert-match",
action="callback",
callback=self._carry_option,
help="Select non-matching lines",
)
g.add_option(
"-G",
"--basic-regexp",
action="callback",
callback=self._carry_option,
help="Use POSIX basic regexp for patterns (default)",
)
g.add_option(
"-E",
"--extended-regexp",
action="callback",
callback=self._carry_option,
help="Use POSIX extended regexp for patterns",
)
g.add_option(
"-F",
"--fixed-strings",
action="callback",
callback=self._carry_option,
help="Use fixed strings (not regexp) for pattern",
)
g = p.add_option_group('Pattern Grouping') g = p.add_option_group("Pattern Grouping")
g.add_option('--all-match', g.add_option(
action='callback', callback=self._carry_option, "--all-match",
help='Limit match to lines that have all patterns') action="callback",
g.add_option('--and', '--or', '--not', callback=self._carry_option,
action='callback', callback=self._carry_option, help="Limit match to lines that have all patterns",
help='Boolean operators to combine patterns') )
g.add_option('-(', '-)', g.add_option(
action='callback', callback=self._carry_option, "--and",
help='Boolean operator grouping') "--or",
"--not",
action="callback",
callback=self._carry_option,
help="Boolean operators to combine patterns",
)
g.add_option(
"-(",
"-)",
action="callback",
callback=self._carry_option,
help="Boolean operator grouping",
)
g = p.add_option_group('Output') g = p.add_option_group("Output")
g.add_option('-n', g.add_option(
action='callback', callback=self._carry_option, "-n",
help='Prefix the line number to matching lines') action="callback",
g.add_option('-C', callback=self._carry_option,
action='callback', callback=self._carry_option, help="Prefix the line number to matching lines",
metavar='CONTEXT', type='str', )
help='Show CONTEXT lines around match') g.add_option(
g.add_option('-B', "-C",
action='callback', callback=self._carry_option, action="callback",
metavar='CONTEXT', type='str', callback=self._carry_option,
help='Show CONTEXT lines before match') metavar="CONTEXT",
g.add_option('-A', type="str",
action='callback', callback=self._carry_option, help="Show CONTEXT lines around match",
metavar='CONTEXT', type='str', )
help='Show CONTEXT lines after match') g.add_option(
g.add_option('-l', '--name-only', '--files-with-matches', "-B",
action='callback', callback=self._carry_option, action="callback",
help='Show only file names containing matching lines') callback=self._carry_option,
g.add_option('-L', '--files-without-match', metavar="CONTEXT",
action='callback', callback=self._carry_option, type="str",
help='Show only file names not containing matching lines') help="Show CONTEXT lines before match",
)
g.add_option(
"-A",
action="callback",
callback=self._carry_option,
metavar="CONTEXT",
type="str",
help="Show CONTEXT lines after match",
)
g.add_option(
"-l",
"--name-only",
"--files-with-matches",
action="callback",
callback=self._carry_option,
help="Show only file names containing matching lines",
)
g.add_option(
"-L",
"--files-without-match",
action="callback",
callback=self._carry_option,
help="Show only file names not containing matching lines",
)
def _ExecuteOne(self, cmd_argv, project): @classmethod
"""Process one project.""" def _ExecuteOne(cls, cmd_argv, project_idx):
try: """Process one project."""
p = GitCommand(project, project = cls.get_parallel_context()["projects"][project_idx]
cmd_argv, try:
bare=False, p = GitCommand(
capture_stdout=True, project,
capture_stderr=True) cmd_argv,
except GitError as e: bare=False,
return (project, -1, None, str(e)) capture_stdout=True,
capture_stderr=True,
verify_command=True,
)
except GitError as e:
return ExecuteOneResult(project_idx, -1, None, str(e), e)
return (project, p.Wait(), p.stdout, p.stderr) try:
error = None
rc = p.Wait()
except GitError as e:
rc = 1
error = e
return ExecuteOneResult(project_idx, rc, p.stdout, p.stderr, error)
@staticmethod @staticmethod
def _ProcessResults(full_name, have_rev, opt, _pool, out, results): def _ProcessResults(
git_failed = False full_name, have_rev, opt, projects, _pool, out, results
bad_rev = False ):
have_match = False git_failed = False
_RelPath = lambda p: p.RelPath(local=opt.this_manifest_only) bad_rev = False
have_match = False
_RelPath = lambda p: p.RelPath(local=opt.this_manifest_only)
errors = []
for project, rc, stdout, stderr in results: for result in results:
if rc < 0: project = projects[result.project_idx]
git_failed = True if result.rc < 0:
out.project('--- project %s ---' % _RelPath(project)) git_failed = True
out.nl() out.project("--- project %s ---" % _RelPath(project))
out.fail('%s', stderr) out.nl()
out.nl() out.fail("%s", result.stderr)
continue out.nl()
errors.append(result.error)
continue
if rc: if result.rc:
# no results # no results
if stderr: if result.stderr:
if have_rev and 'fatal: ambiguous argument' in stderr: if (
bad_rev = True have_rev
else: and "fatal: ambiguous argument" in result.stderr
out.project('--- project %s ---' % _RelPath(project)) ):
out.nl() bad_rev = True
out.fail('%s', stderr.strip()) else:
out.nl() out.project("--- project %s ---" % _RelPath(project))
continue out.nl()
have_match = True out.fail("%s", result.stderr.strip())
out.nl()
if result.error is not None:
errors.append(result.error)
continue
have_match = True
# We cut the last element, to avoid a blank line. # We cut the last element, to avoid a blank line.
r = stdout.split('\n') r = result.stdout.split("\n")
r = r[0:-1] r = r[0:-1]
if have_rev and full_name: if have_rev and full_name:
for line in r: for line in r:
rev, line = line.split(':', 1) rev, line = line.split(":", 1)
out.write("%s", rev) out.write("%s", rev)
out.write(':') out.write(":")
out.project(_RelPath(project)) out.project(_RelPath(project))
out.write('/') out.write("/")
out.write("%s", line) out.write("%s", line)
out.nl() out.nl()
elif full_name: elif full_name:
for line in r: for line in r:
out.project(_RelPath(project)) out.project(_RelPath(project))
out.write('/') out.write("/")
out.write("%s", line) out.write("%s", line)
out.nl() out.nl()
else: else:
for line in r: for line in r:
print(line) print(line)
return (git_failed, bad_rev, have_match) return (git_failed, bad_rev, have_match, errors)
def Execute(self, opt, args): def Execute(self, opt, args):
out = GrepColoring(self.manifest.manifestProject.config) out = GrepColoring(self.manifest.manifestProject.config)
cmd_argv = ['grep'] cmd_argv = ["grep"]
if out.is_on: if out.is_on:
cmd_argv.append('--color') cmd_argv.append("--color")
cmd_argv.extend(getattr(opt, 'cmd_argv', [])) cmd_argv.extend(getattr(opt, "cmd_argv", []))
if '-e' not in cmd_argv: if "-e" not in cmd_argv:
if not args: if not args:
self.Usage() self.Usage()
cmd_argv.append('-e') cmd_argv.append("-e")
cmd_argv.append(args[0]) cmd_argv.append(args[0])
args = args[1:] args = args[1:]
projects = self.GetProjects(args, all_manifests=not opt.this_manifest_only) projects = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
full_name = False full_name = False
if len(projects) > 1: if len(projects) > 1:
cmd_argv.append('--full-name') cmd_argv.append("--full-name")
full_name = True full_name = True
have_rev = False have_rev = False
if opt.revision: if opt.revision:
if '--cached' in cmd_argv: if "--cached" in cmd_argv:
print('fatal: cannot combine --cached and --revision', file=sys.stderr) msg = "fatal: cannot combine --cached and --revision"
sys.exit(1) logger.error(msg)
have_rev = True raise InvalidArgumentsError(msg)
cmd_argv.extend(opt.revision) have_rev = True
cmd_argv.append('--') cmd_argv.extend(opt.revision)
cmd_argv.append("--")
git_failed, bad_rev, have_match = self.ExecuteInParallel( with self.ParallelContext():
opt.jobs, self.get_parallel_context()["projects"] = projects
functools.partial(self._ExecuteOne, cmd_argv), git_failed, bad_rev, have_match, errors = self.ExecuteInParallel(
projects, opt.jobs,
callback=functools.partial(self._ProcessResults, full_name, have_rev, opt), functools.partial(self._ExecuteOne, cmd_argv),
output=out, range(len(projects)),
ordered=True) callback=functools.partial(
self._ProcessResults, full_name, have_rev, opt, projects
),
output=out,
ordered=True,
chunksize=1,
)
if git_failed: if git_failed:
sys.exit(1) raise GrepCommandError(
elif have_match: "error: git failures", aggregate_errors=errors
sys.exit(0) )
elif have_rev and bad_rev: elif have_match:
for r in opt.revision: sys.exit(0)
print("error: can't search revision %s" % r, file=sys.stderr) elif have_rev and bad_rev:
sys.exit(1) for r in opt.revision:
else: logger.error("error: can't search revision %s", r)
sys.exit(1) raise GrepCommandError(aggregate_errors=errors)

View File

@ -16,165 +16,178 @@ import re
import sys import sys
import textwrap import textwrap
from subcmds import all_commands
from color import Coloring from color import Coloring
from command import PagedCommand, MirrorSafeCommand, GitcAvailableCommand, GitcClientCommand from command import MirrorSafeCommand
import gitc_utils from command import PagedCommand
from error import RepoExitError
from subcmds import all_commands
from wrapper import Wrapper from wrapper import Wrapper
class InvalidHelpCommand(RepoExitError):
"""Invalid command passed into help."""
class Help(PagedCommand, MirrorSafeCommand): class Help(PagedCommand, MirrorSafeCommand):
COMMON = False COMMON = False
helpSummary = "Display detailed help on a command" helpSummary = "Display detailed help on a command"
helpUsage = """ helpUsage = """
%prog [--all|command] %prog [--all|command]
""" """
helpDescription = """ helpDescription = """
Displays detailed usage information about a command. Displays detailed usage information about a command.
""" """
def _PrintCommands(self, commandNames): def _PrintCommands(self, commandNames):
"""Helper to display |commandNames| summaries.""" """Helper to display |commandNames| summaries."""
maxlen = 0 maxlen = 0
for name in commandNames: for name in commandNames:
maxlen = max(maxlen, len(name)) maxlen = max(maxlen, len(name))
fmt = ' %%-%ds %%s' % maxlen fmt = " %%-%ds %%s" % maxlen
for name in commandNames: for name in commandNames:
command = all_commands[name]() command = all_commands[name]()
try: try:
summary = command.helpSummary.strip() summary = command.helpSummary.strip()
except AttributeError: except AttributeError:
summary = '' summary = ""
print(fmt % (name, summary)) print(fmt % (name, summary))
def _PrintAllCommands(self): def _PrintAllCommands(self):
print('usage: repo COMMAND [ARGS]') print("usage: repo COMMAND [ARGS]")
self.PrintAllCommandsBody() self.PrintAllCommandsBody()
def PrintAllCommandsBody(self): def PrintAllCommandsBody(self):
print('The complete list of recognized repo commands is:') print("The complete list of recognized repo commands is:")
commandNames = list(sorted(all_commands)) commandNames = list(sorted(all_commands))
self._PrintCommands(commandNames) self._PrintCommands(commandNames)
print("See 'repo help <command>' for more information on a " print(
'specific command.') "See 'repo help <command>' for more information on a "
print('Bug reports:', Wrapper().BUG_URL) "specific command."
)
print("Bug reports:", Wrapper().BUG_URL)
def _PrintCommonCommands(self): def _PrintCommonCommands(self):
print('usage: repo COMMAND [ARGS]') print("usage: repo COMMAND [ARGS]")
self.PrintCommonCommandsBody() self.PrintCommonCommandsBody()
def PrintCommonCommandsBody(self): def PrintCommonCommandsBody(self):
print('The most commonly used repo commands are:') print("The most commonly used repo commands are:")
def gitc_supported(cmd): commandNames = list(
if not isinstance(cmd, GitcAvailableCommand) and not isinstance(cmd, GitcClientCommand): sorted(
return True name for name, command in all_commands.items() if command.COMMON
if self.client.isGitcClient: )
return True )
if isinstance(cmd, GitcClientCommand): self._PrintCommands(commandNames)
return False
if gitc_utils.get_gitc_manifest_dir():
return True
return False
commandNames = list(sorted([name print(
for name, command in all_commands.items() "See 'repo help <command>' for more information on a specific "
if command.COMMON and gitc_supported(command)])) "command.\nSee 'repo help --all' for a complete list of recognized "
self._PrintCommands(commandNames) "commands."
)
print("Bug reports:", Wrapper().BUG_URL)
print( def _PrintCommandHelp(self, cmd, header_prefix=""):
"See 'repo help <command>' for more information on a specific command.\n" class _Out(Coloring):
"See 'repo help --all' for a complete list of recognized commands.") def __init__(self, gc):
print('Bug reports:', Wrapper().BUG_URL) Coloring.__init__(self, gc, "help")
self.heading = self.printer("heading", attr="bold")
self._first = True
def _PrintCommandHelp(self, cmd, header_prefix=''): def _PrintSection(self, heading, bodyAttr):
class _Out(Coloring): try:
def __init__(self, gc): body = getattr(cmd, bodyAttr)
Coloring.__init__(self, gc, 'help') except AttributeError:
self.heading = self.printer('heading', attr='bold') return
self._first = True if body == "" or body is None:
return
def _PrintSection(self, heading, bodyAttr): if not self._first:
try: self.nl()
body = getattr(cmd, bodyAttr) self._first = False
except AttributeError:
return
if body == '' or body is None:
return
if not self._first: self.heading("%s%s", header_prefix, heading)
self.nl() self.nl()
self._first = False self.nl()
self.heading('%s%s', header_prefix, heading) me = "repo %s" % cmd.NAME
self.nl() body = body.strip()
self.nl() body = body.replace("%prog", me)
me = 'repo %s' % cmd.NAME # Extract the title, but skip any trailing {#anchors}.
body = body.strip() asciidoc_hdr = re.compile(r"^\n?#+ ([^{]+)(\{#.+\})?$")
body = body.replace('%prog', me) for para in body.split("\n\n"):
if para.startswith(" "):
self.write("%s", para)
self.nl()
self.nl()
continue
# Extract the title, but skip any trailing {#anchors}. m = asciidoc_hdr.match(para)
asciidoc_hdr = re.compile(r'^\n?#+ ([^{]+)(\{#.+\})?$') if m:
for para in body.split("\n\n"): self.heading("%s%s", header_prefix, m.group(1))
if para.startswith(' '): self.nl()
self.write('%s', para) self.nl()
self.nl() continue
self.nl()
continue
m = asciidoc_hdr.match(para) lines = textwrap.wrap(
if m: para.replace(" ", " "),
self.heading('%s%s', header_prefix, m.group(1)) width=80,
self.nl() break_long_words=False,
self.nl() break_on_hyphens=False,
continue )
for line in lines:
self.write("%s", line)
self.nl()
self.nl()
lines = textwrap.wrap(para.replace(' ', ' '), width=80, out = _Out(self.client.globalConfig)
break_long_words=False, break_on_hyphens=False) out._PrintSection("Summary", "helpSummary")
for line in lines: cmd.OptionParser.print_help()
self.write('%s', line) out._PrintSection("Description", "helpDescription")
self.nl()
self.nl()
out = _Out(self.client.globalConfig) def _PrintAllCommandHelp(self):
out._PrintSection('Summary', 'helpSummary') for name in sorted(all_commands):
cmd.OptionParser.print_help() cmd = all_commands[name](manifest=self.manifest)
out._PrintSection('Description', 'helpDescription') self._PrintCommandHelp(cmd, header_prefix=f"[{name}] ")
def _PrintAllCommandHelp(self): def _Options(self, p):
for name in sorted(all_commands): p.add_option(
cmd = all_commands[name](manifest=self.manifest) "-a",
self._PrintCommandHelp(cmd, header_prefix='[%s] ' % (name,)) "--all",
dest="show_all",
action="store_true",
help="show the complete list of commands",
)
p.add_option(
"--help-all",
dest="show_all_help",
action="store_true",
help="show the --help of all commands",
)
def _Options(self, p): def Execute(self, opt, args):
p.add_option('-a', '--all', if len(args) == 0:
dest='show_all', action='store_true', if opt.show_all_help:
help='show the complete list of commands') self._PrintAllCommandHelp()
p.add_option('--help-all', elif opt.show_all:
dest='show_all_help', action='store_true', self._PrintAllCommands()
help='show the --help of all commands') else:
self._PrintCommonCommands()
def Execute(self, opt, args): elif len(args) == 1:
if len(args) == 0: name = args[0]
if opt.show_all_help:
self._PrintAllCommandHelp()
elif opt.show_all:
self._PrintAllCommands()
else:
self._PrintCommonCommands()
elif len(args) == 1: try:
name = args[0] cmd = all_commands[name](manifest=self.manifest)
except KeyError:
print(
"repo: '%s' is not a repo command." % name, file=sys.stderr
)
raise InvalidHelpCommand(name)
try: self._PrintCommandHelp(cmd)
cmd = all_commands[name](manifest=self.manifest)
except KeyError:
print("repo: '%s' is not a repo command." % name, file=sys.stderr)
sys.exit(1)
self._PrintCommandHelp(cmd) else:
self._PrintCommandHelp(self)
else:
self._PrintCommandHelp(self)

View File

@ -14,209 +14,243 @@
import optparse import optparse
from command import PagedCommand
from color import Coloring from color import Coloring
from git_refs import R_M, R_HEADS from command import PagedCommand
from git_refs import R_HEADS
from git_refs import R_M
class _Coloring(Coloring): class _Coloring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, "status") Coloring.__init__(self, config, "status")
class Info(PagedCommand): class Info(PagedCommand):
COMMON = True COMMON = True
helpSummary = "Get info on the manifest branch, current branch or unmerged branches" helpSummary = (
helpUsage = "%prog [-dl] [-o [-c]] [<project>...]" "Get info on the manifest branch, current branch or unmerged branches"
)
helpUsage = "%prog [-dl] [-o [-c]] [<project>...]"
def _Options(self, p): def _Options(self, p):
p.add_option('-d', '--diff', p.add_option(
dest='all', action='store_true', "-d",
help="show full info and commit diff including remote branches") "--diff",
p.add_option('-o', '--overview', dest="all",
dest='overview', action='store_true', action="store_true",
help='show overview of all local commits') help="show full info and commit diff including remote branches",
p.add_option('-c', '--current-branch', )
dest="current_branch", action="store_true", p.add_option(
help="consider only checked out branches") "-o",
p.add_option('--no-current-branch', "--overview",
dest='current_branch', action='store_false', dest="overview",
help='consider all local branches') action="store_true",
# Turn this into a warning & remove this someday. help="show overview of all local commits",
p.add_option('-b', )
dest='current_branch', action='store_true', p.add_option(
help=optparse.SUPPRESS_HELP) "-c",
p.add_option('-l', '--local-only', "--current-branch",
dest="local", action="store_true", dest="current_branch",
help="disable all remote operations") action="store_true",
help="consider only checked out branches",
)
p.add_option(
"--no-current-branch",
dest="current_branch",
action="store_false",
help="consider all local branches",
)
# Turn this into a warning & remove this someday.
p.add_option(
"-b",
dest="current_branch",
action="store_true",
help=optparse.SUPPRESS_HELP,
)
p.add_option(
"-l",
"--local-only",
dest="local",
action="store_true",
help="disable all remote operations",
)
def Execute(self, opt, args): def Execute(self, opt, args):
self.out = _Coloring(self.client.globalConfig) self.out = _Coloring(self.client.globalConfig)
self.heading = self.out.printer('heading', attr='bold') self.heading = self.out.printer("heading", attr="bold")
self.headtext = self.out.nofmt_printer('headtext', fg='yellow') self.headtext = self.out.nofmt_printer("headtext", fg="yellow")
self.redtext = self.out.printer('redtext', fg='red') self.redtext = self.out.printer("redtext", fg="red")
self.sha = self.out.printer("sha", fg='yellow') self.sha = self.out.printer("sha", fg="yellow")
self.text = self.out.nofmt_printer('text') self.text = self.out.nofmt_printer("text")
self.dimtext = self.out.printer('dimtext', attr='dim') self.dimtext = self.out.printer("dimtext", attr="dim")
self.opt = opt self.opt = opt
if not opt.this_manifest_only: if not opt.this_manifest_only:
self.manifest = self.manifest.outer_client self.manifest = self.manifest.outer_client
manifestConfig = self.manifest.manifestProject.config manifestConfig = self.manifest.manifestProject.config
mergeBranch = manifestConfig.GetBranch("default").merge mergeBranch = manifestConfig.GetBranch("default").merge
manifestGroups = self.manifest.GetGroupsStr() manifestGroups = self.manifest.GetGroupsStr()
self.heading("Manifest branch: ") self.heading("Manifest branch: ")
if self.manifest.default.revisionExpr: if self.manifest.default.revisionExpr:
self.headtext(self.manifest.default.revisionExpr) self.headtext(self.manifest.default.revisionExpr)
self.out.nl() self.out.nl()
self.heading("Manifest merge branch: ") self.heading("Manifest merge branch: ")
self.headtext(mergeBranch) # The manifest might not have a merge branch if it isn't in a git repo,
self.out.nl() # e.g. if `repo init --standalone-manifest` is used.
self.heading("Manifest groups: ") self.headtext(mergeBranch or "")
self.headtext(manifestGroups) self.out.nl()
self.out.nl() self.heading("Manifest groups: ")
self.headtext(manifestGroups)
self.printSeparator()
if not opt.overview:
self._printDiffInfo(opt, args)
else:
self._printCommitOverview(opt, args)
def printSeparator(self):
self.text("----------------------------")
self.out.nl()
def _printDiffInfo(self, opt, args):
# We let exceptions bubble up to main as they'll be well structured.
projs = self.GetProjects(args, all_manifests=not opt.this_manifest_only)
for p in projs:
self.heading("Project: ")
self.headtext(p.name)
self.out.nl()
self.heading("Mount path: ")
self.headtext(p.worktree)
self.out.nl()
self.heading("Current revision: ")
self.headtext(p.GetRevisionId())
self.out.nl()
currentBranch = p.CurrentBranch
if currentBranch:
self.heading('Current branch: ')
self.headtext(currentBranch)
self.out.nl() self.out.nl()
self.heading("Manifest revision: ") self.printSeparator()
self.headtext(p.revisionExpr)
self.out.nl()
localBranches = list(p.GetBranches().keys()) if not opt.overview:
self.heading("Local Branches: ") self._printDiffInfo(opt, args)
self.redtext(str(len(localBranches))) else:
if localBranches: self._printCommitOverview(opt, args)
self.text(" [")
self.text(", ".join(localBranches))
self.text("]")
self.out.nl()
if self.opt.all: def printSeparator(self):
self.findRemoteLocalDiff(p) self.text("----------------------------")
self.printSeparator()
def findRemoteLocalDiff(self, project):
# Fetch all the latest commits.
if not self.opt.local:
project.Sync_NetworkHalf(quiet=True, current_branch_only=True)
branch = self.manifest.manifestProject.config.GetBranch('default').merge
if branch.startswith(R_HEADS):
branch = branch[len(R_HEADS):]
logTarget = R_M + branch
bareTmp = project.bare_git._bare
project.bare_git._bare = False
localCommits = project.bare_git.rev_list(
'--abbrev=8',
'--abbrev-commit',
'--pretty=oneline',
logTarget + "..",
'--')
originCommits = project.bare_git.rev_list(
'--abbrev=8',
'--abbrev-commit',
'--pretty=oneline',
".." + logTarget,
'--')
project.bare_git._bare = bareTmp
self.heading("Local Commits: ")
self.redtext(str(len(localCommits)))
self.dimtext(" (on current branch)")
self.out.nl()
for c in localCommits:
split = c.split()
self.sha(split[0] + " ")
self.text(" ".join(split[1:]))
self.out.nl()
self.printSeparator()
self.heading("Remote Commits: ")
self.redtext(str(len(originCommits)))
self.out.nl()
for c in originCommits:
split = c.split()
self.sha(split[0] + " ")
self.text(" ".join(split[1:]))
self.out.nl()
def _printCommitOverview(self, opt, args):
all_branches = []
for project in self.GetProjects(args, all_manifests=not opt.this_manifest_only):
br = [project.GetUploadableBranch(x)
for x in project.GetBranches()]
br = [x for x in br if x]
if self.opt.current_branch:
br = [x for x in br if x.name == project.CurrentBranch]
all_branches.extend(br)
if not all_branches:
return
self.out.nl()
self.heading('Projects Overview')
project = None
for branch in all_branches:
if project != branch.project:
project = branch.project
self.out.nl()
self.headtext(project.RelPath(local=opt.this_manifest_only))
self.out.nl() self.out.nl()
commits = branch.commits def _printDiffInfo(self, opt, args):
date = branch.date # We let exceptions bubble up to main as they'll be well structured.
self.text('%s %-33s (%2d commit%s, %s)' % ( projs = self.GetProjects(args, all_manifests=not opt.this_manifest_only)
branch.name == project.CurrentBranch and '*' or ' ',
branch.name,
len(commits),
len(commits) != 1 and 's' or '',
date))
self.out.nl()
for commit in commits: for p in projs:
split = commit.split() self.heading("Project: ")
self.text('{0:38}{1} '.format('', '-')) self.headtext(p.name)
self.sha(split[0] + " ") self.out.nl()
self.text(" ".join(split[1:]))
self.heading("Mount path: ")
self.headtext(p.worktree)
self.out.nl()
self.heading("Current revision: ")
self.headtext(p.GetRevisionId())
self.out.nl()
currentBranch = p.CurrentBranch
if currentBranch:
self.heading("Current branch: ")
self.headtext(currentBranch)
self.out.nl()
self.heading("Manifest revision: ")
self.headtext(p.revisionExpr)
self.out.nl()
localBranches = list(p.GetBranches().keys())
self.heading("Local Branches: ")
self.redtext(str(len(localBranches)))
if localBranches:
self.text(" [")
self.text(", ".join(localBranches))
self.text("]")
self.out.nl()
if self.opt.all:
self.findRemoteLocalDiff(p)
self.printSeparator()
def findRemoteLocalDiff(self, project):
# Fetch all the latest commits.
if not self.opt.local:
project.Sync_NetworkHalf(quiet=True, current_branch_only=True)
branch = self.manifest.manifestProject.config.GetBranch("default").merge
if branch.startswith(R_HEADS):
branch = branch[len(R_HEADS) :]
logTarget = R_M + branch
bareTmp = project.bare_git._bare
project.bare_git._bare = False
localCommits = project.bare_git.rev_list(
"--abbrev=8",
"--abbrev-commit",
"--pretty=oneline",
logTarget + "..",
"--",
)
originCommits = project.bare_git.rev_list(
"--abbrev=8",
"--abbrev-commit",
"--pretty=oneline",
".." + logTarget,
"--",
)
project.bare_git._bare = bareTmp
self.heading("Local Commits: ")
self.redtext(str(len(localCommits)))
self.dimtext(" (on current branch)")
self.out.nl() self.out.nl()
for c in localCommits:
split = c.split()
self.sha(split[0] + " ")
self.text(" ".join(split[1:]))
self.out.nl()
self.printSeparator()
self.heading("Remote Commits: ")
self.redtext(str(len(originCommits)))
self.out.nl()
for c in originCommits:
split = c.split()
self.sha(split[0] + " ")
self.text(" ".join(split[1:]))
self.out.nl()
def _printCommitOverview(self, opt, args):
all_branches = []
for project in self.GetProjects(
args, all_manifests=not opt.this_manifest_only
):
br = [project.GetUploadableBranch(x) for x in project.GetBranches()]
br = [x for x in br if x]
if self.opt.current_branch:
br = [x for x in br if x.name == project.CurrentBranch]
all_branches.extend(br)
if not all_branches:
return
self.out.nl()
self.heading("Projects Overview")
project = None
for branch in all_branches:
if project != branch.project:
project = branch.project
self.out.nl()
self.headtext(project.RelPath(local=opt.this_manifest_only))
self.out.nl()
commits = branch.commits
date = branch.date
self.text(
"%s %-33s (%2d commit%s, %s)"
% (
branch.name == project.CurrentBranch and "*" or " ",
branch.name,
len(commits),
len(commits) != 1 and "s" or "",
date,
)
)
self.out.nl()
for commit in commits:
split = commit.split()
self.text(f"{'':38}{'-'} ")
self.sha(split[0] + " ")
self.text(" ".join(split[1:]))
self.out.nl()

View File

@ -16,19 +16,29 @@ import os
import sys import sys
from color import Coloring from color import Coloring
from command import InteractiveCommand, MirrorSafeCommand from command import InteractiveCommand
from git_command import git_require, MIN_GIT_VERSION_SOFT, MIN_GIT_VERSION_HARD from command import MirrorSafeCommand
from error import RepoUnhandledExceptionError
from error import UpdateManifestError
from git_command import git_require
from repo_logging import RepoLogger
from wrapper import Wrapper from wrapper import Wrapper
from wrapper import WrapperDir
logger = RepoLogger(__file__)
_REPO_ALLOW_SHALLOW = os.environ.get("REPO_ALLOW_SHALLOW")
class Init(InteractiveCommand, MirrorSafeCommand): class Init(InteractiveCommand, MirrorSafeCommand):
COMMON = True COMMON = True
MULTI_MANIFEST_SUPPORT = True MULTI_MANIFEST_SUPPORT = True
helpSummary = "Initialize a repo client checkout in the current directory" helpSummary = "Initialize a repo client checkout in the current directory"
helpUsage = """ helpUsage = """
%prog [options] [manifest url] %prog [options] [manifest url]
""" """
helpDescription = """ helpDescription = """
The '%prog' command is run once to install and initialize repo. The '%prog' command is run once to install and initialize repo.
The latest repo source code and manifest collection is downloaded The latest repo source code and manifest collection is downloaded
from the server and is installed in the .repo/ directory in the from the server and is installed in the .repo/ directory in the
@ -42,6 +52,10 @@ The optional -b argument can be used to select the manifest branch
to checkout and use. If no branch is specified, the remote's default to checkout and use. If no branch is specified, the remote's default
branch is used. This is equivalent to using -b HEAD. branch is used. This is equivalent to using -b HEAD.
The optional --manifest-upstream-branch argument can be used when a commit is
provided to --manifest-branch (or -b), to specify the name of the git ref in
which the commit can be found.
The optional -m argument can be used to specify an alternate manifest The optional -m argument can be used to specify an alternate manifest
to be used. If no manifest is specified, the manifest default.xml to be used. If no manifest is specified, the manifest default.xml
will be used. will be used.
@ -77,243 +91,320 @@ manifest, a subsequent `repo sync` (or `repo sync -d`) is necessary
to update the working directory files. to update the working directory files.
""" """
def _CommonOptions(self, p): def _CommonOptions(self, p):
"""Disable due to re-use of Wrapper().""" """Disable due to re-use of Wrapper()."""
def _Options(self, p, gitc_init=False): def _Options(self, p):
Wrapper().InitParser(p, gitc_init=gitc_init) Wrapper().InitParser(p)
m = p.add_option_group('Multi-manifest') m = p.add_option_group("Multi-manifest")
m.add_option('--outer-manifest', action='store_true', default=True, m.add_option(
help='operate starting at the outermost manifest') "--outer-manifest",
m.add_option('--no-outer-manifest', dest='outer_manifest', action="store_true",
action='store_false', help='do not operate on outer manifests') default=True,
m.add_option('--this-manifest-only', action='store_true', default=None, help="operate starting at the outermost manifest",
help='only operate on this (sub)manifest') )
m.add_option('--no-this-manifest-only', '--all-manifests', m.add_option(
dest='this_manifest_only', action='store_false', "--no-outer-manifest",
help='operate on this manifest and its submanifests') dest="outer_manifest",
action="store_false",
help="do not operate on outer manifests",
)
m.add_option(
"--this-manifest-only",
action="store_true",
default=None,
help="only operate on this (sub)manifest",
)
m.add_option(
"--no-this-manifest-only",
"--all-manifests",
dest="this_manifest_only",
action="store_false",
help="operate on this manifest and its submanifests",
)
def _RegisteredEnvironmentOptions(self): def _RegisteredEnvironmentOptions(self):
return {'REPO_MANIFEST_URL': 'manifest_url', return {
'REPO_MIRROR_LOCATION': 'reference'} "REPO_MANIFEST_URL": "manifest_url",
"REPO_MIRROR_LOCATION": "reference",
}
def _SyncManifest(self, opt): def _SyncManifest(self, opt):
"""Call manifestProject.Sync with arguments from opt. """Call manifestProject.Sync with arguments from opt.
Args: Args:
opt: options from optparse. opt: options from optparse.
""" """
# Normally this value is set when instantiating the project, but the # Normally this value is set when instantiating the project, but the
# manifest project is special and is created when instantiating the # manifest project is special and is created when instantiating the
# manifest which happens before we parse options. # manifest which happens before we parse options.
self.manifest.manifestProject.clone_depth = opt.manifest_depth self.manifest.manifestProject.clone_depth = opt.manifest_depth
if not self.manifest.manifestProject.Sync( self.manifest.manifestProject.upstream = opt.manifest_upstream_branch
manifest_url=opt.manifest_url, clone_filter_for_depth = (
manifest_branch=opt.manifest_branch, "blob:none" if (_REPO_ALLOW_SHALLOW == "0") else None
standalone_manifest=opt.standalone_manifest, )
groups=opt.groups, if not self.manifest.manifestProject.Sync(
platform=opt.platform, manifest_url=opt.manifest_url,
mirror=opt.mirror, manifest_branch=opt.manifest_branch,
dissociate=opt.dissociate, standalone_manifest=opt.standalone_manifest,
reference=opt.reference, groups=opt.groups,
worktree=opt.worktree, platform=opt.platform,
submodules=opt.submodules, mirror=opt.mirror,
archive=opt.archive, dissociate=opt.dissociate,
partial_clone=opt.partial_clone, reference=opt.reference,
clone_filter=opt.clone_filter, worktree=opt.worktree,
partial_clone_exclude=opt.partial_clone_exclude, submodules=opt.submodules,
clone_bundle=opt.clone_bundle, archive=opt.archive,
git_lfs=opt.git_lfs, partial_clone=opt.partial_clone,
use_superproject=opt.use_superproject, clone_filter=opt.clone_filter,
verbose=opt.verbose, partial_clone_exclude=opt.partial_clone_exclude,
current_branch_only=opt.current_branch_only, clone_filter_for_depth=clone_filter_for_depth,
tags=opt.tags, clone_bundle=opt.clone_bundle,
depth=opt.depth, git_lfs=opt.git_lfs,
git_event_log=self.git_event_log, use_superproject=opt.use_superproject,
manifest_name=opt.manifest_name): verbose=opt.verbose,
sys.exit(1) current_branch_only=opt.current_branch_only,
tags=opt.tags,
depth=opt.depth,
git_event_log=self.git_event_log,
manifest_name=opt.manifest_name,
):
manifest_name = opt.manifest_name
raise UpdateManifestError(
f"Unable to sync manifest {manifest_name}"
)
def _Prompt(self, prompt, value): def _Prompt(self, prompt, value):
print('%-10s [%s]: ' % (prompt, value), end='', flush=True) print("%-10s [%s]: " % (prompt, value), end="", flush=True)
a = sys.stdin.readline().strip() a = sys.stdin.readline().strip()
if a == '': if a == "":
return value return value
return a return a
def _ShouldConfigureUser(self, opt, existing_checkout): def _ShouldConfigureUser(self, opt, existing_checkout):
gc = self.client.globalConfig gc = self.client.globalConfig
mp = self.manifest.manifestProject mp = self.manifest.manifestProject
# If we don't have local settings, get from global. # If we don't have local settings, get from global.
if not mp.config.Has('user.name') or not mp.config.Has('user.email'): if not mp.config.Has("user.name") or not mp.config.Has("user.email"):
if not gc.Has('user.name') or not gc.Has('user.email'): if not gc.Has("user.name") or not gc.Has("user.email"):
return True return True
mp.config.SetString('user.name', gc.GetString('user.name')) mp.config.SetString("user.name", gc.GetString("user.name"))
mp.config.SetString('user.email', gc.GetString('user.email')) mp.config.SetString("user.email", gc.GetString("user.email"))
if not opt.quiet and not existing_checkout or opt.verbose: if not opt.quiet and not existing_checkout or opt.verbose:
print() print()
print('Your identity is: %s <%s>' % (mp.config.GetString('user.name'), print(
mp.config.GetString('user.email'))) "Your identity is: %s <%s>"
print("If you want to change this, please re-run 'repo init' with --config-name") % (
return False mp.config.GetString("user.name"),
mp.config.GetString("user.email"),
)
)
print(
"If you want to change this, please re-run 'repo init' with "
"--config-name"
)
return False
def _ConfigureUser(self, opt): def _ConfigureUser(self, opt):
mp = self.manifest.manifestProject mp = self.manifest.manifestProject
while True:
if not opt.quiet:
print()
name = self._Prompt("Your Name", mp.UserName)
email = self._Prompt("Your Email", mp.UserEmail)
if not opt.quiet:
print()
print(f"Your identity is: {name} <{email}>")
print("is this correct [y/N]? ", end="", flush=True)
a = sys.stdin.readline().strip().lower()
if a in ("yes", "y", "t", "true"):
break
if name != mp.UserName:
mp.config.SetString("user.name", name)
if email != mp.UserEmail:
mp.config.SetString("user.email", email)
def _HasColorSet(self, gc):
for n in ["ui", "diff", "status"]:
if gc.Has("color.%s" % n):
return True
return False
def _ConfigureColor(self):
gc = self.client.globalConfig
if self._HasColorSet(gc):
return
class _Test(Coloring):
def __init__(self):
Coloring.__init__(self, gc, "test color display")
self._on = True
out = _Test()
while True:
if not opt.quiet:
print() print()
name = self._Prompt('Your Name', mp.UserName) print("Testing colorized output (for 'repo diff', 'repo status'):")
email = self._Prompt('Your Email', mp.UserEmail)
for c in ["black", "red", "green", "yellow", "blue", "magenta", "cyan"]:
out.write(" ")
out.printer(fg=c)(" %-6s ", c)
out.write(" ")
out.printer(fg="white", bg="black")(" %s " % "white")
out.nl()
for c in ["bold", "dim", "ul", "reverse"]:
out.write(" ")
out.printer(fg="black", attr=c)(" %-6s ", c)
out.nl()
print(
"Enable color display in this user account (y/N)? ",
end="",
flush=True,
)
a = sys.stdin.readline().strip().lower()
if a in ("y", "yes", "t", "true", "on"):
gc.SetString("color.ui", "auto")
def _DisplayResult(self):
if self.manifest.IsMirror:
init_type = "mirror "
else:
init_type = ""
if not opt.quiet:
print() print()
print('Your identity is: %s <%s>' % (name, email)) print(
print('is this correct [y/N]? ', end='', flush=True) "repo %shas been initialized in %s"
a = sys.stdin.readline().strip().lower() % (init_type, self.manifest.topdir)
if a in ('yes', 'y', 't', 'true'): )
break
if name != mp.UserName: current_dir = os.getcwd()
mp.config.SetString('user.name', name) if current_dir != self.manifest.topdir:
if email != mp.UserEmail: print(
mp.config.SetString('user.email', email) "If this is not the directory in which you want to initialize "
"repo, please run:"
)
print(" rm -r %s" % os.path.join(self.manifest.topdir, ".repo"))
print("and try again.")
def _HasColorSet(self, gc): def ValidateOptions(self, opt, args):
for n in ['ui', 'diff', 'status']: if opt.reference:
if gc.Has('color.%s' % n): opt.reference = os.path.expanduser(opt.reference)
return True
return False
def _ConfigureColor(self): # Check this here, else manifest will be tagged "not new" and init won't
gc = self.client.globalConfig # be possible anymore without removing the .repo/manifests directory.
if self._HasColorSet(gc): if opt.mirror:
return if opt.archive:
self.OptionParser.error(
"--mirror and --archive cannot be used " "together."
)
if opt.use_superproject is not None:
self.OptionParser.error(
"--mirror and --use-superproject cannot be "
"used together."
)
if opt.archive and opt.use_superproject is not None:
self.OptionParser.error(
"--archive and --use-superproject cannot be used " "together."
)
class _Test(Coloring): if opt.standalone_manifest and (
def __init__(self): opt.manifest_branch or opt.manifest_name != "default.xml"
Coloring.__init__(self, gc, 'test color display') ):
self._on = True self.OptionParser.error(
out = _Test() "--manifest-branch and --manifest-name cannot"
" be used with --standalone-manifest."
)
print() if opt.manifest_upstream_branch and opt.manifest_branch is None:
print("Testing colorized output (for 'repo diff', 'repo status'):") self.OptionParser.error(
"--manifest-upstream-branch cannot be used without "
"--manifest-branch."
)
for c in ['black', 'red', 'green', 'yellow', 'blue', 'magenta', 'cyan']: if args:
out.write(' ') if opt.manifest_url:
out.printer(fg=c)(' %-6s ', c) self.OptionParser.error(
out.write(' ') "--manifest-url option and URL argument both specified: "
out.printer(fg='white', bg='black')(' %s ' % 'white') "only use one to select the manifest URL."
out.nl() )
for c in ['bold', 'dim', 'ul', 'reverse']: opt.manifest_url = args.pop(0)
out.write(' ')
out.printer(fg='black', attr=c)(' %-6s ', c)
out.nl()
print('Enable color display in this user account (y/N)? ', end='', flush=True) if args:
a = sys.stdin.readline().strip().lower() self.OptionParser.error("too many arguments to init")
if a in ('y', 'yes', 't', 'true', 'on'):
gc.SetString('color.ui', 'auto')
def _DisplayResult(self): def Execute(self, opt, args):
if self.manifest.IsMirror: wrapper = Wrapper()
init_type = 'mirror '
else:
init_type = ''
print() reqs = wrapper.Requirements.from_dir(WrapperDir())
print('repo %shas been initialized in %s' % (init_type, self.manifest.topdir)) git_require(reqs.get_hard_ver("git"), fail=True)
min_git_version_soft = reqs.get_soft_ver("git")
if not git_require(min_git_version_soft):
logger.warning(
"repo: warning: git-%s+ will soon be required; "
"please upgrade your version of git to maintain "
"support.",
".".join(str(x) for x in min_git_version_soft),
)
current_dir = os.getcwd() rp = self.manifest.repoProject
if current_dir != self.manifest.topdir:
print('If this is not the directory in which you want to initialize '
'repo, please run:')
print(' rm -r %s' % os.path.join(self.manifest.topdir, '.repo'))
print('and try again.')
def ValidateOptions(self, opt, args): # Handle new --repo-url requests.
if opt.reference: if opt.repo_url:
opt.reference = os.path.expanduser(opt.reference) remote = rp.GetRemote("origin")
remote.url = opt.repo_url
remote.Save()
# Check this here, else manifest will be tagged "not new" and init won't be # Handle new --repo-rev requests.
# possible anymore without removing the .repo/manifests directory. if opt.repo_rev:
if opt.mirror: try:
if opt.archive: remote_ref, rev = wrapper.check_repo_rev(
self.OptionParser.error('--mirror and --archive cannot be used ' rp.worktree,
'together.') opt.repo_rev,
if opt.use_superproject is not None: repo_verify=opt.repo_verify,
self.OptionParser.error('--mirror and --use-superproject cannot be ' quiet=opt.quiet,
'used together.') )
if opt.archive and opt.use_superproject is not None: except wrapper.CloneFailure as e:
self.OptionParser.error('--archive and --use-superproject cannot be used ' err_msg = "fatal: double check your --repo-rev setting."
'together.') logger.error(err_msg)
self.git_event_log.ErrorEvent(err_msg)
raise RepoUnhandledExceptionError(e)
if opt.standalone_manifest and (opt.manifest_branch or branch = rp.GetBranch("default")
opt.manifest_name != 'default.xml'): branch.merge = remote_ref
self.OptionParser.error('--manifest-branch and --manifest-name cannot' rp.work_git.reset("--hard", rev)
' be used with --standalone-manifest.') branch.Save()
if args: if opt.worktree:
if opt.manifest_url: # Older versions of git supported worktree, but had dangerous gc
self.OptionParser.error( # bugs.
'--manifest-url option and URL argument both specified: only use ' git_require((2, 15, 0), fail=True, msg="git gc worktree corruption")
'one to select the manifest URL.')
opt.manifest_url = args.pop(0) # Provide a short notice that we're reinitializing an existing checkout.
# Sometimes developers might not realize that they're in one, or that
# repo doesn't do nested checkouts.
existing_checkout = self.manifest.manifestProject.Exists
if not opt.quiet and existing_checkout:
print(
"repo: reusing existing repo client checkout in",
self.manifest.topdir,
)
if args: self._SyncManifest(opt)
self.OptionParser.error('too many arguments to init')
def Execute(self, opt, args): if os.isatty(0) and os.isatty(1) and not self.manifest.IsMirror:
git_require(MIN_GIT_VERSION_HARD, fail=True) if opt.config_name or self._ShouldConfigureUser(
if not git_require(MIN_GIT_VERSION_SOFT): opt, existing_checkout
print('repo: warning: git-%s+ will soon be required; please upgrade your ' ):
'version of git to maintain support.' self._ConfigureUser(opt)
% ('.'.join(str(x) for x in MIN_GIT_VERSION_SOFT),), self._ConfigureColor()
file=sys.stderr)
rp = self.manifest.repoProject if not opt.quiet:
self._DisplayResult()
# Handle new --repo-url requests.
if opt.repo_url:
remote = rp.GetRemote('origin')
remote.url = opt.repo_url
remote.Save()
# Handle new --repo-rev requests.
if opt.repo_rev:
wrapper = Wrapper()
try:
remote_ref, rev = wrapper.check_repo_rev(
rp.gitdir, opt.repo_rev, repo_verify=opt.repo_verify, quiet=opt.quiet)
except wrapper.CloneFailure:
print('fatal: double check your --repo-rev setting.', file=sys.stderr)
sys.exit(1)
branch = rp.GetBranch('default')
branch.merge = remote_ref
rp.work_git.reset('--hard', rev)
branch.Save()
if opt.worktree:
# Older versions of git supported worktree, but had dangerous gc bugs.
git_require((2, 15, 0), fail=True, msg='git gc worktree corruption')
# Provide a short notice that we're reinitializing an existing checkout.
# Sometimes developers might not realize that they're in one, or that
# repo doesn't do nested checkouts.
existing_checkout = self.manifest.manifestProject.Exists
if not opt.quiet and existing_checkout:
print('repo: reusing existing repo client checkout in', self.manifest.topdir)
self._SyncManifest(opt)
if os.isatty(0) and os.isatty(1) and not self.manifest.IsMirror:
if opt.config_name or self._ShouldConfigureUser(opt, existing_checkout):
self._ConfigureUser(opt)
self._ConfigureColor()
if not opt.quiet:
self._DisplayResult()

View File

@ -14,17 +14,18 @@
import os import os
from command import Command, MirrorSafeCommand from command import Command
from command import MirrorSafeCommand
class List(Command, MirrorSafeCommand): class List(Command, MirrorSafeCommand):
COMMON = True COMMON = True
helpSummary = "List projects and their associated directories" helpSummary = "List projects and their associated directories"
helpUsage = """ helpUsage = """
%prog [-f] [<project>...] %prog [-f] [<project>...]
%prog [-f] -r str1 [str2]... %prog [-f] -r str1 [str2]...
""" """
helpDescription = """ helpDescription = """
List all projects; pass '.' to list the project for the cwd. List all projects; pass '.' to list the project for the cwd.
By default, only projects that currently exist in the checkout are shown. If By default, only projects that currently exist in the checkout are shown. If
@ -35,69 +36,103 @@ groups, then also pass --groups all.
This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'. This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
""" """
def _Options(self, p): def _Options(self, p):
p.add_option('-r', '--regex', p.add_option(
dest='regex', action='store_true', "-r",
help='filter the project list based on regex or wildcard matching of strings') "--regex",
p.add_option('-g', '--groups', dest="regex",
dest='groups', action="store_true",
help='filter the project list based on the groups the project is in') help="filter the project list based on regex or wildcard matching "
p.add_option('-a', '--all', "of strings",
action='store_true', )
help='show projects regardless of checkout state') p.add_option(
p.add_option('-n', '--name-only', "-g",
dest='name_only', action='store_true', "--groups",
help='display only the name of the repository') dest="groups",
p.add_option('-p', '--path-only', help="filter the project list based on the groups the project is "
dest='path_only', action='store_true', "in",
help='display only the path of the repository') )
p.add_option('-f', '--fullpath', p.add_option(
dest='fullpath', action='store_true', "-a",
help='display the full work tree path instead of the relative path') "--all",
p.add_option('--relative-to', metavar='PATH', action="store_true",
help='display paths relative to this one (default: top of repo client checkout)') help="show projects regardless of checkout state",
)
p.add_option(
"-n",
"--name-only",
dest="name_only",
action="store_true",
help="display only the name of the repository",
)
p.add_option(
"-p",
"--path-only",
dest="path_only",
action="store_true",
help="display only the path of the repository",
)
p.add_option(
"-f",
"--fullpath",
dest="fullpath",
action="store_true",
help="display the full work tree path instead of the relative path",
)
p.add_option(
"--relative-to",
metavar="PATH",
help="display paths relative to this one (default: top of repo "
"client checkout)",
)
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
if opt.fullpath and opt.name_only: if opt.fullpath and opt.name_only:
self.OptionParser.error('cannot combine -f and -n') self.OptionParser.error("cannot combine -f and -n")
# Resolve any symlinks so the output is stable. # Resolve any symlinks so the output is stable.
if opt.relative_to: if opt.relative_to:
opt.relative_to = os.path.realpath(opt.relative_to) opt.relative_to = os.path.realpath(opt.relative_to)
def Execute(self, opt, args): def Execute(self, opt, args):
"""List all projects and the associated directories. """List all projects and the associated directories.
This may be possible to do with 'repo forall', but repo newbies have This may be possible to do with 'repo forall', but repo newbies have
trouble figuring that out. The idea here is that it should be more trouble figuring that out. The idea here is that it should be more
discoverable. discoverable.
Args: Args:
opt: The options. opt: The options.
args: Positional args. Can be a list of projects to list, or empty. args: Positional args. Can be a list of projects to list, or empty.
""" """
if not opt.regex: if not opt.regex:
projects = self.GetProjects(args, groups=opt.groups, missing_ok=opt.all, projects = self.GetProjects(
all_manifests=not opt.this_manifest_only) args,
else: groups=opt.groups,
projects = self.FindProjects(args, all_manifests=not opt.this_manifest_only) missing_ok=opt.all,
all_manifests=not opt.this_manifest_only,
)
else:
projects = self.FindProjects(
args, all_manifests=not opt.this_manifest_only
)
def _getpath(x): def _getpath(x):
if opt.fullpath: if opt.fullpath:
return x.worktree return x.worktree
if opt.relative_to: if opt.relative_to:
return os.path.relpath(x.worktree, opt.relative_to) return os.path.relpath(x.worktree, opt.relative_to)
return x.RelPath(local=opt.this_manifest_only) return x.RelPath(local=opt.this_manifest_only)
lines = [] lines = []
for project in projects: for project in projects:
if opt.name_only and not opt.path_only: if opt.name_only and not opt.path_only:
lines.append("%s" % (project.name)) lines.append("%s" % (project.name))
elif opt.path_only and not opt.name_only: elif opt.path_only and not opt.name_only:
lines.append("%s" % (_getpath(project))) lines.append("%s" % (_getpath(project)))
else: else:
lines.append("%s : %s" % (_getpath(project), project.name)) lines.append(f"{_getpath(project)} : {project.name}")
if lines: if lines:
lines.sort() lines.sort()
print('\n'.join(lines)) print("\n".join(lines))

View File

@ -17,15 +17,19 @@ import os
import sys import sys
from command import PagedCommand from command import PagedCommand
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class Manifest(PagedCommand): class Manifest(PagedCommand):
COMMON = False COMMON = False
helpSummary = "Manifest inspection utility" helpSummary = "Manifest inspection utility"
helpUsage = """ helpUsage = """
%prog [-o {-|NAME.xml}] [-m MANIFEST.xml] [-r] %prog [-o {-|NAME.xml}] [-m MANIFEST.xml] [-r]
""" """
_helpDescription = """ _helpDescription = """
With the -o option, exports the current manifest for inspection. With the -o option, exports the current manifest for inspection.
The manifest and (if present) local_manifests/ are combined The manifest and (if present) local_manifests/ are combined
@ -40,92 +44,136 @@ when the manifest was generated. The 'dest-branch' attribute is set
to indicate the remote ref to push changes to via 'repo upload'. to indicate the remote ref to push changes to via 'repo upload'.
""" """
@property @property
def helpDescription(self): def helpDescription(self):
helptext = self._helpDescription + '\n' helptext = self._helpDescription + "\n"
r = os.path.dirname(__file__) r = os.path.dirname(__file__)
r = os.path.dirname(r) r = os.path.dirname(r)
with open(os.path.join(r, 'docs', 'manifest-format.md')) as fd: with open(os.path.join(r, "docs", "manifest-format.md")) as fd:
for line in fd: for line in fd:
helptext += line helptext += line
return helptext return helptext
def _Options(self, p): def _Options(self, p):
p.add_option('-r', '--revision-as-HEAD', p.add_option(
dest='peg_rev', action='store_true', "-r",
help='save revisions as current HEAD') "--revision-as-HEAD",
p.add_option('-m', '--manifest-name', dest="peg_rev",
help='temporary manifest to use for this sync', metavar='NAME.xml') action="store_true",
p.add_option('--suppress-upstream-revision', dest='peg_rev_upstream', help="save revisions as current HEAD",
default=True, action='store_false', )
help='if in -r mode, do not write the upstream field ' p.add_option(
'(only of use if the branch names for a sha1 manifest are ' "-m",
'sensitive)') "--manifest-name",
p.add_option('--suppress-dest-branch', dest='peg_rev_dest_branch', help="temporary manifest to use for this sync",
default=True, action='store_false', metavar="NAME.xml",
help='if in -r mode, do not write the dest-branch field ' )
'(only of use if the branch names for a sha1 manifest are ' p.add_option(
'sensitive)') "--suppress-upstream-revision",
p.add_option('--json', default=False, action='store_true', dest="peg_rev_upstream",
help='output manifest in JSON format (experimental)') default=True,
p.add_option('--pretty', default=False, action='store_true', action="store_false",
help='format output for humans to read') help="if in -r mode, do not write the upstream field "
p.add_option('--no-local-manifests', default=False, action='store_true', "(only of use if the branch names for a sha1 manifest are "
dest='ignore_local_manifests', help='ignore local manifests') "sensitive)",
p.add_option('-o', '--output-file', )
dest='output_file', p.add_option(
default='-', "--suppress-dest-branch",
help='file to save the manifest to. (Filename prefix for multi-tree.)', dest="peg_rev_dest_branch",
metavar='-|NAME.xml') default=True,
action="store_false",
help="if in -r mode, do not write the dest-branch field "
"(only of use if the branch names for a sha1 manifest are "
"sensitive)",
)
p.add_option(
"--json",
default=False,
action="store_true",
help="output manifest in JSON format (experimental)",
)
p.add_option(
"--pretty",
default=False,
action="store_true",
help="format output for humans to read",
)
p.add_option(
"--no-local-manifests",
default=False,
action="store_true",
dest="ignore_local_manifests",
help="ignore local manifests",
)
p.add_option(
"-o",
"--output-file",
dest="output_file",
default="-",
help="file to save the manifest to. (Filename prefix for "
"multi-tree.)",
metavar="-|NAME.xml",
)
def _Output(self, opt): def _Output(self, opt):
# If alternate manifest is specified, override the manifest file that we're using. # If alternate manifest is specified, override the manifest file that
if opt.manifest_name: # we're using.
self.manifest.Override(opt.manifest_name, False) if opt.manifest_name:
self.manifest.Override(opt.manifest_name, False)
for manifest in self.ManifestList(opt): for manifest in self.ManifestList(opt):
output_file = opt.output_file output_file = opt.output_file
if output_file == '-': if output_file == "-":
fd = sys.stdout fd = sys.stdout
else: else:
if manifest.path_prefix: if manifest.path_prefix:
output_file = f'{opt.output_file}:{manifest.path_prefix.replace("/", "%2f")}' output_file = (
fd = open(output_file, 'w') f"{opt.output_file}:"
f'{manifest.path_prefix.replace("/", "%2f")}'
)
fd = open(output_file, "w")
manifest.SetUseLocalManifests(not opt.ignore_local_manifests) manifest.SetUseLocalManifests(not opt.ignore_local_manifests)
if opt.json: if opt.json:
print('warning: --json is experimental!', file=sys.stderr) logger.warning("warning: --json is experimental!")
doc = manifest.ToDict(peg_rev=opt.peg_rev, doc = manifest.ToDict(
peg_rev_upstream=opt.peg_rev_upstream, peg_rev=opt.peg_rev,
peg_rev_dest_branch=opt.peg_rev_dest_branch) peg_rev_upstream=opt.peg_rev_upstream,
peg_rev_dest_branch=opt.peg_rev_dest_branch,
)
json_settings = { json_settings = {
# JSON style guide says Uunicode characters are fully allowed. # JSON style guide says Unicode characters are fully
'ensure_ascii': False, # allowed.
# We use 2 space indent to match JSON style guide. "ensure_ascii": False,
'indent': 2 if opt.pretty else None, # We use 2 space indent to match JSON style guide.
'separators': (',', ': ') if opt.pretty else (',', ':'), "indent": 2 if opt.pretty else None,
'sort_keys': True, "separators": (",", ": ") if opt.pretty else (",", ":"),
} "sort_keys": True,
fd.write(json.dumps(doc, **json_settings)) }
else: fd.write(json.dumps(doc, **json_settings))
manifest.Save(fd, else:
peg_rev=opt.peg_rev, manifest.Save(
peg_rev_upstream=opt.peg_rev_upstream, fd,
peg_rev_dest_branch=opt.peg_rev_dest_branch) peg_rev=opt.peg_rev,
if output_file != '-': peg_rev_upstream=opt.peg_rev_upstream,
fd.close() peg_rev_dest_branch=opt.peg_rev_dest_branch,
if manifest.path_prefix: )
print(f'Saved {manifest.path_prefix} submanifest to {output_file}', if output_file != "-":
file=sys.stderr) fd.close()
else: if manifest.path_prefix:
print(f'Saved manifest to {output_file}', file=sys.stderr) logger.warning(
"Saved %s submanifest to %s",
manifest.path_prefix,
output_file,
)
else:
logger.warning("Saved manifest to %s", output_file)
def ValidateOptions(self, opt, args):
if args:
self.Usage()
def ValidateOptions(self, opt, args): def Execute(self, opt, args):
if args: self._Output(opt)
self.Usage()
def Execute(self, opt, args):
self._Output(opt)

View File

@ -19,12 +19,12 @@ from command import PagedCommand
class Overview(PagedCommand): class Overview(PagedCommand):
COMMON = True COMMON = True
helpSummary = "Display overview of unmerged project branches" helpSummary = "Display overview of unmerged project branches"
helpUsage = """ helpUsage = """
%prog [--current-branch] [<project>...] %prog [--current-branch] [<project>...]
""" """
helpDescription = """ helpDescription = """
The '%prog' command is used to display an overview of the projects branches, The '%prog' command is used to display an overview of the projects branches,
and list any local commits that have not yet been merged into the project. and list any local commits that have not yet been merged into the project.
@ -33,59 +33,77 @@ branches currently checked out in each project. By default, all branches
are displayed. are displayed.
""" """
def _Options(self, p): def _Options(self, p):
p.add_option('-c', '--current-branch', p.add_option(
dest="current_branch", action="store_true", "-c",
help="consider only checked out branches") "--current-branch",
p.add_option('--no-current-branch', dest="current_branch",
dest='current_branch', action='store_false', action="store_true",
help='consider all local branches') help="consider only checked out branches",
# Turn this into a warning & remove this someday. )
p.add_option('-b', p.add_option(
dest='current_branch', action='store_true', "--no-current-branch",
help=optparse.SUPPRESS_HELP) dest="current_branch",
action="store_false",
help="consider all local branches",
)
# Turn this into a warning & remove this someday.
p.add_option(
"-b",
dest="current_branch",
action="store_true",
help=optparse.SUPPRESS_HELP,
)
def Execute(self, opt, args): def Execute(self, opt, args):
all_branches = [] all_branches = []
for project in self.GetProjects(args, all_manifests=not opt.this_manifest_only): for project in self.GetProjects(
br = [project.GetUploadableBranch(x) args, all_manifests=not opt.this_manifest_only
for x in project.GetBranches()] ):
br = [x for x in br if x] br = [project.GetUploadableBranch(x) for x in project.GetBranches()]
if opt.current_branch: br = [x for x in br if x]
br = [x for x in br if x.name == project.CurrentBranch] if opt.current_branch:
all_branches.extend(br) br = [x for x in br if x.name == project.CurrentBranch]
all_branches.extend(br)
if not all_branches: if not all_branches:
return return
class Report(Coloring): class Report(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, 'status') Coloring.__init__(self, config, "status")
self.project = self.printer('header', attr='bold') self.project = self.printer("header", attr="bold")
self.text = self.printer('text') self.text = self.printer("text")
out = Report(all_branches[0].project.config) out = Report(all_branches[0].project.config)
out.text("Deprecated. See repo info -o.") out.text("Deprecated. See repo info -o.")
out.nl()
out.project('Projects Overview')
out.nl()
project = None
for branch in all_branches:
if project != branch.project:
project = branch.project
out.nl() out.nl()
out.project('project %s/' % project.RelPath(local=opt.this_manifest_only)) out.project("Projects Overview")
out.nl() out.nl()
commits = branch.commits project = None
date = branch.date
print('%s %-33s (%2d commit%s, %s)' % ( for branch in all_branches:
branch.name == project.CurrentBranch and '*' or ' ', if project != branch.project:
branch.name, project = branch.project
len(commits), out.nl()
len(commits) != 1 and 's' or ' ', out.project(
date)) "project %s/"
for commit in commits: % project.RelPath(local=opt.this_manifest_only)
print('%-35s - %s' % ('', commit)) )
out.nl()
commits = branch.commits
date = branch.date
print(
"%s %-33s (%2d commit%s, %s)"
% (
branch.name == project.CurrentBranch and "*" or " ",
branch.name,
len(commits),
len(commits) != 1 and "s" or " ",
date,
)
)
for commit in commits:
print("%-35s - %s" % ("", commit))

View File

@ -15,67 +15,83 @@
import itertools import itertools
from color import Coloring from color import Coloring
from command import DEFAULT_LOCAL_JOBS, PagedCommand from command import DEFAULT_LOCAL_JOBS
from command import PagedCommand
class Prune(PagedCommand): class Prune(PagedCommand):
COMMON = True COMMON = True
helpSummary = "Prune (delete) already merged topics" helpSummary = "Prune (delete) already merged topics"
helpUsage = """ helpUsage = """
%prog [<project>...] %prog [<project>...]
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _ExecuteOne(self, project): @classmethod
"""Process one project.""" def _ExecuteOne(cls, project_idx):
return project.PruneHeads() """Process one project."""
project = cls.get_parallel_context()["projects"][project_idx]
return project.PruneHeads()
def Execute(self, opt, args): def Execute(self, opt, args):
projects = self.GetProjects(args, all_manifests=not opt.this_manifest_only) projects = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
# NB: Should be able to refactor this module to display summary as results # NB: Should be able to refactor this module to display summary as
# come back from children. # results come back from children.
def _ProcessResults(_pool, _output, results): def _ProcessResults(_pool, _output, results):
return list(itertools.chain.from_iterable(results)) return list(itertools.chain.from_iterable(results))
all_branches = self.ExecuteInParallel( with self.ParallelContext():
opt.jobs, self.get_parallel_context()["projects"] = projects
self._ExecuteOne, all_branches = self.ExecuteInParallel(
projects, opt.jobs,
callback=_ProcessResults, self._ExecuteOne,
ordered=True) range(len(projects)),
callback=_ProcessResults,
ordered=True,
)
if not all_branches: if not all_branches:
return return
class Report(Coloring): class Report(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, 'status') Coloring.__init__(self, config, "status")
self.project = self.printer('header', attr='bold') self.project = self.printer("header", attr="bold")
out = Report(all_branches[0].project.config) out = Report(all_branches[0].project.config)
out.project('Pending Branches') out.project("Pending Branches")
out.nl()
project = None
for branch in all_branches:
if project != branch.project:
project = branch.project
out.nl()
out.project('project %s/' % project.RelPath(local=opt.this_manifest_only))
out.nl() out.nl()
print('%s %-33s ' % ( project = None
branch.name == project.CurrentBranch and '*' or ' ',
branch.name), end='')
if not branch.base_exists: for branch in all_branches:
print('(ignoring: tracking branch is gone: %s)' % (branch.base,)) if project != branch.project:
else: project = branch.project
commits = branch.commits out.nl()
date = branch.date out.project(
print('(%2d commit%s, %s)' % ( "project %s/"
len(commits), % project.RelPath(local=opt.this_manifest_only)
len(commits) != 1 and 's' or ' ', )
date)) out.nl()
print(
"%s %-33s "
% (
branch.name == project.CurrentBranch and "*" or " ",
branch.name,
),
end="",
)
if not branch.base_exists:
print(f"(ignoring: tracking branch is gone: {branch.base})")
else:
commits = branch.commits
date = branch.date
print(
"(%2d commit%s, %s)"
% (len(commits), len(commits) != 1 and "s" or " ", date)
)

View File

@ -17,149 +17,198 @@ import sys
from color import Coloring from color import Coloring
from command import Command from command import Command
from git_command import GitCommand from git_command import GitCommand
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class RebaseColoring(Coloring): class RebaseColoring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, 'rebase') Coloring.__init__(self, config, "rebase")
self.project = self.printer('project', attr='bold') self.project = self.printer("project", attr="bold")
self.fail = self.printer('fail', fg='red') self.fail = self.printer("fail", fg="red")
class Rebase(Command): class Rebase(Command):
COMMON = True COMMON = True
helpSummary = "Rebase local branches on upstream branch" helpSummary = "Rebase local branches on upstream branch"
helpUsage = """ helpUsage = """
%prog {[<project>...] | -i <project>...} %prog {[<project>...] | -i <project>...}
""" """
helpDescription = """ helpDescription = """
'%prog' uses git rebase to move local changes in the current topic branch to '%prog' uses git rebase to move local changes in the current topic branch to
the HEAD of the upstream history, useful when you have made commits in a topic the HEAD of the upstream history, useful when you have made commits in a topic
branch but need to incorporate new upstream changes "underneath" them. branch but need to incorporate new upstream changes "underneath" them.
""" """
def _Options(self, p): def _Options(self, p):
g = p.get_option_group('--quiet') g = p.get_option_group("--quiet")
g.add_option('-i', '--interactive', g.add_option(
dest="interactive", action="store_true", "-i",
help="interactive rebase (single project only)") "--interactive",
dest="interactive",
action="store_true",
help="interactive rebase (single project only)",
)
p.add_option('--fail-fast', p.add_option(
dest='fail_fast', action='store_true', "--fail-fast",
help='stop rebasing after first error is hit') dest="fail_fast",
p.add_option('-f', '--force-rebase', action="store_true",
dest='force_rebase', action='store_true', help="stop rebasing after first error is hit",
help='pass --force-rebase to git rebase') )
p.add_option('--no-ff', p.add_option(
dest='ff', default=True, action='store_false', "-f",
help='pass --no-ff to git rebase') "--force-rebase",
p.add_option('--autosquash', dest="force_rebase",
dest='autosquash', action='store_true', action="store_true",
help='pass --autosquash to git rebase') help="pass --force-rebase to git rebase",
p.add_option('--whitespace', )
dest='whitespace', action='store', metavar='WS', p.add_option(
help='pass --whitespace to git rebase') "--no-ff",
p.add_option('--auto-stash', dest="ff",
dest='auto_stash', action='store_true', default=True,
help='stash local modifications before starting') action="store_false",
p.add_option('-m', '--onto-manifest', help="pass --no-ff to git rebase",
dest='onto_manifest', action='store_true', )
help='rebase onto the manifest version instead of upstream ' p.add_option(
'HEAD (this helps to make sure the local tree stays ' "--autosquash",
'consistent if you previously synced to a manifest)') dest="autosquash",
action="store_true",
help="pass --autosquash to git rebase",
)
p.add_option(
"--whitespace",
dest="whitespace",
action="store",
metavar="WS",
help="pass --whitespace to git rebase",
)
p.add_option(
"--auto-stash",
dest="auto_stash",
action="store_true",
help="stash local modifications before starting",
)
p.add_option(
"-m",
"--onto-manifest",
dest="onto_manifest",
action="store_true",
help="rebase onto the manifest version instead of upstream "
"HEAD (this helps to make sure the local tree stays "
"consistent if you previously synced to a manifest)",
)
def Execute(self, opt, args): def Execute(self, opt, args):
all_projects = self.GetProjects(args, all_manifests=not opt.this_manifest_only) all_projects = self.GetProjects(
one_project = len(all_projects) == 1 args, all_manifests=not opt.this_manifest_only
)
one_project = len(all_projects) == 1
if opt.interactive and not one_project: if opt.interactive and not one_project:
print('error: interactive rebase not supported with multiple projects', logger.error(
file=sys.stderr) "error: interactive rebase not supported with multiple projects"
if len(args) == 1: )
print('note: project %s is mapped to more than one path' % (args[0],),
file=sys.stderr)
return 1
# Setup the common git rebase args that we use for all projects. if len(args) == 1:
common_args = ['rebase'] logger.warning(
if opt.whitespace: "note: project %s is mapped to more than one path", args[0]
common_args.append('--whitespace=%s' % opt.whitespace) )
if opt.quiet:
common_args.append('--quiet')
if opt.force_rebase:
common_args.append('--force-rebase')
if not opt.ff:
common_args.append('--no-ff')
if opt.autosquash:
common_args.append('--autosquash')
if opt.interactive:
common_args.append('-i')
config = self.manifest.manifestProject.config return 1
out = RebaseColoring(config)
out.redirect(sys.stdout)
_RelPath = lambda p: p.RelPath(local=opt.this_manifest_only)
ret = 0 # Setup the common git rebase args that we use for all projects.
for project in all_projects: common_args = ["rebase"]
if ret and opt.fail_fast: if opt.whitespace:
break common_args.append("--whitespace=%s" % opt.whitespace)
if opt.quiet:
common_args.append("--quiet")
if opt.force_rebase:
common_args.append("--force-rebase")
if not opt.ff:
common_args.append("--no-ff")
if opt.autosquash:
common_args.append("--autosquash")
if opt.interactive:
common_args.append("-i")
cb = project.CurrentBranch config = self.manifest.manifestProject.config
if not cb: out = RebaseColoring(config)
if one_project: out.redirect(sys.stdout)
print("error: project %s has a detached HEAD" % _RelPath(project), _RelPath = lambda p: p.RelPath(local=opt.this_manifest_only)
file=sys.stderr)
return 1
# ignore branches with detatched HEADs
continue
upbranch = project.GetBranch(cb) ret = 0
if not upbranch.LocalMerge: for project in all_projects:
if one_project: if ret and opt.fail_fast:
print("error: project %s does not track any remote branches" break
% _RelPath(project), file=sys.stderr)
return 1
# ignore branches without remotes
continue
args = common_args[:] cb = project.CurrentBranch
if opt.onto_manifest: if not cb:
args.append('--onto') if one_project:
args.append(project.revisionExpr) logger.error(
"error: project %s has a detached HEAD",
_RelPath(project),
)
return 1
# Ignore branches with detached HEADs.
continue
args.append(upbranch.LocalMerge) upbranch = project.GetBranch(cb)
if not upbranch.LocalMerge:
if one_project:
logger.error(
"error: project %s does not track any remote branches",
_RelPath(project),
)
return 1
# Ignore branches without remotes.
continue
out.project('project %s: rebasing %s -> %s', args = common_args[:]
_RelPath(project), cb, upbranch.LocalMerge) if opt.onto_manifest:
out.nl() args.append("--onto")
out.flush() args.append(project.revisionExpr)
needs_stash = False args.append(upbranch.LocalMerge)
if opt.auto_stash:
stash_args = ["update-index", "--refresh", "-q"]
if GitCommand(project, stash_args).Wait() != 0: out.project(
needs_stash = True "project %s: rebasing %s -> %s",
# Dirty index, requires stash... _RelPath(project),
stash_args = ["stash"] cb,
upbranch.LocalMerge,
)
out.nl()
out.flush()
if GitCommand(project, stash_args).Wait() != 0: needs_stash = False
ret += 1 if opt.auto_stash:
continue stash_args = ["update-index", "--refresh", "-q"]
if GitCommand(project, args).Wait() != 0: if GitCommand(project, stash_args).Wait() != 0:
ret += 1 needs_stash = True
continue # Dirty index, requires stash...
stash_args = ["stash"]
if needs_stash: if GitCommand(project, stash_args).Wait() != 0:
stash_args.append('pop') ret += 1
stash_args.append('--quiet') continue
if GitCommand(project, stash_args).Wait() != 0:
ret += 1
if ret: if GitCommand(project, args).Wait() != 0:
out.fail('%i projects had errors', ret) ret += 1
out.nl() continue
return ret if needs_stash:
stash_args.append("pop")
stash_args.append("--quiet")
if GitCommand(project, stash_args).Wait() != 0:
ret += 1
if ret:
msg_fmt = "%d projects had errors"
self.git_event_log.ErrorEvent(msg_fmt % (ret), msg_fmt)
out.fail(msg_fmt, ret)
out.nl()
return ret

View File

@ -12,21 +12,30 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from optparse import SUPPRESS_HELP import optparse
import sys
from command import Command, MirrorSafeCommand from command import Command
from subcmds.sync import _PostRepoUpgrade from command import MirrorSafeCommand
from error import RepoExitError
from repo_logging import RepoLogger
from subcmds.sync import _PostRepoFetch from subcmds.sync import _PostRepoFetch
from subcmds.sync import _PostRepoUpgrade
logger = RepoLogger(__file__)
class SelfupdateError(RepoExitError):
"""Exit error for failed selfupdate command."""
class Selfupdate(Command, MirrorSafeCommand): class Selfupdate(Command, MirrorSafeCommand):
COMMON = False COMMON = False
helpSummary = "Update repo to the latest version" helpSummary = "Update repo to the latest version"
helpUsage = """ helpUsage = """
%prog %prog
""" """
helpDescription = """ helpDescription = """
The '%prog' command upgrades repo to the latest version, if a The '%prog' command upgrades repo to the latest version, if a
newer version is available. newer version is available.
@ -34,28 +43,34 @@ Normally this is done automatically by 'repo sync' and does not
need to be performed by an end-user. need to be performed by an end-user.
""" """
def _Options(self, p): def _Options(self, p):
g = p.add_option_group('repo Version options') g = p.add_option_group("repo Version options")
g.add_option('--no-repo-verify', g.add_option(
dest='repo_verify', default=True, action='store_false', "--no-repo-verify",
help='do not verify repo source code') dest="repo_verify",
g.add_option('--repo-upgraded', default=True,
dest='repo_upgraded', action='store_true', action="store_false",
help=SUPPRESS_HELP) help="do not verify repo source code",
)
g.add_option(
"--repo-upgraded",
dest="repo_upgraded",
action="store_true",
help=optparse.SUPPRESS_HELP,
)
def Execute(self, opt, args): def Execute(self, opt, args):
rp = self.manifest.repoProject rp = self.manifest.repoProject
rp.PreSync() rp.PreSync()
if opt.repo_upgraded: if opt.repo_upgraded:
_PostRepoUpgrade(self.manifest) _PostRepoUpgrade(self.manifest)
else: else:
if not rp.Sync_NetworkHalf().success: result = rp.Sync_NetworkHalf()
print("error: can't update repo", file=sys.stderr) if result.error:
sys.exit(1) logger.error("error: can't update repo")
raise SelfupdateError(aggregate_errors=[result.error])
rp.bare_git.gc('--auto') rp.bare_git.gc("--auto")
_PostRepoFetch(rp, _PostRepoFetch(rp, repo_verify=opt.repo_verify, verbose=True)
repo_verify=opt.repo_verify,
verbose=True)

View File

@ -16,18 +16,18 @@ from subcmds.sync import Sync
class Smartsync(Sync): class Smartsync(Sync):
COMMON = True COMMON = True
helpSummary = "Update working tree to the latest known good revision" helpSummary = "Update working tree to the latest known good revision"
helpUsage = """ helpUsage = """
%prog [<project>...] %prog [<project>...]
""" """
helpDescription = """ helpDescription = """
The '%prog' command is a shortcut for sync -s. The '%prog' command is a shortcut for sync -s.
""" """
def _Options(self, p): def _Options(self, p):
Sync._Options(self, p, show_smart=False) Sync._Options(self, p, show_smart=False)
def Execute(self, opt, args): def Execute(self, opt, args):
opt.smart_sync = True opt.smart_sync = True
Sync.Execute(self, opt, args) Sync.Execute(self, opt, args)

View File

@ -17,101 +17,118 @@ import sys
from color import Coloring from color import Coloring
from command import InteractiveCommand from command import InteractiveCommand
from git_command import GitCommand from git_command import GitCommand
from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class _ProjectList(Coloring): class _ProjectList(Coloring):
def __init__(self, gc): def __init__(self, gc):
Coloring.__init__(self, gc, 'interactive') Coloring.__init__(self, gc, "interactive")
self.prompt = self.printer('prompt', fg='blue', attr='bold') self.prompt = self.printer("prompt", fg="blue", attr="bold")
self.header = self.printer('header', attr='bold') self.header = self.printer("header", attr="bold")
self.help = self.printer('help', fg='red', attr='bold') self.help = self.printer("help", fg="red", attr="bold")
class Stage(InteractiveCommand): class Stage(InteractiveCommand):
COMMON = True COMMON = True
helpSummary = "Stage file(s) for commit" helpSummary = "Stage file(s) for commit"
helpUsage = """ helpUsage = """
%prog -i [<project>...] %prog -i [<project>...]
""" """
helpDescription = """ helpDescription = """
The '%prog' command stages files to prepare the next commit. The '%prog' command stages files to prepare the next commit.
""" """
def _Options(self, p): def _Options(self, p):
g = p.get_option_group('--quiet') g = p.get_option_group("--quiet")
g.add_option('-i', '--interactive', g.add_option(
dest='interactive', action='store_true', "-i",
help='use interactive staging') "--interactive",
dest="interactive",
action="store_true",
help="use interactive staging",
)
def Execute(self, opt, args): def Execute(self, opt, args):
if opt.interactive: if opt.interactive:
self._Interactive(opt, args) self._Interactive(opt, args)
else: else:
self.Usage() self.Usage()
def _Interactive(self, opt, args): def _Interactive(self, opt, args):
all_projects = [ all_projects = [
p for p in self.GetProjects(args, all_manifests=not opt.this_manifest_only) p
if p.IsDirty()] for p in self.GetProjects(
if not all_projects: args, all_manifests=not opt.this_manifest_only
print('no projects have uncommitted modifications', file=sys.stderr) )
return if p.IsDirty()
]
if not all_projects:
logger.error("no projects have uncommitted modifications")
return
out = _ProjectList(self.manifest.manifestProject.config) out = _ProjectList(self.manifest.manifestProject.config)
while True: while True:
out.header(' %s', 'project') out.header(" %s", "project")
out.nl() out.nl()
for i in range(len(all_projects)): for i in range(len(all_projects)):
project = all_projects[i] project = all_projects[i]
out.write('%3d: %s', i + 1, out.write(
project.RelPath(local=opt.this_manifest_only) + '/') "%3d: %s",
out.nl() i + 1,
out.nl() project.RelPath(local=opt.this_manifest_only) + "/",
)
out.nl()
out.nl()
out.write('%3d: (', 0) out.write("%3d: (", 0)
out.prompt('q') out.prompt("q")
out.write('uit)') out.write("uit)")
out.nl() out.nl()
out.prompt('project> ') out.prompt("project> ")
out.flush() out.flush()
try: try:
a = sys.stdin.readline() a = sys.stdin.readline()
except KeyboardInterrupt: except KeyboardInterrupt:
out.nl() out.nl()
break break
if a == '': if a == "":
out.nl() out.nl()
break break
a = a.strip() a = a.strip()
if a.lower() in ('q', 'quit', 'exit'): if a.lower() in ("q", "quit", "exit"):
break break
if not a: if not a:
continue continue
try: try:
a_index = int(a) a_index = int(a)
except ValueError: except ValueError:
a_index = None a_index = None
if a_index is not None: if a_index is not None:
if a_index == 0: if a_index == 0:
break break
if 0 < a_index and a_index <= len(all_projects): if 0 < a_index and a_index <= len(all_projects):
_AddI(all_projects[a_index - 1]) _AddI(all_projects[a_index - 1])
continue continue
projects = [ projects = [
p for p in all_projects p
if a in [p.name, p.RelPath(local=opt.this_manifest_only)]] for p in all_projects
if len(projects) == 1: if a in [p.name, p.RelPath(local=opt.this_manifest_only)]
_AddI(projects[0]) ]
continue if len(projects) == 1:
print('Bye.') _AddI(projects[0])
continue
print("Bye.")
def _AddI(project): def _AddI(project):
p = GitCommand(project, ['add', '--interactive'], bare=False) p = GitCommand(project, ["add", "--interactive"], bare=False)
p.Wait() p.Wait()

View File

@ -13,131 +13,146 @@
# limitations under the License. # limitations under the License.
import functools import functools
import os from typing import NamedTuple
import sys
from command import Command, DEFAULT_LOCAL_JOBS from command import Command
from git_config import IsImmutable from command import DEFAULT_LOCAL_JOBS
from error import RepoExitError
from git_command import git from git_command import git
import gitc_utils from git_config import IsImmutable
from progress import Progress from progress import Progress
from project import SyncBuffer from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class ExecuteOneResult(NamedTuple):
project_idx: int
error: Exception
class StartError(RepoExitError):
"""Exit error for failed start command."""
class Start(Command): class Start(Command):
COMMON = True COMMON = True
helpSummary = "Start a new branch for development" helpSummary = "Start a new branch for development"
helpUsage = """ helpUsage = """
%prog <newbranchname> [--all | <project>...] %prog <newbranchname> [--all | <project>...]
""" """
helpDescription = """ helpDescription = """
'%prog' begins a new branch of development, starting from the '%prog' begins a new branch of development, starting from the
revision specified in the manifest. revision specified in the manifest.
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p): def _Options(self, p):
p.add_option('--all', p.add_option(
dest='all', action='store_true', "--all",
help='begin branch in all projects') dest="all",
p.add_option('-r', '--rev', '--revision', dest='revision', action="store_true",
help='point branch at this revision instead of upstream') help="begin branch in all projects",
p.add_option('--head', '--HEAD', )
dest='revision', action='store_const', const='HEAD', p.add_option(
help='abbreviation for --rev HEAD') "-r",
"--rev",
"--revision",
dest="revision",
help="point branch at this revision instead of upstream",
)
p.add_option(
"--head",
"--HEAD",
dest="revision",
action="store_const",
const="HEAD",
help="abbreviation for --rev HEAD",
)
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
if not args: if not args:
self.Usage() self.Usage()
nb = args[0] nb = args[0]
if not git.check_ref_format('heads/%s' % nb): if not git.check_ref_format("heads/%s" % nb):
self.OptionParser.error("'%s' is not a valid name" % nb) self.OptionParser.error("'%s' is not a valid name" % nb)
def _ExecuteOne(self, revision, nb, project): @classmethod
"""Start one project.""" def _ExecuteOne(cls, revision, nb, default_revisionExpr, project_idx):
# If the current revision is immutable, such as a SHA1, a tag or """Start one project."""
# a change, then we can't push back to it. Substitute with # If the current revision is immutable, such as a SHA1, a tag or
# dest_branch, if defined; or with manifest default revision instead. # a change, then we can't push back to it. Substitute with
branch_merge = '' # dest_branch, if defined; or with manifest default revision instead.
if IsImmutable(project.revisionExpr): branch_merge = ""
if project.dest_branch: error = None
branch_merge = project.dest_branch project = cls.get_parallel_context()["projects"][project_idx]
else: if IsImmutable(project.revisionExpr):
branch_merge = self.manifest.default.revisionExpr if project.dest_branch:
branch_merge = project.dest_branch
else:
branch_merge = default_revisionExpr
try: try:
ret = project.StartBranch( project.StartBranch(
nb, branch_merge=branch_merge, revision=revision) nb, branch_merge=branch_merge, revision=revision
except Exception as e: )
print('error: unable to checkout %s: %s' % (project.name, e), file=sys.stderr) except Exception as e:
ret = False logger.error("error: unable to checkout %s: %s", project.name, e)
return (ret, project) error = e
return ExecuteOneResult(project_idx, error)
def Execute(self, opt, args): def Execute(self, opt, args):
nb = args[0] nb = args[0]
err = [] err_projects = []
projects = [] err = []
if not opt.all: projects = []
projects = args[1:] if not opt.all:
if len(projects) < 1: projects = args[1:]
projects = ['.'] # start it in the local project by default if len(projects) < 1:
projects = ["."] # start it in the local project by default
all_projects = self.GetProjects(projects, all_projects = self.GetProjects(
missing_ok=bool(self.gitc_manifest), projects,
all_manifests=not opt.this_manifest_only) all_manifests=not opt.this_manifest_only,
)
# This must happen after we find all_projects, since GetProjects may need def _ProcessResults(_pool, pm, results):
# the local directory, which will disappear once we save the GITC manifest. for result in results:
if self.gitc_manifest: if result.error:
gitc_projects = self.GetProjects(projects, manifest=self.gitc_manifest, project = all_projects[result.project_idx]
missing_ok=True) err_projects.append(project)
for project in gitc_projects: err.append(result.error)
if project.old_revision: pm.update(msg="")
project.already_synced = True
else:
project.already_synced = False
project.old_revision = project.revisionExpr
project.revisionExpr = None
# Save the GITC manifest.
gitc_utils.save_manifest(self.gitc_manifest)
# Make sure we have a valid CWD with self.ParallelContext():
if not os.path.exists(os.getcwd()): self.get_parallel_context()["projects"] = all_projects
os.chdir(self.manifest.topdir) self.ExecuteInParallel(
opt.jobs,
functools.partial(
self._ExecuteOne,
opt.revision,
nb,
self.manifest.default.revisionExpr,
),
range(len(all_projects)),
callback=_ProcessResults,
output=Progress(
f"Starting {nb}", len(all_projects), quiet=opt.quiet
),
chunksize=1,
)
pm = Progress('Syncing %s' % nb, len(all_projects), quiet=opt.quiet) if err_projects:
for project in all_projects: for p in err_projects:
gitc_project = self.gitc_manifest.paths[project.relpath] logger.error(
# Sync projects that have not been opened. "error: %s/: cannot start %s",
if not gitc_project.already_synced: p.RelPath(local=opt.this_manifest_only),
proj_localdir = os.path.join(self.gitc_manifest.gitc_client_dir, nb,
project.relpath) )
project.worktree = proj_localdir msg_fmt = "cannot start %d project(s)"
if not os.path.exists(proj_localdir): self.git_event_log.ErrorEvent(
os.makedirs(proj_localdir) msg_fmt % (len(err_projects)), msg_fmt
project.Sync_NetworkHalf() )
sync_buf = SyncBuffer(self.manifest.manifestProject.config) raise StartError(aggregate_errors=err)
project.Sync_LocalHalf(sync_buf)
project.revisionId = gitc_project.old_revision
pm.update()
pm.end()
def _ProcessResults(_pool, pm, results):
for (result, project) in results:
if not result:
err.append(project)
pm.update()
self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, opt.revision, nb),
all_projects,
callback=_ProcessResults,
output=Progress('Starting %s' % (nb,), len(all_projects), quiet=opt.quiet))
if err:
for p in err:
print("error: %s/: cannot start %s" % (p.RelPath(local=opt.this_manifest_only), nb),
file=sys.stderr)
sys.exit(1)

View File

@ -17,19 +17,19 @@ import glob
import io import io
import os import os
from command import DEFAULT_LOCAL_JOBS, PagedCommand
from color import Coloring from color import Coloring
from command import DEFAULT_LOCAL_JOBS
from command import PagedCommand
import platform_utils import platform_utils
class Status(PagedCommand): class Status(PagedCommand):
COMMON = True COMMON = True
helpSummary = "Show the working tree status" helpSummary = "Show the working tree status"
helpUsage = """ helpUsage = """
%prog [<project>...] %prog [<project>...]
""" """
helpDescription = """ helpDescription = """
'%prog' compares the working tree to the staging area (aka index), '%prog' compares the working tree to the staging area (aka index),
and the most recent commit on this branch (HEAD), in each project and the most recent commit on this branch (HEAD), in each project
specified. A summary is displayed, one line per file where there specified. A summary is displayed, one line per file where there
@ -76,109 +76,133 @@ the following meanings:
d: deleted ( in index, not in work tree ) d: deleted ( in index, not in work tree )
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p): def _Options(self, p):
p.add_option('-o', '--orphans', p.add_option(
dest='orphans', action='store_true', "-o",
help="include objects in working directory outside of repo projects") "--orphans",
dest="orphans",
action="store_true",
help="include objects in working directory outside of repo "
"projects",
)
def _StatusHelper(self, quiet, local, project): @classmethod
"""Obtains the status for a specific project. def _StatusHelper(cls, quiet, local, project_idx):
"""Obtains the status for a specific project.
Obtains the status for a project, redirecting the output to Obtains the status for a project, redirecting the output to
the specified object. the specified object.
Args: Args:
quiet: Where to output the status. quiet: Where to output the status.
local: a boolean, if True, the path is relative to the local local: a boolean, if True, the path is relative to the local
(sub)manifest. If false, the path is relative to the (sub)manifest. If false, the path is relative to the outermost
outermost manifest. manifest.
project: Project to get status of. project_idx: Project index to get status of.
Returns: Returns:
The status of the project. The status of the project.
""" """
buf = io.StringIO() buf = io.StringIO()
ret = project.PrintWorkTreeStatus(quiet=quiet, output_redir=buf, project = cls.get_parallel_context()["projects"][project_idx]
local=local) ret = project.PrintWorkTreeStatus(
return (ret, buf.getvalue()) quiet=quiet, output_redir=buf, local=local
)
return (ret, buf.getvalue())
def _FindOrphans(self, dirs, proj_dirs, proj_dirs_parents, outstring): def _FindOrphans(self, dirs, proj_dirs, proj_dirs_parents, outstring):
"""find 'dirs' that are present in 'proj_dirs_parents' but not in 'proj_dirs'""" """find 'dirs' that are present in 'proj_dirs_parents' but not in 'proj_dirs'""" # noqa: E501
status_header = ' --\t' status_header = " --\t"
for item in dirs: for item in dirs:
if not platform_utils.isdir(item): if not platform_utils.isdir(item):
outstring.append(''.join([status_header, item])) outstring.append("".join([status_header, item]))
continue continue
if item in proj_dirs: if item in proj_dirs:
continue continue
if item in proj_dirs_parents: if item in proj_dirs_parents:
self._FindOrphans(glob.glob('%s/.*' % item) + self._FindOrphans(
glob.glob('%s/*' % item), glob.glob("%s/.*" % item) + glob.glob("%s/*" % item),
proj_dirs, proj_dirs_parents, outstring) proj_dirs,
continue proj_dirs_parents,
outstring.append(''.join([status_header, item, '/'])) outstring,
)
continue
outstring.append("".join([status_header, item, "/"]))
def Execute(self, opt, args): def Execute(self, opt, args):
all_projects = self.GetProjects(args, all_manifests=not opt.this_manifest_only) all_projects = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
def _ProcessResults(_pool, _output, results): def _ProcessResults(_pool, _output, results):
ret = 0 ret = 0
for (state, output) in results: for state, output in results:
if output: if output:
print(output, end='') print(output, end="")
if state == 'CLEAN': if state == "CLEAN":
ret += 1 ret += 1
return ret return ret
counter = self.ExecuteInParallel( with self.ParallelContext():
opt.jobs, self.get_parallel_context()["projects"] = all_projects
functools.partial(self._StatusHelper, opt.quiet, opt.this_manifest_only), counter = self.ExecuteInParallel(
all_projects, opt.jobs,
callback=_ProcessResults, functools.partial(
ordered=True) self._StatusHelper, opt.quiet, opt.this_manifest_only
),
range(len(all_projects)),
callback=_ProcessResults,
ordered=True,
chunksize=1,
)
if not opt.quiet and len(all_projects) == counter: if not opt.quiet and len(all_projects) == counter:
print('nothing to commit (working directory clean)') print("nothing to commit (working directory clean)")
if opt.orphans: if opt.orphans:
proj_dirs = set() proj_dirs = set()
proj_dirs_parents = set() proj_dirs_parents = set()
for project in self.GetProjects(None, missing_ok=True, all_manifests=not opt.this_manifest_only): for project in self.GetProjects(
relpath = project.RelPath(local=opt.this_manifest_only) None, missing_ok=True, all_manifests=not opt.this_manifest_only
proj_dirs.add(relpath) ):
(head, _tail) = os.path.split(relpath) relpath = project.RelPath(local=opt.this_manifest_only)
while head != "": proj_dirs.add(relpath)
proj_dirs_parents.add(head) (head, _tail) = os.path.split(relpath)
(head, _tail) = os.path.split(head) while head != "":
proj_dirs.add('.repo') proj_dirs_parents.add(head)
(head, _tail) = os.path.split(head)
proj_dirs.add(".repo")
class StatusColoring(Coloring): class StatusColoring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, 'status') Coloring.__init__(self, config, "status")
self.project = self.printer('header', attr='bold') self.project = self.printer("header", attr="bold")
self.untracked = self.printer('untracked', fg='red') self.untracked = self.printer("untracked", fg="red")
orig_path = os.getcwd() orig_path = os.getcwd()
try: try:
os.chdir(self.manifest.topdir) os.chdir(self.manifest.topdir)
outstring = [] outstring = []
self._FindOrphans(glob.glob('.*') + self._FindOrphans(
glob.glob('*'), glob.glob(".*") + glob.glob("*"),
proj_dirs, proj_dirs_parents, outstring) proj_dirs,
proj_dirs_parents,
outstring,
)
if outstring: if outstring:
output = StatusColoring(self.client.globalConfig) output = StatusColoring(self.client.globalConfig)
output.project('Objects not within a project (orphans)') output.project("Objects not within a project (orphans)")
output.nl() output.nl()
for entry in outstring: for entry in outstring:
output.untracked(entry) output.untracked(entry)
output.nl() output.nl()
else: else:
print('No orphan files or directories') print("No orphan files or directories")
finally: finally:
# Restore CWD. # Restore CWD.
os.chdir(orig_path) os.chdir(orig_path)

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -15,52 +15,55 @@
import platform import platform
import sys import sys
from command import Command, MirrorSafeCommand from command import Command
from git_command import git, RepoSourceVersion, user_agent from command import MirrorSafeCommand
from git_command import git
from git_command import RepoSourceVersion
from git_command import user_agent
from git_refs import HEAD from git_refs import HEAD
from wrapper import Wrapper from wrapper import Wrapper
class Version(Command, MirrorSafeCommand): class Version(Command, MirrorSafeCommand):
wrapper_version = None wrapper_version = None
wrapper_path = None wrapper_path = None
COMMON = False COMMON = False
helpSummary = "Display the version of repo" helpSummary = "Display the version of repo"
helpUsage = """ helpUsage = """
%prog %prog
""" """
def Execute(self, opt, args): def Execute(self, opt, args):
rp = self.manifest.repoProject rp = self.manifest.repoProject
rem = rp.GetRemote() rem = rp.GetRemote()
branch = rp.GetBranch('default') branch = rp.GetBranch("default")
# These might not be the same. Report them both. # These might not be the same. Report them both.
src_ver = RepoSourceVersion() src_ver = RepoSourceVersion()
rp_ver = rp.bare_git.describe(HEAD) rp_ver = rp.bare_git.describe(HEAD)
print('repo version %s' % rp_ver) print(f"repo version {rp_ver}")
print(' (from %s)' % rem.url) print(f" (from {rem.url})")
print(' (tracking %s)' % branch.merge) print(f" (tracking {branch.merge})")
print(' (%s)' % rp.bare_git.log('-1', '--format=%cD', HEAD)) print(f" ({rp.bare_git.log('-1', '--format=%cD', HEAD)})")
if self.wrapper_path is not None: if self.wrapper_path is not None:
print('repo launcher version %s' % self.wrapper_version) print(f"repo launcher version {self.wrapper_version}")
print(' (from %s)' % self.wrapper_path) print(f" (from {self.wrapper_path})")
if src_ver != rp_ver: if src_ver != rp_ver:
print(' (currently at %s)' % src_ver) print(f" (currently at {src_ver})")
print('repo User-Agent %s' % user_agent.repo) print(f"repo User-Agent {user_agent.repo}")
print('git %s' % git.version_tuple().full) print(f"git {git.version_tuple().full}")
print('git User-Agent %s' % user_agent.git) print(f"git User-Agent {user_agent.git}")
print('Python %s' % sys.version) print(f"Python {sys.version}")
uname = platform.uname() uname = platform.uname()
if sys.version_info.major < 3: if sys.version_info.major < 3:
# Python 3 returns a named tuple, but Python 2 is simpler. # Python 3 returns a named tuple, but Python 2 is simpler.
print(uname) print(uname)
else: else:
print('OS %s %s (%s)' % (uname.system, uname.release, uname.version)) print(f"OS {uname.system} {uname.release} ({uname.version})")
print('CPU %s (%s)' % processor = uname.processor if uname.processor else "unknown"
(uname.machine, uname.processor if uname.processor else 'unknown')) print(f"CPU {uname.machine} ({processor})")
print('Bug reports:', Wrapper().BUG_URL) print("Bug reports:", Wrapper().BUG_URL)

View File

@ -14,12 +14,70 @@
"""Common fixtures for pytests.""" """Common fixtures for pytests."""
import pathlib
import pytest import pytest
import platform_utils
import repo_trace import repo_trace
@pytest.fixture(autouse=True) @pytest.fixture(autouse=True)
def disable_repo_trace(tmp_path): def disable_repo_trace(tmp_path):
"""Set an environment marker to relax certain strict checks for test code.""" """Set an environment marker to relax certain strict checks for test code.""" # noqa: E501
repo_trace._TRACE_FILE = str(tmp_path / 'TRACE_FILE_from_test') repo_trace._TRACE_FILE = str(tmp_path / "TRACE_FILE_from_test")
# adapted from pytest-home 0.5.1
def _set_home(monkeypatch, path: pathlib.Path):
"""
Set the home dir using a pytest monkeypatch context.
"""
win = platform_utils.isWindows()
vars = ["HOME"] + win * ["USERPROFILE"]
for var in vars:
monkeypatch.setenv(var, str(path))
return path
# copied from
# https://github.com/pytest-dev/pytest/issues/363#issuecomment-1335631998
@pytest.fixture(scope="session")
def monkeysession():
with pytest.MonkeyPatch.context() as mp:
yield mp
@pytest.fixture(autouse=True, scope="session")
def session_tmp_home_dir(tmp_path_factory, monkeysession):
"""Set HOME to a temporary directory, avoiding user's .gitconfig.
b/302797407
Set home at session scope to take effect prior to
``test_wrapper.GitCheckoutTestCase.setUpClass``.
"""
return _set_home(monkeysession, tmp_path_factory.mktemp("home"))
# adapted from pytest-home 0.5.1
@pytest.fixture(autouse=True)
def tmp_home_dir(monkeypatch, tmp_path_factory):
"""Set HOME to a temporary directory.
Ensures that state doesn't accumulate in $HOME across tests.
Note that in conjunction with session_tmp_homedir, the HOME
dir is patched twice, once at session scope, and then again at
the function scope.
"""
return _set_home(monkeypatch, tmp_path_factory.mktemp("home"))
@pytest.fixture(autouse=True)
def setup_user_identity(monkeysession, scope="session"):
"""Set env variables for author and committer name and email."""
monkeysession.setenv("GIT_AUTHOR_NAME", "Foo Bar")
monkeysession.setenv("GIT_COMMITTER_NAME", "Foo Bar")
monkeysession.setenv("GIT_AUTHOR_EMAIL", "foo@bar.baz")
monkeysession.setenv("GIT_COMMITTER_EMAIL", "foo@bar.baz")

View File

@ -11,3 +11,11 @@
intk = 10k intk = 10k
intm = 10m intm = 10m
intg = 10g intg = 10g
[color "status"]
one = yellow
two = magenta cyan
three = black red ul
reset = reset
none
empty =

74
tests/test_color.py Normal file
View File

@ -0,0 +1,74 @@
# Copyright (C) 2024 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the color.py module."""
import os
import unittest
import color
import git_config
def fixture(*paths):
"""Return a path relative to test/fixtures."""
return os.path.join(os.path.dirname(__file__), "fixtures", *paths)
class ColoringTests(unittest.TestCase):
"""tests of the Coloring class."""
def setUp(self):
"""Create a GitConfig object using the test.gitconfig fixture."""
config_fixture = fixture("test.gitconfig")
self.config = git_config.GitConfig(config_fixture)
color.SetDefaultColoring("true")
self.color = color.Coloring(self.config, "status")
def test_Color_Parse_all_params_none(self):
"""all params are None"""
val = self.color._parse(None, None, None, None)
self.assertEqual("", val)
def test_Color_Parse_first_parameter_none(self):
"""check fg & bg & attr"""
val = self.color._parse(None, "black", "red", "ul")
self.assertEqual("\x1b[4;30;41m", val)
def test_Color_Parse_one_entry(self):
"""check fg"""
val = self.color._parse("one", None, None, None)
self.assertEqual("\033[33m", val)
def test_Color_Parse_two_entry(self):
"""check fg & bg"""
val = self.color._parse("two", None, None, None)
self.assertEqual("\033[35;46m", val)
def test_Color_Parse_three_entry(self):
"""check fg & bg & attr"""
val = self.color._parse("three", None, None, None)
self.assertEqual("\033[4;30;41m", val)
def test_Color_Parse_reset_entry(self):
"""check reset entry"""
val = self.color._parse("reset", None, None, None)
self.assertEqual("\033[m", val)
def test_Color_Parse_empty_entry(self):
"""check empty entry"""
val = self.color._parse("none", "blue", "white", "dim")
self.assertEqual("\033[2;34;47m", val)
val = self.color._parse("empty", "green", "white", "bold")
self.assertEqual("\033[1;32;47m", val)

View File

@ -20,37 +20,37 @@ from editor import Editor
class EditorTestCase(unittest.TestCase): class EditorTestCase(unittest.TestCase):
"""Take care of resetting Editor state across tests.""" """Take care of resetting Editor state across tests."""
def setUp(self): def setUp(self):
self.setEditor(None) self.setEditor(None)
def tearDown(self): def tearDown(self):
self.setEditor(None) self.setEditor(None)
@staticmethod @staticmethod
def setEditor(editor): def setEditor(editor):
Editor._editor = editor Editor._editor = editor
class GetEditor(EditorTestCase): class GetEditor(EditorTestCase):
"""Check GetEditor behavior.""" """Check GetEditor behavior."""
def test_basic(self): def test_basic(self):
"""Basic checking of _GetEditor.""" """Basic checking of _GetEditor."""
self.setEditor(':') self.setEditor(":")
self.assertEqual(':', Editor._GetEditor()) self.assertEqual(":", Editor._GetEditor())
class EditString(EditorTestCase): class EditString(EditorTestCase):
"""Check EditString behavior.""" """Check EditString behavior."""
def test_no_editor(self): def test_no_editor(self):
"""Check behavior when no editor is available.""" """Check behavior when no editor is available."""
self.setEditor(':') self.setEditor(":")
self.assertEqual('foo', Editor.EditString('foo')) self.assertEqual("foo", Editor.EditString("foo"))
def test_cat_editor(self): def test_cat_editor(self):
"""Check behavior when editor is `cat`.""" """Check behavior when editor is `cat`."""
self.setEditor('cat') self.setEditor("cat")
self.assertEqual('foo', Editor.EditString('foo')) self.assertEqual("foo", Editor.EditString("foo"))

View File

@ -18,36 +18,53 @@ import inspect
import pickle import pickle
import unittest import unittest
import command
import error import error
import fetch
import git_command
import project
from subcmds import all_modules
imports = all_modules + [
error,
project,
git_command,
fetch,
command,
]
class PickleTests(unittest.TestCase): class PickleTests(unittest.TestCase):
"""Make sure all our custom exceptions can be pickled.""" """Make sure all our custom exceptions can be pickled."""
def getExceptions(self): def getExceptions(self):
"""Return all our custom exceptions.""" """Return all our custom exceptions."""
for name in dir(error): for entry in imports:
cls = getattr(error, name) for name in dir(entry):
if isinstance(cls, type) and issubclass(cls, Exception): cls = getattr(entry, name)
yield cls if isinstance(cls, type) and issubclass(cls, Exception):
yield cls
def testExceptionLookup(self): def testExceptionLookup(self):
"""Make sure our introspection logic works.""" """Make sure our introspection logic works."""
classes = list(self.getExceptions()) classes = list(self.getExceptions())
self.assertIn(error.HookError, classes) self.assertIn(error.HookError, classes)
# Don't assert the exact number to avoid being a change-detector test. # Don't assert the exact number to avoid being a change-detector test.
self.assertGreater(len(classes), 10) self.assertGreater(len(classes), 10)
def testPickle(self): def testPickle(self):
"""Try to pickle all the exceptions.""" """Try to pickle all the exceptions."""
for cls in self.getExceptions(): for cls in self.getExceptions():
args = inspect.getfullargspec(cls.__init__).args[1:] args = inspect.getfullargspec(cls.__init__).args[1:]
obj = cls(*args) obj = cls(*args)
p = pickle.dumps(obj) p = pickle.dumps(obj)
try: try:
newobj = pickle.loads(p) newobj = pickle.loads(p)
except Exception as e: # pylint: disable=broad-except except Exception as e: # pylint: disable=broad-except
self.fail('Class %s is unable to be pickled: %s\n' self.fail(
'Incomplete super().__init__(...) call?' % (cls, e)) "Class %s is unable to be pickled: %s\n"
self.assertIsInstance(newobj, cls) "Incomplete super().__init__(...) call?" % (cls, e)
self.assertEqual(str(obj), str(newobj)) )
self.assertIsInstance(newobj, cls)
self.assertEqual(str(obj), str(newobj))

View File

@ -14,143 +14,331 @@
"""Unittests for the git_command.py module.""" """Unittests for the git_command.py module."""
import re import io
import os import os
import re
import subprocess
import unittest import unittest
from unittest import mock
try:
from unittest import mock
except ImportError:
import mock
import git_command import git_command
import wrapper import wrapper
class GitCommandTest(unittest.TestCase): class GitCommandTest(unittest.TestCase):
"""Tests the GitCommand class (via git_command.git).""" """Tests the GitCommand class (via git_command.git)."""
def setUp(self): def setUp(self):
def realpath_mock(val):
return val
def realpath_mock(val): mock.patch.object(
return val os.path, "realpath", side_effect=realpath_mock
).start()
mock.patch.object(os.path, 'realpath', side_effect=realpath_mock).start() def tearDown(self):
mock.patch.stopall()
def tearDown(self): def test_alternative_setting_when_matching(self):
mock.patch.stopall() r = git_command._build_env(
objdir=os.path.join("zap", "objects"), gitdir="zap"
)
def test_alternative_setting_when_matching(self): self.assertIsNone(r.get("GIT_ALTERNATE_OBJECT_DIRECTORIES"))
r = git_command._build_env( self.assertEqual(
objdir = os.path.join('zap', 'objects'), r.get("GIT_OBJECT_DIRECTORY"), os.path.join("zap", "objects")
gitdir = 'zap' )
)
self.assertIsNone(r.get('GIT_ALTERNATE_OBJECT_DIRECTORIES')) def test_alternative_setting_when_different(self):
self.assertEqual(r.get('GIT_OBJECT_DIRECTORY'), os.path.join('zap', 'objects')) r = git_command._build_env(
objdir=os.path.join("wow", "objects"), gitdir="zap"
)
def test_alternative_setting_when_different(self): self.assertEqual(
r = git_command._build_env( r.get("GIT_ALTERNATE_OBJECT_DIRECTORIES"),
objdir = os.path.join('wow', 'objects'), os.path.join("zap", "objects"),
gitdir = 'zap' )
) self.assertEqual(
r.get("GIT_OBJECT_DIRECTORY"), os.path.join("wow", "objects")
)
self.assertEqual(r.get('GIT_ALTERNATE_OBJECT_DIRECTORIES'), os.path.join('zap', 'objects'))
self.assertEqual(r.get('GIT_OBJECT_DIRECTORY'), os.path.join('wow', 'objects')) class GitCommandWaitTest(unittest.TestCase):
"""Tests the GitCommand class .Wait()"""
def setUp(self):
class MockPopen:
rc = 0
def __init__(self):
self.stdout = io.BufferedReader(io.BytesIO())
self.stderr = io.BufferedReader(io.BytesIO())
def communicate(
self, input: str = None, timeout: float = None
) -> [str, str]:
"""Mock communicate fn."""
return ["", ""]
def wait(self, timeout=None):
return self.rc
self.popen = popen = MockPopen()
def popen_mock(*args, **kwargs):
return popen
def realpath_mock(val):
return val
mock.patch.object(subprocess, "Popen", side_effect=popen_mock).start()
mock.patch.object(
os.path, "realpath", side_effect=realpath_mock
).start()
def tearDown(self):
mock.patch.stopall()
def test_raises_when_verify_non_zero_result(self):
self.popen.rc = 1
r = git_command.GitCommand(None, ["status"], verify_command=True)
with self.assertRaises(git_command.GitCommandError):
r.Wait()
def test_returns_when_no_verify_non_zero_result(self):
self.popen.rc = 1
r = git_command.GitCommand(None, ["status"], verify_command=False)
self.assertEqual(1, r.Wait())
def test_default_returns_non_zero_result(self):
self.popen.rc = 1
r = git_command.GitCommand(None, ["status"])
self.assertEqual(1, r.Wait())
class GitCommandStreamLogsTest(unittest.TestCase):
"""Tests the GitCommand class stderr log streaming cases."""
def setUp(self):
self.mock_process = mock.MagicMock()
self.mock_process.communicate.return_value = (None, None)
self.mock_process.wait.return_value = 0
self.mock_popen = mock.MagicMock()
self.mock_popen.return_value = self.mock_process
mock.patch("subprocess.Popen", self.mock_popen).start()
def tearDown(self):
mock.patch.stopall()
def test_does_not_stream_logs_when_input_is_set(self):
git_command.GitCommand(None, ["status"], input="foo")
self.mock_popen.assert_called_once_with(
["git", "status"],
cwd=None,
env=mock.ANY,
encoding="utf-8",
errors="backslashreplace",
stdin=subprocess.PIPE,
stdout=None,
stderr=None,
)
self.mock_process.communicate.assert_called_once_with(input="foo")
self.mock_process.stderr.read1.assert_not_called()
def test_does_not_stream_logs_when_stdout_is_set(self):
git_command.GitCommand(None, ["status"], capture_stdout=True)
self.mock_popen.assert_called_once_with(
["git", "status"],
cwd=None,
env=mock.ANY,
encoding="utf-8",
errors="backslashreplace",
stdin=None,
stdout=subprocess.PIPE,
stderr=None,
)
self.mock_process.communicate.assert_called_once_with(input=None)
self.mock_process.stderr.read1.assert_not_called()
def test_does_not_stream_logs_when_stderr_is_set(self):
git_command.GitCommand(None, ["status"], capture_stderr=True)
self.mock_popen.assert_called_once_with(
["git", "status"],
cwd=None,
env=mock.ANY,
encoding="utf-8",
errors="backslashreplace",
stdin=None,
stdout=None,
stderr=subprocess.PIPE,
)
self.mock_process.communicate.assert_called_once_with(input=None)
self.mock_process.stderr.read1.assert_not_called()
def test_does_not_stream_logs_when_merge_output_is_set(self):
git_command.GitCommand(None, ["status"], merge_output=True)
self.mock_popen.assert_called_once_with(
["git", "status"],
cwd=None,
env=mock.ANY,
encoding="utf-8",
errors="backslashreplace",
stdin=None,
stdout=None,
stderr=subprocess.STDOUT,
)
self.mock_process.communicate.assert_called_once_with(input=None)
self.mock_process.stderr.read1.assert_not_called()
@mock.patch("sys.stderr")
def test_streams_stderr_when_no_stream_is_set(self, mock_stderr):
logs = "\n".join(
[
"Enumerating objects: 5, done.",
"Counting objects: 100% (5/5), done.",
"Writing objects: 100% (3/3), 330 bytes | 330 KiB/s, done.",
"remote: Processing changes: refs: 1, new: 1, done ",
"remote: SUCCESS",
]
)
self.mock_process.stderr = io.BufferedReader(
io.BytesIO(bytes(logs, "utf-8"))
)
cmd = git_command.GitCommand(None, ["push"])
self.mock_popen.assert_called_once_with(
["git", "push"],
cwd=None,
env=mock.ANY,
stdin=None,
stdout=None,
stderr=subprocess.PIPE,
)
self.mock_process.communicate.assert_not_called()
mock_stderr.write.assert_called_once_with(logs)
self.assertEqual(cmd.stderr, logs)
class GitCallUnitTest(unittest.TestCase): class GitCallUnitTest(unittest.TestCase):
"""Tests the _GitCall class (via git_command.git).""" """Tests the _GitCall class (via git_command.git)."""
def test_version_tuple(self): def test_version_tuple(self):
"""Check git.version_tuple() handling.""" """Check git.version_tuple() handling."""
ver = git_command.git.version_tuple() ver = git_command.git.version_tuple()
self.assertIsNotNone(ver) self.assertIsNotNone(ver)
# We don't dive too deep into the values here to avoid having to update # We don't dive too deep into the values here to avoid having to update
# whenever git versions change. We do check relative to this min version # whenever git versions change. We do check relative to this min
# as this is what `repo` itself requires via MIN_GIT_VERSION. # version as this is what `repo` itself requires via MIN_GIT_VERSION.
MIN_GIT_VERSION = (2, 10, 2) MIN_GIT_VERSION = (2, 10, 2)
self.assertTrue(isinstance(ver.major, int)) self.assertTrue(isinstance(ver.major, int))
self.assertTrue(isinstance(ver.minor, int)) self.assertTrue(isinstance(ver.minor, int))
self.assertTrue(isinstance(ver.micro, int)) self.assertTrue(isinstance(ver.micro, int))
self.assertGreater(ver.major, MIN_GIT_VERSION[0] - 1) self.assertGreater(ver.major, MIN_GIT_VERSION[0] - 1)
self.assertGreaterEqual(ver.micro, 0) self.assertGreaterEqual(ver.micro, 0)
self.assertGreaterEqual(ver.major, 0) self.assertGreaterEqual(ver.major, 0)
self.assertGreaterEqual(ver, MIN_GIT_VERSION) self.assertGreaterEqual(ver, MIN_GIT_VERSION)
self.assertLess(ver, (9999, 9999, 9999)) self.assertLess(ver, (9999, 9999, 9999))
self.assertNotEqual('', ver.full) self.assertNotEqual("", ver.full)
class UserAgentUnitTest(unittest.TestCase): class UserAgentUnitTest(unittest.TestCase):
"""Tests the UserAgent function.""" """Tests the UserAgent function."""
def test_smoke_os(self): def test_smoke_os(self):
"""Make sure UA OS setting returns something useful.""" """Make sure UA OS setting returns something useful."""
os_name = git_command.user_agent.os os_name = git_command.user_agent.os
# We can't dive too deep because of OS/tool differences, but we can check # We can't dive too deep because of OS/tool differences, but we can
# the general form. # check the general form.
m = re.match(r'^[^ ]+$', os_name) m = re.match(r"^[^ ]+$", os_name)
self.assertIsNotNone(m) self.assertIsNotNone(m)
def test_smoke_repo(self): def test_smoke_repo(self):
"""Make sure repo UA returns something useful.""" """Make sure repo UA returns something useful."""
ua = git_command.user_agent.repo ua = git_command.user_agent.repo
# We can't dive too deep because of OS/tool differences, but we can check # We can't dive too deep because of OS/tool differences, but we can
# the general form. # check the general form.
m = re.match(r'^git-repo/[^ ]+ ([^ ]+) git/[^ ]+ Python/[0-9.]+', ua) m = re.match(r"^git-repo/[^ ]+ ([^ ]+) git/[^ ]+ Python/[0-9.]+", ua)
self.assertIsNotNone(m) self.assertIsNotNone(m)
def test_smoke_git(self): def test_smoke_git(self):
"""Make sure git UA returns something useful.""" """Make sure git UA returns something useful."""
ua = git_command.user_agent.git ua = git_command.user_agent.git
# We can't dive too deep because of OS/tool differences, but we can check # We can't dive too deep because of OS/tool differences, but we can
# the general form. # check the general form.
m = re.match(r'^git/[^ ]+ ([^ ]+) git-repo/[^ ]+', ua) m = re.match(r"^git/[^ ]+ ([^ ]+) git-repo/[^ ]+", ua)
self.assertIsNotNone(m) self.assertIsNotNone(m)
class GitRequireTests(unittest.TestCase): class GitRequireTests(unittest.TestCase):
"""Test the git_require helper.""" """Test the git_require helper."""
def setUp(self): def setUp(self):
self.wrapper = wrapper.Wrapper() self.wrapper = wrapper.Wrapper()
ver = self.wrapper.GitVersion(1, 2, 3, 4) ver = self.wrapper.GitVersion(1, 2, 3, 4)
mock.patch.object(git_command.git, 'version_tuple', return_value=ver).start() mock.patch.object(
git_command.git, "version_tuple", return_value=ver
).start()
def tearDown(self): def tearDown(self):
mock.patch.stopall() mock.patch.stopall()
def test_older_nonfatal(self): def test_older_nonfatal(self):
"""Test non-fatal require calls with old versions.""" """Test non-fatal require calls with old versions."""
self.assertFalse(git_command.git_require((2,))) self.assertFalse(git_command.git_require((2,)))
self.assertFalse(git_command.git_require((1, 3))) self.assertFalse(git_command.git_require((1, 3)))
self.assertFalse(git_command.git_require((1, 2, 4))) self.assertFalse(git_command.git_require((1, 2, 4)))
self.assertFalse(git_command.git_require((1, 2, 3, 5))) self.assertFalse(git_command.git_require((1, 2, 3, 5)))
def test_newer_nonfatal(self): def test_newer_nonfatal(self):
"""Test non-fatal require calls with newer versions.""" """Test non-fatal require calls with newer versions."""
self.assertTrue(git_command.git_require((0,))) self.assertTrue(git_command.git_require((0,)))
self.assertTrue(git_command.git_require((1, 0))) self.assertTrue(git_command.git_require((1, 0)))
self.assertTrue(git_command.git_require((1, 2, 0))) self.assertTrue(git_command.git_require((1, 2, 0)))
self.assertTrue(git_command.git_require((1, 2, 3, 0))) self.assertTrue(git_command.git_require((1, 2, 3, 0)))
def test_equal_nonfatal(self): def test_equal_nonfatal(self):
"""Test require calls with equal values.""" """Test require calls with equal values."""
self.assertTrue(git_command.git_require((1, 2, 3, 4), fail=False)) self.assertTrue(git_command.git_require((1, 2, 3, 4), fail=False))
self.assertTrue(git_command.git_require((1, 2, 3, 4), fail=True)) self.assertTrue(git_command.git_require((1, 2, 3, 4), fail=True))
def test_older_fatal(self): def test_older_fatal(self):
"""Test fatal require calls with old versions.""" """Test fatal require calls with old versions."""
with self.assertRaises(SystemExit) as e: with self.assertRaises(git_command.GitRequireError) as e:
git_command.git_require((2,), fail=True) git_command.git_require((2,), fail=True)
self.assertNotEqual(0, e.code) self.assertNotEqual(0, e.code)
def test_older_fatal_msg(self): def test_older_fatal_msg(self):
"""Test fatal require calls with old versions and message.""" """Test fatal require calls with old versions and message."""
with self.assertRaises(SystemExit) as e: with self.assertRaises(git_command.GitRequireError) as e:
git_command.git_require((2,), fail=True, msg='so sad') git_command.git_require((2,), fail=True, msg="so sad")
self.assertNotEqual(0, e.code) self.assertNotEqual(0, e.code)
class GitCommandErrorTest(unittest.TestCase):
"""Test for the GitCommandError class."""
def test_augument_stderr(self):
self.assertEqual(
git_command.GitCommandError(
git_stderr="couldn't find remote ref refs/heads/foo"
).suggestion,
"Check if the provided ref exists in the remote.",
)
self.assertEqual(
git_command.GitCommandError(
git_stderr="'foobar' does not appear to be a git repository"
).suggestion,
"Are you running this repo command outside of a repo workspace?",
)

View File

@ -22,167 +22,169 @@ import git_config
def fixture(*paths): def fixture(*paths):
"""Return a path relative to test/fixtures. """Return a path relative to test/fixtures."""
""" return os.path.join(os.path.dirname(__file__), "fixtures", *paths)
return os.path.join(os.path.dirname(__file__), 'fixtures', *paths)
class GitConfigReadOnlyTests(unittest.TestCase): class GitConfigReadOnlyTests(unittest.TestCase):
"""Read-only tests of the GitConfig class.""" """Read-only tests of the GitConfig class."""
def setUp(self): def setUp(self):
"""Create a GitConfig object using the test.gitconfig fixture. """Create a GitConfig object using the test.gitconfig fixture."""
""" config_fixture = fixture("test.gitconfig")
config_fixture = fixture('test.gitconfig') self.config = git_config.GitConfig(config_fixture)
self.config = git_config.GitConfig(config_fixture)
def test_GetString_with_empty_config_values(self): def test_GetString_with_empty_config_values(self):
""" """
Test config entries with no value. Test config entries with no value.
[section] [section]
empty empty
""" """
val = self.config.GetString('section.empty') val = self.config.GetString("section.empty")
self.assertEqual(val, None) self.assertEqual(val, None)
def test_GetString_with_true_value(self): def test_GetString_with_true_value(self):
""" """
Test config entries with a string value. Test config entries with a string value.
[section] [section]
nonempty = true nonempty = true
""" """
val = self.config.GetString('section.nonempty') val = self.config.GetString("section.nonempty")
self.assertEqual(val, 'true') self.assertEqual(val, "true")
def test_GetString_from_missing_file(self): def test_GetString_from_missing_file(self):
""" """
Test missing config file Test missing config file
""" """
config_fixture = fixture('not.present.gitconfig') config_fixture = fixture("not.present.gitconfig")
config = git_config.GitConfig(config_fixture) config = git_config.GitConfig(config_fixture)
val = config.GetString('empty') val = config.GetString("empty")
self.assertEqual(val, None) self.assertEqual(val, None)
def test_GetBoolean_undefined(self): def test_GetBoolean_undefined(self):
"""Test GetBoolean on key that doesn't exist.""" """Test GetBoolean on key that doesn't exist."""
self.assertIsNone(self.config.GetBoolean('section.missing')) self.assertIsNone(self.config.GetBoolean("section.missing"))
def test_GetBoolean_invalid(self): def test_GetBoolean_invalid(self):
"""Test GetBoolean on invalid boolean value.""" """Test GetBoolean on invalid boolean value."""
self.assertIsNone(self.config.GetBoolean('section.boolinvalid')) self.assertIsNone(self.config.GetBoolean("section.boolinvalid"))
def test_GetBoolean_true(self): def test_GetBoolean_true(self):
"""Test GetBoolean on valid true boolean.""" """Test GetBoolean on valid true boolean."""
self.assertTrue(self.config.GetBoolean('section.booltrue')) self.assertTrue(self.config.GetBoolean("section.booltrue"))
def test_GetBoolean_false(self): def test_GetBoolean_false(self):
"""Test GetBoolean on valid false boolean.""" """Test GetBoolean on valid false boolean."""
self.assertFalse(self.config.GetBoolean('section.boolfalse')) self.assertFalse(self.config.GetBoolean("section.boolfalse"))
def test_GetInt_undefined(self): def test_GetInt_undefined(self):
"""Test GetInt on key that doesn't exist.""" """Test GetInt on key that doesn't exist."""
self.assertIsNone(self.config.GetInt('section.missing')) self.assertIsNone(self.config.GetInt("section.missing"))
def test_GetInt_invalid(self): def test_GetInt_invalid(self):
"""Test GetInt on invalid integer value.""" """Test GetInt on invalid integer value."""
self.assertIsNone(self.config.GetBoolean('section.intinvalid')) self.assertIsNone(self.config.GetBoolean("section.intinvalid"))
def test_GetInt_valid(self): def test_GetInt_valid(self):
"""Test GetInt on valid integers.""" """Test GetInt on valid integers."""
TESTS = ( TESTS = (
('inthex', 16), ("inthex", 16),
('inthexk', 16384), ("inthexk", 16384),
('int', 10), ("int", 10),
('intk', 10240), ("intk", 10240),
('intm', 10485760), ("intm", 10485760),
('intg', 10737418240), ("intg", 10737418240),
) )
for key, value in TESTS: for key, value in TESTS:
self.assertEqual(value, self.config.GetInt('section.%s' % (key,))) self.assertEqual(value, self.config.GetInt(f"section.{key}"))
class GitConfigReadWriteTests(unittest.TestCase): class GitConfigReadWriteTests(unittest.TestCase):
"""Read/write tests of the GitConfig class.""" """Read/write tests of the GitConfig class."""
def setUp(self): def setUp(self):
self.tmpfile = tempfile.NamedTemporaryFile() self.tmpfile = tempfile.NamedTemporaryFile()
self.config = self.get_config() self.config = self.get_config()
def get_config(self): def get_config(self):
"""Get a new GitConfig instance.""" """Get a new GitConfig instance."""
return git_config.GitConfig(self.tmpfile.name) return git_config.GitConfig(self.tmpfile.name)
def test_SetString(self): def test_SetString(self):
"""Test SetString behavior.""" """Test SetString behavior."""
# Set a value. # Set a value.
self.assertIsNone(self.config.GetString('foo.bar')) self.assertIsNone(self.config.GetString("foo.bar"))
self.config.SetString('foo.bar', 'val') self.config.SetString("foo.bar", "val")
self.assertEqual('val', self.config.GetString('foo.bar')) self.assertEqual("val", self.config.GetString("foo.bar"))
# Make sure the value was actually written out. # Make sure the value was actually written out.
config = self.get_config() config = self.get_config()
self.assertEqual('val', config.GetString('foo.bar')) self.assertEqual("val", config.GetString("foo.bar"))
# Update the value. # Update the value.
self.config.SetString('foo.bar', 'valll') self.config.SetString("foo.bar", "valll")
self.assertEqual('valll', self.config.GetString('foo.bar')) self.assertEqual("valll", self.config.GetString("foo.bar"))
config = self.get_config() config = self.get_config()
self.assertEqual('valll', config.GetString('foo.bar')) self.assertEqual("valll", config.GetString("foo.bar"))
# Delete the value. # Delete the value.
self.config.SetString('foo.bar', None) self.config.SetString("foo.bar", None)
self.assertIsNone(self.config.GetString('foo.bar')) self.assertIsNone(self.config.GetString("foo.bar"))
config = self.get_config() config = self.get_config()
self.assertIsNone(config.GetString('foo.bar')) self.assertIsNone(config.GetString("foo.bar"))
def test_SetBoolean(self): def test_SetBoolean(self):
"""Test SetBoolean behavior.""" """Test SetBoolean behavior."""
# Set a true value. # Set a true value.
self.assertIsNone(self.config.GetBoolean('foo.bar')) self.assertIsNone(self.config.GetBoolean("foo.bar"))
for val in (True, 1): for val in (True, 1):
self.config.SetBoolean('foo.bar', val) self.config.SetBoolean("foo.bar", val)
self.assertTrue(self.config.GetBoolean('foo.bar')) self.assertTrue(self.config.GetBoolean("foo.bar"))
# Make sure the value was actually written out. # Make sure the value was actually written out.
config = self.get_config() config = self.get_config()
self.assertTrue(config.GetBoolean('foo.bar')) self.assertTrue(config.GetBoolean("foo.bar"))
self.assertEqual('true', config.GetString('foo.bar')) self.assertEqual("true", config.GetString("foo.bar"))
# Set a false value. # Set a false value.
for val in (False, 0): for val in (False, 0):
self.config.SetBoolean('foo.bar', val) self.config.SetBoolean("foo.bar", val)
self.assertFalse(self.config.GetBoolean('foo.bar')) self.assertFalse(self.config.GetBoolean("foo.bar"))
# Make sure the value was actually written out. # Make sure the value was actually written out.
config = self.get_config() config = self.get_config()
self.assertFalse(config.GetBoolean('foo.bar')) self.assertFalse(config.GetBoolean("foo.bar"))
self.assertEqual('false', config.GetString('foo.bar')) self.assertEqual("false", config.GetString("foo.bar"))
# Delete the value. # Delete the value.
self.config.SetBoolean('foo.bar', None) self.config.SetBoolean("foo.bar", None)
self.assertIsNone(self.config.GetBoolean('foo.bar')) self.assertIsNone(self.config.GetBoolean("foo.bar"))
config = self.get_config() config = self.get_config()
self.assertIsNone(config.GetBoolean('foo.bar')) self.assertIsNone(config.GetBoolean("foo.bar"))
def test_GetSyncAnalysisStateData(self): def test_GetSyncAnalysisStateData(self):
"""Test config entries with a sync state analysis data.""" """Test config entries with a sync state analysis data."""
superproject_logging_data = {} superproject_logging_data = {}
superproject_logging_data['test'] = False superproject_logging_data["test"] = False
options = type('options', (object,), {})() options = type("options", (object,), {})()
options.verbose = 'true' options.verbose = "true"
options.mp_update = 'false' options.mp_update = "false"
TESTS = ( TESTS = (
('superproject.test', 'false'), ("superproject.test", "false"),
('options.verbose', 'true'), ("options.verbose", "true"),
('options.mpupdate', 'false'), ("options.mpupdate", "false"),
('main.version', '1'), ("main.version", "1"),
) )
self.config.UpdateSyncAnalysisState(options, superproject_logging_data) self.config.UpdateSyncAnalysisState(options, superproject_logging_data)
sync_data = self.config.GetSyncAnalysisStateData() sync_data = self.config.GetSyncAnalysisStateData()
for key, value in TESTS: for key, value in TESTS:
self.assertEqual(sync_data[f'{git_config.SYNC_STATE_PREFIX}{key}'], value) self.assertEqual(
self.assertTrue(sync_data[f'{git_config.SYNC_STATE_PREFIX}main.synctime']) sync_data[f"{git_config.SYNC_STATE_PREFIX}{key}"], value
)
self.assertTrue(
sync_data[f"{git_config.SYNC_STATE_PREFIX}main.synctime"]
)

View File

@ -21,304 +21,379 @@ import tempfile
import unittest import unittest
from unittest import mock from unittest import mock
from test_manifest_xml import sort_attributes
import git_superproject import git_superproject
import git_trace2_event_log import git_trace2_event_log
import manifest_xml import manifest_xml
from test_manifest_xml import sort_attributes
class SuperprojectTestCase(unittest.TestCase): class SuperprojectTestCase(unittest.TestCase):
"""TestCase for the Superproject module.""" """TestCase for the Superproject module."""
PARENT_SID_KEY = 'GIT_TRACE2_PARENT_SID' PARENT_SID_KEY = "GIT_TRACE2_PARENT_SID"
PARENT_SID_VALUE = 'parent_sid' PARENT_SID_VALUE = "parent_sid"
SELF_SID_REGEX = r'repo-\d+T\d+Z-.*' SELF_SID_REGEX = r"repo-\d+T\d+Z-.*"
FULL_SID_REGEX = r'^%s/%s' % (PARENT_SID_VALUE, SELF_SID_REGEX) FULL_SID_REGEX = rf"^{PARENT_SID_VALUE}/{SELF_SID_REGEX}"
def setUp(self): def setUp(self):
"""Set up superproject every time.""" """Set up superproject every time."""
self.tempdirobj = tempfile.TemporaryDirectory(prefix='repo_tests') self.tempdirobj = tempfile.TemporaryDirectory(prefix="repo_tests")
self.tempdir = self.tempdirobj.name self.tempdir = self.tempdirobj.name
self.repodir = os.path.join(self.tempdir, '.repo') self.repodir = os.path.join(self.tempdir, ".repo")
self.manifest_file = os.path.join( self.manifest_file = os.path.join(
self.repodir, manifest_xml.MANIFEST_FILE_NAME) self.repodir, manifest_xml.MANIFEST_FILE_NAME
os.mkdir(self.repodir) )
self.platform = platform.system().lower() os.mkdir(self.repodir)
self.platform = platform.system().lower()
# By default we initialize with the expected case where # By default we initialize with the expected case where
# repo launches us (so GIT_TRACE2_PARENT_SID is set). # repo launches us (so GIT_TRACE2_PARENT_SID is set).
env = { env = {
self.PARENT_SID_KEY: self.PARENT_SID_VALUE, self.PARENT_SID_KEY: self.PARENT_SID_VALUE,
} }
self.git_event_log = git_trace2_event_log.EventLog(env=env) self.git_event_log = git_trace2_event_log.EventLog(env=env)
# The manifest parsing really wants a git repo currently. # The manifest parsing really wants a git repo currently.
gitdir = os.path.join(self.repodir, 'manifests.git') gitdir = os.path.join(self.repodir, "manifests.git")
os.mkdir(gitdir) os.mkdir(gitdir)
with open(os.path.join(gitdir, 'config'), 'w') as fp: with open(os.path.join(gitdir, "config"), "w") as fp:
fp.write("""[remote "origin"] fp.write(
"""[remote "origin"]
url = https://localhost:0/manifest url = https://localhost:0/manifest
""") """
)
manifest = self.getXmlManifest(""" manifest = self.getXmlManifest(
"""
<manifest> <manifest>
<remote name="default-remote" fetch="http://localhost" /> <remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" /> <default remote="default-remote" revision="refs/heads/main" />
<superproject name="superproject"/> <superproject name="superproject"/>
<project path="art" name="platform/art" groups="notdefault,platform-""" + self.platform + """ <project path="art" name="platform/art" groups="notdefault,platform-"""
+ self.platform
+ """
" /></manifest> " /></manifest>
""") """
self._superproject = git_superproject.Superproject( )
manifest, name='superproject', self._superproject = git_superproject.Superproject(
remote=manifest.remotes.get('default-remote').ToRemoteSpec('superproject'), manifest,
revision='refs/heads/main') name="superproject",
remote=manifest.remotes.get("default-remote").ToRemoteSpec(
"superproject"
),
revision="refs/heads/main",
)
def tearDown(self): def tearDown(self):
"""Tear down superproject every time.""" """Tear down superproject every time."""
self.tempdirobj.cleanup() self.tempdirobj.cleanup()
def getXmlManifest(self, data): def getXmlManifest(self, data):
"""Helper to initialize a manifest for testing.""" """Helper to initialize a manifest for testing."""
with open(self.manifest_file, 'w') as fp: with open(self.manifest_file, "w") as fp:
fp.write(data) fp.write(data)
return manifest_xml.XmlManifest(self.repodir, self.manifest_file) return manifest_xml.XmlManifest(self.repodir, self.manifest_file)
def verifyCommonKeys(self, log_entry, expected_event_name, full_sid=True): def verifyCommonKeys(self, log_entry, expected_event_name, full_sid=True):
"""Helper function to verify common event log keys.""" """Helper function to verify common event log keys."""
self.assertIn('event', log_entry) self.assertIn("event", log_entry)
self.assertIn('sid', log_entry) self.assertIn("sid", log_entry)
self.assertIn('thread', log_entry) self.assertIn("thread", log_entry)
self.assertIn('time', log_entry) self.assertIn("time", log_entry)
# Do basic data format validation. # Do basic data format validation.
self.assertEqual(expected_event_name, log_entry['event']) self.assertEqual(expected_event_name, log_entry["event"])
if full_sid: if full_sid:
self.assertRegex(log_entry['sid'], self.FULL_SID_REGEX) self.assertRegex(log_entry["sid"], self.FULL_SID_REGEX)
else: else:
self.assertRegex(log_entry['sid'], self.SELF_SID_REGEX) self.assertRegex(log_entry["sid"], self.SELF_SID_REGEX)
self.assertRegex(log_entry['time'], r'^\d+-\d+-\d+T\d+:\d+:\d+\.\d+Z$') self.assertRegex(
log_entry["time"], r"^\d+-\d+-\d+T\d+:\d+:\d+\.\d+\+00:00$"
)
def readLog(self, log_path): def readLog(self, log_path):
"""Helper function to read log data into a list.""" """Helper function to read log data into a list."""
log_data = [] log_data = []
with open(log_path, mode='rb') as f: with open(log_path, mode="rb") as f:
for line in f: for line in f:
log_data.append(json.loads(line)) log_data.append(json.loads(line))
return log_data return log_data
def verifyErrorEvent(self): def verifyErrorEvent(self):
"""Helper to verify that error event is written.""" """Helper to verify that error event is written."""
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir: with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self.git_event_log.Write(path=tempdir) log_path = self.git_event_log.Write(path=tempdir)
self.log_data = self.readLog(log_path) self.log_data = self.readLog(log_path)
self.assertEqual(len(self.log_data), 2) self.assertEqual(len(self.log_data), 2)
error_event = self.log_data[1] error_event = self.log_data[1]
self.verifyCommonKeys(self.log_data[0], expected_event_name='version') self.verifyCommonKeys(self.log_data[0], expected_event_name="version")
self.verifyCommonKeys(error_event, expected_event_name='error') self.verifyCommonKeys(error_event, expected_event_name="error")
# Check for 'error' event specific fields. # Check for 'error' event specific fields.
self.assertIn('msg', error_event) self.assertIn("msg", error_event)
self.assertIn('fmt', error_event) self.assertIn("fmt", error_event)
def test_superproject_get_superproject_no_superproject(self): def test_superproject_get_superproject_no_superproject(self):
"""Test with no url.""" """Test with no url."""
manifest = self.getXmlManifest(""" manifest = self.getXmlManifest(
"""
<manifest> <manifest>
</manifest> </manifest>
""") """
self.assertIsNone(manifest.superproject) )
self.assertIsNone(manifest.superproject)
def test_superproject_get_superproject_invalid_url(self): def test_superproject_get_superproject_invalid_url(self):
"""Test with an invalid url.""" """Test with an invalid url."""
manifest = self.getXmlManifest(""" manifest = self.getXmlManifest(
"""
<manifest> <manifest>
<remote name="test-remote" fetch="localhost" /> <remote name="test-remote" fetch="localhost" />
<default remote="test-remote" revision="refs/heads/main" /> <default remote="test-remote" revision="refs/heads/main" />
<superproject name="superproject"/> <superproject name="superproject"/>
</manifest> </manifest>
""") """
superproject = git_superproject.Superproject( )
manifest, name='superproject', superproject = git_superproject.Superproject(
remote=manifest.remotes.get('test-remote').ToRemoteSpec('superproject'), manifest,
revision='refs/heads/main') name="superproject",
sync_result = superproject.Sync(self.git_event_log) remote=manifest.remotes.get("test-remote").ToRemoteSpec(
self.assertFalse(sync_result.success) "superproject"
self.assertTrue(sync_result.fatal) ),
revision="refs/heads/main",
def test_superproject_get_superproject_invalid_branch(self): )
"""Test with an invalid branch.""" sync_result = superproject.Sync(self.git_event_log)
manifest = self.getXmlManifest("""
<manifest>
<remote name="test-remote" fetch="localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<superproject name="superproject"/>
</manifest>
""")
self._superproject = git_superproject.Superproject(
manifest, name='superproject',
remote=manifest.remotes.get('test-remote').ToRemoteSpec('superproject'),
revision='refs/heads/main')
with mock.patch.object(self._superproject, '_branch', 'junk'):
sync_result = self._superproject.Sync(self.git_event_log)
self.assertFalse(sync_result.success)
self.assertTrue(sync_result.fatal)
self.verifyErrorEvent()
def test_superproject_get_superproject_mock_init(self):
"""Test with _Init failing."""
with mock.patch.object(self._superproject, '_Init', return_value=False):
sync_result = self._superproject.Sync(self.git_event_log)
self.assertFalse(sync_result.success)
self.assertTrue(sync_result.fatal)
def test_superproject_get_superproject_mock_fetch(self):
"""Test with _Fetch failing."""
with mock.patch.object(self._superproject, '_Init', return_value=True):
os.mkdir(self._superproject._superproject_path)
with mock.patch.object(self._superproject, '_Fetch', return_value=False):
sync_result = self._superproject.Sync(self.git_event_log)
self.assertFalse(sync_result.success) self.assertFalse(sync_result.success)
self.assertTrue(sync_result.fatal) self.assertTrue(sync_result.fatal)
def test_superproject_get_all_project_commit_ids_mock_ls_tree(self): def test_superproject_get_superproject_invalid_branch(self):
"""Test with LsTree being a mock.""" """Test with an invalid branch."""
data = ('120000 blob 158258bdf146f159218e2b90f8b699c4d85b5804\tAndroid.bp\x00' manifest = self.getXmlManifest(
'160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00' """
'160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00' <manifest>
'120000 blob acc2cbdf438f9d2141f0ae424cec1d8fc4b5d97f\tbootstrap.bash\x00' <remote name="test-remote" fetch="localhost" />
'160000 commit ade9b7a0d874e25fff4bf2552488825c6f111928\tbuild/bazel\x00') <default remote="test-remote" revision="refs/heads/main" />
with mock.patch.object(self._superproject, '_Init', return_value=True): <superproject name="superproject"/>
with mock.patch.object(self._superproject, '_Fetch', return_value=True): </manifest>
with mock.patch.object(self._superproject, '_LsTree', return_value=data): """
commit_ids_result = self._superproject._GetAllProjectsCommitIds() )
self.assertEqual(commit_ids_result.commit_ids, { self._superproject = git_superproject.Superproject(
'art': '2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea', manifest,
'bootable/recovery': 'e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06', name="superproject",
'build/bazel': 'ade9b7a0d874e25fff4bf2552488825c6f111928' remote=manifest.remotes.get("test-remote").ToRemoteSpec(
}) "superproject"
self.assertFalse(commit_ids_result.fatal) ),
revision="refs/heads/main",
)
with mock.patch.object(self._superproject, "_branch", "junk"):
sync_result = self._superproject.Sync(self.git_event_log)
self.assertFalse(sync_result.success)
self.assertTrue(sync_result.fatal)
self.verifyErrorEvent()
def test_superproject_write_manifest_file(self): def test_superproject_get_superproject_mock_init(self):
"""Test with writing manifest to a file after setting revisionId.""" """Test with _Init failing."""
self.assertEqual(len(self._superproject._manifest.projects), 1) with mock.patch.object(self._superproject, "_Init", return_value=False):
project = self._superproject._manifest.projects[0] sync_result = self._superproject.Sync(self.git_event_log)
project.SetRevisionId('ABCDEF') self.assertFalse(sync_result.success)
# Create temporary directory so that it can write the file. self.assertTrue(sync_result.fatal)
os.mkdir(self._superproject._superproject_path)
manifest_path = self._superproject._WriteManifestFile()
self.assertIsNotNone(manifest_path)
with open(manifest_path, 'r') as fp:
manifest_xml_data = fp.read()
self.assertEqual(
sort_attributes(manifest_xml_data),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project groups="notdefault,platform-' + self.platform + '" '
'name="platform/art" path="art" revision="ABCDEF" upstream="refs/heads/main"/>'
'<superproject name="superproject"/>'
'</manifest>')
def test_superproject_update_project_revision_id(self): def test_superproject_get_superproject_mock_fetch(self):
"""Test with LsTree being a mock.""" """Test with _Fetch failing."""
self.assertEqual(len(self._superproject._manifest.projects), 1) with mock.patch.object(self._superproject, "_Init", return_value=True):
projects = self._superproject._manifest.projects os.mkdir(self._superproject._superproject_path)
data = ('160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00' with mock.patch.object(
'160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00') self._superproject, "_Fetch", return_value=False
with mock.patch.object(self._superproject, '_Init', return_value=True): ):
with mock.patch.object(self._superproject, '_Fetch', return_value=True): sync_result = self._superproject.Sync(self.git_event_log)
with mock.patch.object(self._superproject, self.assertFalse(sync_result.success)
'_LsTree', self.assertTrue(sync_result.fatal)
return_value=data):
# Create temporary directory so that it can write the file. def test_superproject_get_all_project_commit_ids_mock_ls_tree(self):
os.mkdir(self._superproject._superproject_path) """Test with LsTree being a mock."""
update_result = self._superproject.UpdateProjectsRevisionId(projects, self.git_event_log) data = (
self.assertIsNotNone(update_result.manifest_path) "120000 blob 158258bdf146f159218e2b90f8b699c4d85b5804\tAndroid.bp\x00"
self.assertFalse(update_result.fatal) "160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00"
with open(update_result.manifest_path, 'r') as fp: "160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00"
"120000 blob acc2cbdf438f9d2141f0ae424cec1d8fc4b5d97f\tbootstrap.bash\x00"
"160000 commit ade9b7a0d874e25fff4bf2552488825c6f111928\tbuild/bazel\x00"
)
with mock.patch.object(self._superproject, "_Init", return_value=True):
with mock.patch.object(
self._superproject, "_Fetch", return_value=True
):
with mock.patch.object(
self._superproject, "_LsTree", return_value=data
):
commit_ids_result = (
self._superproject._GetAllProjectsCommitIds()
)
self.assertEqual(
commit_ids_result.commit_ids,
{
"art": "2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea",
"bootable/recovery": "e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06",
"build/bazel": "ade9b7a0d874e25fff4bf2552488825c6f111928",
},
)
self.assertFalse(commit_ids_result.fatal)
def test_superproject_write_manifest_file(self):
"""Test with writing manifest to a file after setting revisionId."""
self.assertEqual(len(self._superproject._manifest.projects), 1)
project = self._superproject._manifest.projects[0]
project.SetRevisionId("ABCDEF")
# Create temporary directory so that it can write the file.
os.mkdir(self._superproject._superproject_path)
manifest_path = self._superproject._WriteManifestFile()
self.assertIsNotNone(manifest_path)
with open(manifest_path) as fp:
manifest_xml_data = fp.read() manifest_xml_data = fp.read()
self.assertEqual( self.assertEqual(
sort_attributes(manifest_xml_data), sort_attributes(manifest_xml_data),
'<?xml version="1.0" ?><manifest>' '<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>' '<remote fetch="http://localhost" name="default-remote"/>'
'<default remote="default-remote" revision="refs/heads/main"/>' '<default remote="default-remote" revision="refs/heads/main"/>'
'<project groups="notdefault,platform-' + self.platform + '" ' '<project groups="notdefault,platform-' + self.platform + '" '
'name="platform/art" path="art" ' 'name="platform/art" path="art" revision="ABCDEF" upstream="refs/heads/main"/>'
'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" upstream="refs/heads/main"/>' '<superproject name="superproject"/>'
'<superproject name="superproject"/>' "</manifest>",
'</manifest>') )
def test_superproject_update_project_revision_id_no_superproject_tag(self): def test_superproject_update_project_revision_id(self):
"""Test update of commit ids of a manifest without superproject tag.""" """Test with LsTree being a mock."""
manifest = self.getXmlManifest(""" self.assertEqual(len(self._superproject._manifest.projects), 1)
projects = self._superproject._manifest.projects
data = (
"160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00"
"160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00"
)
with mock.patch.object(self._superproject, "_Init", return_value=True):
with mock.patch.object(
self._superproject, "_Fetch", return_value=True
):
with mock.patch.object(
self._superproject, "_LsTree", return_value=data
):
# Create temporary directory so that it can write the file.
os.mkdir(self._superproject._superproject_path)
update_result = self._superproject.UpdateProjectsRevisionId(
projects, self.git_event_log
)
self.assertIsNotNone(update_result.manifest_path)
self.assertFalse(update_result.fatal)
with open(update_result.manifest_path) as fp:
manifest_xml_data = fp.read()
self.assertEqual(
sort_attributes(manifest_xml_data),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project groups="notdefault,platform-'
+ self.platform
+ '" '
'name="platform/art" path="art" '
'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" upstream="refs/heads/main"/>'
'<superproject name="superproject"/>'
"</manifest>",
)
def test_superproject_update_project_revision_id_no_superproject_tag(self):
"""Test update of commit ids of a manifest without superproject tag."""
manifest = self.getXmlManifest(
"""
<manifest> <manifest>
<remote name="default-remote" fetch="http://localhost" /> <remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" /> <default remote="default-remote" revision="refs/heads/main" />
<project name="test-name"/> <project name="test-name"/>
</manifest> </manifest>
""") """
self.maxDiff = None )
self.assertIsNone(manifest.superproject) self.maxDiff = None
self.assertEqual( self.assertIsNone(manifest.superproject)
sort_attributes(manifest.ToXml().toxml()), self.assertEqual(
'<?xml version="1.0" ?><manifest>' sort_attributes(manifest.ToXml().toxml()),
'<remote fetch="http://localhost" name="default-remote"/>' '<?xml version="1.0" ?><manifest>'
'<default remote="default-remote" revision="refs/heads/main"/>' '<remote fetch="http://localhost" name="default-remote"/>'
'<project name="test-name"/>' '<default remote="default-remote" revision="refs/heads/main"/>'
'</manifest>') '<project name="test-name"/>'
"</manifest>",
)
def test_superproject_update_project_revision_id_from_local_manifest_group(self): def test_superproject_update_project_revision_id_from_local_manifest_group(
"""Test update of commit ids of a manifest that have local manifest no superproject group.""" self,
local_group = manifest_xml.LOCAL_MANIFEST_GROUP_PREFIX + ':local' ):
manifest = self.getXmlManifest(""" """Test update of commit ids of a manifest that have local manifest no superproject group."""
local_group = manifest_xml.LOCAL_MANIFEST_GROUP_PREFIX + ":local"
manifest = self.getXmlManifest(
"""
<manifest> <manifest>
<remote name="default-remote" fetch="http://localhost" /> <remote name="default-remote" fetch="http://localhost" />
<remote name="goog" fetch="http://localhost2" /> <remote name="goog" fetch="http://localhost2" />
<default remote="default-remote" revision="refs/heads/main" /> <default remote="default-remote" revision="refs/heads/main" />
<superproject name="superproject"/> <superproject name="superproject"/>
<project path="vendor/x" name="platform/vendor/x" remote="goog" <project path="vendor/x" name="platform/vendor/x" remote="goog"
groups=\"""" + local_group + """ groups=\""""
+ local_group
+ """
" revision="master-with-vendor" clone-depth="1" /> " revision="master-with-vendor" clone-depth="1" />
<project path="art" name="platform/art" groups="notdefault,platform-""" + self.platform + """ <project path="art" name="platform/art" groups="notdefault,platform-"""
+ self.platform
+ """
" /></manifest> " /></manifest>
""") """
self.maxDiff = None )
self._superproject = git_superproject.Superproject( self.maxDiff = None
manifest, name='superproject', self._superproject = git_superproject.Superproject(
remote=manifest.remotes.get('default-remote').ToRemoteSpec('superproject'), manifest,
revision='refs/heads/main') name="superproject",
self.assertEqual(len(self._superproject._manifest.projects), 2) remote=manifest.remotes.get("default-remote").ToRemoteSpec(
projects = self._superproject._manifest.projects "superproject"
data = ('160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00') ),
with mock.patch.object(self._superproject, '_Init', return_value=True): revision="refs/heads/main",
with mock.patch.object(self._superproject, '_Fetch', return_value=True): )
with mock.patch.object(self._superproject, self.assertEqual(len(self._superproject._manifest.projects), 2)
'_LsTree', projects = self._superproject._manifest.projects
return_value=data): data = "160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00"
# Create temporary directory so that it can write the file. with mock.patch.object(self._superproject, "_Init", return_value=True):
os.mkdir(self._superproject._superproject_path) with mock.patch.object(
update_result = self._superproject.UpdateProjectsRevisionId(projects, self.git_event_log) self._superproject, "_Fetch", return_value=True
self.assertIsNotNone(update_result.manifest_path) ):
self.assertFalse(update_result.fatal) with mock.patch.object(
with open(update_result.manifest_path, 'r') as fp: self._superproject, "_LsTree", return_value=data
manifest_xml_data = fp.read() ):
# Verify platform/vendor/x's project revision hasn't changed. # Create temporary directory so that it can write the file.
self.assertEqual( os.mkdir(self._superproject._superproject_path)
sort_attributes(manifest_xml_data), update_result = self._superproject.UpdateProjectsRevisionId(
'<?xml version="1.0" ?><manifest>' projects, self.git_event_log
'<remote fetch="http://localhost" name="default-remote"/>' )
'<remote fetch="http://localhost2" name="goog"/>' self.assertIsNotNone(update_result.manifest_path)
'<default remote="default-remote" revision="refs/heads/main"/>' self.assertFalse(update_result.fatal)
'<project groups="notdefault,platform-' + self.platform + '" ' with open(update_result.manifest_path) as fp:
'name="platform/art" path="art" ' manifest_xml_data = fp.read()
'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" upstream="refs/heads/main"/>' # Verify platform/vendor/x's project revision hasn't
'<superproject name="superproject"/>' # changed.
'</manifest>') self.assertEqual(
sort_attributes(manifest_xml_data),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<remote fetch="http://localhost2" name="goog"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project groups="notdefault,platform-'
+ self.platform
+ '" '
'name="platform/art" path="art" '
'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" upstream="refs/heads/main"/>'
'<superproject name="superproject"/>'
"</manifest>",
)
def test_superproject_update_project_revision_id_with_pinned_manifest(self): def test_superproject_update_project_revision_id_with_pinned_manifest(self):
"""Test update of commit ids of a pinned manifest.""" """Test update of commit ids of a pinned manifest."""
manifest = self.getXmlManifest(""" manifest = self.getXmlManifest(
"""
<manifest> <manifest>
<remote name="default-remote" fetch="http://localhost" /> <remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" /> <default remote="default-remote" revision="refs/heads/main" />
@ -326,80 +401,136 @@ class SuperprojectTestCase(unittest.TestCase):
<project path="vendor/x" name="platform/vendor/x" revision="" /> <project path="vendor/x" name="platform/vendor/x" revision="" />
<project path="vendor/y" name="platform/vendor/y" <project path="vendor/y" name="platform/vendor/y"
revision="52d3c9f7c107839ece2319d077de0cd922aa9d8f" /> revision="52d3c9f7c107839ece2319d077de0cd922aa9d8f" />
<project path="art" name="platform/art" groups="notdefault,platform-""" + self.platform + """ <project path="art" name="platform/art" groups="notdefault,platform-"""
+ self.platform
+ """
" /></manifest> " /></manifest>
""") """
self.maxDiff = None )
self._superproject = git_superproject.Superproject( self.maxDiff = None
manifest, name='superproject', self._superproject = git_superproject.Superproject(
remote=manifest.remotes.get('default-remote').ToRemoteSpec('superproject'), manifest,
revision='refs/heads/main') name="superproject",
self.assertEqual(len(self._superproject._manifest.projects), 3) remote=manifest.remotes.get("default-remote").ToRemoteSpec(
projects = self._superproject._manifest.projects "superproject"
data = ('160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00' ),
'160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tvendor/x\x00') revision="refs/heads/main",
with mock.patch.object(self._superproject, '_Init', return_value=True): )
with mock.patch.object(self._superproject, '_Fetch', return_value=True): self.assertEqual(len(self._superproject._manifest.projects), 3)
with mock.patch.object(self._superproject, projects = self._superproject._manifest.projects
'_LsTree', data = (
return_value=data): "160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00"
# Create temporary directory so that it can write the file. "160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tvendor/x\x00"
os.mkdir(self._superproject._superproject_path) )
update_result = self._superproject.UpdateProjectsRevisionId(projects, self.git_event_log) with mock.patch.object(self._superproject, "_Init", return_value=True):
self.assertIsNotNone(update_result.manifest_path) with mock.patch.object(
self.assertFalse(update_result.fatal) self._superproject, "_Fetch", return_value=True
with open(update_result.manifest_path, 'r') as fp: ):
manifest_xml_data = fp.read() with mock.patch.object(
# Verify platform/vendor/x's project revision hasn't changed. self._superproject, "_LsTree", return_value=data
self.assertEqual( ):
sort_attributes(manifest_xml_data), # Create temporary directory so that it can write the file.
'<?xml version="1.0" ?><manifest>' os.mkdir(self._superproject._superproject_path)
'<remote fetch="http://localhost" name="default-remote"/>' update_result = self._superproject.UpdateProjectsRevisionId(
'<default remote="default-remote" revision="refs/heads/main"/>' projects, self.git_event_log
'<project groups="notdefault,platform-' + self.platform + '" ' )
'name="platform/art" path="art" ' self.assertIsNotNone(update_result.manifest_path)
'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" upstream="refs/heads/main"/>' self.assertFalse(update_result.fatal)
'<project name="platform/vendor/x" path="vendor/x" ' with open(update_result.manifest_path) as fp:
'revision="e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06" upstream="refs/heads/main"/>' manifest_xml_data = fp.read()
'<project name="platform/vendor/y" path="vendor/y" ' # Verify platform/vendor/x's project revision hasn't
'revision="52d3c9f7c107839ece2319d077de0cd922aa9d8f"/>' # changed.
'<superproject name="superproject"/>' self.assertEqual(
'</manifest>') sort_attributes(manifest_xml_data),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project groups="notdefault,platform-'
+ self.platform
+ '" '
'name="platform/art" path="art" '
'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" upstream="refs/heads/main"/>'
'<project name="platform/vendor/x" path="vendor/x" '
'revision="e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06" upstream="refs/heads/main"/>'
'<project name="platform/vendor/y" path="vendor/y" '
'revision="52d3c9f7c107839ece2319d077de0cd922aa9d8f"/>'
'<superproject name="superproject"/>'
"</manifest>",
)
def test_Fetch(self): def test_Fetch(self):
manifest = self.getXmlManifest(""" manifest = self.getXmlManifest(
"""
<manifest> <manifest>
<remote name="default-remote" fetch="http://localhost" /> <remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" /> <default remote="default-remote" revision="refs/heads/main" />
<superproject name="superproject"/> <superproject name="superproject"/>
" /></manifest> " /></manifest>
""") """
self.maxDiff = None )
self._superproject = git_superproject.Superproject( self.maxDiff = None
manifest, name='superproject', self._superproject = git_superproject.Superproject(
remote=manifest.remotes.get('default-remote').ToRemoteSpec('superproject'), manifest,
revision='refs/heads/main') name="superproject",
os.mkdir(self._superproject._superproject_path) remote=manifest.remotes.get("default-remote").ToRemoteSpec(
os.mkdir(self._superproject._work_git) "superproject"
with mock.patch.object(self._superproject, '_Init', return_value=True): ),
with mock.patch('git_superproject.GitCommand', autospec=True) as mock_git_command: revision="refs/heads/main",
with mock.patch('git_superproject.GitRefs.get', autospec=True) as mock_git_refs: )
instance = mock_git_command.return_value os.mkdir(self._superproject._superproject_path)
instance.Wait.return_value = 0 os.mkdir(self._superproject._work_git)
mock_git_refs.side_effect = ['', '1234'] with mock.patch.object(self._superproject, "_Init", return_value=True):
with mock.patch(
"git_superproject.GitCommand", autospec=True
) as mock_git_command:
with mock.patch(
"git_superproject.GitRefs.get", autospec=True
) as mock_git_refs:
instance = mock_git_command.return_value
instance.Wait.return_value = 0
mock_git_refs.side_effect = ["", "1234"]
self.assertTrue(self._superproject._Fetch()) self.assertTrue(self._superproject._Fetch())
self.assertEqual(mock_git_command.call_args.args,(None, [ self.assertEqual(
'fetch', 'http://localhost/superproject', '--depth', '1', # TODO: Once we require Python 3.8+,
'--force', '--no-tags', '--filter', 'blob:none', # use 'mock_git_command.call_args.args'.
'refs/heads/main:refs/heads/main' mock_git_command.call_args[0],
])) (
None,
[
"fetch",
"http://localhost/superproject",
"--depth",
"1",
"--force",
"--no-tags",
"--filter",
"blob:none",
"refs/heads/main:refs/heads/main",
],
),
)
# If branch for revision exists, set as --negotiation-tip. # If branch for revision exists, set as --negotiation-tip.
self.assertTrue(self._superproject._Fetch()) self.assertTrue(self._superproject._Fetch())
self.assertEqual(mock_git_command.call_args.args,(None, [ self.assertEqual(
'fetch', 'http://localhost/superproject', '--depth', '1', # TODO: Once we require Python 3.8+,
'--force', '--no-tags', '--filter', 'blob:none', # use 'mock_git_command.call_args.args'.
'--negotiation-tip', '1234', mock_git_command.call_args[0],
'refs/heads/main:refs/heads/main' (
])) None,
[
"fetch",
"http://localhost/superproject",
"--depth",
"1",
"--force",
"--no-tags",
"--filter",
"blob:none",
"--negotiation-tip",
"1234",
"refs/heads/main:refs/heads/main",
],
),
)

View File

@ -27,361 +27,384 @@ import platform_utils
def serverLoggingThread(socket_path, server_ready, received_traces): def serverLoggingThread(socket_path, server_ready, received_traces):
"""Helper function to receive logs over a Unix domain socket. """Helper function to receive logs over a Unix domain socket.
Appends received messages on the provided socket and appends to received_traces. Appends received messages on the provided socket and appends to
received_traces.
Args: Args:
socket_path: path to a Unix domain socket on which to listen for traces socket_path: path to a Unix domain socket on which to listen for traces
server_ready: a threading.Condition used to signal to the caller that this thread is ready to server_ready: a threading.Condition used to signal to the caller that
accept connections this thread is ready to accept connections
received_traces: a list to which received traces will be appended (after decoding to a utf-8 received_traces: a list to which received traces will be appended (after
string). decoding to a utf-8 string).
""" """
platform_utils.remove(socket_path, missing_ok=True) platform_utils.remove(socket_path, missing_ok=True)
data = b'' data = b""
with socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) as sock: with socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) as sock:
sock.bind(socket_path) sock.bind(socket_path)
sock.listen(0) sock.listen(0)
with server_ready: with server_ready:
server_ready.notify() server_ready.notify()
with sock.accept()[0] as conn: with sock.accept()[0] as conn:
while True: while True:
recved = conn.recv(4096) recved = conn.recv(4096)
if not recved: if not recved:
break break
data += recved data += recved
received_traces.extend(data.decode('utf-8').splitlines()) received_traces.extend(data.decode("utf-8").splitlines())
class EventLogTestCase(unittest.TestCase): class EventLogTestCase(unittest.TestCase):
"""TestCase for the EventLog module.""" """TestCase for the EventLog module."""
PARENT_SID_KEY = 'GIT_TRACE2_PARENT_SID' PARENT_SID_KEY = "GIT_TRACE2_PARENT_SID"
PARENT_SID_VALUE = 'parent_sid' PARENT_SID_VALUE = "parent_sid"
SELF_SID_REGEX = r'repo-\d+T\d+Z-.*' SELF_SID_REGEX = r"repo-\d+T\d+Z-.*"
FULL_SID_REGEX = r'^%s/%s' % (PARENT_SID_VALUE, SELF_SID_REGEX) FULL_SID_REGEX = rf"^{PARENT_SID_VALUE}/{SELF_SID_REGEX}"
def setUp(self): def setUp(self):
"""Load the event_log module every time.""" """Load the event_log module every time."""
self._event_log_module = None self._event_log_module = None
# By default we initialize with the expected case where # By default we initialize with the expected case where
# repo launches us (so GIT_TRACE2_PARENT_SID is set). # repo launches us (so GIT_TRACE2_PARENT_SID is set).
env = { env = {
self.PARENT_SID_KEY: self.PARENT_SID_VALUE, self.PARENT_SID_KEY: self.PARENT_SID_VALUE,
} }
self._event_log_module = git_trace2_event_log.EventLog(env=env) self._event_log_module = git_trace2_event_log.EventLog(env=env)
self._log_data = None self._log_data = None
def verifyCommonKeys(self, log_entry, expected_event_name=None, full_sid=True): def verifyCommonKeys(
"""Helper function to verify common event log keys.""" self, log_entry, expected_event_name=None, full_sid=True
self.assertIn('event', log_entry) ):
self.assertIn('sid', log_entry) """Helper function to verify common event log keys."""
self.assertIn('thread', log_entry) self.assertIn("event", log_entry)
self.assertIn('time', log_entry) self.assertIn("sid", log_entry)
self.assertIn("thread", log_entry)
self.assertIn("time", log_entry)
# Do basic data format validation. # Do basic data format validation.
if expected_event_name: if expected_event_name:
self.assertEqual(expected_event_name, log_entry['event']) self.assertEqual(expected_event_name, log_entry["event"])
if full_sid: if full_sid:
self.assertRegex(log_entry['sid'], self.FULL_SID_REGEX) self.assertRegex(log_entry["sid"], self.FULL_SID_REGEX)
else: else:
self.assertRegex(log_entry['sid'], self.SELF_SID_REGEX) self.assertRegex(log_entry["sid"], self.SELF_SID_REGEX)
self.assertRegex(log_entry['time'], r'^\d+-\d+-\d+T\d+:\d+:\d+\.\d+Z$') self.assertRegex(
log_entry["time"], r"^\d+-\d+-\d+T\d+:\d+:\d+\.\d+\+00:00$"
)
def readLog(self, log_path): def readLog(self, log_path):
"""Helper function to read log data into a list.""" """Helper function to read log data into a list."""
log_data = [] log_data = []
with open(log_path, mode='rb') as f: with open(log_path, mode="rb") as f:
for line in f: for line in f:
log_data.append(json.loads(line)) log_data.append(json.loads(line))
return log_data return log_data
def remove_prefix(self, s, prefix): def remove_prefix(self, s, prefix):
"""Return a copy string after removing |prefix| from |s|, if present or the original string.""" """Return a copy string after removing |prefix| from |s|, if present or
if s.startswith(prefix): the original string."""
return s[len(prefix):] if s.startswith(prefix):
else: return s[len(prefix) :]
return s else:
return s
def test_initial_state_with_parent_sid(self): def test_initial_state_with_parent_sid(self):
"""Test initial state when 'GIT_TRACE2_PARENT_SID' is set by parent.""" """Test initial state when 'GIT_TRACE2_PARENT_SID' is set by parent."""
self.assertRegex(self._event_log_module.full_sid, self.FULL_SID_REGEX) self.assertRegex(self._event_log_module.full_sid, self.FULL_SID_REGEX)
def test_initial_state_no_parent_sid(self): def test_initial_state_no_parent_sid(self):
"""Test initial state when 'GIT_TRACE2_PARENT_SID' is not set.""" """Test initial state when 'GIT_TRACE2_PARENT_SID' is not set."""
# Setup an empty environment dict (no parent sid). # Setup an empty environment dict (no parent sid).
self._event_log_module = git_trace2_event_log.EventLog(env={}) self._event_log_module = git_trace2_event_log.EventLog(env={})
self.assertRegex(self._event_log_module.full_sid, self.SELF_SID_REGEX) self.assertRegex(self._event_log_module.full_sid, self.SELF_SID_REGEX)
def test_version_event(self): def test_version_event(self):
"""Test 'version' event data is valid. """Test 'version' event data is valid.
Verify that the 'version' event is written even when no other Verify that the 'version' event is written even when no other
events are addded. events are addded.
Expected event log: Expected event log:
<version event> <version event>
""" """
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir: with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir) log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path) self._log_data = self.readLog(log_path)
# A log with no added events should only have the version entry. # A log with no added events should only have the version entry.
self.assertEqual(len(self._log_data), 1) self.assertEqual(len(self._log_data), 1)
version_event = self._log_data[0] version_event = self._log_data[0]
self.verifyCommonKeys(version_event, expected_event_name='version') self.verifyCommonKeys(version_event, expected_event_name="version")
# Check for 'version' event specific fields. # Check for 'version' event specific fields.
self.assertIn('evt', version_event) self.assertIn("evt", version_event)
self.assertIn('exe', version_event) self.assertIn("exe", version_event)
# Verify "evt" version field is a string. # Verify "evt" version field is a string.
self.assertIsInstance(version_event['evt'], str) self.assertIsInstance(version_event["evt"], str)
def test_start_event(self): def test_start_event(self):
"""Test and validate 'start' event data is valid. """Test and validate 'start' event data is valid.
Expected event log:
<version event>
<start event>
"""
self._event_log_module.StartEvent()
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
start_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(start_event, expected_event_name='start')
# Check for 'start' event specific fields.
self.assertIn('argv', start_event)
self.assertTrue(isinstance(start_event['argv'], list))
def test_exit_event_result_none(self):
"""Test 'exit' event data is valid when result is None.
We expect None result to be converted to 0 in the exit event data.
Expected event log:
<version event>
<exit event>
"""
self._event_log_module.ExitEvent(None)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
exit_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(exit_event, expected_event_name='exit')
# Check for 'exit' event specific fields.
self.assertIn('code', exit_event)
# 'None' result should convert to 0 (successful) return code.
self.assertEqual(exit_event['code'], 0)
def test_exit_event_result_integer(self):
"""Test 'exit' event data is valid when result is an integer.
Expected event log:
<version event>
<exit event>
"""
self._event_log_module.ExitEvent(2)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
exit_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(exit_event, expected_event_name='exit')
# Check for 'exit' event specific fields.
self.assertIn('code', exit_event)
self.assertEqual(exit_event['code'], 2)
def test_command_event(self):
"""Test and validate 'command' event data is valid.
Expected event log:
<version event>
<command event>
"""
name = 'repo'
subcommands = ['init' 'this']
self._event_log_module.CommandEvent(name='repo', subcommands=subcommands)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
command_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(command_event, expected_event_name='command')
# Check for 'command' event specific fields.
self.assertIn('name', command_event)
self.assertIn('subcommands', command_event)
self.assertEqual(command_event['name'], name)
self.assertEqual(command_event['subcommands'], subcommands)
def test_def_params_event_repo_config(self):
"""Test 'def_params' event data outputs only repo config keys.
Expected event log:
<version event>
<def_param event>
<def_param event>
"""
config = {
'git.foo': 'bar',
'repo.partialclone': 'true',
'repo.partialclonefilter': 'blob:none',
}
self._event_log_module.DefParamRepoEvents(config)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 3)
def_param_events = self._log_data[1:]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
for event in def_param_events:
self.verifyCommonKeys(event, expected_event_name='def_param')
# Check for 'def_param' event specific fields.
self.assertIn('param', event)
self.assertIn('value', event)
self.assertTrue(event['param'].startswith('repo.'))
def test_def_params_event_no_repo_config(self):
"""Test 'def_params' event data won't output non-repo config keys.
Expected event log:
<version event>
"""
config = {
'git.foo': 'bar',
'git.core.foo2': 'baz',
}
self._event_log_module.DefParamRepoEvents(config)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 1)
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
def test_data_event_config(self):
"""Test 'data' event data outputs all config keys.
Expected event log:
<version event>
<data event>
<data event>
"""
config = {
'git.foo': 'bar',
'repo.partialclone': 'false',
'repo.syncstate.superproject.hassuperprojecttag': 'true',
'repo.syncstate.superproject.sys.argv': ['--', 'sync', 'protobuf'],
}
prefix_value = 'prefix'
self._event_log_module.LogDataConfigEvents(config, prefix_value)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 5)
data_events = self._log_data[1:]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
for event in data_events:
self.verifyCommonKeys(event)
# Check for 'data' event specific fields.
self.assertIn('key', event)
self.assertIn('value', event)
key = event['key']
key = self.remove_prefix(key, f'{prefix_value}/')
value = event['value']
self.assertEqual(self._event_log_module.GetDataEventName(value), event['event'])
self.assertTrue(key in config and value == config[key])
def test_error_event(self):
"""Test and validate 'error' event data is valid.
Expected event log:
<version event>
<error event>
"""
msg = 'invalid option: --cahced'
fmt = 'invalid option: %s'
self._event_log_module.ErrorEvent(msg, fmt)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
error_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(error_event, expected_event_name='error')
# Check for 'error' event specific fields.
self.assertIn('msg', error_event)
self.assertIn('fmt', error_event)
self.assertEqual(error_event['msg'], msg)
self.assertEqual(error_event['fmt'], fmt)
def test_write_with_filename(self):
"""Test Write() with a path to a file exits with None."""
self.assertIsNone(self._event_log_module.Write(path='path/to/file'))
def test_write_with_git_config(self):
"""Test Write() uses the git config path when 'git config' call succeeds."""
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
with mock.patch.object(self._event_log_module,
'_GetEventTargetPath', return_value=tempdir):
self.assertEqual(os.path.dirname(self._event_log_module.Write()), tempdir)
def test_write_no_git_config(self):
"""Test Write() with no git config variable present exits with None."""
with mock.patch.object(self._event_log_module,
'_GetEventTargetPath', return_value=None):
self.assertIsNone(self._event_log_module.Write())
def test_write_non_string(self):
"""Test Write() with non-string type for |path| throws TypeError."""
with self.assertRaises(TypeError):
self._event_log_module.Write(path=1234)
def test_write_socket(self):
"""Test Write() with Unix domain socket for |path| and validate received traces."""
received_traces = []
with tempfile.TemporaryDirectory(prefix='test_server_sockets') as tempdir:
socket_path = os.path.join(tempdir, "server.sock")
server_ready = threading.Condition()
# Start "server" listening on Unix domain socket at socket_path.
try:
server_thread = threading.Thread(
target=serverLoggingThread,
args=(socket_path, server_ready, received_traces))
server_thread.start()
with server_ready:
server_ready.wait(timeout=120)
Expected event log:
<version event>
<start event>
"""
self._event_log_module.StartEvent() self._event_log_module.StartEvent()
path = self._event_log_module.Write(path=f'af_unix:{socket_path}') with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
finally: log_path = self._event_log_module.Write(path=tempdir)
server_thread.join(timeout=5) self._log_data = self.readLog(log_path)
self.assertEqual(path, f'af_unix:stream:{socket_path}') self.assertEqual(len(self._log_data), 2)
self.assertEqual(len(received_traces), 2) start_event = self._log_data[1]
version_event = json.loads(received_traces[0]) self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
start_event = json.loads(received_traces[1]) self.verifyCommonKeys(start_event, expected_event_name="start")
self.verifyCommonKeys(version_event, expected_event_name='version') # Check for 'start' event specific fields.
self.verifyCommonKeys(start_event, expected_event_name='start') self.assertIn("argv", start_event)
# Check for 'start' event specific fields. self.assertTrue(isinstance(start_event["argv"], list))
self.assertIn('argv', start_event)
self.assertIsInstance(start_event['argv'], list) def test_exit_event_result_none(self):
"""Test 'exit' event data is valid when result is None.
We expect None result to be converted to 0 in the exit event data.
Expected event log:
<version event>
<exit event>
"""
self._event_log_module.ExitEvent(None)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
exit_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
self.verifyCommonKeys(exit_event, expected_event_name="exit")
# Check for 'exit' event specific fields.
self.assertIn("code", exit_event)
# 'None' result should convert to 0 (successful) return code.
self.assertEqual(exit_event["code"], 0)
def test_exit_event_result_integer(self):
"""Test 'exit' event data is valid when result is an integer.
Expected event log:
<version event>
<exit event>
"""
self._event_log_module.ExitEvent(2)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
exit_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
self.verifyCommonKeys(exit_event, expected_event_name="exit")
# Check for 'exit' event specific fields.
self.assertIn("code", exit_event)
self.assertEqual(exit_event["code"], 2)
def test_command_event(self):
"""Test and validate 'command' event data is valid.
Expected event log:
<version event>
<command event>
"""
name = "repo"
subcommands = ["init" "this"]
self._event_log_module.CommandEvent(
name="repo", subcommands=subcommands
)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
command_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
self.verifyCommonKeys(command_event, expected_event_name="command")
# Check for 'command' event specific fields.
self.assertIn("name", command_event)
self.assertIn("subcommands", command_event)
self.assertEqual(command_event["name"], name)
self.assertEqual(command_event["subcommands"], subcommands)
def test_def_params_event_repo_config(self):
"""Test 'def_params' event data outputs only repo config keys.
Expected event log:
<version event>
<def_param event>
<def_param event>
"""
config = {
"git.foo": "bar",
"repo.partialclone": "true",
"repo.partialclonefilter": "blob:none",
}
self._event_log_module.DefParamRepoEvents(config)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 3)
def_param_events = self._log_data[1:]
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
for event in def_param_events:
self.verifyCommonKeys(event, expected_event_name="def_param")
# Check for 'def_param' event specific fields.
self.assertIn("param", event)
self.assertIn("value", event)
self.assertTrue(event["param"].startswith("repo."))
def test_def_params_event_no_repo_config(self):
"""Test 'def_params' event data won't output non-repo config keys.
Expected event log:
<version event>
"""
config = {
"git.foo": "bar",
"git.core.foo2": "baz",
}
self._event_log_module.DefParamRepoEvents(config)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 1)
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
def test_data_event_config(self):
"""Test 'data' event data outputs all config keys.
Expected event log:
<version event>
<data event>
<data event>
"""
config = {
"git.foo": "bar",
"repo.partialclone": "false",
"repo.syncstate.superproject.hassuperprojecttag": "true",
"repo.syncstate.superproject.sys.argv": ["--", "sync", "protobuf"],
}
prefix_value = "prefix"
self._event_log_module.LogDataConfigEvents(config, prefix_value)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 5)
data_events = self._log_data[1:]
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
for event in data_events:
self.verifyCommonKeys(event)
# Check for 'data' event specific fields.
self.assertIn("key", event)
self.assertIn("value", event)
key = event["key"]
key = self.remove_prefix(key, f"{prefix_value}/")
value = event["value"]
self.assertEqual(
self._event_log_module.GetDataEventName(value), event["event"]
)
self.assertTrue(key in config and value == config[key])
def test_error_event(self):
"""Test and validate 'error' event data is valid.
Expected event log:
<version event>
<error event>
"""
msg = "invalid option: --cahced"
fmt = "invalid option: %s"
self._event_log_module.ErrorEvent(msg, fmt)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
error_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name="version")
self.verifyCommonKeys(error_event, expected_event_name="error")
# Check for 'error' event specific fields.
self.assertIn("msg", error_event)
self.assertIn("fmt", error_event)
self.assertEqual(error_event["msg"], f"RepoErrorEvent:{msg}")
self.assertEqual(error_event["fmt"], f"RepoErrorEvent:{fmt}")
def test_write_with_filename(self):
"""Test Write() with a path to a file exits with None."""
self.assertIsNone(self._event_log_module.Write(path="path/to/file"))
def test_write_with_git_config(self):
"""Test Write() uses the git config path when 'git config' call
succeeds."""
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
with mock.patch.object(
self._event_log_module,
"_GetEventTargetPath",
return_value=tempdir,
):
self.assertEqual(
os.path.dirname(self._event_log_module.Write()), tempdir
)
def test_write_no_git_config(self):
"""Test Write() with no git config variable present exits with None."""
with mock.patch.object(
self._event_log_module, "_GetEventTargetPath", return_value=None
):
self.assertIsNone(self._event_log_module.Write())
def test_write_non_string(self):
"""Test Write() with non-string type for |path| throws TypeError."""
with self.assertRaises(TypeError):
self._event_log_module.Write(path=1234)
def test_write_socket(self):
"""Test Write() with Unix domain socket for |path| and validate received
traces."""
received_traces = []
with tempfile.TemporaryDirectory(
prefix="test_server_sockets"
) as tempdir:
socket_path = os.path.join(tempdir, "server.sock")
server_ready = threading.Condition()
# Start "server" listening on Unix domain socket at socket_path.
try:
server_thread = threading.Thread(
target=serverLoggingThread,
args=(socket_path, server_ready, received_traces),
)
server_thread.start()
with server_ready:
server_ready.wait(timeout=120)
self._event_log_module.StartEvent()
path = self._event_log_module.Write(
path=f"af_unix:{socket_path}"
)
finally:
server_thread.join(timeout=5)
self.assertEqual(path, f"af_unix:stream:{socket_path}")
self.assertEqual(len(received_traces), 2)
version_event = json.loads(received_traces[0])
start_event = json.loads(received_traces[1])
self.verifyCommonKeys(version_event, expected_event_name="version")
self.verifyCommonKeys(start_event, expected_event_name="start")
# Check for 'start' event specific fields.
self.assertIn("argv", start_event)
self.assertIsInstance(start_event["argv"], list)

View File

@ -14,42 +14,42 @@
"""Unittests for the hooks.py module.""" """Unittests for the hooks.py module."""
import hooks
import unittest import unittest
import hooks
class RepoHookShebang(unittest.TestCase): class RepoHookShebang(unittest.TestCase):
"""Check shebang parsing in RepoHook.""" """Check shebang parsing in RepoHook."""
def test_no_shebang(self): def test_no_shebang(self):
"""Lines w/out shebangs should be rejected.""" """Lines w/out shebangs should be rejected."""
DATA = ( DATA = ("", "#\n# foo\n", "# Bad shebang in script\n#!/foo\n")
'', for data in DATA:
'#\n# foo\n', self.assertIsNone(hooks.RepoHook._ExtractInterpFromShebang(data))
'# Bad shebang in script\n#!/foo\n'
)
for data in DATA:
self.assertIsNone(hooks.RepoHook._ExtractInterpFromShebang(data))
def test_direct_interp(self): def test_direct_interp(self):
"""Lines whose shebang points directly to the interpreter.""" """Lines whose shebang points directly to the interpreter."""
DATA = ( DATA = (
('#!/foo', '/foo'), ("#!/foo", "/foo"),
('#! /foo', '/foo'), ("#! /foo", "/foo"),
('#!/bin/foo ', '/bin/foo'), ("#!/bin/foo ", "/bin/foo"),
('#! /usr/foo ', '/usr/foo'), ("#! /usr/foo ", "/usr/foo"),
('#! /usr/foo -args', '/usr/foo'), ("#! /usr/foo -args", "/usr/foo"),
) )
for shebang, interp in DATA: for shebang, interp in DATA:
self.assertEqual(hooks.RepoHook._ExtractInterpFromShebang(shebang), self.assertEqual(
interp) hooks.RepoHook._ExtractInterpFromShebang(shebang), interp
)
def test_env_interp(self): def test_env_interp(self):
"""Lines whose shebang launches through `env`.""" """Lines whose shebang launches through `env`."""
DATA = ( DATA = (
('#!/usr/bin/env foo', 'foo'), ("#!/usr/bin/env foo", "foo"),
('#!/bin/env foo', 'foo'), ("#!/bin/env foo", "foo"),
('#! /bin/env /bin/foo ', '/bin/foo'), ("#! /bin/env /bin/foo ", "/bin/foo"),
) )
for shebang, interp in DATA: for shebang, interp in DATA:
self.assertEqual(hooks.RepoHook._ExtractInterpFromShebang(shebang), self.assertEqual(
interp) hooks.RepoHook._ExtractInterpFromShebang(shebang), interp
)

File diff suppressed because it is too large Load Diff

View File

@ -22,29 +22,31 @@ import platform_utils
class RemoveTests(unittest.TestCase): class RemoveTests(unittest.TestCase):
"""Check remove() helper.""" """Check remove() helper."""
def testMissingOk(self): def testMissingOk(self):
"""Check missing_ok handling.""" """Check missing_ok handling."""
with tempfile.TemporaryDirectory() as tmpdir: with tempfile.TemporaryDirectory() as tmpdir:
path = os.path.join(tmpdir, 'test') path = os.path.join(tmpdir, "test")
# Should not fail. # Should not fail.
platform_utils.remove(path, missing_ok=True) platform_utils.remove(path, missing_ok=True)
# Should fail. # Should fail.
self.assertRaises(OSError, platform_utils.remove, path) self.assertRaises(OSError, platform_utils.remove, path)
self.assertRaises(OSError, platform_utils.remove, path, missing_ok=False) self.assertRaises(
OSError, platform_utils.remove, path, missing_ok=False
)
# Should not fail if it exists. # Should not fail if it exists.
open(path, 'w').close() open(path, "w").close()
platform_utils.remove(path, missing_ok=True) platform_utils.remove(path, missing_ok=True)
self.assertFalse(os.path.exists(path)) self.assertFalse(os.path.exists(path))
open(path, 'w').close() open(path, "w").close()
platform_utils.remove(path) platform_utils.remove(path)
self.assertFalse(os.path.exists(path)) self.assertFalse(os.path.exists(path))
open(path, 'w').close() open(path, "w").close()
platform_utils.remove(path, missing_ok=False) platform_utils.remove(path, missing_ok=False)
self.assertFalse(os.path.exists(path)) self.assertFalse(os.path.exists(path))

View File

@ -22,461 +22,515 @@ import tempfile
import unittest import unittest
import error import error
import manifest_xml
import git_command import git_command
import git_config import git_config
import manifest_xml
import platform_utils import platform_utils
import project import project
@contextlib.contextmanager @contextlib.contextmanager
def TempGitTree(): def TempGitTree():
"""Create a new empty git checkout for testing.""" """Create a new empty git checkout for testing."""
with tempfile.TemporaryDirectory(prefix='repo-tests') as tempdir: with tempfile.TemporaryDirectory(prefix="repo-tests") as tempdir:
# Tests need to assume, that main is default branch at init, # Tests need to assume, that main is default branch at init,
# which is not supported in config until 2.28. # which is not supported in config until 2.28.
cmd = ['git', 'init'] cmd = ["git", "init"]
if git_command.git_require((2, 28, 0)): if git_command.git_require((2, 28, 0)):
cmd += ['--initial-branch=main'] cmd += ["--initial-branch=main"]
else: else:
# Use template dir for init. # Use template dir for init.
templatedir = tempfile.mkdtemp(prefix='.test-template') templatedir = tempfile.mkdtemp(prefix=".test-template")
with open(os.path.join(templatedir, 'HEAD'), 'w') as fp: with open(os.path.join(templatedir, "HEAD"), "w") as fp:
fp.write('ref: refs/heads/main\n') fp.write("ref: refs/heads/main\n")
cmd += ['--template', templatedir] cmd += ["--template", templatedir]
subprocess.check_call(cmd, cwd=tempdir) subprocess.check_call(cmd, cwd=tempdir)
yield tempdir yield tempdir
class FakeProject(object): class FakeProject:
"""A fake for Project for basic functionality.""" """A fake for Project for basic functionality."""
def __init__(self, worktree): def __init__(self, worktree):
self.worktree = worktree self.worktree = worktree
self.gitdir = os.path.join(worktree, '.git') self.gitdir = os.path.join(worktree, ".git")
self.name = 'fakeproject' self.name = "fakeproject"
self.work_git = project.Project._GitGetByExec( self.work_git = project.Project._GitGetByExec(
self, bare=False, gitdir=self.gitdir) self, bare=False, gitdir=self.gitdir
self.bare_git = project.Project._GitGetByExec( )
self, bare=True, gitdir=self.gitdir) self.bare_git = project.Project._GitGetByExec(
self.config = git_config.GitConfig.ForRepository(gitdir=self.gitdir) self, bare=True, gitdir=self.gitdir
)
self.config = git_config.GitConfig.ForRepository(gitdir=self.gitdir)
class ReviewableBranchTests(unittest.TestCase): class ReviewableBranchTests(unittest.TestCase):
"""Check ReviewableBranch behavior.""" """Check ReviewableBranch behavior."""
def test_smoke(self): def test_smoke(self):
"""A quick run through everything.""" """A quick run through everything."""
with TempGitTree() as tempdir: with TempGitTree() as tempdir:
fakeproj = FakeProject(tempdir) fakeproj = FakeProject(tempdir)
# Generate some commits. # Generate some commits.
with open(os.path.join(tempdir, 'readme'), 'w') as fp: with open(os.path.join(tempdir, "readme"), "w") as fp:
fp.write('txt') fp.write("txt")
fakeproj.work_git.add('readme') fakeproj.work_git.add("readme")
fakeproj.work_git.commit('-mAdd file') fakeproj.work_git.commit("-mAdd file")
fakeproj.work_git.checkout('-b', 'work') fakeproj.work_git.checkout("-b", "work")
fakeproj.work_git.rm('-f', 'readme') fakeproj.work_git.rm("-f", "readme")
fakeproj.work_git.commit('-mDel file') fakeproj.work_git.commit("-mDel file")
# Start off with the normal details. # Start off with the normal details.
rb = project.ReviewableBranch( rb = project.ReviewableBranch(
fakeproj, fakeproj.config.GetBranch('work'), 'main') fakeproj, fakeproj.config.GetBranch("work"), "main"
self.assertEqual('work', rb.name) )
self.assertEqual(1, len(rb.commits)) self.assertEqual("work", rb.name)
self.assertIn('Del file', rb.commits[0]) self.assertEqual(1, len(rb.commits))
d = rb.unabbrev_commits self.assertIn("Del file", rb.commits[0])
self.assertEqual(1, len(d)) d = rb.unabbrev_commits
short, long = next(iter(d.items())) self.assertEqual(1, len(d))
self.assertTrue(long.startswith(short)) short, long = next(iter(d.items()))
self.assertTrue(rb.base_exists) self.assertTrue(long.startswith(short))
# Hard to assert anything useful about this. self.assertTrue(rb.base_exists)
self.assertTrue(rb.date) # Hard to assert anything useful about this.
self.assertTrue(rb.date)
# Now delete the tracking branch! # Now delete the tracking branch!
fakeproj.work_git.branch('-D', 'main') fakeproj.work_git.branch("-D", "main")
rb = project.ReviewableBranch( rb = project.ReviewableBranch(
fakeproj, fakeproj.config.GetBranch('work'), 'main') fakeproj, fakeproj.config.GetBranch("work"), "main"
self.assertEqual(0, len(rb.commits)) )
self.assertFalse(rb.base_exists) self.assertEqual(0, len(rb.commits))
# Hard to assert anything useful about this. self.assertFalse(rb.base_exists)
self.assertTrue(rb.date) # Hard to assert anything useful about this.
self.assertTrue(rb.date)
class ProjectTests(unittest.TestCase):
"""Check Project behavior."""
def test_encode_patchset_description(self):
self.assertEqual(
project.Project._encode_patchset_description("abcd00!! +"),
"abcd00%21%21_%2b",
)
class CopyLinkTestCase(unittest.TestCase): class CopyLinkTestCase(unittest.TestCase):
"""TestCase for stub repo client checkouts. """TestCase for stub repo client checkouts.
It'll have a layout like this: It'll have a layout like this:
tempdir/ # self.tempdir tempdir/ # self.tempdir
checkout/ # self.topdir checkout/ # self.topdir
git-project/ # self.worktree git-project/ # self.worktree
Attributes: Attributes:
tempdir: A dedicated temporary directory. tempdir: A dedicated temporary directory.
worktree: The top of the repo client checkout. worktree: The top of the repo client checkout.
topdir: The top of a project checkout. topdir: The top of a project checkout.
""" """
def setUp(self): def setUp(self):
self.tempdirobj = tempfile.TemporaryDirectory(prefix='repo_tests') self.tempdirobj = tempfile.TemporaryDirectory(prefix="repo_tests")
self.tempdir = self.tempdirobj.name self.tempdir = self.tempdirobj.name
self.topdir = os.path.join(self.tempdir, 'checkout') self.topdir = os.path.join(self.tempdir, "checkout")
self.worktree = os.path.join(self.topdir, 'git-project') self.worktree = os.path.join(self.topdir, "git-project")
os.makedirs(self.topdir) os.makedirs(self.topdir)
os.makedirs(self.worktree) os.makedirs(self.worktree)
def tearDown(self): def tearDown(self):
self.tempdirobj.cleanup() self.tempdirobj.cleanup()
@staticmethod @staticmethod
def touch(path): def touch(path):
with open(path, 'w'): with open(path, "w"):
pass pass
def assertExists(self, path, msg=None): def assertExists(self, path, msg=None):
"""Make sure |path| exists.""" """Make sure |path| exists."""
if os.path.exists(path): if os.path.exists(path):
return return
if msg is None: if msg is None:
msg = ['path is missing: %s' % path] msg = ["path is missing: %s" % path]
while path != '/': while path != "/":
path = os.path.dirname(path) path = os.path.dirname(path)
if not path: if not path:
# If we're given something like "foo", abort once we get to "". # If we're given something like "foo", abort once we get to
break # "".
result = os.path.exists(path) break
msg.append('\tos.path.exists(%s): %s' % (path, result)) result = os.path.exists(path)
if result: msg.append(f"\tos.path.exists({path}): {result}")
msg.append('\tcontents: %r' % os.listdir(path)) if result:
break msg.append("\tcontents: %r" % os.listdir(path))
msg = '\n'.join(msg) break
msg = "\n".join(msg)
raise self.failureException(msg) raise self.failureException(msg)
class CopyFile(CopyLinkTestCase): class CopyFile(CopyLinkTestCase):
"""Check _CopyFile handling.""" """Check _CopyFile handling."""
def CopyFile(self, src, dest): def CopyFile(self, src, dest):
return project._CopyFile(self.worktree, src, self.topdir, dest) return project._CopyFile(self.worktree, src, self.topdir, dest)
def test_basic(self): def test_basic(self):
"""Basic test of copying a file from a project to the toplevel.""" """Basic test of copying a file from a project to the toplevel."""
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, "foo.txt")
self.touch(src) self.touch(src)
cf = self.CopyFile('foo.txt', 'foo') cf = self.CopyFile("foo.txt", "foo")
cf._Copy() cf._Copy()
self.assertExists(os.path.join(self.topdir, 'foo')) self.assertExists(os.path.join(self.topdir, "foo"))
def test_src_subdir(self): def test_src_subdir(self):
"""Copy a file from a subdir of a project.""" """Copy a file from a subdir of a project."""
src = os.path.join(self.worktree, 'bar', 'foo.txt') src = os.path.join(self.worktree, "bar", "foo.txt")
os.makedirs(os.path.dirname(src)) os.makedirs(os.path.dirname(src))
self.touch(src) self.touch(src)
cf = self.CopyFile('bar/foo.txt', 'new.txt') cf = self.CopyFile("bar/foo.txt", "new.txt")
cf._Copy() cf._Copy()
self.assertExists(os.path.join(self.topdir, 'new.txt')) self.assertExists(os.path.join(self.topdir, "new.txt"))
def test_dest_subdir(self): def test_dest_subdir(self):
"""Copy a file to a subdir of a checkout.""" """Copy a file to a subdir of a checkout."""
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, "foo.txt")
self.touch(src) self.touch(src)
cf = self.CopyFile('foo.txt', 'sub/dir/new.txt') cf = self.CopyFile("foo.txt", "sub/dir/new.txt")
self.assertFalse(os.path.exists(os.path.join(self.topdir, 'sub'))) self.assertFalse(os.path.exists(os.path.join(self.topdir, "sub")))
cf._Copy() cf._Copy()
self.assertExists(os.path.join(self.topdir, 'sub', 'dir', 'new.txt')) self.assertExists(os.path.join(self.topdir, "sub", "dir", "new.txt"))
def test_update(self): def test_update(self):
"""Make sure changed files get copied again.""" """Make sure changed files get copied again."""
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, "foo.txt")
dest = os.path.join(self.topdir, 'bar') dest = os.path.join(self.topdir, "bar")
with open(src, 'w') as f: with open(src, "w") as f:
f.write('1st') f.write("1st")
cf = self.CopyFile('foo.txt', 'bar') cf = self.CopyFile("foo.txt", "bar")
cf._Copy() cf._Copy()
self.assertExists(dest) self.assertExists(dest)
with open(dest) as f: with open(dest) as f:
self.assertEqual(f.read(), '1st') self.assertEqual(f.read(), "1st")
with open(src, 'w') as f: with open(src, "w") as f:
f.write('2nd!') f.write("2nd!")
cf._Copy() cf._Copy()
with open(dest) as f: with open(dest) as f:
self.assertEqual(f.read(), '2nd!') self.assertEqual(f.read(), "2nd!")
def test_src_block_symlink(self): def test_src_block_symlink(self):
"""Do not allow reading from a symlinked path.""" """Do not allow reading from a symlinked path."""
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, "foo.txt")
sym = os.path.join(self.worktree, 'sym') sym = os.path.join(self.worktree, "sym")
self.touch(src) self.touch(src)
platform_utils.symlink('foo.txt', sym) platform_utils.symlink("foo.txt", sym)
self.assertExists(sym) self.assertExists(sym)
cf = self.CopyFile('sym', 'foo') cf = self.CopyFile("sym", "foo")
self.assertRaises(error.ManifestInvalidPathError, cf._Copy) self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_symlink_traversal(self): def test_src_block_symlink_traversal(self):
"""Do not allow reading through a symlink dir.""" """Do not allow reading through a symlink dir."""
realfile = os.path.join(self.tempdir, 'file.txt') realfile = os.path.join(self.tempdir, "file.txt")
self.touch(realfile) self.touch(realfile)
src = os.path.join(self.worktree, 'bar', 'file.txt') src = os.path.join(self.worktree, "bar", "file.txt")
platform_utils.symlink(self.tempdir, os.path.join(self.worktree, 'bar')) platform_utils.symlink(self.tempdir, os.path.join(self.worktree, "bar"))
self.assertExists(src) self.assertExists(src)
cf = self.CopyFile('bar/file.txt', 'foo') cf = self.CopyFile("bar/file.txt", "foo")
self.assertRaises(error.ManifestInvalidPathError, cf._Copy) self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_copy_from_dir(self): def test_src_block_copy_from_dir(self):
"""Do not allow copying from a directory.""" """Do not allow copying from a directory."""
src = os.path.join(self.worktree, 'dir') src = os.path.join(self.worktree, "dir")
os.makedirs(src) os.makedirs(src)
cf = self.CopyFile('dir', 'foo') cf = self.CopyFile("dir", "foo")
self.assertRaises(error.ManifestInvalidPathError, cf._Copy) self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_dest_block_symlink(self): def test_dest_block_symlink(self):
"""Do not allow writing to a symlink.""" """Do not allow writing to a symlink."""
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, "foo.txt")
self.touch(src) self.touch(src)
platform_utils.symlink('dest', os.path.join(self.topdir, 'sym')) platform_utils.symlink("dest", os.path.join(self.topdir, "sym"))
cf = self.CopyFile('foo.txt', 'sym') cf = self.CopyFile("foo.txt", "sym")
self.assertRaises(error.ManifestInvalidPathError, cf._Copy) self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_dest_block_symlink_traversal(self): def test_dest_block_symlink_traversal(self):
"""Do not allow writing through a symlink dir.""" """Do not allow writing through a symlink dir."""
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, "foo.txt")
self.touch(src) self.touch(src)
platform_utils.symlink(tempfile.gettempdir(), platform_utils.symlink(
os.path.join(self.topdir, 'sym')) tempfile.gettempdir(), os.path.join(self.topdir, "sym")
cf = self.CopyFile('foo.txt', 'sym/foo.txt') )
self.assertRaises(error.ManifestInvalidPathError, cf._Copy) cf = self.CopyFile("foo.txt", "sym/foo.txt")
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_copy_to_dir(self): def test_src_block_copy_to_dir(self):
"""Do not allow copying to a directory.""" """Do not allow copying to a directory."""
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, "foo.txt")
self.touch(src) self.touch(src)
os.makedirs(os.path.join(self.topdir, 'dir')) os.makedirs(os.path.join(self.topdir, "dir"))
cf = self.CopyFile('foo.txt', 'dir') cf = self.CopyFile("foo.txt", "dir")
self.assertRaises(error.ManifestInvalidPathError, cf._Copy) self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
class LinkFile(CopyLinkTestCase): class LinkFile(CopyLinkTestCase):
"""Check _LinkFile handling.""" """Check _LinkFile handling."""
def LinkFile(self, src, dest): def LinkFile(self, src, dest):
return project._LinkFile(self.worktree, src, self.topdir, dest) return project._LinkFile(self.worktree, src, self.topdir, dest)
def test_basic(self): def test_basic(self):
"""Basic test of linking a file from a project into the toplevel.""" """Basic test of linking a file from a project into the toplevel."""
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, "foo.txt")
self.touch(src) self.touch(src)
lf = self.LinkFile('foo.txt', 'foo') lf = self.LinkFile("foo.txt", "foo")
lf._Link() lf._Link()
dest = os.path.join(self.topdir, 'foo') dest = os.path.join(self.topdir, "foo")
self.assertExists(dest) self.assertExists(dest)
self.assertTrue(os.path.islink(dest)) self.assertTrue(os.path.islink(dest))
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest)) self.assertEqual(
os.path.join("git-project", "foo.txt"), os.readlink(dest)
)
def test_src_subdir(self): def test_src_subdir(self):
"""Link to a file in a subdir of a project.""" """Link to a file in a subdir of a project."""
src = os.path.join(self.worktree, 'bar', 'foo.txt') src = os.path.join(self.worktree, "bar", "foo.txt")
os.makedirs(os.path.dirname(src)) os.makedirs(os.path.dirname(src))
self.touch(src) self.touch(src)
lf = self.LinkFile('bar/foo.txt', 'foo') lf = self.LinkFile("bar/foo.txt", "foo")
lf._Link() lf._Link()
self.assertExists(os.path.join(self.topdir, 'foo')) self.assertExists(os.path.join(self.topdir, "foo"))
def test_src_self(self): def test_src_self(self):
"""Link to the project itself.""" """Link to the project itself."""
dest = os.path.join(self.topdir, 'foo', 'bar') dest = os.path.join(self.topdir, "foo", "bar")
lf = self.LinkFile('.', 'foo/bar') lf = self.LinkFile(".", "foo/bar")
lf._Link() lf._Link()
self.assertExists(dest) self.assertExists(dest)
self.assertEqual(os.path.join('..', 'git-project'), os.readlink(dest)) self.assertEqual(os.path.join("..", "git-project"), os.readlink(dest))
def test_dest_subdir(self): def test_dest_subdir(self):
"""Link a file to a subdir of a checkout.""" """Link a file to a subdir of a checkout."""
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, "foo.txt")
self.touch(src) self.touch(src)
lf = self.LinkFile('foo.txt', 'sub/dir/foo/bar') lf = self.LinkFile("foo.txt", "sub/dir/foo/bar")
self.assertFalse(os.path.exists(os.path.join(self.topdir, 'sub'))) self.assertFalse(os.path.exists(os.path.join(self.topdir, "sub")))
lf._Link() lf._Link()
self.assertExists(os.path.join(self.topdir, 'sub', 'dir', 'foo', 'bar')) self.assertExists(os.path.join(self.topdir, "sub", "dir", "foo", "bar"))
def test_src_block_relative(self): def test_src_block_relative(self):
"""Do not allow relative symlinks.""" """Do not allow relative symlinks."""
BAD_SOURCES = ( BAD_SOURCES = (
'./', "./",
'..', "..",
'../', "../",
'foo/.', "foo/.",
'foo/./bar', "foo/./bar",
'foo/..', "foo/..",
'foo/../foo', "foo/../foo",
) )
for src in BAD_SOURCES: for src in BAD_SOURCES:
lf = self.LinkFile(src, 'foo') lf = self.LinkFile(src, "foo")
self.assertRaises(error.ManifestInvalidPathError, lf._Link) self.assertRaises(error.ManifestInvalidPathError, lf._Link)
def test_update(self): def test_update(self):
"""Make sure changed targets get updated.""" """Make sure changed targets get updated."""
dest = os.path.join(self.topdir, 'sym') dest = os.path.join(self.topdir, "sym")
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, "foo.txt")
self.touch(src) self.touch(src)
lf = self.LinkFile('foo.txt', 'sym') lf = self.LinkFile("foo.txt", "sym")
lf._Link() lf._Link()
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest)) self.assertEqual(
os.path.join("git-project", "foo.txt"), os.readlink(dest)
)
# Point the symlink somewhere else. # Point the symlink somewhere else.
os.unlink(dest) os.unlink(dest)
platform_utils.symlink(self.tempdir, dest) platform_utils.symlink(self.tempdir, dest)
lf._Link() lf._Link()
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest)) self.assertEqual(
os.path.join("git-project", "foo.txt"), os.readlink(dest)
)
class MigrateWorkTreeTests(unittest.TestCase): class MigrateWorkTreeTests(unittest.TestCase):
"""Check _MigrateOldWorkTreeGitDir handling.""" """Check _MigrateOldWorkTreeGitDir handling."""
_SYMLINKS = { _SYMLINKS = {
'config', 'description', 'hooks', 'info', 'logs', 'objects', "config",
'packed-refs', 'refs', 'rr-cache', 'shallow', 'svn', "description",
} "hooks",
_FILES = { "info",
'COMMIT_EDITMSG', 'FETCH_HEAD', 'HEAD', 'index', 'ORIG_HEAD', "logs",
'unknown-file-should-be-migrated', "objects",
} "packed-refs",
_CLEAN_FILES = { "refs",
'a-vim-temp-file~', '#an-emacs-temp-file#', "rr-cache",
} "shallow",
"svn",
}
_FILES = {
"COMMIT_EDITMSG",
"FETCH_HEAD",
"HEAD",
"index",
"ORIG_HEAD",
"unknown-file-should-be-migrated",
}
_CLEAN_FILES = {
"a-vim-temp-file~",
"#an-emacs-temp-file#",
}
@classmethod @classmethod
@contextlib.contextmanager @contextlib.contextmanager
def _simple_layout(cls): def _simple_layout(cls):
"""Create a simple repo client checkout to test against.""" """Create a simple repo client checkout to test against."""
with tempfile.TemporaryDirectory() as tempdir: with tempfile.TemporaryDirectory() as tempdir:
tempdir = Path(tempdir) tempdir = Path(tempdir)
gitdir = tempdir / '.repo/projects/src/test.git' gitdir = tempdir / ".repo/projects/src/test.git"
gitdir.mkdir(parents=True) gitdir.mkdir(parents=True)
cmd = ['git', 'init', '--bare', str(gitdir)] cmd = ["git", "init", "--bare", str(gitdir)]
subprocess.check_call(cmd) subprocess.check_call(cmd)
dotgit = tempdir / 'src/test/.git' dotgit = tempdir / "src/test/.git"
dotgit.mkdir(parents=True) dotgit.mkdir(parents=True)
for name in cls._SYMLINKS: for name in cls._SYMLINKS:
(dotgit / name).symlink_to(f'../../../.repo/projects/src/test.git/{name}') (dotgit / name).symlink_to(
for name in cls._FILES | cls._CLEAN_FILES: f"../../../.repo/projects/src/test.git/{name}"
(dotgit / name).write_text(name) )
for name in cls._FILES | cls._CLEAN_FILES:
(dotgit / name).write_text(name)
yield tempdir yield tempdir
def test_standard(self): def test_standard(self):
"""Migrate a standard checkout that we expect.""" """Migrate a standard checkout that we expect."""
with self._simple_layout() as tempdir: with self._simple_layout() as tempdir:
dotgit = tempdir / 'src/test/.git' dotgit = tempdir / "src/test/.git"
project.Project._MigrateOldWorkTreeGitDir(str(dotgit)) project.Project._MigrateOldWorkTreeGitDir(str(dotgit))
# Make sure the dir was transformed into a symlink. # Make sure the dir was transformed into a symlink.
self.assertTrue(dotgit.is_symlink()) self.assertTrue(dotgit.is_symlink())
self.assertEqual(os.readlink(dotgit), os.path.normpath('../../.repo/projects/src/test.git')) self.assertEqual(
os.readlink(dotgit),
os.path.normpath("../../.repo/projects/src/test.git"),
)
# Make sure files were moved over. # Make sure files were moved over.
gitdir = tempdir / '.repo/projects/src/test.git' gitdir = tempdir / ".repo/projects/src/test.git"
for name in self._FILES: for name in self._FILES:
self.assertEqual(name, (gitdir / name).read_text()) self.assertEqual(name, (gitdir / name).read_text())
# Make sure files were removed. # Make sure files were removed.
for name in self._CLEAN_FILES: for name in self._CLEAN_FILES:
self.assertFalse((gitdir / name).exists()) self.assertFalse((gitdir / name).exists())
def test_unknown(self): def test_unknown(self):
"""A checkout with unknown files should abort.""" """A checkout with unknown files should abort."""
with self._simple_layout() as tempdir: with self._simple_layout() as tempdir:
dotgit = tempdir / 'src/test/.git' dotgit = tempdir / "src/test/.git"
(tempdir / '.repo/projects/src/test.git/random-file').write_text('one') (tempdir / ".repo/projects/src/test.git/random-file").write_text(
(dotgit / 'random-file').write_text('two') "one"
with self.assertRaises(error.GitError): )
project.Project._MigrateOldWorkTreeGitDir(str(dotgit)) (dotgit / "random-file").write_text("two")
with self.assertRaises(error.GitError):
project.Project._MigrateOldWorkTreeGitDir(str(dotgit))
# Make sure no content was actually changed. # Make sure no content was actually changed.
self.assertTrue(dotgit.is_dir()) self.assertTrue(dotgit.is_dir())
for name in self._FILES: for name in self._FILES:
self.assertTrue((dotgit / name).is_file()) self.assertTrue((dotgit / name).is_file())
for name in self._CLEAN_FILES: for name in self._CLEAN_FILES:
self.assertTrue((dotgit / name).is_file()) self.assertTrue((dotgit / name).is_file())
for name in self._SYMLINKS: for name in self._SYMLINKS:
self.assertTrue((dotgit / name).is_symlink()) self.assertTrue((dotgit / name).is_symlink())
class ManifestPropertiesFetchedCorrectly(unittest.TestCase): class ManifestPropertiesFetchedCorrectly(unittest.TestCase):
"""Ensure properties are fetched properly.""" """Ensure properties are fetched properly."""
def setUpManifest(self, tempdir): def setUpManifest(self, tempdir):
repodir = os.path.join(tempdir, '.repo') repodir = os.path.join(tempdir, ".repo")
manifest_dir = os.path.join(repodir, 'manifests') manifest_dir = os.path.join(repodir, "manifests")
manifest_file = os.path.join( manifest_file = os.path.join(repodir, manifest_xml.MANIFEST_FILE_NAME)
repodir, manifest_xml.MANIFEST_FILE_NAME) os.mkdir(repodir)
local_manifest_dir = os.path.join( os.mkdir(manifest_dir)
repodir, manifest_xml.LOCAL_MANIFESTS_DIR_NAME) manifest = manifest_xml.XmlManifest(repodir, manifest_file)
os.mkdir(repodir)
os.mkdir(manifest_dir)
manifest = manifest_xml.XmlManifest(repodir, manifest_file)
return project.ManifestProject( return project.ManifestProject(
manifest, 'test/manifest', os.path.join(tempdir, '.git'), tempdir) manifest, "test/manifest", os.path.join(tempdir, ".git"), tempdir
)
def test_manifest_config_properties(self): def test_manifest_config_properties(self):
"""Test we are fetching the manifest config properties correctly.""" """Test we are fetching the manifest config properties correctly."""
with TempGitTree() as tempdir: with TempGitTree() as tempdir:
fakeproj = self.setUpManifest(tempdir) fakeproj = self.setUpManifest(tempdir)
# Set property using the expected Set method, then ensure # Set property using the expected Set method, then ensure
# the porperty functions are using the correct Get methods. # the porperty functions are using the correct Get methods.
fakeproj.config.SetString( fakeproj.config.SetString(
'manifest.standalone', 'https://chicken/manifest.git') "manifest.standalone", "https://chicken/manifest.git"
self.assertEqual( )
fakeproj.standalone_manifest_url, 'https://chicken/manifest.git') self.assertEqual(
fakeproj.standalone_manifest_url, "https://chicken/manifest.git"
)
fakeproj.config.SetString('manifest.groups', 'test-group, admin-group') fakeproj.config.SetString(
self.assertEqual(fakeproj.manifest_groups, 'test-group, admin-group') "manifest.groups", "test-group, admin-group"
)
self.assertEqual(
fakeproj.manifest_groups, "test-group, admin-group"
)
fakeproj.config.SetString('repo.reference', 'mirror/ref') fakeproj.config.SetString("repo.reference", "mirror/ref")
self.assertEqual(fakeproj.reference, 'mirror/ref') self.assertEqual(fakeproj.reference, "mirror/ref")
fakeproj.config.SetBoolean('repo.dissociate', False) fakeproj.config.SetBoolean("repo.dissociate", False)
self.assertFalse(fakeproj.dissociate) self.assertFalse(fakeproj.dissociate)
fakeproj.config.SetBoolean('repo.archive', False) fakeproj.config.SetBoolean("repo.archive", False)
self.assertFalse(fakeproj.archive) self.assertFalse(fakeproj.archive)
fakeproj.config.SetBoolean('repo.mirror', False) fakeproj.config.SetBoolean("repo.mirror", False)
self.assertFalse(fakeproj.mirror) self.assertFalse(fakeproj.mirror)
fakeproj.config.SetBoolean('repo.worktree', False) fakeproj.config.SetBoolean("repo.worktree", False)
self.assertFalse(fakeproj.use_worktree) self.assertFalse(fakeproj.use_worktree)
fakeproj.config.SetBoolean('repo.clonebundle', False) fakeproj.config.SetBoolean("repo.clonebundle", False)
self.assertFalse(fakeproj.clone_bundle) self.assertFalse(fakeproj.clone_bundle)
fakeproj.config.SetBoolean('repo.submodules', False) fakeproj.config.SetBoolean("repo.submodules", False)
self.assertFalse(fakeproj.submodules) self.assertFalse(fakeproj.submodules)
fakeproj.config.SetBoolean('repo.git-lfs', False) fakeproj.config.SetBoolean("repo.git-lfs", False)
self.assertFalse(fakeproj.git_lfs) self.assertFalse(fakeproj.git_lfs)
fakeproj.config.SetBoolean('repo.superproject', False) fakeproj.config.SetBoolean("repo.superproject", False)
self.assertFalse(fakeproj.use_superproject) self.assertFalse(fakeproj.use_superproject)
fakeproj.config.SetBoolean('repo.partialclone', False) fakeproj.config.SetBoolean("repo.partialclone", False)
self.assertFalse(fakeproj.partial_clone) self.assertFalse(fakeproj.partial_clone)
fakeproj.config.SetString('repo.depth', '48') fakeproj.config.SetString("repo.depth", "48")
self.assertEqual(fakeproj.depth, '48') self.assertEqual(fakeproj.depth, 48)
fakeproj.config.SetString('repo.clonefilter', 'blob:limit=10M') fakeproj.config.SetString("repo.depth", "invalid_depth")
self.assertEqual(fakeproj.clone_filter, 'blob:limit=10M') self.assertEqual(fakeproj.depth, None)
fakeproj.config.SetString('repo.partialcloneexclude', 'third_party/big_repo') fakeproj.config.SetString("repo.clonefilter", "blob:limit=10M")
self.assertEqual(fakeproj.partial_clone_exclude, 'third_party/big_repo') self.assertEqual(fakeproj.clone_filter, "blob:limit=10M")
fakeproj.config.SetString('manifest.platform', 'auto') fakeproj.config.SetString(
self.assertEqual(fakeproj.manifest_platform, 'auto') "repo.partialcloneexclude", "third_party/big_repo"
)
self.assertEqual(
fakeproj.partial_clone_exclude, "third_party/big_repo"
)
fakeproj.config.SetString("manifest.platform", "auto")
self.assertEqual(fakeproj.manifest_platform, "auto")

101
tests/test_repo_logging.py Normal file
View File

@ -0,0 +1,101 @@
# Copyright (C) 2023 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unit test for repo_logging module."""
import contextlib
import io
import logging
import unittest
from unittest import mock
from color import SetDefaultColoring
from error import RepoExitError
from repo_logging import RepoLogger
class TestRepoLogger(unittest.TestCase):
@mock.patch.object(RepoLogger, "error")
def test_log_aggregated_errors_logs_aggregated_errors(self, mock_error):
"""Test if log_aggregated_errors logs a list of aggregated errors."""
logger = RepoLogger(__name__)
logger.log_aggregated_errors(
RepoExitError(
aggregate_errors=[
Exception("foo"),
Exception("bar"),
Exception("baz"),
Exception("hello"),
Exception("world"),
Exception("test"),
]
)
)
mock_error.assert_has_calls(
[
mock.call("=" * 80),
mock.call(
"Repo command failed due to the following `%s` errors:",
"RepoExitError",
),
mock.call("foo\nbar\nbaz\nhello\nworld"),
mock.call("+%d additional errors...", 1),
]
)
@mock.patch.object(RepoLogger, "error")
def test_log_aggregated_errors_logs_single_error(self, mock_error):
"""Test if log_aggregated_errors logs empty aggregated_errors."""
logger = RepoLogger(__name__)
logger.log_aggregated_errors(RepoExitError())
mock_error.assert_has_calls(
[
mock.call("=" * 80),
mock.call("Repo command failed: %s", "RepoExitError"),
]
)
def test_log_with_format_string(self):
"""Test different log levels with format strings."""
# Set color output to "always" for consistent test results.
# This ensures the logger's behavior is uniform across different
# environments and git configurations.
SetDefaultColoring("always")
# Regex pattern to match optional ANSI color codes.
# \033 - Escape character
# \[ - Opening square bracket
# [0-9;]* - Zero or more digits or semicolons
# m - Ending 'm' character
# ? - Makes the entire group optional
opt_color = r"(\033\[[0-9;]*m)?"
for level in (logging.INFO, logging.WARN, logging.ERROR):
name = logging.getLevelName(level)
with self.subTest(level=level, name=name):
output = io.StringIO()
with contextlib.redirect_stderr(output):
logger = RepoLogger(__name__)
logger.log(level, "%s", "100% pass")
self.assertRegex(
output.getvalue().strip(),
f"^{opt_color}100% pass{opt_color}$",
f"failed for level {name}",
)

View File

@ -22,35 +22,39 @@ import repo_trace
class TraceTests(unittest.TestCase): class TraceTests(unittest.TestCase):
"""Check Trace behavior.""" """Check Trace behavior."""
def testTrace_MaxSizeEnforced(self): def testTrace_MaxSizeEnforced(self):
content = 'git chicken' content = "git chicken"
with repo_trace.Trace(content, first_trace=True): with repo_trace.Trace(content, first_trace=True):
pass pass
first_trace_size = os.path.getsize(repo_trace._TRACE_FILE) first_trace_size = os.path.getsize(repo_trace._TRACE_FILE)
with repo_trace.Trace(content): with repo_trace.Trace(content):
pass pass
self.assertGreater( self.assertGreater(
os.path.getsize(repo_trace._TRACE_FILE), first_trace_size) os.path.getsize(repo_trace._TRACE_FILE), first_trace_size
)
# Check we clear everything is the last chunk is larger than _MAX_SIZE. # Check we clear everything is the last chunk is larger than _MAX_SIZE.
with mock.patch('repo_trace._MAX_SIZE', 0): with mock.patch("repo_trace._MAX_SIZE", 0):
with repo_trace.Trace(content, first_trace=True): with repo_trace.Trace(content, first_trace=True):
pass pass
self.assertEqual(first_trace_size, self.assertEqual(
os.path.getsize(repo_trace._TRACE_FILE)) first_trace_size, os.path.getsize(repo_trace._TRACE_FILE)
)
# Check we only clear the chunks we need to. # Check we only clear the chunks we need to.
repo_trace._MAX_SIZE = (first_trace_size + 1) / (1024 * 1024) repo_trace._MAX_SIZE = (first_trace_size + 1) / (1024 * 1024)
with repo_trace.Trace(content, first_trace=True): with repo_trace.Trace(content, first_trace=True):
pass pass
self.assertEqual(first_trace_size * 2, self.assertEqual(
os.path.getsize(repo_trace._TRACE_FILE)) first_trace_size * 2, os.path.getsize(repo_trace._TRACE_FILE)
)
with repo_trace.Trace(content, first_trace=True): with repo_trace.Trace(content, first_trace=True):
pass pass
self.assertEqual(first_trace_size * 2, self.assertEqual(
os.path.getsize(repo_trace._TRACE_FILE)) first_trace_size * 2, os.path.getsize(repo_trace._TRACE_FILE)
)

View File

@ -23,52 +23,58 @@ import ssh
class SshTests(unittest.TestCase): class SshTests(unittest.TestCase):
"""Tests the ssh functions.""" """Tests the ssh functions."""
def test_parse_ssh_version(self): def test_parse_ssh_version(self):
"""Check _parse_ssh_version() handling.""" """Check _parse_ssh_version() handling."""
ver = ssh._parse_ssh_version('Unknown\n') ver = ssh._parse_ssh_version("Unknown\n")
self.assertEqual(ver, ()) self.assertEqual(ver, ())
ver = ssh._parse_ssh_version('OpenSSH_1.0\n') ver = ssh._parse_ssh_version("OpenSSH_1.0\n")
self.assertEqual(ver, (1, 0)) self.assertEqual(ver, (1, 0))
ver = ssh._parse_ssh_version('OpenSSH_6.6.1p1 Ubuntu-2ubuntu2.13, OpenSSL 1.0.1f 6 Jan 2014\n') ver = ssh._parse_ssh_version(
self.assertEqual(ver, (6, 6, 1)) "OpenSSH_6.6.1p1 Ubuntu-2ubuntu2.13, OpenSSL 1.0.1f 6 Jan 2014\n"
ver = ssh._parse_ssh_version('OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\n') )
self.assertEqual(ver, (7, 6)) self.assertEqual(ver, (6, 6, 1))
ver = ssh._parse_ssh_version(
"OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\n"
)
self.assertEqual(ver, (7, 6))
ver = ssh._parse_ssh_version("OpenSSH_9.0p1, LibreSSL 3.3.6\n")
self.assertEqual(ver, (9, 0))
def test_version(self): def test_version(self):
"""Check version() handling.""" """Check version() handling."""
with mock.patch('ssh._run_ssh_version', return_value='OpenSSH_1.2\n'): with mock.patch("ssh._run_ssh_version", return_value="OpenSSH_1.2\n"):
self.assertEqual(ssh.version(), (1, 2)) self.assertEqual(ssh.version(), (1, 2))
def test_context_manager_empty(self): def test_context_manager_empty(self):
"""Verify context manager with no clients works correctly.""" """Verify context manager with no clients works correctly."""
with multiprocessing.Manager() as manager: with multiprocessing.Manager() as manager:
with ssh.ProxyManager(manager): with ssh.ProxyManager(manager):
pass pass
def test_context_manager_child_cleanup(self): def test_context_manager_child_cleanup(self):
"""Verify orphaned clients & masters get cleaned up.""" """Verify orphaned clients & masters get cleaned up."""
with multiprocessing.Manager() as manager: with multiprocessing.Manager() as manager:
with ssh.ProxyManager(manager) as ssh_proxy: with ssh.ProxyManager(manager) as ssh_proxy:
client = subprocess.Popen(['sleep', '964853320']) client = subprocess.Popen(["sleep", "964853320"])
ssh_proxy.add_client(client) ssh_proxy.add_client(client)
master = subprocess.Popen(['sleep', '964853321']) master = subprocess.Popen(["sleep", "964853321"])
ssh_proxy.add_master(master) ssh_proxy.add_master(master)
# If the process still exists, these will throw timeout errors. # If the process still exists, these will throw timeout errors.
client.wait(0) client.wait(0)
master.wait(0) master.wait(0)
def test_ssh_sock(self): def test_ssh_sock(self):
"""Check sock() function.""" """Check sock() function."""
manager = multiprocessing.Manager() manager = multiprocessing.Manager()
proxy = ssh.ProxyManager(manager) proxy = ssh.ProxyManager(manager)
with mock.patch('tempfile.mkdtemp', return_value='/tmp/foo'): with mock.patch("tempfile.mkdtemp", return_value="/tmp/foo"):
# old ssh version uses port # Old ssh version uses port.
with mock.patch('ssh.version', return_value=(6, 6)): with mock.patch("ssh.version", return_value=(6, 6)):
self.assertTrue(proxy.sock().endswith('%p')) self.assertTrue(proxy.sock().endswith("%p"))
proxy._sock_path = None proxy._sock_path = None
# new ssh version uses hash # New ssh version uses hash.
with mock.patch('ssh.version', return_value=(6, 7)): with mock.patch("ssh.version", return_value=(6, 7)):
self.assertTrue(proxy.sock().endswith('%C')) self.assertTrue(proxy.sock().endswith("%C"))

View File

@ -21,53 +21,71 @@ import subcmds
class AllCommands(unittest.TestCase): class AllCommands(unittest.TestCase):
"""Check registered all_commands.""" """Check registered all_commands."""
def test_required_basic(self): def test_required_basic(self):
"""Basic checking of registered commands.""" """Basic checking of registered commands."""
# NB: We don't test all subcommands as we want to avoid "change detection" # NB: We don't test all subcommands as we want to avoid "change
# tests, so we just look for the most common/important ones here that are # detection" tests, so we just look for the most common/important ones
# unlikely to ever change. # here that are unlikely to ever change.
for cmd in {'cherry-pick', 'help', 'init', 'start', 'sync', 'upload'}: for cmd in {"cherry-pick", "help", "init", "start", "sync", "upload"}:
self.assertIn(cmd, subcmds.all_commands) self.assertIn(cmd, subcmds.all_commands)
def test_naming(self): def test_naming(self):
"""Verify we don't add things that we shouldn't.""" """Verify we don't add things that we shouldn't."""
for cmd in subcmds.all_commands: for cmd in subcmds.all_commands:
# Reject filename suffixes like "help.py". # Reject filename suffixes like "help.py".
self.assertNotIn('.', cmd) self.assertNotIn(".", cmd)
# Make sure all '_' were converted to '-'. # Make sure all '_' were converted to '-'.
self.assertNotIn('_', cmd) self.assertNotIn("_", cmd)
# Reject internal python paths like "__init__". # Reject internal python paths like "__init__".
self.assertFalse(cmd.startswith('__')) self.assertFalse(cmd.startswith("__"))
def test_help_desc_style(self): def test_help_desc_style(self):
"""Force some consistency in option descriptions. """Force some consistency in option descriptions.
Python's optparse & argparse has a few default options like --help. Their Python's optparse & argparse has a few default options like --help.
option description text uses lowercase sentence fragments, so enforce our Their option description text uses lowercase sentence fragments, so
options follow the same style so UI is consistent. enforce our options follow the same style so UI is consistent.
We enforce: We enforce:
* Text starts with lowercase. * Text starts with lowercase.
* Text doesn't end with period. * Text doesn't end with period.
""" """
for name, cls in subcmds.all_commands.items(): for name, cls in subcmds.all_commands.items():
cmd = cls() cmd = cls()
parser = cmd.OptionParser parser = cmd.OptionParser
for option in parser.option_list: for option in parser.option_list:
if option.help == optparse.SUPPRESS_HELP: if option.help == optparse.SUPPRESS_HELP:
continue continue
c = option.help[0] c = option.help[0]
self.assertEqual( self.assertEqual(
c.lower(), c, c.lower(),
msg=f'subcmds/{name}.py: {option.get_opt_string()}: help text ' c,
f'should start with lowercase: "{option.help}"') msg=f"subcmds/{name}.py: {option.get_opt_string()}: "
f'help text should start with lowercase: "{option.help}"',
)
self.assertNotEqual( self.assertNotEqual(
option.help[-1], '.', option.help[-1],
msg=f'subcmds/{name}.py: {option.get_opt_string()}: help text ' ".",
f'should not end in a period: "{option.help}"') msg=f"subcmds/{name}.py: {option.get_opt_string()}: "
f'help text should not end in a period: "{option.help}"',
)
def test_cli_option_style(self):
"""Force some consistency in option flags."""
for name, cls in subcmds.all_commands.items():
cmd = cls()
parser = cmd.OptionParser
for option in parser.option_list:
for opt in option._long_opts:
self.assertNotIn(
"_",
opt,
msg=f"subcmds/{name}.py: {opt}: only use dashes in "
"options, not underscores",
)

View File

@ -20,30 +20,27 @@ from subcmds import init
class InitCommand(unittest.TestCase): class InitCommand(unittest.TestCase):
"""Check registered all_commands.""" """Check registered all_commands."""
def setUp(self): def setUp(self):
self.cmd = init.Init() self.cmd = init.Init()
def test_cli_parser_good(self): def test_cli_parser_good(self):
"""Check valid command line options.""" """Check valid command line options."""
ARGV = ( ARGV = ([],)
[], for argv in ARGV:
) opts, args = self.cmd.OptionParser.parse_args(argv)
for argv in ARGV: self.cmd.ValidateOptions(opts, args)
opts, args = self.cmd.OptionParser.parse_args(argv)
self.cmd.ValidateOptions(opts, args)
def test_cli_parser_bad(self): def test_cli_parser_bad(self):
"""Check invalid command line options.""" """Check invalid command line options."""
ARGV = ( ARGV = (
# Too many arguments. # Too many arguments.
['url', 'asdf'], ["url", "asdf"],
# Conflicting options.
# Conflicting options. ["--mirror", "--archive"],
['--mirror', '--archive'], )
) for argv in ARGV:
for argv in ARGV: opts, args = self.cmd.OptionParser.parse_args(argv)
opts, args = self.cmd.OptionParser.parse_args(argv) with self.assertRaises(SystemExit):
with self.assertRaises(SystemExit): self.cmd.ValidateOptions(opts, args)
self.cmd.ValidateOptions(opts, args)

View File

@ -14,120 +14,502 @@
"""Unittests for the subcmds/sync.py module.""" """Unittests for the subcmds/sync.py module."""
import os import os
import shutil
import tempfile
import time
import unittest import unittest
from unittest import mock from unittest import mock
import pytest import pytest
import command import command
from error import GitError
from error import RepoExitError
from project import SyncNetworkHalfResult
from subcmds import sync from subcmds import sync
@pytest.mark.parametrize('use_superproject, cli_args, result', [ @pytest.mark.parametrize(
(True, ['--current-branch'], True), "use_superproject, cli_args, result",
(True, ['--no-current-branch'], True), [
(True, [], True), (True, ["--current-branch"], True),
(False, ['--current-branch'], True), (True, ["--no-current-branch"], True),
(False, ['--no-current-branch'], False), (True, [], True),
(False, [], None), (False, ["--current-branch"], True),
]) (False, ["--no-current-branch"], False),
(False, [], None),
],
)
def test_get_current_branch_only(use_superproject, cli_args, result): def test_get_current_branch_only(use_superproject, cli_args, result):
"""Test Sync._GetCurrentBranchOnly logic. """Test Sync._GetCurrentBranchOnly logic.
Sync._GetCurrentBranchOnly should return True if a superproject is requested, Sync._GetCurrentBranchOnly should return True if a superproject is
and otherwise the value of the current_branch_only option. requested, and otherwise the value of the current_branch_only option.
""" """
cmd = sync.Sync() cmd = sync.Sync()
opts, _ = cmd.OptionParser.parse_args(cli_args) opts, _ = cmd.OptionParser.parse_args(cli_args)
with mock.patch('git_superproject.UseSuperproject', with mock.patch(
return_value=use_superproject): "git_superproject.UseSuperproject", return_value=use_superproject
assert cmd._GetCurrentBranchOnly(opts, cmd.manifest) == result ):
assert cmd._GetCurrentBranchOnly(opts, cmd.manifest) == result
# Used to patch os.cpu_count() for reliable results. # Used to patch os.cpu_count() for reliable results.
OS_CPU_COUNT = 24 OS_CPU_COUNT = 24
@pytest.mark.parametrize('argv, jobs_manifest, jobs, jobs_net, jobs_check', [
# No user or manifest settings. @pytest.mark.parametrize(
([], None, OS_CPU_COUNT, 1, command.DEFAULT_LOCAL_JOBS), "argv, jobs_manifest, jobs, jobs_net, jobs_check",
# No user settings, so manifest settings control. [
([], 3, 3, 3, 3), # No user or manifest settings.
# User settings, but no manifest. ([], None, OS_CPU_COUNT, 1, command.DEFAULT_LOCAL_JOBS),
(['--jobs=4'], None, 4, 4, 4), # No user settings, so manifest settings control.
(['--jobs=4', '--jobs-network=5'], None, 4, 5, 4), ([], 3, 3, 3, 3),
(['--jobs=4', '--jobs-checkout=6'], None, 4, 4, 6), # User settings, but no manifest.
(['--jobs=4', '--jobs-network=5', '--jobs-checkout=6'], None, 4, 5, 6), (["--jobs=4"], None, 4, 4, 4),
(['--jobs-network=5'], None, OS_CPU_COUNT, 5, command.DEFAULT_LOCAL_JOBS), (["--jobs=4", "--jobs-network=5"], None, 4, 5, 4),
(['--jobs-checkout=6'], None, OS_CPU_COUNT, 1, 6), (["--jobs=4", "--jobs-checkout=6"], None, 4, 4, 6),
(['--jobs-network=5', '--jobs-checkout=6'], None, OS_CPU_COUNT, 5, 6), (["--jobs=4", "--jobs-network=5", "--jobs-checkout=6"], None, 4, 5, 6),
# User settings with manifest settings. (
(['--jobs=4'], 3, 4, 4, 4), ["--jobs-network=5"],
(['--jobs=4', '--jobs-network=5'], 3, 4, 5, 4), None,
(['--jobs=4', '--jobs-checkout=6'], 3, 4, 4, 6), OS_CPU_COUNT,
(['--jobs=4', '--jobs-network=5', '--jobs-checkout=6'], 3, 4, 5, 6), 5,
(['--jobs-network=5'], 3, 3, 5, 3), command.DEFAULT_LOCAL_JOBS,
(['--jobs-checkout=6'], 3, 3, 3, 6), ),
(['--jobs-network=5', '--jobs-checkout=6'], 3, 3, 5, 6), (["--jobs-checkout=6"], None, OS_CPU_COUNT, 1, 6),
# Settings that exceed rlimits get capped. (["--jobs-network=5", "--jobs-checkout=6"], None, OS_CPU_COUNT, 5, 6),
(['--jobs=1000000'], None, 83, 83, 83), # User settings with manifest settings.
([], 1000000, 83, 83, 83), (["--jobs=4"], 3, 4, 4, 4),
]) (["--jobs=4", "--jobs-network=5"], 3, 4, 5, 4),
(["--jobs=4", "--jobs-checkout=6"], 3, 4, 4, 6),
(["--jobs=4", "--jobs-network=5", "--jobs-checkout=6"], 3, 4, 5, 6),
(["--jobs-network=5"], 3, 3, 5, 3),
(["--jobs-checkout=6"], 3, 3, 3, 6),
(["--jobs-network=5", "--jobs-checkout=6"], 3, 3, 5, 6),
# Settings that exceed rlimits get capped.
(["--jobs=1000000"], None, 83, 83, 83),
([], 1000000, 83, 83, 83),
],
)
def test_cli_jobs(argv, jobs_manifest, jobs, jobs_net, jobs_check): def test_cli_jobs(argv, jobs_manifest, jobs, jobs_net, jobs_check):
"""Tests --jobs option behavior.""" """Tests --jobs option behavior."""
mp = mock.MagicMock() mp = mock.MagicMock()
mp.manifest.default.sync_j = jobs_manifest mp.manifest.default.sync_j = jobs_manifest
cmd = sync.Sync() cmd = sync.Sync()
opts, args = cmd.OptionParser.parse_args(argv) opts, args = cmd.OptionParser.parse_args(argv)
cmd.ValidateOptions(opts, args) cmd.ValidateOptions(opts, args)
with mock.patch.object(sync, '_rlimit_nofile', return_value=(256, 256)): with mock.patch.object(sync, "_rlimit_nofile", return_value=(256, 256)):
with mock.patch.object(os, 'cpu_count', return_value=OS_CPU_COUNT): with mock.patch.object(os, "cpu_count", return_value=OS_CPU_COUNT):
cmd._ValidateOptionsWithManifest(opts, mp) cmd._ValidateOptionsWithManifest(opts, mp)
assert opts.jobs == jobs assert opts.jobs == jobs
assert opts.jobs_network == jobs_net assert opts.jobs_network == jobs_net
assert opts.jobs_checkout == jobs_check assert opts.jobs_checkout == jobs_check
class LocalSyncState(unittest.TestCase):
"""Tests for LocalSyncState."""
_TIME = 10
def setUp(self):
"""Common setup."""
self.topdir = tempfile.mkdtemp("LocalSyncState")
self.repodir = os.path.join(self.topdir, ".repo")
os.makedirs(self.repodir)
self.manifest = mock.MagicMock(
topdir=self.topdir,
repodir=self.repodir,
repoProject=mock.MagicMock(relpath=".repo/repo"),
)
self.state = self._new_state()
def tearDown(self):
"""Common teardown."""
shutil.rmtree(self.topdir)
def _new_state(self, time=_TIME):
with mock.patch("time.time", return_value=time):
return sync.LocalSyncState(self.manifest)
def test_set(self):
"""Times are set."""
p = mock.MagicMock(relpath="projA")
self.state.SetFetchTime(p)
self.state.SetCheckoutTime(p)
self.assertEqual(self.state.GetFetchTime(p), self._TIME)
self.assertEqual(self.state.GetCheckoutTime(p), self._TIME)
def test_update(self):
"""Times are updated."""
with open(self.state._path, "w") as f:
f.write(
"""
{
"projB": {
"last_fetch": 5,
"last_checkout": 7
}
}
"""
)
# Initialize state to read from the new file.
self.state = self._new_state()
projA = mock.MagicMock(relpath="projA")
projB = mock.MagicMock(relpath="projB")
self.assertEqual(self.state.GetFetchTime(projA), None)
self.assertEqual(self.state.GetFetchTime(projB), 5)
self.assertEqual(self.state.GetCheckoutTime(projB), 7)
self.state.SetFetchTime(projA)
self.state.SetFetchTime(projB)
self.assertEqual(self.state.GetFetchTime(projA), self._TIME)
self.assertEqual(self.state.GetFetchTime(projB), self._TIME)
self.assertEqual(self.state.GetCheckoutTime(projB), 7)
def test_save_to_file(self):
"""Data is saved under repodir."""
p = mock.MagicMock(relpath="projA")
self.state.SetFetchTime(p)
self.state.Save()
self.assertEqual(
os.listdir(self.repodir), [".repo_localsyncstate.json"]
)
def test_partial_sync(self):
"""Partial sync state is detected."""
with open(self.state._path, "w") as f:
f.write(
"""
{
"projA": {
"last_fetch": 5,
"last_checkout": 5
},
"projB": {
"last_fetch": 5,
"last_checkout": 5
}
}
"""
)
# Initialize state to read from the new file.
self.state = self._new_state()
projB = mock.MagicMock(relpath="projB")
self.assertEqual(self.state.IsPartiallySynced(), False)
self.state.SetFetchTime(projB)
self.state.SetCheckoutTime(projB)
self.assertEqual(self.state.IsPartiallySynced(), True)
def test_ignore_repo_project(self):
"""Sync data for repo project is ignored when checking partial sync."""
p = mock.MagicMock(relpath="projA")
self.state.SetFetchTime(p)
self.state.SetCheckoutTime(p)
self.state.SetFetchTime(self.manifest.repoProject)
self.state.Save()
self.assertEqual(self.state.IsPartiallySynced(), False)
self.state = self._new_state(self._TIME + 1)
self.state.SetFetchTime(self.manifest.repoProject)
self.assertEqual(
self.state.GetFetchTime(self.manifest.repoProject), self._TIME + 1
)
self.assertEqual(self.state.GetFetchTime(p), self._TIME)
self.assertEqual(self.state.IsPartiallySynced(), False)
def test_nonexistent_project(self):
"""Unsaved projects don't have data."""
p = mock.MagicMock(relpath="projC")
self.assertEqual(self.state.GetFetchTime(p), None)
self.assertEqual(self.state.GetCheckoutTime(p), None)
def test_prune_removed_projects(self):
"""Removed projects are pruned."""
with open(self.state._path, "w") as f:
f.write(
"""
{
"projA": {
"last_fetch": 5
},
"projB": {
"last_fetch": 7
}
}
"""
)
def mock_exists(path):
if "projA" in path:
return False
return True
projA = mock.MagicMock(relpath="projA")
projB = mock.MagicMock(relpath="projB")
self.state = self._new_state()
self.assertEqual(self.state.GetFetchTime(projA), 5)
self.assertEqual(self.state.GetFetchTime(projB), 7)
with mock.patch("os.path.exists", side_effect=mock_exists):
self.state.PruneRemovedProjects()
self.assertIsNone(self.state.GetFetchTime(projA))
self.state = self._new_state()
self.assertIsNone(self.state.GetFetchTime(projA))
self.assertEqual(self.state.GetFetchTime(projB), 7)
def test_prune_removed_and_symlinked_projects(self):
"""Removed projects that still exists on disk as symlink are pruned."""
with open(self.state._path, "w") as f:
f.write(
"""
{
"projA": {
"last_fetch": 5
},
"projB": {
"last_fetch": 7
}
}
"""
)
def mock_exists(path):
return True
def mock_islink(path):
if "projB" in path:
return True
return False
projA = mock.MagicMock(relpath="projA")
projB = mock.MagicMock(relpath="projB")
self.state = self._new_state()
self.assertEqual(self.state.GetFetchTime(projA), 5)
self.assertEqual(self.state.GetFetchTime(projB), 7)
with mock.patch("os.path.exists", side_effect=mock_exists):
with mock.patch("os.path.islink", side_effect=mock_islink):
self.state.PruneRemovedProjects()
self.assertIsNone(self.state.GetFetchTime(projB))
self.state = self._new_state()
self.assertIsNone(self.state.GetFetchTime(projB))
self.assertEqual(self.state.GetFetchTime(projA), 5)
class FakeProject:
def __init__(self, relpath):
self.relpath = relpath
def __str__(self):
return f"project: {self.relpath}"
def __repr__(self):
return str(self)
class SafeCheckoutOrder(unittest.TestCase):
def test_no_nested(self):
p_f = FakeProject("f")
p_foo = FakeProject("foo")
out = sync._SafeCheckoutOrder([p_f, p_foo])
self.assertEqual(out, [[p_f, p_foo]])
def test_basic_nested(self):
p_foo = p_foo = FakeProject("foo")
p_foo_bar = FakeProject("foo/bar")
out = sync._SafeCheckoutOrder([p_foo, p_foo_bar])
self.assertEqual(out, [[p_foo], [p_foo_bar]])
def test_complex_nested(self):
p_foo = FakeProject("foo")
p_foobar = FakeProject("foobar")
p_foo_dash_bar = FakeProject("foo-bar")
p_foo_bar = FakeProject("foo/bar")
p_foo_bar_baz_baq = FakeProject("foo/bar/baz/baq")
p_bar = FakeProject("bar")
out = sync._SafeCheckoutOrder(
[
p_foo_bar_baz_baq,
p_foo,
p_foobar,
p_foo_dash_bar,
p_foo_bar,
p_bar,
]
)
self.assertEqual(
out,
[
[p_bar, p_foo, p_foo_dash_bar, p_foobar],
[p_foo_bar],
[p_foo_bar_baz_baq],
],
)
class Chunksize(unittest.TestCase):
"""Tests for _chunksize."""
def test_single_project(self):
"""Single project."""
self.assertEqual(sync._chunksize(1, 1), 1)
def test_low_project_count(self):
"""Multiple projects, low number of projects to sync."""
self.assertEqual(sync._chunksize(10, 1), 10)
self.assertEqual(sync._chunksize(10, 2), 5)
self.assertEqual(sync._chunksize(10, 4), 2)
self.assertEqual(sync._chunksize(10, 8), 1)
self.assertEqual(sync._chunksize(10, 16), 1)
def test_high_project_count(self):
"""Multiple projects, high number of projects to sync."""
self.assertEqual(sync._chunksize(2800, 1), 32)
self.assertEqual(sync._chunksize(2800, 16), 32)
self.assertEqual(sync._chunksize(2800, 32), 32)
self.assertEqual(sync._chunksize(2800, 64), 32)
self.assertEqual(sync._chunksize(2800, 128), 21)
class GetPreciousObjectsState(unittest.TestCase): class GetPreciousObjectsState(unittest.TestCase):
"""Tests for _GetPreciousObjectsState.""" """Tests for _GetPreciousObjectsState."""
def setUp(self): def setUp(self):
"""Common setup.""" """Common setup."""
self.cmd = sync.Sync() self.cmd = sync.Sync()
self.project = p = mock.MagicMock(use_git_worktrees=False, self.project = p = mock.MagicMock(
UseAlternates=False) use_git_worktrees=False, UseAlternates=False
p.manifest.GetProjectsWithName.return_value = [p] )
p.manifest.GetProjectsWithName.return_value = [p]
self.opt = mock.Mock(spec_set=['this_manifest_only']) self.opt = mock.Mock(spec_set=["this_manifest_only"])
self.opt.this_manifest_only = False self.opt.this_manifest_only = False
def test_worktrees(self): def test_worktrees(self):
"""False for worktrees.""" """False for worktrees."""
self.project.use_git_worktrees = True self.project.use_git_worktrees = True
self.assertFalse(self.cmd._GetPreciousObjectsState(self.project, self.opt)) self.assertFalse(
self.cmd._GetPreciousObjectsState(self.project, self.opt)
)
def test_not_shared(self): def test_not_shared(self):
"""Singleton project.""" """Singleton project."""
self.assertFalse(self.cmd._GetPreciousObjectsState(self.project, self.opt)) self.assertFalse(
self.cmd._GetPreciousObjectsState(self.project, self.opt)
)
def test_shared(self): def test_shared(self):
"""Shared project.""" """Shared project."""
self.project.manifest.GetProjectsWithName.return_value = [ self.project.manifest.GetProjectsWithName.return_value = [
self.project, self.project self.project,
] self.project,
self.assertTrue(self.cmd._GetPreciousObjectsState(self.project, self.opt)) ]
self.assertTrue(
self.cmd._GetPreciousObjectsState(self.project, self.opt)
)
def test_shared_with_alternates(self): def test_shared_with_alternates(self):
"""Shared project, with alternates.""" """Shared project, with alternates."""
self.project.manifest.GetProjectsWithName.return_value = [ self.project.manifest.GetProjectsWithName.return_value = [
self.project, self.project self.project,
] self.project,
self.project.UseAlternates = True ]
self.assertFalse(self.cmd._GetPreciousObjectsState(self.project, self.opt)) self.project.UseAlternates = True
self.assertFalse(
self.cmd._GetPreciousObjectsState(self.project, self.opt)
)
def test_not_found(self): def test_not_found(self):
"""Project not found in manifest.""" """Project not found in manifest."""
self.project.manifest.GetProjectsWithName.return_value = [] self.project.manifest.GetProjectsWithName.return_value = []
self.assertFalse(self.cmd._GetPreciousObjectsState(self.project, self.opt)) self.assertFalse(
self.cmd._GetPreciousObjectsState(self.project, self.opt)
)
class SyncCommand(unittest.TestCase):
"""Tests for cmd.Execute."""
def setUp(self):
"""Common setup."""
self.repodir = tempfile.mkdtemp(".repo")
self.manifest = manifest = mock.MagicMock(
repodir=self.repodir,
)
git_event_log = mock.MagicMock(ErrorEvent=mock.Mock(return_value=None))
self.outer_client = outer_client = mock.MagicMock()
outer_client.manifest.IsArchive = True
manifest.manifestProject.worktree = "worktree_path/"
manifest.repoProject.LastFetch = time.time()
self.sync_network_half_error = None
self.sync_local_half_error = None
self.cmd = sync.Sync(
manifest=manifest,
outer_client=outer_client,
git_event_log=git_event_log,
)
def Sync_NetworkHalf(*args, **kwargs):
return SyncNetworkHalfResult(True, self.sync_network_half_error)
def Sync_LocalHalf(*args, **kwargs):
if self.sync_local_half_error:
raise self.sync_local_half_error
self.project = p = mock.MagicMock(
use_git_worktrees=False,
UseAlternates=False,
name="project",
Sync_NetworkHalf=Sync_NetworkHalf,
Sync_LocalHalf=Sync_LocalHalf,
RelPath=mock.Mock(return_value="rel_path"),
)
p.manifest.GetProjectsWithName.return_value = [p]
mock.patch.object(
sync,
"_PostRepoFetch",
return_value=None,
).start()
mock.patch.object(
self.cmd, "GetProjects", return_value=[self.project]
).start()
opt, _ = self.cmd.OptionParser.parse_args([])
opt.clone_bundle = False
opt.jobs = 4
opt.quiet = True
opt.use_superproject = False
opt.current_branch_only = True
opt.optimized_fetch = True
opt.retry_fetches = 1
opt.prune = False
opt.auto_gc = False
opt.repo_verify = False
self.opt = opt
def tearDown(self):
mock.patch.stopall()
def test_command_exit_error(self):
"""Ensure unsuccessful commands raise expected errors."""
self.sync_network_half_error = GitError(
"sync_network_half_error error", project=self.project
)
self.sync_local_half_error = GitError(
"sync_local_half_error", project=self.project
)
with self.assertRaises(RepoExitError) as e:
self.cmd.Execute(self.opt, [])
self.assertIn(self.sync_local_half_error, e.aggregate_errors)
self.assertIn(self.sync_network_half_error, e.aggregate_errors)

View File

@ -0,0 +1,70 @@
# Copyright (C) 2023 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the subcmds/upload.py module."""
import unittest
from unittest import mock
from error import GitError
from error import UploadError
from subcmds import upload
class UnexpectedError(Exception):
"""An exception not expected by upload command."""
class UploadCommand(unittest.TestCase):
"""Check registered all_commands."""
def setUp(self):
self.cmd = upload.Upload()
self.branch = mock.MagicMock()
self.people = mock.MagicMock()
self.opt, _ = self.cmd.OptionParser.parse_args([])
mock.patch.object(
self.cmd, "_AppendAutoList", return_value=None
).start()
mock.patch.object(self.cmd, "git_event_log").start()
def tearDown(self):
mock.patch.stopall()
def test_UploadAndReport_UploadError(self):
"""Check UploadExitError raised when UploadError encountered."""
side_effect = UploadError("upload error")
with mock.patch.object(
self.cmd, "_UploadBranch", side_effect=side_effect
):
with self.assertRaises(upload.UploadExitError):
self.cmd._UploadAndReport(self.opt, [self.branch], self.people)
def test_UploadAndReport_GitError(self):
"""Check UploadExitError raised when GitError encountered."""
side_effect = GitError("some git error")
with mock.patch.object(
self.cmd, "_UploadBranch", side_effect=side_effect
):
with self.assertRaises(upload.UploadExitError):
self.cmd._UploadAndReport(self.opt, [self.branch], self.people)
def test_UploadAndReport_UnhandledError(self):
"""Check UnexpectedError passed through."""
side_effect = UnexpectedError("some os error")
with mock.patch.object(
self.cmd, "_UploadBranch", side_effect=side_effect
):
with self.assertRaises(type(side_effect)):
self.cmd._UploadAndReport(self.opt, [self.branch], self.people)

View File

@ -20,9 +20,9 @@ from release import update_manpages
class UpdateManpagesTest(unittest.TestCase): class UpdateManpagesTest(unittest.TestCase):
"""Tests the update-manpages code.""" """Tests the update-manpages code."""
def test_replace_regex(self): def test_replace_regex(self):
"""Check that replace_regex works.""" """Check that replace_regex works."""
data = '\n\033[1mSummary\033[m\n' data = "\n\033[1mSummary\033[m\n"
self.assertEqual(update_manpages.replace_regex(data),'\nSummary\n') self.assertEqual(update_manpages.replace_regex(data), "\nSummary\n")

File diff suppressed because it is too large Load Diff

30
tox.ini
View File

@ -15,7 +15,8 @@
# https://tox.readthedocs.io/ # https://tox.readthedocs.io/
[tox] [tox]
envlist = py36, py37, py38, py39, py310 envlist = lint, py36, py37, py38, py39, py310, py311, py312
requires = virtualenv<20.22.0
[gh-actions] [gh-actions]
python = python =
@ -24,9 +25,15 @@ python =
3.8: py38 3.8: py38
3.9: py39 3.9: py39
3.10: py310 3.10: py310
3.11: py311
3.12: py312
[testenv] [testenv]
deps = deps =
-c constraints.txt
black
flake8
isort
pytest pytest
pytest-timeout pytest-timeout
commands = {envpython} run_tests {posargs} commands = {envpython} run_tests {posargs}
@ -35,5 +42,22 @@ setenv =
GIT_COMMITTER_NAME = Repo test committer GIT_COMMITTER_NAME = Repo test committer
EMAIL = repo@gerrit.nodomain EMAIL = repo@gerrit.nodomain
[pytest] [testenv:lint]
timeout = 300 skip_install = true
deps =
-c constraints.txt
black
flake8
commands =
black --check {posargs:. repo run_tests release/update-hooks release/update-manpages}
flake8
[testenv:format]
skip_install = true
deps =
-c constraints.txt
black
flake8
commands =
black {posargs:. repo run_tests release/update-hooks release/update-manpages}
flake8

View File

@ -18,15 +18,19 @@ import importlib.util
import os import os
def WrapperDir():
return os.path.dirname(__file__)
def WrapperPath(): def WrapperPath():
return os.path.join(os.path.dirname(__file__), 'repo') return os.path.join(WrapperDir(), "repo")
@functools.lru_cache(maxsize=None) @functools.lru_cache(maxsize=None)
def Wrapper(): def Wrapper():
modname = 'wrapper' modname = "wrapper"
loader = importlib.machinery.SourceFileLoader(modname, WrapperPath()) loader = importlib.machinery.SourceFileLoader(modname, WrapperPath())
spec = importlib.util.spec_from_loader(modname, loader) spec = importlib.util.spec_from_loader(modname, loader)
module = importlib.util.module_from_spec(spec) module = importlib.util.module_from_spec(spec)
loader.exec_module(module) loader.exec_module(module)
return module return module