Compare commits

..

32 Commits

Author SHA1 Message Date
993af5e136 superproject: Use bugurl from contactinfo in the missing commits error message.
+ In XmlManifest._Unload set 'bugurl' to Wrapper().BUG_URL.
+ contactinfo returns a namedtuple.
+ bug_url can be accessed as self._manifest.contactinfo.bugurl.

Tested the code with the following commands.

$ ./run_tests -v

Added contactinfo tag to default.xml and verified that bugurl is used.

Bug: [google internal] b/186220520.
Change-Id: Iaafd6465e072b2e47a0a0b548bf6cb608a0b0a04
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/306342
Tested-by: Raman Tenneti <rtenneti@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-05-18 15:35:54 +00:00
339f2df1dd ssh: rewrite proxy management for multiprocessing usage
We changed sync to use multiprocessing for parallel work.  This broke
the ssh proxy code as it's all based on threads.  Rewrite the logic to
be multiprocessing safe.

Now instead of the module acting as a stateful object, callers have to
instantiate a new ProxyManager class that holds all the state, an pass
that down to any users.

Bug: https://crbug.com/gerrit/12389
Change-Id: I4b1af116f7306b91e825d3c56fb4274c9b033562
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305486
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
2021-05-10 21:16:06 +00:00
19e409c818 ssh: move proxy usage to the sync subcommand
The only time we really need ssh proxies is when we want to run many
connections and reuse them.  That only happens when running sync.
Every other command makes at most two connections, and even then it's
only one or none.  So the effort of setting up & tearing down ssh
proxies isn't worth it most of the time.

The big reason we want to move this logic to sync is that it's now
using multiprocessing for parallel work.  The current ssh proxy code
is all based on threads, which means none of the logic is working
correctly.  The current ssh design makes it hard to fix when all of
the state lives in the global/module scope.

So the first step to fixing this is top move the setup & teardown to
the one place that really needs it: sync.  No other commands will use
proxies anymore, just direct connections.

Bug: https://crbug.com/gerrit/12389
Change-Id: Ibd351acdec39a87562b3013637c5df4ea34e03c6
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305485
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-10 21:10:29 +00:00
4a58100251 launcher: bump version for new release
Change-Id: I1f204bb1e5ce6b13c623215236deef01efbc0f6c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305822
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-10 17:10:04 +00:00
0e8828c47b Handle 400 error code when attempting to fetch clone bundle.
Gitlab returns a 400 error when trying to fetch clone.bundle
from a repository containing the git-repo tool. The repo
launcher doesn't then fall back to not using a clone.bundle
file and the repo init fails.

Change-Id: Ia3390d0638ef9a39fb2fab84625b269d28caf1cc
Signed-off-by: Craig Northway <cnorthway@codeaurora.org>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305382
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-05-10 16:34:34 +00:00
23ea754524 sync: added --no-use-superproject to disable superproject.
Tested the code with the following commands.

$ ./run_tests -v

$ repo_dev sync -c -j8 --no-use-superproject
Fetching: 100% (1041/1041), done in 1m22.743s

$ repo_dev sync -c -j8 --use-superproject
WARNING: --use-superproject is experimental and not for general use
..

Bug: [google internal] b/187459275
Change-Id: I3f4269df38cd24a21723e8b2be5a1f013e7b5a91
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305682
Tested-by: Raman Tenneti <rtenneti@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-05-08 00:04:00 +00:00
f907ced0fe sync: Recommend using --no-use-superproject if sync fails.
If superproject was not available for a branch, then the next
repo sync would also fail because --use-superproject is
remembered across repo init. In such cases, hoping the hint to
to use --no-use-superproject will help.

Tested the code with the following commands and by forcing
a failure.

$ ./run_tests -v

Bug: [google internal] b/187459275
Change-Id: Ie250812b7ba83afc230b5b1d154ba11f245f8b8a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305622
Reviewed-by: Xin Li <delphij@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-05-07 20:15:50 +00:00
b44294395f sync: refactor main fetch loop
This is a large chunk of code that is largely isolated.  Move it into
a class method to make it easier to manage & reason about, and in a
follow up CL, easier to scope.

Bug: https://crbug.com/gerrit/12389
Change-Id: I0c69d95a9e03478d347b761580b2343bffa012d5
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305484
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
2021-05-06 19:46:09 +00:00
5291eafa41 ssh: move all ssh logic to a common place
We had ssh logic sprinkled between two git modules, and neither was
quite the right home for it.  This largely moves the logic as-is to
its new home.  We'll leave major refactoring to followup commits.

Bug: https://crbug.com/gerrit/12389
Change-Id: I300a8f7dba74f2bd132232a5eb1e856a8490e0e9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305483
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-06 19:09:16 +00:00
8e768eaaa7 git_command: switch version caches to functools
Simplifies the code a bit to use the stdlib cache helper.

Change-Id: I778e90100ce748a71cc3a5a5d67dda403334315e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305482
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-06 18:36:25 +00:00
2f8fdbecde manifest_xml: cleanup of contactinfo test for readability with f-strings.
Tested the code with the following commands.

$ ./run_tests -v

Bug: [google internal] b/186220520.
Change-Id: I1c0f8958ff4c615707eec218250e8de753ec6562
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305282
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-05-05 15:59:58 +00:00
219431e1c9 sync: fix recursive fetching
Commit b2fa30a2b8 ("sync: switch network
fetch to multiprocessing") accidentally changed the variable passed to
the 2nd fetch call from |missing| to |to_fetch| due to a copy & paste
of the earlier changed logic.  Undo that to fix git submodule fetching.

Bug: https://crbug.com/gerrit/14489
Change-Id: I627954f80fd2e80d9d5809b530aa6b0ef9260abb
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305262
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-05 02:42:23 +00:00
5ba80d404c git_config: hoist Windows ssh check earlier
The ssh master logic has never worked under Windows which is why this
code always returned False when running there (including cygwin).  But
the OS check was still done while holding the threading lock.  While
it might be a little slower than necessary, it still worked.

The switch from the threading module to the multiprocessing module
changed global behavior subtly under Windows and broke things: the
globals previously would stay valid, but now they get cleared.  So
the lock is reset to None in children workers.

We could tweak the logic to pass the lock through, but there isn't
much point when the rest of the code is still disabled in Windows.
So perform the platform check before we grab the lock.  This fixes
the crash, and probably speeds things up a few nanoseconds.

This shouldn't be a problem on Linux systems as the platform fork
will duplicate the existing process memory (including globals).

Bug: https://crbug.com/gerrit/14480
Change-Id: I1d1da82c6d7bd6b8cdc1f03f640a520ecd047063
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305149
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 23:49:29 +00:00
1c3f57e8f1 manifest_xml: initial support for <contactinfo>
It will be used to let manifest authors self-register contact info.
This element can be repeated, and any later entries will clobber
earlier ones. This would allow manifest authors who extend
manifests to specify their own contact info.

It would have 1 required attribute: bugurl.
"bugurl" specifies the URL to file a bug against the manifest owner.

<contactinfo bugurl="bug-url"/>

TODO: This CL only implements the parsing logic and further work
will be in followup CLs.

Tested the code with the following commands.

$ ./run_tests tests/test_manifest_xml.py
$ ./run_tests -v

Bug: [google internal] b/186220520.
Change-Id: I47e765ba2dab5cdf850191129f4d4cd6b803f451
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305203
Tested-by: Raman Tenneti <rtenneti@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-05-04 22:36:01 +00:00
05638bf771 sync: use manifest_name passed in
Commit fb527e3f52 ("sync: create dedicated
manifest project update func") refactored code from the main body into a
dedicated method.  The manifest_name was passed as an argument, but never
used it, and instead reaches back out to the command line options.  This
ignores the logic in the main loop where manifest_name might have changed
(like when using smart sync).

Change-Id: I4b84638fbb10c2b6f8f4b555e1475b0669c2daf4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305148
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 20:00:50 +00:00
c99322a6a9 sync: switch to multiprocessing.Event
We've switched most of this command over to multiprocessing and off
of _threading, so do the Event object too.  The APIs are the same
between the modules, so we shouldn't need to update anything else.

Change-Id: I52d31f1c6ef2bcbe7bbc1dd1add79a8d5d08784a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305147
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 20:00:41 +00:00
14208f4c93 sync: fix logic error with linkfile errors
Make sure err_update_linkfiles is always initalized.

Bug: https://crbug.com/gerrit/11008
Change-Id: I7bdd91f82507608ef967daf0fa0f9c859454e19f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305146
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 20:00:20 +00:00
2ee0a62db0 release-process: document the rate limiting in automatic updates
We check for updates only once per day, so clarify the docs.

Change-Id: Ib669ca6ebc67bc13204996fa40e1a3a82012295e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305145
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 19:37:48 +00:00
c177f944d9 subcmds: force consistent help text format
We're inconsistent with help text as to whether it uses title case and
whether it ends in a period.  Add a test to enforce a standard, and use
the style that Python optparse & argparse use themselves (e.g. with the
--help option): always lowercase, and never trailing period.

Change-Id: Ic1defae23daeac0ac9116aaf487427f50b34050d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305144
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 16:40:53 +00:00
aedd1e5ef0 sync: fix print error when handling server error
When converting this logic from print() to the output buffer, this
error codepath should have dropped the use of the file= redirect.

Bug: https://crbug.com/gerrit/14482
Change-Id: Ib484924a2031ba3295c1c1a5b9a2d816b9912279
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305142
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 16:40:28 +00:00
5a41b0be01 superproject: skip updating commit ids if remote's fetchUrl don't match.
Tested the code with the following commands.

$ ./run_tests -v

+ Test with local.xml
  $ repo_dev init -u sso://android.git.corp.google.com/platform/manifest -b master --use-superproject --partial-clone --clone-filter=blob:limit=10M && mkdir -p .repo/local_manifests && (gcertstatus -quiet=true || gcert) && ln -s /google/src/head/depot/google3/wireless/android/build_tools/aosp/manifests/mirror-aosp-master-with-vendor/local.xml  .repo/local_manifests/local.xml

  $ repo_dev sync -c -j8

+ Test without local.xml
  $ repo_dev init -u sso://android.git.corp.google.com/platform/manifest -b master --partial-clone --clone-filter=blob:limit=10M --repo-rev=main --use-superproject
  $ repo_dev sync -c -j8

Bug: [google internal] b/186395810
Change-Id: Id618113a91c12bcb90a30a3c23d3d6842bcb49e1
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/304942
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Raman Tenneti <rtenneti@google.com>
2021-05-04 15:32:23 +00:00
d68ed63328 init/sync: add --no-tags to match --tags
While this provides a way to undo earlier command line options (e.g.
`repo sync --tags --no-tags`) which can be helpful for scripting &
automation, this more importantly allows the user to override the
manifest settings for syncing tags from a project.

Bug: https://crbug.com/gerrit/12401
Change-Id: Id4c36cd82e6ca7cb073b5d63a09f6c7ccdebba83
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/304904
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 11:32:17 +00:00
7356114d90 add --no-current-branch option to invert --current-branch
For most commands, this is more about providing a way to undo earlier
command line options (e.g. `repo info -c --no-current-branch`) which
can be helpful for scripting & automation.  But for the sync command,
this is helpful to undo the setting that exists in the manifest itself.

With this in place, tweak the sync current_branch_only logic to only
apply the manifest settings when the user hasn't specified a command
line option.

Bug: https://crbug.com/gerrit/12401
Change-Id: I21e2384624680cc740d1b5d1e49c50589d2fe6a0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/304903
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-04 11:31:48 +00:00
b8e09ea1d6 harmonize --current-branch short option across subcommands
We're inconsistent with the short option for this flag:
* gitc-init: <none as -c is already used>
* info:      -b
* init:      -c
* overview:  -b
* sync:      -c
* upload:   --cbr

Since info & overview are not as heavily used as the others, switch
them from -b to -c.  We leave -b in as a hidden alias for now.

Similarly, switch upload from --cbr to just -c.  A lot of people
use --cbr, so we leave this as a hidden alias for now too.

Ideally gitc-init wouldn't use -c, but that ship has sailed, and
we're more likely to deprecate gitc entirely at this point.

This provides a consistent set of options across subcommands.

Bug: https://crbug.com/gerrit/12401
Change-Id: Iec249729223866fe1ea0ebabed12ca851cc38b35
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/304902
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-03 16:57:23 +00:00
feb28914bd superproject: Don't update the commit ids of projects if remote is different.
1) Skip setting the revision id (commit id) for the projects whose
   remote doesn't match superproject's remote.
2) exp-superproject/superproject_override.xml includes local_manfiest's
   projects. When we load this XML, don't reload projects from local.xml
   (otherwise we will get duplicate projects errors).

Tested the code with the following commands.

$ ./run_tests -v

+ Test with local.xml
  $ repo_dev init -u sso://android.git.corp.google.com/platform/manifest -b master --use-superproject --partial-clone --clone-filter=blob:limit=10M && mkdir -p .repo/local_manifests && (gcertstatus -quiet=true || gcert) && ln -s /google/src/head/depot/google3/wireless/android/build_tools/aosp/manifests/mirror-aosp-master-with-vendor/local.xml  .repo/local_manifests/local.xml

  $ repo_dev sync -c -j8

+ Test without local.xml
  $ repo_dev init -u sso://android.git.corp.google.com/platform/manifest -b master --partial-clone --clone-filter=blob:limit=10M --repo-rev=main --use-superproject
  $ repo_dev sync -c -j8

Bug: [google internal] b/186395810
Change-Id: I4e9d4ac2d94a9fc0cef0ccd787b6310758009e86
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/304882
Tested-by: Raman Tenneti <rtenneti@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-05-03 05:13:23 +00:00
d1f3e149df upload: search local projects in parallel
Search for project branches to upload in parallel.  This can cut the
lookup time in half for large projects.  We still run the actual hooks
in serial once we have the list of projects to process, but we would
need to rethink things quite a bit before we could handle running them
in parallel too.

Change-Id: I8da0cbc5010566aa860e1a158f3dc07f0709dcff
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/304842
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-02 00:06:32 +00:00
29626b4f46 project: fix m/ generation when switching manifest branches
We were updating the per-checkout m/ pseudo ref when syncing, but we
only created the common m/ redirect when initializing a project for
the first time.  This is fine unless the user switches the manifest
branch in an existing project, then we never create that redirect.

Bug: https://crbug.com/gerrit/14468
Change-Id: I5325e7e602dcb4ce150bef258901ba5e9fdea461
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/304822
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-02 00:05:55 +00:00
3b038cecc4 upload: include the project in error messages
When running upload across multiple projects, include the project in
any error messages that come up.  This lets users figure out where
the problem might be.

Change-Id: I09470c9a1b512baf910d6d97b747816d1a6f3a87
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/304783
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-05-02 00:05:49 +00:00
a590e640a6 Update copyfile and linkfile if manifest updated
Currently, copyfiles and linkfiles which marked by
"<copyfile/>" and "<linkfile/>" in manifest will
be created by first exec 'repo sync'.
But if some "<copyfile/>" or "<linkfile/>" are removed
in manifest, then 'repo sync', these removed item
dest can not be removed in the sourcecode workspace.

This patch is intent to fix this issue, by save a
'copy-link-files.json' in .repo and then compared with
new dest path when next sync. If any "<copyfile/>" or
"<linkfile/>" were removed, the dest path will be
removed in sourcecode at the same time.

Bug: https://crbug.com/gerrit/11008
Change-Id: I6b7b41e94df0f9e6e52801ec755951a4c572d05d
Signed-off-by: jiajia tang <tangjiajia@xiaomi.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/304202
Reviewed-by: Mike Frysinger <vapier@google.com>
2021-05-01 13:26:08 +00:00
f69c7ee318 manifest_xml: ban use of newlines in paths
There should be no valid use of these anywhere, so just ban them
to make things easier for people.

Bug: https://crbug.com/gerrit/14156
Bug: https://crbug.com/gerrit/14200
Change-Id: I8d2cf988c510c98194c43a329a2b9bf313a3f0a8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/304662
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2021-04-30 05:54:11 +00:00
aabf79d3f0 sync: Fix a corner case when both superproject and depth used.
When depth is used, we would fetch only SHA1 when superproject is
used, as the result, only the manifest branch is being recorded,
and commands like repo start would fail.

Fix this by saving the upstream branch value in the overlay
manifest and add the upstream branch to fetch list.

Bug: [google internal] b/185951360
Change-Id: Ib36f56067723f2572ed817785b31cc928ddfec0a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/304562
Reviewed-by: Raman Tenneti <rtenneti@google.com>
Reviewed-by: Jonathan Nieder <jrn@google.com>
Tested-by: Xin Li <delphij@google.com>
2021-04-29 19:05:47 +00:00
a1cd770d56 help/version: sprinkle bug report URL around
Make it a bit easier for people to locate bug reporting info.

Change-Id: If9c8939c84ebd52eb96b353c1797afa25868bb85
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/303943
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Raman Tenneti <rtenneti@google.com>
2021-04-26 21:43:43 +00:00
30 changed files with 964 additions and 513 deletions

View File

@ -110,6 +110,8 @@ Instead, you should use standard Git workflows like [git worktree] or
[gitsubmodules] with [superprojects]. [gitsubmodules] with [superprojects].
*** ***
* `copy-link-files.json`: Tracking file used by `repo sync` to determine when
copyfile or linkfile are added or removed and need corresponding updates.
* `project.list`: Tracking file used by `repo sync` to determine when projects * `project.list`: Tracking file used by `repo sync` to determine when projects
are added or removed and need corresponding updates in the checkout. are added or removed and need corresponding updates in the checkout.
* `projects/`: Bare checkouts of every project synced by the manifest. The * `projects/`: Bare checkouts of every project synced by the manifest. The

View File

@ -31,6 +31,7 @@ following DTD:
extend-project*, extend-project*,
repo-hooks?, repo-hooks?,
superproject?, superproject?,
contactinfo?,
include*)> include*)>
<!ELEMENT notice (#PCDATA)> <!ELEMENT notice (#PCDATA)>
@ -100,10 +101,13 @@ following DTD:
<!ATTLIST repo-hooks in-project CDATA #REQUIRED> <!ATTLIST repo-hooks in-project CDATA #REQUIRED>
<!ATTLIST repo-hooks enabled-list CDATA #REQUIRED> <!ATTLIST repo-hooks enabled-list CDATA #REQUIRED>
<!ELEMENT superproject (EMPTY)> <!ELEMENT superproject EMPTY>
<!ATTLIST superproject name CDATA #REQUIRED> <!ATTLIST superproject name CDATA #REQUIRED>
<!ATTLIST superproject remote IDREF #IMPLIED> <!ATTLIST superproject remote IDREF #IMPLIED>
<!ELEMENT contactinfo EMPTY>
<!ATTLIST contactinfo bugurl CDATA #REQUIRED>
<!ELEMENT include EMPTY> <!ELEMENT include EMPTY>
<!ATTLIST include name CDATA #REQUIRED> <!ATTLIST include name CDATA #REQUIRED>
<!ATTLIST include groups CDATA #IMPLIED> <!ATTLIST include groups CDATA #IMPLIED>
@ -405,7 +409,7 @@ Attribute `enabled-list`: List of hooks to use, whitespace or comma separated.
### Element superproject ### Element superproject
*** ***
*Note*: This is currently a WIP. *Note*: This is currently a WIP.
*** ***
NB: See the [git superprojects documentation]( NB: See the [git superprojects documentation](
@ -424,6 +428,19 @@ same meaning as project's name attribute. See the
Attribute `remote`: Name of a previously defined remote element. Attribute `remote`: Name of a previously defined remote element.
If not supplied the remote given by the default element is used. If not supplied the remote given by the default element is used.
### Element contactinfo
***
*Note*: This is currently a WIP.
***
This element is used to let manifest authors self-register contact info.
It has "bugurl" as a required atrribute. This element can be repeated,
and any later entries will clobber earlier ones. This would allow manifest
authors who extend manifests to specify their own contact info.
Attribute `bugurl`: The URL to file a bug against the manifest owner.
### Element include ### Element include
This element provides the capability of including another manifest This element provides the capability of including another manifest

View File

@ -83,7 +83,8 @@ control how repo finds updates:
* `--repo-rev`: This tells repo which branch to use for the full project. * `--repo-rev`: This tells repo which branch to use for the full project.
It defaults to the `stable` branch (`REPO_REV` in the launcher script). It defaults to the `stable` branch (`REPO_REV` in the launcher script).
Whenever `repo sync` is run, repo will check to see if an update is available. Whenever `repo sync` is run, repo will, once every 24 hours, see if an update
is available.
It fetches the latest repo-rev from the repo-url. It fetches the latest repo-rev from the repo-url.
Then it verifies that the latest commit in the branch has a valid signed tag Then it verifies that the latest commit in the branch has a valid signed tag
using `git tag -v` (which uses gpg). using `git tag -v` (which uses gpg).
@ -95,6 +96,11 @@ If that tag is valid, then repo will warn and use that commit instead.
If that tag cannot be verified, it gives up and forces the user to resolve. If that tag cannot be verified, it gives up and forces the user to resolve.
### Force an update
The `repo selfupdate` command can be used to force an immediate update.
It is not subject to the 24 hour limitation.
## Branch management ## Branch management
All development happens on the `main` branch and should generally be stable. All development happens on the `main` branch and should generally be stable.

View File

@ -13,10 +13,6 @@
# limitations under the License. # limitations under the License.
# URL to file bug reports for repo tool issues.
BUG_REPORT_URL = 'https://bugs.chromium.org/p/gerrit/issues/entry?template=Repo+tool+issue'
class ManifestParseError(Exception): class ManifestParseError(Exception):
"""Failed to parse the manifest file. """Failed to parse the manifest file.
""" """

View File

@ -12,12 +12,10 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import functools
import os import os
import re
import sys import sys
import subprocess import subprocess
import tempfile
from signal import SIGTERM
from error import GitError from error import GitError
from git_refs import HEAD from git_refs import HEAD
@ -42,101 +40,15 @@ GIT_DIR = 'GIT_DIR'
LAST_GITDIR = None LAST_GITDIR = None
LAST_CWD = None LAST_CWD = None
_ssh_proxy_path = None
_ssh_sock_path = None
_ssh_clients = []
_ssh_version = None
def _run_ssh_version():
"""run ssh -V to display the version number"""
return subprocess.check_output(['ssh', '-V'], stderr=subprocess.STDOUT).decode()
def _parse_ssh_version(ver_str=None):
"""parse a ssh version string into a tuple"""
if ver_str is None:
ver_str = _run_ssh_version()
m = re.match(r'^OpenSSH_([0-9.]+)(p[0-9]+)?\s', ver_str)
if m:
return tuple(int(x) for x in m.group(1).split('.'))
else:
return ()
def ssh_version():
"""return ssh version as a tuple"""
global _ssh_version
if _ssh_version is None:
try:
_ssh_version = _parse_ssh_version()
except subprocess.CalledProcessError:
print('fatal: unable to detect ssh version', file=sys.stderr)
sys.exit(1)
return _ssh_version
def ssh_sock(create=True):
global _ssh_sock_path
if _ssh_sock_path is None:
if not create:
return None
tmp_dir = '/tmp'
if not os.path.exists(tmp_dir):
tmp_dir = tempfile.gettempdir()
if ssh_version() < (6, 7):
tokens = '%r@%h:%p'
else:
tokens = '%C' # hash of %l%h%p%r
_ssh_sock_path = os.path.join(
tempfile.mkdtemp('', 'ssh-', tmp_dir),
'master-' + tokens)
return _ssh_sock_path
def _ssh_proxy():
global _ssh_proxy_path
if _ssh_proxy_path is None:
_ssh_proxy_path = os.path.join(
os.path.dirname(__file__),
'git_ssh')
return _ssh_proxy_path
def _add_ssh_client(p):
_ssh_clients.append(p)
def _remove_ssh_client(p):
try:
_ssh_clients.remove(p)
except ValueError:
pass
def terminate_ssh_clients():
global _ssh_clients
for p in _ssh_clients:
try:
os.kill(p.pid, SIGTERM)
p.wait()
except OSError:
pass
_ssh_clients = []
_git_version = None
class _GitCall(object): class _GitCall(object):
@functools.lru_cache(maxsize=None)
def version_tuple(self): def version_tuple(self):
global _git_version ret = Wrapper().ParseGitVersion()
if _git_version is None: if ret is None:
_git_version = Wrapper().ParseGitVersion()
if _git_version is None:
print('fatal: unable to detect git version', file=sys.stderr) print('fatal: unable to detect git version', file=sys.stderr)
sys.exit(1) sys.exit(1)
return _git_version return ret
def __getattr__(self, name): def __getattr__(self, name):
name = name.replace('_', '-') name = name.replace('_', '-')
@ -254,7 +166,7 @@ class GitCommand(object):
capture_stderr=False, capture_stderr=False,
merge_output=False, merge_output=False,
disable_editor=False, disable_editor=False,
ssh_proxy=False, ssh_proxy=None,
cwd=None, cwd=None,
gitdir=None): gitdir=None):
env = self._GetBasicEnv() env = self._GetBasicEnv()
@ -262,8 +174,8 @@ class GitCommand(object):
if disable_editor: if disable_editor:
env['GIT_EDITOR'] = ':' env['GIT_EDITOR'] = ':'
if ssh_proxy: if ssh_proxy:
env['REPO_SSH_SOCK'] = ssh_sock() env['REPO_SSH_SOCK'] = ssh_proxy.sock()
env['GIT_SSH'] = _ssh_proxy() env['GIT_SSH'] = ssh_proxy.proxy
env['GIT_SSH_VARIANT'] = 'ssh' env['GIT_SSH_VARIANT'] = 'ssh'
if 'http_proxy' in env and 'darwin' == sys.platform: if 'http_proxy' in env and 'darwin' == sys.platform:
s = "'http.proxy=%s'" % (env['http_proxy'],) s = "'http.proxy=%s'" % (env['http_proxy'],)
@ -346,7 +258,7 @@ class GitCommand(object):
raise GitError('%s: %s' % (command[1], e)) raise GitError('%s: %s' % (command[1], e))
if ssh_proxy: if ssh_proxy:
_add_ssh_client(p) ssh_proxy.add_client(p)
self.process = p self.process = p
if input: if input:
@ -358,7 +270,8 @@ class GitCommand(object):
try: try:
self.stdout, self.stderr = p.communicate() self.stdout, self.stderr = p.communicate()
finally: finally:
_remove_ssh_client(p) if ssh_proxy:
ssh_proxy.remove_client(p)
self.rc = p.wait() self.rc = p.wait()
@staticmethod @staticmethod

View File

@ -18,25 +18,16 @@ from http.client import HTTPException
import json import json
import os import os
import re import re
import signal
import ssl import ssl
import subprocess import subprocess
import sys import sys
try:
import threading as _threading
except ImportError:
import dummy_threading as _threading
import time
import urllib.error import urllib.error
import urllib.request import urllib.request
from error import GitError, UploadError from error import GitError, UploadError
import platform_utils import platform_utils
from repo_trace import Trace from repo_trace import Trace
from git_command import GitCommand from git_command import GitCommand
from git_command import ssh_sock
from git_command import terminate_ssh_clients
from git_refs import R_CHANGES, R_HEADS, R_TAGS from git_refs import R_CHANGES, R_HEADS, R_TAGS
ID_RE = re.compile(r'^[0-9a-f]{40}$') ID_RE = re.compile(r'^[0-9a-f]{40}$')
@ -440,127 +431,6 @@ class RefSpec(object):
return s return s
_master_processes = []
_master_keys = set()
_ssh_master = True
_master_keys_lock = None
def init_ssh():
"""Should be called once at the start of repo to init ssh master handling.
At the moment, all we do is to create our lock.
"""
global _master_keys_lock
assert _master_keys_lock is None, "Should only call init_ssh once"
_master_keys_lock = _threading.Lock()
def _open_ssh(host, port=None):
global _ssh_master
# Acquire the lock. This is needed to prevent opening multiple masters for
# the same host when we're running "repo sync -jN" (for N > 1) _and_ the
# manifest <remote fetch="ssh://xyz"> specifies a different host from the
# one that was passed to repo init.
_master_keys_lock.acquire()
try:
# Check to see whether we already think that the master is running; if we
# think it's already running, return right away.
if port is not None:
key = '%s:%s' % (host, port)
else:
key = host
if key in _master_keys:
return True
if (not _ssh_master
or 'GIT_SSH' in os.environ
or sys.platform in ('win32', 'cygwin')):
# failed earlier, or cygwin ssh can't do this
#
return False
# We will make two calls to ssh; this is the common part of both calls.
command_base = ['ssh',
'-o', 'ControlPath %s' % ssh_sock(),
host]
if port is not None:
command_base[1:1] = ['-p', str(port)]
# Since the key wasn't in _master_keys, we think that master isn't running.
# ...but before actually starting a master, we'll double-check. This can
# be important because we can't tell that that 'git@myhost.com' is the same
# as 'myhost.com' where "User git" is setup in the user's ~/.ssh/config file.
check_command = command_base + ['-O', 'check']
try:
Trace(': %s', ' '.join(check_command))
check_process = subprocess.Popen(check_command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
check_process.communicate() # read output, but ignore it...
isnt_running = check_process.wait()
if not isnt_running:
# Our double-check found that the master _was_ infact running. Add to
# the list of keys.
_master_keys.add(key)
return True
except Exception:
# Ignore excpetions. We we will fall back to the normal command and print
# to the log there.
pass
command = command_base[:1] + ['-M', '-N'] + command_base[1:]
try:
Trace(': %s', ' '.join(command))
p = subprocess.Popen(command)
except Exception as e:
_ssh_master = False
print('\nwarn: cannot enable ssh control master for %s:%s\n%s'
% (host, port, str(e)), file=sys.stderr)
return False
time.sleep(1)
ssh_died = (p.poll() is not None)
if ssh_died:
return False
_master_processes.append(p)
_master_keys.add(key)
return True
finally:
_master_keys_lock.release()
def close_ssh():
global _master_keys_lock
terminate_ssh_clients()
for p in _master_processes:
try:
os.kill(p.pid, signal.SIGTERM)
p.wait()
except OSError:
pass
del _master_processes[:]
_master_keys.clear()
d = ssh_sock(create=False)
if d:
try:
platform_utils.rmdir(os.path.dirname(d))
except OSError:
pass
# We're done with the lock, so we can delete it.
_master_keys_lock = None
URI_SCP = re.compile(r'^([^@:]*@?[^:/]{1,}):')
URI_ALL = re.compile(r'^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/') URI_ALL = re.compile(r'^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/')
@ -612,27 +482,6 @@ def GetUrlCookieFile(url, quiet):
yield cookiefile, None yield cookiefile, None
def _preconnect(url):
m = URI_ALL.match(url)
if m:
scheme = m.group(1)
host = m.group(2)
if ':' in host:
host, port = host.split(':')
else:
port = None
if scheme in ('ssh', 'git+ssh', 'ssh+git'):
return _open_ssh(host, port)
return False
m = URI_SCP.match(url)
if m:
host = m.group(1)
return _open_ssh(host)
return False
class Remote(object): class Remote(object):
"""Configuration options related to a remote. """Configuration options related to a remote.
""" """
@ -669,9 +518,23 @@ class Remote(object):
return self.url.replace(longest, longestUrl, 1) return self.url.replace(longest, longestUrl, 1)
def PreConnectFetch(self): def PreConnectFetch(self, ssh_proxy):
"""Run any setup for this remote before we connect to it.
In practice, if the remote is using SSH, we'll attempt to create a new
SSH master session to it for reuse across projects.
Args:
ssh_proxy: The SSH settings for managing master sessions.
Returns:
Whether the preconnect phase for this remote was successful.
"""
if not ssh_proxy:
return True
connectionUrl = self._InsteadOf() connectionUrl = self._InsteadOf()
return _preconnect(connectionUrl) return ssh_proxy.preconnect(connectionUrl)
def ReviewUrl(self, userEmail, validate_certs): def ReviewUrl(self, userEmail, validate_certs):
if self._review_url is None: if self._review_url is None:

View File

@ -26,7 +26,6 @@ import hashlib
import os import os
import sys import sys
from error import BUG_REPORT_URL
from git_command import GitCommand from git_command import GitCommand
from git_refs import R_HEADS from git_refs import R_HEADS
@ -262,10 +261,20 @@ class Superproject(object):
return None return None
projects_missing_commit_ids = [] projects_missing_commit_ids = []
superproject_fetchUrl = self._manifest.superproject['remote'].fetchUrl
for project in projects: for project in projects:
path = project.relpath path = project.relpath
if not path: if not path:
continue continue
# Some manifests that pull projects from the "chromium" GoB
# (remote="chromium"), and have a private manifest that pulls projects
# from both the chromium GoB and "chrome-internal" GoB (remote="chrome").
# For such projects, one of the remotes will be different from
# superproject's remote. Until superproject, supports multiple remotes,
# don't update the commit ids of remotes that don't match superproject's
# remote.
if project.remote.fetchUrl != superproject_fetchUrl:
continue
commit_id = commit_ids.get(path) commit_id = commit_ids.get(path)
if commit_id: if commit_id:
project.SetRevisionId(commit_id) project.SetRevisionId(commit_id)
@ -273,7 +282,7 @@ class Superproject(object):
projects_missing_commit_ids.append(path) projects_missing_commit_ids.append(path)
if projects_missing_commit_ids: if projects_missing_commit_ids:
print('error: please file a bug using %s to report missing commit_ids for: %s' % print('error: please file a bug using %s to report missing commit_ids for: %s' %
(BUG_REPORT_URL, projects_missing_commit_ids), file=sys.stderr) (self._manifest.contactinfo.bugurl, projects_missing_commit_ids), file=sys.stderr)
return None return None
manifest_path = self._WriteManfiestFile() manifest_path = self._WriteManfiestFile()

View File

@ -39,7 +39,7 @@ from color import SetDefaultColoring
import event_log import event_log
from repo_trace import SetTrace from repo_trace import SetTrace
from git_command import user_agent from git_command import user_agent
from git_config import init_ssh, close_ssh, RepoConfig from git_config import RepoConfig
from git_trace2_event_log import EventLog from git_trace2_event_log import EventLog
from command import InteractiveCommand from command import InteractiveCommand
from command import MirrorSafeCommand from command import MirrorSafeCommand
@ -591,8 +591,6 @@ def _Main(argv):
repo = _Repo(opt.repodir) repo = _Repo(opt.repodir)
try: try:
try:
init_ssh()
init_http() init_http()
name, gopts, argv = repo._ParseArgs(argv) name, gopts, argv = repo._ParseArgs(argv)
run = lambda: repo._Run(name, gopts, argv) or 0 run = lambda: repo._Run(name, gopts, argv) or 0
@ -603,8 +601,6 @@ def _Main(argv):
result = tracer.runfunc(run) result = tracer.runfunc(run)
else: else:
result = run() result = run()
finally:
close_ssh()
except KeyboardInterrupt: except KeyboardInterrupt:
print('aborted by user', file=sys.stderr) print('aborted by user', file=sys.stderr)
result = 1 result = 1

View File

@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import collections
import itertools import itertools
import os import os
import platform import platform
@ -27,11 +28,15 @@ import platform_utils
from project import RemoteSpec, Project, MetaProject from project import RemoteSpec, Project, MetaProject
from error import (ManifestParseError, ManifestInvalidPathError, from error import (ManifestParseError, ManifestInvalidPathError,
ManifestInvalidRevisionError) ManifestInvalidRevisionError)
from wrapper import Wrapper
MANIFEST_FILE_NAME = 'manifest.xml' MANIFEST_FILE_NAME = 'manifest.xml'
LOCAL_MANIFEST_NAME = 'local_manifest.xml' LOCAL_MANIFEST_NAME = 'local_manifest.xml'
LOCAL_MANIFESTS_DIR_NAME = 'local_manifests' LOCAL_MANIFESTS_DIR_NAME = 'local_manifests'
# ContactInfo has the self-registered bug url, supplied by the manifest authors.
ContactInfo = collections.namedtuple('ContactInfo', 'bugurl')
# urljoin gets confused if the scheme is not known. # urljoin gets confused if the scheme is not known.
urllib.parse.uses_relative.extend([ urllib.parse.uses_relative.extend([
'ssh', 'ssh',
@ -479,6 +484,12 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
e.setAttribute('remote', remoteName) e.setAttribute('remote', remoteName)
root.appendChild(e) root.appendChild(e)
if self._contactinfo.bugurl != Wrapper().BUG_URL:
root.appendChild(doc.createTextNode(''))
e = doc.createElement('contactinfo')
e.setAttribute('bugurl', self._contactinfo.bugurl)
root.appendChild(e)
return doc return doc
def ToDict(self, **kwargs): def ToDict(self, **kwargs):
@ -490,6 +501,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
'manifest-server', 'manifest-server',
'repo-hooks', 'repo-hooks',
'superproject', 'superproject',
'contactinfo',
} }
# Elements that may be repeated. # Elements that may be repeated.
MULTI_ELEMENTS = { MULTI_ELEMENTS = {
@ -565,6 +577,11 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
self._Load() self._Load()
return self._superproject return self._superproject
@property
def contactinfo(self):
self._Load()
return self._contactinfo
@property @property
def notice(self): def notice(self):
self._Load() self._Load()
@ -595,6 +612,10 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
'repo.partialcloneexclude') or '' 'repo.partialcloneexclude') or ''
return set(x.strip() for x in exclude.split(',')) return set(x.strip() for x in exclude.split(','))
@property
def HasLocalManifests(self):
return self._load_local_manifests and self.local_manifests
@property @property
def IsMirror(self): def IsMirror(self):
return self.manifestProject.config.GetBoolean('repo.mirror') return self.manifestProject.config.GetBoolean('repo.mirror')
@ -630,6 +651,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
self._default = None self._default = None
self._repo_hooks_project = None self._repo_hooks_project = None
self._superproject = {} self._superproject = {}
self._contactinfo = ContactInfo(Wrapper().BUG_URL)
self._notice = None self._notice = None
self.branch = None self.branch = None
self._manifest_server = None self._manifest_server = None
@ -872,6 +894,11 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
raise ManifestParseError("no remote for superproject %s within %s" % raise ManifestParseError("no remote for superproject %s within %s" %
(name, self.manifestFile)) (name, self.manifestFile))
self._superproject['remote'] = remote.ToRemoteSpec(name) self._superproject['remote'] = remote.ToRemoteSpec(name)
if node.nodeName == 'contactinfo':
bugurl = self._reqatt(node, 'bugurl')
# This element can be repeated, later entries will clobber earlier ones.
self._contactinfo = ContactInfo(bugurl)
if node.nodeName == 'remove-project': if node.nodeName == 'remove-project':
name = self._reqatt(node, 'name') name = self._reqatt(node, 'name')
@ -1199,6 +1226,8 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
if '~' in path: if '~' in path:
return '~ not allowed (due to 8.3 filenames on Windows filesystems)' return '~ not allowed (due to 8.3 filenames on Windows filesystems)'
path_codepoints = set(path)
# Some filesystems (like Apple's HFS+) try to normalize Unicode codepoints # Some filesystems (like Apple's HFS+) try to normalize Unicode codepoints
# which means there are alternative names for ".git". Reject paths with # which means there are alternative names for ".git". Reject paths with
# these in it as there shouldn't be any reasonable need for them here. # these in it as there shouldn't be any reasonable need for them here.
@ -1222,10 +1251,17 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
u'\u206F', # NOMINAL DIGIT SHAPES u'\u206F', # NOMINAL DIGIT SHAPES
u'\uFEFF', # ZERO WIDTH NO-BREAK SPACE u'\uFEFF', # ZERO WIDTH NO-BREAK SPACE
} }
if BAD_CODEPOINTS & set(path): if BAD_CODEPOINTS & path_codepoints:
# This message is more expansive than reality, but should be fine. # This message is more expansive than reality, but should be fine.
return 'Unicode combining characters not allowed' return 'Unicode combining characters not allowed'
# Reject newlines as there shouldn't be any legitmate use for them, they'll
# be confusing to users, and they can easily break tools that expect to be
# able to iterate over newline delimited lists. This even applies to our
# own code like .repo/project.list.
if {'\r', '\n'} & path_codepoints:
return 'Newlines not allowed'
# Assume paths might be used on case-insensitive filesystems. # Assume paths might be used on case-insensitive filesystems.
path = path.lower() path = path.lower()

View File

@ -1041,15 +1041,16 @@ class Project(object):
verbose=False, verbose=False,
output_redir=None, output_redir=None,
is_new=None, is_new=None,
current_branch_only=False, current_branch_only=None,
force_sync=False, force_sync=False,
clone_bundle=True, clone_bundle=True,
tags=True, tags=None,
archive=False, archive=False,
optimized_fetch=False, optimized_fetch=False,
retry_fetches=0, retry_fetches=0,
prune=False, prune=False,
submodules=False, submodules=False,
ssh_proxy=None,
clone_filter=None, clone_filter=None,
partial_clone_exclude=set()): partial_clone_exclude=set()):
"""Perform only the network IO portion of the sync process. """Perform only the network IO portion of the sync process.
@ -1116,7 +1117,7 @@ class Project(object):
and self._ApplyCloneBundle(initial=is_new, quiet=quiet, verbose=verbose)): and self._ApplyCloneBundle(initial=is_new, quiet=quiet, verbose=verbose)):
is_new = False is_new = False
if not current_branch_only: if current_branch_only is None:
if self.sync_c: if self.sync_c:
current_branch_only = True current_branch_only = True
elif not self.manifest._loaded: elif not self.manifest._loaded:
@ -1125,8 +1126,8 @@ class Project(object):
elif self.manifest.default.sync_c: elif self.manifest.default.sync_c:
current_branch_only = True current_branch_only = True
if not self.sync_tags: if tags is None:
tags = False tags = self.sync_tags
if self.clone_depth: if self.clone_depth:
depth = self.clone_depth depth = self.clone_depth
@ -1143,6 +1144,7 @@ class Project(object):
alt_dir=alt_dir, current_branch_only=current_branch_only, alt_dir=alt_dir, current_branch_only=current_branch_only,
tags=tags, prune=prune, depth=depth, tags=tags, prune=prune, depth=depth,
submodules=submodules, force_sync=force_sync, submodules=submodules, force_sync=force_sync,
ssh_proxy=ssh_proxy,
clone_filter=clone_filter, retry_fetches=retry_fetches): clone_filter=clone_filter, retry_fetches=retry_fetches):
return False return False
@ -1214,6 +1216,9 @@ class Project(object):
(self.revisionExpr, self.name)) (self.revisionExpr, self.name))
def SetRevisionId(self, revisionId): def SetRevisionId(self, revisionId):
if self.clone_depth or self.manifest.manifestProject.config.GetString('repo.depth'):
self.upstream = self.revisionExpr
self.revisionId = revisionId self.revisionId = revisionId
def Sync_LocalHalf(self, syncbuf, force_sync=False, submodules=False): def Sync_LocalHalf(self, syncbuf, force_sync=False, submodules=False):
@ -1991,6 +1996,7 @@ class Project(object):
prune=False, prune=False,
depth=None, depth=None,
submodules=False, submodules=False,
ssh_proxy=None,
force_sync=False, force_sync=False,
clone_filter=None, clone_filter=None,
retry_fetches=2, retry_fetches=2,
@ -2038,16 +2044,14 @@ class Project(object):
if not name: if not name:
name = self.remote.name name = self.remote.name
ssh_proxy = False
remote = self.GetRemote(name) remote = self.GetRemote(name)
if remote.PreConnectFetch(): if not remote.PreConnectFetch(ssh_proxy):
ssh_proxy = True ssh_proxy = None
if initial: if initial:
if alt_dir and 'objects' == os.path.basename(alt_dir): if alt_dir and 'objects' == os.path.basename(alt_dir):
ref_dir = os.path.dirname(alt_dir) ref_dir = os.path.dirname(alt_dir)
packed_refs = os.path.join(self.gitdir, 'packed-refs') packed_refs = os.path.join(self.gitdir, 'packed-refs')
remote = self.GetRemote(name)
all_refs = self.bare_ref.all all_refs = self.bare_ref.all
ids = set(all_refs.values()) ids = set(all_refs.values())
@ -2134,6 +2138,8 @@ class Project(object):
# Shallow checkout of a specific commit, fetch from that commit and not # Shallow checkout of a specific commit, fetch from that commit and not
# the heads only as the commit might be deeper in the history. # the heads only as the commit might be deeper in the history.
spec.append(branch) spec.append(branch)
if self.upstream:
spec.append(self.upstream)
else: else:
if is_sha1: if is_sha1:
branch = self.upstream branch = self.upstream
@ -2205,7 +2211,7 @@ class Project(object):
# Figure out how long to sleep before the next attempt, if there is one. # Figure out how long to sleep before the next attempt, if there is one.
if not verbose: if not verbose:
output_redir.write('\n%s:\n%s' % (self.name, gitcmd.stdout), file=sys.stderr) output_redir.write('\n%s:\n%s' % (self.name, gitcmd.stdout))
if try_n < retry_fetches - 1: if try_n < retry_fetches - 1:
output_redir.write('sleeping %s seconds before retrying' % retry_cur_sleep) output_redir.write('sleeping %s seconds before retrying' % retry_cur_sleep)
time.sleep(retry_cur_sleep) time.sleep(retry_cur_sleep)
@ -2233,7 +2239,7 @@ class Project(object):
name=name, quiet=quiet, verbose=verbose, output_redir=output_redir, name=name, quiet=quiet, verbose=verbose, output_redir=output_redir,
current_branch_only=current_branch_only and depth, current_branch_only=current_branch_only and depth,
initial=False, alt_dir=alt_dir, initial=False, alt_dir=alt_dir,
depth=None, clone_filter=clone_filter) depth=None, ssh_proxy=ssh_proxy, clone_filter=clone_filter)
return ok return ok
@ -2438,14 +2444,6 @@ class Project(object):
self.bare_objdir.init() self.bare_objdir.init()
if self.use_git_worktrees: if self.use_git_worktrees:
# Set up the m/ space to point to the worktree-specific ref space.
# We'll update the worktree-specific ref space on each checkout.
if self.manifest.branch:
self.bare_git.symbolic_ref(
'-m', 'redirecting to worktree scope',
R_M + self.manifest.branch,
R_WORKTREE_M + self.manifest.branch)
# Enable per-worktree config file support if possible. This is more a # Enable per-worktree config file support if possible. This is more a
# nice-to-have feature for users rather than a hard requirement. # nice-to-have feature for users rather than a hard requirement.
if git_require((2, 20, 0)): if git_require((2, 20, 0)):
@ -2582,6 +2580,14 @@ class Project(object):
def _InitMRef(self): def _InitMRef(self):
if self.manifest.branch: if self.manifest.branch:
if self.use_git_worktrees: if self.use_git_worktrees:
# Set up the m/ space to point to the worktree-specific ref space.
# We'll update the worktree-specific ref space on each checkout.
ref = R_M + self.manifest.branch
if not self.bare_ref.symref(ref):
self.bare_git.symbolic_ref(
'-m', 'redirecting to worktree scope',
ref, R_WORKTREE_M + self.manifest.branch)
# We can't update this ref with git worktrees until it exists. # We can't update this ref with git worktrees until it exists.
# We'll wait until the initial checkout to set it. # We'll wait until the initial checkout to set it.
if not os.path.exists(self.worktree): if not os.path.exists(self.worktree):

21
repo
View File

@ -145,9 +145,11 @@ if not REPO_URL:
REPO_REV = os.environ.get('REPO_REV') REPO_REV = os.environ.get('REPO_REV')
if not REPO_REV: if not REPO_REV:
REPO_REV = 'stable' REPO_REV = 'stable'
# URL to file bug reports for repo tool issues.
BUG_URL = 'https://bugs.chromium.org/p/gerrit/issues/entry?template=Repo+tool+issue'
# increment this whenever we make important changes to this script # increment this whenever we make important changes to this script
VERSION = (2, 14) VERSION = (2, 15)
# increment this if the MAINTAINER_KEYS block is modified # increment this if the MAINTAINER_KEYS block is modified
KEYRING_VERSION = (2, 3) KEYRING_VERSION = (2, 3)
@ -322,8 +324,14 @@ def InitParser(parser, gitc_init=False):
group.add_option(*cbr_opts, group.add_option(*cbr_opts,
dest='current_branch_only', action='store_true', dest='current_branch_only', action='store_true',
help='fetch only current manifest branch from server') help='fetch only current manifest branch from server')
group.add_option('--no-current-branch',
dest='current_branch_only', action='store_false',
help='fetch all manifest branches from server')
group.add_option('--tags',
action='store_true',
help='fetch tags in the manifest')
group.add_option('--no-tags', group.add_option('--no-tags',
dest='tags', default=True, action='store_false', dest='tags', action='store_false',
help="don't fetch tags in the manifest") help="don't fetch tags in the manifest")
# These are fundamentally different ways of structuring the checkout. # These are fundamentally different ways of structuring the checkout.
@ -851,11 +859,10 @@ def _DownloadBundle(url, cwd, quiet, verbose):
try: try:
r = urllib.request.urlopen(url) r = urllib.request.urlopen(url)
except urllib.error.HTTPError as e: except urllib.error.HTTPError as e:
if e.code in [401, 403, 404, 501]: if e.code not in [400, 401, 403, 404, 501]:
print('warning: Cannot get %s' % url, file=sys.stderr)
print('warning: HTTP error %s' % e.code, file=sys.stderr)
return False return False
print('fatal: Cannot get %s' % url, file=sys.stderr)
print('fatal: HTTP error %s' % e.code, file=sys.stderr)
raise CloneFailure()
except urllib.error.URLError as e: except urllib.error.URLError as e:
print('fatal: Cannot get %s' % url, file=sys.stderr) print('fatal: Cannot get %s' % url, file=sys.stderr)
print('fatal: error %s' % e.reason, file=sys.stderr) print('fatal: error %s' % e.reason, file=sys.stderr)
@ -1171,6 +1178,7 @@ The most commonly used repo commands are:
For access to the full online help, install repo ("repo init"). For access to the full online help, install repo ("repo init").
""") """)
print('Bug reports:', BUG_URL)
sys.exit(0) sys.exit(0)
@ -1204,6 +1212,7 @@ def _Version():
print('OS %s %s (%s)' % (uname.system, uname.release, uname.version)) print('OS %s %s (%s)' % (uname.system, uname.release, uname.version))
print('CPU %s (%s)' % print('CPU %s (%s)' %
(uname.machine, uname.processor if uname.processor else 'unknown')) (uname.machine, uname.processor if uname.processor else 'unknown'))
print('Bug reports:', BUG_URL)
sys.exit(0) sys.exit(0)

277
ssh.py Normal file
View File

@ -0,0 +1,277 @@
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Common SSH management logic."""
import functools
import multiprocessing
import os
import re
import signal
import subprocess
import sys
import tempfile
import time
import platform_utils
from repo_trace import Trace
PROXY_PATH = os.path.join(os.path.dirname(__file__), 'git_ssh')
def _run_ssh_version():
"""run ssh -V to display the version number"""
return subprocess.check_output(['ssh', '-V'], stderr=subprocess.STDOUT).decode()
def _parse_ssh_version(ver_str=None):
"""parse a ssh version string into a tuple"""
if ver_str is None:
ver_str = _run_ssh_version()
m = re.match(r'^OpenSSH_([0-9.]+)(p[0-9]+)?\s', ver_str)
if m:
return tuple(int(x) for x in m.group(1).split('.'))
else:
return ()
@functools.lru_cache(maxsize=None)
def version():
"""return ssh version as a tuple"""
try:
return _parse_ssh_version()
except subprocess.CalledProcessError:
print('fatal: unable to detect ssh version', file=sys.stderr)
sys.exit(1)
URI_SCP = re.compile(r'^([^@:]*@?[^:/]{1,}):')
URI_ALL = re.compile(r'^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/')
class ProxyManager:
"""Manage various ssh clients & masters that we spawn.
This will take care of sharing state between multiprocessing children, and
make sure that if we crash, we don't leak any of the ssh sessions.
The code should work with a single-process scenario too, and not add too much
overhead due to the manager.
"""
# Path to the ssh program to run which will pass our master settings along.
# Set here more as a convenience API.
proxy = PROXY_PATH
def __init__(self, manager):
# Protect access to the list of active masters.
self._lock = multiprocessing.Lock()
# List of active masters (pid). These will be spawned on demand, and we are
# responsible for shutting them all down at the end.
self._masters = manager.list()
# Set of active masters indexed by "host:port" information.
# The value isn't used, but multiprocessing doesn't provide a set class.
self._master_keys = manager.dict()
# Whether ssh masters are known to be broken, so we give up entirely.
self._master_broken = manager.Value('b', False)
# List of active ssh sesssions. Clients will be added & removed as
# connections finish, so this list is just for safety & cleanup if we crash.
self._clients = manager.list()
# Path to directory for holding master sockets.
self._sock_path = None
def __enter__(self):
"""Enter a new context."""
return self
def __exit__(self, exc_type, exc_value, traceback):
"""Exit a context & clean up all resources."""
self.close()
def add_client(self, proc):
"""Track a new ssh session."""
self._clients.append(proc.pid)
def remove_client(self, proc):
"""Remove a completed ssh session."""
try:
self._clients.remove(proc.pid)
except ValueError:
pass
def add_master(self, proc):
"""Track a new master connection."""
self._masters.append(proc.pid)
def _terminate(self, procs):
"""Kill all |procs|."""
for pid in procs:
try:
os.kill(pid, signal.SIGTERM)
os.waitpid(pid, 0)
except OSError:
pass
# The multiprocessing.list() API doesn't provide many standard list()
# methods, so we have to manually clear the list.
while True:
try:
procs.pop(0)
except:
break
def close(self):
"""Close this active ssh session.
Kill all ssh clients & masters we created, and nuke the socket dir.
"""
self._terminate(self._clients)
self._terminate(self._masters)
d = self.sock(create=False)
if d:
try:
platform_utils.rmdir(os.path.dirname(d))
except OSError:
pass
def _open_unlocked(self, host, port=None):
"""Make sure a ssh master session exists for |host| & |port|.
If one doesn't exist already, we'll create it.
We won't grab any locks, so the caller has to do that. This helps keep the
business logic of actually creating the master separate from grabbing locks.
"""
# Check to see whether we already think that the master is running; if we
# think it's already running, return right away.
if port is not None:
key = '%s:%s' % (host, port)
else:
key = host
if key in self._master_keys:
return True
if self._master_broken.value or 'GIT_SSH' in os.environ:
# Failed earlier, so don't retry.
return False
# We will make two calls to ssh; this is the common part of both calls.
command_base = ['ssh', '-o', 'ControlPath %s' % self.sock(), host]
if port is not None:
command_base[1:1] = ['-p', str(port)]
# Since the key wasn't in _master_keys, we think that master isn't running.
# ...but before actually starting a master, we'll double-check. This can
# be important because we can't tell that that 'git@myhost.com' is the same
# as 'myhost.com' where "User git" is setup in the user's ~/.ssh/config file.
check_command = command_base + ['-O', 'check']
try:
Trace(': %s', ' '.join(check_command))
check_process = subprocess.Popen(check_command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
check_process.communicate() # read output, but ignore it...
isnt_running = check_process.wait()
if not isnt_running:
# Our double-check found that the master _was_ infact running. Add to
# the list of keys.
self._master_keys[key] = True
return True
except Exception:
# Ignore excpetions. We we will fall back to the normal command and print
# to the log there.
pass
command = command_base[:1] + ['-M', '-N'] + command_base[1:]
try:
Trace(': %s', ' '.join(command))
p = subprocess.Popen(command)
except Exception as e:
self._master_broken.value = True
print('\nwarn: cannot enable ssh control master for %s:%s\n%s'
% (host, port, str(e)), file=sys.stderr)
return False
time.sleep(1)
ssh_died = (p.poll() is not None)
if ssh_died:
return False
self.add_master(p)
self._master_keys[key] = True
return True
def _open(self, host, port=None):
"""Make sure a ssh master session exists for |host| & |port|.
If one doesn't exist already, we'll create it.
This will obtain any necessary locks to avoid inter-process races.
"""
# Bail before grabbing the lock if we already know that we aren't going to
# try creating new masters below.
if sys.platform in ('win32', 'cygwin'):
return False
# Acquire the lock. This is needed to prevent opening multiple masters for
# the same host when we're running "repo sync -jN" (for N > 1) _and_ the
# manifest <remote fetch="ssh://xyz"> specifies a different host from the
# one that was passed to repo init.
with self._lock:
return self._open_unlocked(host, port)
def preconnect(self, url):
"""If |uri| will create a ssh connection, setup the ssh master for it."""
m = URI_ALL.match(url)
if m:
scheme = m.group(1)
host = m.group(2)
if ':' in host:
host, port = host.split(':')
else:
port = None
if scheme in ('ssh', 'git+ssh', 'ssh+git'):
return self._open(host, port)
return False
m = URI_SCP.match(url)
if m:
host = m.group(1)
return self._open(host)
return False
def sock(self, create=True):
"""Return the path to the ssh socket dir.
This has all the master sockets so clients can talk to them.
"""
if self._sock_path is None:
if not create:
return None
tmp_dir = '/tmp'
if not os.path.exists(tmp_dir):
tmp_dir = tempfile.gettempdir()
if version() < (6, 7):
tokens = '%r@%h:%p'
else:
tokens = '%C' # hash of %l%h%p%r
self._sock_path = os.path.join(
tempfile.mkdtemp('', 'ssh-', tmp_dir),
'master-' + tokens)
return self._sock_path

View File

@ -33,7 +33,7 @@ to the Unix 'patch' command.
def _Options(self, p): def _Options(self, p):
p.add_option('-u', '--absolute', p.add_option('-u', '--absolute',
dest='absolute', action='store_true', dest='absolute', action='store_true',
help='Paths are relative to the repository root') help='paths are relative to the repository root')
def _ExecuteOne(self, absolute, project): def _ExecuteOne(self, absolute, project):
"""Obtains the diff for a specific project. """Obtains the diff for a specific project.

View File

@ -68,10 +68,10 @@ synced and their revisions won't be found.
def _Options(self, p): def _Options(self, p):
p.add_option('--raw', p.add_option('--raw',
dest='raw', action='store_true', dest='raw', action='store_true',
help='Display raw diff.') help='display raw diff')
p.add_option('--no-color', p.add_option('--no-color',
dest='color', action='store_false', default=True, dest='color', action='store_false', default=True,
help='does not display the diff in color.') help='does not display the diff in color')
p.add_option('--pretty-format', p.add_option('--pretty-format',
dest='pretty_format', action='store', dest='pretty_format', action='store',
metavar='<FORMAT>', metavar='<FORMAT>',

View File

@ -131,30 +131,30 @@ without iterating through the remaining projects.
def _Options(self, p): def _Options(self, p):
p.add_option('-r', '--regex', p.add_option('-r', '--regex',
dest='regex', action='store_true', dest='regex', action='store_true',
help="Execute the command only on projects matching regex or wildcard expression") help='execute the command only on projects matching regex or wildcard expression')
p.add_option('-i', '--inverse-regex', p.add_option('-i', '--inverse-regex',
dest='inverse_regex', action='store_true', dest='inverse_regex', action='store_true',
help="Execute the command only on projects not matching regex or " help='execute the command only on projects not matching regex or '
"wildcard expression") 'wildcard expression')
p.add_option('-g', '--groups', p.add_option('-g', '--groups',
dest='groups', dest='groups',
help="Execute the command only on projects matching the specified groups") help='execute the command only on projects matching the specified groups')
p.add_option('-c', '--command', p.add_option('-c', '--command',
help='Command (and arguments) to execute', help='command (and arguments) to execute',
dest='command', dest='command',
action='callback', action='callback',
callback=self._cmd_option) callback=self._cmd_option)
p.add_option('-e', '--abort-on-errors', p.add_option('-e', '--abort-on-errors',
dest='abort_on_errors', action='store_true', dest='abort_on_errors', action='store_true',
help='Abort if a command exits unsuccessfully') help='abort if a command exits unsuccessfully')
p.add_option('--ignore-missing', action='store_true', p.add_option('--ignore-missing', action='store_true',
help='Silently skip & do not exit non-zero due missing ' help='silently skip & do not exit non-zero due missing '
'checkouts') 'checkouts')
g = p.get_option_group('--quiet') g = p.get_option_group('--quiet')
g.add_option('-p', g.add_option('-p',
dest='project_header', action='store_true', dest='project_header', action='store_true',
help='Show project headers before output') help='show project headers before output')
p.add_option('--interactive', p.add_option('--interactive',
action='store_true', action='store_true',
help='force interactive usage') help='force interactive usage')

View File

@ -33,7 +33,7 @@ and all locally downloaded sources.
def _Options(self, p): def _Options(self, p):
p.add_option('-f', '--force', p.add_option('-f', '--force',
dest='force', action='store_true', dest='force', action='store_true',
help='Force the deletion (no prompt).') help='force the deletion (no prompt)')
def Execute(self, opt, args): def Execute(self, opt, args):
if not opt.force: if not opt.force:

View File

@ -20,6 +20,7 @@ from subcmds import all_commands
from color import Coloring from color import Coloring
from command import PagedCommand, MirrorSafeCommand, GitcAvailableCommand, GitcClientCommand from command import PagedCommand, MirrorSafeCommand, GitcAvailableCommand, GitcClientCommand
import gitc_utils import gitc_utils
from wrapper import Wrapper
class Help(PagedCommand, MirrorSafeCommand): class Help(PagedCommand, MirrorSafeCommand):
@ -78,6 +79,7 @@ Displays detailed usage information about a command.
print( print(
"See 'repo help <command>' for more information on a specific command.\n" "See 'repo help <command>' for more information on a specific command.\n"
"See 'repo help --all' for a complete list of recognized commands.") "See 'repo help --all' for a complete list of recognized commands.")
print('Bug reports:', Wrapper().BUG_URL)
def _PrintCommandHelp(self, cmd, header_prefix=''): def _PrintCommandHelp(self, cmd, header_prefix=''):
class _Out(Coloring): class _Out(Coloring):

View File

@ -12,6 +12,8 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import optparse
from command import PagedCommand from command import PagedCommand
from color import Coloring from color import Coloring
from git_refs import R_M, R_HEADS from git_refs import R_M, R_HEADS
@ -25,7 +27,7 @@ class _Coloring(Coloring):
class Info(PagedCommand): class Info(PagedCommand):
common = True common = True
helpSummary = "Get info on the manifest branch, current branch or unmerged branches" helpSummary = "Get info on the manifest branch, current branch or unmerged branches"
helpUsage = "%prog [-dl] [-o [-b]] [<project>...]" helpUsage = "%prog [-dl] [-o [-c]] [<project>...]"
def _Options(self, p): def _Options(self, p):
p.add_option('-d', '--diff', p.add_option('-d', '--diff',
@ -34,12 +36,19 @@ class Info(PagedCommand):
p.add_option('-o', '--overview', p.add_option('-o', '--overview',
dest='overview', action='store_true', dest='overview', action='store_true',
help='show overview of all local commits') help='show overview of all local commits')
p.add_option('-b', '--current-branch', p.add_option('-c', '--current-branch',
dest="current_branch", action="store_true", dest="current_branch", action="store_true",
help="consider only checked out branches") help="consider only checked out branches")
p.add_option('--no-current-branch',
dest='current_branch', action='store_false',
help='consider all local branches')
# Turn this into a warning & remove this someday.
p.add_option('-b',
dest='current_branch', action='store_true',
help=optparse.SUPPRESS_HELP)
p.add_option('-l', '--local-only', p.add_option('-l', '--local-only',
dest="local", action="store_true", dest="local", action="store_true",
help="Disable all remote operations") help="disable all remote operations")
def Execute(self, opt, args): def Execute(self, opt, args):
self.out = _Coloring(self.client.globalConfig) self.out = _Coloring(self.client.globalConfig)

View File

@ -36,22 +36,22 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
def _Options(self, p): def _Options(self, p):
p.add_option('-r', '--regex', p.add_option('-r', '--regex',
dest='regex', action='store_true', dest='regex', action='store_true',
help="Filter the project list based on regex or wildcard matching of strings") help='filter the project list based on regex or wildcard matching of strings')
p.add_option('-g', '--groups', p.add_option('-g', '--groups',
dest='groups', dest='groups',
help="Filter the project list based on the groups the project is in") help='filter the project list based on the groups the project is in')
p.add_option('-a', '--all', p.add_option('-a', '--all',
action='store_true', action='store_true',
help='Show projects regardless of checkout state') help='show projects regardless of checkout state')
p.add_option('-f', '--fullpath', p.add_option('-f', '--fullpath',
dest='fullpath', action='store_true', dest='fullpath', action='store_true',
help="Display the full work tree path instead of the relative path") help='display the full work tree path instead of the relative path')
p.add_option('-n', '--name-only', p.add_option('-n', '--name-only',
dest='name_only', action='store_true', dest='name_only', action='store_true',
help="Display only the name of the repository") help='display only the name of the repository')
p.add_option('-p', '--path-only', p.add_option('-p', '--path-only',
dest='path_only', action='store_true', dest='path_only', action='store_true',
help="Display only the path of the repository") help='display only the path of the repository')
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
if opt.fullpath and opt.name_only: if opt.fullpath and opt.name_only:

View File

@ -53,27 +53,27 @@ to indicate the remote ref to push changes to via 'repo upload'.
def _Options(self, p): def _Options(self, p):
p.add_option('-r', '--revision-as-HEAD', p.add_option('-r', '--revision-as-HEAD',
dest='peg_rev', action='store_true', dest='peg_rev', action='store_true',
help='Save revisions as current HEAD') help='save revisions as current HEAD')
p.add_option('-m', '--manifest-name', p.add_option('-m', '--manifest-name',
help='temporary manifest to use for this sync', metavar='NAME.xml') help='temporary manifest to use for this sync', metavar='NAME.xml')
p.add_option('--suppress-upstream-revision', dest='peg_rev_upstream', p.add_option('--suppress-upstream-revision', dest='peg_rev_upstream',
default=True, action='store_false', default=True, action='store_false',
help='If in -r mode, do not write the upstream field. ' help='if in -r mode, do not write the upstream field '
'Only of use if the branch names for a sha1 manifest are ' '(only of use if the branch names for a sha1 manifest are '
'sensitive.') 'sensitive)')
p.add_option('--suppress-dest-branch', dest='peg_rev_dest_branch', p.add_option('--suppress-dest-branch', dest='peg_rev_dest_branch',
default=True, action='store_false', default=True, action='store_false',
help='If in -r mode, do not write the dest-branch field. ' help='if in -r mode, do not write the dest-branch field '
'Only of use if the branch names for a sha1 manifest are ' '(only of use if the branch names for a sha1 manifest are '
'sensitive.') 'sensitive)')
p.add_option('--json', default=False, action='store_true', p.add_option('--json', default=False, action='store_true',
help='Output manifest in JSON format (experimental).') help='output manifest in JSON format (experimental)')
p.add_option('--pretty', default=False, action='store_true', p.add_option('--pretty', default=False, action='store_true',
help='Format output for humans to read.') help='format output for humans to read')
p.add_option('-o', '--output-file', p.add_option('-o', '--output-file',
dest='output_file', dest='output_file',
default='-', default='-',
help='File to save the manifest to', help='file to save the manifest to',
metavar='-|NAME.xml') metavar='-|NAME.xml')
def _Output(self, opt): def _Output(self, opt):

View File

@ -12,6 +12,8 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import optparse
from color import Coloring from color import Coloring
from command import PagedCommand from command import PagedCommand
@ -26,15 +28,22 @@ class Overview(PagedCommand):
The '%prog' command is used to display an overview of the projects branches, The '%prog' command is used to display an overview of the projects branches,
and list any local commits that have not yet been merged into the project. and list any local commits that have not yet been merged into the project.
The -b/--current-branch option can be used to restrict the output to only The -c/--current-branch option can be used to restrict the output to only
branches currently checked out in each project. By default, all branches branches currently checked out in each project. By default, all branches
are displayed. are displayed.
""" """
def _Options(self, p): def _Options(self, p):
p.add_option('-b', '--current-branch', p.add_option('-c', '--current-branch',
dest="current_branch", action="store_true", dest="current_branch", action="store_true",
help="Consider only checked out branches") help="consider only checked out branches")
p.add_option('--no-current-branch',
dest='current_branch', action='store_false',
help='consider all local branches')
# Turn this into a warning & remove this someday.
p.add_option('-b',
dest='current_branch', action='store_true',
help=optparse.SUPPRESS_HELP)
def Execute(self, opt, args): def Execute(self, opt, args):
all_branches = [] all_branches = []

View File

@ -46,27 +46,27 @@ branch but need to incorporate new upstream changes "underneath" them.
p.add_option('--fail-fast', p.add_option('--fail-fast',
dest='fail_fast', action='store_true', dest='fail_fast', action='store_true',
help='Stop rebasing after first error is hit') help='stop rebasing after first error is hit')
p.add_option('-f', '--force-rebase', p.add_option('-f', '--force-rebase',
dest='force_rebase', action='store_true', dest='force_rebase', action='store_true',
help='Pass --force-rebase to git rebase') help='pass --force-rebase to git rebase')
p.add_option('--no-ff', p.add_option('--no-ff',
dest='ff', default=True, action='store_false', dest='ff', default=True, action='store_false',
help='Pass --no-ff to git rebase') help='pass --no-ff to git rebase')
p.add_option('--autosquash', p.add_option('--autosquash',
dest='autosquash', action='store_true', dest='autosquash', action='store_true',
help='Pass --autosquash to git rebase') help='pass --autosquash to git rebase')
p.add_option('--whitespace', p.add_option('--whitespace',
dest='whitespace', action='store', metavar='WS', dest='whitespace', action='store', metavar='WS',
help='Pass --whitespace to git rebase') help='pass --whitespace to git rebase')
p.add_option('--auto-stash', p.add_option('--auto-stash',
dest='auto_stash', action='store_true', dest='auto_stash', action='store_true',
help='Stash local modifications before starting') help='stash local modifications before starting')
p.add_option('-m', '--onto-manifest', p.add_option('-m', '--onto-manifest',
dest='onto_manifest', action='store_true', dest='onto_manifest', action='store_true',
help='Rebase onto the manifest version instead of upstream ' help='rebase onto the manifest version instead of upstream '
'HEAD. This helps to make sure the local tree stays ' 'HEAD (this helps to make sure the local tree stays '
'consistent if you previously synced to a manifest.') 'consistent if you previously synced to a manifest)')
def Execute(self, opt, args): def Execute(self, opt, args):
all_projects = self.GetProjects(args) all_projects = self.GetProjects(args)

View File

@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import errno
import functools import functools
import http.cookiejar as cookielib import http.cookiejar as cookielib
import io import io
@ -56,6 +57,7 @@ from error import RepoChangedException, GitError, ManifestParseError
import platform_utils import platform_utils
from project import SyncBuffer from project import SyncBuffer
from progress import Progress from progress import Progress
import ssh
from wrapper import Wrapper from wrapper import Wrapper
from manifest_xml import GitcManifest from manifest_xml import GitcManifest
@ -168,6 +170,7 @@ later is required to fix a server side protocol bug.
PARALLEL_JOBS = 1 PARALLEL_JOBS = 1
def _CommonOptions(self, p): def _CommonOptions(self, p):
if self.manifest:
try: try:
self.PARALLEL_JOBS = self.manifest.default.sync_j self.PARALLEL_JOBS = self.manifest.default.sync_j
except ManifestParseError: except ManifestParseError:
@ -212,6 +215,9 @@ later is required to fix a server side protocol bug.
p.add_option('-c', '--current-branch', p.add_option('-c', '--current-branch',
dest='current_branch_only', action='store_true', dest='current_branch_only', action='store_true',
help='fetch only current branch from server') help='fetch only current branch from server')
p.add_option('--no-current-branch',
dest='current_branch_only', action='store_false',
help='fetch all branches from server')
p.add_option('-m', '--manifest-name', p.add_option('-m', '--manifest-name',
dest='manifest_name', dest='manifest_name',
help='temporary manifest to use for this sync', metavar='NAME.xml') help='temporary manifest to use for this sync', metavar='NAME.xml')
@ -230,8 +236,14 @@ later is required to fix a server side protocol bug.
help='fetch submodules from server') help='fetch submodules from server')
p.add_option('--use-superproject', action='store_true', p.add_option('--use-superproject', action='store_true',
help='use the manifest superproject to sync projects') help='use the manifest superproject to sync projects')
p.add_option('--no-use-superproject', action='store_false',
dest='use_superproject',
help='disable use of manifest superprojects')
p.add_option('--tags',
action='store_false',
help='fetch tags')
p.add_option('--no-tags', p.add_option('--no-tags',
dest='tags', default=True, action='store_false', dest='tags', action='store_false',
help="don't fetch tags") help="don't fetch tags")
p.add_option('--optimized-fetch', p.add_option('--optimized-fetch',
dest='optimized_fetch', action='store_true', dest='optimized_fetch', action='store_true',
@ -268,15 +280,16 @@ later is required to fix a server side protocol bug.
def _UseSuperproject(self, opt): def _UseSuperproject(self, opt):
"""Returns True if use-superproject option is enabled""" """Returns True if use-superproject option is enabled"""
return (opt.use_superproject or if opt.use_superproject is not None:
self.manifest.manifestProject.config.GetBoolean( return opt.use_superproject
'repo.superproject')) else:
return self.manifest.manifestProject.config.GetBoolean('repo.superproject')
def _GetCurrentBranchOnly(self, opt): def _GetCurrentBranchOnly(self, opt):
"""Returns True if current-branch or use-superproject options are enabled.""" """Returns True if current-branch or use-superproject options are enabled."""
return opt.current_branch_only or self._UseSuperproject(opt) return opt.current_branch_only or self._UseSuperproject(opt)
def _UpdateProjectsRevisionId(self, opt, args): def _UpdateProjectsRevisionId(self, opt, args, load_local_manifests):
"""Update revisionId of every project with the SHA from superproject. """Update revisionId of every project with the SHA from superproject.
This function updates each project's revisionId with SHA from superproject. This function updates each project's revisionId with SHA from superproject.
@ -286,6 +299,7 @@ later is required to fix a server side protocol bug.
opt: Program options returned from optparse. See _Options(). opt: Program options returned from optparse. See _Options().
args: Arguments to pass to GetProjects. See the GetProjects args: Arguments to pass to GetProjects. See the GetProjects
docstring for details. docstring for details.
load_local_manifests: Whether to load local manifests.
Returns: Returns:
Returns path to the overriding manifest file. Returns path to the overriding manifest file.
@ -298,10 +312,11 @@ later is required to fix a server side protocol bug.
submodules_ok=opt.fetch_submodules) submodules_ok=opt.fetch_submodules)
manifest_path = superproject.UpdateProjectsRevisionId(all_projects) manifest_path = superproject.UpdateProjectsRevisionId(all_projects)
if not manifest_path: if not manifest_path:
print('error: Update of revsionId from superproject has failed', print('error: Update of revsionId from superproject has failed. '
'Please resync with --no-use-superproject option',
file=sys.stderr) file=sys.stderr)
sys.exit(1) sys.exit(1)
self._ReloadManifest(manifest_path) self._ReloadManifest(manifest_path, load_local_manifests)
return manifest_path return manifest_path
def _FetchProjectList(self, opt, projects): def _FetchProjectList(self, opt, projects):
@ -343,6 +358,7 @@ later is required to fix a server side protocol bug.
optimized_fetch=opt.optimized_fetch, optimized_fetch=opt.optimized_fetch,
retry_fetches=opt.retry_fetches, retry_fetches=opt.retry_fetches,
prune=opt.prune, prune=opt.prune,
ssh_proxy=self.ssh_proxy,
clone_filter=self.manifest.CloneFilter, clone_filter=self.manifest.CloneFilter,
partial_clone_exclude=self.manifest.PartialCloneExclude) partial_clone_exclude=self.manifest.PartialCloneExclude)
@ -364,7 +380,11 @@ later is required to fix a server side protocol bug.
finish = time.time() finish = time.time()
return (success, project, start, finish) return (success, project, start, finish)
def _Fetch(self, projects, opt, err_event): @classmethod
def _FetchInitChild(cls, ssh_proxy):
cls.ssh_proxy = ssh_proxy
def _Fetch(self, projects, opt, err_event, ssh_proxy):
ret = True ret = True
jobs = opt.jobs_network if opt.jobs_network else self.jobs jobs = opt.jobs_network if opt.jobs_network else self.jobs
@ -394,8 +414,14 @@ later is required to fix a server side protocol bug.
break break
return ret return ret
# We pass the ssh proxy settings via the class. This allows multiprocessing
# to pickle it up when spawning children. We can't pass it as an argument
# to _FetchProjectList below as multiprocessing is unable to pickle those.
Sync.ssh_proxy = None
# NB: Multiprocessing is heavy, so don't spin it up for one job. # NB: Multiprocessing is heavy, so don't spin it up for one job.
if len(projects_list) == 1 or jobs == 1: if len(projects_list) == 1 or jobs == 1:
self._FetchInitChild(ssh_proxy)
if not _ProcessResults(self._FetchProjectList(opt, x) for x in projects_list): if not _ProcessResults(self._FetchProjectList(opt, x) for x in projects_list):
ret = False ret = False
else: else:
@ -413,7 +439,8 @@ later is required to fix a server side protocol bug.
else: else:
pm.update(inc=0, msg='warming up') pm.update(inc=0, msg='warming up')
chunksize = 4 chunksize = 4
with multiprocessing.Pool(jobs) as pool: with multiprocessing.Pool(
jobs, initializer=self._FetchInitChild, initargs=(ssh_proxy,)) as pool:
results = pool.imap_unordered( results = pool.imap_unordered(
functools.partial(self._FetchProjectList, opt), functools.partial(self._FetchProjectList, opt),
projects_list, projects_list,
@ -422,6 +449,11 @@ later is required to fix a server side protocol bug.
ret = False ret = False
pool.close() pool.close()
# Cleanup the reference now that we're done with it, and we're going to
# release any resources it points to. If we don't, later multiprocessing
# usage (e.g. checkouts) will try to pickle and then crash.
del Sync.ssh_proxy
pm.end() pm.end()
self._fetch_times.Save() self._fetch_times.Save()
@ -430,6 +462,64 @@ later is required to fix a server side protocol bug.
return (ret, fetched) return (ret, fetched)
def _FetchMain(self, opt, args, all_projects, err_event, manifest_name,
load_local_manifests, ssh_proxy):
"""The main network fetch loop.
Args:
opt: Program options returned from optparse. See _Options().
args: Command line args used to filter out projects.
all_projects: List of all projects that should be checked out.
err_event: Whether an error was hit while processing.
manifest_name: Manifest file to be reloaded.
load_local_manifests: Whether to load local manifests.
ssh_proxy: SSH manager for clients & masters.
"""
rp = self.manifest.repoProject
to_fetch = []
now = time.time()
if _ONE_DAY_S <= (now - rp.LastFetch):
to_fetch.append(rp)
to_fetch.extend(all_projects)
to_fetch.sort(key=self._fetch_times.Get, reverse=True)
success, fetched = self._Fetch(to_fetch, opt, err_event, ssh_proxy)
if not success:
err_event.set()
_PostRepoFetch(rp, opt.repo_verify)
if opt.network_only:
# bail out now; the rest touches the working tree
if err_event.is_set():
print('\nerror: Exited sync due to fetch errors.\n', file=sys.stderr)
sys.exit(1)
return
# Iteratively fetch missing and/or nested unregistered submodules
previously_missing_set = set()
while True:
self._ReloadManifest(manifest_name, load_local_manifests)
all_projects = self.GetProjects(args,
missing_ok=True,
submodules_ok=opt.fetch_submodules)
missing = []
for project in all_projects:
if project.gitdir not in fetched:
missing.append(project)
if not missing:
break
# Stop us from non-stopped fetching actually-missing repos: If set of
# missing repos has not been changed from last fetch, we break.
missing_set = set(p.name for p in missing)
if previously_missing_set == missing_set:
break
previously_missing_set = missing_set
success, new_fetched = self._Fetch(missing, opt, err_event, ssh_proxy)
if not success:
err_event.set()
fetched.update(new_fetched)
def _CheckoutOne(self, detach_head, force_sync, project): def _CheckoutOne(self, detach_head, force_sync, project):
"""Checkout work tree for one project """Checkout work tree for one project
@ -564,10 +654,18 @@ later is required to fix a server side protocol bug.
t.join() t.join()
pm.end() pm.end()
def _ReloadManifest(self, manifest_name=None): def _ReloadManifest(self, manifest_name=None, load_local_manifests=True):
"""Reload the manfiest from the file specified by the |manifest_name|.
It unloads the manifest if |manifest_name| is None.
Args:
manifest_name: Manifest file to be reloaded.
load_local_manifests: Whether to load local manifests.
"""
if manifest_name: if manifest_name:
# Override calls _Unload already # Override calls _Unload already
self.manifest.Override(manifest_name) self.manifest.Override(manifest_name, load_local_manifests=load_local_manifests)
else: else:
self.manifest._Unload() self.manifest._Unload()
@ -614,6 +712,60 @@ later is required to fix a server side protocol bug.
fd.write('\n') fd.write('\n')
return 0 return 0
def UpdateCopyLinkfileList(self):
"""Save all dests of copyfile and linkfile, and update them if needed.
Returns:
Whether update was successful.
"""
new_paths = {}
new_linkfile_paths = []
new_copyfile_paths = []
for project in self.GetProjects(None, missing_ok=True):
new_linkfile_paths.extend(x.dest for x in project.linkfiles)
new_copyfile_paths.extend(x.dest for x in project.copyfiles)
new_paths = {
'linkfile': new_linkfile_paths,
'copyfile': new_copyfile_paths,
}
copylinkfile_name = 'copy-link-files.json'
copylinkfile_path = os.path.join(self.manifest.repodir, copylinkfile_name)
old_copylinkfile_paths = {}
if os.path.exists(copylinkfile_path):
with open(copylinkfile_path, 'rb') as fp:
try:
old_copylinkfile_paths = json.load(fp)
except:
print('error: %s is not a json formatted file.' %
copylinkfile_path, file=sys.stderr)
platform_utils.remove(copylinkfile_path)
return False
need_remove_files = []
need_remove_files.extend(
set(old_copylinkfile_paths.get('linkfile', [])) -
set(new_linkfile_paths))
need_remove_files.extend(
set(old_copylinkfile_paths.get('copyfile', [])) -
set(new_copyfile_paths))
for need_remove_file in need_remove_files:
try:
platform_utils.remove(need_remove_file)
except OSError as e:
if e.errno == errno.ENOENT:
# Try to remove the updated copyfile or linkfile.
# So, if the file is not exist, nothing need to do.
pass
# Create copy-link-files.json, save dest path of "copyfile" and "linkfile".
with open(copylinkfile_path, 'w', encoding='utf-8') as fp:
json.dump(new_paths, fp)
return True
def _SmartSyncSetup(self, opt, smart_sync_manifest_path): def _SmartSyncSetup(self, opt, smart_sync_manifest_path):
if not self.manifest.manifest_server: if not self.manifest.manifest_server:
print('error: cannot smart sync: no manifest server defined in ' print('error: cannot smart sync: no manifest server defined in '
@ -730,7 +882,7 @@ later is required to fix a server side protocol bug.
start, time.time(), clean) start, time.time(), clean)
if not clean: if not clean:
sys.exit(1) sys.exit(1)
self._ReloadManifest(opt.manifest_name) self._ReloadManifest(manifest_name)
if opt.jobs is None: if opt.jobs is None:
self.jobs = self.manifest.default.sync_j self.jobs = self.manifest.default.sync_j
@ -779,7 +931,7 @@ later is required to fix a server side protocol bug.
print('error: failed to remove existing smart sync override manifest: %s' % print('error: failed to remove existing smart sync override manifest: %s' %
e, file=sys.stderr) e, file=sys.stderr)
err_event = _threading.Event() err_event = multiprocessing.Event()
rp = self.manifest.repoProject rp = self.manifest.repoProject
rp.PreSync() rp.PreSync()
@ -802,8 +954,9 @@ later is required to fix a server side protocol bug.
else: else:
self._UpdateManifestProject(opt, mp, manifest_name) self._UpdateManifestProject(opt, mp, manifest_name)
load_local_manifests = not self.manifest.HasLocalManifests
if self._UseSuperproject(opt): if self._UseSuperproject(opt):
manifest_name = self._UpdateProjectsRevisionId(opt, args) manifest_name = self._UpdateProjectsRevisionId(opt, args, load_local_manifests)
if self.gitc_manifest: if self.gitc_manifest:
gitc_manifest_projects = self.GetProjects(args, gitc_manifest_projects = self.GetProjects(args,
@ -849,49 +1002,16 @@ later is required to fix a server side protocol bug.
self._fetch_times = _FetchTimes(self.manifest) self._fetch_times = _FetchTimes(self.manifest)
if not opt.local_only: if not opt.local_only:
to_fetch = [] with multiprocessing.Manager() as manager:
now = time.time() with ssh.ProxyManager(manager) as ssh_proxy:
if _ONE_DAY_S <= (now - rp.LastFetch): # Initialize the socket dir once in the parent.
to_fetch.append(rp) ssh_proxy.sock()
to_fetch.extend(all_projects) self._FetchMain(opt, args, all_projects, err_event, manifest_name,
to_fetch.sort(key=self._fetch_times.Get, reverse=True) load_local_manifests, ssh_proxy)
success, fetched = self._Fetch(to_fetch, opt, err_event)
if not success:
err_event.set()
_PostRepoFetch(rp, opt.repo_verify)
if opt.network_only: if opt.network_only:
# bail out now; the rest touches the working tree
if err_event.is_set():
print('\nerror: Exited sync due to fetch errors.\n', file=sys.stderr)
sys.exit(1)
return return
# Iteratively fetch missing and/or nested unregistered submodules
previously_missing_set = set()
while True:
self._ReloadManifest(manifest_name)
all_projects = self.GetProjects(args,
missing_ok=True,
submodules_ok=opt.fetch_submodules)
missing = []
for project in all_projects:
if project.gitdir not in fetched:
missing.append(project)
if not missing:
break
# Stop us from non-stopped fetching actually-missing repos: If set of
# missing repos has not been changed from last fetch, we break.
missing_set = set(p.name for p in missing)
if previously_missing_set == missing_set:
break
previously_missing_set = missing_set
success, new_fetched = self._Fetch(to_fetch, opt, err_event)
if not success:
err_event.set()
fetched.update(new_fetched)
# If we saw an error, exit with code 1 so that other scripts can check. # If we saw an error, exit with code 1 so that other scripts can check.
if err_event.is_set(): if err_event.is_set():
err_network_sync = True err_network_sync = True
@ -914,6 +1034,13 @@ later is required to fix a server side protocol bug.
print('\nerror: Local checkouts *not* updated.', file=sys.stderr) print('\nerror: Local checkouts *not* updated.', file=sys.stderr)
sys.exit(1) sys.exit(1)
err_update_linkfiles = not self.UpdateCopyLinkfileList()
if err_update_linkfiles:
err_event.set()
if opt.fail_fast:
print('\nerror: Local update copyfile or linkfile failed.', file=sys.stderr)
sys.exit(1)
err_results = [] err_results = []
# NB: We don't exit here because this is the last step. # NB: We don't exit here because this is the last step.
err_checkout = not self._Checkout(all_projects, opt, err_results) err_checkout = not self._Checkout(all_projects, opt, err_results)
@ -932,6 +1059,8 @@ later is required to fix a server side protocol bug.
print('error: Downloading network changes failed.', file=sys.stderr) print('error: Downloading network changes failed.', file=sys.stderr)
if err_update_projects: if err_update_projects:
print('error: Updating local project lists failed.', file=sys.stderr) print('error: Updating local project lists failed.', file=sys.stderr)
if err_update_linkfiles:
print('error: Updating copyfiles or linkfiles failed.', file=sys.stderr)
if err_checkout: if err_checkout:
print('error: Checking out local projects failed.', file=sys.stderr) print('error: Checking out local projects failed.', file=sys.stderr)
if err_results: if err_results:

View File

@ -13,10 +13,12 @@
# limitations under the License. # limitations under the License.
import copy import copy
import functools
import optparse
import re import re
import sys import sys
from command import InteractiveCommand from command import DEFAULT_LOCAL_JOBS, InteractiveCommand
from editor import Editor from editor import Editor
from error import UploadError from error import UploadError
from git_command import GitCommand from git_command import GitCommand
@ -145,58 +147,66 @@ https://gerrit-review.googlesource.com/Documentation/user-upload.html#notify
Gerrit Code Review: https://www.gerritcodereview.com/ Gerrit Code Review: https://www.gerritcodereview.com/
""" """
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p): def _Options(self, p):
p.add_option('-t', p.add_option('-t',
dest='auto_topic', action='store_true', dest='auto_topic', action='store_true',
help='Send local branch name to Gerrit Code Review') help='send local branch name to Gerrit Code Review')
p.add_option('--hashtag', '--ht', p.add_option('--hashtag', '--ht',
dest='hashtags', action='append', default=[], dest='hashtags', action='append', default=[],
help='Add hashtags (comma delimited) to the review.') help='add hashtags (comma delimited) to the review')
p.add_option('--hashtag-branch', '--htb', p.add_option('--hashtag-branch', '--htb',
action='store_true', action='store_true',
help='Add local branch name as a hashtag.') help='add local branch name as a hashtag')
p.add_option('-l', '--label', p.add_option('-l', '--label',
dest='labels', action='append', default=[], dest='labels', action='append', default=[],
help='Add a label when uploading.') help='add a label when uploading')
p.add_option('--re', '--reviewers', p.add_option('--re', '--reviewers',
type='string', action='append', dest='reviewers', type='string', action='append', dest='reviewers',
help='Request reviews from these people.') help='request reviews from these people')
p.add_option('--cc', p.add_option('--cc',
type='string', action='append', dest='cc', type='string', action='append', dest='cc',
help='Also send email to these email addresses.') help='also send email to these email addresses')
p.add_option('--br', p.add_option('--br', '--branch',
type='string', action='store', dest='branch', type='string', action='store', dest='branch',
help='Branch to upload.') help='(local) branch to upload')
p.add_option('--cbr', '--current-branch', p.add_option('-c', '--current-branch',
dest='current_branch', action='store_true', dest='current_branch', action='store_true',
help='Upload current git branch.') help='upload current git branch')
p.add_option('--no-current-branch',
dest='current_branch', action='store_false',
help='upload all git branches')
# Turn this into a warning & remove this someday.
p.add_option('--cbr',
dest='current_branch', action='store_true',
help=optparse.SUPPRESS_HELP)
p.add_option('--ne', '--no-emails', p.add_option('--ne', '--no-emails',
action='store_false', dest='notify', default=True, action='store_false', dest='notify', default=True,
help='If specified, do not send emails on upload.') help='do not send e-mails on upload')
p.add_option('-p', '--private', p.add_option('-p', '--private',
action='store_true', dest='private', default=False, action='store_true', dest='private', default=False,
help='If specified, upload as a private change.') help='upload as a private change (deprecated; use --wip)')
p.add_option('-w', '--wip', p.add_option('-w', '--wip',
action='store_true', dest='wip', default=False, action='store_true', dest='wip', default=False,
help='If specified, upload as a work-in-progress change.') help='upload as a work-in-progress change')
p.add_option('-o', '--push-option', p.add_option('-o', '--push-option',
type='string', action='append', dest='push_options', type='string', action='append', dest='push_options',
default=[], default=[],
help='Additional push options to transmit') help='additional push options to transmit')
p.add_option('-D', '--destination', '--dest', p.add_option('-D', '--destination', '--dest',
type='string', action='store', dest='dest_branch', type='string', action='store', dest='dest_branch',
metavar='BRANCH', metavar='BRANCH',
help='Submit for review on this target branch.') help='submit for review on this target branch')
p.add_option('-n', '--dry-run', p.add_option('-n', '--dry-run',
dest='dryrun', default=False, action='store_true', dest='dryrun', default=False, action='store_true',
help='Do everything except actually upload the CL.') help='do everything except actually upload the CL')
p.add_option('-y', '--yes', p.add_option('-y', '--yes',
default=False, action='store_true', default=False, action='store_true',
help='Answer yes to all safe prompts.') help='answer yes to all safe prompts')
p.add_option('--no-cert-checks', p.add_option('--no-cert-checks',
dest='validate_certs', action='store_false', default=True, dest='validate_certs', action='store_false', default=True,
help='Disable verifying ssl certs (unsafe).') help='disable verifying ssl certs (unsafe)')
RepoHook.AddOptionGroup(p, 'pre-upload') RepoHook.AddOptionGroup(p, 'pre-upload')
def _SingleBranch(self, opt, branch, people): def _SingleBranch(self, opt, branch, people):
@ -502,40 +512,46 @@ Gerrit Code Review: https://www.gerritcodereview.com/
merge_branch = p.stdout.strip() merge_branch = p.stdout.strip()
return merge_branch return merge_branch
def Execute(self, opt, args): @staticmethod
project_list = self.GetProjects(args) def _GatherOne(opt, project):
pending = [] """Figure out the upload status for |project|."""
reviewers = []
cc = []
branch = None
if opt.branch:
branch = opt.branch
for project in project_list:
if opt.current_branch: if opt.current_branch:
cbr = project.CurrentBranch cbr = project.CurrentBranch
up_branch = project.GetUploadableBranch(cbr) up_branch = project.GetUploadableBranch(cbr)
if up_branch: avail = [up_branch] if up_branch else None
avail = [up_branch]
else: else:
avail = None avail = project.GetUploadableBranches(opt.branch)
print('repo: error: Unable to upload branch "%s". ' return (project, avail)
def Execute(self, opt, args):
projects = self.GetProjects(args)
def _ProcessResults(_pool, _out, results):
pending = []
for result in results:
project, avail = result
if avail is None:
print('repo: error: %s: Unable to upload branch "%s". '
'You might be able to fix the branch by running:\n' 'You might be able to fix the branch by running:\n'
' git branch --set-upstream-to m/%s' % ' git branch --set-upstream-to m/%s' %
(str(cbr), self.manifest.branch), (project.relpath, project.CurrentBranch, self.manifest.branch),
file=sys.stderr) file=sys.stderr)
else: elif avail:
avail = project.GetUploadableBranches(branch) pending.append(result)
if avail: return pending
pending.append((project, avail))
pending = self.ExecuteInParallel(
opt.jobs,
functools.partial(self._GatherOne, opt),
projects,
callback=_ProcessResults)
if not pending: if not pending:
if branch is None: if opt.branch is None:
print('repo: error: no branches ready for upload', file=sys.stderr) print('repo: error: no branches ready for upload', file=sys.stderr)
else: else:
print('repo: error: no branches named "%s" ready for upload' % print('repo: error: no branches named "%s" ready for upload' %
(branch,), file=sys.stderr) (opt.branch,), file=sys.stderr)
return 1 return 1
pending_proj_names = [project.name for (project, available) in pending] pending_proj_names = [project.name for (project, available) in pending]
@ -548,10 +564,8 @@ Gerrit Code Review: https://www.gerritcodereview.com/
worktree_list=pending_worktrees): worktree_list=pending_worktrees):
return 1 return 1
if opt.reviewers: reviewers = _SplitEmails(opt.reviewers) if opt.reviewers else []
reviewers = _SplitEmails(opt.reviewers) cc = _SplitEmails(opt.cc) if opt.cc else []
if opt.cc:
cc = _SplitEmails(opt.cc)
people = (reviewers, cc) people = (reviewers, cc)
if len(pending) == 1 and len(pending[0][1]) == 1: if len(pending) == 1 and len(pending[0][1]) == 1:

View File

@ -18,6 +18,7 @@ import sys
from command import Command, MirrorSafeCommand from command import Command, MirrorSafeCommand
from git_command import git, RepoSourceVersion, user_agent from git_command import git, RepoSourceVersion, user_agent
from git_refs import HEAD from git_refs import HEAD
from wrapper import Wrapper
class Version(Command, MirrorSafeCommand): class Version(Command, MirrorSafeCommand):
@ -62,3 +63,4 @@ class Version(Command, MirrorSafeCommand):
print('OS %s %s (%s)' % (uname.system, uname.release, uname.version)) print('OS %s %s (%s)' % (uname.system, uname.release, uname.version))
print('CPU %s (%s)' % print('CPU %s (%s)' %
(uname.machine, uname.processor if uname.processor else 'unknown')) (uname.machine, uname.processor if uname.processor else 'unknown'))
print('Bug reports:', Wrapper().BUG_URL)

View File

@ -26,33 +26,6 @@ import git_command
import wrapper import wrapper
class SSHUnitTest(unittest.TestCase):
"""Tests the ssh functions."""
def test_ssh_version(self):
"""Check ssh_version() handling."""
ver = git_command._parse_ssh_version('Unknown\n')
self.assertEqual(ver, ())
ver = git_command._parse_ssh_version('OpenSSH_1.0\n')
self.assertEqual(ver, (1, 0))
ver = git_command._parse_ssh_version('OpenSSH_6.6.1p1 Ubuntu-2ubuntu2.13, OpenSSL 1.0.1f 6 Jan 2014\n')
self.assertEqual(ver, (6, 6, 1))
ver = git_command._parse_ssh_version('OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\n')
self.assertEqual(ver, (7, 6))
def test_ssh_sock(self):
"""Check ssh_sock() function."""
with mock.patch('tempfile.mkdtemp', return_value='/tmp/foo'):
# old ssh version uses port
with mock.patch('git_command.ssh_version', return_value=(6, 6)):
self.assertTrue(git_command.ssh_sock().endswith('%p'))
git_command._ssh_sock_path = None
# new ssh version uses hash
with mock.patch('git_command.ssh_version', return_value=(6, 7)):
self.assertTrue(git_command.ssh_sock().endswith('%C'))
git_command._ssh_sock_path = None
class GitCallUnitTest(unittest.TestCase): class GitCallUnitTest(unittest.TestCase):
"""Tests the _GitCall class (via git_command.git).""" """Tests the _GitCall class (via git_command.git)."""

View File

@ -141,12 +141,12 @@ class SuperprojectTestCase(unittest.TestCase):
manifest_xml = fp.read() manifest_xml = fp.read()
self.assertEqual( self.assertEqual(
manifest_xml, manifest_xml,
'<?xml version="1.0" ?><manifest>' + '<?xml version="1.0" ?><manifest>'
'<remote name="default-remote" fetch="http://localhost"/>' + '<remote name="default-remote" fetch="http://localhost"/>'
'<default remote="default-remote" revision="refs/heads/main"/>' + '<default remote="default-remote" revision="refs/heads/main"/>'
'<project name="platform/art" path="art" revision="ABCDEF" ' + '<project name="platform/art" path="art" revision="ABCDEF" '
'groups="notdefault,platform-' + self.platform + '"/>' + 'groups="notdefault,platform-' + self.platform + '"/>'
'<superproject name="superproject"/>' + '<superproject name="superproject"/>'
'</manifest>') '</manifest>')
def test_superproject_update_project_revision_id(self): def test_superproject_update_project_revision_id(self):
@ -168,13 +168,57 @@ class SuperprojectTestCase(unittest.TestCase):
manifest_xml = fp.read() manifest_xml = fp.read()
self.assertEqual( self.assertEqual(
manifest_xml, manifest_xml,
'<?xml version="1.0" ?><manifest>' + '<?xml version="1.0" ?><manifest>'
'<remote name="default-remote" fetch="http://localhost"/>' + '<remote name="default-remote" fetch="http://localhost"/>'
'<default remote="default-remote" revision="refs/heads/main"/>' + '<default remote="default-remote" revision="refs/heads/main"/>'
'<project name="platform/art" path="art" ' + '<project name="platform/art" path="art" '
'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" ' + 'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" '
'groups="notdefault,platform-' + self.platform + '"/>' + 'groups="notdefault,platform-' + self.platform + '"/>'
'<superproject name="superproject"/>' + '<superproject name="superproject"/>'
'</manifest>')
def test_superproject_update_project_revision_id_with_different_remotes(self):
"""Test update of commit ids of a manifest with mutiple remotes."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<remote name="goog" fetch="http://localhost2" />
<default remote="default-remote" revision="refs/heads/main" />
<superproject name="superproject"/>
<project path="vendor/x" name="platform/vendor/x" remote="goog" groups="vendor"
revision="master-with-vendor" clone-depth="1" />
<project path="art" name="platform/art" groups="notdefault,platform-""" + self.platform + """
" /></manifest>
""")
self.maxDiff = None
self._superproject = git_superproject.Superproject(manifest, self.repodir)
self.assertEqual(len(self._superproject._manifest.projects), 2)
projects = self._superproject._manifest.projects
data = ('160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00'
'160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00')
with mock.patch.object(self._superproject, '_Init', return_value=True):
with mock.patch.object(self._superproject, '_Fetch', return_value=True):
with mock.patch.object(self._superproject,
'_LsTree',
return_value=data):
# Create temporary directory so that it can write the file.
os.mkdir(self._superproject._superproject_path)
manifest_path = self._superproject.UpdateProjectsRevisionId(projects)
self.assertIsNotNone(manifest_path)
with open(manifest_path, 'r') as fp:
manifest_xml = fp.read()
self.assertEqual(
manifest_xml,
'<?xml version="1.0" ?><manifest>'
'<remote name="default-remote" fetch="http://localhost"/>'
'<remote name="goog" fetch="http://localhost2"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project name="platform/art" path="art" '
'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" '
'groups="notdefault,platform-' + self.platform + '"/>'
'<project name="platform/vendor/x" path="vendor/x" remote="goog" '
'revision="master-with-vendor" groups="vendor" clone-depth="1"/>'
'<superproject name="superproject"/>'
'</manifest>') '</manifest>')

View File

@ -52,6 +52,9 @@ INVALID_FS_PATHS = (
'blah/foo~', 'blah/foo~',
# Block Unicode characters that get normalized out by filesystems. # Block Unicode characters that get normalized out by filesystems.
u'foo\u200Cbar', u'foo\u200Cbar',
# Block newlines.
'f\n/bar',
'f\r/bar',
) )
# Make sure platforms that use path separators (e.g. Windows) are also # Make sure platforms that use path separators (e.g. Windows) are also
@ -91,6 +94,11 @@ class ManifestParseTestCase(unittest.TestCase):
fp.write(data) fp.write(data)
return manifest_xml.XmlManifest(self.repodir, self.manifest_file) return manifest_xml.XmlManifest(self.repodir, self.manifest_file)
@staticmethod
def encodeXmlAttr(attr):
"""Encode |attr| using XML escape rules."""
return attr.replace('\r', '&#x000d;').replace('\n', '&#x000a;')
class ManifestValidateFilePaths(unittest.TestCase): class ManifestValidateFilePaths(unittest.TestCase):
"""Check _ValidateFilePaths helper. """Check _ValidateFilePaths helper.
@ -247,10 +255,10 @@ class XmlManifestTests(ManifestParseTestCase):
self.assertEqual(manifest.superproject['remote'].name, 'test-remote') self.assertEqual(manifest.superproject['remote'].name, 'test-remote')
self.assertEqual( self.assertEqual(
manifest.ToXml().toxml(), manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' + '<?xml version="1.0" ?><manifest>'
'<remote name="test-remote" fetch="http://localhost"/>' + '<remote name="test-remote" fetch="http://localhost"/>'
'<default remote="test-remote" revision="refs/heads/main"/>' + '<default remote="test-remote" revision="refs/heads/main"/>'
'<superproject name="superproject"/>' + '<superproject name="superproject"/>'
'</manifest>') '</manifest>')
@ -303,6 +311,7 @@ class IncludeElementTests(ManifestParseTestCase):
def test_allow_bad_name_from_user(self): def test_allow_bad_name_from_user(self):
"""Check handling of bad name attribute from the user's input.""" """Check handling of bad name attribute from the user's input."""
def parse(name): def parse(name):
name = self.encodeXmlAttr(name)
manifest = self.getXmlManifest(f""" manifest = self.getXmlManifest(f"""
<manifest> <manifest>
<remote name="default-remote" fetch="http://localhost" /> <remote name="default-remote" fetch="http://localhost" />
@ -327,6 +336,7 @@ class IncludeElementTests(ManifestParseTestCase):
def test_bad_name_checks(self): def test_bad_name_checks(self):
"""Check handling of bad name attribute.""" """Check handling of bad name attribute."""
def parse(name): def parse(name):
name = self.encodeXmlAttr(name)
# Setup target of the include. # Setup target of the include.
with open(os.path.join(self.manifest_dir, 'target.xml'), 'w') as fp: with open(os.path.join(self.manifest_dir, 'target.xml'), 'w') as fp:
fp.write(f'<manifest><include name="{name}"/></manifest>') fp.write(f'<manifest><include name="{name}"/></manifest>')
@ -399,15 +409,17 @@ class ProjectElementTests(ManifestParseTestCase):
project.SetRevisionId('ABCDEF') project.SetRevisionId('ABCDEF')
self.assertEqual( self.assertEqual(
manifest.ToXml().toxml(), manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' + '<?xml version="1.0" ?><manifest>'
'<remote name="default-remote" fetch="http://localhost"/>' + '<remote name="default-remote" fetch="http://localhost"/>'
'<default remote="default-remote" revision="refs/heads/main"/>' + '<default remote="default-remote" revision="refs/heads/main"/>'
'<project name="test-name" revision="ABCDEF"/>' + '<project name="test-name" revision="ABCDEF"/>'
'</manifest>') '</manifest>')
def test_trailing_slash(self): def test_trailing_slash(self):
"""Check handling of trailing slashes in attributes.""" """Check handling of trailing slashes in attributes."""
def parse(name, path): def parse(name, path):
name = self.encodeXmlAttr(name)
path = self.encodeXmlAttr(path)
return self.getXmlManifest(f""" return self.getXmlManifest(f"""
<manifest> <manifest>
<remote name="default-remote" fetch="http://localhost" /> <remote name="default-remote" fetch="http://localhost" />
@ -437,6 +449,8 @@ class ProjectElementTests(ManifestParseTestCase):
def test_toplevel_path(self): def test_toplevel_path(self):
"""Check handling of path=. specially.""" """Check handling of path=. specially."""
def parse(name, path): def parse(name, path):
name = self.encodeXmlAttr(name)
path = self.encodeXmlAttr(path)
return self.getXmlManifest(f""" return self.getXmlManifest(f"""
<manifest> <manifest>
<remote name="default-remote" fetch="http://localhost" /> <remote name="default-remote" fetch="http://localhost" />
@ -453,6 +467,8 @@ class ProjectElementTests(ManifestParseTestCase):
def test_bad_path_name_checks(self): def test_bad_path_name_checks(self):
"""Check handling of bad path & name attributes.""" """Check handling of bad path & name attributes."""
def parse(name, path): def parse(name, path):
name = self.encodeXmlAttr(name)
path = self.encodeXmlAttr(path)
manifest = self.getXmlManifest(f""" manifest = self.getXmlManifest(f"""
<manifest> <manifest>
<remote name="default-remote" fetch="http://localhost" /> <remote name="default-remote" fetch="http://localhost" />
@ -501,10 +517,10 @@ class SuperProjectElementTests(ManifestParseTestCase):
self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/superproject') self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/superproject')
self.assertEqual( self.assertEqual(
manifest.ToXml().toxml(), manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' + '<?xml version="1.0" ?><manifest>'
'<remote name="test-remote" fetch="http://localhost"/>' + '<remote name="test-remote" fetch="http://localhost"/>'
'<default remote="test-remote" revision="refs/heads/main"/>' + '<default remote="test-remote" revision="refs/heads/main"/>'
'<superproject name="superproject"/>' + '<superproject name="superproject"/>'
'</manifest>') '</manifest>')
def test_remote(self): def test_remote(self):
@ -522,11 +538,11 @@ class SuperProjectElementTests(ManifestParseTestCase):
self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/platform/superproject') self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/platform/superproject')
self.assertEqual( self.assertEqual(
manifest.ToXml().toxml(), manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' + '<?xml version="1.0" ?><manifest>'
'<remote name="default-remote" fetch="http://localhost"/>' + '<remote name="default-remote" fetch="http://localhost"/>'
'<remote name="superproject-remote" fetch="http://localhost"/>' + '<remote name="superproject-remote" fetch="http://localhost"/>'
'<default remote="default-remote" revision="refs/heads/main"/>' + '<default remote="default-remote" revision="refs/heads/main"/>'
'<superproject name="platform/superproject" remote="superproject-remote"/>' + '<superproject name="platform/superproject" remote="superproject-remote"/>'
'</manifest>') '</manifest>')
def test_defalut_remote(self): def test_defalut_remote(self):
@ -542,8 +558,27 @@ class SuperProjectElementTests(ManifestParseTestCase):
self.assertEqual(manifest.superproject['remote'].name, 'default-remote') self.assertEqual(manifest.superproject['remote'].name, 'default-remote')
self.assertEqual( self.assertEqual(
manifest.ToXml().toxml(), manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' + '<?xml version="1.0" ?><manifest>'
'<remote name="default-remote" fetch="http://localhost"/>' + '<remote name="default-remote" fetch="http://localhost"/>'
'<default remote="default-remote" revision="refs/heads/main"/>' + '<default remote="default-remote" revision="refs/heads/main"/>'
'<superproject name="superproject"/>' + '<superproject name="superproject"/>'
'</manifest>')
class ContactinfoElementTests(ManifestParseTestCase):
"""Tests for <contactinfo>."""
def test_contactinfo(self):
"""Check contactinfo settings."""
bugurl = 'http://localhost/contactinfo'
manifest = self.getXmlManifest(f"""
<manifest>
<contactinfo bugurl="{bugurl}"/>
</manifest>
""")
self.assertEqual(manifest.contactinfo.bugurl, bugurl)
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>'
f'<contactinfo bugurl="{bugurl}"/>'
'</manifest>') '</manifest>')

74
tests/test_ssh.py Normal file
View File

@ -0,0 +1,74 @@
# Copyright 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the ssh.py module."""
import multiprocessing
import subprocess
import unittest
from unittest import mock
import ssh
class SshTests(unittest.TestCase):
"""Tests the ssh functions."""
def test_parse_ssh_version(self):
"""Check _parse_ssh_version() handling."""
ver = ssh._parse_ssh_version('Unknown\n')
self.assertEqual(ver, ())
ver = ssh._parse_ssh_version('OpenSSH_1.0\n')
self.assertEqual(ver, (1, 0))
ver = ssh._parse_ssh_version('OpenSSH_6.6.1p1 Ubuntu-2ubuntu2.13, OpenSSL 1.0.1f 6 Jan 2014\n')
self.assertEqual(ver, (6, 6, 1))
ver = ssh._parse_ssh_version('OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\n')
self.assertEqual(ver, (7, 6))
def test_version(self):
"""Check version() handling."""
with mock.patch('ssh._run_ssh_version', return_value='OpenSSH_1.2\n'):
self.assertEqual(ssh.version(), (1, 2))
def test_context_manager_empty(self):
"""Verify context manager with no clients works correctly."""
with multiprocessing.Manager() as manager:
with ssh.ProxyManager(manager):
pass
def test_context_manager_child_cleanup(self):
"""Verify orphaned clients & masters get cleaned up."""
with multiprocessing.Manager() as manager:
with ssh.ProxyManager(manager) as ssh_proxy:
client = subprocess.Popen(['sleep', '964853320'])
ssh_proxy.add_client(client)
master = subprocess.Popen(['sleep', '964853321'])
ssh_proxy.add_master(master)
# If the process still exists, these will throw timeout errors.
client.wait(0)
master.wait(0)
def test_ssh_sock(self):
"""Check sock() function."""
manager = multiprocessing.Manager()
proxy = ssh.ProxyManager(manager)
with mock.patch('tempfile.mkdtemp', return_value='/tmp/foo'):
# old ssh version uses port
with mock.patch('ssh.version', return_value=(6, 6)):
self.assertTrue(proxy.sock().endswith('%p'))
proxy._sock_path = None
# new ssh version uses hash
with mock.patch('ssh.version', return_value=(6, 7)):
self.assertTrue(proxy.sock().endswith('%C'))

View File

@ -14,6 +14,7 @@
"""Unittests for the subcmds module (mostly __init__.py than subcommands).""" """Unittests for the subcmds module (mostly __init__.py than subcommands)."""
import optparse
import unittest import unittest
import subcmds import subcmds
@ -41,3 +42,32 @@ class AllCommands(unittest.TestCase):
# Reject internal python paths like "__init__". # Reject internal python paths like "__init__".
self.assertFalse(cmd.startswith('__')) self.assertFalse(cmd.startswith('__'))
def test_help_desc_style(self):
"""Force some consistency in option descriptions.
Python's optparse & argparse has a few default options like --help. Their
option description text uses lowercase sentence fragments, so enforce our
options follow the same style so UI is consistent.
We enforce:
* Text starts with lowercase.
* Text doesn't end with period.
"""
for name, cls in subcmds.all_commands.items():
cmd = cls()
parser = cmd.OptionParser
for option in parser.option_list:
if option.help == optparse.SUPPRESS_HELP:
continue
c = option.help[0]
self.assertEqual(
c.lower(), c,
msg=f'subcmds/{name}.py: {option.get_opt_string()}: help text '
f'should start with lowercase: "{option.help}"')
self.assertNotEqual(
option.help[-1], '.',
msg=f'subcmds/{name}.py: {option.get_opt_string()}: help text '
f'should not end in a period: "{option.help}"')