Compare commits

...

89 Commits

Author SHA1 Message Date
a9f11b3cb2 Support resolving relative fetch URLs on persistent-https://
Some versions of Python will only attempt to resolve a relative
URL if they understand the URL scheme. Convert persistent-http://
and persistent-https:// schemes to the more typical http:// and
https:// versions for the resolve call.

Change-Id: I99072d5a69be8cfaa429a3ab177ba644d928ffba
2013-01-02 15:41:57 -08:00
7bdbde7af8 Allow sync to run even when the manifest is broken.
If the current manifest is broken then "repo sync" fails because it
can't retrieve the default value for --jobs. Use 1 in this case, in
order that you can "repo sync" to get a fixed manifest (assuming someone
fixed it upstream).

Change-Id: I4262abb59311f1e851ca2a663438a7e9f796b9f6
2012-12-05 11:01:36 +00:00
b2bd91c99b Represent git-submodule as nested projects, take 2
(Previous submission of this change broke Android buildbot due to
 incorrect regular expression for parsing git-config output.  During
 investigation, we also found that Android, which pulls Chromium, has a
 workaround for Chromium's submodules; its manifest includes Chromium's
 submodules.  This new change, in addition to fixing the regex, also
 take this type of workarounds into consideration; it adds a new
 attribute that makes repo not fetch submodules unless submodules have a
 project element defined in the manifest, or this attribute is
 overridden by a parent project element or by the default element.)

We need a representation of git-submodule in repo; otherwise repo will
not sync submodules, and leave workspace in a broken state.  Of course
this will not be a problem if all projects are owned by the owner of the
manifest file, who may simply choose not to use git-submodule in all
projects.  However, this is not possible in practice because manifest
file owner is unlikely to own all upstream projects.

As git submodules are simply git repositories, it is natural to treat
them as plain repo projects that live inside a repo project.  That is,
we could use recursively declared projects to denote the is-submodule
relation of git repositories.

The behavior of repo remains the same to projects that do not have a
sub-project within.  As for parent projects, repo fetches them and their
sub-projects as normal projects, and then checks out subprojects at the
commit specified in parent's commit object.  The sub-project is fetched
at a path relative to parent project's working directory; so the path
specified in manifest file should match that of .gitmodules file.

If a submodule is not registered in repo manifest, repo will derive its
properties from itself and its parent project, which might not always be
correct.  In such cases, the subproject is called a derived subproject.

To a user, a sub-project is merely a git-submodule; so all tips of
working with a git-submodule apply here, too.  For example, you should
not run `repo sync` in a parent repository if its submodule is dirty.

Change-Id: I4b8344c1b9ccad2f58ad304573133e5d52e1faef
2012-11-19 10:45:21 -08:00
3f5ea0b182 Allow init command to set options from environment variables
The manifest URL and mirror location can be specified in environment
variables which will be used if the options are not passed on the
command line

Change-Id: Ida87968b4a91189822c3738f835e2631e10b847e
2012-11-17 12:40:42 +09:00
b148ac9d9a Allow command options to be set from environment variables
Extend the Command base class to allow options to be set from values
in environment variables, if the user has not given the option on the
command line and the environment variable is set.

Derived classes of Command can override the implementation of the method
_GetEnvironmentOptions to configure which of its options may be set from
environment variables.

Change-Id: I7c780bcf9644d6567893d9930984c054bce7351e
2012-11-17 12:40:42 +09:00
a67df63ef1 Merge "Raise a NoManifestException when the manifest DNE" 2012-11-16 10:39:24 -08:00
f91074881f Better error message if 'remove-project' refers to non-existent project
If a local manifest includes a 'remove-project' element that refers to
a project that does not exist in the manifest, the error message is a
bit cryptic.

Change the error message to make it clearer what is wrong.

Change-Id: I0b1043aaec87893c3128211d3a9ab2db6d600755
2012-11-16 19:12:55 +09:00
75ee0570da Raise a NoManifestException when the manifest DNE
When a command (eg, `repo forall`) expects the manifest project to
exist, but there is no manifest, an IOException gets raised.  This
change defines a new Exception type to be raised in these cases and
raises it when project.py fails to read the manifest.

Change-Id: Iac576c293a37f7d8f60cd4f6aa95b2c97f9e7957
2012-11-15 18:50:11 -08:00
88b86728a4 Add option to abort on error in forall
Add a new option (-e, --abort-on-errors) which will cause forall to
abort without iterating through remaining projects if a command
exits unsuccessfully.

Bug: Issue 17
Change-Id: Ibea405e0d98b575ad3bda719d511f6982511c19c
Signed-off-by: Victor Boivie <victor.boivie@sonyericsson.com>
2012-11-16 04:22:10 +09:00
e66291f6d0 Merge "Simplify error handling in subcommand execution" 2012-11-14 16:22:41 -08:00
7ba25bedf9 Simplify error handling in subcommand execution
Instead of using a nested try (which repo is plagued with), use a single
try when executing the appropriate subcommand.

Change-Id: I32dbfc010c740c0cc42ef8fb6a83dfe87f87e54a
2012-11-14 14:18:06 -08:00
3794a78b80 Sync help text in repo from init.py
Change Ia6032865f9296b29524c2c25b72bd8e175b30489 improved the
help text for the init command, but the same improvement was not made
in repo.

Change-Id: Idc34e479b5237137b90e8b040824776e4f7883b0
2012-11-15 06:21:24 +09:00
33949c34d2 Add repo info command
The info command will print information regarding the current manifest
and local git branch. It will also show the difference of commits
between the local branch and the remote branch.

It also incorporates an overview command into info which shows commits
over all branches.

Change-Id: Iafedd978f44c84d240c010897eff58bbfbd7de71
2012-11-15 03:29:01 +09:00
8f62fb7bd3 Tidy up code formatting a bit more
Enable the following Pylint warnings:

  C0322: Operator not preceded by a space
  C0323: Operator not followed by a space
  C0324: Comma not followed by a space

And make the necessary fixes.

Change-Id: I74d74283ad5138cbaf28d492b18614eb355ff9fe
2012-11-14 12:09:38 +09:00
98ffba1401 Fix: "Statement seems to have no effect"
Pylint raises an error on the call:

  print

Change it to:

 print()

Change-Id: I507e1b3dd928fa6c32ea7e86260fb3d7b1428e6f
2012-11-14 11:38:57 +09:00
c1b86a2323 Fix inconsistent indentation
The repo coding style is to indent at 2 characters, but there are
many places where this is not followed.

Enable pylint warning "W0311: Bad indentation" and make sure all
indentation is at multiples of 2 characters.

Change-Id: I68f0f64470789ce2429ab11104d15d380a63e6a8
2012-11-14 11:38:57 +09:00
cecd1d864f Change print statements to work in python3
This is part of a series of changes to introduce Python3 support.

Change-Id: I373be5de7141aa127d7debdbce1df39148dbec32
2012-11-13 17:33:56 -08:00
fc241240d8 Convert prompt answers to lower case before checking
When prompting for yes/no answers, convert the answer to lower
case before comparing.  This makes it easier to catch answers
like "Yes", "yes", and "YES" with a comparison only for "yes".

Change-Id: I06da8281cec81a7438ebb46ddaf3344d12abe1eb
2012-11-14 09:19:39 +09:00
9f3406ea46 Minor documentation formatting and grammatical fixes
Change-Id: Iaac6377c787b3bb42242780e9d1116e718e0188d
2012-11-14 08:54:43 +09:00
b1525bffae Fix documentation reference to local_manifest.xml
Documentation of the remove-project element still refers explicitly
to local_manifest.xml.

Change it to the more generic "a local manifest".

Change-Id: I6278beab99a582fae26a4e053adc110362c714c2
2012-11-14 08:54:04 +09:00
685f080d62 More code style cleanup
Clean up a few more unnecessary usages of lambda in `repo` that were missed
in the previous sweep that only considered files ending in .py.

Remove a duplicate import.

Change-Id: I03cf467a5630cbe4eee6649520c52e94a7db76be
2012-11-14 08:34:39 +09:00
8898e2f26d Remove magic hack
It should be assumed that on modern development environments, python
is accessible to /usr/bin/env

Change the shebang as necessary and remove the magic hack.

This also means losing the -E option on the call to python, so that
PYTHONPATH and PYTHONHOME will be respected and local configuration
problems in those vars would be noticed

Change-Id: I6f0708ca7693f05a4c3621c338f03619563ba630
2012-11-14 08:17:11 +09:00
52f1e5d911 Make load order of local manifests deterministic
Local manifest files stored in the local_manifests folder are loaded
in alphabetical order, so it's easier to know in which order project
removals/additions/modifications will be applied.

If local_manifests.xml exists, it will be loaded before the files in
local_manifests.

Change-Id: Ia5c0349608f1823b4662cd6b340b99915bd973d5
2012-11-14 05:05:32 +09:00
8e3d355d44 Merge "Print an error message when aborted by user" 2012-11-12 17:35:47 -08:00
4a4776e9ab Merge "Handle manifest parse errors in main" 2012-11-12 17:35:40 -08:00
2fa715f8b5 Merge "Better handling of duplicate remotes" 2012-11-12 17:35:30 -08:00
6287543e35 Merge "Change usages of xrange() to range()" 2012-11-12 17:30:55 -08:00
b0936b0e20 Print an error message when aborted by user
Change-Id: If7378c5deaace0ac6ab2be961e38644d9373557d
2012-11-13 09:56:16 +09:00
0b8df7be79 Handle manifest parse errors in main
Add handling of manifest parse errors in the main method, and
print an error.  This will prevent python tracebacks being
dumped in many cases.

Change-Id: I75e73539afa34049f73c993dbfda203f1ad33b45
2012-11-13 09:54:47 +09:00
717ece9d81 Better handling of duplicate remotes
In the current implementation, an error is raised if a remote with the
same name is defined more than once.  The check is only that the remote
has the same name as an existing remote.

With the support for multiple local manifests, it is more likely than
before that the same remote is defined in more than one manifest.

Change the check so that it only raises an error if a remote is defined
more than once with the same name, but different attributes.

Change-Id: Ic3608646cf9f40aa2bea7015d3ecd099c5f5f835
2012-11-13 09:35:37 +09:00
5566ae5dde Print deprecation warning when local_manifest.xml is used
The preferred way to specify local manifests is to drop the file(s)
in the local_manifests folder.  Print a deprecation warning when
the legacy local_manifest.xml file is used.

Change-Id: Ice85bd06fb612d6fcceeaa0755efd130556c4464
2012-11-13 08:19:51 +09:00
2d5a0df798 Add support for multiple local manifests
Add support for multiple local manifests stored in the local_manifests
folder under the .repo home directory.

Local manifests will be processed in addition to local_manifest.xml.

Change-Id: Ia0569cea7e9ae0fe3208a8ffef5d9679e14db03b
2012-11-13 08:19:51 +09:00
f7fc8a95be Handle XML errors when parsing the manifest
Catch ExpatError and exit gracefully with an error message, rather
than exiting with a python traceback.

Change-Id: Ifd0a7762aab4e8de63dab8a66117170a05586866
2012-11-13 05:53:41 +09:00
1ad7b555df Merge "Always show --manifest-server-* options" 2012-11-07 12:39:25 -08:00
7e6dd2dff0 Fix pylint warning W0108: Lambda may not be necessary
Remove unnecessary usage of lambda.

Change-Id: I06d41933057d60d15d307ee800cca052a44754c6
2012-11-07 08:39:57 +09:00
8d070cfb25 Always show --manifest-server-* options
The --manifest-server-* flags broke the smartsync subcmd since
the corresponding variables weren't getting set.  This change
ensures that they will always be set, regardless of whether we are
using sync -s or smartsync.

Change-Id: I1b642038787f2114fa812ecbc15c64e431bbb829
2012-11-06 13:14:31 -08:00
a6053d54f1 Change usages of xrange() to range()
In Python3, range() creates a generator rather than a list.

None of the parameters in the ranges changed looked large enough
to create an impact in memory in Python2.  Note: the only use of
range() was for iteration and did not need to be changed.

This is part of a series of changes to introduce Python3 support.

Change-Id: I50b665f9296ea160a5076c71f36a65f76e47029f
2012-11-01 13:36:50 -07:00
e072a92a9b Merge "Use python3 urllib when urllib2 not available" 2012-11-01 10:13:34 -07:00
7601ee2608 Merge "Use 'stat' package instead of literals for mkdir()" 2012-11-01 10:01:18 -07:00
1f7627fd3c Use python3 urllib when urllib2 not available
This is part of a series of changes to introduce Python3 support.

Change-Id: I605b145791053c1f2d7bf3c907c5a68649b21d12
2012-10-31 14:26:48 -07:00
b42b4746af project: Require git >= 1.7.2 for setting config on command line
This option causes the git call to fail, which probably indicates a
programming error; callers should check the git version and change the
call appropriately if -c is not available. Failing loudly is preferable
to failing silently in the general case.

For an example of correctly checking at the call site, see I8fd313dd.
If callers prefer to fail silently, they may set GIT_CONFIG_PARAMETERS
in the environment rather than using the config kwarg to pass
configuration.

Change-Id: I0de18153d44d3225cd3031e6ead54461430ed334
2012-10-31 12:27:27 -07:00
e21526754b sync: Only parallelize gc for git >= 1.7.2
This minimum version is required for the -c argument to set config on
the command line. Without this option, git by default uses as many
threads per invocation as there are CPUs, so we cannot safely
parallelize without hosing a system.

Change-Id: I8fd313dd84917658162b5134b2d9aa34a96f2772
2012-10-31 12:27:17 -07:00
60798a32f6 Use 'stat' package instead of literals for mkdir()
This is part of a series of changes to introduce Python3 support.

Change-Id: Ic988ad181d32357d82dfa554e70d8525118334c0
2012-10-31 09:11:16 -07:00
1d947b3034 Even more coding style cleanup
Fixing some more pylint warnings:

W1401: Anomalous backslash in string
W0623: Redefining name 'name' from outer scope
W0702: No exception type(s) specified
E0102: name: function already defined line n

Change-Id: I5afcdb4771ce210390a79981937806e30900a93c
2012-10-30 10:28:20 +09:00
2d113f3546 Merge "Update minimum git version to 1.7.2" 2012-10-26 16:10:21 -07:00
de7eae4826 Merge "Revert "Represent git-submodule as nested projects"" 2012-10-26 12:30:38 -07:00
2fe99e8820 Merge "repo selfupdate: Fix _PostRepoUpgrade takes 2 arguments" 2012-10-26 12:27:36 -07:00
cd81dd6403 Revert "Represent git-submodule as nested projects"
This reverts commit 69998b0c6f.
Broke Android's non-gitmodule use case.

Conflicts:
	project.py
	subcmds/sync.py

Change-Id: I68ceeb63d8ee3b939f85a64736bdc81dfa352aed
2012-10-26 12:24:57 -07:00
80d2ceb222 repo selfupdate: Fix _PostRepoUpgrade takes 2 arguments
Change-Id: I1cf9e0674ea366ddce96c949e0bc085e3452b25a
2012-10-26 12:24:57 -07:00
c5aa4d3528 Update minimum git version to 1.7.2
We now use the -c flag which was introduced in git 1.7.2.

Change-Id: I9195c0f6ac9fa63e783a03628049fe2c67d258ff
2012-10-26 11:34:11 -07:00
bed45f9400 Merge "Show user about not initializing repo in current directory" 2012-10-26 09:52:16 -07:00
55e4d464a7 Add a PGP key for cco3@android.com
This change adds a PGP key to allow cco3@android.com to sign releases.

Change-Id: I18a70c8b7d8f272dd1aad9d6b2e4a237ef35af33
2012-10-26 07:03:59 -07:00
75cc353380 Show user about not initializing repo in current directory
If the parent of current directory has an initialized repo,
for example, if the current directory is
'/home/users/harry/platform/ics', and there is an initialized repo
in harry's home directory '/home/users/harry/.repo', when user
run 'repo init' command, repo is always initialized to parent
directory in '/home/users/harry/.repo', but most of time user
intends to initialize repo in the current directory, this patch
tells user how to do it.

Change-Id: Id7a76fb18ec0af243432c29605140d60f3de85ca
2012-10-26 15:40:17 +08:00
c9129d90de Update PGP keys during _PostRepoUpgrade in sync
Previously, if a key was added, a client wouldn't add the key during
the sync step.  This would cause issues if a new key were added and a
subsequent release were signed by that key.

Change-Id: I4fac317573cd9d0e8da62aa42e00faf08bfeb26c
2012-10-25 17:48:35 -07:00
57365c98cc Merge "sync: Run gc --auto in parallel" 2012-10-25 17:38:05 -07:00
dc96476af3 Merge "project: Support config args in git command callables" 2012-10-25 17:36:04 -07:00
2577cec095 Merge "sync: Keep a moving average of last fetch times" 2012-10-25 17:35:15 -07:00
e48d34659e Merge "sync: Order projects according to last fetch time" 2012-10-25 17:33:36 -07:00
ab8f911a67 Fix pylint warnings introduced by the submodule patch
"69998b0 Represent git-submodule as nested projects" has introduced a
few pylint warnings.

W0612:1439,8:Project._GetSubmodules.get_submodules: Unused variable 'sub_gitdir'
W0613:1424,36:Project._GetSubmodules.get_submodules: Unused argument 'path'
W0612:1450,25:Project._GetSubmodules.parse_gitmodules: Unused variable 'e'
W0622:516,8:Sync.Execute: Redefining built-in 'all'

Change-Id: I84378e2832ed1b5ab023e394d53b22dcea799ba4
2012-10-25 13:55:49 -07:00
608aff7f62 Merge "Use modern Python exception syntax" 2012-10-25 10:03:37 -07:00
13657c407d Merge "Add regex matching to repo list command" 2012-10-25 10:00:42 -07:00
e4ed8f65f3 Merge "Add pylint configuration and instructions" 2012-10-25 09:51:07 -07:00
fdb44479f8 Merge "Change PyDev project version to "python 2.6"" 2012-10-25 09:46:38 -07:00
188572170e sync: Run gc --auto in parallel
We can't just let this run wild with a high (or even low) -j, since
that would hose a system. Instead, limit the total number of threads
across all git gc subprocesses to the number of CPUs reported by the
multiprocessing module (available in Python 2.6 and above).

Change-Id: Icca0161a1e6116ffa5f7cfc6f5faecda510a7fb9
2012-10-25 08:12:48 -07:00
d75c669fac Add regex matching to repo list command
The repo list -r command will execute a regex search for every
argument provided on both the project name and the project
worktree path.

Useful for finding rarely used gits.

Change-Id: Iaff90dd36c240b3d5d74817d11469be22d77ae03
2012-10-25 15:49:13 +09:00
091f893625 project: Support config args in git command callables
Change-Id: I9d4d0d2b1aeebe41a6b24a339a154d258af665eb
2012-10-24 14:52:08 -07:00
d947858325 sync: Keep a moving average of last fetch times
Try to more accurately estimate which projects take the longest to
sync by keeping an exponentially weighted moving average (a=0.5) of
fetch times, rather than just recording the last observation. This
should discount individual outliers (e.g. an unusually large project
update) and hopefully allow truly slow repos to bubble to the top.

Change-Id: I72b2508cb1266e8a19cf15b616d8a7fc08098cb3
2012-10-24 14:52:07 -07:00
67700e9b90 sync: Order projects according to last fetch time
Some projects may consistently take longer to fetch than others, for
example a more active project may have many more Gerrit changes than a
less active project, which take longer to transfer. Use a simple
heuristic based on the last fetch time to fetch slower projects first,
so we do not tend to spend the end of the sync fetching a small number
of outliers.

This algorithm is probably not optimal, and due to inter-run latency
variance and Python thread scheduling, we may not even have good
estimates of a project sync time.

Change-Id: I9a463f214b3ed742e4d807c42925b62cb8b1745b
2012-10-24 14:51:58 -07:00
a5be53f9c8 Use modern Python exception syntax
"except Exception as e" instead of "except Exception, e"

This is part of a transition to supporting Python 3.  Python >= 2.6
support "as" syntax.

Note: this removes Python 2.5 support.

Change-Id: I309599f3981bba2b46111c43102bee38ff132803
2012-10-23 21:35:59 -07:00
9ed12c5d9c Change PyDev project version to "python 2.6"
Repo is dropping support for Python <2.5 soon, so this updates the
PyDev configuration appropriately.

Change-Id: If327951e3a9fd9ff7513b931bfcfe6172dc8e4c5
2012-10-23 21:35:46 -07:00
4f7bdea9d2 Add pylint configuration and instructions
pylint configuration file (.pylintrc) is added, and submission
instructions are updated to include pylint usage steps.

Deprecated pylint suppression (`disable-msg`) is updated in a few
modules to make it work properly with the latest version (0.26).

Change-Id: I4ec2ef318e23557a374ecdbf40fe12645766830c
2012-10-24 10:18:13 +09:00
69998b0c6f Represent git-submodule as nested projects
We need a representation of git-submodule in repo; otherwise repo will
not sync submodules, and leave workspace in a broken state.  Of course
this will not be a problem if all projects are owned by the owner of the
manifest file, who may simply choose not to use git-submodule in all
projects.  However, this is not possible in practice because manifest
file owner is unlikely to own all upstream projects.

As git submodules are simply git repositories, it is natural to treat
them as plain repo projects that live inside a repo project.  That is,
we could use recursively declared projects to denote the is-submodule
relation of git repositories.

The behavior of repo remains the same to projects that do not have a
sub-project within.  As for parent projects, repo fetches them and their
sub-projects as normal projects, and then checks out subprojects at the
commit specified in parent's commit object.  The sub-project is fetched
at a path relative to parent project's working directory; so the path
specified in manifest file should match that of .gitmodules file.

If a submodule is not registered in repo manifest, repo will derive its
properties from itself and its parent project, which might not always be
correct.  In such cases, the subproject is called a derived subproject.

To a user, a sub-project is merely a git-submodule; so all tips of
working with a git-submodule apply here, too.  For example, you should
not run `repo sync` in a parent repository if its submodule is dirty.

Change-Id: I541e9e2ac1a70304272dbe09724572aa1004eb5c
2012-10-23 16:08:58 -07:00
5c6eeac8f0 More coding style cleanup
Fixing more issues found with pylint.  Some that were supposed to
have been fixed in the previous sweep (Ie0db839e) but were missed:

C0321: More than one statement on a single line
W0622: Redefining built-in 'name'

And some more:

W0631: Using possibly undefined loop variable 'name'
W0223: Method 'name' is abstract in class 'name' but is not overridden
W0231: __init__ method from base class 'name' is not called

Change-Id: Ie119183708609d6279e973057a385fde864230c3
2012-10-22 12:30:14 +09:00
e98607248e Support HTTP authentication using user input as fallback
If repo could not find authentication credentials from ~/.netrc, this
patch tries to get user and password from user's console input. This
could be a good choice if user doesn't want to save his plain password
in ~/.netrc or if user doesn't know about the netrc usage.

The user will be prompted only if authentication infomation does not
exist in the password manager. Since main.py firstly loads auth
infomation from ~/.netrc, this will be executed only as fallback
mechanism.

Example:
$ repo upload .
Upload project xxx/ to remote branch master:
 branch yyy ( 1 commit, ...):
 to https://review.zzz.com/gerrit/ (y/N)? y

(repo may try to access to https://review.zzz.com/gerrit/ssh_info and
will get the 401 HTTP Basic Authentication response from server. If no
authentication info in ~/.netrc, This patch will ask username/passwd)

Authorization Required (Message from Web Server)
User: pororo
Password:
....
[OK ] xxx/

Change-Id: Ia348a4609ac40060d9093c7dc8d7c2560020455a
2012-10-12 06:02:35 +09:00
2f6ab7f5b8 Rename "dir" variables
The variable name "dir" conflicts with the name of a Python built-in
function: http://docs.python.org/library/functions.html#dir

Change-Id: I850f3ec8df7563dc85e21f2876fe5e6550ca2d8f
2012-10-10 08:30:15 +02:00
3a6cd4200e Merge "Coding style cleanup" 2012-10-09 14:29:46 -07:00
25f17682ca Merge "Expand ~ to user's home directory for --reference" 2012-10-09 13:46:10 -07:00
8a68ff9605 Coding style cleanup
Fix the following issues reported by pylint:

C0321: More than one statement on a single line
W0622: Redefining built-in 'name'
W0612: Unused variable 'name'
W0613: Unused argument 'name'
W0102: Dangerous default value 'value' as argument
W0105: String statement has no effect

Also fixed a few cases of inconsistent indentation.

Change-Id: Ie0db839e7c57d576cff12d8c055fe87030d00744
2012-10-09 12:45:30 +02:00
297e7c6ee6 Expand ~ to user's home directory for --reference
This allows a user to have a 'repo init' as:
  $ repo init -u ... --reference=~/mirror

Change-Id: Ib85b7c8ffca9d732132c68fe9a8d7f0ab1fa9288
2012-10-08 15:03:20 +02:00
e3b1c45aeb Remove unreachable code
Change 9bb1816b removed part of a block of code, but left the
remaining part unreachable.  Remove it.

Change-Id: Icdc6061d00e6027df32dee9a3bad3999fe7cdcbc
2012-10-05 10:34:19 +02:00
7119f94aba Update commit-msg hook to version from Gerrit v2.5-rc0
Change-Id: I0d11ac0c24cd53386e996b7dd9bd37c89c789f60
2012-10-04 10:31:09 +02:00
01f443d75a Correct call to sys.exit()
It should be `sys.exit()` not `os.exit()`.

Change-Id: Iaeeef456ddf2d17f5df2b712e50e3630bed856c3
2012-10-04 10:31:09 +02:00
b926116a14 Remove ImportError class
The definition of `ImportError` redefines the Python built-in
class of the same name.

It is not used anywhere, so remove it.

Change-Id: I557ce28c93a3306fff72873dc6f477330fc33128
2012-10-04 10:31:09 +02:00
3ff9decfd4 Merge "manifest: record the original revision when in -r mode." 2012-10-03 16:49:12 -07:00
14a6674e32 manifest: record the original revision when in -r mode.
Currently when doing a sync against a revision locked manifest,
sync has no option but to fall back to sync'ing the entire refs space;
it doesn't know which ref to ask for that contains the sha1 it wants.

This sucks if we're in -c mode; thus when we generate a revision
locked manifest, record the originating branch- and try syncing that
branch first.  If the sha1 is found within that branch, this saves
us having to pull down the rest of the repo- a potentially heavy
saving.

If that branch doesn't have the desired sha1, we fallback to sync'ing
everything.

Change-Id: I99a5e44fa1d792dfcada76956a2363187df94cf1
2012-09-28 22:31:27 -07:00
9779565abf Fix incorrect default_groups when parsing projects from XML manifest
Change Details:
* Switch first default group to 'all' instead of 'default'

Change Benefits:
* More consistent with default_groups in the counterpart Save() function
* Fixes bug where command 'repo manifest' added an extra 'default'
  group to every output project element groups attribute. This bug was
  particularly confusing for projects which had 'groups="notdefault"'
  as they were output as 'groups="notdefault,default"' by 'repo manifest'

Change-Id: I5611c027a982d3394899466248b971910bec8c6b
2012-09-26 01:58:48 -04:00
cf76b1bcec sync: Support manual authentication to the manifest server
Add two new command line options, -u/--manifest-server-username and
-p/--manifest-server-password, which can be used to specify a username
and password to authenticate to the manifest server when using the
-s/--smart-sync or -t/--smart-tag option.

If -u and -p are not specified when using the -s or -t option, use
authentication credentials from the .netrc file (if there are any).

Authentication credentials from -u/-p or .netrc are not used if the
manifest server specified in the manifest file already includes
credentials.

Change-Id: I6cf9540d28f6cef64c5694e8928cfe367a71d28d
2012-09-21 11:20:59 -07:00
e00aa6b923 Clean up imports
manifest_xml: import `HEAD` and `R_HEADS` from correct module
version: import `HEAD` from correct module

`HEAD` and `R_HEADS` should be imported from the git_refs module,
where they are originally defined, rather than from the project
module.

repo: remove unused import of readline

cherry_pick: import standard modules on separate lines
smartsync: import subcmd modules explicitly from subcmd

Use:
  `import re
  import sys`
and
  `from subcmds.sync import Sync`

Instead of:
  `import sys, re`
and
  `from sync import Sync`

Change-Id: Ie10dd6832710939634c4f5c86b9ba5a9cd6fc92e
2012-09-18 09:54:57 +02:00
86d973d24e sync: Support authentication to manifest server with .netrc
When using the --smart-sync or --smart-tag option, and the specified
manifest server is hosted on a server that requires authentication,
repo sync fails with the error: HTTP 401 Unauthorized.

Add support for getting the credentials from the .netrc file.

If a .netrc file exists in the user's home directory, and it contains
credentials for the hostname of the manifest server specified in the
manifest, use the credentials to authenticate with the manifest server
using the URL syntax extension for Basic Authentication:

  http://user:password@host:port/path

Credentials from the .netrc file are only used if the manifest server
URL specified in the manifest does not already include credentials.

Change-Id: I06e6586e8849d0cd12fa9746789e8d45d5b1f848
2012-09-11 09:45:48 +02:00
43 changed files with 2409 additions and 1009 deletions

View File

@ -5,6 +5,6 @@
<pydev_pathproperty name="org.python.pydev.PROJECT_SOURCE_PATH"> <pydev_pathproperty name="org.python.pydev.PROJECT_SOURCE_PATH">
<path>/repo</path> <path>/repo</path>
</pydev_pathproperty> </pydev_pathproperty>
<pydev_property name="org.python.pydev.PYTHON_PROJECT_VERSION">python 2.4</pydev_property> <pydev_property name="org.python.pydev.PYTHON_PROJECT_VERSION">python 2.6</pydev_property>
<pydev_property name="org.python.pydev.PYTHON_PROJECT_INTERPRETER">Default</pydev_property> <pydev_property name="org.python.pydev.PYTHON_PROJECT_INTERPRETER">Default</pydev_property>
</pydev_project> </pydev_project>

301
.pylintrc Normal file
View File

@ -0,0 +1,301 @@
# lint Python modules using external checkers.
#
# This is the main checker controling the other ones and the reports
# generation. It is itself both a raw checker and an astng checker in order
# to:
# * handle message activation / deactivation at the module level
# * handle some basic but necessary stats'data (number of classes, methods...)
#
[MASTER]
# Specify a configuration file.
#rcfile=
# Python code to execute, usually for sys.path manipulation such as
# pygtk.require().
#init-hook=
# Profiled execution.
profile=no
# Add <file or directory> to the black list. It should be a base name, not a
# path. You may set this option multiple times.
ignore=SVN
# Pickle collected data for later comparisons.
persistent=yes
# Set the cache size for astng objects.
cache-size=500
# List of plugins (as comma separated values of python modules names) to load,
# usually to register additional checkers.
load-plugins=
[MESSAGES CONTROL]
# Enable only checker(s) with the given id(s). This option conflicts with the
# disable-checker option
#enable-checker=
# Enable all checker(s) except those with the given id(s). This option
# conflicts with the enable-checker option
#disable-checker=
# Enable all messages in the listed categories.
#enable-msg-cat=
# Disable all messages in the listed categories.
#disable-msg-cat=
# Enable the message(s) with the given id(s).
enable=RP0004
# Disable the message(s) with the given id(s).
disable=R0903,R0912,R0913,R0914,R0915,W0141,C0111,C0103,W0603,W0703,R0911,C0301,C0302,R0902,R0904,W0142,W0212,E1101,E1103,R0201,W0201,W0122,W0232,RP0001,RP0003,RP0101,RP0002,RP0401,RP0701,RP0801
[REPORTS]
# set the output format. Available formats are text, parseable, colorized, msvs
# (visual studio) and html
output-format=text
# Include message's id in output
include-ids=yes
# Put messages in a separate file for each module / package specified on the
# command line instead of printing them on stdout. Reports (if any) will be
# written in a file name "pylint_global.[txt|html]".
files-output=no
# Tells whether to display a full report or only the messages
reports=yes
# Python expression which should return a note less than 10 (10 is the highest
# note).You have access to the variables errors warning, statement which
# respectivly contain the number of errors / warnings messages and the total
# number of statements analyzed. This is used by the global evaluation report
# (R0004).
evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)
# Add a comment according to your evaluation note. This is used by the global
# evaluation report (R0004).
comment=no
# checks for
# * unused variables / imports
# * undefined variables
# * redefinition of variable from builtins or from an outer scope
# * use of variable before assigment
#
[VARIABLES]
# Tells whether we should check for unused import in __init__ files.
init-import=no
# A regular expression matching names used for dummy variables (i.e. not used).
dummy-variables-rgx=_|dummy
# List of additional names supposed to be defined in builtins. Remember that
# you should avoid to define new builtins when possible.
additional-builtins=
# try to find bugs in the code using type inference
#
[TYPECHECK]
# Tells whether missing members accessed in mixin class should be ignored. A
# mixin class is detected if its name ends with "mixin" (case insensitive).
ignore-mixin-members=yes
# List of classes names for which member attributes should not be checked
# (useful for classes with attributes dynamicaly set).
ignored-classes=SQLObject
# When zope mode is activated, consider the acquired-members option to ignore
# access to some undefined attributes.
zope=no
# List of members which are usually get through zope's acquisition mecanism and
# so shouldn't trigger E0201 when accessed (need zope=yes to be considered).
acquired-members=REQUEST,acl_users,aq_parent
# checks for :
# * doc strings
# * modules / classes / functions / methods / arguments / variables name
# * number of arguments, local variables, branchs, returns and statements in
# functions, methods
# * required module attributes
# * dangerous default values as arguments
# * redefinition of function / method / class
# * uses of the global statement
#
[BASIC]
# Required attributes for module, separated by a comma
required-attributes=
# Regular expression which should only match functions or classes name which do
# not require a docstring
no-docstring-rgx=_main|__.*__
# Regular expression which should only match correct module names
module-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$
# Regular expression which should only match correct module level names
const-rgx=(([A-Z_][A-Z1-9_]*)|(__.*__))|(log)$
# Regular expression which should only match correct class names
class-rgx=[A-Z_][a-zA-Z0-9]+$
# Regular expression which should only match correct function names
function-rgx=[a-z_][a-z0-9_]{2,30}$
# Regular expression which should only match correct method names
method-rgx=[a-z_][a-z0-9_]{2,30}$
# Regular expression which should only match correct instance attribute names
attr-rgx=[a-z_][a-z0-9_]{2,30}$
# Regular expression which should only match correct argument names
argument-rgx=[a-z_][a-z0-9_]{2,30}$
# Regular expression which should only match correct variable names
variable-rgx=[a-z_][a-z0-9_]{2,30}$
# Regular expression which should only match correct list comprehension /
# generator expression variable names
inlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$
# Good variable names which should always be accepted, separated by a comma
good-names=i,j,k,ex,Run,_,e,d1,d2,v,f,l,d
# Bad variable names which should always be refused, separated by a comma
bad-names=foo,bar,baz,toto,tutu,tata
# List of builtins function names that should not be used, separated by a comma
bad-functions=map,filter,apply,input
# checks for sign of poor/misdesign:
# * number of methods, attributes, local variables...
# * size, complexity of functions, methods
#
[DESIGN]
# Maximum number of arguments for function / method
max-args=5
# Maximum number of locals for function / method body
max-locals=15
# Maximum number of return / yield for function / method body
max-returns=6
# Maximum number of branch for function / method body
max-branchs=12
# Maximum number of statements in function / method body
max-statements=50
# Maximum number of parents for a class (see R0901).
max-parents=7
# Maximum number of attributes for a class (see R0902).
max-attributes=20
# Minimum number of public methods for a class (see R0903).
min-public-methods=2
# Maximum number of public methods for a class (see R0904).
max-public-methods=30
# checks for
# * external modules dependencies
# * relative / wildcard imports
# * cyclic imports
# * uses of deprecated modules
#
[IMPORTS]
# Deprecated modules which should not be used, separated by a comma
deprecated-modules=regsub,string,TERMIOS,Bastion,rexec
# Create a graph of every (i.e. internal and external) dependencies in the
# given file (report R0402 must not be disabled)
import-graph=
# Create a graph of external dependencies in the given file (report R0402 must
# not be disabled)
ext-import-graph=
# Create a graph of internal dependencies in the given file (report R0402 must
# not be disabled)
int-import-graph=
# checks for :
# * methods without self as first argument
# * overridden methods signature
# * access only to existant members via self
# * attributes not defined in the __init__ method
# * supported interfaces implementation
# * unreachable code
#
[CLASSES]
# List of interface methods to ignore, separated by a comma. This is used for
# instance to not check methods defines in Zope's Interface base class.
ignore-iface-methods=isImplementedBy,deferred,extends,names,namesAndDescriptions,queryDescriptionFor,getBases,getDescriptionFor,getDoc,getName,getTaggedValue,getTaggedValueTags,isEqualOrExtendedBy,setTaggedValue,isImplementedByInstancesOf,adaptWith,is_implemented_by
# List of method names used to declare (i.e. assign) instance attributes.
defining-attr-methods=__init__,__new__,setUp
# checks for similarities and duplicated code. This computation may be
# memory / CPU intensive, so you should disable it if you experiments some
# problems.
#
[SIMILARITIES]
# Minimum lines number of a similarity.
min-similarity-lines=4
# Ignore comments when computing similarities.
ignore-comments=yes
# Ignore docstrings when computing similarities.
ignore-docstrings=yes
# checks for:
# * warning notes in the code like FIXME, XXX
# * PEP 263: source code with non ascii character but no encoding declaration
#
[MISCELLANEOUS]
# List of note tags to take in consideration, separated by a comma.
notes=FIXME,XXX,TODO
# checks for :
# * unauthorized constructions
# * strict indentation
# * line length
# * use of <> instead of !=
#
[FORMAT]
# Maximum number of characters on a single line.
max-line-length=80
# Maximum number of lines in a module
max-module-lines=1000
# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
# tab). In repo it is 2 spaces.
indent-string=' '

View File

@ -2,6 +2,7 @@ Short Version:
- Make small logical changes. - Make small logical changes.
- Provide a meaningful commit message. - Provide a meaningful commit message.
- Check for coding errors with pylint
- Make sure all code is under the Apache License, 2.0. - Make sure all code is under the Apache License, 2.0.
- Publish your changes for review: - Publish your changes for review:
@ -33,7 +34,14 @@ If your description starts to get too long, that's a sign that you
probably need to split up your commit to finer grained pieces. probably need to split up your commit to finer grained pieces.
(2) Check the license (2) Check for coding errors with pylint
Run pylint on changed modules using the provided configuration:
pylint --rcfile=.pylintrc file.py
(3) Check the license
repo is licensed under the Apache License, 2.0. repo is licensed under the Apache License, 2.0.
@ -49,7 +57,7 @@ your patch. It is virtually impossible to remove a patch once it
has been applied and pushed out. has been applied and pushed out.
(3) Sending your patches. (4) Sending your patches.
Do not email your patches to anyone. Do not email your patches to anyone.

View File

@ -36,52 +36,56 @@ ATTRS = {None :-1,
'blink' : 5, 'blink' : 5,
'reverse': 7} 'reverse': 7}
RESET = "\033[m" RESET = "\033[m" # pylint: disable=W1401
# backslash is not anomalous
def is_color(s): return s in COLORS def is_color(s):
def is_attr(s): return s in ATTRS return s in COLORS
def is_attr(s):
return s in ATTRS
def _Color(fg = None, bg = None, attr = None): def _Color(fg = None, bg = None, attr = None):
fg = COLORS[fg] fg = COLORS[fg]
bg = COLORS[bg] bg = COLORS[bg]
attr = ATTRS[attr] attr = ATTRS[attr]
if attr >= 0 or fg >= 0 or bg >= 0: if attr >= 0 or fg >= 0 or bg >= 0:
need_sep = False need_sep = False
code = "\033[" code = "\033[" #pylint: disable=W1401
if attr >= 0: if attr >= 0:
code += chr(ord('0') + attr) code += chr(ord('0') + attr)
need_sep = True need_sep = True
if fg >= 0: if fg >= 0:
if need_sep: if need_sep:
code += ';' code += ';'
need_sep = True need_sep = True
if fg < 8: if fg < 8:
code += '3%c' % (ord('0') + fg) code += '3%c' % (ord('0') + fg)
else: else:
code += '38;5;%d' % fg code += '38;5;%d' % fg
if bg >= 0: if bg >= 0:
if need_sep: if need_sep:
code += ';' code += ';'
need_sep = True need_sep = True
if bg < 8: if bg < 8:
code += '4%c' % (ord('0') + bg) code += '4%c' % (ord('0') + bg)
else: else:
code += '48;5;%d' % bg code += '48;5;%d' % bg
code += 'm' code += 'm'
else: else:
code = '' code = ''
return code return code
class Coloring(object): class Coloring(object):
def __init__(self, config, type): def __init__(self, config, section_type):
self._section = 'color.%s' % type self._section = 'color.%s' % section_type
self._config = config self._config = config
self._out = sys.stdout self._out = sys.stdout
@ -126,8 +130,8 @@ class Coloring(object):
if self._on: if self._on:
c = self._parse(opt, fg, bg, attr) c = self._parse(opt, fg, bg, attr)
def f(fmt, *args): def f(fmt, *args):
str = fmt % args output = fmt % args
return ''.join([c, str, RESET]) return ''.join([c, output, RESET])
return f return f
else: else:
def f(fmt, *args): def f(fmt, *args):
@ -151,8 +155,10 @@ class Coloring(object):
have_fg = False have_fg = False
for a in v.split(' '): for a in v.split(' '):
if is_color(a): if is_color(a):
if have_fg: bg = a if have_fg:
else: fg = a bg = a
else:
fg = a
elif is_attr(a): elif is_attr(a):
attr = a attr = a

View File

@ -22,6 +22,7 @@ import sys
from error import NoSuchProjectError from error import NoSuchProjectError
from error import InvalidProjectGroupsError from error import InvalidProjectGroupsError
class Command(object): class Command(object):
"""Base class for any command line action in repo. """Base class for any command line action in repo.
""" """
@ -33,6 +34,27 @@ class Command(object):
def WantPager(self, opt): def WantPager(self, opt):
return False return False
def ReadEnvironmentOptions(self, opts):
""" Set options from environment variables. """
env_options = self._RegisteredEnvironmentOptions()
for env_key, opt_key in env_options.items():
# Get the user-set option value if any
opt_value = getattr(opts, opt_key)
# If the value is set, it means the user has passed it as a command
# line option, and we should use that. Otherwise we can try to set it
# with the value from the corresponding environment variable.
if opt_value is not None:
continue
env_value = os.environ.get(env_key)
if env_value is not None:
setattr(opts, opt_key, env_value)
return opts
@property @property
def OptionParser(self): def OptionParser(self):
if self._optparse is None: if self._optparse is None:
@ -49,6 +71,24 @@ class Command(object):
"""Initialize the option parser. """Initialize the option parser.
""" """
def _RegisteredEnvironmentOptions(self):
"""Get options that can be set from environment variables.
Return a dictionary mapping environment variable name
to option key name that it can override.
Example: {'REPO_MY_OPTION': 'my_option'}
Will allow the option with key value 'my_option' to be set
from the value in the environment variable named 'REPO_MY_OPTION'.
Note: This does not work properly for options that are explicitly
set to None by the user, or options that are defined with a
default value other than None.
"""
return {}
def Usage(self): def Usage(self):
"""Display usage and terminate. """Display usage and terminate.
""" """
@ -60,10 +100,36 @@ class Command(object):
""" """
raise NotImplementedError raise NotImplementedError
def GetProjects(self, args, missing_ok=False): def _ResetPathToProjectMap(self, projects):
self._by_path = dict((p.worktree, p) for p in projects)
def _UpdatePathToProjectMap(self, project):
self._by_path[project.worktree] = project
def _GetProjectByPath(self, path):
project = None
if os.path.exists(path):
oldpath = None
while path \
and path != oldpath \
and path != self.manifest.topdir:
try:
project = self._by_path[path]
break
except KeyError:
oldpath = path
path = os.path.dirname(path)
else:
try:
project = self._by_path[path]
except KeyError:
pass
return project
def GetProjects(self, args, missing_ok=False, submodules_ok=False):
"""A list of projects that match the arguments. """A list of projects that match the arguments.
""" """
all = self.manifest.projects all_projects = self.manifest.projects
result = [] result = []
mp = self.manifest.manifestProject mp = self.manifest.manifestProject
@ -71,43 +137,40 @@ class Command(object):
groups = mp.config.GetString('manifest.groups') groups = mp.config.GetString('manifest.groups')
if not groups: if not groups:
groups = 'all,-notdefault,platform-' + platform.system().lower() groups = 'all,-notdefault,platform-' + platform.system().lower()
groups = [x for x in re.split('[,\s]+', groups) if x] groups = [x for x in re.split(r'[,\s]+', groups) if x]
if not args: if not args:
for project in all.values(): all_projects_list = all_projects.values()
derived_projects = {}
for project in all_projects_list:
if submodules_ok or project.sync_s:
derived_projects.update((p.name, p)
for p in project.GetDerivedSubprojects())
all_projects_list.extend(derived_projects.values())
for project in all_projects_list:
if ((missing_ok or project.Exists) and if ((missing_ok or project.Exists) and
project.MatchesGroups(groups)): project.MatchesGroups(groups)):
result.append(project) result.append(project)
else: else:
by_path = None self._ResetPathToProjectMap(all_projects.values())
for arg in args: for arg in args:
project = all.get(arg) project = all_projects.get(arg)
if not project: if not project:
path = os.path.abspath(arg).replace('\\', '/') path = os.path.abspath(arg).replace('\\', '/')
project = self._GetProjectByPath(path)
if not by_path: # If it's not a derived project, update path->project mapping and
by_path = dict() # search again, as arg might actually point to a derived subproject.
for p in all.values(): if (project and not project.Derived and
by_path[p.worktree] = p (submodules_ok or project.sync_s)):
search_again = False
if os.path.exists(path): for subproject in project.GetDerivedSubprojects():
oldpath = None self._UpdatePathToProjectMap(subproject)
while path \ search_again = True
and path != oldpath \ if search_again:
and path != self.manifest.topdir: project = self._GetProjectByPath(path) or project
try:
project = by_path[path]
break
except KeyError:
oldpath = path
path = os.path.dirname(path)
else:
try:
project = by_path[path]
except KeyError:
pass
if not project: if not project:
raise NoSuchProjectError(arg) raise NoSuchProjectError(arg)
@ -123,6 +186,11 @@ class Command(object):
result.sort(key=_getpath) result.sort(key=_getpath)
return result return result
# pylint: disable=W0223
# Pylint warns that the `InteractiveCommand` and `PagedCommand` classes do not
# override method `Execute` which is abstract in `Command`. Since that method
# is always implemented in classes derived from `InteractiveCommand` and
# `PagedCommand`, this warning can be suppressed.
class InteractiveCommand(Command): class InteractiveCommand(Command):
"""Command which requires user interaction on the tty and """Command which requires user interaction on the tty and
must not run within a pager, even if the user asks to. must not run within a pager, even if the user asks to.
@ -137,6 +205,8 @@ class PagedCommand(Command):
def WantPager(self, opt): def WantPager(self, opt):
return True return True
# pylint: enable=W0223
class MirrorSafeCommand(object): class MirrorSafeCommand(object):
"""Command permits itself to run within a mirror, """Command permits itself to run within a mirror,
and does not require a working directory. and does not require a working directory.

View File

@ -41,17 +41,20 @@ following DTD:
<!ATTLIST default revision CDATA #IMPLIED> <!ATTLIST default revision CDATA #IMPLIED>
<!ATTLIST default sync-j CDATA #IMPLIED> <!ATTLIST default sync-j CDATA #IMPLIED>
<!ATTLIST default sync-c CDATA #IMPLIED> <!ATTLIST default sync-c CDATA #IMPLIED>
<!ATTLIST default sync-s CDATA #IMPLIED>
<!ELEMENT manifest-server (EMPTY)> <!ELEMENT manifest-server (EMPTY)>
<!ATTLIST url CDATA #REQUIRED> <!ATTLIST url CDATA #REQUIRED>
<!ELEMENT project (annotation?)> <!ELEMENT project (annotation?,
project*)>
<!ATTLIST project name CDATA #REQUIRED> <!ATTLIST project name CDATA #REQUIRED>
<!ATTLIST project path CDATA #IMPLIED> <!ATTLIST project path CDATA #IMPLIED>
<!ATTLIST project remote IDREF #IMPLIED> <!ATTLIST project remote IDREF #IMPLIED>
<!ATTLIST project revision CDATA #IMPLIED> <!ATTLIST project revision CDATA #IMPLIED>
<!ATTLIST project groups CDATA #IMPLIED> <!ATTLIST project groups CDATA #IMPLIED>
<!ATTLIST project sync-c CDATA #IMPLIED> <!ATTLIST project sync-c CDATA #IMPLIED>
<!ATTLIST project sync-s CDATA #IMPLIED>
<!ELEMENT annotation (EMPTY)> <!ELEMENT annotation (EMPTY)>
<!ATTLIST annotation name CDATA #REQUIRED> <!ATTLIST annotation name CDATA #REQUIRED>
@ -152,7 +155,10 @@ Element project
One or more project elements may be specified. Each element One or more project elements may be specified. Each element
describes a single Git repository to be cloned into the repo describes a single Git repository to be cloned into the repo
client workspace. client workspace. You may specify Git-submodules by creating a
nested project. Git-submodules will be automatically
recognized and inherit their parent's attributes, but those
may be overridden by an explicitly specified project element.
Attribute `name`: A unique name for this project. The project's Attribute `name`: A unique name for this project. The project's
name is appended onto its remote's fetch URL to generate the actual name is appended onto its remote's fetch URL to generate the actual
@ -163,7 +169,8 @@ URL to configure the Git remote with. The URL gets formed as:
where ${remote_fetch} is the remote's fetch attribute and where ${remote_fetch} is the remote's fetch attribute and
${project_name} is the project's name attribute. The suffix ".git" ${project_name} is the project's name attribute. The suffix ".git"
is always appended as repo assumes the upstream is a forest of is always appended as repo assumes the upstream is a forest of
bare Git repositories. bare Git repositories. If the project has a parent element, its
name will be prefixed by the parent's.
The project name must match the name Gerrit knows, if Gerrit is The project name must match the name Gerrit knows, if Gerrit is
being used for code reviews. being used for code reviews.
@ -171,6 +178,8 @@ being used for code reviews.
Attribute `path`: An optional path relative to the top directory Attribute `path`: An optional path relative to the top directory
of the repo client where the Git working directory for this project of the repo client where the Git working directory for this project
should be placed. If not supplied the project name is used. should be placed. If not supplied the project name is used.
If the project has a parent element, its path will be prefixed
by the parent's.
Attribute `remote`: Name of a previously defined remote element. Attribute `remote`: Name of a previously defined remote element.
If not supplied the remote given by the default element is used. If not supplied the remote given by the default element is used.
@ -190,6 +199,8 @@ its name:`name` and path:`path`. E.g. for
definition is implicitly in the following manifest groups: definition is implicitly in the following manifest groups:
default, name:monkeys, and path:barrel-of. If you place a project in the default, name:monkeys, and path:barrel-of. If you place a project in the
group "notdefault", it will not be automatically downloaded by repo. group "notdefault", it will not be automatically downloaded by repo.
If the project has a parent element, the `name` and `path` here
are the prefixed ones.
Element annotation Element annotation
------------------ ------------------
@ -209,7 +220,7 @@ Deletes the named project from the internal manifest table, possibly
allowing a subsequent project element in the same manifest file to allowing a subsequent project element in the same manifest file to
replace the project with a different source. replace the project with a different source.
This element is mostly useful in the local_manifest.xml, where This element is mostly useful in a local manifest file, where
the user can remove a project, and possibly replace it with their the user can remove a project, and possibly replace it with their
own definition. own definition.
@ -218,21 +229,25 @@ Element include
This element provides the capability of including another manifest This element provides the capability of including another manifest
file into the originating manifest. Normal rules apply for the file into the originating manifest. Normal rules apply for the
target manifest to include- it must be a usable manifest on it's own. target manifest to include - it must be a usable manifest on its own.
Attribute `name`; the manifest to include, specified relative to Attribute `name`: the manifest to include, specified relative to
the manifest repositories root. the manifest repository's root.
Local Manifest Local Manifests
============== ===============
Additional remotes and projects may be added through a local Additional remotes and projects may be added through local manifest
manifest, stored in `$TOP_DIR/.repo/local_manifest.xml`. files stored in `$TOP_DIR/.repo/local_manifests/*.xml`.
For example: For example:
$ cat .repo/local_manifest.xml $ ls .repo/local_manifests
local_manifest.xml
another_local_manifest.xml
$ cat .repo/local_manifests/local_manifest.xml
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8"?>
<manifest> <manifest>
<project path="manifest" <project path="manifest"
@ -241,6 +256,17 @@ For example:
name="platform/manifest" /> name="platform/manifest" />
</manifest> </manifest>
Users may add projects to the local manifest prior to a `repo sync` Users may add projects to the local manifest(s) prior to a `repo sync`
invocation, instructing repo to automatically download and manage invocation, instructing repo to automatically download and manage
these extra projects. these extra projects.
Manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml` will
be loaded in alphabetical order.
Additional remotes and projects may also be added through a local
manifest, stored in `$TOP_DIR/.repo/local_manifest.xml`. This method
is deprecated in favor of using multiple manifest files as mentioned
above.
If `$TOP_DIR/.repo/local_manifest.xml` exists, it will be loaded before
any manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml`.

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import os import os
import re import re
import sys import sys
@ -53,10 +54,10 @@ class Editor(object):
return e return e
if os.getenv('TERM') == 'dumb': if os.getenv('TERM') == 'dumb':
print >>sys.stderr,\ print(
"""No editor specified in GIT_EDITOR, core.editor, VISUAL or EDITOR. """No editor specified in GIT_EDITOR, core.editor, VISUAL or EDITOR.
Tried to fall back to vi but terminal is dumb. Please configure at Tried to fall back to vi but terminal is dumb. Please configure at
least one of these before using this command.""" least one of these before using this command.""", file=sys.stderr)
sys.exit(1) sys.exit(1)
return 'vi' return 'vi'
@ -67,7 +68,7 @@ least one of these before using this command."""
Args: Args:
data : the text to edit data : the text to edit
Returns: Returns:
new value of edited text; None if editing did not succeed new value of edited text; None if editing did not succeed
""" """
@ -91,7 +92,7 @@ least one of these before using this command."""
try: try:
rc = subprocess.Popen(args, shell=shell).wait() rc = subprocess.Popen(args, shell=shell).wait()
except OSError, e: except OSError as e:
raise EditorError('editor failed, %s: %s %s' raise EditorError('editor failed, %s: %s %s'
% (str(e), editor, path)) % (str(e), editor, path))
if rc != 0: if rc != 0:

View File

@ -21,10 +21,15 @@ class ManifestInvalidRevisionError(Exception):
"""The revision value in a project is incorrect. """The revision value in a project is incorrect.
""" """
class NoManifestException(Exception):
"""The required manifest does not exist.
"""
class EditorError(Exception): class EditorError(Exception):
"""Unspecified error from the user's text editor. """Unspecified error from the user's text editor.
""" """
def __init__(self, reason): def __init__(self, reason):
super(EditorError, self).__init__()
self.reason = reason self.reason = reason
def __str__(self): def __str__(self):
@ -34,24 +39,17 @@ class GitError(Exception):
"""Unspecified internal error from git. """Unspecified internal error from git.
""" """
def __init__(self, command): def __init__(self, command):
super(GitError, self).__init__()
self.command = command self.command = command
def __str__(self): def __str__(self):
return self.command return self.command
class ImportError(Exception):
"""An import from a non-Git format cannot be performed.
"""
def __init__(self, reason):
self.reason = reason
def __str__(self):
return self.reason
class UploadError(Exception): class UploadError(Exception):
"""A bundle upload to Gerrit did not succeed. """A bundle upload to Gerrit did not succeed.
""" """
def __init__(self, reason): def __init__(self, reason):
super(UploadError, self).__init__()
self.reason = reason self.reason = reason
def __str__(self): def __str__(self):
@ -61,6 +59,7 @@ class DownloadError(Exception):
"""Cannot download a repository. """Cannot download a repository.
""" """
def __init__(self, reason): def __init__(self, reason):
super(DownloadError, self).__init__()
self.reason = reason self.reason = reason
def __str__(self): def __str__(self):
@ -70,6 +69,7 @@ class NoSuchProjectError(Exception):
"""A specified project does not exist in the work tree. """A specified project does not exist in the work tree.
""" """
def __init__(self, name=None): def __init__(self, name=None):
super(NoSuchProjectError, self).__init__()
self.name = name self.name = name
def __str__(self): def __str__(self):
@ -82,6 +82,7 @@ class InvalidProjectGroupsError(Exception):
"""A specified project is not suitable for the specified groups """A specified project is not suitable for the specified groups
""" """
def __init__(self, name=None): def __init__(self, name=None):
super(InvalidProjectGroupsError, self).__init__()
self.name = name self.name = name
def __str__(self): def __str__(self):
@ -94,12 +95,12 @@ class RepoChangedException(Exception):
repo or manifest repositories. In this special case we must repo or manifest repositories. In this special case we must
use exec to re-execute repo with the new code and manifest. use exec to re-execute repo with the new code and manifest.
""" """
def __init__(self, extra_args=[]): def __init__(self, extra_args=None):
self.extra_args = extra_args super(RepoChangedException, self).__init__()
self.extra_args = extra_args or []
class HookError(Exception): class HookError(Exception):
"""Thrown if a 'repo-hook' could not be run. """Thrown if a 'repo-hook' could not be run.
The common case is that the file wasn't present when we tried to run it. The common case is that the file wasn't present when we tried to run it.
""" """
pass

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import os import os
import sys import sys
import subprocess import subprocess
@ -37,11 +38,11 @@ def ssh_sock(create=True):
if _ssh_sock_path is None: if _ssh_sock_path is None:
if not create: if not create:
return None return None
dir = '/tmp' tmp_dir = '/tmp'
if not os.path.exists(dir): if not os.path.exists(tmp_dir):
dir = tempfile.gettempdir() tmp_dir = tempfile.gettempdir()
_ssh_sock_path = os.path.join( _ssh_sock_path = os.path.join(
tempfile.mkdtemp('', 'ssh-', dir), tempfile.mkdtemp('', 'ssh-', tmp_dir),
'master-%r@%h:%p') 'master-%r@%h:%p')
return _ssh_sock_path return _ssh_sock_path
@ -88,11 +89,11 @@ class _GitCall(object):
ver_str = git.version() ver_str = git.version()
if ver_str.startswith('git version '): if ver_str.startswith('git version '):
_git_version = tuple( _git_version = tuple(
map(lambda x: int(x), map(int,
ver_str[len('git version '):].strip().split('-')[0].split('.')[0:3] ver_str[len('git version '):].strip().split('-')[0].split('.')[0:3]
)) ))
else: else:
print >>sys.stderr, 'fatal: "%s" unsupported' % ver_str print('fatal: "%s" unsupported' % ver_str, file=sys.stderr)
sys.exit(1) sys.exit(1)
return _git_version return _git_version
@ -110,8 +111,8 @@ def git_require(min_version, fail=False):
if min_version <= git_version: if min_version <= git_version:
return True return True
if fail: if fail:
need = '.'.join(map(lambda x: str(x), min_version)) need = '.'.join(map(str, min_version))
print >>sys.stderr, 'fatal: git %s or later required' % need print('fatal: git %s or later required' % need, file=sys.stderr)
sys.exit(1) sys.exit(1)
return False return False
@ -132,15 +133,15 @@ class GitCommand(object):
gitdir = None): gitdir = None):
env = os.environ.copy() env = os.environ.copy()
for e in [REPO_TRACE, for key in [REPO_TRACE,
GIT_DIR, GIT_DIR,
'GIT_ALTERNATE_OBJECT_DIRECTORIES', 'GIT_ALTERNATE_OBJECT_DIRECTORIES',
'GIT_OBJECT_DIRECTORY', 'GIT_OBJECT_DIRECTORY',
'GIT_WORK_TREE', 'GIT_WORK_TREE',
'GIT_GRAFT_FILE', 'GIT_GRAFT_FILE',
'GIT_INDEX_FILE']: 'GIT_INDEX_FILE']:
if e in env: if key in env:
del env[e] del env[key]
if disable_editor: if disable_editor:
_setenv(env, 'GIT_EDITOR', ':') _setenv(env, 'GIT_EDITOR', ':')
@ -217,7 +218,7 @@ class GitCommand(object):
stdin = stdin, stdin = stdin,
stdout = stdout, stdout = stdout,
stderr = stderr) stderr = stderr)
except Exception, e: except Exception as e:
raise GitError('%s: %s' % (command[1], e)) raise GitError('%s: %s' % (command[1], e))
if ssh_proxy: if ssh_proxy:

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import cPickle import cPickle
import os import os
import re import re
@ -23,7 +24,18 @@ try:
except ImportError: except ImportError:
import dummy_threading as _threading import dummy_threading as _threading
import time import time
import urllib2 try:
import urllib2
except ImportError:
# For python3
import urllib.request
import urllib.error
else:
# For python2
import imp
urllib = imp.new_module('urllib')
urllib.request = urllib2
urllib.error = urllib2
from signal import SIGTERM from signal import SIGTERM
from error import GitError, UploadError from error import GitError, UploadError
@ -35,7 +47,7 @@ from git_command import terminate_ssh_clients
R_HEADS = 'refs/heads/' R_HEADS = 'refs/heads/'
R_TAGS = 'refs/tags/' R_TAGS = 'refs/tags/'
ID_RE = re.compile('^[0-9a-f]{40}$') ID_RE = re.compile(r'^[0-9a-f]{40}$')
REVIEW_CACHE = dict() REVIEW_CACHE = dict()
@ -56,16 +68,16 @@ class GitConfig(object):
@classmethod @classmethod
def ForUser(cls): def ForUser(cls):
if cls._ForUser is None: if cls._ForUser is None:
cls._ForUser = cls(file = os.path.expanduser('~/.gitconfig')) cls._ForUser = cls(configfile = os.path.expanduser('~/.gitconfig'))
return cls._ForUser return cls._ForUser
@classmethod @classmethod
def ForRepository(cls, gitdir, defaults=None): def ForRepository(cls, gitdir, defaults=None):
return cls(file = os.path.join(gitdir, 'config'), return cls(configfile = os.path.join(gitdir, 'config'),
defaults = defaults) defaults = defaults)
def __init__(self, file, defaults=None, pickleFile=None): def __init__(self, configfile, defaults=None, pickleFile=None):
self.file = file self.file = configfile
self.defaults = defaults self.defaults = defaults
self._cache_dict = None self._cache_dict = None
self._section_dict = None self._section_dict = None
@ -104,20 +116,20 @@ class GitConfig(object):
return False return False
return None return None
def GetString(self, name, all=False): def GetString(self, name, all_keys=False):
"""Get the first value for a key, or None if it is not defined. """Get the first value for a key, or None if it is not defined.
This configuration file is used first, if the key is not This configuration file is used first, if the key is not
defined or all = True then the defaults are also searched. defined or all_keys = True then the defaults are also searched.
""" """
try: try:
v = self._cache[_key(name)] v = self._cache[_key(name)]
except KeyError: except KeyError:
if self.defaults: if self.defaults:
return self.defaults.GetString(name, all = all) return self.defaults.GetString(name, all_keys = all_keys)
v = [] v = []
if not all: if not all_keys:
if v: if v:
return v[0] return v[0]
return None return None
@ -125,7 +137,7 @@ class GitConfig(object):
r = [] r = []
r.extend(v) r.extend(v)
if self.defaults: if self.defaults:
r.extend(self.defaults.GetString(name, all = True)) r.extend(self.defaults.GetString(name, all_keys = True))
return r return r
def SetString(self, name, value): def SetString(self, name, value):
@ -157,7 +169,7 @@ class GitConfig(object):
elif old != value: elif old != value:
self._cache[key] = list(value) self._cache[key] = list(value)
self._do('--replace-all', name, value[0]) self._do('--replace-all', name, value[0])
for i in xrange(1, len(value)): for i in range(1, len(value)):
self._do('--add', name, value[i]) self._do('--add', name, value[i])
elif len(old) != 1 or old[0] != value: elif len(old) != 1 or old[0] != value:
@ -288,12 +300,13 @@ class GitConfig(object):
d = self._do('--null', '--list') d = self._do('--null', '--list')
if d is None: if d is None:
return c return c
for line in d.rstrip('\0').split('\0'): for line in d.rstrip('\0').split('\0'): # pylint: disable=W1401
# Backslash is not anomalous
if '\n' in line: if '\n' in line:
key, val = line.split('\n', 1) key, val = line.split('\n', 1)
else: else:
key = line key = line
val = None val = None
if key in c: if key in c:
c[key].append(val) c[key].append(val)
@ -418,7 +431,7 @@ def _open_ssh(host, port=None):
'-o','ControlPath %s' % ssh_sock(), '-o','ControlPath %s' % ssh_sock(),
host] host]
if port is not None: if port is not None:
command_base[1:1] = ['-p',str(port)] command_base[1:1] = ['-p', str(port)]
# Since the key wasn't in _master_keys, we think that master isn't running. # Since the key wasn't in _master_keys, we think that master isn't running.
# ...but before actually starting a master, we'll double-check. This can # ...but before actually starting a master, we'll double-check. This can
@ -449,11 +462,10 @@ def _open_ssh(host, port=None):
try: try:
Trace(': %s', ' '.join(command)) Trace(': %s', ' '.join(command))
p = subprocess.Popen(command) p = subprocess.Popen(command)
except Exception, e: except Exception as e:
_ssh_master = False _ssh_master = False
print >>sys.stderr, \ print('\nwarn: cannot enable ssh control master for %s:%s\n%s'
'\nwarn: cannot enable ssh control master for %s:%s\n%s' \ % (host,port, str(e)), file=sys.stderr)
% (host,port, str(e))
return False return False
_master_processes.append(p) _master_processes.append(p)
@ -525,8 +537,8 @@ class Remote(object):
self.url = self._Get('url') self.url = self._Get('url')
self.review = self._Get('review') self.review = self._Get('review')
self.projectname = self._Get('projectname') self.projectname = self._Get('projectname')
self.fetch = map(lambda x: RefSpec.FromString(x), self.fetch = map(RefSpec.FromString,
self._Get('fetch', all=True)) self._Get('fetch', all_keys=True))
self._review_url = None self._review_url = None
def _InsteadOf(self): def _InsteadOf(self):
@ -537,7 +549,7 @@ class Remote(object):
for url in urlList: for url in urlList:
key = "url." + url + ".insteadOf" key = "url." + url + ".insteadOf"
insteadOfList = globCfg.GetString(key, all=True) insteadOfList = globCfg.GetString(key, all_keys=True)
for insteadOf in insteadOfList: for insteadOf in insteadOfList:
if self.url.startswith(insteadOf) \ if self.url.startswith(insteadOf) \
@ -567,7 +579,7 @@ class Remote(object):
if u.endswith('/ssh_info'): if u.endswith('/ssh_info'):
u = u[:len(u) - len('/ssh_info')] u = u[:len(u) - len('/ssh_info')]
if not u.endswith('/'): if not u.endswith('/'):
u += '/' u += '/'
http_url = u http_url = u
if u in REVIEW_CACHE: if u in REVIEW_CACHE:
@ -579,7 +591,7 @@ class Remote(object):
else: else:
try: try:
info_url = u + 'ssh_info' info_url = u + 'ssh_info'
info = urllib2.urlopen(info_url).read() info = urllib.request.urlopen(info_url).read()
if '<' in info: if '<' in info:
# Assume the server gave us some sort of HTML # Assume the server gave us some sort of HTML
# response back, like maybe a login page. # response back, like maybe a login page.
@ -592,9 +604,9 @@ class Remote(object):
else: else:
host, port = info.split() host, port = info.split()
self._review_url = self._SshReviewUrl(userEmail, host, port) self._review_url = self._SshReviewUrl(userEmail, host, port)
except urllib2.HTTPError, e: except urllib.error.HTTPError as e:
raise UploadError('%s: %s' % (self.review, str(e))) raise UploadError('%s: %s' % (self.review, str(e)))
except urllib2.URLError, e: except urllib.error.URLError as e:
raise UploadError('%s: %s' % (self.review, str(e))) raise UploadError('%s: %s' % (self.review, str(e)))
REVIEW_CACHE[u] = self._review_url REVIEW_CACHE[u] = self._review_url
@ -645,15 +657,15 @@ class Remote(object):
self._Set('url', self.url) self._Set('url', self.url)
self._Set('review', self.review) self._Set('review', self.review)
self._Set('projectname', self.projectname) self._Set('projectname', self.projectname)
self._Set('fetch', map(lambda x: str(x), self.fetch)) self._Set('fetch', map(str, self.fetch))
def _Set(self, key, value): def _Set(self, key, value):
key = 'remote.%s.%s' % (self.name, key) key = 'remote.%s.%s' % (self.name, key)
return self._config.SetString(key, value) return self._config.SetString(key, value)
def _Get(self, key, all=False): def _Get(self, key, all_keys=False):
key = 'remote.%s.%s' % (self.name, key) key = 'remote.%s.%s' % (self.name, key)
return self._config.GetString(key, all = all) return self._config.GetString(key, all_keys = all_keys)
class Branch(object): class Branch(object):
@ -703,6 +715,6 @@ class Branch(object):
key = 'branch.%s.%s' % (self.name, key) key = 'branch.%s.%s' % (self.name, key)
return self._config.SetString(key, value) return self._config.SetString(key, value)
def _Get(self, key, all=False): def _Get(self, key, all_keys=False):
key = 'branch.%s.%s' % (self.name, key) key = 'branch.%s.%s' % (self.name, key)
return self._config.GetString(key, all = all) return self._config.GetString(key, all_keys = all_keys)

View File

@ -115,10 +115,10 @@ class GitRefs(object):
line = line[:-1] line = line[:-1]
p = line.split(' ') p = line.split(' ')
id = p[0] ref_id = p[0]
name = p[1] name = p[1]
self._phyref[name] = id self._phyref[name] = ref_id
finally: finally:
fd.close() fd.close()
self._mtime['packed-refs'] = mtime self._mtime['packed-refs'] = mtime
@ -138,24 +138,24 @@ class GitRefs(object):
def _ReadLoose1(self, path, name): def _ReadLoose1(self, path, name):
try: try:
fd = open(path, 'rb') fd = open(path, 'rb')
except: except IOError:
return return
try: try:
try: try:
mtime = os.path.getmtime(path) mtime = os.path.getmtime(path)
id = fd.readline() ref_id = fd.readline()
except: except (IOError, OSError):
return return
finally: finally:
fd.close() fd.close()
if not id: if not ref_id:
return return
id = id[:-1] ref_id = ref_id[:-1]
if id.startswith('ref: '): if ref_id.startswith('ref: '):
self._symref[name] = id[5:] self._symref[name] = ref_id[5:]
else: else:
self._phyref[name] = id self._phyref[name] = ref_id
self._mtime[name] = mtime self._mtime[name] = mtime

View File

@ -1,5 +1,5 @@
#!/bin/sh #!/bin/sh
# From Gerrit Code Review 2.1.2-rc2-33-g7e30c72 # From Gerrit Code Review 2.5-rc0
# #
# Part of Gerrit Code Review (http://code.google.com/p/gerrit/) # Part of Gerrit Code Review (http://code.google.com/p/gerrit/)
# #
@ -24,71 +24,144 @@ MSG="$1"
# Check for, and add if missing, a unique Change-Id # Check for, and add if missing, a unique Change-Id
# #
add_ChangeId() { add_ChangeId() {
clean_message=$(sed -e ' clean_message=`sed -e '
/^diff --git a\/.*/{ /^diff --git a\/.*/{
s/// s///
q q
} }
/^Signed-off-by:/d /^Signed-off-by:/d
/^#/d /^#/d
' "$MSG" | git stripspace) ' "$MSG" | git stripspace`
if test -z "$clean_message" if test -z "$clean_message"
then then
return return
fi fi
# Does Change-Id: already exist? if so, exit (no change).
if grep -i '^Change-Id:' "$MSG" >/dev/null if grep -i '^Change-Id:' "$MSG" >/dev/null
then then
return return
fi fi
id=$(_gen_ChangeId) id=`_gen_ChangeId`
perl -e ' T="$MSG.tmp.$$"
$MSG = shift; AWK=awk
$id = shift; if [ -x /usr/xpg4/bin/awk ]; then
$CHANGE_ID_AFTER = shift; # Solaris AWK is just too broken
AWK=/usr/xpg4/bin/awk
fi
undef $/; # How this works:
open(I, $MSG); $_ = <I>; close I; # - parse the commit message as (textLine+ blankLine*)*
s|^diff --git a/.*||ms; # - assume textLine+ to be a footer until proven otherwise
s|^#.*$||mg; # - exception: the first block is not footer (as it is the title)
exit unless $_; # - read textLine+ into a variable
# - then count blankLines
# - once the next textLine appears, print textLine+ blankLine* as these
# aren't footer
# - in END, the last textLine+ block is available for footer parsing
$AWK '
BEGIN {
# while we start with the assumption that textLine+
# is a footer, the first block is not.
isFooter = 0
footerComment = 0
blankLines = 0
}
@message = split /\n/; # Skip lines starting with "#" without any spaces before it.
$haveFooter = 0; /^#/ { next }
$startFooter = @message;
for($line = @message - 1; $line >= 0; $line--) {
$_ = $message[$line];
($haveFooter++, next) if /^[a-zA-Z0-9-]+:/; # Skip the line starting with the diff command and everything after it,
next if /^[ []/; # up to the end of the file, assuming it is only patch data.
$startFooter = $line if ($haveFooter && /^\r?$/); # If more than one line before the diff was empty, strip all but one.
last; /^diff --git a/ {
blankLines = 0
while (getline) { }
next
}
# Count blank lines outside footer comments
/^$/ && (footerComment == 0) {
blankLines++
next
}
# Catch footer comment
/^\[[a-zA-Z0-9-]+:/ && (isFooter == 1) {
footerComment = 1
}
/]$/ && (footerComment == 1) {
footerComment = 2
}
# We have a non-blank line after blank lines. Handle this.
(blankLines > 0) {
print lines
for (i = 0; i < blankLines; i++) {
print ""
} }
@footer = @message[$startFooter+1..@message]; lines = ""
@message = @message[0..$startFooter]; blankLines = 0
push(@footer, "") unless @footer; isFooter = 1
footerComment = 0
}
for ($line = 0; $line < @footer; $line++) { # Detect that the current block is not the footer
$_ = $footer[$line]; (footerComment == 0) && (!/^\[?[a-zA-Z0-9-]+:/ || /^[a-zA-Z0-9-]+:\/\//) {
next if /^($CHANGE_ID_AFTER):/i; isFooter = 0
last; }
{
# We need this information about the current last comment line
if (footerComment == 2) {
footerComment = 0
} }
splice(@footer, $line, 0, "Change-Id: I$id"); if (lines != "") {
lines = lines "\n";
}
lines = lines $0
}
$_ = join("\n", @message, @footer); # Footer handling:
open(O, ">$MSG"); print O; close O; # If the last block is considered a footer, splice in the Change-Id at the
' "$MSG" "$id" "$CHANGE_ID_AFTER" # right place.
# Look for the right place to inject Change-Id by considering
# CHANGE_ID_AFTER. Keys listed in it (case insensitive) come first,
# then Change-Id, then everything else (eg. Signed-off-by:).
#
# Otherwise just print the last block, a new line and the Change-Id as a
# block of its own.
END {
unprinted = 1
if (isFooter == 0) {
print lines "\n"
lines = ""
}
changeIdAfter = "^(" tolower("'"$CHANGE_ID_AFTER"'") "):"
numlines = split(lines, footer, "\n")
for (line = 1; line <= numlines; line++) {
if (unprinted && match(tolower(footer[line]), changeIdAfter) != 1) {
unprinted = 0
print "Change-Id: I'"$id"'"
}
print footer[line]
}
if (unprinted) {
print "Change-Id: I'"$id"'"
}
}' "$MSG" > $T && mv $T "$MSG" || rm -f $T
} }
_gen_ChangeIdInput() { _gen_ChangeIdInput() {
echo "tree $(git write-tree)" echo "tree `git write-tree`"
if parent=$(git rev-parse HEAD^0 2>/dev/null) if parent=`git rev-parse "HEAD^0" 2>/dev/null`
then then
echo "parent $parent" echo "parent $parent"
fi fi
echo "author $(git var GIT_AUTHOR_IDENT)" echo "author `git var GIT_AUTHOR_IDENT`"
echo "committer $(git var GIT_COMMITTER_IDENT)" echo "committer `git var GIT_COMMITTER_IDENT`"
echo echo
printf '%s' "$clean_message" printf '%s' "$clean_message"
} }

194
main.py
View File

@ -1,4 +1,4 @@
#!/bin/sh #!/usr/bin/env python
# #
# Copyright (C) 2008 The Android Open Source Project # Copyright (C) 2008 The Android Open Source Project
# #
@ -14,21 +14,23 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
magic='--calling-python-from-/bin/sh--' from __future__ import print_function
"""exec" python -E "$0" "$@" """#$magic" import getpass
if __name__ == '__main__': import imp
import sys
if sys.argv[-1] == '#%s' % magic:
del sys.argv[-1]
del magic
import netrc import netrc
import optparse import optparse
import os import os
import re
import sys import sys
import time import time
import urllib2 try:
import urllib2
except ImportError:
# For python3
import urllib.request
else:
# For python2
urllib = imp.new_module('urllib')
urllib.request = urllib2
from trace import SetTrace from trace import SetTrace
from git_command import git, GitCommand from git_command import git, GitCommand
@ -39,12 +41,14 @@ from subcmds.version import Version
from editor import Editor from editor import Editor
from error import DownloadError from error import DownloadError
from error import ManifestInvalidRevisionError from error import ManifestInvalidRevisionError
from error import ManifestParseError
from error import NoManifestException
from error import NoSuchProjectError from error import NoSuchProjectError
from error import RepoChangedException from error import RepoChangedException
from manifest_xml import XmlManifest from manifest_xml import XmlManifest
from pager import RunPager from pager import RunPager
from subcmds import all as all_commands from subcmds import all_commands
global_options = optparse.OptionParser( global_options = optparse.OptionParser(
usage="repo [-p|--paginate|--no-pager] COMMAND [ARGS]" usage="repo [-p|--paginate|--no-pager] COMMAND [ARGS]"
@ -77,7 +81,7 @@ class _Repo(object):
name = None name = None
glob = [] glob = []
for i in xrange(0, len(argv)): for i in range(len(argv)):
if not argv[i].startswith('-'): if not argv[i].startswith('-'):
name = argv[i] name = argv[i]
if i > 0: if i > 0:
@ -88,7 +92,7 @@ class _Repo(object):
glob = argv glob = argv
name = 'help' name = 'help'
argv = [] argv = []
gopts, gargs = global_options.parse_args(glob) gopts, _gargs = global_options.parse_args(glob)
if gopts.trace: if gopts.trace:
SetTrace() SetTrace()
@ -96,15 +100,14 @@ class _Repo(object):
if name == 'help': if name == 'help':
name = 'version' name = 'version'
else: else:
print >>sys.stderr, 'fatal: invalid usage of --version' print('fatal: invalid usage of --version', file=sys.stderr)
return 1 return 1
try: try:
cmd = self.commands[name] cmd = self.commands[name]
except KeyError: except KeyError:
print >>sys.stderr,\ print("repo: '%s' is not a repo command. See 'repo help'." % name,
"repo: '%s' is not a repo command. See 'repo help'."\ file=sys.stderr)
% name
return 1 return 1
cmd.repodir = self.repodir cmd.repodir = self.repodir
@ -112,12 +115,12 @@ class _Repo(object):
Editor.globalConfig = cmd.manifest.globalConfig Editor.globalConfig = cmd.manifest.globalConfig
if not isinstance(cmd, MirrorSafeCommand) and cmd.manifest.IsMirror: if not isinstance(cmd, MirrorSafeCommand) and cmd.manifest.IsMirror:
print >>sys.stderr, \ print("fatal: '%s' requires a working directory" % name,
"fatal: '%s' requires a working directory"\ file=sys.stderr)
% name
return 1 return 1
copts, cargs = cmd.OptionParser.parse_args(argv) copts, cargs = cmd.OptionParser.parse_args(argv)
copts = cmd.ReadEnvironmentOptions(copts)
if not gopts.no_pager and not isinstance(cmd, InteractiveCommand): if not gopts.no_pager and not isinstance(cmd, InteractiveCommand):
config = cmd.manifest.globalConfig config = cmd.manifest.globalConfig
@ -130,33 +133,35 @@ class _Repo(object):
if use_pager: if use_pager:
RunPager(config) RunPager(config)
start = time.time()
try: try:
start = time.time() result = cmd.Execute(copts, cargs)
try: except DownloadError as e:
result = cmd.Execute(copts, cargs) print('error: %s' % str(e), file=sys.stderr)
finally: result = 1
elapsed = time.time() - start except ManifestInvalidRevisionError as e:
hours, remainder = divmod(elapsed, 3600) print('error: %s' % str(e), file=sys.stderr)
minutes, seconds = divmod(remainder, 60) result = 1
if gopts.time: except NoManifestException as e:
if hours == 0: print('error: manifest required for this command -- please run init',
print >>sys.stderr, 'real\t%dm%.3fs' \ file=sys.stderr)
% (minutes, seconds) result = 1
else: except NoSuchProjectError as e:
print >>sys.stderr, 'real\t%dh%dm%.3fs' \
% (hours, minutes, seconds)
except DownloadError, e:
print >>sys.stderr, 'error: %s' % str(e)
return 1
except ManifestInvalidRevisionError, e:
print >>sys.stderr, 'error: %s' % str(e)
return 1
except NoSuchProjectError, e:
if e.name: if e.name:
print >>sys.stderr, 'error: project %s not found' % e.name print('error: project %s not found' % e.name, file=sys.stderr)
else: else:
print >>sys.stderr, 'error: no project in current directory' print('error: no project in current directory', file=sys.stderr)
return 1 result = 1
finally:
elapsed = time.time() - start
hours, remainder = divmod(elapsed, 3600)
minutes, seconds = divmod(remainder, 60)
if gopts.time:
if hours == 0:
print('real\t%dm%.3fs' % (minutes, seconds), file=sys.stderr)
else:
print('real\t%dh%dm%.3fs' % (hours, minutes, seconds),
file=sys.stderr)
return result return result
@ -166,53 +171,51 @@ def _MyRepoPath():
def _MyWrapperPath(): def _MyWrapperPath():
return os.path.join(os.path.dirname(__file__), 'repo') return os.path.join(os.path.dirname(__file__), 'repo')
_wrapper_module = None
def WrapperModule():
global _wrapper_module
if not _wrapper_module:
_wrapper_module = imp.load_source('wrapper', _MyWrapperPath())
return _wrapper_module
def _CurrentWrapperVersion(): def _CurrentWrapperVersion():
VERSION = None return WrapperModule().VERSION
pat = re.compile(r'^VERSION *=')
fd = open(_MyWrapperPath())
for line in fd:
if pat.match(line):
fd.close()
exec line
return VERSION
raise NameError, 'No VERSION in repo script'
def _CheckWrapperVersion(ver, repo_path): def _CheckWrapperVersion(ver, repo_path):
if not repo_path: if not repo_path:
repo_path = '~/bin/repo' repo_path = '~/bin/repo'
if not ver: if not ver:
print >>sys.stderr, 'no --wrapper-version argument' print('no --wrapper-version argument', file=sys.stderr)
sys.exit(1) sys.exit(1)
exp = _CurrentWrapperVersion() exp = _CurrentWrapperVersion()
ver = tuple(map(lambda x: int(x), ver.split('.'))) ver = tuple(map(int, ver.split('.')))
if len(ver) == 1: if len(ver) == 1:
ver = (0, ver[0]) ver = (0, ver[0])
exp_str = '.'.join(map(str, exp))
if exp[0] > ver[0] or ver < (0, 4): if exp[0] > ver[0] or ver < (0, 4):
exp_str = '.'.join(map(lambda x: str(x), exp)) print("""
print >>sys.stderr, """
!!! A new repo command (%5s) is available. !!! !!! A new repo command (%5s) is available. !!!
!!! You must upgrade before you can continue: !!! !!! You must upgrade before you can continue: !!!
cp %s %s cp %s %s
""" % (exp_str, _MyWrapperPath(), repo_path) """ % (exp_str, _MyWrapperPath(), repo_path), file=sys.stderr)
sys.exit(1) sys.exit(1)
if exp > ver: if exp > ver:
exp_str = '.'.join(map(lambda x: str(x), exp)) print("""
print >>sys.stderr, """
... A new repo command (%5s) is available. ... A new repo command (%5s) is available.
... You should upgrade soon: ... You should upgrade soon:
cp %s %s cp %s %s
""" % (exp_str, _MyWrapperPath(), repo_path) """ % (exp_str, _MyWrapperPath(), repo_path), file=sys.stderr)
def _CheckRepoDir(dir): def _CheckRepoDir(repo_dir):
if not dir: if not repo_dir:
print >>sys.stderr, 'no --repo-dir argument' print('no --repo-dir argument', file=sys.stderr)
sys.exit(1) sys.exit(1)
def _PruneOptions(argv, opt): def _PruneOptions(argv, opt):
i = 0 i = 0
@ -263,11 +266,11 @@ def _UserAgent():
_user_agent = 'git-repo/%s (%s) git/%s Python/%d.%d.%d' % ( _user_agent = 'git-repo/%s (%s) git/%s Python/%d.%d.%d' % (
repo_version, repo_version,
os_name, os_name,
'.'.join(map(lambda d: str(d), git.version_tuple())), '.'.join(map(str, git.version_tuple())),
py_version[0], py_version[1], py_version[2]) py_version[0], py_version[1], py_version[2])
return _user_agent return _user_agent
class _UserAgentHandler(urllib2.BaseHandler): class _UserAgentHandler(urllib.request.BaseHandler):
def http_request(self, req): def http_request(self, req):
req.add_header('User-Agent', _UserAgent()) req.add_header('User-Agent', _UserAgent())
return req return req
@ -276,7 +279,25 @@ class _UserAgentHandler(urllib2.BaseHandler):
req.add_header('User-Agent', _UserAgent()) req.add_header('User-Agent', _UserAgent())
return req return req
class _BasicAuthHandler(urllib2.HTTPBasicAuthHandler): def _AddPasswordFromUserInput(handler, msg, req):
# If repo could not find auth info from netrc, try to get it from user input
url = req.get_full_url()
user, password = handler.passwd.find_user_password(None, url)
if user is None:
print(msg)
try:
user = raw_input('User: ')
password = getpass.getpass()
except KeyboardInterrupt:
return
handler.passwd.add_password(None, url, user, password)
class _BasicAuthHandler(urllib.request.HTTPBasicAuthHandler):
def http_error_401(self, req, fp, code, msg, headers):
_AddPasswordFromUserInput(self, msg, req)
return urllib.request.HTTPBasicAuthHandler.http_error_401(
self, req, fp, code, msg, headers)
def http_error_auth_reqed(self, authreq, host, req, headers): def http_error_auth_reqed(self, authreq, host, req, headers):
try: try:
old_add_header = req.add_header old_add_header = req.add_header
@ -284,7 +305,7 @@ class _BasicAuthHandler(urllib2.HTTPBasicAuthHandler):
val = val.replace('\n', '') val = val.replace('\n', '')
old_add_header(name, val) old_add_header(name, val)
req.add_header = _add_header req.add_header = _add_header
return urllib2.AbstractBasicAuthHandler.http_error_auth_reqed( return urllib.request.AbstractBasicAuthHandler.http_error_auth_reqed(
self, authreq, host, req, headers) self, authreq, host, req, headers)
except: except:
reset = getattr(self, 'reset_retry_count', None) reset = getattr(self, 'reset_retry_count', None)
@ -294,7 +315,12 @@ class _BasicAuthHandler(urllib2.HTTPBasicAuthHandler):
self.retried = 0 self.retried = 0
raise raise
class _DigestAuthHandler(urllib2.HTTPDigestAuthHandler): class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler):
def http_error_401(self, req, fp, code, msg, headers):
_AddPasswordFromUserInput(self, msg, req)
return urllib.request.HTTPDigestAuthHandler.http_error_401(
self, req, fp, code, msg, headers)
def http_error_auth_reqed(self, auth_header, host, req, headers): def http_error_auth_reqed(self, auth_header, host, req, headers):
try: try:
old_add_header = req.add_header old_add_header = req.add_header
@ -302,7 +328,7 @@ class _DigestAuthHandler(urllib2.HTTPDigestAuthHandler):
val = val.replace('\n', '') val = val.replace('\n', '')
old_add_header(name, val) old_add_header(name, val)
req.add_header = _add_header req.add_header = _add_header
return urllib2.AbstractDigestAuthHandler.http_error_auth_reqed( return urllib.request.AbstractDigestAuthHandler.http_error_auth_reqed(
self, auth_header, host, req, headers) self, auth_header, host, req, headers)
except: except:
reset = getattr(self, 'reset_retry_count', None) reset = getattr(self, 'reset_retry_count', None)
@ -315,7 +341,7 @@ class _DigestAuthHandler(urllib2.HTTPDigestAuthHandler):
def init_http(): def init_http():
handlers = [_UserAgentHandler()] handlers = [_UserAgentHandler()]
mgr = urllib2.HTTPPasswordMgrWithDefaultRealm() mgr = urllib.request.HTTPPasswordMgrWithDefaultRealm()
try: try:
n = netrc.netrc() n = netrc.netrc()
for host in n.hosts: for host in n.hosts:
@ -331,11 +357,11 @@ def init_http():
if 'http_proxy' in os.environ: if 'http_proxy' in os.environ:
url = os.environ['http_proxy'] url = os.environ['http_proxy']
handlers.append(urllib2.ProxyHandler({'http': url, 'https': url})) handlers.append(urllib.request.ProxyHandler({'http': url, 'https': url}))
if 'REPO_CURL_VERBOSE' in os.environ: if 'REPO_CURL_VERBOSE' in os.environ:
handlers.append(urllib2.HTTPHandler(debuglevel=1)) handlers.append(urllib.request.HTTPHandler(debuglevel=1))
handlers.append(urllib2.HTTPSHandler(debuglevel=1)) handlers.append(urllib.request.HTTPSHandler(debuglevel=1))
urllib2.install_opener(urllib2.build_opener(*handlers)) urllib.request.install_opener(urllib.request.build_opener(*handlers))
def _Main(argv): def _Main(argv):
result = 0 result = 0
@ -365,17 +391,21 @@ def _Main(argv):
finally: finally:
close_ssh() close_ssh()
except KeyboardInterrupt: except KeyboardInterrupt:
print('aborted by user', file=sys.stderr)
result = 1 result = 1
except RepoChangedException, rce: except ManifestParseError as mpe:
print('fatal: %s' % mpe, file=sys.stderr)
result = 1
except RepoChangedException as rce:
# If repo changed, re-exec ourselves. # If repo changed, re-exec ourselves.
# #
argv = list(sys.argv) argv = list(sys.argv)
argv.extend(rce.extra_args) argv.extend(rce.extra_args)
try: try:
os.execv(__file__, argv) os.execv(__file__, argv)
except OSError, e: except OSError as e:
print >>sys.stderr, 'fatal: cannot restart repo after upgrade' print('fatal: cannot restart repo after upgrade', file=sys.stderr)
print >>sys.stderr, 'fatal: %s' % e print('fatal: %s' % e, file=sys.stderr)
result = 128 result = 128
sys.exit(result) sys.exit(result)

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import itertools import itertools
import os import os
import re import re
@ -21,11 +22,13 @@ import urlparse
import xml.dom.minidom import xml.dom.minidom
from git_config import GitConfig from git_config import GitConfig
from project import RemoteSpec, Project, MetaProject, R_HEADS, HEAD from git_refs import R_HEADS, HEAD
from project import RemoteSpec, Project, MetaProject
from error import ManifestParseError from error import ManifestParseError
MANIFEST_FILE_NAME = 'manifest.xml' MANIFEST_FILE_NAME = 'manifest.xml'
LOCAL_MANIFEST_NAME = 'local_manifest.xml' LOCAL_MANIFEST_NAME = 'local_manifest.xml'
LOCAL_MANIFESTS_DIR_NAME = 'local_manifests'
urlparse.uses_relative.extend(['ssh', 'git']) urlparse.uses_relative.extend(['ssh', 'git'])
urlparse.uses_netloc.extend(['ssh', 'git']) urlparse.uses_netloc.extend(['ssh', 'git'])
@ -37,6 +40,7 @@ class _Default(object):
remote = None remote = None
sync_j = 1 sync_j = 1
sync_c = False sync_c = False
sync_s = False
class _XmlRemote(object): class _XmlRemote(object):
def __init__(self, def __init__(self,
@ -52,15 +56,28 @@ class _XmlRemote(object):
self.reviewUrl = review self.reviewUrl = review
self.resolvedFetchUrl = self._resolveFetchUrl() self.resolvedFetchUrl = self._resolveFetchUrl()
def __eq__(self, other):
return self.__dict__ == other.__dict__
def __ne__(self, other):
return self.__dict__ != other.__dict__
def _resolveFetchUrl(self): def _resolveFetchUrl(self):
url = self.fetchUrl.rstrip('/') url = self.fetchUrl.rstrip('/')
manifestUrl = self.manifestUrl.rstrip('/') manifestUrl = self.manifestUrl.rstrip('/')
p = manifestUrl.startswith('persistent-http')
if p:
manifestUrl = manifestUrl[len('persistent-'):]
# urljoin will get confused if there is no scheme in the base url # urljoin will get confused if there is no scheme in the base url
# ie, if manifestUrl is of the form <hostname:port> # ie, if manifestUrl is of the form <hostname:port>
if manifestUrl.find(':') != manifestUrl.find('/') - 1: if manifestUrl.find(':') != manifestUrl.find('/') - 1:
manifestUrl = 'gopher://' + manifestUrl manifestUrl = 'gopher://' + manifestUrl
url = urlparse.urljoin(manifestUrl, url) url = urlparse.urljoin(manifestUrl, url)
return re.sub(r'^gopher://', '', url) url = re.sub(r'^gopher://', '', url)
if p:
url = 'persistent-' + url
return url
def ToRemoteSpec(self, projectName): def ToRemoteSpec(self, projectName):
url = self.resolvedFetchUrl.rstrip('/') + '/' + projectName url = self.resolvedFetchUrl.rstrip('/') + '/' + projectName
@ -112,7 +129,7 @@ class XmlManifest(object):
if os.path.exists(self.manifestFile): if os.path.exists(self.manifestFile):
os.remove(self.manifestFile) os.remove(self.manifestFile)
os.symlink('manifests/%s' % name, self.manifestFile) os.symlink('manifests/%s' % name, self.manifestFile)
except OSError, e: except OSError:
raise ManifestParseError('cannot link manifest %s' % name) raise ManifestParseError('cannot link manifest %s' % name)
def _RemoteToXml(self, r, doc, root): def _RemoteToXml(self, r, doc, root):
@ -123,7 +140,7 @@ class XmlManifest(object):
if r.reviewUrl is not None: if r.reviewUrl is not None:
e.setAttribute('review', r.reviewUrl) e.setAttribute('review', r.reviewUrl)
def Save(self, fd, peg_rev=False): def Save(self, fd, peg_rev=False, peg_rev_upstream=True):
"""Write the current manifest out to the given file descriptor. """Write the current manifest out to the given file descriptor.
""" """
mp = self.manifestProject mp = self.manifestProject
@ -169,6 +186,9 @@ class XmlManifest(object):
if d.sync_c: if d.sync_c:
have_default = True have_default = True
e.setAttribute('sync-c', 'true') e.setAttribute('sync-c', 'true')
if d.sync_s:
have_default = True
e.setAttribute('sync-s', 'true')
if have_default: if have_default:
root.appendChild(e) root.appendChild(e)
root.appendChild(doc.createTextNode('')) root.appendChild(doc.createTextNode(''))
@ -179,29 +199,38 @@ class XmlManifest(object):
root.appendChild(e) root.appendChild(e)
root.appendChild(doc.createTextNode('')) root.appendChild(doc.createTextNode(''))
sort_projects = list(self.projects.keys()) def output_projects(parent, parent_node, projects):
sort_projects.sort() for p in projects:
output_project(parent, parent_node, self.projects[p])
for p in sort_projects:
p = self.projects[p]
def output_project(parent, parent_node, p):
if not p.MatchesGroups(groups): if not p.MatchesGroups(groups):
continue return
name = p.name
relpath = p.relpath
if parent:
name = self._UnjoinName(parent.name, name)
relpath = self._UnjoinRelpath(parent.relpath, relpath)
e = doc.createElement('project') e = doc.createElement('project')
root.appendChild(e) parent_node.appendChild(e)
e.setAttribute('name', p.name) e.setAttribute('name', name)
if p.relpath != p.name: if relpath != name:
e.setAttribute('path', p.relpath) e.setAttribute('path', relpath)
if not d.remote or p.remote.name != d.remote.name: if not d.remote or p.remote.name != d.remote.name:
e.setAttribute('remote', p.remote.name) e.setAttribute('remote', p.remote.name)
if peg_rev: if peg_rev:
if self.IsMirror: if self.IsMirror:
e.setAttribute('revision', value = p.bare_git.rev_parse(p.revisionExpr + '^0')
p.bare_git.rev_parse(p.revisionExpr + '^0'))
else: else:
e.setAttribute('revision', value = p.work_git.rev_parse(HEAD + '^0')
p.work_git.rev_parse(HEAD + '^0')) e.setAttribute('revision', value)
if peg_rev_upstream and value != p.revisionExpr:
# Only save the origin if the origin is not a sha1, and the default
# isn't our value, and the if the default doesn't already have that
# covered.
e.setAttribute('upstream', p.revisionExpr)
elif not d.revisionExpr or p.revisionExpr != d.revisionExpr: elif not d.revisionExpr or p.revisionExpr != d.revisionExpr:
e.setAttribute('revision', p.revisionExpr) e.setAttribute('revision', p.revisionExpr)
@ -226,6 +255,19 @@ class XmlManifest(object):
if p.sync_c: if p.sync_c:
e.setAttribute('sync-c', 'true') e.setAttribute('sync-c', 'true')
if p.sync_s:
e.setAttribute('sync-s', 'true')
if p.subprojects:
sort_projects = [subp.name for subp in p.subprojects]
sort_projects.sort()
output_projects(p, e, sort_projects)
sort_projects = [key for key in self.projects.keys()
if not self.projects[key].parent]
sort_projects.sort()
output_projects(None, root, sort_projects)
if self._repo_hooks_project: if self._repo_hooks_project:
root.appendChild(doc.createTextNode('')) root.appendChild(doc.createTextNode(''))
e = doc.createElement('repo-hooks') e = doc.createElement('repo-hooks')
@ -294,8 +336,22 @@ class XmlManifest(object):
local = os.path.join(self.repodir, LOCAL_MANIFEST_NAME) local = os.path.join(self.repodir, LOCAL_MANIFEST_NAME)
if os.path.exists(local): if os.path.exists(local):
print('warning: %s is deprecated; put local manifests in %s instead'
% (LOCAL_MANIFEST_NAME, LOCAL_MANIFESTS_DIR_NAME),
file=sys.stderr)
nodes.append(self._ParseManifestXml(local, self.repodir)) nodes.append(self._ParseManifestXml(local, self.repodir))
local_dir = os.path.abspath(os.path.join(self.repodir, LOCAL_MANIFESTS_DIR_NAME))
try:
for local_file in sorted(os.listdir(local_dir)):
if local_file.endswith('.xml'):
try:
nodes.append(self._ParseManifestXml(local_file, self.repodir))
except ManifestParseError as e:
print('%s' % str(e), file=sys.stderr)
except OSError:
pass
self._ParseManifest(nodes) self._ParseManifest(nodes)
if self.IsMirror: if self.IsMirror:
@ -305,7 +361,11 @@ class XmlManifest(object):
self._loaded = True self._loaded = True
def _ParseManifestXml(self, path, include_root): def _ParseManifestXml(self, path, include_root):
root = xml.dom.minidom.parse(path) try:
root = xml.dom.minidom.parse(path)
except (OSError, xml.parsers.expat.ExpatError) as e:
raise ManifestParseError("error parsing manifest %s: %s" % (path, e))
if not root or not root.childNodes: if not root or not root.childNodes:
raise ManifestParseError("no root node in %s" % (path,)) raise ManifestParseError("no root node in %s" % (path,))
@ -316,36 +376,40 @@ class XmlManifest(object):
raise ManifestParseError("no <manifest> in %s" % (path,)) raise ManifestParseError("no <manifest> in %s" % (path,))
nodes = [] nodes = []
for node in manifest.childNodes: for node in manifest.childNodes: # pylint:disable=W0631
if node.nodeName == 'include': # We only get here if manifest is initialised
name = self._reqatt(node, 'name') if node.nodeName == 'include':
fp = os.path.join(include_root, name) name = self._reqatt(node, 'name')
if not os.path.isfile(fp): fp = os.path.join(include_root, name)
raise ManifestParseError, \ if not os.path.isfile(fp):
"include %s doesn't exist or isn't a file" % \ raise ManifestParseError, \
(name,) "include %s doesn't exist or isn't a file" % \
try: (name,)
nodes.extend(self._ParseManifestXml(fp, include_root)) try:
# should isolate this to the exact exception, but that's nodes.extend(self._ParseManifestXml(fp, include_root))
# tricky. actual parsing implementation may vary. # should isolate this to the exact exception, but that's
except (KeyboardInterrupt, RuntimeError, SystemExit): # tricky. actual parsing implementation may vary.
raise except (KeyboardInterrupt, RuntimeError, SystemExit):
except Exception, e: raise
raise ManifestParseError( except Exception as e:
"failed parsing included manifest %s: %s", (name, e)) raise ManifestParseError(
else: "failed parsing included manifest %s: %s", (name, e))
nodes.append(node) else:
nodes.append(node)
return nodes return nodes
def _ParseManifest(self, node_list): def _ParseManifest(self, node_list):
for node in itertools.chain(*node_list): for node in itertools.chain(*node_list):
if node.nodeName == 'remote': if node.nodeName == 'remote':
remote = self._ParseRemote(node) remote = self._ParseRemote(node)
if self._remotes.get(remote.name): if remote:
raise ManifestParseError( if remote.name in self._remotes:
'duplicate remote %s in %s' % if remote != self._remotes[remote.name]:
(remote.name, self.manifestFile)) raise ManifestParseError(
self._remotes[remote.name] = remote 'remote %s already exists with different attributes' %
(remote.name))
else:
self._remotes[remote.name] = remote
for node in itertools.chain(*node_list): for node in itertools.chain(*node_list):
if node.nodeName == 'default': if node.nodeName == 'default':
@ -369,19 +433,24 @@ class XmlManifest(object):
if node.nodeName == 'manifest-server': if node.nodeName == 'manifest-server':
url = self._reqatt(node, 'url') url = self._reqatt(node, 'url')
if self._manifest_server is not None: if self._manifest_server is not None:
raise ManifestParseError( raise ManifestParseError(
'duplicate manifest-server in %s' % 'duplicate manifest-server in %s' %
(self.manifestFile)) (self.manifestFile))
self._manifest_server = url self._manifest_server = url
def recursively_add_projects(project):
if self._projects.get(project.name):
raise ManifestParseError(
'duplicate project %s in %s' %
(project.name, self.manifestFile))
self._projects[project.name] = project
for subproject in project.subprojects:
recursively_add_projects(subproject)
for node in itertools.chain(*node_list): for node in itertools.chain(*node_list):
if node.nodeName == 'project': if node.nodeName == 'project':
project = self._ParseProject(node) project = self._ParseProject(node)
if self._projects.get(project.name): recursively_add_projects(project)
raise ManifestParseError(
'duplicate project %s in %s' %
(project.name, self.manifestFile))
self._projects[project.name] = project
if node.nodeName == 'repo-hooks': if node.nodeName == 'repo-hooks':
# Get the name of the project and the (space-separated) list of enabled. # Get the name of the project and the (space-separated) list of enabled.
repo_hooks_project = self._reqatt(node, 'in-project') repo_hooks_project = self._reqatt(node, 'in-project')
@ -408,9 +477,8 @@ class XmlManifest(object):
try: try:
del self._projects[name] del self._projects[name]
except KeyError: except KeyError:
raise ManifestParseError( raise ManifestParseError('remove-project element specifies non-existent '
'project %s not found' % 'project: %s' % name)
(name))
# If the manifest removes the hooks project, treat it as if it deleted # If the manifest removes the hooks project, treat it as if it deleted
# the repo-hooks element too. # the repo-hooks element too.
@ -490,6 +558,12 @@ class XmlManifest(object):
d.sync_c = False d.sync_c = False
else: else:
d.sync_c = sync_c.lower() in ("yes", "true", "1") d.sync_c = sync_c.lower() in ("yes", "true", "1")
sync_s = node.getAttribute('sync-s')
if not sync_s:
d.sync_s = False
else:
d.sync_s = sync_s.lower() in ("yes", "true", "1")
return d return d
def _ParseNotice(self, node): def _ParseNotice(self, node):
@ -531,11 +605,19 @@ class XmlManifest(object):
return '\n'.join(cleanLines) return '\n'.join(cleanLines)
def _ParseProject(self, node): def _JoinName(self, parent_name, name):
return os.path.join(parent_name, name)
def _UnjoinName(self, parent_name, name):
return os.path.relpath(name, parent_name)
def _ParseProject(self, node, parent = None):
""" """
reads a <project> element from the manifest file reads a <project> element from the manifest file
""" """
name = self._reqatt(node, 'name') name = self._reqatt(node, 'name')
if parent:
name = self._JoinName(parent.name, name)
remote = self._get_remote(node) remote = self._get_remote(node)
if remote is None: if remote is None:
@ -573,42 +655,80 @@ class XmlManifest(object):
else: else:
sync_c = sync_c.lower() in ("yes", "true", "1") sync_c = sync_c.lower() in ("yes", "true", "1")
sync_s = node.getAttribute('sync-s')
if not sync_s:
sync_s = self._default.sync_s
else:
sync_s = sync_s.lower() in ("yes", "true", "1")
upstream = node.getAttribute('upstream')
groups = '' groups = ''
if node.hasAttribute('groups'): if node.hasAttribute('groups'):
groups = node.getAttribute('groups') groups = node.getAttribute('groups')
groups = [x for x in re.split('[,\s]+', groups) if x] groups = [x for x in re.split(r'[,\s]+', groups) if x]
default_groups = ['default', 'name:%s' % name, 'path:%s' % path] if parent is None:
groups.extend(set(default_groups).difference(groups)) relpath, worktree, gitdir = self.GetProjectPaths(name, path)
if self.IsMirror:
relpath = None
worktree = None
gitdir = os.path.join(self.topdir, '%s.git' % name)
else: else:
worktree = os.path.join(self.topdir, path).replace('\\', '/') relpath, worktree, gitdir = self.GetSubprojectPaths(parent, path)
gitdir = os.path.join(self.repodir, 'projects/%s.git' % path)
default_groups = ['all', 'name:%s' % name, 'path:%s' % relpath]
groups.extend(set(default_groups).difference(groups))
project = Project(manifest = self, project = Project(manifest = self,
name = name, name = name,
remote = remote.ToRemoteSpec(name), remote = remote.ToRemoteSpec(name),
gitdir = gitdir, gitdir = gitdir,
worktree = worktree, worktree = worktree,
relpath = path, relpath = relpath,
revisionExpr = revisionExpr, revisionExpr = revisionExpr,
revisionId = None, revisionId = None,
rebase = rebase, rebase = rebase,
groups = groups, groups = groups,
sync_c = sync_c) sync_c = sync_c,
sync_s = sync_s,
upstream = upstream,
parent = parent)
for n in node.childNodes: for n in node.childNodes:
if n.nodeName == 'copyfile': if n.nodeName == 'copyfile':
self._ParseCopyFile(project, n) self._ParseCopyFile(project, n)
if n.nodeName == 'annotation': if n.nodeName == 'annotation':
self._ParseAnnotation(project, n) self._ParseAnnotation(project, n)
if n.nodeName == 'project':
project.subprojects.append(self._ParseProject(n, parent = project))
return project return project
def GetProjectPaths(self, name, path):
relpath = path
if self.IsMirror:
worktree = None
gitdir = os.path.join(self.topdir, '%s.git' % name)
else:
worktree = os.path.join(self.topdir, path).replace('\\', '/')
gitdir = os.path.join(self.repodir, 'projects', '%s.git' % path)
return relpath, worktree, gitdir
def GetSubprojectName(self, parent, submodule_path):
return os.path.join(parent.name, submodule_path)
def _JoinRelpath(self, parent_relpath, relpath):
return os.path.join(parent_relpath, relpath)
def _UnjoinRelpath(self, parent_relpath, relpath):
return os.path.relpath(relpath, parent_relpath)
def GetSubprojectPaths(self, parent, path):
relpath = self._JoinRelpath(parent.relpath, path)
gitdir = os.path.join(parent.gitdir, 'subprojects', '%s.git' % path)
if self.IsMirror:
worktree = None
else:
worktree = os.path.join(parent.worktree, path).replace('\\', '/')
return relpath, worktree, gitdir
def _ParseCopyFile(self, project, node): def _ParseCopyFile(self, project, node):
src = self._reqatt(node, 'src') src = self._reqatt(node, 'src')
dest = self._reqatt(node, 'dest') dest = self._reqatt(node, 'dest')

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import os import os
import select import select
import sys import sys
@ -49,8 +50,8 @@ def RunPager(globalConfig):
_BecomePager(pager) _BecomePager(pager)
except Exception: except Exception:
print >>sys.stderr, "fatal: cannot start pager '%s'" % pager print("fatal: cannot start pager '%s'" % pager, file=sys.stderr)
os.exit(255) sys.exit(255)
def _SelectPager(globalConfig): def _SelectPager(globalConfig):
try: try:
@ -74,11 +75,11 @@ def _BecomePager(pager):
# ready works around a long-standing bug in popularly # ready works around a long-standing bug in popularly
# available versions of 'less', a better 'more'. # available versions of 'less', a better 'more'.
# #
a, b, c = select.select([0], [], [0]) _a, _b, _c = select.select([0], [], [0])
os.environ['LESS'] = 'FRSX' os.environ['LESS'] = 'FRSX'
try: try:
os.execvp(pager, [pager]) os.execvp(pager, [pager])
except OSError, e: except OSError:
os.execv('/bin/sh', ['sh', '-c', pager]) os.execv('/bin/sh', ['sh', '-c', pager])

View File

@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import traceback import traceback
import errno import errno
import filecmp import filecmp
@ -22,13 +23,15 @@ import shutil
import stat import stat
import subprocess import subprocess
import sys import sys
import tempfile
import time import time
from color import Coloring from color import Coloring
from git_command import GitCommand from git_command import GitCommand, git_require
from git_config import GitConfig, IsId, GetSchemeFromUrl, ID_RE from git_config import GitConfig, IsId, GetSchemeFromUrl, ID_RE
from error import GitError, HookError, UploadError from error import GitError, HookError, UploadError
from error import ManifestInvalidRevisionError from error import ManifestInvalidRevisionError
from error import NoManifestException
from trace import IsTrace, Trace from trace import IsTrace, Trace
from git_refs import GitRefs, HEAD, R_HEADS, R_TAGS, R_PUB, R_M from git_refs import GitRefs, HEAD, R_HEADS, R_TAGS, R_PUB, R_M
@ -50,7 +53,7 @@ def _lwrite(path, content):
def _error(fmt, *args): def _error(fmt, *args):
msg = fmt % args msg = fmt % args
print >>sys.stderr, 'error: %s' % msg print('error: %s' % msg, file=sys.stderr)
def not_rev(r): def not_rev(r):
return '^' + r return '^' + r
@ -209,9 +212,9 @@ class _CopyFile:
if os.path.exists(dest): if os.path.exists(dest):
os.remove(dest) os.remove(dest)
else: else:
dir = os.path.dirname(dest) dest_dir = os.path.dirname(dest)
if not os.path.isdir(dir): if not os.path.isdir(dest_dir):
os.makedirs(dir) os.makedirs(dest_dir)
shutil.copy(src, dest) shutil.copy(src, dest)
# make the file read-only # make the file read-only
mode = os.stat(dest)[stat.ST_MODE] mode = os.stat(dest)[stat.ST_MODE]
@ -328,7 +331,6 @@ class RepoHook(object):
HookError: Raised if the user doesn't approve and abort_if_user_denies HookError: Raised if the user doesn't approve and abort_if_user_denies
was passed to the consturctor. was passed to the consturctor.
""" """
hooks_dir = self._hooks_project.worktree
hooks_config = self._hooks_project.config hooks_config = self._hooks_project.config
git_approval_key = 'repo.hooks.%s.approvedhash' % self._hook_type git_approval_key = 'repo.hooks.%s.approvedhash' % self._hook_type
@ -360,7 +362,7 @@ class RepoHook(object):
'(yes/yes-never-ask-again/NO)? ') % ( '(yes/yes-never-ask-again/NO)? ') % (
self._GetMustVerb(), self._script_fullpath) self._GetMustVerb(), self._script_fullpath)
response = raw_input(prompt).lower() response = raw_input(prompt).lower()
print print()
# User is doing a one-time approval. # User is doing a one-time approval.
if response in ('y', 'yes'): if response in ('y', 'yes'):
@ -484,7 +486,31 @@ class Project(object):
revisionId, revisionId,
rebase = True, rebase = True,
groups = None, groups = None,
sync_c = False): sync_c = False,
sync_s = False,
upstream = None,
parent = None,
is_derived = False):
"""Init a Project object.
Args:
manifest: The XmlManifest object.
name: The `name` attribute of manifest.xml's project element.
remote: RemoteSpec object specifying its remote's properties.
gitdir: Absolute path of git directory.
worktree: Absolute path of git working tree.
relpath: Relative path of git working tree to repo's top directory.
revisionExpr: The `revision` attribute of manifest.xml's project element.
revisionId: git commit id for checking out.
rebase: The `rebase` attribute of manifest.xml's project element.
groups: The `groups` attribute of manifest.xml's project element.
sync_c: The `sync-c` attribute of manifest.xml's project element.
sync_s: The `sync-s` attribute of manifest.xml's project element.
upstream: The `upstream` attribute of manifest.xml's project element.
parent: The parent Project object.
is_derived: False if the project was explicitly defined in the manifest;
True if the project is a discovered submodule.
"""
self.manifest = manifest self.manifest = manifest
self.name = name self.name = name
self.remote = remote self.remote = remote
@ -506,6 +532,11 @@ class Project(object):
self.rebase = rebase self.rebase = rebase
self.groups = groups self.groups = groups
self.sync_c = sync_c self.sync_c = sync_c
self.sync_s = sync_s
self.upstream = upstream
self.parent = parent
self.is_derived = is_derived
self.subprojects = []
self.snapshots = {} self.snapshots = {}
self.copyfiles = [] self.copyfiles = []
@ -525,6 +556,10 @@ class Project(object):
# project containing repo hooks. # project containing repo hooks.
self.enabled_repo_hooks = [] self.enabled_repo_hooks = []
@property
def Derived(self):
return self.is_derived
@property @property
def Exists(self): def Exists(self):
return os.path.isdir(self.gitdir) return os.path.isdir(self.gitdir)
@ -554,7 +589,7 @@ class Project(object):
'--unmerged', '--unmerged',
'--ignore-missing', '--ignore-missing',
'--refresh') '--refresh')
if self.work_git.DiffZ('diff-index','-M','--cached',HEAD): if self.work_git.DiffZ('diff-index', '-M', '--cached', HEAD):
return True return True
if self.work_git.DiffZ('diff-files'): if self.work_git.DiffZ('diff-files'):
return True return True
@ -583,14 +618,14 @@ class Project(object):
return self._userident_email return self._userident_email
def _LoadUserIdentity(self): def _LoadUserIdentity(self):
u = self.bare_git.var('GIT_COMMITTER_IDENT') u = self.bare_git.var('GIT_COMMITTER_IDENT')
m = re.compile("^(.*) <([^>]*)> ").match(u) m = re.compile("^(.*) <([^>]*)> ").match(u)
if m: if m:
self._userident_name = m.group(1) self._userident_name = m.group(1)
self._userident_email = m.group(2) self._userident_email = m.group(2)
else: else:
self._userident_name = '' self._userident_name = ''
self._userident_email = '' self._userident_email = ''
def GetRemote(self, name): def GetRemote(self, name):
"""Get the configuration for a single remote. """Get the configuration for a single remote.
@ -606,25 +641,24 @@ class Project(object):
"""Get all existing local branches. """Get all existing local branches.
""" """
current = self.CurrentBranch current = self.CurrentBranch
all = self._allrefs all_refs = self._allrefs
heads = {} heads = {}
pubd = {}
for name, id in all.iteritems(): for name, ref_id in all_refs.iteritems():
if name.startswith(R_HEADS): if name.startswith(R_HEADS):
name = name[len(R_HEADS):] name = name[len(R_HEADS):]
b = self.GetBranch(name) b = self.GetBranch(name)
b.current = name == current b.current = name == current
b.published = None b.published = None
b.revision = id b.revision = ref_id
heads[name] = b heads[name] = b
for name, id in all.iteritems(): for name, ref_id in all_refs.iteritems():
if name.startswith(R_PUB): if name.startswith(R_PUB):
name = name[len(R_PUB):] name = name[len(R_PUB):]
b = heads.get(name) b = heads.get(name)
if b: if b:
b.published = id b.published = ref_id
return heads return heads
@ -683,9 +717,9 @@ class Project(object):
if not os.path.isdir(self.worktree): if not os.path.isdir(self.worktree):
if output_redir == None: if output_redir == None:
output_redir = sys.stdout output_redir = sys.stdout
print >>output_redir, '' print(file=output_redir)
print >>output_redir, 'project %s/' % self.relpath print('project %s/' % self.relpath, file=output_redir)
print >>output_redir, ' missing (run "repo sync")' print(' missing (run "repo sync")', file=output_redir)
return return
self.work_git.update_index('-q', self.work_git.update_index('-q',
@ -724,17 +758,25 @@ class Project(object):
paths.sort() paths.sort()
for p in paths: for p in paths:
try: i = di[p] try:
except KeyError: i = None i = di[p]
except KeyError:
i = None
try: f = df[p] try:
except KeyError: f = None f = df[p]
except KeyError:
f = None
if i: i_status = i.status.upper() if i:
else: i_status = '-' i_status = i.status.upper()
else:
i_status = '-'
if f: f_status = f.status.lower() if f:
else: f_status = '-' f_status = f.status.lower()
else:
f_status = '-'
if i and i.src_path: if i and i.src_path:
line = ' %s%s\t%s => %s (%s%%)' % (i_status, f_status, line = ' %s%s\t%s => %s (%s%%)' % (i_status, f_status,
@ -777,46 +819,46 @@ class Project(object):
out.project('project %s/' % self.relpath) out.project('project %s/' % self.relpath)
out.nl() out.nl()
has_diff = True has_diff = True
print line[:-1] print(line[:-1])
p.Wait() p.Wait()
## Publish / Upload ## ## Publish / Upload ##
def WasPublished(self, branch, all=None): def WasPublished(self, branch, all_refs=None):
"""Was the branch published (uploaded) for code review? """Was the branch published (uploaded) for code review?
If so, returns the SHA-1 hash of the last published If so, returns the SHA-1 hash of the last published
state for the branch. state for the branch.
""" """
key = R_PUB + branch key = R_PUB + branch
if all is None: if all_refs is None:
try: try:
return self.bare_git.rev_parse(key) return self.bare_git.rev_parse(key)
except GitError: except GitError:
return None return None
else: else:
try: try:
return all[key] return all_refs[key]
except KeyError: except KeyError:
return None return None
def CleanPublishedCache(self, all=None): def CleanPublishedCache(self, all_refs=None):
"""Prunes any stale published refs. """Prunes any stale published refs.
""" """
if all is None: if all_refs is None:
all = self._allrefs all_refs = self._allrefs
heads = set() heads = set()
canrm = {} canrm = {}
for name, id in all.iteritems(): for name, ref_id in all_refs.iteritems():
if name.startswith(R_HEADS): if name.startswith(R_HEADS):
heads.add(name) heads.add(name)
elif name.startswith(R_PUB): elif name.startswith(R_PUB):
canrm[name] = id canrm[name] = ref_id
for name, id in canrm.iteritems(): for name, ref_id in canrm.iteritems():
n = name[len(R_PUB):] n = name[len(R_PUB):]
if R_HEADS + n not in heads: if R_HEADS + n not in heads:
self.bare_git.DeleteRef(name, id) self.bare_git.DeleteRef(name, ref_id)
def GetUploadableBranches(self, selected_branch=None): def GetUploadableBranches(self, selected_branch=None):
"""List any branches which can be uploaded for review. """List any branches which can be uploaded for review.
@ -824,15 +866,15 @@ class Project(object):
heads = {} heads = {}
pubed = {} pubed = {}
for name, id in self._allrefs.iteritems(): for name, ref_id in self._allrefs.iteritems():
if name.startswith(R_HEADS): if name.startswith(R_HEADS):
heads[name[len(R_HEADS):]] = id heads[name[len(R_HEADS):]] = ref_id
elif name.startswith(R_PUB): elif name.startswith(R_PUB):
pubed[name[len(R_PUB):]] = id pubed[name[len(R_PUB):]] = ref_id
ready = [] ready = []
for branch, id in heads.iteritems(): for branch, ref_id in heads.iteritems():
if branch in pubed and pubed[branch] == id: if branch in pubed and pubed[branch] == ref_id:
continue continue
if selected_branch and branch != selected_branch: if selected_branch and branch != selected_branch:
continue continue
@ -976,18 +1018,18 @@ class Project(object):
self._InitHooks() self._InitHooks()
def _CopyFiles(self): def _CopyFiles(self):
for file in self.copyfiles: for copyfile in self.copyfiles:
file._Copy() copyfile._Copy()
def GetRevisionId(self, all=None): def GetRevisionId(self, all_refs=None):
if self.revisionId: if self.revisionId:
return self.revisionId return self.revisionId
rem = self.GetRemote(self.remote.name) rem = self.GetRemote(self.remote.name)
rev = rem.ToLocal(self.revisionExpr) rev = rem.ToLocal(self.revisionExpr)
if all is not None and rev in all: if all_refs is not None and rev in all_refs:
return all[rev] return all_refs[rev]
try: try:
return self.bare_git.rev_parse('--verify', '%s^0' % rev) return self.bare_git.rev_parse('--verify', '%s^0' % rev)
@ -1000,16 +1042,20 @@ class Project(object):
"""Perform only the local IO portion of the sync process. """Perform only the local IO portion of the sync process.
Network access is not required. Network access is not required.
""" """
all = self.bare_ref.all all_refs = self.bare_ref.all
self.CleanPublishedCache(all) self.CleanPublishedCache(all_refs)
revid = self.GetRevisionId(all) revid = self.GetRevisionId(all_refs)
def _doff():
self._FastForward(revid)
self._CopyFiles()
self._InitWorkTree() self._InitWorkTree()
head = self.work_git.GetHead() head = self.work_git.GetHead()
if head.startswith(R_HEADS): if head.startswith(R_HEADS):
branch = head[len(R_HEADS):] branch = head[len(R_HEADS):]
try: try:
head = all[head] head = all_refs[head]
except KeyError: except KeyError:
head = None head = None
else: else:
@ -1036,7 +1082,7 @@ class Project(object):
try: try:
self._Checkout(revid, quiet=True) self._Checkout(revid, quiet=True)
except GitError, e: except GitError as e:
syncbuf.fail(self, e) syncbuf.fail(self, e)
return return
self._CopyFiles() self._CopyFiles()
@ -1058,14 +1104,14 @@ class Project(object):
branch.name) branch.name)
try: try:
self._Checkout(revid, quiet=True) self._Checkout(revid, quiet=True)
except GitError, e: except GitError as e:
syncbuf.fail(self, e) syncbuf.fail(self, e)
return return
self._CopyFiles() self._CopyFiles()
return return
upstream_gain = self._revlist(not_rev(HEAD), revid) upstream_gain = self._revlist(not_rev(HEAD), revid)
pub = self.WasPublished(branch.name, all) pub = self.WasPublished(branch.name, all_refs)
if pub: if pub:
not_merged = self._revlist(not_rev(revid), pub) not_merged = self._revlist(not_rev(revid), pub)
if not_merged: if not_merged:
@ -1082,9 +1128,6 @@ class Project(object):
# All published commits are merged, and thus we are a # All published commits are merged, and thus we are a
# strict subset. We can fast-forward safely. # strict subset. We can fast-forward safely.
# #
def _doff():
self._FastForward(revid)
self._CopyFiles()
syncbuf.later1(self, _doff) syncbuf.later1(self, _doff)
return return
@ -1143,13 +1186,10 @@ class Project(object):
try: try:
self._ResetHard(revid) self._ResetHard(revid)
self._CopyFiles() self._CopyFiles()
except GitError, e: except GitError as e:
syncbuf.fail(self, e) syncbuf.fail(self, e)
return return
else: else:
def _doff():
self._FastForward(revid)
self._CopyFiles()
syncbuf.later1(self, _doff) syncbuf.later1(self, _doff)
def AddCopyFile(self, src, dest, absdest): def AddCopyFile(self, src, dest, absdest):
@ -1169,7 +1209,7 @@ class Project(object):
cmd = ['fetch', remote.name] cmd = ['fetch', remote.name]
cmd.append('refs/changes/%2.2d/%d/%d' \ cmd.append('refs/changes/%2.2d/%d/%d' \
% (change_id % 100, change_id, patch_id)) % (change_id % 100, change_id, patch_id))
cmd.extend(map(lambda x: str(x), remote.fetch)) cmd.extend(map(str, remote.fetch))
if GitCommand(self, cmd, bare=True).Wait() != 0: if GitCommand(self, cmd, bare=True).Wait() != 0:
return None return None
return DownloadedChange(self, return DownloadedChange(self,
@ -1188,8 +1228,8 @@ class Project(object):
if head == (R_HEADS + name): if head == (R_HEADS + name):
return True return True
all = self.bare_ref.all all_refs = self.bare_ref.all
if (R_HEADS + name) in all: if (R_HEADS + name) in all_refs:
return GitCommand(self, return GitCommand(self,
['checkout', name, '--'], ['checkout', name, '--'],
capture_stdout = True, capture_stdout = True,
@ -1198,11 +1238,11 @@ class Project(object):
branch = self.GetBranch(name) branch = self.GetBranch(name)
branch.remote = self.GetRemote(self.remote.name) branch.remote = self.GetRemote(self.remote.name)
branch.merge = self.revisionExpr branch.merge = self.revisionExpr
revid = self.GetRevisionId(all) revid = self.GetRevisionId(all_refs)
if head.startswith(R_HEADS): if head.startswith(R_HEADS):
try: try:
head = all[head] head = all_refs[head]
except KeyError: except KeyError:
head = None head = None
@ -1243,9 +1283,9 @@ class Project(object):
# #
return True return True
all = self.bare_ref.all all_refs = self.bare_ref.all
try: try:
revid = all[rev] revid = all_refs[rev]
except KeyError: except KeyError:
# Branch does not exist in this project # Branch does not exist in this project
# #
@ -1253,7 +1293,7 @@ class Project(object):
if head.startswith(R_HEADS): if head.startswith(R_HEADS):
try: try:
head = all[head] head = all_refs[head]
except KeyError: except KeyError:
head = None head = None
@ -1281,8 +1321,8 @@ class Project(object):
didn't exist. didn't exist.
""" """
rev = R_HEADS + name rev = R_HEADS + name
all = self.bare_ref.all all_refs = self.bare_ref.all
if rev not in all: if rev not in all_refs:
# Doesn't exist # Doesn't exist
return None return None
@ -1291,9 +1331,9 @@ class Project(object):
# We can't destroy the branch while we are sitting # We can't destroy the branch while we are sitting
# on it. Switch to a detached HEAD. # on it. Switch to a detached HEAD.
# #
head = all[head] head = all_refs[head]
revid = self.GetRevisionId(all) revid = self.GetRevisionId(all_refs)
if head == revid: if head == revid:
_lwrite(os.path.join(self.worktree, '.git', HEAD), _lwrite(os.path.join(self.worktree, '.git', HEAD),
'%s\n' % revid) '%s\n' % revid)
@ -1362,6 +1402,149 @@ class Project(object):
return kept return kept
## Submodule Management ##
def GetRegisteredSubprojects(self):
result = []
def rec(subprojects):
if not subprojects:
return
result.extend(subprojects)
for p in subprojects:
rec(p.subprojects)
rec(self.subprojects)
return result
def _GetSubmodules(self):
# Unfortunately we cannot call `git submodule status --recursive` here
# because the working tree might not exist yet, and it cannot be used
# without a working tree in its current implementation.
def get_submodules(gitdir, rev):
# Parse .gitmodules for submodule sub_paths and sub_urls
sub_paths, sub_urls = parse_gitmodules(gitdir, rev)
if not sub_paths:
return []
# Run `git ls-tree` to read SHAs of submodule object, which happen to be
# revision of submodule repository
sub_revs = git_ls_tree(gitdir, rev, sub_paths)
submodules = []
for sub_path, sub_url in zip(sub_paths, sub_urls):
try:
sub_rev = sub_revs[sub_path]
except KeyError:
# Ignore non-exist submodules
continue
submodules.append((sub_rev, sub_path, sub_url))
return submodules
re_path = re.compile(r'^submodule\.([^.]+)\.path=(.*)$')
re_url = re.compile(r'^submodule\.([^.]+)\.url=(.*)$')
def parse_gitmodules(gitdir, rev):
cmd = ['cat-file', 'blob', '%s:.gitmodules' % rev]
try:
p = GitCommand(None, cmd, capture_stdout = True, capture_stderr = True,
bare = True, gitdir = gitdir)
except GitError:
return [], []
if p.Wait() != 0:
return [], []
gitmodules_lines = []
fd, temp_gitmodules_path = tempfile.mkstemp()
try:
os.write(fd, p.stdout)
os.close(fd)
cmd = ['config', '--file', temp_gitmodules_path, '--list']
p = GitCommand(None, cmd, capture_stdout = True, capture_stderr = True,
bare = True, gitdir = gitdir)
if p.Wait() != 0:
return [], []
gitmodules_lines = p.stdout.split('\n')
except GitError:
return [], []
finally:
os.remove(temp_gitmodules_path)
names = set()
paths = {}
urls = {}
for line in gitmodules_lines:
if not line:
continue
m = re_path.match(line)
if m:
names.add(m.group(1))
paths[m.group(1)] = m.group(2)
continue
m = re_url.match(line)
if m:
names.add(m.group(1))
urls[m.group(1)] = m.group(2)
continue
names = sorted(names)
return ([paths.get(name, '') for name in names],
[urls.get(name, '') for name in names])
def git_ls_tree(gitdir, rev, paths):
cmd = ['ls-tree', rev, '--']
cmd.extend(paths)
try:
p = GitCommand(None, cmd, capture_stdout = True, capture_stderr = True,
bare = True, gitdir = gitdir)
except GitError:
return []
if p.Wait() != 0:
return []
objects = {}
for line in p.stdout.split('\n'):
if not line.strip():
continue
object_rev, object_path = line.split()[2:4]
objects[object_path] = object_rev
return objects
try:
rev = self.GetRevisionId()
except GitError:
return []
return get_submodules(self.gitdir, rev)
def GetDerivedSubprojects(self):
result = []
if not self.Exists:
# If git repo does not exist yet, querying its submodules will
# mess up its states; so return here.
return result
for rev, path, url in self._GetSubmodules():
name = self.manifest.GetSubprojectName(self, path)
project = self.manifest.projects.get(name)
if project:
result.extend(project.GetDerivedSubprojects())
continue
relpath, worktree, gitdir = self.manifest.GetSubprojectPaths(self, path)
remote = RemoteSpec(self.remote.name,
url = url,
review = self.remote.review)
subproject = Project(manifest = self.manifest,
name = name,
remote = remote,
gitdir = gitdir,
worktree = worktree,
relpath = relpath,
revisionExpr = self.revisionExpr,
revisionId = rev,
rebase = self.rebase,
groups = self.groups,
sync_c = self.sync_c,
sync_s = self.sync_s,
parent = self,
is_derived = True)
result.append(subproject)
result.extend(subproject.GetDerivedSubprojects())
return result
## Direct Git Commands ## ## Direct Git Commands ##
def _RemoteFetch(self, name=None, def _RemoteFetch(self, name=None,
@ -1373,6 +1556,16 @@ class Project(object):
is_sha1 = False is_sha1 = False
tag_name = None tag_name = None
def CheckForSha1():
try:
# if revision (sha or tag) is not present then following function
# throws an error.
self.bare_git.rev_parse('--verify', '%s^0' % self.revisionExpr)
return True
except GitError:
# There is no such persistent revision. We have to fetch it.
return False
if current_branch_only: if current_branch_only:
if ID_RE.match(self.revisionExpr) is not None: if ID_RE.match(self.revisionExpr) is not None:
is_sha1 = True is_sha1 = True
@ -1381,14 +1574,10 @@ class Project(object):
tag_name = self.revisionExpr[len(R_TAGS):] tag_name = self.revisionExpr[len(R_TAGS):]
if is_sha1 or tag_name is not None: if is_sha1 or tag_name is not None:
try: if CheckForSha1():
# if revision (sha or tag) is not present then following function
# throws an error.
self.bare_git.rev_parse('--verify', '%s^0' % self.revisionExpr)
return True return True
except GitError: if is_sha1 and (not self.upstream or ID_RE.match(self.upstream)):
# There is no such persistent revision. We have to fetch it. current_branch_only = False
pass
if not name: if not name:
name = self.remote.name name = self.remote.name
@ -1404,33 +1593,33 @@ class Project(object):
packed_refs = os.path.join(self.gitdir, 'packed-refs') packed_refs = os.path.join(self.gitdir, 'packed-refs')
remote = self.GetRemote(name) remote = self.GetRemote(name)
all = self.bare_ref.all all_refs = self.bare_ref.all
ids = set(all.values()) ids = set(all_refs.values())
tmp = set() tmp = set()
for r, id in GitRefs(ref_dir).all.iteritems(): for r, ref_id in GitRefs(ref_dir).all.iteritems():
if r not in all: if r not in all_refs:
if r.startswith(R_TAGS) or remote.WritesTo(r): if r.startswith(R_TAGS) or remote.WritesTo(r):
all[r] = id all_refs[r] = ref_id
ids.add(id) ids.add(ref_id)
continue continue
if id in ids: if ref_id in ids:
continue continue
r = 'refs/_alt/%s' % id r = 'refs/_alt/%s' % ref_id
all[r] = id all_refs[r] = ref_id
ids.add(id) ids.add(ref_id)
tmp.add(r) tmp.add(r)
ref_names = list(all.keys()) ref_names = list(all_refs.keys())
ref_names.sort() ref_names.sort()
tmp_packed = '' tmp_packed = ''
old_packed = '' old_packed = ''
for r in ref_names: for r in ref_names:
line = '%s %s\n' % (all[r], r) line = '%s %s\n' % (all_refs[r], r)
tmp_packed += line tmp_packed += line
if r not in tmp: if r not in tmp:
old_packed += line old_packed += line
@ -1453,7 +1642,7 @@ class Project(object):
cmd.append('--update-head-ok') cmd.append('--update-head-ok')
cmd.append(name) cmd.append(name)
if not current_branch_only or is_sha1: if not current_branch_only:
# Fetch whole repo # Fetch whole repo
cmd.append('--tags') cmd.append('--tags')
cmd.append((u'+refs/heads/*:') + remote.ToLocal('refs/heads/*')) cmd.append((u'+refs/heads/*:') + remote.ToLocal('refs/heads/*'))
@ -1462,15 +1651,23 @@ class Project(object):
cmd.append(tag_name) cmd.append(tag_name)
else: else:
branch = self.revisionExpr branch = self.revisionExpr
if is_sha1:
branch = self.upstream
if branch.startswith(R_HEADS): if branch.startswith(R_HEADS):
branch = branch[len(R_HEADS):] branch = branch[len(R_HEADS):]
cmd.append((u'+refs/heads/%s:' % branch) + remote.ToLocal('refs/heads/%s' % branch)) cmd.append((u'+refs/heads/%s:' % branch) + remote.ToLocal('refs/heads/%s' % branch))
ok = False ok = False
for i in range(2): for _i in range(2):
if GitCommand(self, cmd, bare=True, ssh_proxy=ssh_proxy).Wait() == 0: ret = GitCommand(self, cmd, bare=True, ssh_proxy=ssh_proxy).Wait()
if ret == 0:
ok = True ok = True
break break
elif current_branch_only and is_sha1 and ret == 128:
# Exit code 128 means "couldn't find the ref you asked for"; if we're in sha1
# mode, we just tried sync'ing from the upstream field; it doesn't exist, thus
# abort the optimization attempt and do a full sync.
break
time.sleep(random.randint(30, 45)) time.sleep(random.randint(30, 45))
if initial: if initial:
@ -1480,6 +1677,15 @@ class Project(object):
else: else:
os.remove(packed_refs) os.remove(packed_refs)
self.bare_git.pack_refs('--all', '--prune') self.bare_git.pack_refs('--all', '--prune')
if is_sha1 and current_branch_only and self.upstream:
# We just synced the upstream given branch; verify we
# got what we wanted, else trigger a second run of all
# refs.
if not CheckForSha1():
return self._RemoteFetch(name=name, current_branch_only=False,
initial=False, quiet=quiet, alt_dir=alt_dir)
return ok return ok
def _ApplyCloneBundle(self, initial=False, quiet=False): def _ApplyCloneBundle(self, initial=False, quiet=False):
@ -1557,7 +1763,8 @@ class Project(object):
# returned another error with the HTTP error code being 400 or above. # returned another error with the HTTP error code being 400 or above.
# This return code only appears if -f, --fail is used. # This return code only appears if -f, --fail is used.
if not quiet: if not quiet:
print >> sys.stderr, "Server does not provide clone.bundle; ignoring." print("Server does not provide clone.bundle; ignoring.",
file=sys.stderr)
return False return False
if os.path.exists(tmpPath): if os.path.exists(tmpPath):
@ -1692,7 +1899,7 @@ class Project(object):
continue continue
try: try:
os.symlink(os.path.relpath(stock_hook, os.path.dirname(dst)), dst) os.symlink(os.path.relpath(stock_hook, os.path.dirname(dst)), dst)
except OSError, e: except OSError as e:
if e.errno == errno.EPERM: if e.errno == errno.EPERM:
raise GitError('filesystem must support symlinks') raise GitError('filesystem must support symlinks')
else: else:
@ -1755,7 +1962,7 @@ class Project(object):
os.symlink(os.path.relpath(src, os.path.dirname(dst)), dst) os.symlink(os.path.relpath(src, os.path.dirname(dst)), dst)
else: else:
raise GitError('cannot overwrite a local work tree') raise GitError('cannot overwrite a local work tree')
except OSError, e: except OSError as e:
if e.errno == errno.EPERM: if e.errno == errno.EPERM:
raise GitError('filesystem must support symlinks') raise GitError('filesystem must support symlinks')
else: else:
@ -1805,7 +2012,8 @@ class Project(object):
if p.Wait() == 0: if p.Wait() == 0:
out = p.stdout out = p.stdout
if out: if out:
return out[:-1].split("\0") return out[:-1].split('\0') # pylint: disable=W1401
# Backslash is not anomalous
return [] return []
def DiffZ(self, name, *args): def DiffZ(self, name, *args):
@ -1821,7 +2029,7 @@ class Project(object):
out = p.process.stdout.read() out = p.process.stdout.read()
r = {} r = {}
if out: if out:
out = iter(out[:-1].split('\0')) out = iter(out[:-1].split('\0')) # pylint: disable=W1401
while out: while out:
try: try:
info = out.next() info = out.next()
@ -1848,7 +2056,7 @@ class Project(object):
self.level = self.level[1:] self.level = self.level[1:]
info = info[1:].split(' ') info = info[1:].split(' ')
info =_Info(path, *info) info = _Info(path, *info)
if info.status in ('R', 'C'): if info.status in ('R', 'C'):
info.src_path = info.path info.src_path = info.path
info.path = out.next() info.path = out.next()
@ -1862,7 +2070,10 @@ class Project(object):
path = os.path.join(self._project.gitdir, HEAD) path = os.path.join(self._project.gitdir, HEAD)
else: else:
path = os.path.join(self._project.worktree, '.git', HEAD) path = os.path.join(self._project.worktree, '.git', HEAD)
fd = open(path, 'rb') try:
fd = open(path, 'rb')
except IOError:
raise NoManifestException(path)
try: try:
line = fd.read() line = fd.read()
finally: finally:
@ -1938,7 +2149,9 @@ class Project(object):
Since we don't have a 'rev_parse' method defined, the __getattr__ will Since we don't have a 'rev_parse' method defined, the __getattr__ will
run. We'll replace the '_' with a '-' and try to run a git command. run. We'll replace the '_' with a '-' and try to run a git command.
Any other arguments will be passed to the git command. Any other positional arguments will be passed to the git command, and the
following keyword arguments are supported:
config: An optional dict of git config options to be passed with '-c'.
Args: Args:
name: The name of the git command to call. Any '_' characters will name: The name of the git command to call. Any '_' characters will
@ -1948,8 +2161,20 @@ class Project(object):
A callable object that will try to call git with the named command. A callable object that will try to call git with the named command.
""" """
name = name.replace('_', '-') name = name.replace('_', '-')
def runner(*args): def runner(*args, **kwargs):
cmdv = [name] cmdv = []
config = kwargs.pop('config', None)
for k in kwargs:
raise TypeError('%s() got an unexpected keyword argument %r'
% (name, k))
if config is not None:
if not git_require((1, 7, 2)):
raise ValueError('cannot set config on command line for %s()'
% name)
for k, v in config.iteritems():
cmdv.append('-c')
cmdv.append('%s=%s' % (k, v))
cmdv.append(name)
cmdv.extend(args) cmdv.extend(args)
p = GitCommand(self._project, p = GitCommand(self._project,
cmdv, cmdv,
@ -2009,7 +2234,7 @@ class _Later(object):
self.action() self.action()
out.nl() out.nl()
return True return True
except GitError, e: except GitError:
out.nl() out.nl()
return False return False
@ -2079,7 +2304,6 @@ class MetaProject(Project):
"""A special project housed under .repo. """A special project housed under .repo.
""" """
def __init__(self, manifest, name, gitdir, worktree): def __init__(self, manifest, name, gitdir, worktree):
repodir = manifest.repodir
Project.__init__(self, Project.__init__(self,
manifest = manifest, manifest = manifest,
name = name, name = name,
@ -2131,12 +2355,12 @@ class MetaProject(Project):
if not self.remote or not self.revisionExpr: if not self.remote or not self.revisionExpr:
return False return False
all = self.bare_ref.all all_refs = self.bare_ref.all
revid = self.GetRevisionId(all) revid = self.GetRevisionId(all_refs)
head = self.work_git.GetHead() head = self.work_git.GetHead()
if head.startswith(R_HEADS): if head.startswith(R_HEADS):
try: try:
head = all[head] head = all_refs[head]
except KeyError: except KeyError:
head = None head = None

290
repo
View File

@ -1,9 +1,10 @@
#!/bin/sh #!/usr/bin/env python
## repo default configuration ## repo default configuration
## ##
REPO_URL='https://gerrit.googlesource.com/git-repo' from __future__ import print_function
REPO_REV='stable' REPO_URL = 'https://gerrit.googlesource.com/git-repo'
REPO_REV = 'stable'
# Copyright (C) 2008 Google Inc. # Copyright (C) 2008 Google Inc.
# #
@ -19,19 +20,11 @@ REPO_REV='stable'
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
magic='--calling-python-from-/bin/sh--'
"""exec" python -E "$0" "$@" """#$magic"
if __name__ == '__main__':
import sys
if sys.argv[-1] == '#%s' % magic:
del sys.argv[-1]
del magic
# increment this whenever we make important changes to this script # increment this whenever we make important changes to this script
VERSION = (1, 17) VERSION = (1, 19)
# increment this if the MAINTAINER_KEYS block is modified # increment this if the MAINTAINER_KEYS block is modified
KEYRING_VERSION = (1,0) KEYRING_VERSION = (1, 1)
MAINTAINER_KEYS = """ MAINTAINER_KEYS = """
Repo Maintainer <repo@android.kernel.org> Repo Maintainer <repo@android.kernel.org>
@ -74,13 +67,45 @@ HTHs37+/QLMomGEGKZMWi0dShU2J5mNRQu3Hhxl3hHDVbt5CeJBb26aQcQrFz69W
zE3GNvmJosh6leayjtI9P2A6iEkEGBECAAkFAkj3uiACGwwACgkQFlMNXpIPXGWp zE3GNvmJosh6leayjtI9P2A6iEkEGBECAAkFAkj3uiACGwwACgkQFlMNXpIPXGWp
TACbBS+Up3RpfYVfd63c1cDdlru13pQAn3NQy/SN858MkxN+zym86UBgOad2 TACbBS+Up3RpfYVfd63c1cDdlru13pQAn3NQy/SN858MkxN+zym86UBgOad2
=CMiZ =CMiZ
-----END PGP PUBLIC KEY BLOCK-----
Conley Owens <cco3@android.com>
-----BEGIN PGP PUBLIC KEY BLOCK-----
Version: GnuPG v1.4.11 (GNU/Linux)
mQENBFBiLPwBCACvISTASOgFXwADw2GYRH2I2z9RvYkYoZ6ThTTNlMXbbYYKO2Wo
a9LQDNW0TbCEekg5UKk0FD13XOdWaqUt4Gtuvq9c43GRSjMO6NXH+0BjcQ8vUtY2
/W4CYUevwdo4nQ1+1zsOCu1XYe/CReXq0fdugv3hgmRmh3sz1soo37Q44W2frxxg
U7Rz3Da4FjgAL0RQ8qndD+LwRHXTY7H7wYM8V/3cYFZV7pSodd75q3MAXYQLf0ZV
QR1XATu5l1QnXrxgHvz7MmDwb1D+jX3YPKnZveaukigQ6hDHdiVcePBiGXmk8LZC
2jQkdXeF7Su1ZYpr2nnEHLJ6vOLcCpPGb8gDABEBAAG0H0NvbmxleSBPd2VucyA8
Y2NvM0BhbmRyb2lkLmNvbT6JATgEEwECACIFAlBiLPwCGwMGCwkIBwMCBhUIAgkK
CwQWAgMBAh4BAheAAAoJEBkmlFUziHGkHVkH/2Hks2Cif5i2xPtv2IFZcjL42joU
T7lO5XFqUYS9ZNHpGa/V0eiPt7rHoO16glR83NZtwlrq2cSN89i9HfOhMYV/qLu8
fLCHcV2muw+yCB5s5bxnI5UkToiNZyBNqFkcOt/Kbj9Hpy68A1kmc6myVEaUYebq
2Chx/f3xuEthan099t746v1K+/6SvQGDNctHuaMr9cWdxZtHjdRf31SQRc99Phe5
w+ZGR/ebxNDKRK9mKgZT8wVFHlXerJsRqWIqtx1fsW1UgLgbpcpe2MChm6B5wTu0
s1ltzox3l4q71FyRRPUJxXyvGkDLZWpK7EpiHSCOYq/KP3HkKeXU3xqHpcG5AQ0E
UGIs/AEIAKzO/7lO9cB6dshmZYo8Vy/b7aGicThE+ChcDSfhvyOXVdEM2GKAjsR+
rlBWbTFX3It301p2HwZPFEi9nEvJxVlqqBiW0bPmNMkDRR55l2vbWg35wwkg6RyE
Bc5/TQjhXI2w8IvlimoGoUff4t3JmMOnWrnKSvL+5iuRj12p9WmanCHzw3Ee7ztf
/aU/q+FTpr3DLerb6S8xbv86ySgnJT6o5CyL2DCWRtnYQyGVi0ZmLzEouAYiO0hs
z0AAu28Mj+12g2WwePRz6gfM9rHtI37ylYW3oT/9M9mO9ei/Bc/1D7Dz6qNV+0vg
uSVJxM2Bl6GalHPZLhHntFEdIA6EdoUAEQEAAYkBHwQYAQIACQUCUGIs/AIbDAAK
CRAZJpRVM4hxpNfkB/0W/hP5WK/NETXBlWXXW7JPaWO2c5kGwD0lnj5RRmridyo1
vbm5PdM91jOsDQYqRu6YOoYBnDnEhB2wL2bPh34HWwwrA+LwB8hlcAV2z1bdwyfl
3R823fReKN3QcvLHzmvZPrF4Rk97M9UIyKS0RtnfTWykRgDWHIsrtQPoNwsXrWoT
9LrM2v+1+9mp3vuXnE473/NHxmiWEQH9Ez+O/mOxQ7rSOlqGRiKq/IBZCfioJOtV
fTQeIu/yASZnsLBqr6SJEGwYBoWcyjG++k4fyw8ocOAo4uGDYbxgN7yYfNQ0OH7o
V6pfUgqKLWa/aK7/N1ZHnPdFLD8Xt0Dmy4BPwrKC
=O7am
-----END PGP PUBLIC KEY BLOCK----- -----END PGP PUBLIC KEY BLOCK-----
""" """
GIT = 'git' # our git command GIT = 'git' # our git command
MIN_GIT_VERSION = (1, 5, 4) # minimum supported git version MIN_GIT_VERSION = (1, 7, 2) # minimum supported git version
repodir = '.repo' # name of repo's private directory repodir = '.repo' # name of repo's private directory
S_repo = 'repo' # special repo reposiory S_repo = 'repo' # special repo repository
S_manifests = 'manifests' # special manifest repository S_manifests = 'manifests' # special manifest repository
REPO_MAIN = S_repo + '/main.py' # main script REPO_MAIN = S_repo + '/main.py' # main script
@ -88,10 +113,21 @@ REPO_MAIN = S_repo + '/main.py' # main script
import optparse import optparse
import os import os
import re import re
import readline import stat
import subprocess import subprocess
import sys import sys
import urllib2 try:
import urllib2
except ImportError:
# For python3
import urllib.request
import urllib.error
else:
# For python2
import imp
urllib = imp.new_module('urllib')
urllib.request = urllib2
urllib.error = urllib2
home_dot_repo = os.path.expanduser('~/.repoconfig') home_dot_repo = os.path.expanduser('~/.repoconfig')
gpg_dir = os.path.join(home_dot_repo, 'gnupg') gpg_dir = os.path.join(home_dot_repo, 'gnupg')
@ -118,7 +154,8 @@ group.add_option('-m', '--manifest-name',
help='initial manifest file', metavar='NAME.xml') help='initial manifest file', metavar='NAME.xml')
group.add_option('--mirror', group.add_option('--mirror',
dest='mirror', action='store_true', dest='mirror', action='store_true',
help='mirror the forrest') help='create a replica of the remote repositories '
'rather than a client working directory')
group.add_option('--reference', group.add_option('--reference',
dest='reference', dest='reference',
help='location of mirror directory', metavar='DIR') help='location of mirror directory', metavar='DIR')
@ -131,7 +168,7 @@ group.add_option('-g', '--groups',
metavar='GROUP') metavar='GROUP')
group.add_option('-p', '--platform', group.add_option('-p', '--platform',
dest='platform', default="auto", dest='platform', default="auto",
help='restrict manifest projects to ones with a specified' help='restrict manifest projects to ones with a specified '
'platform group [auto|all|none|linux|darwin|...]', 'platform group [auto|all|none|linux|darwin|...]',
metavar='PLATFORM') metavar='PLATFORM')
@ -180,25 +217,24 @@ def _Init(args):
if branch.startswith('refs/heads/'): if branch.startswith('refs/heads/'):
branch = branch[len('refs/heads/'):] branch = branch[len('refs/heads/'):]
if branch.startswith('refs/'): if branch.startswith('refs/'):
print >>sys.stderr, "fatal: invalid branch name '%s'" % branch print("fatal: invalid branch name '%s'" % branch, file=sys.stderr)
raise CloneFailure() raise CloneFailure()
if not os.path.isdir(repodir): if not os.path.isdir(repodir):
try: try:
os.mkdir(repodir) os.mkdir(repodir)
except OSError, e: except OSError as e:
print >>sys.stderr, \ print('fatal: cannot make %s directory: %s'
'fatal: cannot make %s directory: %s' % ( % (repodir, e.strerror), file=sys.stderr)
repodir, e.strerror) # Don't raise CloneFailure; that would delete the
# Don't faise CloneFailure; that would delete the
# name. Instead exit immediately. # name. Instead exit immediately.
# #
sys.exit(1) sys.exit(1)
_CheckGitVersion() _CheckGitVersion()
try: try:
if _NeedSetupGnuPG(): if NeedSetupGnuPG():
can_verify = _SetupGnuPG(opt.quiet) can_verify = SetupGnuPG(opt.quiet)
else: else:
can_verify = True can_verify = True
@ -213,8 +249,8 @@ def _Init(args):
_Checkout(dst, branch, rev, opt.quiet) _Checkout(dst, branch, rev, opt.quiet)
except CloneFailure: except CloneFailure:
if opt.quiet: if opt.quiet:
print >>sys.stderr, \ print('fatal: repo init failed; run without --quiet to see why',
'fatal: repo init failed; run without --quiet to see why' file=sys.stderr)
raise raise
@ -222,13 +258,13 @@ def _CheckGitVersion():
cmd = [GIT, '--version'] cmd = [GIT, '--version']
try: try:
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE) proc = subprocess.Popen(cmd, stdout=subprocess.PIPE)
except OSError, e: except OSError as e:
print >>sys.stderr print(file=sys.stderr)
print >>sys.stderr, "fatal: '%s' is not available" % GIT print("fatal: '%s' is not available" % GIT, file=sys.stderr)
print >>sys.stderr, 'fatal: %s' % e print('fatal: %s' % e, file=sys.stderr)
print >>sys.stderr print(file=sys.stderr)
print >>sys.stderr, 'Please make sure %s is installed'\ print('Please make sure %s is installed and in your path.' % GIT,
' and in your path.' % GIT file=sys.stderr)
raise CloneFailure() raise CloneFailure()
ver_str = proc.stdout.read().strip() ver_str = proc.stdout.read().strip()
@ -236,18 +272,18 @@ def _CheckGitVersion():
proc.wait() proc.wait()
if not ver_str.startswith('git version '): if not ver_str.startswith('git version '):
print >>sys.stderr, 'error: "%s" unsupported' % ver_str print('error: "%s" unsupported' % ver_str, file=sys.stderr)
raise CloneFailure() raise CloneFailure()
ver_str = ver_str[len('git version '):].strip() ver_str = ver_str[len('git version '):].strip()
ver_act = tuple(map(lambda x: int(x), ver_str.split('.')[0:3])) ver_act = tuple(map(int, ver_str.split('.')[0:3]))
if ver_act < MIN_GIT_VERSION: if ver_act < MIN_GIT_VERSION:
need = '.'.join(map(lambda x: str(x), MIN_GIT_VERSION)) need = '.'.join(map(str, MIN_GIT_VERSION))
print >>sys.stderr, 'fatal: git %s or later required' % need print('fatal: git %s or later required' % need, file=sys.stderr)
raise CloneFailure() raise CloneFailure()
def _NeedSetupGnuPG(): def NeedSetupGnuPG():
if not os.path.isdir(home_dot_repo): if not os.path.isdir(home_dot_repo):
return True return True
@ -259,29 +295,27 @@ def _NeedSetupGnuPG():
if not kv: if not kv:
return True return True
kv = tuple(map(lambda x: int(x), kv.split('.'))) kv = tuple(map(int, kv.split('.')))
if kv < KEYRING_VERSION: if kv < KEYRING_VERSION:
return True return True
return False return False
def _SetupGnuPG(quiet): def SetupGnuPG(quiet):
if not os.path.isdir(home_dot_repo): if not os.path.isdir(home_dot_repo):
try: try:
os.mkdir(home_dot_repo) os.mkdir(home_dot_repo)
except OSError, e: except OSError as e:
print >>sys.stderr, \ print('fatal: cannot make %s directory: %s'
'fatal: cannot make %s directory: %s' % ( % (home_dot_repo, e.strerror), file=sys.stderr)
home_dot_repo, e.strerror)
sys.exit(1) sys.exit(1)
if not os.path.isdir(gpg_dir): if not os.path.isdir(gpg_dir):
try: try:
os.mkdir(gpg_dir, 0700) os.mkdir(gpg_dir, stat.S_IRWXU)
except OSError, e: except OSError as e:
print >>sys.stderr, \ print('fatal: cannot make %s directory: %s' % (gpg_dir, e.strerror),
'fatal: cannot make %s directory: %s' % ( file=sys.stderr)
gpg_dir, e.strerror)
sys.exit(1) sys.exit(1)
env = os.environ.copy() env = os.environ.copy()
@ -292,23 +326,23 @@ def _SetupGnuPG(quiet):
proc = subprocess.Popen(cmd, proc = subprocess.Popen(cmd,
env = env, env = env,
stdin = subprocess.PIPE) stdin = subprocess.PIPE)
except OSError, e: except OSError as e:
if not quiet: if not quiet:
print >>sys.stderr, 'warning: gpg (GnuPG) is not available.' print('warning: gpg (GnuPG) is not available.', file=sys.stderr)
print >>sys.stderr, 'warning: Installing it is strongly encouraged.' print('warning: Installing it is strongly encouraged.', file=sys.stderr)
print >>sys.stderr print(file=sys.stderr)
return False return False
proc.stdin.write(MAINTAINER_KEYS) proc.stdin.write(MAINTAINER_KEYS)
proc.stdin.close() proc.stdin.close()
if proc.wait() != 0: if proc.wait() != 0:
print >>sys.stderr, 'fatal: registering repo maintainer keys failed' print('fatal: registering repo maintainer keys failed', file=sys.stderr)
sys.exit(1) sys.exit(1)
print print()
fd = open(os.path.join(home_dot_repo, 'keyring-version'), 'w') fd = open(os.path.join(home_dot_repo, 'keyring-version'), 'w')
fd.write('.'.join(map(lambda x: str(x), KEYRING_VERSION)) + '\n') fd.write('.'.join(map(str, KEYRING_VERSION)) + '\n')
fd.close() fd.close()
return True return True
@ -324,7 +358,7 @@ def _SetConfig(local, name, value):
def _InitHttp(): def _InitHttp():
handlers = [] handlers = []
mgr = urllib2.HTTPPasswordMgrWithDefaultRealm() mgr = urllib.request.HTTPPasswordMgrWithDefaultRealm()
try: try:
import netrc import netrc
n = netrc.netrc() n = netrc.netrc()
@ -334,20 +368,20 @@ def _InitHttp():
mgr.add_password(p[1], 'https://%s/' % host, p[0], p[2]) mgr.add_password(p[1], 'https://%s/' % host, p[0], p[2])
except: except:
pass pass
handlers.append(urllib2.HTTPBasicAuthHandler(mgr)) handlers.append(urllib.request.HTTPBasicAuthHandler(mgr))
handlers.append(urllib2.HTTPDigestAuthHandler(mgr)) handlers.append(urllib.request.HTTPDigestAuthHandler(mgr))
if 'http_proxy' in os.environ: if 'http_proxy' in os.environ:
url = os.environ['http_proxy'] url = os.environ['http_proxy']
handlers.append(urllib2.ProxyHandler({'http': url, 'https': url})) handlers.append(urllib.request.ProxyHandler({'http': url, 'https': url}))
if 'REPO_CURL_VERBOSE' in os.environ: if 'REPO_CURL_VERBOSE' in os.environ:
handlers.append(urllib2.HTTPHandler(debuglevel=1)) handlers.append(urllib.request.HTTPHandler(debuglevel=1))
handlers.append(urllib2.HTTPSHandler(debuglevel=1)) handlers.append(urllib.request.HTTPSHandler(debuglevel=1))
urllib2.install_opener(urllib2.build_opener(*handlers)) urllib.request.install_opener(urllib.request.build_opener(*handlers))
def _Fetch(url, local, src, quiet): def _Fetch(url, local, src, quiet):
if not quiet: if not quiet:
print >>sys.stderr, 'Get %s' % url print('Get %s' % url, file=sys.stderr)
cmd = [GIT, 'fetch'] cmd = [GIT, 'fetch']
if quiet: if quiet:
@ -392,20 +426,20 @@ def _DownloadBundle(url, local, quiet):
dest = open(os.path.join(local, '.git', 'clone.bundle'), 'w+b') dest = open(os.path.join(local, '.git', 'clone.bundle'), 'w+b')
try: try:
try: try:
r = urllib2.urlopen(url) r = urllib.request.urlopen(url)
except urllib2.HTTPError, e: except urllib.error.HTTPError as e:
if e.code == 404: if e.code == 404:
return False return False
print >>sys.stderr, 'fatal: Cannot get %s' % url print('fatal: Cannot get %s' % url, file=sys.stderr)
print >>sys.stderr, 'fatal: HTTP error %s' % e.code print('fatal: HTTP error %s' % e.code, file=sys.stderr)
raise CloneFailure() raise CloneFailure()
except urllib2.URLError, e: except urllib.error.URLError as e:
print >>sys.stderr, 'fatal: Cannot get %s' % url print('fatal: Cannot get %s' % url, file=sys.stderr)
print >>sys.stderr, 'fatal: error %s' % e.reason print('fatal: error %s' % e.reason, file=sys.stderr)
raise CloneFailure() raise CloneFailure()
try: try:
if not quiet: if not quiet:
print >>sys.stderr, 'Get %s' % url print('Get %s' % url, file=sys.stderr)
while True: while True:
buf = r.read(8192) buf = r.read(8192)
if buf == '': if buf == '':
@ -428,25 +462,24 @@ def _Clone(url, local, quiet):
""" """
try: try:
os.mkdir(local) os.mkdir(local)
except OSError, e: except OSError as e:
print >>sys.stderr, \ print('fatal: cannot make %s directory: %s' % (local, e.strerror),
'fatal: cannot make %s directory: %s' \ file=sys.stderr)
% (local, e.strerror)
raise CloneFailure() raise CloneFailure()
cmd = [GIT, 'init', '--quiet'] cmd = [GIT, 'init', '--quiet']
try: try:
proc = subprocess.Popen(cmd, cwd = local) proc = subprocess.Popen(cmd, cwd = local)
except OSError, e: except OSError as e:
print >>sys.stderr print(file=sys.stderr)
print >>sys.stderr, "fatal: '%s' is not available" % GIT print("fatal: '%s' is not available" % GIT, file=sys.stderr)
print >>sys.stderr, 'fatal: %s' % e print('fatal: %s' % e, file=sys.stderr)
print >>sys.stderr print(file=sys.stderr)
print >>sys.stderr, 'Please make sure %s is installed'\ print('Please make sure %s is installed and in your path.' % GIT,
' and in your path.' % GIT file=sys.stderr)
raise CloneFailure() raise CloneFailure()
if proc.wait() != 0: if proc.wait() != 0:
print >>sys.stderr, 'fatal: could not create %s' % local print('fatal: could not create %s' % local, file=sys.stderr)
raise CloneFailure() raise CloneFailure()
_InitHttp() _InitHttp()
@ -474,21 +507,18 @@ def _Verify(cwd, branch, quiet):
proc.stderr.close() proc.stderr.close()
if proc.wait() != 0 or not cur: if proc.wait() != 0 or not cur:
print >>sys.stderr print(file=sys.stderr)
print >>sys.stderr,\ print("fatal: branch '%s' has not been signed" % branch, file=sys.stderr)
"fatal: branch '%s' has not been signed" \
% branch
raise CloneFailure() raise CloneFailure()
m = re.compile(r'^(.*)-[0-9]{1,}-g[0-9a-f]{1,}$').match(cur) m = re.compile(r'^(.*)-[0-9]{1,}-g[0-9a-f]{1,}$').match(cur)
if m: if m:
cur = m.group(1) cur = m.group(1)
if not quiet: if not quiet:
print >>sys.stderr print(file=sys.stderr)
print >>sys.stderr, \ print("info: Ignoring branch '%s'; using tagged release '%s'"
"info: Ignoring branch '%s'; using tagged release '%s'" \ % (branch, cur), file=sys.stderr)
% (branch, cur) print(file=sys.stderr)
print >>sys.stderr
env = os.environ.copy() env = os.environ.copy()
env['GNUPGHOME'] = gpg_dir.encode() env['GNUPGHOME'] = gpg_dir.encode()
@ -506,10 +536,10 @@ def _Verify(cwd, branch, quiet):
proc.stderr.close() proc.stderr.close()
if proc.wait() != 0: if proc.wait() != 0:
print >>sys.stderr print(file=sys.stderr)
print >>sys.stderr, out print(out, file=sys.stderr)
print >>sys.stderr, err print(err, file=sys.stderr)
print >>sys.stderr print(file=sys.stderr)
raise CloneFailure() raise CloneFailure()
return '%s^0' % cur return '%s^0' % cur
@ -539,19 +569,19 @@ def _Checkout(cwd, branch, rev, quiet):
def _FindRepo(): def _FindRepo():
"""Look for a repo installation, starting at the current directory. """Look for a repo installation, starting at the current directory.
""" """
dir = os.getcwd() curdir = os.getcwd()
repo = None repo = None
olddir = None olddir = None
while dir != '/' \ while curdir != '/' \
and dir != olddir \ and curdir != olddir \
and not repo: and not repo:
repo = os.path.join(dir, repodir, REPO_MAIN) repo = os.path.join(curdir, repodir, REPO_MAIN)
if not os.path.isfile(repo): if not os.path.isfile(repo):
repo = None repo = None
olddir = dir olddir = curdir
dir = os.path.dirname(dir) curdir = os.path.dirname(curdir)
return (repo, os.path.join(dir, repodir)) return (repo, os.path.join(curdir, repodir))
class _Options: class _Options:
@ -563,7 +593,7 @@ def _ParseArguments(args):
opt = _Options() opt = _Options()
arg = [] arg = []
for i in xrange(0, len(args)): for i in range(len(args)):
a = args[i] a = args[i]
if a == '-h' or a == '--help': if a == '-h' or a == '--help':
opt.help = True opt.help = True
@ -576,7 +606,7 @@ def _ParseArguments(args):
def _Usage(): def _Usage():
print >>sys.stderr,\ print(
"""usage: repo COMMAND [ARGS] """usage: repo COMMAND [ARGS]
repo is not yet installed. Use "repo init" to install it here. repo is not yet installed. Use "repo init" to install it here.
@ -587,7 +617,7 @@ The most commonly used repo commands are:
help Display detailed help on a command help Display detailed help on a command
For access to the full online help, install repo ("repo init"). For access to the full online help, install repo ("repo init").
""" """, file=sys.stderr)
sys.exit(1) sys.exit(1)
@ -597,25 +627,23 @@ def _Help(args):
init_optparse.print_help() init_optparse.print_help()
sys.exit(0) sys.exit(0)
else: else:
print >>sys.stderr,\ print("error: '%s' is not a bootstrap command.\n"
"error: '%s' is not a bootstrap command.\n"\ ' For access to online help, install repo ("repo init").'
' For access to online help, install repo ("repo init").'\ % args[0], file=sys.stderr)
% args[0]
else: else:
_Usage() _Usage()
sys.exit(1) sys.exit(1)
def _NotInstalled(): def _NotInstalled():
print >>sys.stderr,\ print('error: repo is not installed. Use "repo init" to install it here.',
'error: repo is not installed. Use "repo init" to install it here.' file=sys.stderr)
sys.exit(1) sys.exit(1)
def _NoCommands(cmd): def _NoCommands(cmd):
print >>sys.stderr,\ print("""error: command '%s' requires repo to be installed first.
"""error: command '%s' requires repo to be installed first. Use "repo init" to install it here.""" % cmd, file=sys.stderr)
Use "repo init" to install it here.""" % cmd
sys.exit(1) sys.exit(1)
@ -652,18 +680,18 @@ def _SetDefaultsTo(gitdir):
proc.stderr.close() proc.stderr.close()
if proc.wait() != 0: if proc.wait() != 0:
print >>sys.stderr, 'fatal: %s has no current branch' % gitdir print('fatal: %s has no current branch' % gitdir, file=sys.stderr)
sys.exit(1) sys.exit(1)
def main(orig_args): def main(orig_args):
main, dir = _FindRepo() repo_main, rel_repo_dir = _FindRepo()
cmd, opt, args = _ParseArguments(orig_args) cmd, opt, args = _ParseArguments(orig_args)
wrapper_path = os.path.abspath(__file__) wrapper_path = os.path.abspath(__file__)
my_main, my_git = _RunSelf(wrapper_path) my_main, my_git = _RunSelf(wrapper_path)
if not main: if not repo_main:
if opt.help: if opt.help:
_Usage() _Usage()
if cmd == 'help': if cmd == 'help':
@ -683,26 +711,26 @@ def main(orig_args):
os.rmdir(os.path.join(root, name)) os.rmdir(os.path.join(root, name))
os.rmdir(repodir) os.rmdir(repodir)
sys.exit(1) sys.exit(1)
main, dir = _FindRepo() repo_main, rel_repo_dir = _FindRepo()
else: else:
_NoCommands(cmd) _NoCommands(cmd)
if my_main: if my_main:
main = my_main repo_main = my_main
ver_str = '.'.join(map(lambda x: str(x), VERSION)) ver_str = '.'.join(map(str, VERSION))
me = [main, me = [repo_main,
'--repo-dir=%s' % dir, '--repo-dir=%s' % rel_repo_dir,
'--wrapper-version=%s' % ver_str, '--wrapper-version=%s' % ver_str,
'--wrapper-path=%s' % wrapper_path, '--wrapper-path=%s' % wrapper_path,
'--'] '--']
me.extend(orig_args) me.extend(orig_args)
me.extend(extra_args) me.extend(extra_args)
try: try:
os.execv(main, me) os.execv(repo_main, me)
except OSError, e: except OSError as e:
print >>sys.stderr, "fatal: unable to start %s" % main print("fatal: unable to start %s" % repo_main, file=sys.stderr)
print >>sys.stderr, "fatal: %s" % e print("fatal: %s" % e, file=sys.stderr)
sys.exit(148) sys.exit(148)

View File

@ -15,7 +15,7 @@
import os import os
all = {} all_commands = {}
my_dir = os.path.dirname(__file__) my_dir = os.path.dirname(__file__)
for py in os.listdir(my_dir): for py in os.listdir(my_dir):
@ -43,7 +43,7 @@ for py in os.listdir(my_dir):
name = name.replace('_', '-') name = name.replace('_', '-')
cmd.NAME = name cmd.NAME = name
all[name] = cmd all_commands[name] = cmd
if 'help' in all: if 'help' in all_commands:
all['help'].commands = all all_commands['help'].commands = all_commands

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import sys import sys
from command import Command from command import Command
from git_command import git from git_command import git
@ -36,16 +37,16 @@ It is equivalent to "git branch -D <branchname>".
nb = args[0] nb = args[0]
if not git.check_ref_format('heads/%s' % nb): if not git.check_ref_format('heads/%s' % nb):
print >>sys.stderr, "error: '%s' is not a valid name" % nb print("error: '%s' is not a valid name" % nb, file=sys.stderr)
sys.exit(1) sys.exit(1)
nb = args[0] nb = args[0]
err = [] err = []
success = [] success = []
all = self.GetProjects(args[1:]) all_projects = self.GetProjects(args[1:])
pm = Progress('Abandon %s' % nb, len(all)) pm = Progress('Abandon %s' % nb, len(all_projects))
for project in all: for project in all_projects:
pm.update() pm.update()
status = project.AbandonBranch(nb) status = project.AbandonBranch(nb)
@ -58,13 +59,13 @@ It is equivalent to "git branch -D <branchname>".
if err: if err:
for p in err: for p in err:
print >>sys.stderr,\ print("error: %s/: cannot abandon %s" % (p.relpath, nb),
"error: %s/: cannot abandon %s" \ file=sys.stderr)
% (p.relpath, nb)
sys.exit(1) sys.exit(1)
elif not success: elif not success:
print >>sys.stderr, 'error: no project has branch %s' % nb print('error: no project has branch %s' % nb, file=sys.stderr)
sys.exit(1) sys.exit(1)
else: else:
print >>sys.stderr, 'Abandoned in %d project(s):\n %s' % ( print('Abandoned in %d project(s):\n %s'
len(success), '\n '.join(p.relpath for p in success)) % (len(success), '\n '.join(p.relpath for p in success)),
file=sys.stderr)

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import sys import sys
from color import Coloring from color import Coloring
from command import Command from command import Command
@ -93,21 +94,21 @@ is shown, then the branch appears in all projects.
def Execute(self, opt, args): def Execute(self, opt, args):
projects = self.GetProjects(args) projects = self.GetProjects(args)
out = BranchColoring(self.manifest.manifestProject.config) out = BranchColoring(self.manifest.manifestProject.config)
all = {} all_branches = {}
project_cnt = len(projects) project_cnt = len(projects)
for project in projects: for project in projects:
for name, b in project.GetBranches().iteritems(): for name, b in project.GetBranches().iteritems():
b.project = project b.project = project
if name not in all: if name not in all_branches:
all[name] = BranchInfo(name) all_branches[name] = BranchInfo(name)
all[name].add(b) all_branches[name].add(b)
names = all.keys() names = all_branches.keys()
names.sort() names.sort()
if not names: if not names:
print >>sys.stderr, ' (no branches)' print(' (no branches)', file=sys.stderr)
return return
width = 25 width = 25
@ -116,7 +117,7 @@ is shown, then the branch appears in all projects.
width = len(name) width = len(name)
for name in names: for name in names:
i = all[name] i = all_branches[name]
in_cnt = len(i.projects) in_cnt = len(i.projects)
if i.IsCurrent: if i.IsCurrent:
@ -140,12 +141,12 @@ is shown, then the branch appears in all projects.
fmt = out.write fmt = out.write
paths = [] paths = []
if in_cnt < project_cnt - in_cnt: if in_cnt < project_cnt - in_cnt:
type = 'in' in_type = 'in'
for b in i.projects: for b in i.projects:
paths.append(b.project.relpath) paths.append(b.project.relpath)
else: else:
fmt = out.notinproject fmt = out.notinproject
type = 'not in' in_type = 'not in'
have = set() have = set()
for b in i.projects: for b in i.projects:
have.add(b.project) have.add(b.project)
@ -153,11 +154,11 @@ is shown, then the branch appears in all projects.
if not p in have: if not p in have:
paths.append(p.relpath) paths.append(p.relpath)
s = ' %s %s' % (type, ', '.join(paths)) s = ' %s %s' % (in_type, ', '.join(paths))
if width + 7 + len(s) < 80: if width + 7 + len(s) < 80:
fmt(s) fmt(s)
else: else:
fmt(' %s:' % type) fmt(' %s:' % in_type)
for p in paths: for p in paths:
out.nl() out.nl()
fmt(width*' ' + ' %s' % p) fmt(width*' ' + ' %s' % p)

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import sys import sys
from command import Command from command import Command
from progress import Progress from progress import Progress
@ -39,10 +40,10 @@ The command is equivalent to:
nb = args[0] nb = args[0]
err = [] err = []
success = [] success = []
all = self.GetProjects(args[1:]) all_projects = self.GetProjects(args[1:])
pm = Progress('Checkout %s' % nb, len(all)) pm = Progress('Checkout %s' % nb, len(all_projects))
for project in all: for project in all_projects:
pm.update() pm.update()
status = project.CheckoutBranch(nb) status = project.CheckoutBranch(nb)
@ -55,10 +56,9 @@ The command is equivalent to:
if err: if err:
for p in err: for p in err:
print >>sys.stderr,\ print("error: %s/: cannot checkout %s" % (p.relpath, nb),
"error: %s/: cannot checkout %s" \ file=sys.stderr)
% (p.relpath, nb)
sys.exit(1) sys.exit(1)
elif not success: elif not success:
print >>sys.stderr, 'error: no project has branch %s' % nb print('error: no project has branch %s' % nb, file=sys.stderr)
sys.exit(1) sys.exit(1)

View File

@ -13,7 +13,9 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import sys, re from __future__ import print_function
import re
import sys
from command import Command from command import Command
from git_command import GitCommand from git_command import GitCommand
@ -45,13 +47,13 @@ change id will be added.
capture_stdout = True, capture_stdout = True,
capture_stderr = True) capture_stderr = True)
if p.Wait() != 0: if p.Wait() != 0:
print >>sys.stderr, p.stderr print(p.stderr, file=sys.stderr)
sys.exit(1) sys.exit(1)
sha1 = p.stdout.strip() sha1 = p.stdout.strip()
p = GitCommand(None, ['cat-file', 'commit', sha1], capture_stdout=True) p = GitCommand(None, ['cat-file', 'commit', sha1], capture_stdout=True)
if p.Wait() != 0: if p.Wait() != 0:
print >>sys.stderr, "error: Failed to retrieve old commit message" print("error: Failed to retrieve old commit message", file=sys.stderr)
sys.exit(1) sys.exit(1)
old_msg = self._StripHeader(p.stdout) old_msg = self._StripHeader(p.stdout)
@ -61,8 +63,8 @@ change id will be added.
capture_stderr = True) capture_stderr = True)
status = p.Wait() status = p.Wait()
print >>sys.stdout, p.stdout print(p.stdout, file=sys.stdout)
print >>sys.stderr, p.stderr print(p.stderr, file=sys.stderr)
if status == 0: if status == 0:
# The cherry-pick was applied correctly. We just need to edit the # The cherry-pick was applied correctly. We just need to edit the
@ -75,16 +77,14 @@ change id will be added.
capture_stderr = True) capture_stderr = True)
p.stdin.write(new_msg) p.stdin.write(new_msg)
if p.Wait() != 0: if p.Wait() != 0:
print >>sys.stderr, "error: Failed to update commit message" print("error: Failed to update commit message", file=sys.stderr)
sys.exit(1) sys.exit(1)
else: else:
print >>sys.stderr, """\ print('NOTE: When committing (please see above) and editing the commit'
NOTE: When committing (please see above) and editing the commit message, 'message, please remove the old Change-Id-line and add:')
please remove the old Change-Id-line and add: print(self._GetReference(sha1), file=stderr)
""" print(file=stderr)
print >>sys.stderr, self._GetReference(sha1)
print >>sys.stderr
def _IsChangeId(self, line): def _IsChangeId(self, line):
return CHANGE_ID_RE.match(line) return CHANGE_ID_RE.match(line)

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import re import re
import sys import sys
@ -32,13 +33,13 @@ makes it available in your project's local working directory.
""" """
def _Options(self, p): def _Options(self, p):
p.add_option('-c','--cherry-pick', p.add_option('-c', '--cherry-pick',
dest='cherrypick', action='store_true', dest='cherrypick', action='store_true',
help="cherry-pick instead of checkout") help="cherry-pick instead of checkout")
p.add_option('-r','--revert', p.add_option('-r', '--revert',
dest='revert', action='store_true', dest='revert', action='store_true',
help="revert instead of checkout") help="revert instead of checkout")
p.add_option('-f','--ff-only', p.add_option('-f', '--ff-only',
dest='ffonly', action='store_true', dest='ffonly', action='store_true',
help="force fast-forward merge") help="force fast-forward merge")
@ -68,23 +69,23 @@ makes it available in your project's local working directory.
for project, change_id, ps_id in self._ParseChangeIds(args): for project, change_id, ps_id in self._ParseChangeIds(args):
dl = project.DownloadPatchSet(change_id, ps_id) dl = project.DownloadPatchSet(change_id, ps_id)
if not dl: if not dl:
print >>sys.stderr, \ print('[%s] change %d/%d not found'
'[%s] change %d/%d not found' \ % (project.name, change_id, ps_id),
% (project.name, change_id, ps_id) file=sys.stderr)
sys.exit(1) sys.exit(1)
if not opt.revert and not dl.commits: if not opt.revert and not dl.commits:
print >>sys.stderr, \ print('[%s] change %d/%d has already been merged'
'[%s] change %d/%d has already been merged' \ % (project.name, change_id, ps_id),
% (project.name, change_id, ps_id) file=sys.stderr)
continue continue
if len(dl.commits) > 1: if len(dl.commits) > 1:
print >>sys.stderr, \ print('[%s] %d/%d depends on %d unmerged changes:' \
'[%s] %d/%d depends on %d unmerged changes:' \ % (project.name, change_id, ps_id, len(dl.commits)),
% (project.name, change_id, ps_id, len(dl.commits)) file=sys.stderr)
for c in dl.commits: for c in dl.commits:
print >>sys.stderr, ' %s' % (c) print(' %s' % (c), file=sys.stderr)
if opt.cherrypick: if opt.cherrypick:
project._CherryPick(dl.commit) project._CherryPick(dl.commit)
elif opt.revert: elif opt.revert:

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import fcntl import fcntl
import re import re
import os import os
@ -92,6 +93,9 @@ following <command>.
Unless -p is used, stdin, stdout, stderr are inherited from the Unless -p is used, stdin, stdout, stderr are inherited from the
terminal and are not redirected. terminal and are not redirected.
If -e is used, when a command exits unsuccessfully, '%prog' will abort
without iterating through the remaining projects.
""" """
def _Options(self, p): def _Options(self, p):
@ -104,6 +108,9 @@ terminal and are not redirected.
dest='command', dest='command',
action='callback', action='callback',
callback=cmd) callback=cmd)
p.add_option('-e', '--abort-on-errors',
dest='abort_on_errors', action='store_true',
help='Abort if a command exits unsuccessfully')
g = p.add_option_group('Output') g = p.add_option_group('Output')
g.add_option('-p', g.add_option('-p',
@ -141,12 +148,16 @@ terminal and are not redirected.
for cn in cmd[1:]: for cn in cmd[1:]:
if not cn.startswith('-'): if not cn.startswith('-'):
break break
if cn in _CAN_COLOR: else:
cn = None
# pylint: disable=W0631
if cn and cn in _CAN_COLOR:
class ColorCmd(Coloring): class ColorCmd(Coloring):
def __init__(self, config, cmd): def __init__(self, config, cmd):
Coloring.__init__(self, config, cmd) Coloring.__init__(self, config, cmd)
if ColorCmd(self.manifest.manifestProject.config, cn).is_on: if ColorCmd(self.manifest.manifestProject.config, cn).is_on:
cmd.insert(cmd.index(cn) + 1, '--color') cmd.insert(cmd.index(cn) + 1, '--color')
# pylint: enable=W0631
mirror = self.manifest.IsMirror mirror = self.manifest.IsMirror
out = ForallColoring(self.manifest.manifestProject.config) out = ForallColoring(self.manifest.manifestProject.config)
@ -179,7 +190,7 @@ terminal and are not redirected.
if not os.path.exists(cwd): if not os.path.exists(cwd):
if (opt.project_header and opt.verbose) \ if (opt.project_header and opt.verbose) \
or not opt.project_header: or not opt.project_header:
print >>sys.stderr, 'skipping %s/' % project.relpath print('skipping %s/' % project.relpath, file=sys.stderr)
continue continue
if opt.project_header: if opt.project_header:
@ -208,7 +219,6 @@ terminal and are not redirected.
return self.fd.fileno() return self.fd.fileno()
empty = True empty = True
didout = False
errbuf = '' errbuf = ''
p.stdin.close() p.stdin.close()
@ -220,7 +230,7 @@ terminal and are not redirected.
fcntl.fcntl(s.fd, fcntl.F_SETFL, flags | os.O_NONBLOCK) fcntl.fcntl(s.fd, fcntl.F_SETFL, flags | os.O_NONBLOCK)
while s_in: while s_in:
in_ready, out_ready, err_ready = select.select(s_in, [], []) in_ready, _out_ready, _err_ready = select.select(s_in, [], [])
for s in in_ready: for s in in_ready:
buf = s.fd.read(4096) buf = s.fd.read(4096)
if not buf: if not buf:
@ -229,9 +239,7 @@ terminal and are not redirected.
continue continue
if not opt.verbose: if not opt.verbose:
if s.fd == p.stdout: if s.fd != p.stdout:
didout = True
else:
errbuf += buf errbuf += buf
continue continue
@ -253,7 +261,12 @@ terminal and are not redirected.
s.dest.flush() s.dest.flush()
r = p.wait() r = p.wait()
if r != 0 and r != rc: if r != 0:
rc = r if r != rc:
rc = r
if opt.abort_on_errors:
print("error: %s: Aborting due to previous error" % project.relpath,
file=sys.stderr)
sys.exit(r)
if rc != 0: if rc != 0:
sys.exit(rc) sys.exit(rc)

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import sys import sys
from color import Coloring from color import Coloring
from command import PagedCommand from command import PagedCommand
@ -51,7 +52,7 @@ Examples
Look for a line that has '#define' and either 'MAX_PATH or 'PATH_MAX': Look for a line that has '#define' and either 'MAX_PATH or 'PATH_MAX':
repo grep -e '#define' --and -\( -e MAX_PATH -e PATH_MAX \) repo grep -e '#define' --and -\\( -e MAX_PATH -e PATH_MAX \\)
Look for a line that has 'NODE' or 'Unexpected' in files that Look for a line that has 'NODE' or 'Unexpected' in files that
contain a line that matches both expressions: contain a line that matches both expressions:
@ -84,7 +85,7 @@ contain a line that matches both expressions:
g.add_option('--cached', g.add_option('--cached',
action='callback', callback=carry, action='callback', callback=carry,
help='Search the index, instead of the work tree') help='Search the index, instead of the work tree')
g.add_option('-r','--revision', g.add_option('-r', '--revision',
dest='revision', action='append', metavar='TREEish', dest='revision', action='append', metavar='TREEish',
help='Search TREEish, instead of the work tree') help='Search TREEish, instead of the work tree')
@ -96,7 +97,7 @@ contain a line that matches both expressions:
g.add_option('-i', '--ignore-case', g.add_option('-i', '--ignore-case',
action='callback', callback=carry, action='callback', callback=carry,
help='Ignore case differences') help='Ignore case differences')
g.add_option('-a','--text', g.add_option('-a', '--text',
action='callback', callback=carry, action='callback', callback=carry,
help="Process binary files as if they were text") help="Process binary files as if they were text")
g.add_option('-I', g.add_option('-I',
@ -125,7 +126,7 @@ contain a line that matches both expressions:
g.add_option('--and', '--or', '--not', g.add_option('--and', '--or', '--not',
action='callback', callback=carry, action='callback', callback=carry,
help='Boolean operators to combine patterns') help='Boolean operators to combine patterns')
g.add_option('-(','-)', g.add_option('-(', '-)',
action='callback', callback=carry, action='callback', callback=carry,
help='Boolean operator grouping') help='Boolean operator grouping')
@ -145,10 +146,10 @@ contain a line that matches both expressions:
action='callback', callback=carry, action='callback', callback=carry,
metavar='CONTEXT', type='str', metavar='CONTEXT', type='str',
help='Show CONTEXT lines after match') help='Show CONTEXT lines after match')
g.add_option('-l','--name-only','--files-with-matches', g.add_option('-l', '--name-only', '--files-with-matches',
action='callback', callback=carry, action='callback', callback=carry,
help='Show only file names containing matching lines') help='Show only file names containing matching lines')
g.add_option('-L','--files-without-match', g.add_option('-L', '--files-without-match',
action='callback', callback=carry, action='callback', callback=carry,
help='Show only file names not containing matching lines') help='Show only file names not containing matching lines')
@ -157,9 +158,9 @@ contain a line that matches both expressions:
out = GrepColoring(self.manifest.manifestProject.config) out = GrepColoring(self.manifest.manifestProject.config)
cmd_argv = ['grep'] cmd_argv = ['grep']
if out.is_on and git_require((1,6,3)): if out.is_on and git_require((1, 6, 3)):
cmd_argv.append('--color') cmd_argv.append('--color')
cmd_argv.extend(getattr(opt,'cmd_argv',[])) cmd_argv.extend(getattr(opt, 'cmd_argv', []))
if '-e' not in cmd_argv: if '-e' not in cmd_argv:
if not args: if not args:
@ -178,8 +179,7 @@ contain a line that matches both expressions:
have_rev = False have_rev = False
if opt.revision: if opt.revision:
if '--cached' in cmd_argv: if '--cached' in cmd_argv:
print >>sys.stderr,\ print('fatal: cannot combine --cached and --revision', file=sys.stderr)
'fatal: cannot combine --cached and --revision'
sys.exit(1) sys.exit(1)
have_rev = True have_rev = True
cmd_argv.extend(opt.revision) cmd_argv.extend(opt.revision)
@ -230,13 +230,13 @@ contain a line that matches both expressions:
out.nl() out.nl()
else: else:
for line in r: for line in r:
print line print(line)
if have_match: if have_match:
sys.exit(0) sys.exit(0)
elif have_rev and bad_rev: elif have_rev and bad_rev:
for r in opt.revision: for r in opt.revision:
print >>sys.stderr, "error: can't search revision %s" % r print("error: can't search revision %s" % r, file=sys.stderr)
sys.exit(1) sys.exit(1)
else: else:
sys.exit(1) sys.exit(1)

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import re import re
import sys import sys
from formatter import AbstractFormatter, DumbWriter from formatter import AbstractFormatter, DumbWriter
@ -31,10 +32,8 @@ Displays detailed usage information about a command.
""" """
def _PrintAllCommands(self): def _PrintAllCommands(self):
print 'usage: repo COMMAND [ARGS]' print('usage: repo COMMAND [ARGS]')
print """ print('The complete list of recognized repo commands are:')
The complete list of recognized repo commands are:
"""
commandNames = self.commands.keys() commandNames = self.commands.keys()
commandNames.sort() commandNames.sort()
@ -49,17 +48,14 @@ The complete list of recognized repo commands are:
summary = command.helpSummary.strip() summary = command.helpSummary.strip()
except AttributeError: except AttributeError:
summary = '' summary = ''
print fmt % (name, summary) print(fmt % (name, summary))
print """ print("See 'repo help <command>' for more information on a"
See 'repo help <command>' for more information on a specific command. 'specific command.')
"""
def _PrintCommonCommands(self): def _PrintCommonCommands(self):
print 'usage: repo COMMAND [ARGS]' print('usage: repo COMMAND [ARGS]')
print """ print('The most commonly used repo commands are:')
The most commonly used repo commands are: commandNames = [name
"""
commandNames = [name
for name in self.commands.keys() for name in self.commands.keys()
if self.commands[name].common] if self.commands[name].common]
commandNames.sort() commandNames.sort()
@ -75,11 +71,10 @@ The most commonly used repo commands are:
summary = command.helpSummary.strip() summary = command.helpSummary.strip()
except AttributeError: except AttributeError:
summary = '' summary = ''
print fmt % (name, summary) print(fmt % (name, summary))
print """ print(
See 'repo help <command>' for more information on a specific command. "See 'repo help <command>' for more information on a specific command.\n"
See 'repo help --all' for a complete list of recognized commands. "See 'repo help --all' for a complete list of recognized commands.")
"""
def _PrintCommandHelp(self, cmd): def _PrintCommandHelp(self, cmd):
class _Out(Coloring): class _Out(Coloring):
@ -120,8 +115,8 @@ See 'repo help --all' for a complete list of recognized commands.
m = asciidoc_hdr.match(para) m = asciidoc_hdr.match(para)
if m: if m:
title = m.group(1) title = m.group(1)
type = m.group(2) section_type = m.group(2)
if type[0] in ('=', '-'): if section_type[0] in ('=', '-'):
p = self.heading p = self.heading
else: else:
def _p(fmt, *args): def _p(fmt, *args):
@ -131,7 +126,7 @@ See 'repo help --all' for a complete list of recognized commands.
p('%s', title) p('%s', title)
self.nl() self.nl()
p('%s', ''.ljust(len(title),type[0])) p('%s', ''.ljust(len(title), section_type[0]))
self.nl() self.nl()
continue continue
@ -162,7 +157,7 @@ See 'repo help --all' for a complete list of recognized commands.
try: try:
cmd = self.commands[name] cmd = self.commands[name]
except KeyError: except KeyError:
print >>sys.stderr, "repo: '%s' is not a repo command." % name print("repo: '%s' is not a repo command." % name, file=sys.stderr)
sys.exit(1) sys.exit(1)
cmd.manifest = self.manifest cmd.manifest = self.manifest

195
subcmds/info.py Normal file
View File

@ -0,0 +1,195 @@
#
# Copyright (C) 2012 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from command import PagedCommand
from color import Coloring
from error import NoSuchProjectError
from git_refs import R_M
class _Coloring(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, "status")
class Info(PagedCommand):
common = True
helpSummary = "Get info on the manifest branch, current branch or unmerged branches"
helpUsage = "%prog [-dl] [-o [-b]] [<project>...]"
def _Options(self, p, show_smart=True):
p.add_option('-d', '--diff',
dest='all', action='store_true',
help="show full info and commit diff including remote branches")
p.add_option('-o', '--overview',
dest='overview', action='store_true',
help='show overview of all local commits')
p.add_option('-b', '--current-branch',
dest="current_branch", action="store_true",
help="consider only checked out branches")
p.add_option('-l', '--local-only',
dest="local", action="store_true",
help="Disable all remote operations")
def Execute(self, opt, args):
self.out = _Coloring(self.manifest.globalConfig)
self.heading = self.out.printer('heading', attr = 'bold')
self.headtext = self.out.printer('headtext', fg = 'yellow')
self.redtext = self.out.printer('redtext', fg = 'red')
self.sha = self.out.printer("sha", fg = 'yellow')
self.text = self.out.printer('text')
self.dimtext = self.out.printer('dimtext', attr = 'dim')
self.opt = opt
mergeBranch = self.manifest.manifestProject.config.GetBranch("default").merge
self.heading("Manifest branch: ")
self.headtext(self.manifest.default.revisionExpr)
self.out.nl()
self.heading("Manifest merge branch: ")
self.headtext(mergeBranch)
self.out.nl()
self.printSeparator()
if not opt.overview:
self.printDiffInfo(args)
else:
self.printCommitOverview(args)
def printSeparator(self):
self.text("----------------------------")
self.out.nl()
def printDiffInfo(self, args):
try:
projs = self.GetProjects(args)
except NoSuchProjectError:
return
for p in projs:
self.heading("Project: ")
self.headtext(p.name)
self.out.nl()
self.heading("Mount path: ")
self.headtext(p.worktree)
self.out.nl()
self.heading("Current revision: ")
self.headtext(p.revisionExpr)
self.out.nl()
localBranches = p.GetBranches().keys()
self.heading("Local Branches: ")
self.redtext(str(len(localBranches)))
if len(localBranches) > 0:
self.text(" [")
self.text(", ".join(localBranches))
self.text("]")
self.out.nl()
if self.opt.all:
self.findRemoteLocalDiff(p)
self.printSeparator()
def findRemoteLocalDiff(self, project):
#Fetch all the latest commits
if not self.opt.local:
project.Sync_NetworkHalf(quiet=True, current_branch_only=True)
logTarget = R_M + self.manifest.default.revisionExpr
bareTmp = project.bare_git._bare
project.bare_git._bare = False
localCommits = project.bare_git.rev_list(
'--abbrev=8',
'--abbrev-commit',
'--pretty=oneline',
logTarget + "..",
'--')
originCommits = project.bare_git.rev_list(
'--abbrev=8',
'--abbrev-commit',
'--pretty=oneline',
".." + logTarget,
'--')
project.bare_git._bare = bareTmp
self.heading("Local Commits: ")
self.redtext(str(len(localCommits)))
self.dimtext(" (on current branch)")
self.out.nl()
for c in localCommits:
split = c.split()
self.sha(split[0] + " ")
self.text("".join(split[1:]))
self.out.nl()
self.printSeparator()
self.heading("Remote Commits: ")
self.redtext(str(len(originCommits)))
self.out.nl()
for c in originCommits:
split = c.split()
self.sha(split[0] + " ")
self.text("".join(split[1:]))
self.out.nl()
def printCommitOverview(self, args):
all_branches = []
for project in self.GetProjects(args):
br = [project.GetUploadableBranch(x)
for x in project.GetBranches().keys()]
br = [x for x in br if x]
if self.opt.current_branch:
br = [x for x in br if x.name == project.CurrentBranch]
all_branches.extend(br)
if not all_branches:
return
self.out.nl()
self.heading('Projects Overview')
project = None
for branch in all_branches:
if project != branch.project:
project = branch.project
self.out.nl()
self.headtext(project.relpath)
self.out.nl()
commits = branch.commits
date = branch.date
self.text('%s %-33s (%2d commit%s, %s)' % (
branch.name == project.CurrentBranch and '*' or ' ',
branch.name,
len(commits),
len(commits) != 1 and 's' or '',
date))
self.out.nl()
for commit in commits:
split = commit.split()
self.text('{0:38}{1} '.format('','-'))
self.sha(split[0] + " ")
self.text("".join(split[1:]))
self.out.nl()

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import os import os
import platform import platform
import re import re
@ -117,18 +118,22 @@ to update the working directory files.
dest='config_name', action="store_true", default=False, dest='config_name', action="store_true", default=False,
help='Always prompt for name/e-mail') help='Always prompt for name/e-mail')
def _RegisteredEnvironmentOptions(self):
return {'REPO_MANIFEST_URL': 'manifest_url',
'REPO_MIRROR_LOCATION': 'reference'}
def _SyncManifest(self, opt): def _SyncManifest(self, opt):
m = self.manifest.manifestProject m = self.manifest.manifestProject
is_new = not m.Exists is_new = not m.Exists
if is_new: if is_new:
if not opt.manifest_url: if not opt.manifest_url:
print >>sys.stderr, 'fatal: manifest url (-u) is required.' print('fatal: manifest url (-u) is required.', file=sys.stderr)
sys.exit(1) sys.exit(1)
if not opt.quiet: if not opt.quiet:
print >>sys.stderr, 'Get %s' \ print('Get %s' % GitConfig.ForUser().UrlInsteadOf(opt.manifest_url),
% GitConfig.ForUser().UrlInsteadOf(opt.manifest_url) file=sys.stderr)
m._InitGitDir() m._InitGitDir()
if opt.manifest_branch: if opt.manifest_branch:
@ -147,7 +152,7 @@ to update the working directory files.
r.ResetFetch() r.ResetFetch()
r.Save() r.Save()
groups = re.split('[,\s]+', opt.groups) groups = re.split(r'[,\s]+', opt.groups)
all_platforms = ['linux', 'darwin'] all_platforms = ['linux', 'darwin']
platformize = lambda x: 'platform-' + x platformize = lambda x: 'platform-' + x
if opt.platform == 'auto': if opt.platform == 'auto':
@ -159,7 +164,7 @@ to update the working directory files.
elif opt.platform in all_platforms: elif opt.platform in all_platforms:
groups.extend(platformize(opt.platform)) groups.extend(platformize(opt.platform))
elif opt.platform != 'none': elif opt.platform != 'none':
print >>sys.stderr, 'fatal: invalid platform flag' print('fatal: invalid platform flag', file=sys.stderr)
sys.exit(1) sys.exit(1)
groups = [x for x in groups if x] groups = [x for x in groups if x]
@ -175,12 +180,13 @@ to update the working directory files.
if is_new: if is_new:
m.config.SetString('repo.mirror', 'true') m.config.SetString('repo.mirror', 'true')
else: else:
print >>sys.stderr, 'fatal: --mirror not supported on existing client' print('fatal: --mirror not supported on existing client',
file=sys.stderr)
sys.exit(1) sys.exit(1)
if not m.Sync_NetworkHalf(is_new=is_new): if not m.Sync_NetworkHalf(is_new=is_new):
r = m.GetRemote(m.remote.name) r = m.GetRemote(m.remote.name)
print >>sys.stderr, 'fatal: cannot obtain manifest %s' % r.url print('fatal: cannot obtain manifest %s' % r.url, file=sys.stderr)
# Better delete the manifest git dir if we created it; otherwise next # Better delete the manifest git dir if we created it; otherwise next
# time (when user fixes problems) we won't go through the "is_new" logic. # time (when user fixes problems) we won't go through the "is_new" logic.
@ -197,24 +203,22 @@ to update the working directory files.
if is_new or m.CurrentBranch is None: if is_new or m.CurrentBranch is None:
if not m.StartBranch('default'): if not m.StartBranch('default'):
print >>sys.stderr, 'fatal: cannot create default in manifest' print('fatal: cannot create default in manifest', file=sys.stderr)
sys.exit(1) sys.exit(1)
def _LinkManifest(self, name): def _LinkManifest(self, name):
if not name: if not name:
print >>sys.stderr, 'fatal: manifest name (-m) is required.' print('fatal: manifest name (-m) is required.', file=sys.stderr)
sys.exit(1) sys.exit(1)
try: try:
self.manifest.Link(name) self.manifest.Link(name)
except ManifestParseError, e: except ManifestParseError as e:
print >>sys.stderr, "fatal: manifest '%s' not available" % name print("fatal: manifest '%s' not available" % name, file=sys.stderr)
print >>sys.stderr, 'fatal: %s' % str(e) print('fatal: %s' % str(e), file=sys.stderr)
sys.exit(1) sys.exit(1)
def _Prompt(self, prompt, value): def _Prompt(self, prompt, value):
mp = self.manifest.manifestProject
sys.stdout.write('%-10s [%s]: ' % (prompt, value)) sys.stdout.write('%-10s [%s]: ' % (prompt, value))
a = sys.stdin.readline().strip() a = sys.stdin.readline().strip()
if a == '': if a == '':
@ -233,24 +237,24 @@ to update the working directory files.
mp.config.SetString('user.name', gc.GetString('user.name')) mp.config.SetString('user.name', gc.GetString('user.name'))
mp.config.SetString('user.email', gc.GetString('user.email')) mp.config.SetString('user.email', gc.GetString('user.email'))
print '' print()
print 'Your identity is: %s <%s>' % (mp.config.GetString('user.name'), print('Your identity is: %s <%s>' % (mp.config.GetString('user.name'),
mp.config.GetString('user.email')) mp.config.GetString('user.email')))
print 'If you want to change this, please re-run \'repo init\' with --config-name' print('If you want to change this, please re-run \'repo init\' with --config-name')
return False return False
def _ConfigureUser(self): def _ConfigureUser(self):
mp = self.manifest.manifestProject mp = self.manifest.manifestProject
while True: while True:
print '' print()
name = self._Prompt('Your Name', mp.UserName) name = self._Prompt('Your Name', mp.UserName)
email = self._Prompt('Your Email', mp.UserEmail) email = self._Prompt('Your Email', mp.UserEmail)
print '' print()
print 'Your identity is: %s <%s>' % (name, email) print('Your identity is: %s <%s>' % (name, email))
sys.stdout.write('is this correct [y/N]? ') sys.stdout.write('is this correct [y/N]? ')
a = sys.stdin.readline().strip() a = sys.stdin.readline().strip().lower()
if a in ('yes', 'y', 't', 'true'): if a in ('yes', 'y', 't', 'true'):
break break
@ -276,17 +280,17 @@ to update the working directory files.
self._on = True self._on = True
out = _Test() out = _Test()
print '' print()
print "Testing colorized output (for 'repo diff', 'repo status'):" print("Testing colorized output (for 'repo diff', 'repo status'):")
for c in ['black','red','green','yellow','blue','magenta','cyan']: for c in ['black', 'red', 'green', 'yellow', 'blue', 'magenta', 'cyan']:
out.write(' ') out.write(' ')
out.printer(fg=c)(' %-6s ', c) out.printer(fg=c)(' %-6s ', c)
out.write(' ') out.write(' ')
out.printer(fg='white', bg='black')(' %s ' % 'white') out.printer(fg='white', bg='black')(' %s ' % 'white')
out.nl() out.nl()
for c in ['bold','dim','ul','reverse']: for c in ['bold', 'dim', 'ul', 'reverse']:
out.write(' ') out.write(' ')
out.printer(fg='black', attr=c)(' %-6s ', c) out.printer(fg='black', attr=c)(' %-6s ', c)
out.nl() out.nl()
@ -315,8 +319,29 @@ to update the working directory files.
# We store the depth in the main manifest project. # We store the depth in the main manifest project.
self.manifest.manifestProject.config.SetString('repo.depth', depth) self.manifest.manifestProject.config.SetString('repo.depth', depth)
def _DisplayResult(self):
if self.manifest.IsMirror:
init_type = 'mirror '
else:
init_type = ''
print()
print('repo %shas been initialized in %s'
% (init_type, self.manifest.topdir))
current_dir = os.getcwd()
if current_dir != self.manifest.topdir:
print('If this is not the directory in which you want to initialize'
'repo, please run:')
print(' rm -r %s/.repo' % self.manifest.topdir)
print('and try again.')
def Execute(self, opt, args): def Execute(self, opt, args):
git_require(MIN_GIT_VERSION, fail=True) git_require(MIN_GIT_VERSION, fail=True)
if opt.reference:
opt.reference = os.path.expanduser(opt.reference)
self._SyncManifest(opt) self._SyncManifest(opt)
self._LinkManifest(opt.manifest_name) self._LinkManifest(opt.manifest_name)
@ -327,10 +352,4 @@ to update the working directory files.
self._ConfigureDepth(opt) self._ConfigureDepth(opt)
if self.manifest.IsMirror: self._DisplayResult()
type = 'mirror '
else:
type = ''
print ''
print 'repo %sinitialized in %s' % (type, self.manifest.topdir)

View File

@ -13,13 +13,17 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import re
from command import Command, MirrorSafeCommand from command import Command, MirrorSafeCommand
class List(Command, MirrorSafeCommand): class List(Command, MirrorSafeCommand):
common = True common = True
helpSummary = "List projects and their associated directories" helpSummary = "List projects and their associated directories"
helpUsage = """ helpUsage = """
%prog [<project>...] %prog [-f] [<project>...]
%prog [-f] -r str1 [str2]..."
""" """
helpDescription = """ helpDescription = """
List all projects; pass '.' to list the project for the cwd. List all projects; pass '.' to list the project for the cwd.
@ -27,6 +31,14 @@ List all projects; pass '.' to list the project for the cwd.
This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'. This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
""" """
def _Options(self, p, show_smart=True):
p.add_option('-r', '--regex',
dest='regex', action='store_true',
help="Filter the project list based on regex or wildcard matching of strings")
p.add_option('-f', '--fullpath',
dest='fullpath', action='store_true',
help="Display the full work tree path instead of the relative path")
def Execute(self, opt, args): def Execute(self, opt, args):
"""List all projects and the associated directories. """List all projects and the associated directories.
@ -35,14 +47,33 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
discoverable. discoverable.
Args: Args:
opt: The options. We don't take any. opt: The options.
args: Positional args. Can be a list of projects to list, or empty. args: Positional args. Can be a list of projects to list, or empty.
""" """
projects = self.GetProjects(args) if not opt.regex:
projects = self.GetProjects(args)
else:
projects = self.FindProjects(args)
def _getpath(x):
if opt.fullpath:
return x.worktree
return x.relpath
lines = [] lines = []
for project in projects: for project in projects:
lines.append("%s : %s" % (project.relpath, project.name)) lines.append("%s : %s" % (_getpath(project), project.name))
lines.sort() lines.sort()
print '\n'.join(lines) print('\n'.join(lines))
def FindProjects(self, args):
result = []
for project in self.GetProjects(''):
for arg in args:
pattern = re.compile(r'%s' % arg, re.IGNORECASE)
if pattern.search(project.name) or pattern.search(project.relpath):
result.append(project)
break
result.sort(key=lambda project: project.relpath)
return result

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import os import os
import sys import sys
@ -35,19 +36,24 @@ in a Git repository for use during future 'repo init' invocations.
@property @property
def helpDescription(self): def helpDescription(self):
help = self._helpDescription + '\n' helptext = self._helpDescription + '\n'
r = os.path.dirname(__file__) r = os.path.dirname(__file__)
r = os.path.dirname(r) r = os.path.dirname(r)
fd = open(os.path.join(r, 'docs', 'manifest-format.txt')) fd = open(os.path.join(r, 'docs', 'manifest-format.txt'))
for line in fd: for line in fd:
help += line helptext += line
fd.close() fd.close()
return help return helptext
def _Options(self, p): def _Options(self, p):
p.add_option('-r', '--revision-as-HEAD', p.add_option('-r', '--revision-as-HEAD',
dest='peg_rev', action='store_true', dest='peg_rev', action='store_true',
help='Save revisions as current HEAD') help='Save revisions as current HEAD')
p.add_option('--suppress-upstream-revision', dest='peg_rev_upstream',
default=True, action='store_false',
help='If in -r mode, do not write the upstream field. '
'Only of use if the branch names for a sha1 manifest are '
'sensitive.')
p.add_option('-o', '--output-file', p.add_option('-o', '--output-file',
dest='output_file', dest='output_file',
default='-', default='-',
@ -60,10 +66,11 @@ in a Git repository for use during future 'repo init' invocations.
else: else:
fd = open(opt.output_file, 'w') fd = open(opt.output_file, 'w')
self.manifest.Save(fd, self.manifest.Save(fd,
peg_rev = opt.peg_rev) peg_rev = opt.peg_rev,
peg_rev_upstream = opt.peg_rev_upstream)
fd.close() fd.close()
if opt.output_file != '-': if opt.output_file != '-':
print >>sys.stderr, 'Saved manifest to %s' % opt.output_file print('Saved manifest to %s' % opt.output_file, file=sys.stderr)
def Execute(self, opt, args): def Execute(self, opt, args):
if args: if args:
@ -73,6 +80,6 @@ in a Git repository for use during future 'repo init' invocations.
self._Output(opt) self._Output(opt)
return return
print >>sys.stderr, 'error: no operation to perform' print('error: no operation to perform', file=sys.stderr)
print >>sys.stderr, 'error: see repo help manifest' print('error: see repo help manifest', file=sys.stderr)
sys.exit(1) sys.exit(1)

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
from color import Coloring from color import Coloring
from command import PagedCommand from command import PagedCommand
@ -38,30 +39,33 @@ are displayed.
help="Consider only checked out branches") help="Consider only checked out branches")
def Execute(self, opt, args): def Execute(self, opt, args):
all = [] all_branches = []
for project in self.GetProjects(args): for project in self.GetProjects(args):
br = [project.GetUploadableBranch(x) br = [project.GetUploadableBranch(x)
for x in project.GetBranches().keys()] for x in project.GetBranches().keys()]
br = [x for x in br if x] br = [x for x in br if x]
if opt.current_branch: if opt.current_branch:
br = [x for x in br if x.name == project.CurrentBranch] br = [x for x in br if x.name == project.CurrentBranch]
all.extend(br) all_branches.extend(br)
if not all: if not all_branches:
return return
class Report(Coloring): class Report(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, 'status') Coloring.__init__(self, config, 'status')
self.project = self.printer('header', attr='bold') self.project = self.printer('header', attr='bold')
self.text = self.printer('text')
out = Report(all[0].project.config) out = Report(all_branches[0].project.config)
out.text("Deprecated. See repo info -o.")
out.nl()
out.project('Projects Overview') out.project('Projects Overview')
out.nl() out.nl()
project = None project = None
for branch in all: for branch in all_branches:
if project != branch.project: if project != branch.project:
project = branch.project project = branch.project
out.nl() out.nl()
@ -70,11 +74,11 @@ are displayed.
commits = branch.commits commits = branch.commits
date = branch.date date = branch.date
print '%s %-33s (%2d commit%s, %s)' % ( print('%s %-33s (%2d commit%s, %s)' % (
branch.name == project.CurrentBranch and '*' or ' ', branch.name == project.CurrentBranch and '*' or ' ',
branch.name, branch.name,
len(commits), len(commits),
len(commits) != 1 and 's' or ' ', len(commits) != 1 and 's' or ' ',
date) date))
for commit in commits: for commit in commits:
print '%-35s - %s' % ('', commit) print('%-35s - %s' % ('', commit))

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
from color import Coloring from color import Coloring
from command import PagedCommand from command import PagedCommand
@ -24,11 +25,11 @@ class Prune(PagedCommand):
""" """
def Execute(self, opt, args): def Execute(self, opt, args):
all = [] all_branches = []
for project in self.GetProjects(args): for project in self.GetProjects(args):
all.extend(project.PruneHeads()) all_branches.extend(project.PruneHeads())
if not all: if not all_branches:
return return
class Report(Coloring): class Report(Coloring):
@ -36,13 +37,13 @@ class Prune(PagedCommand):
Coloring.__init__(self, config, 'status') Coloring.__init__(self, config, 'status')
self.project = self.printer('header', attr='bold') self.project = self.printer('header', attr='bold')
out = Report(all[0].project.config) out = Report(all_branches[0].project.config)
out.project('Pending Branches') out.project('Pending Branches')
out.nl() out.nl()
project = None project = None
for branch in all: for branch in all_branches:
if project != branch.project: if project != branch.project:
project = branch.project project = branch.project
out.nl() out.nl()
@ -51,9 +52,9 @@ class Prune(PagedCommand):
commits = branch.commits commits = branch.commits
date = branch.date date = branch.date
print '%s %-33s (%2d commit%s, %s)' % ( print('%s %-33s (%2d commit%s, %s)' % (
branch.name == project.CurrentBranch and '*' or ' ', branch.name == project.CurrentBranch and '*' or ' ',
branch.name, branch.name,
len(commits), len(commits),
len(commits) != 1 and 's' or ' ', len(commits) != 1 and 's' or ' ',
date) date))

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import sys import sys
from command import Command from command import Command
@ -55,18 +56,20 @@ branch but need to incorporate new upstream changes "underneath" them.
help='Stash local modifications before starting') help='Stash local modifications before starting')
def Execute(self, opt, args): def Execute(self, opt, args):
all = self.GetProjects(args) all_projects = self.GetProjects(args)
one_project = len(all) == 1 one_project = len(all_projects) == 1
if opt.interactive and not one_project: if opt.interactive and not one_project:
print >>sys.stderr, 'error: interactive rebase not supported with multiple projects' print('error: interactive rebase not supported with multiple projects',
file=sys.stderr)
return -1 return -1
for project in all: for project in all_projects:
cb = project.CurrentBranch cb = project.CurrentBranch
if not cb: if not cb:
if one_project: if one_project:
print >>sys.stderr, "error: project %s has a detatched HEAD" % project.relpath print("error: project %s has a detatched HEAD" % project.relpath,
file=sys.stderr)
return -1 return -1
# ignore branches with detatched HEADs # ignore branches with detatched HEADs
continue continue
@ -74,7 +77,8 @@ branch but need to incorporate new upstream changes "underneath" them.
upbranch = project.GetBranch(cb) upbranch = project.GetBranch(cb)
if not upbranch.LocalMerge: if not upbranch.LocalMerge:
if one_project: if one_project:
print >>sys.stderr, "error: project %s does not track any remote branches" % project.relpath print("error: project %s does not track any remote branches"
% project.relpath, file=sys.stderr)
return -1 return -1
# ignore branches without remotes # ignore branches without remotes
continue continue
@ -101,8 +105,8 @@ branch but need to incorporate new upstream changes "underneath" them.
args.append(upbranch.LocalMerge) args.append(upbranch.LocalMerge)
print >>sys.stderr, '# %s: rebasing %s -> %s' % \ print('# %s: rebasing %s -> %s'
(project.relpath, cb, upbranch.LocalMerge) % (project.relpath, cb, upbranch.LocalMerge), file=sys.stderr)
needs_stash = False needs_stash = False
if opt.auto_stash: if opt.auto_stash:

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
from optparse import SUPPRESS_HELP from optparse import SUPPRESS_HELP
import sys import sys
@ -52,7 +53,7 @@ need to be performed by an end-user.
else: else:
if not rp.Sync_NetworkHalf(): if not rp.Sync_NetworkHalf():
print >>sys.stderr, "error: can't update repo" print("error: can't update repo", file=sys.stderr)
sys.exit(1) sys.exit(1)
rp.bare_git.gc('--auto') rp.bare_git.gc('--auto')

View File

@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from sync import Sync from subcmds.sync import Sync
class Smartsync(Sync): class Smartsync(Sync):
common = True common = True

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import sys import sys
from color import Coloring from color import Coloring
@ -48,9 +49,9 @@ The '%prog' command stages files to prepare the next commit.
self.Usage() self.Usage()
def _Interactive(self, opt, args): def _Interactive(self, opt, args):
all = filter(lambda x: x.IsDirty(), self.GetProjects(args)) all_projects = filter(lambda x: x.IsDirty(), self.GetProjects(args))
if not all: if not all_projects:
print >>sys.stderr,'no projects have uncommitted modifications' print('no projects have uncommitted modifications', file=sys.stderr)
return return
out = _ProjectList(self.manifest.manifestProject.config) out = _ProjectList(self.manifest.manifestProject.config)
@ -58,8 +59,8 @@ The '%prog' command stages files to prepare the next commit.
out.header(' %s', 'project') out.header(' %s', 'project')
out.nl() out.nl()
for i in xrange(0, len(all)): for i in range(len(all_projects)):
p = all[i] p = all_projects[i]
out.write('%3d: %s', i + 1, p.relpath + '/') out.write('%3d: %s', i + 1, p.relpath + '/')
out.nl() out.nl()
out.nl() out.nl()
@ -93,15 +94,15 @@ The '%prog' command stages files to prepare the next commit.
if a_index is not None: if a_index is not None:
if a_index == 0: if a_index == 0:
break break
if 0 < a_index and a_index <= len(all): if 0 < a_index and a_index <= len(all_projects):
_AddI(all[a_index - 1]) _AddI(all_projects[a_index - 1])
continue continue
p = filter(lambda x: x.name == a or x.relpath == a, all) p = filter(lambda x: x.name == a or x.relpath == a, all_projects)
if len(p) == 1: if len(p) == 1:
_AddI(p[0]) _AddI(p[0])
continue continue
print 'Bye.' print('Bye.')
def _AddI(project): def _AddI(project):
p = GitCommand(project, ['add', '--interactive'], bare=False) p = GitCommand(project, ['add', '--interactive'], bare=False)

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import sys import sys
from command import Command from command import Command
from git_config import IsId from git_config import IsId
@ -41,7 +42,7 @@ revision specified in the manifest.
nb = args[0] nb = args[0]
if not git.check_ref_format('heads/%s' % nb): if not git.check_ref_format('heads/%s' % nb):
print >>sys.stderr, "error: '%s' is not a valid name" % nb print("error: '%s' is not a valid name" % nb, file=sys.stderr)
sys.exit(1) sys.exit(1)
err = [] err = []
@ -49,13 +50,13 @@ revision specified in the manifest.
if not opt.all: if not opt.all:
projects = args[1:] projects = args[1:]
if len(projects) < 1: if len(projects) < 1:
print >>sys.stderr, "error: at least one project must be specified" print("error: at least one project must be specified", file=sys.stderr)
sys.exit(1) sys.exit(1)
all = self.GetProjects(projects) all_projects = self.GetProjects(projects)
pm = Progress('Starting %s' % nb, len(all)) pm = Progress('Starting %s' % nb, len(all_projects))
for project in all: for project in all_projects:
pm.update() pm.update()
# If the current revision is a specific SHA1 then we can't push back # If the current revision is a specific SHA1 then we can't push back
# to it so substitute the manifest default revision instead. # to it so substitute the manifest default revision instead.
@ -67,7 +68,6 @@ revision specified in the manifest.
if err: if err:
for p in err: for p in err:
print >>sys.stderr,\ print("error: %s/: cannot start %s" % (p.relpath, nb),
"error: %s/: cannot start %s" \ file=sys.stderr)
% (p.relpath, nb)
sys.exit(1) sys.exit(1)

View File

@ -98,18 +98,18 @@ the following meanings:
sem.release() sem.release()
def Execute(self, opt, args): def Execute(self, opt, args):
all = self.GetProjects(args) all_projects = self.GetProjects(args)
counter = itertools.count() counter = itertools.count()
if opt.jobs == 1: if opt.jobs == 1:
for project in all: for project in all_projects:
state = project.PrintWorkTreeStatus() state = project.PrintWorkTreeStatus()
if state == 'CLEAN': if state == 'CLEAN':
counter.next() counter.next()
else: else:
sem = _threading.Semaphore(opt.jobs) sem = _threading.Semaphore(opt.jobs)
threads_and_output = [] threads_and_output = []
for project in all: for project in all_projects:
sem.acquire() sem.acquire()
class BufList(StringIO.StringIO): class BufList(StringIO.StringIO):
@ -128,5 +128,5 @@ the following meanings:
t.join() t.join()
output.dump(sys.stdout) output.dump(sys.stdout)
output.close() output.close()
if len(all) == counter.next(): if len(all_projects) == counter.next():
print 'nothing to commit (working directory clean)' print('nothing to commit (working directory clean)')

View File

@ -13,14 +13,18 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import netrc
from optparse import SUPPRESS_HELP from optparse import SUPPRESS_HELP
import os import os
import pickle
import re import re
import shutil import shutil
import socket import socket
import subprocess import subprocess
import sys import sys
import time import time
import urlparse
import xmlrpclib import xmlrpclib
try: try:
@ -36,15 +40,23 @@ except ImportError:
def _rlimit_nofile(): def _rlimit_nofile():
return (256, 256) return (256, 256)
from git_command import GIT try:
import multiprocessing
except ImportError:
multiprocessing = None
from git_command import GIT, git_require
from git_refs import R_HEADS, HEAD from git_refs import R_HEADS, HEAD
from main import WrapperModule
from project import Project from project import Project
from project import RemoteSpec from project import RemoteSpec
from command import Command, MirrorSafeCommand from command import Command, MirrorSafeCommand
from error import RepoChangedException, GitError from error import RepoChangedException, GitError, ManifestParseError
from project import SyncBuffer from project import SyncBuffer
from progress import Progress from progress import Progress
_ONE_DAY_S = 24 * 60 * 60
class _FetchError(Exception): class _FetchError(Exception):
"""Internal error thrown in _FetchHelper() when we don't want stack trace.""" """Internal error thrown in _FetchHelper() when we don't want stack trace."""
pass pass
@ -81,6 +93,18 @@ build as specified by the manifest-server element in the current
manifest. The -t/--smart-tag option is similar and allows you to manifest. The -t/--smart-tag option is similar and allows you to
specify a custom tag/label. specify a custom tag/label.
The -u/--manifest-server-username and -p/--manifest-server-password
options can be used to specify a username and password to authenticate
with the manifest server when using the -s or -t option.
If -u and -p are not specified when using the -s or -t option, '%prog'
will attempt to read authentication credentials for the manifest server
from the user's .netrc file.
'%prog' will not use authentication credentials from -u/-p or .netrc
if the manifest server specified in the manifest file already includes
credentials.
The -f/--force-broken option can be used to proceed with syncing The -f/--force-broken option can be used to proceed with syncing
other projects if a project sync fails. other projects if a project sync fails.
@ -90,6 +114,9 @@ resumeable bundle file on a content delivery network. This
may be necessary if there are problems with the local Python may be necessary if there are problems with the local Python
HTTP client or proxy configuration, but the Git binary works. HTTP client or proxy configuration, but the Git binary works.
The --fetch-submodules option enables fetching Git submodules
of a project from server.
SSH Connections SSH Connections
--------------- ---------------
@ -121,27 +148,30 @@ later is required to fix a server side protocol bug.
""" """
def _Options(self, p, show_smart=True): def _Options(self, p, show_smart=True):
self.jobs = self.manifest.default.sync_j try:
self.jobs = self.manifest.default.sync_j
except ManifestParseError:
self.jobs = 1
p.add_option('-f', '--force-broken', p.add_option('-f', '--force-broken',
dest='force_broken', action='store_true', dest='force_broken', action='store_true',
help="continue sync even if a project fails to sync") help="continue sync even if a project fails to sync")
p.add_option('-l','--local-only', p.add_option('-l', '--local-only',
dest='local_only', action='store_true', dest='local_only', action='store_true',
help="only update working tree, don't fetch") help="only update working tree, don't fetch")
p.add_option('-n','--network-only', p.add_option('-n', '--network-only',
dest='network_only', action='store_true', dest='network_only', action='store_true',
help="fetch only, don't update working tree") help="fetch only, don't update working tree")
p.add_option('-d','--detach', p.add_option('-d', '--detach',
dest='detach_head', action='store_true', dest='detach_head', action='store_true',
help='detach projects back to manifest revision') help='detach projects back to manifest revision')
p.add_option('-c','--current-branch', p.add_option('-c', '--current-branch',
dest='current_branch_only', action='store_true', dest='current_branch_only', action='store_true',
help='fetch only current branch from server') help='fetch only current branch from server')
p.add_option('-q','--quiet', p.add_option('-q', '--quiet',
dest='quiet', action='store_true', dest='quiet', action='store_true',
help='be more quiet') help='be more quiet')
p.add_option('-j','--jobs', p.add_option('-j', '--jobs',
dest='jobs', action='store', type='int', dest='jobs', action='store', type='int',
help="projects to fetch simultaneously (default %d)" % self.jobs) help="projects to fetch simultaneously (default %d)" % self.jobs)
p.add_option('-m', '--manifest-name', p.add_option('-m', '--manifest-name',
@ -150,6 +180,15 @@ later is required to fix a server side protocol bug.
p.add_option('--no-clone-bundle', p.add_option('--no-clone-bundle',
dest='no_clone_bundle', action='store_true', dest='no_clone_bundle', action='store_true',
help='disable use of /clone.bundle on HTTP/HTTPS') help='disable use of /clone.bundle on HTTP/HTTPS')
p.add_option('-u', '--manifest-server-username', action='store',
dest='manifest_server_username',
help='username to authenticate with the manifest server')
p.add_option('-p', '--manifest-server-password', action='store',
dest='manifest_server_password',
help='password to authenticate with the manifest server')
p.add_option('--fetch-submodules',
dest='fetch_submodules', action='store_true',
help='fetch submodules from server')
if show_smart: if show_smart:
p.add_option('-s', '--smart-sync', p.add_option('-s', '--smart-sync',
dest='smart_sync', action='store_true', dest='smart_sync', action='store_true',
@ -167,59 +206,62 @@ later is required to fix a server side protocol bug.
help=SUPPRESS_HELP) help=SUPPRESS_HELP)
def _FetchHelper(self, opt, project, lock, fetched, pm, sem, err_event): def _FetchHelper(self, opt, project, lock, fetched, pm, sem, err_event):
"""Main function of the fetch threads when jobs are > 1. """Main function of the fetch threads when jobs are > 1.
Args: Args:
opt: Program options returned from optparse. See _Options(). opt: Program options returned from optparse. See _Options().
project: Project object for the project to fetch. project: Project object for the project to fetch.
lock: Lock for accessing objects that are shared amongst multiple lock: Lock for accessing objects that are shared amongst multiple
_FetchHelper() threads. _FetchHelper() threads.
fetched: set object that we will add project.gitdir to when we're done fetched: set object that we will add project.gitdir to when we're done
(with our lock held). (with our lock held).
pm: Instance of a Project object. We will call pm.update() (with our pm: Instance of a Project object. We will call pm.update() (with our
lock held). lock held).
sem: We'll release() this semaphore when we exit so that another thread sem: We'll release() this semaphore when we exit so that another thread
can be started up. can be started up.
err_event: We'll set this event in the case of an error (after printing err_event: We'll set this event in the case of an error (after printing
out info about the error). out info about the error).
""" """
# We'll set to true once we've locked the lock. # We'll set to true once we've locked the lock.
did_lock = False did_lock = False
# Encapsulate everything in a try/except/finally so that: # Encapsulate everything in a try/except/finally so that:
# - We always set err_event in the case of an exception. # - We always set err_event in the case of an exception.
# - We always make sure we call sem.release(). # - We always make sure we call sem.release().
# - We always make sure we unlock the lock if we locked it. # - We always make sure we unlock the lock if we locked it.
try:
try: try:
try: start = time.time()
success = project.Sync_NetworkHalf( success = project.Sync_NetworkHalf(
quiet=opt.quiet, quiet=opt.quiet,
current_branch_only=opt.current_branch_only, current_branch_only=opt.current_branch_only,
clone_bundle=not opt.no_clone_bundle) clone_bundle=not opt.no_clone_bundle)
self._fetch_times.Set(project, time.time() - start)
# Lock around all the rest of the code, since printing, updating a set # Lock around all the rest of the code, since printing, updating a set
# and Progress.update() are not thread safe. # and Progress.update() are not thread safe.
lock.acquire() lock.acquire()
did_lock = True did_lock = True
if not success: if not success:
print >>sys.stderr, 'error: Cannot fetch %s' % project.name print('error: Cannot fetch %s' % project.name, file=sys.stderr)
if opt.force_broken: if opt.force_broken:
print >>sys.stderr, 'warn: --force-broken, continuing to sync' print('warn: --force-broken, continuing to sync',
else: file=sys.stderr)
raise _FetchError() else:
raise _FetchError()
fetched.add(project.gitdir) fetched.add(project.gitdir)
pm.update() pm.update()
except _FetchError: except _FetchError:
err_event.set() err_event.set()
except: except:
err_event.set() err_event.set()
raise raise
finally: finally:
if did_lock: if did_lock:
lock.release() lock.release()
sem.release() sem.release()
def _Fetch(self, projects, opt): def _Fetch(self, projects, opt):
fetched = set() fetched = set()
@ -234,9 +276,9 @@ later is required to fix a server side protocol bug.
clone_bundle=not opt.no_clone_bundle): clone_bundle=not opt.no_clone_bundle):
fetched.add(project.gitdir) fetched.add(project.gitdir)
else: else:
print >>sys.stderr, 'error: Cannot fetch %s' % project.name print('error: Cannot fetch %s' % project.name, file=sys.stderr)
if opt.force_broken: if opt.force_broken:
print >>sys.stderr, 'warn: --force-broken, continuing to sync' print('warn: --force-broken, continuing to sync', file=sys.stderr)
else: else:
sys.exit(1) sys.exit(1)
else: else:
@ -269,14 +311,62 @@ later is required to fix a server side protocol bug.
# If we saw an error, exit with code 1 so that other scripts can check. # If we saw an error, exit with code 1 so that other scripts can check.
if err_event.isSet(): if err_event.isSet():
print >>sys.stderr, '\nerror: Exited sync due to fetch errors' print('\nerror: Exited sync due to fetch errors', file=sys.stderr)
sys.exit(1) sys.exit(1)
pm.end() pm.end()
for project in projects: self._fetch_times.Save()
project.bare_git.gc('--auto')
self._GCProjects(projects)
return fetched return fetched
def _GCProjects(self, projects):
has_dash_c = git_require((1, 7, 2))
if multiprocessing and has_dash_c:
cpu_count = multiprocessing.cpu_count()
else:
cpu_count = 1
jobs = min(self.jobs, cpu_count)
if jobs < 2:
for project in projects:
project.bare_git.gc('--auto')
return
config = {'pack.threads': cpu_count / jobs if cpu_count > jobs else 1}
threads = set()
sem = _threading.Semaphore(jobs)
err_event = _threading.Event()
def GC(project):
try:
try:
project.bare_git.gc('--auto', config=config)
except GitError:
err_event.set()
except:
err_event.set()
raise
finally:
sem.release()
for project in projects:
if err_event.isSet():
break
sem.acquire()
t = _threading.Thread(target=GC, args=(project,))
t.daemon = True
threads.add(t)
t.start()
for t in threads:
t.join()
if err_event.isSet():
print('\nerror: Exited sync due to gc errors', file=sys.stderr)
sys.exit(1)
def UpdateProjectList(self): def UpdateProjectList(self):
new_project_paths = [] new_project_paths = []
for project in self.GetProjects(None, missing_ok=True): for project in self.GetProjects(None, missing_ok=True):
@ -296,37 +386,38 @@ later is required to fix a server side protocol bug.
if not path: if not path:
continue continue
if path not in new_project_paths: if path not in new_project_paths:
"""If the path has already been deleted, we don't need to do it # If the path has already been deleted, we don't need to do it
"""
if os.path.exists(self.manifest.topdir + '/' + path): if os.path.exists(self.manifest.topdir + '/' + path):
project = Project( project = Project(
manifest = self.manifest, manifest = self.manifest,
name = path, name = path,
remote = RemoteSpec('origin'), remote = RemoteSpec('origin'),
gitdir = os.path.join(self.manifest.topdir, gitdir = os.path.join(self.manifest.topdir,
path, '.git'), path, '.git'),
worktree = os.path.join(self.manifest.topdir, path), worktree = os.path.join(self.manifest.topdir, path),
relpath = path, relpath = path,
revisionExpr = 'HEAD', revisionExpr = 'HEAD',
revisionId = None, revisionId = None,
groups = None) groups = None)
if project.IsDirty(): if project.IsDirty():
print >>sys.stderr, 'error: Cannot remove project "%s": \ print('error: Cannot remove project "%s": uncommitted changes'
uncommitted changes are present' % project.relpath 'are present' % project.relpath, file=sys.stderr)
print >>sys.stderr, ' commit changes, then run sync again' print(' commit changes, then run sync again',
return -1 file=sys.stderr)
else: return -1
print >>sys.stderr, 'Deleting obsolete path %s' % project.worktree else:
shutil.rmtree(project.worktree) print('Deleting obsolete path %s' % project.worktree,
# Try deleting parent subdirs if they are empty file=sys.stderr)
dir = os.path.dirname(project.worktree) shutil.rmtree(project.worktree)
while dir != self.manifest.topdir: # Try deleting parent subdirs if they are empty
try: project_dir = os.path.dirname(project.worktree)
os.rmdir(dir) while project_dir != self.manifest.topdir:
except OSError: try:
break os.rmdir(project_dir)
dir = os.path.dirname(dir) except OSError:
break
project_dir = os.path.dirname(project_dir)
new_project_paths.sort() new_project_paths.sort()
fd = open(file_path, 'w') fd = open(file_path, 'w')
@ -345,28 +436,70 @@ uncommitted changes are present' % project.relpath
self.jobs = min(self.jobs, (soft_limit - 5) / 3) self.jobs = min(self.jobs, (soft_limit - 5) / 3)
if opt.network_only and opt.detach_head: if opt.network_only and opt.detach_head:
print >>sys.stderr, 'error: cannot combine -n and -d' print('error: cannot combine -n and -d', file=sys.stderr)
sys.exit(1) sys.exit(1)
if opt.network_only and opt.local_only: if opt.network_only and opt.local_only:
print >>sys.stderr, 'error: cannot combine -n and -l' print('error: cannot combine -n and -l', file=sys.stderr)
sys.exit(1) sys.exit(1)
if opt.manifest_name and opt.smart_sync: if opt.manifest_name and opt.smart_sync:
print >>sys.stderr, 'error: cannot combine -m and -s' print('error: cannot combine -m and -s', file=sys.stderr)
sys.exit(1) sys.exit(1)
if opt.manifest_name and opt.smart_tag: if opt.manifest_name and opt.smart_tag:
print >>sys.stderr, 'error: cannot combine -m and -t' print('error: cannot combine -m and -t', file=sys.stderr)
sys.exit(1) sys.exit(1)
if opt.manifest_server_username or opt.manifest_server_password:
if not (opt.smart_sync or opt.smart_tag):
print('error: -u and -p may only be combined with -s or -t',
file=sys.stderr)
sys.exit(1)
if None in [opt.manifest_server_username, opt.manifest_server_password]:
print('error: both -u and -p must be given', file=sys.stderr)
sys.exit(1)
if opt.manifest_name: if opt.manifest_name:
self.manifest.Override(opt.manifest_name) self.manifest.Override(opt.manifest_name)
if opt.smart_sync or opt.smart_tag: if opt.smart_sync or opt.smart_tag:
if not self.manifest.manifest_server: if not self.manifest.manifest_server:
print >>sys.stderr, \ print('error: cannot smart sync: no manifest server defined in'
'error: cannot smart sync: no manifest server defined in manifest' 'manifest', file=sys.stderr)
sys.exit(1) sys.exit(1)
manifest_server = self.manifest.manifest_server
if not '@' in manifest_server:
username = None
password = None
if opt.manifest_server_username and opt.manifest_server_password:
username = opt.manifest_server_username
password = opt.manifest_server_password
else:
try:
info = netrc.netrc()
except IOError:
print('.netrc file does not exist or could not be opened',
file=sys.stderr)
else:
try:
parse_result = urlparse.urlparse(manifest_server)
if parse_result.hostname:
username, _account, password = \
info.authenticators(parse_result.hostname)
except TypeError:
# TypeError is raised when the given hostname is not present
# in the .netrc file.
print('No credentials found for %s in .netrc'
% parse_result.hostname, file=sys.stderr)
except netrc.NetrcParseError as e:
print('Error parsing .netrc file: %s' % e, file=sys.stderr)
if (username and password):
manifest_server = manifest_server.replace('://', '://%s:%s@' %
(username, password),
1)
try: try:
server = xmlrpclib.Server(self.manifest.manifest_server) server = xmlrpclib.Server(manifest_server)
if opt.smart_sync: if opt.smart_sync:
p = self.manifest.manifestProject p = self.manifest.manifestProject
b = p.GetBranch(p.CurrentBranch) b = p.GetBranch(p.CurrentBranch)
@ -397,20 +530,21 @@ uncommitted changes are present' % project.relpath
finally: finally:
f.close() f.close()
except IOError: except IOError:
print >>sys.stderr, 'error: cannot write manifest to %s' % \ print('error: cannot write manifest to %s' % manifest_path,
manifest_path file=sys.stderr)
sys.exit(1) sys.exit(1)
self.manifest.Override(manifest_name) self.manifest.Override(manifest_name)
else: else:
print >>sys.stderr, 'error: %s' % manifest_str print('error: %s' % manifest_str, file=sys.stderr)
sys.exit(1) sys.exit(1)
except (socket.error, IOError, xmlrpclib.Fault), e: except (socket.error, IOError, xmlrpclib.Fault) as e:
print >>sys.stderr, 'error: cannot connect to manifest server %s:\n%s' % ( print('error: cannot connect to manifest server %s:\n%s'
self.manifest.manifest_server, e) % (self.manifest.manifest_server, e), file=sys.stderr)
sys.exit(1) sys.exit(1)
except xmlrpclib.ProtocolError, e: except xmlrpclib.ProtocolError as e:
print >>sys.stderr, 'error: cannot connect to manifest server %s:\n%d %s' % ( print('error: cannot connect to manifest server %s:\n%d %s'
self.manifest.manifest_server, e.errcode, e.errmsg) % (self.manifest.manifest_server, e.errcode, e.errmsg),
file=sys.stderr)
sys.exit(1) sys.exit(1)
rp = self.manifest.repoProject rp = self.manifest.repoProject
@ -420,7 +554,7 @@ uncommitted changes are present' % project.relpath
mp.PreSync() mp.PreSync()
if opt.repo_upgraded: if opt.repo_upgraded:
_PostRepoUpgrade(self.manifest) _PostRepoUpgrade(self.manifest, quiet=opt.quiet)
if not opt.local_only: if not opt.local_only:
mp.Sync_NetworkHalf(quiet=opt.quiet, mp.Sync_NetworkHalf(quiet=opt.quiet,
@ -434,14 +568,18 @@ uncommitted changes are present' % project.relpath
self.manifest._Unload() self.manifest._Unload()
if opt.jobs is None: if opt.jobs is None:
self.jobs = self.manifest.default.sync_j self.jobs = self.manifest.default.sync_j
all = self.GetProjects(args, missing_ok=True) all_projects = self.GetProjects(args,
missing_ok=True,
submodules_ok=opt.fetch_submodules)
self._fetch_times = _FetchTimes(self.manifest)
if not opt.local_only: if not opt.local_only:
to_fetch = [] to_fetch = []
now = time.time() now = time.time()
if (24 * 60 * 60) <= (now - rp.LastFetch): if _ONE_DAY_S <= (now - rp.LastFetch):
to_fetch.append(rp) to_fetch.append(rp)
to_fetch.extend(all) to_fetch.extend(all_projects)
to_fetch.sort(key=self._fetch_times.Get, reverse=True)
fetched = self._Fetch(to_fetch, opt) fetched = self._Fetch(to_fetch, opt)
_PostRepoFetch(rp, opt.no_repo_verify) _PostRepoFetch(rp, opt.no_repo_verify)
@ -449,13 +587,26 @@ uncommitted changes are present' % project.relpath
# bail out now; the rest touches the working tree # bail out now; the rest touches the working tree
return return
# Iteratively fetch missing and/or nested unregistered submodules
previously_missing_set = set()
while True:
self.manifest._Unload() self.manifest._Unload()
all = self.GetProjects(args, missing_ok=True) all_projects = self.GetProjects(args,
missing_ok=True,
submodules_ok=opt.fetch_submodules)
missing = [] missing = []
for project in all: for project in all_projects:
if project.gitdir not in fetched: if project.gitdir not in fetched:
missing.append(project) missing.append(project)
self._Fetch(missing, opt) if not missing:
break
# Stop us from non-stopped fetching actually-missing repos: If set of
# missing repos has not been changed from last fetch, we break.
missing_set = set(p.name for p in missing)
if previously_missing_set == missing_set:
break
previously_missing_set = missing_set
fetched.update(self._Fetch(missing, opt))
if self.manifest.IsMirror: if self.manifest.IsMirror:
# bail out now, we have no working tree # bail out now, we have no working tree
@ -466,49 +617,53 @@ uncommitted changes are present' % project.relpath
syncbuf = SyncBuffer(mp.config, syncbuf = SyncBuffer(mp.config,
detach_head = opt.detach_head) detach_head = opt.detach_head)
pm = Progress('Syncing work tree', len(all)) pm = Progress('Syncing work tree', len(all_projects))
for project in all: for project in all_projects:
pm.update() pm.update()
if project.worktree: if project.worktree:
project.Sync_LocalHalf(syncbuf) project.Sync_LocalHalf(syncbuf)
pm.end() pm.end()
print >>sys.stderr print(file=sys.stderr)
if not syncbuf.Finish(): if not syncbuf.Finish():
sys.exit(1) sys.exit(1)
# If there's a notice that's supposed to print at the end of the sync, print # If there's a notice that's supposed to print at the end of the sync, print
# it now... # it now...
if self.manifest.notice: if self.manifest.notice:
print self.manifest.notice print(self.manifest.notice)
def _PostRepoUpgrade(manifest): def _PostRepoUpgrade(manifest, quiet=False):
wrapper = WrapperModule()
if wrapper.NeedSetupGnuPG():
wrapper.SetupGnuPG(quiet)
for project in manifest.projects.values(): for project in manifest.projects.values():
if project.Exists: if project.Exists:
project.PostRepoUpgrade() project.PostRepoUpgrade()
def _PostRepoFetch(rp, no_repo_verify=False, verbose=False): def _PostRepoFetch(rp, no_repo_verify=False, verbose=False):
if rp.HasChanges: if rp.HasChanges:
print >>sys.stderr, 'info: A new version of repo is available' print('info: A new version of repo is available', file=sys.stderr)
print >>sys.stderr, '' print(file=sys.stderr)
if no_repo_verify or _VerifyTag(rp): if no_repo_verify or _VerifyTag(rp):
syncbuf = SyncBuffer(rp.config) syncbuf = SyncBuffer(rp.config)
rp.Sync_LocalHalf(syncbuf) rp.Sync_LocalHalf(syncbuf)
if not syncbuf.Finish(): if not syncbuf.Finish():
sys.exit(1) sys.exit(1)
print >>sys.stderr, 'info: Restarting repo with latest version' print('info: Restarting repo with latest version', file=sys.stderr)
raise RepoChangedException(['--repo-upgraded']) raise RepoChangedException(['--repo-upgraded'])
else: else:
print >>sys.stderr, 'warning: Skipped upgrade to unverified version' print('warning: Skipped upgrade to unverified version', file=sys.stderr)
else: else:
if verbose: if verbose:
print >>sys.stderr, 'repo version %s is current' % rp.work_git.describe(HEAD) print('repo version %s is current' % rp.work_git.describe(HEAD),
file=sys.stderr)
def _VerifyTag(project): def _VerifyTag(project):
gpg_dir = os.path.expanduser('~/.repoconfig/gnupg') gpg_dir = os.path.expanduser('~/.repoconfig/gnupg')
if not os.path.exists(gpg_dir): if not os.path.exists(gpg_dir):
print >>sys.stderr,\ print('warning: GnuPG was not available during last "repo init"\n'
"""warning: GnuPG was not available during last "repo init" 'warning: Cannot automatically authenticate repo."""',
warning: Cannot automatically authenticate repo.""" file=sys.stderr)
return True return True
try: try:
@ -522,10 +677,9 @@ warning: Cannot automatically authenticate repo."""
if rev.startswith(R_HEADS): if rev.startswith(R_HEADS):
rev = rev[len(R_HEADS):] rev = rev[len(R_HEADS):]
print >>sys.stderr print(file=sys.stderr)
print >>sys.stderr,\ print("warning: project '%s' branch '%s' is not signed"
"warning: project '%s' branch '%s' is not signed" \ % (project.name, rev), file=sys.stderr)
% (project.name, rev)
return False return False
env = os.environ.copy() env = os.environ.copy()
@ -544,9 +698,72 @@ warning: Cannot automatically authenticate repo."""
proc.stderr.close() proc.stderr.close()
if proc.wait() != 0: if proc.wait() != 0:
print >>sys.stderr print(file=sys.stderr)
print >>sys.stderr, out print(out, file=sys.stderr)
print >>sys.stderr, err print(err, file=sys.stderr)
print >>sys.stderr print(file=sys.stderr)
return False return False
return True return True
class _FetchTimes(object):
_ALPHA = 0.5
def __init__(self, manifest):
self._path = os.path.join(manifest.repodir, '.repopickle_fetchtimes')
self._times = None
self._seen = set()
def Get(self, project):
self._Load()
return self._times.get(project.name, _ONE_DAY_S)
def Set(self, project, t):
self._Load()
name = project.name
old = self._times.get(name, t)
self._seen.add(name)
a = self._ALPHA
self._times[name] = (a*t) + ((1-a) * old)
def _Load(self):
if self._times is None:
try:
f = open(self._path)
except IOError:
self._times = {}
return self._times
try:
try:
self._times = pickle.load(f)
except IOError:
try:
os.remove(self._path)
except OSError:
pass
self._times = {}
finally:
f.close()
return self._times
def Save(self):
if self._times is None:
return
to_delete = []
for name in self._times:
if name not in self._seen:
to_delete.append(name)
for name in to_delete:
del self._times[name]
try:
f = open(self._path, 'wb')
try:
pickle.dump(self._times, f)
except (IOError, OSError, pickle.PickleError):
try:
os.remove(self._path)
except OSError:
pass
finally:
f.close()

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import copy import copy
import re import re
import sys import sys
@ -26,28 +27,30 @@ UNUSUAL_COMMIT_THRESHOLD = 5
def _ConfirmManyUploads(multiple_branches=False): def _ConfirmManyUploads(multiple_branches=False):
if multiple_branches: if multiple_branches:
print "ATTENTION: One or more branches has an unusually high number of commits." print('ATTENTION: One or more branches has an unusually high number'
'of commits.')
else: else:
print "ATTENTION: You are uploading an unusually high number of commits." print('ATTENTION: You are uploading an unusually high number of commits.')
print "YOU PROBABLY DO NOT MEAN TO DO THIS. (Did you rebase across branches?)" print('YOU PROBABLY DO NOT MEAN TO DO THIS. (Did you rebase across'
'branches?)')
answer = raw_input("If you are sure you intend to do this, type 'yes': ").strip() answer = raw_input("If you are sure you intend to do this, type 'yes': ").strip()
return answer == "yes" return answer == "yes"
def _die(fmt, *args): def _die(fmt, *args):
msg = fmt % args msg = fmt % args
print >>sys.stderr, 'error: %s' % msg print('error: %s' % msg, file=sys.stderr)
sys.exit(1) sys.exit(1)
def _SplitEmails(values): def _SplitEmails(values):
result = [] result = []
for str in values: for value in values:
result.extend([s.strip() for s in str.split(',')]) result.extend([s.strip() for s in value.split(',')])
return result return result
class Upload(InteractiveCommand): class Upload(InteractiveCommand):
common = True common = True
helpSummary = "Upload changes for code review" helpSummary = "Upload changes for code review"
helpUsage=""" helpUsage = """
%prog [--re --cc] [<project>]... %prog [--re --cc] [<project>]...
""" """
helpDescription = """ helpDescription = """
@ -174,20 +177,20 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
if answer is None: if answer is None:
date = branch.date date = branch.date
list = branch.commits commit_list = branch.commits
print 'Upload project %s/ to remote branch %s:' % (project.relpath, project.revisionExpr) print('Upload project %s/ to remote branch %s:' % (project.relpath, project.revisionExpr))
print ' branch %s (%2d commit%s, %s):' % ( print(' branch %s (%2d commit%s, %s):' % (
name, name,
len(list), len(commit_list),
len(list) != 1 and 's' or '', len(commit_list) != 1 and 's' or '',
date) date))
for commit in list: for commit in commit_list:
print ' %s' % commit print(' %s' % commit)
sys.stdout.write('to %s (y/N)? ' % remote.review) sys.stdout.write('to %s (y/N)? ' % remote.review)
answer = sys.stdin.readline().strip() answer = sys.stdin.readline().strip().lower()
answer = answer in ('y', 'Y', 'yes', '1', 'true', 't') answer = answer in ('y', 'yes', '1', 'true', 't')
if answer: if answer:
if len(branch.commits) > UNUSUAL_COMMIT_THRESHOLD: if len(branch.commits) > UNUSUAL_COMMIT_THRESHOLD:
@ -212,17 +215,17 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
for branch in avail: for branch in avail:
name = branch.name name = branch.name
date = branch.date date = branch.date
list = branch.commits commit_list = branch.commits
if b: if b:
script.append('#') script.append('#')
script.append('# branch %s (%2d commit%s, %s) to remote branch %s:' % ( script.append('# branch %s (%2d commit%s, %s) to remote branch %s:' % (
name, name,
len(list), len(commit_list),
len(list) != 1 and 's' or '', len(commit_list) != 1 and 's' or '',
date, date,
project.revisionExpr)) project.revisionExpr))
for commit in list: for commit in commit_list:
script.append('# %s' % commit) script.append('# %s' % commit)
b[name] = branch b[name] = branch
@ -297,7 +300,7 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
try: try:
# refs/changes/XYZ/N --> XYZ # refs/changes/XYZ/N --> XYZ
return refs.get(last_pub).split('/')[-2] return refs.get(last_pub).split('/')[-2]
except: except (AttributeError, IndexError):
return "" return ""
def _UploadAndReport(self, opt, todo, original_people): def _UploadAndReport(self, opt, todo, original_people):
@ -309,33 +312,33 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
# Check if there are local changes that may have been forgotten # Check if there are local changes that may have been forgotten
if branch.project.HasChanges(): if branch.project.HasChanges():
key = 'review.%s.autoupload' % branch.project.remote.review key = 'review.%s.autoupload' % branch.project.remote.review
answer = branch.project.config.GetBoolean(key) answer = branch.project.config.GetBoolean(key)
# if they want to auto upload, let's not ask because it could be automated # if they want to auto upload, let's not ask because it could be automated
if answer is None: if answer is None:
sys.stdout.write('Uncommitted changes in ' + branch.project.name + ' (did you forget to amend?). Continue uploading? (y/N) ') sys.stdout.write('Uncommitted changes in ' + branch.project.name + ' (did you forget to amend?). Continue uploading? (y/N) ')
a = sys.stdin.readline().strip().lower() a = sys.stdin.readline().strip().lower()
if a not in ('y', 'yes', 't', 'true', 'on'): if a not in ('y', 'yes', 't', 'true', 'on'):
print >>sys.stderr, "skipping upload" print("skipping upload", file=sys.stderr)
branch.uploaded = False branch.uploaded = False
branch.error = 'User aborted' branch.error = 'User aborted'
continue continue
# Check if topic branches should be sent to the server during upload # Check if topic branches should be sent to the server during upload
if opt.auto_topic is not True: if opt.auto_topic is not True:
key = 'review.%s.uploadtopic' % branch.project.remote.review key = 'review.%s.uploadtopic' % branch.project.remote.review
opt.auto_topic = branch.project.config.GetBoolean(key) opt.auto_topic = branch.project.config.GetBoolean(key)
branch.UploadForReview(people, auto_topic=opt.auto_topic, draft=opt.draft) branch.UploadForReview(people, auto_topic=opt.auto_topic, draft=opt.draft)
branch.uploaded = True branch.uploaded = True
except UploadError, e: except UploadError as e:
branch.error = e branch.error = e
branch.uploaded = False branch.uploaded = False
have_errors = True have_errors = True
print >>sys.stderr, '' print(file=sys.stderr)
print >>sys.stderr, '----------------------------------------------------------------------' print('----------------------------------------------------------------------', file=sys.stderr)
if have_errors: if have_errors:
for branch in todo: for branch in todo:
@ -344,17 +347,19 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
fmt = ' (%s)' fmt = ' (%s)'
else: else:
fmt = '\n (%s)' fmt = '\n (%s)'
print >>sys.stderr, ('[FAILED] %-15s %-15s' + fmt) % ( print(('[FAILED] %-15s %-15s' + fmt) % (
branch.project.relpath + '/', \ branch.project.relpath + '/', \
branch.name, \ branch.name, \
str(branch.error)) str(branch.error)),
print >>sys.stderr, '' file=sys.stderr)
print()
for branch in todo: for branch in todo:
if branch.uploaded: if branch.uploaded:
print >>sys.stderr, '[OK ] %-15s %s' % ( print('[OK ] %-15s %s' % (
branch.project.relpath + '/', branch.project.relpath + '/',
branch.name) branch.name),
file=sys.stderr)
if have_errors: if have_errors:
sys.exit(1) sys.exit(1)
@ -384,18 +389,18 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
pending_proj_names = [project.name for (project, avail) in pending] pending_proj_names = [project.name for (project, avail) in pending]
try: try:
hook.Run(opt.allow_all_hooks, project_list=pending_proj_names) hook.Run(opt.allow_all_hooks, project_list=pending_proj_names)
except HookError, e: except HookError as e:
print >>sys.stderr, "ERROR: %s" % str(e) print("ERROR: %s" % str(e), file=sys.stderr)
return return
if opt.reviewers: if opt.reviewers:
reviewers = _SplitEmails(opt.reviewers) reviewers = _SplitEmails(opt.reviewers)
if opt.cc: if opt.cc:
cc = _SplitEmails(opt.cc) cc = _SplitEmails(opt.cc)
people = (reviewers,cc) people = (reviewers, cc)
if not pending: if not pending:
print >>sys.stdout, "no branches ready for upload" print("no branches ready for upload", file=sys.stderr)
elif len(pending) == 1 and len(pending[0][1]) == 1: elif len(pending) == 1 and len(pending[0][1]) == 1:
self._SingleBranch(opt, pending[0][1][0], people) self._SingleBranch(opt, pending[0][1][0], people)
else: else:

View File

@ -13,10 +13,11 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import sys import sys
from command import Command, MirrorSafeCommand from command import Command, MirrorSafeCommand
from git_command import git from git_command import git
from project import HEAD from git_refs import HEAD
class Version(Command, MirrorSafeCommand): class Version(Command, MirrorSafeCommand):
wrapper_version = None wrapper_version = None
@ -32,12 +33,12 @@ class Version(Command, MirrorSafeCommand):
rp = self.manifest.repoProject rp = self.manifest.repoProject
rem = rp.GetRemote(rp.remote.name) rem = rp.GetRemote(rp.remote.name)
print 'repo version %s' % rp.work_git.describe(HEAD) print('repo version %s' % rp.work_git.describe(HEAD))
print ' (from %s)' % rem.url print(' (from %s)' % rem.url)
if Version.wrapper_path is not None: if Version.wrapper_path is not None:
print 'repo launcher version %s' % Version.wrapper_version print('repo launcher version %s' % Version.wrapper_version)
print ' (from %s)' % Version.wrapper_path print(' (from %s)' % Version.wrapper_path)
print git.version().strip() print(git.version().strip())
print 'Python %s' % sys.version print('Python %s' % sys.version)

View File

@ -4,49 +4,49 @@ import unittest
import git_config import git_config
def fixture(*paths): def fixture(*paths):
"""Return a path relative to test/fixtures. """Return a path relative to test/fixtures.
""" """
return os.path.join(os.path.dirname(__file__), 'fixtures', *paths) return os.path.join(os.path.dirname(__file__), 'fixtures', *paths)
class GitConfigUnitTest(unittest.TestCase): class GitConfigUnitTest(unittest.TestCase):
"""Tests the GitConfig class. """Tests the GitConfig class.
"""
def setUp(self):
"""Create a GitConfig object using the test.gitconfig fixture.
""" """
def setUp(self): config_fixture = fixture('test.gitconfig')
"""Create a GitConfig object using the test.gitconfig fixture. self.config = git_config.GitConfig(config_fixture)
"""
config_fixture = fixture('test.gitconfig')
self.config = git_config.GitConfig(config_fixture)
def test_GetString_with_empty_config_values(self): def test_GetString_with_empty_config_values(self):
""" """
Test config entries with no value. Test config entries with no value.
[section] [section]
empty empty
""" """
val = self.config.GetString('section.empty') val = self.config.GetString('section.empty')
self.assertEqual(val, None) self.assertEqual(val, None)
def test_GetString_with_true_value(self): def test_GetString_with_true_value(self):
""" """
Test config entries with a string value. Test config entries with a string value.
[section] [section]
nonempty = true nonempty = true
""" """
val = self.config.GetString('section.nonempty') val = self.config.GetString('section.nonempty')
self.assertEqual(val, 'true') self.assertEqual(val, 'true')
def test_GetString_from_missing_file(self): def test_GetString_from_missing_file(self):
""" """
Test missing config file Test missing config file
""" """
config_fixture = fixture('not.present.gitconfig') config_fixture = fixture('not.present.gitconfig')
config = git_config.GitConfig(config_fixture) config = git_config.GitConfig(config_fixture)
val = config.GetString('empty') val = config.GetString('empty')
self.assertEqual(val, None) self.assertEqual(val, None)
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

View File

@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from __future__ import print_function
import sys import sys
import os import os
REPO_TRACE = 'REPO_TRACE' REPO_TRACE = 'REPO_TRACE'
@ -31,4 +32,4 @@ def SetTrace():
def Trace(fmt, *args): def Trace(fmt, *args):
if IsTrace(): if IsTrace():
print >>sys.stderr, fmt % args print(fmt % args, file=sys.stderr)