Compare commits

...

299 Commits

Author SHA1 Message Date
16889ba43d Revert "Repo: fall back to http, if ssh connection fails for http repos"
This reverts commit 488bf092d5.

Issue 230

Change-Id: I3a5725301f576e1a2ac499cb6daa631895115640
2016-09-22 16:40:27 +00:00
40d3952270 Merge "On project cleanup, don't remove nested projects" 2016-09-21 22:23:44 +00:00
4350791e0d On project cleanup, don't remove nested projects
When there are nested projects in a manifest, like on AOSP right now:

<project path="build" name="platform/build" />
<project path="build/blueprint" name="platform/build/blueprint" />
<project path="build/kati" name="platform/build/kati" />
<project path="build/soong" name="platform/build/soong" />

And the top "build" project is removed (or renamed to remove the
nesting), repo just wipes away everything under build/ and re-creates
the projects that are still there. But it only checks to see if the
build/ project is dirty, so if there are dirty files in a nested
project, they'll just be blown away, and a fresh worktree checked out.

Instead, behave similarly to how `git clean -dxf` behaves and preserve
any subdirectories that have git repositories in them. This isn't as
strict as git -- it does not check to see if the '.git' entry is a
readable gitdir, just whether an entry named '.git' exists.

If it encounters any errors removing files, we'll print them all out to
stderr and tell the user that we were unable to clean up the obsolete
project, that they should clean it up manually, then sync again.

Change-Id: I2f6a7dd205a8e0b7590ca5369e9b0ba21d5a6f77
2016-09-20 17:16:12 -07:00
d648045366 implement optional 'pushurl' in the manifest file
Allow the 'remote' element in the manifest file to define an optional
'pushurl' attribute which is passed into the .git/config file.

Change-Id: If342d299d371374aedc4440645798888869c9714
Signed-off-by: Steve Rae <steve.rae@raedomain.com>
2016-09-20 15:31:20 +00:00
628456833a Merge "Repo: improve error detection for new ssh connections" 2016-09-20 08:06:12 +00:00
2aa61d0bc8 Merge "Repo: fall back to http, if ssh connection fails for http repos" 2016-09-20 08:05:57 +00:00
4aed6f8c7d Merge "Replace pylint with pyflakes/flake8" 2016-09-20 06:57:16 +00:00
01b7d758d5 Merge "When syncing a project with a shared object store, disable automatic pruning." 2016-09-14 20:48:24 +00:00
267ac57361 Merge "repo: add comment for updating maintainer keys" 2016-09-14 14:06:42 +00:00
bb5b1a076b Replace pylint with pyflakes/flake8
pylint reports a lot of warnings, but many of them are false positive,
and it's difficult to configure it. It also seems that for some reason
the included config file is not working well with the latest version.

Update the documentation to recommend using pyflakes and flake8 instead
of pylint. Remove the pylint config and add a basic flake8 config with
minimum settings:

- Maximum line length 80 columns
- Ignore warnings about indentation (repo uses 2 rather than expected 4)
- Ignore warnings about import placement

In this commit no code cleanup is done, and it's expected that most of
the files will throw up quite a few warnings, at least for flake8. These
can be cleaned up in follow-up commits.

The existing pylint suppression comments are left as-is. These will be
helpful when cleaning up pyflakes warnings later.

Change-Id: I2f7cb4340266ed07cc973ca6483b8f09d66a765b
2016-09-14 09:49:02 +02:00
e01ee026e6 Merge "Fix submodule checkout error when using sync-s option" 2016-09-14 07:45:50 +00:00
e4433653db repo: add comment for updating maintainer keys
Change-Id: Ic1e7557f9597234033561ab9fb3104b87e30015e
2016-09-14 01:28:30 -04:00
d9de945d8a Merge "upload: short circuit when nothing is pending" 2016-09-14 05:20:36 +00:00
2ff302929c When syncing a project with a shared object store, disable automatic pruning.
The shared object stores confuse git and make it throw away objects which are
still in use. We'll avoid that problem by disabling automatic pruning on those
projects, but there's nothing preventing a user from changing the config back
or pruning a repository manually.

BUG=chromium:375945
TEST=Ran repo sync on fresh ChromeOS checkout, starting with a branch of repo
with this change. Verified that the kernel projects and no others were
identified as having shared object stores, and that repo successfully disabled
automatic pruning in their configs. Re-enabled pruning and ran repo sync just
on one of the kernel directories. Verified that pruning was re-disabled as a
result.

Change-Id: I728ed5b06f0087aeb5a23ba8f5410a7cd10af5b0
2016-09-14 00:19:44 -04:00
e5c0ea0a95 Consider local project to be default for 'repo start'
The requirement to explicitly specify the local project when starting
a new repo branch is somewhat counter intuitive.

This patch uses the current directory's git tree as the default
project.

Tested by running

  'repo start <name>'

observed that the result is the same as if running

  'repo start <name> .'

Change-Id: If106caa801b4cd5ba70dbe8354a227d59f100aa3
2016-09-14 00:17:45 -04:00
163a3be18b upload: short circuit when nothing is pending
When nothing is pending, most of this code is already short-circuited.
Hoist the single check up to make this more obvious/slightly faster.

Change-Id: Iec3a7e08eacd23a7c5f964900d5776bf5252c804
2016-09-14 00:16:37 -04:00
7a77c16d37 Update mailmap
Order the entries alphabetically and add a couple more.

Change-Id: I8d98e8a5a1dd6868b566a42428030d2040b8af7a
2016-09-02 11:12:28 +09:00
488bf092d5 Repo: fall back to http, if ssh connection fails for http repos
if a gerrit server has ssh and https access enabled, but user access
(for some users) is limited to https, 'repo upload' command will fail
for them.

Gerrit returns a ssh configuration (gerrit/ssh_info), that does not work
for users limited to https.

With this patch repo will test, if the returned ssh configuration from
gerrit/ssh_info is working. if not, it will fall back to https for upload.

Change-Id: If98f472e994f350bf71f35610cd649b163f1ab33
2016-08-30 07:35:03 +00:00
05dc46b0e3 Repo: improve error detection for new ssh connections
this check can only detect errors that happen within 1 sec after launching
ssh. But this is typically enough to catch configuration issues like
'connection refused' or 'authentication failed'.

Change-Id: I00b6f62d4c2889b1faa6c820e49a198554c92795
2016-08-30 07:34:33 +00:00
39252ba028 gitc: Lower concurrent ls-projects requests
Too many requests at the same time is causing 502 errors.

Change-Id: Ic8fbb2fbb7fb6014733fa5be018d2dc02472f704
2016-08-23 14:19:00 -07:00
71e4cea6de RepoHook: do not list options twice during hash based approval
Instead of

  Do you want to allow this script to run (yes/yes-never-ask-again/NO)? (yes/always/NO)?

ask

  Do you want to allow this script to run (yes/always/NO)?

Change-Id: I5f5a2d0e88086a8d85e54fb8623a62d74a20956a
Signed-off-by: Jonathan Nieder <jrn@google.com>
2016-08-18 17:06:26 -07:00
c4c2b066d1 Increment the wrapper version
There have been a number of changes in the repo wrapper since the last
increment that was done in fee390ee:

- 9711a98 init: Add --no-clone-bundle option
- 631d0ec Support non-ASCII GNUPGHOME environment variable
- 4088eb4 repo: Cleaned up pylint/pep8 violations
- 5553628 repo: Add check of REPO_URL env variable
- 745b4ad Fix gitc-init behavior
- d3ddcdb Ignore clone.bundle on HTTP 501, i.e. Not Implemented

Change-Id: I3f763ef0ec2df2d726dff429021b48ad474148f1
2016-08-17 13:58:17 +09:00
6a0a3648f1 Merge "init: Add --no-clone-bundle option" 2016-08-17 04:57:28 +00:00
6118faa118 Merge "init: Respect --quiet option when synching manifest repository" 2016-08-17 04:04:14 +00:00
183c52ab02 Merge "project: Set config option to skip lfs smudge filter" 2016-08-17 01:00:12 +00:00
58f85f9a30 Merge "RepoHook: allow users to approve hooks via manifests" 2016-08-16 18:05:23 +00:00
40252c20f7 RepoHook: allow users to approve hooks via manifests
The constant prompting when registered hooks change can be tedious and
has a large multiplication factor when the project is large (e.g. the
AOSP).  It gets worse as people want to write more checks, hooks, docs,
and tests (or fix bugs), but every CL that goes in will trigger a new
prompt to approve.

Let's tweak our trust model when it comes to hooks.  Since people start
off by calling `repo init` with a URL to a manifest, and that manifest
defines all the hooks, anchor trust in that.  This requires that we get
the manifest over a trusted link (e.g. https or ssh) so that it can't
be MITM-ed.  If the user chooses to use an untrusted link (e.g. git or
http), then we'll fallback to the existing hash based approval.

Bug: Issue 226
Change-Id: I77be9e4397383f264fcdaefb582e345ea4069a13
2016-08-16 13:02:52 -04:00
76a4a9df86 project: Set config option to skip lfs smudge filter
During sync, repo runs `git read-tree --reset -u -v HEAD` which causes
git-lfs's smudge filter to run. However this fails because git-lfs does
not work with bare repositories.

Add lfs.filter configuration to the project config as suggested in the
comments on the upstream git-lfs client issue [1]. This prevents the
smudge filter from running, and the sync completes successfully.

For any projects that have LFS objects, `git lfs pull` must be executed.

[1] https://github.com/github/git-lfs/issues/1422

Bug: Issue 224
Change-Id: I091ff37998131e2e6bbc59aa37ee352fe12d7fcd
2016-08-16 21:55:36 +09:00
befaec1e56 improve docs
Change-Id: Ide4008f09c2f17f8fb3d85dfffe94544abfdd6a6
2016-08-16 00:14:28 -04:00
9711a98d6c init: Add --no-clone-bundle option
Bug: Issue 218
Change-Id: I42ba1f5fb9168875da0df6bdf4fe44c8d6498d54
2016-08-15 09:51:48 +09:00
438eade413 init: Respect --quiet option when synching manifest repository
Change-Id: Ib58b7dd971670e0888e6428333050700e776b0de
2016-08-15 09:51:48 +09:00
69297c1b77 Merge "Support non-ASCII GNUPGHOME environment variable" 2016-08-15 00:51:32 +00:00
8016f60a46 Merge "repo: Repo does not always handle '.' parameter correctly" 2016-08-14 08:50:28 +00:00
631d0ec708 Support non-ASCII GNUPGHOME environment variable
Here we don't need to encode this gpg_dir string when using
Python 2.7 on Linux.

Change-Id: I56724e9511d3b1aea61535e654a45c212130630d
2016-07-16 22:10:06 +03:00
f97e72e5dd Bail out when manifest is referencing a bad SHA-1 revision.
BUG: Issue 222
Change-Id: Ie0a64b39922d6fdf1be2989eb514985be8490278
2016-06-29 11:01:43 -07:00
faaddc9b4e Merge "Adds additional crlf clobber avoidance." 2016-06-29 05:42:20 +00:00
a36af0767b Merge "Fix variable assignment" 2016-06-29 01:43:41 +00:00
037040f73e Fix variable assignment
If upstream string is empty, current_branch_only variable will be assigned
to an empty string.

This is not what we expect here as this variable is a boolean.

Change-Id: Ibba935e25a74c2be1e50c88b4b403cf394ba365e
2016-06-28 12:31:25 +02:00
2598ed06f1 Fix submodule checkout error when using sync-s option
When sync-s="true" option is used, the checkout of a submodule will try
to use the revision attribute of the parent project.

If this revision is a named reference, the checkout will fail if there
is no reference with this name in the submodule.

The proposed solution is to use the git commit id as revisionExpr for
submodules.

Change-Id: Ie8390a11957fd6a9c61289c6861d13cb3fa11678
2016-06-27 20:16:45 +02:00
01952e6634 Adds additional crlf clobber avoidance.
Adds the hook-scripts to .gitattributes due to the shell-scripts not
liking CRLF which they will get if a user sets 'autocrlf = true'
in their global gitconfig.

Further, since the python interpreter can handle either CRLF or LF, 
python-scripts specific line-ending rules have been removed.

Change-Id: I2d6bfd491b2f626b9ca93c40a3a7f2cfba6c54f0
2016-06-22 08:36:45 +00:00
9d2b14d2ec pylint: Fix unused-{argument,variable} warning
This commit fixes 4 out of the remaining 5 pylint warnings:

$ pylint --rcfile=.pylintrc *.py
************* Module gitc_utils
W:146, 0: TODO(sbasi/jorg): Come up with a solution to remove the sleep below. (fixme)
W:130, 6: Unused variable 'name' (unused-variable)
************* Module main
W:382,32: Unused argument 'fp' (unused-argument)
W:382,36: Unused argument 'code' (unused-argument)
W:382,42: Unused argument 'msg' (unused-argument)

Change-Id: Ie3d77b9a65b7daefaa9aa4b80f4b682c1678fd58
Signed-off-by: Stefan Beller <sbeller@google.com>
2016-06-21 11:48:57 -07:00
6685106306 pylint: fix indentation in manifest_xml
This fixes pylint warning:

************* Module manifest_xml
W:975, 0: Bad indentation. Found 8 spaces, expected 6 (bad-indentation)

Change-Id: I967212f9439430351836ebdc27e442d7b77476e2
Signed-off-by: Stefan Beller <sbeller@google.com>
2016-06-17 16:45:48 -07:00
d64e8eee51 pylint: ignore bad-whitespace
This ignores whitespaces errors, which we have quite a few of in argument
lists, for example:

************* Module git_config
C:209, 0: No space allowed around keyword argument assignment
  def HasSection(self, section, subsection = ''):
                                           ^ (bad-whitespace)
C:320, 0: No space allowed around keyword argument assignment
                   capture_stdout = True,
                                  ^ (bad-whitespace)
C:321, 0: No space allowed around keyword argument assignment
                   capture_stderr = True)
                                  ^ (bad-whitespace)
C:427, 0: Exactly one space required after comma
                     '-o','ControlPath %s' % ssh_sock(),
                         ^ (bad-whitespace)
C:436, 0: Exactly one space required after comma
    check_command = command_base + ['-O','check']
                                        ^ (bad-whitespace)
C:464, 0: Exactly one space required after comma
             % (host,port, str(e)), file=sys.stderr)
                    ^ (bad-whitespace)
C:707, 0: No space allowed around keyword argument assignment
    return self._config.GetString(key, all_keys = all_keys)
                                                ^ (bad-whitespace)
C:759, 0: No space allowed around keyword argument assignment
    return self._config.GetString(key, all_keys = all_keys)
                                                ^ (bad-whitespace)

Change-Id: Ia8f154f6741ce609787551f65877d7584c457903
Signed-off-by: Stefan Beller <sbeller@google.com>
2016-06-17 16:37:24 -07:00
8b39fb4bc0 Merge "Fix XmlManifest.Save with remotes that have 'alias' set" 2016-04-22 18:28:57 +00:00
96c2d65489 Fix XmlManifest.Save with remotes that have 'alias' set
When the alias attribute is set for a remote, the RemoteSpec attached to
a Project only contains the alias name used by git, not the original
name used in the manifest. But that's not enough information to
reconstruct the manifest, so save off the original manifest name as
another RemoteSpec parameter, only used to write the manifest out.

Bug: Issue 181
Bug: Issue 219
Change-Id: Id7417dfd6ce5572e4e5fe14f22924fdf088ca4f3
2016-04-22 10:32:06 +09:00
7ecccf6225 diffmanifests: support custom git pretty format strings
Change-Id: I29f4f1351c421f393328514d145df1a96aed9ee2
2016-04-21 18:36:11 +00:00
cee5c77166 Add a .mailmap file
Change-Id: I3c7e68fae0f8c082b2e0fbfc26cfb7dda31f1d34
2016-04-18 19:08:23 +09:00
79fba68e40 sync: Update help text for --smart-sync to be more specific
The --smart-sync option should return the manifest for *the latest*
known good build.

Change-Id: I2f3216b5b9e1af2ea5f9c3bf1c025813a3b77581
2016-04-13 18:03:00 +09:00
e868841782 Improve documentation of manifest server RPC methods
Mention that the RPC endpoints are used when running repo
sync with the --smart-sync and --smart-tag options

Change-Id: I4b0b82e8b714fe923a5b325a6135f0128bf636ff
2016-04-13 17:55:36 +09:00
f9fe3e14d2 repo: Repo does not always handle '.' parameter correctly
The repo script allows a manifest to specify a '.' as the path the
top-level directory, which co-locates the .git and .repo directories,
and places files from the git repository at the top-level:

  <project name="proj_name" path="." />
  <project name="sierra.other.git" path="other" />

Most commands work correctly with this setup. Some commands, however,
fail to find the project. For instance, 'repo sync' works, and 'repo sync .'
works in a sub-project ('other' in this case) but 'repo sync .' in the
top-level directory fails with the error:

error: project . not found

There are two reasons for this:

1. The self.worktree attribute of the Project object is not normalized,
so with a '.' for path its value would be '/my/project/root/.'. This is
fine when used as a path, since it's the same path as '/my/project/root',
but when used in a string comparison it fails. This commit applies
os.path.normpath() to that value before storing it.

2. The _GetProjectByPath method in command.py was not checking the path
against manifest.topdir, so even once it was normalized the project was
not found. This commit adds a check against manifest.topdir if the
loop drops out without finding a project.

Change-Id: Ic84d053f1bbb5a357cad566805d5a326ae8246d2
2016-04-08 00:07:52 +00:00
bdb866ea76 Fix symlinking of new projects
We weren't copying these lists, so the += was actually changing the
underlying lists.

When a new project was added to the manifest, we run _CheckDirReference
against the manifest project with share_refs=True, which added the
working_tree_* to the shareable_* lists. Then, when we load the new
manifest and create the new project, it uses the lists that already
contain the working_tree_* files, even though we passed
share_refs=False.

This happens reliably under the above conditions, but doesn't seem to
happen when syncing a fresh tree. So we've got a mixture of links that
may need to be cleaned up later. This patch will just stop it from
happening in the future.

Change-Id: Ib7935bfad78af1e494a75e55134ec829f13c2a41
2016-04-05 17:44:09 -07:00
e121ad558d Merge "Ignore clone.bundle on HTTP 501, i.e. Not Implemented" 2016-04-05 21:39:29 +00:00
1f0564406b Add --inverse-regex option to forall subcommand
Make it possible to exclude projects using regex/wildcard.

The syntax is similar to that of the -r option, e.g.:

  repo forall -i ^platform/ ^device/ -c 'echo $REPO_PROJECT'

Change-Id: Id250de5665152228c044c79337d3ac15b5696484
2016-04-05 07:28:27 +00:00
936d6185eb Merge changes from topic 'pylint-pep8-cleanup'
* changes:
  command.py: Cleaned up pylint/pep8 violations
  project.py: Cleaned up pylint/pep8 violations
2016-03-15 00:31:12 +00:00
9322964d14 Update commit-msg hook to version from Gerrit 2.12.1
Change-Id: I31b74aba998f8e83f370a759218777f2557a8872
2016-03-14 10:08:33 +09:00
4aa4b211c6 RepoHook: set __file__ when running the hook
A common design pattern is to use __file__ to find the location of the
active python module to assist in output or loading of related assets.
The current hook systems runs the pre-upload.py hook in a context w/out
that set leading to runtime errors:

$ repo upload --cbr .
ERROR: Traceback (most recent call last):
  File ".../repo/project.py", line 481, in _ExecuteHook
    self._script_fullpath, 'exec'), context)
  File ".../repohooks/pre-upload.py", line 32, in <module>
    path = os.path.dirname(os.path.realpath(__file__))
NameError: name '__file__' is not defined

Define this variable in this context so code can safely use it.

Change-Id: If6331312445fa61d9351b59f83abcc1c99ae6748
2016-03-05 21:52:31 +00:00
8ccfa74d12 command.py: Cleaned up pylint/pep8 violations
I noticed when running pylint (as the SUBMITTING_PATCHES file directs)
that there were a few violations reported. This makes it difficult
to see violations I might have introduced. This commit corrects all
pylint violations in the command.py script.

This script now has a pylint score of 10.0.

Change-Id: Ibb35fa9af0e0b9b40e02ae043682b3af23286748
2016-03-02 09:05:45 -07:00
30b0f4e022 project.py: Cleaned up pylint/pep8 violations
I noticed when running pylint (as the SUBMITTING_PATCHES file directs)
that there were a number of violations reported. This makes it difficult
to see violations I might have introduced. This commit corrects all
pylint violations in the project.py script.

This script now has a pylint score of 10.0, and no violations reported
by pep8.

Change-Id: I1462fd84f5b6b4c0dc893052671373e7ffd838f1
2016-03-02 09:05:40 -07:00
203153e7bb Add rpc: to default protocol whitelist
Change-Id: I57e1c3d93c0ce56da9c487df65eb3d258e0260e8
2016-02-26 18:53:54 -08:00
4cfb6d7167 Better error display on forall
It was only displaying 'Project list error: GitError()'
without any useful info about the project nor the error

Change-Id: Iad66cbaa03cad1053b5ae9ecc90d7772aa42ac13
2016-02-18 01:29:54 +00:00
b29e61133e SUBMITTING_PATCHES: Expand instructions
This commit adds additional instructions on getting patches submitted,
based on my recent experience doing so.

Change-Id: I8e0d37d316214cc9a39383414773aad181f83f18
2016-02-17 02:45:08 +00:00
4088eb434b repo: Cleaned up pylint/pep8 violations
I noticed when running pylint (as the SUBMITTING_PATCHES file directs)
that there were a number of violations reported. This makes it difficult
to see violations I might have introduced. This commit corrects all
pylint violations in the repo script.

First I ran this to clean up the formatting:

 autopep8 --max-line-length=80 --indent-size 2 repo

Following that the following violations remained:

% pylint --rcfile=.pylintrc repo
************* Module repo
W:220,21: Redefining name 'init_optparse' from outer scope (line 156)
(redefined-outer-name)
W:482, 2: No exception type(s) specified (bare-except)
C:704, 0: Old-style class defined. (old-style-class)

For line 220, the parameter to _GitcInitOptions was renamed so as not to
mask the init_optparse global.

For line 482, a pylint directive was added to disable the bare-execpt
violation for just that line.

For line 704, the _Options class was changed to subclass object.

Additionally, the comments at lines 107-113 were spaced out to line up
with the comment at line 112 that autopep8 moved.

This script now has a pylint score of 10.0

Change-Id: I779b66eb6b061a195d3c4372b99dec1b6d2a214f
2016-02-15 10:29:02 -07:00
5553628601 repo: Add check of REPO_URL env variable
We want to be able to run repo on a system that is not connected to
the Internet and cannot access https://gerrit.googlesource.com. We
can put a clone of that repos there, but would prefer to use the
stable version of the repo script instead of a locally modified
version.

This commit adds a check for the REPO_URL environment variable. If
that is set and not empty its value will be set in the REPO_URL
global in repo.  Otherwise the standard path will be used.

Change-Id: I0616f5f81ef75f3463b73623b892cb5eed6bb7ba
2016-02-09 17:27:29 -07:00
5ed805a98e Merge "Fix typos for manifest dtd" 2016-02-04 22:58:20 +00:00
985ac6b946 Merge "Fix prune when bare git has detached head" 2016-02-04 22:44:20 +00:00
ecf0a6c92b Merge "GITC: Fix 'repo start <branch> <repo>/<subdir>'" 2016-02-04 22:36:05 +00:00
04197a5144 GITC: Fix 'repo start <branch> <repo>/<subdir>'
As soon as we wrote the gitc manifest, the folder for that repo became
empty, causing the next GetProjects lookup to fail. Reorder the
GetProjects calls so that they all happen while we still have the
repository contents available.

If you were already in a subdir, for cases like 'repo start <branch> .',
this would still fail, since the working directory would disappear out
from under you. That's fine most of the time, since we shouldn't be
doing operations based on the local directory, but git has a realpath
function that tries to restore CWD by chdir'ing back to it. So if the
working directory no longer exists, chdir to the topdir before
continuing.

Change-Id: Ibdf6cd37ff6e5a5f8338347c3919175491f7166f
2016-02-04 14:31:55 -08:00
0b4cb325c6 Add option to rebase onto project's manifest version
Some teams have a continuous build server that would mark certain
manifest green and safe to sync to.  Then team members could repo
sync to that particular manifest file and make sure they always
sync to a green build.  But if she/he has some local changes and
wants to rebase, currently it would be a manual process to find the
correct version to rebase onto.  This patch helps with that use
case by automating the process to rebase onto the currently synced
manifest version.

Change-Id: I847c9eb6addf7f84fd3f5594fbf8c0bcc103f9a5
2016-01-28 10:20:03 -08:00
1a799d14b7 Fix prune when bare git has detached head
We don't really use HEAD much in the bare git repositories, but there
have been reports of errors in git-symbolic-ref:

  symbolic-ref: fatal: Refusing to point HEAD outside of refs/

That happen when the bare git repo is in the detached head state. It's
possible that previous operations were killed while we were pruning
branches.

Use DetachHead instead of SetHead if we're restoring the repo into a
detached head state.

Change-Id: I9062e8957bc70367d3ded399685ac026fbb421fc
2015-12-15 14:22:40 -08:00
827e547d9e Fix typos for manifest dtd
Change-Id: If53721544eca570e2bcce4598cdc2670a679c681
2015-12-11 00:05:54 +00:00
e9becc079c Sync: Fix error exit code when both -n and -f are used
When repo sync is used with -f (--force-error) and a project fails to
sync, the sync will continue but then exit with an error status.

However if -n (--network-only) is also used, the exit code is 0, even
when a project failed.

Modify the logic to make sure the sync exits with the correct status.

Bug: Issue 214
Change-Id: I0b5d97a34642c5aa3743750ef14a42c9d5743c1d
2015-11-26 02:25:43 +00:00
466b8c4ea2 Set GIT_ALLOW_PROTOCOL to limit dangerous protocols
See git commit 33cfccbbf35a -- some protocols allow arbitrary command
execution as part of the URL. Instead of blindly allowing those,
whitelist the allowed URL protocols unless the user has already done so.

Bug: Issue 210
Change-Id: I6bd8e721aa5e3dab53ef28cfdc8fde33eb74ef76
2015-11-26 11:03:19 +09:00
e1e0bd1f75 Check for broken links when updating linkfiles
If a linkfile is a broken link (destination does not exist), and it
needs to be updated, we didn't notice that it needed to be removed
first. Use lexists instead of exists to check for this condition.

Change-Id: I1f6a1f0193d3fd2b9f7a647836044997f6ab32eb
2015-11-18 16:51:51 -08:00
74cfd2709b Sync: Add option to prune refs during sync
By passing --prune to the sync command, the --prune option is
given to the `git fetch`, causing refs that no longer exist on
the remote to be removed.

Change-Id: I3cedacce14276d96ac2d5aabf2d07fd05e92bc02
2015-10-27 03:04:17 +00:00
c2a64ddffd A couple of fixes to the init command's -p option.
Adds windows as one of the allowed platforms flags.
Fixes -p foo to append 'platform-foo', instead of each letter (list.extend
expects a list and thus appends each char in the string, rather than the
string itself).

Change-Id: I73a92127ac29a32fc31b335cc54a246302904140
2015-10-22 13:28:20 -07:00
745b4ad660 Fix gitc-init behavior
With gitc-init, a gitc client may be specified using '-c'. If we're
not currently in that client, we need to change directories so that
we don't affect the local checkout, and to ensure that repo is
checked out in the new client.

This also makes '-c' optional if already in a gitc client, to match
the rest of the init options.

Change-Id: Ib514ad9fd101698060ae89bb035499800897e9bd
2015-10-07 15:43:22 -07:00
4c5f74e452 Sync: Add HTTP Cookie File header on temporary cookie file
The .gitcookies file generated by googlesource.com does not have
the header:

 # (Netscape) HTTP Cookie File

which causes python's MozillaCookieJar.load to fail with the
error:

 "does not look like a Netscape format cookies file"

Prepend the expected header onto the generated cookie file.

We don't bother to check if the header already exists on the
file; repeating it does not cause any problem.

Bug: Issue 207
Change-Id: I7d39720a1d36a6aae00f70691156514ebc04e579
2015-10-02 11:12:05 +09:00
b1ad2190a2 Sync: Don't fail when git cookies can't be loaded
If the git cookies file fails to load, use a default
cookie jar instead.

Bug: Issue 207
Change-Id: I7cb326c204f2784ab4dbd13801b3186667af5b78
2015-10-02 11:04:01 +09:00
f231db11a2 GITC: Add repo gitc-delete command.
repo gitc-delete deletes a GITC client and all the locally
saved sources. Useful for removing unnecessary clients and
recovering disk space.

Change-Id: Idf23addcea52b8713d268c34a7b37da0c5e5cd26
2015-10-01 21:05:17 +00:00
79360640f4 Add GitcClientCommand class for GITC-specific commands
These won't show up as common commands in the help text unless in a GITC
client, and will refuse to execute.

Change-Id: Iffe82adcc9d6ddde9cb4b204f83ff018042bdab0
2015-09-29 13:46:34 -07:00
7b01b2fd01 Merge "launcher: Update repo after applying clone.bundle" 2015-09-14 23:40:05 +00:00
aad84232ca Merge "docs: add copyfile and linkfile elements description" 2015-09-14 10:39:21 +00:00
3c03580607 fixed typo in gitc_init.py help output
Change-Id: I86459bf63297487457d6c4c995dfd1e63133ec53
2015-09-11 14:11:30 -04:00
54527e7e30 docs: add copyfile and linkfile elements description
The "copyfile" element is available since 2009 and
have been used in every Android manifest; the "linkfile"
element is available since 2014.
Now it's a good time to add both to the documentation

Change-Id: Ia987edf5f69a006235fbd3f33b744e9794a6d964
Signed-off-by: Ruslan Bilovol <ruslan.bilovol@gmail.com>
2015-09-10 09:43:19 +00:00
5ea32d1359 GITC: Always update the gitc manifest from the repo manifest
This way any changes made to the main manifest are reflected in the gitc
manifest. It's also necessary to use both manifests to sync since the
information required to update the gitc manifest is actually in the repo
manifest.

This also fixes a few issues that came up when testing. notdefault
groups weren't being saved to the gitc manifest in a method that matched
'sync'. The merge branch wasn't always being set to the correct value
either.

Change-Id: I435235cb5622a048ffad0059affd32ecf71f1f5b
2015-09-09 20:50:40 -07:00
5cc384034d Merge "Revert "GITC: Always update the gitc manifest from the repo manifest"" 2015-09-09 21:44:11 +00:00
0375523331 Revert "GITC: Always update the gitc manifest from the repo manifest"
This reverts commit 250303b437.

Change-Id: I1fd8af20f802553151aacb953c913f3305ca6057
2015-09-09 21:43:32 +00:00
c32ba1961e Merge "_CopyAndLinkFiles even if the sources haven't changed" 2015-09-09 20:47:19 +00:00
250303b437 GITC: Always update the gitc manifest from the repo manifest
This way any changes made to the main manifest are reflected in the gitc
manifest. It's also necessary to use both manifests to sync since the
information required to update the gitc manifest is actually in the repo
manifest.

This also fixes a few issues that came up when testing. notdefault
groups weren't being saved to the gitc manifest in a method that matched
'sync'. The merge branch wasn't always being set to the correct value
either.

Change-Id: I5dbc850dd73a9fbd10ab2470ae4c40e46ff894de
2015-09-09 12:35:56 -07:00
029eaf3bac _CopyAndLinkFiles even if the sources haven't changed
The source or destination attributes may have changed even if the source
didn't, so we need to make sure that these are up to date.

Change-Id: I266ef3598ddda7e8c23bc9c6a049905ddc586348
2015-09-03 12:54:06 -07:00
ba72d8301e GITC: Fix repo sync.
Fixing http://b/23785024 by calling os.getcwd() because variable
cwd no longer exists.

Change-Id: I21ff7d059e072f9f60726db76b67587a92c878ad
2015-09-03 10:47:44 -07:00
fee390eea2 launcher: Update repo after applying clone.bundle
If the clone.bundle is out of date, repo may be installed with an old
version. It will upgrade with the next sync a day later, or when "repo
selfupdate" is run.

This behavior was added to normal project downloads, but was never added
to the repo launcher.

Change-Id: Ib04bef3a658c98fe1b6c53b3e8d0067165a5e3f7
2015-09-02 12:45:19 -07:00
9ff2ece6ab gitc: Improve help visibility
This improves the visiblity of gitc-init if we can get the gitc config,
and hides it otherwise.

Change-Id: I82830b0b07c311e8c74397ba79eb4c361f8b6fb5
2015-09-01 12:23:56 -07:00
2487cb7b2c Fix gitc check if gitc isn't installed
This was doing cwd.startswith(''), which is always true.

Change-Id: Icc059c09492b31e2d7651e4a595bda783c5abc47
2015-08-31 15:59:54 -07:00
8ce5041596 GITC: Pull GITC Manifest Dir from the config.
Updates the repo launcher and gitc_utils to pull the manifest
directory location out of the gitc config file.

Change-Id: Id08381b8a7d61962093d5cddcb3ff6afbb13004b
2015-08-31 21:39:17 +00:00
f7a51898d3 GITC: Expand relative remote URLs.
The GITC filesystem does not understand relative URLs for remotes,
so now if a remote uses a relative URL, it will be be expanded to
be relative to the manifest URL.

Change-Id: Ie1210758560aeb1934da3f71496aaf19c2728214
2015-08-28 18:09:05 +00:00
b9a1b73425 GITC: Add repo start support.
Add repo start support for GITC checkouts. If the user is in
the GITC FS view, they can now run repo start to check out
the sources and create a new working branch.

When "repo start" is called on a GITC project, the revision
tag is set to an empty string and saved in a new tag:
old-revision. This tells the GITC filesystem to display the
local copy of the sources when being viewed. The local copy
is created by pulling the project sources and the new branch
is created based off the original project revision.

Updated main.py to setup each command's gitc_manifest when
appropriate.

Updated repo sync's logic to sync opened projects and
updating the GITC manifest file for the rest.

Change-Id: I7e4809d1c4fc43c69b26f2f1bebe45aab0cae628
2015-08-28 10:53:05 -07:00
dc2545cad6 project.py: Improve message shown when hook is not replaced
If a hook file has been modified locally, it will not be replaced.

Improve the message to make this clearer.

Also change it from an error to a warning.

Change-Id: I62c635390f24d2868db17717c247861b0381c99f
2015-08-25 05:40:46 +00:00
f33929d014 project.py: Consistently use the _error method to print error messages
Use the _error method instead of directly calling `print`.

Also add a new _warn convenience method.

Change-Id: Ia332c14ef8d9d1fe2df128dbf36b5521802ccdf1
2015-08-25 14:39:06 +09:00
3010e5ba64 Smartsync: Don't fail if there isn't a cookiefile
Change-Id: I434a259f43ca9808e88051ac8ba865c519a24702
2015-08-20 10:29:37 -07:00
ba7bc738c1 Sync: Refactor netrc parsing
Don't emit a message when the netrc file doesn't exist or couldn't
be opened.

Instead of trying to unpack the result of info.authenticators() and
catching the resulting TypeError when it's None, first store it to
a local and only unpack it if it has a value.

Also remove an unused import.

Change-Id: I5c404d91e48c261c1ab850c3e5f040c4f4c235cb
2015-08-20 17:05:17 +09:00
f4599a2a3d gitc_init: Remove unused import
Also add missing newline at the end of the file.

Change-Id: I206e6c4b033d223eb0ff5824ecbf6fd98c39c918
2015-08-20 16:45:39 +09:00
022a1d4e6e gitc_utils: Fix incorrect string format argument
Change-Id: Ibbac6e111833c8f5d93cb6cb4a10f8f2c4fd8e11
2015-08-20 16:41:04 +09:00
41d1baac31 gitc_utils: Remove unused variable
Change-Id: I569819675a99ff6c01fb57b23fed033c39d14d1f
2015-08-20 16:40:44 +09:00
46496d8761 gitc_utils: Remove unused import; add missing import
shutils is not used. Remove it.

sys is used, but not imported.  Import it.

Change-Id: I4ee582f33bbd451601097ef2f20bee23b54764cd
2015-08-20 16:37:09 +09:00
7c9263bce0 Merge "Support smart-sync through persistent-http[s]" 2015-08-19 23:36:35 +00:00
dab9e99f0f Merge "Fix formatting of message when retrying clone" 2015-08-19 17:54:50 +00:00
c5f15bf7c0 Merge "Include project path in --force-sync error message" 2015-08-19 17:53:28 +00:00
6d35d676db Merge "Copy clone-depth in repo manifest" 2015-08-19 17:50:15 +00:00
0745bb2657 Support smart-sync through persistent-http[s]
Use the same cookies and proxy that git traffic goes through for
persistent-http[s] to support authentication for smart-sync.

Change-Id: I20f4a281c259053a5a4fdbc48b1bca48e781c692
2015-08-19 10:22:11 -07:00
25857b8988 Fix formatting of message when retrying clone
Passing the force_sync variable into the string formatting results in
the message:

  "Retrying clone after deleting None"

or

  "Retrying clone after deleting True".

Pass the name of the git directory instead.

Also, move the print inside the if-block so it's only displayed
when the retry is actually going to be attempted.

Change-Id: I76d9ecc176cecee4ad512d13e9d1f6bd36aacbbb
2015-08-19 13:03:13 +00:00
bdb5271de3 GITC: Add repo sync support.
Add repo sync support for GITC checkouts. If the user is in the
GITC client directory they can still pull the sources as normal
if they pass in the --force-gitc argument. Otherwise the user
should call repo sync in the GITC view to update the user's
remote view. (This works because .repo in the GITC view will
link to .repo in the client config directory.)

Part of the support for this change is the refactoring of GITC
related code into gitc_utils.py.

Change-Id: I2636aaa50b450b6f091309db8dd0e8f4dbdad579
2015-08-18 11:59:10 -07:00
884092225d Copy clone-depth in repo manifest
This argument wasn't being copied, which caused syncs from generated
manifests to pull down too much of the git history.

Change-Id: I269bab788d4557267c081628b3f8c6aec7744e81
2015-08-17 15:30:27 -07:00
5d0c3a614e Merge "GITC: Add gitc-init subcommand to repo." 2015-08-12 23:25:20 +00:00
1efc2b4a01 GITC: Add gitc-init subcommand to repo.
Adds the new gitc-init command to set up a GITC client. Gitc-init
sets up the client directory and calls repo init within it. Once
the repo is initialized, then generates a GITC manifest file
by using git ls-remote on each project and retrieving the HEAD SHA
to use as the revision attribute.

Gitc-init inherits from and has all the options as repo init.

Change-Id: Icd7e47e90eab752a77de7c80ebc98cfe16bf6de3
2015-08-12 16:22:14 -07:00
d3ddcdbd8a Ignore clone.bundle on HTTP 501, i.e. Not Implemented
Change-Id: I03ee003d3bd5d0684a31bdf7961a55a511dfa0e2
2015-08-12 20:12:51 +02:00
2635c0e3b6 Merge "Fix shallow clone behavior" 2015-08-05 01:02:41 +00:00
43322283dc Merge "Support filtering by group on forall and list subcmd" 2015-08-05 01:01:02 +00:00
f9b7683a3b Include project path in --force-sync error message
For projects that have been cloned outside of the repo command (or
cloned a long time ago), commit abaa7f312f
introduced an error message to invite the user to use --force-sync.
However, due to the risk of data loss, it's useful to know which
project's git directory is being replaced before deciding whether or not
to provide --force-sync.

This change updates the exception's associated value to include the
project's relative path and explain to the user how they can resolve the
issue. A previous version of this commit used the project name. However,
for projects that have multiple work trees, the name can be ambiguous,
while the path clearly identifies which git directory will be replaced.

Change-Id: If717e66fda4d19accc0a8e889a91f4cd4ff14dff
2015-08-04 18:41:20 -04:00
eeab6860f1 Fix shallow clone behavior
The existing code here makes sure that switching clone-depth from on to
off actually causes the history to be fully restored. Unfortunately, it
does this by fetching the full history every time the fetch spec
changes. Switching between two clone-depth="1" branches will fetch far
more than the top commit.

Instead, when not using clone-depth, pass --depth=2147483647 to git
fetch so that it ensures that we have the entire history. That is
slightly less efficient, so limit it to only when there are shallow
objects in the project by checking for the existance of the 'shallow'
file.

Change-Id: Iee0cfc9c6992c208344b1d9123769992412db67b
2015-08-03 16:54:16 -07:00
7e59de2bcc Include dest-branch attribute in the 'manifest' subcommand's output
Change-Id: If4227d02005fddea82d9e698a373222100d8f710
2015-07-31 17:36:28 -04:00
163fdbf2fd Merge "Fix _ReferenceGitDir symlinking" 2015-07-31 18:01:32 +00:00
555be54790 Merge "Emit project info in case of sync exception." 2015-07-31 17:06:23 +00:00
c5cd433daf Emit project info in case of sync exception.
Previously repo would only print the failing project path if
Sync_NetworkHalf returned false/empty, but if it threw an
exception the print() was never called.

Change-Id: I58c41de43930df5e34b21561c205e062a72e290f
2015-07-31 14:03:50 +00:00
2a3e15217a Fix _ReferenceGitDir symlinking
This fixes these errors:

  ...
  File ".repo/repo/project.py", line 2371, in _ReferenceGitDir
    os.symlink(os.path.relpath(src, os.path.dirname(dst)), dst)
  OSError: [Errno 17] File exists

Which was happening for checkouts that were created before v1.12.8, when
project-objects was created. Nothing had yet been forcing these
checkouts to use project-objects, until the recent verification changes.

In this OSError case, we already created the symlink, so src == dst, and
the directory did not exist. This caused us to run os.makedirs the
os.symlink on the same file.

dst really should be the file in gitdir, not the target of that symlink
if it exists. So just use realpath for the dotgit portion of the path.

Change-Id: Iff5396a2093de91029c42cf38aa57131fd22981c
2015-07-30 21:29:53 -07:00
0369a069ad Support filtering by group on forall and list subcmd
Enable operating against groups of repositories. As it stands, it isn't
compatible with `-r/--regex`.

`repo forall -g groupname -c pwd` will  run `pwd` for all projects in
groupname.

`repo forall -g thisgroup,-butnotthisone -c pwd` will  run `pwd` for all
projects in `thisgroup` but not `butnotthisone`.

`repo list -g groupname -n` will list all the names of repos in
`groupname`.

Change-Id: Ia75c50ce52541d1c8cea2874b20a4db2e0e54960
2015-07-30 12:59:35 -05:00
abaa7f312f Add option to correct gitdir when syncing
In some cases, a user may wish to continue with a sync even though
it would require overwriting an existing git directory. This behavior
is not safe as a default because it could result in the loss of some
user data, but as an optional flag it allows the user more flexibility.

To support this, add a --force-sync flag to the sync command that will
attempt to overwrite the existing git dir if it is specified and the
existing git dir points to the wrong obj dir.

Change-Id: Ieddda8ad54e264a1eb4a9d54881dd6ebc8a03833
2015-07-29 14:44:46 -06:00
7cccfb2cf0 Merge "InitGitDir: Clean up created directories" 2015-07-29 18:49:12 +00:00
57f43f4944 Merge "Prevent repo info from crashing when default element doesn't exist." 2015-07-29 02:11:55 +00:00
17af578d72 Prevent repo info from crashing when default element doesn't exist.
repo info will crash when using a manifest with no default element despite
default being an optional element. Output nothing for "Manifest Branch" if no
default element exists (or if no default revision exists).

Change-Id: I7ebffa2408863837ba980f0ab6e593134400aea9
2015-07-27 16:56:31 -07:00
b1a07b8276 InitGitDir: Clean up created directories
If _InitGitDir fails, it leaves any progress it had made on the file
system. This can cause subsequent calls to repo sync to behave
differently. This is especially evident when _CheckDirReference() fails,
since it will not be invoked when sync is retried because both the
source and destination directories already exist.

To address this, have _InitGitDir() clean up any directories it has created
if it catches an exception. Also behave the same way for _InitWorkTree().

Change-Id: Ic16bb3feea649e115b59bd44be294e89e3692aeb
2015-07-27 13:33:43 -06:00
4e16c24981 Revert "Add --prune option to fetch when syncing a mirror repo"
For some users it is not desirable to remove refs that don't exist
on the remote server when syncing a mirror repo.

This reverts commit b4d43b9f66.

Change-Id: Ie849b66682138ef88da6cd1a5fbb27e993197dd7
2015-07-20 22:31:04 +09:00
b3d6e67196 Merge "Fail if gitdir does not point to objdir during sync" 2015-07-15 19:30:41 +00:00
503d66d8af Merge "project.RemoteFetch: Handle depth cases more robustly" 2015-07-15 19:29:14 +00:00
679bac4bf3 project.RemoteFetch: Handle depth cases more robustly
The fetch logic for the case where depth is set and revision is a
SHA1 has several failure modes that are not handled well by the
current logic.

1) 'git fetch <SHA1>' requires git version >= 1.8.3
2) 'git fetch <SHA1>' can be prevented by a configuration option on the server.
3) 'git fetch --depth=<N> <refspec>' can fail to contain a SHA1 specified by
   the manifest.

Each of these cases cause infinite recursion when _RemoteFetch() tries to call
itself with current_branch_only=False because current_branch_only is set to
True when depth != None.

To try to prevent the infinite recursion, we set self.clone_depth to None
before the first retry of _RemoteFetch(). This will allow the Fetch to
eventually succeed in the case where clone-depth is specified in the manifest.
A user specified depth from the init command will still recurse infinitely.

In addition, never try to fetch a SHA1 directly if the git version being used
is not at least 1.8.3.

Change-Id: I802fc17878c0929cfd63fff611633c1d3b54ecd3
2015-07-15 15:53:14 +00:00
97836cf09f Merge "Always output upstream if specified" 2015-07-13 16:36:28 +00:00
80e3a37ab5 Merge changes Iaefcbe14,I697a0f64,I19bfe9fe,I06e942c4
* changes:
  forall: use smart sync override manifest if it exists
  sync: Remove smart sync override manifest when not in smart sync mode
  forall: Don't try to get lrev of projects in mirror workspace
  sync: Improve error message when writing smart sync manifest fails
2015-07-11 14:01:16 +00:00
bb4a1b5274 Merge "Improve error message when syncing a project with invalid groups." 2015-07-10 22:00:47 +00:00
551dfecea9 Always output upstream if specified
Previously, in running the `manifest` command, we wouldn't output the
upstream if the default upstream would include the pinned sha1.
However, now that fetching refs/heads/* doesn't guarantee that we will
have the sha1, we need to always output the specified upstream branch.

Change-Id: Ib8b409a8ecd439397b38ee9649c530407797f841
2015-07-10 14:59:10 -07:00
6944cdb8d1 forall: use smart sync override manifest if it exists
If a workspace is synced with the -s or -t option, the included projects
may be different to those in the original manifest. However, when using
the forall command, the list of the projects from the original manifest
is used.

If the smart sync manifest file exists, use it to override the original
manifest.

Change-Id: Iaefcbe148d2158ac046f158d98bbd8b5a5378ce7
2015-07-06 16:18:06 +09:00
59b417493e sync: Remove smart sync override manifest when not in smart sync mode
When syncing with the -s or -t option, a smart_sync_override.xml file
is created. This file is left in the file system when syncing again
without the -s or -t option.

Remove the smart sync override manifest, if it exists, when not using
the -s or -t option.

Change-Id: I697a0f6405205ba5f84a4d470becf7cd23c07b4b
2015-07-06 16:18:06 +09:00
30d13eea86 forall: Don't try to get lrev of projects in mirror workspace
git rev-parse fails for projects that don't have an explicit revision
specified, and don't have a branch of the same name as the default
revision. This can be the case in a workspace synced with the smart
sync (-s) or smart tag (-t) option.

Change-Id: I19bfe9fe7396170379415d85f10f6440dc6ea08f
2015-07-06 16:18:06 +09:00
727cc3e324 sync: Improve error message when writing smart sync manifest fails
The error message only states that writing the manifest failed.

Include the exception message, so it's easier to track down the reason
that the write failed.

Change-Id: I06e942c48a19521ba45292199519dd0a8bdb1de7
2015-07-06 16:18:06 +09:00
c5ceeb1625 Merge "Fix 'repo cherry-pick' to avoid hanging on commit-msg update." 2015-06-25 14:53:46 +00:00
db75704bfc Fix 'repo cherry-pick' to avoid hanging on commit-msg update.
After performing the actual cherry-pick operation, the code
in cherry_pick.py opens a pipe to 'git commit -F' to rewrite the commit
message, emits the fixed-up commit msg to the pipe, then waits
for 'git commit' to complete. The child 'git' process winds up
hanging while reading from the pipe, however, since the parent
process still has it open. To fix the hang, change the parent process
to close its end of the pipe after it has emitted the message.

Change-Id: I5929371e69a5b076f09009d00d40a2c72ac8ac33
2015-06-22 08:00:20 -04:00
87ea5913f2 Improve error message when syncing a project with invalid groups.
Change-Id: Iaf5c2a0f00667dc09bcf455cfe2f39bfbaa2bfc0
2015-06-19 15:55:15 -07:00
185307d1dd Merge "Teach _LinkFile._Link to handle globs." 2015-06-09 00:14:13 +00:00
c116f94261 forall: setenv, only encode val if encode exists
Change-Id: I655e3043d0118c4e929897d3a51e5e013e5758dc
2015-06-04 00:34:19 +00:00
7993f3cdda init: don't call urllib.parse
it's actually urllib.parse.urlparse

Change-Id: Ie3532e54625e887c8682d92b932ea21a629e8d60
2015-06-04 00:33:33 +00:00
b1d1fd778d git_config: fix _SaveJson typo
Change-Id: I35ca2b3733e6d1508669f9a6690c6645c582912e
2015-06-04 00:22:23 +00:00
be4456cf24 error: fix typos
Change-Id: I09c47024ef54c360ea3c15c5d4f169e13444e412
2015-06-04 00:21:16 +00:00
cf738ed4a1 git_command: only decode when needed
strings no longer need decoding, since unicode is str

Change-Id: I9516d298fee7ddc058452394b7759327fe3aa7a8
2015-06-03 16:50:39 +01:00
6cfc68e1e6 decode the buffer before appending
output from a process is in bytes in python3. we need
to decode it.

in Python3, bytes don't have an encode attribute. use this
to identify it.

Change-Id: I152f2ec34614131027db680ead98b53f9b321ed5
2015-06-03 16:39:32 +01:00
4c426ef1d4 Teach _LinkFile._Link to handle globs.
This allows a project to use globs in the linkfile src attribute. When
a glob is used in the src the dest field must be a directory. Then
_LinkFile._Link(self) calls will create symbolic links in the dest
directory to all of the entries in the src as defined by the glob
specification.

Below all of the entries in master-configs/ will have symbolic links
in <root dir>/configs directory:

  <project name="helloworld.git" path="apps/helloworld">
      <linkfile src="master-configs/*" dest="configs"/>
  </project>

Change-Id: Idfed8fa47c83d2ca6e2b8e867731b8e2f9e2eb47
2015-06-03 08:05:17 -07:00
472ce9f5fa Merge changes I32da12c2,Ie4a65b3e
* changes:
  Skip sleep and retry if git remote update exits with a signal
  Catch exceptions in project list generator
2015-06-02 00:14:43 +00:00
0184dcc510 Make linkfile symlinks relative
The source (target) of the symlink is specified relative to a project
within a tree, and the destination is specified relative to the top
of the tree, so it should always be possible to create a relative symlink
to the target file.  Relative symlinks will allow moving an entire tree
without breaking the symlink, and copying a tree (with -p) without leaving
a symlink to the old tree.

Change-Id: I16492a8b59a137d2abe43ca78e3b212e2c835599
2015-06-01 01:24:38 +00:00
c4b301f988 Skip sleep and retry if git remote update exits with a signal
Pressing ctrl-c during repo sync often hangs for 30 to 45 seconds
due to the time.sleep and retry in _RemoteFetch.  If git exits with
a signal, for example -2 for SIGINT triggered by ctrl-c, skip the
sleep and retry.

Change-Id: I32da12c2dcc96d9cc0b12a066e824b12ebfb52a0
2015-05-13 18:11:34 +00:00
31a7be561e Catch exceptions in project list generator
If the generator that produces per-project worker arguments raises an
exception it triggers python bug http://bugs.python.org/issue8296.
Rewrite the generator expression as a generator function, and catch
Exceptions and KeyboardInterrupts to end the iteration.

Also add a pool worker initializer to disable SIGINT to prevent
KeyboardInterrupts inside multiprocessing.Pool in the worker threads
causing the same problem.

Fixes easy-to-reproduce hangs when hitting ctrl-c during
repo forall -c echo

Change-Id: Ie4a65b3e1e07a64ed6bb6ff20f3912c4326718ca
2015-05-13 11:09:38 -07:00
384b3c5948 Fail if gitdir does not point to objdir during sync
There are a set of cases that can cause the git directory in
.repo/projects to point to a directory in .repo/project-objects that
is not the one specified in the manifest. This results in a tree that
is not sane, and so should cause a failure.

In order to reproduce the failure case:
1) Sync to any manifest
2) Change the 'name' of a project to a different repository. Leave the
   'path' the same.
3) Resync the modified project. The project-objects directory will not
   be created, and the projects directory will remain pointed at the old
   project-objects.

Change-Id: Ie6711b1c773508850c5c9f748a27ff72d65e2bf2
2015-05-12 09:15:53 -06:00
35de228f33 Merge "Don't attempt to create "fully qualified names" for SHA1s" 2015-05-11 09:20:54 +00:00
ace097c36e Merge "Add option on sync to avoid fetching from remotes for existing sha1" 2015-05-01 07:51:52 +00:00
b155354034 Add option on sync to avoid fetching from remotes for existing sha1
In 2fb6466f79 an optimisation was
added to avoid fetching from remotes if the project is fixed to
a revision and the revision is already available locally.

This causes problems for users who expect all objects to be
fetched by default.

Change the logic so that the optimized behaviour is only enabled if
an option is explicitly given to repo sync.

Change-Id: I3b2794ddd8e0071b1787e166463cd8347ca9e24f
2015-04-30 14:29:02 +00:00
382582728e Don't attempt to create "fully qualified names" for SHA1s
Doing so breaks "repo init -b <SHA1>".

Change-Id: Ic071a1b099a9125db22ea446d7e92e7854d69b37
2015-04-30 14:54:47 +02:00
b4d43b9f66 Add --prune option to fetch when syncing a mirror repo
When syncing a mirror repo, add the --prune option to the fetch
command to force removal of stale refs from the mirror.

Change-Id: I4b43b2a5c86b9915627887c16f6569066f3ab978
2015-04-30 10:32:37 +09:00
4ccad7554b Fix substitution err for schemeless manifest urls
Previously, we used a regex that would only remove a phony string from
a url if it existed, but we recently replaced that with a slice.  This
change goes back to the previous behavior.

Change-Id: I8baf527be01c4b49d45b903b31a1cd6315563d5b
2015-04-29 10:45:37 -07:00
403b64edf4 Don't append branch to fetch spec when syncing to a mirror
Appending the branch to the fetch spec causes sync of a mirror to
fail for projects that don't have an explicit revision specified,
and don't have a branch of the same name as the default revision.

For example, a manifest defining a default revision:

 <default revision="master">

having a project without an explicit revision:

 <project name="path/to/project">

and not having a branch named "master", will cause repo sync to
fail for that project with the error:

 Couldn't find remote ref refs/heads/master

Modify the logic to not append the branch onto the fetch spec when
syncing to a mirror.

Change-Id: I5c4457bd125519abf27abe682dea62ad708978c9
2015-04-27 10:56:27 +09:00
a38769cda8 Merge "forall: use a generator to map the Pool" 2015-04-08 17:59:58 +00:00
44859d0267 Merge "status: lose dependence on StringIO" 2015-04-08 17:58:35 +00:00
6ad6dbefe7 forall: use a generator to map the Pool
Before, a list was generated, which is why there was a massive delay.

Using a generator will allow processes to start straight away

Change-Id: Ia325b0b340cc328c08c9bcc92a6709bbdaf6a664
2015-04-08 13:22:34 +01:00
33fe4e99f9 Remove deprecated include-ids setting from pylint config
Change-Id: Ie5ab21e434d24ff862bb5e0c263761370d71f56f
2015-04-07 11:10:17 +09:00
4214585073 Merge "Pylint and PEP8 fixes for color.py" 2015-04-07 02:06:49 +00:00
b51f07cd06 status: lose dependence on StringIO
buflist was being used, which isn't available in Python 3.

`Execute` was using StringIO to capture the output of `PrintWorkTreeStatus`,
only to redirect it straight to stdout.
Instead, just let `PrintWorkTreeStatus` do it's own thing directly to stdout.

for handling `_FindOrphans`, we swap StringIO for a list. Nothing was done
that needed a a file like object.

Change-Id: Ibdaae137904de66a5ffb590d84203ef0fe782d8b
2015-04-04 21:21:49 +01:00
04f2f0e186 Maintain fully qualified tracking branches
When running repo branch, the git merge line (in many circumstances)
is set to the revision of the project specified in the manifest.  If
this is a branch name that is not fully-qualified, we will end up with
something like "merge = master" instead of "merge = refs/heads/master".
This change examines the revision if we are going to use that and
changes branch short names to fully qualified branch names.

Change-Id: Ie1be94fb8d45df8eeac44a47f729a3819a05fa81
2015-04-01 17:43:36 +00:00
cb07ba7e3d Resolve fetch urls more efficiently
Instead of using regex, append the netloc and relative
scheme lists with the custom scheme.
The schemes will only be appended when needed, instead
of passing X amount of regex replaces.

see http://bugs.python.org/issue18828 for more details.

Change-Id: I10d26d5ddc32e7ed04c5a412bdd6e13ec59eb70f
2015-03-31 20:12:44 +00:00
23ff7df6a7 use the max depth instead of unshallow
This allows the use of older versions of git

Change-Id: I88ea685066603af19896a791829355ddbfa91ffe
2015-03-30 21:54:26 +00:00
cc1b1a703d Revert "Change the min git version from 1.7.2 to 1.8.2"
This reverts commit 52b99aa91d.

Change-Id: I01d93704c92f7af1ca2b36dbc9509ee1290e2d3c
2015-03-30 21:53:25 +00:00
bdf7ed2301 Pylint and PEP8 fixes for color.py
Change-Id: I1a676e25957a7b5dd800d2585a2ec7fe75295668
2015-03-28 21:12:27 +00:00
9c76f67f13 Always capture output for GitCommand
Switch the GitCommand program to always capture the output for stdout
and stderr.  And by default print the output while running.

The options capture_stdout and capture_stderr have effectively become
options to supress the printing of stdout and stderr.

Update the 'git fetch' to use '--progress' so that the progress messages
will be displayed.  git checks if the output location isatty() and if it
is not a TTY it will by default not print the progress messages.

Change-Id: Ifdae138e008f80a59195f9f43c911a1a5210ec60
2015-03-26 11:43:55 -07:00
52b99aa91d Change the min git version from 1.7.2 to 1.8.2
This is needed for the --unshallow option of git fetch.

Change-Id: Ifdc5cec6130315c643924328fea425f1b94cb04a
2015-03-18 21:43:39 +00:00
9371979628 Revert "Implementation of manifest defined githooks"
This reverts commit 38e4387f8e.

A "repo init" followed by "repo sync" is meant to be as safe as
"git clone".  In particular it should not run arbitrary code provided
by the manifest owner.

It would still be nice to have support for manifest-defined git hooks
--- they'd just need a prompt like the upload RepoHook has.  Hopefully
a later change can bring them back.

Change-Id: I5ecd90fb5c2ed64f103d856d1ffcba38a47b062d
Signed-off-by: Jonathan Nieder <jrn@google.com>
2015-03-17 11:29:58 -07:00
2086004261 Merge "Don't exit with error on HTTP 401 when downloading clone bundle" 2015-03-11 17:25:45 +00:00
2338788050 Don't exit with error on HTTP 401 when downloading clone bundle
If the server returns HTTP 401 (unauthorized) when attempting to
download clone bundle files, ignore it and continue, rather than
exiting with a fatal error.

Change-Id: I2c7ee03e149c354c7e4ad6ea1ebf266534778fe1
2015-03-11 07:43:40 +00:00
0402cd882a Add space between project path and branch in repo status.
Currently, paths longer than 39 chars have no space after them so it looks
like this:

project path/branch master

Change-Id: I4c1bb13648ac099ade8a8d4ebafa04131571f842
2015-03-11 07:42:17 +00:00
936183a492 git_config: add support for remote '.'
As a fix for issue #149, this patch add support for the remote '.'
(local).

As an alias for the local repository, remote '.' is lacking a fetch =
config in .git/config.

Without such refspec, repo info --overview is not able to process a
local tracking branch.

v2: Check for name == '.' before checking if merge starts with refs/,
    since the case where it's not is invalid.

Signed-off-by: Yann Droneaud <ydroneaud@opteya.com>
Signed-off-by: Filipe Brandenburger <filbranden@google.com>

Change-Id: I8c8fd8602cd68baecb530301ae41d37d751ec85d
2015-03-06 13:23:27 -08:00
85e8267031 Merge "Implementation of manifest defined githooks" 2015-03-05 20:52:30 +00:00
e30f46b957 Print stderr output from git command for RemoteFetch
The stderr output generated by git during a RemoteFetch was not being
printed.  This information is useful so print it.

Change-Id: I6e6ce12c4a57e5ca2359f76ce14f2fcbbc37a5ef
2015-02-25 14:29:28 -08:00
e4978cfbe3 Ensure the repo project is never fetched with partial depth
If the repo project is synced with partial depth, then the tags
won't be fetched and users will be told the newest sha1 in the
stable branch isn't signed.

Change-Id: I107df97b4836b928c76aa33a700fa35d1705ae09
2015-02-10 14:44:05 -08:00
126e298214 Handle case where 'git remote prune' needs to be run
Handle the case when this error occurs:
    error: some local refs could not be updated; try running
     'git remote prune origin' to remove any old, conflicting branches

This is usually caused by a reference getting changed from a file to a
directory.

For example:
  Initially someone creates a branch 'foo' and it is stored as:
    .git/refs/remotes/origin/foo

  Then later on it is decided to change the layout structure where 'foo'
  is a directory with branches below it:
    .git/refs/remotes/origin/foo/master

  The problem occurs when someone still has
  '.git/refs/remotes/origin/foo' on their system and does a repo sync.
  When this occurs the error message for needing to do a
  'git remote prune origin' occurs.

Now when doing a 'git fetch' if the error message from git says that a
'git remote prune' is needed, it will do the prune and then retry the
fetch.

Change-Id: I4c6f5aa6bd932f0ef7a39134400bedd52e82f633
Signed-off-by: John L. Villalovos <john.l.villalovos@intel.com>
2015-02-03 13:49:51 -08:00
38e4387f8e Implementation of manifest defined githooks
When working within a team or corporation it is often
useful/required to use predefined git templates. This
change teaches repo to use a per-remote git hook template
structure.

The implementation is done as a continuation of the
existing projecthook functionality. The terminology is
therefore defined as projecthooks.

The downloaded projecthooks are stored in the .repo
directory as a metaproject separating them from the users
project forest.

The projecthooks are downloaded and set up when doing a
repo init and updated for each new repo init.

When downloading a mirror the projecthooks gits are
not added to the bare forest since the intention is to
ensure that the latest are used (allows for company policy
enforcement).

The projecthooks are defined in the manifest file in the
remote element as a subnode, the name refers to the
project name on the server referred to in the remote.
<remote name="myremote ..>
   <projecthook name="myprojecthookgit" revision="myrevision"/>
</remote>

The hooks found in the projecthook revision supersede
the stock hooks found in repo. This removes the need for
updating the projecthook gits for repo stock hook changes.

Change-Id: I6796b7b0342c1f83c35f4b3e46782581b069a561
Signed-off-by: Patrik Ryd <patrik.ryd@stericsson.com>
Signed-off-by: Ian Kumlien <ian.kumlien@gmail.com>
2015-02-03 16:01:15 +09:00
24245e0094 Merge "Add missing documentation of --current-branch option on sync command" 2015-01-31 12:44:45 +00:00
db6f1b0884 Merge "Use depth flag when fetching" 2015-01-30 19:36:06 +00:00
f2fad61bde Add missing documentation of --current-branch option on sync command
Change-Id: I72d6e3d51241148c1df97bbad26338debb1fcb4e
2015-01-29 14:36:28 +09:00
ee69084421 Merge "Handle shallow checkout of SHA1 pinned repos" 2015-01-28 20:29:37 +00:00
d37d43f036 Merge "Don't delete hooks in .git/hooks" 2015-01-28 20:29:05 +00:00
7bdac71087 pylint fixes for project.py
Fix all the formatting warnings and unused variables

Change-Id: I17d88a23572303879530077f3a80451de5417fbb
2015-01-22 04:20:21 +00:00
f97e8383a3 Use depth flag when fetching
Currently, we only use the depth flag when cloning.  The result is that when
new project history has merges, the entire history of the merged branch is
brought in and the project becomes unshallow very quickly.  --depth and
clone-depth are often used to save on space, not just network load, so this
seems less than ideal.

This change uses --depth on every fetch (when the user has depth specified),
not just the initial clone.  The result is that the given project stays
consistently shallow as opposed to growing over time, especially when merges
are involved.

Change-Id: Iac706cfdad4a555c72f9d9f1119195d38d91df12
2015-01-22 01:20:22 +00:00
3000cdad22 Handle shallow checkout of SHA1 pinned repos
When doing a shallow checkout SHA1 pinned repos with repo init --depth=1 and
repo sync -c, repo would try to fetch only some reference and fail if the exact
SHA1 repo was missing.
Instead, when depth is set, fetch only the specific commit.

Change-Id: If3f799d0e78c03faea47f796380bb5e367b11998
2015-01-21 14:14:23 -08:00
b9d9efd394 Don't delete hooks in .git/hooks
We currently delete all hooks in .git/hooks for each project before
symlink'ing in the standard project hooks.  This can be annoying for
users who have installed custom git hooks.

There's no reason to delete all existing hooks.  Just rip out the
deletion code.

Change-Id: I5062a6cd20af700f6d6a17b11ad6c94853987c57
Signed-off-by: Mitchel Humpherys <mitchelh@codeaurora.org>
2015-01-15 22:49:08 -08:00
497bde4de5 Respect --quiet when looking up bundle cookie file
Change-Id: I02a244132c49e4bb50ecda978974d6d2b220f6d1
2015-01-02 13:58:05 -08:00
4abf8e6ef8 Save cookies back to jar when fetching clone.bundle
Change-Id: I3ef71b5e7f8ee1cda66057e46ae234866c7258c4
2015-01-02 13:57:14 -08:00
137d0131bf Hold persistent proxy connection open while fetching clone.bundle
The persistent proxy may choose to present a per-process cookie file
that gets cleaned up after the process exits, to help with the fact
that libcurl cannot save cookies atomically when a cookie file is
shared across processes. We were letting this cleanup happen
immediately by closing stdin as soon as we read the configuration
option, resulting in a nonexistent cookie file by the time we use the
config option.

Work around this by converting the cookie logic to a context manager
method, which closes the process only when we're done with the cookie
file.

Change-Id: I12a88b25cc19621ef8161337144c1b264264211a
2015-01-02 13:57:13 -08:00
42e679b9f6 Merge "add a global --color option" 2015-01-02 20:56:25 +00:00
902665bce6 add a global --color option
If you want to turn off colors for commands, you have to manually adjust
the git config settings (in various locations).  If you're writing scripts
though, you often don't want to modify those locations.  Add a commandline
option to explicitly control things.

The default behavior is unchanged -- we still scan the config files.

Change-Id: I54a3fd8e1918bac180aadd7c7d3004f069b02522
2014-12-30 18:50:05 -05:00
c8d882ae2a Silence warnings about invalid clone.bundle files when quieted
The invalid clone.bundle file warning is not typically user actionable,
and can be confusing. So don't show it when -q flag is in effect.

Change-Id: If9fef4085391acf54b63c75029ec0e161c38eb86
2014-12-24 10:23:24 +09:00
3eb87cec5c Revert "Check for existence of refs upon initial fetch"
This reverts commit 565480588d.

We are reverting this change for 2 reasons:

1) It introduced a bug for users using sync -c with a reference mirror.
2) The fetch specs have recently changed to cause git to properly fail
when we request a non-existent branch of a manifest, removing the need
for this change.

Change-Id: I0f63da9bfb40cf5ffafb7979f1b8c929a738fc7b
2014-11-10 23:49:32 +00:00
5fb8ed217c If revision is sha hash and dest-branch is defined, use it for starting branch
Change-Id: I538c7d216f72b87629b61aee547d374a398c95da
2014-10-27 12:25:05 +00:00
7e12e0a2fa Support persistent-http(s) review urls
Change-Id: I8e0065685c968dfa9dc26bcdb6ee2fa14019c509
2014-10-23 15:42:09 -07:00
7893b85509 Merge changes I1f71be22,I5b119f11
* changes:
  Always fetch the specific revision given
  Support specifying non-HEADS refs as upstream
2014-10-22 00:23:18 +00:00
b4e50e67e8 Merge "upload: report names of uncommitted files" 2014-10-21 18:03:55 +00:00
0936aeab2c Exit 1 if repo download -c fails
Change-Id: I6985548bf87032b121eeccf858c4eeca1a60598c
2014-10-17 15:45:57 -04:00
14e134da02 upload: report names of uncommitted files
When there are uncommitted files in the tree, 'repo upload' stops to
ask if it is OK to continue, but does not report the actual names of
uncommitted files.

This patch adds plumbing to have the outstanding file names reported
if desired.

BUG=None
TEST=verified that 'repo upload' properly operates with the following
    conditions present in the tree:
    . file(s) modified locally
    . file(s) added to index, but not committed
    . files not known to git
    . no modified files (the upload proceeds as expected)

Change-Id: If65d5f8e8bcb3300c16d85dc5d7017758545f80d
Signed-off-by: Vadim Bendebury <vbendeb@chromium.org>
Signed-off-by: Vadim Bendebury <vbendeb@google.com>
2014-10-14 11:20:05 -07:00
04e52d6166 Always fetch the specific revision given
Don't assume the revision is in refs/heads/.

Change-Id: I1f71be222ed3ed940d2265aad43d1f2d601fc03a
2014-10-09 13:41:56 -06:00
909d58b2e2 Support specifying non-HEADS refs as upstream
While not typical, some users might have an upstream that isn't in
the usual refs/heads/* namespace. There's no reason not to use
those refs as the value for the upstream attribute, so support
doing so.

Change-Id: I5b119f1135c3268c20e7c4084682e860d3ee1fb1
2014-10-09 13:41:51 -06:00
5cf16607d3 Allow selection of a target when using smart sync.
Change-Id: I02a24471b9b62dbba3773f22a289825bc566acd9
2014-10-02 10:17:44 -07:00
c190b98ed5 Merge "Add extend-project tag to support adding groups to an existing project" 2014-09-18 23:09:08 +00:00
4863307299 Add support for rpc:// protocol schemes.
Change-Id: I0e500e45cacc20ac04b43435c4bd189299e9e97b
2014-09-10 13:45:52 -07:00
f75870beac Change implementation of cleanup in case of clone failure during "repo init"
Fix includes:
1. It deletes only .repo/repo instead of the whole .repo repository.

Bug: Issue 161
Change-Id: I1ab8caa7538fec5e6206d1b029f63bd3f60dedcd
2014-09-03 13:56:04 +05:30
bf0b0cbc2f Merge "Provide detail print-out when not all projects of a branch are current." 2014-08-26 21:11:40 +00:00
3a10968a70 Merge "Enable transferring of attribute using command 'repo manifest -o -'" 2014-08-22 16:13:16 +00:00
c46de6932a Decode git version
Used by 'repo --version'
With Python 3,
* Before: b'git version 2.1.0'
* After: git version 2.1.0

Change-Id: I4321bb0f09e92cda1123c35910338b940e82a305
2014-08-20 11:47:10 +05:30
303a82f33a Don't open non-binary files as binary
* Don't pen the git config file, and the git ".lock" file as binary.

Change-Id: I7b3939658456f2fd0a0500443cdd8d1ee1a4459d
2014-08-19 23:05:44 +05:30
7a91d51dcf Enable transferring of attribute using command 'repo manifest -o -'
'upstream' attribute is now transferred to the new manifest xml
that is created when using command 'repo manifest -o -'.

Manifest help is updated for the attributes 'sync-c','sync-s' and
'sync-j'.

Bug: Issue 164
Change-Id: If63f781e91d25c5b5b5ea0696b0c04337b0a686a
2014-07-24 16:27:08 +05:30
a8d539189e Update the commit-msg hook to the version from Gerrit 2.8.2
Change-Id: Id911bc6841f488a42d08580de800c3afafa2937e
2014-07-15 11:30:06 -07:00
588142dfcb Provide detail print-out when not all projects of a branch are current.
When current is "split" (i.e. some projects are current while others are not):
- Disable 'not in' printout (i.e. will print out all projects)
- Disable printing of multiple projects on one line
- Print current projects in green, non-current in white

Since using color to differentiate current from non-current in "split" cases:
- In non-split cases also print out project names in color (green for current
  white for non-current)

Change-Id: Ia6b826612c708447cecfe5954dc767f7b2ea2ea7
2014-07-11 10:56:03 -07:00
a6d258b84d Merge "Fix UrlInsteadOf to handle multiple strings" 2014-06-30 22:21:58 +00:00
a769498568 Add --jobs option to forall subcommand
Enable '--jobs' ('-j') option in the forall subcommand. For -jn
where n > 1, the '-p' option can no longer guarantee the
continuity of console output between the project header and the
output from the worker process.

SIG_INT is sent to all worker processes upon keyboard interrupt
(Ctrl+C).

Bug: Issue 105
Change-Id: If09afa2ed639d481ede64f28b641dc80d0b89a5c
2014-06-24 01:02:54 +00:00
884a387eca Add extend-project tag to support adding groups to an existing project
Currently, if a local manifest wants to add groups to an existing
project, it must use remove-project and then re-add the project with
the new groups.  This makes the local manifest more fragile, requiring
updates to the local manifest if the original manifest changes.

Add a new extend-project tag, which supports adding groups to an
existing project.

Change-Id: Ib4d1352efd722a65dd263d02644b9ea5ab6ed400
2014-06-20 11:35:16 -07:00
80b87fe6c1 Use fetch --unshallow when appropriate.
If a user reinits to a different manifest or the manifest updates so
that a project no longer has a fixed depth, we need to use --unshallow
when we fetch.

Change-Id: I6d3f15e5464b5eaad9205654bc24354947a78aea
2014-05-09 18:47:35 -07:00
e9f75b1782 Merge "Enable remotes to define their own revision" 2014-05-08 18:38:33 +00:00
a35e402161 Merge "Return a list rather than dict_values in XmlManifest.projects()" 2014-05-07 18:21:31 +00:00
dd7aea6c11 Merge "Define unicode as str if using Python 3" 2014-05-07 18:20:32 +00:00
5196805fa2 Merge "Use exec() rather than execfile()" 2014-05-07 18:18:56 +00:00
85b24acd6a Use JSON instead of pickle
Use JSON as it is shown to be much faster than pickle.
Also clean up the loading and saving functions.

Change-Id: I45b3dee7b4d59a1c0e0d38d4a83b543ac5839390
2014-05-07 10:46:24 +01:00
36ea2fb6ee Enable remotes to define their own revision
Some projects use multiple remotes.
In some cases these remotes have different naming conventions.
Add an option to define a revision in the remote configuration.

The `project` revision takes precedence over `remote` and `default`.
The `remote` revision takes precedence over `default`.
The `default` revision acts as a fall back as it originally did.

Change-Id: I2b376160d45d48b0bab840c02a3eef1a1e32cf6d
2014-05-07 08:29:30 +00:00
2cd1f0452e Use next(iterator) rather than iterator.next()
iterator.next() was replaced with iterator.__next__() in Python 3.
Use next(iterator) instead which will select the correct method for
returning the next item.

Change-Id: I6d0c89c8b32e817e5897fe87332933dacf22027b
2014-05-07 08:44:20 +01:00
65e3a78a9e Merge "Prevent warning twice about Python 3 usage" 2014-05-07 06:30:30 +00:00
d792f7928d Define unicode as str if using Python 3
The unicode object was renamed to str in Python 3

Change-Id: I1e4972fb07b313d3462587b3059bb3638d779625
2014-05-06 20:38:51 +01:00
6efdde9f6e Prevent warning twice about Python 3 usage
Only warn about using Python 3 when running the repo script directly.
This prevents the user being warned twice.

Change-Id: I2ee51ea2fa0127ea310598320e460ec9f38c6488
2014-05-06 12:44:22 +00:00
7446c5954a Use sorted() rather than .sort()
dict.keys() produces a dict_keys object in Python 3, which does
not support .sort(). Use sorted() which will give the same outcome.

Change-Id: If6b33db07a31995b4e44959209d08d8fb74ae339
2014-05-06 12:42:35 +00:00
d58bfe5a58 Return a list rather than dict_values in XmlManifest.projects()
dict.values() produce dict_values objects rather than list objects.
Convert this to a list to maintain functionality with certain functions.

Change-Id: Ie76269e19f8d68479a1d7ae03aa965252d759a9e
2014-05-06 09:16:52 +01:00
70f6890352 Use exec() rather than execfile()
execfile() is not in Python 3.

Change-Id: I5af222340f13c1e8edaa820e7675d3e4d62a1689
2014-05-05 23:41:07 +01:00
666d534636 Ensure HEAD is correct when skipping remote fetch
A recent optimization (2fb6466f79) skips
performing a remote fetch if we already know we have the sha1 we want.
However, that optimization skipped initialization steps that ensure HEAD
points to the correct sha1.  This change makes sure not to skip those
steps.

Here is an example of how to test this change:

"""""""""
url=<manifest url>
branch1=<branch name>
branch2=<branch name>
project=<project with revision set to different sha1 in each branch>

repo init -u $url -b $branch1 --mirror
repo sync $project
first=$(cd $project.git; git rev-parse HEAD)

repo init -b $branch2
repo sync $project
second=$(cd platform/build.git; git rev-parse HEAD)

if [[ $first == $second ]]
then
    echo 'problem!'
else
    echo 'no problem!'
fi
"""""""""
2014-05-01 13:20:32 -07:00
f2af756425 Add 'shallow' gitfile to symlinks
This fixes the bug that kept clients from doing things like `git log`
in projects using the clone-depth feature.

Change-Id: Ib4024a7b82ceaa7eb7b8935b007b3e8225e0aea8
2014-04-30 11:34:00 -07:00
544e7b0a97 Merge "Ignore clone-depth attribute when fetching to a mirror" 2014-04-24 21:21:02 +00:00
e0df232da7 Add linkfile support.
It's just like copyfile and runs at the same time as copyfile but
instead of copying it creates a symlink instead.  This is needed
because copyfile copies the target of the link as opposed to the
symlink itself.

Change-Id: I7bff2aa23f0d80d9d51061045bd9c86a9b741ac5
2014-04-22 14:35:47 -05:00
5a7c3afa73 Merge "Don't try to remove .repo if it doesn't exist" 2014-04-18 00:06:08 +00:00
9bc422f130 Ignore clone-depth attribute when fetching to a mirror
If a manifest includes projects with a clone-depth=1 attribute, and a
workspace is initialised from that manifest using the --mirror option,
any workspaces initialised and synced from the mirror will fail with:

  fatal: attempt to fetch/clone from a shallow repository

on the projects that had the clone-depth.

Ignore the clone-depth attribute when fetching from the remote to a
mirror workspace. Thus the mirror will be synched with a complete
clone of all the repositories.

Change-Id: I638b77e4894f5eda137d31fa6358eec53cf4654a
2014-04-16 11:00:40 +09:00
e81bc030bb Add total count and iteration count to forall environment
For long-running forall commands sometimes it's useful to know which
iteration is currently running. Add REPO_I and REPO_COUNT environment
variables to reflect the current iteration count as well as the total
number of iterations so that the user can build simple status
indicators.

Example:

    $ repo forall -c 'echo $REPO_I / $REPO_COUNT; git gc'
    1 / 579
    Counting objects: 41, done.
    Delta compression using up to 8 threads.
    Compressing objects: 100% (19/19), done.
    Writing objects: 100% (41/41), done.
    Total 41 (delta 21), reused 41 (delta 21)
    2 / 579
    Counting objects: 53410, done.
    Delta compression using up to 8 threads.
    Compressing objects: 100% (10423/10423), done.
    Writing objects: 100% (53410/53410), done.
    Total 53410 (delta 42513), reused 53410 (delta 42513)
    3 / 579
    ...

Change-Id: I9f28b0d8b7debe423eed3b4bc1198b23e40c0c50
Signed-off-by: Mitchel Humpherys <mitchelh@codeaurora.org>
2014-03-31 13:08:26 -07:00
eb5acc9ae9 Don't try to remove .repo if it doesn't exist
Part of the cleanup path for _Init is removing the .repo
directory. However, _Init can fail before creating the .repo directory,
so trying to remove it raises another exception:

    fatal: invalid branch name 'refs/changes/53/55053/4'
    Traceback (most recent call last):
      File "/home/mitchelh/bin/repo", line 775, in <module>
        main(sys.argv[1:])
      File "/home/mitchelh/bin/repo", line 749, in main
        os.rmdir(repodir)
    OSError: [Errno 2] No such file or directory: '.repo'

Fix this by only removing .repo if it actually exists.

Change-Id: Ia251d29e9c73e013eb296501d11c36263457e235
2014-03-12 15:11:27 -07:00
26c45a7958 Make --no-tags work with -c
Currently, the --no-tags option is ignored if the user asks to only
fetch the current branch. There is no reason for this restriction. Fix
it.

Change-Id: Ibaaeae85ebe9955ed49325940461d630d794b990
Signed-off-by: Mitchel Humpherys <mitchelh@codeaurora.org>
2014-03-12 16:34:53 +09:00
68425f4da8 Fix indentation in project.py
Change-Id: I81c630536eaa54d5a25b9cb339a96c28619815ea
2014-03-11 14:55:52 +09:00
53e902a19b More verbose errors for NoManifestExceptions.
The old "manifest required for this command -- please run
init" is replaced by a more helpful message that lists the
command repo was trying to execute (with arguments) as well
as the str() of the NoManifestException. For example:

> error: in `sync`: [Errno 2] No such file or directory:
> 	'path/to/.repo/manifests/.git/HEAD'
> error: manifest missing or unreadable -- please run init

Other failure points in basic command parsing and dispatch
are more clearly explained in the same fashion.

Change-Id: I6212e5c648bc5d57e27145d55a5391ca565e4149
2014-03-11 05:33:43 +00:00
4e4d40f7c0 Fix UrlInsteadOf to handle multiple strings
For complex .gitconfig url rewrites, multiple insteadOf lines may be
used for a url. Search all of them for the right rewrite.

Change-Id: If5e9ecd054e86226924b0baf513801cd57c389cd
2014-03-06 21:04:18 -08:00
093fdb6587 Add reviewers automatically from project's git config
The `review.URL.autocopy` setting sends email notification to the
named reviewers, but does not add them as reviewer on the uploaded
change.

Add a new setting `review.URL.autoreviewer`.  The named reviewers
will be added as reviewer on the uploaded change.

Change-Id: I3fddfb49edf346f8724fe15b84be8c39d43e7e65
Signed-off-by: bijia <bijia@xiaomi.com>
2014-03-04 00:51:30 +00:00
2fb6466f79 Don't fetch from remotes if commit id exists locally
In existing workspaces where the manifest specifies a commit id in the
manifest, we can avoid doing a fetch from the remote if we have the
commit locally. This substantially improves sync times for fully
specified manifests.

Change-Id: Ide216f28a545e00e0b493ce90ed0019513c61613
2014-03-03 10:17:03 +00:00
724aafb52d Merge "Clean up duplicate logic in subcmds/sync.py." 2014-02-28 21:16:32 +00:00
ccd218cd8f Fix to mirror manifest when --mirror is given
Commit 8d201 "repo: Support multiple branches for the same project."
(Change id is I5e2f4e1a7abb56f9d3f310fa6fd0c17019330ecd) caused missing
mirroring manifest repository when 'repo sync' after 'repo init --mirror'.

When the function _AddMetaProjectMirror() is called to add two of
meta projects - git-repo itself and manifest repository to mirror,
it didn't add them into self._paths which has list of projects to be
sync'ed by 'repo sync'.

In addition, because member var of Project 'relpath' is used as a key
of self._paths, it should be set with proper value other than None.
Since this is only for meta projects which are not described in manifest
xml, 'relpath' is name of the projects.

Change-Id: Icc3b9e6739a78114ec70bf54fe645f79df972686
Signed-off-by: Kwanhong Lee <kwanhong.lee@windriver.com>
2014-02-20 11:07:23 +09:00
dd6542268a Add the "diffmanifests" command
This command allows a deeper diff between two manifest projects.
In addition to changed projects, it displays the logs of the
commits between both revisions for each project.

Change-Id: I86d30602cfbc654f8c84db2be5d8a30cb90f1398
Signed-off-by: Julien Campergue <julien.campergue@parrot.com>
2014-02-17 11:20:11 +00:00
baca5f7e88 Merge "Add error message for download -c conflicts" 2014-02-17 07:57:00 +00:00
89ece429fb Clean up duplicate logic in subcmds/sync.py.
The fetch logic is now shared between the jobs == 1 and
jobs > 1 cases. This refactoring also fixes a bug where
opts.force_broken was not honored when jobs > 1.

Change-Id: Ic886f3c3c00f3d8fc73a65366328fed3c44dc3be
2014-02-14 16:14:32 +00:00
565480588d Check for existence of refs upon initial fetch
When we do an initial fetch and have not specified any branch etc,
the following fetch command will not error:
git fetch origin --tags +refs/heads/*:refs/remotes/origin/*

In this change we make sure something got fetched and if not we report
an error.

This fixes the bug that occurs when we init using a bad manifest url and
then are unable to init again (because a manifest project has been
inited with no manifest).

Change-Id: I6f8aaefc83a1837beb10b1ac90bea96dc8e61156
2014-02-12 09:11:00 -08:00
1829101e28 Add error message for download -c conflicts
Currently if you run repo download -c on a change and the cherry-pick
runs into a merge conflict a Traceback is produced:

rob@rob-i5-lm ~/Programming/repo_test/repo1 $ repo download -c repo1 3/1
From ssh://rob-i5-lm:29418/repo1
 * branch            refs/changes/03/3/1 -> FETCH_HEAD
error: could not apply 0c8b474... 2
hint: after resolving the conflicts, mark the corrected paths
hint: with 'git add <paths>' or 'git rm <paths>'
hint: and commit the result with 'git commit'
Traceback (most recent call last):
  File "/home/rob/Programming/git-repo/main.py", line 408, in <module>
    _Main(sys.argv[1:])
  File "/home/rob/Programming/git-repo/main.py", line 384, in _Main
    result = repo._Run(argv) or 0
  File "/home/rob/Programming/git-repo/main.py", line 143, in _Run
    result = cmd.Execute(copts, cargs)
  File "/home/rob/Programming/git-repo/subcmds/download.py", line 90, in Execute
    project._CherryPick(dl.commit)
  File "/home/rob/Programming/git-repo/project.py", line 1943, in _CherryPick
    raise GitError('%s cherry-pick %s ' % (self.name, rev))
error.GitError: repo1 cherry-pick 0c8b4740f876f8f8372bbaed430f02b6ba8b1898

This amount of error message is confusing to users and has the side effect
of the git message telling you the actual issue being ignored.

This change introduces a message stating that the cherry-pick couldn't
be completed removing the Traceback.

To reproduce the issue create a change that causes a conflict with one currently
in review and use repo download -c to cherry-pick the conflicting change.

Change-Id: I8ddf4e0c8ad9bd04b1af5360313f67cc053f7d6a
2014-02-11 18:19:04 +00:00
1966133f8e Merge "Stop appending 'p/' to review urls" 2014-02-10 22:42:31 +00:00
f1027e23b4 Merge "Implement Kerberos HTTP authentication handler" 2014-02-05 00:58:53 +00:00
2cd38a0bf8 Stop appending 'p/' to review urls
Gerrit no longer requires 'p/', and this causes unexpected behavior.
In this change we stop appending 'p/' to the urls.

Change-Id: I72c13bf838f4112086141959fb1af249f9213ce6
2014-02-04 15:32:29 -08:00
1b46cc9b6d Merge "Changes to support sso: repositories for upload" 2014-02-04 21:19:07 +00:00
1242e60bdd Implement Kerberos HTTP authentication handler
This commit implements a Kerberos HTTP authentication handler. It
uses credentials from a local cache to perform an HTTP authentication
negotiation using the GSSAPI.

The purpose of this handler is to allow the use Kerberos authentication
to access review endpoints without the need to transmit the user
password.

Change-Id: Id2c3fc91a58b15a3e83e4bd9ca87203fa3d647c8
2014-02-04 09:22:42 +01:00
2d0f508648 Fix persistent-https relative url resolving
Previously, we would remove 'persistent-' then tack it on at the end
if it had been previously found.  However, this would ignore urljoin's
decision on whether or not the second path was relative.  Instead, we
were always assuming it was relative and that we didn't want to use
a different absolute url with a different protocol.

This change handles persistent-https:// in the same way we handled the
absense of an explicit protocol.  The only difference is that this time
instead of temporarily replacing it with 'gopher://', we use 'wais://'.

Change-Id: I6e8ad1eb4b911931a991481717f1ade01315db2a
2014-01-31 16:06:31 -08:00
143d8a7249 Changes to support sso: repositories for upload
Change-Id: Iddf90d52f700a1f6462abe76d4f4a367ebb6d603
2014-01-31 07:39:44 -08:00
5db69f3f66 Update the version number on the repo launcher
The repo launcher version needs to be updated so some users can take
advantage of the more robust version number parsing.

Change-Id: Ibcd8036363311528db82db2b252357ffd21eb59b
2014-01-30 16:00:35 -08:00
ff0a3c8f80 Share git version parsing code with wrapper module
'repo' and 'git_command.py' had their own git version parsing code.
This change shares that code between the modules.  DRY is good.

Change-Id: Ic896d2dc08353644bd4ced57e15a91284d97d54a
2014-01-30 15:18:56 -08:00
094cdbe090 Add wrapper module
This takes the wrapper importing code from main.py and moves it into
its own module so that other modules may import it without causing
circular imports with main.py.

Change-Id: I9402950573933ed6f14ce0bfb600f74f32727705
2014-01-30 15:17:09 -08:00
148a84de0c Respect version hyphenation
The last change regarding version parsing lost handling of version
hyphenation, this restores that.  In otherwords,
1.1.1-otherstuff is parsed as (1,1,1) instead of (1,1,0)

Change-Id: I3753944e92095606653835ed2bd090b9301c7194
2014-01-30 13:53:55 -08:00
1c5da49e6c Handle release candidates in git version parsing
Right now repo chokes on git versions like "1.9.rc1".  This change
treats 'rc*' as a '0'.

Change-Id: I612b7b431675ba7415bf70640a673e48dbb00a90
2014-01-30 13:26:50 -08:00
b8433dfd2f repo: Fix 'remove-project' regression with multiple projects.
In CL:50715, I updated repo to handle multiple projects, but the
remove-projects code path was not updated accordingly. Update it.

Change-Id: Icd681d45ce857467b584bca0d2fdcbf24ec6e8db
2014-01-30 10:14:54 -08:00
f2fe2d9b86 Properly iterate through values
the value of Manifest.projects has changed from being the dictionary
to the values of the dictionary.  Here we handle this change
correctly on a PostRepoUpgrade.

From a `git diff v1.12.7 -- manifest_xml.py`:
+  @property
   def projects(self):
     self._Load()
-    return self._projects
+    return self._paths.values()

self._paths does contain the projects according to this line of
manifest_xml.py:
484      self._paths[project.relpath] = project

Change-Id: I141f8d5468ee10dfb08f99ba434004a307fed810
2014-01-29 13:57:22 -08:00
c9877c7cf6 Merge "Only fetch current branch on shallow clients" 2014-01-29 21:12:34 +00:00
69e04d8953 Only fetch current branch on shallow clients
Fetching a new branch on a shallow client may download the entire
project history, as the depth parameter is not passed to git
fetch. Force the fetch to only download the current branch.

Change-Id: Ie17ce8eb5e3487c24d90b2cae8227319dea482c8
2014-01-29 12:48:54 -08:00
f1f1137d61 Merge "Don't backtrace when current branch is not uploadable." 2014-01-14 00:41:35 +00:00
f77ef2edb0 Merge "hooks/pre-auto-gc: fix AC detection on OSX Maverick" 2014-01-10 02:50:53 +00:00
e695338e21 Merge "repo: Support multiple branches for the same project." 2014-01-10 01:20:13 +00:00
bd80f7eedd Merge "Canonicalize project hooks path before use" 2014-01-09 02:11:10 +00:00
bf79c6618e Fix os.mkdir race condition.
This code checks whether a dir exists before creating it. In between the
check and the mkdir call, it is possible that another process will have
created the directory. We have seen this bug occur many times in
practice during our 'repo init' tests.

Change-Id: Ia47d39955739aa38fd303f4e90be7b4c50d9d4ba
2013-12-26 14:59:00 -08:00
f045d49a71 Merge "Add --archive option to init to sync using git archive" 2013-12-18 17:44:59 +00:00
719757d6a8 hooks/pre-auto-gc: fix AC detection on OSX Maverick
The output of pmset has been changed to "Now drawing from 'AC Power'"

Change-Id: Id425d3bcd6a28656736a6d2c3096623a3ec053cc
2013-12-17 09:48:20 +07:00
011d4f426c Don't backtrace when current branch is not uploadable.
The backtrace currently occurs when one uses the "--cbr" argument with
the repo upload subcommand if the current branch is not tracking an
upstream branch. There may be other cases that would backtrace as well,
but this is the only one I found so far.

Change-Id: Ie712fbb0ce3e7fe3b72769fca89cc4c0e3d2fce0
2013-12-11 23:24:01 -08:00
53d6a7b895 Fix error in xml manifest doc.
The docs on the annotations say that zero or more may exist as a child
of a project, so that means that a "*" instead of a "?" should be used.

Change-Id: Iff855d003dfb05cd980f285a237332914e1dad70
2013-12-10 15:30:03 -08:00
335f5ef4ad Add --archive option to init to sync using git archive
This significantly reduces sync time and used brandwidth as only
a tar of each project's revision is checked out, but git is not
accessible from projects anymore.

This is relevant when git is not needed in projects but sync
speed/brandwidth may be important like on CI servers when building
several versions from scratch regularly for example.

Archive is not supported over http/https.

Change-Id: I48c3c7de2cd5a1faec33e295fcdafbc7807d0e4d
Signed-off-by: Julien Campergue <julien.campergue@parrot.com>
2013-12-10 08:27:07 +00:00
672cc499b9 Canonicalize project hooks path before use
If the top-level .repo directory is moved somewhere else (e.g. a
different drive) and replaced with a symlink, _InitHooks() will create
broken symlinks. Resolving symlinks before computing the relative path
for the symlink keeps the path within the repo tree, so the tree can
be moved anywhere.

Change-Id: Ifa5c07869e3477186ddd2c255c6c607f547bc1fe
2013-12-03 09:02:16 -08:00
61df418c59 Update the commit-msg hook to the version from Gerrit 2.6
Change-Id: Iaf21ba8d2ceea58973dbc56f0b4ece54500cd997
2013-11-29 19:17:23 +09:00
4534120628 Merge "Allow using repo with python3" 2013-11-22 10:25:35 +00:00
cbc0798f67 Fix print of git-remote-persistent-https error
If git-remote-persistent-https fails, we use an iter() and then
subsequently a .read() on stderr.  Python doesn't like this and
gives the following error message:
ValueError: Mixing iteration and read methods would lose data

This change removes the use of iter() to avoid the issue.

Change-Id: I980659b83229e2a559c20dcc7b116f8d2476abd5
2013-11-21 10:38:03 -08:00
d5a5b19efd Remove trailing whitespace
Change-Id: I56bcb559431277d40070fa33c580c6c3525ff9bc
2013-11-21 19:16:08 +05:30
5d6cb80b8f Allow using repo with python3
* Switching from python2 to python3 in the same workspace isn't
  currently supported, due to a change in the pickle version (which
  isn't supported by python2)
* Basic functionality does work with python3, however not everything
  is expected to

Change-Id: I4256b5a9861562d0260b503f972c1569190182aa
2013-11-21 18:44:52 +05:30
0eb35cbe50 Fix some python3 encoding issues
* Add .decode('utf-8') where needed
* Add 'b' to `open` where needed, and remove where unnecessary

Change-Id: I0f03ecf9ed1a78e3b2f15f9469deb9aaab698657
2013-11-21 06:03:22 +00:00
ce201a5311 Fix a small whitespace consistency issue
Change-Id: Ie98c79833ca5e7ef71666489135f7491223f779c
2013-10-16 14:42:42 -07:00
12fd10c201 Merge "Dan't accessing attr of None (manifest subcmd)" 2013-10-16 21:41:33 +00:00
a17d7af4d9 Dan't accessing attr of None (manifest subcmd)
If d.remote is None, this code failed for obvious reasons.  This is a
simple fix.

Change-Id: I413756121e444111f1e3c7dc8bc8032467946c13
2013-10-16 14:38:09 -07:00
8d20116038 repo: Support multiple branches for the same project.
It is often useful to be able to include the same project more than
once, but with different branches and placed in different paths in the
workspace. Add this feature.

This CL adds the concept of an object directory. The object directory
stores objects that can be shared amongst several working trees. For
newly synced repositories, we set up the git repo now to share its
objects with an object repo.

Each worktree for a given repo shares objects, but has an independent
set of references and branches. This ensures that repo only has to
update the objects once; however the references for each worktree are
updated separately. Storing the references separately is needed to
ensure that commits to a branch on one worktree will not change the
HEAD commits of the others.

One nice side effect of sharing objects between different worktrees is
that you can easily cherry-pick changes between the two worktrees
without needing to fetch them.

Bug: Issue 141
Change-Id: I5e2f4e1a7abb56f9d3f310fa6fd0c17019330ecd
2013-10-14 15:34:32 -07:00
40 changed files with 3435 additions and 1421 deletions

3
.flake8 Normal file
View File

@ -0,0 +1,3 @@
[flake8]
max-line-length=80
ignore=E111,E114,E402

2
.gitattributes vendored
View File

@ -1,4 +1,4 @@
# Prevent /bin/sh scripts from being clobbered by autocrlf=true
git_ssh text eol=lf
main.py text eol=lf
repo text eol=lf
hooks/* text eol=lf

11
.mailmap Normal file
View File

@ -0,0 +1,11 @@
Anthony Newnam <anthony.newnam@garmin.com> Anthony <anthony@bnovc.com>
Hu Xiuyun <xiuyun.hu@hisilicon.com> Hu xiuyun <xiuyun.hu@hisilicon.com>
Hu Xiuyun <xiuyun.hu@hisilicon.com> Hu Xiuyun <clouds08@qq.com>
Jelly Chen <chenguodong@huawei.com> chenguodong <chenguodong@huawei.com>
Jia Bi <bijia@xiaomi.com> bijia <bijia@xiaomi.com>
JoonCheol Park <jooncheol@gmail.com> Jooncheol Park <jooncheol@gmail.com>
Sergii Pylypenko <x.pelya.x@gmail.com> pelya <x.pelya.x@gmail.com>
Shawn Pearce <sop@google.com> Shawn O. Pearce <sop@google.com>
Ulrik Sjölin <ulrik.sjolin@sonyericsson.com> Ulrik Sjolin <ulrik.sjolin@gmail.com>
Ulrik Sjölin <ulrik.sjolin@sonyericsson.com> Ulrik Sjolin <ulrik.sjolin@sonyericsson.com>
Ulrik Sjölin <ulrik.sjolin@sonyericsson.com> Ulrik Sjölin <ulrik.sjolin@sonyericsson.com>

301
.pylintrc
View File

@ -1,301 +0,0 @@
# lint Python modules using external checkers.
#
# This is the main checker controling the other ones and the reports
# generation. It is itself both a raw checker and an astng checker in order
# to:
# * handle message activation / deactivation at the module level
# * handle some basic but necessary stats'data (number of classes, methods...)
#
[MASTER]
# Specify a configuration file.
#rcfile=
# Python code to execute, usually for sys.path manipulation such as
# pygtk.require().
#init-hook=
# Profiled execution.
profile=no
# Add <file or directory> to the black list. It should be a base name, not a
# path. You may set this option multiple times.
ignore=SVN
# Pickle collected data for later comparisons.
persistent=yes
# Set the cache size for astng objects.
cache-size=500
# List of plugins (as comma separated values of python modules names) to load,
# usually to register additional checkers.
load-plugins=
[MESSAGES CONTROL]
# Enable only checker(s) with the given id(s). This option conflicts with the
# disable-checker option
#enable-checker=
# Enable all checker(s) except those with the given id(s). This option
# conflicts with the enable-checker option
#disable-checker=
# Enable all messages in the listed categories.
#enable-msg-cat=
# Disable all messages in the listed categories.
#disable-msg-cat=
# Enable the message(s) with the given id(s).
enable=RP0004
# Disable the message(s) with the given id(s).
disable=R0903,R0912,R0913,R0914,R0915,W0141,C0111,C0103,W0603,W0703,R0911,C0301,C0302,R0902,R0904,W0142,W0212,E1101,E1103,R0201,W0201,W0122,W0232,RP0001,RP0003,RP0101,RP0002,RP0401,RP0701,RP0801,F0401,E0611,R0801,I0011
[REPORTS]
# set the output format. Available formats are text, parseable, colorized, msvs
# (visual studio) and html
output-format=text
# Include message's id in output
include-ids=yes
# Put messages in a separate file for each module / package specified on the
# command line instead of printing them on stdout. Reports (if any) will be
# written in a file name "pylint_global.[txt|html]".
files-output=no
# Tells whether to display a full report or only the messages
reports=yes
# Python expression which should return a note less than 10 (10 is the highest
# note).You have access to the variables errors warning, statement which
# respectivly contain the number of errors / warnings messages and the total
# number of statements analyzed. This is used by the global evaluation report
# (R0004).
evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)
# Add a comment according to your evaluation note. This is used by the global
# evaluation report (R0004).
comment=no
# checks for
# * unused variables / imports
# * undefined variables
# * redefinition of variable from builtins or from an outer scope
# * use of variable before assigment
#
[VARIABLES]
# Tells whether we should check for unused import in __init__ files.
init-import=no
# A regular expression matching names used for dummy variables (i.e. not used).
dummy-variables-rgx=_|dummy
# List of additional names supposed to be defined in builtins. Remember that
# you should avoid to define new builtins when possible.
additional-builtins=
# try to find bugs in the code using type inference
#
[TYPECHECK]
# Tells whether missing members accessed in mixin class should be ignored. A
# mixin class is detected if its name ends with "mixin" (case insensitive).
ignore-mixin-members=yes
# List of classes names for which member attributes should not be checked
# (useful for classes with attributes dynamicaly set).
ignored-classes=SQLObject
# When zope mode is activated, consider the acquired-members option to ignore
# access to some undefined attributes.
zope=no
# List of members which are usually get through zope's acquisition mecanism and
# so shouldn't trigger E0201 when accessed (need zope=yes to be considered).
acquired-members=REQUEST,acl_users,aq_parent
# checks for :
# * doc strings
# * modules / classes / functions / methods / arguments / variables name
# * number of arguments, local variables, branchs, returns and statements in
# functions, methods
# * required module attributes
# * dangerous default values as arguments
# * redefinition of function / method / class
# * uses of the global statement
#
[BASIC]
# Required attributes for module, separated by a comma
required-attributes=
# Regular expression which should only match functions or classes name which do
# not require a docstring
no-docstring-rgx=_main|__.*__
# Regular expression which should only match correct module names
module-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$
# Regular expression which should only match correct module level names
const-rgx=(([A-Z_][A-Z1-9_]*)|(__.*__))|(log)$
# Regular expression which should only match correct class names
class-rgx=[A-Z_][a-zA-Z0-9]+$
# Regular expression which should only match correct function names
function-rgx=[a-z_][a-z0-9_]{2,30}$
# Regular expression which should only match correct method names
method-rgx=[a-z_][a-z0-9_]{2,30}$
# Regular expression which should only match correct instance attribute names
attr-rgx=[a-z_][a-z0-9_]{2,30}$
# Regular expression which should only match correct argument names
argument-rgx=[a-z_][a-z0-9_]{2,30}$
# Regular expression which should only match correct variable names
variable-rgx=[a-z_][a-z0-9_]{2,30}$
# Regular expression which should only match correct list comprehension /
# generator expression variable names
inlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$
# Good variable names which should always be accepted, separated by a comma
good-names=i,j,k,ex,Run,_,e,d1,d2,v,f,l,d
# Bad variable names which should always be refused, separated by a comma
bad-names=foo,bar,baz,toto,tutu,tata
# List of builtins function names that should not be used, separated by a comma
bad-functions=map,filter,apply,input
# checks for sign of poor/misdesign:
# * number of methods, attributes, local variables...
# * size, complexity of functions, methods
#
[DESIGN]
# Maximum number of arguments for function / method
max-args=5
# Maximum number of locals for function / method body
max-locals=15
# Maximum number of return / yield for function / method body
max-returns=6
# Maximum number of branch for function / method body
max-branchs=12
# Maximum number of statements in function / method body
max-statements=50
# Maximum number of parents for a class (see R0901).
max-parents=7
# Maximum number of attributes for a class (see R0902).
max-attributes=20
# Minimum number of public methods for a class (see R0903).
min-public-methods=2
# Maximum number of public methods for a class (see R0904).
max-public-methods=30
# checks for
# * external modules dependencies
# * relative / wildcard imports
# * cyclic imports
# * uses of deprecated modules
#
[IMPORTS]
# Deprecated modules which should not be used, separated by a comma
deprecated-modules=regsub,string,TERMIOS,Bastion,rexec
# Create a graph of every (i.e. internal and external) dependencies in the
# given file (report R0402 must not be disabled)
import-graph=
# Create a graph of external dependencies in the given file (report R0402 must
# not be disabled)
ext-import-graph=
# Create a graph of internal dependencies in the given file (report R0402 must
# not be disabled)
int-import-graph=
# checks for :
# * methods without self as first argument
# * overridden methods signature
# * access only to existant members via self
# * attributes not defined in the __init__ method
# * supported interfaces implementation
# * unreachable code
#
[CLASSES]
# List of interface methods to ignore, separated by a comma. This is used for
# instance to not check methods defines in Zope's Interface base class.
ignore-iface-methods=isImplementedBy,deferred,extends,names,namesAndDescriptions,queryDescriptionFor,getBases,getDescriptionFor,getDoc,getName,getTaggedValue,getTaggedValueTags,isEqualOrExtendedBy,setTaggedValue,isImplementedByInstancesOf,adaptWith,is_implemented_by
# List of method names used to declare (i.e. assign) instance attributes.
defining-attr-methods=__init__,__new__,setUp
# checks for similarities and duplicated code. This computation may be
# memory / CPU intensive, so you should disable it if you experiments some
# problems.
#
[SIMILARITIES]
# Minimum lines number of a similarity.
min-similarity-lines=4
# Ignore comments when computing similarities.
ignore-comments=yes
# Ignore docstrings when computing similarities.
ignore-docstrings=yes
# checks for:
# * warning notes in the code like FIXME, XXX
# * PEP 263: source code with non ascii character but no encoding declaration
#
[MISCELLANEOUS]
# List of note tags to take in consideration, separated by a comma.
notes=FIXME,XXX,TODO
# checks for :
# * unauthorized constructions
# * strict indentation
# * line length
# * use of <> instead of !=
#
[FORMAT]
# Maximum number of characters on a single line.
max-line-length=80
# Maximum number of lines in a module
max-module-lines=1000
# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
# tab). In repo it is 2 spaces.
indent-string=' '

14
README.md Normal file
View File

@ -0,0 +1,14 @@
# repo
Repo is a tool built on top of Git. Repo helps manage many Git repositories,
does the uploads to revision control systems, and automates parts of the
development workflow. Repo is not meant to replace Git, only to make it
easier to work with Git. The repo command is an executable Python script
that you can put anywhere in your path.
* Homepage: https://code.google.com/p/git-repo/
* Bug reports: https://code.google.com/p/git-repo/issues/
* Source: https://code.google.com/p/git-repo/
* Overview: https://source.android.com/source/developing.html
* Docs: https://source.android.com/source/using-repo.html
* [Submitting patches](./SUBMITTING_PATCHES.md)

View File

@ -1,87 +0,0 @@
Short Version:
- Make small logical changes.
- Provide a meaningful commit message.
- Check for coding errors with pylint
- Make sure all code is under the Apache License, 2.0.
- Publish your changes for review:
git push https://gerrit-review.googlesource.com/git-repo HEAD:refs/for/master
Long Version:
I wanted a file describing how to submit patches for repo,
so I started with the one found in the core Git distribution
(Documentation/SubmittingPatches), which itself was based on the
patch submission guidelines for the Linux kernel.
However there are some differences, so please review and familiarize
yourself with the following relevant bits:
(1) Make separate commits for logically separate changes.
Unless your patch is really trivial, you should not be sending
out a patch that was generated between your working tree and your
commit head. Instead, always make a commit with complete commit
message and generate a series of patches from your repository.
It is a good discipline.
Describe the technical detail of the change(s).
If your description starts to get too long, that's a sign that you
probably need to split up your commit to finer grained pieces.
(2) Check for coding errors with pylint
Run pylint on changed modules using the provided configuration:
pylint --rcfile=.pylintrc file.py
(3) Check the license
repo is licensed under the Apache License, 2.0.
Because of this licensing model *every* file within the project
*must* list the license that covers it in the header of the file.
Any new contributions to an existing file *must* be submitted under
the current license of that file. Any new files *must* clearly
indicate which license they are provided under in the file header.
Please verify that you are legally allowed and willing to submit your
changes under the license covering each file *prior* to submitting
your patch. It is virtually impossible to remove a patch once it
has been applied and pushed out.
(4) Sending your patches.
Do not email your patches to anyone.
Instead, login to the Gerrit Code Review tool at:
https://gerrit-review.googlesource.com/
Ensure you have completed one of the necessary contributor
agreements, providing documentation to the project maintainers that
they have right to redistribute your work under the Apache License:
https://gerrit-review.googlesource.com/#/settings/agreements
Ensure you have obtained an HTTP password to authenticate:
https://gerrit-review.googlesource.com/new-password
Push your patches over HTTPS to the review server, possibly through
a remembered remote to make this easier in the future:
git config remote.review.url https://gerrit-review.googlesource.com/git-repo
git config remote.review.push HEAD:refs/for/master
git push review
You will be automatically emailed a copy of your commits, and any
comments made by the project maintainers.

135
SUBMITTING_PATCHES.md Normal file
View File

@ -0,0 +1,135 @@
# Short Version
- Make small logical changes.
- Provide a meaningful commit message.
- Check for coding errors and style nits with pyflakes and flake8
- Make sure all code is under the Apache License, 2.0.
- Publish your changes for review.
- Make corrections if requested.
- Verify your changes on gerrit so they can be submitted.
`git push https://gerrit-review.googlesource.com/git-repo HEAD:refs/for/master`
# Long Version
I wanted a file describing how to submit patches for repo,
so I started with the one found in the core Git distribution
(Documentation/SubmittingPatches), which itself was based on the
patch submission guidelines for the Linux kernel.
However there are some differences, so please review and familiarize
yourself with the following relevant bits.
## Make separate commits for logically separate changes.
Unless your patch is really trivial, you should not be sending
out a patch that was generated between your working tree and your
commit head. Instead, always make a commit with complete commit
message and generate a series of patches from your repository.
It is a good discipline.
Describe the technical detail of the change(s).
If your description starts to get too long, that's a sign that you
probably need to split up your commit to finer grained pieces.
## Check for coding errors and style nits with pyflakes and flake8
### Coding errors
Run `pyflakes` on changed modules:
pyflakes file.py
Ideally there should be no new errors or warnings introduced.
### Style violations
Run `flake8` on changes modules:
flake8 file.py
Note that repo generally follows [Google's python style guide]
(https://google.github.io/styleguide/pyguide.html) rather than [PEP 8]
(https://www.python.org/dev/peps/pep-0008/), so it's possible that
the output of `flake8` will be quite noisy. It's not mandatory to
avoid all warnings, but at least the maximum line length should be
followed.
If there are many occurrences of the same warning that cannot be
avoided without going against the Google style guide, these may be
suppressed in the included `.flake8` file.
## Check the license
repo is licensed under the Apache License, 2.0.
Because of this licensing model *every* file within the project
*must* list the license that covers it in the header of the file.
Any new contributions to an existing file *must* be submitted under
the current license of that file. Any new files *must* clearly
indicate which license they are provided under in the file header.
Please verify that you are legally allowed and willing to submit your
changes under the license covering each file *prior* to submitting
your patch. It is virtually impossible to remove a patch once it
has been applied and pushed out.
## Sending your patches.
Do not email your patches to anyone.
Instead, login to the Gerrit Code Review tool at:
https://gerrit-review.googlesource.com/
Ensure you have completed one of the necessary contributor
agreements, providing documentation to the project maintainers that
they have right to redistribute your work under the Apache License:
https://gerrit-review.googlesource.com/#/settings/agreements
Ensure you have obtained an HTTP password to authenticate:
https://gerrit-review.googlesource.com/new-password
Ensure that you have the local commit hook installed to automatically
add a ChangeId to your commits:
curl -Lo `git rev-parse --git-dir`/hooks/commit-msg https://gerrit-review.googlesource.com/tools/hooks/commit-msg
chmod +x `git rev-parse --git-dir`/hooks/commit-msg
If you have already committed your changes you will need to amend the commit
to get the ChangeId added.
git commit --amend
Push your patches over HTTPS to the review server, possibly through
a remembered remote to make this easier in the future:
git config remote.review.url https://gerrit-review.googlesource.com/git-repo
git config remote.review.push HEAD:refs/for/master
git push review
You will be automatically emailed a copy of your commits, and any
comments made by the project maintainers.
## Make changes if requested
The project maintainer who reviews your changes might request changes to your
commit. If you make the requested changes you will need to amend your commit
and push it to the review server again.
## Verify your changes on gerrit
After you receive a Code-Review+2 from the maintainer, select the Verified
button on the gerrit page for the change. This verifies that you have tested
your changes and notifies the maintainer that they are ready to be submitted.
The maintainer will then submit your changes to the repository.

View File

@ -18,41 +18,43 @@ import sys
import pager
COLORS = {None :-1,
'normal' :-1,
'black' : 0,
'red' : 1,
'green' : 2,
'yellow' : 3,
'blue' : 4,
COLORS = {None: -1,
'normal': -1,
'black': 0,
'red': 1,
'green': 2,
'yellow': 3,
'blue': 4,
'magenta': 5,
'cyan' : 6,
'white' : 7}
'cyan': 6,
'white': 7}
ATTRS = {None :-1,
'bold' : 1,
'dim' : 2,
'ul' : 4,
'blink' : 5,
ATTRS = {None: -1,
'bold': 1,
'dim': 2,
'ul': 4,
'blink': 5,
'reverse': 7}
RESET = "\033[m" # pylint: disable=W1401
# backslash is not anomalous
RESET = "\033[m"
def is_color(s):
return s in COLORS
def is_attr(s):
return s in ATTRS
def _Color(fg = None, bg = None, attr = None):
def _Color(fg=None, bg=None, attr=None):
fg = COLORS[fg]
bg = COLORS[bg]
attr = ATTRS[attr]
if attr >= 0 or fg >= 0 or bg >= 0:
need_sep = False
code = "\033[" #pylint: disable=W1401
code = "\033["
if attr >= 0:
code += chr(ord('0') + attr)
@ -71,7 +73,6 @@ def _Color(fg = None, bg = None, attr = None):
if bg >= 0:
if need_sep:
code += ';'
need_sep = True
if bg < 8:
code += '4%c' % (ord('0') + bg)
@ -82,6 +83,27 @@ def _Color(fg = None, bg = None, attr = None):
code = ''
return code
DEFAULT = None
def SetDefaultColoring(state):
"""Set coloring behavior to |state|.
This is useful for overriding config options via the command line.
"""
if state is None:
# Leave it alone -- return quick!
return
global DEFAULT
state = state.lower()
if state in ('auto',):
DEFAULT = state
elif state in ('always', 'yes', 'true', True):
DEFAULT = 'always'
elif state in ('never', 'no', 'false', False):
DEFAULT = 'never'
class Coloring(object):
def __init__(self, config, section_type):
@ -89,9 +111,11 @@ class Coloring(object):
self._config = config
self._out = sys.stdout
on = self._config.GetString(self._section)
on = DEFAULT
if on is None:
on = self._config.GetString('color.ui')
on = self._config.GetString(self._section)
if on is None:
on = self._config.GetString('color.ui')
if on == 'auto':
if pager.active or os.isatty(1):
@ -122,6 +146,7 @@ class Coloring(object):
def printer(self, opt=None, fg=None, bg=None, attr=None):
s = self
c = self.colorer(opt, fg, bg, attr)
def f(fmt, *args):
s._out.write(c(fmt, *args))
return f
@ -129,6 +154,7 @@ class Coloring(object):
def nofmt_printer(self, opt=None, fg=None, bg=None, attr=None):
s = self
c = self.nofmt_colorer(opt, fg, bg, attr)
def f(fmt):
s._out.write(c(fmt))
return f
@ -136,11 +162,13 @@ class Coloring(object):
def colorer(self, opt=None, fg=None, bg=None, attr=None):
if self._on:
c = self._parse(opt, fg, bg, attr)
def f(fmt, *args):
output = fmt % args
return ''.join([c, output, RESET])
return f
else:
def f(fmt, *args):
return fmt % args
return f
@ -148,6 +176,7 @@ class Coloring(object):
def nofmt_colorer(self, opt=None, fg=None, bg=None, attr=None):
if self._on:
c = self._parse(opt, fg, bg, attr)
def f(fmt):
return ''.join([c, fmt, RESET])
return f

View File

@ -31,7 +31,7 @@ class Command(object):
manifest = None
_optparse = None
def WantPager(self, opt):
def WantPager(self, _opt):
return False
def ReadEnvironmentOptions(self, opts):
@ -63,7 +63,7 @@ class Command(object):
usage = self.helpUsage.strip().replace('%prog', me)
except AttributeError:
usage = 'repo %s' % self.NAME
self._optparse = optparse.OptionParser(usage = usage)
self._optparse = optparse.OptionParser(usage=usage)
self._Options(self._optparse)
return self._optparse
@ -106,19 +106,24 @@ class Command(object):
def _UpdatePathToProjectMap(self, project):
self._by_path[project.worktree] = project
def _GetProjectByPath(self, path):
def _GetProjectByPath(self, manifest, path):
project = None
if os.path.exists(path):
oldpath = None
while path \
and path != oldpath \
and path != self.manifest.topdir:
while path and \
path != oldpath and \
path != manifest.topdir:
try:
project = self._by_path[path]
break
except KeyError:
oldpath = path
path = os.path.dirname(path)
if not project and path == manifest.topdir:
try:
project = self._by_path[path]
except KeyError:
pass
else:
try:
project = self._by_path[path]
@ -126,21 +131,24 @@ class Command(object):
pass
return project
def GetProjects(self, args, missing_ok=False, submodules_ok=False):
def GetProjects(self, args, manifest=None, groups='', missing_ok=False,
submodules_ok=False):
"""A list of projects that match the arguments.
"""
all_projects = self.manifest.projects
if not manifest:
manifest = self.manifest
all_projects_list = manifest.projects
result = []
mp = self.manifest.manifestProject
mp = manifest.manifestProject
groups = mp.config.GetString('manifest.groups')
if not groups:
groups = mp.config.GetString('manifest.groups')
if not groups:
groups = 'default,platform-' + platform.system().lower()
groups = [x for x in re.split(r'[,\s]+', groups) if x]
if not args:
all_projects_list = list(all_projects.values())
derived_projects = {}
for project in all_projects_list:
if submodules_ok or project.sync_s:
@ -148,55 +156,66 @@ class Command(object):
for p in project.GetDerivedSubprojects())
all_projects_list.extend(derived_projects.values())
for project in all_projects_list:
if ((missing_ok or project.Exists) and
project.MatchesGroups(groups)):
if (missing_ok or project.Exists) and project.MatchesGroups(groups):
result.append(project)
else:
self._ResetPathToProjectMap(all_projects.values())
self._ResetPathToProjectMap(all_projects_list)
for arg in args:
project = all_projects.get(arg)
projects = manifest.GetProjectsWithName(arg)
if not project:
if not projects:
path = os.path.abspath(arg).replace('\\', '/')
project = self._GetProjectByPath(path)
project = self._GetProjectByPath(manifest, path)
# If it's not a derived project, update path->project mapping and
# search again, as arg might actually point to a derived subproject.
if (project and not project.Derived and
(submodules_ok or project.sync_s)):
if (project and not project.Derived and (submodules_ok or
project.sync_s)):
search_again = False
for subproject in project.GetDerivedSubprojects():
self._UpdatePathToProjectMap(subproject)
search_again = True
if search_again:
project = self._GetProjectByPath(path) or project
project = self._GetProjectByPath(manifest, path) or project
if not project:
raise NoSuchProjectError(arg)
if not missing_ok and not project.Exists:
raise NoSuchProjectError(arg)
if not project.MatchesGroups(groups):
raise InvalidProjectGroupsError(arg)
if project:
projects = [project]
result.append(project)
if not projects:
raise NoSuchProjectError(arg)
for project in projects:
if not missing_ok and not project.Exists:
raise NoSuchProjectError(arg)
if not project.MatchesGroups(groups):
raise InvalidProjectGroupsError(arg)
result.extend(projects)
def _getpath(x):
return x.relpath
result.sort(key=_getpath)
return result
def FindProjects(self, args):
def FindProjects(self, args, inverse=False):
result = []
patterns = [re.compile(r'%s' % a, re.IGNORECASE) for a in args]
for project in self.GetProjects(''):
for pattern in patterns:
if pattern.search(project.name) or pattern.search(project.relpath):
match = pattern.search(project.name) or pattern.search(project.relpath)
if not inverse and match:
result.append(project)
break
if inverse and match:
break
else:
if inverse:
result.append(project)
result.sort(key=lambda project: project.relpath)
return result
# pylint: disable=W0223
# Pylint warns that the `InteractiveCommand` and `PagedCommand` classes do not
# override method `Execute` which is abstract in `Command`. Since that method
@ -206,19 +225,33 @@ class InteractiveCommand(Command):
"""Command which requires user interaction on the tty and
must not run within a pager, even if the user asks to.
"""
def WantPager(self, opt):
def WantPager(self, _opt):
return False
class PagedCommand(Command):
"""Command which defaults to output in a pager, as its
display tends to be larger than one screen full.
"""
def WantPager(self, opt):
def WantPager(self, _opt):
return True
# pylint: enable=W0223
class MirrorSafeCommand(object):
"""Command permits itself to run within a mirror,
and does not require a working directory.
"""
class GitcAvailableCommand(object):
"""Command that requires GITC to be available, but does
not require the local client to be a GITC client.
"""
class GitcClientCommand(object):
"""Command that requires the local client to be a GITC
client.
"""

View File

@ -26,16 +26,19 @@ following DTD:
manifest-server?,
remove-project*,
project*,
extend-project*,
repo-hooks?)>
<!ELEMENT notice (#PCDATA)>
<!ELEMENT remote (EMPTY)>
<!ATTLIST remote name ID #REQUIRED>
<!ATTLIST remote alias CDATA #IMPLIED>
<!ATTLIST remote fetch CDATA #REQUIRED>
<!ATTLIST remote pushurl CDATA #IMPLIED>
<!ATTLIST remote review CDATA #IMPLIED>
<!ATTLIST remote revision CDATA #IMPLIED>
<!ELEMENT default (EMPTY)>
<!ATTLIST default remote IDREF #IMPLIED>
<!ATTLIST default revision CDATA #IMPLIED>
@ -45,10 +48,12 @@ following DTD:
<!ATTLIST default sync-s CDATA #IMPLIED>
<!ELEMENT manifest-server (EMPTY)>
<!ATTLIST url CDATA #REQUIRED>
<!ELEMENT project (annotation?,
project*)>
<!ATTLIST manifest-server url CDATA #REQUIRED>
<!ELEMENT project (annotation*,
project*,
copyfile*,
linkfile*)>
<!ATTLIST project name CDATA #REQUIRED>
<!ATTLIST project path CDATA #IMPLIED>
<!ATTLIST project remote IDREF #IMPLIED>
@ -65,7 +70,20 @@ following DTD:
<!ATTLIST annotation name CDATA #REQUIRED>
<!ATTLIST annotation value CDATA #REQUIRED>
<!ATTLIST annotation keep CDATA "true">
<!ELEMENT copyfile (EMPTY)>
<!ATTLIST copyfile src CDATA #REQUIRED>
<!ATTLIST copyfile dest CDATA #REQUIRED>
<!ELEMENT linkfile (EMPTY)>
<!ATTLIST linkfile src CDATA #REQUIRED>
<!ATTLIST linkfile dest CDATA #REQUIRED>
<!ELEMENT extend-project (EMPTY)>
<!ATTLIST extend-project name CDATA #REQUIRED>
<!ATTLIST extend-project path CDATA #IMPLIED>
<!ATTLIST extend-project groups CDATA #IMPLIED>
<!ELEMENT remove-project (EMPTY)>
<!ATTLIST remove-project name CDATA #REQUIRED>
@ -108,10 +126,20 @@ Attribute `fetch`: The Git URL prefix for all projects which use
this remote. Each project's name is appended to this prefix to
form the actual URL used to clone the project.
Attribute `pushurl`: The Git "push" URL prefix for all projects
which use this remote. Each project's name is appended to this
prefix to form the actual URL used to "git push" the project.
This attribute is optional; if not specified then "git push"
will use the same URL as the `fetch` attribute.
Attribute `review`: Hostname of the Gerrit server where reviews
are uploaded to by `repo upload`. This attribute is optional;
if not specified then `repo upload` will not function.
Attribute `revision`: Name of a Git branch (e.g. `master` or
`refs/heads/master`). Remotes with their own revision will override
the default revision.
Element default
---------------
@ -132,14 +160,14 @@ Project elements not setting their own `dest-branch` will inherit
this value. If this value is not set, projects will use `revision`
by default instead.
Attribute `sync_j`: Number of parallel jobs to use when synching.
Attribute `sync-j`: Number of parallel jobs to use when synching.
Attribute `sync_c`: Set to true to only sync the given Git
Attribute `sync-c`: Set to true to only sync the given Git
branch (specified in the `revision` attribute) rather than the
whole ref space. Project elements lacking a sync_c element of
whole ref space. Project elements lacking a sync-c element of
their own will use this value.
Attribute `sync_s`: Set to true to also sync sub-projects.
Attribute `sync-s`: Set to true to also sync sub-projects.
Element manifest-server
@ -154,7 +182,8 @@ The manifest server should implement the following RPC methods:
GetApprovedManifest(branch, target)
Return a manifest in which each project is pegged to a known good revision
for the current branch and target.
for the current branch and target. This is used by repo sync when the
--smart-sync option is given.
The target to use is defined by environment variables TARGET_PRODUCT
and TARGET_BUILD_VARIANT. These variables are used to create a string
@ -166,7 +195,8 @@ should choose a reasonable default target.
GetManifest(tag)
Return a manifest in which each project is pegged to the revision at
the specified tag.
the specified tag. This is used by repo sync when the --smart-tag option
is given.
Element project
@ -208,7 +238,8 @@ to track for this project. Names can be relative to refs/heads
(e.g. just "master") or absolute (e.g. "refs/heads/master").
Tags and/or explicit SHA-1s should work in theory, but have not
been extensively tested. If not supplied the revision given by
the default element is used.
the remote element is used if applicable, else the default
element is used.
Attribute `dest-branch`: Name of a Git branch (e.g. `master`).
When using `repo upload`, changes will be submitted for code
@ -226,13 +257,13 @@ group "notdefault", it will not be automatically downloaded by repo.
If the project has a parent element, the `name` and `path` here
are the prefixed ones.
Attribute `sync_c`: Set to true to only sync the given Git
Attribute `sync-c`: Set to true to only sync the given Git
branch (specified in the `revision` attribute) rather than the
whole ref space.
Attribute `sync_s`: Set to true to also sync sub-projects.
Attribute `sync-s`: Set to true to also sync sub-projects.
Attribute `upstream`: Name of the Git branch in which a sha1
Attribute `upstream`: Name of the Git ref in which a sha1
can be found. Used when syncing a revision locked manifest in
-c mode to avoid having to sync the entire ref space.
@ -246,6 +277,22 @@ rather than the `name` attribute. This attribute only applies to the
local mirrors syncing, it will be ignored when syncing the projects in a
client working directory.
Element extend-project
----------------------
Modify the attributes of the named project.
This element is mostly useful in a local manifest file, to modify the
attributes of an existing project without completely replacing the
existing project definition. This makes the local manifest more robust
against changes to the original manifest.
Attribute `path`: If specified, limit the change to projects checked out
at the specified path, rather than all projects with the given name.
Attribute `groups`: List of additional groups to which this project
belongs. Same syntax as the corresponding element of `project`.
Element annotation
------------------
@ -257,6 +304,21 @@ prefixed with REPO__. In addition, there is an optional attribute
"false". This attribute determines whether or not the annotation will
be kept when exported with the manifest subcommand.
Element copyfile
----------------
Zero or more copyfile elements may be specified as children of a
project element. Each element describes a src-dest pair of files;
the "src" file will be copied to the "dest" place during 'repo sync'
command.
"src" is project relative, "dest" is relative to the top of the tree.
Element linkfile
----------------
It's just like copyfile and runs at the same time as copyfile but
instead of copying it creates a symlink.
Element remove-project
----------------------

View File

@ -24,6 +24,13 @@ class ManifestInvalidRevisionError(Exception):
class NoManifestException(Exception):
"""The required manifest does not exist.
"""
def __init__(self, path, reason):
super(NoManifestException, self).__init__()
self.path = path
self.reason = reason
def __str__(self):
return self.reason
class EditorError(Exception):
"""Unspecified error from the user's text editor.
@ -73,7 +80,7 @@ class NoSuchProjectError(Exception):
self.name = name
def __str__(self):
if self.Name is None:
if self.name is None:
return 'in current directory'
return self.name
@ -86,7 +93,7 @@ class InvalidProjectGroupsError(Exception):
self.name = name
def __str__(self):
if self.Name is None:
if self.name is None:
return 'in current directory'
return self.name

View File

@ -14,13 +14,16 @@
# limitations under the License.
from __future__ import print_function
import fcntl
import os
import select
import sys
import subprocess
import tempfile
from signal import SIGTERM
from error import GitError
from trace import REPO_TRACE, IsTrace, Trace
from wrapper import Wrapper
GIT = 'git'
MIN_GIT_VERSION = (1, 5, 4)
@ -75,24 +78,32 @@ def terminate_ssh_clients():
_git_version = None
class _sfd(object):
"""select file descriptor class"""
def __init__(self, fd, dest, std_name):
assert std_name in ('stdout', 'stderr')
self.fd = fd
self.dest = dest
self.std_name = std_name
def fileno(self):
return self.fd.fileno()
class _GitCall(object):
def version(self):
p = GitCommand(None, ['--version'], capture_stdout=True)
if p.Wait() == 0:
return p.stdout
if hasattr(p.stdout, 'decode'):
return p.stdout.decode('utf-8')
else:
return p.stdout
return None
def version_tuple(self):
global _git_version
if _git_version is None:
ver_str = git.version()
if ver_str.startswith('git version '):
_git_version = tuple(
map(int,
ver_str[len('git version '):].strip().split('-')[0].split('.')[0:3]
))
else:
_git_version = Wrapper().ParseGitVersion(ver_str)
if _git_version is None:
print('fatal: "%s" unsupported' % ver_str, file=sys.stderr)
sys.exit(1)
return _git_version
@ -143,6 +154,9 @@ class GitCommand(object):
if key in env:
del env[key]
# If we are not capturing std* then need to print it.
self.tee = {'stdout': not capture_stdout, 'stderr': not capture_stderr}
if disable_editor:
_setenv(env, 'GIT_EDITOR', ':')
if ssh_proxy:
@ -154,6 +168,9 @@ class GitCommand(object):
if p is not None:
s = p + ' ' + s
_setenv(env, 'GIT_CONFIG_PARAMETERS', s)
if 'GIT_ALLOW_PROTOCOL' not in env:
_setenv(env, 'GIT_ALLOW_PROTOCOL',
'file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc')
if project:
if not cwd:
@ -166,22 +183,21 @@ class GitCommand(object):
if gitdir:
_setenv(env, GIT_DIR, gitdir)
cwd = None
command.extend(cmdv)
command.append(cmdv[0])
# Need to use the --progress flag for fetch/clone so output will be
# displayed as by default git only does progress output if stderr is a TTY.
if sys.stderr.isatty() and cmdv[0] in ('fetch', 'clone'):
if '--progress' not in cmdv and '--quiet' not in cmdv:
command.append('--progress')
command.extend(cmdv[1:])
if provide_stdin:
stdin = subprocess.PIPE
else:
stdin = None
if capture_stdout:
stdout = subprocess.PIPE
else:
stdout = None
if capture_stderr:
stderr = subprocess.PIPE
else:
stderr = None
stdout = subprocess.PIPE
stderr = subprocess.PIPE
if IsTrace():
global LAST_CWD
@ -230,8 +246,36 @@ class GitCommand(object):
def Wait(self):
try:
p = self.process
(self.stdout, self.stderr) = p.communicate()
rc = p.returncode
rc = self._CaptureOutput()
finally:
_remove_ssh_client(p)
return rc
def _CaptureOutput(self):
p = self.process
s_in = [_sfd(p.stdout, sys.stdout, 'stdout'),
_sfd(p.stderr, sys.stderr, 'stderr')]
self.stdout = ''
self.stderr = ''
for s in s_in:
flags = fcntl.fcntl(s.fd, fcntl.F_GETFL)
fcntl.fcntl(s.fd, fcntl.F_SETFL, flags | os.O_NONBLOCK)
while s_in:
in_ready, _, _ = select.select(s_in, [], [])
for s in in_ready:
buf = s.fd.read(4096)
if not buf:
s_in.remove(s)
continue
if not hasattr(buf, 'encode'):
buf = buf.decode()
if s.std_name == 'stdout':
self.stdout += buf
else:
self.stderr += buf
if self.tee[s.std_name]:
s.dest.write(buf)
s.dest.flush()
return p.wait()

View File

@ -15,8 +15,10 @@
from __future__ import print_function
import contextlib
import errno
import json
import os
import pickle
import re
import subprocess
import sys
@ -80,7 +82,7 @@ class GitConfig(object):
return cls(configfile = os.path.join(gitdir, 'config'),
defaults = defaults)
def __init__(self, configfile, defaults=None, pickleFile=None):
def __init__(self, configfile, defaults=None, jsonFile=None):
self.file = configfile
self.defaults = defaults
self._cache_dict = None
@ -88,12 +90,11 @@ class GitConfig(object):
self._remotes = {}
self._branches = {}
if pickleFile is None:
self._pickle = os.path.join(
self._json = jsonFile
if self._json is None:
self._json = os.path.join(
os.path.dirname(self.file),
'.repopickle_' + os.path.basename(self.file))
else:
self._pickle = pickleFile
'.repo_' + os.path.basename(self.file) + '.json')
def Has(self, name, include_defaults = True):
"""Return true if this configuration file has the key.
@ -217,9 +218,9 @@ class GitConfig(object):
"""Resolve any url.*.insteadof references.
"""
for new_url in self.GetSubSections('url'):
old_url = self.GetString('url.%s.insteadof' % new_url)
if old_url is not None and url.startswith(old_url):
return new_url + url[len(old_url):]
for old_url in self.GetString('url.%s.insteadof' % new_url, True):
if old_url is not None and url.startswith(old_url):
return new_url + url[len(old_url):]
return url
@property
@ -248,50 +249,41 @@ class GitConfig(object):
return self._cache_dict
def _Read(self):
d = self._ReadPickle()
d = self._ReadJson()
if d is None:
d = self._ReadGit()
self._SavePickle(d)
self._SaveJson(d)
return d
def _ReadPickle(self):
def _ReadJson(self):
try:
if os.path.getmtime(self._pickle) \
if os.path.getmtime(self._json) \
<= os.path.getmtime(self.file):
os.remove(self._pickle)
os.remove(self._json)
return None
except OSError:
return None
try:
Trace(': unpickle %s', self.file)
fd = open(self._pickle, 'rb')
Trace(': parsing %s', self.file)
fd = open(self._json)
try:
return pickle.load(fd)
return json.load(fd)
finally:
fd.close()
except EOFError:
os.remove(self._pickle)
return None
except IOError:
os.remove(self._pickle)
return None
except pickle.PickleError:
os.remove(self._pickle)
except (IOError, ValueError):
os.remove(self._json)
return None
def _SavePickle(self, cache):
def _SaveJson(self, cache):
try:
fd = open(self._pickle, 'wb')
fd = open(self._json, 'w')
try:
pickle.dump(cache, fd, pickle.HIGHEST_PROTOCOL)
json.dump(cache, fd, indent=2)
finally:
fd.close()
except IOError:
if os.path.exists(self._pickle):
os.remove(self._pickle)
except pickle.PickleError:
if os.path.exists(self._pickle):
os.remove(self._pickle)
except (IOError, TypeError):
if os.path.exists(self._json):
os.remove(self._json)
def _ReadGit(self):
"""
@ -304,8 +296,8 @@ class GitConfig(object):
d = self._do('--null', '--list')
if d is None:
return c
for line in d.rstrip('\0').split('\0'): # pylint: disable=W1401
# Backslash is not anomalous
for line in d.decode('utf-8').rstrip('\0').split('\0'): # pylint: disable=W1401
# Backslash is not anomalous
if '\n' in line:
key, val = line.split('\n', 1)
else:
@ -472,9 +464,13 @@ def _open_ssh(host, port=None):
% (host,port, str(e)), file=sys.stderr)
return False
time.sleep(1)
ssh_died = (p.poll() is not None)
if ssh_died:
return False
_master_processes.append(p)
_master_keys.add(key)
time.sleep(1)
return True
finally:
_master_keys_lock.release()
@ -512,6 +508,43 @@ def GetSchemeFromUrl(url):
return m.group(1)
return None
@contextlib.contextmanager
def GetUrlCookieFile(url, quiet):
if url.startswith('persistent-'):
try:
p = subprocess.Popen(
['git-remote-persistent-https', '-print_config', url],
stdin=subprocess.PIPE, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
try:
cookieprefix = 'http.cookiefile='
proxyprefix = 'http.proxy='
cookiefile = None
proxy = None
for line in p.stdout:
line = line.strip()
if line.startswith(cookieprefix):
cookiefile = line[len(cookieprefix):]
if line.startswith(proxyprefix):
proxy = line[len(proxyprefix):]
# Leave subprocess open, as cookie file may be transient.
if cookiefile or proxy:
yield cookiefile, proxy
return
finally:
p.stdin.close()
if p.wait():
err_msg = p.stderr.read()
if ' -print_config' in err_msg:
pass # Persistent proxy doesn't support -print_config.
elif not quiet:
print(err_msg, file=sys.stderr)
except OSError as e:
if e.errno == errno.ENOENT:
pass # No persistent proxy.
raise
yield GitConfig.ForUser().GetString('http.cookiefile'), None
def _preconnect(url):
m = URI_ALL.match(url)
if m:
@ -539,6 +572,7 @@ class Remote(object):
self._config = config
self.name = name
self.url = self._Get('url')
self.pushUrl = self._Get('pushurl')
self.review = self._Get('review')
self.projectname = self._Get('projectname')
self.fetch = list(map(RefSpec.FromString,
@ -576,7 +610,9 @@ class Remote(object):
return None
u = self.review
if not u.startswith('http:') and not u.startswith('https:'):
if u.startswith('persistent-'):
u = u[len('persistent-'):]
if u.split(':')[0] not in ('http', 'https', 'sso'):
u = 'http://%s' % u
if u.endswith('/Gerrit'):
u = u[:len(u) - len('/Gerrit')]
@ -592,6 +628,9 @@ class Remote(object):
host, port = os.environ['REPO_HOST_PORT_INFO'].split()
self._review_url = self._SshReviewUrl(userEmail, host, port)
REVIEW_CACHE[u] = self._review_url
elif u.startswith('sso:'):
self._review_url = u # Assume it's right
REVIEW_CACHE[u] = self._review_url
else:
try:
info_url = u + 'ssh_info'
@ -601,7 +640,7 @@ class Remote(object):
# of HTML response back, like maybe a login page.
#
# Assume HTTP if SSH is not enabled or ssh_info doesn't look right.
self._review_url = http_url + 'p/'
self._review_url = http_url
else:
host, port = info.split()
self._review_url = self._SshReviewUrl(userEmail, host, port)
@ -624,9 +663,7 @@ class Remote(object):
def ToLocal(self, rev):
"""Convert a remote revision string to something we have locally.
"""
if IsId(rev):
return rev
if rev.startswith(R_TAGS):
if self.name == '.' or IsId(rev):
return rev
if not rev.startswith('refs/'):
@ -635,6 +672,10 @@ class Remote(object):
for spec in self.fetch:
if spec.SourceMatches(rev):
return spec.MapSource(rev)
if not rev.startswith(R_HEADS):
return rev
raise GitError('remote %s does not have %s' % (self.name, rev))
def WritesTo(self, ref):
@ -658,6 +699,10 @@ class Remote(object):
"""Save this remote to the configuration.
"""
self._Set('url', self.url)
if self.pushUrl is not None:
self._Set('pushurl', self.pushUrl + '/' + self.projectname)
else:
self._Set('pushurl', self.pushUrl)
self._Set('review', self.review)
self._Set('projectname', self.projectname)
self._Set('fetch', list(map(str, self.fetch)))
@ -704,7 +749,7 @@ class Branch(object):
self._Set('merge', self.merge)
else:
fd = open(self._config.file, 'ab')
fd = open(self._config.file, 'a')
try:
fd.write('[branch "%s"]\n' % self.name)
if self.remote:

View File

@ -100,7 +100,7 @@ class GitRefs(object):
def _ReadPackedRefs(self):
path = os.path.join(self._gitdir, 'packed-refs')
try:
fd = open(path, 'rb')
fd = open(path, 'r')
mtime = os.path.getmtime(path)
except IOError:
return

154
gitc_utils.py Normal file
View File

@ -0,0 +1,154 @@
#
# Copyright (C) 2015 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import os
import platform
import re
import sys
import time
import git_command
import git_config
import wrapper
from error import ManifestParseError
NUM_BATCH_RETRIEVE_REVISIONID = 32
def get_gitc_manifest_dir():
return wrapper.Wrapper().get_gitc_manifest_dir()
def parse_clientdir(gitc_fs_path):
return wrapper.Wrapper().gitc_parse_clientdir(gitc_fs_path)
def _set_project_revisions(projects):
"""Sets the revisionExpr for a list of projects.
Because of the limit of open file descriptors allowed, length of projects
should not be overly large. Recommend calling this function multiple times
with each call not exceeding NUM_BATCH_RETRIEVE_REVISIONID projects.
@param projects: List of project objects to set the revionExpr for.
"""
# Retrieve the commit id for each project based off of it's current
# revisionExpr and it is not already a commit id.
project_gitcmds = [(
project, git_command.GitCommand(None,
['ls-remote',
project.remote.url,
project.revisionExpr],
capture_stdout=True, cwd='/tmp'))
for project in projects if not git_config.IsId(project.revisionExpr)]
for proj, gitcmd in project_gitcmds:
if gitcmd.Wait():
print('FATAL: Failed to retrieve revisionExpr for %s' % proj)
sys.exit(1)
revisionExpr = gitcmd.stdout.split('\t')[0]
if not revisionExpr:
raise(ManifestParseError('Invalid SHA-1 revision project %s (%s)' %
(proj.remote.url, proj.revisionExpr)))
proj.revisionExpr = revisionExpr
def _manifest_groups(manifest):
"""Returns the manifest group string that should be synced
This is the same logic used by Command.GetProjects(), which is used during
repo sync
@param manifest: The XmlManifest object
"""
mp = manifest.manifestProject
groups = mp.config.GetString('manifest.groups')
if not groups:
groups = 'default,platform-' + platform.system().lower()
return groups
def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
"""Generate a manifest for shafsd to use for this GITC client.
@param gitc_manifest: Current gitc manifest, or None if there isn't one yet.
@param manifest: A GitcManifest object loaded with the current repo manifest.
@param paths: List of project paths we want to update.
"""
print('Generating GITC Manifest by fetching revision SHAs for each '
'project.')
if paths is None:
paths = manifest.paths.keys()
groups = [x for x in re.split(r'[,\s]+', _manifest_groups(manifest)) if x]
# Convert the paths to projects, and filter them to the matched groups.
projects = [manifest.paths[p] for p in paths]
projects = [p for p in projects if p.MatchesGroups(groups)]
if gitc_manifest is not None:
for path, proj in manifest.paths.iteritems():
if not proj.MatchesGroups(groups):
continue
if not proj.upstream and not git_config.IsId(proj.revisionExpr):
proj.upstream = proj.revisionExpr
if not path in gitc_manifest.paths:
# Any new projects need their first revision, even if we weren't asked
# for them.
projects.append(proj)
elif not path in paths:
# And copy revisions from the previous manifest if we're not updating
# them now.
gitc_proj = gitc_manifest.paths[path]
if gitc_proj.old_revision:
proj.revisionExpr = None
proj.old_revision = gitc_proj.old_revision
else:
proj.revisionExpr = gitc_proj.revisionExpr
index = 0
while index < len(projects):
_set_project_revisions(
projects[index:(index+NUM_BATCH_RETRIEVE_REVISIONID)])
index += NUM_BATCH_RETRIEVE_REVISIONID
if gitc_manifest is not None:
for path, proj in gitc_manifest.paths.iteritems():
if proj.old_revision and path in paths:
# If we updated a project that has been started, keep the old-revision
# updated.
repo_proj = manifest.paths[path]
repo_proj.old_revision = repo_proj.revisionExpr
repo_proj.revisionExpr = None
# Convert URLs from relative to absolute.
for _name, remote in manifest.remotes.iteritems():
remote.fetchUrl = remote.resolvedFetchUrl
# Save the manifest.
save_manifest(manifest)
def save_manifest(manifest, client_dir=None):
"""Save the manifest file in the client_dir.
@param client_dir: Client directory to save the manifest in.
@param manifest: Manifest object to save.
"""
if not client_dir:
client_dir = manifest.gitc_client_dir
with open(os.path.join(client_dir, '.manifest'), 'w') as f:
manifest.Save(f, groups=_manifest_groups(manifest))
# TODO(sbasi/jorg): Come up with a solution to remove the sleep below.
# Give the GITC filesystem time to register the manifest changes.
time.sleep(3)

View File

@ -1,7 +1,7 @@
#!/bin/sh
# From Gerrit Code Review 2.5.2
# From Gerrit Code Review 2.12.1
#
# Part of Gerrit Code Review (http://code.google.com/p/gerrit/)
# Part of Gerrit Code Review (https://www.gerritcodereview.com/)
#
# Copyright (C) 2009 The Android Open Source Project
#
@ -20,14 +20,14 @@
unset GREP_OPTIONS
CHANGE_ID_AFTER="Bug|Issue"
CHANGE_ID_AFTER="Bug|Issue|Test"
MSG="$1"
# Check for, and add if missing, a unique Change-Id
#
add_ChangeId() {
clean_message=`sed -e '
/^diff --git a\/.*/{
/^diff --git .*/{
s///
q
}
@ -39,6 +39,17 @@ add_ChangeId() {
return
fi
# Do not add Change-Id to temp commits
if echo "$clean_message" | head -1 | grep -q '^\(fixup\|squash\)!'
then
return
fi
if test "false" = "`git config --bool --get gerrit.createChangeId`"
then
return
fi
# Does Change-Id: already exist? if so, exit (no change).
if grep -i '^Change-Id:' "$MSG" >/dev/null
then
@ -53,6 +64,10 @@ add_ChangeId() {
AWK=/usr/xpg4/bin/awk
fi
# Get core.commentChar from git config or use default symbol
commentChar=`git config --get core.commentChar`
commentChar=${commentChar:-#}
# How this works:
# - parse the commit message as (textLine+ blankLine*)*
# - assume textLine+ to be a footer until proven otherwise
@ -71,13 +86,13 @@ add_ChangeId() {
blankLines = 0
}
# Skip lines starting with "#" without any spaces before it.
/^#/ { next }
# Skip lines starting with commentChar without any spaces before it.
/^'"$commentChar"'/ { next }
# Skip the line starting with the diff command and everything after it,
# up to the end of the file, assuming it is only patch data.
# If more than one line before the diff was empty, strip all but one.
/^diff --git a/ {
/^diff --git / {
blankLines = 0
while (getline) { }
next
@ -154,7 +169,7 @@ add_ChangeId() {
if (unprinted) {
print "Change-Id: I'"$id"'"
}
}' "$MSG" > $T && mv $T "$MSG" || rm -f $T
}' "$MSG" > "$T" && mv "$T" "$MSG" || rm -f "$T"
}
_gen_ChangeIdInput() {
echo "tree `git write-tree`"

View File

@ -35,7 +35,7 @@ elif grep -q "AC Power \+: 1" /proc/pmu/info 2>/dev/null
then
exit 0
elif test -x /usr/bin/pmset && /usr/bin/pmset -g batt |
grep -q "Currently drawing from 'AC Power'"
grep -q "drawing from 'AC Power'"
then
exit 0
elif test -d /sys/bus/acpi/drivers/battery && test 0 = \

167
main.py
View File

@ -31,21 +31,31 @@ else:
urllib = imp.new_module('urllib')
urllib.request = urllib2
try:
import kerberos
except ImportError:
kerberos = None
from color import SetDefaultColoring
from trace import SetTrace
from git_command import git, GitCommand
from git_config import init_ssh, close_ssh
from command import InteractiveCommand
from command import MirrorSafeCommand
from command import GitcAvailableCommand, GitcClientCommand
from subcmds.version import Version
from editor import Editor
from error import DownloadError
from error import InvalidProjectGroupsError
from error import ManifestInvalidRevisionError
from error import ManifestParseError
from error import NoManifestException
from error import NoSuchProjectError
from error import RepoChangedException
from manifest_xml import XmlManifest
import gitc_utils
from manifest_xml import GitcManifest, XmlManifest
from pager import RunPager
from wrapper import WrapperPath, Wrapper
from subcmds import all_commands
@ -63,6 +73,9 @@ global_options.add_option('-p', '--paginate',
global_options.add_option('--no-pager',
dest='no_pager', action='store_true',
help='disable the pager')
global_options.add_option('--color',
choices=('auto', 'always', 'never'), default=None,
help='control color usage: auto, always, never')
global_options.add_option('--trace',
dest='trace', action='store_true',
help='trace git command execution')
@ -107,6 +120,8 @@ class _Repo(object):
print('fatal: invalid usage of --version', file=sys.stderr)
return 1
SetDefaultColoring(gopts.color)
try:
cmd = self.commands[name]
except KeyError:
@ -116,6 +131,12 @@ class _Repo(object):
cmd.repodir = self.repodir
cmd.manifest = XmlManifest(cmd.repodir)
cmd.gitc_manifest = None
gitc_client_name = gitc_utils.parse_clientdir(os.getcwd())
if gitc_client_name:
cmd.gitc_manifest = GitcManifest(cmd.repodir, gitc_client_name)
cmd.manifest.isGitcClient = True
Editor.globalConfig = cmd.manifest.globalConfig
if not isinstance(cmd, MirrorSafeCommand) and cmd.manifest.IsMirror:
@ -123,8 +144,25 @@ class _Repo(object):
file=sys.stderr)
return 1
copts, cargs = cmd.OptionParser.parse_args(argv)
copts = cmd.ReadEnvironmentOptions(copts)
if isinstance(cmd, GitcAvailableCommand) and not gitc_utils.get_gitc_manifest_dir():
print("fatal: '%s' requires GITC to be available" % name,
file=sys.stderr)
return 1
if isinstance(cmd, GitcClientCommand) and not gitc_client_name:
print("fatal: '%s' requires a GITC client" % name,
file=sys.stderr)
return 1
try:
copts, cargs = cmd.OptionParser.parse_args(argv)
copts = cmd.ReadEnvironmentOptions(copts)
except NoManifestException as e:
print('error: in `%s`: %s' % (' '.join([name] + argv), str(e)),
file=sys.stderr)
print('error: manifest missing or unreadable -- please run init',
file=sys.stderr)
return 1
if not gopts.no_pager and not isinstance(cmd, InteractiveCommand):
config = cmd.manifest.globalConfig
@ -140,15 +178,13 @@ class _Repo(object):
start = time.time()
try:
result = cmd.Execute(copts, cargs)
except DownloadError as e:
print('error: %s' % str(e), file=sys.stderr)
result = 1
except ManifestInvalidRevisionError as e:
print('error: %s' % str(e), file=sys.stderr)
result = 1
except NoManifestException as e:
print('error: manifest required for this command -- please run init',
file=sys.stderr)
except (DownloadError, ManifestInvalidRevisionError,
NoManifestException) as e:
print('error: in `%s`: %s' % (' '.join([name] + argv), str(e)),
file=sys.stderr)
if isinstance(e, NoManifestException):
print('error: manifest missing or unreadable -- please run init',
file=sys.stderr)
result = 1
except NoSuchProjectError as e:
if e.name:
@ -156,6 +192,12 @@ class _Repo(object):
else:
print('error: no project in current directory', file=sys.stderr)
result = 1
except InvalidProjectGroupsError as e:
if e.name:
print('error: project group must be enabled for project %s' % e.name, file=sys.stderr)
else:
print('error: project group must be enabled for the project in the current directory', file=sys.stderr)
result = 1
finally:
elapsed = time.time() - start
hours, remainder = divmod(elapsed, 3600)
@ -169,21 +211,10 @@ class _Repo(object):
return result
def _MyRepoPath():
return os.path.dirname(__file__)
def _MyWrapperPath():
return os.path.join(os.path.dirname(__file__), 'repo')
_wrapper_module = None
def WrapperModule():
global _wrapper_module
if not _wrapper_module:
_wrapper_module = imp.load_source('wrapper', _MyWrapperPath())
return _wrapper_module
def _CurrentWrapperVersion():
return WrapperModule().VERSION
def _CheckWrapperVersion(ver, repo_path):
if not repo_path:
@ -193,7 +224,7 @@ def _CheckWrapperVersion(ver, repo_path):
print('no --wrapper-version argument', file=sys.stderr)
sys.exit(1)
exp = _CurrentWrapperVersion()
exp = Wrapper().VERSION
ver = tuple(map(int, ver.split('.')))
if len(ver) == 1:
ver = (0, ver[0])
@ -205,7 +236,7 @@ def _CheckWrapperVersion(ver, repo_path):
!!! You must upgrade before you can continue: !!!
cp %s %s
""" % (exp_str, _MyWrapperPath(), repo_path), file=sys.stderr)
""" % (exp_str, WrapperPath(), repo_path), file=sys.stderr)
sys.exit(1)
if exp > ver:
@ -214,7 +245,7 @@ def _CheckWrapperVersion(ver, repo_path):
... You should upgrade soon:
cp %s %s
""" % (exp_str, _MyWrapperPath(), repo_path), file=sys.stderr)
""" % (exp_str, WrapperPath(), repo_path), file=sys.stderr)
def _CheckRepoDir(repo_dir):
if not repo_dir:
@ -342,6 +373,86 @@ class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler):
self.retried = 0
raise
class _KerberosAuthHandler(urllib.request.BaseHandler):
def __init__(self):
self.retried = 0
self.context = None
self.handler_order = urllib.request.BaseHandler.handler_order - 50
def http_error_401(self, req, fp, code, msg, headers): # pylint:disable=unused-argument
host = req.get_host()
retry = self.http_error_auth_reqed('www-authenticate', host, req, headers)
return retry
def http_error_auth_reqed(self, auth_header, host, req, headers):
try:
spn = "HTTP@%s" % host
authdata = self._negotiate_get_authdata(auth_header, headers)
if self.retried > 3:
raise urllib.request.HTTPError(req.get_full_url(), 401,
"Negotiate auth failed", headers, None)
else:
self.retried += 1
neghdr = self._negotiate_get_svctk(spn, authdata)
if neghdr is None:
return None
req.add_unredirected_header('Authorization', neghdr)
response = self.parent.open(req)
srvauth = self._negotiate_get_authdata(auth_header, response.info())
if self._validate_response(srvauth):
return response
except kerberos.GSSError:
return None
except:
self.reset_retry_count()
raise
finally:
self._clean_context()
def reset_retry_count(self):
self.retried = 0
def _negotiate_get_authdata(self, auth_header, headers):
authhdr = headers.get(auth_header, None)
if authhdr is not None:
for mech_tuple in authhdr.split(","):
mech, __, authdata = mech_tuple.strip().partition(" ")
if mech.lower() == "negotiate":
return authdata.strip()
return None
def _negotiate_get_svctk(self, spn, authdata):
if authdata is None:
return None
result, self.context = kerberos.authGSSClientInit(spn)
if result < kerberos.AUTH_GSS_COMPLETE:
return None
result = kerberos.authGSSClientStep(self.context, authdata)
if result < kerberos.AUTH_GSS_CONTINUE:
return None
response = kerberos.authGSSClientResponse(self.context)
return "Negotiate %s" % response
def _validate_response(self, authdata):
if authdata is None:
return None
result = kerberos.authGSSClientStep(self.context, authdata)
if result == kerberos.AUTH_GSS_COMPLETE:
return True
return None
def _clean_context(self):
if self.context is not None:
kerberos.authGSSClientClean(self.context)
self.context = None
def init_http():
handlers = [_UserAgentHandler()]
@ -358,6 +469,8 @@ def init_http():
pass
handlers.append(_BasicAuthHandler(mgr))
handlers.append(_DigestAuthHandler(mgr))
if kerberos:
handlers.append(_KerberosAuthHandler())
if 'http_proxy' in os.environ:
url = os.environ['http_proxy']

View File

@ -29,17 +29,19 @@ else:
urllib = imp.new_module('urllib')
urllib.parse = urlparse
import gitc_utils
from git_config import GitConfig
from git_refs import R_HEADS, HEAD
from project import RemoteSpec, Project, MetaProject
from error import ManifestParseError
from error import ManifestParseError, ManifestInvalidRevisionError
MANIFEST_FILE_NAME = 'manifest.xml'
LOCAL_MANIFEST_NAME = 'local_manifest.xml'
LOCAL_MANIFESTS_DIR_NAME = 'local_manifests'
urllib.parse.uses_relative.extend(['ssh', 'git'])
urllib.parse.uses_netloc.extend(['ssh', 'git'])
# urljoin gets confused if the scheme is not known.
urllib.parse.uses_relative.extend(['ssh', 'git', 'persistent-https', 'rpc'])
urllib.parse.uses_netloc.extend(['ssh', 'git', 'persistent-https', 'rpc'])
class _Default(object):
"""Project defaults within the manifest."""
@ -62,13 +64,17 @@ class _XmlRemote(object):
name,
alias=None,
fetch=None,
pushUrl=None,
manifestUrl=None,
review=None):
review=None,
revision=None):
self.name = name
self.fetchUrl = fetch
self.pushUrl = pushUrl
self.manifestUrl = manifestUrl
self.remoteAlias = alias
self.reviewUrl = review
self.revision = revision
self.resolvedFetchUrl = self._resolveFetchUrl()
def __eq__(self, other):
@ -80,18 +86,17 @@ class _XmlRemote(object):
def _resolveFetchUrl(self):
url = self.fetchUrl.rstrip('/')
manifestUrl = self.manifestUrl.rstrip('/')
p = manifestUrl.startswith('persistent-http')
if p:
manifestUrl = manifestUrl[len('persistent-'):]
# urljoin will gets confused over quite a few things. The ones we care
# about here are:
# * no scheme in the base url, like <hostname:port>
# We handle no scheme by replacing it with an obscure protocol, gopher
# and then replacing it with the original when we are done.
# urljoin will get confused if there is no scheme in the base url
# ie, if manifestUrl is of the form <hostname:port>
if manifestUrl.find(':') != manifestUrl.find('/') - 1:
manifestUrl = 'gopher://' + manifestUrl
url = urllib.parse.urljoin(manifestUrl, url)
url = re.sub(r'^gopher://', '', url)
if p:
url = 'persistent-' + url
url = urllib.parse.urljoin('gopher://' + manifestUrl, url)
url = re.sub(r'^gopher://', '', url)
else:
url = urllib.parse.urljoin(manifestUrl, url)
return url
def ToRemoteSpec(self, projectName):
@ -99,7 +104,11 @@ class _XmlRemote(object):
remoteName = self.name
if self.remoteAlias:
remoteName = self.remoteAlias
return RemoteSpec(remoteName, url, self.reviewUrl)
return RemoteSpec(remoteName,
url=url,
pushUrl=self.pushUrl,
review=self.reviewUrl,
orig_name=self.name)
class XmlManifest(object):
"""manages the repo configuration file"""
@ -110,6 +119,7 @@ class XmlManifest(object):
self.manifestFile = os.path.join(self.repodir, MANIFEST_FILE_NAME)
self.globalConfig = GitConfig.ForUser()
self.localManifestWarning = False
self.isGitcClient = False
self.repoProject = MetaProject(self, 'repo',
gitdir = os.path.join(repodir, 'repo/.git'),
@ -153,19 +163,27 @@ class XmlManifest(object):
root.appendChild(e)
e.setAttribute('name', r.name)
e.setAttribute('fetch', r.fetchUrl)
if r.pushUrl is not None:
e.setAttribute('pushurl', r.pushUrl)
if r.remoteAlias is not None:
e.setAttribute('alias', r.remoteAlias)
if r.reviewUrl is not None:
e.setAttribute('review', r.reviewUrl)
if r.revision is not None:
e.setAttribute('revision', r.revision)
def Save(self, fd, peg_rev=False, peg_rev_upstream=True):
def _ParseGroups(self, groups):
return [x for x in re.split(r'[,\s]+', groups) if x]
def Save(self, fd, peg_rev=False, peg_rev_upstream=True, groups=None):
"""Write the current manifest out to the given file descriptor.
"""
mp = self.manifestProject
groups = mp.config.GetString('manifest.groups')
if groups is None:
groups = mp.config.GetString('manifest.groups')
if groups:
groups = [x for x in re.split(r'[,\s]+', groups) if x]
groups = self._ParseGroups(groups)
doc = xml.dom.minidom.Document()
root = doc.createElement('manifest')
@ -195,6 +213,9 @@ class XmlManifest(object):
if d.revisionExpr:
have_default = True
e.setAttribute('revision', d.revisionExpr)
if d.destBranchExpr:
have_default = True
e.setAttribute('dest-branch', d.destBranchExpr)
if d.sync_j > 1:
have_default = True
e.setAttribute('sync-j', '%d' % d.sync_j)
@ -215,8 +236,9 @@ class XmlManifest(object):
root.appendChild(doc.createTextNode(''))
def output_projects(parent, parent_node, projects):
for p in projects:
output_project(parent, parent_node, self.projects[p])
for project_name in projects:
for project in self._projects[project_name]:
output_project(parent, parent_node, project)
def output_project(parent, parent_node, p):
if not p.MatchesGroups(groups):
@ -233,22 +255,34 @@ class XmlManifest(object):
e.setAttribute('name', name)
if relpath != name:
e.setAttribute('path', relpath)
remoteName = d.remote.remoteAlias or d.remote.name
if not d.remote or p.remote.name != remoteName:
e.setAttribute('remote', p.remote.name)
remoteName = None
if d.remote:
remoteName = d.remote.name
if not d.remote or p.remote.orig_name != remoteName:
remoteName = p.remote.orig_name
e.setAttribute('remote', remoteName)
if peg_rev:
if self.IsMirror:
value = p.bare_git.rev_parse(p.revisionExpr + '^0')
else:
value = p.work_git.rev_parse(HEAD + '^0')
e.setAttribute('revision', value)
if peg_rev_upstream and value != p.revisionExpr:
# Only save the origin if the origin is not a sha1, and the default
# isn't our value, and the if the default doesn't already have that
# covered.
e.setAttribute('upstream', p.revisionExpr)
elif not d.revisionExpr or p.revisionExpr != d.revisionExpr:
e.setAttribute('revision', p.revisionExpr)
if peg_rev_upstream:
if p.upstream:
e.setAttribute('upstream', p.upstream)
elif value != p.revisionExpr:
# Only save the origin if the origin is not a sha1, and the default
# isn't our value
e.setAttribute('upstream', p.revisionExpr)
else:
revision = self.remotes[p.remote.orig_name].revision or d.revisionExpr
if not revision or revision != p.revisionExpr:
e.setAttribute('revision', p.revisionExpr)
if p.upstream and p.upstream != p.revisionExpr:
e.setAttribute('upstream', p.upstream)
if p.dest_branch and p.dest_branch != d.destBranchExpr:
e.setAttribute('dest-branch', p.dest_branch)
for c in p.copyfiles:
ce = doc.createElement('copyfile')
@ -256,6 +290,12 @@ class XmlManifest(object):
ce.setAttribute('dest', c.dest)
e.appendChild(ce)
for l in p.linkfiles:
le = doc.createElement('linkfile')
le.setAttribute('src', l.src)
le.setAttribute('dest', l.dest)
e.appendChild(le)
default_groups = ['all', 'name:%s' % p.name, 'path:%s' % p.relpath]
egroups = [g for g in p.groups if g not in default_groups]
if egroups:
@ -274,14 +314,17 @@ class XmlManifest(object):
if p.sync_s:
e.setAttribute('sync-s', 'true')
if p.subprojects:
sort_projects = list(sorted([subp.name for subp in p.subprojects]))
output_projects(p, e, sort_projects)
if p.clone_depth:
e.setAttribute('clone-depth', str(p.clone_depth))
sort_projects = list(sorted([key for key, value in self.projects.items()
if not value.parent]))
sort_projects.sort()
output_projects(None, root, sort_projects)
self._output_manifest_project_extras(p, e)
if p.subprojects:
subprojects = set(subp.name for subp in p.subprojects)
output_projects(p, e, list(sorted(subprojects)))
projects = set(p.name for p in self._paths.values() if not p.parent)
output_projects(None, root, list(sorted(projects)))
if self._repo_hooks_project:
root.appendChild(doc.createTextNode(''))
@ -293,10 +336,19 @@ class XmlManifest(object):
doc.writexml(fd, '', ' ', '\n', 'UTF-8')
def _output_manifest_project_extras(self, p, e):
"""Manifests can modify e if they support extra project attributes."""
pass
@property
def paths(self):
self._Load()
return self._paths
@property
def projects(self):
self._Load()
return self._projects
return list(self._paths.values())
@property
def remotes(self):
@ -327,9 +379,14 @@ class XmlManifest(object):
def IsMirror(self):
return self.manifestProject.config.GetBoolean('repo.mirror')
@property
def IsArchive(self):
return self.manifestProject.config.GetBoolean('repo.archive')
def _Unload(self):
self._loaded = False
self._projects = {}
self._paths = {}
self._remotes = {}
self._default = None
self._repo_hooks_project = None
@ -461,11 +518,17 @@ class XmlManifest(object):
self._manifest_server = url
def recursively_add_projects(project):
if self._projects.get(project.name):
projects = self._projects.setdefault(project.name, [])
if project.relpath is None:
raise ManifestParseError(
'duplicate project %s in %s' %
'missing path for %s in %s' %
(project.name, self.manifestFile))
self._projects[project.name] = project
if project.relpath in self._paths:
raise ManifestParseError(
'duplicate path %s in %s' %
(project.relpath, self.manifestFile))
self._paths[project.relpath] = project
projects.append(project)
for subproject in project.subprojects:
recursively_add_projects(subproject)
@ -473,6 +536,23 @@ class XmlManifest(object):
if node.nodeName == 'project':
project = self._ParseProject(node)
recursively_add_projects(project)
if node.nodeName == 'extend-project':
name = self._reqatt(node, 'name')
if name not in self._projects:
raise ManifestParseError('extend-project element specifies non-existent '
'project: %s' % name)
path = node.getAttribute('path')
groups = node.getAttribute('groups')
if groups:
groups = self._ParseGroups(groups)
for p in self._projects[name]:
if path and p.relpath != path:
continue
if groups:
p.groups.extend(groups)
if node.nodeName == 'repo-hooks':
# Get the name of the project and the (space-separated) list of enabled.
repo_hooks_project = self._reqatt(node, 'in-project')
@ -486,22 +566,31 @@ class XmlManifest(object):
# Store a reference to the Project.
try:
self._repo_hooks_project = self._projects[repo_hooks_project]
repo_hooks_projects = self._projects[repo_hooks_project]
except KeyError:
raise ManifestParseError(
'project %s not found for repo-hooks' %
(repo_hooks_project))
if len(repo_hooks_projects) != 1:
raise ManifestParseError(
'internal error parsing repo-hooks in %s' %
(self.manifestFile))
self._repo_hooks_project = repo_hooks_projects[0]
# Store the enabled hooks in the Project object.
self._repo_hooks_project.enabled_repo_hooks = enabled_repo_hooks
if node.nodeName == 'remove-project':
name = self._reqatt(node, 'name')
try:
del self._projects[name]
except KeyError:
if name not in self._projects:
raise ManifestParseError('remove-project element specifies non-existent '
'project: %s' % name)
for p in self._projects[name]:
del self._paths[p.relpath]
del self._projects[name]
# If the manifest removes the hooks project, treat it as if it deleted
# the repo-hooks element too.
if self._repo_hooks_project and (self._repo_hooks_project.name == name):
@ -538,11 +627,13 @@ class XmlManifest(object):
name = name,
remote = remote.ToRemoteSpec(name),
gitdir = gitdir,
objdir = gitdir,
worktree = None,
relpath = None,
relpath = name or None,
revisionExpr = m.revisionExpr,
revisionId = None)
self._projects[project.name] = project
self._projects[project.name] = [project]
self._paths[project.relpath] = project
def _ParseRemote(self, node):
"""
@ -553,11 +644,17 @@ class XmlManifest(object):
if alias == '':
alias = None
fetch = self._reqatt(node, 'fetch')
pushUrl = node.getAttribute('pushurl')
if pushUrl == '':
pushUrl = None
review = node.getAttribute('review')
if review == '':
review = None
revision = node.getAttribute('revision')
if revision == '':
revision = None
manifestUrl = self.manifestProject.config.GetString('remote.origin.url')
return _XmlRemote(name, alias, fetch, manifestUrl, review)
return _XmlRemote(name, alias, fetch, pushUrl, manifestUrl, review, revision)
def _ParseDefault(self, node):
"""
@ -635,7 +732,7 @@ class XmlManifest(object):
def _UnjoinName(self, parent_name, name):
return os.path.relpath(name, parent_name)
def _ParseProject(self, node, parent = None):
def _ParseProject(self, node, parent = None, **extra_proj_attrs):
"""
reads a <project> element from the manifest file
"""
@ -650,7 +747,7 @@ class XmlManifest(object):
raise ManifestParseError("no remote for project %s within %s" %
(name, self.manifestFile))
revisionExpr = node.getAttribute('revision')
revisionExpr = node.getAttribute('revision') or remote.revision
if not revisionExpr:
revisionExpr = self._default.revisionExpr
if not revisionExpr:
@ -699,12 +796,13 @@ class XmlManifest(object):
groups = ''
if node.hasAttribute('groups'):
groups = node.getAttribute('groups')
groups = [x for x in re.split(r'[,\s]+', groups) if x]
groups = self._ParseGroups(groups)
if parent is None:
relpath, worktree, gitdir = self.GetProjectPaths(name, path)
relpath, worktree, gitdir, objdir = self.GetProjectPaths(name, path)
else:
relpath, worktree, gitdir = self.GetSubprojectPaths(parent, path)
relpath, worktree, gitdir, objdir = \
self.GetSubprojectPaths(parent, name, path)
default_groups = ['all', 'name:%s' % name, 'path:%s' % relpath]
groups.extend(set(default_groups).difference(groups))
@ -717,6 +815,7 @@ class XmlManifest(object):
name = name,
remote = remote.ToRemoteSpec(name),
gitdir = gitdir,
objdir = objdir,
worktree = worktree,
relpath = relpath,
revisionExpr = revisionExpr,
@ -728,11 +827,14 @@ class XmlManifest(object):
clone_depth = clone_depth,
upstream = upstream,
parent = parent,
dest_branch = dest_branch)
dest_branch = dest_branch,
**extra_proj_attrs)
for n in node.childNodes:
if n.nodeName == 'copyfile':
self._ParseCopyFile(project, n)
if n.nodeName == 'linkfile':
self._ParseLinkFile(project, n)
if n.nodeName == 'annotation':
self._ParseAnnotation(project, n)
if n.nodeName == 'project':
@ -745,10 +847,15 @@ class XmlManifest(object):
if self.IsMirror:
worktree = None
gitdir = os.path.join(self.topdir, '%s.git' % name)
objdir = gitdir
else:
worktree = os.path.join(self.topdir, path).replace('\\', '/')
gitdir = os.path.join(self.repodir, 'projects', '%s.git' % path)
return relpath, worktree, gitdir
objdir = os.path.join(self.repodir, 'project-objects', '%s.git' % name)
return relpath, worktree, gitdir, objdir
def GetProjectsWithName(self, name):
return self._projects.get(name, [])
def GetSubprojectName(self, parent, submodule_path):
return os.path.join(parent.name, submodule_path)
@ -759,14 +866,15 @@ class XmlManifest(object):
def _UnjoinRelpath(self, parent_relpath, relpath):
return os.path.relpath(relpath, parent_relpath)
def GetSubprojectPaths(self, parent, path):
def GetSubprojectPaths(self, parent, name, path):
relpath = self._JoinRelpath(parent.relpath, path)
gitdir = os.path.join(parent.gitdir, 'subprojects', '%s.git' % path)
objdir = os.path.join(parent.gitdir, 'subproject-objects', '%s.git' % name)
if self.IsMirror:
worktree = None
else:
worktree = os.path.join(parent.worktree, path).replace('\\', '/')
return relpath, worktree, gitdir
return relpath, worktree, gitdir, objdir
def _ParseCopyFile(self, project, node):
src = self._reqatt(node, 'src')
@ -776,6 +884,14 @@ class XmlManifest(object):
# dest is relative to the top of the tree
project.AddCopyFile(src, dest, os.path.join(self.topdir, dest))
def _ParseLinkFile(self, project, node):
src = self._reqatt(node, 'src')
dest = self._reqatt(node, 'dest')
if not self.IsMirror:
# src is project relative;
# dest is relative to the top of the tree
project.AddLinkFile(src, dest, os.path.join(self.topdir, dest))
def _ParseAnnotation(self, project, node):
name = self._reqatt(node, 'name')
value = self._reqatt(node, 'value')
@ -808,3 +924,61 @@ class XmlManifest(object):
raise ManifestParseError("no %s in <%s> within %s" %
(attname, node.nodeName, self.manifestFile))
return v
def projectsDiff(self, manifest):
"""return the projects differences between two manifests.
The diff will be from self to given manifest.
"""
fromProjects = self.paths
toProjects = manifest.paths
fromKeys = sorted(fromProjects.keys())
toKeys = sorted(toProjects.keys())
diff = {'added': [], 'removed': [], 'changed': [], 'unreachable': []}
for proj in fromKeys:
if not proj in toKeys:
diff['removed'].append(fromProjects[proj])
else:
fromProj = fromProjects[proj]
toProj = toProjects[proj]
try:
fromRevId = fromProj.GetCommitRevisionId()
toRevId = toProj.GetCommitRevisionId()
except ManifestInvalidRevisionError:
diff['unreachable'].append((fromProj, toProj))
else:
if fromRevId != toRevId:
diff['changed'].append((fromProj, toProj))
toKeys.remove(proj)
for proj in toKeys:
diff['added'].append(toProjects[proj])
return diff
class GitcManifest(XmlManifest):
def __init__(self, repodir, gitc_client_name):
"""Initialize the GitcManifest object."""
super(GitcManifest, self).__init__(repodir)
self.isGitcClient = True
self.gitc_client_name = gitc_client_name
self.gitc_client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(),
gitc_client_name)
self.manifestFile = os.path.join(self.gitc_client_dir, '.manifest')
def _ParseProject(self, node, parent = None):
"""Override _ParseProject and add support for GITC specific attributes."""
return super(GitcManifest, self)._ParseProject(
node, parent=parent, old_revision=node.getAttribute('old-revision'))
def _output_manifest_project_extras(self, p, e):
"""Output GITC Specific Project attributes"""
if p.old_revision:
e.setAttribute('old-revision', str(p.old_revision))

1334
project.py

File diff suppressed because it is too large Load Diff

286
repo
View File

@ -1,8 +1,11 @@
#!/usr/bin/env python
## repo default configuration
##
REPO_URL = 'https://gerrit.googlesource.com/git-repo'
# repo default configuration
#
import os
REPO_URL = os.environ.get('REPO_URL', None)
if not REPO_URL:
REPO_URL = 'https://gerrit.googlesource.com/git-repo'
REPO_REV = 'stable'
# Copyright (C) 2008 Google Inc.
@ -20,10 +23,13 @@ REPO_REV = 'stable'
# limitations under the License.
# increment this whenever we make important changes to this script
VERSION = (1, 20)
VERSION = (1, 23)
# increment this if the MAINTAINER_KEYS block is modified
KEYRING_VERSION = (1, 2)
# Each individual key entry is created by using:
# gpg --armor --export keyid
MAINTAINER_KEYS = """
Repo Maintainer <repo@android.kernel.org>
@ -101,18 +107,21 @@ JuinEP+AwLAUZ1Bsx9ISC0Agpk2VeHXPL3FGhroEmoMvBzO0kTFGyoeT7PR/BfKv
-----END PGP PUBLIC KEY BLOCK-----
"""
GIT = 'git' # our git command
MIN_GIT_VERSION = (1, 7, 2) # minimum supported git version
repodir = '.repo' # name of repo's private directory
S_repo = 'repo' # special repo repository
S_manifests = 'manifests' # special manifest repository
REPO_MAIN = S_repo + '/main.py' # main script
MIN_PYTHON_VERSION = (2, 6) # minimum supported python version
GIT = 'git' # our git command
MIN_GIT_VERSION = (1, 7, 2) # minimum supported git version
repodir = '.repo' # name of repo's private directory
S_repo = 'repo' # special repo repository
S_manifests = 'manifests' # special manifest repository
REPO_MAIN = S_repo + '/main.py' # main script
MIN_PYTHON_VERSION = (2, 6) # minimum supported python version
GITC_CONFIG_FILE = '/gitc/.config'
GITC_FS_ROOT_DIR = '/gitc/manifest-rw/'
import errno
import optparse
import os
import re
import shutil
import stat
import subprocess
import sys
@ -137,11 +146,6 @@ def _print(*objects, **kwargs):
# Python version check
ver = sys.version_info
if ver[0] == 3:
_print('error: Python 3 support is not fully implemented in repo yet.\n'
'Please use Python 2.6 - 2.7 instead.',
file=sys.stderr)
sys.exit(1)
if (ver[0], ver[1]) < MIN_PYTHON_VERSION:
_print('error: Python version %s unsupported.\n'
'Please use Python 2.6 - 2.7 instead.'
@ -181,6 +185,10 @@ group.add_option('--reference',
group.add_option('--depth', type='int', default=None,
dest='depth',
help='create a shallow clone with given depth; see git clone')
group.add_option('--archive',
dest='archive', action='store_true',
help='checkout an archive instead of a git repository for '
'each project. See git archive.')
group.add_option('-g', '--groups',
dest='groups', default='default',
help='restrict manifest projects to ones with specified '
@ -191,6 +199,9 @@ group.add_option('-p', '--platform',
help='restrict manifest projects to ones with a specified '
'platform group [auto|all|none|linux|darwin|...]',
metavar='PLATFORM')
group.add_option('--no-clone-bundle',
dest='no_clone_bundle', action='store_true',
help='disable use of /clone.bundle on HTTP/HTTPS')
# Tool
@ -211,14 +222,69 @@ group.add_option('--config-name',
dest='config_name', action="store_true", default=False,
help='Always prompt for name/e-mail')
def _GitcInitOptions(init_optparse_arg):
init_optparse_arg.set_usage("repo gitc-init -u url -c client [options]")
g = init_optparse_arg.add_option_group('GITC options')
g.add_option('-f', '--manifest-file',
dest='manifest_file',
help='Optional manifest file to use for this GITC client.')
g.add_option('-c', '--gitc-client',
dest='gitc_client',
help='The name of the gitc_client instance to create or modify.')
_gitc_manifest_dir = None
def get_gitc_manifest_dir():
global _gitc_manifest_dir
if _gitc_manifest_dir is None:
_gitc_manifest_dir = ''
try:
with open(GITC_CONFIG_FILE, 'r') as gitc_config:
for line in gitc_config:
match = re.match('gitc_dir=(?P<gitc_manifest_dir>.*)', line)
if match:
_gitc_manifest_dir = match.group('gitc_manifest_dir')
except IOError:
pass
return _gitc_manifest_dir
def gitc_parse_clientdir(gitc_fs_path):
"""Parse a path in the GITC FS and return its client name.
@param gitc_fs_path: A subdirectory path within the GITC_FS_ROOT_DIR.
@returns: The GITC client name
"""
if gitc_fs_path == GITC_FS_ROOT_DIR:
return None
if not gitc_fs_path.startswith(GITC_FS_ROOT_DIR):
manifest_dir = get_gitc_manifest_dir()
if manifest_dir == '':
return None
if manifest_dir[-1] != '/':
manifest_dir += '/'
if gitc_fs_path == manifest_dir:
return None
if not gitc_fs_path.startswith(manifest_dir):
return None
return gitc_fs_path.split(manifest_dir)[1].split('/')[0]
return gitc_fs_path.split(GITC_FS_ROOT_DIR)[1].split('/')[0]
class CloneFailure(Exception):
"""Indicate the remote clone of repo itself failed.
"""
def _Init(args):
def _Init(args, gitc_init=False):
"""Installs repo by cloning it over the network.
"""
if gitc_init:
_GitcInitOptions(init_optparse)
opt, args = init_optparse.parse_args(args)
if args:
init_optparse.print_usage()
@ -240,10 +306,30 @@ def _Init(args):
_print("fatal: invalid branch name '%s'" % branch, file=sys.stderr)
raise CloneFailure()
if not os.path.isdir(repodir):
try:
os.mkdir(repodir)
except OSError as e:
try:
if gitc_init:
gitc_manifest_dir = get_gitc_manifest_dir()
if not gitc_manifest_dir:
_print('fatal: GITC filesystem is not available. Exiting...',
file=sys.stderr)
sys.exit(1)
gitc_client = opt.gitc_client
if not gitc_client:
gitc_client = gitc_parse_clientdir(os.getcwd())
if not gitc_client:
_print('fatal: GITC client (-c) is required.', file=sys.stderr)
sys.exit(1)
client_dir = os.path.join(gitc_manifest_dir, gitc_client)
if not os.path.exists(client_dir):
os.makedirs(client_dir)
os.chdir(client_dir)
if os.path.exists(repodir):
# This GITC Client has already initialized repo so continue.
return
os.mkdir(repodir)
except OSError as e:
if e.errno != errno.EEXIST:
_print('fatal: cannot make %s directory: %s'
% (repodir, e.strerror), file=sys.stderr)
# Don't raise CloneFailure; that would delete the
@ -259,7 +345,7 @@ def _Init(args):
can_verify = True
dst = os.path.abspath(os.path.join(repodir, S_repo))
_Clone(url, dst, opt.quiet)
_Clone(url, dst, opt.quiet, not opt.no_clone_bundle)
if can_verify and not opt.no_repo_verify:
rev = _Verify(dst, branch, opt.quiet)
@ -274,6 +360,20 @@ def _Init(args):
raise
def ParseGitVersion(ver_str):
if not ver_str.startswith('git version '):
return None
num_ver_str = ver_str[len('git version '):].strip().split('-')[0]
to_tuple = []
for num_str in num_ver_str.split('.')[:3]:
if num_str.isdigit():
to_tuple.append(int(num_str))
else:
to_tuple.append(0)
return tuple(to_tuple)
def _CheckGitVersion():
cmd = [GIT, '--version']
try:
@ -291,12 +391,11 @@ def _CheckGitVersion():
proc.stdout.close()
proc.wait()
if not ver_str.startswith('git version '):
ver_act = ParseGitVersion(ver_str)
if ver_act is None:
_print('error: "%s" unsupported' % ver_str, file=sys.stderr)
raise CloneFailure()
ver_str = ver_str[len('git version '):].strip()
ver_act = tuple(map(int, ver_str.split('.')[0:3]))
if ver_act < MIN_GIT_VERSION:
need = '.'.join(map(str, MIN_GIT_VERSION))
_print('fatal: git %s or later required' % need, file=sys.stderr)
@ -322,30 +421,33 @@ def NeedSetupGnuPG():
def SetupGnuPG(quiet):
if not os.path.isdir(home_dot_repo):
try:
os.mkdir(home_dot_repo)
except OSError as e:
try:
os.mkdir(home_dot_repo)
except OSError as e:
if e.errno != errno.EEXIST:
_print('fatal: cannot make %s directory: %s'
% (home_dot_repo, e.strerror), file=sys.stderr)
sys.exit(1)
if not os.path.isdir(gpg_dir):
try:
os.mkdir(gpg_dir, stat.S_IRWXU)
except OSError as e:
try:
os.mkdir(gpg_dir, stat.S_IRWXU)
except OSError as e:
if e.errno != errno.EEXIST:
_print('fatal: cannot make %s directory: %s' % (gpg_dir, e.strerror),
file=sys.stderr)
sys.exit(1)
env = os.environ.copy()
env['GNUPGHOME'] = gpg_dir.encode()
try:
env['GNUPGHOME'] = gpg_dir
except UnicodeEncodeError:
env['GNUPGHOME'] = gpg_dir.encode()
cmd = ['gpg', '--import']
try:
proc = subprocess.Popen(cmd,
env = env,
stdin = subprocess.PIPE)
env=env,
stdin=subprocess.PIPE)
except OSError as e:
if not quiet:
_print('warning: gpg (GnuPG) is not available.', file=sys.stderr)
@ -371,7 +473,7 @@ def _SetConfig(local, name, value):
"""Set a git configuration option to the specified value.
"""
cmd = [GIT, 'config', name, value]
if subprocess.Popen(cmd, cwd = local).wait() != 0:
if subprocess.Popen(cmd, cwd=local).wait() != 0:
raise CloneFailure()
@ -384,9 +486,9 @@ def _InitHttp():
n = netrc.netrc()
for host in n.hosts:
p = n.hosts[host]
mgr.add_password(p[1], 'http://%s/' % host, p[0], p[2])
mgr.add_password(p[1], 'http://%s/' % host, p[0], p[2])
mgr.add_password(p[1], 'https://%s/' % host, p[0], p[2])
except:
except: # pylint: disable=bare-except
pass
handlers.append(urllib.request.HTTPBasicAuthHandler(mgr))
handlers.append(urllib.request.HTTPDigestAuthHandler(mgr))
@ -399,6 +501,7 @@ def _InitHttp():
handlers.append(urllib.request.HTTPSHandler(debuglevel=1))
urllib.request.install_opener(urllib.request.build_opener(*handlers))
def _Fetch(url, local, src, quiet):
if not quiet:
_print('Get %s' % url, file=sys.stderr)
@ -413,22 +516,23 @@ def _Fetch(url, local, src, quiet):
cmd.append('+refs/heads/*:refs/remotes/origin/*')
cmd.append('refs/tags/*:refs/tags/*')
proc = subprocess.Popen(cmd, cwd = local, stderr = err)
proc = subprocess.Popen(cmd, cwd=local, stderr=err)
if err:
proc.stderr.read()
proc.stderr.close()
if proc.wait() != 0:
raise CloneFailure()
def _DownloadBundle(url, local, quiet):
if not url.endswith('/'):
url += '/'
url += 'clone.bundle'
proc = subprocess.Popen(
[GIT, 'config', '--get-regexp', 'url.*.insteadof'],
cwd = local,
stdout = subprocess.PIPE)
[GIT, 'config', '--get-regexp', 'url.*.insteadof'],
cwd=local,
stdout=subprocess.PIPE)
for line in proc.stdout:
m = re.compile(r'^url\.(.*)\.insteadof (.*)$').match(line)
if m:
@ -448,7 +552,7 @@ def _DownloadBundle(url, local, quiet):
try:
r = urllib.request.urlopen(url)
except urllib.error.HTTPError as e:
if e.code in [403, 404]:
if e.code in [401, 403, 404, 501]:
return False
_print('fatal: Cannot get %s' % url, file=sys.stderr)
_print('fatal: HTTP error %s' % e.code, file=sys.stderr)
@ -470,6 +574,7 @@ def _DownloadBundle(url, local, quiet):
finally:
dest.close()
def _ImportBundle(local):
path = os.path.join(local, '.git', 'clone.bundle')
try:
@ -477,7 +582,8 @@ def _ImportBundle(local):
finally:
os.remove(path)
def _Clone(url, local, quiet):
def _Clone(url, local, quiet, clone_bundle):
"""Clones a git repository to a new subdirectory of repodir
"""
try:
@ -489,14 +595,14 @@ def _Clone(url, local, quiet):
cmd = [GIT, 'init', '--quiet']
try:
proc = subprocess.Popen(cmd, cwd = local)
proc = subprocess.Popen(cmd, cwd=local)
except OSError as e:
_print(file=sys.stderr)
_print("fatal: '%s' is not available" % GIT, file=sys.stderr)
_print('fatal: %s' % e, file=sys.stderr)
_print(file=sys.stderr)
_print('Please make sure %s is installed and in your path.' % GIT,
file=sys.stderr)
file=sys.stderr)
raise CloneFailure()
if proc.wait() != 0:
_print('fatal: could not create %s' % local, file=sys.stderr)
@ -504,12 +610,12 @@ def _Clone(url, local, quiet):
_InitHttp()
_SetConfig(local, 'remote.origin.url', url)
_SetConfig(local, 'remote.origin.fetch',
'+refs/heads/*:refs/remotes/origin/*')
if _DownloadBundle(url, local, quiet):
_SetConfig(local,
'remote.origin.fetch',
'+refs/heads/*:refs/remotes/origin/*')
if clone_bundle and _DownloadBundle(url, local, quiet):
_ImportBundle(local)
else:
_Fetch(url, local, 'origin', quiet)
_Fetch(url, local, 'origin', quiet)
def _Verify(cwd, branch, quiet):
@ -519,7 +625,7 @@ def _Verify(cwd, branch, quiet):
proc = subprocess.Popen(cmd,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
cwd = cwd)
cwd=cwd)
cur = proc.stdout.read().strip()
proc.stdout.close()
@ -537,18 +643,21 @@ def _Verify(cwd, branch, quiet):
if not quiet:
_print(file=sys.stderr)
_print("info: Ignoring branch '%s'; using tagged release '%s'"
% (branch, cur), file=sys.stderr)
% (branch, cur), file=sys.stderr)
_print(file=sys.stderr)
env = os.environ.copy()
env['GNUPGHOME'] = gpg_dir.encode()
try:
env['GNUPGHOME'] = gpg_dir
except UnicodeEncodeError:
env['GNUPGHOME'] = gpg_dir.encode()
cmd = [GIT, 'tag', '-v', cur]
proc = subprocess.Popen(cmd,
stdout = subprocess.PIPE,
stderr = subprocess.PIPE,
cwd = cwd,
env = env)
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
cwd=cwd,
env=env)
out = proc.stdout.read()
proc.stdout.close()
@ -568,21 +677,21 @@ def _Checkout(cwd, branch, rev, quiet):
"""Checkout an upstream branch into the repository and track it.
"""
cmd = [GIT, 'update-ref', 'refs/heads/default', rev]
if subprocess.Popen(cmd, cwd = cwd).wait() != 0:
if subprocess.Popen(cmd, cwd=cwd).wait() != 0:
raise CloneFailure()
_SetConfig(cwd, 'branch.default.remote', 'origin')
_SetConfig(cwd, 'branch.default.merge', 'refs/heads/%s' % branch)
cmd = [GIT, 'symbolic-ref', 'HEAD', 'refs/heads/default']
if subprocess.Popen(cmd, cwd = cwd).wait() != 0:
if subprocess.Popen(cmd, cwd=cwd).wait() != 0:
raise CloneFailure()
cmd = [GIT, 'read-tree', '--reset', '-u']
if not quiet:
cmd.append('-v')
cmd.append('HEAD')
if subprocess.Popen(cmd, cwd = cwd).wait() != 0:
if subprocess.Popen(cmd, cwd=cwd).wait() != 0:
raise CloneFailure()
@ -594,8 +703,8 @@ def _FindRepo():
olddir = None
while curdir != '/' \
and curdir != olddir \
and not repo:
and curdir != olddir \
and not repo:
repo = os.path.join(curdir, repodir, REPO_MAIN)
if not os.path.isfile(repo):
repo = None
@ -604,7 +713,7 @@ def _FindRepo():
return (repo, os.path.join(curdir, repodir))
class _Options:
class _Options(object):
help = False
@ -626,15 +735,20 @@ def _ParseArguments(args):
def _Usage():
gitc_usage = ""
if get_gitc_manifest_dir():
gitc_usage = " gitc-init Initialize a GITC Client.\n"
_print(
"""usage: repo COMMAND [ARGS]
"""usage: repo COMMAND [ARGS]
repo is not yet installed. Use "repo init" to install it here.
The most commonly used repo commands are:
init Install repo in the current working directory
help Display detailed help on a command
""" + gitc_usage +
""" help Display detailed help on a command
For access to the full online help, install repo ("repo init").
""", file=sys.stderr)
@ -646,6 +760,10 @@ def _Help(args):
if args[0] == 'init':
init_optparse.print_help()
sys.exit(0)
elif args[0] == 'gitc-init':
_GitcInitOptions(init_optparse)
init_optparse.print_help()
sys.exit(0)
else:
_print("error: '%s' is not a bootstrap command.\n"
' For access to online help, install repo ("repo init").'
@ -691,8 +809,8 @@ def _SetDefaultsTo(gitdir):
'--git-dir=%s' % gitdir,
'symbolic-ref',
'HEAD'],
stdout = subprocess.PIPE,
stderr = subprocess.PIPE)
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
REPO_REV = proc.stdout.read().strip()
proc.stdout.close()
@ -705,12 +823,23 @@ def _SetDefaultsTo(gitdir):
def main(orig_args):
repo_main, rel_repo_dir = _FindRepo()
cmd, opt, args = _ParseArguments(orig_args)
repo_main, rel_repo_dir = None, None
# Don't use the local repo copy, make sure to switch to the gitc client first.
if cmd != 'gitc-init':
repo_main, rel_repo_dir = _FindRepo()
wrapper_path = os.path.abspath(__file__)
my_main, my_git = _RunSelf(wrapper_path)
cwd = os.getcwd()
if get_gitc_manifest_dir() and cwd.startswith(get_gitc_manifest_dir()):
_print('error: repo cannot be used in the GITC local manifest directory.'
'\nIf you want to work on this GITC client please rerun this '
'command from the corresponding client under /gitc/',
file=sys.stderr)
sys.exit(1)
if not repo_main:
if opt.help:
_Usage()
@ -718,18 +847,13 @@ def main(orig_args):
_Help(args)
if not cmd:
_NotInstalled()
if cmd == 'init':
if cmd == 'init' or cmd == 'gitc-init':
if my_git:
_SetDefaultsTo(my_git)
try:
_Init(args)
_Init(args, gitc_init=(cmd == 'gitc-init'))
except CloneFailure:
for root, dirs, files in os.walk(repodir, topdown=False):
for name in files:
os.remove(os.path.join(root, name))
for name in dirs:
os.rmdir(os.path.join(root, name))
os.rmdir(repodir)
shutil.rmtree(os.path.join(repodir, S_repo), ignore_errors=True)
sys.exit(1)
repo_main, rel_repo_dir = _FindRepo()
else:
@ -755,4 +879,8 @@ def main(orig_args):
if __name__ == '__main__':
if ver[0] == 3:
_print('warning: Python 3 support is currently experimental. YMMV.\n'
'Please use Python 2.6 - 2.7 instead.',
file=sys.stderr)
main(sys.argv[1:])

View File

@ -46,6 +46,10 @@ class BranchInfo(object):
def IsCurrent(self):
return self.current > 0
@property
def IsSplitCurrent(self):
return self.current != 0 and self.current != len(self.projects)
@property
def IsPublished(self):
return self.published > 0
@ -139,10 +143,14 @@ is shown, then the branch appears in all projects.
if in_cnt < project_cnt:
fmt = out.write
paths = []
if in_cnt < project_cnt - in_cnt:
non_cur_paths = []
if i.IsSplitCurrent or (in_cnt < project_cnt - in_cnt):
in_type = 'in'
for b in i.projects:
paths.append(b.project.relpath)
if not i.IsSplitCurrent or b.current:
paths.append(b.project.relpath)
else:
non_cur_paths.append(b.project.relpath)
else:
fmt = out.notinproject
in_type = 'not in'
@ -154,13 +162,19 @@ is shown, then the branch appears in all projects.
paths.append(p.relpath)
s = ' %s %s' % (in_type, ', '.join(paths))
if width + 7 + len(s) < 80:
if not i.IsSplitCurrent and (width + 7 + len(s) < 80):
fmt = out.current if i.IsCurrent else fmt
fmt(s)
else:
fmt(' %s:' % in_type)
fmt = out.current if i.IsCurrent else out.write
for p in paths:
out.nl()
fmt(width*' ' + ' %s' % p)
fmt = out.write
for p in non_cur_paths:
out.nl()
fmt(width*' ' + ' %s' % p)
else:
out.write(' in all projects')
out.nl()

View File

@ -76,6 +76,7 @@ change id will be added.
capture_stdout = True,
capture_stderr = True)
p.stdin.write(new_msg)
p.stdin.close()
if p.Wait() != 0:
print("error: Failed to update commit message", file=sys.stderr)
sys.exit(1)

204
subcmds/diffmanifests.py Normal file
View File

@ -0,0 +1,204 @@
#
# Copyright (C) 2014 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from color import Coloring
from command import PagedCommand
from manifest_xml import XmlManifest
class _Coloring(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, "status")
class Diffmanifests(PagedCommand):
""" A command to see logs in projects represented by manifests
This is used to see deeper differences between manifests. Where a simple
diff would only show a diff of sha1s for example, this command will display
the logs of the project between both sha1s, allowing user to see diff at a
deeper level.
"""
common = True
helpSummary = "Manifest diff utility"
helpUsage = """%prog manifest1.xml [manifest2.xml] [options]"""
helpDescription = """
The %prog command shows differences between project revisions of manifest1 and
manifest2. if manifest2 is not specified, current manifest.xml will be used
instead. Both absolute and relative paths may be used for manifests. Relative
paths start from project's ".repo/manifests" folder.
The --raw option Displays the diff in a way that facilitates parsing, the
project pattern will be <status> <path> <revision from> [<revision to>] and the
commit pattern will be <status> <onelined log> with status values respectively :
A = Added project
R = Removed project
C = Changed project
U = Project with unreachable revision(s) (revision(s) not found)
for project, and
A = Added commit
R = Removed commit
for a commit.
Only changed projects may contain commits, and commit status always starts with
a space, and are part of last printed project.
Unreachable revisions may occur if project is not up to date or if repo has not
been initialized with all the groups, in which case some projects won't be
synced and their revisions won't be found.
"""
def _Options(self, p):
p.add_option('--raw',
dest='raw', action='store_true',
help='Display raw diff.')
p.add_option('--no-color',
dest='color', action='store_false', default=True,
help='does not display the diff in color.')
p.add_option('--pretty-format',
dest='pretty_format', action='store',
metavar='<FORMAT>',
help='print the log using a custom git pretty format string')
def _printRawDiff(self, diff):
for project in diff['added']:
self.printText("A %s %s" % (project.relpath, project.revisionExpr))
self.out.nl()
for project in diff['removed']:
self.printText("R %s %s" % (project.relpath, project.revisionExpr))
self.out.nl()
for project, otherProject in diff['changed']:
self.printText("C %s %s %s" % (project.relpath, project.revisionExpr,
otherProject.revisionExpr))
self.out.nl()
self._printLogs(project, otherProject, raw=True, color=False)
for project, otherProject in diff['unreachable']:
self.printText("U %s %s %s" % (project.relpath, project.revisionExpr,
otherProject.revisionExpr))
self.out.nl()
def _printDiff(self, diff, color=True, pretty_format=None):
if diff['added']:
self.out.nl()
self.printText('added projects : \n')
self.out.nl()
for project in diff['added']:
self.printProject('\t%s' % (project.relpath))
self.printText(' at revision ')
self.printRevision(project.revisionExpr)
self.out.nl()
if diff['removed']:
self.out.nl()
self.printText('removed projects : \n')
self.out.nl()
for project in diff['removed']:
self.printProject('\t%s' % (project.relpath))
self.printText(' at revision ')
self.printRevision(project.revisionExpr)
self.out.nl()
if diff['changed']:
self.out.nl()
self.printText('changed projects : \n')
self.out.nl()
for project, otherProject in diff['changed']:
self.printProject('\t%s' % (project.relpath))
self.printText(' changed from ')
self.printRevision(project.revisionExpr)
self.printText(' to ')
self.printRevision(otherProject.revisionExpr)
self.out.nl()
self._printLogs(project, otherProject, raw=False, color=color,
pretty_format=pretty_format)
self.out.nl()
if diff['unreachable']:
self.out.nl()
self.printText('projects with unreachable revisions : \n')
self.out.nl()
for project, otherProject in diff['unreachable']:
self.printProject('\t%s ' % (project.relpath))
self.printRevision(project.revisionExpr)
self.printText(' or ')
self.printRevision(otherProject.revisionExpr)
self.printText(' not found')
self.out.nl()
def _printLogs(self, project, otherProject, raw=False, color=True,
pretty_format=None):
logs = project.getAddedAndRemovedLogs(otherProject,
oneline=(pretty_format is None),
color=color,
pretty_format=pretty_format)
if logs['removed']:
removedLogs = logs['removed'].split('\n')
for log in removedLogs:
if log.strip():
if raw:
self.printText(' R ' + log)
self.out.nl()
else:
self.printRemoved('\t\t[-] ')
self.printText(log)
self.out.nl()
if logs['added']:
addedLogs = logs['added'].split('\n')
for log in addedLogs:
if log.strip():
if raw:
self.printText(' A ' + log)
self.out.nl()
else:
self.printAdded('\t\t[+] ')
self.printText(log)
self.out.nl()
def Execute(self, opt, args):
if not args or len(args) > 2:
self.Usage()
self.out = _Coloring(self.manifest.globalConfig)
self.printText = self.out.nofmt_printer('text')
if opt.color:
self.printProject = self.out.nofmt_printer('project', attr = 'bold')
self.printAdded = self.out.nofmt_printer('green', fg = 'green', attr = 'bold')
self.printRemoved = self.out.nofmt_printer('red', fg = 'red', attr = 'bold')
self.printRevision = self.out.nofmt_printer('revision', fg = 'yellow')
else:
self.printProject = self.printAdded = self.printRemoved = self.printRevision = self.printText
manifest1 = XmlManifest(self.manifest.repodir)
manifest1.Override(args[0])
if len(args) == 1:
manifest2 = self.manifest
else:
manifest2 = XmlManifest(self.manifest.repodir)
manifest2.Override(args[1])
diff = manifest1.projectsDiff(manifest2)
if opt.raw:
self._printRawDiff(diff)
else:
self._printDiff(diff, color=opt.color, pretty_format=opt.pretty_format)

View File

@ -18,6 +18,7 @@ import re
import sys
from command import Command
from error import GitError
CHANGE_RE = re.compile(r'^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$')
@ -87,7 +88,13 @@ makes it available in your project's local working directory.
for c in dl.commits:
print(' %s' % (c), file=sys.stderr)
if opt.cherrypick:
project._CherryPick(dl.commit)
try:
project._CherryPick(dl.commit)
except GitError:
print('[%s] Could not complete the cherry-pick of %s' \
% (project.name, dl.commit), file=sys.stderr)
sys.exit(1)
elif opt.revert:
project._Revert(dl.commit)
elif opt.ffonly:

View File

@ -14,10 +14,13 @@
# limitations under the License.
from __future__ import print_function
import errno
import fcntl
import multiprocessing
import re
import os
import select
import signal
import sys
import subprocess
@ -31,6 +34,7 @@ _CAN_COLOR = [
'log',
]
class ForallColoring(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, 'forall')
@ -87,6 +91,12 @@ revision to a locally executed git command, use REPO_LREV.
REPO_RREV is the name of the revision from the manifest, exactly
as written in the manifest.
REPO_COUNT is the total number of projects being iterated.
REPO_I is the current (1-based) iteration count. Can be used in
conjunction with REPO_COUNT to add a simple progress indicator to your
command.
REPO__* are any extra environment variables, specified by the
"annotation" element under any project element. This can be useful
for differentiating trees based on user-specific criteria, or simply
@ -110,6 +120,12 @@ without iterating through the remaining projects.
p.add_option('-r', '--regex',
dest='regex', action='store_true',
help="Execute the command only on projects matching regex or wildcard expression")
p.add_option('-i', '--inverse-regex',
dest='inverse_regex', action='store_true',
help="Execute the command only on projects not matching regex or wildcard expression")
p.add_option('-g', '--groups',
dest='groups',
help="Execute the command only on projects matching the specified groups")
p.add_option('-c', '--command',
help='Command (and arguments) to execute',
dest='command',
@ -126,9 +142,35 @@ without iterating through the remaining projects.
g.add_option('-v', '--verbose',
dest='verbose', action='store_true',
help='Show command error messages')
g.add_option('-j', '--jobs',
dest='jobs', action='store', type='int', default=1,
help='number of commands to execute simultaneously')
def WantPager(self, opt):
return opt.project_header
return opt.project_header and opt.jobs == 1
def _SerializeProject(self, project):
""" Serialize a project._GitGetByExec instance.
project._GitGetByExec is not pickle-able. Instead of trying to pass it
around between processes, make a dict ourselves containing only the
attributes that we need.
"""
if not self.manifest.IsMirror:
lrev = project.GetRevisionId()
else:
lrev = None
return {
'name': project.name,
'relpath': project.relpath,
'remote_name': project.remote.name,
'lrev': lrev,
'rrev': project.revisionExpr,
'annotations': dict((a.name, a.value) for a in project.annotations),
'gitdir': project.gitdir,
'worktree': project.worktree,
}
def Execute(self, opt, args):
if not opt.command:
@ -167,123 +209,192 @@ without iterating through the remaining projects.
# pylint: enable=W0631
mirror = self.manifest.IsMirror
out = ForallColoring(self.manifest.manifestProject.config)
out.redirect(sys.stdout)
rc = 0
first = True
if not opt.regex:
projects = self.GetProjects(args)
else:
smart_sync_manifest_name = "smart_sync_override.xml"
smart_sync_manifest_path = os.path.join(
self.manifest.manifestProject.worktree, smart_sync_manifest_name)
if os.path.isfile(smart_sync_manifest_path):
self.manifest.Override(smart_sync_manifest_path)
if opt.regex:
projects = self.FindProjects(args)
elif opt.inverse_regex:
projects = self.FindProjects(args, inverse=True)
else:
projects = self.GetProjects(args, groups=opt.groups)
for project in projects:
env = os.environ.copy()
def setenv(name, val):
if val is None:
val = ''
env[name] = val.encode()
os.environ['REPO_COUNT'] = str(len(projects))
setenv('REPO_PROJECT', project.name)
setenv('REPO_PATH', project.relpath)
setenv('REPO_REMOTE', project.remote.name)
setenv('REPO_LREV', project.GetRevisionId())
setenv('REPO_RREV', project.revisionExpr)
for a in project.annotations:
setenv("REPO__%s" % (a.name), a.value)
if mirror:
setenv('GIT_DIR', project.gitdir)
cwd = project.gitdir
else:
cwd = project.worktree
if not os.path.exists(cwd):
if (opt.project_header and opt.verbose) \
or not opt.project_header:
print('skipping %s/' % project.relpath, file=sys.stderr)
continue
if opt.project_header:
stdin = subprocess.PIPE
stdout = subprocess.PIPE
stderr = subprocess.PIPE
else:
stdin = None
stdout = None
stderr = None
p = subprocess.Popen(cmd,
cwd = cwd,
shell = shell,
env = env,
stdin = stdin,
stdout = stdout,
stderr = stderr)
if opt.project_header:
class sfd(object):
def __init__(self, fd, dest):
self.fd = fd
self.dest = dest
def fileno(self):
return self.fd.fileno()
empty = True
errbuf = ''
p.stdin.close()
s_in = [sfd(p.stdout, sys.stdout),
sfd(p.stderr, sys.stderr)]
for s in s_in:
flags = fcntl.fcntl(s.fd, fcntl.F_GETFL)
fcntl.fcntl(s.fd, fcntl.F_SETFL, flags | os.O_NONBLOCK)
while s_in:
in_ready, _out_ready, _err_ready = select.select(s_in, [], [])
for s in in_ready:
buf = s.fd.read(4096)
if not buf:
s.fd.close()
s_in.remove(s)
continue
if not opt.verbose:
if s.fd != p.stdout:
errbuf += buf
continue
if empty:
if first:
first = False
else:
out.nl()
if mirror:
project_header_path = project.name
else:
project_header_path = project.relpath
out.project('project %s/', project_header_path)
out.nl()
out.flush()
if errbuf:
sys.stderr.write(errbuf)
sys.stderr.flush()
errbuf = ''
empty = False
s.dest.write(buf)
s.dest.flush()
r = p.wait()
if r != 0:
if r != rc:
rc = r
if opt.abort_on_errors:
print("error: %s: Aborting due to previous error" % project.relpath,
file=sys.stderr)
sys.exit(r)
pool = multiprocessing.Pool(opt.jobs, InitWorker)
try:
config = self.manifest.manifestProject.config
results_it = pool.imap(
DoWorkWrapper,
self.ProjectArgs(projects, mirror, opt, cmd, shell, config))
pool.close()
for r in results_it:
rc = rc or r
if r != 0 and opt.abort_on_errors:
raise Exception('Aborting due to previous error')
except (KeyboardInterrupt, WorkerKeyboardInterrupt):
# Catch KeyboardInterrupt raised inside and outside of workers
print('Interrupted - terminating the pool')
pool.terminate()
rc = rc or errno.EINTR
except Exception as e:
# Catch any other exceptions raised
print('Got an error, terminating the pool: %s: %s' %
(type(e).__name__, e),
file=sys.stderr)
pool.terminate()
rc = rc or getattr(e, 'errno', 1)
finally:
pool.join()
if rc != 0:
sys.exit(rc)
def ProjectArgs(self, projects, mirror, opt, cmd, shell, config):
for cnt, p in enumerate(projects):
try:
project = self._SerializeProject(p)
except Exception as e:
print('Project list error on project %s: %s: %s' %
(p.name, type(e).__name__, e),
file=sys.stderr)
return
except KeyboardInterrupt:
print('Project list interrupted',
file=sys.stderr)
return
yield [mirror, opt, cmd, shell, cnt, config, project]
class WorkerKeyboardInterrupt(Exception):
""" Keyboard interrupt exception for worker processes. """
pass
def InitWorker():
signal.signal(signal.SIGINT, signal.SIG_IGN)
def DoWorkWrapper(args):
""" A wrapper around the DoWork() method.
Catch the KeyboardInterrupt exceptions here and re-raise them as a different,
``Exception``-based exception to stop it flooding the console with stacktraces
and making the parent hang indefinitely.
"""
project = args.pop()
try:
return DoWork(project, *args)
except KeyboardInterrupt:
print('%s: Worker interrupted' % project['name'])
raise WorkerKeyboardInterrupt()
def DoWork(project, mirror, opt, cmd, shell, cnt, config):
env = os.environ.copy()
def setenv(name, val):
if val is None:
val = ''
if hasattr(val, 'encode'):
val = val.encode()
env[name] = val
setenv('REPO_PROJECT', project['name'])
setenv('REPO_PATH', project['relpath'])
setenv('REPO_REMOTE', project['remote_name'])
setenv('REPO_LREV', project['lrev'])
setenv('REPO_RREV', project['rrev'])
setenv('REPO_I', str(cnt + 1))
for name in project['annotations']:
setenv("REPO__%s" % (name), project['annotations'][name])
if mirror:
setenv('GIT_DIR', project['gitdir'])
cwd = project['gitdir']
else:
cwd = project['worktree']
if not os.path.exists(cwd):
if (opt.project_header and opt.verbose) \
or not opt.project_header:
print('skipping %s/' % project['relpath'], file=sys.stderr)
return
if opt.project_header:
stdin = subprocess.PIPE
stdout = subprocess.PIPE
stderr = subprocess.PIPE
else:
stdin = None
stdout = None
stderr = None
p = subprocess.Popen(cmd,
cwd=cwd,
shell=shell,
env=env,
stdin=stdin,
stdout=stdout,
stderr=stderr)
if opt.project_header:
out = ForallColoring(config)
out.redirect(sys.stdout)
class sfd(object):
def __init__(self, fd, dest):
self.fd = fd
self.dest = dest
def fileno(self):
return self.fd.fileno()
empty = True
errbuf = ''
p.stdin.close()
s_in = [sfd(p.stdout, sys.stdout),
sfd(p.stderr, sys.stderr)]
for s in s_in:
flags = fcntl.fcntl(s.fd, fcntl.F_GETFL)
fcntl.fcntl(s.fd, fcntl.F_SETFL, flags | os.O_NONBLOCK)
while s_in:
in_ready, _out_ready, _err_ready = select.select(s_in, [], [])
for s in in_ready:
buf = s.fd.read(4096)
if not buf:
s.fd.close()
s_in.remove(s)
continue
if not opt.verbose:
if s.fd != p.stdout:
errbuf += buf
continue
if empty and out:
if not cnt == 0:
out.nl()
if mirror:
project_header_path = project['name']
else:
project_header_path = project['relpath']
out.project('project %s/', project_header_path)
out.nl()
out.flush()
if errbuf:
sys.stderr.write(errbuf)
sys.stderr.flush()
errbuf = ''
empty = False
s.dest.write(buf)
s.dest.flush()
r = p.wait()
return r

55
subcmds/gitc_delete.py Normal file
View File

@ -0,0 +1,55 @@
#
# Copyright (C) 2015 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import os
import shutil
import sys
from command import Command, GitcClientCommand
import gitc_utils
from pyversion import is_python3
if not is_python3():
# pylint:disable=W0622
input = raw_input
# pylint:enable=W0622
class GitcDelete(Command, GitcClientCommand):
common = True
visible_everywhere = False
helpSummary = "Delete a GITC Client."
helpUsage = """
%prog
"""
helpDescription = """
This subcommand deletes the current GITC client, deleting the GITC manifest
and all locally downloaded sources.
"""
def _Options(self, p):
p.add_option('-f', '--force',
dest='force', action='store_true',
help='Force the deletion (no prompt).')
def Execute(self, opt, args):
if not opt.force:
prompt = ('This will delete GITC client: %s\nAre you sure? (yes/no) ' %
self.gitc_manifest.gitc_client_name)
response = input(prompt).lower()
if not response == 'yes':
print('Response was not "yes"\n Exiting...')
sys.exit(1)
shutil.rmtree(self.gitc_manifest.gitc_client_dir)

82
subcmds/gitc_init.py Normal file
View File

@ -0,0 +1,82 @@
#
# Copyright (C) 2015 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import os
import sys
import gitc_utils
from command import GitcAvailableCommand
from manifest_xml import GitcManifest
from subcmds import init
import wrapper
class GitcInit(init.Init, GitcAvailableCommand):
common = True
helpSummary = "Initialize a GITC Client."
helpUsage = """
%prog [options] [client name]
"""
helpDescription = """
The '%prog' command is ran to initialize a new GITC client for use
with the GITC file system.
This command will setup the client directory, initialize repo, just
like repo init does, and then downloads the manifest collection
and installs it in the .repo/directory of the GITC client.
Once this is done, a GITC manifest is generated by pulling the HEAD
SHA for each project and generates the properly formatted XML file
and installs it as .manifest in the GITC client directory.
The -c argument is required to specify the GITC client name.
The optional -f argument can be used to specify the manifest file to
use for this GITC client.
"""
def _Options(self, p):
super(GitcInit, self)._Options(p)
g = p.add_option_group('GITC options')
g.add_option('-f', '--manifest-file',
dest='manifest_file',
help='Optional manifest file to use for this GITC client.')
g.add_option('-c', '--gitc-client',
dest='gitc_client',
help='The name of the gitc_client instance to create or modify.')
def Execute(self, opt, args):
gitc_client = gitc_utils.parse_clientdir(os.getcwd())
if not gitc_client or (opt.gitc_client and gitc_client != opt.gitc_client):
print('fatal: Please update your repo command. See go/gitc for instructions.', file=sys.stderr)
sys.exit(1)
self.client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(),
gitc_client)
super(GitcInit, self).Execute(opt, args)
manifest_file = self.manifest.manifestFile
if opt.manifest_file:
if not os.path.exists(opt.manifest_file):
print('fatal: Specified manifest file %s does not exist.' %
opt.manifest_file)
sys.exit(1)
manifest_file = opt.manifest_file
manifest = GitcManifest(self.repodir, gitc_client)
manifest.Override(manifest_file)
gitc_utils.generate_gitc_manifest(None, manifest)
print('Please run `cd %s` to view your GITC client.' %
os.path.join(wrapper.Wrapper().GITC_FS_ROOT_DIR, gitc_client))

View File

@ -19,7 +19,8 @@ import sys
from formatter import AbstractFormatter, DumbWriter
from color import Coloring
from command import PagedCommand, MirrorSafeCommand
from command import PagedCommand, MirrorSafeCommand, GitcAvailableCommand, GitcClientCommand
import gitc_utils
class Help(PagedCommand, MirrorSafeCommand):
common = False
@ -54,9 +55,21 @@ Displays detailed usage information about a command.
def _PrintCommonCommands(self):
print('usage: repo COMMAND [ARGS]')
print('The most commonly used repo commands are:')
def gitc_supported(cmd):
if not isinstance(cmd, GitcAvailableCommand) and not isinstance(cmd, GitcClientCommand):
return True
if self.manifest.isGitcClient:
return True
if isinstance(cmd, GitcClientCommand):
return False
if gitc_utils.get_gitc_manifest_dir():
return True
return False
commandNames = list(sorted([name
for name, command in self.commands.items()
if command.common]))
if command.common and gitc_supported(command)]))
maxlen = 0
for name in commandNames:

View File

@ -59,7 +59,8 @@ class Info(PagedCommand):
or 'all,-notdefault')
self.heading("Manifest branch: ")
self.headtext(self.manifest.default.revisionExpr)
if self.manifest.default.revisionExpr:
self.headtext(self.manifest.default.revisionExpr)
self.out.nl()
self.heading("Manifest merge branch: ")
self.headtext(mergeBranch)

View File

@ -27,7 +27,7 @@ else:
import imp
import urlparse
urllib = imp.new_module('urllib')
urllib.parse = urlparse.urlparse
urllib.parse = urlparse
from color import Coloring
from command import InteractiveCommand, MirrorSafeCommand
@ -61,6 +61,11 @@ directory use as much data as possible from the local reference
directory when fetching from the server. This will make the sync
go a lot faster by reducing data traffic on the network.
The --no-clone-bundle option disables any attempt to use
$URL/clone.bundle to bootstrap a new Git repository from a
resumeable bundle file on a content delivery network. This
may be necessary if there are problems with the local Python
HTTP client or proxy configuration, but the Git binary works.
Switching Manifest Branches
---------------------------
@ -99,6 +104,10 @@ to update the working directory files.
g.add_option('--depth', type='int', default=None,
dest='depth',
help='create a shallow clone with given depth; see git clone')
g.add_option('--archive',
dest='archive', action='store_true',
help='checkout an archive instead of a git repository for '
'each project. See git archive.')
g.add_option('-g', '--groups',
dest='groups', default='default',
help='restrict manifest projects to ones with specified '
@ -109,6 +118,9 @@ to update the working directory files.
help='restrict manifest projects to ones with a specified '
'platform group [auto|all|none|linux|darwin|...]',
metavar='PLATFORM')
g.add_option('--no-clone-bundle',
dest='no_clone_bundle', action='store_true',
help='disable use of /clone.bundle on HTTP/HTTPS')
# Tool
g = p.add_option_group('repo Version options')
@ -149,7 +161,7 @@ to update the working directory files.
# server where this git is located, so let's save that here.
mirrored_manifest_git = None
if opt.reference:
manifest_git_path = urllib.parse(opt.manifest_url).path[1:]
manifest_git_path = urllib.parse.urlparse(opt.manifest_url).path[1:]
mirrored_manifest_git = os.path.join(opt.reference, manifest_git_path)
if not mirrored_manifest_git.endswith(".git"):
mirrored_manifest_git += ".git"
@ -175,7 +187,7 @@ to update the working directory files.
r.Save()
groups = re.split(r'[,\s]+', opt.groups)
all_platforms = ['linux', 'darwin']
all_platforms = ['linux', 'darwin', 'windows']
platformize = lambda x: 'platform-' + x
if opt.platform == 'auto':
if (not opt.mirror and
@ -184,7 +196,7 @@ to update the working directory files.
elif opt.platform == 'all':
groups.extend(map(platformize, all_platforms))
elif opt.platform in all_platforms:
groups.extend(platformize(opt.platform))
groups.append(platformize(opt.platform))
elif opt.platform != 'none':
print('fatal: invalid platform flag', file=sys.stderr)
sys.exit(1)
@ -198,6 +210,16 @@ to update the working directory files.
if opt.reference:
m.config.SetString('repo.reference', opt.reference)
if opt.archive:
if is_new:
m.config.SetString('repo.archive', 'true')
else:
print('fatal: --archive is only supported when initializing a new '
'workspace.', file=sys.stderr)
print('Either delete the .repo folder in this workspace, or initialize '
'in another location.', file=sys.stderr)
sys.exit(1)
if opt.mirror:
if is_new:
m.config.SetString('repo.mirror', 'true')
@ -208,7 +230,8 @@ to update the working directory files.
'in another location.', file=sys.stderr)
sys.exit(1)
if not m.Sync_NetworkHalf(is_new=is_new):
if not m.Sync_NetworkHalf(is_new=is_new, quiet=opt.quiet,
clone_bundle=not opt.no_clone_bundle):
r = m.GetRemote(m.remote.name)
print('fatal: cannot obtain manifest %s' % r.url, file=sys.stderr)
@ -219,7 +242,7 @@ to update the working directory files.
sys.exit(1)
if opt.manifest_branch:
m.MetaBranchSwitch(opt.manifest_branch)
m.MetaBranchSwitch()
syncbuf = SyncBuffer(m.config)
m.Sync_LocalHalf(syncbuf)
@ -366,6 +389,13 @@ to update the working directory files.
if opt.reference:
opt.reference = os.path.expanduser(opt.reference)
# Check this here, else manifest will be tagged "not new" and init won't be
# possible anymore without removing the .repo/manifests directory.
if opt.archive and opt.mirror:
print('fatal: --mirror and --archive cannot be used together.',
file=sys.stderr)
sys.exit(1)
self._SyncManifest(opt)
self._LinkManifest(opt.manifest_name)

View File

@ -35,6 +35,9 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
p.add_option('-r', '--regex',
dest='regex', action='store_true',
help="Filter the project list based on regex or wildcard matching of strings")
p.add_option('-g', '--groups',
dest='groups',
help="Filter the project list based on the groups the project is in")
p.add_option('-f', '--fullpath',
dest='fullpath', action='store_true',
help="Display the full work tree path instead of the relative path")
@ -62,7 +65,7 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
sys.exit(1)
if not opt.regex:
projects = self.GetProjects(args)
projects = self.GetProjects(args, groups=opt.groups)
else:
projects = self.FindProjects(args)

View File

@ -54,6 +54,11 @@ branch but need to incorporate new upstream changes "underneath" them.
p.add_option('--auto-stash',
dest='auto_stash', action='store_true',
help='Stash local modifications before starting')
p.add_option('-m', '--onto-manifest',
dest='onto_manifest', action='store_true',
help='Rebase onto the manifest version instead of upstream '
'HEAD. This helps to make sure the local tree stays '
'consistent if you previously synced to a manifest.')
def Execute(self, opt, args):
all_projects = self.GetProjects(args)
@ -62,6 +67,9 @@ branch but need to incorporate new upstream changes "underneath" them.
if opt.interactive and not one_project:
print('error: interactive rebase not supported with multiple projects',
file=sys.stderr)
if len(args) == 1:
print('note: project %s is mapped to more than one path' % (args[0],),
file=sys.stderr)
return -1
for project in all_projects:
@ -103,6 +111,10 @@ branch but need to incorporate new upstream changes "underneath" them.
if opt.interactive:
args.append("-i")
if opt.onto_manifest:
args.append('--onto')
args.append(project.revisionExpr)
args.append(upbranch.LocalMerge)
print('# %s: rebasing %s -> %s'

View File

@ -14,11 +14,15 @@
# limitations under the License.
from __future__ import print_function
import os
import sys
from command import Command
from git_config import IsId
from git_command import git
import gitc_utils
from progress import Progress
from project import SyncBuffer
class Start(Command):
common = True
@ -50,19 +54,59 @@ revision specified in the manifest.
if not opt.all:
projects = args[1:]
if len(projects) < 1:
print("error: at least one project must be specified", file=sys.stderr)
sys.exit(1)
projects = ['.',] # start it in the local project by default
all_projects = self.GetProjects(projects)
all_projects = self.GetProjects(projects,
missing_ok=bool(self.gitc_manifest))
# This must happen after we find all_projects, since GetProjects may need
# the local directory, which will disappear once we save the GITC manifest.
if self.gitc_manifest:
gitc_projects = self.GetProjects(projects, manifest=self.gitc_manifest,
missing_ok=True)
for project in gitc_projects:
if project.old_revision:
project.already_synced = True
else:
project.already_synced = False
project.old_revision = project.revisionExpr
project.revisionExpr = None
# Save the GITC manifest.
gitc_utils.save_manifest(self.gitc_manifest)
# Make sure we have a valid CWD
if not os.path.exists(os.getcwd()):
os.chdir(self.manifest.topdir)
pm = Progress('Starting %s' % nb, len(all_projects))
for project in all_projects:
pm.update()
if self.gitc_manifest:
gitc_project = self.gitc_manifest.paths[project.relpath]
# Sync projects that have not been opened.
if not gitc_project.already_synced:
proj_localdir = os.path.join(self.gitc_manifest.gitc_client_dir,
project.relpath)
project.worktree = proj_localdir
if not os.path.exists(proj_localdir):
os.makedirs(proj_localdir)
project.Sync_NetworkHalf()
sync_buf = SyncBuffer(self.manifest.manifestProject.config)
project.Sync_LocalHalf(sync_buf)
project.revisionId = gitc_project.old_revision
# If the current revision is a specific SHA1 then we can't push back
# to it so substitute the manifest default revision instead.
# to it; so substitute with dest_branch if defined, or with manifest
# default revision instead.
branch_merge = ''
if IsId(project.revisionExpr):
project.revisionExpr = self.manifest.default.revisionExpr
if not project.StartBranch(nb):
if project.dest_branch:
branch_merge = project.dest_branch
else:
branch_merge = self.manifest.default.revisionExpr
if not project.StartBranch(nb, branch_merge=branch_merge):
err.append(project)
pm.end()

View File

@ -22,15 +22,8 @@ except ImportError:
import glob
from pyversion import is_python3
if is_python3():
import io
else:
import StringIO as io
import itertools
import os
import sys
from color import Coloring
@ -97,7 +90,7 @@ the following meanings:
dest='orphans', action='store_true',
help="include objects in working directory outside of repo projects")
def _StatusHelper(self, project, clean_counter, sem, output):
def _StatusHelper(self, project, clean_counter, sem):
"""Obtains the status for a specific project.
Obtains the status for a project, redirecting the output to
@ -111,9 +104,9 @@ the following meanings:
output: Where to output the status.
"""
try:
state = project.PrintWorkTreeStatus(output)
state = project.PrintWorkTreeStatus()
if state == 'CLEAN':
clean_counter.next()
next(clean_counter)
finally:
sem.release()
@ -122,16 +115,16 @@ the following meanings:
status_header = ' --\t'
for item in dirs:
if not os.path.isdir(item):
outstring.write(''.join([status_header, item]))
outstring.append(''.join([status_header, item]))
continue
if item in proj_dirs:
continue
if item in proj_dirs_parents:
self._FindOrphans(glob.glob('%s/.*' % item) + \
glob.glob('%s/*' % item), \
self._FindOrphans(glob.glob('%s/.*' % item) +
glob.glob('%s/*' % item),
proj_dirs, proj_dirs_parents, outstring)
continue
outstring.write(''.join([status_header, item, '/']))
outstring.append(''.join([status_header, item, '/']))
def Execute(self, opt, args):
all_projects = self.GetProjects(args)
@ -141,30 +134,21 @@ the following meanings:
for project in all_projects:
state = project.PrintWorkTreeStatus()
if state == 'CLEAN':
counter.next()
next(counter)
else:
sem = _threading.Semaphore(opt.jobs)
threads_and_output = []
threads = []
for project in all_projects:
sem.acquire()
class BufList(io.StringIO):
def dump(self, ostream):
for entry in self.buflist:
ostream.write(entry)
output = BufList()
t = _threading.Thread(target=self._StatusHelper,
args=(project, counter, sem, output))
threads_and_output.append((t, output))
args=(project, counter, sem))
threads.append(t)
t.daemon = True
t.start()
for (t, output) in threads_and_output:
for t in threads:
t.join()
output.dump(sys.stdout)
output.close()
if len(all_projects) == counter.next():
if len(all_projects) == next(counter):
print('nothing to commit (working directory clean)')
if opt.orphans:
@ -188,23 +172,21 @@ the following meanings:
try:
os.chdir(self.manifest.topdir)
outstring = io.StringIO()
self._FindOrphans(glob.glob('.*') + \
glob.glob('*'), \
outstring = []
self._FindOrphans(glob.glob('.*') +
glob.glob('*'),
proj_dirs, proj_dirs_parents, outstring)
if outstring.buflist:
if outstring:
output = StatusColoring(self.manifest.globalConfig)
output.project('Objects not within a project (orphans)')
output.nl()
for entry in outstring.buflist:
for entry in outstring:
output.untracked(entry)
output.nl()
else:
print('No orphan files or directories')
outstring.close()
finally:
# Restore CWD.
os.chdir(orig_path)

View File

@ -14,27 +14,35 @@
# limitations under the License.
from __future__ import print_function
import json
import netrc
from optparse import SUPPRESS_HELP
import os
import pickle
import re
import shutil
import socket
import subprocess
import sys
import tempfile
import time
from pyversion import is_python3
if is_python3():
import http.cookiejar as cookielib
import urllib.error
import urllib.parse
import urllib.request
import xmlrpc.client
else:
import cookielib
import imp
import urllib2
import urlparse
import xmlrpclib
urllib = imp.new_module('urllib')
urllib.error = urllib2
urllib.parse = urlparse
urllib.request = urllib2
xmlrpc = imp.new_module('xmlrpc')
xmlrpc.client = xmlrpclib
@ -57,14 +65,17 @@ except ImportError:
multiprocessing = None
from git_command import GIT, git_require
from git_config import GetUrlCookieFile
from git_refs import R_HEADS, HEAD
from main import WrapperModule
import gitc_utils
from project import Project
from project import RemoteSpec
from command import Command, MirrorSafeCommand
from error import RepoChangedException, GitError, ManifestParseError
from project import SyncBuffer
from progress import Progress
from wrapper import Wrapper
from manifest_xml import GitcManifest
_ONE_DAY_S = 24 * 60 * 60
@ -119,6 +130,11 @@ credentials.
The -f/--force-broken option can be used to proceed with syncing
other projects if a project sync fails.
The --force-sync option can be used to overwrite existing git
directories if they have previously been linked to a different
object direcotry. WARNING: This may cause data to be lost since
refs may be removed when overwriting.
The --no-clone-bundle option disables any attempt to use
$URL/clone.bundle to bootstrap a new Git repository from a
resumeable bundle file on a content delivery network. This
@ -128,6 +144,16 @@ HTTP client or proxy configuration, but the Git binary works.
The --fetch-submodules option enables fetching Git submodules
of a project from server.
The -c/--current-branch option can be used to only fetch objects that
are on the branch specified by a project's revision.
The --optimized-fetch option can be used to only fetch projects that
are fixed to a sha1 revision if the sha1 revision does not already
exist locally.
The --prune option can be used to remove any refs that no longer
exist on the remote.
SSH Connections
---------------
@ -167,6 +193,11 @@ later is required to fix a server side protocol bug.
p.add_option('-f', '--force-broken',
dest='force_broken', action='store_true',
help="continue sync even if a project fails to sync")
p.add_option('--force-sync',
dest='force_sync', action='store_true',
help="overwrite an existing git directory if it needs to "
"point to a different object directory. WARNING: this "
"may cause loss of data")
p.add_option('-l', '--local-only',
dest='local_only', action='store_true',
help="only update working tree, don't fetch")
@ -203,10 +234,15 @@ later is required to fix a server side protocol bug.
p.add_option('--no-tags',
dest='no_tags', action='store_true',
help="don't fetch tags")
p.add_option('--optimized-fetch',
dest='optimized_fetch', action='store_true',
help='only fetch projects fixed to sha1 if revision does not exist locally')
p.add_option('--prune', dest='prune', action='store_true',
help='delete refs that no longer exist on the remote')
if show_smart:
p.add_option('-s', '--smart-sync',
dest='smart_sync', action='store_true',
help='smart sync using manifest from a known good build')
help='smart sync using manifest from the latest known good build')
p.add_option('-t', '--smart-tag',
dest='smart_tag', action='store',
help='smart sync using manifest from a known tag')
@ -219,9 +255,25 @@ later is required to fix a server side protocol bug.
dest='repo_upgraded', action='store_true',
help=SUPPRESS_HELP)
def _FetchHelper(self, opt, project, lock, fetched, pm, sem, err_event):
def _FetchProjectList(self, opt, projects, *args, **kwargs):
"""Main function of the fetch threads when jobs are > 1.
Delegates most of the work to _FetchHelper.
Args:
opt: Program options returned from optparse. See _Options().
projects: Projects to fetch.
*args, **kwargs: Remaining arguments to pass to _FetchHelper. See the
_FetchHelper docstring for details.
"""
for project in projects:
success = self._FetchHelper(opt, project, *args, **kwargs)
if not success and not opt.force_broken:
break
def _FetchHelper(self, opt, project, lock, fetched, pm, sem, err_event):
"""Fetch git objects for a single project.
Args:
opt: Program options returned from optparse. See _Options().
project: Project object for the project to fetch.
@ -235,6 +287,9 @@ later is required to fix a server side protocol bug.
can be started up.
err_event: We'll set this event in the case of an error (after printing
out info about the error).
Returns:
Whether the fetch was successful.
"""
# We'll set to true once we've locked the lock.
did_lock = False
@ -252,8 +307,11 @@ later is required to fix a server side protocol bug.
success = project.Sync_NetworkHalf(
quiet=opt.quiet,
current_branch_only=opt.current_branch_only,
force_sync=opt.force_sync,
clone_bundle=not opt.no_clone_bundle,
no_tags=opt.no_tags)
no_tags=opt.no_tags, archive=self.manifest.IsArchive,
optimized_fetch=opt.optimized_fetch,
prune=opt.prune)
self._fetch_times.Set(project, time.time() - start)
# Lock around all the rest of the code, since printing, updating a set
@ -262,6 +320,7 @@ later is required to fix a server side protocol bug.
did_lock = True
if not success:
err_event.set()
print('error: Cannot fetch %s' % project.name, file=sys.stderr)
if opt.force_broken:
print('warn: --force-broken, continuing to sync',
@ -272,8 +331,10 @@ later is required to fix a server side protocol bug.
fetched.add(project.gitdir)
pm.update()
except _FetchError:
err_event.set()
except:
pass
except Exception as e:
print('error: Cannot fetch %s (%s: %s)' \
% (project.name, type(e).__name__, str(e)), file=sys.stderr)
err_event.set()
raise
finally:
@ -281,67 +342,68 @@ later is required to fix a server side protocol bug.
lock.release()
sem.release()
return success
def _Fetch(self, projects, opt):
fetched = set()
lock = _threading.Lock()
pm = Progress('Fetching projects', len(projects))
if self.jobs == 1:
for project in projects:
pm.update()
if not opt.quiet:
print('Fetching project %s' % project.name)
if project.Sync_NetworkHalf(
quiet=opt.quiet,
current_branch_only=opt.current_branch_only,
clone_bundle=not opt.no_clone_bundle,
no_tags=opt.no_tags):
fetched.add(project.gitdir)
else:
print('error: Cannot fetch %s' % project.name, file=sys.stderr)
if opt.force_broken:
print('warn: --force-broken, continuing to sync', file=sys.stderr)
else:
sys.exit(1)
else:
threads = set()
lock = _threading.Lock()
sem = _threading.Semaphore(self.jobs)
err_event = _threading.Event()
for project in projects:
# Check for any errors before starting any new threads.
# ...we'll let existing threads finish, though.
if err_event.isSet():
break
objdir_project_map = dict()
for project in projects:
objdir_project_map.setdefault(project.objdir, []).append(project)
sem.acquire()
t = _threading.Thread(target = self._FetchHelper,
args = (opt,
project,
lock,
fetched,
pm,
sem,
err_event))
threads = set()
sem = _threading.Semaphore(self.jobs)
err_event = _threading.Event()
for project_list in objdir_project_map.values():
# Check for any errors before running any more tasks.
# ...we'll let existing threads finish, though.
if err_event.isSet() and not opt.force_broken:
break
sem.acquire()
kwargs = dict(opt=opt,
projects=project_list,
lock=lock,
fetched=fetched,
pm=pm,
sem=sem,
err_event=err_event)
if self.jobs > 1:
t = _threading.Thread(target = self._FetchProjectList,
kwargs = kwargs)
# Ensure that Ctrl-C will not freeze the repo process.
t.daemon = True
threads.add(t)
t.start()
else:
self._FetchProjectList(**kwargs)
for t in threads:
t.join()
for t in threads:
t.join()
# If we saw an error, exit with code 1 so that other scripts can check.
if err_event.isSet():
print('\nerror: Exited sync due to fetch errors', file=sys.stderr)
sys.exit(1)
# If we saw an error, exit with code 1 so that other scripts can check.
if err_event.isSet():
print('\nerror: Exited sync due to fetch errors', file=sys.stderr)
sys.exit(1)
pm.end()
self._fetch_times.Save()
self._GCProjects(projects)
if not self.manifest.IsArchive:
self._GCProjects(projects)
return fetched
def _GCProjects(self, projects):
gc_gitdirs = {}
for project in projects:
if len(project.manifest.GetProjectsWithName(project.name)) > 1:
print('Shared project %s found, disabling pruning.' % project.name)
project.bare_git.config('--replace-all', 'gc.pruneExpire', 'never')
gc_gitdirs[project.gitdir] = project.bare_git
has_dash_c = git_require((1, 7, 2))
if multiprocessing and has_dash_c:
cpu_count = multiprocessing.cpu_count()
@ -350,8 +412,8 @@ later is required to fix a server side protocol bug.
jobs = min(self.jobs, cpu_count)
if jobs < 2:
for project in projects:
project.bare_git.gc('--auto')
for bare_git in gc_gitdirs.values():
bare_git.gc('--auto')
return
config = {'pack.threads': cpu_count / jobs if cpu_count > jobs else 1}
@ -360,10 +422,10 @@ later is required to fix a server side protocol bug.
sem = _threading.Semaphore(jobs)
err_event = _threading.Event()
def GC(project):
def GC(bare_git):
try:
try:
project.bare_git.gc('--auto', config=config)
bare_git.gc('--auto', config=config)
except GitError:
err_event.set()
except:
@ -372,11 +434,11 @@ later is required to fix a server side protocol bug.
finally:
sem.release()
for project in projects:
for bare_git in gc_gitdirs.values():
if err_event.isSet():
break
sem.acquire()
t = _threading.Thread(target=GC, args=(project,))
t = _threading.Thread(target=GC, args=(bare_git,))
t.daemon = True
threads.add(t)
t.start()
@ -395,6 +457,59 @@ later is required to fix a server side protocol bug.
else:
self.manifest._Unload()
def _DeleteProject(self, path):
print('Deleting obsolete path %s' % path, file=sys.stderr)
# Delete the .git directory first, so we're less likely to have a partially
# working git repository around. There shouldn't be any git projects here,
# so rmtree works.
try:
shutil.rmtree(os.path.join(path, '.git'))
except OSError:
print('Failed to remove %s' % os.path.join(path, '.git'), file=sys.stderr)
print('error: Failed to delete obsolete path %s' % path, file=sys.stderr)
print(' remove manually, then run sync again', file=sys.stderr)
return -1
# Delete everything under the worktree, except for directories that contain
# another git project
dirs_to_remove = []
failed = False
for root, dirs, files in os.walk(path):
for f in files:
try:
os.remove(os.path.join(root, f))
except OSError:
print('Failed to remove %s' % os.path.join(root, f), file=sys.stderr)
failed = True
dirs[:] = [d for d in dirs
if not os.path.lexists(os.path.join(root, d, '.git'))]
dirs_to_remove += [os.path.join(root, d) for d in dirs
if os.path.join(root, d) not in dirs_to_remove]
for d in reversed(dirs_to_remove):
if len(os.listdir(d)) == 0:
try:
os.rmdir(d)
except OSError:
print('Failed to remove %s' % os.path.join(root, d), file=sys.stderr)
failed = True
continue
if failed:
print('error: Failed to delete obsolete path %s' % path, file=sys.stderr)
print(' remove manually, then run sync again', file=sys.stderr)
return -1
# Try deleting parent dirs if they are empty
project_dir = path
while project_dir != self.manifest.topdir:
if len(os.listdir(project_dir)) == 0:
os.rmdir(project_dir)
else:
break
project_dir = os.path.dirname(project_dir)
return 0
def UpdateProjectList(self):
new_project_paths = []
for project in self.GetProjects(None, missing_ok=True):
@ -415,13 +530,14 @@ later is required to fix a server side protocol bug.
continue
if path not in new_project_paths:
# If the path has already been deleted, we don't need to do it
if os.path.exists(self.manifest.topdir + '/' + path):
gitdir = os.path.join(self.manifest.topdir, path, '.git')
if os.path.exists(gitdir):
project = Project(
manifest = self.manifest,
name = path,
remote = RemoteSpec('origin'),
gitdir = os.path.join(self.manifest.topdir,
path, '.git'),
gitdir = gitdir,
objdir = gitdir,
worktree = os.path.join(self.manifest.topdir, path),
relpath = path,
revisionExpr = 'HEAD',
@ -434,18 +550,8 @@ later is required to fix a server side protocol bug.
print(' commit changes, then run sync again',
file=sys.stderr)
return -1
else:
print('Deleting obsolete path %s' % project.worktree,
file=sys.stderr)
shutil.rmtree(project.worktree)
# Try deleting parent subdirs if they are empty
project_dir = os.path.dirname(project.worktree)
while project_dir != self.manifest.topdir:
try:
os.rmdir(project_dir)
except OSError:
break
project_dir = os.path.dirname(project_dir)
elif self._DeleteProject(project.worktree):
return -1
new_project_paths.sort()
fd = open(file_path, 'w')
@ -488,6 +594,9 @@ later is required to fix a server side protocol bug.
self.manifest.Override(opt.manifest_name)
manifest_name = opt.manifest_name
smart_sync_manifest_name = "smart_sync_override.xml"
smart_sync_manifest_path = os.path.join(
self.manifest.manifestProject.worktree, smart_sync_manifest_name)
if opt.smart_sync or opt.smart_tag:
if not self.manifest.manifest_server:
@ -509,19 +618,18 @@ later is required to fix a server side protocol bug.
try:
info = netrc.netrc()
except IOError:
print('.netrc file does not exist or could not be opened',
file=sys.stderr)
# .netrc file does not exist or could not be opened
pass
else:
try:
parse_result = urllib.parse.urlparse(manifest_server)
if parse_result.hostname:
username, _account, password = \
info.authenticators(parse_result.hostname)
except TypeError:
# TypeError is raised when the given hostname is not present
# in the .netrc file.
print('No credentials found for %s in .netrc'
% parse_result.hostname, file=sys.stderr)
auth = info.authenticators(parse_result.hostname)
if auth:
username, _account, password = auth
else:
print('No credentials found for %s in .netrc'
% parse_result.hostname, file=sys.stderr)
except netrc.NetrcParseError as e:
print('Error parsing .netrc file: %s' % e, file=sys.stderr)
@ -530,8 +638,12 @@ later is required to fix a server side protocol bug.
(username, password),
1)
transport = PersistentTransport(manifest_server)
if manifest_server.startswith('persistent-'):
manifest_server = manifest_server[len('persistent-'):]
try:
server = xmlrpc.client.Server(manifest_server)
server = xmlrpc.client.Server(manifest_server, transport=transport)
if opt.smart_sync:
p = self.manifest.manifestProject
b = p.GetBranch(p.CurrentBranch)
@ -540,7 +652,10 @@ later is required to fix a server side protocol bug.
branch = branch[len(R_HEADS):]
env = os.environ.copy()
if 'TARGET_PRODUCT' in env and 'TARGET_BUILD_VARIANT' in env:
if 'SYNC_TARGET' in env:
target = env['SYNC_TARGET']
[success, manifest_str] = server.GetApprovedManifest(branch, target)
elif 'TARGET_PRODUCT' in env and 'TARGET_BUILD_VARIANT' in env:
target = '%s-%s' % (env['TARGET_PRODUCT'],
env['TARGET_BUILD_VARIANT'])
[success, manifest_str] = server.GetApprovedManifest(branch, target)
@ -551,17 +666,16 @@ later is required to fix a server side protocol bug.
[success, manifest_str] = server.GetManifest(opt.smart_tag)
if success:
manifest_name = "smart_sync_override.xml"
manifest_path = os.path.join(self.manifest.manifestProject.worktree,
manifest_name)
manifest_name = smart_sync_manifest_name
try:
f = open(manifest_path, 'w')
f = open(smart_sync_manifest_path, 'w')
try:
f.write(manifest_str)
finally:
f.close()
except IOError:
print('error: cannot write manifest to %s' % manifest_path,
except IOError as e:
print('error: cannot write manifest to %s:\n%s'
% (smart_sync_manifest_path, e),
file=sys.stderr)
sys.exit(1)
self._ReloadManifest(manifest_name)
@ -578,6 +692,13 @@ later is required to fix a server side protocol bug.
% (self.manifest.manifest_server, e.errcode, e.errmsg),
file=sys.stderr)
sys.exit(1)
else: # Not smart sync or smart tag mode
if os.path.isfile(smart_sync_manifest_path):
try:
os.remove(smart_sync_manifest_path)
except OSError as e:
print('error: failed to remove existing smart sync override manifest: %s' %
e, file=sys.stderr)
rp = self.manifest.repoProject
rp.PreSync()
@ -591,7 +712,8 @@ later is required to fix a server side protocol bug.
if not opt.local_only:
mp.Sync_NetworkHalf(quiet=opt.quiet,
current_branch_only=opt.current_branch_only,
no_tags=opt.no_tags)
no_tags=opt.no_tags,
optimized_fetch=opt.optimized_fetch)
if mp.HasChanges:
syncbuf = SyncBuffer(mp.config)
@ -601,6 +723,42 @@ later is required to fix a server side protocol bug.
self._ReloadManifest(manifest_name)
if opt.jobs is None:
self.jobs = self.manifest.default.sync_j
if self.gitc_manifest:
gitc_manifest_projects = self.GetProjects(args,
missing_ok=True)
gitc_projects = []
opened_projects = []
for project in gitc_manifest_projects:
if project.relpath in self.gitc_manifest.paths and \
self.gitc_manifest.paths[project.relpath].old_revision:
opened_projects.append(project.relpath)
else:
gitc_projects.append(project.relpath)
if not args:
gitc_projects = None
if gitc_projects != [] and not opt.local_only:
print('Updating GITC client: %s' % self.gitc_manifest.gitc_client_name)
manifest = GitcManifest(self.repodir, self.gitc_manifest.gitc_client_name)
if manifest_name:
manifest.Override(manifest_name)
else:
manifest.Override(self.manifest.manifestFile)
gitc_utils.generate_gitc_manifest(self.gitc_manifest,
manifest,
gitc_projects)
print('GITC client successfully synced.')
# The opened projects need to be synced as normal, therefore we
# generate a new args list to represent the opened projects.
# TODO: make this more reliable -- if there's a project name/path overlap,
# this may choose the wrong project.
args = [os.path.relpath(self.manifest.paths[p].worktree, os.getcwd())
for p in opened_projects]
if not args:
return
all_projects = self.GetProjects(args,
missing_ok=True,
submodules_ok=opt.fetch_submodules)
@ -641,7 +799,7 @@ later is required to fix a server side protocol bug.
previously_missing_set = missing_set
fetched.update(self._Fetch(missing, opt))
if self.manifest.IsMirror:
if self.manifest.IsMirror or self.manifest.IsArchive:
# bail out now, we have no working tree
return
@ -654,7 +812,7 @@ later is required to fix a server side protocol bug.
for project in all_projects:
pm.update()
if project.worktree:
project.Sync_LocalHalf(syncbuf)
project.Sync_LocalHalf(syncbuf, force_sync=opt.force_sync)
pm.end()
print(file=sys.stderr)
if not syncbuf.Finish():
@ -666,10 +824,10 @@ later is required to fix a server side protocol bug.
print(self.manifest.notice)
def _PostRepoUpgrade(manifest, quiet=False):
wrapper = WrapperModule()
wrapper = Wrapper()
if wrapper.NeedSetupGnuPG():
wrapper.SetupGnuPG(quiet)
for project in manifest.projects.values():
for project in manifest.projects:
if project.Exists:
project.PostRepoUpgrade()
@ -742,7 +900,7 @@ class _FetchTimes(object):
_ALPHA = 0.5
def __init__(self, manifest):
self._path = os.path.join(manifest.repodir, '.repopickle_fetchtimes')
self._path = os.path.join(manifest.repodir, '.repo_fetchtimes.json')
self._times = None
self._seen = set()
@ -762,21 +920,16 @@ class _FetchTimes(object):
if self._times is None:
try:
f = open(self._path)
except IOError:
self._times = {}
return self._times
try:
try:
self._times = pickle.load(f)
except IOError:
try:
os.remove(self._path)
except OSError:
pass
self._times = {}
finally:
f.close()
return self._times
self._times = json.load(f)
finally:
f.close()
except (IOError, ValueError):
try:
os.remove(self._path)
except OSError:
pass
self._times = {}
def Save(self):
if self._times is None:
@ -790,13 +943,110 @@ class _FetchTimes(object):
del self._times[name]
try:
f = open(self._path, 'wb')
f = open(self._path, 'w')
try:
pickle.dump(self._times, f)
except (IOError, OSError, pickle.PickleError):
json.dump(self._times, f, indent=2)
finally:
f.close()
except (IOError, TypeError):
try:
os.remove(self._path)
except OSError:
pass
# This is a replacement for xmlrpc.client.Transport using urllib2
# and supporting persistent-http[s]. It cannot change hosts from
# request to request like the normal transport, the real url
# is passed during initialization.
class PersistentTransport(xmlrpc.client.Transport):
def __init__(self, orig_host):
self.orig_host = orig_host
def request(self, host, handler, request_body, verbose=False):
with GetUrlCookieFile(self.orig_host, not verbose) as (cookiefile, proxy):
# Python doesn't understand cookies with the #HttpOnly_ prefix
# Since we're only using them for HTTP, copy the file temporarily,
# stripping those prefixes away.
if cookiefile:
tmpcookiefile = tempfile.NamedTemporaryFile()
tmpcookiefile.write("# HTTP Cookie File")
try:
os.remove(self._path)
except OSError:
pass
finally:
f.close()
with open(cookiefile) as f:
for line in f:
if line.startswith("#HttpOnly_"):
line = line[len("#HttpOnly_"):]
tmpcookiefile.write(line)
tmpcookiefile.flush()
cookiejar = cookielib.MozillaCookieJar(tmpcookiefile.name)
try:
cookiejar.load()
except cookielib.LoadError:
cookiejar = cookielib.CookieJar()
finally:
tmpcookiefile.close()
else:
cookiejar = cookielib.CookieJar()
proxyhandler = urllib.request.ProxyHandler
if proxy:
proxyhandler = urllib.request.ProxyHandler({
"http": proxy,
"https": proxy })
opener = urllib.request.build_opener(
urllib.request.HTTPCookieProcessor(cookiejar),
proxyhandler)
url = urllib.parse.urljoin(self.orig_host, handler)
parse_results = urllib.parse.urlparse(url)
scheme = parse_results.scheme
if scheme == 'persistent-http':
scheme = 'http'
if scheme == 'persistent-https':
# If we're proxying through persistent-https, use http. The
# proxy itself will do the https.
if proxy:
scheme = 'http'
else:
scheme = 'https'
# Parse out any authentication information using the base class
host, extra_headers, _ = self.get_host_info(parse_results.netloc)
url = urllib.parse.urlunparse((
scheme,
host,
parse_results.path,
parse_results.params,
parse_results.query,
parse_results.fragment))
request = urllib.request.Request(url, request_body)
if extra_headers is not None:
for (name, header) in extra_headers:
request.add_header(name, header)
request.add_header('Content-Type', 'text/xml')
try:
response = opener.open(request)
except urllib.error.HTTPError as e:
if e.code == 501:
# We may have been redirected through a login process
# but our POST turned into a GET. Retry.
response = opener.open(request)
else:
raise
p, u = xmlrpc.client.getparser()
while 1:
data = response.read(1024)
if not data:
break
p.feed(data)
p.close()
return u.close()
def close(self):
pass

View File

@ -25,10 +25,12 @@ from git_command import GitCommand
from project import RepoHook
from pyversion import is_python3
# pylint:disable=W0622
if not is_python3():
# pylint:disable=W0622
input = raw_input
# pylint:enable=W0622
else:
unicode = str
# pylint:enable=W0622
UNUSUAL_COMMIT_THRESHOLD = 5
@ -89,6 +91,11 @@ to "true" then repo will assume you always answer "y" at the prompt,
and will not prompt you further. If it is set to "false" then repo
will assume you always answer "n", and will abort.
review.URL.autoreviewer:
To automatically append a user or mailing list to reviews, you can set
a per-project or global Git option to do so.
review.URL.autocopy:
To automatically copy a user or mailing list to all uploaded reviews,
@ -293,14 +300,20 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
self._UploadAndReport(opt, todo, people)
def _AppendAutoCcList(self, branch, people):
def _AppendAutoList(self, branch, people):
"""
Appends the list of reviewers in the git project's config.
Appends the list of users in the CC list in the git project's config if a
non-empty reviewer list was found.
"""
name = branch.name
project = branch.project
key = 'review.%s.autoreviewer' % project.GetBranch(name).remote.review
raw_list = project.config.GetString(key)
if not raw_list is None:
people[0].extend([entry.strip() for entry in raw_list.split(',')])
key = 'review.%s.autocopy' % project.GetBranch(name).remote.review
raw_list = project.config.GetString(key)
if not raw_list is None and len(people[0]) > 0:
@ -323,16 +336,20 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
for branch in todo:
try:
people = copy.deepcopy(original_people)
self._AppendAutoCcList(branch, people)
self._AppendAutoList(branch, people)
# Check if there are local changes that may have been forgotten
if branch.project.HasChanges():
changes = branch.project.UncommitedFiles()
if changes:
key = 'review.%s.autoupload' % branch.project.remote.review
answer = branch.project.config.GetBoolean(key)
# if they want to auto upload, let's not ask because it could be automated
if answer is None:
sys.stdout.write('Uncommitted changes in ' + branch.project.name + ' (did you forget to amend?). Continue uploading? (y/N) ')
sys.stdout.write('Uncommitted changes in ' + branch.project.name)
sys.stdout.write(' (did you forget to amend?):\n')
sys.stdout.write('\n'.join(changes) + '\n')
sys.stdout.write('Continue uploading? (y/N) ')
a = sys.stdin.readline().strip().lower()
if a not in ('y', 'yes', 't', 'true', 'on'):
print("skipping upload", file=sys.stderr)
@ -422,18 +439,35 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
for project in project_list:
if opt.current_branch:
cbr = project.CurrentBranch
avail = [project.GetUploadableBranch(cbr)] if cbr else None
up_branch = project.GetUploadableBranch(cbr)
if up_branch:
avail = [up_branch]
else:
avail = None
print('ERROR: Current branch (%s) not uploadable. '
'You may be able to type '
'"git branch --set-upstream-to m/master" to fix '
'your branch.' % str(cbr),
file=sys.stderr)
else:
avail = project.GetUploadableBranches(branch)
if avail:
pending.append((project, avail))
if pending and (not opt.bypass_hooks):
if not pending:
print("no branches ready for upload", file=sys.stderr)
return
if not opt.bypass_hooks:
hook = RepoHook('pre-upload', self.manifest.repo_hooks_project,
self.manifest.topdir, abort_if_user_denies=True)
self.manifest.topdir,
self.manifest.manifestProject.GetRemote('origin').url,
abort_if_user_denies=True)
pending_proj_names = [project.name for (project, avail) in pending]
pending_worktrees = [project.worktree for (project, avail) in pending]
try:
hook.Run(opt.allow_all_hooks, project_list=pending_proj_names)
hook.Run(opt.allow_all_hooks, project_list=pending_proj_names,
worktree_list=pending_worktrees)
except HookError as e:
print("ERROR: %s" % str(e), file=sys.stderr)
return
@ -444,9 +478,7 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
cc = _SplitEmails(opt.cc)
people = (reviewers, cc)
if not pending:
print("no branches ready for upload", file=sys.stderr)
elif len(pending) == 1 and len(pending[0][1]) == 1:
if len(pending) == 1 and len(pending[0][1]) == 1:
self._SingleBranch(opt, pending[0][1][0], people)
else:
self._MultipleBranches(opt, pending, people)

1
tests/fixtures/gitc_config vendored Normal file
View File

@ -0,0 +1 @@
gitc_dir=/test/usr/local/google/gitc

75
tests/test_wrapper.py Normal file
View File

@ -0,0 +1,75 @@
#
# Copyright (C) 2015 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import unittest
import wrapper
def fixture(*paths):
"""Return a path relative to tests/fixtures.
"""
return os.path.join(os.path.dirname(__file__), 'fixtures', *paths)
class RepoWrapperUnitTest(unittest.TestCase):
"""Tests helper functions in the repo wrapper
"""
def setUp(self):
"""Load the wrapper module every time
"""
wrapper._wrapper_module = None
self.wrapper = wrapper.Wrapper()
def test_get_gitc_manifest_dir_no_gitc(self):
"""
Test reading a missing gitc config file
"""
self.wrapper.GITC_CONFIG_FILE = fixture('missing_gitc_config')
val = self.wrapper.get_gitc_manifest_dir()
self.assertEqual(val, '')
def test_get_gitc_manifest_dir(self):
"""
Test reading the gitc config file and parsing the directory
"""
self.wrapper.GITC_CONFIG_FILE = fixture('gitc_config')
val = self.wrapper.get_gitc_manifest_dir()
self.assertEqual(val, '/test/usr/local/google/gitc')
def test_gitc_parse_clientdir_no_gitc(self):
"""
Test parsing the gitc clientdir without gitc running
"""
self.wrapper.GITC_CONFIG_FILE = fixture('missing_gitc_config')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/something'), None)
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test'), 'test')
def test_gitc_parse_clientdir(self):
"""
Test parsing the gitc clientdir
"""
self.wrapper.GITC_CONFIG_FILE = fixture('gitc_config')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/something'), None)
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test/'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test/extra'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/extra'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/'), None)
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/'), None)
if __name__ == '__main__':
unittest.main()

30
wrapper.py Normal file
View File

@ -0,0 +1,30 @@
#!/usr/bin/env python
#
# Copyright (C) 2014 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import imp
import os
def WrapperPath():
return os.path.join(os.path.dirname(__file__), 'repo')
_wrapper_module = None
def Wrapper():
global _wrapper_module
if not _wrapper_module:
_wrapper_module = imp.load_source('wrapper', WrapperPath())
return _wrapper_module