Compare commits

...

34 Commits

Author SHA1 Message Date
f322b9abb4 sync: Support downloading bundle to initialize repository
An HTTP (or HTTPS) based remote server may now offer a 'clone.bundle'
file in each repository's Git directory. Over an http:// or https://
remote repo will first ask for '$URL/clone.bundle', and if present
download this to bootstrap the local client, rather than relying
on the native Git transport to initialize the new repository.

Bundles may be hosted elsewhere. The client automatically follows a
HTTP 302 redirect to acquire the bundle file. This allows servers
to direct clients to cached copies residing on content delivery
networks, where the bundle may be closer to the end-user.

Bundle downloads are resumeable from where they last left off,
allowing clients to initialize large repositories even when the
connection gets interrupted.

If a bundle does not exist for a repository (a HTTP 404 response
code is returned for '$URL/clone.bundle'), the native Git transport
is used instead. If the client is performing a shallow sync, the
bundle transport is not used, as there is no way to embed shallow
data into the bundle.

Change-Id: I05dad17792fd6fd20635a0f71589566e557cc743
Signed-off-by: Shawn O. Pearce <sop@google.com>
2011-09-28 10:07:36 -07:00
db728cd866 Allow remote url to be relative to manifst url 2011-09-28 10:07:01 -07:00
c4657969eb sync: Update default -j flag from manifest
If the manifest is updated and the default sync-j attribute
was modified, honor it during this sync session if the user
has not supplied a -j flag on the command line.

Change-Id: I127ee5c779e2bbbb40b30bddc10ec1fa704b3bf3
Signed-off-by: Shawn O. Pearce <sop@google.com>
2011-09-26 09:08:44 -07:00
7b947de1ee Ignore missing ~/.netrc
Change-Id: Ifa6065d57a6cb11ad57ddd44bc88d9690fe234ab
Signed-off-by: Shawn O. Pearce <sop@google.com>
2011-09-23 11:50:31 -07:00
6392c87945 sync: Allow -j to have a default in manifest
This permits manifest authors to suggest a number of parallel
fetch operations against a remote server. For example, Gerrit
Code Review servers support queuing of requests and processes
them in first-in, first-out order. Running concurrent fetches
can utilize multiple CPUs on the Gerrit server, but will also
decrease overall operation latency by having the request put
into the queue ready to execute as soon as a CPU is free.

Change-Id: I3d3904acb6f63516bae4b071c510ad57a2afab18
Signed-off-by: Shawn O. Pearce <sop@google.com>
2011-09-22 18:08:27 -07:00
97d2b2f7a0 sync: Limit -j to file descriptors
Each worker thread requires at least 3 file descriptors to run the
forked 'git fetch' child to operate against the local repository.
Mac OS X has the RLIMIT_NOFILE set to 256 by default, which means
a sync -j128 often fails when the workers run out of pipes within
the Python parent process.

Change-Id: I2cdb14621b899424b079daf7969bc8c16b85b903
Signed-off-by: Shawn O. Pearce <sop@google.com>
2011-09-22 18:08:26 -07:00
3a0e782790 Add global option --time to track execution
This prints a simple line after a command ends, providing
information about how long it executed for using real wall
clock time. Its mostly useful for looking at sync times.

Change-Id: Ie0997df0a0f90150270835d94b58a01a10bc3956
Signed-off-by: Shawn O. Pearce <sop@google.com>
2011-09-22 18:08:18 -07:00
490d09a314 Support units in progress messages
This allows our progress meter to be used for bytes transferred, by
setting the units to KB or MB to let the user know the size.

Change-Id: Ie8653d4a40d79439026c18bd51915845b2c5bba9
Signed-off-by: Shawn O. Pearce <sop@google.com>
2011-09-19 14:52:57 -07:00
13111b4e97 Add support for url.*.insteadof
Teach repo how to resolve URLs using the url.insteadof feature
that C Git natively uses during clone, fetch or push. This will
later allow repo to resolve a URL before accessing it directly.
We do not want to pre-resolve things and store the resolved URL
into individual projects, as this makes it impossible for the
user to undo the insteadof mapping at a later date.

Change-Id: I0f62e811197c53fbc8a8be424e3cabf4ed07b4cb
Signed-off-by: Shawn O. Pearce <sop@google.com>
2011-09-19 14:52:57 -07:00
bd0312a484 Support ~/.netrc for HTTP Basic authentication
If repo tries to access a URL over HTTP and the user needs to
authenticate, offer a match from ~/.netrc. This matches behavior
with the Git command line client.

Change-Id: I803f3c5d562177ea0330941350cff3cc1e1bef08
Signed-off-by: Shawn O. Pearce <sop@google.com>
2011-09-19 14:52:32 -07:00
334851e4b6 Enhance HTTP support
Setting REPO_CURL_VERBOSE=1 in the environment will register a debug
level HTTPHandler on the urllib2 library, showing HTTP requests and
responses on the stderr channel of repo.

During any HTTP or HTTPS request created inside of the repo process,
a custom User-Agent header is now defined:

  User-Agent: git-repo/1.7.5 (Linux) git/1.7.7 Python/2.6.5

Change-Id: Ia5026fb1e1500659bd2af27416d85e205048bf26
Signed-off-by: Shawn O. Pearce <sop@google.com>
2011-09-19 14:51:47 -07:00
014d060989 Honor http_proxy variable globally
If the http_proxy environment variable was set, honor it during
the entire repo session for any Python created HTTP connections.

Change-Id: Ib4ae833cb2cdd47ab0126949f6b399d2c142887d
Signed-off-by: Shawn O. Pearce <sop@google.com>
2011-09-11 13:11:04 -07:00
44da16e8a0 Change default REPO_URL to code.google.com
Change-Id: If7700daf96fb8f3ee449e5774017272ef31b4b44
2011-09-05 14:16:49 -07:00
65e0f35fda Add commit-msg hook also for manifest project
The manifest project has - by design - not a review URL associated
with it. It is actually not even a 'project' in repo's sense.

This will prevent the commit-msg hook from being added, which is
not necessarily wanted as the project is managed in gerrit.

This commit will enable the commit-msg hook, which in turn will
add the Change-Id-line to every new commit in it. This simplifies
replacing patch sets (by git push ... refs/for/...).

Change-Id: I42d0f6fd79e6282d9d47074a3819e68d968999a7
Signed-off-by: Victor Boivie <victor.boivie@sonyericsson.com>
2011-07-20 07:34:23 -07:00
08c880db18 Smart tag support
This is an evolution of 'smart-sync' that adds a new option, -t,
that allows you to specify a tag/label to use instead of the
"latest good build" on the current manifest branch which -s does.

Signed-off-by: Victor Boivie <victor.boivie@sonyericsson.com>
Change-Id: I8c20fd91104a6aafa0271d4d33f6c4850aade17e
2011-07-20 07:13:48 -07:00
a101f1c167 Honor 'http_proxy' environment variable
'repo upload' makes http request using urllib2 python library.
Unfortunately this library does not work (by default) in case
if the user behind a proxy.

This change adds proxy handler in case if 'http_proxy' environment
variable is set.

Change-Id: Ic4176ad733fc21bd5b59661b3eacc2f0a7c3c1ff
2011-07-20 07:11:00 -07:00
49cd59bc86 Add --depth option to main repo wrapper.
See related repo change:
  https://review.source.android.com/#change,22722

Change-Id: I9bdd86971c94604477b91cdf47d6fac2c0bc186e
2011-06-14 16:59:19 -07:00
30d452905f Add a --depth option to repo init.
Change-Id: Id30fb4a85f4f8a1847420b0b51a86060041eb5bf
2011-06-09 16:48:23 -07:00
d6c93a28ca Add branch support to repo upload
This commit adds a --br=<branch> option to repo upload.

repo currently examines every non-published branch. This is problematic
for my workflow. I have many branches in my kernel tree. Many of these
branches are based off of upstream remotes (I have many remotes) and
will never be uploaded (they'll get sent upstream as a patch).

Having repo scan these branches adds to my upload processing time
and clutters the branch selection buffer. I've also seen repo get
confused when one of my branches is 1000s of commits different from
m/master.

Change-Id: I68fa18951ea59ba373277b57ffcaf8cddd7e7a40
2011-05-26 10:49:39 -07:00
d572a13021 Added repo cherry-pick command
It is undesired to have the same Change-Id:-line for two separate
commits, and when cherry-picking, the user must manually change it.

If this is not done, bad things may happen (such as when the user
is uploading the cherry-picked commit to Gerrit, it will instead
see it as a new patch-set for the original change, or worse).

repo cherry-pick works the same was as git cherry-pick, except that
it replaces the Change-Id with a new one and adds a reference
back to the commit from where it was picked.

On failures (when git can not successfully apply the cherry-picked
commit), instructions will be written to the user.

Change-Id: I5a38b89839f91848fad43386d43cae2f6cdabf83
2011-04-07 17:19:06 -04:00
3ba5f95b46 Fixed repo checkout error message when git reports errors.
In the current version of repo checkout, we often get the error:
  error: no project has branch xyzzy

...even when the actual error was something else.  This fixes it
to only report the 'no project has branch' when that is actually true.

This fix is very similar to one made for 'repo abandon':
  https://review.source.android.com/#change,22207

The repo checkout error is filed as: <http://crosbug.com/6514>

TEST=manual

A sample creating a case where 'git checkout' will fail:

  $ repo start branch1 .
  $ repo start branch2 .
  $ touch bogusfile
  $ git add bogusfile
  $ git commit -m "create bogus file"
  [branch2 f8b6b08] create bogus file
   0 files changed, 0 insertions(+), 0 deletions(-)
   create mode 100644 bogusfile
  $ echo "More" >> bogusfile
  $ repo checkout branch1 .
  error: chromite/: cannot checkout branch1

A sample case showing that we still fail if no project has a branch:

  $ repo checkout xyzzy .
  error: no project has branch xyzzy

Change-Id: I48a8e258fa7a9c1f2800dafc683787204bbfcc63
2011-04-07 16:55:35 -04:00
2630dd9787 Fixed problems w/ 2nd repo init if first repo init had bad URL.
This is the simplest fix: if we had problems syncing the
manifest.git directory and we were the ones that created it,
we should delete it.  This doesn't try to do anything complex
like try to recover from a .repo directory that got broken in
some other way.

This is filed as: <http://crosbug.com/13403>

TEST=manual

Init once with a bad URL:
  $ repo init -u http://foobar.example.com
  Getting manifest ...
     from http://foobar.example.com
  Connection closed by 172.22.121.77
  error: Couldn't resolve host 'foobar.example.com' while accessing http://foobar.example.com/info/refs

  fatal: HTTP request failed
  fatal: cannot obtain manifest http://foobar.example.com

Init again: identical to the first.  Good:
  $ repo init -u http://foobar.example.com
  Getting manifest ...
     from http://foobar.example.com
  Connection closed by 172.22.121.77
  error: Couldn't resolve host 'foobar.example.com' while accessing http://foobar.example.com/info/refs

  fatal: HTTP request failed
  fatal: cannot obtain manifest http://foobar.example.com

Init with correct URL:
  $ repo init -u http://git.chromium.org/git/manifest -m minilayout.xml
  Getting manifest ...
     from http://git.chromium.org/git/manifest
  [ ... cut ... ]

  repo initialized in /.../repoiniterr

Try a bad URL after a good one; it doesn't get saved (good):
  $ repo init -u http://foobar.example.com
  Connection closed by 172.22.121.77
  error: Couldn't resolve host 'foobar.example.com' while accessing http://foobar.example.com/info/refs

  fatal: HTTP request failed
  fatal: cannot obtain manifest http://foobar.example.com

Just to confirm, I can still do a good one after a bad...
  $ repo init -u http://git.chromium.org/git/manifest -m minilayout.xml

  Your Name  [George Washington]:
  Your Email [george@washington.example.com]:

  Your identity is: George Washington <george@washington.example.com>
  is this correct [y/n]? y

  repo initialized in /.../repoiniterr

Change-Id: I1692821a330d97b1d218b2e191a93245b33f2362
2011-04-07 16:51:50 -04:00
dafb1d68d3 Fixed repo abandon to give better messages.
The main fix is to give an error message if nothing was actually
abandoned.  See <http://crosbug.com/6041>.

The secondary fix is to list projects where the abandon happened.
This could be done in a separate CL or dropped altogether if requested.

TEST=manual

$ repo abandon dougabc; echo $?
Abandon dougabc: 100% (127/127), done.
Abandoned in 2 project(s):
  chromite
  src/platform/init
0

$ repo abandon dougabc; echo $?
Abandon dougabc: 100% (127/127), done.
error: no project has branch dougabc
1

$ repo abandon dougabc; echo $?
Abandon dougabc: 100% (127/127), done.
error: chromite/: cannot abandon dougabc
1

Change-Id: I79520cc3279291acadc1a24ca34a761e9de04ed4
2011-04-07 16:49:23 -04:00
4655e81a75 Add option to check status of projects in parallel.
Change-Id: I6ac653f88573def8bb3d96031d3570ff966251ad
2011-04-07 16:36:42 -04:00
723c5dc3d6 Fix parallel sync on python < 2.6.
Event.isSet was renamed to is_set in 2.6, but we should
use the earlier syntax to avoid breaking compatibility
with older Python installations.

Change-Id: I41888ed38df278191d7496c1a6eed15e881733f4
2011-04-04 11:34:47 -04:00
e6a0eeb80d sync: Fix syntax error on Python 2.4
Change-Id: I371d032d5a1ddde137721cbe2b24bfa38f20aaaa
Signed-off-by: Shawn O. Pearce <sop@google.com>
2011-03-22 19:04:47 -07:00
0960b5b53d Creating rr-cache
If git-rerere is enabled, it uses the rr-cache directory that
repo currently creates a symlink from, but doesn't create the
destination directory (inside the project's directory). Git
will then complain during merges and rebases.

This commit creates the rr-cache directory inside the project.

Change-Id: If8b57a04f022fc6ed6a7007d05aa2e876e6611ee
2011-03-17 09:19:51 -07:00
fc06ced9f9 Make 'repo sync -jN' exit with an error code in the case of sync errors.
The bug that this is fixing is described here:

http://code.google.com/p/chromium-os/issues/detail?id=6813

This fix allows the helper threads to signal the main thread that they
saw an error.  When the main thread sees the error, it will let all
existing threads finish, then exit with an error.

Change-Id: If3019bc6b0b3ab9304d49ed2eea53e9d57f3095a
2011-03-17 09:17:42 -07:00
fce89f218a Add 'list' command to repo.
This isn't a required command, but might be more discoverable for
repo newbies?

Change-Id: If357346f234774d42e04e024e65acdaf6dca6c62
2011-03-16 12:55:44 -07:00
37282b4b9c Support repo-level pre-upload hook and prep for future hooks.
All repo-level hooks are expected to live in a single project at the
top level of that project.  The name of the hooks project is provided
in the manifest.xml.  The manifest also lists which hooks are enabled
to make it obvious if a file somehow failed to sync down (or got
deleted).

Before running any hook, we will prompt the user to make sure that it
is OK.  A user can deny running the hook, allow once, or allow
"forever" (until hooks change).  This tries to keep with the git
spirit of not automatically running anything on the user's computer
that got synced down.  Note that individual repo commands can add
always options to avoid these prompts as they see fit (see below for
the 'upload' options).

When hooks are run, they are loaded into the current interpreter (the
one running repo) and their main() function is run.  This mechanism is
used (instead of using subprocess) to make it easier to expand to a
richer hook interface in the future.  During loading, the
interpreter's sys.path is updated to contain the directory containing
the hooks so that hooks can be split into multiple files.

The upload command has two options that control hook behavior:
  - no-verify=False, verify=False (DEFAULT):
    If stdout is a tty, can prompt about running upload hooks if needed.
    If user denies running hooks, the upload is cancelled.  If stdout is
    not a tty and we would need to prompt about upload hooks, upload is
    cancelled.
  - no-verify=False, verify=True:
    Always run upload hooks with no prompt.
  - no-verify=True, verify=False:
    Never run upload hooks, but upload anyway (AKA bypass hooks).
  - no-verify=True, verify=True:
    Invalid

Sample bit of manifest.xml code for enabling hooks (assumes you have a
project named 'hooks' where hooks are stored):
  <repo-hooks in-project="hooks" enabled-list="pre-upload" />

Sample main() function in pre-upload.py in hooks directory:
  def main(project_list, **kwargs):
    print ('These projects will be uploaded: %s' %
           ', '.join(project_list))
    print ('I am being a good boy and ignoring anything in kwargs\n'
           'that I don\'t understand.')
    print 'I fail 50% of the time.  How flaky.'
    if random.random() <= .5:
      raise Exception('Pre-upload hook failed.  Have a nice day.')

Change-Id: I5cefa2cd5865c72589263cf8e2f152a43c122f70
2011-03-11 11:53:23 -08:00
835cd6888f Post-nonexistent-revision crash sidestepped
Fix for the bug that leaves a fractional .git directory after attempting to
perform an initial sync to a nonexistent revision. Moved the initialization of
the working directory to after the revision ID has already been checked. Now,
no project/.git directory gets created at all if the revision ID is bad.

Change-Id: I0c9b2a59573410f1d11de7661591bf02e4ce326b
2011-03-08 13:48:24 -08:00
8ced8641c8 Renamed 'repo_hooks' function to '_ProjectHooks'.
This renaming was done for two reasons:
1. The hooks are actually project-level hooks, not repo-level
   hooks.  Since we are talking about adding repo-level hooks,
   It keeps things less confusing if we name the existing hooks
   to be "ProjectHooks"
2. The function is a private function in project.py and so
   should have capitalization to match.

I also added a docstring describing this function.

Change-Id: I1d30f5de08e8f9f99f78146e68c76f906782d97e
2011-02-01 09:57:29 -08:00
2536f80625 Fixed bug identifying 'commit-msg' files.
There was a minor typo that would cause repo to (I believe)
mistakenly identify any file that contained a substring of the
word 'commit-msg' as a commit message hook.  For example, the file
'mit' or the file 'msg' would be treated as a commit message hook.
I believe that it was intended that repo only recognize files
named exactly 'commit-msg'.

Change-Id: I93edbddf3da3cf0935641e6efb19b0a8ee6e2308
2011-02-01 09:53:56 -08:00
0ce6ca9c7b Fix mirror clients with no worktree
Commit "Make path references OS independent" (df14a70c45)
broke mirror clients by trying to invoke replace() on None
when there is no worktree.

Change-Id: Ie0a187058358f7dcdf83119e45cc65409c980f11
2011-01-10 13:26:34 -08:00
17 changed files with 1272 additions and 161 deletions

View File

@ -25,7 +25,8 @@ following DTD:
default?,
manifest-server?,
remove-project*,
project*)>
project*,
repo-hooks?)>
<!ELEMENT notice (#PCDATA)>
@ -37,6 +38,7 @@ following DTD:
<!ELEMENT default (EMPTY)>
<!ATTLIST default remote IDREF #IMPLIED>
<!ATTLIST default revision CDATA #IMPLIED>
<!ATTLIST default sync-j CDATA #IMPLIED>
<!ELEMENT manifest-server (EMPTY)>
<!ATTLIST url CDATA #REQUIRED>
@ -49,6 +51,10 @@ following DTD:
<!ELEMENT remove-project (EMPTY)>
<!ATTLIST remove-project name CDATA #REQUIRED>
<!ELEMENT repo-hooks (EMPTY)>
<!ATTLIST repo-hooks in-project CDATA #REQUIRED>
<!ATTLIST repo-hooks enabled-list CDATA #REQUIRED>
]>
A description of the elements and their attributes follows.

View File

@ -57,6 +57,15 @@ class UploadError(Exception):
def __str__(self):
return self.reason
class DownloadError(Exception):
"""Cannot download a repository.
"""
def __init__(self, reason):
self.reason = reason
def __str__(self):
return self.reason
class NoSuchProjectError(Exception):
"""A specified project does not exist in the work tree.
"""
@ -75,3 +84,10 @@ class RepoChangedException(Exception):
"""
def __init__(self, extra_args=[]):
self.extra_args = extra_args
class HookError(Exception):
"""Thrown if a 'repo-hook' could not be run.
The common case is that the file wasn't present when we tried to run it.
"""
pass

View File

@ -72,6 +72,8 @@ def terminate_ssh_clients():
pass
_ssh_clients = []
_git_version = None
class _GitCall(object):
def version(self):
p = GitCommand(None, ['--version'], capture_stdout=True)
@ -79,6 +81,21 @@ class _GitCall(object):
return p.stdout
return None
def version_tuple(self):
global _git_version
if _git_version is None:
ver_str = git.version()
if ver_str.startswith('git version '):
_git_version = tuple(
map(lambda x: int(x),
ver_str[len('git version '):].strip().split('.')[0:3]
))
else:
print >>sys.stderr, 'fatal: "%s" unsupported' % ver_str
sys.exit(1)
return _git_version
def __getattr__(self, name):
name = name.replace('_','-')
def fun(*cmdv):
@ -88,23 +105,9 @@ class _GitCall(object):
return fun
git = _GitCall()
_git_version = None
def git_require(min_version, fail=False):
global _git_version
if _git_version is None:
ver_str = git.version()
if ver_str.startswith('git version '):
_git_version = tuple(
map(lambda x: int(x),
ver_str[len('git version '):].strip().split('.')[0:3]
))
else:
print >>sys.stderr, 'fatal: "%s" unsupported' % ver_str
sys.exit(1)
if min_version <= _git_version:
git_version = git.version_tuple()
if min_version <= git_version:
return True
if fail:
need = '.'.join(map(lambda x: str(x), min_version))

View File

@ -198,6 +198,15 @@ class GitConfig(object):
except KeyError:
return False
def UrlInsteadOf(self, url):
"""Resolve any url.*.insteadof references.
"""
for new_url in self.GetSubSections('url'):
old_url = self.GetString('url.%s.insteadof' % new_url)
if old_url is not None and url.startswith(old_url):
return new_url + url[len(old_url):]
return url
@property
def _sections(self):
d = self._section_dict
@ -482,6 +491,12 @@ def close_ssh():
URI_SCP = re.compile(r'^([^@:]*@?[^:/]{1,}):')
URI_ALL = re.compile(r'^([a-z][a-z+]*)://([^@/]*@?[^/]*)/')
def GetSchemeFromUrl(url):
m = URI_ALL.match(url)
if m:
return m.group(1)
return None
def _preconnect(url):
m = URI_ALL.match(url)
if m:

101
main.py
View File

@ -22,17 +22,22 @@ if __name__ == '__main__':
del sys.argv[-1]
del magic
import netrc
import optparse
import os
import re
import sys
import time
import urllib2
from trace import SetTrace
from git_command import git, GitCommand
from git_config import init_ssh, close_ssh
from command import InteractiveCommand
from command import MirrorSafeCommand
from command import PagedCommand
from editor import Editor
from error import DownloadError
from error import ManifestInvalidRevisionError
from error import NoSuchProjectError
from error import RepoChangedException
@ -53,6 +58,9 @@ global_options.add_option('--no-pager',
global_options.add_option('--trace',
dest='trace', action='store_true',
help='trace git command execution')
global_options.add_option('--time',
dest='time', action='store_true',
help='time repo command execution')
global_options.add_option('--version',
dest='show_version', action='store_true',
help='display this version of repo')
@ -122,7 +130,23 @@ class _Repo(object):
RunPager(config)
try:
cmd.Execute(copts, cargs)
start = time.time()
try:
cmd.Execute(copts, cargs)
finally:
elapsed = time.time() - start
hours, remainder = divmod(elapsed, 3600)
minutes, seconds = divmod(remainder, 60)
if gopts.time:
if hours == 0:
print >>sys.stderr, 'real\t%dm%.3fs' \
% (minutes, seconds)
else:
print >>sys.stderr, 'real\t%dh%dm%.3fs' \
% (hours, minutes, seconds)
except DownloadError, e:
print >>sys.stderr, 'error: %s' % str(e)
sys.exit(1)
except ManifestInvalidRevisionError, e:
print >>sys.stderr, 'error: %s' % str(e)
sys.exit(1)
@ -133,6 +157,9 @@ class _Repo(object):
print >>sys.stderr, 'error: no project in current directory'
sys.exit(1)
def _MyRepoPath():
return os.path.dirname(__file__)
def _MyWrapperPath():
return os.path.join(os.path.dirname(__file__), 'repo')
@ -199,6 +226,77 @@ def _PruneOptions(argv, opt):
continue
i += 1
_user_agent = None
def _UserAgent():
global _user_agent
if _user_agent is None:
py_version = sys.version_info
os_name = sys.platform
if os_name == 'linux2':
os_name = 'Linux'
elif os_name == 'win32':
os_name = 'Win32'
elif os_name == 'cygwin':
os_name = 'Cygwin'
elif os_name == 'darwin':
os_name = 'Darwin'
p = GitCommand(
None, ['describe', 'HEAD'],
cwd = _MyRepoPath(),
capture_stdout = True)
if p.Wait() == 0:
repo_version = p.stdout
if len(repo_version) > 0 and repo_version[-1] == '\n':
repo_version = repo_version[0:-1]
if len(repo_version) > 0 and repo_version[0] == 'v':
repo_version = repo_version[1:]
else:
repo_version = 'unknown'
_user_agent = 'git-repo/%s (%s) git/%s Python/%d.%d.%d' % (
repo_version,
os_name,
'.'.join(map(lambda d: str(d), git.version_tuple())),
py_version[0], py_version[1], py_version[2])
return _user_agent
class _UserAgentHandler(urllib2.BaseHandler):
def http_request(self, req):
req.add_header('User-Agent', _UserAgent())
return req
def https_request(self, req):
req.add_header('User-Agent', _UserAgent())
return req
def init_http():
handlers = [_UserAgentHandler()]
mgr = urllib2.HTTPPasswordMgrWithDefaultRealm()
try:
n = netrc.netrc()
for host in n.hosts:
p = n.hosts[host]
mgr.add_password(None, 'http://%s/' % host, p[0], p[2])
mgr.add_password(None, 'https://%s/' % host, p[0], p[2])
except netrc.NetrcParseError:
pass
except IOError:
pass
handlers.append(urllib2.HTTPBasicAuthHandler(mgr))
if 'http_proxy' in os.environ:
url = os.environ['http_proxy']
handlers.append(urllib2.ProxyHandler({'http': url, 'https': url}))
if 'REPO_CURL_VERBOSE' in os.environ:
handlers.append(urllib2.HTTPHandler(debuglevel=1))
handlers.append(urllib2.HTTPSHandler(debuglevel=1))
urllib2.install_opener(urllib2.build_opener(*handlers))
def _Main(argv):
opt = optparse.OptionParser(usage="repo wrapperinfo -- ...")
opt.add_option("--repo-dir", dest="repodir",
@ -217,6 +315,7 @@ def _Main(argv):
try:
try:
init_ssh()
init_http()
repo._Run(argv)
finally:
close_ssh()

View File

@ -14,7 +14,9 @@
# limitations under the License.
import os
import re
import sys
import urlparse
import xml.dom.minidom
from git_config import GitConfig, IsId
@ -24,26 +26,36 @@ from error import ManifestParseError
MANIFEST_FILE_NAME = 'manifest.xml'
LOCAL_MANIFEST_NAME = 'local_manifest.xml'
urlparse.uses_relative.extend(['ssh', 'git'])
urlparse.uses_netloc.extend(['ssh', 'git'])
class _Default(object):
"""Project defaults within the manifest."""
revisionExpr = None
remote = None
sync_j = 1
class _XmlRemote(object):
def __init__(self,
name,
fetch=None,
manifestUrl=None,
review=None):
self.name = name
self.fetchUrl = fetch
self.manifestUrl = manifestUrl
self.reviewUrl = review
def ToRemoteSpec(self, projectName):
url = self.fetchUrl
while url.endswith('/'):
url = url[:-1]
url += '/%s.git' % projectName
url = self.fetchUrl.rstrip('/') + '/' + projectName + '.git'
manifestUrl = self.manifestUrl.rstrip('/')
# urljoin will get confused if there is no scheme in the base url
# ie, if manifestUrl is of the form <hostname:port>
if manifestUrl.find(':') != manifestUrl.find('/') - 1:
manifestUrl = 'gopher://' + manifestUrl
url = urlparse.urljoin(manifestUrl, url)
url = re.sub(r'^gopher://', '', url)
return RemoteSpec(self.name, url, self.reviewUrl)
class XmlManifest(object):
@ -133,6 +145,9 @@ class XmlManifest(object):
if d.revisionExpr:
have_default = True
e.setAttribute('revision', d.revisionExpr)
if d.sync_j > 1:
have_default = True
e.setAttribute('sync-j', '%d' % d.sync_j)
if have_default:
root.appendChild(e)
root.appendChild(doc.createTextNode(''))
@ -171,6 +186,14 @@ class XmlManifest(object):
ce.setAttribute('dest', c.dest)
e.appendChild(ce)
if self._repo_hooks_project:
root.appendChild(doc.createTextNode(''))
e = doc.createElement('repo-hooks')
e.setAttribute('in-project', self._repo_hooks_project.name)
e.setAttribute('enabled-list',
' '.join(self._repo_hooks_project.enabled_repo_hooks))
root.appendChild(e)
doc.writexml(fd, '', ' ', '\n', 'UTF-8')
@property
@ -188,6 +211,11 @@ class XmlManifest(object):
self._Load()
return self._default
@property
def repo_hooks_project(self):
self._Load()
return self._repo_hooks_project
@property
def notice(self):
self._Load()
@ -207,6 +235,7 @@ class XmlManifest(object):
self._projects = {}
self._remotes = {}
self._default = None
self._repo_hooks_project = None
self._notice = None
self.branch = None
self._manifest_server = None
@ -239,15 +268,15 @@ class XmlManifest(object):
def _ParseManifest(self, is_root_file):
root = xml.dom.minidom.parse(self.manifestFile)
if not root or not root.childNodes:
raise ManifestParseError, \
"no root node in %s" % \
self.manifestFile
raise ManifestParseError(
"no root node in %s" %
self.manifestFile)
config = root.childNodes[0]
if config.nodeName != 'manifest':
raise ManifestParseError, \
"no <manifest> in %s" % \
self.manifestFile
raise ManifestParseError(
"no <manifest> in %s" %
self.manifestFile)
for node in config.childNodes:
if node.nodeName == 'remove-project':
@ -255,25 +284,30 @@ class XmlManifest(object):
try:
del self._projects[name]
except KeyError:
raise ManifestParseError, \
'project %s not found' % \
(name)
raise ManifestParseError(
'project %s not found' %
(name))
# If the manifest removes the hooks project, treat it as if it deleted
# the repo-hooks element too.
if self._repo_hooks_project and (self._repo_hooks_project.name == name):
self._repo_hooks_project = None
for node in config.childNodes:
if node.nodeName == 'remote':
remote = self._ParseRemote(node)
if self._remotes.get(remote.name):
raise ManifestParseError, \
'duplicate remote %s in %s' % \
(remote.name, self.manifestFile)
raise ManifestParseError(
'duplicate remote %s in %s' %
(remote.name, self.manifestFile))
self._remotes[remote.name] = remote
for node in config.childNodes:
if node.nodeName == 'default':
if self._default is not None:
raise ManifestParseError, \
'duplicate default in %s' % \
(self.manifestFile)
raise ManifestParseError(
'duplicate default in %s' %
(self.manifestFile))
self._default = self._ParseDefault(node)
if self._default is None:
self._default = _Default()
@ -281,29 +315,52 @@ class XmlManifest(object):
for node in config.childNodes:
if node.nodeName == 'notice':
if self._notice is not None:
raise ManifestParseError, \
'duplicate notice in %s' % \
(self.manifestFile)
raise ManifestParseError(
'duplicate notice in %s' %
(self.manifestFile))
self._notice = self._ParseNotice(node)
for node in config.childNodes:
if node.nodeName == 'manifest-server':
url = self._reqatt(node, 'url')
if self._manifest_server is not None:
raise ManifestParseError, \
'duplicate manifest-server in %s' % \
(self.manifestFile)
raise ManifestParseError(
'duplicate manifest-server in %s' %
(self.manifestFile))
self._manifest_server = url
for node in config.childNodes:
if node.nodeName == 'project':
project = self._ParseProject(node)
if self._projects.get(project.name):
raise ManifestParseError, \
'duplicate project %s in %s' % \
(project.name, self.manifestFile)
raise ManifestParseError(
'duplicate project %s in %s' %
(project.name, self.manifestFile))
self._projects[project.name] = project
for node in config.childNodes:
if node.nodeName == 'repo-hooks':
# Get the name of the project and the (space-separated) list of enabled.
repo_hooks_project = self._reqatt(node, 'in-project')
enabled_repo_hooks = self._reqatt(node, 'enabled-list').split()
# Only one project can be the hooks project
if self._repo_hooks_project is not None:
raise ManifestParseError(
'duplicate repo-hooks in %s' %
(self.manifestFile))
# Store a reference to the Project.
try:
self._repo_hooks_project = self._projects[repo_hooks_project]
except KeyError:
raise ManifestParseError(
'project %s not found for repo-hooks' %
(repo_hooks_project))
# Store the enabled hooks in the Project object.
self._repo_hooks_project.enabled_repo_hooks = enabled_repo_hooks
def _AddMetaProjectMirror(self, m):
name = None
m_url = m.GetRemote(m.remote.name).url
@ -320,7 +377,8 @@ class XmlManifest(object):
if name is None:
s = m_url.rindex('/') + 1
remote = _XmlRemote('origin', m_url[:s])
manifestUrl = self.manifestProject.config.GetString('remote.origin.url')
remote = _XmlRemote('origin', m_url[:s], manifestUrl)
name = m_url[s:]
if name.endswith('.git'):
@ -348,7 +406,8 @@ class XmlManifest(object):
review = node.getAttribute('review')
if review == '':
review = None
return _XmlRemote(name, fetch, review)
manifestUrl = self.manifestProject.config.GetString('remote.origin.url')
return _XmlRemote(name, fetch, manifestUrl, review)
def _ParseDefault(self, node):
"""
@ -359,6 +418,11 @@ class XmlManifest(object):
d.revisionExpr = node.getAttribute('revision')
if d.revisionExpr == '':
d.revisionExpr = None
sync_j = node.getAttribute('sync-j')
if sync_j == '' or sync_j is None:
d.sync_j = 1
else:
d.sync_j = int(sync_j)
return d
def _ParseNotice(self, node):

View File

@ -21,13 +21,14 @@ from trace import IsTrace
_NOT_TTY = not os.isatty(2)
class Progress(object):
def __init__(self, title, total=0):
def __init__(self, title, total=0, units=''):
self._title = title
self._total = total
self._done = 0
self._lastp = -1
self._start = time()
self._show = False
self._units = units
def update(self, inc=1):
self._done += inc
@ -51,11 +52,11 @@ class Progress(object):
if self._lastp != p:
self._lastp = p
sys.stderr.write('\r%s: %3d%% (%d/%d) ' % (
sys.stderr.write('\r%s: %3d%% (%d%s/%d%s) ' % (
self._title,
p,
self._done,
self._total))
self._done, self._units,
self._total, self._units))
sys.stderr.flush()
def end(self):
@ -69,9 +70,9 @@ class Progress(object):
sys.stderr.flush()
else:
p = (100 * self._done) / self._total
sys.stderr.write('\r%s: %3d%% (%d/%d), done. \n' % (
sys.stderr.write('\r%s: %3d%% (%d%s/%d%s), done. \n' % (
self._title,
p,
self._done,
self._total))
self._done, self._units,
self._total, self._units))
sys.stderr.flush()

View File

@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import traceback
import errno
import filecmp
import os
@ -23,9 +24,11 @@ import urllib2
from color import Coloring
from git_command import GitCommand
from git_config import GitConfig, IsId
from error import GitError, ImportError, UploadError
from git_config import GitConfig, IsId, GetSchemeFromUrl
from error import DownloadError
from error import GitError, HookError, ImportError, UploadError
from error import ManifestInvalidRevisionError
from progress import Progress
from git_refs import GitRefs, HEAD, R_HEADS, R_TAGS, R_PUB, R_M
@ -54,14 +57,25 @@ def not_rev(r):
def sq(r):
return "'" + r.replace("'", "'\''") + "'"
hook_list = None
def repo_hooks():
global hook_list
if hook_list is None:
_project_hook_list = None
def _ProjectHooks():
"""List the hooks present in the 'hooks' directory.
These hooks are project hooks and are copied to the '.git/hooks' directory
of all subprojects.
This function caches the list of hooks (based on the contents of the
'repo/hooks' directory) on the first call.
Returns:
A list of absolute paths to all of the files in the hooks directory.
"""
global _project_hook_list
if _project_hook_list is None:
d = os.path.abspath(os.path.dirname(__file__))
d = os.path.join(d , 'hooks')
hook_list = map(lambda x: os.path.join(d, x), os.listdir(d))
return hook_list
_project_hook_list = map(lambda x: os.path.join(d, x), os.listdir(d))
return _project_hook_list
def relpath(dst, src):
src = os.path.dirname(src)
@ -223,6 +237,249 @@ class RemoteSpec(object):
self.url = url
self.review = review
class RepoHook(object):
"""A RepoHook contains information about a script to run as a hook.
Hooks are used to run a python script before running an upload (for instance,
to run presubmit checks). Eventually, we may have hooks for other actions.
This shouldn't be confused with files in the 'repo/hooks' directory. Those
files are copied into each '.git/hooks' folder for each project. Repo-level
hooks are associated instead with repo actions.
Hooks are always python. When a hook is run, we will load the hook into the
interpreter and execute its main() function.
"""
def __init__(self,
hook_type,
hooks_project,
topdir,
abort_if_user_denies=False):
"""RepoHook constructor.
Params:
hook_type: A string representing the type of hook. This is also used
to figure out the name of the file containing the hook. For
example: 'pre-upload'.
hooks_project: The project containing the repo hooks. If you have a
manifest, this is manifest.repo_hooks_project. OK if this is None,
which will make the hook a no-op.
topdir: Repo's top directory (the one containing the .repo directory).
Scripts will run with CWD as this directory. If you have a manifest,
this is manifest.topdir
abort_if_user_denies: If True, we'll throw a HookError() if the user
doesn't allow us to run the hook.
"""
self._hook_type = hook_type
self._hooks_project = hooks_project
self._topdir = topdir
self._abort_if_user_denies = abort_if_user_denies
# Store the full path to the script for convenience.
if self._hooks_project:
self._script_fullpath = os.path.join(self._hooks_project.worktree,
self._hook_type + '.py')
else:
self._script_fullpath = None
def _GetHash(self):
"""Return a hash of the contents of the hooks directory.
We'll just use git to do this. This hash has the property that if anything
changes in the directory we will return a different has.
SECURITY CONSIDERATION:
This hash only represents the contents of files in the hook directory, not
any other files imported or called by hooks. Changes to imported files
can change the script behavior without affecting the hash.
Returns:
A string representing the hash. This will always be ASCII so that it can
be printed to the user easily.
"""
assert self._hooks_project, "Must have hooks to calculate their hash."
# We will use the work_git object rather than just calling GetRevisionId().
# That gives us a hash of the latest checked in version of the files that
# the user will actually be executing. Specifically, GetRevisionId()
# doesn't appear to change even if a user checks out a different version
# of the hooks repo (via git checkout) nor if a user commits their own revs.
#
# NOTE: Local (non-committed) changes will not be factored into this hash.
# I think this is OK, since we're really only worried about warning the user
# about upstream changes.
return self._hooks_project.work_git.rev_parse('HEAD')
def _GetMustVerb(self):
"""Return 'must' if the hook is required; 'should' if not."""
if self._abort_if_user_denies:
return 'must'
else:
return 'should'
def _CheckForHookApproval(self):
"""Check to see whether this hook has been approved.
We'll look at the hash of all of the hooks. If this matches the hash that
the user last approved, we're done. If it doesn't, we'll ask the user
about approval.
Note that we ask permission for each individual hook even though we use
the hash of all hooks when detecting changes. We'd like the user to be
able to approve / deny each hook individually. We only use the hash of all
hooks because there is no other easy way to detect changes to local imports.
Returns:
True if this hook is approved to run; False otherwise.
Raises:
HookError: Raised if the user doesn't approve and abort_if_user_denies
was passed to the consturctor.
"""
hooks_dir = self._hooks_project.worktree
hooks_config = self._hooks_project.config
git_approval_key = 'repo.hooks.%s.approvedhash' % self._hook_type
# Get the last hash that the user approved for this hook; may be None.
old_hash = hooks_config.GetString(git_approval_key)
# Get the current hash so we can tell if scripts changed since approval.
new_hash = self._GetHash()
if old_hash is not None:
# User previously approved hook and asked not to be prompted again.
if new_hash == old_hash:
# Approval matched. We're done.
return True
else:
# Give the user a reason why we're prompting, since they last told
# us to "never ask again".
prompt = 'WARNING: Scripts have changed since %s was allowed.\n\n' % (
self._hook_type)
else:
prompt = ''
# Prompt the user if we're not on a tty; on a tty we'll assume "no".
if sys.stdout.isatty():
prompt += ('Repo %s run the script:\n'
' %s\n'
'\n'
'Do you want to allow this script to run '
'(yes/yes-never-ask-again/NO)? ') % (
self._GetMustVerb(), self._script_fullpath)
response = raw_input(prompt).lower()
print
# User is doing a one-time approval.
if response in ('y', 'yes'):
return True
elif response == 'yes-never-ask-again':
hooks_config.SetString(git_approval_key, new_hash)
return True
# For anything else, we'll assume no approval.
if self._abort_if_user_denies:
raise HookError('You must allow the %s hook or use --no-verify.' %
self._hook_type)
return False
def _ExecuteHook(self, **kwargs):
"""Actually execute the given hook.
This will run the hook's 'main' function in our python interpreter.
Args:
kwargs: Keyword arguments to pass to the hook. These are often specific
to the hook type. For instance, pre-upload hooks will contain
a project_list.
"""
# Keep sys.path and CWD stashed away so that we can always restore them
# upon function exit.
orig_path = os.getcwd()
orig_syspath = sys.path
try:
# Always run hooks with CWD as topdir.
os.chdir(self._topdir)
# Put the hook dir as the first item of sys.path so hooks can do
# relative imports. We want to replace the repo dir as [0] so
# hooks can't import repo files.
sys.path = [os.path.dirname(self._script_fullpath)] + sys.path[1:]
# Exec, storing global context in the context dict. We catch exceptions
# and convert to a HookError w/ just the failing traceback.
context = {}
try:
execfile(self._script_fullpath, context)
except Exception:
raise HookError('%s\nFailed to import %s hook; see traceback above.' % (
traceback.format_exc(), self._hook_type))
# Running the script should have defined a main() function.
if 'main' not in context:
raise HookError('Missing main() in: "%s"' % self._script_fullpath)
# Add 'hook_should_take_kwargs' to the arguments to be passed to main.
# We don't actually want hooks to define their main with this argument--
# it's there to remind them that their hook should always take **kwargs.
# For instance, a pre-upload hook should be defined like:
# def main(project_list, **kwargs):
#
# This allows us to later expand the API without breaking old hooks.
kwargs = kwargs.copy()
kwargs['hook_should_take_kwargs'] = True
# Call the main function in the hook. If the hook should cause the
# build to fail, it will raise an Exception. We'll catch that convert
# to a HookError w/ just the failing traceback.
try:
context['main'](**kwargs)
except Exception:
raise HookError('%s\nFailed to run main() for %s hook; see traceback '
'above.' % (
traceback.format_exc(), self._hook_type))
finally:
# Restore sys.path and CWD.
sys.path = orig_syspath
os.chdir(orig_path)
def Run(self, user_allows_all_hooks, **kwargs):
"""Run the hook.
If the hook doesn't exist (because there is no hooks project or because
this particular hook is not enabled), this is a no-op.
Args:
user_allows_all_hooks: If True, we will never prompt about running the
hook--we'll just assume it's OK to run it.
kwargs: Keyword arguments to pass to the hook. These are often specific
to the hook type. For instance, pre-upload hooks will contain
a project_list.
Raises:
HookError: If there was a problem finding the hook or the user declined
to run a required hook (from _CheckForHookApproval).
"""
# No-op if there is no hooks project or if hook is disabled.
if ((not self._hooks_project) or
(self._hook_type not in self._hooks_project.enabled_repo_hooks)):
return
# Bail with a nice error if we can't find the hook.
if not os.path.isfile(self._script_fullpath):
raise HookError('Couldn\'t find repo hook: "%s"' % self._script_fullpath)
# Make sure the user is OK with running the hook.
if (not user_allows_all_hooks) and (not self._CheckForHookApproval()):
return
# Run the hook with the same version of python we're using.
self._ExecuteHook(**kwargs)
class Project(object):
def __init__(self,
manifest,
@ -237,7 +494,10 @@ class Project(object):
self.name = name
self.remote = remote
self.gitdir = gitdir.replace('\\', '/')
self.worktree = worktree.replace('\\', '/')
if worktree:
self.worktree = worktree.replace('\\', '/')
else:
self.worktree = None
self.relpath = relpath
self.revisionExpr = revisionExpr
@ -261,6 +521,10 @@ class Project(object):
self.bare_git = self._GitGetByExec(self, bare=True)
self.bare_ref = GitRefs(gitdir)
# This will be filled in if a project is later identified to be the
# project containing repo hooks.
self.enabled_repo_hooks = []
@property
def Exists(self):
return os.path.isdir(self.gitdir)
@ -388,13 +652,18 @@ class Project(object):
return False
def PrintWorkTreeStatus(self):
def PrintWorkTreeStatus(self, output_redir=None):
"""Prints the status of the repository to stdout.
Args:
output: If specified, redirect the output to this object.
"""
if not os.path.isdir(self.worktree):
print ''
print 'project %s/' % self.relpath
print ' missing (run "repo sync")'
if output_redir == None:
output_redir = sys.stdout
print >>output_redir, ''
print >>output_redir, 'project %s/' % self.relpath
print >>output_redir, ' missing (run "repo sync")'
return
self.work_git.update_index('-q',
@ -409,6 +678,8 @@ class Project(object):
return 'CLEAN'
out = StatusColoring(self.config)
if not output_redir == None:
out.redirect(output_redir)
out.project('project %-40s', self.relpath + '/')
branch = self.CurrentBranch
@ -458,6 +729,7 @@ class Project(object):
else:
out.write('%s', line)
out.nl()
return 'DIRTY'
def PrintWorkTreeDiff(self):
@ -521,7 +793,7 @@ class Project(object):
if R_HEADS + n not in heads:
self.bare_git.DeleteRef(name, id)
def GetUploadableBranches(self):
def GetUploadableBranches(self, selected_branch=None):
"""List any branches which can be uploaded for review.
"""
heads = {}
@ -537,6 +809,8 @@ class Project(object):
for branch, id in heads.iteritems():
if branch in pubed and pubed[branch] == id:
continue
if selected_branch and branch != selected_branch:
continue
rb = self.GetUploadableBranch(branch)
if rb:
@ -612,15 +886,13 @@ class Project(object):
## Sync ##
def Sync_NetworkHalf(self, quiet=False):
def Sync_NetworkHalf(self, quiet=False, is_new=None):
"""Perform only the network IO portion of the sync process.
Local working directory/branch state is not affected.
"""
is_new = not self.Exists
if is_new is None:
is_new = not self.Exists
if is_new:
if not quiet:
print >>sys.stderr
print >>sys.stderr, 'Initializing project %s ...' % self.name
self._InitGitDir()
self._InitRemote()
@ -677,11 +949,11 @@ class Project(object):
"""Perform only the local IO portion of the sync process.
Network access is not required.
"""
self._InitWorkTree()
all = self.bare_ref.all
self.CleanPublishedCache(all)
revid = self.GetRevisionId(all)
self._InitWorkTree()
head = self.work_git.GetHead()
if head.startswith(R_HEADS):
branch = head[len(R_HEADS):]
@ -897,6 +1169,13 @@ class Project(object):
def CheckoutBranch(self, name):
"""Checkout a local topic branch.
Args:
name: The name of the branch to checkout.
Returns:
True if the checkout succeeded; False if it didn't; None if the branch
didn't exist.
"""
rev = R_HEADS + name
head = self.work_git.GetHead()
@ -911,7 +1190,7 @@ class Project(object):
except KeyError:
# Branch does not exist in this project
#
return False
return None
if head.startswith(R_HEADS):
try:
@ -934,13 +1213,19 @@ class Project(object):
def AbandonBranch(self, name):
"""Destroy a local topic branch.
Args:
name: The name of the branch to abandon.
Returns:
True if the abandon succeeded; False if it didn't; None if the branch
didn't exist.
"""
rev = R_HEADS + name
all = self.bare_ref.all
if rev not in all:
# Doesn't exist; assume already abandoned.
#
return True
# Doesn't exist
return None
head = self.work_git.GetHead()
if head == rev:
@ -1027,9 +1312,16 @@ class Project(object):
name = self.remote.name
ssh_proxy = False
if self.GetRemote(name).PreConnectFetch():
remote = self.GetRemote(name)
if remote.PreConnectFetch():
ssh_proxy = True
bundle_dst = os.path.join(self.gitdir, 'clone.bundle')
bundle_tmp = os.path.join(self.gitdir, 'clone.bundle.tmp')
use_bundle = False
if os.path.exists(bundle_dst) or os.path.exists(bundle_tmp):
use_bundle = True
if initial:
alt = os.path.join(self.gitdir, 'objects/info/alternates')
try:
@ -1044,6 +1336,8 @@ class Project(object):
ref_dir = None
if ref_dir and 'objects' == os.path.basename(ref_dir):
if use_bundle:
use_bundle = False
ref_dir = os.path.dirname(ref_dir)
packed_refs = os.path.join(self.gitdir, 'packed-refs')
remote = self.GetRemote(name)
@ -1083,16 +1377,46 @@ class Project(object):
else:
ref_dir = None
use_bundle = True
cmd = ['fetch']
# The --depth option only affects the initial fetch; after that we'll do
# full fetches of changes.
depth = self.manifest.manifestProject.config.GetString('repo.depth')
if depth and initial:
cmd.append('--depth=%s' % depth)
use_bundle = False
if quiet:
cmd.append('--quiet')
if not self.worktree:
cmd.append('--update-head-ok')
cmd.append(name)
if tag is not None:
cmd.append('tag')
cmd.append(tag)
if use_bundle and not os.path.exists(bundle_dst):
bundle_url = remote.url + '/clone.bundle'
bundle_url = GitConfig.ForUser().UrlInsteadOf(bundle_url)
if GetSchemeFromUrl(bundle_url) in ('http', 'https'):
use_bundle = self._FetchBundle(
bundle_url,
bundle_tmp,
bundle_dst,
quiet=quiet)
else:
use_bundle = False
if use_bundle:
if not quiet:
cmd.append('--quiet')
cmd.append(bundle_dst)
for f in remote.fetch:
cmd.append(str(f))
cmd.append('refs/tags/*:refs/tags/*')
else:
cmd.append(name)
if tag is not None:
cmd.append('tag')
cmd.append(tag)
ok = GitCommand(self,
cmd,
@ -1107,8 +1431,99 @@ class Project(object):
os.remove(packed_refs)
self.bare_git.pack_refs('--all', '--prune')
if os.path.exists(bundle_dst):
os.remove(bundle_dst)
if os.path.exists(bundle_tmp):
os.remove(bundle_tmp)
return ok
def _FetchBundle(self, srcUrl, tmpPath, dstPath, quiet=False):
keep = True
done = False
dest = open(tmpPath, 'a+b')
try:
dest.seek(0, os.SEEK_END)
pos = dest.tell()
req = urllib2.Request(srcUrl)
if pos > 0:
req.add_header('Range', 'bytes=%d-' % pos)
try:
r = urllib2.urlopen(req)
except urllib2.HTTPError, e:
if e.code == 404:
keep = False
return False
elif e.info()['content-type'] == 'text/plain':
try:
msg = e.read()
if len(msg) > 0 and msg[-1] == '\n':
msg = msg[0:-1]
msg = ' (%s)' % msg
except:
msg = ''
else:
try:
from BaseHTTPServer import BaseHTTPRequestHandler
res = BaseHTTPRequestHandler.responses[e.code]
msg = ' (%s: %s)' % (res[0], res[1])
except:
msg = ''
raise DownloadError('HTTP %s%s' % (e.code, msg))
except urllib2.URLError, e:
raise DownloadError('%s (%s)' % (e.reason, req.get_host()))
p = None
try:
size = r.headers['content-length']
unit = 1 << 10
if size and not quiet:
if size > 1024 * 1.3:
unit = 1 << 20
desc = 'MB'
else:
desc = 'KB'
p = Progress(
'Downloading %s' % self.relpath,
int(size) / unit,
units=desc)
if pos > 0:
p.update(pos / unit)
s = 0
while True:
d = r.read(8192)
if d == '':
done = True
return True
dest.write(d)
if p:
s += len(d)
if s >= unit:
p.update(s / unit)
s = s % unit
if p:
if s >= unit:
p.update(s / unit)
else:
p.update(1)
finally:
r.close()
if p:
p.end()
finally:
dest.close()
if os.path.exists(dstPath):
os.remove(dstPath)
if done:
os.rename(tmpPath, dstPath)
elif not keep:
os.remove(tmpPath)
def _Checkout(self, rev, quiet=False):
cmd = ['checkout']
if quiet:
@ -1189,13 +1604,16 @@ class Project(object):
hooks = self._gitdir_path('hooks')
if not os.path.exists(hooks):
os.makedirs(hooks)
for stock_hook in repo_hooks():
for stock_hook in _ProjectHooks():
name = os.path.basename(stock_hook)
if name in ('commit-msg') and not self.remote.review:
if name in ('commit-msg',) and not self.remote.review \
and not self is self.manifest.manifestProject:
# Don't install a Gerrit Code Review hook if this
# project does not appear to use it for reviews.
#
# Since the manifest project is one of those, but also
# managed through gerrit, it's excluded
continue
dst = os.path.join(hooks, name)
@ -1285,6 +1703,11 @@ class Project(object):
cmd.append(HEAD)
if GitCommand(self, cmd).Wait() != 0:
raise GitError("cannot initialize work tree")
rr_cache = os.path.join(self.gitdir, 'rr-cache')
if not os.path.exists(rr_cache):
os.makedirs(rr_cache)
self._CopyFiles()
def _gitdir_path(self, path):
@ -1443,6 +1866,22 @@ class Project(object):
return r
def __getattr__(self, name):
"""Allow arbitrary git commands using pythonic syntax.
This allows you to do things like:
git_obj.rev_parse('HEAD')
Since we don't have a 'rev_parse' method defined, the __getattr__ will
run. We'll replace the '_' with a '-' and try to run a git command.
Any other arguments will be passed to the git command.
Args:
name: The name of the git command to call. Any '_' characters will
be replaced with '-'.
Returns:
A callable object that will try to call git with the named command.
"""
name = name.replace('_', '-')
def runner(*args):
cmdv = [name]

110
repo
View File

@ -2,7 +2,7 @@
## repo default configuration
##
REPO_URL='git://android.git.kernel.org/tools/repo.git'
REPO_URL='https://code.google.com/p/git-repo/'
REPO_REV='stable'
# Copyright (C) 2008 Google Inc.
@ -28,7 +28,7 @@ if __name__ == '__main__':
del magic
# increment this whenever we make important changes to this script
VERSION = (1, 10)
VERSION = (1, 13)
# increment this if the MAINTAINER_KEYS block is modified
KEYRING_VERSION = (1,0)
@ -91,6 +91,7 @@ import re
import readline
import subprocess
import sys
import urllib2
home_dot_repo = os.path.expanduser('~/.repoconfig')
gpg_dir = os.path.join(home_dot_repo, 'gnupg')
@ -121,6 +122,10 @@ group.add_option('--mirror',
group.add_option('--reference',
dest='reference',
help='location of mirror directory', metavar='DIR')
group.add_option('--depth', type='int', default=None,
dest='depth',
help='create a shallow clone with given depth; see git clone')
# Tool
group = init_optparse.add_option_group('repo Version options')
@ -183,10 +188,6 @@ def _Init(args):
else:
can_verify = True
if not opt.quiet:
print >>sys.stderr, 'Getting repo ...'
print >>sys.stderr, ' from %s' % url
dst = os.path.abspath(os.path.join(repodir, S_repo))
_Clone(url, dst, opt.quiet)
@ -296,15 +297,42 @@ def _SetConfig(local, name, value):
raise CloneFailure()
def _Fetch(local, quiet, *args):
def _InitHttp():
handlers = []
mgr = urllib2.HTTPPasswordMgrWithDefaultRealm()
try:
import netrc
n = netrc.netrc()
for host in n.hosts:
p = n.hosts[host]
mgr.add_password(None, 'http://%s/' % host, p[0], p[2])
mgr.add_password(None, 'https://%s/' % host, p[0], p[2])
except:
pass
handlers.append(urllib2.HTTPBasicAuthHandler(mgr))
if 'http_proxy' in os.environ:
url = os.environ['http_proxy']
handlers.append(urllib2.ProxyHandler({'http': url, 'https': url}))
if 'REPO_CURL_VERBOSE' in os.environ:
handlers.append(urllib2.HTTPHandler(debuglevel=1))
handlers.append(urllib2.HTTPSHandler(debuglevel=1))
urllib2.install_opener(urllib2.build_opener(*handlers))
def _Fetch(url, local, src, quiet):
if not quiet:
print >>sys.stderr, 'Get %s' % url
cmd = [GIT, 'fetch']
if quiet:
cmd.append('--quiet')
err = subprocess.PIPE
else:
err = None
cmd.extend(args)
cmd.append('origin')
cmd.append(src)
cmd.append('+refs/heads/*:refs/remotes/origin/*')
cmd.append('refs/tags/*:refs/tags/*')
proc = subprocess.Popen(cmd, cwd = local, stderr = err)
if err:
@ -313,6 +341,62 @@ def _Fetch(local, quiet, *args):
if proc.wait() != 0:
raise CloneFailure()
def _DownloadBundle(url, local, quiet):
if not url.endswith('/'):
url += '/'
url += 'clone.bundle'
proc = subprocess.Popen(
[GIT, 'config', '--get-regexp', 'url.*.insteadof'],
cwd = local,
stdout = subprocess.PIPE)
for line in proc.stdout:
m = re.compile(r'^url\.(.*)\.insteadof (.*)$').match(line)
if m:
new_url = m.group(1)
old_url = m.group(2)
if url.startswith(old_url):
url = new_url + url[len(old_url):]
break
proc.stdout.close()
proc.wait()
if not url.startswith('http:') and not url.startswith('https:'):
return False
dest = open(os.path.join(local, '.git', 'clone.bundle'), 'w+b')
try:
try:
r = urllib2.urlopen(url)
except urllib2.HTTPError, e:
if e.code == 404:
return False
print >>sys.stderr, 'fatal: Cannot get %s' % url
print >>sys.stderr, 'fatal: HTTP error %s' % e.code
raise CloneFailure()
except urllib2.URLError, e:
print >>sys.stderr, 'fatal: Cannot get %s' % url
print >>sys.stderr, 'fatal: error %s' % e.reason
raise CloneFailure()
try:
if not quiet:
print >>sys.stderr, 'Get %s' % url
while True:
buf = r.read(8192)
if buf == '':
return True
dest.write(buf)
finally:
r.close()
finally:
dest.close()
def _ImportBundle(local):
path = os.path.join(local, '.git', 'clone.bundle')
try:
_Fetch(local, local, path, True)
finally:
os.remove(path)
def _Clone(url, local, quiet):
"""Clones a git repository to a new subdirectory of repodir
@ -340,11 +424,14 @@ def _Clone(url, local, quiet):
print >>sys.stderr, 'fatal: could not create %s' % local
raise CloneFailure()
_InitHttp()
_SetConfig(local, 'remote.origin.url', url)
_SetConfig(local, 'remote.origin.fetch',
'+refs/heads/*:refs/remotes/origin/*')
_Fetch(local, quiet)
_Fetch(local, quiet, '--tags')
if _DownloadBundle(url, local, quiet):
_ImportBundle(local)
else:
_Fetch(url, local, 'origin', quiet)
def _Verify(cwd, branch, quiet):
@ -596,4 +683,3 @@ def main(orig_args):
if __name__ == '__main__':
main(sys.argv[1:])

View File

@ -41,21 +41,30 @@ It is equivalent to "git branch -D <branchname>".
nb = args[0]
err = []
success = []
all = self.GetProjects(args[1:])
pm = Progress('Abandon %s' % nb, len(all))
for project in all:
pm.update()
if not project.AbandonBranch(nb):
err.append(project)
status = project.AbandonBranch(nb)
if status is not None:
if status:
success.append(project)
else:
err.append(project)
pm.end()
if err:
if len(err) == len(all):
print >>sys.stderr, 'error: no project has branch %s' % nb
else:
for p in err:
print >>sys.stderr,\
"error: %s/: cannot abandon %s" \
% (p.relpath, nb)
for p in err:
print >>sys.stderr,\
"error: %s/: cannot abandon %s" \
% (p.relpath, nb)
sys.exit(1)
elif not success:
print >>sys.stderr, 'error: no project has branch %s' % nb
sys.exit(1)
else:
print >>sys.stderr, 'Abandoned in %d project(s):\n %s' % (
len(success), '\n '.join(p.relpath for p in success))

View File

@ -38,21 +38,27 @@ The command is equivalent to:
nb = args[0]
err = []
success = []
all = self.GetProjects(args[1:])
pm = Progress('Checkout %s' % nb, len(all))
for project in all:
pm.update()
if not project.CheckoutBranch(nb):
err.append(project)
status = project.CheckoutBranch(nb)
if status is not None:
if status:
success.append(project)
else:
err.append(project)
pm.end()
if err:
if len(err) == len(all):
print >>sys.stderr, 'error: no project has branch %s' % nb
else:
for p in err:
print >>sys.stderr,\
"error: %s/: cannot checkout %s" \
% (p.relpath, nb)
for p in err:
print >>sys.stderr,\
"error: %s/: cannot checkout %s" \
% (p.relpath, nb)
sys.exit(1)
elif not success:
print >>sys.stderr, 'error: no project has branch %s' % nb
sys.exit(1)

114
subcmds/cherry_pick.py Normal file
View File

@ -0,0 +1,114 @@
#
# Copyright (C) 2010 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys, re, string, random, os
from command import Command
from git_command import GitCommand
CHANGE_ID_RE = re.compile(r'^\s*Change-Id: I([0-9a-f]{40})\s*$')
class CherryPick(Command):
common = True
helpSummary = "Cherry-pick a change."
helpUsage = """
%prog <sha1>
"""
helpDescription = """
'%prog' cherry-picks a change from one branch to another.
The change id will be updated, and a reference to the old
change id will be added.
"""
def _Options(self, p):
pass
def Execute(self, opt, args):
if len(args) != 1:
self.Usage()
reference = args[0]
p = GitCommand(None,
['rev-parse', '--verify', reference],
capture_stdout = True,
capture_stderr = True)
if p.Wait() != 0:
print >>sys.stderr, p.stderr
sys.exit(1)
sha1 = p.stdout.strip()
p = GitCommand(None, ['cat-file', 'commit', sha1], capture_stdout=True)
if p.Wait() != 0:
print >>sys.stderr, "error: Failed to retrieve old commit message"
sys.exit(1)
old_msg = self._StripHeader(p.stdout)
p = GitCommand(None,
['cherry-pick', sha1],
capture_stdout = True,
capture_stderr = True)
status = p.Wait()
print >>sys.stdout, p.stdout
print >>sys.stderr, p.stderr
if status == 0:
# The cherry-pick was applied correctly. We just need to edit the
# commit message.
new_msg = self._Reformat(old_msg, sha1)
p = GitCommand(None, ['commit', '--amend', '-F', '-'],
provide_stdin = True,
capture_stdout = True,
capture_stderr = True)
p.stdin.write(new_msg)
if p.Wait() != 0:
print >>sys.stderr, "error: Failed to update commit message"
sys.exit(1)
else:
print >>sys.stderr, """\
NOTE: When committing (please see above) and editing the commit message,
please remove the old Change-Id-line and add:
"""
print >>sys.stderr, self._GetReference(sha1)
print >>sys.stderr
def _IsChangeId(self, line):
return CHANGE_ID_RE.match(line)
def _GetReference(self, sha1):
return "(cherry picked from commit %s)" % sha1
def _StripHeader(self, commit_msg):
lines = commit_msg.splitlines()
return "\n".join(lines[lines.index("")+1:])
def _Reformat(self, old_msg, sha1):
new_msg = []
for line in old_msg.splitlines():
if not self._IsChangeId(line):
new_msg.append(line)
# Add a blank line between the message and the change id/reference
try:
if new_msg[-1].strip() != "":
new_msg.append("")
except IndexError:
pass
new_msg.append(self._GetReference(sha1))
return "\n".join(new_msg)

View File

@ -14,12 +14,14 @@
# limitations under the License.
import os
import shutil
import sys
from color import Coloring
from command import InteractiveCommand, MirrorSafeCommand
from error import ManifestParseError
from project import SyncBuffer
from git_config import GitConfig
from git_command import git_require, MIN_GIT_VERSION
class Init(InteractiveCommand, MirrorSafeCommand):
@ -81,6 +83,9 @@ to update the working directory files.
g.add_option('--reference',
dest='reference',
help='location of mirror directory', metavar='DIR')
g.add_option('--depth', type='int', default=None,
dest='depth',
help='create a shallow clone with given depth; see git clone')
# Tool
g = p.add_option_group('repo Version options')
@ -104,8 +109,8 @@ to update the working directory files.
sys.exit(1)
if not opt.quiet:
print >>sys.stderr, 'Getting manifest ...'
print >>sys.stderr, ' from %s' % opt.manifest_url
print >>sys.stderr, 'Get %s' \
% GitConfig.ForUser().UrlInsteadOf(opt.manifest_url)
m._InitGitDir()
if opt.manifest_branch:
@ -134,9 +139,14 @@ to update the working directory files.
print >>sys.stderr, 'fatal: --mirror not supported on existing client'
sys.exit(1)
if not m.Sync_NetworkHalf():
if not m.Sync_NetworkHalf(is_new=is_new):
r = m.GetRemote(m.remote.name)
print >>sys.stderr, 'fatal: cannot obtain manifest %s' % r.url
# Better delete the manifest git dir if we created it; otherwise next
# time (when user fixes problems) we won't go through the "is_new" logic.
if is_new:
shutil.rmtree(m.gitdir)
sys.exit(1)
syncbuf = SyncBuffer(m.config)
@ -226,6 +236,25 @@ to update the working directory files.
if a in ('y', 'yes', 't', 'true', 'on'):
gc.SetString('color.ui', 'auto')
def _ConfigureDepth(self, opt):
"""Configure the depth we'll sync down.
Args:
opt: Options from optparse. We care about opt.depth.
"""
# Opt.depth will be non-None if user actually passed --depth to repo init.
if opt.depth is not None:
if opt.depth > 0:
# Positive values will set the depth.
depth = str(opt.depth)
else:
# Negative numbers will clear the depth; passing None to SetString
# will do that.
depth = None
# We store the depth in the main manifest project.
self.manifest.manifestProject.config.SetString('repo.depth', depth)
def Execute(self, opt, args):
git_require(MIN_GIT_VERSION, fail=True)
self._SyncManifest(opt)
@ -235,6 +264,8 @@ to update the working directory files.
self._ConfigureUser()
self._ConfigureColor()
self._ConfigureDepth(opt)
if self.manifest.IsMirror:
type = 'mirror '
else:

48
subcmds/list.py Normal file
View File

@ -0,0 +1,48 @@
#
# Copyright (C) 2011 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from command import Command, MirrorSafeCommand
class List(Command, MirrorSafeCommand):
common = True
helpSummary = "List projects and their associated directories"
helpUsage = """
%prog [<project>...]
"""
helpDescription = """
List all projects; pass '.' to list the project for the cwd.
This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
"""
def Execute(self, opt, args):
"""List all projects and the associated directories.
This may be possible to do with 'repo forall', but repo newbies have
trouble figuring that out. The idea here is that it should be more
discoverable.
Args:
opt: The options. We don't take any.
args: Positional args. Can be a list of projects to list, or empty.
"""
projects = self.GetProjects(args)
lines = []
for project in projects:
lines.append("%s : %s" % (project.relpath, project.name))
lines.sort()
print '\n'.join(lines)

View File

@ -15,6 +15,15 @@
from command import PagedCommand
try:
import threading as _threading
except ImportError:
import dummy_threading as _threading
import itertools
import sys
import StringIO
class Status(PagedCommand):
common = True
helpSummary = "Show the working tree status"
@ -27,6 +36,9 @@ and the most recent commit on this branch (HEAD), in each project
specified. A summary is displayed, one line per file where there
is a difference between these three states.
The -j/--jobs option can be used to run multiple status queries
in parallel.
Status Display
--------------
@ -60,9 +72,34 @@ the following meanings:
"""
def _Options(self, p):
p.add_option('-j', '--jobs',
dest='jobs', action='store', type='int', default=2,
help="number of projects to check simultaneously")
def _StatusHelper(self, project, clean_counter, sem, output):
"""Obtains the status for a specific project.
Obtains the status for a project, redirecting the output to
the specified object. It will release the semaphore
when done.
Args:
project: Project to get status of.
clean_counter: Counter for clean projects.
sem: Semaphore, will call release() when complete.
output: Where to output the status.
"""
try:
state = project.PrintWorkTreeStatus(output)
if state == 'CLEAN':
clean_counter.next()
finally:
sem.release()
def Execute(self, opt, args):
all = self.GetProjects(args)
clean = 0
counter = itertools.count()
on = {}
for project in all:
@ -77,9 +114,24 @@ the following meanings:
for cb in branch_names:
print '# on branch %s' % cb
for project in all:
state = project.PrintWorkTreeStatus()
if state == 'CLEAN':
clean += 1
if len(all) == clean:
if opt.jobs == 1:
for project in all:
state = project.PrintWorkTreeStatus()
if state == 'CLEAN':
counter.next()
else:
sem = _threading.Semaphore(opt.jobs)
threads_and_output = []
for project in all:
sem.acquire()
output = StringIO.StringIO()
t = _threading.Thread(target=self._StatusHelper,
args=(project, counter, sem, output))
threads_and_output.append((t, output))
t.start()
for (t, output) in threads_and_output:
t.join()
sys.stdout.write(output.getvalue())
output.close()
if len(all) == counter.next():
print 'nothing to commit (working directory clean)'

View File

@ -28,6 +28,14 @@ try:
except ImportError:
import dummy_threading as _threading
try:
import resource
def _rlimit_nofile():
return resource.getrlimit(resource.RLIMIT_NOFILE)
except ImportError:
def _rlimit_nofile():
return (256, 256)
from git_command import GIT
from git_refs import R_HEADS
from project import HEAD
@ -39,6 +47,10 @@ from project import R_HEADS
from project import SyncBuffer
from progress import Progress
class _FetchError(Exception):
"""Internal error thrown in _FetchHelper() when we don't want stack trace."""
pass
class Sync(Command, MirrorSafeCommand):
jobs = 1
common = True
@ -68,7 +80,8 @@ revision is temporarily needed.
The -s/--smart-sync option can be used to sync to a known good
build as specified by the manifest-server element in the current
manifest.
manifest. The -t/--smart-tag option is similar and allows you to
specify a custom tag/label.
The -f/--force-broken option can be used to proceed with syncing
other projects if a project sync fails.
@ -104,6 +117,8 @@ later is required to fix a server side protocol bug.
"""
def _Options(self, p, show_smart=True):
self.jobs = self.manifest.default.sync_j
p.add_option('-f', '--force-broken',
dest='force_broken', action='store_true',
help="continue sync even if a project fails to sync")
@ -121,11 +136,14 @@ later is required to fix a server side protocol bug.
help='be more quiet')
p.add_option('-j','--jobs',
dest='jobs', action='store', type='int',
help="number of projects to fetch simultaneously")
help="projects to fetch simultaneously (default %d)" % self.jobs)
if show_smart:
p.add_option('-s', '--smart-sync',
dest='smart_sync', action='store_true',
help='smart sync using manifest from a known good build')
p.add_option('-t', '--smart-tag',
dest='smart_tag', action='store',
help='smart sync using manifest from a known tag')
g = p.add_option_group('repo Version options')
g.add_option('--no-repo-verify',
@ -135,20 +153,61 @@ later is required to fix a server side protocol bug.
dest='repo_upgraded', action='store_true',
help=SUPPRESS_HELP)
def _FetchHelper(self, opt, project, lock, fetched, pm, sem):
if not project.Sync_NetworkHalf(quiet=opt.quiet):
print >>sys.stderr, 'error: Cannot fetch %s' % project.name
if opt.force_broken:
print >>sys.stderr, 'warn: --force-broken, continuing to sync'
else:
sem.release()
sys.exit(1)
def _FetchHelper(self, opt, project, lock, fetched, pm, sem, err_event):
"""Main function of the fetch threads when jobs are > 1.
lock.acquire()
fetched.add(project.gitdir)
pm.update()
lock.release()
sem.release()
Args:
opt: Program options returned from optparse. See _Options().
project: Project object for the project to fetch.
lock: Lock for accessing objects that are shared amongst multiple
_FetchHelper() threads.
fetched: set object that we will add project.gitdir to when we're done
(with our lock held).
pm: Instance of a Project object. We will call pm.update() (with our
lock held).
sem: We'll release() this semaphore when we exit so that another thread
can be started up.
err_event: We'll set this event in the case of an error (after printing
out info about the error).
"""
# We'll set to true once we've locked the lock.
did_lock = False
# Encapsulate everything in a try/except/finally so that:
# - We always set err_event in the case of an exception.
# - We always make sure we call sem.release().
# - We always make sure we unlock the lock if we locked it.
try:
try:
success = project.Sync_NetworkHalf(quiet=opt.quiet)
# Lock around all the rest of the code, since printing, updating a set
# and Progress.update() are not thread safe.
lock.acquire()
did_lock = True
if not success:
print >>sys.stderr, 'error: Cannot fetch %s' % project.name
if opt.force_broken:
print >>sys.stderr, 'warn: --force-broken, continuing to sync'
else:
raise _FetchError()
fetched.add(project.gitdir)
pm.update()
except BaseException, e:
# Notify the _Fetch() function about all errors.
err_event.set()
# If we got our own _FetchError, we don't want a stack trace.
# However, if we got something else (something in Sync_NetworkHalf?),
# we'd like one (so re-raise after we've set err_event).
if not isinstance(e, _FetchError):
raise
finally:
if did_lock:
lock.release()
sem.release()
def _Fetch(self, projects, opt):
fetched = set()
@ -169,7 +228,13 @@ later is required to fix a server side protocol bug.
threads = set()
lock = _threading.Lock()
sem = _threading.Semaphore(self.jobs)
err_event = _threading.Event()
for project in projects:
# Check for any errors before starting any new threads.
# ...we'll let existing threads finish, though.
if err_event.isSet():
break
sem.acquire()
t = _threading.Thread(target = self._FetchHelper,
args = (opt,
@ -177,13 +242,19 @@ later is required to fix a server side protocol bug.
lock,
fetched,
pm,
sem))
sem,
err_event))
threads.add(t)
t.start()
for t in threads:
t.join()
# If we saw an error, exit with code 1 so that other scripts can check.
if err_event.isSet():
print >>sys.stderr, '\nerror: Exited sync due to fetch errors'
sys.exit(1)
pm.end()
for project in projects:
project.bare_git.gc('--auto')
@ -251,6 +322,10 @@ uncommitted changes are present' % project.relpath
def Execute(self, opt, args):
if opt.jobs:
self.jobs = opt.jobs
if self.jobs > 1:
soft_limit, _ = _rlimit_nofile()
self.jobs = min(self.jobs, (soft_limit - 5) / 3)
if opt.network_only and opt.detach_head:
print >>sys.stderr, 'error: cannot combine -n and -d'
sys.exit(1)
@ -258,27 +333,31 @@ uncommitted changes are present' % project.relpath
print >>sys.stderr, 'error: cannot combine -n and -l'
sys.exit(1)
if opt.smart_sync:
if opt.smart_sync or opt.smart_tag:
if not self.manifest.manifest_server:
print >>sys.stderr, \
'error: cannot smart sync: no manifest server defined in manifest'
sys.exit(1)
try:
server = xmlrpclib.Server(self.manifest.manifest_server)
p = self.manifest.manifestProject
b = p.GetBranch(p.CurrentBranch)
branch = b.merge
if branch.startswith(R_HEADS):
branch = branch[len(R_HEADS):]
if opt.smart_sync:
p = self.manifest.manifestProject
b = p.GetBranch(p.CurrentBranch)
branch = b.merge
if branch.startswith(R_HEADS):
branch = branch[len(R_HEADS):]
env = os.environ.copy()
if (env.has_key('TARGET_PRODUCT') and
env.has_key('TARGET_BUILD_VARIANT')):
target = '%s-%s' % (env['TARGET_PRODUCT'],
env['TARGET_BUILD_VARIANT'])
[success, manifest_str] = server.GetApprovedManifest(branch, target)
env = os.environ.copy()
if (env.has_key('TARGET_PRODUCT') and
env.has_key('TARGET_BUILD_VARIANT')):
target = '%s-%s' % (env['TARGET_PRODUCT'],
env['TARGET_BUILD_VARIANT'])
[success, manifest_str] = server.GetApprovedManifest(branch, target)
else:
[success, manifest_str] = server.GetApprovedManifest(branch)
else:
[success, manifest_str] = server.GetApprovedManifest(branch)
assert(opt.smart_tag)
[success, manifest_str] = server.GetManifest(opt.smart_tag)
if success:
manifest_name = "smart_sync_override.xml"
@ -321,6 +400,8 @@ uncommitted changes are present' % project.relpath
if not syncbuf.Finish():
sys.exit(1)
self.manifest._Unload()
if opt.jobs is None:
self.jobs = self.manifest.default.sync_j
all = self.GetProjects(args, missing_ok=True)
if not opt.local_only:

View File

@ -19,7 +19,8 @@ import sys
from command import InteractiveCommand
from editor import Editor
from error import UploadError
from error import HookError, UploadError
from project import RepoHook
UNUSUAL_COMMIT_THRESHOLD = 5
@ -119,6 +120,32 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
p.add_option('--cc',
type='string', action='append', dest='cc',
help='Also send email to these email addresses.')
p.add_option('--br',
type='string', action='store', dest='branch',
help='Branch to upload.')
# Options relating to upload hook. Note that verify and no-verify are NOT
# opposites of each other, which is why they store to different locations.
# We are using them to match 'git commit' syntax.
#
# Combinations:
# - no-verify=False, verify=False (DEFAULT):
# If stdout is a tty, can prompt about running upload hooks if needed.
# If user denies running hooks, the upload is cancelled. If stdout is
# not a tty and we would need to prompt about upload hooks, upload is
# cancelled.
# - no-verify=False, verify=True:
# Always run upload hooks with no prompt.
# - no-verify=True, verify=False:
# Never run upload hooks, but upload anyway (AKA bypass hooks).
# - no-verify=True, verify=True:
# Invalid
p.add_option('--no-verify',
dest='bypass_hooks', action='store_true',
help='Do not run the upload hook.')
p.add_option('--verify',
dest='allow_all_hooks', action='store_true',
help='Run the upload hook without prompting.')
def _SingleBranch(self, opt, branch, people):
project = branch.project
@ -312,6 +339,25 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
pending = []
reviewers = []
cc = []
branch = None
if opt.branch:
branch = opt.branch
for project in project_list:
avail = project.GetUploadableBranches(branch)
if avail:
pending.append((project, avail))
if pending and (not opt.bypass_hooks):
hook = RepoHook('pre-upload', self.manifest.repo_hooks_project,
self.manifest.topdir, abort_if_user_denies=True)
pending_proj_names = [project.name for (project, avail) in pending]
try:
hook.Run(opt.allow_all_hooks, project_list=pending_proj_names)
except HookError, e:
print >>sys.stderr, "ERROR: %s" % str(e)
return
if opt.reviewers:
reviewers = _SplitEmails(opt.reviewers)
@ -319,11 +365,6 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
cc = _SplitEmails(opt.cc)
people = (reviewers,cc)
for project in project_list:
avail = project.GetUploadableBranches()
if avail:
pending.append((project, avail))
if not pending:
print >>sys.stdout, "no branches ready for upload"
elif len(pending) == 1 and len(pending[0][1]) == 1: