Compare commits

..

23 Commits

Author SHA1 Message Date
0ce6ca9c7b Fix mirror clients with no worktree
Commit "Make path references OS independent" (df14a70c45)
broke mirror clients by trying to invoke replace() on None
when there is no worktree.

Change-Id: Ie0a187058358f7dcdf83119e45cc65409c980f11
2011-01-10 13:26:34 -08:00
0fc3a39829 Bump repo version to 1,10
Change-Id: Ifdc041e7152af31de413b9269f20000acd945b3b
2011-01-10 09:01:24 -08:00
c7c57e34db help: Don't show empty Summary or Description sections
Signed-off-by: Shawn O. Pearce <sop@google.com>
(cherry picked from commit 60e679209a)
2011-01-09 17:39:22 -08:00
0d2b61f11d sync: Run git gc --auto after fetch
Users may wind up with a lot of loose object content in projects they
don't frequently make changes in, but that are modified by others.

Since we bypass many git code paths that would have otherwise called
out to `git gc --auto`, its possible for these projects to have
their loose object database grow out of control.  To help prevent
that, we now invoke it ourselves during the network half of sync.

Signed-off-by: Shawn O. Pearce <sop@google.com>
(cherry picked from commit 1875ddd47c)
2011-01-09 17:39:22 -08:00
2bf9db0d3b Add "repo branch" as an alias for "repo branches"
For those of us that are used to typing "git branch".

Signed-off-by: Mike Lockwood <lockwood@android.com>
(cherry picked from commit 33f0e786bb)
2011-01-09 17:39:22 -08:00
f00e0ce556 upload: Catch and cleanly report connectivity errors
Instead of giving a Python backtrace when there is a connectivity
problem during repo upload, report that we cannot access the host,
and why, with a halfway decent error message.

Bug: REPO-45
Change-Id: I9a45b387e86e48073a2d99bd6d594c1a7d6d99d4
Signed-off-by: Shawn O. Pearce <sop@google.com>
(cherry picked from commit d2dfac81ad)
2011-01-09 17:39:22 -08:00
1b5a4a0c5d forall: Silently skip missing projects
If a project is missing locally, it might be OK to skip over it
and continue running the same command in other projects.

Bug: REPO-43
Change-Id: I64f97eb315f379ab2c51fc53d24ed340b3d09250
Signed-off-by: Shawn O. Pearce <sop@google.com>
(cherry picked from commit d4cd69bdef)
2011-01-09 17:39:22 -08:00
de8b2c4276 Fix to display the usage message of the command download when the user
don't provide any arguments to 'repo download'.

Signed-off-by: Thiago Farina <thiago.farina@gmail.com>
(cherry picked from commit 840ed0fab7)
2011-01-09 17:39:22 -08:00
727ee98a40 Use os.environ.copy() instead of dict()
Signed-off-by: Shawn O. Pearce <sop@google.com>
(cherry picked from commit 3218c13205)
2011-01-09 17:39:22 -08:00
df14a70c45 Make path references OS independent
Change-Id: I5573995adfd52fd54bddc62d1d1ea78fb1328130
(cherry picked from commit b0f9a02394)

Conflicts:

	command.py
2011-01-09 17:39:19 -08:00
f18cb76173 Encode the environment variables passed to git
Windows allows the environment to have unicode values.
This will cause Python to fail to execute the command.

Change-Id: I37d922c3d7ced0d5b4883f0220346ac42defc5e9
Signed-off-by: Shawn O. Pearce <sop@google.com>
2011-01-09 16:13:56 -08:00
d3fd537ea5 Exit with statuscode 0 for repo help init
The complete help text is printed, so the program executed successfully.

Some tools (like OpenGrok) detects the availibility of a program by
running it with a known set of options and check the return code.
It is an easy and portable way of checking for the existence of a program
instead of searching the path (and handle extensions) ourselves.

Change-Id: Ic13428c77be4a36d599ccb8c86d893308818eae3
2011-01-09 16:10:04 -08:00
0048b69c03 Fixed race condition in 'repo sync -jN' that would open multiple masters.
This fixes the SSH Control Masters to be managed in a thread-safe
fashion.  This is important because "repo sync -jN" uses threads to
sync more than one repository at the same time.  The problem didn't
show up earlier because it was masked if all of the threads tried to
connect to the same host that was used on the "repo init" line.
2010-12-21 13:39:23 -08:00
2b8db3ce3e Added feature to print a <notice> from manifest at the end of a sync.
This feature is used to convey information on a when a branch has
ceased development or if it is an experimental branch with a few
gotchas, etc.

You add it to your manifest XML by doing something like this:
<manifest>
  <notice>
    NOTE TO DEVELOPERS:
      If you checkin code, you have to pinky-swear that it contains no bugs.
      Anyone who breaks their promise will have tomatoes thrown at them in the
      team meeting.  Be sure to bring an extra set of clothes.
  </notice>

  <remote ... />
  ...
</manifest>

Carriage returns and indentation are relevant for the text in this tag.

This feature was requested by Anush Elangovan on the ChromiumOS team.
2010-11-01 15:08:06 -07:00
5df6de075e sync: Use --force-broken to continue other projects
This adds a new flag -f/--force-broken that will allow the rest of
the sync process to continue instead of bailing when a particular
project fails to sync.

Change-Id: I23680f2ee7927410f7ed930b1d469424c9aa246e
Signed-off-by: Andrei Warkentin <andreiw@motorola.com>
Signed-off-by: Shawn O. Pearce <sop@google.com>
2010-10-29 12:20:01 -07:00
a0de6e8eab upload: Remove --replace option
It hasn't been necessary for a long time, and its
functionality can be accomplished with 'git push'.

Change-Id: Ic00d3adbe4cee7be3955117489c69d6e90106559
2010-10-29 12:12:56 -07:00
16614f86b3 sync --quiet: be more quiet
Change-Id: I5e8363c7b32e4546d1236cfc5a32e01c3e5ea8e6
Signed-off-by: Shawn O. Pearce <sop@google.com>
2010-10-29 12:08:57 -07:00
88443387b1 sync: Enable use of git clone --reference
Use git clone to initialize a new repository, and when possible
allow callers to use --reference to reuse an existing checkout as
the initial object storage area for the new checkout.

Change-Id: Ie27f760247f311ce484c6d3e85a90d94da2febfc
Signed-off-by: Shawn O. Pearce <sop@google.com>
2010-10-29 12:08:50 -07:00
99482ae58a Only delete corrupt pickle config files if they exist
os.remove() raises OSError if the file being removed doesn't exist.
Check before calling to ensure we don't raise a useless exception
on an already deleted file.

Change-Id: I44c1c7dd97a47fcab8afb6c18fdf179158b6dab7
Signed-off-by: Shawn O. Pearce <sop@google.com>
2010-10-29 08:25:04 -07:00
ec1df9b7f6 Don't allow git fetch to start ControlMaster
To avoid connectivity problems, we don't want the ssh process
that is started by git fetch to become a ControlMaster for the
overall sync task.  If it did, we would lose connectivity when
git fetch was finished with the current project, causing later
projects to not fetch efficiently.

Change-Id: I8d0dcf9b361276ff8c8b5a6324cbd4a501e9c4dd
Signed-off-by: Shawn O. Pearce <sop@google.com>
2010-10-29 08:15:14 -07:00
06d029c1c8 Check for existing SSH ControlMaster
Be more thorough about checking for an existing ssh master by
running a test command first, and only opening up a new master
if the test fails to connect.

Change-Id: I56fe8e7b4dbc123675b7f259e81d359ed0cd55cf
Signed-off-by: Shawn O. Pearce <sop@google.com>
2010-10-29 08:14:56 -07:00
b715b14807 Fix for handling values of EDITOR which contain a space.
The shell swallows the 0th arg, which was the filename. Simple fix
is to pass in an extra arg for the shell to swallow.

Change-Id: Iad6304ba9ccea6e7262ee06ef87d3dac57dbde81
2010-08-06 17:05:04 -07:00
60829ba72f upload: Fix --replace flag
--replace started to fail due to a Python error, I forgot to pass
through the opt structure to the replace function.

Change-Id: Ifcd7a0c715c3fd9070a4c58208612a626382de35
Signed-off-by: Shawn O. Pearce <sop@google.com>
2010-07-16 07:42:45 -07:00
17 changed files with 390 additions and 178 deletions

View File

@ -74,7 +74,7 @@ class Command(object):
project = all.get(arg)
if not project:
path = os.path.abspath(arg)
path = os.path.abspath(arg).replace('\\', '/')
if not by_path:
by_path = dict()
@ -82,13 +82,15 @@ class Command(object):
by_path[p.worktree] = p
if os.path.exists(path):
oldpath = None
while path \
and path != '/' \
and path != oldpath \
and path != self.manifest.topdir:
try:
project = by_path[path]
break
except KeyError:
oldpath = path
path = os.path.dirname(path)
else:
try:

View File

@ -20,12 +20,15 @@ A manifest XML file (e.g. 'default.xml') roughly conforms to the
following DTD:
<!DOCTYPE manifest [
<!ELEMENT manifest (remote*,
<!ELEMENT manifest (notice?,
remote*,
default?,
manifest-server?,
remove-project*,
project*)>
<!ELEMENT notice (#PCDATA)>
<!ELEMENT remote (EMPTY)>
<!ATTLIST remote name ID #REQUIRED>
<!ATTLIST remote fetch CDATA #REQUIRED>

View File

@ -82,7 +82,7 @@ least one of these before using this command."""
fd = None
if re.compile("^.*[$ \t'].*$").match(editor):
args = [editor + ' "$@"']
args = [editor + ' "$@"', 'sh']
shell = True
else:
args = [editor]

View File

@ -112,6 +112,9 @@ def git_require(min_version, fail=False):
sys.exit(1)
return False
def _setenv(env, name, value):
env[name] = value.encode()
class GitCommand(object):
def __init__(self,
project,
@ -124,7 +127,7 @@ class GitCommand(object):
ssh_proxy = False,
cwd = None,
gitdir = None):
env = dict(os.environ)
env = os.environ.copy()
for e in [REPO_TRACE,
GIT_DIR,
@ -137,10 +140,10 @@ class GitCommand(object):
del env[e]
if disable_editor:
env['GIT_EDITOR'] = ':'
_setenv(env, 'GIT_EDITOR', ':')
if ssh_proxy:
env['REPO_SSH_SOCK'] = ssh_sock()
env['GIT_SSH'] = _ssh_proxy()
_setenv(env, 'REPO_SSH_SOCK', ssh_sock())
_setenv(env, 'GIT_SSH', _ssh_proxy())
if project:
if not cwd:
@ -151,7 +154,7 @@ class GitCommand(object):
command = [GIT]
if bare:
if gitdir:
env[GIT_DIR] = gitdir
_setenv(env, GIT_DIR, gitdir)
cwd = None
command.extend(cmdv)

View File

@ -18,7 +18,13 @@ import os
import re
import subprocess
import sys
try:
import threading as _threading
except ImportError:
import dummy_threading as _threading
import time
import urllib2
from signal import SIGTERM
from urllib2 import urlopen, HTTPError
from error import GitError, UploadError
@ -257,9 +263,11 @@ class GitConfig(object):
finally:
fd.close()
except IOError:
os.remove(self._pickle)
if os.path.exists(self._pickle):
os.remove(self._pickle)
except cPickle.PickleError:
os.remove(self._pickle)
if os.path.exists(self._pickle):
os.remove(self._pickle)
def _ReadGit(self):
"""
@ -356,60 +364,110 @@ class RefSpec(object):
return s
_ssh_cache = {}
_master_processes = []
_master_keys = set()
_ssh_master = True
_master_keys_lock = None
def init_ssh():
"""Should be called once at the start of repo to init ssh master handling.
At the moment, all we do is to create our lock.
"""
global _master_keys_lock
assert _master_keys_lock is None, "Should only call init_ssh once"
_master_keys_lock = _threading.Lock()
def _open_ssh(host, port=None):
global _ssh_master
if port is not None:
key = '%s:%s' % (host, port)
else:
key = host
if key in _ssh_cache:
return True
if not _ssh_master \
or 'GIT_SSH' in os.environ \
or sys.platform in ('win32', 'cygwin'):
# failed earlier, or cygwin ssh can't do this
#
return False
command = ['ssh',
'-o','ControlPath %s' % ssh_sock(),
'-M',
'-N',
host]
if port is not None:
command[3:3] = ['-p',str(port)]
# Acquire the lock. This is needed to prevent opening multiple masters for
# the same host when we're running "repo sync -jN" (for N > 1) _and_ the
# manifest <remote fetch="ssh://xyz"> specifies a different host from the
# one that was passed to repo init.
_master_keys_lock.acquire()
try:
Trace(': %s', ' '.join(command))
p = subprocess.Popen(command)
except Exception, e:
_ssh_master = False
print >>sys.stderr, \
'\nwarn: cannot enable ssh control master for %s:%s\n%s' \
% (host,port, str(e))
return False
_ssh_cache[key] = p
time.sleep(1)
return True
# Check to see whether we already think that the master is running; if we
# think it's already running, return right away.
if port is not None:
key = '%s:%s' % (host, port)
else:
key = host
if key in _master_keys:
return True
if not _ssh_master \
or 'GIT_SSH' in os.environ \
or sys.platform in ('win32', 'cygwin'):
# failed earlier, or cygwin ssh can't do this
#
return False
# We will make two calls to ssh; this is the common part of both calls.
command_base = ['ssh',
'-o','ControlPath %s' % ssh_sock(),
host]
if port is not None:
command_base[1:1] = ['-p',str(port)]
# Since the key wasn't in _master_keys, we think that master isn't running.
# ...but before actually starting a master, we'll double-check. This can
# be important because we can't tell that that 'git@myhost.com' is the same
# as 'myhost.com' where "User git" is setup in the user's ~/.ssh/config file.
check_command = command_base + ['-O','check']
try:
Trace(': %s', ' '.join(check_command))
check_process = subprocess.Popen(check_command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
check_process.communicate() # read output, but ignore it...
isnt_running = check_process.wait()
if not isnt_running:
# Our double-check found that the master _was_ infact running. Add to
# the list of keys.
_master_keys.add(key)
return True
except Exception:
# Ignore excpetions. We we will fall back to the normal command and print
# to the log there.
pass
command = command_base[:1] + \
['-M', '-N'] + \
command_base[1:]
try:
Trace(': %s', ' '.join(command))
p = subprocess.Popen(command)
except Exception, e:
_ssh_master = False
print >>sys.stderr, \
'\nwarn: cannot enable ssh control master for %s:%s\n%s' \
% (host,port, str(e))
return False
_master_processes.append(p)
_master_keys.add(key)
time.sleep(1)
return True
finally:
_master_keys_lock.release()
def close_ssh():
global _master_keys_lock
terminate_ssh_clients()
for key,p in _ssh_cache.iteritems():
for p in _master_processes:
try:
os.kill(p.pid, SIGTERM)
p.wait()
except OSError:
pass
_ssh_cache.clear()
del _master_processes[:]
_master_keys.clear()
d = ssh_sock(create=False)
if d:
@ -418,6 +476,9 @@ def close_ssh():
except OSError:
pass
# We're done with the lock, so we can delete it.
_master_keys_lock = None
URI_SCP = re.compile(r'^([^@:]*@?[^:/]{1,}):')
URI_ALL = re.compile(r'^([a-z][a-z+]*)://([^@/]*@?[^/]*)/')
@ -504,23 +565,25 @@ class Remote(object):
try:
info = urlopen(u).read()
if info == 'NOT_AVAILABLE':
raise UploadError('Upload over ssh unavailable')
raise UploadError('%s: SSH disabled' % self.review)
if '<' in info:
# Assume the server gave us some sort of HTML
# response back, like maybe a login page.
#
raise UploadError('Cannot read %s:\n%s' % (u, info))
raise UploadError('%s: Cannot parse response' % u)
self._review_protocol = 'ssh'
self._review_host = info.split(" ")[0]
self._review_port = info.split(" ")[1]
except urllib2.URLError, e:
raise UploadError('%s: %s' % (self.review, e.reason[1]))
except HTTPError, e:
if e.code == 404:
self._review_protocol = 'http-post'
self._review_host = None
self._review_port = None
else:
raise UploadError('Cannot guess Gerrit version')
raise UploadError('Upload over ssh unavailable')
REVIEW_CACHE[u] = (
self._review_protocol,

View File

@ -1,2 +1,2 @@
#!/bin/sh
exec ssh -o "ControlPath $REPO_SSH_SOCK" "$@"
exec ssh -o "ControlMaster no" -o "ControlPath $REPO_SSH_SOCK" "$@"

View File

@ -28,7 +28,7 @@ import re
import sys
from trace import SetTrace
from git_config import close_ssh
from git_config import init_ssh, close_ssh
from command import InteractiveCommand
from command import MirrorSafeCommand
from command import PagedCommand
@ -61,6 +61,8 @@ class _Repo(object):
def __init__(self, repodir):
self.repodir = repodir
self.commands = all_commands
# add 'branch' as an alias for 'branches'
all_commands['branch'] = all_commands['branches']
def _Run(self, argv):
name = None
@ -214,6 +216,7 @@ def _Main(argv):
repo = _Repo(opt.repodir)
try:
try:
init_ssh()
repo._Run(argv)
finally:
close_ssh()

View File

@ -107,6 +107,15 @@ class XmlManifest(object):
root = doc.createElement('manifest')
doc.appendChild(root)
# Save out the notice. There's a little bit of work here to give it the
# right whitespace, which assumes that the notice is automatically indented
# by 4 by minidom.
if self.notice:
notice_element = root.appendChild(doc.createElement('notice'))
notice_lines = self.notice.splitlines()
indented_notice = ('\n'.join(" "*4 + line for line in notice_lines))[4:]
notice_element.appendChild(doc.createTextNode(indented_notice))
d = self.default
sort_remotes = list(self.remotes.keys())
sort_remotes.sort()
@ -179,6 +188,11 @@ class XmlManifest(object):
self._Load()
return self._default
@property
def notice(self):
self._Load()
return self._notice
@property
def manifest_server(self):
self._Load()
@ -193,6 +207,7 @@ class XmlManifest(object):
self._projects = {}
self._remotes = {}
self._default = None
self._notice = None
self.branch = None
self._manifest_server = None
@ -263,6 +278,14 @@ class XmlManifest(object):
if self._default is None:
self._default = _Default()
for node in config.childNodes:
if node.nodeName == 'notice':
if self._notice is not None:
raise ManifestParseError, \
'duplicate notice in %s' % \
(self.manifestFile)
self._notice = self._ParseNotice(node)
for node in config.childNodes:
if node.nodeName == 'manifest-server':
url = self._reqatt(node, 'url')
@ -338,6 +361,45 @@ class XmlManifest(object):
d.revisionExpr = None
return d
def _ParseNotice(self, node):
"""
reads a <notice> element from the manifest file
The <notice> element is distinct from other tags in the XML in that the
data is conveyed between the start and end tag (it's not an empty-element
tag).
The white space (carriage returns, indentation) for the notice element is
relevant and is parsed in a way that is based on how python docstrings work.
In fact, the code is remarkably similar to here:
http://www.python.org/dev/peps/pep-0257/
"""
# Get the data out of the node...
notice = node.childNodes[0].data
# Figure out minimum indentation, skipping the first line (the same line
# as the <notice> tag)...
minIndent = sys.maxint
lines = notice.splitlines()
for line in lines[1:]:
lstrippedLine = line.lstrip()
if lstrippedLine:
indent = len(line) - len(lstrippedLine)
minIndent = min(indent, minIndent)
# Strip leading / trailing blank lines and also indentation.
cleanLines = [lines[0].strip()]
for line in lines[1:]:
cleanLines.append(line[minIndent:].rstrip())
# Clear completely blank lines from front and back...
while cleanLines and not cleanLines[0]:
del cleanLines[0]
while cleanLines and not cleanLines[-1]:
del cleanLines[-1]
return '\n'.join(cleanLines)
def _ParseProject(self, node):
"""
reads a <project> element from the manifest file
@ -373,7 +435,7 @@ class XmlManifest(object):
worktree = None
gitdir = os.path.join(self.topdir, '%s.git' % name)
else:
worktree = os.path.join(self.topdir, path)
worktree = os.path.join(self.topdir, path).replace('\\', '/')
gitdir = os.path.join(self.repodir, 'projects/%s.git' % path)
project = Project(manifest = self,

View File

@ -111,7 +111,6 @@ class ReviewableBranch(object):
self.project = project
self.branch = branch
self.base = base
self.replace_changes = None
@property
def name(self):
@ -151,7 +150,6 @@ class ReviewableBranch(object):
def UploadForReview(self, people, auto_topic=False):
self.project.UploadForReview(self.name,
self.replace_changes,
people,
auto_topic=auto_topic)
@ -238,8 +236,11 @@ class Project(object):
self.manifest = manifest
self.name = name
self.remote = remote
self.gitdir = gitdir
self.worktree = worktree
self.gitdir = gitdir.replace('\\', '/')
if worktree:
self.worktree = worktree.replace('\\', '/')
else:
self.worktree = None
self.relpath = relpath
self.revisionExpr = revisionExpr
@ -557,7 +558,6 @@ class Project(object):
return None
def UploadForReview(self, branch=None,
replace_changes=None,
people=([],[]),
auto_topic=False):
"""Uploads the named branch for code review.
@ -600,9 +600,6 @@ class Project(object):
cmd.append(branch.remote.SshReviewUrl(self.UserEmail))
cmd.append(ref_spec)
if replace_changes:
for change_id,commit_id in replace_changes.iteritems():
cmd.append('%s:refs/changes/%s/new' % (commit_id, change_id))
if GitCommand(self, cmd, bare = True).Wait() != 0:
raise UploadError('Upload failed')
@ -618,17 +615,19 @@ class Project(object):
## Sync ##
def Sync_NetworkHalf(self):
def Sync_NetworkHalf(self, quiet=False):
"""Perform only the network IO portion of the sync process.
Local working directory/branch state is not affected.
"""
if not self.Exists:
print >>sys.stderr
print >>sys.stderr, 'Initializing project %s ...' % self.name
is_new = not self.Exists
if is_new:
if not quiet:
print >>sys.stderr
print >>sys.stderr, 'Initializing project %s ...' % self.name
self._InitGitDir()
self._InitRemote()
if not self._RemoteFetch():
if not self._RemoteFetch(initial=is_new, quiet=quiet):
return False
#Check that the requested ref was found after fetch
@ -641,7 +640,7 @@ class Project(object):
#
rev = self.revisionExpr
if rev.startswith(R_TAGS):
self._RemoteFetch(None, rev[len(R_TAGS):])
self._RemoteFetch(None, rev[len(R_TAGS):], quiet=quiet)
if self.worktree:
self._InitMRef()
@ -1024,7 +1023,9 @@ class Project(object):
## Direct Git Commands ##
def _RemoteFetch(self, name=None, tag=None):
def _RemoteFetch(self, name=None, tag=None,
initial=False,
quiet=False):
if not name:
name = self.remote.name
@ -1032,17 +1033,84 @@ class Project(object):
if self.GetRemote(name).PreConnectFetch():
ssh_proxy = True
if initial:
alt = os.path.join(self.gitdir, 'objects/info/alternates')
try:
fd = open(alt, 'rb')
try:
ref_dir = fd.readline()
if ref_dir and ref_dir.endswith('\n'):
ref_dir = ref_dir[:-1]
finally:
fd.close()
except IOError, e:
ref_dir = None
if ref_dir and 'objects' == os.path.basename(ref_dir):
ref_dir = os.path.dirname(ref_dir)
packed_refs = os.path.join(self.gitdir, 'packed-refs')
remote = self.GetRemote(name)
all = self.bare_ref.all
ids = set(all.values())
tmp = set()
for r, id in GitRefs(ref_dir).all.iteritems():
if r not in all:
if r.startswith(R_TAGS) or remote.WritesTo(r):
all[r] = id
ids.add(id)
continue
if id in ids:
continue
r = 'refs/_alt/%s' % id
all[r] = id
ids.add(id)
tmp.add(r)
ref_names = list(all.keys())
ref_names.sort()
tmp_packed = ''
old_packed = ''
for r in ref_names:
line = '%s %s\n' % (all[r], r)
tmp_packed += line
if r not in tmp:
old_packed += line
_lwrite(packed_refs, tmp_packed)
else:
ref_dir = None
cmd = ['fetch']
if quiet:
cmd.append('--quiet')
if not self.worktree:
cmd.append('--update-head-ok')
cmd.append(name)
if tag is not None:
cmd.append('tag')
cmd.append(tag)
return GitCommand(self,
cmd,
bare = True,
ssh_proxy = ssh_proxy).Wait() == 0
ok = GitCommand(self,
cmd,
bare = True,
ssh_proxy = ssh_proxy).Wait() == 0
if initial:
if ref_dir:
if old_packed != '':
_lwrite(packed_refs, old_packed)
else:
os.remove(packed_refs)
self.bare_git.pack_refs('--all', '--prune')
return ok
def _Checkout(self, rev, quiet=False):
cmd = ['checkout']
@ -1080,6 +1148,27 @@ class Project(object):
os.makedirs(self.gitdir)
self.bare_git.init()
mp = self.manifest.manifestProject
ref_dir = mp.config.GetString('repo.reference')
if ref_dir:
mirror_git = os.path.join(ref_dir, self.name + '.git')
repo_git = os.path.join(ref_dir, '.repo', 'projects',
self.relpath + '.git')
if os.path.exists(mirror_git):
ref_dir = mirror_git
elif os.path.exists(repo_git):
ref_dir = repo_git
else:
ref_dir = None
if ref_dir:
_lwrite(os.path.join(self.gitdir, 'objects/info/alternates'),
os.path.join(ref_dir, 'objects') + '\n')
if self.manifest.IsMirror:
self.config.SetString('core.bare', 'true')
else:

20
repo
View File

@ -28,7 +28,7 @@ if __name__ == '__main__':
del magic
# increment this whenever we make important changes to this script
VERSION = (1, 8)
VERSION = (1, 10)
# increment this if the MAINTAINER_KEYS block is modified
KEYRING_VERSION = (1,0)
@ -118,6 +118,9 @@ group.add_option('-m', '--manifest-name',
group.add_option('--mirror',
dest='mirror', action='store_true',
help='mirror the forrest')
group.add_option('--reference',
dest='reference',
help='location of mirror directory', metavar='DIR')
# Tool
group = init_optparse.add_option_group('repo Version options')
@ -256,8 +259,8 @@ def _SetupGnuPG(quiet):
gpg_dir, e.strerror)
sys.exit(1)
env = dict(os.environ)
env['GNUPGHOME'] = gpg_dir
env = os.environ.copy()
env['GNUPGHOME'] = gpg_dir.encode()
cmd = ['gpg', '--import']
try:
@ -375,8 +378,8 @@ def _Verify(cwd, branch, quiet):
% (branch, cur)
print >>sys.stderr
env = dict(os.environ)
env['GNUPGHOME'] = gpg_dir
env = os.environ.copy()
env['GNUPGHOME'] = gpg_dir.encode()
cmd = [GIT, 'tag', '-v', cur]
proc = subprocess.Popen(cmd,
@ -427,10 +430,14 @@ def _FindRepo():
dir = os.getcwd()
repo = None
while dir != '/' and not repo:
olddir = None
while dir != '/' \
and dir != olddir \
and not repo:
repo = os.path.join(dir, repodir, REPO_MAIN)
if not os.path.isfile(repo):
repo = None
olddir = dir
dir = os.path.dirname(dir)
return (repo, os.path.join(dir, repodir))
@ -476,6 +483,7 @@ def _Help(args):
if args:
if args[0] == 'init':
init_optparse.print_help()
sys.exit(0)
else:
print >>sys.stderr,\
"error: '%s' is not a bootstrap command.\n"\

View File

@ -36,6 +36,9 @@ makes it available in your project's local working directory.
pass
def _ParseChangeIds(self, args):
if not args:
self.Usage()
to_get = []
project = None

View File

@ -151,11 +151,11 @@ terminal and are not redirected.
first = True
for project in self.GetProjects(args):
env = dict(os.environ.iteritems())
env = os.environ.copy()
def setenv(name, val):
if val is None:
val = ''
env[name] = val
env[name] = val.encode()
setenv('REPO_PROJECT', project.name)
setenv('REPO_PATH', project.relpath)
@ -169,6 +169,12 @@ terminal and are not redirected.
else:
cwd = project.worktree
if not os.path.exists(cwd):
if (opt.project_header and opt.verbose) \
or not opt.project_header:
print >>sys.stderr, 'skipping %s/' % project.relpath
continue
if opt.project_header:
stdin = subprocess.PIPE
stdout = subprocess.PIPE

View File

@ -94,6 +94,8 @@ See 'repo help --all' for a complete list of recognized commands.
body = getattr(cmd, bodyAttr)
except AttributeError:
return
if body == '' or body is None:
return
self.nl()

View File

@ -41,6 +41,13 @@ The optional -m argument can be used to specify an alternate manifest
to be used. If no manifest is specified, the manifest default.xml
will be used.
The --reference option can be used to point to a directory that
has the content of a --mirror sync. This will make the working
directory use as much data as possible from the local reference
directory when fetching from the server. This will make the sync
go a lot faster by reducing data traffic on the network.
Switching Manifest Branches
---------------------------
@ -71,7 +78,9 @@ to update the working directory files.
g.add_option('--mirror',
dest='mirror', action='store_true',
help='mirror the forrest')
g.add_option('--reference',
dest='reference',
help='location of mirror directory', metavar='DIR')
# Tool
g = p.add_option_group('repo Version options')
@ -115,6 +124,9 @@ to update the working directory files.
r.ResetFetch()
r.Save()
if opt.reference:
m.config.SetString('repo.reference', opt.reference)
if opt.mirror:
if is_new:
m.config.SetString('repo.mirror', 'true')

View File

@ -55,6 +55,7 @@ need to be performed by an end-user.
print >>sys.stderr, "error: can't update repo"
sys.exit(1)
rp.bare_git.gc('--auto')
_PostRepoFetch(rp,
no_repo_verify = opt.no_repo_verify,
verbose = True)

View File

@ -70,6 +70,9 @@ The -s/--smart-sync option can be used to sync to a known good
build as specified by the manifest-server element in the current
manifest.
The -f/--force-broken option can be used to proceed with syncing
other projects if a project sync fails.
SSH Connections
---------------
@ -101,6 +104,9 @@ later is required to fix a server side protocol bug.
"""
def _Options(self, p, show_smart=True):
p.add_option('-f', '--force-broken',
dest='force_broken', action='store_true',
help="continue sync even if a project fails to sync")
p.add_option('-l','--local-only',
dest='local_only', action='store_true',
help="only update working tree, don't fetch")
@ -110,6 +116,9 @@ later is required to fix a server side protocol bug.
p.add_option('-d','--detach',
dest='detach_head', action='store_true',
help='detach projects back to manifest revision')
p.add_option('-q','--quiet',
dest='quiet', action='store_true',
help='be more quiet')
p.add_option('-j','--jobs',
dest='jobs', action='store', type='int',
help="number of projects to fetch simultaneously")
@ -126,11 +135,14 @@ later is required to fix a server side protocol bug.
dest='repo_upgraded', action='store_true',
help=SUPPRESS_HELP)
def _FetchHelper(self, project, lock, fetched, pm, sem):
if not project.Sync_NetworkHalf():
def _FetchHelper(self, opt, project, lock, fetched, pm, sem):
if not project.Sync_NetworkHalf(quiet=opt.quiet):
print >>sys.stderr, 'error: Cannot fetch %s' % project.name
sem.release()
sys.exit(1)
if opt.force_broken:
print >>sys.stderr, 'warn: --force-broken, continuing to sync'
else:
sem.release()
sys.exit(1)
lock.acquire()
fetched.add(project.gitdir)
@ -138,18 +150,21 @@ later is required to fix a server side protocol bug.
lock.release()
sem.release()
def _Fetch(self, projects):
def _Fetch(self, projects, opt):
fetched = set()
pm = Progress('Fetching projects', len(projects))
if self.jobs == 1:
for project in projects:
pm.update()
if project.Sync_NetworkHalf():
if project.Sync_NetworkHalf(quiet=opt.quiet):
fetched.add(project.gitdir)
else:
print >>sys.stderr, 'error: Cannot fetch %s' % project.name
sys.exit(1)
if opt.force_broken:
print >>sys.stderr, 'warn: --force-broken, continuing to sync'
else:
sys.exit(1)
else:
threads = set()
lock = _threading.Lock()
@ -157,7 +172,12 @@ later is required to fix a server side protocol bug.
for project in projects:
sem.acquire()
t = _threading.Thread(target = self._FetchHelper,
args = (project, lock, fetched, pm, sem))
args = (opt,
project,
lock,
fetched,
pm,
sem))
threads.add(t)
t.start()
@ -165,6 +185,8 @@ later is required to fix a server side protocol bug.
t.join()
pm.end()
for project in projects:
project.bare_git.gc('--auto')
return fetched
def UpdateProjectList(self):
@ -249,7 +271,7 @@ uncommitted changes are present' % project.relpath
if branch.startswith(R_HEADS):
branch = branch[len(R_HEADS):]
env = dict(os.environ)
env = os.environ.copy()
if (env.has_key('TARGET_PRODUCT') and
env.has_key('TARGET_BUILD_VARIANT')):
target = '%s-%s' % (env['TARGET_PRODUCT'],
@ -291,7 +313,7 @@ uncommitted changes are present' % project.relpath
_PostRepoUpgrade(self.manifest)
if not opt.local_only:
mp.Sync_NetworkHalf()
mp.Sync_NetworkHalf(quiet=opt.quiet)
if mp.HasChanges:
syncbuf = SyncBuffer(mp.config)
@ -308,7 +330,7 @@ uncommitted changes are present' % project.relpath
to_fetch.append(rp)
to_fetch.extend(all)
fetched = self._Fetch(to_fetch)
fetched = self._Fetch(to_fetch, opt)
_PostRepoFetch(rp, opt.no_repo_verify)
if opt.network_only:
# bail out now; the rest touches the working tree
@ -320,7 +342,7 @@ uncommitted changes are present' % project.relpath
for project in all:
if project.gitdir not in fetched:
missing.append(project)
self._Fetch(missing)
self._Fetch(missing, opt)
if self.manifest.IsMirror:
# bail out now, we have no working tree
@ -341,6 +363,11 @@ uncommitted changes are present' % project.relpath
if not syncbuf.Finish():
sys.exit(1)
# If there's a notice that's supposed to print at the end of the sync, print
# it now...
if self.manifest.notice:
print self.manifest.notice
def _PostRepoUpgrade(manifest):
for project in manifest.projects.values():
if project.Exists:
@ -388,9 +415,9 @@ warning: Cannot automatically authenticate repo."""
% (project.name, rev)
return False
env = dict(os.environ)
env['GIT_DIR'] = project.gitdir
env['GNUPGHOME'] = gpg_dir
env = os.environ.copy()
env['GIT_DIR'] = project.gitdir.encode()
env['GNUPGHOME'] = gpg_dir.encode()
cmd = [GIT, 'tag', '-v', cur]
proc = subprocess.Popen(cmd,

View File

@ -47,7 +47,7 @@ class Upload(InteractiveCommand):
common = True
helpSummary = "Upload changes for code review"
helpUsage="""
%prog [--re --cc] {[<project>]... | --replace <project>}
%prog [--re --cc] [<project>]...
"""
helpDescription = """
The '%prog' command is used to send changes to the Gerrit Code
@ -67,12 +67,6 @@ added to the respective list of users, and emails are sent to any
new users. Users passed as --reviewers must already be registered
with the code review system, or the upload will fail.
If the --replace option is passed the user can designate which
existing change(s) in Gerrit match up to the commits in the branch
being uploaded. For each matched pair of change,commit the commit
will be added as a new patch set, completely replacing the set of
files and description associated with the change in Gerrit.
Configuration
-------------
@ -119,9 +113,6 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
p.add_option('-t',
dest='auto_topic', action='store_true',
help='Send local branch name to Gerrit Code Review')
p.add_option('--replace',
dest='replace', action='store_true',
help='Upload replacement patchesets from this branch')
p.add_option('--re', '--reviewers',
type='string', action='append', dest='reviewers',
help='Request reviews from these people.')
@ -262,65 +253,6 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
except:
return ""
def _ReplaceBranch(self, project, people):
branch = project.CurrentBranch
if not branch:
print >>sys.stdout, "no branches ready for upload"
return
branch = project.GetUploadableBranch(branch)
if not branch:
print >>sys.stdout, "no branches ready for upload"
return
script = []
script.append('# Replacing from branch %s' % branch.name)
if len(branch.commits) == 1:
change = self._FindGerritChange(branch)
script.append('[%-6s] %s' % (change, branch.commits[0]))
else:
for commit in branch.commits:
script.append('[ ] %s' % commit)
script.append('')
script.append('# Insert change numbers in the brackets to add a new patch set.')
script.append('# To create a new change record, leave the brackets empty.')
script = Editor.EditString("\n".join(script)).split("\n")
change_re = re.compile(r'^\[\s*(\d{1,})\s*\]\s*([0-9a-f]{1,}) .*$')
to_replace = dict()
full_hashes = branch.unabbrev_commits
for line in script:
m = change_re.match(line)
if m:
c = m.group(1)
f = m.group(2)
try:
f = full_hashes[f]
except KeyError:
print 'fh = %s' % full_hashes
print >>sys.stderr, "error: commit %s not found" % f
sys.exit(1)
if c in to_replace:
print >>sys.stderr,\
"error: change %s cannot accept multiple commits" % c
sys.exit(1)
to_replace[c] = f
if not to_replace:
print >>sys.stderr, "error: no replacements specified"
print >>sys.stderr, " use 'repo upload' without --replace"
sys.exit(1)
if len(branch.commits) > UNUSUAL_COMMIT_THRESHOLD:
if not _ConfirmManyUploads(multiple_branches=True):
_die("upload aborted by user")
branch.replace_changes = to_replace
self._UploadAndReport(opt, [branch], people)
def _UploadAndReport(self, opt, todo, original_people):
have_errors = False
for branch in todo:
@ -351,15 +283,19 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
have_errors = True
print >>sys.stderr, ''
print >>sys.stderr, '--------------------------------------------'
print >>sys.stderr, '----------------------------------------------------------------------'
if have_errors:
for branch in todo:
if not branch.uploaded:
print >>sys.stderr, '[FAILED] %-15s %-15s (%s)' % (
if len(str(branch.error)) <= 30:
fmt = ' (%s)'
else:
fmt = '\n (%s)'
print >>sys.stderr, ('[FAILED] %-15s %-15s' + fmt) % (
branch.project.relpath + '/', \
branch.name, \
branch.error)
str(branch.error))
print >>sys.stderr, ''
for branch in todo:
@ -383,14 +319,6 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
cc = _SplitEmails(opt.cc)
people = (reviewers,cc)
if opt.replace:
if len(project_list) != 1:
print >>sys.stderr, \
'error: --replace requires exactly one project'
sys.exit(1)
self._ReplaceBranch(project_list[0], people)
return
for project in project_list:
avail = project.GetUploadableBranches()
if avail: