Compare commits

...

19 Commits

Author SHA1 Message Date
9fa44db94b Introduce 'repo abandon <branchname>' as an alias for 'git branch -D'
This destroys a local development branch, removing all history
of that branch from ever existing.  If the branch is currently
checked out we move back to the upstream revision.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-03 11:24:59 -08:00
c9ef744c7b Install a default pre-auto-gc hook in all repositories
This hook is evaluated by `git gc --auto` to determine if it is a
good idea to execute a GC at this time, or defer it to some later
date.  When working on a laptop its a good idea to avoid GC if you
are on battery power as the extra CPU and disk IO would consume a
decent amount of the charge.

The hook is the standard sample hook from git.git contrib/hooks,
last modified in git.git by 84ed4c5d117d72f02cc918e413b9861a9d2846d7.
I added the GPLv2 header to the script to ensure the license notice
is clear, as it does not match repo's own APLv2 license.

We only update hooks during initial repository creation or on
a repo sync.  This way we don't incur huge overheads from the
hook stat operations during "repo status" or even the normal
"repo sync" cases.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-03 11:00:44 -08:00
438ee1cad9 Catch symlink creation failures and report a better error
Some users have noticed that repo doesn't work on VFAT, as we
require a POSIX filesystem with POSIX symlink support.  Catching the
OSError during our symlink creation and raising a GitError with a
more descriptive message will help users to troubleshoot and fix
their own installation problems.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-03 09:59:36 -08:00
23d7781c0b Don't print "Already up-to-date" during repo sync
If we are already up-to-date we just want to display no output.
This means we have to avoid calling "git merge" if there aren't
commits to be merged into the working directory.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-30 11:06:57 -07:00
a54c527ae9 Fast-forward a fully merged topic branch during 'repo sync'
Instead of trying to rebase the changes on a topic branch that
has been fully merged into the upstream branch we track, we should
just fast-forward the topic branch to the new upstream revision.
This way the branch doesn't try to rewrite commits that are already
merged in the upstream.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-30 11:03:00 -07:00
df830f1238 Remove import_tar, import_zip and the <snapshot> elements
Now that repo relies only on the git data stream (as it is much
faster to download through) we don't really need to be parsing the
<snapshot> elements within manifest.  Its a lot of complex code to
convert the tar (or zip) through to a fast import stream, and we
just aren't calling it anymore.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-30 09:21:43 -07:00
90be5c0839 Cache the per-user configuration to avoid duplicate instances
Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-29 15:24:34 -07:00
7965f9fed0 Move the Editor configuration out of Manifest's constructor
This way we can build more than one Manifest instance in memory
and not muck around with the Editor configuration each time we
build a new instance.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-29 15:24:34 -07:00
de646819b8 Don't flip out if there are no template hooks
Git may have been installed without its hooks directory, which
means we won't have any hooks in a repo created git repository.
Since we are just deleting the hooks it doesn't matter.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-29 14:38:12 -07:00
bd4edc9a69 Stop downloading snapshots as native git:// is faster
Downloading and streaming a tar into Git is slower than just
letting the native git:// protocol handle the data transfer,
especially when there are multiple revisions available and
Git can perform delta compression across revisions.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-28 16:14:05 -07:00
ce03a401c6 Stop hiding remote missing object errors
Hiding error messages from the remote peer is not a good idea,
as users should be made aware when the remote peer is not a
complete Git repository so they can alert the administrators
and have the repository corrected.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-28 16:12:03 -07:00
45476c40c7 wrapper 1.6
Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-28 08:46:15 -07:00
1619134720 Added missing wait after git-version call in wrapper
Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-28 08:44:18 -07:00
7efd1a5b23 Remove unused import from gerrit_upload.py
Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-28 08:44:18 -07:00
329c31da7d Repair any mis-directed android-1.0 annotated tags
The initial open source release of the Android 1.0 platform had
some problems with its Perforce->Git imports.  Google was forced
to rewrite some history to redirect users onto more stable upstream
sources and correct errors in the imports.

Not everyone has the correct android-1.0 tags, as some users did
manage to fetch the platform early, before the mirror sites crashed
and the history was rewritten.

This change is a band-aid to ensure any stale android-1.0 tags are
get updated to the corrected version.  It should be backed out at
some point in the near future, when we can be fairly certain that
everyone has the correct android-1.0 tags.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-24 09:17:25 -07:00
5cc6679fb8 Support user supplied custom .repo/local_manifest.xml files
By creating a .repo/local_manifest.xml the user can add extra
projects into their client space, without touching the main
manifest script.

For example:

  $ cat .repo/local_manifest.xml
  <?xml version="1.0" encoding="UTF-8"?>
  <manifest>
   <project path="android-build"
            name="platform/build"
            remote="korg"
            revision="android-1.0" />
  </manifest>

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-23 16:20:14 -07:00
632768bc65 Teach repo how to download changes to the local checkout
Now `repo download . 1402` would download the change numbered 1402
into the current project and check it out for the user, using a
detached HEAD.  `repo sync .` would back out of the change and
return to the upstream version.

Multiple projects can be fetched at once by listing them out on
the command line as different arguments.

Individual patch sets can be selected by adding a '/n' to indicate
the n-th patch set should be downloaded instead of the default of
patch set 1.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-23 14:43:28 -07:00
0758d2f1d6 Show which user account each change was uploaded under
This way users are well aware of which account we used when the
uploads are complete, so they can be certain to sign into the web
application with that user identity.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-22 13:13:40 -07:00
bb0ee80571 Change RPC client to only use Google Accounts for authentication
Hosted domain account (such as "@google.com" itself) don't work on the
Google App Engine service unless the user specifically creates their
own Google Account (https://www.google.com/accounts/NewAccount) with
the same email address.

When both such accounts exist we must *only* use the Google Account in
our auth request, as that is all Google App Engine will honor when we
send it the session cookie.

However, Google has internal servers that may also be running Gerrit
based applications.  In those case we must use the hosted auth login
for @google.com user accounts, as the internal servers honor only the
hosted account and not the public Google Account database.

In the future we may need to add other domains to the "HOSTED" list
if other Gerrit instances are setup on hosted domains and locked to
only those domain's user accounts, similar to how a server that is
internal to Google would be setup.  Since this is currently not a
likely occurrence I'm not worrying about making it configurable at
this juncture.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-22 13:10:29 -07:00
18 changed files with 388 additions and 1317 deletions

View File

@ -1 +1 @@
__version__ = 'v1.0'
__version__ = 'v1.0-14-gc4f226bc'

View File

@ -167,6 +167,10 @@ class HttpRpc(RpcChannel):
Returns:
The authentication token returned by ClientLogin.
"""
account_type = 'GOOGLE'
if self.host.endswith('.google.com'):
account_type = 'HOSTED'
req = self._CreateRequest(
url="https://www.google.com/accounts/ClientLogin",
data=urllib.urlencode({
@ -174,7 +178,7 @@ class HttpRpc(RpcChannel):
"Passwd": password,
"service": "ah",
"source": "gerrit-codereview-client",
"accountType": "HOSTED_OR_GOOGLE",
"accountType": account_type,
})
)
try:
@ -214,7 +218,6 @@ class HttpRpc(RpcChannel):
response.info()["location"] != continue_location):
raise urllib2.HTTPError(req.get_full_url(), response.code, response.msg,
response.headers, response.fp)
self.authenticated = True
def _GetXsrfToken(self):
"""Fetches /proto/_token for use in X-XSRF-Token HTTP header.
@ -253,10 +256,18 @@ class HttpRpc(RpcChannel):
authentication cookie, it returns a 401 response and directs us to
authenticate ourselves with ClientLogin.
"""
for i in range(3):
credentials = self.auth_function()
auth_token = self._GetAuthToken(credentials[0], credentials[1])
attempts = 0
while True:
attempts += 1
try:
cred = self.auth_function()
auth_token = self._GetAuthToken(cred[0], cred[1])
except ClientLoginError:
if attempts < 3:
continue
raise
self._GetAuthCookie(auth_token)
self.authenticated = True
if self.cookie_file is not None:
self.cookie_jar.save()
return

View File

@ -64,3 +64,5 @@ class RepoChangedException(Exception):
repo or manifest repositories. In this special case we must
use exec to re-execute repo with the new code and manifest.
"""
def __init__(self, extra_args=[]):
self.extra_args = extra_args

View File

@ -15,7 +15,6 @@
import getpass
import os
import subprocess
import sys
from tempfile import mkstemp

View File

@ -28,9 +28,13 @@ def IsId(rev):
class GitConfig(object):
_ForUser = None
@classmethod
def ForUser(cls):
return cls(file = os.path.expanduser('~/.gitconfig'))
if cls._ForUser is None:
cls._ForUser = cls(file = os.path.expanduser('~/.gitconfig'))
return cls._ForUser
@classmethod
def ForRepository(cls, gitdir, defaults=None):

44
hooks/pre-auto-gc Executable file
View File

@ -0,0 +1,44 @@
#!/bin/sh
#
# An example hook script to verify if you are on battery, in case you
# are running Linux or OS X. Called by git-gc --auto with no arguments.
# The hook should exit with non-zero status after issuing an appropriate
# message if it wants to stop the auto repacking.
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
if test -x /sbin/on_ac_power && /sbin/on_ac_power
then
exit 0
elif test "$(cat /sys/class/power_supply/AC/online 2>/dev/null)" = 1
then
exit 0
elif grep -q 'on-line' /proc/acpi/ac_adapter/AC/state 2>/dev/null
then
exit 0
elif grep -q '0x01$' /proc/apm 2>/dev/null
then
exit 0
elif grep -q "AC Power \+: 1" /proc/pmu/info 2>/dev/null
then
exit 0
elif test -x /usr/bin/pmset && /usr/bin/pmset -g batt |
grep -q "Currently drawing from 'AC Power'"
then
exit 0
fi
echo "Auto packing deferred; not on AC"
exit 1

View File

@ -1,422 +0,0 @@
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import random
import stat
import sys
import urllib2
import StringIO
from error import GitError, ImportError
from git_command import GitCommand
class ImportExternal(object):
"""Imports a single revision from a non-git data source.
Suitable for use to import a tar or zip based snapshot.
"""
def __init__(self):
self._marks = 0
self._files = {}
self._tempref = 'refs/repo-external/import'
self._urls = []
self._remap = []
self.parent = None
self._user_name = 'Upstream'
self._user_email = 'upstream-import@none'
self._user_when = 1000000
self.commit = None
def Clone(self):
r = self.__class__()
r.project = self.project
for u in self._urls:
r._urls.append(u)
for p in self._remap:
r._remap.append(_PathMap(r, p._old, p._new))
return r
def SetProject(self, project):
self.project = project
def SetVersion(self, version):
self.version = version
def AddUrl(self, url):
self._urls.append(url)
def SetParent(self, commit_hash):
self.parent = commit_hash
def SetCommit(self, commit_hash):
self.commit = commit_hash
def RemapPath(self, old, new, replace_version=True):
self._remap.append(_PathMap(self, old, new))
@property
def TagName(self):
v = ''
for c in self.version:
if c >= '0' and c <= '9':
v += c
elif c >= 'A' and c <= 'Z':
v += c
elif c >= 'a' and c <= 'z':
v += c
elif c in ('-', '_', '.', '/', '+', '@'):
v += c
return 'upstream/%s' % v
@property
def PackageName(self):
n = self.project.name
if n.startswith('platform/'):
# This was not my finest moment...
#
n = n[len('platform/'):]
return n
def Import(self):
self._need_graft = False
if self.parent:
try:
self.project.bare_git.cat_file('-e', self.parent)
except GitError:
self._need_graft = True
gfi = GitCommand(self.project,
['fast-import', '--force', '--quiet'],
bare = True,
provide_stdin = True)
try:
self._out = gfi.stdin
try:
self._UnpackFiles()
self._MakeCommit()
self._out.flush()
finally:
rc = gfi.Wait()
if rc != 0:
raise ImportError('fast-import failed')
if self._need_graft:
id = self._GraftCommit()
else:
id = self.project.bare_git.rev_parse('%s^0' % self._tempref)
if self.commit and self.commit != id:
raise ImportError('checksum mismatch: %s expected,'
' %s imported' % (self.commit, id))
self._MakeTag(id)
return id
finally:
try:
self.project.bare_git.DeleteRef(self._tempref)
except GitError:
pass
def _PickUrl(self, failed):
u = map(lambda x: x.replace('%version%', self.version), self._urls)
for f in failed:
if f in u:
u.remove(f)
if len(u) == 0:
return None
return random.choice(u)
def _OpenUrl(self):
failed = {}
while True:
url = self._PickUrl(failed.keys())
if url is None:
why = 'Cannot download %s' % self.project.name
if failed:
why += ': one or more mirrors are down\n'
bad_urls = list(failed.keys())
bad_urls.sort()
for url in bad_urls:
why += ' %s: %s\n' % (url, failed[url])
else:
why += ': no mirror URLs'
raise ImportError(why)
print >>sys.stderr, "Getting %s ..." % url
try:
return urllib2.urlopen(url), url
except urllib2.HTTPError, e:
failed[url] = e.code
except urllib2.URLError, e:
failed[url] = e.reason[1]
except OSError, e:
failed[url] = e.strerror
def _UnpackFiles(self):
raise NotImplementedError
def _NextMark(self):
self._marks += 1
return self._marks
def _UnpackOneFile(self, mode, size, name, fd):
if stat.S_ISDIR(mode): # directory
return
else:
mode = self._CleanMode(mode, name)
old_name = name
name = self._CleanName(name)
if stat.S_ISLNK(mode) and self._remap:
# The link is relative to the old_name, and may need to
# be rewritten according to our remap rules if it goes
# up high enough in the tree structure.
#
dest = self._RewriteLink(fd.read(size), old_name, name)
fd = StringIO.StringIO(dest)
size = len(dest)
fi = _File(mode, name, self._NextMark())
self._out.write('blob\n')
self._out.write('mark :%d\n' % fi.mark)
self._out.write('data %d\n' % size)
while size > 0:
n = min(2048, size)
self._out.write(fd.read(n))
size -= n
self._out.write('\n')
self._files[fi.name] = fi
def _SetFileMode(self, name, mode):
if not stat.S_ISDIR(mode):
mode = self._CleanMode(mode, name)
name = self._CleanName(name)
try:
fi = self._files[name]
except KeyError:
raise ImportError('file %s was not unpacked' % name)
fi.mode = mode
def _RewriteLink(self, dest, relto_old, relto_new):
# Drop the last components of the symlink itself
# as the dest is relative to the directory its in.
#
relto_old = _TrimPath(relto_old)
relto_new = _TrimPath(relto_new)
# Resolve the link to be absolute from the top of
# the archive, so we can remap its destination.
#
while dest.find('/./') >= 0 or dest.find('//') >= 0:
dest = dest.replace('/./', '/')
dest = dest.replace('//', '/')
if dest.startswith('../') or dest.find('/../') > 0:
dest = _FoldPath('%s/%s' % (relto_old, dest))
for pm in self._remap:
if pm.Matches(dest):
dest = pm.Apply(dest)
break
dest, relto_new = _StripCommonPrefix(dest, relto_new)
while relto_new:
i = relto_new.find('/')
if i > 0:
relto_new = relto_new[i + 1:]
else:
relto_new = ''
dest = '../' + dest
return dest
def _CleanMode(self, mode, name):
if stat.S_ISREG(mode): # regular file
if (mode & 0111) == 0:
return 0644
else:
return 0755
elif stat.S_ISLNK(mode): # symlink
return stat.S_IFLNK
else:
raise ImportError('invalid mode %o in %s' % (mode, name))
def _CleanName(self, name):
old_name = name
for pm in self._remap:
if pm.Matches(name):
name = pm.Apply(name)
break
while name.startswith('/'):
name = name[1:]
if not name:
raise ImportError('path %s is empty after remap' % old_name)
if name.find('/./') >= 0 or name.find('/../') >= 0:
raise ImportError('path %s contains relative parts' % name)
return name
def _MakeCommit(self):
msg = '%s %s\n' % (self.PackageName, self.version)
self._out.write('commit %s\n' % self._tempref)
self._out.write('committer %s <%s> %d +0000\n' % (
self._user_name,
self._user_email,
self._user_when))
self._out.write('data %d\n' % len(msg))
self._out.write(msg)
self._out.write('\n')
if self.parent and not self._need_graft:
self._out.write('from %s^0\n' % self.parent)
self._out.write('deleteall\n')
for f in self._files.values():
self._out.write('M %o :%d %s\n' % (f.mode, f.mark, f.name))
self._out.write('\n')
def _GraftCommit(self):
raw = self.project.bare_git.cat_file('commit', self._tempref)
raw = raw.split("\n")
while raw[1].startswith('parent '):
del raw[1]
raw.insert(1, 'parent %s' % self.parent)
id = self._WriteObject('commit', "\n".join(raw))
graft_file = os.path.join(self.project.gitdir, 'info/grafts')
if os.path.exists(graft_file):
graft_list = open(graft_file, 'rb').read().split("\n")
if graft_list and graft_list[-1] == '':
del graft_list[-1]
else:
graft_list = []
exists = False
for line in graft_list:
if line == id:
exists = True
break
if not exists:
graft_list.append(id)
graft_list.append('')
fd = open(graft_file, 'wb')
fd.write("\n".join(graft_list))
fd.close()
return id
def _MakeTag(self, id):
name = self.TagName
raw = []
raw.append('object %s' % id)
raw.append('type commit')
raw.append('tag %s' % name)
raw.append('tagger %s <%s> %d +0000' % (
self._user_name,
self._user_email,
self._user_when))
raw.append('')
raw.append('%s %s\n' % (self.PackageName, self.version))
tagid = self._WriteObject('tag', "\n".join(raw))
self.project.bare_git.UpdateRef('refs/tags/%s' % name, tagid)
def _WriteObject(self, type, data):
wo = GitCommand(self.project,
['hash-object', '-t', type, '-w', '--stdin'],
bare = True,
provide_stdin = True,
capture_stdout = True,
capture_stderr = True)
wo.stdin.write(data)
if wo.Wait() != 0:
raise GitError('cannot create %s from (%s)' % (type, data))
return wo.stdout[:-1]
def _TrimPath(path):
i = path.rfind('/')
if i > 0:
path = path[0:i]
return ''
def _StripCommonPrefix(a, b):
while True:
ai = a.find('/')
bi = b.find('/')
if ai > 0 and bi > 0 and a[0:ai] == b[0:bi]:
a = a[ai + 1:]
b = b[bi + 1:]
else:
break
return a, b
def _FoldPath(path):
while True:
if path.startswith('../'):
return path
i = path.find('/../')
if i <= 0:
if path.startswith('/'):
return path[1:]
return path
lhs = path[0:i]
rhs = path[i + 4:]
i = lhs.rfind('/')
if i > 0:
path = lhs[0:i + 1] + rhs
else:
path = rhs
class _File(object):
def __init__(self, mode, name, mark):
self.mode = mode
self.name = name
self.mark = mark
class _PathMap(object):
def __init__(self, imp, old, new):
self._imp = imp
self._old = old
self._new = new
def _r(self, p):
return p.replace('%version%', self._imp.version)
@property
def old(self):
return self._r(self._old)
@property
def new(self):
return self._r(self._new)
def Matches(self, name):
return name.startswith(self.old)
def Apply(self, name):
return self.new + name[len(self.old):]

View File

@ -1,206 +0,0 @@
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import bz2
import stat
import tarfile
import zlib
import StringIO
from import_ext import ImportExternal
from error import ImportError
class ImportTar(ImportExternal):
"""Streams a (optionally compressed) tar file from the network
directly into a Project's Git repository.
"""
@classmethod
def CanAccept(cls, url):
"""Can this importer read and unpack the data stored at url?
"""
if url.endswith('.tar.gz') or url.endswith('.tgz'):
return True
if url.endswith('.tar.bz2'):
return True
if url.endswith('.tar'):
return True
return False
def _UnpackFiles(self):
url_fd, url = self._OpenUrl()
try:
if url.endswith('.tar.gz') or url.endswith('.tgz'):
tar_fd = _Gzip(url_fd)
elif url.endswith('.tar.bz2'):
tar_fd = _Bzip2(url_fd)
elif url.endswith('.tar'):
tar_fd = _Raw(url_fd)
else:
raise ImportError('non-tar file extension: %s' % url)
try:
tar = tarfile.TarFile(name = url,
mode = 'r',
fileobj = tar_fd)
try:
for entry in tar:
mode = entry.mode
if (mode & 0170000) == 0:
if entry.isdir():
mode |= stat.S_IFDIR
elif entry.isfile() or entry.islnk(): # hard links as files
mode |= stat.S_IFREG
elif entry.issym():
mode |= stat.S_IFLNK
if stat.S_ISLNK(mode): # symlink
data_fd = StringIO.StringIO(entry.linkname)
data_sz = len(entry.linkname)
elif stat.S_ISDIR(mode): # directory
data_fd = StringIO.StringIO('')
data_sz = 0
else:
data_fd = tar.extractfile(entry)
data_sz = entry.size
self._UnpackOneFile(mode, data_sz, entry.name, data_fd)
finally:
tar.close()
finally:
tar_fd.close()
finally:
url_fd.close()
class _DecompressStream(object):
"""file like object to decompress a tar stream
"""
def __init__(self, fd):
self._fd = fd
self._pos = 0
self._buf = None
def tell(self):
return self._pos
def seek(self, offset):
d = offset - self._pos
if d > 0:
self.read(d)
elif d == 0:
pass
else:
raise NotImplementedError, 'seek backwards'
def close(self):
self._fd = None
def read(self, size = -1):
if not self._fd:
raise EOFError, 'Reached EOF'
r = []
try:
if size >= 0:
self._ReadChunk(r, size)
else:
while True:
self._ReadChunk(r, 2048)
except EOFError:
pass
if len(r) == 1:
r = r[0]
else:
r = ''.join(r)
self._pos += len(r)
return r
def _ReadChunk(self, r, size):
b = self._buf
try:
while size > 0:
if b is None or len(b) == 0:
b = self._Decompress(self._fd.read(2048))
continue
use = min(size, len(b))
r.append(b[:use])
b = b[use:]
size -= use
finally:
self._buf = b
def _Decompress(self, b):
raise NotImplementedError, '_Decompress'
class _Raw(_DecompressStream):
"""file like object for an uncompressed stream
"""
def __init__(self, fd):
_DecompressStream.__init__(self, fd)
def _Decompress(self, b):
return b
class _Bzip2(_DecompressStream):
"""file like object to decompress a .bz2 stream
"""
def __init__(self, fd):
_DecompressStream.__init__(self, fd)
self._bz = bz2.BZ2Decompressor()
def _Decompress(self, b):
return self._bz.decompress(b)
_FHCRC, _FEXTRA, _FNAME, _FCOMMENT = 2, 4, 8, 16
class _Gzip(_DecompressStream):
"""file like object to decompress a .gz stream
"""
def __init__(self, fd):
_DecompressStream.__init__(self, fd)
self._z = zlib.decompressobj(-zlib.MAX_WBITS)
magic = fd.read(2)
if magic != '\037\213':
raise IOError, 'Not a gzipped file'
method = ord(fd.read(1))
if method != 8:
raise IOError, 'Unknown compression method'
flag = ord(fd.read(1))
fd.read(6)
if flag & _FEXTRA:
xlen = ord(fd.read(1))
xlen += 256 * ord(fd.read(1))
fd.read(xlen)
if flag & _FNAME:
while fd.read(1) != '\0':
pass
if flag & _FCOMMENT:
while fd.read(1) != '\0':
pass
if flag & _FHCRC:
fd.read(2)
def _Decompress(self, b):
return self._z.decompress(b)

View File

@ -1,345 +0,0 @@
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import stat
import struct
import zlib
import cStringIO
from import_ext import ImportExternal
from error import ImportError
class ImportZip(ImportExternal):
"""Streams a zip file from the network directly into a Project's
Git repository.
"""
@classmethod
def CanAccept(cls, url):
"""Can this importer read and unpack the data stored at url?
"""
if url.endswith('.zip') or url.endswith('.jar'):
return True
return False
def _UnpackFiles(self):
url_fd, url = self._OpenUrl()
try:
if not self.__class__.CanAccept(url):
raise ImportError('non-zip file extension: %s' % url)
zip = _ZipFile(url_fd)
for entry in zip.FileRecords():
data = zip.Open(entry).read()
sz = len(data)
if data and _SafeCRLF(data):
data = data.replace('\r\n', '\n')
sz = len(data)
fd = cStringIO.StringIO(data)
self._UnpackOneFile(entry.mode, sz, entry.name, fd)
zip.Close(entry)
for entry in zip.CentralDirectory():
self._SetFileMode(entry.name, entry.mode)
zip.CheckTail()
finally:
url_fd.close()
def _SafeCRLF(data):
"""Is it reasonably safe to perform a CRLF->LF conversion?
If the stream contains a NUL byte it is likely binary,
and thus a CRLF->LF conversion may damage the stream.
If the only NUL is in the last position of the stream,
but it otherwise can do a CRLF<->LF conversion we do
the CRLF conversion anyway. At least one source ZIP
file has this structure in its source code.
If every occurrance of a CR and LF is paired up as a
CRLF pair then the conversion is safely bi-directional.
s/\r\n/\n/g == s/\n/\r\\n/g can convert between them.
"""
nul = data.find('\0')
if 0 <= nul and nul < (len(data) - 1):
return False
n_lf = 0
last = 0
while True:
lf = data.find('\n', last)
if lf < 0:
break
if lf == 0 or data[lf - 1] != '\r':
return False
last = lf + 1
n_lf += 1
return n_lf > 0
class _ZipFile(object):
"""Streaming iterator to parse a zip file on the fly.
"""
def __init__(self, fd):
self._fd = _UngetStream(fd)
def FileRecords(self):
return _FileIter(self._fd)
def CentralDirectory(self):
return _CentIter(self._fd)
def CheckTail(self):
type_buf = self._fd.read(4)
type = struct.unpack('<I', type_buf)[0]
if type != 0x06054b50: # end of central directory
raise ImportError('zip record %x unsupported' % type)
def Open(self, entry):
if entry.is_compressed:
return _InflateStream(self._fd)
else:
if entry.has_trailer:
raise ImportError('unable to extract streamed zip')
return _FixedLengthStream(self._fd, entry.uncompressed_size)
def Close(self, entry):
if entry.has_trailer:
type = struct.unpack('<I', self._fd.read(4))[0]
if type == 0x08074b50:
# Not a formal type marker, but commonly seen in zips
# as the data descriptor signature.
#
struct.unpack('<3I', self._fd.read(12))
else:
# No signature for the data descriptor, so read the
# remaining fields out of the stream
#
self._fd.read(8)
class _FileIter(object):
def __init__(self, fd):
self._fd = fd
def __iter__(self):
return self
def next(self):
fd = self._fd
type_buf = fd.read(4)
type = struct.unpack('<I', type_buf)[0]
if type != 0x04034b50: # local file header
fd.unread(type_buf)
raise StopIteration()
rec = _FileHeader(fd.read(26))
rec.name = fd.read(rec.name_len)
fd.read(rec.extra_len)
if rec.name.endswith('/'):
rec.name = rec.name[:-1]
rec.mode = stat.S_IFDIR | 0777
return rec
class _FileHeader(object):
"""Information about a single file in the archive.
0 version needed to extract 2 bytes
1 general purpose bit flag 2 bytes
2 compression method 2 bytes
3 last mod file time 2 bytes
4 last mod file date 2 bytes
5 crc-32 4 bytes
6 compressed size 4 bytes
7 uncompressed size 4 bytes
8 file name length 2 bytes
9 extra field length 2 bytes
"""
def __init__(self, raw_bin):
rec = struct.unpack('<5H3I2H', raw_bin)
if rec[2] == 8:
self.is_compressed = True
elif rec[2] == 0:
self.is_compressed = False
else:
raise ImportError('unrecognized compression format')
if rec[1] & (1 << 3):
self.has_trailer = True
else:
self.has_trailer = False
self.compressed_size = rec[6]
self.uncompressed_size = rec[7]
self.name_len = rec[8]
self.extra_len = rec[9]
self.mode = stat.S_IFREG | 0644
class _CentIter(object):
def __init__(self, fd):
self._fd = fd
def __iter__(self):
return self
def next(self):
fd = self._fd
type_buf = fd.read(4)
type = struct.unpack('<I', type_buf)[0]
if type != 0x02014b50: # central directory
fd.unread(type_buf)
raise StopIteration()
rec = _CentHeader(fd.read(42))
rec.name = fd.read(rec.name_len)
fd.read(rec.extra_len)
fd.read(rec.comment_len)
if rec.name.endswith('/'):
rec.name = rec.name[:-1]
rec.mode = stat.S_IFDIR | 0777
return rec
class _CentHeader(object):
"""Information about a single file in the archive.
0 version made by 2 bytes
1 version needed to extract 2 bytes
2 general purpose bit flag 2 bytes
3 compression method 2 bytes
4 last mod file time 2 bytes
5 last mod file date 2 bytes
6 crc-32 4 bytes
7 compressed size 4 bytes
8 uncompressed size 4 bytes
9 file name length 2 bytes
10 extra field length 2 bytes
11 file comment length 2 bytes
12 disk number start 2 bytes
13 internal file attributes 2 bytes
14 external file attributes 4 bytes
15 relative offset of local header 4 bytes
"""
def __init__(self, raw_bin):
rec = struct.unpack('<6H3I5H2I', raw_bin)
self.name_len = rec[9]
self.extra_len = rec[10]
self.comment_len = rec[11]
if (rec[0] & 0xff00) == 0x0300: # UNIX
self.mode = rec[14] >> 16
else:
self.mode = stat.S_IFREG | 0644
class _UngetStream(object):
"""File like object to read and rewind a stream.
"""
def __init__(self, fd):
self._fd = fd
self._buf = None
def read(self, size = -1):
r = []
try:
if size >= 0:
self._ReadChunk(r, size)
else:
while True:
self._ReadChunk(r, 2048)
except EOFError:
pass
if len(r) == 1:
return r[0]
return ''.join(r)
def unread(self, buf):
b = self._buf
if b is None or len(b) == 0:
self._buf = buf
else:
self._buf = buf + b
def _ReadChunk(self, r, size):
b = self._buf
try:
while size > 0:
if b is None or len(b) == 0:
b = self._Inflate(self._fd.read(2048))
if not b:
raise EOFError()
continue
use = min(size, len(b))
r.append(b[:use])
b = b[use:]
size -= use
finally:
self._buf = b
def _Inflate(self, b):
return b
class _FixedLengthStream(_UngetStream):
"""File like object to read a fixed length stream.
"""
def __init__(self, fd, have):
_UngetStream.__init__(self, fd)
self._have = have
def _Inflate(self, b):
n = self._have
if n == 0:
self._fd.unread(b)
return None
if len(b) > n:
self._fd.unread(b[n:])
b = b[:n]
self._have -= len(b)
return b
class _InflateStream(_UngetStream):
"""Inflates the stream as it reads input.
"""
def __init__(self, fd):
_UngetStream.__init__(self, fd)
self._z = zlib.decompressobj(-zlib.MAX_WBITS)
def _Inflate(self, b):
z = self._z
if not z:
self._fd.unread(b)
return None
b = z.decompress(b)
if z.unconsumed_tail != '':
self._fd.unread(z.unconsumed_tail)
elif z.unused_data != '':
self._fd.unread(z.unused_data)
self._z = None
return b

10
main.py
View File

@ -28,6 +28,7 @@ import re
import sys
from command import InteractiveCommand, PagedCommand
from editor import Editor
from error import NoSuchProjectError
from error import RepoChangedException
from manifest import Manifest
@ -77,6 +78,7 @@ class _Repo(object):
cmd.repodir = self.repodir
cmd.manifest = Manifest(cmd.repodir)
Editor.globalConfig = cmd.manifest.globalConfig
if not gopts.no_pager and not isinstance(cmd, InteractiveCommand):
config = cmd.manifest.globalConfig
@ -184,11 +186,13 @@ def _Main(argv):
repo._Run(argv)
except KeyboardInterrupt:
sys.exit(1)
except RepoChangedException:
# If the repo or manifest changed, re-exec ourselves.
except RepoChangedException, rce:
# If repo changed, re-exec ourselves.
#
argv = list(sys.argv)
argv.extend(rce.extra_args)
try:
os.execv(__file__, sys.argv)
os.execv(__file__, argv)
except OSError, e:
print >>sys.stderr, 'fatal: cannot restart repo after upgrade'
print >>sys.stderr, 'fatal: %s' % e

View File

@ -17,15 +17,13 @@ import os
import sys
import xml.dom.minidom
from editor import Editor
from git_config import GitConfig, IsId
from import_tar import ImportTar
from import_zip import ImportZip
from project import Project, MetaProject, R_TAGS
from remote import Remote
from error import ManifestParseError
MANIFEST_FILE_NAME = 'manifest.xml'
LOCAL_MANIFEST_NAME = 'local_manifest.xml'
class _Default(object):
"""Project defaults within the manifest."""
@ -41,9 +39,7 @@ class Manifest(object):
self.repodir = os.path.abspath(repodir)
self.topdir = os.path.dirname(self.repodir)
self.manifestFile = os.path.join(self.repodir, MANIFEST_FILE_NAME)
self.globalConfig = GitConfig.ForUser()
Editor.globalConfig = self.globalConfig
self.repoProject = MetaProject(self, 'repo',
gitdir = os.path.join(repodir, 'repo/.git'),
@ -108,10 +104,20 @@ class Manifest(object):
def _Load(self):
if not self._loaded:
self._ParseManifest()
self._ParseManifest(True)
local = os.path.join(self.repodir, LOCAL_MANIFEST_NAME)
if os.path.exists(local):
try:
real = self.manifestFile
self.manifestFile = local
self._ParseManifest(False)
finally:
self.manifestFile = real
self._loaded = True
def _ParseManifest(self):
def _ParseManifest(self, is_root_file):
root = xml.dom.minidom.parse(self.manifestFile)
if not root or not root.childNodes:
raise ManifestParseError, \
@ -124,9 +130,10 @@ class Manifest(object):
"no <manifest> in %s" % \
self.manifestFile
self.branch = config.getAttribute('branch')
if not self.branch:
self.branch = 'default'
if is_root_file:
self.branch = config.getAttribute('branch')
if not self.branch:
self.branch = 'default'
for node in config.childNodes:
if node.nodeName == 'remote':
@ -236,78 +243,8 @@ class Manifest(object):
elif n.nodeName == 'copyfile':
self._ParseCopyFile(project, n)
to_resolve = []
by_version = {}
for n in node.childNodes:
if n.nodeName == 'import':
self._ParseImport(project, n, to_resolve, by_version)
for pair in to_resolve:
sn, pr = pair
try:
sn.SetParent(by_version[pr].commit)
except KeyError:
raise ManifestParseError, \
'snapshot %s not in project %s in %s' % \
(pr, project.name, self.manifestFile)
return project
def _ParseImport(self, project, import_node, to_resolve, by_version):
first_url = None
for node in import_node.childNodes:
if node.nodeName == 'mirror':
first_url = self._reqatt(node, 'url')
break
if not first_url:
raise ManifestParseError, \
'mirror url required for project %s in %s' % \
(project.name, self.manifestFile)
imp = None
for cls in [ImportTar, ImportZip]:
if cls.CanAccept(first_url):
imp = cls()
break
if not imp:
raise ManifestParseError, \
'snapshot %s unsupported for project %s in %s' % \
(first_url, project.name, self.manifestFile)
imp.SetProject(project)
for node in import_node.childNodes:
if node.nodeName == 'remap':
old = node.getAttribute('strip')
new = node.getAttribute('insert')
imp.RemapPath(old, new)
elif node.nodeName == 'mirror':
imp.AddUrl(self._reqatt(node, 'url'))
for node in import_node.childNodes:
if node.nodeName == 'snapshot':
sn = imp.Clone()
sn.SetVersion(self._reqatt(node, 'version'))
sn.SetCommit(node.getAttribute('check'))
pr = node.getAttribute('prior')
if pr:
if IsId(pr):
sn.SetParent(pr)
else:
to_resolve.append((sn, pr))
rev = R_TAGS + sn.TagName
if rev in project.snapshots:
raise ManifestParseError, \
'duplicate snapshot %s for project %s in %s' % \
(sn.version, project.name, self.manifestFile)
project.snapshots[rev] = sn
by_version[sn.version] = sn
def _ParseCopyFile(self, project, node):
src = self._reqatt(node, 'src')
dest = self._reqatt(node, 'dest')

View File

@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import errno
import filecmp
import os
import re
@ -45,6 +46,57 @@ def _info(fmt, *args):
def not_rev(r):
return '^' + r
hook_list = None
def repo_hooks():
global hook_list
if hook_list is None:
d = os.path.abspath(os.path.dirname(__file__))
d = os.path.join(d , 'hooks')
hook_list = map(lambda x: os.path.join(d, x), os.listdir(d))
return hook_list
def relpath(dst, src):
src = os.path.dirname(src)
top = os.path.commonprefix([dst, src])
if top.endswith('/'):
top = top[:-1]
else:
top = os.path.dirname(top)
tmp = src
rel = ''
while top != tmp:
rel += '../'
tmp = os.path.dirname(tmp)
return rel + dst[len(top) + 1:]
class DownloadedChange(object):
_commit_cache = None
def __init__(self, project, base, change_id, ps_id, commit):
self.project = project
self.base = base
self.change_id = change_id
self.ps_id = ps_id
self.commit = commit
@property
def commits(self):
if self._commit_cache is None:
self._commit_cache = self.project.bare_git.rev_list(
'--abbrev=8',
'--abbrev-commit',
'--pretty=oneline',
'--reverse',
'--date-order',
not_rev(self.base),
self.commit,
'--')
return self._commit_cache
class ReviewableBranch(object):
_commit_cache = None
@ -88,6 +140,10 @@ class ReviewableBranch(object):
commit = self.project.bare_git.rev_parse(R_HEADS + self.name)
return 'http://%s/r/%s' % (me.remote.review, commit[0:12])
@property
def owner_email(self):
return self.project.UserEmail
class StatusColoring(Coloring):
def __init__(self, config):
@ -437,17 +493,43 @@ class Project(object):
for r in self.extraRemotes.values():
if not self._RemoteFetch(r.name):
return False
if not self._SnapshotDownload():
return False
if not self._RemoteFetch():
return False
self._RepairAndroidImportErrors()
self._InitMRef()
return True
def PostRepoUpgrade(self):
self._InitHooks()
def _CopyFiles(self):
for file in self.copyfiles:
file._Copy()
def _RepairAndroidImportErrors(self):
if self.name in ['platform/external/iptables',
'platform/external/libpcap',
'platform/external/tcpdump',
'platform/external/webkit',
'platform/system/wlan/ti']:
# I hate myself for doing this...
#
# In the initial Android 1.0 release these projects were
# shipped, some users got them, and then the history had
# to be rewritten to correct problems with their imports.
# The 'android-1.0' tag may still be pointing at the old
# history, so we need to drop the tag and fetch it again.
#
try:
remote = self.GetRemote(self.remote.name)
relname = remote.ToLocal(R_HEADS + 'release-1.0')
tagname = R_TAGS + 'android-1.0'
if self._revlist(not_rev(relname), tagname):
cmd = ['fetch', remote.name, '+%s:%s' % (tagname, tagname)]
GitCommand(self, cmd, bare = True).Wait()
except GitError:
pass
def Sync_LocalHalf(self):
"""Perform only the local IO portion of the sync process.
Network access is not required.
@ -511,6 +593,19 @@ class Project(object):
_info("[%s] Consider merging or rebasing the"
" unpublished commits.", self.name)
return True
elif upstream_gain:
# We can fast-forward safely.
#
try:
self._FastForward(rev)
except GitError:
return False
self._CopyFiles()
return True
else:
# Trivially no changes in the upstream.
#
return True
if merge == rev:
try:
@ -575,39 +670,29 @@ class Project(object):
self._CopyFiles()
return True
def _SnapshotDownload(self):
if self.snapshots:
have = set(self._allrefs.keys())
need = []
for tag, sn in self.snapshots.iteritems():
if tag not in have:
need.append(sn)
if need:
print >>sys.stderr, """
*** Downloading source(s) from a mirror site. ***
*** If the network hangs, kill and restart repo. ***
"""
for sn in need:
try:
sn.Import()
except ImportError, e:
print >>sys.stderr, \
'error: Cannot import %s: %s' \
% (self.name, e)
return False
cmd = ['repack', '-a', '-d', '-f', '-l']
if GitCommand(self, cmd, bare = True).Wait() != 0:
return False
return True
def AddCopyFile(self, src, dest):
# dest should already be an absolute path, but src is project relative
# make src an absolute path
src = os.path.join(self.worktree, src)
self.copyfiles.append(_CopyFile(src, dest))
def DownloadPatchSet(self, change_id, patch_id):
"""Download a single patch set of a single change to FETCH_HEAD.
"""
remote = self.GetRemote(self.remote.name)
cmd = ['fetch', remote.name]
cmd.append('refs/changes/%2.2d/%d/%d' \
% (change_id % 100, change_id, patch_id))
cmd.extend(map(lambda x: str(x), remote.fetch))
if GitCommand(self, cmd, bare=True).Wait() != 0:
return None
return DownloadedChange(self,
remote.ToLocal(self.revision),
change_id,
patch_id,
self.bare_git.rev_parse('FETCH_HEAD'))
## Branch Management ##
@ -625,6 +710,22 @@ class Project(object):
else:
raise GitError('%s checkout %s ' % (self.name, rev))
def AbandonBranch(self, name):
"""Destroy a local topic branch.
"""
try:
tip_rev = self.bare_git.rev_parse(R_HEADS + name)
except GitError:
return
if self.CurrentBranch == name:
self._Checkout(
self.GetRemote(self.remote.name).ToLocal(self.revision),
quiet=True)
cmd = ['branch', '-D', name]
GitCommand(self, cmd, capture_stdout=True).Wait()
def PruneHeads(self):
"""Prune any topic branches already merged into upstream.
"""
@ -691,41 +792,9 @@ class Project(object):
def _RemoteFetch(self, name=None):
if not name:
name = self.remote.name
hide_errors = False
if self.extraRemotes or self.snapshots:
hide_errors = True
proc = GitCommand(self,
return GitCommand(self,
['fetch', name],
bare = True,
capture_stderr = hide_errors)
if hide_errors:
err = proc.process.stderr.fileno()
buf = ''
while True:
b = os.read(err, 256)
if b:
buf += b
while buf:
r = buf.find('remote: error: unable to find ')
if r >= 0:
lf = buf.find('\n')
if lf < 0:
break
buf = buf[lf + 1:]
continue
cr = buf.find('\r')
if cr < 0:
break
os.write(2, buf[0:cr + 1])
buf = buf[cr + 1:]
if not b:
if buf:
os.write(2, buf)
break
return proc.Wait() == 0
bare = True).Wait() == 0
def _Checkout(self, rev, quiet=False):
cmd = ['checkout']
@ -765,16 +834,35 @@ class Project(object):
self.config.SetString('core.bare', None)
hooks = self._gitdir_path('hooks')
for old_hook in os.listdir(hooks):
try:
to_rm = os.listdir(hooks)
except OSError:
to_rm = []
for old_hook in to_rm:
os.remove(os.path.join(hooks, old_hook))
# TODO(sop) install custom repo hooks
self._InitHooks()
m = self.manifest.manifestProject.config
for key in ['user.name', 'user.email']:
if m.Has(key, include_defaults = False):
self.config.SetString(key, m.GetString(key))
def _InitHooks(self):
hooks = self._gitdir_path('hooks')
if not os.path.exists(hooks):
os.makedirs(hooks)
for stock_hook in repo_hooks():
dst = os.path.join(hooks, os.path.basename(stock_hook))
try:
os.symlink(relpath(stock_hook, dst), dst)
except OSError, e:
if e.errno == errno.EEXIST:
pass
elif e.errno == errno.EPERM:
raise GitError('filesystem must support symlinks')
else:
raise
def _InitRemote(self):
if self.remote.fetchUrl:
remote = self.GetRemote(self.remote.name)
@ -814,19 +902,6 @@ class Project(object):
if not os.path.exists(dotgit):
os.makedirs(dotgit)
topdir = os.path.commonprefix([self.gitdir, dotgit])
if topdir.endswith('/'):
topdir = topdir[:-1]
else:
topdir = os.path.dirname(topdir)
tmpdir = dotgit
relgit = ''
while topdir != tmpdir:
relgit += '../'
tmpdir = os.path.dirname(tmpdir)
relgit += self.gitdir[len(topdir) + 1:]
for name in ['config',
'description',
'hooks',
@ -837,8 +912,15 @@ class Project(object):
'refs',
'rr-cache',
'svn']:
os.symlink(os.path.join(relgit, name),
os.path.join(dotgit, name))
try:
src = os.path.join(self.gitdir, name)
dst = os.path.join(dotgit, name)
os.symlink(relpath(src, dst), dst)
except OSError, e:
if e.errno == errno.EPERM:
raise GitError('filesystem must support symlinks')
else:
raise
rev = self.GetRemote(self.remote.name).ToLocal(self.revision)
rev = self.bare_git.rev_parse('%s^0' % rev)

3
repo
View File

@ -28,7 +28,7 @@ if __name__ == '__main__':
del magic
# increment this whenever we make important changes to this script
VERSION = (1, 5)
VERSION = (1, 6)
# increment this if the MAINTAINER_KEYS block is modified
KEYRING_VERSION = (1,0)
@ -202,6 +202,7 @@ def _CheckGitVersion():
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE)
ver_str = proc.stdout.read().strip()
proc.stdout.close()
proc.wait()
if not ver_str.startswith('git version '):
print >>sys.stderr, 'error: "%s" unsupported' % ver_str

42
subcmds/abandon.py Normal file
View File

@ -0,0 +1,42 @@
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
from command import Command
from git_command import git
class Abandon(Command):
common = True
helpSummary = "Permanently abandon a development branch"
helpUsage = """
%prog <branchname> [<project>...]
This subcommand permanently abandons a development branch by
deleting it (and all its history) from your local repository.
It is equivalent to "git branch -D <branchname>".
"""
def Execute(self, opt, args):
if not args:
self.Usage()
nb = args[0]
if not git.check_ref_format('heads/%s' % nb):
print >>sys.stderr, "error: '%s' is not a valid name" % nb
sys.exit(1)
for project in self.GetProjects(args[1:]):
project.AbandonBranch(nb)

View File

@ -1,169 +0,0 @@
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import sys
import tempfile
from command import Command
from error import GitError, NoSuchProjectError
from git_config import IsId
from import_tar import ImportTar
from import_zip import ImportZip
from project import Project
from remote import Remote
def _ToCommit(project, rev):
return project.bare_git.rev_parse('--verify', '%s^0' % rev)
def _Missing(project, rev):
return project._revlist('--objects', rev, '--not', '--all')
class ComputeSnapshotCheck(Command):
common = False
helpSummary = "Compute the check value for a new snapshot"
helpUsage = """
%prog -p NAME -v VERSION -s FILE [options]
"""
helpDescription = """
%prog computes and then displays the proper check value for a
snapshot, so it can be pasted into the manifest file for a project.
"""
def _Options(self, p):
g = p.add_option_group('Snapshot description options')
g.add_option('-p', '--project',
dest='project', metavar='NAME',
help='destination project name')
g.add_option('-v', '--version',
dest='version', metavar='VERSION',
help='upstream version/revision identifier')
g.add_option('-s', '--snapshot',
dest='snapshot', metavar='PATH',
help='local tarball path')
g.add_option('--new-project',
dest='new_project', action='store_true',
help='destinition is a new project')
g.add_option('--keep',
dest='keep_git', action='store_true',
help='keep the temporary git repository')
g = p.add_option_group('Base revision grafting options')
g.add_option('--prior',
dest='prior', metavar='COMMIT',
help='prior revision checksum')
g = p.add_option_group('Path mangling options')
g.add_option('--strip-prefix',
dest='strip_prefix', metavar='PREFIX',
help='remove prefix from all paths on import')
g.add_option('--insert-prefix',
dest='insert_prefix', metavar='PREFIX',
help='insert prefix before all paths on import')
def _Compute(self, opt):
try:
real_project = self.GetProjects([opt.project])[0]
except NoSuchProjectError:
if opt.new_project:
print >>sys.stderr, \
"warning: project '%s' does not exist" % opt.project
else:
raise NoSuchProjectError(opt.project)
self._tmpdir = tempfile.mkdtemp()
project = Project(manifest = self.manifest,
name = opt.project,
remote = Remote('origin'),
gitdir = os.path.join(self._tmpdir, '.git'),
worktree = self._tmpdir,
relpath = opt.project,
revision = 'refs/heads/master')
project._InitGitDir()
url = 'file://%s' % os.path.abspath(opt.snapshot)
imp = None
for cls in [ImportTar, ImportZip]:
if cls.CanAccept(url):
imp = cls()
break
if not imp:
print >>sys.stderr, 'error: %s unsupported' % opt.snapshot
sys.exit(1)
imp.SetProject(project)
imp.SetVersion(opt.version)
imp.AddUrl(url)
if opt.prior:
if opt.new_project:
if not IsId(opt.prior):
print >>sys.stderr, 'error: --prior=%s not valid' % opt.prior
sys.exit(1)
else:
try:
opt.prior = _ToCommit(real_project, opt.prior)
missing = _Missing(real_project, opt.prior)
except GitError, e:
print >>sys.stderr,\
'error: --prior=%s not valid\n%s' \
% (opt.prior, e)
sys.exit(1)
if missing:
print >>sys.stderr,\
'error: --prior=%s is valid, but is not reachable' \
% opt.prior
sys.exit(1)
imp.SetParent(opt.prior)
src = opt.strip_prefix
dst = opt.insert_prefix
if src or dst:
if src is None:
src = ''
if dst is None:
dst = ''
imp.RemapPath(src, dst)
commitId = imp.Import()
print >>sys.stderr,"%s\t%s" % (commitId, imp.version)
return project
def Execute(self, opt, args):
if args \
or not opt.project \
or not opt.version \
or not opt.snapshot:
self.Usage()
success = False
project = None
try:
self._tmpdir = None
project = self._Compute(opt)
finally:
if project and opt.keep_git:
print 'GIT_DIR = %s' % (project.gitdir)
elif self._tmpdir:
for root, dirs, files in os.walk(self._tmpdir, topdown=False):
for name in files:
os.remove(os.path.join(root, name))
for name in dirs:
os.rmdir(os.path.join(root, name))
os.rmdir(self._tmpdir)

78
subcmds/download.py Normal file
View File

@ -0,0 +1,78 @@
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import re
import sys
from command import Command
CHANGE_RE = re.compile(r'^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$')
class Download(Command):
common = True
helpSummary = "Download and checkout a change"
helpUsage = """
%prog {project change[/patchset]}...
"""
helpDescription = """
The '%prog' command downloads a change from the review system and
makes it available in your project's local working directory.
"""
def _Options(self, p):
pass
def _ParseChangeIds(self, args):
to_get = []
project = None
for a in args:
m = CHANGE_RE.match(a)
if m:
if not project:
self.Usage()
chg_id = int(m.group(1))
if m.group(2):
ps_id = int(m.group(2))
else:
ps_id = 1
to_get.append((project, chg_id, ps_id))
else:
project = self.GetProjects([a])[0]
return to_get
def Execute(self, opt, args):
for project, change_id, ps_id in self._ParseChangeIds(args):
dl = project.DownloadPatchSet(change_id, ps_id)
if not dl:
print >>sys.stderr, \
'[%s] change %d/%d not found' \
% (project.name, change_id, ps_id)
sys.exit(1)
if not dl.commits:
print >>sys.stderr, \
'[%s] change %d/%d has already been merged' \
% (project.name, change_id, ps_id)
continue
if len(dl.commits) > 1:
print >>sys.stderr, \
'[%s] %d/%d depends on %d unmerged changes:' \
% (project.name, change_id, ps_id, len(dl.commits))
for c in dl.commits:
print >>sys.stderr, ' %s' % (c)
project._Checkout(dl.commit)

View File

@ -49,6 +49,9 @@ the manifest.
p.add_option('--no-repo-verify',
dest='no_repo_verify', action='store_true',
help='do not verify repo source code')
p.add_option('--repo-upgraded',
dest='repo_upgraded', action='store_true',
help='perform additional actions after a repo upgrade')
def _Fetch(self, *projects):
fetched = set()
@ -67,6 +70,11 @@ the manifest.
mp = self.manifest.manifestProject
mp.PreSync()
if opt.repo_upgraded:
for project in self.manifest.projects.values():
if project.Exists:
project.PostRepoUpgrade()
all = self.GetProjects(args, missing_ok=True)
fetched = self._Fetch(rp, mp, *all)
@ -77,7 +85,7 @@ the manifest.
if not rp.Sync_LocalHalf():
sys.exit(1)
print >>sys.stderr, 'info: Restarting repo with latest version'
raise RepoChangedException()
raise RepoChangedException(['--repo-upgraded'])
else:
print >>sys.stderr, 'warning: Skipped upgrade to unverified version'

View File

@ -158,6 +158,7 @@ changes in all projects listed in the manifest.
branch.project.relpath + '/',
branch.name)
print >>sys.stderr, '%s' % branch.tip_url
print >>sys.stderr, '(as %s)' % branch.owner_email
print >>sys.stderr, ''
if have_errors: