Compare commits

..

34 Commits

Author SHA1 Message Date
a6df7d284c Describe upload --replace in upload's help text
Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-12-12 08:04:07 -08:00
67092448c2 Don't accept multiple commits for the same change in upload --replace
Gerrit won't permit more than one commit using the same change
number during a replacement request, so we should error out if
the user has asked for this in their upload edit script.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-12-12 08:01:12 -08:00
e92ceebde0 Fix upload --replace after it was broken when --review,--cc was added
Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-24 15:51:25 -08:00
03eaf07ec6 Support <remove-project name="X"> in manifest to remove/replace X
The manifest files now permit removing a project so the user can
either keep it out of their client, or replace it with a different
project using an entirely different configuration.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-20 11:54:46 -08:00
2896a79120 Add --review and --cc flags to repo upload, so you can
assign reviewers when you upload changes.
2008-11-19 11:55:06 -05:00
8c6eef4713 Make repo's editor work when the editor is a commandline with
multiple args.
2008-11-14 21:12:44 -05:00
34d237fbfb Paper bag fix repo 1.3's "repo upload" without --replace
If we aren't doing a replacement we do not have any
replace rules, so we cannot iterate over them.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-12 18:37:18 -08:00
c99883fee9 Teach 'repo upload --replace' how to add replacement patch sets
Users are prompted with the list of known changes we are about
to upload, and they can fill out the current change numbers for
any changes which already exist in the data store.  For each of
those changes the change number and commit id is sent as part of
the upload request, so Gerrit can insert the new commit as a new
patch set of the existing change, rather than make a new change.

This facility permits developers to replace a patch so they can
address comments made on a prior version of the same change.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-12 09:12:19 -08:00
ec18b4bac4 Update proto client to support patch set replacments
Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-12 09:12:19 -08:00
35f2596c27 Refactor part of GetUploadableBranches to lookup one specific branch
This way project.GetUploadableBranch(project.CurrentBranch) can tell
us how (if at all) to upload the currently checked out branch.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-12 09:12:17 -08:00
5d40e26201 Treat missing attributes as None when parsing the manifest
Some of our code assumes that a property is None.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-06 11:23:27 -08:00
70939e2f73 Add <add-remote to-project="..."> to inject additional remotes
This way users can add forks they know about to an existing project
that was already declared in the primary manifest.  This is mostly
useful with the Linux kernel project, where multiple forks is quite
common for the main upstream tree (e.g. Linus' tree), a platform
architecture tree (e.g. ARM) and a device specific tree (e.g. the
msm7k tree used by Android).

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-06 11:23:08 -08:00
ae6e0949d1 Add <remote project-name="..."> attribute within projects
By setting a project-name on a remote nested within a project forks
of a project like the Linux kernel can be easily handled by fetching
all relevant forks into the same client side project under different
remote names.  Developers can create branches off different remotes
using `git checkout --track -b $myname $remote/$branch` and later
`repo upload` automatically redirects to the proper fork project
in the code review server.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-06 11:23:06 -08:00
339ba9f6f7 Use remote.*.projectname to indicate the target project for upload
This way "forks" of a project, e.g. the linux kernel, can be setup to
use different destination projects in the review server by creating
different remotes in the client side Git repository.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-06 09:52:51 -08:00
70cd4ab270 Add some short documentation about the local manifest
Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-06 08:48:44 -08:00
e284ad1d1a Add 'repo init --mirror' to download a complete forrest
The mirror option downloads a complete forrest (as described by the
manifest) and creates a replica of the remote repositories rather
than a client working directory.  This permits other clients to
sync off the mirror site.

A mirror can be positioned in a "DMZ", where the mirror executes
"repo sync" to obtain changes from the external upstream and
clients inside the protected zone operate off the mirror only,
and therefore do not require direct git:// access to the external
upstream repositories.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-05 18:08:32 -08:00
3e5481999d Add a basic outline of the repo manifest file format
Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-04 11:19:36 -08:00
d3c388391e Update proto_client to notify the user when auth cookies are accessed
Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-04 10:49:04 -08:00
2450a2987a Assume the manifest branch matches the branch name in Git
Whatever branch name we checked the manifest out from is the name
we want to reflect throughout the rest of the projects, e.g. within
the special "m/" remote space.

This reduces the difference between different branches within the
manifest file.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-04 09:45:59 -08:00
f5c25a68d8 Cleanup stale manifest migration code from manifest.py
Prior to open-sourcing repo we had manifests in two different
layouts; one where the manifest was a straight-up git clone, and
one where the manifest was our bare repository with symlink work
tree format (identical to what our projects use).  Only the latter
form is created or used by repo at this point, so the transition
code to handle the straight-up git clone is not necessary.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-04 09:45:59 -08:00
9fa44db94b Introduce 'repo abandon <branchname>' as an alias for 'git branch -D'
This destroys a local development branch, removing all history
of that branch from ever existing.  If the branch is currently
checked out we move back to the upstream revision.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-03 11:24:59 -08:00
c9ef744c7b Install a default pre-auto-gc hook in all repositories
This hook is evaluated by `git gc --auto` to determine if it is a
good idea to execute a GC at this time, or defer it to some later
date.  When working on a laptop its a good idea to avoid GC if you
are on battery power as the extra CPU and disk IO would consume a
decent amount of the charge.

The hook is the standard sample hook from git.git contrib/hooks,
last modified in git.git by 84ed4c5d117d72f02cc918e413b9861a9d2846d7.
I added the GPLv2 header to the script to ensure the license notice
is clear, as it does not match repo's own APLv2 license.

We only update hooks during initial repository creation or on
a repo sync.  This way we don't incur huge overheads from the
hook stat operations during "repo status" or even the normal
"repo sync" cases.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-03 11:00:44 -08:00
438ee1cad9 Catch symlink creation failures and report a better error
Some users have noticed that repo doesn't work on VFAT, as we
require a POSIX filesystem with POSIX symlink support.  Catching the
OSError during our symlink creation and raising a GitError with a
more descriptive message will help users to troubleshoot and fix
their own installation problems.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-11-03 09:59:36 -08:00
23d7781c0b Don't print "Already up-to-date" during repo sync
If we are already up-to-date we just want to display no output.
This means we have to avoid calling "git merge" if there aren't
commits to be merged into the working directory.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-30 11:06:57 -07:00
a54c527ae9 Fast-forward a fully merged topic branch during 'repo sync'
Instead of trying to rebase the changes on a topic branch that
has been fully merged into the upstream branch we track, we should
just fast-forward the topic branch to the new upstream revision.
This way the branch doesn't try to rewrite commits that are already
merged in the upstream.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-30 11:03:00 -07:00
df830f1238 Remove import_tar, import_zip and the <snapshot> elements
Now that repo relies only on the git data stream (as it is much
faster to download through) we don't really need to be parsing the
<snapshot> elements within manifest.  Its a lot of complex code to
convert the tar (or zip) through to a fast import stream, and we
just aren't calling it anymore.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-30 09:21:43 -07:00
90be5c0839 Cache the per-user configuration to avoid duplicate instances
Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-29 15:24:34 -07:00
7965f9fed0 Move the Editor configuration out of Manifest's constructor
This way we can build more than one Manifest instance in memory
and not muck around with the Editor configuration each time we
build a new instance.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-29 15:24:34 -07:00
de646819b8 Don't flip out if there are no template hooks
Git may have been installed without its hooks directory, which
means we won't have any hooks in a repo created git repository.
Since we are just deleting the hooks it doesn't matter.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-29 14:38:12 -07:00
bd4edc9a69 Stop downloading snapshots as native git:// is faster
Downloading and streaming a tar into Git is slower than just
letting the native git:// protocol handle the data transfer,
especially when there are multiple revisions available and
Git can perform delta compression across revisions.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-28 16:14:05 -07:00
ce03a401c6 Stop hiding remote missing object errors
Hiding error messages from the remote peer is not a good idea,
as users should be made aware when the remote peer is not a
complete Git repository so they can alert the administrators
and have the repository corrected.

Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-28 16:12:03 -07:00
45476c40c7 wrapper 1.6
Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-28 08:46:15 -07:00
1619134720 Added missing wait after git-version call in wrapper
Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-28 08:44:18 -07:00
7efd1a5b23 Remove unused import from gerrit_upload.py
Signed-off-by: Shawn O. Pearce <sop@google.com>
2008-10-28 08:44:18 -07:00
22 changed files with 856 additions and 1391 deletions

View File

@ -1 +1 @@
__version__ = 'v1.0-14-gc4f226bc'
__version__ = 'v1.0-112-gbcd4db5a'

View File

@ -20,6 +20,7 @@ import md5
import os
import random
import socket
import sys
import time
import urllib
import urllib2
@ -29,6 +30,38 @@ from froofle.protobuf.service import RpcChannel
from froofle.protobuf.service import RpcController
from need_retry_pb2 import RetryRequestLaterResponse;
_cookie_jars = {}
def _open_jar(path):
auth = False
if path is None:
c = cookielib.CookieJar()
else:
c = _cookie_jars.get(path)
if c is None:
c = cookielib.MozillaCookieJar(path)
if os.path.exists(path):
try:
c.load()
auth = True
except (cookielib.LoadError, IOError):
pass
if auth:
print >>sys.stderr, \
'Loaded authentication cookies from %s' \
% path
else:
os.close(os.open(path, os.O_CREAT, 0600))
os.chmod(path, 0600)
_cookie_jars[path] = c
else:
auth = True
return c, auth
class ClientLoginError(urllib2.HTTPError):
"""Raised to indicate an error authenticating with ClientLogin."""
@ -269,6 +302,9 @@ class HttpRpc(RpcChannel):
self._GetAuthCookie(auth_token)
self.authenticated = True
if self.cookie_file is not None:
print >>sys.stderr, \
'Saving authentication cookies to %s' \
% self.cookie_file
self.cookie_jar.save()
return
@ -337,24 +373,8 @@ class HttpRpc(RpcChannel):
opener.add_handler(urllib2.HTTPDefaultErrorHandler())
opener.add_handler(urllib2.HTTPSHandler())
opener.add_handler(urllib2.HTTPErrorProcessor())
if self.cookie_file is not None:
self.cookie_jar = cookielib.MozillaCookieJar(self.cookie_file)
if os.path.exists(self.cookie_file):
try:
self.cookie_jar.load()
self.authenticated = True
except (cookielib.LoadError, IOError):
# Failed to load cookies - just ignore them.
pass
else:
# Create an empty cookie file with mode 600
fd = os.open(self.cookie_file, os.O_CREAT, 0600)
os.close(fd)
# Always chmod the cookie file
os.chmod(self.cookie_file, 0600)
else:
# Don't save cookies across runs of update.py.
self.cookie_jar = cookielib.CookieJar()
self.cookie_jar, \
self.authenticated = _open_jar(self.cookie_file)
opener.add_handler(urllib2.HTTPCookieProcessor(self.cookie_jar))
return opener

View File

@ -27,23 +27,35 @@ _UPLOADBUNDLERESPONSE_CODETYPE = descriptor.EnumDescriptor(
options=None,
type=None),
descriptor.EnumValueDescriptor(
name='UNKNOWN_PROJECT', index=3, number=2,
name='UNKNOWN_CHANGE', index=3, number=9,
options=None,
type=None),
descriptor.EnumValueDescriptor(
name='UNKNOWN_BRANCH', index=4, number=3,
name='CHANGE_CLOSED', index=4, number=10,
options=None,
type=None),
descriptor.EnumValueDescriptor(
name='UNKNOWN_BUNDLE', index=5, number=5,
name='UNKNOWN_EMAIL', index=5, number=11,
options=None,
type=None),
descriptor.EnumValueDescriptor(
name='NOT_BUNDLE_OWNER', index=6, number=6,
name='UNKNOWN_PROJECT', index=6, number=2,
options=None,
type=None),
descriptor.EnumValueDescriptor(
name='BUNDLE_CLOSED', index=7, number=8,
name='UNKNOWN_BRANCH', index=7, number=3,
options=None,
type=None),
descriptor.EnumValueDescriptor(
name='UNKNOWN_BUNDLE', index=8, number=5,
options=None,
type=None),
descriptor.EnumValueDescriptor(
name='NOT_BUNDLE_OWNER', index=9, number=6,
options=None,
type=None),
descriptor.EnumValueDescriptor(
name='BUNDLE_CLOSED', index=10, number=8,
options=None,
type=None),
],
@ -51,6 +63,35 @@ _UPLOADBUNDLERESPONSE_CODETYPE = descriptor.EnumDescriptor(
)
_REPLACEPATCHSET = descriptor.Descriptor(
name='ReplacePatchSet',
full_name='codereview.ReplacePatchSet',
filename='upload_bundle.proto',
containing_type=None,
fields=[
descriptor.FieldDescriptor(
name='change_id', full_name='codereview.ReplacePatchSet.change_id', index=0,
number=1, type=9, cpp_type=9, label=2,
default_value=unicode("", "utf-8"),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='object_id', full_name='codereview.ReplacePatchSet.object_id', index=1,
number=2, type=9, cpp_type=9, label=2,
default_value=unicode("", "utf-8"),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[], # TODO(robinson): Implement.
enum_types=[
],
options=None)
_UPLOADBUNDLEREQUEST = descriptor.Descriptor(
name='UploadBundleRequest',
full_name='codereview.UploadBundleRequest',
@ -92,6 +133,27 @@ _UPLOADBUNDLEREQUEST = descriptor.Descriptor(
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='replace', full_name='codereview.UploadBundleRequest.replace', index=5,
number=2, type=11, cpp_type=10, label=3,
default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='reviewers', full_name='codereview.UploadBundleRequest.reviewers', index=6,
number=3, type=9, cpp_type=9, label=3,
default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='cc', full_name='codereview.UploadBundleRequest.cc', index=7,
number=4, type=9, cpp_type=9, label=3,
default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
@ -121,6 +183,20 @@ _UPLOADBUNDLERESPONSE = descriptor.Descriptor(
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='invalid_reviewers', full_name='codereview.UploadBundleResponse.invalid_reviewers', index=2,
number=12, type=9, cpp_type=9, label=3,
default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='invalid_cc', full_name='codereview.UploadBundleResponse.invalid_cc', index=3,
number=13, type=9, cpp_type=9, label=3,
default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
@ -174,8 +250,13 @@ _UPLOADBUNDLECONTINUE = descriptor.Descriptor(
options=None)
_UPLOADBUNDLEREQUEST.fields_by_name['replace'].message_type = _REPLACEPATCHSET
_UPLOADBUNDLERESPONSE.fields_by_name['status_code'].enum_type = _UPLOADBUNDLERESPONSE_CODETYPE
class ReplacePatchSet(message.Message):
__metaclass__ = reflection.GeneratedProtocolMessageType
DESCRIPTOR = _REPLACEPATCHSET
class UploadBundleRequest(message.Message):
__metaclass__ = reflection.GeneratedProtocolMessageType
DESCRIPTOR = _UPLOADBUNDLEREQUEST

200
docs/manifest-format.txt Normal file
View File

@ -0,0 +1,200 @@
repo Manifest Format
====================
A repo manifest describes the structure of a repo client; that is
the directories that are visible and where they should be obtained
from with git.
The basic structure of a manifest is a bare Git repository holding
a single 'default.xml' XML file in the top level directory.
Manifests are inherently version controlled, since they are kept
within a Git repository. Updates to manifests are automatically
obtained by clients during `repo sync`.
XML File Format
---------------
A manifest XML file (e.g. 'default.xml') roughly conforms to the
following DTD:
<!DOCTYPE manifest [
<!ELEMENT manifest (remote*,
default?,
remove-project*,
project*,
add-remote*)>
<!ELEMENT remote (EMPTY)>
<!ATTLIST remote name ID #REQUIRED>
<!ATTLIST remote fetch CDATA #REQUIRED>
<!ATTLIST remote review CDATA #IMPLIED>
<!ATTLIST remote project-name CDATA #IMPLIED>
<!ELEMENT default (EMPTY)>
<!ATTLIST default remote IDREF #IMPLIED>
<!ATTLIST default revision CDATA #IMPLIED>
<!ELEMENT project (remote*)>
<!ATTLIST project name CDATA #REQUIRED>
<!ATTLIST project path CDATA #IMPLIED>
<!ATTLIST project remote IDREF #IMPLIED>
<!ATTLIST project revision CDATA #IMPLIED>
<!ELEMENT add-remote (EMPTY)>
<!ATTLIST add-remote to-project ID #REQUIRED>
<!ATTLIST add-remote name ID #REQUIRED>
<!ATTLIST add-remote fetch CDATA #REQUIRED>
<!ATTLIST add-remote review CDATA #IMPLIED>
<!ATTLIST add-remote project-name CDATA #IMPLIED>
<!ELEMENT remove-project (EMPTY)>
<!ATTLIST remove-project name CDATA #REQUIRED>
]>
A description of the elements and their attributes follows.
Element manifest
----------------
The root element of the file.
Element remote
--------------
One or more remote elements may be specified. Each remote element
specifies a Git URL shared by one or more projects and (optionally)
the Gerrit review server those projects upload changes through.
Attribute `name`: A short name unique to this manifest file. The
name specified here is used as the remote name in each project's
.git/config, and is therefore automatically available to commands
like `git fetch`, `git remote`, `git pull` and `git push`.
Attribute `fetch`: The Git URL prefix for all projects which use
this remote. Each project's name is appended to this prefix to
form the actual URL used to clone the project.
Attribute `review`: Hostname of the Gerrit server where reviews
are uploaded to by `repo upload`. This attribute is optional;
if not specified then `repo upload` will not function.
Attribute `project-name`: Specifies the name of this project used
by the review server given in the review attribute of this element.
Only permitted when the remote element is nested inside of a project
element (see below). If not given, defaults to the name supplied
in the project's name attribute.
Element add-remote
------------------
Adds a remote to an existing project, whose name is given by the
to-project attribute. This is functionally equivalent to nesting
a remote element under the project, but has the advantage that it
can be specified in the uesr's `local_manifest.xml` to add a remote
to a project declared by the normal manifest.
The element can be used to add a fork of an existing project that
the user needs to work with.
Element default
---------------
At most one default element may be specified. Its remote and
revision attributes are used when a project element does not
specify its own remote or revision attribute.
Attribute `remote`: Name of a previously defined remote element.
Project elements lacking a remote attribute of their own will use
this remote.
Attribute `revision`: Name of a Git branch (e.g. `master` or
`refs/heads/master`). Project elements lacking their own
revision attribute will use this revision.
Element project
---------------
One or more project elements may be specified. Each element
describes a single Git repository to be cloned into the repo
client workspace.
Attribute `name`: A unique name for this project. The project's
name is appended onto its remote's fetch URL to generate the actual
URL to configure the Git remote with. The URL gets formed as:
${remote_fetch}/${project_name}.git
where ${remote_fetch} is the remote's fetch attribute and
${project_name} is the project's name attribute. The suffix ".git"
is always appended as repo assumes the upstream is a forrest of
bare Git repositories.
The project name must match the name Gerrit knows, if Gerrit is
being used for code reviews.
Attribute `path`: An optional path relative to the top directory
of the repo client where the Git working directory for this project
should be placed. If not supplied the project name is used.
Attribute `remote`: Name of a previously defined remote element.
If not supplied the remote given by the default element is used.
Attribute `revision`: Name of the Git branch the manifest wants
to track for this project. Names can be relative to refs/heads
(e.g. just "master") or absolute (e.g. "refs/heads/master").
Tags and/or explicit SHA-1s should work in theory, but have not
been extensively tested. If not supplied the revision given by
the default element is used.
Child element `remote`: Described like the top-level remote element,
but adds an additional remote to only this project. These additional
remotes are fetched from first on the initial `repo sync`, causing
the majority of the project's object database to be obtained through
these additional remotes.
Element remove-project
----------------------
Deletes the named project from the internal manifest table, possibly
allowing a subsequent project element in the same manifest file to
replace the project with a different source.
This element is mostly useful in the local_manifest.xml, where
the user can remove a project, and possibly replace it with their
own definition.
Local Manifest
==============
Additional remotes and projects may be added through a local
manifest, stored in `$TOP_DIR/.repo/local_manifest.xml`.
For example:
----
$ cat .repo/local_manifest.xml
<?xml version="1.0" encoding="UTF-8"?>
<manifest>
<project path="manifest"
name="tools/manifest" />
<project path="platform-manifest"
name="platform/manifest" />
</manifest>
----
Users may add projects to the local manifest prior to a `repo sync`
invocation, instructing repo to automatically download and manage
these extra projects.
Currently the only supported feature of a local manifest is to
add new remotes and/or projects. In the future a local manifest
may support picking different revisions of a project, or deleting
projects specified in the default manifest.

View File

@ -69,14 +69,14 @@ least one of these before using this command."""
Returns:
new value of edited text; None if editing did not succeed
"""
editor = cls._GetEditor()
editor = cls._GetEditor().split()
fd, path = tempfile.mkstemp()
try:
os.write(fd, data)
os.close(fd)
fd = None
if subprocess.Popen([editor, path]).wait() != 0:
if subprocess.Popen(editor + [path]).wait() != 0:
raise EditorError()
return open(path).read()
finally:

View File

@ -64,3 +64,5 @@ class RepoChangedException(Exception):
repo or manifest repositories. In this special case we must
use exec to re-execute repo with the new code and manifest.
"""
def __init__(self, extra_args=[]):
self.extra_args = extra_args

View File

@ -15,7 +15,6 @@
import getpass
import os
import subprocess
import sys
from tempfile import mkstemp
@ -76,6 +75,8 @@ def UploadBundle(project,
dest_branch,
src_branch,
bases,
people,
replace_changes = None,
save_cookies=True):
srv = _GetRpcServer(email, server, save_cookies)
@ -112,8 +113,17 @@ def UploadBundle(project,
req = UploadBundleRequest()
req.dest_project = str(dest_project)
req.dest_branch = str(dest_branch)
for e in people[0]:
req.reviewers.append(e)
for e in people[1]:
req.cc.append(e)
for c in revlist:
req.contained_object.append(c)
if replace_changes:
for change_id,commit_id in replace_changes.iteritems():
r = req.replace.add()
r.change_id = change_id
r.object_id = commit_id
else:
req = UploadBundleContinue()
req.bundle_id = bundle_id
@ -149,6 +159,14 @@ def UploadBundle(project,
elif rsp.status_code == UploadBundleResponse.UNAUTHORIZED_USER:
reason = ('Unauthorized user. Visit http://%s/hello to sign up.'
% server)
elif rsp.status_code == UploadBundleResponse.UNKNOWN_CHANGE:
reason = 'invalid change id'
elif rsp.status_code == UploadBundleResponse.CHANGE_CLOSED:
reason = 'one or more changes are closed'
elif rsp.status_code == UploadBundleResponse.UNKNOWN_EMAIL:
emails = [x for x in rsp.invalid_reviewers] + [
x for x in rsp.invalid_cc]
reason = 'invalid email addresses: %s' % ", ".join(emails)
else:
reason = 'unknown error ' + str(rsp.status_code)
raise UploadError(reason)

View File

@ -28,9 +28,13 @@ def IsId(rev):
class GitConfig(object):
_ForUser = None
@classmethod
def ForUser(cls):
return cls(file = os.path.expanduser('~/.gitconfig'))
if cls._ForUser is None:
cls._ForUser = cls(file = os.path.expanduser('~/.gitconfig'))
return cls._ForUser
@classmethod
def ForRepository(cls, gitdir, defaults=None):
@ -254,6 +258,7 @@ class Remote(object):
self.name = name
self.url = self._Get('url')
self.review = self._Get('review')
self.projectname = self._Get('projectname')
self.fetch = map(lambda x: RefSpec.FromString(x),
self._Get('fetch', all=True))
@ -281,18 +286,21 @@ class Remote(object):
return True
return False
def ResetFetch(self):
def ResetFetch(self, mirror=False):
"""Set the fetch refspec to its default value.
"""
self.fetch = [RefSpec(True,
'refs/heads/*',
'refs/remotes/%s/*' % self.name)]
if mirror:
dst = 'refs/heads/*'
else:
dst = 'refs/remotes/%s/*' % self.name
self.fetch = [RefSpec(True, 'refs/heads/*', dst)]
def Save(self):
"""Save this remote to the configuration.
"""
self._Set('url', self.url)
self._Set('review', self.review)
self._Set('projectname', self.projectname)
self._Set('fetch', map(lambda x: str(x), self.fetch))
def _Set(self, key, value):

44
hooks/pre-auto-gc Executable file
View File

@ -0,0 +1,44 @@
#!/bin/sh
#
# An example hook script to verify if you are on battery, in case you
# are running Linux or OS X. Called by git-gc --auto with no arguments.
# The hook should exit with non-zero status after issuing an appropriate
# message if it wants to stop the auto repacking.
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
if test -x /sbin/on_ac_power && /sbin/on_ac_power
then
exit 0
elif test "$(cat /sys/class/power_supply/AC/online 2>/dev/null)" = 1
then
exit 0
elif grep -q 'on-line' /proc/acpi/ac_adapter/AC/state 2>/dev/null
then
exit 0
elif grep -q '0x01$' /proc/apm 2>/dev/null
then
exit 0
elif grep -q "AC Power \+: 1" /proc/pmu/info 2>/dev/null
then
exit 0
elif test -x /usr/bin/pmset && /usr/bin/pmset -g batt |
grep -q "Currently drawing from 'AC Power'"
then
exit 0
fi
echo "Auto packing deferred; not on AC"
exit 1

View File

@ -1,422 +0,0 @@
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import random
import stat
import sys
import urllib2
import StringIO
from error import GitError, ImportError
from git_command import GitCommand
class ImportExternal(object):
"""Imports a single revision from a non-git data source.
Suitable for use to import a tar or zip based snapshot.
"""
def __init__(self):
self._marks = 0
self._files = {}
self._tempref = 'refs/repo-external/import'
self._urls = []
self._remap = []
self.parent = None
self._user_name = 'Upstream'
self._user_email = 'upstream-import@none'
self._user_when = 1000000
self.commit = None
def Clone(self):
r = self.__class__()
r.project = self.project
for u in self._urls:
r._urls.append(u)
for p in self._remap:
r._remap.append(_PathMap(r, p._old, p._new))
return r
def SetProject(self, project):
self.project = project
def SetVersion(self, version):
self.version = version
def AddUrl(self, url):
self._urls.append(url)
def SetParent(self, commit_hash):
self.parent = commit_hash
def SetCommit(self, commit_hash):
self.commit = commit_hash
def RemapPath(self, old, new, replace_version=True):
self._remap.append(_PathMap(self, old, new))
@property
def TagName(self):
v = ''
for c in self.version:
if c >= '0' and c <= '9':
v += c
elif c >= 'A' and c <= 'Z':
v += c
elif c >= 'a' and c <= 'z':
v += c
elif c in ('-', '_', '.', '/', '+', '@'):
v += c
return 'upstream/%s' % v
@property
def PackageName(self):
n = self.project.name
if n.startswith('platform/'):
# This was not my finest moment...
#
n = n[len('platform/'):]
return n
def Import(self):
self._need_graft = False
if self.parent:
try:
self.project.bare_git.cat_file('-e', self.parent)
except GitError:
self._need_graft = True
gfi = GitCommand(self.project,
['fast-import', '--force', '--quiet'],
bare = True,
provide_stdin = True)
try:
self._out = gfi.stdin
try:
self._UnpackFiles()
self._MakeCommit()
self._out.flush()
finally:
rc = gfi.Wait()
if rc != 0:
raise ImportError('fast-import failed')
if self._need_graft:
id = self._GraftCommit()
else:
id = self.project.bare_git.rev_parse('%s^0' % self._tempref)
if self.commit and self.commit != id:
raise ImportError('checksum mismatch: %s expected,'
' %s imported' % (self.commit, id))
self._MakeTag(id)
return id
finally:
try:
self.project.bare_git.DeleteRef(self._tempref)
except GitError:
pass
def _PickUrl(self, failed):
u = map(lambda x: x.replace('%version%', self.version), self._urls)
for f in failed:
if f in u:
u.remove(f)
if len(u) == 0:
return None
return random.choice(u)
def _OpenUrl(self):
failed = {}
while True:
url = self._PickUrl(failed.keys())
if url is None:
why = 'Cannot download %s' % self.project.name
if failed:
why += ': one or more mirrors are down\n'
bad_urls = list(failed.keys())
bad_urls.sort()
for url in bad_urls:
why += ' %s: %s\n' % (url, failed[url])
else:
why += ': no mirror URLs'
raise ImportError(why)
print >>sys.stderr, "Getting %s ..." % url
try:
return urllib2.urlopen(url), url
except urllib2.HTTPError, e:
failed[url] = e.code
except urllib2.URLError, e:
failed[url] = e.reason[1]
except OSError, e:
failed[url] = e.strerror
def _UnpackFiles(self):
raise NotImplementedError
def _NextMark(self):
self._marks += 1
return self._marks
def _UnpackOneFile(self, mode, size, name, fd):
if stat.S_ISDIR(mode): # directory
return
else:
mode = self._CleanMode(mode, name)
old_name = name
name = self._CleanName(name)
if stat.S_ISLNK(mode) and self._remap:
# The link is relative to the old_name, and may need to
# be rewritten according to our remap rules if it goes
# up high enough in the tree structure.
#
dest = self._RewriteLink(fd.read(size), old_name, name)
fd = StringIO.StringIO(dest)
size = len(dest)
fi = _File(mode, name, self._NextMark())
self._out.write('blob\n')
self._out.write('mark :%d\n' % fi.mark)
self._out.write('data %d\n' % size)
while size > 0:
n = min(2048, size)
self._out.write(fd.read(n))
size -= n
self._out.write('\n')
self._files[fi.name] = fi
def _SetFileMode(self, name, mode):
if not stat.S_ISDIR(mode):
mode = self._CleanMode(mode, name)
name = self._CleanName(name)
try:
fi = self._files[name]
except KeyError:
raise ImportError('file %s was not unpacked' % name)
fi.mode = mode
def _RewriteLink(self, dest, relto_old, relto_new):
# Drop the last components of the symlink itself
# as the dest is relative to the directory its in.
#
relto_old = _TrimPath(relto_old)
relto_new = _TrimPath(relto_new)
# Resolve the link to be absolute from the top of
# the archive, so we can remap its destination.
#
while dest.find('/./') >= 0 or dest.find('//') >= 0:
dest = dest.replace('/./', '/')
dest = dest.replace('//', '/')
if dest.startswith('../') or dest.find('/../') > 0:
dest = _FoldPath('%s/%s' % (relto_old, dest))
for pm in self._remap:
if pm.Matches(dest):
dest = pm.Apply(dest)
break
dest, relto_new = _StripCommonPrefix(dest, relto_new)
while relto_new:
i = relto_new.find('/')
if i > 0:
relto_new = relto_new[i + 1:]
else:
relto_new = ''
dest = '../' + dest
return dest
def _CleanMode(self, mode, name):
if stat.S_ISREG(mode): # regular file
if (mode & 0111) == 0:
return 0644
else:
return 0755
elif stat.S_ISLNK(mode): # symlink
return stat.S_IFLNK
else:
raise ImportError('invalid mode %o in %s' % (mode, name))
def _CleanName(self, name):
old_name = name
for pm in self._remap:
if pm.Matches(name):
name = pm.Apply(name)
break
while name.startswith('/'):
name = name[1:]
if not name:
raise ImportError('path %s is empty after remap' % old_name)
if name.find('/./') >= 0 or name.find('/../') >= 0:
raise ImportError('path %s contains relative parts' % name)
return name
def _MakeCommit(self):
msg = '%s %s\n' % (self.PackageName, self.version)
self._out.write('commit %s\n' % self._tempref)
self._out.write('committer %s <%s> %d +0000\n' % (
self._user_name,
self._user_email,
self._user_when))
self._out.write('data %d\n' % len(msg))
self._out.write(msg)
self._out.write('\n')
if self.parent and not self._need_graft:
self._out.write('from %s^0\n' % self.parent)
self._out.write('deleteall\n')
for f in self._files.values():
self._out.write('M %o :%d %s\n' % (f.mode, f.mark, f.name))
self._out.write('\n')
def _GraftCommit(self):
raw = self.project.bare_git.cat_file('commit', self._tempref)
raw = raw.split("\n")
while raw[1].startswith('parent '):
del raw[1]
raw.insert(1, 'parent %s' % self.parent)
id = self._WriteObject('commit', "\n".join(raw))
graft_file = os.path.join(self.project.gitdir, 'info/grafts')
if os.path.exists(graft_file):
graft_list = open(graft_file, 'rb').read().split("\n")
if graft_list and graft_list[-1] == '':
del graft_list[-1]
else:
graft_list = []
exists = False
for line in graft_list:
if line == id:
exists = True
break
if not exists:
graft_list.append(id)
graft_list.append('')
fd = open(graft_file, 'wb')
fd.write("\n".join(graft_list))
fd.close()
return id
def _MakeTag(self, id):
name = self.TagName
raw = []
raw.append('object %s' % id)
raw.append('type commit')
raw.append('tag %s' % name)
raw.append('tagger %s <%s> %d +0000' % (
self._user_name,
self._user_email,
self._user_when))
raw.append('')
raw.append('%s %s\n' % (self.PackageName, self.version))
tagid = self._WriteObject('tag', "\n".join(raw))
self.project.bare_git.UpdateRef('refs/tags/%s' % name, tagid)
def _WriteObject(self, type, data):
wo = GitCommand(self.project,
['hash-object', '-t', type, '-w', '--stdin'],
bare = True,
provide_stdin = True,
capture_stdout = True,
capture_stderr = True)
wo.stdin.write(data)
if wo.Wait() != 0:
raise GitError('cannot create %s from (%s)' % (type, data))
return wo.stdout[:-1]
def _TrimPath(path):
i = path.rfind('/')
if i > 0:
path = path[0:i]
return ''
def _StripCommonPrefix(a, b):
while True:
ai = a.find('/')
bi = b.find('/')
if ai > 0 and bi > 0 and a[0:ai] == b[0:bi]:
a = a[ai + 1:]
b = b[bi + 1:]
else:
break
return a, b
def _FoldPath(path):
while True:
if path.startswith('../'):
return path
i = path.find('/../')
if i <= 0:
if path.startswith('/'):
return path[1:]
return path
lhs = path[0:i]
rhs = path[i + 4:]
i = lhs.rfind('/')
if i > 0:
path = lhs[0:i + 1] + rhs
else:
path = rhs
class _File(object):
def __init__(self, mode, name, mark):
self.mode = mode
self.name = name
self.mark = mark
class _PathMap(object):
def __init__(self, imp, old, new):
self._imp = imp
self._old = old
self._new = new
def _r(self, p):
return p.replace('%version%', self._imp.version)
@property
def old(self):
return self._r(self._old)
@property
def new(self):
return self._r(self._new)
def Matches(self, name):
return name.startswith(self.old)
def Apply(self, name):
return self.new + name[len(self.old):]

View File

@ -1,206 +0,0 @@
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import bz2
import stat
import tarfile
import zlib
import StringIO
from import_ext import ImportExternal
from error import ImportError
class ImportTar(ImportExternal):
"""Streams a (optionally compressed) tar file from the network
directly into a Project's Git repository.
"""
@classmethod
def CanAccept(cls, url):
"""Can this importer read and unpack the data stored at url?
"""
if url.endswith('.tar.gz') or url.endswith('.tgz'):
return True
if url.endswith('.tar.bz2'):
return True
if url.endswith('.tar'):
return True
return False
def _UnpackFiles(self):
url_fd, url = self._OpenUrl()
try:
if url.endswith('.tar.gz') or url.endswith('.tgz'):
tar_fd = _Gzip(url_fd)
elif url.endswith('.tar.bz2'):
tar_fd = _Bzip2(url_fd)
elif url.endswith('.tar'):
tar_fd = _Raw(url_fd)
else:
raise ImportError('non-tar file extension: %s' % url)
try:
tar = tarfile.TarFile(name = url,
mode = 'r',
fileobj = tar_fd)
try:
for entry in tar:
mode = entry.mode
if (mode & 0170000) == 0:
if entry.isdir():
mode |= stat.S_IFDIR
elif entry.isfile() or entry.islnk(): # hard links as files
mode |= stat.S_IFREG
elif entry.issym():
mode |= stat.S_IFLNK
if stat.S_ISLNK(mode): # symlink
data_fd = StringIO.StringIO(entry.linkname)
data_sz = len(entry.linkname)
elif stat.S_ISDIR(mode): # directory
data_fd = StringIO.StringIO('')
data_sz = 0
else:
data_fd = tar.extractfile(entry)
data_sz = entry.size
self._UnpackOneFile(mode, data_sz, entry.name, data_fd)
finally:
tar.close()
finally:
tar_fd.close()
finally:
url_fd.close()
class _DecompressStream(object):
"""file like object to decompress a tar stream
"""
def __init__(self, fd):
self._fd = fd
self._pos = 0
self._buf = None
def tell(self):
return self._pos
def seek(self, offset):
d = offset - self._pos
if d > 0:
self.read(d)
elif d == 0:
pass
else:
raise NotImplementedError, 'seek backwards'
def close(self):
self._fd = None
def read(self, size = -1):
if not self._fd:
raise EOFError, 'Reached EOF'
r = []
try:
if size >= 0:
self._ReadChunk(r, size)
else:
while True:
self._ReadChunk(r, 2048)
except EOFError:
pass
if len(r) == 1:
r = r[0]
else:
r = ''.join(r)
self._pos += len(r)
return r
def _ReadChunk(self, r, size):
b = self._buf
try:
while size > 0:
if b is None or len(b) == 0:
b = self._Decompress(self._fd.read(2048))
continue
use = min(size, len(b))
r.append(b[:use])
b = b[use:]
size -= use
finally:
self._buf = b
def _Decompress(self, b):
raise NotImplementedError, '_Decompress'
class _Raw(_DecompressStream):
"""file like object for an uncompressed stream
"""
def __init__(self, fd):
_DecompressStream.__init__(self, fd)
def _Decompress(self, b):
return b
class _Bzip2(_DecompressStream):
"""file like object to decompress a .bz2 stream
"""
def __init__(self, fd):
_DecompressStream.__init__(self, fd)
self._bz = bz2.BZ2Decompressor()
def _Decompress(self, b):
return self._bz.decompress(b)
_FHCRC, _FEXTRA, _FNAME, _FCOMMENT = 2, 4, 8, 16
class _Gzip(_DecompressStream):
"""file like object to decompress a .gz stream
"""
def __init__(self, fd):
_DecompressStream.__init__(self, fd)
self._z = zlib.decompressobj(-zlib.MAX_WBITS)
magic = fd.read(2)
if magic != '\037\213':
raise IOError, 'Not a gzipped file'
method = ord(fd.read(1))
if method != 8:
raise IOError, 'Unknown compression method'
flag = ord(fd.read(1))
fd.read(6)
if flag & _FEXTRA:
xlen = ord(fd.read(1))
xlen += 256 * ord(fd.read(1))
fd.read(xlen)
if flag & _FNAME:
while fd.read(1) != '\0':
pass
if flag & _FCOMMENT:
while fd.read(1) != '\0':
pass
if flag & _FHCRC:
fd.read(2)
def _Decompress(self, b):
return self._z.decompress(b)

View File

@ -1,345 +0,0 @@
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import stat
import struct
import zlib
import cStringIO
from import_ext import ImportExternal
from error import ImportError
class ImportZip(ImportExternal):
"""Streams a zip file from the network directly into a Project's
Git repository.
"""
@classmethod
def CanAccept(cls, url):
"""Can this importer read and unpack the data stored at url?
"""
if url.endswith('.zip') or url.endswith('.jar'):
return True
return False
def _UnpackFiles(self):
url_fd, url = self._OpenUrl()
try:
if not self.__class__.CanAccept(url):
raise ImportError('non-zip file extension: %s' % url)
zip = _ZipFile(url_fd)
for entry in zip.FileRecords():
data = zip.Open(entry).read()
sz = len(data)
if data and _SafeCRLF(data):
data = data.replace('\r\n', '\n')
sz = len(data)
fd = cStringIO.StringIO(data)
self._UnpackOneFile(entry.mode, sz, entry.name, fd)
zip.Close(entry)
for entry in zip.CentralDirectory():
self._SetFileMode(entry.name, entry.mode)
zip.CheckTail()
finally:
url_fd.close()
def _SafeCRLF(data):
"""Is it reasonably safe to perform a CRLF->LF conversion?
If the stream contains a NUL byte it is likely binary,
and thus a CRLF->LF conversion may damage the stream.
If the only NUL is in the last position of the stream,
but it otherwise can do a CRLF<->LF conversion we do
the CRLF conversion anyway. At least one source ZIP
file has this structure in its source code.
If every occurrance of a CR and LF is paired up as a
CRLF pair then the conversion is safely bi-directional.
s/\r\n/\n/g == s/\n/\r\\n/g can convert between them.
"""
nul = data.find('\0')
if 0 <= nul and nul < (len(data) - 1):
return False
n_lf = 0
last = 0
while True:
lf = data.find('\n', last)
if lf < 0:
break
if lf == 0 or data[lf - 1] != '\r':
return False
last = lf + 1
n_lf += 1
return n_lf > 0
class _ZipFile(object):
"""Streaming iterator to parse a zip file on the fly.
"""
def __init__(self, fd):
self._fd = _UngetStream(fd)
def FileRecords(self):
return _FileIter(self._fd)
def CentralDirectory(self):
return _CentIter(self._fd)
def CheckTail(self):
type_buf = self._fd.read(4)
type = struct.unpack('<I', type_buf)[0]
if type != 0x06054b50: # end of central directory
raise ImportError('zip record %x unsupported' % type)
def Open(self, entry):
if entry.is_compressed:
return _InflateStream(self._fd)
else:
if entry.has_trailer:
raise ImportError('unable to extract streamed zip')
return _FixedLengthStream(self._fd, entry.uncompressed_size)
def Close(self, entry):
if entry.has_trailer:
type = struct.unpack('<I', self._fd.read(4))[0]
if type == 0x08074b50:
# Not a formal type marker, but commonly seen in zips
# as the data descriptor signature.
#
struct.unpack('<3I', self._fd.read(12))
else:
# No signature for the data descriptor, so read the
# remaining fields out of the stream
#
self._fd.read(8)
class _FileIter(object):
def __init__(self, fd):
self._fd = fd
def __iter__(self):
return self
def next(self):
fd = self._fd
type_buf = fd.read(4)
type = struct.unpack('<I', type_buf)[0]
if type != 0x04034b50: # local file header
fd.unread(type_buf)
raise StopIteration()
rec = _FileHeader(fd.read(26))
rec.name = fd.read(rec.name_len)
fd.read(rec.extra_len)
if rec.name.endswith('/'):
rec.name = rec.name[:-1]
rec.mode = stat.S_IFDIR | 0777
return rec
class _FileHeader(object):
"""Information about a single file in the archive.
0 version needed to extract 2 bytes
1 general purpose bit flag 2 bytes
2 compression method 2 bytes
3 last mod file time 2 bytes
4 last mod file date 2 bytes
5 crc-32 4 bytes
6 compressed size 4 bytes
7 uncompressed size 4 bytes
8 file name length 2 bytes
9 extra field length 2 bytes
"""
def __init__(self, raw_bin):
rec = struct.unpack('<5H3I2H', raw_bin)
if rec[2] == 8:
self.is_compressed = True
elif rec[2] == 0:
self.is_compressed = False
else:
raise ImportError('unrecognized compression format')
if rec[1] & (1 << 3):
self.has_trailer = True
else:
self.has_trailer = False
self.compressed_size = rec[6]
self.uncompressed_size = rec[7]
self.name_len = rec[8]
self.extra_len = rec[9]
self.mode = stat.S_IFREG | 0644
class _CentIter(object):
def __init__(self, fd):
self._fd = fd
def __iter__(self):
return self
def next(self):
fd = self._fd
type_buf = fd.read(4)
type = struct.unpack('<I', type_buf)[0]
if type != 0x02014b50: # central directory
fd.unread(type_buf)
raise StopIteration()
rec = _CentHeader(fd.read(42))
rec.name = fd.read(rec.name_len)
fd.read(rec.extra_len)
fd.read(rec.comment_len)
if rec.name.endswith('/'):
rec.name = rec.name[:-1]
rec.mode = stat.S_IFDIR | 0777
return rec
class _CentHeader(object):
"""Information about a single file in the archive.
0 version made by 2 bytes
1 version needed to extract 2 bytes
2 general purpose bit flag 2 bytes
3 compression method 2 bytes
4 last mod file time 2 bytes
5 last mod file date 2 bytes
6 crc-32 4 bytes
7 compressed size 4 bytes
8 uncompressed size 4 bytes
9 file name length 2 bytes
10 extra field length 2 bytes
11 file comment length 2 bytes
12 disk number start 2 bytes
13 internal file attributes 2 bytes
14 external file attributes 4 bytes
15 relative offset of local header 4 bytes
"""
def __init__(self, raw_bin):
rec = struct.unpack('<6H3I5H2I', raw_bin)
self.name_len = rec[9]
self.extra_len = rec[10]
self.comment_len = rec[11]
if (rec[0] & 0xff00) == 0x0300: # UNIX
self.mode = rec[14] >> 16
else:
self.mode = stat.S_IFREG | 0644
class _UngetStream(object):
"""File like object to read and rewind a stream.
"""
def __init__(self, fd):
self._fd = fd
self._buf = None
def read(self, size = -1):
r = []
try:
if size >= 0:
self._ReadChunk(r, size)
else:
while True:
self._ReadChunk(r, 2048)
except EOFError:
pass
if len(r) == 1:
return r[0]
return ''.join(r)
def unread(self, buf):
b = self._buf
if b is None or len(b) == 0:
self._buf = buf
else:
self._buf = buf + b
def _ReadChunk(self, r, size):
b = self._buf
try:
while size > 0:
if b is None or len(b) == 0:
b = self._Inflate(self._fd.read(2048))
if not b:
raise EOFError()
continue
use = min(size, len(b))
r.append(b[:use])
b = b[use:]
size -= use
finally:
self._buf = b
def _Inflate(self, b):
return b
class _FixedLengthStream(_UngetStream):
"""File like object to read a fixed length stream.
"""
def __init__(self, fd, have):
_UngetStream.__init__(self, fd)
self._have = have
def _Inflate(self, b):
n = self._have
if n == 0:
self._fd.unread(b)
return None
if len(b) > n:
self._fd.unread(b[n:])
b = b[:n]
self._have -= len(b)
return b
class _InflateStream(_UngetStream):
"""Inflates the stream as it reads input.
"""
def __init__(self, fd):
_UngetStream.__init__(self, fd)
self._z = zlib.decompressobj(-zlib.MAX_WBITS)
def _Inflate(self, b):
z = self._z
if not z:
self._fd.unread(b)
return None
b = z.decompress(b)
if z.unconsumed_tail != '':
self._fd.unread(z.unconsumed_tail)
elif z.unused_data != '':
self._fd.unread(z.unused_data)
self._z = None
return b

10
main.py
View File

@ -28,6 +28,7 @@ import re
import sys
from command import InteractiveCommand, PagedCommand
from editor import Editor
from error import NoSuchProjectError
from error import RepoChangedException
from manifest import Manifest
@ -77,6 +78,7 @@ class _Repo(object):
cmd.repodir = self.repodir
cmd.manifest = Manifest(cmd.repodir)
Editor.globalConfig = cmd.manifest.globalConfig
if not gopts.no_pager and not isinstance(cmd, InteractiveCommand):
config = cmd.manifest.globalConfig
@ -184,11 +186,13 @@ def _Main(argv):
repo._Run(argv)
except KeyboardInterrupt:
sys.exit(1)
except RepoChangedException:
# If the repo or manifest changed, re-exec ourselves.
except RepoChangedException, rce:
# If repo changed, re-exec ourselves.
#
argv = list(sys.argv)
argv.extend(rce.extra_args)
try:
os.execv(__file__, sys.argv)
os.execv(__file__, argv)
except OSError, e:
print >>sys.stderr, 'fatal: cannot restart repo after upgrade'
print >>sys.stderr, 'fatal: %s' % e

View File

@ -17,11 +17,8 @@ import os
import sys
import xml.dom.minidom
from editor import Editor
from git_config import GitConfig, IsId
from import_tar import ImportTar
from import_zip import ImportZip
from project import Project, MetaProject, R_TAGS
from project import Project, MetaProject, R_HEADS
from remote import Remote
from error import ManifestParseError
@ -42,24 +39,15 @@ class Manifest(object):
self.repodir = os.path.abspath(repodir)
self.topdir = os.path.dirname(self.repodir)
self.manifestFile = os.path.join(self.repodir, MANIFEST_FILE_NAME)
self.globalConfig = GitConfig.ForUser()
Editor.globalConfig = self.globalConfig
self.repoProject = MetaProject(self, 'repo',
gitdir = os.path.join(repodir, 'repo/.git'),
worktree = os.path.join(repodir, 'repo'))
wt = os.path.join(repodir, 'manifests')
gd_new = os.path.join(repodir, 'manifests.git')
gd_old = os.path.join(wt, '.git')
if os.path.exists(gd_new) or not os.path.exists(gd_old):
gd = gd_new
else:
gd = gd_old
self.manifestProject = MetaProject(self, 'manifests',
gitdir = gd,
worktree = wt)
gitdir = os.path.join(repodir, 'manifests.git'),
worktree = os.path.join(repodir, 'manifests'))
self._Unload()
@ -100,6 +88,10 @@ class Manifest(object):
self._Load()
return self._default
@property
def IsMirror(self):
return self.manifestProject.config.GetBoolean('repo.mirror')
def _Unload(self):
self._loaded = False
self._projects = {}
@ -109,6 +101,12 @@ class Manifest(object):
def _Load(self):
if not self._loaded:
m = self.manifestProject
b = m.GetBranch(m.CurrentBranch).merge
if b.startswith(R_HEADS):
b = b[len(R_HEADS):]
self.branch = b
self._ParseManifest(True)
local = os.path.join(self.repodir, LOCAL_MANIFEST_NAME)
@ -120,6 +118,10 @@ class Manifest(object):
finally:
self.manifestFile = real
if self.IsMirror:
self._AddMetaProjectMirror(self.repoProject)
self._AddMetaProjectMirror(self.manifestProject)
self._loaded = True
def _ParseManifest(self, is_root_file):
@ -135,10 +137,15 @@ class Manifest(object):
"no <manifest> in %s" % \
self.manifestFile
if is_root_file:
self.branch = config.getAttribute('branch')
if not self.branch:
self.branch = 'default'
for node in config.childNodes:
if node.nodeName == 'remove-project':
name = self._reqatt(node, 'name')
try:
del self._projects[name]
except KeyError:
raise ManifestParseError, \
'project %s not found' % \
(name)
for node in config.childNodes:
if node.nodeName == 'remote':
@ -168,6 +175,50 @@ class Manifest(object):
(project.name, self.manifestFile)
self._projects[project.name] = project
for node in config.childNodes:
if node.nodeName == 'add-remote':
pn = self._reqatt(node, 'to-project')
project = self._projects.get(pn)
if not project:
raise ManifestParseError, \
'project %s not defined in %s' % \
(pn, self.manifestFile)
self._ParseProjectExtraRemote(project, node)
def _AddMetaProjectMirror(self, m):
name = None
m_url = m.GetRemote(m.remote.name).url
if m_url.endswith('/.git'):
raise ManifestParseError, 'refusing to mirror %s' % m_url
if self._default and self._default.remote:
url = self._default.remote.fetchUrl
if not url.endswith('/'):
url += '/'
if m_url.startswith(url):
remote = self._default.remote
name = m_url[len(url):]
if name is None:
s = m_url.rindex('/') + 1
remote = Remote('origin', fetch = m_url[:s])
name = m_url[s:]
if name.endswith('.git'):
name = name[:-4]
if name not in self._projects:
m.PreSync()
gitdir = os.path.join(self.topdir, '%s.git' % name)
project = Project(manifest = self,
name = name,
remote = remote,
gitdir = gitdir,
worktree = None,
relpath = None,
revision = m.revision)
self._projects[project.name] = project
def _ParseRemote(self, node):
"""
reads a <remote> element from the manifest file
@ -175,10 +226,17 @@ class Manifest(object):
name = self._reqatt(node, 'name')
fetch = self._reqatt(node, 'fetch')
review = node.getAttribute('review')
if review == '':
review = None
projectName = node.getAttribute('project-name')
if projectName == '':
projectName = None
r = Remote(name=name,
fetch=fetch,
review=review)
review=review,
projectName=projectName)
for n in node.childNodes:
if n.nodeName == 'require':
@ -193,6 +251,8 @@ class Manifest(object):
d = _Default()
d.remote = self._get_remote(node)
d.revision = node.getAttribute('revision')
if d.revision == '':
d.revision = None
return d
def _ParseProject(self, node):
@ -225,8 +285,13 @@ class Manifest(object):
"project %s path cannot be absolute in %s" % \
(name, self.manifestFile)
worktree = os.path.join(self.topdir, path)
gitdir = os.path.join(self.repodir, 'projects/%s.git' % path)
if self.IsMirror:
relpath = None
worktree = None
gitdir = os.path.join(self.topdir, '%s.git' % name)
else:
worktree = os.path.join(self.topdir, path)
gitdir = os.path.join(self.repodir, 'projects/%s.git' % path)
project = Project(manifest = self,
name = name,
@ -238,93 +303,28 @@ class Manifest(object):
for n in node.childNodes:
if n.nodeName == 'remote':
r = self._ParseRemote(n)
if project.extraRemotes.get(r.name) \
or project.remote.name == r.name:
raise ManifestParseError, \
'duplicate remote %s in project %s in %s' % \
(r.name, project.name, self.manifestFile)
project.extraRemotes[r.name] = r
self._ParseProjectExtraRemote(project, n)
elif n.nodeName == 'copyfile':
self._ParseCopyFile(project, n)
to_resolve = []
by_version = {}
for n in node.childNodes:
if n.nodeName == 'import':
self._ParseImport(project, n, to_resolve, by_version)
for pair in to_resolve:
sn, pr = pair
try:
sn.SetParent(by_version[pr].commit)
except KeyError:
raise ManifestParseError, \
'snapshot %s not in project %s in %s' % \
(pr, project.name, self.manifestFile)
return project
def _ParseImport(self, project, import_node, to_resolve, by_version):
first_url = None
for node in import_node.childNodes:
if node.nodeName == 'mirror':
first_url = self._reqatt(node, 'url')
break
if not first_url:
def _ParseProjectExtraRemote(self, project, n):
r = self._ParseRemote(n)
if project.extraRemotes.get(r.name) \
or project.remote.name == r.name:
raise ManifestParseError, \
'mirror url required for project %s in %s' % \
(project.name, self.manifestFile)
imp = None
for cls in [ImportTar, ImportZip]:
if cls.CanAccept(first_url):
imp = cls()
break
if not imp:
raise ManifestParseError, \
'snapshot %s unsupported for project %s in %s' % \
(first_url, project.name, self.manifestFile)
imp.SetProject(project)
for node in import_node.childNodes:
if node.nodeName == 'remap':
old = node.getAttribute('strip')
new = node.getAttribute('insert')
imp.RemapPath(old, new)
elif node.nodeName == 'mirror':
imp.AddUrl(self._reqatt(node, 'url'))
for node in import_node.childNodes:
if node.nodeName == 'snapshot':
sn = imp.Clone()
sn.SetVersion(self._reqatt(node, 'version'))
sn.SetCommit(node.getAttribute('check'))
pr = node.getAttribute('prior')
if pr:
if IsId(pr):
sn.SetParent(pr)
else:
to_resolve.append((sn, pr))
rev = R_TAGS + sn.TagName
if rev in project.snapshots:
raise ManifestParseError, \
'duplicate snapshot %s for project %s in %s' % \
(sn.version, project.name, self.manifestFile)
project.snapshots[rev] = sn
by_version[sn.version] = sn
'duplicate remote %s in project %s in %s' % \
(r.name, project.name, self.manifestFile)
project.extraRemotes[r.name] = r
def _ParseCopyFile(self, project, node):
src = self._reqatt(node, 'src')
dest = self._reqatt(node, 'dest')
# src is project relative, and dest is relative to the top of the tree
project.AddCopyFile(src, os.path.join(self.topdir, dest))
if not self.IsMirror:
# src is project relative;
# dest is relative to the top of the tree
project.AddCopyFile(src, os.path.join(self.topdir, dest))
def _get_remote(self, node):
name = node.getAttribute('remote')

View File

@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import errno
import filecmp
import os
import re
@ -45,6 +46,32 @@ def _info(fmt, *args):
def not_rev(r):
return '^' + r
hook_list = None
def repo_hooks():
global hook_list
if hook_list is None:
d = os.path.abspath(os.path.dirname(__file__))
d = os.path.join(d , 'hooks')
hook_list = map(lambda x: os.path.join(d, x), os.listdir(d))
return hook_list
def relpath(dst, src):
src = os.path.dirname(src)
top = os.path.commonprefix([dst, src])
if top.endswith('/'):
top = top[:-1]
else:
top = os.path.dirname(top)
tmp = src
rel = ''
while top != tmp:
rel += '../'
tmp = os.path.dirname(tmp)
return rel + dst[len(top) + 1:]
class DownloadedChange(object):
_commit_cache = None
@ -77,6 +104,7 @@ class ReviewableBranch(object):
self.project = project
self.branch = branch
self.base = base
self.replace_changes = None
@property
def name(self):
@ -96,6 +124,16 @@ class ReviewableBranch(object):
'--')
return self._commit_cache
@property
def unabbrev_commits(self):
r = dict()
for commit in self.project.bare_git.rev_list(
not_rev(self.base),
R_HEADS + self.name,
'--'):
r[commit[0:8]] = commit
return r
@property
def date(self):
return self.project.bare_git.log(
@ -104,8 +142,10 @@ class ReviewableBranch(object):
R_HEADS + self.name,
'--')
def UploadForReview(self):
self.project.UploadForReview(self.name)
def UploadForReview(self, people):
self.project.UploadForReview(self.name,
self.replace_changes,
people)
@property
def tip_url(self):
@ -184,7 +224,10 @@ class Project(object):
gitdir = self.gitdir,
defaults = self.manifest.globalConfig)
self.work_git = self._GitGetByExec(self, bare=False)
if self.worktree:
self.work_git = self._GitGetByExec(self, bare=False)
else:
self.work_git = None
self.bare_git = self._GitGetByExec(self, bare=True)
@property
@ -398,15 +441,23 @@ class Project(object):
if branch in pubed and pubed[branch] == id:
continue
branch = self.GetBranch(branch)
base = branch.LocalMerge
if branch.LocalMerge:
rb = ReviewableBranch(self, branch, base)
if rb.commits:
ready.append(rb)
rb = self.GetUploadableBranch(branch)
if rb:
ready.append(rb)
return ready
def UploadForReview(self, branch=None):
def GetUploadableBranch(self, branch_name):
"""Get a single uploadable branch, or None.
"""
branch = self.GetBranch(branch_name)
base = branch.LocalMerge
if branch.LocalMerge:
rb = ReviewableBranch(self, branch, base)
if rb.commits:
return rb
return None
def UploadForReview(self, branch=None, replace_changes=None, people=([],[])):
"""Uploads the named branch for code review.
"""
if branch is None:
@ -431,16 +482,22 @@ class Project(object):
if not base_list:
raise GitError('no base refs, cannot upload %s' % branch.name)
if not branch.remote.projectname:
branch.remote.projectname = self.name
branch.remote.Save()
print >>sys.stderr, ''
_info("Uploading %s to %s:", branch.name, self.name)
try:
UploadBundle(project = self,
server = branch.remote.review,
email = self.UserEmail,
dest_project = self.name,
dest_project = branch.remote.projectname,
dest_branch = dest_branch,
src_branch = R_HEADS + branch.name,
bases = base_list)
bases = base_list,
people = people,
replace_changes = replace_changes)
except proto_client.ClientLoginError:
raise UploadError('Login failure')
except urllib2.HTTPError, e:
@ -462,18 +519,28 @@ class Project(object):
print >>sys.stderr
print >>sys.stderr, 'Initializing project %s ...' % self.name
self._InitGitDir()
self._InitRemote()
for r in self.extraRemotes.values():
if not self._RemoteFetch(r.name):
return False
if not self._SnapshotDownload():
return False
if not self._RemoteFetch():
return False
self._RepairAndroidImportErrors()
self._InitMRef()
if self.worktree:
self._RepairAndroidImportErrors()
self._InitMRef()
else:
self._InitMirrorHead()
try:
os.remove(os.path.join(self.gitdir, 'FETCH_HEAD'))
except OSError:
pass
return True
def PostRepoUpgrade(self):
self._InitHooks()
def _CopyFiles(self):
for file in self.copyfiles:
file._Copy()
@ -565,6 +632,19 @@ class Project(object):
_info("[%s] Consider merging or rebasing the"
" unpublished commits.", self.name)
return True
elif upstream_gain:
# We can fast-forward safely.
#
try:
self._FastForward(rev)
except GitError:
return False
self._CopyFiles()
return True
else:
# Trivially no changes in the upstream.
#
return True
if merge == rev:
try:
@ -629,33 +709,6 @@ class Project(object):
self._CopyFiles()
return True
def _SnapshotDownload(self):
if self.snapshots:
have = set(self._allrefs.keys())
need = []
for tag, sn in self.snapshots.iteritems():
if tag not in have:
need.append(sn)
if need:
print >>sys.stderr, """
*** Downloading source(s) from a mirror site. ***
*** If the network hangs, kill and restart repo. ***
"""
for sn in need:
try:
sn.Import()
except ImportError, e:
print >>sys.stderr, \
'error: Cannot import %s: %s' \
% (self.name, e)
return False
cmd = ['repack', '-a', '-d', '-f', '-l']
if GitCommand(self, cmd, bare = True).Wait() != 0:
return False
return True
def AddCopyFile(self, src, dest):
# dest should already be an absolute path, but src is project relative
# make src an absolute path
@ -696,6 +749,22 @@ class Project(object):
else:
raise GitError('%s checkout %s ' % (self.name, rev))
def AbandonBranch(self, name):
"""Destroy a local topic branch.
"""
try:
tip_rev = self.bare_git.rev_parse(R_HEADS + name)
except GitError:
return
if self.CurrentBranch == name:
self._Checkout(
self.GetRemote(self.remote.name).ToLocal(self.revision),
quiet=True)
cmd = ['branch', '-D', name]
GitCommand(self, cmd, capture_stdout=True).Wait()
def PruneHeads(self):
"""Prune any topic branches already merged into upstream.
"""
@ -762,41 +831,11 @@ class Project(object):
def _RemoteFetch(self, name=None):
if not name:
name = self.remote.name
hide_errors = False
if self.extraRemotes or self.snapshots:
hide_errors = True
proc = GitCommand(self,
['fetch', name],
bare = True,
capture_stderr = hide_errors)
if hide_errors:
err = proc.process.stderr.fileno()
buf = ''
while True:
b = os.read(err, 256)
if b:
buf += b
while buf:
r = buf.find('remote: error: unable to find ')
if r >= 0:
lf = buf.find('\n')
if lf < 0:
break
buf = buf[lf + 1:]
continue
cr = buf.find('\r')
if cr < 0:
break
os.write(2, buf[0:cr + 1])
buf = buf[cr + 1:]
if not b:
if buf:
os.write(2, buf)
break
return proc.Wait() == 0
cmd = ['fetch']
if not self.worktree:
cmd.append('--update-head-ok')
cmd.append(name)
return GitCommand(self, cmd, bare = True).Wait() == 0
def _Checkout(self, rev, quiet=False):
cmd = ['checkout']
@ -836,16 +875,35 @@ class Project(object):
self.config.SetString('core.bare', None)
hooks = self._gitdir_path('hooks')
for old_hook in os.listdir(hooks):
try:
to_rm = os.listdir(hooks)
except OSError:
to_rm = []
for old_hook in to_rm:
os.remove(os.path.join(hooks, old_hook))
# TODO(sop) install custom repo hooks
self._InitHooks()
m = self.manifest.manifestProject.config
for key in ['user.name', 'user.email']:
if m.Has(key, include_defaults = False):
self.config.SetString(key, m.GetString(key))
def _InitHooks(self):
hooks = self._gitdir_path('hooks')
if not os.path.exists(hooks):
os.makedirs(hooks)
for stock_hook in repo_hooks():
dst = os.path.join(hooks, os.path.basename(stock_hook))
try:
os.symlink(relpath(stock_hook, dst), dst)
except OSError, e:
if e.errno == errno.EEXIST:
pass
elif e.errno == errno.EPERM:
raise GitError('filesystem must support symlinks')
else:
raise
def _InitRemote(self):
if self.remote.fetchUrl:
remote = self.GetRemote(self.remote.name)
@ -856,14 +914,23 @@ class Project(object):
url += '/%s.git' % self.name
remote.url = url
remote.review = self.remote.reviewUrl
if remote.projectname is None:
remote.projectname = self.name
remote.ResetFetch()
if self.worktree:
remote.ResetFetch(mirror=False)
else:
remote.ResetFetch(mirror=True)
remote.Save()
for r in self.extraRemotes.values():
remote = self.GetRemote(r.name)
remote.url = r.fetchUrl
remote.review = r.reviewUrl
if r.projectName:
remote.projectname = r.projectName
elif remote.projectname is None:
remote.projectname = self.name
remote.ResetFetch()
remote.Save()
@ -880,24 +947,16 @@ class Project(object):
dst = remote.ToLocal(self.revision)
self.bare_git.symbolic_ref('-m', msg, ref, dst)
def _InitMirrorHead(self):
dst = self.GetRemote(self.remote.name).ToLocal(self.revision)
msg = 'manifest set to %s' % self.revision
self.bare_git.SetHead(dst, message=msg)
def _InitWorkTree(self):
dotgit = os.path.join(self.worktree, '.git')
if not os.path.exists(dotgit):
os.makedirs(dotgit)
topdir = os.path.commonprefix([self.gitdir, dotgit])
if topdir.endswith('/'):
topdir = topdir[:-1]
else:
topdir = os.path.dirname(topdir)
tmpdir = dotgit
relgit = ''
while topdir != tmpdir:
relgit += '../'
tmpdir = os.path.dirname(tmpdir)
relgit += self.gitdir[len(topdir) + 1:]
for name in ['config',
'description',
'hooks',
@ -908,8 +967,15 @@ class Project(object):
'refs',
'rr-cache',
'svn']:
os.symlink(os.path.join(relgit, name),
os.path.join(dotgit, name))
try:
src = os.path.join(self.gitdir, name)
dst = os.path.join(dotgit, name)
os.symlink(relpath(src, dst), dst)
except OSError, e:
if e.errno == errno.EPERM:
raise GitError('filesystem must support symlinks')
else:
raise
rev = self.GetRemote(self.remote.name).ToLocal(self.revision)
rev = self.bare_git.rev_parse('%s^0' % rev)

View File

@ -14,8 +14,12 @@
# limitations under the License.
class Remote(object):
def __init__(self, name, fetch=None, review=None):
def __init__(self, name,
fetch=None,
review=None,
projectName=None):
self.name = name
self.fetchUrl = fetch
self.reviewUrl = review
self.projectName = projectName
self.requiredCommits = []

6
repo
View File

@ -28,7 +28,7 @@ if __name__ == '__main__':
del magic
# increment this whenever we make important changes to this script
VERSION = (1, 5)
VERSION = (1, 7)
# increment this if the MAINTAINER_KEYS block is modified
KEYRING_VERSION = (1,0)
@ -115,6 +115,9 @@ group.add_option('-b', '--manifest-branch',
group.add_option('-m', '--manifest-name',
dest='manifest_name',
help='initial manifest file', metavar='NAME.xml')
group.add_option('--mirror',
dest='mirror', action='store_true',
help='mirror the forrest')
# Tool
group = init_optparse.add_option_group('Version options')
@ -202,6 +205,7 @@ def _CheckGitVersion():
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE)
ver_str = proc.stdout.read().strip()
proc.stdout.close()
proc.wait()
if not ver_str.startswith('git version '):
print >>sys.stderr, 'error: "%s" unsupported' % ver_str

42
subcmds/abandon.py Normal file
View File

@ -0,0 +1,42 @@
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
from command import Command
from git_command import git
class Abandon(Command):
common = True
helpSummary = "Permanently abandon a development branch"
helpUsage = """
%prog <branchname> [<project>...]
This subcommand permanently abandons a development branch by
deleting it (and all its history) from your local repository.
It is equivalent to "git branch -D <branchname>".
"""
def Execute(self, opt, args):
if not args:
self.Usage()
nb = args[0]
if not git.check_ref_format('heads/%s' % nb):
print >>sys.stderr, "error: '%s' is not a valid name" % nb
sys.exit(1)
for project in self.GetProjects(args[1:]):
project.AbandonBranch(nb)

View File

@ -1,169 +0,0 @@
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import sys
import tempfile
from command import Command
from error import GitError, NoSuchProjectError
from git_config import IsId
from import_tar import ImportTar
from import_zip import ImportZip
from project import Project
from remote import Remote
def _ToCommit(project, rev):
return project.bare_git.rev_parse('--verify', '%s^0' % rev)
def _Missing(project, rev):
return project._revlist('--objects', rev, '--not', '--all')
class ComputeSnapshotCheck(Command):
common = False
helpSummary = "Compute the check value for a new snapshot"
helpUsage = """
%prog -p NAME -v VERSION -s FILE [options]
"""
helpDescription = """
%prog computes and then displays the proper check value for a
snapshot, so it can be pasted into the manifest file for a project.
"""
def _Options(self, p):
g = p.add_option_group('Snapshot description options')
g.add_option('-p', '--project',
dest='project', metavar='NAME',
help='destination project name')
g.add_option('-v', '--version',
dest='version', metavar='VERSION',
help='upstream version/revision identifier')
g.add_option('-s', '--snapshot',
dest='snapshot', metavar='PATH',
help='local tarball path')
g.add_option('--new-project',
dest='new_project', action='store_true',
help='destinition is a new project')
g.add_option('--keep',
dest='keep_git', action='store_true',
help='keep the temporary git repository')
g = p.add_option_group('Base revision grafting options')
g.add_option('--prior',
dest='prior', metavar='COMMIT',
help='prior revision checksum')
g = p.add_option_group('Path mangling options')
g.add_option('--strip-prefix',
dest='strip_prefix', metavar='PREFIX',
help='remove prefix from all paths on import')
g.add_option('--insert-prefix',
dest='insert_prefix', metavar='PREFIX',
help='insert prefix before all paths on import')
def _Compute(self, opt):
try:
real_project = self.GetProjects([opt.project])[0]
except NoSuchProjectError:
if opt.new_project:
print >>sys.stderr, \
"warning: project '%s' does not exist" % opt.project
else:
raise NoSuchProjectError(opt.project)
self._tmpdir = tempfile.mkdtemp()
project = Project(manifest = self.manifest,
name = opt.project,
remote = Remote('origin'),
gitdir = os.path.join(self._tmpdir, '.git'),
worktree = self._tmpdir,
relpath = opt.project,
revision = 'refs/heads/master')
project._InitGitDir()
url = 'file://%s' % os.path.abspath(opt.snapshot)
imp = None
for cls in [ImportTar, ImportZip]:
if cls.CanAccept(url):
imp = cls()
break
if not imp:
print >>sys.stderr, 'error: %s unsupported' % opt.snapshot
sys.exit(1)
imp.SetProject(project)
imp.SetVersion(opt.version)
imp.AddUrl(url)
if opt.prior:
if opt.new_project:
if not IsId(opt.prior):
print >>sys.stderr, 'error: --prior=%s not valid' % opt.prior
sys.exit(1)
else:
try:
opt.prior = _ToCommit(real_project, opt.prior)
missing = _Missing(real_project, opt.prior)
except GitError, e:
print >>sys.stderr,\
'error: --prior=%s not valid\n%s' \
% (opt.prior, e)
sys.exit(1)
if missing:
print >>sys.stderr,\
'error: --prior=%s is valid, but is not reachable' \
% opt.prior
sys.exit(1)
imp.SetParent(opt.prior)
src = opt.strip_prefix
dst = opt.insert_prefix
if src or dst:
if src is None:
src = ''
if dst is None:
dst = ''
imp.RemapPath(src, dst)
commitId = imp.Import()
print >>sys.stderr,"%s\t%s" % (commitId, imp.version)
return project
def Execute(self, opt, args):
if args \
or not opt.project \
or not opt.version \
or not opt.snapshot:
self.Usage()
success = False
project = None
try:
self._tmpdir = None
project = self._Compute(opt)
finally:
if project and opt.keep_git:
print 'GIT_DIR = %s' % (project.gitdir)
elif self._tmpdir:
for root, dirs, files in os.walk(self._tmpdir, topdown=False):
for name in files:
os.remove(os.path.join(root, name))
for name in dirs:
os.rmdir(os.path.join(root, name))
os.rmdir(self._tmpdir)

View File

@ -57,6 +57,10 @@ default.xml will be used.
g.add_option('-m', '--manifest-name',
dest='manifest_name', default='default.xml',
help='initial manifest file', metavar='NAME.xml')
g.add_option('--mirror',
dest='mirror', action='store_true',
help='mirror the forrest')
# Tool
g = p.add_option_group('Version options')
@ -112,6 +116,9 @@ default.xml will be used.
r.ResetFetch()
r.Save()
if opt.mirror:
m.config.SetString('repo.mirror', 'true')
m.Sync_NetworkHalf()
m.Sync_LocalHalf()
m.StartBranch('default')
@ -185,9 +192,14 @@ default.xml will be used.
self._SyncManifest(opt)
self._LinkManifest(opt.manifest_name)
if os.isatty(0) and os.isatty(1):
if os.isatty(0) and os.isatty(1) and not opt.mirror:
self._ConfigureUser()
self._ConfigureColor()
if opt.mirror:
type = 'mirror '
else:
type = ''
print ''
print 'repo initialized in %s' % self.manifest.topdir
print 'repo %sinitialized in %s' % (type, self.manifest.topdir)

View File

@ -49,6 +49,9 @@ the manifest.
p.add_option('--no-repo-verify',
dest='no_repo_verify', action='store_true',
help='do not verify repo source code')
p.add_option('--repo-upgraded',
dest='repo_upgraded', action='store_true',
help='perform additional actions after a repo upgrade')
def _Fetch(self, *projects):
fetched = set()
@ -67,6 +70,11 @@ the manifest.
mp = self.manifest.manifestProject
mp.PreSync()
if opt.repo_upgraded:
for project in self.manifest.projects.values():
if project.Exists:
project.PostRepoUpgrade()
all = self.GetProjects(args, missing_ok=True)
fetched = self._Fetch(rp, mp, *all)
@ -77,7 +85,7 @@ the manifest.
if not rp.Sync_LocalHalf():
sys.exit(1)
print >>sys.stderr, 'info: Restarting repo with latest version'
raise RepoChangedException()
raise RepoChangedException(['--repo-upgraded'])
else:
print >>sys.stderr, 'warning: Skipped upgrade to unverified version'
@ -94,8 +102,9 @@ the manifest.
self._Fetch(*missing)
for project in all:
if not project.Sync_LocalHalf():
sys.exit(1)
if project.worktree:
if not project.Sync_LocalHalf():
sys.exit(1)
def _VerifyTag(project):

View File

@ -25,11 +25,17 @@ def _die(fmt, *args):
print >>sys.stderr, 'error: %s' % msg
sys.exit(1)
def _SplitEmails(values):
result = []
for str in values:
result.extend([s.strip() for s in str.split(',')])
return result
class Upload(InteractiveCommand):
common = True
helpSummary = "Upload changes for code review"
helpUsage="""
%prog [<project>]...
%prog [--re --cc] {[<project>]... | --replace <project>}
"""
helpDescription = """
The '%prog' command is used to send changes to the Gerrit code
@ -44,9 +50,31 @@ at the command line. Projects can be specified either by name, or
by a relative or absolute path to the project's local directory. If
no projects are specified, '%prog' will search for uploadable
changes in all projects listed in the manifest.
If the --reviewers or --cc options are passed, those emails are
added to the respective list of users, and emails are sent to any
new users. Users passed to --reviewers must be already registered
with the code review system, or the upload will fail.
If the --replace option is passed the user can designate which
existing change(s) in Gerrit match up to the commits in the branch
being uploaded. For each matched pair of change,commit the commit
will be added as a new patch set, completely replacing the set of
files and description associated with the change in Gerrit.
"""
def _SingleBranch(self, branch):
def _Options(self, p):
p.add_option('--replace',
dest='replace', action='store_true',
help='Upload replacement patchesets from this branch')
p.add_option('--re', '--reviewers',
type='string', action='append', dest='reviewers',
help='Request reviews from these people.')
p.add_option('--cc',
type='string', action='append', dest='cc',
help='Also send email to these email addresses.')
def _SingleBranch(self, branch, people):
project = branch.project
name = branch.name
date = branch.date
@ -64,11 +92,11 @@ changes in all projects listed in the manifest.
sys.stdout.write('(y/n)? ')
answer = sys.stdin.readline().strip()
if answer in ('y', 'Y', 'yes', '1', 'true', 't'):
self._UploadAndReport([branch])
self._UploadAndReport([branch], people)
else:
_die("upload aborted by user")
def _MultipleBranches(self, pending):
def _MultipleBranches(self, pending, people):
projects = {}
branches = {}
@ -127,13 +155,62 @@ changes in all projects listed in the manifest.
todo.append(branch)
if not todo:
_die("nothing uncommented for upload")
self._UploadAndReport(todo)
self._UploadAndReport(todo, people)
def _UploadAndReport(self, todo):
def _ReplaceBranch(self, project, people):
branch = project.CurrentBranch
if not branch:
print >>sys.stdout, "no branches ready for upload"
return
branch = project.GetUploadableBranch(branch)
if not branch:
print >>sys.stdout, "no branches ready for upload"
return
script = []
script.append('# Replacing from branch %s' % branch.name)
for commit in branch.commits:
script.append('[ ] %s' % commit)
script.append('')
script.append('# Insert change numbers in the brackets to add a new patch set.')
script.append('# To create a new change record, leave the brackets empty.')
script = Editor.EditString("\n".join(script)).split("\n")
change_re = re.compile(r'^\[\s*(\d{1,})\s*\]\s*([0-9a-f]{1,}) .*$')
to_replace = dict()
full_hashes = branch.unabbrev_commits
for line in script:
m = change_re.match(line)
if m:
c = m.group(1)
f = m.group(2)
try:
f = full_hashes[f]
except KeyError:
print 'fh = %s' % full_hashes
print >>sys.stderr, "error: commit %s not found" % f
sys.exit(1)
if c in to_replace:
print >>sys.stderr,\
"error: change %s cannot accept multiple commits" % c
sys.exit(1)
to_replace[c] = f
if not to_replace:
print >>sys.stderr, "error: no replacements specified"
print >>sys.stderr, " use 'repo upload' without --replace"
sys.exit(1)
branch.replace_changes = to_replace
self._UploadAndReport([branch], people)
def _UploadAndReport(self, todo, people):
have_errors = False
for branch in todo:
try:
branch.UploadForReview()
branch.UploadForReview(people)
branch.uploaded = True
except UploadError, e:
branch.error = e
@ -167,6 +244,22 @@ changes in all projects listed in the manifest.
def Execute(self, opt, args):
project_list = self.GetProjects(args)
pending = []
reviewers = []
cc = []
if opt.reviewers:
reviewers = _SplitEmails(opt.reviewers)
if opt.cc:
cc = _SplitEmails(opt.cc)
people = (reviewers,cc)
if opt.replace:
if len(project_list) != 1:
print >>sys.stderr, \
'error: --replace requires exactly one project'
sys.exit(1)
self._ReplaceBranch(project_list[0], people)
return
for project in project_list:
avail = project.GetUploadableBranches()
@ -176,6 +269,6 @@ changes in all projects listed in the manifest.
if not pending:
print >>sys.stdout, "no branches ready for upload"
elif len(pending) == 1 and len(pending[0][1]) == 1:
self._SingleBranch(pending[0][1][0])
self._SingleBranch(pending[0][1][0], people)
else:
self._MultipleBranches(pending)
self._MultipleBranches(pending, people)