Compare commits

...

126 Commits

Author SHA1 Message Date
936183a492 git_config: add support for remote '.'
As a fix for issue #149, this patch add support for the remote '.'
(local).

As an alias for the local repository, remote '.' is lacking a fetch =
config in .git/config.

Without such refspec, repo info --overview is not able to process a
local tracking branch.

v2: Check for name == '.' before checking if merge starts with refs/,
    since the case where it's not is invalid.

Signed-off-by: Yann Droneaud <ydroneaud@opteya.com>
Signed-off-by: Filipe Brandenburger <filbranden@google.com>

Change-Id: I8c8fd8602cd68baecb530301ae41d37d751ec85d
2015-03-06 13:23:27 -08:00
85e8267031 Merge "Implementation of manifest defined githooks" 2015-03-05 20:52:30 +00:00
e30f46b957 Print stderr output from git command for RemoteFetch
The stderr output generated by git during a RemoteFetch was not being
printed.  This information is useful so print it.

Change-Id: I6e6ce12c4a57e5ca2359f76ce14f2fcbbc37a5ef
2015-02-25 14:29:28 -08:00
e4978cfbe3 Ensure the repo project is never fetched with partial depth
If the repo project is synced with partial depth, then the tags
won't be fetched and users will be told the newest sha1 in the
stable branch isn't signed.

Change-Id: I107df97b4836b928c76aa33a700fa35d1705ae09
2015-02-10 14:44:05 -08:00
126e298214 Handle case where 'git remote prune' needs to be run
Handle the case when this error occurs:
    error: some local refs could not be updated; try running
     'git remote prune origin' to remove any old, conflicting branches

This is usually caused by a reference getting changed from a file to a
directory.

For example:
  Initially someone creates a branch 'foo' and it is stored as:
    .git/refs/remotes/origin/foo

  Then later on it is decided to change the layout structure where 'foo'
  is a directory with branches below it:
    .git/refs/remotes/origin/foo/master

  The problem occurs when someone still has
  '.git/refs/remotes/origin/foo' on their system and does a repo sync.
  When this occurs the error message for needing to do a
  'git remote prune origin' occurs.

Now when doing a 'git fetch' if the error message from git says that a
'git remote prune' is needed, it will do the prune and then retry the
fetch.

Change-Id: I4c6f5aa6bd932f0ef7a39134400bedd52e82f633
Signed-off-by: John L. Villalovos <john.l.villalovos@intel.com>
2015-02-03 13:49:51 -08:00
38e4387f8e Implementation of manifest defined githooks
When working within a team or corporation it is often
useful/required to use predefined git templates. This
change teaches repo to use a per-remote git hook template
structure.

The implementation is done as a continuation of the
existing projecthook functionality. The terminology is
therefore defined as projecthooks.

The downloaded projecthooks are stored in the .repo
directory as a metaproject separating them from the users
project forest.

The projecthooks are downloaded and set up when doing a
repo init and updated for each new repo init.

When downloading a mirror the projecthooks gits are
not added to the bare forest since the intention is to
ensure that the latest are used (allows for company policy
enforcement).

The projecthooks are defined in the manifest file in the
remote element as a subnode, the name refers to the
project name on the server referred to in the remote.
<remote name="myremote ..>
   <projecthook name="myprojecthookgit" revision="myrevision"/>
</remote>

The hooks found in the projecthook revision supersede
the stock hooks found in repo. This removes the need for
updating the projecthook gits for repo stock hook changes.

Change-Id: I6796b7b0342c1f83c35f4b3e46782581b069a561
Signed-off-by: Patrik Ryd <patrik.ryd@stericsson.com>
Signed-off-by: Ian Kumlien <ian.kumlien@gmail.com>
2015-02-03 16:01:15 +09:00
24245e0094 Merge "Add missing documentation of --current-branch option on sync command" 2015-01-31 12:44:45 +00:00
db6f1b0884 Merge "Use depth flag when fetching" 2015-01-30 19:36:06 +00:00
f2fad61bde Add missing documentation of --current-branch option on sync command
Change-Id: I72d6e3d51241148c1df97bbad26338debb1fcb4e
2015-01-29 14:36:28 +09:00
ee69084421 Merge "Handle shallow checkout of SHA1 pinned repos" 2015-01-28 20:29:37 +00:00
d37d43f036 Merge "Don't delete hooks in .git/hooks" 2015-01-28 20:29:05 +00:00
7bdac71087 pylint fixes for project.py
Fix all the formatting warnings and unused variables

Change-Id: I17d88a23572303879530077f3a80451de5417fbb
2015-01-22 04:20:21 +00:00
f97e8383a3 Use depth flag when fetching
Currently, we only use the depth flag when cloning.  The result is that when
new project history has merges, the entire history of the merged branch is
brought in and the project becomes unshallow very quickly.  --depth and
clone-depth are often used to save on space, not just network load, so this
seems less than ideal.

This change uses --depth on every fetch (when the user has depth specified),
not just the initial clone.  The result is that the given project stays
consistently shallow as opposed to growing over time, especially when merges
are involved.

Change-Id: Iac706cfdad4a555c72f9d9f1119195d38d91df12
2015-01-22 01:20:22 +00:00
3000cdad22 Handle shallow checkout of SHA1 pinned repos
When doing a shallow checkout SHA1 pinned repos with repo init --depth=1 and
repo sync -c, repo would try to fetch only some reference and fail if the exact
SHA1 repo was missing.
Instead, when depth is set, fetch only the specific commit.

Change-Id: If3f799d0e78c03faea47f796380bb5e367b11998
2015-01-21 14:14:23 -08:00
b9d9efd394 Don't delete hooks in .git/hooks
We currently delete all hooks in .git/hooks for each project before
symlink'ing in the standard project hooks.  This can be annoying for
users who have installed custom git hooks.

There's no reason to delete all existing hooks.  Just rip out the
deletion code.

Change-Id: I5062a6cd20af700f6d6a17b11ad6c94853987c57
Signed-off-by: Mitchel Humpherys <mitchelh@codeaurora.org>
2015-01-15 22:49:08 -08:00
497bde4de5 Respect --quiet when looking up bundle cookie file
Change-Id: I02a244132c49e4bb50ecda978974d6d2b220f6d1
2015-01-02 13:58:05 -08:00
4abf8e6ef8 Save cookies back to jar when fetching clone.bundle
Change-Id: I3ef71b5e7f8ee1cda66057e46ae234866c7258c4
2015-01-02 13:57:14 -08:00
137d0131bf Hold persistent proxy connection open while fetching clone.bundle
The persistent proxy may choose to present a per-process cookie file
that gets cleaned up after the process exits, to help with the fact
that libcurl cannot save cookies atomically when a cookie file is
shared across processes. We were letting this cleanup happen
immediately by closing stdin as soon as we read the configuration
option, resulting in a nonexistent cookie file by the time we use the
config option.

Work around this by converting the cookie logic to a context manager
method, which closes the process only when we're done with the cookie
file.

Change-Id: I12a88b25cc19621ef8161337144c1b264264211a
2015-01-02 13:57:13 -08:00
42e679b9f6 Merge "add a global --color option" 2015-01-02 20:56:25 +00:00
902665bce6 add a global --color option
If you want to turn off colors for commands, you have to manually adjust
the git config settings (in various locations).  If you're writing scripts
though, you often don't want to modify those locations.  Add a commandline
option to explicitly control things.

The default behavior is unchanged -- we still scan the config files.

Change-Id: I54a3fd8e1918bac180aadd7c7d3004f069b02522
2014-12-30 18:50:05 -05:00
c8d882ae2a Silence warnings about invalid clone.bundle files when quieted
The invalid clone.bundle file warning is not typically user actionable,
and can be confusing. So don't show it when -q flag is in effect.

Change-Id: If9fef4085391acf54b63c75029ec0e161c38eb86
2014-12-24 10:23:24 +09:00
3eb87cec5c Revert "Check for existence of refs upon initial fetch"
This reverts commit 565480588d.

We are reverting this change for 2 reasons:

1) It introduced a bug for users using sync -c with a reference mirror.
2) The fetch specs have recently changed to cause git to properly fail
when we request a non-existent branch of a manifest, removing the need
for this change.

Change-Id: I0f63da9bfb40cf5ffafb7979f1b8c929a738fc7b
2014-11-10 23:49:32 +00:00
5fb8ed217c If revision is sha hash and dest-branch is defined, use it for starting branch
Change-Id: I538c7d216f72b87629b61aee547d374a398c95da
2014-10-27 12:25:05 +00:00
7e12e0a2fa Support persistent-http(s) review urls
Change-Id: I8e0065685c968dfa9dc26bcdb6ee2fa14019c509
2014-10-23 15:42:09 -07:00
7893b85509 Merge changes I1f71be22,I5b119f11
* changes:
  Always fetch the specific revision given
  Support specifying non-HEADS refs as upstream
2014-10-22 00:23:18 +00:00
b4e50e67e8 Merge "upload: report names of uncommitted files" 2014-10-21 18:03:55 +00:00
0936aeab2c Exit 1 if repo download -c fails
Change-Id: I6985548bf87032b121eeccf858c4eeca1a60598c
2014-10-17 15:45:57 -04:00
14e134da02 upload: report names of uncommitted files
When there are uncommitted files in the tree, 'repo upload' stops to
ask if it is OK to continue, but does not report the actual names of
uncommitted files.

This patch adds plumbing to have the outstanding file names reported
if desired.

BUG=None
TEST=verified that 'repo upload' properly operates with the following
    conditions present in the tree:
    . file(s) modified locally
    . file(s) added to index, but not committed
    . files not known to git
    . no modified files (the upload proceeds as expected)

Change-Id: If65d5f8e8bcb3300c16d85dc5d7017758545f80d
Signed-off-by: Vadim Bendebury <vbendeb@chromium.org>
Signed-off-by: Vadim Bendebury <vbendeb@google.com>
2014-10-14 11:20:05 -07:00
04e52d6166 Always fetch the specific revision given
Don't assume the revision is in refs/heads/.

Change-Id: I1f71be222ed3ed940d2265aad43d1f2d601fc03a
2014-10-09 13:41:56 -06:00
909d58b2e2 Support specifying non-HEADS refs as upstream
While not typical, some users might have an upstream that isn't in
the usual refs/heads/* namespace. There's no reason not to use
those refs as the value for the upstream attribute, so support
doing so.

Change-Id: I5b119f1135c3268c20e7c4084682e860d3ee1fb1
2014-10-09 13:41:51 -06:00
5cf16607d3 Allow selection of a target when using smart sync.
Change-Id: I02a24471b9b62dbba3773f22a289825bc566acd9
2014-10-02 10:17:44 -07:00
c190b98ed5 Merge "Add extend-project tag to support adding groups to an existing project" 2014-09-18 23:09:08 +00:00
4863307299 Add support for rpc:// protocol schemes.
Change-Id: I0e500e45cacc20ac04b43435c4bd189299e9e97b
2014-09-10 13:45:52 -07:00
f75870beac Change implementation of cleanup in case of clone failure during "repo init"
Fix includes:
1. It deletes only .repo/repo instead of the whole .repo repository.

Bug: Issue 161
Change-Id: I1ab8caa7538fec5e6206d1b029f63bd3f60dedcd
2014-09-03 13:56:04 +05:30
bf0b0cbc2f Merge "Provide detail print-out when not all projects of a branch are current." 2014-08-26 21:11:40 +00:00
3a10968a70 Merge "Enable transferring of attribute using command 'repo manifest -o -'" 2014-08-22 16:13:16 +00:00
c46de6932a Decode git version
Used by 'repo --version'
With Python 3,
* Before: b'git version 2.1.0'
* After: git version 2.1.0

Change-Id: I4321bb0f09e92cda1123c35910338b940e82a305
2014-08-20 11:47:10 +05:30
303a82f33a Don't open non-binary files as binary
* Don't pen the git config file, and the git ".lock" file as binary.

Change-Id: I7b3939658456f2fd0a0500443cdd8d1ee1a4459d
2014-08-19 23:05:44 +05:30
7a91d51dcf Enable transferring of attribute using command 'repo manifest -o -'
'upstream' attribute is now transferred to the new manifest xml
that is created when using command 'repo manifest -o -'.

Manifest help is updated for the attributes 'sync-c','sync-s' and
'sync-j'.

Bug: Issue 164
Change-Id: If63f781e91d25c5b5b5ea0696b0c04337b0a686a
2014-07-24 16:27:08 +05:30
a8d539189e Update the commit-msg hook to the version from Gerrit 2.8.2
Change-Id: Id911bc6841f488a42d08580de800c3afafa2937e
2014-07-15 11:30:06 -07:00
588142dfcb Provide detail print-out when not all projects of a branch are current.
When current is "split" (i.e. some projects are current while others are not):
- Disable 'not in' printout (i.e. will print out all projects)
- Disable printing of multiple projects on one line
- Print current projects in green, non-current in white

Since using color to differentiate current from non-current in "split" cases:
- In non-split cases also print out project names in color (green for current
  white for non-current)

Change-Id: Ia6b826612c708447cecfe5954dc767f7b2ea2ea7
2014-07-11 10:56:03 -07:00
a6d258b84d Merge "Fix UrlInsteadOf to handle multiple strings" 2014-06-30 22:21:58 +00:00
a769498568 Add --jobs option to forall subcommand
Enable '--jobs' ('-j') option in the forall subcommand. For -jn
where n > 1, the '-p' option can no longer guarantee the
continuity of console output between the project header and the
output from the worker process.

SIG_INT is sent to all worker processes upon keyboard interrupt
(Ctrl+C).

Bug: Issue 105
Change-Id: If09afa2ed639d481ede64f28b641dc80d0b89a5c
2014-06-24 01:02:54 +00:00
884a387eca Add extend-project tag to support adding groups to an existing project
Currently, if a local manifest wants to add groups to an existing
project, it must use remove-project and then re-add the project with
the new groups.  This makes the local manifest more fragile, requiring
updates to the local manifest if the original manifest changes.

Add a new extend-project tag, which supports adding groups to an
existing project.

Change-Id: Ib4d1352efd722a65dd263d02644b9ea5ab6ed400
2014-06-20 11:35:16 -07:00
80b87fe6c1 Use fetch --unshallow when appropriate.
If a user reinits to a different manifest or the manifest updates so
that a project no longer has a fixed depth, we need to use --unshallow
when we fetch.

Change-Id: I6d3f15e5464b5eaad9205654bc24354947a78aea
2014-05-09 18:47:35 -07:00
e9f75b1782 Merge "Enable remotes to define their own revision" 2014-05-08 18:38:33 +00:00
a35e402161 Merge "Return a list rather than dict_values in XmlManifest.projects()" 2014-05-07 18:21:31 +00:00
dd7aea6c11 Merge "Define unicode as str if using Python 3" 2014-05-07 18:20:32 +00:00
5196805fa2 Merge "Use exec() rather than execfile()" 2014-05-07 18:18:56 +00:00
85b24acd6a Use JSON instead of pickle
Use JSON as it is shown to be much faster than pickle.
Also clean up the loading and saving functions.

Change-Id: I45b3dee7b4d59a1c0e0d38d4a83b543ac5839390
2014-05-07 10:46:24 +01:00
36ea2fb6ee Enable remotes to define their own revision
Some projects use multiple remotes.
In some cases these remotes have different naming conventions.
Add an option to define a revision in the remote configuration.

The `project` revision takes precedence over `remote` and `default`.
The `remote` revision takes precedence over `default`.
The `default` revision acts as a fall back as it originally did.

Change-Id: I2b376160d45d48b0bab840c02a3eef1a1e32cf6d
2014-05-07 08:29:30 +00:00
2cd1f0452e Use next(iterator) rather than iterator.next()
iterator.next() was replaced with iterator.__next__() in Python 3.
Use next(iterator) instead which will select the correct method for
returning the next item.

Change-Id: I6d0c89c8b32e817e5897fe87332933dacf22027b
2014-05-07 08:44:20 +01:00
65e3a78a9e Merge "Prevent warning twice about Python 3 usage" 2014-05-07 06:30:30 +00:00
d792f7928d Define unicode as str if using Python 3
The unicode object was renamed to str in Python 3

Change-Id: I1e4972fb07b313d3462587b3059bb3638d779625
2014-05-06 20:38:51 +01:00
6efdde9f6e Prevent warning twice about Python 3 usage
Only warn about using Python 3 when running the repo script directly.
This prevents the user being warned twice.

Change-Id: I2ee51ea2fa0127ea310598320e460ec9f38c6488
2014-05-06 12:44:22 +00:00
7446c5954a Use sorted() rather than .sort()
dict.keys() produces a dict_keys object in Python 3, which does
not support .sort(). Use sorted() which will give the same outcome.

Change-Id: If6b33db07a31995b4e44959209d08d8fb74ae339
2014-05-06 12:42:35 +00:00
d58bfe5a58 Return a list rather than dict_values in XmlManifest.projects()
dict.values() produce dict_values objects rather than list objects.
Convert this to a list to maintain functionality with certain functions.

Change-Id: Ie76269e19f8d68479a1d7ae03aa965252d759a9e
2014-05-06 09:16:52 +01:00
70f6890352 Use exec() rather than execfile()
execfile() is not in Python 3.

Change-Id: I5af222340f13c1e8edaa820e7675d3e4d62a1689
2014-05-05 23:41:07 +01:00
666d534636 Ensure HEAD is correct when skipping remote fetch
A recent optimization (2fb6466f79) skips
performing a remote fetch if we already know we have the sha1 we want.
However, that optimization skipped initialization steps that ensure HEAD
points to the correct sha1.  This change makes sure not to skip those
steps.

Here is an example of how to test this change:

"""""""""
url=<manifest url>
branch1=<branch name>
branch2=<branch name>
project=<project with revision set to different sha1 in each branch>

repo init -u $url -b $branch1 --mirror
repo sync $project
first=$(cd $project.git; git rev-parse HEAD)

repo init -b $branch2
repo sync $project
second=$(cd platform/build.git; git rev-parse HEAD)

if [[ $first == $second ]]
then
    echo 'problem!'
else
    echo 'no problem!'
fi
"""""""""
2014-05-01 13:20:32 -07:00
f2af756425 Add 'shallow' gitfile to symlinks
This fixes the bug that kept clients from doing things like `git log`
in projects using the clone-depth feature.

Change-Id: Ib4024a7b82ceaa7eb7b8935b007b3e8225e0aea8
2014-04-30 11:34:00 -07:00
544e7b0a97 Merge "Ignore clone-depth attribute when fetching to a mirror" 2014-04-24 21:21:02 +00:00
e0df232da7 Add linkfile support.
It's just like copyfile and runs at the same time as copyfile but
instead of copying it creates a symlink instead.  This is needed
because copyfile copies the target of the link as opposed to the
symlink itself.

Change-Id: I7bff2aa23f0d80d9d51061045bd9c86a9b741ac5
2014-04-22 14:35:47 -05:00
5a7c3afa73 Merge "Don't try to remove .repo if it doesn't exist" 2014-04-18 00:06:08 +00:00
9bc422f130 Ignore clone-depth attribute when fetching to a mirror
If a manifest includes projects with a clone-depth=1 attribute, and a
workspace is initialised from that manifest using the --mirror option,
any workspaces initialised and synced from the mirror will fail with:

  fatal: attempt to fetch/clone from a shallow repository

on the projects that had the clone-depth.

Ignore the clone-depth attribute when fetching from the remote to a
mirror workspace. Thus the mirror will be synched with a complete
clone of all the repositories.

Change-Id: I638b77e4894f5eda137d31fa6358eec53cf4654a
2014-04-16 11:00:40 +09:00
e81bc030bb Add total count and iteration count to forall environment
For long-running forall commands sometimes it's useful to know which
iteration is currently running. Add REPO_I and REPO_COUNT environment
variables to reflect the current iteration count as well as the total
number of iterations so that the user can build simple status
indicators.

Example:

    $ repo forall -c 'echo $REPO_I / $REPO_COUNT; git gc'
    1 / 579
    Counting objects: 41, done.
    Delta compression using up to 8 threads.
    Compressing objects: 100% (19/19), done.
    Writing objects: 100% (41/41), done.
    Total 41 (delta 21), reused 41 (delta 21)
    2 / 579
    Counting objects: 53410, done.
    Delta compression using up to 8 threads.
    Compressing objects: 100% (10423/10423), done.
    Writing objects: 100% (53410/53410), done.
    Total 53410 (delta 42513), reused 53410 (delta 42513)
    3 / 579
    ...

Change-Id: I9f28b0d8b7debe423eed3b4bc1198b23e40c0c50
Signed-off-by: Mitchel Humpherys <mitchelh@codeaurora.org>
2014-03-31 13:08:26 -07:00
eb5acc9ae9 Don't try to remove .repo if it doesn't exist
Part of the cleanup path for _Init is removing the .repo
directory. However, _Init can fail before creating the .repo directory,
so trying to remove it raises another exception:

    fatal: invalid branch name 'refs/changes/53/55053/4'
    Traceback (most recent call last):
      File "/home/mitchelh/bin/repo", line 775, in <module>
        main(sys.argv[1:])
      File "/home/mitchelh/bin/repo", line 749, in main
        os.rmdir(repodir)
    OSError: [Errno 2] No such file or directory: '.repo'

Fix this by only removing .repo if it actually exists.

Change-Id: Ia251d29e9c73e013eb296501d11c36263457e235
2014-03-12 15:11:27 -07:00
26c45a7958 Make --no-tags work with -c
Currently, the --no-tags option is ignored if the user asks to only
fetch the current branch. There is no reason for this restriction. Fix
it.

Change-Id: Ibaaeae85ebe9955ed49325940461d630d794b990
Signed-off-by: Mitchel Humpherys <mitchelh@codeaurora.org>
2014-03-12 16:34:53 +09:00
68425f4da8 Fix indentation in project.py
Change-Id: I81c630536eaa54d5a25b9cb339a96c28619815ea
2014-03-11 14:55:52 +09:00
53e902a19b More verbose errors for NoManifestExceptions.
The old "manifest required for this command -- please run
init" is replaced by a more helpful message that lists the
command repo was trying to execute (with arguments) as well
as the str() of the NoManifestException. For example:

> error: in `sync`: [Errno 2] No such file or directory:
> 	'path/to/.repo/manifests/.git/HEAD'
> error: manifest missing or unreadable -- please run init

Other failure points in basic command parsing and dispatch
are more clearly explained in the same fashion.

Change-Id: I6212e5c648bc5d57e27145d55a5391ca565e4149
2014-03-11 05:33:43 +00:00
4e4d40f7c0 Fix UrlInsteadOf to handle multiple strings
For complex .gitconfig url rewrites, multiple insteadOf lines may be
used for a url. Search all of them for the right rewrite.

Change-Id: If5e9ecd054e86226924b0baf513801cd57c389cd
2014-03-06 21:04:18 -08:00
093fdb6587 Add reviewers automatically from project's git config
The `review.URL.autocopy` setting sends email notification to the
named reviewers, but does not add them as reviewer on the uploaded
change.

Add a new setting `review.URL.autoreviewer`.  The named reviewers
will be added as reviewer on the uploaded change.

Change-Id: I3fddfb49edf346f8724fe15b84be8c39d43e7e65
Signed-off-by: bijia <bijia@xiaomi.com>
2014-03-04 00:51:30 +00:00
2fb6466f79 Don't fetch from remotes if commit id exists locally
In existing workspaces where the manifest specifies a commit id in the
manifest, we can avoid doing a fetch from the remote if we have the
commit locally. This substantially improves sync times for fully
specified manifests.

Change-Id: Ide216f28a545e00e0b493ce90ed0019513c61613
2014-03-03 10:17:03 +00:00
724aafb52d Merge "Clean up duplicate logic in subcmds/sync.py." 2014-02-28 21:16:32 +00:00
ccd218cd8f Fix to mirror manifest when --mirror is given
Commit 8d201 "repo: Support multiple branches for the same project."
(Change id is I5e2f4e1a7abb56f9d3f310fa6fd0c17019330ecd) caused missing
mirroring manifest repository when 'repo sync' after 'repo init --mirror'.

When the function _AddMetaProjectMirror() is called to add two of
meta projects - git-repo itself and manifest repository to mirror,
it didn't add them into self._paths which has list of projects to be
sync'ed by 'repo sync'.

In addition, because member var of Project 'relpath' is used as a key
of self._paths, it should be set with proper value other than None.
Since this is only for meta projects which are not described in manifest
xml, 'relpath' is name of the projects.

Change-Id: Icc3b9e6739a78114ec70bf54fe645f79df972686
Signed-off-by: Kwanhong Lee <kwanhong.lee@windriver.com>
2014-02-20 11:07:23 +09:00
dd6542268a Add the "diffmanifests" command
This command allows a deeper diff between two manifest projects.
In addition to changed projects, it displays the logs of the
commits between both revisions for each project.

Change-Id: I86d30602cfbc654f8c84db2be5d8a30cb90f1398
Signed-off-by: Julien Campergue <julien.campergue@parrot.com>
2014-02-17 11:20:11 +00:00
baca5f7e88 Merge "Add error message for download -c conflicts" 2014-02-17 07:57:00 +00:00
89ece429fb Clean up duplicate logic in subcmds/sync.py.
The fetch logic is now shared between the jobs == 1 and
jobs > 1 cases. This refactoring also fixes a bug where
opts.force_broken was not honored when jobs > 1.

Change-Id: Ic886f3c3c00f3d8fc73a65366328fed3c44dc3be
2014-02-14 16:14:32 +00:00
565480588d Check for existence of refs upon initial fetch
When we do an initial fetch and have not specified any branch etc,
the following fetch command will not error:
git fetch origin --tags +refs/heads/*:refs/remotes/origin/*

In this change we make sure something got fetched and if not we report
an error.

This fixes the bug that occurs when we init using a bad manifest url and
then are unable to init again (because a manifest project has been
inited with no manifest).

Change-Id: I6f8aaefc83a1837beb10b1ac90bea96dc8e61156
2014-02-12 09:11:00 -08:00
1829101e28 Add error message for download -c conflicts
Currently if you run repo download -c on a change and the cherry-pick
runs into a merge conflict a Traceback is produced:

rob@rob-i5-lm ~/Programming/repo_test/repo1 $ repo download -c repo1 3/1
From ssh://rob-i5-lm:29418/repo1
 * branch            refs/changes/03/3/1 -> FETCH_HEAD
error: could not apply 0c8b474... 2
hint: after resolving the conflicts, mark the corrected paths
hint: with 'git add <paths>' or 'git rm <paths>'
hint: and commit the result with 'git commit'
Traceback (most recent call last):
  File "/home/rob/Programming/git-repo/main.py", line 408, in <module>
    _Main(sys.argv[1:])
  File "/home/rob/Programming/git-repo/main.py", line 384, in _Main
    result = repo._Run(argv) or 0
  File "/home/rob/Programming/git-repo/main.py", line 143, in _Run
    result = cmd.Execute(copts, cargs)
  File "/home/rob/Programming/git-repo/subcmds/download.py", line 90, in Execute
    project._CherryPick(dl.commit)
  File "/home/rob/Programming/git-repo/project.py", line 1943, in _CherryPick
    raise GitError('%s cherry-pick %s ' % (self.name, rev))
error.GitError: repo1 cherry-pick 0c8b4740f876f8f8372bbaed430f02b6ba8b1898

This amount of error message is confusing to users and has the side effect
of the git message telling you the actual issue being ignored.

This change introduces a message stating that the cherry-pick couldn't
be completed removing the Traceback.

To reproduce the issue create a change that causes a conflict with one currently
in review and use repo download -c to cherry-pick the conflicting change.

Change-Id: I8ddf4e0c8ad9bd04b1af5360313f67cc053f7d6a
2014-02-11 18:19:04 +00:00
1966133f8e Merge "Stop appending 'p/' to review urls" 2014-02-10 22:42:31 +00:00
f1027e23b4 Merge "Implement Kerberos HTTP authentication handler" 2014-02-05 00:58:53 +00:00
2cd38a0bf8 Stop appending 'p/' to review urls
Gerrit no longer requires 'p/', and this causes unexpected behavior.
In this change we stop appending 'p/' to the urls.

Change-Id: I72c13bf838f4112086141959fb1af249f9213ce6
2014-02-04 15:32:29 -08:00
1b46cc9b6d Merge "Changes to support sso: repositories for upload" 2014-02-04 21:19:07 +00:00
1242e60bdd Implement Kerberos HTTP authentication handler
This commit implements a Kerberos HTTP authentication handler. It
uses credentials from a local cache to perform an HTTP authentication
negotiation using the GSSAPI.

The purpose of this handler is to allow the use Kerberos authentication
to access review endpoints without the need to transmit the user
password.

Change-Id: Id2c3fc91a58b15a3e83e4bd9ca87203fa3d647c8
2014-02-04 09:22:42 +01:00
2d0f508648 Fix persistent-https relative url resolving
Previously, we would remove 'persistent-' then tack it on at the end
if it had been previously found.  However, this would ignore urljoin's
decision on whether or not the second path was relative.  Instead, we
were always assuming it was relative and that we didn't want to use
a different absolute url with a different protocol.

This change handles persistent-https:// in the same way we handled the
absense of an explicit protocol.  The only difference is that this time
instead of temporarily replacing it with 'gopher://', we use 'wais://'.

Change-Id: I6e8ad1eb4b911931a991481717f1ade01315db2a
2014-01-31 16:06:31 -08:00
143d8a7249 Changes to support sso: repositories for upload
Change-Id: Iddf90d52f700a1f6462abe76d4f4a367ebb6d603
2014-01-31 07:39:44 -08:00
5db69f3f66 Update the version number on the repo launcher
The repo launcher version needs to be updated so some users can take
advantage of the more robust version number parsing.

Change-Id: Ibcd8036363311528db82db2b252357ffd21eb59b
2014-01-30 16:00:35 -08:00
ff0a3c8f80 Share git version parsing code with wrapper module
'repo' and 'git_command.py' had their own git version parsing code.
This change shares that code between the modules.  DRY is good.

Change-Id: Ic896d2dc08353644bd4ced57e15a91284d97d54a
2014-01-30 15:18:56 -08:00
094cdbe090 Add wrapper module
This takes the wrapper importing code from main.py and moves it into
its own module so that other modules may import it without causing
circular imports with main.py.

Change-Id: I9402950573933ed6f14ce0bfb600f74f32727705
2014-01-30 15:17:09 -08:00
148a84de0c Respect version hyphenation
The last change regarding version parsing lost handling of version
hyphenation, this restores that.  In otherwords,
1.1.1-otherstuff is parsed as (1,1,1) instead of (1,1,0)

Change-Id: I3753944e92095606653835ed2bd090b9301c7194
2014-01-30 13:53:55 -08:00
1c5da49e6c Handle release candidates in git version parsing
Right now repo chokes on git versions like "1.9.rc1".  This change
treats 'rc*' as a '0'.

Change-Id: I612b7b431675ba7415bf70640a673e48dbb00a90
2014-01-30 13:26:50 -08:00
b8433dfd2f repo: Fix 'remove-project' regression with multiple projects.
In CL:50715, I updated repo to handle multiple projects, but the
remove-projects code path was not updated accordingly. Update it.

Change-Id: Icd681d45ce857467b584bca0d2fdcbf24ec6e8db
2014-01-30 10:14:54 -08:00
f2fe2d9b86 Properly iterate through values
the value of Manifest.projects has changed from being the dictionary
to the values of the dictionary.  Here we handle this change
correctly on a PostRepoUpgrade.

From a `git diff v1.12.7 -- manifest_xml.py`:
+  @property
   def projects(self):
     self._Load()
-    return self._projects
+    return self._paths.values()

self._paths does contain the projects according to this line of
manifest_xml.py:
484      self._paths[project.relpath] = project

Change-Id: I141f8d5468ee10dfb08f99ba434004a307fed810
2014-01-29 13:57:22 -08:00
c9877c7cf6 Merge "Only fetch current branch on shallow clients" 2014-01-29 21:12:34 +00:00
69e04d8953 Only fetch current branch on shallow clients
Fetching a new branch on a shallow client may download the entire
project history, as the depth parameter is not passed to git
fetch. Force the fetch to only download the current branch.

Change-Id: Ie17ce8eb5e3487c24d90b2cae8227319dea482c8
2014-01-29 12:48:54 -08:00
f1f1137d61 Merge "Don't backtrace when current branch is not uploadable." 2014-01-14 00:41:35 +00:00
f77ef2edb0 Merge "hooks/pre-auto-gc: fix AC detection on OSX Maverick" 2014-01-10 02:50:53 +00:00
e695338e21 Merge "repo: Support multiple branches for the same project." 2014-01-10 01:20:13 +00:00
bd80f7eedd Merge "Canonicalize project hooks path before use" 2014-01-09 02:11:10 +00:00
bf79c6618e Fix os.mkdir race condition.
This code checks whether a dir exists before creating it. In between the
check and the mkdir call, it is possible that another process will have
created the directory. We have seen this bug occur many times in
practice during our 'repo init' tests.

Change-Id: Ia47d39955739aa38fd303f4e90be7b4c50d9d4ba
2013-12-26 14:59:00 -08:00
f045d49a71 Merge "Add --archive option to init to sync using git archive" 2013-12-18 17:44:59 +00:00
719757d6a8 hooks/pre-auto-gc: fix AC detection on OSX Maverick
The output of pmset has been changed to "Now drawing from 'AC Power'"

Change-Id: Id425d3bcd6a28656736a6d2c3096623a3ec053cc
2013-12-17 09:48:20 +07:00
011d4f426c Don't backtrace when current branch is not uploadable.
The backtrace currently occurs when one uses the "--cbr" argument with
the repo upload subcommand if the current branch is not tracking an
upstream branch. There may be other cases that would backtrace as well,
but this is the only one I found so far.

Change-Id: Ie712fbb0ce3e7fe3b72769fca89cc4c0e3d2fce0
2013-12-11 23:24:01 -08:00
53d6a7b895 Fix error in xml manifest doc.
The docs on the annotations say that zero or more may exist as a child
of a project, so that means that a "*" instead of a "?" should be used.

Change-Id: Iff855d003dfb05cd980f285a237332914e1dad70
2013-12-10 15:30:03 -08:00
335f5ef4ad Add --archive option to init to sync using git archive
This significantly reduces sync time and used brandwidth as only
a tar of each project's revision is checked out, but git is not
accessible from projects anymore.

This is relevant when git is not needed in projects but sync
speed/brandwidth may be important like on CI servers when building
several versions from scratch regularly for example.

Archive is not supported over http/https.

Change-Id: I48c3c7de2cd5a1faec33e295fcdafbc7807d0e4d
Signed-off-by: Julien Campergue <julien.campergue@parrot.com>
2013-12-10 08:27:07 +00:00
672cc499b9 Canonicalize project hooks path before use
If the top-level .repo directory is moved somewhere else (e.g. a
different drive) and replaced with a symlink, _InitHooks() will create
broken symlinks. Resolving symlinks before computing the relative path
for the symlink keeps the path within the repo tree, so the tree can
be moved anywhere.

Change-Id: Ifa5c07869e3477186ddd2c255c6c607f547bc1fe
2013-12-03 09:02:16 -08:00
61df418c59 Update the commit-msg hook to the version from Gerrit 2.6
Change-Id: Iaf21ba8d2ceea58973dbc56f0b4ece54500cd997
2013-11-29 19:17:23 +09:00
4534120628 Merge "Allow using repo with python3" 2013-11-22 10:25:35 +00:00
cbc0798f67 Fix print of git-remote-persistent-https error
If git-remote-persistent-https fails, we use an iter() and then
subsequently a .read() on stderr.  Python doesn't like this and
gives the following error message:
ValueError: Mixing iteration and read methods would lose data

This change removes the use of iter() to avoid the issue.

Change-Id: I980659b83229e2a559c20dcc7b116f8d2476abd5
2013-11-21 10:38:03 -08:00
d5a5b19efd Remove trailing whitespace
Change-Id: I56bcb559431277d40070fa33c580c6c3525ff9bc
2013-11-21 19:16:08 +05:30
5d6cb80b8f Allow using repo with python3
* Switching from python2 to python3 in the same workspace isn't
  currently supported, due to a change in the pickle version (which
  isn't supported by python2)
* Basic functionality does work with python3, however not everything
  is expected to

Change-Id: I4256b5a9861562d0260b503f972c1569190182aa
2013-11-21 18:44:52 +05:30
0eb35cbe50 Fix some python3 encoding issues
* Add .decode('utf-8') where needed
* Add 'b' to `open` where needed, and remove where unnecessary

Change-Id: I0f03ecf9ed1a78e3b2f15f9469deb9aaab698657
2013-11-21 06:03:22 +00:00
ce201a5311 Fix a small whitespace consistency issue
Change-Id: Ie98c79833ca5e7ef71666489135f7491223f779c
2013-10-16 14:42:42 -07:00
12fd10c201 Merge "Dan't accessing attr of None (manifest subcmd)" 2013-10-16 21:41:33 +00:00
a17d7af4d9 Dan't accessing attr of None (manifest subcmd)
If d.remote is None, this code failed for obvious reasons.  This is a
simple fix.

Change-Id: I413756121e444111f1e3c7dc8bc8032467946c13
2013-10-16 14:38:09 -07:00
fbd3f2a10b Only check merge destination if it isn't None
Change-Id: Ifb1dcd07142933489e93a1f4f03e38289087b609
2013-10-15 12:59:00 -07:00
37128b6f70 Fix indentation
git-repo uses 2 space indentation.  A couple of recent changes
introduced 4 space indentation in some modules.

Change-Id: Ia4250157c1824c1b5e7d555068c4608f995be9da
2013-10-15 10:48:40 +09:00
143b4cc992 Merge "Better handling of duplicate default" 2013-10-15 01:40:08 +00:00
8d20116038 repo: Support multiple branches for the same project.
It is often useful to be able to include the same project more than
once, but with different branches and placed in different paths in the
workspace. Add this feature.

This CL adds the concept of an object directory. The object directory
stores objects that can be shared amongst several working trees. For
newly synced repositories, we set up the git repo now to share its
objects with an object repo.

Each worktree for a given repo shares objects, but has an independent
set of references and branches. This ensures that repo only has to
update the objects once; however the references for each worktree are
updated separately. Storing the references separately is needed to
ensure that commits to a branch on one worktree will not change the
HEAD commits of the others.

One nice side effect of sharing objects between different worktrees is
that you can easily cherry-pick changes between the two worktrees
without needing to fetch them.

Bug: Issue 141
Change-Id: I5e2f4e1a7abb56f9d3f310fa6fd0c17019330ecd
2013-10-14 15:34:32 -07:00
53263d873d Merge "repo: use explicit Python executable to run main.py" 2013-10-10 18:42:59 +00:00
7487992bd3 Better handling of duplicate default
Currently, an error is raised if more than one default is defined.

When including another manifest, it is likely that a default has
been defined in both manifests.

Don't raise an error if all the defaults defined have the same
attributes.

Change-Id: I2603020687e2ba04c2c62c3268ee375279b34a08
Signed-off-by: Julien Campergue <julien.campergue@parrot.com>
2013-10-10 18:14:27 +02:00
b25ea555c3 Merge "Respect remote aliases" 2013-10-10 16:08:42 +00:00
3bfd72158c Don't upload when dest branch is not merge branch
Example:
- `repo init -b master` / sync a project
- In one project: `git checkout -b work origin/branch-thats-not-master`
- make some changes, `git commit`
- `repo upload .`
- Upload will now be skipped with a warning instead of being uploaded to
  master

Change-Id: I990b36217b75fe3c8b4d776e7fefa1c7d9ab7282
2013-10-10 09:06:38 -07:00
59b31cb6e0 don't pass project revision to UploadForReview
Passing a project revisionExpr to UploadForReview will cause it to
try to push to refs/for/<sha> if the revision points to a sha
instead of a branch.  Pass None for dest_branch if no destination
branch has been specified, which will cause UploadForReview to
upload to the merge branch.

There is room for further improvement, the user prompts will
still print "Upload project <project> to remote branch <sha>",
and then upload to the merge branch and not the sha, but that
is the same behavior that was in 1.12.2.

Change-Id: I06c510336ae67ff7e68b5b69e929693179d15c0b
2013-10-08 23:14:29 -07:00
1e7ab2a63f Respect remote aliases
Previously, change I7150e449341ed8655d398956a095261978d95870
had broken alias support in order to fix the manifest command to keep
it from spitting projects that point to an alias that wasn't recorded.
This commit reverts that commit and instead solves the issue more
correctly, outputting the alias in the remote node of the manifest and
respecting that alias when outputting the list of projects.

Change-Id: I941fc4adb7121d2e61cedc5838e80d3918c977c3
2013-10-08 17:26:57 -07:00
3a2a59eb87 repo: use explicit Python executable to run main.py
Small step to support non-POSIX platforms.

Change-Id: I3bdb9c82c2dfbacb1da328caaa1a406ab91ad675
2013-09-21 20:03:57 +03:00
24 changed files with 1766 additions and 677 deletions

View File

@ -83,15 +83,38 @@ def _Color(fg = None, bg = None, attr = None):
return code
DEFAULT = None
def SetDefaultColoring(state):
"""Set coloring behavior to |state|.
This is useful for overriding config options via the command line.
"""
if state is None:
# Leave it alone -- return quick!
return
global DEFAULT
state = state.lower()
if state in ('auto',):
DEFAULT = state
elif state in ('always', 'yes', 'true', True):
DEFAULT = 'always'
elif state in ('never', 'no', 'false', False):
DEFAULT = 'never'
class Coloring(object):
def __init__(self, config, section_type):
self._section = 'color.%s' % section_type
self._config = config
self._out = sys.stdout
on = self._config.GetString(self._section)
on = DEFAULT
if on is None:
on = self._config.GetString('color.ui')
on = self._config.GetString(self._section)
if on is None:
on = self._config.GetString('color.ui')
if on == 'auto':
if pager.active or os.isatty(1):

View File

@ -129,7 +129,7 @@ class Command(object):
def GetProjects(self, args, missing_ok=False, submodules_ok=False):
"""A list of projects that match the arguments.
"""
all_projects = self.manifest.projects
all_projects_list = self.manifest.projects
result = []
mp = self.manifest.manifestProject
@ -140,7 +140,6 @@ class Command(object):
groups = [x for x in re.split(r'[,\s]+', groups) if x]
if not args:
all_projects_list = list(all_projects.values())
derived_projects = {}
for project in all_projects_list:
if submodules_ok or project.sync_s:
@ -152,12 +151,12 @@ class Command(object):
project.MatchesGroups(groups)):
result.append(project)
else:
self._ResetPathToProjectMap(all_projects.values())
self._ResetPathToProjectMap(all_projects_list)
for arg in args:
project = all_projects.get(arg)
projects = self.manifest.GetProjectsWithName(arg)
if not project:
if not projects:
path = os.path.abspath(arg).replace('\\', '/')
project = self._GetProjectByPath(path)
@ -172,14 +171,19 @@ class Command(object):
if search_again:
project = self._GetProjectByPath(path) or project
if not project:
raise NoSuchProjectError(arg)
if not missing_ok and not project.Exists:
raise NoSuchProjectError(arg)
if not project.MatchesGroups(groups):
raise InvalidProjectGroupsError(arg)
if project:
projects = [project]
result.append(project)
if not projects:
raise NoSuchProjectError(arg)
for project in projects:
if not missing_ok and not project.Exists:
raise NoSuchProjectError(arg)
if not project.MatchesGroups(groups):
raise InvalidProjectGroupsError(arg)
result.extend(projects)
def _getpath(x):
return x.relpath

View File

@ -26,16 +26,18 @@ following DTD:
manifest-server?,
remove-project*,
project*,
extend-project*,
repo-hooks?)>
<!ELEMENT notice (#PCDATA)>
<!ELEMENT remote (EMPTY)>
<!ELEMENT remote (projecthook?)>
<!ATTLIST remote name ID #REQUIRED>
<!ATTLIST remote alias CDATA #IMPLIED>
<!ATTLIST remote fetch CDATA #REQUIRED>
<!ATTLIST remote review CDATA #IMPLIED>
<!ATTLIST remote revision CDATA #IMPLIED>
<!ELEMENT default (EMPTY)>
<!ATTLIST default remote IDREF #IMPLIED>
<!ATTLIST default revision CDATA #IMPLIED>
@ -46,8 +48,8 @@ following DTD:
<!ELEMENT manifest-server (EMPTY)>
<!ATTLIST url CDATA #REQUIRED>
<!ELEMENT project (annotation?,
<!ELEMENT project (annotation*,
project*)>
<!ATTLIST project name CDATA #REQUIRED>
<!ATTLIST project path CDATA #IMPLIED>
@ -65,7 +67,16 @@ following DTD:
<!ATTLIST annotation name CDATA #REQUIRED>
<!ATTLIST annotation value CDATA #REQUIRED>
<!ATTLIST annotation keep CDATA "true">
<!ELEMENT extend-project>
<!ATTLIST extend-project name CDATA #REQUIRED>
<!ATTLIST extend-project path CDATA #IMPLIED>
<!ATTLIST extend-project groups CDATA #IMPLIED>
<!ELEMENT projecthook (EMPTY)>
<!ATTLIST projecthook name CDATA #REQUIRED>
<!ATTLIST projecthook revision CDATA #REQUIRED>
<!ELEMENT remove-project (EMPTY)>
<!ATTLIST remove-project name CDATA #REQUIRED>
@ -112,6 +123,10 @@ Attribute `review`: Hostname of the Gerrit server where reviews
are uploaded to by `repo upload`. This attribute is optional;
if not specified then `repo upload` will not function.
Attribute `revision`: Name of a Git branch (e.g. `master` or
`refs/heads/master`). Remotes with their own revision will override
the default revision.
Element default
---------------
@ -132,14 +147,14 @@ Project elements not setting their own `dest-branch` will inherit
this value. If this value is not set, projects will use `revision`
by default instead.
Attribute `sync_j`: Number of parallel jobs to use when synching.
Attribute `sync-j`: Number of parallel jobs to use when synching.
Attribute `sync_c`: Set to true to only sync the given Git
Attribute `sync-c`: Set to true to only sync the given Git
branch (specified in the `revision` attribute) rather than the
whole ref space. Project elements lacking a sync_c element of
whole ref space. Project elements lacking a sync-c element of
their own will use this value.
Attribute `sync_s`: Set to true to also sync sub-projects.
Attribute `sync-s`: Set to true to also sync sub-projects.
Element manifest-server
@ -208,7 +223,8 @@ to track for this project. Names can be relative to refs/heads
(e.g. just "master") or absolute (e.g. "refs/heads/master").
Tags and/or explicit SHA-1s should work in theory, but have not
been extensively tested. If not supplied the revision given by
the default element is used.
the remote element is used if applicable, else the default
element is used.
Attribute `dest-branch`: Name of a Git branch (e.g. `master`).
When using `repo upload`, changes will be submitted for code
@ -226,13 +242,13 @@ group "notdefault", it will not be automatically downloaded by repo.
If the project has a parent element, the `name` and `path` here
are the prefixed ones.
Attribute `sync_c`: Set to true to only sync the given Git
Attribute `sync-c`: Set to true to only sync the given Git
branch (specified in the `revision` attribute) rather than the
whole ref space.
Attribute `sync_s`: Set to true to also sync sub-projects.
Attribute `sync-s`: Set to true to also sync sub-projects.
Attribute `upstream`: Name of the Git branch in which a sha1
Attribute `upstream`: Name of the Git ref in which a sha1
can be found. Used when syncing a revision locked manifest in
-c mode to avoid having to sync the entire ref space.
@ -246,6 +262,22 @@ rather than the `name` attribute. This attribute only applies to the
local mirrors syncing, it will be ignored when syncing the projects in a
client working directory.
Element extend-project
----------------------
Modify the attributes of the named project.
This element is mostly useful in a local manifest file, to modify the
attributes of an existing project without completely replacing the
existing project definition. This makes the local manifest more robust
against changes to the original manifest.
Attribute `path`: If specified, limit the change to projects checked out
at the specified path, rather than all projects with the given name.
Attribute `groups`: List of additional groups to which this project
belongs. Same syntax as the corresponding element of `project`.
Element annotation
------------------
@ -278,6 +310,15 @@ target manifest to include - it must be a usable manifest on its own.
Attribute `name`: the manifest to include, specified relative to
the manifest repository's root.
Element projecthook
-------------------
This element is used to define a per-remote hook git that is
fetched and applied to all projects using the remote. The project-
hook functionality allows for company/team .git/hooks to be used.
The hooks in the supplied project and revision are supplemented to
the current repo stock hooks for each project. Supplemented hooks
overrule any stock hooks.
Local Manifests
===============

View File

@ -24,6 +24,13 @@ class ManifestInvalidRevisionError(Exception):
class NoManifestException(Exception):
"""The required manifest does not exist.
"""
def __init__(self, path, reason):
super(NoManifestException, self).__init__()
self.path = path
self.reason = reason
def __str__(self):
return self.reason
class EditorError(Exception):
"""Unspecified error from the user's text editor.

View File

@ -21,6 +21,7 @@ import tempfile
from signal import SIGTERM
from error import GitError
from trace import REPO_TRACE, IsTrace, Trace
from wrapper import Wrapper
GIT = 'git'
MIN_GIT_VERSION = (1, 5, 4)
@ -79,20 +80,15 @@ class _GitCall(object):
def version(self):
p = GitCommand(None, ['--version'], capture_stdout=True)
if p.Wait() == 0:
return p.stdout
return p.stdout.decode('utf-8')
return None
def version_tuple(self):
global _git_version
if _git_version is None:
ver_str = git.version()
if ver_str.startswith('git version '):
_git_version = tuple(
map(int,
ver_str[len('git version '):].strip().split('-')[0].split('.')[0:3]
))
else:
_git_version = Wrapper().ParseGitVersion(ver_str)
if _git_version is None:
print('fatal: "%s" unsupported' % ver_str, file=sys.stderr)
sys.exit(1)
return _git_version

View File

@ -15,8 +15,8 @@
from __future__ import print_function
import json
import os
import pickle
import re
import subprocess
import sys
@ -80,7 +80,7 @@ class GitConfig(object):
return cls(configfile = os.path.join(gitdir, 'config'),
defaults = defaults)
def __init__(self, configfile, defaults=None, pickleFile=None):
def __init__(self, configfile, defaults=None, jsonFile=None):
self.file = configfile
self.defaults = defaults
self._cache_dict = None
@ -88,12 +88,11 @@ class GitConfig(object):
self._remotes = {}
self._branches = {}
if pickleFile is None:
self._pickle = os.path.join(
self._json = jsonFile
if self._json is None:
self._json = os.path.join(
os.path.dirname(self.file),
'.repopickle_' + os.path.basename(self.file))
else:
self._pickle = pickleFile
'.repo_' + os.path.basename(self.file) + '.json')
def Has(self, name, include_defaults = True):
"""Return true if this configuration file has the key.
@ -217,9 +216,9 @@ class GitConfig(object):
"""Resolve any url.*.insteadof references.
"""
for new_url in self.GetSubSections('url'):
old_url = self.GetString('url.%s.insteadof' % new_url)
if old_url is not None and url.startswith(old_url):
return new_url + url[len(old_url):]
for old_url in self.GetString('url.%s.insteadof' % new_url, True):
if old_url is not None and url.startswith(old_url):
return new_url + url[len(old_url):]
return url
@property
@ -248,50 +247,41 @@ class GitConfig(object):
return self._cache_dict
def _Read(self):
d = self._ReadPickle()
d = self._ReadJson()
if d is None:
d = self._ReadGit()
self._SavePickle(d)
self._SaveJson(d)
return d
def _ReadPickle(self):
def _ReadJson(self):
try:
if os.path.getmtime(self._pickle) \
if os.path.getmtime(self._json) \
<= os.path.getmtime(self.file):
os.remove(self._pickle)
os.remove(self._json)
return None
except OSError:
return None
try:
Trace(': unpickle %s', self.file)
fd = open(self._pickle, 'rb')
Trace(': parsing %s', self.file)
fd = open(self._json)
try:
return pickle.load(fd)
return json.load(fd)
finally:
fd.close()
except EOFError:
os.remove(self._pickle)
return None
except IOError:
os.remove(self._pickle)
return None
except pickle.PickleError:
os.remove(self._pickle)
except (IOError, ValueError):
os.remove(self._json)
return None
def _SavePickle(self, cache):
def _SaveJson(self, cache):
try:
fd = open(self._pickle, 'wb')
fd = open(self._json, 'w')
try:
pickle.dump(cache, fd, pickle.HIGHEST_PROTOCOL)
json.dump(cache, fd, indent=2)
finally:
fd.close()
except IOError:
if os.path.exists(self._pickle):
os.remove(self._pickle)
except pickle.PickleError:
if os.path.exists(self._pickle):
os.remove(self._pickle)
except (IOError, TypeError):
if os.path.exists(self.json):
os.remove(self._json)
def _ReadGit(self):
"""
@ -304,8 +294,8 @@ class GitConfig(object):
d = self._do('--null', '--list')
if d is None:
return c
for line in d.rstrip('\0').split('\0'): # pylint: disable=W1401
# Backslash is not anomalous
for line in d.decode('utf-8').rstrip('\0').split('\0'): # pylint: disable=W1401
# Backslash is not anomalous
if '\n' in line:
key, val = line.split('\n', 1)
else:
@ -576,7 +566,9 @@ class Remote(object):
return None
u = self.review
if not u.startswith('http:') and not u.startswith('https:'):
if u.startswith('persistent-'):
u = u[len('persistent-'):]
if u.split(':')[0] not in ('http', 'https', 'sso'):
u = 'http://%s' % u
if u.endswith('/Gerrit'):
u = u[:len(u) - len('/Gerrit')]
@ -592,6 +584,9 @@ class Remote(object):
host, port = os.environ['REPO_HOST_PORT_INFO'].split()
self._review_url = self._SshReviewUrl(userEmail, host, port)
REVIEW_CACHE[u] = self._review_url
elif u.startswith('sso:'):
self._review_url = u # Assume it's right
REVIEW_CACHE[u] = self._review_url
else:
try:
info_url = u + 'ssh_info'
@ -601,7 +596,7 @@ class Remote(object):
# of HTML response back, like maybe a login page.
#
# Assume HTTP if SSH is not enabled or ssh_info doesn't look right.
self._review_url = http_url + 'p/'
self._review_url = http_url
else:
host, port = info.split()
self._review_url = self._SshReviewUrl(userEmail, host, port)
@ -624,9 +619,7 @@ class Remote(object):
def ToLocal(self, rev):
"""Convert a remote revision string to something we have locally.
"""
if IsId(rev):
return rev
if rev.startswith(R_TAGS):
if self.name == '.' or IsId(rev):
return rev
if not rev.startswith('refs/'):
@ -635,6 +628,10 @@ class Remote(object):
for spec in self.fetch:
if spec.SourceMatches(rev):
return spec.MapSource(rev)
if not rev.startswith(R_HEADS):
return rev
raise GitError('remote %s does not have %s' % (self.name, rev))
def WritesTo(self, ref):
@ -704,7 +701,7 @@ class Branch(object):
self._Set('merge', self.merge)
else:
fd = open(self._config.file, 'ab')
fd = open(self._config.file, 'a')
try:
fd.write('[branch "%s"]\n' % self.name)
if self.remote:

View File

@ -100,7 +100,7 @@ class GitRefs(object):
def _ReadPackedRefs(self):
path = os.path.join(self._gitdir, 'packed-refs')
try:
fd = open(path, 'rb')
fd = open(path, 'r')
mtime = os.path.getmtime(path)
except IOError:
return

View File

@ -1,5 +1,4 @@
#!/bin/sh
# From Gerrit Code Review 2.5.2
#
# Part of Gerrit Code Review (http://code.google.com/p/gerrit/)
#
@ -27,7 +26,7 @@ MSG="$1"
#
add_ChangeId() {
clean_message=`sed -e '
/^diff --git a\/.*/{
/^diff --git .*/{
s///
q
}
@ -39,6 +38,11 @@ add_ChangeId() {
return
fi
if test "false" = "`git config --bool --get gerrit.createChangeId`"
then
return
fi
# Does Change-Id: already exist? if so, exit (no change).
if grep -i '^Change-Id:' "$MSG" >/dev/null
then
@ -77,7 +81,7 @@ add_ChangeId() {
# Skip the line starting with the diff command and everything after it,
# up to the end of the file, assuming it is only patch data.
# If more than one line before the diff was empty, strip all but one.
/^diff --git a/ {
/^diff --git / {
blankLines = 0
while (getline) { }
next
@ -154,7 +158,7 @@ add_ChangeId() {
if (unprinted) {
print "Change-Id: I'"$id"'"
}
}' "$MSG" > $T && mv $T "$MSG" || rm -f $T
}' "$MSG" > "$T" && mv "$T" "$MSG" || rm -f "$T"
}
_gen_ChangeIdInput() {
echo "tree `git write-tree`"

View File

@ -35,7 +35,7 @@ elif grep -q "AC Power \+: 1" /proc/pmu/info 2>/dev/null
then
exit 0
elif test -x /usr/bin/pmset && /usr/bin/pmset -g batt |
grep -q "Currently drawing from 'AC Power'"
grep -q "drawing from 'AC Power'"
then
exit 0
elif test -d /sys/bus/acpi/drivers/battery && test 0 = \

140
main.py
View File

@ -31,6 +31,12 @@ else:
urllib = imp.new_module('urllib')
urllib.request = urllib2
try:
import kerberos
except ImportError:
kerberos = None
from color import SetDefaultColoring
from trace import SetTrace
from git_command import git, GitCommand
from git_config import init_ssh, close_ssh
@ -46,6 +52,7 @@ from error import NoSuchProjectError
from error import RepoChangedException
from manifest_xml import XmlManifest
from pager import RunPager
from wrapper import WrapperPath, Wrapper
from subcmds import all_commands
@ -63,6 +70,9 @@ global_options.add_option('-p', '--paginate',
global_options.add_option('--no-pager',
dest='no_pager', action='store_true',
help='disable the pager')
global_options.add_option('--color',
choices=('auto', 'always', 'never'), default=None,
help='control color usage: auto, always, never')
global_options.add_option('--trace',
dest='trace', action='store_true',
help='trace git command execution')
@ -107,6 +117,8 @@ class _Repo(object):
print('fatal: invalid usage of --version', file=sys.stderr)
return 1
SetDefaultColoring(gopts.color)
try:
cmd = self.commands[name]
except KeyError:
@ -123,8 +135,15 @@ class _Repo(object):
file=sys.stderr)
return 1
copts, cargs = cmd.OptionParser.parse_args(argv)
copts = cmd.ReadEnvironmentOptions(copts)
try:
copts, cargs = cmd.OptionParser.parse_args(argv)
copts = cmd.ReadEnvironmentOptions(copts)
except NoManifestException as e:
print('error: in `%s`: %s' % (' '.join([name] + argv), str(e)),
file=sys.stderr)
print('error: manifest missing or unreadable -- please run init',
file=sys.stderr)
return 1
if not gopts.no_pager and not isinstance(cmd, InteractiveCommand):
config = cmd.manifest.globalConfig
@ -140,15 +159,13 @@ class _Repo(object):
start = time.time()
try:
result = cmd.Execute(copts, cargs)
except DownloadError as e:
print('error: %s' % str(e), file=sys.stderr)
result = 1
except ManifestInvalidRevisionError as e:
print('error: %s' % str(e), file=sys.stderr)
result = 1
except NoManifestException as e:
print('error: manifest required for this command -- please run init',
file=sys.stderr)
except (DownloadError, ManifestInvalidRevisionError,
NoManifestException) as e:
print('error: in `%s`: %s' % (' '.join([name] + argv), str(e)),
file=sys.stderr)
if isinstance(e, NoManifestException):
print('error: manifest missing or unreadable -- please run init',
file=sys.stderr)
result = 1
except NoSuchProjectError as e:
if e.name:
@ -169,21 +186,10 @@ class _Repo(object):
return result
def _MyRepoPath():
return os.path.dirname(__file__)
def _MyWrapperPath():
return os.path.join(os.path.dirname(__file__), 'repo')
_wrapper_module = None
def WrapperModule():
global _wrapper_module
if not _wrapper_module:
_wrapper_module = imp.load_source('wrapper', _MyWrapperPath())
return _wrapper_module
def _CurrentWrapperVersion():
return WrapperModule().VERSION
def _CheckWrapperVersion(ver, repo_path):
if not repo_path:
@ -193,7 +199,7 @@ def _CheckWrapperVersion(ver, repo_path):
print('no --wrapper-version argument', file=sys.stderr)
sys.exit(1)
exp = _CurrentWrapperVersion()
exp = Wrapper().VERSION
ver = tuple(map(int, ver.split('.')))
if len(ver) == 1:
ver = (0, ver[0])
@ -205,7 +211,7 @@ def _CheckWrapperVersion(ver, repo_path):
!!! You must upgrade before you can continue: !!!
cp %s %s
""" % (exp_str, _MyWrapperPath(), repo_path), file=sys.stderr)
""" % (exp_str, WrapperPath(), repo_path), file=sys.stderr)
sys.exit(1)
if exp > ver:
@ -214,7 +220,7 @@ def _CheckWrapperVersion(ver, repo_path):
... You should upgrade soon:
cp %s %s
""" % (exp_str, _MyWrapperPath(), repo_path), file=sys.stderr)
""" % (exp_str, WrapperPath(), repo_path), file=sys.stderr)
def _CheckRepoDir(repo_dir):
if not repo_dir:
@ -342,6 +348,86 @@ class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler):
self.retried = 0
raise
class _KerberosAuthHandler(urllib.request.BaseHandler):
def __init__(self):
self.retried = 0
self.context = None
self.handler_order = urllib.request.BaseHandler.handler_order - 50
def http_error_401(self, req, fp, code, msg, headers):
host = req.get_host()
retry = self.http_error_auth_reqed('www-authenticate', host, req, headers)
return retry
def http_error_auth_reqed(self, auth_header, host, req, headers):
try:
spn = "HTTP@%s" % host
authdata = self._negotiate_get_authdata(auth_header, headers)
if self.retried > 3:
raise urllib.request.HTTPError(req.get_full_url(), 401,
"Negotiate auth failed", headers, None)
else:
self.retried += 1
neghdr = self._negotiate_get_svctk(spn, authdata)
if neghdr is None:
return None
req.add_unredirected_header('Authorization', neghdr)
response = self.parent.open(req)
srvauth = self._negotiate_get_authdata(auth_header, response.info())
if self._validate_response(srvauth):
return response
except kerberos.GSSError:
return None
except:
self.reset_retry_count()
raise
finally:
self._clean_context()
def reset_retry_count(self):
self.retried = 0
def _negotiate_get_authdata(self, auth_header, headers):
authhdr = headers.get(auth_header, None)
if authhdr is not None:
for mech_tuple in authhdr.split(","):
mech, __, authdata = mech_tuple.strip().partition(" ")
if mech.lower() == "negotiate":
return authdata.strip()
return None
def _negotiate_get_svctk(self, spn, authdata):
if authdata is None:
return None
result, self.context = kerberos.authGSSClientInit(spn)
if result < kerberos.AUTH_GSS_COMPLETE:
return None
result = kerberos.authGSSClientStep(self.context, authdata)
if result < kerberos.AUTH_GSS_CONTINUE:
return None
response = kerberos.authGSSClientResponse(self.context)
return "Negotiate %s" % response
def _validate_response(self, authdata):
if authdata is None:
return None
result = kerberos.authGSSClientStep(self.context, authdata)
if result == kerberos.AUTH_GSS_COMPLETE:
return True
return None
def _clean_context(self):
if self.context is not None:
kerberos.authGSSClientClean(self.context)
self.context = None
def init_http():
handlers = [_UserAgentHandler()]
@ -358,6 +444,8 @@ def init_http():
pass
handlers.append(_BasicAuthHandler(mgr))
handlers.append(_DigestAuthHandler(mgr))
if kerberos:
handlers.append(_KerberosAuthHandler())
if 'http_proxy' in os.environ:
url = os.environ['http_proxy']

View File

@ -32,7 +32,7 @@ else:
from git_config import GitConfig
from git_refs import R_HEADS, HEAD
from project import RemoteSpec, Project, MetaProject
from error import ManifestParseError
from error import ManifestParseError, ManifestInvalidRevisionError
MANIFEST_FILE_NAME = 'manifest.xml'
LOCAL_MANIFEST_NAME = 'local_manifest.xml'
@ -51,19 +51,31 @@ class _Default(object):
sync_c = False
sync_s = False
def __eq__(self, other):
return self.__dict__ == other.__dict__
def __ne__(self, other):
return self.__dict__ != other.__dict__
class _XmlRemote(object):
def __init__(self,
name,
alias=None,
fetch=None,
manifestUrl=None,
review=None):
review=None,
revision=None,
projecthookName=None,
projecthookRevision=None):
self.name = name
self.fetchUrl = fetch
self.manifestUrl = manifestUrl
self.remoteAlias = alias
self.reviewUrl = review
self.revision = revision
self.resolvedFetchUrl = self._resolveFetchUrl()
self.projecthookName = projecthookName
self.projecthookRevision = projecthookRevision
def __eq__(self, other):
return self.__dict__ == other.__dict__
@ -74,23 +86,31 @@ class _XmlRemote(object):
def _resolveFetchUrl(self):
url = self.fetchUrl.rstrip('/')
manifestUrl = self.manifestUrl.rstrip('/')
p = manifestUrl.startswith('persistent-http')
if p:
manifestUrl = manifestUrl[len('persistent-'):]
# urljoin will get confused if there is no scheme in the base url
# ie, if manifestUrl is of the form <hostname:port>
# urljoin will gets confused over quite a few things. The ones we care
# about here are:
# * no scheme in the base url, like <hostname:port>
# * persistent-https://
# * rpc://
# We handle this by replacing these with obscure protocols
# and then replacing them with the original when we are done.
# gopher -> <none>
# wais -> persistent-https
# nntp -> rpc
if manifestUrl.find(':') != manifestUrl.find('/') - 1:
manifestUrl = 'gopher://' + manifestUrl
manifestUrl = re.sub(r'^persistent-https://', 'wais://', manifestUrl)
manifestUrl = re.sub(r'^rpc://', 'nntp://', manifestUrl)
url = urllib.parse.urljoin(manifestUrl, url)
url = re.sub(r'^gopher://', '', url)
if p:
url = 'persistent-' + url
url = re.sub(r'^wais://', 'persistent-https://', url)
url = re.sub(r'^nntp://', 'rpc://', url)
return url
def ToRemoteSpec(self, projectName):
url = self.resolvedFetchUrl.rstrip('/') + '/' + projectName
remoteName = self.name
if self.remoteAlias:
remoteName = self.remoteAlias
return RemoteSpec(remoteName, url, self.reviewUrl)
class XmlManifest(object):
@ -145,8 +165,20 @@ class XmlManifest(object):
root.appendChild(e)
e.setAttribute('name', r.name)
e.setAttribute('fetch', r.fetchUrl)
if r.remoteAlias is not None:
e.setAttribute('alias', r.remoteAlias)
if r.reviewUrl is not None:
e.setAttribute('review', r.reviewUrl)
if r.revision is not None:
e.setAttribute('revision', r.revision)
if r.projecthookName is not None:
ph = doc.createElement('projecthook')
ph.setAttribute('name', r.projecthookName)
ph.setAttribute('revision', r.projecthookRevision)
e.appendChild(ph)
def _ParseGroups(self, groups):
return [x for x in re.split(r'[,\s]+', groups) if x]
def Save(self, fd, peg_rev=False, peg_rev_upstream=True):
"""Write the current manifest out to the given file descriptor.
@ -155,7 +187,7 @@ class XmlManifest(object):
groups = mp.config.GetString('manifest.groups')
if groups:
groups = [x for x in re.split(r'[,\s]+', groups) if x]
groups = self._ParseGroups(groups)
doc = xml.dom.minidom.Document()
root = doc.createElement('manifest')
@ -205,8 +237,9 @@ class XmlManifest(object):
root.appendChild(doc.createTextNode(''))
def output_projects(parent, parent_node, projects):
for p in projects:
output_project(parent, parent_node, self.projects[p])
for project_name in projects:
for project in self._projects[project_name]:
output_project(parent, parent_node, project)
def output_project(parent, parent_node, p):
if not p.MatchesGroups(groups):
@ -223,8 +256,12 @@ class XmlManifest(object):
e.setAttribute('name', name)
if relpath != name:
e.setAttribute('path', relpath)
if not d.remote or p.remote.name != d.remote.name:
e.setAttribute('remote', p.remote.name)
remoteName = None
if d.remote:
remoteName = d.remote.remoteAlias or d.remote.name
if not d.remote or p.remote.name != remoteName:
remoteName = p.remote.name
e.setAttribute('remote', remoteName)
if peg_rev:
if self.IsMirror:
value = p.bare_git.rev_parse(p.revisionExpr + '^0')
@ -236,8 +273,12 @@ class XmlManifest(object):
# isn't our value, and the if the default doesn't already have that
# covered.
e.setAttribute('upstream', p.revisionExpr)
elif not d.revisionExpr or p.revisionExpr != d.revisionExpr:
e.setAttribute('revision', p.revisionExpr)
else:
revision = self.remotes[remoteName].revision or d.revisionExpr
if not revision or revision != p.revisionExpr:
e.setAttribute('revision', p.revisionExpr)
if p.upstream and p.upstream != p.revisionExpr:
e.setAttribute('upstream', p.upstream)
for c in p.copyfiles:
ce = doc.createElement('copyfile')
@ -245,6 +286,12 @@ class XmlManifest(object):
ce.setAttribute('dest', c.dest)
e.appendChild(ce)
for l in p.linkfiles:
le = doc.createElement('linkfile')
le.setAttribute('src', l.src)
le.setAttribute('dest', l.dest)
e.appendChild(le)
default_groups = ['all', 'name:%s' % p.name, 'path:%s' % p.relpath]
egroups = [g for g in p.groups if g not in default_groups]
if egroups:
@ -264,13 +311,11 @@ class XmlManifest(object):
e.setAttribute('sync-s', 'true')
if p.subprojects:
sort_projects = list(sorted([subp.name for subp in p.subprojects]))
output_projects(p, e, sort_projects)
subprojects = set(subp.name for subp in p.subprojects)
output_projects(p, e, list(sorted(subprojects)))
sort_projects = list(sorted([key for key, value in self.projects.items()
if not value.parent]))
sort_projects.sort()
output_projects(None, root, sort_projects)
projects = set(p.name for p in self._paths.values() if not p.parent)
output_projects(None, root, list(sorted(projects)))
if self._repo_hooks_project:
root.appendChild(doc.createTextNode(''))
@ -282,10 +327,15 @@ class XmlManifest(object):
doc.writexml(fd, '', ' ', '\n', 'UTF-8')
@property
def paths(self):
self._Load()
return self._paths
@property
def projects(self):
self._Load()
return self._projects
return list(self._paths.values())
@property
def remotes(self):
@ -316,9 +366,14 @@ class XmlManifest(object):
def IsMirror(self):
return self.manifestProject.config.GetBoolean('repo.mirror')
@property
def IsArchive(self):
return self.manifestProject.config.GetBoolean('repo.archive')
def _Unload(self):
self._loaded = False
self._projects = {}
self._paths = {}
self._remotes = {}
self._default = None
self._repo_hooks_project = None
@ -422,11 +477,13 @@ class XmlManifest(object):
for node in itertools.chain(*node_list):
if node.nodeName == 'default':
if self._default is not None:
raise ManifestParseError(
'duplicate default in %s' %
(self.manifestFile))
self._default = self._ParseDefault(node)
new_default = self._ParseDefault(node)
if self._default is None:
self._default = new_default
elif new_default != self._default:
raise ManifestParseError('duplicate default in %s' %
(self.manifestFile))
if self._default is None:
self._default = _Default()
@ -448,11 +505,17 @@ class XmlManifest(object):
self._manifest_server = url
def recursively_add_projects(project):
if self._projects.get(project.name):
projects = self._projects.setdefault(project.name, [])
if project.relpath is None:
raise ManifestParseError(
'duplicate project %s in %s' %
'missing path for %s in %s' %
(project.name, self.manifestFile))
self._projects[project.name] = project
if project.relpath in self._paths:
raise ManifestParseError(
'duplicate path %s in %s' %
(project.relpath, self.manifestFile))
self._paths[project.relpath] = project
projects.append(project)
for subproject in project.subprojects:
recursively_add_projects(subproject)
@ -460,6 +523,23 @@ class XmlManifest(object):
if node.nodeName == 'project':
project = self._ParseProject(node)
recursively_add_projects(project)
if node.nodeName == 'extend-project':
name = self._reqatt(node, 'name')
if name not in self._projects:
raise ManifestParseError('extend-project element specifies non-existent '
'project: %s' % name)
path = node.getAttribute('path')
groups = node.getAttribute('groups')
if groups:
groups = self._ParseGroups(groups)
for p in self._projects[name]:
if path and p.relpath != path:
continue
if groups:
p.groups.extend(groups)
if node.nodeName == 'repo-hooks':
# Get the name of the project and the (space-separated) list of enabled.
repo_hooks_project = self._reqatt(node, 'in-project')
@ -473,22 +553,31 @@ class XmlManifest(object):
# Store a reference to the Project.
try:
self._repo_hooks_project = self._projects[repo_hooks_project]
repo_hooks_projects = self._projects[repo_hooks_project]
except KeyError:
raise ManifestParseError(
'project %s not found for repo-hooks' %
(repo_hooks_project))
if len(repo_hooks_projects) != 1:
raise ManifestParseError(
'internal error parsing repo-hooks in %s' %
(self.manifestFile))
self._repo_hooks_project = repo_hooks_projects[0]
# Store the enabled hooks in the Project object.
self._repo_hooks_project.enabled_repo_hooks = enabled_repo_hooks
if node.nodeName == 'remove-project':
name = self._reqatt(node, 'name')
try:
del self._projects[name]
except KeyError:
if name not in self._projects:
raise ManifestParseError('remove-project element specifies non-existent '
'project: %s' % name)
for p in self._projects[name]:
del self._paths[p.relpath]
del self._projects[name]
# If the manifest removes the hooks project, treat it as if it deleted
# the repo-hooks element too.
if self._repo_hooks_project and (self._repo_hooks_project.name == name):
@ -525,11 +614,13 @@ class XmlManifest(object):
name = name,
remote = remote.ToRemoteSpec(name),
gitdir = gitdir,
objdir = gitdir,
worktree = None,
relpath = None,
relpath = name or None,
revisionExpr = m.revisionExpr,
revisionId = None)
self._projects[project.name] = project
self._projects[project.name] = [project]
self._paths[project.relpath] = project
def _ParseRemote(self, node):
"""
@ -543,8 +634,17 @@ class XmlManifest(object):
review = node.getAttribute('review')
if review == '':
review = None
revision = node.getAttribute('revision')
if revision == '':
revision = None
manifestUrl = self.manifestProject.config.GetString('remote.origin.url')
return _XmlRemote(name, alias, fetch, manifestUrl, review)
projecthookName = None
projecthookRevision = None
for n in node.childNodes:
if n.nodeName == 'projecthook':
projecthookName, projecthookRevision = self._ParseProjectHooks(n)
break
return _XmlRemote(name, alias, fetch, manifestUrl, review, revision, projecthookName, projecthookRevision)
def _ParseDefault(self, node):
"""
@ -637,7 +737,7 @@ class XmlManifest(object):
raise ManifestParseError("no remote for project %s within %s" %
(name, self.manifestFile))
revisionExpr = node.getAttribute('revision')
revisionExpr = node.getAttribute('revision') or remote.revision
if not revisionExpr:
revisionExpr = self._default.revisionExpr
if not revisionExpr:
@ -686,12 +786,13 @@ class XmlManifest(object):
groups = ''
if node.hasAttribute('groups'):
groups = node.getAttribute('groups')
groups = [x for x in re.split(r'[,\s]+', groups) if x]
groups = self._ParseGroups(groups)
if parent is None:
relpath, worktree, gitdir = self.GetProjectPaths(name, path)
relpath, worktree, gitdir, objdir = self.GetProjectPaths(name, path)
else:
relpath, worktree, gitdir = self.GetSubprojectPaths(parent, path)
relpath, worktree, gitdir, objdir = \
self.GetSubprojectPaths(parent, name, path)
default_groups = ['all', 'name:%s' % name, 'path:%s' % relpath]
groups.extend(set(default_groups).difference(groups))
@ -704,6 +805,7 @@ class XmlManifest(object):
name = name,
remote = remote.ToRemoteSpec(name),
gitdir = gitdir,
objdir = objdir,
worktree = worktree,
relpath = relpath,
revisionExpr = revisionExpr,
@ -720,6 +822,8 @@ class XmlManifest(object):
for n in node.childNodes:
if n.nodeName == 'copyfile':
self._ParseCopyFile(project, n)
if n.nodeName == 'linkfile':
self._ParseLinkFile(project, n)
if n.nodeName == 'annotation':
self._ParseAnnotation(project, n)
if n.nodeName == 'project':
@ -732,10 +836,15 @@ class XmlManifest(object):
if self.IsMirror:
worktree = None
gitdir = os.path.join(self.topdir, '%s.git' % name)
objdir = gitdir
else:
worktree = os.path.join(self.topdir, path).replace('\\', '/')
gitdir = os.path.join(self.repodir, 'projects', '%s.git' % path)
return relpath, worktree, gitdir
objdir = os.path.join(self.repodir, 'project-objects', '%s.git' % name)
return relpath, worktree, gitdir, objdir
def GetProjectsWithName(self, name):
return self._projects.get(name, [])
def GetSubprojectName(self, parent, submodule_path):
return os.path.join(parent.name, submodule_path)
@ -746,14 +855,15 @@ class XmlManifest(object):
def _UnjoinRelpath(self, parent_relpath, relpath):
return os.path.relpath(relpath, parent_relpath)
def GetSubprojectPaths(self, parent, path):
def GetSubprojectPaths(self, parent, name, path):
relpath = self._JoinRelpath(parent.relpath, path)
gitdir = os.path.join(parent.gitdir, 'subprojects', '%s.git' % path)
objdir = os.path.join(parent.gitdir, 'subproject-objects', '%s.git' % name)
if self.IsMirror:
worktree = None
else:
worktree = os.path.join(parent.worktree, path).replace('\\', '/')
return relpath, worktree, gitdir
return relpath, worktree, gitdir, objdir
def _ParseCopyFile(self, project, node):
src = self._reqatt(node, 'src')
@ -763,6 +873,14 @@ class XmlManifest(object):
# dest is relative to the top of the tree
project.AddCopyFile(src, dest, os.path.join(self.topdir, dest))
def _ParseLinkFile(self, project, node):
src = self._reqatt(node, 'src')
dest = self._reqatt(node, 'dest')
if not self.IsMirror:
# src is project relative;
# dest is relative to the top of the tree
project.AddLinkFile(src, dest, os.path.join(self.topdir, dest))
def _ParseAnnotation(self, project, node):
name = self._reqatt(node, 'name')
value = self._reqatt(node, 'value')
@ -795,3 +913,43 @@ class XmlManifest(object):
raise ManifestParseError("no %s in <%s> within %s" %
(attname, node.nodeName, self.manifestFile))
return v
def projectsDiff(self, manifest):
"""return the projects differences between two manifests.
The diff will be from self to given manifest.
"""
fromProjects = self.paths
toProjects = manifest.paths
fromKeys = sorted(fromProjects.keys())
toKeys = sorted(toProjects.keys())
diff = {'added': [], 'removed': [], 'changed': [], 'unreachable': []}
for proj in fromKeys:
if not proj in toKeys:
diff['removed'].append(fromProjects[proj])
else:
fromProj = fromProjects[proj]
toProj = toProjects[proj]
try:
fromRevId = fromProj.GetCommitRevisionId()
toRevId = toProj.GetCommitRevisionId()
except ManifestInvalidRevisionError:
diff['unreachable'].append((fromProj, toProj))
else:
if fromRevId != toRevId:
diff['changed'].append((fromProj, toProj))
toKeys.remove(proj)
for proj in toKeys:
diff['added'].append(toProjects[proj])
return diff
def _ParseProjectHooks(self, node):
name = self._reqatt(node, 'name')
revision = self._reqatt(node, 'revision')
return name, revision

File diff suppressed because it is too large Load Diff

71
repo
View File

@ -20,7 +20,7 @@ REPO_REV = 'stable'
# limitations under the License.
# increment this whenever we make important changes to this script
VERSION = (1, 20)
VERSION = (1, 21)
# increment this if the MAINTAINER_KEYS block is modified
KEYRING_VERSION = (1, 2)
@ -110,9 +110,11 @@ REPO_MAIN = S_repo + '/main.py' # main script
MIN_PYTHON_VERSION = (2, 6) # minimum supported python version
import errno
import optparse
import os
import re
import shutil
import stat
import subprocess
import sys
@ -137,11 +139,6 @@ def _print(*objects, **kwargs):
# Python version check
ver = sys.version_info
if ver[0] == 3:
_print('error: Python 3 support is not fully implemented in repo yet.\n'
'Please use Python 2.6 - 2.7 instead.',
file=sys.stderr)
sys.exit(1)
if (ver[0], ver[1]) < MIN_PYTHON_VERSION:
_print('error: Python version %s unsupported.\n'
'Please use Python 2.6 - 2.7 instead.'
@ -181,6 +178,10 @@ group.add_option('--reference',
group.add_option('--depth', type='int', default=None,
dest='depth',
help='create a shallow clone with given depth; see git clone')
group.add_option('--archive',
dest='archive', action='store_true',
help='checkout an archive instead of a git repository for '
'each project. See git archive.')
group.add_option('-g', '--groups',
dest='groups', default='default',
help='restrict manifest projects to ones with specified '
@ -240,10 +241,10 @@ def _Init(args):
_print("fatal: invalid branch name '%s'" % branch, file=sys.stderr)
raise CloneFailure()
if not os.path.isdir(repodir):
try:
os.mkdir(repodir)
except OSError as e:
try:
os.mkdir(repodir)
except OSError as e:
if e.errno != errno.EEXIST:
_print('fatal: cannot make %s directory: %s'
% (repodir, e.strerror), file=sys.stderr)
# Don't raise CloneFailure; that would delete the
@ -274,6 +275,20 @@ def _Init(args):
raise
def ParseGitVersion(ver_str):
if not ver_str.startswith('git version '):
return None
num_ver_str = ver_str[len('git version '):].strip().split('-')[0]
to_tuple = []
for num_str in num_ver_str.split('.')[:3]:
if num_str.isdigit():
to_tuple.append(int(num_str))
else:
to_tuple.append(0)
return tuple(to_tuple)
def _CheckGitVersion():
cmd = [GIT, '--version']
try:
@ -291,12 +306,11 @@ def _CheckGitVersion():
proc.stdout.close()
proc.wait()
if not ver_str.startswith('git version '):
ver_act = ParseGitVersion(ver_str)
if ver_act is None:
_print('error: "%s" unsupported' % ver_str, file=sys.stderr)
raise CloneFailure()
ver_str = ver_str[len('git version '):].strip()
ver_act = tuple(map(int, ver_str.split('.')[0:3]))
if ver_act < MIN_GIT_VERSION:
need = '.'.join(map(str, MIN_GIT_VERSION))
_print('fatal: git %s or later required' % need, file=sys.stderr)
@ -322,18 +336,18 @@ def NeedSetupGnuPG():
def SetupGnuPG(quiet):
if not os.path.isdir(home_dot_repo):
try:
os.mkdir(home_dot_repo)
except OSError as e:
try:
os.mkdir(home_dot_repo)
except OSError as e:
if e.errno != errno.EEXIST:
_print('fatal: cannot make %s directory: %s'
% (home_dot_repo, e.strerror), file=sys.stderr)
sys.exit(1)
if not os.path.isdir(gpg_dir):
try:
os.mkdir(gpg_dir, stat.S_IRWXU)
except OSError as e:
try:
os.mkdir(gpg_dir, stat.S_IRWXU)
except OSError as e:
if e.errno != errno.EEXIST:
_print('fatal: cannot make %s directory: %s' % (gpg_dir, e.strerror),
file=sys.stderr)
sys.exit(1)
@ -724,12 +738,7 @@ def main(orig_args):
try:
_Init(args)
except CloneFailure:
for root, dirs, files in os.walk(repodir, topdown=False):
for name in files:
os.remove(os.path.join(root, name))
for name in dirs:
os.rmdir(os.path.join(root, name))
os.rmdir(repodir)
shutil.rmtree(os.path.join(repodir, S_repo), ignore_errors=True)
sys.exit(1)
repo_main, rel_repo_dir = _FindRepo()
else:
@ -739,7 +748,7 @@ def main(orig_args):
repo_main = my_main
ver_str = '.'.join(map(str, VERSION))
me = [repo_main,
me = [sys.executable, repo_main,
'--repo-dir=%s' % rel_repo_dir,
'--wrapper-version=%s' % ver_str,
'--wrapper-path=%s' % wrapper_path,
@ -747,7 +756,7 @@ def main(orig_args):
me.extend(orig_args)
me.extend(extra_args)
try:
os.execv(repo_main, me)
os.execv(sys.executable, me)
except OSError as e:
_print("fatal: unable to start %s" % repo_main, file=sys.stderr)
_print("fatal: %s" % e, file=sys.stderr)
@ -755,4 +764,8 @@ def main(orig_args):
if __name__ == '__main__':
if ver[0] == 3:
_print('warning: Python 3 support is currently experimental. YMMV.\n'
'Please use Python 2.6 - 2.7 instead.',
file=sys.stderr)
main(sys.argv[1:])

View File

@ -46,6 +46,10 @@ class BranchInfo(object):
def IsCurrent(self):
return self.current > 0
@property
def IsSplitCurrent(self):
return self.current != 0 and self.current != len(self.projects)
@property
def IsPublished(self):
return self.published > 0
@ -139,10 +143,14 @@ is shown, then the branch appears in all projects.
if in_cnt < project_cnt:
fmt = out.write
paths = []
if in_cnt < project_cnt - in_cnt:
non_cur_paths = []
if i.IsSplitCurrent or (in_cnt < project_cnt - in_cnt):
in_type = 'in'
for b in i.projects:
paths.append(b.project.relpath)
if not i.IsSplitCurrent or b.current:
paths.append(b.project.relpath)
else:
non_cur_paths.append(b.project.relpath)
else:
fmt = out.notinproject
in_type = 'not in'
@ -154,13 +162,19 @@ is shown, then the branch appears in all projects.
paths.append(p.relpath)
s = ' %s %s' % (in_type, ', '.join(paths))
if width + 7 + len(s) < 80:
if not i.IsSplitCurrent and (width + 7 + len(s) < 80):
fmt = out.current if i.IsCurrent else fmt
fmt(s)
else:
fmt(' %s:' % in_type)
fmt = out.current if i.IsCurrent else out.write
for p in paths:
out.nl()
fmt(width*' ' + ' %s' % p)
fmt = out.write
for p in non_cur_paths:
out.nl()
fmt(width*' ' + ' %s' % p)
else:
out.write(' in all projects')
out.nl()

195
subcmds/diffmanifests.py Normal file
View File

@ -0,0 +1,195 @@
#
# Copyright (C) 2014 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from color import Coloring
from command import PagedCommand
from manifest_xml import XmlManifest
class _Coloring(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, "status")
class Diffmanifests(PagedCommand):
""" A command to see logs in projects represented by manifests
This is used to see deeper differences between manifests. Where a simple
diff would only show a diff of sha1s for example, this command will display
the logs of the project between both sha1s, allowing user to see diff at a
deeper level.
"""
common = True
helpSummary = "Manifest diff utility"
helpUsage = """%prog manifest1.xml [manifest2.xml] [options]"""
helpDescription = """
The %prog command shows differences between project revisions of manifest1 and
manifest2. if manifest2 is not specified, current manifest.xml will be used
instead. Both absolute and relative paths may be used for manifests. Relative
paths start from project's ".repo/manifests" folder.
The --raw option Displays the diff in a way that facilitates parsing, the
project pattern will be <status> <path> <revision from> [<revision to>] and the
commit pattern will be <status> <onelined log> with status values respectively :
A = Added project
R = Removed project
C = Changed project
U = Project with unreachable revision(s) (revision(s) not found)
for project, and
A = Added commit
R = Removed commit
for a commit.
Only changed projects may contain commits, and commit status always starts with
a space, and are part of last printed project.
Unreachable revisions may occur if project is not up to date or if repo has not
been initialized with all the groups, in which case some projects won't be
synced and their revisions won't be found.
"""
def _Options(self, p):
p.add_option('--raw',
dest='raw', action='store_true',
help='Display raw diff.')
p.add_option('--no-color',
dest='color', action='store_false', default=True,
help='does not display the diff in color.')
def _printRawDiff(self, diff):
for project in diff['added']:
self.printText("A %s %s" % (project.relpath, project.revisionExpr))
self.out.nl()
for project in diff['removed']:
self.printText("R %s %s" % (project.relpath, project.revisionExpr))
self.out.nl()
for project, otherProject in diff['changed']:
self.printText("C %s %s %s" % (project.relpath, project.revisionExpr,
otherProject.revisionExpr))
self.out.nl()
self._printLogs(project, otherProject, raw=True, color=False)
for project, otherProject in diff['unreachable']:
self.printText("U %s %s %s" % (project.relpath, project.revisionExpr,
otherProject.revisionExpr))
self.out.nl()
def _printDiff(self, diff, color=True):
if diff['added']:
self.out.nl()
self.printText('added projects : \n')
self.out.nl()
for project in diff['added']:
self.printProject('\t%s' % (project.relpath))
self.printText(' at revision ')
self.printRevision(project.revisionExpr)
self.out.nl()
if diff['removed']:
self.out.nl()
self.printText('removed projects : \n')
self.out.nl()
for project in diff['removed']:
self.printProject('\t%s' % (project.relpath))
self.printText(' at revision ')
self.printRevision(project.revisionExpr)
self.out.nl()
if diff['changed']:
self.out.nl()
self.printText('changed projects : \n')
self.out.nl()
for project, otherProject in diff['changed']:
self.printProject('\t%s' % (project.relpath))
self.printText(' changed from ')
self.printRevision(project.revisionExpr)
self.printText(' to ')
self.printRevision(otherProject.revisionExpr)
self.out.nl()
self._printLogs(project, otherProject, raw=False, color=color)
self.out.nl()
if diff['unreachable']:
self.out.nl()
self.printText('projects with unreachable revisions : \n')
self.out.nl()
for project, otherProject in diff['unreachable']:
self.printProject('\t%s ' % (project.relpath))
self.printRevision(project.revisionExpr)
self.printText(' or ')
self.printRevision(otherProject.revisionExpr)
self.printText(' not found')
self.out.nl()
def _printLogs(self, project, otherProject, raw=False, color=True):
logs = project.getAddedAndRemovedLogs(otherProject, oneline=True,
color=color)
if logs['removed']:
removedLogs = logs['removed'].split('\n')
for log in removedLogs:
if log.strip():
if raw:
self.printText(' R ' + log)
self.out.nl()
else:
self.printRemoved('\t\t[-] ')
self.printText(log)
self.out.nl()
if logs['added']:
addedLogs = logs['added'].split('\n')
for log in addedLogs:
if log.strip():
if raw:
self.printText(' A ' + log)
self.out.nl()
else:
self.printAdded('\t\t[+] ')
self.printText(log)
self.out.nl()
def Execute(self, opt, args):
if not args or len(args) > 2:
self.Usage()
self.out = _Coloring(self.manifest.globalConfig)
self.printText = self.out.nofmt_printer('text')
if opt.color:
self.printProject = self.out.nofmt_printer('project', attr = 'bold')
self.printAdded = self.out.nofmt_printer('green', fg = 'green', attr = 'bold')
self.printRemoved = self.out.nofmt_printer('red', fg = 'red', attr = 'bold')
self.printRevision = self.out.nofmt_printer('revision', fg = 'yellow')
else:
self.printProject = self.printAdded = self.printRemoved = self.printRevision = self.printText
manifest1 = XmlManifest(self.manifest.repodir)
manifest1.Override(args[0])
if len(args) == 1:
manifest2 = self.manifest
else:
manifest2 = XmlManifest(self.manifest.repodir)
manifest2.Override(args[1])
diff = manifest1.projectsDiff(manifest2)
if opt.raw:
self._printRawDiff(diff)
else:
self._printDiff(diff, color=opt.color)

View File

@ -18,6 +18,7 @@ import re
import sys
from command import Command
from error import GitError
CHANGE_RE = re.compile(r'^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$')
@ -87,7 +88,13 @@ makes it available in your project's local working directory.
for c in dl.commits:
print(' %s' % (c), file=sys.stderr)
if opt.cherrypick:
project._CherryPick(dl.commit)
try:
project._CherryPick(dl.commit)
except GitError:
print('[%s] Could not complete the cherry-pick of %s' \
% (project.name, dl.commit), file=sys.stderr)
sys.exit(1)
elif opt.revert:
project._Revert(dl.commit)
elif opt.ffonly:

View File

@ -14,7 +14,9 @@
# limitations under the License.
from __future__ import print_function
import errno
import fcntl
import multiprocessing
import re
import os
import select
@ -31,6 +33,7 @@ _CAN_COLOR = [
'log',
]
class ForallColoring(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, 'forall')
@ -87,6 +90,12 @@ revision to a locally executed git command, use REPO_LREV.
REPO_RREV is the name of the revision from the manifest, exactly
as written in the manifest.
REPO_COUNT is the total number of projects being iterated.
REPO_I is the current (1-based) iteration count. Can be used in
conjunction with REPO_COUNT to add a simple progress indicator to your
command.
REPO__* are any extra environment variables, specified by the
"annotation" element under any project element. This can be useful
for differentiating trees based on user-specific criteria, or simply
@ -126,9 +135,31 @@ without iterating through the remaining projects.
g.add_option('-v', '--verbose',
dest='verbose', action='store_true',
help='Show command error messages')
g.add_option('-j', '--jobs',
dest='jobs', action='store', type='int', default=1,
help='number of commands to execute simultaneously')
def WantPager(self, opt):
return opt.project_header
return opt.project_header and opt.jobs == 1
def _SerializeProject(self, project):
""" Serialize a project._GitGetByExec instance.
project._GitGetByExec is not pickle-able. Instead of trying to pass it
around between processes, make a dict ourselves containing only the
attributes that we need.
"""
return {
'name': project.name,
'relpath': project.relpath,
'remote_name': project.remote.name,
'lrev': project.GetRevisionId(),
'rrev': project.revisionExpr,
'annotations': dict((a.name, a.value) for a in project.annotations),
'gitdir': project.gitdir,
'worktree': project.worktree,
}
def Execute(self, opt, args):
if not opt.command:
@ -167,123 +198,165 @@ without iterating through the remaining projects.
# pylint: enable=W0631
mirror = self.manifest.IsMirror
out = ForallColoring(self.manifest.manifestProject.config)
out.redirect(sys.stdout)
rc = 0
first = True
if not opt.regex:
projects = self.GetProjects(args)
else:
projects = self.FindProjects(args)
for project in projects:
env = os.environ.copy()
def setenv(name, val):
if val is None:
val = ''
env[name] = val.encode()
os.environ['REPO_COUNT'] = str(len(projects))
setenv('REPO_PROJECT', project.name)
setenv('REPO_PATH', project.relpath)
setenv('REPO_REMOTE', project.remote.name)
setenv('REPO_LREV', project.GetRevisionId())
setenv('REPO_RREV', project.revisionExpr)
for a in project.annotations:
setenv("REPO__%s" % (a.name), a.value)
if mirror:
setenv('GIT_DIR', project.gitdir)
cwd = project.gitdir
else:
cwd = project.worktree
if not os.path.exists(cwd):
if (opt.project_header and opt.verbose) \
or not opt.project_header:
print('skipping %s/' % project.relpath, file=sys.stderr)
continue
if opt.project_header:
stdin = subprocess.PIPE
stdout = subprocess.PIPE
stderr = subprocess.PIPE
else:
stdin = None
stdout = None
stderr = None
p = subprocess.Popen(cmd,
cwd = cwd,
shell = shell,
env = env,
stdin = stdin,
stdout = stdout,
stderr = stderr)
if opt.project_header:
class sfd(object):
def __init__(self, fd, dest):
self.fd = fd
self.dest = dest
def fileno(self):
return self.fd.fileno()
empty = True
errbuf = ''
p.stdin.close()
s_in = [sfd(p.stdout, sys.stdout),
sfd(p.stderr, sys.stderr)]
for s in s_in:
flags = fcntl.fcntl(s.fd, fcntl.F_GETFL)
fcntl.fcntl(s.fd, fcntl.F_SETFL, flags | os.O_NONBLOCK)
while s_in:
in_ready, _out_ready, _err_ready = select.select(s_in, [], [])
for s in in_ready:
buf = s.fd.read(4096)
if not buf:
s.fd.close()
s_in.remove(s)
continue
if not opt.verbose:
if s.fd != p.stdout:
errbuf += buf
continue
if empty:
if first:
first = False
else:
out.nl()
if mirror:
project_header_path = project.name
else:
project_header_path = project.relpath
out.project('project %s/', project_header_path)
out.nl()
out.flush()
if errbuf:
sys.stderr.write(errbuf)
sys.stderr.flush()
errbuf = ''
empty = False
s.dest.write(buf)
s.dest.flush()
r = p.wait()
if r != 0:
if r != rc:
rc = r
if opt.abort_on_errors:
print("error: %s: Aborting due to previous error" % project.relpath,
file=sys.stderr)
sys.exit(r)
pool = multiprocessing.Pool(opt.jobs)
try:
config = self.manifest.manifestProject.config
results_it = pool.imap(
DoWorkWrapper,
[[mirror, opt, cmd, shell, cnt, config, self._SerializeProject(p)]
for cnt, p in enumerate(projects)]
)
pool.close()
for r in results_it:
rc = rc or r
if r != 0 and opt.abort_on_errors:
raise Exception('Aborting due to previous error')
except (KeyboardInterrupt, WorkerKeyboardInterrupt):
# Catch KeyboardInterrupt raised inside and outside of workers
print('Interrupted - terminating the pool')
pool.terminate()
rc = rc or errno.EINTR
except Exception as e:
# Catch any other exceptions raised
print('Got an error, terminating the pool: %r' % e,
file=sys.stderr)
pool.terminate()
rc = rc or getattr(e, 'errno', 1)
finally:
pool.join()
if rc != 0:
sys.exit(rc)
class WorkerKeyboardInterrupt(Exception):
""" Keyboard interrupt exception for worker processes. """
pass
def DoWorkWrapper(args):
""" A wrapper around the DoWork() method.
Catch the KeyboardInterrupt exceptions here and re-raise them as a different,
``Exception``-based exception to stop it flooding the console with stacktraces
and making the parent hang indefinitely.
"""
project = args.pop()
try:
return DoWork(project, *args)
except KeyboardInterrupt:
print('%s: Worker interrupted' % project['name'])
raise WorkerKeyboardInterrupt()
def DoWork(project, mirror, opt, cmd, shell, cnt, config):
env = os.environ.copy()
def setenv(name, val):
if val is None:
val = ''
env[name] = val.encode()
setenv('REPO_PROJECT', project['name'])
setenv('REPO_PATH', project['relpath'])
setenv('REPO_REMOTE', project['remote_name'])
setenv('REPO_LREV', project['lrev'])
setenv('REPO_RREV', project['rrev'])
setenv('REPO_I', str(cnt + 1))
for name in project['annotations']:
setenv("REPO__%s" % (name), project['annotations'][name])
if mirror:
setenv('GIT_DIR', project['gitdir'])
cwd = project['gitdir']
else:
cwd = project['worktree']
if not os.path.exists(cwd):
if (opt.project_header and opt.verbose) \
or not opt.project_header:
print('skipping %s/' % project['relpath'], file=sys.stderr)
return
if opt.project_header:
stdin = subprocess.PIPE
stdout = subprocess.PIPE
stderr = subprocess.PIPE
else:
stdin = None
stdout = None
stderr = None
p = subprocess.Popen(cmd,
cwd=cwd,
shell=shell,
env=env,
stdin=stdin,
stdout=stdout,
stderr=stderr)
if opt.project_header:
out = ForallColoring(config)
out.redirect(sys.stdout)
class sfd(object):
def __init__(self, fd, dest):
self.fd = fd
self.dest = dest
def fileno(self):
return self.fd.fileno()
empty = True
errbuf = ''
p.stdin.close()
s_in = [sfd(p.stdout, sys.stdout),
sfd(p.stderr, sys.stderr)]
for s in s_in:
flags = fcntl.fcntl(s.fd, fcntl.F_GETFL)
fcntl.fcntl(s.fd, fcntl.F_SETFL, flags | os.O_NONBLOCK)
while s_in:
in_ready, _out_ready, _err_ready = select.select(s_in, [], [])
for s in in_ready:
buf = s.fd.read(4096)
if not buf:
s.fd.close()
s_in.remove(s)
continue
if not opt.verbose:
if s.fd != p.stdout:
errbuf += buf
continue
if empty and out:
if not cnt == 0:
out.nl()
if mirror:
project_header_path = project['name']
else:
project_header_path = project['relpath']
out.project('project %s/', project_header_path)
out.nl()
out.flush()
if errbuf:
sys.stderr.write(errbuf)
sys.stderr.flush()
errbuf = ''
empty = False
s.dest.write(buf)
s.dest.flush()
r = p.wait()
return r

View File

@ -32,7 +32,7 @@ else:
from color import Coloring
from command import InteractiveCommand, MirrorSafeCommand
from error import ManifestParseError
from project import SyncBuffer
from project import SyncBuffer, MetaProject
from git_config import GitConfig
from git_command import git_require, MIN_GIT_VERSION
@ -99,6 +99,10 @@ to update the working directory files.
g.add_option('--depth', type='int', default=None,
dest='depth',
help='create a shallow clone with given depth; see git clone')
g.add_option('--archive',
dest='archive', action='store_true',
help='checkout an archive instead of a git repository for '
'each project. See git archive.')
g.add_option('-g', '--groups',
dest='groups', default='default',
help='restrict manifest projects to ones with specified '
@ -198,6 +202,16 @@ to update the working directory files.
if opt.reference:
m.config.SetString('repo.reference', opt.reference)
if opt.archive:
if is_new:
m.config.SetString('repo.archive', 'true')
else:
print('fatal: --archive is only supported when initializing a new '
'workspace.', file=sys.stderr)
print('Either delete the .repo folder in this workspace, or initialize '
'in another location.', file=sys.stderr)
sys.exit(1)
if opt.mirror:
if is_new:
m.config.SetString('repo.mirror', 'true')
@ -219,7 +233,7 @@ to update the working directory files.
sys.exit(1)
if opt.manifest_branch:
m.MetaBranchSwitch(opt.manifest_branch)
m.MetaBranchSwitch()
syncbuf = SyncBuffer(m.config)
m.Sync_LocalHalf(syncbuf)
@ -360,14 +374,68 @@ to update the working directory files.
print(' rm -r %s/.repo' % self.manifest.topdir)
print('and try again.')
def _SyncProjectHooks(self, opt, repodir):
"""Downloads the defined hooks supplied in the projecthooks element
"""
# Always delete projecthooks and re-download for every new init.
projecthooksdir = os.path.join(repodir, 'projecthooks')
if os.path.exists(projecthooksdir):
shutil.rmtree(projecthooksdir)
for remotename in self.manifest.remotes:
r = self.manifest.remotes.get(remotename)
if r.projecthookName is not None and r.projecthookRevision is not None:
projecthookurl = r.resolvedFetchUrl.rstrip('/') + '/' + r.projecthookName
ph = MetaProject(manifest = self.manifest,
name = r.projecthookName,
gitdir = os.path.join(projecthooksdir,'%s/%s.git' % (remotename, r.projecthookName)),
worktree = os.path.join(projecthooksdir,'%s/%s' % (remotename, r.projecthookName)))
ph.revisionExpr = r.projecthookRevision
is_new = not ph.Exists
if is_new:
if not opt.quiet:
print('Get projecthook %s' % \
GitConfig.ForUser().UrlInsteadOf(projecthookurl), file=sys.stderr)
ph._InitGitDir(MirrorOverride=True)
phr = ph.GetRemote(remotename)
phr.name = 'origin'
phr.url = projecthookurl
phr.ResetFetch()
phr.Save()
if not ph.Sync_NetworkHalf(quiet=opt.quiet, is_new=is_new, clone_bundle=False):
print('fatal: cannot obtain projecthook %s' % phr.url, file=sys.stderr)
# Better delete the git dir if we created it; otherwise next
# time (when user fixes problems) we won't go through the "is_new" logic.
if is_new:
shutil.rmtree(ph.gitdir)
sys.exit(1)
syncbuf = SyncBuffer(ph.config)
ph.Sync_LocalHalf(syncbuf)
syncbuf.Finish()
def Execute(self, opt, args):
git_require(MIN_GIT_VERSION, fail=True)
if opt.reference:
opt.reference = os.path.expanduser(opt.reference)
# Check this here, else manifest will be tagged "not new" and init won't be
# possible anymore without removing the .repo/manifests directory.
if opt.archive and opt.mirror:
print('fatal: --mirror and --archive cannot be used together.',
file=sys.stderr)
sys.exit(1)
self._SyncManifest(opt)
self._LinkManifest(opt.manifest_name)
self._SyncProjectHooks(opt, self.manifest.repodir)
if os.isatty(0) and os.isatty(1) and not self.manifest.IsMirror:
if opt.config_name or self._ShouldConfigureUser():

View File

@ -62,6 +62,9 @@ branch but need to incorporate new upstream changes "underneath" them.
if opt.interactive and not one_project:
print('error: interactive rebase not supported with multiple projects',
file=sys.stderr)
if len(args) == 1:
print('note: project %s is mapped to more than one path' % (args[0],),
file=sys.stderr)
return -1
for project in all_projects:

View File

@ -59,9 +59,13 @@ revision specified in the manifest.
for project in all_projects:
pm.update()
# If the current revision is a specific SHA1 then we can't push back
# to it so substitute the manifest default revision instead.
# to it; so substitute with dest_branch if defined, or with manifest
# default revision instead.
if IsId(project.revisionExpr):
project.revisionExpr = self.manifest.default.revisionExpr
if project.dest_branch:
project.revisionExpr = project.dest_branch
else:
project.revisionExpr = self.manifest.default.revisionExpr
if not project.StartBranch(nb):
err.append(project)
pm.end()

View File

@ -113,7 +113,7 @@ the following meanings:
try:
state = project.PrintWorkTreeStatus(output)
if state == 'CLEAN':
clean_counter.next()
next(clean_counter)
finally:
sem.release()
@ -141,7 +141,7 @@ the following meanings:
for project in all_projects:
state = project.PrintWorkTreeStatus()
if state == 'CLEAN':
counter.next()
next(counter)
else:
sem = _threading.Semaphore(opt.jobs)
threads_and_output = []
@ -164,7 +164,7 @@ the following meanings:
t.join()
output.dump(sys.stdout)
output.close()
if len(all_projects) == counter.next():
if len(all_projects) == next(counter):
print('nothing to commit (working directory clean)')
if opt.orphans:

View File

@ -14,10 +14,10 @@
# limitations under the License.
from __future__ import print_function
import json
import netrc
from optparse import SUPPRESS_HELP
import os
import pickle
import re
import shutil
import socket
@ -58,13 +58,13 @@ except ImportError:
from git_command import GIT, git_require
from git_refs import R_HEADS, HEAD
from main import WrapperModule
from project import Project
from project import RemoteSpec
from command import Command, MirrorSafeCommand
from error import RepoChangedException, GitError, ManifestParseError
from project import SyncBuffer
from progress import Progress
from wrapper import Wrapper
_ONE_DAY_S = 24 * 60 * 60
@ -128,6 +128,9 @@ HTTP client or proxy configuration, but the Git binary works.
The --fetch-submodules option enables fetching Git submodules
of a project from server.
The -c/--current-branch option can be used to only fetch objects that
are on the branch specified by a project's revision.
SSH Connections
---------------
@ -219,9 +222,25 @@ later is required to fix a server side protocol bug.
dest='repo_upgraded', action='store_true',
help=SUPPRESS_HELP)
def _FetchHelper(self, opt, project, lock, fetched, pm, sem, err_event):
def _FetchProjectList(self, opt, projects, *args, **kwargs):
"""Main function of the fetch threads when jobs are > 1.
Delegates most of the work to _FetchHelper.
Args:
opt: Program options returned from optparse. See _Options().
projects: Projects to fetch.
*args, **kwargs: Remaining arguments to pass to _FetchHelper. See the
_FetchHelper docstring for details.
"""
for project in projects:
success = self._FetchHelper(opt, project, *args, **kwargs)
if not success and not opt.force_broken:
break
def _FetchHelper(self, opt, project, lock, fetched, pm, sem, err_event):
"""Fetch git objects for a single project.
Args:
opt: Program options returned from optparse. See _Options().
project: Project object for the project to fetch.
@ -235,6 +254,9 @@ later is required to fix a server side protocol bug.
can be started up.
err_event: We'll set this event in the case of an error (after printing
out info about the error).
Returns:
Whether the fetch was successful.
"""
# We'll set to true once we've locked the lock.
did_lock = False
@ -253,7 +275,7 @@ later is required to fix a server side protocol bug.
quiet=opt.quiet,
current_branch_only=opt.current_branch_only,
clone_bundle=not opt.no_clone_bundle,
no_tags=opt.no_tags)
no_tags=opt.no_tags, archive=self.manifest.IsArchive)
self._fetch_times.Set(project, time.time() - start)
# Lock around all the rest of the code, since printing, updating a set
@ -281,67 +303,65 @@ later is required to fix a server side protocol bug.
lock.release()
sem.release()
return success
def _Fetch(self, projects, opt):
fetched = set()
lock = _threading.Lock()
pm = Progress('Fetching projects', len(projects))
if self.jobs == 1:
for project in projects:
pm.update()
if not opt.quiet:
print('Fetching project %s' % project.name)
if project.Sync_NetworkHalf(
quiet=opt.quiet,
current_branch_only=opt.current_branch_only,
clone_bundle=not opt.no_clone_bundle,
no_tags=opt.no_tags):
fetched.add(project.gitdir)
else:
print('error: Cannot fetch %s' % project.name, file=sys.stderr)
if opt.force_broken:
print('warn: --force-broken, continuing to sync', file=sys.stderr)
else:
sys.exit(1)
else:
threads = set()
lock = _threading.Lock()
sem = _threading.Semaphore(self.jobs)
err_event = _threading.Event()
for project in projects:
# Check for any errors before starting any new threads.
# ...we'll let existing threads finish, though.
if err_event.isSet():
break
objdir_project_map = dict()
for project in projects:
objdir_project_map.setdefault(project.objdir, []).append(project)
sem.acquire()
t = _threading.Thread(target = self._FetchHelper,
args = (opt,
project,
lock,
fetched,
pm,
sem,
err_event))
threads = set()
sem = _threading.Semaphore(self.jobs)
err_event = _threading.Event()
for project_list in objdir_project_map.values():
# Check for any errors before running any more tasks.
# ...we'll let existing threads finish, though.
if err_event.isSet() and not opt.force_broken:
break
sem.acquire()
kwargs = dict(opt=opt,
projects=project_list,
lock=lock,
fetched=fetched,
pm=pm,
sem=sem,
err_event=err_event)
if self.jobs > 1:
t = _threading.Thread(target = self._FetchProjectList,
kwargs = kwargs)
# Ensure that Ctrl-C will not freeze the repo process.
t.daemon = True
threads.add(t)
t.start()
else:
self._FetchProjectList(**kwargs)
for t in threads:
t.join()
for t in threads:
t.join()
# If we saw an error, exit with code 1 so that other scripts can check.
if err_event.isSet():
print('\nerror: Exited sync due to fetch errors', file=sys.stderr)
sys.exit(1)
# If we saw an error, exit with code 1 so that other scripts can check.
if err_event.isSet():
print('\nerror: Exited sync due to fetch errors', file=sys.stderr)
sys.exit(1)
pm.end()
self._fetch_times.Save()
self._GCProjects(projects)
if not self.manifest.IsArchive:
self._GCProjects(projects)
return fetched
def _GCProjects(self, projects):
gitdirs = {}
for project in projects:
gitdirs[project.gitdir] = project.bare_git
has_dash_c = git_require((1, 7, 2))
if multiprocessing and has_dash_c:
cpu_count = multiprocessing.cpu_count()
@ -350,8 +370,8 @@ later is required to fix a server side protocol bug.
jobs = min(self.jobs, cpu_count)
if jobs < 2:
for project in projects:
project.bare_git.gc('--auto')
for bare_git in gitdirs.values():
bare_git.gc('--auto')
return
config = {'pack.threads': cpu_count / jobs if cpu_count > jobs else 1}
@ -360,10 +380,10 @@ later is required to fix a server side protocol bug.
sem = _threading.Semaphore(jobs)
err_event = _threading.Event()
def GC(project):
def GC(bare_git):
try:
try:
project.bare_git.gc('--auto', config=config)
bare_git.gc('--auto', config=config)
except GitError:
err_event.set()
except:
@ -372,11 +392,11 @@ later is required to fix a server side protocol bug.
finally:
sem.release()
for project in projects:
for bare_git in gitdirs.values():
if err_event.isSet():
break
sem.acquire()
t = _threading.Thread(target=GC, args=(project,))
t = _threading.Thread(target=GC, args=(bare_git,))
t.daemon = True
threads.add(t)
t.start()
@ -416,12 +436,13 @@ later is required to fix a server side protocol bug.
if path not in new_project_paths:
# If the path has already been deleted, we don't need to do it
if os.path.exists(self.manifest.topdir + '/' + path):
gitdir = os.path.join(self.manifest.topdir, path, '.git')
project = Project(
manifest = self.manifest,
name = path,
remote = RemoteSpec('origin'),
gitdir = os.path.join(self.manifest.topdir,
path, '.git'),
gitdir = gitdir,
objdir = gitdir,
worktree = os.path.join(self.manifest.topdir, path),
relpath = path,
revisionExpr = 'HEAD',
@ -540,7 +561,10 @@ later is required to fix a server side protocol bug.
branch = branch[len(R_HEADS):]
env = os.environ.copy()
if 'TARGET_PRODUCT' in env and 'TARGET_BUILD_VARIANT' in env:
if 'SYNC_TARGET' in env:
target = env['SYNC_TARGET']
[success, manifest_str] = server.GetApprovedManifest(branch, target)
elif 'TARGET_PRODUCT' in env and 'TARGET_BUILD_VARIANT' in env:
target = '%s-%s' % (env['TARGET_PRODUCT'],
env['TARGET_BUILD_VARIANT'])
[success, manifest_str] = server.GetApprovedManifest(branch, target)
@ -641,7 +665,7 @@ later is required to fix a server side protocol bug.
previously_missing_set = missing_set
fetched.update(self._Fetch(missing, opt))
if self.manifest.IsMirror:
if self.manifest.IsMirror or self.manifest.IsArchive:
# bail out now, we have no working tree
return
@ -666,10 +690,10 @@ later is required to fix a server side protocol bug.
print(self.manifest.notice)
def _PostRepoUpgrade(manifest, quiet=False):
wrapper = WrapperModule()
wrapper = Wrapper()
if wrapper.NeedSetupGnuPG():
wrapper.SetupGnuPG(quiet)
for project in manifest.projects.values():
for project in manifest.projects:
if project.Exists:
project.PostRepoUpgrade()
@ -742,7 +766,7 @@ class _FetchTimes(object):
_ALPHA = 0.5
def __init__(self, manifest):
self._path = os.path.join(manifest.repodir, '.repopickle_fetchtimes')
self._path = os.path.join(manifest.repodir, '.repo_fetchtimes.json')
self._times = None
self._seen = set()
@ -762,21 +786,16 @@ class _FetchTimes(object):
if self._times is None:
try:
f = open(self._path)
except IOError:
self._times = {}
return self._times
try:
try:
self._times = pickle.load(f)
except IOError:
try:
os.remove(self._path)
except OSError:
pass
self._times = {}
finally:
f.close()
return self._times
self._times = json.load(f)
finally:
f.close()
except (IOError, ValueError):
try:
os.remove(self._path)
except OSError:
pass
self._times = {}
def Save(self):
if self._times is None:
@ -790,13 +809,13 @@ class _FetchTimes(object):
del self._times[name]
try:
f = open(self._path, 'wb')
f = open(self._path, 'w')
try:
pickle.dump(self._times, f)
except (IOError, OSError, pickle.PickleError):
try:
os.remove(self._path)
except OSError:
pass
finally:
f.close()
json.dump(self._times, f, indent=2)
finally:
f.close()
except (IOError, TypeError):
try:
os.remove(self._path)
except OSError:
pass

View File

@ -21,13 +21,16 @@ import sys
from command import InteractiveCommand
from editor import Editor
from error import HookError, UploadError
from git_command import GitCommand
from project import RepoHook
from pyversion import is_python3
# pylint:disable=W0622
if not is_python3():
# pylint:disable=W0622
input = raw_input
# pylint:enable=W0622
else:
unicode = str
# pylint:enable=W0622
UNUSUAL_COMMIT_THRESHOLD = 5
@ -88,6 +91,11 @@ to "true" then repo will assume you always answer "y" at the prompt,
and will not prompt you further. If it is set to "false" then repo
will assume you always answer "n", and will abort.
review.URL.autoreviewer:
To automatically append a user or mailing list to reviews, you can set
a per-project or global Git option to do so.
review.URL.autocopy:
To automatically copy a user or mailing list to all uploaded reviews,
@ -292,14 +300,20 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
self._UploadAndReport(opt, todo, people)
def _AppendAutoCcList(self, branch, people):
def _AppendAutoList(self, branch, people):
"""
Appends the list of reviewers in the git project's config.
Appends the list of users in the CC list in the git project's config if a
non-empty reviewer list was found.
"""
name = branch.name
project = branch.project
key = 'review.%s.autoreviewer' % project.GetBranch(name).remote.review
raw_list = project.config.GetString(key)
if not raw_list is None:
people[0].extend([entry.strip() for entry in raw_list.split(',')])
key = 'review.%s.autocopy' % project.GetBranch(name).remote.review
raw_list = project.config.GetString(key)
if not raw_list is None and len(people[0]) > 0:
@ -322,16 +336,20 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
for branch in todo:
try:
people = copy.deepcopy(original_people)
self._AppendAutoCcList(branch, people)
self._AppendAutoList(branch, people)
# Check if there are local changes that may have been forgotten
if branch.project.HasChanges():
changes = branch.project.UncommitedFiles()
if changes:
key = 'review.%s.autoupload' % branch.project.remote.review
answer = branch.project.config.GetBoolean(key)
# if they want to auto upload, let's not ask because it could be automated
if answer is None:
sys.stdout.write('Uncommitted changes in ' + branch.project.name + ' (did you forget to amend?). Continue uploading? (y/N) ')
sys.stdout.write('Uncommitted changes in ' + branch.project.name)
sys.stdout.write(' (did you forget to amend?):\n')
sys.stdout.write('\n'.join(changes) + '\n')
sys.stdout.write('Continue uploading? (y/N) ')
a = sys.stdin.readline().strip().lower()
if a not in ('y', 'yes', 't', 'true', 'on'):
print("skipping upload", file=sys.stderr)
@ -344,7 +362,21 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
key = 'review.%s.uploadtopic' % branch.project.remote.review
opt.auto_topic = branch.project.config.GetBoolean(key)
destination = opt.dest_branch or branch.project.dest_branch or branch.project.revisionExpr
destination = opt.dest_branch or branch.project.dest_branch
# Make sure our local branch is not setup to track a different remote branch
merge_branch = self._GetMergeBranch(branch.project)
if destination:
full_dest = 'refs/heads/%s' % destination
if not opt.dest_branch and merge_branch and merge_branch != full_dest:
print('merge branch %s does not match destination branch %s'
% (merge_branch, full_dest))
print('skipping upload.')
print('Please use `--destination %s` if this is intentional'
% destination)
branch.uploaded = False
continue
branch.UploadForReview(people, auto_topic=opt.auto_topic, draft=opt.draft, dest_branch=destination)
branch.uploaded = True
except UploadError as e:
@ -379,6 +411,21 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
if have_errors:
sys.exit(1)
def _GetMergeBranch(self, project):
p = GitCommand(project,
['rev-parse', '--abbrev-ref', 'HEAD'],
capture_stdout = True,
capture_stderr = True)
p.Wait()
local_branch = p.stdout.strip()
p = GitCommand(project,
['config', '--get', 'branch.%s.merge' % local_branch],
capture_stdout = True,
capture_stderr = True)
p.Wait()
merge_branch = p.stdout.strip()
return merge_branch
def Execute(self, opt, args):
project_list = self.GetProjects(args)
pending = []
@ -392,7 +439,16 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
for project in project_list:
if opt.current_branch:
cbr = project.CurrentBranch
avail = [project.GetUploadableBranch(cbr)] if cbr else None
up_branch = project.GetUploadableBranch(cbr)
if up_branch:
avail = [up_branch]
else:
avail = None
print('ERROR: Current branch (%s) not uploadable. '
'You may be able to type '
'"git branch --set-upstream-to m/master" to fix '
'your branch.' % str(cbr),
file=sys.stderr)
else:
avail = project.GetUploadableBranches(branch)
if avail:
@ -402,8 +458,10 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
hook = RepoHook('pre-upload', self.manifest.repo_hooks_project,
self.manifest.topdir, abort_if_user_denies=True)
pending_proj_names = [project.name for (project, avail) in pending]
pending_worktrees = [project.worktree for (project, avail) in pending]
try:
hook.Run(opt.allow_all_hooks, project_list=pending_proj_names)
hook.Run(opt.allow_all_hooks, project_list=pending_proj_names,
worktree_list=pending_worktrees)
except HookError as e:
print("ERROR: %s" % str(e), file=sys.stderr)
return

30
wrapper.py Normal file
View File

@ -0,0 +1,30 @@
#!/usr/bin/env python
#
# Copyright (C) 2014 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import imp
import os
def WrapperPath():
return os.path.join(os.path.dirname(__file__), 'repo')
_wrapper_module = None
def Wrapper():
global _wrapper_module
if not _wrapper_module:
_wrapper_module = imp.load_source('wrapper', WrapperPath())
return _wrapper_module