Compare commits

..

8 Commits

Author SHA1 Message Date
68744dbc01 Fixing forall subcommand for Py3
Execution of 'repo forall -p -c' doesn't work with Py3 and ends up
with an error:

Got an error, terminating the pool: TypeError: can only concatenate
str (not "bytes") to str

That's fixed by using the decode() method.

Change-Id: Ice01aaa1822dde8d957b5bf096021dd5a2b7dd51
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253659
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: David Pursehouse <dpursehouse@collab.net>
Tested-by: Jiri Tyr <jiri.tyr@gmail.com>
(cherry picked from commit 83a3227b62)
2020-02-10 23:31:45 -05:00
ef412624e9 remove spurious +x bits
These files are not directly executable, so drop the +x bits.

Change-Id: Iaf19a03a497686cc21103e7ddf08073173440dd1
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/254076
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: David Pursehouse <dpursehouse@collab.net>
(cherry picked from commit e7c91889a6)
2020-02-10 23:31:03 -05:00
a06ab7d28b find python via env
This allows these scripts to run through the active version of the
virtualenv python when invoked via tox.

Change-Id: Ib52f475b7b20c34d62cfd179a1341da1a08a8b5c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253974
Reviewed-by: David Pursehouse <dpursehouse@collab.net>
Tested-by: Mike Frysinger <vapier@google.com>
(cherry picked from commit 1b117db767)
2020-02-10 23:30:58 -05:00
471a7ed5f7 git_config: fix encoding handling in GetUrlCookieFile
Make sure we decode the bytes coming from the subprocess.Popen as
we're treating them as strings.

Change-Id: I44100ca5cd94f68a35d489936292eb641006edbe
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253973
Reviewed-by: Jonathan Nieder <jrn@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
(cherry picked from commit ded477dbb9)
2020-02-10 23:28:40 -05:00
619a2b5887 Fix inverted logic around [gitc-]init and -c
Instead of not using '-c' for '--current-branch' when using gitc, we
were only using '-c' when using gitc, so we still had the conflict with
the gitc option, and other users still couldn't use '-c'.

Test: repo init -u https://android.googlesource.com/platform/manifest; repo init -c
Test: repo gitc-init -u ... -b ... -c testing
Change-Id: I71e4950a49c281418249f0783c6a2ea34f0d3e2b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253795
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Dan Willemsen <dwillemsen@google.com>
(cherry picked from commit 93293ca47f)
2020-02-07 15:54:52 -05:00
ab15e42fa4 Do not try to fetch default revision for mirrors always
* Mirrors may contain multiple projects, some of which may not
  always contain the default revision.
* Only fetch the default revision explicitly if
  '--current-branch' is set.
* Fixes breakage casued by
  commit 6856f98467
  "Fix repo mirror with --current-branch"

Bug: https://crbug.com/gerrit/12274
Change-Id: Iaafabe2992f76f3644b841f24245d3e19c9515a9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253093
Reviewed-by: Kuang-che Wu <kcwu@chromium.org>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Chirayu Desai <chirayudesai1@gmail.com>
(cherry picked from commit f7b64e3350)
2020-02-06 09:19:35 -05:00
75c02fe4cb init: handle -c conflicts with gitc-init
We keep getting requests for init to support -c.  This conflicts with
gitc-init which allocates -c for its own use.  Lets make this dynamic
so we keep it with "init" but omit it for "gitc-init".

Bug: https://crbug.com/gerrit/10200
Change-Id: Ibf69c2bbeff638e28e63cb08926fea0c622258db
(cherry picked from commit 66098f707a)
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253392
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2020-02-05 18:04:11 +00:00
afd1b4023f repo: point default branch to repo-1
Since this will be feature-frozen for Python 2 users, lets point the
default update branch to "repo-1" rather than "stable" as the latter
will follow the master development (and Python 3-only).

Bug: https://crbug.com/gerrit/10418
Change-Id: Iceff0983684a580dc5c9ec1c60acfb5eda5ce2c4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/253172
Reviewed-by: David Pursehouse <dpursehouse@collab.net>
Tested-by: Mike Frysinger <vapier@google.com>
2020-02-05 02:56:08 +00:00
86 changed files with 3319 additions and 8372 deletions

16
.flake8
View File

@ -1,15 +1,3 @@
[flake8]
max-line-length=100
ignore=
# E111: Indentation is not a multiple of four
E111,
# E114: Indentation is not a multiple of four (comment)
E114,
# E402: Module level import not at top of file
E402,
# E731: do not assign a lambda expression, use a def
E731,
# W503: Line break before binary operator
W503,
# W504: Line break after binary operator
W504
max-line-length=80
ignore=E111,E114,E402

View File

@ -1,31 +0,0 @@
# GitHub actions workflow.
# https://help.github.com/en/actions/automating-your-workflow-with-github-actions/workflow-syntax-for-github-actions
name: Test CI
on:
push:
branches: [main, repo-1, stable, maint]
tags: [v*]
jobs:
test:
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
python-version: [3.5, 3.6, 3.7, 3.8, 3.9]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox tox-gh-actions
- name: Test with tox
run: tox

2
.gitignore vendored
View File

@ -1,4 +1,3 @@
*.asc
*.egg-info/
*.log
*.pyc
@ -7,7 +6,6 @@ __pycache__
.repopickle_*
/repoc
/.tox
/.venv
# PyCharm related
/.idea/

View File

@ -4,7 +4,6 @@ Hu Xiuyun <xiuyun.hu@hisilicon.com> Hu xiuyun <xiuyun.hu@hisilicon.com
Hu Xiuyun <xiuyun.hu@hisilicon.com> Hu Xiuyun <clouds08@qq.com>
Jelly Chen <chenguodong@huawei.com> chenguodong <chenguodong@huawei.com>
Jia Bi <bijia@xiaomi.com> bijia <bijia@xiaomi.com>
Jiri Tyr <jiri.tyr@gmail.com> Jiri tyr <jiri.tyr@gmail.com>
JoonCheol Park <jooncheol@gmail.com> Jooncheol Park <jooncheol@gmail.com>
Sergii Pylypenko <x.pelya.x@gmail.com> pelya <x.pelya.x@gmail.com>
Shawn Pearce <sop@google.com> Shawn O. Pearce <sop@google.com>

View File

@ -6,29 +6,15 @@ development workflow. Repo is not meant to replace Git, only to make it
easier to work with Git. The repo command is an executable Python script
that you can put anywhere in your path.
* Homepage: <https://gerrit.googlesource.com/git-repo/>
* Mailing list: [repo-discuss on Google Groups][repo-discuss]
* Bug reports: <https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo>
* Source: <https://gerrit.googlesource.com/git-repo/>
* Overview: <https://source.android.com/source/developing.html>
* Docs: <https://source.android.com/source/using-repo.html>
* Homepage: https://gerrit.googlesource.com/git-repo/
* Bug reports: https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo
* Source: https://gerrit.googlesource.com/git-repo/
* Overview: https://source.android.com/source/developing.html
* Docs: https://source.android.com/source/using-repo.html
* [repo Manifest Format](./docs/manifest-format.md)
* [repo Hooks](./docs/repo-hooks.md)
* [Submitting patches](./SUBMITTING_PATCHES.md)
* Running Repo in [Microsoft Windows](./docs/windows.md)
* GitHub mirror: <https://github.com/GerritCodeReview/git-repo>
* Postsubmit tests: <https://github.com/GerritCodeReview/git-repo/actions>
## Contact
Please use the [repo-discuss] mailing list or [issue tracker] for questions.
You can [file a new bug report][new-bug] under the "repo" component.
Please do not e-mail individual developers for support.
They do not have the bandwidth for it, and often times questions have already
been asked on [repo-discuss] or bugs posted to the [issue tracker].
So please search those sites first.
## Install
@ -48,8 +34,3 @@ $ PATH="${HOME}/.bin:${PATH}"
$ curl https://storage.googleapis.com/git-repo-downloads/repo > ~/.bin/repo
$ chmod a+rx ~/.bin/repo
```
[new-bug]: https://bugs.chromium.org/p/gerrit/issues/entry?template=Repo+tool+issue
[issue tracker]: https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo
[repo-discuss]: https://groups.google.com/forum/#!forum/repo-discuss

View File

@ -4,13 +4,13 @@
- Make small logical changes.
- Provide a meaningful commit message.
- Check for coding errors and style nits with flake8.
- Check for coding errors and style nits with pyflakes and flake8
- Make sure all code is under the Apache License, 2.0.
- Publish your changes for review.
- Make corrections if requested.
- Verify your changes on gerrit so they can be submitted.
`git push https://gerrit-review.googlesource.com/git-repo HEAD:refs/for/main`
`git push https://gerrit-review.googlesource.com/git-repo HEAD:refs/for/master`
# Long Version
@ -38,30 +38,34 @@ If your description starts to get too long, that's a sign that you
probably need to split up your commit to finer grained pieces.
## Check for coding errors and style violations with flake8
## Check for coding errors and style nits with pyflakes and flake8
Run `flake8` on changed modules:
### Coding errors
Run `pyflakes` on changed modules:
pyflakes file.py
Ideally there should be no new errors or warnings introduced.
### Style violations
Run `flake8` on changes modules:
flake8 file.py
Note that repo generally follows [Google's Python Style Guide] rather than
[PEP 8], with a couple of notable exceptions:
Note that repo generally follows [Google's python style guide] rather than
[PEP 8], so it's possible that the output of `flake8` will be quite noisy.
It's not mandatory to avoid all warnings, but at least the maximum line
length should be followed.
* Indentation is at 2 columns rather than 4
* The maximum line length is 100 columns rather than 80
If there are many occurrences of the same warning that cannot be
avoided without going against the Google style guide, these may be
suppressed in the included `.flake8` file.
There should be no new errors or warnings introduced.
Warnings that cannot be avoided without going against the Google Style Guide
may be suppressed inline individally using a `# noqa` comment as described
in the [flake8 documentation].
If there are many occurrences of the same warning, these may be suppressed for
the entire project in the included `.flake8` file.
[Google's Python Style Guide]: https://google.github.io/styleguide/pyguide.html
[Google's python style guide]: https://google.github.io/styleguide/pyguide.html
[PEP 8]: https://www.python.org/dev/peps/pep-0008/
[flake8 documentation]: https://flake8.pycqa.org/en/3.1.1/user/ignoring-errors.html#in-line-ignoring-errors
## Running tests
@ -150,7 +154,7 @@ Push your patches over HTTPS to the review server, possibly through
a remembered remote to make this easier in the future:
git config remote.review.url https://gerrit-review.googlesource.com/git-repo
git config remote.review.push HEAD:refs/for/main
git config remote.review.push HEAD:refs/for/master
git push review

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -82,7 +84,6 @@ def _Color(fg=None, bg=None, attr=None):
code = ''
return code
DEFAULT = None

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,7 +14,6 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import multiprocessing
import os
import optparse
import platform
@ -22,21 +23,6 @@ import sys
from event_log import EventLog
from error import NoSuchProjectError
from error import InvalidProjectGroupsError
import progress
# Number of projects to submit to a single worker process at a time.
# This number represents a tradeoff between the overhead of IPC and finer
# grained opportunity for parallelism. This particular value was chosen by
# iterating through powers of two until the overall performance no longer
# improved. The performance of this batch size is not a function of the
# number of cores on the system.
WORKER_BATCH_SIZE = 32
# How many jobs to run in parallel by default? This assumes the jobs are
# largely I/O bound and do not hit the network.
DEFAULT_LOCAL_JOBS = min(os.cpu_count(), 8)
class Command(object):
@ -48,10 +34,6 @@ class Command(object):
manifest = None
_optparse = None
# Whether this command supports running in parallel. If greater than 0,
# it is the number of parallel jobs to default to.
PARALLEL_JOBS = None
def WantPager(self, _opt):
return False
@ -84,35 +66,13 @@ class Command(object):
usage = self.helpUsage.strip().replace('%prog', me)
except AttributeError:
usage = 'repo %s' % self.NAME
epilog = 'Run `repo help %s` to view the detailed manual.' % self.NAME
self._optparse = optparse.OptionParser(usage=usage, epilog=epilog)
self._CommonOptions(self._optparse)
self._optparse = optparse.OptionParser(usage=usage)
self._Options(self._optparse)
return self._optparse
def _CommonOptions(self, p, opt_v=True):
"""Initialize the option parser with common options.
These will show up for *all* subcommands, so use sparingly.
NB: Keep in sync with repo:InitParser().
"""
g = p.add_option_group('Logging options')
opts = ['-v'] if opt_v else []
g.add_option(*opts, '--verbose',
dest='output_mode', action='store_true',
help='show all output')
g.add_option('-q', '--quiet',
dest='output_mode', action='store_false',
help='only show errors')
if self.PARALLEL_JOBS is not None:
p.add_option(
'-j', '--jobs',
type=int, default=self.PARALLEL_JOBS,
help='number of jobs to run in parallel (default: %s)' % self.PARALLEL_JOBS)
def _Options(self, p):
"""Initialize the option parser with subcommand-specific options."""
"""Initialize the option parser.
"""
def _RegisteredEnvironmentOptions(self):
"""Get options that can be set from environment variables.
@ -138,11 +98,6 @@ class Command(object):
self.OptionParser.print_usage()
sys.exit(1)
def CommonValidateOptions(self, opt, args):
"""Validate common options."""
opt.quiet = opt.output_mode is False
opt.verbose = opt.output_mode is True
def ValidateOptions(self, opt, args):
"""Validate the user options & arguments before executing.
@ -158,44 +113,6 @@ class Command(object):
"""
raise NotImplementedError
@staticmethod
def ExecuteInParallel(jobs, func, inputs, callback, output=None, ordered=False):
"""Helper for managing parallel execution boiler plate.
For subcommands that can easily split their work up.
Args:
jobs: How many parallel processes to use.
func: The function to apply to each of the |inputs|. Usually a
functools.partial for wrapping additional arguments. It will be run
in a separate process, so it must be pickalable, so nested functions
won't work. Methods on the subcommand Command class should work.
inputs: The list of items to process. Must be a list.
callback: The function to pass the results to for processing. It will be
executed in the main thread and process the results of |func| as they
become available. Thus it may be a local nested function. Its return
value is passed back directly. It takes three arguments:
- The processing pool (or None with one job).
- The |output| argument.
- An iterator for the results.
output: An output manager. May be progress.Progess or color.Coloring.
ordered: Whether the jobs should be processed in order.
Returns:
The |callback| function's results are returned.
"""
try:
# NB: Multiprocessing is heavy, so don't spin it up for one job.
if len(inputs) == 1 or jobs == 1:
return callback(None, output, (func(x) for x in inputs))
else:
with multiprocessing.Pool(jobs) as pool:
submit = pool.imap if ordered else pool.imap_unordered
return callback(pool, output, submit(func, inputs, chunksize=WORKER_BATCH_SIZE))
finally:
if isinstance(output, progress.Progress):
output.end()
def _ResetPathToProjectMap(self, projects):
self._by_path = dict((p.worktree, p) for p in projects)
@ -206,9 +123,9 @@ class Command(object):
project = None
if os.path.exists(path):
oldpath = None
while (path and
path != oldpath and
path != manifest.topdir):
while path and \
path != oldpath and \
path != manifest.topdir:
try:
project = self._by_path[path]
break
@ -239,7 +156,9 @@ class Command(object):
mp = manifest.manifestProject
if not groups:
groups = manifest.GetGroupsStr()
groups = mp.config.GetString('manifest.groups')
if not groups:
groups = 'default,platform-' + platform.system().lower()
groups = [x for x in re.split(r'[,\s]+', groups) if x]
if not args:
@ -317,7 +236,6 @@ class InteractiveCommand(Command):
"""Command which requires user interaction on the tty and
must not run within a pager, even if the user asks to.
"""
def WantPager(self, _opt):
return False
@ -326,7 +244,6 @@ class PagedCommand(Command):
"""Command which defaults to output in a pager, as its
display tends to be larger than one screen full.
"""
def WantPager(self, _opt):
return True

View File

@ -1,121 +0,0 @@
# Copyright 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Programmable bash completion. https://github.com/scop/bash-completion
# Complete the list of repo subcommands.
__complete_repo_list_commands() {
local repo=${COMP_WORDS[0]}
(
# Handle completions if running outside of a checkout.
if ! "${repo}" help --all 2>/dev/null; then
repo help 2>/dev/null
fi
) | sed -n '/^ /{s/ \([^ ]\+\) .\+/\1/;p}'
}
# Complete list of all branches available in all projects in the repo client
# checkout.
__complete_repo_list_branches() {
local repo=${COMP_WORDS[0]}
"${repo}" branches 2>/dev/null | \
sed -n '/|/{s/[ *][Pp ] *\([^ ]\+\) .*/\1/;p}'
}
# Complete list of all projects available in the repo client checkout.
__complete_repo_list_projects() {
local repo=${COMP_WORDS[0]}
"${repo}" list -n 2>/dev/null
}
# Complete the repo <command> argument.
__complete_repo_command() {
if [[ ${COMP_CWORD} -ne 1 ]]; then
return 1
fi
local command=${COMP_WORDS[1]}
COMPREPLY=($(compgen -W "$(__complete_repo_list_commands)" -- "${command}"))
return 0
}
# Complete repo subcommands that take <branch> <projects>.
__complete_repo_command_branch_projects() {
local current=$1
if [[ ${COMP_CWORD} -eq 2 ]]; then
COMPREPLY=($(compgen -W "$(__complete_repo_list_branches)" -- "${current}"))
else
COMPREPLY=($(compgen -W "$(__complete_repo_list_projects)" -- "${current}"))
fi
}
# Complete repo subcommands that take only <projects>.
__complete_repo_command_projects() {
local current=$1
COMPREPLY=($(compgen -W "$(__complete_repo_list_projects)" -- "${current}"))
}
# Complete the repo subcommand arguments.
__complete_repo_arg() {
if [[ ${COMP_CWORD} -le 1 ]]; then
return 1
fi
local command=${COMP_WORDS[1]}
local current=${COMP_WORDS[COMP_CWORD]}
case ${command} in
abandon|checkout)
__complete_repo_command_branch_projects "${current}"
return 0
;;
branch|branches|diff|info|list|overview|prune|rebase|smartsync|stage|status|\
sync|upload)
__complete_repo_command_projects "${current}"
return 0
;;
help)
if [[ ${COMP_CWORD} -eq 2 ]]; then
COMPREPLY=(
$(compgen -W "$(__complete_repo_list_commands)" -- "${current}")
)
fi
return 0
;;
start)
if [[ ${COMP_CWORD} -gt 2 ]]; then
COMPREPLY=(
$(compgen -W "$(__complete_repo_list_projects)" -- "${current}")
)
fi
return 0
;;
*)
return 1
;;
esac
}
# Complete the repo arguments.
__complete_repo() {
COMPREPLY=()
__complete_repo_command && return 0
__complete_repo_arg && return 0
return 0
}
complete -F __complete_repo repo

View File

@ -1,255 +0,0 @@
# Repo internal filesystem layout
A reference to the `.repo/` tree in repo client checkouts.
Hopefully it's complete & up-to-date, but who knows!
*** note
**Warning**:
This is meant for developers of the repo project itself as a quick reference.
**Nothing** in here must be construed as ABI, or that repo itself will never
change its internals in backwards incompatible ways.
***
[TOC]
## .repo/ layout
All content under `.repo/` is managed by `repo` itself with few exceptions.
In general, you should not make manual changes in here.
If a setting was initialized using an option to `repo init`, you should use that
command to change the setting later on.
It is always safe to re-run `repo init` in existing repo client checkouts.
For example, if you want to change the manifest branch, you can simply run
`repo init --manifest-branch=<new name>` and repo will take care of the rest.
* `config`: Per-repo client checkout settings using [git-config] file format.
* `.repo_config.json`: JSON cache of the `config` file for repo to
read/process quickly.
### repo/ state
* `repo/`: A git checkout of the repo project. This is how `repo` re-execs
itself to get the latest released version.
It tracks the git repository at `REPO_URL` using the `REPO_REV` branch.
Those are specified at `repo init` time using the `--repo-url=<REPO_URL>`
and `--repo-rev=<REPO_REV>` options.
Any changes made to this directory will usually be automatically discarded
by repo itself when it checks for updates. If you want to update to the
latest version of repo, use `repo selfupdate` instead. If you want to
change the git URL/branch that this tracks, re-run `repo init` with the new
settings.
* `.repo_fetchtimes.json`: Used by `repo sync` to record stats when syncing
the various projects.
### Manifests
For more documentation on the manifest format, including the local_manifests
support, see the [manifest-format.md] file.
* `manifests/`: A git checkout of the manifest project. Its `.git/` state
points to the `manifest.git` bare checkout (see below). It tracks the git
branch specified at `repo init` time via `--manifest-branch`.
The local branch name is always `default` regardless of the remote tracking
branch. Do not get confused if the remote branch is not `default`, or if
there is a remote `default` that is completely different!
No manual changes should be made in here as it will just confuse repo and
it won't automatically recover causing no new changes to be picked up.
* `manifests.git/`: A bare checkout of the manifest project. It tracks the
git repository specified at `repo init` time via `--manifest-url`.
No manual changes should be made in here as it will just confuse repo.
If you want to switch the tracking settings, re-run `repo init` with the
new settings.
* `manifest.xml`: The manifest that repo uses. It is generated at `repo init`
and uses the `--manifest-name` to determine what manifest file to load next
out of `manifests/`.
Do not try to modify this to load other manifests as it will confuse repo.
If you want to switch manifest files, re-run `repo init` with the new
setting.
Older versions of repo managed this with symlinks.
* `manifest.xml -> manifests/<manifest-name>.xml`: A symlink to the manifest
that the user wishes to sync. It is specified at `repo init` time via
`--manifest-name`.
* `manifests.git/.repo_config.json`: JSON cache of the `manifests.git/config`
file for repo to read/process quickly.
* `local_manifest.xml` (*Deprecated*): User-authored tweaks to the manifest
used to sync. See [local manifests] for more details.
* `local_manifests/`: Directory of user-authored manifest fragments to tweak
the manifest used to sync. See [local manifests] for more details.
### Project objects
*** note
**Warning**: Please do not use repo's approach to projects/ & project-objects/
layouts as a model for other tools to implement similar approaches.
It has a number of known downsides like:
* [Symlinks do not work well under Windows](./windows.md).
* Git sometimes replaces symlinks under .git/ with real files (under unknown
circumstances), and then the internal state gets out of sync, and data loss
may ensue.
* When sharing project-objects between multiple project checkouts, Git might
automatically run `gc` or `prune` which may lead to data loss or corruption
(since those operate on leaf projects and miss refs in other leaves). See
https://gerrit-review.googlesource.com/c/git-repo/+/254392 for more details.
Instead, you should use standard Git workflows like [git worktree] or
[gitsubmodules] with [superprojects].
***
* `project.list`: Tracking file used by `repo sync` to determine when projects
are added or removed and need corresponding updates in the checkout.
* `projects/`: Bare checkouts of every project synced by the manifest. The
filesystem layout matches the `<project path=...` setting in the manifest
(i.e. where it's checked out in the repo client source tree). Those
checkouts will symlink their `.git/` state to paths under here.
Some git state is further split out under `project-objects/`.
* `project-objects/`: Git objects that are safe to share across multiple
git checkouts. The filesystem layout matches the `<project name=...`
setting in the manifest (i.e. the path on the remote server) with a `.git`
suffix. This allows for multiple checkouts of the same remote git repo to
share their objects. For example, you could have different branches of
`foo/bar.git` checked out to `foo/bar-main`, `foo/bar-release`, etc...
There will be multiple trees under `projects/` for each one, but only one
under `project-objects/`.
This layout is designed to allow people to sync against different remotes
(e.g. a local mirror & a public review server) while avoiding duplicating
the content. However, this can run into problems if different remotes use
the same path on their respective servers. Best to avoid that.
* `subprojects/`: Like `projects/`, but for git submodules.
* `subproject-objects/`: Like `project-objects/`, but for git submodules.
* `worktrees/`: Bare checkouts of every project synced by the manifest. The
filesystem layout matches the `<project name=...` setting in the manifest
(i.e. the path on the remote server) with a `.git` suffix. This has the
same advantages as the `project-objects/` layout above.
This is used when [git worktree]'s are enabled.
### Global settings
The `.repo/manifests.git/config` file is used to track settings for the entire
repo client checkout.
Most settings use the `[repo]` section to avoid conflicts with git.
User controlled settings are initialized when running `repo init`.
| Setting | `repo init` Option | Use/Meaning |
|------------------- |---------------------------|-------------|
| manifest.groups | `--groups` & `--platform` | The manifest groups to sync |
| repo.archive | `--archive` | Use `git archive` for checkouts |
| repo.clonebundle | `--clone-bundle` | Whether the initial sync used clone.bundle explicitly |
| repo.clonefilter | `--clone-filter` | Filter setting when using [partial git clones] |
| repo.depth | `--depth` | Create shallow checkouts when cloning |
| repo.dissociate | `--dissociate` | Dissociate from any reference/mirrors after initial clone |
| repo.mirror | `--mirror` | Checkout is a repo mirror |
| repo.partialclone | `--partial-clone` | Create [partial git clones] |
| repo.partialcloneexclude | `--partial-clone-exclude` | Comma-delimited list of project names (not paths) to exclude while using [partial git clones] |
| repo.reference | `--reference` | Reference repo client checkout |
| repo.submodules | `--submodules` | Sync git submodules |
| repo.superproject | `--use-superproject` | Sync [superproject] |
| repo.worktree | `--worktree` | Use [git worktree] for checkouts |
| user.email | `--config-name` | User's e-mail address; Copied into `.git/config` when checking out a new project |
| user.name | `--config-name` | User's name; Copied into `.git/config` when checking out a new project |
[partial git clones]: https://git-scm.com/docs/gitrepository-layout#_code_partialclone_code
[superproject]: https://en.wikibooks.org/wiki/Git/Submodules_and_Superprojects
### Repo hooks settings
For more details on this feature, see the [repo-hooks docs](./repo-hooks.md).
We'll just discuss the internal configuration settings.
These are stored in the registered `<repo-hooks>` project itself, so if the
manifest switches to a different project, the settings will not be copied.
| Setting | Use/Meaning |
|--------------------------------------|-------------|
| repo.hooks.\<hook\>.approvedmanifest | User approval for secure manifest sources (e.g. https://) |
| repo.hooks.\<hook\>.approvedhash | User approval for insecure manifest sources (e.g. http://) |
For example, if our manifest had the following entries, we would store settings
under `.repo/projects/src/repohooks.git/config` (which would be reachable via
`git --git-dir=src/repohooks/.git config`).
```xml
<project path="src/repohooks" name="chromiumos/repohooks" ... />
<repo-hooks in-project="chromiumos/repohooks" ... />
```
If `<hook>` is `pre-upload`, the `.git/config` setting might be:
```ini
[repo "hooks.pre-upload"]
approvedmanifest = https://chromium.googlesource.com/chromiumos/manifest
```
## Per-project settings
These settings are somewhat meant to be tweaked by the user on a per-project
basis (e.g. `git config` in a checked out source repo).
Where possible, we re-use standard git settings to avoid confusion, and we
refrain from documenting those, so see [git-config] documentation instead.
See `repo help upload` for documentation on `[review]` settings.
The `[remote]` settings are automatically populated/updated from the manifest.
The `[branch]` settings are updated by `repo start` and `git branch`.
| Setting | Subcommands | Use/Meaning |
|-------------------------------|---------------|-------------|
| review.\<url\>.autocopy | upload | Automatically add to `--cc=<value>` |
| review.\<url\>.autoreviewer | upload | Automatically add to `--reviewers=<value>` |
| review.\<url\>.autoupload | upload | Automatically answer "yes" or "no" to all prompts |
| review.\<url\>.uploadhashtags | upload | Automatically add to `--hashtag=<value>` |
| review.\<url\>.uploadlabels | upload | Automatically add to `--label=<value>` |
| review.\<url\>.uploadnotify | upload | [Notify setting][upload-notify] to use |
| review.\<url\>.uploadtopic | upload | Default [topic] to use |
| review.\<url\>.username | upload | Override username with `ssh://` review URIs |
| remote.\<remote\>.fetch | sync | Set of refs to fetch |
| remote.\<remote\>.projectname | \<network\> | The name of the project as it exists in Gerrit review |
| remote.\<remote\>.pushurl | upload | The base URI for pushing CLs |
| remote.\<remote\>.review | upload | The URI of the Gerrit review server |
| remote.\<remote\>.url | sync & upload | The URI of the git project to fetch |
| branch.\<branch\>.merge | sync & upload | The branch to merge & upload & track |
| branch.\<branch\>.remote | sync & upload | The remote to track |
## ~/ dotconfig layout
Repo will create & maintain a few files in the user's home directory.
* `.repoconfig/`: Repo's per-user directory for all random config files/state.
* `.repoconfig/config`: Per-user settings using [git-config] file format.
* `.repoconfig/keyring-version`: Cache file for checking if the gnupg subdir
has all the same keys as the repo launcher. Used to avoid running gpg
constantly as that can be quite slow.
* `.repoconfig/gnupg/`: GnuPG's internal state directory used when repo needs
to run `gpg`. This provides isolation from the user's normal `~/.gnupg/`.
* `.repoconfig/.repo_config.json`: JSON cache of the `.repoconfig/config`
file for repo to read/process quickly.
* `.repo_.gitconfig.json`: JSON cache of the `.gitconfig` file for repo to
read/process quickly.
[git-config]: https://git-scm.com/docs/git-config
[git worktree]: https://git-scm.com/docs/git-worktree
[gitsubmodules]: https://git-scm.com/docs/gitsubmodules
[manifest-format.md]: ./manifest-format.md
[local manifests]: ./manifest-format.md#Local-Manifests
[superprojects]: https://en.wikibooks.org/wiki/Git/Submodules_and_Superprojects
[topic]: https://gerrit-review.googlesource.com/Documentation/intro-user.html#topics
[upload-notify]: https://gerrit-review.googlesource.com/Documentation/user-upload.html#notify

View File

@ -21,7 +21,6 @@ following DTD:
```xml
<!DOCTYPE manifest [
<!ELEMENT manifest (notice?,
remote*,
default?,
@ -30,7 +29,6 @@ following DTD:
project*,
extend-project*,
repo-hooks?,
superproject?,
include*)>
<!ELEMENT notice (#PCDATA)>
@ -91,7 +89,6 @@ following DTD:
<!ATTLIST extend-project path CDATA #IMPLIED>
<!ATTLIST extend-project groups CDATA #IMPLIED>
<!ATTLIST extend-project revision CDATA #IMPLIED>
<!ATTLIST extend-project remote CDATA #IMPLIED>
<!ELEMENT remove-project EMPTY>
<!ATTLIST remove-project name CDATA #REQUIRED>
@ -100,22 +97,11 @@ following DTD:
<!ATTLIST repo-hooks in-project CDATA #REQUIRED>
<!ATTLIST repo-hooks enabled-list CDATA #REQUIRED>
<!ELEMENT superproject (EMPTY)>
<!ATTLIST superproject name CDATA #REQUIRED>
<!ATTLIST superproject remote IDREF #IMPLIED>
<!ELEMENT include EMPTY>
<!ATTLIST include name CDATA #REQUIRED>
<!ATTLIST include groups CDATA #IMPLIED>
<!ATTLIST include name CDATA #REQUIRED>
]>
```
For compatibility purposes across repo releases, all unknown elements are
silently ignored. However, repo reserves all possible names for itself for
future use. If you want to use custom elements, the `x-*` namespace is
reserved for that purpose, and repo guarantees to never allocate any
corresponding names.
A description of the elements and their attributes follows.
@ -123,10 +109,6 @@ A description of the elements and their attributes follows.
The root element of the file.
### Element notice
Arbitrary text that is displayed to users whenever `repo sync` finishes.
The content is simply passed through as it exists in the manifest.
### Element remote
@ -159,8 +141,8 @@ Attribute `review`: Hostname of the Gerrit server where reviews
are uploaded to by `repo upload`. This attribute is optional;
if not specified then `repo upload` will not function.
Attribute `revision`: Name of a Git branch (e.g. `main` or
`refs/heads/main`). Remotes with their own revision will override
Attribute `revision`: Name of a Git branch (e.g. `master` or
`refs/heads/master`). Remotes with their own revision will override
the default revision.
### Element default
@ -173,11 +155,11 @@ Attribute `remote`: Name of a previously defined remote element.
Project elements lacking a remote attribute of their own will use
this remote.
Attribute `revision`: Name of a Git branch (e.g. `main` or
`refs/heads/main`). Project elements lacking their own
Attribute `revision`: Name of a Git branch (e.g. `master` or
`refs/heads/master`). Project elements lacking their own
revision attribute will use this revision.
Attribute `dest-branch`: Name of a Git branch (e.g. `main`).
Attribute `dest-branch`: Name of a Git branch (e.g. `master`).
Project elements not setting their own `dest-branch` will inherit
this value. If this value is not set, projects will use `revision`
by default instead.
@ -253,37 +235,24 @@ name will be prefixed by the parent's.
The project name must match the name Gerrit knows, if Gerrit is
being used for code reviews.
"name" must not be empty, and may not be an absolute path or use "." or ".."
path components. It is always interpreted relative to the remote's fetch
settings, so if a different base path is needed, declare a different remote
with the new settings needed.
These restrictions are not enforced for [Local Manifests].
Attribute `path`: An optional path relative to the top directory
of the repo client where the Git working directory for this project
should be placed. If not supplied the project "name" is used.
should be placed. If not supplied the project name is used.
If the project has a parent element, its path will be prefixed
by the parent's.
"path" may not be an absolute path or use "." or ".." path components.
These restrictions are not enforced for [Local Manifests].
If you want to place files into the root of the checkout (e.g. a README or
Makefile or another build script), use the [copyfile] or [linkfile] elements
instead.
Attribute `remote`: Name of a previously defined remote element.
If not supplied the remote given by the default element is used.
Attribute `revision`: Name of the Git branch the manifest wants
to track for this project. Names can be relative to refs/heads
(e.g. just "main") or absolute (e.g. "refs/heads/main").
(e.g. just "master") or absolute (e.g. "refs/heads/master").
Tags and/or explicit SHA-1s should work in theory, but have not
been extensively tested. If not supplied the revision given by
the remote element is used if applicable, else the default
element is used.
Attribute `dest-branch`: Name of a Git branch (e.g. `main`).
Attribute `dest-branch`: Name of a Git branch (e.g. `master`).
When using `repo upload`, changes will be submitted for code
review on this branch. If unspecified both here and in the
default element, `revision` is used instead.
@ -292,7 +261,7 @@ Attribute `groups`: List of groups to which this project belongs,
whitespace or comma separated. All projects belong to the group
"all", and each project automatically belongs to a group of
its name:`name` and path:`path`. E.g. for
`<project name="monkeys" path="barrel-of"/>`, that project
<project name="monkeys" path="barrel-of"/>, that project
definition is implicitly in the following manifest groups:
default, name:monkeys, and path:barrel-of. If you place a project in the
group "notdefault", it will not be automatically downloaded by repo.
@ -337,9 +306,6 @@ belongs. Same syntax as the corresponding element of `project`.
Attribute `revision`: If specified, overrides the revision of the original
project. Same syntax as the corresponding element of `project`.
Attribute `remote`: If specified, overrides the remote of the original
project. Same syntax as the corresponding element of `project`.
### Element annotation
Zero or more annotation elements may be specified as children of a
@ -372,7 +338,7 @@ It's just like copyfile and runs at the same time as copyfile but
instead of copying it creates a symlink.
The symlink is created at "dest" (relative to the top of the tree) and
points to the path specified by "src" which is a path in the project.
points to the path specified by "src".
Parent directories of "dest" will be automatically created if missing.
@ -389,41 +355,6 @@ This element is mostly useful in a local manifest file, where
the user can remove a project, and possibly replace it with their
own definition.
### Element repo-hooks
NB: See the [practical documentation](./repo-hooks.md) for using repo hooks.
Only one repo-hooks element may be specified at a time.
Attempting to redefine it will fail to parse.
Attribute `in-project`: The project where the hooks are defined. The value
must match the `name` attribute (**not** the `path` attribute) of a previously
defined `project` element.
Attribute `enabled-list`: List of hooks to use, whitespace or comma separated.
### Element superproject
***
*Note*: This is currently a WIP.
***
NB: See the [git superprojects documentation](
https://en.wikibooks.org/wiki/Git/Submodules_and_Superprojects) for background
information.
This element is used to specify the URL of the superproject. It has "name" and
"remote" as atrributes. Only "name" is required while the others have
reasonable defaults. At most one superproject may be specified.
Attempting to redefine it will fail to parse.
Attribute `name`: A unique name for the superproject. This attribute has the
same meaning as project's name attribute. See the
[element project](#element-project) for more information.
Attribute `remote`: Name of a previously defined remote element.
If not supplied the remote given by the default element is used.
### Element include
This element provides the capability of including another manifest
@ -433,15 +364,8 @@ target manifest to include - it must be a usable manifest on its own.
Attribute `name`: the manifest to include, specified relative to
the manifest repository's root.
"name" may not be an absolute path or use "." or ".." path components.
These restrictions are not enforced for [Local Manifests].
Attribute `groups`: List of additional groups to which all projects
in the included manifest belong. This appends and recurses, meaning
all projects in sub-manifests carry all parent include groups.
Same syntax as the corresponding element of `project`.
## Local Manifests {#local-manifests}
## Local Manifests
Additional remotes and projects may be added through local manifest
files stored in `$TOP_DIR/.repo/local_manifests/*.xml`.
@ -468,9 +392,10 @@ these extra projects.
Manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml` will
be loaded in alphabetical order.
The legacy `$TOP_DIR/.repo/local_manifest.xml` path is no longer supported.
Additional remotes and projects may also be added through a local
manifest, stored in `$TOP_DIR/.repo/local_manifest.xml`. This method
is deprecated in favor of using multiple manifest files as mentioned
above.
[copyfile]: #Element-copyfile
[linkfile]: #Element-linkfile
[Local Manifests]: #local-manifests
If `$TOP_DIR/.repo/local_manifest.xml` exists, it will be loaded before
any manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml`.

View File

@ -18,13 +18,13 @@ Bugfixes may be added on a best-effort basis or from the community, but largely
no new features will be added, nor is support guaranteed.
Users can select this during `repo init` time via the [repo launcher].
Otherwise the default branches (e.g. stable & main) will be used which will
Otherwise the default branches (e.g. stable & master) will be used which will
require Python 3.
This means the [repo launcher] needs to support both Python 2 & Python 3, but
since it doesn't import any other repo code, this shouldn't be too problematic.
The main branch will require Python 3.6 at a minimum.
The master branch will require Python 3.6 at a minimum.
If the system has an older version of Python 3, then users will have to select
the legacy Python 2 branch instead.

View File

@ -5,37 +5,6 @@ related topics and flows.
[TOC]
## Schedule
There is no specific schedule for when releases are made.
Usually it's more along the lines of "enough minor changes have been merged",
or "there's a known issue the maintainers know should get fixed".
If you find a fix has been merged for an issue important to you, but hasn't been
released after a week or so, feel free to [contact] us to request a new release.
### Release Freezes {#freeze}
We try to observe a regular schedule for when **not** to release.
If something goes wrong, staff need to be active in order to respond quickly &
effectively.
We also don't want to disrupt non-Google organizations if possible.
We generally follow the rules:
* Release during Mon - Thu, 9:00 - 14:00 [US PT]
* Avoid holidays
* All regular [US holidays]
* Large international ones if possible
* All the various [New Years]
* Jan 1 in Gregorian calendar is the most obvious
* Check for large Lunar New Years too
* Follow the normal [Google production freeze schedule]
[US holidays]: https://en.wikipedia.org/wiki/Federal_holidays_in_the_United_States
[US PT]: https://en.wikipedia.org/wiki/Pacific_Time_Zone
[New Years]: https://en.wikipedia.org/wiki/New_Year
[Google production freeze schedule]: http://goto.google.com/prod-freeze
## Launcher script
The main repo script serves as a standalone program and is often referred to as
@ -80,11 +49,11 @@ control how repo finds updates:
* `--repo-url`: This tells repo where to clone the full repo project itself.
It defaults to the official project (`REPO_URL` in the launcher script).
* `--repo-rev`: This tells repo which branch to use for the full project.
* `--repo-branch`: This tells repo which branch to use for the full project.
It defaults to the `stable` branch (`REPO_REV` in the launcher script).
Whenever `repo sync` is run, repo will check to see if an update is available.
It fetches the latest repo-rev from the repo-url.
It fetches the latest repo-branch from the repo-url.
Then it verifies that the latest commit in the branch has a valid signed tag
using `git tag -v` (which uses gpg).
If the tag is valid, then repo will update its internal checkout to it.
@ -97,7 +66,7 @@ If that tag cannot be verified, it gives up and forces the user to resolve.
## Branch management
All development happens on the `main` branch and should generally be stable.
All development happens on the `master` branch and should generally be stable.
Since the repo launcher defaults to tracking the `stable` branch, it is not
normally updated until a new release is available.
@ -112,7 +81,7 @@ For example, when `stable` moves from `v1.10.x` to `v1.11.x`, then the `maint`
branch will be updated from `v1.9.x` to `v1.10.x`.
We don't have parallel release branches/series.
Typically all tags are made against the `main` branch and then pushed to the
Typically all tags are made against the `master` branch and then pushed to the
`stable` branch to make it available to the rest of the world.
Since repo doesn't typically see a lot of changes, this tends to be OK.
@ -120,10 +89,10 @@ Since repo doesn't typically see a lot of changes, this tends to be OK.
When you want to create a new release, you'll need to select a good version and
create a signed tag using a key registered in repo itself.
Typically we just tag the latest version of the `main` branch.
Typically we just tag the latest version of the `master` branch.
The tag could be pushed now, but it won't be used by clients normally (since the
default `repo-rev` setting is `stable`).
This would allow some early testing on systems who explicitly select `main`.
default `repo-branch` setting is `stable`).
This would allow some early testing on systems who explicitly select `master`.
### Creating a signed tag
@ -144,7 +113,7 @@ $ export GNUPGHOME=~/.gnupg/repo/
$ gpg -K
# Pick whatever branch or commit you want to tag.
$ r=main
$ r=master
# Pick the new version.
$ t=1.12.10
@ -192,92 +161,7 @@ You can create a short changelog using the command:
$ git log --format="%h (%aN) %s" --no-merges origin/stable..$r
```
## Project References
Here's a table showing the relationship of major tools, their EOL dates, and
their status in Ubuntu & Debian.
Those distros tend to be good indicators of how long we need to support things.
Things in bold indicate stuff to take note of, but does not guarantee that we
still support them.
Things in italics are things we used to care about but probably don't anymore.
| Date | EOL | [Git][rel-g] | [Python][rel-p] | [Ubuntu][rel-u] / [Debian][rel-d] | Git | Python |
|:--------:|:------------:|--------------|-----------------|-----------------------------------|-----|--------|
| Oct 2008 | *Oct 2013* | | 2.6.0 | *10.04 Lucid* - 10.10 Maverick / *Squeeze* |
| Dec 2008 | *Feb 2009* | | 3.0.0 |
| Feb 2009 | *Mar 2012* | | | Debian 5 Lenny | 1.5.6.5 | 2.5.2 |
| Jun 2009 | *Jun 2016* | | 3.1.0 | *10.04 Lucid* - 10.10 Maverick / *Squeeze* |
| Feb 2010 | *Oct 2012* | 1.7.0 | | *10.04 Lucid* - *12.04 Precise* - 12.10 Quantal |
| Apr 2010 | *Apr 2015* | | | *10.04 Lucid* | 1.7.0.4 | 2.6.5 3.1.2 |
| Jul 2010 | *Dec 2019* | | **2.7.0** | 11.04 Natty - **<current>** |
| Oct 2010 | | | | 10.10 Maverick | 1.7.1 | 2.6.6 3.1.3 |
| Feb 2011 | *Feb 2016* | | | Debian 6 Squeeze | 1.7.2.5 | 2.6.6 3.1.3 |
| Apr 2011 | | | | 11.04 Natty | 1.7.4 | 2.7.1 3.2.0 |
| Oct 2011 | *Feb 2016* | | 3.2.0 | 11.04 Natty - 12.10 Quantal |
| Oct 2011 | | | | 11.10 Ocelot | 1.7.5.4 | 2.7.2 3.2.2 |
| Apr 2012 | *Apr 2019* | | | *12.04 Precise* | 1.7.9.5 | 2.7.3 3.2.3 |
| Sep 2012 | *Sep 2017* | | 3.3.0 | 13.04 Raring - 13.10 Saucy |
| Oct 2012 | *Dec 2014* | 1.8.0 | | 13.04 Raring - 13.10 Saucy |
| Oct 2012 | | | | 12.10 Quantal | 1.7.10.4 | 2.7.3 3.2.3 |
| Apr 2013 | | | | 13.04 Raring | 1.8.1.2 | 2.7.4 3.3.1 |
| May 2013 | *May 2018* | | | Debian 7 Wheezy | 1.7.10.4 | 2.7.3 3.2.3 |
| Oct 2013 | | | | 13.10 Saucy | 1.8.3.2 | 2.7.5 3.3.2 |
| Feb 2014 | *Dec 2014* | **1.9.0** | | **14.04 Trusty** |
| Mar 2014 | *Mar 2019* | | **3.4.0** | **14.04 Trusty** - 15.10 Wily / **Jessie** |
| Apr 2014 | **Apr 2022** | | | **14.04 Trusty** | 1.9.1 | 2.7.5 3.4.0 |
| May 2014 | *Dec 2014* | 2.0.0 |
| Aug 2014 | *Dec 2014* | **2.1.0** | | 14.10 Utopic - 15.04 Vivid / **Jessie** |
| Oct 2014 | | | | 14.10 Utopic | 2.1.0 | 2.7.8 3.4.2 |
| Nov 2014 | *Sep 2015* | 2.2.0 |
| Feb 2015 | *Sep 2015* | 2.3.0 |
| Apr 2015 | *May 2017* | 2.4.0 |
| Apr 2015 | **Jun 2020** | | | **Debian 8 Jessie** | 2.1.4 | 2.7.9 3.4.2 |
| Apr 2015 | | | | 15.04 Vivid | 2.1.4 | 2.7.9 3.4.3 |
| Jul 2015 | *May 2017* | 2.5.0 | | 15.10 Wily |
| Sep 2015 | *May 2017* | 2.6.0 |
| Sep 2015 | **Sep 2020** | | **3.5.0** | **16.04 Xenial** - 17.04 Zesty / **Stretch** |
| Oct 2015 | | | | 15.10 Wily | 2.5.0 | 2.7.9 3.4.3 |
| Jan 2016 | *Jul 2017* | **2.7.0** | | **16.04 Xenial** |
| Mar 2016 | *Jul 2017* | 2.8.0 |
| Apr 2016 | **Apr 2024** | | | **16.04 Xenial** | 2.7.4 | 2.7.11 3.5.1 |
| Jun 2016 | *Jul 2017* | 2.9.0 | | 16.10 Yakkety |
| Sep 2016 | *Sep 2017* | 2.10.0 |
| Oct 2016 | | | | 16.10 Yakkety | 2.9.3 | 2.7.11 3.5.1 |
| Nov 2016 | *Sep 2017* | **2.11.0** | | 17.04 Zesty / **Stretch** |
| Dec 2016 | **Dec 2021** | | **3.6.0** | 17.10 Artful - **18.04 Bionic** - 18.10 Cosmic |
| Feb 2017 | *Sep 2017* | 2.12.0 |
| Apr 2017 | | | | 17.04 Zesty | 2.11.0 | 2.7.13 3.5.3 |
| May 2017 | *May 2018* | 2.13.0 |
| Jun 2017 | **Jun 2022** | | | **Debian 9 Stretch** | 2.11.0 | 2.7.13 3.5.3 |
| Aug 2017 | *Dec 2019* | 2.14.0 | | 17.10 Artful |
| Oct 2017 | *Dec 2019* | 2.15.0 |
| Oct 2017 | | | | 17.10 Artful | 2.14.1 | 2.7.14 3.6.3 |
| Jan 2018 | *Dec 2019* | 2.16.0 |
| Apr 2018 | *Dec 2019* | 2.17.0 | | **18.04 Bionic** |
| Apr 2018 | **Apr 2028** | | | **18.04 Bionic** | 2.17.0 | 2.7.15 3.6.5 |
| Jun 2018 | *Dec 2019* | 2.18.0 |
| Jun 2018 | **Jun 2023** | | 3.7.0 | 19.04 Disco - **20.04 Focal** / **Buster** |
| Sep 2018 | *Dec 2019* | 2.19.0 | | 18.10 Cosmic |
| Oct 2018 | | | | 18.10 Cosmic | 2.19.1 | 2.7.15 3.6.6 |
| Dec 2018 | *Dec 2019* | **2.20.0** | | 19.04 Disco / **Buster** |
| Feb 2019 | *Dec 2019* | 2.21.0 |
| Apr 2019 | | | | 19.04 Disco | 2.20.1 | 2.7.16 3.7.3 |
| Jun 2019 | | 2.22.0 |
| Jul 2019 | **Jul 2024** | | | **Debian 10 Buster** | 2.20.1 | 2.7.16 3.7.3 |
| Aug 2019 | | 2.23.0 |
| Oct 2019 | **Oct 2024** | | 3.8.0 |
| Oct 2019 | | | | 19.10 Eoan | 2.20.1 | 2.7.17 3.7.5 |
| Nov 2019 | | 2.24.0 |
| Jan 2020 | | 2.25.0 | | **20.04 Focal** |
| Apr 2020 | **Apr 2030** | | | **20.04 Focal** | 2.25.0 | 2.7.17 3.7.5 |
[contact]: ../README.md#contact
[rel-d]: https://en.wikipedia.org/wiki/Debian_version_history
[rel-g]: https://en.wikipedia.org/wiki/Git#Releases
[rel-p]: https://en.wikipedia.org/wiki/History_of_Python#Table_of_versions
[rel-u]: https://en.wikipedia.org/wiki/Ubuntu_version_history#Table_of_versions
[example announcement]: https://groups.google.com/d/topic/repo-discuss/UGBNismWo1M/discussion
[repo-discuss@googlegroups.com]: https://groups.google.com/forum/#!forum/repo-discuss
[go/repo-release]: https://goto.google.com/repo-release

View File

@ -27,7 +27,7 @@ repohooks project is updated and a hook is triggered.
For the full syntax, see the [repo manifest format](./manifest-format.md).
Here's a short example from
[Android](https://android.googlesource.com/platform/manifest/+/HEAD/default.xml).
[Android](https://android.googlesource.com/platform/manifest/+/master/default.xml).
The `<project>` line checks out the repohooks git repo to the local
`tools/repohooks/` path. The `<repo-hooks>` line says to look in the project
with the name `platform/tools/repohooks` for hooks to run during the

View File

@ -19,33 +19,7 @@ also due to most developers not using Windows.
We will never add code specific to older versions of Windows.
It might work, but it most likely won't, so please don't bother asking.
## Git worktrees
*** note
**Warning**: Repo's support for Git worktrees is new & experimental.
Please report any bugs and be sure to maintain backups!
***
The Repo 2.4 release introduced support for [Git worktrees][git-worktree].
You don't have to worry about or understand this particular feature, so don't
worry if this section of the Git manual is particularly impenetrable.
The salient point is that Git worktrees allow Repo to create repo client
checkouts that do not require symlinks at all under Windows.
This means users no longer need Administrator access to sync code.
Simply use `--worktree` when running `repo init` to opt in.
This does not effect specific Git repositories that use symlinks themselves.
[git-worktree]: https://git-scm.com/docs/git-worktree
## Symlinks by default
*** note
**NB**: This section applies to the default Repo behavior which does not use
Git worktrees (see the previous section for more info).
***
## Symlinks
Repo will use symlinks heavily internally.
On *NIX platforms, this isn't an issue, but Windows makes it a bit difficult.
@ -88,8 +62,9 @@ This also helps `tar` unpack symlinks, so that's nice.
## Python
Python 3.6 or newer is required.
Python 2 is known to be broken when running under Windows.
You should make sure to be running Python 3.6 or newer under Windows.
Python 2 might work, but due to already limited platform testing, you should
only run newer Python versions.
See our [Python Support](./python-support.md) document for more details.
You can grab the latest Windows installer here:<br>

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,6 +14,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import os
import re
import sys
@ -21,7 +24,6 @@ import tempfile
from error import EditorError
import platform_utils
class Editor(object):
"""Manages the user's preferred text editor."""
@ -55,7 +57,7 @@ class Editor(object):
if os.getenv('TERM') == 'dumb':
print(
"""No editor specified in GIT_EDITOR, core.editor, VISUAL or EDITOR.
"""No editor specified in GIT_EDITOR, core.editor, VISUAL or EDITOR.
Tried to fall back to vi but terminal is dumb. Please configure at
least one of these before using this command.""", file=sys.stderr)
sys.exit(1)
@ -102,10 +104,10 @@ least one of these before using this command.""", file=sys.stderr)
rc = subprocess.Popen(args, shell=shell).wait()
except OSError as e:
raise EditorError('editor failed, %s: %s %s'
% (str(e), editor, path))
% (str(e), editor, path))
if rc != 0:
raise EditorError('editor failed with exit status %d: %s %s'
% (rc, editor, path))
% (rc, editor, path))
with open(path, mode='rb') as fd2:
return fd2.read().decode('utf-8')

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,93 +14,70 @@
# See the License for the specific language governing permissions and
# limitations under the License.
# URL to file bug reports for repo tool issues.
BUG_REPORT_URL = 'https://bugs.chromium.org/p/gerrit/issues/entry?template=Repo+tool+issue'
class ManifestParseError(Exception):
"""Failed to parse the manifest file.
"""
class ManifestInvalidRevisionError(ManifestParseError):
class ManifestInvalidRevisionError(Exception):
"""The revision value in a project is incorrect.
"""
class ManifestInvalidPathError(ManifestParseError):
"""A path used in <copyfile> or <linkfile> is incorrect.
"""
class NoManifestException(Exception):
"""The required manifest does not exist.
"""
def __init__(self, path, reason):
super().__init__(path, reason)
super(NoManifestException, self).__init__()
self.path = path
self.reason = reason
def __str__(self):
return self.reason
class EditorError(Exception):
"""Unspecified error from the user's text editor.
"""
def __init__(self, reason):
super().__init__(reason)
super(EditorError, self).__init__()
self.reason = reason
def __str__(self):
return self.reason
class GitError(Exception):
"""Unspecified internal error from git.
"""
def __init__(self, command):
super().__init__(command)
super(GitError, self).__init__()
self.command = command
def __str__(self):
return self.command
class UploadError(Exception):
"""A bundle upload to Gerrit did not succeed.
"""
def __init__(self, reason):
super().__init__(reason)
super(UploadError, self).__init__()
self.reason = reason
def __str__(self):
return self.reason
class DownloadError(Exception):
"""Cannot download a repository.
"""
def __init__(self, reason):
super().__init__(reason)
super(DownloadError, self).__init__()
self.reason = reason
def __str__(self):
return self.reason
class NoSuchProjectError(Exception):
"""A specified project does not exist in the work tree.
"""
def __init__(self, name=None):
super().__init__(name)
super(NoSuchProjectError, self).__init__()
self.name = name
def __str__(self):
@ -110,9 +89,8 @@ class NoSuchProjectError(Exception):
class InvalidProjectGroupsError(Exception):
"""A specified project is not suitable for the specified groups
"""
def __init__(self, name=None):
super().__init__(name)
super(InvalidProjectGroupsError, self).__init__()
self.name = name
def __str__(self):
@ -120,18 +98,15 @@ class InvalidProjectGroupsError(Exception):
return 'in current directory'
return self.name
class RepoChangedException(Exception):
"""Thrown if 'repo sync' results in repo updating its internal
repo or manifest repositories. In this special case we must
use exec to re-execute repo with the new code and manifest.
"""
def __init__(self, extra_args=None):
super().__init__(extra_args)
super(RepoChangedException, self).__init__()
self.extra_args = extra_args or []
class HookError(Exception):
"""Thrown if a 'repo-hook' could not be run.

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2017 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,6 +14,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import json
import multiprocessing
@ -19,7 +23,6 @@ TASK_COMMAND = 'command'
TASK_SYNC_NETWORK = 'sync-network'
TASK_SYNC_LOCAL = 'sync-local'
class EventLog(object):
"""Event log that records events that occurred during a repo invocation.
@ -135,7 +138,7 @@ class EventLog(object):
Returns:
A dictionary of the event added to the log.
"""
event['status'] = self.GetStatusString(success)
event['status'] = self.GetStatusString(success)
event['finish_time'] = finish
return event
@ -162,7 +165,6 @@ class EventLog(object):
# An integer id that is unique across this invocation of the program.
_EVENT_ID = multiprocessing.Value('i', 1)
def _NextEventId():
"""Helper function for grabbing the next unique id.

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,8 +14,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import os
import re
import sys
import subprocess
import tempfile
@ -26,17 +28,7 @@ from repo_trace import REPO_TRACE, IsTrace, Trace
from wrapper import Wrapper
GIT = 'git'
# NB: These do not need to be kept in sync with the repo launcher script.
# These may be much newer as it allows the repo launcher to roll between
# different repo releases while source versions might require a newer git.
#
# The soft version is when we start warning users that the version is old and
# we'll be dropping support for it. We'll refuse to work with versions older
# than the hard version.
#
# git-1.7 is in (EOL) Ubuntu Precise. git-1.9 is in Ubuntu Trusty.
MIN_GIT_VERSION_SOFT = (1, 9, 1)
MIN_GIT_VERSION_HARD = (1, 7, 2)
MIN_GIT_VERSION = (1, 5, 4)
GIT_DIR = 'GIT_DIR'
LAST_GITDIR = None
@ -45,36 +37,6 @@ LAST_CWD = None
_ssh_proxy_path = None
_ssh_sock_path = None
_ssh_clients = []
_ssh_version = None
def _run_ssh_version():
"""run ssh -V to display the version number"""
return subprocess.check_output(['ssh', '-V'], stderr=subprocess.STDOUT).decode()
def _parse_ssh_version(ver_str=None):
"""parse a ssh version string into a tuple"""
if ver_str is None:
ver_str = _run_ssh_version()
m = re.match(r'^OpenSSH_([0-9.]+)(p[0-9]+)?\s', ver_str)
if m:
return tuple(int(x) for x in m.group(1).split('.'))
else:
return ()
def ssh_version():
"""return ssh version as a tuple"""
global _ssh_version
if _ssh_version is None:
try:
_ssh_version = _parse_ssh_version()
except subprocess.CalledProcessError:
print('fatal: unable to detect ssh version', file=sys.stderr)
sys.exit(1)
return _ssh_version
def ssh_sock(create=True):
global _ssh_sock_path
@ -84,36 +46,28 @@ def ssh_sock(create=True):
tmp_dir = '/tmp'
if not os.path.exists(tmp_dir):
tmp_dir = tempfile.gettempdir()
if ssh_version() < (6, 7):
tokens = '%r@%h:%p'
else:
tokens = '%C' # hash of %l%h%p%r
_ssh_sock_path = os.path.join(
tempfile.mkdtemp('', 'ssh-', tmp_dir),
'master-' + tokens)
tempfile.mkdtemp('', 'ssh-', tmp_dir),
'master-%r@%h:%p')
return _ssh_sock_path
def _ssh_proxy():
global _ssh_proxy_path
if _ssh_proxy_path is None:
_ssh_proxy_path = os.path.join(
os.path.dirname(__file__),
'git_ssh')
os.path.dirname(__file__),
'git_ssh')
return _ssh_proxy_path
def _add_ssh_client(p):
_ssh_clients.append(p)
def _remove_ssh_client(p):
try:
_ssh_clients.remove(p)
except ValueError:
pass
def terminate_ssh_clients():
global _ssh_clients
for p in _ssh_clients:
@ -124,10 +78,8 @@ def terminate_ssh_clients():
pass
_ssh_clients = []
_git_version = None
class _GitCall(object):
def version_tuple(self):
global _git_version
@ -139,15 +91,12 @@ class _GitCall(object):
return _git_version
def __getattr__(self, name):
name = name.replace('_', '-')
name = name.replace('_','-')
def fun(*cmdv):
command = [name]
command.extend(cmdv)
return GitCommand(None, command).Wait() == 0
return fun
git = _GitCall()
@ -162,10 +111,11 @@ def RepoSourceVersion():
proj = os.path.dirname(os.path.abspath(__file__))
env[GIT_DIR] = os.path.join(proj, '.git')
result = subprocess.run([GIT, 'describe', HEAD], stdout=subprocess.PIPE,
encoding='utf-8', env=env, check=False)
if result.returncode == 0:
ver = result.stdout.strip()
p = subprocess.Popen([GIT, 'describe', HEAD], stdout=subprocess.PIPE,
env=env)
if p.wait() == 0:
ver = p.stdout.read().strip().decode('utf-8')
if ver.startswith('v'):
ver = ver[1:]
else:
@ -227,10 +177,8 @@ class UserAgent(object):
return self._git_ua
user_agent = UserAgent()
def git_require(min_version, fail=False, msg=''):
git_version = git.version_tuple()
if min_version <= git_version:
@ -243,38 +191,42 @@ def git_require(min_version, fail=False, msg=''):
sys.exit(1)
return False
def _setenv(env, name, value):
env[name] = value.encode()
class GitCommand(object):
def __init__(self,
project,
cmdv,
bare=False,
input=None,
capture_stdout=False,
capture_stderr=False,
merge_output=False,
disable_editor=False,
ssh_proxy=False,
cwd=None,
gitdir=None):
bare = False,
provide_stdin = False,
capture_stdout = False,
capture_stderr = False,
disable_editor = False,
ssh_proxy = False,
cwd = None,
gitdir = None):
env = self._GetBasicEnv()
# If we are not capturing std* then need to print it.
self.tee = {'stdout': not capture_stdout, 'stderr': not capture_stderr}
if disable_editor:
env['GIT_EDITOR'] = ':'
_setenv(env, 'GIT_EDITOR', ':')
if ssh_proxy:
env['REPO_SSH_SOCK'] = ssh_sock()
env['GIT_SSH'] = _ssh_proxy()
env['GIT_SSH_VARIANT'] = 'ssh'
_setenv(env, 'REPO_SSH_SOCK', ssh_sock())
_setenv(env, 'GIT_SSH', _ssh_proxy())
_setenv(env, 'GIT_SSH_VARIANT', 'ssh')
if 'http_proxy' in env and 'darwin' == sys.platform:
s = "'http.proxy=%s'" % (env['http_proxy'],)
p = env.get('GIT_CONFIG_PARAMETERS')
if p is not None:
s = p + ' ' + s
env['GIT_CONFIG_PARAMETERS'] = s
_setenv(env, 'GIT_CONFIG_PARAMETERS', s)
if 'GIT_ALLOW_PROTOCOL' not in env:
env['GIT_ALLOW_PROTOCOL'] = (
'file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc')
env['GIT_HTTP_USER_AGENT'] = user_agent.git
_setenv(env, 'GIT_ALLOW_PROTOCOL',
'file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc')
_setenv(env, 'GIT_HTTP_USER_AGENT', user_agent.git)
if project:
if not cwd:
@ -285,10 +237,7 @@ class GitCommand(object):
command = [GIT]
if bare:
if gitdir:
# Git on Windows wants its paths only using / for reliability.
if platform_utils.isWindows():
gitdir = gitdir.replace('\\', '/')
env[GIT_DIR] = gitdir
_setenv(env, GIT_DIR, gitdir)
cwd = None
command.append(cmdv[0])
# Need to use the --progress flag for fetch/clone so output will be
@ -298,10 +247,13 @@ class GitCommand(object):
command.append('--progress')
command.extend(cmdv[1:])
stdin = subprocess.PIPE if input else None
stdout = subprocess.PIPE if capture_stdout else None
stderr = (subprocess.STDOUT if merge_output else
(subprocess.PIPE if capture_stderr else None))
if provide_stdin:
stdin = subprocess.PIPE
else:
stdin = None
stdout = subprocess.PIPE
stderr = subprocess.PIPE
if IsTrace():
global LAST_CWD
@ -329,19 +281,15 @@ class GitCommand(object):
dbg += ' 1>|'
if stderr == subprocess.PIPE:
dbg += ' 2>|'
elif stderr == subprocess.STDOUT:
dbg += ' 2>&1'
Trace('%s', dbg)
try:
p = subprocess.Popen(command,
cwd=cwd,
env=env,
encoding='utf-8',
errors='backslashreplace',
stdin=stdin,
stdout=stdout,
stderr=stderr)
cwd = cwd,
env = env,
stdin = stdin,
stdout = stdout,
stderr = stderr)
except Exception as e:
raise GitError('%s: %s' % (command[1], e))
@ -349,17 +297,7 @@ class GitCommand(object):
_add_ssh_client(p)
self.process = p
if input:
if isinstance(input, str):
input = input.encode('utf-8')
p.stdin.write(input)
p.stdin.close()
try:
self.stdout, self.stderr = p.communicate()
finally:
_remove_ssh_client(p)
self.rc = p.wait()
self.stdin = p.stdin
@staticmethod
def _GetBasicEnv():
@ -379,4 +317,35 @@ class GitCommand(object):
return env
def Wait(self):
return self.rc
try:
p = self.process
rc = self._CaptureOutput()
finally:
_remove_ssh_client(p)
return rc
def _CaptureOutput(self):
p = self.process
s_in = platform_utils.FileDescriptorStreams.create()
s_in.add(p.stdout, sys.stdout, 'stdout')
s_in.add(p.stderr, sys.stderr, 'stderr')
self.stdout = ''
self.stderr = ''
while not s_in.is_done:
in_ready = s_in.select()
for s in in_ready:
buf = s.read()
if not buf:
s_in.remove(s)
continue
if not hasattr(buf, 'encode'):
buf = buf.decode()
if s.std_name == 'stdout':
self.stdout += buf
else:
self.stderr += buf
if self.tee[s.std_name]:
s.dest.write(buf)
s.dest.flush()
return p.wait()

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,13 +14,13 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import contextlib
import errno
from http.client import HTTPException
import json
import os
import re
import signal
import ssl
import subprocess
import sys
@ -27,12 +29,26 @@ try:
except ImportError:
import dummy_threading as _threading
import time
import urllib.error
import urllib.request
from pyversion import is_python3
if is_python3():
import urllib.request
import urllib.error
else:
import urllib2
import imp
urllib = imp.new_module('urllib')
urllib.request = urllib2
urllib.error = urllib2
from signal import SIGTERM
from error import GitError, UploadError
import platform_utils
from repo_trace import Trace
if is_python3():
from http.client import HTTPException
else:
from httplib import HTTPException
from git_command import GitCommand
from git_command import ssh_sock
@ -43,47 +59,39 @@ ID_RE = re.compile(r'^[0-9a-f]{40}$')
REVIEW_CACHE = dict()
def IsChange(rev):
return rev.startswith(R_CHANGES)
def IsId(rev):
return ID_RE.match(rev)
def IsTag(rev):
return rev.startswith(R_TAGS)
def IsImmutable(rev):
return IsChange(rev) or IsId(rev) or IsTag(rev)
def _key(name):
parts = name.split('.')
if len(parts) < 2:
return name.lower()
parts[0] = parts[0].lower()
parts[ 0] = parts[ 0].lower()
parts[-1] = parts[-1].lower()
return '.'.join(parts)
class GitConfig(object):
_ForUser = None
_USER_CONFIG = '~/.gitconfig'
@classmethod
def ForUser(cls):
if cls._ForUser is None:
cls._ForUser = cls(configfile=os.path.expanduser(cls._USER_CONFIG))
cls._ForUser = cls(configfile = os.path.expanduser('~/.gitconfig'))
return cls._ForUser
@classmethod
def ForRepository(cls, gitdir, defaults=None):
return cls(configfile=os.path.join(gitdir, 'config'),
defaults=defaults)
return cls(configfile = os.path.join(gitdir, 'config'),
defaults = defaults)
def __init__(self, configfile, defaults=None, jsonFile=None):
self.file = configfile
@ -96,70 +104,18 @@ class GitConfig(object):
self._json = jsonFile
if self._json is None:
self._json = os.path.join(
os.path.dirname(self.file),
'.repo_' + os.path.basename(self.file) + '.json')
os.path.dirname(self.file),
'.repo_' + os.path.basename(self.file) + '.json')
def Has(self, name, include_defaults=True):
def Has(self, name, include_defaults = True):
"""Return true if this configuration file has the key.
"""
if _key(name) in self._cache:
return True
if include_defaults and self.defaults:
return self.defaults.Has(name, include_defaults=True)
return self.defaults.Has(name, include_defaults = True)
return False
def GetInt(self, name):
"""Returns an integer from the configuration file.
This follows the git config syntax.
Args:
name: The key to lookup.
Returns:
None if the value was not defined, or is not a boolean.
Otherwise, the number itself.
"""
v = self.GetString(name)
if v is None:
return None
v = v.strip()
mult = 1
if v.endswith('k'):
v = v[:-1]
mult = 1024
elif v.endswith('m'):
v = v[:-1]
mult = 1024 * 1024
elif v.endswith('g'):
v = v[:-1]
mult = 1024 * 1024 * 1024
base = 10
if v.startswith('0x'):
base = 16
try:
return int(v, base=base) * mult
except ValueError:
return None
def DumpConfigDict(self):
"""Returns the current configuration dict.
Configuration data is information only (e.g. logging) and
should not be considered a stable data-source.
Returns:
dict of {<key>, <value>} for git configuration cache.
<value> are strings converted by GetString.
"""
config_dict = {}
for key in self._cache:
config_dict[key] = self.GetString(key)
return config_dict
def GetBoolean(self, name):
"""Returns a boolean from the configuration file.
None : The value was not defined, or is not a boolean.
@ -176,12 +132,6 @@ class GitConfig(object):
return False
return None
def SetBoolean(self, name, value):
"""Set the truthy value for a key."""
if value is not None:
value = 'true' if value else 'false'
self.SetString(name, value)
def GetString(self, name, all_keys=False):
"""Get the first value for a key, or None if it is not defined.
@ -192,7 +142,7 @@ class GitConfig(object):
v = self._cache[_key(name)]
except KeyError:
if self.defaults:
return self.defaults.GetString(name, all_keys=all_keys)
return self.defaults.GetString(name, all_keys = all_keys)
v = []
if not all_keys:
@ -203,7 +153,7 @@ class GitConfig(object):
r = []
r.extend(v)
if self.defaults:
r.extend(self.defaults.GetString(name, all_keys=True))
r.extend(self.defaults.GetString(name, all_keys = True))
return r
def SetString(self, name, value):
@ -267,7 +217,7 @@ class GitConfig(object):
"""
return self._sections.get(section, set())
def HasSection(self, section, subsection=''):
def HasSection(self, section, subsection = ''):
"""Does at least one key in section.subsection exist?
"""
try:
@ -318,7 +268,8 @@ class GitConfig(object):
def _ReadJson(self):
try:
if os.path.getmtime(self._json) <= os.path.getmtime(self.file):
if os.path.getmtime(self._json) \
<= os.path.getmtime(self.file):
platform_utils.remove(self._json)
return None
except OSError:
@ -350,6 +301,8 @@ class GitConfig(object):
d = self._do('--null', '--list')
if d is None:
return c
if not is_python3():
d = d.decode('utf-8')
for line in d.rstrip('\0').split('\0'):
if '\n' in line:
key, val = line.split('\n', 1)
@ -365,25 +318,19 @@ class GitConfig(object):
return c
def _do(self, *args):
command = ['config', '--file', self.file, '--includes']
command = ['config', '--file', self.file]
command.extend(args)
p = GitCommand(None,
command,
capture_stdout=True,
capture_stderr=True)
capture_stdout = True,
capture_stderr = True)
if p.Wait() == 0:
return p.stdout
else:
GitError('git config %s: %s' % (str(args), p.stderr))
class RepoConfig(GitConfig):
"""User settings for repo itself."""
_USER_CONFIG = '~/.repoconfig/config'
class RefSpec(object):
"""A Git refspec line, split into its components:
@ -445,7 +392,6 @@ _master_keys = set()
_ssh_master = True
_master_keys_lock = None
def init_ssh():
"""Should be called once at the start of repo to init ssh master handling.
@ -455,15 +401,9 @@ def init_ssh():
assert _master_keys_lock is None, "Should only call init_ssh once"
_master_keys_lock = _threading.Lock()
def _open_ssh(host, port=None):
global _ssh_master
# Bail before grabbing the lock if we already know that we aren't going to
# try creating new masters below.
if sys.platform in ('win32', 'cygwin'):
return False
# Acquire the lock. This is needed to prevent opening multiple masters for
# the same host when we're running "repo sync -jN" (for N > 1) _and_ the
# manifest <remote fetch="ssh://xyz"> specifies a different host from the
@ -481,14 +421,17 @@ def _open_ssh(host, port=None):
if key in _master_keys:
return True
if not _ssh_master or 'GIT_SSH' in os.environ:
# Failed earlier, so don't retry.
if not _ssh_master \
or 'GIT_SSH' in os.environ \
or sys.platform in ('win32', 'cygwin'):
# failed earlier, or cygwin ssh can't do this
#
return False
# We will make two calls to ssh; this is the common part of both calls.
command_base = ['ssh',
'-o', 'ControlPath %s' % ssh_sock(),
host]
'-o','ControlPath %s' % ssh_sock(),
host]
if port is not None:
command_base[1:1] = ['-p', str(port)]
@ -496,13 +439,13 @@ def _open_ssh(host, port=None):
# ...but before actually starting a master, we'll double-check. This can
# be important because we can't tell that that 'git@myhost.com' is the same
# as 'myhost.com' where "User git" is setup in the user's ~/.ssh/config file.
check_command = command_base + ['-O', 'check']
check_command = command_base + ['-O','check']
try:
Trace(': %s', ' '.join(check_command))
check_process = subprocess.Popen(check_command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
check_process.communicate() # read output, but ignore it...
check_process.communicate() # read output, but ignore it...
isnt_running = check_process.wait()
if not isnt_running:
@ -515,14 +458,16 @@ def _open_ssh(host, port=None):
# to the log there.
pass
command = command_base[:1] + ['-M', '-N'] + command_base[1:]
command = command_base[:1] + \
['-M', '-N'] + \
command_base[1:]
try:
Trace(': %s', ' '.join(command))
p = subprocess.Popen(command)
except Exception as e:
_ssh_master = False
print('\nwarn: cannot enable ssh control master for %s:%s\n%s'
% (host, port, str(e)), file=sys.stderr)
% (host,port, str(e)), file=sys.stderr)
return False
time.sleep(1)
@ -536,7 +481,6 @@ def _open_ssh(host, port=None):
finally:
_master_keys_lock.release()
def close_ssh():
global _master_keys_lock
@ -544,7 +488,7 @@ def close_ssh():
for p in _master_processes:
try:
os.kill(p.pid, signal.SIGTERM)
os.kill(p.pid, SIGTERM)
p.wait()
except OSError:
pass
@ -561,18 +505,15 @@ def close_ssh():
# We're done with the lock, so we can delete it.
_master_keys_lock = None
URI_SCP = re.compile(r'^([^@:]*@?[^:/]{1,}):')
URI_ALL = re.compile(r'^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/')
def GetSchemeFromUrl(url):
m = URI_ALL.match(url)
if m:
return m.group(1)
return None
@contextlib.contextmanager
def GetUrlCookieFile(url, quiet):
if url.startswith('persistent-'):
@ -613,7 +554,6 @@ def GetUrlCookieFile(url, quiet):
cookiefile = os.path.expanduser(cookiefile)
yield cookiefile, None
def _preconnect(url):
m = URI_ALL.match(url)
if m:
@ -634,11 +574,9 @@ def _preconnect(url):
return False
class Remote(object):
"""Configuration options related to a remote.
"""
def __init__(self, config, name):
self._config = config
self.name = name
@ -647,7 +585,7 @@ class Remote(object):
self.review = self._Get('review')
self.projectname = self._Get('projectname')
self.fetch = list(map(RefSpec.FromString,
self._Get('fetch', all_keys=True)))
self._Get('fetch', all_keys=True)))
self._review_url = None
def _InsteadOf(self):
@ -661,8 +599,8 @@ class Remote(object):
insteadOfList = globCfg.GetString(key, all_keys=True)
for insteadOf in insteadOfList:
if (self.url.startswith(insteadOf)
and len(insteadOf) > len(longest)):
if self.url.startswith(insteadOf) \
and len(insteadOf) > len(longest):
longest = insteadOf
longestUrl = url
@ -793,13 +731,12 @@ class Remote(object):
def _Get(self, key, all_keys=False):
key = 'remote.%s.%s' % (self.name, key)
return self._config.GetString(key, all_keys=all_keys)
return self._config.GetString(key, all_keys = all_keys)
class Branch(object):
"""Configuration options related to a single branch.
"""
def __init__(self, config, name):
self._config = config
self.name = name
@ -843,4 +780,4 @@ class Branch(object):
def _Get(self, key, all_keys=False):
key = 'branch.%s.%s' % (self.name, key)
return self._config.GetString(key, all_keys=all_keys)
return self._config.GetString(key, all_keys = all_keys)

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2009 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -16,14 +18,12 @@ import os
from repo_trace import Trace
import platform_utils
HEAD = 'HEAD'
HEAD = 'HEAD'
R_CHANGES = 'refs/changes/'
R_HEADS = 'refs/heads/'
R_TAGS = 'refs/tags/'
R_PUB = 'refs/published/'
R_WORKTREE = 'refs/worktree/'
R_WORKTREE_M = R_WORKTREE + 'm/'
R_M = 'refs/remotes/m/'
R_HEADS = 'refs/heads/'
R_TAGS = 'refs/tags/'
R_PUB = 'refs/published/'
R_M = 'refs/remotes/m/'
class GitRefs(object):
@ -131,14 +131,11 @@ class GitRefs(object):
base = os.path.join(self._gitdir, prefix)
for name in platform_utils.listdir(base):
p = os.path.join(base, name)
# We don't implement the full ref validation algorithm, just the simple
# rules that would show up in local filesystems.
# https://git-scm.com/docs/git-check-ref-format
if name.startswith('.') or name.endswith('.lock'):
pass
elif platform_utils.isdir(p):
if platform_utils.isdir(p):
self._mtime[prefix] = os.path.getmtime(base)
self._ReadLoose(prefix + name + '/')
elif name.endswith('.lock'):
pass
else:
self._ReadLoose1(p, prefix + name)
@ -147,7 +144,7 @@ class GitRefs(object):
with open(path) as fd:
mtime = os.path.getmtime(path)
ref_id = fd.readline()
except (OSError, UnicodeError):
except (IOError, OSError):
return
try:

View File

@ -1,280 +0,0 @@
# Copyright (C) 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Provide functionality to get all projects and their commit ids from Superproject.
For more information on superproject, check out:
https://en.wikibooks.org/wiki/Git/Submodules_and_Superprojects
Examples:
superproject = Superproject()
project_commit_ids = superproject.UpdateProjectsRevisionId(projects)
"""
import hashlib
import os
import sys
from error import BUG_REPORT_URL
from git_command import GitCommand
from git_refs import R_HEADS
_SUPERPROJECT_GIT_NAME = 'superproject.git'
_SUPERPROJECT_MANIFEST_NAME = 'superproject_override.xml'
class Superproject(object):
"""Get commit ids from superproject.
Initializes a local copy of a superproject for the manifest. This allows
lookup of commit ids for all projects. It contains _project_commit_ids which
is a dictionary with project/commit id entries.
"""
def __init__(self, manifest, repodir, superproject_dir='exp-superproject',
quiet=False):
"""Initializes superproject.
Args:
manifest: A Manifest object that is to be written to a file.
repodir: Path to the .repo/ dir for holding all internal checkout state.
It must be in the top directory of the repo client checkout.
superproject_dir: Relative path under |repodir| to checkout superproject.
quiet: If True then only print the progress messages.
"""
self._project_commit_ids = None
self._manifest = manifest
self._quiet = quiet
self._branch = self._GetBranch()
self._repodir = os.path.abspath(repodir)
self._superproject_dir = superproject_dir
self._superproject_path = os.path.join(self._repodir, superproject_dir)
self._manifest_path = os.path.join(self._superproject_path,
_SUPERPROJECT_MANIFEST_NAME)
git_name = ''
if self._manifest.superproject:
remote_name = self._manifest.superproject['remote'].name
git_name = hashlib.md5(remote_name.encode('utf8')).hexdigest() + '-'
self._work_git_name = git_name + _SUPERPROJECT_GIT_NAME
self._work_git = os.path.join(self._superproject_path, self._work_git_name)
@property
def project_commit_ids(self):
"""Returns a dictionary of projects and their commit ids."""
return self._project_commit_ids
def _GetBranch(self):
"""Returns the branch name for getting the approved manifest."""
p = self._manifest.manifestProject
b = p.GetBranch(p.CurrentBranch)
if not b:
return None
branch = b.merge
if branch and branch.startswith(R_HEADS):
branch = branch[len(R_HEADS):]
return branch
def _Init(self):
"""Sets up a local Git repository to get a copy of a superproject.
Returns:
True if initialization is successful, or False.
"""
if not os.path.exists(self._superproject_path):
os.mkdir(self._superproject_path)
if not self._quiet and not os.path.exists(self._work_git):
print('%s: Performing initial setup for superproject; this might take '
'several minutes.' % self._work_git)
cmd = ['init', '--bare', self._work_git_name]
p = GitCommand(None,
cmd,
cwd=self._superproject_path,
capture_stdout=True,
capture_stderr=True)
retval = p.Wait()
if retval:
print('repo: error: git init call failed with return code: %r, stderr: %r' %
(retval, p.stderr), file=sys.stderr)
return False
return True
def _Fetch(self, url):
"""Fetches a local copy of a superproject for the manifest based on url.
Args:
url: superproject's url.
Returns:
True if fetch is successful, or False.
"""
if not os.path.exists(self._work_git):
print('git fetch missing drectory: %s' % self._work_git,
file=sys.stderr)
return False
cmd = ['fetch', url, '--depth', '1', '--force', '--no-tags', '--filter', 'blob:none']
if self._branch:
cmd += [self._branch + ':' + self._branch]
p = GitCommand(None,
cmd,
cwd=self._work_git,
capture_stdout=True,
capture_stderr=True)
retval = p.Wait()
if retval:
print('repo: error: git fetch call failed with return code: %r, stderr: %r' %
(retval, p.stderr), file=sys.stderr)
return False
return True
def _LsTree(self):
"""Gets the commit ids for all projects.
Works only in git repositories.
Returns:
data: data returned from 'git ls-tree ...' instead of None.
"""
if not os.path.exists(self._work_git):
print('git ls-tree missing drectory: %s' % self._work_git,
file=sys.stderr)
return None
data = None
branch = 'HEAD' if not self._branch else self._branch
cmd = ['ls-tree', '-z', '-r', branch]
p = GitCommand(None,
cmd,
cwd=self._work_git,
capture_stdout=True,
capture_stderr=True)
retval = p.Wait()
if retval == 0:
data = p.stdout
else:
print('repo: error: git ls-tree call failed with return code: %r, stderr: %r' % (
retval, p.stderr), file=sys.stderr)
return data
def Sync(self):
"""Gets a local copy of a superproject for the manifest.
Returns:
True if sync of superproject is successful, or False.
"""
print('WARNING: --use-superproject is experimental and not '
'for general use', file=sys.stderr)
if not self._manifest.superproject:
print('error: superproject tag is not defined in manifest',
file=sys.stderr)
return False
url = self._manifest.superproject['remote'].url
if not url:
print('error: superproject URL is not defined in manifest',
file=sys.stderr)
return False
if not self._Init():
return False
if not self._Fetch(url):
return False
if not self._quiet:
print('%s: Initial setup for superproject completed.' % self._work_git)
return True
def _GetAllProjectsCommitIds(self):
"""Get commit ids for all projects from superproject and save them in _project_commit_ids.
Returns:
A dictionary with the projects/commit ids on success, otherwise None.
"""
if not self.Sync():
return None
data = self._LsTree()
if not data:
print('error: git ls-tree failed to return data for superproject',
file=sys.stderr)
return None
# Parse lines like the following to select lines starting with '160000' and
# build a dictionary with project path (last element) and its commit id (3rd element).
#
# 160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00
# 120000 blob acc2cbdf438f9d2141f0ae424cec1d8fc4b5d97f\tbootstrap.bash\x00
commit_ids = {}
for line in data.split('\x00'):
ls_data = line.split(None, 3)
if not ls_data:
break
if ls_data[0] == '160000':
commit_ids[ls_data[3]] = ls_data[2]
self._project_commit_ids = commit_ids
return commit_ids
def _WriteManfiestFile(self):
"""Writes manifest to a file.
Returns:
manifest_path: Path name of the file into which manifest is written instead of None.
"""
if not os.path.exists(self._superproject_path):
print('error: missing superproject directory %s' %
self._superproject_path,
file=sys.stderr)
return None
manifest_str = self._manifest.ToXml(groups=self._manifest.GetGroupsStr()).toxml()
manifest_path = self._manifest_path
try:
with open(manifest_path, 'w', encoding='utf-8') as fp:
fp.write(manifest_str)
except IOError as e:
print('error: cannot write manifest to %s:\n%s'
% (manifest_path, e),
file=sys.stderr)
return None
return manifest_path
def UpdateProjectsRevisionId(self, projects):
"""Update revisionId of every project in projects with the commit id.
Args:
projects: List of projects whose revisionId needs to be updated.
Returns:
manifest_path: Path name of the overriding manfiest file instead of None.
"""
commit_ids = self._GetAllProjectsCommitIds()
if not commit_ids:
print('error: Cannot get project commit ids from manifest', file=sys.stderr)
return None
projects_missing_commit_ids = []
for project in projects:
path = project.relpath
if not path:
continue
commit_id = commit_ids.get(path)
if commit_id:
project.SetRevisionId(commit_id)
else:
projects_missing_commit_ids.append(path)
if projects_missing_commit_ids:
print('error: please file a bug using %s to report missing commit_ids for: %s' %
(BUG_REPORT_URL, projects_missing_commit_ids), file=sys.stderr)
return None
manifest_path = self._WriteManfiestFile()
return manifest_path

View File

@ -1,238 +0,0 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Provide event logging in the git trace2 EVENT format.
The git trace2 EVENT format is defined at:
https://www.kernel.org/pub/software/scm/git/docs/technical/api-trace2.html#_event_format
https://git-scm.com/docs/api-trace2#_the_event_format_target
Usage:
git_trace_log = EventLog()
git_trace_log.StartEvent()
...
git_trace_log.ExitEvent()
git_trace_log.Write()
"""
import datetime
import json
import os
import sys
import tempfile
import threading
from git_command import GitCommand, RepoSourceVersion
class EventLog(object):
"""Event log that records events that occurred during a repo invocation.
Events are written to the log as a consecutive JSON entries, one per line.
Entries follow the git trace2 EVENT format.
Each entry contains the following common keys:
- event: The event name
- sid: session-id - Unique string to allow process instance to be identified.
- thread: The thread name.
- time: is the UTC time of the event.
Valid 'event' names and event specific fields are documented here:
https://git-scm.com/docs/api-trace2#_event_format
"""
def __init__(self, env=None):
"""Initializes the event log."""
self._log = []
# Try to get session-id (sid) from environment (setup in repo launcher).
KEY = 'GIT_TRACE2_PARENT_SID'
if env is None:
env = os.environ
now = datetime.datetime.utcnow()
# Save both our sid component and the complete sid.
# We use our sid component (self._sid) as the unique filename prefix and
# the full sid (self._full_sid) in the log itself.
self._sid = 'repo-%s-P%08x' % (now.strftime('%Y%m%dT%H%M%SZ'), os.getpid())
parent_sid = env.get(KEY)
# Append our sid component to the parent sid (if it exists).
if parent_sid is not None:
self._full_sid = parent_sid + '/' + self._sid
else:
self._full_sid = self._sid
# Set/update the environment variable.
# Environment handling across systems is messy.
try:
env[KEY] = self._full_sid
except UnicodeEncodeError:
env[KEY] = self._full_sid.encode()
# Add a version event to front of the log.
self._AddVersionEvent()
@property
def full_sid(self):
return self._full_sid
def _AddVersionEvent(self):
"""Adds a 'version' event at the beginning of current log."""
version_event = self._CreateEventDict('version')
version_event['evt'] = "2"
version_event['exe'] = RepoSourceVersion()
self._log.insert(0, version_event)
def _CreateEventDict(self, event_name):
"""Returns a dictionary with the common keys/values for git trace2 events.
Args:
event_name: The event name.
Returns:
Dictionary with the common event fields populated.
"""
return {
'event': event_name,
'sid': self._full_sid,
'thread': threading.currentThread().getName(),
'time': datetime.datetime.utcnow().isoformat() + 'Z',
}
def StartEvent(self):
"""Append a 'start' event to the current log."""
start_event = self._CreateEventDict('start')
start_event['argv'] = sys.argv
self._log.append(start_event)
def ExitEvent(self, result):
"""Append an 'exit' event to the current log.
Args:
result: Exit code of the event
"""
exit_event = self._CreateEventDict('exit')
# Consider 'None' success (consistent with event_log result handling).
if result is None:
result = 0
exit_event['code'] = result
self._log.append(exit_event)
def CommandEvent(self, name, subcommands):
"""Append a 'command' event to the current log.
Args:
name: Name of the primary command (ex: repo, git)
subcommands: List of the sub-commands (ex: version, init, sync)
"""
command_event = self._CreateEventDict('command')
command_event['name'] = name
command_event['subcommands'] = subcommands
self._log.append(command_event)
def DefParamRepoEvents(self, config):
"""Append a 'def_param' event for each repo.* config key to the current log.
Args:
config: Repo configuration dictionary
"""
# Only output the repo.* config parameters.
repo_config = {k: v for k, v in config.items() if k.startswith('repo.')}
for param, value in repo_config.items():
def_param_event = self._CreateEventDict('def_param')
def_param_event['param'] = param
def_param_event['value'] = value
self._log.append(def_param_event)
def _GetEventTargetPath(self):
"""Get the 'trace2.eventtarget' path from git configuration.
Returns:
path: git config's 'trace2.eventtarget' path if it exists, or None
"""
path = None
cmd = ['config', '--get', 'trace2.eventtarget']
# TODO(https://crbug.com/gerrit/13706): Use GitConfig when it supports
# system git config variables.
p = GitCommand(None, cmd, capture_stdout=True, capture_stderr=True,
bare=True)
retval = p.Wait()
if retval == 0:
# Strip trailing carriage-return in path.
path = p.stdout.rstrip('\n')
elif retval != 1:
# `git config --get` is documented to produce an exit status of `1` if
# the requested variable is not present in the configuration. Report any
# other return value as an error.
print("repo: error: 'git config --get' call failed with return code: %r, stderr: %r" % (
retval, p.stderr), file=sys.stderr)
return path
def Write(self, path=None):
"""Writes the log out to a file.
Log is only written if 'path' or 'git config --get trace2.eventtarget'
provide a valid path to write logs to.
Logging filename format follows the git trace2 style of being a unique
(exclusive writable) file.
Args:
path: Path to where logs should be written.
Returns:
log_path: Path to the log file if log is written, otherwise None
"""
log_path = None
# If no logging path is specified, get the path from 'trace2.eventtarget'.
if path is None:
path = self._GetEventTargetPath()
# If no logging path is specified, exit.
if path is None:
return None
if isinstance(path, str):
# Get absolute path.
path = os.path.abspath(os.path.expanduser(path))
else:
raise TypeError('path: str required but got %s.' % type(path))
# Git trace2 requires a directory to write log to.
# TODO(https://crbug.com/gerrit/13706): Support file (append) mode also.
if not os.path.isdir(path):
return None
# Use NamedTemporaryFile to generate a unique filename as required by git trace2.
try:
with tempfile.NamedTemporaryFile(mode='x', prefix=self._sid, dir=path,
delete=False) as f:
# TODO(https://crbug.com/gerrit/13706): Support writing events as they
# occur.
for e in self._log:
# Dump in compact encoding mode.
# See 'Compact encoding' in Python docs:
# https://docs.python.org/3/library/json.html#module-json
json.dump(e, f, indent=None, separators=(',', ':'))
f.write('\n')
log_path = f.name
except FileExistsError as err:
print('repo: warning: git trace2 logging failed: %r' % err,
file=sys.stderr)
return None
return log_path

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2015 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,8 +14,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import os
import multiprocessing
import platform
import re
import sys
@ -27,24 +29,12 @@ from error import ManifestParseError
NUM_BATCH_RETRIEVE_REVISIONID = 32
def get_gitc_manifest_dir():
return wrapper.Wrapper().get_gitc_manifest_dir()
def parse_clientdir(gitc_fs_path):
return wrapper.Wrapper().gitc_parse_clientdir(gitc_fs_path)
def _get_project_revision(args):
"""Worker for _set_project_revisions to lookup one project remote."""
(i, url, expr) = args
gitcmd = git_command.GitCommand(
None, ['ls-remote', url, expr], capture_stdout=True, cwd='/tmp')
rc = gitcmd.Wait()
return (i, rc, gitcmd.stdout.split('\t', 1)[0])
def _set_project_revisions(projects):
"""Sets the revisionExpr for a list of projects.
@ -52,38 +42,47 @@ def _set_project_revisions(projects):
should not be overly large. Recommend calling this function multiple times
with each call not exceeding NUM_BATCH_RETRIEVE_REVISIONID projects.
Args:
projects: List of project objects to set the revionExpr for.
@param projects: List of project objects to set the revionExpr for.
"""
# Retrieve the commit id for each project based off of it's current
# revisionExpr and it is not already a commit id.
with multiprocessing.Pool(NUM_BATCH_RETRIEVE_REVISIONID) as pool:
results_iter = pool.imap_unordered(
_get_project_revision,
((i, project.remote.url, project.revisionExpr)
for i, project in enumerate(projects)
if not git_config.IsId(project.revisionExpr)),
chunksize=8)
for (i, rc, revisionExpr) in results_iter:
project = projects[i]
if rc:
print('FATAL: Failed to retrieve revisionExpr for %s' % project.name)
pool.terminate()
sys.exit(1)
if not revisionExpr:
pool.terminate()
raise ManifestParseError('Invalid SHA-1 revision project %s (%s)' %
(project.remote.url, project.revisionExpr))
project.revisionExpr = revisionExpr
project_gitcmds = [(
project, git_command.GitCommand(None,
['ls-remote',
project.remote.url,
project.revisionExpr],
capture_stdout=True, cwd='/tmp'))
for project in projects if not git_config.IsId(project.revisionExpr)]
for proj, gitcmd in project_gitcmds:
if gitcmd.Wait():
print('FATAL: Failed to retrieve revisionExpr for %s' % proj)
sys.exit(1)
revisionExpr = gitcmd.stdout.split('\t')[0]
if not revisionExpr:
raise ManifestParseError('Invalid SHA-1 revision project %s (%s)' %
(proj.remote.url, proj.revisionExpr))
proj.revisionExpr = revisionExpr
def _manifest_groups(manifest):
"""Returns the manifest group string that should be synced
This is the same logic used by Command.GetProjects(), which is used during
repo sync
@param manifest: The XmlManifest object
"""
mp = manifest.manifestProject
groups = mp.config.GetString('manifest.groups')
if not groups:
groups = 'default,platform-' + platform.system().lower()
return groups
def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
"""Generate a manifest for shafsd to use for this GITC client.
Args:
gitc_manifest: Current gitc manifest, or None if there isn't one yet.
manifest: A GitcManifest object loaded with the current repo manifest.
paths: List of project paths we want to update.
@param gitc_manifest: Current gitc manifest, or None if there isn't one yet.
@param manifest: A GitcManifest object loaded with the current repo manifest.
@param paths: List of project paths we want to update.
"""
print('Generating GITC Manifest by fetching revision SHAs for each '
@ -91,7 +90,7 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
if paths is None:
paths = list(manifest.paths.keys())
groups = [x for x in re.split(r'[,\s]+', manifest.GetGroupsStr()) if x]
groups = [x for x in re.split(r'[,\s]+', _manifest_groups(manifest)) if x]
# Convert the paths to projects, and filter them to the matched groups.
projects = [manifest.paths[p] for p in paths]
@ -105,11 +104,11 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
if not proj.upstream and not git_config.IsId(proj.revisionExpr):
proj.upstream = proj.revisionExpr
if path not in gitc_manifest.paths:
if not path in gitc_manifest.paths:
# Any new projects need their first revision, even if we weren't asked
# for them.
projects.append(proj)
elif path not in paths:
elif not path in paths:
# And copy revisions from the previous manifest if we're not updating
# them now.
gitc_proj = gitc_manifest.paths[path]
@ -119,7 +118,11 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
else:
proj.revisionExpr = gitc_proj.revisionExpr
_set_project_revisions(projects)
index = 0
while index < len(projects):
_set_project_revisions(
projects[index:(index+NUM_BATCH_RETRIEVE_REVISIONID)])
index += NUM_BATCH_RETRIEVE_REVISIONID
if gitc_manifest is not None:
for path, proj in gitc_manifest.paths.items():
@ -137,20 +140,16 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
# Save the manifest.
save_manifest(manifest)
def save_manifest(manifest, client_dir=None):
"""Save the manifest file in the client_dir.
Args:
manifest: Manifest object to save.
client_dir: Client directory to save the manifest in.
@param client_dir: Client directory to save the manifest in.
@param manifest: Manifest object to save.
"""
if not client_dir:
manifest_file = manifest.manifestFile
else:
manifest_file = os.path.join(client_dir, '.manifest')
with open(manifest_file, 'w') as f:
manifest.Save(f, groups=manifest.GetGroupsStr())
client_dir = manifest.gitc_client_dir
with open(os.path.join(client_dir, '.manifest'), 'w') as f:
manifest.Save(f, groups=_manifest_groups(manifest))
# TODO(sbasi/jorg): Come up with a solution to remove the sleep below.
# Give the GITC filesystem time to register the manifest changes.
time.sleep(3)

509
hooks.py
View File

@ -1,509 +0,0 @@
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import errno
import json
import os
import re
import subprocess
import sys
import traceback
import urllib.parse
from error import HookError
from git_refs import HEAD
class RepoHook(object):
"""A RepoHook contains information about a script to run as a hook.
Hooks are used to run a python script before running an upload (for instance,
to run presubmit checks). Eventually, we may have hooks for other actions.
This shouldn't be confused with files in the 'repo/hooks' directory. Those
files are copied into each '.git/hooks' folder for each project. Repo-level
hooks are associated instead with repo actions.
Hooks are always python. When a hook is run, we will load the hook into the
interpreter and execute its main() function.
Combinations of hook option flags:
- no-verify=False, verify=False (DEFAULT):
If stdout is a tty, can prompt about running hooks if needed.
If user denies running hooks, the action is cancelled. If stdout is
not a tty and we would need to prompt about hooks, action is
cancelled.
- no-verify=False, verify=True:
Always run hooks with no prompt.
- no-verify=True, verify=False:
Never run hooks, but run action anyway (AKA bypass hooks).
- no-verify=True, verify=True:
Invalid
"""
def __init__(self,
hook_type,
hooks_project,
repo_topdir,
manifest_url,
bypass_hooks=False,
allow_all_hooks=False,
ignore_hooks=False,
abort_if_user_denies=False):
"""RepoHook constructor.
Params:
hook_type: A string representing the type of hook. This is also used
to figure out the name of the file containing the hook. For
example: 'pre-upload'.
hooks_project: The project containing the repo hooks.
If you have a manifest, this is manifest.repo_hooks_project.
OK if this is None, which will make the hook a no-op.
repo_topdir: The top directory of the repo client checkout.
This is the one containing the .repo directory. Scripts will
run with CWD as this directory.
If you have a manifest, this is manifest.topdir.
manifest_url: The URL to the manifest git repo.
bypass_hooks: If True, then 'Do not run the hook'.
allow_all_hooks: If True, then 'Run the hook without prompting'.
ignore_hooks: If True, then 'Do not abort action if hooks fail'.
abort_if_user_denies: If True, we'll abort running the hook if the user
doesn't allow us to run the hook.
"""
self._hook_type = hook_type
self._hooks_project = hooks_project
self._repo_topdir = repo_topdir
self._manifest_url = manifest_url
self._bypass_hooks = bypass_hooks
self._allow_all_hooks = allow_all_hooks
self._ignore_hooks = ignore_hooks
self._abort_if_user_denies = abort_if_user_denies
# Store the full path to the script for convenience.
if self._hooks_project:
self._script_fullpath = os.path.join(self._hooks_project.worktree,
self._hook_type + '.py')
else:
self._script_fullpath = None
def _GetHash(self):
"""Return a hash of the contents of the hooks directory.
We'll just use git to do this. This hash has the property that if anything
changes in the directory we will return a different has.
SECURITY CONSIDERATION:
This hash only represents the contents of files in the hook directory, not
any other files imported or called by hooks. Changes to imported files
can change the script behavior without affecting the hash.
Returns:
A string representing the hash. This will always be ASCII so that it can
be printed to the user easily.
"""
assert self._hooks_project, "Must have hooks to calculate their hash."
# We will use the work_git object rather than just calling GetRevisionId().
# That gives us a hash of the latest checked in version of the files that
# the user will actually be executing. Specifically, GetRevisionId()
# doesn't appear to change even if a user checks out a different version
# of the hooks repo (via git checkout) nor if a user commits their own revs.
#
# NOTE: Local (non-committed) changes will not be factored into this hash.
# I think this is OK, since we're really only worried about warning the user
# about upstream changes.
return self._hooks_project.work_git.rev_parse(HEAD)
def _GetMustVerb(self):
"""Return 'must' if the hook is required; 'should' if not."""
if self._abort_if_user_denies:
return 'must'
else:
return 'should'
def _CheckForHookApproval(self):
"""Check to see whether this hook has been approved.
We'll accept approval of manifest URLs if they're using secure transports.
This way the user can say they trust the manifest hoster. For insecure
hosts, we fall back to checking the hash of the hooks repo.
Note that we ask permission for each individual hook even though we use
the hash of all hooks when detecting changes. We'd like the user to be
able to approve / deny each hook individually. We only use the hash of all
hooks because there is no other easy way to detect changes to local imports.
Returns:
True if this hook is approved to run; False otherwise.
Raises:
HookError: Raised if the user doesn't approve and abort_if_user_denies
was passed to the consturctor.
"""
if self._ManifestUrlHasSecureScheme():
return self._CheckForHookApprovalManifest()
else:
return self._CheckForHookApprovalHash()
def _CheckForHookApprovalHelper(self, subkey, new_val, main_prompt,
changed_prompt):
"""Check for approval for a particular attribute and hook.
Args:
subkey: The git config key under [repo.hooks.<hook_type>] to store the
last approved string.
new_val: The new value to compare against the last approved one.
main_prompt: Message to display to the user to ask for approval.
changed_prompt: Message explaining why we're re-asking for approval.
Returns:
True if this hook is approved to run; False otherwise.
Raises:
HookError: Raised if the user doesn't approve and abort_if_user_denies
was passed to the consturctor.
"""
hooks_config = self._hooks_project.config
git_approval_key = 'repo.hooks.%s.%s' % (self._hook_type, subkey)
# Get the last value that the user approved for this hook; may be None.
old_val = hooks_config.GetString(git_approval_key)
if old_val is not None:
# User previously approved hook and asked not to be prompted again.
if new_val == old_val:
# Approval matched. We're done.
return True
else:
# Give the user a reason why we're prompting, since they last told
# us to "never ask again".
prompt = 'WARNING: %s\n\n' % (changed_prompt,)
else:
prompt = ''
# Prompt the user if we're not on a tty; on a tty we'll assume "no".
if sys.stdout.isatty():
prompt += main_prompt + ' (yes/always/NO)? '
response = input(prompt).lower()
print()
# User is doing a one-time approval.
if response in ('y', 'yes'):
return True
elif response == 'always':
hooks_config.SetString(git_approval_key, new_val)
return True
# For anything else, we'll assume no approval.
if self._abort_if_user_denies:
raise HookError('You must allow the %s hook or use --no-verify.' %
self._hook_type)
return False
def _ManifestUrlHasSecureScheme(self):
"""Check if the URI for the manifest is a secure transport."""
secure_schemes = ('file', 'https', 'ssh', 'persistent-https', 'sso', 'rpc')
parse_results = urllib.parse.urlparse(self._manifest_url)
return parse_results.scheme in secure_schemes
def _CheckForHookApprovalManifest(self):
"""Check whether the user has approved this manifest host.
Returns:
True if this hook is approved to run; False otherwise.
"""
return self._CheckForHookApprovalHelper(
'approvedmanifest',
self._manifest_url,
'Run hook scripts from %s' % (self._manifest_url,),
'Manifest URL has changed since %s was allowed.' % (self._hook_type,))
def _CheckForHookApprovalHash(self):
"""Check whether the user has approved the hooks repo.
Returns:
True if this hook is approved to run; False otherwise.
"""
prompt = ('Repo %s run the script:\n'
' %s\n'
'\n'
'Do you want to allow this script to run')
return self._CheckForHookApprovalHelper(
'approvedhash',
self._GetHash(),
prompt % (self._GetMustVerb(), self._script_fullpath),
'Scripts have changed since %s was allowed.' % (self._hook_type,))
@staticmethod
def _ExtractInterpFromShebang(data):
"""Extract the interpreter used in the shebang.
Try to locate the interpreter the script is using (ignoring `env`).
Args:
data: The file content of the script.
Returns:
The basename of the main script interpreter, or None if a shebang is not
used or could not be parsed out.
"""
firstline = data.splitlines()[:1]
if not firstline:
return None
# The format here can be tricky.
shebang = firstline[0].strip()
m = re.match(r'^#!\s*([^\s]+)(?:\s+([^\s]+))?', shebang)
if not m:
return None
# If the using `env`, find the target program.
interp = m.group(1)
if os.path.basename(interp) == 'env':
interp = m.group(2)
return interp
def _ExecuteHookViaReexec(self, interp, context, **kwargs):
"""Execute the hook script through |interp|.
Note: Support for this feature should be dropped ~Jun 2021.
Args:
interp: The Python program to run.
context: Basic Python context to execute the hook inside.
kwargs: Arbitrary arguments to pass to the hook script.
Raises:
HookError: When the hooks failed for any reason.
"""
# This logic needs to be kept in sync with _ExecuteHookViaImport below.
script = """
import json, os, sys
path = '''%(path)s'''
kwargs = json.loads('''%(kwargs)s''')
context = json.loads('''%(context)s''')
sys.path.insert(0, os.path.dirname(path))
data = open(path).read()
exec(compile(data, path, 'exec'), context)
context['main'](**kwargs)
""" % {
'path': self._script_fullpath,
'kwargs': json.dumps(kwargs),
'context': json.dumps(context),
}
# We pass the script via stdin to avoid OS argv limits. It also makes
# unhandled exception tracebacks less verbose/confusing for users.
cmd = [interp, '-c', 'import sys; exec(sys.stdin.read())']
proc = subprocess.Popen(cmd, stdin=subprocess.PIPE)
proc.communicate(input=script.encode('utf-8'))
if proc.returncode:
raise HookError('Failed to run %s hook.' % (self._hook_type,))
def _ExecuteHookViaImport(self, data, context, **kwargs):
"""Execute the hook code in |data| directly.
Args:
data: The code of the hook to execute.
context: Basic Python context to execute the hook inside.
kwargs: Arbitrary arguments to pass to the hook script.
Raises:
HookError: When the hooks failed for any reason.
"""
# Exec, storing global context in the context dict. We catch exceptions
# and convert to a HookError w/ just the failing traceback.
try:
exec(compile(data, self._script_fullpath, 'exec'), context)
except Exception:
raise HookError('%s\nFailed to import %s hook; see traceback above.' %
(traceback.format_exc(), self._hook_type))
# Running the script should have defined a main() function.
if 'main' not in context:
raise HookError('Missing main() in: "%s"' % self._script_fullpath)
# Call the main function in the hook. If the hook should cause the
# build to fail, it will raise an Exception. We'll catch that convert
# to a HookError w/ just the failing traceback.
try:
context['main'](**kwargs)
except Exception:
raise HookError('%s\nFailed to run main() for %s hook; see traceback '
'above.' % (traceback.format_exc(), self._hook_type))
def _ExecuteHook(self, **kwargs):
"""Actually execute the given hook.
This will run the hook's 'main' function in our python interpreter.
Args:
kwargs: Keyword arguments to pass to the hook. These are often specific
to the hook type. For instance, pre-upload hooks will contain
a project_list.
"""
# Keep sys.path and CWD stashed away so that we can always restore them
# upon function exit.
orig_path = os.getcwd()
orig_syspath = sys.path
try:
# Always run hooks with CWD as topdir.
os.chdir(self._repo_topdir)
# Put the hook dir as the first item of sys.path so hooks can do
# relative imports. We want to replace the repo dir as [0] so
# hooks can't import repo files.
sys.path = [os.path.dirname(self._script_fullpath)] + sys.path[1:]
# Initial global context for the hook to run within.
context = {'__file__': self._script_fullpath}
# Add 'hook_should_take_kwargs' to the arguments to be passed to main.
# We don't actually want hooks to define their main with this argument--
# it's there to remind them that their hook should always take **kwargs.
# For instance, a pre-upload hook should be defined like:
# def main(project_list, **kwargs):
#
# This allows us to later expand the API without breaking old hooks.
kwargs = kwargs.copy()
kwargs['hook_should_take_kwargs'] = True
# See what version of python the hook has been written against.
data = open(self._script_fullpath).read()
interp = self._ExtractInterpFromShebang(data)
reexec = False
if interp:
prog = os.path.basename(interp)
if prog.startswith('python2') and sys.version_info.major != 2:
reexec = True
elif prog.startswith('python3') and sys.version_info.major == 2:
reexec = True
# Attempt to execute the hooks through the requested version of Python.
if reexec:
try:
self._ExecuteHookViaReexec(interp, context, **kwargs)
except OSError as e:
if e.errno == errno.ENOENT:
# We couldn't find the interpreter, so fallback to importing.
reexec = False
else:
raise
# Run the hook by importing directly.
if not reexec:
self._ExecuteHookViaImport(data, context, **kwargs)
finally:
# Restore sys.path and CWD.
sys.path = orig_syspath
os.chdir(orig_path)
def _CheckHook(self):
# Bail with a nice error if we can't find the hook.
if not os.path.isfile(self._script_fullpath):
raise HookError('Couldn\'t find repo hook: %s' % self._script_fullpath)
def Run(self, **kwargs):
"""Run the hook.
If the hook doesn't exist (because there is no hooks project or because
this particular hook is not enabled), this is a no-op.
Args:
user_allows_all_hooks: If True, we will never prompt about running the
hook--we'll just assume it's OK to run it.
kwargs: Keyword arguments to pass to the hook. These are often specific
to the hook type. For instance, pre-upload hooks will contain
a project_list.
Returns:
True: On success or ignore hooks by user-request
False: The hook failed. The caller should respond with aborting the action.
Some examples in which False is returned:
* Finding the hook failed while it was enabled, or
* the user declined to run a required hook (from _CheckForHookApproval)
In all these cases the user did not pass the proper arguments to
ignore the result through the option combinations as listed in
AddHookOptionGroup().
"""
# Do not do anything in case bypass_hooks is set, or
# no-op if there is no hooks project or if hook is disabled.
if (self._bypass_hooks or
not self._hooks_project or
self._hook_type not in self._hooks_project.enabled_repo_hooks):
return True
passed = True
try:
self._CheckHook()
# Make sure the user is OK with running the hook.
if self._allow_all_hooks or self._CheckForHookApproval():
# Run the hook with the same version of python we're using.
self._ExecuteHook(**kwargs)
except SystemExit as e:
passed = False
print('ERROR: %s hooks exited with exit code: %s' % (self._hook_type, str(e)),
file=sys.stderr)
except HookError as e:
passed = False
print('ERROR: %s' % str(e), file=sys.stderr)
if not passed and self._ignore_hooks:
print('\nWARNING: %s hooks failed, but continuing anyways.' % self._hook_type,
file=sys.stderr)
passed = True
return passed
@classmethod
def FromSubcmd(cls, manifest, opt, *args, **kwargs):
"""Method to construct the repo hook class
Args:
manifest: The current active manifest for this command from which we
extract a couple of fields.
opt: Contains the commandline options for the action of this hook.
It should contain the options added by AddHookOptionGroup() in which
we are interested in RepoHook execution.
"""
for key in ('bypass_hooks', 'allow_all_hooks', 'ignore_hooks'):
kwargs.setdefault(key, getattr(opt, key))
kwargs.update({
'hooks_project': manifest.repo_hooks_project,
'repo_topdir': manifest.topdir,
'manifest_url': manifest.manifestProject.GetRemote('origin').url,
})
return cls(*args, **kwargs)
@staticmethod
def AddOptionGroup(parser, name):
"""Help options relating to the various hooks."""
# Note that verify and no-verify are NOT opposites of each other, which
# is why they store to different locations. We are using them to match
# 'git commit' syntax.
group = parser.add_option_group(name + ' hooks')
group.add_option('--no-verify',
dest='bypass_hooks', action='store_true',
help='Do not run the %s hook.' % name)
group.add_option('--verify',
dest='allow_all_hooks', action='store_true',
help='Run the %s hook without prompting.' % name)
group.add_option('--ignore-hooks',
action='store_true',
help='Do not abort if %s hooks fail.' % name)

View File

@ -1,5 +1,5 @@
#!/bin/sh
# From Gerrit Code Review 3.1.3
# From Gerrit Code Review 2.14.6
#
# Part of Gerrit Code Review (https://www.gerritcodereview.com/)
#
@ -16,48 +16,176 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# avoid [[ which is not POSIX sh.
if test "$#" != 1 ; then
echo "$0 requires an argument."
exit 1
fi
unset GREP_OPTIONS
if test ! -f "$1" ; then
echo "file does not exist: $1"
exit 1
fi
CHANGE_ID_AFTER="Bug|Depends-On|Issue|Test|Feature|Fixes|Fixed"
MSG="$1"
# Do not create a change id if requested
if test "false" = "`git config --bool --get gerrit.createChangeId`" ; then
exit 0
fi
# Check for, and add if missing, a unique Change-Id
#
add_ChangeId() {
clean_message=`sed -e '
/^diff --git .*/{
s///
q
}
/^Signed-off-by:/d
/^#/d
' "$MSG" | git stripspace`
if test -z "$clean_message"
then
return
fi
# $RANDOM will be undefined if not using bash, so don't use set -u
random=$( (whoami ; hostname ; date; cat $1 ; echo $RANDOM) | git hash-object --stdin)
dest="$1.tmp.${random}"
# Do not add Change-Id to temp commits
if echo "$clean_message" | head -1 | grep -q '^\(fixup\|squash\)!'
then
return
fi
trap 'rm -f "${dest}"' EXIT
if test "false" = "`git config --bool --get gerrit.createChangeId`"
then
return
fi
if ! git stripspace --strip-comments < "$1" > "${dest}" ; then
echo "cannot strip comments from $1"
exit 1
fi
# Does Change-Id: already exist? if so, exit (no change).
if grep -i '^Change-Id:' "$MSG" >/dev/null
then
return
fi
if test ! -s "${dest}" ; then
echo "file is empty: $1"
exit 1
fi
id=`_gen_ChangeId`
T="$MSG.tmp.$$"
AWK=awk
if [ -x /usr/xpg4/bin/awk ]; then
# Solaris AWK is just too broken
AWK=/usr/xpg4/bin/awk
fi
# Avoid the --in-place option which only appeared in Git 2.8
# Avoid the --if-exists option which only appeared in Git 2.15
if ! git -c trailer.ifexists=doNothing interpret-trailers \
--trailer "Change-Id: I${random}" < "$1" > "${dest}" ; then
echo "cannot insert change-id line in $1"
exit 1
fi
# Get core.commentChar from git config or use default symbol
commentChar=`git config --get core.commentChar`
commentChar=${commentChar:-#}
if ! mv "${dest}" "$1" ; then
echo "cannot mv ${dest} to $1"
exit 1
fi
# How this works:
# - parse the commit message as (textLine+ blankLine*)*
# - assume textLine+ to be a footer until proven otherwise
# - exception: the first block is not footer (as it is the title)
# - read textLine+ into a variable
# - then count blankLines
# - once the next textLine appears, print textLine+ blankLine* as these
# aren't footer
# - in END, the last textLine+ block is available for footer parsing
$AWK '
BEGIN {
# while we start with the assumption that textLine+
# is a footer, the first block is not.
isFooter = 0
footerComment = 0
blankLines = 0
}
# Skip lines starting with commentChar without any spaces before it.
/^'"$commentChar"'/ { next }
# Skip the line starting with the diff command and everything after it,
# up to the end of the file, assuming it is only patch data.
# If more than one line before the diff was empty, strip all but one.
/^diff --git / {
blankLines = 0
while (getline) { }
next
}
# Count blank lines outside footer comments
/^$/ && (footerComment == 0) {
blankLines++
next
}
# Catch footer comment
/^\[[a-zA-Z0-9-]+:/ && (isFooter == 1) {
footerComment = 1
}
/]$/ && (footerComment == 1) {
footerComment = 2
}
# We have a non-blank line after blank lines. Handle this.
(blankLines > 0) {
print lines
for (i = 0; i < blankLines; i++) {
print ""
}
lines = ""
blankLines = 0
isFooter = 1
footerComment = 0
}
# Detect that the current block is not the footer
(footerComment == 0) && (!/^\[?[a-zA-Z0-9-]+:/ || /^[a-zA-Z0-9-]+:\/\//) {
isFooter = 0
}
{
# We need this information about the current last comment line
if (footerComment == 2) {
footerComment = 0
}
if (lines != "") {
lines = lines "\n";
}
lines = lines $0
}
# Footer handling:
# If the last block is considered a footer, splice in the Change-Id at the
# right place.
# Look for the right place to inject Change-Id by considering
# CHANGE_ID_AFTER. Keys listed in it (case insensitive) come first,
# then Change-Id, then everything else (eg. Signed-off-by:).
#
# Otherwise just print the last block, a new line and the Change-Id as a
# block of its own.
END {
unprinted = 1
if (isFooter == 0) {
print lines "\n"
lines = ""
}
changeIdAfter = "^(" tolower("'"$CHANGE_ID_AFTER"'") "):"
numlines = split(lines, footer, "\n")
for (line = 1; line <= numlines; line++) {
if (unprinted && match(tolower(footer[line]), changeIdAfter) != 1) {
unprinted = 0
print "Change-Id: I'"$id"'"
}
print footer[line]
}
if (unprinted) {
print "Change-Id: I'"$id"'"
}
}' "$MSG" > "$T" && mv "$T" "$MSG" || rm -f "$T"
}
_gen_ChangeIdInput() {
echo "tree `git write-tree`"
if parent=`git rev-parse "HEAD^0" 2>/dev/null`
then
echo "parent $parent"
fi
echo "author `git var GIT_AUTHOR_IDENT`"
echo "committer `git var GIT_COMMITTER_IDENT`"
echo
printf '%s' "$clean_message"
}
_gen_ChangeId() {
_gen_ChangeIdInput |
git hash-object -t commit --stdin
}
add_ChangeId

198
main.py
View File

@ -1,4 +1,5 @@
#!/usr/bin/env python3
#!/usr/bin/env python
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
@ -20,15 +21,23 @@ People shouldn't run this directly; instead, they should use the `repo` wrapper
which takes care of execing this entry point.
"""
from __future__ import print_function
import getpass
import netrc
import optparse
import os
import shlex
import sys
import textwrap
import time
import urllib.request
from pyversion import is_python3
if is_python3():
import urllib.request
else:
import imp
import urllib2
urllib = imp.new_module('urllib')
urllib.request = urllib2
try:
import kerberos
@ -38,9 +47,8 @@ except ImportError:
from color import SetDefaultColoring
import event_log
from repo_trace import SetTrace
from git_command import user_agent
from git_config import init_ssh, close_ssh, RepoConfig
from git_trace2_event_log import EventLog
from git_command import git, GitCommand, user_agent
from git_config import init_ssh, close_ssh
from command import InteractiveCommand
from command import MirrorSafeCommand
from command import GitcAvailableCommand, GitcClientCommand
@ -54,41 +62,14 @@ from error import NoManifestException
from error import NoSuchProjectError
from error import RepoChangedException
import gitc_utils
from manifest_xml import GitcClient, RepoClient
from manifest_xml import GitcManifest, XmlManifest
from pager import RunPager, TerminatePager
from wrapper import WrapperPath, Wrapper
from subcmds import all_commands
# NB: These do not need to be kept in sync with the repo launcher script.
# These may be much newer as it allows the repo launcher to roll between
# different repo releases while source versions might require a newer python.
#
# The soft version is when we start warning users that the version is old and
# we'll be dropping support for it. We'll refuse to work with versions older
# than the hard version.
#
# python-3.6 is in Ubuntu Bionic.
MIN_PYTHON_VERSION_SOFT = (3, 6)
MIN_PYTHON_VERSION_HARD = (3, 5)
if sys.version_info.major < 3:
print('repo: error: Python 2 is no longer supported; '
'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr)
sys.exit(1)
else:
if sys.version_info < MIN_PYTHON_VERSION_HARD:
print('repo: error: Python 3 version is too old; '
'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr)
sys.exit(1)
elif sys.version_info < MIN_PYTHON_VERSION_SOFT:
print('repo: warning: your Python 3 version is no longer supported; '
'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr)
if not is_python3():
input = raw_input
global_options = optparse.OptionParser(
usage='repo [-p|--paginate|--no-pager] COMMAND [ARGS]',
@ -99,7 +80,7 @@ global_options.add_option('-p', '--paginate',
dest='pager', action='store_true',
help='display command output in the pager')
global_options.add_option('--no-pager',
dest='pager', action='store_false',
dest='no_pager', action='store_true',
help='disable the pager')
global_options.add_option('--color',
choices=('auto', 'always', 'never'), default=None,
@ -119,14 +100,13 @@ global_options.add_option('--version',
global_options.add_option('--event-log',
dest='event_log', action='store',
help='filename of event log to append timeline to')
global_options.add_option('--git-trace2-event-log', action='store',
help='directory to write git trace2 event log to')
class _Repo(object):
def __init__(self, repodir):
self.repodir = repodir
self.commands = all_commands
# add 'branch' as an alias for 'branches'
all_commands['branch'] = all_commands['branches']
def _ParseArgs(self, argv):
"""Parse the main `repo` command line options."""
@ -146,9 +126,6 @@ class _Repo(object):
argv = []
gopts, _gargs = global_options.parse_args(glob)
name, alias_args = self._ExpandAlias(name)
argv = alias_args + argv
if gopts.help:
global_options.print_help()
commands = ' '.join(sorted(self.commands))
@ -159,27 +136,6 @@ class _Repo(object):
return (name, gopts, argv)
def _ExpandAlias(self, name):
"""Look up user registered aliases."""
# We don't resolve aliases for existing subcommands. This matches git.
if name in self.commands:
return name, []
key = 'alias.%s' % (name,)
alias = RepoConfig.ForRepository(self.repodir).GetString(key)
if alias is None:
alias = RepoConfig.ForUser().GetString(key)
if alias is None:
return name, []
args = alias.strip().split(' ', 1)
name = args[0]
if len(args) == 2:
args = shlex.split(args[1])
else:
args = []
return name, args
def _Run(self, name, gopts, argv):
"""Execute the requested subcommand."""
result = 0
@ -196,23 +152,21 @@ class _Repo(object):
SetDefaultColoring(gopts.color)
try:
cmd = self.commands[name]()
cmd = self.commands[name]
except KeyError:
print("repo: '%s' is not a repo command. See 'repo help'." % name,
file=sys.stderr)
return 1
git_trace2_event_log = EventLog()
cmd.repodir = self.repodir
cmd.client = RepoClient(cmd.repodir)
cmd.manifest = cmd.client.manifest
cmd.manifest = XmlManifest(cmd.repodir)
cmd.gitc_manifest = None
gitc_client_name = gitc_utils.parse_clientdir(os.getcwd())
if gitc_client_name:
cmd.gitc_manifest = GitcClient(cmd.repodir, gitc_client_name)
cmd.client.isGitcClient = True
cmd.gitc_manifest = GitcManifest(cmd.repodir, gitc_client_name)
cmd.manifest.isGitcClient = True
Editor.globalConfig = cmd.client.globalConfig
Editor.globalConfig = cmd.manifest.globalConfig
if not isinstance(cmd, MirrorSafeCommand) and cmd.manifest.IsMirror:
print("fatal: '%s' requires a working directory" % name,
@ -234,13 +188,13 @@ class _Repo(object):
copts = cmd.ReadEnvironmentOptions(copts)
except NoManifestException as e:
print('error: in `%s`: %s' % (' '.join([name] + argv), str(e)),
file=sys.stderr)
file=sys.stderr)
print('error: manifest missing or unreadable -- please run init',
file=sys.stderr)
return 1
if gopts.pager is not False and not isinstance(cmd, InteractiveCommand):
config = cmd.client.globalConfig
if not gopts.no_pager and not isinstance(cmd, InteractiveCommand):
config = cmd.manifest.globalConfig
if gopts.pager:
use_pager = True
else:
@ -253,17 +207,13 @@ class _Repo(object):
start = time.time()
cmd_event = cmd.event_log.Add(name, event_log.TASK_COMMAND, start)
cmd.event_log.SetParent(cmd_event)
git_trace2_event_log.StartEvent()
git_trace2_event_log.CommandEvent(name='repo', subcommands=[name])
try:
cmd.CommonValidateOptions(copts, cargs)
cmd.ValidateOptions(copts, cargs)
result = cmd.Execute(copts, cargs)
except (DownloadError, ManifestInvalidRevisionError,
NoManifestException) as e:
NoManifestException) as e:
print('error: in `%s`: %s' % (' '.join([name] + argv), str(e)),
file=sys.stderr)
file=sys.stderr)
if isinstance(e, NoManifestException):
print('error: manifest missing or unreadable -- please run init',
file=sys.stderr)
@ -278,8 +228,7 @@ class _Repo(object):
if e.name:
print('error: project group must be enabled for project %s' % e.name, file=sys.stderr)
else:
print('error: project group must be enabled for the project in the current directory',
file=sys.stderr)
print('error: project group must be enabled for the project in the current directory', file=sys.stderr)
result = 1
except SystemExit as e:
if e.code:
@ -299,78 +248,49 @@ class _Repo(object):
cmd.event_log.FinishEvent(cmd_event, finish,
result is None or result == 0)
git_trace2_event_log.DefParamRepoEvents(
cmd.manifest.manifestProject.config.DumpConfigDict())
git_trace2_event_log.ExitEvent(result)
if gopts.event_log:
cmd.event_log.Write(os.path.abspath(
os.path.expanduser(gopts.event_log)))
git_trace2_event_log.Write(gopts.git_trace2_event_log)
return result
def _CheckWrapperVersion(ver_str, repo_path):
"""Verify the repo launcher is new enough for this checkout.
Args:
ver_str: The version string passed from the repo launcher when it ran us.
repo_path: The path to the repo launcher that loaded us.
"""
# Refuse to work with really old wrapper versions. We don't test these,
# so might as well require a somewhat recent sane version.
# v1.15 of the repo launcher was released in ~Mar 2012.
MIN_REPO_VERSION = (1, 15)
min_str = '.'.join(str(x) for x in MIN_REPO_VERSION)
def _CheckWrapperVersion(ver, repo_path):
if not repo_path:
repo_path = '~/bin/repo'
if not ver_str:
if not ver:
print('no --wrapper-version argument', file=sys.stderr)
sys.exit(1)
# Pull out the version of the repo launcher we know about to compare.
exp = Wrapper().VERSION
ver = tuple(map(int, ver_str.split('.')))
ver = tuple(map(int, ver.split('.')))
if len(ver) == 1:
ver = (0, ver[0])
exp_str = '.'.join(map(str, exp))
if ver < MIN_REPO_VERSION:
if exp[0] > ver[0] or ver < (0, 4):
print("""
repo: error:
!!! Your version of repo %s is too old.
!!! We need at least version %s.
!!! A new version of repo (%s) is available.
!!! You must upgrade before you can continue:
!!! A new repo command (%5s) is available. !!!
!!! You must upgrade before you can continue: !!!
cp %s %s
""" % (ver_str, min_str, exp_str, WrapperPath(), repo_path), file=sys.stderr)
""" % (exp_str, WrapperPath(), repo_path), file=sys.stderr)
sys.exit(1)
if exp > ver:
print('\n... A new version of repo (%s) is available.' % (exp_str,),
file=sys.stderr)
if os.access(repo_path, os.W_OK):
print("""\
print("""
... A new repo command (%5s) is available.
... You should upgrade soon:
cp %s %s
""" % (WrapperPath(), repo_path), file=sys.stderr)
else:
print("""\
... New version is available at: %s
... The launcher is run from: %s
!!! The launcher is not writable. Please talk to your sysadmin or distro
!!! to get an update installed.
""" % (WrapperPath(), repo_path), file=sys.stderr)
cp %s %s
""" % (exp_str, WrapperPath(), repo_path), file=sys.stderr)
def _CheckRepoDir(repo_dir):
if not repo_dir:
print('no --repo-dir argument', file=sys.stderr)
sys.exit(1)
def _PruneOptions(argv, opt):
i = 0
while i < len(argv):
@ -386,7 +306,6 @@ def _PruneOptions(argv, opt):
continue
i += 1
class _UserAgentHandler(urllib.request.BaseHandler):
def http_request(self, req):
req.add_header('User-Agent', user_agent.repo)
@ -396,7 +315,6 @@ class _UserAgentHandler(urllib.request.BaseHandler):
req.add_header('User-Agent', user_agent.repo)
return req
def _AddPasswordFromUserInput(handler, msg, req):
# If repo could not find auth info from netrc, try to get it from user input
url = req.get_full_url()
@ -410,24 +328,22 @@ def _AddPasswordFromUserInput(handler, msg, req):
return
handler.passwd.add_password(None, url, user, password)
class _BasicAuthHandler(urllib.request.HTTPBasicAuthHandler):
def http_error_401(self, req, fp, code, msg, headers):
_AddPasswordFromUserInput(self, msg, req)
return urllib.request.HTTPBasicAuthHandler.http_error_401(
self, req, fp, code, msg, headers)
self, req, fp, code, msg, headers)
def http_error_auth_reqed(self, authreq, host, req, headers):
try:
old_add_header = req.add_header
def _add_header(name, val):
val = val.replace('\n', '')
old_add_header(name, val)
req.add_header = _add_header
return urllib.request.AbstractBasicAuthHandler.http_error_auth_reqed(
self, authreq, host, req, headers)
except Exception:
self, authreq, host, req, headers)
except:
reset = getattr(self, 'reset_retry_count', None)
if reset is not None:
reset()
@ -435,24 +351,22 @@ class _BasicAuthHandler(urllib.request.HTTPBasicAuthHandler):
self.retried = 0
raise
class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler):
def http_error_401(self, req, fp, code, msg, headers):
_AddPasswordFromUserInput(self, msg, req)
return urllib.request.HTTPDigestAuthHandler.http_error_401(
self, req, fp, code, msg, headers)
self, req, fp, code, msg, headers)
def http_error_auth_reqed(self, auth_header, host, req, headers):
try:
old_add_header = req.add_header
def _add_header(name, val):
val = val.replace('\n', '')
old_add_header(name, val)
req.add_header = _add_header
return urllib.request.AbstractDigestAuthHandler.http_error_auth_reqed(
self, auth_header, host, req, headers)
except Exception:
self, auth_header, host, req, headers)
except:
reset = getattr(self, 'reset_retry_count', None)
if reset is not None:
reset()
@ -460,7 +374,6 @@ class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler):
self.retried = 0
raise
class _KerberosAuthHandler(urllib.request.BaseHandler):
def __init__(self):
self.retried = 0
@ -479,7 +392,7 @@ class _KerberosAuthHandler(urllib.request.BaseHandler):
if self.retried > 3:
raise urllib.request.HTTPError(req.get_full_url(), 401,
"Negotiate auth failed", headers, None)
"Negotiate auth failed", headers, None)
else:
self.retried += 1
@ -495,7 +408,7 @@ class _KerberosAuthHandler(urllib.request.BaseHandler):
return response
except kerberos.GSSError:
return None
except Exception:
except:
self.reset_retry_count()
raise
finally:
@ -541,7 +454,6 @@ class _KerberosAuthHandler(urllib.request.BaseHandler):
kerberos.authGSSClientClean(self.context)
self.context = None
def init_http():
handlers = [_UserAgentHandler()]
@ -550,7 +462,7 @@ def init_http():
n = netrc.netrc()
for host in n.hosts:
p = n.hosts[host]
mgr.add_password(p[1], 'http://%s/' % host, p[0], p[2])
mgr.add_password(p[1], 'http://%s/' % host, p[0], p[2])
mgr.add_password(p[1], 'https://%s/' % host, p[0], p[2])
except netrc.NetrcParseError:
pass
@ -569,7 +481,6 @@ def init_http():
handlers.append(urllib.request.HTTPSHandler(debuglevel=1))
urllib.request.install_opener(urllib.request.build_opener(*handlers))
def _Main(argv):
result = 0
@ -617,7 +528,7 @@ def _Main(argv):
argv = list(sys.argv)
argv.extend(rce.extra_args)
try:
os.execv(sys.executable, [__file__] + argv)
os.execv(__file__, argv)
except OSError as e:
print('fatal: cannot restart repo after upgrade', file=sys.stderr)
print('fatal: %s' % e, file=sys.stderr)
@ -626,6 +537,5 @@ def _Main(argv):
TerminatePager()
sys.exit(result)
if __name__ == '__main__':
_Main(sys.argv[1:])

File diff suppressed because it is too large Load Diff

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,6 +14,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import os
import select
import subprocess
@ -24,7 +27,6 @@ pager_process = None
old_stdout = None
old_stderr = None
def RunPager(globalConfig):
if not os.isatty(0) or not os.isatty(1):
return
@ -33,37 +35,33 @@ def RunPager(globalConfig):
return
if platform_utils.isWindows():
_PipePager(pager)
_PipePager(pager);
else:
_ForkPager(pager)
def TerminatePager():
global pager_process, old_stdout, old_stderr
if pager_process:
sys.stdout.flush()
sys.stderr.flush()
pager_process.stdin.close()
pager_process.wait()
pager_process.wait();
pager_process = None
# Restore initial stdout/err in case there is more output in this process
# after shutting down the pager process
sys.stdout = old_stdout
sys.stderr = old_stderr
def _PipePager(pager):
global pager_process, old_stdout, old_stderr
assert pager_process is None, "Only one active pager process at a time"
# Create pager process, piping stdout/err into its stdin
pager_process = subprocess.Popen([pager], stdin=subprocess.PIPE, stdout=sys.stdout,
stderr=sys.stderr)
pager_process = subprocess.Popen([pager], stdin=subprocess.PIPE, stdout=sys.stdout, stderr=sys.stderr)
old_stdout = sys.stdout
old_stderr = sys.stderr
sys.stdout = pager_process.stdin
sys.stderr = pager_process.stdin
def _ForkPager(pager):
global active
# This process turns into the pager; a child it forks will
@ -90,7 +88,6 @@ def _ForkPager(pager):
print("fatal: cannot start pager '%s'" % pager, file=sys.stderr)
sys.exit(255)
def _SelectPager(globalConfig):
try:
return os.environ['GIT_PAGER']
@ -108,7 +105,6 @@ def _SelectPager(globalConfig):
return 'less'
def _BecomePager(pager):
# Delaying execution of the pager until we have output
# ready works around a long-standing bug in popularly

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2016 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -15,9 +17,18 @@
import errno
import os
import platform
import select
import shutil
import stat
from pyversion import is_python3
if is_python3():
from queue import Queue
else:
from Queue import Queue
from threading import Thread
def isWindows():
""" Returns True when running with the native port of Python for Windows,
@ -28,6 +39,145 @@ def isWindows():
return platform.system() == "Windows"
class FileDescriptorStreams(object):
""" Platform agnostic abstraction enabling non-blocking I/O over a
collection of file descriptors. This abstraction is required because
fctnl(os.O_NONBLOCK) is not supported on Windows.
"""
@classmethod
def create(cls):
""" Factory method: instantiates the concrete class according to the
current platform.
"""
if isWindows():
return _FileDescriptorStreamsThreads()
else:
return _FileDescriptorStreamsNonBlocking()
def __init__(self):
self.streams = []
def add(self, fd, dest, std_name):
""" Wraps an existing file descriptor as a stream.
"""
self.streams.append(self._create_stream(fd, dest, std_name))
def remove(self, stream):
""" Removes a stream, when done with it.
"""
self.streams.remove(stream)
@property
def is_done(self):
""" Returns True when all streams have been processed.
"""
return len(self.streams) == 0
def select(self):
""" Returns the set of streams that have data available to read.
The returned streams each expose a read() and a close() method.
When done with a stream, call the remove(stream) method.
"""
raise NotImplementedError
def _create_stream(self, fd, dest, std_name):
""" Creates a new stream wrapping an existing file descriptor.
"""
raise NotImplementedError
class _FileDescriptorStreamsNonBlocking(FileDescriptorStreams):
""" Implementation of FileDescriptorStreams for platforms that support
non blocking I/O.
"""
class Stream(object):
""" Encapsulates a file descriptor """
def __init__(self, fd, dest, std_name):
self.fd = fd
self.dest = dest
self.std_name = std_name
self.set_non_blocking()
def set_non_blocking(self):
import fcntl
flags = fcntl.fcntl(self.fd, fcntl.F_GETFL)
fcntl.fcntl(self.fd, fcntl.F_SETFL, flags | os.O_NONBLOCK)
def fileno(self):
return self.fd.fileno()
def read(self):
return self.fd.read(4096)
def close(self):
self.fd.close()
def _create_stream(self, fd, dest, std_name):
return self.Stream(fd, dest, std_name)
def select(self):
ready_streams, _, _ = select.select(self.streams, [], [])
return ready_streams
class _FileDescriptorStreamsThreads(FileDescriptorStreams):
""" Implementation of FileDescriptorStreams for platforms that don't support
non blocking I/O. This implementation requires creating threads issuing
blocking read operations on file descriptors.
"""
def __init__(self):
super(_FileDescriptorStreamsThreads, self).__init__()
# The queue is shared accross all threads so we can simulate the
# behavior of the select() function
self.queue = Queue(10) # Limit incoming data from streams
def _create_stream(self, fd, dest, std_name):
return self.Stream(fd, dest, std_name, self.queue)
def select(self):
# Return only one stream at a time, as it is the most straighforward
# thing to do and it is compatible with the select() function.
item = self.queue.get()
stream = item.stream
stream.data = item.data
return [stream]
class QueueItem(object):
""" Item put in the shared queue """
def __init__(self, stream, data):
self.stream = stream
self.data = data
class Stream(object):
""" Encapsulates a file descriptor """
def __init__(self, fd, dest, std_name, queue):
self.fd = fd
self.dest = dest
self.std_name = std_name
self.queue = queue
self.data = None
self.thread = Thread(target=self.read_to_queue)
self.thread.daemon = True
self.thread.start()
def close(self):
self.fd.close()
def read(self):
data = self.data
self.data = None
return data
def read_to_queue(self):
""" The thread function: reads everything from the file descriptor into
the shared queue and terminates when reaching EOF.
"""
for line in iter(self.fd.readline, b''):
self.queue.put(_FileDescriptorStreamsThreads.QueueItem(self, line))
self.fd.close()
self.queue.put(_FileDescriptorStreamsThreads.QueueItem(self, None))
def symlink(source, link_name):
"""Creates a symbolic link pointing to source named link_name.
Note: On Windows, source must exist on disk, as the implementation needs

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2016 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -14,13 +16,16 @@
import errno
from pyversion import is_python3
from ctypes import WinDLL, get_last_error, FormatError, WinError, addressof
from ctypes import c_buffer, c_ubyte, Structure, Union, byref
from ctypes.wintypes import BOOL, BOOLEAN, LPCWSTR, DWORD, HANDLE
from ctypes.wintypes import WCHAR, USHORT, LPVOID, ULONG, LPDWORD
from ctypes import c_buffer
from ctypes.wintypes import BOOL, BOOLEAN, LPCWSTR, DWORD, HANDLE, POINTER, c_ubyte
from ctypes.wintypes import WCHAR, USHORT, LPVOID, Structure, Union, ULONG
from ctypes.wintypes import byref
kernel32 = WinDLL('kernel32', use_last_error=True)
LPDWORD = POINTER(DWORD)
UCHAR = c_ubyte
# Win32 error codes
@ -142,8 +147,7 @@ def create_dirsymlink(source, link_name):
def _create_symlink(source, link_name, dwFlags):
if not CreateSymbolicLinkW(link_name, source,
dwFlags | SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE):
if not CreateSymbolicLinkW(link_name, source, dwFlags | SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE):
# See https://github.com/golang/go/pull/24307/files#diff-b87bc12e4da2497308f9ef746086e4f0
# "the unprivileged create flag is unsupported below Windows 10 (1703, v10.0.14972).
# retry without it."
@ -194,15 +198,26 @@ def readlink(path):
'Error reading symbolic link \"%s\"'.format(path))
rdb = REPARSE_DATA_BUFFER.from_buffer(target_buffer)
if rdb.ReparseTag == IO_REPARSE_TAG_SYMLINK:
return rdb.SymbolicLinkReparseBuffer.PrintName
return _preserve_encoding(path, rdb.SymbolicLinkReparseBuffer.PrintName)
elif rdb.ReparseTag == IO_REPARSE_TAG_MOUNT_POINT:
return rdb.MountPointReparseBuffer.PrintName
return _preserve_encoding(path, rdb.MountPointReparseBuffer.PrintName)
# Unsupported reparse point type
_raise_winerror(
ERROR_NOT_SUPPORTED,
'Error reading symbolic link \"%s\"'.format(path))
def _preserve_encoding(source, target):
"""Ensures target is the same string type (i.e. unicode or str) as source."""
if is_python3():
return target
if isinstance(source, unicode):
return unicode(target)
return str(target)
def _raise_winerror(code, error_desc):
win_error_desc = FormatError(code).strip()
error_desc = "%s: %s".format(error_desc, win_error_desc)

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2009 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -24,53 +26,18 @@ _NOT_TTY = not os.isatty(2)
# column 0.
CSI_ERASE_LINE = '\x1b[2K'
def duration_str(total):
"""A less noisy timedelta.__str__.
The default timedelta stringification contains a lot of leading zeros and
uses microsecond resolution. This makes for noisy output.
"""
hours, rem = divmod(total, 3600)
mins, secs = divmod(rem, 60)
ret = '%.3fs' % (secs,)
if mins:
ret = '%im%s' % (mins, ret)
if hours:
ret = '%ih%s' % (hours, ret)
return ret
class Progress(object):
def __init__(self, title, total=0, units='', print_newline=False, delay=True,
quiet=False):
def __init__(self, title, total=0, units='', print_newline=False,
always_print_percentage=False):
self._title = title
self._total = total
self._done = 0
self._lastp = -1
self._start = time()
self._show = not delay
self._show = False
self._units = units
self._print_newline = print_newline
# Only show the active jobs section if we run more than one in parallel.
self._show_jobs = False
self._active = 0
# When quiet, never show any output. It's a bit hacky, but reusing the
# existing logic that delays initial output keeps the rest of the class
# clean. Basically we set the start time to years in the future.
if quiet:
self._show = False
self._start += 2**32
def start(self, name):
self._active += 1
if not self._show_jobs:
self._show_jobs = self._active > 1
self.update(inc=0, msg='started ' + name)
def finish(self, name):
self.update(msg='finished ' + name)
self._active -= 1
self._always_print_percentage = always_print_percentage
def update(self, inc=1, msg=''):
self._done += inc
@ -86,46 +53,41 @@ class Progress(object):
if self._total <= 0:
sys.stderr.write('%s\r%s: %d,' % (
CSI_ERASE_LINE,
self._title,
self._done))
CSI_ERASE_LINE,
self._title,
self._done))
sys.stderr.flush()
else:
p = (100 * self._done) / self._total
if self._show_jobs:
jobs = '[%d job%s] ' % (self._active, 's' if self._active > 1 else '')
else:
jobs = ''
sys.stderr.write('%s\r%s: %2d%% %s(%d%s/%d%s)%s%s%s' % (
if self._lastp != p or self._always_print_percentage:
self._lastp = p
sys.stderr.write('%s\r%s: %3d%% (%d%s/%d%s)%s%s%s' % (
CSI_ERASE_LINE,
self._title,
p,
jobs,
self._done, self._units,
self._total, self._units,
' ' if msg else '', msg,
'\n' if self._print_newline else ''))
sys.stderr.flush()
"\n" if self._print_newline else ""))
sys.stderr.flush()
def end(self):
if _NOT_TTY or IsTrace() or not self._show:
return
duration = duration_str(time() - self._start)
if self._total <= 0:
sys.stderr.write('%s\r%s: %d, done in %s\n' % (
CSI_ERASE_LINE,
self._title,
self._done,
duration))
sys.stderr.write('%s\r%s: %d, done.\n' % (
CSI_ERASE_LINE,
self._title,
self._done))
sys.stderr.flush()
else:
p = (100 * self._done) / self._total
sys.stderr.write('%s\r%s: %3d%% (%d%s/%d%s), done in %s\n' % (
CSI_ERASE_LINE,
self._title,
p,
self._done, self._units,
self._total, self._units,
duration))
sys.stderr.write('%s\r%s: %3d%% (%d%s/%d%s), done.\n' % (
CSI_ERASE_LINE,
self._title,
p,
self._done, self._units,
self._total, self._units))
sys.stderr.flush()

1319
project.py

File diff suppressed because it is too large Load Diff

20
pyversion.py Normal file
View File

@ -0,0 +1,20 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2013 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
def is_python3():
return sys.version_info[0] == 3

View File

@ -1,2 +0,0 @@
These are helper tools for managing official releases.
See the [release process](../docs/release-process.md) document for more details.

View File

@ -1,114 +0,0 @@
#!/usr/bin/env python3
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Helper tool for signing repo launcher scripts correctly.
This is intended to be run only by the official Repo release managers.
"""
import argparse
import os
import subprocess
import sys
import util
def sign(opts):
"""Sign the launcher!"""
output = ''
for key in opts.keys:
# We use ! at the end of the key so that gpg uses this specific key.
# Otherwise it uses the key as a lookup into the overall key and uses the
# default signing key. i.e. It will see that KEYID_RSA is a subkey of
# another key, and use the primary key to sign instead of the subkey.
cmd = ['gpg', '--homedir', opts.gpgdir, '-u', f'{key}!', '--batch', '--yes',
'--armor', '--detach-sign', '--output', '-', opts.launcher]
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
output += ret.stdout
# Save the combined signatures into one file.
with open(f'{opts.launcher}.asc', 'w', encoding='utf-8') as fp:
fp.write(output)
def check(opts):
"""Check the signature."""
util.run(opts, ['gpg', '--verify', f'{opts.launcher}.asc'])
def postmsg(opts):
"""Helpful info to show at the end for release manager."""
print(f"""
Repo launcher bucket:
gs://git-repo-downloads/
To upload this launcher directly:
gsutil cp -a public-read {opts.launcher} {opts.launcher}.asc gs://git-repo-downloads/
NB: You probably want to upload it with a specific version first, e.g.:
gsutil cp -a public-read {opts.launcher} gs://git-repo-downloads/repo-3.0
gsutil cp -a public-read {opts.launcher}.asc gs://git-repo-downloads/repo-3.0.asc
""")
def get_parser():
"""Get a CLI parser."""
parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument('-n', '--dry-run',
dest='dryrun', action='store_true',
help='show everything that would be done')
parser.add_argument('--gpgdir',
default=os.path.join(util.HOMEDIR, '.gnupg', 'repo'),
help='path to dedicated gpg dir with release keys '
'(default: ~/.gnupg/repo/)')
parser.add_argument('--keyid', dest='keys', default=[], action='append',
help='alternative signing keys to use')
parser.add_argument('launcher',
default=os.path.join(util.TOPDIR, 'repo'), nargs='?',
help='the launcher script to sign')
return parser
def main(argv):
"""The main func!"""
parser = get_parser()
opts = parser.parse_args(argv)
if not os.path.exists(opts.gpgdir):
parser.error(f'--gpgdir does not exist: {opts.gpgdir}')
if not os.path.exists(opts.launcher):
parser.error(f'launcher does not exist: {opts.launcher}')
opts.launcher = os.path.relpath(opts.launcher)
print(f'Signing "{opts.launcher}" launcher script and saving to '
f'"{opts.launcher}.asc"')
if opts.keys:
print(f'Using custom keys to sign: {" ".join(opts.keys)}')
else:
print('Using official Repo release keys to sign')
opts.keys = [util.KEYID_DSA, util.KEYID_RSA, util.KEYID_ECC]
util.import_release_key(opts)
sign(opts)
check(opts)
postmsg(opts)
return 0
if __name__ == '__main__':
sys.exit(main(sys.argv[1:]))

View File

@ -1,140 +0,0 @@
#!/usr/bin/env python3
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Helper tool for signing repo release tags correctly.
This is intended to be run only by the official Repo release managers, but it
could be run by people maintaining their own fork of the project.
NB: Check docs/release-process.md for production freeze information.
"""
import argparse
import os
import re
import subprocess
import sys
import util
# We currently sign with the old DSA key as it's been around the longest.
# We should transition to RSA by Jun 2020, and ECC by Jun 2021.
KEYID = util.KEYID_DSA
# Regular expression to validate tag names.
RE_VALID_TAG = r'^v([0-9]+[.])+[0-9]+$'
def sign(opts):
"""Tag the commit & sign it!"""
# We use ! at the end of the key so that gpg uses this specific key.
# Otherwise it uses the key as a lookup into the overall key and uses the
# default signing key. i.e. It will see that KEYID_RSA is a subkey of
# another key, and use the primary key to sign instead of the subkey.
cmd = ['git', 'tag', '-s', opts.tag, '-u', f'{opts.key}!',
'-m', f'repo {opts.tag}', opts.commit]
key = 'GNUPGHOME'
print('+', f'export {key}="{opts.gpgdir}"')
oldvalue = os.getenv(key)
os.putenv(key, opts.gpgdir)
util.run(opts, cmd)
if oldvalue is None:
os.unsetenv(key)
else:
os.putenv(key, oldvalue)
def check(opts):
"""Check the signature."""
util.run(opts, ['git', 'tag', '--verify', opts.tag])
def postmsg(opts):
"""Helpful info to show at the end for release manager."""
cmd = ['git', 'rev-parse', 'remotes/origin/stable']
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
current_release = ret.stdout.strip()
cmd = ['git', 'log', '--format=%h (%aN) %s', '--no-merges',
f'remotes/origin/stable..{opts.tag}']
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
shortlog = ret.stdout.strip()
print(f"""
Here's the short log since the last release.
{shortlog}
To push release to the public:
git push origin {opts.commit}:stable {opts.tag} -n
NB: People will start upgrading to this version immediately.
To roll back a release:
git push origin --force {current_release}:stable -n
""")
def get_parser():
"""Get a CLI parser."""
parser = argparse.ArgumentParser(
description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter)
parser.add_argument('-n', '--dry-run',
dest='dryrun', action='store_true',
help='show everything that would be done')
parser.add_argument('--gpgdir',
default=os.path.join(util.HOMEDIR, '.gnupg', 'repo'),
help='path to dedicated gpg dir with release keys '
'(default: ~/.gnupg/repo/)')
parser.add_argument('-f', '--force', action='store_true',
help='force signing of any tag')
parser.add_argument('--keyid', dest='key',
help='alternative signing key to use')
parser.add_argument('tag',
help='the tag to create (e.g. "v2.0")')
parser.add_argument('commit', default='HEAD', nargs='?',
help='the commit to tag')
return parser
def main(argv):
"""The main func!"""
parser = get_parser()
opts = parser.parse_args(argv)
if not os.path.exists(opts.gpgdir):
parser.error(f'--gpgdir does not exist: {opts.gpgdir}')
if not opts.force and not re.match(RE_VALID_TAG, opts.tag):
parser.error(f'tag "{opts.tag}" does not match regex "{RE_VALID_TAG}"; '
'use --force to sign anyways')
if opts.key:
print(f'Using custom key to sign: {opts.key}')
else:
print('Using official Repo release key to sign')
opts.key = KEYID
util.import_release_key(opts)
sign(opts)
check(opts)
postmsg(opts)
return 0
if __name__ == '__main__':
sys.exit(main(sys.argv[1:]))

View File

@ -1,73 +0,0 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Random utility code for release tools."""
import os
import re
import subprocess
import sys
assert sys.version_info >= (3, 6), 'This module requires Python 3.6+'
TOPDIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
HOMEDIR = os.path.expanduser('~')
# These are the release keys we sign with.
KEYID_DSA = '8BB9AD793E8E6153AF0F9A4416530D5E920F5C65'
KEYID_RSA = 'A34A13BE8E76BFF46A0C022DA2E75A824AAB9624'
KEYID_ECC = 'E1F9040D7A3F6DAFAC897CD3D3B95DA243E48A39'
def cmdstr(cmd):
"""Get a nicely quoted shell command."""
ret = []
for arg in cmd:
if not re.match(r'^[a-zA-Z0-9/_.=-]+$', arg):
arg = f'"{arg}"'
ret.append(arg)
return ' '.join(ret)
def run(opts, cmd, check=True, **kwargs):
"""Helper around subprocess.run to include logging."""
print('+', cmdstr(cmd))
if opts.dryrun:
cmd = ['true', '--'] + cmd
try:
return subprocess.run(cmd, check=check, **kwargs)
except subprocess.CalledProcessError as e:
print(f'aborting: {e}', file=sys.stderr)
sys.exit(1)
def import_release_key(opts):
"""Import the public key of the official release repo signing key."""
# Extract the key from our repo launcher.
launcher = getattr(opts, 'launcher', os.path.join(TOPDIR, 'repo'))
print(f'Importing keys from "{launcher}" launcher script')
with open(launcher, encoding='utf-8') as fp:
data = fp.read()
keys = re.findall(
r'\n-----BEGIN PGP PUBLIC KEY BLOCK-----\n[^-]*'
r'\n-----END PGP PUBLIC KEY BLOCK-----\n', data, flags=re.M)
run(opts, ['gpg', '--import'], input='\n'.join(keys).encode('utf-8'))
print('Marking keys as fully trusted')
run(opts, ['gpg', '--import-ownertrust'],
input=f'{KEYID_DSA}:6:\n'.encode('utf-8'))

1155
repo

File diff suppressed because it is too large Load Diff

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -17,6 +19,7 @@
Activated via `repo --trace ...` or `REPO_TRACE=1 repo ...`.
"""
from __future__ import print_function
import sys
import os
@ -25,16 +28,13 @@ REPO_TRACE = 'REPO_TRACE'
_TRACE = os.environ.get(REPO_TRACE) == '1'
def IsTrace():
return _TRACE
def SetTrace():
global _TRACE
_TRACE = True
def Trace(fmt, *args):
if IsTrace():
print(fmt % args, file=sys.stderr)

View File

@ -1,57 +0,0 @@
# This file declares various requirements for this version of repo. The
# launcher script will load it and check the constraints before trying to run
# us. This avoids issues of the launcher using an old version of Python (e.g.
# 3.5) while the codebase has moved on to requiring something much newer (e.g.
# 3.8). If the launcher tried to import us, it would fail with syntax errors.
# This is a JSON file with line-level comments allowed.
# Always keep backwards compatibility in mine. The launcher script is robust
# against missing values, but when a field is renamed/removed, it means older
# versions of the launcher script won't be able to enforce the constraint.
# When requiring versions, always use lists as they are easy to parse & compare
# in Python. Strings would require futher processing to turn into a list.
# Version constraints should be expressed in pairs: soft & hard. Soft versions
# are when we start warning users that their software too old and we're planning
# on dropping support for it, so they need to start planning system upgrades.
# Hard versions are when we refuse to work the tool. Users will be shown an
# error message before we abort entirely.
# When deciding whether to upgrade a version requirement, check out the distro
# lists to see who will be impacted:
# https://gerrit.googlesource.com/git-repo/+/HEAD/docs/release-process.md#Project-References
{
# The repo launcher itself. This allows us to force people to upgrade as some
# ignore the warnings about it being out of date, or install ancient versions
# to start with for whatever reason.
#
# NB: Repo launchers started checking this file with repo-2.12, so listing
# versions older than that won't make a difference.
"repo": {
"hard": [2, 11],
"soft": [2, 11]
},
# Supported Python versions.
#
# python-3.6 is in Ubuntu Bionic.
# python-3.5 is in Debian Stretch.
"python": {
"hard": [3, 5],
"soft": [3, 6]
},
# Supported git versions.
#
# git-1.7.2 is in Debian Squeeze.
# git-1.7.9 is in Ubuntu Precise.
# git-1.9.1 is in Ubuntu Trusty.
# git-1.7.10 is in Debian Wheezy.
"git": {
"hard": [1, 7, 2],
"soft": [1, 9, 1]
}
}

View File

@ -1,4 +1,5 @@
#!/usr/bin/env python3
#!/usr/bin/env python
# -*- coding:utf-8 -*-
# Copyright 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -15,42 +16,37 @@
"""Wrapper to run pytest with the right settings."""
from __future__ import print_function
import errno
import os
import shutil
import subprocess
import sys
def find_pytest():
"""Try to locate a good version of pytest."""
# Use the Python 3 version if available.
ret = shutil.which('pytest-3')
if ret:
return ret
# Hopefully this is a Python 3 version.
ret = shutil.which('pytest')
if ret:
return ret
print('%s: unable to find pytest.' % (__file__,), file=sys.stderr)
print('%s: Try installing: sudo apt-get install python-pytest' % (__file__,),
file=sys.stderr)
def run_pytest(cmd, argv):
"""Run the unittests via |cmd|."""
try:
return subprocess.call([cmd] + argv)
except OSError as e:
if e.errno == errno.ENOENT:
print('%s: unable to run `%s`: %s' % (__file__, cmd, e), file=sys.stderr)
print('%s: Try installing pytest: sudo apt-get install python-pytest' %
(__file__,), file=sys.stderr)
return 127
else:
raise
def main(argv):
"""The main entry."""
# Add the repo tree to PYTHONPATH as the tests expect to be able to import
# modules directly.
pythonpath = os.path.dirname(os.path.realpath(__file__))
oldpythonpath = os.environ.get('PYTHONPATH', None)
if oldpythonpath is not None:
pythonpath += os.pathsep + oldpythonpath
os.environ['PYTHONPATH'] = pythonpath
topdir = os.path.dirname(os.path.realpath(__file__))
pythonpath = os.environ.get('PYTHONPATH', '')
os.environ['PYTHONPATH'] = '%s:%s' % (topdir, pythonpath)
pytest = find_pytest()
return subprocess.run([pytest] + argv, check=False).returncode
return run_pytest('pytest', argv)
if __name__ == '__main__':

View File

@ -1,4 +1,5 @@
#!/usr/bin/env python3
#!/usr/bin/env python
# -*- coding:utf-8 -*-
# Copyright 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the 'License");
@ -15,6 +16,8 @@
"""Python packaging for repo."""
from __future__ import print_function
import os
import setuptools
@ -32,7 +35,7 @@ with open(os.path.join(TOPDIR, 'README.md')) as fp:
# https://packaging.python.org/tutorials/packaging-projects/
setuptools.setup(
name='repo',
version='2',
version='1.13.8',
maintainer='Various',
maintainer_email='repo-discuss@googlegroups.com',
description='Repo helps manage many Git repositories',
@ -52,10 +55,9 @@ setuptools.setup(
'Operating System :: MacOS :: MacOS X',
'Operating System :: Microsoft :: Windows :: Windows 10',
'Operating System :: POSIX :: Linux',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3 :: Only',
'Topic :: Software Development :: Version Control :: Git',
],
python_requires='>=3.5',
# We support Python 2.7 and Python 3.6+.
python_requires='>=2.7, ' + ', '.join('!=3.%i.*' % x for x in range(0, 6)),
packages=['subcmds'],
)

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -14,7 +16,6 @@
import os
# A mapping of the subcommand name to the class that implements it.
all_commands = {}
my_dir = os.path.dirname(__file__)
@ -36,14 +37,14 @@ for py in os.listdir(my_dir):
['%s' % name])
mod = getattr(mod, name)
try:
cmd = getattr(mod, clsn)
cmd = getattr(mod, clsn)()
except AttributeError:
raise SyntaxError('%s/%s does not define class %s' % (
__name__, py, clsn))
__name__, py, clsn))
name = name.replace('_', '-')
cmd.NAME = name
all_commands[name] = cmd
# Add 'branch' as an alias for 'branches'.
all_commands['branch'] = all_commands['branches']
if 'help' in all_commands:
all_commands['help'].commands = all_commands

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,16 +14,13 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from collections import defaultdict
import functools
import itertools
from __future__ import print_function
import sys
from command import Command, DEFAULT_LOCAL_JOBS
from command import Command
from collections import defaultdict
from git_command import git
from progress import Progress
class Abandon(Command):
common = True
helpSummary = "Permanently abandon a development branch"
@ -33,8 +32,6 @@ deleting it (and all its history) from your local repository.
It is equivalent to "git branch -D <branchname>".
"""
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p):
p.add_option('--all',
dest='all', action='store_true',
@ -51,64 +48,52 @@ It is equivalent to "git branch -D <branchname>".
else:
args.insert(0, "'All local branches'")
def _ExecuteOne(self, all_branches, nb, project):
"""Abandon one project."""
if all_branches:
branches = project.GetBranches()
else:
branches = [nb]
ret = {}
for name in branches:
status = project.AbandonBranch(name)
if status is not None:
ret[name] = status
return (ret, project)
def Execute(self, opt, args):
nb = args[0]
err = defaultdict(list)
success = defaultdict(list)
all_projects = self.GetProjects(args[1:])
def _ProcessResults(_pool, pm, states):
for (results, project) in states:
for branch, status in results.items():
pm = Progress('Abandon %s' % nb, len(all_projects))
for project in all_projects:
pm.update()
if opt.all:
branches = list(project.GetBranches().keys())
else:
branches = [nb]
for name in branches:
status = project.AbandonBranch(name)
if status is not None:
if status:
success[branch].append(project)
success[name].append(project)
else:
err[branch].append(project)
pm.update()
err[name].append(project)
pm.end()
self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, opt.all, nb),
all_projects,
callback=_ProcessResults,
output=Progress('Abandon %s' % (nb,), len(all_projects), quiet=opt.quiet))
width = 25
for name in branches:
if width < len(name):
width = len(name)
width = max(itertools.chain(
[25], (len(x) for x in itertools.chain(success, err))))
if err:
for br in err.keys():
err_msg = "error: cannot abandon %s" % br
err_msg = "error: cannot abandon %s" %br
print(err_msg, file=sys.stderr)
for proj in err[br]:
print(' ' * len(err_msg) + " | %s" % proj.relpath, file=sys.stderr)
print(' '*len(err_msg) + " | %s" % proj.relpath, file=sys.stderr)
sys.exit(1)
elif not success:
print('error: no project has local branch(es) : %s' % nb,
file=sys.stderr)
sys.exit(1)
else:
# Everything below here is displaying status.
if opt.quiet:
return
print('Abandoned branches:')
print('Abandoned branches:', file=sys.stderr)
for br in success.keys():
if len(all_projects) > 1 and len(all_projects) == len(success[br]):
result = "all project"
else:
result = "%s" % (
('\n' + ' ' * width + '| ').join(p.relpath for p in success[br]))
print("%s%s| %s\n" % (br, ' ' * (width - len(br)), result))
('\n'+' '*width + '| ').join(p.relpath for p in success[br]))
print("%s%s| %s\n" % (br,' '*(width-len(br)), result),file=sys.stderr)

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2009 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,21 +14,18 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import itertools
from __future__ import print_function
import sys
from color import Coloring
from command import Command, DEFAULT_LOCAL_JOBS
from command import Command
class BranchColoring(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, 'branch')
self.current = self.printer('current', fg='green')
self.local = self.printer('local')
self.local = self.printer('local')
self.notinproject = self.printer('notinproject', fg='red')
class BranchInfo(object):
def __init__(self, name):
self.name = name
@ -95,7 +94,6 @@ the branch appears in, or does not appear in. If no project list
is shown, then the branch appears in all projects.
"""
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def Execute(self, opt, args):
projects = self.GetProjects(args)
@ -103,19 +101,14 @@ is shown, then the branch appears in all projects.
all_branches = {}
project_cnt = len(projects)
def _ProcessResults(_pool, _output, results):
for name, b in itertools.chain.from_iterable(results):
for project in projects:
for name, b in project.GetBranches().items():
b.project = project
if name not in all_branches:
all_branches[name] = BranchInfo(name)
all_branches[name].add(b)
self.ExecuteInParallel(
opt.jobs,
expand_project_to_branches,
projects,
callback=_ProcessResults)
names = sorted(all_branches)
names = list(sorted(all_branches))
if not names:
print(' (no branches)', file=sys.stderr)
@ -165,7 +158,7 @@ is shown, then the branch appears in all projects.
for b in i.projects:
have.add(b.project)
for p in projects:
if p not in have:
if not p in have:
paths.append(p.relpath)
s = ' %s %s' % (in_type, ', '.join(paths))
@ -177,27 +170,11 @@ is shown, then the branch appears in all projects.
fmt = out.current if i.IsCurrent else out.write
for p in paths:
out.nl()
fmt(width * ' ' + ' %s' % p)
fmt(width*' ' + ' %s' % p)
fmt = out.write
for p in non_cur_paths:
out.nl()
fmt(width * ' ' + ' %s' % p)
fmt(width*' ' + ' %s' % p)
else:
out.write(' in all projects')
out.nl()
def expand_project_to_branches(project):
"""Expands a project into a list of branch names & associated information.
Args:
project: project.Project
Returns:
List[Tuple[str, git_config.Branch]]
"""
branches = []
for name, b in project.GetBranches().items():
b.project = project
branches.append((name, b))
return branches

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2009 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,13 +14,11 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import functools
from __future__ import print_function
import sys
from command import Command, DEFAULT_LOCAL_JOBS
from command import Command
from progress import Progress
class Checkout(Command):
common = True
helpSummary = "Checkout a branch for development"
@ -33,37 +33,28 @@ The command is equivalent to:
repo forall [<project>...] -c git checkout <branchname>
"""
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def ValidateOptions(self, opt, args):
if not args:
self.Usage()
def _ExecuteOne(self, nb, project):
"""Checkout one project."""
return (project.CheckoutBranch(nb), project)
def Execute(self, opt, args):
nb = args[0]
err = []
success = []
all_projects = self.GetProjects(args[1:])
def _ProcessResults(_pool, pm, results):
for status, project in results:
if status is not None:
if status:
success.append(project)
else:
err.append(project)
pm.update()
pm = Progress('Checkout %s' % nb, len(all_projects))
for project in all_projects:
pm.update()
self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, nb),
all_projects,
callback=_ProcessResults,
output=Progress('Checkout %s' % (nb,), len(all_projects), quiet=opt.quiet))
status = project.CheckoutBranch(nb)
if status is not None:
if status:
success.append(project)
else:
err.append(project)
pm.end()
if err:
for p in err:

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2010 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,6 +14,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import re
import sys
from command import Command
@ -19,7 +22,6 @@ from git_command import GitCommand
CHANGE_ID_RE = re.compile(r'^\s*Change-Id: I([0-9a-f]{40})\s*$')
class CherryPick(Command):
common = True
helpSummary = "Cherry-pick a change."
@ -32,6 +34,9 @@ The change id will be updated, and a reference to the old
change id will be added.
"""
def _Options(self, p):
pass
def ValidateOptions(self, opt, args):
if len(args) != 1:
self.Usage()
@ -41,8 +46,8 @@ change id will be added.
p = GitCommand(None,
['rev-parse', '--verify', reference],
capture_stdout=True,
capture_stderr=True)
capture_stdout = True,
capture_stderr = True)
if p.Wait() != 0:
print(p.stderr, file=sys.stderr)
sys.exit(1)
@ -56,8 +61,8 @@ change id will be added.
p = GitCommand(None,
['cherry-pick', sha1],
capture_stdout=True,
capture_stderr=True)
capture_stdout = True,
capture_stderr = True)
status = p.Wait()
print(p.stdout, file=sys.stdout)
@ -69,9 +74,11 @@ change id will be added.
new_msg = self._Reformat(old_msg, sha1)
p = GitCommand(None, ['commit', '--amend', '-F', '-'],
input=new_msg,
capture_stdout=True,
capture_stderr=True)
provide_stdin = True,
capture_stdout = True,
capture_stderr = True)
p.stdin.write(new_msg)
p.stdin.close()
if p.Wait() != 0:
print("error: Failed to update commit message", file=sys.stderr)
sys.exit(1)
@ -90,7 +97,7 @@ change id will be added.
def _StripHeader(self, commit_msg):
lines = commit_msg.splitlines()
return "\n".join(lines[lines.index("") + 1:])
return "\n".join(lines[lines.index("")+1:])
def _Reformat(self, old_msg, sha1):
new_msg = []

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,11 +14,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import functools
import io
from command import DEFAULT_LOCAL_JOBS, PagedCommand
from command import PagedCommand
class Diff(PagedCommand):
common = True
@ -28,42 +26,19 @@ The -u option causes '%prog' to generate diff output with file paths
relative to the repository root, so the output can be applied
to the Unix 'patch' command.
"""
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p):
def cmd(option, opt_str, value, parser):
setattr(parser.values, option.dest, list(parser.rargs))
while parser.rargs:
del parser.rargs[0]
p.add_option('-u', '--absolute',
dest='absolute', action='store_true',
help='Paths are relative to the repository root')
def _ExecuteOne(self, absolute, project):
"""Obtains the diff for a specific project.
Args:
absolute: Paths are relative to the root.
project: Project to get status of.
Returns:
The status of the project.
"""
buf = io.StringIO()
ret = project.PrintWorkTreeDiff(absolute, output_redir=buf)
return (ret, buf.getvalue())
def Execute(self, opt, args):
all_projects = self.GetProjects(args)
def _ProcessResults(_pool, _output, results):
ret = 0
for (state, output) in results:
if output:
print(output, end='')
if not state:
ret = 1
return ret
return self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, opt.absolute),
all_projects,
callback=_ProcessResults,
ordered=True)
ret = 0
for project in self.GetProjects(args):
if not project.PrintWorkTreeDiff(opt.absolute):
ret = 1
return ret

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2014 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -14,14 +16,12 @@
from color import Coloring
from command import PagedCommand
from manifest_xml import RepoClient
from manifest_xml import XmlManifest
class _Coloring(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, "status")
class Diffmanifests(PagedCommand):
""" A command to see logs in projects represented by manifests
@ -77,7 +77,7 @@ synced and their revisions won't be found.
metavar='<FORMAT>',
help='print the log using a custom git pretty format string')
def _printRawDiff(self, diff, pretty_format=None):
def _printRawDiff(self, diff):
for project in diff['added']:
self.printText("A %s %s" % (project.relpath, project.revisionExpr))
self.out.nl()
@ -90,7 +90,7 @@ synced and their revisions won't be found.
self.printText("C %s %s %s" % (project.relpath, project.revisionExpr,
otherProject.revisionExpr))
self.out.nl()
self._printLogs(project, otherProject, raw=True, color=False, pretty_format=pretty_format)
self._printLogs(project, otherProject, raw=True, color=False)
for project, otherProject in diff['unreachable']:
self.printText("U %s %s %s" % (project.relpath, project.revisionExpr,
@ -181,26 +181,26 @@ synced and their revisions won't be found.
self.OptionParser.error('missing manifests to diff')
def Execute(self, opt, args):
self.out = _Coloring(self.client.globalConfig)
self.out = _Coloring(self.manifest.globalConfig)
self.printText = self.out.nofmt_printer('text')
if opt.color:
self.printProject = self.out.nofmt_printer('project', attr='bold')
self.printAdded = self.out.nofmt_printer('green', fg='green', attr='bold')
self.printRemoved = self.out.nofmt_printer('red', fg='red', attr='bold')
self.printRevision = self.out.nofmt_printer('revision', fg='yellow')
self.printProject = self.out.nofmt_printer('project', attr = 'bold')
self.printAdded = self.out.nofmt_printer('green', fg = 'green', attr = 'bold')
self.printRemoved = self.out.nofmt_printer('red', fg = 'red', attr = 'bold')
self.printRevision = self.out.nofmt_printer('revision', fg = 'yellow')
else:
self.printProject = self.printAdded = self.printRemoved = self.printRevision = self.printText
manifest1 = RepoClient(self.repodir)
manifest1 = XmlManifest(self.manifest.repodir)
manifest1.Override(args[0], load_local_manifests=False)
if len(args) == 1:
manifest2 = self.manifest
else:
manifest2 = RepoClient(self.repodir)
manifest2 = XmlManifest(self.manifest.repodir)
manifest2.Override(args[1], load_local_manifests=False)
diff = manifest1.projectsDiff(manifest2)
if opt.raw:
self._printRawDiff(diff, pretty_format=opt.pretty_format)
self._printRawDiff(diff)
else:
self._printDiff(diff, color=opt.color, pretty_format=opt.pretty_format)

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,15 +14,15 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import re
import sys
from command import Command
from error import GitError, NoSuchProjectError
from error import GitError
CHANGE_RE = re.compile(r'^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$')
class Download(Command):
common = True
helpSummary = "Download and checkout a change"
@ -34,13 +36,9 @@ If no project is specified try to use current directory as a project.
"""
def _Options(self, p):
p.add_option('-b', '--branch',
help='create a new branch first')
p.add_option('-c', '--cherry-pick',
dest='cherrypick', action='store_true',
help="cherry-pick instead of checkout")
p.add_option('-x', '--record-origin', action='store_true',
help='pass -x when cherry-picking')
p.add_option('-r', '--revert',
dest='revert', action='store_true',
help="revert instead of checkout")
@ -60,7 +58,6 @@ If no project is specified try to use current directory as a project.
if m:
if not project:
project = self.GetProjects(".")[0]
print('Defaulting to cwd project', project.name)
chg_id = int(m.group(1))
if m.group(2):
ps_id = int(m.group(2))
@ -77,33 +74,9 @@ If no project is specified try to use current directory as a project.
ps_id = max(int(match.group(1)), ps_id)
to_get.append((project, chg_id, ps_id))
else:
projects = self.GetProjects([a])
if len(projects) > 1:
# If the cwd is one of the projects, assume they want that.
try:
project = self.GetProjects('.')[0]
except NoSuchProjectError:
project = None
if project not in projects:
print('error: %s matches too many projects; please re-run inside '
'the project checkout.' % (a,), file=sys.stderr)
for project in projects:
print(' %s/ @ %s' % (project.relpath, project.revisionExpr),
file=sys.stderr)
sys.exit(1)
else:
project = projects[0]
print('Defaulting to cwd project', project.name)
project = self.GetProjects([a])[0]
return to_get
def ValidateOptions(self, opt, args):
if opt.record_origin:
if not opt.cherrypick:
self.OptionParser.error('-x only makes sense with --cherry-pick')
if opt.ffonly:
self.OptionParser.error('-x and --ff are mutually exclusive options')
def Execute(self, opt, args):
for project, change_id, ps_id in self._ParseChangeIds(args):
dl = project.DownloadPatchSet(change_id, ps_id)
@ -120,41 +93,22 @@ If no project is specified try to use current directory as a project.
continue
if len(dl.commits) > 1:
print('[%s] %d/%d depends on %d unmerged changes:'
print('[%s] %d/%d depends on %d unmerged changes:' \
% (project.name, change_id, ps_id, len(dl.commits)),
file=sys.stderr)
for c in dl.commits:
print(' %s' % (c), file=sys.stderr)
if opt.cherrypick:
mode = 'cherry-pick'
try:
project._CherryPick(dl.commit)
except GitError:
print('[%s] Could not complete the cherry-pick of %s' \
% (project.name, dl.commit), file=sys.stderr)
sys.exit(1)
elif opt.revert:
mode = 'revert'
project._Revert(dl.commit)
elif opt.ffonly:
mode = 'fast-forward merge'
project._FastForward(dl.commit, ffonly=True)
else:
mode = 'checkout'
# We'll combine the branch+checkout operation, but all the rest need a
# dedicated branch start.
if opt.branch and mode != 'checkout':
project.StartBranch(opt.branch)
try:
if opt.cherrypick:
project._CherryPick(dl.commit, ffonly=opt.ffonly,
record_origin=opt.record_origin)
elif opt.revert:
project._Revert(dl.commit)
elif opt.ffonly:
project._FastForward(dl.commit, ffonly=True)
else:
if opt.branch:
project.StartBranch(opt.branch, revision=dl.commit)
else:
project._Checkout(dl.commit)
except GitError:
print('[%s] Could not complete the %s of %s'
% (project.name, mode, dl.commit), file=sys.stderr)
sys.exit(1)
project._Checkout(dl.commit)

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,9 +14,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import errno
import functools
import io
import multiprocessing
import re
import os
@ -23,14 +24,14 @@ import sys
import subprocess
from color import Coloring
from command import DEFAULT_LOCAL_JOBS, Command, MirrorSafeCommand, WORKER_BATCH_SIZE
from error import ManifestInvalidRevisionError
from command import Command, MirrorSafeCommand
import platform_utils
_CAN_COLOR = [
'branch',
'diff',
'grep',
'log',
'branch',
'diff',
'grep',
'log',
]
@ -45,7 +46,7 @@ class Forall(Command, MirrorSafeCommand):
helpSummary = "Run a shell command in each project"
helpUsage = """
%prog [<project>...] -c <command> [<arg>...]
%prog -r str1 [str2] ... -c <command> [<arg>...]
%prog -r str1 [str2] ... -c <command> [<arg>...]"
"""
helpDescription = """
Executes the same shell command in each project.
@ -53,11 +54,6 @@ Executes the same shell command in each project.
The -r option allows running the command only on projects matching
regex or wildcard expression.
By default, projects are processed non-interactively in parallel. If you want
to run interactive commands, make sure to pass --interactive to force --jobs 1.
While the processing order of projects is not guaranteed, the order of project
output is stable.
# Output Formatting
The -p option causes '%prog' to bind pipes to the command's stdin,
@ -120,22 +116,18 @@ terminal and are not redirected.
If -e is used, when a command exits unsuccessfully, '%prog' will abort
without iterating through the remaining projects.
"""
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
@staticmethod
def _cmd_option(option, _opt_str, _value, parser):
setattr(parser.values, option.dest, list(parser.rargs))
while parser.rargs:
del parser.rargs[0]
def _Options(self, p):
def cmd(option, opt_str, value, parser):
setattr(parser.values, option.dest, list(parser.rargs))
while parser.rargs:
del parser.rargs[0]
p.add_option('-r', '--regex',
dest='regex', action='store_true',
help="Execute the command only on projects matching regex or wildcard expression")
p.add_option('-i', '--inverse-regex',
dest='inverse_regex', action='store_true',
help="Execute the command only on projects not matching regex or "
"wildcard expression")
help="Execute the command only on projects not matching regex or wildcard expression")
p.add_option('-g', '--groups',
dest='groups',
help="Execute the command only on projects matching the specified groups")
@ -143,7 +135,7 @@ without iterating through the remaining projects.
help='Command (and arguments) to execute',
dest='command',
action='callback',
callback=self._cmd_option)
callback=cmd)
p.add_option('-e', '--abort-on-errors',
dest='abort_on_errors', action='store_true',
help='Abort if a command exits unsuccessfully')
@ -151,17 +143,43 @@ without iterating through the remaining projects.
help='Silently skip & do not exit non-zero due missing '
'checkouts')
g = p.get_option_group('--quiet')
g = p.add_option_group('Output')
g.add_option('-p',
dest='project_header', action='store_true',
help='Show project headers before output')
p.add_option('--interactive',
action='store_true',
help='force interactive usage')
g.add_option('-v', '--verbose',
dest='verbose', action='store_true',
help='Show command error messages')
g.add_option('-j', '--jobs',
dest='jobs', action='store', type='int', default=1,
help='number of commands to execute simultaneously')
def WantPager(self, opt):
return opt.project_header and opt.jobs == 1
def _SerializeProject(self, project):
""" Serialize a project._GitGetByExec instance.
project._GitGetByExec is not pickle-able. Instead of trying to pass it
around between processes, make a dict ourselves containing only the
attributes that we need.
"""
if not self.manifest.IsMirror:
lrev = project.GetRevisionId()
else:
lrev = None
return {
'name': project.name,
'relpath': project.relpath,
'remote_name': project.remote.name,
'lrev': lrev,
'rrev': project.revisionExpr,
'annotations': dict((a.name, a.value) for a in project.annotations),
'gitdir': project.gitdir,
'worktree': project.worktree,
}
def ValidateOptions(self, opt, args):
if not opt.command:
self.Usage()
@ -177,14 +195,9 @@ without iterating through the remaining projects.
cmd.append(cmd[0])
cmd.extend(opt.command[1:])
# Historically, forall operated interactively, and in serial. If the user
# has selected 1 job, then default to interacive mode.
if opt.jobs == 1:
opt.interactive = True
if opt.project_header \
and not shell \
and cmd[0] == 'git':
if opt.project_header \
and not shell \
and cmd[0] == 'git':
# If this is a direct git command that can enable colorized
# output and the user prefers coloring, add --color into the
# command line because we are going to wrap the command into
@ -207,7 +220,7 @@ without iterating through the remaining projects.
smart_sync_manifest_name = "smart_sync_override.xml"
smart_sync_manifest_path = os.path.join(
self.manifest.manifestProject.worktree, smart_sync_manifest_name)
self.manifest.manifestProject.worktree, smart_sync_manifest_name)
if os.path.isfile(smart_sync_manifest_path):
self.manifest.Override(smart_sync_manifest_path)
@ -221,50 +234,58 @@ without iterating through the remaining projects.
os.environ['REPO_COUNT'] = str(len(projects))
pool = multiprocessing.Pool(opt.jobs, InitWorker)
try:
config = self.manifest.manifestProject.config
with multiprocessing.Pool(opt.jobs, InitWorker) as pool:
results_it = pool.imap(
functools.partial(DoWorkWrapper, mirror, opt, cmd, shell, config),
enumerate(projects),
chunksize=WORKER_BATCH_SIZE)
first = True
for (r, output) in results_it:
if output:
if first:
first = False
elif opt.project_header:
print()
# To simplify the DoWorkWrapper, take care of automatic newlines.
end = '\n'
if output[-1] == '\n':
end = ''
print(output, end=end)
rc = rc or r
if r != 0 and opt.abort_on_errors:
raise Exception('Aborting due to previous error')
results_it = pool.imap(
DoWorkWrapper,
self.ProjectArgs(projects, mirror, opt, cmd, shell, config))
pool.close()
for r in results_it:
rc = rc or r
if r != 0 and opt.abort_on_errors:
raise Exception('Aborting due to previous error')
except (KeyboardInterrupt, WorkerKeyboardInterrupt):
# Catch KeyboardInterrupt raised inside and outside of workers
print('Interrupted - terminating the pool')
pool.terminate()
rc = rc or errno.EINTR
except Exception as e:
# Catch any other exceptions raised
print('forall: unhandled error, terminating the pool: %s: %s' %
(type(e).__name__, e),
print('Got an error, terminating the pool: %s: %s' %
(type(e).__name__, e),
file=sys.stderr)
pool.terminate()
rc = rc or getattr(e, 'errno', 1)
finally:
pool.join()
if rc != 0:
sys.exit(rc)
def ProjectArgs(self, projects, mirror, opt, cmd, shell, config):
for cnt, p in enumerate(projects):
try:
project = self._SerializeProject(p)
except Exception as e:
print('Project list error on project %s: %s: %s' %
(p.name, type(e).__name__, e),
file=sys.stderr)
return
except KeyboardInterrupt:
print('Project list interrupted',
file=sys.stderr)
return
yield [mirror, opt, cmd, shell, cnt, config, project]
class WorkerKeyboardInterrupt(Exception):
""" Keyboard interrupt exception for worker processes. """
pass
def InitWorker():
signal.signal(signal.SIGINT, signal.SIG_IGN)
def DoWorkWrapper(mirror, opt, cmd, shell, config, args):
def DoWorkWrapper(args):
""" A wrapper around the DoWork() method.
Catch the KeyboardInterrupt exceptions here and re-raise them as a different,
@ -272,81 +293,109 @@ def DoWorkWrapper(mirror, opt, cmd, shell, config, args):
and making the parent hang indefinitely.
"""
cnt, project = args
project = args.pop()
try:
return DoWork(project, mirror, opt, cmd, shell, cnt, config)
return DoWork(project, *args)
except KeyboardInterrupt:
print('%s: Worker interrupted' % project.name)
print('%s: Worker interrupted' % project['name'])
raise WorkerKeyboardInterrupt()
def DoWork(project, mirror, opt, cmd, shell, cnt, config):
env = os.environ.copy()
def setenv(name, val):
if val is None:
val = ''
if hasattr(val, 'encode'):
val = val.encode()
env[name] = val
setenv('REPO_PROJECT', project.name)
setenv('REPO_PATH', project.relpath)
setenv('REPO_REMOTE', project.remote.name)
try:
# If we aren't in a fully synced state and we don't have the ref the manifest
# wants, then this will fail. Ignore it for the purposes of this code.
lrev = '' if mirror else project.GetRevisionId()
except ManifestInvalidRevisionError:
lrev = ''
setenv('REPO_LREV', lrev)
setenv('REPO_RREV', project.revisionExpr)
setenv('REPO_UPSTREAM', project.upstream)
setenv('REPO_DEST_BRANCH', project.dest_branch)
setenv('REPO_PROJECT', project['name'])
setenv('REPO_PATH', project['relpath'])
setenv('REPO_REMOTE', project['remote_name'])
setenv('REPO_LREV', project['lrev'])
setenv('REPO_RREV', project['rrev'])
setenv('REPO_I', str(cnt + 1))
for annotation in project.annotations:
setenv("REPO__%s" % (annotation.name), annotation.value)
for name in project['annotations']:
setenv("REPO__%s" % (name), project['annotations'][name])
if mirror:
setenv('GIT_DIR', project.gitdir)
cwd = project.gitdir
setenv('GIT_DIR', project['gitdir'])
cwd = project['gitdir']
else:
cwd = project.worktree
cwd = project['worktree']
if not os.path.exists(cwd):
# Allow the user to silently ignore missing checkouts so they can run on
# partial checkouts (good for infra recovery tools).
if opt.ignore_missing:
return (0, '')
output = ''
return 0
if ((opt.project_header and opt.verbose)
or not opt.project_header):
output = 'skipping %s/' % project.relpath
return (1, output)
or not opt.project_header):
print('skipping %s/' % project['relpath'], file=sys.stderr)
return 1
if opt.verbose:
stderr = subprocess.STDOUT
else:
stderr = subprocess.DEVNULL
stdin = None if opt.interactive else subprocess.DEVNULL
result = subprocess.run(
cmd, cwd=cwd, shell=shell, env=env, check=False,
encoding='utf-8', errors='replace',
stdin=stdin, stdout=subprocess.PIPE, stderr=stderr)
output = result.stdout
if opt.project_header:
if output:
buf = io.StringIO()
out = ForallColoring(config)
out.redirect(buf)
if mirror:
project_header_path = project.name
else:
project_header_path = project.relpath
out.project('project %s/' % project_header_path)
out.nl()
buf.write(output)
output = buf.getvalue()
return (result.returncode, output)
stdin = subprocess.PIPE
stdout = subprocess.PIPE
stderr = subprocess.PIPE
else:
stdin = None
stdout = None
stderr = None
p = subprocess.Popen(cmd,
cwd=cwd,
shell=shell,
env=env,
stdin=stdin,
stdout=stdout,
stderr=stderr)
if opt.project_header:
out = ForallColoring(config)
out.redirect(sys.stdout)
empty = True
errbuf = ''
p.stdin.close()
s_in = platform_utils.FileDescriptorStreams.create()
s_in.add(p.stdout, sys.stdout, 'stdout')
s_in.add(p.stderr, sys.stderr, 'stderr')
while not s_in.is_done:
in_ready = s_in.select()
for s in in_ready:
buf = s.read().decode()
if not buf:
s.close()
s_in.remove(s)
continue
if not opt.verbose:
if s.std_name == 'stderr':
errbuf += buf
continue
if empty and out:
if not cnt == 0:
out.nl()
if mirror:
project_header_path = project['name']
else:
project_header_path = project['relpath']
out.project('project %s/', project_header_path)
out.nl()
out.flush()
if errbuf:
sys.stderr.write(errbuf)
sys.stderr.flush()
errbuf = ''
empty = False
s.dest.write(buf)
s.dest.flush()
r = p.wait()
return r

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2015 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,11 +14,15 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import sys
from command import Command, GitcClientCommand
import platform_utils
from pyversion import is_python3
if not is_python3():
input = raw_input
class GitcDelete(Command, GitcClientCommand):
common = True

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2015 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,6 +14,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import os
import sys
@ -47,17 +50,23 @@ use for this GITC client.
"""
def _Options(self, p):
super()._Options(p, gitc_init=True)
super(GitcInit, self)._Options(p, gitc_init=True)
g = p.add_option_group('GITC options')
g.add_option('-f', '--manifest-file',
dest='manifest_file',
help='Optional manifest file to use for this GITC client.')
g.add_option('-c', '--gitc-client',
dest='gitc_client',
help='The name of the gitc_client instance to create or modify.')
def Execute(self, opt, args):
gitc_client = gitc_utils.parse_clientdir(os.getcwd())
if not gitc_client or (opt.gitc_client and gitc_client != opt.gitc_client):
print('fatal: Please update your repo command. See go/gitc for instructions.',
file=sys.stderr)
print('fatal: Please update your repo command. See go/gitc for instructions.', file=sys.stderr)
sys.exit(1)
self.client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(),
gitc_client)
super().Execute(opt, args)
super(GitcInit, self).Execute(opt, args)
manifest_file = self.manifest.manifestFile
if opt.manifest_file:

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2009 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,14 +14,14 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import functools
from __future__ import print_function
import sys
from color import Coloring
from command import DEFAULT_LOCAL_JOBS, PagedCommand
from command import PagedCommand
from error import GitError
from git_command import GitCommand
from git_command import git_require, GitCommand
class GrepColoring(Coloring):
def __init__(self, config):
@ -27,7 +29,6 @@ class GrepColoring(Coloring):
self.project = self.printer('project', attr='bold')
self.fail = self.printer('fail', fg='red')
class Grep(PagedCommand):
common = True
helpSummary = "Print lines matching a pattern"
@ -62,33 +63,30 @@ contain a line that matches both expressions:
repo grep --all-match -e NODE -e Unexpected
"""
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
@staticmethod
def _carry_option(_option, opt_str, value, parser):
pt = getattr(parser.values, 'cmd_argv', None)
if pt is None:
pt = []
setattr(parser.values, 'cmd_argv', pt)
if opt_str == '-(':
pt.append('(')
elif opt_str == '-)':
pt.append(')')
else:
pt.append(opt_str)
if value is not None:
pt.append(value)
def _CommonOptions(self, p):
"""Override common options slightly."""
super()._CommonOptions(p, opt_v=False)
def _Options(self, p):
def carry(option,
opt_str,
value,
parser):
pt = getattr(parser.values, 'cmd_argv', None)
if pt is None:
pt = []
setattr(parser.values, 'cmd_argv', pt)
if opt_str == '-(':
pt.append('(')
elif opt_str == '-)':
pt.append(')')
else:
pt.append(opt_str)
if value is not None:
pt.append(value)
g = p.add_option_group('Sources')
g.add_option('--cached',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help='Search the index, instead of the work tree')
g.add_option('-r', '--revision',
dest='revision', action='append', metavar='TREEish',
@ -96,139 +94,74 @@ contain a line that matches both expressions:
g = p.add_option_group('Pattern')
g.add_option('-e',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
metavar='PATTERN', type='str',
help='Pattern to search for')
g.add_option('-i', '--ignore-case',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help='Ignore case differences')
g.add_option('-a', '--text',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help="Process binary files as if they were text")
g.add_option('-I',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help="Don't match the pattern in binary files")
g.add_option('-w', '--word-regexp',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help='Match the pattern only at word boundaries')
g.add_option('-v', '--invert-match',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help='Select non-matching lines')
g.add_option('-G', '--basic-regexp',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help='Use POSIX basic regexp for patterns (default)')
g.add_option('-E', '--extended-regexp',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help='Use POSIX extended regexp for patterns')
g.add_option('-F', '--fixed-strings',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help='Use fixed strings (not regexp) for pattern')
g = p.add_option_group('Pattern Grouping')
g.add_option('--all-match',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help='Limit match to lines that have all patterns')
g.add_option('--and', '--or', '--not',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help='Boolean operators to combine patterns')
g.add_option('-(', '-)',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help='Boolean operator grouping')
g = p.add_option_group('Output')
g.add_option('-n',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help='Prefix the line number to matching lines')
g.add_option('-C',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
metavar='CONTEXT', type='str',
help='Show CONTEXT lines around match')
g.add_option('-B',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
metavar='CONTEXT', type='str',
help='Show CONTEXT lines before match')
g.add_option('-A',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
metavar='CONTEXT', type='str',
help='Show CONTEXT lines after match')
g.add_option('-l', '--name-only', '--files-with-matches',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help='Show only file names containing matching lines')
g.add_option('-L', '--files-without-match',
action='callback', callback=self._carry_option,
action='callback', callback=carry,
help='Show only file names not containing matching lines')
def _ExecuteOne(self, cmd_argv, project):
"""Process one project."""
try:
p = GitCommand(project,
cmd_argv,
bare=False,
capture_stdout=True,
capture_stderr=True)
except GitError as e:
return (project, -1, None, str(e))
return (project, p.Wait(), p.stdout, p.stderr)
@staticmethod
def _ProcessResults(full_name, have_rev, _pool, out, results):
git_failed = False
bad_rev = False
have_match = False
for project, rc, stdout, stderr in results:
if rc < 0:
git_failed = True
out.project('--- project %s ---' % project.relpath)
out.nl()
out.fail('%s', stderr)
out.nl()
continue
if rc:
# no results
if stderr:
if have_rev and 'fatal: ambiguous argument' in stderr:
bad_rev = True
else:
out.project('--- project %s ---' % project.relpath)
out.nl()
out.fail('%s', stderr.strip())
out.nl()
continue
have_match = True
# We cut the last element, to avoid a blank line.
r = stdout.split('\n')
r = r[0:-1]
if have_rev and full_name:
for line in r:
rev, line = line.split(':', 1)
out.write("%s", rev)
out.write(':')
out.project(project.relpath)
out.write('/')
out.write("%s", line)
out.nl()
elif full_name:
for line in r:
out.project(project.relpath)
out.write('/')
out.write("%s", line)
out.nl()
else:
for line in r:
print(line)
return (git_failed, bad_rev, have_match)
def Execute(self, opt, args):
out = GrepColoring(self.manifest.manifestProject.config)
cmd_argv = ['grep']
if out.is_on:
if out.is_on and git_require((1, 6, 3)):
cmd_argv.append('--color')
cmd_argv.extend(getattr(opt, 'cmd_argv', []))
@ -255,13 +188,62 @@ contain a line that matches both expressions:
cmd_argv.extend(opt.revision)
cmd_argv.append('--')
git_failed, bad_rev, have_match = self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, cmd_argv),
projects,
callback=functools.partial(self._ProcessResults, full_name, have_rev),
output=out,
ordered=True)
git_failed = False
bad_rev = False
have_match = False
for project in projects:
try:
p = GitCommand(project,
cmd_argv,
bare=False,
capture_stdout=True,
capture_stderr=True)
except GitError as e:
git_failed = True
out.project('--- project %s ---' % project.relpath)
out.nl()
out.fail('%s', str(e))
out.nl()
continue
if p.Wait() != 0:
# no results
#
if p.stderr:
if have_rev and 'fatal: ambiguous argument' in p.stderr:
bad_rev = True
else:
out.project('--- project %s ---' % project.relpath)
out.nl()
out.fail('%s', p.stderr.strip())
out.nl()
continue
have_match = True
# We cut the last element, to avoid a blank line.
#
r = p.stdout.split('\n')
r = r[0:-1]
if have_rev and full_name:
for line in r:
rev, line = line.split(':', 1)
out.write("%s", rev)
out.write(':')
out.project(project.relpath)
out.write('/')
out.write("%s", line)
out.nl()
elif full_name:
for line in r:
out.project(project.relpath)
out.write('/')
out.write("%s", line)
out.nl()
else:
for line in r:
print(line)
if git_failed:
sys.exit(1)

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,16 +14,15 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import re
import sys
import textwrap
from formatter import AbstractFormatter, DumbWriter
from subcmds import all_commands
from color import Coloring
from command import PagedCommand, MirrorSafeCommand, GitcAvailableCommand, GitcClientCommand
import gitc_utils
class Help(PagedCommand, MirrorSafeCommand):
common = False
helpSummary = "Display detailed help on a command"
@ -40,7 +41,7 @@ Displays detailed usage information about a command.
fmt = ' %%-%ds %%s' % maxlen
for name in commandNames:
command = all_commands[name]()
command = self.commands[name]
try:
summary = command.helpSummary.strip()
except AttributeError:
@ -50,7 +51,7 @@ Displays detailed usage information about a command.
def _PrintAllCommands(self):
print('usage: repo COMMAND [ARGS]')
print('The complete list of recognized repo commands are:')
commandNames = list(sorted(all_commands))
commandNames = list(sorted(self.commands))
self._PrintCommands(commandNames)
print("See 'repo help <command>' for more information on a "
'specific command.')
@ -62,7 +63,7 @@ Displays detailed usage information about a command.
def gitc_supported(cmd):
if not isinstance(cmd, GitcAvailableCommand) and not isinstance(cmd, GitcClientCommand):
return True
if self.client.isGitcClient:
if self.manifest.isGitcClient:
return True
if isinstance(cmd, GitcClientCommand):
return False
@ -71,20 +72,21 @@ Displays detailed usage information about a command.
return False
commandNames = list(sorted([name
for name, command in all_commands.items()
if command.common and gitc_supported(command)]))
for name, command in self.commands.items()
if command.common and gitc_supported(command)]))
self._PrintCommands(commandNames)
print(
"See 'repo help <command>' for more information on a specific command.\n"
"See 'repo help --all' for a complete list of recognized commands.")
"See 'repo help <command>' for more information on a specific command.\n"
"See 'repo help --all' for a complete list of recognized commands.")
def _PrintCommandHelp(self, cmd, header_prefix=''):
class _Out(Coloring):
def __init__(self, gc):
Coloring.__init__(self, gc, 'help')
self.heading = self.printer('heading', attr='bold')
self._first = True
self.wrap = AbstractFormatter(DumbWriter())
def _PrintSection(self, heading, bodyAttr):
try:
@ -94,9 +96,7 @@ Displays detailed usage information about a command.
if body == '' or body is None:
return
if not self._first:
self.nl()
self._first = False
self.nl()
self.heading('%s%s', header_prefix, heading)
self.nl()
@ -106,8 +106,7 @@ Displays detailed usage information about a command.
body = body.strip()
body = body.replace('%prog', me)
# Extract the title, but skip any trailing {#anchors}.
asciidoc_hdr = re.compile(r'^\n?#+ ([^{]+)(\{#.+\})?$')
asciidoc_hdr = re.compile(r'^\n?#+ (.+)$')
for para in body.split("\n\n"):
if para.startswith(' '):
self.write('%s', para)
@ -122,21 +121,18 @@ Displays detailed usage information about a command.
self.nl()
continue
lines = textwrap.wrap(para.replace(' ', ' '), width=80,
break_long_words=False, break_on_hyphens=False)
for line in lines:
self.write('%s', line)
self.nl()
self.nl()
self.wrap.add_flowing_data(para)
self.wrap.end_paragraph(1)
self.wrap.end_paragraph(0)
out = _Out(self.client.globalConfig)
out = _Out(self.manifest.globalConfig)
out._PrintSection('Summary', 'helpSummary')
cmd.OptionParser.print_help()
out._PrintSection('Description', 'helpDescription')
def _PrintAllCommandHelp(self):
for name in sorted(all_commands):
cmd = all_commands[name]()
for name in sorted(self.commands):
cmd = self.commands[name]
cmd.manifest = self.manifest
self._PrintCommandHelp(cmd, header_prefix='[%s] ' % (name,))
@ -161,7 +157,7 @@ Displays detailed usage information about a command.
name = args[0]
try:
cmd = all_commands[name]()
cmd = self.commands[name]
except KeyError:
print("repo: '%s' is not a repo command." % name, file=sys.stderr)
sys.exit(1)

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2012 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -14,14 +16,12 @@
from command import PagedCommand
from color import Coloring
from git_refs import R_M, R_HEADS
from git_refs import R_M
class _Coloring(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, "status")
class Info(PagedCommand):
common = True
helpSummary = "Get info on the manifest branch, current branch or unmerged branches"
@ -41,14 +41,15 @@ class Info(PagedCommand):
dest="local", action="store_true",
help="Disable all remote operations")
def Execute(self, opt, args):
self.out = _Coloring(self.client.globalConfig)
self.heading = self.out.printer('heading', attr='bold')
self.headtext = self.out.nofmt_printer('headtext', fg='yellow')
self.redtext = self.out.printer('redtext', fg='red')
self.sha = self.out.printer("sha", fg='yellow')
self.out = _Coloring(self.manifest.globalConfig)
self.heading = self.out.printer('heading', attr = 'bold')
self.headtext = self.out.nofmt_printer('headtext', fg = 'yellow')
self.redtext = self.out.printer('redtext', fg = 'red')
self.sha = self.out.printer("sha", fg = 'yellow')
self.text = self.out.nofmt_printer('text')
self.dimtext = self.out.printer('dimtext', attr='dim')
self.dimtext = self.out.printer('dimtext', attr = 'dim')
self.opt = opt
@ -121,14 +122,11 @@ class Info(PagedCommand):
self.printSeparator()
def findRemoteLocalDiff(self, project):
# Fetch all the latest commits.
#Fetch all the latest commits
if not self.opt.local:
project.Sync_NetworkHalf(quiet=True, current_branch_only=True)
branch = self.manifest.manifestProject.config.GetBranch('default').merge
if branch.startswith(R_HEADS):
branch = branch[len(R_HEADS):]
logTarget = R_M + branch
logTarget = R_M + self.manifest.manifestProject.config.GetBranch("default").merge
bareTmp = project.bare_git._bare
project.bare_git._bare = False
@ -197,16 +195,16 @@ class Info(PagedCommand):
commits = branch.commits
date = branch.date
self.text('%s %-33s (%2d commit%s, %s)' % (
branch.name == project.CurrentBranch and '*' or ' ',
branch.name,
len(commits),
len(commits) != 1 and 's' or '',
date))
branch.name == project.CurrentBranch and '*' or ' ',
branch.name,
len(commits),
len(commits) != 1 and 's' or '',
date))
self.out.nl()
for commit in commits:
split = commit.split()
self.text('{0:38}{1} '.format('', '-'))
self.text('{0:38}{1} '.format('','-'))
self.sha(split[0] + " ")
self.text(" ".join(split[1:]))
self.out.nl()

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,29 +14,34 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import optparse
from __future__ import print_function
import os
import platform
import re
import sys
import urllib.parse
from pyversion import is_python3
if is_python3():
import urllib.parse
else:
import imp
import urlparse
urllib = imp.new_module('urllib')
urllib.parse = urlparse
from color import Coloring
from command import InteractiveCommand, MirrorSafeCommand
from error import ManifestParseError
from project import SyncBuffer
from git_config import GitConfig
from git_command import git_require, MIN_GIT_VERSION_SOFT, MIN_GIT_VERSION_HARD
import git_superproject
from git_command import git_require, MIN_GIT_VERSION
import platform_utils
from wrapper import Wrapper
class Init(InteractiveCommand, MirrorSafeCommand):
common = True
helpSummary = "Initialize a repo client checkout in the current directory"
helpSummary = "Initialize repo in the current directory"
helpUsage = """
%prog [options] [manifest url]
%prog [options]
"""
helpDescription = """
The '%prog' command is run once to install and initialize repo.
@ -42,13 +49,8 @@ The latest repo source code and manifest collection is downloaded
from the server and is installed in the .repo/ directory in the
current working directory.
When creating a new checkout, the manifest URL is the only required setting.
It may be specified using the --manifest-url option, or as the first optional
argument.
The optional -b argument can be used to select the manifest branch
to checkout and use. If no branch is specified, the remote's default
branch is used. This is equivalent to using -b HEAD.
to checkout and use. If no branch is specified, master is assumed.
The optional -m argument can be used to specify an alternate manifest
to be used. If no manifest is specified, the manifest default.xml
@ -79,41 +81,109 @@ manifest, a subsequent `repo sync` (or `repo sync -d`) is necessary
to update the working directory files.
"""
def _CommonOptions(self, p):
"""Disable due to re-use of Wrapper()."""
def _Options(self, p, gitc_init=False):
Wrapper().InitParser(p, gitc_init=gitc_init)
# Logging
g = p.add_option_group('Logging options')
g.add_option('-q', '--quiet',
dest="quiet", action="store_true", default=False,
help="be quiet")
# Manifest
g = p.add_option_group('Manifest options')
g.add_option('-u', '--manifest-url',
dest='manifest_url',
help='manifest repository location', metavar='URL')
g.add_option('-b', '--manifest-branch',
dest='manifest_branch',
help='manifest branch or revision', metavar='REVISION')
cbr_opts = ['--current-branch']
# The gitc-init subcommand allocates -c itself, but a lot of init users
# want -c, so try to satisfy both as best we can.
if not gitc_init:
cbr_opts += ['-c']
g.add_option(*cbr_opts,
dest='current_branch_only', action='store_true',
help='fetch only current manifest branch from server')
g.add_option('-m', '--manifest-name',
dest='manifest_name', default='default.xml',
help='initial manifest file', metavar='NAME.xml')
g.add_option('--mirror',
dest='mirror', action='store_true',
help='create a replica of the remote repositories '
'rather than a client working directory')
g.add_option('--reference',
dest='reference',
help='location of mirror directory', metavar='DIR')
g.add_option('--dissociate',
dest='dissociate', action='store_true',
help='dissociate from reference mirrors after clone')
g.add_option('--depth', type='int', default=None,
dest='depth',
help='create a shallow clone with given depth; see git clone')
g.add_option('--partial-clone', action='store_true',
dest='partial_clone',
help='perform partial clone (https://git-scm.com/'
'docs/gitrepository-layout#_code_partialclone_code)')
g.add_option('--clone-filter', action='store', default='blob:none',
dest='clone_filter',
help='filter for use with --partial-clone [default: %default]')
g.add_option('--archive',
dest='archive', action='store_true',
help='checkout an archive instead of a git repository for '
'each project. See git archive.')
g.add_option('--submodules',
dest='submodules', action='store_true',
help='sync any submodules associated with the manifest repo')
g.add_option('-g', '--groups',
dest='groups', default='default',
help='restrict manifest projects to ones with specified '
'group(s) [default|all|G1,G2,G3|G4,-G5,-G6]',
metavar='GROUP')
g.add_option('-p', '--platform',
dest='platform', default='auto',
help='restrict manifest projects to ones with a specified '
'platform group [auto|all|none|linux|darwin|...]',
metavar='PLATFORM')
g.add_option('--no-clone-bundle',
dest='no_clone_bundle', action='store_true',
help='disable use of /clone.bundle on HTTP/HTTPS')
g.add_option('--no-tags',
dest='no_tags', action='store_true',
help="don't fetch tags in the manifest")
# Tool
g = p.add_option_group('repo Version options')
g.add_option('--repo-url',
dest='repo_url',
help='repo repository location', metavar='URL')
g.add_option('--repo-branch',
dest='repo_branch',
help='repo branch or revision', metavar='REVISION')
g.add_option('--no-repo-verify',
dest='no_repo_verify', action='store_true',
help='do not verify repo source code')
# Other
g = p.add_option_group('Other options')
g.add_option('--config-name',
dest='config_name', action="store_true", default=False,
help='Always prompt for name/e-mail')
def _RegisteredEnvironmentOptions(self):
return {'REPO_MANIFEST_URL': 'manifest_url',
'REPO_MIRROR_LOCATION': 'reference'}
def _CloneSuperproject(self, opt):
"""Clone the superproject based on the superproject's url and branch.
Args:
opt: Program options returned from optparse. See _Options().
"""
superproject = git_superproject.Superproject(self.manifest,
self.repodir,
quiet=opt.quiet)
if not superproject.Sync():
print('error: git update of superproject failed', file=sys.stderr)
sys.exit(1)
def _SyncManifest(self, opt):
m = self.manifest.manifestProject
is_new = not m.Exists
if is_new:
if not opt.manifest_url:
print('fatal: manifest url is required.', file=sys.stderr)
print('fatal: manifest url (-u) is required.', file=sys.stderr)
sys.exit(1)
if not opt.quiet:
print('Downloading manifest from %s' %
(GitConfig.ForUser().UrlInsteadOf(opt.manifest_url),),
print('Get %s' % GitConfig.ForUser().UrlInsteadOf(opt.manifest_url),
file=sys.stderr)
# The manifest project object doesn't keep track of the path on the
@ -130,38 +200,30 @@ to update the working directory files.
m._InitGitDir(mirror_git=mirrored_manifest_git)
if opt.manifest_branch:
m.revisionExpr = opt.manifest_branch
else:
m.revisionExpr = 'refs/heads/master'
else:
if opt.manifest_branch:
m.revisionExpr = opt.manifest_branch
else:
m.PreSync()
self._ConfigureDepth(opt)
# Set the remote URL before the remote branch as we might need it below.
if opt.manifest_url:
r = m.GetRemote(m.remote.name)
r.url = opt.manifest_url
r.ResetFetch()
r.Save()
if opt.manifest_branch:
if opt.manifest_branch == 'HEAD':
opt.manifest_branch = m.ResolveRemoteHead()
if opt.manifest_branch is None:
print('fatal: unable to resolve HEAD', file=sys.stderr)
sys.exit(1)
m.revisionExpr = opt.manifest_branch
else:
if is_new:
default_branch = m.ResolveRemoteHead()
if default_branch is None:
# If the remote doesn't have HEAD configured, default to master.
default_branch = 'refs/heads/master'
m.revisionExpr = default_branch
else:
m.PreSync()
groups = re.split(r'[,\s]+', opt.groups)
all_platforms = ['linux', 'darwin', 'windows']
platformize = lambda x: 'platform-' + x
if opt.platform == 'auto':
if (not opt.mirror and
not m.config.GetString('repo.mirror') == 'true'):
not m.config.GetString('repo.mirror') == 'true'):
groups.append(platformize(platform.system().lower()))
elif opt.platform == 'all':
groups.extend(map(platformize, all_platforms))
@ -173,7 +235,7 @@ to update the working directory files.
groups = [x for x in groups if x]
groupstr = ','.join(groups)
if opt.platform == 'auto' and groupstr == self.manifest.GetDefaultGroupsStr():
if opt.platform == 'auto' and groupstr == 'default,platform-' + platform.system().lower():
groupstr = None
m.config.SetString('manifest.groups', groupstr)
@ -181,25 +243,11 @@ to update the working directory files.
m.config.SetString('repo.reference', opt.reference)
if opt.dissociate:
m.config.SetBoolean('repo.dissociate', opt.dissociate)
if opt.worktree:
if opt.mirror:
print('fatal: --mirror and --worktree are incompatible',
file=sys.stderr)
sys.exit(1)
if opt.submodules:
print('fatal: --submodules and --worktree are incompatible',
file=sys.stderr)
sys.exit(1)
m.config.SetBoolean('repo.worktree', opt.worktree)
if is_new:
m.use_git_worktrees = True
print('warning: --worktree is experimental!', file=sys.stderr)
m.config.SetString('repo.dissociate', 'true')
if opt.archive:
if is_new:
m.config.SetBoolean('repo.archive', opt.archive)
m.config.SetString('repo.archive', 'true')
else:
print('fatal: --archive is only supported when initializing a new '
'workspace.', file=sys.stderr)
@ -209,7 +257,7 @@ to update the working directory files.
if opt.mirror:
if is_new:
m.config.SetBoolean('repo.mirror', opt.mirror)
m.config.SetString('repo.mirror', 'true')
else:
print('fatal: --mirror is only supported when initializing a new '
'workspace.', file=sys.stderr)
@ -217,39 +265,25 @@ to update the working directory files.
'in another location.', file=sys.stderr)
sys.exit(1)
if opt.partial_clone is not None:
if opt.partial_clone:
if opt.mirror:
print('fatal: --mirror and --partial-clone are mutually exclusive',
file=sys.stderr)
sys.exit(1)
m.config.SetBoolean('repo.partialclone', opt.partial_clone)
m.config.SetString('repo.partialclone', 'true')
if opt.clone_filter:
m.config.SetString('repo.clonefilter', opt.clone_filter)
elif m.config.GetBoolean('repo.partialclone'):
opt.clone_filter = m.config.GetString('repo.clonefilter')
else:
opt.clone_filter = None
if opt.partial_clone_exclude is not None:
m.config.SetString('repo.partialcloneexclude', opt.partial_clone_exclude)
if opt.clone_bundle is None:
opt.clone_bundle = False if opt.partial_clone else True
else:
m.config.SetBoolean('repo.clonebundle', opt.clone_bundle)
if opt.submodules:
m.config.SetBoolean('repo.submodules', opt.submodules)
m.config.SetString('repo.submodules', 'true')
if opt.use_superproject is not None:
m.config.SetBoolean('repo.superproject', opt.use_superproject)
if not m.Sync_NetworkHalf(is_new=is_new, quiet=opt.quiet, verbose=opt.verbose,
clone_bundle=opt.clone_bundle,
current_branch_only=opt.current_branch_only,
tags=opt.tags, submodules=opt.submodules,
clone_filter=opt.clone_filter,
partial_clone_exclude=self.manifest.PartialCloneExclude):
if not m.Sync_NetworkHalf(is_new=is_new, quiet=opt.quiet,
clone_bundle=not opt.no_clone_bundle,
current_branch_only=opt.current_branch_only,
no_tags=opt.no_tags, submodules=opt.submodules,
clone_filter=opt.clone_filter):
r = m.GetRemote(m.remote.name)
print('fatal: cannot obtain manifest %s' % r.url, file=sys.stderr)
@ -292,8 +326,8 @@ to update the working directory files.
return value
return a
def _ShouldConfigureUser(self, opt):
gc = self.client.globalConfig
def _ShouldConfigureUser(self):
gc = self.manifest.globalConfig
mp = self.manifest.manifestProject
# If we don't have local settings, get from global.
@ -304,24 +338,21 @@ to update the working directory files.
mp.config.SetString('user.name', gc.GetString('user.name'))
mp.config.SetString('user.email', gc.GetString('user.email'))
if not opt.quiet:
print()
print('Your identity is: %s <%s>' % (mp.config.GetString('user.name'),
mp.config.GetString('user.email')))
print("If you want to change this, please re-run 'repo init' with --config-name")
print()
print('Your identity is: %s <%s>' % (mp.config.GetString('user.name'),
mp.config.GetString('user.email')))
print('If you want to change this, please re-run \'repo init\' with --config-name')
return False
def _ConfigureUser(self, opt):
def _ConfigureUser(self):
mp = self.manifest.manifestProject
while True:
if not opt.quiet:
print()
name = self._Prompt('Your Name', mp.UserName)
print()
name = self._Prompt('Your Name', mp.UserName)
email = self._Prompt('Your Email', mp.UserEmail)
if not opt.quiet:
print()
print()
print('Your identity is: %s <%s>' % (name, email))
print('is this correct [y/N]? ', end='')
# TODO: When we require Python 3, use flush=True w/print above.
@ -342,7 +373,7 @@ to update the working directory files.
return False
def _ConfigureColor(self):
gc = self.client.globalConfig
gc = self.manifest.globalConfig
if self._HasColorSet(gc):
return
@ -393,16 +424,15 @@ to update the working directory files.
# We store the depth in the main manifest project.
self.manifest.manifestProject.config.SetString('repo.depth', depth)
def _DisplayResult(self, opt):
def _DisplayResult(self):
if self.manifest.IsMirror:
init_type = 'mirror '
else:
init_type = ''
if not opt.quiet:
print()
print('repo %shas been initialized in %s' %
(init_type, self.manifest.topdir))
print()
print('repo %shas been initialized in %s'
% (init_type, self.manifest.topdir))
current_dir = os.getcwd()
if current_dir != self.manifest.topdir:
@ -420,56 +450,15 @@ to update the working directory files.
if opt.archive and opt.mirror:
self.OptionParser.error('--mirror and --archive cannot be used together.')
if args:
if opt.manifest_url:
self.OptionParser.error(
'--manifest-url option and URL argument both specified: only use '
'one to select the manifest URL.')
opt.manifest_url = args.pop(0)
if args:
self.OptionParser.error('too many arguments to init')
def Execute(self, opt, args):
git_require(MIN_GIT_VERSION_HARD, fail=True)
if not git_require(MIN_GIT_VERSION_SOFT):
print('repo: warning: git-%s+ will soon be required; please upgrade your '
'version of git to maintain support.'
% ('.'.join(str(x) for x in MIN_GIT_VERSION_SOFT),),
file=sys.stderr)
rp = self.manifest.repoProject
# Handle new --repo-url requests.
if opt.repo_url:
remote = rp.GetRemote('origin')
remote.url = opt.repo_url
remote.Save()
# Handle new --repo-rev requests.
if opt.repo_rev:
wrapper = Wrapper()
remote_ref, rev = wrapper.check_repo_rev(
rp.gitdir, opt.repo_rev, repo_verify=opt.repo_verify, quiet=opt.quiet)
branch = rp.GetBranch('default')
branch.merge = remote_ref
rp.work_git.reset('--hard', rev)
branch.Save()
if opt.worktree:
# Older versions of git supported worktree, but had dangerous gc bugs.
git_require((2, 15, 0), fail=True, msg='git gc worktree corruption')
git_require(MIN_GIT_VERSION, fail=True)
self._SyncManifest(opt)
self._LinkManifest(opt.manifest_name)
if self.manifest.manifestProject.config.GetBoolean('repo.superproject'):
self._CloneSuperproject(opt)
if os.isatty(0) and os.isatty(1) and not self.manifest.IsMirror:
if opt.config_name or self._ShouldConfigureUser(opt):
self._ConfigureUser(opt)
if opt.config_name or self._ShouldConfigureUser():
self._ConfigureUser()
self._ConfigureColor()
self._DisplayResult(opt)
self._DisplayResult()

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2011 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,24 +14,21 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from command import Command, MirrorSafeCommand
from __future__ import print_function
import sys
from command import Command, MirrorSafeCommand
class List(Command, MirrorSafeCommand):
common = True
helpSummary = "List projects and their associated directories"
helpUsage = """
%prog [-f] [<project>...]
%prog [-f] -r str1 [str2]...
%prog [-f] -r str1 [str2]..."
"""
helpDescription = """
List all projects; pass '.' to list the project for the cwd.
By default, only projects that currently exist in the checkout are shown. If
you want to list all projects (using the specified filter settings), use the
--all option. If you want to show all projects regardless of the manifest
groups, then also pass --groups all.
This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
"""
@ -40,9 +39,6 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
p.add_option('-g', '--groups',
dest='groups',
help="Filter the project list based on the groups the project is in")
p.add_option('-a', '--all',
action='store_true',
help='Show projects regardless of checkout state')
p.add_option('-f', '--fullpath',
dest='fullpath', action='store_true',
help="Display the full work tree path instead of the relative path")
@ -69,7 +65,7 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
args: Positional args. Can be a list of projects to list, or empty.
"""
if not opt.regex:
projects = self.GetProjects(args, groups=opt.groups, missing_ok=opt.all)
projects = self.GetProjects(args, groups=opt.groups)
else:
projects = self.FindProjects(args)
@ -81,12 +77,11 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
lines = []
for project in projects:
if opt.name_only and not opt.path_only:
lines.append("%s" % (project.name))
lines.append("%s" % ( project.name))
elif opt.path_only and not opt.name_only:
lines.append("%s" % (_getpath(project)))
else:
lines.append("%s : %s" % (_getpath(project), project.name))
if lines:
lines.sort()
print('\n'.join(lines))
lines.sort()
print('\n'.join(lines))

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2009 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,32 +14,25 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import json
from __future__ import print_function
import os
import sys
from command import PagedCommand
class Manifest(PagedCommand):
common = False
helpSummary = "Manifest inspection utility"
helpUsage = """
%prog [-o {-|NAME.xml}] [-m MANIFEST.xml] [-r]
%prog [-o {-|NAME.xml} [-r]]
"""
_helpDescription = """
With the -o option, exports the current manifest for inspection.
The manifest and (if present) local_manifests/ are combined
The manifest and (if present) local_manifest.xml are combined
together to produce a single manifest file. This file can be stored
in a Git repository for use during future 'repo init' invocations.
The -r option can be used to generate a manifest file with project
revisions set to the current commit hash. These are known as
"revision locked manifests", as they don't follow a particular branch.
In this case, the 'upstream' attribute is set to the ref we were on
when the manifest was generated. The 'dest-branch' attribute is set
to indicate the remote ref to push changes to via 'repo upload'.
"""
@property
@ -54,22 +49,11 @@ to indicate the remote ref to push changes to via 'repo upload'.
p.add_option('-r', '--revision-as-HEAD',
dest='peg_rev', action='store_true',
help='Save revisions as current HEAD')
p.add_option('-m', '--manifest-name',
help='temporary manifest to use for this sync', metavar='NAME.xml')
p.add_option('--suppress-upstream-revision', dest='peg_rev_upstream',
default=True, action='store_false',
help='If in -r mode, do not write the upstream field. '
'Only of use if the branch names for a sha1 manifest are '
'sensitive.')
p.add_option('--suppress-dest-branch', dest='peg_rev_dest_branch',
default=True, action='store_false',
help='If in -r mode, do not write the dest-branch field. '
'Only of use if the branch names for a sha1 manifest are '
'sensitive.')
p.add_option('--json', default=False, action='store_true',
help='Output manifest in JSON format (experimental).')
p.add_option('--pretty', default=False, action='store_true',
help='Format output for humans to read.')
p.add_option('-o', '--output-file',
dest='output_file',
default='-',
@ -77,34 +61,13 @@ to indicate the remote ref to push changes to via 'repo upload'.
metavar='-|NAME.xml')
def _Output(self, opt):
# If alternate manifest is specified, override the manifest file that we're using.
if opt.manifest_name:
self.manifest.Override(opt.manifest_name, False)
if opt.output_file == '-':
fd = sys.stdout
else:
fd = open(opt.output_file, 'w')
if opt.json:
print('warning: --json is experimental!', file=sys.stderr)
doc = self.manifest.ToDict(peg_rev=opt.peg_rev,
peg_rev_upstream=opt.peg_rev_upstream,
peg_rev_dest_branch=opt.peg_rev_dest_branch)
json_settings = {
# JSON style guide says Uunicode characters are fully allowed.
'ensure_ascii': False,
# We use 2 space indent to match JSON style guide.
'indent': 2 if opt.pretty else None,
'separators': (',', ': ') if opt.pretty else (',', ':'),
'sort_keys': True,
}
fd.write(json.dumps(doc, **json_settings))
else:
self.manifest.Save(fd,
peg_rev=opt.peg_rev,
peg_rev_upstream=opt.peg_rev_upstream,
peg_rev_dest_branch=opt.peg_rev_dest_branch)
self.manifest.Save(fd,
peg_rev = opt.peg_rev,
peg_rev_upstream = opt.peg_rev_upstream)
fd.close()
if opt.output_file != '-':
print('Saved manifest to %s' % opt.output_file, file=sys.stderr)

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2012 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,6 +14,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
from color import Coloring
from command import PagedCommand

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,11 +14,9 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import itertools
from __future__ import print_function
from color import Coloring
from command import DEFAULT_LOCAL_JOBS, PagedCommand
from command import PagedCommand
class Prune(PagedCommand):
common = True
@ -24,26 +24,11 @@ class Prune(PagedCommand):
helpUsage = """
%prog [<project>...]
"""
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _ExecuteOne(self, project):
"""Process one project."""
return project.PruneHeads()
def Execute(self, opt, args):
projects = self.GetProjects(args)
# NB: Should be able to refactor this module to display summary as results
# come back from children.
def _ProcessResults(_pool, _output, results):
return list(itertools.chain.from_iterable(results))
all_branches = self.ExecuteInParallel(
opt.jobs,
self._ExecuteOne,
projects,
callback=_ProcessResults,
ordered=True)
all_branches = []
for project in self.GetProjects(args):
all_branches.extend(project.PruneHeads())
if not all_branches:
return

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2010 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,6 +14,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import sys
from color import Coloring
@ -39,10 +42,9 @@ branch but need to incorporate new upstream changes "underneath" them.
"""
def _Options(self, p):
g = p.get_option_group('--quiet')
g.add_option('-i', '--interactive',
dest="interactive", action="store_true",
help="interactive rebase (single project only)")
p.add_option('-i', '--interactive',
dest="interactive", action="store_true",
help="interactive rebase (single project only)")
p.add_option('--fail-fast',
dest='fail_fast', action='store_true',
@ -51,8 +53,11 @@ branch but need to incorporate new upstream changes "underneath" them.
dest='force_rebase', action='store_true',
help='Pass --force-rebase to git rebase')
p.add_option('--no-ff',
dest='ff', default=True, action='store_false',
dest='no_ff', action='store_true',
help='Pass --no-ff to git rebase')
p.add_option('-q', '--quiet',
dest='quiet', action='store_true',
help='Pass --quiet to git rebase')
p.add_option('--autosquash',
dest='autosquash', action='store_true',
help='Pass --autosquash to git rebase')
@ -77,7 +82,7 @@ branch but need to incorporate new upstream changes "underneath" them.
file=sys.stderr)
if len(args) == 1:
print('note: project %s is mapped to more than one path' % (args[0],),
file=sys.stderr)
file=sys.stderr)
return 1
# Setup the common git rebase args that we use for all projects.
@ -88,7 +93,7 @@ branch but need to incorporate new upstream changes "underneath" them.
common_args.append('--quiet')
if opt.force_rebase:
common_args.append('--force-rebase')
if not opt.ff:
if opt.no_ff:
common_args.append('--no-ff')
if opt.autosquash:
common_args.append('--autosquash')

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2009 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,6 +14,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
from optparse import SUPPRESS_HELP
import sys
@ -19,7 +22,6 @@ from command import Command, MirrorSafeCommand
from subcmds.sync import _PostRepoUpgrade
from subcmds.sync import _PostRepoFetch
class Selfupdate(Command, MirrorSafeCommand):
common = False
helpSummary = "Update repo to the latest version"
@ -37,7 +39,7 @@ need to be performed by an end-user.
def _Options(self, p):
g = p.add_option_group('repo Version options')
g.add_option('--no-repo-verify',
dest='repo_verify', default=True, action='store_false',
dest='no_repo_verify', action='store_true',
help='do not verify repo source code')
g.add_option('--repo-upgraded',
dest='repo_upgraded', action='store_true',
@ -57,5 +59,5 @@ need to be performed by an end-user.
rp.bare_git.gc('--auto')
_PostRepoFetch(rp,
repo_verify=opt.repo_verify,
verbose=True)
no_repo_verify = opt.no_repo_verify,
verbose = True)

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2010 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -14,7 +16,6 @@
from subcmds.sync import Sync
class Smartsync(Sync):
common = True
helpSummary = "Update working tree to the latest known good revision"

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,13 +14,13 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import sys
from color import Coloring
from command import InteractiveCommand
from git_command import GitCommand
class _ProjectList(Coloring):
def __init__(self, gc):
Coloring.__init__(self, gc, 'interactive')
@ -26,7 +28,6 @@ class _ProjectList(Coloring):
self.header = self.printer('header', attr='bold')
self.help = self.printer('help', fg='red', attr='bold')
class Stage(InteractiveCommand):
common = True
helpSummary = "Stage file(s) for commit"
@ -38,8 +39,7 @@ The '%prog' command stages files to prepare the next commit.
"""
def _Options(self, p):
g = p.get_option_group('--quiet')
g.add_option('-i', '--interactive',
p.add_option('-i', '--interactive',
dest='interactive', action='store_true',
help='use interactive staging')
@ -105,7 +105,6 @@ The '%prog' command stages files to prepare the next commit.
continue
print('Bye.')
def _AddI(project):
p = GitCommand(project, ['add', '--interactive'], bare=False)
p.Wait()

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,18 +14,17 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import functools
from __future__ import print_function
import os
import sys
from command import Command, DEFAULT_LOCAL_JOBS
from command import Command
from git_config import IsImmutable
from git_command import git
import gitc_utils
from progress import Progress
from project import SyncBuffer
class Start(Command):
common = True
helpSummary = "Start a new branch for development"
@ -34,7 +35,6 @@ class Start(Command):
'%prog' begins a new branch of development, starting from the
revision specified in the manifest.
"""
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p):
p.add_option('--all',
@ -42,8 +42,7 @@ revision specified in the manifest.
help='begin branch in all projects')
p.add_option('-r', '--rev', '--revision', dest='revision',
help='point branch at this revision instead of upstream')
p.add_option('--head', '--HEAD',
dest='revision', action='store_const', const='HEAD',
p.add_option('--head', dest='revision', action='store_const', const='HEAD',
help='abbreviation for --rev HEAD')
def ValidateOptions(self, opt, args):
@ -54,26 +53,6 @@ revision specified in the manifest.
if not git.check_ref_format('heads/%s' % nb):
self.OptionParser.error("'%s' is not a valid name" % nb)
def _ExecuteOne(self, revision, nb, project):
"""Start one project."""
# If the current revision is immutable, such as a SHA1, a tag or
# a change, then we can't push back to it. Substitute with
# dest_branch, if defined; or with manifest default revision instead.
branch_merge = ''
if IsImmutable(project.revisionExpr):
if project.dest_branch:
branch_merge = project.dest_branch
else:
branch_merge = self.manifest.default.revisionExpr
try:
ret = project.StartBranch(
nb, branch_merge=branch_merge, revision=revision)
except Exception as e:
print('error: unable to checkout %s: %s' % (project.name, e), file=sys.stderr)
ret = False
return (ret, project)
def Execute(self, opt, args):
nb = args[0]
err = []
@ -81,7 +60,7 @@ revision specified in the manifest.
if not opt.all:
projects = args[1:]
if len(projects) < 1:
projects = ['.'] # start it in the local project by default
projects = ['.',] # start it in the local project by default
all_projects = self.GetProjects(projects,
missing_ok=bool(self.gitc_manifest))
@ -105,8 +84,11 @@ revision specified in the manifest.
if not os.path.exists(os.getcwd()):
os.chdir(self.manifest.topdir)
pm = Progress('Syncing %s' % nb, len(all_projects), quiet=opt.quiet)
for project in all_projects:
pm = Progress('Starting %s' % nb, len(all_projects))
for project in all_projects:
pm.update()
if self.gitc_manifest:
gitc_project = self.gitc_manifest.paths[project.relpath]
# Sync projects that have not been opened.
if not gitc_project.already_synced:
@ -119,21 +101,21 @@ revision specified in the manifest.
sync_buf = SyncBuffer(self.manifest.manifestProject.config)
project.Sync_LocalHalf(sync_buf)
project.revisionId = gitc_project.old_revision
pm.update()
pm.end()
def _ProcessResults(_pool, pm, results):
for (result, project) in results:
if not result:
err.append(project)
pm.update()
# If the current revision is immutable, such as a SHA1, a tag or
# a change, then we can't push back to it. Substitute with
# dest_branch, if defined; or with manifest default revision instead.
branch_merge = ''
if IsImmutable(project.revisionExpr):
if project.dest_branch:
branch_merge = project.dest_branch
else:
branch_merge = self.manifest.default.revisionExpr
self.ExecuteInParallel(
opt.jobs,
functools.partial(self._ExecuteOne, opt.revision, nb),
all_projects,
callback=_ProcessResults,
output=Progress('Starting %s' % (nb,), len(all_projects), quiet=opt.quiet))
if not project.StartBranch(
nb, branch_merge=branch_merge, revision=opt.revision):
err.append(project)
pm.end()
if err:
for p in err:

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,17 +14,23 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import functools
import glob
import io
import os
from __future__ import print_function
from command import DEFAULT_LOCAL_JOBS, PagedCommand
from command import PagedCommand
try:
import threading as _threading
except ImportError:
import dummy_threading as _threading
import glob
import itertools
import os
from color import Coloring
import platform_utils
class Status(PagedCommand):
common = True
helpSummary = "Show the working tree status"
@ -76,29 +84,36 @@ the following meanings:
d: deleted ( in index, not in work tree )
"""
PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
def _Options(self, p):
p.add_option('-j', '--jobs',
dest='jobs', action='store', type='int', default=2,
help="number of projects to check simultaneously")
p.add_option('-o', '--orphans',
dest='orphans', action='store_true',
help="include objects in working directory outside of repo projects")
p.add_option('-q', '--quiet', action='store_true',
help="only print the name of modified projects")
def _StatusHelper(self, quiet, project):
def _StatusHelper(self, project, clean_counter, sem, quiet):
"""Obtains the status for a specific project.
Obtains the status for a project, redirecting the output to
the specified object.
the specified object. It will release the semaphore
when done.
Args:
quiet: Where to output the status.
project: Project to get status of.
Returns:
The status of the project.
clean_counter: Counter for clean projects.
sem: Semaphore, will call release() when complete.
output: Where to output the status.
"""
buf = io.StringIO()
ret = project.PrintWorkTreeStatus(quiet=quiet, output_redir=buf)
return (ret, buf.getvalue())
try:
state = project.PrintWorkTreeStatus(quiet=quiet)
if state == 'CLEAN':
next(clean_counter)
finally:
sem.release()
def _FindOrphans(self, dirs, proj_dirs, proj_dirs_parents, outstring):
"""find 'dirs' that are present in 'proj_dirs_parents' but not in 'proj_dirs'"""
@ -111,31 +126,34 @@ the following meanings:
continue
if item in proj_dirs_parents:
self._FindOrphans(glob.glob('%s/.*' % item) +
glob.glob('%s/*' % item),
proj_dirs, proj_dirs_parents, outstring)
glob.glob('%s/*' % item),
proj_dirs, proj_dirs_parents, outstring)
continue
outstring.append(''.join([status_header, item, '/']))
def Execute(self, opt, args):
all_projects = self.GetProjects(args)
counter = itertools.count()
def _ProcessResults(_pool, _output, results):
ret = 0
for (state, output) in results:
if output:
print(output, end='')
if opt.jobs == 1:
for project in all_projects:
state = project.PrintWorkTreeStatus(quiet=opt.quiet)
if state == 'CLEAN':
ret += 1
return ret
next(counter)
else:
sem = _threading.Semaphore(opt.jobs)
threads = []
for project in all_projects:
sem.acquire()
counter = self.ExecuteInParallel(
opt.jobs,
functools.partial(self._StatusHelper, opt.quiet),
all_projects,
callback=_ProcessResults,
ordered=True)
if not opt.quiet and len(all_projects) == counter:
t = _threading.Thread(target=self._StatusHelper,
args=(project, counter, sem, opt.quiet))
threads.append(t)
t.daemon = True
t.start()
for t in threads:
t.join()
if not opt.quiet and len(all_projects) == next(counter):
print('nothing to commit (working directory clean)')
if opt.orphans:
@ -152,8 +170,8 @@ the following meanings:
class StatusColoring(Coloring):
def __init__(self, config):
Coloring.__init__(self, config, 'status')
self.project = self.printer('header', attr='bold')
self.untracked = self.printer('untracked', fg='red')
self.project = self.printer('header', attr = 'bold')
self.untracked = self.printer('untracked', fg = 'red')
orig_path = os.getcwd()
try:
@ -161,11 +179,11 @@ the following meanings:
outstring = []
self._FindOrphans(glob.glob('.*') +
glob.glob('*'),
proj_dirs, proj_dirs_parents, outstring)
glob.glob('*'),
proj_dirs, proj_dirs_parents, outstring)
if outstring:
output = StatusColoring(self.client.globalConfig)
output = StatusColoring(self.manifest.globalConfig)
output.project('Objects not within a project (orphans)')
output.nl()
for entry in outstring:

File diff suppressed because it is too large Load Diff

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,21 +14,25 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import copy
import re
import sys
from command import InteractiveCommand
from editor import Editor
from error import UploadError
from error import HookError, UploadError
from git_command import GitCommand
from git_refs import R_HEADS
from hooks import RepoHook
from project import RepoHook
from pyversion import is_python3
if not is_python3():
input = raw_input
else:
unicode = str
UNUSUAL_COMMIT_THRESHOLD = 5
def _ConfirmManyUploads(multiple_branches=False):
if multiple_branches:
print('ATTENTION: One or more branches has an unusually high number '
@ -38,20 +44,17 @@ def _ConfirmManyUploads(multiple_branches=False):
answer = input("If you are sure you intend to do this, type 'yes': ").strip()
return answer == "yes"
def _die(fmt, *args):
msg = fmt % args
print('error: %s' % msg, file=sys.stderr)
sys.exit(1)
def _SplitEmails(values):
result = []
for value in values:
result.extend([s.strip() for s in value.split(',')])
return result
class Upload(InteractiveCommand):
common = True
helpSummary = "Upload changes for code review"
@ -123,23 +126,6 @@ is set to "true" then repo will assume you always want the equivalent
of the -t option to the repo command. If unset or set to "false" then
repo will make use of only the command line option.
review.URL.uploadhashtags:
To add hashtags whenever uploading a commit, you can set a per-project
or global Git option to do so. The value of review.URL.uploadhashtags
will be used as comma delimited hashtags like the --hashtag option.
review.URL.uploadlabels:
To add labels whenever uploading a commit, you can set a per-project
or global Git option to do so. The value of review.URL.uploadlabels
will be used as comma delimited labels like the --label option.
review.URL.uploadnotify:
Control e-mail notifications when uploading.
https://gerrit-review.googlesource.com/Documentation/user-upload.html#notify
# References
Gerrit Code Review: https://www.gerritcodereview.com/
@ -150,27 +136,21 @@ Gerrit Code Review: https://www.gerritcodereview.com/
p.add_option('-t',
dest='auto_topic', action='store_true',
help='Send local branch name to Gerrit Code Review')
p.add_option('--hashtag', '--ht',
dest='hashtags', action='append', default=[],
help='Add hashtags (comma delimited) to the review.')
p.add_option('--hashtag-branch', '--htb',
action='store_true',
help='Add local branch name as a hashtag.')
p.add_option('-l', '--label',
dest='labels', action='append', default=[],
help='Add a label when uploading.')
p.add_option('--re', '--reviewers',
type='string', action='append', dest='reviewers',
type='string', action='append', dest='reviewers',
help='Request reviews from these people.')
p.add_option('--cc',
type='string', action='append', dest='cc',
type='string', action='append', dest='cc',
help='Also send email to these email addresses.')
p.add_option('--br',
type='string', action='store', dest='branch',
type='string', action='store', dest='branch',
help='Branch to upload.')
p.add_option('--cbr', '--current-branch',
dest='current_branch', action='store_true',
help='Upload current git branch.')
p.add_option('-d', '--draft',
action='store_true', dest='draft', default=False,
help='If specified, upload as a draft.')
p.add_option('--ne', '--no-emails',
action='store_false', dest='notify', default=True,
help='If specified, do not send emails on upload.')
@ -188,16 +168,32 @@ Gerrit Code Review: https://www.gerritcodereview.com/
type='string', action='store', dest='dest_branch',
metavar='BRANCH',
help='Submit for review on this target branch.')
p.add_option('-n', '--dry-run',
dest='dryrun', default=False, action='store_true',
help='Do everything except actually upload the CL.')
p.add_option('-y', '--yes',
default=False, action='store_true',
help='Answer yes to all safe prompts.')
# Options relating to upload hook. Note that verify and no-verify are NOT
# opposites of each other, which is why they store to different locations.
# We are using them to match 'git commit' syntax.
#
# Combinations:
# - no-verify=False, verify=False (DEFAULT):
# If stdout is a tty, can prompt about running upload hooks if needed.
# If user denies running hooks, the upload is cancelled. If stdout is
# not a tty and we would need to prompt about upload hooks, upload is
# cancelled.
# - no-verify=False, verify=True:
# Always run upload hooks with no prompt.
# - no-verify=True, verify=False:
# Never run upload hooks, but upload anyway (AKA bypass hooks).
# - no-verify=True, verify=True:
# Invalid
p.add_option('--no-cert-checks',
dest='validate_certs', action='store_false', default=True,
help='Disable verifying ssl certs (unsafe).')
RepoHook.AddOptionGroup(p, 'pre-upload')
p.add_option('--no-verify',
dest='bypass_hooks', action='store_true',
help='Do not run the upload hook.')
p.add_option('--verify',
dest='allow_all_hooks', action='store_true',
help='Run the upload hook without prompting.')
def _SingleBranch(self, opt, branch, people):
project = branch.project
@ -216,24 +212,20 @@ Gerrit Code Review: https://www.gerritcodereview.com/
destination = opt.dest_branch or project.dest_branch or project.revisionExpr
print('Upload project %s/ to remote branch %s%s:' %
(project.relpath, destination, ' (private)' if opt.private else ''))
(project.relpath, destination, ' (draft)' if opt.draft else ''))
print(' branch %s (%2d commit%s, %s):' % (
name,
len(commit_list),
len(commit_list) != 1 and 's' or '',
date))
name,
len(commit_list),
len(commit_list) != 1 and 's' or '',
date))
for commit in commit_list:
print(' %s' % commit)
print('to %s (y/N)? ' % remote.review, end='')
# TODO: When we require Python 3, use flush=True w/print above.
sys.stdout.flush()
if opt.yes:
print('<--yes>')
answer = True
else:
answer = sys.stdin.readline().strip().lower()
answer = answer in ('y', 'yes', '1', 'true', 't')
answer = sys.stdin.readline().strip().lower()
answer = answer in ('y', 'yes', '1', 'true', 't')
if answer:
if len(branch.commits) > UNUSUAL_COMMIT_THRESHOLD:
@ -330,12 +322,12 @@ Gerrit Code Review: https://www.gerritcodereview.com/
key = 'review.%s.autoreviewer' % project.GetBranch(name).remote.review
raw_list = project.config.GetString(key)
if raw_list is not None:
if not raw_list is None:
people[0].extend([entry.strip() for entry in raw_list.split(',')])
key = 'review.%s.autocopy' % project.GetBranch(name).remote.review
raw_list = project.config.GetString(key)
if raw_list is not None and len(people[0]) > 0:
if not raw_list is None and len(people[0]) > 0:
people[1].extend([entry.strip() for entry in raw_list.split(',')])
def _FindGerritChange(self, branch):
@ -372,11 +364,7 @@ Gerrit Code Review: https://www.gerritcodereview.com/
print('Continue uploading? (y/N) ', end='')
# TODO: When we require Python 3, use flush=True w/print above.
sys.stdout.flush()
if opt.yes:
print('<--yes>')
a = 'yes'
else:
a = sys.stdin.readline().strip().lower()
a = sys.stdin.readline().strip().lower()
if a not in ('y', 'yes', 't', 'true', 'on'):
print("skipping upload", file=sys.stderr)
branch.uploaded = False
@ -388,51 +376,12 @@ Gerrit Code Review: https://www.gerritcodereview.com/
key = 'review.%s.uploadtopic' % branch.project.remote.review
opt.auto_topic = branch.project.config.GetBoolean(key)
def _ExpandCommaList(value):
"""Split |value| up into comma delimited entries."""
if not value:
return
for ret in value.split(','):
ret = ret.strip()
if ret:
yield ret
# Check if hashtags should be included.
key = 'review.%s.uploadhashtags' % branch.project.remote.review
hashtags = set(_ExpandCommaList(branch.project.config.GetString(key)))
for tag in opt.hashtags:
hashtags.update(_ExpandCommaList(tag))
if opt.hashtag_branch:
hashtags.add(branch.name)
# Check if labels should be included.
key = 'review.%s.uploadlabels' % branch.project.remote.review
labels = set(_ExpandCommaList(branch.project.config.GetString(key)))
for label in opt.labels:
labels.update(_ExpandCommaList(label))
# Basic sanity check on label syntax.
for label in labels:
if not re.match(r'^.+[+-][0-9]+$', label):
print('repo: error: invalid label syntax "%s": labels use forms '
'like CodeReview+1 or Verified-1' % (label,), file=sys.stderr)
sys.exit(1)
# Handle e-mail notifications.
if opt.notify is False:
notify = 'NONE'
else:
key = 'review.%s.uploadnotify' % branch.project.remote.review
notify = branch.project.config.GetString(key)
destination = opt.dest_branch or branch.project.dest_branch
# Make sure our local branch is not setup to track a different remote branch
merge_branch = self._GetMergeBranch(branch.project)
if destination:
full_dest = destination
if not full_dest.startswith(R_HEADS):
full_dest = R_HEADS + full_dest
full_dest = 'refs/heads/%s' % destination
if not opt.dest_branch and merge_branch and merge_branch != full_dest:
print('merge branch %s does not match destination branch %s'
% (merge_branch, full_dest))
@ -443,12 +392,10 @@ Gerrit Code Review: https://www.gerritcodereview.com/
continue
branch.UploadForReview(people,
dryrun=opt.dryrun,
auto_topic=opt.auto_topic,
hashtags=hashtags,
labels=labels,
draft=opt.draft,
private=opt.private,
notify=notify,
notify=None if opt.notify else 'NONE',
wip=opt.wip,
dest_branch=destination,
validate_certs=opt.validate_certs,
@ -471,18 +418,18 @@ Gerrit Code Review: https://www.gerritcodereview.com/
else:
fmt = '\n (%s)'
print(('[FAILED] %-15s %-15s' + fmt) % (
branch.project.relpath + '/',
branch.name,
str(branch.error)),
file=sys.stderr)
branch.project.relpath + '/', \
branch.name, \
str(branch.error)),
file=sys.stderr)
print()
for branch in todo:
if branch.uploaded:
print('[OK ] %-15s %s' % (
branch.project.relpath + '/',
branch.name),
file=sys.stderr)
branch.project.relpath + '/',
branch.name),
file=sys.stderr)
if have_errors:
sys.exit(1)
@ -490,14 +437,14 @@ Gerrit Code Review: https://www.gerritcodereview.com/
def _GetMergeBranch(self, project):
p = GitCommand(project,
['rev-parse', '--abbrev-ref', 'HEAD'],
capture_stdout=True,
capture_stderr=True)
capture_stdout = True,
capture_stderr = True)
p.Wait()
local_branch = p.stdout.strip()
p = GitCommand(project,
['config', '--get', 'branch.%s.merge' % local_branch],
capture_stdout=True,
capture_stderr=True)
capture_stdout = True,
capture_stderr = True)
p.Wait()
merge_branch = p.stdout.strip()
return merge_branch
@ -520,10 +467,10 @@ Gerrit Code Review: https://www.gerritcodereview.com/
avail = [up_branch]
else:
avail = None
print('repo: error: Unable to upload branch "%s". '
'You might be able to fix the branch by running:\n'
' git branch --set-upstream-to m/%s' %
(str(cbr), self.manifest.branch),
print('ERROR: Current branch (%s) not uploadable. '
'You may be able to type '
'"git branch --set-upstream-to m/master" to fix '
'your branch.' % str(cbr),
file=sys.stderr)
else:
avail = project.GetUploadableBranches(branch)
@ -531,22 +478,22 @@ Gerrit Code Review: https://www.gerritcodereview.com/
pending.append((project, avail))
if not pending:
if branch is None:
print('repo: error: no branches ready for upload', file=sys.stderr)
else:
print('repo: error: no branches named "%s" ready for upload' %
(branch,), file=sys.stderr)
return 1
print("no branches ready for upload", file=sys.stderr)
return
pending_proj_names = [project.name for (project, available) in pending]
pending_worktrees = [project.worktree for (project, available) in pending]
hook = RepoHook.FromSubcmd(
hook_type='pre-upload', manifest=self.manifest,
opt=opt, abort_if_user_denies=True)
if not hook.Run(
project_list=pending_proj_names,
worktree_list=pending_worktrees):
return 1
if not opt.bypass_hooks:
hook = RepoHook('pre-upload', self.manifest.repo_hooks_project,
self.manifest.topdir,
self.manifest.manifestProject.GetRemote('origin').url,
abort_if_user_denies=True)
pending_proj_names = [project.name for (project, available) in pending]
pending_worktrees = [project.worktree for (project, available) in pending]
try:
hook.Run(opt.allow_all_hooks, project_list=pending_proj_names,
worktree_list=pending_worktrees)
except HookError as e:
print("ERROR: %s" % str(e), file=sys.stderr)
return
if opt.reviewers:
reviewers = _SplitEmails(opt.reviewers)

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2009 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,14 +14,12 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import platform
from __future__ import print_function
import sys
from command import Command, MirrorSafeCommand
from git_command import git, RepoSourceVersion, user_agent
from git_refs import HEAD
class Version(Command, MirrorSafeCommand):
wrapper_version = None
wrapper_path = None
@ -33,19 +33,16 @@ class Version(Command, MirrorSafeCommand):
def Execute(self, opt, args):
rp = self.manifest.repoProject
rem = rp.GetRemote(rp.remote.name)
branch = rp.GetBranch('default')
# These might not be the same. Report them both.
src_ver = RepoSourceVersion()
rp_ver = rp.bare_git.describe(HEAD)
print('repo version %s' % rp_ver)
print(' (from %s)' % rem.url)
print(' (tracking %s)' % branch.merge)
print(' (%s)' % rp.bare_git.log('-1', '--format=%cD', HEAD))
if self.wrapper_path is not None:
print('repo launcher version %s' % self.wrapper_version)
print(' (from %s)' % self.wrapper_path)
if Version.wrapper_path is not None:
print('repo launcher version %s' % Version.wrapper_version)
print(' (from %s)' % Version.wrapper_path)
if src_ver != rp_ver:
print(' (currently at %s)' % src_ver)
@ -54,11 +51,3 @@ class Version(Command, MirrorSafeCommand):
print('git %s' % git.version_tuple().full)
print('git User-Agent %s' % user_agent.git)
print('Python %s' % sys.version)
uname = platform.uname()
if sys.version_info.major < 3:
# Python 3 returns a named tuple, but Python 2 is simpler.
print(uname)
else:
print('OS %s %s (%s)' % (uname.system, uname.release, uname.version))
print('CPU %s (%s)' %
(uname.machine, uname.processor if uname.processor else 'unknown'))

View File

@ -1,13 +1,3 @@
[section]
empty
nonempty = true
boolinvalid = oops
booltrue = true
boolfalse = false
intinvalid = oops
inthex = 0x10
inthexk = 0x10k
int = 10
intk = 10k
intm = 10m
intg = 10g

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -14,6 +16,8 @@
"""Unittests for the editor.py module."""
from __future__ import print_function
import unittest
from editor import Editor

View File

@ -1,53 +0,0 @@
# Copyright 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the error.py module."""
import inspect
import pickle
import unittest
import error
class PickleTests(unittest.TestCase):
"""Make sure all our custom exceptions can be pickled."""
def getExceptions(self):
"""Return all our custom exceptions."""
for name in dir(error):
cls = getattr(error, name)
if isinstance(cls, type) and issubclass(cls, Exception):
yield cls
def testExceptionLookup(self):
"""Make sure our introspection logic works."""
classes = list(self.getExceptions())
self.assertIn(error.HookError, classes)
# Don't assert the exact number to avoid being a change-detector test.
self.assertGreater(len(classes), 10)
def testPickle(self):
"""Try to pickle all the exceptions."""
for cls in self.getExceptions():
args = inspect.getfullargspec(cls.__init__).args[1:]
obj = cls(*args)
p = pickle.dumps(obj)
try:
newobj = pickle.loads(p)
except Exception as e: # pylint: disable=broad-except
self.fail('Class %s is unable to be pickled: %s\n'
'Incomplete super().__init__(...) call?' % (cls, e))
self.assertIsInstance(newobj, cls)
self.assertEqual(str(obj), str(newobj))

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -14,43 +16,12 @@
"""Unittests for the git_command.py module."""
from __future__ import print_function
import re
import unittest
try:
from unittest import mock
except ImportError:
import mock
import git_command
import wrapper
class SSHUnitTest(unittest.TestCase):
"""Tests the ssh functions."""
def test_ssh_version(self):
"""Check ssh_version() handling."""
ver = git_command._parse_ssh_version('Unknown\n')
self.assertEqual(ver, ())
ver = git_command._parse_ssh_version('OpenSSH_1.0\n')
self.assertEqual(ver, (1, 0))
ver = git_command._parse_ssh_version('OpenSSH_6.6.1p1 Ubuntu-2ubuntu2.13, OpenSSL 1.0.1f 6 Jan 2014\n')
self.assertEqual(ver, (6, 6, 1))
ver = git_command._parse_ssh_version('OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\n')
self.assertEqual(ver, (7, 6))
def test_ssh_sock(self):
"""Check ssh_sock() function."""
with mock.patch('tempfile.mkdtemp', return_value='/tmp/foo'):
# old ssh version uses port
with mock.patch('git_command.ssh_version', return_value=(6, 6)):
self.assertTrue(git_command.ssh_sock().endswith('%p'))
git_command._ssh_sock_path = None
# new ssh version uses hash
with mock.patch('git_command.ssh_version', return_value=(6, 7)):
self.assertTrue(git_command.ssh_sock().endswith('%C'))
git_command._ssh_sock_path = None
class GitCallUnitTest(unittest.TestCase):
@ -64,7 +35,7 @@ class GitCallUnitTest(unittest.TestCase):
# We don't dive too deep into the values here to avoid having to update
# whenever git versions change. We do check relative to this min version
# as this is what `repo` itself requires via MIN_GIT_VERSION.
MIN_GIT_VERSION = (2, 10, 2)
MIN_GIT_VERSION = (1, 7, 2)
self.assertTrue(isinstance(ver.major, int))
self.assertTrue(isinstance(ver.minor, int))
self.assertTrue(isinstance(ver.micro, int))
@ -105,45 +76,3 @@ class UserAgentUnitTest(unittest.TestCase):
# the general form.
m = re.match(r'^git/[^ ]+ ([^ ]+) git-repo/[^ ]+', ua)
self.assertIsNotNone(m)
class GitRequireTests(unittest.TestCase):
"""Test the git_require helper."""
def setUp(self):
ver = wrapper.GitVersion(1, 2, 3, 4)
mock.patch.object(git_command.git, 'version_tuple', return_value=ver).start()
def tearDown(self):
mock.patch.stopall()
def test_older_nonfatal(self):
"""Test non-fatal require calls with old versions."""
self.assertFalse(git_command.git_require((2,)))
self.assertFalse(git_command.git_require((1, 3)))
self.assertFalse(git_command.git_require((1, 2, 4)))
self.assertFalse(git_command.git_require((1, 2, 3, 5)))
def test_newer_nonfatal(self):
"""Test non-fatal require calls with newer versions."""
self.assertTrue(git_command.git_require((0,)))
self.assertTrue(git_command.git_require((1, 0)))
self.assertTrue(git_command.git_require((1, 2, 0)))
self.assertTrue(git_command.git_require((1, 2, 3, 0)))
def test_equal_nonfatal(self):
"""Test require calls with equal values."""
self.assertTrue(git_command.git_require((1, 2, 3, 4), fail=False))
self.assertTrue(git_command.git_require((1, 2, 3, 4), fail=True))
def test_older_fatal(self):
"""Test fatal require calls with old versions."""
with self.assertRaises(SystemExit) as e:
git_command.git_require((2,), fail=True)
self.assertNotEqual(0, e.code)
def test_older_fatal_msg(self):
"""Test fatal require calls with old versions and message."""
with self.assertRaises(SystemExit) as e:
git_command.git_require((2,), fail=True, msg='so sad')
self.assertNotEqual(0, e.code)

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2009 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -14,22 +16,21 @@
"""Unittests for the git_config.py module."""
from __future__ import print_function
import os
import tempfile
import unittest
import git_config
def fixture(*paths):
"""Return a path relative to test/fixtures.
"""
return os.path.join(os.path.dirname(__file__), 'fixtures', *paths)
class GitConfigReadOnlyTests(unittest.TestCase):
"""Read-only tests of the GitConfig class."""
class GitConfigUnitTest(unittest.TestCase):
"""Tests the GitConfig class.
"""
def setUp(self):
"""Create a GitConfig object using the test.gitconfig fixture.
"""
@ -67,107 +68,5 @@ class GitConfigReadOnlyTests(unittest.TestCase):
val = config.GetString('empty')
self.assertEqual(val, None)
def test_GetBoolean_undefined(self):
"""Test GetBoolean on key that doesn't exist."""
self.assertIsNone(self.config.GetBoolean('section.missing'))
def test_GetBoolean_invalid(self):
"""Test GetBoolean on invalid boolean value."""
self.assertIsNone(self.config.GetBoolean('section.boolinvalid'))
def test_GetBoolean_true(self):
"""Test GetBoolean on valid true boolean."""
self.assertTrue(self.config.GetBoolean('section.booltrue'))
def test_GetBoolean_false(self):
"""Test GetBoolean on valid false boolean."""
self.assertFalse(self.config.GetBoolean('section.boolfalse'))
def test_GetInt_undefined(self):
"""Test GetInt on key that doesn't exist."""
self.assertIsNone(self.config.GetInt('section.missing'))
def test_GetInt_invalid(self):
"""Test GetInt on invalid integer value."""
self.assertIsNone(self.config.GetBoolean('section.intinvalid'))
def test_GetInt_valid(self):
"""Test GetInt on valid integers."""
TESTS = (
('inthex', 16),
('inthexk', 16384),
('int', 10),
('intk', 10240),
('intm', 10485760),
('intg', 10737418240),
)
for key, value in TESTS:
self.assertEqual(value, self.config.GetInt('section.%s' % (key,)))
class GitConfigReadWriteTests(unittest.TestCase):
"""Read/write tests of the GitConfig class."""
def setUp(self):
self.tmpfile = tempfile.NamedTemporaryFile()
self.config = self.get_config()
def get_config(self):
"""Get a new GitConfig instance."""
return git_config.GitConfig(self.tmpfile.name)
def test_SetString(self):
"""Test SetString behavior."""
# Set a value.
self.assertIsNone(self.config.GetString('foo.bar'))
self.config.SetString('foo.bar', 'val')
self.assertEqual('val', self.config.GetString('foo.bar'))
# Make sure the value was actually written out.
config = self.get_config()
self.assertEqual('val', config.GetString('foo.bar'))
# Update the value.
self.config.SetString('foo.bar', 'valll')
self.assertEqual('valll', self.config.GetString('foo.bar'))
config = self.get_config()
self.assertEqual('valll', config.GetString('foo.bar'))
# Delete the value.
self.config.SetString('foo.bar', None)
self.assertIsNone(self.config.GetString('foo.bar'))
config = self.get_config()
self.assertIsNone(config.GetString('foo.bar'))
def test_SetBoolean(self):
"""Test SetBoolean behavior."""
# Set a true value.
self.assertIsNone(self.config.GetBoolean('foo.bar'))
for val in (True, 1):
self.config.SetBoolean('foo.bar', val)
self.assertTrue(self.config.GetBoolean('foo.bar'))
# Make sure the value was actually written out.
config = self.get_config()
self.assertTrue(config.GetBoolean('foo.bar'))
self.assertEqual('true', config.GetString('foo.bar'))
# Set a false value.
for val in (False, 0):
self.config.SetBoolean('foo.bar', val)
self.assertFalse(self.config.GetBoolean('foo.bar'))
# Make sure the value was actually written out.
config = self.get_config()
self.assertFalse(config.GetBoolean('foo.bar'))
self.assertEqual('false', config.GetString('foo.bar'))
# Delete the value.
self.config.SetBoolean('foo.bar', None)
self.assertIsNone(self.config.GetBoolean('foo.bar'))
config = self.get_config()
self.assertIsNone(config.GetBoolean('foo.bar'))
if __name__ == '__main__':
unittest.main()

View File

@ -1,182 +0,0 @@
# Copyright (C) 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the git_superproject.py module."""
import os
import platform
import tempfile
import unittest
from unittest import mock
import git_superproject
import manifest_xml
import platform_utils
class SuperprojectTestCase(unittest.TestCase):
"""TestCase for the Superproject module."""
def setUp(self):
"""Set up superproject every time."""
self.tempdir = tempfile.mkdtemp(prefix='repo_tests')
self.repodir = os.path.join(self.tempdir, '.repo')
self.manifest_file = os.path.join(
self.repodir, manifest_xml.MANIFEST_FILE_NAME)
os.mkdir(self.repodir)
self.platform = platform.system().lower()
# The manifest parsing really wants a git repo currently.
gitdir = os.path.join(self.repodir, 'manifests.git')
os.mkdir(gitdir)
with open(os.path.join(gitdir, 'config'), 'w') as fp:
fp.write("""[remote "origin"]
url = https://localhost:0/manifest
""")
manifest = self.getXmlManifest("""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<superproject name="superproject"/>
<project path="art" name="platform/art" groups="notdefault,platform-""" + self.platform + """
" /></manifest>
""")
self._superproject = git_superproject.Superproject(manifest, self.repodir)
def tearDown(self):
"""Tear down superproject every time."""
platform_utils.rmtree(self.tempdir)
def getXmlManifest(self, data):
"""Helper to initialize a manifest for testing."""
with open(self.manifest_file, 'w') as fp:
fp.write(data)
return manifest_xml.XmlManifest(self.repodir, self.manifest_file)
def test_superproject_get_superproject_no_superproject(self):
"""Test with no url."""
manifest = self.getXmlManifest("""
<manifest>
</manifest>
""")
superproject = git_superproject.Superproject(manifest, self.repodir)
self.assertFalse(superproject.Sync())
def test_superproject_get_superproject_invalid_url(self):
"""Test with an invalid url."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="test-remote" fetch="localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<superproject name="superproject"/>
</manifest>
""")
superproject = git_superproject.Superproject(manifest, self.repodir)
self.assertFalse(superproject.Sync())
def test_superproject_get_superproject_invalid_branch(self):
"""Test with an invalid branch."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="test-remote" fetch="localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<superproject name="superproject"/>
</manifest>
""")
superproject = git_superproject.Superproject(manifest, self.repodir)
with mock.patch.object(self._superproject, '_GetBranch', return_value='junk'):
self.assertFalse(superproject.Sync())
def test_superproject_get_superproject_mock_init(self):
"""Test with _Init failing."""
with mock.patch.object(self._superproject, '_Init', return_value=False):
self.assertFalse(self._superproject.Sync())
def test_superproject_get_superproject_mock_fetch(self):
"""Test with _Fetch failing."""
with mock.patch.object(self._superproject, '_Init', return_value=True):
os.mkdir(self._superproject._superproject_path)
with mock.patch.object(self._superproject, '_Fetch', return_value=False):
self.assertFalse(self._superproject.Sync())
def test_superproject_get_all_project_commit_ids_mock_ls_tree(self):
"""Test with LsTree being a mock."""
data = ('120000 blob 158258bdf146f159218e2b90f8b699c4d85b5804\tAndroid.bp\x00'
'160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00'
'160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00'
'120000 blob acc2cbdf438f9d2141f0ae424cec1d8fc4b5d97f\tbootstrap.bash\x00'
'160000 commit ade9b7a0d874e25fff4bf2552488825c6f111928\tbuild/bazel\x00')
with mock.patch.object(self._superproject, '_Init', return_value=True):
with mock.patch.object(self._superproject, '_Fetch', return_value=True):
with mock.patch.object(self._superproject, '_LsTree', return_value=data):
commit_ids = self._superproject._GetAllProjectsCommitIds()
self.assertEqual(commit_ids, {
'art': '2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea',
'bootable/recovery': 'e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06',
'build/bazel': 'ade9b7a0d874e25fff4bf2552488825c6f111928'
})
def test_superproject_write_manifest_file(self):
"""Test with writing manifest to a file after setting revisionId."""
self.assertEqual(len(self._superproject._manifest.projects), 1)
project = self._superproject._manifest.projects[0]
project.SetRevisionId('ABCDEF')
# Create temporary directory so that it can write the file.
os.mkdir(self._superproject._superproject_path)
manifest_path = self._superproject._WriteManfiestFile()
self.assertIsNotNone(manifest_path)
with open(manifest_path, 'r') as fp:
manifest_xml = fp.read()
self.assertEqual(
manifest_xml,
'<?xml version="1.0" ?><manifest>' +
'<remote name="default-remote" fetch="http://localhost"/>' +
'<default remote="default-remote" revision="refs/heads/main"/>' +
'<project name="platform/art" path="art" revision="ABCDEF" ' +
'groups="notdefault,platform-' + self.platform + '"/>' +
'<superproject name="superproject"/>' +
'</manifest>')
def test_superproject_update_project_revision_id(self):
"""Test with LsTree being a mock."""
self.assertEqual(len(self._superproject._manifest.projects), 1)
projects = self._superproject._manifest.projects
data = ('160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00'
'160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00')
with mock.patch.object(self._superproject, '_Init', return_value=True):
with mock.patch.object(self._superproject, '_Fetch', return_value=True):
with mock.patch.object(self._superproject,
'_LsTree',
return_value=data):
# Create temporary directory so that it can write the file.
os.mkdir(self._superproject._superproject_path)
manifest_path = self._superproject.UpdateProjectsRevisionId(projects)
self.assertIsNotNone(manifest_path)
with open(manifest_path, 'r') as fp:
manifest_xml = fp.read()
self.assertEqual(
manifest_xml,
'<?xml version="1.0" ?><manifest>' +
'<remote name="default-remote" fetch="http://localhost"/>' +
'<default remote="default-remote" revision="refs/heads/main"/>' +
'<project name="platform/art" path="art" ' +
'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" ' +
'groups="notdefault,platform-' + self.platform + '"/>' +
'<superproject name="superproject"/>' +
'</manifest>')
if __name__ == '__main__':
unittest.main()

View File

@ -1,261 +0,0 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the git_trace2_event_log.py module."""
import json
import os
import tempfile
import unittest
from unittest import mock
import git_trace2_event_log
class EventLogTestCase(unittest.TestCase):
"""TestCase for the EventLog module."""
PARENT_SID_KEY = 'GIT_TRACE2_PARENT_SID'
PARENT_SID_VALUE = 'parent_sid'
SELF_SID_REGEX = r'repo-\d+T\d+Z-.*'
FULL_SID_REGEX = r'^%s/%s' % (PARENT_SID_VALUE, SELF_SID_REGEX)
def setUp(self):
"""Load the event_log module every time."""
self._event_log_module = None
# By default we initialize with the expected case where
# repo launches us (so GIT_TRACE2_PARENT_SID is set).
env = {
self.PARENT_SID_KEY: self.PARENT_SID_VALUE,
}
self._event_log_module = git_trace2_event_log.EventLog(env=env)
self._log_data = None
def verifyCommonKeys(self, log_entry, expected_event_name, full_sid=True):
"""Helper function to verify common event log keys."""
self.assertIn('event', log_entry)
self.assertIn('sid', log_entry)
self.assertIn('thread', log_entry)
self.assertIn('time', log_entry)
# Do basic data format validation.
self.assertEqual(expected_event_name, log_entry['event'])
if full_sid:
self.assertRegex(log_entry['sid'], self.FULL_SID_REGEX)
else:
self.assertRegex(log_entry['sid'], self.SELF_SID_REGEX)
self.assertRegex(log_entry['time'], r'^\d+-\d+-\d+T\d+:\d+:\d+\.\d+Z$')
def readLog(self, log_path):
"""Helper function to read log data into a list."""
log_data = []
with open(log_path, mode='rb') as f:
for line in f:
log_data.append(json.loads(line))
return log_data
def test_initial_state_with_parent_sid(self):
"""Test initial state when 'GIT_TRACE2_PARENT_SID' is set by parent."""
self.assertRegex(self._event_log_module.full_sid, self.FULL_SID_REGEX)
def test_initial_state_no_parent_sid(self):
"""Test initial state when 'GIT_TRACE2_PARENT_SID' is not set."""
# Setup an empty environment dict (no parent sid).
self._event_log_module = git_trace2_event_log.EventLog(env={})
self.assertRegex(self._event_log_module.full_sid, self.SELF_SID_REGEX)
def test_version_event(self):
"""Test 'version' event data is valid.
Verify that the 'version' event is written even when no other
events are addded.
Expected event log:
<version event>
"""
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
# A log with no added events should only have the version entry.
self.assertEqual(len(self._log_data), 1)
version_event = self._log_data[0]
self.verifyCommonKeys(version_event, expected_event_name='version')
# Check for 'version' event specific fields.
self.assertIn('evt', version_event)
self.assertIn('exe', version_event)
# Verify "evt" version field is a string.
self.assertIsInstance(version_event['evt'], str)
def test_start_event(self):
"""Test and validate 'start' event data is valid.
Expected event log:
<version event>
<start event>
"""
self._event_log_module.StartEvent()
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
start_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(start_event, expected_event_name='start')
# Check for 'start' event specific fields.
self.assertIn('argv', start_event)
self.assertTrue(isinstance(start_event['argv'], list))
def test_exit_event_result_none(self):
"""Test 'exit' event data is valid when result is None.
We expect None result to be converted to 0 in the exit event data.
Expected event log:
<version event>
<exit event>
"""
self._event_log_module.ExitEvent(None)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
exit_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(exit_event, expected_event_name='exit')
# Check for 'exit' event specific fields.
self.assertIn('code', exit_event)
# 'None' result should convert to 0 (successful) return code.
self.assertEqual(exit_event['code'], 0)
def test_exit_event_result_integer(self):
"""Test 'exit' event data is valid when result is an integer.
Expected event log:
<version event>
<exit event>
"""
self._event_log_module.ExitEvent(2)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
exit_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(exit_event, expected_event_name='exit')
# Check for 'exit' event specific fields.
self.assertIn('code', exit_event)
self.assertEqual(exit_event['code'], 2)
def test_command_event(self):
"""Test and validate 'command' event data is valid.
Expected event log:
<version event>
<command event>
"""
name = 'repo'
subcommands = ['init' 'this']
self._event_log_module.CommandEvent(name='repo', subcommands=subcommands)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
command_event = self._log_data[1]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
self.verifyCommonKeys(command_event, expected_event_name='command')
# Check for 'command' event specific fields.
self.assertIn('name', command_event)
self.assertIn('subcommands', command_event)
self.assertEqual(command_event['name'], name)
self.assertEqual(command_event['subcommands'], subcommands)
def test_def_params_event_repo_config(self):
"""Test 'def_params' event data outputs only repo config keys.
Expected event log:
<version event>
<def_param event>
<def_param event>
"""
config = {
'git.foo': 'bar',
'repo.partialclone': 'true',
'repo.partialclonefilter': 'blob:none',
}
self._event_log_module.DefParamRepoEvents(config)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 3)
def_param_events = self._log_data[1:]
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
for event in def_param_events:
self.verifyCommonKeys(event, expected_event_name='def_param')
# Check for 'def_param' event specific fields.
self.assertIn('param', event)
self.assertIn('value', event)
self.assertTrue(event['param'].startswith('repo.'))
def test_def_params_event_no_repo_config(self):
"""Test 'def_params' event data won't output non-repo config keys.
Expected event log:
<version event>
"""
config = {
'git.foo': 'bar',
'git.core.foo2': 'baz',
}
self._event_log_module.DefParamRepoEvents(config)
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 1)
self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
def test_write_with_filename(self):
"""Test Write() with a path to a file exits with None."""
self.assertIsNone(self._event_log_module.Write(path='path/to/file'))
def test_write_with_git_config(self):
"""Test Write() uses the git config path when 'git config' call succeeds."""
with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
with mock.patch.object(self._event_log_module,
'_GetEventTargetPath', return_value=tempdir):
self.assertEqual(os.path.dirname(self._event_log_module.Write()), tempdir)
def test_write_no_git_config(self):
"""Test Write() with no git config variable present exits with None."""
with mock.patch.object(self._event_log_module,
'_GetEventTargetPath', return_value=None):
self.assertIsNone(self._event_log_module.Write())
def test_write_non_string(self):
"""Test Write() with non-string type for |path| throws TypeError."""
with self.assertRaises(TypeError):
self._event_log_module.Write(path=1234)
if __name__ == '__main__':
unittest.main()

View File

@ -1,55 +0,0 @@
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the hooks.py module."""
import hooks
import unittest
class RepoHookShebang(unittest.TestCase):
"""Check shebang parsing in RepoHook."""
def test_no_shebang(self):
"""Lines w/out shebangs should be rejected."""
DATA = (
'',
'#\n# foo\n',
'# Bad shebang in script\n#!/foo\n'
)
for data in DATA:
self.assertIsNone(hooks.RepoHook._ExtractInterpFromShebang(data))
def test_direct_interp(self):
"""Lines whose shebang points directly to the interpreter."""
DATA = (
('#!/foo', '/foo'),
('#! /foo', '/foo'),
('#!/bin/foo ', '/bin/foo'),
('#! /usr/foo ', '/usr/foo'),
('#! /usr/foo -args', '/usr/foo'),
)
for shebang, interp in DATA:
self.assertEqual(hooks.RepoHook._ExtractInterpFromShebang(shebang),
interp)
def test_env_interp(self):
"""Lines whose shebang launches through `env`."""
DATA = (
('#!/usr/bin/env foo', 'foo'),
('#!/bin/env foo', 'foo'),
('#! /bin/env /bin/foo ', '/bin/foo'),
)
for shebang, interp in DATA:
self.assertEqual(hooks.RepoHook._ExtractInterpFromShebang(shebang),
interp)

View File

@ -1,549 +0,0 @@
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the manifest_xml.py module."""
import os
import platform
import shutil
import tempfile
import unittest
import xml.dom.minidom
import error
import manifest_xml
# Invalid paths that we don't want in the filesystem.
INVALID_FS_PATHS = (
'',
'.',
'..',
'../',
'./',
'.//',
'foo/',
'./foo',
'../foo',
'foo/./bar',
'foo/../../bar',
'/foo',
'./../foo',
'.git/foo',
# Check case folding.
'.GIT/foo',
'blah/.git/foo',
'.repo/foo',
'.repoconfig',
# Block ~ due to 8.3 filenames on Windows filesystems.
'~',
'foo~',
'blah/foo~',
# Block Unicode characters that get normalized out by filesystems.
u'foo\u200Cbar',
)
# Make sure platforms that use path separators (e.g. Windows) are also
# rejected properly.
if os.path.sep != '/':
INVALID_FS_PATHS += tuple(x.replace('/', os.path.sep) for x in INVALID_FS_PATHS)
class ManifestParseTestCase(unittest.TestCase):
"""TestCase for parsing manifests."""
def setUp(self):
self.tempdir = tempfile.mkdtemp(prefix='repo_tests')
self.repodir = os.path.join(self.tempdir, '.repo')
self.manifest_dir = os.path.join(self.repodir, 'manifests')
self.manifest_file = os.path.join(
self.repodir, manifest_xml.MANIFEST_FILE_NAME)
self.local_manifest_dir = os.path.join(
self.repodir, manifest_xml.LOCAL_MANIFESTS_DIR_NAME)
os.mkdir(self.repodir)
os.mkdir(self.manifest_dir)
# The manifest parsing really wants a git repo currently.
gitdir = os.path.join(self.repodir, 'manifests.git')
os.mkdir(gitdir)
with open(os.path.join(gitdir, 'config'), 'w') as fp:
fp.write("""[remote "origin"]
url = https://localhost:0/manifest
""")
def tearDown(self):
shutil.rmtree(self.tempdir, ignore_errors=True)
def getXmlManifest(self, data):
"""Helper to initialize a manifest for testing."""
with open(self.manifest_file, 'w') as fp:
fp.write(data)
return manifest_xml.XmlManifest(self.repodir, self.manifest_file)
class ManifestValidateFilePaths(unittest.TestCase):
"""Check _ValidateFilePaths helper.
This doesn't access a real filesystem.
"""
def check_both(self, *args):
manifest_xml.XmlManifest._ValidateFilePaths('copyfile', *args)
manifest_xml.XmlManifest._ValidateFilePaths('linkfile', *args)
def test_normal_path(self):
"""Make sure good paths are accepted."""
self.check_both('foo', 'bar')
self.check_both('foo/bar', 'bar')
self.check_both('foo', 'bar/bar')
self.check_both('foo/bar', 'bar/bar')
def test_symlink_targets(self):
"""Some extra checks for symlinks."""
def check(*args):
manifest_xml.XmlManifest._ValidateFilePaths('linkfile', *args)
# We allow symlinks to end in a slash since we allow them to point to dirs
# in general. Technically the slash isn't necessary.
check('foo/', 'bar')
# We allow a single '.' to get a reference to the project itself.
check('.', 'bar')
def test_bad_paths(self):
"""Make sure bad paths (src & dest) are rejected."""
for path in INVALID_FS_PATHS:
self.assertRaises(
error.ManifestInvalidPathError, self.check_both, path, 'a')
self.assertRaises(
error.ManifestInvalidPathError, self.check_both, 'a', path)
class ValueTests(unittest.TestCase):
"""Check utility parsing code."""
def _get_node(self, text):
return xml.dom.minidom.parseString(text).firstChild
def test_bool_default(self):
"""Check XmlBool default handling."""
node = self._get_node('<node/>')
self.assertIsNone(manifest_xml.XmlBool(node, 'a'))
self.assertIsNone(manifest_xml.XmlBool(node, 'a', None))
self.assertEqual(123, manifest_xml.XmlBool(node, 'a', 123))
node = self._get_node('<node a=""/>')
self.assertIsNone(manifest_xml.XmlBool(node, 'a'))
def test_bool_invalid(self):
"""Check XmlBool invalid handling."""
node = self._get_node('<node a="moo"/>')
self.assertEqual(123, manifest_xml.XmlBool(node, 'a', 123))
def test_bool_true(self):
"""Check XmlBool true values."""
for value in ('yes', 'true', '1'):
node = self._get_node('<node a="%s"/>' % (value,))
self.assertTrue(manifest_xml.XmlBool(node, 'a'))
def test_bool_false(self):
"""Check XmlBool false values."""
for value in ('no', 'false', '0'):
node = self._get_node('<node a="%s"/>' % (value,))
self.assertFalse(manifest_xml.XmlBool(node, 'a'))
def test_int_default(self):
"""Check XmlInt default handling."""
node = self._get_node('<node/>')
self.assertIsNone(manifest_xml.XmlInt(node, 'a'))
self.assertIsNone(manifest_xml.XmlInt(node, 'a', None))
self.assertEqual(123, manifest_xml.XmlInt(node, 'a', 123))
node = self._get_node('<node a=""/>')
self.assertIsNone(manifest_xml.XmlInt(node, 'a'))
def test_int_good(self):
"""Check XmlInt numeric handling."""
for value in (-1, 0, 1, 50000):
node = self._get_node('<node a="%s"/>' % (value,))
self.assertEqual(value, manifest_xml.XmlInt(node, 'a'))
def test_int_invalid(self):
"""Check XmlInt invalid handling."""
with self.assertRaises(error.ManifestParseError):
node = self._get_node('<node a="xx"/>')
manifest_xml.XmlInt(node, 'a')
class XmlManifestTests(ManifestParseTestCase):
"""Check manifest processing."""
def test_empty(self):
"""Parse an 'empty' manifest file."""
manifest = self.getXmlManifest(
'<?xml version="1.0" encoding="UTF-8"?>'
'<manifest></manifest>')
self.assertEqual(manifest.remotes, {})
self.assertEqual(manifest.projects, [])
def test_link(self):
"""Verify Link handling with new names."""
manifest = manifest_xml.XmlManifest(self.repodir, self.manifest_file)
with open(os.path.join(self.manifest_dir, 'foo.xml'), 'w') as fp:
fp.write('<manifest></manifest>')
manifest.Link('foo.xml')
with open(self.manifest_file) as fp:
self.assertIn('<include name="foo.xml" />', fp.read())
def test_toxml_empty(self):
"""Verify the ToXml() helper."""
manifest = self.getXmlManifest(
'<?xml version="1.0" encoding="UTF-8"?>'
'<manifest></manifest>')
self.assertEqual(manifest.ToXml().toxml(), '<?xml version="1.0" ?><manifest/>')
def test_todict_empty(self):
"""Verify the ToDict() helper."""
manifest = self.getXmlManifest(
'<?xml version="1.0" encoding="UTF-8"?>'
'<manifest></manifest>')
self.assertEqual(manifest.ToDict(), {})
def test_repo_hooks(self):
"""Check repo-hooks settings."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<project name="repohooks" path="src/repohooks"/>
<repo-hooks in-project="repohooks" enabled-list="a, b"/>
</manifest>
""")
self.assertEqual(manifest.repo_hooks_project.name, 'repohooks')
self.assertEqual(manifest.repo_hooks_project.enabled_repo_hooks, ['a', 'b'])
def test_unknown_tags(self):
"""Check superproject settings."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<superproject name="superproject"/>
<iankaz value="unknown (possible) future tags are ignored"/>
<x-custom-tag>X tags are always ignored</x-custom-tag>
</manifest>
""")
self.assertEqual(manifest.superproject['name'], 'superproject')
self.assertEqual(manifest.superproject['remote'].name, 'test-remote')
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' +
'<remote name="test-remote" fetch="http://localhost"/>' +
'<default remote="test-remote" revision="refs/heads/main"/>' +
'<superproject name="superproject"/>' +
'</manifest>')
class IncludeElementTests(ManifestParseTestCase):
"""Tests for <include>."""
def test_group_levels(self):
root_m = os.path.join(self.manifest_dir, 'root.xml')
with open(root_m, 'w') as fp:
fp.write("""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<include name="level1.xml" groups="level1-group" />
<project name="root-name1" path="root-path1" />
<project name="root-name2" path="root-path2" groups="r2g1,r2g2" />
</manifest>
""")
with open(os.path.join(self.manifest_dir, 'level1.xml'), 'w') as fp:
fp.write("""
<manifest>
<include name="level2.xml" groups="level2-group" />
<project name="level1-name1" path="level1-path1" />
</manifest>
""")
with open(os.path.join(self.manifest_dir, 'level2.xml'), 'w') as fp:
fp.write("""
<manifest>
<project name="level2-name1" path="level2-path1" groups="l2g1,l2g2" />
</manifest>
""")
include_m = manifest_xml.XmlManifest(self.repodir, root_m)
for proj in include_m.projects:
if proj.name == 'root-name1':
# Check include group not set on root level proj.
self.assertNotIn('level1-group', proj.groups)
if proj.name == 'root-name2':
# Check root proj group not removed.
self.assertIn('r2g1', proj.groups)
if proj.name == 'level1-name1':
# Check level1 proj has inherited group level 1.
self.assertIn('level1-group', proj.groups)
if proj.name == 'level2-name1':
# Check level2 proj has inherited group levels 1 and 2.
self.assertIn('level1-group', proj.groups)
self.assertIn('level2-group', proj.groups)
# Check level2 proj group not removed.
self.assertIn('l2g1', proj.groups)
def test_allow_bad_name_from_user(self):
"""Check handling of bad name attribute from the user's input."""
def parse(name):
manifest = self.getXmlManifest(f"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<include name="{name}" />
</manifest>
""")
# Force the manifest to be parsed.
manifest.ToXml()
# Setup target of the include.
target = os.path.join(self.tempdir, 'target.xml')
with open(target, 'w') as fp:
fp.write('<manifest></manifest>')
# Include with absolute path.
parse(os.path.abspath(target))
# Include with relative path.
parse(os.path.relpath(target, self.manifest_dir))
def test_bad_name_checks(self):
"""Check handling of bad name attribute."""
def parse(name):
# Setup target of the include.
with open(os.path.join(self.manifest_dir, 'target.xml'), 'w') as fp:
fp.write(f'<manifest><include name="{name}"/></manifest>')
manifest = self.getXmlManifest("""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<include name="target.xml" />
</manifest>
""")
# Force the manifest to be parsed.
manifest.ToXml()
# Handle empty name explicitly because a different codepath rejects it.
with self.assertRaises(error.ManifestParseError):
parse('')
for path in INVALID_FS_PATHS:
if not path:
continue
with self.assertRaises(error.ManifestInvalidPathError):
parse(path)
class ProjectElementTests(ManifestParseTestCase):
"""Tests for <project>."""
def test_group(self):
"""Check project group settings."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<project name="test-name" path="test-path"/>
<project name="extras" path="path" groups="g1,g2,g1"/>
</manifest>
""")
self.assertEqual(len(manifest.projects), 2)
# Ordering isn't guaranteed.
result = {
manifest.projects[0].name: manifest.projects[0].groups,
manifest.projects[1].name: manifest.projects[1].groups,
}
project = manifest.projects[0]
self.assertCountEqual(
result['test-name'],
['name:test-name', 'all', 'path:test-path'])
self.assertCountEqual(
result['extras'],
['g1', 'g2', 'g1', 'name:extras', 'all', 'path:path'])
groupstr = 'default,platform-' + platform.system().lower()
self.assertEqual(groupstr, manifest.GetGroupsStr())
groupstr = 'g1,g2,g1'
manifest.manifestProject.config.SetString('manifest.groups', groupstr)
self.assertEqual(groupstr, manifest.GetGroupsStr())
def test_set_revision_id(self):
"""Check setting of project's revisionId."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="test-name"/>
</manifest>
""")
self.assertEqual(len(manifest.projects), 1)
project = manifest.projects[0]
project.SetRevisionId('ABCDEF')
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' +
'<remote name="default-remote" fetch="http://localhost"/>' +
'<default remote="default-remote" revision="refs/heads/main"/>' +
'<project name="test-name" revision="ABCDEF"/>' +
'</manifest>')
def test_trailing_slash(self):
"""Check handling of trailing slashes in attributes."""
def parse(name, path):
return self.getXmlManifest(f"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="{name}" path="{path}" />
</manifest>
""")
manifest = parse('a/path/', 'foo')
self.assertEqual(manifest.projects[0].gitdir,
os.path.join(self.tempdir, '.repo/projects/foo.git'))
self.assertEqual(manifest.projects[0].objdir,
os.path.join(self.tempdir, '.repo/project-objects/a/path.git'))
manifest = parse('a/path', 'foo/')
self.assertEqual(manifest.projects[0].gitdir,
os.path.join(self.tempdir, '.repo/projects/foo.git'))
self.assertEqual(manifest.projects[0].objdir,
os.path.join(self.tempdir, '.repo/project-objects/a/path.git'))
manifest = parse('a/path', 'foo//////')
self.assertEqual(manifest.projects[0].gitdir,
os.path.join(self.tempdir, '.repo/projects/foo.git'))
self.assertEqual(manifest.projects[0].objdir,
os.path.join(self.tempdir, '.repo/project-objects/a/path.git'))
def test_toplevel_path(self):
"""Check handling of path=. specially."""
def parse(name, path):
return self.getXmlManifest(f"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="{name}" path="{path}" />
</manifest>
""")
for path in ('.', './', './/', './//'):
manifest = parse('server/path', path)
self.assertEqual(manifest.projects[0].gitdir,
os.path.join(self.tempdir, '.repo/projects/..git'))
def test_bad_path_name_checks(self):
"""Check handling of bad path & name attributes."""
def parse(name, path):
manifest = self.getXmlManifest(f"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="{name}" path="{path}" />
</manifest>
""")
# Force the manifest to be parsed.
manifest.ToXml()
# Verify the parser is valid by default to avoid buggy tests below.
parse('ok', 'ok')
# Handle empty name explicitly because a different codepath rejects it.
# Empty path is OK because it defaults to the name field.
with self.assertRaises(error.ManifestParseError):
parse('', 'ok')
for path in INVALID_FS_PATHS:
if not path or path.endswith('/'):
continue
with self.assertRaises(error.ManifestInvalidPathError):
parse(path, 'ok')
# We have a dedicated test for path=".".
if path not in {'.'}:
with self.assertRaises(error.ManifestInvalidPathError):
parse('ok', path)
class SuperProjectElementTests(ManifestParseTestCase):
"""Tests for <superproject>."""
def test_superproject(self):
"""Check superproject settings."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<superproject name="superproject"/>
</manifest>
""")
self.assertEqual(manifest.superproject['name'], 'superproject')
self.assertEqual(manifest.superproject['remote'].name, 'test-remote')
self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/superproject')
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' +
'<remote name="test-remote" fetch="http://localhost"/>' +
'<default remote="test-remote" revision="refs/heads/main"/>' +
'<superproject name="superproject"/>' +
'</manifest>')
def test_remote(self):
"""Check superproject settings with a remote."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<remote name="superproject-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<superproject name="platform/superproject" remote="superproject-remote"/>
</manifest>
""")
self.assertEqual(manifest.superproject['name'], 'platform/superproject')
self.assertEqual(manifest.superproject['remote'].name, 'superproject-remote')
self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/platform/superproject')
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' +
'<remote name="default-remote" fetch="http://localhost"/>' +
'<remote name="superproject-remote" fetch="http://localhost"/>' +
'<default remote="default-remote" revision="refs/heads/main"/>' +
'<superproject name="platform/superproject" remote="superproject-remote"/>' +
'</manifest>')
def test_defalut_remote(self):
"""Check superproject settings with a default remote."""
manifest = self.getXmlManifest("""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<superproject name="superproject" remote="default-remote"/>
</manifest>
""")
self.assertEqual(manifest.superproject['name'], 'superproject')
self.assertEqual(manifest.superproject['remote'].name, 'default-remote')
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?><manifest>' +
'<remote name="default-remote" fetch="http://localhost"/>' +
'<default remote="default-remote" revision="refs/heads/main"/>' +
'<superproject name="superproject"/>' +
'</manifest>')

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -14,6 +16,8 @@
"""Unittests for the project.py module."""
from __future__ import print_function
import contextlib
import os
import shutil
@ -21,10 +25,7 @@ import subprocess
import tempfile
import unittest
import error
import git_command
import git_config
import platform_utils
import project
@ -35,22 +36,49 @@ def TempGitTree():
# Python 2 support entirely.
try:
tempdir = tempfile.mkdtemp(prefix='repo-tests')
# Tests need to assume, that main is default branch at init,
# which is not supported in config until 2.28.
cmd = ['git', 'init']
if git_command.git_require((2, 28, 0)):
cmd += ['--initial-branch=main']
else:
# Use template dir for init.
templatedir = tempfile.mkdtemp(prefix='.test-template')
with open(os.path.join(templatedir, 'HEAD'), 'w') as fp:
fp.write('ref: refs/heads/main\n')
cmd += ['--template', templatedir]
subprocess.check_call(cmd, cwd=tempdir)
subprocess.check_call(['git', 'init'], cwd=tempdir)
yield tempdir
finally:
platform_utils.rmtree(tempdir)
shutil.rmtree(tempdir)
class RepoHookShebang(unittest.TestCase):
"""Check shebang parsing in RepoHook."""
def test_no_shebang(self):
"""Lines w/out shebangs should be rejected."""
DATA = (
'',
'# -*- coding:utf-8 -*-\n',
'#\n# foo\n',
'# Bad shebang in script\n#!/foo\n'
)
for data in DATA:
self.assertIsNone(project.RepoHook._ExtractInterpFromShebang(data))
def test_direct_interp(self):
"""Lines whose shebang points directly to the interpreter."""
DATA = (
('#!/foo', '/foo'),
('#! /foo', '/foo'),
('#!/bin/foo ', '/bin/foo'),
('#! /usr/foo ', '/usr/foo'),
('#! /usr/foo -args', '/usr/foo'),
)
for shebang, interp in DATA:
self.assertEqual(project.RepoHook._ExtractInterpFromShebang(shebang),
interp)
def test_env_interp(self):
"""Lines whose shebang launches through `env`."""
DATA = (
('#!/usr/bin/env foo', 'foo'),
('#!/bin/env foo', 'foo'),
('#! /bin/env /bin/foo ', '/bin/foo'),
)
for shebang, interp in DATA:
self.assertEqual(project.RepoHook._ExtractInterpFromShebang(shebang),
interp)
class FakeProject(object):
@ -86,7 +114,7 @@ class ReviewableBranchTests(unittest.TestCase):
# Start off with the normal details.
rb = project.ReviewableBranch(
fakeproj, fakeproj.config.GetBranch('work'), 'main')
fakeproj, fakeproj.config.GetBranch('work'), 'master')
self.assertEqual('work', rb.name)
self.assertEqual(1, len(rb.commits))
self.assertIn('Del file', rb.commits[0])
@ -99,239 +127,10 @@ class ReviewableBranchTests(unittest.TestCase):
self.assertTrue(rb.date)
# Now delete the tracking branch!
fakeproj.work_git.branch('-D', 'main')
fakeproj.work_git.branch('-D', 'master')
rb = project.ReviewableBranch(
fakeproj, fakeproj.config.GetBranch('work'), 'main')
fakeproj, fakeproj.config.GetBranch('work'), 'master')
self.assertEqual(0, len(rb.commits))
self.assertFalse(rb.base_exists)
# Hard to assert anything useful about this.
self.assertTrue(rb.date)
class CopyLinkTestCase(unittest.TestCase):
"""TestCase for stub repo client checkouts.
It'll have a layout like:
tempdir/ # self.tempdir
checkout/ # self.topdir
git-project/ # self.worktree
Attributes:
tempdir: A dedicated temporary directory.
worktree: The top of the repo client checkout.
topdir: The top of a project checkout.
"""
def setUp(self):
self.tempdir = tempfile.mkdtemp(prefix='repo_tests')
self.topdir = os.path.join(self.tempdir, 'checkout')
self.worktree = os.path.join(self.topdir, 'git-project')
os.makedirs(self.topdir)
os.makedirs(self.worktree)
def tearDown(self):
shutil.rmtree(self.tempdir, ignore_errors=True)
@staticmethod
def touch(path):
with open(path, 'w'):
pass
def assertExists(self, path, msg=None):
"""Make sure |path| exists."""
if os.path.exists(path):
return
if msg is None:
msg = ['path is missing: %s' % path]
while path != '/':
path = os.path.dirname(path)
if not path:
# If we're given something like "foo", abort once we get to "".
break
result = os.path.exists(path)
msg.append('\tos.path.exists(%s): %s' % (path, result))
if result:
msg.append('\tcontents: %r' % os.listdir(path))
break
msg = '\n'.join(msg)
raise self.failureException(msg)
class CopyFile(CopyLinkTestCase):
"""Check _CopyFile handling."""
def CopyFile(self, src, dest):
return project._CopyFile(self.worktree, src, self.topdir, dest)
def test_basic(self):
"""Basic test of copying a file from a project to the toplevel."""
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
cf = self.CopyFile('foo.txt', 'foo')
cf._Copy()
self.assertExists(os.path.join(self.topdir, 'foo'))
def test_src_subdir(self):
"""Copy a file from a subdir of a project."""
src = os.path.join(self.worktree, 'bar', 'foo.txt')
os.makedirs(os.path.dirname(src))
self.touch(src)
cf = self.CopyFile('bar/foo.txt', 'new.txt')
cf._Copy()
self.assertExists(os.path.join(self.topdir, 'new.txt'))
def test_dest_subdir(self):
"""Copy a file to a subdir of a checkout."""
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
cf = self.CopyFile('foo.txt', 'sub/dir/new.txt')
self.assertFalse(os.path.exists(os.path.join(self.topdir, 'sub')))
cf._Copy()
self.assertExists(os.path.join(self.topdir, 'sub', 'dir', 'new.txt'))
def test_update(self):
"""Make sure changed files get copied again."""
src = os.path.join(self.worktree, 'foo.txt')
dest = os.path.join(self.topdir, 'bar')
with open(src, 'w') as f:
f.write('1st')
cf = self.CopyFile('foo.txt', 'bar')
cf._Copy()
self.assertExists(dest)
with open(dest) as f:
self.assertEqual(f.read(), '1st')
with open(src, 'w') as f:
f.write('2nd!')
cf._Copy()
with open(dest) as f:
self.assertEqual(f.read(), '2nd!')
def test_src_block_symlink(self):
"""Do not allow reading from a symlinked path."""
src = os.path.join(self.worktree, 'foo.txt')
sym = os.path.join(self.worktree, 'sym')
self.touch(src)
platform_utils.symlink('foo.txt', sym)
self.assertExists(sym)
cf = self.CopyFile('sym', 'foo')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_symlink_traversal(self):
"""Do not allow reading through a symlink dir."""
realfile = os.path.join(self.tempdir, 'file.txt')
self.touch(realfile)
src = os.path.join(self.worktree, 'bar', 'file.txt')
platform_utils.symlink(self.tempdir, os.path.join(self.worktree, 'bar'))
self.assertExists(src)
cf = self.CopyFile('bar/file.txt', 'foo')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_copy_from_dir(self):
"""Do not allow copying from a directory."""
src = os.path.join(self.worktree, 'dir')
os.makedirs(src)
cf = self.CopyFile('dir', 'foo')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_dest_block_symlink(self):
"""Do not allow writing to a symlink."""
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
platform_utils.symlink('dest', os.path.join(self.topdir, 'sym'))
cf = self.CopyFile('foo.txt', 'sym')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_dest_block_symlink_traversal(self):
"""Do not allow writing through a symlink dir."""
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
platform_utils.symlink(tempfile.gettempdir(),
os.path.join(self.topdir, 'sym'))
cf = self.CopyFile('foo.txt', 'sym/foo.txt')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_copy_to_dir(self):
"""Do not allow copying to a directory."""
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
os.makedirs(os.path.join(self.topdir, 'dir'))
cf = self.CopyFile('foo.txt', 'dir')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
class LinkFile(CopyLinkTestCase):
"""Check _LinkFile handling."""
def LinkFile(self, src, dest):
return project._LinkFile(self.worktree, src, self.topdir, dest)
def test_basic(self):
"""Basic test of linking a file from a project into the toplevel."""
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
lf = self.LinkFile('foo.txt', 'foo')
lf._Link()
dest = os.path.join(self.topdir, 'foo')
self.assertExists(dest)
self.assertTrue(os.path.islink(dest))
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest))
def test_src_subdir(self):
"""Link to a file in a subdir of a project."""
src = os.path.join(self.worktree, 'bar', 'foo.txt')
os.makedirs(os.path.dirname(src))
self.touch(src)
lf = self.LinkFile('bar/foo.txt', 'foo')
lf._Link()
self.assertExists(os.path.join(self.topdir, 'foo'))
def test_src_self(self):
"""Link to the project itself."""
dest = os.path.join(self.topdir, 'foo', 'bar')
lf = self.LinkFile('.', 'foo/bar')
lf._Link()
self.assertExists(dest)
self.assertEqual(os.path.join('..', 'git-project'), os.readlink(dest))
def test_dest_subdir(self):
"""Link a file to a subdir of a checkout."""
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
lf = self.LinkFile('foo.txt', 'sub/dir/foo/bar')
self.assertFalse(os.path.exists(os.path.join(self.topdir, 'sub')))
lf._Link()
self.assertExists(os.path.join(self.topdir, 'sub', 'dir', 'foo', 'bar'))
def test_src_block_relative(self):
"""Do not allow relative symlinks."""
BAD_SOURCES = (
'./',
'..',
'../',
'foo/.',
'foo/./bar',
'foo/..',
'foo/../foo',
)
for src in BAD_SOURCES:
lf = self.LinkFile(src, 'foo')
self.assertRaises(error.ManifestInvalidPathError, lf._Link)
def test_update(self):
"""Make sure changed targets get updated."""
dest = os.path.join(self.topdir, 'sym')
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
lf = self.LinkFile('foo.txt', 'sym')
lf._Link()
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest))
# Point the symlink somewhere else.
os.unlink(dest)
platform_utils.symlink(self.tempdir, dest)
lf._Link()
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest))

View File

@ -1,43 +0,0 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the subcmds module (mostly __init__.py than subcommands)."""
import unittest
import subcmds
class AllCommands(unittest.TestCase):
"""Check registered all_commands."""
def test_required_basic(self):
"""Basic checking of registered commands."""
# NB: We don't test all subcommands as we want to avoid "change detection"
# tests, so we just look for the most common/important ones here that are
# unlikely to ever change.
for cmd in {'cherry-pick', 'help', 'init', 'start', 'sync', 'upload'}:
self.assertIn(cmd, subcmds.all_commands)
def test_naming(self):
"""Verify we don't add things that we shouldn't."""
for cmd in subcmds.all_commands:
# Reject filename suffixes like "help.py".
self.assertNotIn('.', cmd)
# Make sure all '_' were converted to '-'.
self.assertNotIn('_', cmd)
# Reject internal python paths like "__init__".
self.assertFalse(cmd.startswith('__'))

View File

@ -1,49 +0,0 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the subcmds/init.py module."""
import unittest
from subcmds import init
class InitCommand(unittest.TestCase):
"""Check registered all_commands."""
def setUp(self):
self.cmd = init.Init()
def test_cli_parser_good(self):
"""Check valid command line options."""
ARGV = (
[],
)
for argv in ARGV:
opts, args = self.cmd.OptionParser.parse_args(argv)
self.cmd.ValidateOptions(opts, args)
def test_cli_parser_bad(self):
"""Check invalid command line options."""
ARGV = (
# Too many arguments.
['url', 'asdf'],
# Conflicting options.
['--mirror', '--archive'],
)
for argv in ARGV:
opts, args = self.cmd.OptionParser.parse_args(argv)
with self.assertRaises(SystemExit):
self.cmd.ValidateOptions(opts, args)

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2015 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -14,86 +16,26 @@
"""Unittests for the wrapper.py module."""
import contextlib
from io import StringIO
from __future__ import print_function
import os
import re
import shutil
import sys
import tempfile
import unittest
from unittest import mock
import git_command
import main
import platform_utils
import wrapper
@contextlib.contextmanager
def TemporaryDirectory():
"""Create a new empty git checkout for testing."""
# TODO(vapier): Convert this to tempfile.TemporaryDirectory once we drop
# Python 2 support entirely.
try:
tempdir = tempfile.mkdtemp(prefix='repo-tests')
yield tempdir
finally:
platform_utils.rmtree(tempdir)
def fixture(*paths):
"""Return a path relative to tests/fixtures.
"""
return os.path.join(os.path.dirname(__file__), 'fixtures', *paths)
class RepoWrapperTestCase(unittest.TestCase):
"""TestCase for the wrapper module."""
def setUp(self):
"""Load the wrapper module every time."""
wrapper._wrapper_module = None
self.wrapper = wrapper.Wrapper()
class RepoWrapperUnitTest(RepoWrapperTestCase):
class RepoWrapperUnitTest(unittest.TestCase):
"""Tests helper functions in the repo wrapper
"""
def test_version(self):
"""Make sure _Version works."""
with self.assertRaises(SystemExit) as e:
with mock.patch('sys.stdout', new_callable=StringIO) as stdout:
with mock.patch('sys.stderr', new_callable=StringIO) as stderr:
self.wrapper._Version()
self.assertEqual(0, e.exception.code)
self.assertEqual('', stderr.getvalue())
self.assertIn('repo launcher version', stdout.getvalue())
def test_python_constraints(self):
"""The launcher should never require newer than main.py."""
self.assertGreaterEqual(main.MIN_PYTHON_VERSION_HARD,
wrapper.MIN_PYTHON_VERSION_HARD)
self.assertGreaterEqual(main.MIN_PYTHON_VERSION_SOFT,
wrapper.MIN_PYTHON_VERSION_SOFT)
# Make sure the versions are themselves in sync.
self.assertGreaterEqual(wrapper.MIN_PYTHON_VERSION_SOFT,
wrapper.MIN_PYTHON_VERSION_HARD)
def test_init_parser(self):
"""Make sure 'init' GetParser works."""
parser = self.wrapper.GetParser(gitc_init=False)
opts, args = parser.parse_args([])
self.assertEqual([], args)
self.assertIsNone(opts.manifest_url)
def test_gitc_init_parser(self):
"""Make sure 'gitc-init' GetParser works."""
parser = self.wrapper.GetParser(gitc_init=True)
opts, args = parser.parse_args([])
self.assertEqual([], args)
self.assertIsNone(opts.manifest_file)
def setUp(self):
"""Load the wrapper module every time
"""
wrapper._wrapper_module = None
self.wrapper = wrapper.Wrapper()
def test_get_gitc_manifest_dir_no_gitc(self):
"""
@ -130,442 +72,9 @@ class RepoWrapperUnitTest(RepoWrapperTestCase):
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test/extra'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/extra'),
'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/extra'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/'), None)
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/'), None)
class SetGitTrace2ParentSid(RepoWrapperTestCase):
"""Check SetGitTrace2ParentSid behavior."""
KEY = 'GIT_TRACE2_PARENT_SID'
VALID_FORMAT = re.compile(r'^repo-[0-9]{8}T[0-9]{6}Z-P[0-9a-f]{8}$')
def test_first_set(self):
"""Test env var not yet set."""
env = {}
self.wrapper.SetGitTrace2ParentSid(env)
self.assertIn(self.KEY, env)
value = env[self.KEY]
self.assertRegex(value, self.VALID_FORMAT)
def test_append(self):
"""Test env var is appended."""
env = {self.KEY: 'pfx'}
self.wrapper.SetGitTrace2ParentSid(env)
self.assertIn(self.KEY, env)
value = env[self.KEY]
self.assertTrue(value.startswith('pfx/'))
self.assertRegex(value[4:], self.VALID_FORMAT)
def test_global_context(self):
"""Check os.environ gets updated by default."""
os.environ.pop(self.KEY, None)
self.wrapper.SetGitTrace2ParentSid()
self.assertIn(self.KEY, os.environ)
value = os.environ[self.KEY]
self.assertRegex(value, self.VALID_FORMAT)
class RunCommand(RepoWrapperTestCase):
"""Check run_command behavior."""
def test_capture(self):
"""Check capture_output handling."""
ret = self.wrapper.run_command(['echo', 'hi'], capture_output=True)
self.assertEqual(ret.stdout, 'hi\n')
def test_check(self):
"""Check check handling."""
self.wrapper.run_command(['true'], check=False)
self.wrapper.run_command(['true'], check=True)
self.wrapper.run_command(['false'], check=False)
with self.assertRaises(self.wrapper.RunError):
self.wrapper.run_command(['false'], check=True)
class RunGit(RepoWrapperTestCase):
"""Check run_git behavior."""
def test_capture(self):
"""Check capture_output handling."""
ret = self.wrapper.run_git('--version')
self.assertIn('git', ret.stdout)
def test_check(self):
"""Check check handling."""
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper.run_git('--version-asdfasdf')
self.wrapper.run_git('--version-asdfasdf', check=False)
class ParseGitVersion(RepoWrapperTestCase):
"""Check ParseGitVersion behavior."""
def test_autoload(self):
"""Check we can load the version from the live git."""
ret = self.wrapper.ParseGitVersion()
self.assertIsNotNone(ret)
def test_bad_ver(self):
"""Check handling of bad git versions."""
ret = self.wrapper.ParseGitVersion(ver_str='asdf')
self.assertIsNone(ret)
def test_normal_ver(self):
"""Check handling of normal git versions."""
ret = self.wrapper.ParseGitVersion(ver_str='git version 2.25.1')
self.assertEqual(2, ret.major)
self.assertEqual(25, ret.minor)
self.assertEqual(1, ret.micro)
self.assertEqual('2.25.1', ret.full)
def test_extended_ver(self):
"""Check handling of extended distro git versions."""
ret = self.wrapper.ParseGitVersion(
ver_str='git version 1.30.50.696.g5e7596f4ac-goog')
self.assertEqual(1, ret.major)
self.assertEqual(30, ret.minor)
self.assertEqual(50, ret.micro)
self.assertEqual('1.30.50.696.g5e7596f4ac-goog', ret.full)
class CheckGitVersion(RepoWrapperTestCase):
"""Check _CheckGitVersion behavior."""
def test_unknown(self):
"""Unknown versions should abort."""
with mock.patch.object(self.wrapper, 'ParseGitVersion', return_value=None):
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper._CheckGitVersion()
def test_old(self):
"""Old versions should abort."""
with mock.patch.object(
self.wrapper, 'ParseGitVersion',
return_value=self.wrapper.GitVersion(1, 0, 0, '1.0.0')):
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper._CheckGitVersion()
def test_new(self):
"""Newer versions should run fine."""
with mock.patch.object(
self.wrapper, 'ParseGitVersion',
return_value=self.wrapper.GitVersion(100, 0, 0, '100.0.0')):
self.wrapper._CheckGitVersion()
class Requirements(RepoWrapperTestCase):
"""Check Requirements handling."""
def test_missing_file(self):
"""Don't crash if the file is missing (old version)."""
testdir = os.path.dirname(os.path.realpath(__file__))
self.assertIsNone(self.wrapper.Requirements.from_dir(testdir))
self.assertIsNone(self.wrapper.Requirements.from_file(
os.path.join(testdir, 'xxxxxxxxxxxxxxxxxxxxxxxx')))
def test_corrupt_data(self):
"""If the file can't be parsed, don't blow up."""
self.assertIsNone(self.wrapper.Requirements.from_file(__file__))
self.assertIsNone(self.wrapper.Requirements.from_data(b'x'))
def test_valid_data(self):
"""Make sure we can parse the file we ship."""
self.assertIsNotNone(self.wrapper.Requirements.from_data(b'{}'))
rootdir = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
self.assertIsNotNone(self.wrapper.Requirements.from_dir(rootdir))
self.assertIsNotNone(self.wrapper.Requirements.from_file(os.path.join(
rootdir, 'requirements.json')))
def test_format_ver(self):
"""Check format_ver can format."""
self.assertEqual('1.2.3', self.wrapper.Requirements._format_ver((1, 2, 3)))
self.assertEqual('1', self.wrapper.Requirements._format_ver([1]))
def test_assert_all_unknown(self):
"""Check assert_all works with incompatible file."""
reqs = self.wrapper.Requirements({})
reqs.assert_all()
def test_assert_all_new_repo(self):
"""Check assert_all accepts new enough repo."""
reqs = self.wrapper.Requirements({'repo': {'hard': [1, 0]}})
reqs.assert_all()
def test_assert_all_old_repo(self):
"""Check assert_all rejects old repo."""
reqs = self.wrapper.Requirements({'repo': {'hard': [99999, 0]}})
with self.assertRaises(SystemExit):
reqs.assert_all()
def test_assert_all_new_python(self):
"""Check assert_all accepts new enough python."""
reqs = self.wrapper.Requirements({'python': {'hard': sys.version_info}})
reqs.assert_all()
def test_assert_all_old_python(self):
"""Check assert_all rejects old python."""
reqs = self.wrapper.Requirements({'python': {'hard': [99999, 0]}})
with self.assertRaises(SystemExit):
reqs.assert_all()
def test_assert_ver_unknown(self):
"""Check assert_ver works with incompatible file."""
reqs = self.wrapper.Requirements({})
reqs.assert_ver('xxx', (1, 0))
def test_assert_ver_new(self):
"""Check assert_ver allows new enough versions."""
reqs = self.wrapper.Requirements({'git': {'hard': [1, 0], 'soft': [2, 0]}})
reqs.assert_ver('git', (1, 0))
reqs.assert_ver('git', (1, 5))
reqs.assert_ver('git', (2, 0))
reqs.assert_ver('git', (2, 5))
def test_assert_ver_old(self):
"""Check assert_ver rejects old versions."""
reqs = self.wrapper.Requirements({'git': {'hard': [1, 0], 'soft': [2, 0]}})
with self.assertRaises(SystemExit):
reqs.assert_ver('git', (0, 5))
class NeedSetupGnuPG(RepoWrapperTestCase):
"""Check NeedSetupGnuPG behavior."""
def test_missing_dir(self):
"""The ~/.repoconfig tree doesn't exist yet."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = os.path.join(tempdir, 'foo')
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_missing_keyring(self):
"""The keyring-version file doesn't exist yet."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_empty_keyring(self):
"""The keyring-version file exists, but is empty."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
with open(os.path.join(tempdir, 'keyring-version'), 'w'):
pass
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_old_keyring(self):
"""The keyring-version file exists, but it's old."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
with open(os.path.join(tempdir, 'keyring-version'), 'w') as fp:
fp.write('1.0\n')
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_new_keyring(self):
"""The keyring-version file exists, and is up-to-date."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
with open(os.path.join(tempdir, 'keyring-version'), 'w') as fp:
fp.write('1000.0\n')
self.assertFalse(self.wrapper.NeedSetupGnuPG())
class SetupGnuPG(RepoWrapperTestCase):
"""Check SetupGnuPG behavior."""
def test_full(self):
"""Make sure it works completely."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
self.wrapper.gpg_dir = os.path.join(self.wrapper.home_dot_repo, 'gnupg')
self.assertTrue(self.wrapper.SetupGnuPG(True))
with open(os.path.join(tempdir, 'keyring-version'), 'r') as fp:
data = fp.read()
self.assertEqual('.'.join(str(x) for x in self.wrapper.KEYRING_VERSION),
data.strip())
class VerifyRev(RepoWrapperTestCase):
"""Check verify_rev behavior."""
def test_verify_passes(self):
"""Check when we have a valid signed tag."""
desc_result = self.wrapper.RunResult(0, 'v1.0\n', '')
gpg_result = self.wrapper.RunResult(0, '', '')
with mock.patch.object(self.wrapper, 'run_git',
side_effect=(desc_result, gpg_result)):
ret = self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
self.assertEqual('v1.0^0', ret)
def test_unsigned_commit(self):
"""Check we fall back to signed tag when we have an unsigned commit."""
desc_result = self.wrapper.RunResult(0, 'v1.0-10-g1234\n', '')
gpg_result = self.wrapper.RunResult(0, '', '')
with mock.patch.object(self.wrapper, 'run_git',
side_effect=(desc_result, gpg_result)):
ret = self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
self.assertEqual('v1.0^0', ret)
def test_verify_fails(self):
"""Check we fall back to signed tag when we have an unsigned commit."""
desc_result = self.wrapper.RunResult(0, 'v1.0-10-g1234\n', '')
gpg_result = Exception
with mock.patch.object(self.wrapper, 'run_git',
side_effect=(desc_result, gpg_result)):
with self.assertRaises(Exception):
self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
class GitCheckoutTestCase(RepoWrapperTestCase):
"""Tests that use a real/small git checkout."""
GIT_DIR = None
REV_LIST = None
@classmethod
def setUpClass(cls):
# Create a repo to operate on, but do it once per-class.
cls.GIT_DIR = tempfile.mkdtemp(prefix='repo-rev-tests')
run_git = wrapper.Wrapper().run_git
remote = os.path.join(cls.GIT_DIR, 'remote')
os.mkdir(remote)
# Tests need to assume, that main is default branch at init,
# which is not supported in config until 2.28.
if git_command.git_require((2, 28, 0)):
initstr = '--initial-branch=main'
else:
# Use template dir for init.
templatedir = tempfile.mkdtemp(prefix='.test-template')
with open(os.path.join(templatedir, 'HEAD'), 'w') as fp:
fp.write('ref: refs/heads/main\n')
initstr = '--template=' + templatedir
run_git('init', initstr, cwd=remote)
run_git('commit', '--allow-empty', '-minit', cwd=remote)
run_git('branch', 'stable', cwd=remote)
run_git('tag', 'v1.0', cwd=remote)
run_git('commit', '--allow-empty', '-m2nd commit', cwd=remote)
cls.REV_LIST = run_git('rev-list', 'HEAD', cwd=remote).stdout.splitlines()
run_git('init', cwd=cls.GIT_DIR)
run_git('fetch', remote, '+refs/heads/*:refs/remotes/origin/*', cwd=cls.GIT_DIR)
@classmethod
def tearDownClass(cls):
if not cls.GIT_DIR:
return
shutil.rmtree(cls.GIT_DIR)
class ResolveRepoRev(GitCheckoutTestCase):
"""Check resolve_repo_rev behavior."""
def test_explicit_branch(self):
"""Check refs/heads/branch argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/heads/stable')
self.assertEqual('refs/heads/stable', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
with self.assertRaises(wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/heads/unknown')
def test_explicit_tag(self):
"""Check refs/tags/tag argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/tags/v1.0')
self.assertEqual('refs/tags/v1.0', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
with self.assertRaises(wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/tags/unknown')
def test_branch_name(self):
"""Check branch argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'stable')
self.assertEqual('refs/heads/stable', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'main')
self.assertEqual('refs/heads/main', rrev)
self.assertEqual(self.REV_LIST[0], lrev)
def test_tag_name(self):
"""Check tag argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'v1.0')
self.assertEqual('refs/tags/v1.0', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
def test_full_commit(self):
"""Check specific commit argument."""
commit = self.REV_LIST[0]
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, commit)
self.assertEqual(commit, rrev)
self.assertEqual(commit, lrev)
def test_partial_commit(self):
"""Check specific (partial) commit argument."""
commit = self.REV_LIST[0][0:20]
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, commit)
self.assertEqual(self.REV_LIST[0], rrev)
self.assertEqual(self.REV_LIST[0], lrev)
def test_unknown(self):
"""Check unknown ref/commit argument."""
with self.assertRaises(wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, 'boooooooya')
class CheckRepoVerify(RepoWrapperTestCase):
"""Check check_repo_verify behavior."""
def test_no_verify(self):
"""Always fail with --no-repo-verify."""
self.assertFalse(self.wrapper.check_repo_verify(False))
def test_gpg_initialized(self):
"""Should pass if gpg is setup already."""
with mock.patch.object(self.wrapper, 'NeedSetupGnuPG', return_value=False):
self.assertTrue(self.wrapper.check_repo_verify(True))
def test_need_gpg_setup(self):
"""Should pass/fail based on gpg setup."""
with mock.patch.object(self.wrapper, 'NeedSetupGnuPG', return_value=True):
with mock.patch.object(self.wrapper, 'SetupGnuPG') as m:
m.return_value = True
self.assertTrue(self.wrapper.check_repo_verify(True))
m.return_value = False
self.assertFalse(self.wrapper.check_repo_verify(True))
class CheckRepoRev(GitCheckoutTestCase):
"""Check check_repo_rev behavior."""
def test_verify_works(self):
"""Should pass when verification passes."""
with mock.patch.object(self.wrapper, 'check_repo_verify', return_value=True):
with mock.patch.object(self.wrapper, 'verify_rev', return_value='12345'):
rrev, lrev = self.wrapper.check_repo_rev(self.GIT_DIR, 'stable')
self.assertEqual('refs/heads/stable', rrev)
self.assertEqual('12345', lrev)
def test_verify_fails(self):
"""Should fail when verification fails."""
with mock.patch.object(self.wrapper, 'check_repo_verify', return_value=True):
with mock.patch.object(self.wrapper, 'verify_rev', side_effect=Exception):
with self.assertRaises(Exception):
self.wrapper.check_repo_rev(self.GIT_DIR, 'stable')
def test_verify_ignore(self):
"""Should pass when verification is disabled."""
with mock.patch.object(self.wrapper, 'verify_rev', side_effect=Exception):
rrev, lrev = self.wrapper.check_repo_rev(self.GIT_DIR, 'stable', repo_verify=False)
self.assertEqual('refs/heads/stable', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
if __name__ == '__main__':
unittest.main()

16
tox.ini
View File

@ -15,20 +15,8 @@
# https://tox.readthedocs.io/
[tox]
envlist = py35, py36, py37, py38, py39
[gh-actions]
python =
3.5: py35
3.6: py36
3.7: py37
3.8: py38
3.9: py39
envlist = py27, py36, py37, py38
[testenv]
deps = pytest
commands = {envpython} run_tests
setenv =
GIT_AUTHOR_NAME = Repo test author
GIT_COMMITTER_NAME = Repo test committer
EMAIL = repo@gerrit.nodomain
commands = {toxinidir}/run_tests

View File

@ -1,3 +1,5 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2014 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@ -12,6 +14,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
try:
from importlib.machinery import SourceFileLoader
_loader = lambda *args: SourceFileLoader(*args).load_module()
@ -24,10 +27,7 @@ import os
def WrapperPath():
return os.path.join(os.path.dirname(__file__), 'repo')
_wrapper_module = None
def Wrapper():
global _wrapper_module
if not _wrapper_module: