Compare commits

..

No commits in common. "main" and "v2.50.1" have entirely different histories.

23 changed files with 150 additions and 719 deletions

View File

@ -1,2 +1 @@
# NB: Keep in sync with run_tests.vpython3.
black<26
black<24

View File

@ -141,7 +141,7 @@ Instead, you should use standard Git workflows like [git worktree] or
(e.g. a local mirror & a public review server) while avoiding duplicating
the content. However, this can run into problems if different remotes use
the same path on their respective servers. Best to avoid that.
* `modules/`: Like `projects/`, but for git submodules.
* `subprojects/`: Like `projects/`, but for git submodules.
* `subproject-objects/`: Like `project-objects/`, but for git submodules.
* `worktrees/`: Bare checkouts of every project synced by the manifest. The
filesystem layout matches the `<project name=...` setting in the manifest

View File

@ -231,7 +231,26 @@ At most one manifest-server may be specified. The url attribute
is used to specify the URL of a manifest server, which is an
XML RPC service.
See the [smart sync documentation](./smart-sync.md) for more details.
The manifest server should implement the following RPC methods:
GetApprovedManifest(branch, target)
Return a manifest in which each project is pegged to a known good revision
for the current branch and target. This is used by repo sync when the
--smart-sync option is given.
The target to use is defined by environment variables TARGET_PRODUCT
and TARGET_BUILD_VARIANT. These variables are used to create a string
of the form $TARGET_PRODUCT-$TARGET_BUILD_VARIANT, e.g. passion-userdebug.
If one of those variables or both are not present, the program will call
GetApprovedManifest without the target parameter and the manifest server
should choose a reasonable default target.
GetManifest(tag)
Return a manifest in which each project is pegged to the revision at
the specified tag. This is used by repo sync when the --smart-tag option
is given.
### Element submanifest

View File

@ -1,129 +0,0 @@
# repo Smart Syncing
Repo normally fetches & syncs manifests from the same URL specified during
`repo init`, and that often fetches the latest revisions of all projects in
the manifest. This flow works well for tracking and developing with the
latest code, but often it's desirable to sync to other points. For example,
to get a local build matching a specific release or build to reproduce bugs
reported by other people.
Repo's sync subcommand has support for fetching manifests from a server over
an XML-RPC connection. The local configuration and network API are defined by
repo, but individual projects have to host their own server for the client to
communicate with.
This process is called "smart syncing" -- instead of blindly fetching the latest
revision of all projects and getting an unknown state to develop against, the
client passes a request to the server and is given a matching manifest that
typically specifies specific commits for every project to fetch a known source
state.
[TOC]
## Manifest Configuration
The manifest specifies the server to communicate with via the
the [`<manifest-server>` element](manifest-format.md#Element-manifest_server)
element. This is how the client knows what service to talk to.
```xml
<manifest-server url="https://example.com/your/manifest/server/url" />
```
If the URL starts with `persistent-`, then the
[`git-remote-persistent-https` helper](https://github.com/git/git/blob/HEAD/contrib/persistent-https/README)
is used to communicate with the server.
## Credentials
Credentials may be specified directly in typical `username:password`
[URI syntax](https://en.wikipedia.org/wiki/URI#Syntax) in the
`<manifest-server>` element directly in the manifest.
If they are not specified, `repo sync` has `--manifest-server-username=USERNAME`
and `--manifest-server-password=PASSWORD` options.
If those are not used, then repo will look up the host in your
[`~/.netrc`](https://docs.python.org/3/library/netrc.html) database.
When making the connection, cookies matching the host are automatically loaded
from the cookiejar specified in
[Git's `http.cookiefile` setting](https://git-scm.com/docs/git-config#Documentation/git-config.txt-httpcookieFile).
## Manifest Server
Unfortunately, there are no public reference implementations. Google has an
internal one for Android, but it is written using Google's internal systems,
so wouldn't be that helpful as a reference.
That said, the XML-RPC API is pretty simple, so any standard XML-RPC server
example would do. Google's internal server uses Python's
[xmlrpc.server.SimpleXMLRPCDispatcher](https://docs.python.org/3/library/xmlrpc.server.html).
## Network API
The manifest server should implement the following RPC methods.
### GetApprovedManifest
> `GetApprovedManifest(branch: str, target: Optional[str]) -> str`
The meaning of `branch` and `target` is not strictly defined. The server may
interpret them however it wants. The recommended interpretation is that the
`branch` matches the manifest branch, and `target` is an identifier for your
project that matches something users would build.
See the client section below for how repo typically generates these values.
The server will return a manifest or an error. If it's an error, repo will
show the output directly to the user to provide a limited feedback channel.
If the user's request is ambiguous and could match multiple manifests, the
server has to decide whether to pick one automatically (and silently such that
the user won't know there were multiple matches), or return an error and force
the user to be more specific.
### GetManifest
> `GetManifest(tag: str) -> str`
The meaning of `tag` is not strictly defined. Projects are encouraged to use
a system where the tag matches a unique source state.
See the client section below for how repo typically generates these values.
The server will return a manifest or an error. If it's an error, repo will
show the output directly to the user to provide a limited feedback channel.
If the user's request is ambiguous and could match multiple manifests, the
server has to decide whether to pick one automatically (and silently such that
the user won't know there were multiple matches), or return an error and force
the user to be more specific.
## Client Options
Once repo has successfully downloaded the manifest from the server, it saves a
copy into `.repo/manifests/smart_sync_override.xml` so users can examine it.
The next time `repo sync` is run, this file is automatically replaced or removed
based on the current set of options.
### --smart-sync
Repo will call `GetApprovedManifest(branch[, target])`.
The `branch` is determined by the current manifest branch as specified by
`--manifest-branch=BRANCH` when running `repo init`.
The `target` is defined by environment variables in the order below. If none
of them match, then `target` is omitted. These variables were decided as they
match the settings Android build environments automatically setup.
1. `${SYNC_TARGET}`: If defined, the value is used directly.
2. `${TARGET_PRODUCT}-${TARGET_RELEASE}-${TARGET_BUILD_VARIANT}`: If these
variables are all defined, then they are merged with `-` and used.
3. `${TARGET_PRODUCT}-${TARGET_BUILD_VARIANT}`: If these variables are all
defined, then they are merged with `-` and used.
### --smart-tag=TAG
Repo will call `GetManifest(TAG)`.

View File

@ -238,9 +238,9 @@ def _build_env(
s = p + " " + s
env["GIT_CONFIG_PARAMETERS"] = s
if "GIT_ALLOW_PROTOCOL" not in env:
env["GIT_ALLOW_PROTOCOL"] = (
"file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc"
)
env[
"GIT_ALLOW_PROTOCOL"
] = "file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc"
env["GIT_HTTP_USER_AGENT"] = user_agent.git
if objdir:
@ -350,9 +350,9 @@ class GitCommand:
"Project": e.project,
"CommandName": command_name,
"Message": str(e),
"ReturnCode": (
str(e.git_rc) if e.git_rc is not None else None
),
"ReturnCode": str(e.git_rc)
if e.git_rc is not None
else None,
"IsError": log_as_error,
}
)

View File

@ -90,20 +90,6 @@ class GitConfig:
@staticmethod
def _getUserConfig():
"""Get the user-specific config file.
Prefers the XDG config location if available, with fallback to
~/.gitconfig
This matches git behavior:
https://git-scm.com/docs/git-config#FILES
"""
xdg_config_home = os.getenv(
"XDG_CONFIG_HOME", os.path.expanduser("~/.config")
)
xdg_config_file = os.path.join(xdg_config_home, "git", "config")
if os.path.exists(xdg_config_file):
return xdg_config_file
return os.path.expanduser("~/.gitconfig")
@classmethod

View File

@ -1014,9 +1014,9 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
def SetManifestOverride(self, path):
"""Override manifestFile. The caller must call Unload()"""
self._outer_client.manifest.manifestFileOverrides[self.path_prefix] = (
path
)
self._outer_client.manifest.manifestFileOverrides[
self.path_prefix
] = path
@property
def UseLocalManifests(self):
@ -2056,12 +2056,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
path = path.rstrip("/")
name = name.rstrip("/")
relpath = self._JoinRelpath(parent.relpath, path)
subprojects = os.path.join(parent.gitdir, "subprojects", f"{path}.git")
modules = os.path.join(parent.gitdir, "modules", path)
if platform_utils.isdir(subprojects):
gitdir = subprojects
else:
gitdir = modules
gitdir = os.path.join(parent.gitdir, "subprojects", "%s.git" % path)
objdir = os.path.join(
parent.gitdir, "subproject-objects", "%s.git" % name
)
@ -2112,22 +2107,22 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
# implementation:
# https://eclipse.googlesource.com/jgit/jgit/+/9110037e3e9461ff4dac22fee84ef3694ed57648/org.eclipse.jgit/src/org/eclipse/jgit/lib/ObjectChecker.java#884
BAD_CODEPOINTS = {
"\u200c", # ZERO WIDTH NON-JOINER
"\u200d", # ZERO WIDTH JOINER
"\u200e", # LEFT-TO-RIGHT MARK
"\u200f", # RIGHT-TO-LEFT MARK
"\u202a", # LEFT-TO-RIGHT EMBEDDING
"\u202b", # RIGHT-TO-LEFT EMBEDDING
"\u202c", # POP DIRECTIONAL FORMATTING
"\u202d", # LEFT-TO-RIGHT OVERRIDE
"\u202e", # RIGHT-TO-LEFT OVERRIDE
"\u206a", # INHIBIT SYMMETRIC SWAPPING
"\u206b", # ACTIVATE SYMMETRIC SWAPPING
"\u206c", # INHIBIT ARABIC FORM SHAPING
"\u206d", # ACTIVATE ARABIC FORM SHAPING
"\u206e", # NATIONAL DIGIT SHAPES
"\u206f", # NOMINAL DIGIT SHAPES
"\ufeff", # ZERO WIDTH NO-BREAK SPACE
"\u200C", # ZERO WIDTH NON-JOINER
"\u200D", # ZERO WIDTH JOINER
"\u200E", # LEFT-TO-RIGHT MARK
"\u200F", # RIGHT-TO-LEFT MARK
"\u202A", # LEFT-TO-RIGHT EMBEDDING
"\u202B", # RIGHT-TO-LEFT EMBEDDING
"\u202C", # POP DIRECTIONAL FORMATTING
"\u202D", # LEFT-TO-RIGHT OVERRIDE
"\u202E", # RIGHT-TO-LEFT OVERRIDE
"\u206A", # INHIBIT SYMMETRIC SWAPPING
"\u206B", # ACTIVATE SYMMETRIC SWAPPING
"\u206C", # INHIBIT ARABIC FORM SHAPING
"\u206D", # ACTIVATE ARABIC FORM SHAPING
"\u206E", # NATIONAL DIGIT SHAPES
"\u206F", # NOMINAL DIGIT SHAPES
"\uFEFF", # ZERO WIDTH NO-BREAK SPACE
}
if BAD_CODEPOINTS & path_codepoints:
# This message is more expansive than reality, but should be fine.

View File

@ -40,7 +40,7 @@ def RunPager(globalConfig):
def TerminatePager():
global pager_process
global pager_process, old_stdout, old_stderr
if pager_process:
sys.stdout.flush()
sys.stderr.flush()

View File

@ -156,12 +156,6 @@ def remove(path, missing_ok=False):
os.rmdir(longpath)
else:
os.remove(longpath)
elif (
e.errno == errno.EROFS
and missing_ok
and not os.path.exists(longpath)
):
pass
elif missing_ok and e.errno == errno.ENOENT:
pass
else:

View File

@ -642,10 +642,6 @@ class Project:
# project containing repo hooks.
self.enabled_repo_hooks = []
# This will be updated later if the project has submodules and
# if they will be synced.
self.has_subprojects = False
def RelPath(self, local=True):
"""Return the path for the project relative to a manifest.
@ -1564,11 +1560,6 @@ class Project:
return
self._InitWorkTree(force_sync=force_sync, submodules=submodules)
# TODO(https://git-scm.com/docs/git-worktree#_bugs): Re-evaluate if
# submodules can be init when using worktrees once its support is
# complete.
if self.has_subprojects and not self.use_git_worktrees:
self._InitSubmodules()
all_refs = self.bare_ref.all
self.CleanPublishedCache(all_refs)
revid = self.GetRevisionId(all_refs)
@ -2197,27 +2188,24 @@ class Project:
def get_submodules(gitdir, rev):
# Parse .gitmodules for submodule sub_paths and sub_urls.
sub_paths, sub_urls, sub_shallows = parse_gitmodules(gitdir, rev)
sub_paths, sub_urls = parse_gitmodules(gitdir, rev)
if not sub_paths:
return []
# Run `git ls-tree` to read SHAs of submodule object, which happen
# to be revision of submodule repository.
sub_revs = git_ls_tree(gitdir, rev, sub_paths)
submodules = []
for sub_path, sub_url, sub_shallow in zip(
sub_paths, sub_urls, sub_shallows
):
for sub_path, sub_url in zip(sub_paths, sub_urls):
try:
sub_rev = sub_revs[sub_path]
except KeyError:
# Ignore non-exist submodules.
continue
submodules.append((sub_rev, sub_path, sub_url, sub_shallow))
submodules.append((sub_rev, sub_path, sub_url))
return submodules
re_path = re.compile(r"^submodule\.(.+)\.path=(.*)$")
re_url = re.compile(r"^submodule\.(.+)\.url=(.*)$")
re_shallow = re.compile(r"^submodule\.(.+)\.shallow=(.*)$")
def parse_gitmodules(gitdir, rev):
cmd = ["cat-file", "blob", "%s:.gitmodules" % rev]
@ -2231,9 +2219,9 @@ class Project:
gitdir=gitdir,
)
except GitError:
return [], [], []
return [], []
if p.Wait() != 0:
return [], [], []
return [], []
gitmodules_lines = []
fd, temp_gitmodules_path = tempfile.mkstemp()
@ -2250,17 +2238,16 @@ class Project:
gitdir=gitdir,
)
if p.Wait() != 0:
return [], [], []
return [], []
gitmodules_lines = p.stdout.split("\n")
except GitError:
return [], [], []
return [], []
finally:
platform_utils.remove(temp_gitmodules_path)
names = set()
paths = {}
urls = {}
shallows = {}
for line in gitmodules_lines:
if not line:
continue
@ -2274,16 +2261,10 @@ class Project:
names.add(m.group(1))
urls[m.group(1)] = m.group(2)
continue
m = re_shallow.match(line)
if m:
names.add(m.group(1))
shallows[m.group(1)] = m.group(2)
continue
names = sorted(names)
return (
[paths.get(name, "") for name in names],
[urls.get(name, "") for name in names],
[shallows.get(name, "") for name in names],
)
def git_ls_tree(gitdir, rev, paths):
@ -2324,7 +2305,7 @@ class Project:
# If git repo does not exist yet, querying its submodules will
# mess up its states; so return here.
return result
for rev, path, url, shallow in self._GetSubmodules():
for rev, path, url in self._GetSubmodules():
name = self.manifest.GetSubprojectName(self, path)
(
relpath,
@ -2346,7 +2327,6 @@ class Project:
review=self.remote.review,
revision=self.remote.revision,
)
clone_depth = 1 if shallow.lower() == "true" else None
subproject = Project(
manifest=self.manifest,
name=name,
@ -2363,13 +2343,10 @@ class Project:
sync_s=self.sync_s,
sync_tags=self.sync_tags,
parent=self,
clone_depth=clone_depth,
is_derived=True,
)
result.append(subproject)
result.extend(subproject.GetDerivedSubprojects())
if result:
self.has_subprojects = True
return result
def EnableRepositoryExtension(self, key, value="true", version=1):
@ -2755,14 +2732,6 @@ class Project:
# field; it doesn't exist, thus abort the optimization attempt
# and do a full sync.
break
elif depth and is_sha1 and ret == 1:
# In sha1 mode, when depth is enabled, syncing the revision
# from upstream may not work because some servers only allow
# fetching named refs. Fetching a specific sha1 may result
# in an error like 'server does not allow request for
# unadvertised object'. In this case, attempt a full sync
# without depth.
break
elif ret < 0:
# Git died with a signal, exit immediately.
break
@ -2883,14 +2852,7 @@ class Project:
# We do not use curl's --retry option since it generally doesn't
# actually retry anything; code 18 for example, it will not retry on.
cmd = [
"curl",
"--fail",
"--output",
tmpPath,
"--netrc-optional",
"--location",
]
cmd = ["curl", "--fail", "--output", tmpPath, "--netrc", "--location"]
if quiet:
cmd += ["--silent", "--show-error"]
if os.path.exists(tmpPath):
@ -3035,17 +2997,6 @@ class Project:
project=self.name,
)
def _InitSubmodules(self, quiet=True):
"""Initialize the submodules for the project."""
cmd = ["submodule", "init"]
if quiet:
cmd.append("-q")
if GitCommand(self, cmd).Wait() != 0:
raise GitError(
f"{self.name} submodule init",
project=self.name,
)
def _Rebase(self, upstream, onto=None):
cmd = ["rebase"]
if onto is not None:
@ -3464,11 +3415,6 @@ class Project:
"""
dotgit = os.path.join(self.worktree, ".git")
# If bare checkout of the submodule is stored under the subproject dir,
# migrate it.
if self.parent:
self._MigrateOldSubmoduleDir()
# If using an old layout style (a directory), migrate it.
if not platform_utils.islink(dotgit) and platform_utils.isdir(dotgit):
self._MigrateOldWorkTreeGitDir(dotgit, project=self.name)
@ -3479,76 +3425,34 @@ class Project:
self._InitGitWorktree()
self._CopyAndLinkFiles()
else:
# Remove old directory symbolic links for submodules.
if self.parent and platform_utils.islink(dotgit):
platform_utils.remove(dotgit)
init_dotgit = True
if not init_dotgit:
# See if the project has changed.
self._removeBadGitDirLink(dotgit)
if os.path.realpath(self.gitdir) != os.path.realpath(dotgit):
platform_utils.remove(dotgit)
if init_dotgit or not os.path.exists(dotgit):
self._createDotGit(dotgit)
os.makedirs(self.worktree, exist_ok=True)
platform_utils.symlink(
os.path.relpath(self.gitdir, self.worktree), dotgit
)
if init_dotgit:
_lwrite(
os.path.join(self.gitdir, HEAD), f"{self.GetRevisionId()}\n"
os.path.join(dotgit, HEAD), "%s\n" % self.GetRevisionId()
)
# Finish checking out the worktree.
cmd = ["read-tree", "--reset", "-u", "-v", HEAD]
try:
if GitCommand(self, cmd).Wait() != 0:
raise GitError(
"Cannot initialize work tree for " + self.name,
project=self.name,
)
except Exception as e:
# Something went wrong with read-tree (perhaps fetching
# missing blobs), so remove .git to avoid half initialized
# workspace from which repo can't recover on its own.
platform_utils.remove(dotgit)
raise e
if GitCommand(self, cmd).Wait() != 0:
raise GitError(
"Cannot initialize work tree for " + self.name,
project=self.name,
)
if submodules:
self._SyncSubmodules(quiet=True)
self._CopyAndLinkFiles()
def _createDotGit(self, dotgit):
"""Initialize .git path.
For submodule projects, create a '.git' file using the gitfile
mechanism, and for the rest, create a symbolic link.
"""
os.makedirs(self.worktree, exist_ok=True)
if self.parent:
_lwrite(
dotgit,
f"gitdir: {os.path.relpath(self.gitdir, self.worktree)}\n",
)
else:
platform_utils.symlink(
os.path.relpath(self.gitdir, self.worktree), dotgit
)
def _removeBadGitDirLink(self, dotgit):
"""Verify .git is initialized correctly, otherwise delete it."""
if self.parent and os.path.isfile(dotgit):
with open(dotgit) as fp:
setting = fp.read()
if not setting.startswith("gitdir:"):
raise GitError(
f"'.git' in {self.worktree} must start with 'gitdir:'",
project=self.name,
)
gitdir = setting.split(":", 1)[1].strip()
dotgit_path = os.path.normpath(os.path.join(self.worktree, gitdir))
else:
dotgit_path = os.path.realpath(dotgit)
if os.path.realpath(self.gitdir) != dotgit_path:
platform_utils.remove(dotgit)
@classmethod
def _MigrateOldWorkTreeGitDir(cls, dotgit, project=None):
"""Migrate the old worktree .git/ dir style to a symlink.
@ -3637,28 +3541,6 @@ class Project:
dotgit,
)
def _MigrateOldSubmoduleDir(self):
"""Move the old bare checkout in 'subprojects' to 'modules'
as bare checkouts of submodules are now in 'modules' dir.
"""
subprojects = os.path.join(self.parent.gitdir, "subprojects")
if not platform_utils.isdir(subprojects):
return
modules = os.path.join(self.parent.gitdir, "modules")
old = self.gitdir
new = os.path.splitext(self.gitdir.replace(subprojects, modules))[0]
if all(map(platform_utils.isdir, [old, new])):
platform_utils.rmtree(old, ignore_errors=True)
else:
os.makedirs(modules, exist_ok=True)
platform_utils.rename(old, new)
self.gitdir = new
self.UpdatePaths(self.relpath, self.worktree, self.gitdir, self.objdir)
if platform_utils.isdir(subprojects) and not os.listdir(subprojects):
platform_utils.rmtree(subprojects, ignore_errors=True)
def _get_symlink_error_message(self):
if platform_utils.isWindows():
return (

View File

@ -16,8 +16,3 @@
line-length = 80
# NB: Keep in sync with tox.ini.
target-version = ['py36', 'py37', 'py38', 'py39', 'py310', 'py311'] #, 'py312'
[tool.pytest.ini_options]
markers = """
skip_cq: Skip tests in the CQ. Should be rarely used!
"""

View File

@ -16,7 +16,6 @@
import os
import re
import shlex
import subprocess
import sys
@ -36,7 +35,12 @@ KEYID_ECC = "E1F9040D7A3F6DAFAC897CD3D3B95DA243E48A39"
def cmdstr(cmd):
"""Get a nicely quoted shell command."""
return " ".join(shlex.quote(x) for x in cmd)
ret = []
for arg in cmd:
if not re.match(r"^[a-zA-Z0-9/_.=-]+$", arg):
arg = f'"{arg}"'
ret.append(arg)
return " ".join(ret)
def run(opts, cmd, check=True, **kwargs):

53
repo
View File

@ -27,7 +27,6 @@ import platform
import shlex
import subprocess
import sys
from typing import NamedTuple
# These should never be newer than the main.py version since this needs to be a
@ -57,14 +56,9 @@ class Trace:
trace = Trace()
def cmdstr(cmd):
"""Get a nicely quoted shell command."""
return " ".join(shlex.quote(x) for x in cmd)
def exec_command(cmd):
"""Execute |cmd| or return None on failure."""
trace.print(":", cmdstr(cmd))
trace.print(":", " ".join(cmd))
try:
if platform.system() == "Windows":
ret = subprocess.call(cmd)
@ -130,7 +124,7 @@ if not REPO_REV:
BUG_URL = "https://issues.gerritcodereview.com/issues/new?component=1370071"
# increment this whenever we make important changes to this script
VERSION = (2, 54)
VERSION = (2, 50)
# increment this if the MAINTAINER_KEYS block is modified
KEYRING_VERSION = (2, 3)
@ -223,6 +217,7 @@ S_manifests = "manifests" # special manifest repository
REPO_MAIN = S_repo + "/main.py" # main script
import collections
import errno
import json
import optparse
@ -487,6 +482,16 @@ def InitParser(parser):
return parser
# This is a poor replacement for subprocess.run until we require Python 3.6+.
RunResult = collections.namedtuple(
"RunResult", ("returncode", "stdout", "stderr")
)
class RunError(Exception):
"""Error when running a command failed."""
def run_command(cmd, **kwargs):
"""Run |cmd| and return its output."""
check = kwargs.pop("check", False)
@ -511,7 +516,7 @@ def run_command(cmd, **kwargs):
# Run & package the results.
proc = subprocess.Popen(cmd, **kwargs)
(stdout, stderr) = proc.communicate(input=cmd_input)
dbg = ": " + cmdstr(cmd)
dbg = ": " + " ".join(cmd)
if cmd_input is not None:
dbg += " 0<|"
if stdout == subprocess.PIPE:
@ -521,9 +526,7 @@ def run_command(cmd, **kwargs):
elif stderr == subprocess.STDOUT:
dbg += " 2>&1"
trace.print(dbg)
ret = subprocess.CompletedProcess(
cmd, proc.returncode, decode(stdout), decode(stderr)
)
ret = RunResult(proc.returncode, decode(stdout), decode(stderr))
# If things failed, print useful debugging output.
if check and ret.returncode:
@ -544,13 +547,13 @@ def run_command(cmd, **kwargs):
_print_output("stdout", ret.stdout)
_print_output("stderr", ret.stderr)
# This will raise subprocess.CalledProcessError for us.
ret.check_returncode()
raise RunError(ret)
return ret
class CloneFailure(Exception):
"""Indicate the remote clone of repo itself failed."""
@ -669,20 +672,15 @@ def run_git(*args, **kwargs):
file=sys.stderr,
)
sys.exit(1)
except subprocess.CalledProcessError:
except RunError:
raise CloneFailure()
class GitVersion(NamedTuple):
"""The git version info broken down into components for easy analysis.
Similar to Python's sys.version_info.
"""
major: int
minor: int
micro: int
full: int
# The git version info broken down into components for easy analysis.
# Similar to Python's sys.version_info.
GitVersion = collections.namedtuple(
"GitVersion", ("major", "minor", "micro", "full")
)
def ParseGitVersion(ver_str=None):
@ -848,11 +846,10 @@ def _GetRepoConfig(name):
return None
else:
print(
f"repo: error: git {cmdstr(cmd)} failed:\n{ret.stderr}",
f"repo: error: git {' '.join(cmd)} failed:\n{ret.stderr}",
file=sys.stderr,
)
# This will raise subprocess.CalledProcessError for us.
ret.check_returncode()
raise RunError()
def _InitHttp():

View File

@ -15,57 +15,16 @@
"""Wrapper to run linters and pytest with the right settings."""
import functools
import os
import subprocess
import sys
from typing import List
import pytest
ROOT_DIR = os.path.dirname(os.path.realpath(__file__))
@functools.lru_cache()
def is_ci() -> bool:
"""Whether we're running in our CI system."""
return os.getenv("LUCI_CQ") == "yes"
def run_pytest(argv: List[str]) -> int:
"""Returns the exit code from pytest."""
if is_ci():
argv = ["-m", "not skip_cq"] + argv
return subprocess.run(
[sys.executable, "-m", "pytest"] + argv,
check=False,
cwd=ROOT_DIR,
).returncode
def run_pytest_py38(argv: List[str]) -> int:
"""Returns the exit code from pytest under Python 3.8."""
if is_ci():
argv = ["-m", "not skip_cq"] + argv
try:
return subprocess.run(
[
"vpython3",
"-vpython-spec",
"run_tests.vpython3.8",
"-m",
"pytest",
]
+ argv,
check=False,
cwd=ROOT_DIR,
).returncode
except FileNotFoundError:
# Skip if the user doesn't have vpython from depot_tools.
return 0
def run_black():
"""Returns the exit code from black."""
# Black by default only matches .py files. We have to list standalone
@ -79,40 +38,32 @@ def run_black():
return subprocess.run(
[sys.executable, "-m", "black", "--check", ROOT_DIR] + extra_programs,
check=False,
cwd=ROOT_DIR,
).returncode
def run_flake8():
"""Returns the exit code from flake8."""
return subprocess.run(
[sys.executable, "-m", "flake8", ROOT_DIR],
check=False,
cwd=ROOT_DIR,
[sys.executable, "-m", "flake8", ROOT_DIR], check=False
).returncode
def run_isort():
"""Returns the exit code from isort."""
return subprocess.run(
[sys.executable, "-m", "isort", "--check", ROOT_DIR],
check=False,
cwd=ROOT_DIR,
[sys.executable, "-m", "isort", "--check", ROOT_DIR], check=False
).returncode
def main(argv):
"""The main entry."""
checks = (
functools.partial(run_pytest, argv),
functools.partial(run_pytest_py38, argv),
lambda: pytest.main(argv),
run_black,
run_flake8,
run_isort,
)
# Run all the tests all the time to get full feedback. Don't exit on the
# first error as that makes it more difficult to iterate in the CQ.
return 1 if sum(c() for c in checks) else 0
return 0 if all(not c() for c in checks) else 1
if __name__ == "__main__":

View File

@ -5,92 +5,97 @@
# List of available wheels:
# https://chromium.googlesource.com/infra/infra/+/main/infra/tools/dockerbuild/wheels.md
python_version: "3.11"
python_version: "3.8"
wheel: <
name: "infra/python/wheels/pytest-py3"
version: "version:8.3.4"
version: "version:6.2.2"
>
# Required by pytest==8.3.4
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/py-py2_py3"
version: "version:1.11.0"
version: "version:1.10.0"
>
# Required by pytest==8.3.4
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/iniconfig-py3"
version: "version:1.1.1"
>
# Required by pytest==8.3.4
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/packaging-py3"
version: "version:23.0"
>
# Required by pytest==8.3.4
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/pluggy-py3"
version: "version:1.5.0"
version: "version:0.13.1"
>
# Required by pytest==8.3.4
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/toml-py3"
version: "version:0.10.1"
>
# Required by pytest==8.3.4
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/pyparsing-py3"
version: "version:3.0.7"
>
# Required by pytest==8.3.4
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/attrs-py2_py3"
version: "version:21.4.0"
>
# NB: Keep in sync with constraints.txt.
# Required by packaging==16.8
wheel: <
name: "infra/python/wheels/black-py3"
version: "version:25.1.0"
name: "infra/python/wheels/six-py2_py3"
version: "version:1.16.0"
>
# Required by black==25.1.0
wheel: <
name: "infra/python/wheels/black-py3"
version: "version:23.1.0"
>
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/mypy-extensions-py3"
version: "version:0.4.3"
>
# Required by black==25.1.0
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/tomli-py3"
version: "version:2.0.1"
>
# Required by black==25.1.0
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/platformdirs-py3"
version: "version:2.5.2"
>
# Required by black==25.1.0
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/pathspec-py3"
version: "version:0.9.0"
>
# Required by black==25.1.0
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/typing-extensions-py3"
version: "version:4.3.0"
>
# Required by black==25.1.0
# Required by black==23.1.0
wheel: <
name: "infra/python/wheels/click-py3"
version: "version:8.0.3"

View File

@ -1,67 +0,0 @@
# This is a vpython "spec" file.
#
# Read more about `vpython` and how to modify this file here:
# https://chromium.googlesource.com/infra/infra/+/main/doc/users/vpython.md
# List of available wheels:
# https://chromium.googlesource.com/infra/infra/+/main/infra/tools/dockerbuild/wheels.md
python_version: "3.8"
wheel: <
name: "infra/python/wheels/pytest-py3"
version: "version:8.3.4"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/py-py2_py3"
version: "version:1.11.0"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/iniconfig-py3"
version: "version:1.1.1"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/packaging-py3"
version: "version:23.0"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/pluggy-py3"
version: "version:1.5.0"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/toml-py3"
version: "version:0.10.1"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/tomli-py3"
version: "version:2.1.0"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/pyparsing-py3"
version: "version:3.0.7"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/attrs-py2_py3"
version: "version:21.4.0"
>
# Required by pytest==8.3.4
wheel: <
name: "infra/python/wheels/exceptiongroup-py3"
version: "version:1.1.2"
>

View File

@ -233,9 +233,9 @@ synced and their revisions won't be found.
)
self.printRevision = self.out.nofmt_printer("revision", fg="yellow")
else:
self.printProject = self.printAdded = self.printRemoved = (
self.printRevision
) = self.printText
self.printProject = (
self.printAdded
) = self.printRemoved = self.printRevision = self.printText
manifest1 = RepoClient(self.repodir)
manifest1.Override(args[0], load_local_manifests=False)

View File

@ -13,18 +13,16 @@
# limitations under the License.
import os
from typing import List, Set
from typing import Set
from command import Command
from git_command import GitCommand
import platform_utils
from progress import Progress
from project import Project
class Gc(Command):
COMMON = True
helpSummary = "Cleaning up internal repo and Git state."
helpSummary = "Cleaning up internal repo state."
helpUsage = """
%prog
"""
@ -45,13 +43,6 @@ class Gc(Command):
action="store_true",
help="answer yes to all safe prompts",
)
p.add_option(
"--repack",
default=False,
action="store_true",
help="repack all projects that use partial clone with "
"filter=blob:none",
)
def _find_git_to_delete(
self, to_keep: Set[str], start_dir: str
@ -73,7 +64,10 @@ class Gc(Command):
return to_delete
def delete_unused_projects(self, projects: List[Project], opt):
def Execute(self, opt, args):
projects = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
print(f"Scanning filesystem under {self.repodir}...")
project_paths = set()
@ -96,11 +90,11 @@ class Gc(Command):
if not to_delete:
print("Nothing to clean up.")
return 0
return
print("Identified the following projects are no longer used:")
print("\n".join(to_delete))
print("")
print("\n")
if not opt.yes:
print(
"If you proceed, any local commits in those projects will be "
@ -131,164 +125,3 @@ class Gc(Command):
platform_utils.rmtree(tmp_path)
pm.update(msg=path)
pm.end()
return 0
def _generate_promisor_files(self, pack_dir: str):
"""Generates promisor files for all pack files in the given directory.
Promisor files are empty files with the same name as the corresponding
pack file but with the ".promisor" extension. They are used by Git.
"""
for root, _, files in platform_utils.walk(pack_dir):
for file in files:
if not file.endswith(".pack"):
continue
with open(os.path.join(root, f"{file[:-4]}promisor"), "w"):
pass
def repack_projects(self, projects: List[Project], opt):
repack_projects = []
# Find all projects eligible for repacking:
# - can't be shared
# - have a specific fetch filter
for project in projects:
if project.config.GetBoolean("extensions.preciousObjects"):
continue
if not project.clone_depth:
continue
if project.manifest.CloneFilterForDepth != "blob:none":
continue
repack_projects.append(project)
if opt.dryrun:
print(f"Would have repacked {len(repack_projects)} projects.")
return 0
pm = Progress(
"Repacking (this will take a while)",
len(repack_projects),
delay=False,
quiet=opt.quiet,
show_elapsed=True,
elide=True,
)
for project in repack_projects:
pm.update(msg=f"{project.name}")
pack_dir = os.path.join(project.gitdir, "tmp_repo_repack")
if os.path.isdir(pack_dir):
platform_utils.rmtree(pack_dir)
os.mkdir(pack_dir)
# Prepare workspace for repacking - remove all unreachable refs and
# their objects.
GitCommand(
project,
["reflog", "expire", "--expire-unreachable=all"],
verify_command=True,
).Wait()
pm.update(msg=f"{project.name} | gc", inc=0)
GitCommand(
project,
["gc"],
verify_command=True,
).Wait()
# Get all objects that are reachable from the remote, and pack them.
pm.update(msg=f"{project.name} | generating list of objects", inc=0)
remote_objects_cmd = GitCommand(
project,
[
"rev-list",
"--objects",
f"--remotes={project.remote.name}",
"--filter=blob:none",
"--tags",
],
capture_stdout=True,
verify_command=True,
)
# Get all local objects and pack them.
local_head_objects_cmd = GitCommand(
project,
["rev-list", "--objects", "HEAD^{tree}"],
capture_stdout=True,
verify_command=True,
)
local_objects_cmd = GitCommand(
project,
[
"rev-list",
"--objects",
"--all",
"--reflog",
"--indexed-objects",
"--not",
f"--remotes={project.remote.name}",
"--tags",
],
capture_stdout=True,
verify_command=True,
)
remote_objects_cmd.Wait()
pm.update(msg=f"{project.name} | remote repack", inc=0)
GitCommand(
project,
["pack-objects", os.path.join(pack_dir, "pack")],
input=remote_objects_cmd.stdout,
capture_stderr=True,
capture_stdout=True,
verify_command=True,
).Wait()
# create promisor file for each pack file
self._generate_promisor_files(pack_dir)
local_head_objects_cmd.Wait()
local_objects_cmd.Wait()
pm.update(msg=f"{project.name} | local repack", inc=0)
GitCommand(
project,
["pack-objects", os.path.join(pack_dir, "pack")],
input=local_head_objects_cmd.stdout + local_objects_cmd.stdout,
capture_stderr=True,
capture_stdout=True,
verify_command=True,
).Wait()
# Swap the old pack directory with the new one.
platform_utils.rename(
os.path.join(project.objdir, "objects", "pack"),
os.path.join(project.objdir, "objects", "pack_old"),
)
platform_utils.rename(
pack_dir,
os.path.join(project.objdir, "objects", "pack"),
)
platform_utils.rmtree(
os.path.join(project.objdir, "objects", "pack_old")
)
pm.end()
return 0
def Execute(self, opt, args):
projects: List[Project] = self.GetProjects(
args, all_manifests=not opt.this_manifest_only
)
ret = self.delete_unused_projects(projects, opt)
if ret != 0:
return ret
if not opt.repack:
return
return self.repack_projects(projects, opt)

View File

@ -350,8 +350,6 @@ later is required to fix a server side protocol bug.
# value later on.
PARALLEL_JOBS = 0
_JOBS_WARN_THRESHOLD = 100
def _Options(self, p, show_smart=True):
p.add_option(
"--jobs-network",
@ -1060,8 +1058,6 @@ later is required to fix a server side protocol bug.
verbose=verbose,
)
success = syncbuf.Finish()
except KeyboardInterrupt:
logger.error("Keyboard interrupt while processing %s", project.name)
except GitError as e:
logger.error(
"error.GitError: Cannot checkout %s: %s", project.name, e
@ -1504,7 +1500,6 @@ later is required to fix a server side protocol bug.
if manifest_server.startswith("persistent-"):
manifest_server = manifest_server[len("persistent-") :]
# Changes in behavior should update docs/smart-sync.md accordingly.
try:
server = xmlrpc.client.Server(manifest_server, transport=transport)
if opt.smart_sync:
@ -1730,24 +1725,6 @@ later is required to fix a server side protocol bug.
opt.jobs_network = min(opt.jobs_network, jobs_soft_limit)
opt.jobs_checkout = min(opt.jobs_checkout, jobs_soft_limit)
# Warn once if effective job counts seem excessively high.
# Prioritize --jobs, then --jobs-network, then --jobs-checkout.
job_options_to_check = (
("--jobs", opt.jobs),
("--jobs-network", opt.jobs_network),
("--jobs-checkout", opt.jobs_checkout),
)
for name, value in job_options_to_check:
if value > self._JOBS_WARN_THRESHOLD:
logger.warning(
"High job count (%d > %d) specified for %s; this may "
"lead to excessive resource usage or diminishing returns.",
value,
self._JOBS_WARN_THRESHOLD,
name,
)
break
def Execute(self, opt, args):
errors = []
try:
@ -2019,8 +1996,6 @@ def _PostRepoFetch(rp, repo_verify=True, verbose=False):
# We also have to make sure this will switch to an older commit if
# that's the latest tag in order to support release rollback.
try:
# Refresh index since reset --keep won't do it.
rp.work_git.update_index("-q", "--refresh")
rp.work_git.reset("--keep", new_rev)
except GitError as e:
raise RepoUnhandledExceptionError(e)

View File

@ -21,8 +21,6 @@ import subprocess
import unittest
from unittest import mock
import pytest
import git_command
import wrapper
@ -265,7 +263,6 @@ class UserAgentUnitTest(unittest.TestCase):
m = re.match(r"^[^ ]+$", os_name)
self.assertIsNotNone(m)
@pytest.mark.skip_cq("TODO(b/266734831): Find out why this fails in CQ")
def test_smoke_repo(self):
"""Make sure repo UA returns something useful."""
ua = git_command.user_agent.repo
@ -274,7 +271,6 @@ class UserAgentUnitTest(unittest.TestCase):
m = re.match(r"^git-repo/[^ ]+ ([^ ]+) git/[^ ]+ Python/[0-9.]+", ua)
self.assertIsNotNone(m)
@pytest.mark.skip_cq("TODO(b/266734831): Find out why this fails in CQ")
def test_smoke_git(self):
"""Make sure git UA returns something useful."""
ua = git_command.user_agent.git

View File

@ -21,7 +21,6 @@ import tempfile
import unittest
from unittest import mock
import pytest
from test_manifest_xml import sort_attributes
import git_superproject
@ -146,7 +145,6 @@ class SuperprojectTestCase(unittest.TestCase):
)
self.assertIsNone(manifest.superproject)
@pytest.mark.skip_cq("TODO(b/266734831): Find out why this takes 8m+ in CQ")
def test_superproject_get_superproject_invalid_url(self):
"""Test with an invalid url."""
manifest = self.getXmlManifest(
@ -170,7 +168,6 @@ class SuperprojectTestCase(unittest.TestCase):
self.assertFalse(sync_result.success)
self.assertTrue(sync_result.fatal)
@pytest.mark.skip_cq("TODO(b/266734831): Find out why this takes 8m+ in CQ")
def test_superproject_get_superproject_invalid_branch(self):
"""Test with an invalid branch."""
manifest = self.getXmlManifest(

View File

@ -51,7 +51,7 @@ INVALID_FS_PATHS = (
"foo~",
"blah/foo~",
# Block Unicode characters that get normalized out by filesystems.
"foo\u200cbar",
"foo\u200Cbar",
# Block newlines.
"f\n/bar",
"f\r/bar",

View File

@ -17,7 +17,6 @@
import io
import os
import re
import subprocess
import sys
import tempfile
import unittest
@ -126,7 +125,7 @@ class RunCommand(RepoWrapperTestCase):
self.wrapper.run_command(["true"], check=False)
self.wrapper.run_command(["true"], check=True)
self.wrapper.run_command(["false"], check=False)
with self.assertRaises(subprocess.CalledProcessError):
with self.assertRaises(self.wrapper.RunError):
self.wrapper.run_command(["false"], check=True)
@ -359,8 +358,8 @@ class VerifyRev(RepoWrapperTestCase):
def test_verify_passes(self):
"""Check when we have a valid signed tag."""
desc_result = subprocess.CompletedProcess([], 0, "v1.0\n", "")
gpg_result = subprocess.CompletedProcess([], 0, "", "")
desc_result = self.wrapper.RunResult(0, "v1.0\n", "")
gpg_result = self.wrapper.RunResult(0, "", "")
with mock.patch.object(
self.wrapper, "run_git", side_effect=(desc_result, gpg_result)
):
@ -371,8 +370,8 @@ class VerifyRev(RepoWrapperTestCase):
def test_unsigned_commit(self):
"""Check we fall back to signed tag when we have an unsigned commit."""
desc_result = subprocess.CompletedProcess([], 0, "v1.0-10-g1234\n", "")
gpg_result = subprocess.CompletedProcess([], 0, "", "")
desc_result = self.wrapper.RunResult(0, "v1.0-10-g1234\n", "")
gpg_result = self.wrapper.RunResult(0, "", "")
with mock.patch.object(
self.wrapper, "run_git", side_effect=(desc_result, gpg_result)
):
@ -383,7 +382,7 @@ class VerifyRev(RepoWrapperTestCase):
def test_verify_fails(self):
"""Check we fall back to signed tag when we have an unsigned commit."""
desc_result = subprocess.CompletedProcess([], 0, "v1.0-10-g1234\n", "")
desc_result = self.wrapper.RunResult(0, "v1.0-10-g1234\n", "")
gpg_result = Exception
with mock.patch.object(
self.wrapper, "run_git", side_effect=(desc_result, gpg_result)