mirror of
https://gerrit.googlesource.com/git-repo
synced 2025-06-26 20:17:52 +00:00
Compare commits
32 Commits
Author | SHA1 | Date | |
---|---|---|---|
024df06ec1 | |||
45809e51ca | |||
331c5dd3e7 | |||
e848e9f72c | |||
1544afe460 | |||
3b8f9535c7 | |||
8f4f98582e | |||
8bc5000423 | |||
6a7f73bb9a | |||
23d063bdcd | |||
ce0ed799b6 | |||
2844a5f3cc | |||
47944bbe2e | |||
83c66ec661 | |||
87058c6ca5 | |||
b5644160b7 | |||
aadd12cb08 | |||
b8fd19215f | |||
7a1f1f70f0 | |||
c993c5068e | |||
c3d7c8536c | |||
880c621dc6 | |||
da6ae1da8b | |||
5771897459 | |||
56a5a01c65 | |||
e9cb391117 | |||
25d6c7cc10 | |||
f19b310f15 | |||
712e62b9b0 | |||
daf2ad38eb | |||
b861511db9 | |||
e914ec293a |
22
.github/workflows/close-pull-request.yml
vendored
Normal file
22
.github/workflows/close-pull-request.yml
vendored
Normal file
@ -0,0 +1,22 @@
|
||||
# GitHub actions workflow.
|
||||
# https://docs.github.com/en/actions/learn-github-actions/workflow-syntax-for-github-actions
|
||||
|
||||
# https://github.com/superbrothers/close-pull-request
|
||||
name: Close Pull Request
|
||||
|
||||
on:
|
||||
pull_request_target:
|
||||
types: [opened]
|
||||
|
||||
jobs:
|
||||
run:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: superbrothers/close-pull-request@v3
|
||||
with:
|
||||
comment: >
|
||||
Thanks for your contribution!
|
||||
Unfortunately, we don't use GitHub pull requests to manage code
|
||||
contributions to this repository.
|
||||
Instead, please see [README.md](../blob/HEAD/SUBMITTING_PATCHES.md)
|
||||
which provides full instructions on how to get involved.
|
5
.github/workflows/test-ci.yml
vendored
5
.github/workflows/test-ci.yml
vendored
@ -13,8 +13,9 @@ jobs:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
os: [ubuntu-latest, macos-latest, windows-latest]
|
||||
python-version: ['3.6', '3.7', '3.8', '3.9', '3.10']
|
||||
# ubuntu-20.04 is the last version that supports python 3.6
|
||||
os: [ubuntu-20.04, macos-latest, windows-latest]
|
||||
python-version: ['3.6', '3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
|
||||
runs-on: ${{ matrix.os }}
|
||||
|
||||
steps:
|
||||
|
@ -1,47 +1,93 @@
|
||||
# Supported Python Versions
|
||||
|
||||
With Python 2.7 officially going EOL on [01 Jan 2020](https://pythonclock.org/),
|
||||
we need a support plan for the repo project itself.
|
||||
Inevitably, there will be a long tail of users who still want to use Python 2 on
|
||||
their old LTS/corp systems and have little power to change the system.
|
||||
This documents the current supported Python versions, and tries to provide
|
||||
guidance for when we decide to drop support for older versions.
|
||||
|
||||
## Summary
|
||||
|
||||
* Python 3.6 (released Dec 2016) is required by default starting with repo-2.x.
|
||||
* Older versions of Python (e.g. v2.7) may use the legacy feature-frozen branch
|
||||
based on repo-1.x.
|
||||
* Python 3.6 (released Dec 2016) is required starting with repo-2.0.
|
||||
* Older versions of Python (e.g. v2.7) may use old releases via the repo-1.x
|
||||
branch, but no support is provided.
|
||||
|
||||
## Overview
|
||||
|
||||
We provide a branch for Python 2 users that is feature-frozen.
|
||||
Bugfixes may be added on a best-effort basis or from the community, but largely
|
||||
no new features will be added, nor is support guaranteed.
|
||||
|
||||
Users can select this during `repo init` time via the [repo launcher].
|
||||
Otherwise the default branches (e.g. stable & main) will be used which will
|
||||
require Python 3.
|
||||
|
||||
This means the [repo launcher] needs to support both Python 2 & Python 3, but
|
||||
since it doesn't import any other repo code, this shouldn't be too problematic.
|
||||
|
||||
The main branch will require Python 3.6 at a minimum.
|
||||
If the system has an older version of Python 3, then users will have to select
|
||||
the legacy Python 2 branch instead.
|
||||
|
||||
### repo hooks
|
||||
## repo hooks
|
||||
|
||||
Projects that use [repo hooks] run on independent schedules.
|
||||
They might migrate to Python 3 earlier or later than us.
|
||||
To support them, we'll probe the shebang of the hook script and if we find an
|
||||
interpreter in there that indicates a different version than repo is currently
|
||||
running under, we'll attempt to reexec ourselves under that.
|
||||
Since it's not possible to detect what version of Python the hooks were written
|
||||
or tested against, we always import & exec them with the active Python version.
|
||||
|
||||
For example, a hook with a header like `#!/usr/bin/python2` will have repo
|
||||
execute `/usr/bin/python2` to execute the hook code specifically if repo is
|
||||
currently running Python 3.
|
||||
If the user's Python is too new for the [repo hooks], then it is up to the hooks
|
||||
maintainer to update.
|
||||
|
||||
For more details, consult the [repo hooks] documentation.
|
||||
## Repo launcher
|
||||
|
||||
The [repo launcher] is an independent script that can support older versions of
|
||||
Python without holding back the rest of the codebase.
|
||||
If it detects the current version of Python is too old, it will try to reexec
|
||||
via a newer version of Python via standard `pythonX.Y` interpreter names.
|
||||
|
||||
However, this is provided as a nicety when it is not onerous, and there is no
|
||||
official support for older versions of Python than the rest of the codebase.
|
||||
|
||||
If your default python interpreters are too old to run the launcher even though
|
||||
you have newer versions installed, your choices are:
|
||||
|
||||
* Modify the [repo launcher]'s shebang to suite your environment.
|
||||
* Download an older version of the [repo launcher] and don't upgrade it.
|
||||
Be aware that there is no guarantee old repo launchers are WILL work with
|
||||
current versions of repo. Bug reports using old launchers will not be
|
||||
accepted.
|
||||
|
||||
## When to drop support
|
||||
|
||||
So far, Python 3.6 has provided most of the interesting features that we want
|
||||
(e.g. typing & f-strings), and there haven't been features in newer versions
|
||||
that are critical to us.
|
||||
|
||||
That said, let's assume we need functionality that only exists in Python 3.7.
|
||||
How do we decide when it's acceptable to drop Python 3.6?
|
||||
|
||||
1. Review the [Project References](./release-process.md#project-references) to
|
||||
see what major distros are using the previous version of Python, and when
|
||||
they go EOL. Generally we care about Ubuntu LTS & current/previous Debian
|
||||
stable versions.
|
||||
* If they're all EOL already, then go for it, drop support.
|
||||
* If they aren't EOL, start a thread on [repo-discuss] to see how the user
|
||||
base feels about the proposal.
|
||||
1. Update the "soft" versions in the codebase. This will start warning users
|
||||
that the older version is deprecated.
|
||||
* Update [repo](/repo) if the launcher needs updating.
|
||||
This only helps with people who download newer launchers.
|
||||
* Update [main.py](/main.py) for the main codebase.
|
||||
This warns for everyone regardless of [repo launcher] version.
|
||||
* Update [requirements.json](/requirements.json).
|
||||
This allows [repo launcher] to display warnings/errors without having
|
||||
to execute the new codebase. This helps in case of syntax or module
|
||||
changes where older versions won't even be able to import the new code.
|
||||
1. After some grace period (ideally at least 2 quarters after the first release
|
||||
with the updated soft requirements), update the "hard" versions, and then
|
||||
start using the new functionality.
|
||||
|
||||
## Python 2.7 & 3.0-3.5
|
||||
|
||||
> **There is no support for these versions.**
|
||||
> **Do not file bugs if you are using old Python versions.**
|
||||
> **Any such reports will be marked invalid and ignored.**
|
||||
> **Upgrade your distro and/or runtime instead.**
|
||||
|
||||
Fetch an old version of the [repo launcher]:
|
||||
|
||||
```sh
|
||||
$ curl https://storage.googleapis.com/git-repo-downloads/repo-2.32 > ~/.bin/repo-2.32
|
||||
$ chmod a+rx ~/.bin/repo-2.32
|
||||
```
|
||||
|
||||
Then initialize an old version of repo:
|
||||
|
||||
```sh
|
||||
$ repo-2.32 init --repo-rev=repo-1 ...
|
||||
```
|
||||
|
||||
|
||||
[repo-discuss]: https://groups.google.com/forum/#!forum/repo-discuss
|
||||
[repo hooks]: ./repo-hooks.md
|
||||
[repo launcher]: ../repo
|
||||
|
255
git_command.py
255
git_command.py
@ -13,7 +13,9 @@
|
||||
# limitations under the License.
|
||||
|
||||
import functools
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
from typing import Any, Optional
|
||||
@ -21,7 +23,9 @@ from typing import Any, Optional
|
||||
from error import GitError
|
||||
from error import RepoExitError
|
||||
from git_refs import HEAD
|
||||
from git_trace2_event_log_base import BaseEventLog
|
||||
import platform_utils
|
||||
from repo_logging import RepoLogger
|
||||
from repo_trace import IsTrace
|
||||
from repo_trace import REPO_TRACE
|
||||
from repo_trace import Trace
|
||||
@ -45,11 +49,14 @@ GIT_DIR = "GIT_DIR"
|
||||
LAST_GITDIR = None
|
||||
LAST_CWD = None
|
||||
DEFAULT_GIT_FAIL_MESSAGE = "git command failure"
|
||||
ERROR_EVENT_LOGGING_PREFIX = "RepoGitCommandError"
|
||||
# Common line length limit
|
||||
GIT_ERROR_STDOUT_LINES = 1
|
||||
GIT_ERROR_STDERR_LINES = 1
|
||||
GIT_ERROR_STDERR_LINES = 10
|
||||
INVALID_GIT_EXIT_CODE = 126
|
||||
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
class _GitCall(object):
|
||||
@functools.lru_cache(maxsize=None)
|
||||
@ -57,7 +64,7 @@ class _GitCall(object):
|
||||
ret = Wrapper().ParseGitVersion()
|
||||
if ret is None:
|
||||
msg = "fatal: unable to detect git version"
|
||||
print(msg, file=sys.stderr)
|
||||
logger.error(msg)
|
||||
raise GitRequireError(msg)
|
||||
return ret
|
||||
|
||||
@ -67,7 +74,7 @@ class _GitCall(object):
|
||||
def fun(*cmdv):
|
||||
command = [name]
|
||||
command.extend(cmdv)
|
||||
return GitCommand(None, command).Wait() == 0
|
||||
return GitCommand(None, command, add_event_log=False).Wait() == 0
|
||||
|
||||
return fun
|
||||
|
||||
@ -105,6 +112,42 @@ def RepoSourceVersion():
|
||||
return ver
|
||||
|
||||
|
||||
@functools.lru_cache(maxsize=None)
|
||||
def GetEventTargetPath():
|
||||
"""Get the 'trace2.eventtarget' path from git configuration.
|
||||
|
||||
Returns:
|
||||
path: git config's 'trace2.eventtarget' path if it exists, or None
|
||||
"""
|
||||
path = None
|
||||
cmd = ["config", "--get", "trace2.eventtarget"]
|
||||
# TODO(https://crbug.com/gerrit/13706): Use GitConfig when it supports
|
||||
# system git config variables.
|
||||
p = GitCommand(
|
||||
None,
|
||||
cmd,
|
||||
capture_stdout=True,
|
||||
capture_stderr=True,
|
||||
bare=True,
|
||||
add_event_log=False,
|
||||
)
|
||||
retval = p.Wait()
|
||||
if retval == 0:
|
||||
# Strip trailing carriage-return in path.
|
||||
path = p.stdout.rstrip("\n")
|
||||
elif retval != 1:
|
||||
# `git config --get` is documented to produce an exit status of `1`
|
||||
# if the requested variable is not present in the configuration.
|
||||
# Report any other return value as an error.
|
||||
logger.error(
|
||||
"repo: error: 'git config --get' call failed with return code: "
|
||||
"%r, stderr: %r",
|
||||
retval,
|
||||
p.stderr,
|
||||
)
|
||||
return path
|
||||
|
||||
|
||||
class UserAgent(object):
|
||||
"""Mange User-Agent settings when talking to external services
|
||||
|
||||
@ -174,7 +217,7 @@ def git_require(min_version, fail=False, msg=""):
|
||||
if msg:
|
||||
msg = " for " + msg
|
||||
error_msg = "fatal: git %s or later required%s" % (need, msg)
|
||||
print(error_msg, file=sys.stderr)
|
||||
logger.error(error_msg)
|
||||
raise GitRequireError(error_msg)
|
||||
return False
|
||||
|
||||
@ -247,6 +290,8 @@ class GitCommand(object):
|
||||
gitdir=None,
|
||||
objdir=None,
|
||||
verify_command=False,
|
||||
add_event_log=True,
|
||||
log_as_error=True,
|
||||
):
|
||||
if project:
|
||||
if not cwd:
|
||||
@ -257,6 +302,7 @@ class GitCommand(object):
|
||||
self.project = project
|
||||
self.cmdv = cmdv
|
||||
self.verify_command = verify_command
|
||||
self.stdout, self.stderr = None, None
|
||||
|
||||
# Git on Windows wants its paths only using / for reliability.
|
||||
if platform_utils.isWindows():
|
||||
@ -276,15 +322,67 @@ class GitCommand(object):
|
||||
command = [GIT]
|
||||
if bare:
|
||||
cwd = None
|
||||
command.append(cmdv[0])
|
||||
command_name = cmdv[0]
|
||||
command.append(command_name)
|
||||
# Need to use the --progress flag for fetch/clone so output will be
|
||||
# displayed as by default git only does progress output if stderr is a
|
||||
# TTY.
|
||||
if sys.stderr.isatty() and cmdv[0] in ("fetch", "clone"):
|
||||
if sys.stderr.isatty() and command_name in ("fetch", "clone"):
|
||||
if "--progress" not in cmdv and "--quiet" not in cmdv:
|
||||
command.append("--progress")
|
||||
command.extend(cmdv[1:])
|
||||
|
||||
event_log = (
|
||||
BaseEventLog(env=env, add_init_count=True)
|
||||
if add_event_log
|
||||
else None
|
||||
)
|
||||
|
||||
try:
|
||||
self._RunCommand(
|
||||
command,
|
||||
env,
|
||||
capture_stdout=capture_stdout,
|
||||
capture_stderr=capture_stderr,
|
||||
merge_output=merge_output,
|
||||
ssh_proxy=ssh_proxy,
|
||||
cwd=cwd,
|
||||
input=input,
|
||||
)
|
||||
self.VerifyCommand()
|
||||
except GitCommandError as e:
|
||||
if event_log is not None:
|
||||
error_info = json.dumps(
|
||||
{
|
||||
"ErrorType": type(e).__name__,
|
||||
"Project": e.project,
|
||||
"CommandName": command_name,
|
||||
"Message": str(e),
|
||||
"ReturnCode": str(e.git_rc)
|
||||
if e.git_rc is not None
|
||||
else None,
|
||||
"IsError": log_as_error,
|
||||
}
|
||||
)
|
||||
event_log.ErrorEvent(
|
||||
f"{ERROR_EVENT_LOGGING_PREFIX}:{error_info}"
|
||||
)
|
||||
event_log.Write(GetEventTargetPath())
|
||||
if isinstance(e, GitPopenCommandError):
|
||||
raise
|
||||
|
||||
def _RunCommand(
|
||||
self,
|
||||
command,
|
||||
env,
|
||||
capture_stdout=False,
|
||||
capture_stderr=False,
|
||||
merge_output=False,
|
||||
ssh_proxy=None,
|
||||
cwd=None,
|
||||
input=None,
|
||||
):
|
||||
# Set subprocess.PIPE for streams that need to be captured.
|
||||
stdin = subprocess.PIPE if input else None
|
||||
stdout = subprocess.PIPE if capture_stdout else None
|
||||
stderr = (
|
||||
@ -293,6 +391,30 @@ class GitCommand(object):
|
||||
else (subprocess.PIPE if capture_stderr else None)
|
||||
)
|
||||
|
||||
# tee_stderr acts like a tee command for stderr, in that, it captures
|
||||
# stderr from the subprocess and streams it back to sys.stderr, while
|
||||
# keeping a copy in-memory.
|
||||
# This allows us to store stderr logs from the subprocess into
|
||||
# GitCommandError.
|
||||
# Certain git operations, such as `git push`, writes diagnostic logs,
|
||||
# such as, progress bar for pushing, into stderr. To ensure we don't
|
||||
# break git's UX, we need to write to sys.stderr as we read from the
|
||||
# subprocess. Setting encoding or errors makes subprocess return
|
||||
# io.TextIOWrapper, which is line buffered. To avoid line-buffering
|
||||
# while tee-ing stderr, we unset these kwargs. See GitCommand._Tee
|
||||
# for tee-ing between the streams.
|
||||
# We tee stderr iff the caller doesn't want to capture any stream to
|
||||
# not disrupt the existing flow.
|
||||
# See go/tee-repo-stderr for more context.
|
||||
tee_stderr = False
|
||||
kwargs = {"encoding": "utf-8", "errors": "backslashreplace"}
|
||||
if not (stdin or stdout or stderr):
|
||||
tee_stderr = True
|
||||
# stderr will be written back to sys.stderr even though it is
|
||||
# piped here.
|
||||
stderr = subprocess.PIPE
|
||||
kwargs = {}
|
||||
|
||||
dbg = ""
|
||||
if IsTrace():
|
||||
global LAST_CWD
|
||||
@ -339,17 +461,16 @@ class GitCommand(object):
|
||||
command,
|
||||
cwd=cwd,
|
||||
env=env,
|
||||
encoding="utf-8",
|
||||
errors="backslashreplace",
|
||||
stdin=stdin,
|
||||
stdout=stdout,
|
||||
stderr=stderr,
|
||||
**kwargs,
|
||||
)
|
||||
except Exception as e:
|
||||
raise GitCommandError(
|
||||
raise GitPopenCommandError(
|
||||
message="%s: %s" % (command[1], e),
|
||||
project=project.name if project else None,
|
||||
command_args=cmdv,
|
||||
project=self.project.name if self.project else None,
|
||||
command_args=self.cmdv,
|
||||
)
|
||||
|
||||
if ssh_proxy:
|
||||
@ -358,12 +479,45 @@ class GitCommand(object):
|
||||
self.process = p
|
||||
|
||||
try:
|
||||
self.stdout, self.stderr = p.communicate(input=input)
|
||||
if tee_stderr:
|
||||
# tee_stderr streams stderr to sys.stderr while capturing
|
||||
# a copy within self.stderr. tee_stderr is only enabled
|
||||
# when the caller wants to pipe no stream.
|
||||
self.stderr = self._Tee(p.stderr, sys.stderr)
|
||||
else:
|
||||
self.stdout, self.stderr = p.communicate(input=input)
|
||||
finally:
|
||||
if ssh_proxy:
|
||||
ssh_proxy.remove_client(p)
|
||||
self.rc = p.wait()
|
||||
|
||||
@staticmethod
|
||||
def _Tee(in_stream, out_stream):
|
||||
"""Writes text from in_stream to out_stream while recording in buffer.
|
||||
|
||||
Args:
|
||||
in_stream: I/O stream to be read from.
|
||||
out_stream: I/O stream to write to.
|
||||
|
||||
Returns:
|
||||
A str containing everything read from the in_stream.
|
||||
"""
|
||||
buffer = ""
|
||||
read_size = 1024 if sys.version_info < (3, 7) else -1
|
||||
chunk = in_stream.read1(read_size)
|
||||
while chunk:
|
||||
# Convert to str.
|
||||
if not hasattr(chunk, "encode"):
|
||||
chunk = chunk.decode("utf-8", "backslashreplace")
|
||||
|
||||
buffer += chunk
|
||||
out_stream.write(chunk)
|
||||
out_stream.flush()
|
||||
|
||||
chunk = in_stream.read1(read_size)
|
||||
|
||||
return buffer
|
||||
|
||||
@staticmethod
|
||||
def _GetBasicEnv():
|
||||
"""Return a basic env for running git under.
|
||||
@ -383,16 +537,14 @@ class GitCommand(object):
|
||||
env.pop(key, None)
|
||||
return env
|
||||
|
||||
def Wait(self):
|
||||
if not self.verify_command or self.rc == 0:
|
||||
return self.rc
|
||||
|
||||
def VerifyCommand(self):
|
||||
if self.rc == 0:
|
||||
return None
|
||||
stdout = (
|
||||
"\n".join(self.stdout.split("\n")[:GIT_ERROR_STDOUT_LINES])
|
||||
if self.stdout
|
||||
else None
|
||||
)
|
||||
|
||||
stderr = (
|
||||
"\n".join(self.stderr.split("\n")[:GIT_ERROR_STDERR_LINES])
|
||||
if self.stderr
|
||||
@ -407,6 +559,11 @@ class GitCommand(object):
|
||||
git_stderr=stderr,
|
||||
)
|
||||
|
||||
def Wait(self):
|
||||
if self.verify_command:
|
||||
self.VerifyCommand()
|
||||
return self.rc
|
||||
|
||||
|
||||
class GitRequireError(RepoExitError):
|
||||
"""Error raised when git version is unavailable or invalid."""
|
||||
@ -423,6 +580,29 @@ class GitCommandError(GitError):
|
||||
raised exclusively from non-zero exit codes returned from git commands.
|
||||
"""
|
||||
|
||||
# Tuples with error formats and suggestions for those errors.
|
||||
_ERROR_TO_SUGGESTION = [
|
||||
(
|
||||
re.compile("couldn't find remote ref .*"),
|
||||
"Check if the provided ref exists in the remote.",
|
||||
),
|
||||
(
|
||||
re.compile("unable to access '.*': .*"),
|
||||
(
|
||||
"Please make sure you have the correct access rights and the "
|
||||
"repository exists."
|
||||
),
|
||||
),
|
||||
(
|
||||
re.compile("'.*' does not appear to be a git repository"),
|
||||
"Are you running this repo command outside of a repo workspace?",
|
||||
),
|
||||
(
|
||||
re.compile("not a git repository"),
|
||||
"Are you running this repo command outside of a repo workspace?",
|
||||
),
|
||||
]
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
message: str = DEFAULT_GIT_FAIL_MESSAGE,
|
||||
@ -439,13 +619,40 @@ class GitCommandError(GitError):
|
||||
self.git_stdout = git_stdout
|
||||
self.git_stderr = git_stderr
|
||||
|
||||
@property
|
||||
@functools.lru_cache(maxsize=None)
|
||||
def suggestion(self):
|
||||
"""Returns helpful next steps for the given stderr."""
|
||||
if not self.git_stderr:
|
||||
return self.git_stderr
|
||||
|
||||
for err, suggestion in self._ERROR_TO_SUGGESTION:
|
||||
if err.search(self.git_stderr):
|
||||
return suggestion
|
||||
|
||||
return None
|
||||
|
||||
def __str__(self):
|
||||
args = "[]" if not self.command_args else " ".join(self.command_args)
|
||||
error_type = type(self).__name__
|
||||
return f"""{error_type}: {self.message}
|
||||
Project: {self.project}
|
||||
Args: {args}
|
||||
Stdout:
|
||||
{self.git_stdout}
|
||||
Stderr:
|
||||
{self.git_stderr}"""
|
||||
string = f"{error_type}: '{args}' on {self.project} failed"
|
||||
|
||||
if self.message != DEFAULT_GIT_FAIL_MESSAGE:
|
||||
string += f": {self.message}"
|
||||
|
||||
if self.git_stdout:
|
||||
string += f"\nstdout: {self.git_stdout}"
|
||||
|
||||
if self.git_stderr:
|
||||
string += f"\nstderr: {self.git_stderr}"
|
||||
|
||||
if self.suggestion:
|
||||
string += f"\nsuggestion: {self.suggestion}"
|
||||
|
||||
return string
|
||||
|
||||
|
||||
class GitPopenCommandError(GitError):
|
||||
"""
|
||||
Error raised when subprocess.Popen fails for a GitCommand
|
||||
"""
|
||||
|
@ -795,8 +795,8 @@ class SyncAnalysisState:
|
||||
to be logged.
|
||||
"""
|
||||
self._config = config
|
||||
now = datetime.datetime.utcnow()
|
||||
self._Set("main.synctime", now.isoformat(timespec="microseconds") + "Z")
|
||||
now = datetime.datetime.now(datetime.timezone.utc)
|
||||
self._Set("main.synctime", now.isoformat(timespec="microseconds"))
|
||||
self._Set("main.version", "1")
|
||||
self._Set("sys.argv", sys.argv)
|
||||
for key, value in superproject_logging_data.items():
|
||||
|
@ -1,47 +1,9 @@
|
||||
# Copyright (C) 2020 The Android Open Source Project
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
"""Provide event logging in the git trace2 EVENT format.
|
||||
|
||||
The git trace2 EVENT format is defined at:
|
||||
https://www.kernel.org/pub/software/scm/git/docs/technical/api-trace2.html#_event_format
|
||||
https://git-scm.com/docs/api-trace2#_the_event_format_target
|
||||
|
||||
Usage:
|
||||
|
||||
git_trace_log = EventLog()
|
||||
git_trace_log.StartEvent()
|
||||
...
|
||||
git_trace_log.ExitEvent()
|
||||
git_trace_log.Write()
|
||||
"""
|
||||
|
||||
|
||||
import datetime
|
||||
import errno
|
||||
import json
|
||||
import os
|
||||
import socket
|
||||
import sys
|
||||
import tempfile
|
||||
import threading
|
||||
|
||||
from git_command import GitCommand
|
||||
from git_command import GetEventTargetPath
|
||||
from git_command import RepoSourceVersion
|
||||
from git_trace2_event_log_base import BaseEventLog
|
||||
|
||||
|
||||
class EventLog(object):
|
||||
class EventLog(BaseEventLog):
|
||||
"""Event log that records events that occurred during a repo invocation.
|
||||
|
||||
Events are written to the log as a consecutive JSON entries, one per line.
|
||||
@ -58,318 +20,13 @@ class EventLog(object):
|
||||
https://git-scm.com/docs/api-trace2#_event_format
|
||||
"""
|
||||
|
||||
def __init__(self, env=None):
|
||||
"""Initializes the event log."""
|
||||
self._log = []
|
||||
# Try to get session-id (sid) from environment (setup in repo launcher).
|
||||
KEY = "GIT_TRACE2_PARENT_SID"
|
||||
if env is None:
|
||||
env = os.environ
|
||||
def __init__(self, **kwargs):
|
||||
super().__init__(repo_source_version=RepoSourceVersion(), **kwargs)
|
||||
|
||||
self.start = datetime.datetime.utcnow()
|
||||
|
||||
# Save both our sid component and the complete sid.
|
||||
# We use our sid component (self._sid) as the unique filename prefix and
|
||||
# the full sid (self._full_sid) in the log itself.
|
||||
self._sid = "repo-%s-P%08x" % (
|
||||
self.start.strftime("%Y%m%dT%H%M%SZ"),
|
||||
os.getpid(),
|
||||
)
|
||||
parent_sid = env.get(KEY)
|
||||
# Append our sid component to the parent sid (if it exists).
|
||||
if parent_sid is not None:
|
||||
self._full_sid = parent_sid + "/" + self._sid
|
||||
else:
|
||||
self._full_sid = self._sid
|
||||
|
||||
# Set/update the environment variable.
|
||||
# Environment handling across systems is messy.
|
||||
try:
|
||||
env[KEY] = self._full_sid
|
||||
except UnicodeEncodeError:
|
||||
env[KEY] = self._full_sid.encode()
|
||||
|
||||
# Add a version event to front of the log.
|
||||
self._AddVersionEvent()
|
||||
|
||||
@property
|
||||
def full_sid(self):
|
||||
return self._full_sid
|
||||
|
||||
def _AddVersionEvent(self):
|
||||
"""Adds a 'version' event at the beginning of current log."""
|
||||
version_event = self._CreateEventDict("version")
|
||||
version_event["evt"] = "2"
|
||||
version_event["exe"] = RepoSourceVersion()
|
||||
self._log.insert(0, version_event)
|
||||
|
||||
def _CreateEventDict(self, event_name):
|
||||
"""Returns a dictionary with common keys/values for git trace2 events.
|
||||
|
||||
Args:
|
||||
event_name: The event name.
|
||||
|
||||
Returns:
|
||||
Dictionary with the common event fields populated.
|
||||
"""
|
||||
return {
|
||||
"event": event_name,
|
||||
"sid": self._full_sid,
|
||||
"thread": threading.current_thread().name,
|
||||
"time": datetime.datetime.utcnow().isoformat() + "Z",
|
||||
}
|
||||
|
||||
def StartEvent(self):
|
||||
"""Append a 'start' event to the current log."""
|
||||
start_event = self._CreateEventDict("start")
|
||||
start_event["argv"] = sys.argv
|
||||
self._log.append(start_event)
|
||||
|
||||
def ExitEvent(self, result):
|
||||
"""Append an 'exit' event to the current log.
|
||||
|
||||
Args:
|
||||
result: Exit code of the event
|
||||
"""
|
||||
exit_event = self._CreateEventDict("exit")
|
||||
|
||||
# Consider 'None' success (consistent with event_log result handling).
|
||||
if result is None:
|
||||
result = 0
|
||||
exit_event["code"] = result
|
||||
time_delta = datetime.datetime.utcnow() - self.start
|
||||
exit_event["t_abs"] = time_delta.total_seconds()
|
||||
self._log.append(exit_event)
|
||||
|
||||
def CommandEvent(self, name, subcommands):
|
||||
"""Append a 'command' event to the current log.
|
||||
|
||||
Args:
|
||||
name: Name of the primary command (ex: repo, git)
|
||||
subcommands: List of the sub-commands (ex: version, init, sync)
|
||||
"""
|
||||
command_event = self._CreateEventDict("command")
|
||||
command_event["name"] = name
|
||||
command_event["subcommands"] = subcommands
|
||||
self._log.append(command_event)
|
||||
|
||||
def LogConfigEvents(self, config, event_dict_name):
|
||||
"""Append a |event_dict_name| event for each config key in |config|.
|
||||
|
||||
Args:
|
||||
config: Configuration dictionary.
|
||||
event_dict_name: Name of the event dictionary for items to be logged
|
||||
under.
|
||||
"""
|
||||
for param, value in config.items():
|
||||
event = self._CreateEventDict(event_dict_name)
|
||||
event["param"] = param
|
||||
event["value"] = value
|
||||
self._log.append(event)
|
||||
|
||||
def DefParamRepoEvents(self, config):
|
||||
"""Append 'def_param' events for repo config keys to the current log.
|
||||
|
||||
This appends one event for each repo.* config key.
|
||||
|
||||
Args:
|
||||
config: Repo configuration dictionary
|
||||
"""
|
||||
# Only output the repo.* config parameters.
|
||||
repo_config = {k: v for k, v in config.items() if k.startswith("repo.")}
|
||||
self.LogConfigEvents(repo_config, "def_param")
|
||||
|
||||
def GetDataEventName(self, value):
|
||||
"""Returns 'data-json' if the value is an array else returns 'data'."""
|
||||
return "data-json" if value[0] == "[" and value[-1] == "]" else "data"
|
||||
|
||||
def LogDataConfigEvents(self, config, prefix):
|
||||
"""Append a 'data' event for each entry in |config| to the current log.
|
||||
|
||||
For each keyX and valueX of the config, "key" field of the event is
|
||||
'|prefix|/keyX' and the "value" of the "key" field is valueX.
|
||||
|
||||
Args:
|
||||
config: Configuration dictionary.
|
||||
prefix: Prefix for each key that is logged.
|
||||
"""
|
||||
for key, value in config.items():
|
||||
event = self._CreateEventDict(self.GetDataEventName(value))
|
||||
event["key"] = f"{prefix}/{key}"
|
||||
event["value"] = value
|
||||
self._log.append(event)
|
||||
|
||||
def ErrorEvent(self, msg, fmt=None):
|
||||
"""Append a 'error' event to the current log."""
|
||||
error_event = self._CreateEventDict("error")
|
||||
if fmt is None:
|
||||
fmt = msg
|
||||
error_event["msg"] = f"RepoErrorEvent:{msg}"
|
||||
error_event["fmt"] = f"RepoErrorEvent:{fmt}"
|
||||
self._log.append(error_event)
|
||||
|
||||
def _GetEventTargetPath(self):
|
||||
"""Get the 'trace2.eventtarget' path from git configuration.
|
||||
|
||||
Returns:
|
||||
path: git config's 'trace2.eventtarget' path if it exists, or None
|
||||
"""
|
||||
path = None
|
||||
cmd = ["config", "--get", "trace2.eventtarget"]
|
||||
# TODO(https://crbug.com/gerrit/13706): Use GitConfig when it supports
|
||||
# system git config variables.
|
||||
p = GitCommand(
|
||||
None, cmd, capture_stdout=True, capture_stderr=True, bare=True
|
||||
)
|
||||
retval = p.Wait()
|
||||
if retval == 0:
|
||||
# Strip trailing carriage-return in path.
|
||||
path = p.stdout.rstrip("\n")
|
||||
elif retval != 1:
|
||||
# `git config --get` is documented to produce an exit status of `1`
|
||||
# if the requested variable is not present in the configuration.
|
||||
# Report any other return value as an error.
|
||||
print(
|
||||
"repo: error: 'git config --get' call failed with return code: "
|
||||
"%r, stderr: %r" % (retval, p.stderr),
|
||||
file=sys.stderr,
|
||||
)
|
||||
return path
|
||||
|
||||
def _WriteLog(self, write_fn):
|
||||
"""Writes the log out using a provided writer function.
|
||||
|
||||
Generate compact JSON output for each item in the log, and write it
|
||||
using write_fn.
|
||||
|
||||
Args:
|
||||
write_fn: A function that accepts byts and writes them to a
|
||||
destination.
|
||||
"""
|
||||
|
||||
for e in self._log:
|
||||
# Dump in compact encoding mode.
|
||||
# See 'Compact encoding' in Python docs:
|
||||
# https://docs.python.org/3/library/json.html#module-json
|
||||
write_fn(
|
||||
json.dumps(e, indent=None, separators=(",", ":")).encode(
|
||||
"utf-8"
|
||||
)
|
||||
+ b"\n"
|
||||
)
|
||||
|
||||
def Write(self, path=None):
|
||||
"""Writes the log out to a file or socket.
|
||||
|
||||
Log is only written if 'path' or 'git config --get trace2.eventtarget'
|
||||
provide a valid path (or socket) to write logs to.
|
||||
|
||||
Logging filename format follows the git trace2 style of being a unique
|
||||
(exclusive writable) file.
|
||||
|
||||
Args:
|
||||
path: Path to where logs should be written. The path may have a
|
||||
prefix of the form "af_unix:[{stream|dgram}:]", in which case
|
||||
the path is treated as a Unix domain socket. See
|
||||
https://git-scm.com/docs/api-trace2#_enabling_a_target for
|
||||
details.
|
||||
|
||||
Returns:
|
||||
log_path: Path to the log file or socket if log is written,
|
||||
otherwise None
|
||||
"""
|
||||
log_path = None
|
||||
# If no logging path is specified, get the path from
|
||||
# 'trace2.eventtarget'.
|
||||
def Write(self, path=None, **kwargs):
|
||||
if path is None:
|
||||
path = self._GetEventTargetPath()
|
||||
return super().Write(path=path, **kwargs)
|
||||
|
||||
# If no logging path is specified, exit.
|
||||
if path is None:
|
||||
return None
|
||||
|
||||
path_is_socket = False
|
||||
socket_type = None
|
||||
if isinstance(path, str):
|
||||
parts = path.split(":", 1)
|
||||
if parts[0] == "af_unix" and len(parts) == 2:
|
||||
path_is_socket = True
|
||||
path = parts[1]
|
||||
parts = path.split(":", 1)
|
||||
if parts[0] == "stream" and len(parts) == 2:
|
||||
socket_type = socket.SOCK_STREAM
|
||||
path = parts[1]
|
||||
elif parts[0] == "dgram" and len(parts) == 2:
|
||||
socket_type = socket.SOCK_DGRAM
|
||||
path = parts[1]
|
||||
else:
|
||||
# Get absolute path.
|
||||
path = os.path.abspath(os.path.expanduser(path))
|
||||
else:
|
||||
raise TypeError("path: str required but got %s." % type(path))
|
||||
|
||||
# Git trace2 requires a directory to write log to.
|
||||
|
||||
# TODO(https://crbug.com/gerrit/13706): Support file (append) mode also.
|
||||
if not (path_is_socket or os.path.isdir(path)):
|
||||
return None
|
||||
|
||||
if path_is_socket:
|
||||
if socket_type == socket.SOCK_STREAM or socket_type is None:
|
||||
try:
|
||||
with socket.socket(
|
||||
socket.AF_UNIX, socket.SOCK_STREAM
|
||||
) as sock:
|
||||
sock.connect(path)
|
||||
self._WriteLog(sock.sendall)
|
||||
return f"af_unix:stream:{path}"
|
||||
except OSError as err:
|
||||
# If we tried to connect to a DGRAM socket using STREAM,
|
||||
# ignore the attempt and continue to DGRAM below. Otherwise,
|
||||
# issue a warning.
|
||||
if err.errno != errno.EPROTOTYPE:
|
||||
print(
|
||||
f"repo: warning: git trace2 logging failed: {err}",
|
||||
file=sys.stderr,
|
||||
)
|
||||
return None
|
||||
if socket_type == socket.SOCK_DGRAM or socket_type is None:
|
||||
try:
|
||||
with socket.socket(
|
||||
socket.AF_UNIX, socket.SOCK_DGRAM
|
||||
) as sock:
|
||||
self._WriteLog(lambda bs: sock.sendto(bs, path))
|
||||
return f"af_unix:dgram:{path}"
|
||||
except OSError as err:
|
||||
print(
|
||||
f"repo: warning: git trace2 logging failed: {err}",
|
||||
file=sys.stderr,
|
||||
)
|
||||
return None
|
||||
# Tried to open a socket but couldn't connect (SOCK_STREAM) or write
|
||||
# (SOCK_DGRAM).
|
||||
print(
|
||||
"repo: warning: git trace2 logging failed: could not write to "
|
||||
"socket",
|
||||
file=sys.stderr,
|
||||
)
|
||||
return None
|
||||
|
||||
# Path is an absolute path
|
||||
# Use NamedTemporaryFile to generate a unique filename as required by
|
||||
# git trace2.
|
||||
try:
|
||||
with tempfile.NamedTemporaryFile(
|
||||
mode="xb", prefix=self._sid, dir=path, delete=False
|
||||
) as f:
|
||||
# TODO(https://crbug.com/gerrit/13706): Support writing events
|
||||
# as they occur.
|
||||
self._WriteLog(f.write)
|
||||
log_path = f.name
|
||||
except FileExistsError as err:
|
||||
print(
|
||||
"repo: warning: git trace2 logging failed: %r" % err,
|
||||
file=sys.stderr,
|
||||
)
|
||||
return None
|
||||
return log_path
|
||||
def _GetEventTargetPath(self):
|
||||
return GetEventTargetPath()
|
||||
|
352
git_trace2_event_log_base.py
Normal file
352
git_trace2_event_log_base.py
Normal file
@ -0,0 +1,352 @@
|
||||
# Copyright (C) 2020 The Android Open Source Project
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
"""Provide event logging in the git trace2 EVENT format.
|
||||
|
||||
The git trace2 EVENT format is defined at:
|
||||
https://www.kernel.org/pub/software/scm/git/docs/technical/api-trace2.html#_event_format
|
||||
https://git-scm.com/docs/api-trace2#_the_event_format_target
|
||||
|
||||
Usage:
|
||||
|
||||
git_trace_log = EventLog()
|
||||
git_trace_log.StartEvent()
|
||||
...
|
||||
git_trace_log.ExitEvent()
|
||||
git_trace_log.Write()
|
||||
"""
|
||||
|
||||
|
||||
import datetime
|
||||
import errno
|
||||
import json
|
||||
import os
|
||||
import socket
|
||||
import sys
|
||||
import tempfile
|
||||
import threading
|
||||
|
||||
|
||||
# BaseEventLog __init__ Counter that is consistent within the same process
|
||||
p_init_count = 0
|
||||
|
||||
|
||||
class BaseEventLog(object):
|
||||
"""Event log that records events that occurred during a repo invocation.
|
||||
|
||||
Events are written to the log as a consecutive JSON entries, one per line.
|
||||
Entries follow the git trace2 EVENT format.
|
||||
|
||||
Each entry contains the following common keys:
|
||||
- event: The event name
|
||||
- sid: session-id - Unique string to allow process instance to be
|
||||
identified.
|
||||
- thread: The thread name.
|
||||
- time: is the UTC time of the event.
|
||||
|
||||
Valid 'event' names and event specific fields are documented here:
|
||||
https://git-scm.com/docs/api-trace2#_event_format
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self, env=None, repo_source_version=None, add_init_count=False
|
||||
):
|
||||
"""Initializes the event log."""
|
||||
global p_init_count
|
||||
p_init_count += 1
|
||||
self._log = []
|
||||
# Try to get session-id (sid) from environment (setup in repo launcher).
|
||||
KEY = "GIT_TRACE2_PARENT_SID"
|
||||
if env is None:
|
||||
env = os.environ
|
||||
|
||||
self.start = datetime.datetime.now(datetime.timezone.utc)
|
||||
|
||||
# Save both our sid component and the complete sid.
|
||||
# We use our sid component (self._sid) as the unique filename prefix and
|
||||
# the full sid (self._full_sid) in the log itself.
|
||||
self._sid = "repo-%s-P%08x" % (
|
||||
self.start.strftime("%Y%m%dT%H%M%SZ"),
|
||||
os.getpid(),
|
||||
)
|
||||
|
||||
if add_init_count:
|
||||
self._sid = f"{self._sid}-{p_init_count}"
|
||||
|
||||
parent_sid = env.get(KEY)
|
||||
# Append our sid component to the parent sid (if it exists).
|
||||
if parent_sid is not None:
|
||||
self._full_sid = parent_sid + "/" + self._sid
|
||||
else:
|
||||
self._full_sid = self._sid
|
||||
|
||||
# Set/update the environment variable.
|
||||
# Environment handling across systems is messy.
|
||||
try:
|
||||
env[KEY] = self._full_sid
|
||||
except UnicodeEncodeError:
|
||||
env[KEY] = self._full_sid.encode()
|
||||
|
||||
if repo_source_version is not None:
|
||||
# Add a version event to front of the log.
|
||||
self._AddVersionEvent(repo_source_version)
|
||||
|
||||
@property
|
||||
def full_sid(self):
|
||||
return self._full_sid
|
||||
|
||||
def _AddVersionEvent(self, repo_source_version):
|
||||
"""Adds a 'version' event at the beginning of current log."""
|
||||
version_event = self._CreateEventDict("version")
|
||||
version_event["evt"] = "2"
|
||||
version_event["exe"] = repo_source_version
|
||||
self._log.insert(0, version_event)
|
||||
|
||||
def _CreateEventDict(self, event_name):
|
||||
"""Returns a dictionary with common keys/values for git trace2 events.
|
||||
|
||||
Args:
|
||||
event_name: The event name.
|
||||
|
||||
Returns:
|
||||
Dictionary with the common event fields populated.
|
||||
"""
|
||||
return {
|
||||
"event": event_name,
|
||||
"sid": self._full_sid,
|
||||
"thread": threading.current_thread().name,
|
||||
"time": datetime.datetime.now(datetime.timezone.utc).isoformat(),
|
||||
}
|
||||
|
||||
def StartEvent(self):
|
||||
"""Append a 'start' event to the current log."""
|
||||
start_event = self._CreateEventDict("start")
|
||||
start_event["argv"] = sys.argv
|
||||
self._log.append(start_event)
|
||||
|
||||
def ExitEvent(self, result):
|
||||
"""Append an 'exit' event to the current log.
|
||||
|
||||
Args:
|
||||
result: Exit code of the event
|
||||
"""
|
||||
exit_event = self._CreateEventDict("exit")
|
||||
|
||||
# Consider 'None' success (consistent with event_log result handling).
|
||||
if result is None:
|
||||
result = 0
|
||||
exit_event["code"] = result
|
||||
time_delta = datetime.datetime.now(datetime.timezone.utc) - self.start
|
||||
exit_event["t_abs"] = time_delta.total_seconds()
|
||||
self._log.append(exit_event)
|
||||
|
||||
def CommandEvent(self, name, subcommands):
|
||||
"""Append a 'command' event to the current log.
|
||||
|
||||
Args:
|
||||
name: Name of the primary command (ex: repo, git)
|
||||
subcommands: List of the sub-commands (ex: version, init, sync)
|
||||
"""
|
||||
command_event = self._CreateEventDict("command")
|
||||
command_event["name"] = name
|
||||
command_event["subcommands"] = subcommands
|
||||
self._log.append(command_event)
|
||||
|
||||
def LogConfigEvents(self, config, event_dict_name):
|
||||
"""Append a |event_dict_name| event for each config key in |config|.
|
||||
|
||||
Args:
|
||||
config: Configuration dictionary.
|
||||
event_dict_name: Name of the event dictionary for items to be logged
|
||||
under.
|
||||
"""
|
||||
for param, value in config.items():
|
||||
event = self._CreateEventDict(event_dict_name)
|
||||
event["param"] = param
|
||||
event["value"] = value
|
||||
self._log.append(event)
|
||||
|
||||
def DefParamRepoEvents(self, config):
|
||||
"""Append 'def_param' events for repo config keys to the current log.
|
||||
|
||||
This appends one event for each repo.* config key.
|
||||
|
||||
Args:
|
||||
config: Repo configuration dictionary
|
||||
"""
|
||||
# Only output the repo.* config parameters.
|
||||
repo_config = {k: v for k, v in config.items() if k.startswith("repo.")}
|
||||
self.LogConfigEvents(repo_config, "def_param")
|
||||
|
||||
def GetDataEventName(self, value):
|
||||
"""Returns 'data-json' if the value is an array else returns 'data'."""
|
||||
return "data-json" if value[0] == "[" and value[-1] == "]" else "data"
|
||||
|
||||
def LogDataConfigEvents(self, config, prefix):
|
||||
"""Append a 'data' event for each entry in |config| to the current log.
|
||||
|
||||
For each keyX and valueX of the config, "key" field of the event is
|
||||
'|prefix|/keyX' and the "value" of the "key" field is valueX.
|
||||
|
||||
Args:
|
||||
config: Configuration dictionary.
|
||||
prefix: Prefix for each key that is logged.
|
||||
"""
|
||||
for key, value in config.items():
|
||||
event = self._CreateEventDict(self.GetDataEventName(value))
|
||||
event["key"] = f"{prefix}/{key}"
|
||||
event["value"] = value
|
||||
self._log.append(event)
|
||||
|
||||
def ErrorEvent(self, msg, fmt=None):
|
||||
"""Append a 'error' event to the current log."""
|
||||
error_event = self._CreateEventDict("error")
|
||||
if fmt is None:
|
||||
fmt = msg
|
||||
error_event["msg"] = f"RepoErrorEvent:{msg}"
|
||||
error_event["fmt"] = f"RepoErrorEvent:{fmt}"
|
||||
self._log.append(error_event)
|
||||
|
||||
def _WriteLog(self, write_fn):
|
||||
"""Writes the log out using a provided writer function.
|
||||
|
||||
Generate compact JSON output for each item in the log, and write it
|
||||
using write_fn.
|
||||
|
||||
Args:
|
||||
write_fn: A function that accepts byts and writes them to a
|
||||
destination.
|
||||
"""
|
||||
|
||||
for e in self._log:
|
||||
# Dump in compact encoding mode.
|
||||
# See 'Compact encoding' in Python docs:
|
||||
# https://docs.python.org/3/library/json.html#module-json
|
||||
write_fn(
|
||||
json.dumps(e, indent=None, separators=(",", ":")).encode(
|
||||
"utf-8"
|
||||
)
|
||||
+ b"\n"
|
||||
)
|
||||
|
||||
def Write(self, path=None):
|
||||
"""Writes the log out to a file or socket.
|
||||
|
||||
Log is only written if 'path' or 'git config --get trace2.eventtarget'
|
||||
provide a valid path (or socket) to write logs to.
|
||||
|
||||
Logging filename format follows the git trace2 style of being a unique
|
||||
(exclusive writable) file.
|
||||
|
||||
Args:
|
||||
path: Path to where logs should be written. The path may have a
|
||||
prefix of the form "af_unix:[{stream|dgram}:]", in which case
|
||||
the path is treated as a Unix domain socket. See
|
||||
https://git-scm.com/docs/api-trace2#_enabling_a_target for
|
||||
details.
|
||||
|
||||
Returns:
|
||||
log_path: Path to the log file or socket if log is written,
|
||||
otherwise None
|
||||
"""
|
||||
log_path = None
|
||||
# If no logging path is specified, exit.
|
||||
if path is None:
|
||||
return None
|
||||
|
||||
path_is_socket = False
|
||||
socket_type = None
|
||||
if isinstance(path, str):
|
||||
parts = path.split(":", 1)
|
||||
if parts[0] == "af_unix" and len(parts) == 2:
|
||||
path_is_socket = True
|
||||
path = parts[1]
|
||||
parts = path.split(":", 1)
|
||||
if parts[0] == "stream" and len(parts) == 2:
|
||||
socket_type = socket.SOCK_STREAM
|
||||
path = parts[1]
|
||||
elif parts[0] == "dgram" and len(parts) == 2:
|
||||
socket_type = socket.SOCK_DGRAM
|
||||
path = parts[1]
|
||||
else:
|
||||
# Get absolute path.
|
||||
path = os.path.abspath(os.path.expanduser(path))
|
||||
else:
|
||||
raise TypeError("path: str required but got %s." % type(path))
|
||||
|
||||
# Git trace2 requires a directory to write log to.
|
||||
|
||||
# TODO(https://crbug.com/gerrit/13706): Support file (append) mode also.
|
||||
if not (path_is_socket or os.path.isdir(path)):
|
||||
return None
|
||||
|
||||
if path_is_socket:
|
||||
if socket_type == socket.SOCK_STREAM or socket_type is None:
|
||||
try:
|
||||
with socket.socket(
|
||||
socket.AF_UNIX, socket.SOCK_STREAM
|
||||
) as sock:
|
||||
sock.connect(path)
|
||||
self._WriteLog(sock.sendall)
|
||||
return f"af_unix:stream:{path}"
|
||||
except OSError as err:
|
||||
# If we tried to connect to a DGRAM socket using STREAM,
|
||||
# ignore the attempt and continue to DGRAM below. Otherwise,
|
||||
# issue a warning.
|
||||
if err.errno != errno.EPROTOTYPE:
|
||||
print(
|
||||
f"repo: warning: git trace2 logging failed: {err}",
|
||||
file=sys.stderr,
|
||||
)
|
||||
return None
|
||||
if socket_type == socket.SOCK_DGRAM or socket_type is None:
|
||||
try:
|
||||
with socket.socket(
|
||||
socket.AF_UNIX, socket.SOCK_DGRAM
|
||||
) as sock:
|
||||
self._WriteLog(lambda bs: sock.sendto(bs, path))
|
||||
return f"af_unix:dgram:{path}"
|
||||
except OSError as err:
|
||||
print(
|
||||
f"repo: warning: git trace2 logging failed: {err}",
|
||||
file=sys.stderr,
|
||||
)
|
||||
return None
|
||||
# Tried to open a socket but couldn't connect (SOCK_STREAM) or write
|
||||
# (SOCK_DGRAM).
|
||||
print(
|
||||
"repo: warning: git trace2 logging failed: could not write to "
|
||||
"socket",
|
||||
file=sys.stderr,
|
||||
)
|
||||
return None
|
||||
|
||||
# Path is an absolute path
|
||||
# Use NamedTemporaryFile to generate a unique filename as required by
|
||||
# git trace2.
|
||||
try:
|
||||
with tempfile.NamedTemporaryFile(
|
||||
mode="xb", prefix=self._sid, dir=path, delete=False
|
||||
) as f:
|
||||
# TODO(https://crbug.com/gerrit/13706): Support writing events
|
||||
# as they occur.
|
||||
self._WriteLog(f.write)
|
||||
log_path = f.name
|
||||
except FileExistsError as err:
|
||||
print(
|
||||
"repo: warning: git trace2 logging failed: %r" % err,
|
||||
file=sys.stderr,
|
||||
)
|
||||
return None
|
||||
return log_path
|
63
hooks.py
63
hooks.py
@ -12,11 +12,8 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import errno
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
import traceback
|
||||
import urllib.parse
|
||||
@ -298,43 +295,6 @@ class RepoHook(object):
|
||||
|
||||
return interp
|
||||
|
||||
def _ExecuteHookViaReexec(self, interp, context, **kwargs):
|
||||
"""Execute the hook script through |interp|.
|
||||
|
||||
Note: Support for this feature should be dropped ~Jun 2021.
|
||||
|
||||
Args:
|
||||
interp: The Python program to run.
|
||||
context: Basic Python context to execute the hook inside.
|
||||
kwargs: Arbitrary arguments to pass to the hook script.
|
||||
|
||||
Raises:
|
||||
HookError: When the hooks failed for any reason.
|
||||
"""
|
||||
# This logic needs to be kept in sync with _ExecuteHookViaImport below.
|
||||
script = """
|
||||
import json, os, sys
|
||||
path = '''%(path)s'''
|
||||
kwargs = json.loads('''%(kwargs)s''')
|
||||
context = json.loads('''%(context)s''')
|
||||
sys.path.insert(0, os.path.dirname(path))
|
||||
data = open(path).read()
|
||||
exec(compile(data, path, 'exec'), context)
|
||||
context['main'](**kwargs)
|
||||
""" % {
|
||||
"path": self._script_fullpath,
|
||||
"kwargs": json.dumps(kwargs),
|
||||
"context": json.dumps(context),
|
||||
}
|
||||
|
||||
# We pass the script via stdin to avoid OS argv limits. It also makes
|
||||
# unhandled exception tracebacks less verbose/confusing for users.
|
||||
cmd = [interp, "-c", "import sys; exec(sys.stdin.read())"]
|
||||
proc = subprocess.Popen(cmd, stdin=subprocess.PIPE)
|
||||
proc.communicate(input=script.encode("utf-8"))
|
||||
if proc.returncode:
|
||||
raise HookError("Failed to run %s hook." % (self._hook_type,))
|
||||
|
||||
def _ExecuteHookViaImport(self, data, context, **kwargs):
|
||||
"""Execute the hook code in |data| directly.
|
||||
|
||||
@ -412,30 +372,13 @@ context['main'](**kwargs)
|
||||
# See what version of python the hook has been written against.
|
||||
data = open(self._script_fullpath).read()
|
||||
interp = self._ExtractInterpFromShebang(data)
|
||||
reexec = False
|
||||
if interp:
|
||||
prog = os.path.basename(interp)
|
||||
if prog.startswith("python2") and sys.version_info.major != 2:
|
||||
reexec = True
|
||||
elif prog.startswith("python3") and sys.version_info.major == 2:
|
||||
reexec = True
|
||||
|
||||
# Attempt to execute the hooks through the requested version of
|
||||
# Python.
|
||||
if reexec:
|
||||
try:
|
||||
self._ExecuteHookViaReexec(interp, context, **kwargs)
|
||||
except OSError as e:
|
||||
if e.errno == errno.ENOENT:
|
||||
# We couldn't find the interpreter, so fallback to
|
||||
# importing.
|
||||
reexec = False
|
||||
else:
|
||||
raise
|
||||
if prog.startswith("python2"):
|
||||
raise HookError("Python 2 is not supported")
|
||||
|
||||
# Run the hook by importing directly.
|
||||
if not reexec:
|
||||
self._ExecuteHookViaImport(data, context, **kwargs)
|
||||
self._ExecuteHookViaImport(data, context, **kwargs)
|
||||
finally:
|
||||
# Restore sys.path and CWD.
|
||||
sys.path = orig_syspath
|
||||
|
132
main.py
132
main.py
@ -32,6 +32,8 @@ import textwrap
|
||||
import time
|
||||
import urllib.request
|
||||
|
||||
from repo_logging import RepoLogger
|
||||
|
||||
|
||||
try:
|
||||
import kerberos
|
||||
@ -69,6 +71,9 @@ from wrapper import Wrapper
|
||||
from wrapper import WrapperPath
|
||||
|
||||
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
# NB: These do not need to be kept in sync with the repo launcher script.
|
||||
# These may be much newer as it allows the repo launcher to roll between
|
||||
# different repo releases while source versions might require a newer python.
|
||||
@ -81,27 +86,19 @@ from wrapper import WrapperPath
|
||||
MIN_PYTHON_VERSION_SOFT = (3, 6)
|
||||
MIN_PYTHON_VERSION_HARD = (3, 6)
|
||||
|
||||
if sys.version_info.major < 3:
|
||||
print(
|
||||
"repo: error: Python 2 is no longer supported; "
|
||||
"Please upgrade to Python {}.{}+.".format(*MIN_PYTHON_VERSION_SOFT),
|
||||
file=sys.stderr,
|
||||
if sys.version_info < MIN_PYTHON_VERSION_HARD:
|
||||
logger.error(
|
||||
"repo: error: Python version is too old; "
|
||||
"Please upgrade to Python %d.%d+.",
|
||||
*MIN_PYTHON_VERSION_SOFT,
|
||||
)
|
||||
sys.exit(1)
|
||||
else:
|
||||
if sys.version_info < MIN_PYTHON_VERSION_HARD:
|
||||
print(
|
||||
"repo: error: Python 3 version is too old; "
|
||||
"Please upgrade to Python {}.{}+.".format(*MIN_PYTHON_VERSION_SOFT),
|
||||
file=sys.stderr,
|
||||
)
|
||||
sys.exit(1)
|
||||
elif sys.version_info < MIN_PYTHON_VERSION_SOFT:
|
||||
print(
|
||||
"repo: warning: your Python 3 version is no longer supported; "
|
||||
"Please upgrade to Python {}.{}+.".format(*MIN_PYTHON_VERSION_SOFT),
|
||||
file=sys.stderr,
|
||||
)
|
||||
elif sys.version_info < MIN_PYTHON_VERSION_SOFT:
|
||||
logger.error(
|
||||
"repo: warning: your Python version is no longer supported; "
|
||||
"Please upgrade to Python %d.%d+.",
|
||||
*MIN_PYTHON_VERSION_SOFT,
|
||||
)
|
||||
|
||||
KEYBOARD_INTERRUPT_EXIT = 128 + signal.SIGINT
|
||||
MAX_PRINT_ERRORS = 5
|
||||
@ -309,7 +306,7 @@ class _Repo(object):
|
||||
)
|
||||
|
||||
if Wrapper().gitc_parse_clientdir(os.getcwd()):
|
||||
print("GITC is not supported.", file=sys.stderr)
|
||||
logger.error("GITC is not supported.")
|
||||
raise GitcUnsupportedError()
|
||||
|
||||
try:
|
||||
@ -322,32 +319,24 @@ class _Repo(object):
|
||||
git_event_log=git_trace2_event_log,
|
||||
)
|
||||
except KeyError:
|
||||
print(
|
||||
"repo: '%s' is not a repo command. See 'repo help'." % name,
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"repo: '%s' is not a repo command. See 'repo help'.", name
|
||||
)
|
||||
return 1
|
||||
|
||||
Editor.globalConfig = cmd.client.globalConfig
|
||||
|
||||
if not isinstance(cmd, MirrorSafeCommand) and cmd.manifest.IsMirror:
|
||||
print(
|
||||
"fatal: '%s' requires a working directory" % name,
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error("fatal: '%s' requires a working directory", name)
|
||||
return 1
|
||||
|
||||
try:
|
||||
copts, cargs = cmd.OptionParser.parse_args(argv)
|
||||
copts = cmd.ReadEnvironmentOptions(copts)
|
||||
except NoManifestException as e:
|
||||
print(
|
||||
"error: in `%s`: %s" % (" ".join([name] + argv), str(e)),
|
||||
file=sys.stderr,
|
||||
)
|
||||
print(
|
||||
"error: manifest missing or unreadable -- please run init",
|
||||
file=sys.stderr,
|
||||
logger.error("error: in `%s`: %s", " ".join([name] + argv), e)
|
||||
logger.error(
|
||||
"error: manifest missing or unreadable -- please run init"
|
||||
)
|
||||
return 1
|
||||
|
||||
@ -453,34 +442,28 @@ class _Repo(object):
|
||||
ManifestInvalidRevisionError,
|
||||
NoManifestException,
|
||||
) as e:
|
||||
print(
|
||||
"error: in `%s`: %s" % (" ".join([name] + argv), str(e)),
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error("error: in `%s`: %s", " ".join([name] + argv), e)
|
||||
if isinstance(e, NoManifestException):
|
||||
print(
|
||||
"error: manifest missing or unreadable -- please run init",
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"error: manifest missing or unreadable -- please run init"
|
||||
)
|
||||
result = e.exit_code
|
||||
except NoSuchProjectError as e:
|
||||
if e.name:
|
||||
print("error: project %s not found" % e.name, file=sys.stderr)
|
||||
logger.error("error: project %s not found", e.name)
|
||||
else:
|
||||
print("error: no project in current directory", file=sys.stderr)
|
||||
logger.error("error: no project in current directory")
|
||||
result = e.exit_code
|
||||
except InvalidProjectGroupsError as e:
|
||||
if e.name:
|
||||
print(
|
||||
"error: project group must be enabled for project %s"
|
||||
% e.name,
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"error: project group must be enabled for project %s",
|
||||
e.name,
|
||||
)
|
||||
else:
|
||||
print(
|
||||
logger.error(
|
||||
"error: project group must be enabled for the project in "
|
||||
"the current directory",
|
||||
file=sys.stderr,
|
||||
"the current directory"
|
||||
)
|
||||
result = e.exit_code
|
||||
except SystemExit as e:
|
||||
@ -547,7 +530,7 @@ def _CheckWrapperVersion(ver_str, repo_path):
|
||||
repo_path = "~/bin/repo"
|
||||
|
||||
if not ver_str:
|
||||
print("no --wrapper-version argument", file=sys.stderr)
|
||||
logger.error("no --wrapper-version argument")
|
||||
sys.exit(1)
|
||||
|
||||
# Pull out the version of the repo launcher we know about to compare.
|
||||
@ -556,7 +539,7 @@ def _CheckWrapperVersion(ver_str, repo_path):
|
||||
|
||||
exp_str = ".".join(map(str, exp))
|
||||
if ver < MIN_REPO_VERSION:
|
||||
print(
|
||||
logger.error(
|
||||
"""
|
||||
repo: error:
|
||||
!!! Your version of repo %s is too old.
|
||||
@ -565,42 +548,44 @@ repo: error:
|
||||
!!! You must upgrade before you can continue:
|
||||
|
||||
cp %s %s
|
||||
"""
|
||||
% (ver_str, min_str, exp_str, WrapperPath(), repo_path),
|
||||
file=sys.stderr,
|
||||
""",
|
||||
ver_str,
|
||||
min_str,
|
||||
exp_str,
|
||||
WrapperPath(),
|
||||
repo_path,
|
||||
)
|
||||
sys.exit(1)
|
||||
|
||||
if exp > ver:
|
||||
print(
|
||||
"\n... A new version of repo (%s) is available." % (exp_str,),
|
||||
file=sys.stderr,
|
||||
logger.warning(
|
||||
"\n... A new version of repo (%s) is available.", exp_str
|
||||
)
|
||||
if os.access(repo_path, os.W_OK):
|
||||
print(
|
||||
logger.warning(
|
||||
"""\
|
||||
... You should upgrade soon:
|
||||
cp %s %s
|
||||
"""
|
||||
% (WrapperPath(), repo_path),
|
||||
file=sys.stderr,
|
||||
""",
|
||||
WrapperPath(),
|
||||
repo_path,
|
||||
)
|
||||
else:
|
||||
print(
|
||||
logger.warning(
|
||||
"""\
|
||||
... New version is available at: %s
|
||||
... The launcher is run from: %s
|
||||
!!! The launcher is not writable. Please talk to your sysadmin or distro
|
||||
!!! to get an update installed.
|
||||
"""
|
||||
% (WrapperPath(), repo_path),
|
||||
file=sys.stderr,
|
||||
""",
|
||||
WrapperPath(),
|
||||
repo_path,
|
||||
)
|
||||
|
||||
|
||||
def _CheckRepoDir(repo_dir):
|
||||
if not repo_dir:
|
||||
print("no --repo-dir argument", file=sys.stderr)
|
||||
logger.error("no --repo-dir argument")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
@ -861,18 +846,7 @@ def _Main(argv):
|
||||
result = repo._Run(name, gopts, argv) or 0
|
||||
except RepoExitError as e:
|
||||
if not isinstance(e, SilentRepoExitError):
|
||||
exception_name = type(e).__name__
|
||||
print("fatal: %s" % e, file=sys.stderr)
|
||||
if e.aggregate_errors:
|
||||
print(f"{exception_name} Aggregate Errors")
|
||||
for err in e.aggregate_errors[:MAX_PRINT_ERRORS]:
|
||||
print(err)
|
||||
if (
|
||||
e.aggregate_errors
|
||||
and len(e.aggregate_errors) > MAX_PRINT_ERRORS
|
||||
):
|
||||
diff = len(e.aggregate_errors) - MAX_PRINT_ERRORS
|
||||
print(f"+{diff} additional errors ...")
|
||||
logger.log_aggregated_errors(e)
|
||||
result = e.exit_code
|
||||
except KeyboardInterrupt:
|
||||
print("aborted by user", file=sys.stderr)
|
||||
|
@ -2210,7 +2210,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
|
||||
toProjects = manifest.paths
|
||||
|
||||
fromKeys = sorted(fromProjects.keys())
|
||||
toKeys = sorted(toProjects.keys())
|
||||
toKeys = set(toProjects.keys())
|
||||
|
||||
diff = {
|
||||
"added": [],
|
||||
@ -2221,13 +2221,13 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
|
||||
}
|
||||
|
||||
for proj in fromKeys:
|
||||
fromProj = fromProjects[proj]
|
||||
if proj not in toKeys:
|
||||
diff["removed"].append(fromProjects[proj])
|
||||
elif not fromProjects[proj].Exists:
|
||||
diff["removed"].append(fromProj)
|
||||
elif not fromProj.Exists:
|
||||
diff["missing"].append(toProjects[proj])
|
||||
toKeys.remove(proj)
|
||||
else:
|
||||
fromProj = fromProjects[proj]
|
||||
toProj = toProjects[proj]
|
||||
try:
|
||||
fromRevId = fromProj.GetCommitRevisionId()
|
||||
@ -2239,8 +2239,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
|
||||
diff["changed"].append((fromProj, toProj))
|
||||
toKeys.remove(proj)
|
||||
|
||||
for proj in toKeys:
|
||||
diff["added"].append(toProjects[proj])
|
||||
diff["added"].extend(toProjects[proj] for proj in sorted(toKeys))
|
||||
|
||||
return diff
|
||||
|
||||
|
231
project.py
231
project.py
@ -44,7 +44,6 @@ from git_command import GitCommand
|
||||
from git_config import GetSchemeFromUrl
|
||||
from git_config import GetUrlCookieFile
|
||||
from git_config import GitConfig
|
||||
from git_config import ID_RE
|
||||
from git_config import IsId
|
||||
from git_refs import GitRefs
|
||||
from git_refs import HEAD
|
||||
@ -57,9 +56,13 @@ import git_superproject
|
||||
from git_trace2_event_log import EventLog
|
||||
import platform_utils
|
||||
import progress
|
||||
from repo_logging import RepoLogger
|
||||
from repo_trace import Trace
|
||||
|
||||
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
class SyncNetworkHalfResult(NamedTuple):
|
||||
"""Sync_NetworkHalf return value."""
|
||||
|
||||
@ -116,16 +119,6 @@ def _lwrite(path, content):
|
||||
raise
|
||||
|
||||
|
||||
def _error(fmt, *args):
|
||||
msg = fmt % args
|
||||
print("error: %s" % msg, file=sys.stderr)
|
||||
|
||||
|
||||
def _warn(fmt, *args):
|
||||
msg = fmt % args
|
||||
print("warn: %s" % msg, file=sys.stderr)
|
||||
|
||||
|
||||
def not_rev(r):
|
||||
return "^" + r
|
||||
|
||||
@ -212,7 +205,9 @@ class ReviewableBranch(object):
|
||||
"--",
|
||||
)
|
||||
try:
|
||||
self._commit_cache = self.project.bare_git.rev_list(*args)
|
||||
self._commit_cache = self.project.bare_git.rev_list(
|
||||
*args, log_as_error=self.base_exists
|
||||
)
|
||||
except GitError:
|
||||
# We weren't able to probe the commits for this branch. Was it
|
||||
# tracking a branch that no longer exists? If so, return no
|
||||
@ -437,7 +432,7 @@ class _CopyFile(object):
|
||||
mode = mode & ~(stat.S_IWUSR | stat.S_IWGRP | stat.S_IWOTH)
|
||||
os.chmod(dest, mode)
|
||||
except IOError:
|
||||
_error("Cannot copy file %s to %s", src, dest)
|
||||
logger.error("error: Cannot copy file %s to %s", src, dest)
|
||||
|
||||
|
||||
class _LinkFile(object):
|
||||
@ -472,7 +467,9 @@ class _LinkFile(object):
|
||||
os.makedirs(dest_dir)
|
||||
platform_utils.symlink(relSrc, absDest)
|
||||
except IOError:
|
||||
_error("Cannot link file %s to %s", relSrc, absDest)
|
||||
logger.error(
|
||||
"error: Cannot link file %s to %s", relSrc, absDest
|
||||
)
|
||||
|
||||
def _Link(self):
|
||||
"""Link the self.src & self.dest paths.
|
||||
@ -500,7 +497,7 @@ class _LinkFile(object):
|
||||
dest = _SafeExpandPath(self.topdir, self.dest)
|
||||
# Entity contains a wild card.
|
||||
if os.path.exists(dest) and not platform_utils.isdir(dest):
|
||||
_error(
|
||||
logger.error(
|
||||
"Link error: src with wildcard, %s must be a directory",
|
||||
dest,
|
||||
)
|
||||
@ -1138,7 +1135,7 @@ class Project(object):
|
||||
url = branch.remote.ReviewUrl(self.UserEmail, validate_certs)
|
||||
if url is None:
|
||||
raise UploadError("review not configured", project=self.name)
|
||||
cmd = ["push"]
|
||||
cmd = ["push", "--progress"]
|
||||
if dryrun:
|
||||
cmd.append("-n")
|
||||
|
||||
@ -1202,7 +1199,7 @@ class Project(object):
|
||||
tar.extractall(path=path)
|
||||
return True
|
||||
except (IOError, tarfile.TarError) as e:
|
||||
_error("Cannot extract archive %s: %s", tarpath, str(e))
|
||||
logger.error("error: Cannot extract archive %s: %s", tarpath, e)
|
||||
return False
|
||||
|
||||
def Sync_NetworkHalf(
|
||||
@ -1235,10 +1232,7 @@ class Project(object):
|
||||
)
|
||||
msg_args = self.name
|
||||
msg = msg_template % msg_args
|
||||
_error(
|
||||
msg_template,
|
||||
msg_args,
|
||||
)
|
||||
logger.error(msg_template, msg_args)
|
||||
return SyncNetworkHalfResult(
|
||||
False, SyncNetworkHalfError(msg, project=self.name)
|
||||
)
|
||||
@ -1251,7 +1245,7 @@ class Project(object):
|
||||
try:
|
||||
self._FetchArchive(tarpath, cwd=topdir)
|
||||
except GitError as e:
|
||||
_error("%s", e)
|
||||
logger.error("error: %s", e)
|
||||
return SyncNetworkHalfResult(False, e)
|
||||
|
||||
# From now on, we only need absolute tarpath.
|
||||
@ -1268,7 +1262,7 @@ class Project(object):
|
||||
try:
|
||||
platform_utils.remove(tarpath)
|
||||
except OSError as e:
|
||||
_warn("Cannot remove archive %s: %s", tarpath, str(e))
|
||||
logger.warning("warn: Cannot remove archive %s: %s", tarpath, e)
|
||||
self._CopyAndLinkFiles()
|
||||
return SyncNetworkHalfResult(True)
|
||||
|
||||
@ -1354,10 +1348,8 @@ class Project(object):
|
||||
remote_fetched = False
|
||||
if not (
|
||||
optimized_fetch
|
||||
and (
|
||||
ID_RE.match(self.revisionExpr)
|
||||
and self._CheckForImmutableRevision()
|
||||
)
|
||||
and IsId(self.revisionExpr)
|
||||
and self._CheckForImmutableRevision()
|
||||
):
|
||||
remote_fetched = True
|
||||
try:
|
||||
@ -1443,6 +1435,8 @@ class Project(object):
|
||||
rather than the id of the current git object (for example, a tag)
|
||||
|
||||
"""
|
||||
if self.revisionId:
|
||||
return self.revisionId
|
||||
if not self.revisionExpr.startswith(R_TAGS):
|
||||
return self.GetRevisionId(self._allrefs)
|
||||
|
||||
@ -1601,7 +1595,9 @@ class Project(object):
|
||||
# See if we can perform a fast forward merge. This can happen if our
|
||||
# branch isn't in the exact same state as we last published.
|
||||
try:
|
||||
self.work_git.merge_base("--is-ancestor", HEAD, revid)
|
||||
self.work_git.merge_base(
|
||||
"--is-ancestor", HEAD, revid, log_as_error=False
|
||||
)
|
||||
# Skip the published logic.
|
||||
pub = False
|
||||
except GitError:
|
||||
@ -1672,7 +1668,7 @@ class Project(object):
|
||||
)
|
||||
|
||||
branch.remote = self.GetRemote()
|
||||
if not ID_RE.match(self.revisionExpr):
|
||||
if not IsId(self.revisionExpr):
|
||||
# In case of manifest sync the revisionExpr might be a SHA1.
|
||||
branch.merge = self.revisionExpr
|
||||
if not branch.merge.startswith("refs/"):
|
||||
@ -1763,17 +1759,17 @@ class Project(object):
|
||||
"""
|
||||
if self.IsDirty():
|
||||
if force:
|
||||
print(
|
||||
logger.warning(
|
||||
"warning: %s: Removing dirty project: uncommitted changes "
|
||||
"lost." % (self.RelPath(local=False),),
|
||||
file=sys.stderr,
|
||||
"lost.",
|
||||
self.RelPath(local=False),
|
||||
)
|
||||
else:
|
||||
msg = (
|
||||
"error: %s: Cannot remove project: uncommitted"
|
||||
"changes are present.\n" % self.RelPath(local=False)
|
||||
)
|
||||
print(msg, file=sys.stderr)
|
||||
logger.error(msg)
|
||||
raise DeleteDirtyWorktreeError(msg, project=self)
|
||||
|
||||
if not quiet:
|
||||
@ -1820,12 +1816,11 @@ class Project(object):
|
||||
platform_utils.rmtree(self.gitdir)
|
||||
except OSError as e:
|
||||
if e.errno != errno.ENOENT:
|
||||
print("error: %s: %s" % (self.gitdir, e), file=sys.stderr)
|
||||
print(
|
||||
logger.error("error: %s: %s", self.gitdir, e)
|
||||
logger.error(
|
||||
"error: %s: Failed to delete obsolete checkout; remove "
|
||||
"manually, then run `repo sync -l`."
|
||||
% (self.RelPath(local=False),),
|
||||
file=sys.stderr,
|
||||
"manually, then run `repo sync -l`.",
|
||||
self.RelPath(local=False),
|
||||
)
|
||||
raise DeleteWorktreeError(aggregate_errors=[e])
|
||||
|
||||
@ -1841,10 +1836,7 @@ class Project(object):
|
||||
platform_utils.remove(path)
|
||||
except OSError as e:
|
||||
if e.errno != errno.ENOENT:
|
||||
print(
|
||||
"error: %s: Failed to remove: %s" % (path, e),
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error("error: %s: Failed to remove: %s", path, e)
|
||||
failed = True
|
||||
errors.append(e)
|
||||
dirs[:] = [
|
||||
@ -1863,10 +1855,7 @@ class Project(object):
|
||||
platform_utils.remove(d)
|
||||
except OSError as e:
|
||||
if e.errno != errno.ENOENT:
|
||||
print(
|
||||
"error: %s: Failed to remove: %s" % (d, e),
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error("error: %s: Failed to remove: %s", d, e)
|
||||
failed = True
|
||||
errors.append(e)
|
||||
elif not platform_utils.listdir(d):
|
||||
@ -1874,21 +1863,16 @@ class Project(object):
|
||||
platform_utils.rmdir(d)
|
||||
except OSError as e:
|
||||
if e.errno != errno.ENOENT:
|
||||
print(
|
||||
"error: %s: Failed to remove: %s" % (d, e),
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error("error: %s: Failed to remove: %s", d, e)
|
||||
failed = True
|
||||
errors.append(e)
|
||||
if failed:
|
||||
print(
|
||||
"error: %s: Failed to delete obsolete checkout."
|
||||
% (self.RelPath(local=False),),
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"error: %s: Failed to delete obsolete checkout.",
|
||||
self.RelPath(local=False),
|
||||
)
|
||||
print(
|
||||
logger.error(
|
||||
" Remove manually, then run `repo sync -l`.",
|
||||
file=sys.stderr,
|
||||
)
|
||||
raise DeleteWorktreeError(aggregate_errors=errors)
|
||||
|
||||
@ -1922,9 +1906,7 @@ class Project(object):
|
||||
branch = self.GetBranch(name)
|
||||
branch.remote = self.GetRemote()
|
||||
branch.merge = branch_merge
|
||||
if not branch.merge.startswith("refs/") and not ID_RE.match(
|
||||
branch_merge
|
||||
):
|
||||
if not branch.merge.startswith("refs/") and not IsId(branch_merge):
|
||||
branch.merge = R_HEADS + branch_merge
|
||||
|
||||
if revision is None:
|
||||
@ -2075,7 +2057,7 @@ class Project(object):
|
||||
)
|
||||
b.Wait()
|
||||
finally:
|
||||
if ID_RE.match(old):
|
||||
if IsId(old):
|
||||
self.bare_git.DetachHead(old)
|
||||
else:
|
||||
self.bare_git.SetHead(old)
|
||||
@ -2326,15 +2308,26 @@ class Project(object):
|
||||
# if revision (sha or tag) is not present then following function
|
||||
# throws an error.
|
||||
self.bare_git.rev_list(
|
||||
"-1", "--missing=allow-any", "%s^0" % self.revisionExpr, "--"
|
||||
"-1",
|
||||
"--missing=allow-any",
|
||||
"%s^0" % self.revisionExpr,
|
||||
"--",
|
||||
log_as_error=False,
|
||||
)
|
||||
if self.upstream:
|
||||
rev = self.GetRemote().ToLocal(self.upstream)
|
||||
self.bare_git.rev_list(
|
||||
"-1", "--missing=allow-any", "%s^0" % rev, "--"
|
||||
"-1",
|
||||
"--missing=allow-any",
|
||||
"%s^0" % rev,
|
||||
"--",
|
||||
log_as_error=False,
|
||||
)
|
||||
self.bare_git.merge_base(
|
||||
"--is-ancestor", self.revisionExpr, rev
|
||||
"--is-ancestor",
|
||||
self.revisionExpr,
|
||||
rev,
|
||||
log_as_error=False,
|
||||
)
|
||||
return True
|
||||
except GitError:
|
||||
@ -2377,7 +2370,6 @@ class Project(object):
|
||||
retry_sleep_initial_sec=4.0,
|
||||
retry_exp_factor=2.0,
|
||||
) -> bool:
|
||||
is_sha1 = False
|
||||
tag_name = None
|
||||
# The depth should not be used when fetching to a mirror because
|
||||
# it will result in a shallow repository that cannot be cloned or
|
||||
@ -2389,8 +2381,7 @@ class Project(object):
|
||||
if depth:
|
||||
current_branch_only = True
|
||||
|
||||
if ID_RE.match(self.revisionExpr) is not None:
|
||||
is_sha1 = True
|
||||
is_sha1 = bool(IsId(self.revisionExpr))
|
||||
|
||||
if current_branch_only:
|
||||
if self.revisionExpr.startswith(R_TAGS):
|
||||
@ -2417,7 +2408,7 @@ class Project(object):
|
||||
# * otherwise, fetch all branches to make sure we end up with
|
||||
# the specific commit.
|
||||
if self.upstream:
|
||||
current_branch_only = not ID_RE.match(self.upstream)
|
||||
current_branch_only = not IsId(self.upstream)
|
||||
else:
|
||||
current_branch_only = False
|
||||
|
||||
@ -2786,7 +2777,7 @@ class Project(object):
|
||||
print("Curl output:\n%s" % output)
|
||||
return False
|
||||
elif curlret and not verbose and output:
|
||||
print("%s" % output, file=sys.stderr)
|
||||
logger.error("%s", output)
|
||||
|
||||
if os.path.exists(tmpPath):
|
||||
if curlret == 0 and self._IsValidBundle(tmpPath, quiet):
|
||||
@ -2805,10 +2796,7 @@ class Project(object):
|
||||
return True
|
||||
else:
|
||||
if not quiet:
|
||||
print(
|
||||
"Invalid clone.bundle file; ignoring.",
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error("Invalid clone.bundle file; ignoring.")
|
||||
return False
|
||||
except OSError:
|
||||
return False
|
||||
@ -2928,9 +2916,8 @@ class Project(object):
|
||||
self._CheckDirReference(self.objdir, self.gitdir)
|
||||
except GitError as e:
|
||||
if force_sync:
|
||||
print(
|
||||
"Retrying clone after deleting %s" % self.gitdir,
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"Retrying clone after deleting %s", self.gitdir
|
||||
)
|
||||
try:
|
||||
platform_utils.rmtree(
|
||||
@ -3051,8 +3038,8 @@ class Project(object):
|
||||
# hardlink below.
|
||||
if not filecmp.cmp(stock_hook, dst, shallow=False):
|
||||
if not quiet:
|
||||
_warn(
|
||||
"%s: Not replacing locally modified %s hook",
|
||||
logger.warning(
|
||||
"warn: %s: Not replacing locally modified %s hook",
|
||||
self.RelPath(local=False),
|
||||
name,
|
||||
)
|
||||
@ -3163,7 +3150,12 @@ class Project(object):
|
||||
src = platform_utils.realpath(src_path)
|
||||
# Fail if the links are pointing to the wrong place.
|
||||
if src != dst:
|
||||
_error("%s is different in %s vs %s", name, destdir, srcdir)
|
||||
logger.error(
|
||||
"error: %s is different in %s vs %s",
|
||||
name,
|
||||
destdir,
|
||||
srcdir,
|
||||
)
|
||||
raise GitError(
|
||||
"--force-sync not enabled; cannot overwrite a local "
|
||||
"work tree. If you're comfortable with the "
|
||||
@ -3402,7 +3394,8 @@ class Project(object):
|
||||
# Now that the dir should be empty, clear it out, and symlink it over.
|
||||
platform_utils.rmdir(dotgit)
|
||||
platform_utils.symlink(
|
||||
os.path.relpath(gitdir, os.path.dirname(dotgit)), dotgit
|
||||
os.path.relpath(gitdir, os.path.dirname(os.path.realpath(dotgit))),
|
||||
dotgit,
|
||||
)
|
||||
|
||||
def _get_symlink_error_message(self):
|
||||
@ -3635,7 +3628,7 @@ class Project(object):
|
||||
self.update_ref("-d", name, old)
|
||||
self._project.bare_ref.deleted(name)
|
||||
|
||||
def rev_list(self, *args, **kw):
|
||||
def rev_list(self, *args, log_as_error=True, **kw):
|
||||
if "format" in kw:
|
||||
cmdv = ["log", "--pretty=format:%s" % kw["format"]]
|
||||
else:
|
||||
@ -3649,6 +3642,7 @@ class Project(object):
|
||||
capture_stdout=True,
|
||||
capture_stderr=True,
|
||||
verify_command=True,
|
||||
log_as_error=log_as_error,
|
||||
)
|
||||
p.Wait()
|
||||
return p.stdout.splitlines()
|
||||
@ -3676,7 +3670,7 @@ class Project(object):
|
||||
"""
|
||||
name = name.replace("_", "-")
|
||||
|
||||
def runner(*args, **kwargs):
|
||||
def runner(*args, log_as_error=True, **kwargs):
|
||||
cmdv = []
|
||||
config = kwargs.pop("config", None)
|
||||
for k in kwargs:
|
||||
@ -3697,6 +3691,7 @@ class Project(object):
|
||||
capture_stdout=True,
|
||||
capture_stderr=True,
|
||||
verify_command=True,
|
||||
log_as_error=log_as_error,
|
||||
)
|
||||
p.Wait()
|
||||
r = p.stdout
|
||||
@ -4211,7 +4206,7 @@ class ManifestProject(MetaProject):
|
||||
"manifest.standalone"
|
||||
)
|
||||
if was_standalone_manifest and not manifest_url:
|
||||
print(
|
||||
logger.error(
|
||||
"fatal: repo was initialized with a standlone manifest, "
|
||||
"cannot be re-initialized without --manifest-url/-u"
|
||||
)
|
||||
@ -4229,7 +4224,7 @@ class ManifestProject(MetaProject):
|
||||
is_new = not self.Exists
|
||||
if is_new:
|
||||
if not manifest_url:
|
||||
print("fatal: manifest url is required.", file=sys.stderr)
|
||||
logger.error("fatal: manifest url is required.")
|
||||
return False
|
||||
|
||||
if verbose:
|
||||
@ -4285,7 +4280,7 @@ class ManifestProject(MetaProject):
|
||||
if manifest_branch == "HEAD":
|
||||
manifest_branch = self.ResolveRemoteHead()
|
||||
if manifest_branch is None:
|
||||
print("fatal: unable to resolve HEAD", file=sys.stderr)
|
||||
logger.error("fatal: unable to resolve HEAD")
|
||||
return False
|
||||
self.revisionExpr = manifest_branch
|
||||
else:
|
||||
@ -4310,7 +4305,7 @@ class ManifestProject(MetaProject):
|
||||
elif platform in all_platforms:
|
||||
groups.append(platformize(platform))
|
||||
elif platform != "none":
|
||||
print("fatal: invalid platform flag", file=sys.stderr)
|
||||
logger.error("fatal: invalid platform flag", file=sys.stderr)
|
||||
return False
|
||||
self.config.SetString("manifest.platform", platform)
|
||||
|
||||
@ -4331,35 +4326,29 @@ class ManifestProject(MetaProject):
|
||||
|
||||
if worktree:
|
||||
if mirror:
|
||||
print(
|
||||
"fatal: --mirror and --worktree are incompatible",
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error("fatal: --mirror and --worktree are incompatible")
|
||||
return False
|
||||
if submodules:
|
||||
print(
|
||||
"fatal: --submodules and --worktree are incompatible",
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"fatal: --submodules and --worktree are incompatible"
|
||||
)
|
||||
return False
|
||||
self.config.SetBoolean("repo.worktree", worktree)
|
||||
if is_new:
|
||||
self.use_git_worktrees = True
|
||||
print("warning: --worktree is experimental!", file=sys.stderr)
|
||||
logger.warning("warning: --worktree is experimental!")
|
||||
|
||||
if archive:
|
||||
if is_new:
|
||||
self.config.SetBoolean("repo.archive", archive)
|
||||
else:
|
||||
print(
|
||||
logger.error(
|
||||
"fatal: --archive is only supported when initializing a "
|
||||
"new workspace.",
|
||||
file=sys.stderr,
|
||||
"new workspace."
|
||||
)
|
||||
print(
|
||||
logger.error(
|
||||
"Either delete the .repo folder in this workspace, or "
|
||||
"initialize in another location.",
|
||||
file=sys.stderr,
|
||||
"initialize in another location."
|
||||
)
|
||||
return False
|
||||
|
||||
@ -4367,24 +4356,21 @@ class ManifestProject(MetaProject):
|
||||
if is_new:
|
||||
self.config.SetBoolean("repo.mirror", mirror)
|
||||
else:
|
||||
print(
|
||||
logger.error(
|
||||
"fatal: --mirror is only supported when initializing a new "
|
||||
"workspace.",
|
||||
file=sys.stderr,
|
||||
"workspace."
|
||||
)
|
||||
print(
|
||||
logger.error(
|
||||
"Either delete the .repo folder in this workspace, or "
|
||||
"initialize in another location.",
|
||||
file=sys.stderr,
|
||||
"initialize in another location."
|
||||
)
|
||||
return False
|
||||
|
||||
if partial_clone is not None:
|
||||
if mirror:
|
||||
print(
|
||||
logger.error(
|
||||
"fatal: --mirror and --partial-clone are mutually "
|
||||
"exclusive",
|
||||
file=sys.stderr,
|
||||
"exclusive"
|
||||
)
|
||||
return False
|
||||
self.config.SetBoolean("repo.partialclone", partial_clone)
|
||||
@ -4414,11 +4400,10 @@ class ManifestProject(MetaProject):
|
||||
|
||||
self.config.SetBoolean("repo.git-lfs", git_lfs)
|
||||
if not is_new:
|
||||
print(
|
||||
logger.warning(
|
||||
"warning: Changing --git-lfs settings will only affect new "
|
||||
"project checkouts.\n"
|
||||
" Existing projects will require manual updates.\n",
|
||||
file=sys.stderr,
|
||||
" Existing projects will require manual updates.\n"
|
||||
)
|
||||
|
||||
if clone_filter_for_depth is not None:
|
||||
@ -4442,9 +4427,7 @@ class ManifestProject(MetaProject):
|
||||
).success
|
||||
if not success:
|
||||
r = self.GetRemote()
|
||||
print(
|
||||
"fatal: cannot obtain manifest %s" % r.url, file=sys.stderr
|
||||
)
|
||||
logger.error("fatal: cannot obtain manifest %s", r.url)
|
||||
|
||||
# Better delete the manifest git dir if we created it; otherwise
|
||||
# next time (when user fixes problems) we won't go through the
|
||||
@ -4465,14 +4448,13 @@ class ManifestProject(MetaProject):
|
||||
self.StartBranch("default")
|
||||
except GitError as e:
|
||||
msg = str(e)
|
||||
print(
|
||||
f"fatal: cannot create default in manifest {msg}",
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"fatal: cannot create default in manifest %s", msg
|
||||
)
|
||||
return False
|
||||
|
||||
if not manifest_name:
|
||||
print("fatal: manifest name (-m) is required.", file=sys.stderr)
|
||||
logger.error("fatal: manifest name (-m) is required.")
|
||||
return False
|
||||
|
||||
elif is_new:
|
||||
@ -4487,11 +4469,8 @@ class ManifestProject(MetaProject):
|
||||
try:
|
||||
self.manifest.Link(manifest_name)
|
||||
except ManifestParseError as e:
|
||||
print(
|
||||
"fatal: manifest '%s' not available" % manifest_name,
|
||||
file=sys.stderr,
|
||||
)
|
||||
print("fatal: %s" % str(e), file=sys.stderr)
|
||||
logger.error("fatal: manifest '%s' not available", manifest_name)
|
||||
logger.error("fatal: %s", e)
|
||||
return False
|
||||
|
||||
if not this_manifest_only:
|
||||
@ -4533,13 +4512,13 @@ class ManifestProject(MetaProject):
|
||||
submanifest = ""
|
||||
if self.manifest.path_prefix:
|
||||
submanifest = f"for {self.manifest.path_prefix} "
|
||||
print(
|
||||
f"warning: git update of superproject {submanifest}failed, "
|
||||
logger.warning(
|
||||
"warning: git update of superproject %s failed, "
|
||||
"repo sync will not use superproject to fetch source; "
|
||||
"while this error is not fatal, and you can continue to "
|
||||
"run repo sync, please run repo init with the "
|
||||
"--no-use-superproject option to stop seeing this warning",
|
||||
file=sys.stderr,
|
||||
submanifest,
|
||||
)
|
||||
if sync_result.fatal and use_superproject is not None:
|
||||
return False
|
||||
|
@ -15,4 +15,4 @@
|
||||
[tool.black]
|
||||
line-length = 80
|
||||
# NB: Keep in sync with tox.ini.
|
||||
target-version = ['py36', 'py37', 'py38', 'py39', 'py310', 'py311']
|
||||
target-version = ['py36', 'py37', 'py38', 'py39', 'py310', 'py311'] #, 'py312'
|
||||
|
@ -15,24 +15,16 @@
|
||||
"""Logic for printing user-friendly logs in repo."""
|
||||
|
||||
import logging
|
||||
import multiprocessing
|
||||
|
||||
from color import Coloring
|
||||
from error import RepoExitError
|
||||
|
||||
|
||||
SEPARATOR = "=" * 80
|
||||
MAX_PRINT_ERRORS = 5
|
||||
|
||||
|
||||
class LogColoring(Coloring):
|
||||
"""Coloring outstream for logging."""
|
||||
|
||||
def __init__(self, config):
|
||||
super().__init__(config, "logs")
|
||||
self.error = self.colorer("error", fg="red")
|
||||
self.warning = self.colorer("warn", fg="yellow")
|
||||
|
||||
|
||||
class ConfigMock:
|
||||
class _ConfigMock:
|
||||
"""Default coloring config to use when Logging.config is not set."""
|
||||
|
||||
def __init__(self):
|
||||
@ -42,34 +34,59 @@ class ConfigMock:
|
||||
return self.default_values.get(x, None)
|
||||
|
||||
|
||||
class _LogColoring(Coloring):
|
||||
"""Coloring outstream for logging."""
|
||||
|
||||
def __init__(self, config):
|
||||
super().__init__(config, "logs")
|
||||
self.error = self.colorer("error", fg="red")
|
||||
self.warning = self.colorer("warn", fg="yellow")
|
||||
self.levelMap = {
|
||||
"WARNING": self.warning,
|
||||
"ERROR": self.error,
|
||||
}
|
||||
|
||||
|
||||
class _LogColoringFormatter(logging.Formatter):
|
||||
"""Coloring formatter for logging."""
|
||||
|
||||
def __init__(self, config=None, *args, **kwargs):
|
||||
self.config = config if config else _ConfigMock()
|
||||
self.colorer = _LogColoring(self.config)
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def format(self, record):
|
||||
"""Formats |record| with color."""
|
||||
msg = super().format(record)
|
||||
colorer = self.colorer.levelMap.get(record.levelname)
|
||||
return msg if not colorer else colorer(msg)
|
||||
|
||||
|
||||
class RepoLogger(logging.Logger):
|
||||
"""Repo Logging Module."""
|
||||
|
||||
# Aggregates error-level logs. This is used to generate an error summary
|
||||
# section at the end of a command execution.
|
||||
errors = multiprocessing.Manager().list()
|
||||
|
||||
def __init__(self, name, config=None, **kwargs):
|
||||
def __init__(self, name: str, config=None, **kwargs):
|
||||
super().__init__(name, **kwargs)
|
||||
self.config = config if config else ConfigMock()
|
||||
self.colorer = LogColoring(self.config)
|
||||
handler = logging.StreamHandler()
|
||||
handler.setFormatter(_LogColoringFormatter(config))
|
||||
self.addHandler(handler)
|
||||
|
||||
def error(self, msg, *args, **kwargs):
|
||||
"""Print and aggregate error-level logs."""
|
||||
colored_error = self.colorer.error(msg, *args)
|
||||
RepoLogger.errors.append(colored_error)
|
||||
|
||||
super().error(colored_error, **kwargs)
|
||||
|
||||
def warning(self, msg, *args, **kwargs):
|
||||
"""Print warning-level logs with coloring."""
|
||||
colored_warning = self.colorer.warning(msg, *args)
|
||||
super().warning(colored_warning, **kwargs)
|
||||
|
||||
def log_aggregated_errors(self):
|
||||
def log_aggregated_errors(self, err: RepoExitError):
|
||||
"""Print all aggregated logs."""
|
||||
super().error(self.colorer.error(SEPARATOR))
|
||||
super().error(
|
||||
self.colorer.error("Repo command failed due to following errors:")
|
||||
self.error(SEPARATOR)
|
||||
|
||||
if not err.aggregate_errors:
|
||||
self.error("Repo command failed: %s", type(err).__name__)
|
||||
return
|
||||
|
||||
self.error(
|
||||
"Repo command failed due to the following `%s` errors:",
|
||||
type(err).__name__,
|
||||
)
|
||||
super().error("\n".join(RepoLogger.errors))
|
||||
self.error(
|
||||
"\n".join(str(e) for e in err.aggregate_errors[:MAX_PRINT_ERRORS])
|
||||
)
|
||||
|
||||
diff = len(err.aggregate_errors) - MAX_PRINT_ERRORS
|
||||
if diff > 0:
|
||||
self.error("+%d additional errors...", diff)
|
||||
|
10
run_tests
10
run_tests
@ -27,8 +27,16 @@ ROOT_DIR = os.path.dirname(os.path.realpath(__file__))
|
||||
|
||||
def run_black():
|
||||
"""Returns the exit code from black."""
|
||||
# Black by default only matches .py files. We have to list standalone
|
||||
# scripts manually.
|
||||
extra_programs = [
|
||||
"repo",
|
||||
"run_tests",
|
||||
"release/update-manpages",
|
||||
]
|
||||
return subprocess.run(
|
||||
[sys.executable, "-m", "black", "--check", ROOT_DIR], check=False
|
||||
[sys.executable, "-m", "black", "--check", ROOT_DIR] + extra_programs,
|
||||
check=False,
|
||||
).returncode
|
||||
|
||||
|
||||
|
@ -15,7 +15,6 @@
|
||||
import collections
|
||||
import functools
|
||||
import itertools
|
||||
import sys
|
||||
|
||||
from command import Command
|
||||
from command import DEFAULT_LOCAL_JOBS
|
||||
@ -23,6 +22,10 @@ from error import RepoError
|
||||
from error import RepoExitError
|
||||
from git_command import git
|
||||
from progress import Progress
|
||||
from repo_logging import RepoLogger
|
||||
|
||||
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
class AbandonError(RepoExitError):
|
||||
@ -126,18 +129,12 @@ It is equivalent to "git branch -D <branchname>".
|
||||
if err:
|
||||
for br in err.keys():
|
||||
err_msg = "error: cannot abandon %s" % br
|
||||
print(err_msg, file=sys.stderr)
|
||||
logger.error(err_msg)
|
||||
for proj in err[br]:
|
||||
print(
|
||||
" " * len(err_msg) + " | %s" % _RelPath(proj),
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error(" " * len(err_msg) + " | %s", _RelPath(proj))
|
||||
raise AbandonError(aggregate_errors=aggregate_errors)
|
||||
elif not success:
|
||||
print(
|
||||
"error: no project has local branch(es) : %s" % nb,
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error("error: no project has local branch(es) : %s", nb)
|
||||
raise AbandonError(aggregate_errors=aggregate_errors)
|
||||
else:
|
||||
# Everything below here is displaying status.
|
||||
|
@ -13,7 +13,6 @@
|
||||
# limitations under the License.
|
||||
|
||||
import functools
|
||||
import sys
|
||||
from typing import NamedTuple
|
||||
|
||||
from command import Command
|
||||
@ -22,6 +21,10 @@ from error import GitError
|
||||
from error import RepoExitError
|
||||
from progress import Progress
|
||||
from project import Project
|
||||
from repo_logging import RepoLogger
|
||||
|
||||
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
class CheckoutBranchResult(NamedTuple):
|
||||
@ -99,12 +102,9 @@ The command is equivalent to:
|
||||
|
||||
if err_projects:
|
||||
for p in err_projects:
|
||||
print(
|
||||
"error: %s/: cannot checkout %s" % (p.relpath, nb),
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error("error: %s/: cannot checkout %s", p.relpath, nb)
|
||||
raise CheckoutCommandError(aggregate_errors=err)
|
||||
elif not success:
|
||||
msg = f"error: no project has branch {nb}"
|
||||
print(msg, file=sys.stderr)
|
||||
logger.error(msg)
|
||||
raise MissingBranchError(msg)
|
||||
|
@ -18,9 +18,11 @@ import sys
|
||||
from command import Command
|
||||
from error import GitError
|
||||
from git_command import GitCommand
|
||||
from repo_logging import RepoLogger
|
||||
|
||||
|
||||
CHANGE_ID_RE = re.compile(r"^\s*Change-Id: I([0-9a-f]{40})\s*$")
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
class CherryPick(Command):
|
||||
@ -52,7 +54,7 @@ change id will be added.
|
||||
try:
|
||||
p.Wait()
|
||||
except GitError:
|
||||
print(p.stderr, file=sys.stderr)
|
||||
logger.error(p.stderr)
|
||||
raise
|
||||
|
||||
sha1 = p.stdout.strip()
|
||||
@ -67,9 +69,7 @@ change id will be added.
|
||||
try:
|
||||
p.Wait()
|
||||
except GitError:
|
||||
print(
|
||||
"error: Failed to retrieve old commit message", file=sys.stderr
|
||||
)
|
||||
logger.error("error: Failed to retrieve old commit message")
|
||||
raise
|
||||
|
||||
old_msg = self._StripHeader(p.stdout)
|
||||
@ -85,14 +85,13 @@ change id will be added.
|
||||
try:
|
||||
p.Wait()
|
||||
except GitError as e:
|
||||
print(str(e))
|
||||
print(
|
||||
logger.error(e)
|
||||
logger.warning(
|
||||
"NOTE: When committing (please see above) and editing the "
|
||||
"commit message, please remove the old Change-Id-line and "
|
||||
"add:"
|
||||
"add:\n%s",
|
||||
self._GetReference(sha1),
|
||||
)
|
||||
print(self._GetReference(sha1), file=sys.stderr)
|
||||
print(file=sys.stderr)
|
||||
raise
|
||||
|
||||
if p.stdout:
|
||||
@ -115,10 +114,7 @@ change id will be added.
|
||||
try:
|
||||
p.Wait()
|
||||
except GitError:
|
||||
print(
|
||||
"error: Failed to update commit message",
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error("error: Failed to update commit message")
|
||||
raise
|
||||
|
||||
def _IsChangeId(self, line):
|
||||
|
@ -19,9 +19,11 @@ from command import Command
|
||||
from error import GitError
|
||||
from error import NoSuchProjectError
|
||||
from error import RepoExitError
|
||||
from repo_logging import RepoLogger
|
||||
|
||||
|
||||
CHANGE_RE = re.compile(r"^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$")
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
class DownloadCommandError(RepoExitError):
|
||||
@ -109,21 +111,16 @@ If no project is specified try to use current directory as a project.
|
||||
except NoSuchProjectError:
|
||||
project = None
|
||||
if project not in projects:
|
||||
print(
|
||||
logger.error(
|
||||
"error: %s matches too many projects; please "
|
||||
"re-run inside the project checkout." % (a,),
|
||||
file=sys.stderr,
|
||||
"re-run inside the project checkout.",
|
||||
a,
|
||||
)
|
||||
for project in projects:
|
||||
print(
|
||||
" %s/ @ %s"
|
||||
% (
|
||||
project.RelPath(
|
||||
local=opt.this_manifest_only
|
||||
),
|
||||
project.revisionExpr,
|
||||
),
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
" %s/ @ %s",
|
||||
project.RelPath(local=opt.this_manifest_only),
|
||||
project.revisionExpr,
|
||||
)
|
||||
raise NoSuchProjectError()
|
||||
else:
|
||||
@ -156,18 +153,21 @@ If no project is specified try to use current directory as a project.
|
||||
dl = project.DownloadPatchSet(change_id, ps_id)
|
||||
|
||||
if not opt.revert and not dl.commits:
|
||||
print(
|
||||
"[%s] change %d/%d has already been merged"
|
||||
% (project.name, change_id, ps_id),
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"[%s] change %d/%d has already been merged",
|
||||
project.name,
|
||||
change_id,
|
||||
ps_id,
|
||||
)
|
||||
continue
|
||||
|
||||
if len(dl.commits) > 1:
|
||||
print(
|
||||
"[%s] %d/%d depends on %d unmerged changes:"
|
||||
% (project.name, change_id, ps_id, len(dl.commits)),
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"[%s] %d/%d depends on %d unmerged changes:",
|
||||
project.name,
|
||||
change_id,
|
||||
ps_id,
|
||||
len(dl.commits),
|
||||
)
|
||||
for c in dl.commits:
|
||||
print(" %s" % (c), file=sys.stderr)
|
||||
@ -204,9 +204,10 @@ If no project is specified try to use current directory as a project.
|
||||
project._Checkout(dl.commit)
|
||||
|
||||
except GitError:
|
||||
print(
|
||||
"[%s] Could not complete the %s of %s"
|
||||
% (project.name, mode, dl.commit),
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"[%s] Could not complete the %s of %s",
|
||||
project.name,
|
||||
mode,
|
||||
dl.commit,
|
||||
)
|
||||
raise
|
||||
|
@ -28,8 +28,10 @@ from command import DEFAULT_LOCAL_JOBS
|
||||
from command import MirrorSafeCommand
|
||||
from command import WORKER_BATCH_SIZE
|
||||
from error import ManifestInvalidRevisionError
|
||||
from repo_logging import RepoLogger
|
||||
|
||||
|
||||
logger = RepoLogger(__file__)
|
||||
_CAN_COLOR = [
|
||||
"branch",
|
||||
"diff",
|
||||
@ -293,10 +295,10 @@ without iterating through the remaining projects.
|
||||
rc = rc or errno.EINTR
|
||||
except Exception as e:
|
||||
# Catch any other exceptions raised
|
||||
print(
|
||||
"forall: unhandled error, terminating the pool: %s: %s"
|
||||
% (type(e).__name__, e),
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"forall: unhandled error, terminating the pool: %s: %s",
|
||||
type(e).__name__,
|
||||
e,
|
||||
)
|
||||
rc = rc or getattr(e, "errno", 1)
|
||||
if rc != 0:
|
||||
|
@ -24,6 +24,10 @@ from error import InvalidArgumentsError
|
||||
from error import SilentRepoExitError
|
||||
from git_command import GitCommand
|
||||
from project import Project
|
||||
from repo_logging import RepoLogger
|
||||
|
||||
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
class GrepColoring(Coloring):
|
||||
@ -371,7 +375,7 @@ contain a line that matches both expressions:
|
||||
if opt.revision:
|
||||
if "--cached" in cmd_argv:
|
||||
msg = "fatal: cannot combine --cached and --revision"
|
||||
print(msg, file=sys.stderr)
|
||||
logger.error(msg)
|
||||
raise InvalidArgumentsError(msg)
|
||||
have_rev = True
|
||||
cmd_argv.extend(opt.revision)
|
||||
@ -396,5 +400,5 @@ contain a line that matches both expressions:
|
||||
sys.exit(0)
|
||||
elif have_rev and bad_rev:
|
||||
for r in opt.revision:
|
||||
print("error: can't search revision %s" % r, file=sys.stderr)
|
||||
logger.error("error: can't search revision %s", r)
|
||||
raise GrepCommandError(aggregate_errors=errors)
|
||||
|
@ -23,9 +23,12 @@ from error import UpdateManifestError
|
||||
from git_command import git_require
|
||||
from git_command import MIN_GIT_VERSION_HARD
|
||||
from git_command import MIN_GIT_VERSION_SOFT
|
||||
from repo_logging import RepoLogger
|
||||
from wrapper import Wrapper
|
||||
|
||||
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
_REPO_ALLOW_SHALLOW = os.environ.get("REPO_ALLOW_SHALLOW")
|
||||
|
||||
|
||||
@ -330,11 +333,11 @@ to update the working directory files.
|
||||
def Execute(self, opt, args):
|
||||
git_require(MIN_GIT_VERSION_HARD, fail=True)
|
||||
if not git_require(MIN_GIT_VERSION_SOFT):
|
||||
print(
|
||||
"repo: warning: git-%s+ will soon be required; please upgrade "
|
||||
"your version of git to maintain support."
|
||||
% (".".join(str(x) for x in MIN_GIT_VERSION_SOFT),),
|
||||
file=sys.stderr,
|
||||
logger.warning(
|
||||
"repo: warning: git-%s+ will soon be required; "
|
||||
"please upgrade your version of git to maintain "
|
||||
"support.",
|
||||
".".join(str(x) for x in MIN_GIT_VERSION_SOFT),
|
||||
)
|
||||
|
||||
rp = self.manifest.repoProject
|
||||
@ -357,10 +360,7 @@ to update the working directory files.
|
||||
)
|
||||
except wrapper.CloneFailure as e:
|
||||
err_msg = "fatal: double check your --repo-rev setting."
|
||||
print(
|
||||
err_msg,
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error(err_msg)
|
||||
self.git_event_log.ErrorEvent(err_msg)
|
||||
raise RepoUnhandledExceptionError(e)
|
||||
|
||||
|
@ -17,6 +17,10 @@ import os
|
||||
import sys
|
||||
|
||||
from command import PagedCommand
|
||||
from repo_logging import RepoLogger
|
||||
|
||||
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
class Manifest(PagedCommand):
|
||||
@ -132,7 +136,7 @@ to indicate the remote ref to push changes to via 'repo upload'.
|
||||
manifest.SetUseLocalManifests(not opt.ignore_local_manifests)
|
||||
|
||||
if opt.json:
|
||||
print("warning: --json is experimental!", file=sys.stderr)
|
||||
logger.warning("warning: --json is experimental!")
|
||||
doc = manifest.ToDict(
|
||||
peg_rev=opt.peg_rev,
|
||||
peg_rev_upstream=opt.peg_rev_upstream,
|
||||
@ -159,13 +163,13 @@ to indicate the remote ref to push changes to via 'repo upload'.
|
||||
if output_file != "-":
|
||||
fd.close()
|
||||
if manifest.path_prefix:
|
||||
print(
|
||||
f"Saved {manifest.path_prefix} submanifest to "
|
||||
f"{output_file}",
|
||||
file=sys.stderr,
|
||||
logger.warning(
|
||||
"Saved %s submanifest to %s",
|
||||
manifest.path_prefix,
|
||||
output_file,
|
||||
)
|
||||
else:
|
||||
print(f"Saved manifest to {output_file}", file=sys.stderr)
|
||||
logger.warning("Saved manifest to %s", output_file)
|
||||
|
||||
def ValidateOptions(self, opt, args):
|
||||
if args:
|
||||
|
@ -17,6 +17,10 @@ import sys
|
||||
from color import Coloring
|
||||
from command import Command
|
||||
from git_command import GitCommand
|
||||
from repo_logging import RepoLogger
|
||||
|
||||
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
class RebaseColoring(Coloring):
|
||||
@ -104,17 +108,15 @@ branch but need to incorporate new upstream changes "underneath" them.
|
||||
one_project = len(all_projects) == 1
|
||||
|
||||
if opt.interactive and not one_project:
|
||||
print(
|
||||
"error: interactive rebase not supported with multiple "
|
||||
"projects",
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"error: interactive rebase not supported with multiple projects"
|
||||
)
|
||||
|
||||
if len(args) == 1:
|
||||
print(
|
||||
"note: project %s is mapped to more than one path"
|
||||
% (args[0],),
|
||||
file=sys.stderr,
|
||||
logger.warning(
|
||||
"note: project %s is mapped to more than one path", args[0]
|
||||
)
|
||||
|
||||
return 1
|
||||
|
||||
# Setup the common git rebase args that we use for all projects.
|
||||
@ -145,10 +147,9 @@ branch but need to incorporate new upstream changes "underneath" them.
|
||||
cb = project.CurrentBranch
|
||||
if not cb:
|
||||
if one_project:
|
||||
print(
|
||||
"error: project %s has a detached HEAD"
|
||||
% _RelPath(project),
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"error: project %s has a detached HEAD",
|
||||
_RelPath(project),
|
||||
)
|
||||
return 1
|
||||
# Ignore branches with detached HEADs.
|
||||
@ -157,10 +158,9 @@ branch but need to incorporate new upstream changes "underneath" them.
|
||||
upbranch = project.GetBranch(cb)
|
||||
if not upbranch.LocalMerge:
|
||||
if one_project:
|
||||
print(
|
||||
"error: project %s does not track any remote branches"
|
||||
% _RelPath(project),
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"error: project %s does not track any remote branches",
|
||||
_RelPath(project),
|
||||
)
|
||||
return 1
|
||||
# Ignore branches without remotes.
|
||||
|
@ -13,15 +13,18 @@
|
||||
# limitations under the License.
|
||||
|
||||
import optparse
|
||||
import sys
|
||||
|
||||
from command import Command
|
||||
from command import MirrorSafeCommand
|
||||
from error import RepoExitError
|
||||
from repo_logging import RepoLogger
|
||||
from subcmds.sync import _PostRepoFetch
|
||||
from subcmds.sync import _PostRepoUpgrade
|
||||
|
||||
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
class SelfupdateError(RepoExitError):
|
||||
"""Exit error for failed selfupdate command."""
|
||||
|
||||
@ -66,7 +69,7 @@ need to be performed by an end-user.
|
||||
else:
|
||||
result = rp.Sync_NetworkHalf()
|
||||
if result.error:
|
||||
print("error: can't update repo", file=sys.stderr)
|
||||
logger.error("error: can't update repo")
|
||||
raise SelfupdateError(aggregate_errors=[result.error])
|
||||
|
||||
rp.bare_git.gc("--auto")
|
||||
|
@ -17,6 +17,10 @@ import sys
|
||||
from color import Coloring
|
||||
from command import InteractiveCommand
|
||||
from git_command import GitCommand
|
||||
from repo_logging import RepoLogger
|
||||
|
||||
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
class _ProjectList(Coloring):
|
||||
@ -62,7 +66,7 @@ The '%prog' command stages files to prepare the next commit.
|
||||
if p.IsDirty()
|
||||
]
|
||||
if not all_projects:
|
||||
print("no projects have uncommitted modifications", file=sys.stderr)
|
||||
logger.error("no projects have uncommitted modifications")
|
||||
return
|
||||
|
||||
out = _ProjectList(self.manifest.manifestProject.config)
|
||||
|
@ -13,7 +13,6 @@
|
||||
# limitations under the License.
|
||||
|
||||
import functools
|
||||
import sys
|
||||
from typing import NamedTuple
|
||||
|
||||
from command import Command
|
||||
@ -23,6 +22,10 @@ from git_command import git
|
||||
from git_config import IsImmutable
|
||||
from progress import Progress
|
||||
from project import Project
|
||||
from repo_logging import RepoLogger
|
||||
|
||||
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
class ExecuteOneResult(NamedTuple):
|
||||
@ -95,10 +98,7 @@ revision specified in the manifest.
|
||||
nb, branch_merge=branch_merge, revision=revision
|
||||
)
|
||||
except Exception as e:
|
||||
print(
|
||||
"error: unable to checkout %s: %s" % (project.name, e),
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error("error: unable to checkout %s: %s", project.name, e)
|
||||
error = e
|
||||
return ExecuteOneResult(project, error)
|
||||
|
||||
@ -136,10 +136,10 @@ revision specified in the manifest.
|
||||
|
||||
if err_projects:
|
||||
for p in err_projects:
|
||||
print(
|
||||
"error: %s/: cannot start %s"
|
||||
% (p.RelPath(local=opt.this_manifest_only), nb),
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"error: %s/: cannot start %s",
|
||||
p.RelPath(local=opt.this_manifest_only),
|
||||
nb,
|
||||
)
|
||||
msg_fmt = "cannot start %d project(s)"
|
||||
self.git_event_log.ErrorEvent(
|
||||
|
293
subcmds/sync.py
293
subcmds/sync.py
@ -25,7 +25,7 @@ import socket
|
||||
import sys
|
||||
import tempfile
|
||||
import time
|
||||
from typing import List, NamedTuple, Set
|
||||
from typing import List, NamedTuple, Set, Union
|
||||
import urllib.error
|
||||
import urllib.parse
|
||||
import urllib.request
|
||||
@ -56,6 +56,7 @@ from command import MirrorSafeCommand
|
||||
from command import WORKER_BATCH_SIZE
|
||||
from error import GitError
|
||||
from error import RepoChangedException
|
||||
from error import RepoError
|
||||
from error import RepoExitError
|
||||
from error import RepoUnhandledExceptionError
|
||||
from error import SyncError
|
||||
@ -74,6 +75,7 @@ from project import DeleteWorktreeError
|
||||
from project import Project
|
||||
from project import RemoteSpec
|
||||
from project import SyncBuffer
|
||||
from repo_logging import RepoLogger
|
||||
from repo_trace import Trace
|
||||
import ssh
|
||||
from wrapper import Wrapper
|
||||
@ -88,6 +90,8 @@ _AUTO_GC = os.environ.get(_REPO_AUTO_GC) == "1"
|
||||
|
||||
_REPO_ALLOW_SHALLOW = os.environ.get("REPO_ALLOW_SHALLOW")
|
||||
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
class _FetchOneResult(NamedTuple):
|
||||
"""_FetchOne return value.
|
||||
@ -118,7 +122,6 @@ class _FetchResult(NamedTuple):
|
||||
|
||||
success: bool
|
||||
projects: Set[str]
|
||||
errors: List[Exception]
|
||||
|
||||
|
||||
class _FetchMainResult(NamedTuple):
|
||||
@ -129,7 +132,6 @@ class _FetchMainResult(NamedTuple):
|
||||
"""
|
||||
|
||||
all_projects: List[Project]
|
||||
errors: List[Exception]
|
||||
|
||||
|
||||
class _CheckoutOneResult(NamedTuple):
|
||||
@ -161,6 +163,34 @@ class SmartSyncError(SyncError):
|
||||
"""Smart sync exit error."""
|
||||
|
||||
|
||||
class ManifestInterruptError(RepoError):
|
||||
"""Aggregate Error to be logged when a user interrupts a manifest update."""
|
||||
|
||||
def __init__(self, output, **kwargs):
|
||||
super().__init__(output, **kwargs)
|
||||
self.output = output
|
||||
|
||||
def __str__(self):
|
||||
error_type = type(self).__name__
|
||||
return f"{error_type}:{self.output}"
|
||||
|
||||
|
||||
class TeeStringIO(io.StringIO):
|
||||
"""StringIO class that can write to an additional destination."""
|
||||
|
||||
def __init__(
|
||||
self, io: Union[io.TextIOWrapper, None], *args, **kwargs
|
||||
) -> None:
|
||||
super().__init__(*args, **kwargs)
|
||||
self.io = io
|
||||
|
||||
def write(self, s: str) -> int:
|
||||
"""Write to additional destination."""
|
||||
super().write(s)
|
||||
if self.io is not None:
|
||||
self.io.write(s)
|
||||
|
||||
|
||||
class Sync(Command, MirrorSafeCommand):
|
||||
COMMON = True
|
||||
MULTI_MANIFEST_SUPPORT = True
|
||||
@ -580,9 +610,10 @@ later is required to fix a server side protocol bug.
|
||||
superproject_logging_data["superproject"] = False
|
||||
superproject_logging_data["noworktree"] = True
|
||||
if opt.use_superproject is not False:
|
||||
print(
|
||||
f"{m.path_prefix}: not using superproject because "
|
||||
"there is no working tree."
|
||||
logger.warning(
|
||||
"%s: not using superproject because there is no "
|
||||
"working tree.",
|
||||
m.path_prefix,
|
||||
)
|
||||
|
||||
if not use_super:
|
||||
@ -602,13 +633,13 @@ later is required to fix a server side protocol bug.
|
||||
need_unload = True
|
||||
else:
|
||||
if print_messages:
|
||||
print(
|
||||
f"{m.path_prefix}: warning: Update of revisionId from "
|
||||
"superproject has failed, repo sync will not use "
|
||||
"superproject to fetch the source. ",
|
||||
"Please resync with the --no-use-superproject option "
|
||||
"to avoid this repo warning.",
|
||||
file=sys.stderr,
|
||||
logger.warning(
|
||||
"%s: warning: Update of revisionId from superproject "
|
||||
"has failed, repo sync will not use superproject to "
|
||||
"fetch the source. Please resync with the "
|
||||
"--no-use-superproject option to avoid this repo "
|
||||
"warning.",
|
||||
m.path_prefix,
|
||||
)
|
||||
if update_result.fatal and opt.use_superproject is not None:
|
||||
raise SuperprojectError()
|
||||
@ -645,7 +676,7 @@ later is required to fix a server side protocol bug.
|
||||
success = False
|
||||
remote_fetched = False
|
||||
errors = []
|
||||
buf = io.StringIO()
|
||||
buf = TeeStringIO(sys.stdout if opt.verbose else None)
|
||||
try:
|
||||
sync_result = project.Sync_NetworkHalf(
|
||||
quiet=opt.quiet,
|
||||
@ -672,25 +703,26 @@ later is required to fix a server side protocol bug.
|
||||
errors.append(sync_result.error)
|
||||
|
||||
output = buf.getvalue()
|
||||
if (opt.verbose or not success) and output:
|
||||
if output and buf.io is None and not success:
|
||||
print("\n" + output.rstrip())
|
||||
|
||||
if not success:
|
||||
print(
|
||||
"error: Cannot fetch %s from %s"
|
||||
% (project.name, project.remote.url),
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"error: Cannot fetch %s from %s",
|
||||
project.name,
|
||||
project.remote.url,
|
||||
)
|
||||
except KeyboardInterrupt:
|
||||
print(f"Keyboard interrupt while processing {project.name}")
|
||||
logger.error("Keyboard interrupt while processing %s", project.name)
|
||||
except GitError as e:
|
||||
print("error.GitError: Cannot fetch %s" % str(e), file=sys.stderr)
|
||||
logger.error("error.GitError: Cannot fetch %s", e)
|
||||
errors.append(e)
|
||||
except Exception as e:
|
||||
print(
|
||||
"error: Cannot fetch %s (%s: %s)"
|
||||
% (project.name, type(e).__name__, str(e)),
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"error: Cannot fetch %s (%s: %s)",
|
||||
project.name,
|
||||
type(e).__name__,
|
||||
e,
|
||||
)
|
||||
del self._sync_dict[k]
|
||||
errors.append(e)
|
||||
@ -725,13 +757,12 @@ later is required to fix a server side protocol bug.
|
||||
jobs = jobs_str(len(items))
|
||||
return f"{jobs} | {elapsed_str(elapsed)} {earliest_proj}"
|
||||
|
||||
def _Fetch(self, projects, opt, err_event, ssh_proxy):
|
||||
def _Fetch(self, projects, opt, err_event, ssh_proxy, errors):
|
||||
ret = True
|
||||
|
||||
jobs = opt.jobs_network
|
||||
fetched = set()
|
||||
remote_fetched = set()
|
||||
errors = []
|
||||
pm = Progress(
|
||||
"Fetching",
|
||||
len(projects),
|
||||
@ -846,10 +877,10 @@ later is required to fix a server side protocol bug.
|
||||
if not self.outer_client.manifest.IsArchive:
|
||||
self._GCProjects(projects, opt, err_event)
|
||||
|
||||
return _FetchResult(ret, fetched, errors)
|
||||
return _FetchResult(ret, fetched)
|
||||
|
||||
def _FetchMain(
|
||||
self, opt, args, all_projects, err_event, ssh_proxy, manifest
|
||||
self, opt, args, all_projects, err_event, ssh_proxy, manifest, errors
|
||||
):
|
||||
"""The main network fetch loop.
|
||||
|
||||
@ -865,7 +896,6 @@ later is required to fix a server side protocol bug.
|
||||
List of all projects that should be checked out.
|
||||
"""
|
||||
rp = manifest.repoProject
|
||||
errors = []
|
||||
|
||||
to_fetch = []
|
||||
now = time.time()
|
||||
@ -874,11 +904,9 @@ later is required to fix a server side protocol bug.
|
||||
to_fetch.extend(all_projects)
|
||||
to_fetch.sort(key=self._fetch_times.Get, reverse=True)
|
||||
|
||||
result = self._Fetch(to_fetch, opt, err_event, ssh_proxy)
|
||||
result = self._Fetch(to_fetch, opt, err_event, ssh_proxy, errors)
|
||||
success = result.success
|
||||
fetched = result.projects
|
||||
if result.errors:
|
||||
errors.extend(result.errors)
|
||||
|
||||
if not success:
|
||||
err_event.set()
|
||||
@ -887,15 +915,14 @@ later is required to fix a server side protocol bug.
|
||||
if opt.network_only:
|
||||
# Bail out now; the rest touches the working tree.
|
||||
if err_event.is_set():
|
||||
print(
|
||||
"\nerror: Exited sync due to fetch errors.\n",
|
||||
file=sys.stderr,
|
||||
)
|
||||
raise SyncError(
|
||||
e = SyncError(
|
||||
"error: Exited sync due to fetch errors.",
|
||||
aggregate_errors=errors,
|
||||
)
|
||||
return _FetchMainResult([], errors)
|
||||
|
||||
logger.error(e)
|
||||
raise e
|
||||
return _FetchMainResult([])
|
||||
|
||||
# Iteratively fetch missing and/or nested unregistered submodules.
|
||||
previously_missing_set = set()
|
||||
@ -920,16 +947,14 @@ later is required to fix a server side protocol bug.
|
||||
if previously_missing_set == missing_set:
|
||||
break
|
||||
previously_missing_set = missing_set
|
||||
result = self._Fetch(missing, opt, err_event, ssh_proxy)
|
||||
result = self._Fetch(missing, opt, err_event, ssh_proxy, errors)
|
||||
success = result.success
|
||||
new_fetched = result.projects
|
||||
if result.errors:
|
||||
errors.extend(result.errors)
|
||||
if not success:
|
||||
err_event.set()
|
||||
fetched.update(new_fetched)
|
||||
|
||||
return _FetchMainResult(all_projects, errors)
|
||||
return _FetchMainResult(all_projects)
|
||||
|
||||
def _CheckoutOne(self, detach_head, force_sync, project):
|
||||
"""Checkout work tree for one project
|
||||
@ -954,22 +979,21 @@ later is required to fix a server side protocol bug.
|
||||
)
|
||||
success = syncbuf.Finish()
|
||||
except GitError as e:
|
||||
print(
|
||||
"error.GitError: Cannot checkout %s: %s"
|
||||
% (project.name, str(e)),
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"error.GitError: Cannot checkout %s: %s", project.name, e
|
||||
)
|
||||
errors.append(e)
|
||||
except Exception as e:
|
||||
print(
|
||||
"error: Cannot checkout %s: %s: %s"
|
||||
% (project.name, type(e).__name__, str(e)),
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"error: Cannot checkout %s: %s: %s",
|
||||
project.name,
|
||||
type(e).__name__,
|
||||
e,
|
||||
)
|
||||
raise
|
||||
|
||||
if not success:
|
||||
print("error: Cannot checkout %s" % (project.name), file=sys.stderr)
|
||||
logger.error("error: Cannot checkout %s", project.name)
|
||||
finish = time.time()
|
||||
return _CheckoutOneResult(success, errors, project, start, finish)
|
||||
|
||||
@ -1092,16 +1116,17 @@ later is required to fix a server side protocol bug.
|
||||
"\r%s: Shared project %s found, disabling pruning."
|
||||
% (relpath, project.name)
|
||||
)
|
||||
|
||||
if git_require((2, 7, 0)):
|
||||
project.EnableRepositoryExtension("preciousObjects")
|
||||
else:
|
||||
# This isn't perfect, but it's the best we can do with old
|
||||
# git.
|
||||
print(
|
||||
"\r%s: WARNING: shared projects are unreliable when "
|
||||
logger.warning(
|
||||
"%s: WARNING: shared projects are unreliable when "
|
||||
"using old versions of git; please upgrade to "
|
||||
"git-2.7.0+." % (relpath,),
|
||||
file=sys.stderr,
|
||||
"git-2.7.0+.",
|
||||
relpath,
|
||||
)
|
||||
project.config.SetString("gc.pruneExpire", "never")
|
||||
else:
|
||||
@ -1303,10 +1328,9 @@ later is required to fix a server side protocol bug.
|
||||
try:
|
||||
old_copylinkfile_paths = json.load(fp)
|
||||
except Exception:
|
||||
print(
|
||||
"error: %s is not a json formatted file."
|
||||
% copylinkfile_path,
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"error: %s is not a json formatted file.",
|
||||
copylinkfile_path,
|
||||
)
|
||||
platform_utils.remove(copylinkfile_path)
|
||||
raise
|
||||
@ -1363,15 +1387,12 @@ later is required to fix a server side protocol bug.
|
||||
if auth:
|
||||
username, _account, password = auth
|
||||
else:
|
||||
print(
|
||||
"No credentials found for %s in .netrc"
|
||||
% parse_result.hostname,
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"No credentials found for %s in .netrc",
|
||||
parse_result.hostname,
|
||||
)
|
||||
except netrc.NetrcParseError as e:
|
||||
print(
|
||||
"Error parsing .netrc file: %s" % e, file=sys.stderr
|
||||
)
|
||||
logger.error("Error parsing .netrc file: %s", e)
|
||||
|
||||
if username and password:
|
||||
manifest_server = manifest_server.replace(
|
||||
@ -1440,7 +1461,7 @@ later is required to fix a server side protocol bug.
|
||||
|
||||
return manifest_name
|
||||
|
||||
def _UpdateAllManifestProjects(self, opt, mp, manifest_name):
|
||||
def _UpdateAllManifestProjects(self, opt, mp, manifest_name, errors):
|
||||
"""Fetch & update the local manifest project.
|
||||
|
||||
After syncing the manifest project, if the manifest has any sub
|
||||
@ -1452,7 +1473,7 @@ later is required to fix a server side protocol bug.
|
||||
manifest_name: Manifest file to be reloaded.
|
||||
"""
|
||||
if not mp.standalone_manifest_url:
|
||||
self._UpdateManifestProject(opt, mp, manifest_name)
|
||||
self._UpdateManifestProject(opt, mp, manifest_name, errors)
|
||||
|
||||
if mp.manifest.submanifests:
|
||||
for submanifest in mp.manifest.submanifests.values():
|
||||
@ -1465,10 +1486,10 @@ later is required to fix a server side protocol bug.
|
||||
git_event_log=self.git_event_log,
|
||||
)
|
||||
self._UpdateAllManifestProjects(
|
||||
opt, child.manifestProject, None
|
||||
opt, child.manifestProject, None, errors
|
||||
)
|
||||
|
||||
def _UpdateManifestProject(self, opt, mp, manifest_name):
|
||||
def _UpdateManifestProject(self, opt, mp, manifest_name, errors):
|
||||
"""Fetch & update the local manifest project.
|
||||
|
||||
Args:
|
||||
@ -1478,21 +1499,32 @@ later is required to fix a server side protocol bug.
|
||||
"""
|
||||
if not opt.local_only:
|
||||
start = time.time()
|
||||
result = mp.Sync_NetworkHalf(
|
||||
quiet=opt.quiet,
|
||||
verbose=opt.verbose,
|
||||
current_branch_only=self._GetCurrentBranchOnly(
|
||||
opt, mp.manifest
|
||||
),
|
||||
force_sync=opt.force_sync,
|
||||
tags=opt.tags,
|
||||
optimized_fetch=opt.optimized_fetch,
|
||||
retry_fetches=opt.retry_fetches,
|
||||
submodules=mp.manifest.HasSubmodules,
|
||||
clone_filter=mp.manifest.CloneFilter,
|
||||
partial_clone_exclude=mp.manifest.PartialCloneExclude,
|
||||
clone_filter_for_depth=mp.manifest.CloneFilterForDepth,
|
||||
)
|
||||
buf = TeeStringIO(sys.stdout)
|
||||
try:
|
||||
result = mp.Sync_NetworkHalf(
|
||||
quiet=opt.quiet,
|
||||
output_redir=buf,
|
||||
verbose=opt.verbose,
|
||||
current_branch_only=self._GetCurrentBranchOnly(
|
||||
opt, mp.manifest
|
||||
),
|
||||
force_sync=opt.force_sync,
|
||||
tags=opt.tags,
|
||||
optimized_fetch=opt.optimized_fetch,
|
||||
retry_fetches=opt.retry_fetches,
|
||||
submodules=mp.manifest.HasSubmodules,
|
||||
clone_filter=mp.manifest.CloneFilter,
|
||||
partial_clone_exclude=mp.manifest.PartialCloneExclude,
|
||||
clone_filter_for_depth=mp.manifest.CloneFilterForDepth,
|
||||
)
|
||||
if result.error:
|
||||
errors.append(result.error)
|
||||
except KeyboardInterrupt:
|
||||
errors.append(
|
||||
ManifestInterruptError(buf.getvalue(), project=mp.name)
|
||||
)
|
||||
raise
|
||||
|
||||
finish = time.time()
|
||||
self.event_log.AddSync(
|
||||
mp, event_log.TASK_SYNC_NETWORK, start, finish, result.success
|
||||
@ -1517,10 +1549,9 @@ later is required to fix a server side protocol bug.
|
||||
|
||||
def ValidateOptions(self, opt, args):
|
||||
if opt.force_broken:
|
||||
print(
|
||||
logger.warning(
|
||||
"warning: -f/--force-broken is now the default behavior, and "
|
||||
"the options are deprecated",
|
||||
file=sys.stderr,
|
||||
"the options are deprecated"
|
||||
)
|
||||
if opt.network_only and opt.detach_head:
|
||||
self.OptionParser.error("cannot combine -n and -d")
|
||||
@ -1545,11 +1576,12 @@ later is required to fix a server side protocol bug.
|
||||
opt.prune = True
|
||||
|
||||
if opt.auto_gc is None and _AUTO_GC:
|
||||
print(
|
||||
f"Will run `git gc --auto` because {_REPO_AUTO_GC} is set.",
|
||||
f"{_REPO_AUTO_GC} is deprecated and will be removed in a ",
|
||||
"future release. Use `--auto-gc` instead.",
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"Will run `git gc --auto` because %s is set. %s is deprecated "
|
||||
"and will be removed in a future release. Use `--auto-gc` "
|
||||
"instead.",
|
||||
_REPO_AUTO_GC,
|
||||
_REPO_AUTO_GC,
|
||||
)
|
||||
opt.auto_gc = True
|
||||
|
||||
@ -1626,10 +1658,10 @@ later is required to fix a server side protocol bug.
|
||||
try:
|
||||
platform_utils.remove(smart_sync_manifest_path)
|
||||
except OSError as e:
|
||||
print(
|
||||
logger.error(
|
||||
"error: failed to remove existing smart sync override "
|
||||
"manifest: %s" % e,
|
||||
file=sys.stderr,
|
||||
"manifest: %s",
|
||||
e,
|
||||
)
|
||||
|
||||
err_event = multiprocessing.Event()
|
||||
@ -1640,11 +1672,10 @@ later is required to fix a server side protocol bug.
|
||||
if cb:
|
||||
base = rp.GetBranch(cb).merge
|
||||
if not base or not base.startswith("refs/heads/"):
|
||||
print(
|
||||
logger.warning(
|
||||
"warning: repo is not tracking a remote branch, so it will "
|
||||
"not receive updates; run `repo init --repo-rev=stable` to "
|
||||
"fix.",
|
||||
file=sys.stderr,
|
||||
"fix."
|
||||
)
|
||||
|
||||
for m in self.ManifestList(opt):
|
||||
@ -1665,7 +1696,7 @@ later is required to fix a server side protocol bug.
|
||||
mp.ConfigureCloneFilterForDepth("blob:none")
|
||||
|
||||
if opt.mp_update:
|
||||
self._UpdateAllManifestProjects(opt, mp, manifest_name)
|
||||
self._UpdateAllManifestProjects(opt, mp, manifest_name, errors)
|
||||
else:
|
||||
print("Skipping update of local manifest project.")
|
||||
|
||||
@ -1705,10 +1736,14 @@ later is required to fix a server side protocol bug.
|
||||
# Initialize the socket dir once in the parent.
|
||||
ssh_proxy.sock()
|
||||
result = self._FetchMain(
|
||||
opt, args, all_projects, err_event, ssh_proxy, manifest
|
||||
opt,
|
||||
args,
|
||||
all_projects,
|
||||
err_event,
|
||||
ssh_proxy,
|
||||
manifest,
|
||||
errors,
|
||||
)
|
||||
if result.errors:
|
||||
errors.extend(result.errors)
|
||||
all_projects = result.all_projects
|
||||
|
||||
if opt.network_only:
|
||||
@ -1719,12 +1754,11 @@ later is required to fix a server side protocol bug.
|
||||
if err_event.is_set():
|
||||
err_network_sync = True
|
||||
if opt.fail_fast:
|
||||
print(
|
||||
"\nerror: Exited sync due to fetch errors.\n"
|
||||
logger.error(
|
||||
"error: Exited sync due to fetch errors.\n"
|
||||
"Local checkouts *not* updated. Resolve network issues "
|
||||
"& retry.\n"
|
||||
"`repo sync -l` will update some local checkouts.",
|
||||
file=sys.stderr,
|
||||
"`repo sync -l` will update some local checkouts."
|
||||
)
|
||||
raise SyncFailFastError(aggregate_errors=errors)
|
||||
|
||||
@ -1742,10 +1776,7 @@ later is required to fix a server side protocol bug.
|
||||
if isinstance(e, DeleteWorktreeError):
|
||||
errors.extend(e.aggregate_errors)
|
||||
if opt.fail_fast:
|
||||
print(
|
||||
"\nerror: Local checkouts *not* updated.",
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error("error: Local checkouts *not* updated.")
|
||||
raise SyncFailFastError(aggregate_errors=errors)
|
||||
|
||||
err_update_linkfiles = False
|
||||
@ -1756,9 +1787,8 @@ later is required to fix a server side protocol bug.
|
||||
errors.append(e)
|
||||
err_event.set()
|
||||
if opt.fail_fast:
|
||||
print(
|
||||
"\nerror: Local update copyfile or linkfile failed.",
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
"error: Local update copyfile or linkfile failed."
|
||||
)
|
||||
raise SyncFailFastError(aggregate_errors=errors)
|
||||
|
||||
@ -1781,12 +1811,10 @@ later is required to fix a server side protocol bug.
|
||||
|
||||
# If we saw an error, exit with code 1 so that other scripts can check.
|
||||
if err_event.is_set():
|
||||
# Add a new line so it's easier to read.
|
||||
print("\n", file=sys.stderr)
|
||||
|
||||
def print_and_log(err_msg):
|
||||
self.git_event_log.ErrorEvent(err_msg)
|
||||
print(err_msg, file=sys.stderr)
|
||||
logger.error("%s", err_msg)
|
||||
|
||||
print_and_log("error: Unable to fully sync the tree")
|
||||
if err_network_sync:
|
||||
@ -1799,15 +1827,11 @@ later is required to fix a server side protocol bug.
|
||||
print_and_log("error: Checking out local projects failed.")
|
||||
if err_results:
|
||||
# Don't log repositories, as it may contain sensitive info.
|
||||
print(
|
||||
"Failing repos:\n%s" % "\n".join(err_results),
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.error("Failing repos:\n%s", "\n".join(err_results))
|
||||
# Not useful to log.
|
||||
print(
|
||||
logger.error(
|
||||
'Try re-running with "-j1 --fail-fast" to exit at the first '
|
||||
"error.",
|
||||
file=sys.stderr,
|
||||
"error."
|
||||
)
|
||||
raise SyncError(aggregate_errors=errors)
|
||||
|
||||
@ -1824,10 +1848,9 @@ later is required to fix a server side protocol bug.
|
||||
|
||||
self._local_sync_state.PruneRemovedProjects()
|
||||
if self._local_sync_state.IsPartiallySynced():
|
||||
print(
|
||||
logger.warning(
|
||||
"warning: Partial syncs are not supported. For the best "
|
||||
"experience, sync the entire tree.",
|
||||
file=sys.stderr,
|
||||
"experience, sync the entire tree."
|
||||
)
|
||||
|
||||
if not opt.quiet:
|
||||
@ -1854,7 +1877,7 @@ def _PostRepoUpgrade(manifest, quiet=False):
|
||||
|
||||
def _PostRepoFetch(rp, repo_verify=True, verbose=False):
|
||||
if rp.HasChanges:
|
||||
print("info: A new version of repo is available", file=sys.stderr)
|
||||
logger.warning("info: A new version of repo is available")
|
||||
wrapper = Wrapper()
|
||||
try:
|
||||
rev = rp.bare_git.describe(rp.GetRevisionId())
|
||||
@ -1876,19 +1899,13 @@ def _PostRepoFetch(rp, repo_verify=True, verbose=False):
|
||||
rp.work_git.reset("--keep", new_rev)
|
||||
except GitError as e:
|
||||
raise RepoUnhandledExceptionError(e)
|
||||
print("info: Restarting repo with latest version", file=sys.stderr)
|
||||
print("info: Restarting repo with latest version")
|
||||
raise RepoChangedException(["--repo-upgraded"])
|
||||
else:
|
||||
print(
|
||||
"warning: Skipped upgrade to unverified version",
|
||||
file=sys.stderr,
|
||||
)
|
||||
logger.warning("warning: Skipped upgrade to unverified version")
|
||||
else:
|
||||
if verbose:
|
||||
print(
|
||||
"repo version %s is current" % rp.work_git.describe(HEAD),
|
||||
file=sys.stderr,
|
||||
)
|
||||
print("repo version %s is current" % rp.work_git.describe(HEAD))
|
||||
|
||||
|
||||
class _FetchTimes(object):
|
||||
|
@ -29,10 +29,12 @@ from git_command import GitCommand
|
||||
from git_refs import R_HEADS
|
||||
from hooks import RepoHook
|
||||
from project import ReviewableBranch
|
||||
from repo_logging import RepoLogger
|
||||
from subcmds.sync import LocalSyncState
|
||||
|
||||
|
||||
_DEFAULT_UNUSUAL_COMMIT_THRESHOLD = 5
|
||||
logger = RepoLogger(__file__)
|
||||
|
||||
|
||||
class UploadExitError(SilentRepoExitError):
|
||||
@ -70,16 +72,16 @@ def _VerifyPendingCommits(branches: List[ReviewableBranch]) -> bool:
|
||||
# If any branch has many commits, prompt the user.
|
||||
if many_commits:
|
||||
if len(branches) > 1:
|
||||
print(
|
||||
logger.warning(
|
||||
"ATTENTION: One or more branches has an unusually high number "
|
||||
"of commits."
|
||||
)
|
||||
else:
|
||||
print(
|
||||
logger.warning(
|
||||
"ATTENTION: You are uploading an unusually high number of "
|
||||
"commits."
|
||||
)
|
||||
print(
|
||||
logger.warning(
|
||||
"YOU PROBABLY DO NOT MEAN TO DO THIS. (Did you rebase across "
|
||||
"branches?)"
|
||||
)
|
||||
@ -93,7 +95,7 @@ def _VerifyPendingCommits(branches: List[ReviewableBranch]) -> bool:
|
||||
|
||||
def _die(fmt, *args):
|
||||
msg = fmt % args
|
||||
print("error: %s" % msg, file=sys.stderr)
|
||||
logger.error("error: %s", msg)
|
||||
raise UploadExitError(msg)
|
||||
|
||||
|
||||
@ -748,16 +750,13 @@ Gerrit Code Review: https://www.gerritcodereview.com/
|
||||
for result in results:
|
||||
project, avail = result
|
||||
if avail is None:
|
||||
print(
|
||||
logger.error(
|
||||
'repo: error: %s: Unable to upload branch "%s". '
|
||||
"You might be able to fix the branch by running:\n"
|
||||
" git branch --set-upstream-to m/%s"
|
||||
% (
|
||||
project.RelPath(local=opt.this_manifest_only),
|
||||
project.CurrentBranch,
|
||||
project.manifest.branch,
|
||||
),
|
||||
file=sys.stderr,
|
||||
" git branch --set-upstream-to m/%s",
|
||||
project.RelPath(local=opt.this_manifest_only),
|
||||
project.CurrentBranch,
|
||||
project.manifest.branch,
|
||||
)
|
||||
elif avail:
|
||||
pending.append(result)
|
||||
@ -772,14 +771,11 @@ Gerrit Code Review: https://www.gerritcodereview.com/
|
||||
|
||||
if not pending:
|
||||
if opt.branch is None:
|
||||
print(
|
||||
"repo: error: no branches ready for upload", file=sys.stderr
|
||||
)
|
||||
logger.error("repo: error: no branches ready for upload")
|
||||
else:
|
||||
print(
|
||||
'repo: error: no branches named "%s" ready for upload'
|
||||
% (opt.branch,),
|
||||
file=sys.stderr,
|
||||
logger.error(
|
||||
'repo: error: no branches named "%s" ready for upload',
|
||||
opt.branch,
|
||||
)
|
||||
return 1
|
||||
|
||||
@ -809,10 +805,9 @@ Gerrit Code Review: https://www.gerritcodereview.com/
|
||||
project_list=pending_proj_names, worktree_list=pending_worktrees
|
||||
):
|
||||
if LocalSyncState(manifest).IsPartiallySynced():
|
||||
print(
|
||||
logger.error(
|
||||
"Partially synced tree detected. Syncing all projects "
|
||||
"may resolve issues you're seeing.",
|
||||
file=sys.stderr,
|
||||
"may resolve issues you're seeing."
|
||||
)
|
||||
ret = 1
|
||||
if ret:
|
||||
|
@ -14,8 +14,11 @@
|
||||
|
||||
"""Common fixtures for pytests."""
|
||||
|
||||
import pathlib
|
||||
|
||||
import pytest
|
||||
|
||||
import platform_utils
|
||||
import repo_trace
|
||||
|
||||
|
||||
@ -23,3 +26,49 @@ import repo_trace
|
||||
def disable_repo_trace(tmp_path):
|
||||
"""Set an environment marker to relax certain strict checks for test code.""" # noqa: E501
|
||||
repo_trace._TRACE_FILE = str(tmp_path / "TRACE_FILE_from_test")
|
||||
|
||||
|
||||
# adapted from pytest-home 0.5.1
|
||||
def _set_home(monkeypatch, path: pathlib.Path):
|
||||
"""
|
||||
Set the home dir using a pytest monkeypatch context.
|
||||
"""
|
||||
win = platform_utils.isWindows()
|
||||
vars = ["HOME"] + win * ["USERPROFILE"]
|
||||
for var in vars:
|
||||
monkeypatch.setenv(var, str(path))
|
||||
return path
|
||||
|
||||
|
||||
# copied from
|
||||
# https://github.com/pytest-dev/pytest/issues/363#issuecomment-1335631998
|
||||
@pytest.fixture(scope="session")
|
||||
def monkeysession():
|
||||
with pytest.MonkeyPatch.context() as mp:
|
||||
yield mp
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True, scope="session")
|
||||
def session_tmp_home_dir(tmp_path_factory, monkeysession):
|
||||
"""Set HOME to a temporary directory, avoiding user's .gitconfig.
|
||||
|
||||
b/302797407
|
||||
|
||||
Set home at session scope to take effect prior to
|
||||
``test_wrapper.GitCheckoutTestCase.setUpClass``.
|
||||
"""
|
||||
return _set_home(monkeysession, tmp_path_factory.mktemp("home"))
|
||||
|
||||
|
||||
# adapted from pytest-home 0.5.1
|
||||
@pytest.fixture(autouse=True)
|
||||
def tmp_home_dir(monkeypatch, tmp_path_factory):
|
||||
"""Set HOME to a temporary directory.
|
||||
|
||||
Ensures that state doesn't accumulate in $HOME across tests.
|
||||
|
||||
Note that in conjunction with session_tmp_homedir, the HOME
|
||||
dir is patched twice, once at session scope, and then again at
|
||||
the function scope.
|
||||
"""
|
||||
return _set_home(monkeypatch, tmp_path_factory.mktemp("home"))
|
||||
|
@ -14,6 +14,7 @@
|
||||
|
||||
"""Unittests for the git_command.py module."""
|
||||
|
||||
import io
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
@ -74,6 +75,10 @@ class GitCommandWaitTest(unittest.TestCase):
|
||||
class MockPopen(object):
|
||||
rc = 0
|
||||
|
||||
def __init__(self):
|
||||
self.stdout = io.BufferedReader(io.BytesIO())
|
||||
self.stderr = io.BufferedReader(io.BytesIO())
|
||||
|
||||
def communicate(
|
||||
self, input: str = None, timeout: float = None
|
||||
) -> [str, str]:
|
||||
@ -117,6 +122,115 @@ class GitCommandWaitTest(unittest.TestCase):
|
||||
self.assertEqual(1, r.Wait())
|
||||
|
||||
|
||||
class GitCommandStreamLogsTest(unittest.TestCase):
|
||||
"""Tests the GitCommand class stderr log streaming cases."""
|
||||
|
||||
def setUp(self):
|
||||
self.mock_process = mock.MagicMock()
|
||||
self.mock_process.communicate.return_value = (None, None)
|
||||
self.mock_process.wait.return_value = 0
|
||||
|
||||
self.mock_popen = mock.MagicMock()
|
||||
self.mock_popen.return_value = self.mock_process
|
||||
mock.patch("subprocess.Popen", self.mock_popen).start()
|
||||
|
||||
def tearDown(self):
|
||||
mock.patch.stopall()
|
||||
|
||||
def test_does_not_stream_logs_when_input_is_set(self):
|
||||
git_command.GitCommand(None, ["status"], input="foo")
|
||||
|
||||
self.mock_popen.assert_called_once_with(
|
||||
["git", "status"],
|
||||
cwd=None,
|
||||
env=mock.ANY,
|
||||
encoding="utf-8",
|
||||
errors="backslashreplace",
|
||||
stdin=subprocess.PIPE,
|
||||
stdout=None,
|
||||
stderr=None,
|
||||
)
|
||||
self.mock_process.communicate.assert_called_once_with(input="foo")
|
||||
self.mock_process.stderr.read1.assert_not_called()
|
||||
|
||||
def test_does_not_stream_logs_when_stdout_is_set(self):
|
||||
git_command.GitCommand(None, ["status"], capture_stdout=True)
|
||||
|
||||
self.mock_popen.assert_called_once_with(
|
||||
["git", "status"],
|
||||
cwd=None,
|
||||
env=mock.ANY,
|
||||
encoding="utf-8",
|
||||
errors="backslashreplace",
|
||||
stdin=None,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=None,
|
||||
)
|
||||
self.mock_process.communicate.assert_called_once_with(input=None)
|
||||
self.mock_process.stderr.read1.assert_not_called()
|
||||
|
||||
def test_does_not_stream_logs_when_stderr_is_set(self):
|
||||
git_command.GitCommand(None, ["status"], capture_stderr=True)
|
||||
|
||||
self.mock_popen.assert_called_once_with(
|
||||
["git", "status"],
|
||||
cwd=None,
|
||||
env=mock.ANY,
|
||||
encoding="utf-8",
|
||||
errors="backslashreplace",
|
||||
stdin=None,
|
||||
stdout=None,
|
||||
stderr=subprocess.PIPE,
|
||||
)
|
||||
self.mock_process.communicate.assert_called_once_with(input=None)
|
||||
self.mock_process.stderr.read1.assert_not_called()
|
||||
|
||||
def test_does_not_stream_logs_when_merge_output_is_set(self):
|
||||
git_command.GitCommand(None, ["status"], merge_output=True)
|
||||
|
||||
self.mock_popen.assert_called_once_with(
|
||||
["git", "status"],
|
||||
cwd=None,
|
||||
env=mock.ANY,
|
||||
encoding="utf-8",
|
||||
errors="backslashreplace",
|
||||
stdin=None,
|
||||
stdout=None,
|
||||
stderr=subprocess.STDOUT,
|
||||
)
|
||||
self.mock_process.communicate.assert_called_once_with(input=None)
|
||||
self.mock_process.stderr.read1.assert_not_called()
|
||||
|
||||
@mock.patch("sys.stderr")
|
||||
def test_streams_stderr_when_no_stream_is_set(self, mock_stderr):
|
||||
logs = "\n".join(
|
||||
[
|
||||
"Enumerating objects: 5, done.",
|
||||
"Counting objects: 100% (5/5), done.",
|
||||
"Writing objects: 100% (3/3), 330 bytes | 330 KiB/s, done.",
|
||||
"remote: Processing changes: refs: 1, new: 1, done ",
|
||||
"remote: SUCCESS",
|
||||
]
|
||||
)
|
||||
self.mock_process.stderr = io.BufferedReader(
|
||||
io.BytesIO(bytes(logs, "utf-8"))
|
||||
)
|
||||
|
||||
cmd = git_command.GitCommand(None, ["push"])
|
||||
|
||||
self.mock_popen.assert_called_once_with(
|
||||
["git", "push"],
|
||||
cwd=None,
|
||||
env=mock.ANY,
|
||||
stdin=None,
|
||||
stdout=None,
|
||||
stderr=subprocess.PIPE,
|
||||
)
|
||||
self.mock_process.communicate.assert_not_called()
|
||||
mock_stderr.write.assert_called_once_with(logs)
|
||||
self.assertEqual(cmd.stderr, logs)
|
||||
|
||||
|
||||
class GitCallUnitTest(unittest.TestCase):
|
||||
"""Tests the _GitCall class (via git_command.git)."""
|
||||
|
||||
@ -214,3 +328,22 @@ class GitRequireTests(unittest.TestCase):
|
||||
with self.assertRaises(git_command.GitRequireError) as e:
|
||||
git_command.git_require((2,), fail=True, msg="so sad")
|
||||
self.assertNotEqual(0, e.code)
|
||||
|
||||
|
||||
class GitCommandErrorTest(unittest.TestCase):
|
||||
"""Test for the GitCommandError class."""
|
||||
|
||||
def test_augument_stderr(self):
|
||||
self.assertEqual(
|
||||
git_command.GitCommandError(
|
||||
git_stderr="couldn't find remote ref refs/heads/foo"
|
||||
).suggestion,
|
||||
"Check if the provided ref exists in the remote.",
|
||||
)
|
||||
|
||||
self.assertEqual(
|
||||
git_command.GitCommandError(
|
||||
git_stderr="'foobar' does not appear to be a git repository"
|
||||
).suggestion,
|
||||
"Are you running this repo command outside of a repo workspace?",
|
||||
)
|
||||
|
@ -108,7 +108,9 @@ class SuperprojectTestCase(unittest.TestCase):
|
||||
self.assertRegex(log_entry["sid"], self.FULL_SID_REGEX)
|
||||
else:
|
||||
self.assertRegex(log_entry["sid"], self.SELF_SID_REGEX)
|
||||
self.assertRegex(log_entry["time"], r"^\d+-\d+-\d+T\d+:\d+:\d+\.\d+Z$")
|
||||
self.assertRegex(
|
||||
log_entry["time"], r"^\d+-\d+-\d+T\d+:\d+:\d+\.\d+\+00:00$"
|
||||
)
|
||||
|
||||
def readLog(self, log_path):
|
||||
"""Helper function to read log data into a list."""
|
||||
@ -490,7 +492,9 @@ class SuperprojectTestCase(unittest.TestCase):
|
||||
|
||||
self.assertTrue(self._superproject._Fetch())
|
||||
self.assertEqual(
|
||||
mock_git_command.call_args.args,
|
||||
# TODO: Once we require Python 3.8+,
|
||||
# use 'mock_git_command.call_args.args'.
|
||||
mock_git_command.call_args[0],
|
||||
(
|
||||
None,
|
||||
[
|
||||
@ -510,7 +514,9 @@ class SuperprojectTestCase(unittest.TestCase):
|
||||
# If branch for revision exists, set as --negotiation-tip.
|
||||
self.assertTrue(self._superproject._Fetch())
|
||||
self.assertEqual(
|
||||
mock_git_command.call_args.args,
|
||||
# TODO: Once we require Python 3.8+,
|
||||
# use 'mock_git_command.call_args.args'.
|
||||
mock_git_command.call_args[0],
|
||||
(
|
||||
None,
|
||||
[
|
||||
|
@ -90,7 +90,9 @@ class EventLogTestCase(unittest.TestCase):
|
||||
self.assertRegex(log_entry["sid"], self.FULL_SID_REGEX)
|
||||
else:
|
||||
self.assertRegex(log_entry["sid"], self.SELF_SID_REGEX)
|
||||
self.assertRegex(log_entry["time"], r"^\d+-\d+-\d+T\d+:\d+:\d+\.\d+Z$")
|
||||
self.assertRegex(
|
||||
log_entry["time"], r"^\d+-\d+-\d+T\d+:\d+:\d+\.\d+\+00:00$"
|
||||
)
|
||||
|
||||
def readLog(self, log_path):
|
||||
"""Helper function to read log data into a list."""
|
||||
|
@ -16,100 +16,49 @@
|
||||
import unittest
|
||||
from unittest import mock
|
||||
|
||||
from error import RepoExitError
|
||||
from repo_logging import RepoLogger
|
||||
|
||||
|
||||
class TestRepoLogger(unittest.TestCase):
|
||||
def test_error_logs_error(self):
|
||||
"""Test if error fn outputs logs."""
|
||||
@mock.patch.object(RepoLogger, "error")
|
||||
def test_log_aggregated_errors_logs_aggregated_errors(self, mock_error):
|
||||
"""Test if log_aggregated_errors logs a list of aggregated errors."""
|
||||
logger = RepoLogger(__name__)
|
||||
RepoLogger.errors[:] = []
|
||||
result = None
|
||||
|
||||
def mock_handler(log):
|
||||
nonlocal result
|
||||
result = log.getMessage()
|
||||
|
||||
mock_out = mock.MagicMock()
|
||||
mock_out.level = 0
|
||||
mock_out.handle = mock_handler
|
||||
logger.addHandler(mock_out)
|
||||
|
||||
logger.error("We're no strangers to love")
|
||||
|
||||
self.assertEqual(result, "We're no strangers to love")
|
||||
|
||||
def test_warning_logs_error(self):
|
||||
"""Test if warning fn outputs logs."""
|
||||
logger = RepoLogger(__name__)
|
||||
RepoLogger.errors[:] = []
|
||||
result = None
|
||||
|
||||
def mock_handler(log):
|
||||
nonlocal result
|
||||
result = log.getMessage()
|
||||
|
||||
mock_out = mock.MagicMock()
|
||||
mock_out.level = 0
|
||||
mock_out.handle = mock_handler
|
||||
logger.addHandler(mock_out)
|
||||
|
||||
logger.warning("You know the rules and so do I (do I)")
|
||||
|
||||
self.assertEqual(result, "You know the rules and so do I (do I)")
|
||||
|
||||
def test_error_aggregates_error_msg(self):
|
||||
"""Test if error fn aggregates error logs."""
|
||||
logger = RepoLogger(__name__)
|
||||
RepoLogger.errors[:] = []
|
||||
|
||||
logger.error("A full commitment's what I'm thinking of")
|
||||
logger.error("You wouldn't get this from any other guy")
|
||||
logger.error("I just wanna tell you how I'm feeling")
|
||||
logger.error("Gotta make you understand")
|
||||
|
||||
self.assertEqual(
|
||||
RepoLogger.errors[:],
|
||||
[
|
||||
"A full commitment's what I'm thinking of",
|
||||
"You wouldn't get this from any other guy",
|
||||
"I just wanna tell you how I'm feeling",
|
||||
"Gotta make you understand",
|
||||
],
|
||||
logger.log_aggregated_errors(
|
||||
RepoExitError(
|
||||
aggregate_errors=[
|
||||
Exception("foo"),
|
||||
Exception("bar"),
|
||||
Exception("baz"),
|
||||
Exception("hello"),
|
||||
Exception("world"),
|
||||
Exception("test"),
|
||||
]
|
||||
)
|
||||
)
|
||||
|
||||
def test_log_aggregated_errors_logs_aggregated_errors(self):
|
||||
"""Test if log_aggregated_errors outputs aggregated errors."""
|
||||
logger = RepoLogger(__name__)
|
||||
RepoLogger.errors[:] = []
|
||||
result = []
|
||||
|
||||
def mock_handler(log):
|
||||
nonlocal result
|
||||
result.append(log.getMessage())
|
||||
|
||||
mock_out = mock.MagicMock()
|
||||
mock_out.level = 0
|
||||
mock_out.handle = mock_handler
|
||||
logger.addHandler(mock_out)
|
||||
|
||||
logger.error("Never gonna give you up")
|
||||
logger.error("Never gonna let you down")
|
||||
logger.error("Never gonna run around and desert you")
|
||||
logger.log_aggregated_errors()
|
||||
|
||||
self.assertEqual(
|
||||
result,
|
||||
mock_error.assert_has_calls(
|
||||
[
|
||||
"Never gonna give you up",
|
||||
"Never gonna let you down",
|
||||
"Never gonna run around and desert you",
|
||||
"=" * 80,
|
||||
"Repo command failed due to following errors:",
|
||||
(
|
||||
"Never gonna give you up\n"
|
||||
"Never gonna let you down\n"
|
||||
"Never gonna run around and desert you"
|
||||
mock.call("=" * 80),
|
||||
mock.call(
|
||||
"Repo command failed due to the following `%s` errors:",
|
||||
"RepoExitError",
|
||||
),
|
||||
],
|
||||
mock.call("foo\nbar\nbaz\nhello\nworld"),
|
||||
mock.call("+%d additional errors...", 1),
|
||||
]
|
||||
)
|
||||
|
||||
@mock.patch.object(RepoLogger, "error")
|
||||
def test_log_aggregated_errors_logs_single_error(self, mock_error):
|
||||
"""Test if log_aggregated_errors logs empty aggregated_errors."""
|
||||
logger = RepoLogger(__name__)
|
||||
logger.log_aggregated_errors(RepoExitError())
|
||||
|
||||
mock_error.assert_has_calls(
|
||||
[
|
||||
mock.call("=" * 80),
|
||||
mock.call("Repo command failed: %s", "RepoExitError"),
|
||||
]
|
||||
)
|
||||
|
@ -117,8 +117,12 @@ class LocalSyncState(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
"""Common setup."""
|
||||
self.repodir = tempfile.mkdtemp(".repo")
|
||||
self.topdir = tempfile.mkdtemp("LocalSyncState")
|
||||
self.repodir = os.path.join(self.topdir, ".repo")
|
||||
os.makedirs(self.repodir)
|
||||
|
||||
self.manifest = mock.MagicMock(
|
||||
topdir=self.topdir,
|
||||
repodir=self.repodir,
|
||||
repoProject=mock.MagicMock(relpath=".repo/repo"),
|
||||
)
|
||||
@ -126,7 +130,7 @@ class LocalSyncState(unittest.TestCase):
|
||||
|
||||
def tearDown(self):
|
||||
"""Common teardown."""
|
||||
shutil.rmtree(self.repodir)
|
||||
shutil.rmtree(self.topdir)
|
||||
|
||||
def _new_state(self, time=_TIME):
|
||||
with mock.patch("time.time", return_value=time):
|
||||
|
4
tox.ini
4
tox.ini
@ -15,7 +15,8 @@
|
||||
# https://tox.readthedocs.io/
|
||||
|
||||
[tox]
|
||||
envlist = lint, py36, py37, py38, py39, py310, py311
|
||||
envlist = lint, py36, py37, py38, py39, py310, py311, py312
|
||||
requires = virtualenv<20.22.0
|
||||
|
||||
[gh-actions]
|
||||
python =
|
||||
@ -25,6 +26,7 @@ python =
|
||||
3.9: py39
|
||||
3.10: py310
|
||||
3.11: py311
|
||||
3.12: py312
|
||||
|
||||
[testenv]
|
||||
deps =
|
||||
|
Reference in New Issue
Block a user