Compare commits

..

52 Commits

Author SHA1 Message Date
4e16c24981 Revert "Add --prune option to fetch when syncing a mirror repo"
For some users it is not desirable to remove refs that don't exist
on the remote server when syncing a mirror repo.

This reverts commit b4d43b9f66.

Change-Id: Ie849b66682138ef88da6cd1a5fbb27e993197dd7
2015-07-20 22:31:04 +09:00
b3d6e67196 Merge "Fail if gitdir does not point to objdir during sync" 2015-07-15 19:30:41 +00:00
503d66d8af Merge "project.RemoteFetch: Handle depth cases more robustly" 2015-07-15 19:29:14 +00:00
679bac4bf3 project.RemoteFetch: Handle depth cases more robustly
The fetch logic for the case where depth is set and revision is a
SHA1 has several failure modes that are not handled well by the
current logic.

1) 'git fetch <SHA1>' requires git version >= 1.8.3
2) 'git fetch <SHA1>' can be prevented by a configuration option on the server.
3) 'git fetch --depth=<N> <refspec>' can fail to contain a SHA1 specified by
   the manifest.

Each of these cases cause infinite recursion when _RemoteFetch() tries to call
itself with current_branch_only=False because current_branch_only is set to
True when depth != None.

To try to prevent the infinite recursion, we set self.clone_depth to None
before the first retry of _RemoteFetch(). This will allow the Fetch to
eventually succeed in the case where clone-depth is specified in the manifest.
A user specified depth from the init command will still recurse infinitely.

In addition, never try to fetch a SHA1 directly if the git version being used
is not at least 1.8.3.

Change-Id: I802fc17878c0929cfd63fff611633c1d3b54ecd3
2015-07-15 15:53:14 +00:00
97836cf09f Merge "Always output upstream if specified" 2015-07-13 16:36:28 +00:00
80e3a37ab5 Merge changes Iaefcbe14,I697a0f64,I19bfe9fe,I06e942c4
* changes:
  forall: use smart sync override manifest if it exists
  sync: Remove smart sync override manifest when not in smart sync mode
  forall: Don't try to get lrev of projects in mirror workspace
  sync: Improve error message when writing smart sync manifest fails
2015-07-11 14:01:16 +00:00
bb4a1b5274 Merge "Improve error message when syncing a project with invalid groups." 2015-07-10 22:00:47 +00:00
551dfecea9 Always output upstream if specified
Previously, in running the `manifest` command, we wouldn't output the
upstream if the default upstream would include the pinned sha1.
However, now that fetching refs/heads/* doesn't guarantee that we will
have the sha1, we need to always output the specified upstream branch.

Change-Id: Ib8b409a8ecd439397b38ee9649c530407797f841
2015-07-10 14:59:10 -07:00
6944cdb8d1 forall: use smart sync override manifest if it exists
If a workspace is synced with the -s or -t option, the included projects
may be different to those in the original manifest. However, when using
the forall command, the list of the projects from the original manifest
is used.

If the smart sync manifest file exists, use it to override the original
manifest.

Change-Id: Iaefcbe148d2158ac046f158d98bbd8b5a5378ce7
2015-07-06 16:18:06 +09:00
59b417493e sync: Remove smart sync override manifest when not in smart sync mode
When syncing with the -s or -t option, a smart_sync_override.xml file
is created. This file is left in the file system when syncing again
without the -s or -t option.

Remove the smart sync override manifest, if it exists, when not using
the -s or -t option.

Change-Id: I697a0f6405205ba5f84a4d470becf7cd23c07b4b
2015-07-06 16:18:06 +09:00
30d13eea86 forall: Don't try to get lrev of projects in mirror workspace
git rev-parse fails for projects that don't have an explicit revision
specified, and don't have a branch of the same name as the default
revision. This can be the case in a workspace synced with the smart
sync (-s) or smart tag (-t) option.

Change-Id: I19bfe9fe7396170379415d85f10f6440dc6ea08f
2015-07-06 16:18:06 +09:00
727cc3e324 sync: Improve error message when writing smart sync manifest fails
The error message only states that writing the manifest failed.

Include the exception message, so it's easier to track down the reason
that the write failed.

Change-Id: I06e942c48a19521ba45292199519dd0a8bdb1de7
2015-07-06 16:18:06 +09:00
c5ceeb1625 Merge "Fix 'repo cherry-pick' to avoid hanging on commit-msg update." 2015-06-25 14:53:46 +00:00
db75704bfc Fix 'repo cherry-pick' to avoid hanging on commit-msg update.
After performing the actual cherry-pick operation, the code
in cherry_pick.py opens a pipe to 'git commit -F' to rewrite the commit
message, emits the fixed-up commit msg to the pipe, then waits
for 'git commit' to complete. The child 'git' process winds up
hanging while reading from the pipe, however, since the parent
process still has it open. To fix the hang, change the parent process
to close its end of the pipe after it has emitted the message.

Change-Id: I5929371e69a5b076f09009d00d40a2c72ac8ac33
2015-06-22 08:00:20 -04:00
87ea5913f2 Improve error message when syncing a project with invalid groups.
Change-Id: Iaf5c2a0f00667dc09bcf455cfe2f39bfbaa2bfc0
2015-06-19 15:55:15 -07:00
185307d1dd Merge "Teach _LinkFile._Link to handle globs." 2015-06-09 00:14:13 +00:00
c116f94261 forall: setenv, only encode val if encode exists
Change-Id: I655e3043d0118c4e929897d3a51e5e013e5758dc
2015-06-04 00:34:19 +00:00
7993f3cdda init: don't call urllib.parse
it's actually urllib.parse.urlparse

Change-Id: Ie3532e54625e887c8682d92b932ea21a629e8d60
2015-06-04 00:33:33 +00:00
b1d1fd778d git_config: fix _SaveJson typo
Change-Id: I35ca2b3733e6d1508669f9a6690c6645c582912e
2015-06-04 00:22:23 +00:00
be4456cf24 error: fix typos
Change-Id: I09c47024ef54c360ea3c15c5d4f169e13444e412
2015-06-04 00:21:16 +00:00
cf738ed4a1 git_command: only decode when needed
strings no longer need decoding, since unicode is str

Change-Id: I9516d298fee7ddc058452394b7759327fe3aa7a8
2015-06-03 16:50:39 +01:00
6cfc68e1e6 decode the buffer before appending
output from a process is in bytes in python3. we need
to decode it.

in Python3, bytes don't have an encode attribute. use this
to identify it.

Change-Id: I152f2ec34614131027db680ead98b53f9b321ed5
2015-06-03 16:39:32 +01:00
4c426ef1d4 Teach _LinkFile._Link to handle globs.
This allows a project to use globs in the linkfile src attribute. When
a glob is used in the src the dest field must be a directory. Then
_LinkFile._Link(self) calls will create symbolic links in the dest
directory to all of the entries in the src as defined by the glob
specification.

Below all of the entries in master-configs/ will have symbolic links
in <root dir>/configs directory:

  <project name="helloworld.git" path="apps/helloworld">
      <linkfile src="master-configs/*" dest="configs"/>
  </project>

Change-Id: Idfed8fa47c83d2ca6e2b8e867731b8e2f9e2eb47
2015-06-03 08:05:17 -07:00
472ce9f5fa Merge changes I32da12c2,Ie4a65b3e
* changes:
  Skip sleep and retry if git remote update exits with a signal
  Catch exceptions in project list generator
2015-06-02 00:14:43 +00:00
0184dcc510 Make linkfile symlinks relative
The source (target) of the symlink is specified relative to a project
within a tree, and the destination is specified relative to the top
of the tree, so it should always be possible to create a relative symlink
to the target file.  Relative symlinks will allow moving an entire tree
without breaking the symlink, and copying a tree (with -p) without leaving
a symlink to the old tree.

Change-Id: I16492a8b59a137d2abe43ca78e3b212e2c835599
2015-06-01 01:24:38 +00:00
c4b301f988 Skip sleep and retry if git remote update exits with a signal
Pressing ctrl-c during repo sync often hangs for 30 to 45 seconds
due to the time.sleep and retry in _RemoteFetch.  If git exits with
a signal, for example -2 for SIGINT triggered by ctrl-c, skip the
sleep and retry.

Change-Id: I32da12c2dcc96d9cc0b12a066e824b12ebfb52a0
2015-05-13 18:11:34 +00:00
31a7be561e Catch exceptions in project list generator
If the generator that produces per-project worker arguments raises an
exception it triggers python bug http://bugs.python.org/issue8296.
Rewrite the generator expression as a generator function, and catch
Exceptions and KeyboardInterrupts to end the iteration.

Also add a pool worker initializer to disable SIGINT to prevent
KeyboardInterrupts inside multiprocessing.Pool in the worker threads
causing the same problem.

Fixes easy-to-reproduce hangs when hitting ctrl-c during
repo forall -c echo

Change-Id: Ie4a65b3e1e07a64ed6bb6ff20f3912c4326718ca
2015-05-13 11:09:38 -07:00
384b3c5948 Fail if gitdir does not point to objdir during sync
There are a set of cases that can cause the git directory in
.repo/projects to point to a directory in .repo/project-objects that
is not the one specified in the manifest. This results in a tree that
is not sane, and so should cause a failure.

In order to reproduce the failure case:
1) Sync to any manifest
2) Change the 'name' of a project to a different repository. Leave the
   'path' the same.
3) Resync the modified project. The project-objects directory will not
   be created, and the projects directory will remain pointed at the old
   project-objects.

Change-Id: Ie6711b1c773508850c5c9f748a27ff72d65e2bf2
2015-05-12 09:15:53 -06:00
35de228f33 Merge "Don't attempt to create "fully qualified names" for SHA1s" 2015-05-11 09:20:54 +00:00
ace097c36e Merge "Add option on sync to avoid fetching from remotes for existing sha1" 2015-05-01 07:51:52 +00:00
b155354034 Add option on sync to avoid fetching from remotes for existing sha1
In 2fb6466f79 an optimisation was
added to avoid fetching from remotes if the project is fixed to
a revision and the revision is already available locally.

This causes problems for users who expect all objects to be
fetched by default.

Change the logic so that the optimized behaviour is only enabled if
an option is explicitly given to repo sync.

Change-Id: I3b2794ddd8e0071b1787e166463cd8347ca9e24f
2015-04-30 14:29:02 +00:00
382582728e Don't attempt to create "fully qualified names" for SHA1s
Doing so breaks "repo init -b <SHA1>".

Change-Id: Ic071a1b099a9125db22ea446d7e92e7854d69b37
2015-04-30 14:54:47 +02:00
b4d43b9f66 Add --prune option to fetch when syncing a mirror repo
When syncing a mirror repo, add the --prune option to the fetch
command to force removal of stale refs from the mirror.

Change-Id: I4b43b2a5c86b9915627887c16f6569066f3ab978
2015-04-30 10:32:37 +09:00
4ccad7554b Fix substitution err for schemeless manifest urls
Previously, we used a regex that would only remove a phony string from
a url if it existed, but we recently replaced that with a slice.  This
change goes back to the previous behavior.

Change-Id: I8baf527be01c4b49d45b903b31a1cd6315563d5b
2015-04-29 10:45:37 -07:00
403b64edf4 Don't append branch to fetch spec when syncing to a mirror
Appending the branch to the fetch spec causes sync of a mirror to
fail for projects that don't have an explicit revision specified,
and don't have a branch of the same name as the default revision.

For example, a manifest defining a default revision:

 <default revision="master">

having a project without an explicit revision:

 <project name="path/to/project">

and not having a branch named "master", will cause repo sync to
fail for that project with the error:

 Couldn't find remote ref refs/heads/master

Modify the logic to not append the branch onto the fetch spec when
syncing to a mirror.

Change-Id: I5c4457bd125519abf27abe682dea62ad708978c9
2015-04-27 10:56:27 +09:00
a38769cda8 Merge "forall: use a generator to map the Pool" 2015-04-08 17:59:58 +00:00
44859d0267 Merge "status: lose dependence on StringIO" 2015-04-08 17:58:35 +00:00
6ad6dbefe7 forall: use a generator to map the Pool
Before, a list was generated, which is why there was a massive delay.

Using a generator will allow processes to start straight away

Change-Id: Ia325b0b340cc328c08c9bcc92a6709bbdaf6a664
2015-04-08 13:22:34 +01:00
33fe4e99f9 Remove deprecated include-ids setting from pylint config
Change-Id: Ie5ab21e434d24ff862bb5e0c263761370d71f56f
2015-04-07 11:10:17 +09:00
4214585073 Merge "Pylint and PEP8 fixes for color.py" 2015-04-07 02:06:49 +00:00
b51f07cd06 status: lose dependence on StringIO
buflist was being used, which isn't available in Python 3.

`Execute` was using StringIO to capture the output of `PrintWorkTreeStatus`,
only to redirect it straight to stdout.
Instead, just let `PrintWorkTreeStatus` do it's own thing directly to stdout.

for handling `_FindOrphans`, we swap StringIO for a list. Nothing was done
that needed a a file like object.

Change-Id: Ibdaae137904de66a5ffb590d84203ef0fe782d8b
2015-04-04 21:21:49 +01:00
04f2f0e186 Maintain fully qualified tracking branches
When running repo branch, the git merge line (in many circumstances)
is set to the revision of the project specified in the manifest.  If
this is a branch name that is not fully-qualified, we will end up with
something like "merge = master" instead of "merge = refs/heads/master".
This change examines the revision if we are going to use that and
changes branch short names to fully qualified branch names.

Change-Id: Ie1be94fb8d45df8eeac44a47f729a3819a05fa81
2015-04-01 17:43:36 +00:00
cb07ba7e3d Resolve fetch urls more efficiently
Instead of using regex, append the netloc and relative
scheme lists with the custom scheme.
The schemes will only be appended when needed, instead
of passing X amount of regex replaces.

see http://bugs.python.org/issue18828 for more details.

Change-Id: I10d26d5ddc32e7ed04c5a412bdd6e13ec59eb70f
2015-03-31 20:12:44 +00:00
23ff7df6a7 use the max depth instead of unshallow
This allows the use of older versions of git

Change-Id: I88ea685066603af19896a791829355ddbfa91ffe
2015-03-30 21:54:26 +00:00
cc1b1a703d Revert "Change the min git version from 1.7.2 to 1.8.2"
This reverts commit 52b99aa91d.

Change-Id: I01d93704c92f7af1ca2b36dbc9509ee1290e2d3c
2015-03-30 21:53:25 +00:00
bdf7ed2301 Pylint and PEP8 fixes for color.py
Change-Id: I1a676e25957a7b5dd800d2585a2ec7fe75295668
2015-03-28 21:12:27 +00:00
9c76f67f13 Always capture output for GitCommand
Switch the GitCommand program to always capture the output for stdout
and stderr.  And by default print the output while running.

The options capture_stdout and capture_stderr have effectively become
options to supress the printing of stdout and stderr.

Update the 'git fetch' to use '--progress' so that the progress messages
will be displayed.  git checks if the output location isatty() and if it
is not a TTY it will by default not print the progress messages.

Change-Id: Ifdae138e008f80a59195f9f43c911a1a5210ec60
2015-03-26 11:43:55 -07:00
52b99aa91d Change the min git version from 1.7.2 to 1.8.2
This is needed for the --unshallow option of git fetch.

Change-Id: Ifdc5cec6130315c643924328fea425f1b94cb04a
2015-03-18 21:43:39 +00:00
9371979628 Revert "Implementation of manifest defined githooks"
This reverts commit 38e4387f8e.

A "repo init" followed by "repo sync" is meant to be as safe as
"git clone".  In particular it should not run arbitrary code provided
by the manifest owner.

It would still be nice to have support for manifest-defined git hooks
--- they'd just need a prompt like the upload RepoHook has.  Hopefully
a later change can bring them back.

Change-Id: I5ecd90fb5c2ed64f103d856d1ffcba38a47b062d
Signed-off-by: Jonathan Nieder <jrn@google.com>
2015-03-17 11:29:58 -07:00
2086004261 Merge "Don't exit with error on HTTP 401 when downloading clone bundle" 2015-03-11 17:25:45 +00:00
2338788050 Don't exit with error on HTTP 401 when downloading clone bundle
If the server returns HTTP 401 (unauthorized) when attempting to
download clone bundle files, ignore it and continue, rather than
exiting with a fatal error.

Change-Id: I2c7ee03e149c354c7e4ad6ea1ebf266534778fe1
2015-03-11 07:43:40 +00:00
0402cd882a Add space between project path and branch in repo status.
Currently, paths longer than 39 chars have no space after them so it looks
like this:

project path/branch master

Change-Id: I4c1bb13648ac099ade8a8d4ebafa04131571f842
2015-03-11 07:42:17 +00:00
15 changed files with 356 additions and 285 deletions

View File

@ -61,9 +61,6 @@ disable=R0903,R0912,R0913,R0914,R0915,W0141,C0111,C0103,W0603,W0703,R0911,C0301,
# (visual studio) and html
output-format=text
# Include message's id in output
include-ids=yes
# Put messages in a separate file for each module / package specified on the
# command line instead of printing them on stdout. Reports (if any) will be
# written in a file name "pylint_global.[txt|html]".

View File

@ -18,41 +18,43 @@ import sys
import pager
COLORS = {None :-1,
'normal' :-1,
'black' : 0,
'red' : 1,
'green' : 2,
'yellow' : 3,
'blue' : 4,
COLORS = {None: -1,
'normal': -1,
'black': 0,
'red': 1,
'green': 2,
'yellow': 3,
'blue': 4,
'magenta': 5,
'cyan' : 6,
'white' : 7}
'cyan': 6,
'white': 7}
ATTRS = {None :-1,
'bold' : 1,
'dim' : 2,
'ul' : 4,
'blink' : 5,
ATTRS = {None: -1,
'bold': 1,
'dim': 2,
'ul': 4,
'blink': 5,
'reverse': 7}
RESET = "\033[m" # pylint: disable=W1401
# backslash is not anomalous
RESET = "\033[m"
def is_color(s):
return s in COLORS
def is_attr(s):
return s in ATTRS
def _Color(fg = None, bg = None, attr = None):
def _Color(fg=None, bg=None, attr=None):
fg = COLORS[fg]
bg = COLORS[bg]
attr = ATTRS[attr]
if attr >= 0 or fg >= 0 or bg >= 0:
need_sep = False
code = "\033[" #pylint: disable=W1401
code = "\033["
if attr >= 0:
code += chr(ord('0') + attr)
@ -71,7 +73,6 @@ def _Color(fg = None, bg = None, attr = None):
if bg >= 0:
if need_sep:
code += ';'
need_sep = True
if bg < 8:
code += '4%c' % (ord('0') + bg)
@ -82,9 +83,9 @@ def _Color(fg = None, bg = None, attr = None):
code = ''
return code
DEFAULT = None
def SetDefaultColoring(state):
"""Set coloring behavior to |state|.
@ -145,6 +146,7 @@ class Coloring(object):
def printer(self, opt=None, fg=None, bg=None, attr=None):
s = self
c = self.colorer(opt, fg, bg, attr)
def f(fmt, *args):
s._out.write(c(fmt, *args))
return f
@ -152,6 +154,7 @@ class Coloring(object):
def nofmt_printer(self, opt=None, fg=None, bg=None, attr=None):
s = self
c = self.nofmt_colorer(opt, fg, bg, attr)
def f(fmt):
s._out.write(c(fmt))
return f
@ -159,11 +162,13 @@ class Coloring(object):
def colorer(self, opt=None, fg=None, bg=None, attr=None):
if self._on:
c = self._parse(opt, fg, bg, attr)
def f(fmt, *args):
output = fmt % args
return ''.join([c, output, RESET])
return f
else:
def f(fmt, *args):
return fmt % args
return f
@ -171,6 +176,7 @@ class Coloring(object):
def nofmt_colorer(self, opt=None, fg=None, bg=None, attr=None):
if self._on:
c = self._parse(opt, fg, bg, attr)
def f(fmt):
return ''.join([c, fmt, RESET])
return f

View File

@ -31,7 +31,7 @@ following DTD:
<!ELEMENT notice (#PCDATA)>
<!ELEMENT remote (projecthook?)>
<!ELEMENT remote (EMPTY)>
<!ATTLIST remote name ID #REQUIRED>
<!ATTLIST remote alias CDATA #IMPLIED>
<!ATTLIST remote fetch CDATA #REQUIRED>
@ -73,10 +73,6 @@ following DTD:
<!ATTLIST extend-project path CDATA #IMPLIED>
<!ATTLIST extend-project groups CDATA #IMPLIED>
<!ELEMENT projecthook (EMPTY)>
<!ATTLIST projecthook name CDATA #REQUIRED>
<!ATTLIST projecthook revision CDATA #REQUIRED>
<!ELEMENT remove-project (EMPTY)>
<!ATTLIST remove-project name CDATA #REQUIRED>
@ -310,15 +306,6 @@ target manifest to include - it must be a usable manifest on its own.
Attribute `name`: the manifest to include, specified relative to
the manifest repository's root.
Element projecthook
-------------------
This element is used to define a per-remote hook git that is
fetched and applied to all projects using the remote. The project-
hook functionality allows for company/team .git/hooks to be used.
The hooks in the supplied project and revision are supplemented to
the current repo stock hooks for each project. Supplemented hooks
overrule any stock hooks.
Local Manifests
===============

View File

@ -80,7 +80,7 @@ class NoSuchProjectError(Exception):
self.name = name
def __str__(self):
if self.Name is None:
if self.name is None:
return 'in current directory'
return self.name
@ -93,7 +93,7 @@ class InvalidProjectGroupsError(Exception):
self.name = name
def __str__(self):
if self.Name is None:
if self.name is None:
return 'in current directory'
return self.name

View File

@ -14,7 +14,9 @@
# limitations under the License.
from __future__ import print_function
import fcntl
import os
import select
import sys
import subprocess
import tempfile
@ -76,11 +78,24 @@ def terminate_ssh_clients():
_git_version = None
class _sfd(object):
"""select file descriptor class"""
def __init__(self, fd, dest, std_name):
assert std_name in ('stdout', 'stderr')
self.fd = fd
self.dest = dest
self.std_name = std_name
def fileno(self):
return self.fd.fileno()
class _GitCall(object):
def version(self):
p = GitCommand(None, ['--version'], capture_stdout=True)
if p.Wait() == 0:
return p.stdout.decode('utf-8')
if hasattr(p.stdout, 'decode'):
return p.stdout.decode('utf-8')
else:
return p.stdout
return None
def version_tuple(self):
@ -139,6 +154,9 @@ class GitCommand(object):
if key in env:
del env[key]
# If we are not capturing std* then need to print it.
self.tee = {'stdout': not capture_stdout, 'stderr': not capture_stderr}
if disable_editor:
_setenv(env, 'GIT_EDITOR', ':')
if ssh_proxy:
@ -162,22 +180,21 @@ class GitCommand(object):
if gitdir:
_setenv(env, GIT_DIR, gitdir)
cwd = None
command.extend(cmdv)
command.append(cmdv[0])
# Need to use the --progress flag for fetch/clone so output will be
# displayed as by default git only does progress output if stderr is a TTY.
if sys.stderr.isatty() and cmdv[0] in ('fetch', 'clone'):
if '--progress' not in cmdv and '--quiet' not in cmdv:
command.append('--progress')
command.extend(cmdv[1:])
if provide_stdin:
stdin = subprocess.PIPE
else:
stdin = None
if capture_stdout:
stdout = subprocess.PIPE
else:
stdout = None
if capture_stderr:
stderr = subprocess.PIPE
else:
stderr = None
stdout = subprocess.PIPE
stderr = subprocess.PIPE
if IsTrace():
global LAST_CWD
@ -226,8 +243,36 @@ class GitCommand(object):
def Wait(self):
try:
p = self.process
(self.stdout, self.stderr) = p.communicate()
rc = p.returncode
rc = self._CaptureOutput()
finally:
_remove_ssh_client(p)
return rc
def _CaptureOutput(self):
p = self.process
s_in = [_sfd(p.stdout, sys.stdout, 'stdout'),
_sfd(p.stderr, sys.stderr, 'stderr')]
self.stdout = ''
self.stderr = ''
for s in s_in:
flags = fcntl.fcntl(s.fd, fcntl.F_GETFL)
fcntl.fcntl(s.fd, fcntl.F_SETFL, flags | os.O_NONBLOCK)
while s_in:
in_ready, _, _ = select.select(s_in, [], [])
for s in in_ready:
buf = s.fd.read(4096)
if not buf:
s_in.remove(s)
continue
if not hasattr(buf, 'encode'):
buf = buf.decode()
if s.std_name == 'stdout':
self.stdout += buf
else:
self.stderr += buf
if self.tee[s.std_name]:
s.dest.write(buf)
s.dest.flush()
return p.wait()

View File

@ -280,7 +280,7 @@ class GitConfig(object):
finally:
fd.close()
except (IOError, TypeError):
if os.path.exists(self.json):
if os.path.exists(self._json):
os.remove(self._json)
def _ReadGit(self):

View File

@ -45,6 +45,7 @@ from command import MirrorSafeCommand
from subcmds.version import Version
from editor import Editor
from error import DownloadError
from error import InvalidProjectGroupsError
from error import ManifestInvalidRevisionError
from error import ManifestParseError
from error import NoManifestException
@ -173,6 +174,12 @@ class _Repo(object):
else:
print('error: no project in current directory', file=sys.stderr)
result = 1
except InvalidProjectGroupsError as e:
if e.name:
print('error: project group must be enabled for project %s' % e.name, file=sys.stderr)
else:
print('error: project group must be enabled for the project in the current directory', file=sys.stderr)
result = 1
finally:
elapsed = time.time() - start
hours, remainder = divmod(elapsed, 3600)

View File

@ -38,8 +38,9 @@ MANIFEST_FILE_NAME = 'manifest.xml'
LOCAL_MANIFEST_NAME = 'local_manifest.xml'
LOCAL_MANIFESTS_DIR_NAME = 'local_manifests'
urllib.parse.uses_relative.extend(['ssh', 'git'])
urllib.parse.uses_netloc.extend(['ssh', 'git'])
# urljoin gets confused if the scheme is not known.
urllib.parse.uses_relative.extend(['ssh', 'git', 'persistent-https', 'rpc'])
urllib.parse.uses_netloc.extend(['ssh', 'git', 'persistent-https', 'rpc'])
class _Default(object):
"""Project defaults within the manifest."""
@ -64,9 +65,7 @@ class _XmlRemote(object):
fetch=None,
manifestUrl=None,
review=None,
revision=None,
projecthookName=None,
projecthookRevision=None):
revision=None):
self.name = name
self.fetchUrl = fetch
self.manifestUrl = manifestUrl
@ -74,8 +73,6 @@ class _XmlRemote(object):
self.reviewUrl = review
self.revision = revision
self.resolvedFetchUrl = self._resolveFetchUrl()
self.projecthookName = projecthookName
self.projecthookRevision = projecthookRevision
def __eq__(self, other):
return self.__dict__ == other.__dict__
@ -89,21 +86,14 @@ class _XmlRemote(object):
# urljoin will gets confused over quite a few things. The ones we care
# about here are:
# * no scheme in the base url, like <hostname:port>
# * persistent-https://
# * rpc://
# We handle this by replacing these with obscure protocols
# and then replacing them with the original when we are done.
# gopher -> <none>
# wais -> persistent-https
# nntp -> rpc
# We handle no scheme by replacing it with an obscure protocol, gopher
# and then replacing it with the original when we are done.
if manifestUrl.find(':') != manifestUrl.find('/') - 1:
manifestUrl = 'gopher://' + manifestUrl
manifestUrl = re.sub(r'^persistent-https://', 'wais://', manifestUrl)
manifestUrl = re.sub(r'^rpc://', 'nntp://', manifestUrl)
url = urllib.parse.urljoin(manifestUrl, url)
url = re.sub(r'^gopher://', '', url)
url = re.sub(r'^wais://', 'persistent-https://', url)
url = re.sub(r'^nntp://', 'rpc://', url)
url = urllib.parse.urljoin('gopher://' + manifestUrl, url)
url = re.sub(r'^gopher://', '', url)
else:
url = urllib.parse.urljoin(manifestUrl, url)
return url
def ToRemoteSpec(self, projectName):
@ -171,11 +161,6 @@ class XmlManifest(object):
e.setAttribute('review', r.reviewUrl)
if r.revision is not None:
e.setAttribute('revision', r.revision)
if r.projecthookName is not None:
ph = doc.createElement('projecthook')
ph.setAttribute('name', r.projecthookName)
ph.setAttribute('revision', r.projecthookRevision)
e.appendChild(ph)
def _ParseGroups(self, groups):
return [x for x in re.split(r'[,\s]+', groups) if x]
@ -268,11 +253,13 @@ class XmlManifest(object):
else:
value = p.work_git.rev_parse(HEAD + '^0')
e.setAttribute('revision', value)
if peg_rev_upstream and value != p.revisionExpr:
# Only save the origin if the origin is not a sha1, and the default
# isn't our value, and the if the default doesn't already have that
# covered.
e.setAttribute('upstream', p.revisionExpr)
if peg_rev_upstream:
if p.upstream:
e.setAttribute('upstream', p.upstream)
elif value != p.revisionExpr:
# Only save the origin if the origin is not a sha1, and the default
# isn't our value
e.setAttribute('upstream', p.revisionExpr)
else:
revision = self.remotes[remoteName].revision or d.revisionExpr
if not revision or revision != p.revisionExpr:
@ -638,13 +625,7 @@ class XmlManifest(object):
if revision == '':
revision = None
manifestUrl = self.manifestProject.config.GetString('remote.origin.url')
projecthookName = None
projecthookRevision = None
for n in node.childNodes:
if n.nodeName == 'projecthook':
projecthookName, projecthookRevision = self._ParseProjectHooks(n)
break
return _XmlRemote(name, alias, fetch, manifestUrl, review, revision, projecthookName, projecthookRevision)
return _XmlRemote(name, alias, fetch, manifestUrl, review, revision)
def _ParseDefault(self, node):
"""
@ -948,8 +929,3 @@ class XmlManifest(object):
diff['added'].append(toProjects[proj])
return diff
def _ParseProjectHooks(self, node):
name = self._reqatt(node, 'name')
revision = self._reqatt(node, 'revision')
return name, revision

View File

@ -16,6 +16,7 @@ from __future__ import print_function
import contextlib
import errno
import filecmp
import glob
import os
import random
import re
@ -69,6 +70,27 @@ def not_rev(r):
def sq(r):
return "'" + r.replace("'", "'\''") + "'"
_project_hook_list = None
def _ProjectHooks():
"""List the hooks present in the 'hooks' directory.
These hooks are project hooks and are copied to the '.git/hooks' directory
of all subprojects.
This function caches the list of hooks (based on the contents of the
'repo/hooks' directory) on the first call.
Returns:
A list of absolute paths to all of the files in the hooks directory.
"""
global _project_hook_list
if _project_hook_list is None:
d = os.path.realpath(os.path.abspath(os.path.dirname(__file__)))
d = os.path.join(d, 'hooks')
_project_hook_list = [os.path.join(d, x) for x in os.listdir(d)]
return _project_hook_list
class DownloadedChange(object):
_commit_cache = None
@ -212,28 +234,60 @@ class _CopyFile(object):
_error('Cannot copy file %s to %s', src, dest)
class _LinkFile(object):
def __init__(self, src, dest, abssrc, absdest):
def __init__(self, git_worktree, src, dest, relsrc, absdest):
self.git_worktree = git_worktree
self.src = src
self.dest = dest
self.abs_src = abssrc
self.src_rel_to_dest = relsrc
self.abs_dest = absdest
def _Link(self):
src = self.abs_src
dest = self.abs_dest
def __linkIt(self, relSrc, absDest):
# link file if it does not exist or is out of date
if not os.path.islink(dest) or os.readlink(dest) != src:
if not os.path.islink(absDest) or (os.readlink(absDest) != relSrc):
try:
# remove existing file first, since it might be read-only
if os.path.exists(dest):
os.remove(dest)
if os.path.exists(absDest):
os.remove(absDest)
else:
dest_dir = os.path.dirname(dest)
dest_dir = os.path.dirname(absDest)
if not os.path.isdir(dest_dir):
os.makedirs(dest_dir)
os.symlink(src, dest)
os.symlink(relSrc, absDest)
except IOError:
_error('Cannot link file %s to %s', src, dest)
_error('Cannot link file %s to %s', relSrc, absDest)
def _Link(self):
"""Link the self.rel_src_to_dest and self.abs_dest. Handles wild cards
on the src linking all of the files in the source in to the destination
directory.
"""
# We use the absSrc to handle the situation where the current directory
# is not the root of the repo
absSrc = os.path.join(self.git_worktree, self.src)
if os.path.exists(absSrc):
# Entity exists so just a simple one to one link operation
self.__linkIt(self.src_rel_to_dest, self.abs_dest)
else:
# Entity doesn't exist assume there is a wild card
absDestDir = self.abs_dest
if os.path.exists(absDestDir) and not os.path.isdir(absDestDir):
_error('Link error: src with wildcard, %s must be a directory',
absDestDir)
else:
absSrcFiles = glob.glob(absSrc)
for absSrcFile in absSrcFiles:
# Create a releative path from source dir to destination dir
absSrcDir = os.path.dirname(absSrcFile)
relSrcDir = os.path.relpath(absSrcDir, absDestDir)
# Get the source file name
srcFile = os.path.basename(absSrcFile)
# Now form the final full paths to srcFile. They will be
# absolute for the desintaiton and relative for the srouce.
absDest = os.path.join(absDestDir, srcFile)
relSrc = os.path.join(relSrcDir, srcFile)
self.__linkIt(relSrc, absDest)
class RemoteSpec(object):
def __init__(self,
@ -490,6 +544,12 @@ class RepoHook(object):
class Project(object):
# These objects can be shared between several working trees.
shareable_files = ['description', 'info']
shareable_dirs = ['hooks', 'objects', 'rr-cache', 'svn']
# These objects can only be used by a single working tree.
working_tree_files = ['config', 'packed-refs', 'shallow']
working_tree_dirs = ['logs', 'refs']
def __init__(self,
manifest,
name,
@ -508,7 +568,8 @@ class Project(object):
upstream=None,
parent=None,
is_derived=False,
dest_branch=None):
dest_branch=None,
optimized_fetch=False):
"""Init a Project object.
Args:
@ -530,6 +591,8 @@ class Project(object):
is_derived: False if the project was explicitly defined in the manifest;
True if the project is a discovered submodule.
dest_branch: The branch to which to push changes for review by default.
optimized_fetch: If True, when a project is set to a sha1 revision, only
fetch from the remote if the sha1 is not present locally.
"""
self.manifest = manifest
self.name = name
@ -558,6 +621,7 @@ class Project(object):
self.upstream = upstream
self.parent = parent
self.is_derived = is_derived
self.optimized_fetch = optimized_fetch
self.subprojects = []
self.snapshots = {}
@ -587,7 +651,7 @@ class Project(object):
@property
def Exists(self):
return os.path.isdir(self.gitdir)
return os.path.isdir(self.gitdir) and os.path.isdir(self.objdir)
@property
def CurrentBranch(self):
@ -788,7 +852,7 @@ class Project(object):
out = StatusColoring(self.config)
if not output_redir == None:
out.redirect(output_redir)
out.project('project %-40s', self.relpath + '/')
out.project('project %-40s', self.relpath + '/ ')
branch = self.CurrentBranch
if branch is None:
@ -1039,7 +1103,8 @@ class Project(object):
current_branch_only=False,
clone_bundle=True,
no_tags=False,
archive=False):
archive=False,
optimized_fetch=False):
"""Perform only the network IO portion of the sync process.
Local working directory/branch state is not affected.
"""
@ -1072,7 +1137,6 @@ class Project(object):
"%s" % (tarpath, str(e)), file=sys.stderr)
self._CopyAndLinkFiles()
return True
if is_new is None:
is_new = not self.Exists
if is_new:
@ -1108,8 +1172,9 @@ class Project(object):
elif self.manifest.default.sync_c:
current_branch_only = True
has_sha1 = ID_RE.match(self.revisionExpr) and self._CheckForSha1()
if (not has_sha1 #Need to fetch since we don't already have this revision
need_to_fetch = not (optimized_fetch and \
(ID_RE.match(self.revisionExpr) and self._CheckForSha1()))
if (need_to_fetch
and not self._RemoteFetch(initial=is_new, quiet=quiet, alt_dir=alt_dir,
current_branch_only=current_branch_only,
no_tags=no_tags)):
@ -1305,6 +1370,8 @@ class Project(object):
if not ID_RE.match(self.revisionExpr):
# in case of manifest sync the revisionExpr might be a SHA1
branch.merge = self.revisionExpr
if not branch.merge.startswith('refs/'):
branch.merge = R_HEADS + branch.merge
branch.Save()
if cnt_mine > 0 and self.rebase:
@ -1330,9 +1397,10 @@ class Project(object):
def AddLinkFile(self, src, dest, absdest):
# dest should already be an absolute path, but src is project relative
# make src an absolute path
abssrc = os.path.join(self.worktree, src)
self.linkfiles.append(_LinkFile(src, dest, abssrc, absdest))
# make src relative path to dest
absdestdir = os.path.dirname(absdest)
relsrc = os.path.relpath(os.path.join(self.worktree, src), absdestdir)
self.linkfiles.append(_LinkFile(self.worktree, src, dest, relsrc, absdest))
def AddAnnotation(self, name, value, keep):
self.annotations.append(_Annotation(name, value, keep))
@ -1373,6 +1441,8 @@ class Project(object):
branch = self.GetBranch(name)
branch.remote = self.GetRemote(self.remote.name)
branch.merge = self.revisionExpr
if not branch.merge.startswith('refs/') and not ID_RE.match(self.revisionExpr):
branch.merge = R_HEADS + self.revisionExpr
revid = self.GetRevisionId(all_refs)
if head.startswith(R_HEADS):
@ -1828,23 +1898,25 @@ class Project(object):
spec.append('tag')
spec.append(tag_name)
branch = self.revisionExpr
if is_sha1 and depth:
# Shallow checkout of a specific commit, fetch from that commit and not
# the heads only as the commit might be deeper in the history.
spec.append(branch)
else:
if is_sha1:
branch = self.upstream
if branch is not None and branch.strip():
if not branch.startswith('refs/'):
branch = R_HEADS + branch
spec.append(str((u'+%s:' % branch) + remote.ToLocal(branch)))
if not self.manifest.IsMirror:
branch = self.revisionExpr
if is_sha1 and depth and git_require((1, 8, 3)):
# Shallow checkout of a specific commit, fetch from that commit and not
# the heads only as the commit might be deeper in the history.
spec.append(branch)
else:
if is_sha1:
branch = self.upstream
if branch is not None and branch.strip():
if not branch.startswith('refs/'):
branch = R_HEADS + branch
spec.append(str((u'+%s:' % branch) + remote.ToLocal(branch)))
cmd.extend(spec)
shallowfetch = self.config.GetString('repo.shallowfetch')
if shallowfetch and shallowfetch != ' '.join(spec):
GitCommand(self, ['fetch', '--unshallow', name] + shallowfetch.split(),
GitCommand(self, ['fetch', '--depth=2147483647', name]
+ shallowfetch.split(),
bare=True, ssh_proxy=ssh_proxy).Wait()
if depth:
self.config.SetString('repo.shallowfetch', ' '.join(spec))
@ -1853,10 +1925,8 @@ class Project(object):
ok = False
for _i in range(2):
gitcmd = GitCommand(self, cmd, bare=True, capture_stderr=True,
ssh_proxy=ssh_proxy)
gitcmd = GitCommand(self, cmd, bare=True, ssh_proxy=ssh_proxy)
ret = gitcmd.Wait()
print(gitcmd.stderr, file=sys.stderr, end='')
if ret == 0:
ok = True
break
@ -1865,9 +1935,8 @@ class Project(object):
"error:" in gitcmd.stderr and
"git remote prune" in gitcmd.stderr):
prunecmd = GitCommand(self, ['remote', 'prune', name], bare=True,
capture_stderr=True, ssh_proxy=ssh_proxy)
ssh_proxy=ssh_proxy)
ret = prunecmd.Wait()
print(prunecmd.stderr, file=sys.stderr, end='')
if ret:
break
continue
@ -1876,6 +1945,9 @@ class Project(object):
# mode, we just tried sync'ing from the upstream field; it doesn't exist, thus
# abort the optimization attempt and do a full sync.
break
elif ret < 0:
# Git died with a signal, exit immediately
break
time.sleep(random.randint(30, 45))
if initial:
@ -1891,8 +1963,15 @@ class Project(object):
# got what we wanted, else trigger a second run of all
# refs.
if not self._CheckForSha1():
return self._RemoteFetch(name=name, current_branch_only=False,
initial=False, quiet=quiet, alt_dir=alt_dir)
if not depth:
# Avoid infinite recursion when depth is True (since depth implies
# current_branch_only)
return self._RemoteFetch(name=name, current_branch_only=False,
initial=False, quiet=quiet, alt_dir=alt_dir)
if self.clone_depth:
self.clone_depth = None
return self._RemoteFetch(name=name, current_branch_only=current_branch_only,
initial=False, quiet=quiet, alt_dir=alt_dir)
return ok
@ -2085,20 +2164,25 @@ class Project(object):
if GitCommand(self, cmd).Wait() != 0:
raise GitError('%s merge %s ' % (self.name, head))
def _InitGitDir(self, mirror_git=None, MirrorOverride=False):
if not os.path.exists(self.gitdir):
def _InitGitDir(self, mirror_git=None):
init_git_dir = not os.path.exists(self.gitdir)
init_obj_dir = not os.path.exists(self.objdir)
# Initialize the bare repository, which contains all of the objects.
if init_obj_dir:
os.makedirs(self.objdir)
self.bare_objdir.init()
# Initialize the bare repository, which contains all of the objects.
if not os.path.exists(self.objdir):
os.makedirs(self.objdir)
self.bare_objdir.init()
# If we have a separate directory to hold refs, initialize it as well.
if self.objdir != self.gitdir:
# If we have a separate directory to hold refs, initialize it as well.
if self.objdir != self.gitdir:
if init_git_dir:
os.makedirs(self.gitdir)
if init_obj_dir or init_git_dir:
self._ReferenceGitDir(self.objdir, self.gitdir, share_refs=False,
copy_all=True)
self._CheckDirReference(self.objdir, self.gitdir, share_refs=False)
if init_git_dir:
mp = self.manifest.manifestProject
ref_dir = mp.config.GetString('repo.reference') or ''
@ -2127,38 +2211,11 @@ class Project(object):
for key in ['user.name', 'user.email']:
if m.Has(key, include_defaults=False):
self.config.SetString(key, m.GetString(key))
if self.manifest.IsMirror and not MirrorOverride:
if self.manifest.IsMirror:
self.config.SetString('core.bare', 'true')
else:
self.config.SetString('core.bare', None)
def _ProjectHooks(self, remote, repodir):
"""List the hooks present in the 'hooks' directory.
These hooks are project hooks and are copied to the '.git/hooks' directory
of all subprojects.
The remote projecthooks supplement/overrule any stockhook making it possible to
have a combination of hooks both from the remote projecthook and
.repo/hooks directories.
Returns:
A list of absolute paths to all of the files in the hooks directory and
projecthooks files, excluding the .git folder.
"""
hooks = {}
d = os.path.join(os.path.abspath(os.path.dirname(__file__)), 'hooks')
hooks = dict([(x, os.path.join(d, x)) for x in os.listdir(d)])
if remote is not None:
if remote.projecthookName is not None:
d = os.path.abspath('%s/projecthooks/%s/%s' % (repodir, remote.name, remote.projecthookName))
if os.path.isdir(d):
hooks.update(dict([(x, os.path.join(d, x)) for x in os.listdir(d)]))
if hooks.has_key('.git'):
del hooks['.git']
return hooks.values()
def _UpdateHooks(self):
if os.path.exists(self.gitdir):
self._InitHooks()
@ -2167,10 +2224,7 @@ class Project(object):
hooks = os.path.realpath(self._gitdir_path('hooks'))
if not os.path.exists(hooks):
os.makedirs(hooks)
pr = None
if self is not self.manifest.manifestProject:
pr = self.manifest.remotes.get(self.remote.name)
for stock_hook in self._ProjectHooks(pr, self.manifest.repodir):
for stock_hook in _ProjectHooks():
name = os.path.basename(stock_hook)
if name in ('commit-msg',) and not self.remote.review \
@ -2234,6 +2288,21 @@ class Project(object):
msg = 'manifest set to %s' % self.revisionExpr
self.bare_git.symbolic_ref('-m', msg, ref, dst)
def _CheckDirReference(self, srcdir, destdir, share_refs):
symlink_files = self.shareable_files
symlink_dirs = self.shareable_dirs
if share_refs:
symlink_files += self.working_tree_files
symlink_dirs += self.working_tree_dirs
to_symlink = symlink_files + symlink_dirs
for name in set(to_symlink):
dst = os.path.realpath(os.path.join(destdir, name))
if os.path.lexists(dst):
src = os.path.realpath(os.path.join(srcdir, name))
# Fail if the links are pointing to the wrong place
if src != dst:
raise GitError('cannot overwrite a local work tree')
def _ReferenceGitDir(self, gitdir, dotgit, share_refs, copy_all):
"""Update |dotgit| to reference |gitdir|, using symlinks where possible.
@ -2245,13 +2314,11 @@ class Project(object):
copy_all: If true, copy all remaining files from |gitdir| -> |dotgit|.
This saves you the effort of initializing |dotgit| yourself.
"""
# These objects can be shared between several working trees.
symlink_files = ['description', 'info']
symlink_dirs = ['hooks', 'objects', 'rr-cache', 'svn']
symlink_files = self.shareable_files
symlink_dirs = self.shareable_dirs
if share_refs:
# These objects can only be used by a single working tree.
symlink_files += ['config', 'packed-refs', 'shallow']
symlink_dirs += ['logs', 'refs']
symlink_files += self.working_tree_files
symlink_dirs += self.working_tree_dirs
to_symlink = symlink_files + symlink_dirs
to_copy = []
@ -2263,8 +2330,8 @@ class Project(object):
src = os.path.realpath(os.path.join(gitdir, name))
dst = os.path.realpath(os.path.join(dotgit, name))
if os.path.lexists(dst) and not os.path.islink(dst):
raise GitError('cannot overwrite a local work tree')
if os.path.lexists(dst):
continue
# If the source dir doesn't exist, create an empty dir.
if name in symlink_dirs and not os.path.lexists(src):
@ -2293,11 +2360,15 @@ class Project(object):
def _InitWorkTree(self):
dotgit = os.path.join(self.worktree, '.git')
if not os.path.exists(dotgit):
init_dotgit = not os.path.exists(dotgit)
if init_dotgit:
os.makedirs(dotgit)
self._ReferenceGitDir(self.gitdir, dotgit, share_refs=True,
copy_all=False)
self._CheckDirReference(self.gitdir, dotgit, share_refs=True)
if init_dotgit:
_lwrite(os.path.join(dotgit, HEAD), '%s\n' % self.GetRevisionId())
cmd = ['read-tree', '--reset', '-u']

2
repo
View File

@ -462,7 +462,7 @@ def _DownloadBundle(url, local, quiet):
try:
r = urllib.request.urlopen(url)
except urllib.error.HTTPError as e:
if e.code in [403, 404]:
if e.code in [401, 403, 404]:
return False
_print('fatal: Cannot get %s' % url, file=sys.stderr)
_print('fatal: HTTP error %s' % e.code, file=sys.stderr)

View File

@ -76,6 +76,7 @@ change id will be added.
capture_stdout = True,
capture_stderr = True)
p.stdin.write(new_msg)
p.stdin.close()
if p.Wait() != 0:
print("error: Failed to update commit message", file=sys.stderr)
sys.exit(1)

View File

@ -20,6 +20,7 @@ import multiprocessing
import re
import os
import select
import signal
import sys
import subprocess
@ -150,11 +151,15 @@ without iterating through the remaining projects.
attributes that we need.
"""
if not self.manifest.IsMirror:
lrev = project.GetRevisionId()
else:
lrev = None
return {
'name': project.name,
'relpath': project.relpath,
'remote_name': project.remote.name,
'lrev': project.GetRevisionId(),
'lrev': lrev,
'rrev': project.revisionExpr,
'annotations': dict((a.name, a.value) for a in project.annotations),
'gitdir': project.gitdir,
@ -200,6 +205,13 @@ without iterating through the remaining projects.
mirror = self.manifest.IsMirror
rc = 0
smart_sync_manifest_name = "smart_sync_override.xml"
smart_sync_manifest_path = os.path.join(
self.manifest.manifestProject.worktree, smart_sync_manifest_name)
if os.path.isfile(smart_sync_manifest_path):
self.manifest.Override(smart_sync_manifest_path)
if not opt.regex:
projects = self.GetProjects(args)
else:
@ -207,14 +219,12 @@ without iterating through the remaining projects.
os.environ['REPO_COUNT'] = str(len(projects))
pool = multiprocessing.Pool(opt.jobs)
pool = multiprocessing.Pool(opt.jobs, InitWorker)
try:
config = self.manifest.manifestProject.config
results_it = pool.imap(
DoWorkWrapper,
[[mirror, opt, cmd, shell, cnt, config, self._SerializeProject(p)]
for cnt, p in enumerate(projects)]
)
self.ProjectArgs(projects, mirror, opt, cmd, shell, config))
pool.close()
for r in results_it:
rc = rc or r
@ -236,12 +246,28 @@ without iterating through the remaining projects.
if rc != 0:
sys.exit(rc)
def ProjectArgs(self, projects, mirror, opt, cmd, shell, config):
for cnt, p in enumerate(projects):
try:
project = self._SerializeProject(p)
except Exception as e:
print('Project list error: %r' % e,
file=sys.stderr)
return
except KeyboardInterrupt:
print('Project list interrupted',
file=sys.stderr)
return
yield [mirror, opt, cmd, shell, cnt, config, project]
class WorkerKeyboardInterrupt(Exception):
""" Keyboard interrupt exception for worker processes. """
pass
def InitWorker():
signal.signal(signal.SIGINT, signal.SIG_IGN)
def DoWorkWrapper(args):
""" A wrapper around the DoWork() method.
@ -263,7 +289,9 @@ def DoWork(project, mirror, opt, cmd, shell, cnt, config):
def setenv(name, val):
if val is None:
val = ''
env[name] = val.encode()
if hasattr(val, 'encode'):
val = val.encode()
env[name] = val
setenv('REPO_PROJECT', project['name'])
setenv('REPO_PATH', project['relpath'])

View File

@ -27,12 +27,12 @@ else:
import imp
import urlparse
urllib = imp.new_module('urllib')
urllib.parse = urlparse.urlparse
urllib.parse = urlparse
from color import Coloring
from command import InteractiveCommand, MirrorSafeCommand
from error import ManifestParseError
from project import SyncBuffer, MetaProject
from project import SyncBuffer
from git_config import GitConfig
from git_command import git_require, MIN_GIT_VERSION
@ -153,7 +153,7 @@ to update the working directory files.
# server where this git is located, so let's save that here.
mirrored_manifest_git = None
if opt.reference:
manifest_git_path = urllib.parse(opt.manifest_url).path[1:]
manifest_git_path = urllib.parse.urlparse(opt.manifest_url).path[1:]
mirrored_manifest_git = os.path.join(opt.reference, manifest_git_path)
if not mirrored_manifest_git.endswith(".git"):
mirrored_manifest_git += ".git"
@ -374,52 +374,6 @@ to update the working directory files.
print(' rm -r %s/.repo' % self.manifest.topdir)
print('and try again.')
def _SyncProjectHooks(self, opt, repodir):
"""Downloads the defined hooks supplied in the projecthooks element
"""
# Always delete projecthooks and re-download for every new init.
projecthooksdir = os.path.join(repodir, 'projecthooks')
if os.path.exists(projecthooksdir):
shutil.rmtree(projecthooksdir)
for remotename in self.manifest.remotes:
r = self.manifest.remotes.get(remotename)
if r.projecthookName is not None and r.projecthookRevision is not None:
projecthookurl = r.resolvedFetchUrl.rstrip('/') + '/' + r.projecthookName
ph = MetaProject(manifest = self.manifest,
name = r.projecthookName,
gitdir = os.path.join(projecthooksdir,'%s/%s.git' % (remotename, r.projecthookName)),
worktree = os.path.join(projecthooksdir,'%s/%s' % (remotename, r.projecthookName)))
ph.revisionExpr = r.projecthookRevision
is_new = not ph.Exists
if is_new:
if not opt.quiet:
print('Get projecthook %s' % \
GitConfig.ForUser().UrlInsteadOf(projecthookurl), file=sys.stderr)
ph._InitGitDir(MirrorOverride=True)
phr = ph.GetRemote(remotename)
phr.name = 'origin'
phr.url = projecthookurl
phr.ResetFetch()
phr.Save()
if not ph.Sync_NetworkHalf(quiet=opt.quiet, is_new=is_new, clone_bundle=False):
print('fatal: cannot obtain projecthook %s' % phr.url, file=sys.stderr)
# Better delete the git dir if we created it; otherwise next
# time (when user fixes problems) we won't go through the "is_new" logic.
if is_new:
shutil.rmtree(ph.gitdir)
sys.exit(1)
syncbuf = SyncBuffer(ph.config)
ph.Sync_LocalHalf(syncbuf)
syncbuf.Finish()
def Execute(self, opt, args):
git_require(MIN_GIT_VERSION, fail=True)
@ -435,7 +389,6 @@ to update the working directory files.
self._SyncManifest(opt)
self._LinkManifest(opt.manifest_name)
self._SyncProjectHooks(opt, self.manifest.repodir)
if os.isatty(0) and os.isatty(1) and not self.manifest.IsMirror:
if opt.config_name or self._ShouldConfigureUser():

View File

@ -22,15 +22,8 @@ except ImportError:
import glob
from pyversion import is_python3
if is_python3():
import io
else:
import StringIO as io
import itertools
import os
import sys
from color import Coloring
@ -97,7 +90,7 @@ the following meanings:
dest='orphans', action='store_true',
help="include objects in working directory outside of repo projects")
def _StatusHelper(self, project, clean_counter, sem, output):
def _StatusHelper(self, project, clean_counter, sem):
"""Obtains the status for a specific project.
Obtains the status for a project, redirecting the output to
@ -111,7 +104,7 @@ the following meanings:
output: Where to output the status.
"""
try:
state = project.PrintWorkTreeStatus(output)
state = project.PrintWorkTreeStatus()
if state == 'CLEAN':
next(clean_counter)
finally:
@ -122,16 +115,16 @@ the following meanings:
status_header = ' --\t'
for item in dirs:
if not os.path.isdir(item):
outstring.write(''.join([status_header, item]))
outstring.append(''.join([status_header, item]))
continue
if item in proj_dirs:
continue
if item in proj_dirs_parents:
self._FindOrphans(glob.glob('%s/.*' % item) + \
glob.glob('%s/*' % item), \
self._FindOrphans(glob.glob('%s/.*' % item) +
glob.glob('%s/*' % item),
proj_dirs, proj_dirs_parents, outstring)
continue
outstring.write(''.join([status_header, item, '/']))
outstring.append(''.join([status_header, item, '/']))
def Execute(self, opt, args):
all_projects = self.GetProjects(args)
@ -144,26 +137,17 @@ the following meanings:
next(counter)
else:
sem = _threading.Semaphore(opt.jobs)
threads_and_output = []
threads = []
for project in all_projects:
sem.acquire()
class BufList(io.StringIO):
def dump(self, ostream):
for entry in self.buflist:
ostream.write(entry)
output = BufList()
t = _threading.Thread(target=self._StatusHelper,
args=(project, counter, sem, output))
threads_and_output.append((t, output))
args=(project, counter, sem))
threads.append(t)
t.daemon = True
t.start()
for (t, output) in threads_and_output:
for t in threads:
t.join()
output.dump(sys.stdout)
output.close()
if len(all_projects) == next(counter):
print('nothing to commit (working directory clean)')
@ -188,23 +172,21 @@ the following meanings:
try:
os.chdir(self.manifest.topdir)
outstring = io.StringIO()
self._FindOrphans(glob.glob('.*') + \
glob.glob('*'), \
outstring = []
self._FindOrphans(glob.glob('.*') +
glob.glob('*'),
proj_dirs, proj_dirs_parents, outstring)
if outstring.buflist:
if outstring:
output = StatusColoring(self.manifest.globalConfig)
output.project('Objects not within a project (orphans)')
output.nl()
for entry in outstring.buflist:
for entry in outstring:
output.untracked(entry)
output.nl()
else:
print('No orphan files or directories')
outstring.close()
finally:
# Restore CWD.
os.chdir(orig_path)

View File

@ -131,6 +131,10 @@ of a project from server.
The -c/--current-branch option can be used to only fetch objects that
are on the branch specified by a project's revision.
The --optimized-fetch option can be used to only fetch projects that
are fixed to a sha1 revision if the sha1 revision does not already
exist locally.
SSH Connections
---------------
@ -206,6 +210,9 @@ later is required to fix a server side protocol bug.
p.add_option('--no-tags',
dest='no_tags', action='store_true',
help="don't fetch tags")
p.add_option('--optimized-fetch',
dest='optimized_fetch', action='store_true',
help='only fetch projects fixed to sha1 if revision does not exist locally')
if show_smart:
p.add_option('-s', '--smart-sync',
dest='smart_sync', action='store_true',
@ -275,7 +282,8 @@ later is required to fix a server side protocol bug.
quiet=opt.quiet,
current_branch_only=opt.current_branch_only,
clone_bundle=not opt.no_clone_bundle,
no_tags=opt.no_tags, archive=self.manifest.IsArchive)
no_tags=opt.no_tags, archive=self.manifest.IsArchive,
optimized_fetch=opt.optimized_fetch)
self._fetch_times.Set(project, time.time() - start)
# Lock around all the rest of the code, since printing, updating a set
@ -509,6 +517,9 @@ later is required to fix a server side protocol bug.
self.manifest.Override(opt.manifest_name)
manifest_name = opt.manifest_name
smart_sync_manifest_name = "smart_sync_override.xml"
smart_sync_manifest_path = os.path.join(
self.manifest.manifestProject.worktree, smart_sync_manifest_name)
if opt.smart_sync or opt.smart_tag:
if not self.manifest.manifest_server:
@ -575,17 +586,16 @@ later is required to fix a server side protocol bug.
[success, manifest_str] = server.GetManifest(opt.smart_tag)
if success:
manifest_name = "smart_sync_override.xml"
manifest_path = os.path.join(self.manifest.manifestProject.worktree,
manifest_name)
manifest_name = smart_sync_manifest_name
try:
f = open(manifest_path, 'w')
f = open(smart_sync_manifest_path, 'w')
try:
f.write(manifest_str)
finally:
f.close()
except IOError:
print('error: cannot write manifest to %s' % manifest_path,
except IOError as e:
print('error: cannot write manifest to %s:\n%s'
% (smart_sync_manifest_path, e),
file=sys.stderr)
sys.exit(1)
self._ReloadManifest(manifest_name)
@ -602,6 +612,13 @@ later is required to fix a server side protocol bug.
% (self.manifest.manifest_server, e.errcode, e.errmsg),
file=sys.stderr)
sys.exit(1)
else: # Not smart sync or smart tag mode
if os.path.isfile(smart_sync_manifest_path):
try:
os.remove(smart_sync_manifest_path)
except OSError as e:
print('error: failed to remove existing smart sync override manifest: %s' %
e, file=sys.stderr)
rp = self.manifest.repoProject
rp.PreSync()
@ -615,7 +632,8 @@ later is required to fix a server side protocol bug.
if not opt.local_only:
mp.Sync_NetworkHalf(quiet=opt.quiet,
current_branch_only=opt.current_branch_only,
no_tags=opt.no_tags)
no_tags=opt.no_tags,
optimized_fetch=opt.optimized_fetch)
if mp.HasChanges:
syncbuf = SyncBuffer(mp.config)