ssh: rewrite proxy management for multiprocessing usage

We changed sync to use multiprocessing for parallel work.  This broke
the ssh proxy code as it's all based on threads.  Rewrite the logic to
be multiprocessing safe.

Now instead of the module acting as a stateful object, callers have to
instantiate a new ProxyManager class that holds all the state, an pass
that down to any users.

Bug: https://crbug.com/gerrit/12389
Change-Id: I4b1af116f7306b91e825d3c56fb4274c9b033562
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305486
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
This commit is contained in:
Mike Frysinger
2021-05-06 00:44:42 -04:00
parent 19e409c818
commit 339f2df1dd
6 changed files with 221 additions and 154 deletions

View File

@ -27,7 +27,6 @@ import urllib.request
from error import GitError, UploadError
import platform_utils
from repo_trace import Trace
import ssh
from git_command import GitCommand
from git_refs import R_CHANGES, R_HEADS, R_TAGS
@ -519,17 +518,23 @@ class Remote(object):
return self.url.replace(longest, longestUrl, 1)
def PreConnectFetch(self):
def PreConnectFetch(self, ssh_proxy):
"""Run any setup for this remote before we connect to it.
In practice, if the remote is using SSH, we'll attempt to create a new
SSH master session to it for reuse across projects.
Args:
ssh_proxy: The SSH settings for managing master sessions.
Returns:
Whether the preconnect phase for this remote was successful.
"""
if not ssh_proxy:
return True
connectionUrl = self._InsteadOf()
return ssh.preconnect(connectionUrl)
return ssh_proxy.preconnect(connectionUrl)
def ReviewUrl(self, userEmail, validate_certs):
if self._review_url is None: