Replace CGetURLOpener with urlretrieve#105
Open
pcf000 wants to merge 4 commits intopfultz2:masterfrom
Open
Conversation
|
Hi @pfultz2 , may I ask why this is not merged yet? It certainly sorted the issues I was facing with an internal proxy :)! |
Owner
|
There is failures in ubuntu CI, if you can merge the latest from master which updated it to 20.04. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
For reasons I have not entirely traced, URLopener (and thus FancyURLopener) does not work with some proxies. I can see that instead of setting up a tunnel with a CONNECT request, it makes a GET request to the proxy, which in my case induces a 502-bad-gateway response. Even though it recognises the proxy settings, it doesn't quite do the right thing.
However, urlopen uses ProxyHandler to accommodate the situation, and does work through this troublesome proxy. urllib.request.urlretrieve has the same interface as URLopener.retrieve, and seems like a drop-in replacement. I added a try/except around the urlretrieve to get the BuildError for responses of 400 or greater, since there is no longer a custom http_error_default or a good place to hang it.
This change addresses my internal build problem and probably also issues #52 and #77. Plus, no more deprecation warnings.