Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Help Tracing urllib2 Error, Please?

2 views
Skip to first unread message

Larry Hale

unread,
Jul 19, 2008, 7:08:27 PM7/19/08
to
Since it seems I have a "unique" problem, I wonder if anyone could
point me in the general/right direction for tracking down the issue
and resolving it myself.

See my prior post @ http://groups.google.com/group/comp.lang.python/browse_thread/thread/44775994a6b55161?hl=en#
for more info. (Python 2.5.2 on Win XP 64 ==>> Squid Proxy requiring
Authentication ==>> Internet not working.)

I've looked the urllib2 source over, but am having trouble following
it. As previously mentioned, urllib2 initiates the request, Squid
replies "407 error" that auth's required, and then urllib2 just stops,
throwing error 407.

Any though(s) on what to check out?

It's frustrating (to say the least) that it seems so many are
successfully accomplishing this task, and all's working perfectly for
them, but I'm failing miserably.

Would any quotes viewed in the HTTP traffic help? (Wireshark shows
all! :) I don't even know what other info could help.

Any info to get about Squid's configuration that might make it "non
standard" in a way that could cause my problem? Any question(s) I
should ask my Net Admin to relay info to you all?


As always, any/all help greatly appreciated. Thanks! :)

-Larry

Rob Wolfe

unread,
Jul 20, 2008, 7:30:09 AM7/20/08
to
Larry Hale <lar...@hotmail.com> writes:

Maybe Squid is configured to not allow sending authentication
directly in URI. Or maybe there is only digest scheme allowed.
Try this:

def getopener(proxy=None, digest=False):
opener = urllib2.build_opener(urllib2.HTTPHandler)
if proxy:
passwd_mgr = urllib2.HTTPPasswordMgr()
passwd_mgr.add_password(None, 'http://localhost:3128', 'user', 'password')

if digest:
proxy_support = urllib2.ProxyDigestAuthHandler(passwd_mgr)
else:
proxy_support = urllib2.ProxyBasicAuthHandler(passwd_mgr)
opener.add_handler(proxy_support)
return opener


def fetchurl(url, opener):
f = opener.open(url)
data = f.read()
f.close()
return data

print fetchurl('http://www.python.org', getopener('127.0.0.1:3128'))
print fetchurl('http://www.python.org', getopener('127.0.0.1:3128', digest=True))

HTH,
Rob

0 new messages