> 2011-08-25 19:41:44-0700 [default] *DEBUG: Crawled (501) <GET
> https://www.paypal.com>* (referer: None)
> 2011-08-25 19:41:44-0700 [default] INFO: Closing spider (finished)
> 2011-08-25 19:41:48-0700 [default] INFO: Spider closed (finished)
>
>
> $ wget https://www.paypal.com
> --2011-08-25 19:44:08-- https://www.paypal.com/
> Resolving us.proxymesh.com... 184.106.76.204
> Connecting to us.proxymesh.com|184.106.76.204|:31280... connected.
> Proxy request sent, awaiting response*... 200 OK*
> Length: unspecified [text/html]
> Saving to: `index.html'
>
> I have scrapy 0.12.0.2545 , twisted 11.0.0 and python 2.7.
>
> After some investigation, it appears that scrapy instead of issuing
> a CONNECT method and then doing a GET it is only issuing a GET
> requests which causes the fetch to fail.
>
> Do you have any idea why this happens and how it can be fixed?
>
> Thanks,
> Oana
>
>
>
>
>
>
>
> --
> You received this message because you are subscribed to the Google Groups "scrapy-users" group.
> To post to this group, send email to scrapy...@googlegroups.com.
> To unsubscribe from this group, send email to scrapy-users...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/scrapy-users?hl=en.
>
HttpProxyMiddleware
.