I have performed web-scrapping using python-scrapy framework with a Proxy Mesh IP. If the proxy requires authentication I use the following code :
import base64
# Start your middleware class
class ProxyMiddleware(object):
# overwrite process request
def process_request(self, request, spider):
# Set the location of the proxy
request.meta['proxy'] = "http://....."
# Use the following lines if your proxy requires authentication
proxy_user_pass = "username:pwd"
# setup basic authentication for the proxy
encoded_user_pass = base64.encodestring(proxy_user_pass)
request.headers['Proxy-Authorization'] = 'Basic ' + encoded_user_pass
When I want to do the same while scraping using selenium chrome driver what is the appropriate technique that can be used. I find examples using firefox but no luck in chrome driver. Please share your ideas.