What I've ended up doing is creating a method to get the headers and then make my own decisions based on what I find. As it might be generically useful, here's the code snippet for others who come to this group experiencing the same quirkiness..
def self.get_head_meta(url, port=80)
non_http_url = url.gsub('http://','')
host = non_http_url.split('/')[0]
path = non_http_url.gsub(host,'')
proxy_host=[proxy NOT INCLUDING http://]
proxy_port=[integer]
response = nil
Net::HTTP::Proxy(proxy_host, proxy_port).start(host,port) {|http|
response=http.head(path)
}
return response.to_hash
end
The RSS feeds are so huge I can totally justify doing a call out to this and eat the extra call. I think next step might be trying to use Net:HTTP to retrieve 5k of a 1Mb file (ie the top) which I've seen one example of out there.
Thanks again for Feedzirra. It's making life easier for sure.