Glad to hear it all worked out, and thank you for you specific responses as to your concerns.
I do run the tests in fog against the mocked as well as real modes, so as you said, a bug in the mocking code is probably a bug in fog itself (fwiw).
There definitely is some overheard to switching between libraries that vcr might help accomodate (I've made this switch myself in other places, also without vcr, and I can see how it would benefit).
I wasn't aware that vcr took latency into account, would using vcr have helped with the performance issues, vs the mocks?
The support for VCR was added to excon as a separate concern from fog (and fairly recently). It is there to provide, mostly, for people who are using excon for other things. That said, I'm not trying to say that it shouldn't work. It is just not something I have gone to any length to provide for (since it seems like there is a ready solution with the mocks). So, I am glad you got the mocks working and I think that in most cases they will be easier to use, but it sounds like there are some reasons why you might want to opt for more difficult (but more accurate) usage of vcr directly. I'll add it to the list of things to think about going forward. So far you have been the first to really bring it up though, everyone else seems to have been fine with the mocks (or perhaps they are not testing well or doing something else and I'm not aware of it). Would you still want vcr support to be added in fog in the future or do you find that in the end the mocks do a good enough job supporting your use case?
Also, out of curiousity, is there anything in particular that lead you toward trying to approach this via VCR instead of the fog mocks? I'd like to shore up the documentation to make sure the mocked path is more clearly the recommended one (at least for the time being).
Thanks again for all the feedback, its always valuable to me when I can hear more specifically about use cases and concerns.