I have been doing Facebook/Twitter/Reddit analysis with a mix of Python and JavaScript. Here is some advice:
If you need to do some really large-scale mining/spidering, python is pretty great for shuffling things around between databases and APIs. Python is good for making those static json files dynamic, or hitting an API regularly and storing time series data somewhere.s
All of my work with analysis and visualizing is being done in JS though, and it's working out pretty great. Pretty much every API is giving things to you in JSON, and the JS map/reduce implementation is great for parsing this. If your data set is a static list of JSON? Just put it in a browser somewhere and load it with d3.json().
Seriously wrap your head around d3.nest, and map and reduce, there is a learning curve but it is very powerful and flexible! The actual math behind analysis of nodes and whatnot is not overwhelming. You might find that the cost of writing a distance counting algorithm is a lot less than building a dev environment that uses Java and whatever else.
What it's going to come down to is how big your target dataset is. If it's in the 100s of MB or even single digit GBs, you can put it in a browser and skip a lot of kind of tedious back end work.
If you're trying to work with larger data sets, honestly you probably need to recruit an engineer anyhow!
Japhy