We are in the process of transitioning to Spark 2.0
Currently all tests are passing, when run in Local TestCuster mode, DruidTest Cluster mode is broken because of dependency incompatibilities between Druid 0.9.x and Spark 2.0. Which means for now you need to separately start a local Druid cluster(just as you had to do prior to v0.4.0)
Will be running more testing and cleanup over the next few days. This branch should be folded into the master branch in around a week's time.