I know this has been brought up a few times, and I've tried a number of things to no avail:
Spark 1.6.0
Alluxio 1.1.1 (recompiled for Hadoop 2.7.1)
Followed the following tutorials:
Also tried recompiling Spark Core with the alluxio-underfs-hdfs dependency added in core/pom.xml
The Alluxio tests work fine for underfs and I can see those files in Alluxio and HDFS.
However, from Spark, when writing a dataframe it will not persist to hdfs (CACHE_THROUGH is set but I will get the no under file system factory found error).