import org.apache.spark.sql.hive.HiveContext
val hiveContext = new HiveContext(sparkContext)
java.lang.RuntimeException: java.lang.NullPointerException at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522) at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171) at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163) at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161) at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:168) ... 33 elided Caused by: java.lang.NullPointerException at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012) at org.apache.hadoop.util.Shell.runCommand(Shell.java:483) at org.apache.hadoop.util.Shell.run(Shell.java:456) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722) at org.apache.hadoop.util.Shell.execCommand(Shell.java:815) at org.apache.hadoop.util.Shell.execCommand(Shell.java:798) at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097) at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:654) at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:629) at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:599) at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508) ... 37 more
There is something else, but I am not sure it would be the case for you, you might need winutils.exe:
Make sure Spark can find it. Do ONE of the following:
Define HADOOP_HOME to point to the parent of bin, e.g.,:
set HADOOP_HOME=C:\hadoopOr define JAVA_OPTS to point to the parent of bin, e.g.,:
set JAVA_OPTS=-Dhadoop.home.dir=C:\hadoop
--
You received this message because you are subscribed to the Google Groups "spark-notebook-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-notebook-...@googlegroups.com.
To post to this group, send email to spark-not...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/spark-notebook-user/4c85cb2c-d2f0-4c2c-89ca-6977931c554a%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: --------- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522) at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171) at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163) at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161) at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:168) ... 33 elided Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: --------- at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612) at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508) ... 37 more
I tried this:
import org.apache.hadoop.fs._ val path = new Path("file:/tmp/hive") val lfs = FileSystem.get(path.toUri(), sc.hadoopConfiguration) lfs.getFileStatus(path).getPermission()which leads to:
import org.apache.hadoop.fs._
import org.apache.hadoop.fs._
val path = new Path("file:/tmp/hive")path: org.apache.hadoop.fs.Path = file:/tmp/hive
file:/tmp/hive
val lfs = FileSystem.get(path.toUri(), sc.hadoopConfiguration)lfs: org.apache.hadoop.fs.FileSystem = org.apache.hadoop.fs.LocalFileSystem@e085523
org.apache.hadoop.fs.LocalFileSystem@e085523
lfs.getFileStatus(path).getPermission()
res1: org.apache.hadoop.fs.permission.FsPermission = ---------
FsPermission writableHDFSDirPermission = new FsPermission((short)00733); FileSystem fs = rootHDFSDirPath.getFileSystem(conf); if (!fs.exists(rootHDFSDirPath)) { Utilities.createDirsWithPermission(conf, rootHDFSDirPath, writableHDFSDirPermission, true); } FsPermission currentHDFSDirPermission = fs.getFileStatus(rootHDFSDirPath).getPermission(); LOG.debug("HDFS root scratch dir: " + rootHDFSDirPath + ", permission: " + currentHDFSDirPermission); // If the root HDFS scratch dir already exists, make sure it is writeable. if (!((currentHDFSDirPermission.toShort() & writableHDFSDirPermission .toShort()) == writableHDFSDirPermission.toShort())) { throw new RuntimeException("The root scratch dir: " + rootHDFSDirPath + " on HDFS should be writable. Current permissions are: " + currentHDFSDirPermission); }
<console>:1: error: ')' expected but integer literal found. FsPermission writableHDFSDirPermission = new FsPermission((short)00733) ^
--
You received this message because you are subscribed to the Google Groups "spark-notebook-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-notebook-...@googlegroups.com.
To post to this group, send email to spark-not...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/spark-notebook-user/db0a2472-9acb-458e-9358-d9a00d78df46%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Following steps solved my problem:
1. Open Command Prompt in Admin Mode
2. winutils.exe chmod 777 /tmp/hive
Thank you a lot for the quick responses!