No FileSystem for scheme: hdfs

Showing 1-7 of 7 messages
No FileSystem for scheme: hdfs Thinus Prinsloo 8/8/12 2:28 AM
Hey all,

I'm trying to write a simple YARN-based application but I'm stuck at writing files to HDFS.  I've managed to figure out the mechanics of running jar files in containers on the various machines, but when trying to create a FileSystem object, I get the following exception:

java.io.IOException: No FileSystem for scheme: hdfs
        at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2138)
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2145)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:80)
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2184)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2166)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:302)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:158)

The line causing the error is this one:

FileSystem hdfs = FileSystem.get(conf);

The conf object (hadoop.io.conf) is made up of the core-site.xml and hdfs-site.xml files.  I'm running the JAR as a user called "yarn" and the directory I'm trying to access does exist, though we are not near that point yet - I'm just trying to create the FileSystem object.

I've checked Google and the only help there is related to a bug that used to be in HDFS 2.0.0-alpha.  I checked the fs.default.name value in the configuration, and it is reported as "hdfs://namenode.local:8020", which seems right.

Any help/ideas would be greatly appreciated.

Thanks,
Thinus
Re: No FileSystem for scheme: hdfs Thinus Prinsloo 8/8/12 7:56 AM
Ok, I fixed it by just running the distributed jar with "hadoop jar", in stead of trying to execute a standalone "java -jar".
Re: No FileSystem for scheme: hdfs Sambit Tripathy 9/11/12 10:11 PM
Yes it works.You have to build the jar with all required dependencies included.
Re: No FileSystem for scheme: hdfs jghuang 9/14/12 11:21 AM
I have received similar error. Since I am using C, I cannot create a jar and use "hadoop jar" to run it.
 
I have read somewhere that Phil Z. said the way to correct it is to add the following to hdfs-site.xml. Because that posting was in 2009, is it still valid?
<property>
  <name>fs.hdfs.impl</name>
  <value>org.apache.hadoop.dfs.DistributedFileSystem</value>
  <description>The FileSystem for hdfs: uris.</description>
</property>
The error:
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
12/09/14 14:11:49 ERROR security.UserGroupInformation: PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException: No FileSystem for scheme: hdfs
Exception in thread "main" java.io.IOException: No FileSystem for scheme: hdfs

        at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2138)
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2145)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:80)
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2184)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2166)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:302)
        at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:148)
        at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:146)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:146)
Call to org.apache.hadoop.fs.Filesystem::get(URI, Configuration) failed!
Re: No FileSystem for scheme: hdfs Harsh J 9/14/12 7:43 PM
Hi,

You do not need to do that if you have the proper hdfs jars on the
classpath of your application. Is there a way to know if that is
included properly?
--
Harsh J
Re: No FileSystem for scheme: hdfs jghuang 9/17/12 10:37 AM
Classpath:
/usr/lib/hadoop/lib/slf4j-log4j12-1.6.1.jar:.:/usr/lib/hadoop/lib/slf4j-api-1.6.1.jar:/usr/lib/hadoop-0.20-mapreduce/hadoop-core.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/commons-lang-2.5.jar:/usr/lib/hadoop/lib/commons-logging-api-1.1.1.jar:/usr/lib/hadoop/lib/commons-logging-1.1.1.jar:/usr/lib/hadoop/etc/hadoop:/usr/lib/hadoop/core-3.1.1.jar:/usr/lib/hadoop/hadoop-client-3.1.1.jar
Re: No FileSystem for scheme: hdfs kunp...@gmail.com 9/4/13 1:41 AM
I‘ve got the similar problem with "java -jar xx.jar" in hadoop-2.0.5-alpha:
java.io.IOException: No FileSystem for scheme: file

but it works well when running with "hadoop jar". 

When I add the follow config into core-default.xml and it works with "java -jar"

<property>
  <name>fs.file.impl</name>
  <value>org.apache.hadoop.fs.LocalFileSystem</value>
  <description>The FileSystem for file: uris.</description>
</property>

<property>
  <name>fs.hdfs.impl</name>
  <value>org.apache.hadoop.hdfs.DistributedFileSystem</value>
  <description>The FileSystem for hdfs: uris.</description>
</property>

So,maybe it is not the problem of missing required dependencies. I don't why,but it works!