java.lang.NoClassDefFoundError while running hadoop

505 views
Skip to first unread message

Esash

unread,
Nov 4, 2013, 2:55:52 AM11/4/13
to chenn...@googlegroups.com
Hi All,

I am pretty new to Hadoop. I am trying to get the DatanodeReport from the client using the following :

import java.net.InetSocketAddress;
import org.apache.hadoop.hdfs.MiniDFSCluster;
import org.apache.hadoop.hdfs.protocol.FSConstants.DatanodeReportType;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hdfs.DFSClient;
import org.apache.hadoop.hdfs.protocol.DatanodeInfo;

public class Three {
    public static void testDN() throws Exception {
        Configuration conf = new Configuration();
        MiniDFSCluster cluster = null;
        try {
            cluster = new MiniDFSCluster(conf, 1, true, null);
            InetSocketAddress addr = new InetSocketAddress("localhost", cluster.getNameNodePort());
            DFSClient client = new DFSClient(addr, conf);
            DatanodeInfo[] report = client.datanodeReport(DatanodeReportType.ALL);
        } finally {
            if (cluster != null) {
                cluster.shutdown();
            }
        }

    public static void main(String[] args) throws Exception {
        testDN();
    }
}

I could compile this using :

javac -cp hadoop-core-1.2.1.jar:hadoop-test-1.2.1.jar Three.java

But, if try to run this : bin/hadoop Three, I get the following error :

hduser1@slave:/usr/local/hadoop$ bin/hadoop Three
Warning: $HADOOP_HOME is deprecated.

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/MiniDFSCluster
    at Three.testDN(Three.java:13)
    at Three.main(Three.java:28)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.MiniDFSCluster
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 2 more
hduser1@slave:/usr/local/hadoop$

I understand that the hadoop-test-1.2.1.jar is not getting included in the path. I tried adding it to HADOOP_CLASSPATH and also to PATH variable.
But the error is persistent. Please help. I use Ubuntu 12.10.

Thanks a lot, in advance.

- Esash

Ashwanth Kumar

unread,
Nov 4, 2013, 3:11:22 AM11/4/13
to chenn...@googlegroups.com
What did you set your HADOOP_CLASSPATH as? 


--
You received this message because you are subscribed to the Google Groups "Hadoop Users Group (HUG) Chennai" group.
To unsubscribe from this group and stop receiving emails from it, send an email to chennaihug+...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.



--

Ashwanth Kumar / ashwanthkumar.in

Esash

unread,
Nov 4, 2013, 3:39:34 AM11/4/13
to chenn...@googlegroups.com, ashwan...@googlemail.com
Hi All,

I got the mistake I was doing. I just added the whole jar file path to the HADOOP_CLASSPATH and it started working. I was adding only the path to file excluding the filename before. Hence the error. Thanks a lot.

- Esash
Reply all
Reply to author
Forward
Message has been deleted
0 new messages