ERROR: DFS Broker (hadoop) did not come up Exception in thread "main" java.lang.NoClassDefFoundError: org/hypertable/DfsBroker/hadoop/

閲覧: 76 回
最初の未読メッセージにスキップ

Xy Zheng

未読、
2013/12/12 1:08:512013/12/12
To: hyperta...@googlegroups.com
Hi Doug:

cap start error as follows:  

 servers: ["dlxa101"]
    [dlxa101] executing command
 ** [out :: dlxa101] DFS broker: available file descriptors: 65536
 ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) to come up...
 ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) to come up...
 ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) to come up...
 ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) to come up...
 ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) to come up...
 ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) to come up...
 ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) to come up...
 ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) to come up...
 ** [out :: dlxa101] ERROR: DFS Broker (hadoop) did not come up

[cloudil@dlxa101 bin]$ ./set-hadoop-distro.sh cdh4
Hypertable successfully configured for Hadoop cdh4

DfsBroker.hadoop.log as follows:

[cloudil@dlxa101 log]$ tail -f -n20 DfsBroker.hadoop.log 
No Hadoop distro is configured.  Run the following script to
configure:

/home/cloudil/zxy_hytcluster_test/hypertable_cluster/current/bin/set-hadoop-distro.sh
Hypertable successfully configured for Hadoop cdh4
Exception in thread "main" java.lang.NoClassDefFoundError: org/hypertable/DfsBroker/hadoop/main
Caused by: java.lang.ClassNotFoundException: org.hypertable.DfsBroker.hadoop.main
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.hypertable.DfsBroker.hadoop.main.  Program will exit.

can you help me. thank you.

Xy Zheng

未読、
2013/12/12 22:13:542013/12/12
To: hyperta...@googlegroups.com
I think this error arised because maybe there is no jar file (hypertable-0.9.7.X(-examples).jar) in /current/lib and /current/lib/java/cdhX

you must be compile it and copy to the path mentioned.


wow!!!

在 2013年12月12日星期四UTC+8下午2时08分51秒,Xy Zheng写道:
メッセージは削除されました

Xy Zheng

未読、
2013/12/13 1:43:302013/12/13
To: hyperta...@googlegroups.com
fixed error: /current/lib/Java

在 2013年12月13日星期五UTC+8上午11时13分54秒,Xy Zheng写道:

Xy Zheng

未読、
2013/12/23 2:22:512013/12/23
To: hyperta...@googlegroups.com
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/server/namenode/NotReplicatedYetException
at org.hypertable.DfsBroker.hadoop.main.main(main.java:171)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.server.namenode.NotReplicatedYetException
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)

what is the reason of  this error arise? i have put *.jar under correct path. any advice be appreciated.

在 2013年12月13日星期五UTC+8下午2时43分30秒,Xy Zheng写道:

Xy Zheng

未読、
2013/12/23 4:08:482013/12/23
To: hyperta...@googlegroups.com
[cloudil@dlxa101 bin]$ ./set-hadoop-distro.sh cdh4 
when i did above command, the error fixed. 
but i don't know why?

when cap install_package. the config items as follows:
set :default_dfs, "hadoop"
set :default_distro, "cdh4"

be set in Capfile.
i want to ./set-hadoop-distro.sh be called when and by whom? 
is it set automatically?
why should i manual set? 

my Capfile set part as follows:
[cloudil@dlxa101 zxy_hytcluster_test]$ head -n20 Capfile
set :source_machine, "dlxa101"
set :install_dir,  "/home/cloudil/zxy_hytcluster_test/hypertable_cluster"
set :hypertable_version, "0.9.7.8"
set :default_pkg, "/home/cloudil/zxy_hytcluster_test/hypertable-0.9.7.8-linux-x86_64-debug.tar.bz2"
set :default_dfs, "hadoop"
set :default_distro, "cdh4"
set :default_config, "/home/cloudil/zxy_hytcluster_test/hypertable.cfg"

role :source, "dlxa101"
role :master, "dlxa101"
role :hyperspace, "dlxa102"
role :slave,  "dlxa103", "dlxa105", "dlxa106", "dlxa107"
role :localhost, "dlxa101"

######################### END OF USER CONFIGURATION ############################
set :prompt_stop, 0
set :prompt_clean, 1

any advice be appreciated.

在 2013年12月23日星期一UTC+8下午3时22分51秒,Xy Zheng写道:

Doug Judd

未読、
2013/12/23 11:31:022013/12/23
To: hyperta...@googlegroups.com
Maybe there were some old CDH3 .jar files left around from the previous run.  Setting :default_distro to "cdh4" should have solved the problem.  I think when you change distros (e.g. change :default_distro from "cdh3" to "cdh4") you need to run the following command:

cap set_distro

- Doug


--
You received this message because you are subscribed to the Google Groups "Hypertable Development" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hypertable-de...@googlegroups.com.
To post to this group, send email to hyperta...@googlegroups.com.
Visit this group at http://groups.google.com/group/hypertable-dev.
For more options, visit https://groups.google.com/groups/opt_out.



--
Doug Judd
CEO, Hypertable Inc.

Xy Zheng

未読、
2013/12/23 20:26:262013/12/23
To: hyperta...@googlegroups.com、do...@hypertable.com
OK, i see. thank you very much.

在 2013年12月24日星期二UTC+8上午12时31分02秒,Doug Judd写道:
全員に返信
投稿者に返信
転送
新着メール 0 件