Hello,
I've been trying to run wordcount exercise using the HDFS cluster. I'm
able to complete the "configure hdfs in a cluster" step and "configure
mapreduce daemon" steps successfully. I can format the file system but
when I run the start-dfs.sh script, I see the following error:
192.168.11.10: starting secondarynamenode, logging to /N/u/train010/
hadoop-0.20.2/bin/../logs/hadoop-train010-secondarynamenode-s10.out
192.168.11.10: Exception in thread "main" java.net.BindException:
Address already in use
192.168.11.10: at sun.nio.ch.Net.bind(Native Method)
I looked at my log file and I do see the Duplicate Protocol Version
error:
2010-07-27 19:11:41,116 INFO org.apache.hadoop.ipc.Server: Error
register getProtocolVersion
java.lang.IllegalArgumentException: Duplicate
metricsName:getProtocolVersion
at org.apache.hadoop.metrics.util.MetricsR
I've followed the instructions and removed all my /tmp files and
started from scratch but I'm still seeing this same error. Has anyone
else seen this? If so, could you let me know how you resolved it.
Thanks,
KJV