<configuration>
<property>
<value>hdfs://155.69.140.200:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/home/crawler/crawlzilla/workspace/nutch-crawler</value>
</property>
</configuration>
mapred-site.xml
<property>
<name>mapred.job.tracker</name>
<value>155.69.140.200:9001</value>
</property>
and when i crawl, i got this
error: hadoop dfs -mkdir /user/crawler/admin/testnode broken
able to guide me through?
reason: i already have my hadoop running with some other application and hope to integrate crawzilla in it too.
Thanks so much =)