02-July-2013 - Excute Hadoop Commands in VMware player (LINUX environment)

13 views
Skip to first unread message

Raj

unread,
Jul 3, 2013, 7:03:02 PM7/3/13
to hadooponli...@googlegroups.com

How to access Hadoop in VMware player

1.       Open VMware player (Double click on VMware icon)

2.       Select Cloudera_training_VM_1.7

3.       Right click on Cloudera_training_VM_1.7 and click on “Play Virtual Machine”. It take some time to open VMware player.

4.       Double click on “Terminal” to open command prompt

a.       cd ..  -> Space should be added between cd and .. else command will not work

b.      cd ..

c.       ls

d.      cd usr

e.      ls

f.        cd lib

g.       ls

h.      cd Hadoop-0.20

i.         ls  -> Now you can see all Hadoop related jar and files

 

Below are important Configuration files

1.       core-site.xml

2.       hdfs-site.xml

3.       mapred-site.xml

4.       Masters

5.       Slaves

6.       hadoop-env.sh

7.       hadoop-plicy.xml

8.       log4j.properties

9.       mapred-queue.acls.xml

 

To access Hadoop in browser use below URL in VMplayer browser not in your local system.

http://localhost:50070

 

Commands:

1.       To move file from one directory to another directory

[training@localhost conf]$ hadoop fs -mv /mapreduce/input/stocks/<src> /mapreduce/input/yash<destination>

 

2.       To make directory

-mkdir /yash ->create directory

 

3.       To remove directory

-rmr /yash ->remove directory

 

4.       To copy

-cp <src> <dsc>

5.       put

Usage: hdfs dfs -put <localsrc> ... <dst>

Copy single src, or multiple srcs from local file system to the destination file system. Also reads input from stdin and writes to destination file system.

·         hdfs dfs -put localfile /user/hadoop/hadoopfile

·         hdfs dfs -put localfile1 localfile2 /user/hadoop/hadoopdir

·         hdfs dfs -put localfile hdfs://nn.example.com/hadoop/hadoopfile

·         hdfs dfs -put - hdfs://nn.example.com/hadoop/hadoopfile
Reads the input from stdin.

Exit Code: Returns 0 on success and -1 on error.

6.       get

Usage: hdfs dfs -get [-ignorecrc] [-crc] <src> <localdst>

Copy files to the local file system. Files that fail the CRC check may be copied with the -ignorecrc option. Files and CRCs may be copied using the -crc option.

Example:

·         hdfs dfs -get /user/hadoop/file localfile

·         hdfs dfs -get hdfs://nn.example.com/user/hadoop/file localfile

Exit Code: Returns 0 on success and -1 on error.

 

 

 

Reply all
Reply to author
Forward
0 new messages