Cannot run hadoop commands from manifest file

90 views
Skip to first unread message

Shouvanik Haldar

unread,
Mar 18, 2014, 2:38:55 AM3/18/14
to puppet...@googlegroups.com
Hi,

I have a basic hadoop copy command
e.g.
hadoop fs -copyToLocal s3://xx-xx-xxxx/scripts/sqoop-1.4.3-hadoop200.jar sqoop-1.4.3-hadoop200.jar

How can I execute this command from puppet? I looked into some sites. But could not get any help.
Please help me.

Regards,
Shouvanik

jcbollinger

unread,
Mar 19, 2014, 9:45:53 AM3/19/14
to puppet...@googlegroups.com


On Tuesday, March 18, 2014 1:38:55 AM UTC-5, Shouvanik Haldar wrote:
Hi,

I have a basic hadoop copy command
e.g.
hadoop fs -copyToLocal s3://xx-xx-xxxx/scripts/sqoop-1.4.3-hadoop200.jar sqoop-1.4.3-hadoop200.jar

How can I execute this command from puppet? I looked into some sites. But could not get any help.



Declare an appropriate Exec resource for the target node(s).  Something like

exec { 'hadoop-get-scoop':
  path => '/usr/bin:/bin:/usr/sbin:/sbin',

  command => 'hadoop fs -copyToLocal s3://xx-xx-xxxx/scripts/sqoop-1.4.3-hadoop200.jar /path/to/sqoop-1.4.3-hadoop200.jar',
  creates => '/path/to/sqoop-1.4.3-hadoop200.jar'
}

Alternatively, put the file in question on the master, and sync it via a File resource instead of via a hadoop command.


John

Shouvanik Haldar

unread,
Mar 21, 2014, 2:57:45 AM3/21/14
to puppet...@googlegroups.com
Thanks for the reply.
Somehow hadoop command is not getting recognised. I am still getting error

err: /Stage[main]//Exec[hadoop-get-scoop]/returns: change from notrun to 0 failed: hadoop fs -copyToLocal s3://xxx-xxx-xxxx/scripts/mysql-connector-java.jar /home/hadoop/mysql-connector-java.jar returned 1 instead of one of [0] at /root/examples/download-s3files.pp:46

jcbollinger

unread,
Mar 21, 2014, 9:18:07 AM3/21/14
to puppet...@googlegroups.com


On Friday, March 21, 2014 1:57:45 AM UTC-5, Shouvanik Haldar wrote:
Thanks for the reply.
Somehow hadoop command is not getting recognised. I am still getting error

err: /Stage[main]//Exec[hadoop-get-scoop]/returns: change from notrun to 0 failed: hadoop fs -copyToLocal s3://xxx-xxx-xxxx/scripts/mysql-connector-java.jar /home/hadoop/mysql-connector-java.jar returned 1 instead of one of [0] at /root/examples/download-s3files.pp:46



There are three main possibilities:
  1. The command is running fine, but it exits with code 1 instead of the expected success code, 0.  Generally speaking, Unix commands exit with code 0 when they complete successfully, but if hadoop is an oddball that may return code 1 on success then use the Exec's 'returns' property (http://docs.puppetlabs.com/references/3.stable/type.html#exec-attribute-returns) to tell Puppet that's ok.
  2. The command is not running at all because it is not found in the specified path.  In my example I gave a guess at an appropriate path, but if your hadoop is installed elsewhere then you will need either to use a more appropriate path parameter (http://docs.puppetlabs.com/references/3.stable/type.html#exec-attribute-path) or give the full path to hadoop in the command.
  3. The 'hadoop' command is found, but does not run successfully.  If the same command works when run manually, then the problem is likely with the environment in which it runs.  Puppet intentionally provides a very sparse environment to commands it spawns; anything else you need, such as particular environment variables, you have to arrange for in the Exec.  There are the 'path' and 'environment' (http://docs.puppetlabs.com/references/3.stable/type.html#exec-attribute-environment) parameters for that, or you can put the needed provisions directly in the command.

Good luck,

John

Reply all
Reply to author
Forward
0 new messages