Infinite loop when using rhwatch::setup

18 views
Skip to first unread message

nae...@ic.ufal.br

unread,
May 24, 2015, 7:10:54 PM5/24/15
to rh...@googlegroups.com
I have a file "processed_input_rdt.Rdata" on the root folder on HDFS, I want this file to be in all maps.
I'm trying to do it by using a global variable via setup as seen above:

mr <- rhwatch(
  setup = expression(rhload("processed_input_rdt.Rdata")),
  map      = map,
  reduce   = reduce,
  input    = rhfmt(processed_input_tbl, type = "text"),
  output   = rhfmt(output, type = "text"),
  readback = FALSE,
 
)



Bue I'm getting the error:
> source("workbench.R")
Error: evaluation nested too deeply: infinite recursion / options(expressions=)?

The code runs OK when I remove the setup statement. Where am I doing wrong?



Saptarshi Guha

unread,
May 26, 2015, 11:01:06 AM5/26/15
to rh...@googlegroups.com

Instead , do this

Rhwatch(,...,shared="path-on-hdfs-to-processed-input-rt-rdata", setup=expression(map={ load("processed_input_rtd.Rdata") })
(Assuming you need the rdata file in the map phase. If you need it in the reduce phase remove the "map=" bit).

Also in the map reduce job, that is during the map and reduce code bits, do not use any rhipe function other than rhcollect, rhcounter, and rhstatus. In fact , do not run library(Rhipe).

So I presume your Rdata file is not a save image after having loaded RHIPE. Because if you try to load that rdata file in the mapreduce job R will try loading the rhipe library and that causes issues.

Feel free to ask questions for further clarification

Cheers
Saptarshi


--

---
You received this message because you are subscribed to the Google Groups "rhipe" group.
To unsubscribe from this group and stop receiving emails from it, send an email to rhipe+un...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

nae...@ic.ufal.br

unread,
Jun 2, 2015, 1:15:25 PM6/2/15
to rh...@googlegroups.com
Hey.
Thanks! That's really going to help-me. The code worked here with your help.

yi-chia wang

unread,
Jul 5, 2015, 9:20:50 PM7/5/15
to rh...@googlegroups.com
when i type in this

$ R CMD INSTALL Rhipe_0.73.1.tar.gz

the error message is

** R
** inst
** preparing package for lazy loading
** help
*** installing help indices
** building package indices
** testing if installed package can be loaded
Rhipe requires HADOOP_HOME or HADOOP or HADOOP_BIN environment variable to be present
 $HADOOP/bin/hadoop or $HADOOP_BIN/hadoop should exist
Rhipe: HADOOP_BIN is missing, using $HADOOP/bin
HADOOP_HOME missing
HADOOP_CONF_DIR missing, you are probably going to have a problem running RHIPE.
HADOOP_CONF_DIR should be the location of the directory that contains the configuration files

------------------------------------------------
| Please call rhinit() else RHIPE will not run |
------------------------------------------------

how can i set the hadoop  Environment variables for rhipe ?

i can't solve  this problem can anyone can help me ?????

please thx~!!


Fishtank於 2015年5月26日星期二 UTC-4上午11時01分06秒寫道:

yi-chia wang

unread,
Jul 5, 2015, 9:20:50 PM7/5/15
to rh...@googlegroups.com
when i type in this

$ R CMD INSTALL Rhipe_0.73.1.tar.gz

the error message is

** R
** inst
** preparing package for lazy loading
** help
*** installing help indices
** building package indices
** testing if installed package can be loaded
Rhipe requires HADOOP_HOME or HADOOP or HADOOP_BIN environment variable to be present
 $HADOOP/bin/hadoop or $HADOOP_BIN/hadoop should exist
Rhipe: HADOOP_BIN is missing, using $HADOOP/bin
HADOOP_HOME missing
HADOOP_CONF_DIR missing, you are probably going to have a problem running RHIPE.
HADOOP_CONF_DIR should be the location of the directory that contains the configuration files

------------------------------------------------
| Please call rhinit() else RHIPE will not run |
------------------------------------------------

how can i set the hadoop  Environment variables for rhipe ?

i can't solve  this problem can anyone can help me ?????

please thx~!!


Fishtank於 2015年5月26日星期二 UTC-4上午11時01分06秒寫道:

Instead , do this

Reply all
Reply to author
Forward
0 new messages