Error hb.get.data.frame from large table

31 views
Skip to first unread message

Adamant

unread,
Jan 26, 2016, 9:12:24 AM1/26/16
to RHadoop

Hi there,
I'm quite new to Hadoop and R and I hoped somebody could help me with a little problem I encountered.
When I try to use hb.get.data.frame (from rhbase) from a table (more then 10000000 lines), I get this error message:


Error in scn$get(batchsize) :
  rhbase<hbScannerGetList>:: (IOError) Default TException.

I read that it must be a time out problem, but I have no idea how to solve it or where to look.
By the way, it works fine with smaller tables.
My program for this looks like:

databuffer<-data.frame()
datagetter <- hb.get.data.frame(tablename = 'tblname', start = '0', columns = c("column1", ...))
Rtablename <- data.frame()
while(!is.null(databuffer<-datagetter())){Rtablename <-rbind(Rtablename,databuffer)}

I hope someone has an idea what's going on.
Greetings and thanks!

 

Reply all
Reply to author
Forward
0 new messages