Read from HDFS

77 views
Skip to first unread message

Venkatesh Sambandamoorthy

unread,
Feb 5, 2015, 6:49:05 PM2/5/15
to hipi-...@googlegroups.com
Hi,

I was trying to get started with HIPI. The program CreateHipiImageBundle is working fine if i give my loca Filesystem path but throwing a NULLpointer exception if i give my HDFS path. Can you help me with that. I am using Hadoop 2.6.0.

Thanks
Venkatesh

Chris Sweeney

unread,
Feb 6, 2015, 2:50:54 AM2/6/15
to hipi-...@googlegroups.com
Are you sure you are specifying the HDFS path correctly? There are some simple tests you can do to make sure it is correct independently of HIPI (i.e., try to read/write a file to that path)

Venkatesh Sambandamoorthy

unread,
Feb 10, 2015, 12:37:56 AM2/10/15
to hipi-...@googlegroups.com, cmsw...@cs.ucsb.edu
Chris,

Yup i am able to tun the wordcount problem without any issues. I am very new to hadoop and hipi. If i can ask for a favour can you explain the mapper and reducer in getting started page for finding the image average ?

Thanks
Venkatesh

Chris Sweeney

unread,
Feb 10, 2015, 2:17:32 AM2/10/15
to hipi-...@googlegroups.com
Hi Venkatesh,

From the website: "In the map phase, we compute average color of an image, and in the reduce phase, we sum up the average color to compute the total average image color."

Venkatesh Sambandamoorthy

unread,
Feb 10, 2015, 12:59:13 PM2/10/15
to hipi-...@googlegroups.com, cmsw...@cs.ucsb.edu

Thats fine. Can you please explain the below briefly.

 if (value != null && value.getWidth() > 1 && value.getHeight() > 1 && value.getBands() == 3) {
          FloatImage avg = new FloatImage(1, 1, 3);
          float[] avgData = avg.getData();
          float[] valData = value.getData();
          for (int i = 0; i < value.getWidth(); i++) {
            for (int j = 0; j < value.getHeight(); j++) {
              avgData[0] += valData[i * value.getHeight() * 3 + j * 3];
              avgData[1] += valData[i * value.getHeight() * 3 + j * 3 + 1];
              avgData[2] += valData[i * value.getHeight() * 3 + j * 3 + 2];
            }
          }
          avg.scale(1.0f / (value.getWidth() * value.getHeight()));
          context.write(new IntWritable(0), avg);




Also i am getting the below exception when i run the FirstProgram

Class hipi.imagebundle.mapreduce.output.BinaryOutputFormat not found

I am sure of including the jar because the tool example for creating the hib file worked.

Thanks
Venkatesh

Chris Sweeney

unread,
Feb 10, 2015, 1:01:50 PM2/10/15
to hipi-...@googlegroups.com
This is simply a summation of all pixels in an image, then dividing by the number of pixels to get the mean color of an image.

Venkatesh Sambandamoorthy

unread,
Feb 10, 2015, 1:44:23 PM2/10/15
to hipi-...@googlegroups.com, cmsw...@cs.ucsb.edu
Thanks, that helps.

Do you have any idea why am getting an class not found exception for BinaryOutputFormat. I am sure i have included hipi jar in hadoop classpath. I was able to successfuly execute the tools example.

2015-02-09 13:25:22,056 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Using mapred newApiCommitter.
2015-02-09 13:25:22,878 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter set in config null
2015-02-09 13:25:22,926 INFO [main] org.apache.hadoop.service.AbstractService: Service org.apache.hadoop.mapreduce.v2.app.MRAppMaster failed in state INITED; cause: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hipi.imagebundle.mapreduce.output.BinaryOutputFormat not found
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hipi.imagebundle.mapreduce.output.BinaryOutputFormat not found
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:472)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:452)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.callWithJobClassLoader(MRAppMaster.java:1541)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:452)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:371)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1499)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1496)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1429)

Chris Sweeney

unread,
Feb 10, 2015, 1:57:35 PM2/10/15
to hipi-...@googlegroups.com
Hi,

This error only occurs when your classpath is broken. I believe the examples work because the classpath is set in the build that is included. I am not a java expert but maybe this post will help?

Reply all
Reply to author
Forward
0 new messages