Execute Mapreduce Error in Distributed mode

180 views
Skip to first unread message

steph...@gmail.com

unread,
Aug 15, 2017, 2:59:47 AM8/15/17
to CDAP User
After my testing, my workflow with 2 mapreduce jobs were running successfully in standalone.
When I deployed it to distributed mode, it ran into a ClassNotFoundException.

ClassNotFoundException : org.apache.hadoop.mapreduce.lib.output.TextOutputFormat 

================code==================
//These are Service code , I send a Http request to start workflow in handler after creating dataset.

DatasetProperties datasetProperties  = FileSetProperties.Builder builder = FileSetProperties.builder()
            .setInputFormat(TextInputFormat.class)
            .setOutputFormat(TextOutputFormat.class).build();

//create dataset in runtime
getContext().getAdmin().createDataset(datasetName,
                "co.cask.cdap.api.dataset.table.Table",
                datasetProperties);

sendHttpRequestToStartWorkflow();
================code over==============

Okey, I change some pom scope below:

<dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-core</artifactId>
      <version>${hadoop.version}</version>
      <scope>provided</scope>
      <scope>compile</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-common</artifactId>
      <version>${hadoop.version}</version>
      <scope>provided</scope>
      <scope>compile</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
      <version>${hadoop.version}</version>
      <scope>provided</scope>
      <scope>compile</scope>
    </dependency>

Service is ok, and my workflow in yarn manager is ok, too. But mapreduce can not be started.

Below is Workflow log, the log will be occurs when I start a mapreduce via send REST manually too :
java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:120) ~[spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar:1.6.2.2.5.3.0-37]
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82) ~[spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar:1.6.2.2.5.3.0-37]
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75) ~[spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar:1.6.2.2.5.3.0-37]
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1260) ~[spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar:1.6.2.2.5.3.0-37]
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1256) ~[spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar:1.6.2.2.5.3.0-37]
at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_91]
at javax.security.auth.Subject.doAs(Subject.java:422) ~[na:1.8.0_91]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) ~[spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar:1.6.2.2.5.3.0-37]
at org.apache.hadoop.mapreduce.Job.connect(Job.java:1255) ~[spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar:1.6.2.2.5.3.0-37]
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1284) ~[spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar:1.6.2.2.5.3.0-37]
at co.cask.cdap.internal.app.runtime.batch.MapReduceRuntimeService.startUp(MapReduceRuntimeService.java:339) ~[na:na]
at com.google.common.util.concurrent.AbstractExecutionThreadService$1$1.run(AbstractExecutionThreadService.java:47) ~[com.google.guava.guava-13.0.1.jar:na]
at co.cask.cdap.internal.app.runtime.batch.MapReduceRuntimeService$1$1.run(MapReduceRuntimeService.java:449) [na:na]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_91]

I have no idea about it . I try to run a normal mapreduce job in our environment,  it's fine.

shankar

unread,
Aug 15, 2017, 2:23:17 PM8/15/17
to CDAP User
Hi Stephen,

You shouldn't have to change the scope from provided to compile for the mapreduce-client libraries. 

Could you provide complete logs and stack-trace when you were getting ClassNotFound exception for TextOutputFormat class ? If you can attach the map-reduce container logs, it will be useful to debug. 

Thanks
Shankar

steph...@gmail.com

unread,
Aug 15, 2017, 9:15:20 PM8/15/17
to CDAP User
Hi Shankar
  Attachment is our master log. Here I use a Classloader hack code, the erorr still be occurred without the hack code.
  
===========code================
private DatasetProperties createDatasetProperties(String basePath, String des,
                                                      boolean enableExploreOnCreate,
                                                      Map<String, String> properties) {
  
      ClassLoader classLoader = Thread.currentThread().getContextClassLoader();
      Thread.currentThread().setContextClassLoader(TextOutputFormat.class.getClassLoader());//error point this
      try {
        FileSetProperties.Builder builder = FileSetProperties.builder()
            .setInputFormat(TextInputFormat.class)
            .setOutputFormat(TextOutputFormat.class)//error will point this without Classloader hack code
            .setEnableExploreOnCreate(enableExploreOnCreate);
        if (StringUtils.isNotBlank(basePath)) builder.setBasePath(String.format("temp/%s", basePath));
        if (StringUtils.isNotBlank(des)) builder.setDescription("temp dataset");
        if (properties != null) builder.addAll(properties);
        return builder.build();
      } finally {
        Thread.currentThread().setContextClassLoader(classLoader);
      }
    }
  
===========code over============
Thanks.
master-cdap.log

steph...@gmail.com

unread,
Aug 15, 2017, 9:29:40 PM8/15/17
to CDAP User
Because of the error occurred in service, it does not start my workflow yet. Our design flow is:
   Front End send REST to service -> create dataset in service -> send REST to CDAP fabric in service to start workflow
So here hasn't map-reduce container logs.

shankar

unread,
Aug 15, 2017, 10:10:12 PM8/15/17
to CDAP User
Hi,

you can try using the method "setOutputFormat(String className)" instead of "setOutputFormat(Class<?> outputFormatClass)" in your service. 

Eg: setOutputFormat("org.apache.hadoop.mapreduce.lib.output.TextOutputFormat")

you can keep the scope of the map-reduce client libraries as provided for the MR job to run successfully.

Reference: 


Thanks
Shankar

steph...@gmail.com

unread,
Aug 15, 2017, 11:15:03 PM8/15/17
to CDAP User
Hi Shankar
  Many thanks, mapreduce worked finally. But still has some errors.
  Here is map-reduce container log.

2017-08-16 11:10:02,572 - INFO  [main:c.c.c.i.a.r.b.MapReduceClassLoader@297] - Create ProgramClassLoader from /bigdata/hadoop/yarn/local/usercache/yarn/appcache/application_1501047275094_0247/container_e01_1501047275094_0247_01_000001/program.jar, expand to /bigdata/hadoop/yarn/local/usercache/yarn/appcache/application_1501047275094_0247/container_e01_1501047275094_0247_01_000001/1502853002570-0
2017-08-16 11:10:05,727 - INFO  [authorization-enforcement-service:c.c.c.s.a.AbstractAuthorizationService@102] - Started authorization enforcement service...
2017-08-16 11:10:05,729 - INFO  [DistributedMapReduceTaskContextProvider STARTING:c.c.c.l.a.LogAppenderInitializer@62] - Initializing log appender KafkaLogAppender
2017-08-16 11:10:05,807 - INFO  [main:c.c.c.i.a.r.b.d.MapReduceContainerLauncher@100] - Launch main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main([])
2017-08-16 11:10:08,153 - INFO  [main:o.m.log@67] - Logging to Logger[org.mortbay.log] via org.mortbay.log.Slf4jLog
2017-08-16 11:10:08,357 - INFO  [main:o.m.log@67] - jetty-6.1.26.hwx
2017-08-16 11:10:08,387 - INFO  [main:o.m.log@67] - Extract jar:file:/bigdata/hadoop/yarn/local/filecache/14/mapreduce.tar.gz/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.7.3.2.6.1.0-129.jar!/webapps/mapreduce to /bigdata/hadoop/yarn/local/usercache/yarn/appcache/application_1501047275094_0247/container_e01_1501047275094_0247_01_000001/tmp/Jetty_0_0_0_0_33260_mapreduce____.kbums3/webapp
2017-08-16 11:10:08,674 - WARN  [main:o.m.log@76] - failed guice: com.sun.jersey.spi.service.ServiceConfigurationError: com.sun.jersey.spi.HeaderDelegateProvider: The class com.sun.jersey.core.impl.provider.header.LocaleProvider implementing provider interface com.sun.jersey.spi.HeaderDelegateProvider could not be instantiated: Cannot cast com.sun.jersey.core.impl.provider.header.LocaleProvider to com.sun.jersey.spi.HeaderDelegateProvider
2017-08-16 11:10:08,675 - WARN  [main:o.m.log@76] - failed org.mortbay.jetty.webapp.WebAppContext@1104cf3a{/,jar:file:/bigdata/hadoop/yarn/local/filecache/14/mapreduce.tar.gz/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.7.3.2.6.1.0-129.jar!/webapps/mapreduce}: com.sun.jersey.spi.service.ServiceConfigurationError: com.sun.jersey.spi.HeaderDelegateProvider: The class com.sun.jersey.core.impl.provider.header.LocaleProvider implementing provider interface com.sun.jersey.spi.HeaderDelegateProvider could not be instantiated: Cannot cast com.sun.jersey.core.impl.provider.header.LocaleProvider to com.sun.jersey.spi.HeaderDelegateProvider
2017-08-16 11:10:08,680 - WARN  [main:o.m.log@76] - failed ContextHandlerCollection@6b8a9e1: com.sun.jersey.spi.service.ServiceConfigurationError: com.sun.jersey.spi.HeaderDelegateProvider: The class com.sun.jersey.core.impl.provider.header.LocaleProvider implementing provider interface com.sun.jersey.spi.HeaderDelegateProvider could not be instantiated: Cannot cast com.sun.jersey.core.impl.provider.header.LocaleProvider to com.sun.jersey.spi.HeaderDelegateProvider
2017-08-16 11:10:08,692 - ERROR [main:o.m.log@87] - Error starting handlers
com.sun.jersey.spi.service.ServiceConfigurationError: com.sun.jersey.spi.HeaderDelegateProvider: The class com.sun.jersey.core.impl.provider.header.LocaleProvider implementing provider interface com.sun.jersey.spi.HeaderDelegateProvider could not be instantiated: Cannot cast com.sun.jersey.core.impl.provider.header.LocaleProvider to com.sun.jersey.spi.HeaderDelegateProvider
	at com.sun.jersey.spi.service.ServiceFinder.fail(ServiceFinder.java:602) ~[jersey-core-1.9.jar:1.9]
	at com.sun.jersey.spi.service.ServiceFinder.access$800(ServiceFinder.java:159) ~[jersey-core-1.9.jar:1.9]
	at com.sun.jersey.spi.service.ServiceFinder$LazyObjectIterator.hasNext(ServiceFinder.java:892) ~[jersey-core-1.9.jar:1.9]
	at com.sun.jersey.core.spi.factory.AbstractRuntimeDelegate.<init>(AbstractRuntimeDelegate.java:76) ~[jersey-core-1.9.jar:1.9]
	at com.sun.jersey.server.impl.provider.RuntimeDelegateImpl.<init>(RuntimeDelegateImpl.java:54) ~[jersey-server-1.9.jar:1.9]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_131]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:1.8.0_131]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_131]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[na:1.8.0_131]
	at java.lang.Class.newInstance(Class.java:442) ~[na:1.8.0_131]
	at javax.ws.rs.ext.FactoryFinder.newInstance(FactoryFinder.java:117) ~[javax.ws.rs.javax.ws.rs-api-2.0.jar:2.0]
	at javax.ws.rs.ext.FactoryFinder.find(FactoryFinder.java:165) ~[javax.ws.rs.javax.ws.rs-api-2.0.jar:2.0]
	at javax.ws.rs.ext.RuntimeDelegate.findDelegate(RuntimeDelegate.java:135) ~[javax.ws.rs.javax.ws.rs-api-2.0.jar:2.0]
	at javax.ws.rs.ext.RuntimeDelegate.getInstance(RuntimeDelegate.java:120) ~[javax.ws.rs.javax.ws.rs-api-2.0.jar:2.0]
	at javax.ws.rs.core.EntityTag.<clinit>(EntityTag.java:56) ~[javax.ws.rs.javax.ws.rs-api-2.0.jar:2.0]
	at java.lang.Class.forName0(Native Method) ~[na:1.8.0_131]
	at java.lang.Class.forName(Class.java:264) ~[na:1.8.0_131]
	at com.sun.proxy.$Proxy52.<clinit>(Unknown Source) ~[na:na]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_131]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:1.8.0_131]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_131]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[na:1.8.0_131]
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:739) ~[na:1.8.0_131]
	at com.sun.jersey.server.impl.application.WebApplicationImpl.createProxy(WebApplicationImpl.java:1550) ~[jersey-server-1.9.jar:1.9]
	at com.sun.jersey.server.impl.application.WebApplicationImpl.<init>(WebApplicationImpl.java:321) ~[jersey-server-1.9.jar:1.9]
	at com.sun.jersey.server.impl.container.WebApplicationProviderImpl.createWebApplication(WebApplicationProviderImpl.java:55) ~[jersey-server-1.9.jar:1.9]
	at com.sun.jersey.spi.container.WebApplicationFactory.createWebApplication(WebApplicationFactory.java:66) ~[jersey-server-1.9.jar:1.9]
	at com.sun.jersey.spi.container.servlet.ServletContainer.create(ServletContainer.java:391) ~[jersey-server-1.9.jar:1.9]
	at com.sun.jersey.spi.container.servlet.ServletContainer$InternalWebComponent.create(ServletContainer.java:306) ~[jersey-server-1.9.jar:1.9]
	at com.sun.jersey.spi.container.servlet.WebComponent.load(WebComponent.java:607) ~[jersey-server-1.9.jar:1.9]
	at com.sun.jersey.spi.container.servlet.WebComponent.init(WebComponent.java:210) ~[jersey-server-1.9.jar:1.9]
	at com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:373) ~[jersey-server-1.9.jar:1.9]
	at com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:710) ~[jersey-server-1.9.jar:1.9]
	at com.google.inject.servlet.FilterDefinition.init(FilterDefinition.java:114) ~[guice-servlet-3.0.jar:na]
	at com.google.inject.servlet.ManagedFilterPipeline.initPipeline(ManagedFilterPipeline.java:98) ~[guice-servlet-3.0.jar:na]
	at com.google.inject.servlet.GuiceFilter.init(GuiceFilter.java:172) ~[guice-servlet-3.0.jar:na]
	at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:97) ~[jetty-6.1.26.hwx.jar:6.1.26.hwx]
	at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) [jetty-util-6.1.26.hwx.jar:6.1.26.hwx]
	at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:713) ~[jetty-6.1.26.hwx.jar:6.1.26.hwx]
	at org.mortbay.jetty.servlet.Context.startContext(Context.java:140) ~[jetty-6.1.26.hwx.jar:6.1.26.hwx]
	at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1282) ~[jetty-6.1.26.hwx.jar:6.1.26.hwx]
	at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:518) ~[jetty-6.1.26.hwx.jar:6.1.26.hwx]
	at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:499) ~[jetty-6.1.26.hwx.jar:6.1.26.hwx]
	at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) [jetty-util-6.1.26.hwx.jar:6.1.26.hwx]
	at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152) ~[jetty-6.1.26.hwx.jar:6.1.26.hwx]
	at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:156) ~[jetty-6.1.26.hwx.jar:6.1.26.hwx]
	at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) [jetty-util-6.1.26.hwx.jar:6.1.26.hwx]
	at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130) ~[jetty-6.1.26.hwx.jar:6.1.26.hwx]
	at org.mortbay.jetty.Server.doStart(Server.java:224) ~[jetty-6.1.26.hwx.jar:6.1.26.hwx]
	at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) [jetty-util-6.1.26.hwx.jar:6.1.26.hwx]
	at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:921) [hadoop-common-2.7.3.2.6.1.0-129.jar:na]
	at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:398) [hadoop-yarn-common-2.7.3.2.6.1.0-129.jar:na]
	at org.apache.hadoop.mapreduce.v2.app.client.MRClientService.serviceStart(MRClientService.java:142) [hadoop-mapreduce-client-app-2.7.3.2.6.1.0-129.jar:na]
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) [hadoop-common-2.7.3.2.6.1.0-129.jar:na]
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceStart(MRAppMaster.java:1149) [hadoop-mapreduce-client-app-2.7.3.2.6.1.0-129.jar:na]
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) [hadoop-common-2.7.3.2.6.1.0-129.jar:na]
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$5.run(MRAppMaster.java:1560) [hadoop-mapreduce-client-app-2.7.3.2.6.1.0-129.jar:na]
	at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_131]
	at javax.security.auth.Subject.doAs(Subject.java:422) [na:1.8.0_131]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) [hadoop-common-2.7.3.2.6.1.0-129.jar:na]
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1556) [hadoop-mapreduce-client-app-2.7.3.2.6.1.0-129.jar:na]
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1487) [hadoop-mapreduce-client-app-2.7.3.2.6.1.0-129.jar:na]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_131]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_131]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_131]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_131]
	at co.cask.cdap.internal.app.runtime.batch.distributed.MapReduceContainerLauncher.launch(MapReduceContainerLauncher.java:101) [co.cask.cdap.cdap-app-fabric-4.1.0.jar:na]
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(Unknown Source) [hadoop-mapreduce-client-app-2.7.3.2.6.1.0-129.jar:na]
Caused by: java.lang.ClassCastException: Cannot cast com.sun.jersey.core.impl.provider.header.LocaleProvider to com.sun.jersey.spi.HeaderDelegateProvider
	at java.lang.Class.cast(Class.java:3369) ~[na:1.8.0_131]
	at com.sun.jersey.spi.service.ServiceFinder$LazyObjectIterator.hasNext(ServiceFinder.java:851) ~[jersey-core-1.9.jar:1.9]
	... 65 common frames omitted
2017-08-16 11:10:08,707 - INFO  [main:o.m.log@67] - Started HttpServer2$SelectChannelConne...@0.0.0.0:33260
2017-08-16 11:10:09,198 - WARN  [main:o.a.h.m.v.a.MRAppMaster@91] - log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
2017-08-16 11:10:09,199 - WARN  [main:o.a.h.m.v.a.MRAppMaster@91] - log4j:WARN Please initialize the log4j system properly.
2017-08-16 11:10:09,199 - WARN  [main:o.a.h.m.v.a.MRAppMaster@91] - log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
2017-08-16 11:10:09,209 - INFO  [DistributedMapReduceTaskContextProvider STOPPING:c.c.c.l.a.LogAppenderInitializer@92] - Stopping log appender KafkaLogAppender
2017-08-16 11:10:09,227 - INFO  [DistributedMapReduceTaskContextProvider STOPPING:c.c.c.l.a.LogAppenderInitializer@94] - Done stopping log appender KafkaLogAppender
2017-08-16 11:10:09,230 - INFO  [ZKKafkaClientService STOPPING:o.a.t.i.k.c.ZKKafkaClientService@106] - Stopping KafkaClientService
2017-08-16 11:10:09,231 - INFO  [ZKKafkaClientService STOPPING:o.a.t.i.k.c.ZKKafkaClientService@114] - KafkaClientService stopped
2017-08-16 11:10:09,240 - INFO  [authorization-enforcement-service:c.c.c.s.a.AbstractAuthorizationService@139] - Shutdown authorization enforcement service successfully.

steph...@gmail.com

unread,
Aug 15, 2017, 11:28:34 PM8/15/17
to CDAP User
Attachment is our result of execute "jar -tf app.jar", is any strange?


iua-datapreparation-4.1.0-jartf.txt
iua-common-4.1.0-jartf.txt

shankar

unread,
Aug 31, 2017, 8:01:54 PM8/31/17
to CDAP User
Hi Stephen,

I noticed you have a dependency on jersey-core in your application, Can you remove the dependency on  "jersey-core-1.9". jersey-core jar comes from hadoop library and your additional jar dependency (multiple jars) is the reason behind class cast exception.

If you remove the dependency of jersey-core in your application, it shuold proceed to run the MR program.

Thanks
Shankar
Reply all
Reply to author
Forward
0 new messages