java.lang.AbstractMethodError: co.cask.cdap.etl.spark.Compat$1.call(Ljava/lang/Object;)Ljava/util/It

127 views
Skip to first unread message

arch...@gmail.com

unread,
May 14, 2018, 5:29:40 AM5/14/18
to CDAP User
Hi, 

Getting this error 

java.lang.AbstractMethodError: co.cask.cdap.etl.spark.Compat$1.call(Ljava/lang/Object;)Ljava/util/Iterator
at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:125) ~[co.cask.cdap.spark-assembly-2.1.0.jar:na]
at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:125) ~[co.cask.cdap.spark-assembly-2.1.0.jar:na]
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434) ~[co.cask.cdap.spark-assembly-2.1.0.jar:na]

when i submit a job to standalone CDAP, using the Spark DecisionTreeTrainer Plugin, with app.program.spark.compat set to spark2_2.11

Any ideas ?

Regards
Chetana


maev...@gmail.com

unread,
May 17, 2018, 9:24:03 AM5/17/18
to CDAP User

arch...@gmail.com

unread,
May 17, 2018, 9:33:30 PM5/17/18
to CDAP User
Please the details, this link leads me to the cap users forum.

Regards,
Chetana

maev...@gmail.com

unread,
May 18, 2018, 3:36:47 AM5/18/18
to CDAP User
"Hi Salt,

The error is due to the wrong version of the system cdap-data-streams jar being used. The first time CDAP starts up, it loads a bunch of system artifacts, and will use the spark version to load a spark specific version of the cdap-data-streams artifact. If you have changed spark versions, you will need to first delete those artifacts, then reload them in order to pick up the changes.


curl -X DELETE <host>:<port>/v3/namespaces/system/artifacts/cdap-data-pipeline/versions/4.2.0
curl -X DELETE <host>:<port>/v3/namespaces/system/artifacts/cdap-data-streams/versions/4.2.0

After that is done, you can reload the system artifacts using https://docs.cask.co/cdap/4.2.0/en/reference-manual/http-restful-api/artifact.html#H2858:

curl -X POST <host>:<port>/v3/namespaces/system/artifacts


As for your plugin, you shouldn't bundle hydrator-spark-core in it, as it contains internal code used to run the actual pipeline. That dependency should have test scope in your pom.

Regards,
Albert"
Reply all
Reply to author
Forward
0 new messages