how azkaban run spark job,calling spark task (type = command) by the shell and by type = spark calls have different?

419 views
Skip to first unread message

orzso...@gmail.com

unread,
May 17, 2016, 11:55:14 PM5/17/16
to azkaban
azkaban is now 3.0.0 version, the source has supported spark jobtype, but there is no description of how to use the relevant documentation. Some people know how to configure job it? I would like to know to run type = command type = spark run any different.

type=spark
dependencies=wordcount1
job.class=com.test.spark.JavaWordCount

master=local[2]
class=com.mapbar.spark.streaming.JavaWordCount
executor-cores=2
num-executors=2
executor-memory=512M
name=wordcount
conf.spark.serializer=org.apache.spark.serializer.KryoSerializer

main.args=${param.inData} ${param.outData}

force.output.overwrite=true

input.path=${param.inData}
output.path=${param.outData}

==================================================
type=command
command=spark-submit \
  --master local[2] \
  --jars $LIBJARS \
  --class com.test.spark.JavaWordCount \
  --executor-cores 2 \
  --num-executors 2\
  --executor-memory 512M \
  --conf spark.serializer=org.apache.spark.serializer.KryoSerializer \
  --name $name \
  $JAR_PATH \
  $lyd $ldd $lyc $ldc $hour_topic $day_topic
Reply all
Reply to author
Forward
0 new messages