Main class [org.apache.oozie.action.hadoop.SparkMain], exit code [-1] -
i trying execute spark job through oozie on yarn cluster. here workflow.xml
<workflow-app xmlns="uri:oozie:workflow:0.5" name="samplewf1"> <start to="myfirstsparkjob"/> <action name="myfirstsparkjob"> <spark xmlns="uri:oozie:spark-action:0.1"> <job-tracker>${jobtracker}</job-tracker> <name-node>${namenode}</name-node> <configuration> <property> <name>mapreduce.job.queuename</name> <value>${queuename}</value> </property> </configuration> <master>yarn-cluster</master> <mode>cluster</mode> <name>spark example</name> <class>class_path</class> <jar>hdfs_path_to_jar</jar> <spark-opts>--conf spark.hadoop.yarn.resourcemanager.address=your-rm:8050</spark-opts> </spark> <ok to="end"/> <error to="fail"/> </action> <kill name="fail"> <message>workflow failed</message> </kill> <end name="end"/> </workflow-app>
jar have mentioned contains dependencies require. using services below spark version 1.6.0.2.4 oozie version 4.2.0.2.4 hdp version 2.4.0.0-169
i submitting oozie job through yarn user. below config.
namenode=hdfs://host:8020 jobtracker=host:8050 oozie.use.system.libpath=true userloc=/user/oozie oozie.libpath=${namenode}${userloc}/share/lib currtimescript=sample.sh currtimescriptpath=/user/oozie/workflows/test_flow queuename=aggregation user.name=yarn classpath=valid_class_path oozieprojectroot=${namenode}${userloc}/workflows workflowname=test_flow workflowhome=${oozieprojectroot}/${workflowname} oozie.wf.application.path=${workflowhome}/workflow.xml
Comments
Post a Comment