How can we launch Spark application on YARN?
-
-
What are the ways to launch Apache Spark over YARN?
Explain the technique to launch Apache Spark over Hadoop YARN.
-
One can launch a Spark Job over YARN by specifying the--master yarn
option as follows:
spark-submit --class <main-class> --master yarn <jar file> [arguments]
NOTE: Before running this job. You have to specify the path to Hadoop Configuration files as follows:
For example: If the Hadoop Configuration files are in /usr/lib/hadoop/conf:
export HADOOP_CONF_DIR=/usr/lib/hadoop/conf
- You must be logged in to reply to this topic.