Free Online Certification Courses – Learn Today. Lead Tomorrow. › Forums › Apache Spark › Different Running Modes of Apache Spark
- This topic has 2 replies, 1 voice, and was last updated 5 years, 6 months ago by DataFlair Team.
-
AuthorPosts
-
-
September 20, 2018 at 10:23 pm #6461DataFlair TeamSpectator
Different Running Modes of Apache Spark
-
September 20, 2018 at 10:23 pm #6462DataFlair TeamSpectator
Apache Spark can be run in following three mode :
(1) Local mode
(2) Standalone mode
(3) Cluster modeTo study in detail about Spark running mode follow http://data-flair.training/blogs/apache-spark-cluster-managers-tutorial/
-
September 20, 2018 at 10:23 pm #6463DataFlair TeamSpectator
We can launch spark application in four modes:
1) Local Mode (local[*],local,local[2]…etc)
-> When you launch spark-shell without control/configuration argument, It will launch in local mode
spark-shell –master local[1]
-> spark-submit –class com.df.SparkWordCount SparkWC.jar local[1]2) Spark Standalone cluster manger:
-> spark-shell –master spark://hduser:7077
-> spark-submit –class com.df.SparkWordCount SparkWC.jar spark://hduser:70773) Yarn mode (Client/Cluster mode):
-> spark-shell –master yarn or
(or)
->spark-shell –master yarn –deploy-mode clientAbove both commands are same.
To launch spark application in cluster mode, we have to use spark-submit command. We cannot run yarn-cluster mode via spark-shell because when we run spark application, driver program will be running as part application master container/process. So it is not possible to run cluster mode via spark-shell.
-> spark-submit –class com.df.SparkWordCount SparkWC.jar yarn-client
-> spark-submit –class com.df.SparkWordCount SparkWC.jar yarn-cluster4) Mesos mode:
-> spark-shell –master mesos://HOST:5050
-
-
AuthorPosts
- You must be logged in to reply to this topic.