Free Online Certification Courses – Learn Today. Lead Tomorrow. › Forums › Apache Spark › What is the role of Spark Driver in spark applications
- This topic has 2 replies, 1 voice, and was last updated 5 years, 6 months ago by DataFlair Team.
-
AuthorPosts
-
-
September 20, 2018 at 4:23 pm #5835DataFlair TeamSpectator
What is the use of Spark driver, where it gets executed on the cluster ?
-
September 20, 2018 at 4:24 pm #5836DataFlair TeamSpectator
The spark driver is that the program that defines the transformations and actions on RDDs of knowledge and submits request to the master. Spark driver is a program that runs on the master node of the machine which declares transformations and actions on knowledge RDDs.
In easy terms, the driver in Spark creates SparkContext, connected to a given Spark Master.It conjointly delivers the RDD graphs to Master, wherever the standalone cluster manager runs.
Also see How Spark works
-
September 20, 2018 at 4:24 pm #5837DataFlair TeamSpectator
A Spark driver (aka an application’s driver process) is a JVM process that hosts SparkContextfor a Spark application. It is the master node in a Spark application.
It is the cockpit of jobs and tasks execution (using DAGScheduler and Task Scheduler).
It hosts Web UI for the environment.
It splits a Spark application into tasks and schedules them to run on executors.
A driver is where the task scheduler lives and spawns tasks across workers.
A driver coordinates workers and overall execution of tasks.
-
-
AuthorPosts
- You must be logged in to reply to this topic.