Live instructor-led & Self-paced Online Certification Training Courses (Big Data, Hadoop, Spark) › Forums › Apache Spark › Command to start and stop the spark in interactive shell ? This topic contains 1 reply, has 1 voice, and was last updated by dfbdteam5 10 months ago. Viewing 2 posts - 1 through 2 (of 2 total) Author Posts September 20, 2018 at 1:22 pm #4977 dfbdteam5Moderator What is the command to start and stop the spark in interactive shell ? September 20, 2018 at 1:23 pm #4978 dfbdteam5Moderator Command to start the interactive shell in Scala: >>>>bin/spark-shell First go the spark directory i.e. hdadmin@ubuntu:~$ cd spark-1.6.1-bin-hadoop2.6/ hdadmin@ubuntu:~/spark-1.6.1-bin-hadoop2.6$ bin/spark-shell —————————————————————————————————————————— Command to stop the interactive shell in Scala: scala>Press (Ctrl+D) One can see the following message scala> Stopping spark context. For more details refer to http://data-flair.training/blogs/apache-spark-installation-on-multi-node-cluster-step-by-step-guide/ Author Posts Viewing 2 posts - 1 through 2 (of 2 total) You must be logged in to reply to this topic.