Free Online Certification Courses – Learn Today. Lead Tomorrow. › Forums › Apache Spark › Not able to execute Spark program
- This topic has 2 replies, 1 voice, and was last updated 5 years, 6 months ago by DataFlair Team.
-
AuthorPosts
-
-
September 20, 2018 at 3:24 pm #5495DataFlair TeamSpectator
Hi,
I am getting below exceptions while running the Spark with Scala application,
1. Right click on Scala program -> Run As -> Scala application
Exception in thread “main” java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the –driver-memory option or spark.driver.memory in Spark configuration.
2. if i run spark-submit,
@localhost spark-1.5.2]$ bin/spark-submit –class com.scala.spark.demo.Wordcount /home/Desktop/scalasparkWC.jar /home/Desktop/wordcount.parquet /home/Desktop/Apr_2017_003
Error occurred during initialization of VM
Could not reserve enough space for object heap
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
[ubuntu@localhost spark-1.5.2]$ -
September 20, 2018 at 3:24 pm #5497DataFlair TeamSpectator
When you are executing program from the Eclipse, change following settings:
In eclipse go to Run configuration > Arguments > VM Arguments and set max heapsize like -Xmx512m
The code provided is for Spark 2.0 and you are using Spark 1.5, Use Spark 2.0, the error will be resolved.
Refer Create Spark Project in Scala with Eclipse without Maven
-
September 20, 2018 at 3:24 pm #5498DataFlair TeamSpectator
Thanks for the quick response. Both issues are resolved.
-
-
AuthorPosts
- You must be logged in to reply to this topic.