Not able to execute Spark program

Viewing 2 reply threads
  • Author
    Posts
    • #5495
      DataFlair TeamDataFlair Team
      Spectator

      Hi,

      I am getting below exceptions while running the Spark with Scala application,

      1. Right click on Scala program -> Run As -> Scala application

      Exception in thread “main” java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the –driver-memory option or spark.driver.memory in Spark configuration.

      2. if i run spark-submit,

      @localhost spark-1.5.2]$ bin/spark-submit –class com.scala.spark.demo.Wordcount /home/Desktop/scalasparkWC.jar /home/Desktop/wordcount.parquet /home/Desktop/Apr_2017_003
      Error occurred during initialization of VM
      Could not reserve enough space for object heap
      Error: Could not create the Java Virtual Machine.
      Error: A fatal exception has occurred. Program will exit.
      [ubuntu@localhost spark-1.5.2]$

    • #5497
      DataFlair TeamDataFlair Team
      Spectator

      When you are executing program from the Eclipse, change following settings:

      In eclipse go to Run configuration > Arguments > VM Arguments and set max heapsize like -Xmx512m

      The code provided is for Spark 2.0 and you are using Spark 1.5, Use Spark 2.0, the error will be resolved.

      Refer Create Spark Project in Scala with Eclipse without Maven

    • #5498
      DataFlair TeamDataFlair Team
      Spectator

      Thanks for the quick response. Both issues are resolved.

Viewing 2 reply threads
  • You must be logged in to reply to this topic.