Apache Spark Map vs FlatMap Operation

1. Objective

In this Apache Spark tutorial, we will discuss the comparison between Spark Map vs FlatMap Operation. Map and FlatMap are the transformation operations in Spark. Map() operation applies to each element of RDD and it returns the result as new RDD. In the Map, operation developer can define his own custom business logic. While FlatMap() is similar to Map, but FlatMap allows returning 0, 1 or more elements from map function.

In this blog, we will discuss how to perform map operation on RDD and how to process data using FlatMap operation. This tutorial also covers what is map operation, what is a flatMap operation, the difference between map() and flatMap() transformation in Apache Spark with examples. We will also see Spark map and flatMap example in Scala and Java in this Spark tutorial.

So, let’s start Spark Map vs FlatMap function.

Apache Spark Map vs FlatMap Operation

Apache Spark Map vs FlatMap Operation

 Do you know How to install and configure Apache Spark?

2. Difference between Spark Map vs FlatMap Operation

This section of the Spark tutorial provides the details of Map vs FlatMap operation in Apache Spark with examples in Scala and Java programming languages.

i. Spark Map Transformation

A map is a transformation operation in Apache Spark. It applies to each element of RDD and it returns the result as new RDD. In the Map, operation developer can define his own custom business logic. The same logic will be applied to all the elements of RDD.

Spark Map function takes one element as input process it according to custom code (specified by the developer) and returns one element at a time. Map transforms an RDD of length N into another RDD of length N. The input and output RDDs will typically have the same number of records.

apache spark map transformation operation

Apache Spark map transformation operation

a. Map Transformation Scala Example

Create RDD

val data = spark.read.textFile("INPUT-PATH").rdd

Above statement will create an RDD with name data. Follow this guide to learn more ways to create RDDs in Apache Spark.

Map Transformation-1

val newData = data.map (line => line.toUpperCase() )

Above the map, a transformation will convert each and every record of RDD to upper case.

Map Transformation-2

val tag = data.map {line => {
val xml = XML.loadString(line)
xml.attribute("Tags").get.toString()
}
}

Above the map, a transformation will parse XML and collect Tag attribute from the XML data. Overall the map operation is converting XML into a structured format.

Follow this link to know about Java Programming Language

b. Map Transformation Java Example

Create RDD

JavaRDD<String> linesRDD = spark.read().textFile("INPUT-PATH").javaRDD();

Above statement will create an RDD with name lines RDD.

Map Transformation

JavaRDD<String> newData = linesRDD.map(new Function<String, String>() {
public String call(String s) {
String result = s.trim().toUpperCase();
return result;
}
});

We recommend you to read – Spark Shell Commands to Interact with Spark-Scala

ii. Spark FlatMap Transformation Operation

Let’s now discuss flatMap() operation in Apache Spark-

apache spark flatmap transformation operation

Apache Spark flatMap transformation operation

A flatMap is a transformation operation. It applies to each element of RDD and it returns the result as new RDD. It is similar to Map, but FlatMap allows returning 0, 1 or more elements from map function. In the FlatMap operation, a developer can define his own custom business logic. The same logic will be applied to all the elements of the RDD.

A FlatMap function takes one element as input process it according to custom code (specified by the developer) and returns 0 or more element at a time. flatMap() transforms an RDD of length N into another RDD of length M.

a. FlatMap Transformation Scala Example

val result = data.flatMap (line => line.split(" ") )

Above flatMap transformation will convert a line into words. One word will be an individual element of the newly created RDD.

Learn to Create Spark project in Scala with Eclipse

b. FlatMap Transformation Java Example

JavaRDD<String> result = data.flatMap(new FlatMapFunction<String, String>() {
public Iterator<String> call(String s) {
return Arrays.asList(s.split(" ")).iterator();
} });

Above flatMap transformation will convert a line into words. One word will be an individual element of the newly created RDD.

3. Conclusion

Hence, from the comparison between Spark map() vs flatMap(), it is clear that Spark map function expresses a one-to-one transformation. It transforms each element of a collection into one element of the resulting collection. While Spark flatMap function expresses a one-to-many transformation. It transforms each element to 0 or more elements.

Please leave a comment if you like this post or have any query about Apache Spark map vs flatMap function.

See Also-

Reference for Spark

21 Responses

  1. Rohan Agarwal says:

    makes lot of sense, gifs are too nice

    • Data Flair says:

      Glad Rohan, you not only liked our tutorial but also noticed the GIF work on Spark Map vs FlatMap Operation. These images, examples, and GIFs are specially added to help you. So that you can understand all theory knowledge with a practical touch. We are continuously working for you. Keep connected with us for reading more interesting articles on Spark Technology.

  2. Chandresh Bhatt says:

    Really good animation for explanation.

    • Data Flair says:

      Hellow Chandresh
      Glad to know that all our readers are liking the animation part of Spark map vs Flatmap operation. Your this reply really appreciate us to do more such activity for the better explanation. Keep reading and keep enjoying.

  3. Chameera says:

    A great description clearly describe about the difference and the mechanism with examples.
    Thank you!

    • Data Flair says:

      Chameera, you are amazing.
      Very happy to read reviews of our loyal readers.
      Hope along with difference you clearly understood the Spark Map and Flatmap Transformation. We have explained the practical examples along with GIF so that no one gets difficulty in learning new Spark Concepts.
      Keep reading and enjoy your learning with Data Flair

  4. Raunak says:

    Awesome explanation. Keep it up.
    Thank you.

    • Data Flair says:

      Hii Raunak
      Thank you, for giving such a positive and motivated review on Spark Map vs Flatmap. Your thoughts really appreciate us to publish more blogs which can help you.\
      Best wishes to you.

  5. Suresh says:

    Awesome explanation.

  6. mallikarjun says:

    Thank you . good comparison

    • Data Flair says:

      Hellow Mallikarjun,
      Thanks a lot for taking time and leaving us a review. We are glad that you like our comparison of Spark Map vs Flatmap.
      Keep connected with us, we have more such blogs that will help you to grab more knowledge on Spark Technology.

  7. Anup Tiwari says:

    Great explanation, especially by animation. Thanks

  8. Akash Jain says:

    Nice blog. Thanks for this.

  9. narendra solanki says:

    Nice Blog, Specially GIFS (Y)

    • DataFlair Team says:

      Thanks, Narendra Thank you so much for taking the time to write this excellent review. We always try to make the best user experience for learning.
      Keep Visiting and Keep Learning
      Regards,
      DataFlair

  10. Chiranjeevi Bevara says:

    Thanks a million for detailed explanation. Its so easy to understand. Nice work.

    • DataFlair Team says:

      We are glad when we see such kind of appreciation from our loyal readers. Thanks, Chiranjeevi from writing us on Apache Spark map vs flat map.
      We would like to suggest you refer our Spark Interview Questions and Quiz. It will surely help you to brush up your skills,
      Hope, it will help you!
      DataFlair

Leave a Reply

Your email address will not be published. Required fields are marked *