Live instructor-led & Self-paced Online Certification Training Courses (Big Data, Hadoop, Spark) Forums Hadoop how to create custom key and custom value in MapReduce Job

This topic contains 1 reply, has 1 voice, and was last updated by  dfbdteam3 1 year, 6 months ago.

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
  • #4682


    I want to send more than one value from Mapper to reducer, I can do the same by creating custom key / value. How to create custom key and value ?



    In Hadoop, data types to be used as the key must implement WritableComparable interface and data types to be used as value must implement Writable interface.
    If your custom key/value are of the same type then you can write one custom datatype for both the key/value which implements WritableComparable otherwise you need to implement two different data types. One for the key which implements the WritableComparable interface and other for value which implements Writable interface.

    //Custom Data-Type
    public class MyCustomeKey implements WritableComparable

    //Create Mapper with Custome Key
    public class MyMapper extends Mapper

    For more details, please follow:

Viewing 2 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic.