Hadoop Combiner
Large chunks of intermediate data are generated by the Mapper, on a large dataset when we run MapReduce job and this intermediate data are passed on the Reducer for further processing, which leads to enormous network congestion. So, for reducing network congestion, MapReduce framework provides a function known as Hadoop CombineR.
well, to learn about Hadoop Combiner in detail, follow the link:Hadoop Combiner – Best Explanation to MapReduce Combiner
Hadoop Reducer
Hadoop Reducers generates the output (zero or more key-value pair) after processing the intermediate values for particular key generated by the map function.
And, to learn about Hadoop Reducer in detail, follow the link: Hadoop Reducer – 3 Steps learning for MapReduce Reducer