Front Page Forums Apache Flink Apache Flink program error while using lambda expression

This topic contains 1 reply, has 1 voice, and was last updated by  DataFlair Team 2 months, 1 week ago.

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
  • #6613

    DataFlair Team

    I am using filter Api in basic wordcount program with Lambda expression but it is not working.

    public class Count {
    public static void main(String[] args) throws Exception {

    final ExecutionEnvironment environment = ExecutionEnvironment.getExecutionEnvironment();
    DataSet<String> text = environment.readTextFile(“/home/Malini/input11.txt”);

    DataSet<Tuple2<String, Integer>> wordCounts = text
    .filter(line -> line.contains(“Malini”)) 
    .flatMap(new MySplitter())




    public static class MySplitter implements FlatMapFunction<String, Tuple2<String, Integer>> {
    public void flatMap(String line, Collector<Tuple2<String, Integer>> out) {
    for (String word : line.split(” “))

    out.collect(new Tuple2<String, Integer>(word, 1));



    DataFlair Team

    Please check your Java version. Lambda expression is supportable in Java compiler 1.8. If you are using java 1.7 or earlier version it will not support the Lambda expression. So it is recommended to use Java 1.8.

Viewing 2 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic.