Front Page Forums Apache Flink Apache Flink program error while using lambda expression

This topic contains 1 reply, has 1 voice, and was last updated by  DataFlair Team 2 months, 1 week ago.

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • #6613

    DataFlair Team
    Participant

    I am using filter Api in basic wordcount program with Lambda expression but it is not working.

    public class Count {
    public static void main(String[] args) throws Exception {

    final ExecutionEnvironment environment = ExecutionEnvironment.getExecutionEnvironment();
    DataSet<String> text = environment.readTextFile(“/home/Malini/input11.txt”);

    DataSet<Tuple2<String, Integer>> wordCounts = text
    .filter(line -> line.contains(“Malini”)) 
    .flatMap(new MySplitter())
    .groupBy(0)
    .sum(1);

    wordCounts.print();

    wordCounts.writeAsCsv(“/home/Malini/out000000334111.csv”);
    environment.execute();

    }

    public static class MySplitter implements FlatMapFunction<String, Tuple2<String, Integer>> {
    @Override
    public void flatMap(String line, Collector<Tuple2<String, Integer>> out) {
    for (String word : line.split(” “))
    {

    out.collect(new Tuple2<String, Integer>(word, 1));
    }
    }

    }
    }

    #6614

    DataFlair Team
    Participant

    Please check your Java version. Lambda expression is supportable in Java compiler 1.8. If you are using java 1.7 or earlier version it will not support the Lambda expression. So it is recommended to use Java 1.8.

Viewing 2 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic.