Free Online Certification Courses – Learn Today. Lead Tomorrow. › Forums › Apache Flink › Apache Flink program error while using lambda expression
- This topic has 1 reply, 1 voice, and was last updated 5 years, 6 months ago by DataFlair Team.
-
AuthorPosts
-
-
October 10, 2018 at 4:15 pm #6613DataFlair TeamSpectator
I am using filter Api in basic wordcount program with Lambda expression but it is not working.
public class Count {
public static void main(String[] args) throws Exception {final ExecutionEnvironment environment = ExecutionEnvironment.getExecutionEnvironment();
DataSet<String> text = environment.readTextFile(“/home/Malini/input11.txt”);DataSet<Tuple2<String, Integer>> wordCounts = text
.filter(line -> line.contains(“Malini”))
.flatMap(new MySplitter())
.groupBy(0)
.sum(1);wordCounts.print();
wordCounts.writeAsCsv(“/home/Malini/out000000334111.csv”);
environment.execute();}
public static class MySplitter implements FlatMapFunction<String, Tuple2<String, Integer>> {
@Override
public void flatMap(String line, Collector<Tuple2<String, Integer>> out) {
for (String word : line.split(” “))
{out.collect(new Tuple2<String, Integer>(word, 1));
}
}}
} -
October 10, 2018 at 4:15 pm #6614DataFlair TeamSpectator
Please check your Java version. Lambda expression is supportable in Java compiler 1.8. If you are using java 1.7 or earlier version it will not support the Lambda expression. So it is recommended to use Java 1.8.
-
-
AuthorPosts
- You must be logged in to reply to this topic.