Free Online Certification Courses – Learn Today. Lead Tomorrow. › Forums › Apache Hadoop › Spark Streaming with Socket
- This topic has 1 reply, 1 voice, and was last updated 6 years ago by DataFlair Team.
-
AuthorPosts
-
-
September 20, 2018 at 12:58 pm #4959DataFlair TeamSpectator
I’m trying to get the example spark streaming example to run but instead of getting input from the console created with netcat: nc -lk 9999, I want to send messages to this console and have spark listen. How can I do this.
So the idea is: Spark Listens from nc -l 9999 (as example) and we want send a message from a socket. How can we do this? Is there a way of avoiding to have a server here and can we make do with netcat?
Other system -> Socket message -> nc -> spark
-
September 20, 2018 at 12:58 pm #4960DataFlair TeamSpectator
Hello,
You can use Apache Flume to read the Data from Twitter/or any Live Streaming, then in the Flume conf file put the source as Twitter, sink as Avrosink(localhost port 9999). Spark listens from localhost under the port 9999 and if any data is populated, Spark display the same on the console based on the logic defined.
Twitter/Live Stream data => Apache Flume => Localhost => Console.
Apache Flume reads the Live Twitter Data from your account and sends the data to localhost, Spark Stream listens the data from the localhost console and display only the words starts with # (if business logic is defined in Spark Streaming code).
Is this answers your query or you looking for something different.
-
-
AuthorPosts
- You must be logged in to reply to this topic.