Spark Streaming with Socket

Viewing 1 reply thread
  • Author
    • #4959
      DataFlair Team

      I’m trying to get the example spark streaming example to run but instead of getting input from the console created with netcat: nc -lk 9999, I want to send messages to this console and have spark listen. How can I do this.

      So the idea is: Spark Listens from nc -l 9999 (as example) and we want send a message from a socket. How can we do this? Is there a way of avoiding to have a server here and can we make do with netcat?

      Other system -> Socket message -> nc -> spark

    • #4960
      DataFlair Team


      You can use Apache Flume to read the Data from Twitter/or any Live Streaming, then in the Flume conf file put the source as Twitter, sink as Avrosink(localhost port 9999). Spark listens from localhost under the port 9999 and if any data is populated, Spark display the same on the console based on the logic defined.

      Twitter/Live Stream data => Apache Flume => Localhost => Console.

      Apache Flume reads the Live Twitter Data from your account and sends the data to localhost, Spark Stream listens the data from the localhost console and display only the words starts with # (if business logic is defined in Spark Streaming code).

      Is this answers your query or you looking for something different.

Viewing 1 reply thread
  • You must be logged in to reply to this topic.