Front Page Forums Hadoop Spark Streaming with Socket

This topic contains 1 reply, has 1 voice, and was last updated by  dfbdteam3 2 months, 3 weeks ago.

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • #4959

    dfbdteam3
    Moderator

    I’m trying to get the example spark streaming example to run but instead of getting input from the console created with netcat: nc -lk 9999, I want to send messages to this console and have spark listen. How can I do this.

    So the idea is: Spark Listens from nc -l 9999 (as example) and we want send a message from a socket. How can we do this? Is there a way of avoiding to have a server here and can we make do with netcat?

    Other system -> Socket message -> nc -> spark

    #4960

    dfbdteam3
    Moderator

    Hello,

    You can use Apache Flume to read the Data from Twitter/or any Live Streaming, then in the Flume conf file put the source as Twitter, sink as Avrosink(localhost port 9999). Spark listens from localhost under the port 9999 and if any data is populated, Spark display the same on the console based on the logic defined.

    Twitter/Live Stream data => Apache Flume => Localhost => Console.

    Apache Flume reads the Live Twitter Data from your account and sends the data to localhost, Spark Stream listens the data from the localhost console and display only the words starts with # (if business logic is defined in Spark Streaming code).

    Is this answers your query or you looking for something different.

Viewing 2 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic.