Skip to main content

Ingesting Websocket Data for Live Stock Streams with Cloudera Flow Management Powered by Apache NiFi

Ingesting Websocket Data for Live Stock Streams with Cloudera Flow Management Powered by Apache NiFi

The stocks I follow have a lot of trades and changes throughout the day, I would like to capture all of this data and make it available to my colleagues.   I will push it to Kafka and make it available via a topic and I may also push it to Slack or Dischord or a webpage or dashboard or Cloudera Visual App dashboard.   We'll see what people request.

We will read websockets from wss://ws.finnhub.io?token=YOURTOKEN.   You will need to sign up for a finnhub.io account to get this data.   The API is well documented and very easy to use with Apache NiFi.

As updates happen we receive websocket calls and send them to Kafka for use in Flink SQL, Kafka Connect, Spark Streaming, Kafka Streams, Python, .Java Spring Boot Apps, NET Apps and NIFi.

Definition of Fields

s

Symbol.

p

Last price.

t

UNIX milliseconds timestamp.

v

Volume.

c

List of trade conditions. A comprehensive list of trade conditions code can be found here


Incoming Websocket Text Message Processing



We parse out the fields we want, then rename them for something readable.   Then we build a new JSON field that matches our trades schema then we push to Kafka.


First step we need to setup a controller pool to connect to finnhub's web socket API.


We can see data in flight via NiFi Provenance.




The detailed steps and settings for converting raw websocket text messages to final messages to send to Kafka.













Raw Data From Websockets Text Message

Formatted JSON Data Before Converting and Sending to Kafka Topic (trades)


We can view the final clean data in Kafka via Cloudera Streams Messaging Manager (SMM)


Schema

https://github.com/tspannhw/ApacheConAtHome2020/blob/main/schemas/trades.avsc


Happy Holidays from Tim and the Streaming Felines!





Reference

Popular posts from this blog

Ingesting Drone Data From DJII Ryze Tello Drones Part 1 - Setup and Practice

Ingesting Drone Data From DJII Ryze Tello Drones Part 1 - Setup and Practice In Part 1, we will setup our drone, our communication environment, capture the data and do initial analysis. We will eventually grab live video stream for object detection, real-time flight control and real-time data ingest of photos, videos and sensor readings. We will have Apache NiFi react to live situations facing the drone and have it issue flight commands via UDP. In this initial section, we will control the drone with Python which can be triggered by NiFi. Apache NiFi will ingest log data that is stored as CSV files on a NiFi node connected to the drone's WiFi. This will eventually move to a dedicated embedded device running MiniFi. This is a small personal drone with less than 13 minutes of flight time per battery. This is not a commercial drone, but gives you an idea of the what you can do with drones. Drone Live Communications for Sensor Readings and Drone Control You must connect t

Advanced XML Processing with Apache NiFi 1.9.1

Advanced XML Processing with Apache NiFi 1.9.1 With the latest version of Apache NiFi, you can now directly convert XML to JSON or Apache AVRO, CSV or any other format supported by RecordWriters.   This is a great advancement.  To make it even easier, you don't even need to know the schema before hand.   There is a built-in option to Infer Schema. The results of an RSS (XML) feed converted to JSON and displayed in a slack channel. Besides just RSS feeds, we can grab regular XML data including XML data that is wrapped in a Zip file (or even in a Zipfile in an email, SFTP server or Google Docs). Get the Hourly Weather Observation for the United States Decompress That Zip  Unpack That Zip into Files One ZIP becomes many XML files of data. An example XML record from a NOAA weather station. Converted to JSON Automagically Let's Read Those Records With A Query and Convert the results to JSON Records

Connecting Apache NiFi to Apache Atlas For Data Governance At Scale in Streaming

Connecting Apache NiFi to Apache Atlas For Data Governance At Scale in Streaming Once connected you can see NiFi and Kafka flowing to Atlas. You must add Atlas Report to NiFi cluster. Add a ReportLineageToAtlas under Controller Settings / Reporting Tasks You must add URL for Atlas, Authentication method and if basic, username/password. You need to set Atlas Configuration directory, NiFi URL to use, Lineage Strategy - Complete Path Another example with an AWS hosted NiFi and Atlas: IMPORTANT NOTE:   Keep your Atlas Default Cluster Name  consistent with other applications for Cloudera clusters, usually the name cm is a great option or default . You can now see the lineage state: Configure Atlas to Be Enabled and Have Kafka Have Atlas Service enabled in NiFi configuration Example Configuration You must have access to Atlas Application Properties. /etc/atlas/conf atlas-application.properties   #Generated by Apache