Skip to main content

[FLaNK] Smart Weather Websocket Application - Kafka Consumer

 [FLaNK] Smart Weather Websocket Application - Kafka Consumer

Part 2 of 2



This is based on Koji Kawamura's excellent GIST: 

https://gist.github.com/ijokarumawak/60b9ab2038ef906731ebf4c0eee97176

As part of my Smart Weather Application, I wanted to display weather information as it arrives in a webpage using web sockets.   Koji has an excellent NiFi flow that does it.   I tweaked it and add some things since I am not using Zeppelin.   I am hosting my webpage with NiFi as well.

https://www.datainmotion.dev/2020/11/flank-smart-weather-applications-with.html

We simply supply a webpage that makes a websocket connection to NiFi and NiFi keeps a cache in HBase to know what the client is doing.  This cache is updated by consuming from Kafka.   We can then feed events as they happen to the page.





Here is the JavaScript for the web page interface to websockets:

<script>
function sendMessage(type, payload) {
websocket.send(makeMessage(type, payload));
}

function makeMessage(type, payload) {
return JSON.stringify({
'type': type,
'payload': payload
});
}

var wsUri = "ws://edge2ai-1.dim.local:9091/test";

websocket = new WebSocket(wsUri);
websocket.onopen = function(evt) {

sendMessage('publish', {
"message": document.getElementById("kafkamessage")
});

};
websocket.onerror = function(evt) {console.log('ERR', evt)};
websocket.onmessage = function(evt) {
var dataPoints = JSON.parse(evt.data);

var output = document.getElementById("results");
var dataBuffer = "<p>";
for(var i=0;i<dataPoints.length;i++)
{
dataBuffer += " <img src=\"" + dataPoints[i].icon_url_base + dataPoints[i].icon_url_name + "\"> &nbsp;" + dataPoints[i].location +
dataPoints[i].station_id + "@" + dataPoints[i].latitude + ":" +
dataPoints[i].longitude + "@" + dataPoints[i].observation_time +
dataPoints[i].temperature_string + "," + dataPoints[i].relative_humidity + "," +
dataPoints[i].wind_string +"<br>";

}
output.innerHTML = output.innerHTML + dataBuffer + "</p><br>";
};

</script>




Video Walkthrough:   https://www.twitch.tv/videos/797412192?es_id=bbacb7cb39

Source Code:   https://github.com/tspannhw/SmartWeather/tree/main



Kafka Topic

weathernj Schema

The schema registry has a live Swagger interface to it's REST API




NiFi Flow Overview


Ingest Via REST All US Weather Data from Zipped XML



As Data Streamings In, We Can Govern It





Ingested Data is Validated Against It's Schema Then Pushed to Kafka as Avro


We consume that Kafka data in store it in Kudu for analytics



We host a web page for our Websockets Application in NiFi with 4 simple processors.



Listen and Put Web Socket Messages Between NiFi Server and Web Application



Kafka Data is Cached for Websocket Applications


Set the Port for WebSockets via Jetty Web Server


Use HBase As Our Cache




We can monitor our Flink SQL application from the Global Flink Dashboard






We can query our Weather data store in Apache Kudu via Apache Impala through Apache Hue 




Kudu Visualizations of Our Weather Data in Cloudera Visual Applications

Popular posts from this blog

Ingesting Drone Data From DJII Ryze Tello Drones Part 1 - Setup and Practice

Ingesting Drone Data From DJII Ryze Tello Drones Part 1 - Setup and Practice In Part 1, we will setup our drone, our communication environment, capture the data and do initial analysis. We will eventually grab live video stream for object detection, real-time flight control and real-time data ingest of photos, videos and sensor readings. We will have Apache NiFi react to live situations facing the drone and have it issue flight commands via UDP. In this initial section, we will control the drone with Python which can be triggered by NiFi. Apache NiFi will ingest log data that is stored as CSV files on a NiFi node connected to the drone's WiFi. This will eventually move to a dedicated embedded device running MiniFi. This is a small personal drone with less than 13 minutes of flight time per battery. This is not a commercial drone, but gives you an idea of the what you can do with drones. Drone Live Communications for Sensor Readings and Drone Control You must connect t

Migrating Apache Flume Flows to Apache NiFi: Kafka Source to HDFS / Kudu / File / Hive

Migrating Apache Flume Flows to Apache NiFi: Kafka Source to HDFS / Kudu / File / Hive Article 7 -  https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_9.html Article 6 -  https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_35.html Article 5 -  Article 4 -  https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_8.html Article 3 -  https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_7.html Article 2 -  https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache.html Article 1 -  https://www.datainmotion.dev/2019/08/migrating-apache-flume-flows-to-apache.html Source Code:   https://github.com/tspannhw/flume-to-nifi This is one possible simple, fast replacement for " Flafka ". Consume / Publish Kafka And Store to Files, HDFS, Hive 3.1, Kudu Consume Kafka Flow   Merge Records And Store As AVRO or ORC Consume Kafka, Upda

Advanced XML Processing with Apache NiFi 1.9.1

Advanced XML Processing with Apache NiFi 1.9.1 With the latest version of Apache NiFi, you can now directly convert XML to JSON or Apache AVRO, CSV or any other format supported by RecordWriters.   This is a great advancement.  To make it even easier, you don't even need to know the schema before hand.   There is a built-in option to Infer Schema. The results of an RSS (XML) feed converted to JSON and displayed in a slack channel. Besides just RSS feeds, we can grab regular XML data including XML data that is wrapped in a Zip file (or even in a Zipfile in an email, SFTP server or Google Docs). Get the Hourly Weather Observation for the United States Decompress That Zip  Unpack That Zip into Files One ZIP becomes many XML files of data. An example XML record from a NOAA weather station. Converted to JSON Automagically Let's Read Those Records With A Query and Convert the results to JSON Records