Skip to main content

Analyzing Wood Burning Stoves with FLaNK Stack: MiNiFi, Flink, NiFi, Kafka, Kudu

Analyzing Wood Burning Stoves with FLaNK Stack:   MiNiFi, Flink, NiFi, Kafka, Kudu (FLaNK Stack)


Winter has arrived, finally.   The 50-70 F days are over, it dropped below 30 F in Princeton, so time to light up the wood burning stove and burn some season cherry wood (We get cherry wood from a local tree service that removes dead trees for people and then season the wood.  Recycle!) .  It's great for camp fires, smoking meats and for heating up our house.  Also if you have no smelled cherry wood smoke it is amazing.   I wanted to see if having a fire that raised my houses temperature from 67 F to 87 F would produce noticeable sensor readings.   Fortunately, I have a thermal camera sensor (Pimoroni rocks! Add another thing to my list of thinks I love from Britain (Dr. Who, Jelly Babies, Pimoroni and my awesome boss Dan).  I also have Raspberry Pi sensors for temperature, humidity, light and various gas sensors.   Let's see what the numbers look like.    The temperatures and images start greeen and yellow and as they heat up turn red, purple and then pure white.   That's real hot.   Fortunately the Raspberry Pis didn't overheat, had to open a window when we got close to 90.   Yes, temperature regulation and maybe an automated wood feeder would be nice.
  




Inside the Stove


Cherry Wood burning nice in stove, notice Fire on Cloudera T-Shirt

Four  USB PS3 Eye Cameras ($7!!!) attached to Raspberry Pi 3B+ and 4s.


A very organized professional assortment of Pis and sensors...





 It's very easy to spot check my sensor values as they stream through Apache Kafka with Cloudera SMM.


Some Sensor Readings:

{"bme280_tempf": "93.78", "uuid": "20200117195629_104c9f2a-b5a8-43d2-8386-57b7bd05f55a", "systemtime": "01/17/2020 14:56:29", "bme280_altitude": "-41.31", "memory": 92.1, "max30105_value": "84.00", "end": "1579290989.4628081", "imgnamep": "images/bog_image_p_20200117195629_104c9f2a-b5a8-43d2-8386-57b7bd05f55a.jpg", "max30105_temp": "34.56", "ipaddress": "192.168.1.251", "diskusage": "44726.6", "host": "garden2", "max30105timestamp": "20200117-145629-345697", "starttime": "01/17/2020 13:48:29", "bme280_altitude_feet": "-135.53", "max30105_delta": "0.00", "max30105_mean": "84.00", "max30105_detected": "False", "bme280_tempc": "34.32", "bme280_pressure": "1034.61", "cputemp": 59, "te": "4079.87322807312", "imgname": "images/bog_image_20200117195629_104c9f2a-b5a8-43d2-8386-57b7bd05f55a.jpg"}


[{"uuid":"sgp30_uuid_xyg_20200117185015","ipaddress":"192.168.1.221","runtime":"0","host":"garden3","host_name":"garden3","macaddress":"dc:a6:32:32:98:20","end":"1579287015.6653564","te":"0.025962352752685547","systemtime":"01/17/2020 13:50:15","cpu":55.0,"diskusage":"109290.8 MB","memory":29.7,"equivalentco2ppm":"  400","totalvocppb":"   37","id":"20200117185015_b8fbd9c1-fa30-4f70-b20d-e43a2c703b18"}]

{"uuid": "rpi4_uuid_kse_20200117222947", "ipaddress": "192.168.1.243", "host": "rp4", "host_name": "rp4", "macaddress": "dc:a6:32:03:a6:e9", "systemtime": "01/17/2020 17:29:47", "cpu": 50.8, "diskusage": "46208.0 MB", "memory": 18.2, "id": "20200117222947_e9299089-d56f-468b-8bac-897a2918307a", "temperature": "48.96355155197982", "pressure": "1035.4460084255888", "humidity": "0.0", "lux": "49.0753", "proximity": "0", "gas": "Oxidising: 30516.85 Ohms\nReducing: 194406.50 Ohms\nNH3: 104000.00 Ohms"}

{"host": "rp4", "cputemp": "72", "ipaddress": "192.168.1.243", "endtime": "1579293634.02", "runtime": "0.00", "systemtime": "01/17/2020 15:40:34", "starttime": "01/17/2020 15:40:34", "diskfree": "46322.7", "memory": "17.1", "uuid": "20200117204034_99f49e71-7444-4fd7-b82e-7e03720c4c39", "image_filename": "20200117204034_d9f811a3-8582-4b47-b4b4-cb6ec51cca04.jpg"}

The next step is to have NiFi load the data to Kudu, Hive, HBase or Phoenix tables for analysis with Cloudera Data Science Work Bench and some machine learning analytics in Python 3 on either Zeppelin or Jupyter notebooks feeding CDSW.   Then I can host my final model on K8 within CDSW for a real edge to AI application and solve the issue of how much fire in my house is too much? 

This article is part of the FLaNK Stack series, highlighting using the FL(ink) Apache NiFi Kafka Kudu stack for big data streaming development with IoT and AI applications.

Popular posts from this blog

Migrating Apache Flume Flows to Apache NiFi: Kafka Source to HDFS / Kudu / File / Hive

Migrating Apache Flume Flows to Apache NiFi: Kafka Source to HDFS / Kudu / File / HiveArticle 7 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_9.html Article 6 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_35.html
Article 5 - 
Article 4 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_8.html Article 3 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_7.html Article 2 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache.html Article 1https://www.datainmotion.dev/2019/08/migrating-apache-flume-flows-to-apache.html Source Code:  https://github.com/tspannhw/flume-to-nifi
This is one possible simple, fast replacement for "Flafka".



Consume / Publish Kafka And Store to Files, HDFS, Hive 3.1, Kudu

Consume Kafka Flow 

 Merge Records And Store As AVRO or ORC
Consume Kafka, Update Records via Machine Learning Models In CDSW And Store to Kudu

Sour…

Exploring Apache NiFi 1.10: Stateless Engine and Parameters

Exploring Apache NiFi 1.10:   Stateless Engine and Parameters Apache NiFi is now available in 1.10!
https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12316020&version=12344993

You can now use JDK 8 or JDK 11!   I am running in JDK 11, seems a bit faster.

A huge feature is the addition of Parameters!   And you can use these to pass parameters to Apache NiFi Stateless!

A few lesser Processors have been moved from the main download, see here for migration hints:
https://cwiki.apache.org/confluence/display/NIFI/Migration+Guidance

Release Notes:   https://cwiki.apache.org/confluence/display/NIFI/Release+Notes#ReleaseNotes-Version1.10.0

Example Source Code:https://github.com/tspannhw/stateless-examples

More New Features:

ParquetReader/Writer (See:  https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_7.html)Prometheus Reporting Task.   Expect more Prometheus stuff coming.Experimental Encrypted content repository.   People asked me for this one before.Par…

Ingesting Drone Data From DJII Ryze Tello Drones Part 1 - Setup and Practice

Ingesting Drone Data From DJII Ryze Tello Drones Part 1 - Setup and Practice In Part 1, we will setup our drone, our communication environment, capture the data and do initial analysis. We will eventually grab live video stream for object detection, real-time flight control and real-time data ingest of photos, videos and sensor readings. We will have Apache NiFi react to live situations facing the drone and have it issue flight commands via UDP. In this initial section, we will control the drone with Python which can be triggered by NiFi. Apache NiFi will ingest log data that is stored as CSV files on a NiFi node connected to the drone's WiFi. This will eventually move to a dedicated embedded device running MiniFi. This is a small personal drone with less than 13 minutes of flight time per battery. This is not a commercial drone, but gives you an idea of the what you can do with drones. Drone Live Communications for Sensor Readings and Drone Control You must connect to the drone…