Showing posts with label cem. Show all posts
Showing posts with label cem. Show all posts

EdgeAI: Jetson Nano with MiNiFi C++ Agent

Build and Utilizing The Apache NiFi - MiNiFi C++ Agent For Jetson Nano

(EdgeAI:   Jetson Nano with MiNiFi C++ Agent)


source.hostname
jetsonnano

source.ipv4
192.168.1.217

GetUSBCamera

FPS: .5


Bootstrap and Build

/opt/demo/nifi-minifi-cpp-source/build

bootstrap.sh

Options:  Kafka, OpenCV, TensorFlow, USB Camera


****************************************
 Select MiNiFi C++ Features to toggle.
****************************************
A. Persistent Repositories .....Enabled
B. Lib Curl Features ...........Enabled
C. Lib Archive Features ........Enabled
D. Execute Script support ......Enabled
E. Expression Language support .Enabled
F. Kafka support ...............Enabled
G. PCAP support ................Disabled
H. USB Camera support ..........Enabled
I. GPS support .................Disabled
J. TensorFlow Support ..........Disabled
K. Bustache Support ............Disabled
L. MQTT Support ................Enabled
M. SQLite Support ..............Disabled
N. Python Support ..............Enabled
O. COAP Support ................Enabled
S. SFTP Support ................Enabled
V. AWS Support .................Disabled
T. OpenCV Support ..............Enabled
U. OPC-UA Support...............Enabled

****************************************

sudo apt-get install libcurl-dev libcurl4-openssl-dev -y
make


We can see when data arrives in NiFi from a MiNiFi Agent.



 We can publish to Kafka directly from our MiNiFi C++ agent.


If CEM/Edge Flow Manager is a mystery to you, check out the live Swagger REST Documentation.


With MiNiFi C++ I can add a USB Camera.




 In NiFi we can see the Host Information that MiNiFi attached.



Example Data



{"uuid": "nano_uuid_crr_20200218002610", "ipaddress": "192.168.1.217", "top1pct": 54.833984375, "top1": "cab, hack, taxi, taxicab", "cputemp": "45.5", "gputemp": "43.5", "gputempf": "110", "cputempf": "114", "runtime": "4", "host": "jetsonnano", "filename": "/opt/demo/images/image_esq_20200218002610.jpg", "imageinput": "/opt/demo/images/2020-02-17_1926.jpg", "host_name": "jetsonnano", "macaddress": "ec:08:6b:18:0d:7f", "end": "1581985574.6246474", "te": "4.158604383468628", "systemtime": "02/17/2020 19:26:14", "cpu": 51.8, "diskusage": "5479.7 MB", "memory": 71.4, "id": "20200218002610_8a12dd65-1038-41ac-b923-98fc907f5be0"}

Example Config.yml Section


  name: AppendHostInfo
  class: org.apache.nifi.minifi.processors.AppendHostInfo
  max concurrent tasks: 1
  scheduling strategy: TIMER_DRIVEN
  scheduling period: 1000 ms
  penalization period: 30000 ms
  yield period: 1000 ms
  run duration nanos: 0
  auto-terminated relationships list: []
  Properties:
    Hostname Attribute: source.hostname
    IP Attribute: source.ipv4
    Network Interface Name: wlan0

Example Output


[2020-02-11 19:35:09.116] [org::apache::nifi::minifi::processors::ExecuteProcess] [info] Execute Command /opt/demo/rundemo.sh 
[2020-02-11 19:35:11.275] [org::apache::nifi::minifi::c2::C2Agent] [info] Checking 0 triggers
[2020-02-11 19:35:13.742] [org::apache::nifi::minifi::c2::C2Agent] [info] Checking 0 triggers
[2020-02-11 19:35:15.568] [org::apache::nifi::minifi::core::ProcessSession] [info] Transferring 899b5964-4d2f-11ea-8b9a-6e260e221e3d from ExecuteProcess - Python to relationship success
[2020-02-11 19:35:15.568] [org::apache::nifi::minifi::processors::ExecuteProcess] [info] Execute Command Complete /opt/demo/rundemo.sh status 0 pid 31004
[2020-02-11 19:35:15.569] [org::apache::nifi::minifi::core::ProcessSession] [info] Transferring 899b5964-4d2f-11ea-8b9a-6e260e221e3d from AppendHostInfo to relationship success
[2020-02-11 19:35:15.649] [org::apache::nifi::minifi::sitetosite::SiteToSiteClient] [info] Site to Site transaction 4d0b460e-e4f6-4ca1-8c56-30d310a0712b sent flow 1flow records, with total size 3581
[2020-02-11 19:35:15.785] [org::apache::nifi::minifi::sitetosite::HttpSiteToSiteClient] [info] Site to Site closed transaction 4d0b460e-e4f6-4ca1-8c56-30d310a0712b
[2020-02-11 19:35:15.841] [org::apache::nifi::minifi::sitetosite::SiteToSiteClient] [info] Site2Site transaction 4d0b460e-e4f6-4ca1-8c56-30d310a0712b peer finished transaction
[2020-02-11 19:35:15.841] [org::apache::nifi::minifi::io::HttpStream] [warning] Future status already cleared for http://ec2-35-171-154-174.compute-1.amazonaws.com:8080/nifi-api/data-transfer/input-ports/17979d5f-0170-1000-0000-000011f1cc00/transactions/4d0b460e-e4f6-4ca1-8c56-30d310a0712b/flow-files, continuing
[2020-02-11 19:35:16.236] [org::apache::nifi::minifi::c2::C2Agent] [info] Checking 0 triggers
[2020-02-11 19:35:16.263] [org::apache::nifi::minifi::core::ProcessSession] [info] Transferring 8a05413a-4d2f-11ea-8b9a-6e260e221e3d from TailFile to relationship success
[2020-02-11 19:35:16.264] [org::apache::nifi::minifi::processors::TailFile] [info] TailFile nano.log for 616 bytes
[2020-02-11 19:35:16.273] [org::apache::nifi::minifi::core::ProcessSession] [info] Transferring 8a05413a-4d2f-11ea-8b9a-6e260e221e3d from AppendHostInfo to relationship success
[2020-02-11 19:35:16.274] [org::apache::nifi::minifi::core::ProcessSession] [info] Transferring 8a05413a-4d2f-11ea-8b9a-6e260e221e3d from PublishKafka to relationship success
[2020-02-11 19:35:18.748] [org::apache::nifi::minifi::c2::C2Agent] [info] Checking 0 triggers
[2020-02-11 19:35:21.260] [org::apache::nifi::minifi::c2::C2Agent] [info] Checking 0 triggers

Using Apache NiFi - MiNiFi C++ Agent Elsewhere

I am working on a Jetbot robot powered by NVidia Jetson Nano that will use the MiNiFi C++ agent.








References







Using GrovePi with Raspberry Pi and MiNiFi Agents for Data Ingest to Parquet, Kudu, ORC, Kafka, Hive and Impala

Using GrovePi with Raspberry Pi and MiNiFi Agents for Data Ingest


Source Code:  https://github.com/tspannhw/minifi-grove-sensors

Acquiring sensor data from Grove sensors is easy using a GrovePi Hat and some compatible sensors.


Just before my talk at the Future of Data Meetup @ Bell Works in Holmdel, NJ, I thought I should ingest some data from a grove sensor interface.

It's so easy a sleeping cat could do it.




So what does this device look like?  



I have a temperature and humidity sensor on there.




The distance sonic sensor is in there too, that's for the next article.




Let's do this with minimal RAM.




That's a 64GB hard drive underneath in the white case with the RPI.





I need more data and BACON.



We design our MiNiFi Agent Flow in CEM/EFM.   Grab JSON data stream and run sensors.


Apache NiFi 1.9.2 / CFM 1.0 Received HTTPS S2S Events From MiNiFi Agent




A simple flow to query and convert our JSON data, then store it to Kudu and HDFS (ORC) as well as push it to Kafka with a schema.




Let's read that Kafka message and store to Parquet, we will push to MQTT and JMS in the next article.   This is our universal proxy/gateway.



We could infer a schema and not save it.   But by saving a schema to the schema registry it makes SMM, Kafka, NiFi and others schema aware and easy to automagically query and convert between CSV/JSON/XML/AVRO/Parquet and more.

Let's store the data in Parquet files on HDFS with an Impala table.   In Apache NiFi 1.10 there is a ParquetWriter



Before we push to Kafka, let's create a topic for it with Cloudera SMM



Let's build an impala table for that Kudu data.



We can query our tables with ease as data rapidly is added.





Let's Examine the Parquet Files that NiFi Generated





 Let's query that parquet data with Impala in Hue



 Let's monitor that data in Kafka with Cloudera SMM






That was easy from device to enterprise cloud data store(s) with enterprise messages, security, governance, lineage, data catalog, SDX, monitoring and more.   How easy can you ingest IoT data, query it mid stream and store it in multiple data stores.   It took longer to write the article then to do the project and code.   All graphical, Single Sign On, multiple schemas/verisons/data types/engines, multiple OSs, edge, cloud and laptop.   Easy.

Table DDL


CREATE EXTERNAL TABLE IF NOT EXISTS grovesensors2 
(humidity STRING, uuid STRING, systemtime STRING, runtime STRING, cpu DOUBLE, id STRING, te STRING, host STRING, `end` STRING, 
macaddress STRING, temperature STRING, diskusage STRING, memory DOUBLE, ipaddress STRING, host_name STRING) 
STORED AS ORC
LOCATION '/tmp/grovesensors'

CREATE TABLE grovesensors ( uuid STRING,  `end` STRING,humidity STRING, systemtime STRING, runtime STRING, cpu DOUBLE, id STRING, te STRING, 
host STRING,
macaddress STRING, temperature STRING, diskusage STRING, memory DOUBLE, ipaddress STRING, host_name STRING,
PRIMARY KEY (uuid, `end`)
)
PARTITION BY HASH PARTITIONS 16
STORED AS KUDU
TBLPROPERTIES ('kudu.num_tablet_replicas' = '1')

hdfs dfs -mkdir -p /tmp/grovesensors
hdfs dfs -mkdir -p /tmp/groveparquet

CREATE  EXTERNAL TABLE grove_parquet 
 (
 diskusage STRING, 
  memory DOUBLE,  host_name STRING,
  systemtime STRING,
  macaddress STRING,
  temperature STRING,
  humidity STRING,
  cpu DOUBLE,
  uuid STRING,  ipaddress STRING,
  host STRING,
  `end` STRING,  te STRING,
  runtime STRING,
  id STRING
)
STORED AS PARQUET
LOCATION '/tmp/groveparquet/'

Parquet Format



message org.apache.nifi.grove {
  optional binary diskusage (STRING);
  optional double memory;
  optional binary host_name (STRING);
  optional binary systemtime (STRING);
  optional binary macaddress (STRING);
  optional binary temperature (STRING);
  optional binary humidity (STRING);
  optional double cpu;
  optional binary uuid (STRING);
  optional binary ipaddress (STRING);
  optional binary host (STRING);
  optional binary end (STRING);
  optional binary te (STRING);
  optional binary runtime (STRING);
  optional binary id (STRING);
}

References







Google Coral TPU with Edge Devices and MiNiFi

Google Coral TPU with Edge Devices and MiNiFi 


Designing Our Edge AI Flow with Cloudera Edge Flow Manager.


Configure Your Remote Process Group to Send Data to Your NiFi Cluster


Monitor Your Agents From the Events Screen


Let's grab all the new images and then delete on completion



We have Input and Output Ports to have Bidirectional communication with 0-n MiNiFi agents


Our NiFi flow to process calls from MiNiFi Agents running Coral TPUs


We run a query to check the TensorFlow Lite classification results and send out a slack message.



Let's push JSON data to a Kafka Cluster in AWS