Skip to main content

Migrating Apache Flume Flows to Apache NiFi: Any Relational Database To/From Anywhere

Migrating Apache Flume Flows to Apache NiFi:  Any Relational Database To/From Anywhere



This is a simple use case of being a gateway between Relational Databases and other sources and sinks.   We can do a lot more than that in NiFi.  We can SELECT, UPDATE, INSERT, DELETE and run any DML.  All with No Code.   We can also access metadata from an RDBMS and build dynamical ELT systems from that.

It is extremely easy to do this in NiFi.


Instead of using Flume, Let's Use Apache NiFi to Move Any Tabular Data To and From Databases




From A Relational Database (via JDBC Driver) to Anywhere.   In our case, we will pull from an RDBMS and post to Kudu.


Step 1:  QueryDatabaseTableRecord (Create Connection Pool, Pick DB Type, Table Name, Record Writer)
Step 2:  PutKudu (Set Kudu Master, Table Name, 
Done!

Query Database


Connect to Kudu




Let's Write JSON Records That Get Converted to Kudu Records or RDBMS/MySQL/JDBC Records 



Schema For The Data




Read All The Records From Our JDBC Database





Let's Create an Apache Kudu table to Put Database Records To





Let's Examine the MySQL Table We Want to Read/Write To and From





Let's Check the MariaDB Table



MySQL Table Information






From Anywhere (Say a Device) to A Relational Database (via JDBC Driver).   In our case, we will insert into an RDBMS from Kafka.


Step 1:  Acquire or modify data say ConsumeKafkaRecord_2
Step 2:  PutDatabaseRecord (Set Record Reader, INSERT or UPDATE, Connection Pool, Table Name)
Done!


Put Database Records in Any JDBC/RDBMS



Setup Your Connection Pool to SELECT, UPDATE, INSERT or DELETE





SQL DDL

Create MariaDB/MySQL Table


CREATE TABLE iot ( uuid VARCHAR(255) NOT NULL PRIMARY KEY,
 ipaddress VARCHAR(255),top1pct BIGINT, top1 VARCHAR(255),
cputemp VARCHAR(255), gputemp VARCHAR(255),
 gputempf VARCHAR(255),
cputempf varchar(255), runtime VARCHAR(255),
host VARCHAR(255), filename VARCHAR(255),
 imageinput VARCHAR(255),hostname varchar(255),
macaddress varchar(255), end VARCHAR(255), te VARCHAR(255), systemtime VARCHAR(255),

cpu BIGINT, diskusage VARCHAR(255), memory BIGINT, id VARCHAR(255));

Create Kudu Table


CREATE TABLE iot ( uuid STRING,
 ipaddress STRING,top1pct BIGINT, 
 top1 STRING,
cputemp STRING, 
gputemp STRING,
 gputempf STRING,
cputempf STRING, runtime STRING,
host STRING, filename STRING,
 imageinput STRING,hostname STRING,
macaddress STRING, 
`end` STRING, te STRING, systemtime STRING,
cpu BIGINT, diskusage STRING, 
memory BIGINT, 
id STRING,
PRIMARY KEY (uuid)
)
PARTITION BY HASH PARTITIONS 16 
STORED AS KUDU 

TBLPROPERTIES ('kudu.num_tablet_replicas' = '1');

References

Popular posts from this blog

Migrating Apache Flume Flows to Apache NiFi: Kafka Source to HDFS / Kudu / File / Hive

Migrating Apache Flume Flows to Apache NiFi: Kafka Source to HDFS / Kudu / File / HiveArticle 7 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_9.html Article 6 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_35.html
Article 5 - 
Article 4 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_8.html Article 3 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_7.html Article 2 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache.html Article 1https://www.datainmotion.dev/2019/08/migrating-apache-flume-flows-to-apache.html Source Code:  https://github.com/tspannhw/flume-to-nifi
This is one possible simple, fast replacement for "Flafka".



Consume / Publish Kafka And Store to Files, HDFS, Hive 3.1, Kudu

Consume Kafka Flow 

 Merge Records And Store As AVRO or ORC
Consume Kafka, Update Records via Machine Learning Models In CDSW And Store to Kudu

Sour…

Exploring Apache NiFi 1.10: Stateless Engine and Parameters

Exploring Apache NiFi 1.10:   Stateless Engine and Parameters Apache NiFi is now available in 1.10!
https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12316020&version=12344993

You can now use JDK 8 or JDK 11!   I am running in JDK 11, seems a bit faster.

A huge feature is the addition of Parameters!   And you can use these to pass parameters to Apache NiFi Stateless!

A few lesser Processors have been moved from the main download, see here for migration hints:
https://cwiki.apache.org/confluence/display/NIFI/Migration+Guidance

Release Notes:   https://cwiki.apache.org/confluence/display/NIFI/Release+Notes#ReleaseNotes-Version1.10.0

Example Source Code:https://github.com/tspannhw/stateless-examples

More New Features:

ParquetReader/Writer (See:  https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_7.html)Prometheus Reporting Task.   Expect more Prometheus stuff coming.Experimental Encrypted content repository.   People asked me for this one before.Par…

Ingesting Drone Data From DJII Ryze Tello Drones Part 1 - Setup and Practice

Ingesting Drone Data From DJII Ryze Tello Drones Part 1 - Setup and Practice In Part 1, we will setup our drone, our communication environment, capture the data and do initial analysis. We will eventually grab live video stream for object detection, real-time flight control and real-time data ingest of photos, videos and sensor readings. We will have Apache NiFi react to live situations facing the drone and have it issue flight commands via UDP. In this initial section, we will control the drone with Python which can be triggered by NiFi. Apache NiFi will ingest log data that is stored as CSV files on a NiFi node connected to the drone's WiFi. This will eventually move to a dedicated embedded device running MiniFi. This is a small personal drone with less than 13 minutes of flight time per battery. This is not a commercial drone, but gives you an idea of the what you can do with drones. Drone Live Communications for Sensor Readings and Drone Control You must connect to the drone…