Migrating Apache Flume Flows to Apache NiFi: Kafka Source to HTTP REST Sink and HTTP REST Source to Kafka Sink

Migrating Apache Flume Flows to Apache NiFi:  Kafka Source to HTTP REST Sink and HTTP REST Source to Kafka Sink


This is a simple use case of being a gateway between REST API and Kafka.   We can do a lot more than that in NiFi.  We can be a Kafka Consumer and Producer as well as POST REST calls and receive any REST calls on configurable ports.  All with No Code.

NiFi can act as a listener for HTTP Requests and provide HTTP Responses in a scriptable full Web Server mechanism with JETTY.   Or it can listen for HTTP REST calls on a port and route those files anywhere. https://community.cloudera.com/t5/Community-Articles/Parsing-Web-Pages-for-Images-with-Apache-NiFi/ta-p/248415 .  We can also do websockets https://community.cloudera.com/t5/Community-Articles/An-Example-WebSocket-Application-in-Apache-NiFi-1-1/ta-p/248598.  https://community.cloudera.com/t5/Community-Articles/Accessing-Feeds-from-EtherDelta-on-Trades-Funds-Buys-and/ta-p/248316

It is extremely easy to do this in NiFi.




Kafka Consumer to REST POST



HTTP REST to Kafka Producer




Full Monitoring on Apache NiFi






A Very Common Use Case:  Ingesting Stock Feeds From REST to Kafka



References

Migrating Apache Flume Flows to Apache NiFi: SYSLOG to KAFKA

Migrating Apache Flume Flows to Apache NiFi:  SYSLOG  to KAFKA




This is a simple use case of being a smart gateway/proxy between SYSLOGand Kafka.   We can do a lot more than that in NiFi.  We can be a Kafka Consumer and Producer as well as read and parse all types of logs including SYSLOG.  We have a GrokReader for converting semistructured logs into manageable tabular style data with schemas.   Log->JSON/CSV/AVRO/PARQUET.  All with No Code.

It is extremely easy to do this in NiFi.

See:  Log Parsing:   https://www.datainmotion.dev/2019/08/migrating-apache-flume-flows-to-apache.html

Detailed Tutorials

https://blog.davidvassallo.me/2018/09/19/apache-nifi-from-syslog-to-elasticsearch/
https://community.cloudera.com/t5/Community-Articles/NiFi-Send-to-syslog/ta-p/248638
http://www.youritgoeslinux.com/impl/bigdata/nifi/syslog

Processors

https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.9.2/org.apache.nifi.processors.standard.ListenSyslog/

https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.9.2/org.apache.nifi.processors.standard.ParseSyslog/index.html



References

Migrating Apache Flume Flows to Apache NiFi: Twitter Source to Kafka Sink

Migrating Apache Flume Flows to Apache NiFi:  Twitter Source to Kafka Sink


Article 4 -  This.

This is a simple use case of pushing Tweets to Kafka.   We can do a lot more than that in NiFi.   If you see the referenced article I can do Deep Learning on Tweet Images, Run Sentiment Analysis, Query the Tweets in Stream, Send messages to email / slack based on certain criteria and retweet automagically.  All with No Code.

It is extremely easy to do this in NiFi.





Create a Kafka Topic To Send Tweets To


Example Tweet



Configure Your Connection to Twitter



Configure Kafka Producer



NiFi Flow to Send Two Twitter Sources To Kafka


Additional Apache NiFi Flow Details For Other Processing







Kafka Messages in Topic Sent From NiFi Producer 



Source Code


References

Migrating Apache Flume Flows to Apache NiFi: Kafka Source to Apache Parquet on HDFS

Migrating Apache Flume Flows to Apache NiFi: Kafka Source to Apache Parquet on HDFS


Article 3 - This

This is one possible simple, fast replacement for "Flafka".   I can read any/all Kafka topics, route and transform them with SQL and store them in Apache ORC, Apache Avro, Apache Parquet, Apache Kudu, Apache HBase, JSON, CSV, XML or compressed files of many types in S3, Apache HDFS, File Systems or anywhere you want to stream this data in Real-time.   Also with a fast easy to use Web UI.   Everything you liked doing in Flume but now easier and with more Source and Sink options.







Consume Kafka And Store to Apache Parquet


Kafka to Kudu, ORC, AVRO and Parquet 


With Apache 1.10 I can send those Parquet files anywhere not only HDFS.


JSON (or CSV or AVRO or ...) and Parquet Out

In Apache 1.10, Parquet has a dedicated reader and writer


Or I can use PutParquet



Create A Parquet Table and Query It









References

Migrating Apache Flume Flows to Apache NiFi: Kafka Source to HDFS / Kudu / File / Hive

Migrating Apache Flume Flows to Apache NiFi: Kafka Source to HDFS / Kudu / File / Hive


This is one possible simple, fast replacement for "Flafka".




Consume / Publish Kafka And Store to Files, HDFS, Hive 3.1, Kudu



Consume Kafka Flow 



 Merge Records And Store As AVRO or ORC


Consume Kafka, Update Records via Machine Learning Models In CDSW And Store to Kudu



Source:  Apache Kafka Topics


You enter a few parameters and start ingesting data with or without schemas.   Apache Flume had no Schema support.   Flume did not support transactions.



Sink:   Files




Storing to files in files systems, object stores, SFTP or elsewhere could not be easier.  Choose S3, Local File System, SFTP, HDFS or wherever.

Sink:   Apache Kudu / Apache Impala



Storing to Kudu/Impala (or Parquet for that manner could not be easier with Apache NiFi).


Sink:   HDFS for Apache ORC Files


When completes, the ConvertAvroToORC and PutHDFS build the Hive DDL for you!  You can build the tables automagically with Apache NiFi if you wish.

CREATE EXTERNAL TABLE IF NOT EXISTS iotsensors
(sensor_id BIGINT, sensor_ts BIGINT, is_healthy STRING, response STRING, sensor_0 BIGINT, sensor_1 BIGINT,
sensor_2 BIGINT, sensor_3 BIGINT, sensor_4 BIGINT, sensor_5 BIGINT, sensor_6 BIGINT, sensor_7 BIGINT, sensor_8 BIGINT,
sensor_9 BIGINT, sensor_10 BIGINT, sensor_11 BIGINT)
STORED AS ORC
LOCATION '/tmp/iotsensors'





Sink: Kafka

Publishing to Kafka is just as easy!  Push records with schema references or raw data.  AVRO or JSON, whatever makes sense for your enterprise.

Write to data easily with no coding and no changes or redeploys for schema or schema version changes.
 Pick a Topic and Stream Data While Converting Types


Clean UI and REST API to Manage, Monitor, Configure and Notify on Kafka




Other Reasons to Use Apache NiFi Over Apache Flume

DevOps with REST API, CLI, Python API

Schemas!   We not only work with semi-structured, structured and unstructured data.  We are schema and schema version aware for CSV, JSON, AVRO, XML, Grokked Text Files and more. https://community.cloudera.com/t5/Community-Articles/Big-Data-DevOps-Apache-NiFi-HWX-Schema-Registry-Schema/ta-p/247963

Flume Replacement Use Cases Implemented in Apache NiFi

Sink/Source:   JMS

Source:   Files/PDF/PowerPoint/Excel/Word  Sink:  Files

Source:  Files/CSV  Sink:   HDFS/Hive/Apache ORC

Source:  REST/Files/Simulator   Sink:  HBase, Files, HDFS.    ETL with Lookups.

Flume Replacement - Lightweight Open Source Agents


If you need to replace local Log to Kafka agents or anything to Kafka or anything to anything with routing, transformation and manipulation.   You can use Edge Flow Manager deployed MiNiFi Agents available in Java and C++ versions.

References