Migrating Apache Flume Flows to Apache NiFi: Kafka Source to Apache Parquet on HDFS

Migrating Apache Flume Flows to Apache NiFi: Kafka Source to Apache Parquet on HDFS


Article 3 - This

This is one possible simple, fast replacement for "Flafka".   I can read any/all Kafka topics, route and transform them with SQL and store them in Apache ORC, Apache Avro, Apache Parquet, Apache Kudu, Apache HBase, JSON, CSV, XML or compressed files of many types in S3, Apache HDFS, File Systems or anywhere you want to stream this data in Real-time.   Also with a fast easy to use Web UI.   Everything you liked doing in Flume but now easier and with more Source and Sink options.







Consume Kafka And Store to Apache Parquet


Kafka to Kudu, ORC, AVRO and Parquet 


With Apache 1.10 I can send those Parquet files anywhere not only HDFS.


JSON (or CSV or AVRO or ...) and Parquet Out

In Apache 1.10, Parquet has a dedicated reader and writer


Or I can use PutParquet



Create A Parquet Table and Query It









References