Skip to main content


Showing posts from April, 2019

Streams Messaging Manager (SMM) REST API

CALL SMM REST API from nifi!/Topic_metrics_operations/getTopicMetrics

--- article , demo

curl -X GET "" -H "accept: application/json"

curl -X GET "" -H "accept: application/json"

curl -X GET "" -H "accept: application/json"

Kafka Cluster Details

curl -X GET "" -H "accept: application/json"

curl -X GET "http://magellan-5.f…

Energy Monitoring via Apache NiFi MiNiFi 0.6.0 C++ Agent with Cloudera Edge Manager

Energy Monitoring via Apache NiFi MiNiFi 0.6.0 C++ Agent with Cloudera Edge Manager
With the advent of version 0.6.0 of the C++ Agent, we can have native fast Python Processors that can easily be added to your palate in Cloudera Edge Manager.


Publishing and Consuming JMS Messages from Tibco Enterprise Message Service (JMS) with Apache NiFi

TIBCO Enterprise Message Service I tested this against the most recent release of TIBCO Enterprise Message Service and their JMS driver available via trial download. I followed the very easy install directions. I downloaded it to a Centos 7 server. Expanded my download to TIB_ems-dev_8.4.0_linux_x86_64 Then made it executable and ran TIBCOUniversalInstaller-lnx-x86-64.bin --console. I used all the defaults (I picked server and client) and then quickly ran the finished install server. Running Tibco on Centos 7 cd /opt/tibco/ems/8.4/bin/./tibemsd64 -config ~/TIBCO_HOME/tibco/cfgmgmt/ems/data/tibemsd.confExample JMS Queue Settings URL: tcp://servername:7222class: com.tibco.tibjms.TibjmsQueueConnectionFactoryDirectory:/opt/tibco/ems/8.4/lib/ I believe it just uses these files from that directory: tibjms.jarjms-2.0.jar Once I have my server and port shown, it's easy to add those settings to Apache NiFi. The settings I…

IoT Edge Use Cases with Apache Kafka and Apache NiFi - MiniFi

Article MiniFi Java Agent 0.5 Copy over necessary NARs from Apache NiFi 1.7 lib: nifi-ssl-context-service-nar-1.7.0.narnifi-standard-services-api-nar-1.7.0.narnifi-kafka-1-0-nar-1.7.0.nar This will support PublishKafka_1_0 and ConsumeKafka_1_0. Then create a consume and/or publish flow. You can combine the two based on your needs. In my simple example I consume the Kafka messages in MiniFi and write to a file. I also write the metadata to a JSON file. Consume Kafka Publish Electric Monitoring Data To Kafka Let's monitor the messages going through our topic, smartPlug. Publish Messages to Kafka Consume Any Messages From the smartPlug topic Logs ProvenanceEvent file containing 377 records.In the past 5 minutes,1512 events have been written to the ProvenanceRepository, totaling 839.32 KB2018-11-2619:42:32,473 INFO [main] o.a.n.c.s.StandardProcessSchedulerStartingPutFile[id=25a86505-031a-37d9-0000-000000000000]2018-11-2619:42:32,474 INFO [main