Skip to main content

Posts

Showing posts with the label edge

Energy Monitoring via Apache NiFi MiNiFi 0.6.0 C++ Agent with Cloudera Edge Manager

Energy Monitoring via Apache NiFi MiNiFi 0.6.0 C++ Agent with Cloudera Edge Manager With the advent of version 0.6.0 of the C++ Agent, we can have native fast Python Processors that can easily be added to your palate in Cloudera Edge Manager. Source: https://github.com/tspannhw/minificpp-python-EnergyMonitoring

Apache NiFi Operations and Monitoring 101

NiFi Operations https://community.hortonworks.com/articles/207858/more-devops-for-hdf-apache-nifi-and-friends.html https://community.hortonworks.com/articles/92495/monitor-apache-nifi-with-apache-nifi.html https://dzone.com/articles/building-a-custom-apache-nifi-operations-dashboard https://dzone.com/articles/simple-apache-nifi-operations-dashboard-part-2-spr https://www.slideshare.net/Hadoop_Summit/best-practices-and-lessons-learnt-from-running-apache-nifi-at-renault https://community.hortonworks.com/articles/183217/devops-backing-up-apache-nifi-registry-flows.html https://community.hortonworks.com/articles/177349/big-data-devops-apache-nifi-hwx-schema-registry-sc.html https://community.hortonworks.com/articles/177301/big-data-devops-apache-nifi-flow-versioning-and-au.html https://community.hortonworks.com/articles/161761/new-features-in-apache-nifi-15-apache-nifi-registr.html https://community.hortonworks.com/articles/191658/devops-tips-using-the-apache-nifi-

Barcelona DataWorks Summit March 2019

I just returned from this awesome event.   Not even a rough plane trip can damper my spirits after seeing all the amazing things and all that we got to do this year.   It was nice to see familiar faces from attendees from 2017 and 2018 including my friends from Prague and Germany! Thanks to Andy LoPresto, George Vetticaden, Dinesh Chandrasekhar, Purnima, Nathan, Dan Chaffelson for great pictures, talks, support and being an amazing team for Data in Motionists. Meetup The meetup was great and in the same hall as some other amazing meetups at the same time. A great experience for those at Summit early (and open to all people for free). https://www.slideshare.net/bunkertor/the-edge-to-ai-deep-dive-barcelona-meetup-march-2019 https://www.meetup.com/futureofdata-barcelona/events/259345951/ Highlight :  Dan spinning up NiFi at scale in the audience on Google Cloud on K8 with ease! Highlight :  Andy’s crushing it MiNiFi and NiFi presentation! I think he h

Using Raspberry Pi 3B+ with Apache NiFi MiNiFi and Google Coral Accelerator and Pimoroni Inky Phat

Using Raspberry Pi 3B+ with Apache NiFi MiNiFi and Google Coral Accelerator and Pimoroni Inky Phat Architecture Introduction First we need to unbox our new goodies.   The Inky Phat is an awesome E-Ink display with low power usage that stays displayed after shutdown!  Next I added a new Google Coral Edge TPU ML Accelerator USB Coprocessor to a new Raspberry Pi 3B+.    This was so easy to integrate and get up and running. Let's unbox this beautiful device (but be careful when it runs it can get really hot and there is a warning in the instructions).   So I run this on top of an aluminum case and with a big fan on it. Pimoroni Inky Phat It is pretty easy to set this up and it provides a robust Python library to write to our E-Ink display.   You can see an example screen here. https://github.com/pimoroni/inky Pimoroni Inky pHAT ePaper eInk Display in Red Pimoroni Inky Phat (Red) https://shop.pimoroni.com/products/inky-phat https://github.com

Edge to AI: Apache Spark, Apache NiFi, Apache NiFi MiNiFi, Cloudera Data Science Workbench Example

Use Case IoT Devices with Sensors, Cameras Overview In this, the third of the CDSW series, we build on using CDSW to classify images with a Python Apache MXNet model. In this use case we are receiving edge data from devices running MiniFi Agents that are collecting sensor data, images and also running edge analytics with TensorFlow. An Apache NiFi server collects this data with full data lineage using HTTP calls from the device(s). We then filter, transform, merge and route the sensor data, image data, deep learning analytics data and metadata to different data stores. As part of the flow we upload our images to a cloud hosted FTP server (could be S3 or any media store anywhere) and call a CDSW Model from Apache NiFi via REST and get the model results back as JSON. We are also storing our sensor data in Parquet files in HDFS. We then trigger a PySpark job from CDSW via API from Apache NiFi and check the status of that. We store the status result data in Parquet as well for PySpark SQL