Scanning Documents into Data Lakes via Tesseract, MQTT, Python, JSON, Records, TensorFlow, OpenCV and Apache NiFi

There are many awesome open source tools available to integrate with your Big Data Streaming flows.
Take a look at these articles for installation and why the new version of Tesseract is different.
I am officially recommending Python 3.6 or newer. Please don't use Python 2.7 if you don't have to. Friends don't let friends use old Python.
Tesseract 4 with Deep Learning
For installation on a Mac Laptop:
  1. brew install tesseract --HEAD
  2.  
  3. pip3.6 install pytesseract
  4.  
  5. brew install leptonica
Note: if you have tesseract already, you may need to uninstall and unlink it first with brew. If you don't use brew, you can install another way.
Summary
  1. Execute the run.sh (https://github.com/tspannhw/nifi-tesseract-python/blob/master/pytesstest.py) .
  2. It will send a MQTT message of the text and some other attributes in JSON format to the tesseract topic in the specified MQTT broker.
  3. Apache NiFi will read from this topic via ConsumeMQTT
  4. The flow checks to see if it's valid JSON via RouteOnContent.
  5. We run MergeRecord to convert a bunch of JSON into one big Apache Avro File
  6. Then we run ConvertAvroToORC to make a superfast Apache ORC file for storage
  7. Then we store it in HDFS via PutHDFS
Running The Python Script
You could have this also hooked up to a scanner or point it at a directory. You could also have it scheduled to run every 30 seconds or so. I had this hooked up to a local Apache NiFi instance to schedule runs. This can also be run by MiniFi Java Agent or MiniFi C++ agent. Or on demand if you wish.
Sending MQTT Messages From Python
  1. # MQTT
  2. client = mqtt.Client()
  3. client.username_pw_set("user","pass")
  4. client.connect("server.server.com", 17769, 60)
  5. client.publish("tesseract", payload=json_string, qos=0, retain=True)
You will need to run: pip3 install paho-mqtt
Create the HDFS Directory
  1. hdfs dfs -mkdir -p /tesseract

Create the External Hive Table (DDL Built by NiFi)
  1. CREATE EXTERNAL TABLE IF NOT EXISTS tesseract (`text` STRING, imgname STRING, host STRING, `end` STRING, te STRING, battery INT, systemtime STRING, cpu DOUBLE, diskusage STRING, memory DOUBLE, id STRING) STORED AS ORC
  2. LOCATION '/tesseract';

This DDL is a side effect, it's built by our ORC conversion and HDFS storage commands.
You could run that create script in Hive View 2, Beeline or another Apache Hive JDBC/ODBC tool. I used Apache Zeppelin since I am going to be doing queries there anyway.

Let's Ingest Our Captured Images and Process Them with Apache Tika, TensorFlow and grab the metadata
Consume MQTT Records and Store in Apache Hive
Let's look at other fields in Zeppelin
Let's Look at Our Records in Apache Zeppelin via a SQL Query (SELECT *FROM TESSERACT)
ConsumeMQTT: Give me all the record from the tesseract topic from our MQTT Broker. Isolation from our ingest clients which could be 100,000 devices.
MergeRecord: Merge all the JSON files sent via MQTT into one big AVRO File
ConvertAVROToORC: converts are merged AVRO file
PutHDFS
Tesseract Example Schema in Hortonworks Schema Registry
TIP: You can generate your schema with InferAvroSchema. Do that once, copy it and paste into Schema Registry. Then you can remove that step from your flow.
The Schema Text
  1. {
  2. "type": "record",
  3. "name": "tesseract",
  4. "fields": [
  5. {
  6. "name": "text",
  7. "type": "string",
  8. "doc": "Type inferred from '\"cgi cctong aiternacrety, pou can acces the complete Pro\\nLance repesiiry from eh Provenance mens: The Provenance\\n‘emu inchades the Date/Time, Actontype, the Unsque Fowie\\nTD and other sata. Om the ar it is smal exci i oe:\\n‘ick chs icon, and you get the flowin On the right, war\\n‘cots like three inthe cic soemecaed gether Liege:\\n\\nLineage ts visualined as « lange direcnad sqycie graph (DAG) char\\nSrones the seeps 1m she Gow where modifications oF routing ‘oot\\nplace on the Aewiike. Righe-iieit « step lp the Lineage s view\\nSetusls aboot the fowtle at that step ar expand the ow to ander:\\nScand where & was potentially domed frum. Af the very bottom\\nleft of the Lineage Oi a slider wath a play button to play the pro\\n“sing flow (with scaled ame} and understand where tbe owtise\\nSpent the meat Game of at whch PORN get muted\\n\\naide the Bowtie dealin, you cam: finn deed analy of box\\n\\ntern\\n=\"'"
  9. },
  10. {
  11. "name": "imgname",
  12. "type": "string",
  13. "doc": "Type inferred from '\"images/tesseract_image_20180613205132_c14779b8-1546-433e-8976-ddb5bfc5f978.jpg\"'"
  14. },
  15. {
  16. "name": "host",
  17. "type": "string",
  18. "doc": "Type inferred from '\"HW13125.local\"'"
  19. },
  20. {
  21. "name": "end",
  22. "type": "string",
  23. "doc": "Type inferred from '\"1528923095.3205361\"'"
  24. },
  25. {
  26. "name": "te",
  27. "type": "string",
  28. "doc": "Type inferred from '\"3.7366552352905273\"'"
  29. },
  30. {
  31. "name": "battery",
  32. "type": "int",
  33. "doc": "Type inferred from '100'"
  34. },
  35. {
  36. "name": "systemtime",
  37. "type": "string",
  38. "doc": "Type inferred from '\"06/13/2018 16:51:35\"'"
  39. },
  40. {
  41. "name": "cpu",
  42. "type": "double",
  43. "doc": "Type inferred from '22.8'"
  44. },
  45. {
  46. "name": "diskusage",
  47. "type": "string",
  48. "doc": "Type inferred from '\"113759.7 MB\"'"
  49. },
  50. {
  51. "name": "memory",
  52. "type": "double",
  53. "doc": "Type inferred from '69.4'"
  54. },
  55. {
  56. "name": "id",
  57. "type": "string",
  58. "doc": "Type inferred from '\"20180613205132_c14779b8-1546-433e-8976-ddb5bfc5f978\"'"
  59. }
  60. ]
  61. }
The above schema was generated by Infer Avro Schema in Apache NiFi.
Image Analytics Results
  1. {
  2. "tiffImageWidth" : "1280",
  3. "ContentType" : "image/jpeg",
  4. "JPEGImageWidth" : "1280 pixels",
  5. "FileTypeDetectedFileTypeName" : "JPEG",
  6. "tiffBitsPerSample" : "8",
  7. "ThumbnailHeightPixels" : "0",
  8. "label4" : "book jacket",
  9. "YResolution" : "1 dot",
  10. "label5" : "pill bottle",
  11. "ImageWidth" : "1280 pixels",
  12. "JFIFYResolution" : "1 dot",
  13. "JPEGImageHeight" : "720 pixels",
  14. "filecreationTime" : "2018-06-13T17:24:07-0400",
  15. "JFIFThumbnailHeightPixels" : "0",
  16. "DataPrecision" : "8 bits",
  17. "XResolution" : "1 dot",
  18. "ImageHeight" : "720 pixels",
  19. "JPEGNumberofComponents" : "3",
  20. "JFIFXResolution" : "1 dot",
  21. "FileTypeExpectedFileNameExtension" : "jpg",
  22. "JPEGDataPrecision" : "8 bits",
  23. "FileSize" : "223716 bytes",
  24. "probability4" : "1.74%",
  25. "tiffImageLength" : "720",
  26. "probability3" : "3.29%",
  27. "probability2" : "6.13%",
  28. "probability1" : "81.23%",
  29. "FileName" : "apache-tika-2858986094088526803.tmp",
  30. "filelastAccessTime" : "2018-06-13T17:24:07-0400",
  31. "JFIFThumbnailWidthPixels" : "0",
  32. "JPEGCompressionType" : "Baseline",
  33. "JFIFVersion" : "1.1",
  34. "filesize" : "223716",
  35. "FileModifiedDate" : "Wed Jun 13 17:24:27 -04:00 2018",
  36. "Component3" : "Cr component: Quantization table 1, Sampling factors 1 horiz/1 vert",
  37. "Component1" : "Y component: Quantization table 0, Sampling factors 2 horiz/2 vert",
  38. "Component2" : "Cb component: Quantization table 1, Sampling factors 1 horiz/1 vert",
  39. "NumberofTables" : "4 Huffman tables",
  40. "FileTypeDetectedFileTypeLongName" : "Joint Photographic Experts Group",
  41. "fileowner" : "tspann",
  42. "filepermissions" : "rw-r--r--",
  43. "JPEGComponent3" : "Cr component: Quantization table 1, Sampling factors 1 horiz/1 vert",
  44. "JPEGComponent2" : "Cb component: Quantization table 1, Sampling factors 1 horiz/1 vert",
  45. "JPEGComponent1" : "Y component: Quantization table 0, Sampling factors 2 horiz/2 vert",
  46. "FileTypeDetectedMIMEType" : "image/jpeg",
  47. "NumberofComponents" : "3",
  48. "HuffmanNumberofTables" : "4 Huffman tables",
  49. "label1" : "menu",
  50. "XParsedBy" : "org.apache.tika.parser.DefaultParser, org.apache.tika.parser.ocr.TesseractOCRParser, org.apache.tika.parser.jpeg.JpegParser",
  51. "label2" : "web site",
  52. "label3" : "crossword puzzle",
  53. "absolutepath" : "/Volumes/seagate/opensourcecomputervision/images/",
  54. "filelastModifiedTime" : "2018-06-13T17:24:07-0400",
  55. "ThumbnailWidthPixels" : "0",
  56. "filegroup" : "staff",
  57. "ResolutionUnits" : "none",
  58. "JFIFResolutionUnits" : "none",
  59. "CompressionType" : "Baseline",
  60. "probability5" : "1.12%"
  61. }
This is built using a combination of Apache Tika, TensorFlow and other metadata analysis processors.

Creating An Email Bot in Apache NiFi (Consume and Send Email)

Creating An Email Bot in Apache NiFi


See:  https://community.cloudera.com/t5/Community-Articles/Creating-An-Email-Bot-in-Apache-NiFi/ta-p/249131


Some people say I must have a bot to read and reply to email at all crazy hours of the day. An awesome email assistant, well I decided to prototype it.

This is the first piece. After this I will add some Spark machine learning to intelligently reply to emails from a list of pretrained responses. With supervised learning it will learn what emails to send to who, based on Subject, From, Body Content, attachments, time of day, sender domain and many other variables.

For now, it just reads some emails and checks for a hard coded subject.

I could use this to trigger other processes, such as running a batch Spark job.

Since most people send and use HTML email (that's what Outlook, Outlook.com, Gmail do), I will send and receive HTML emails as to make it look more legit.

I could also run my fortune script and return that as my email content. Making me sound wise, or pull in a random selection of tweets about Hadoop or even recent news. Making the email very current and fresh.

Snippet Example of a Mixed Content Email Message (Attachments Removed to Save Space)

Return-Path: <x@example.com>
Delivered-To: nifi@example.com
Received: from x.x.net
    by x.x.net (Dovecot) with LMTP id +5RhOfCcB1jpZQAAf6S19A
    for <nifi@example.com>; Wed, 19 Oct 2016 12:19:13 -0400
Return-path: <x@example.com>
Envelope-to: nifi@example.com
Delivery-date: Wed, 19 Oct 2016 12:19:13 -0400
Received: from [x.x.x.x] (helo=smtp.example.com)
    by x.example.com with esmtp (Exim)
    id 1bwtaC-0006dd-VQ
    for nifi@example.com; Wed, 19 Oct 2016 12:19:12 -0400
Received: from x.x.net ([x.x.x.x])
    by x with bizsmtp
    id xUKB1t0063zlEh401UKCnK; Wed, 19 Oct 2016 12:19:12 -0400
X-EN-OrigIP: 64.78.52.185
X-EN-IMPSID: xUKB1t0063zlEh401UKCnK
Received: from x.x.net (localhost [127.0.0.1])
    (using TLSv1 with cipher AES256-SHA (256/256 bits))
    (No client certificate requested)
    by emg-ca-1-1.localdomain (Postfix) with ESMTPS id BEE9453F81
    for <nifi@example.com>; Wed, 19 Oct 2016 09:19:10 -0700 (PDT)
Subject: test
MIME-Version: 1.0
x-echoworx-msg-id: e50ca00a-edc5-4030-a127-f5474adf4802
x-echoworx-emg-received: Wed, 19 Oct 2016 09:19:10.713 -0700
x-echoworx-message-code-hashed: 5841d9083d16bded28a3c4d33bc505206b431f7f383f0eb3dbf1bd1917f763e8
x-echoworx-action: delivered
Received: from 10.254.155.15 ([10.254.155.15])
          by emg-ca-1-1 (JAMES SMTP Server 2.3.2) with SMTP ID 503
          for <nifi@example.com>;
          Wed, 19 Oct 2016 09:19:10 -0700 (PDT)
Received: from x.x.net (unknown [x.x.x.x])
    (using TLSv1 with cipher AES256-SHA (256/256 bits))
    (No client certificate requested)
    by emg-ca-1-1.localdomain (Postfix) with ESMTPS id 6693053F86
    for <nifi@example.com>; Wed, 19 Oct 2016 09:19:10 -0700 (PDT)
Received: from x.x.net (x.x.x.x) by
 x.x.net (x.x.x.x) with Microsoft SMTP
 Server (TLS) id 15.0.1178.4; Wed, 19 Oct 2016 09:19:09 -0700
Received: from x.x.x.net ([x.x.x.x]) by
 x.x.x.net ([x.x.x.x]) with mapi id
 15.00.1178.000; Wed, 19 Oct 2016 09:19:09 -0700
From: x x<x@example.com>
To: "nifi@example.com" <nifi@example.com>
Thread-Topic: test
Thread-Index: AQHSKiSFTVqN9ugyLEirSGxkMiBNFg==
Date: Wed, 19 Oct 2016 16:19:09 +0000
Message-ID: <D49AD137-3765-4F9A-BF98-C4E36D11FFD8@hortonworks.com>
Accept-Language: en-US
Content-Language: en-US
X-MS-Has-Attach: yes
X-MS-TNEF-Correlator:
x-ms-exchange-messagesentrepresentingtype: 1
x-ms-exchange-transport-fromentityheader: Hosted
x-originating-ip: [71.168.178.39]
x-source-routing-agent: Processed
Content-Type: multipart/related;
    boundary="_004_D49AD13737654F9ABF98C4E36D11FFD8hortonworkscom_";
    type="multipart/alternative"


--_004_D49AD13737654F9ABF98C4E36D11FFD8hortonworkscom_
Content-Type: multipart/alternative;
    boundary="_000_D49AD13737654F9ABF98C4E36D11FFD8hortonworkscom_"


--_000_D49AD13737654F9ABF98C4E36D11FFD8hortonworkscom_
Content-Type: text/plain; charset="utf-8"
Content-Transfer-Encoding: base64

Python Script to Parse Email Messages

#!/usr/bin/env python

"""Unpack a MIME message into a directory of files."""
import json
import os
import sys
import email
import errno
import mimetypes
from optparse import OptionParser
from email.parser import Parser

def main():
    parser = OptionParser(usage="""Unpack a MIME message into a directory of files.
Usage: %prog [options] msgfile
""")
    parser.add_option('-d', '--directory',
                      type='string', action='store',
                      help="""Unpack the MIME message into the named
                      directory, which will be created if it doesn't already
                      exist.""")
    opts, args = parser.parse_args()
    if not opts.directory:
 os.makedirs(opts.directory)
    try:
        os.mkdir(opts.directory)
    except OSError as e:
        # Ignore directory exists error
        if e.errno != errno.EEXIST:
            raise
    msgstring = ''.join(str(x) for x in sys.stdin.readlines())

    msg = email.message_from_string(msgstring)

    headers = Parser().parsestr(msgstring)
    response  = {'To': headers['to'], 'From': headers['from'], 'Subject': headers['subject'], 'Received': headers['Received']}
    print json.dumps(response)
    counter = 1
    for part in msg.walk():
        # multipart/* are just containers
        if part.get_content_maintype() == 'multipart':
            continue
        # Applications should really sanitize the given filename so that an
        # email message can't be used to overwrite important files
        filename = part.get_filename()
        if not filename:
            ext = mimetypes.guess_extension(part.get_content_type())
            if not ext:
                # Use a generic bag-of-bits extension
                ext = '.bin'
            filename = 'part-%03d%s' % (counter, ext)
        counter += 1
        fp = open(os.path.join(opts.directory, filename), 'wb')
        fp.write(part.get_payload(decode=True))
        fp.close()

if __name__ == '__main__':
    main()

mailnifi.sh

python mailnifi.py -d /opt/demo/email/"$@"

Python needs the email component for parsing the message, you can install via PIP.

pip install email

I am using Python 2.7, you could use a newer Python 3.x

Here is the flow:

11373-emailassistantflow.png

11375-consumepop3.png

11376-emailparseflow.png

11374-attributes.png

For the final part of the flow, I read the files created by the parsing, load them to HDFS and delete from the file system using the standard GetFile.

11371-readparsedemailsflow.png

 

Reference:

Files:

email-assistant-12-jan-2017.xml


undefined

Simple Leprechaun Detector.... And then how to make it more advanced

Okay, maybe just detect anyone or anything moving.   Let's say a Leprechaun if you have a kid that builds a Leprechaun trap.

The easy one is to use a USB web camera, Raspberry Pi and Motion software.

The second version we will add Apache NiFi - MiNiFi which will read the images and send them on.

Github:   https://github.com/tspannhw/leprechaun-detector/tree/master

This will send you an image when one is detected.

Install the Motion Detector


 apt-get install motion -y

Edit the Configuration


 /etc/motion/motion.conf 


Some needed configuration

# Command to be executed when an event starts. (default: none)# An event starts at first motion detected after a period of no motion defined by event_gap; on_event_start  /opt/demo/runmotion.sh %f



Store your images somewhere MiNiFi can grab them


target_dir /opt/demo/images

Start the Motion Detector


/etc/init.d/motion start




References





Posting Images to Imgur via Apache NiFi Using Custom Processor

Posting Images to Imgur via Apache NiFi Using Custom Processor

As part of a flow from a web camera, I decided that imgur would be a nice place to push images that I can reference publically in Cloudera Data Science Workbench calls for processing with Apache MXNet GluonCV YOLOv3.

I updated my custom processor since I needed a header.

I should make this allow for multiple headers and more.

For now, I'll stick with this.   This is built for Apache NiFi 1.9.0 and updated parameters.






PostImage Processor NAR Release
https://github.com/tspannhw/nifi-postimage-processor/releases/tag/1.1

Imgur

https://apidocs.imgur.com/

Sign up for the API to use this and head there limits.   This is for non-commercial purposes.

Here is an example image uploaded to imgur


Results From HTTP Post


post.header
{Transfer-Encoding=[chunked], Server=[nginx/1.13.5], Access-Control-Allow-Methods=[POST, GET, OPTIONS, PATCH, PUT, DELETE], Connection=[close], X-Ratelimit-Userlimit=[2000], X-Post-Rate-Limit-Reset=[52], X-Ratelimit-Clientreset=[86400], Date=[Fri, 15 Mar 2019 20:32:46 GMT], Access-Control-Allow-Headers=[Content-Type, Content-Length, Accept-Encoding, X-CSRF-Token, Authorization], X-Ratelimit-Userreset=[3600], X-Ratelimit-Userremaining=[1999], Strict-Transport-Security=[max-age=15724800; includeSubDomains;], Cache-Control=[no-store, no-cache, must-revalidate, post-check=0, pre-check=0], Access-Control-Allow-Credentials=[true], X-Post-Rate-Limit-Remaining=[1244], X-Ratelimit-Clientlimit=[12500], X-Post-Rate-Limit-Limit=[1250], X-Ratelimit-Clientremaining=[12499], Content-Type=[application/json]}
post.results
{"data":{"in_most_viral":false,"ad_type":0,"link":"https://i.imgur.com/NEfUOaY.jpg","description":null,"section":null,"title":null,"type":"image/jpeg","deletehash":"oRHxGI63iyEligc","datetime":1552681953,"has_sound":false,"id":"NEfUOaY","in_gallery":false,"vote":null,"views":0,"height":480,"bandwidth":0,"nsfw":null,"is_ad":false,"edited":"0","ad_url":"","tags":[],"account_id":0,"size":368339,"width":640,"account_url":null,"name":"","animated":false,"favorite":false},"success":true,"status":200}
post.status
OK
post.statuscode
200


Posting Images to Slack from Apache NiFi Using Custom Processor

Posting Images to Slack from Apache NiFi Using Custom Processor

As part of one of my remote camera feed projects, I wanted to send the images to Slack.

So I used my PostImage processor to send them via REST API.


It's a very simple flow.










PostImage Processor NAR Release
https://github.com/tspannhw/nifi-postimage-processor/releases/tag/1.1

Example Results

post.header
{X-Cache=[Miss from cloudfront], X-Accepted-OAuth-Scopes=[files:write:user,post], Server=[Apache], Access-Control-Allow-Origin=[*], X-Content-Type-Options=[nosniff], Connection=[keep-alive], Pragma=[no-cache], Date=[Mon, 11 Mar 2019 20:14:33 GMT], Access-Control-Allow-Headers=[slack-route, x-slack-version-ts], Via=[1.1 d0c5747a41ab1b19c48bdc3c7feed516.cloudfront.net (CloudFront)], Referrer-Policy=[no-referrer], Access-Control-Expose-Headers=[x-slack-req-id], Strict-Transport-Security=[max-age=31536000; includeSubDomains; preload], Cache-Control=[private, no-cache, no-store, must-revalidate], X-Via=[haproxy-www-ozs9], X-Slack-Req-Id=[7a42ad8f-bfcf-4b30-a3a6-7d38fb2b1e4a], X-Amz-Cf-Id=[Gr2gyXOdTmRLpTXssuFruYmk_D-487WBNdMtPzjlVj7SrLgsjLYXqw==], Vary=[Accept-Encoding], Expires=[Mon, 26 Jul 1997 05:00:00 GMT], X-XSS-Protection=[0], X-OAuth-Scopes=[identify,bot:basic], Content-Type=[application/json; charset=utf-8]}
post.results
{"file":{"filetype":"jpg","thumb_360":"https://files.slack.com/files-tmb/T1SD6MZMF-FGV6N568J-f7d3118d9a/2019-03-11_1547_360.jpg","thumb_160":"https://files.slack.com/files-tmb/T1SD6MZMF-FGV6N568J-f7d3118d9a/2019-03-11_1547_160.jpg","thumb_480":"https://files.slack.com/files-tmb/T1SD6MZMF-FGV6N568J-f7d3118d9a/2019-03-11_1547_480.jpg","title":"2019-03-11 1547","original_h":480,"ims":[],"mode":"hosted","shares":{"public":{"CGU6WRSNL":[{"channel_name":"images","reply_users":[],"reply_users_count":0,"team_id":"T1SD6MZMF","reply_count":0,"ts":"1552335275.020900"}]}},"image_exif_rotation":1,"url_private":"https://files.slack.com/files-pri/T1SD6MZMF-FGV6N568J/2019-03-11_1547.jpg","id":"FGV6N568J","display_as_bot":false,"timestamp":1552335273,"thumb_64":"https://files.slack.com/files-tmb/T1SD6MZMF-FGV6N568J-f7d3118d9a/2019-03-11_1547_64.jpg","thumb_80":"https://files.slack.com/files-tmb/T1SD6MZMF-FGV6N568J-f7d3118d9a/2019-03-11_1547_80.jpg","created":1552335273,"editable":false,"thumb_480_w":480,"is_external":false,"thumb_360_h":270,"groups":[],"pretty_type":"JPEG","external_type":"","url_private_download":"https://files.slack.com/files-pri/T1SD6MZMF-FGV6N568J/download/2019-03-11_1547.jpg","permalink_public":"https://slack-files.com/T1SD6MZMF-FGV6N568J-b58ce07115","is_starred":false,"size":367476,"channels":["CGU6WRSNL"],"comments_count":0,"name":"2019-03-11_1547.jpg","is_public":true,"thumb_360_w":360,"mimetype":"image/jpeg","public_url_shared":false,"permalink":"https://nifi-se.slack.com/files/UG2L4DSM9/FGV6N568J/2019-03-11_1547.jpg","user":"UG2L4DSM9","original_w":640,"username":"","thumb_480_h":360},"ok":true}
post.status
OK
post.statuscode
200