Skip to main content

Scanning Documents into Data Lakes via Tesseract, MQTT, Python, JSON, Records, TensorFlow, OpenCV and Apache NiFi

There are many awesome open source tools available to integrate with your Big Data Streaming flows.
Take a look at these articles for installation and why the new version of Tesseract is different.
I am officially recommending Python 3.6 or newer. Please don't use Python 2.7 if you don't have to. Friends don't let friends use old Python.
Tesseract 4 with Deep Learning
For installation on a Mac Laptop:
  1. brew install tesseract --HEAD
  3. pip3.6 install pytesseract
  5. brew install leptonica
Note: if you have tesseract already, you may need to uninstall and unlink it first with brew. If you don't use brew, you can install another way.
  1. Execute the ( .
  2. It will send a MQTT message of the text and some other attributes in JSON format to the tesseract topic in the specified MQTT broker.
  3. Apache NiFi will read from this topic via ConsumeMQTT
  4. The flow checks to see if it's valid JSON via RouteOnContent.
  5. We run MergeRecord to convert a bunch of JSON into one big Apache Avro File
  6. Then we run ConvertAvroToORC to make a superfast Apache ORC file for storage
  7. Then we store it in HDFS via PutHDFS
Running The Python Script
You could have this also hooked up to a scanner or point it at a directory. You could also have it scheduled to run every 30 seconds or so. I had this hooked up to a local Apache NiFi instance to schedule runs. This can also be run by MiniFi Java Agent or MiniFi C++ agent. Or on demand if you wish.
Sending MQTT Messages From Python
  1. # MQTT
  2. client = mqtt.Client()
  3. client.username_pw_set("user","pass")
  4. client.connect("", 17769, 60)
  5. client.publish("tesseract", payload=json_string, qos=0, retain=True)
You will need to run: pip3 install paho-mqtt
Create the HDFS Directory
  1. hdfs dfs -mkdir -p /tesseract

Create the External Hive Table (DDL Built by NiFi)
  1. CREATE EXTERNAL TABLE IF NOT EXISTS tesseract (`text` STRING, imgname STRING, host STRING, `end` STRING, te STRING, battery INT, systemtime STRING, cpu DOUBLE, diskusage STRING, memory DOUBLE, id STRING) STORED AS ORC
  2. LOCATION '/tesseract';

This DDL is a side effect, it's built by our ORC conversion and HDFS storage commands.
You could run that create script in Hive View 2, Beeline or another Apache Hive JDBC/ODBC tool. I used Apache Zeppelin since I am going to be doing queries there anyway.

Let's Ingest Our Captured Images and Process Them with Apache Tika, TensorFlow and grab the metadata
Consume MQTT Records and Store in Apache Hive
Let's look at other fields in Zeppelin
Let's Look at Our Records in Apache Zeppelin via a SQL Query (SELECT *FROM TESSERACT)
ConsumeMQTT: Give me all the record from the tesseract topic from our MQTT Broker. Isolation from our ingest clients which could be 100,000 devices.
MergeRecord: Merge all the JSON files sent via MQTT into one big AVRO File
ConvertAVROToORC: converts are merged AVRO file
Tesseract Example Schema in Hortonworks Schema Registry
TIP: You can generate your schema with InferAvroSchema. Do that once, copy it and paste into Schema Registry. Then you can remove that step from your flow.
The Schema Text
  1. {
  2. "type": "record",
  3. "name": "tesseract",
  4. "fields": [
  5. {
  6. "name": "text",
  7. "type": "string",
  8. "doc": "Type inferred from '\"cgi cctong aiternacrety, pou can acces the complete Pro\\nLance repesiiry from eh Provenance mens: The Provenance\\n‘emu inchades the Date/Time, Actontype, the Unsque Fowie\\nTD and other sata. Om the ar it is smal exci i oe:\\n‘ick chs icon, and you get the flowin On the right, war\\n‘cots like three inthe cic soemecaed gether Liege:\\n\\nLineage ts visualined as « lange direcnad sqycie graph (DAG) char\\nSrones the seeps 1m she Gow where modifications oF routing ‘oot\\nplace on the Aewiike. Righe-iieit « step lp the Lineage s view\\nSetusls aboot the fowtle at that step ar expand the ow to ander:\\nScand where & was potentially domed frum. Af the very bottom\\nleft of the Lineage Oi a slider wath a play button to play the pro\\n“sing flow (with scaled ame} and understand where tbe owtise\\nSpent the meat Game of at whch PORN get muted\\n\\naide the Bowtie dealin, you cam: finn deed analy of box\\n\\ntern\\n=\"'"
  9. },
  10. {
  11. "name": "imgname",
  12. "type": "string",
  13. "doc": "Type inferred from '\"images/tesseract_image_20180613205132_c14779b8-1546-433e-8976-ddb5bfc5f978.jpg\"'"
  14. },
  15. {
  16. "name": "host",
  17. "type": "string",
  18. "doc": "Type inferred from '\"HW13125.local\"'"
  19. },
  20. {
  21. "name": "end",
  22. "type": "string",
  23. "doc": "Type inferred from '\"1528923095.3205361\"'"
  24. },
  25. {
  26. "name": "te",
  27. "type": "string",
  28. "doc": "Type inferred from '\"3.7366552352905273\"'"
  29. },
  30. {
  31. "name": "battery",
  32. "type": "int",
  33. "doc": "Type inferred from '100'"
  34. },
  35. {
  36. "name": "systemtime",
  37. "type": "string",
  38. "doc": "Type inferred from '\"06/13/2018 16:51:35\"'"
  39. },
  40. {
  41. "name": "cpu",
  42. "type": "double",
  43. "doc": "Type inferred from '22.8'"
  44. },
  45. {
  46. "name": "diskusage",
  47. "type": "string",
  48. "doc": "Type inferred from '\"113759.7 MB\"'"
  49. },
  50. {
  51. "name": "memory",
  52. "type": "double",
  53. "doc": "Type inferred from '69.4'"
  54. },
  55. {
  56. "name": "id",
  57. "type": "string",
  58. "doc": "Type inferred from '\"20180613205132_c14779b8-1546-433e-8976-ddb5bfc5f978\"'"
  59. }
  60. ]
  61. }
The above schema was generated by Infer Avro Schema in Apache NiFi.
Image Analytics Results
  1. {
  2. "tiffImageWidth" : "1280",
  3. "ContentType" : "image/jpeg",
  4. "JPEGImageWidth" : "1280 pixels",
  5. "FileTypeDetectedFileTypeName" : "JPEG",
  6. "tiffBitsPerSample" : "8",
  7. "ThumbnailHeightPixels" : "0",
  8. "label4" : "book jacket",
  9. "YResolution" : "1 dot",
  10. "label5" : "pill bottle",
  11. "ImageWidth" : "1280 pixels",
  12. "JFIFYResolution" : "1 dot",
  13. "JPEGImageHeight" : "720 pixels",
  14. "filecreationTime" : "2018-06-13T17:24:07-0400",
  15. "JFIFThumbnailHeightPixels" : "0",
  16. "DataPrecision" : "8 bits",
  17. "XResolution" : "1 dot",
  18. "ImageHeight" : "720 pixels",
  19. "JPEGNumberofComponents" : "3",
  20. "JFIFXResolution" : "1 dot",
  21. "FileTypeExpectedFileNameExtension" : "jpg",
  22. "JPEGDataPrecision" : "8 bits",
  23. "FileSize" : "223716 bytes",
  24. "probability4" : "1.74%",
  25. "tiffImageLength" : "720",
  26. "probability3" : "3.29%",
  27. "probability2" : "6.13%",
  28. "probability1" : "81.23%",
  29. "FileName" : "apache-tika-2858986094088526803.tmp",
  30. "filelastAccessTime" : "2018-06-13T17:24:07-0400",
  31. "JFIFThumbnailWidthPixels" : "0",
  32. "JPEGCompressionType" : "Baseline",
  33. "JFIFVersion" : "1.1",
  34. "filesize" : "223716",
  35. "FileModifiedDate" : "Wed Jun 13 17:24:27 -04:00 2018",
  36. "Component3" : "Cr component: Quantization table 1, Sampling factors 1 horiz/1 vert",
  37. "Component1" : "Y component: Quantization table 0, Sampling factors 2 horiz/2 vert",
  38. "Component2" : "Cb component: Quantization table 1, Sampling factors 1 horiz/1 vert",
  39. "NumberofTables" : "4 Huffman tables",
  40. "FileTypeDetectedFileTypeLongName" : "Joint Photographic Experts Group",
  41. "fileowner" : "tspann",
  42. "filepermissions" : "rw-r--r--",
  43. "JPEGComponent3" : "Cr component: Quantization table 1, Sampling factors 1 horiz/1 vert",
  44. "JPEGComponent2" : "Cb component: Quantization table 1, Sampling factors 1 horiz/1 vert",
  45. "JPEGComponent1" : "Y component: Quantization table 0, Sampling factors 2 horiz/2 vert",
  46. "FileTypeDetectedMIMEType" : "image/jpeg",
  47. "NumberofComponents" : "3",
  48. "HuffmanNumberofTables" : "4 Huffman tables",
  49. "label1" : "menu",
  50. "XParsedBy" : "org.apache.tika.parser.DefaultParser, org.apache.tika.parser.ocr.TesseractOCRParser, org.apache.tika.parser.jpeg.JpegParser",
  51. "label2" : "web site",
  52. "label3" : "crossword puzzle",
  53. "absolutepath" : "/Volumes/seagate/opensourcecomputervision/images/",
  54. "filelastModifiedTime" : "2018-06-13T17:24:07-0400",
  55. "ThumbnailWidthPixels" : "0",
  56. "filegroup" : "staff",
  57. "ResolutionUnits" : "none",
  58. "JFIFResolutionUnits" : "none",
  59. "CompressionType" : "Baseline",
  60. "probability5" : "1.12%"
  61. }
This is built using a combination of Apache Tika, TensorFlow and other metadata analysis processors.

Popular posts from this blog

Ingesting Drone Data From DJII Ryze Tello Drones Part 1 - Setup and Practice

Ingesting Drone Data From DJII Ryze Tello Drones Part 1 - Setup and Practice In Part 1, we will setup our drone, our communication environment, capture the data and do initial analysis. We will eventually grab live video stream for object detection, real-time flight control and real-time data ingest of photos, videos and sensor readings. We will have Apache NiFi react to live situations facing the drone and have it issue flight commands via UDP. In this initial section, we will control the drone with Python which can be triggered by NiFi. Apache NiFi will ingest log data that is stored as CSV files on a NiFi node connected to the drone's WiFi. This will eventually move to a dedicated embedded device running MiniFi. This is a small personal drone with less than 13 minutes of flight time per battery. This is not a commercial drone, but gives you an idea of the what you can do with drones. Drone Live Communications for Sensor Readings and Drone Control You must connect t

NiFi on Cloudera Data Platform Upgrade - April 2021

CFM 2.1.1 on CDP 7.1.6 There is a new Cloudera release of Apache NiFi now with SAML support. Apache NiFi Apache NiFi Registry See:   For changes: Get your download on: To start researching for the future, take a look at some of the technical preview features around Easy Rules engine and handlers. Make sure you use the latest possible JDK 8 as there are some bugs out there.   Use a recent v

Using Apache NiFi in OpenShift and Anywhere Else to Act as Your Global Integration Gateway

Using Apache NiFi in OpenShift and Anywhere Else to Act as Your Global Integration Gateway What does it look like? Where Can I Run This Magic Engine: Private Cloud, Public Cloud, Hybrid Cloud, VM, Bare Metal, Single Node, Laptop, Raspberry Pi or anywhere you have a 1GB of RAM and some CPU is a good place to run a powerful graphical integration and dataflow engine.   You can also run MiNiFi C++ or Java agents if you want it even smaller. Sounds Too Powerful and Expensive: Apache NiFi is Open Source and can be run freely anywhere. For What Use Cases: Microservices, Images, Deep Learning and Machine Learning Models, Structured Data, Unstructured Data, NLP, Sentiment Analysis, Semistructured Data, Hive, Hadoop, MongoDB, ElasticSearch, SOLR, ETL/ELT, MySQL CDC, MySQL Insert/Update/Delete/Query, Hosting Unlimited REST Services, Interactive with Websockets, Ingesting Any REST API, Natively Converting JSON/XML/CSV/TSV/Logs/Avro/Parquet, Excel, PDF, Word Documents, Syslog, Kafka, JMS, MQTT, TCP