Skip to main content

Creating An Email Bot in Apache NiFi (Consume and Send Email)

Some people say I must have a bot to read and reply to email at all crazy hours of the day. An awesome email assistant, well I decided to prototype it.
This is the first piece. After this I will add some Spark machine learning to intelligently reply to emails from a list of pretrained responses. With supervised learning it will learn what emails to send to who, based on Subject, From, Body Content, attachments, time of day, sender domain and many other variables.
For now, it just reads some emails and checks for a hard coded subject.
I could use this to trigger other processes, such as running a batch Spark job.
Since most people send and use HTML email (that's what Outlook,, Gmail do), I will send and receive HTML emails as to make it look more legit.
I could also run my fortune script and return that as my email content. Making me sound wise, or pull in a random selection of tweets about Hadoop or even recent news. Making the email very current and fresh.
Snippet Example of a Mixed Content Email Message (Attachments Removed to Save Space)
  1. Return-Path: <>
  2. Delivered-To:
  3. Received: from
  4. by (Dovecot) with LMTP id +5RhOfCcB1jpZQAAf6S19A
  5. for <>; Wed, 19 Oct 2016 12:19:13 -0400
  6. Return-path: <>
  7. Envelope-to:
  8. Delivery-date: Wed, 19 Oct 2016 12:19:13 -0400
  9. Received: from [] (
  10. by with esmtp (Exim)
  11. id 1bwtaC-0006dd-VQ
  12. for; Wed, 19 Oct 2016 12:19:12 -0400
  13. Received: from ([])
  14. by walimpinc11 with bizsmtp
  15. id xUKB1t0063zlEh401UKCnK; Wed, 19 Oct 2016 12:19:12 -0400
  16. X-EN-OrigIP:
  17. X-EN-IMPSID: xUKB1t0063zlEh401UKCnK
  18. Received: from (localhost [])
  19. (using TLSv1 with cipher AES256-SHA (256/256 bits))
  20. (No client certificate requested)
  21. by emg-ca-1-1.localdomain (Postfix) with ESMTPS id BEE9453F81
  22. for <>; Wed, 19 Oct 2016 09:19:10 -0700 (PDT)
  23. Subject: test
  24. MIME-Version: 1.0
  25. x-echoworx-msg-id: e50ca00a-edc5-4030-a127-f5474adf4802
  26. x-echoworx-emg-received: Wed, 19 Oct 2016 09:19:10.713 -0700
  27. x-echoworx-message-code-hashed: 5841d9083d16bded28a3c4d33bc505206b431f7f383f0eb3dbf1bd1917f763e8
  28. x-echoworx-action: delivered
  29. Received: from ([])
  30. by emg-ca-1-1 (JAMES SMTP Server 2.3.2) with SMTP ID 503
  31. for <>;
  32. Wed, 19 Oct 2016 09:19:10 -0700 (PDT)
  33. Received: from (unknown [])
  34. (using TLSv1 with cipher AES256-SHA (256/256 bits))
  35. (No client certificate requested)
  36. by emg-ca-1-1.localdomain (Postfix) with ESMTPS id 6693053F86
  37. for <>; Wed, 19 Oct 2016 09:19:10 -0700 (PDT)
  38. Received: from ( by
  39. ( with Microsoft SMTP
  40. Server (TLS) id 15.0.1178.4; Wed, 19 Oct 2016 09:19:09 -0700
  41. Received: from ([]) by
  42. ([]) with mapi id
  43. 15.00.1178.000; Wed, 19 Oct 2016 09:19:09 -0700
  44. From: Timothy Spann <>
  45. To: "" <>
  46. Thread-Topic: test
  47. Thread-Index: AQHSKiSFTVqN9ugyLEirSGxkMiBNFg==
  48. Date: Wed, 19 Oct 2016 16:19:09 +0000
  49. Message-ID: <>
  50. Accept-Language: en-US
  51. Content-Language: en-US
  52. X-MS-Has-Attach: yes
  53. X-MS-TNEF-Correlator:
  54. x-ms-exchange-messagesentrepresentingtype: 1
  55. x-ms-exchange-transport-fromentityheader: Hosted
  56. x-originating-ip: []
  57. x-source-routing-agent: Processed
  58. Content-Type: multipart/related;
  59. boundary="_004_D49AD13737654F9ABF98C4E36D11FFD8hortonworkscom_";
  60. type="multipart/alternative"
  63. --_004_D49AD13737654F9ABF98C4E36D11FFD8hortonworkscom_
  64. Content-Type: multipart/alternative;
  65. boundary="_000_D49AD13737654F9ABF98C4E36D11FFD8hortonworkscom_"
  68. --_000_D49AD13737654F9ABF98C4E36D11FFD8hortonworkscom_
  69. Content-Type: text/plain; charset="utf-8"
  70. Content-Transfer-Encoding: base64
Python Script to Parse Email Messages
  1. #!/usr/bin/env python
  3. """Unpack a MIME message into a directory of files."""
  4. import json
  5. import os
  6. import sys
  7. import email
  8. import errno
  9. import mimetypes
  10. from optparse import OptionParser
  11. from email.parser import Parser
  13. def main():
  14. parser = OptionParser(usage="""\
  15. Unpack a MIME message into a directory of files.
  16. Usage: %prog [options] msgfile
  17. """)
  18. parser.add_option('-d', '--directory',
  19. type='string', action='store',
  20. help="""Unpack the MIME message into the named
  21. directory, which will be created if it doesn't already
  22. exist.""")
  23. opts, args = parser.parse_args()
  24. if not
  25. os.makedirs(
  26. try:
  27. os.mkdir(
  28. except OSError as e:
  29. # Ignore directory exists error
  30. if e.errno != errno.EEXIST:
  31. raise
  32. msgstring = ''.join(str(x) for x in sys.stdin.readlines())
  34. msg = email.message_from_string(msgstring)
  36. headers = Parser().parsestr(msgstring)
  37. response = {'To': headers['to'], 'From': headers['from'], 'Subject': headers['subject'], 'Received': headers['Received']}
  38. print json.dumps(response)
  39. counter = 1
  40. for part in msg.walk():
  41. # multipart/* are just containers
  42. if part.get_content_maintype() == 'multipart':
  43. continue
  44. # Applications should really sanitize the given filename so that an
  45. # email message can't be used to overwrite important files
  46. filename = part.get_filename()
  47. if not filename:
  48. ext = mimetypes.guess_extension(part.get_content_type())
  49. if not ext:
  50. # Use a generic bag-of-bits extension
  51. ext = '.bin'
  52. filename = 'part-%03d%s' % (counter, ext)
  53. counter += 1
  54. fp = open(os.path.join(, filename), 'wb')
  55. fp.write(part.get_payload(decode=True))
  56. fp.close()
  58. if __name__ == '__main__':
  59. main()
  1. python -d /opt/demo/email/"$@"
Python needs the email component for parsing the message, you can install via PIP.
  1. pip3.6 install email
I am using Python 2.7, you could use a newer Python 3.x
Here is the flow:
For the final part of the flow, I read the files created by the parsing, load them to HDFS and delete from the file system using the standard GetFile.

Popular posts from this blog

Ingesting Drone Data From DJII Ryze Tello Drones Part 1 - Setup and Practice

Ingesting Drone Data From DJII Ryze Tello Drones Part 1 - Setup and Practice In Part 1, we will setup our drone, our communication environment, capture the data and do initial analysis. We will eventually grab live video stream for object detection, real-time flight control and real-time data ingest of photos, videos and sensor readings. We will have Apache NiFi react to live situations facing the drone and have it issue flight commands via UDP. In this initial section, we will control the drone with Python which can be triggered by NiFi. Apache NiFi will ingest log data that is stored as CSV files on a NiFi node connected to the drone's WiFi. This will eventually move to a dedicated embedded device running MiniFi. This is a small personal drone with less than 13 minutes of flight time per battery. This is not a commercial drone, but gives you an idea of the what you can do with drones. Drone Live Communications for Sensor Readings and Drone Control You must connect t

Advanced XML Processing with Apache NiFi 1.9.1

Advanced XML Processing with Apache NiFi 1.9.1 With the latest version of Apache NiFi, you can now directly convert XML to JSON or Apache AVRO, CSV or any other format supported by RecordWriters.   This is a great advancement.  To make it even easier, you don't even need to know the schema before hand.   There is a built-in option to Infer Schema. The results of an RSS (XML) feed converted to JSON and displayed in a slack channel. Besides just RSS feeds, we can grab regular XML data including XML data that is wrapped in a Zip file (or even in a Zipfile in an email, SFTP server or Google Docs). Get the Hourly Weather Observation for the United States Decompress That Zip  Unpack That Zip into Files One ZIP becomes many XML files of data. An example XML record from a NOAA weather station. Converted to JSON Automagically Let's Read Those Records With A Query and Convert the results to JSON Records

Simple Change Data Capture (CDC) with SQL Selects via Apache NiFi (FLaNK)

 Simple Change Data Capture (CDC) with SQL Selects via Apache NiFi (FLaNK) Sometimes you need real CDC and you have access to transaction change logs and you use a tool like QLIK REPLICATE or GoldenGate to pump out records to Kafka and then Flink SQL or NiFi can read them and process them. Other times you need something easier for just some basic changes and inserts to some tables you are interested in receiving new data as events.   Apache NiFi can do this easily for you with QueryDatabaseTableRecord, you don't need to know anything but the database connection information, table name and what field may change.  NiFi will query, watch state and give you new records.   Nothing is hardcoded, parameterize those values and you have a generic Any RDBMS to Any Other Store data pipeline.   We are reading as records which means each FlowFile in NiFi can have thousands of records that we know all the fields, types and schema related information for.   This can be ones that NiFi infers the s