Trigger a shell script on mail arrival - shell

How can I trigger a shell script on an email arrival that extracts the mail in a text file? I want to extract the information in the mail, process it to determine the request and send an automated response to that request. The mail will basically consist of a data request and the response will have the requested data in a text file attached to it.

Look into the documentation of your MTA (mail transfer agent). Many of them allow to run scripts or hooks when mail arrives and certain other conditions are met.
If you're using Linux and want a pure client solution (i.e. independent of the mail server software), then you should look at procmail. The documentation contains lots of useful tips and hints how to set up the tool (like performance considerations) and how to properly set up the environment so your script executes correctly.
It also contains examples like a service which responds to "ping" mails.

Related

Obiee agents sending delivery content in different formats for different uses

One interesting thing has occured lately in our delivery files. We had an agent that had a task to send content of Obiee analysis to several destinations (in this case email addresses) on a daily basis in xls format.
However, when emails were received, we noticed that to one recepient content is delivered in .xls format, but for others in .dat format.
We couldn't find any file or menu in obiee settings where this paramater may be modified. Is there any solution to this?
There isn't one. OA sends out a payload and then your mail server does...whatever it does. The problem will be at the mail server stage.

Email batches via AWS SES, ideas?

I have an SaaS application where each paying customer may have thousands of members they may want to send emails to every now and then.
For now, simple BCC sending via AWS SES have done the trick, but now I am looking at sending personalized emails so I must be able to send the emails one by one.
SES does not have any queue system as per my knowledge, you must make an API call per email. In short, it takes forever to send a batch (my limit is 14 per second), and the user cannot close the page while it is executing (even AJAX calls stop executing if you leave the page, am I right?).
I was thinking of building a system where I store the emails in a database table and then either:
1) Use a CRON that executes every 5 seconds or so, grab a few emails and send them.
2) Execute an AJAX script each 5 seconds that grabs the emails for said logged in customer in a batch ONLY and send them out, but again, if the customer logs out while it executes chances are that specific a batch is interrupted (the remaining ones would still keep sending the next time the customer logs in).
Does any have any better ideas? Or, which of the two above would be preferred?
You should use templates and the SendBulkTemplatedEmail endpoint that AWS introduced a few months ago: https://aws.amazon.com/blogs/ses/introducing-email-templates-and-bulk-sending/.
That way you can send up to 50 personalized emails with a single SES API call. So 700 with 14 calls.
You shouldn't consider queuing them up in a user's browser and sending them by making a series of AJAX requests though. You should only send one Ajax request to start a job. In most server-side languages (any I can think of) you can respond to an HTTP request and still continue doing processing after responding. You can also implement a progress checker in a multitude of ways.
Use a cronjob that sends to the SES SMTP server. This way you can personalize the emails and also control how many emails to send. Your cronjob can sleep in between each batch of emails.
You can use celery to run background job. A user submits a request on a webpage which starts a background job through celery. The background job take care of sending emails. Once sending emails is completed, inform the user by email.
http://www.celeryproject.org/

Using `mail` to locally send messages to yourself

I have a daemon that executes some commands every two days. When it encounters any error, I want it to notify me. I know that bash can look up the /var/mail/user and tell me if there any new messages in this file. But I never used messages before. I think I have to use the mail command to do that.
However when I try to look up information about the usage of the command, I only read about sending actual emails, not local messages to users. So how would I send a message to myself, so that when I execute mail I'd get You have new mail.
If all you need is to be notified by your automated task, you could use something else than mails. Chatbots, for instance.
I personally use nimrod : https://www.nimrod-messenger.io
It's messenger only but other chat systems are planned.

Running a service when email arrives

I'm creating an application that downloads an attachment from an email that's automatically generated at midnight every night. The attachment is downloaded into a directory and from there is then parsed into a database.
The main problem with this is that the email generation takes time, so the actually delivery time is unknown. Instead of having a program running the entire time, waiting on it to arrive, it would be far more elegant to have it automatically run the service to download the attachment when the email drops in the inbox, much in the same way as a FileSystemWatcher works on local directories.
The email server runs Exchange 2003
Is there a way, programmatically or otherwise, to cause a service to run on receiving a new email?
Exchange has the ability to run event sinks when certain things occur on the server.
http://support.microsoft.com/kb/324021 shows an example of doing this to create a catchall email address.
Details of the exchange 2003 SDK are at: http://msdn.microsoft.com/en-us/library/ms986138(v=exchg.65).aspx

Ncqrs: How to store events as part of test set-up

How do I store events as part of setting up my tests?
Currently I'm initializing application state by sending commands like this:
Given some commands were sent
When sending another command
Then some events should have been published
I'm using ICommandService.Execute() to send the Commands in the Given and When parts.
Since commands can be rejected by the domain, I wouldn't want to rely on them. I'd rather set up my application state by simulating events like this:
Given _some events_ occurred
When sending a command
Then some events should have been published
How do I push the events from Given into the event store so that they can be replayed during handling the "When" part?
Thanks
Dennis
Have been given the answer on the mailing list and will add this for further reference:
I was using an old version of Ncqrs. The current version exposes Ncqrs.Eventing.Storage.IEventStore.Store() which takes an event stream and can be uses during test set-up just as needed.

Resources