I have a flow with in between a step that waits for Incoming mails.
I would like to stop this process after 30 minutes and subsequent after stopping it, send a mail.
Do you have an idea. See also the attachment.
Regards
Fischer
Related
I would implement a timeout event in quarkus and I search the best way to do that.
Problem summary :
I have a process who wait answer from a REST service
If the service is call, I'll go to next process
If the service isn't call before the delay => I must not validate the process and go to the next process
So I'm thinking of using the quarkus event bus, with a delayed message. If the message is send, I close the process and go to the next process. If the client answer before the delay, the message will never be send (how can do that?)
Thanks you
I want to inform the seller, that the buyer is coming soon (about 2 hours before pickup time) via mail.
I would normally do it the hard way with CRON and a database table. Checking hourly if I find an order with pickup time minus 2 hours, only then sending the mail out.
Now, I would like to know if you would recommend using Queueing Jobs for sending Mails out.
With
$when = now()->addDays(10); //I would dynamically set the date
Mail::to($order->seller())
->later($when, new BuyerIsComing($order));
I can delay the delivery of a queued email message.
But how safe would this be? Especially, if someone is ordering something but is picking it up in let us exaggerate two months?
Is the Laravel queueing system rigid enough to behave correctly after long delays (i.e. 2 months)?
Edit
I'm using Redis for Queueing
You actually have nothing to worry about. Sending mail usually increases the response time of your application, so it's a good thing you want to delay the sending.
Queues are the way to go and it's pretty easy to setup in Laravel. Laravel supports a couple of them out of the box. I would advise you start with database and then try beanstalk etc.
Lastly and somehow more importantly, use a process manager like Supervisor to monitor and maintain your queue workers...
Take a look at https://laravel.com/docs/5.7/queues for more insight.
Cheers.
If by safe, you mean reliable, then it would be little different than sending an email immediately. If there's ever a possibility that your server "hiccups" and doesn't send an email, that possibility would be the same now as 10 minutes from now. Once the job is in the queue, it is persisted until completion (unless you use a memory-based driver, like Redis, which could get reset if the server reboots).
If you are using a database queue driver or remote, the log of queued jobs will remain even if the server is unavailable for a short period of time. Your queue will be honored even if the exact time stamp for when you want to send the job has expired. For instance if you schedule to send an email at 1:00pm but your server is down at that exact moment, when it comes back online it will still see the job because it is stored as incomplete and the time for the job is in the past, which will trigger the execution of the job at the next time your queue worker checks the job list.
Of course, this assumes that you have your queue worker set up to always check jobs and automatically restart, even after a server failure, but that's a different discussion with lots of solutions...such as those shown here.
If you're using database driver with Laravel queues to process your email then you don't need to worry about anything.
Jobs are only removed from Jobs table if they are successfully completed otherwise their next attempt time is set which is few minutes in future and they are executed again (if your queue worker is online).
So its completely safe to use Laravel queues
I have an SaaS application where each paying customer may have thousands of members they may want to send emails to every now and then.
For now, simple BCC sending via AWS SES have done the trick, but now I am looking at sending personalized emails so I must be able to send the emails one by one.
SES does not have any queue system as per my knowledge, you must make an API call per email. In short, it takes forever to send a batch (my limit is 14 per second), and the user cannot close the page while it is executing (even AJAX calls stop executing if you leave the page, am I right?).
I was thinking of building a system where I store the emails in a database table and then either:
1) Use a CRON that executes every 5 seconds or so, grab a few emails and send them.
2) Execute an AJAX script each 5 seconds that grabs the emails for said logged in customer in a batch ONLY and send them out, but again, if the customer logs out while it executes chances are that specific a batch is interrupted (the remaining ones would still keep sending the next time the customer logs in).
Does any have any better ideas? Or, which of the two above would be preferred?
You should use templates and the SendBulkTemplatedEmail endpoint that AWS introduced a few months ago: https://aws.amazon.com/blogs/ses/introducing-email-templates-and-bulk-sending/.
That way you can send up to 50 personalized emails with a single SES API call. So 700 with 14 calls.
You shouldn't consider queuing them up in a user's browser and sending them by making a series of AJAX requests though. You should only send one Ajax request to start a job. In most server-side languages (any I can think of) you can respond to an HTTP request and still continue doing processing after responding. You can also implement a progress checker in a multitude of ways.
Use a cronjob that sends to the SES SMTP server. This way you can personalize the emails and also control how many emails to send. Your cronjob can sleep in between each batch of emails.
You can use celery to run background job. A user submits a request on a webpage which starts a background job through celery. The background job take care of sending emails. Once sending emails is completed, inform the user by email.
http://www.celeryproject.org/
I have a program that shuts down another application when certain conditions are met. Now most of the time everything works out fine, but sometimes the app is writing to a file and leaves it in a half finished state and has no way of recovering it on restart. I thought that one could send soft close signals and escalate after certain timeouts to more aggressive close signals, going trough a list like this:
1. WM_CLOSE
2. WM_QUIT
3. WM_DESTROY
4. TerminateProcess().
Now I know that the program has no code to handle any signal it receives. Is there a possibility that certain FileHandler under Windows react gracefully on such soft signals or is there no use to sending those, if the app does not handle them explicitly?
This article says:
NOTE: A console application's response to WM_CLOSE depends on whether or not it has installed a control handler.
Does this mean if no control handler is installed sending 1-4 is just as good as sending 4 directly?
Please a bit of advice on the following:
I am using a ThreadPoolTaskExecutor to execute slow-external tasks like sending emails.
I need to improve this:
1) When the a task is passed to the exeuctor, it has to wait executing it till at least the transaction of the passing operation has finished.
Example: i makes no sense to email something when the order process fails and generates an exception which occurs on commitment
2) When the task fails, some retry mechanism is used to try the task again.
Example: when sending the email fails, it will be retried after 5,10 minutes and then throws an exception.
How to deal with these issues? of should I just integrate some queue that offers this functionality?..
Ed
I would say : yes use a queue in a messaging infrastructure.
Personally I would use Camel for this because I am completely smitten by Camel and would use it if I would reprogram my toaster to toast the slices golden brown at breakfast.
Since you are going to mail, it will be message based anyway, so using a message based system will already reduce the impedance mismatch.
Now things as transactions, retries and parking the message on a dead letter queue comes as standard with these things. This is nice, because you can then script your way out of trouble when a email server disaster hits, by resubmitting the messages from the dead letter queue.
Integrating an ActiveMQ or a Camel is just adding a couple of dependencies and 5-10 lines in your spring configuration.
Once it is in there it is beautiful to organize background processing, notify remote systems, automate email responses, notify sysadmins of impending doom, ... You send a message, continue what you're doing, respond to the customer, while in the background the wheels are turning.
Ok, sorry : I got carried away and got way too lyrical.