Laravel bulk mail queue (divided by quantity and time) - laravel

I have a scheduling system that sends emails with the respective calendars of each system member.
My mailing list has increased significantly (more than 800 recipients), and my email provider is generating some kind of restriction, something like SMTP tarpitting.
I think I could take all these recipients and split and send them in small packages, ie, I could use Mail::queue().
The point is:
Is there any way that I can add queued at intervals, for example 10 minutes and that always added at the end of the queue, even if there is a new mailing package.
The idea would be (I don't know if it's the best solution), take this total amount, example 800, divide by 150, which would give 5 interactions, and of these 5 interactions, send 25 emails every 10 minutes. (25 X (60/10) X 5 == 750).

You can throw all mails to your queue and then configure the queue to do a specific amount at a given time (one need redis for this): https://laravel.com/docs/master/queues#rate-limiting
So, you can focus on what you are doing and less on the how you are doing it 😉

Related

Fetch third party data in a periodic interval

I've an application with 10M users. The application has access to the user's Google Health data. I want to periodically read/refresh users' data using Google APIs.
The challenge that I'm facing is the memory-intensive task. Since Google does not provide any callback for new data, I'll be doing background sync (every 30 mins). All users would be picked and added to a queue, which would then be picked sequentially (depending upon the number of worker nodes).
Now for 10M users being refreshed every 30 mins, I need a lot of worker nodes.
Each user request takes around 1 sec including network calls.
In 30 mins, I can process = 1800 users
To process 10M users, I need 10M/1800 nodes = 5.5K nodes
Quite expensive. Both monetary and operationally.
Then thought of using lambdas. However, lambda requires a NAT with an internet gateway to access the public internet. Relatively, it very cheap.
Want to understand if there's any other possible solution wrt the scale?
Without knowing more about your architecture and the google APIs it is difficult to make a recommendation.
Firstly I would see if google offer a bulk export functionality, then batch up the user requests. So instead of making 1 request per user you can make say 1 request for 100k users. This would reduce the overhead associated with connecting and processing/parsing of the message metadata.
Secondly i'd look to see if i could reduce the processing time, for example an interpreted language like python is in a lot of cases much slower than a compiled language like C# or GO. Or maybe a library or algorithm can be replaced with something more optimal.
Without more details of your specific setup its hard to offer more specific advice.

Discord.py - Send many changes to the api

Situation
Let's say there are 200 people in the Voicechannel "Lobby".
We want to start our game night and write !start.
The bot then has to randomly put those people into rooms with a max size of 10.
Bot has to create 20 Channels
Bot has to randomly move the 200 People into the 20 Channels.
Easy so far.
Issue
If we just fire those 220 Requests to the API at the same time, it will not respond to every request.
If we put a thread.sleep 1s between every request, it will take 220 seconds longer.
Idea
My idea would be to create 10 additional bots which then execute 22 requests each. So everything should be done after 22+x seconds.
Would that be within discords terms of service?
Is there a better way?

Performance concerns about monitor incoming email for multiple accounts on the server

I want to write a Server side script / daemon which would monitor multiple email accounts (might become quite a big number) and then send push notifications . My conceptual idea till now is: have a database with accounts and passwords. Iterate through that, check if any new messages are there and then react on that by doing smth with an email and sending push notification to the mobile device of the Client. My biggest concern is perfomance. Looping through thousands accounts doesnt seem right to me , but I cant come up with better solution. Registering an Observer for each account doesnt sound any better..
Any ideas? Im open to any languages (Scripting or programming). Not asking for code, just trying to wrap my head around the concept.
Thank you!
You could do it by blocks. Going one by one through all your database entries may take a long time if we are talking of thousands of accounts, maybe you could divide it on several scripts or script executions, taking, lets say, blocks of 100 accounts. So you would have an environment like this: script/thread 1 checking accounts from 1 to 100, script/thread 2 checking accounts from 101 to 200...
This could be done with threads on the same script/program, by using different scrips, or by using a "wrapper" to call the script many times as needed, depending on the amount of entries/blocks. You may need to keep an eye on you server resources, but performance of the checks should improve.
Hope this helps.

Pricing: Are push notifications really free?

According to the parse.com pricing page, push notifications are free up to 1 million unique recipients.
API calls are free up to 30 requests / second.
I want to make sure there is no catch here.
An example will clarify: I have 100K subscribed users. I will send weekly push notifications to them. In a month, that will be 4 push "blasts" with 100K recipients each. Is this covered by the free tier? Would this count as 4 API calls, 400K API calls, or some other amount?
100k users is 1/10 the advertised unique recipient limit, so that should be okay.
Remember that there's a 10sec timeout, too. So the only way to blast 100k pushes within the free-tier resource limits is to create a scheduled job that spends about 2 hours (that's a safe rate of 15 req/sec) doing pushes and writing state so you can pick up later where you left off.
Assuming there's no hidden gotcha (you'll probably need to discover those empirically), I think the only gotcha in plain sight is the fact that the free tier allows only one (1) scheduled job. Any other long-running processing -- and there are bound to be some on 100k users -- are going to have to share the job, making the what-should-this-single-job-work-on-now logic pretty complex.
You should take a look at the FAQ for Parse.com:
https://www.parse.com/plans/faq
What is considered an API request?
Anytime you make a network call to
Parse on behalf of your app using one of the Parse SDKs or REST API,
it counts as an API request. This does include things like queries,
saves, logins, amongst other kinds of requests. It also includes
requests to send push notifications, although this is seen as a single
request regardless of how many recipients are targeted. Serving Parse
files counts as an API request, including static assets served from
Parse Hosting. Analytics requests do have a special exemption. You can
send us your analytics events any time without being limited by your
app's request limit.

Using Twilio SMS API, can I specify multiple destination phones in one post?

Twilio limits long code SMS to 1/sec. To improve my throughput, I split my batch into 5 phone numbers. I've found each HTTP POST to the Twilio API takes about 0.5 seconds.
One would think using 5 twilio phone numbers to send a message to 1000 cell phones would take 200 seconds, but it will take 500 seconds just to POST the requests. So two phone numbers will double my throughput, but more would not make a difference.
Am I missing something? I was thinking it would be nice if the API would take a list of phone numbers for the "To" parameter. I don't want to pay for a short code, but even if I do it seems the maximum throughput is 2/sec unless you resort to the complexity of having multiple threads feeding Twilio.
I've noticed TwiML during a call let's you include multiple sms nodes when constructing a response so it seems like there should be a way to do the same for outbound SMS.
Twilio Evangelist here. At the moment, we require that you submit each outgoing SMS message as its own API request.
The current rate limit on a longcode is 1 message per second. If more messages per second are sent, Twilio queues them up and sends them out at a rate of 1 per second.
A potential workaround is to make async requests across multiple phone numbers. This can be accomplished with the twilio node.js module or an evented framework such as EventMachine for Ruby or a similar toolset for your language of choice.
Hope this helps!
Here's a more modern answer. Twilio now supports Messaging Services. It basically lets you create a service that can group multiple outbound phone numbers together. So, when you fire off requests for a text to be sent, it can use ALL the numbers in the message group to perform the sending. This effectively overcomes the 1 text per second limit.
Messaging services also comes with Copilot. It adds several features such as "sticky sender". This ensures the same end user always gets texts from the same number in the pool instead of getting a text from different numbers.
If you are using the trial account, even looping with a 5s timeout between each item in the array did not work for me. And that was for just two numbers. Once I upgraded the account the code worked immediately without needing a timeout.
You know it's the trial account if the SMS you receive (when sending to only one number) says "Sent from your Twilio trial account - ".

Resources