Fetching large webhook in an optimal way - laravel

So I was sending a webhook from SendGrid, comprising 10000 records in it. At my endpoint, I am using rabbitmq queue to process the webhook further. The issue I am facing is in MongoDb update process as it is experiencing too much load. I've searched around and didn't get any results for this issue. What I was thinking to do is to send data to mongodb in chunks after consuming the message or is there any way around where I can receive the webhook in chunks before pushing it to my queue?

Related

Handle large dataset in restful API without pagination

Need to send morethan 2 million records from an rest API by connecting to oracle database. The response size could be morethan 3GB. I know it will hit the performance of API and may get outofmemory errors.
I'm researching on http chunking and webflux options to avoid performance issues from an API and also for consumer. Not sure whether it will resolve my issue.
I would like to know do we have any approaches to handle large datasets like 2or 3 millions of data?
Want to understand best approach for my problem.
I tried with streamingresponsebody but the backend connection got closed as the request didn't completed in 5minutes.

Waiting for kafka consumer to complete its operations

Use case: I have a use case where the client send a request to an microservice endpoint (the producer) which does some operations and produce a message to the kafka to be consumed by a consumer which stores some data into his own database. Immediately after this, the client send another request to the consumer microservice to get latest updates (which should include data stored previously).
But the problem is that the client is sending the second request without waiting for consumer to finish storing data (for the first request).
Question: how should I handle it? Can somehow wait for consumer to finish storing data?
What I tried: I tried to add thread.sleep into producer endpoint, but I don't like that solution.
Thanks.

Performance improvement idea about saving large number of Objects into the dabatase

My web application has one feature where it allows the user to send messages to all friends. The number of friends can be 100K to 200K. The application is using Spring and Hibernate.
It entails fetching the friends' info, building the message object and saving it to the database. After all the messages are sent (actually saved to the db), a notification will popup showing how many messages are sent successfully such as 99/100 sent or 100/100 sent.
However, when I was load testing this feature. It took an extremely long time to finish. I am trying to improve the performance. One approach I tried was to divide the friends into small batches and fetch/save each batch concurrently and wait on all of them to finish. But that still didn't get too much improvement.
I was wondering if there are any other ways I can try. Another approach I can think of is to use WebSockets to send each batch and update the notification after each batch and start the next batch until all the batches are sent. But how can the user still get the notification after he navigates away from the message page? The Websocket logic on the client side has to be somewhere global, correct?
Thank you in advance.

RabbitMQ keep messages in queue

I am streaming a tty's stdout and stderr to RabbitMQ (logs to be exact). These logs can be viewed on a website and while the content is streamed to RabbitMQ they are consumed by the webserver and forwarded to the client using WebSockets. Logs are immediately persisted after sending it to RabbitMQ.
When the user accesses the website the persisted logs are rendered and the consecutive parts are streamed using WebSockets. The problem is that there is a race condition as the persisted logs might be missing chunks of the log that occurred between rendering the site and receiving the first chunk via WebSocket.
My idea was to keep all chunks in the queue and send those via the WebSocket after connecting. Additionally I would add a worker to listen to some kind of a "finished" event which then takes everything in the queue and persists it at once.
The problem is that I don't know if this is possible using RabbitMQ or how. Any ideas or other solutions?
I don't think it really matters but my stack is using Ruby Sinatra and the Bunny RabbitMQ client.
While I agree with your general idea about picking up where you left off, after loading the intial page, what you're trying to do isn't something that should be done from RabbitMQ.
There are a lot of potential problems that this would cause, which I've outlined in a blog post, previously.
Instead of trying to do this w/ RMQ, I would do this from a database layer.
As you push things into the database, you have an ID - hopefully one that is sequential. If not, add a sequence to the entries.
When you load the page for the user, send the current ID that they are at down to the browser.
After the page finishes loading and you're setting up the websocket connection, send the user's current spot in the list of messages via the websocket. then the websocket connection can use that id to say "give me all the messages after this id, and start streaming them"
Again, this is not done via RabbitMQ (see my article on why this is a bad idea), but via your database and sequential IDs.

Can I view raw message in iron.io webpage?

I'm learning to use iron.io MQ push queues. I'm pushing some messages with Laravel php framework and everything works. However, just to round up my knowledge, I would like to see the raw contents of these messages. In my iron.io account I can see the total number of messages sent, but I can't find a place where to inspect individual messages and their contents. I'm wondering weather Laravel is sending some ID's or anything like that..
Laravel is using IronMQ's Push Queues and since push queues are delivered immediately, they don't stick around upon successful delivery. Although, you can create an error queue to inspect messages that can't be delivered successfully.

Resources