So I have a problem with rate limiting in a Laravel project for a China project. All the users are from China. The scenario is that an authenticated user manage to submit 10 post requests within a second? How can I prevent this to happen?
I have tried using laravel Throttle feature but I am not sure whether if it will work if I set it to milliseconds. Are there any suggestions on how I could counter this problem?
Related
For a laravel application that uses stripe, this request https://r.stripe.com/0 gets fired multiple times like so:
These requests got fired right after I refreshed the homepage.
The problem is that I recently got a 429 too many requests error in my live server. This is pretty rare but I suspect it has something to do with stripe since I read about rate-limiting in stripe.
Since I am new to stripe and I got this project handed over midway, I had some questions:
Is there any relation between the 429 error and these stripe requests? Or is this something to do with the way the system is built?
If it's the former, how would one resolve the issue? Maybe disable the security checks (cause I read somewhere that these requests were stripe security checks), or maybe increase the rate limit from stripe in some way?
I searched extensively about this but couldn't find any relevant posts anywhere, maybe because it's a very trivial problem. I would be grateful if someone who has faced this or knows about this can enlighten me.
I have some simple websites (not Laravel applications) with forms where people can input there postalcode and housenumber where the street and city field automatically gets filled in with the associated information. To accomplish this I make an API call with a ajax request to my Laravel application which returns the associated street and city. My Laravel application then makes a call to a third-party api which costs me around € 0.01 per request.
No I want to avoid unwanted an unauthorized access to my Laravel api calls, because each call costs me money. Because at this moment it is very easy to replicate such calls and someone with bad intentions could make a script that could perform thousands of calls per minute.
So my questions is how I can prevent unwanted and unauthorized api calls. I already read about Sanctum and passport, but from what I read this applies only for authenticated users. And using a token in the request header seems unnecessary, because anybody with a little knowledge can trace the token and use it.
Note that the people who fill in the forms can be random people and don't have an account.
There are probably many approaches. A simple but effective one would be sessions. You can save the user in a session. This way you can also count his Api accesses. As soon as they are larger than allowed, you can block their requests. You also write the block in the session. But pay attention to the session duration. It must be long enough.
But the user with bad intentions can get a new session. To avoid this, you can also put his IP on an internal blacklist for a day.
Note: But an open api is always a point of attack.
I have a web application (Angular front, Laravel backend API). In that there is a section where I have messaging system. Basically four user roles can post and read messages.
When a user logged in I needs to check every 10 seconds to see if there are new messages for any user of any role(out of 4 roles).
This works fine but from time to time I get http status code of 429 Too many request. I have no idea what is the reason. Anyone have an idea of what is the reason or can point me to the right direction in order to fix this?
Note: I have a custom field system build and I use that to hold extra data of messages. I notice that when i fetch messages considerable number of models related to custom fields also quarried. Can this be a reason?
I found the solution and here are my steps.
I needed to find all my request going to the API, for that I installed this package API logger. Using this I examined the requests and there I found some request made to the API over and over.
I corrected the requests made to the API and the issue seems to fixed.
When using the LUIS programmatic API, we get frequent 429 errors ("too many requests") when doing a half-dozen GET and POST requests. We've inserted a pause in our code to deal with this.
We have a paid subscription key to LUIS, which indicates we should get 50 requests/second (see https://azure.microsoft.com/en-us/pricing/details/cognitive-services/language-understanding-intelligent-services/). However, it seems the paid subscription key can only be used for hitting the application endpoint. For Ocp-Apim-Subscription-Key in request headers, we must use our "programmatic key", which is associated with the Starter_Key, which is (apparently) rate-limited.
Am I missing something here? How do we get more throughput on the LUIS programmatic api?
One of our engineers arrived at the following answer, so I'm going to post his answer here.
The programmatic API is limited to 5 requests per second, 100K requests per month for everyone. Your paid subscription key is only for the endpoint API, not for the programmatic API.
If you need more throughput:
Put your API requests in a queue. It is unlikely that you need to update your LUIS model non-stop, 5 times per second -- you probably just have a burst of updates. Put them in a queue to stay within the limit.
Don't try using the same user account to manage multiple LUIS models. Set up additional accounts for each model, which gives you additional programmatic keys. Each programmatic key gives you 5 requests per second.
I'm building a web application where users can sign up, add twitters feeds that they want to follow and their stream will update as the feeds they're following receive new posts.
My go to platform is Laravel. However, I can't think of the best way to implement the live update aspect of the site.
I would use an AJAX function that is called periodically (every 30 seconds for example) but as the number of users increases this method as it's drawbacks.
I've looked into HTML5 Server Side Events but IE isn't supported unfortunately.
What would be the best way to implement this functionality within a Laravel App?
Thanks,
Nick
You have two options :
Streaming ( Websockets )
Long polling
You can read more about websockets here :
https://developer.mozilla.org/en-US/docs/Web/API/WebSockets_API
And you can read more about long polling here :
https://www.quora.com/Why-would-HTTP-long-polling-be-used-instead-of-HTTP-Streaming-to-achieve-real-time-push-notifications
In short :
websockets run on a different port than your usual app, so accessing all your assets can be a bit strange(depending on your system architecture).
Long polling is a very long http request that can last up to several minutes, instead of sending a request every 30 seconds, you send it every time the server returns a response. this means that if the server took 5 minutes to return a response, you're only sending a request once per 5 minutes. (for example, there's no reason to alert the client that nothing changed at all, so you can sleep(30) and try again)
As a side note, Unless you need real time data, I think long polling is much easier to implement and use with a framework such as laravel.
you can use pusher or node.js for realtime.In laracast you will find a video how to do that https://laracasts.com/series/intermediate-laravel/episodes/4
I use Pusher for real time data, is extremly easy to use and exists a Laravel package.
https://pusher.com/
https://github.com/vinkla/pusher