Laravel how to deal with 120 calls to external apîs? [closed] - laravel

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 months ago.
Improve this question
I am trying to figure how to use use multiple 120 calls on a paid apis
1 - should i store all response on db and call them from the db axcording to the connected user ?
2 - should i store all jsons on a folder and call them according to connected user ?
I am confused about the way to deal with
When a user have valide subscription calls will be made to external apis as scheduled job

What you can do is cache the response you get from the paid API.
$value = Cache::remember('cache-key', 'time-to-live-in-seconds', function () {
// send request to paid api and return the data
});
Checkout the official docs
(https://laravel.com/docs/9.x/cache#retrieving-items-from-the-cache).
By default the cache driver is file, you can switch to redis or memcached if need be.
Now what you need to understand is the cache key and time-to-live-in-seconds.
Cache Key : This is the key Laravel will use to associate the cached data, so if the request is dependent on say, the logged in user, you can use the user id as the key here.
Time to live in seconds : This tells how long the data should be cached. So you have to know how often the paid api changes so that you do not keep stale data for a long time.
Now when you try to send a request, Laravel will first check if the data exists in cache, if it does, it will verify whether the data has expired based on time-to-live-in-seconds. It will return the cached data if its valid or send the paid api request and return the data if its not.

Related

Locking a a concert ticket to prevent other users from booking it in Laravel [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I am working on a ticketing system using Laravel. Are there any known techniques to prevent double bookings in a ticketing system.
Scenario
Ticket A has only 6 tickets available, User A comes in and adds 4 into their basket. I intend to make an API call to the database to deduct 4 from the available ticket number and then start a 10 minute timer within which User A must complete payment or else the tickets will be added back to the database.
However, a big flaw in my method is this: if the user simply closes the window, I have no way of checking the elapsed time to re-add the tickets back. Any ideas or other known techniques that I can make use of?
I already took a look at this question but still run into the same issue/flaw
Locking while accessing the Model would solve most of your worries and don't let core business logic being enforced on the front end.
Use database transactions to secure only one row is modified at a time and check that the ticket amount is available or else fail. This can produce database locks, that should be handled for better user experiences. There will not be written anything to the database, before the transaction is executed without any errors.
Thorwing the exception will cancel the operation and secure the operation to be atomic.
$ticketsToBeBought = 4;
DB::transaction(function ($query) {
$ticket = Ticket::where('id', 'ticket_a')->firstOrFail();
$availableTickets = $ticket->tickets_available;
$afterBuy = $availableTickets - $ticketsToBeBought;
if ($afterBuy < 0) {
throw new NoMoreTicketsException();
}
$ticket->tickets_available = $afterBuy;
$ticket->save();
// create ticket models or similar for the sale
});
This is a fairly simple approach to a very complex problem, that big companies normally tackle. But i hope it can get you going in the right direction and this is my approach to never oversell tickets.

Errors while getting many images from S3 Laravel Storage [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 2 years ago.
Improve this question
One of my API's routes associated with Laravel controller that returns URL of image stored on AWS S3.
I have function that looks like
public function getImage($params) {
//... $image is fetched from database
return Storage::disk('s3')->response("some_path/".$image->filename);
}
This codes works fine when I'm requesting few images, but when I try to use it inside some list which can be scrolled very fast some of request are failing. What am I doing wrong?
Because you are quickly scrolling and populating your lists, a lot of requests are being made to your server.
Laravel has a throttle middleware installed by default on your routes to mitigate security risks.
In your case you are hiring the limits of the throttle, resulting in 429 error codes.
Your PHP code is correct, your front-end code should be less greedy in trying to fetch images.
.... Or you should raise the allowed throttle amount in laravel, or remove it all together, but I wouldn't recommend it.

Best model for processing rss feeds [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I am creating a podcast website and I was wondering what would be the best way to keep a database up to date within 15 mins with the podcast rss feeds.
Currently I parse a feed on request and store in redis cache for 15 mins. But I would prefer to keep a database with all the data (feeds and all episodes).
Would it be better to bake the data by constantly hitting all feeds every 15 mins on a processing server or to process the feeds when requested?
If I were to update rss feed when requested I would have to:
check database -> check if 15 mins old -> done || parse feed -> check for neew feeds -> done || add to database -> done
where done = send data to user.
Any thoughts?
That's a way to do it. There are protocols like PubSubHubbub which can help you avoid polling "dumbly" every 15 minutes... You could also use Superfeedr and just wait for us to send you the data we find in finds.

Restricting random generated email on site to register [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I want user not to register on site using random emails generated randomly. For example Mailinator.com
How I can restrict those emails from my site when use register using those emails
Notice that Mailinator has many different domain names. You should see where the A or MX records of the domain name part resolves to, to filter mailinator effectively. Notice that it will also cause me to not use your service:
% host mailinator.com
mailinator.com has address 207.198.106.56
mailinator.com mail is handled by 10 mailinator.com.
% host suremail.info
suremail.info has address 207.198.106.56
suremail.info mail is handled by 10 suremail.info.
So effectively you'd want your blacklist to block by all of these
- the domain part of the address
- the A record of the domain
- the A record of the highest priority MX record of the domain
There is one more way but i am not sure it will work or not. This is Link for PhpBB blacklisted email. you can add them in your database table named blacklists
(according to cakephp modelname requirement)
Then in singup function compare both email
$mailchk = $this->request->data['User']['email'];
$mailexists = $this->request->data['Blacklist']['email']
compare this both email and if they mathch kick that user out.
but this is ideal way, I am not sure it will work or not because Programming functions have their own limt
you can use preg_match or FILTER_VALLIDATE_EMAIL to compare both data

Session in "Sql*net message from client", but is active [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I've Googled around, and my impression is that
Sql*net message from client
suggests the Oracle DBMS is waiting for the client to send new commands to the DBMS, and therefore any time spent in this event should be client-side time and not consume DB server CPUs. In other words, normally, if a session is in this event, it should be "INACTIVE" rather than "ACTIVE".
What's puzzling to us is that starting from this week (after we started using connection pools [we use dbcp]), we occassionally see sessions in the
Sql*net message from client
event and showing "ACTIVE" at the same time for extended periods of times. And during all this time, CPU usage on the DB is high.
Can anyone shed some light on what this means? If the DB session is waiting for the client to send a message, what can it be "ACTIVE" and consuming CPU cycles for?
If you see this event in the V$SESSION view you need to check the value of the STATE column as well to determine if the session is idle or is in fact working.
This is based on the following Oracle Magazine article:
you cannot look at the EVENT column alone to find out what the session
is waiting for. You must look at the STATE column first to determine
whether the session is waiting or working and then inspect the EVENT
column.

Resources