How to handle php call for rapid api data update to database? - heroku

I am creating web application work with laravel and vueJs and my app work with third party api which one provide live data on related market. I.E.
When user add order with a price, it will match live market price before order is confirmed.
I constantly update the market price feed. A cron job is used to do this, but the response time is more than 1123 ms on laravel.
When using direct Php API call without using framework, the response time only improve slightly, e.g. 995 ms , before compare data in local database.
Please suggest a better way to retrieve this continuous update data. Currently the application is still in development, I i want know the correct type of service need and need suggestion on what kind of server should I use. The application make ~2,678,400 API request monthly.

Related

How to work in Laravel app with external api?

in my Laravel 5.7/mysql app I need to make external api to read some data from external
app with get request and to write some data to my db with post request.
Which tools/scripts are there for this and how to make these requests safe?
MODIFIED :
Thanks for feedbacks, but looks like I badly put my question
The external app(I do not know what is it written with) need to read data from my app
and write data to my Laravel 5 app.
And how have I to test these requests while on development locally ?
Looks like I have to use Guzzle as in provided link?
Which steps have I to take for safety on my side?
Thanks!
These three libraries are popular for your use-case:
Guzzle
Curl
zttp
If the database is local you can use Eloquent, If not, remote connection to that database may help. otherwise, if you only have API access you should consume eighter of above libraries or any alternative options to make an HTTP request your application might require.
Security-wise, as long as you are only making a request to a remote server, the Suggested way is to store any key or secret string related to authorizing your request (if applicable) in your .env to prevent it from committed to your version control systems. Needless to say to always handle any possible HTTP error your remote API might throw in order to prevent any unwanted error on your application side.
And as Abir Adak mentioned in the comment check this thread for further details.
Updated Answer: On the case of MODIFIED part, generally you have 3 popular options,
REST API
This blog post is a detailed walkthrough written for Laravel
This one from Stack Overflow can help you with designing you API
This last one can help you to develop a widely accepted API response and endpoints by following its specifications.
GraphQL
Can save some time for developing your API, but I suggest to make sure that the consumers of your API are happy to use this option.
GraphQ
Laravel Package for GraphQL
If using Laravel isn't a must, and you are using PostgreSQL, you might want to look at Hasura as well.
SOAP
Have little knowledge on this option for Laravel, just know folks coding using C# and .net are happier to expose their API with this protocol. read more about it on WikiPedia
Postman is a great tool for testing your API or any other API.

Mix Panel API web segmentation and personalisation

Hi I am interested in using Mix Panel on a web site to track customers events. I would like to know if there is any way to use the api to personalise the web site per customer, similar to segmentation for emails.
I would like to query the api for a singular customer asking whether they have achieved several events.
For example something like
If customer has clicked out and last visit greater than a month ago display a banner advert.
Mixpanel does not seem like a correct tool for the job you describe here.
While theoretically this might be possible (via Mixpanel's HTTP API), this will create unnecessary architectural complexity and add extra latency. If you need to customize your web site per user, store any user state in a database like MySQL or PostgreSQL. This will be both faster and easier.

Syncing with an External API in Volt Ruby Framework

I'm looking into Volt as an option for building an Admin interface to our REST API. The API is a separate application. I would like the Admin application to persist data to the API but also store it's own data that is irrelevant to the API (such as admin users and notes on the API data objects) locally.
Is there a way to sync each local change in the Admin with our remote API, like a callback, for example? Or do I need to wait until the Data Provider API is ready as mentioned in the most recent Volt blog post (as of writing)?
So this is a fairly common thing, so I think long term the solution will be to support multiple stores in an app and have a REST data provider that you can extend. However that might be a while before that's ready. In the mean time, you can always load and save data via tasks. (I realize its not ideal, but it will work right now) Let me know if you need more info on using tasks to load and save. I'll add the REST data provider to the TODO list.

Should I do API requests server side or client side?

I am trying to make a web app using ExpressJS and Coffeescript that pulls data from Amazon, LastFM, and Bing's web API's.
Users can request data such as the prices for a specific album from a specific band, upcoming concert times and locations for a band, etc... stuff like that.
My question is: should I make these API calls client-side using jQuery and getJSON or should they be server-side? I've done client-side requests; how would I even make an API call from the server side?
I just want to know what the best practice is, and also if someone could point me in the right direction for making server-side API requests, that would be very helpful.
Thanks!
There's are two key considerations for this question:
Do calls incur any data access? Are the results just going to be written to the screen?
How & where do you plan to handle errors? How do you handle throttling?
Item #2 is really important here because web services go down all of the time for a whole host of reasons. Your calls to Bing, Amazon & Last FM will fail probably 1% or 0.1% of the time (based on my experiences here).
To make requests users server-side JS you probably want to take a look at the Request package on NPM.
It's often good to abstract away your storage and dependent services to isolate changes and offer a consolidated and consistent web api for your application. But sometimes, if you have a good hypermedia web api (RESTful responses link to other resources), you could reference a resource link from another service in the response from your service (ex: SO request could reference gravatar image/resource of user). There's no one size fits all - it depends on whether you want to encapsulate the dependency or integrate with it.
It might be beneficial to make the web-api requests from your service exposed via expressjs as your own web-apis.
Making http web-api requests is easy from node. Here's another SO post covering that:
HTTP GET Request in Node.js Express
well, the way you describe it I think you may want to fetch data from amazon, lastfm and so on, process it with node, save it in your database and provide your own api.
you can use node's http.request() to fetch the data and build your own rest api with express.js

Recommended way to send details to the client for a web analytics service

When creating a service like Google Analytics or StatCounter, I want to do it a little bit different in the data storage part:
A user visits my client's website.
JS code or 1 pixel image is downloaded from my server.
Request sent to my server, where the data is processed.
Things like country, returning customer, bounce rate, etc are calculated.
Instead of storing this data in my server, I want to store it in the client's server.
The client is an individual or business who is using my "service" for web analytics of their website.
Assuming that they are prepared to create a db schema that I choose, what is the recommended way to send the data to them to store?
The only thing I can think of is, asking them to give me a URL in their server, to which I will POST a JSON string, which they can store or do whatever they want.
Apart from HTTP POST, are their any other choices I have to send the data to them?
You could store the data on your own server then provide a mechanism for the client to download it. This would save you the burden of entering and testing a different URL for each customer.
It would also mean that you would only need one SSL URL and authentication method for security. Otherwise you would need to make sure each customer has a working SSL and get your script to log onto each of them when it deposits the data.

Resources