How to avoid answering already client-cached data with GraphQL? - caching

Hi everybody, please pardon my english :-)
I have a JS client that request data with GraphQL and store them.
It happends that GraphQL request ask for data that already are cached on the client, and I'm looking for a way to avoid resending them.
Example
Authors may have books
A first query ask for the 20 Lasts books (for main page for example)
A second query ask for the books of a precise author
This is a probably an example of unnecerary cache usage due to the small amount of data, but it shows the problematic.
Potential ways
Use server cache as answered here: How does caching work in GraphQL?
Send already client-cached data id with the request to allow the server to not join this data with the answer if still valid - But how to do that ?
Create a "request id" and let the server know if data have evolved depsite the last query - It seems to be heavy for the server...
Thank you in advance for your(s) answer(s) !

Related

Process request and return response in chunks

I'm making a search agreggator and I've been wondering how could I improve the performance of the search.
Given that I'm getting results from different websites, currently I need to wait to receive the results for each provider but this is done one after another so the whole request takes a while to respond.
The easiest solution would be to just make a request from the client for each provider, but this would end up with a ton of request per search, (but if this is the proper way I'll just do it.)
Why I've been wondering is if there's way to return results everytime a provider responds, so if we have providers A, B and C and B already returned results then send it back to the client. In order for this to work all the searchs would need to run in parallel of course.
Do you know a way of doing this?
I'm trying to build a search experience similar to SkyScanner, that loads results but then you can see it still keeps getting more records and it sorts them on the fly (on client side as far as I can see).
Caching is the key here. Best practices for external API (or scraping) is to be as little of a 'taker' as possible. So in your Laravel setup, get your results, but cache the results for as long as makes sense for your app. Although the odds in a skyscanner situation is low that two users will make the exact same request, the odds are much higher that a user will make the same request multiple times, or may share the link, etc.
https://laravel.com/docs/8.x/cache
cache(['key' => 'value'], now()->addMinutes(10));
$value = cache('key');
To actually scrape the content, you could use this:
https://github.com/softonic/laravel-intelligent-scraper
Or to use an API which is the nicer route:
https://docs.guzzlephp.org/en/stable/
On the client side, you could just make a few calls to your own service in separate requests and that would give you your asynchronous feel you're looking for.

Delta handling in GoogleClassroom

We are integrating Google Classroom to sync data to our Application from GoogleClassroom.
We have the following queries :-
Let's say today we fetched data from Google Classroom and we got a Course named as XYZ".
After 1 week again we fetched the data and above retrieved course i.e. "XYZ" got deleted / inactive then it will come with type as "Deleted/ Inactive.
How long does this delete /inactive event for this courses will come. and how we can handle deltas after doing first sync.
Thanks
It's tough to understand what you're asking partially do to language. Let me see if I can understand.
Are you asking how long a course will be marked as "archived" after it was changed to archived by the user? If so, the answer should be "indefinitely."
If you're asking a different question, perhaps showing us what code you're using or API endpoints you're hitting, along with its return information, will help us understand.

Queries on the Parse

I would like to ask the below queries. Apologies if it was asked before, I couldn't find them.
W.r.t the new pricing it is mentioned as "You can send us your analytics events any time without being limited by your app's request limit." - Is it that any interactions made for the Parse analytics does not count towards to the overall api request limit set for the app?
From the answers to the queries posted a while back in the forums, there was some distinction between the normal and premium customers - Is there any now..?
I am using the android sdk - Just out of curiosity, can two objects(or more) have the same object id by any chance?
Thanks.
Answers to all of your questions:
Correct. Analytics does not contribute to API request limits or burst limit.
On the new pricing, there is no longer a distinction. All previously "Pro-Only" features are available to everyone.
No, multiple objects cannot / will not have duplicate objectId values.

Send data to browser

An example:
Say, I have an AJAX chat on a page where people can talk to each other.
How is it possible to display (send) the message sent by person A to persons B, C and D while they have the chat opened?
I understand that technically it works a bit different: the chat(ajax) is reading from DB (or other source), say every second, to find out if there are new messages to display.
But I wonder if there is a method to send the new message to the rest of the people just when it is sent, and not to load the DB with 1000s of reads every second.
Please note that the AJAX chat example is just an example to explain what I want, and is not something I want to realize. I just need to know if there is a method to let all the opened browser at a specific page(ajax) that there is new content on the server that should be gathered.
{sorry for my English}
Since the server cannot respond to a client without a corresponding request, you need to keep state for each user's queued message. However, this is exactly what the database accomplishes. You cannot get around this by replacing the database with something that doesn't just accomplish the same thing in a different way. That said, there are surely optimizations you could do. Keep in mind, however, that you shouldn't prematurely optimize situations like this; databases are designed to handle extremely high traffic, and it's very possible (and in fact, likely), that the scenario described will be handled just fine by the database out of the box.
What you're describing is generally referred to as the 'Comet' concept. See the Wikipedia article for details, especially implementation options (long polling, etc.).
Another answer is to have the server push changes to connected clients, that way there is just one call to the database and then the server pushes the change to all the clients. This article indicates it is possible, however I have never tried this myself.
It's very basic, but if you want to stick with a standard AJAX solution, a simple means of reducing load on the server when polling would be to get the AJAX call to forward the last collected comment ID for that client - you then use that (with the appropriate escaping) in the lookup query on the server side to ensure you only return new comments.

ajax architecture question

I have a page with 3 layers, one for navigation, one for database records and one for results. When I click on a database record, the results are displayed in the result layer via ajax. For navigation, the links will simply be different queries. I am wondering if it would make sense to have each different query be sent as ajax data and palced into the records layer, or rather to have the query appended to the php file each time. Which is the more efficient approach?
Well, sending a different AJAX request will be recommended as per my point of view. As
Performance wise, it will rather reduce the response times, as only the POST data is sent and databytes recieved. The page can then format it, one it receives an XMLHttpResponse
Security wise : I prefer using POST than GET as it gives at least some opaqueness as to what is being passed as a parameter and not anyone can just edit the url and play around. Plus, you don't have the URL length restriction while passing parameters in POST.
So, i'd say fire an XMLHTTPRequest
each on each link and display the
response in the Results layer
(pane/div) on the page.
I think you question is quite unspecific and confusing.
What is "appended to the php file"?
Are you really concerned about efficiency? I mean, how fast should the results be displayed? Or are you concerned about the server workload?
Have you read this tutorial? Prototype introduction to Ajax
I think it should answer most of your questions and give enough example code to continue.

Resources