I have a basic "input search" using AJAX (angularJS, elasticsearch and lumen) and I need to implement the search history for each user in my application. I have some ideas on this:
Save the record from the backend every time you call the uri search(maybe this would increase the response time too much).
Save each search in the frontend and when changing the pages or after a few seconds, maybe a second, send this dataset to the backend.
What would be the best way to optimize this?
For searching anyways you are calling the method , then only you can store the search history, I guess first approach is good.
Saving it at frontend first and then again one extra call for saving those search text will not be good idea.
This is my thinking.
Related
I'm making a search agreggator and I've been wondering how could I improve the performance of the search.
Given that I'm getting results from different websites, currently I need to wait to receive the results for each provider but this is done one after another so the whole request takes a while to respond.
The easiest solution would be to just make a request from the client for each provider, but this would end up with a ton of request per search, (but if this is the proper way I'll just do it.)
Why I've been wondering is if there's way to return results everytime a provider responds, so if we have providers A, B and C and B already returned results then send it back to the client. In order for this to work all the searchs would need to run in parallel of course.
Do you know a way of doing this?
I'm trying to build a search experience similar to SkyScanner, that loads results but then you can see it still keeps getting more records and it sorts them on the fly (on client side as far as I can see).
Caching is the key here. Best practices for external API (or scraping) is to be as little of a 'taker' as possible. So in your Laravel setup, get your results, but cache the results for as long as makes sense for your app. Although the odds in a skyscanner situation is low that two users will make the exact same request, the odds are much higher that a user will make the same request multiple times, or may share the link, etc.
https://laravel.com/docs/8.x/cache
cache(['key' => 'value'], now()->addMinutes(10));
$value = cache('key');
To actually scrape the content, you could use this:
https://github.com/softonic/laravel-intelligent-scraper
Or to use an API which is the nicer route:
https://docs.guzzlephp.org/en/stable/
On the client side, you could just make a few calls to your own service in separate requests and that would give you your asynchronous feel you're looking for.
Instead of the traditional Posting of forms (with a save button) to save data to a database using coldfusion.
Is there a sensible way of having information saved as the user exits the field.
Is this even good practice?
All you need to do is via JavaScript, assign a change event to every field, then define that the event will make an Ajax call to save the data in that particular field. You should need a single target URL that takes some primary key and the field name in question.
What you really need to consider though, is the bandwidth required to support such a process. What is your current load? Concurrent users? Concurrent form usage?
If you have 100 people filling out a 10 field form, you currently have 100 HTTP POST requests to deal with. Can you handle 1000 HTTP POST calls if every field saves on its own? What about 1000 people at a time? 10k? 100k? And larger forms, how many of those do you have?
The functionality is fairly trivial to implement, what is not trivial is the potential impact on your infrastructure.
How can I abort a store load while the ajax call is still executing? I have a simple store with proxy type of 'ajax' and 'json' reader.
The documentation does not indicate any way to abort this. I have noticed that jsonp does allow aborting a load in progress. Do I have to switch to jsonp?
The motivation here is that I have a search bar and list object that gets populated with results. The actual search on the backend can take 5-10 seconds. So if a user starts a search then quickly wants to do another search (in case, for example, the first search was a typo), then the new search needs to abort the first search ajax call. Otherwise, I am seeing mixed results showing up in my search results.
As usual, any help is greatly appreciated!
Mohammad
The solution I have used in the past to solve this exact problem is to track each request with an incrementing counter and as requests complete I check the counter and if a request has been made with a higher counter I disregard the result.
I have a site developed in codeigniter.
In the page search I have a form that when I compile It I send a request to a servere with CURL and return me an xml.
This query and the print date is about 15seconds because I have to make more query with many server and this time is necessary.
But the problem is: I have a list of element, when I click on an element I make a query to retrieve the data of the element.
But if I click back or click to go back to all element searched I don't want to make an other query that takes 15second.
When I search the element I have a get request and I have a link like this:
http://myurl/backend/hotel/hotel_list?nation=94&city=1007&check-in=12%2FApr%2F2013&check-out=13%2FApr%2F2013&n_single_rooms=1&n_double_rooms=0&n_triple_rooms=0&n_extra_beds=0
I load the page and I can have more elements. i click on some of this in a simple link like this:
http://myurl/backend/hotel/hotel_view?id_service=tra_0_YYW
When I enter into this page I have to go back to the previous url (the first) without remake the query that takes more seconds.
I can't cache the result because is a realtime database and change every minutes or second but I thinked to cache the page search when I enter on it and if i go back to it reload from cache if the time is minor than 2 minutes for example.
Is this a good way or there is a more perfmormant way to do this in codeigniter?
I can't put in session because there is large data.
The other solution are:
- cache page (but every minutes I have to delete it)
- cache result (but every minutes I have to delete it)
- create sessionflashdata (but I have a large amount of data)
is there a way with the browser when I go back to don't remake the page?
Thanks
cache page (but every minutes I have to delete it)
I think you can easily implement it with codeigniter's page caching function "$this->output->cache(1);"
cache result (but every minutes I have to delete it)
You will have to use codeigniter's object caching method to implement it.
create sessionflashdata (but I have a large amount of data)
Its not a good idea to save huge data in session. Rather use 'database session' instead, which will help you handling similar way and codeigniter has its integrated support.
Hope this helps. You can read more about all kind of codeigniter caching if you are just starting with it.
I have a page with 3 layers, one for navigation, one for database records and one for results. When I click on a database record, the results are displayed in the result layer via ajax. For navigation, the links will simply be different queries. I am wondering if it would make sense to have each different query be sent as ajax data and palced into the records layer, or rather to have the query appended to the php file each time. Which is the more efficient approach?
Well, sending a different AJAX request will be recommended as per my point of view. As
Performance wise, it will rather reduce the response times, as only the POST data is sent and databytes recieved. The page can then format it, one it receives an XMLHttpResponse
Security wise : I prefer using POST than GET as it gives at least some opaqueness as to what is being passed as a parameter and not anyone can just edit the url and play around. Plus, you don't have the URL length restriction while passing parameters in POST.
So, i'd say fire an XMLHTTPRequest
each on each link and display the
response in the Results layer
(pane/div) on the page.
I think you question is quite unspecific and confusing.
What is "appended to the php file"?
Are you really concerned about efficiency? I mean, how fast should the results be displayed? Or are you concerned about the server workload?
Have you read this tutorial? Prototype introduction to Ajax
I think it should answer most of your questions and give enough example code to continue.