Codeigniter cache page - codeigniter

I have a site developed in codeigniter.
In the page search I have a form that when I compile It I send a request to a servere with CURL and return me an xml.
This query and the print date is about 15seconds because I have to make more query with many server and this time is necessary.
But the problem is: I have a list of element, when I click on an element I make a query to retrieve the data of the element.
But if I click back or click to go back to all element searched I don't want to make an other query that takes 15second.
When I search the element I have a get request and I have a link like this:
http://myurl/backend/hotel/hotel_list?nation=94&city=1007&check-in=12%2FApr%2F2013&check-out=13%2FApr%2F2013&n_single_rooms=1&n_double_rooms=0&n_triple_rooms=0&n_extra_beds=0
I load the page and I can have more elements. i click on some of this in a simple link like this:
http://myurl/backend/hotel/hotel_view?id_service=tra_0_YYW
When I enter into this page I have to go back to the previous url (the first) without remake the query that takes more seconds.
I can't cache the result because is a realtime database and change every minutes or second but I thinked to cache the page search when I enter on it and if i go back to it reload from cache if the time is minor than 2 minutes for example.
Is this a good way or there is a more perfmormant way to do this in codeigniter?
I can't put in session because there is large data.
The other solution are:
- cache page (but every minutes I have to delete it)
- cache result (but every minutes I have to delete it)
- create sessionflashdata (but I have a large amount of data)
is there a way with the browser when I go back to don't remake the page?
Thanks

cache page (but every minutes I have to delete it)
I think you can easily implement it with codeigniter's page caching function "$this->output->cache(1);"
cache result (but every minutes I have to delete it)
You will have to use codeigniter's object caching method to implement it.
create sessionflashdata (but I have a large amount of data)
Its not a good idea to save huge data in session. Rather use 'database session' instead, which will help you handling similar way and codeigniter has its integrated support.
Hope this helps. You can read more about all kind of codeigniter caching if you are just starting with it.

Related

Web forms: Go back in history without refreshing page

Is it possible to go back in a page without reloading it?
I am developing a Web Forms website and every time a go back in history, the page reloads (and takes a long time).
Following is the curl of the page:
Honestly, no.
The life cycle of a Web Form is very specific and the page goes through it every time it is run (that is every time you request it through your browser).
On the other hand, you can always optimize your page to make it load faster. How you do it depends on many things one of which is what code runs on the server side upon loading and if any portions of that code can be either optimized for speed or moved in event handlers to be executed at a later point in time. For example, if you're fetching data from a database when your page loads consider applying paging to narrow the number of selected rows.
Please, feel free to ask a new question if you decide to take that course of action.

how to implement search log?

I have a basic "input search" using AJAX (angularJS, elasticsearch and lumen) and I need to implement the search history for each user in my application. I have some ideas on this:
Save the record from the backend every time you call the uri search(maybe this would increase the response time too much).
Save each search in the frontend and when changing the pages or after a few seconds, maybe a second, send this dataset to the backend.
What would be the best way to optimize this?
For searching anyways you are calling the method , then only you can store the search history, I guess first approach is good.
Saving it at frontend first and then again one extra call for saving those search text will not be good idea.
This is my thinking.

Perform sever-side caching of 3rd party images

I just added some functionality to my site which, when a user hovers their mouse over a link (to a 3rd party page), a preview of the link is created from the meta tags on the target page and displayed. I'm worried about the implications of hot-linking in my current implementation.
I'm now thinking of implementing some kind of server-side caching such that the first request for the preview fetches the info and image from the target page, but each subsequent request (up to some age limit) is served from a cache on my host. I'm relatively confident that I could implement something of my own, but is there an off-the-shelf solution for something like this? I'm self-taught so I'm guessing that my DIY solution would be less than optimal. Thanks.
Edit I implemented a DIY solution (see below) but I'm still open to suggestions as to how this could be accomplished efficiently.
I couldn't find any off-the-shelf solutions so I wrote one in PHP.
It accepts a URL as a HTTP GET parameter and does some error checking. If error-checking passes, it opens a JSON-encoded database from disk and parses the data into an array of Record objects that contain the info that I want. The supplied URL is used as the array key. If the key exists in the array, the cached info is returned. Otherwise, the web page is fetched, meta tags parsed, image saved locally, and cached data returned. The cached info is then inserted into the database. After the cached info is returned to the requesting page, each record is examined for its expiration date and expired records are removed. Each request for a cached record extends its expiration date. Lastly, the database is JSON-encoded and written back to disk.

Load preliminary results, then reload when completed (Django or ajax)

I have a page that is taking far too long to load, because it requires 50+ objects to be fetched from the database.
I would like to load the page with only, say, the first 10 results, then let the server get on with loading the rest in the background, and then refresh the page.
Is there a way of doing something like:
def foo_view(request):
values = Foo.objects.all()[:10]
render_to_response(template, values, context_instance=...)
values = Foo.objects.all()
return render_to_response(template, values, context_instance=...)
Or is this a job for ajax? (Reloading the data as soon as the page has loaded.)
Thanks!
Edit:
It turns out that I was mistaken about the cause of the long loading time: actually fetching 50-100 objects from the database barely causes a delay.
There was a method in my template that resulted in n^3 database hits for my n items, when I should have been calling it once in the view function, and passing the results to my template.
AJAX is your solution. Add first 10 objects to your page. Then if user scrolls down fetch another 10 and so on. Like twitter. Or use pagination ? :)

Automatically rebuild cache

I run a Symfony 1.4 project with very large amount of data. The main page and category pages are using pagers which need to know how much rows are available. I'm passing a query which contains joins to the pager which leads to a loading-time of 1 minute on these pages.
I configured cache.yml for the respective actions. But I think the workaround is insufficient and here are my assumptions:
Symfony rebuilds the cache within a single request which is made by a user. Let's call this user "cache-victim" to simplify things.
In our case, the data needs to be up-to-update - a lifetime of 10 minutes would be sufficient. Obviously, the cache won't be rebuilt, if no user is willing to be the "cache-victim" and therefore just cancels the request. Are these assumptions correct?
So, I came up with this idea:
Symfony should fake the http-request after rebuilding the cache. The new cache-entries should be written on a temporary file/directory and should be swapped with the previous cache-entries, as soon as cache rebuilding has finished.
Is this possible?
In my opinion, this is similar to the concept of double buffering.
Wouldn't it be silly, if there was a single "gpu-victim" in a multiplayer game who sees the screen building up line by line? (This is a lop-sided comparison, I know ... ;) )
Edit
There is no "cache-victim" - Every 10 minutes page reloading takes 1 minute for every user.
I think your problem is due to some missing or wrong indexes. I've a sf1.4 project for a large soccer site (i.e. 2M pages/day) and pagers aren't going so slow even if our database has more than 1M rows these days. Take a look at your query with EXPLAIN and check where it is going bad...
Sorry for necromancing (is there a badge for that?).
By configuring cache.yml you are just caching the view layer of your app (that is, css, js and html) for REQUESTS WITHOUT PARAMETERS. Navigating the pager obviously has a ?page=X on the GET request.
Taken from symfony 1.4 config.yml documentation:
An incoming request with GET parameters in the query string or submitted with the POST, PUT, or DELETE method will never be cached by symfony, regardless of the configuration. http://www.symfony-project.org/reference/1_4/en/09-Cache
What might help you is to cache the database results, but its a painful process on symfony/doctrine. Refer to:
http://www.symfony-project.org/more-with-symfony/1_4/en/08-Advanced-Doctrine-Usage#chapter_08_using_doctrine_result_caching
Edit:
This might help you as well:
http://www.zalas.eu/symfony-meets-apc-alternative-php-cache

Resources