In my TYPO3 6.2.17 installation, I use the tt_news extension 3.6
My articles are located in a folder and are displayed in a default tt_news list element. Usually if I save an article, I need to flush the Frotend Cache. So far so good.However, sometimes I need to display news articles timed, i.e. beginning from a certain date, which I did with the start value in the access tab of the news article. The problem is that the news are not displayed at the required date, until anyone deletes the Frontend Cache, plus after the set date.
What can I do so that the articles are displayed without anyone deleting the Frontend cache manually, after the access start date.
Edit:
This Problem cannot be solved just with cron jobs, because it would be to difficult for the content editors to create a cron job for every single news article
Disabling caching totally on given page isn't best choice, especially if you have large number of news to render at once and/or large number of visitors, for such cases even relatively short caching is better then no caching at all. The easiest way is shortening cache period of the pages which displays lists and single views by adding only on it shorter period like:
[globalVar = TSFE:id = 123|345]
config.cache_period = 60
[end]
(where 123 is your list page and 345 is single page UID) Instead using condition you can also just create ext TypoScript templates on these pages.
Keep in mind that period of cache is counted from its create time, so it may happen that some posts will require 2 periods (as first may be just dismissed by time diff) to disappear. If that's absolutely important to you to hide item right now just set the cache_period value to 29 seconds.
Finally if on list/single pages there are elements that also requires extensive rendering (like advanced TMENU's etc) you can cache these additionally with the cache function, it will prevent re-rendering menu between page's cache expirations, anyway you are stil able to force clearing it from BE with yellow flash icon,
pseudo code:
lib.mainMenu = COA
lib.mainMenu {
stdWrap.cache.key = lib_mainMenu_{page:uid}_{TSFE:sys_language_uid}
stdWrap.cache.key.insertData = 1
stdWrap.cache.lifetime = 3600
10 = HMENU
10 {
// ... your menu code
}
}
For clearing the cache through an external trigger see this question, which was asked yesterday: Refresh Typo3 by web server cron job
Alternatively you could exclude the page with the list plugin from caching. Check the behavior tab in the page properties.
Related
We are working on a site in AEM 6.1 which has news and events content with most pages having info on recent and related news/events based on tagging that are dynamic. We are using dispatcher. Please suggest on some caching techniques that could be implemented at application level apart from the dispatcher. Thanks.
Aim of implementing the caching on dispatcher is to allow less hits on your app server and serve as much as possible from web server. In short improving response time from your app server. But in some cases we can't cache too much on web server if results change on app server frequently.
On app server we can have following solutions implemented to get results quickly on top of having dispatcher in place.
Make sure your content hierarchy where you are ingesting news items have as less number of article as possible. Divide your hierarchy based on following structure. Year >> Month >> Day >> Hour (this can be ignored if content flow is less) >> news items.
Having this structure in place, write path based query so that you don't have to traverse in whole content hierarchy.
There is a concept of transient node in CQ, for each news item which is getting created in CQ, update the transient node with newly created item. Means for recent news you don't have to traverse content structure just refer to transient node which has reference to newly created news item.
You could also write a cron job which gets executed in background and takes care of collating views namely top recent news.
To complement the answer of Rupesh I would say that definitely use dispatcher cache as much as you can and for using local caching strategies in AEM try using guava cache it is a very good and easy to use tool there is also a lot of information on how you can set it up and use it for your specific needs. Hope it helps.
I would suggest the following:
For recent news/events, write a scheduler
(https://sling.apache.org/documentation/bundles/scheduler-service-commons-scheduler.html) that will compute the list of recent news/events and write it to a specific node as properties, example:
/tmp/recent
news [/path/to/news1,/path/to/news2]
events [/path/to/event1,/path/to/event2]
Most recent always at the end of the array. Your code need to limit it to the amount of max recent you want to have.
Let's say you want to have the last 5 changed and a 6th page is changed, then you just pop and push(new_page_path)
This could run once a day or at the frequency which you feel is the best depending on your requirements.
If you need instant update, then you can additionally write a Listener when a page is changed/deleted and update the recent list. In this case I would suggest putting the code that deal with updating the recent list into a service and use that service in both the scheduler and the listener.
Listener and scheduler need to run on both author and publisher and on publisher trigger dispatcher cache invalidation for /tmp/recent afterwards.
In order to render the recent list without having to invalidate the whole pages, I would suggest you use SSI for that, that means have a component in your page that will render an SSI include to /tmp/recent.news.html or /tmp/recent.events.html depending on whether you want to render recent news or events.
Give the node /tmp/recent the resourceType for handling the "news" and "events" selector and implement that resourceType to render the content.
For the related
Use the Tag Manager (https://docs.adobe.com/docs/en/cq/5-6-1/javadoc/com/day/cq/tagging/TagManager.html) "find" method to lookup for all news/events having the same tag as the current page. I assume your news and events pages have a dedicated template or resource type.
Also I would suggest having a dedicated component that would include that content using SSI include. Let's say your page has 2 tags, ns/tag1 and ns/tag2, then you could perform the SSI include like this:
SSI include /etc/tags/ns/tag1.related_news.html
SSI include /etc/tags/ns/tag1.related_events.html
SSI include /etc/tags/ns/tag2.related_news.html
SSI include /etc/tags/ns/tag2.related_events.html
depending on what you want to include
Write a component under /apps/cq/tagging/components/tag (sling:resourceSuperType= /libs/cq/tagging/components/tag) that will provide the rendering for the "related_news" and "related_events" selector and list all related pages.
The advantage with this approach is that you can share the related page for each tag and whenever the tag is changed/deleted then the cache gets invalidated automatically.
In both cases (recent and related) configure the dispatcher to cache the output.
I have a site developed in codeigniter.
In the page search I have a form that when I compile It I send a request to a servere with CURL and return me an xml.
This query and the print date is about 15seconds because I have to make more query with many server and this time is necessary.
But the problem is: I have a list of element, when I click on an element I make a query to retrieve the data of the element.
But if I click back or click to go back to all element searched I don't want to make an other query that takes 15second.
When I search the element I have a get request and I have a link like this:
http://myurl/backend/hotel/hotel_list?nation=94&city=1007&check-in=12%2FApr%2F2013&check-out=13%2FApr%2F2013&n_single_rooms=1&n_double_rooms=0&n_triple_rooms=0&n_extra_beds=0
I load the page and I can have more elements. i click on some of this in a simple link like this:
http://myurl/backend/hotel/hotel_view?id_service=tra_0_YYW
When I enter into this page I have to go back to the previous url (the first) without remake the query that takes more seconds.
I can't cache the result because is a realtime database and change every minutes or second but I thinked to cache the page search when I enter on it and if i go back to it reload from cache if the time is minor than 2 minutes for example.
Is this a good way or there is a more perfmormant way to do this in codeigniter?
I can't put in session because there is large data.
The other solution are:
- cache page (but every minutes I have to delete it)
- cache result (but every minutes I have to delete it)
- create sessionflashdata (but I have a large amount of data)
is there a way with the browser when I go back to don't remake the page?
Thanks
cache page (but every minutes I have to delete it)
I think you can easily implement it with codeigniter's page caching function "$this->output->cache(1);"
cache result (but every minutes I have to delete it)
You will have to use codeigniter's object caching method to implement it.
create sessionflashdata (but I have a large amount of data)
Its not a good idea to save huge data in session. Rather use 'database session' instead, which will help you handling similar way and codeigniter has its integrated support.
Hope this helps. You can read more about all kind of codeigniter caching if you are just starting with it.
My team and I are rapidly launching new stores and views on Magento Enterprise Edition but we're running into an issue with caching. To be clear, the caching part itself works great. We have several complex products that take about 17 seconds to build, but after its cached the pages loads in 300ms, which is awesome! Unfortunately, if we clear the cache under any serious load (high traffic) we seem to be experiencing a cache miss storm, where every page request is trying to populate the cache, causing our webhead to stall out with load averages above 50.
Do you have any suggestions for avoiding this? Are there documented best practices for pre-warming a cache for new code deployments or even just content and configuration changes?
This could be related, so I'll include it: After clicking the button to refresh the cache and before the refreshing process is complete most pages on the front end die with 500 error codes and seemingly random error messages. Any idea what might cause that?
I implemented a solution to warm up a cache after a CMS Block is saved. You may take the inspiration of this solution to do the same for different cases (product save, CMS block, CMS Page, Category Save, etc)
This piece of code can be triggered after a CMS Block save by using an observer cms_block_save_after:
/**
* Clean targeted cache block and warmup if content is provided
*/
public function clearBlockHtmlCache(Varien_Event_Observer $observer)
{
$block = $observer->getEvent()->getObject();
$id = $block->getCacheKey();
// Remove only specific cache block
Mage::app()->getCacheInstance()->getFrontend()->remove(strtoupper($id));
// no print, it's ok just warmup cache with filters processing
$block->toHtml();
}
I run a Symfony 1.4 project with very large amount of data. The main page and category pages are using pagers which need to know how much rows are available. I'm passing a query which contains joins to the pager which leads to a loading-time of 1 minute on these pages.
I configured cache.yml for the respective actions. But I think the workaround is insufficient and here are my assumptions:
Symfony rebuilds the cache within a single request which is made by a user. Let's call this user "cache-victim" to simplify things.
In our case, the data needs to be up-to-update - a lifetime of 10 minutes would be sufficient. Obviously, the cache won't be rebuilt, if no user is willing to be the "cache-victim" and therefore just cancels the request. Are these assumptions correct?
So, I came up with this idea:
Symfony should fake the http-request after rebuilding the cache. The new cache-entries should be written on a temporary file/directory and should be swapped with the previous cache-entries, as soon as cache rebuilding has finished.
Is this possible?
In my opinion, this is similar to the concept of double buffering.
Wouldn't it be silly, if there was a single "gpu-victim" in a multiplayer game who sees the screen building up line by line? (This is a lop-sided comparison, I know ... ;) )
Edit
There is no "cache-victim" - Every 10 minutes page reloading takes 1 minute for every user.
I think your problem is due to some missing or wrong indexes. I've a sf1.4 project for a large soccer site (i.e. 2M pages/day) and pagers aren't going so slow even if our database has more than 1M rows these days. Take a look at your query with EXPLAIN and check where it is going bad...
Sorry for necromancing (is there a badge for that?).
By configuring cache.yml you are just caching the view layer of your app (that is, css, js and html) for REQUESTS WITHOUT PARAMETERS. Navigating the pager obviously has a ?page=X on the GET request.
Taken from symfony 1.4 config.yml documentation:
An incoming request with GET parameters in the query string or submitted with the POST, PUT, or DELETE method will never be cached by symfony, regardless of the configuration. http://www.symfony-project.org/reference/1_4/en/09-Cache
What might help you is to cache the database results, but its a painful process on symfony/doctrine. Refer to:
http://www.symfony-project.org/more-with-symfony/1_4/en/08-Advanced-Doctrine-Usage#chapter_08_using_doctrine_result_caching
Edit:
This might help you as well:
http://www.zalas.eu/symfony-meets-apc-alternative-php-cache
Let's imagine that we have blog with category A. Category A is currently having 1000 posts on 100 pages. All pages are cached in files (for example, cached by Smarty template engine). I'm adding post and want it to be displayed on first page immediately. So, I have to clear or invalidate cache for all 100 pages of category A.
Deleting cached pages is not a good idea because we can have too much files (for example, thousands of pages). I think that invalidating cache and regenerating page on request is much more efficient way.
My only thought is to add number of posts in category to cache id. So, first we should get number of posts in category (for example, from memcache) and then check if cached version valid by this number.
Everything looks fine and simple. But let's imagine situation when I'm adding new post and then after 1 minute I'm removing another (older) post. Number of posts still 1000 and some category pages will stay old (if they were not viewed during this 1 minute).
What is the solution?
PS: Sorry for my English, but I think that my question will be clear from people who have already faced such problem.
Thank you
Number of posts is not a good solution because when you edit some post you would want to refresh cache as well.
Couple of strategies I can think of:
Use time when a change was made as a reference.
When new post is added (removed, edited) - store current timestamp in a category, lets call it cache_threshold. When a page is requested - check when this page was cached. If it is older than our threshold - page needs to be regenerated.
Switch to object caching rather than page caching.
Instead of caching whole pages, you can cache each individual post. If new post is added (removed, edited) you would just immediately regenerate its cache as it is not time consuming. In order to display the page you would just need to grab a required amount of cached posts and display them.
This solution requires more work but it is more flexible and effective.