Not sure if anyone is still using the legacy 1.4 - but I love it!
Background:
I have a user dashboard available at app.com/home/dashboard
To optimize DB hits, I cached the template Since the URL doesn't have a user parameter, a user ended up seeing another user's data
To beat this, I wrote a filter to include the user-id in the URL, giving each user their own URL like app.com/home/18/dashboard
While this prevents the data leak and provides the benefit of caching, it is messing reporting in GA, since I cannot track the total visits to the Dashboard (directly)
Has anyone worked around this?
I've had the same problem. To solve it I've decided to split the whole page into partials and components and cache those instead.
To make partials/component cached per-user just pass 'user_id' => $sf_user->getId() along with parameters. That will make user_id value used as part of cache entry key.
Related
I using CakePHP with Backbone.JS, I set up a controller just to give me a JSON output for getting my data, e.g. client names etc, to pass into each Backbone model.
This was all working, or appeared to be, however, it seems that it now gives me some random 403 errors when the page / from is saved or reloaded. But I have no idea why? If it can access it to start with, and does, then why would it not have access after a save or reload?
I have tried, $this->Auth->allow and it dose appear to fix the problem but this data is or could be important and need it not to be access my everybody who might guest at my access path.
Now I have read a number of articles on her, most point to read/write access on the files your accessing, but in my case its just a path /XXXX/XXXXX/myjson/clients For example.
Now I can post my code, if needed, but I am not sure what the problem is, is this a CakePHP issue or is Backbone not requesting the data right?
Please be aware that I am dyslexic, please be kind about my question, if I have not explained myself right. Then please be me some time to re-word / edit my post.
Thanks,
For any one else looking at this, I had added autoRegenerate to the Configure Write Session. For some reason it looks like CakePHP was taking to long to regenerate a new cookie and request my information at the same time.
Will a single GSA search request that specifies two site collections get loaded into the suggestions database for both sites?
For example consider the following query,
http://gsa/search?q=hello&site=site1|site2&client=myfrontend
If results are returned from both collections will the query "hello" get loaded into the suggestion database for both,
1. client=myfrontend&site=site1
2. client=myfrontend&site=site2
So far I have not seen the query get loaded into either suggest database, but perhaps I have not waited long enough, which makes me wonder is there any way to quickly reset/reload suggestions? In the past I have tried resetting them via admin console's suggestions page, and then enabling/disabling suggestions via the frontend, but it does not always regenerate them. Is there any other procedure people follow to reload suggestions?
Contacted Google support and, if multiple collections are defined in search request the suggestion gets logged to the default_collection.
They are working on a resolution :)
I've read all over the place and I'm trying to figure out if I am understanding the way caching happens in Drupal 6. We have a site that has a real time stock ticker in it. We have Drupal caching enabled so the stock price ends up getting cached and frozen at a specific spot. I figured a way I could handle it would be to put the ticker in a block I make in a custom module and set BLOCK_NO_CACHE, but if I'm understanding this correctly, if you have site caching enabled, then the ENTIRE page gets cached including any and all blocks on it regardless of their individual cache settings. Is this correct? So am I unable to take advantage of site caching then if I have certain spots that should not cache? Does anyone know of another solution I might be able to use to have the best of both worlds? To be able to have site caching, but also have a real time stock ticker? By the way, the stock ticker is making a JSON request to the Yahoo finance API to get the quote.
You are correct, the directive BLOCK_NO_CACHE is only applicable to block level. However when page caching is enabled then Drupal will cache the entire page (which includes the block as well). But this is only applicable to anonymous users. Drupal's philosophy is that the content for anonymous users is always the same so they get served the cached page. But this is not applicable to authenticated users. Since different users might have different access to certain parts of the page (e.g. the links block will look different for an admin than a regular user).
You might want to have a look at this discussion: BLOCK_NO_CACHE not working for anonymous users
And there is a solution, which you'll stumble upon this discussion. It's this module: Ajax Blocks. Extract from the module description:
Permits to load some blocks by additional AJAX request after loading
the whole cached page when the page is viewed by anonymous user. It is
suitable for sites which are mostly static, and the page caching for
anonymous users is a great benefit, but there are some pieces of
information that have to be dynamic.
So I have a framework we've built on codeigniter. It uses regular codeigniter sessions by default which allows up to 4kb storage encrypted on a cookie.
Its for general apps that require a registration process, which can vary in size as questions are generated dynamically through an admin panel. Registration process relies on session data as it redirects throughout process.
I have used db_sessions in the past when I knew this would be an issue on the framework, however, I'm now considering the possibility to always have registration process using db_session and the rest of the site use the 4kb cookie session.
Is this possible. It seems like it could be a really bad idea, but I don't really want to rework the dynamic registration process or really use db_session for whole site as it will eventually make the site run very slow if too many users are online at once.
so I'm think I can just set the variable in config to be true only when the registration controller is loaded(by checking the url via $_SERVER or the uri helper if I can load it in the config which I'm guessing I cant).
Does this seem plausible?
It seems like it could be a really bad idea
You answered your own question :) You'll have issues when the user switches from one page to another. What happens if they open multiple windows, press a 'back' button etc. You'll need to switch the cookie over when they start registration, and switch it back at the end. It will be very very messy for basically no gain.
but I don't really want to rework the dynamic registration process or
really use db_session for whole site as it will eventually make the
site run very slow if too many users are online at once.
The reality is; your website has to be huge to have ANY real performance issues by using a DB for your sessions. Any if you are not using the DB, then you are relying on the cookie stored on the users computer. Depending on your site, this means they might have the ability to edit that cookie and change "admin = true" or something.
Just use the DB session - I think you are overcomplicating the situation.
We have a simple CCTV system in our office that shows a live image from each of our security cameras. The CCTV system doesn't have an API or any method of extracting the live images. You can however view the image from another browser by creating a basic HTML page with the image link:
http://192.168.1.6/media/getimage_sid.php?sid=a09c4ecb72bade3802e7bf563b0d0bd6&card=1&camera=1&width=384&height=288
This works perfectly, until the session expires and/or timesout. I don't know very much about cookies and sessions but when I inspected the page in Google Chrome I noticed the following cookie:
Name Value Domain Path Expires Size
PHPSESSID a09c4ecb72bade3802e7bf563b0d0bd6 192.168.1.6 / Session 41
there is also a HTTP column and a Secure column but both are empty.
What I am trying to figure out, is how do I keep that cookie alive or trigger it to recreate with the same value? I'm assuming that a rake task to log in to the system wouldn't work because that would reset the session ID every time.
The intranet is a Rails application, so one way would be to create a script that logs in and stores the current session ID to the database and then putting the last recorded sessions ID into the IMG links from the database. It's a bit of a long way around though, I'm hoping for a better solution.
I have read a few articles showing how to do this with AJAX but that would appear to rely on the intranet being viewed all the time. I need this to work if no-one has viewed the intranet for the weekend.
This project is so we can put a couple of live (when the page refreshes!) images on our intranet so we don't have to continuously go to the CCTV system, log in and find the right camera just to see who is at the garage door etc.
Any help would be appreciated.
It's a bit of hack but I've made a small script to pull in the latest session ID and then put it into the image links.
A random different approach: does the following URL get the right image, without having to worry about the session id?
http://192.168.1.6/media/getimage_sid.php?card=1&camera=1&width=384&height=288
The session ID used in the cookie seems to be the PHP generated one.
I don't think session ID should become stale if you 'notify' the server that you're still online.1 You should try to specify the Cookie: in your HTTP request headers. Specifying the SID via the URL is probably not be enough to indicate to the server that you're actually using it.2
If your web-pages are fetching the images directly (i.e. you have an <img src="http://192.168.1.6/..."> in the HTML page) you might work like this:
make an AJAX request (XMLHttpRequest) to a URL which returns a session ID.
any subsequent request to the server on that page should automatically include the session in the headers.3
Otherwise, if you can't specify a Cookie: header, you may choose to make the time before a session becomes stale longer. If you have access to the computer hosting the PHP interface (192.168.1.6) then you can configure PHP to do so (via the php.ini configuration file, I believe). Information about session configuration is available here, and specifically the gc-maxlifetime options seems useful:
session.gc_maxlifetime specifies the number of seconds after which data will be seen as 'garbage' and potentially cleaned up. Garbage collection may occur during session start (depending on session.gc_probability and session.gc_divisor).
Alternatively, if none of the above appeal to you, your solution to fetch (GET) a page to obtain a valid, fresh session ID seems logical and good. You could optimize this by measuring how long it takes before session IDs become stale and fetching new session IDs only at that interval.
1 I looked for a valid reference for this but couldn't find one.
2 specifically PHP uses a PHPSESSID= token in the URL whereas in your example it looks like sid=. It is also generally considered bad practice security-wise I believe (this article explains how it might be used for XSS), since you're exposing user information in the URL, though I think this has little to no effect in this case
3 according to the XMLHttpRequest spec of the send() method:
If the user agent supports HTTP State Management it should persist, discard and send cookies