Unable to clear cookies using browser.cookies.clear - ruby

I am unable to clear cookies using watir-webdriver and browser.cookies.clear
Are there any other alternatives ?

This is as far as I know a browser based limitation, due to concerns of security and privacy.
Webdriver is interacting with the browser by javascript, and javascript is not allowed to clear all cookies (think how nasty that could be on a malicious site). In the non testing env, most JS that is executing came from the server of the site you are accessing. So the most it is allowed to do is clear the cookies for the 'current domain' e.g. the cookies belonging to the current URL. So if a web page wants to clear all its own cookies, that is OK, but it is not allowed to clear cookies belonging to other domains.
So if you want all your tests to start with fresh cookies, you will need something in the 'Before' section in env.rb that goes to the root of the site in question, and then clears the cookies
BTW the same limitation applies to setting cookies, if you want to create cookies for a specific site, you need to navigate to that site before trying to create them, or nothing gets created.

This is not an answer, but FYI only.
Suppose chrome is the chosen browser, when cucumber is running, from the output of ps -ef | grep chrome we'll be able to find the customized user data directory which is something like below.
--user-data-dir=/tmp/.org.chromium.Chromium.G2DgPo
And inside that directory, we'll be able to find the Cookies file stored under the Default folder.
Not sure directly deleting that database file could fulfill the needs or not. Cause in a normal browser session, such runtime data are stored at
~/.config/google-chrome/Default/Cookies

Related

use both regular(4kb) session and db session in one codeigniter app?

So I have a framework we've built on codeigniter. It uses regular codeigniter sessions by default which allows up to 4kb storage encrypted on a cookie.
Its for general apps that require a registration process, which can vary in size as questions are generated dynamically through an admin panel. Registration process relies on session data as it redirects throughout process.
I have used db_sessions in the past when I knew this would be an issue on the framework, however, I'm now considering the possibility to always have registration process using db_session and the rest of the site use the 4kb cookie session.
Is this possible. It seems like it could be a really bad idea, but I don't really want to rework the dynamic registration process or really use db_session for whole site as it will eventually make the site run very slow if too many users are online at once.
so I'm think I can just set the variable in config to be true only when the registration controller is loaded(by checking the url via $_SERVER or the uri helper if I can load it in the config which I'm guessing I cant).
Does this seem plausible?
It seems like it could be a really bad idea
You answered your own question :) You'll have issues when the user switches from one page to another. What happens if they open multiple windows, press a 'back' button etc. You'll need to switch the cookie over when they start registration, and switch it back at the end. It will be very very messy for basically no gain.
but I don't really want to rework the dynamic registration process or
really use db_session for whole site as it will eventually make the
site run very slow if too many users are online at once.
The reality is; your website has to be huge to have ANY real performance issues by using a DB for your sessions. Any if you are not using the DB, then you are relying on the cookie stored on the users computer. Depending on your site, this means they might have the ability to edit that cookie and change "admin = true" or something.
Just use the DB session - I think you are overcomplicating the situation.

Prevent a session from expiring?

We have a simple CCTV system in our office that shows a live image from each of our security cameras. The CCTV system doesn't have an API or any method of extracting the live images. You can however view the image from another browser by creating a basic HTML page with the image link:
http://192.168.1.6/media/getimage_sid.php?sid=a09c4ecb72bade3802e7bf563b0d0bd6&card=1&camera=1&width=384&height=288
This works perfectly, until the session expires and/or timesout. I don't know very much about cookies and sessions but when I inspected the page in Google Chrome I noticed the following cookie:
Name Value Domain Path Expires Size
PHPSESSID a09c4ecb72bade3802e7bf563b0d0bd6 192.168.1.6 / Session 41
there is also a HTTP column and a Secure column but both are empty.
What I am trying to figure out, is how do I keep that cookie alive or trigger it to recreate with the same value? I'm assuming that a rake task to log in to the system wouldn't work because that would reset the session ID every time.
The intranet is a Rails application, so one way would be to create a script that logs in and stores the current session ID to the database and then putting the last recorded sessions ID into the IMG links from the database. It's a bit of a long way around though, I'm hoping for a better solution.
I have read a few articles showing how to do this with AJAX but that would appear to rely on the intranet being viewed all the time. I need this to work if no-one has viewed the intranet for the weekend.
This project is so we can put a couple of live (when the page refreshes!) images on our intranet so we don't have to continuously go to the CCTV system, log in and find the right camera just to see who is at the garage door etc.
Any help would be appreciated.
It's a bit of hack but I've made a small script to pull in the latest session ID and then put it into the image links.
A random different approach: does the following URL get the right image, without having to worry about the session id?
http://192.168.1.6/media/getimage_sid.php?card=1&camera=1&width=384&height=288
The session ID used in the cookie seems to be the PHP generated one.
I don't think session ID should become stale if you 'notify' the server that you're still online.1 You should try to specify the Cookie: in your HTTP request headers. Specifying the SID via the URL is probably not be enough to indicate to the server that you're actually using it.2
If your web-pages are fetching the images directly (i.e. you have an <img src="http://192.168.1.6/..."> in the HTML page) you might work like this:
make an AJAX request (XMLHttpRequest) to a URL which returns a session ID.
any subsequent request to the server on that page should automatically include the session in the headers.3
Otherwise, if you can't specify a Cookie: header, you may choose to make the time before a session becomes stale longer. If you have access to the computer hosting the PHP interface (192.168.1.6) then you can configure PHP to do so (via the php.ini configuration file, I believe). Information about session configuration is available here, and specifically the gc-maxlifetime options seems useful:
session.gc_maxlifetime specifies the number of seconds after which data will be seen as 'garbage' and potentially cleaned up. Garbage collection may occur during session start (depending on session.gc_probability and session.gc_divisor).
Alternatively, if none of the above appeal to you, your solution to fetch (GET) a page to obtain a valid, fresh session ID seems logical and good. You could optimize this by measuring how long it takes before session IDs become stale and fetching new session IDs only at that interval.
1 I looked for a valid reference for this but couldn't find one.
2 specifically PHP uses a PHPSESSID= token in the URL whereas in your example it looks like sid=. It is also generally considered bad practice security-wise I believe (this article explains how it might be used for XSS), since you're exposing user information in the URL, though I think this has little to no effect in this case
3 according to the XMLHttpRequest spec of the send() method:
If the user agent supports HTTP State Management it should persist, discard and send cookies

Copy cookie to other domain. Firefox? Chromium?

Is there any possibility of copying a set of cookies from one domain to another. I badly need this for Web development.
You cannot just copy set of cookies, but you can write your own php/python code, to set several cookies for another domain and use values from the old set.
No, the same origin policy prohibits sites from setting or reading cookies on behalf of other sites outside of a few special cases. Some browser extensions will allow you to copy and paste cookies to sync them manually, though.
From your comments it sounds like you want to use the same cookie for your development and your production system. How about using something like 'local.example.com' instead of 'localhost' as your development domain and setting wildcard cookies for all subdomains?
We use this pattern so we don't need to register multiple API keys for webservices since most of them have wildcard support for subdomains.
I'm not sure I would recommend something like that so you are able to use the same cookies in development and production because it has other implications as well. For example if you send static assets from another subdomain then the browser will send cookie header information unnecessarily and of course there might be some more details that might make debugging harder rather than easier this way.
If you could explain the problem at hand in a bit more detail there might be other solutions or best practices for staging and production environments that can help you.
You can do this manually, using grease monkey.
goto tools->page info.
select security tab.
view cookies button.
type the domain you wish to read from.
make a note of all the cookies and their content.
now go to the domain you wish to copy to.
install greasemonkey add on for firefox (or better yet, using Cookies Manager+ :: Add-ons for Firefox!).
run some javascript code to re-create the mentioned cookeis and their values.

Clearing session in Firefox for every request made (Watir issue)

I'm developing a screen scraping robot that uses Watir (ruby) to crawl specific web searches.
Watir is used as the search results are delivered in pages, only available via AJAX requests.
My issue is now that to perform a new search, the browser has to be shut down in order for the search session to be cleared - otherwise the initial search overrule the change in the GET parameters.
Is it somehow possible to force Firefox to clear sessions on every request made?
Additionally, does anyone have experience solving these kind of issues via Watir?
If the session is maintained via cookies in your firefox browser then it's possible.
All you have to remove the cookies from your firefox cookies repository before it starts.
Firefox stores its cookies at (as of in my ubuntu and mac)
~/.mozilla/firefox/12wwonrk.default/cookies.sqlite [in ubuntu]
or
~/Library/Application Support/Firefox/Profiles/eox4ghka.default/cookies.sqlite [in mac]
(prior Firefox 3 it was cookies.txt instead sqlite)
If you can truncate the sqlite (or the txt) then the cookies will no longer be there.
As you are running Watir you are most probably using ruby. So, if you can run these commands through system or %x[] (or compatible commands through sqlite gem/lib) before Watir::Browser.new statement, hopefully you'll be done.
./sqlite3 path/to/cookies.sqlite
DELETE FROM moz_cookies;
.quit
If you want to use Watir, you can mess with profiles as described at http://watirwebdriver.com/. Most browsers seem to get their own profile for each new instance by default.

Clear all website cache?

Is it possible to clear all site cache? I would like to do this when the user logs out or the session expires instead of instructing the browser not to cache on each request.
As far as I know, there is no way to instruct the browser to clear all the pages it has cached for your site. The only control that you, as a website author, have over caching of a page occurs when the browser tries to access that page. You can specify that cached versions of your pages should expire at a certain time using the Expires header, but even then the browser won't actually clear the page from its cache at that time.
i certainly hope not - that would give the web site destructive powers over the client machine!
If security is your main concern here, why not use HTTPS? Browsers don't cache content received via HTTPS (or cache it only in memory).
One tricky way to mimic this would be to include the session-id as a parameter when referencing any static piece of content on the site. When the user establishes the session, the browser will recognize all the pieces of content as new due to the inclusion of this parameter. For the duration of the session the browser will used the static content in its cache. After the user logs out and logs back in again, the session-id parameter for the static contents will be different, so the browser will recognize this is as completely new content and will download everything again.
That being said... this is a hack and I wouldn't recommend pursuing it.. For what reason do you want the user's cache to be cleared after their session expires? There's probably a better solution that can fit your situation as opposed to what you are currently asking for.
If you are talking about asp.net cache objects, you can use this:
For Each elem As DictionaryEntry In Cache
Cache.Remove(elem.Key)
Next
to remove items from the cache, but that may not be the full-extent of what you are trying to accomplish.

Resources