Is there any possibility of copying a set of cookies from one domain to another. I badly need this for Web development.
You cannot just copy set of cookies, but you can write your own php/python code, to set several cookies for another domain and use values from the old set.
No, the same origin policy prohibits sites from setting or reading cookies on behalf of other sites outside of a few special cases. Some browser extensions will allow you to copy and paste cookies to sync them manually, though.
From your comments it sounds like you want to use the same cookie for your development and your production system. How about using something like 'local.example.com' instead of 'localhost' as your development domain and setting wildcard cookies for all subdomains?
We use this pattern so we don't need to register multiple API keys for webservices since most of them have wildcard support for subdomains.
I'm not sure I would recommend something like that so you are able to use the same cookies in development and production because it has other implications as well. For example if you send static assets from another subdomain then the browser will send cookie header information unnecessarily and of course there might be some more details that might make debugging harder rather than easier this way.
If you could explain the problem at hand in a bit more detail there might be other solutions or best practices for staging and production environments that can help you.
You can do this manually, using grease monkey.
goto tools->page info.
select security tab.
view cookies button.
type the domain you wish to read from.
make a note of all the cookies and their content.
now go to the domain you wish to copy to.
install greasemonkey add on for firefox (or better yet, using Cookies Manager+ :: Add-ons for Firefox!).
run some javascript code to re-create the mentioned cookeis and their values.
Related
So Amazon's Cloudfront CDN is ubiquitous, and as a NoScript user, it can be a little frustrating having to allow every "########.cloudfront.net" on different sites. Does anyone now how to create an ABE rule in NoScript to allow any script coming from a *.cloudfront.net domain?
While the answer by #samadadi is correct and will allow you to universally accept content from cloudfront, you might want to consider why you're running No Script in the first place.
Cloud Front is a Content Delivery Network that anyone can use, even "the bad guys". If you allow JavaScript downloaded from there to be run, then you are circumventing the protection NoScript offers. If you want to remain safe, I would recommend staying with allowing it on a site by site basis.
This might be an unpopular option because, yes, it is more work, but you will be safer.
Update (just to save people reading the comments below): CloudFront sub domains, while looking random, remain the same between visits. Permanently allowing cloudfront subdomains from sites you trust should be safe, and will only ask you once.
Add these addresses to the Noscript whitelist to allow CloudFront scripts globally:
cloudfront.net
amazonaws.com
To allow *.cloudfront.net, I checked the debug button, and touched up the JSON as suggested here: https://www.dedoimedo.com/computers/firefox-noscript-10-guide-1.html. Let me know if yo have trouble doing this and I will walk you through it.
I am unable to clear cookies using watir-webdriver and browser.cookies.clear
Are there any other alternatives ?
This is as far as I know a browser based limitation, due to concerns of security and privacy.
Webdriver is interacting with the browser by javascript, and javascript is not allowed to clear all cookies (think how nasty that could be on a malicious site). In the non testing env, most JS that is executing came from the server of the site you are accessing. So the most it is allowed to do is clear the cookies for the 'current domain' e.g. the cookies belonging to the current URL. So if a web page wants to clear all its own cookies, that is OK, but it is not allowed to clear cookies belonging to other domains.
So if you want all your tests to start with fresh cookies, you will need something in the 'Before' section in env.rb that goes to the root of the site in question, and then clears the cookies
BTW the same limitation applies to setting cookies, if you want to create cookies for a specific site, you need to navigate to that site before trying to create them, or nothing gets created.
This is not an answer, but FYI only.
Suppose chrome is the chosen browser, when cucumber is running, from the output of ps -ef | grep chrome we'll be able to find the customized user data directory which is something like below.
--user-data-dir=/tmp/.org.chromium.Chromium.G2DgPo
And inside that directory, we'll be able to find the Cookies file stored under the Default folder.
Not sure directly deleting that database file could fulfill the needs or not. Cause in a normal browser session, such runtime data are stored at
~/.config/google-chrome/Default/Cookies
Our team have finished development phase of a web application
And i want to check the whole pages if there is a broken links or not
I try to use a lot of tools such as Xenu tool and LinkChecker tool
BUT
It can NOT navigate pages under the login page
And only home page will checked
Because of authentication is required
If there is a way to pass the authentication parameters ("userName" and "password") to the tool
to make it able to navigate pages under login page and check them
Edit your login module so that you can pass username and password in the url. Then start the tool from something like this: http://yourwebsite.com/login?username=...&password=.... You can then leave the work to the tool and your webapp, supposing your tool is managing cookies correctly (Xenu has an option for that.)
After reading the cookies section in the Xenu FAQ, I realized it can access IE's cookies, so...you can try login in IE then (while the browser is still running) run Xenu, enabling its cookies.
The only enterprise-grade solution I found so far is IBM Rational Policy tester. It does a whole lot more than simply checking broken links, but it does it well (spell check, grammar, reg ex, SEO, 508 accessibility, ...). The configuration of the tool is a pain and the UI is incredibly dated too... Having said that, authentication isn't an issue and, once configured, it does the job like a boss.
https://www.ibm.com/support/knowledgecenter/en/SSAUB3_8.5.0/com.ibm.pt.help.doc/helpindex_pt.html
So I have a framework we've built on codeigniter. It uses regular codeigniter sessions by default which allows up to 4kb storage encrypted on a cookie.
Its for general apps that require a registration process, which can vary in size as questions are generated dynamically through an admin panel. Registration process relies on session data as it redirects throughout process.
I have used db_sessions in the past when I knew this would be an issue on the framework, however, I'm now considering the possibility to always have registration process using db_session and the rest of the site use the 4kb cookie session.
Is this possible. It seems like it could be a really bad idea, but I don't really want to rework the dynamic registration process or really use db_session for whole site as it will eventually make the site run very slow if too many users are online at once.
so I'm think I can just set the variable in config to be true only when the registration controller is loaded(by checking the url via $_SERVER or the uri helper if I can load it in the config which I'm guessing I cant).
Does this seem plausible?
It seems like it could be a really bad idea
You answered your own question :) You'll have issues when the user switches from one page to another. What happens if they open multiple windows, press a 'back' button etc. You'll need to switch the cookie over when they start registration, and switch it back at the end. It will be very very messy for basically no gain.
but I don't really want to rework the dynamic registration process or
really use db_session for whole site as it will eventually make the
site run very slow if too many users are online at once.
The reality is; your website has to be huge to have ANY real performance issues by using a DB for your sessions. Any if you are not using the DB, then you are relying on the cookie stored on the users computer. Depending on your site, this means they might have the ability to edit that cookie and change "admin = true" or something.
Just use the DB session - I think you are overcomplicating the situation.
I'm trying to come up with ways to speed up my secure web site. Because there are a lot of CSS images that need to be loaded, it can slow down the site since secure resources are not cached to disk by the browser and must be retrieved more often than they really need to.
One thing I was considering is perhaps moving style-based images and javascript libraries to a non-secure sub-domain so that the browser could cache these resources that don't pose a security risk (a gradient isn't exactly sensitive material).
I wanted to see what other people thought about doing something like this. Is this a feasible idea or should I go about optimizing my site in other ways like using CSS sprite-maps, etc. to reduce requests and bandwidth?
Browsers (especially IE) get jumpy about this and alert users that there's mixed content on the page. We tried it and had a couple of users call in to question the security of our site. I wouldn't recommend it. Having users lose their sense of security when using your site is not worth the added speed.
Do not mix content, there is nothing more annoying then having to go and click the yes button on that dialog. I wish IE would let me always select show mixed content sites. As Chris said don't do it.
If you want to optimize your site, there are plenty of ways, if SSL is the only way left buy a hardware accelerator....hmmm if you load an image using http will it be cached if you load it with https? Just a side question that I need to go find out.
Be aware that in IE 7 there are issues with mixing secure and non-secure items on the same page, so this may result in some users not being able to view all the content of your pages properly. Not that I endorse IE 7, but recently I had to look into this issue, and it's a pain to deal with.
This is not advisable at all. The reason browsers give you such trouble about insecure content on secure pages is it exposes information about the current session and leaves you vulnerable to man-in-the-middle attacks. I'll grant there probably isn't much a 3rd party could do to sniff venerable info if the only insecured content is images, but CSS can contain reference to javascript/vbscript via behavior files (IE). If your javascript is served insecurely, there isn't much that can be done to prevent a rouge script scraping your webpage at an inopportune time.
At best, you might be able to get a way with iframing secure content to keep the look and feel. As a consumer I really don't like it, but as a web developer I've had to do that before due to no other pragmatic options. But, frankly, there's just as many if not more defects with that, too, as after all, you're hoping that something doesn't violate the integrity of the insecure content so that it may host the secure content and not some alternate content.
It's just not a great idea from a security perspective.