Cookies Not Being Sent via Firefox's Downloads API in Extension - firefox

I am attempting to write a Firefox extension which downloads files.
The website that I'm trying to download the files from requires cookies to be passed to GET requests, otherwise a 403 is returned.
I can visit the URL that I'm attempting to download the file from in the browser, and the file will load correctly, indicating that my cookies are correct.
I can also, using my extension, issue GET requests to an authenticated-only API from the same domain and receive the correct response (indicating that cookies are passed correctly).
However, when I attempt to download a file from a url using Firefox's downloads API (browser.downloads.download), the download fails (I get a 403), because the cookies aren't being passed. I have confirmed this with Charles Proxy.
The Mozilla Documentation says "If the specified url uses the HTTP or HTTPS protocol, then the request will include all cookies currently set for its hostname"; my url uses HTTP.
Why aren't the cookies being passed?
I'm using Firefox Developer Edition (68.0).

It was a Firefox bug in versions 67-69.
https://bugzilla.mozilla.org/show_bug.cgi?id=1555591

Related

safari and firefox does not send cookie when send http request to remote server with the same sub domain name but chrome does

I have two servers, a.example.com and b.example.com
The cookie with domain .example.com was set in a.example.com/admin
I visit a.example.com/admin page, and in this page, a http request was send to b.example.com
I had a packet capture and just found that the cookie was not send when I use safari and firefox browser, but in chrome, the cookie was send.
so I was wondering way this happen, and does there exist any method by which the safari and firefox can send the cookie?
Check this link, it may help you figure this out: https://discourse.mozilla-community.org/t/webextension-xmlhttprequest-issues-no-cookies-or-referrer-solved/11224/15
It seems that either you need to enable 'third party cookies' or you need to wrap XMLHttpRequest. Also, make sure the website is listed in the permissions section of your manifest file: https://developer.mozilla.org/en-US/Add-ons/WebExtensions/manifest.json/permissions

FineUploader upload gets 401 error and hangs

I'm using FineUploader 5.11.9 "Traditional" to upload pictures to my web server, where a script written using the Perl CGI::Simple module is on the receiving end. Everything works fine on my internal development system - I use CGI::Simple::upload() to get the name of the file and then CGI::Simple::upload($file,$target) to magically copy the temp file to where I want it without having to mess around with filehandles (so much easier than with old CGI.pm, once you realise that you need to use CGI::Simple(-upload) because upload support is disabled by default!)
But I digress. Everything is fine on my development system, but on my production system only the first upload works, and then only if I do it soon after loading the webapp page. All subsequent uploads hang, and when I use the Web Inspector in Safari I see the calls to my CGI script have received a 401 Not Authorized response.
The production server is set with Basic Access Authentication (using AuthType Basic etc in a <Location "/"> section in the Apache httpd.conf file (actually default-ssl.conf which gets included by it), and Safari is set to remember the username/password, so I'm guessing that my first upload works because the authentication from when the web page loaded is still valid, but when I do another request it has gone stale.
I've had a look on the FineUploader documentation site and can't find anything about Authorization for its AJAX requests. Google found FineUploader - add authentication in header which might be the answer, but if it is, how do I find the right value to put in the "Authorization:Basic" header?
To change headers sent with a traditional upload request, use the setCustomHeaders API method:
uploader.setCustomHeaders(
'Authorization', 'Basic ' + username + ':' + password,
fileId
)

Selenium IDE: How to detect secure cookies on page loaded with http://?

I am using Firefox 22 and Selenium IDE 2.2.0.
I have loaded a page in firefox using the HTTP protocol (not HTTPS). I know for sure that the page has set a secure cookie (as a result of an embedded AJAX request). I can verify this using the browser internal url chrome://web-developer/content/generated/view-cookie-information.html - because among other cookies that page shows a cookie like this:
Name WC_AUTHENTICATION_5122759
Value 5122759%2cDKppXa7BAqnZ0ERDLb0Wee%2bXqUk%3d
Host .testserver.dk
Path /
Expires At end of session
Secure Yes
HttpOnly No
However, when I run assertCookie in the Selenium IDE I can only see the unsecure cookies. I.e. all cookies - except then one above - are detected by Selenium IDE:
Executing: |assertCookie | glob:WC_AUTHENTICATION_* | | yields this set of visible cookies:
[error] Actual value 'JSESSIONID=0000uCQdh2FZ0ZA8z-O5zcGoUtD:-1;
WC_PERSISTENT=lT8Z5tbkQrvLhNm%2bGyCj%2bh4yPAU%3d%0d%0a%3b2013%2d07%2d05+13%3a18%3a18%2e807%5f1373023098807%2d3048%5f10201%5f5122827%2c%2d100%2cDKK%5f10201;
WC_SESSION_ESTABLISHED=true;
WC_ACTIVEPOINTER=%2d100%2c10201; WC_USERACTIVITY_5122827=5122827%2c10201%2cnull%2cnull%2cnull%2cnull%2cnull%2cnull%2cnull%2cnull%2cy6bjcrZgvCVe5c52BBKvcItxyF5lLravpDq9rd9I0ZmRfRNxcC2oG13Eyug3kKgbtLOHVLxm9T76%0d%0a%2fGJFLp5bOrkPoNqmc38TIr%2fO7eU%2fbd7Mfny2kQg7v6xGweYoRkXYgAEz91rH0QavFhlOjpd12A%3d%3d;'
did not match 'glob:WC_AUTHENTICATION_*'
So does anyone know how can I use the Selenium IDE to verify the presence of secure cookies on a page loaded with http:// (not https://) ?
Sadly, what you are doing is breaking the specifications. A secure cookie is suppose to be only available if the connection is secure. Hence, if you are connecting with HTTP, you can't see it.
However, if this is just on your test machine (not your end user), you can modify the response from the server using Fiddler. With Fiddler, you can program something like, if you see this cookie, add another cookie, or strip the secure flag.
EDIT:
Some background information about Selenium and cookies:
Selenium works through the browser with JavaScript as part of the page. Because it is essentially a part of the page, it has to follow all the same rules as the page. This means that it still has to abide by the security rules on cookies. A secure only cookie can only be read on a secure connection, thus Selenium cannot read a secure cookie if it's not on a secure connection.
The place where HTTP request comes in is that cookies are a part of the HTTP header. Both the request (from the browser) and the response (from the server) have an HTTP header. Cookies are present in both.
You want to verify if the server has set the cookie, so you want to inspect the HTTP response from the server for the presence of the cookie. Because of security restrictions, however, you cannot from Selenium. These security restrictions are enforced by the browser. All reputable browsers enforce these policies, since without these policies, the end user's credentials will be easily compromised.
This is where Fiddler comes in. Fiddler inspects the HTTP data at a lower level, before the browser gets to it. Thus, you can use Fiddler to manipulate the data before it gets to the browser to give some kind of indication that the cookie was present.

URL not allowed by Access-Control-Allow-Origin

I am trying to implement OAUTH for accessing Flickr APIs. My AJAX call to flickr.com keeps failing.
Sample Error Message:
XMLHttpRequest cannot load http://www.flickr.com/services/oauth /request_token?oauth_callback=oob&oauth…signature_method=HMAC-SHA1&oauth_timestamp=1368375405647&oauth_version=1.0. Origin http://localhost:8080 is not allowed by Access-Control-Allow-Origin.
Initially I used chrome and read the html file as file://path. I used to get the error 'null not allowed by access-control-allow-origin'. I solved this problem by copying the html file to 'local IIS server', 'local python webserver' and then a 'remote webserver'. I created python web server using > python -m http.server 8080'
I realize my cross browser call to flickr.com using XMLHttpRequest is failing. I tried by various solutions suggested in this forum:
Using newer Chrome 26.0.1410.64 m, which I guess supports CORS
I launched chrome with --disable-web-security
I created a web server using python -m http.server 8080 on local machine and then on a remote machine and copied the html file to the site
I copied file to a local MSFT IIS server
I defined URL in etc/hosts file to avoid numeric IP
I still get the same error (with relevant URL in the error message)
code clipping:
urlString="http://www.flickr.com/services/oauth/request_token?"+
"oauth_callback="+"oob"+'&'+
"oauth_consumer_key="+consumerKey+'&'+
"oauth_nonce="+nonce+'&'+
"oauth_signature="+esignature+'&'+
"oauth_signature_method="+macAlgorithm+'&'+
"oauth_timestamp="+timeStamp+'&'+
"oauth_version=1.0";
$.ajax({
url: urlString,
success:function(data){
alert(data);
}
});
In order to CORS work, both ends must enable it.
The first end is the browser, and, as you are using Chrome 26.*, yours is ok.
The second end is the server:
Before making a GET request to a domain different than the one the page is on, the browser sends an OPTIONS request to that domain. In response to this request, the server should include some headers that tell if a cross-domain request (GET, POST or other) is allowed.
One of those headers is Access-Control-Allow-Origin.
So when you run your page from your file system (file:// "protocol"), the OPTIONS means something like "Flickr, can I make a cross-domain call to you? I'm calling from null". Flickr does not recognize that domain as allowed and returns the error you are getting.
Same way, when you run your page from your local server, the OPTIONS says "(...) I'm calling from localhost:8080". Flickr does not recognize that domain as allowed as well.
The solution:
I don't know the Flickr oauth service, but I know that, as any other service, to make a CORS call to it, the page must be in a domain allowed by it. From your tests, I'm guessing Flickr does't allow many other domains.
But... an alternative to CORS is JSONP. I did a little research, Flickr oauth seems to support it.
Check this page for details: http://www.flickr.com/services/api/explore/flickr.auth.oauth.getAccessToken
There's another question talking about that specific subject:
Is JSONP supported in the new Flickr OAuth API?
About JSONP, this can get you started: How to make a JSONP request from Javascript without JQuery?
It is not possible to implement Oauth 1.0 through just javascript without any server side script. Since the flickr's new authentication process is based on Oauth 1.0a. You got to use a server-side script.
I tried to send the token request using JSONP in FireFox with CORS on(using a third-party add-on) and it worked fine. But without using any add-ons, it's not possible as the response from flickr is in text format(not in a JSON format) and the request fails.
You can either use server-side code for token request. OR Use the deprecated flickr API for authentication.

AJAX request to https php server from Firefox and Chrome extensions

I'm working on extensions for Firefox and Chrome. The data used by my extensions is mostly generated from ajax requests. The type of data being returned is private, so it needs to be secure. My server supports https and the ajax calls are being sent to an https domain. Information is being sent back and forth, and the extensions are working correctly.
My questions are:
Do the extensions actually make secure connections with the server, or is this considered the same as cross domain posting, sending a request from a http page to a https page?
Am I putting my users' information at more risk during the transfers than if the user were to access the information directly from an https web page in the browser?
Thanks in advance!
The browser absolutely makes a secure connection when you use HTTPS. Certainly, a browser would never downgrade the security of your connection without telling you: it will either complete the request as written or it throw some sort of error if it is not possible.
Extensions for both Chrome and Firefox are permitted to make cross-domain AJAX requests. In Chrome, you simply need to supply the protocol/name of the host as a permission in your manifest.json. In Firefox, I think you may need to use Components.classes to get a cross-domain requester, as described in the MDN page for Using XMLHttpRequest, but I'm not 100% sure about that. Just try doing a normal request and see if it succeeds; if not, use the Components.classes solution.

Resources