Response Cookie not getting set by Chrome & IE - ajax

I'm trying to figure out why Chrome (26.0.1410.64) and IE10 don't seem to recognize the cookie I set in my response from an ASP.NET Web API controller. Here is the situation:
I have a drop-down menu login form on my page that makes an ajax call to my Web API method (via HTTP POST) and that Web API method returns some JSON data and also sets a cookie in the response (using the HTTP headers). It works perfectly in Firefox and Safari (so, WebKit) but not in Chrome or IE. Chrome and IE appear to completely ignore the cookie that's sent back in the response. I've verified (using Fiddler) that the cookie is sent back on the response so I know it's there - I can't figure out why IE10 and Chrome don't pick it up though.
Any ideas? Does it have something to do with how Chrome and IE10 handle response cookies in ajax requests?

So I figured out the issue, although it's not something I really would like to accept as a solution. I guess I will just have to deal with it and always test the site on my local machine using Firefox.
So here's the issue:
When I run my site locally by running it from Visual Studio and IIS on my local machine, it creates a website at an address like http://localhost:1839/. For some reason, ajax cookies get ignored by IE10 and Chrome when it's "localhost" - but not when it's a real-looking host name or IP Address. So if I edit my host file and create a generic entry like localhost.com and point it at 127.0.0.1:1839 then everything works fine in IE and Chrome (and Firefox still as well).
It's when I use the localhost:1839 address that ajax cookie only works in Firefox.
So what I ended up doing was deploying my website to a different test IIS server (on another machine) that I have a test.mydomain.com entry in my local host file for - that points to the test IIS server's IP address. Now IE, Chrome and Firefox all accept the ajax cookie from this faked "test.mydomain.com" domain.
So for those of you sending cookies back on an ajax request - beware of this "localhost" issue with Chrome and IE.

The Domain on the set cookie is most likely conflicting against using localhost. If you edit your hosts file and add a alias it will make test.mydomain.com point to your local machine:
Within c:\windows\System32\drivers\etc\hosts add the following:
127.0.0.1 test.mydomain.com
Start your webserver within Visual Studio
Close all browsers, then load test.mydomain.com

Related

NS_ERROR_DOM_BAD_URI on Firefox only on localhost

I'm trying to switch to Firefox for development but I'm stuck at this error and I have no idea what it could be.
The problem is specifically with our Login endpoint, which sets HttpOnly cookies on successful login. In development this works on both Safari and Chrome, but trying to login on Firefox returns a NS_ERROR_DOM_BAD_URI error.
In development, web is running at http://localhost:3000, the API is on https://localhost:5001.
I assume Firefox is blocking the login because the api and the web are technically on different origins, but I don't understand why it would be doing that for localhost.
Is there a way to disable this error entirely?

Cloudflare identifying CURL

So I'm trying to create some scripts that have to run on a particular site protected by CloudFlare. I am getting one odd situation though:
Whenever I send a cURL request with the command line to that particular website (just a GET request), it reports a 503.
When I do the same request with a Firefox RESTED client, it reports a 200. - Running it in my browser executes the javascript protection as expected (so a 200 as well)
What can possibly be the trick to identifying a CURL vs a Firefox RESTED client-request, that both seem to do the exact same thing?
I'm using:
same IP
same User-Agent (in fact I tried mocking over 7 headers that my regular browser sends too, including Accept-Language Accept-Encoding and more)
Apparently when using the RESTED Firefox add-on, it uses all cookies that are currently in your firefox browser as well. One of these cookies identified my RESTED client as being valid

ASP.NET Unauthorized in Postman using IIS Express, but works in Chrome

I am trying to test an ASP.NET Web Api locally using IIS Express. When I use Chrome and hit the url (localhost:5000/api/test, for example) the json displays fine, but when using Postman I keep getting unauthorized 401.2 when hitting the same url. The api controller has Anonymous access on the route.
On the error message, one of the likely causes is:
Integrated authentication is enabled and the request was sent through a proxy that changed the authentication headers before they reach the Web server.
Is Postman somehow changing the headers?
[2019 Update]
Got the same issue, I couldn't debug an ASP Core 2.1 API using Postman when running on the local machine using IIS Express. I kept on having "Could not get any response" despite it was working fine on a browser.
Following troubleshooting steps explained here PostmanLabs Github, I noticed into Postman console that this was coming from a certificate issue.
Disabling SSL Verification from Postman Settings > General allowed the request to pass through.
Looks like it's your proxy.
I haven't found the proxy setting in postman. So I deleted postman for Win and installed postman for Chrome. Possibly Postman gets the environment from Chrome.
Anyway the resolution is to use Postman for Chrome instead of Postman for Win.
I have a localhost WebAPI site up with IIS Express (HTTPS). Postman started to respond as expected to GETs and POSTs after I changed (in Postman)
File --> Settings --> Proxy
to: "Use the system proxy"
and turning on: "Respect HTTP_PROXY ...".
I had earlier set up a custom proxy that wasn't working with HTTPS.

Making requests to ws:// from a website loaded on https

I'm using sipml5 to connect to a sip phone service and one of the setting is the service websocket server URL. the problem is that the server url is not secured (ex. ws://123.123.123.123:9999/ws) and it cannot be accessed on wss://. Because of that, when loading my site on a HTTPS connection, the browser blocks the request automatically, it doesn't behave like it does when loading let's say, an image over http, and then shows a warning.
Error is: [blocked] The page at 'X' was loaded over HTTPS, but ran insecure content from 'ws://....': this content should also be loaded over HTTPS.
I need to know if there is a way to make the browser connect to ws:// even though the page initializing the request is loaded over https.
Please help.
EDIT:
What I'm looking for is a flag or something like that, in Chrome or Firefox for example, which lets the user access insecure resources even though the page is loaded on https.
Why you are using http? You can get an ssl certificate from https://letsencrypt.readthedocs.org/en/latest/intro.html
then add the following details to http.conf
tlsenable=yes
tlsbindaddr=0.0.0.0:8089
tlscertfile=/path-to/cert.pem
tlsprivatekey=/path-to/privkey.pem

AJAX request to https php server from Firefox and Chrome extensions

I'm working on extensions for Firefox and Chrome. The data used by my extensions is mostly generated from ajax requests. The type of data being returned is private, so it needs to be secure. My server supports https and the ajax calls are being sent to an https domain. Information is being sent back and forth, and the extensions are working correctly.
My questions are:
Do the extensions actually make secure connections with the server, or is this considered the same as cross domain posting, sending a request from a http page to a https page?
Am I putting my users' information at more risk during the transfers than if the user were to access the information directly from an https web page in the browser?
Thanks in advance!
The browser absolutely makes a secure connection when you use HTTPS. Certainly, a browser would never downgrade the security of your connection without telling you: it will either complete the request as written or it throw some sort of error if it is not possible.
Extensions for both Chrome and Firefox are permitted to make cross-domain AJAX requests. In Chrome, you simply need to supply the protocol/name of the host as a permission in your manifest.json. In Firefox, I think you may need to use Components.classes to get a cross-domain requester, as described in the MDN page for Using XMLHttpRequest, but I'm not 100% sure about that. Just try doing a normal request and see if it succeeds; if not, use the Components.classes solution.

Resources