Can Firefox 76 be forced to consider .localhost subdomains as Secure Context without tls? - firefox

We are using .localhost domains for development of our applications, and we have multiple applications living at different domains. We are at point where we need to test features requiring pages to execute in Secure Context, i.e. Service Workers and Push API.
Google Chrome for a past few versions has been marking all sites coming from .localhost domain as Secure Context, allowing local, hassle-free testing of Service Workers, Push API, etc.
I cannot find a way to force Firefox 76 to consider same pages being from Secure Context.
We have managed to resolve all .localhost addresses correctly to 127.0.0.1 in all browsers using local dns-resolver settings or built-in browser behaviours.
Firefox config entry network.dns.localDomains does not seem to affect if site is considered to be in Secure Context.
There seems to be some kind of FF internal development to change that behaviour out-of-the box, but it's hard to say when it will be merged and released, and if all pages in *.localhost will be considered Secure Context for sure:
https://bugzilla.mozilla.org/show_bug.cgi?id=1220810

As of Firefox 84, localhost is considered a secure context. Before that, it wasn't because it's not guaranteed that localhost will in fact resolve to a local (and therefore trusted) address.
However, the preference dom.securecontext.whitelist (renamed to dom.securecontext.allowlist in Firefox 97) has been created specifically with this scenario in mind, and it takes a list of origins (for example, host1.example.com,host2.example.net) that will be considered secure.
This preference does not seem to be well-documented, but it can be seen in this changeset: https://hg.mozilla.org/mozilla-central/rev/cfb9de0c9f2a.

Related

AppAuth loopback authentication fails on macOS with Chrome

We're using AppAuth for a macOS application to authenticate Google accounts. This has been working for years, except recently Chrome has started to block all http connections by default. The loopback server in AppAuth is hard-coded to work with http connections only. The following issue also seems to have gone unanswered: https://github.com/openid/AppAuth-iOS/issues/624
What other options do we have for using a https loopback server on macOS for OAuth2 authentication? We need the loopback server to be able to extract parameters Google sends back after authentication. Asking users to switch from Chrome is not desirable.
Interesting - with loopback desktop logins there are two URLs involved:
The URL in the desktop app, which is meant to be HTTP according to OAuth standards, since it runs on end user PCs. Using HTTPS would require the entire user base to host SSL certificates, which is highly impractical. Typically a loopback URL is a value such as http://localhost:8000, where the port number is often calculated at runtime.
The URL used to invoke the system browser is a value such as https://myauthserver/authorize?client_id=xxx&redirect_uri=http://localhost:8000..., and this should be HTTPS of course.
PROBLEM DIAGNOSIS
I'd be very surprised if Google have blocked this if you are using standard desktop logins, since it has been referenced on their Native Apps Page for years.
Are you sure something else is not the cause? One possibility might be lack of a user gesture in the system browser. Is the problem consistent and are there any differences in these cases:
Make Safari Browser the default before login
Make Chrome Browser the default before login
Make Chrome Browser the default before login and clear browser cache
Let me know and I may be able to suggest some next steps ...

Edge AJAX calls fail to a domain with SSL pointing to localhost

We have a product which relies on a thin client installed on users machine. We make an ajax get request to a domain pointing to local host which has a real ssl. This fails in edge, works in every other browser including IE11. Note that same works if there is no ssl involved. It also works on Windows 10 Home edition.
Adding a datatype, content-type or request method does not resolve this. Only way to fix this seems to be running following command.
CheckNetIsolation LoopbackExempt -a -n="Microsoft.MicrosoftEdge_8wekyb3d8bbwe"
If this is expected behavior, can someone explain why microsoft would block this on a enterprise version but it works on home edition ?
Microsoft Edge, and Windows 10 apps in general, use AppContainer Isolation:
Isolating the application from network resources beyond those
specifically allocated, AppContainer prevents the application from
'escaping' its environment and maliciously exploiting network
resources. Granular access can be granted for Internet access,
Intranet access, and acting as a server.
Your thin-client is running on win10 enterprise edge against an intranet ssl service (localhost), so access is by default restricted by this mechanism. With the command
CheckNetIsolation LoopbackExempt -a -n="Microsoft.MicrosoftEdge_8wekyb3d8bbwe"
you are disabling network isolation on that host for the loopback network adapter (localhost) for MS Edge so your app client (and any other locally sourced app) can run on it without restriction against any localhost service.
This fails in edge, works in every other browser including IE11.
They clearly wanted to improve the default security policy of previous versions. It's never too late, MS :) There is actually an Enhanced Protected Mode (EPM) that could prevent your app from running on IE too. Chrome has its Google Chrome Sandbox that can also be tuned like this. Safari and Firefox also have sand-boxing features although I am not familiar with their particularities.
Note that same works if there is no ssl involved.
Typically, if you are using ssl is because you are dealing with sensitive data and/or a critical service. If you are not it is ok to be more lax. Again, just a matter of security policy.
It also works on Windows 10 Home edition. If this is expected behavior, can someone explain why microsoft would block this on a enterprise version but it works on home edition?
Enterprise versions of any product are known to be more restrictive since their target users are more security concerned (IT people typically don't want to expose their company's intranet payroll db service to external attackers, and things like that). Also, in this case the default behavior can be easily defined/altered by experts on the IT department (check out domain security policies) so it's better to leave the default settings to "paranoid" mode and let the experts tweak according to the company's needs.
Note there are other mechanisms at work when you are running a thin client on the browser that make this kind of protection redundant (same domain policy, XSS protection and so on). Nevertheless one can never be too safe: There are ways to work around those defenses such as Self-XSS that require isolation between the browser and the local network to avoid compromising the system. In the end, less exposed surface means less attack vectors, so isolation is good if you can afford it :)

How to disable CORS in mozilla firefox?

How to disable the web security in Firefox or how to solve CORS issue in Firefox during development?
Things tried but did not work:
The option of filtering in "about:config" and setting the "security.fileuri.strict_origin_policy=false" doesn't work
Tried few add-ons like "CORS-Everywhere" (https://addons.mozilla.org/en-US/firefox/addon/cors-everywhere/). Doesn't work.
How to disable the web security in firefox
Don't. It gives unrealistic results for testing.
how to solve CORS issue in Firefox during development
Ideally: Create a development environment that is just like the live environment.
The server side code will, at some point, need development work performed on it. Your team will need the ability to create a development server with test data in it for that. Use the same development server for working on the client side code.
That way you can do you development work:
without making test calls to the live server (so you never need fake test users doing fake actions on the live server with the risk that test data will escape somewhere end users will see it).
without cross origin issues (because your development server for your client side code will be the same as the development server for the URL you are requesting)
able to use relative URLs
with a browser that acts like the browsers used by end users
As a quick and dirty hack which doesn't have most of the benefits of using a proper test environment: Use a proxy server that maps requests to the same origin as your development environment to the live environment.
I used Charles proxy for that before I moved to having proper development environments.

How Do Firefox Extensions Use IP Address With Anonymous Proxy? Original IP May Be Exposed?

Research On Firefox Extensions Connections
I have read the FAQ's on Firefox Extensions (https://addons.mozilla.org/en-us/faq) and have looked at their community forums for information but wasn't able to find anything on how extensions actually connect and collect your data.
The closest I found was the Mozilla Wiki page on data collection (https://wiki.mozilla.org/Firefox/Data_Collection) but it only gives you the basic opt in/out data collection levels.
What I'm trying to understand:
If I'm using a manually configured anonymous proxy in Firefox, could the extension potentially send my actual IP address (not my proxy IP address) back to a third party?
Example: Translate Extension
For example, if I were to use the Google Translate for Firefox, would Google be able to see my original IP?
What I was thinking
Since the proxy is the only way for the browser to connect to the internet, the extension would have to use the proxy IP address to connect and thus, would only be able to see the proxy IP address. However, I would love to be sure that there is no back door or way of the extension revealing my original IP.
Any insight is greatly appreciated. We are not doing anything unethical, we just have to maintain separate IP usage for various clients and do not want to risk mixing their information. Thanks again.
Firefox extensions are usually not limited in what they can do, only extensions based on the WebExtensions framework are sandboxed - currently the majority of Firefox extensions is still either classic XUL-based extensions or based on the Add-on SDK, these don't have inherent restrictions. So in theory an extension can do lots of things in order to deanonymize you, for example:
Use nsIDNSService in order to retrieve your local IP address (usually, this address isn't valid outside your local network however).
Change browser settings, in particular disable your configured proxy server.
Use external command line tools in order to read out system information or send a request bypassing the browser.
Read files on your hard drive in order to find your name.
Note that Chrome also offers an API to modify browser's proxy settings extensions, and a similar API is planned for WebExtensions. So even sandboxing doesn't always protect against deanonymization, and you need to trust the extensions you are installing.
However, the extensions hosted on Addons.Mozilla.Org are usually reviewed by Mozilla (the ones that aren't reviewed yet have a yellow install button and a warning). One aspect that the reviewers look into is: does this add-on do what it claims to do or are there unexpected side-effects? Any unexpected functionality has to be strictly opt-in, with full explanation about the implications. This was introduced in 2009 as the No Surprises policy and works remarkably well. Chrome Web Store doesn't have any comparable policy.

Is it possible to 'pretend' a site is a third party to test functionality if 3rd party cookies are blocked?

I do automation testing for a company that is trying to implement a single sign on via an iFrame; a third party site will include our page in an iFrame and we will do an authorization.
We had to rework the way this worked because of Firefox defaulting to 3rd Party Cookies being off. For manual testing, we have hosted the page on a different domain, but this domain requires certain usernames and passwords we cannot expose in code, so this is difficult to automate.
Is there a way I can trick Firefox into thinking that mydomain.com is not actually mydomain.com? This sounds impossible, because if I can trick Firefox into thinking I'm actually on mydomain2.com, then I can effectively just put on a 3rd party cookie, but since I'm doing it on my own instance of Firefox, are there any settings I can change in my profile to confuse it?
Yes, this is incredibly easy and we do this all the time.
Log onto the test machine (the computer that will be running the browser) and edit the hosts file, located in c:\windows\system\drivers\etc
Add an entry for the site you wish to be confused, using a different domain name but the correct IP address. Because it's a different domain, it'll look like a third party site, but because the IP address is the same, the requests will actually be sent to the same web server.
Example:
Assume your web server is running on the local host (which has address 127.0.0.1)
Add host entries for
127.0.0.1 FirstPartyDomain.com
127.0.0.1 ThirdPartyDomain.com
Access your site via http://FirstPartyDomain.com
Site contains an iFrame
<iframe src="ThirdPartyDomain.com/SetNastyCookies">
The request in the iFrame will go to the same server (local host) but in the context of a third party site.
What about using DNSMasq? Most open-sourced routers such as DD-WRT support this option. If you need to test it via https, you could also temporarily store security certificate exceptions as well during your testing.

Resources