SSL certificate - Use Client certificate installed on server for local testing - Mutual authentication - windows

I have been working with web services connecting to URLs provided by different clients and so far it has all been done using one-way authentication. Now I'm asked to enable 2-way (mutual) authentication for one of the clients. I did a lot of research and reading but still confused about a lot of things.
I could test successfully on my local machine following instructions from various different articles. But the problem is now to deploy it in production.
Here's what I did for testing: I created a test Web service Host and assigned it a self-signed certificate and created a client to test this. After this I created a client certificate using makecert and verified that this is installed via MMC. I then modified my Host app to only allow clients with certificate and tested from client to see the connection refused due to not providing the client certificate. Then I modified the bindings in the client application to include the certificate name and I was able to connect to the Host successfully. So this completes local hosting.
Now the real problem. The tech team is going to create a certificate in "cert store" on the server. And I need to test again to make sure everything works as expected. We have a few different developers who all want to test on their machines on their local code. Can we all use the same certificate somehow? I don't think we would be allowed to import the certificate but what suggestions could I give them so all of us can use the same certificate?
I'm also confused about issues like difference between windows certificate and IIS certificate. What advantages would the IIS certificate provide?
Thanks for help!
Edit: Could one of the differences between installing on IIS be so that the hosted sites be accessed via SSL connection? This would mean we don't really need to install on IIS if it's just a client certificate. Is this correct?

Related

Running ASP.Net Web Api inside intranet - cert authority invalid

I made simple API for which is running on my "server" inside local network. When I want to call api from another PC internet browser inside my intranet over https://xxx.xxx.xxx.xxx:9100/ShowList i get warning from my browser that my connection is not private NET:ERR_CERT_AUTHORITY_INVALID. I can proceed as unsafe but i would like to avoid that. The question is what I have to do to have save connection inside my intranet? I'm totally new in that so i'm aware of thing I have to do.
You just need to add a ssl certificate to the hosting IIS server (assuming you are using IIS as the server). You can create a Self-Signed Certificate, buy from the vendors or use free services. Take a look at
https://learn.microsoft.com/en-us/iis/manage/configuring-security/how-to-set-up-ssl-on-iis

Accessing HTTPS content from out-of browser Silverlight 4 applications

I am using some of the local machine's resources using COM interop functionality provided in Silverlight 4.0. Hence, naturally I need OOB with elevated permissions. However, in my case I am consuming the WCF services hosted on HTTPS channel. Here is where I am facing the problem. The OOB with elevated permissions applied, doesn't allow me consuming the HTTPS service hosted on either different or the same domain, giving me a NotFound exception. Please note that I have used the self-signed certificate for the development environment. The same is also installed in the Trusted Root folder of the client machine on which I am testing.
Interestingly, when I set the Fiddler options (in Fiddler session, Toos -> Fiddler Options -> HTTPS tab) to intercept the HTTPS traffic, with Decrypt HTTPS traffic checkbox set, I am able to use the same HTTPS service without any exception. But for that, I was told by Fiddler to store a temporary certificate inside my user profile's Fiddler directory, and I must have at least one Fiddler session at that time. Hence, it seems to be a certification issue. But does it relate in anyway to signing of the XAP file with the required certificate ? I am not sure. I tried with a self-signed certificate and bind my layer service URL to use that certificate. Then I install the same certificate to Trusted root folder of the client. But i was not successful in signing the XAP with that certificate.
Please let me know if you have any work-around.
If the code is running in a different user's context, you need to put your "Self-signed" certificate into the Machine Trusted Root store. Start mmc.exe. On the File menu, choose to Add a Snap-in. Add the Certificates snap-in. Pick Local Machine. Import the Self-signed root into the Trusted Root store.
I had the same problem and found out, that the SSL settings in IIS were wrong.
I configured IIS 7.5 to SSL only and to accept client certificates. With this settings, I ended up with the service not found error in OOB. After setting IIS to ignoring client certificates the OOB Application works fine.

Creating a web service that requires client certificates

I am currently working on a project that has the following components (all .NET 2.0)
Client Application
Web Service Invocation API
Web Service
In summary the Client Application creates and instance of the API and this calls the Web Service. Nice and simple and this all works exactly as I want it to.
The next stage of the project was to secure the Web Service with SSL. So I have created a "Self Signed CA" and from this signed a server certificate for IIS. Again, nice and simple and this all works exactly as I want it to.
The next stage of the project is to secure the Web Service by requiring the invoker to supply a client certificate. So I have created a client certificate (via the Self Signed CA). I am then adding this to the Web Service invocation call in the API:
WSBridge.Processor processor = new WSBridge.Processor();
processor.Url = this.endpoint;
processor.ClientCertificates.AddRange(this.clientCertificates);
processor.Timeout = (int)Settings.Default["DefaultTimeout"];
In debug I can see that this.clientCertificates contains the certificate I created. So in theory it is being presented to the web server.
However, when I attempt to call the Web Service I get the following exception in the API:
The request failed with HTTP status 403: Forbidden.
Fairly self explantory, but I have no idea what is causing the problem.
Other relevant information:
In my dev environment Client, API & Web Service are all running on the same machine
If I attempt to access the Web Service Description in IIS I get the following error (I am not prompted to choose a client certificate):
HTTP Error 403.7 - Forbidden
The page you are attempting to access requires your browser to have a Secure Sockets Layer (SSL) client certificate that the Web server recognizes.
The client certificate is loaded into the Personal store for the current user, the CA root is in trusted root for the local machine and current user.
If I switch off "Require SSL" and put "Client Certificates" on accept in IIS I can make my request. However when I look at HttpContext.Current.Request.ClientCertificate.Count in the Web Service this comes back as 0.
I need to be able to run my development with client certificates as portions of the service code use the CN of the client certificate to perform various actions. I could hack it in but it would be nice to be able to do a real end to end.
All the certificates mention here were generated using OpenSSL. I am developing on Windows 7 so I do not have the facility to install Microsoft CA
So, does anybody have any ideas as to the cause of this problem?
As an aside (not worth creating a new question for this) - for some reason when I enable SSL for the Web Service Visual Studio is no longer able to debug the service.
EDIT : Some more information
The client certificate has an intended purpose of <All>
Although I am working on localhost the server certificate for the web server was issued to devserver.xyz.com so I have changed my hosts file to point that to localhost. As such I can now browse (with client certs switched off in IIS) to my service descriptor page without seeing any SSL certificate warnings.
Well I have solved the problem, in summary this was due to the format of the client certificate this should have been PKCS12.
More Detail
Although the MMC Certificate plugin was showing the client certificate in the personal store for the current userm I noticed that when viewing the same store via Internet Explorer (Tools -> Internet Options -> Content -> Certificates) the certificate was not present.
After a little Googling it seems that IE will only accepts PKCS12 format for client certificates, so I convert the certificate with the following OpenSSL command:
openssl pkcs12 -export -in client_alpha.cer -inkey client_alpha.key -out client_alpha.p12
I then imported the p12 file into IE which allowed me to browse to the Web Service description page with full client/server certificated TLS.
Once I had made this change, I then retried by client application and this now works aswell. This is due to the fact that IIS, like IE, will only accept client certificates in PKCS12 format.

SSL Certificate only works within Local Network

I am running windows server 2003 standard and have installed the ssl cert as per Godaddy's instructions. Let me know what information you need from me. Attempting to access the website securely outside of our network the page does not load. Thanks in advance!
Although it would help if you provided more information (like what error the clients are getting), I’m going to guess that you are missing the intermediate certificates that GoDaddy uses. These need to be installed on the server where the SSL certificate is installed.
Follow the procedure here.

Google Chrome doesn't trust mitmproxy's certfificates

I'm running mitmdump (from mitmproxy) on my Macbook Pro, and I'm connecting to the proxy through my Windows desktop PC.
However, Chrome (running on the PC) refuses to connect to so many sites because of the invalid certificates which mitmproxy provides.
Chrome throws the error: ERR::NET_CERT_AUTHORITY_INVALID
Here's what mitmdump shows:
But why? What's wrong with mitmproxy's certificates, why can't it just send back google's as if nothing happened?
I'd like to know how I can fix this and make (force) my desktop PC to connect to any website through my Macbook's mitmproxy.
Answering this question for people who may find this important now. To get the proxy working, you have to add the certificate as trusted in your browser.
For windows follow this: https://www.nullalo.com/en/chrome-how-to-install-self-signed-ssl-certificates/2/
For linux follow this: https://dev.to/suntong/using-squid-to-proxy-ssl-sites-nj3
For Mac-os follow this: https://www.andrewconnell.com/blog/updated-creating-and-trusting-self-signed-certs-on-macos-and-chrome/#add-certificate-to-trusted-root-authority
There are some additional details in the above links; tldr; import the certificate in your chrome://settings url and add the certificate as trusted. That shall do.
This will make your browser trust your self-signed certificate(mitm auto generated certificates too.)
The default certificates of mitmproxy is at ~/.mitmproxy/ directory.
Per the Getting Started page of the docs you add the CA by going to http://mitm.it while mitmproxy is running and selecting the operating system that you are using. This should solve your problem and will allow https sites to work with mitmproxy.
This is the expected behavior.
mitmproxy performes a Man-In-The-Middle attack to https connections by providing on-the-fly generated fake certificates to the client while it keeps communicating to the server over fully encrypted connection using the real certificates.
This way the communication between client and proxy can be decrypted. But the client has to actively approve using those fake certificates.
If that wasn't the case then SSL would be broken - which it isn't.
The whole story is very well explained here:
http://docs.mitmproxy.org/en/stable/howmitmproxy.html

Resources