I am writing a test and windows seems to be prompting me for what certificate to use when using x509 client certificates.
I would like to have it automatically select one without user interaction (user interaction is bad when writing tests).
The problem is there seems to be no documentation on how this works in MSDN, could someone at least point me towards an answer?
You can have the server request a client certificate signed by a specific CA.
If only one certificate installed locally is issued by that CA then this should surpass the prompt.
Related
I have a Windows image that uses the PowerShell execution policy AllSigned
I also have a PowerShell script that is signed by a signing certificate issued by an internal CA
The certificate of the issuing CA is installed/trusted on the target machine
When I sign the PowerShell script, I am including the full certificate chain (IncludeChain = 'All')
The certificate chain looks like this:
|- Issuing CA Certificate
|- Signing Certificate
The PowerShell script is signed by the Signing Certificate, but we are not installing that certificate on our target machine. We are only installing the Issuing CA Certificate into the Trusted Publishers and other certificate stores.
This method works when we sign our custom application binaries using the Signing Certificate (we use Windows Defender Application Control to ensure that any applications running on our target are signed by trusted publishers) but it does not work when running PowerShell scripts.
According to this MSDN community post, PowerShell uses Known Publisher rules, which state that the Signing Certificate itself has to be in the Trusted Publishers certificate store.
This is not ideal, as the signing certificate we use to sign the PowerShell scripts is not something we want to ship out, nor will it be valid anymore by the time our product ships.
I understand that if I use a timestamp server when signing the PowerShell scripts, that the signature will still be considered valid if the signature was generated within the validity window of the signing certificate, but this is not our preferred solution.
Ideally I would like to know if it is possible, and how, to have PowerShell use the Issuing CA Certificate to validate the signed PowerShell scripts. (i.e. Known Issuer rules)
In the case that it's not possible, I would like to know why Microsoft departed from the practice of allowing you to validate signatures without explicitly trusting the signing certificate (i.e. using the issuing certificate to validate it).
Any help would be greatly appreciated.
There is a difference between deployment and the running of PowerShell scripts.
The confusion is that Windows Defender Application Control can use code signing 2 different ways, for 2 different reasons, and PowerShell has only ever supported one. Windows Defender Application Control can use code signing:
With a trusted Issuing CA Certificate to authenticate applications. This is the situation when your company wants to share many internal applications among all employees easily. It is also used for "trusted" Microsoft Store applications.
Because "normal" people don't blindly trust all applications and generic certificate authorities, you can instead deploy based off of trusting just the Signing Certificate for verification. (see Catalog Files). This is so that you could deploy applications with certificates that may not have an accessible CA. e.g. if you singed the application with an internal CA and you sell it to another company, or if you are using a self signed certificate.
Windows Defender Application Control primary purpose is for Application deployment/control, and a byproduct is that it can do PowerShell scripts as well. Most "normal" applications can run with "invalid" or "broken" certificates with incomplete certificate chains. The certificate was only used to control the distribution of the code and validation that the application was not tampered with/changed, and has nothing to do with the active "running" of the code.
PowerShell, on the other hand, when running with AllSigned, always validates the entire chain before running. It doesn't care about the distribution, it cares about what it runs. That means that the entire certificate chain needs to be present and trusted on the running machine. Yes, that means that if you sign with an internal CA, you need the Issuing CA Certificate, and the Signing Certificate distributed and trusted by the running party.
This leads you to 3 options:
Self signed certificate - This is ok for personal/development projects, and marginally better than distributing unsigned code.
Internal CA certificate - This is ok for internal projects. In this case, yes, you would have to share the entire certificate chain if you wanted to distribute.
Global/Public CA Certificate - This is the recommended method if you are distributing publicly/commercially. The Public CA's are trusted, and code signing certificates can be bought from places like DigiCert and can last 3 years. I know, for me, I would feel much more comfortable running code signed by a DigiCert over having to mess with internal or self signed certificates.
That doesn't make sense.
If the internal root CA certificate is in your trusted root CA store and the intermediate in your trusted intermediate store. Then a PS script is signed by a certificate with a trusted chain back to the trusted root, it should trust the signing of the certificate.
There should be no difference between an internal trusted CA and a public trusted CA.
If anything a code signing certificate issued by an internal CA is more trustworthy than a public signed one. Internal processes and controls mean only actual trusted internal publsihers can get one, unlike a public code signing cert bought for a small amount.
I'm fairly new to the whole certificate shebang and not a versed Linux admin.
In our company, we run a Windows domain, but we also have some CentOS servers for different services.
On one of said servers we have our ticket system, which is browser based. I want to certify it with a certificate, signed by our Windows root CA, but no matter what I do, the certificate is shown as invalid in the browser.
Funny enough, both certificates in the chain (CA -> server) are shown as valid.
I already did the following:
start certificate process from scratch
tried different certificate formats (.cer, .pem)
verified server cert with root cert
checked validity with openssl (OK)
checked SSL connection with openssl, no issues
added root cert to Linux server trusted CA store
recreated cert chain (of 2)
restarted Apache over and over
reset browser cache
tried different browser
checked DNS entries
checked, if root CA is trusted in Windows (it is)
manually installed server cert in my browser
Both the server cert and the root cert show up as valid in the browser, with the correct relation.
I'm completely lost here. Is there some key step I forgot and not one of the ~30 guides I read forgot to mention?
Any help is greatly appreciated
Your question is missing some information:
Did you check the SSL connection from outside the server?
Did you verify the RootCA cert is inside the cert-store of the server (sometimes it is rejected without error messages)?
I would check the reason for rejecting the certificate in the browser (FireFox is usually more informative than Chrome), and look for the error-code.
Reasons can be (some of which you have already verified):
Wrong certificate properties (missing the required values in the "usage" attribute)
Wrong domain name
Expired certificate
Certificate could not be verified on the client-side
See this image as an example of an error code:
https://user-images.githubusercontent.com/165314/71407838-14f55a00-2634-11ea-8a30-c119d2eb1eb1.png
I want my site to be secure using HTTPS protocols. I managed to make a self-signed key to be trustedCertEntry as I made my own CA certificate, with different CN, which I used to sign my own private certificate.
It works smooth testing it with openssl with something like:
openssl s_client -connect www.mydomain.com:80 -tls1 -state
Thus, browser doesn't report a certificate self-signed error, as it sees a different CA.
But I get a SEC_ERROR_UNKNOWN_ISSUER error. Still it seems logical to me as nobody knows me as a CA. It is supposed to work if user adds exception for me.
I thought this trick was acceptable and it was like many https compliant sites were working, as you may visit a unknown site and you want to encrypt communications from 3rd party watchers but trust that page.
After trying to get a clear response for it, beyond coding that I will find resources, my question is:
If I want to have a site, for which the users don't have to add an exception in the first visit, do I have to get a certificate from a "world-known" CA? Or am I missing a solution for self-signing my certificate with my own CA certificate?
Technically speaking, the answer is: Yes, you will have to get a certificate from a CA that is trusted by your users' browsers via a chain of intermediary CA's that ends at an inherently trusted root CA. The accepted answer to this question explains how it works: SSL Certificate framework 101: How does the browser actually verify the validity of a given server certificate?
Having said that, if your "only" concern is to provide encrypted connections, you might be able to leverage the Let's Encrypt CA, which provides free certificates for that purpose. Those certificates will be only domain-validated, which provides a weaker kind of assurance of identity than, for example, an Extended Validation Certificate.
Depending on the browser used, there will be minimal difference in user experience between DV and EV certificates. For Safari, the user will see a grey padlock in the address bar for the lower assurance DV-backed sites, like this:
and a green padlock when higher identity assurance is provided, like this:
Whether the former is good enough for you (or your customers) depends on your situation.
In case you want to understand what "inherently trusted" actually means for web browsers, see this blog post: Who your browser trusts, and how to control it.
Goal: I would like to run my custom powershell scripts that are signed with a valid certificate against target machines with their powershell execution policy set to “AllSigned” without having to install another certificate on the target machine.
Problem: The powershell scripts will not run until I install the public key of the certificate I used to sign the scripts as a trusted publisher (lets call it MyCert.cer) on the target machine.
Possible Misunderstanding: I could be misunderstanding the way code siging works with my “problem” above. However I was under the impression that since windows comes with DigiCert certificates installed by default as “Trusted Root Certification Authorities” (See image below) that all I would need for my signed scripts to work is signing them from a digicert authority.
My Certificate details:
I purchased a code signing certificate from DigiCert. The certificate is valid and has an “EKU” of “Code Signing (1.3.6.1.5.5.7.3.3)”.
Certificate chain:
Final Thoughts: I signed the powershell scripts using the cmdlet “Set-AuthenticodeSignature” and my issued code signing certificate. The scripts will run if I install the MyCert.cer public key as a “Trusted Publisher” on the target machine. However, I would like to not have to touch the target machine and be able to run my code signed scripts against said machine. Is this possible? Have I purchased the wrong certificate for my goal? Or is an entry to the Trusted Publishers certificate store required for running code signed scripts?
Thank you for your time.
Update: Here is the command I used to sign the powershell scripts.
Set-AuthenticodeSignature -Certificate $cert -FilePath $FileToSign -IncludeChain all
I wanted to let everyone know that I did include "All" for the includeChain. I have also tried using digiCerts timestamp server for the -TimestampServer parameter. However adding the timestamp does not make a difference for running the script. The -TimestampServer parameter as to my understanding is for when a certificate expires and need to be re-validated. However the certificate I am using is still current and not expired.
You are finding the intended behavior of the AllSigned Execution Policy. From Get-Help about_Execution_Policies you will see:
AllSigned
- Scripts can run.
- Requires that all scripts and configuration files
be signed by a trusted publisher, including scripts
that you write on the local computer.
The short answer is that you'll need to trust your cert on all your computers (the easy way to do this is with Group Policy). The Group Policy Object that you'll write will modify Computer Configuration\Windows Settings\Security Settings\Public Key Policies\Trusted Publishers and then you'll need to follow the instructions in the Certificate Import Wizard. The key thing here is that the certificate can be traced back to the Trusted Root Certification Authorities in your organization, so it's a very good thing that you bought a Digicert certificate and that your organization trusts Digicert certs.
So why does Digicert show up under Trusted Root Certification Authorities? The answer here is pretty simple. It means that your organization recognizes Digicert certificates and allows them to be trusted. This doesn't mean that every single Digicert certificate automatically gets a pass, it just means that they are allowed to be installed to your domain. I am gonna pick on Comodo, since they're close alphabetically and they don't show up in your screenshot. Because Comodo also gives out digital certificates, if I were to sign my scripts with my Comodo cert and try and install that certificate across your domain, it wouldn't stick since Comodo is not a Trusted Root Certification Authority in your domain.
I hope that helps explain what's going on!
I'm in a situation where I need to deploy around 200 SSL Certificates to various devices around our Agency (HP iLO - such joy they bring...). At present I have a powershell script that obtains a CSR from the iLO Device, but I now need to be able to sign this with our CA in an automated manner so I can pass back to the script and import into the device.
We are using Microsoft Certificate Services and I can download the CA Certificate, Certificate Chain, or CRL if required.
I do not really care how I get the signed certificate - what I mean is, if powershell needs to call an external app to get it signed, then thats fine. I've had a look at makecert.exe in the .NET SDK but it does not appear to do the job.
I have the feeling it may be possible using OpenSSL - If someone could enlighten me, that would be great.
Cheers
Having Dealt with Microsoft Engineer this morning, the most graceful solution in doing this with existing infrastructure is using certreq.exe. Once you have a CSR, you can run this command to obtain a certificate using MS CA:
certreq.exe -attrib "CertificateTemplate:WebServer" infile.csr outfile.cer
from there, you ca get the certificate using Get-Content outfile.cer to feed back into the script.
Cheers
This article contains the steps to create a CA and sign a CSR from IIS:
Creating a Self-Signed Certificate using OpenSSL for use with Microsoft Internet Information Services (IIS) 5
You can export the CA into a format OpenSSL can read, and follow the steps after "Sign the Certificate Request".
Here's a FAQ for OpenSSL on managing a CA with the tool.
http://www.gtlib.cc.gatech.edu/pub/linux/docs/HOWTO/other-formats/html_single/SSL-Certificates-HOWTO.html
i've used OpenSSL successfully to create and manage a root CA, sign CSRs and generate certificates.
Powershell cannot handle this natively, but it can interact and script the whole process, definitely.
-Oisin