Windows / Perl / Net::SSLeay / OpenSSL: What locations are CA certificates loaded from? - windows

Here's a program that does an HTTPS request, with some code at the start that I'm going to explain below:
use 5.012;
use LWP::UserAgent;
use HTTP::Request::Common;
use Net::SSLeay;
BEGIN {
return unless $^O eq 'MSWin32'; # only needed on Windows
print STDERR "attempting to set HTTPS_CA_FILE to PEM file path\n";
require Mozilla::CA; # load module to determine PEM file path
my $pemfile = do {
my $path = $INC{ 'Mozilla/CA.pm' };
$path =~ s#\.pm$#/cacert.pem#;
$path;
};
if ( -f $pemfile ) {
$ENV{HTTPS_CA_FILE} = $pemfile;
print STDERR "HTTPS_CA_FILE set to $pemfile\n";
}
else {
warn "PEM file $pemfile missing";
}
} # ==========================================================================
$Net::SSLeay::trace = 2;
my $ua = LWP::UserAgent->new;
my $req = GET 'https://client.billsafe.de/';
my $rsp = $ua->request( $req );
say $rsp->is_success ? 'success' : 'failure';
say $rsp->status_line;
say '=================';
say substr $rsp->decoded_content, 0, 200;
say '=================';
# possibly relevant module versions
for ( qw/Net::SSLeay Crypt::SSLeay LWP::Protocol::https Mozilla::CA/ ) {
no strict 'refs';
say $_, "\t", ${"${_}::VERSION"}
}
The code at the beginning sets the environment variable HTTPS_CA_FILE to the value of the PEM file cacert.pem from Mozilla::CA that gets loaded by default (I checked using procmon.exe, the file is fully read by default).
The reason for doing this apparently nonsensical setting is that we have some Windows machines (Windows Server 2008) where the SSL setup fails with certificate verify failed when the environment variable is not set. It is a mystery to us why this is so. And it works fine on other Windows machines with identical versions for Net::SSLeay, LWP::Protocol::https and Mozilla::CA.
Our module versions are:
Net::SSLeay 1.36
Crypt::SSLeay -/-
LWP::Protocol::https 6.02
Mozilla::CA 20110409
Now the question: Are there other places, apart from cacert.pem, that root certificates are loaded from in this constellation (Windows, Perl, Net::SSLeay)? If so, what are they? Where can I read up on it?
Update
The OpenSSL docs do not mention any certificate store other than a plain file and a plain directory:
SSL_CTX_set_cert_store(3)
SSL_CTX_load_verify_locations(3)
The Windows C API functions used to open the system certificate store are the following:
CertOpenStore
CertOpenSystemStore
I checked out the OpenSSL HEAD from CVS. The CertOpenStore function is indeed used in engines/e_capi.c. I haven't investigated further to find out what is used to access a store in the OpenSSL versions on the servers in question.
If you do a web search you'll see that a couple of people have wondered whether OpenSSL can access the Windows certificate store directly, or have proposed to patch OpenSSL accordingly. There's also this recent issue on the TortoiseSVN list (Windows Certificate Store / OpenSSL CAPI). Some more research needed to find out what's the matter here.

Since LWP 6.00 you can pass an ssl_opts hashref to new specifying the certificate files amongst other options:
my $ua = LWP::UserAgent->new(
ssl_opts => {
verify_hostname => 1,
SSL_cert_file => $ssl_cert_file,
SSL_key_file => $ssl_key_file,
},
);

Related

HTTPS communicaton in TCL for windows

I wish to download a file that resides on the server. I am setting a token and downloading the data.
package require http
package require twapi_crypto
http::register https 443 [list ::twapi::tls_socket]
set token [http::geturl "https://$ipaddress/filename"]
set status [http::status $token]
set data [http::data $token]
put $data
when I run this code, it gives the following error:
The certificate chain was issued by an authority that is not trusted
while executing
"InitializeSecurityContext $Credentials $Handle $Target $Inattr 0 $Datarep $inbuflist 0"
(procedure "sspi_step" line 16)
invoked from within
"sspi_step $SpiContext $data"
(procedure "_negotiate2" line 19)
invoked from within
"_negotiate2 $chan"
(procedure "rethrow" line 2)
invoked from within
"rethrow"
invoked from within
"trap {
_negotiate2 $chan
} onerror {} {
variable channels
if {[info exists _channels($chan)]} {
dict set _chan..."
(procedure "_negotiate line 3)
invoked from within
"_negotiate $chan"
(procedure "::twapi::tls::_so_write_andler" line12)
invoked from within
"::twapi::tls::_so_write_handle rc0"
I don't know what I am doing wrong here. A similar code is implemented in Linux using tls and it works fine. All this comes in an exe and I am using starkit to make an exe.
Is there any other way to use https connection in windows apart from twapi?
I am not sure but I have used tls in windows, but the geturl ncode returns null value. Can this be because of SSL not being installed on windows machine?
PS: a similar error can be found in TWAPI Bug #154 but the resolution is not given.

FTP: copy, check integrity and delete

I am looking for a way to connect to a remote server with ftp or lftp and make sure the following steps:
Copy files from FTP server to my local machine.
Check if the downloaded files are fine (i.e. md5checksum).
If the download was fine then delete the downloaded files from the FTP server.
This routine will be executed each day from my local machine. What would be the best option to do this? Is there a tool that makes abstraction of all the 3 steps ?
I am running Linux on both client and server machines.
Update: Additionally, I have also a text file that contains the association between the files on the FTPserver and their MD5sum. They were computed at the FTP server side.
First, make sure your remote server supports the checksum calculation at all. Many do not. I believe there's even no standard FTP command to calculate a checksum of a remote file. There were many proposals and there are many proprietary solutions.
The latest proposal is:
https://datatracker.ietf.org/doc/html/draft-bryan-ftpext-hash-02
So even if your server supports checksum calculation, you have to find a client that supports the same command.
Some of the commands that can be used to calculate checksum are: XSHA1, XSHA256, XSHA512, XMD5, MD5, XCRC and HASH.
You can test that with WinSCP. The WinSCP supports all the previously mentioned commands. Test its checksum calculation function or the checksum scripting command. If they work, enable logging and check, what command and what syntax WinSCP uses against your server.
Neither the ftp (neither Windows nor *nix version) nor the lftp support checksum calculation, let alone automatic verification of downloaded file.
I'm not even aware of any other client that can automatically verify downloaded file.
You can definitely script it with a help of some feature-rich client.
I've wrote this answer before OP specified that he/she is on Linux. I'm keeping the Windows solution in case it helps someone else.
On Windows, you could script it with PowerShell using WinSCP .NET assembly.
param (
$sessionUrl = "ftp://username:password#example.com/",
[Parameter(Mandatory)]
$localPath,
[Parameter(Mandatory)]
$remotePath,
[Switch]
$pause = $False
)
try
{
# Load WinSCP .NET assembly
Add-Type -Path (Join-Path $PSScriptRoot "WinSCPnet.dll")
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.ParseUrl($sessionUrl);
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
Write-Host "Downloading $remotePath to $localPath..."
$session.GetFiles($remotePath, $localPath).Check();
# Calculate remote file checksum
$buf = $session.CalculateFileChecksum("sha-1", $remotePath)
$remoteChecksum = [BitConverter]::ToString($buf)
Write-Host "Remote file checksum: $remoteChecksum"
# Calculate local file checksum
$sha1 = [System.Security.Cryptography.SHA1]::Create()
$localStream = [System.IO.File]::OpenRead($localPath)
$localChecksum = [BitConverter]::ToString($sha1.ComputeHash($localStream))
Write-Host "Downloaded file checksum: $localChecksum"
# Compare cheksums
if ($localChecksum -eq $remoteChecksum)
{
Write-Host "Match, deleting remote file"
$session.RemoveFiles($remotePath).Check();
$result = 0
}
else
{
Write-Host "Does NOT match"
$result = 1
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
}
catch [Exception]
{
Write-Host "Error: $($_.Exception.Message)"
$result = 1
}
# Pause if -pause switch was used
if ($pause)
{
Write-Host "Press any key to exit..."
[System.Console]::ReadKey() | Out-Null
}
exit $result
You can run it like:
powershell -file checksum.ps1 -remotePath ./file.dat -localPath C:\path\file.dat
This is partially based on WinSCP example for Verifying checksum of a remote file against a local file over SFTP/FTP protocol.
(I'm the author on WinSCP)
The question was later edited to say that OP has a text file with a checksum. That makes it a completely different question. Just download the file, calculate local checksum and compare it to the checksum you have in the text file. If they match, delete the remote file.
That's a long shot, but if the server supports php, you can exploit that.
Save the following as a php file (say, check.php), in the same folder as your name_of_file.txt file:
<? php
echo md5_file('name_of_file.txt');
php>
Then, visit the page check.php, and you should get the md5 hash of your file.
Related questions:
Calculate file checksum in FTP server using Apache FtpClient
How to perform checksums during a SFTP file transfer for data integrity?
https://serverfault.com/q/98597/401691

Add Mozilla root certs to Windows without admin

I want to add Mozilla's root certs to Windows 7 without admin privileges.
Is there a straight forward way to add the root certificates into the current user's certificate store? I'd prefer to use Windows' native tools, without relying on something I'd have to download.
Some resources that looked promising.
Pre-converted PEM files by CURL - The Mozilla root certs converted to PEM and hosted by cURL. Here's a direct link to the PEM Encoded root certs
Verified HTTPs in Ruby - A general overview of how to obtain the root certificates.
How to get root certs for cURL - explains how to generate the PEM file from the Mozilla certificates yourself.
How to Import Certificates using Powershell - a ranting overview of how to install certificates that seems more complex than it ought to be.
I ended making a powershell script to do it.
VERIFY THIS CODE BEFORE RUNNING IT. It's adding all of the certificate authorities from http://curl.haxx.se/ca/cacert.pem to the current user's TRUSTED ROOT certificate store.
To run it in a single command, paste the following into a command prompt:
#powershell -NoProfile -ExecutionPolicy unrestricted -Command "iex ((new-object net.webclient).DownloadString('https://raw.github.com/jschaf/install-mozilla-certs/master/install-mozilla-cert.ps1'))"
Here's the Github link: https://github.com/jschaf/install-mozilla-certs
And the source:
# Variables
$url = "http://curl.haxx.se/ca/cacert.pem"
# Download the certificates
Write-Host "Downloading Mozilla certificates from $url."
$downloader = New-Object System.Net.WebClient
$rawcerts = $downloader.DownloadString("http://curl.haxx.se/ca/cacert.pem")
# Remove headers and begin/end delimiters and convert into a byte
# stream
$header = "-----BEGIN CERTIFICATE-----`n"
$footer = "`n-----END CERTIFICATE-----"
$match_string = "(?s)$header(.*?)$footer"
$certs_matches = Select-String $match_string -input $rawcerts -AllMatches
$certs_base64 = $certs_matches.matches | %{ $_.Groups[1].Value }
$certs_bytes = $certs_base64 | %{ ,[System.Text.Encoding]::UTF8.GetBytes($_) }
# Install the certificates
$user_root_cert_store = Get-Item Cert:\CurrentUser\Root
$user_root_cert_store.Open("ReadWrite")
foreach ($c in $certs_bytes) {
$cert = new-object System.Security.Cryptography.X509Certificates.X509Certificate2(,$c)
$user_root_cert_store.Add($cert)
}
$user_root_cert_store.Close()
Write-Host "Finished installing all certificates."
One annoying thing is that Windows will prompt for yes/no for every certificate. Since it's installing 158 certificates this gets old quick. If anyone knows how to prevent confirmation let me know or drop a pull request.

Sending an email from R using the sendmailR package

I am trying to send an email from R, using the sendmailR package. The code below works fine when I run it on my PC, and I recieve the email. However, when I run it with my macbook pro, it fails with the following error:
library(sendmailR)
from <- sprintf("<sendmailR#%s>", Sys.info()[4])
to <- "<myemail#gmail.com>"
subject <- "TEST"
sendmail(from, to, subject, body,
control=list(smtpServer="ASPMX.L.GOOGLE.COM"))
Error in socketConnection(host = server, port = port, blocking = TRUE) :
cannot open the connection
In addition: Warning message:
In socketConnection(host = server, port = port, blocking = TRUE) :
ASPMX.L.GOOGLE.COM:25 cannot be opened
Any ideas as to why this would work on a PC, but not a mac? I turned the firewall off on both machines.
Are you able to send email via the command-line?
So, first of all, fire up a Terminal and then
$ echo “Test 123” | mail -s “Test” user#domain.com
Look into /var/log/mail.log, or better use
$ tail -f /var/log/mail.log
in a different window while you send your email. If you see something like
... setting up TLS connection to smtp.gmail.com[xxx.xx.xxx.xxx]:587
... Trusted TLS connection established to smtp.gmail.com[xxx.xx.xxx.xxx]:587:\
TLSv1 with cipher RC4-MD5 (128/128 bits)
then you succeeded. Otherwise, it means you have to configure you mailing system. I use postfix with Gmail for two years now, and I never had have problem with it. Basically, you need to grab the Equifax certificates, Equifax_Secure_CA.pem from here: http://www.geotrust.com/resources/root-certificates/. (They were using Thawtee certificates before but they changed last year.) Then, assuming you used Gmail,
Create relay_password in /etc/postfix and put a single line like this (with your correct login and password):
smtp.gmail.com login#gmail.com:password
then in a Terminal,
$ sudo postmap /etc/postfix/relay_password
to update Postfix lookup table.
Add the certificates in /etc/postfix/certs, or any folder you like, then
$ sudo c_rehash /etc/postfix/certs/
(i.e., rehash the certificates with Openssl).
Edit /etc/postfix/main.cf so that it includes the following lines (adjust the paths if needed):
relayhost = smtp.gmail.com:587
smtp_sasl_auth_enable = yes
smtp_sasl_password_maps = hash:/etc/postfix/relay_password
smtp_sasl_security_options = noanonymous
smtp_tls_security_level = may
smtp_tls_CApath = /etc/postfix/certs
smtp_tls_session_cache_database = btree:/etc/postfix/smtp_scache
smtp_tls_session_cache_timeout = 3600s
smtp_tls_loglevel = 1
tls_random_source = dev:/dev/urandom
Finally, just reload the Postfix process, with e.g.
$ sudo postfix reload
(a combination of start/stop works too).
You can choose a different port for the SMTP, e.g. 465.
It’s still possible to use SASL without TLS (the above steps are basically the same), but in both case the main problem is that your login informations are available in a plan text file... Also, should you want to use your MobileMe account, just replace the Gmail SMTP server with smtp.me.com.

Windows file copy using http

Is there a windows command to copy or download files from a http url to the filesystem? I've tried with copy, xcopy and robocopy and they dont seem to support http urls.
You can use a powershell script to accomplish this.
Get-Web http://www.msn.com/ -toFile www.msn.com.html
function Get-Web($url,
[switch]$self,
$credential,
$toFile,
[switch]$bytes)
{
#.Synopsis
# Downloads a file from the web
#.Description
# Uses System.Net.Webclient (not the browser) to download data
# from the web.
#.Parameter self
# Uses the default credentials when downloading that page (for downloading intranet pages)
#.Parameter credential
# The credentials to use to download the web data
#.Parameter url
# The page to download (e.g. www.msn.com)
#.Parameter toFile
# The file to save the web data to
#.Parameter bytes
# Download the data as bytes
#.Example
# # Downloads www.live.com and outputs it as a string
# Get-Web http://www.live.com/
#.Example
# # Downloads www.live.com and saves it to a file
# Get-Web http://wwww.msn.com/ -toFile www.msn.com.html
$webclient = New-Object Net.Webclient
if ($credential) {
$webClient.Credential = $credential
}
if ($self) {
$webClient.UseDefaultCredentials = $true
}
if ($toFile) {
if (-not "$toFile".Contains(":")) {
$toFile = Join-Path $pwd $toFile
}
$webClient.DownloadFile($url, $toFile)
} else {
if ($bytes) {
$webClient.DownloadData($url)
} else {
$webClient.DownloadString($url)
}
}
}
source http://blogs.msdn.com/mediaandmicrocode/archive/2008/12/01/microcode-powershell-scripting-tricks-scripting-the-web-part-1-get-web.aspx
I am not familiar with any commands on Windows that can do that, but I always download GNU wget on Windows for these and similar purposes.
cURL comes to mind.
curl -o homepage.html http://www.apptranslator.com/
This command downloads the page and stores it into file homepage.html.
Thousands options available.
Use BITSAdmin Tool (bitsadmin is a command line utility on windows)
example :
bitsadmin /transfer "Download_Job" /download /priority high "http://www.sourceWebSite.com/file.zip" "C:\destination\file.zip"
where,
Download_Job - Any relevant job name you want
I can't remember any command line utility for this.
Maybe you can implement something similar using JavaScript (with WinHttpRequest) and running it like that:
wscript your_script.js
Or just install msys with wget.
Just use Win32 api (1 line of code in C...)

Resources