Decode PKCS#7 Signature via Windows API? - windows

I wish to parse and display the contents of an Authenticode PKCS#7 signature as extracted from a Window PE binary's Security Directory.
I can use OpenSSL to do this on the command line with "openssl pkcs7 -text -in extracted_signature.pks -inform DER -print_certs", however I need to do this via C/C++ and the Windows API. I cannot use the OpenSSL library itself.
Using the CryptDecodeObjectEx API I can begin to decode the extracted signature:
CRYPT_CONTENT_INFO * content_info;
DWORD len;
CryptDecodeObjectEx(
X509_ASN_ENCODING | PKCS_7_ASN_ENCODING,
PKCS_CONTENT_INFO,
pointer_to_extracted_signature,
length_of_extracted_signature,
CRYPT_DECODE_ALLOC_FLAG,
NULL,
&content_info,
&len
);
The above call completes successfully and content_info->pszObjId will have an OID of "1.2.840.113549.1.7.2" (szOID_RSA_signedData) however I am unable to find the structures needed to continue decoding. The available OID's for CryptDecodeObjectEx are listed here.
Can anybody please advise how to decode an Authenticode PKCS#7 signature via the Windows API?

I have found the correct way to decode an Authenticode PKCS#7 signature is to use CryptQueryObject with the CERT_QUERY_OBJECT_BLOB and CERT_QUERY_CONTENT_FLAG_PKCS7_SIGNED flags set. Code snippit below for anybody who might need to do this.
CERT_BLOB cert_blob;
HCERTSTORE cert_store = NULL;
HCRYPTMSG cert_msg = NULL;
cert_blob.pbData = pointer_to_extracted_signature;
cert_blob.cbData = length_of_extracted_signature;
CryptQueryObject(
CERT_QUERY_OBJECT_BLOB,
&cert_blob,
CERT_QUERY_CONTENT_FLAG_PKCS7_SIGNED,
CERT_QUERY_FORMAT_FLAG_BINARY,
0,
NULL,
NULL,
NULL,
&cert_store,
&cert_msg,
NULL
);
PCCERT_CONTEXT next_cert = NULL;
while( (next_cert = CertEnumCertificatesInStore( cert_store, next_cert ) ) != NULL )
{
// process next_cert...
}

Related

Issue with net-snmp v3 for authPriv for AES

I am creating a c++ project using the net-snmp libraries i build, I was able to interface with my hardware via SNMP v2c as well as SNMP v3 (authNoPriv). However, this was unsuccessful when I tried using authPriv, is there any advice on this?
What I suspect is that net-snmp does not support AES.
When i tried to run net-snmp directly, I see for the privacy protocol there's only the option for DES. So I would like to confirm does net-snmp supports both AES128 and DES privacy protocol?
For authNoPriv, I was returned with the Authentication failure when I used SHA-1 Authentication Protocol
For authPriv, I couldn't establish any connection with the SNMP hardware.
I suspect there is something wrong in my code, as there was no issue with authNoPriv with MD5 Authentication Protocol, but the above errors occur when I configured to the respective the security protocol.
// Definitions
const char * user = "snmpuser";
const char * our_v3_passphrase = "passphrase";
const char * our_v3_privphrase = "privphrase";
struct snmp_session session;
SOCK_STARTUP;
// Initialize the SNMP library
snmp_sess_init(&session);
session.peername = _strdup(argv[1])
// set the SNMP version number
session.version = SNMP_VERSION_3;
session.securityNameLen = strlen(session.securityName);
// set the security level
session.securityLevel = SNMP_SEC_LEVEL_AUTHPRIV; // SNMP_SEC_LEVEL_AUTHNOPRIV (for authNoPriv)
// set the authentication protocol
session.securityAuthProto = usmHMACMD5AuthProtocol; // usmHMACSHA1AuthProtocol
session.securityAuthProtoLen = USM_AUTH_PROTO_MD5_LEN; // USM_AUTH_PROTO_SHA_LEN
session.securityAuthKeyLen = USM_AUTH_KU_LEN;
// set authentication key to a hashed version of passphrase
if (generate_Ku(session.securityAuthProto, session.securityAuthProtoLen, (u_char *)our_v3_passphrase, strlen(our_v3_passphrase), session.securityAuthKey, &session.securityAuthKeyLen) != SNMPERR_SUCCESS) {
snmp_perror(argv[0]);
snmp_log(LOG_ERR, "Error generating Ku from authentication passphrase. \n");
SOCK_CLEANUP;
exit(1);
}
// set the privacy protocol
session.securityPrivProto = usmAES128PrivProtocol; // usmDESPrivProtocol
session.securityAuthProtoLen = USM_PRIV_PROTO_AES128_LEN; // USM_PRIV_PROTO_DES_LEN
session.securityAuthKeyLen = USM_PRIV_KU_LEN;
// set privacy key to a hashed version of privphrase
if (generate_Ku(session.securityAuthProto, session.securityAuthProtoLen, (u_char *)our_v3_privphrase, strlen(our_v3_privphrase), session.securityPrivKey, &session.securityPrivKeyLen) != SNMPERR_SUCCESS) {
snmp_perror(argv[0]);
snmp_log(LOG_ERR, "Error generating Ku from authentication passphrase. \n");
SOCK_CLEANUP;
exit(1);
}

Signing an appxbundle using CryptUIWizDigitalSign API

I'm facing a rather interesting issue in regards to Authenticode signing an UWP appxbundle file.
Some background:
The client provided us with a SafeNet USB token containing the signing certificate. The private key is not exportable, of course. I want to be able to use this certificate for our automated release builds to sign the package. Unfortunately, the token requires a PIN to be entered once per session, so for example if the build agent reboots, the build will fail. We enabled single login on the token so it's enough to unlock it once a session.
Current state:
We can use signtool on the appxbundle without any problems, given the token has been unlocked. This works well enough but breaks as soon as the machine is rebooted or the workstation is locked.
After some searching I managed to find this piece of code. This takes the signing parameters (including the token PIN) and invokes Windows API to sign the target file. I managed to compile this and it worked flawlessly for signing the installation wrapper (EXE file) - the token did not ask for PIN and was unlocked automatically by the API call.
However, when I invoked the same code on the appxbundle file, the call to CryptUIWizDigitalSign failed with error code 0x80080209 APPX_E_INVALID_SIP_CLIENT_DATA. This is a mystery to me because invoking signtool on the same bundle, with the same parameters/certificate works without problem so the certificate should be fully compatible with the package.
Does anyone have experience with something like this? Is there a way to figure out what is the root cause of the error (what is incompatible between my cert and the bundle)?
EDIT 1
In response to a comment:
The code I'm using to call the APIs (taken directly from the aforementioned SO question)
#include <windows.h>
#include <cryptuiapi.h>
#include <iostream>
#include <string>
#pragma comment (lib, "cryptui.lib")
const std::wstring ETOKEN_BASE_CRYPT_PROV_NAME = L"eToken Base Cryptographic Provider";
std::string utf16_to_utf8(const std::wstring& str)
{
if (str.empty())
{
return "";
}
auto utf8len = ::WideCharToMultiByte(CP_UTF8, 0, str.data(), str.size(), NULL, 0, NULL, NULL);
if (utf8len == 0)
{
return "";
}
std::string utf8Str;
utf8Str.resize(utf8len);
::WideCharToMultiByte(CP_UTF8, 0, str.data(), str.size(), &utf8Str[0], utf8Str.size(), NULL, NULL);
return utf8Str;
}
struct CryptProvHandle
{
HCRYPTPROV Handle = NULL;
CryptProvHandle(HCRYPTPROV handle = NULL) : Handle(handle) {}
~CryptProvHandle() { if (Handle) ::CryptReleaseContext(Handle, 0); }
};
HCRYPTPROV token_logon(const std::wstring& containerName, const std::string& tokenPin)
{
CryptProvHandle cryptProv;
if (!::CryptAcquireContext(&cryptProv.Handle, containerName.c_str(), ETOKEN_BASE_CRYPT_PROV_NAME.c_str(), PROV_RSA_FULL, CRYPT_SILENT))
{
std::wcerr << L"CryptAcquireContext failed, error " << std::hex << std::showbase << ::GetLastError() << L"\n";
return NULL;
}
if (!::CryptSetProvParam(cryptProv.Handle, PP_SIGNATURE_PIN, reinterpret_cast<const BYTE*>(tokenPin.c_str()), 0))
{
std::wcerr << L"CryptSetProvParam failed, error " << std::hex << std::showbase << ::GetLastError() << L"\n";
return NULL;
}
auto result = cryptProv.Handle;
cryptProv.Handle = NULL;
return result;
}
int wmain(int argc, wchar_t** argv)
{
if (argc < 6)
{
std::wcerr << L"usage: etokensign.exe <certificate file path> <private key container name> <token PIN> <timestamp URL> <path to file to sign>\n";
return 1;
}
const std::wstring certFile = argv[1];
const std::wstring containerName = argv[2];
const std::wstring tokenPin = argv[3];
const std::wstring timestampUrl = argv[4];
const std::wstring fileToSign = argv[5];
CryptProvHandle cryptProv = token_logon(containerName, utf16_to_utf8(tokenPin));
if (!cryptProv.Handle)
{
return 1;
}
CRYPTUI_WIZ_DIGITAL_SIGN_EXTENDED_INFO extInfo = {};
extInfo.dwSize = sizeof(extInfo);
extInfo.pszHashAlg = szOID_NIST_sha256; // Use SHA256 instead of default SHA1
CRYPT_KEY_PROV_INFO keyProvInfo = {};
keyProvInfo.pwszContainerName = const_cast<wchar_t*>(containerName.c_str());
keyProvInfo.pwszProvName = const_cast<wchar_t*>(ETOKEN_BASE_CRYPT_PROV_NAME.c_str());
keyProvInfo.dwProvType = PROV_RSA_FULL;
CRYPTUI_WIZ_DIGITAL_SIGN_CERT_PVK_INFO pvkInfo = {};
pvkInfo.dwSize = sizeof(pvkInfo);
pvkInfo.pwszSigningCertFileName = const_cast<wchar_t*>(certFile.c_str());
pvkInfo.dwPvkChoice = CRYPTUI_WIZ_DIGITAL_SIGN_PVK_PROV;
pvkInfo.pPvkProvInfo = &keyProvInfo;
CRYPTUI_WIZ_DIGITAL_SIGN_INFO signInfo = {};
signInfo.dwSize = sizeof(signInfo);
signInfo.dwSubjectChoice = CRYPTUI_WIZ_DIGITAL_SIGN_SUBJECT_FILE;
signInfo.pwszFileName = fileToSign.c_str();
signInfo.dwSigningCertChoice = CRYPTUI_WIZ_DIGITAL_SIGN_PVK;
signInfo.pSigningCertPvkInfo = &pvkInfo;
signInfo.pwszTimestampURL = timestampUrl.c_str();
signInfo.pSignExtInfo = &extInfo;
if (!::CryptUIWizDigitalSign(CRYPTUI_WIZ_NO_UI, NULL, NULL, &signInfo, NULL))
{
std::wcerr << L"CryptUIWizDigitalSign failed, error " << std::hex << std::showbase << ::GetLastError() << L"\n";
return 1;
}
std::wcout << L"Successfully signed " << fileToSign << L"\n";
return 0;
}
The certificate is a CER file (public portion only) exported from the token and the container name is taken from the token's info. As I mentioned, this works correctly for EXE files.
The signtool command
signtool sign /sha1 "cert thumbprint" /fd SHA256 /n "subject name" /t "http://timestamp.verisign.com/scripts/timestamp.dll" /debug "$path"
This also works, when I call it either manually or from the CI build when the token is unlocked. But the code above fails with the mentioned error.
EDIT 2
Thanks to all of you, I now have a working implementation! I ended up using the SignerSignEx2 API, as suggested by RbMm. This seems to work fine for both appx bundles and PE files (different parameters for each). Verified on Windows 10 with a TFS 2017 build agent - unlocks the token, finds a specified certificate in the cert store, and signs+timestamps the specified file.
I published the result on GitHub, if anyone is interested: https://github.com/mareklinka/SafeNetTokenSigner
first of all i look where CryptUIWizDigitalSign failed:
the CryptUIWizDigitalSign called SignerSignEx function, with pSipData == 0. for sign PE file (exe, dll, sys) - this is ok and will be work. but for appxbundle (zip archive file type) this parameter mandatory and must point to APPX_SIP_CLIENT_DATA: for appxbundle call stack is
CryptUIWizDigitalSign
SignerSignEx
HRESULT Appx::Packaging::AppxSipClientData::Initialize(SIP_SUBJECTINFO* subjectInfo)
at very begin of Appx::Packaging::AppxSipClientData::Initialize we can view next code:
if (!subjectInfo->pClientData) return APPX_E_INVALID_SIP_CLIENT_DATA;
this is exactly where your code fail.
instead of CryptUIWizDigitalSign need direct call SignerSignEx2 and pSipData is mandatory parameter in this case.
in msdn exist full worked example - How to programmatically sign an app package (C++)
the key point here:
APPX_SIP_CLIENT_DATA sipClientData = {};
sipClientData.pSignerParams = &signerParams;
signerParams.pSipData = &sipClientData;
the modern SignTool call SignerSignEx2 direct:
here again clear visible:
if (!subjectInfo->pClientData) return APPX_E_INVALID_SIP_CLIENT_DATA;
after this called
HRESULT Appx::Packaging::Packaging::SignFile(
PCWSTR FileName, APPX_SIP_CLIENT_DATA* sipClientData)
here at begin next code:
if (!sipClientData->pSignerParams) return APPX_E_INVALID_SIP_CLIENT_DATA;
this clear stated in msdn:
You must provide a pointer to an APPX_SIP_CLIENT_DATA structure as
the pSipData parameter when you sign an app package. You must
populate the pSignerParams member of APPX_SIP_CLIENT_DATA with
the same parameters that you use to sign the app package. To do this,
define your desired parameters on the SIGNER_SIGN_EX2_PARAMS
structure, assign the address of this structure to pSignerParams,
and then directly reference the structure's members as well when you
call SignerSignEx2.
question - why need again provide the same parameters, which used in call SignerSignEx2 ? because appxbundle is really archive, which containing multiple files. and every file need be sign. for this Appx::Packaging::Packaging::SignFile recursive call SignerSignEx2 again :
for this recursive calls pSignerParams and used - for call SignerSignEx2 with exactly the same parameters as top call

How to use RegDeleteTree to delete KEY_WOW64_64KEY key?

I have tested RegDeleteTree() in 32bit program. It only delete the KEY_WOW64_32KEY key. Is there a function like RegOpenKeyEx() to specify the KEY_WOW64_32KEY or KEY_WOW64_64KEY ?
I have tired this code:
HKEY key = NULL;
if (RegOpenKeyEx(HKEY_LOCAL_MACHINE, _T("SOFTWARE\\asd"), 0, KEY_READ | KEY_WOW64_64KEY, &key) == ERROR_SUCCESS)
{
long d = RegDeleteTree(key, NULL);
if (d != ERROR_SUCCESS)
{
cout << "Error" << d;
}
else
{
cout << "Success";
}
}
The output is Success but the key still exits in registry. I'm running it on Windows 7 with VS 2013 and compiled as win32 program not x64.
Use RegOpenKeyEx with KEY_WOW64_64KEY to open the 64 bit view of the key. Then call RegDeleteTree passing the returned key.
Regarding your update. You only ask for read access when opening the key. The documentation for RegDeleteTree states:
The key must have been opened with the following access rights: DELETE, KEY_ENUMERATE_SUB_KEYS, and KEY_QUERY_VALUE.
The fact that the call succeeds anyway makes me think that your process is subject to virtualization. You need to have a manifest for the process that forces UAC elevation. Use the requireAdministrator option in the manifest.

How to do asymmetric encryption/decryption with OSX 10.7+ without openssl?

Since openssl is deprecated in osx 10.7+, I'd like to switch from openssl to the internal osx keychain and crypto function.
But now I am stuck on asymmetric encryption/decryption.
How can I do encryption/decryption of a randomly generated symmetric key with a asymmetric (RSA) key. With openssl it's quite easy.
In the apple dev docs, they say that CommonCrypto supports asymmetric encryption, but while checking the headers, I can only see support for symmetric stuff.
Any hints?
Take a look at Cryptographic Message Syntax Services and see if that can do what you need.
Also, you're misreading the OpenSSL thing just a bit. The OpenSSL libraries that ship with the OS are deprecated. That doesn't mean you can't continue to use OpenSSL. OpenSSL is Open Source, and there's nothing stopping you from downloading it and using it freely in your application.
Apple's deprecation just means that if you use OpenSSL, you need to include your own copy of the OpenSSL libraries so that you are responsible for keeping your OpenSSL library up-to-date and for fixing any breakage that occurs whenever you do so. :-)
And if not, the iOS asymmetric encryption and decryption functions (SecKeyEncrypt and SecKeyDecrypt) do exist in OS X, and the iOS header even shows that they are available in OS X. I'm not sure why they aren't in the OS X SDK. I filed a bug, and it was marked as a dup.
It probably would not be possible for Apple to remove those functions in the future without breaking the Simulator, but if you're submitting to the app store and they give you grief about using them, here's a roughly compatible replacement for SecKeyEncrypt built using the Security Transforms API:
// Workaround for SecKeyEncrypt not really being public API in OS X
OSStatus OSXSecKeyEncrypt ( SecKeyRef key, SecPadding padding, const uint8_t *plainText, size_t plainTextLen, uint8_t *cipherText, size_t *cipherTextLen )
{
CFMutableDictionaryRef parameters = CFDictionaryCreateMutable(
kCFAllocatorDefault, 0, &kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(parameters, kSecAttrKeyType, kSecAttrKeyTypeAES);
CFErrorRef error = NULL;
SecTransformRef encrypt = SecEncryptTransformCreate(key, &error);
if (error) {
AFNSLog(#"Encryption failed: %#\n", (__bridge NSError *)error);
return (OSStatus)[(__bridge NSError *)error code];
}
SecTransformSetAttribute(
encrypt,
kSecPaddingKey,
NULL, // kSecPaddingPKCS1Key (rdar://13661366 : NULL means kSecPaddingPKCS1Key and
// kSecPaddingPKCS1Key fails horribly)
&error);
CFDataRef sourceData = CFDataCreate(kCFAllocatorDefault, plainText, plainTextLen);
SecTransformSetAttribute(encrypt, kSecTransformInputAttributeName,
sourceData, &error);
CFDataRef encryptedData = SecTransformExecute(encrypt, &error);
if (error) {
AFNSLog(#"Encryption failed: %#\n", (__bridge NSError *)error);
return (OSStatus)[(__bridge NSError *)error code];
}
if ((unsigned long)CFDataGetLength(encryptedData) > *cipherTextLen) {
return errSecBufferTooSmall;
}
*cipherTextLen = CFDataGetLength(encryptedData);
CFDataGetBytes(encryptedData, CFRangeMake(0, *cipherTextLen), cipherText);
return noErr;
}
You should be able to adapt the code for decryption fairly easily; I didn't need it for my purposes, so I didn't write that function.

Force GetKeyNameText to english

The Win32 function GetKeyNameText will provide the name of keyboard keys in the current input locale.
From MSDN:
The key name is translated according to the layout of the currently
installed keyboard, thus the function may give different results for
different input locales.
Is it possible to force the input locale for a short amount of time? Or is there another alternative to GetKeyNameText that will always return the name in English?
Update: This answer does not work. It actually modifies the keyboard settings of the user. This appear to be a behavior change between Windows versions.
CString csLangId;
csLangId.Format( L"%08X", MAKELANGID( LANG_INVARIANT, SUBLANG_NEUTRAL ) );
HKL hLocale = LoadKeyboardLayout( (LPCTSTR)csLangId, KLF_ACTIVATE );
HKL hPrevious = ActivateKeyboardLayout( hLocale, KLF_SETFORPROCESS );
// Call GetKeyNameText
ActivateKeyboardLayout( hPrevious, KLF_SETFORPROCESS );
UnloadKeyboardLayout( hLocale );
WARNING: GetKeyNameText is broken (it returns wrong A-Z key names for non-english keyboard layouts since it uses MapVirtualKey with MAPVK_VK_TO_CHAR that is broken), keyboard layout dlls pKeyNames and pKeyNamesExt text is bugged and outdated. I cannot recommend dealing with this stuff at all. :)
If you're really-really want to get this info - then you can load and parse it manually from keyboard layout dll file (kbdus.dll, kbdger.dll etc).
There is a bunch of undocumented stuff involved:
In order to get proper keyboard layout dll file name first you need to convert HKL to KLID string. You can do this via such code:
// Returns KLID string of size KL_NAMELENGTH
// Same as GetKeyboardLayoutName but for any HKL
// https://docs.microsoft.com/en-us/windows-hardware/manufacture/desktop/windows-language-pack-default-values
BOOL GetKLIDFromHKL(HKL hkl, _Out_writes_(KL_NAMELENGTH) LPWSTR pwszKLID)
{
bool succeded = false;
if ((HIWORD(hkl) & 0xf000) == 0xf000) // deviceId contains layoutId
{
WORD layoutId = HIWORD(hkl) & 0x0fff;
HKEY key;
CHECK_EQ(::RegOpenKeyW(HKEY_LOCAL_MACHINE, L"SYSTEM\\CurrentControlSet\\Control\\Keyboard Layouts", &key), ERROR_SUCCESS);
DWORD index = 0;
while (::RegEnumKeyW(key, index, pwszKLID, KL_NAMELENGTH) == ERROR_SUCCESS)
{
WCHAR layoutIdBuffer[MAX_PATH] = {};
DWORD layoutIdBufferSize = sizeof(layoutIdBuffer);
if (::RegGetValueW(key, pwszKLID, L"Layout Id", RRF_RT_REG_SZ, nullptr, layoutIdBuffer, &layoutIdBufferSize) == ERROR_SUCCESS)
{
if (layoutId == std::stoul(layoutIdBuffer, nullptr, 16))
{
succeded = true;
DBGPRINT("Found KLID 0x%ls by layoutId=0x%04x", pwszKLID, layoutId);
break;
}
}
++index;
}
CHECK_EQ(::RegCloseKey(key), ERROR_SUCCESS);
}
else
{
WORD langId = LOWORD(hkl);
// deviceId overrides langId if set
if (HIWORD(hkl) != 0)
langId = HIWORD(hkl);
std::swprintf(pwszKLID, KL_NAMELENGTH, L"%08X", langId);
succeded = true;
DBGPRINT("Found KLID 0x%ls by langId=0x%04x", pwszKLID, langId);
}
return succeded;
}
Then with KLID string you need to go to HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Keyboard Layouts\%KLID% registry path and read Layout File string from it.
Load this dll file from SHGetKnownFolderPath(FOLDERID_System, ...) (usually C:\Windows\System32) with LoadLibrary() call.
Next you need to do GetProcAddress(KbdDllHandle, "KbdLayerDescriptor") - you're receive pointer that can be casted to PKBDTABLES.
There is kbd.h header in Windows SDK that have KBDTABLES struct definition (there is some stuff involved to use proper KBD_LONG_POINTER size for x32 code running on x64 Windows. See my link to Gtk source at the end).
You have to look at pKeyNames and pKeyNamesExt in it to get scan code -> key name mapping.
Long story short: The GTK toolkit have the code that doing all this(see here and here). Actually they are building scan code -> printed chars tables from Windows keyboard layout dlls.

Resources