AES GCM decrypt Firefox error only: "DOMException: The operation failed for an operation-specific reason", Chromium OK though - firefox

I followed former answers from Webcrypto AES-CBC Decrypt: Operation Error - The operation failed for an operation-specific reason and JavaScript AES encryption and decryption (Advanced Encryption Standard)
and used:
iv = crypto.getRandomValues(new Uint8Array(16))
key = window.crypto.subtle.generateKey(
{
name: "AES-GCM",
length: 256,
},
false,
["encrypt", "decrypt"]
)
to generate the key
and
Uint8ArrayEncrypted = window.crypto.subtle.encrypt(
{name: "aes-gcm", iv: iv, tagLength: 128},
key,
Uint8ArrayVar)
to encrypt and
Uint8ArrayDecrypted = window.crypto.subtle.decrypt(
{name: "aes-gcm", iv: iv, tagLength: 128},
key,
Uint8ArrayEncrypted)
to decrypt
On Chromium 83 (Ubuntu) and Firefox 88, I successfully generate the key, the iv and encrypt.
And on Chromium, it simply also decrypts without problem.Uint8ArrayDecrypted is correct ArrayBuffer.
But FF throws the error "The operation failed for an operation-specific reason" and stop there. No Uint8ArrayDecrypted returned.
I didn't use tag, like in WebCrypto API: DOMException: The provided data is too small
Reading https://developer.mozilla.org/en-US/docs/Web/API/SubtleCrypto/encrypt,
I don't see it uses tag.
Does Firefox need something else specific?
Why the error message is so "generic"? Which operation or specific reason?
With an error so generic, I don't know where to look.

Related

Creating JWT signing method for AWS key in Go

I generated an ECC_NIST_P521 spec key, which uses the ECDSA_SHA_512 signing algorithm. I'm trying to create a jwt.SigningMethod with this in mind, but I'm not sure which values to use for the fields. This is what I have so far:
signingMethod := jwt.SigningMethodECDSA {
Name: "ECC_NIST_P521",
Hash: crypto.SHA512,
}
Specifically, I'm not sure if the name is correct and I don't know what to use for the KeySize and CurveBits fields. Any help would be appreciated.
You need to specify Hash, CurveBits and KeySize. The value of Name is ignored:
signingMethod := jwt.SigningMethodECDSA{
Name: "ECC_NIST_P521",
Hash: crypto.SHA512,
CurveBits: 521,
KeySize: 66,
}
521 bits - the size of curve field.
66 - number of bytes that fit a compact representation of a point on the curve.
Full example to sign and verify signature: https://go.dev/play/p/bEnLN2PJv4a

How to sign cert with an arbitrary or deprecated extension

For example say I want to sign a cert with an arbitrary or deprecated extension (nsCertType for example): https://www.openssl.org/docs/manmaster/man5/x509v3_config.html
I believe I'm supposed to add the arbitrary extension as part of the certificate as per below but how / where do you discover the asn1 object identifier? I've read more documentation that I care to admit today and am still stumped.
tmpl := &x509.Certificate{
SerialNumber: big.NewInt(time.Now().Unix()*1000),
Subject: pkix.Name{CommonName: "edgeproxy", Organization: []string{"edgeproxy"}},
NotBefore: now,
NotAfter: now.Add(caMaxAge),
ExtraExtensions: []pkix.Extension{
{
Id: asn1.ObjectIdentifier{}, //what goes here
Critical: false,
[]byte("sslCA"),
},
},
ExtKeyUsage: []x509.ExtKeyUsage{x509.ExtKeyUsageServerAuth,x509.ExtKeyUsageClientAuth,x509.ExtKeyUsageEmailProtection, x509.ExtKeyUsageTimeStamping, x509.ExtKeyUsageMicrosoftCommercialCodeSigning, x509.ExtKeyUsageMicrosoftServerGatedCrypto, x509.ExtKeyUsageNetscapeServerGatedCrypto} ,
KeyUsage: x509.KeyUsageCRLSign | x509.KeyUsageCertSign,
IsCA: true,
BasicConstraintsValid: true,
}
In python I would do this but don't know how to port this into go (which is what I'm doing at the end of the day):
OpenSSL.crypto.X509Extension(
b"nsCertType",
False,
b"sslCA"
),
Go sources at https://golang.org/src/encoding/asn1/asn1.go define:
// An ObjectIdentifier represents an ASN.1 OBJECT IDENTIFIER.
type ObjectIdentifier []int
So the object identifier (OID for short) is an array of integers. The asn1 module has methods to parse them, like parseObjectIdentifier.
This is the structure you need to put after the Id: attribute.
But now you need to find out the OID you want.
While difficult to read, OpenSSL source code can show you OIDs of many things in the X.400/X.500/X.509 worlds, or at least those known by OpenSSL.
If you go to https://github.com/openssl/openssl/blob/1aec7716c1c5fccf605a46252a46ea468e684454/crypto/objects/obj_dat.h
and searching on nsCertType you get:
{"nsCertType", "Netscape Cert Type", NID_netscape_cert_type, 9, &so[407]},
so is defined previously, and if you jump at its 407th item you see:
0x60,0x86,0x48,0x01,0x86,0xF8,0x42,0x01,0x01, /* [ 407] OBJ_netscape_cert_type */
and doing a final search on OBJ_netscape_cert_type in same file gives:
71, /* OBJ_netscape_cert_type 2 16 840 1 113730 1 1 */
which means the corresponding OID is 2.16.840.1.113730.1.1
Or you can decode the above list of integers that describe this OID (see How does ASN.1 encode an object identifier? for details).
first 0x60 is 9610 so 2*40 + 16, which means the OID starts with 2.16.
then each other one is in "base128" form: if most significant bit is 1 combine the 7 least significant bits together of all following numbers until one has 0 as most significant bit
0x86 is 100001102 so has to go with 0x48 aka 010010002 so it is in fact 000011010010002 or 84010
0x01 is less than 128 so it is itself, 1
0x86 is still 100001102 but has to be paired with both 0xF8 (111110002) and 0x42 (010000102 and we stop here since first bit is 0) so 0000110111100010000102 altogether or 11373010
and the two last 0x01 are themselves, 1.
so we do get again 2.16.840.1.113730.1.1
You can double check it at some online OID browser like here:
http://oid-info.com/cgi-bin/display?oid=2.16.840.1.113730.1.1&action=display
that gives the following description for it:
Netscape certificate type (a Rec. ITU-T X.509 v3 certificate extension
used to identify whether the certificate subject is a Secure Sockets
Layer (SSL) client, an SSL server or a Certificate Authority (CA))
You can then even browse various arcs, like the netscape one, or others, to find out other OIDs.
You also get the full ASN.1 notation:
{joint-iso-itu-t(2) country(16) us(840) organization(1) netscape(113730) cert-ext(1) cert-type(1)}

Ruby SHA1 RSA signing different from command line OpenSSL?

I'm trying to implement RSA SHA1 signature verification.
To my surprise, command line OpenSSL tool doesn't generate the same key as the Ruby OpenSSL.
If I run those commands :
MacBook-Pro-de-Geoffrey:ssl_tests Escaflowne$ cat data.txt
000
MacBook-Pro-de-Geoffrey:ssl_tests Escaflowne$ openssl dgst -sha1 -binary -sign prvkey.pem -out sig.bin data.txt
MacBook-Pro-de-Geoffrey:ssl_tests Escaflowne$ openssl base64 -in sig.bin -out sig64.txt
MacBook-Pro-de-Geoffrey:ssl_tests Escaflowne$ cat sig64.txt
AJEh2kA7O3j624Kdl7UCGN1HiEk/v2LQudB+cjxw1CfmRTjcSPBjUE/EAwy8NEut
K4zYgfRwwTs7NY3AwYiUEtAe5yohUM0Qv17qSDW+G4IWjwe9PKE7Sl00umiMdszA
q/1hqeQlHKgjme7YO7H6i1UcAXmriOOjn+ySRaovsHw=
So final base64 result in command line is :
AJEh2kA7O3j624Kdl7UCGN1HiEk/v2LQudB+cjxw1CfmRTjcSPBjUE/EAwy8NEut
K4zYgfRwwTs7NY3AwYiUEtAe5yohUM0Qv17qSDW+G4IWjwe9PKE7Sl00umiMdszA
q/1hqeQlHKgjme7YO7H6i1UcAXmriOOjn+ySRaovsHw=
Now, if I try signing it through my ruby script :
def sign_message(message)
privkey = OpenSSL::PKey::RSA.new(File.read(Rails.root.join('lib', 'payment', 'prvkey.pem')))
digest = OpenSSL::Digest.new('sha1')
expected_sign = privkey.sign(digest, message)
base_64_expected_sign = [expected_sign].pack('m')
puts "Expected Signature"
puts expected_sign
puts "Base 64 Expected Signature"
puts base_64_expected_sign
return base_64_expected_sign
end
And calling the function like this :
def test_sign
message = "000"
message_signature = sign_message(message)
puts "Message Signature : #{message_signature}"
puts "Valid : #{verify_signature(message_signature, message)}"
end
I get the output :
Expected Signature
??|??n?^~?T_1Y#??BR??u???k x?*????S?L?:.7
t??tc?)崪? ?}DMp?p2??4?D-f??jT;!e
?k??5??
Base 64 Expected Signature
QjhL1zQoUdGFLVCMg06/CKeE/HdhRTOhJ/p09wkWeK0qD/afsxfcU7tMtDou
Nw3rwXw/5W68XhZ+BK1UXwIxWUDbFYlCUpu6HnWTmI5rC3QP+f50Y8kp5bSq
gQkekH1ETXDmcDKvExeSNKVELWYe3uwTalQ7IWUMyWvnF541rvo=
Message Signature : QjhL1zQoUdGFLVCMg06/CKeE/HdhRTOhJ/p09wkWeK0qD/afsxfcU7tMtDou
Nw3rwXw/5W68XhZ+BK1UXwIxWUDbFYlCUpu6HnWTmI5rC3QP+f50Y8kp5bSq
gQkekH1ETXDmcDKvExeSNKVELWYe3uwTalQ7IWUMyWvnF541rvo=
So final ruby OpenSSL signature is :
QjhL1zQoUdGFLVCMg06/CKeE/HdhRTOhJ/p09wkWeK0qD/afsxfcU7tMtDou
Nw3rwXw/5W68XhZ+BK1UXwIxWUDbFYlCUpu6HnWTmI5rC3QP+f50Y8kp5bSq
gQkekH1ETXDmcDKvExeSNKVELWYe3uwTalQ7IWUMyWvnF541rvo=
Versus command line :
AJEh2kA7O3j624Kdl7UCGN1HiEk/v2LQudB+cjxw1CfmRTjcSPBjUE/EAwy8NEut
K4zYgfRwwTs7NY3AwYiUEtAe5yohUM0Qv17qSDW+G4IWjwe9PKE7Sl00umiMdszA
q/1hqeQlHKgjme7YO7H6i1UcAXmriOOjn+ySRaovsHw=
I've been struggling with this for some time now and I don't understand what could be making a difference!
UPDATE :
Well, apparently results match if I replace my message variable with File.read(Rails.root.join('lib', 'payment', 'data.txt'))
So basically, using a string with the same value as what's in the text file doesn't give the same result.
This means it's encoding related right ?
UPDATE 2 :
So the file says its encoded in us-ascii if I run file -I data.txt
However, if I do message.encoding.name it says its loaded as UTF-8
Also, message.encode('ascii') does not alter the result of the generated signature, it still corresponds with the command line openssl.
As soon as I switch to a string "000".encode('utf-8') or "000".encode('ascii'), the signatures don't match anymore.
So encoding doesn't seem to play a role at all.
How come there's a difference between the exact same content whether it comes from reading a file or written as a string ?
The file data.txt has a trailing newline that you are not taking into account in your code. Using
message = "000\n"
should work.
You could also do
message = File.binread("data.txt")
to make sure you get the exact data as the command line.

aws-sdk-ruby Aws::ACM::Client#import_certificate File paths or contents of files

I'm trying to use the aws-sdk-ruby to import Certificates to ACM. However, when I try to use the Aws::ACM::Client#import_certificate using either of the following methods, the stack trace tells me my private key is not 1024 or 2048. If that were the case Entrust wouldn't have signed my Certificate. I also told the openssl program to generate as 2048.
The Error Message
The private key is not supported. Only RSA 1024-bit and 2048-bit private keys are allowed.
First code example
def acm_upload(options)
require 'aws-sdk'
#aws_region = ENV['AWS_REGION'] || ENV['AWS_DEFAULT_REGION'] || 'us-west-2'
#aws_profile = ENV['AWS_PROFILE'] || ENV['AWS_DEFAULT_PROFILE'] || 'default'
acm = Aws::ACM::Client.new(region: #aws_region, profile: #aws_profile)
begin
puts '=> Uploading Key, Cert, and Chain to ACM.'
aws_response = acm.import_certificate({
certificate: options[:cert_name],
private_key: options[:key_name],
certificate_chain: options[:chain_name],
})
rescue Aws::ACM::Errors::ServiceError => e
puts 'An AWS ACM Service Error has occured.'
raise e.message
rescue Aws::Errors::ServiceError => e
puts 'An AWS Error has occured.'
raise e.message
end
puts aws_response
end
acm_upload({
cert_name: './ssl/certificate/signed_cert.crt',
key_name: './ssl/key/private_key.pem',
chain_name: './ssl/chains/cert_chain.crt'
})
The first method call says my key is not 2048bit. Then the second method also does as well:
acm_upload({
cert_name: File.read('./ssl/certificate/signed_cert.crt'),
key_name: File.read('./ssl/key/private_key.pem'),
chain_name: File.read('./ssl/chains/cert_chain.crt)'
})
Same error as above. The documentation isn't very clear to me on what its expecting. It says data, and I figured that was the contents of the certificate file. Has anyone else had this issue before?
I was able to upload the key, certificate, and chain to ACM using the aws Python CLI that they provide using file://.
Try the AWS CLI and see if that works for you:
aws acm import-certificate --certificate file://certificate.crt --private-key file://private_key.key --certificate-chain file://certificate_chain.crt
aws --version
note: compatible with version: aws-cli/1.14.18 Python/2.7.9 Windows/8 botocore/1.8.22
note: NOT compatible with version: aws-cli/1.10.21 Python/2.7.9 Windows/8 botocore/1.4.12

Parsing a certificate string in go

I'm using ssldump to extract the certificate in a communication. When i parse the result I obtain a string in go defined as:
var string certStr
certStr = "30 82 06 9f...."
How can I parse it to a X509 certificate?
UPDATED
I have tried to parse it directly:
certSlc := []byte(certStr)
cert,err := x509.ParseCertificates(certSlc)
But the result was:
Error:asn1: structure error: tags don't match (16 vs {class:0 tag:19 length:48 isCompound:true}) {optional:false explicit:false application:false defaultValue:<nil> tag:<nil> stringType:0 timeType:0 set:false omitEmpty:false}
Should I do another kind of conversion? maybe is the string incomplete or has got wrong type of cert?
I found the error. The problem was in the source.
As I was explaining, my cert string was "30 82 06 09...". This source must be decoded with:
hex.DecodeString(certStr)
The problem is that hex decoding doesn't work with this format. The error I obtained was:
encoding/hex: invalid byte: U+0020 ' '
So, removing whitespaces and carriage returns in the original string is the solution to make it work.
After decoding in a byte slice the X509 certificate can be created with no problem.

Resources