I'm using ssldump to extract the certificate in a communication. When i parse the result I obtain a string in go defined as:
var string certStr
certStr = "30 82 06 9f...."
How can I parse it to a X509 certificate?
UPDATED
I have tried to parse it directly:
certSlc := []byte(certStr)
cert,err := x509.ParseCertificates(certSlc)
But the result was:
Error:asn1: structure error: tags don't match (16 vs {class:0 tag:19 length:48 isCompound:true}) {optional:false explicit:false application:false defaultValue:<nil> tag:<nil> stringType:0 timeType:0 set:false omitEmpty:false}
Should I do another kind of conversion? maybe is the string incomplete or has got wrong type of cert?
I found the error. The problem was in the source.
As I was explaining, my cert string was "30 82 06 09...". This source must be decoded with:
hex.DecodeString(certStr)
The problem is that hex decoding doesn't work with this format. The error I obtained was:
encoding/hex: invalid byte: U+0020 ' '
So, removing whitespaces and carriage returns in the original string is the solution to make it work.
After decoding in a byte slice the X509 certificate can be created with no problem.
Related
I followed former answers from Webcrypto AES-CBC Decrypt: Operation Error - The operation failed for an operation-specific reason and JavaScript AES encryption and decryption (Advanced Encryption Standard)
and used:
iv = crypto.getRandomValues(new Uint8Array(16))
key = window.crypto.subtle.generateKey(
{
name: "AES-GCM",
length: 256,
},
false,
["encrypt", "decrypt"]
)
to generate the key
and
Uint8ArrayEncrypted = window.crypto.subtle.encrypt(
{name: "aes-gcm", iv: iv, tagLength: 128},
key,
Uint8ArrayVar)
to encrypt and
Uint8ArrayDecrypted = window.crypto.subtle.decrypt(
{name: "aes-gcm", iv: iv, tagLength: 128},
key,
Uint8ArrayEncrypted)
to decrypt
On Chromium 83 (Ubuntu) and Firefox 88, I successfully generate the key, the iv and encrypt.
And on Chromium, it simply also decrypts without problem.Uint8ArrayDecrypted is correct ArrayBuffer.
But FF throws the error "The operation failed for an operation-specific reason" and stop there. No Uint8ArrayDecrypted returned.
I didn't use tag, like in WebCrypto API: DOMException: The provided data is too small
Reading https://developer.mozilla.org/en-US/docs/Web/API/SubtleCrypto/encrypt,
I don't see it uses tag.
Does Firefox need something else specific?
Why the error message is so "generic"? Which operation or specific reason?
With an error so generic, I don't know where to look.
For example say I want to sign a cert with an arbitrary or deprecated extension (nsCertType for example): https://www.openssl.org/docs/manmaster/man5/x509v3_config.html
I believe I'm supposed to add the arbitrary extension as part of the certificate as per below but how / where do you discover the asn1 object identifier? I've read more documentation that I care to admit today and am still stumped.
tmpl := &x509.Certificate{
SerialNumber: big.NewInt(time.Now().Unix()*1000),
Subject: pkix.Name{CommonName: "edgeproxy", Organization: []string{"edgeproxy"}},
NotBefore: now,
NotAfter: now.Add(caMaxAge),
ExtraExtensions: []pkix.Extension{
{
Id: asn1.ObjectIdentifier{}, //what goes here
Critical: false,
[]byte("sslCA"),
},
},
ExtKeyUsage: []x509.ExtKeyUsage{x509.ExtKeyUsageServerAuth,x509.ExtKeyUsageClientAuth,x509.ExtKeyUsageEmailProtection, x509.ExtKeyUsageTimeStamping, x509.ExtKeyUsageMicrosoftCommercialCodeSigning, x509.ExtKeyUsageMicrosoftServerGatedCrypto, x509.ExtKeyUsageNetscapeServerGatedCrypto} ,
KeyUsage: x509.KeyUsageCRLSign | x509.KeyUsageCertSign,
IsCA: true,
BasicConstraintsValid: true,
}
In python I would do this but don't know how to port this into go (which is what I'm doing at the end of the day):
OpenSSL.crypto.X509Extension(
b"nsCertType",
False,
b"sslCA"
),
Go sources at https://golang.org/src/encoding/asn1/asn1.go define:
// An ObjectIdentifier represents an ASN.1 OBJECT IDENTIFIER.
type ObjectIdentifier []int
So the object identifier (OID for short) is an array of integers. The asn1 module has methods to parse them, like parseObjectIdentifier.
This is the structure you need to put after the Id: attribute.
But now you need to find out the OID you want.
While difficult to read, OpenSSL source code can show you OIDs of many things in the X.400/X.500/X.509 worlds, or at least those known by OpenSSL.
If you go to https://github.com/openssl/openssl/blob/1aec7716c1c5fccf605a46252a46ea468e684454/crypto/objects/obj_dat.h
and searching on nsCertType you get:
{"nsCertType", "Netscape Cert Type", NID_netscape_cert_type, 9, &so[407]},
so is defined previously, and if you jump at its 407th item you see:
0x60,0x86,0x48,0x01,0x86,0xF8,0x42,0x01,0x01, /* [ 407] OBJ_netscape_cert_type */
and doing a final search on OBJ_netscape_cert_type in same file gives:
71, /* OBJ_netscape_cert_type 2 16 840 1 113730 1 1 */
which means the corresponding OID is 2.16.840.1.113730.1.1
Or you can decode the above list of integers that describe this OID (see How does ASN.1 encode an object identifier? for details).
first 0x60 is 9610 so 2*40 + 16, which means the OID starts with 2.16.
then each other one is in "base128" form: if most significant bit is 1 combine the 7 least significant bits together of all following numbers until one has 0 as most significant bit
0x86 is 100001102 so has to go with 0x48 aka 010010002 so it is in fact 000011010010002 or 84010
0x01 is less than 128 so it is itself, 1
0x86 is still 100001102 but has to be paired with both 0xF8 (111110002) and 0x42 (010000102 and we stop here since first bit is 0) so 0000110111100010000102 altogether or 11373010
and the two last 0x01 are themselves, 1.
so we do get again 2.16.840.1.113730.1.1
You can double check it at some online OID browser like here:
http://oid-info.com/cgi-bin/display?oid=2.16.840.1.113730.1.1&action=display
that gives the following description for it:
Netscape certificate type (a Rec. ITU-T X.509 v3 certificate extension
used to identify whether the certificate subject is a Secure Sockets
Layer (SSL) client, an SSL server or a Certificate Authority (CA))
You can then even browse various arcs, like the netscape one, or others, to find out other OIDs.
You also get the full ASN.1 notation:
{joint-iso-itu-t(2) country(16) us(840) organization(1) netscape(113730) cert-ext(1) cert-type(1)}
I been trying to encode strings using protoc cli utility.
Noticed that output still contains plain text.
What am i doing wrong?
osboxes#osboxes:~/proto/bin$ cat ./teststring.proto
syntax = "proto2";
message Test2 {
optional string b = 2;
}
echo b:\"my_testing_string\"|./protoc --encode Test2 teststring.proto>result.out
result.out contains:
^R^Qmy_testing_string
protoc versions libprotoc 3.6.0 and libprotoc 2.5.0
Just to formalize in an answer:
The command as written should be fine; the output is protobuf binary - it just resembles text because protobuf uses utf-8 to encode strings, and your content is dominated by a string. However, despite this: the file isn't actually text, and you should usually use a hex viewer or similar if you need to inspect it.
If you want to understand the internals of a file, https://protogen.marcgravell.com/decode is a good resource - it rips an input file or hex string following the protocol rules, and tells you what each byte means (field headers, length prefixes, payloads, etc).
I'm guessing your file is actually:
(hex) 10 11 6D 79 5F etc
i.e. 0x10 = "field 2, length prefixed", 0x11 = 17 (the payload length, encoded as varint), then "my_testing_string" encoded as 17 bytes of UTF8.
protoc --proto_path=${protobuf_path} --encode=${protobuf_message} ${protobuf_file} < ${source_file} > ${output_file}
and in this case:
protoc --proto_path=~/proto/bin --encode="Test2" ~/proto/bin/teststring.proto < ${source.txt} > ./output.bin
or:
cat b:\"my_testing_string\" | protoc --proto_path=~/proto/bin --encode="Test2" ~/proto/bin/teststring.proto > ./output.bin
I'm trying to implement RSA SHA1 signature verification.
To my surprise, command line OpenSSL tool doesn't generate the same key as the Ruby OpenSSL.
If I run those commands :
MacBook-Pro-de-Geoffrey:ssl_tests Escaflowne$ cat data.txt
000
MacBook-Pro-de-Geoffrey:ssl_tests Escaflowne$ openssl dgst -sha1 -binary -sign prvkey.pem -out sig.bin data.txt
MacBook-Pro-de-Geoffrey:ssl_tests Escaflowne$ openssl base64 -in sig.bin -out sig64.txt
MacBook-Pro-de-Geoffrey:ssl_tests Escaflowne$ cat sig64.txt
AJEh2kA7O3j624Kdl7UCGN1HiEk/v2LQudB+cjxw1CfmRTjcSPBjUE/EAwy8NEut
K4zYgfRwwTs7NY3AwYiUEtAe5yohUM0Qv17qSDW+G4IWjwe9PKE7Sl00umiMdszA
q/1hqeQlHKgjme7YO7H6i1UcAXmriOOjn+ySRaovsHw=
So final base64 result in command line is :
AJEh2kA7O3j624Kdl7UCGN1HiEk/v2LQudB+cjxw1CfmRTjcSPBjUE/EAwy8NEut
K4zYgfRwwTs7NY3AwYiUEtAe5yohUM0Qv17qSDW+G4IWjwe9PKE7Sl00umiMdszA
q/1hqeQlHKgjme7YO7H6i1UcAXmriOOjn+ySRaovsHw=
Now, if I try signing it through my ruby script :
def sign_message(message)
privkey = OpenSSL::PKey::RSA.new(File.read(Rails.root.join('lib', 'payment', 'prvkey.pem')))
digest = OpenSSL::Digest.new('sha1')
expected_sign = privkey.sign(digest, message)
base_64_expected_sign = [expected_sign].pack('m')
puts "Expected Signature"
puts expected_sign
puts "Base 64 Expected Signature"
puts base_64_expected_sign
return base_64_expected_sign
end
And calling the function like this :
def test_sign
message = "000"
message_signature = sign_message(message)
puts "Message Signature : #{message_signature}"
puts "Valid : #{verify_signature(message_signature, message)}"
end
I get the output :
Expected Signature
??|??n?^~?T_1Y#??BR??u???k x?*????S?L?:.7
t??tc?)崪? ?}DMp?p2??4?D-f??jT;!e
?k??5??
Base 64 Expected Signature
QjhL1zQoUdGFLVCMg06/CKeE/HdhRTOhJ/p09wkWeK0qD/afsxfcU7tMtDou
Nw3rwXw/5W68XhZ+BK1UXwIxWUDbFYlCUpu6HnWTmI5rC3QP+f50Y8kp5bSq
gQkekH1ETXDmcDKvExeSNKVELWYe3uwTalQ7IWUMyWvnF541rvo=
Message Signature : QjhL1zQoUdGFLVCMg06/CKeE/HdhRTOhJ/p09wkWeK0qD/afsxfcU7tMtDou
Nw3rwXw/5W68XhZ+BK1UXwIxWUDbFYlCUpu6HnWTmI5rC3QP+f50Y8kp5bSq
gQkekH1ETXDmcDKvExeSNKVELWYe3uwTalQ7IWUMyWvnF541rvo=
So final ruby OpenSSL signature is :
QjhL1zQoUdGFLVCMg06/CKeE/HdhRTOhJ/p09wkWeK0qD/afsxfcU7tMtDou
Nw3rwXw/5W68XhZ+BK1UXwIxWUDbFYlCUpu6HnWTmI5rC3QP+f50Y8kp5bSq
gQkekH1ETXDmcDKvExeSNKVELWYe3uwTalQ7IWUMyWvnF541rvo=
Versus command line :
AJEh2kA7O3j624Kdl7UCGN1HiEk/v2LQudB+cjxw1CfmRTjcSPBjUE/EAwy8NEut
K4zYgfRwwTs7NY3AwYiUEtAe5yohUM0Qv17qSDW+G4IWjwe9PKE7Sl00umiMdszA
q/1hqeQlHKgjme7YO7H6i1UcAXmriOOjn+ySRaovsHw=
I've been struggling with this for some time now and I don't understand what could be making a difference!
UPDATE :
Well, apparently results match if I replace my message variable with File.read(Rails.root.join('lib', 'payment', 'data.txt'))
So basically, using a string with the same value as what's in the text file doesn't give the same result.
This means it's encoding related right ?
UPDATE 2 :
So the file says its encoded in us-ascii if I run file -I data.txt
However, if I do message.encoding.name it says its loaded as UTF-8
Also, message.encode('ascii') does not alter the result of the generated signature, it still corresponds with the command line openssl.
As soon as I switch to a string "000".encode('utf-8') or "000".encode('ascii'), the signatures don't match anymore.
So encoding doesn't seem to play a role at all.
How come there's a difference between the exact same content whether it comes from reading a file or written as a string ?
The file data.txt has a trailing newline that you are not taking into account in your code. Using
message = "000\n"
should work.
You could also do
message = File.binread("data.txt")
to make sure you get the exact data as the command line.
I'm using this library in Go (on OSX) to interact with a windows DNS server.
When running the below snippet, I get an error about a null terminator.
$ ~/go/bin/winrm-dns-client create -d domain.com -n node-0 -t A -v 10.90.61.30
2018/06/03 12:40:22 Error creating DNS record: Reading record: Reading record: Unmarshalling response: Unmarshalling json: invalid character '\x00' after array element
My suspicion is that there's a null terminator added here when the helper method calls sprintf to concat json responses into an array.
However, even after adding a bytes.Trim like shown below... I still get a null terminator error and it seems that the null terminator still exists...
func unmarshalResponse(resp string) ([]interface{}, error) {
var data interface{}
byteRespTrim := []byte(resp)
fmt.Print("found a null terminator at -- ")
fmt.Println(bytes.Index(byteRespTrim, []byte("\x00")))
fmt.Print("total length = ")
fmt.Println(len(byteRespTrim))
byteRespTrim = bytes.Trim(byteRespTrim, "\x00")
fmt.Print("after trim found a null terminator at -- ")
loc := bytes.Index(byteRespTrim, []byte("\x00"))
fmt.Print(loc)
When calling I get the below
(master)⚡ % ./windows-dns-test create -d domain.com -n openshift-node-0 -t A -v 10.90.61.30
found a null terminator at -- 2102
total length = 2615
after trim found a null terminator at -- 2102
From your logs, the offending character seems to be found at position 2102, while the whole array has 2615 elements.
So it looks Trim won't solve it since the problem is not necessarily the final character of the array.
Did you try removing all occurrences, using Replace for example?
byteRespTrim = bytes.Replace(byteRespTrim, []byte("\x00"), []byte{}, -1)