Storing crypto/rand generated string issues - go

So I have the following go file(s) as part of my project to be used for hashing passwords, I also wrote some tests that to my knowledge have yet to fail.
Currently the issue is that I am trying to store the password and salt in some database as strings, and every time I retrieve them to be compared against a another string I keep getting the message in the picture from golang's bcrypt package. The tests I wrote are running fine and produce the appropriate effect. I would have supplied a go playground link but bcrypt package is part of the standard library.
I know the gibberish from crypto/rand is pretty much the same from the initial look but I am not sure if there is anything being changed on the database. I am using redis fyi.
Edit: based on the request of #3of3, I am including the DAO code from my project. Also the bcrypt only solution worked with this code but as I stated in the comments, I am aiming to stick to Mozilla's guide.

The salt does not roundtrip through the JSON encode / decode because the salt is not valid UTF8.
There are a few ways to fix the problem:
Hex or base64 encode / decode the salt in hasher.
Use the []byte type for salt throughout the code. The JSON encoder encodes []byte values using base64.
Use the gob encoder instead of the JSON encoder.
Mozilla recommends storing the extra salt separate from the bcrypted password. By storing the extra salt with the bcrypted password, the system is no more secure than using bcrypt alone.
To hex encode the salt, change
return string(p), string(salt), nil
to
return string(p), hex.EncodeToString(salt), nil
and change
s := []byte(salt)
to
s, err := hex.DecodeString(salt)
if err != nil {
return err
}

It seems you forgot that the generated hashes are hex encoded, thus when casting the []byte variable to a string you'll get something weird. Using the hex package you can create the actual string you want:
hex.EncodeToString(hash)

Related

is there any way to check and change some file encoding type in the same time in golang?

i use golang.org/x/text/encoding/charmap package and below code for changing file encoding and it works well.
data, err := ioutil.ReadFile(*file)
checkError(err)
encode := charmap.Windows1256.NewDecoder()
transform, err := encode.Bytes(data)
checkError(err)
err = ioutil.WriteFile(string(*output), transform, 644)
checkError(err)
now i want to work with some other encoding type too but the problem is there is no way for working with or checking other encoding for changing the file encode or walking charmap package for check encoding unless using switch/case for checking every encoding type and there are a lot encoding type and i feel this is wrong way to do this, so is there any solution for make this coding process more productive?
thank you for help.

Ruby RSA public key encryption to Golang

I'm currently working on a project where I have to "convert" some code from Ruby(version 1.9.3p194) to Golang(version 1.7). There is this part where Ruby uses RSA public key encryption and I always get a consistent result every time it gets executed. This is the function used:
Edit: I overlooked that after the public key encryption, there is a base 64 encoding as well
public_key = OpenSSL::PKey::RSA.new(public_encryption_key)
public_encrypted_text = public_key.public_encrypt(text, OpenSSL::PKey::RSA::NO_PADDING)
base64_encrypted_text = Base64.encode64(public_encrypted_text).gsub("\n", "")
escaped_encrypted_text = URI.escape(encrypted_key, "/+=")
However in Golang, due to the rsa library I can't get a consistent result since the function to encrypt takes a random parameter to generate different result each time. I understand why it needs to be different every time, but i can't get anything remotely similar to what ruby generates. These are the functions used in Golang:
//keyBytes is the public key as []byte
block, _ := pem.Decode(keyBytes)
key, err := x509.ParsePKIXPublicKey(block.Bytes)
if err != nil {
return nil, err
}
pubKey, ok := key.(*rsa.PublicKey)
if !ok {
return nil, errors.New("Cannot convert to rsa.PublicKey")
}
result, err := rsa.EncryptPKCS1v15(cryptorand.Reader, pubKey, text)
encryptedText := base64.URLEncoding.EncodeToString(result)
encryptedText = strings.TrimRight(encryptedText, "=")
One of the problems is that ruby can encrypt the text with no problem, and in golang I'm getting an error that the key is too short to encrypt everything.
If I encrypt something else, like "Hello". When decrypting I get from ruby the error "padding check failed". The decryption is being handle like follows:
private_key.private_decrypt(Base64.decode64(text))
EDIT: Thanks to the answer of gusto2 I now know better what is going on since I didn't have much understanding of RSA.
Now in Golang I was able to encrypt the text using PKCS1 v1.5, and just to be sure I tried to decrypt that as well, also in Golang, with no problem.
However in Ruby I still wasn't able to decrypt using the private key. So I figured that the base64 encoding used in Golang was the issue. So I changed that part to this:
encryptedText := base64.StdEncoding.EncodeToString(result)
And also I removed the last line were the equal sign was being trimmed.
With that done it worked like a charm.
I am no knowledgeable about golang, however I may know something about RSA.
The difference seems to be in the padding.
For ruby - no padding is used
For golang - PKCS1v15 padding is used
In the rubyexample you use OpenSSL::PKey::RSA::NO_PADDING is used which is VERY VERY unsafe. It is called textbook RSA and is not inteded in real-life use as it has many weaknesses and dangerous traps. So the ruby example is very dangerously unsafe because of using the textbook RSA. As well it is limited to encrypting small messages (much smaller than the keyspace).
There are two padding types used with RSA:
PKCS1 v1 (commonly referred as PKCS1) - this is a deterministic padding (the output is always the same), many cryptographers consider this option obsolete as some weaknesses has been found when not used properly, but it is still in use and not considered broken.
PKCS1 v2 (commonly refered as OAEP or PSS) which is stochastic (randomized) padding. You can distinguish the last two as the output of OAEP is always different.
One of the problems is that ruby can encrypt the text with no problem, and in golang I'm getting an error that the key is too short to encrypt everything
You've provided only a small part of the golang example, so here I may only assume many things.
As you claim the golang example outputs randomized output and according to the parameters PKCS1 v1.5 is used, I'd assume the implementation is doing hybrid encryption which is good and much safer way to encrypt data with RSA (using symmetric encryption with random key and wrap/encrypt the key with RSA).

Golang OpenSSL bad iv size

I'm trying to create the following:
cipher, err := openssl.GetCipherByName("aes-128-ecb")
decryptionTool, err := openssl.NewDecryptionCipherCtx(cipher, nil, byteKey, iv)
The byteKey and iv are both 16 bytes long.
When I build my code, I get the following error:
panic: bad IV size (16 bytes instead of 0)
I read the documentation, and checked the source code, but I still can't find a way to add the IV without getting an error. I am using spacemonkeygo's OpenSSL.
Does anyone know what's wrong?
And thanks in advance!
EDIT: I added the cipher type above, and a bit more details about biteKey and iv
EDIT 2: ECB has no IV! I totally forgot about that. Well I guess that solves it.
I used ECB the first time, and I wanted to change it to CBC. I forgot to change it to aes-cbc and so I got an error when I tried adding an IV.

Is there a way to ensure that all data in a yaml string was parsed?

For testing, I often see go code read byte slices, which are parsed into structs using yaml, for example here:
https://github.com/kubernetes/kubernetes/blob/master/pkg/util/strategicpatch/patch_test.go#L74m
I just got bitten by not exporting my field names, resulting in an empty list which I iterated over in my test cases, thus assuming that all tests were passing (in hindsight, that should have been a red flag :)). There are other errors which are silently ignored by yaml unmarshaling, such as a key being misspelled and not matching a struct field exactly.
Is there a way to ensure that all the data in the byte slice was actually parsed into the struct returned by yaml.Unmarshal? If not, how do others handle this situation?
go-yaml/yaml
For anyone searching for a solution to this problem, the yaml.v2 library has an UnmarshalStrict method that returns an error if there are keys in the yaml document that have no corresponding fields in the go struct.
import yaml "gopkg.in/yaml.v2"
err := yaml.UnmarshalStrict(data, destinationStruct)
BurntSushi/toml
It's not part of the question, but I'd just like to document how to achieve something similar in toml:
You can find if there were any keys in the toml file that could not be decoded by using the metadata returned by the toml.decode function.
import "github.com/BurntSushi/toml"
metadata, err := toml.Decode(data, destinationStruct)
undecodedKeys := metadata.Undecoded()
Note that metadata.Undecoded() also returns keys that have not been decoded because of a Primitive value. You can read more about it here.
Json
The default go json library does not support this currently, but there is a proposal ready to be merged. It seems that it will be a part of go 1.10.

"Recalculate" SHA512+salt string to BLOWFISH+salt - is this possible?

Maybe this is a stupid question, but I wouldn't be shocked if some excellent brains come around with a proper solution or an idea: Is it possible to recalculate/transcode a salted sha512 string into a salted blowfish string ?
The (imo quite interesting) background is: I have a big database of SHA512+salt strings like that $6$rounds=5000$usesomesillystri$D4IrlXatmP7rx3P3InaxBeoomnAihCKREY4... (118 chars) and want to move to another hash/salt algorithm, generating strings like $2a$07$usesomesillystringfore2uDLvp1Ii2e./U9C8sBjqp8I90dH6hi (60 chars).
I'm intentionally NOT asking this on security.stackexchange.com as this is not a security question. It's about transcoding/recalculation.
Is it possible to recalculate/transcode a salted sha512 string into a salted blowfish string ?
Nope.
SHA2-512 is a cryptographic hash. Data goes in, but there's no way to get it back out. Do note that the thing you're using is a proposed but not standardized form of crypt that uses SHA2, and is not a raw SHA2 hash.
bcrypt (which is derived from, but is not Blowfish) is a key derivation function, which while a different thing than a cryptographic hash, still has the same result: data goes in, but there's no way to get it back out.
There is no way to simply convert one of these password hash types to another. This is true of almost every hash type. If you need to change the hash type, do so when the user next logs in.

Resources