Getting as3crypto to work with ruby (Gibberish/EzCrypto) - ruby

I'm trying to get as3crypto to play nice with either Gibberish or EzCrypto in AES-128 mode.
No matter what combination of settings I use I simply cannot get one to decrypt the other, and usually get a "bad decrypt" message in ruby. Each contained environment can decrypt data it encrypted itself but one cannot seem to decrypt the other.
Has anyone been able to get the two to work together?
Here's one of the variations I tried:
On the Actionscript side, using as3crypto:
//define the encryption key
var key:ByteArray = Hex.toArray("password");
//put plaintext into a bytearray
var plainText:ByteArray = Hex.toArray(Hex.fromString("this is a secret!"));
//set the encryption key
var aes:AESKey = new AESKey(key);
//encrypt the text
aes.encrypt( plainText );
trace(Base64.encode(Hex.fromArray(plainText)));
//encrypted value is N2QwZmI0YWQ4NzhmNDNhYjYzM2QxMTAwNGYzNDI1ZGUyMQ==
And on the ruby side, using gibberish:
// also tried the default size (256)
cipher = Gibberish::AES.new("password",128)
// raises the following exception: OpenSSL::Cipher::CipherError: wrong final block length
cipher.dec("N2QwZmI0YWQ4NzhmNDNhYjYzM2QxMTAwNGYzNDI1ZGUyMQ==")
I've tried all sort of different approaches, all yielding either the above exception or "bad encrypt"

Finally figured it out myself. The thing is both Gibberish and EzCrypto do not seem to provide a way to specify an IV, which is needed when using aes-cbc. The trick is to extract the iv from the first 16 bytes of the encrypted data as3crypto produces.
Here's the as3 code, which also changed a little:
// there are other ways to create the key, but this works well
var key:ByteArray = new ByteArray();
key.writeUTFBytes(MD5.encrypt("password"));
// encrypt the data. simple-aes-cbc is equiv. to aes-256-cbc in openssl/ruby, if your key is
// long enough (an MD5 is 32 bytes long)
var data:ByteArray = Hex.toArray(Hex.fromString("secret"));
var mode:ICipher= Crypto.getCipher("simple-aes-cbc", key) ;
mode.encrypt(data);
// the value here is base64, 32 bytes long. the first 16 bytes are the IV, needed to decrypt
// the data in ruby
// e.g: sEFOIF57LVGC+HMEI9EMTpcJdcu4J3qJm0PDdHE/OSY=
trace(Base64.encodeByteArray(data));
The ruby part uses a gem called encryptor to supply the iv.
you can also use OpenSSL directly, it's pretty straight forward:
key = Digest::MD5.hexdigest("password")
// decode the base64 encoded data back to binary:
encrypted_data = Base64.decode64("sEFOIF57LVGC+HMEI9EMTpcJdcu4J3qJm0PDdHE/OSY=")
// the tricky part: extract the IV from the decoded data
iv = encrypted_data.slice!(0,16)
// decrypt!
Encryptor.decrypt(encrypted_data,:key=>key,:iv=>iv)
// should output "secret"

Related

Why does OpenSSL not return the same values using public_key.verify and private_key.private_encrypt(digest)

As far as I understand, signing data with an asymmetric key pair is essentially the same as using the private key to encrypt that same data. It says as much here: http://ruby-doc.org/stdlib-2.0.0/libdoc/openssl/rdoc/OpenSSL.html. Ruby openssl seems to use an approach of hashing the data first then signing it. So I am confused why the value of signature is not equivalent at the end of both operations:
1.
data = 'Sign me!'
digest = OpenSSL::Digest::SHA256.new
pkey = OpenSSL::PKey::RSA.new(2048)
signature = pkey.sign(digest, data)
puts signature
2.
data = 'Sign me!'
data_hash = OpenSSL::Digest::digest("SHA256", data)
signature = pkey.private_encrypt(data_hash.to_s)
puts signature
(or if the digest is not working as I thought)
3.
data = 'Sign me!'
signature = pkey.private_encrypt(data)
puts signature
I am guessing that there are formatting things going on in the background. Now, this is for academic understanding, I will use verify for actual signature verification.

C# and ColdFusion AES Encryption not matching

I have to encrypt url query sting in C# and pass to ColdFusion page. Can someone help me on writing encryption code using AES algorithm in C#.net that is equivalent to below ColdFusion function? Thanks in advance.
<cfset strLink = Encrypt("top secret", "WTq8zYcZfaWVvMncigHqwQ==", "AES","Hex")>
CF Result:
strLink = 91E72250B8A7EDBC4E5AF37F04E6AB5B
I tried below code in C#, but the results are not matching.
byte[] plainText = Encoding.Unicode.GetBytes("top secret");
byte[] key = Convert.FromBase64String("WTq8zYcZfaWVvMncigHqwQ==");
RijndaelManaged algorithm = new RijndaelManaged();
algorithm.Mode = CipherMode.ECB;
algorithm.Padding = PaddingMode.PKCS7;
algorithm.BlockSize = 128;
algorithm.KeySize = 128;
algorithm.Key = key;
string result;
using (ICryptoTransform encryptor = algorithm.CreateEncryptor())
{
using (MemoryStream memoryStream = new MemoryStream())
{
using (CryptoStream cryptoStream = new CryptoStream(memoryStream, encryptor, CryptoStreamMode.Write))
{
cryptoStream.Write(plainText, 0, plainText.Length);
cryptoStream.FlushFinalBlock();
result = Convert.ToBase64String(memoryStream.ToArray());
}
}
}
return result;
C# Result:
HEX = 89F9F3C55CD232362FE1E14240C479BE5B56210FF3913E7B6BA4BCD3C87F9AA7
Base64 = ifnzxVzSMjYv4eFCQMR5vltWIQ/zkT57a6S808h/mqc=
(From comments...)
This is a perfect example of how character encoding makes a big difference.
Believe it or not, it is simply due to using the wrong encoding in the C# code. Encoding.Unicode uses UTF-16, whereas CF's Encrypt function always uses UTF-8 (very different). Consequently, the C# code is encrypting a totally different value than CF. Hence the different results, and why the length of the C# string (hex) is longer than the one returned from CF.
Use Encoding.UTF8.GetBytes() instead of Encoding.Unicode.GetBytes() and the results will match:
byte[] plainText = Encoding.UTF8.GetBytes("top secret");

What is the role of SHA hash in signing a document in OpenSSL library?

I am following OpenSSL directives to generate signatures. I am using ruby 2.1.0 and am generating signatures like this:
document = "This is a simple string document to be signed"
key = OpenSSL::PKey::RSA.new([private_key])
digest = OpenSSL::Digest::SHA256.new
signature = key.sign digest, document
The signature is transmitted and reaches the destination where it is to be verified. To verify, I do like this:
key = OpenSSL::PKey::RSA.new([pubkey])
digest = OpenSSL::Digest::SHA256.new
key.verify digest, signature, document # => valid
This is working because if we change just one letter of the document or signature, this returns invalid result:
key.verify digest, signature, changed_document # => Invalid
But with a different SHA, the verification command still results in a valid result:
digest = OpenSSL::Digest::SHA256.new('this will generate different SHA')
key.verify digest, signature, document # => valid
It confused me. Shouldn't a different SHA hash result in invalid result? What is the role of digest here?
Passing an argument to OpenSSL::Digest::SHA256.new causes that data to be added to the digest.
However, the openssl signing functions reset the digest before it is used and so that extra data has no effect in this particular case.

Not able to Decrypt string using RSA.rb: RSA Encryption for Ruby

I have to encrypt a particular field value and store in DB. I have used RSA Encryption for Ruby. I was able to encrypt and save it, but then while decrypting it back, i am facing problem. What i have done is as follows,
key_pair = RSA::KeyPair.generate(512)
Stored key_pair in separate column.
ciphertext = key_pair.encrypt("Hello, world!")
Stored ciphertext in another column in same table.
While decrypting, i fetched the key_pair value from database and applied decrypting function
plaintext = key_pair.decrypt(ciphertext)
This step throws error
NoMethodError: undefined method `decrypt' for <String:0xa431b88>
because "key_pair" is not an instance of "RSA::KeyPair".
When i try to decrypt the stored value, i fetch key_pair value from database and then apply decrypt method on it. So the key_pair value has String class. I need a way to solve. Please guide me.
Before decrypt, try:
# get persisted value from DB; then
key_pair = RSA::KeyPair.new(your private key, your public key)
# and then decrypt
plaintext = key_pair.decrypt(ciphertext)

File to Byte Array in WinJS

I'm tinkering with some Windows Store development in JavaScript and I seem to be stuck on how to get a byte array from a binary file. I've found a couple of examples online, but they all seem to only read in text whereas my file is an image. I'm opening the file like this:
Windows.Storage.FileIO.readBufferAsync(photos[currentIndex]).done(function (buffer) {
var dataReader = Windows.Storage.Streams.DataReader.fromBuffer(buffer);
var fileContent = dataReader.readString(buffer.length);
dataReader.close();
// do something with fileContent
});
Where photos[currentIndex] is a file (loaded from getFilesAsync()). The error in this case, of course, is that readString fails on binary data. It can't map the "characters" into a string. I also tried this:
Windows.Storage.FileIO.readBufferAsync(photos[currentIndex]).done(function (buffer) {
var bytes = [];
var dataReader = Windows.Storage.Streams.DataReader.fromBuffer(buffer);
dataReader.readBytes(bytes);
dataReader.close();
// do something with bytes
});
But bytes is empty, so I think I'm using this incorrectly. I imagine I'm just overlooking something simple here, but for some reason I just can't seem to find the right way to read a binary file into a byte array. Can somebody offer a second set of eyes to help?
Figured it out almost immediately after posting the question, but I figure I'll leave the answer here for posterity...
I needed to declare the array in the second example differently:
Windows.Storage.FileIO.readBufferAsync(photos[currentIndex]).done(function (buffer) {
var bytes = new Uint8Array(buffer.length);
var dataReader = Windows.Storage.Streams.DataReader.fromBuffer(buffer);
dataReader.readBytes(bytes);
dataReader.close();
// do something with bytes
});
My JavaScript isn't quite up to par, so I guess I didn't understand how the array declaration was supposed to work. (When I do vanilla JavaScript in a browser, I always just declare empty arrays like I originally did and append to them.) But this does the trick.

Resources