Using AES library i am trying to send encrypted data from arduino side to raspberry pi side.The encrypted data that is being printed on the arduino serial monitor is not the same as what is being printed on the raspberry side.
Maybe it is the decoding problem.
Also while decrypting on the raspberry pi side it gives an error saying "the input text must be multiple of 16 in length", when i pad the input( temperature data) with zeroes it still gives the same error message.
I have tried using 'utf-8' and 'iso-8859-1' for decoding but still it doesnt show the same decrypted data.
PYTHON CODE :
from Crypto.Cipher import AES
ser=serial.Serial(' /dev/ttyS0',9600)
st=ser.readline()
st1=st.decode('utf-8')
obj = AES.new('This is a key123', AES.MODE_CBC, 'This is an IV456')
ciphertext = obj.encrypt(message)
obj2 = AES.new('This is a key123', AES.MODE_CBC, 'This is an IV456')
obj2.decrypt(ciphertext)
ARDUINO CODE :
void aesTest (int bits)
{
aes.iv_inc();
byte iv [N_BLOCK] ;
int plainPaddedLength = sizeof(chartemp) + (N_BLOCK - ((sizeof(chartemp)-1) % 16));
byte cipher [plainPaddedLength];
byte check [plainPaddedLength];
aes.set_IV(myIv);
aes.get_IV(iv);
aes.do_aes_encrypt(chartemp,sizeof(chartemp),cipher,key,bits,iv);
aes.set_IV(myIv);
aes.get_IV(iv);
aes.printArray(cipher,(bool)false); //print cipher with padding
String cipher1=String((char*)cipher);
myserial.println(cipher1);
}
HERE chartemp is the temperature that from LM35 IC converted to characted array.
I expect the output on the raspberry pi side to be decrypted properly
Encrypted data is a sequence of pseudo-random bytes. It is not a valid UTF-8 string.
This line is a bit dodgy, but probably technically "works:"
String cipher1=String((char*)cipher);
But this line is incorrect:
st1=st.decode('utf-8')
You can't take random data and decode it as utf-8. You either need to send and receive the data as just a string of bytes, or encode the data into a string, such as with Base64. I suspect you'll be more comfortable with the latter, so look at Base64 in Java and base64 in Python.
Related
I'm working on a Ruby project that is interacting with a webservice that I'm exchanging some encrypted data with.
I am having a very hard time decrypting something I get back from the webservice in Ruby, although in the .NET side, it's working fine, and a number of other web-based or desktop-based tools can deal with this.
The encryption method was 3DES with ECB and no padding.
Below is a test script I have been working on. I've tried everything I can think of to get these strings unpacked correctly, but to no avail.
require 'openssl'
require 'base64'
def cipher(key, encrypted)
key = key.unpack('a2'*32).map{|x| x.hex}.pack('c'*32)
encrypted = encrypted.unpack('a2'*32).map{|x| x.hex}.pack('c'*32)
OpenSSL::Cipher::ciphers.select{|c| c.include? 'des3' }.map do |cipher_name|
begin
cipher = OpenSSL::Cipher.new(cipher_name)
cipher.padding = 0
cipher.decrypt
cipher.key=key
plain = cipher.update(encrypted) + cipher.final
p "Cipher #{cipher_name} success: #{plain} #{plain.class} #{plain.length} #{plain.encoding.to_s}"
plain
rescue => e
p "Cipher #{cipher_name} failed #{e}"
nil
end
end
end
key = '202FA9B21843D7022B6466DB68327E1F'
encrypted = 'ff6f07e270ebd5c0878c67c999d87ebf'
res1 = cipher key, encrypted
key = '49CE85147B24123718AB3F4539AB1A21'
encrypted = '995604ed8016da8897f1875ebd725529'
res2 = cipher key, encrypted
p res1 == res2 ? "SUCCESS" : "FAIL"
# In both cases, the correct output should be '25588015543912470222703296730936'
A 3DES key is 24-bytes, use a full length key.
3DES uses triple encryption with essentially a 24-byte key. 202FA9B21843D7022B6466DB68327E1F is hex encoded 16-byte key.
Try repeating the first 8-bytes of the key:
202FA9B21843D7022B6466DB68327E1F202FA9B21843D702
Some 3DES implementations will repeat 8-bytes of a 16-byte key but relying on such implementation details is not a good idea.
Note: 3DES actually uses a 168-bit key because the LSb of each byte is not used. Further because there are actually three DES calls the security is only 112-bits. Additionally DES has some weak keys. There are two common modes, ede and ded, in an effort to facilitate moving from DES to 3DES thus adding more confusion.
Finally: Move from 3DES to AES in CBC mode with a random IV. Please don't continue poor security practices.
I'm Node developer, but every once in awhile I get to play around with ERB templates. I really love pulling out as much ruby as I can in these templates when I can and this idea caught my eye especially.
I have this configuration value, which should be encrypted, but is coming in plain text. The program would decrypt it like so:
var crypto = require('crypto');
var decipher = crypto.createDecipher('aes256', 'e20jhciwjf90u2r9u9ujj');
var decrypted = crypto.update('4ufujj90u19ru90u109u') + crypto.final();
I was wondering how I might go about creating an encrypted string for the above to decrypt using ruby?
So far I have:
require 'openssl'
cipher = OpenSSL::Cipher::Cipher.new('aes256');
cipher.key= 'e20jhciwjf90u2r9u9ujj'
encrypted = cipher.update('my cat is yellow and very pretty.') + cipher.final
Two problems I have:
I often get a Key length not long enough error on the ruby side.
Ruby outputs a bunch of crazy hex, whereas node seems to always take/want utf8.
Am I encrypting/decrypting safely?
Is there a way to universally translate/work laterally with these two APIs?
AES-256 uses a key of 256bits, and by default ruby uses utf-8 encoding, so each ansi character is 8bit long. So the key string must be 32 bytes.
Explicitly use an aes mode (e.g. aes-256-cbc).
Set the same iv(initial vector) on both sides
I finally succeeded with the above methods.
Here is my code:
Ruby side:
require 'openssl'
require 'base64'
cipher = OpenSSL::Cipher.new('aes-256-cbc')
cipher.encrypt
cipher.iv = 'a'*16;
cipher.key = '01234567890123456789012345678901' # should be 32 characters, 32*8=256 bits
enc = Base64.strict_encode64(cipher.update('01234567890123456789012345678901') + cipher.final)
puts enc
Javascript side:
var encrypted = new Buffer(base64Data, 'base64');
var crypto = require('crypto');
var decipher = crypto.createDecipheriv('aes-256-cbc', '01234567890123456789012345678901', 'aaaaaaaaaaaaaaaa');
var dec = decipher.update(encrypted);
console.log(Buffer.concat([dec, decipher.final()]));
I need to decrypt text encrypted using AES/CBC/PKCS5Padding scheme. The encrypted text I got was generated using Coldfusion.
CFML example below:
<table border="1" cellpadding="5" cellspacing="0">
<tr bgcolor="c0c0c0">
<th>Decrypted string</th>
<th>3DESKey</th>
</tr>
<cfset variables.algorithm ="AES/CBC/PKCS5Padding">
<cfset variables.seed ="C610297CE8570750">
<cfset variables.password = "Vza0O49SHpIe/mR4+4jHXhApmKhEyl5O2nzzDxVNQbo=">
<cfset variables.decryptedString = Decrypt(variables.password, generate3DesKey("#variables.seed#"), "#variables.algorithm#", "Base64")>
<cfoutput>
<tr>
<td>#variables.decryptedString#</td>
<td><cfoutput>#generate3DesKey("variables.seed")#</cfoutput></td>
</tr>
</cfoutput>
</table>
Output is:
Decrypted String: Name322big563
3DESKey: QzYxMDI5N0NFODU3MDc1MA==
I tried with ruby:
require 'openssl'
require 'base64'
string = "Vza0O49SHpIe/mR4+4jHXhApmKhEyl5O2nzzDxVNQbo="
def decrypt(cpass)
des = OpenSSL::Cipher::Cipher.new('AES-256-CBC')
des.decrypt
des.key = 'C610297CE8570750'
return des.update(Base64.decode64(cpass)) + des.final
end
decrypted = decrypt(string)
puts "decrypted string: #{decrypted}"
I get key length too short (OpenSSL::Cipher::CipherError)
The problem is I don't know the key but only the seed used C610297CE8570750, because the key returned by the CFML script is base64 but I need a hex key.
I tried also with OpenSSL::Cipher::AES256.new(:CBC) same error.
require 'openssl'
require 'base64'
# decryption
aes = OpenSSL::Cipher::AES256.new(:CBC)
aes.decrypt
aes.padding = 1 # actually it's on by default
aes.key = "QzYxMDI5N0NFODU3MDc1MA=="
aes.iv = "C610297CE8570750"
aes.update(Base64::decode64("Vza0O49SHpIe/mR4+4jHXhApmKhEyl5O2nzzDxVNQbo="))+aes.final
Any idea?
EDIT:
As hinted by #Leigh, need to use AES-128-CBC, so I did this:
require 'openssl'
require 'base64'
string = "Vza0O49SHpIe/mR4+4jHXhApmKhEyl5O2nzzDxVNQbo="
def decrypt(cpass)
des = OpenSSL::Cipher::Cipher.new('AES-128-CBC')
des.decrypt
des.key = 'C610297CE8570750'
return des.update(Base64.decode64(cpass)) + des.final
end
decrypted = decrypt(string)
puts "decrypted string: #{decrypted}"
actually seems to kinda work (...ish).
decrypted string: ▒▒.ϥD▒▒ ▒▒▒▒▒Name322big563
any idea what's still wrong?
(Expanded from comments)
but I need a hex key
Then convert it from base64 to hex. In CF, you can use the BinaryEncode() and BinaryDecode functions:
binaryEncode(binaryDecode("QzYxMDI5N0NFODU3MDc1MA==", "base64"), "hex")
Looks like there are a few other problems:
The CF code generates a 128 bit key, but the ruby code is using AES 256. It needs to use AES 128.
The CF code is generating a random IV. The Ruby code is using a totally different iv. With CBC mode, both sides must use the same iv to get the expected results. "Decrypting with the incorrect IV causes the first block of plaintext to be corrupt ...", which is why your decrypted value is off. To resolve it, the Ruby code should use the same iv that was used to encrypt.
Update:
When CF generates the IV automatically (as it does here), it prepends that IV to the encrypted value:
When ColdFusion creates an IV automatically, it generates a secure,
random IV and prepends this to the encrypted data. When ColdFusion
decrypts the data, this IV is recovered and used. It is
cryptologically important that the IV varies between encryptions. This
is why the encrypted value changes when you repeatedly encrypt the
same string with an algorithm that uses an IV, like
DES/CBC/PKCS5Padding. Unlike the encryption key, it is not necessary
for the IV to be kept secret.
So the IV value can be extracted by removing the first "block" of the encrypted binary. The block size depends on the algorithm. For AES, it is 16. I do not know the exact Ruby code, but in CF you could extract the IV like so:
blockSize = 16;
rawBinary = binaryDecode(encryptedString, "base64");
// IV is always the first block
ivBytes = arraySlice(rawBinary, 1, blockSize);
// Remaining bytes are the encrypted value
dataBytes = arraySlice(rawBinary, blockSize+1, arrayLen(rawBinary)-blockSize);
Unless I'm very much mistaken, this is a problem I encountered years ago.
PHP Encryption Code Converted to ColdFusion
I need to send a cipher encrypted string in JSON through a socket.
JSON.generate doesn't like this and throws the error:
`encode': "\xF7" from ASCII-8BIT to UTF-8 (Encoding::UndefinedConversionError)
To work around this I just encoded it with Base64.encode and on the other side decoded it with Base64.decode
But when I go to decrypt it with the cipher I get the error:
`final': wrong final block length (OpenSSL::Cipher::CipherError)
A quick little test I wrote to prove it was the Base64 messing things up:
decipher = OpenSSL::Cipher.new("des-ede3")
decipher.decrypt
decipher.key = symkey
test = Base64.encode64(encrypted_json)
test2 = Base64.decode64(test)
puts test2 == encrypted_json #Prints true
decrypted_json = decipher.update(test) + decipher.final
puts
puts "JSON: #{decrypted_json}"
This code throws the same error at decrypted_json = decipher.update(test) + decipher.final
The encrypted_json was encrypted by a cipher using the same key.
Anyone have any ideas on what is going wrong? Or how to generate the JSON without having to encode it to Base64?
Edit: Here is my JSON generator
ready_to_send = JSON.generate({
enc_json: Base64.encode64(encrypted_json),
symkey: Base64.encode64(encrypted_sym_key)
})
I have an application that was originally written in Borland C++ and used a Blowfish algorithm implemented in the TurboPower LockBox component .
This application has now been ported to C#. Currently I call a Borland C++ dll that uses this algorithm. However, when running the application on a 64-bit OS, I get errors whenever attempting to use this dll. If I compile the application as 32-bit, everything works, but we want to have this application work as a 64-bit app. As far as I can tell, that means I need a .Net Blowfish algorithm that works like the C++ one.
I found Blowfish.Net and it looks promising. However, when I use the same key and text the encrypted results do not match. I did find out the C++ dll uses the BlowfishECB algorithm. It also converts the result to Base 64, which I have also done.
Any help with this would be appreciated. Here is some test code in C#.
//Convert the key to a byte array. In C++ the key was 16 bytes long
byte[] _key = new byte[16];
Array.Clear(_key, 0, _key.Length);
var pwdBytes = System.Text.Encoding.Default.GetBytes(LicEncryptKey);
int max = Math.Min(16, pwdBytes.Length);
Array.Copy(pwdBytes, _key, max);
//Convert the string to a byte[] and pad it to to the 8 byte block size
var decrypted = System.Text.Encoding.ASCII.GetBytes(originalString);
var blowfish = new BlowfishECB();
blowfish.Initialize(_key, 0, _key.Length);
int arraySize = decrypted.Length;
int diff = arraySize%BlowfishECB.BLOCK_SIZE;
if (diff != 0)
{
arraySize += (BlowfishECB.BLOCK_SIZE - diff);
}
var decryptedBytes = new Byte[arraySize];
Array.Clear(decryptedBytes, 0, decryptedBytes.Length);
Array.Copy(decrypted, decryptedBytes, decrypted.Length);
//Prepare the byte array for the encrypted string
var encryptedBytes = new Byte[decryptedBytes.Length];
Array.Clear(encryptedBytes, 0, encryptedBytes.Length);
blowfish.Encrypt(decryptedBytes, 0, encryptedBytes, 0, decryptedBytes.Length);
//Convert to Base64
string result = Convert.ToBase64String(encryptedBytes);
It won't compatible with your TurboPower LockBox data.
I'd suggest that you provide a utility to do the data migration by decoding using LockBox in C++ (32-bit), outputting to temp files/tables and re-encoding using Blowfish.Net and C# (64-bit).
This data migration is done once before any upgrade to the .NET version, then it's all compatible with it.
Since you're changing the format: you could also change the format and omit the Base64 conversion by storing binary files/BLOBs, other ideas may also be useful like applying multiple encryptions, or replacing Blowfish by something else.