How do I convert a hex strign to its 32 bit signed int equivalent in ruby?
for example
a = "fb6d8cf1" #hex string
[a].pack('H*').unpack('l') #from the documentation it unpacks to its 32 bit signed int
It converts to
-242455045
But the actual answer is
-76706575
Could you point me to what I am doing wrong?
Seems like you had an endian problem. This gives the desired result:
[a].pack("H*").unpack("l>")
# => [-76706575]
["038a67f90"].pack("H*").unpack("l>")
#=> [59402233]
You could flip the bytes yourself to get around the endian and sign issues:
>> ['fb6d8cf1'.scan(/[0-9a-f]{2}/i).reverse.join].pack('H*').unpack('l')
=> [-76706575]
Use:
class String
def to_si(base, lenght = 32)
mid = 2**(length-1)
max_unsigned = 2**length
n = self.to_i base
(n>=mid) ? n - max_unsigned : n
end
end
"fb6d8cf1".to_si 16, 32
Related
this is mostly a ruby question.
I'm stuck trying to parse some bytes from an i2c device to a float value on ruby.
Long story:
I'm trying to read a float value from an i2c device with raspberry pi and an AtTiny85 (the device). i'm able to read its value from console through i2ctools
Example:
i2cset -y 0 0x25 0x00; sleep 1; i2cget -y 0 0x25 0x00; i2cget -y 0 0x25 0x00; i2cget -y 0 0x25 0x00; i2cget -y 0 0x25 0x00
Gives me:
0x3e
0x00
0x80
0x92
that means 0.12549046, which is a value in volts that i'm able to check with my multimeter and is ok. (the order of the bytes is 0x3e008092)
Now i need to get this float value from a ruby script, I'm using the i2c gem.
A comment on this site suggest the following conversion method:
hex_string = '42880000'
float = [hex_string.to_i(16)].pack('L').unpack('F')[0]
# => 68.0
float = 66.2
hex_string = [float].pack('F').unpack('L')[0].to_s(16)
# => 42846666
But i haven't been able to get this string of hex values. This fraction of code:
require "i2c/i2c"
require "i2c/backends/i2c-dev"
#i2c = ::I2C.create("/dev/i2c-0")
sharp = 0x25
#i2c.write(sharp, 0)
sleep 1
puts #i2c.read(sharp, 4).inspect
Puts on screen
">\x00\x00P"
Where the characters '>' and 'P' are the ASCII values of the byte in that position, but then i cannot know where/how to split the string and clean it up to at least try the method showed above.
I could write a C program to read the value and printf it to console or something and run it from ruby, but i think that would be an awful solution.
Some ideas on how can this be done would be very helpful!
Greetings.
I came with something:
bytes = []
for i in (0..3) do
bytes << #i2c.read_byte(sharp).unpack('*C*')[0].to_s(16)
bytes[i] = "00" unless bytes[i] != "0"
end
bytes = bytes.join.to_s
float = [bytes.to_i(16)].pack('L').unpack('F')[0]
puts float.to_s
Not shure about unpack(' * C * ') though, but it works. If it's a better way to do it i'd be glad with another answer.
Greetings!
You probably just need to use unpack with a format of g, or possible e depending on the endianness.
#i2c.read(sharp, 4).unpack('g')[0]
The example you are referring to is taking a string of hex digits and first converting it to a binary string (that’s the [hex_string.to_i(16)].pack('L') part) before converting to an integer (the L directive is for 32 bit integers). The data you have is already a binary string, so you just need to convert it directly with the appropriate directive for unpack.
Have a read of the documentation for unpack and pack.
I have a checksum that I need to add to a ruby string in hex. I have been unable to convert the checksum successfully. I am relatively new to ruby, so I'm not sure if I'm missing something. Here is what I am doing:
def get_checksum message
# get the checksum
cnt = 0
lrc = 0
while (cnt < message.length - 1)
lrc = lrc ^ message[cnt].to_i
cnt += 1
end
# return as hex
lrc.to_s.each_byte.map { |b| b.to_s(16) + " " }.join
end
I have some c# reference code as well, but have never used C# being a long time mac C/C++/Obj-C coder. Here is the C# code I am trying to convert:
// calculate LRC
private string GetChecksum(string inputstring)
{
int checksum = 0;
foreach (char c in inputstring)
{
checksum ^= Convert.ToByte(c);
}
return checksum.ToString("X2");
}
Any help would be appreciated.
.to_i will return 0 when called on a character.
def get_checksum message
# get the checksum
lrc = 0
message.each_byte do |b|
lrc = lrc ^ b
end
# return as hex
lrc.to_s(16)
end
I'm trying to send an integer over the serial port to my Ardunio. The chip is then going to display the number in binary on the LED's. However I'm having lots of trouble trying to send the data as a byte over the serial port, as far as I can debug the following code sends it as the ASC char values.
Can anyone point me in the right direction or spot the mistake? I'd really appreciate it. I've been pulling my hair out over this for a long time.
Ruby
require 'rubygems'
require 'serialport' # use Kernel::require on windows, works better.
#params for serial port
port_str = "/dev/tty.usbserial-A700dZt3" #may be different for you
baud_rate = 9600
data_bits = 8
stop_bits = 1
parity = SerialPort::NONE
sp = SerialPort.new(port_str, baud_rate, data_bits, stop_bits, parity)
i = 15
#just write forever
while true do
sp.write(i.to_s(2))
sleep 10
end
Arduino
int ledPin = 10;
int ledPin1 = 11;
int ledPin2 = 12;
int ledPin3 = 13;
byte incomingByte; // for incoming serial data
void setup() {
pinMode(ledPin, OUTPUT); // initialize the LED pin as an output:
pinMode(ledPin1, OUTPUT); // initialize the LED pin as an output:
pinMode(ledPin2, OUTPUT); // initialize the LED pin as an output:
pinMode(ledPin3, OUTPUT); // initialize the LED pin as an output:
Serial.begin(9600);
Serial.println("I am online");
}
void loop() {
// send data only when you receive data:
if (Serial.available() > 0) {
incomingByte = Serial.read();
Serial.println(incomingByte, DEC);
int value = (incomingByte, DEC) % 16;
digitalWrite(ledPin, (value >> 0) % 2);
digitalWrite(ledPin1, (value >> 1) % 2);
digitalWrite(ledPin2, (value >> 2) % 2);
digitalWrite(ledPin3, (value >> 3) % 2); // MSB
}
}
I'm guessing you are trying to write the value 15 in order to light all the LEDs at once. However, 15.to_s(2) is "1111". The ASCII value of the character '1' is 49, so instead of writing 15 once you are writing 49 four times in rapid succession.
The write command you are looking for is therefore probably sp.putc(i). This writes only one character with the given binary value (= machine-readable for Arduino) instead of an ASCII string representation of the value expressed in binary (= human-readable for you).
So keeping everything else the same, replace the while loop in your Ruby code with:
loop do
sp.putc(i)
puts 'Wrote: %d = %bb' % [ i, i ]
i = (i == 15) ? 0 : (i + 1)
sleep(10)
end
If you wish to read the responses from Arduino, you can use e.g. sp.gets to get one line of text, e.g. try placing puts 'Arduino replied: ' + sp.gets in the loop before sleep (and one puts sp.gets before the loop to read the "I am online" sent when the connection is first established).
Edit: I just spotted another problem in your code, on the Arduino side: value = (incomingByte, DEC) % 16; always results in the value 10 because (incomingByte, DEC) has the value DEC (which is 10). You should use value = incomingByte % 16; instead. Or do away with value altogether and modify incomingByte itself, e.g. incomingByte %= 16;.
Your problems may be caused by buffering. To disable buffering, you can do one of the following:
Set sp to unbuffered after creating it (before writing): sp.sync = true
Call flush after the write
Use the unbuffered syswrite instead of write
It's been so long since I did anything with serial ports that I can't help there, but I do see one thing.
>> 15.to_s #=> "15"
and
>> 15.to_s(2) #=> "1111"
I think if you want the binary value to be sent you'll want "\xf" or "\u000F".
Change your code from:
while true do
sp.write(i.to_s(2)) # <-- this sends a multi-character ASCII representation of the "i" value, NOT the binary.
sleep 10
end
to:
while true do
sp.write(i.chr) # <-- this sends a single byte binary representation of the "i" value, NOT the ASCII.
sleep 10
end
To show the difference, here's the length of the strings being output:
>> 15.to_s(2).size #=> 4
>> 15.chr.size #=> 1
And the decimal values of the bytes comprising the strings:
>> 15.to_s(2).bytes.to_a #=> [49, 49, 49, 49]
>> 15.chr.bytes.to_a #=> [15]
I've had this Ruby code work before
while true do
printf("%c", sp.getc)
end
rather than using sp.write(i.to_s). It looks like you are explicitly converting it to a string, which may be the cause of your problems.
I found the original blog post I used:
http://www.arduino.cc/playground/Interfacing/Ruby
I would like to convert a raw string to an array of big-endian words.
As example, here is a JavaScript function that do it well (by Paul Johnston):
/*
* Convert a raw string to an array of big-endian words
* Characters >255 have their high-byte silently ignored.
*/
function rstr2binb(input)
{
var output = Array(input.length >> 2);
for(var i = 0; i < output.length; i++)
output[i] = 0;
for(var i = 0; i < input.length * 8; i += 8)
output[i>>5] |= (input.charCodeAt(i / 8) & 0xFF) << (24 - i % 32);
return output;
}
I believe the Ruby equivalent can be String#unpack(format).
However, I don't know what should be the correct format parameter.
Thank you for any help.
Regards
I think you should have posted few examples of the input/output pairs. Here's code that gives me the same output as your JS code in Chrome:
/* JS in Chrome: */
rstr2binb('hello world!')
[1751477356, 1864398703, 1919706145]
# irb, Ruby 1.9.1:
'hello world!'.unpack('N*')
#=> [1751477356, 1864398703, 1919706145]
However, I am not sure it will give the same results if you try it on some multibyte characters, unpack shouldn't be ignoring anything.
I am trying to convert an decimal number to it's character equivalent. For example:
int j = 65 // The character equivalent would be 'A'.
Sorry, forgot to specify the language. I thought I did. I am using the Cocoa/Object-C. It is really frustrating. I have tried the following but it is still not converting correctly.
char_num1 = [working_text characterAtIndex:i]; // value = 65
char_num2 = [working_text characterAtIndex:i+1]; // value = 75
char_num3 = char_num1 + char_num2; // value = 140
char_str1 = [NSString stringWithFormat:#"%c",char_num3]; // mapped value = 229
char_str2 = [char_str2 stringByAppendingString:char_str1];
When char_num1 and char_num2 are added, I get the new ascii decimal value. However, when I try to convert the new decimal value to a character, I do not get the character that is mapped to char_num3.
Convert a character to a number in C:
int j = 'A';
Convert a number to a character in C:
char ch = 65;
Convert a character to a number in python:
j = ord('A')
Convert a number to a character in Python:
ch = chr(65)
Most languages have a 'char' function, so it would be Char(j)
I'm not sure what language you're asking about. In Java, this works:
int a = 'a';
It's quite often done with "chr" or "char", but some indication of the language / platform would be useful :-)
string k = Chr(j);