How to convert HEX value to Decimal in HIVE HQL - hadoop

I've got Hive 1.1, and I'm having trouble converting a HEX value into a Decimal or BIGINT.
Using UNHEX('2BD1BCAE0501250E') should return= 3157512269357720846
But instead, I'm getting something like WingDings= +Ѽ�%
I've tried DECODE, ENCODE, BASE64...but nothing seems to be working. Has anyone else tried doing this? Thanks

Conv(STRING num, int from_base, int to_base) Converts a number from a given base to another
conv('2BD1BCAE0501250E', 16, 10)

Related

How do I reinterpret a uint64_t variable bits as a double as opposed to static_cast?

uint64 value = 0x543;
double dval = *reinterpret_cast<double*>(&value);
I want dval to be of a certain value which when written in hexadecimal looks like this "0x543". Is there any danger doing this? Do you see a more efficient way of doing this?
Thanks
Use std::bit_cast. In fact, the shown example is exactly what you are trying to do.

how to convert mm:ss.000 to s.ms using Posixct and strptime

One of my first posts, so I'll do my best. I've tried searching this, which is how I got this far..but I could use some help converting some time data in the form mm:ss.000 (that's milliseconds at the end) to seconds with a fraction at the end. For example, 2:15.45 should come out to 135.45.
This works:
t <- "02:15.45" (as.numeric(as.POSIXct(strptime(t, format = "%M:%OS"))) - as.numeric(as.POSIXct(strptime("0", format = "%S"))))
But this one, where I'm trying to use a column of my dataframe (originally in character form) does not work:
starttimesFPsnapjumps <- FPsnapjumps$start (as.numeric(as.POSIXct(strptime(starttimesFPsnapjumps, format = "%M:%OS"))) - as.numeric(as.POSIXct(strptime("0", format = "%S"))))
Perhaps it's because my numbers in the column have an extra 0 - they are mm:ss.000. Any thoughts? Thanks in advance.

How to convert 64 bit binary string to a double float in ruby?

I am wondering how to convert a 64 bit binary string to a double float in ruby. The string that I have is as follows:
binaryString = "0011111111110000000000000000000000000000000000000000000000000000"
Using an online converter (http://www.binaryconvert.com/convert_double.html?) I know that the value should be 1.0. However, I'm attempting to use the ruby unpack to convert to double, and I'm not getting the correct result.
double_value = binaryString.unpack("G")
Gives me double_value = 1.3983819593719592e-76
I've tried other directives like "F" and "D", but none yield correct results.
Any ideas what I am doing wrong? Thank you for the help!
unpack expects binary data, so you have to pack your bit string first using B:
b = '0011111111110000000000000000000000000000000000000000000000000000'
[b].pack('B*').unpack1('G')
#=> 1.0

RCFile - emitting GZip compressed int columns

For some reason, Hive is not recognizing columns emitted as integers, but does recognize columns emitted as strings.
Is there something about Hive or RCFile or GZ that is preventing proper rendering of int?
My Hive DDL looks like:
create external table if not exists db.table (intField int, strField string) stored as rcfile location '/path/to/my/data';
And the relevant portion of my Java looks like:
BytesRefArrayWritable dataWrite = new BytesRefArrayWritable(2);
byte[] byteArray;
BytesRefWritable bytesRefWritable = new BytesRefWritable(); intWritable.set(myObj.getIntField());
byteArray = WritableUtils.toByteArray(intWritable.get());
bytesRefWritable.set(byteArray, 0, byteArray.length);
dataWrite.set(0, bytesRefWritable); // sets int field as column 0
bytesRefWritable = new BytesRefWritable();
textWritable.set(myObj.getStrField());
bytesRefWritable.set(textWritable.getBytes(), 0, textWritable.getLength());
dataWrite.set(1, bytesRefWritable); // sets str field as column 1
The code runs fine, and through logging I can see the various Writables have bytes within them.
Hive can read the external table as well, but the int field shows up as NULL, indicating some error.
SELECT * from db.table;
OK
NULL my string field
Time taken: 0.647 seconds
Any idea what might be going on here?
So, I'm not sure exactly why this is the case, but I got it working using the following method:
In the code that writes the byte array representing the integer value, instead of using WritableUtils.toByteArray(), I instead Text.set(Integer.toString(intVal)).getBytes().
In other words, I convert the integer to its String representation, and use the Text writable object to get the byte array as if it were a string.
Then, in my Hive DDL, I can call the column an int and it interprets it correctly.
I'm not sure what was initially causing the problem, be it a bug in WritableUtils, some incompatibility with compressed integer byte arrays, or a faulty understanding of how this stuff works on my part. In any event, the solution described above successfully meets the task's needs.

Get int from boost::gregorian::date_duration

I have date_duration variable and i want to convert it to int, I found the same topic here, about how to convert for output by <<, but i don't want it. I have to use integer value for arithmetic operations. How to convert it to string or integer? I can write it to file and then fscanf but it's idiot method.
You can use the days() method to get the number of days (as a value, not an object).

Resources