convert TO_CHAR, IS_DATE to hive query - hadoop

I want to convert specific data to Hive.
However, functions available in Oracle cannot be used in Hive. How can I solve this?
The applied conversion rule is as follows.
DECODE(TRUE, IS_DATE(TO_CHAR(columnname , 'YYYYMMDD')), 'YYYYMMDD',NULL)
In the case of DECODE, it was confirmed that it could be processed with IF.
But I couldn't find a way to change IS_DATE function and TO_CHAR function.

Oracle does not have an IS_DATE function. Are you sure this is not a user-defined function? If so then you will need to look at the source code and check what it does and duplicate that in Hive.
DECODE(a, b, c, d) can be rewritten as a CASE expression:
CASE WHEN a = b THEN c ELSE d END
So your code (assuming that columnname is a DATE and you are using TO_CHAR to convert it to a string and then IS_DATE checks if it is a valid date, which seems pointless as it will only not be a valid date when columnname is NULL) would convert to:
CASE
WHEN CAST(columnname AS STRING FORMAT 'YYYYMMDD') IS NOT NULL
THEN 'YYYYMMDD'
ELSE NULL
END
or, more simply:
CASE
WHEN columnname IS NOT NULL
THEN 'YYYYMMDD'
ELSE NULL
END

Related

How I can use case statment in BIP after where as paramter?

How I can use case statment in BIP App after where and compare it to a date
if is the date is null show Full data if he ask for Specific date display the data for the date that he asked for
Didn't understand the question but if you think of using case in where clause it could be something like this:
-- ...
WHERE
something = CASE
WHEN Nvl(date_column, To_Date('01.01.2099', 'dd.mm.yyyy') <= other_date_column -- or parameter or date calculation .....
THEN something
ELSE
something_else
END
-- ...
It depends what you want to do when one of comparing dates is null, but if it is null, you could use Nvl() function to create a date that is for sure out of scope and make it '=' or '<=' or '>=' or ... to the comparing value. This way you can manage the expresion and do the job. This is definitely sql question. Regards...

Range of values in set statement in Hive

Is it possible to enter a range of values in the set statement in Hive:
eg:
PROC FORMAT;
VALUE $ABCD
'3000',
'3001',
'8816' - '8820',
'1517' - '1599' = 'Y'
OTHER = 'N';
I need this format statement in SAS to be converted and I have entered the values in set statement of Hive so I can use the 'ABCD' value in a case statement later. I am unable to find a way out to write out the range of values.
I cannot just list them out since they are not necessarily whole numbers.
It is not just a syntax translation.
If you are working with SQL, you have to think SQL.
It seems you are looking for something similar to -
case
when mycol in (3000, 3001)
or mycol between 8816 and 8820
or mycol between 1517 and 1599
then 'Y'
else 'N'
end

How to perform case in an oracle concat function?

How to perform case in an oracle concat function ?
I want to concat number and a letter based on the number. ie if number is more than 1 than i have to append s else not.
I tried below query but its not working.
Select concat(count(*) , if count(*) > 1 then 's' else '')
from tablename
group by columnname;
In Oracle it is better to use || for concatenation, and there's no if but there are case expressions. I will show the solution with || as it is best practice, but if you must use concat you can do it as well. Also, Oracle will convert a number to a VARCHAR2 for you, but it is best to write your conversions explicitly.
select to_char(count(*)) || case when count(*) > 1 then 's' end from ....
NOTE - the default value of case is '' (same as NULL in Oracle), so I didn't need to write more.

T-SQL isnumeric() replacement in hive

What is the replacement for the T-SQL isnumeric() function in hive 0.10?
http://technet.microsoft.com/en-us/library/ms186272.aspx
There isn't a direct equivalent in HIVE but you can use the cast function.
Casting anything that isn't "numeric" to double will return null and you could use it like this :
select x from table where cast(field as double) is not null
You can check if a number is decimal or not by the below check. This can be given in where clause as well.
select case when abs(x%1)>0 then x else cast(x as bigint) end ;

Dynamic order by date data type in Oracle using CASE

My code in the stored procedure:
SELECT * FROM
my_table ir
WHERE
--where clause goes here
ORDER BY
CASE WHEN p_order_by_field='Id' AND p_sort_order='ASC' THEN IR.ID end,
CASE WHEN p_order_by_field='Id' AND p_sort_order='DESC' THEN IR.ID end DESC,
CASE WHEN p_order_by_field='Date' AND p_sort_order='ASC' THEN TO_CHAR(IR.IDATE, 'MM/dd/yyyy') end,
CASE WHEN p_order_by_field='Date' AND p_sort_order='DESC' THEN TO_CHAR(IR.IDATE, 'MM/dd/yyyy') end DESC;
Problem is that sorting is done based on the char, which comes out wrong for the date case. CASE statement, however, won't allow any other datatype other than char. So what is the solution in this case? I need to be able to pass the p_order_by_field into the stored procedure.
Thanks
Should be simple - just use ISO date format in your case:
TO_CHAR(IR.IDATE, 'yyyy-mm-dd')
and you should be fine.
Another problem could occure when you want to sort on the date difference (let say number of days between two days).
For example such a sort would return number 13 (days) before 9 (days).
The solution is that you concatenate length of date difference and the difference itself:
length(trunc(date2) - trunc(date1)) || to_char(date2 - date1)

Resources