I have a string like so: "2014-09-02T03:01:09.8093664Z", and Im trying to convert it into local timezone. I tried from_utc_timestamp(eventTime, 'GMT'), from_utc_timestamp(eventTime, "PDT"), but Hive just returns error:
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"_col0":"2014-09-02T03:01:09.8093664Z",
.
.
.
... 7 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Error evaluating Converting field _col0 from UTC to timezone: 'PDT'
Am I doing something wrong here?
I searched stackoverflow and did not find a solution to this problem (Local Time Convert To UTC Time In Hive is related but doesn't solve the problem)
from_unixtime(UNIX_TIMESTAMP("2014-09-02T03:01:09Z", "yyyy-MM-dd'T'HH:mm:ss'Z' "),"yyyy-MM-dd HH:mm:ss")
its conver to 2014-09-02 03:01:09
An usesful way to solve this problem is creating an UDF function to make this operation. This new one could be specific for this case or more generic adapted to more datetime format conversions. You could read below some benefits:
Makes your hive query more readable
Avoid duplicated code if you need this operation in other queries
Makes more scalable your systems because you can update this method whenever you want
Delegate the complex operations to Java code, and consequently you will be able to test those complex parts.
Could you read more about how to create a customized UDF here.
If you need to know how to implement this method in Java I've found in Starckoverflow a post that explains you a way to that, here you have the entry.
You must first extract the time and date string in the proper format before you convert it to GMT. This requires the following format 'yyyy-MM-dd HH:mm:ss'.
Use a regexp_replace to extract the string and then pipe that to the from_utc_timestamp function like this:
select from_utc_timestamp(regexp_replace(event_time,'(\^\\d{4}-\\d{2}-\\d{2})T(\\\d{2}:\\d{2}:\\d{2}).*','$1 $2),'GMT') from my table;
Your output is then: 2014-09-01 03:01:09
Good luck!
Related
I build a dynamic Dictionary in ClickHouse DB. It will create the dictionary before the SQL commands execute. Also, the dictionary name was created at the same time.
The dictionary name was a combined string by date and table name. Because I don't want to keep so much data in the long-term dictionary, the short-term Dictionary could be a great help to release the memory after maybe an hour when no one is using it.
And the error happened.
When I use the dictionary in my SQL commands it is okay with solid commands.
e.g.
dictGet(CONCAT('2022-04-06', 'MyTableName'), 'GetBackColumnName',myKeyColumn) AS col_name
But when I change to using the column from Table, it was broken.
e.g.
dictGet(CONCAT(DATE_COL, 'MyTableName'), 'GetBackColumnName',myKeyColumn) AS col_name
And the error message shows up.
Illegal type String of the first argument of function dictGet,
expected a const string.
Does anyone know how to fix the issue?
My CH version is: 20.8.7.15
I try to find the resolution from the ClickHouse office report but nothing can fix this issue. And I tried lots of functions of String to figure out what happened.
Hello everyone and happy new year,
I'm converting my project to use SQL Server instead of MySQL and I'm struggling with the problem of managing timestamps.
In the project, I have this code:
Customers::whereBetween('created_at', [Carbon::now()->subDays('7'), Carbon::now()])->count();
which gives me back the number of new customers registered in the last 7 days.
Using MySQL no problem whatsoever while with SQL Server I get this error:
Converting an nvarchar data type to datetime resulted in a value
outside of the allowable range.
despite in my model, I have set
public function getDateFormat()
{
return 'Y-m-d H: i: s.v';
}
to get the values in milliseconds.
What did I forget to set up?
The error tells you that a conversion failed. Possible reasons:
The date does not exist
If you get February 30th as the input, it will fail.
Input not complying to the format in use
To detect whether this is the problem, you will need to find out what the generated SQL is and find out which value caused this problem. After carefully studying the conversion you should be able to determine what the problem and solution is.
I am doing functionality testing(need to write hive code while referring Scala code) in my project. I am having an issue with my date functions in my code. In Scala we have casted our date data type into string as changed its structure into ‘YYYYMM’, MY value inside my date column is like 201706(YYYYMM), which is not accepted in Hive (read that it accepts only YYYY-MM-DD).
My question is
1) How to change the YYYYMM to YYYY-MM-DD? I have tried casting to date and also UNIX_TIMESTAMP neither of them are working query is getting failed at the end.
2) We are also using filter.to_date (colm1,”YYYYMM”).between(add_months(to_date((colm2),”YYYYMM”),-27), add_months(to_date((colm2),”YYYYMM”),-2))) in our Scala code , How can I change that to HIVE? Unable to get any ideas
Thanks In advance…..
Regards,
M Sontosh Aditya
use
unix_timestamp(DATE_COLUMN, string pattern)
Further understanding please refer DateFuncitos
FILTER("source"."recordCount" USING "source"."snapshot_date" =
EVALUATE('TO_CHAR(%1, ''YYYYMMDD'')', TIMESTAMPADD(SQL_TSI_DAY, -7, EVALUATE('TO_DATE(%1, %2)', "source"."snapshot_date" , 'YYYYMMDD'))))
So i have this piece of code here. I know some will say "Just use the AGO function" But somehow it's causing problems because of it's connection with other tables so what I'm trying to achieve here is like a remake. The process goes this way:
The snapshot_date there is actually in varchar format and not date. So it's like "20131016" and I'm trying to change it to a date then subtract 7 days from it using the TIMESTAMPADD function and then finally returning it back to varchar to use it with FILTER.
This snippet somehow works when testing the FILTER using hardcoded values like "20131016" for example but when tested out with the code above all the row are blank. On paper, the process i assumed would happen goes lke this. "20131016" turns to a date with a format of 20131016 (yyyymmdd) and then less 7 days: 20131009 and then turned into char again "20131009" to be used in the filter.
But somehow that doesn't happen. I think the data format is not applying either to the string->date or the date->string conversion. which results to the values not getting a match at all.
Anyone have any idea what's wrong with my code?
By the way I've already tried to use CAST instead of EVALUATE or TO_TIMEDATE with the same result. Oh and this goes to the formula of the column in BMM.
Thanks
You might get some clues by looking at the SQL generated by the BI Server. I can't see any issues with your column expression, so I wouldn't limit your debugging to that alone.
A query returning nulls is often caused by incorrect levels being set (especially on logical table sources, but potentially on a measure column too). This will often result in some form of SELECT NULL FROM ... in the physical SQL.
Try this :
FILTER("source"."recordCount" USING "source"."snapshot_date" =
EVALUATE('TO_CHAR(%1, %2)', TIMESTAMPADD(SQL_TSI_DAY, -7, EVALUATE('TO_DATE(%1, %2)', TO_CHAR("source"."snapshot_date" , 'YYYYMMDD') , 'YYYYMMDD')) , 'YYYYMMDD'))
I'm storing a simple java.util.date in an Oracle XE database via hibernate.
When testing with JUnit if I can retrieve the correct value, I get an error like this:
junit.framework.AssertionFailedError:
expected:<Sun Dec 28 11:20:27 CET 2008>
but was:<2008-12-28 11:20:27.0>
The value is stored in an Oracle Date column (which should have a second-precision) which looks okay to me. Also, I'm surprised that 11:20:27 is not equal to 11:20:27.0. Or does this have to do with timezones?
Any help is welcome.
Thorsten
Okay, worked some more on it ...
Oracle Date columns only store values with an accuracy of a second.
Java Dates do contain milliseconds, but they are typically not printed. So
expected:
was actually created by a date like 11:20:27,345, which is of course not equal to 11:20:27.0
Solution:
either only use full second dates to store and retrieve
or
get hibernate to create the correct Oracle Datatype (TIMESTAMP) - this is very dependent on the dialect specified in the hibernate config (OracleDialect and Oracle10gDialect create different types).
If you compare a java.util.Date to a java.sql.Date that both represent the same instant in time, equals(Object) will return false (it considers two objects of different classes to never be equal).
Your tests need to account for that. The easiest way to do this is to convert the dates to UNIX time (e.g. java.util.Date.getTime()) and compare those values.