ORACLE : found a date out of range : -5000 BC ?! (inserted by hibernate) - oracle

As you may know, Oracle's limits are -4713 to +9999 for full year in date fields.
I'm working on a 11g database and I've found sereval records with a year out of range (-5386, -5459, -5592 etc).
On SQL Developper I tried :
alter session set nls_date_format='DD/MM/YYYY HH:MI AD';
select the_date_field, to_char(the_date_field, 'DD/MM/YYYY HH:MI AD') FROM the_table WHERE ...
It gives me the result:
26/09/5386 05:23 AV.J.-C. | 00/00/0000 00:00 00000000
(fr display, "av" means "before")
I got an impossible date and "TO_CHAR" function can't work and returns "00/00/0000" (yeah, another impossible date!) ; but SQL Developper display this field well.
I also tried :
select the_date_field + 1 FROM the_table WHERE ...
And I obtain: ORA-01841: (full) year must be between -4713 and +9999, and not be 0
So it really looks like a real Date.
"describe the_table" gives : "the_date_field DATE"
This data was inserted by a j2ee application using Hibernate v3.0.5, with oracle.jdbc.driver.OracleDriver v12.1.0.2.0.
Hibernate property "the_date_field" is also set to "timestamp" (it shouldn't be a problem, indeed?).
Issue repeats over time but I still didn't figure out how to reproduce it.
The others billions of records in this old application have no problem.
Anyone has an idea of what's going on?

Related

ORA-01841: (full) year must ... not be 0. How to reproduce and to fix?

sorry for asking a question, that has a lot of answers on Stackoverflow and allow me to pose this question in my context, that may differ from the previous questions.
I'm on a Production database, where I CANNOT CHANGE data. The data going into this database is highly dynamic, it changes all the time, so that makes it hard to reproduce that error. I'm accessing Oracle 11g via JDBC (Java).
Ok, for my DELETE I get the
ORA-01841: (full) year must be between -4713 and +9999, and not be 0.
This is my table (simplified):
MY_TABLE
Name Null? Type
---------------------------- -------- ---------------------------
MYTIMESTAMP NOT NULL TIMESTAMP(6) WITH TIME ZONE
From time to time I get the ORA-01841 for this DELETE:
delete from MY_TABLE where MYTIMESTAMP < sysdate - 30
When I look up the data, all seems fine. So where I need an idea:
1) How can I insert an invalid timestamp into MY_TABLE, so that I can reproduce that error? (*)
2) How can I rewrite the DELETE statement, so that I won't fail? Please note, I cannot change the existing data on Oracle.
Thank you
(*) I tried to insert these "invalid" date, but alas, not invalid enough:
insert into MY_TABLE (MY_IMESTAMP) values ('31-DEC-9999 11:00:00 PM +2:00')
insert into MY_TABLE (MY_IMESTAMP) values ('31-DEC-0 11:00:00 PM +2:00')
I see that error a lot when there are implicit type conversions going on. Can you maybe try:
delete from MY_TABLE where MYTIMESTAMP < systimestamp - interval '30' day

How to find the issue with dual in oracle?

I am using Oracle SQL developer. I am using the following query to get the current time stamp.
select to_char(CURRENT_TIMESTAMP,'DDMMYYYY/HHMMSS') from dual;
In this, minutes is constantly set to 10. But when we don't use to_char, it is working fine. How to find what went wrong? Is there any method to correct this?
You should use MI in HHMMSS instead of MM. MI stands for minutes, MM is for months, and currently it is October, hence the 10.
You can find the available formatting options at Oracle's site.
select to_char(CURRENT_TIMESTAMP,'DDMMYYYY/HHMMSS') from dual;
The issue is with the HHMMSS - MM is used to represent the month number. MI is what is used to represent minutes.
So what you're really after is:
select to_char(CURRENT_TIMESTAMP,'DDMMYYYY/HHMISS') from dual;

Oracle date corruption during update

I'm migrating some data from one oracle schema/table to a new schema/table on the same database.
The migration script does the following:
create table newtable as select
...
cast(ACTIVITYDATE as date) as ACTIVITY_DATE,
...
FROM oldtable where ACTIVITYDATE > sysdate - 1000;
If I look at the original data, it looks fine - here's one record:
select
activitydate,
to_char(activitydate, 'MON DD,YYYY'),
to_char(activitydate, 'DD-MON-YYYY HH24:MI:SS'),
dump(activitydate),
length(activitydate)
from orginaltable where oldpk = 1067514
Result:
18-NOV-10 NOV 18,2010 18-NOV-2010 12:59:15 Typ=12 Len=7: 120,110,11,18,13,60,16
The migrated data, showing that the data is corrupt:
select
activity_date,
to_char(activity_date, 'MON DD,YYYY'),
to_char(activity_date, 'DD-MON-YYYY HH24:MI:SS'),
dump(activity_date),
length(activity_date)
from newtable
where id = 1067514
Result:
18-NOV-10 000 00,0000 00-000-0000 00:00:00 Typ=12 Len=7: 120,110,11,18,13,0,16
Around 5000 out of 350k records show this problem.
Can anyone explain how this happened?
UPDATE:
I don't find any published reference to this specific type of DATE corruption on the Oracle support site. (It may be there, my quick searches just didn't turn it up.)
Baddate Script To Check Database For Corrupt dates [ID 95402.1]
Bug 2790435 - Serial INSERT with parallel SELECT and type conversion can insert corrupt data [ID 2790435.8]
The output from the DUMP() function is showing the date value is indeed invalid:
Typ=12 Len=7: 120,110,11,18,13,0,16
We expect that the minutes byte should be a value between one and sixty, not zero.
The 7 bytes of a DATE value represent, in order, century(+100), year(+100), month, day, hour(+1), minutes(+1), seconds(+1).
The only time I have seen invalid DATE values like this when a DATE value was being supplied as a bind variable, from a Pro*C program (where the bind value is supplied in the internal 7 byte representation, entirely bypassing the normal validation routines that catch invalid dates e.g. Feb 30)
There is no reason to expect the behavior you're seeing, given the Oracle syntax you posted.
This is either a spurious anomaly (memory corruption?) or if this is repeatable, then it's a flaw (bug) in the Oracle code. If it's a flaw in the Oracle code, the most likely suspects would be "newish" features in an un-patched release.
(I know CAST is a standard SQL function that's been around for ages in other databases. I guess I'm old school, and have never introduced it into my Oracle-syntax repertoire. I don't know what version of Oracle it was that introduced the CAST, but I would have stayed away from it in the first release it appeared in.)
The big 'red flag' (that another commenter noted) is that CAST( datecol AS DATE).
You would expect the optimizer to treat that as equivalent to date_col ... but past experience shows us that TO_NUMBER( number_col ) is actually interpreted by the optimizer as TO_NUMBER( TO_CHAR ( number_col ) ).
I suspect something similar might be going on with that unneeded CAST.
Based on that one record you showed, I suspect the issue is with values with a "59" value for minutes or seconds, and possibly a "23" value for hours, would be the ones that show the error.
I would try checking for places where the minutes, hour or seconds are stored as 0:
SELECT id, DUMP(activitydate)
FROM newtable
WHERE DUMP(activitydate) LIKE '%,0,%'
OR DUMP(activitydate) LIKE '%,0'
I've seen similar things to spence7593, again with Pro*C.
It is possible to create invalid dates programmatically using a DBMS_STATS package.
Not sure if there is a similar mechanism to reverse that.
create or replace function stats_raw_to_date (p_in raw) return date is
v_date date;
v_char varchar2(25);
begin
dbms_stats.CONVERT_RAW_VALUE(p_in, v_date);
return v_date;
exception
when others then return null;
end;
/
select stats_raw_to_date(utl_raw.cast_to_raw(
chr(120)||chr(110)||chr(11)||chr(18)||chr(13)||chr(0)||chr(16)))
from dual;

Oracle Interval Bug?

I'm using this sql query:
select sysdate, sysdate - INTERVAL '6' month from dual;
But it is return: ORA-01839: date not valid for month specified.
Which is weird, because if I change the the number into 9, it is return the date (sysdate = 31/05/11 and the subtracted is 31/08/10). I'm also tried using different value: 1,3,6,8,11 also not working, but 2,4,5,7,9,12 are working.
From the numbers, I think it is because the resulting quert doesn't have 31 days for that month. Is this the expected behavior? Because in MySQL, I can use the query (select now() - Interval 6 Month;) to get the correct value. Is there any other way?
I am using Oracle 11.1.0.6
It is the expected behaviour; see the sixth bullet in the datetime/interval arithmetic section of the documentation.
As Lisa says you can use add_months, which has the opposite behaviour - which can also cause confusion sometimes. You need to decide which is most suitable for you.
select sysdate,add_months(sysdate,-6) from dual;

Sybase equivalent in Oracle

I am doing a change of code from Sybase to Oracle.
I have problem in converting the below query to oracle.
Select Custodian_addr,convert(datetime,dateadd(ss,CreateDT,"01/01/1970")
Here CreateDT is the column name whose value for instance is 1015015173
The result for date conversion (for this example)is March 1 2002 8:39 PM GMT
I researched and found an oracle alternative which results in error
Select Custodian_addr,to_char(CreateDT,"SS")
I am getting a query error in Oracle.I am not able to identify whats wrong. Since I am executing this in Perl ["] has to escaped or what might be the issue? Please suggest me a solution
SELECT 'some address' as custodian_addr,
date '1970-01-01' + 1015015173/86400 as create_dt
from dual
/
CUSTODIAN_AD CREATE_DT
------------ -------------------
some address 2002-03-01 20:39:33
Oracle date arithmetic is pretty simple -- adding 1 to a date increments it by 1 day. So since that number is seconds, dividing it by 86400 (60*60*24) casts that number as a number of days (and fractions thereof).
What is the error you get?
Your TO_CHAR function should not use double quotes. It needs to be
TO_CHAR(CreateDT,'SS')
If you put something in double quotes in Oracle, it's interpreted as an identifier, not a string constant.

Resources