insert a String into timestamp(6) column - oracle

I have timestamps looking like this: 2019-06-13 13:22:30.521000000
I am using Spark/Scala scripts to insert them into an Oracle table. Column in Oracle is Timestamp(6) and should stay like that.
This is what I do:
what I have in Spark is a df containing a column with my timestamps:
+-----------------------------+
| time |
+-----------------------------+
|2019-06-13 13:22:30.521000000|
+-----------------------------+
I do the following:
df.withColumn("time", (unix_timestamp(substring(col("time"), 1, 23), "yyyy-MM-dd HH:mm:ss.SSS") + substring(col("time"), -6, 6).cast("float") / 1000000).cast(TimestampType))
and I insert using a connexion to Oracle (insert script was tested and works fine).
But in Oracle I only see the following in my table:
+--------------------------+
| time |
+--------------------------+
|2019-06-13 13:22:30.000000|
+--------------------------+
The milliseconds aren't included. Any help please? Thank you!

If your time column is a timestamp type, you can try date_format:
https://sparkbyexamples.com/spark/spark-sql-how-to-convert-date-to-string-format/

I thank everyone that tried to help me.
This is what I did to get desired output:
df.withColumn("time", (unix_timestamp(substring(col("time"), 1, 23), "yyyy-MM-dd HH:mm:ss.SSS") + substring(col("time"), -9, 9).cast("float") / 1000000000).cast(TimestampType))
all other solutions kept returning null or timestamps without milliseconds.
Hope it helps someone.

I don't know tools you use, but - if it were only Oracle, then to_timestamp with appropriate format mask does the job. See if it helps.
SQL> create table test (col timestamp(6));
Table created.
SQL> insert into test (col) values
2 (to_timestamp('2019-06-13 13:22:30.521000000', 'yyyy-mm-dd hh24:mi:ss.ff'));
1 row created.
SQL> select * From test;
COL
---------------------------------------------------------------------------
13.06.19 13:22:30,521000
SQL>
[EDIT, as you can't read my mind (at least, I hope so]
As you (AbderrahmenM) said that you have a string but still want to insert a timestamp, perhaps you could use a stored procedure. Here's an example:
SQL> create or replace procedure p_test (par_time in varchar2)
2 is
3 begin
4 insert into test (col) values
5 (to_timestamp(par_time, 'yyyy-mm-dd hh24:mi:ss.ff'));
6 end;
7 /
Procedure created.
SQL> exec p_test('2019-06-13 13:22:30.521000000');
PL/SQL procedure successfully completed.
SQL> select * from test;
COL
-------------------------------------------------------------------
13.06.19 13:22:30,521000
SQL>
Now, the only thing I can't help with is how to call a procedure from Spark. If you know how, then simply pass that string you have and it should be properly inserted into the database; pay attention to correct format mask!

Related

Oracle vs HANA char data type handling

We have Oracle as source and HANA 1.0 sps12 as target. We are mirroring Oracle to HANA with Informatica CDC through real-time replication. In Oracle, for many columns we have datatype as CHAR i.e. fixed length datatype. As HANA officially doesn't support CHAR datatype so we are using NVARCHAR data type instead of same. Problem we are facing is -as in Oracle CHAR datatype is of fixed length and append spaces whenever actual string is of lesser length than datatype, we have lot of extra spaces in target HANA db for such columns.
For eg. If column col1 has data type
CHAR(5)
and value as 'A', it is replicated in HANA as 'A ' i.e. 'A' appended by four extra spaces, causing lot of problems in queries and data interpretation
Is it possible to implement CHAR like datatype in HANA?
You can use RPAD function in Informatica while transferring data to Hana. Just make sure if Hana doesn't trim automatically.
So, for the CHAR(5) source column you should use:
out_Column = RPAD(input_Column, 5)
Pretty much exactly, as the documentation says:
I don't know HANA and this is more a comment than an answer, but I chose to put it here as there's some code I'd like you to see.
Here's a table whose column is of a CHAR datatype:
SQL> create table test (col char(10));
Table created.
SQL> insert into test values ('abc');
1 row created.
Column's length is 10 (which you already know):
SQL> select length(col) from test;
LENGTH(COL)
-----------
10
But, if you TRIM it, you get a better result, the one you're looking for:
SQL> select length( TRIM (col)) from test;
LENGTH(TRIM(COL))
-----------------
3
SQL>
So: if you can persuade the mirroring process to apply TRIM function to those columns, you might get what you want.
[EDIT, after seeing Lars' comment and re-reading the question]
Right; the problem seems to be just the opposite of what I initially understood. If that's the point, maybe RPAD would help. Here's an example:
SQL> create table test (col varchar2(10));
Table created.
SQL> insert into test values ('abc');
1 row created.
SQL> select length(col) from test;
LENGTH(COL)
-----------
3
SQL> insert into test values (rpad('def', 10, ' '));
1 row created.
SQL> select col, length(col) len from test;
COL LEN
---------- ----------
abc 3
def 10
SQL>

What is the best data type for the field of format "YYYY-MM-DD" in oracle 11g?

I'm creating tables in Oracle 11g table and came across one date field of format "YYYY-MM-DD".
I don't want to use varchar2 for this and when I use number(5), it's still accepting the input. Then what's the meaning of limit 5 here?
Please suggest me the best datatype I can use here.
This is, obviously, a date format mask. If you're about to store dates into that column, you should use the DATE datatype, such as
SQL> create table test
2 (datum date);
Table created.
Don't use VARCHAR2 (put strings into it, not dates) nor NUMBER (put numbers into it, not dates) datatypes for that. You'll regret it sooner than you think.
I'm going to enter some values into the table, showing different ways of how you could do that - it is important that you insert dates, not strings into it. Never rely on Oracle, implicitly converting strings you might provide to dates. Sooner or later, it'll produce an error.
SQL> insert into test values (date '2018-12-25');
1 row created.
SQL> insert into test values (to_date('09.05.2018', 'dd.mm.yyyy'));
1 row created.
SQL> insert into test values (sysdate);
1 row created.
Now, several ways of selecting that value:
This one returns date in a format currently set by my database's NLS settings:
SQL> select * from test;
DATUM
--------
25.12.18
09.05.18
09.05.18
I'm forcing it to return values in desired format, using ALTER SESSION:
SQL> alter session set nls_date_format = 'yyyy-mm-dd';
Session altered.
SQL> select * from test;
DATUM
----------
2018-12-25
2018-05-09
2018-05-09
Yet another format; note that value inserted via the SYSDATE function (which returns DATE) contains date and time component. It was "invisible" in previous examples:
SQL> alter session set nls_date_format = 'dd.mm.yyyy hh24:mi:ss';
Session altered.
SQL> select * from test;
DATUM
-------------------
25.12.2018 00:00:00
09.05.2018 00:00:00
09.05.2018 08:03:50
Using TO_CHAR function with some format (such as dd-mon-yyyy). I'm also requesting Oracle to "translate" month name into English (as my database works in Croatian):
SQL> select to_char(datum, 'dd-mon-yyyy', 'nls_date_language = english') datum from test;
DATUM
-----------
25-dec-2018
09-may-2018
09-may-2018
SQL>
[EDIT]
Oracle doesn't store DATE values in any "human" readable format (there's more to read on the Internet, Google for it). It is a format mask that represents that value to you.
I strongly suggest you NOT to store dates into any datatype column but DATE. It's a time bomb, waiting to explode (and then it'll hurt). Nobody stops you from entering a value as '1234-99-66' or '12-345-678'; what will you do with it, then?
Consider creating a view on a top of the table which uses TO_CHAR function and returns the value in a format you want ('yyyy-mm-dd'). DATE datatype column in a table makes sure that values are valid, and the view will let the third-party application to accept values it finds appropriate.
For example:
SQL> create view v_test as
2 select to_char(datum, 'yyyy-mm-dd') datum
3 from test;
View created.
SQL> select * from v_test;
DATUM
----------
2018-12-25
2018-05-09
2018-05-09
SQL>
So: you wouldn't let the third-party application to access the table, but the view instead.

Insert a record having timestamp(6) field from Oracle to Postgresql via dblink bug?, lose timestamp precision?

I am in trouble while inserting a new record to postgresql from oracle database server. After insert a new record into postgresql, I lose precision of timestamp fields, all digits which refers to microsecond had been lose. Here is my sample code:
declare
v_date timestamp(6):=to_timestamp('2013-06-04 12:03:01.123456','YYYY-MM-DD HH24:MI:SS.FF6');
begin
dbms_output.put_line (v_date);
insert into "public"."DAS_ITEM"#PG_LINK
("DOCKID","CANDY_ITM_NBR","MODIFIED_ON") VALUES (1,3, v_date);
commit;
end;
After running the pl/sql, I would like to query the data from postgres directly
select "DOCKID","CANDY_ITM_NBR", to_char("MODIFIED_ON", 'YYYY-MM-DD HH24:MI:SS.US') from "DAS_ITEM";
and here is the result:
DOCKID | CANDY_ITM_NBR | MODIFIED_ON
-----------+---------------+---------------------------
1 | 3 | 2013-06-04 12:03:01.000000
Currently, the value of MODIFIED_ON field is '2013-06-04 12:03:01.000000', I expect the value of MODIFIED_ON was 2013-06-04 12:03:01.123456
Please help me, I am in trouble for 36 hours.

Default milliseconds to 0 in Oracle

I have a timestamp column in Oracle that has format 'MM/DD/YYYY HH24:MI.SxFF6'.
The data looks like below:
11/09/1917 10:45:28.230000
10/19/2014 18:09:28.410000
12/19/2011 11:06:28.340000
I need the timestamp to retain the value except for getting the milliseconds which need to be defaulted to 000000.
I tried query -
cast(to_char(Local_time, 'MM/DD/YYYY HH24:MI:SS') as timestamp(6))
But it is throwing error - "Not valid month"
Does anyone have any ideas on what I can try to get milliseconds to 0. I use Toad to query the table.
Your TIMESTAMP value does not have any format. All you have is a default display format - defined by current user NLS_TIMESTAMP_FORMAT setting.
Try this one:
CAST(Local_time AS TIMESTAMP(0))
If you like to trunc the milliseconds but haven them still available use
CAST(CAST(Local_time AS TIMESTAMP(0)) AS TIMESTAMP(6))
Something like this, perhaps?
SQL> create table test (col timestamp, result timestamp);
Table created.
SQL> insert into test (col) values (to_timestamp('11/09/1917 15:45:28.230000', 'MM/DD/YYYY HH24:MI:SS.FF6'));
1 row created.
SQL> update test set result = cast(col as date);
1 row updated.
SQL> select * From test;
COL RESULT
------------------------- -------------------------
09.11.17 15:45:28,230000 09.11.17 15:45:28,000000
SQL>

Trying to export a Oracle via PL/SQL gives a date of 0000-00-00

I have inherited an Oracle .dmp file which I'm trying to get into CSV so that I can load it into MySQL.
The general approach I'm using is described here. I'm having a problem with one row though. It contains a date of 5544-09-14 like so:
alter session set nls_date_format = 'dd-MON-yyyy';
select OID, REF, TRADING_DATE From LOAN WHERE REF = 'XXXX';
OID REF TRADING_DATE
--- -------------------- ------------
1523 XXXX 14-SEP-5544
This is garbage data from the legacy system which didn't validate the input dates. I'm wondering why my PL/SQL function to export the data chokes on this value though?
It exports that row with a TRADING_DATE value of '0000-00-00T00:00:00' and I'm not sure why?
SELECT dump(TRADING_DATE) FROM LOAN WHERE REF = 'XXXX';
DUMP(TRADING_DATE)
--------------------------------------------------------------------------------
Typ=12 Len=7: 44,156,9,14,1,1,1
and
SELECT to_char(trading_date, 'YYYYMMDDHH24MISS') FROM LOAN WHERE REF = 'XXXX';
TO_CHAR(TRADIN
--------------
00000000000000
The value stored in that column is not a valid date. The first byte of the dump should be the century, which according to Oracle support note 69028.1 is stored in 'excess-100' notation, which means it should have a value of 100 + the actual century; so 1900 would be 119, 2000 would be 120, and 5500 would be 155. So 44 would represent -5600; the date you have stored appears to actually represent 5544-09-14 BC. As Oracle only supports dates with years between -4713 and +9999, this isn't recognised.
You can recreate this fairly easily; the trickiest bit is getting the invalid date into the database in the first place:
create table t42(dt date);
Table created.
declare
d date;
begin
dbms_stats.convert_raw_value('2c9c090e010101', d);
insert into t42 (dt) values (d);
end;
/
PL/SQL procedure successfully completed.
select dump(dt), dump(dt, 1016) from t42;
DUMP(DT)
--------------------------------------------------------------------------------
DUMP(DT,1016)
--------------------------------------------------------------------------------
Typ=12 Len=7: 45,56,9,14,1,1,1
Typ=12 Len=7: 2d,38,9,e,1,1,1
So this has a single row with the same data you do. Using alter session I can see what looks like a valid date:
alter session set nls_date_format = 'DD-Mon-YYYY';
select dt from t42;
DT
-----------
14-Sep-5544
alter session set nls_date_format = 'YYYYMMDDHH24MISS';
select dt from t42;
DT
--------------
55440914000000
But if I use an explicit date mask it just gets zeros:
select to_char(dt, 'DD-Mon-YYYY'), to_char(dt, 'YYYYMMDDHH24MISS') from t42;
TO_CHAR(DT,'DD-MON-Y TO_CHAR(DT,'YY
-------------------- --------------
00-000-0000 00000000000000
And if I run your procedure:
exec dump_table_to_csv('T42');
The resultant CSV has:
"DT"
"0000-00-00T00:00:00"
I think the difference is that those that attempt to show the date are sticking with internal date data type 12, while those that show zeros are using external data type 13, as mentioned in note 69028.1.
So in short, your procedure isn't doing anything wrong, the date it's trying to export is invalid internally. Unless you know what date it was supposed to be, which seems unlikely given your starting point, I don't think there's much you can do about it other than guess or ignore it. Unless, perhaps, you know how the data was inserted and can work out how it got corrupted.
I think it's more likely to be from an OCI program than what I did here; this 'raw' trick was originally from here. You might also want to look at note 331831.1. And this previous question is somewhat related.

Resources