The company I work for is in the process of switching from oracle to EnterpriseDB and I'm trying to update a query that uses a timestamp from a table, but whenever I try to pull that timestamp it gives me:
[Devart][ODBC][PostgreSQL]Invalid TIMESTAMP string {HY000}
I've tried casting it as varchar2, date, timestamp, using to_date, and nothing has worked.
The query is:
select "ship_date" from "promotion"#pgProd
In postgres ship_date is just a timestamp.
Any information about how this can be accomplished would be appreciated.
EDIT: To clarify, this query is being run in oracle pulling data from postgres.
Oracle version is 11g
The relevent line of the creation script is:
ship_date timestamp without time zone NOT NULL
Related
Looking into a query in WebIntelligence, after running, the prompts are replaced by values provided by user (for instance dates).
When I run the same query on Oracle (because this database I use for my universe) I’m getting error in terms of dates. Dates in query (in BO) are just strings,
like StartDate = '30-06-2020 00:00:00′. When I run the query generated in WebIntelligence on Oracle I’m getting error:
ORA-01843: not a valid month
01843. 00000 – ” not a valid month”
And to fix this I need to use for instance to_date function and then it’s working fine. My question is: how dates are parsed in WebIntelligence while running a query?
so the mentioned error does not occur?
I am getting the same error as you when I try a query directly against Oracle using SQL Developer that works in Web Intelligence. According to this BusinessObjects makes a call to set the date format.
So you can do that either in the preferences of SQL Developer (or presumably whatever database query tool you are using) or explicitly setting it with the alter session command.
alter session set nls_date_format = 'DD-MM-YYYY HH24:MI:SS';
select...[the rest of your query]
Both options are shown in the answer to How can I set a custom date time format in Oracle SQL Developer?.
I'm using Oracle's SQL Developer to export data into CSVs. I found that Oracle spits out the dates as dd-MMM-yy. When I bulk insert these files into SQL Server it's interpreting some of the dates incorrectly. How do I change that?
I'm an Oracle neophyte, so I might be approaching this whole thing incorrectly. I need to transfer a lot of tables/rows from Oracle to SQL Server. I have a linked server set up in SQL to Oracle, but that takes a really long time to transfer the data. About 18 hours, and both databases are on the same server, but it gets the dates correct.
I didn't find any good way to accomplish this other than a couple of PL/SQL scripts I couldn't get to work for me. Is it really that rare that data gets migrated from Oracle to MS-SQL?
Well, dates across database vendors are just hell.
The default date format can be set in SQL Developer Preferences > Database > NLS > Date Format. You can also set it in the session as #belayer has commented.
For writing CSV files or for migration projects, I would always try to control the format directly, like
SELECT id, TO_CHAR(my_date,'YYYY-MM-DD') AS my_date, my_column
FROM my_table;
Having said that, there should be a better way to move the data out of Oracle into SQL Server...
I have an Oracle database and I have to load dat from this database to Azure SQL DWH. This is done once every day. At the beginning of the pipeline I first do a lookup on SQL DWH to look for the latest date. The result for that is something like '2015-10-25'.
I want to use this date to query the Oracle database. But I allready found out, by trying the query on Oracle that the following code does not work:
Select * from Table 1 where day = '2015-10-25'
The date in the day column looks like 25-OCT-15 (DD-MON-YY).
I treid the following where clause:
where day = TO_DATE('2015-10-25','DD-MON-YY')
But then I get the error: "literal does not match format string"
I realy don't know how to make Oracle understand this T-SQL date format.
Your Oracle column is of date datatype. When you connect to an Oracle database and write a query against that date column, you will see its default format DD-MON-YY as per this reference.
You can override this setting by running an ALTER SESSION command, eg
ALTER SESSION SET NLS_DATE_FORMAT = 'YYYY MM DD';
but this is just in that local session. The data is still stored in Oracle in the same manner and it's simply the way you view it that is changing.
In Azure Data Factory (ADF v2) and your example, you are dealing with strings. So you must make sure any parameters you pass in are in the correct format or set to the correct datatype. The Oracle function TO_DATE converts strings to the date datatype. Therefore when passing in a string of format YYYY-MM-DD then that is the format you must use, to let the TO_DATE function know what you are passing in:
TO_DATE('2015-10-25','YYYY-MM-DD')
The function then successfully converts your parameter to a date datetype for correct comparison against the main date column.
You can try this query:
Select * from Table 1 where day = to_char (to_date('2015-10-25','YYYY-MM-DD'), 'DD-Mon-YY')
Reference this blog: how to convert YYYYMMDD to DD-Mon-YYYY in oracle?
Hope this helps.
I'm using Pentaho to datamask some of the information on the oracle DB
I have several transformations of the form:
SELECT -> data mask -> UPDATE rows based on primary key
I have tables where a timestamp is part of the primary key in the update step. Even though I am not masking or updating this field in any way, I get the error ORA-01843: not a valid month when performing the update.
I believe this is because when Pentaho takes in the timestamp from Step 1 it doesn’t actually keep it as a timestamp until I try the update and hence the primary key check. Outputting to excel, I see pentaho giving timestamps in the format
2014-07-30 15:44:31.869033 Europe/London (Pentaho)
But in DB the format is
30-JAN-15 09.21.38.109145000 AM (Oracle - TIMESTAMP(6) WITH LOCAL TIME ZONE)
I have tried to convert the pentaho field to a Timestamp (format: yyyy-MM-dd HH:mm:ss.SSSSSS) before the update step but receive errors if I try and use milliseconds.
2017/03/14 13:19:25 - Select values.0 - AUDIT_CREATE_TS Timestamp : couldn't convert string [2015-01-30 09:21:38.109145 Europe/London] to a timestamp, expecting format [yyyy-mm-dd hh:mm:ss.ffffff]
2017/03/14 13:19:25 - Select values.0 - Timestamp format must be yyyy-mm-dd hh:mm:ss[.fffffffff]
If I replace my formatting to the one suggested by Pentaho I get "Illegal character 'f'" and then I am stuck in a loop.
Ignoring milliseconds seems to succeed but won’t give me any matches because it isn’t precise enough and returns no results from db..
Any help would be appreciated!
Not sure about Pentaho, but if you're looking for a conversion from this string:
'2015-01-30 09:21:38.109145 Europe/London'
to a timestamp with timezone in Oracle, it would be:
select to_timestamp_tz('2015-01-30 09:21:38.109145 Europe/London', 'YYYY-MM-DD HH24:MI:SS.FF6 TZR') from dual;
See Oracle Datetime Format Models document for more.
Currently, I am working on migrating an Oracle database schema to Postgres. As part of it I have to convert a custom utility function in Oracle which generates timestamps based on timezone. But the values obtained from Oracle and Postgres seem to vary by one hour.
Oracle:
SELECT (to_timestamp_tz('20300714 235959 CET','YYYYMMDD HH24MISS TZR') - to_timestamp_tz('19700101 000000 GMT','YYYYMMDD HH24MISS TZR'))
as foo FROM dual;
yields +22109 21:59:59.000000
Postgres:
select ('20300714 235959'::timestamp AT TIME ZONE 'CET') - ('19700101 000000'::timestamp AT TIME ZONE 'GMT') as foo;
yields 22109 days 22:59:59
I guess the reason for this difference is because of daylight saving but I am not sure. Can anyone help me out with this problem.
I am using Postgres v9.6 and Oracle 12c.
Well, I found the mistake I was doing. Postgres seems to handle the abbreviated timezone name and full timezone name differently. In case if you want to incorporate daylight related changes you would have to use full timezone name. In my case using 'Europe/Amsterdam' instead of 'CET' yielded the same result as that of Oracle.
To find the complete list of full timezone names supported by Postgres, I used the query:
select * from pg_timezone_names where abbrev='CET';