I am new using sas data Integartion Studio 4.9005. i am using database oracle 18.
the scenario is i move from table_A with data type varchar(100) to table_B with date data type.
the format from table_A is YYYY/MM/DD
but the result is give me random number (01-01-1960) it doesn't make sense.
for code to convert is "INPUT( TANGGAL_LAHIR ,yymmdd10.)"
Is there i am missing ? Thank you very much.
Related
I have an Oracle database and I have to load dat from this database to Azure SQL DWH. This is done once every day. At the beginning of the pipeline I first do a lookup on SQL DWH to look for the latest date. The result for that is something like '2015-10-25'.
I want to use this date to query the Oracle database. But I allready found out, by trying the query on Oracle that the following code does not work:
Select * from Table 1 where day = '2015-10-25'
The date in the day column looks like 25-OCT-15 (DD-MON-YY).
I treid the following where clause:
where day = TO_DATE('2015-10-25','DD-MON-YY')
But then I get the error: "literal does not match format string"
I realy don't know how to make Oracle understand this T-SQL date format.
Your Oracle column is of date datatype. When you connect to an Oracle database and write a query against that date column, you will see its default format DD-MON-YY as per this reference.
You can override this setting by running an ALTER SESSION command, eg
ALTER SESSION SET NLS_DATE_FORMAT = 'YYYY MM DD';
but this is just in that local session. The data is still stored in Oracle in the same manner and it's simply the way you view it that is changing.
In Azure Data Factory (ADF v2) and your example, you are dealing with strings. So you must make sure any parameters you pass in are in the correct format or set to the correct datatype. The Oracle function TO_DATE converts strings to the date datatype. Therefore when passing in a string of format YYYY-MM-DD then that is the format you must use, to let the TO_DATE function know what you are passing in:
TO_DATE('2015-10-25','YYYY-MM-DD')
The function then successfully converts your parameter to a date datetype for correct comparison against the main date column.
You can try this query:
Select * from Table 1 where day = to_char (to_date('2015-10-25','YYYY-MM-DD'), 'DD-Mon-YY')
Reference this blog: how to convert YYYYMMDD to DD-Mon-YYYY in oracle?
Hope this helps.
I'm currently working on importing a CSV(with thousands of rows) into SQL.
So far, I have created a table in SQL and populated it with column names, as well as data types corresponding to the column names(all of these correspond to the columns in the csv)
In this csv file, one of the column gives a timestamp in the form of:
2/3/2019 12:00:00 AM (MM/DD/YY HH:MM:SS)
The corresponding column I created in sql has a datatyoe if timestamp(6)
My next step is to right click on the table I just created in SQL and hit "Import Data" I then go through the steps of importing the csv to sql, however, the timestamps are not correctly imported. I know this because all of the timestamps in the csv are from February 3, 2019, and in SQL it ranges from
20-MAR-02 7:00:05 PM(DD-MON-YY HH:MM:SS) to 20-APR-02 7:00:00 PM(DD-MON-YY HH:MM:SS)
I don't know a way to get the format to match up from the csv(MM/DD/YY HH:MM:SS) to SQL(DD-MON-YY HH:MM:SS)
Any help is appreciated! Thank you!!
Edit: #Sean Lange: Thanks! When I try to use datetime(6)as a data type I get an error that says 00907:missing right parenthesis? If I try just datetime I get an error 00902: invalid datatype? Any thoughts?
Edit#2: Yes, I am using oracle, sorry the language on these websites confuses me sometimes because I'm still learning. Any help would still be wonderful, going through step by step specifically
Edit #3: Thank you for your help. Sorry once again for my lack of knowledge, I'm using oracle sql developer.
After creating the table incorrectly, I then typed these lines of code:
alter table table_name
alter column column_name to_timestamp(yourFieldName, 'mm/dd/yyyy hh:mi:ss am');
However, I was given this error.
Error report -
ORA-01735: invalid ALTER TABLE option
01735. 00000 - "invalid ALTER TABLE option"
I truly do appreciate all of your help, thank you all again!
I also tried to update the forenter image description heremat in the wizard and was given an error. Attached are screenshots explaining what I did
enter image description here
I dont think the pictures are working, but the error said that the date format wasnt recognized.
You haven't specified what client you're using to connect to Oracle Database - TOAD? Oracle SQL Developer? Something else? ...So I can't give you specific step-by-step instructions.
However, your basic problem is that you need to convert from a text string to a timestamp, and for that you need to enter your date format string. Somewhere your "Import Data" wizard should have a step for either (a) entering an expression or (b) entering a date format.
The date format string you want is 'mm/dd/yyyy hh:mi:ss am' (not case sensitive, you can use caps if you prefer, or pm instead of am).
The expression would be
to_timestamp(yourFieldName, 'mm/dd/yyyy hh:mi:ss am')
I'm using Pentaho to datamask some of the information on the oracle DB
I have several transformations of the form:
SELECT -> data mask -> UPDATE rows based on primary key
I have tables where a timestamp is part of the primary key in the update step. Even though I am not masking or updating this field in any way, I get the error ORA-01843: not a valid month when performing the update.
I believe this is because when Pentaho takes in the timestamp from Step 1 it doesn’t actually keep it as a timestamp until I try the update and hence the primary key check. Outputting to excel, I see pentaho giving timestamps in the format
2014-07-30 15:44:31.869033 Europe/London (Pentaho)
But in DB the format is
30-JAN-15 09.21.38.109145000 AM (Oracle - TIMESTAMP(6) WITH LOCAL TIME ZONE)
I have tried to convert the pentaho field to a Timestamp (format: yyyy-MM-dd HH:mm:ss.SSSSSS) before the update step but receive errors if I try and use milliseconds.
2017/03/14 13:19:25 - Select values.0 - AUDIT_CREATE_TS Timestamp : couldn't convert string [2015-01-30 09:21:38.109145 Europe/London] to a timestamp, expecting format [yyyy-mm-dd hh:mm:ss.ffffff]
2017/03/14 13:19:25 - Select values.0 - Timestamp format must be yyyy-mm-dd hh:mm:ss[.fffffffff]
If I replace my formatting to the one suggested by Pentaho I get "Illegal character 'f'" and then I am stuck in a loop.
Ignoring milliseconds seems to succeed but won’t give me any matches because it isn’t precise enough and returns no results from db..
Any help would be appreciated!
Not sure about Pentaho, but if you're looking for a conversion from this string:
'2015-01-30 09:21:38.109145 Europe/London'
to a timestamp with timezone in Oracle, it would be:
select to_timestamp_tz('2015-01-30 09:21:38.109145 Europe/London', 'YYYY-MM-DD HH24:MI:SS.FF6 TZR') from dual;
See Oracle Datetime Format Models document for more.
I have the following date which is in varchar2(11) column in database:
select valid_untill from SALES_ORDERS_V where header_id = 7999410;
30-May-2016
Using rtf template and xml source, the report output (PDF) is:
4950-11-19 04:45:49:0
I don't know its equal to "30-May-2016".
Why this is showing this, as I did not do any formating in rtf?
Not familiar with either RTF or XML-Publisher, but whenever you retrieve a date saved in string format, IF you use it as a date in your code and not as a string, you must make sure you retrieve it correctly.
In this case, with your select statement: it shouldn't be select valid-until from... (or is it really misspelled, with two l at the end: valid_until?) If it is meant to be used as a date, it should be
select to_date(valid_until, 'dd-Mon-yyyy') from ...
Really the problem here is that the date is stored as a string and not in the date datatype. Good luck!
I'm a Business Intelligence intern and am trying to write a simple ETL batch job to bring one table into our warehouse using SAP Data Services Designer. The source table has a timestamp column, which halts the job's execution, saying:
You cannot select directly from timestamp column . Using
a timestamp column in SQL transform may cause this error. See
documentation or notify Customer Support.
From the technical manual, this limitation is confirmed in the timestamp section, which reads:
You cannot use timestamp columns in the SQL transform or in an Oracle
stored procedure. To use a timestamp column in the SQL transform,
convert the timestamp column in the select list of the SQL transform
to a character format using the to_char function and convert it back
to timestamp using the to_date function."
I've tried remedying the problem by changing the output schema's column to a datetime type, and converting the timestamp in the SQL transform with
TO_DATE(TO_CHAR(SQL.DATETIME_STAMP, 'YYYY-MON-DD HH24:MI:SS'), 'YYYY-MON-DD HH24:MI:SS')
I'm missing a key concept as it still fails with error 54003 no matter what I try. Thoughts, anyone?