Time value as output - informatica-powercenter

For few columns from the source i.e .csv file, we are having values like 1:52:00, 14:45:00.
I am supposed to load to the Oracle table.
Which data type should I choose in Target as well as source?
Should i be doing any thing in the expression transformation?

Use SQLLDR to load the data into database with the format described as in the link
http://docs.oracle.com/cd/B19306_01/server.102/b14200/sql_elements004.htm
ie.'HH24:MI:SS'

Oracle does not support time-only values, it supports dates (with a time component).
You have a few options:
Store the value as a string, perhaps providing a leading zero for
the hour.
Store the value as the number of seconds (or minutes) past midnight.
Store the value as the time component of some arbitrarily defined date, for
example 0001-JAN-01 01:52:00 and 0001-Jan-01 14:45:00. Tell your report writers to ignore the date portion of the value.
Your source datatype will be string(8). Use LPAD to add leading zeroes.

Related

DECIMAL value out of range

I am trying to publish data from our SAS environment into a remote Hadoop/Hive database (as sequence files). I'm performing basic tests by taking some source data from our business users and using a data step to write out to the Hadoop library.
I'm getting errors indicating that a value at row X is out of range.
For example:
ERROR: Value out of range for column BUY_RT1, type DECIMAL(5, 5). Disallowed value is: 0.
The source data has a numeric format of 6.5, and the actual value is .00000.
Why is .00000 out of range? Would the format for Hadoop need to be DECIMAL(6, 5)?
I get the same error when the value is 0.09:
ERROR: Value out of range for column INT_RT, type DECIMAL(5, 5). Disallowed value is: 0.09
You may need to check the actual values in SAS. If a numeric value in SAS has a format applied, you will see the formatted (possibly rounded) version of the numeric value wherever you output the value, but the underlying numeric may still have more significant digits that you're not seeing, due to the format.
For example, you say your source data has a format of 6.5 and the 'actual value' is 0.00000; are you sure that's the actual value? To check, you could try comparing the value to a literal 0, or putting the value to the SAS log with a different format like BEST32. (eg put BUY_RT1 best32.;).
If this is the problem, the solution is to properly round the source numeric values, rather than just applying a format.

SORT in JCL based on Current Date

Requirement: I need to sort an input file based on Date.
The date is in YYYYMMDD format starting at 56th Position in the flat file.
Now, the I am trying to write a sort card which writes all the records that have the date(YYYYMMDD) in the past 7 Days.
Example: My job is running on 20181007, it should fetch all the records that have date in between 20181001 to 20181007.
Thanks in advance.
In terms of DFSort you can use the following filter to select the current date as a relative value. For instance:
OUTFIL INCLUDE=(56,8,CH,GE,DATE1-7)
There are several definitions for Dates in various formats. I assume that since you are referring to a flat file the date is in a character format and not zoned decimal or other representation.
For DFSort here is a reference to the include statement
Similar constructs exist for other sort products. Without specifics about the product your using this is unfortunately a generic answer.

How can you add FLOAT measures in Tableau formatted as a time stamp (hh:mm:ss)?

The fields look as described above. They are time fields from SQL imported as a varchar. I had to format as date in tableau. There can be NULL values, so I am having a tough time getting over that. Tableau statement I have is only ([time spent])+([time waited])+([time solved)].
Thank you!
If you only want to use the result for a graphical visualization of what took the longest, you can split and add all the values into seconds and using it into your view. E.g.
In this case the HH:MM:SS fields are Strings for Tableau.
The formula used to sum the three fields is:
//transforms everything into seconds for each variable
zn((INT(SPLIT([Time Spent],':',1))*3600))
+
zn((INT(SPLIT([Time Spent],":",2))*60))
+
zn((INT(SPLIT([Time Spent],":",3))))
+
zn((INT(SPLIT([Time Waited],':',1))*3600))
+
zn((INT(SPLIT([Time Waited],":",2))*60))
+
zn((INT(SPLIT([Time Waited],":",3))))
+
zn((INT(SPLIT([Time Solved],':',1))*3600))
+
zn((INT(SPLIT([Time Solved],":",2))*60))
+
zn((INT(SPLIT([Time Solved],":",3))))
Quick explanation of the formula:
I SPLIT every field three times, one for the hours, minutes and seconds, adding all the values.
There is an INT formula that will convert the strings into integers.
There is also a ZN for every field - this will make Null fields become Zeros.
You can also use the value as integer if you want, e.g. the Case A has a Total Time of 5310 seconds.
The best approach is usually to store dates in the database in a date field instead of in a string. That might mean a data prep/cleanup step before you get to Tableau, but it will help with efficiency, simplicity and robustness ever after.
You can present dates in many formats, including hh:mm, when the underlying representation is a date datatype. See the custom date options on the format pane in Tableau for example. But storing dates as formatted strings and converting them to something else for calculations is really doing things the hard way.
If you have no choice but to read in strings and convert them to dates, then you should look at the DateParse function.
Either way, decide what a null date means and make sure your calculations behave well in that case -- unless you can enforce that the date field not contain nulls in the database.
One example would be a field called Completed_Date in a table of Work_Orders. You could determine that a null Completed_Date meant the work order had not been fulfilled yet, and thus allow nulls for that field. But you could also have the database enforce that another field, say Submitted_Date, could never be null.

Hibernate mapping of two properties to one column

I have an object that is generated from XSDs, so I can't change it. In it I have a String DATE and a String TIME (representing the time of day without the date).
DATE = yyyy-mm-dd
TIME = hh:MM:ss:mmmm
In the OracleDB, I don't want to represent these as VARCHAR. I'd like to use DATE or DATETIME. Therefore, I'd need to map both DATE + TIME to one single column, DATETIME.
This is not possible. You can map two columns to a single property (using composites or user types) but not the other way around.
Using the same column name twice in the mapping file usually results in strange exceptions (index out of bounds).
I would use two columns in the database. Convert them to DATE-kind data types using a user type.

Oracle - Fetch date/time in milliseconds from DATE datatype field

I have last_update_date column defined as DATE field
I want to get time in milliseconds.
Currently I have:
TO_CHAR(last_update_date,'YYYY-DD-MM hh:mi:ss am')
But I want to get milliseconds as well.
I googled a bit and think DATE fields will not have milliseconds. only TIMESTAMP fields will.
Is there any way to get milliseconds? I do not have option to change data type for the field.
DATE fields on Oracle only store the data down to a second so there is no way to provide anything more precise than that. If you want more precision, you must use another type such as TIMESTAMP.
Here is a link to another SO question regarding Oracle date and time precision.
As RC says, the DATE type only supports a granularity down to the second.
If converting to TIMESTAMP is truly not an option then how about the addition of another numerical column that just holds the milliseconds?
This option would be more cumbersome to deal with than a TIMESTAMP column but it could be workable if converting the type is not possible.
In a similar situation where I couldn't change the fields in a table, (Couldn't afford to 'break' third party software,) but needed sub-second precision, I added a 1:1 supplemental table, and an after insert trigger on the original table to post the timestamp into the supplemental table.
If you only need to know the ORDER of records being added within the same second, you could do the same thing, only using a sequence as a data source for the supplemental field.

Resources