NUMBER equivalent of Oracle in Snowflake - oracle

So I have this pipeline that migrates data from Oracle to Snowflake. There is one column in Oracle that has datatype NUMBER and I used NUMBER(38, 18) in Snowflake for it.
My pipeline started failing yesterday because that column in Oracle now has 21 digit numbers which NUMBER(38, 18) cannot handle in Snowflake. I switched to NUMBER(38, 17) and it worked for now.
Is there any NUMBER equivalent in Snowflake so that it can handle any value from Oracle? I cannot possibly change the Oracle table datatype.

A column defined as NUMBER in oracle (without scale or precision) doesn't have an equivalent in Snowflake unfortunately. It is some variable-length decimal format that isn't the same as a float.
If you are not 100% sure on what the scale / precision should be, you can try to use FLOAT in Snowflake but that could lead to rounding errors on big aggregations so you'll need to figure out if it is worth it or not.

Related

Accessing postgresql's timestamp field from Oracle via ODBC cause a loss of time precision in microsecond

I am trying to access PostgreSQL record from Oracle via ODBC, there is a severe problem when I try to read timestamp field, the precision of time in microsecond has been lost. For example: 2018-01-25 12:40:20.123456 in PostgreSQL will be 2018-01-25 12:40:20.000000 in the Oracle. To make sure, I have write PL/SQL to check the data, somehow all microsecond digits has been lost.
There's a documentation talking about connection string's parameter BTD - BIND TIMESTAMP AS DATE link
By default, this parameter should be "FALSE"
Bind TIMESTAMP as DATE (BTD Connect String)
Added the new connection option, Bind TIMESTAMP as DATE, that allows you to bind the ODBC driver SQL_TIMESTAMP data type to the Oracle DATE data type instead of to the Oracle TIMESTAMP data type (which is the default).
Here is my PL/SQL:
declare
v_timestamp timestamp(6);
begin
select max("MODIFIED_ON") into v_timestamp from "public"."DAS_ITEM"#PG_LINK;
dbms_output.put_line(v_timestamp);
end;
The result is: 19/JAN/18 08:59:42.000000 AM , it's missing microsecond, all 6-digit second fraction has been replace to zero.
On the other hand, on my PostgreSQL, the result is "2018-01-19 08:59:42.695166"
I also have tested with isql, it return timestamp value with whole precision, as a consequence, I believe that the main reason comes from Oracle.
The Oracle DATE datatype does not support fractions of seconds. You need to use TIMESTAMP for that. This also applies to any table columns or PL/SQL datatypes PostgreSQL timestamps go into; if the timestamps are passed into a DATE somewhere, fractions of seconds will be truncated.

Oracle timestamp column dropping fractions of seconds

We are testing one of our test Oracle databases upgraded from 12.1 to 12.2 and have come across a fairly serious problem.
Some of our tables include columns of type timestamp with time zone. At the time of the upgrade these contain values that can and do include fractions of seconds.
What we have noticed is that any routine that now (under 12.2) tries to populate data in such columns results in the fractional seconds value being truncated and the value being rounded to the nearest second. It is almost as if we are attempting to insert a timestamp into a date field.
E.g. if we try and insert the value '01-JAN-2017 12:34:56.789' into a TTZ column in the 12.2 instance, the value that gets inserted is actually '01-JAN-2017 12:34:57.000'.
What's worse is that this data loss is occurring silently, with no warning given.
Anyone else had this?

ORA-01438 value doesn't fit into defined Number(11,7) data type

I understand the idea of Number datatype and I am acknowledged with the information from this page http://docs.oracle.com/cd/B28359_01/server.111/b28318/datatype.htm#i22289
However it looks that I still miss something because I don't really understand why I am getting this error ORA-01438:
select cast (18000.0 as number(11,7)) from dual;
Results in
ORA-01438: value larger than specified precision allowed for this column
01438. 00000 - "value larger than specified precision allowed for this column"
*Cause: When inserting or updating records, a numeric value was entered
that exceeded the precision defined for the column.
*Action: Enter a value that complies with the numeric column's precision,
or use the MODIFY option with the ALTER TABLE command to expand
the precision.
At the same time reducing scale from 7 to 6 works as a charm
select cast (18000.0 as number(11,6)) from dual;
This is happens under 'Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production'
Can someone enlighten me on why this is happening.
Thank you, appreciate any help.
number(11,7) allows numbers with a total of 11 digits and 7 fractional digits. Which in turn means you have 11-7=4 non-fractional digits.
18000 as five non-fractional digits which is one too many

DB2 Format decimals with custom precision

I can write the following query for Microsoft SQL Server and it working fine:
SELECT
STR(Facts.decimal_value,FormattingSettings.length,FormattingSettings.precision)
FROM
Facts,FormattingSettings
As result i get values formatted with custom precision (FormattingSettings there is one row table) and can changing the formatting dynamically.
It is possible to do same thing for DB2 (9.5)?
The answers for similar questions (like decimal formatting in sql query) had used only the integer constant as precision.

Oracle - Cast Varchar to Float and specify the precision

I need to cast a varchar to a float. (The varchar is guaranteed to be a number)
I'm trying to create a materialized view on top of a pre-built table. Because of this, all the data types must match exactly...including the precision and size of the data types. The original column (before the NVL was added) was pulling from a FLOAT data type with a precision of 126. When I try casting a varchar to a float of a precision of 126 it doesn't seem to include the 126 data precision.
(I tested the fact that it wasn't including the 126 data size by creating a standard view with the below query that casts to float(126). Through my Toad IDE I could see that the precision of the "ThisorThat" float column was 38).
I have simply updated my materialized view with a NVL statement like so...
Select Cast(NVL(thisFloat, thatFloat) as Float(126)) as ThisorThat
....
From tables;
I get the error "Ora-12060: shape of prebuilt table does not match definition query" because the sizes are different than the original table that I am "pre-building" upon. I need to somehow cast the varchar to a float with an explicit size of 126. Any ideas?
Version: Oracle 10g
Edit:
Here's a link which is basically the same thing I'm encountering.
Use
TO_BINARY_FLOAT(mynumberstring)
or
TO_BINARY_DOUBLE(mynumberstring)
What you're trying to do actually isn't a cast, it's a conversion.

Resources