Default precision scale data type in snowflake - jdbc

I want to create a table with column as decimal and I dont want to specify any precision scale, it seems using number as dataType it create column as Number(38,0) by default.
How can I get some default value for scale as well.
In other db like oracle numeric takes default scale and we are able to insert decimal values but the same is not happening in snowflake

You are right that it is defined as NUMBER(38,0) when you don't specify a scale. It's a popular request from Snowflake users, and there is already a project to improve the NUMBER data type. Unfortunately, there is no ETA yet.

You can create a table with data type FLOAT/DECIMAL/DOUBLE etc, all these data types use underlying precision/scale = 15/9. It actually uses double data type internally.
You can try like
CREATE OR REPLACE TABLE table_double (
Scale REAL
);
insert into table_double values
(103269015259.46179);
insert into table_double values
(10326901.461);
It will truncate after 15/9.
Please refer to the documentation.
https://docs.snowflake.com/en/sql-reference/data-types-numeric.html#data-types-for-floating-point-numbers
Regards,
Sujan

Related

What is the apt Oracle data type to store ISOYearMonth?

I searched for suitable Oracle data type to store ISOYearMonth. But come across Date, TimeStamp data types to store Date with time with precision variations. Please guide me in selecting apt Oracle data type to store ISOYearMonth.
I think use DATE type will be most suitable.
You will need truncate dates using TRUNC(my_date, 'MONTH') when you will insert data into table.
In this case you don't need any additional conversion when you will select data for comparisons or arithmetic operations
Of course, if you want to see this date on the screen in ISOYearMonth format, you should convert data to string using TO_CHAR(my_date, 'YYYY-MM') in your query.

Formatting a column using Expression Transformation in Informatica Powercenter

I have a Column (SALARY) in Source Table from Relational DB,
for example 15000 is a record in SALARY column
and I want to format it as $15,000.00 into the Target table which is a Relational DB
using Expression Transformation.
Thankyou,
Ajay
This can be done using below steps:
Pull the required column from source to Expression transformation.
Create a derived column as the concatenation of $ and the input column by making use of the equation CONCAT('$',Source_Column)
Load this new column to the target, instead of the column from the source.
I hope this is just to learn the functionality. In real life scenarios, this is a bad practice. We need not keep these symbols and all in tables. This can be directly handled at reporting level.
may be the scenario where salary column has salary of different types say 'dollor','euro','pounds', etc in same table.
still i agree with Thomas Cherian that is not good practice to store the symbol with data itself.
if this is the scenario you can do 2 things.
--1.) add another column in table and store the value of type there or add another column which stores the conversion rate also.
--2.) convert all the values in once currency and store it without symbol.

What Oracle data type is easily converted to BIT in MSSQL via SSIS?

I have a Data Flow from an Oracle table to an MSSQL table with one field of data type BIT. The Oracle table is using the characters Y and N at the moment (I'm unsure of the data type and have no way of checking), but the MSSQL table needs to be data type BIT. What type of cast can I use on the Oracle query so that the data is pulled smoothly over?
Use char(1) and then use a derived column transformation like this:
(DT_BOOL)(OracleField == "Y"?1:0)
Give this column a name like OracleFieldAsBool
and then use it instead of the original column in the rest of your data flow.

Export table from Access to Oracle converts number to Varchar how to make it stop doing that?

I have a table in Access 2007 with various number columns, but when i export the table from Access all the number fields are converted to varchar2. I am using an Access odbc for oracle
Use Number for Data Type and for field size use decimal

How to store unlimited characters in Oracle 11g?

We have a table in Oracle 11g with a varchar2 column. We use a proprietary programming language where this column is defined as string. Maximum we can store 2000 characters (4000 bytes) in this column. Now the requirement is such that the column needs to store more than 2000 characters (in fact unlimited characters). The DBAs don't like BLOB or LONG datatypes for maintenance reasons.
The solution that I can think of is to remove this column from the original table and have a separate table for this column and then store each character in a row, in order to get unlimited characters. This tble will be joined with the original table for queries.
Is there any better solution to this problem?
UPDATE: The proprietary programming language allows to define variables of type string and blob, there is no option of CLOB. I understand the responses given, but I cannot take on the DBAs. I understand that deviating from BLOB or LONG will be developers' nightmare, but still cannot help it.
UPDATE 2: If maximum I need is 8000 characters, can I just add 3 more columns so that I will have 4 columns with 2000 char each to get 8000 chars. So when the first column is full, values would be spilled over to the next column and so on. Will this design have any bad side effects? Please suggest.
If a blob is what you need convince your dba it's what you need. Those data types are there for a reason and any roll your own implementation will be worse than the built in type.
Also you might want to look at the CLOB type as it will meet your needs quite well.
You could follow the way Oracle stored their stored procedures in the information schema. Define a table called text columns:
CREATE TABLE MY_TEXT (
IDENTIFIER INT,
LINE INT,
TEXT VARCHAR2 (4000),
PRIMARY KEY (INDENTIFIER, LINE));
The identifier column is the foreign key to the original table. The Line is a simple integer (not a sequence) to keep the text fields in order. This allows keeping larger chunks of data
Yes this is not as efficient as a blob, clob, or LONG (I would avoid LONG fields if at all possible). Yes, this requires more mainenance, buf if your DBAs are dead set against managing CLOB fields in the database, this is option two.
EDIT:
My_Table below is where you currently have the VARCHAR column you are looking to expand. I would keep it in the table for the short text fields.
CREATE TABLE MY_TABLE (
INDENTIFER INT,
OTHER_FIELD VARCHAR2(10),
REQUIRED_TEXT VARCHAR(4000),
PRIMERY KEY (IDENTFIER));
Then write the query to pull the data join the two tables, ordering by LINE in the MY_TEXT field. Your application will need to split the string into 2000 character chunks and insert them in line order.
I would do this in a PL/SQL procedure. Both insert and select. PL/SQL VARCHAR strings can be up to 32K characters. Which may or may not be large enough for your needs.
But like every other person answering this question, I would strongly suggest making a case to the DBA to make the column a CLOB. From the program perspective this will be a BLOB and therefore simple to manage.
You said no BLOB or LONG... but what about CLOB? 4GB character data.
BLOB is the best solution. Anything else will be less convenient and a bigger maintenance annoyance.
Is BFILE a viable alternative datatype for your DBAs?
I don't get it. A CLOB is the appropriate database datatype. If your weird programming language will deal with strings of 8000 (or whatever) characters, what stops it writing those to a CLOB.
More specifically, what error do you get (from Oracle or your programming language) when you try to insert an 8000 character string into a column defined as a CLOB.

Resources