i used
[column sal format A10;
set linesize 1500;]
to set width for the column sal and mgr. after that when i view the table the values in mgr and sal columns appears as ######.
now how to get the actual values??
You are using an invalid format model for a number column. The valid elements are listed in the documentation. SQL*Plus doesn't know what you mean, so it seems to be defaulting to the same behaviour it has if a number won't fit in a valid number format: "If a value cannot fit in the column, SQL*Plus displays pound signs (#) instead of the number."
You would only use the charcater column formatting like A10 if the result returned by the query was already a string, e.g. if you did to_char(sal, '999G999G999') as sal; but even then the maximum length is already set by the format model (to 12 in that case, to allow the +/- sign).
You probably don't need to format those columns at all, but if you are retrieving them as numbers and want to add formatting you need to use an appropriate model, e.g.
column salary format 999G999G999
Related
I'm using oracle APEX 20.1 and have a report with a column that displays currency values.
Since I'm from Europe I need the currency symbol to be behind the value (unlike the typical $amount format you see often in America).
I achieved this by changing the column's format mask to 'FM999G999G999G999G990D00L'.
This works well so far, but visually I would prefer if there was a whitespace between the end of the number (D00) and the currency symbol (L). Is there a way to insert a static whitespace in this format string?
I already looked through oracle's documentation on currency format strings, but they do not seem to mention such an option to include an always-there whitespace or an arbitrary static character.
Thank you in advance.
You can't include punctuation or character literals in a number mask as you can for dates, unfortunately.
You can include a space as part of the currency symbol itself - that is a string, not a character, and can be up to 10 bytes:
to_char(<number>, 'FM999G999G999G999G990D00L', 'nls_currency='' $''')
though that then uses a fixed currency symbol, not the session value from the L format element; you can get it dynamically from the session parameters:
to_char(
<number>,
'FM999G999G999G999G990D00L',
(
select 'nls_currency='' ' || value || ''''
from nls_session_parameters where parameter = 'NLS_CURRENCY'
)
)
which is a bit ugly. And you might not actually want the session's currency symbol; it might be more appropriate to always use the symbol that's relevant to that data.
And I imagine neither of those will fit in to Apex's numeric column formatting, so you would probably need to do that to_char() call explicitly in your query and have Apex just treat it as a pre-formatted string. (I have no idea how formatting works in Apex - from your description I'm assuming there is somewhere you define a format mask for a column in an interactive grid or whatever; but maybe you are already calling to_char().)
You can also change the currency symbol for the session:
alter session set nls_currency = ' €';
select to_char(123.45, 'FM999G999G999G999G990D00L') from dual;
123.45 €
which might be an option but would affect all currency fields - where the L format element is used - though maybe that's a good thing.
db<>fiddle.
I have an Interactive report with floats and I want to sort the values in the correct way like:
8.00
9.00
80.00
90.00
In addition I want to show only two decimal places.
By default Apex displays this format:
How the achieve the correct formatting?
Edit:
The Data type of the column was not numeric but Varchar. After chasnging the type to number all formatting were as expected.
Lets say your IR has the following columns:
ID
Amount
Order by
Set the order by statement:
Click on the IR
Find the Source attribute on the right
Click on Order By and set the value to (choose one):
AMOUNT DESC
AMOUNT ASC
Format
The format should be set on column level on the left screen find the AMOUNT column under the IR. Now on the right you should find an attribute called Format Mask (under Appearance).
Set the value to: FM9999999.90 (choose one predefined)
SQL
Or just use an SQL query:
select ID,
TO_CHAR(AMOUNT, 'FM9999999.90') AMOUNT,
from TEST_TABLE
order by AMOUNT DESC
You can supply a format mask on the column definition.
The LOV next to the setting gives some samples
999G999G999G999G990D00
Sorting will be as expected if this table's column is numeric.
In Interactive report: Actions > Data > Sort, select column and set Ascending Direction;
In Page Designer: Choose your column and set Format Mask in Appearance section, for example: 99.99.
In Oracle, while trying to concatenate two columns of both Number type and then trying to take MAX of it, I am having a question.
i.e column A column B of Number data type,
Select MAX(A||B) from table
Table data
A B
20150501 95906
20150501 161938
when I’m running the query Select MAX(A||B) from table
O/P - 2015050195906
Ideally 20150501161938 should be the output????
I am trying to format column B like TO_CHAR(B,'FM000000') and execute i'm getting the expected output.
Select MAX(A || TO_CHAR(B,'FM000000')) FROM table
O/P - 2015011161938
Why is 2015050195906 is considered as MAX in first case.
Presumably, column A is a date and column B is a time.
If that's true, treat them as such:
select max(to_date(to_char(a)||to_char(b,'FM000000'),'YYYYMMDDHH24MISS')) from your_table;
That will add a leading space for the time component (if necessary) then concatenate the columns into a string, which is then passed to the to_date function, and then the max function will treat as a DATE datatype, which is presumably what you want.
PS: The real solution here, is to fix your data model. Don't store dates and times as numbers. In addition to sorting issues like this, the optimizer can get confused. (If you store a date as a number, how can the optimizer know that '20141231' will immediately be followed by '20150101'?)
You should convert to number;
select MAX(TO_NUMBER(A||B)) from table
Concatenation will result in a character/text output. As such, it sorts alphabetically, so 9 appears after 16.
In the second case, you are specifiying a format to pad the number to six digits. That works well, because 095906 will now appear before 161938.
I am getting data from erp systems in the form of feeds ,in particular one column length in feed is 15 only.
In target table also corresponded column also length is varchar2(15) but when I am trying to load same into db it showing error like:
ORA-12899: value too large for column emp_name (actual: 16, maximum:
15)
I cant increase the column length since it is base table in the production.
have a look into this blog, the problem resolved for me by changing the column datatype from varchar(100) to varchar(100 char). in my case the data contains some umlaut characters.
http://gerardnico.com/wiki/database/oracle/byte_or_character
The usual reason for problems like this are non-ASCII characters that can be represented with one byte in the original database but require two (or more) bytes in the target database (due to different NLS settings).
To ensure your target column is large enough for 15 characters, you can modify it:
ALTER TABLE table_name MODIFY column_name VARCHAR2(15 CHAR)
(note the 15 CHAR - you can also use BYTE; if neither is present, the database uses the NLS_LENGTH_SEMANTICS setting as a default).
To check which values are larger than 15 bytes, you can
create a staging table in the target database with the column length set to 15 CHAR
insert the data from the source table into the staging table
find the offending rows with
SELECT * FROM staging WHERE lengthb(mycol) > 15
(note the use of LENGTHB as apposed to LENGTH - the former returns the length in bytes, whereas the latter returns the length in characters)
I found AL32UTF8 as the only valid setting. This varies from standard UTF8 with a few character having supplementary bytes, i.e, the characters are about 99% the same. I am guessing you have character conversion problems going on. In other words the data in table1 was written using one charset, and the new table has a slightly different charset.
If this is true, you have to find the source of the oddball charset. Because this will continue to happen.
Solution to:
ORA-12899: VALUE TOO LARGE FOR COLUMN(ACTUAL,MAXIMUM)
If you are facing problem while updating a column size of a table which already has data more than the new length below is the simple script that would work definitely.
ALTER TABLE TABLE_NAME ADD (NEW_COLUMN_NAME DATATYPE(DATASIZE));
UPDATE TABLE_NAME SET NEW_COLUMN_NAME = SUBSTR(OLD_COLUMN_NAME , 1, NEW_LENGTH);
ALTER TABLE TABLE_NAME DROP COLUMN OLD_COLUMN_NAME ;
ALTER TABLE TABLE_NAME RENAME COLUMN NEW_COLUMN_NAME TO OLD_COLUMN_NAME;
Meaning of the query:
ALTER TABLE TABLE_NAME ADD (NEW_COLUMN_NAME DATATYPE(DATASIZE));
It would just create a new column of the required new length in your existing table.
UPDATE TABLE_NAME SET NEW_COLUMN_NAME = SUBSTR(OLD_COLUMN_NAME , 1, NEW_LENGTH);
It will discard all the values after the new length value from old column values and set the trimmed values into the new column name.
ALTER TABLE TABLE_NAME DROP COLUMN OLD_COLUMN_NAME ;
It will remove the old column name as its absurd now and we have copied all the information into the new column.
ALTER TABLE TABLE_NAME RENAME COLUMN NEW_COLUMN_NAME TO OLD_COLUMN_NAME;
Renaming the new column name to the old column name would help you regain the original table structure except for the new column size as you wished.
Certainly the cause of error is that the value is too large for column data type. However, sometimes it is not visible at first sight. Except "byte versus char" differences mentioned in other answers, there can also be problem with line terminators.
I was trying to load CSV file using SQL*Loader in dockerized Oracle. The foo column of type char(1) was the last column. I got ORA-12899: value too large for column foo (actual: 2, maximum: 1) error despite all values of foo column were of length 1. Later I noticed the CSV file has been edited in Windows editor and accidentally saved with CRLF terminators. Since Linux in Docker container expects just LF, the CR was treated as part of column data.
This error made me confused a little bit.
VARCHAR2(x CHAR) means that the column will hold x characters but not
more than can fit into 4000 bytes. Internally, Oracle will set the
byte length of the column (DBA_TAB_COLUMNS.DATA_LENGTH) to MIN(x *
mchw, 4000), where mchw is the maximum byte width of a character in
the database character set. This is 1 for US7ASCII or WE8MSWIN1252, 2
for JA16SJIS, 3 for UTF8, and 4 for AL32UTF8.
For example, a VARCHAR2(3000 CHAR) column in an AL32UTF8 database will
be internally defined as having the width of 4000 bytes. It will hold
up to 3000 characters from the ASCII range (the character limit), but
only 1333 Chinese characters (the byte limit, 1333 * 3 bytes = 3999
bytes). A VARCHAR2(100 CHAR) column in an AL32UTF8 database will be
internally defined as having the width of 400 bytes. It will hold up
to any 100 Unicode characters.
Reference: https://community.oracle.com/tech/developers/discussion/421117/difference-between-varchar2-4000-byte-varchar2-4000-char
I'm trying to parse numbers using the following code
TO_NUMBER('1,234.56', '9999999D99')
For some reason comma is ignored and the value is parsed correctly, despite the format doesn't have it. Is there any way to restrict usage of thousand group separator?
So far I only came up with setting a bogus symbol as a separator with the hope that user will not use it
TO_NUMBER('1,234.56', '9999999D99', 'NLS_NUMERIC_CHARACTERS=''.ã''');
This is a very strange request - if Oracle is correctly converting the string to a number then it seems to be doing it's job correctly. However, if you really need to do this for whatever reason then simply remove the format mask.
SQL> select to_number('1,234.56') from dual;
select to_number('1,234.56') from dual
*
ERROR at line 1:
ORA-01722: invalid number
SQL> select to_number('1234.56') from dual;
TO_NUMBER('1234.56')
--------------------
1234.56
SQL>
Though the SQL Language Reference doesn't mention any default values I believe that the default value for the format mask described in the OLAP DML Reference for TO_NUMBER() applies:
The default number format identifies a period (.) as the decimal marker and does not recognize any other symbol.
This, in turn, means that a comma is an invalid value for the conversion and thus the conversion will fail.