i am creating a cursor on a query getting all the columns from a table. in a loop i am using each column and printing it out. I wanted to know any easy way to check whether a column has any value or not. if a column in a row is null, put "NULL". I know I can use if condition to check each and every column in a row from the cursor in a for loop. but I have alot of columns, so is there an easy way of doing that?
Rather than handle this requirement within PL/SQL, you should consider doing it in the cursor itself. In general, the more you can do in your SQL the better (The optimizer will handle the performance). Using lots of PL/SQL logic especially unnecessary if/else conditions in a loop will benefit you if you are processing a lot of data.
In your cursor you can simply:
CURSOR c1 IS
SELECT NVL(column, 'NULL')
FROM dual;
There are plenty of in-built SQL functions that will handle this for you. Above, I used NVL, but you can use COALESCE, CASE, etc.
CASE example (note - if the column is not datatype VARCHAR then you will need to apply some datatype conversion to avoid any errors):
CURSOR c1 IS
SELECT (CASE
WHEN column IS NULL THEN
'NULL'
ELSE
column
END)
FROM dual;
COALESCE example:
CURSOR c1 IS
SELECT COALESCE(column, 'NULL')
FROM dual;
It's worth noting that NVL is a binary function, meaning only one column/value can be tested, however COALESCE can test multiple columns/values at the same time like below:
SELECT COALESCE(column1, column2, column3, 'NULL')
FROM YOUR_TABLE;
In the case that column1 value is NULL and column2 is NOT NULL then column2 value will be returned.
If column1 and column2 values are both NULL then column3 value will be returned.
If column1, column2, column3 are all NULL then the value 'NULL' is returned.
Related
I'm facing a problem when I try to use LAG function on CLOB column.
So let's assume we have a table
create table test (
id number primary key,
not_clob varchar2(255),
this_is_clob clob
);
insert into test values (1, 'test1', to_clob('clob1'));
insert into test values (2, 'test2', to_clob('clob2'));
DECLARE
x CLOB := 'C';
BEGIN
FOR i in 1..32767
LOOP
x := x||'C';
END LOOP;
INSERT INTO test(id,not_clob,this_is_clob) values(3,'test3',x);
END;
/
commit;
Now let's do a select using non-clob columns
select id, lag(not_clob) over (order by id) from test;
It works fine as expected, but when I try the same with clob column
select id, lag(this_is_clob) over (order by id) from test;
I get
ORA-00932: inconsistent datatypes: expected - got CLOB
00932. 00000 - "inconsistent datatypes: expected %s got %s"
*Cause:
*Action:
Error at Line: 1 Column: 16
Can you tell me what's the solution of this problem as I couldn't find anything on that.
The documentation says the argument for any analytic function can be any datatype but it seems unrestricted CLOB is not supported.
However, there is a workaround:
select id, lag(dbms_lob.substr(this_is_clob, 4000, 1)) over (order by id)
from test;
This is not the whole CLOB but 4k should be good enough in many cases.
I'm still wondering what is the proper way to overcome the problem
Is upgrading to 12c an option? The problem is nothing to do with CLOB as such, it's the fact that Oracle has a hard limit for strings in SQL of 4000 characters. In 12c we have the option to use extended data types (providing we can persuade our DBAs to turn it on!). Find out more.
Some of the features may not work properly in SQL when using CLOBs(like DISTINCT , ORDER BY GROUP BY etc. Looks like LAG is also one of them but, I couldn't find anywhere in docs.
If your values in the CLOB columns are always less than 4000 characters, you may use TO_CHAR
select id, lag( TO_CHAR(this_is_clob)) over (order by id) from test;
OR
convert it into an equivalent SELF JOIN ( may not be as efficient as LAG )
SELECT a.id,
b.this_is_clob AS lagging
FROM test a
LEFT JOIN test b ON b.id < a.id;
Demo
I know this is an old question, but I think I found an answer which eliminates the need to restrict the CLOB length and wanted to share it. Utilizing CTE and recursive subqueries, we can replicate the lag functionality with CLOB columns.
First, let's take a look at my "original" query:
WITH TEST_TABLE AS
(
SELECT LEVEL ORDER_BY_COL,
TO_CLOB(LEVEL) AS CLOB_COL
FROM DUAL
CONNECT BY LEVEL <= 10
)
SELECT tt.order_by_col,
tt.clob_col,
LAG(tt.clob_col) OVER (ORDER BY tt.order_by_col)
FROM test_table tt;
As expected, I get the following error:
ORA-00932: inconsistent datatypes: expected - got CLOB
Now, lets look at the modified query:
WITH TEST_TABLE AS
(
SELECT LEVEL ORDER_BY_COL,
TO_CLOB(LEVEL) AS CLOB_COL
FROM DUAL
CONNECT BY LEVEL <= 10
),
initial_pull AS
(
SELECT tt.order_by_col,
LAG(tt.order_by_col) OVER (ORDER BY tt.order_by_col) AS PREV_ROW,
tt.clob_col
FROM test_table tt
),
recursive_subquery (order_by_col, prev_row, clob_col, prev_clob_col) AS
(
SELECT ip.order_by_col, ip.prev_row, ip.clob_col, NULL
FROM initial_pull ip
WHERE ip.prev_row IS NULL
UNION ALL
SELECT ip.order_by_col, ip.prev_row, ip.clob_col, rs.clob_col
FROM initial_pull ip
INNER JOIN recursive_subquery rs ON ip.prev_row = rs.order_by_col
)
SELECT rs.order_by_col, rs.clob_col, rs.prev_clob_col
FROM recursive_subquery rs;
So here is how it works.
I create the TEST_TABLE, this really is only for the example as you should already have this table somewhere in your schema.
I create a CTE of the data I want to pull, plus a LAG function on the primary key (or a unique column) in the table partitioned and ordered in the same way I would have in my original query.
Create a recursive subquery using the initial row as the root and descending row by row joining on the lagged column. Returning both the CLOB column from the current row and the CLOB column from its parent row.
I have a requirement to find all not-null columns in a table. For example, my table is the below one
Lets say, Column1, Column2 and Column3 have not-null constraints and Column4, Column5 and Column6 are of nullable types. Is there any query in Oracle that list the column names that are of not-null types, ie I need to get the column names Column1, Column2 and Column3.
DESIRED OUTPUT
Column1
Column2
Column3
I know there should be a simple way to achieve this, but am new to Oracle. Any help would be highly appreciated.
You can query the all_tab_columns table:
select column_name
from all_tab_columns
where table_name = 'TABLE1'
and nullable = 'N';
I know there should be a simple way to achieve this, but am new to Oracle.
Well, online documentation is exactly what you need to look into.
Depending on the privilege, you need to look into [DBA|USER|ALL]_TAB_COLUMNS.
ALL_TAB_COLUMNS
Column Datatype Description
NULLABLE VARCHAR2(1) Indicates whether a column allows NULLs.
The value is N if there is a NOT NULL constraint
on the column or if the column is part of a PRIMARY KEY.
The constraint should be in an ENABLE VALIDATE state.
So, per the documentation, you need to use the filter:
NULLABLE = 'N'
I'm trying to use a nested table inside the IN clause in a PL-SQL block.
First, I have defined a TYPE:
CREATE OR REPLACE TYPE VARCHAR_ARRAY AS TABLE OF VARCHAR2(32767);
Here is my PL-SQL block using the 'BULK COLLECT INTO':
DECLARE
COL1 VARCHAR2(50) := '123456789';
N_TBL VARCHAR_ARRAY := VARCHAR_ARRAY();
C NUMBER;
BEGIN
-- Print timestamp
DBMS_OUTPUT.PUT_LINE('START: ' || TO_CHAR(SYSTIMESTAMP ,'dd-mm-yyyy hh24:mi:ss.FF'));
SELECT COLUMN1
BULK COLLECT INTO N_TBL
FROM MY_TABLE
WHERE COLUMN1 = COL1;
SELECT COUNT(COLUMN1)
INTO C
FROM MY_OTHER_TABLE
WHERE COLUMN1 IN (SELECT column_value FROM TABLE(N_TBL));
-- Print timestamp
DBMS_OUTPUT.PUT_LINE('ENDED: ' || TO_CHAR(SYSTIMESTAMP ,'dd-mm-yyyy hh24:mi:ss.FF'));
END;
And the output is:
START: 01-08-2014 12:36:14.997
ENDED: 01-08-2014 12:36:17.554
It takes more than 2.5 seconds (2.557 seconds exactly)
Now, If I replace the nested table by a subquery, like this:
DECLARE
COL1 VARCHAR2(50) := '123456789';
N_TBL VARCHAR_ARRAY := VARCHAR_ARRAY();
C NUMBER;
BEGIN
-- Print timestamp
DBMS_OUTPUT.PUT_LINE('START: ' || TO_CHAR(SYSTIMESTAMP ,'dd-mm-yyyy hh24:mi:ss.FF'));
SELECT COUNT(COLUMN1)
INTO C
FROM MY_OTHER_TABLE
WHERE COLUMN1 IN (
-- Nested table replaced by a subquery
SELECT COLUMN1
FROM MY_TABLE
WHERE COLUMN1 = COL1
);
-- Print timestamp
DBMS_OUTPUT.PUT_LINE('ENDED: ' || TO_CHAR(SYSTIMESTAMP ,'dd-mm-yyyy hh24:mi:ss.FF'));
END;
The output is:
START: 01-08-2014 12:36:08.889
ENDED: 01-08-2014 12:36:08.903
It takes only 14 milliseconds...!!!
What could I do to enhance this PL-SQL block ?
Is there any database configuration needed?
Are the two query plans different?
Assuming that they are, the difference is likely that the optimizer has reasonable estimates about the number of rows the subquery will return and, thus, is able to choose the most efficient plan. When your data is in a nested table (I'd hate to use the word array in the type declaration here since that implies that you're using a varray when you're not), Oracle doesn't have information about how many elements are going to be in the collection. By default, it's going to guess that the collection has as many elements as your data blocks have bytes. So if you have 8k blocks, Oracle will guess that your collection has 8192 elements.
Assuming that your actual query doesn't return anywhere close to 8192 rows and that it actually returns many more or many fewer rows, you can potentially use the cardinality hint to let the optimizer make a more accurate guess. For example, if your query generally returns a few dozen rows, you probably want something like
SELECT COUNT(COLUMN1)
INTO C
FROM MY_OTHER_TABLE
WHERE COLUMN1 IN (SELECT /*+ cardinality(t 50) */ column_value
FROM TABLE(N_TBL) t);
The literal you put in the cardinality hint doesn't need to be particularly accurate, just close to general reality. If the number of rows is completely unknown the dynamic_sampling hint can help.
If you are using Oracle 11g, you may also benefit from cardinality feedback helping the optimizer learn to better estimate the number of elements in a collection.
I have 3 tables, TBL_A, TBL_B and TBL_C.
TBL_A has an ID (lets call it ID_A) thats PK of A, and FK of B and C. ID_A have a trigger with an autoincrement before insert.
The problem:
I need to make an Insert All that looks kinda like this
INSERT ALL
INTO TBL_A(FIELD1, FIELD2) VALUES('VALUE1', 'VALUE2')--like i said before, the trigger autoinsert the id with the NEXTVAL in MY_SEQ sequence.
INTO TBL_B(ID_A, FIELD) VALUES (MY_SEQ.CURRVAL, 'VALUE')
INTO TBL_C(ID_A, FIELD) VALUE (MY_SEQ.CURRVAL, 'VALUE')
SELECT * FROM DUAL
But, for some reason it says that sequence MY_SEQ.CURRVAL is not yet defined in this session . What can i do to solve it? Cant find a way to do it.
I need to keep the trigger because that insert can be huge.
PLZ help me and srry about my english btw ^^
This have worked fine for me:
INSERT ALL INTO TBL_A(FIELD1, FIELD2) VALUES('VALUE1', 'VALUE2')
INTO TBL_B(ID_A, FIELD) VALUES (MY_SEQ.CURRVAL, 'VALUE')
INTO TBL_C(ID_A, FIELD) VALUES (MY_SEQ.CURRVAL, 'VALUE')
SELECT * FROM DUAL;
Indeed this really works when I call at least once MY_SEQ.NEXTVAL. In all other runs this is working fine without call NEXTVAL. I think the sequence must be properly initialized
INSERT ALL
INTO TBL_A(ID, FIELD1, FIELD2) VALUES(id, val1, val2)
INTO TBL_B(ID_A, FIELD) VALUES(id, val)
INTO TBL_C(ID_A, FIELD) VALUES(id, val)
SELECT
MY_SEQ.NEXTVAL as id, 'VALUE' as val, 'VALUE1' val1, 'VALUE2' val2
FROM DUAL
You can't use a sequence in the subquery of a multi-table insert and you shouldn't use it in the values clause like that because it can give unpredictable results.
You might consider populating the data into a global temporary table with the sequence and then doing your multitable insert using the global temporary table. Of course this would require modifying the trigger.
Oracle 12 introduced nice feature (which should have been there long ago btw!) - identity columns. So here's a script:
CREATE TABLE test (
a INTEGER GENERATED ALWAYS AS IDENTITY PRIMARY KEY,
b VARCHAR2(10)
);
-- Ok
INSERT INTO test (b) VALUES ('x');
-- Ok
INSERT INTO test (b)
SELECT 'y' FROM dual;
-- Fails
INSERT INTO test (b)
SELECT 'z' FROM dual UNION ALL SELECT 'zz' FROM DUAL;
First two inserts run without issues providing values for 'a' of 1 and 2. But the third one fails with ORA-01400: cannot insert NULL into ("DEV"."TEST"."A"). Why did this happen? A bug? Nothing like this is mentioned in the documentation part about identity column restrictions. Or am I just doing something wrong?
I believe the below query works, i havent tested!
INSERT INTO Test (b)
SELECT * FROM
(
SELECT 'z' FROM dual
UNION ALL
SELECT 'zz' FROM dual
);
Not sure, if it helps you any way.
For, GENERATED ALWAYS AS IDENTITY Oracle internally uses a Sequence only. And the options on general Sequence applies on this as well.
NEXTVAL is used to fetch the next available sequence, and obviously it is a pseudocolumn.
The below is from Oracle
You cannot use CURRVAL and NEXTVAL in the following constructs:
A subquery in a DELETE, SELECT, or UPDATE statement
A query of a view or of a materialized view
A SELECT statement with the DISTINCT operator
A SELECT statement with a GROUP BY clause or ORDER BY clause
A SELECT statement that is combined with another SELECT statement with the UNION, INTERSECT, or MINUS set operator
The WHERE clause of a SELECT statement
DEFAULT value of a column in a CREATE TABLE or ALTER TABLE statement
The condition of a CHECK constraint
The subquery and SET operations rule above should answer your Question.
And for the reason for NULL, when pseudocolumn(eg. NEXTVAL) is used with a SET operation or any other rules mentioned above, the output is NULL, as Oracle couldnt extract them in effect with combining multiple selects.
Let us see the below query,
select rownum from dual
union all
select rownum from dual
the result is
ROWNUM
1
1