I am trying to loop through multiple tables in a data mart and see how my TABLE_A joins to a TABLE_B based on a specific field (that changes with every loop). The variables for fields and tables resolve correctly in the DBMS_Output, but the query itself does not resovle to the full length. Instead, parts of the query are being cut off with each loop. How can I fix this?
My loop looks like this:
DECLARE
CURSOR c_tables is
SELECT
COLUMN_NAME
,JOIN_TABLE_NAME
FROM meself.test_loop ;
v_string1 varchar2(32767) := '';
BEGIN
FOR i IN c_tables LOOP
v_string1 := '
INSERT INTO myself.FIN_HASHKEY_MATCH
SELECT
x.TABLE_NAME
,x.JOIN_TABLE_NAME
,x.HASH_KEY_NAME
,x.MATCH_TYPE
,count(*) AS TOTALS
FROM
(
SELECT
DISTINCT a.'||i.COLUMN_NAME||'
,''' || i.JOIN_TABLE_NAME || ''' AS JOIN_TABLE_NAME
,''' || i.COLUMN_NAME || ''' AS HASH_KEY_NAME
, ''TABLE_A'' AS Table_Name
,(CASE
WHEN
a.'|| i.COLUMN_NAME || ' = z.' || i.COLUMN_NAME || ' THEN "MATCH"
ELSE "FAIL"
END) AS MATCH_TYPE
FROM DATAMART.TABLE_A a
LEFT JOIN (SELECT
DISTINCT '|| i.COLUMN_NAME ||'
FROM DATAMART.'|| i.JOIN_TABLE_NAME ||') z
ON a.'|| i.COLUMN_NAME ||' = z.'|| i.COLUMN_NAME ||'
) x
GROUP BY x.TABLE_NAME, x.JOIN_TABLE_NAME, x.HASH_KEY_NAME, x.MATCH_TYPE'
;
dbms_output.put_line( v_string1 );
--execute immediate v_string1;
END LOOP;
END;
**Which resolves to this (just showing 2 loops for simplicity). You will see the JOIN ON section is missing or incomplete before moving on to the next loop. **
INSERT INTO myself.FIN_HASHKEY_MATCH
SELECT
x.TABLE_NAME
,x.JOIN_TABLE_NAME
,x.HASH_KEY_NAME
,x.MATCH_TYPE
,count(*) AS TOTALS
FROM
(
SELECT
DISTINCT a.LINE_SCTGRY_D_SK
,'TABLE_B' AS JOIN_TABLE_NAME
,'LINE_SCTGRY_D_SK' AS HASH_KEY_NAME
,'TABLE_A' AS Table_Name
,(CASE
WHEN
a.LINE_SCTGRY_D_SK = z.LINE_SCTGRY_D_SK THEN "MATCH"
ELSE "FAIL"
END) AS MATCH_TYPE
FROM DATAMART.TABLE_A a
LEFT JOIN (SELECT
DISTINCT LINE_SCTGRY_D_SK
FROM DATAMART.TABLE_B
INSERT INTO myself.FIN_HASHKEY_MATCH
SELECT
x.TABLE_NAME
,x.JOIN_TABLE_NAME
,x.HASH_KEY_NAME
,x.MATCH_TYPE
,count(*) AS TOTALS
FROM
(
SELECT
DISTINCT a.MEMBER_D_SK
,'TABLE_B' AS JOIN_TABLE_NAME
,'MEMBER_D_SK' AS HASH_KEY_NAME
, 'TABLE_A' AS Table_Name
,(CASE
WHEN
a.MEMBER_D_SK = z.MEMBER_D_SK THEN "MATCH"
ELSE "FAIL"
END) AS MATCH_TYPE
FROM DATAMART.TABLE_A a
LEFT JOIN (SELECT
DISTINCT MEMBER_D_SK
FROM DATAMART.TABLE_B) z
ON a.MEMBER_D_SK = z.ME
( I think it could be a simple question for the most users here ..)
Short description:
I need a way (maybe with PL/SQL which I don't know ..) to "select defined data from all tables which contain this type of data"
Long description (example):
I have a different number of different tables. An often changing part of them - I don't know the number and the names - contains the column "FID". Now I need two steps:
a) Select all tables which contain the column "FID". ( I know how to do this as single step)
b) Select from all found tables the value FID and show it.
For me the problem is the step from a) to b). With known tables I would use UNION, but with a dynamic result of tables I have no idea ..
You could use a variation on an XML magic trick, by using dbms_xmlgen to get all the values into XML documents based on a query against user_tab_columns:
select dbms_xmlgen.getxmltype(
'select "' || column_name || '" from "' || table_name || '"')
from user_tab_columns
where upper(column_name) = 'FID'
and data_type = 'NUMBER';
... where I'm assuming FID is expected to be a numeric ID, so limiting only to numeric columns (and also allowing for mixed case/quoted identifiers for table and columns names, just in case). That gives one row per table, with an XML document listing the FID values in that table.
Then from that XML you can extract the individual values, again as numbers:
with cte (xml) as (
select dbms_xmlgen.getxmltype(
'select "' || column_name || '" as fid from "' || table_name || '"')
from user_tab_columns
where upper(column_name) = 'FID'
and data_type = 'NUMBER'
)
select x.fid
from cte
cross apply xmltable(
'/ROWSET/ROW'
passing cte.xml
columns fid number path 'FID'
) x;
Or if you want to see the table/column each value came from, just include those in the CTE and select list:
with cte (table_name, column_name, xml) as (
select table_name, column_name, dbms_xmlgen.getxmltype(
'select "' || column_name || '" as fid from "' || table_name || '"')
from user_tab_columns
where upper(column_name) = 'FID'
and data_type = 'NUMBER'
)
select cte.table_name, cte.column_name, x.fid
from cte
cross apply xmltable(
'/ROWSET/ROW'
passing cte.xml
columns fid number path 'FID'
) x;
If you want to search other schemas, then use all_tab_columns instead, and optionally include each table's owner:
with cte (owner, table_name, column_name, xml) as (
select owner, table_name, column_name, dbms_xmlgen.getxmltype(
'select "' || column_name || '" as fid from "' || owner || '"."' || table_name || '"')
from all_tab_columns
where upper(column_name) = 'FID'
and data_type = 'NUMBER'
)
select cte.owner, cte.table_name, cte.column_name, x.fid
from cte
cross apply xmltable(
'/ROWSET/ROW'
passing cte.xml
columns fid number path 'FID'
) x;
db<>fiddle
The basis for this trick goes back to at least 2007 but may be even older, from before getxmltype() existed (it seems to have been added in 10g); I'd originally used xmltype(getxml()):
select xmltype(dbms_xmlgen.getxml(
'select "' || column_name || '" from "' || table_name || '"'))
from user_tab_columns
where upper(column_name) = 'FID'
and data_type = 'NUMBER';
which works most of the time, but if any of the tables are empty throws "ORA-06502: PL/SQL: numeric or value error".
If you want to use pl/sql I really love pipelined functions:
create type result_type as Object ( text varchar2(2000) );
create type result_type_table as table of result_type;
create or replace function select_all( p_column_name in varchar2 )
return result_type_table
deterministic
pipelined
as
v_table_name varchar2(40);
v_result result_type := result_type('');
v_table_name_cursor sys_refcursor;
v_inner_cursor sys_refcursor;
begin
open v_table_name_cursor
for 'select a.table_name
from user_tab_cols a
, user_tables b
where a.column_name = :1
and a.table_name = b.table_name'
using upper(p_column_name);
loop
fetch v_table_name_cursor into v_table_name;
exit when v_table_name_cursor%notfound;
open v_inner_cursor
for 'select '||p_column_name||' from '||v_table_name;
loop
fetch v_inner_cursor into v_result.text;
exit when v_inner_cursor%notfound;
pipe row (v_result );
end loop;
close v_inner_cursor;
end loop;
close v_table_name_cursor;
end;
/
Using this function is simple:
select * from table( select_all('your_column_name') );
db<>fiddle
I am having 20 tables ( Each table has a PK and data ), i want to find out what is the current MAX(PK) Value for each table.
I Want the result as follows :
TABLE_NAME MAX_VAL
-------------------- ----------
TABELE_A 114
TABELE_B 55
TABELE_C 14
TABELE_D 866
TABELE_3 4552
is there any way to accomplish this or else i have to write 20 times SELECT MAX(PK_COL) FROM TABLE ?
Assuming your currently connected schema is composed of those twenty tables, and each have identical primary key column name(pk_col), then consider the following code block containing an implicit cursor :
declare
v_max pls_integer;
begin
dbms_output.put_line('table_name max_val');
for c in ( select * from user_tables )
loop
execute immediate 'select max(pk_col) from '||c.table_name into v_max;
dbms_output.put_line(c.table_name||' '||v_max);
end loop;
end;
/
i have found another method which will bring TABLE_NAME,PK_COLUMN and MAX( PK_COLUMN ).
SELECT CASE
WHEN RN = 1 THEN
FORMATTED_QUERY_SET
ELSE
FORMATTED_QUERY_SET || ' UNION ALL '
END AS FORMATTED_QUERY_SET
FROM (SELECT ' SELECT NVL(MAX( ' || COL.COLUMN_NAME ||
' ),0) CURR_MAX_VAL, ''' || TAB.TABLE_NAME ||
''' TABLE_NAME,''' || COL.COLUMN_NAME ||
''' COLUMN_NAME FROM ' || TAB.TABLE_NAME AS FORMATTED_QUERY_SET,
TAB.TABLE_NAME,
ROW_NUMBER() OVER(ORDER BY TAB.TABLE_NAME DESC) AS RN
FROM USER_CONSTRAINTS TAB
JOIN USER_CONS_COLUMNS COL
ON TAB.TABLE_NAME = COL.TABLE_NAME
JOIN USER_TAB_COLUMNS COL2
ON COL.COLUMN_NAME = COL2.COLUMN_NAME
AND COL.TABLE_NAME = COL2.TABLE_NAME
WHERE TAB.CONSTRAINT_TYPE = 'P'
AND COL.CONSTRAINT_NAME LIKE '%_PK'
AND REGEXP_LIKE(COL2.DATA_TYPE, ('NUMB|INTE')))
ORDER BY TABLE_NAME;
Copy the output returned by the above query and execute.
Note : Remove the last ' UNION ALL ' operator from the query string.
Note : Please correct me if i am doing anything wrong .
... pivot (sum(A) for B in (X))
Now B is of datatype varchar2 and X is a string of varchar2 values separated by commas.
Values for X are select distinct values from a column(say CL) of same table. This way pivot query was working.
But the problem is that whenever there is a new value in column CL I have to manually add that to the string X.
I tried replacing X with select distinct values from CL. But query is not running.
The reason I felt was due to the fact that for replacing X we need values separated by commas.
Then i created a function to return exact output to match with string X. But query still doesn't run.
The error messages shown are like "missing righr parantheses", "end of file communication channel" etc etc.
I tried pivot xml instead of just pivot, the query runs but gives vlaues like oraxxx etc which are no values at all.
Maybe I am not using it properly.
Can you tell me some method to create a pivot with dynamic values?
You cannot put a dynamic statement in the PIVOT's IN statement without using PIVOT XML, which outputs some less than desirable output. However, you can create an IN string and input it into your statement.
First, here is my sample table;
myNumber myValue myLetter
---------- ---------- --------
1 2 A
1 4 B
2 6 C
2 8 A
2 10 B
3 12 C
3 14 A
First setup the string to use in your IN statement. Here you are putting the string into "str_in_statement". We are using COLUMN NEW_VALUE and LISTAGG to setup the string.
clear columns
COLUMN temp_in_statement new_value str_in_statement
SELECT DISTINCT
LISTAGG('''' || myLetter || ''' AS ' || myLetter,',')
WITHIN GROUP (ORDER BY myLetter) AS temp_in_statement
FROM (SELECT DISTINCT myLetter FROM myTable);
Your string will look like:
'A' AS A,'B' AS B,'C' AS C
Now use the String statement in your PIVOT query.
SELECT * FROM
(SELECT myNumber, myLetter, myValue FROM myTable)
PIVOT (Sum(myValue) AS val FOR myLetter IN (&str_in_statement));
Here is the Output:
MYNUMBER A_VAL B_VAL C_VAL
---------- ---------- ---------- ----------
1 2 4
2 8 10 6
3 14 12
There are limitations though. You can only concatenate a string up to 4000 bytes.
You can't put a non constant string in the IN clause of the pivot clause.
You can use Pivot XML for that.
From documentation:
subquery A subquery is used only in conjunction with the XML keyword.
When you specify a subquery, all values found by the subquery are used
for pivoting
It should look like this:
select xmlserialize(content t.B_XML) from t_aa
pivot xml(
sum(A) for B in(any)
) t;
You can also have a subquery instead of the ANY keyword:
select xmlserialize(content t.B_XML) from t_aa
pivot xml(
sum(A) for B in (select cl from t_bb)
) t;
Here is a sqlfiddle demo
For later readers, here is another solution
https://technology.amis.nl/2006/05/24/dynamic-sql-pivoting-stealing-antons-thunder/
allowing a query like
select * from table( pivot( 'select deptno, job, count(*) c from scott.emp group by deptno,job' ) )
I am not exactly going to give answer for the question OP has asked, instead I will be just describing how dynamic pivot can be done.
Here we have to use dynamic sql, by initially retrieving the column values into a variable and passing the variable inside dynamic sql.
EXAMPLE
Consider we have a table like below.
If we need to show the values in the column YR as column names and the values in those columns from QTY, then we can use the below code.
declare
sqlqry clob;
cols clob;
begin
select listagg('''' || YR || ''' as "' || YR || '"', ',') within group (order by YR)
into cols
from (select distinct YR from EMPLOYEE);
sqlqry :=
'
select * from
(
select *
from EMPLOYEE
)
pivot
(
MIN(QTY) for YR in (' || cols || ')
)';
execute immediate sqlqry;
end;
/
RESULT
If required, you can also create a temp table and do a select query in that temp table to see the results. Its simple, just add the CREATE TABLE TABLENAME AS in the above code.
sqlqry :=
'
CREATE TABLE TABLENAME AS
select * from
USE DYNAMIC QUERY
Test code is below
-- DDL for Table TMP_TEST
--------------------------------------------------------
CREATE TABLE "TMP_TEST"
( "NAME" VARCHAR2(20),
"APP" VARCHAR2(20)
);
/
SET DEFINE OFF;
Insert into TMP_TEST (NAME,APP) values ('suhaib','2');
Insert into TMP_TEST (NAME,APP) values ('suhaib','1');
Insert into TMP_TEST (NAME,APP) values ('shahzad','3');
Insert into TMP_TEST (NAME,APP) values ('shahzad','2');
Insert into TMP_TEST (NAME,APP) values ('shahzad','5');
Insert into TMP_TEST (NAME,APP) values ('tariq','1');
Insert into TMP_TEST (NAME,APP) values ('tariq','2');
Insert into TMP_TEST (NAME,APP) values ('tariq','6');
Insert into TMP_TEST (NAME,APP) values ('tariq','4');
/
CREATE TABLE "TMP_TESTAPP"
( "APP" VARCHAR2(20)
);
SET DEFINE OFF;
Insert into TMP_TESTAPP (APP) values ('1');
Insert into TMP_TESTAPP (APP) values ('2');
Insert into TMP_TESTAPP (APP) values ('3');
Insert into TMP_TESTAPP (APP) values ('4');
Insert into TMP_TESTAPP (APP) values ('5');
Insert into TMP_TESTAPP (APP) values ('6');
/
create or replace PROCEDURE temp_test(
pcursor out sys_refcursor,
PRESULT OUT VARCHAR2
)
AS
V_VALUES VARCHAR2(4000);
V_QUERY VARCHAR2(4000);
BEGIN
PRESULT := 'Nothing';
-- concating activities name using comma, replace "'" with "''" because we will use it in dynamic query so "'" can effect query.
SELECT DISTINCT
LISTAGG('''' || REPLACE(APP,'''','''''') || '''',',')
WITHIN GROUP (ORDER BY APP) AS temp_in_statement
INTO V_VALUES
FROM (SELECT DISTINCT APP
FROM TMP_TESTAPP);
-- designing dynamic query
V_QUERY := 'select *
from ( select NAME,APP
from TMP_TEST )
pivot (count(*) for APP in
(' ||V_VALUES|| '))
order by NAME' ;
OPEN PCURSOR
FOR V_QUERY;
PRESULT := 'Success';
Exception
WHEN OTHERS THEN
PRESULT := SQLcode || ' - ' || SQLERRM;
END temp_test;
I used the above method (Anton PL/SQL custom function pivot()) and it done the job! As I am not a professional Oracle developer, these are simple steps I've done:
1) Download the zip package to find pivotFun.sql in there.
2) Run once the pivotFun.sql to create a new function
3) Use the function in normal SQL.
Just be careful with dynamic columns names. In my environment I found that column name is limited with 30 characters and cannot contain a single quote in it. So, my query is now something like this:
SELECT
*
FROM
table(
pivot('
SELECT DISTINCT
P.proj_id,
REPLACE(substr(T.UDF_TYPE_LABEL, 1, 30), '''''''','','') as Attribute,
CASE
WHEN V.udf_text is null and V.udf_date is null and V.udf_number is NOT null THEN to_char(V.udf_number)
WHEN V.udf_text is null and V.udf_date is NOT null and V.udf_number is null THEN to_char(V.udf_date)
WHEN V.udf_text is NOT null and V.udf_date is null and V.udf_number is null THEN V.udf_text
ELSE NULL END
AS VALUE
FROM
project P
LEFT JOIN UDFVALUE V ON P.proj_id = V.proj_id
LEFT JOIN UDFTYPE T ON V.UDF_TYPE_ID = T.UDF_TYPE_ID
WHERE
P.delete_session_id IS NULL AND
T.TABLE_NAME = ''PROJECT''
')
)
Works well with up to 1m records.
Looks like it became possible without extra development effort since Oracle 19c with introduction of SQL_MACRO (and possibly Polymorphic Table Functions, which I haven't use yet).
create table t as
select
trunc(level/5) as id
, chr(65+mod(level, 5)) as code
, level as val
from dual
connect by level < 10
create function f_pivot
return varchar2 SQL_MACRO(TABLE)
is
l_codes varchar2(1000);
begin
select listagg(
distinct '''' || code
|| ''' as ' || code, ',')
into l_codes
from t;
return
'select *
from t
pivot (
max(val) for code in (
' || l_codes || '))';
end;
/
select *
from f_pivot()
ID | B | C | D | E | A
-: | -: | -: | -: | -: | ---:
0 | 1 | 2 | 3 | 4 | null
1 | 6 | 7 | 8 | 9 | 5
The only issue (in case of SQL_MACRO approach) is that result set doen't change its structure during one session:
insert into t
values(1, 'Q', 100);
commit;
select *
from f_pivot()
ID | B | C | D | E | A
-: | -: | -: | -: | -: | ---:
0 | 1 | 2 | 3 | 4 | null
1 | 6 | 7 | 8 | 9 | 5
But in separate session it works fine:
select dbms_xmlgen.getxml('select * from f_pivot()') as v
from dual
V
<?xml version="1.0"?><ROWSET> <ROW> <ID>0</ID> <B>1</B> <C>2</C> <D>3</D> <E>4</E> </ROW> <ROW> <ID>1</ID> <B>6</B> <C>7</C> <D>8</D> <E>9</E> <A>5</A> <Q>100</Q> </ROW></ROWSET>
Using with function feature dynamic pivot may be used in-place without predefined function:
with function f_pivot1
return varchar2 SQL_MACRO(TABLE)
is
l_codes varchar2(1000);
begin
select listagg(distinct '''' || code || ''' as ' || code, ',')
into l_codes
from t;
return
'select *
from t
pivot (
max(val) for code in (
' || l_codes || '))';
end;
select *
from f_pivot1()
ID | B | C | D | E | A | Q
-: | -: | -: | -: | -: | ---: | ---:
0 | 1 | 2 | 3 | 4 | null | null
1 | 6 | 7 | 8 | 9 | 5 | 100
db<>fiddle here
You cannot put a dynamic statement in the PIVOT's IN statement without using PIVOT XML, but you can use small Technic to use dynamic statement in PIVOT. In PL/SQL, within a string value, two apostrophe is equal to one apostrophes.
declare
sqlqry clob;
search_ids varchar(256) := '''2016'',''2017'',''2018'',''2019''';
begin
search_ids := concat( search_ids,'''2020''' ); -- you can append new search id dynamically as you wanted
sqlqry :=
'
select * from
(
select *
from EMPLOYEE
)
pivot
(
MIN(QTY) for YR in (' || search_ids || ')
)';
execute immediate sqlqry;
end;
There’s no straightforward method for dynamic pivoting in Oracle’s SQL, unless it returns XML type results.
For the non-XML results PL/SQL might be used through creating functions of SYS_REFCURSOR return type
With Conditional Aggregation
CREATE OR REPLACE FUNCTION Get_Jobs_ByYear RETURN SYS_REFCURSOR IS
v_recordset SYS_REFCURSOR;
v_sql VARCHAR2(32767);
v_cols VARCHAR2(32767);
BEGIN
SELECT LISTAGG( 'SUM( CASE WHEN job_title = '''||job_title||''' THEN 1 ELSE 0 END ) AS "'||job_title||'"' , ',' )
WITHIN GROUP ( ORDER BY job_title )
INTO v_cols
FROM ( SELECT DISTINCT job_title
FROM jobs j );
v_sql :=
'SELECT "HIRE YEAR",'|| v_cols ||
' FROM
(
SELECT TO_NUMBER(TO_CHAR(hire_date,''YYYY'')) AS "HIRE YEAR", job_title
FROM employees e
JOIN jobs j
ON j.job_id = e.job_id
)
GROUP BY "HIRE YEAR"
ORDER BY "HIRE YEAR"';
OPEN v_recordset FOR v_sql;
DBMS_OUTPUT.PUT_LINE(v_sql);
RETURN v_recordset;
END;
/
With PIVOT Clause
CREATE OR REPLACE FUNCTION Get_Jobs_ByYear RETURN SYS_REFCURSOR IS
v_recordset SYS_REFCURSOR;
v_sql VARCHAR2(32767);
v_cols VARCHAR2(32767);
BEGIN
SELECT LISTAGG( ''''||job_title||''' AS "'||job_title||'"' , ',' )
WITHIN GROUP ( ORDER BY job_title )
INTO v_cols
FROM ( SELECT DISTINCT job_title
FROM jobs j );
v_sql :=
'SELECT *
FROM
(
SELECT TO_NUMBER(TO_CHAR(hire_date,''YYYY'')) AS "HIRE YEAR", job_title
FROM employees e
JOIN jobs j
ON j.job_id = e.job_id
)
PIVOT
(
COUNT(*) FOR job_title IN ( '|| v_cols ||' )
)
ORDER BY "HIRE YEAR"';
OPEN v_recordset FOR v_sql;
DBMS_OUTPUT.PUT_LINE(v_sql);
RETURN v_recordset;
END;
/
But there's a drawback with LISTAGG() that's coded ORA-01489: result of string concatenation is too long raises whenever the concatenated string within the first argument exceeds the length of 4000 characters. In this case, the query returning the value of v_cols variable might be replaced with the XMLELEMENT() function nested within XMLAGG() such as
CREATE OR REPLACE FUNCTION Get_Jobs_ByYear RETURN SYS_REFCURSOR IS
v_recordset SYS_REFCURSOR;
v_sql VARCHAR2(32767);
v_cols VARCHAR2(32767);
BEGIN
SELECT RTRIM(DBMS_XMLGEN.CONVERT(
XMLAGG(
XMLELEMENT(e, 'SUM( CASE WHEN job_title = '''||job_title||
''' THEN 1 ELSE 0 END ) AS "'||job_title||'",')
).EXTRACT('//text()').GETCLOBVAL() ,1),',') AS "v_cols"
FROM ( SELECT DISTINCT job_title
FROM jobs j);
v_sql :=
'SELECT "HIRE YEAR",'|| v_cols ||
' FROM
(
SELECT TO_NUMBER(TO_CHAR(hire_date,''YYYY'')) AS "HIRE YEAR", job_title
FROM employees e
JOIN jobs j
ON j.job_id = e.job_id
)
GROUP BY "HIRE YEAR"
ORDER BY "HIRE YEAR"';
DBMS_OUTPUT.put_line(LENGTH(v_sql));
OPEN v_recordset FOR v_sql;
RETURN v_recordset;
END;
/
unless the upper limit 32767 for VARCHAR2 type is exceeded. This last method might also be applied for the database with version prior to Oracle 11g Release 2 as they don't contain LISTAGG() function.
Btw, yet LISTAGG() function can be used during the checkout of the v_cols even for very long concatenated string generated without getting ORA-01489 error while the trailing part of the string is truncated through use of ON OVERFLOW TRUNCATE clause if the version for the database is 12.2+ such as
LISTAGG( <concatenated string>,',' ON OVERFLOW TRUNCATE 'THE REST IS TRUNCATED' WITHOUT COUNT )
The function can be invoked as
VAR rc REFCURSOR
EXEC :rc := Get_Jobs_ByYear;
PRINT rc
from SQL Developer's command line
or
BEGIN
:result := Get_Jobs_ByYear;
END;
from Test window of PL/SQL Developer in order to get the result
set.
Demo for generated queries
You can dynamically pivot data in a single SQL statement with the open source program Method4.Pivot.
After installing the package, call the function and pass in a SQL statement as a string. The last column of your SQL statement defines the values, and the second-to-last column defines the column names. The default aggregation function is MAX, which works well for common entity-attribute-value queries like this one:
select * from table(method4.pivot(
q'[
select 'A' name, 1 value from dual union all
select 'B' name, 2 value from dual union all
select 'C' name, 3 value from dual
]'
));
A B C
- - -
1 2 3
The program also supports different aggregation functions through the parameter P_AGGREGATE_FUNCTION, and allows for a custom column name order if you add a column named PIVOT_COLUMN_ID.
The package uses an Oracle Data Cartridge approach similar to Anton's pivot, but Method4.Pivot has several important advantages:
Regular open source program with a repo, installation instructions, license, unit tests, documentation, and comments - not just a Zip file on a blog.
Handles unusual column names.
Handles unusual data types, like floats.
Handles up to 1000 columns.
Provides meaningful error messages for common mistakes.
Handles NULL column names.
Handles 128-character column names.
Prevents misleading implicit conversion.
Hard-parses statements each time to catch underlying table changes.
But most users are still better off creating a dynamic pivot at the application layer or with the pivot XML option.
Say I have several tables that all start with 'PLAYER_' and I am trying to loop through all of those tables to get tables names and then loop again to get a value of a column in all of these tables.
This column exists in all tables so I want to use nested FOR loops to achieve that.
Here is what I have so far but it does not seem to work:
DECLARE
LOG_ID NUMBER;
TBL_NME VARCHAR2(30);
V_STRNG VARCHAR2(4000);
BEGIN
FOR i IN (SELECT TABLE_NAME FROM USER_TABLES WHERE TABLE_NAME LIKE 'PLAYER_%') LOOP
TBL_NME := i.TABLE_NAME;
DBMS_OUTPUT.PUT_LINE('TABLE EXTRACTED IS ' || TBL_NME);
FOR j IN(SELECT LOG_ID FROM i.TABLE_NAME) LOOP
V_EXEC_OBJ_STRNG := 'SELECT LOG_ID FROM ' || i.TABLE_NAME;
EXECUTE IMMEDIATE V_STRNG INTO LOG_ID;
DBMS_OUTPUT.PUT_LINE('LOG_ID IS ' || LOG_ID || ' FOR TABLE ' || i.TABLE_NAME);
END LOOP;
END LOOP;
END;
/
You can probably get away with just one loop ...
Example
create table player_01 ( id, name )
as
select level, dbms_random.string( 'x', 25 )
from dual
connect by level <= 10 ;
create table player_02 ( id, name )
as
select level, dbms_random.string( 'x', 25 )
from dual
connect by level <= 11 ;
create table player_03 ( id, name )
as
select level, dbms_random.string( 'x', 25 )
from dual
connect by level <= 12 ;
Anonymous block:
-- find all relevant tables and retrieve the highest id values
declare
logid number := 0 ;
tablename varchar2( 30 ) := '' ;
v_string varchar2( 4000 ) := '' ;
begin
for r in (
select table_name from user_tables
where table_name like 'PLAYER%'
order by table_name
) loop
-- dbms_output.put_line( ' current table -> ' || r.table_name ) ;
v_string := 'select max( id ) as logid from ' || r.table_name;
execute immediate v_string into logid ;
dbms_output.put_line( 'log id is ' || logid || ' for table ' || r.table_name ) ;
end loop ;
end ;
/
-- result
log id is 10 for table PLAYER_01
log id is 11 for table PLAYER_02
log id is 12 for table PLAYER_03
Dbfiddle here.
According to your comment, there are several LOGIDs in each PLAYER_ table. Maybe the following example is closer to the "real thing". (And: the anonymous block has nested loops ... ( tested with Oracle 12c and 11g, dbfiddle here ).
Tables
create table player_01 ( id, details, logid )
as
select level, dbms_random.string( 'x', 25 ), abs( dbms_random.random() )
from dual
connect by level <= 3 ;
create table player_02 ( id, details, logid )
as
select level, dbms_random.string( 'x', 25 ), abs( dbms_random.random() )
from dual
connect by level <= 4 ;
create table player_03 ( id, details, logid )
as
select level, dbms_random.string( 'x', 25 ), abs( dbms_random.random() )
from dual
connect by level <= 4 ;
Sample data in PLAYER_01 / PLAYER_02 / PLAYER_03
select * from player_01 ;
ID DETAILS LOGID
1 VZAQXPFCQK3U2F0RL32I31N40 699945134
2 32QWFFMUCF1DL6E3Z5QM4DSWY 1635628934
3 48GWBETOLUSDEFA3SMY061NUO 1237793316
select * from player_02;
ID DETAILS LOGID
1 HS827U4VCY853N8DKTI98J82D 1993524164
2 XLYS0XPJG0IQP4BNKDQ0ZITPA 1665941353
3 DWVVR5O6N5T1HP5MDYHVH3NZJ 1129581845
4 L7N8HCPVTHP466WJ5TCQ04YHE 794237444
select * from player_03;
ID DETAILS LOGID
1 SYVX5G2FE5IC1MI6TCSAHNOUU 720476135
2 4IQZIG6DAUCWW3APJY5OZ63TF 287457960
3 525NMZFVGLWKIT7EIFA41C8MB 784891618
4 0XHJXV2O4TCQQSITOTIQCO3AA 1578737054
Anonymous block
declare
logid number := 0 ;
tablename varchar2( 30 ) := '' ;
v_string1 varchar2( 4000 ) := '' ;
v_string2 varchar2( 4000 ) := '' ;
rowcount number := 0 ;
begin
for r in (
select table_name from user_tables
where table_name like 'PLAYER%'
order by table_name
) loop
v_string1 := 'select count(*) from ' || r.table_name ;
execute immediate v_string1 into rowcount ;
dbms_output.put_line( rowcount ) ;
for rn in 1 .. rowcount
loop
-- dbms_output.put_line( rn ) ;
v_string2 := 'select logid from ( '
|| 'select logid, row_number() over ( order by id ) rn '
|| ' from ' || r.table_name || ' )'
|| ' where rn = ' || rn;
-- dbms_output.put_line( v_string2 ) ;
execute immediate v_string2 into logid ;
dbms_output.put_line( 'log id is ' || logid || ' for table ' || r.table_name ) ;
end loop ;
end loop ;
end ;
/
dbms_output:
3
log id is 699945134 for table PLAYER_01
log id is 1635628934 for table PLAYER_01
log id is 1237793316 for table PLAYER_01
4
log id is 1993524164 for table PLAYER_02
log id is 1665941353 for table PLAYER_02
log id is 1129581845 for table PLAYER_02
log id is 794237444 for table PLAYER_02
4
log id is 720476135 for table PLAYER_03
log id is 287457960 for table PLAYER_03
log id is 784891618 for table PLAYER_03
log id is 1578737054 for table PLAYER_03
The second query string (v_string2) looks a bit like this (maybe a bit easier to read than all the string parts and ||):
select logid
from (
select
logid
, row_number() over ( order by id ) rn
from player_01
) where rn = 1
;
-- query result
LOGID
1338793259
Query in the inner loop
(answering the question in your comment)
The subquery uses row_number() - see documentation:
"ROW_NUMBER is an analytic function. It assigns a unique number to
each row to which it is applied (either each row in the partition or
each row returned by the query), in the ordered sequence of rows
specified in the order_by_clause, beginning with 1."
We are using this to get consecutive numbers, numbering the LOGIDs as it were. Then, we use the RN values in the WHERE clause (of the outer select), and compare them to the inner FOR loop's "rn" value.
select
logid
, row_number() over ( order by id ) rn
from player_01 ;
-- result
LOGID RN
1775991812 1
262095022 2
2090118607 3