PL/SQL query IN comma deliminated string - oracle

I am developing an application in Oracle APEX. I have a string with user id's that is comma deliminated which looks like this,
45,4932,20,19
This string is stored as
:P5_USER_ID_LIST
I want a query that will find all users that are within this list my query looks like this
SELECT * FROM users u WHERE u.user_id IN (:P5_USER_ID_LIST);
I keep getting an Oracle error: Invalid number. If I however hard code the string into the query it works. Like this:
SELECT * FROM users u WHERE u.user_id IN (45,4932,20,19);
Anyone know why this might be an issue?

A bind variable binds a value, in this case the string '45,4932,20,19'. You could use dynamic SQL and concatenation as suggested by Randy, but you would need to be very careful that the user is not able to modify this value, otherwise you have a SQL Injection issue.
A safer route would be to put the IDs into an Apex collection in a PL/SQL process:
declare
array apex_application_global.vc_arr2;
begin
array := apex_util.string_to_table (:P5_USER_ID_LIST, ',');
apex_collection.create_or_truncate_collection ('P5_ID_COLL');
apex_collection.add_members ('P5_ID_COLL', array);
end;
Then change your query to:
SELECT * FROM users u WHERE u.user_id IN
(SELECT c001 FROM apex_collections
WHERE collection_name = 'P5_ID_COLL')

An easier solution is to use instr:
SELECT * FROM users u
WHERE instr(',' || :P5_USER_ID_LIST ||',' ,',' || u.user_id|| ',', 1) !=0;
tricks:
',' || :P5_USER_ID_LIST ||','
to make your string ,45,4932,20,19,
',' || u.user_id|| ','
to have i.e. ,32, and avoid to select the 32 being in ,4932,

I have faced this situation several times and here is what i've used:
SELECT *
FROM users u
WHERE ','||to_char(:P5_USER_ID_LIST)||',' like '%,'||to_char(u.user_id)||',%'
ive used the like operator but you must be a little carefull of one aspect here: your item P5_USER_ID_LIST must be ",45,4932,20,19," so that like will compare with an exact number "',45,'".
When using it like this, the select will not mistake lets say : 5 with 15, 155, 55.
Try it out and let me know how it goes;)
Cheers ,
Alex

Create a native query rather than using "createQuery/createNamedQuery"

The reason this is an issue is that you cannot just bind an in list the way you want, and just about everyone makes this mistake at least once as they are learning Oracle (and probably SQL!).
When you bind the string '32,64,128', it effectively becomes a query like:
select ...
from t
where t.c1 in ('32,64,128')
To Oracle this is totally different to:
select ...
from t
where t.c1 in (32,64,128)
The first example has a single string value in the in list and the second has a 3 numbers in the in list. The reason you get an invalid number error is because Oracle attempts to cast the string '32,64,128' into a number, which it cannot do due to the commas in the string.
A variation of this "how do I bind an in list" question has come up on here quite a few times recently.
Generically, and without resorting to any PLSQL, worrying about SQL Injection or not binding the query correctly, you can use this trick:
with bound_inlist
as
(
select
substr(txt,
instr (txt, ',', 1, level ) + 1,
instr (txt, ',', 1, level+1) - instr (txt, ',', 1, level) -1 )
as token
from (select ','||:txt||',' txt from dual)
connect by level <= length(:txt)-length(replace(:txt,',',''))+1
)
select *
from bound_inlist a, users u
where a.token = u.id;

If possible the best idea may be to not store your user ids in csv! Put them in a table or failing that an array etc. You cannot bind a csv field as a number.

Please dont use: WHERE ','||to_char(:P5_USER_ID_LIST)||',' like '%,'||to_char(u.user_id)||',%' because you'll force a full table scan although with the users table you may not have that many so the impact will be low but against other tables in an enterprise environment this is a problem.
EDIT: I have put together a script to demonstrate the differences between the regex method and the wildcard like method. Not only is regex faster but it's also a lot more robust.
-- Create table
create table CSV_TEST
(
NUM NUMBER not null,
STR VARCHAR2(20)
);
create sequence csv_test_seq;
begin
for j in 1..10 loop
for i in 1..500000 loop
insert into csv_test( num, str ) values ( csv_test_seq.nextval, to_char( csv_test_seq.nextval ));
end loop;
commit;
end loop;
end;
/
-- Create/Recreate primary, unique and foreign key constraints
alter table CSV_TEST
add constraint CSV_TEST_PK primary key (NUM)
using index ;
alter table CSV_TEST
add constraint CSV_TEST_FK unique (STR)
using index;
select sysdate from dual;
select *
from csv_test t
where t.num in ( Select Regexp_Substr('100001, 100002, 100003 , 100004, 100005','[^,]+', 1, Level) From Dual
Connect By Regexp_Substr('100001, 100002,100003, 100004, 100005', '[^,]+', 1, Level) Is Not Null);
select sysdate from dual;
select *
from csv_test t
where ('%,' || '100001,100002, 100003, 100004 ,100005' || ',%') like '%,' || num || ',%';
select sysdate from dual;
select *
from csv_test t
where t.num in ( Select Regexp_Substr('100001, 100002, 100003 , 100004, 100005','[^,]+', 1, Level) From Dual
Connect By Regexp_Substr('100001, 100002,100003, 100004, 100005', '[^,]+', 1, Level) Is Not Null);
select sysdate from dual;
select *
from csv_test t
where ('%,' || '100001,100002, 100003, 100004 ,100005' || ',%') like '%,' || num || ',%';
select sysdate from dual;
drop table csv_test;
drop sequence csv_test_seq;

Solution from Tony Andrews works for me. The process should be added to "Page processing" >> "After submit">> "Processes".

As you are Storing User Ids as String so You can Easily match String Using Like as Below
SELECT * FROM users u WHERE u.user_id LIKE '%'||(:P5_USER_ID_LIST)||'%'
For Example
:P5_USER_ID_LIST = 45,4932,20,19
Your Query Surely Will return Any of 1 User Id which Matches to Users table
This Will Surely Resolve Your Issue , Enjoy

you will need to run this as dynamic SQL.
create the entire string, then run it dynamically.

Related

Find value by any column in table

I have columns in table something like
answeremail,
answertime,
ata,
atacomment,
ataid,
atainternalnumber,
atanumber,
atatype,
author,
becomeatafromdeviation,
becomeexternalatafrominternal,
becomefastfromothertype,
briefdescription,
city,
client_answer_attachment,
clientcomment,
confirmstatus,
created_at,
deviation,
deviationnumber,
deviationtype,
duedate,
emailsent,
financeid,
forfortnox,
fromfortnox,
is_deleted,
locked,
name,
parentata,
paymenttype,
pdfurl,
projectid,
quantity,
reason,
revisiondate,
startdate,
status,
street,
suggestion,
token,
type,
unit,
userid,
zip
I want to create SELECT statment to retrive some of this column but without specify column name something like this.
SELECT * FROM ata WHERE 'field' = 'argument'
Is there any solution for this problem or either I need to specify all those column in SELECT statment ?
Not (just) in SELECT, but in WHERE. Otherwise, how will query know which columns to check?
But - beware of datatypes. Oracle will try to implicitly convert one datatype to another. Sometimes, it'll succeed (e.g. to convert number 1 to string '1'), sometimes it'll fail (e.g. convert string 'A' to number or string 'AB23F' to date value).
Therefore, although you can try to write query which will write query for you, it might take some time to actually make it work properly. PL/SQL is probably what you'll end up with.
An example which checks all tables in my schema that contain a column named PHONE_NUMBER and searches for a row whose phone number contains 654. This script returns tables that contain such a value; you'd return something else - list of columns? All columns? Can't tell.
That's just a starting point. Good luck!
DECLARE
l_str VARCHAR2(500);
l_cnt NUMBER := 0;
BEGIN
FOR cur_r IN (SELECT u.table_name, u.column_name
FROM user_tab_columns u, user_tables t
WHERE u.table_name = t.table_name
AND u.column_name = 'PHONE_NUMBER'
)
LOOP
l_str := 'SELECT COUNT(*) FROM ' || cur_r.table_name ||
' WHERE ' || cur_r.column_name || ' like (''%654%'')';
EXECUTE IMMEDIATE (l_str) INTO l_cnt;
IF l_cnt > 0 THEN
dbms_output.put_line(l_cnt ||' : ' || cur_r.table_name);
END IF;
END LOOP;
END;
There is one short solution using xml:
https://dbfiddle.uk/?rdbms=oracle_18&fiddle=e5efc164b77a55b2ecf5d4bf85d589a5
select/*+ no_xml_query_rewrite */ *
from
xmltable(
'for $row in ora:view("SAMPLE_DATA")
let $col := $row/ROW/*[text() eq $search]
where $col ne ""
return element R {
attribute X {$col},
$row
}'
passing
'BB' as "search"
columns
found_col varchar2(30) path '#X'
,row_data xmltype path '.'
);
Results:
FOUND_COL ROW_DATA
------------------------------ ----------------------------------------------------------------------------------------------------
BB <R X="BB"><ROW><COLA>10A</COLA><COLB>BB</COLB><COLC>00</COLC></ROW></R>
BB <R X="BB"><ROW><COLA>10A</COLA><COLB>BB</COLB><COLC>10</COLC></ROW></R>
BB <R X="BB"><ROW><COLA>10A</COLA><COLB>BB</COLB><COLC>20</COLC></ROW></R>
BB <R X="BB"><ROW><COLA>10A</COLA><COLB>BB</COLB><COLC>30</COLC></ROW></R>
BB <R X="BB"><ROW><COLA>10A</COLA><COLB>BB</COLB><COLC>40</COLC></ROW></R>
From your description, I think you want to retrive the columns which spicified by the where clause only, and to omit the other columns.
I don't think this is allowed by simple sql statement. In the oracle document, at the select_list section, I can't see any solution for this question.
see https://docs.oracle.com/en/database/oracle/oracle-database/19/sqlrf/SELECT.html#GUID-CFA006CA-6FF1-4972-821E-6996142A51C6
If you really want, you can use pl/sql.

ORACLE SUBQUERY NOT WORKING IN (IN CONDITION)

I need help
i have records 123,456,789 in rows when i am execute like
this one is working
select * from table1 where num1 in('123','456')
but when i am execute
select * from table1 where num1 in(select value from table2)
no resultset found - why?
Check the DataType varchare2 or Number
try
select * from table1 where num1 in(select to_char(value) from table2)
Storing comma separated values could be the cause of problem.
You can try using regexp_substr to split comma.
First and foremost, an important thing to remember: Do not store numbers in character datatypes. Use NUMBER or INTEGER. Secondly, always prefer VARCHAR2 datatype over CHAR if you wish to store characters > 1.
You said in one of your comments that num1 column is of type char(4). The problem with CHAR datatype is that If your string is 3 characters wide, it stores the record by adding extra 1 space character to make it 4 characters. VARCHAR2 only stores as many characters as you pass while inserting/updating and are not blank padded.
To verify that you may run select length(any_char_col) from t;
Coming to your problem, the IN condition is never satisfied because what's actually being compared is
WHERE 'abc ' = 'abc' - Note the extra space in left side operator.
To fix this, one good option is to pad the right side expression with as many spaces as required to do the right comparison.The function RPAD( string1, padded_length [, pad_string] ) could be used for this purpose.So, your query should look something like this.
select * from table1 where num1 IN (select rpad(value,4) from table2);
This will likely utilise an index on the column num1 if it exists.
The other one is to use RTRIM on LHS, which is only useful if there's a function based index on RTRIM(num1)
select * from table1 where RTRIM(num1) in(select value from table2);
So, the takeaway from all these examples is always use NUMBER types to store numbers and prefer VARCHAR2 over CHAR for strings.
See Demo to fully understand what's happening.
EDIT : It seems You are storing comma separated numbers.You could do something like this.
SELECT *
FROM table1 t1
WHERE EXISTS (
SELECT 1
FROM table2 t2
WHERE ',' ||t2.value|| ',' LIKE '%,' || rtrim(t1.num1) || ',%'
);
See Demo2
Storing comma separated values are bound to cause problems, better change it.
Let me tell you first,
You have stored values in table2 which is comma seperated.
So, how could you match your data with table1 and table2.
Its not Possible.
That's why you did not get any values in result set.
I found the Solution using string array
SELECT T.* FROM TABLE1 T,
(SELECT TRIM(VALUE)AS VAL FROM TABLE2)TABLE2
WHERE
TRIM(NUM1) IN (SELECT COLUMN_VALUE FROM TABLE(FUNC_GETSTRING_ARRAY(TABLE2.VAL)))
thanks

How do I display a field's hidden characters in the result of a query in Oracle?

I have two rows that have a varchar column that are different according to a Java .equals(). I can't easily change or debug the Java code that's running against this particular database but I do have access to do queries directly against the database using SQLDeveloper. The fields look the same to me (they are street addresses with two lines separated by some new line or carriage feed/new line combo).
Is there a way to see all of the hidden characters as the result of a query?I'd like to avoid having to use the ascii() function with substr() on each of the rows to figure out which hidden character is different.
I'd also accept some query that shows me which character is the first difference between the two fields.
Try
select dump(column_name) from table
More information is in the documentation.
As for finding the position where the character differs, this might give you an idea:
create table tq84_compare (
id number,
col varchar2(20)
);
insert into tq84_compare values (1, 'hello world');
insert into tq84_compare values (2, 'hello' || chr(9) || 'world');
with c as (
select
(select col from tq84_compare where id = 1) col1,
(select col from tq84_compare where id = 2) col2
from
dual
),
l as (
select
level l from dual
start with 1=1
connect by level < (select length(c.col1) from c)
)
select
max(l.l) + 1position
from c,l
where substr(c.col1,1,l.l) = substr(c.col2,1,l.l);
SELECT DUMP('€ÁÑ', 1016)
FROM DUAL
... will print something like:
Typ=96 Len=3 CharacterSet=WE8MSWIN1252: 80,c1,d1

Oracle: Using Pseudo column value in the same Select statement

I have a scenario in oracle where i need to be able to reuse the value of a pseudo column which was calculated previously within the same select statement something like:
select 'output1' process, process || '-Output2' from Table1
I don't want to the repeat the first columns logic again in the second column for maintenance purposes, currently it is done as
select 'output1' process, 'output1' || '-Output2' name from Table1
since i have 4 such columns which depend on the previous column output, repeating would be a maintenance nightmare
Edit: i included table name and removed dual, so that no assumptions are made about that this is not a complex process, my actual statement does have 2 to 3 joins on different tables
You could calculate the value in a sub-query:
select calculated_output process, calculated_output || '-Output2' name from
(
select 'output1' calculated_output from dual
)
You can also use the subquery factoring syntax - i think it is more readable myself:
with tmp as
(
select 'output' process from dual
)
select
process, process || '-Output2' from tmp;
very similar to the previous poster, but I prefer the method I show below, it is more readable, thusly more supportable, in my opinion
select val1 || val2 || val3 as val4, val1 from (
select 'output1' as val1, '-output2' as val2, '-output3' as val3 from dual
)

How to put more than 1000 values into an Oracle IN clause [duplicate]

This question already has answers here:
SQL IN Clause 1000 item limit
(5 answers)
Closed 8 years ago.
Is there any way to get around the Oracle 10g limitation of 1000 items in a static IN clause? I have a comma delimited list of many of IDs that I want to use in an IN clause, Sometimes this list can exceed 1000 items, at which point Oracle throws an error. The query is similar to this...
select * from table1 where ID in (1,2,3,4,...,1001,1002,...)
Put the values in a temporary table and then do a select where id in (select id from temptable)
select column_X, ... from my_table
where ('magic', column_X ) in (
('magic', 1),
('magic', 2),
('magic', 3),
('magic', 4),
...
('magic', 99999)
) ...
I am almost sure you can split values across multiple INs using OR:
select * from table1 where ID in (1,2,3,4,...,1000) or
ID in (1001,1002,...,2000)
You may try to use the following form:
select * from table1 where ID in (1,2,3,4,...,1000)
union all
select * from table1 where ID in (1001,1002,...)
Where do you get the list of ids from in the first place? Since they are IDs in your database, did they come from some previous query?
When I have seen this in the past it has been because:-
a reference table is missing and the correct way would be to add the new table, put an attribute on that table and join to it
a list of ids is extracted from the database, and then used in a subsequent SQL statement (perhaps later or on another server or whatever). In this case, the answer is to never extract it from the database. Either store in a temporary table or just write one query.
I think there may be better ways to rework this code that just getting this SQL statement to work. If you provide more details you might get some ideas.
Use ...from table(... :
create or replace type numbertype
as object
(nr number(20,10) )
/
create or replace type number_table
as table of numbertype
/
create or replace procedure tableselect
( p_numbers in number_table
, p_ref_result out sys_refcursor)
is
begin
open p_ref_result for
select *
from employees , (select /*+ cardinality(tab 10) */ tab.nr from table(p_numbers) tab) tbnrs
where id = tbnrs.nr;
end;
/
This is one of the rare cases where you need a hint, else Oracle will not use the index on column id. One of the advantages of this approach is that Oracle doesn't need to hard parse the query again and again. Using a temporary table is most of the times slower.
edit 1 simplified the procedure (thanks to jimmyorr) + example
create or replace procedure tableselect
( p_numbers in number_table
, p_ref_result out sys_refcursor)
is
begin
open p_ref_result for
select /*+ cardinality(tab 10) */ emp.*
from employees emp
, table(p_numbers) tab
where tab.nr = id;
end;
/
Example:
set serveroutput on
create table employees ( id number(10),name varchar2(100));
insert into employees values (3,'Raymond');
insert into employees values (4,'Hans');
commit;
declare
l_number number_table := number_table();
l_sys_refcursor sys_refcursor;
l_employee employees%rowtype;
begin
l_number.extend;
l_number(1) := numbertype(3);
l_number.extend;
l_number(2) := numbertype(4);
tableselect(l_number, l_sys_refcursor);
loop
fetch l_sys_refcursor into l_employee;
exit when l_sys_refcursor%notfound;
dbms_output.put_line(l_employee.name);
end loop;
close l_sys_refcursor;
end;
/
This will output:
Raymond
Hans
I wound up here looking for a solution as well.
Depending on the high-end number of items you need to query against, and assuming your items are unique, you could split your query into batches queries of 1000 items, and combine the results on your end instead (pseudocode here):
//remove dupes
items = items.RemoveDuplicates();
//how to break the items into 1000 item batches
batches = new batch list;
batch = new batch;
for (int i = 0; i < items.Count; i++)
{
if (batch.Count == 1000)
{
batches.Add(batch);
batch.Clear()
}
batch.Add(items[i]);
if (i == items.Count - 1)
{
//add the final batch (it has < 1000 items).
batches.Add(batch);
}
}
// now go query the db for each batch
results = new results;
foreach(batch in batches)
{
results.Add(query(batch));
}
This may be a good trade-off in the scenario where you don't typically have over 1000 items - as having over 1000 items would be your "high end" edge-case scenario. For example, in the event that you have 1500 items, two queries of (1000, 500) wouldn't be so bad. This also assumes that each query isn't particularly expensive in of its own right.
This wouldn't be appropriate if your typical number of expected items got to be much larger - say, in the 100000 range - requiring 100 queries. If so, then you should probably look more seriously into using the global temporary tables solution provided above as the most "correct" solution. Furthermore, if your items are not unique, you would need to resolve duplicate results in your batches as well.
Yes, very weird situation for oracle.
if you specify 2000 ids inside the IN clause, it will fail.
this fails:
select ...
where id in (1,2,....2000)
but if you simply put the 2000 ids in another table (temp table for example), it will works
below query:
select ...
where id in (select userId
from temptable_with_2000_ids )
what you can do, actually could split the records into a lot of 1000 records and execute them group by group.
Here is some Perl code that tries to work around the limit by creating an inline view and then selecting from it. The statement text is compressed by using rows of twelve items each instead of selecting each item from DUAL individually, then uncompressed by unioning together all columns. UNION or UNION ALL in decompression should make no difference here as it all goes inside an IN which will impose uniqueness before joining against it anyway, but in the compression, UNION ALL is used to prevent a lot of unnecessary comparing. As the data I'm filtering on are all whole numbers, quoting is not an issue.
#
# generate the innards of an IN expression with more than a thousand items
#
use English '-no_match_vars';
sub big_IN_list{
#_ < 13 and return join ', ',#_;
my $padding_required = (12 - (#_ % 12)) % 12;
# get first dozen and make length of #_ an even multiple of 12
my ($a,$b,$c,$d,$e,$f,$g,$h,$i,$j,$k,$l) = splice #_,0,12, ( ('NULL') x $padding_required );
my #dozens;
local $LIST_SEPARATOR = ', '; # how to join elements within each dozen
while(#_){
push #dozens, "SELECT #{[ splice #_,0,12 ]} FROM DUAL"
};
$LIST_SEPARATOR = "\n union all\n "; # how to join #dozens
return <<"EXP";
WITH t AS (
select $a A, $b B, $c C, $d D, $e E, $f F, $g G, $h H, $i I, $j J, $k K, $l L FROM DUAL
union all
#dozens
)
select A from t union select B from t union select C from t union
select D from t union select E from t union select F from t union
select G from t union select H from t union select I from t union
select J from t union select K from t union select L from t
EXP
}
One would use that like so:
my $bases_list_expr = big_IN_list(list_your_bases());
$dbh->do(<<"UPDATE");
update bases_table set belong_to = 'us'
where id in ($bases_list_expr)
UPDATE
Instead of using IN clause, can you try using JOIN with the other table, which is fetching the id. that way we don't need to worry about limit. just a thought from my side.
Instead of SELECT * FROM table1 WHERE ID IN (1,2,3,4,...,1000);
Use this :
SELECT * FROM table1 WHERE ID IN (SELECT rownum AS ID FROM dual connect BY level <= 1000);
*Note that you need to be sure the ID does not refer any other foreign IDS if this is a dependency. To ensure only existing ids are available then :
SELECT * FROM table1 WHERE ID IN (SELECT distinct(ID) FROM tablewhereidsareavailable);
Cheers

Resources