This question already has answers here:
Search All Fields In All Tables For A Specific Value (Oracle)
(17 answers)
Closed 5 months ago.
is there a way to search a pattern string in all columns, all tables, in a Oracle database?
Should be case-insensitive.
For example, I'd like to find in which tables/columns there is the string 'alpha'.
Strings with values: 'alphabeth', '001alpha', 'alpha001' should be returned also.
Has anyone idea how to write a query like this?
Thanks a lot in advance.
Luis
A query ? Sure NO.
A PL/SQL procedure using the meta data found in user/all_tab_columns, EXECUTE IMMEDIATE to execute dynamic queries using INSTR or LIKE with COLLATE BINARY_AI or BINARY_CI.
("all columns": no, only the one of string types: CLOB, CHAR, VARCHAR2, NVARCHAR2, ... BLOB may be more annoying because some are storing JSON there... but no way no know without knowledge of the application, or maybe by checking the presence of a IS JSON constraint on the column... )
But seriously what could be the business need for such requirement? Forensic?
Related
This question already has an answer here:
Alternate method to global temp tables for Oracle Stored Procedure
(1 answer)
Closed 8 years ago.
I need a way to temporarily store and use multiple values returned from an Oracle query. In SQL Server, I stored my values in a temp table, did my work, then dropped the table. I'm discovering the Oracle equivalent isn't as clear cut.
Here's a SQL Server example of what I'm trying to do:
select id into #temp from SomeTable where SomeColumn = 'Some Value'
:
(do whatever I need to do with #temp data)
:
drop table #temp
I can code my way around SQL Server pretty well, but am almost clueless when it comes to Oracle syntax. I've been reading various Oracle references, and they haven't been very helpful. I did read that Oracle temp tables work differently than SQL, and are often not recommended.
I'm looking into the temp table route, but if there's a better way to do this that doesn't use temp tables, I'm all ears. Anyone know a better way to do this in Oracle?
Thanks in advance.
As with many things, that depends. It depends on how much data you'll be retrieving and how you want to use it. If you don't have too much data to work with ("too much" meaning, oh, say, more than a couple thousand rows) and you want to manipulate the data procedurally, i.e. in a PL/SQL procedure or script, AND you don't want to access it using DML, i.e. you don't want to say something like SELECT * FROM your_temp_data... than loading the data into a PL/SQL collection, as #EgorSkriptunoff mentions above, might be a workable solution.
However, if the temp data is large (more than a couple thousand rows) and/or you need to be able to do something like SELECT * FROM your_temp_data... then your best bet it to use Oracle's Global Temp Tables. A GTT is a table which is used to hold data which should only last as long as either a single transaction or a complete session (i.e. as long as you're attached to the database). Documentation here and here, and another write-up on them here.
Share and enjoy.
This question already has answers here:
Auto-increment in Oracle without using a trigger
(9 answers)
Closed 8 years ago.
I want the first column of a table in PLSQL to be auto-increment.That column will be the primary key for that table. I heard something called serialize but i didn't get proper description about that. i was working in SQL Server. I am new to Oracle(PLSQL). Please help me to find out a proper solution.
Create a sequence
CREATE SEQUENCE name_of_sequence
START WITH 1
INCREMENT BY 1
CACHE 100;
Create a trigger
CREATE OR REPLACE TRIGGER trigger_name
BEFORE INSERT ON table_name
FOR EACH ROW
BEGIN
SELECT name_of_sequence.nextval
INTO :new.name_of_primary_key_column
FROM dual;
END;
The syntax of the trigger gets a little simpler in 11g since you can do a direct assignment to the :new.name_of_primary_key_column rather than selecting from dual. And I understand that there is some additional syntactic sugar in 12c that makes this even easier though I haven't played around with that.
I've just a quick question to see how it comes that I get 2 different results for the same thing.
We have two databases which are built up exactly the same in terms of structure.
In both, there is a view which do a comparison between a varchar2(10) and a char(10) where the fields are only filled with a length of 7 (+3 spaces for the char off course).
Off course this is something wrong in our structure, but that's something different than my question.
How is it possible that one database is able to do the comparison (varchar2=char) and the other one not?
Is there some Oracle-setting which can allow this.
Thanks for the help,
Grts,
Maarten
It's probably bug 11726301 "Wrong Result with query_rewrite_enabled=false and Joins of CHAR with Other CHAR and VARCHAR2 Columns"
Fixed in 11.2.0.3
Workaround is to set query_rewrite_enabled=true
I have a long raw column in an Oracle table. Insert with select is not working because of the long raw column which is part of my select statement as well. Basically I am trying to insert to insert history row with couple of parameters changed. Hence I was thinking of using PL/SQL in Oracle. I have no experience in PL/SQL neither I got anything after googling for couple of days. Can anyone help me with a sample PL/ SQL for my problem ? Thanks in advance !!!
LONG and LONG RAW datatypes are deprecated, and have been for many years now. You really are much better off getting away from them.
Having said that, if you're using PL/SQL, you will be limited to 32,760 bytes of data, which is the max that the LONG RAW PL/SQL datatype will hold. However, the LONG RAW database datatype, can hold up to 2GB of data. So, if any rows in your table contain data longer than 32,760 bytes, you will not be able to retrieve it using PL/SQL. This is a fundamental limitation of LONG and LONG RAW datatypes, and one of the reasons Oracle has deprecated their use.
In that case, the only options are Pro*C or OCI.
More information can be found here:
http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14261/datatypes.htm#CJAEGDEB
Hope that helps.
You can work with a LONG RAW column directly in PL/SQL if your data is limited to 32kB:
FOR cc IN (SELECT col1, col2... col_raw FROM your_table) LOOP
INSERT INTO your_other_table (col1, col2... col_raw)
VALUES (cc.col1, cc.col2... cc.col_raw);
END LOOP;
This will fail if any LONG RAW is larger than 32k.
In that case you will have to use another language. You could use java since it is included in the DB. I already answered a couple of questions on SO with LONG RAW and java:
Copying data from LOB Column to Long Raw Column (will work with LONG RAW to LONG RAW too, just replace the UPDATE with an INSERT)
Get the LENGTH of a LONG RAW
In any case as you have noticed it is a pain to work with this data type. If converting to LOB is not possible you will have to use a workaround.
As a follow-up to this question, I need help with the following scenario:
In Oracle, given a simple data table:
create table data (
id VARCHAR2(255),
key VARCHAR2(255),
value CLOB);
I am using the following merge command:
merge into data
using (
select
? id,
? key,
? value
from
dual
) val on (
data.id=val.id
and data.key=val.key
)
when matched then
update set data.value = val.value
when not matched then
insert (id, key, value) values (val.id, val.key, val.value);
I am invoking the query via JDBC from a Java application.
When the "value" string is large, the above query results in the following Oracle error:
ORA-01461: cannot bind a LONG value for insert into a long column
I even set the "SetBigStringTryClob" property as documented here with the same result.
Is it possible to achieve the behavior I want given that "value" is a CLOB?
EDIT: Client environment is Java
You haven't mentioned specifically in your post, but judging by the tags for the question, I'm assuming you're doing this from Java.
I've had success with code like this in a project I just finished. This application used Unicode, so there may be simpler solutions if your problem domain is limited to a standard ASCII character set.
Are you currently using the OracleStatement.setCLOB() method? It's a terribly awkward thing to have to do, but we couldn't get around it any other way. You have to actually create a temporary CLOB, and then use that temporary CLOB in the setCLOB() method call.
Now, I've ripped this from a working system, and had to make a few ad-hoc adjustments, so if this doesn't appear to work in your situation, let me know and I'll go back to see if I can get a smaller working example.
This of course assumes you're using the Oracle Corp. JDBC drivers (ojdbc14.jar or ojdbc5.jar) which are found in $ORACLE_HOME/jdbc/lib
CLOB tempClob = CLOB.createTemporary(conn, true, CLOB.DURATION_SESSION);
// Open the temporary CLOB in readwrite mode to enable writing
tempClob.open(CLOB.MODE_READWRITE);
// Get the output stream to write
Writer tempClobWriter = tempClob.getCharacterOutputStream();
// Write the data into the temporary CLOB
tempClobWriter.write(stringData);
// Flush and close the stream
tempClobWriter.flush();
tempClobWriter.close();
// Close the temporary CLOB
tempClob.close();
myStatement.setCLOB(column.order, tempClob);
Regards,
Dwayne King