Poor performance with ODP.net parameter array inserts when record contains BLOBs - oracle

I use ODP.net parameter arrays to achieve batch insert of records. This way performs very well when the records don't contain BLOB column - typically about 10,000 records can be inserted in one second.
If a record contains BLOB column, it has a poor performance - about 1,000 records need 8 ssconds.
Is there any method to batch insert records with BLOB column efficiently.

I find I use odp to insert record with blob column uncorrectly.
when insert record with blob column, I use byte array to store the blob value. this way will have a poor performance.
I replace another way, use OracleBlob type to store the blob value.
In this way, batch insert records will have a high performance.

Related

How to improve single insert performance in oracle

In my business case, I need insert one row and can't use batch insert . So I want to know what the throughput can made by Oracle. I try these ways:
Effective way
I use multi-thread, each thread owns one connection to insert data
I use ssd to store oracle datafile
Ineffective way
I use multi table to store data in one schema
I use table partition
I use multi schema to store data
Turn up data file block size
Use append hint in insert SQL
In the end the best TPS is 1w/s+
Other:
Oracle 11g
Single insert data size 1k
CPU i7, 64GB memory
Oracle is highly optimized for anything from one row inserts to batches of hundreds of rows. You do not mention whether you are having performance problems with this one row insert nor how long the insert takes. For such a simple operation, you don't need to worry about any of those details. If you have thousands of web-based users inserting one row into a table every minute, no problem. If you are committing your work at the appropriate time, and you don't have a huge number of indexes, a single row insert should not take more than a few milliseconds.
In SQL*Plus try the commands
set autotrace on explain statistics
set timing on
and run your insert statement.
Edit your question to include the results of the explain plan. And be sure to indent the results 4 spaces.

Storing table data as blob in column in different table(Oracle)

Requirement : We have around 500 tables from which around 10k rows in each tables are of interest. We want to store this data as blob in a table. All data when exported to a file is of 250 MB. Now one option is to store this 250 MB file in a blob (Oracle allows 4 GB) or store each table data as blob in a blob column i.e we will have one row for each table and blob column will have that table data.
Now with respect to performance, which option is better in terms of performance. Also this data needs to be fetched and insert into database.
Basically, this will be delivered to customer and our utility will read the data from blob and will insert into database.
Questions:
1) How to insert table data as blob in blob column
2) How to read from that blob column and then prepare insert statements.
3) Is there any benefit we can get from compression of table which contains blob data. If yes, then for reading how to uncompress that.
4) Does this approach will work on MSSQL and DB2 also.
What are the other considerations while designing tables having blob.
Please suggest
I have impression you want to go from structured content to non-structured.
I hope you know what you are trading off, but I do not have that impression reading your question.
Going BLOB you lose relationship / constraints between values.
It could be faster to read one block of data, but when you need to write minor change, you may need to write bigger "chunk" in case of big BLOBs.
To insert BLOB in database you can use any available API (OCI, JDBC. Even pl/sql if you access it only on server side).
For compression, you can use BLOB option. Also, you can DIY using some library (if you need to think about other RDBMS types).
Why do you want to store a table into a BLOB? For archive or transfer you could export the tables using exp or perferablyl expdp. These files you can compress and transfer or store as BLOB inside another Oracle database.
Max. size of LOB was 4 GB till Oracle release 9 as far as I remember. Today the limit is 8 TB to 128 TB, depending on your DB-Block Size.

Is it ok to define multiple NCLOB columns in an oracle table?

I've got to store multiple text fields of variable length in an oracle database. I'd need to define them as columns of the same table to be able to order the results when I query it.
I can't know the max size of the field contents, most of them will be less than 100 characters but there could be some of thousands of chars. Furthermore the number of fields changes dinamically.
I was thinking of defining a table with multiple NCLOB columns that would allow me to store anything in them (very short and very long texts) But I wonder if this would be the right design.
Summary:
Variable number of fields (metadata of the same object)
Variable length of the content
I need to order the results
Thanks
KL
When you need variable number of fields, it's better to split the table into parent and child. Then you can effectively have any number of fields. And you can add order column to store ordering information to order the result. You can query by joining the two table and use order by clause to order the result. Also you can add foreign key constraint to make sure the relationship and data integrity.
In the case of variable length of contents, you can use varchar2(or nvarchar2) to store text date. Varchar2 can hold characters up to 4000 bytes. If you know that the maximum length of the content can be longer than 4000 bytes, you should use CLOB(or NCLOB).

Best way to bulk insert data into Oracle database

I am going to create a lot of data scripts such as INSERT INTO and UPDATE
There will be 100,000 plus records if not 1,000,000
What is the best way to get this data into Oracle quickly? I have already found that SQL Loader is not good for this as it does not update individual rows.
Thanks
UPDATE: I will be writing an application to do this in C#
Load the records in a stage table via SQL*Loader. Then use bulk operations:
INSERT INTO SELECT (for example "Bulk Insert into Oracle database")
mass UPDATE ("Oracle - Update statement with inner join")
or a single MERGE statement
To keep It as fast as possible I would keep it all in the database.
Use external tables (to allow Oracle to read the file contents),
and create a stored procedure to do the processing.
The update could be slow, If possible, It may be a good idea to consider creating a new table based on all the records in the old (with updates) then switch the new & old tables around.
How about using a spreadsheet program like MS Excel or LibreOffice Calc? This is how I perform bulk inserts.
Prepare your data in a tabular format.
Let's say you have three columns, A (text), B (number) & C (date). In the D column, enter the following formula. Adjust accordingly.
="INSERT INTO YOUR_TABLE (COL_A, COL_B, COL_C) VALUES ('"&A1&"', "&B1&", to_date ('"&C1&"', 'mm/dd/yy'));"

How to store unlimited characters in Oracle 11g?

We have a table in Oracle 11g with a varchar2 column. We use a proprietary programming language where this column is defined as string. Maximum we can store 2000 characters (4000 bytes) in this column. Now the requirement is such that the column needs to store more than 2000 characters (in fact unlimited characters). The DBAs don't like BLOB or LONG datatypes for maintenance reasons.
The solution that I can think of is to remove this column from the original table and have a separate table for this column and then store each character in a row, in order to get unlimited characters. This tble will be joined with the original table for queries.
Is there any better solution to this problem?
UPDATE: The proprietary programming language allows to define variables of type string and blob, there is no option of CLOB. I understand the responses given, but I cannot take on the DBAs. I understand that deviating from BLOB or LONG will be developers' nightmare, but still cannot help it.
UPDATE 2: If maximum I need is 8000 characters, can I just add 3 more columns so that I will have 4 columns with 2000 char each to get 8000 chars. So when the first column is full, values would be spilled over to the next column and so on. Will this design have any bad side effects? Please suggest.
If a blob is what you need convince your dba it's what you need. Those data types are there for a reason and any roll your own implementation will be worse than the built in type.
Also you might want to look at the CLOB type as it will meet your needs quite well.
You could follow the way Oracle stored their stored procedures in the information schema. Define a table called text columns:
CREATE TABLE MY_TEXT (
IDENTIFIER INT,
LINE INT,
TEXT VARCHAR2 (4000),
PRIMARY KEY (INDENTIFIER, LINE));
The identifier column is the foreign key to the original table. The Line is a simple integer (not a sequence) to keep the text fields in order. This allows keeping larger chunks of data
Yes this is not as efficient as a blob, clob, or LONG (I would avoid LONG fields if at all possible). Yes, this requires more mainenance, buf if your DBAs are dead set against managing CLOB fields in the database, this is option two.
EDIT:
My_Table below is where you currently have the VARCHAR column you are looking to expand. I would keep it in the table for the short text fields.
CREATE TABLE MY_TABLE (
INDENTIFER INT,
OTHER_FIELD VARCHAR2(10),
REQUIRED_TEXT VARCHAR(4000),
PRIMERY KEY (IDENTFIER));
Then write the query to pull the data join the two tables, ordering by LINE in the MY_TEXT field. Your application will need to split the string into 2000 character chunks and insert them in line order.
I would do this in a PL/SQL procedure. Both insert and select. PL/SQL VARCHAR strings can be up to 32K characters. Which may or may not be large enough for your needs.
But like every other person answering this question, I would strongly suggest making a case to the DBA to make the column a CLOB. From the program perspective this will be a BLOB and therefore simple to manage.
You said no BLOB or LONG... but what about CLOB? 4GB character data.
BLOB is the best solution. Anything else will be less convenient and a bigger maintenance annoyance.
Is BFILE a viable alternative datatype for your DBAs?
I don't get it. A CLOB is the appropriate database datatype. If your weird programming language will deal with strings of 8000 (or whatever) characters, what stops it writing those to a CLOB.
More specifically, what error do you get (from Oracle or your programming language) when you try to insert an 8000 character string into a column defined as a CLOB.

Resources