Spring data jpa, how to write custom query for inserting multiple records in same table? - spring

Following is my table schema
CREATE TABLE Animals (
id MEDIUMINT NOT NULL AUTO_INCREMENT,
name CHAR(30) NOT NULL,
PRIMARY KEY (id)
) ENGINE=MyISAM;
In mysql I can directly insert multiple records in one insert sql query. Like following.
INSERT INTO animals (name) VALUES('dog'),('cat'),('penguin'),('lax'),('whale'),('ostrich');
However, how can I achieve the same thing in spring data jpa.
Right now I am using CrudRepository's Iterable save(Iterable entities); and I am eneded up with 6 insert statements
insert into Animals (name) values (?)
insert into Animals (name) values (?)
insert into Animals (name) values (?)
insert into Animals (name) values (?)
insert into Animals (name) values (?)
insert into Animals (name) values (?)
How do I restrict to One insert query ? Any answer will be helpful regarding spring data jpa, hql or jpql .

Assuming that you are using hibernate you need to tweak a couple of settings. You need to enable batching by putting a value in the hibernate.jdbc.batch_size property. However that might not cut it as depending on the number of different statements you also might need to order them to be able to batch them. For this you need to set the hibernate.order_inserts and hibernate.order_updates to true.
If you are using versions or timestamps to limit concurrent modification you might also need to enable hibernate.jdbc.batch_versioned_data
So all in all you probably need to add something like this to your configuration or persistence.xml
properties.put("hibernate.jdbc.batch_size", "25");
properties.put("hibernate.order_inserts", "true");
properties.put("hibernate.order_updates", "true");
properties.put("hibernate.jdbc.batch_versioned_data", "true");
A blog post explaining a bit more can be found here.
However if you are using MySQL it might not even work (yet) as MySQL requires an additional setting on the datasource to enable the usage of the addBatch for JDBC. You need to set the rewriteBatchedStatements to true to enable this feature for MySQL. Depending on your datasource you can add this to the connection string or as a seperate property.

Related

Multiple id's input for select query with oracle bpel dbadapter

In Oracle SOA one can do insert multiple records in Database in single transaction.This is supported out of the box and nothing you need to do special to achieve it. If you create database adapter in your BPEL process with INSERT operation it is exposed as collection of object as Input. You can use XSLT to assign that Collection and all records will be inserted in one atomic transaction.
Is there an equivalent feature for pure sql query?
I have a complicated query that requires only a single id for it's input.
But I like this query to be repeated for multiple id's. Rather than defining a for loop and what not, is there a flag/switch/way when creating the bpel process to allow for multiple id's as the input?
Maybe this helps:
create table test (id number(3), name varchar2(20));
insert into test
select t.column_value, dbms_random.string('A', 20)
from table(sys.odcinumberlist(4, 17, 105, 91, 212)) t;
Pure SQL, one insert, five rows with defined ids. If id is varchar use sys.odcivarchar2list or define custom type at first.

Populating Tables into Oracle in-memory segments

I am trying to load the tables into oracle in-memory database. I have enable the tables for INMEMORY by using sql+ command ALTER TABLE table_name INMEMORY. The table also contains data i.e. the table is populated. But when I try to use the command SELECT v.owner, v.segment_name name, v.populate_status status from v$im_segments v;, it shows no rows selected.
What can be the problem?
Have you considered this?
https://docs.oracle.com/database/121/CNCPT/memory.htm#GUID-DF723C06-62FE-4E5A-8BE0-0703695A7886
Population of the IM Column Store in Response to Queries
Setting the INMEMORY attribute on an object means that this object is a candidate for population in the IM column store, not that the database immediately populates the object in memory.
By default (INMEMORY PRIORITY is set to NONE), the database delays population of a table in the IM column store until the database considers it useful. When the INMEMORY attribute is set for an object, the database may choose not to materialize all columns when the database determines that the memory is better used elsewhere. Also, the IM column store may populate a subset of columns from a table.
You probably need to run a select against the date first

create a backup table with all parameter

I'm trying to move the data from one table TABLE5 another one TABLE5_BKP.
CREATE TABLE TABLE5_BKP AS SELECT * FROM TABLE5;
The table created and the data moved. when I checked the constraints,
The primary key,foreign key etc are not generated but all other constraints like,
SYS_C2211111 Check "COLUMN1" IS NOT NULL
etc are created. What to do in this case? Need to create the primary key,foreign key etc separately? What about indexes and other parameters, which I was not able to check.
You can't implicitly create PK, FK, Indexes, etc. just using
CREATE TABLE tablename AS SELECT *...
You have to specify them after creating. Also I suggest you to use oracle tools, like exp/imp, data pump, etc. if you want to move the database structure from one database to another.

MSSQL - fetch Auto increment column value and save it in a another column

This is Regarding MSSQL auto increment feature and i have following table created
CREATE TABLE Order(
[order_id] [int] IDENTITY(1,1) NOT NULL,
[name] [varchar](50) NULL,
[original_order_id] [int] NOT NULL
)
In here i have a situation where i need to insert the auto generated value for original_id to original_order_id.
After googling few minutes following thing and it works fine for me
insert into Order values('Vegitable Order', IDENT_CURRENT('Order'))
I am using java application spring JDBC templates to execute the quires. can there be any issues? specially in multi threaded environment?
Using IDENT_CURRENT is not a good idea. If there are concurrent transactions, the returned value might not be your last inserted id, but the last inserted id of another transaction!
Use the JDBC getGeneratedKey facility to retrieve the generated id of the first INSERT and use the retrieved value in the second INSERT, or use SCOPE_IDENTITY() in the second INSERT. With SCOPE_IDENTITY() be sure that you are executing both statements in the same transaction.
See also: How to get the insert ID in JDBC?

Linq insert with no primary key

I need to insert records into a table that has no primary key using LINQ to SQL. The table is poorly designed; I have NO control over the table structure. The table is comprised of a few varchar fields, a text field, and a timestamp. It is used as an audit trail for other entities.
What is the best way to accomplish the inserts? Could I extend the Linq partial class for this table and add a "fake" key? I'm open to any hack, however kludgey.
LINQ to SQL isn't meant for this task, so don't use it. Just warp the insert into a stored procedure and add the procedure to your data model. If you can't do that, write a normal function with a bit of in-line SQL.
Open your DBML file in the designer, and give the mapping a key, whether your database has one or not. This will solve your problem. Just beware, however, that you can't count on the column being used for identity or anything else if there isn't a genuine key in the database.
I was able to work around this using a composite key.
I had a similar problem with a table containing only two columns: username, role.
This table obviously does not require an identity column. So, I created a composite key with username and role. This enabled me to use LINQ for adding and deleting entries.
You might use the DataContext.ExecuteCommand method to run your own custom insert statement.
Or, you might add a primary key to a column, this will allow the objects to be tracked for inserts/updates/deletes by the datacontext. This will work even if the column isn't really an enforced primary key in the database (how would linq know?). If you're only doing inserts and never re-use a primary key value in the same datacontext, you'll be fine.

Resources