Spring Boot - JPA Repository - Batch Insert not working - spring

I'm trying to achieve Batch Insert using JpaRepository, but it seems that it doesn't work even though I'm using the recommended properties. This is my code:
Entity - Book.java:
#Entity(name = "books")
#Table(name = "books")
public class Book {
#Id
#GeneratedValue(strategy = GenerationType.SEQUENCE)
private Long id;
private String title;
private String author;
private String edition;
private String status;
#Column(unique = true)
private String isbn;
#JsonIgnore
#OneToMany(cascade = CascadeType.ALL,mappedBy = "book", fetch = FetchType.LAZY)
private List<Image> images = new ArrayList<>();
// Getters and Setters omitted
}
Service - BookServiceImpl
#Service
public class BookServiceImpl implements BookService {
#Autowired
private BookRepository bookRepository;
#Override
public List<Book> storeBooks(List<Book> books) {
return bookRepository.saveAll(books);
}
}
Properties - application.properties:
spring.datasource.driver-class-name=org.postgresql.Driver
spring.datasource.url=jdbc:postgresql://localhost/bookdb?reWriteBatchedInserts=true
spring.datasource.username=**
spring.datasource.password=**
spring.jpa.hibernate.ddl-auto=create
spring.jpa.show-sql=true
spring.jpa.properties.hibernate.dialect = org.hibernate.dialect.PostgreSQLDialect
spring.jpa.properties.hibernate.jdbc.batch_size=100
spring.jpa.properties.hibernate.order_inserts=true
spring.jpa.properties.hibernate.order_updates=true
spring.jpa.properties.hibernate.generate_statistics=true
SQL Log after inserting:
Hibernate: select nextval ('hibernate_sequence')
Hibernate: select nextval ('hibernate_sequence')
Hibernate: select nextval ('hibernate_sequence')
Hibernate: select nextval ('hibernate_sequence')
Hibernate: select nextval ('hibernate_sequence')
Hibernate: select nextval ('hibernate_sequence')
Hibernate: select nextval ('hibernate_sequence')
Hibernate: select nextval ('hibernate_sequence')
Hibernate: select nextval ('hibernate_sequence')
Hibernate: insert into books (author, edition, isbn, status, title, id) values (?, ?, ?, ?, ?, ?)
Hibernate: insert into books (author, edition, isbn, status, title, id) values (?, ?, ?, ?, ?, ?)
Hibernate: insert into books (author, edition, isbn, status, title, id) values (?, ?, ?, ?, ?, ?)
Hibernate: insert into books (author, edition, isbn, status, title, id) values (?, ?, ?, ?, ?, ?)
Hibernate: insert into books (author, edition, isbn, status, title, id) values (?, ?, ?, ?, ?, ?)
Hibernate: insert into books (author, edition, isbn, status, title, id) values (?, ?, ?, ?, ?, ?)
Hibernate: insert into books (author, edition, isbn, status, title, id) values (?, ?, ?, ?, ?, ?)
Hibernate: insert into books (author, edition, isbn, status, title, id) values (?, ?, ?, ?, ?, ?)
Hibernate: insert into books (author, edition, isbn, status, title, id) values (?, ?, ?, ?, ?, ?)
2021-03-07 09:57:50.163 INFO 7800 --- [nio-8080-exec-1] i.StatisticalLoggingSessionEventListener : Session Metrics {
2883700 nanoseconds spent acquiring 1 JDBC connections;
0 nanoseconds spent releasing 0 JDBC connections;
9612998 nanoseconds spent preparing 10 JDBC statements;
23803401 nanoseconds spent executing 9 JDBC statements;
23764601 nanoseconds spent executing 1 JDBC batches;
0 nanoseconds spent performing 0 L2C puts;
0 nanoseconds spent performing 0 L2C hits;
0 nanoseconds spent performing 0 L2C misses;
275826200 nanoseconds spent executing 1 flushes (flushing a total of 9 entities and 9 collections);
0 nanoseconds spent executing 0 partial-flushes (flushing a total of 0 entities and 0 collections)
I don't know if the problem is with the logs or something, but I implemented everything as recommended...

I realized that the batch insert was already working, the problem is that Hibernate Default Logging doesn't show if the SQL Inserts are batched or not, so the solution was to implement BeanPostProcessor and add two dependencies, SLF4J and DataSource Proxy.
#Component
public class DatasourceProxyBeanPostProcessor implements BeanPostProcessor {
#Override
public Object postProcessBeforeInitialization(final Object bean, final String beanName) throws BeansException {
return bean;
}
#Override
public Object postProcessAfterInitialization(final Object bean, final String beanName) throws BeansException {
if (bean instanceof DataSource) {
ProxyFactory factory = new ProxyFactory(bean);
factory.setProxyTargetClass(true);
factory.addAdvice(new ProxyDataSourceInterceptor((DataSource) bean));
return factory.getProxy();
}
return bean;
}
private static class ProxyDataSourceInterceptor implements MethodInterceptor {
private final DataSource dataSource;
public ProxyDataSourceInterceptor(final DataSource dataSource) {
super();
this.dataSource = ProxyDataSourceBuilder.create(dataSource).countQuery().logQueryBySlf4j(SLF4JLogLevel.INFO).build();
}
#Override
public Object invoke(final MethodInvocation invocation) throws Throwable {
Method proxyMethod = ReflectionUtils.findMethod(dataSource.getClass(), invocation.getMethod().getName());
if (proxyMethod != null) {
return proxyMethod.invoke(dataSource, invocation.getArguments());
}
return invocation.proceed();
}
}
}
so i updated my pom.xml
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-log4j2</artifactId>
</dependency>
<!-- https://mvnrepository.com/artifact/net.ttddyy/datasource-proxy -->
<dependency>
<groupId>net.ttddyy</groupId>
<artifactId>datasource-proxy</artifactId>
<version>1.7</version>
</dependency>
Then i tested it again and i got this from SLF4J:
2021-03-07 10:48:46.075 INFO 14044 --- [nio-8080-exec-5] n.t.d.l.l.SLF4JQueryLoggingListener : Name:, Connection:6, Time:4, Success:True, Type:Prepared, Batch:True, QuerySize:1, BatchSize:16, Query:["insert into books (author, edition, isbn, status, title, id) values (?, ?, ?, ?, ?, ?)"], Params:[(Author 2,Edition 2,978-472-592-193-7,Owned,Book 2,33),(Author 2,Edition 2,978-412-592-193-7,Owned,Book 2,34),(Author 2,Edition 2,978-473-592-193-7,Owned,Book 2,35),(Author 2,Edition 2,978-472-552-193-7,Owned,Book 2,36),(Author 2,Edition 2,978-472-092-193-7,Owned,Book 2,37),(Author 2,Edition 2,978-402-592-193-7,Owned,Book 2,38),(Author 2,Edition 2,178-472-592-193-7,Owned,Book 2,39),(Author 2,Edition 2,278-472-592-193-7,Owned,Book 2,40),(Author 2,Edition 2,978-472-592-472-7,Owned,Book 2,41),(Author 2,Edition 2,592-472-592-123-7,Owned,Book 2,42),(Author 2,Edition 2,562-472-592-123-9,Owned,Book 2,43),(Author 2,Edition 2,978-123-562-123-9,Owned,Book 2,44),(Author 2,Edition 2,472-472-582-123-9,Owned,Book 2,45),(Author 2,Edition 2,222-472-592-123-9,Owned,Book 2,46),(Author 2,Edition 2,978-222-123-123-9,Owned,Book 2,47),(Author 2,Edition 2,978-433-502-123-9,Owned,Book 2,48)]
Sources:
https://www.baeldung.com/jpa-hibernate-batch-insert-update
https://arnoldgalovics.com/configuring-a-datasource-proxy-in-spring-boot/

Related

How does insert ignore for batches work in Spring Boot repository

I am working in Spring Boot framework.
I have a working way to run batches of "insert ignore" but i don't fully understand how/why it works.
I am using the Persistable interface to have inserts be done in batches.
In addition i'm using SQLInsert to run insert ignore instead of insert.
The code below demonstrates it:
#Entity
#SQLInsert(sql = "insert ignore into my_table (created_at, data, id) VALUES (?, ?, ?)")
public class MyTable implements Persistable<String> {
#Id
#Column
private String id;
#Column
private String data;
#Column
private Timestamp createdAt;
#Id
#Column
private String data;
public String getId() {
calculateId();
return id;
}
#Override
public boolean isNew() {
return true;
}
}
===============
#Repository
public interface MyTableRepository extends JpaRepository<MyTable, String> {
#Transactional
#Modifying(clearAutomatically = true, flushAutomatically = true)
<S extends MyTable> List<S> saveAll(Iterable<S> entities);
}
In debug, and on the DB side, i see that the save is indeed done in batches.
The query shown in debug mode is of this sort:
["insert ignore into my_table (created_at, data, id) VALUES (?, ?, ?)"], Params:[(2022-06-08 17:44:35.041,data1,id1),(2022-06-08 17:44:35.042,data2,id2),(2022-06-08 17:44:35.042,data3,id3)]
I see that there are 3 new records created in the DB.
The question how does it work if the number of "?" is smaller than the numbers of parameters passed?
If you would try it directly on the DB you would get an error

Hibernate Composite key problem with partial joining with other entity

Hibernate Composite key problem with partial joining with other entity.
Below code was working for javax.persistence_1.0.0.0_2.0 (Toplink), however same is not working for under Spring Boot (Spring Boot JPA starter - jakarta.persistence-api-2.2.3)
#Table(name = "EMPLOYEE")
#IdClass(EmployeePK.class)
public class Employee implements Serializable {
#Id
#Column(name = "EMP_NUMBER", nullable = false, length = 4000, insertable = false, updatable = false)
private String empNo;
#Id
#Column(name = "RGSN_ID", nullable = false, insertable = false, updatable = false)
private Long registId;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumns( { #JoinColumn(name = "PRJ_ID", referencedColumnName = "PRJ_ID"),
#JoinColumn(name = "EMP_NUMBER",
referencedColumnName = "EMP_NUMBER") })
private Projects projects;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "RGSN_ID")
private Organization organization;
//other fields
//created_by, creation_date, last_update_date, last_updated_by, status
}
Composite Key of Employee Table
public class EmployeePK
implements Serializable {
private String empNo;
private Long registId;
// getter setter equals hashcode
}
Project Table
#Table(name = "PROJECTS")
#IdClass(ProjectsPK.class)
public class Projects implements Serializable {
#Id
#Column(name = "PRJ_ID", nullable = false, insertable = false, updatable = false)
private Long prjId;
#Id
#Column(name = "EMP_NUMBER", nullable = false, length = 4000)
private String empNo;
#OneToMany(mappedBy = "projects")
private List<Employee> empList;
//other fields
//created_by, creation_date, last_update_date, last_updated_by, status
}
Composite keys for Project table
public class ProjectsPK
implements Serializable {
private Long prjId;
private String empNo;
// getter setter equals hashcode
}
Console Exception:
insert
into
EMPLOYEE
(created_by, creation_date, last_update_date, last_updated_by, status, prj_id, emp_number, rgsn_id)
values
(?, ?, ?, ?, ?, ?, ?, ?)
o.h.type.descriptor.sql.BasicBinder : binding parameter [1] as [VARCHAR] - [abc#c.com]
o.h.type.descriptor.sql.BasicBinder : binding parameter [2] as [TIMESTAMP] - [2022-03-16 18:52:37.587915]
o.h.type.descriptor.sql.BasicBinder : binding parameter [3] as [TIMESTAMP] - [2022-03-16 18:52:37.587915]
o.h.type.descriptor.sql.BasicBinder : binding parameter [4] as [VARCHAR] - [abc#c.com]
o.h.type.descriptor.sql.BasicBinder : binding parameter [5] as [VARCHAR] - [A]
o.h.type.descriptor.sql.BasicBinder : binding parameter [6] as [BIGINT] - [435]
o.h.type.descriptor.sql.BasicBinder : binding parameter [7] as [VARCHAR] - [123]
o.h.type.descriptor.sql.BasicBinder : binding parameter [8] as [VARCHAR] - [123]
o.h.type.descriptor.sql.BasicBinder : binding parameter [9] as [BIGINT] - [null]
o.h.engine.jdbc.spi.SqlExceptionHelper : SQL Error: 0, SQLState: S1093
o.h.engine.jdbc.spi.SqlExceptionHelper : The index 9 is out of range.
com.microsoft.sqlserver.jdbc.SQLServerException: The index 9 is out of range.
In above exception it shows two entries for emp_number 123 and because of that index is coming 9.
Not getting what could be the problem also added insertable = false, updatable = false at those entries.

Getting JPA error integrity constraint (FK_XXXXX) violated - parent key not found

I have two tables
CREATE TABLE stripe_product (
id NUMBER NOT NULL PRIMARY KEY,
product_id VARCHAR2(256) NOT NULL,
name VARCHAR2(256) NOT NULL,
description VARCHAR2(256),
active NUMBER(1,0),
deleted NUMBER(1,0),
created_at TIMESTAMP,
created_by NUMBER,
updated_at TIMESTAMP,
updated_by NUMBER,
deleted_at TIMESTAMP,
CONSTRAINT UC_stripe_product_id_product_id UNIQUE (id, product_id),
CONSTRAINT UC_stripe_product_product_id UNIQUE (product_id)
);
And
CREATE TABLE stripe_price (
id NUMBER NOT NULL PRIMARY KEY,
price_id VARCHAR2(256) NOT NULL,
stripe_product_product_id VARCHAR2(256) NOT NULL,
active NUMBER(1,0),
deleted NUMBER(1,0),
currency VARCHAR2(10) NOT NULL,
billing_scheme VARCHAR2(50) NOT NULL,
unit_amount NUMBER NOT NULL,
type VARCHAR2(50) NOT NULL,
recurring_aggregate_usage VARCHAR2(50),
recurring_interval VARCHAR2(50),
recurring_interval_count NUMBER,
recurring_usage_type VARCHAR2(50),
created_at TIMESTAMP,
created_by NUMBER,
updated_at TIMESTAMP,
updated_by NUMBER,
deleted_at TIMESTAMP,
CONSTRAINT UC_stripe_price_id_price_id_stripe_product_product_id UNIQUE (id, price_id, stripe_product_product_id),
CONSTRAINT UC_stripe_price_price_id UNIQUE (price_id),
CONSTRAINT FK_stripe_price_stripe_product_product_id FOREIGN KEY (stripe_product_product_id)
REFERENCES stripe_product(product_id) ON DELETE CASCADE
);
I mapped these tables using the following classes
#Entity
#EntityListeners(AuditingEntityListener.class)
#Table(name = "stripe_product")
public class StripeProduct {
#Id
#SequenceGenerator(name = "stripe_product_seq", sequenceName = "stripe_product_seq", allocationSize = 1)
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "stripe_product_seq")
private Long id;
#Column(name = "product_id", unique = true)
private String productId;
#Column(nullable = false)
private String name;
private String description;
private Boolean active;
private Boolean deleted;
#Embedded
private Audit audit = new Audit();
#Column(name = "deleted_at")
private Instant deletedAt;
public StripeProduct() {
}
public StripeProduct(Product product) {
this.productId = product.getId();
this.name = product.getName();
this.description = product.getDescription();
this.active = product.getActive();
this.deleted = product.getDeleted();
}
// getters and setter
}
Other one
#Entity
#EntityListeners(AuditingEntityListener.class)
#Table(name = "stripe_price")
public class StripePrice {
#Id
#SequenceGenerator(name = "stripe_price_seq", sequenceName = "stripe_price_seq", allocationSize = 1)
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "stripe_price_seq")
private Long id;
#Column(name = "price_id", unique = true)
private String priceId;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "stripe_product_product_id")
private StripeProduct stripeProduct;
private Boolean active;
private Boolean deleted;
private String currency;
....
#Embedded
private Audit audit = new Audit();
#Column(name = "deleted_at")
private Instant deletedAt;
public StripePrice() {
}
public StripePrice(Price price, StripeProduct stripeProduct) {
Assert.notNull(price, "price cannot be null");
this.priceId = price.getId();
this.stripeProduct = stripeProduct;
this.active = price.getActive();
this.currency = price.getCurrency();
this.billingScheme = price.getBillingScheme();
this.unitAmount = price.getUnitAmount();
this.type = price.getType();
Recurring recurring = price.getRecurring();
if (recurring != null) {
this.recurringAggregateUsage = recurring.getAggregateUsage();
this.recurringInterval = recurring.getInterval();
this.recurringIntervalCount = recurring.getIntervalCount();
this.recurringUsageType = recurring.getUsageType();
}
this.deleted = price.getDeleted();
}
// getters and setters
}
In the database I have the following records
Now if I directly insert the record in the database using the following sql it works
insert into stripe_price (active, created_by, created_at, updated_by, updated_at, billing_scheme,
currency, deleted, deleted_at, price_id, recurring_aggregate_usage, recurring_interval,
recurring_interval_count, recurring_usage_type, stripe_product_product_id, type, unit_amount, id)
values (1, 0, SYSDATE, 0, SYSDATE, 'Billing scheme', 'usd', 0, null, 'adsad', 'hjgjh', 'sfsad', 1,
'asdsad', 'prod_Io2qV0NPORZhnX', 'adsad', 100, 33);
insert into stripe_price (active, created_by, created_at, updated_by, updated_at, billing_scheme,
currency, deleted, deleted_at, price_id, recurring_aggregate_usage, recurring_interval,
recurring_interval_count, recurring_usage_type, stripe_product_product_id, type, unit_amount, id)
values (1, 0, SYSDATE, 0, SYSDATE, 'Billing scheme', 'usd', 0, null, 'price_id-2', 'hjgjh', 'sfsad',
1, 'asdsadxzcxzc', 'prod_Io2qV0NPORZhnX', 'asd1234', 100, 34)
But now using JPA I am getting error
Caused by: java.sql.BatchUpdateException: ORA-02291: integrity constraint (BUILDADMIN.FK_STRIPE_PRICE_STRIPE_PRODUCT_PRODUCT_ID) violated - parent key not found
Here is my code
List<Price> prices = priceCollection.getData();
if (!CollectionUtils.isEmpty(prices)) {
prices.forEach(price -> {
String productId = price.getProduct();
StripeProduct managedStripeProduct = stripeProductRepository.findByProductId(productId).orElse(null);
if (managedStripeProduct != null) {
StripePrice newStripePrice = new StripePrice(price, managedStripeProduct);
StripePrice managedStripePrice = stripePriceRepository.save(newStripePrice);
}
});
}
In debug I found that the following SQL is making
Hibernate: select stripe_price_seq.nextval from dual
Hibernate: insert into stripe_price (active, created_by, created_at, updated_by, updated_at, billing_scheme, currency, deleted, deleted_at, price_id, recurring_aggregate_usage, recurring_interval, recurring_interval_count, recurring_usage_type, stripe_product_product_id, type, unit_amount, id) values (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
binding parameter [1] as [BIT] - [true]
binding parameter [2] as [BIGINT] - [0]
binding parameter [3] as [TIMESTAMP] - [2021-01-25T23:18:11.104Z]
binding parameter [4] as [BIGINT] - [0]
binding parameter [5] as [TIMESTAMP] - [2021-01-25T23:18:11.104Z]
binding parameter [6] as [VARCHAR] - [per_unit]
binding parameter [7] as [VARCHAR] - [usd]
binding parameter [8] as [BIT] - [null]
binding parameter [9] as [TIMESTAMP] - [null]
binding parameter [10] as [VARCHAR] - [price_1ICQl8JOji9YLkEKmju4jUmu]
binding parameter [11] as [VARCHAR] - [null]
binding parameter [12] as [VARCHAR] - [month]
binding parameter [13] as [BIGINT] - [1]
binding parameter [14] as [VARCHAR] - [licensed]
binding parameter [15] as [BIGINT] - [30]
binding parameter [16] as [VARCHAR] - [recurring]
binding parameter [17] as [BIGINT] - [100000]
binding parameter [18] as [BIGINT] - [80]
As you can notice that there is no stripe_product_product_id when hibernate is making SQL. I think that's why it is generating error.
Although I am setting it on StripePrice but unable to find why I am betting error. Can anyone please explain what I am doing wrong ? And how I can I resolve this issue?
Thanks
I have solved the problem. Actually the problem was I have different name for column. In stripe_product table column name is product_id. While in stripe_price table column name is stripe_product_product_id. So I have to use the following in my mapping
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "stripe_product_product_id", referencedColumnName = "product_id", nullable = false)
private StripeProduct stripeProduct;
So basically referencedColumnName = "product_id" was missing that's why JPA unable to find the product_id value from stripe_product table. Hopefully it will be useful for others too.

Spring Boot/Spring Data. How to fill DB with encoded data using data.sql containing not encoded data

I'm writing some kind of "hello-world" REST service with Spring Data Rest.
I have a simple #Entity class:
#Entity
public class User {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
private String firstName;
private String lastName;
#Temporal(DATE)
private Date birthDate;
private String email;
#Column(length = 60)
#JsonDeserialize(using = BCryptPasswordDeserializer.class)
private String password;
// getters and setters
}
As you can see, there's a #JsonDeserialize above password field so any password coming from client is encoded with BCryptPasswordEncoder before going to database. BTW I'm using H2 im-memory DB.
Implementation of BCryptPasswordDeserializer:
public class BCryptPasswordDeserializer extends JsonDeserializer<String> {
#Override
public String deserialize(JsonParser jp, DeserializationContext dc) throws IOException, JsonProcessingException {
ObjectCodec oc = jp.getCodec();
JsonNode node = oc.readTree(jp);
BCryptPasswordEncoder encoder = new BCryptPasswordEncoder();
String encodedPassword = encoder.encode(node.asText());
return encodedPassword;
}
}
Also I have data.sql in my resources:
insert into user (id, first_name, last_name, birth_date, email, password)
values (1, 'Ivan', 'Ivanov', '1990-01-01', 'i.ivanov#test.test', '123567');
insert into user (id, first_name, last_name, birth_date, email, password)
values(2, 'Petr', 'Petrov', '1990-01-01', 'p.petrov#test.test', '123567');
insert into user (id, first_name, last_name, birth_date, email, password)
values(3, 'John', 'Cena', '1990-01-01', 'j.cena#test.test', '123567');
insert into user (id, first_name, last_name, birth_date, email, password)
values(4, 'Michael', 'Moore', '1990-01-01', 'm.moore#test.test', '123567');
insert into user (id, first_name, last_name, birth_date, email, password)
values(5, 'Trevor', 'Fassbender', '1990-01-01', 't.fassbender#test.test', '123567');
And there's my "problem": I need something that also will encode passwords from this file before inserting into table. Is there any way to do that? Or maybe there's some different way in Spring to fill database which allows to use encryption logic before inserting data?
I would create Spring component that uses the repository to insert the data after startup.
Something like this:
#Component
class UserCreator {
#Autowired
private UserRepository userRepository;
#PostConstruct
public void init() {
userRepository.save(new User('Ivan', 'Ivanov', '1990-01-01', 'i.ivanov#test.test', '123567'));
// etc
}
}
To control whether the UserCreator is used or not (you probably don't want to do this in your production environment), you may add a #Profile annotation to the class. Or you insert the data only if the User database table is empty ... (or whatever condition you want).

Save LocalDateTime with namedParameterJdbcTemplate

I have a list of 4 million generated entities that I want to move into table. The entity have field with type LocalDateTime:
#Data
#AllArgsConstructor
#Entity
#Table(name = "invoices")
public class Invoice {
#Id
#GeneratedValue(strategy=GenerationType.IDENTITY)
Long id;
#Column(name = "exact_iss_time")
private LocalDateTime exactIssueTime;
#Column(name = "final_iss_time")
private LocalDateTime finalIssueTime;
#Column(name = "issuer")
private String issuer;
#Column(name = "groupid")
private Integer groupID;
protected Invoice() {
}
}
As it is a big number of entities I want to do it optimally - which I gues is with NamedParameterJdbcTemplate.batchUpdate() like this:
public int[] bulkSaveInvoices(List<Invoice> invoices){
String insertSQL = "INSERT INTO invoices VALUES (:id, :exactIssueTime, :finalIssueTime, :issuer, :groupID)";
SqlParameterSource[] sqlParams = SqlParameterSourceUtils.createBatch(invoices.toArray());
int[] insertCounts = namedParameterJdbcTemplate.batchUpdate(insertSQL, sqlParams);
return insertCounts;
}
However I keep getting error:
org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [INSERT INTO invoices VALUES (?, ?, ?, ?, ?)]; nested exception is org.postgresql.util.PSQLException: Can't infer the SQL type to use for an instance of java.time.LocalDateTime. Use setObject() with an explicit Types value to specify the type to use.
Am I doing this right - if so how to fix this LocalDateTime issue in this case?
If this is the wrong way than what is the most optimal way to INSERT the 4 million generated test entities into the table with Spring-boot?
database?
JPA 2.1 was released before Java 8 and therefore doesn’t support the new Date and Time API.
You can try add convertor, for more check How to persist LocalDate and LocalDateTime with JPA
#Converter(autoApply = true)
public class LocalDateTimeAttributeConverter implements AttributeConverter<LocalDateTime, Timestamp> {
#Override
public Timestamp convertToDatabaseColumn(LocalDateTime locDateTime) {
return (locDateTime == null ? null : Timestamp.valueOf(locDateTime));
}
#Override
public LocalDateTime convertToEntityAttribute(Timestamp sqlTimestamp) {
return (sqlTimestamp == null ? null : sqlTimestamp.toLocalDateTime());
}
}
Or you can try not jdbc approach, but use custom JPA batch

Resources