I am using Spring and my class is annotated with #Transactional.
I am using the SimpleJdbcInsert but I am getting the following warning:
TableMetaDataProvider: - Unable to locate table meta data for
'data.data_insert' -- column names must be provided
I have three tables and all the three are having the relationship such that:
primary key of table1 is the foreign key in table 2 and the primary key in table 2 is the foreign key in table 3.
Showing table 1 insert code:
java.sql.Timestamp timestamp = getCurrentJavaSqlTimestamp();
Map<String, Object> params = new HashMap<String, Object>();
params.put("notes", task.getNotes());
params.put("recording_time", timestamp);
params.put("end_user_id", 805);
SimpleJdbcInsert insertData = new SimpleJdbcInsert(dataSource).
withTableName("data.data_insert").
usingColumns("notes", "recording_time",
"end_user_id").usingGeneratedKeyColumns("data_id");
long dataId = insertData.executeAndReturnKey(params).longValue();
The error logs:
2015-09-29 14:10:27,133 WARN [http-8080-2] LegacyFlexJsonExceptionMessageConverter: - Generated Key Name(s) not specificed. Using the generated keys features requires specifying the name(s) of the generated column(s) for User ID: 805, Request ID: f8da3bb5-0613-4a74-9ca8-95a6ab4f1692, clientIP: 127.0.0.1 uri: /admin/dataInsert, Request Parameters:
org.springframework.dao.InvalidDataAccessApiUsageException: Generated Key Name(s) not specificed. Using the generated keys features requires specifying the name(s) of the generated column(s)
at org.springframework.jdbc.core.simple.AbstractJdbcInsert.prepareStatementForGeneratedKeys(AbstractJdbcInsert.java:530)
at org.springframework.jdbc.core.simple.AbstractJdbcInsert.access$0(AbstractJdbcInsert.java:528)
at org.springframework.jdbc.core.simple.AbstractJdbcInsert$1.createPreparedStatement(AbstractJdbcInsert.java:448)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:581)
at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:843)
at org.springframework.jdbc.core.simple.AbstractJdbcInsert.executeInsertAndReturnKeyHolderInternal(AbstractJdbcInsert.java:445)
at org.springframework.jdbc.core.simple.AbstractJdbcInsert.executeInsertAndReturnKeyInternal(AbstractJdbcInsert.java:426)
at org.springframework.jdbc.core.simple.AbstractJdbcInsert.doExecuteAndReturnKey(AbstractJdbcInsert.java:380)
at org.springframework.jdbc.core.simple.SimpleJdbcInsert.executeAndReturnKey(SimpleJdbcInsert.java:122)
at com.gridpoint.energy.datamodel.impl.PGDataFixBackUpManagerBean.backupDataInRange(PGDataFixBackUpManagerBean.java:79)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:319)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183)
The correct one:
java.sql.Timestamp timestamp = getCurrentJavaSqlTimestamp();
Map<String, Object> params = new HashMap<String, Object>();
params.put("notes", task.getNotes());
params.put("recording_time", timestamp);
params.put("end_user_id", 805);
SimpleJdbcInsert insertData = new
SimpleJdbcInsert(dataSource).withSchemaName("data").
withTableName("data_insert")
usingColumns("notes", "recording_time",
"end_user_id").usingGeneratedKeyColumns("data_id");
long dataId = insertData.executeAndReturnKey(params).longValue();
So just needed a schemaName using withSchemaName.
Related
I'm trying to insert some data on H2 in memory database to make some tests and simulate my real Oracle11g but I'm facing an error NULL not allowed for column "ID". I've tried many solutions like putting Id and default on insert query, changing to Mode=LEGACY on jdb-url, but it didn't work.
data.sql
INSERT INTO CSM_SECURITY.csm_person(EMAIL, PERSON_NAME, SYS_ADMIN, DEFAULT_LANGUAGE, WSO2_ID, SYS_ADM_COUNTRY_ID, SYS_ADMIN_COUNTRY)
VALUES ('admin#wso2.com','admin','Y','es-ar',NULL,NULL,NULL);
Person Entity
#Entity
#Table(name = "csm_person", uniqueConstraints = #UniqueConstraint(columnNames = "email"))
public class Person implements Serializable {
#Id
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "PERSON_SEQ")
#SequenceGenerator(sequenceName = "CSM_PERSON_ID_SEQ", allocationSize = 1, name = "PERSON_SEQ")
private Long id;
#Column(name = "email")
private String email;
#Column(name = "person_name")
private String personName;
...
src/test/resources/application.properties
# SQL properties
spring.jpa.defer-datasource-initialization=true
spring.jpa.database-platform = org.hibernate.dialect.H2Dialect
spring.jpa.show-sql = true
spring.jpa.properties.hibernate.format_sql = true
spring.jpa.properties.hibernate.default_schema = CSM_SECURITY
spring.jpa.properties.hibernate.dialect = org.hibernate.dialect.H2Dialect
spring.jpa.properties.hibernate.event.merge.entity_copy_observer = allow
spring.jpa.open-in-view = false
spring.datasource.driver-class-name = org.h2.Driver
spring.sql.init.platform = h2
spring.datasource.name = CSM_SECURITY
spring.datasource.url = jdbc:h2:mem:CSM_SECURITY;MODE=Oracle;DB_CLOSE_DELAY=-1;INIT=CREATE SCHEMA IF NOT EXISTS CSM_SECURITY
spring.sql.init.mode = embedded
spring.datasource.username = sa
pom.xml
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.7.0</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
Error
Caused by: org.springframework.jdbc.datasource.init.ScriptStatementFailedException:
Failed to execute SQL script statement #1 of
URL [file:/home/ainacio/Development/csm-security/target/test-classes/data.sql]:
INSERT INTO CSM_SECURITY.csm_person(EMAIL, PERSON_NAME, SYS_ADMIN, DEFAULT_LANGUAGE, WSO2_ID, SYS_ADM_COUNTRY_ID, SYS_ADMIN_COUNTRY)
VALUES ('admin#wso2.com','admin','Y','es-ar',NULL,NULL,NULL);
nested exception is org.h2.jdbc.JdbcSQLIntegrityConstraintViolationException: NULL not allowed for column "ID"; SQL statement:
For such insertions you need to use identity columns (GenerationType.IDENTITY), but they aren't available in Oracle 11g, only 12c and newer versions support them.
You need to provide a value for ID column but this value must not conflict with values produced by a sequence, for example, you can change start value of a sequence generator to a 100, and use smaller values in your initialization scripts.
Alternatively you can fetch value of a sequence by itself, but Oracle 11g doesn't support sequence value expressions inside insert values or inside subqueries. So if you need to execute the same SQL in H2 and that historic version of Oracle, you need to execute something like that:
INSERT INTO CSM_SECURITY.CSM_PERSON
(ID, EMAIL, PERSON_NAME, SYS_ADMIN, DEFAULT_LANGUAGE,
WSO2_ID, SYS_ADM_COUNTRY_ID, SYS_ADMIN_COUNTRY)
SELECT CSM_SECURITY.CSM_PERSON_ID_SEQ.NEXTVAL,
'admin#wso2.com', 'admin', 'Y', 'es-ar', NULL, NULL, NULL FROM DUAL;
Make sure you're using Oracle compatibility mode of H2, because new versions of H2 don't accept NEXTVAL in Regular mode.
If you use this initilalization script only with H2, that trick with insert from query isn't required:
INSERT INTO CSM_SECURITY.CSM_PERSON
(ID, EMAIL, PERSON_NAME, SYS_ADMIN, DEFAULT_LANGUAGE,
WSO2_ID, SYS_ADM_COUNTRY_ID, SYS_ADMIN_COUNTRY)
VALUES (NEXT VALUE FOR CSM_SECURITY.CSM_PERSON_ID_SEQ,
'admin#wso2.com', 'admin', 'Y', 'es-ar', NULL, NULL, NULL);
I am trying to add new columns dynamically from Spring Boot application. Let us say, each time an event e occurs, I want to add a column into a Cassandra table with a well defined column-name and type. I have tried this code:
#Query("alter table attributes.attributedata add ?0 ?1")
public void addColumn(String columnName, String dataType);
Error Log:
org.springframework.cassandra.support.exception.CassandraQuerySyntaxException: line 1:41 no viable alternative at input 'gac5' (alter table attributes.attributedata add ['gac]...); nested exception is com.datastax.driver.core.exceptions.SyntaxError: line 1:41 no viable alternative at input 'gac5' (alter table attributes.attributedata add ['gac]...)
at org.springframework.cassandra.support.CassandraExceptionTranslator.translateExceptionIfPossible(CassandraExceptionTranslator.java:132)
at org.springframework.cassandra.core.CqlTemplate.potentiallyConvertRuntimeException(CqlTemplate.java:946)
at org.springframework.cassandra.core.CqlTemplate.translateExceptionIfPossible(CqlTemplate.java:930)
at org.springframework.cassandra.core.CqlTemplate.translateExceptionIfPossible(CqlTemplate.java:912)
at org.springframework.cassandra.core.CqlTemplate.doExecute(CqlTemplate.java:278)
at org.springframework.cassandra.core.CqlTemplate.doExecute(CqlTemplate.java:559)
at org.springframework.cassandra.core.CqlTemplate.doExecute(CqlTemplate.java:549)
at org.springframework.cassandra.core.CqlTemplate.query(CqlTemplate.java:485)
at org.springframework.cassandra.core.CqlTemplate.query(CqlTemplate.java:510)
at org.springframework.cassandra.core.CqlTemplate.query(CqlTemplate.java:505)
at org.springframework.data.cassandra.core.CassandraTemplate.selectOne(CassandraTemplate.java:638)
at org.springframework.data.cassandra.core.CassandraTemplate.selectOne(CassandraTemplate.java:509)
at org.springframework.data.cassandra.repository.query.CassandraQueryExecution$SingleEntityExecution.execute(CassandraQueryExecution.java:104)
at org.springframework.data.cassandra.repository.query.CassandraQueryExecution$ResultProcessingExecution.execute(CassandraQueryExecution.java:143)
at org.springframework.data.cassandra.repository.query.AbstractCassandraQuery.execute(AbstractCassandraQuery.java:113)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.doInvoke(RepositoryFactorySupport.java:483)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.invoke(RepositoryFactorySupport.java:461)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.data.projection.DefaultMethodInvokingMethodInterceptor.invoke(DefaultMethodInvokingMethodInterceptor.java:56)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:92)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.data.repository.core.support.SurroundingTransactionDetectorMethodInterceptor.invoke(SurroundingTransactionDetectorMethodInterceptor.java:57)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213)
I have also tried to store the entire query into a string and then directly place the string into value such that
#Query(value="?0")
but that does not work either.
This code works perfectly fine. If the values obtained from the function are not enclosed within single inverted commas, the code would work.
#Query("alter table attributes.attributedata add dummy text")
public void addColumn(String columnName, String dataType);
Is there any way this could work? Please suggest the possible alternatives.
A cluster object can be created as shown in the code, followed by the session object on which the query is executed. The name and type of the column to be added are passed as parameters.
public void addColumn(String columnName, String columnType){
//Query
String query = "ALTER TABLE keyspace.tablename ADD " + columnName + " " + columnType;
//Creating Cluster object
Cluster cluster = Cluster.builder().addContactPoint("127.0.0.1").build();
//Creating Session object
Session session = cluster.connect();
//Executing the query
session.execute(query);
System.out.println("Column added");
}
In order to optimize the code, we can create the session object only once during the application startup. The implementation for such scenario is given below:
private Session session;
#EventListener(ApplicationReadyEvent.class)
private void createSessionObject(){
//Creating Cluster object
Cluster cluster = Cluster.builder().addContactPoint("127.0.0.1").build();
//Creating Session object
Session session = cluster.connect();
}
public void addColumn(String columnName, String columnType){
//Query
String query = "ALTER TABLE keyspace.tablename ADD " + columnName + " " + columnType;
if(session == null){
createSessionObject();
}
//Executing the query
session.execute(query);
System.out.println("Column added");
}
You can try something like this using CONCAT
#Query("alter table attributes.attributedata add CONCAT('%',:columnNameDataType,'%')")
public void addColumn(String columnNameDataType);
Check this link
I am trying to use Spring Data JDBC for my PostgreSQL database. I defined the following beans
#Data
class Report {
#Id
private Long id;
private String name;
private Set<Dimension> dimensions;
}
#Data
class Dimension {
private String name;
private Long[] filterIds;
}
and the corresponding DDL
CREATE TABLE report (
id bigserial PRIMARY KEY,
name text NOT NULL
);
CREATE TABLE dimension (
id bigserial PRIMARY KEY ,
report bigint,
name text,
filter_ids bigint[],
FOREIGN KEY (report) REFERENCES report(id) ON DELETE CASCADE ON UPDATE CASCADE
);
Then I tried to insert a report
final Dimension dimension = new Dimension();
dimension.setName("xyz");
dimension.setFilterIds(new Long[]{ 1L, 2L, 3L });
final Report report = new Report();
report.setName("xyz");
report.setDimensions(Collections.singleton(dimension));
repository.save(report);
where repository is simply a CrudRepository<Report, Long>.
This gave me the following error
org.postgresql.util.PSQLException: ERROR: column "filter_ids" is of type bigint[] but expression is of type bigint
Hinweis: You will need to rewrite or cast the expression.
Position: 116
Can I somehow tell Spring Data JDBC how to map the array types?
With the release of Spring Data JDBC 1.1.0, this became possible. See the documentation here:
The properties of the following types are currently supported:
All primitive types and their boxed types (int, float, Integer, Float, and so on)
Enums get mapped to their name.
String
java.util.Date, java.time.LocalDate, java.time.LocalDateTime, and java.time.LocalTime
Arrays and Collections of the types mentioned above can be mapped to columns of array type if your database supports that.
...
As P44T answered this should work from version of 1.1 of Spring Data JDBC onwards just as you used it.
Original answer
It is currently not possible. There are issues for this. A starting point is this one: https://jira.spring.io/browse/DATAJDBC-259
I'm trying the following unit test:
#Test
#Transactional
public void thatFolderLocationAssociationTableIsWorking() {
Location l1 = new DomesticLocation();
l1.setDeptName("Test name 1");
Location l2 = new DomesticLocation();
l2.setDeptName("Test name 2");
KMLFolder k1 = new KMLFolder();
k1.setName("Test name 1");
KMLFolder k2 = new KMLFolder();
k1.setName("Test name 2");
List<Location> locations = new ArrayList<Location>();
locations.add(l1);
locations.add(l2);
k1.setLocations(locations);
kmlFolderServiceImpl.save(k1);
assertEquals("Test name 1", kmlFolderServiceImpl.find(1L).getLocations().get(0).getDeptName());
assertEquals("Test name 2", kmlFolderServiceImpl.find(1L).getLocations().get(1).getDeptName());
//The following line gets the NPE
assertEquals("Test name 1", locationServiceImpl.find(1L).getKmlFolderList().get(0).getName());
}
I'm getting NPEs on the laster assertions where I'm trying to retrieve KMLFolder.getName() from the Locations The other assertions are working, where I get the Location name from the KMLFolder.
Here are my JPA definitions:
#ManyToMany(mappedBy="kmlFolderList", cascade=CascadeType.ALL)
private List<Location> locations = new ArrayList<Location>();
#ManyToMany
#JoinTable(name="LOCATION_KMLFOLDER",
joinColumns={#JoinColumn(name="KMLFOLDER_ID", referencedColumnName="ID")},
inverseJoinColumns={#JoinColumn(name="LOCATION_ID", referencedColumnName="ID")}
)
The appropriate table is being created when I run the test. Here's the console output:
Hibernate:
create table project.LOCATION_KMLFOLDER (
KMLFOLDER_ID bigint not null,
LOCATION_ID bigint not null
) ENGINE=InnoDB
...
Hibernate:
alter table project.LOCATION_KMLFOLDER
add index FK_lqllrwb2t5cn0cbxxx3ms26ku (LOCATION_ID),
add constraint FK_lqllrwb2t5cn0cbxxx3ms26ku
foreign key (LOCATION_ID)
references project.KMLFolder (id)
Hibernate:
alter table .LOCATION_KMLFOLDER
add index FK_ckj00nos13yojmcyvtefgk9pl (KMLFOLDER_ID),
add constraint FK_ckj00nos13yojmcyvtefgk9pl
foreign key (KMLFOLDER_ID)
references project.Locations (id)
The console does not show inserts in to the LOCATION_KNLFOLDER table as I expect. Any thoughts on why this may be happening?
You're initializing the inverse side of the association, that Hibernate ignores, instead of (or in addition to) initializing the owner side of the association, that Hibernate doesn't ignore.
The owner side is the side without the mappedBy attribute.
Below is the code I am using to save a record in the database and then get the generated primary key.
public void save(User user) {
// TODO Auto-generated method stub
Object[] args = { user.getFirstname(), user.getLastname(),
user.getEmail() };
int[] types = { Types.VARCHAR, Types.VARCHAR, Types.VARCHAR };
SqlUpdate su = new SqlUpdate();
su.setJdbcTemplate(getJdbcTemplate());
su.setSql(QUERY_SAVE);
setSqlTypes(su, types);
su.setReturnGeneratedKeys(true);
su.compile();
KeyHolder keyHolder = new GeneratedKeyHolder();
su.update(args, keyHolder);
int id = keyHolder.getKey().intValue();
if (su.isReturnGeneratedKeys()) {
user.setId(id);
} else {
throw new RuntimeException("No key generated for insert statement");
}
}
But its not working, It gives me following error.
The generated key is not of a supported numeric type. Unable to cast [oracle.sql.ROWID] to [java.lang.Number]
The row is being inserted in the database properly. As well I could get the generataed primary key when using MS SQL database but the same code is not working with the ORACLE 11G.
Please help.
As in the comment, oracle rowid's are alpha numerical so can't be cast to an int.
Besides that, you should not use the generated rowid anywhere in your code. This is not the primary key that you defined on the table.
MS SQL has the option to declare a column as a primary key which auto-increments. This is a functionality that does not work in oracle.
What I always do (regardless if the db supports auto-increment) is the following:
select sequenceName.nextval from dual
The value returned by the previous statement is used as the primary key for the insert statement.
insert into something (pk, ...) values (:pk,:.....)
That way we always have the pk after the insert.