Spring SimpleJdbcInsert fails with 'INSERT has more target columns than expressions' - spring

I am using a SimpleJdbcInsert to insert rows into a PostgreSQL database. However, I get an the following error:
Caused by: org.postgresql.util.PSQLException: ERROR: INSERT has more
target columns than expressions.
org.springframework.jdbc.UncategorizedSQLException:
PreparedStatementCallback; uncategorized SQLException for SQL [INSERT
INTO product (product_id,product_name,product_code,in_
stock,product_category) VALUES(?)]; SQL state [25P02]; error code [0];
ERROR: current transaction is aborted, commands ignored until end of
transaction block; nested exception is
org.postgresql.util.PSQLException: ERROR: current transaction is
aborted, commands ignored until end of transaction block
The number columns is exactly the same as the number of values I am trying to insert when I print out the MapSqlParameterSource object shown below:
Parameters Names ::
[
product_id,
product_name,
product_code,
in_ stock,
product_category
]
Parameters Values :: [{
product_id=1518,
product_name=Sofa,
product_code=150,
in_stock=true,
product_category=null,
}]
The product_id is the primary key and it is not null. Could the problem be because I am not using an auto-generated primary key? I still do not understand why that would be a problem.
The columns shown in the error message are precisely the same as the columns in the parameter list I'm printing. The values also tally with the number of columns as well, so I'm really baffled why PostgreSQL is giving this error. Please help!

I was able to solve it with a different solution to using Spring JDBC.

Related

Nifi throwing None of the fields in the record map to the columns defined by [table name]

Am trying execute a sql query on oracle database and inserting the result into another table, for my trial am just performing a simple query as
SELECT 1 AS count
FROM dual
and trying to insert that into a single column table which has the name COUNT.
The content of the record on Nifi seems to be as follows
[
{
"COUNT" : "1"
}
]
but the logs keeps throwing the error
due to java.sql.SQLDataException:
None of the fields in the record map to the columns defined by
the schema_name.table_name table:
any ideas ?
I believe you get that same error message if your table name doesn't match. The Translate Field Names property only translates the fields (columns), not the table name. Try specifying the schema/table in uppercase to match what Oracle is expecting.

Spark SQL throwing error "java.lang.UnsupportedOperationException: Unknown field type: void"

I am getting below error in Spark(1.6) SQL while creating a table with column value default as NULL. Ex: create table test as select column_a, NULL as column_b from test_temp;
The same thing works in Hive and creates the column with data type "void".
I am using empty string instead of NULL to avoid the exception and new column getting string data type.
Is there any better way to insert null values in hive table using spark sql ?
2017-12-26 07:27:59 ERROR StandardImsLogger$:177 - org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.UnsupportedOperationException: Unknown field type: void
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:789)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:746)
at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply$mcV$sp(ClientWrapper.scala:428)
at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply(ClientWrapper.scala:426)
at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply(ClientWrapper.scala:426)
at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:293)
at org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:239)
at org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:238)
at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:281)
at org.apache.spark.sql.hive.client.ClientWrapper.createTable(ClientWrapper.scala:426)
at org.apache.spark.sql.hive.execution.CreateTableAsSelect.metastoreRelation$lzycompute$1(CreateTableAsSelect.scala:72)
at org.apache.spark.sql.hive.execution.CreateTableAsSelect.metastoreRelation$1(CreateTableAsSelect.scala:47)
at org.apache.spark.sql.hive.execution.CreateTableAsSelect.run(CreateTableAsSelect.scala:89)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:56)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:56)
at org.apache.spark.sql.DataFrame.withCallback(DataFrame.scala:153)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:145)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:829)
I couldn't find much information regarding the datatype void but it looks like it is somewhat equivalent to the Any datatype we have in Scala.
The table at the end of this page explains that a void can be cast to any other data type.
Here are some JIRA issues that are kinda similar to the problem you are facing
HIVE-2901
HIVE-747
So, as mentioned in the comment, instead of NULL you can cast it to any of the implicit data types.
select cast(NULL as string) as column_b
I started to get a similar issue. I build the code down to an example
WITH DATA
AS (
SELECT 1 ISSUE_ID,
DATE(NULL) DueDate,
MAKE_DATE(2000,01,01) DDate
UNION ALL
SELECT 1 ISSUE_ID,
MAKE_DATE(2000,01,01),
MAKE_DATE(2000,01,02)
)
SELECT ISNOTNULL(lag(IT.DueDate, 1) OVER (PARTITION by IT.ISSUE_ID ORDER BY IT.DDate ))
AND ISNULL(IT.DueDate)
FROM DATA IT

JHipster Post REST API Displays SQL Error

i am creating a project with using JHipster.
I have 2 entity(Course,Student) and they have many-to many relation.
So there is an another entity called Course_Student and it has 2 column(student_id,course_id).
Students(users) need to register course with clicking Register Course button on their own panel.
So i wrote a query and rest API function.
Here is my Query function:
#Modifying
#Query(value="insert into COURSE_STUDENT (STUDENTS_ID, COURSES_ID) VALUES (:studentId,:courseId)",nativeQuery = true)
#Transactional
void registerCourse(#Param("studentId") Long studentId, #Param("courseId") Long courseId);
And Rest API function:
#PostMapping("/registercourse/{studentId}/{courseId}")
#Timed
public ResponseEntity<Void> registerCourse(#PathVariable Long studentId,#PathVariable Long courseId) {
log.debug("REST request to register {} Course : {} Student : {}", courseId,studentId);
courseRepository.registerCourse(studentId,studentId);
return ResponseEntity.ok().headers(HeaderUtil.createEntityCreationAlert(ENTITY_NAME, courseId.toString())).build();
}
But when i try to run registerCourse API with Using Swagger console displays these errors: (i entered student id:33, course id: 1)
c.m.m.w.rest.errors.ExceptionTranslator : An unexpected error occurred: could not execute statement; SQL [n/a]; constraint ["FK_COURSE_STUDENT_COURSES_ID: PUBLIC.COURSE_STUDENT FOREIGN KEY(COURSES_ID) REFERENCES PUBLIC.COURSE(ID) (33)"; SQL statement:
insert into COURSE_STUDENT (STUDENTS_ID, COURSES_ID) VALUES (?,?) [23506-196]]; nested exception is org.hibernate.exception.ConstraintViolationException: could not execute statement
org.springframework.dao.DataIntegrityViolationException: could not execute statement; SQL [n/a]; constraint ["FK_COURSE_STUDENT_COURSES_ID: PUBLIC.COURSE_STUDENT FOREIGN KEY(COURSES_ID) REFERENCES PUBLIC.COURSE(ID) (33)"; SQL statement:
insert into COURSE_STUDENT (STUDENTS_ID, COURSES_ID) VALUES (?,?) [23506-196]]; nested exception is org.hibernate.exception.ConstraintViolationException: could not execute statement
at org.springframework.orm.jpa.vendor.HibernateJpaDialect.convertHibernateAccessException(HibernateJpaDialect.java:278)
Caused by: org.hibernate.exception.ConstraintViolationException: could not execute statement
at org.hibernate.exception.internal.SQLStateConversionDelegate.convert(SQLStateConversionDelegate.java:112)
... 145 common frames omitted
Caused by: org.h2.jdbc.JdbcSQLException: Referential integrity constraint violation: "FK_COURSE_STUDENT_COURSES_ID: PUBLIC.COURSE_STUDENT FOREIGN KEY(COURSES_ID) REFERENCES PUBLIC.COURSE(ID) (33)"; SQL statement:
insert into COURSE_STUDENT (STUDENTS_ID, COURSES_ID) VALUES (?,?) [23506-196]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:345)
... 163 common frames omitted
2017-09-08 08:58:57.099 WARN 7528 --- [ XNIO-2 task-8] .m.m.a.ExceptionHandlerExceptionResolver : Resolved exception caused by Handler execution: org.springframework.
You are passing the student_id as the course_id, and no course exists with the id of 33. See the 3rd line of your Rest API's registerCourse:
courseRepository.registerCourse(studentId,studentId);
change to:
courseRepository.registerCourse(studentId,courseId);

Advantage Database Server 8.1 UNIQUE CONSTRAINT multiple columns

I am working on an Advantage Database Server 8.1 and I have created a new table. I want to add a unique constraint for the combination of 2 columns.
I tried
ALTER TABLE TableName
ADD CONSTRAINT ConstraintName
UNIQUE (ColumnName1, ColumnName2)
but I get the error
"ERROR IN SCRIPT: poQuery: Error 7200: AQE Error: State = 42000; NativeError = 2115; [Extended Systems][Advantage SQL Engine]Expected lexical element not found: You are missing the column names. -- Location of error in the SQL
statement is: 33 (line: 2 column: 5)"
Ok the solution I found is:
CREATE UNIQUE INDEX ConstraintName ON TableName (ColumnName1, ColumnName2);

SQL Minus and lower/upper dont work together in Jdbc

i got a HSQLDB 2.2.9 and the following statement:
(SELECT lower(MyCol) FROM MyTable WHERE ID = ?)
MINUS
(SELECT lower(MyCol) FROM MyTable WHERE ID = ?)
And it works in my Squirrel. But when i execute this in my program which uses Jdbc i get the following exception:
Exception in thread "main" org.springframework.dao.TransientDataAccessResourceException: PreparedStatementCallback; SQL [(SELECT lower(MyCol) FROM MyTable WHERE ID = ? ) MINUS (SELECT lower(MyCol) FROM MyTable WHERE ID_CENTER = ?)]; Column not found: MyCol; nested exception is java.sql.SQLException: Column not found: MyCol
If i delete the lower() that statement works but its case sensitive which i want to eliminate here.
Can please someone tell me why i get this error and how to fix it?
This exception is not thrown by HSQLDB 2.2.9. If the column could not be found, the exception message would be in this form:
user lacks privilege or object not found: MYCOL
Please check your Spring data source settings.

Resources