spring boot Invalid bean definition with name 'WDSDataSource' defined in URL " - spring-boot

while executing the application getting invalid "WDSDataSource" in persistance.xml , could not resolve placeholder "AppEncryptionKey"
Below is snap of persistence.xml of wds configuration.
<bean id="WDSDataSource" class="com.deere.dsfj.utility.datasource.DSFJTomcatDataSource" init-method="initialize"
p:username="${ApplicatinID}"
p:encryptionKey="${AppEncryptionKey}"
p:encryptionPassword="${AppEncryptedPassword}"
p:driverClassName="com.ibm.db2.jcc.DB2Driver"
p:url="${WDSJdbcUrl}"
p:testWhileIdle="false"
p:testOnBorrow="true"
p:testOnReturn="false"
p:validationInterval="300000"
p:timeBetweenEvictionRunsMillis="300000"
p:maxActive="80"
p:initialSize="0"
p:maxWait="300000"
p:removeAbandonedTimeout="300"
p:minEvictableIdleTimeMillis="300000"
p:minIdle="1"
p:maxIdle="10"
p:logAbandoned="true"
p:removeAbandoned="true"/>

Looks like spring is not able to find any value or the Key AppEncryptionKey in your properties file from which you are taking these values.
So ensure that the key AppEncryptionKey is present in the properties file.

Related

How to set args using Spring Camel Lib in application.properties

Working with SpringBoot, Java 11.
In application.properties i want to configure: camel.component.rabbitmq.args
Documentation by camel using rabbitmq: https://camel.apache.org/components/3.18.x/rabbitmq-component.html#_message_body
Note: I'm not asking about rabbit, just the configuration in application.
Error i have received trying to configure by my way:
application.properties
camel.component.rabbitmq.args={arg.queue.x-message-ttl=3600000}
Error:
Failed to bind properties under 'camel.component.rabbitmq.args' to java.util.Map<java.lang.String, java.lang.Object>:
Reason: No converter found capable of converting from type [java.lang.String] to type [java.util.Map<java.lang.String, java.lang.Object>]
How is the correct way by example? Thanks!!!
In the application.properties, the proper syntax for a component option of type Map is componentOptionName[mapKey]=mapValue so in your case it would be:
camel.component.rabbitmq.args[queue.x-message-ttl]=3600000

Cannot bind environment variable to application.properties

Im working with Spring-boot and PostgreSQL and failed to bind the database password to the application.properties. I have already set the DATABASE_PASSWORD to env but its still failed to bind the properties
spring.datasource.url=jdbc:postgresql://${DATABASE_HOST}:${DATABASE_PORT}/${DATABASE_NAME}?reWriteBatchedInserts=true
spring.datasource.username=${DATABASE_USER}
spring.datasource.password=${DATABASE_PASSWORD}
Description:
Failed to bind properties under 'spring.datasource.password' to
java.lang.String:
Property: spring.datasource.password
Value: ${DATABASE_PASSWORD}
Origin: class path resource [application.properties]:16:28
Reason: Could not resolve placeholder 'DATABASE_PASSWORD' in value "${DATABASE_PASSWORD}"
if you have set your database_password to a system environment as you say, than spring should use that as this says -:
The values in application.properties are filtered through the existing Environment when they are used, so you can refer back to previously defined values (for example, from System properties).
Did you try restarting ?

Spring Boot Actuator Liquibase endpoint fail

I'm trying to use Liquibase with Spring Boot.
Here is my application.properties file:
# ----------------------------------------
# DATA PROPERTIES
# ----------------------------------------
spring.datasource.url=jdbc:postgresql://xxxxxx:5432/dev
spring.datasource.schema=my_schema
spring.datasource.username=my_username
spring.datasource.password=my_password
# LIQUIBASE (LiquibaseProperties)
liquibase.default-schema=${spring.datasource.schema}
liquibase.user=${spring.datasource.username}
liquibase.password=${spring.datasource.password}
Change sets are well applied (table creation is ok).
The problem comes when I access /liquibase actuator's endpoint, I get a 500 error:
Unable to get Liquibase changelog
I also get the following log:
org.postgresql.util.PSQLException: ERROR: relation "public.databasechangelog" does not exist
If thing the problem is the schema prefix used to access changelog table: "public" versus "my_schema".
I thought spring.datasource.schema was the right parameter to set ?
Here is a working solution (come from this answer):
# ----------------------------------------
# DATA PROPERTIES
# ----------------------------------------
spring.datasource.url=jdbc:postgresql://xxxxxx:5432/dev?currentSchema=my_schema
Just a guess here - I think that the issue is that your 'real' schema is being set by the spring.datasource.schema but the liquibase tables are being stored in public and it may be that the Spring Boot actuator doesn't know that those can be separate.

Spring Boot JMS & Batch

Previously everything worked properly. Today I configured Spring Batch together with my Spring Boot application and faced an issue with application.properties.
I have following properties encrypted with Jasypt:
spring.profiles.active=https
ENVIRONMENT=h2
#aws sqs
aws.sqs.account.access.key=ENC(kjsdh456fgkjhdfsgkjhdfg)
#queue message listener
queue.message.listener.task.executor.threads.number=1
queue.message.listener.task.executor.max.concurrent.consumers=1
Now, in order to configure Spring Batch I added
ENVIRONMENT=h2
to application.properties file.
also, I have added batch-h2.properties file:
# Placeholders batch.* for H2 database:
batch.jdbc.driver=org.h2.Driver
batch.jdbc.url=jdbc:h2:~/testdb;CIPHER=AES;AUTO_SERVER=TRUE;DB_CLOSE_ON_EXIT=FALSE
batch.jdbc.user=sa
batch.jdbc.password="sa sa"
batch.jdbc.testWhileIdle=false
batch.jdbc.validationQuery=
batch.drop.script=classpath:/org/springframework/batch/core/schema-drop-h2.sql
batch.schema.script=classpath:/org/springframework/batch/core/schema-h2.sql
batch.business.schema.script=classpath:/business-schema-h2.sql
batch.database.incrementer.class=org.springframework.jdbc.support.incrementer.H2SequenceMaxValueIncrementer
batch.database.incrementer.parent=sequenceIncrementerParent
batch.lob.handler.class=org.springframework.jdbc.support.lob.DefaultLobHandler
batch.grid.size=2
batch.jdbc.pool.size=6
batch.verify.cursor.position=true
batch.isolationlevel=ISOLATION_SERIALIZABLE
batch.table.prefix=BATCH_
and after that I continuously receiving following exception:
Caused by: java.lang.IllegalArgumentException: Could not resolve placeholder 'aws.sqs.account.access.key' in string value "${aws.sqs.account.access.key}"
aws.sqs.account.access.key property now cannot be resolved.
I'm injecting this property into my configuration:
#Configuration
public class SQSConfig {
#Value("${aws.sqs.account.access.key}")
private String accessKey;
How to fix it ?

A non-read-only mapping must be defined for the sequence number field

I am getting the following error from Toplink when I start my application. I am trying to add two new tables to our application.
EXCEPTION [TOPLINK-41] (TopLink - 9.0.3.7 (Build 440)): oracle.toplink.exceptions.DescriptorException
EXCEPTION DESCRIPTION: A non-read-only mapping must be defined for the sequence number field.
DESCRIPTOR: Descriptor(icis.cr.common.db.entities.ClerkReviewTask --> [DatabaseTable(CREV_TASK)])
I have compared the mappings to one that works and haven't noticed anything. I compared the new Class in TopLink workbench and don't see any missing mapping. It appears my sequence is mapped correctly. Does anyone have any suggestions with this?
The descriptor has the following for the TASK_ID field:
<primaryKeyFieldHandles>
<FieldHandle>
<table>CREV_TASK</table>
<fieldName>TASK_ID</fieldName>
</FieldHandle>
</primaryKeyFieldHandles>
<sequenceNumberName>SEQ_CREV_TASK_ID</sequenceNumberName>
<sequenceNumberFieldHandle>
<FieldHandle>
<table>CREV_TASK</table>
<fieldName>TASK_ID</fieldName>
</FieldHandle>
</sequenceNumberFieldHandle>
<Mapping>
<descriptor>icis.cr.common.db.entities.ClerkReviewTask.ClassDescriptor</descriptor>
<usesMethodAccessing>false</usesMethodAccessing>
<inherited>false</inherited>
<readOnly>false</readOnly>
<getMethodHandle>
<MethodHandle emptyAggregate="true">
</MethodHandle>
</getMethodHandle>
<setMethodHandle>
<MethodHandle emptyAggregate="true">
</MethodHandle>
</setMethodHandle>
<instanceVariableName>id</instanceVariableName>
<defaultFieldNames>
<defaultFieldName>direct field=</defaultFieldName>
</defaultFieldNames>
<fieldHandle>
<FieldHandle>
<table>CREV_TASK</table>
<fieldName>TASK_ID</fieldName>
</FieldHandle>
</fieldHandle>
<classIndicator>BldrDirectToFieldMapping</classIndicator>
</Mapping>
I was able to fix this by right-clicking my project in TopLink Mapping Workbench and selecting Export Project to Java Source. My file was out of date and caused this error and the following:
EXCEPTION [TOPLINK-110] (TopLink - 9.0.3.7 (Build 440)): oracle.toplink.exceptions.DescriptorException
EXCEPTION DESCRIPTION: Descriptor is missing for class [icis.cr.common.db.entities.ClerkReviewCaseTask].
MAPPING: oracle.toplink.mappings.OneToManyMapping[caseTasks]
DESCRIPTOR: Descriptor(icis.cr.common.db.entities.ClerkReviewTask --> [DatabaseTable(CREV_TASK)])

Resources