How to create a Batch project that doesn't need a data source? - spring

I am creating a batch project that does not have a need to hit a database and when I run a context load, I get an exception
"Factory method 'dataSource' threw exception; nested exception is
org.springframework.boot.autoconfigure.jdbc.DataSourceProperties$DataSourceBeanCreationException:
Failed to determine a suitable driver class".
I believe that there are a few dependencies like spring-data-jpa, that may require a data source to be supplied within my project and when I add a random data source, my context loads just fine.

Related

Couldn't able to retrieve the path of h2 db when spring-boot application being deployed with application jar

I am deploying a spring boot application with jar built by maven, and i am using the h2 db as file which is placed in the resource folder. And i am creating the spring JdbcTemplate connection using the Hikari configuration. Here for Hikari configuration i need to provide the location of the h2 db location as URL and i trying to provide as per the h2 db documentation. But while creating the connection application failed saying following exception
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'h2DBConnection' defined in com.xx.Application: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.jdbc.core.JdbcTemplate]: Factory method 'h2DBConnection' threw exception; nested exception is java.io.FileNotFoundException: class path resource [h2DB.mv.db] cannot be resolved to absolute file path because it does not reside in the file system: jar:file:/home/app/app.jar!/BOOT-INF/classes!/h2DB.mv.db
But when i check the jar file to see the content file is there in the mentioned path.
And i saw other posts saying need to use the ResourceLoader with inputstream to load the files from the jar. But here i need to provide the path to db file for Hikari config.
It is working fine in the IDE only issue when i start the application using Jar
Can someone help me how can it be done?
i tried to use the classpathResource and ResourceLoader but it couldn't able to get the file by jdbctemplate.

Spring Boot: How to define new object classes in UnboundID LDAP?

In a Spring Boot 2.1.6 project I'm using the embedded LDAP service (UnboundID LDAP). I need to define new attribute types and object classes. My LDIF file contains something like that:
dn: cn=schema
changetype: modify
add: attributetypes
attributetypes: ( 2.25.128424792425578037463837247958458780603.1 NAME 'customAttribute' DESC 'a custom attribute' EQUALITY caseIgnoreMatch SYNTAX '1.3.6.1.4.1.1466.115.121.1.15' )
However, when I deploy my WAR on tomcat I get this exception:
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'directoryServer' defined in class path resource [org/springframework/boot/autoconfigure/ldap/embedded/EmbeddedLdapAutoConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.unboundid.ldap.listener.InMemoryDirectoryServer]: Factory method 'directoryServer' threw exception; nested exception is java.lang.IllegalStateException: Unable to load LDIF classpath:test.ldif
...
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.unboundid.ldap.listener.InMemoryDirectoryServer]: Factory method 'directoryServer' threw exception; nested exception is java.lang.IllegalStateException: Unable to load LDIF classpath:test.ldif
...
Caused by: LDAPException(resultCode=68 (entry already exists), errorMessage='Unable to add an entry with a DN that is the same as or subordinate to the subschema subentry DN 'cn=schema'.', ldapSDKVersion=4.0.11, revision=34e39aab27ea4fb92659a6888933db08099c7e41)
at com.unboundid.ldap.listener.InMemoryRequestHandler.addEntry(InMemoryRequestHandler.java:4916)
at com.unboundid.ldap.listener.InMemoryRequestHandler.importFromLDIF(InMemoryRequestHandler.java:4624)
at com.unboundid.ldap.listener.InMemoryDirectoryServer.importFromLDIF(InMemoryDirectoryServer.java:1255)
at org.springframework.boot.autoconfigure.ldap.embedded.EmbeddedLdapAutoConfiguration.importLdif(EmbeddedLdapAutoConfiguration.java:164)
... 76 more
What happens obviously is that, for some reason, the LDAP service "thinks" that I'm trying to ad the entry "cn=schema", and refuses the operation as the entry already exists, while I'm trying to modify it, hence the statememnt "changetype: modify" in the LDIF file.
Could someone please advise me what I'm doing wrong ?
Many thanks in advance.
Kind regards,
Nicolas

jHispter Elasticsearch issue after adding any custom entity

org.springframework.beans.factory.UnsatisfiedDependencyException:
Error creating bean with name 'countryServiceImpl' defined in file [E:\tms-ws\TransportManagement\target\classes\com\baltransport\tms\app\v1\service\impl\CountryServiceImpl.class]:
Unsatisfied dependency expressed through constructor parameter 1; nested exception is org.springframework.beans.factory.BeanCreationException:
Error creating bean with name 'countrySearchRepository': Invocation of init method failed;
nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate
[org.springframework.data.elasticsearch.repository.support.NumberKeyedRepository]:
Constructor threw exception; nested exception is org.springframework.data.elasticsearch.ElasticsearchException: Failed to build mapping for country:country
at org.springframework. ...
Getting this exception jHispter Elasticsearch issue after adding any custom entity .
It works perfectly fine first time (with default jdl)
U can try:
Delete db and create again.
Import your jdl, with the news changes, maybe new entity.
Run elasticsearch in docker. (Find this file .yml in your project) and check if this is run correctly (localhost:9200)
Run your app (monolith or microservices)
Regards
you can try to delete the indexes in elasticsearch directly from the url.
Supposing you index is called "user" you can do the
DELETE /indexname

Issue with spring cloud task mutiple datasources

I configured two data sources in my app following the guide of spring example:https://github.com/spring-cloud/spring-cloud-task/blob/master/spring-cloud-task-samples/multiple-datasources.
The spring boot version I use is :2.0.0.RELEASE
The spring.cloud.task.version I use is :1.2.2.RELEASE.
This application works fine in my local computer, But when deploy to the to AWS, I got the following error with the defination of class:CustomTaskConfigurer.java.
which defined the same as here:https://github.com/spring-cloud/spring-cloud-task/blob/master/spring-cloud-task-samples/multiple-datasources/src/main/java/io/spring/configuration/CustomTaskConfigurer.java
The error message is like below:
exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.cloud.task.configuration.SimpleTaskConfiguration': Invocation of init method failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'customTaskConfigurer' defined in file [/home/vcap/app/BOOT-INF/classes/com/xxx/configuration/CustomTaskConfigurer.class]: Bean instantiation via constructor failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.xxx.configuration.CustomTaskConfigurer$$EnhancerBySpringCGLIB$$bc80cd46]: Constructor threw exception; nested exception is java.lang.IllegalStateException: Unable to create a TaskExecutionDao.
The root cause for this error is that when I developed the application locally I configure a local datasource bean for postgresql like below:
#Bean
#Primary
#ConfigurationProperties("spring.datasource")
public HikariDataSource sourceDataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
This bean read the properties in the application.properties file which identify the username and password url for the local postgres.
When This application deloyed to cloud,It will connect to the cloud data base instead of the local databases, which means the url,username,and password are no longer correct.
After add the configuration for cloud, this error disppeared.
But this exception stack trace only tell you that it is unable to create taskExecutionDao, It is really hard for the user to fix the issue when see such a error message
If it is a multiple datasource issue, you can try marking it as #Primary. Providing a better stack trace is helpful.

SRVE0283E: Exception caught while initializing context: org.springframework.beans.factory.BeanCreationException

the project working fine on windows 7 , shows those Error when deploy the project on mac .
liberty server ,eclipse mars ,
strong text[ERROR ] SRVE0283E: Exception caught while initializing context: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'jpaMapppingContext': Invocation of init method failed; nested exception is javax.persistence.PersistenceException: java.lang.LinkageError: loader constraint violation in interface itable initialization: when resolving method "com.ibm.db2.jcc.am.Connection.prepareSQLJCall(Ljava/lang/String;ILcom/ibm/db2/jcc/SQLJSection;Lcom/ibm/db2/jcc/SQLJColumnMetaData;Lcom/ibm/db2/jcc/SQLJColumnMetaData;ZZIIIILjava/lang/String;[Ljava/lang/Object;)Lcom/ibm/db2/jcc/SQLJCallableStatement;" the class loader (instance of com/ibm/ws/classloading/internal/AppClassLoader) of the current class, com/ibm/db2/jcc/am/Connection, and the class loader (instance of sun/misc/Launcher$ExtClassLoader) for interface com/ibm/db2/jcc/SQLJConnection have different Class objects for the type c/SQLJSection;Lcom/ibm/db2/jcc/SQLJColumnMetaData;Lcom/ibm/db2/jcc/SQLJColumnMetaData;ZZIIIILjava/lang/String;[Ljava/lang/Object;)Lcom/ibm/db2/jcc/SQLJCallableStatement; used in the signature
at
Based on the existence of Launcher$ExtClassLoader in the error message, it looks like you've put a copy of the JDBC driver into your Java Extension Class Loader path (usually JAVA_HOME/jre/lib/ext). Because of that, the environment has visibility to classes from both that location and the application class loader, and that causes a duplicate visibility that leads to the LinkageError.
There are extremely rare cases that require the use of the Java Extension loader, but I don't believe JDBC drivers are typically among them, so simply removing it from jre/lib/ext is probably the most straightforward solution.

Resources