Use Ebean ORM With Elasticsearch but failed - elasticsearch

ebean orm version is 12.16.1.
Ebean wants me to configure the DataSource.
Do I need a DataSource to connect to ES?
What should I do?
This is My Config:
package cn.com.deos.config;
import io.ebean.config.DatabaseConfig;
import io.ebean.event.ServerConfigStartup;
public class EsConfig implements ServerConfigStartup {
#Override
public void onStart(DatabaseConfig config) {
config.getDocStoreConfig().setUrl("http://xxx.xxx.xx.xx:9200");
config.getDocStoreConfig().setActive(true);
config.getDocStoreConfig().setGenerateMapping(false);
config.getDocStoreConfig().setDropCreate(false);
config.getDocStoreConfig().setCreate(false);
config.getDocStoreConfig().setUsername("xxxx");
config.getDocStoreConfig().setPassword("xxxx");
}
}
Exception
Caused by: io.ebean.datasource.DataSourceConfigurationException:
Configuration error creating DataSource for the default Database. This
typically means a missing application-test.yaml or missing ebean-test
dependency. See https://ebean.io/docs/trouble-shooting#datasource at
io.ebean.DbContext.(DbContext.java:52) at
io.ebean.DbContext.(DbContext.java:23) ... 2 more Caused by:
io.ebean.datasource.DataSourceConfigurationException: DataSource user
is null? at
io.ebean.datasource.pool.ConnectionPool.(ConnectionPool.java:131)
at
io.ebean.datasource.pool.ConnectionPoolFactory.createPool(ConnectionPoolFactory.java:14)
at
io.ebean.datasource.DataSourceFactory.create(DataSourceFactory.java:26)
at
io.ebeaninternal.server.core.InitDataSource.create(InitDataSource.java:119)
at
io.ebeaninternal.server.core.InitDataSource.createFromConfig(InitDataSource.java:114)
at
io.ebeaninternal.server.core.InitDataSource.initDataSource(InitDataSource.java:48)
at
io.ebeaninternal.server.core.InitDataSource.initialise(InitDataSource.java:33)
at
io.ebeaninternal.server.core.InitDataSource.init(InitDataSource.java:24)
at
io.ebeaninternal.server.core.DefaultContainer.setDataSource(DefaultContainer.java:222)
at
io.ebeaninternal.server.core.DefaultContainer.createServer(DefaultContainer.java:93)
at
io.ebeaninternal.server.core.DefaultContainer.createServer(DefaultContainer.java:64)
at
io.ebeaninternal.server.core.DefaultContainer.createServer(DefaultContainer.java:36)
at io.ebean.DatabaseFactory.create(DatabaseFactory.java:60) at
io.ebean.DbContext.getWithCreate(DbContext.java:103) at
io.ebean.DbContext.(DbContext.java:42) ... 3 more

Related

Wildfly / Infinispan HTTP session replication hits ClassNotFoundException when unmarshalling CGLIB Session Bean

I'm running Wildfly 20.0.1.Final in standalone, two-node cluster. I'm trying to implement HTTP Session sharing between the nodes.
In my Spring web application I have <distributable/> in my web.xml.
My session object is this:
package my.package;
#Component
#Scope(value = WebApplicationContext.SCOPE_SESSION, proxyMode = ScopedProxyMode.INTERFACES)
public class MySessionBean implements Serializable {
// omitted for brevity
}
As you can see, I have ScopedProxyMode.TARGET_CLASS.
When I perform a failover in Wildfly, my HTTP Session can't be restored however, as I hit this warning:
2021-02-22 13:24:18,651 WARN [org.wildfly.clustering.web.infinispan] (default task-1) WFLYCLWEBINF0007:
Failed to activate attributes of session Pd9oI0OBiZSC9we0uXsZdBwkLnadO1l4TUfvoJZf:
org.wildfly.clustering.marshalling.spi.InvalidSerializedFormException:
java.lang.ClassNotFoundException: my.package.MySessionBean$$EnhancerBySpringCGLIB$$9c0fa1df
from [Module "deployment.myDeployment.war" from Service Module Loader]
...
Caused by: java.lang.ClassNotFoundException: my.package.MySessionBean$$EnhancerBySpringCGLIB$$9c0fa1df from [Module "deployment.myDeployment.war" from Service Module Loader]
at org.jboss.modules.ModuleClassLoader.findClass(ModuleClassLoader.java:255)
at org.jboss.modules.ConcurrentClassLoader.performLoadClassUnchecked(ConcurrentClassLoader.java:410)
at org.jboss.modules.ConcurrentClassLoader.performLoadClass(ConcurrentClassLoader.java:398)
at org.jboss.modules.ConcurrentClassLoader.loadClass(ConcurrentClassLoader.java:116)
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Class.java:398)
at org.jboss.marshalling#2.0.9.Final//org.jboss.marshalling.ModularClassResolver.resolveClass(ModularClassResolver.java:133)
at org.jboss.marshalling.river#2.0.9.Final//org.jboss.marshalling.river.RiverUnmarshaller.doReadClassDescriptor(RiverUnmarshaller.java:1033)
at org.jboss.marshalling.river#2.0.9.Final//org.jboss.marshalling.river.RiverUnmarshaller.doReadNewObject(RiverUnmarshaller.java:1366)
at org.jboss.marshalling.river#2.0.9.Final//org.jboss.marshalling.river.RiverUnmarshaller.doReadObject(RiverUnmarshaller.java:283)
at org.jboss.marshalling.river#2.0.9.Final//org.jboss.marshalling.river.RiverUnmarshaller.doReadObject(RiverUnmarshaller.java:216)
at org.jboss.marshalling#2.0.9.Final//org.jboss.marshalling.AbstractObjectInput.readObject(AbstractObjectInput.java:41)
at org.wildfly.clustering.marshalling.spi#20.0.1.Final//org.wildfly.clustering.marshalling.spi.util.MapExternalizer.readObject(MapExternalizer.java:65)
...
Note, that the ClassNotFoundException is complaining because the lack of my.package.MySessionBean$$EnhancerBySpringCGLIB$$9c0fa1df, which is the Spring-enhanced bean of my MySessionBean bean.
Changing to ScopedProxyMode.INTERFACES is not an option.
Can you please point me in the right direction with this?
I managed to fix this by creating a simple POJO, called MySessionDTO, and using that in my session.
So initially I had this (which threw the exception in the question):
request.getSession().setAttribute("mySession", mySessionBean);
...and after I created MySessionDTO (see below), I refactored it into this:
request.getSession().setAttribute("mySession", mySessionBean.getMySessionDTO());
MySessionDTO is a simple POJO:
package my.package;
import java.io.Serializable;
public class MySessionDTO extends MySessionBean implements Serializable {
public MySessionDTO (MySessionBean mySessionBean) {
this.setAttributeX(mySessionBean.getAttributeX());
this.setAttributeY(mySessionBean.getAttributeY());
}
}

conditional class not detected on mvn clean package

I have the following conditional class that should be unseen unless some other class annotates itself with a custom annotation:
#ConditionalOnBean(MyConf.class)
#Service
#RequiredArgsConstructor
public class CondClass{
public void method() throws MessagingException
Then I have a spring boot test:
#SpringBootTest
public void myTest{
It works fine if I manually execute it, but if I do:
mvn clean package
I get:
java.lang.IllegalStateException: Failed to load ApplicationContext
Caused by: java.lang.IllegalStateException: Failed to introspect Class
[it.bmed.medmad.arch.common.websocket.service.impl.CondClass] from ClassLoader
[sun.misc.Launcher$AppClassLoader#18b4aac2]
Caused by: java.lang.NoClassDefFoundError: org/springframework/messaging/MessagingException
Caused by: java.lang.ClassNotFoundException:
org.springframework.messaging.MessagingException
Indeed that MessagingException is on a dependency that is not included, but I would expect that the whole class is not inspected because of the annotation.
Add #ConditionalOnClass(MyConf.class), too:
#ConditionalOnClass(MyConf.class)
#ConditionalOnBean(MyConf.class)
#Service
#RequiredArgsConstructor
public class CondClass {
// ...
}
sample: official usage

How to ignore the spring-boot-cassandra default config to load the cassandra connection instance

I have add the dependency of cassandra starter
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-cassandra</artifactId>
<version>2.0.0.RELEASE</version>
</dependency>
but the default config is poor for me.
org.springframework.beans.factory.BeanCreationException:
Error creating bean with name 'cassandraSession'
defined in class path resource
[org/springframework/boot/autoconfigure/data/cassandra/
CassandraDataAutoConfiguration.class]:
Invocation of init method failed; nested exception is
com.datastax.driver.core.exceptions.NoHostAvailableException:
All host(s) tried for query failed (tried: localhost/0:0:0:0:0:0:0:1:9042
(com.datastax.driver.core.exceptions.TransportException:
[localhost/0:0:0:0:0:0:0:1:9042] Cannot connect), localhost/127.0.0.1:9042
(com.datastax.driver.core.exceptions.TransportException:
[localhost/127.0.0.1:9042] Cannot connect))
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException:
All host(s) tried for query failed (tried: localhost/0:0:0:0:0:0:0:1:9042
(com.datastax.driver.core.exceptions.TransportException:
[localhost/0:0:0:0:0:0:0:1:9042]
Cannot connect), localhost/127.0.0.1:9042
(com.datastax.driver.core.exceptions.TransportException:
[localhost/127.0.0.1:9042] Cannot connect))
I hope the spring application not load the cassandra connection instance(like cassandraSession) when I don't haved config the 'spring.data.cassandra.*'
What can do it?
You need to exclude CassandraDataAutoConfiguration to disable spring boot cassandra auto configuration e.g.
#SpringBootApplication
#EnableAutoConfiguration(exclude = { CassandraDataAutoConfiguration.class })
public class Application {
}
Then define your own Cassandra configuration e.g.
#Configuration
#EnableReactiveCassandraRepositories
public class CassandraConfig extends AbstractReactiveCassandraConfiguration {
}
Ended up defining my own custom cluster bean
#Configuration
#EnableReactiveCassandraRepositories
public class CassandraConfig extends AbstractReactiveCassandraConfiguration {
// read contact points from config
#Value("${spring.data.cassandra.contact-points}")
private String contactPoints;
#Override
public CassandraClusterFactoryBean cluster() {
CassandraClusterFactoryBean bean = super.cluster();
bean.setContactPoints(contactPoints);
return bean;
}
https://github.com/spring-projects/spring-data-cassandra/blob/f3115017d4a04e105d4046f6fd716ac308ecd7aa/spring-data-cassandra/src/main/java/org/springframework/data/cassandra/config/AbstractClusterConfiguration.java#L88

Spring Boot: How to make single externalize JDBC datasource configuration work in differnt DAOImpl classes

I have a requirement to fetch DB username and password from Vault. So I have removed the default implementation (spring.datasource.url,spring.datasource.username,spring.datasource.password)
and added the following code in DAOImpl class.
Code
#Autowired
private JdbcTemplate jdbcTemplate;
#Bean
#Primary
public DataSource dataSource()
{
return DataSourceBuilder.create().username("someusername").password("somepassword")
.url("someurl")
.driverClassName("oracle.jdbc.driver.OracleDriver").build();
}
It was working perfectly. But when I added a new DAOImpl class I got the following Exception. Is it necessay to add the above code snippet in all the DAOImpl
classes. Is there a way to configure dataSource in single class and use it in all the DAOImpl classes
Exception
Caused by: org.springframework.beans.factory.BeanCurrentlyInCreationException: Error creating bean with name 'dataSource': Requested bean is currently in creation: Is there an unresolvable circular reference?

Spring Boot 2.0.0.M2 and Spring Data Elasticsearch configuration

I'm trying to move my project to Spring Boot 2.0.0.M2.
This is my old Spring Data Elasticsearch configuration:
import org.elasticsearch.client.Client;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.common.transport.InetSocketTransportAddress;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Profile;
import org.springframework.data.elasticsearch.core.ElasticsearchOperations;
import org.springframework.data.elasticsearch.core.ElasticsearchTemplate;
import org.springframework.data.elasticsearch.repository.config.EnableElasticsearchRepositories;
#Profile("production")
#Configuration
#EnableElasticsearchRepositories(basePackages = "com.example.domain.repository.elasticsearch")
public class ElasticsearchConfig {
#Value("${elasticsearch.host}")
private String host;
#Value("${elasticsearch.port}")
private int port;
#Bean
public Client client() throws Exception {
return TransportClient.builder().build().addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName(host), port));
}
#Bean
public ElasticsearchOperations elasticsearchTemplate() throws Exception {
return new ElasticsearchTemplate(client());
}
}
Right now I faced a following issue:
1. The method builder() is undefined for the type TransportClient
2. InetSocketTransportAddress cannot be resolved to a type
On the Maven classpath I have Spring Data Elasticsearch 3.0.0.M4:
How to properly configure current version of Spring Data Elasticsearch ?
UPDATED
For my tests I use Embedded Elasticsearch with a following application.properties:
#Elasticsearch
spring.data.elasticsearch.properties.http.enabled=true
spring.data.elasticsearch.properties.http.port=9250
spring.data.elasticsearch.properties.path.home=target/test-elasticsearch-db
spring.data.elasticsearch.properties.transport.tcp.connect_timeout=60s
This is my ES test config:
#Profile("test")
#Configuration
#EnableElasticsearchRepositories(basePackages = "com.example.domain.repository.elasticsearch")
public class ElasticsearchTestConfig {
}
Right now the test fails with a following error:
Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'elasticsearchTemplate' defined in class path resource [org/springframework/boot/autoconfigure/data/elasticsearch/ElasticsearchDataAutoConfiguration.class]: Unsatisfied dependency expressed through method 'elasticsearchTemplate' parameter 0; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'org.elasticsearch.client.Client' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {}
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:726) ~[spring-beans-5.0.0.RC2.jar:5.0.0.RC2]
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:458) ~[spring-beans-5.0.0.RC2.jar:5.0.0.RC2]
and
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'org.elasticsearch.client.Client' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {}
at org.springframework.beans.factory.support.DefaultListableBeanFactory.raiseNoMatchingBeanFound(DefaultListableBeanFactory.java:1478) ~[spring-beans-5.0.0.RC2.jar:5.0.0.RC2]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1089) ~[spring-beans-5.0.0.RC2.jar:5.0.0.RC2]
What is wrong here and how to fix this ?
Spring Boot 2.0 uses Elasticsearch 5 which includes some breaking API changes. You would be shielded from those changes if you used Spring Boot's auto-configuration rather than trying to write your own.
All that's needed is a value for the spring.data.elasticsearch.cluster-nodes property. With that in place, Spring Boot will auto-configure both a TransportClient and an ElasticsearchTemplate.
Had similar issue when migrating from spring-boot 1.5.6 to 2.0.0. The reason seems to be previous support for embedded elasticsearch is not more supported and the same is reflected in spring-boot.
Previously leaving the following property empty
spring.data.elasticsearch.cluster-nodes
put the following into use
spring.data.elasticsearch.properties.path.home
and spring-boot created given directory in the target folder for embedded mode to run. With spring-boot 2.0.0 (spring-boot-autoconfigure-2.0.0.RELEASE.jar to be precise) cluster-nodes has become asserted for non-null value causing elasticsearchTemplate bean not to be created under the hood.
That is the reason why your app stop working so suddenly.

Resources