Apache Ignite repository save method is doing only UPDATE instead of INSERT - spring

I'm developing a Spring Boot + Impala app using Apache Ignite as cache store.
The problem is IgniteRepository.save(key,entity) is only running UPDATE query instead of INSERT.
pom.xml
<ignite.version>2.14.0</ignite.version>
<dependency>
<groupId>org.apache.ignite</groupId>
<artifactId>ignite-spring-data-ext</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.ignite</groupId>
<artifactId>ignite-core</artifactId>
<version>${ignite.version}</version>
</dependency>
<dependency>
<groupId>org.apache.ignite</groupId>
<artifactId>ignite-indexing</artifactId>
<version>${ignite.version}</version>
</dependency>
<dependency>
<groupId>org.apache.ignite</groupId>
<artifactId>ignite-spring</artifactId>
<version>${ignite.version}</version>
</dependency>
Ignite Configuration :
IgniteConfiguration cfg = new IgniteConfiguration();
cfg.setIgniteInstanceName("springDataNode");
cfg.setPeerClassLoadingEnabled(true);
CacheConfiguration ccfg = new CacheConfiguration("XYZCache");
ccfg.setIndexedTypes(Long.class, XYZ.class);
ccfg.setCacheMode(CacheMode.PARTITIONED);
ccfg.setAtomicityMode(CacheAtomicityMode.ATOMIC);
ccfg.setReadThrough(true);
ccfg.setWriteThrough(true);
ccfg.setWriteBehindEnabled(true);
CacheJdbcPojoStoreFactory<Long, XYZ> factory = new CacheJdbcPojoStoreFactory<>();
factory.setDataSourceBean("ImpalaDataSource");
JdbcType jdbcType = new JdbcType();
jdbcType.setCacheName("XYZCache");
jdbcType.setKeyType(Long.class);
jdbcType.setValueType(XYZ.class);
jdbcType.setDatabaseTable("schema.table");
jdbcType.setKeyFields(new JdbcTypeField(Types.BIGINT, "id", Long.class, "id"));
jdbcType.setValueFields(
new JdbcTypeField(Types.VARCHAR, "comments", String.class, "comments"),
new JdbcTypeField(Types.BIGINT, "id", Long.class, "id")
);
factory.setTypes(jdbcType);
ccfg.setCacheStoreFactory(factory);
cfg.setCacheConfiguration(ccfg);
return IgniteSpring.start(cfg, applicationContext);
Ignite Repository :
#RepositoryConfig(cacheName = "XYZCache")
public interface XYZRepository extends IgniteRepository<XYZ, Long> {
#Query("select * FROM XYZ WHERE comments=?")
List<XYZ> test(String comments);
#Query("insert into XYZ (id,comments) values (?,?)")
List<XYZ> customSave(Long id,String comments);
}
POJO :
#Data
public class XYZ implements Serializable {
private static final long serialVersionUID = -2677636393779376050L;
#QuerySqlField
private Long id;
#QuerySqlField
private String comments;
}
Calling code:
xyzRepository.save(id, xyz);
xyzRepository.customSave(id, comments);
Both the methods are throwing error by running UPDATE query (instead of INSERT) which is not supported in Impala and also not what I intend to do :
Caused by:
org.apache.ignite.internal.processors.cache.CachePartialUpdateCheckedException:
Failed to update keys (retry update if possible).: [1671548234688] at
org.apache.ignite.internal.processors.cache.GridCacheUtils.convertToCacheException(GridCacheUtils.java:1251)
~[ignite-core-2.14.0.jar:2.14.0]
Caused by: org.apache.ignite.IgniteCheckedException: Failed update
entry in database [table=schema.table, entry=Entry [key=1671548234688,
val=pkg.XYZ [idHash=1354181174, hash=991365654, id=1671548234688,
comments=test]]] at
org.apache.ignite.internal.processors.cache.store.GridCacheStoreManagerAdapter.put(GridCacheStoreManagerAdapter.java:593)
at org.apache.ignite.internal.processors.cache.GridCacheMapEntry$AtomicCacheUpdateClosure.update(GridCacheMapEntry.java:6154)
at org.apache.ignite.internal.processors.cache.GridCacheMapEntry$AtomicCacheUpdateClosure.call(GridCacheMapEntry.java:5918)
at org.apache.ignite.internal.processors.cache.GridCacheMapEntry$AtomicCacheUpdateClosure.call(GridCacheMapEntry.java:5603)
at org.apache.ignite.internal.processors.cache.persistence.tree.BPlusTree$Invoke.invokeClosure(BPlusTree.java:4254)
at org.apache.ignite.internal.processors.cache.persistence.tree.BPlusTree$Invoke.access$5700(BPlusTree.java:4148)
at org.apache.ignite.internal.processors.cache.persistence.tree.BPlusTree.invokeDown(BPlusTree.java:2226)
at org.apache.ignite.internal.processors.cache.persistence.tree.BPlusTree.invoke(BPlusTree.java:2116)
... 146 common frames omitted
Caused by: javax.cache.integration.CacheWriterException: Failed update entry in database [table=schema.table, entry=Entry
[key=1671548234688, val=pkg.XYZ [idHash=1354181174, hash=991365654,
id=1671548234688, comments=test]]]
at org.apache.ignite.cache.store.jdbc.CacheAbstractJdbcStore.writeUpsert(CacheAbstractJdbcStore.java:978)
at org.apache.ignite.cache.store.jdbc.CacheAbstractJdbcStore.write(CacheAbstractJdbcStore.java:1029)
at org.apache.ignite.internal.processors.cache.store.GridCacheStoreManagerAdapter.put(GridCacheStoreManagerAdapter.java:585)
... 153 common frames omitted
Caused by: com.cloudera.impala.support.exceptions.GeneralException:
[Cloudera]ImpalaJDBCDriver ERROR processing query/statement.
Error Code: 0, SQL state: TStatus(statusCode:ERROR_STATUS,
sqlState:HY000, errorMessage:AnalysisException: Impala does not
support modifying a non-Kudu table: schema.table ), Query: UPDATE
schema.table SET table.comments = 'test' WHERE (table.id =
1671548234688). ... 163 common frames omitted
What is the issue here? Why UPDATE is being forced by Apache Ignite? How can I change this behavior?
I also implemented Persistable interface and overrode isNew() to return true but it didn't work.
PS: Select queries are working fine (findAll, findById etc.) including the custom test() method. So, there is no datasource configuration issue and I am able to connect to Impala.

This is likely because the dialect you are using does not have the merge set up.
See here: to understand the flow. This is per the stack trace you posted.
org.apache.ignite.cache.store.jdbc.CacheAbstractJdbcStore.writeUpsert(CacheAbstractJdbcStore.java:978) at <br>
org.apache.ignite.cache.store.jdbc.CacheAbstractJdbcStore.write(CacheAbstractJdbcStore.java:1029) at <br>
org.apache.ignite.internal.processors.cache.store.GridCacheStoreManagerAdapter.put(GridCacheStoreManagerAdapter.java:585) ..
Alternatively, you can write your own data store factory.

Related

Spring Boot WebClient XML

My spring boot application wants to use Webclient to make an http request (XML request body) and receives XML response. Hence I created another spring boot application with jackson-dataformat-xml and created an endpoint to receive and return XML as below.
spring-boot-version=2.2.5
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-xml</artifactId>
</dependency>
#PostMapping(value = "/api",
consumes = MediaType.APPLICATION_XML_VALUE,
produces = MediaType.APPLICATION_XML_VALUE)
public ResponseEntity<MyXmlResponse> trip(#RequestBody MyXmlRequest request) throws Exception {
MyXmlResponse response = new MyXmlResponse();
response.setStatus("SUCCESS");
response.setTripID(request.getTripID());
return ResponseEntity.ok().body(response);
}
It works perfect and obviously no JaxB annotations are required as I use jackson-dataformat-xml. Also the request XML can be case-insensitive.
Now, in my first application I want to consume this API via webclient. I read that Spring webflux do not support Jackson-dataformat-xml yet. Hence I have to annotate my classes with Jaxb annotations.
spring-boot-version=2.2.5
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
webClient.post()
.uri(URI.create("url-to-api-endpoint"))
.body(Mono.just(myXmlRequest), MyXmlRequest.class)
.exchange()
.doOnSuccess(response -> {
HttpStatus statusCode = response.statusCode();
log.info("Status code of external system request {}", statusCode);
})
.doOnError(onError -> {
log.error("Error on connecting to external system {}", onError.getMessage());
})
.flatMap(response -> response.bodyToMono(MyXmlResponse.class))
.subscribe(this::handleResponse);
Above code throws an exception as follows
org.springframework.webreactive.function.UnsupportedMediaTypeException: Content type 'application/xml' not supported for bodyType=com.example.MyXmlRequest
at org.springframework.web.reactive.function.BodyInserters.unsupportedError(BodyInserters.java:391)
I fixed this problem by annotating with XmlRootElement as follows
#Getter #Setter #NoArgsConstructor #ToString
#XmlRootElement()
public class MyXmlRequest {
private String attribute1;
}
On the next attempt I got another error as follows
reactor.core.Exceptions$ErrorCallbackNotImplemented: org.springframework.web.reactive.function.UnsupportedMediaTypeException: Content type 'application/xml' not supported for bodyType=com.example.MyXmlResponse
Caused by: org.springframework.web.reactive.function.UnsupportedMediaTypeException: Content type 'application/xml' not supported for bodyType=com.example.MyXmlResponse
This could be solved by annotating MyXmlResponse with XmlRootElement as follows
#Getter #Setter #NoArgsConstructor #ToString
#XmlRootElement()
public class MyXmlResponse {
private String attr1;
private String attr2;
}
This time I get unmarshallexception as follows
reactor.core.Exceptions$ErrorCallbackNotImplemented: org.springframework.core.codec.DecodingException: Could not unmarshal XML to class com.example.MyXmlResponse; nested exception is javax.xml.bind.UnmarshalException
- with linked exception:
[com.sun.istack.internal.SAXParseException2; lineNumber: 1; columnNumber: 15; unexpected element (uri:"", local:"MyXmlResponse"). Expected elements are <{}myXmlResponse>]
Caused by: org.springframework.core.codec.DecodingException: Could not unmarshal XML to class com.example.MyXmlResponse; nested exception is javax.xml.bind.UnmarshalException
- with linked exception:
I fixed it with additional attributes passed to annotation as follows.
#XmlRootElement(name = "MyXmlResponse", namespace = "")
public class MyXmlResponse {
In future, my XML structures going to be tremendously complex. I want to know if I am doing it the right way.

Spring Boot test tries to initialize cache2k for the 2nd time and fails

After adding cache2k to my project some #SpringBootTest's stopped working with an error:
java.lang.IllegalStateException: Cache already created: 'cache'
Below I provide the minimal example to reproduce:
Go to start.spring.io and create a simplest Maven project with Cache starter, then add cache2k dependencies:
<properties>
<java.version>1.8</java.version>
<cache2k-version>1.2.2.Final</cache2k-version>
</properties>
<dependencies>
<dependency>
<groupId>org.cache2k</groupId>
<artifactId>cache2k-api</artifactId>
<version>${cache2k-version}</version>
</dependency>
<dependency>
<groupId>org.cache2k</groupId>
<artifactId>cache2k-core</artifactId>
<version>${cache2k-version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.cache2k</groupId>
<artifactId>cache2k-spring</artifactId>
<version>${cache2k-version}</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-cache</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
Now configure the simplest cache:
#SpringBootApplication
#EnableCaching
public class CachingDemoApplication {
public static void main(String[] args) {
SpringApplication.run(CachingDemoApplication.class, args);
}
#Bean
public CacheManager springCacheManager() {
SpringCache2kCacheManager cacheManager = new SpringCache2kCacheManager();
cacheManager.addCaches(b -> b.name("cache"));
return cacheManager;
}
}
And add any service (which we will #MockBean in one of our tests:
#Service
public class SomeService {
public String getString() {
System.out.println("Executing service method");
return "foo";
}
}
Now two #SpringBootTest tests are required to reproduce the issue:
#SpringBootTest
#RunWith(SpringRunner.class)
public class SpringBootAppTest {
#Test
public void getString() {
System.out.println("Empty test");
}
}
#RunWith(SpringRunner.class)
#SpringBootTest
public class WithMockedBeanTest {
#MockBean
SomeService service;
#Test
public void contextLoads() {
}
}
Notice that the 2nd test has mocked #MockBean. This causes an error (stacktrace below).
Caused by: java.lang.IllegalStateException: Cache already created: 'cache'
at org.cache2k.core.CacheManagerImpl.newCache(CacheManagerImpl.java:174)
at org.cache2k.core.InternalCache2kBuilder.buildAsIs(InternalCache2kBuilder.java:239)
at org.cache2k.core.InternalCache2kBuilder.build(InternalCache2kBuilder.java:182)
at org.cache2k.core.Cache2kCoreProviderImpl.createCache(Cache2kCoreProviderImpl.java:215)
at org.cache2k.Cache2kBuilder.build(Cache2kBuilder.java:837)
at org.cache2k.extra.spring.SpringCache2kCacheManager.buildAndWrap(SpringCache2kCacheManager.java:205)
at org.cache2k.extra.spring.SpringCache2kCacheManager.lambda$addCache$2(SpringCache2kCacheManager.java:143)
at java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1853)
at org.cache2k.extra.spring.SpringCache2kCacheManager.addCache(SpringCache2kCacheManager.java:141)
at org.cache2k.extra.spring.SpringCache2kCacheManager.addCaches(SpringCache2kCacheManager.java:132)
at com.example.cachingdemo.CachingDemoApplication.springCacheManager(CachingDemoApplication.java:23)
at com.example.cachingdemo.CachingDemoApplication$$EnhancerBySpringCGLIB$$2dce99ca.CGLIB$springCacheManager$0(<generated>)
at com.example.cachingdemo.CachingDemoApplication$$EnhancerBySpringCGLIB$$2dce99ca$$FastClassBySpringCGLIB$$bbd240c0.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:244)
at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:363)
at com.example.cachingdemo.CachingDemoApplication$$EnhancerBySpringCGLIB$$2dce99ca.springCacheManager(<generated>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154)
... 52 more
If you remove #MockBean, both tests will pass.
How can I avoid this error in my test suite?
Your second test represents a different ApplicationContext altogether so the test framework will initiate a dedicated one for it. If cache2k is stateful (for instance sharing the CacheManager for a given classloader if it already exists), the second context will attempt to create a new CacheManager while the first one is still active.
You either need to flag one of the test as dirty (see #DirtiesContext) which will close the context and shut down the CacheManager, or you can replace the cache infrastructure by an option that does not require all that, see #AutoConfigureCache.
If cache2k works in such a way that it requires you to dirty the context, I'd highly recommend to swap it using the later options.
Since I do not want any custom behavior in test, but just want to get rid of this error, the solution is to create CacheManager using unique name like this:
#Bean
public CacheManager springCacheManager() {
SpringCache2kCacheManager cacheManager = new SpringCache2kCacheManager("spring-" + hashCode());
cacheManager.addCaches(b -> b.name("cache"));
return cacheManager;
}
I encountered the same error when using cache2k with Spring Dev Tools, and ended up with the following code as the solution:
#Bean
public CacheManager cacheManager() {
SpringCache2kCacheManager cacheManager = new SpringCache2kCacheManager();
// To avoid the "Caused by: java.lang.IllegalStateException: Cache already created:"
// error when Spring DevTools is enabled and code reloaded
if (cacheManager.getCacheNames().stream()
.filter(name -> name.equals("cache"))
.count() == 0) {
cacheManager.addCaches(
b -> b.name("cache")
);
}
return cacheManager;
}

Adding a dependency to a working Spring Boot project invalidates all JUnit

I have 2 Eclipse projects and each one is has services managed by Spring. I use Spring Boot starter dependencies for each of them. Each one works properly, and can be tested with JUnit launched via SpringRunner.class and #SpringBootTest.
Now, I want to call some services from project 1 in project 2, so I add a dependency in project 2 pom.xml and I add
#ComponentScan(basePackages="com.project1")
From then on, I can't launch any JUnit, it complains about dataSource not being set, like if configs where mixing randomly.
My question is : what are the recommended practices when you create a Spring Boot App and you want to isolate some features in a separate project (here XML features) ? If u can't have 2 spring boot app with one dependant of the other, what are the spring dependencies you need so the spring boot project can deal with the non spring boot dependency, and so that u can still launch JUnit using Spring runner locally ?
Do I need to pick Spring dependencies one by one (core, bean, context, test, log4j, slf4j, junit, hamcrest, ...) like before Spring boot exist to do this ?
See my comment on why the possible duplicate is different.
After removing all Spring boot dependencies from my module project, I still have the error as soon as I add the "ComponentScan" to scan the module services.
Here is my DB config (main project depending on a xml module) to be clear on the package config. This config WORKS perfectly until I add the ComponentScan on a package from the module project :
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(basePackages="fr.my.project.repository")
class PersistenceContext {
private static final String[] ENTITY_PACKAGES = { "fr.my.project.model" };
private static final String PROP_DB_DRIVER_CLASS = "db.driver";
private static final String PROP_DB_PASSWORD = "db.password";
private static final String PROP_DB_URL = "db.url";
private static final String PROP_DB_USER = "db.username";
private static final String PROP_HIBERNATE_DIALECT = "hibernate.dialect";
private static final String PROP_HIBERNATE_FORMAT_SQL = "hibernate.format_sql";
private static final String PROP_HIBERNATE_HBM2DDL_AUTO = "hibernate.hbm2ddl.auto";
private static final String PROP_HIBERNATE_SHOW_SQL = "hibernate.show_sql";
/**
* Creates and configures the HikariCP datasource bean.
*
* #param env
* The runtime environment of our application.
* #return
*/
#Bean(destroyMethod = "close")
DataSource dataSource(Environment env) {
HikariConfig dataSourceConfig = new HikariConfig();
dataSourceConfig.setDriverClassName(env.getRequiredProperty(PROP_DB_DRIVER_CLASS));
dataSourceConfig.setJdbcUrl(env.getRequiredProperty(PROP_DB_URL));
dataSourceConfig.setUsername(env.getRequiredProperty(PROP_DB_USER));
dataSourceConfig.setPassword(env.getRequiredProperty(PROP_DB_PASSWORD));
return new HikariDataSource(dataSourceConfig);
}
/**
* Creates the bean that creates the JPA entity manager factory.
*
* #param dataSource
* The datasource that provides the database connections.
* #param env
* The runtime environment of our application.
* #return
*/
#Bean
LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource dataSource, Environment env) {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource(dataSource);
entityManagerFactoryBean.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
entityManagerFactoryBean.setPackagesToScan(ENTITY_PACKAGES);
Properties jpaProperties = new Properties();
// Configures the used database dialect. This allows Hibernate to create SQL
// that is optimized for the used database.
jpaProperties.put(PROP_HIBERNATE_DIALECT, env.getRequiredProperty(PROP_HIBERNATE_DIALECT));
// Specifies the action that is invoked to the database when the Hibernate
// SessionFactory is created or closed.
jpaProperties.put(PROP_HIBERNATE_HBM2DDL_AUTO, env.getRequiredProperty(PROP_HIBERNATE_HBM2DDL_AUTO));
// If the value of this property is true, Hibernate writes all SQL
// statements to the console.
jpaProperties.put(PROP_HIBERNATE_SHOW_SQL, env.getRequiredProperty(PROP_HIBERNATE_SHOW_SQL));
// If the value of this property is true, Hibernate will use prettyprint
// when it writes SQL to the console.
jpaProperties.put(PROP_HIBERNATE_FORMAT_SQL, env.getRequiredProperty(PROP_HIBERNATE_FORMAT_SQL));
entityManagerFactoryBean.setJpaProperties(jpaProperties);
return entityManagerFactoryBean;
}
/**
* Creates the transaction manager bean that integrates the used JPA provider with the Spring transaction mechanism.
*
* #param entityManagerFactory
* The used JPA entity manager factory.
* #return
*/
#Bean
JpaTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory);
return transactionManager;
}
}
and after adding :
#ComponentScan(basePackages="fr.my.module.xml.service")
I get this error when launching any Junit :
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.apache.tomcat.jdbc.pool.DataSource]: Factory method 'dataSource' threw exception; nested exception is org.springframework.boot.autoconfigure.jdbc.DataSourceProperties$DataSourceBeanCreationException: Cannot determine embedded database driver class for database type NONE. If you want an embedded database please put a supported one on the classpath. If you have database settings to be loaded from a particular profile you may need to active it (no profiles are currently active).
Here is a temporary answer on how to configure the dependency project, but I hope some easier way benefiting of Spring Boot shortcuts for all app modules exist.
pom.xml with manual minimal dependencies :
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>4.3.14.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>4.3.14.RELEASE</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.1.11</version>
</dependency>
Manual test config :
#RunWith(SpringRunner.class)
#ContextConfiguration(loader=AnnotationConfigContextLoader.class, classes=AppConfig.class)
public class XmlTest {
Manual app config :
#Configuration
#ComponentScan(basePackages="my.package.xml")
public class AppConfig {
}
Sooooooo after doing all these tries, Spring Boot may not be the cause of this problem at all.
The thing is I was adding #ComponentScan(basePackages="fr.package.xml") hoping to complete the default package scanning, but it was overriding it.
The proper way to add a package, is to redeclare explicitely the default package before adding the new package :
#ComponentScan(basePackages={"fr.package.xml", "fr.package.persistence"})
My other answer was about setting up manual minimal dependencies for a module in a Spring Boot app. But here is an example of using Spring boot special dependencies in the module which is not the main app :
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.0.1.RELEASE</version>
<relativePath /> <!-- lookup parent from repository -->
</parent>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
Then, you don't declare "#SpringBootApplication" in a main class in src/main/java where it may break the global packaging, but you set it up inside your test class :
#RunWith(SpringRunner.class)
#SpringBootTest("service.message=Hello")
public class MyServiceTest {
#Autowired
private MyService myService;
#Test
public void contextLoads() {
assertThat(myService.message()).isNotNull();
}
#SpringBootApplication
static class TestConfiguration {
}
}
source : https://github.com/spring-guides/gs-multi-module/tree/master/complete

How to access a in-memory database in spring

I am developing a standalone java application using in-memory embedded database. I refereed few documents and I wrote the following code. I am using spring Boot.
These are the steps I did so far:
In, pom file I added these dependencies.
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jdbc</artifactId>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.36</version>
</dependency>
In application.properties file
spring.datasource.driver-class-name=com.mysql.jdbc.Driver
spring.datasource.url=jdbc:mysql://localhost:3306/test
spring.datasource.username=root
spring.datasource.password=admin
I created a scripts schema.sql and data.sql
schema.sql
CREATE TABLE users
(
id int(11) NOT NULL AUTO_INCREMENT,
name varchar(100) NOT NULL,
email varchar(100) DEFAULT NULL,
PRIMARY KEY (id)
);
data.sql
insert into users(id, name, email) values(1,'demouser','user#test.com');
Here is my repository class.
#Repository
public class UserRepository {
#Autowired
private JdbcTemplate jdbcTemplate;
#Transactional(readOnly = true)
public List<User> findAll() {
return jdbcTemplate.query("select * from users", new UserRowMapper());
}
class UserRowMapper implements RowMapper<User> {
#Override
public User mapRow(ResultSet rs, int rowNum) throws SQLException {
User user = new User();
user.setId(rs.getInt("id"));
user.setName(rs.getString("name"));
user.setEmail(rs.getString("email"));
return user;
}
Here is my junit class
#Autowired
private UserRepository userRepository;
private EmbeddedDatabase db;
#Before
public void setUp() {
// db = new EmbeddedDatabaseBuilder().addDefaultScripts().build();
Object db = new EmbeddedDatabaseBuilder().setType(EmbeddedDatabaseType.H2).addScript("schema.sql")
.addScript("data.sql").build();
}
#Test
public void findAllUsers() {
List<User> users = userRepository.findAll();
System.out.println(users.get(0).getName());
assertNotNull(users);
assertTrue(!users.isEmpty());
}
when I run my junit I am getting the following error.
org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'userRepository': Unsatisfied dependency expressed through field 'jdbcTemplate': Error creating bean with name 'org.springframework.boot.autoconfigure.jdbc.JdbcTemplateAutoConfiguration': Unsatisfied dependency expressed through constructor parameter 0: Error creating bean with name 'dataSource' defined in class path resource [org/springframework/boot/autoconfigure/jdbc/DataSourceConfiguration$Tomcat.class]: Initialization of bean failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'dataSourceInitializer': Invocation of init method failed; nested exception is org.springframework.jdbc.datasource.init.UncategorizedScriptException: Failed to execute database script; nested exception is org.springframework.jdbc.CannotGetJdbcConnectionException: Could not get JDBC Connection; nested exception is com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
Do I have to add any other dependency or download a database.
Since you want to use In-memory database which is H2 so you shouldn't be having MySql dependecy in pom.xml and mysql related properties in application.properties.
Remove the MySql connector dependency from pom.xml
<!--Delete This-->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.36</version>
And delete all the properties in application.properties file.
Keep it blank, Spring boot will auto-configure these properties for H2 database.
Checkout the complete Spring Boot JDBC Demo project in the GitHub repository.

Elasticsearch-Spark serialization not working with inner classes

Elasticsearch/Spark serialization does not appear to play well with nested types.
For example:
public class Foo implements Serializable {
private List<Bar> bars = new ArrayList<Bar>();
// getters and setters
public static class Bar implements Serializable {
}
}
List<Foo> foos = new ArrayList<Foo>();
foos.add( new Foo());
// Note: Foo object does not contain nested Bar instances
SparkConf sc = new SparkConf(); //
sc.setMaster("local");
sc.setAppName("spark.app.name");
sc.set("spark.serializer", KryoSerializer.class.getName());
JavaSparkContext jsc = new JavaSparkContext(sc);
JavaRDD javaRDD = jsc.parallelize(ImmutableList.copyOf(foos));
JavaEsSpark.saveToEs(javaRDD, INDEX_NAME+"/"+TYPE_NAME);
The above code above works, and documents of type Foo will be indexed within Elasticsearch.
The issue arises when the bars list in a Foo object is not empty, for instance:
Foo = new Foo();
Bar = new Foo.Bar();
foo.getBars().add(bar);
Then, when indexing to Elasticsearch, the following exception is thrown:
org.elasticsearch.hadoop.serialization.EsHadoopSerializationException:
Cannot handle type [Bar] within type [class Foo], instance [Bar ...]]
within instance [Foo#1cf628a]
using writer [org.elasticsearch.spark.serialization.ScalaValueWriter#4e635d]
at org.elasticsearch.hadoop.serialization.builder.ContentBuilder.value(ContentBuilder.java:63)
at org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.doWriteObject(TemplatedBulk.java:71)
at org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.write(TemplatedBulk.java:58)
at org.elasticsearch.hadoop.rest.RestRepository.writeToIndex(RestRepository.java:148)
at org.elasticsearch.spark.rdd.EsRDDWriter.write(EsRDDWriter.scala:47)
at org.elasticsearch.spark.rdd.EsSpark$$anonfun$saveToEs$1.apply(EsSpark.scala:68)
at org.elasticsearch.spark.rdd.EsSpark$$anonfun$saveToEs$1.apply(EsSpark.scala:68)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:64)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
These are the relevant Maven dependencies
<dependency>
<groupId>com.sksamuel.elastic4s</groupId>
<artifactId>elastic4s_2.11</artifactId>
<version>1.5.5</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-hadoop-cascading</artifactId>
<version>2.1.0.Beta4</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.1.3</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-spark_2.10</artifactId>
<version>2.1.0.Beta4</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-xml</artifactId>
<version>2.11.0-M4</version>
</dependency>
What is the correct way to index when using nested types with ElasticSearch and Spark?
Thanks
A solution could be to build a json from the object you're trying to save, using for example Json4s.
In this case your "JavaEsSpark" RDD would be a RDD of strings.
Then you simply have to call
JavaEsSpark.saveJsonToEs...
instead of
JavaEsSpark.saveToEs...
This workaround helped me save countless hours trying to figure out a way to Serialize nested maps.
Looking at the ScalaValueWriter & JdkValueWriter code we can see that only certain types are directly supported. Most likely the inner class is not a JavaBean or other supported type.
One day ScalaValueWriter & JdkValueWriter will possibly support user defined types (like Bar in our example), other than just Java types like String, int, etc.
In the meantime, there is the following workaround. Instead of having Foo expose a List of Bar objects, internally transform the List to a Map<String, Object> and expose that.
Something like this:
private List<Map<String, Object>> bars= new ArrayList<Map<String, Object>>();
public List<Map<String, Object>> getBars() {
return bars;
}
public void setBars(List<Bar> bars) {
for (Bar bar: bars){
this.bars.add(bar.getAsMap());
}
}
i suggest working with com.google.gson.Gson;
String foosJson = new Gson().toJson(foos );
then ,
Map map = new HashMap<> ();
...
...
JavaRDD<Map<String,?>> javaRDD= sc.parallelize(ImmutableList.of(map));
JavaEsSpark.saveToEs ( javaRDD, INDEX_NAME+"/"+TYPE_NAME );

Resources