Why does 'ConversionException' occur in 'TrackingEventProcessor' of Axon framework? - spring-boot

with Axon 4.5
with Spring Boot 2.5.2
with Spring JPA
I wrote Saga management as below
#Saga
class OrderSagaManagement {
#Autowired
private lateinit var eventGateway: EventGateway
#StartSaga
#SagaEventHandler(associationProperty = "orderId")
fun handle(orderCreatedEvent: OrderCreatedEvent) {
val paymentId = Random().nextInt(10000)
SagaLifecycle.associateWith("paymentId", paymentId)
eventGateway.publish(PaymentEvent(paymentId, orderCreatedEvent.orderId))
}
...
}
When I dispatch OrderCreateEvent, ConversionException occurs in TokenEventProcessor#processingLoop() as below.
application.yml is as below.
spring:
application:
name: order-service
datasource:
url: mysql
username: mysql
password:
driver-class-name: com.mysql.jdbc.Driver
jpa:
hibernate:
ddl-auto: update
axon:
serializer:
general: xstream
axonserver:
servers: localhost:8124
bulid.gradle
implementation("org.axonframework:axon-spring-boot-starter:4.5")
Why does 'ConversionException' occur in 'TrackingEventProcessor' of Axon framework?

Related

springboot quartz init schema only on first startup

This is my config:
#Bean
#QuartzDataSource
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource quartzDataSource() {
return DataSourceBuilder.create().build();
}
and this is my app.yml:
datasource:
url: my-url
jdbcUrl: ${spring.datasource.url}
username: 'root'
password: 'root'
...
quartz:
job-store-type: jdbc
jdbc:
initialize-schema: always
wait-for-jobs-to-complete-on-shutdown: true
properties:
org:
quartz:
dataSource:
quartz-data-source:
provider: hikaricp
driver: com.mysql.cj.jdbc.Driver
URL: ${spring.datasource.url}
user: ${spring.datasource.username}
password: ${spring.datasource.password}
maximumPoolSize: 5
connectionTestQuery: SELECT 1
validationTimeout: 5000
idleTimeout: 1
scheduler:
instanceId: AUTO
instanceName: my-project-scheduler
jobStore:
class: org.quartz.impl.jdbcjobstore.JobStoreTX
driverDelegateClass: org.quartz.impl.jdbcjobstore.StdJDBCDelegate
useProperties: false
misfireThreshold: 60000
clusterCheckinInterval: 30000
isClustered: true
dataSource: quartz-data-source
threadPool:
class: org.quartz.simpl.SimpleThreadPool
threadCount: 1
threadPriority: 5
threadsInheritContextClassLoaderOfInitializingThread: true
My question:
If I set initialize-schema: always then the qrtz tables are created on each application startup.
On the other side, if I set initialize-schema: never then I get an error on the first startup that the qrt tables are missing.
Is there a way to configure it to initialize the qrtz tables only if they do not exist?
You are gonna need a migration tool to handle the database creation.
Spring Boot provides two options: Flyway and LiquidBase.
Choose one, create migration scripts and you are up and running.
I personally like the Flyway approach.
You just add implementation 'org.flywaydb:flyway-core' to your build.gradle file (or the maven alternative).
Then add this to your application.yml
spring:
flyway:
enabled: true
baseline-on-migrate: true
Then create db/migration folder in resources folder and put in your migration scripts eg. V1_0_0__db_init.sql (flyway has its own naming convention).
To get the create SQL scripts I recommend that you export them from running database.
Also do not forget to change the spring.jpa.hibernate.ddl-auto to validate.

R2DBC and Spring Data integration issues

I have spring boot project (version 2.4.6) with spring data dependency (spring-boot-starter-data-jpa) and postgreSQL driver.
In project we are using Hibernate and data repositories, which configured via:
#EnableSpringDataCommons(basePackages = ["..."]) // path to folder with repositories
#EnableJpaAuditing(auditorAwareRef = "auditorAware")
#EnableTransactionManagement
#Configuration
class PersistenceConfig
And also I want to add reactive R2DBC. My plan is to use it in one specific place, where we have integration with other system, such communication happened via reactive data streaming. According on it, I need to update some state in database reactivly. That's why I'm added next dependencies:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-r2dbc</artifactId>
<version>2.4.6</version>
</dependency>
dependency>
<groupId>io.r2dbc</groupId>
<artifactId>r2dbc-postgresql</artifactId>
<scope>runtime</scope>
</dependency>
And also, next properties configuration:
spring:
name: configuration
url: jdbc:postgresql://${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DBNAME}
username: ${POSTGRES_USERNAME}
password: ${POSTGRES_PASSWORD}
driverClassName: org.postgresql.Driver
hikari:
maximum-pool-size: 2
jpa:
database: POSTGRESQL
database-platform: org.hibernate.dialect.PostgreSQLDialect
generate-ddl: false
open-in-view: false
properties:
javax:
persistence:
validation:
mode: auto
hibernate:
temp:
use_jdbc_metadata_defaults: false
jdbc:
time_zone: UTC
lob.non_contextual_creation: true
hibernate:
ddl-auto: none
r2dbc:
url: r2dbc:postgresql://${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DBNAME}
username: ${POSTGRES_USERNAME}
password: ${POSTGRES_PASSWORD}
liquibase.change-log: classpath:/db/changelog-master.xml
And finally, I have such data-layer logic service:
#Service
class MyDataReactiveService(
val operator: TransactionalOperator,
val template: R2dbcEntityTemplate
) {
fun updateObjectStatus(state: String, objectId: UUID): Mono<Int> =
template
.update(ObjectEntity::class.java)
.matching(query(Criteria.where("id").`is`(objectId)))
.apply(update("state", state))
.`as` { operator.transactional(it) }
}
Where ObjectEntity - regular spring data entity.
But, unfortunately, I have next error while application startup (inside tests):
Field objectRepository in com.slandshow.TestManager required a bean named 'entityManagerFactory' that could not be found.
The injection point has the following annotations:
- #org.springframework.beans.factory.annotation.Autowired(required=true)
Action:
Consider defining a bean named 'entityManagerFactory' in your configuration.
TestManager - Test wrapper with injected beans:
#Service
class TestManager {
#Autowired
lateinit var objectRepository: ObjectRepository
...
}
And ObjectRepository:
interface ObjectRepository : JpaRepository<ObjectEntity> {
...
}
As far as I understand, such issue related to r2dbc and spring data misconfig.
But how can I fix it?
Since you did not post the code of ObjectRepository it’s hard to say what is wrong. However, I do not recommend to use JPA and R2DBC in same project for the same database..it’s a hassle and furthermore this may not give you any advantage. Instead I would recommend to use WebClient to make HTTP calls and use Kotlin Coroutine to fire query in dedicated thread (since you are using Kotlin already). In my opinion this will be better approach. However all this depends on your application i.e. how many queries you are firing after calls and so forth.

Why isn't the KafkaTransactionManager being applied to this Spring Cloud Stream Kafka Producer?

I have configured a Spring Cloud Stream Kafka application to use transactions (full source code available on Github):
spring:
application:
name: message-relay-service
cloud:
stream:
kafka:
binder:
transaction:
transaction-id-prefix: message-relay-tx-
producer:
configuration:
retries: 1
acks: all
key:
serializer: org.apache.kafka.common.serialization.StringSerializer
bindings:
output:
destination: transfer
contentType: application/*+avro
schema-registry-client:
endpoint: http://localhost:8081
schema:
avro:
subjectNamingStrategy: org.springframework.cloud.stream.schema.avro.QualifiedSubjectNamingStrategy
datasource:
url: jdbc:h2:tcp://localhost:9090/mem:mydb
driver-class-name: org.h2.Driver
username: sa
password:
jpa:
hibernate:
ddl-auto: create
database-platform: org.hibernate.dialect.H2Dialect
server:
port: 8085
This app has a scheduled task that periodically checks records to send in a database using a #Scheduled task. This methods is annotated with #Transactional and the main class defines #EnableTransactionManagement.
However when debugging the code I've realized that the KafkaTransactionManager isn't being executed, that is to say, there are no Kafka transactions in place. What's the problem?
#EnableTransactionManagement
#EnableBinding(Source::class)
#EnableScheduling
#SpringBootApplication
class MessageRelayServiceApplication
fun main(args: Array<String>) {
runApplication<MessageRelayServiceApplication>(*args)
}
---
#Component
class MessageRelay(private val outboxService: OutboxService,
private val source: Source) {
#Transactional
#Scheduled(fixedDelay = 10000)
fun checkOutbox() {
val pending = outboxService.getPending()
pending.forEach {
val message = MessageBuilder.withPayload(it.payload)
.setHeader(KafkaHeaders.MESSAGE_KEY, it.messageKey)
.setHeader(MessageHeaders.CONTENT_TYPE, it.contentType)
.build()
source.output().send(message)
outboxService.markAsProcessed(it.id)
}
}
}
I don't see #EnableTransactionManagement in account-service, only in message-relay-service.
In any case, your scenario is not currently supported; the transactional binder was designed for processors where the consumer starts the transaction, any records sent on the consumer thread participate in that transaction, the consumer sends the offset to the transaction and then commits the transaction.
It was not designed for producer-only bindings; please open a GitHub issue against the binder because it should be supported.
I am not sure why you are not seeing a transaction starting but, even if it does, the problem is that #Transactional will use Boot's auto-configured KTM (and producer factory) and the binding uses a different producer factory (the one from your configuration).
Even if a transaction is in process, the producer won't participate in it.

Error resolving template on deploy jar or war on spring boot

when i run my application on embedded tomcat server all thing is good.
but when i deploy jar or war on other server or container, server on some url give me error:
Error resolving template "/shop/index", template might not exist or might not be accessible by any of the configured Template Resolvers
this is my application.yml:
server:
port: 8080
logging:
level:
com.mousavi007.shop: debug
spring:
datasource:
url: jdbc:mysql://localhost:3306/shop2
username: *******
password: *******
platform: mysql
jpa:
hibernate:
ddl-auto: update
database-platform: org.hibernate.dialect.MySQL5InnoDBDialect
database: mysql
show-sql: true
thymeleaf:
mode: LEGACYHTML5
shop controller:
#Slf4j
#Controller
#RequestMapping("/shop")
public class ShopController {
#GetMapping({"","/"})
public String Shop(Model model, #RequestParam("pageSize") Optional<Integer> pageSize,
#RequestParam("page") Optional<Integer> page){
****
****
****
****
return "/shop/index";
}
}
change
return "/shop/index";
to
return "shop/index";

Spring Boot with in memory database fails

I'm trying to test my Spring Boot application with an embedded database h2. As for dev and prod, I will be using a MySQL database.
I would like to have different application.yml and schema.sql file for each mode.
The project structure is:
src
--main
----resources
------application.yml
------schema.sql
--test
----resources
------application-test.yml
------schema-test.sql
This is my RespositoryTest :
#RunWith(SpringJUnit4ClassRunner.class)
#SpringBootTest
#DataJpaTest
public class IUserRepositoryTest {
#Autowired
private TestEntityManager entityManager;
#Autowired
private IUserRepository userRepository;
#Test
public void Should_ReturnEmployee_ForExistingEmail() {
User oneUser=new User("john","doe","example#email.com");
entityManager.persist(oneUser);
List<User> userList=userRepository.findUserByEmail("example#email.com");
assertThat(userList).isNotEmpty();
assertThat(userList.get(0)).isNotNull();
assertThat(userList.get(0).getEmail()).isEqualTo("example#email.com");
}
This is my test/resources/application-test.yml:
spring:
profiles: test
datasource:
url: jdbc:h2:mem:test;INIT=create schema IF NOT EXISTS mydb;DB_CLOSE_DELAY=-1
platform: h2
username: sa
password:
driverClassName: org.h2.Driver
jpa:
hibernate:
ddl-auto: create-drop
properties:
hibernate:
default-schema: mydb
dialect: org.hibernate.dialect.H2Dialect
This is my test/resources/schema-test.sql:
CREATE SCHEMA IF NOT EXISTS MYDB
As for my main/resources/application.yml:
logging:
level:
org.springframework.web: DEBUG
org:
hibernate:
SQL: DEBUG
spring:
jpa:
hibernate:
ddl-auto: update
properties:
hibernate:
dialect: org.hibernate.dialect.MySQL5Dialect
datasource:
url: jdbc:mysql://localhost:3306/mydb
username: root
password: toor
database:
driverClassName: com.mysql.jdbc.Driver
When I run my app as a spring boot one, the main application.yml is used and all is good, but when I run my tests, I get this error:
org.hibernate.tool.schema.spi.CommandAcceptanceException: Error executing DDL via JDBC Statement
Caused by: org.h2.jdbc.JdbcSQLException: Schema "MYDB" not found; SQL statement
Which causes all my tests to fail.
When I try to use this project structure:
src
--main
----resources
------application.yml
------schema.sql
--test
----resources
------application.yml
------schema.sql
The test succed but when I run my app as a spring boot, the test/resources/application.yml is the one being used instead of the main one.
Your tests are running with the "default" profile so it will only load the "default" configurations with no suffix (i.e. -test).
Try adding #ActiveProfiles("test") to your test class to enable the test profile.

Resources