Store SseEmitter on Redis - spring-boot

I need to store SseEmitter objects on external storage to share them accross multiple application instances. But I am getting NotSerializableException as SseEmitter do not implement Serializable.
Please share your thoughts.

Related

How to minimize interaction with Redis when using it as a Spring Session cache?

We are using Spring Cloud Gateway for OAuth2 authentication, after which it stores users session information in Redis with default settings set by #EnableRedisWebSession and
#Bean
fun redisConnectionFactory(): LettuceConnectionFactory {
return LettuceConnectionFactory("redis-cache", 6379);
}
#Bean
fun authorizedClientRepository(): ServerOAuth2AuthorizedClientRepository {
return WebSessionServerOAuth2AuthorizedClientRepository()
}
application.yml cache settings:
spring:
session:
store-type: redis
redis:
save-mode: on_set_attribute
flush-mode: on_save
It works fine, though I can see it makes requests to Redis on every user requests like it doesn't have in-memory cache at all. Is there any option to change this behaviour (i.e. make requests though the network to Redis only if current user session is not found in local memory cache)? May be I can reimplement some classes or there is no way to do it except of rewriting all cache logic? Sorry, for quite a broad question, but I didn't find any information on this topic in the documentation. Or maybe you could point me at classes in Spring Session source code, where this logic is implemented, so I could figure out what are my options.
I'm using spring-cloud-starter-gateway 2.2.5.RELEASE, spring-session-core 2.3.1.RELEASE, spring-boot-starter-data-redis-reactive and spring-session-data-redis.
From reading the documentation, I don't believe it is possible out of the box as using a local cache could result in inconsistent state amongst all connecting SCG instances to that Redis Instance.
You would need to define your own implementation of a SessionRepository that will try a local caffeine cache, and if not found then go to Redis instead. As a starting point, you could duplicate or trying extending the RedisSessionRepository.
The only thing you'd need to be careful of then is if you have multiple instances of SCG running how to handle if another instance updates redis, how the other instances would handle it if they have a locally cached instance already.

Read datasource from a database or a file and based on API identifier, save data in the database? Which tool to use?

In a Spring Boot Application, there is a method mapped with a POST API for posting certain data in database. The issue is based on API URL parameter, data source will change.
Like the API is: {baseURL}/api/{someIdentifier}/addUser
Now, there is another file or consider a database which maps Database Connection Strings (like Datasource, Username, password, driver) to this {someIdentifier}. There could be a lot of such identifiers (corresponding to which there could be multiple databases and their parameters).
Now when this API gets hit, based on this identifier there will be a method which will fetch connection strings, make the connection and then it should save the data in that database. On every API, creating a connection is not feasible.
Can anyone please suggest which tool or technology can be helpful for solving this problem, especially using Spring Boot.
Thanks in advance!
You are looking for the AbstractRoutingDataSource.
From its documentation:
Abstract DataSource implementation that routes getConnection() calls to one of various target DataSources based on a lookup key. The latter is usually (but not necessarily) determined through some thread-bound transaction context.

A good solution to Spring MVC "failed to lazily initialize a collection of role no session or session was closed"

This is a well known issue of Spring MVC with Hibernate/JPA, "failed to lazily initialize a collection of role no session or session was closed". See Hibernate: failed to lazily initialize a collection of role, no session or session was closed
Many posts suggest using EAGER to replace LAZY, which can work but has performance lost. Is there a good solution for this issue?
Thanks.
This generally happens when you have your transaction management inside your DAO. The best practice is to write a service on top of the DAO and do all the gets/populations in your services. And apply transactions in services.
This way you shield your Domain models being accessed out of a database session and your services decide wether to load the children or not based on what it is supposed to do.

Cache solution jaxb

I have a requirement in which I need to cache data coming as response from a soap ws. I am using Spring with JAXB and JAX-WS to call the web service. I am using ehcache for caching.
What I would want ideally is that the user data for example is cached as the java bean (JAXB). We can use the id of a bean (JAXB bean) as the name. Whenever data is requested, the data should first be checked in the cache and if the data is not available, the soap ws should be called and then the data should be stored in the cache.
I am not aware if there is already a solution to handle this in spring or ehcache or maybe in JAXB. Can someone please help me out.
In my view, the more important point to ask would be how would you keep the server data in synch with what you have cached on the client side? If you really want to do this, then you would have to create some sort of mechanism to notify clients that the original data has updated.
For e.g. one solution could be to create a JMS Topic (Pub/Sub) and have the server publish an event to this topic. All clients listen to this topic and reload the cache by making the web service call at this point.
In terms of the cache itself at the client end, why not just choose a ConcurrentHashMap to begin with?

Using Spring Cloud Connector for Heroku in order to connect to multiple RedisLabs databases

I have a requirement for multiple RedisLabs databases for my application as described in their home page:
multiple dedicated databases in a plan
We enable multiple DBs in a single plan, each running in a dedicated process and in a non-blocking manner.
I rely on Spring Cloud Connectors in order to connect to Heroku (or Foreman in local) and it seems the RedisServiceInfoCreator class allows for a single RedisLabs URL i.e. REDISCLOUD_URL
Here is how I have configured my first redis connection factory:
#Configuration
#Profile({Profiles.CLOUD, Profiles.DEFAULT})
public class RedisCloudConfiguration extends AbstractCloudConfig {
#Bean
public RedisConnectionFactory redisConnectionFactory() {
PoolConfig poolConfig = ...
return connectionFactory().redisConnectionFactory("REDISCLOUD", new PooledServiceConnectorConfig(poolConfig));
}
...
How I am supposed to configure a second connection factory if I intend to use several redis labs databases?
Redis Cloud will set for you an env var only for the first resource in each add-on that you create.
If you create multiple resources in an add-on, you should either set an env var yourself, or use the new endpoint directly in your code.
In short the answer is yes, RedisConnectionFactory should be using Jedis in order to connect to your redis db. it is using jedis pool that can only work with a single redis endpoint. in this regard there is no different between RedisLabs and a basic redis.
you should create several connection pools to work with several redis dbs/endpoints.
just to extend, if you are using multiple dbs to scale, there is no need with RedisLabs as they support clustering with a single endpoint. so you can simple create a single db with as much memory as needed, RedisLabs will create a cluster for you and will scale your redis automatically.
if you app does require logical seperation, then creation multiple dbs is the right way to go.

Resources