How can I connect redis and couchbase to my spring application.
I get this error Parameter 0 of method couchbaseMappingContext in org.springframework.data.couchbase.config.AbstractCouchbaseConfiguration required a single bean, but 2 were found: - couchbaseCustomConversions: defined by method 'customConversions' in class path resource [{classPath}/chat/config/CouchbaseConfiguration.class] - redisCustomConversions: defined in null
I only need redis to 'look' at one package and the other ones need to be connected with only couchbase.
Redis config
#Bean
public MappingJackson2HttpMessageConverter mappingJackson2HttpMessageConverter() {
ObjectMapper mapper = new ObjectMapper();
mapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false);
MappingJackson2HttpMessageConverter converter =
new MappingJackson2HttpMessageConverter(mapper);
return converter;
}
#Bean
public RedisTemplate<Long, ?> redisTemplate(RedisConnectionFactory connectionFactory) {
RedisTemplate<Long, ?> template = new RedisTemplate<>();
template.setConnectionFactory(connectionFactory);
// Add some specific configuration here. Key serializers, etc.
return template;
}
Couchbase config
#EnableCouchbaseRepositories
#Configuration
public class CouchbaseConfiguration extends AbstractCouchbaseConfiguration {
#Value("${spring.data.couchbase.bucket-name}")
private String bucketName;
#Value("${spring.couchbase.username}")
private String username;
#Value("${spring.couchbase.password}")
private String password;
#Value("${spring.couchbase.connection-string}")
private String connectionString;
#Override
public String getConnectionString() {
return this.connectionString;
}
#Override
public String getUserName() {
return this.username;
}
#Override
public String getPassword() {
return this.password;
}
#Override
public String getBucketName() {
return this.bucketName;
}
}
and when I first start my app in terminal there is this info : Spring Data Redis - Could not safely identify store assignment for repository candidate interface
To resolve the ambiguity in taking the customConversions bean, we could tell the couchbase configuration class how to create the customConversions bean. Adding the below code to the class which extends AbstractCouchbaseConfiguration should solve the issue
#Bean
public CustomConversions customConversions() {
return super.customConversions();
}
Related
i face a litle problem with Jackson on springboot web, i'm expecting this kind of response.
{
'date': 'string of date'
}
but i get back this
[
0: java.sql.date,
1: long number of timestamp
]
please how to configure springboot to avoid this?
#Configuration
public class BootConfiguration {
#Bean
#Primary
public PasswordEncoder passwordEncoder() {
return PasswordEncoderFactories.createDelegatingPasswordEncoder();
}
#Bean
public JavaTimeModule javaTimeModule() {
return new JavaTimeModule();
}
#Bean
public Jdk8Module jdk8TimeModule() {
return new Jdk8Module();
}
#Bean
public CoreJackson2Module coreJackson2Module() {
return new CoreJackson2Module();
}
#Bean
public OAuth2AuthorizationServerJackson2Module authorizationServerJackson2Module(){
return new OAuth2AuthorizationServerJackson2Module();
}
public List<Module> securityModules(){
ClassLoader classLoader = JdbcOAuth2AuthorizationService.class.getClassLoader();
return SecurityJackson2Modules.getModules(classLoader);
}
#Bean
#Primary
#Order(Ordered.HIGHEST_PRECEDENCE)
public ObjectMapper objectMapper() {
ObjectMapper mapper = new ObjectMapper().findAndRegisterModules();
mapper.registerModule(coreJackson2Module());
mapper.registerModule(javaTimeModule());
mapper.registerModule(jdk8TimeModule());
mapper.registerModules(securityModules());
mapper.registerModule(authorizationServerJackson2Module());
mapper.addMixIn(UserPrincipal.class, Object.class);
return mapper;
}
}
this is the controller method where
#GetMapping(value = "/authentications", params = {"pge","lmt", "slug"})
public ResponseEntity<?> getLastLogins(
Authentication authentication,
#RequestParam("lmt")Integer limit,
#RequestParam("pge")Integer page,
#RequestParam("slug")String slug){
return new ResponseEntity<>(loginManager.findAllAuthentications(authentication.getName(),slug,limit,page), HttpStatus.OK);
}
}
and this is the model containing the properties i try to get back
#Data
public class User implements Serializable {
private String firstName;
private String lastName;
private Date date;
private Role role;
}
I have multiple questions related to HazelCast core as well as Spring Boot cache API.
Let me lay out the scenario first.
We have a monitoring system for monitoring multiple network infrastructures.
I have a Spring Boot app which could be deployed as multiple nodes behind a load-balancer.
In addition to that, this same app can work for multiple infrastructures by just running it with different profile such as infra-1-prod, infra-2-prod etc.
Its horizontally scalable as well as versatile. This nature is achieved by running the application with different profiles.
Along with other things, this profile change, changes the underlying DB connections to a relational database which contains the configuration data for a particular infrastructure.
Have a look at the relevant architecture for the application
The same spring boot application could be run as a node for different infrastructures spawning its own HazelCast instance node. If we have 6 nodes for the application, there will be 6 nodes for the HazelCast cluster. All of them will be in sync.
Now I have a Repository named RuleRepository which returns the Rule data for a particular Rule Alias.
#Repository
public interface RuleRepository extends JpaRepository<Rule, Long> {
#Cacheable(value = Constants.CACHE_ALIAS)
Optional<Rule> findByAlias(String ruleAlias);
//some other functions
}
Now the problem is, as the Rule Aliases are auto generated by DB sequences, an alias R_123 points to different data for Infra-1 and Infra-2 nodes but because all the HazelCast nodes are in sync, incorrect data is overridden.
For this, I thought of giving different names to the cache for every infrastructure so that the cached data don't get jumbled.
Doing this is not straight forward because we can't inject properties into the cache names. For this we need to implement our own custom CacheResolver and CacheManager.
I will lay out my understanding of HazelCast before I ask the first question.
Every HazelCast Instance can have multiple Map Configurations which are basically just different caches. Every CacheManager can be linked with a Single HazelCast instance which will internally contain multiple caches.
Question 1: If the relationship between CacheManager and HazelCastInstance is one-to-one then how will I determine which method data will be cached into which cache (Map Config).
Here is the incomplete implementation I have with me currently
public class CacheableOperations {
private final CacheManager cacheManager;
private final CacheManager noOpCacheManager;
public CacheableOperations(CacheManager cacheManager, CacheManager noOpCacheManager) {
this.cacheManager = cacheManager;
this.noOpCacheManager = noOpCacheManager;
}
private Map<String, CacheableOperation<?>> opMap;
public void init() {
List<CacheableOperation<? extends Class>> ops = new ArrayList<>();
ops.add(new CacheableOperation.Builder(RuleRepository.class)
.method("findByAlias")
.cacheManager(cacheManager)
.build());
postProcessOperations(ops);
}
public CacheableOperation<?> get(CacheOperationInvocationContext<?> context) {
final String queryKey = getOperationKey(context.getTarget().getClass().getName(),
context.getMethod().getName());
return opMap.get(queryKey);
}
private void postProcessOperations(List<CacheableOperation<? extends Class>> ops) {
Map<String, CacheableOperation<?>> tempMap = new HashMap<>();
for (CacheableOperation<?> op : ops) {
for (String methodName : op.getMethodNames()) {
tempMap.put(getOperationKey(op.getTargetClass().getName(), methodName), op);
}
}
opMap = ImmutableMap.copyOf(tempMap);
}
private String getOperationKey(String first, String second) {
return String.format("%s-%s", first, second);
}
Here is the class for CacheConfiguration
#Configuration
#AllArgsConstructor
public class CacheConfiguration extends CachingConfigurerSupport {
private final CacheProperties cacheProperties;
private SysdiagProperties sysdiagProperties;
#Bean
#Override
public CacheManager cacheManager() {
return new HazelcastCacheManager(hazelcastInstance());
}
#Bean
#Profile("client")
HazelcastInstance hazelcastInstance() {
Config config = new Config();
config.getNetworkConfig().getJoin().getTcpIpConfig().addMember(sysdiagProperties.getCache().getMemberIps()).setEnabled(true);
config.getNetworkConfig().getJoin().getMulticastConfig().setEnabled(false);
config.setInstanceName("restapi-master-cache-" + sysdiagProperties.getServiceName());
return Hazelcast.newHazelcastInstance(config);
}
#Bean
#Override
public CacheResolver cacheResolver() {
return new CustomCacheResolver(cacheProperties, operations(), noOpCacheManager());
}
#Bean
public CacheManager noOpCacheManager() {
return new NoOpCacheManager();
}
#Bean
public CacheableOperations operations() {
CacheableOperations operations = new CacheableOperations(cacheManager(), noOpCacheManager());
operations.init();
return operations;
}
And here is the CacheableOperation class
public class CacheableOperation<T> {
private final Class<T> targetClass;
private final String[] methodNames;
private final CacheManager cacheManager;
private CacheableOperation(Class<T> targetClass, String[] methodNames, CacheManager cacheManager) {
this.targetClass = targetClass;
this.methodNames = methodNames;
this.cacheManager = cacheManager;
}
public Class<T> getTargetClass() {
return targetClass;
}
public String[] getMethodNames() {
return methodNames;
}
public CacheManager getCacheManager() {
return cacheManager;
}
public static class Builder<T> {
private final Class<T> targetClass;
private String[] methodNames;
private CacheManager cacheManager;
private Map<String, Method> methods = new HashMap<>();
public Builder(Class<T> targetClass) {
this.targetClass = targetClass;
Arrays.stream(targetClass.getDeclaredMethods())
.forEachOrdered(method -> methods.put(method.getName(), method));
}
public Builder<T> method(String... methodNames) {
this.methodNames = methodNames;
return this;
}
public Builder<T> cacheManager(CacheManager cacheManager) {
this.cacheManager = cacheManager;
return this;
}
public CacheableOperation<T> build() {
checkArgument(targetClass != null);
checkArgument(ArrayUtils.isNotEmpty(methodNames));
checkArgument(Arrays.stream(methodNames).allMatch(name -> methods.get(name) != null));
return new CacheableOperation<T>(targetClass, methodNames, cacheManager);
}
}
}
And finally the CacheResolver
public class CustomCacheResolver implements CacheResolver {
private final CacheableOperations operations;
private final CacheProperties cacheProperties;
private final CacheManager noOpCacheManager;
public CustomCacheResolver(CacheProperties cacheProperties, CacheableOperations operations, CacheManager noOpCacheManager) {
this.cacheProperties = cacheProperties;
this.operations = operations;
this.noOpCacheManager = noOpCacheManager;
}
#Override
public Collection<? extends Cache> resolveCaches(CacheOperationInvocationContext<?> context) {
if (!cacheProperties.isEnabled()) {
return getCaches(noOpCacheManager, context);
}
Collection<Cache> caches = new ArrayList<>();
CacheableOperation operation = operations.get(context);
if (operation != null) {
CacheManager cacheManager = operation.getCacheManager();
if (cacheManager != null) {
caches = getCaches(cacheManager, context);
}
}
return caches;
}
private Collection<Cache> getCaches(CacheManager cacheManager, CacheOperationInvocationContext<?> context) {
return context.getOperation().getCacheNames().stream()
.map(cacheName -> cacheManager.getCache(cacheName))
.filter(cache -> cache != null)
.collect(Collectors.toList());
}
}
Question 2: In this whole code base, I cannot find the linkage between a Cache Name and a Method Name which I did in the first snippet. All I could see is a link between the method name and the cacheManager instance. Where do I define that?
All the questions and documentation I read about Spring Boot and HazelCast, does not seem to go in great depth in this case.
Question 3: Can someone define the role of a CacheResolver and a CacheManager in a straight forward manner for me.
Thanks for the patience. Answer to even one of the question might help me a lot. :)
You can specify the parameter in the #Cacheable annotation. For example:
#Cacheable("books")
public String getBookNameByIsbn(String isbn) {
return findBookInSlowSource(isbn);
}
That will decide on the name internal map/cache used.
I am trying to use a CrudRepository in association with spring-data-redis and lettuce. Following all the advice I can find I have configured my spring-boot 2.1.8 application with #ReadingConverters and #WritingConverters but when I try to use the repository I am getting "Path to property must not be null or empty."
Doing some debugging, this seems to be caused by org.springframework.data.redis.core.convert.MappingRedisConverter:393
writeInternal(entity.getKeySpace(), "", source, entity.getTypeInformation(), sink);
The second parameter being the path. This ends up at line 747 of MappingRedisConverter running this code:
} else if (targetType.filter(it -> ClassUtils.isAssignable(byte[].class, it)).isPresent()) {
sink.getBucket().put(path, toBytes(value));
}
Ultimately, the put with an empty path ends up in org.springframework.data.redis.core.convert.Bucket:77 and fails the Assert.hasText(path, "Path to property must not be null or empty."); even though the data has been serialized.
Is this a bug with spring-data-redis or have I got to configure something else?
RedicsConfig.java
#Configuration
#EnableConfigurationProperties({RedisProperties.class})
#RequiredArgsConstructor
#EnableRedisRepositories
public class RedisConfiguration {
private final RedisConnectionFactory redisConnectionFactory;
#Bean
public RedisTemplate<?, ?> redisTemplate() {
RedisTemplate<byte[], byte[]> template = new RedisTemplate<byte[], byte[]>();
template.setConnectionFactory(redisConnectionFactory);
template.afterPropertiesSet();
return template;
}
#Bean
public ObjectMapper objectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.setSerializationInclusion(JsonInclude.Include.NON_NULL);
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
objectMapper.findAndRegisterModules();
return objectMapper;
}
#Bean
public RedisCustomConversions redisCustomConversions(List<Converter<?,?>> converters) {
return new RedisCustomConversions(converters);
}
}
I've just included one writing converter here but have several reading and writing ones...
#Component
#WritingConverter
#RequiredArgsConstructor
#Slf4j
public class CategoryWritingConverter implements Converter<Category, byte[]> {
private final ObjectMapper objectMapper;
#Setter
private Jackson2JsonRedisSerializer<Category> serializer;
#Override
public byte[] convert(Category category) {
return getSerializer().serialize(category);
}
private Jackson2JsonRedisSerializer<Category> getSerializer() {
if (serializer == null) {
serializer = new Jackson2JsonRedisSerializer<>(Category.class);
serializer.setObjectMapper(objectMapper);
}
return serializer;
}
}
The object to write:
#Data
#JsonInclude(JsonInclude.Include.NON_NULL)
#EqualsAndHashCode(onlyExplicitlyIncluded = true)
#AllArgsConstructor
#NoArgsConstructor
#RedisHash("category")
#TypeAlias("category")
public class Category {
#Id
#EqualsAndHashCode.Include
private String categoryCode;
private String categoryText;
}
And the repo:
public interface CategoryRepository extends CrudRepository<Category, String> {
Page<Category> findAll(Pageable pageable);
}
Can anybody advise what I have missed or if this is a bug I should raise on spring-data-redis?
How can I register a custom converter in my MongoTemplate with Spring Boot? I would like to do this only using annotations if possible.
I just register the bean:
#Bean
public MongoCustomConversions mongoCustomConversions() {
List list = new ArrayList<>();
list.add(myNewConverter());
return new MongoCustomConversions(list);
}
Here is a place in source code where I find it
If you only want to override the custom converters portion of the Spring Boot configuration, you only need to create a configuration class that provides a #Bean for the custom converters. This is handy if you don't want to override all of the other Mongo settings (URI, database name, host, port, etc.) that Spring Boot has wired in for you from your application.properties file.
#Configuration
public class MongoConfig
{
#Bean
public CustomConversions customConversions()
{
List<Converter<?, ?>> converterList = new ArrayList<Converter<?, ?>>();
converterList.add(new MyCustomWriterConverter());
return new CustomConversions(converterList);
}
}
This will also only work if you've enabled AutoConfiguration and excluded the DataSourceAutoConfig:
#SpringBootApplication(scanBasePackages = {"com.mypackage"})
#EnableMongoRepositories(basePackages = {"com.mypackage.repository"})
#EnableAutoConfiguration(exclude = {DataSourceAutoConfiguration.class})
public class MyApplication
{
public static void main(String[] args)
{
SpringApplication.run(MyApplication.class, args);
}
}
In this case, I'm setting a URI in the application.properties file and using Spring data repositories:
#mongodb settings
spring.data.mongodb.uri=mongodb://localhost:27017/mydatabase
spring.data.mongodb.repositories.enabled=true
You need to create a configuration class for converter config.
#Configuration
#EnableAutoConfiguration(exclude = { EmbeddedMongoAutoConfiguration.class })
#Profile("!testing")
public class MongoConfig extends AbstractMongoConfiguration {
#Value("${spring.data.mongodb.host}") //if it is stored in application.yml, else hard code it here
private String host;
#Value("${spring.data.mongodb.port}")
private Integer port;
#Override
protected String getDatabaseName() {
return "test";
}
#Bean
public Mongo mongo() throws Exception {
return new MongoClient(host, port);
}
#Override
public String getMappingBasePackage() {
return "com.base.package";
}
#Override
public CustomConversions customConversions() {
List<Converter<?, ?>> converters = new ArrayList<>();
converters.add(new LongToDateTimeConverter());
return new CustomConversions(converters);
}
}
#ReadingConverter
static class LongToDateTimeConverter implements Converter<Long, Date> {
#Override
public Date convert(Long source) {
if (source == null) {
return null;
}
return new Date(source);
}
}
I have the following sftp camel component configuration:
#Configuration
public class FtpCamelComponent {
#Value("${SFTP_HOST}")
private String sftpHost;
#Value("${SFTP_KNOWNHOST}")
private String sftpKnownHost;
#Value("${SFTP_KEY}")
private String sftpKey;
#Value("${SFTP_USER}")
private String sftpUser;
#Value("{SFTP_DIRECTORY}")
private String sftpFileDirectory;
#Bean
public SftpConfiguration sftpConfiguration(){
SftpConfiguration sftpConfiguration = new SftpConfiguration();
sftpConfiguration.setUsername(sftpUser);
sftpConfiguration.setHost(sftpHost);
sftpConfiguration.setKnownHostsFile(sftpKnownHost);
sftpConfiguration.setPrivateKeyFile(sftpKey);
sftpConfiguration.setDirectory(sftpFileDirectory);
return sftpConfiguration;
}
#Bean
public SftpEndpoint sftpEndpoint(SftpConfiguration sftpConfiguration){
SftpEndpoint sftpEndpoint = new SftpEndpoint();
sftpEndpoint.setConfiguration(sftpConfiguration);
sftpEndpoint.setEndpointUriIfNotSpecified("sftp");
return sftpEndpoint;
}
#Bean
public SftpComponent sftpComponent(SftpEndpoint sftpEndpoint){
SftpComponent sftpComponent = new SftpComponent();
sftpComponent.setEndpointClass(sftpEndpoint.getClass());
return sftpComponent;
}
}
I added the component to my camel context:
#Configuration
#Import({FtpCamelComponent.class,
SftpCamelRoute.class})
public class SftpCamelContext extends CamelConfiguration {
#Autowired
SftpComponent sftpComponent;
#Bean(name = "sftpCamelContext")
protected CamelContext createCamelContext() throws Exception {
SpringCamelContext camelContext = new SpringCamelContext();
camelContext.setApplicationContext(getApplicationContext());
camelContext.addComponent("sftp", sftpComponent);
return camelContext;
}
}
Why can't I just use sftp: in my camel route as I have already configured it and added it to my camel context?
#Bean(name = "FileToSftp")
public RouteBuilder fileToSFTP(){
return new RouteBuilder() {
public void configure() {
from("direct:fileToSftp")
.to("file:export/b1?fileName=export.csv")
.setHeader("CamelFileName", constant("export.csv"))
.to("sftp://dev#localhost:/export/in/?knownHostsFile=key/knownhost&privateKeyFile=key/id_rsa.pub&localWorkDirectory=export/b1&download=false");
}
};
}