awspring SQS Configuration not working in Spring Boot Project - spring

We are attempting to move away from spring cloud aws to the new io.awspring.cloud project. The documentation states:
The AmazonSQSAsync client is automatically created and passed to the template’s constructor based on the provided credentials.
I am attempting to run this locally. In my pom.xml I have added the dependency:
<dependency>
<groupId>io.awspring.cloud</groupId>
<artifactId>spring-cloud-aws-messaging</artifactId>
<version>${spring.cloud.aws}</version>
</dependency>
And I removed the code to create an AmazonSQSAsync client. As per the documentation, I still define a QueueMessagingTemplate
import io.awspring.cloud.messaging.core.QueueMessagingTemplate;
import com.amazonaws.services.sqs.AmazonSQSAsync;
#Bean
#Qualifier
public QueueMessagingTemplate queueMessagingTemplateFifoSupport(AmazonSQSAsync amazonSQSAsync,
ObjectMapper objectMapper) {
MappingJackson2MessageConverter converter = new MappingJackson2MessageConverter();
converter.setSerializedPayloadClass(String.class);
converter.setObjectMapper(objectMapper);
QueueMessagingTemplate messagingTemplate = new QueueMessagingTemplate(amazonSQSAsync);
messagingTemplate.setMessageConverter(converter);
return messagingTemplate;
}
The package namespace was updated to use io.awspring. When running the app I get the error:
No qualifying bean of type 'com.amazonaws.services.sqs.AmazonSQSAsync' available
My belief is that Spring should supply this client automatically. Am I missing something?
In my application properties I have defined:
cloud.aws.credentials.access-key
cloud.aws.credentials.secret-key
cloud.aws.region.static

I was missing a dependency:
<dependency>
<groupId>io.awspring.cloud</groupId>
<artifactId>spring-cloud-aws-autoconfigure</artifactId>
<version>${spring.cloud.aws}</version>
</dependency>
which is necessary for automatic creation of default clients.
On a side note- best to avoid using this for now as you will be locked in to v1 of the Java AWS SDK until you can upgrade io.awspring to a version compatible with both a newer version of spring boot and aws

Related

I am not able to connect to Google Cloud Memory Store from Spring Boot

I am developing a module with spring boot in my backend where i need to use Redis through GCP Memory Store. I have been searching in forum and even the "oficial documentation" about memory store but i cannot understand how to connect to memory store with my spring boot app.
I found a google code lab but they use a Compute Engine VM to install spring boot and then save and retrieve information from memory store. So i tried to do it like that in my local spring boot but it didnt work because throws an error saying:
Unable to connect to Redis; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to 10.1.3.4
the codelab i mentioned earlier says that you only have to add this line to your application.properties:
spring.redis.host=10.1.3.4
as well as the annotation #EnableCaching in the main class and #Cachable annotation in the controller method where you try to do something with redis.
the method looks like this:
#RequestMapping("/hello/{name}")
#Cacheable("hello")
public String hello(#PathVariable String name) throws InterruptedException {
Thread.sleep(5000);
return "Hello " + name;
}
i dont know what else to do. Notice that i am new on this topic of redis and memory store.
Anyone can give me some guidance on this please?
thanks in advance
codelab url: https://codelabs.developers.google.com/codelabs/cloud-spring-cache-memorystore#0
See this documentation on how to setup Memorystore Redis instance.
Included in the documentation is how you can connect and test your Memorystore instance from different computing environments.
There's also a step by step guide on how SpringBoot can use Redis to cache with annonations.
Add the Spring Data Redis starter in your pom.xml if you're using Maven for your project setup.
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-boot-starter-cache</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-redis</artifactId>
</dependency>
Add this configuration in your application.properties file:
spring.redis.host=<MEMORYSTORE_REDIS_IP>
# Configure default TTL, e.g., 10 minutes
spring.cache.redis.time-to-live=600000
Turn on caching capability explicitly with the #EnableCaching annotation:
#SpringBootApplication
#EnableCaching
class DemoApplication {
...
}
Once you configured the Spring Boot with Redis and enabled caching, you can use the #Cacheable annotation to cache return values.
#Service
class OrderService {
private final OrderRepository orderRepository;
public OrderService(OrderRepository orderRepository) {
this.orderRepository = orderRepository;
}
#Cacheable("order")
public Order getOrder(Long id) {
orderRepository.findById(id);
}
}

Use Micrometer with OpenFeign in spring-boot application

The OpenApi documentation says that it supports micrometer. How does the integration works? I could not find anything except this little documentation.
I have a FeignClient in a spring boot application
#FeignClient(name = "SomeService", url = "xxx", configuration = FeignConfiguration.class)
public interface SomeService {
#GET
#Path("/something")
Something getSomething();
}
with the configuration
public class FeignConfiguration {
#Bean
public Capability capability() {
return new MicrometerCapability();
}
}
and the micrometer integration as a dependency
<dependency>
<groupId>io.github.openfeign</groupId>
<artifactId>feign-micrometer</artifactId>
<version>10.12</version>
</dependency>
The code makes a call but I could not find any new metrics via the actuator overview, expecting some general information about my HTTP requests. What part is missing?
Update
I added the support for this to spring-cloud-openfeign. After the next release (2020.0.2), if micrometer is set-up, the only thing you need to do is putting feign-micrometer onto your classpath.
Old answer
I'm not sure if you do but I recommend to use spring-cloud-openfeign which autoconfigures Feign components for you. Unfortunately, it seems it does not autoconfigure Capability (that's one reason why your solution does not work) so you need to do it manually, please see the docs how to do it.
I was able to make this work combining the examples in the OpenFeign and Spring Cloud OpenFeign docs:
#Import(FeignClientsConfiguration.class)
class FooController {
private final FooClient fooClient;
public FooController(Decoder decoder, Encoder encoder, Contract contract, MeterRegistry meterRegistry) {
this.fooClient = Feign.builder()
.encoder(encoder)
.decoder(decoder)
.contract(contract)
.addCapability(new MicrometerCapability(meterRegistry))
.target(FooClient.class, "https://PROD-SVC");
}
}
What I did:
Used spring-cloud-openfeign
Added feign-micrometer (see feign-bom)
Created the client in the way you can see above
Importing FeignClientsConfiguration and passing MeterRegistry to MicrometerCapability are vital
After these, and calling the client, I had new metrics:
feign.Client
feign.Feign
feign.codec.Decoder
feign.codec.Decoder.response_size

nodeBuilder() is removed by Elasticsearch, but still spring-data-elasticsearch documentation contains configuration which uses nodeBuilder()

I was following the Spring-Data-Elasticseach documentaion and was following the configuration as mentioned in the above link.
#Configuration
#EnableElasticsearchRepositories(basePackages = "org/springframework/data/elasticsearch/repositories")
static class Config {
#Bean
public ElasticsearchOperations elasticsearchTemplate() {
return new ElasticsearchTemplate(nodeBuilder().local(true).node().client());
}
}
Since import for nodeBuilder() is not mentioned in the documentation I assumed it from org.elasticsearch.node.NodeBuilder.* as mentioned in elasticsearch Java API.
But in the later releases, the API got changed and NodeBuilder no longer exists. So why/how the spring documentation still using the NodeBuilder?
If that's an issue with the documentation, what's the right configuration?
The dependencies I am using
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-elasticsearch</artifactId>
</dependency>
with the boot version 2.1.1.RELEASE
This looks like a documentation issue. I've raise DATAES-574 to have that fixed.
With Spring Boot 2.1, the usual way to create a Client bean is to set a spring.data.elasticsearch.cluster-nodes property. Behind the scenes this will create the Client as a org.elasticsearch.client.transport.TransportClient instance.
You can also define that bean yourself if you so wish.
In the future, TransportClient is also going to been deprecated by Elasticsearch. At that point you'll need to use the higher level REST API. See https://jira.spring.io/browse/DATAES-407 for the details.

java.lang.ClassCastException: DTOObject cannot be cast to DTOObject

I am facing a weird issue in my application which runs on Spring Boot 1.4.0M3 which is using Spring cache implementation where provider is Redis where I receive classCastException that same object cannot be casted
I am using Mongodb as database and I have User Object which contains List of Roles object loaded lazily and Roles internally contains Permissions Object like below
#Document
#Data
public class User implements Serializable{
private String passwordResetToken;
private boolean enabled = false;
#DBRef(lazy= true)
private List<Role> roleList;
}
My Role DTO is as below
#Data
#Document
public class Role implements Serializable{
private String roleName;
private String description;
#DBRef(lazy= true)
private List<Permission> permissions;
}
Now in my spring MVC while loading all roles I am calling all permissions and since this is repetitive operation I thought of caching the result and using redis and while loading the roles value I receive below exception.
raised java.lang.ClassCastException: com.learning.securedapp.domain.Permission cannot be cast to com.learning.securedapp.domain.Permission
Help me to overcome this error.
I am attaching the source code to my project and I receive error at line 91 of RoleController.java
To Replicate in your local environment login to application and click on permissions menu and then roles menu, In Roles menu now click on any edit icon.you will receive above error.
When you use DevTools with caching, you need to be aware of this limitation.
When the object is serialized into the cache, the application class loader is C1. Then after you change some code/configuration, devtools automatically restart the context and creates a new classloader (C2). When you hit that cache method, the cache abstraction finds an entry in the cache and it deserializes it from the store. If the cache library doesn't take the context classloader into account, that object will have the wrong classloader attached to it (which explains that weird exception A cannot be cast to A).
TL;DR do not serialize classes with devtools if the cache library doesn't use the context classloader. Or put your cache library in the application classloader:
restart.include.yourcache=/my-cache-lib-[\\w-]+\.jar
This worked for me , DevTools and Redis both are working. We need to pass classLoader when creating JdkSerializationRedisSerializer and it should work
JdkSerializationRedisSerializer redisSerializer = new JdkSerializationRedisSerializer(getClass().getClassLoader());
So my RedisCacheConfig is:
#Configuration
#EnableCaching
public class RedisCacheConfig extends CachingConfigurerSupport implements CachingConfigurer {
............................
............................
#Bean
public RedisCacheManager redisCacheManager(LettuceConnectionFactory lettuceConnectionFactory) {
JdkSerializationRedisSerializer redisSerializer = new JdkSerializationRedisSerializer(getClass().getClassLoader());
RedisCacheConfiguration redisCacheConfiguration = RedisCacheConfiguration.defaultCacheConfig()
.disableCachingNullValues()
.entryTtl(Duration.ofHours(redisDataTTL))
.serializeValuesWith(RedisSerializationContext.SerializationPair.fromSerializer(redisSerializer));
redisCacheConfiguration.usePrefix();
RedisCacheManager redisCacheManager = RedisCacheManager.RedisCacheManagerBuilder.fromConnectionFactory(lettuceConnectionFactory)
.cacheDefaults(redisCacheConfiguration)
.build();
redisCacheManager.setTransactionAware(true);
return redisCacheManager;
}
............................
............................
}
Check this spring boot issue: https://github.com/spring-projects/spring-boot/issues/9444
I actually tried the proposed solution (and many variations thereof) with no luck. E.g., this didn't stop the problem from occurring:
restart.include.cache=/spring-data-redis-.*.jar
I updated the above to callout the specific version I was using and it still didn't work.
What I ended up doing which did work was to exclude spring-boot-devtools from my project. I'm using Maven so the annotation was this:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<version>[1.5.9,)</version>
<scope>provided</scope>
</dependency>
This will prevent any version equal to or greater than 1.5.9 from loading up. After I included the above, everything worked as expected. I know this isn't an ideal solution for all, but I made little use of the restart functions of devtools so this was actually a good approach for me.
I am using Spring Boot 2.0.5, and I ended up removing devtools altogether from pom.xml. Thanks to the answer above from #Always Learning.
As much as I hate to do this, but I can't find another way for now!
<!--
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<scope>runtime</scope>
</dependency>
-->

Spring Integration with Jackson ObjectMapper and Java 8 Time (JSR-310)

I am struggling with configuring a "custom" ObjectMapper to be used by the Spring Integration DSL transformers.
I receive an java.time.Instant json representations that I would like to parse to object properties. i.e:
{"type": "TEST", "source":"TEST", "timestamp":{"epochSecond": 1454503381, "nano": 335000000}}
The message is a kafka message which raises a question: Should I write a custom serializer implementing Kafka encoders/decoders in order to be able to transform the kafka message to the right object or spring-integration have to do this automatically?
fw/dependencies and version:
Spring Boot - 1.3.2.RELEASE
Spring Integration Java Dsl - 1.1.1.RELEASE
FasterXml Jackson - 2.6.5
I've added this Java Configuration to the project following the Jackson documentation:
https://github.com/FasterXML/jackson-datatype-jsr310
#Configuration
public class IntegrationConfiguration {
#Bean
public JsonObjectMapper<JsonNode, JsonParser> jsonObjectMapper() {
ObjectMapper mapper = new ObjectMapper();
mapper.registerModule(new JavaTimeModule());
return new Jackson2JsonObjectMapper(mapper);
}
}
and the following Jackson JSR-310 artefact as well:
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
<version>2.6.5</version>
</dependency>
Based on this post on the Spring blog I don't even have to register the new Java8 time module.
https://spring.io/blog/2014/12/02/latest-jackson-integration-improvements-in-spring#jackson-modules
This is the exception I got:
Caused by: com.fasterxml.jackson.databind.JsonMappingException: No suitable constructor found for type [simple type, class java.time.Instant]: can not instantiate from JSON object (missing default constructor or creator, or perhaps need to add/enable type information?)
at [Source: {"type":"TEST","source":"TEST","timestamp":{"epochSecond":1454503381,"nano":335000000}}; line: 1, column: 71] (through reference chain: my.app.MyDto["timestamp"])
at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1106)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:296)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:133)
at com.fasterxml.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:520)
at com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:95)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:258)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:125)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3736)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2764)
at org.springframework.integration.support.json.Jackson2JsonObjectMapper.fromJson(Jackson2JsonObjectMapper.java:75)
at org.springframework.integration.support.json.Jackson2JsonObjectMapper.fromJson(Jackson2JsonObjectMapper.java:44)
at org.springframework.integration.support.json.AbstractJacksonJsonObjectMapper.fromJson(AbstractJacksonJsonObjectMapper.java:56)
at org.springframework.integration.json.JsonToObjectTransformer.doTransform(JsonToObjectTransformer.java:78)
at org.springframework.integration.transformer.AbstractTransformer.transform(AbstractTransformer.java:33)
... 74 more
RESOLUTION:
The problem was that I expected that Spring will detect the jackson-datatype-jsr310 archetype and register the JavaTimeModule, but it doesn't which is totally fine.
There are two way we can fix this:
1. The accepted answer if we use Spring Boot with Spring Integration as is.
2. If using the Spring Integration Dsl, just keep the IntegrationConfiguration class with the jsonObjectMapper() bean and use it like that:
#Autowired
private JsonObjectMapper jsonObjectMapper;
return IntegrationFlows
.from(inboundChannel())
.transform(Transformers.fromJson(myDto.class, jsonObjectMapper))
...
There is nothing to do with the Spring Boot on the matter to force Spring Integration to use that.
You just need to configure JsonToObjectTransformer with that your jsonObjetMapper():
#Bean
#Transformer(inputChannel="input", outputChannel="output")
JsonToObjectTransformer jsonToObjectTransformer() {
return new JsonToObjectTransformer(jsonObjectMapper());
}
Although there is no reason to register JsonObjectMapper as a bean.
Have you defined an encoder to your channel adapter?
You should always use an encoder for whichever adapter you're using, inbound channel adapter or message drive channel adapter.
In your case StringEncoder should solve the problem.
<bean id="myEncoder" class="org.springframework.integration.kafka.serializer.common.StringEncoder"/>

Resources