Spring Integration Test Binder doesn't mock dependency - spring

I have a project (https://github.com/serjteplov/demo-kafka-mock.git) where there are couple of dummy functions to read messages from kafka. In one of these functions crudRepository is used to perform some operations against DB:
#EnableAutoConfiguration
public class SampleFunctionConfiguration {
#Bean
public Function<String, String> uppercase(UserRepository userRepository) {
String id = userRepository.findById("abc").map(User::getId).orElse(null);
return value -> value.toUpperCase() + " id=" + id;
}
#Bean
public Function<String, String> reverse() {
return value -> new StringBuilder(value).reverse().toString();
}
}
So the problem is to write test on uppercase binder function. To make this test works correctly I have to mock such call
userRepository.findById("abc")
then I've created mock and added it to the context:
#Primary
#Bean("test")
UserRepository userRepository() {
return new DetachedMockFactory().Mock(UserRepository)
}
and mock call in the test:
userRepository.findById("abc") >> Optional.of(new User("abc", "Bob"))
After test execution mock is successfully created, but userRepository.findById("abc") still returns null.
Can anyone tell how to fix this problem? Or any alternative implementations and workarounds would be nice

You are invoking the stubbed method in the bean definition instead of in the function.
#Bean
public Function<String, String> uppercase(UserRepository userRepository) {
String id = userRepository.findById("abc").map(User::getId).orElse(null);
return value -> value.toUpperCase() + " id=" + id;
}
Vs.
#Bean
public Function<String, String> uppercase(UserRepository userRepository) {
return value -> {
String id = userRepository.findById("abc").map(User::getId).orElse(null);
return value.toUpperCase() + " id=" + id;
};
}

Related

Is Spring #Component annotation used correctly?

The purpose of this question is to find out if the codes are written with the right approach. Let's do CRUD operations on categories and posts in the blog website project. To keep the question short, I shared just create and update side.
(Technologies used in the project: spring-boot, mongodb)
Let's start to model Category:
#Document("category")
public class Category{
#Id
private String id;
#Indexed(unique = true, background = true)
private String name;
#Indexed(unique = true, background = true)
private String slug;
// getter and setter
Abstract BaseController class and IController Interface is created for fundamental level save, delete and update operations. I shared below controller side:
public interface IController<T>{
#PostMapping("/save")
ResponseEntity<BlogResponse> save(T object);
#GetMapping(value = "/find-all")
ResponseEntity<BlogResponse> findAll();
#GetMapping(value = "/delete-all")
ResponseEntity<BlogResponse> deleteAll();
}
public abstract class BaseController<T extends MongoRepository<S,String>, S> implements IController<S> {
#Autowired
private T repository;
#Autowired
private BlogResponse blogResponse;
#PostMapping(value = "/save", consumes = MediaType.APPLICATION_FORM_URLENCODED_VALUE)
public #ResponseBody ResponseEntity<BlogResponse> save(S object) {
try {
S model = (S) repository.save(object);
String modelName = object.getClass().getSimpleName().toLowerCase();
blogResponse.setMessage(modelName + " is saved successfully").putData(modelName, object);
} catch (DuplicateKeyException dke) {
return new ResponseEntity<BlogResponse>(blogResponse.setMessage("This data is already existing!!!"), HttpStatus.BAD_REQUEST);
} catch (Exception e) {
return new ResponseEntity<BlogResponse>(blogResponse.setMessage(e.getMessage()), HttpStatus.INTERNAL_SERVER_ERROR);
}
return new ResponseEntity<BlogResponse>(blogResponse, HttpStatus.OK);
}
// delete, findAll and other controllers
#RestController
#RequestMapping(value = "category")
#RequestScope
public class CategoryController extends BaseController<ICategoryRepository, Category>{
// More specific opretions like findSlug() can be write here.
}
And finally BlogResponce component is shared below;
#Component
#Scope("prototype")
public class BlogResponse{
private String message;
private Map<String, Object> data;
public String getMessage() {
return message;
}
public BlogResponse setMessage(String message) {
this.message = message;
return this;
}
public BlogResponse putData(String key, Object object){
if(data == null)
data = new HashMap<String,Object>();
data.put(key,object);
return this;
}
public Map<String,Object> getData(){
return data;
}
#Override
public String toString() {
return "BlogResponse{" +
"message='" + message + '\'' +
", data=" + data +
'}';
}
}
Question: I am new spring boot and I want to move forward by doing it right. BlogResponse is set bean by using #Component annotation. This doc said that other annotations like #Controller, #Service are specializations of #Component for more specific use cases. So I think, I cant use them. BlogResponse is set prototype scope for create new object at each injection. Also it's life end after response because of #RequestScope. Are this annotations using correcty? Maybe there is more effective way or approach. You can remark about other roughness if it existing.

Spring IoC: identifier per request

I've created this bean in order to get a Supplier<String>:
#Bean
public Supplier<String> auditIdSupplier() {
return () -> String.join(
"-",
"KEY",
UUID.randomUUID().toString()
);
}
As you can see, it's intented to only generate an straightforward identifier string.
Each time, it's called, a new identifier is supplied.
I'd like to change this behavior, in order to get the same generated identifier inside request scope. I mean, first time a request is reached, a new indentifier is generated. From then on, next calls no this Supplier has to return the first generated indentifier inside request scope.
Any ideas?
As it was written in commentary, maybe something like below will work:
#Bean
#RequestScope
public Supplier<String> auditIdSupplier() {
String val = String.join("-","KEY",UUID.randomUUID().toString());
return () -> val;
}
This is my version:
#Component
#Scope(WebApplicationContext.SCOPE_REQUEST)
public class AuditIdPerRequest {
private String key;
#PostConstruct
public void calculateKey() {
this.key = String.join(
"-",
"KEY",
UUID.randomUUID().toString()
);
}
public String getAuditId() {
return this.key;
}
}
You need to configure a request scoped bean
#Configuration
public class MyConfig {
#Bean
#RequestScope
public String myRequestScopedIdentifyer(NativeWebRequest httpRequest) {
// You don't need request as parameter here, but you can inject it this way if you need request context
return String.join(
"-",
"KEY",
UUID.randomUUID().toString());
}
And then inject it where appropriate with either field injection
#Component
public class MyClass {
#Autowired
#Qualifier("myRequestScopedIdentifyer")
private String identifier
or object factory
#Component
public class MyClass {
public MyClass(#Qualifier("myRequestScopedIdentifyer") ObjectFactory<String> identifyerProvider) {
this.identifyerProvider= identifyerProvider;
}
private final ObjectFactory<String> identifyerProvider;
public void someMethod() {
String requestScopedId = identifyerProvider.getObject();
}

Cannot Write Data to ElasticSearch with AbstractReactiveElasticsearchConfiguration

I am trying out to write data to my local Elasticsearch Docker Container (7.4.2), for simplicity I used the AbstractReactiveElasticsearchConfiguration given from Spring also Overriding the entityMapper function. The I constructed my repository extending the ReactiveElasticsearchRepository
Then in the end I used my autowired repository to saveAll() my collection of elements containing the data. However Elasticsearch doesn't write any data. Also i have a REST controller which is starting my whole process returning nothing basicly, DeferredResult>
The REST method coming from my ApiDelegateImpl
#Override
public DeferredResult<ResponseEntity<Void>> openUsageExporterStartPost() {
final DeferredResult<ResponseEntity<Void>> deferredResult = new DeferredResult<>();
ForkJoinPool.commonPool().execute(() -> {
try {
openUsageExporterAdapter.startExport();
deferredResult.setResult(ResponseEntity.accepted().build());
} catch (Exception e) {
deferredResult.setErrorResult(e);
}
}
);
return deferredResult;
}
My Elasticsearch Configuration
#Configuration
public class ElasticSearchConfig extends AbstractReactiveElasticsearchConfiguration {
#Value("${spring.data.elasticsearch.client.reactive.endpoints}")
private String elasticSearchEndpoint;
#Bean
#Override
public EntityMapper entityMapper() {
final ElasticsearchEntityMapper entityMapper = new ElasticsearchEntityMapper(elasticsearchMappingContext(), new DefaultConversionService());
entityMapper.setConversions(elasticsearchCustomConversions());
return entityMapper;
}
#Override
public ReactiveElasticsearchClient reactiveElasticsearchClient() {
ClientConfiguration clientConfiguration = ClientConfiguration.builder()
.connectedTo(elasticSearchEndpoint)
.build();
return ReactiveRestClients.create(clientConfiguration);
}
}
My Repository
public interface OpenUsageRepository extends ReactiveElasticsearchRepository<OpenUsage, Long> {
}
My DTO
#Data
#Document(indexName = "open_usages", type = "open_usages")
#TypeAlias("OpenUsage")
public class OpenUsage {
#Field(name = "id")
#Id
private Long id;
......
}
My Adapter Implementation
#Autowired
private final OpenUsageRepository openUsageRepository;
...transform entity into OpenUsage...
public void doSomething(final List<OpenUsage> openUsages){
openUsageRepository.saveAll(openUsages)
}
And finally my IT test
#SpringBootTest(webEnvironment = WebEnvironment.RANDOM_PORT)
#Testcontainers
#TestPropertySource(locations = {"classpath:application-it.properties"})
#ContextConfiguration(initializers = OpenUsageExporterApplicationIT.Initializer.class)
class OpenUsageExporterApplicationIT {
#LocalServerPort
private int port;
private final static String STARTCALL = "http://localhost:%s/open-usage-exporter/start/";
#Container
private static ElasticsearchContainer container = new ElasticsearchContainer("docker.elastic.co/elasticsearch/elasticsearch:6.8.4").withExposedPorts(9200);
static class Initializer implements ApplicationContextInitializer<ConfigurableApplicationContext> {
#Override
public void initialize(final ConfigurableApplicationContext configurableApplicationContext) {
final List<String> pairs = new ArrayList<>();
pairs.add("spring.data.elasticsearch.client.reactive.endpoints=" + container.getContainerIpAddress() + ":" + container.getFirstMappedPort());
pairs.add("spring.elasticsearch.rest.uris=http://" + container.getContainerIpAddress() + ":" + container.getFirstMappedPort());
TestPropertyValues.of(pairs).applyTo(configurableApplicationContext);
}
}
#Test
void testExportToES() throws IOException, InterruptedException {
final List<OpenUsageEntity> openUsageEntities = dbPreparator.insertTestData();
assertTrue(openUsageEntities.size() > 0);
final String result = executeRestCall(STARTCALL);
// Awaitility here tells me nothing is in ElasticSearch :(
}
private String executeRestCall(final String urlTemplate) throws IOException {
final String url = String.format(urlTemplate, port);
final HttpUriRequest request = new HttpPost(url);
final HttpResponse response = HttpClientBuilder.create().build().execute(request);
// Get the result.
return EntityUtils.toString(response.getEntity());
}
}
public void doSomething(final List<OpenUsage> openUsages){
openUsageRepository.saveAll(openUsages)
}
This lacks a semicolon at the end, so it should not compile.
But I assume this is just a typo, and there is a semicolon in reality.
Anyway, saveAll() returns a Flux. This Flux is just a recipe for saving your data, and it is not 'executed' until subscribe() is called by someone (or something like blockLast()). You just throw that Flux away, so the saving never gets executed.
How to fix this? One option is to add .blockLast() call:
openUsageRepository.saveAll(openUsages).blockLast();
But this will save the data in a blocking way effectively defeating the reactivity.
Another option is, if the code you are calling saveAll() from supports reactivity is just to return the Flux returned by saveAll(), but, as your doSomething() has void return type, this is doubtful.
It is not seen how your startExport() connects to doSomething() anyway. But it looks like your 'calling code' does not use any notion of reactivity, so a real solution would be to either rewrite the calling code to use reactivity (obtain a Publisher and subscribe() on it, then wait till the data arrives), or revert to using blocking API (ElasticsearchRepository instead of ReactiveElasticsearchRepository).

Spring Boot MeterRegistryCustomizer with NewRelicRegistry not working as I expect.

I have a bean, set up in a configuration class. My goal is to transform, deny, apply common tags and modify the metrics that are sent to New Relic.
Here is my configuration class
#Configuration
#Log4j2
public class MetricsConfig {
private static final Duration HISTOGRAM_EXPIRY = Duration.ofMinutes(10);
private static final Duration STEP = Duration.ofSeconds(5);
private final transient String profile;
#Autowired
public MetricsConfig(#Value("${spring.profiles.active}") final String profile) {
this.profile = profile;
}
#Bean
public MeterRegistryCustomizer<NewRelicMeterRegistry> metricsCommonTags() {
log.info("Configuring Registry");
return registry -> registry.config()
.commonTags(Arrays.asList(Tag.of("appId", "1111111"), Tag.of("environment", profile),
Tag.of("app", "aws-app-name")))
.meterFilter(new MeterFilter() {
#Override
public Meter.Id map(Meter.Id id) {
if(id.getName().startsWith("http")){
return id.withName("app-name." + profile + "." + id.getName());
}
return id;
}
#Override
public DistributionStatisticConfig configure(Meter.Id id, DistributionStatisticConfig config) {
return config.merge(DistributionStatisticConfig.builder()
.percentilesHistogram(true)
.percentiles(0.5, 0.75, 0.95)
.expiry(HISTOGRAM_EXPIRY)
.bufferLength((int) (HISTOGRAM_EXPIRY.toMillis() / STEP.toMillis()))
.build());
}
}).meterFilter(MeterFilter.deny(id -> {
String uri = id.getTag("uri");
log.info("id: [{}]", id);
return (uri != null && uri.startsWith("/swagger") && uri.startsWith("/manage")) || !id.getName().toLowerCase().startsWith("app-name");
}))
;
}
}
Then, I also inject MeterRegistry into some of my classes to capture custom events (Timer, Counter).
Everything works in regards to capturing the events, except that the data in New Relic is missing the commonTags, transformations, and anything else that I apply in MetricsConfig class.
Am I missing something on making sure my app is wiring up the MeterRegistryCustomizer correctly?
Arg.. I had implemented a HandlerInterceptorAdapter to attempt to implement a Counter for all requests with additional tags. Which, it did not like.

Map parameter as GET param in Spring REST controller

How I can pass a Map parameter as a GET param in url to Spring REST controller ?
It’s possible to bind all request parameters in a Map just by adding a Map object after the annotation:
#RequestMapping("/demo")
public String example(#RequestParam Map<String, String> map){
String apple = map.get("APPLE");//apple
String banana = map.get("BANANA");//banana
return apple + banana;
}
Request
/demo?APPLE=apple&BANANA=banana
Source -- https://reversecoding.net/spring-mvc-requestparam-binding-request-parameters/
There are different ways (but a simple #RequestParam('myMap')Map<String,String> does not work - maybe not true anymore!)
The (IMHO) easiest solution is to use a command object then you could use [key] in the url to specifiy the map key:
#Controller
#RequestMapping("/demo")
public class DemoController {
public static class Command{
private Map<String, String> myMap;
public Map<String, String> getMyMap() {return myMap;}
public void setMyMap(Map<String, String> myMap) {this.myMap = myMap;}
#Override
public String toString() {
return "Command [myMap=" + myMap + "]";
}
}
#RequestMapping(method=RequestMethod.GET)
public ModelAndView helloWorld(Command command) {
System.out.println(command);
return null;
}
}
Request: http://localhost:8080/demo?myMap[line1]=hello&myMap[line2]=world
Output: Command [myMap={line1=hello, line2=world}]
Tested with Spring Boot 1.2.7

Resources