We construct a large object currently in java and send it as text/xml. It seems the client times out. Is it possible to convert this object in byte array or String and stream the data.
Will it help? We are using spring boot.
You should use TEXT_EVENT_STREAM_VALUE with Spring reactive, but it is expensive(too much CPU/Memory)
Ideally spring controlller should return
return quasarService.findByReceivedStatus(organizationId, receiverId)
.delayElements(Duration.ofSeconds(delay))
.cache()
.take(50)
.onBackpressureDrop()
.repeat(3) /// if repeat number not specified connection will keep long..
.parallel(50); //ParallelFlux
Related
So the question: i have the following Json:
{
"type": "ipv4",
"value": "1.2.3.4",
"firstSeen": "2020-07-10 15:00:00.000",
"totalCount": 8
}
i need to create a spring boot microservice from it ,with the following restrictions:
TotalCount cannot be less than 0 and cannot be more than 100.
firstSeen date should ALWAYS be converted to ISO 8601 format. The user can enter the date
in any string format. Return error if it is not well formed.
Expose the following RESTful APIs
Create a new record (as shown above, id auto-generated)
Get record by value
as this is my first time working with microservice,i can not understand this problem,is there anyone can help me with this please?
You will need to create a basic Spring Boot project using Spring Initializer, if you are using Intellij you can use this link as reference https://www.jetbrains.com/help/idea/your-first-spring-application.html#create-new-spring-boot-project.
Then add a new controller method which accepts a Json Request. Since you are trying to create a new record, I suggest you use POST method. Json Request will accept the 4 input parameters you mentioned. This is very basic and you should be able to find it in pretty much any Spring boot tutorial online. you can refer for example https://dzone.com/articles/simple-spring-boot-post
This Json Request can have validator annotations which check for the criteria you give. For example you can have #Size(min=0, max=100, message="TotalCount cannot be less than 0 and cannot be more than 100"). https://www.baeldung.com/jpa-size-length-column-differences
For date you might need to write a custom validator to check the specific format you want. For creating a record I guess you mean adding it to database. here you can configure your database using the yaml file, again there are lot of online resources on how to configure a Database in your spring boot project. https://blog.tericcabrel.com/write-custom-validator-for-body-request-in-spring-boot/
Since its your first time, it might take a while to figure out various details but I assure you once you get a hold of it, its going to be easy.
I'm using Spring Cloud Stream 3.x in Spring Boot 2.x application to consume messages from a Kafka topic.
I want to have a listener that consumes messages conditionally on some custom header value, as per doc:
#StreamListener(value = "someTopic", condition = "headers['SomeHeader']=='SomeHeaderValue'")
public void onMessage(Message<?> message) {
LOGGER.info("Received: {}", message);
}
However listener never gets notified, and if condition is removed I see the following in the log:
Received: ... SomeHeader: [B#1055e4af ...
It turns out that custom headers are left in Kafka byte array raw format, making them not eligible for condition evaluation.
Is some additional configuration needed or am I missing something?
After some digging in sources and stackoveflow I have found the following:
Spring Cloud Stream delegates to Spring Kafka message and headers conversion (KafkaMessageChannelBinder ~ getHeaderMapper)
Headers are left in raw format by default headers conversion implementation (BinderHeaderMapper)
Spring Cloud Stream allows customization of headers mapping and particularly conversion of headers from byte array to String (How can I map incoming headers as String instead of byte[] in my Spring Cloud Stream project?)
So I added my custom header mapper bean (bean name is important, it allows to omit additional configuration property), which maps my custom header to String:
#Bean
public KafkaHeaderMapper kafkaBinderHeaderMapper() {
SimpleKafkaHeaderMapper headerMapper = new SimpleKafkaHeaderMapper();
headerMapper.setRawMappedHeaders(Map.of(
"SomeHeader", true
));
return headerMapper;
}
That fixed the problem:
Received: ... SomeHeader: SomeHeaderValue ...
P.S. It seems like a bug in Spring Cloud Stream:
It introduces its own implementation of header mapper (BinderHeaderMapper), but the latter doesn't respect conditional routing feature.
Header mapper is subclassed in KafkaMessageChannelBinder, this added behaviour is non-obvious and will be lost if custom header mapper is provided.
Let's say I have a very simple HTTP endpoint using Spring Webflux:
#GetMapping
fun greeting(#RequestParam("msg") val message : String) = Mono.just(Greeting(message))
where Greeting is a simple DTO serialized as JSON. How can I instruct Spring Webflux to return the response compressed as GZIP? I'm using the Netty implementation if that matters.
What you are looking for is server compression properties
server.compression.enabled=true
server.compression.min-response-size=1024
Adding to above-accepted answer, it's always better to give min-response-size as well to remove server overhead of compressing all responses and also the the mime-types.
server.compression.enabled=true
server.compression.mime-types=text/html,text/xml,text/plain,text/css,text/javascript,application/javascript,application/json,application/xml
server.compression.min-response-size=2KB
I have a Spring Boot based REST API application with the following endpoint (Written in Kotlin)
#RequestMapping(value = ["/search"], method = [RequestMethod.GET])
#ApiOperation("Check whether any of the map values exists. Returns string 'true' if stamp exists, else 'false'")
fun checkExists(
#ApiParam("information about the stamp as key-value pairs (example: ds=2017-11-34&hh=05)", required = true)
#RequestParam searchValues: Map<String, String>
): Boolean {
return service.checkExists(searchValues)
}
And I know Spring supports sending a dynamic map of key value pairs as documented here.
I am also using Swagger to document the API definitions, and further more, I am using swagger-codegen-cli to generate the client library using which someone can connect with this REST API.
Now, the issue is, I am not able to send a map of values from the swagger generated client to the Spring REST API (even though Spring supports it). Starting from Swagger OpenAPI 3, they've added support for Object types in the specification. But this works in a different way than I need. For example with just Spring and RequestParam of type Map
http://localhost:8080/search?foo=A&bar=B
is parsed as a map of key value pairs
key="foo",value="A"
key="bar",value="B"
But, When I send a Map Object from the swagger client with the same key-value pairs
Map<String, String> values = new HashMap<>();
values.put("foo","A");
values.put("bar","B");
return out = clientApi.checkExistsUsingGET(values);
This sends a request to the REST API in form of
http://localhost:8080/search?searchValues={foo=A,bar=B}
and the map in Spring side ends up as
key="searchValues",value="{foo=A,bar=B}"
I've been struggling to get the swagger client api to send the request in a way the Spring API is intended to work with Map of values, but I am not able to figure a solution.
Am I doing using the client API in a wrong way?, or this just can't be done with swagger?
Any suggestions/opinions appreciated!
This is not yet supported by swagger-ui. See this issue https://github.com/swagger-api/swagger-ui/issues/2241
I am looking at this example - https://github.com/spring-cloud/spring-cloud-stream-samples/blob/master/kafka-streams-samples/kafka-streams-product-tracker/src/main/java/kafka/streams/product/tracker/KafkaStreamsProductTrackerApplication.java
Trying to do something similar, but for me it is not working. How does product json string is received as Product Object ?
By default, the deserialization on the inbound KStream is done by Spring Cloud Stream. The default content-type used is application/json (equivalent to providing the property: spring.cloud.stream.bindings.input.contentType: application/json). This is why the product json string is properly converted.
You can disable the framework level conversion and let Kafka do that in which case you need to provide the Serdes through properties. In order to enable native deserialization, you can set the property - spring.cloud.stream.bindings.input.consumer.useNativeDecoding: true. Then you need to provide the appropriate Serdes. More on all these are here: https://docs.spring.io/spring-cloud-stream/docs/Elmhurst.BUILD-SNAPSHOT/reference/htmlsingle/#_message_conversion