I am trying to process a zip file with Apache Camel.
After making a call, I get a zip file and try to prepare the next call with this zip file as body.
The call requires a form data with one name and zip file as value.
I handle in this way:
process(e ->{
Object zip = e.getIn().getBody();
MultiValueMap<String, Object> body = new LinkedMultiValueMap<>();
body.add("file",zip);
e.getIn().setBody(body);
})
But I receive the exception:
org.apache.camel.NoTypeConversionAvailableException: No type converter available to convert from type: org.springframework.util.LinkedMultiValueMap to the required type: java.io.InputStream with value {file=[[B#2b02c691]}
Any Ideas?
Cheers!
I tried to get the response in byte[] but it still dose not work.
As Jeremy said, the error says Camel is expecting (further in the process) a body of type InputStream, whilst you are obviously preparing a body of type MultiValueMap (BTW: why use a map if you have a single object to handle ??)
I do not know what is the concrete type of your 'zip' object, but (if needed) you may have to replace current body with its inputstream equivalent:
process(e ->{
// Print concrete type
Object zip = e.getMessage().getBody();
System.out.println("Type is " + zip.getClass() );
// Convert body
InputStream is = e.getMessage().getBody(InputStream.class);
// Replace body
e.getMessage().setBody(is);
})
Related
I’m having an issue with Spring 5 reactive WebClient, when I request an endpoint that returns a correctly formated json response with content type "text/plain;charset=UTF-8".
The exception is
org.springframework.web.reactive.function.UnsupportedMediaTypeException:
Content type 'text/plain;charset=UTF-8' not supported for bodyType=MyDTOClass
Here is how I made the request:
webClient.get().uri(endpoint).retrieve().bodyToFlux(MyDTOClass.class)
EDIT: Headers are "correctly" setted (Accept, Content-Type), I have tried differents content-types (json, json + UTF8, text plain, text plain + UTF8) conbinations, without success. I think the issue is .bodyToFlux(MyDTOClass.class) doesn't know how to translate "text" into MyDTOClass objects.
If I change the request to:
webClient.get().uri(endpoint).retrieve().bodyToFlux(String.class)
I can read the String.
EDIT 2: The next quote is extracted from the Spring documentation
(https://docs.spring.io/spring/docs/current/spring-framework-reference/web-reactive.html#webflux-codecs-jackson)
By default both Jackson2Encoder and Jackson2Decoder do not support
elements of type String. Instead the default assumption is that a
string or a sequence of strings represent serialized JSON content, to
be rendered by the CharSequenceEncoder. If what you need is to render
a JSON array from Flux<String>, use Flux#collectToList() and encode a
Mono<List<String>>.
I think the solution is define a new Decoder/Reader in order to transform the String into MyDTOClass, but i don't know how to do it.
In case someone needs it, here is the solution:
This answer (https://stackoverflow.com/a/57046640/13333357) is the key. We have to add a custom decoder in order to specify what and how deserialize the response.
But we have to keep in mind this: The class level anotation #JsonIgnoreProperties is setted by default to the json mapper and does not have effect to other mappers. So if your DTO doesn't match all the response "json" properties, the deserialization will fail.
Here is how to configure the ObjectMapper and the WebClient to deserialize json objects from text responses:
...
WebClient.builder()
.baseUrl(url)
.exchangeStrategies(ExchangeStrategies.builder().codecs(configurer ->{
ObjectMapper mapper = new ObjectMapper();
mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
configurer.customCodecs().decoder(new Jackson2JsonDecoder(mapper, MimeTypeUtils.parseMimeType(MediaType.TEXT_PLAIN_VALUE)));
}).build())
.build();
...
Cheers!
Following up on this answer from Charlie above, you can now add an extra "codec" without replacing them.
You can also easily build Spring's default configured ObjectMapper via Jackson2ObjectMapperBuilder.json().build()
Here's an example that reuses the ObjectMapper from the built-in Jackson2JsonDecoder
var webClient = webClientBuilder
.baseUrl(properties.getBaseUrl())
.codecs(configurer -> {
// This API returns JSON with content type text/plain, so need to register a custom
// decoder to deserialize this response via Jackson
// Get existing decoder's ObjectMapper if available, or create new one
ObjectMapper objectMapper = configurer.getReaders().stream()
.filter(reader -> reader instanceof Jackson2JsonDecoder)
.map(reader -> (Jackson2JsonDecoder) reader)
.map(reader -> reader.getObjectMapper())
.findFirst()
.orElseGet(() -> Jackson2ObjectMapperBuilder.json().build());
Jackson2JsonDecoder decoder = new Jackson2JsonDecoder(objectMapper, MediaType.TEXT_PLAIN);
configurer.customCodecs().registerWithDefaultConfig(decoder);
})
.build();
Set content type for webclient.
webClient.get()
.uri(endpoint)
.contentType(MediaType.APPLICATION_JSON_UTF8)
I'm trying to call an endpoint that accepts PUT requests and expects to be passed 3 different MultipartFile paramters. Let's call them A, B and C.
When I make a request to the same enpoint from Postman it works as intended. When I do it via the the reactor-netty lib I get back Error 400 Bad Request:
"Required request part 'A' is not present"
HttpClient
.create()
// skipping baseUrl and headers headers
.put()
.uri(ENDPOINT_URI)
.sendForm((req, form) -> form
.multipart(true)
.file("A", FILE_A, "application/json)
.file("B", FILE_B, "application/json)
.file("C", FILE_C, "application/json))
.response()
I could not find much info online to establish if this is the best way to achieve what I need. Can you please point me to where I'm going wrong or perhaps towards an alternative solution?
Thanks
After looking throught the source of the HttpClientForm (the class in which .file is called) I found this:
default HttpClientForm file(String name, InputStream stream, #Nullable String contentType) {
return file(name, "", stream, contentType);
}
as well as this:
default HttpClientForm file(String name, File file, #Nullable String contentType) {
return file(name, file.getName(), file, contentType);
}
Somehow I thought that the first paramter 'name' is the one that is matched with the #RequestParam value. By the looks of it its actually the second.
Also if using an input stream instead of a File I had to call the the file method with 4 paramters and pass the name explicitly as the second parameter like so:
file(name, "A", stream, contentType)
I have the following spring mvc method that returns a file:
#RequestMapping(value = "/files/{fileName}", method = RequestMethod.GET)
public FileSystemResource getFiles(#PathVariable String fileName){
String path="/home/marios/Desktop/";
return new FileSystemResource(path+fileName);
}
I expect a ResourceHttpMessageConverter to create the appropriate response with an octet-stream type according to its documentation:
If JAF is not available, application/octet-stream is used.
However although I correctly get the file without a problem, the result has Content-Type: application/json;charset=UTF-8
Can you tell me why this happens?
(I use spring version 4.1.4. I have not set explicitly any message converters and I know that spring loads by default among others the ResourceHttpMessageConverter and also the MappingJackson2HttpMessageConverter because I have jackson 2 in my classpath due to the fact that I have other mvc methods that return json.
Also if I use HttpEntity<FileSystemResource> and set manually the content type, or specify it with produces = MediaType.APPLICATION_OCTET_STREAM it works fine.
Note also that in my request I do not specify any accept content types, and prefer not to rely on my clients to do that)
I ended up debugging the whole thing, and I found that AbstractJackson2HttpMessageConverter has a canWrite implementation that returns true in case of the FileSystemResource because it just checks if class is serializable, and the set media type which is null since I do not specify any which in that case is supposed to be supported by it.
As a result it ends up putting the json content types in a list of producible media types. Of course ResourceHttpMessageConverter.canWrite implementation also naturally returns true, but the ResourceHttpMessageConverter does not return any producible media types.
When the time to write the actual response comes, from the write method implementation, the write of the ResourceHttpMessageConverter runs first due to the fact that the ResourceHttpMessageConverter is first in the list of the available converters (if MappingJackson2HttpMessageConverter was first, it would try to call write since its canWrite returns true and throw exception), and since there was already a producible content type set, it does not default to running the ResourceHttpMessageConverter.getDefaultContentType that would set the correct content type.
If I remove json converter all would work fine, but unfortunately none of my json methods would work. Therefore specifying the content type is the only way to get rid of the returned json content type
For anyone still looking for a piece of code:
You should wrap your FileSystemResource into a ResponseEntity<>
Then determine your image's content type and append it to ResponseEntity as a header.
Here is an example:
#GetMapping("/image")
public #ResponseBody ResponseEntity<FileSystemResource> getImage() throws IOException {
File file = /* load your image file from anywhere */;
if (!file.exists()) {
//TODO: throw 404
}
FileSystemResource resource = new FileSystemResource(file);
HttpHeaders headers = new HttpHeaders();
headers.setContentType(/* determine your image's media type or just set it as a constant using MediaType.<value> */);
headers.setContentLength(resource.contentLength());
return new ResponseEntity<>(resource, headers, HttpStatus.OK);
}
The JAX-RS web service I'm calling is throwing xml content as text/html content type. On my side, I need to read the xml, and convert it to Java object.
The problem is: the response xml isn't formatted right and has newline characters in wrong places, such as - there are several newline characters before <?xml version="1.0" encoding="UTF-8"?> . This is causing problems trying to unmarshall it.
Is there a way I can unmarshall the response xml string though it has formatting problems?
Thanks in advance.
HttpGet httpGet = new HttpGet(uri);
HttpResponse response = client.execute(httpGet);
InputStream inputStream = response.getEntity().getContent();
JAXBContext context = JAXBContext.newInstance(MyClass.class);
MyClass myObj = (MyClass) context.createUnmarshaller().unmarshal(inputStream);
How about trimming the content before unmarshalling:
import org.apache.commons.io.IOUtils;
import org.apache.commons.lang.CharEncoding;
MyClass myObj = MyClass.class.cast(
context.createUnmarshaller().unmarshal(
IOUtils.toInputStream(
IOUtils.toString(
InputStream.class.cast(response.getEntity().getContent()),
CharEncoding.UTF_8
).trim(),
CharEncoding.UTF_8
)
)
);
Finally got the provider to send the response in xml/application format. However, as a way out, an option that I found is to save the content as a file, and then again retrieve content from the file.
I'm want to make a request to google API and pass the resulting XML to SAX parser here are both codes...
First the request:
HttpClient hclient = new DefaultHttpClient();
HttpGet get = new HttpGet("http://www.google.com/ig/api?weather=Cardiff");
HttpResponse hrep = hclient.execute(get);
HttpEntity httpEntity = hrep.getEntity();
Then the parser:
SAXParserFactory saxpf = SAXParserFactory.newInstance();
SAXParser saxp = saxpf.newSAXParser();
XMLReader xr = saxp.getXMLReader();
ExHandler myHandler = new ExHandler();
xr.setContentHandler(myHandler);
xr.parse();
Is this the right way to do this and how do I connect both codes.
Thanks in advance
The SAXParser object can take in an input stream and the handler. So something like:
SAXParser saxParser = factory.newSAXParser();
XMLParser parser = new XMLParser();
saxParser.parse(httpEntity.getContent(),parser);
The getContent() method returns and input stream from the HttpRequest, and the XMLParser object is just a class I created (supposedly) that contains the definition of how to parse the XML.
EDIT*
You really should read the entire API for SAXParser, it has several overloaded methods:
void parse(InputSource is, DefaultHandler dh)
Parse the content given InputSource as XML using the specified DefaultHandler.
void parse(InputSource is, HandlerBase hb)
Parse the content given InputSource as XML using the specified HandlerBase.
void parse(InputStream is, DefaultHandler dh)
Parse the content of the given InputStream instance as XML using the specified DefaultHandler.
void parse(InputStream is, DefaultHandler dh, String systemId)
Parse the content of the given InputStream instance as XML using the specified DefaultHandler.
void parse(InputStream is, HandlerBase hb)
Parse the content of the given InputStream instance as XML using the specified HandlerBase.
void parse(InputStream is, HandlerBase hb, String systemId)
Parse the content of the given InputStream instance as XML using the specified HandlerBase.
Some of the methods take an InputSource, some take an InputStream, as I stated earlier.