A scenario, need to consume a rest webservice which provides a huge file as a stream output and vice versa need to handle the stream and directly write to a file rather memory.
Service :
#RequestMapping(value = "downloadFile", method = RequestMethod.GET)
public StreamingResponseBody getSteamingFile(HttpServletResponse response) throws IOException {
response.setContentType("application/octet-stream");
InputStream inputStream = new FileInputStream(new File("data\\test_big.txt"));
return outputStream -> {
int nRead;
byte[] data = new byte[1024];
System.out.println("Writing some bytes..");
while ((nRead = inputStream.read(data, 0, data.length)) != -1) {
outputStream.write(data, 0, nRead);
}
System.out.println("Completed #####");
outputStream.flush();
outputStream.close();
response.flushBuffer();
};
}
Consumer Route:
.to("http4://localhost:8080/downloadFile")
.process(new Processor() {
public void process(Exchange exchange) throws Exception {
InputStream is = exchange.getIn().getBody(InputStream.class);
File ret = File.createTempFile("loadTest", "tmp");
FileOutputStream fos = new FileOutputStream(ret);
StreamUtils.copy(is, fos);
System.out.println("File Name "+ ret.getName());
is.close();
fos.flush();
fos.close();
}
});
256 JVM is going out of Memory when processing 300 MB which treats my route is not performing streaming to file.
.to("http4://localhost:8080/downloadFile?disableStreamCache=true")
disableStreamCache:
Determines whether or not the raw input stream from Servlet is cached or not (Camel will read the stream into a in memory/overflow to file, Stream caching) cache. By default Camel will cache the Servlet input stream to support reading it multiple times to ensure it Camel can retrieve all data from the stream. However you can set this option to true when you for example need to access the raw stream, such as streaming it directly to a file or other persistent store
You have to enable stream caching read here.
Related
i want to receive a HTTP Stream in SpringBoot but the InputStream of HttpServletRequest seems not to be an endless HTTP Stream and only contains the Content of first HTTP Body.
I want to process a chuncked HTTP Stream in SpringBoot on which is puhed some Value String from time to time.
Currently I tried something like this in a controller:
#Override
public void test(HttpServletRequest request,
HttpServletResponse response) {
System.out.println("StreamStart");
try {
byte[] buffer = new byte[1024];
while(true){
int len = request.getInputStream().read(buffer);
if(len!=-1) {
System.out.println("Len: " + len);
System.out.println(new String(buffer));
}
Thread.sleep(500);
}
}
catch(Exception x){
x.printStackTrace();
}
System.out.println("StreamEnd");
}
However the first Request Body after the header works, but the second does not appear in my Controller.
Does SpringBoot cancles the connection or the stream?
Can I have access to the complete HTTP Input stream to get my values from it?
Maybe Multipart request would be usefull for you?
That way you can recieve multiple parts of data
https://www.w3.org/Protocols/rfc1341/7_2_Multipart.html
Example:
#PostMapping("/upload")
public void uploadStream(#RequestParam MultipartFile[] multipartFiles){
for(MultipartFile multipartFile:multipartFiles){
try {
InputStream inputStream = multipartFile.getInputStream();
} catch (IOException e) {
e.printStackTrace();
}
}
}
I'm getting a gzipped content from the client and I need to decompress it before it reaches the controller, otherwise I get a jackson parsing exception.
I created a WebFilter that wraps the request and maps the body into a deflated byte array like this:
#Override
public Flux<DataBuffer> getBody() {
return request.getBody().map(requestDataBuffer -> {
try {
GZIPInputStream gzipInputStream = new GZIPInputStream(requestDataBuffer.asInputStream());
StringWriter writer = new StringWriter();
IOUtils.copy(gzipInputStream, writer, UTF_8);
byte[] targetArray = writer.toString().getBytes();
return new DefaultDataBufferFactory().wrap(targetArray);
}
catch (IOException e) {
LOG.error("failed to create gzip input stream. content-encoding is {}", request.getHeaders().getFirst(CONTENT_ENCODING));
return requestDataBuffer;
}
});
}
However, when the request body is too large the data buffer doesn't contain all the data, therefore I get stream exceptions.
Any ideas how to configure the data buffer or how to accept gzipped content?
I think the best way is to rely on the Netty implementation for that, and configure the server to use that support from Netty.
You can create a component (or return a new instance of this directly from a #Bean method) that customizes the Reactor Netty server:
#Component
public class RequestInflateCustomizer implements NettyServerCustomizer {
#Override
public HttpServer apply(HttpServer httpServer) {
return httpServer.tcpConfiguration(
tcp -> tcp.doOnConnection(conn -> conn.addHandlerFirst(new HttpContentDecompressor())));
}
}
I am creating POC for RESTFUL Web service using Spring 4.0. Requirement is to receive MultipartFile as Response from REST WEB-Service.
REST Service Controller
#RequestMapping(value="/getcontent/file", method=RequestMapping.post)
public MultipartFile getMultipartAsFileAsObject() {
File file = new File("src/test/resources/input.docx");
FileInputStream input = new FileInputStream(file);
MultipartFile multipartFile = new MockMultipartFile("file",file.getName(),
"application/docx", IOUtils.toByteArray(input));
return multipartFile
}
I call this service using third party Clients and Apache Http Client as well. kindly have a look on output.
Using Third party REST client ie. Postman
output looks like Json -
{
"name" : "file",
"originalfilename" : "sample.docx",
"contentType" : "application/docx",
"content" : [
82,
101,
97,
100,
101,
32,
32,
.
.
.
.
.
]
}
Apache HTTP Client Sample code
private static void executeClient() {
HttpClient client = new DefaultHttpClient();
HttpPost postReqeust = new HttpPost(SERVER_URI);
try{
// Set Various Attributes
HttpResponse response = client.execute(postReqeust) ;
//Verify response if any
if (response != null)
{
InputStream inputStream = response.getEntity().getContent();
byte[] buffer = new byte[inputStream.available()];
inputStream.read(buffer);
OutputStream outputStream = new FileOutputStream
(new File("src/main/resource/sample.docx"));
outputStream.write(buffer);
outputStream.flush();
outputStream.close();
}
}
catch(Exception ex){
ex.printStackTrace();
}
Output of Apache Http client
file is getting Created but It is empty. (0 bytes).
I found some interesting answers from multiple stackoverflow questions.
Links are given below
file downloading in restful web services
what's the correct way to send a file from REST web service to client?
For Sending single file : (copied from above sources)
#GET
#Produces(MediaType.APPLICATION_OCTET_STREAM)
public Response getFile() {
File file = ... // Initialize this to the File path you want to serve.
return Response.ok(file, MediaType.APPLICATION_OCTET_STREAM)
.header("Content-Disposition", "attachment; filename=\"" + file.getName() + "\"" ) //optional
.build();
}
For Sending Zip file : (copied from above sources)
1) Approach First :
You can use above method to send any file / Zip.
private static final String FILE_PATH = "d:\\Test2.zip";
#GET
#Path("/get")
#Produces(MediaType.APPLICATION_OCTET_STREAM)
public Response getFile() {
File file = new File(FILE_PATH);
ResponseBuilder response = Response.ok((Object) file);
response.header("Content-Disposition", "attachment; filename=newfile.zip");
return response.build();
}
2) Approach Second :
#GET
#Path("/get")
#Produces(MediaType.APPLICATION_OCTET_STREAM)
public StreamingOutput helloWorldZip() throws Exception {
return new StreamingOutput(){
#Override
public void write(OutputStream arg0) throws IOException, WebApplicationException {
// TODO Auto-generated method stub
BufferedOutputStream bus = new BufferedOutputStream(arg0);
try {
Thread.currentThread().getContextClassLoader().getResource("");
File file = new File("d:\\Test1.zip");
FileInputStream fizip = new FileInputStream(file);
byte[] buffer2 = IOUtils.toByteArray(fizip);
bus.write(buffer2);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
};
}
I have multipart/form-data request from client side which will upload multiple files. Each file would be a large file (more than 10MB in size). I want to receive each file individually and process it while remaining files are still getting received completely.
This is my request in rest console
URL : http://localhost:8080/multiUpload
Content-Type : multipart/form-data;boundary=myboundary
form-data : two attachments
This is my snippet of code to handle it.
#RequestMapping(value = "/multiUpload", method = RequestMethod.POST)
public #ResponseBody String handleMultiFileUpload(HttpServletRequest request)
throws IOException {
System.out.println("======Inside handleFileUpload======");
try {
MultipartStream multipartStream = new MultipartStream(
request.getInputStream(), "myboundary".getBytes(),
1024 * 1024, null);
boolean nextPart = multipartStream.skipPreamble();
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
while (nextPart) {
System.out.println("inside while");
multipartStream.readBodyData(outputStream);
InputStream inputStream = new ByteArrayInputStream(
outputStream.toByteArray());
nextPart = multipartStream.readBoundary();
}
} catch (MultipartStream.MalformedStreamException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
return "You successfully uploaded !";
}
The code is not entering into while loop. i.e skipPreamble is returning false. Am I doing anything wrong ??.
I'm trying to serve images from mongodb GridFS. My Controller.
#RequestMapping(value = "{id}", method = RequestMethod.GET)
public void getPhoto (#PathVariable String id, HttpServletResponse response, HttpServletRequest request) {
log.info("#getPhoto > ip of request: " + request.getRemoteAddr() + ", id: " + id);
final InputStream inputStream = resourceService.getMediaResourceById(id);
try {
IOUtils.copy(inputStream, response.getOutputStream());
response.flushBuffer();
} catch (IOException | NullPointerException e) {
log.error("#getPhoto > error with request for objectId: " + id, e);
response.setStatus(HttpServletResponse.SC_BAD_REQUEST);
}
}
The result:
This only happens using Spring Boot. as a test when using Spring and running the exact same code i'm getting:
Writing directly to a response is discouraged in controller methods for various reasons. You are essentially responsible for almost everything yourself. The preferred way is to return something that gets converted as needed.
You already use ResponseEntity<byte[]> now. But your source is a stream and you have to create an unnecessary byte array. You can use Resource instead that wraps all sorts of input streams, be it from files or already opened input streams.
InputStreamResource inputStream = new InputStreamResource(resourceService.getMediaResourceById(id));
return new ResponseEntity<>(inputStream, HttpStatus.OK);
or as of Spring 4.1
return ResponseEntity.ok(inputStream);
Please note that produces = MediaType.IMAGE_JPEG_VALUE doesn't actually set a content type. It's used for content negotiation.