video is not fetching completly, - spring

I want to fetch video from mongodb than it should play that video up to video size, but its playing only some second, i gave maxUploadSize is 20mb and maxInMemorySize also 20mb,but on the jsp page, its fetching only 1mb video even video size more than 1mb.
I am not getting what to do, video should play full according to video size
here is Controller
#RequestMapping(value = "/welcome-video-controller/{videoObj}", produces = "video/webm")
#ResponseBody
public ResponseEntity<byte[]> getVideoForLoginPage(#PathVariable String videoObj, HttpServletResponse response)
throws IOException {
LOG.info("Entry :: getVideoForPost");
BufferedImage bufferedVideoForPost = null;
URL resourcePath = null;
byte[] videoArray = null;
ResponseEntity<byte[]> result = null;
LOG.info("videoObj-->" + videoObj);
File videoFromMongo = new File(VIDEO_FROM_PATH + videoObj);
GridFSDBFile videoFile = MongoUtility.getVideoFileFromMongo(videoObj);
videoFile.writeTo(videoFromMongo);
bufferedVideoForPost = ImageIO.read(videoFromMongo);
videoFile.getInputStream();
HttpHeaders headers = new HttpHeaders();
headers.setContentLength((int) videoFile.getLength());
videoArray = new byte[(int) videoFile.getLength()];
result = new ResponseEntity<byte[]>(videoArray, headers, HttpStatus.OK);
videoFile.getInputStream().read(videoArray);
LOG.info("videoArray-->" + videoArray);
LOG.info("videoArray size-->" + videoArray.length);
return result;
}
and this is html codding
<c:set value="${videoPostDetail.videoNames}" var="videoObj" />
<c:if test="${videoObj ne ''}">
<video width="96%" height="220" controls id="sideVideo">
<source src='/SocialNetworkingApp/welcome-video-controller/${videoObj}.do' type='video/webm'>
</video>
</c:if>
it should play complete video,i tried lot but nothing is working, plz tell what's problem,

I got the solution, actually problem was the chunk size, i dint set chunk size of file and by default chunkSize was less than 1 mb,
now i added this line into the code gfsFile.setChunkSize(uploadVideoFile.length()); and its working fine.
public static void saveVideoIntoMongo(File uploadVideoFile, String videoFilePath, String newVideoFileName)
throws IOException {
LOG.info("Entry :: saveVideoIntoMongo");
LOG.info("videoFilePath-->" + videoFilePath);
LOG.info("newVideoFileName-->" + newVideoFileName);
LOG.info("uploadVideoFile-->" + uploadVideoFile);
DB db = getMongoDBInstance("videoDb");// later on take it from the properties file instead of hardcoding
GridFS gfsPhoto = getGridFSForFiles(db, "video");
if (!("").equals(newVideoFileName)) {
GridFSInputFile gfsFile = gfsPhoto.createFile(uploadVideoFile);
gfsFile.setChunkSize(uploadVideoFile.length());//setting chunkSize
gfsFile.setFilename(newVideoFileName);
gfsFile.save();
}
LOG.info("Exit :: saveVideoIntoMongo");
}

Related

Upload large files to Azure Media Services in Xamarin

I am trying to upload a .mp4 file, selected from the user's iOS or Android device, to my Azure Media Services account.
This code works for small files ( less than ~95MB):
public static async Task<string> UploadBlob(string blobContainerSasUri, string blobName, byte[] blobContent, string path)
{
string responseString;
int contentLength = blobContent.Length;
string queryString = (new Uri(blobContainerSasUri)).Query;
string blobContainerUri = blobContainerSasUri.Split('?')[0];
string requestUri = string.Format(System.Globalization.CultureInfo.InvariantCulture, "{0}/{1}{2}", blobContainerUri, blobName, queryString);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(requestUri);
request.Method = "PUT";
request.AllowWriteStreamBuffering = false;
request.Headers.Add("x-ms-blob-type", "BlockBlob");
request.ContentLength = contentLength;
request.Timeout = Int32.MaxValue;
request.KeepAlive = true;
int bufferLength = 1048576; //upload 1MB at time, useful for a simple progress bar.
Stream requestStream = request.GetRequestStream();
requestStream.WriteTimeout = Int32.MaxValue;
ProgressViewModel progressViewModel = App.Locator.GetProgressBar(App.Locator.MainViewModel.currentModuleItemId);
MyVideosPage myVideosPage = App.Locator.GetVideosPage(App.Locator.MainViewModel.currentModuleItemId);
FileStream fileStream = new FileStream(path, FileMode.Open, FileAccess.Read);
int nRead = 0;
int currentPos = 0;
while ((nRead = fileStream.Read(blobContent, currentPos, bufferLength)) > 0)
{
await requestStream.WriteAsync(blobContent, currentPos, nRead);
currentPos += nRead;
}
fileStream.Close();
requestStream.Close();
HttpWebResponse objHttpWebResponse = null;
try
{
// this is where it fails for large files
objHttpWebResponse = (HttpWebResponse)request.GetResponse();
Stream responseStream = objHttpWebResponse.GetResponseStream();
StreamReader stream = new StreamReader(responseStream);
responseString = stream.ReadToEnd();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (objHttpWebResponse != null)
objHttpWebResponse.Close();
}
return responseString;
}
An exception is thrown after this line is called:
(HttpWebResponse)request.GetResponse();
The exception message is "The request body is too large and exceeds the maximum permissible limit."
The exception StatusCode is "RequestEntityTooLarge".
How can I upload large files? Is this a problem with HttpWebRequest, or Azure Media Services?
Azure Storage supports one shot upload (aka PutBlob API) up to 256MB if you are using the new REST API versions. But since you are not specifying the REST API version, you're defaulting to a very old version where the maximum supported size of one shot upload is 100MB.
Use x-ms-version: 2018-03-28 header to be able to upload up to 256MB in one HTTP request.
If you have to deal with larger files, you will need to use block & commit upload. You will need to use PutBlock API to stage blocks from the source file. Blocks can be up to 100MB each. Then you need to commit all the blocks using the PutBlockList API. If you don't have to deal with this logic yourself, simply use the Azure Storage SDK for .NET (supports Xamarin) and use the uploadFromFile method. It is simple, and resilient.

Convert HTTPResponse to image

For one of the implementation, from Java code I am hitting labelary Rest service to convert ZPL formatted code to Image.
I am able to successfully fetch the response. But I am not able to convert the HttpResponse to image file.
HttpClient client = new DefaultHttpClient();
HttpPost post = new HttpPost "http://api.labelary.com/v1/printers/8dpmm/labels/4x6/0/");
byte[] byteArray = Base64.decodeBase64(base64Val.getBytes());
String decodedString = new String(byteArray);
StringEntity requestEntity;
try {
requestEntity = new StringEntity(decodedString);
requestEntity.setContentType("application/x-www-form-urlencoded");
post.setEntity(requestEntity);
HttpResponse response = client.execute(post);
BufferedReader bufferReader = new BufferedReader(new InputStreamReader(response.getEntity().getContent()));
#Need suggestions to convert BufferReader to Image
}
Post referring the suggested answer, code looks like:
HttpResponse response = client.execute(post);
InputStream inStream = response.getEntity().getContent();
String dataString = convertStreamToString(inStream);
byte[] imageBytes = javax.xml.bind.DatatypeConverter.parseBase64Binary(dataString);
BufferedImage image = ImageIO.read(new ByteArrayInputStream(imageBytes));
File outputfile = new File("myImage.png");
ImageIO.write(image, "png", outputfile);
use this byte to image converter. past value in datastring and check whether you get the image.

ImageProcessorCore: Attempt to resample image results in zero-length response

I am trying to resample a JPG image from 300dpi to 150dpi and am getting back a zero-length file.
Controller's ActionResult:
public ActionResult ViewImage(string file, int dpi = 300, bool log = true)
{
FileExtensions fileExtensions = new FileExtensions();
ImageExtensions imageExtensions = new ImageExtensions();
FileModel fileModel = fileExtensions.GetFileModel(file);
string contentType = fileModel.FileType;
byte[] fileData = fileModel.FileData;
string fileName = Path.GetFileNameWithoutExtension(fileModel.FileName) + "_" + dpi + "DPI" + Path.GetExtension(fileModel.FileName);
FileStreamResult resampledImage = imageExtensions.ResampleImage(fileData, contentType, dpi);
resampledImage.FileDownloadName = fileName;
return resampledImage;
}
ResampleImage method:
public FileStreamResult ResampleImage(byte[] fileData, string contentType, int targetDPI)
{
MemoryStream outputStream = new MemoryStream();
using (Stream sourceStream = new MemoryStream(fileData))
{
Image image = new Image(sourceStream);
image.HorizontalResolution = targetDPI;
image.VerticalResolution = targetDPI;
JpegEncoder jpegEncoder = new JpegEncoder();
jpegEncoder.Quality = 100;
image.Save(outputStream, jpegEncoder);
}
FileStreamResult file = new FileStreamResult(outputStream, contentType);
return file;
}
I thought I best answer here since we've already dealt with it on the issue tracker.
ImageProcessorCore at present (2016-08-03) is alpha software and as such is unfinished. When you were having the issue, horizontal and vertical resolution was not settable in jpeg images. This is now solved.
Incidentally there are overloads that allow saving as jpeg without having to create your own JpegEncoder instance.
image.SaveAsJpeg(outputStream);

java.awt.image.RasterFormatException: Data array too small (should be > 388799 )

I am trying to do some image manipulation using IConverter class which is included in Xuggle library to convert the images from IVideoPicture type to BufferedImage type but am encountering the error in the title.
Here is my code:
BufferedImage orgnlimage = new BufferedImage(Picture.getWidth(), Picture.getHeight(), BufferedImage.TYPE_3BYTE_BGR);
IConverter converter = ConverterFactory.createConverter(orgnlimage, IPixelFormat.Type.BGR24);
orgnlimage = converter.toImage(Picture); // Exception on this line
The dimensions of the image in question, is 360x360.
This is the exception I'm getting:
Exception in thread "main" java.awt.image.RasterFormatException: Data array too small (should be > 388799 )
at sun.awt.image.ByteComponentRaster.verify(ByteComponentRaster.java:947)
at sun.awt.image.ByteComponentRaster.<init>(ByteComponentRaster.java:201)
at sun.awt.image.ByteInterleavedRaster.<init>(ByteInterleavedRaster.java:191)
at sun.awt.image.ByteInterleavedRaster.<init>(ByteInterleavedRaster.java:113)
at java.awt.image.Raster.createWritableRaster(Raster.java:980)
at com.xuggle.xuggler.video.BgrConverter.toImage(BgrConverter.java:195)
at xuggler.Encrypt.main(Encrypt.java:53)
at xuggler.DecodeAndSaveAudioVideo.main(DecodeAndSaveAudioVideo.java:141)
My Second attemp :
public IVideoPicture main(IVideoPicture Picture) throws NoSuchPaddingException, IllegalBlockSizeException, BadPaddingException, IOException
{
int width = Picture.getWidth();
int height = Picture.getHeight();
long timestamp = Picture.getTimeStamp();
BufferedImage orgnlimage = videoPictureToImage(Picture);
byte[] orgnlimagebytes =toByte(orgnlimage);
byte[] encryptedbytes = encrypt(orgnlimagebytes, "abc");
//System.out.println(encryptedbytes.length);
BufferedImage encryptedimage = toImage(encryptedbytes, width, height);
String desc = ConverterFactory.findDescriptor(encryptedimage);
IConverter converter = ConverterFactory.createConverter(desc, Picture);
IVideoPicture Pic = converter.toPicture(encryptedimage, timestamp);
return Pic;
}
and the stack trace :
Exception in thread "main" java.nio.BufferOverflowException
at java.nio.DirectByteBuffer.put(DirectByteBuffer.java:363)
at java.nio.ByteBuffer.put(ByteBuffer.java:859)
at com.xuggle.xuggler.video.BgrConverter.toPicture(BgrConverter.java:132)
at xuggler.Encrypt.main(Encrypt.java:62)
at xuggler.DecodeAndSaveAudioVideo.main(DecodeAndSaveAudioVideo.java:141)
The problem is that for some reason, the Xuggler BgrConverter.toImage() method tries to create a raster around a byte array of size 388799, which is one byte short... It should have been of size 388800 (360 * 360 * 3) for your image in BGR format.
I'd say file a bug report.
Or try Humble Video instead, which seems to kind of a successor to Xuggler.

Display static Google Map image in BlackBerry 5.0

I'm having a really interesting problem to solve:
I'm getting a static google map image, with an URL like this.
I've tried several methods to get this information:
Fetching the "remote resource" as a ByteArrayOutputStream, storing the Image in the SD of the Simulator, an so on... but every freaking time I get an IlegalArgumentException.
I always get a 200 http response, and the correct MIME type ("image/png"), but either way: fetching the image and converting it to a Bitmap, or storing the image in the SD and reading it later; I get the same result... the file IS always corrupt.
I really belive its an encoding problem, or the reading method (similar to this one):
public static Bitmap downloadImage(InputStream inStream){
byte[] buffer = new byte[256];
ByteArrayOutputStream baos = new ByteArrayOutputStream();
while (inStream.read(buffer) != -1){
baos.write(buffer);
}
baos.flush();
baos.close();
byte[] imageData = baos.toByteArray();
Bitmap bi = Bitmap.createBitmapFromBytes(imageData, 0, imageData.length, 1);
//Bitmap bi = Bitmap.createBitmapFromBytes(imageData, 0, -1, 1);
return bi;
}
The only thing that comes to mind is the imageData.lenght (in the response, the content length is: 6005 ), but I really can't figure this one out.
Any help is more than welcome...
try this way:
InputStream input = httpConn.openInputStream();
byte[] xmlBytes = new byte[256];
int len = 0;
int size = 0;
StringBuffer raw = new StringBuffer();
while (-1 != (len = input.read(xmlBytes)))
{
raw.append(new String(xmlBytes, 0, len));
size += len;
}
value = raw.toString();
byte[] dataArray = value.getBytes();
EncodedImage bitmap;
bitmap = EncodedImage.createEncodedImage(dataArray, 0,dataArray.length);
final Bitmap googleImage = bitmap.getBitmap();
Swati's answer is good. This same thing can be accomplished with many fewer lines of code:
InputStream input = httpConn.openInputStream();
byte[] dataArray = net.rim.device.api.io.IOUtilities.streamToBytes(input);
Bitmap googleImage = Bitmap.createBitmapFromBytes(dataArray, 0, -1, 1);

Resources