How can I get a frame sample (jpeg) from a video (mp4) using javaCV - javacv

I'm trying to create a Mendix Java Action which generates a thumbnail jpeg from a movie.
I'm using javaCV 5.4 but I'm struggling to generate a Frame from the frameGrabber, for some reason it's always null?
this.InputFile = __InputFile == null ? null : system.proxies.FileDocument.initialize(getContext(), __InputFile);
this.TargetFileImage = __TargetFileImage == null ? null : system.proxies.Image.initialize(getContext(), __TargetFileImage);
// BEGIN USER CODE
ILogNode logger = Core.getLogger("GeneratePosterImage");
try (ByteArrayOutputStream outputStream = new ByteArrayOutputStream()) {
InputStream inputStream = Core.getFileDocumentContent(getContext(), __InputFile);
FFmpegFrameGrabber frameGrabber = new FFmpegFrameGrabber(inputStream, 0);
frameGrabber.start();
logger.info("Frame Count: " + frameGrabber.getLengthInFrames());
logger.info("Format: "+ frameGrabber.getFormat() + " width: "+ frameGrabber.getImageWidth() + " height: " + frameGrabber.getImageHeight());
Java2DFrameConverter c = new Java2DFrameConverter();
//Frame frame = frameGrabber.grabImage();
//Frame frame = frameGrabber.grab();
Frame frame = frameGrabber.grabKeyFrame();
BufferedImage bufferedImage = c.convert(frame);
logger.info("Height: " + bufferedImage.getHeight() + "Width: " + bufferedImage.getWidth());
ImageIO.write(bufferedImage, "jpeg", outputStream);
Core.storeFileDocumentContent(getContext(), __TargetFileImage, new ByteArrayInputStream(outputStream.toByteArray()));
frameGrabber.stop();
} catch (IOException e)
{
logger.error(e.getMessage());
}
I'm very new to Java, can anyone advise where I'm going wrong?
The output from the first two log messages are:
Frame Count: 96,
Format: mov,mp4,m4a,3gp,3g2,mj2 width: 1312 height: 756
Thanks
Adrian

I'm posting this in case it helps anyone else.
My Colleague has figured it out - I didn't need the second parameter to FFmpegFrameGrabber() - I should have used:
FFmpegFrameGrabber frameGrabber = new FFmpegFrameGrabber(inputStream);
Thanks
Adrian

Related

An exception of type 'System.Runtime.InteropServices.COMException' occurred in Microsoft.Phone.ni.dll but was not handled in user code

This code is in the ScheduledTaskAgent.cs and ofcourse runs in the background.
I always get this exception at the statement : stream.Seek(0, SeekOrigin.Begin);
The image is in my project files under the root.I tried with "Do Not Copy" and "Always Copy".But still the same error comes.
Deployment.Current.Dispatcher.BeginInvoke(()=>
{
BitmapImage img = new BitmapImage(new Uri(#"\DefaultImage.jpg", UriKind.Relative));
img.CreateOptions = BitmapCreateOptions.None;
img.ImageOpened += (s, e) =>
{
WriteableBitmap wbitmap = new WriteableBitmap((BitmapImage)s);
TextBlock textBlock = new TextBlock();
textBlock.Text = "Sample Text";
textBlock.TextWrapping = TextWrapping.Wrap;
wbitmap.Render(textBlock, new TranslateTransform() { X = 25, Y = 10 }); ;
wbitmap.Invalidate();
using (MemoryStream stream = new MemoryStream())
{
wbitmap.SaveJpeg(stream, wbitmap.PixelWidth, wbitmap.PixelHeight, 0, 100);
stream.Seek(0, SeekOrigin.Begin);
SaveToIsolatedStorage(stream);
stream.Close();
}
};
});
I think that the SaveJpeg method closes the stream and you cannot seek in it afterwards. Try this:
using (IsolatedStorageFileStream stream = IsolatedStorageFile.GetUserStoreForApplication().CreateFile(fileName))
{
wbitmap.SaveJpeg(stream, wbitmap.PixelWidth, wbitmap.PixelHeight, 0, 100);
}

javacv: grabber.getFrameRate() return 0

I have used javacv for my project to deal with avi.
The video shows faster than normal, I want to get the fps to set the speed of the video. But grabber.getFrameRate() return 0, as the same with grabber.getLengthInFrames() and grabber.getSampleRate(), can anyone tell me why?
code snippet below:
FrameGrabber grabber = new OpenCVFrameGrabber("sample.avi");
double fps=grabber.getFrameRate();
System.out.println(fps);
//int n=grabber.getLengthInFrames();
//int f=grabber.getSampleRate();
CvMemStorage storage = CvMemStorage.create();
grabber.start();
grabbedImage = grabber.grab();
while (frame.isVisible() && (grabbedImage = grabber.grab()) != null)
{
BufferedImage bfimg = grabbedImage.getBufferedImage();
frame.showImage(bfimg);
frame.waitKey((int)(1000/fps));
cvClearMemStorage(storage);
}
grabber.stop();
you must call after start();
maybe you can
FrameGrabber grabber = new OpenCVFrameGrabber("sample.avi");
grabber.start();
double fps=grabber.getFrameRate();
I use this code to record mp4 video:
public static void main(String[] args) {
IplImage image;
CanvasFrame canvas = new CanvasFrame("Web Cam");
try {
OpenCVFrameGrabber grabber = new OpenCVFrameGrabber(0);
grabber.start();
IplImage grabbedImage = grabber.grab();
canvas.setCanvasSize(grabbedImage.width(), grabbedImage.height());
System.out.println("framerate = " + grabber.getFrameRate());
grabber.setFrameRate(grabber.getFrameRate());
FFmpegFrameRecorder recorder = new FFmpegFrameRecorder("c:/demo.mp4", 320, 240);
recorder.setVideoCodec(13);
recorder.setFormat("mp4");
recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
recorder.setFrameRate(10);
recorder.setVideoBitrate(5*1024);
recorder.start();
System.out.println("framerate = " + grabber.getFrameRate());
while (canvas.isVisible() && (grabbedImage = grabber.grab()) != null) {
canvas.showImage(grabbedImage);
recorder.record(grabbedImage);
}
recorder.stop();
grabber.stop();
canvas.dispose();
} catch (Exception e) {
e.printStackTrace();
}
}
I tried this in a webcam example and it worked:
double time1,time2;
// your loop
while(....){
time1=System.currentTimeMillis(); // add this first in the loop
//code here
.
.
.
.
time2=System.currentTimeMillis(); // add this at the end of the loop
System.out.println("framerate = " + 1/(((time2-time)/1000)%60))
} //end of loop
In case of webcam example with code for webcam found:
http://opencvlover.blogspot.com/2012/05/accessing-webcam-using-javacv.html
it looks like this :
IplImage img
while (...) {
time1=System.currentTimeMillis(); // add this first in the loop
//inser grabed video fram to IplImage img
img = grabber.grab();
.
.
.
.
//Show video frame in canvas
canvas.showImage(img);
time2=System.currentTimeMillis(); // add this at the end of the loop
System.out.println("framerate = " + 1/(((time2-time)/1000)%60))
} //end of loop
.
.
.

Post Image from Blackberry WebWorks SDK Camera to PHP File Server

I am using Blackberry webworks i want to upload image from camera to php webserver i am getting image in uri and it is displaying correctly but i dont know how to upload image to server please help.
file uri
file://dir/image name
i am using this code from blackberry webworks camera api
function takePicture() {
try {
blackberry.media.camera.takePicture(photoTaken, closedCB, errorCB);
} catch (e) {
//alert("Error in supported: " + e);
}
}
function successCB(filePath) {
var file = "file://" + filePath;
$("#myfileCam").val(file);
$("#imgPic").show();
$("#imgPic").attr("src", file);
}
function photoTaken(filePath) {
var img = new Image();
img.src = "file://" + filePath;
img.width = Math.round(screen.width / 2);
document.getElementById("path").appendChild(img);
var html = "<input type='file' name='myfile' id='myfile' value='file://" + filepath + "' />";
document.getElementById("fileup").innerHTML = html;
//$("#fileup").html();
$("#myfileCam").val("file://" + filePath);
}
function closedCB() {
// alert("Camera closed event");
}
function errorCB(e) {
alert("Error occured: " + e);
}
FileConnection fc= (FileConnection)Connector.open(filePath);
InputStream is=fc.openInputStream();
byte[] ReimgData = IOUtilities.streamToBytes(is);
EncodedImage encode_image =EncodedImage.createEncodedImage(ReimgData, 0, (int)fc.fileSize());
JPEGEncodedImage encoder=JPEGEncodedImage.encode(encode_image.getBitmap(),50);
byte[] array=encoder.getData();
// Decodes the image represented by this EncodedImage and returns a Bitmap
int length=array.length;
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream(length);
Base64OutputStream base64OutputStream = new Base64OutputStream( byteArrayOutputStream );
try{
base64OutputStream.write( array, 0, length );
base64OutputStream.flush();
base64OutputStream.close();
}
catch (IOException ioe){
//Dialog.alert("Error in encodeBase64() : "+ioe.toString());
System.out.println("Error in encodeBase64() : "+ioe.toString());
}
data = Base64InputStream.decode(byteArrayOutputStream.toString());
Use Phonegap for Blackberry Webworks I had the same problem and i resolved using this api.
http://docs.phonegap.com/en/1.5.0/phonegap_file_file.md.html#FileTransfer here everything you need.
The only thing you need to do is to put the phonegap jar file in ext folder and call for permissions en config.xml adding this feature (phonegap), and that it.
I hope this work to you, it worked to me.

Image overlay with camera captured image in android

I need to take a picture with the camera and at the same time show an overlay image on top of the camera view. After the picture is taken, i need to save what the user saw while taking the picture. Can anyone suggest me? Please.
public void onPictureTaken(byte[] data, Camera camera){
Bitmap cameraBitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
wid = cameraBitmap.getWidth();
hgt = cameraBitmap.getHeight();
Bitmap newImage = Bitmap.createBitmap(wid , hgt , Bitmap.Config.ARGB_8888);
Canvas canvas = new Canvas(newImage);
canvas.drawBitmap(cameraBitmap, 0f, 0f, null);
Drawable drawable = getResources().getDrawable(R.drawable.d);
drawable.setBounds(20 ,20, 260, 160);
drawable.draw(canvas);
File storagePath = new File(Environment.getExternalStorageDirectory() + "/Vampire Photos/");
storagePath.mkdirs();
File myImage = new File(storagePath,Long.toString(System.currentTimeMillis()) + ".jpg");
try
{
FileOutputStream out = new FileOutputStream(myImage);
newImage.compress(Bitmap.CompressFormat.JPEG, 90, out);
out.flush();
out.close();
}
catch(FileNotFoundException e)
{
Log.d("In Saving File", e + "");
}
catch(IOException e)
{
Log.d("In Saving File", e + "");
}
camera.startPreview();
drawable = null;
newImage.recycle();
newImage = null;
cameraBitmap.recycle();
cameraBitmap = null;
} };

Why is part of the image cut out when taking a picture with this code on Blackberry?

When I take a picture on the simulator (Haven't tried a device yet) the result is only less than half of the image and the rest is gray. Does anyone know why?
Thanks
listener = new FileSystemJournalListener()
{
private long _lastUSN;
public void fileJournalChanged()
{
long nextUSN = FileSystemJournal.getNextUSN();
FileSystemJournalEntry entry = FileSystemJournal.getEntry(nextUSN - 1);
nextUSN++;
switch (entry.getEvent()) {
case FileSystemJournalEntry.FILE_ADDED:
try
{
FileConnection fconn = (FileConnection)Connector.open("file://" +entry.getPath());
if(fconn.exists())
{
InputStream input = null;
input = fconn.openInputStream();
byte[] data = new byte[(int) fconn.fileSize() + 1000];
input.read(data);
rawImage = data;
pic = Bitmap.createBitmapFromBytes(data, 0, -1, 1);
if(input != null)
{
input.close();
}
Bitmap[] images = new Bitmap[1];
images[0] = pic;
//labels[1] = "Label for image 2";
// tooltips[1] = "Tooltip for image 2";
// labels[2] = "Label for image 2";
// tooltips[2] = "Tooltip for image 2";
ScrollEntry[] entries = new ScrollEntry[images.length];
entries[0] = new ScrollEntry(images[0], "", "");
PictureScrollField pictureScrollField = new PictureScrollField(175, 131);
pictureScrollField.setData(entries, 0);
pictureScrollField.setHighlightStyle(HighlightStyle.ILLUMINATE_WITH_SHRINK_LENS);
// pictureScrollField.setHighlightBorderColor(Color.BLUE);
pictureScrollField.setBackground(BackgroundFactory.createSolidTransparentBackground(Color.BLACK, 150));
insert(pictureScrollField, 1);
picTaken = true;
EventInjector.KeyEvent inject = new EventInjector.KeyEvent(EventInjector.KeyEvent.KEY_DOWN, Characters.ESCAPE, 0, 50);
inject.post();
inject.post();
}
break;
}
catch (Exception e)
{
// TODO Auto-generated catch block
Dialog.alert(e.toString());
}
//either a picture was taken or a picture was added to the BlackBerry device
case FileSystemJournalEntry.FILE_DELETED:
//a picture was removed from the BlackBerry device;
break;
}
input.read(data) only reads some amount of data, not all of it. If you want to read the whole file, use IOUtilities.streamToBytes(input); instead, like this:
byte[] data = IOUtilities.streamToBytes(input);
byte[] data = new byte[(int) fconn.fileSize() + 1000];
...
pic = Bitmap.createBitmapFromBytes(data, 0, -1, 1);
I think data now contains last wrong 1000 bytes, try changing to:
byte[] data = new byte[(int) fconn.fileSize()];
I faced the same problem. Just use:
synchronized(UiApplication.getEventLock()) {
//your code here
}

Resources