Kinect for Windows SDK vs. SimpleOpenNI RGB image quality - windows

I'm comapring these Kinect libraries on windows:
a) Kinect for Windows SDK for Processing (http://www.magicandlove.com/blog/2012/09/05/kinect-for-processing-library-page/)
b) SimpleOpenNI (http://code.google.com/p/simple-openni/)
Link to full image: http://i.imgur.com/mghSM.jpg
Does anyone noticed the difference in RGB image coming out of these wrappers?
SimpleOpenNI is more pixelated and has more noises. Is it possible to get better image in SimpleOpenNI?
Thanks for any tips!

Related

How to lock Windows 10 rotation in JavaFX8?(Microsoft Surface Pro)

I am using a surface pro and I want to lock the windows10 rotation or horizontally with CODE.
Please helpme.
like android code:
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);

Xcode 8 import pdf (vector) intro image assets not working well

I drag pdf intro image assets (universal) and building Xcode.
There is no generated png#2 png#3 . (only Pdf)
I follow this tutorial :
https://icons8.com/articles/how-to-use-vectors-in-xcode-7/
When I start app on the device images are so bad quality...
Xcode 6, 7 working ....
Maybe PDF file needs optimisation ?!
In research I found "size of svg is not important" but in this case it is (I test it).
Not sure if this helps, but from what I was reading about this, Xcode doesn't fully support vector graphics. It sounds like instead of you being able to load a vector image (.pdf) and it scales however necessary. Instead, whatever the default size of your .pdf is, it'll assume that is the 1x size, then scale the pdf to automatically create the 2x and 3x image when it's built. So its not actually scaling the original, just creating a 1x, 2x, and 3x png of the original size of your pdf. From what I'm reading, people are thinking this is done to maintain backwards compatibility. I couldn't find anything that says it has changed in Xcode 8, so I'm assuming it still works the same way.
This question seems to answer it well: How do vector images work in Xcode (i.e. pdf files)?
Hope I was able to help.
Possible hepful support links, I would suggest updating to the latest. xcode 13
xcode 8.1
xcode 8

Can't get Java3D going on ImageJ 3DViewer

OS: Mac OS 10.11.1
I installed ImageJ and was going to display the example image "flybrain.tif" using "3D Viewer" under Plugins -> 3D. However, the 3D viewer doesn't launch.
I searched around and realised that it could be a problem from java3d, so downloaded java3d from https://java3d.java.net/binary-builds.html. I placed the three *.jar from java3d under /Library/Java/JavaVirtualMachines/jdk1.8.0_65.jdk/Contents/Home/jre/lib
and added the paths to the ~/.bash_profile.
The 3D viewer still doesn't start. Anything else I can try? Many thanks!
I have a similar, not identical setup (Mac OS 10.11.2, jdk 1.8.0_66) and I am able to open the Fly Brain in 3D viewer using Fiji.
In Fiji, I activated the update sites for ImageJ, Fiji, and 3D.
You might also try posting the question at the ImageJ forum.
Hope this helps.
Problem solved by downloading the fiji-jogl-java3d-20151006220121.zip file from http://forum.imagej.net/t/java-3d-progress-and-next-steps/135. The 3D viewer works and I am able to record the 3D video again. Thank you every one, #TSwayne #Jan-Eglinger #gouessej

windows app live video

Whats the best and easiest way to play an incoming live video stream in a c++ windows application (visual studio 2010) and write some notes (eg. this is a blue ball) on the stream display? ActiveX? DirectX? Flash?
I have Windows SDK 7.1 installed. Do I need to install any other software?
Appreciate any pointers.
In simplest, you can do everything you ask with just directshow. There is directshow.net managed library that wraps it for you.
So - try to find an example that just gets video from capture device to the renderer. Then, insert SampleGrabber filter in between those, and modify frame data accordingly. I am using such technique to draw timestamp on the recorded video in my recorder, I am even drawing it with simple GDI+ calls.
Thing to consider: you'll have to watch out for PICTURE format - some webcams have YUY2 as default or ONLY format. You'll want RGB24 format to be able to wrap Bitmap then Graphics around it.

how to use textures in android ndk with opengl es 2.0?

I tried to the code from Fastest 2D frame rate possible with android NDK, my try included, better options available? ,
but textures didn't show, just filled with black.
who can show me a example of how to use textures in android ndk with opengl es 2.0?
Thanks!
my phone is Moto Milestone with android 2.1.

Resources