I simulate a high FPS camera and was requested for a desktop app that supports screen drawing in 100 Hertz.
Now it's implemented in .Net (winforms) and I want to learn a new technology. Therefore, I need to show that there aren't performance issues using Electron framework with JS UI.
I searched quite a bit and the max FPS where 60.
Is it possible?
Since Electron is using Chromium for rendering, it should be possible.
You could check this by setting the show FPS flag and running a short test:
Show FPS meter Chrome 33
Related
I am working on a tablet (HP) with Windows 8.1. We developed a web application, accessed from the tablet with the Chrome browser, which accesses the tablet's webcam using the getUserMedia API (the implementation is simple, based on JavaScript, similar to the one here for example: https://davidwalsh.name/demo/camera.php).
Our application will be used to take photos of identity cards, and then submit them to a servlet.
The quality of the picture taken inside the browser, using the getUserMedia API, is quite poor, and the letters on the identity cards are sometimes not easily readable in the image.
If I use the "Camera" application from Windows 8.1 on the same tablet, and take pictures of the same identity cards, in the same light conditions and from the same distance, the resulting images (JPEG) are very clear.
Why is this difference in quality? I read all about the getUserMedia API, and I tried all the available parameters (constraints, width, height, jpeg quality), but I cannot obtain a good quality image.
What is the reason for which the same camera on the same tablet results in such a quality difference when used in the browser, and when used with the Windows camera application, and is there a way to obtain better quality in the browser (develop a custom plugin)?
To answer your question "Why is this difference in quality?" In short it is because the Browser emulates the camera feed and does image transformation underthehood to allow the ability to send different streams to different clients. WebRTC main focus is for P2P media streaming, not for taking high quality photos.
You can use ImageCapture to gain more camera properties (and to get it as an ImageBitmap) but support right now is still very weak.
Please read my answers below which goes into more depth on MediaCapture and ImageCapture for their use cases for more information.
Why the difference in native camera resolution -vs- getUserMedia on iPad / iOS?
Take photo when the camera is automatically focused
I am developping a Phonegap application using nice UI tools, like ReactJS, with requestAnimationFrame and things like that.
I am trying to have rendering performances of 60fps.
According to this article:
http://calendar.perfplanet.com/2013/the-runtime-performance-checklist/
To have 60 fps, the browser has to be able to do, under 16ms, the following:
JavaScript
Style calculations
Layout
Paint Setup & Paint
Composite
I can see the performance of each in Chrome DevTools by looking at the frames. My Js code is pretty fast thanks to how React optimises the dom manipulations, and the whole fit under 16ms most of the time.
However, these informations are for my computer, my CPU and GPU...
I'm interested to have these informations for different mobile devices, to see how they will behave and to know what is the minimum device level to have 60 fps.
Are there any tools to inspect browser rendering frames for most popular mobile devices?
Is there an easy way to slow down my computer so that it behaves like a mobile phone in term of performances?
What about the impact of other apps running on the phone that could use some of the phone's resources?
You could hook up a remote web inspector with the phone an profile the rendering performance like you would on your computer. This is supported on on iOS 6+ and on Android 4.4, assuming the app is in debug mode. However, hooking up the remote web inspector distorts the results as it imposes a serious performance bump itself.
If you're just interested in measuring raw fps my suggestion is to inject stats.js, a little widget that displays a fps graph:
https://github.com/mrdoob/stats.js/
I used this to benchmark different game engines and got decent results.
I have created a Flash game which plays at 60 frames per second. It plays fine in all browsers except Internet Explorer 8 and 9; in these cases, it seems to be half (or less than half) the intended frame rate.
I have tried embedding it even by using Flash Authoring Tool's own html code, as well as the swfobject method, but to no avail. In all cases, GPU acceleration is enabled.
It's also not a matter of CPU performance, since it is tested on the same computer with all other applications closed. The problem only rises with IE.
One final peculiarity to consider : I have loaded FRAPS to count the actual rendering frame rate and it shows it to be 60 fps (inside IE), although it's crystal clear that this is not the actual speed achieved.
sounds like flash isn't properly installed on your IE browser
Try the following (assuming you have Adobe Flash Player v10.2 installed):
• Restart your computer.
• Open Internet Options (located in the Control Panel).
• Click on the Advanced tab.
• Click on the Reset button, place a check next to Delete Personal Settings and then click Reset.
• Launch IE9 and Adobe Flash Player will be properly enabled.
then try your tests again
GPU acceleration is not always the best solution. It depends on the way you have code the app.
Try another wmode value.
I'm doing a phone app with some animations and they look really clunky on the emulator. I don't have a phone yet so this is the only way I can test my app.
Sometimes the animations start late (up to a second after user input) and the they are almost always very jagged. Far from the smooth fades and transitions that I've seen on the interwebs. I'm not using anything hairy - just basic rotations and opacity fades on one or two elements.
Does anyone else see this in the emulator? If not, i guess I have a bug somewhere. If so, is there a work around? Should I bump the priority of the xde.exe in process explorer? Other?
Thanks!
This may be a consequence of gpu detection no working on your system.
You can verify this by checking if you can see the frame rate counters.
Jeff Wilcox – Frame rate counters in Windows Phone
Note the emulator system requirements here also.
Setup and System Requirements for Windows Phone Emulator
I was playing with new Silverlight 4 and to my surprise when I run my sample application in OOB all animations become very jerky when I moved mouse around during animations, but when I run my app in browser animations are smooth even when moving mouse around.
I tried my app on two different computers, turned on GPU acceleration in OOB settings - and got the same jerky result.
Is this a know problem with Silverlight?
I'm running WinXP SP3
UPDATE: Tested on 3 Windows 7 machines - no issues at all (running in OOB and in browser), tested additional 5 WinXP SP3 machines - 100% reproducible problem on any Silverlight 4 app running in OOB
Turning on "Enable GPU acceleration in Out of Browser" isn't enough. That setting works in tandem with .CacheMode property, which must be set on all elements (or a top-level element) whose rendered bitmap will be sent to the GPU.
From there, the GPU can hardware accelerate rotations, scalings, opacities, clipping. If your animation does any of that, you'll need to set element.CacheMode = "BitmapCache" on the top level element you are animating.
Again, you'll need to turn on the "enable GPU acceleration" for this to work.
If either step is missing, you won't get GPU accelerated.
A couple of caveats for hardware acceleration:
Pixel shaders and perspective transforms are not HW accelerated last I checked.
HW acceleration works on XP, but requires that you have a video card from NVidia, ATI, or Intel, and the driver date must be post Nov 2004. Anything less and nothing will be accelerated.
I recommend reading MSDN's article on hardware acceleration.