Flash (as3) runs slower in browser than in standalone - performance

I have made a custom video player in Flash built on the AS3 Netstream. In development it was never causing any significant CPU usage: Youtube/Vimeo are at about 10 to 15% CPU and my own player 20 to 25%.
Now it's running on our development webserver and it is hogging the CPU.
I have tried setting the framerate unreasonably low (1fps) and it doesn't seem to make any significant impact.
We have experimented with WMODE in the HTML page that runs the player. In wmode: "direct" it is a little better, but still nowhere close to the CPU amount in FlashDevelop.
I will gladly post all the code you think is relevant but at the moment I am at a loss for what could be causing this.
UPDATE:
Could it be related to the video file format?
UPDATE:
I have tried Chrome and Firefox on multiple computers. CPU usage varies according to the speed of the computer, as expected, but is always about 4 or 5 times as much as any other video player. So far we have found out that the high CPU compared to other players is caused by decompressing. If a smaller video format is used it works better. However, this doesn't answer the main question: why is the CPU usage within browser(s) so much higher than in standalone Flash?

There could be a difference in performance in different environments, so please check the follwing things:
is flashdevelop using a debug or release player?
is your browser using a debug or release player?
does it matter if you make a release or debug build (if you use the Flash IDE, this setting is called 'permit debugging')? Test on debug player AND release player?
are you using the chrome pepper player (buildin)?
is your code valid, doublechecked, no runtime errors?
did you profile the flash on memory leaks?
are you using StageVideo? This will render video on GPU, which should give better performance (Btw youtube and vimeo does)
did you test with other videos, bitrates, encodings?

I disabled the plugin-container in Firefox (in about:config, turn dom.ipc.plugins.enabled to false) and my Flex app seems to run as fast as in the standalone player now.

Related

The performance to handle key up/down on cobalt browser

I tried cobalt browser on our platform (arm-v8 linux), and I found the key response is a little bit slow. When pressing up/down key on the youtube home page, it takes 200ms-400ms from InjectKeyboardEvent to DoLayoutAndProduceRenderTree, and DoLayoutAndProduceRenderTree takes 100ms-150ms then starts to render. I saw movie rows start drawing after 500ms. Any suggestion for this?
I tried to change javascript_engine from mozjs to javascriptcore, but the performance was similar. (I found javascriptcore not enable JIT by default?) Also we run cobalt PC version, and the key response is similar, delayed some time then start scrolling.
Does Cobalt has any performance measurement for developers to check this? how to enable it?
Thanks a lot.
Non-gold builds of Cobalt have a "debug console" HUD that can be used to display live-updating debug values, including performance measurements. qa is the fastest build type that still has the debug console.
The debug console HUD can be enabled with the --debug_console=hud command-line flag, or toggled at runtime by pressing Ctrl-O.
The HUD is an overlay that displays a bunch of "CVals" (console values). These can be shown or hidden in the debug console itself (type help). In particular, you probably want to enable Event.Duration.MainWebModule which will be updated every time you hit a key. Take a look at src/cobalt/browser/web_module_stat_tracker.cc for a description of the event-timing CVals.
There is also a build target layout_benchmarks that tests the non-Javascript portions of a full layout. This is not quite the same as an incremental layout, and clearly Javascript (InjectKeyboardEvent) is a majority of your key-handling cost, but it may be a decent proxy for overall performance, and is reasonably comparable between platforms.
JavaScriptCore will soon be deprecated in favor of SpiderMonkey (a.k.a. mozjs), and removed from the Cobalt tree, so it's not an avenue of exploration with much future.
Note that Linux X11 Cobalt is not necessarily as fast as device platforms, as the X11 implementation is not particularly optimized. MesaGL is software-rendered, for example. Video composition is done fairly crudely, and YUV conversion is also done on the CPU.

Will GPU affects the performance of video in devices?

We are facing a weird problem in Surface device with Windows RT operating system
When we are playing the video from Cloudfront CDN through JW player, Video takes long time to load and buffers very often than other devices. Sometimes it stops playing video. Same problem when we are using HTML5 video player
When we tried to play the video in Surface pro 2, it works fine.
What might me the problem here, it is because of CPU, GPU, RAM or any other browser issue in that specific device?
Simple Jwplayer Example here : http://jsfiddle.net/hiteshbhilai2010/Ga55z/1/
Simple HTML5 Player Example here : http://jsfiddle.net/hiteshbhilai2010/dU6TF/
The GPU of the device is only used for decoding the video. If you can watch a video locally with no problems then you can almost certainly rule out the GPU as a bottleneck.
It doesn't seem very likely that the CPU is slowing you down but you should be able to check how much of the CPUs time the video player is taking through Task Manager or something similar.
The only 2 remaining factors I can think of that would affect you are RAM and your network hardware (since you've already tried watching it on another device on the same connection I'm assuming).
I'd get some info on what network speeds your device + connection is capable of and if you can rule that out, investigate how much RAM the browser is using.

WebRTC and low performance machines

What happens if we run WebRTC on a device with a processor that is too weak to handle the video. Is WebRTC smart enough to drop down to a lower resolution on its own? Or do we have to manually detect this situation and resize the video depending on the device capability?
Thanks
There are methods in place for Firefox and Chrome and they should work for simple applications but if you are going to have numerous things running on the machine, you may want to handle it and cap/control it yourself.
Improvements are in the works for chrome currently
For my weaker machines I have had to decrease the video quality(through media constraints in getUserMedia) and put a cap on the bandwidth in Chrome. This has given me the control over CPU utilization I need where the in browser solution has not.
Firefox does not support bandwidth caps yet(SDP or MediaConstraints), so you will have to rely on media constraints only.

What could cause an Android app to run slow on an identical device to one which it runs fast on?

I, and a few other of my Android app users, run a Galaxy Nexus. Most of us find the app to be blazing fast, but a couple are reporting that it is unusably slow also on a Galaxy Nexus. I'm shocked to hear them tell me that the buttons, scrolling, etc. are all slow. The main view of the app is a ListView containing many images, textviews, etc. In fact, you can check out the app for free on Google Play if you feel like digging deeper. I'm trying to compile a checklist of what might cause this issue.
Here's what I have so far:
Low memory
Low disk space
Uncaught errors
Rooted device (?)
Any other ideas?
More importantly, is there any way to detect (or even adjust for!) potential problems?
Some other things:
CPU Usage (monitor via an app like WatchDog (it's free)); it mightn't be your app that's the problem
Android version.
Connection speed (Wi-Fi vs 4G vs 3G vs 1x)
Carrier (since they like to flash their own custom ROMs)
AFAIK Android version and connection speed are exposed by the SDK, not sure about CPU usage or carrier.
Of the errors you listed, I think that low memory would be the most likely factor.
If I were you, I'd create a function that would collect as much of this information as possible and send it to your email (or someplace). Then, have some way for the user to call this function (e.g. in the settings menu or someplace like that).
Granted this is all just intuition as an end-user, I have little experience debugging deployed apps from the coding side.

Periodic GPU performance problem

I have a WinForms application that uses XNA to animate 3D models in a control. The app have been doing just fine for months but recently I've started to experience periodic pauses in the animation. Setting out to investigate what is going on I have established these facts:
It happens on my machine only, other machines works fine
Removing everything from my render loop does not improve the problem
In 2. I didn't actually remove everything, I limited my loop to set the viewport on my GraphicsDevice and then do a GraphicsDevice.Present.
Trying to dig further I fired up PIX to capture some statistics. Screenshots of two PIX runs can be viewed here (Run6) and here (Run14). Run6 is using my original render loop and Run14 is using the bare-bones Present loop.
PIX tells me that the GPU is periodically doing something, and I assume this is causing the pauses. What could be the cause of this? Or how do I go about finding out what the GPU is actually doing?
Update: since I usually trust my code to be perfect (who's laughing?) I started a new XNA project from scratch to see if it exhibit the same behavior. So starting a new XNA 3.1 Windows Game project and running PIX I get this timeline. The same periodic pauses. So the problem must be lower in the stack, in XNA or Direct3D.
So PIX shows that the GPU is working on something, I can see the list of DX calls made within each frame and the timing calculations shows that the pause occurs during (or after) the IDirect3DDevice9::Present call.
Update 2: I had previously installed and uninstalled XNA 4.0 CTP on the problematic machine. I cannot be certain that this is related but I thought that perhaps a reinstall of the XNA Game Studio 3.1 bits could make a difference. Turns out it did.
The underlying question remains the same (and the bounty is still up): what could affect XNA 3.1 (or DirectX) to make it behave like this and is there any logging/tracing power tool for the DirectX and/or GPU level out there that could shed some light on what is going on?
Note: I'm using XNA 3.1 on a Windows 7 x64 dual-core machine with 8GB RAM.
Note2: also posted this question on the XNA Creators forums here.
You could try to see if you can find something with Xperf that is close to your periodically problem, do not run your application but keep the programs open that would normally run besides your application. You could also try to do it again with the application running but it could give a cluttered view.
Start the tracing, do this in an elevated prompt.
xperf -on BASE+LATENCY -stackWalk Profile
Wait for a fair amount of time to be sure that the problem is traced.
Stop the tracing and open it like this.
xperf -d trace.etl
xperfview trace.etl
Analyze by looking at the graphs and consulting tables of specific intervals and see if you can find something that is related to the problem, the highest chance on finding it would be in the DPC and Interrupts section. But it might as well be something odd at the CPU or I/O section. Good luck!
Also more information on Xperf and how to obtain it, hopefully this delivers results.
If not, you can alternatively try GPUView which has been used for improvements in DWM,
this is also included next to Xperf with the Windows Performance Toolkit so you can easily try both!
log v
... wait for a fair amount of time to be sure that the problem is traced ...
log
gpuview merged.etl
In the case that gpuview gets out of memory you can try to add "/limit 3" or remove the v.
Read the documentation of the tools if you are stuck somewhere.
Hmm ... this seems to be occurring on the GPU, however it sounds like a CPU garbage collection issue. Can you run the CLR profiler and see if you can see any spikes in GC activity that you can correlate to the slowdowns?
I agree that it sounds unlikely since you can clearly see it in PIX, but it might offer a clue as to the cause.
If it's only happening on your own machine, then could it be drivers? Forgive me for being skeptical, but it's a 64 bit machine after all :D
This looks like either a vsync issue or GPU in its last throes. Since going back to a different version fixed it, and the "bottleneck" is in IDirect3DDevice9::Present lets go with the former option.
I'm not familiar with XNA so I don't know how much of the workings of D3D are exposed, but do you know what your PresentationParameters are set to?
Specifically try setting the swap effect set to Discard.

Resources