Will GPU affects the performance of video in devices? - performance

We are facing a weird problem in Surface device with Windows RT operating system
When we are playing the video from Cloudfront CDN through JW player, Video takes long time to load and buffers very often than other devices. Sometimes it stops playing video. Same problem when we are using HTML5 video player
When we tried to play the video in Surface pro 2, it works fine.
What might me the problem here, it is because of CPU, GPU, RAM or any other browser issue in that specific device?
Simple Jwplayer Example here : http://jsfiddle.net/hiteshbhilai2010/Ga55z/1/
Simple HTML5 Player Example here : http://jsfiddle.net/hiteshbhilai2010/dU6TF/

The GPU of the device is only used for decoding the video. If you can watch a video locally with no problems then you can almost certainly rule out the GPU as a bottleneck.
It doesn't seem very likely that the CPU is slowing you down but you should be able to check how much of the CPUs time the video player is taking through Task Manager or something similar.
The only 2 remaining factors I can think of that would affect you are RAM and your network hardware (since you've already tried watching it on another device on the same connection I'm assuming).
I'd get some info on what network speeds your device + connection is capable of and if you can rule that out, investigate how much RAM the browser is using.

Related

Can NFC alone be used to transfer audio?

Can NFC be used to transfer audio like headphones cable (i.e. both ways audio and play/pause commands to an android phone) without bluetooth or other protocols?
Nope. That won't work with the audio standards of these days.
In theory you can establish a connection of about 800 kilobits/second. In pratice that rarely happens outside the lab. In practice, with error-correction and retransmission etc, you can expect a raw data-transfer rate of 2kb/second.
That is enough to transmit recognizable speech utilizing specialized codecs for low bandwidth, but not much more. Think about audio quality from cell-phones 20 years ago and you'll get the idea.

WebRTC and low performance machines

What happens if we run WebRTC on a device with a processor that is too weak to handle the video. Is WebRTC smart enough to drop down to a lower resolution on its own? Or do we have to manually detect this situation and resize the video depending on the device capability?
Thanks
There are methods in place for Firefox and Chrome and they should work for simple applications but if you are going to have numerous things running on the machine, you may want to handle it and cap/control it yourself.
Improvements are in the works for chrome currently
For my weaker machines I have had to decrease the video quality(through media constraints in getUserMedia) and put a cap on the bandwidth in Chrome. This has given me the control over CPU utilization I need where the in browser solution has not.
Firefox does not support bandwidth caps yet(SDP or MediaConstraints), so you will have to rely on media constraints only.

Flash (as3) runs slower in browser than in standalone

I have made a custom video player in Flash built on the AS3 Netstream. In development it was never causing any significant CPU usage: Youtube/Vimeo are at about 10 to 15% CPU and my own player 20 to 25%.
Now it's running on our development webserver and it is hogging the CPU.
I have tried setting the framerate unreasonably low (1fps) and it doesn't seem to make any significant impact.
We have experimented with WMODE in the HTML page that runs the player. In wmode: "direct" it is a little better, but still nowhere close to the CPU amount in FlashDevelop.
I will gladly post all the code you think is relevant but at the moment I am at a loss for what could be causing this.
UPDATE:
Could it be related to the video file format?
UPDATE:
I have tried Chrome and Firefox on multiple computers. CPU usage varies according to the speed of the computer, as expected, but is always about 4 or 5 times as much as any other video player. So far we have found out that the high CPU compared to other players is caused by decompressing. If a smaller video format is used it works better. However, this doesn't answer the main question: why is the CPU usage within browser(s) so much higher than in standalone Flash?
There could be a difference in performance in different environments, so please check the follwing things:
is flashdevelop using a debug or release player?
is your browser using a debug or release player?
does it matter if you make a release or debug build (if you use the Flash IDE, this setting is called 'permit debugging')? Test on debug player AND release player?
are you using the chrome pepper player (buildin)?
is your code valid, doublechecked, no runtime errors?
did you profile the flash on memory leaks?
are you using StageVideo? This will render video on GPU, which should give better performance (Btw youtube and vimeo does)
did you test with other videos, bitrates, encodings?
I disabled the plugin-container in Firefox (in about:config, turn dom.ipc.plugins.enabled to false) and my Flex app seems to run as fast as in the standalone player now.

How is TeamViewer so fast?

Sorry about the length, it's kinda necessary.
Introduction
I'm developing a remote desktop software (just for fun) in C# 4.0 for Windows Vista/7. I've gotten through basic obstacles: I have a robust UDP messaging system, relatively clean program design, I've got a mirror driver (the free DFMirage mirror driver from DemoForge) up and running, and I've implemented NAT traversal for all NAT types except Symmetric NATs (present in corporate firewall situations).
Regarding screen transfer/sharing, thanks to the mirror driver, I'm automatically notified of changed screen regions and I can simply marshal the mirror driver's ever-changing screen bitmap to my own bitmap. Then I compress the screen region as a PNG and send it off from the server to my client. Things are looking pretty good, but it's not fast enough. It's just as slow as VNC (btw, I don't use the VNC protocol, just a custom amateur protocol).
From the slowest remote desktop software to the fastest, the list usually begins at all VNC-like implementations, then climbs up to Microsoft Windows Remote Desktop...and then...TeamViewer. Not quite sure about CrossLoop, LogMeIn - I haven't used them, but TeamViewer is insanely fast. It's quite literally live. I ran a tree command on Command Prompt and it updated with 20 ms delay. I can browse the web just a few milliseconds slower than on my laptop. Scrolling code vertically in Visual Studio has 50 ms lag time. Think about how robust TeamViewer's screen-transfer solution must be to accomplish all this.
VNCs use poll-based hooks for detecting screen change and brute force screen capturing/comparing at their worst. At their best, they use a mirror driver like DFMirage. I'm at this level. And they use something called the RFB protocol.
Microsoft Windows Remote Desktop apparently goes one step higher than VNC. I heard, from somewhere on StackOverflow, that Windows Remote Desktop doesn't send screen bitmaps, but actual drawing commands. That's quite brilliant, because it can just send simple text (draw this rectangle at this coordinate and color it with this gradient)! Remote Desktop really is pretty fast - and it's the standard way of working from home. And it uses something called the RDP protocol.
Now TeamViewer is a complete mystery to me. Apparently, they released their source code for Version 2 (TeamViewer is Version 7 as of February 2012). People have read it and said that Version 2 is useless - that it's just a few improvements over VNC with automatic NAT traversal.
But Version 7...it's ridiculously fast now. I mean, it's actually faster than Windows Remote Desktop. I've streamed DirectX 3D games with TeamViewer (at 1 fps, but Windows Remote Desktop doesn't even allow DirectX to run).
By the way, TeamViewer does all this without a mirror driver. There is an option to install one, and it gets just a bit faster.
The Question
My question is, how is TeamViewer so fast? It must not be possible. If you've got 1920 by 1080 resolution at even 24 bit depth (16 bit depth would be noticeably ugly), thats still 6,220,800 bytes raw. Even using libjpeg-turbo (one of the fastest JPG compression libraries used by large corporations), compressing it down to 30KB (let's be extremely generous), would take time to route through TeamViewer's servers (TeamViewer bypasses corporate Symmetric NATs by simply proxying traffic through their servers). And that libjpeg-turbo compression would take time to compress. High-quality JPG compression takes 175 milliseconds for a full 1920 by 1080 screenshot for me. And that number goes up if the host's computer runs an Atom processor. I simply don't understand how TeamViewer has optimized their screen transfer so well. Again, small-size images might be highly compressed, but take at least tens of milliseconds to compress. Large-size images take no time to compress, but take a long time to get through. Somehow, TeamViewer completes this entire process to get roughly 20-25 frames per second. I've used a network monitor, and TeamViewer is still lagless at speeds of 500 Kbps and 1 Mbps (VNC software lag for a few seconds at that transfer rate). During my tree Command Prompt test, TeamViewer was receiving inbound data at a rate of 1 Mbps and still running 5-6 fps. VNC and remote desktop don't do that. So, how?
The answers will be somewhat complicated and intricate, so please don't post your $0.02 if you're only going to say it's because they use UDP instead of TCP (would you believe they actually do use TCP just as successfully though).
I'm hoping there's a TeamViewer developer somewhere here on StackOverflow.
Potential Answers
Will update this once people reply.
My thoughts are, first of all, that TeamViewer has very fine network control. For example, they split large packets to just under the MTU size and never waste a trip. They probably have all sorts of fancy hooks to detect screen changes along with extremely fast XOR image comparisons.
The most fundamental thing here probably is that you don't want to transmit static images but only changes to the images, which essentially is analogous to video stream.
My best guess is some very efficient (and heavily specialized and optimized) motion compensation algorithm, because most of the actual change in generic desktop usage is linear movement of elements (scrolling text, moving windows, etc. opposed to transformation of elements).
The DirectX 3D performance of 1 FPS seems to confirm my guess to some extent.
would take time to route through TeamViewer's servers (TeamViewer bypasses corporate Symmetric NATs by simply proxying traffic through their servers)
You'll find that TeamViewer rarely needs to relay traffic through their own servers. TeamViewer penetrates NAT and networks complicated by NAT using NAT traversal (I think it is UDP hole-punching, like Google's libjingle).
They do use their own servers to middle-man in order to do the handshake and connection set-up, but most of the time the relationship between client and server will be P2P (best case, when the hand-shake is successful). If NAT traversal fails, then TeamViewer will indeed relay traffic through its own servers.
I've only ever seen it do this when a client has been behind double-NAT, though.
It sounds indeed like video streaming more than image streaming, as someone suggested.
JPEG/PNG compression isn't targeted for these types of speeds, so forget them.
Imagine having a recording codec on your system that can realtime record an incoming video stream (your screen). A bit like Fraps perhaps. Then imagine a video playback codec on the other side (the remote client).
As HD recorders can do it (record live and even playback live from the same HD), so should you, in the end. The HD surely can't deliver images quicker than you can read your display, so that isn't the bottleneck. The bottleneck are the video codecs. You'll find the encoder much more of a problem than the decoder, as all decoders are mostly free.
I'm not saying it's simple; I myself have used DirectShow to encode a video file, and it's not realtime by far. But given the right codec I'm convinced it can work.
My random guess is: TV uses x264 codec which has a commercial license (otherwise TeamViewer would have to release their source code). At some point (more than 5 years ago), I recall main developer of x264 wrote an article about improvements he made for low delay encoding (if you delay by a few frames encoders can compress better), plus he mentioned some other improvements that were relevant for TeamViewer-like use. In that post he mentioned playing quake over video stream with no noticeable issues. Back then I was kind of sure who was the sponsor of these improvements, as TeamViewer was pretty much the only option at that time. x264 is an open source implementation of H264 video codec, and it's insanely good implementation, it's the best one. At the same time it's extremely well optimized. Most likely due to extremely good implementation of x264 you get much better results with TV at lower CPU load. AnyDesk and Chrome Remote Desk use libvpx, which isn't as good as x264 (optimization and video quality wise).
However, I don't think TeamView can beat microsoft's RDP. To me it's the best, however it works between windows PCs or from Mac to Windows only. TV works even from mobiles.
Update: article was written in January 2010, so that work was done roughly 10 years ago. Also, I made a mistake: he played call of duty, not quake. When you posted your question, if my guess is correct, TeamViewer had been using that work for 3 years.
Read that blog post from web archive: x264: the best low-latency video streaming platform in the world. When I read the article back in 2010, I was sure that the "startup–which has requested not to be named" that the author mentions was TeamViewer.
Oddly. but in my experience TeamViewer is not faster/more responsive than VNC, only easier to setup. I have a couple of win-boxen that I VNC over OpenVPN into (so there is another overhead layer) and that's on cheap Cable (512 up) and I find properly setup TightVNC to be much more responsive than TeamViewer to same boxen. RDP (naturally) even more so since by large part it sends GUI draw commands instead of bitmap tiles.
Which brings us to:
Why are you not using VNC? There are plethora of open source
solutions, and Tight is probably on top of it's game right now.
Advanced VNC implementations use lossy compression and that seems to achieve
better results than your choice of PNG. Also, IIRC the rest of the payload is also
squashed using zlib. Bothj Tight and UltraVNC have very optimized algos, especially for windows. On top of that Tight is open-source.
If win boxen are your primary target RDP may be a better option, and has an opensource implementation (rdesktop)
If *nix boxen are your primary target NX may be a better option and has an open source implementation (FreeNX, albeit not as optimised as NoMachine's proprietary product).
If compressing JPEG is a performance issue for your algo, I'm pretty sure that image comparison would still take away some performance. I'd bet they use best-case compression for every specific situation ie lossy for large frames, some quick and dirty internall losless for smaller ones, compare bits of images and send only diffs of sort and bunch of other optimisation tricks.
And a lot of those tricks must be present in Tight > 2.0 since again, in my experience it beats the hell out of TeamViewer performance wyse, YMMV.
Also the choice of a JIT compiled runtime over something like C++ might take a slice from your performance edge, especially in memory constrained machines (a lot of performance tuning goes to the toilet when windows start using the pagefile intensively). And you will need memory to keep previous image states for internal comparison atop of what DF mirage gives you.

OnLive: How does it work? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
OnLive is a cloud computing solution for gaming. It offers streaming of high-end games to any pc, regardless of its hardware. I wonder how it works: sending raw HD res image and audio data seems unlikely. Would relatively simple compression, like jpeg and mp3/ogg, do the trick?
Have you read this article? Excerpts thereof:
It's essentially the gaming version of cloud computing - everything is computed, rendered and housed online. In its simplest description, your controller inputs are uploaded, a high-end server takes your inputs and plays the game, and then a video stream of the output is sent back to your computer. Think of it as something like Youtube or Hulu for games.
The service works with pretty much any Windows or Mac machine as a small browser plug-in. Optionally, you will also be able to purchase a small device, called the OnLive MicroConsole, that you can hook directly into your TV via HDMI, though if your computer supports video output to your TV, you can just do it that way instead. Of course, you can also just play on your computer's display if you don't want to pipe it out to your living room set.
[...]
OnLive has worked diligently to overcome lag issues. The first step in this was creating a video compression algorithm that was as quick as possible.
It's basically games-over-VNC. Obviously they use video compression; of what sort I'm not sure. The two obvious alternatives would seem to be something fairly computationally lightweight, such as motion JPEG or even MPEG 2, running on the same server that's running the game, or something more computationally intensive but compact, such as H264, running on dedicated hardware.
Personally, If I were designing the service, I'd go for the latter: It allows you to have better compression without massively upgrading all your servers, for the cost of a relatively inexpensive codec chip. Because the video stream is smaller, you can attract people who have connections that would have been marginal or too slow using a poorer codec.
This is what I understood: It is a thin client based gaming solution. Different from the gaming consoles like Wii, X-Box or Play Station, no CPU/GPU or any processing is needed at player’s side. The game is streamed from a monster server via internet, just like a HiFi terminal session (RDP/Remote Desktop) but with HD graphics. Controls (inputs) are sent to the server and graphics is sent back. It can be played on Mac or PC via a web browser add-in or in a TV with a small unit to connect to the server. Requires a 5mbps connection for HD and a 1.5mbps for SD. Almost all game titles will be available or ported to this platform. No need to buy a console or a game. No need of high end gaming PCs… Just a broadband connection (of course this should be high end).
I think that they are using something like an HDMI video h264 encoder in order to stream a video directly from an hdmi audio/video output.
Something like this HDMI encoder or this h264 realtime encoder
You can also use a frame-grabber card like this: http://www.epiphan.com/products/frame-grabbers/vga2ethernet/
There is also one more solition now. If you have a recent Nvidia graphics card, you can have the benefits of hardware accelerated capture, without the extra hardware. It's called "Gamestream" You can buy one of the Nvidia devices supporting the protocol, or you can download an open source app called "Moonlight" http://moonlight-stream.com

Resources