Excanvas: Can I use Processing.js with it at near-normal (for <canvas>) speeds? - performance

I've recently come across Google's Excanvas stuff... It's all very well and good having <canvas> for the turdier browsers out there, but I was wondering how optimised it was; can I use processing.js on it (probably) and get framerates close to those I'd get on a native canvas in something like Chrome/FireFox?
Thanks,
James

This will mostly depend on the sturdiness of the machine your code gets executed on, and the speed of the javascript interpreter that will run your code.
ExtCanvas means extra overhead, and some versions of Chrome and Firefox support hardware acceleration of their canvas rendering.
This makes it unlikely that using ExtCanvas will yield the same results (as far as speed is concerned) as a canvas tag

Related

Webgl and three.js running great on chrome but HORRIBLE on firefox

Basically I am downloading a zip file and extracting a collada file to load in the browser. This works freaking awesome in chrome but is REALLY slow with model movement from the mouse in Firefox. I cant seem to figure this out or if there's a setting I'm missing to speed up Firefox or what. The file is loaded up here
http://moneybagsnetwork.com/3d/test.htm
Its using jsunzip and three.js to load everything. I've bypassed the jsunzip and that's not the issue. I've also dumbed down the model to not use any event listeners and no lights and that didn't help one bit. Completely stumped here and the model really isn't that big :/
Here is a link to a zip of the files I'm using
http://moneybagsnetwork.com/3d/good.zip
Sorry about the multiple commented lines. I might turn things back on if this gets fixed.
I have noticed that Chrome is usually way faster and more responsive with Three.js applications than Firefox. The difference is not so much on the WebGL side, but at the plain Javascript supporting code.
Looking at your code, it seems you do some very heavy javascript stuff on your onmousemove function. This could very well cause much of the performance gap between the browsers. Mousemove is executed many many times during each and every mousemovement, so it quickly adds up to slow performance. It could also be that Firefox actually creates more mousemove events for similat cursor movements (not sure).
You could either move most of the raycasting and other stuff from mousemove to mouseclick. Alternatively, you could implement a delayed mousemove so that the function is called only maximum of X times per second, or only when the mouse has stopped. Something like this (untested but should work):
var mousemovetimer = null;
function onmousemove(event){
if (mousemovetimer) {
window.clearTimeout(mousemovetimer);
}
mousemovetimer = window.setTimeout(delayedmousemove, 100);
}
function delayedmousemove(event) {
// your original mousemove code here
}
Maybe your graphics card is in our blacklist. There is usually a note about this towards the bottom of about:support.
Cards can be blacklisted for various reasons, missing drivers / features, occasional crashes ... see:
http://www.sitepoint.com/firefox-enable-webgl-blacklisted-graphics-card/
To enable WebGL, set webgl.force-enabled to true.
To enable Layers Acceleration, set layers.acceleration.force-enabled to true
To enable Direct2D in Windows Vista/7, set gfx.direct2d.force-enabled to true

Poor Canvas2D performance with Firefox on Linux

I just hit something particularly hard to debug when doing some pretty intensive rendering with Canvas2D. I use all kinds of things, from globalCompositeOperation to multiple off-screen canvases, with some drawImage magic in-between.
It works perfectly fine and smooth on :
Chrome (26) [OSX 10.7.5]
Safari (6.0.2) [OSX 10.7.5]
Firefox (Both 18 and 20 Aurora) [OSX 10.7.5]
Chrome (24) [Windows 7]
Firefox (12) [Windows 7]
Chromium (24) [Archlinux, Gnome 3]
EDIT: Added tests for Windows 7. Strangely enough, it works for FF12 (I had an old version on my dual boot) but there's a definite performance hit after upgrading to FF18. It's not as bad on Windows as it is on Linux though, and the same version on OSX works flawlessly. Regression maybe?
For some reason, on Firefox and Linux (I tried both 18 and 20 Aurora), I have bad rendering performances when dragging and rendering at the same time.
If I fire-and-forget an animation, it is on par with Chrome/Safari, but if I drag and render, I often end up only seeing the end frame after I release the drag.
Neither requestAnimationFrame nor a direct rendering on the mouse event handler work.
After profiling, the reported timings for the rendering parts are well within the range of acceptable (up to 100ms at the absolute worst), and definitely do not correspond to what I see on the screen.
I tried reducing the load by removing some stuff, ending up with reported render times under 15ms, but what I saw didn't change.
What baffles me is that it works almost everywhere else except with Firefox on Linux. Any idea as to where I should look, a bug report or a solution to my problem?
I have fully switched to Chrome on linux because of this issue. It stems from the old 2d rendering engine they are using called Cairo, which is old and out-dated. Azure was to replace this engine and they have it done basically all the platforms except linux.
http://blog.mozilla.org/joe/2011/04/26/introducing-the-azure-project/
https://bugzilla.mozilla.org/show_bug.cgi?id=781731
I think I know where you should look based on this:
If I fire-and-forget an animation, it is on par with Chrome/Safari, but if I drag and render, I often end up only seeing the end frame after I release the drag.
This is probably a double-buffering bug with Firefox on linux.
Canvas implementations have double-buffering built in. You can see it in action on any browser in a simple example like this one: http://jsfiddle.net/simonsarris/XzAjv/ (which uses a setTimeout vs extra work to illustrate that clearing does not happen right away)
The implementations try to delay all rendering by rendering it to an internal bitmap, and then all at once (at the next pause) render it to the canvas. This stops the "flickering" effect when clearing a canvas before redrawing a scene, which is good.
But it seems there's a plain old bug in the Linux Firefox. During your drag and render it seems to not be updating the canvas, probably in an attempt to buffer, but seems to be doing so when it should not be. This would explain why it works in fire-and-forget scenarios.
So I think a bug report is in order. I haven't got any linux machines lying around so I can't reproduce it and submit something myself to be certain though, sorry.
This is in reply to a comment: You could, during the mouse move, dispatch the drawing portion to a tiny timer.
For instance:
// The old way
theCanvas.addEventListener('mousemove', function() {
// if we're dragging and are redrawing
drawingCode();
}, false);
// The new way
theCanvas.addEventListener('mousemove', function() {
// if we're dragging and are redrawing
// in 1 millisecond, fire off drawing code
setTimeout(function() { drawingCode(); }, 1);
}, false);
There isn't such a method, its totally hidden. What you could do is, during mouse move, dispatch

In IE 6+ Transition or Flash would be better to use for small drop in-out animations

IE6+ got some problems with transition as usual but i know ways to get trough those but the thing im wondering if i use flash instead of getting nerfed by transition lacks would it make my website better or lamer? Cause if i use .swf files even for smallest animations i guess it would drop the performance really badly but would look better. Essipecially while im using opacity transition.
So which would be better to use on IE6+ or What would u expect from a website while browsing with IE6+? A good looking .swf with a performance lack or a crappy looking transition with good performance?
get ie6 out of your mind. Take a look at the percentage share of people using ie6...
http://www.w3schools.com/browsers/browsers_explorer.asp
1.2% in november and december 2011!
Quit supporting it!

Is it possible for CSS3 transitions or high memory objects to affect scrolling smoothness in Chrome?

I'm working on a site with lots of CSS3 transitions (which are hardware accelerated) and high memory objects (for example, an array of 39 objects, each containing the full html source for a typical online news article) and I'm noticing some very choppy/jittery scrolling, which I've been unable to debug.
I've kept these high memory objects out of the DOM, which should prevent them from affecting DOM performance, however, I can't help but think that they are still having a negative effect. I don't have code samples to post because I'm unsure of whether this is even an issue.
Please go to this site (Orange) and click on an article tile. In the reader div that pops up over the page, try scrolling as you normally would. Does it feel choppy/jittery? Do you have any suggestions on how to improve this?
CSS3 transitions, opacity, text and box-shadows and the like are certainly known to impact rendering speeds. In fact, even sites with heavy use of text-shadow alone can cause choppy scrolling on the average computer. Combining this with heavy use of javascript seems like a recipe for choppy web browsing.
edit: The loading animation on the o in orange is pretty awesome!
Yes, that's jittery. A page with a lot of Javascript will do that and frameworks like jQuery won't help at all. I'd recommend recoding as much as you can without using jQuery and passing it through JSLint (http://www.jslint.com/).
Try using Chrome's developer tools too to get an idea of what the bottleneck is.
Try disabling Javascript too and seeing if it's any better. If it isn't, then you know where your problem lies.

How do you feel about including ie7.js or ie8.js in your page?

See here: http://code.google.com/p/ie7-js/
Does anyone have any experience or remarks about this javascript? Is it worth including? Do you recommend it?
I know many people, myself included that are using various IE hacks to get transparent PNG support. THis looks like a little bit more help, and as long as it works, and the size is fairly small, I wouldn't see much against using it.
I've used it before, and my results are mixed. Those scripts cause IE to churn for a bit on page load. Basically, you have to think of it as iterating through Elements and stylesheet rules to apply "fixes" for areas that are deficient in that particular rendering engine. In some cases, depending on how complicated your markup or stylesheets are, that can take a bit of time and you will see the browser hang.
That said, if you can trade off that performance cost, you will save development time as you'll spend less time hacking around IE6 quirks; IE7/IE8 will provide enough missing functionality that you can avoid certain edge cases, can develop using standards better (min-width/min-height, multiple className selectors, etc.), and certain rendering issues will disappear.
However, if you just need 24-bit transparent PNG support, use a tool built for that. Including IE7/IE8.js for PNG support alone is like pounding in a nail with a tank. Use DD_belatedPNG for that.
It works, but its worth keeping in mind that ie7.js and ie8.js do much more than provide transparent PNG support. Even with the transparent PNG support, its worth keeping in mind that transparent background images cannot be tiled (repeated) using background-repeat or positioned using background-position. This hinders any ability to use CSS rollovers using background-position. I've only used it on one site I've done, and now that I'm updating the site I can't remove the ie8.js because if I do the entire website breaks layout in IE. I don't believe I'll be using it in the future, and instead rely on simple CSS hacks or simply allow my sites to "degrade gracefully" in IE6.
I know that there are some tools for fixing the transparent PNG problem which are more flexible than this. For instance, the jQuery plugin ifixpng2 will support background position, which ie7-js doesn't do.
As long as you are aware of exactly what it fixes, I would say go for it. I'm not sure about this lib exactly, but some libs get very expensive if you have a large DOM, as they tend to hook in HTC file base behaviors on EVERY DOM Element. This causes the dreaded "Loading x of y" status bar message to flash constantly on the initial load, and any newly generated DOM content.
well its beautiful and works grate way u can use cs3 features like li:hover. we did lost of project last time using ie8.js and it works great way.

Resources