How can I stop Dashing from updating widgets outside of viewport - ruby

I have set up a Home-automation dashboard using Shopify/Dashing as base.
Dashing uses Gridster.js to allocate the large amount of widgets I have in columns & pages. The pages are being switched with Dashing.cycleDashboards(). The widgets are binded to an EventSource socket with Batman.js (installed as Ruby Gem, which gets it's data from OpenHab through mqtt)
I've chosen not to create multiple dashboards because of the outdated browser the dashboard is running on and split them in gridster divs as said. This is where my question is about;
Although only one Gridster div is displayed at a time, all are being updated in the DOM. Because the browser is so unstable this takes up a lot of memory and browser chrashes.
I'd like to stop Dashing from updating the hidden Gridster divs. Of course the div should get updated after switching to it. (After the switch I would like to execute state updates per widget with jQuery XHR calls.)
Is the above possible? And if so, please point me in the right direction. What file(s) do I need to change? Sample code would be great.
Thanks in advance.
Michael

I did not fix this issue. I have rewritten the complete HTML structure to accomplish what I needed. Closing this thread!

Related

How do Big sites prevent the loading circle on tabs from showing?

Okay I do not know how to explain this to you, It may be just my internet, or maybe my site is slower, or they really have a technique for doing this.
If you visit Facebook, Reddit, Youtube, Twitter, and if you click on links or any actions on those websites, the url changes but the browser tab doesn't show any loading circle.
How do they do that?
I am pretty sure my website is fast enought and at times it loads even faster than the bigger sites, but mine shows the loading circle on the browser tab.
Okay so I found the answer. Here is the technique for changing the url without reloading the page.
Updating address bar with new URL without hash or reloading the page
How do I modify the URL without reloading the page?
I am still trying to figure out though how to redirect the actual page without reloading the entire page. I am guessing they are loading it via ajax or something similar upon url change. I'll update this once I figure it out.
Edit: I am currently working on this feature for my site. The technique is to use ajax to load the content based on the url. I'll update this thread more as I update my site with this feature.
Edit 2: Damn, you will probably face the same problem I had trying to detect the url change without using onhashchange. If so, here you go:
How to detect URL change in JavaScript
This literally took me 4 hours just to figure that one out.....
Edit 3: I have now integrated this feature on my site. You can check it at
Grandweb
It is quite simple, but lots of work in appending the content once retrieved via ajax. So here is the process:
I am using pushState(); to change the url without reloading the page.
var url = $(this).attr('href');
var split_url = url.split('/');
var new_url = url.replace('https://grandweb.net/','');
window.history.pushState("object or string", "Title", "/write");
Using 'mouseup' was a bad idea, I changed my mind.
I then have to trigger the first function using 'mouseup' to retreieve the content via ajax, and then listen to succeeding onpopstate() for the next ones, because some mouse actions such as Mouse 4 or Mouse 5 are bound to the browser's Back and Forward button, and does not trigger via 'mouseup'.
$(window).on('mouseup', function(evt) {
get_content();
}
window.onpopstate = function(event) {
get_content();
}
The first one is responsible for triggering the function on first try because onpopstate only listens only when the browser's history API is populated.
Using mouseup was a bad idea, basically, don't use it unless you really want to detect mouse action from anywhere on the document.
I instead use the anchor tags/links to trigger the first function for retrieveng content.
example:
<a class="dynamic_btn" href="website.com/post">Home</a>
then
$(document).on('click','.dynamic_btn',function(e){
e.preventDefault();
get_content();
});
Using onhashchange is possible IF you have hashes on your url. I do not use hashes on my url so basically onhashchange is useless in my use case, unless I do not know something.
After retrieving the contents, I append them via creating DOM elements to existing containers from the page.
This is much easier to do if you are planning to change few elements or containers in your pages. If you plan on doing this to change a full page layout, goodluck. It's doable, but it's a really pain in tha *ss.
Upon observing Facebook, I learned that they do not implement this technique in all of their links/features. It makes sense because this is harder to maintain most especially because most of the work here is being done client side. It is very nice though because the page doesn't load.
I have implemented it on a few 'essential' functions of my website such as the viewing of posts and returning to the homepage. I can implement it on the whole site, but I am still deciding on that. That is all, thank you very much for reading internet stranger.

jQuery AJAX Load Method - Delay

I'll admit that I'm pretty new web development (only been coding for about a year) and especially green when it comes to JS / jQuery.
A specific web page I've built loads different data based on hovering over certain categories: country clubs, resorts, hotels, etc. When I built the site on my local machine, the javascript function was super quick. However, on the live site, it has a long delay before the data swap happens.
The URL is: http://preferredparkingsolutions.com/client_list.html
Which links to a javascript function at: http://preferredparkingsolutions.com/scripts/clientHover.js
Which replaces the display div (#client_list) by pulling data from a text file.
Is there a better / faster way of doing this?
Yes, this could be optimised by loading the content in up-front and caching it. Currently you are doing a HTTP request each for each and every hover - even if the user has hovered over that element before, since the AJAX responses aren't being cached. Doing this would be your quickest win.
However, I can't see any case at all for having the content live externally. Is there any reason you're against having the content physically in the page and just using show/hide methods? There's various benefits to this - SEO, for one thing, since Google will find the content.
this is the external page you are loading http://preferredparkingsolutions.com/client_list.inc.html and the content looks little and looks like its a static page then why not just load every thing upfront and then just hide and show div's ? as Utkanos suggested you will aslo have a SEO benifit and also its HTTP request each for each and every hover. if you still want to load it externally lost load it once and cache it and use the cached version to hide and show divs.

Should I load an entire html page with AJAX?

My designer thought it was a good idea to create a transition between different pages. Essentially only the content part will reload (header and footer stay intact), and only the content div should have a transitional effect (fade or some sort). To create this sort of effect isn't really the problem, to make google (analytics) happy is...
Solutions I didn't like and why;
Load only the content div with ajax: google won't see any content, meaning the site will never be found, or only the parts which are retrieved by ajax, which arent't full pages at all
show the transitional effect, then after that 'redirect' the user to the designated page (capture the click event of a elements): effect is pretty much the same as just linking to another page, eg. user will still see a page being reloaded
I thought of one possible solution:
When a visitor clicks a link, capture the event, load the target with ajax, show the transitional effect in the meantime, then just rewrite the entire document with the content fetched with the ajax request.
At least this will work and has some advantages; the page reload will look seamless, no matter how slow your internet connection is, google won't really mind because the ajax content is a full html page itself, and can be crawled as is, even non-javascript browsers (mobile phones et al.) won't mind, they just reload the page.
My hesitation to implement this method is that i would reload an entire page using ajax. I'm wondering if this is what ajax is meant to do, if it would slow things down. Most of all, is there a better solution, eg. my first 'bad' solution but slightly different so google would like it (analytics too)?
Thanks for your thoughts on this!
Short answer: I would not recommend loading an entire page in this manner.
Long answer: Not recommended. whilst possible, this is not really the intent of XHR/Ajax. Essentially what you're doing is replicating the native behaviour of the browser. Some of the problems you'll encounter:
Support for the Back/Forward
button. You'll need a URI # scheme
to solve.
The Browser must parse
the entire page through AJAX.
This'll slow things down. E.g. if
you load a block of HTML into the
browser, then replace the DOM with
it, only then will any scripts, CSS
or images contained therein begin
downloading.
Memory - the
browser's not changing pages. Over
time (depending on the browser), I'd
expect the memory usage to increase.
Accessibility. Screen readers
will need to be notified whenever
the page content is updated. Might
not be a concern for you but worth
mentioning.
Caching. Browser
would not know which page to cache
(beyond the initial load).
Separation of concerns - your View
is essentially broken into
server-side pieces to render the
page's content along with the static
HTML for the page framework and
lastly the JS to combined the server
piece with the browser piece.
This'll make maintenance over time
problematic and complex.
Integration with other components -
you're already seeing problems with
Google Analytics. You may encounter
issues with other components related
to the timing of when the DOM is
constructed.
Whether it's worth it for the page transition effect is your call but I hope I've answered your question.
you can have AJAX and SEO: Google's proposal .
i think you can learn something from Gmail's design.
This may be a bit strange, but I have an idea for this.
Prepare your pages to load with an 'ifarme' GET parameter.
When there is 'iframe' load it with some javascript to trigger the parent show_iframe_content()
When there is no 'iframe' just load the page, with a hidden iframe element called 'preloader'
Your goal is to make sure every of your links are opened in the 'preloader' with an additional 'iframe' get parameter, and when the loading of the iframe finishes, and it calls the show_iframe_content() you copy the content to your parent page.
Like this: Link clicked -> transition to loading phase -> iframe loaded -> show_iframe_content() called -> iframe content copied back to parent -> transition back to normal phase
The whole thing is good since, if a crawler visit ary of your pages, it will do it without the 'iframe' get parameter, so it can go through all your pages like normal, but when you use it in a browser, you make your links do the magic above.
This is just a sketch of it, but I'm sure it can be made right.
EDIT: Actually you can do it with simple ajax, without iframe, the thing is you have to modify the page after it has been loaded in the browser, to load the linked content with ajax. Also crawlers should see the links.
Example script:
$.fn.initLinks = function() {
$("a",this).click(function() {
var url = $(this).attr("href");
// transition to loading phase ...
// Ajax post parameter tells the site lo load only the content without header/footer
$.post(href,"ajax=1",function(data) {
$("#content").html(data).initLinks();
// transition to normal phase ...
});
return false;
});
};
$(function() {
$("body").initLinks();
});
Google analytics can track javascript events as if they are pageviews- check here for implementation:
http://www.google.com/support/googleanalytics/bin/answer.py?hl=en-GB&answer=55521

AJAX load time - Host and Server issue?

I'm having an issue with slow AJAX calls. This is a common question, but I've done everything suggested in all the research I can find. I'm hoping to get a consensus form people who read this.
Basically, I make an ajax request to a php page, which gets info from a database.
Here is the page.
I've timed all of my javascript, mySQL, and php scripts, requests, and pages.
(If you run firebug you can see my time markers in the console, as well as in the xml)
As an example -
The mysql request takes 20ms
The PHP page takes 50ms
The ajax success script, which processes the small amount of xml (less than 1k) and generates the markers, takes 8ms to run.
Yet, loading the page takes nearly 4 seconds.
So, assuming none of my scripts are lagging, this has to be a problem with the response time from the server, or my own internet connection, right?
I'd appreciate any theories or thoughts.
Thank you
Ok looked at your page and here are some of the issues I saw that would affect speed:
It takes 4ms to get your data in your getMarkers function but it takes 892ms to read the xml file. I would recommend falling back to vanilla javascript to read your xml file as the amount of find's you are performing is really harming your performance here.
Minify and combine all of the scripts that are local on your server. I was getting some really high response times. You can eliminate 4 http requests by doing this which with the response times on your server will help a bit. (Note don't combine jquery or jquery ui in this)
Since your server is a bit slow (not your fault this will vary as you are probably on shared hosting) I would recommend linking jquery and jquery ui to the google cdn hosted versions. Here is a post on that Jquery CDN
You have 24 images on your page; 23 of which are under 4KB. Combine those 23 into one CSS sprite image and assign a 1px X 1px blank gif to be the inline html image and use the CSS sprite image instead. Here is good article on what this is if you are unfamiliar: CSS Sprites explained also here is good online css sprites generator: CSS Sprite generator
Make sure you need Jquery UI for this page. I didn't see anything that would have required it. If you can remove it you save yourself 206k. Remember to remove the associated CSS file if it isn't needed. This would save you another 2 calls.
Didn't dig too deep but if you are not already kick off the call to setup the google map in a $(document).ready() that way the rest of your page can load and you can display a loading animation in that area. This way users know something is happening and your page will appear to load a lot quicker.
So you can greatly speed things up by doing the above. You would go from 82 components down to 51 local and 2 more on Google CDN. If you can improve that xml read time you can shave nearly another second off load time as well

Why does my website need so much time to render?

When cached, my starting page only needs to load one element (the "root document") - but then it needs some time until it's rendered completely:
alt text http://www.walkner.biz/_temp/firebug_net.png
The elements following are things loaded asynchronous via JavaScript.
Two questions:
Why does it take so "long" from loading the root document until the DomContentLoaded-event?
Does it make sense to load some not-so-important things asynchronously? Is it important to have the DmoContentLoaded-event as early as possible? Unfortunately there's not much documentation about that event, but I don't think it's the moment when the page is displayed, is it?
I'm not sure YSlow is gonna help him as that will download all elements for a page and run performance tests on them, whereas swalkner's problem is how long it is taking to render the HTML page itself when all other elements (images, CSS, etc) are cached.
At least that's what I think he's saying.
In the original question you said, "The elements following are things loaded asynchronous via JavaScript." but then listed nothing. What is loaded?
I would suggest checking for Javascript errors in the first instance. Then try removing some of your asynchronous loading calls one by one until you hit the bottleneck. In fact, remove them all, how long does the downloaded HTML take to render? Take that time and work from there.
Is your HTML document very big? Does it use lots of inline styles that could be in the CSS file?
Perhaps if you posted a link to the site then people would have a look at it.

Resources