Tealium integration - tealium

I am looking at my project page load improvement and I do see both utag.js and utag.sync.js loaded.
Does utag and utag.sync.js both javascript files needed for tealium
integration.
If both are needed then why ? what is the purpose, difference between these two js?
Thanks,
Sri

utag.js and utag.sync.js are there for slightly different reasons. Essentially, utag.js will be needed as this is the main JavaScript file that Tealium generates. It's generally responsible for looking after your Data Layer, evaluating load rules, running extensions, loading/executing your tags, etc. All that sweet, sweet Tealium iQ functionality comes from there.
With all this in mind, the utag.js file is pretty big. So, in order to not slow down the page load, Tealium recommends that you load utag.js asynchronously in the <body> (see here).
However, there are a few situations where your vendor code needs to be run at the very beginning of the page load. Examples of this include any A/B testing tags (Optimizely, Adobe Target, etc.) or occasionally some content-changing tags require being loaded early in order to flickering.
For these situations, Tealium provides utag.sync.jsā€”that's a separate file from utag.js that you can load synchronously in the <head>. You can then paste the code from any of these tags that require very early loading, in order to be sure that they will be loaded in time. You can read more about how to use utag.sync.js here.
But yes, long story short: utag.js is the main Tealium JS file, responsible for the main functionality. utag.sync.js is then an additional file used for certain situations/bits of code that wouldn't work as required if implemented in utag.js.

Related

Use google hosted jQuery-ui or self host custom download of jQuery UI?

I'm working on a site where we are using the slide function from jquery-ui.
The Google-hosted minified version of jquery-ui weighs 63KB - this is for the whole library. The custom download of just the slide function weighs 14KB.
Obviously if a user has cached the Google hosted version its a no-brainer, but if they haven't it will take longer to load as I could just lump the custom jquery-ui slide function inside of my main.js file.
I guess it comes down to how many other sites using jquery-ui (if this was just for the normal jquery the above would be a no-brainer as loads of sites use jquery, but I'm a bit unsure as per the usage of jquery-ui)...
I can't work out what's the best thing to do in the above scenario?
I'd say if the custom selective build is that small, both absolutely and relatively, there's a good reasons to choose that path.
Loading a JavaScript resource has several implications, in the following order of events:
Loading: Request / response communication or, in case of a cache hit - fetching. Keep in mind that CDN or not, the communication only affects the first page. If your site is built in a traditional "full page request" style (as opposed to SPA's and the likes), this literally becomes a non-issue.
Parsing: The JS engine needs to parse the entire resource.
Executing: The JS engine executes the entire resource. That means that any initialization / loading code is executed, even if that's initialization for features that aren't used in the hosting page.
Memory usage: The memory usage depends on the entire resource. That includes static objects as well as function (which are also objects).
With that in mind, having a smaller resource is advantageous in ways beyond simple loading. More so, a request for such a small resource is negligible in terms of communication. You wouldn't even think twice about it had it been a mini version of the company logo somewhere on the bottom of the screen where nobody even notices.
As a side note and potential optimization, if your site serves any proprietary library, or a group of less common libraries, you can bundle all of these together, including the jQuery UI subset, and your users will only have a single request, again making this advantageous.
Go with the Google hosted version
It is likely that the user would have recently visited a website that loads jQuery-UI hosted on Google servers.
It will take load off from your server and make other elements load faster.
Browsers load a fixed number of resources from one domain. Loading the jQuery-UI from Google servers will make sure it is downloaded concurrently with other resource that reside on your servers.
The Yahoo developer network recommends using a CDN. Their full reasons are posted here.
https://developer.yahoo.com/performance/rules.html
This quote from their site really seals it in my mind.
"Deploying your content across multiple, geographically dispersed servers will make your pages load faster from the user's perspective."
I am not an expert but my two cents are these anyway. With a CDN you can be sure that there is reduced latency, plus as mentioned, user is most likely to have picked it up from some other website hosted by googleAlso the thing I always care about, save bandwidth.

Meteor: Eliminate render-blocking JavaScript and CSS in above-the-fold content

How to "Eliminate render-blocking JavaScript and CSS in above-the-fold content" in Meteor?
The TruthTM
After implementing a working solution for this problem, I'd say that the rightTM answer to your question is: "No, that's just what you get for using such a complex javascript framework."
But it's still a fact that loading meteor can take over a minute, on slow networks. It is huge. That makes for an awful UX. So I think it would improve a meteor app overall to have something like a loading screen.
I'm writing a package kriegslustig:altboiler (I'll update this as soon as I do the first "major" release).
Solution
I documented the solution that I'm using in altboiler in this repo. It got pretty long, so here's a summary:
Use WebApp.connectHandlers
Loop through WebApp.clientPrograms[WebApp.defaultArch].manifest
Get all the URLs inside via AJAX
Buffer and then compile them into one single script tag
Insert that script tag into the head
And then finally self destruct the loader script
That way you won't get that error on Google Pagespeed.
Potential
You could serve a loading screen first or you could also render the whole site without the parts that need a connection to the server.
Performance
I expected this to load Meteor a lot slower, but in my initial test Meteor actually loaded faster. My test wasn't exactly scientific though. I simply loaded it in the Chrome emulator and throttled the connection to 50kbps. Also, I did this on a dev instance, so it was uncompressed. The results are still somewhat relevant though:
Without altboiler: 1.7min
With altboiler: 2.8min
The ajax requests only perform better when there are a lot of requests made. So presumably the impact on a bundled instance could range from slightly worse to slightly better.
Downsides
It might intervene with the spiderable package, but I don't think so. I'll test it when I've written some tests for the package.

Optimizing load time while supporting JavaScript disabled browsers

I have a large background image that is a major part of my design, but is not essential, that I am planning to load via JavaScript after the rest of my page has loaded. I would like my users to be able to interact with my more important content while this large file loads.
I would like to support JavaScript disabled users as much as possible, while maintaining optimal load times for everyone else. Is there a way to force an element to load last when javascript is disabled? If not, what's your opinion - is it better to fully support non-javascript users or optimize my site for everyone else?
If you include a style block at the very bottom of your page (just inside the closing body tag), and set the background image from there then it should load last, even for users with javascript disabled. Mike and claustrofob both make valid points.

AngularJS - does everything need to loaded on initial page load?

I am diving into JavaScript MVC with Angular and as I understand it, along with the initial shell page, all your scripts must be loaded on the initial page load. However, and correct me if I'm wrong, that would mean that a majority of your scripts being loaded could be entirely useless (i.e. you have view #1 showing and scripts for views #2 - #10 aren't needed yet)?
In my case, I have a fairly large web app, with a feed page, results page, product page, profile page, among others. It amounts to 10+ pages, and my current (the traditional) approach is loading scripts specific to each page on load. Now each page is a partial and I don't believe it's possible to load specific scripts with partials?
So, part of my question is if my statements are accurate. The other is whether or not my fear of suffering on initial page load are justified (especially for mobile devices for instance).
I really got into Angular in hopes to clean up my JavaScript with the MVC approach and did not plan on taking advantage of it as a single page application (I can forego the use of routing different partials into my view, right?). But now I'm not sure. I just want to get a better understanding of how it works before making the leap.
Any help appreciated. Thanks!
Take a look into AMD pattern with Require.JS (Works with any type of JS framework). There is a seed project with AngularJS + RequireJS.

Website optimization / decrease loading time in asp.net mvc 3

I have built a test website using nopCommerce open source , Everything is working fine , i need to know , why my website loading time is greater than 6 sec , the homepage works fine but the categories when clicked takes like 6-10 secs. how can i check the http request and calls to db so that i can track which function is taking a long times.
Test website is test website
Thanks
Things I would try in that order:
MvcMiniProfiler.
Analyze my code for possible performance bottlenecks using a .NET profiler.
Finally submit bugs to the nopCommerce support if the previous approaches didn't yield anything fruitful that would put my code into cause.
In between I might also checkout with my hosting provider whether he is not the cause of the slowness.
As a quick and dirty check, you can add the time it takes to generate the response as a column in the IIS logs - that will give you some idea as to whether the server is being slow to serve the pages or you need to do some front-end optimisation work.
On the front end side the first thing you need to do it merge all the CSS files for a theme into one to save on roundtrips - the browser can't render the page until it's got the CSS
All the .js files you have in the head will also block the page, can you merge them and load them later?
The performance of imagegen.ashx looks on the slow side - do you need to generate the banners on the fly or could they be pre-generated?
If the back-end side of generating the page is slow, there are some scripts around the web to show which queries are using the most CPU, making the most IO ops etc.
Below is a list of things you can improve,
1.Combine your js.
There are a few things you can use, for example, jsMin, you can read this [post] http://encosia.com/automatically-minify-and-combine-javascript-in-visual-studio/. However, jsmin doesn't seem to compress the combined js.
Another option is [jmerge] http://demo.lateralcode.com/jmerge/ It kinda does it after the fact, in the sense that you need to have the site ready to cobine them with jmerge since it only take a http link.
The best one I'v known so far is bundling and minification feature of MVC4. It's part of MVC4, however, you can get a Nuget package for you MVC 3 app.
Word of advice: bundling every js of yours is not necessarily a good idea, it even backfires someimtes, since you will end up with a big js that browser will have to download sequentially, instead of downloading several smaller ones. (you might want to look into head.js to make js download parallel) So the trick here is to keep the balance. I end up have a jquery from google CDN and bundled the rest of my js into one.
2.Put js at the bottom of the page so the browser doesnt have to load the js first before it starts to render the page. But you need to be careful with this one though, since normally you will have jquery functions doing stuff upon document.ready() at the header of the page, I adviese you moving that to the bottom of the page as well, if possible.
If you move the js reference and scirpt block in you layout page to the bottom, then you will most likely run into problem with nested js reference and js script blocks in your individual view. No worries, then you need to look into using #section (probably suitable for a discussion in an other thread) in your view and render it in your layout page, so that the referenced and script block inside your view get rendered at the bottom of the page at run time.
2.Use CDN
Pretty straight forward.
3.Combine CSS
Combine them into one, with the same tool you use for combining js, but you need to reference it at the page header, instead of the bottom.
4.Enable static content cache, something like this in your web config file
It won't help with first time load, but definitely will make it a lot faster for returning user.
5.Enable url compression
Time to first load
This is one of the metrics used by webpagetest.org. But dont bang your head against this one too much, as it basically says how fast your web server can serve the content. So probably not much you can do here form the software end.
Hope that would help!
NopCommerce is deadly slow, and the developers doesn't look in to the performance issue seriously. I have seen lot of performance related forums left unanswered. So best luck.

Resources