I'm using Vue CLI PWA workbox plugin mode.
My app has more than 1000 assets, and precaching them results in very bad performance. Please, check it out:
https://nikudu.com/
Is there a way to precache files more specificly?
For example, precache files by URL.
On url x/y only precache files 1,5,6 and on url x/v precache files 7,8,2.
I have worked with PWAs and Angular, so no experience with Vue. That said, I would suggest to pre-cache only your shell app (hence the minimum static assets/js that you want to present to the user, when accessing the app offline).
For the other images you can use a lazy approach and cache the images only after they have been requested once (but I do not know if this goes against your requirements).
From this point you could even expand your logic to define, which are the most used routes and preload only those images in the background, once the user lands on the app.
If you are interested, I wrote an article about Service Workers and caching strategies independent from Angular, therefore you can use the same concepts in your app too.
Related
Next.js documentation says:
"Only assets that are in the public directory at build time will be served by Next.js. Files added at runtime won't be available. We recommend using a third party service like AWS S3 for persistent file storage."
I have issue to understand how would you work with a site with dynamic content. I mean, let say i operate a merchant site, do i have to rebuild the site each time i add a product? or must i use a cdn for my images.
Sounds it should have a more easy way no?
I have a page which i would like to use as offline fallback.
The problem is, that in this page i have a ~10 of js/css resources and ~20 images.
It's silly to point each one in precache array by hand.
Also, it's hard to maintain - all resource changes in html template will be affect sw.js.
Is there a option to precache .html page with all resources (js/css/png) ?
No.
You would most likely want to have your build scripts include the static assets in the precache array in your SW. Not by hand.
I have been using Google's method for some time now and has never failed me. SW toolbox didn't give me the results I wanted.
I'm working on a site where we are using the slide function from jquery-ui.
The Google-hosted minified version of jquery-ui weighs 63KB - this is for the whole library. The custom download of just the slide function weighs 14KB.
Obviously if a user has cached the Google hosted version its a no-brainer, but if they haven't it will take longer to load as I could just lump the custom jquery-ui slide function inside of my main.js file.
I guess it comes down to how many other sites using jquery-ui (if this was just for the normal jquery the above would be a no-brainer as loads of sites use jquery, but I'm a bit unsure as per the usage of jquery-ui)...
I can't work out what's the best thing to do in the above scenario?
I'd say if the custom selective build is that small, both absolutely and relatively, there's a good reasons to choose that path.
Loading a JavaScript resource has several implications, in the following order of events:
Loading: Request / response communication or, in case of a cache hit - fetching. Keep in mind that CDN or not, the communication only affects the first page. If your site is built in a traditional "full page request" style (as opposed to SPA's and the likes), this literally becomes a non-issue.
Parsing: The JS engine needs to parse the entire resource.
Executing: The JS engine executes the entire resource. That means that any initialization / loading code is executed, even if that's initialization for features that aren't used in the hosting page.
Memory usage: The memory usage depends on the entire resource. That includes static objects as well as function (which are also objects).
With that in mind, having a smaller resource is advantageous in ways beyond simple loading. More so, a request for such a small resource is negligible in terms of communication. You wouldn't even think twice about it had it been a mini version of the company logo somewhere on the bottom of the screen where nobody even notices.
As a side note and potential optimization, if your site serves any proprietary library, or a group of less common libraries, you can bundle all of these together, including the jQuery UI subset, and your users will only have a single request, again making this advantageous.
Go with the Google hosted version
It is likely that the user would have recently visited a website that loads jQuery-UI hosted on Google servers.
It will take load off from your server and make other elements load faster.
Browsers load a fixed number of resources from one domain. Loading the jQuery-UI from Google servers will make sure it is downloaded concurrently with other resource that reside on your servers.
The Yahoo developer network recommends using a CDN. Their full reasons are posted here.
https://developer.yahoo.com/performance/rules.html
This quote from their site really seals it in my mind.
"Deploying your content across multiple, geographically dispersed servers will make your pages load faster from the user's perspective."
I am not an expert but my two cents are these anyway. With a CDN you can be sure that there is reduced latency, plus as mentioned, user is most likely to have picked it up from some other website hosted by googleAlso the thing I always care about, save bandwidth.
I am developing a site that can be broken down to a handful of main pages. These pages can be thought as isolated from each other, except they share the session data (ie. session id and logged-in username).
Initially, I was gonna build the site as a SPA using ng-view (ie. make the pages into AngularJS views). But then, I don't see any benefits for my site to be implemented in that way. And it would require extra time and efforts to make it support SEO (Making AJAX Applications Crawlable).
Going with an approach that does not provide any benefits and even creates extra workload doesn't seem to be too smart. So I thought to myself, why don't I make the main pages of my site into individual AngularJS apps. The parts of the site that need to be indexed by search engines are simply the initial screens of some of those apps, so I wouldn't need to do extra work for SEO. (Note: The initial screens are rendered by the Django server with data for search engines to crawl, so they are non-blank.)
For each of the apps, it may or may not have its own set of partials, depending on the requirements on it.
Example:
mydomain.com/item_page/1234 (load "item" app)
mydomain.com/dashboard (load "dashboard" app)
mydomain.com/account (load "account" app and default to "tab_1" view)
mydomain.com/account#tab_1 (load "tab_1" view of "account" app)
mydomain.com/account#tab_2 (load "tab_2" view of "account" app)
mydomain.com/post_item (load "post" app)
This is solely my random thought and I haven't seen any AngularJS examples that are comprised of multiple AngularJS apps. I would like to know:
Is the multiple-AngularJS-apps for one site approach feasible? What are some caveats that I should be aware of? Are there any example site out there in the wild that's taking this approach?
If feasible, how do I share the session data between the apps?
Note this post is about multiple AngularJS apps for one site, not multiple AngularJS apps on the same page.
There is nothing wrong with such approach, as long as you keep the size of downloaded JS script small enough, and ensure good caching. One of examples of such applications can be GitHub (they are not using angular, but approach is the same). When you go Issues page on GitHub, it loads an html page, common Github JS libraries and page specific JS code. Navigation and actions inside page, are handled by that single page specific script. If you go to other section (like Code) a new page with new page specific JS code will be loaded. Another example is Amazon AWS console, they even use different frameworks for different pages. (both GitHub and Amazon don't use Angular, but this approach works for any JS based framework, even for GWT).
As for sharing some session data between pages, you can embed this info directly in the page itself, using inline scripts or hidden elements. E.g. when your server is generating page, it should also generate some session information into the page. Another approach is to download session data once, and store them in local storage/session storage.
I am creating a mobile singlepage web app using jquery mobile. The webapp includes a number of javascript files and a number of css files. I have written a deploy script that concatenates and minifies js and css files, and now I am wondering whether I should inline the concatenated js and css directly in the HTML file - please note that I am talking about a singlepage app here (I know that this would be a bad idea in a traditional web 1.0 app with dynamically generated HTML). I am also using appcache/manifest file to cache the singlepage app so that subsequent access to the web app will be served from the cache, so it is the initial load time that is my primary concern.
When I inline everything (jquery, jquery mobile etc.), my 7kb HTML file increases to 350kb (100kb zipped) but now everything can be loaded in a single request.
But am I missing some other benefits such as parallel downloading of js files - and would it therefore be better to not inline the css and js, but instead just concatenate all js and css to a single js file and a single css file and then fetch each of them in separate requests?
Are there any limits regarding file size that I should be aware of? Maybe caching in network routers works better with smaller file sizes or whatever?
So my question boils down to whether it is a good idea to inline everything when making singlepage mobile web apps?
The answer to how much should be concatenated and how much should be inlined varies depending on a number of conditions. The final answer is you should do A/B testing and find what works best for you. From what you describe I recommend you definitely NOT inline 350K of CSS & JS. If you do this then any change to the HTML or JS or CSS requires downloading the entire payload. Instead, compartmentalize those changes and forced updates by keeping HTML, JS, & CSS as separate requests. You could do dynamic | inlining to make the first response fast but leverage (app or localStorage) cache for subsequent requests, but that's going to get complicated when coupled with app cache (because the HTML doc is saved to app cache). Otherwise, just keep them separate, save each resource to app cache, and update individual resources as needed.
I would not recommend inlining everything into the html if your webapp could be accessed from different urls with different querystring.
Example :
http://webapp.com/?fb_token=fdsf
http://webapp.com/?referer=bla
http://webapp.com/?tracker=toto
Each of those will add a master copy in the appcache (you can veryfy it by looking at chrome://appcache-internals/ in chrome). You then risk to reach the quota limit for appcache in term of cache. Furthermore on appcache update, the browser goes through its appcache entry list and ask a fresh copy of it.
A good compromise that i am following for mobile device & appcache , is to keep the html tiny and then have one big css (containing base 64 inlined images) and one js file.
FYI, there is currently a quota limit on the size on the sum of all your listed resources in appcache in the order from 5MB to 25MB (25 being the new iOS6 limit)
-seb