Performance issues with jQuery mobile app - performance

I have a quite large app build using phonegap and jQuery mobile. I have about 5 pages withing the one HTML page (as per jQuery mobiles navigation system) all sharing one main JS file, which has 3400 lines of JS code.
The app has ran slower over the development period, and I Was wondering if it would make a difference splitting the app up into seperate HTML files and having them load into the DOM?
And secondly, would it also be worth splitting the javascript up so only javascript needed on the page is currently loaded with the page?
Like shall I have the whole 3400 lines of code loaded on device ready? or inject needed JS along with the page?

Yep as Nathan said I think you already know the answer.
I'd definitely recommend splitting your application into separate HTML pages and JavaScript, and loading the JavaScript only as and when you need it. I've heard people recommend a single-page architecture approach, although the difference there is that the HTML pages and JavaScript are injected in and loaded as and when they are needed.
See part 4 here in this PhoneGap tutorial for more information on properly implementing a single-page architecture.
It's not a good idea to load everything into memory at the start... it's an inefficient use of resources - as you can see from the slow performance.
And having separate HTML and JavaScript files should also be a lot easier to manage and understand as well.

After loading the too much data ~(250 records in listview) in jquery mobile page DOM. The navigation sytem transits very slow. I have tried verious tricks but unsuccessful. finally, I got the exceptable performance of navigation by applying below tricks.
Try to remove/hide the populated data from DOM when trying to change page.
when come back to page. repopulate it.

Related

SonarQube page extension with html possible?

I just read about extending SonarQube with custom pages. All examples I have found consist only of javascript files.
Is it possible to have html files as well or is it javascript only? If it is javascript only then the whole feature is completely useless as nobody wants to create all the html elements with javascript.
It's Javascript only.
However, there's a guide on how to use React to create the HTML elements, which makes it a lot more bearable. Try it!

Microsoft Web Matrix

Pretty easy question I hope: does anyone know of a tool that will effectively scrape sites built with Microsoft Matrix? I could write the code in python, but it will take me way longer than I think I want to dedicate to the task, namely because of the really bad and ugly HTML generated by Matrix.
I have tried Web Harvey, Helium Scraper, and I tried the Web Scraper plugin for Chrome. WebHarvey choked on the HTML and couldn't load subsequent pages. Helium Scraper was able to move from one details page to another (the Next links were followed) but content from within the details pages was not lifted out. The Chrome plugin web scraper was not able to navigate links, with the popup window displaying an error page. My gut is telling me that this has to do with uniquely ASP.net things, but I could be wrong.
Any pointers or suggestions appreciated.
You know there are two completely different versions of Microsoft Web Matrix right? There's the one from 2003; i have no idea what its html looks like. There's the one from 2011 to current which uses razor cshtml source files to produce its html. In the 2011+ one, you write the html by hand; there's no drag and drop, and so it's unlikely you'll get consistent html from site to site.

Web Scraping an Image

I was thinking about the applications of web scraping (still quite new to it) and came up with a question. Can you get an image from a page if there are advertisements on the page (like can you avoid advertisements and only look for the correct image content on the page)? Also, if the image is also a link to another page, can you say go to the next page and get that image (and then go from there until you either reach a certain amount or get all of the images)? This would mean avoiding going to the advertisements pages.
Absolutely. If you use a tool like kimonolabs.com this can be relatively easy. You click the data that you want on the page, so instead of getting all images including advertisements, Kimono uses the CSS selectors of the data you clicked to know which data to scrape.
You can use Kimono to scrape data within links as well. It's actually a very common use. Here's a break-down of that strategy: https://help.kimonolabs.com/hc/en-us/articles/203438300-Source-URLs-to-crawl-from-another-kimono-API
This might be a helpful solution for you, especially if you're not a programmer because it doesn't require coding experience. It's a pretty powerful tool.
I think if you are ok with PHP programming then give a look into php simple html dome parser. I have used it a lot and scrapped number of websites.

Moving from Flash to HTML5/CSS/ etc

Some years ago I decided I could bypass all the browser inconsistencies by producing sites entirely in Flash. Doesn't look such a good decision now so I'm re-writing my semi-CMS framework in php, javascript/jQuery and HTML. One aspect of my Flash sites which I am very pleased with is the ability to load all pages or states in the background so the user rarely requests a page that isn't already loaded. When that does happen I can display a progress bar. In AJAX I can't display progress but I also found a significant difference I hadn't anticipated. In Flash I load the .swf for page 1 completely, before starting to load page 2. That means everything including images etc. In AJAX I can't see a way to do that. I can check that the HTML file itself has completed loading, but not that all its images have loaded before loading the next HTML file. Is it possible?
If you're looking for an HTML5 solution rather than just an AJAX solution you might want to investigate the Application Cache. There is a progress event which you could hook into, though it possibly doesn't get into the level of detail you need. As far as I'm aware, resources will start downloading in the order they're listed in the manifest file.

Alternative for Downloading Several Images to a Web Page

We all know that pre-fetching images can run slow because of browser limits in the HTTP protocol, right? So, I have XHTML, jQuery, Apache httpd, and PHP at my disposal. What's an easy solution to pre-fetch a lot of images, without using sprites or multiple hosts?
See, I have these themes one selects with a SELECT box. It changes the 200x200 theme image on the right of the box. Unfortunately there are like 150 of these. So, when I load the page, the progress bar keeps running to download these all.
How can I get these images pre-fetching faster without using sprites or multiple hosts?
If it's just a theme change, which probably rarely happens (right)? Then why wouldn't you just load the image for a theme when the the select is changed and a new theme is chosen? It seems "strange" to load 150 images of which 149 may not be seen.
Correct me if I'm missing the point - and if so, can you provide a screenshot so I can get an idea of what you're really trying to show?
Hindsight is probably 20/20. I probably should have implemented it in sprites, as well as for many of the buttons I used on the site. It's just that I lack a good sprite editor tool that speeds that process up.
Anyway, the strategy I went with was to use Javascript prefetch via jQuery. But even that wasn't enough. I had to wrap that function in a setTimeout(), but that only helped a little. I then had to fire that setTimeout() during a login form submit. It made the login form submit slightly longer, but made the website appear snappy on load.

Resources