installing electron app is too slow because of native dependencies that need installing into user end pc - windows

I have an electron app with 2 package.json files.
The root/package.json has all devDependencies, and the root/app/package.json has all dependencies which is necessary for app running.
So I package app folder using electron-packager, then build installer for windows using inno setup.
But when I install the app, because the node_modules in app has too many dependencies, the installer is so slow in order to extract all contents from node_modules.
Other apps cost 3-10s for installing, but mine 25-35s.
So what should I do for this? Maybe I can bundle the js using webpack before packaging?
Thanks.

You should absolutely use something like webpack (or equivalent) to bundle your application. Webpack does an excellent job at tree-shaking your dependencies and only keeping the resultant necessary modules.
I have already posted a possible solution for electron projects, including a build process approach that leads to installation building. My particular recommendation leaned on utilizing Wix for MSI deployment but the build process items are still applicable (steps 1-6) for anyone wanting to understand a possible process for the items important to doing this work (even if you use another installer). Hope this helps:
https://stackoverflow.com/a/46474978/3946706

Are you packaging a web app into electron? The slow packaging time is probably because of bundling web node modules into the electron app which is not necessary.
https://medium.com/#hellobharadwaj/electron-plus-angular-react-why-use-2-different-package-json-files-361ae47d07f3

Related

JavaFX Native Packaging EXE on a Mac

I am using Eclipse on Mac to develop my JavaFX application. I have packaged it as a dmg very nicely with the ant build and e(fx)clipse plugin.
However I now need to make this application an exe. Every tutorial and help I have found so far show that you need Inno Setup however this program is only available for Windows and I am on a Mac.
How should I go about this?
Any help is appreciated!
It is not possible, as documented at official oracle website:
https://docs.oracle.com/javase/8/docs/technotes/guides/deploy/self-contained-packaging.html
Self-contained application packages have the following drawbacks:
Package per target platform: Self-contained application packages are platform-specific and can only be produced for the same system on
which you build. To deliver self-contained application packages on
Windows, Linux, and OS X, you must build your project on all three
platforms.
Creating native bundles/launchers is tied to internal tools calling local installed toolsets, so running any "EXE"-file would never work. An option would be to install a windows-system inside some virtual machine.
Some notes about "create 32bit on 64bit"-systems and vice-versa: it is tricky and not very possible, at least on windows-systems. I encountered this while debugging some issue of the javafx-maven-plugin (disclaimer: I am the maintainer of that maven-plugin)

Deploying a pre-built package to Appharbor

I couldn't find any information on Appharbor's website about the possibility to deploy pre-built asp.net (mvc) applications. Does anyone know if that's doable?
Another question I have is wether appharbor's built process supports project that launch an executable (node.exe in this case) that's included in a solution folder as part of a custom build step?
If you're worried about precompilation, that's something AppHarbor does out of the box. If you push a repository without a solution file, we won't build it, but just deploy the contents (see part with no solution file).
You should also be able to run node.exe as part of the build, as long as all dependencies (incl. node.exe) are in the repository.

Jenkins + Cmake + JIRA = CI of multiple interdependent projects?

We have a number of small projects within our system running on Linux (Slackware 7-11, slowly migrating to RHEL 6.0). Around 50-100 applications and 15-20 libraries. Almost all our applications use one or more of our libraries. Our source tree looks something like this:
/app1
/app2
/app3
/include
/foo/app4
/foo/app5
/foo/app6
/foo/lib1
/foo/lib2
/lib/lib3
/lib/lib4
/lib/include
Now, I've done some work creating some CMakeLists.txt files and built most of the libs and some of the apps. I'm fairly comfortable with using cmake to build. I did this with v2.6, and I recently (an hour ago) upgraded to 2.8. Each of the above projects have their own CMakeLists.txt file specific to the project to do building and installation (no packaging, yet).
I have a requirement to make use of and enforce continuous integration. I've installed and played around with Jenkins, and from what I've seen I'm very impressed. I'm also evaluating JIRA to do our issue tracking.
Just to get things up and going, I've done a cmake install on all the libs, so the apps can find them in the filesystem. Headers are installed to /usr/local/include and libs to /usr/local/lib. Is this a bad thing to do? Would it be better to tell cmake to look for the lib's source directory, use the export interface or the recently introduced ExternalProject_Add?
Because I'm going to be using Jenkins, I cannot be guaranteed that cmake can find the source or build directory. Of course, I can tell Jenkins to build the projects in order (or at least, build the dependencies first). If an update to a library breaks the building of another project, then I guess it'll be up to someone with 3/4 of a wit to determine this.
Thank you in advance
Just to get things up and going, I've done a cmake install on all the libs, so the apps can find them in the filesystem. Headers are installed to /usr/local/include and libs to /usr/local/lib. Is this a bad thing to do?
No it is not a bad thing to do, but your build should reproduce resources from scratch. Things like portability and fixing build bugs will become an issue if things need to be pre-installed in the system outside of the build process. If you are able to do it other ways as you mentioned I would suggest that way, but if its going to make your build that much longer, its something you need to feel out. My ideology is everything should be movable to a new Jenkins machine with a fresh install at the drop of a hat, again this always isn't achievable, but something to strive for.
Because I'm going to be using Jenkins, I cannot be guaranteed that
cmake can find the source or build directory. Of course, I can tell
Jenkins to build the projects in order (or at least, build the
dependencies first). If an update to a library breaks the building of
another project, then I guess it'll be up to someone with 3/4 of a wit
to determine this.
Well one of the things I do in interdependent jobs is that on the successful building of one jobs triggers the job that depends on it. So for example if A depends on B, and A fail, B will never be run and whoever created the issue in build A is responsible for it and so on. This prevents a cascading affect of broken build that all were caused by a broken dependency. I would suggest that you keep files in a particular build in its job folder and specify to the dependency the location of the required files. Again keep your builds separate and clean.
I'm also evaluating JIRA to do our issue tracking.
I highly recommend JIRA as an issue tracking system for company; You might want to look at this Jenkins plugin for integration. If your using git, and you dont mind hosting your code off site, I would GitHub issues a shot as well.
Goodluck you seem to be on the right track.

How to deploy multiple projects in a single MSI?

I have 3 projects in my solution that I want to deploy. Is there a nice and quick way of using Visual Studio's setup projects to deploy all three apps using one MSI and letting the user decide which apps he wants to install during the install process?
I have setup projects for the 3 individual apps, I also have an overarching setup project that has the output of those other three projects. Am I on the right track or is there a better way?
I think you probably want merge modules. Accrding to MSDN:
A merge module is a standard feature of Microsoft Windows Installer that packages components together with any related files, resources, registry entries, and setup logic. You can use merge modules to install components that multiple applications share. You cannot install merge modules directly. You must merge them into deployment projects.
http://support.microsoft.com/kb/827025
In your case, each application would be a merge module and you would need to provide some UI to select which applications you would like to install. You could modify one of the default page templates to do that.
If using WiX (which I suggest doing) then you break each project down into its components, each project would be represented as a Feature in WiX/MSI which you can do conditional installs on. The standard tree dialog on installers for selecting features is based on this and the WiX examples have a ready made UI that uses it.
As for merge modules the lead developer of WiX was involved in the early creation of the Merge Module specs and he reccomends using .wixlibs now. See Here
WiX v3 Docs
I also have a similar requirement, however i used merge modules but cant seem to find a way of selecting which specific msm to install and which not to. As i understand there is a no condition property which can be set on msm's while integrating them with msi's. Please let me know if there is some alternate way of doin so..
Thanks,
Apn
You can use Wix as i've posted here -->
VS 2005 Setup Projects: Deploy Many Projects With One MSI

In continuous integration what is the best way to deal with external application dependencies

In using our TeamCity Continuous Integration server we have uncovered some issues that we are unsure as to the best way to handle. Namely how to reference external applications that our application requires on the CI server.
This was initially uncovered with a dependency on Crystal Reports, so we went and installed Crystal Reports on the Server fixing the immediate problem. However as we move more applications over to the CI server we are finding more dependencies.
What is the best strategy here? Is it to continue installing the required applications on the Server?
Thanks
Where possible make the external dependencies part of your build system.
For instance check the installer in to your version control system and have a step that checks it out and runs it in silent mode (many installers support a mode with no user action sometimes using the commandline /s).
This way if you need to set up another build machine for a branch or just for new hardware everything is repeatable.
If your builds require the actual application to complete the build, then you should probably continue to install the application on your build server.
If you just need references to dlls or assemblies from the application, then what we've done at my company is to create installable 'SDKs' of the references required for a particular applicatoin and install them on our development and build machines in well-known library directories that our solutions reference.
On the build machine, our pre-build steps install the correct version of the dependencies and then clean them up when we are finished.
Recently, we've moved to using virtual machines for our build machines that our build process activates. These VMs get the SDKs installed on them as a pre-build, and then are restored to their snap-shot state after the build. We had some dependencies that were almost impossible to uninstall, so this made for a clean starting point each time.
If you use Maven to build, you can define your dependencies in the pom.xml file. They will then be automatically downloaded if necessary.
I am not sure if I followed correctly...
I am assuming your application is dependent on this external app, while building? In that case it should be on the machine doing CI...

Resources