Alternate libraries folder locations for Processing (in installation folder?) - processing

I'm using Processing 2.1.1 on Linux. I am aware that the docs say:
via Processing 1.0 - Processing Discourse - where's the libraries folder?
Contributed libraries must be [...] placed within the "libraries" folder of your Processing sketchbook
How to Install a Contributed Library - Processing
Manual Install
Contributed libraries may be downloaded separately and manually placed within the libraries folder of your Processing sketchbook.
The thing is - I enjoy running Processing directly from its unzipped folder (without any installation as such); and I actually keep it on a different partition than my "main" partition (which has my home folder, ~, where ~/sketchbook is by default created on Linux). Thus, when I install a new library, it ends up being in a different partition than my main program - and I'd like to keep the libraries and the Processing program together.
Is there any way I could achieve this?

Well, it turns out there is - there is a folder, processing-2.1.1/modes/java/libraries/ - and all of the libraries in ~/sketchbook/libraries can be moved here!
Just note that those libraries, once moved, will not show as "contributed" anymore (in neither "Sketch/Import library..." nor "File/Examples").
Also, if installing from the web via "Sketch/Import library.../Add library...", that installation is likely going to complete in ~/sketchbook/libraries - so afterwards one would have to close Processing down; move the library folders from ~/sketchbook/libraries to processing-2.1.1/modes/java/libraries/; and then restart Processing again.
Hope this helps someone,
Cheers!

Related

Why doesn't the PDW copy some files when updating an existing installation?

I have a fairly large application (~750k LOC) that I distribute using the Package and Deployment Wizard. I fully understand that it would be nice to migrate to .NET (that ain't happening - see the code size above), and that the PDW is deeply flawed. However, for the most part I've made it work well for my end users, by customizing the Setup1 application, writing a menu-driven wrapper for the Setup application, and by running it in silent mode. (Note that the problem I'm about to describe occurred even before I started using silent mode.)
The issue I'm having is that my application requires quite a few auxiliary files, which I've added to the PDW project in the "Included files" section. When a user does a clean installation (either from scratch, or after un-installing a previous installation), everything works fine. However, if they simply run the installer to update the existing installation, the executable file and any OCXs I've updated get copied over the previous versions just fine, but my auxiliary files don't - I have to have the user manually delete them, and then the Setup1 program will re-install them as it should.
I've checked in the Setup.lst file, and all of the files are listed there, with their current date stamps. In fact, in my "BuildAll.bat" file, I do the Windows equivalent of a "touch" (copy /b "TheFile.dat" +,,) to force the date stamp to be current. However, if the file exists on the target machine, it won't be over-written even though it's older. There are no errors reported, either visibly or in the .LOG file (which is required if using the silent option).
A couple of additional points: Some of the auxiliary files are themselves VB6 applications - just the .exe files. Those do get copied correctly if they're newer than the existing files. Other than being files with internal versioning information, there's no difference between them and the other auxiliary files (which are things like media files, or text-based .txt or .dat files).
So, what's going on, and how do I fix it (besides moving to Inno or some other solution that won't work for me...)? Thanks in advance for any help!
~~
Mark Moulding

What does yarn --pnp?

There is this new shining Yarn feature called Plug'n'Play.
I would like to know what it does exactly?
I know it's creating a .pnp folder and a .pnp.js file, but does it change anything else on the machine, like a config file somewhere?
Thank you.
I designed and implemented PnP, so I can talk hours about it 🙂
tl;dr: We only write the .pnp.js and .pnp folders (on top of the regular Yarn cache). We don't store configuration anywhere else.
Without Plug'n'Play
When you run yarn install (even without PnP), a few things happen:
If you use the offline mirror feature, we download the tarballs from the registry and store them within the offline mirror folder
Regardless of whether or not you use the offline mirror, we unpack all the tarballs downloaded and store their files in the Yarn cache
We then figure out which files from the cache should be copied into which location in the node_modules
We apply the computed changes (a bunch of rsync operations, basically)
With Plug'n'Play
With PnP, the workflow becomes like this:
No changes, we download the tarballs from the registry in the offline mirror (if enabled)
No changes, we still unpack them into the Yarn cache
We generate a .pnp.js file¹
And that's it. There is no other generated file than the .pnp.js file (and the cache, but it already was there before).
¹ As you mentioned, we also generate a .pnp folder (.yarn as of Yarn 2) in the project. This folder is meant to contain two types of data:
Unplugged packages are packages that must be local to the project. Typically, those are the packages with postinstall scripts (we cannot store them into the cache, as the generated artifacts might be different from a project to another).
Virtual packages, which are symlinks created for each package in your dependency tree that lists peer dependencies. Without going into the details, they are a necessary part of the design, and are required to make require.resolve work as before. Those files don't exist anymore as of Yarn 2 🎉
How does it work?
The .pnp.js file contains information similar to the following:
webpack#1.0.0 -> /cache/webpack-1.0.0/
-> it depends on lodash#1.0.0
lodash#1.0.0 -> /cache/lodash-1.0.0/
-> no dependencies
By having those information, the resolution can correctly infer that when a file within /cache/webpack-1.0.0 makes a require call to lodash, then the required files must be loaded from /cache/lodash-1.0.0. It's a bit more complex in practice (we keep an inverse map for improved perfs, we use relative paths to ensure portability, etc), but the general concept is there.
Bonus round: With Plug'n'Play+Zip loading (Yarn 2)
Bonus: With Yarn 2, we're about to improve this workflow even more. This is what it will look like:
We download the tarballs from the registry and store then into the cache (no more distinction between offline mirror and cache - they are the same)
We generate the same .pnp.js file as before
And that's it! As you can see we don't unpack the packages anymore (instead, we use a Node loader to read them from the package archives at runtime).
Doing this has a very interesting property: if both your cache and .pnp.js files are there, you don't need to run yarn install for your application to work! And to ensure you have those files, you just have to add them to your repository and version them as you would with everything else.²
It's very useful, as you don't need to remember to run yarn install after git rebase, git pull, or git checkout, and your CI systems become faster and stabler as they don't need special setup - just clone your application and it'll just work.
² Before someone mentions it - checking-in binary files within a repository is perfectly fine. The reason why node_modules were a very bad thing to check-in within your repository was because of the exponential number of text files, which was putting a huge strain on Git - technically, but also philosophically as code reviews were made impossible.
In the case I described we don't suffer from the same problem, because the number of files is constrained (exactly one file for each package), and reviewing them is very easy - in fact, it's better in that you can clearly see how many new packages are added to your project by a PR!
It imports only the parts of a package you are going to use, making the bloated node_modules folder much, much leaner.
Think about for example having relative big libraries like lodash or ramda when you use only 4-5 functions from them - how much you could save getting only the actually used minimum.
I believe it is not yet 100% fully stable, but still a nice option to keep on your radar :)

VB6 Registration - DEP file

I have an app that I am moving to another server. It is complaining that it is missing TABCTL32.OCX. I have located this file on another server and I want to copy and paste it across.
I have discovered that there is also a file called TABCTL32.DEP on the server I am moving from. Do I have to copy both files across or is the dependency file optional?
I have tried it with an without the DEP. The app works in both cases. It is a production server so I want to be sure.
Those .DEP (depdendency) files are instructions about a library meant to be used by packaging tools. These files have no run-time significance, containing only development metadata. They are text files.
They contain the preferred ("designed") location to install the library, sub-dependencies of the library including optional localization "satellite" resource DLLs, version information, etc.
See articles such as INFO: How Setup Wizard and PDW Use Dependency Files.
This is information a packager should use along with other "rules databases" such as VB6DEP.ini. Programmers are also supposed to create them if they expect other developers to use their libraries.
If you are using an "impaired" 3rd party packaging technology that is ignorant of .DEP files it is up to you to read them and incorporate the information they contain in your build process. You are also responsible as a developer to keep your dev machine's .DEP files and VB6DEP.ini file up to date, since they often are not updated by Microsoft anymore.
You can't just copy files willy-nilly from one machine to another. Go find this program's installer and run it on the new machine.
A .dep file is a file used by the Visual Basic Setup Wizard to determine what dependencies your ocx file have. You can open the file with Windows Notepad to view the contents.
Unless you are using the Visual Basic Package and Deploy Wizard, you can ignore this file.
For more info, see INFO: How Setup Wizard and PDW Use Dependency Files

How to prevent Installshield from removing files?

I am developing a package using Installshield 2008 Primer Edition and Project type is Installscript MSI project.
The problem I am facing is during installation I am installing some of the files to the following location C:\Program Files\Company\SystemFiles from this location I am copying and adding the set of files into System32 folder, it contains DLLs and OCX files, copying into the System32 folder has been done using Installscript.
Due to this during uninstallation, the installed file is getting removed from System32 due to this other dependent application which requires the same set of DLLs have stopped working.
I have approached Installscript to copy files from ProgramFiles to System32 Folder rather than using built-in options because we have an issue during the upgrade in order to avoid that I am using Installscript.
Even I have tried several workarounds like setting the file attributes after file copies to System32 using Installscript like FILE_ATTR_SYSTEM which sets the system attribute but still files are getting removed during uninstallation.
Any idea how to give file attributes as PERMANENT or SHARED; will this help, and if it will, then how can I set it using Installscript?
I have 2 ideas
1)I think you can use SHARED option as this wont remove the files while uninstallation.
2)Also when i was facing similar issue , what i did was putting all the required files in the installation directory itself so that while uninstalling only the installed files will be removed.(I know this is not a best solution)
(NOTE:I have worked on Install shield some 6 years back and so remember only certain things)
You can also disable logging from Install Script. This will make the installer "forget" that it installed specific files groups or features.
You should make sure to enable logging once again after you have copied the files that you want to permanently leave on the system.
If you don't remember to enable logging after you have disabled it, your uninstall process may not work correctly.
Syntax is as follows:
Disable(LOGGING);
//Add code to copy your permanent files here
Enable(LOGGING);
For InstallScript projects:
To prevent the files in a particular Component from being removed during uninstall:
1-Select the Components view from within the Organization folder.
2-Select the component that contains the files you do not wish to remove during uninstall.
3-Change the "Uninstall" property in the right pane to a value of "No."
For MSI Projects:
To prevent the files in a particular Component from being removed during uninstall:
1-Select the Components view from within the Organization folder.
2-Select the component that contains the files you do not wish to remove during uninstall.
3-Change the "Permanent" property in the right pane to a value of "Yes".
I see this is an old question but I just came across this. Seems to be a common problem. One good solution is to stage the files to a private directory mostly program files and then have a custom action do the copy and register (ocx etc). Installshield remembers what it copied so it tends to remove them. Do not disturb anything else like logging (my recommendation). Set conditions on the custom action so that it doesn't run during Uninstall.
Although sometime back I did another weird implementation which only programmers are used to doing.. Packed the files as resources and created my own code to extract and deploy (Something that Process Explorer kind of tool does). There were certain use cases that warranted this kind of implementation. But again this is complicated and obviously reinventing the wheel. Unless you are good with C/C++ and Windows API this would be difficult. I would still suggest you stay away from this kind of implementation because it is also considered a "virulent behavior". Yet, so far I never got warnings from the Anti-malware products.

Which Qt DLL's should I copy to make my program stand-alone?

I'm trying to make a distribution directory with my application. I've copied several Qt DLLs to that directory, and the program seems to be working, with one exception: it doesn't seem to find SQL plugin for SQLite. Copying qtsqlite.dll to the directory, doesn't allow my application to open or create SQLite files. What must be the direcotry structure or which additional files need to be copied so that the program can read the database?
you can use depends.exe to see exactly what the dependencies of your exe are and make sure they're all included.
Also, read this page about qt plugins. they are supposed to be in a specific directory called "plugins" and not in the main directory with all the other dlls.
Most probably, the qtsqlite.dll itself depends on original SQLite DLL's which you probably need to copy as well.
Don't forget to include an LGP license copy in your distribution as well as pointers to the original download ressources of the libs you include and their sources. To stay with the law :-)
Thanks to the link #shoosh provided, I was able to fix the problem. I needed to create sqldrivers subdirectory in the distribution dir with qsqlite.dll library inside. But that was just step one. Do you have any tips and resources on creating a full-blown Windows installer? I'm mainly a Linux programmer so this area is unknown to me.

Resources