Setting environment variable through NSIS Envar-plugin - windows

I have a custom installer created through NSIS.
I have the following Ant task doing the same:
<target name="buildNSIS">
<exec executable="D:\NSIS\nsis-binary\makensis.exe" failonerror="true" >
<!-- providing some nsis definitions -->
<arg value="/DPROJECT_NAME=${ant.project.name}"/>
<!-- passing the script -->
<arg value=".\installer\MySetup.nsi"/>
</exec>
</target>
where MySetup.nsi is the script to run through NSIS for the installer.
I want to set an environment variable as part of the install process.
I read that its best to do using: https://nsis.sourceforge.io/EnVar_plug-in
However, the instruction there is confusing. It just says: Just extract the contents to your nsis directory (usually '$PROGRAMFILES\NSIS')
What does it mean?
My D:\NSIS\nsis-binary directory looks like:
So do I unzip Envar_plugin.zip inside Plugins directory above and start using EnVar::AddValue or EnVar::AddValueEx functions inside my MySetup.nsi as mentioned in Envar_plugin examples?
How do I use Envar_plugin so that my resultant custom installer through MySetup.nsi for my software will set environment variables during installation of my software?

Plug-ins have to be installed in the correct plug-in subdirectory inside the NSIS folder. Some plug-ins only have a .DLL file in the root of the .ZIP file and some already have the correct directory tree in the .ZIP. This specific plug-in has the latter and you can just extract the contents to your main NSIS folder.
If you try to execute a plug-in command (name::function) and NSIS cannot find the plug-in then you most likely put the .DLL file in the wrong folder. Recent versions of NSIS will print a list of directories it tried to search when this happens.

Related

In eclipse executing Sass with ANT task. Why have I to set 'executable' to 'sass.bat' instead of 'sass'?

I have this ANT task to execute Sass in an eclipse project:
<project basedir="." default="sass">
<target name="sass">
<apply dest="www/styles" executable="sass">
<srcfile/>
<targetfile/>
<fileset dir="styles" includes="*.scss"/>
<mapper from="*.scss" to="*.css" type="glob"/>
</apply>
</target>
</project>
It works fine in Ubuntu. In Windows 7 I have to set the executable as sass.bat.
This is the error:
Buildfile: D:\my_workspace\my_project\build.xml
sass:
BUILD FAILED
D:\my_workspace\my_project\build.xml:3: Execute failed: java.io.IOException:
Cannot run program "sass" (in directory "D:\my_workspace\my_project"):
CreateProcess error=2, The system cannot find the file specified
Total time: 326 milliseconds
Both, sass and sass.bat can be invoked from the command line so as the Ruby/bin folder is in system PATH variable.
I don't want to mantain two versions of this file for different OS.
How can I solve this?
[not an answer-but too big for comment ]
Windows shell understand how to expand pathext. Ant does not interpret - it only try .exe, not others.
See comments for Windows Users
The task delegates to Runtime.exec which in turn apparently
calls ::CreateProcess. It is the latter Win32 function that defines
the exact semantics of the call. In particular, if you do not put a
file extension on the executable, only ".EXE" files are looked for,
not ".COM", ".CMD" or other file types listed in the environment
variable PATHEXT. That is only used by the shell.
Note that .bat files cannot in general by executed directly. One
normally needs to execute the command shell executable cmd using the
/c switch.
A common problem is not having the executable on the PATH. In case you get an error message
Cannot run program "...":CreateProcess error=2. The system cannot find
the path specified. have a look at your PATH variable. Just type the
command directly on the command line and if Windows finds it, Ant
should do it too. (Otherwise ask on the user mailinglist for help.) If
Windows can not execute the program add the directory of the program
to the PATH (set PATH=%PATH%;dirOfProgram) or specify the absolute
path in the executable attribute in your buildfile.
I've solved it by adding the conditional variable exec_file with value sass.bat for Windows family OS and sass for other.
<project basedir="." default="sass">
<condition property="exec_file" value="sass.bat" else="sass" >
<os family="windows" />
</condition>
<target name="sass">
<apply dest="www/styles" executable="${exec_file}">
<srcfile/>
<targetfile/>
<fileset dir="styles" includes="*.scss"/>
<mapper from="*.scss" to="*.css" type="glob"/>
</apply>
</target>
</project>

How to preserve exec file in ant copy task

I have created Java FX bundle for Mac OS X using Ant. It creating bundle with two files -
1. MyApplication.app
2. MyApplication.dmg
I wish to copy both files at other folder, so I wrote command in my build.xml as -
<copy todir="my_new_folder">
<fileset dir="old_folder\bundles"/>
</copy>
It copying both files successfully at "my_new_folder". But on running .app from "my_new_folder" not launching my application though it is launching from "old_folder" correctly.
On comparing copied app I found that on exec (Unix Executable File) resided at MacOS folder ("Show Package Contents/Contents/MacOS") not preserving, its kind been changing in document file.
How to preserve its kind to Unix Executable File as I am simply executing simple copy directory.
Thanks,
Neelam Sharma
As noted in the ant copy task guide:
Unix Note: File permissions are not retained when files are copied; they end up with the default UMASK permissions instead. This is caused by the lack of any means to query or set file permissions in the current Java runtimes. If you need a permission-preserving copy function, use this instead:
<exec executable="cp" ... >
So, in your case, replace <copy> with:
<exec executable="cp">
<arg line="-R old_folder/bundles my_new_folder"/>
</exec>
(note that you should use forward slashes, even if this ant script is being used under Windows).

How to configure where NuGet.exe Command Tool line looks for packages

We have successfully set up a couple of local package repositories using the NuGet.Server package and hosted them on a local IIS webserver. We are able to connect from the Package Manager and install no problem. So these are working fine.
In order for us not to have to check in our packages folder we have included the following command line in each project file that includes NuGet references. This works, if the NuGet.exe is in the path on the CI build agent.
However, I would like to move the source configuration form the command line in every project file and put it in just one place, preferably where other pesky developers can't change it ;)
<Target Name="BeforeBuild">
<Exec Command="nuget install $(ProjectDir)packages.config -s
http://domain:80/DataServices/Packages.svc/;
http://domain:81/DataServices/Packages.svc/
-o $(SolutionDir)packages" />
</Target>
Is there a better way?
Yes there is ;-)
Take a look at NuGetPowerTools. After running Install-Package NuGetPowerTools, it adds a .nuget folder to your $(SolutionDir) containing nuget.exe, nuget msbuild targets and settings (which you will need to check in).
After that, you simply run Enable-PackageRestore and it sets up msbuild targets into your visual studio project files which will make sure that packages will be fetched in a prebuild step, even on your build server, without checking in any packages. (don't forget to check in the .nuget folder though!).
This way, you simply manage the nuget package sources in a nuget msbuild settings file (in the .nuget folder) central to your solution, instead of in each project.
Cheers,
Xavier
I finally got NuGetPowerTools to install after the advice from digitaltrust on http://blog.davidebbo.com
Although NuGetPowerTools solved my problem, it was overkill for what I wanted. It requires that you check in to version control a .nuget folder that it creates in your solution root. The folder contains NuGet.exe, and a couple of target files. I don't like this as I think version control is for source code, not tools.
I came up with the following solution.
Save NuGet.exe to a folder on your local drive, both on dev and continuous integration machines. I chose C:\tools\nuget\
Add that filepath to the Path Environment Variable in all environments
On continuous integration machines, find %APPDATA%\NuGet\NuGet.Config and enter the following
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<packageSources>
<add key="LocalRepositoryName" value="http://Domain/DataServices/Packages.svc/" />
</packageSources>
You can add more than one entry to packageSources and NuGet will search them in the order that they appear
The after build code from my question can now be amended to the following.
<Target Name="BeforeBuild">
<Exec Command="nuget install $(ProjectDir)packages.config
-o $(SolutionDir)packages" />
</Target>
The end result of this is that whenever the approved repository location is changed, the config has to be changed in only one place rather than in every csproj file. Also, it is the continuous integration server administrators who determine that location, not the developers in their command line calls.

Checking a file out (TFS) for a pre-build action

I've added a pre-build action for an ASP.NET web control (server control) project, that runs jsmin.exe on a set of Javascript files. These output files are part of the source control tree and are embedded into the assembly.
The problem is when the pre-build runs, jsmin can't write the file as it's readonly. Is it possible to check the file out before hand? Or am I forced to set the file's attributes in the command line.
Any improved solution to the problem is welcome.
Update
One small issue with Mehmet's answer -you need to prepend the VS directory:
"$(DevEnvDir)tf" checkout /lock:none "$(ProjectDir)myfile"
If you're using Team Foundation Server, you can use team foundation command line utility (tf.exe) to check out the file(s) during pre-build and then check them back in during post-build. If you're using something else for source control, you can check if they have a command line tool like tf.exe.
If you do not want to check the files in as part of the build (which you normally wouldn't for this sort of thing) then I would simply set the attributes of the .js files before running jsmin on them. The easiest way of setting the files read-writeable is to use the the Attrib task provided by the MSBuild community extensions. The same community extensions also provide a JSCompress task for easily calling JSMin from MSBuild.
Therefore you'd have something like the following (not tested):
<Import Project="$(MSBuildExtensionsPath)\MSBuildCommunityTasks\MSBuild.Community.Tasks.Targets" />
<!-- rest of TFSBuild.proj file -->
<Target Name="AfterGet">
<Message Text="Compressing Javascript files under "$(SolutionRoot)"." />
<CreateItem Include="$(SolutionRoot)\**\*.js">
<Output TaskParameter="Include" ItemName="JsFiles"/>
</CreateItem>
<Attrib Files="#(JsFiles)" ReadOnly="false"/>
<JSCompress Files="#(JsFiles)" />
</Target>
Note that by modifying the files after getting them may well cause issues if you tried to move to an incremental build.

Calling batch/script files from VC6/VC2005/VC2008 project files

Is there a way to invoke an external script or batch file from VC6 (and later) project files?
I have a background process that I need to kill before attempting to build certain projects (DLLS, executables) and haven't found a way to successfully do so from the project itself. I'd like simply to call a batch file with a taskkill command in it.
(Yes, I could run the batch file from a command line before building the projects, but I don't always remember to do so and having it done automatically would be more convenient and less irritating for the whole development team.)
You can create a utility project (configuration type: Utility in the project property pages) that has a post build event. You then call the batch file from that Post-Build event. If I remember correctly, utility configuration appeared in VS2005. But I believe the same can be achieved with another type of configuration on VC6.
Here is an example of a setup (this is the text of the Command Line property of the Post-Build Event):
set solutionDir=$(SolutionDir)
set platformName=$(PlatformName)
set configurationName=$(ConfigurationName)
call $(SolutionDir)PostBuild.bat
As you can see, you have all the flexibility of customizing the batch environment based on VisualStudio macros.
If you want to have this batch file called every time you build, add a dependency to the requiring project (your main executable or dll project for example). You can add your batch file to the solution items for convenient access (right-click on the solution and select Add -> Existing Item...).
You can even invoke the build command on this utility project to force the execution of the batch file.
At work we have a similar setup to start our unit tests each time a build is triggered.
You could invoke it from a custom build step or a build event.
At least for C# in Visual Studio 2008, you can open the project file and find within the file the following comment:
<!-- To modify your build process, add your task inside one of the targets below and uncomment it.
Other similar extension points exist, see Microsoft.Common.targets.
<Target Name="BeforeBuild">
</Target>
<Target Name="AfterBuild">
</Target>
-->
Uncomment the one that works best for you, in this case the "BeforeBuild" item. Then substitute your batch file for the one I have here:
<Target Name="BeforeBuild">
<Exec Command="MyBatchFile.bat" />
</Target>
That's all there is to it; whenever you build that project, this will take place each and every time.
That said, I do not know if this works the same for VS 2005 or, especially, VC6. YMMV!

Resources