So I have been trying to build an installer for my game with NSIS. For the most part it works fine but just noticed that it seems to be skipping certain files for no reason. Or no reason I can figure out.
At first I was using this line to gather up all the files in the source folder:
File /r "${NSISDIR}\game\source\*.*"
However, I noticed that this didn't get everything. Granted it found all sub-folders and kept the hierarchy correct. There didn't seem to be any rhyme or reason to what it skipped. Then I tried listing all files and directories separately and found out why. Example:
File "${NSISDIR}\OWTD-DE\source\pygame.math.pyd"
This produces the following error:
File: "C:\Program Files (x86)\NSIS\game\source\pygame.math.pyd" -> no files found.
But that file exists, I can see it in the source folder. This was the case for all missing files. At first I thought it may be the two periods in the name, but various files have that naming convention and they are added fine. I cannot figure out how to get it to recognize these files. Any ideas?
${NSISDIR} is a define used to access the UI resources in the Contrib subfolder, you are not supposed to put your files there. Your source files should not be in Program Files, only installed files should be located there. Also, on 64-bit systems there are two Program Files folders and there are some compatibility hacks in Windows related to %ProgramFiles% so putting your source files there is not optimal. Just because you see that file there does not mean it is actually in Program Files, it could be UAC Virtualization/VirtualStore tricking you...
Normally you would keep your .nsi somewhere in the same directory tree as the rest of your files so you can use relative paths but you can also use a define if you really want to:
!define MYSOURCE "c:\foo\bar"
...
Section
File /r "${MYSOURCE}\*.*"
SectionEnd
If it still misses some files I would suggest trying Process Monitor so you can see the low-level details...
Weirdly enough, this process did not work very well on Windows Home 64-Bit but did on Windows Professional 64-Bit. I'm not sure if this an issue with NSIS itself or what, but nothing was different between the two except the OS. And there really isn't much difference between those two operating systems. However, perhaps some configuration differences between the two was the real issue.
While marked solved, I'm not really sure what the actual issue and solution could be.
Related
I have a fairly large application (~750k LOC) that I distribute using the Package and Deployment Wizard. I fully understand that it would be nice to migrate to .NET (that ain't happening - see the code size above), and that the PDW is deeply flawed. However, for the most part I've made it work well for my end users, by customizing the Setup1 application, writing a menu-driven wrapper for the Setup application, and by running it in silent mode. (Note that the problem I'm about to describe occurred even before I started using silent mode.)
The issue I'm having is that my application requires quite a few auxiliary files, which I've added to the PDW project in the "Included files" section. When a user does a clean installation (either from scratch, or after un-installing a previous installation), everything works fine. However, if they simply run the installer to update the existing installation, the executable file and any OCXs I've updated get copied over the previous versions just fine, but my auxiliary files don't - I have to have the user manually delete them, and then the Setup1 program will re-install them as it should.
I've checked in the Setup.lst file, and all of the files are listed there, with their current date stamps. In fact, in my "BuildAll.bat" file, I do the Windows equivalent of a "touch" (copy /b "TheFile.dat" +,,) to force the date stamp to be current. However, if the file exists on the target machine, it won't be over-written even though it's older. There are no errors reported, either visibly or in the .LOG file (which is required if using the silent option).
A couple of additional points: Some of the auxiliary files are themselves VB6 applications - just the .exe files. Those do get copied correctly if they're newer than the existing files. Other than being files with internal versioning information, there's no difference between them and the other auxiliary files (which are things like media files, or text-based .txt or .dat files).
So, what's going on, and how do I fix it (besides moving to Inno or some other solution that won't work for me...)? Thanks in advance for any help!
~~
Mark Moulding
I have an old program that saves its files to Program Files. We are updating it to run properly on Windows 7. The problem is that we now can't find our saved files. Windows 7 allowed our program to save to program files, but obviously put the files somewhere else. We can't find that 'somewhere else'. Does anybody know where Windows 7 places its files when we save in Program Files?
Update:
We've looked in program files, in program files (x86), we've used Windows Explorer search function to try and find the directory name. Nothing works, but when we check to see if the directory we are making already exists in our application, we find it and put up our error dialog box.
Look in C:\Users\[USERNAME]\AppData\Local\VirtualStore\Program Files\[APPNAME]
Possibly program files (86) if your application is a 32 bit app. The place they should be saved is in the users directory though (or even better, give the user the choice of where to save).
I have got an old project, C++, 64 bits compiled on VS2008. The project is built using some Python scripts (SCONS). I have got to compile it in VS2010.
All is working pretty fine except one small detail: in VS2008 all output goes to Debug\Win64 or Release\Win64, where scripts are looking for it, while in VS2010 it goes to Debug\x64 or Release\x64.
I know that there are PLATFORM/PLATFORMNAME macros being used by VS. Anything I did trying to change these values is mighty ignored by VS, or, if I am changing it manually in vcxproj files, VS refuses to compile at all.
For some company-related reasons scripts could not be changed. So for now I just added to a batch file that runs the script some xcopy commands to copy all the files from\x64 to \win64 before the script starts. It's kind of working, but I would like to know about a more elegant solution.
Thanks,
fLot
Another solution that might work is to create a file system junction so that \Win64 and \x64 becomes two different names to the same physical folder. You have to create a junction for each configuration instead of copying the files but once created it should stick between builds and ensure the two folders have the same content. See Wikipedia: http://en.wikipedia.org/wiki/NTFS_junction_point.
We're finally getting around to moving our software's documents out of the program's own directory and into "My Documents". We're also adding a "requestedPrivileges" line to the manifest to prevent further trouble with virtualization.
However if we only did that then anyone who had been running the old versions in Vista/7 is likely to lose their work somewhere within the hidden VirtualStore directory after updating. So what's the preferred way of migrating into the 21st century?
Frankly I'm a little wary of copying files around, especially as I can't seem to find a programmatic way of getting at the shadow directory, but presumably plenty of other people must have had the same problem before us.
Don't add requestedPrivileges unless you legitimately need administrative rights in order for your program to work - nothing in your description suggests that you do. That should also let you simply copy the files on the first boot as if they were still in your program directory, because any virtualization would still be in effect.
However, if you absolutely must do the migration without UAC enabled, you can find your files in %LOCALAPPDATA%\VirtualStore\path\to\file. For example, if your file would have been stored in C:\Program Files\OurApp\, you'll find it in %LOCALAPPDATA%\VirtualStore\Program Files\OurApp\.
To get the path to %LOCALAPPDATA%, you can use SHGetSpecialFolderPath with CSIDL_LOCAL_APPDATA as the CSIDL parameter.
I am building a cross platform product and one of the requirements is across windows(win32,AMD64 and IA61). The product as is relatively simple CLI but we have a separate build team who checks out the code from CVS and build in separate build environments. I am able to build succesfully(using Visual C++ 2005) in one platform(AMD machine). But once I check in the code, check out the build fails.
The cause of the build failure is because the include library paths are wrongly specified in the property sheets. Specifically the output file folder under the Linker in property pages are specified wrongly. So these libraries get built in a different folder from where the other projects are expecting them.
However along with the source I check in the .sln files (and later .vcproj files) also everytime. Morover if I open the .sln file in the folder where the build is not succeeding, there is no difference between the one where I could succesfully build(pre check in). In fact using windiff I could not see any difference between the two build folders (except some .ncb and cvs log files).
So any idea what is going on? Where does VC++ 2005 take the include directories take the output folder path from if not from .sln? Is CVS somehow interfering with the process? Anything else I could try out.
Thanks in advance.
Just to update the problem was resolved. The root cause is the .vcproj files were not getting checked in CVS!! This is where the individual project settings were stored(I was under the impression that this is done in .sln files).
I think the problem can be that after you have changed the settings in one build configuration (for example x86-Release) but forgotten to change them for another configuration (for example ia64-Debug), and when configuration changes, you have this problem.
Another thing that I would check on your place is project dependencies. If those are set in the right way VS will look for project output exactly where it is outputted, even when you change the output folder.
Do you have any binary files checked in as ASCII?
The round trip to and from CVS can corrupt binary files that are incorrectly marked as ASCII because CVS performs character processing on ASCII files (e.g. to give you the correct end of line codes for your OS). Corruption can occur even in an all Windows environment.
See the Binary section in the CVS FAQ for more information.