Including MS C++ runtime in VS2005 generated MSI - visual-studio

I've got a project that depends on a particular version of MSVCR80.dll (the MS Visual C Runtime) and I'm running into problems where, depending on the particular system configuration, my app doesn't always get the right version of that file. It's been a bit of a crap shoot as to what path it takes to find a file with that name, and it's not always right...
Is there a way, when creating a Deployment Project in VS2005, to ensure that my app will always use the runtime that I provided?? When I add the runtime file to the project, it asks about creating a merge module...but not really sure what that does. And regardless of creating one, the issue remains.

Martin Richter wrote an article about that on CodeProject:
Create projects easily with private MFC, ATL and CRT assemblies
This solution does not rely on your MSI packages but on the application that uses the CRT files.

I am not sure if it is your application after installation that doesn't work, or if it is a dll you use as part of the installation that doesn't work?
To make a very long story, very short: new versions of the C / C++ runtimes are installed as Win32 assemblies, or side-by-side installation. This means the files will go into folders under C:\Windows\winsxs - the Win32 equivalent of the GAC, and several versions of the same file can co-exist here.
Applications compiled with Visual Studio 2005 / 2008 will put a manifest file into the binary, and this manifest specifies what side-by-side runtime version to bind to. It doesn't matter if you put the MSVCR80.dll next to your EXE or even in system32 - the manifest embedded in the EXE will load the file from C:\Windows\winsxs.
This is all "full circle". In the old days runtimes went to System32. This caused the original dll-hell: applications overwriting each other's global runtime files. To remedy all this the idea was to "isolate changes" to each application. Hence the new approach was to isolate a local copy of the runtime file next to the EXE. Now this caused an entirely new problem: how do you make sure security updates for the isolated dll was deployed? In most cases this never happened, and you had lots of applications running with local, unsafe dll's. So what to do? The decision was to introduce the second coming of dll-hell: the side-by-side assembly approach. In this approach runtimes are not local, but global - with the critical difference of supporting side-by-side installations. This way, in theory, applications can function without overwriting each other's runtime dlls.
So that was the quick summary of "how to make runtime deployment complicated". I am not positive it is still possible to do, but did you check whether you can statically link to the runtime? Sometimes old-school really is easier...

Related

"Declare" statement as yet an other way to circumvent the .dll hell?

To my astonishment I found VB6 code that uses Declare statements to define functions in a .dll that lives in the Program Folder without it being registered on Windows. This seems like a supersimple way to avoid the .dll hell without having to resort to using Side by side manifests. Can I read some more about this somewhere? Are there snags?
The Declare statement is used to do "just in time" binding to non-ActiveX DLLs. Until your program "touches" a Declared entrypoint no attempt is made to load the library.
It basically has nothing at all to do with the topic of DLL Hell.
Muddled thinking can even lead people to plop ActiveX DLLs "next to" the EXE which actually can result in DLL Hell because people who tend to do this also use poor techniques for installing and uninstalling applications.
Poorly designed application A deployment plops a commonly shared DLL or OCX next to the EXE.
Poorly designed application is run, the VB6 runtime can't find the classes in the registry, does a DLL Search using Windows heuristics, immediately locates the DLL next to the EXE and calls its self-registration entrypoint.
Innocent, properly designed applications B, C, D are later installed that use the same DLL/OCX and their installers find the library already registered.
Poorly designed application A is uninstalled, typically by simply deleteing its folder in Program Files.
Applications B, C, and D (and any future applications using the library) are now broken - due to orphaned component registration pointing to a non-existant library.
Moral of the story:
Never, never, never put DLLs "next to" your VB6 application on installation. If you have private DLLs that are not shared with other applications even then put them into a libs, etc. folder under the application folder. Possible exception might be non-COM DLLs, such as those you expect to use Declare with.
There is also a great deal of misunderstanding about manifests, of which there are multiple kinds. The ones you are probably thinking of are application and assembly manifests.
These can be used for selecting among different versions of a library installed Side by Side, or they can be used to isolate applications and assemblies which is the part that has bearing on DLL Hell.
Of course application manifests can be used to specify quite a few other things about how Windows should run the application.
Windows searches in a well-documented sequence of folders for LoadLibrary (which VB6 uses behind the scenes to resolve Declare declarations). Since the first location on the list of search folders is the app's own folder, your discovery makes perfect sense.
It doesn't resolve the "DLL hell" issue for the most part, though. It can't work for system DLLs, for instance, because Windows preloads most of them. Also, if a DLL is already loaded into memory, Windows may use that copy of the DLL (not sharing data, but code can be reused).
That's part of the reason that manifests were created; they allow an application to strictly define required versions of system DLLs in order to provide certain functionality. VB6's technique is old fashioned (just like VB6).

Where to install shared DLLs on Windows

I have a driver which can be installed on Windows (XP/Vista/7). It's accessed via a native C++ DLL that 3rd-party applications link to, and which is also a Winsock Provider (WSP). It used to be installed under System32, but having seen advice not to, I changed it to install under ProgramFiles instead.
Now, the problem is that people are having to either copy it back into System32 or copy it into the application directory whenever they want to use it in their own applications, because Windows won't search the install directory under ProgramFiles when the application tries to load the DLL.
I've been unable to find any Microsoft documentation discussing this issue, so if System32 shouldn't be used then where should shared DLLs be installed?
The Windows side-by-side cache. Backgrounder info is here, technical reference is here.
I haven't seen anybody actually do this yet, other than Microsoft. Quite notable is that MSFT gave up on winsxs deployment for the C/C++ CRT and MFC runtime DLLs for VS2010, it was causing too many problems. They're back in c:\windows\system32. The managed equivalent of this (the GAC) is going strong though. Better tool support, probably.
I'm fairly sure that by a large margin everybody chooses app-local deployment.
Since it's a DLL linked to your driver maybe it's less of an issue, but I'd be wary of trying to share the DLL and would instead try to get all developers of client apps to keep their own version of the dll in their applications folders.
I've had too much fun in DLL Hell to want any more weird bugs because AppX overwrote the DLL with an old version that breaks AppY etc.
Anywhere on the path would work.
If this is only to be used with your set of apps (eg Qt4.dll) then in the root of "c:\program files\my company" would be good, or have a 'shared' folder below that.
If it's to be used by other apps that you don't know about (eg a video codec) then system32 makes sense (you will need admin rights when you install)
Fixed filesystem location
One possibility would be to install them in a sub-directory of Program Files, as already suggested by #Martin Beckett.
Variable filesystem location, fixed entry in Registry
If you don't want to install them at a fixed location, you could also store their location in the Windows Registry. You'd install some keys under HKEY_LOCAL_MACHINE\SOFTWARE that your applications could read to find out where the DLLs are located. Then the DLLs can be loaded at run-time.
P.S.: Preventing orphaned DLLs
You could go one step further and use the Registry, or some other storage (a file, say), to store information about which of your applications uses which of your DLLs. For example:
FooCommon.dll <- FooApp1, FooApp2, FooApp3
FooBar.dll <- FooApp1, FooApp3
FooBaz.dll <- FooApp2, FooApp3
Whenever one of your applications is un-installed, it removes itself from this map. Any uninstaller can then safely delete a DLL that is no longer in use by any application. That method is akin to reference-counting, and the idea is to prevent orphaned DLLs from accumulating on users' filesystem.
Native DLLs can be stored as side-by-side assemblies. See this reference for more information. This is separate from the .NET Global Assembly Cache, but it uses similar concepts. This blog post also has quite a few useful details about how DLLs get resolved as side-by-side assemblies.

Should you build components every time you build a main app

We have started using Final Builder to create builds for our vb6 and .net projects. We are also using Visual Source Safe to manage our source. Some of our vb6 exe's are dependent on certain ocx's, such that a particular vb6 exe may require a particular version of an ocx.
The question is, should the final builder script for our exe project also re-build the ocx project, or is it better to simply pull a particular version of the already compiled ocx. My concern is that other developers could have broken the build (or created a bug) for the ocx which could then break the exe we are trying to build. Moreover, re-building the ocx project would result in the same version of the ocx but with a different date, resulting in confusion if dllhell(ocx hell) issues arise.
There is no difference in terms of building and maintaining your app between a ocx and a activex dll. The ocx should use binary compatibility and be part of your compile process.
This is however a general rule. You may have some components that rarely change if ever. In my own VB6 application I have a handful of components that reside at the bottomost level of my reference hierarchy that rarely get updated. They maybe get updated one or twice a year at best. Some haven't been updated for several years now.
However based on your description it sounds like the controls are still being modified. So I doubt the second case applies.
In the end use your best judgment.
There are two ways to use OCX/DLLs: code reusability vs. fragmentation of an over-large project.
Those meant for re-use would be absurd to build, build, and rebuild, and almost never should be customized to fit a new application. These are your crown jewels, and most people should have no ability to modify the source. They are the domain of your organization's "library writers" because that's what they are: libraries.
If you simply have large, monolithic, unweildy applications you may have to go the other route. Then OCXs and DLLs simply become an awkward extension of the "module" concept. This is why we have Project Groups.
Your library users should not be fiddling with libraries though. I'm sure they all fancy themselves able to "ensure they are up to date and performant" but that's a different debate entirely.

Developing in VB 6.0

We have multiple projects in VB 6.0. Most of these projects are ActiveX DLL's. When developing, projects take a '.dll' reference of other projects, but this does not allow us to debug. So, for this we have to take a reference to the '.vbp' project. However, taking a project reference, means we are asked for binary compatibility.
During development, should we use project compatibility and build projects into DLL's for deployment?
It's fine to reference the vbp during development, just make sure you keep binary compatibility on. If you do not, you'll make the registry into a nice mess and deployment will be a disaster. Keep in mind, however, even with binary compatibility on, every time you change the public interface of the DLL, you're creating a forward reference in the OLE entry in the registry.
we have four levels of DLLS in the CAD/CAM software we use for my company's cutting machines. We handled this by making a compatibility directory that the PREVIOUS version's DLLs in it. With this we can continue to use binary compatibility.
The process looks like this.
Compatibility has Revision 119 DLLs
in it.
We compile down Revision 120 and
release it
Copy the Revision 120 DLLs to the
compatibility directory.
Develop
Test
We Compile down Revision 121 and
release it.
Copy the Revision 121 DLLs to the
compatibility directory.
[repeat]
The main problem you need to watch out for is changes to the the lowest level of DLLs you use. Visual Basic 6 uses a #include statement in generating its internal type libraries. Doing will need to getting it confused over whether it is still binary compatible or not. Note you can see this by using the OLE View tool that comes with Visual Studio 6.
The solution to this problem is compile the low level DLL and immediately put it into the compatibility directory. The resulting internal typelibs for the higher level DLLs will now properly detect whether you are binary compatible or not.
Remember binary compatibile means all you can do is add a method or property. You can't change a existing method's name or argument list. (it's signature in COM terms)
You should be able to debug referencing a dll. Did you start the projects in the right order? Or you can add all/some dll into the same "Project Group" (*.vbg).

what is the diff between dependencies and manually add a dll/ocx in vs installer 6?

i'm using vs installer to build a setup package for my vb6 app.
and the problem is i can see that under the project explorer there's a list of dependencies attached to my exe file.
alt text http://img505.imageshack.us/img505/9696/croppercapture259lr8.png
and under the file system on target machine treeview, i can actually store the dll/ocx on a folder or in the windows system folder itself[the left window].
alt text http://img101.imageshack.us/img101/9224/croppercapture251qm1.png
so what i don't understand is .. is there actually a difference?
if i just set the dependencies and didn't add the dll or ocx to the folder or win sys folder, does the dll automatically get copied over too?
It is not guaranteed that all those dlls will be present on the system that the software is being installed on. So they need to be included in your installer. From there you have two choices.
You can install them in your windows system folders or in your application folder. The difference is that if you install them in your application folder you can set things up on XP and Vista so that the different version of the software with different version of the components can be fired up and run side by side. Installing them into the system folder will break any older version that depend on older version of the components.
Installing in the application folder rarely doesn't work if a component depends on other components that can't be updated. When this occurs it is usually with with Microsoft libraries. They have gotten better over the years on this issue.
You can read more about the issues involving side by side execution here
Finally the dependencies need to be in your installer so that they are registered in the Windows Registry. Unlike most .NET assemblies any ActiveX/COM application needs to have the component registered in order to use it even if you are using CreateObject and Variant types to access it.
I will admit the whole process is idiosyncratic and is one of the sources for the stories about DLL Hell. Start with the MSDN article, use wikipedia, and of course ask further questions here.
You should usually not have a "dlls" folder under the app folder for a normal Installer package but there are many factors involved (private standard DLLs, Reg-Free COM, etc.). Yes, the dependencies get included (unless you exclude them). They should each have a property that determines where they install on the target systems.
You also have a number of components in that list that are either not redistributable this way because they are OS-dependent system components, MDAC components, or not licensed for redist (fm20.dll for example).
Sadly this is an example of the type of package that can lead directly to DLL Hell for your users' systems. Fixing this can mean researching every MS component in MS KB articles to determine what can or should be redistributed and how.
Deployment can be a messy business to get right.

Resources