Microsoft visual C++ backwards compatibility - visual-studio

Consider a C++ API defined as a series of __options(declexport/import) classes.
Further, assume that the caller is never permitted to call the ordinary operator new(size_t) on these classes. Either a static factory method does the new-ing or there is a class-specific operator new. And ditto marks on the delete size as needed (frequently just a virtual destructor).
Now, if you compile and link a DLL and an IMPLIB of with the tools from VS2010, can you hand that implib and DLL to a user of VS2005 and expect it to work?
MFC is not involved here at all.
I'd be particularly grateful to any reference to any relatively formal Microsoft statement on the subject.

So long as the name mangling on the C++ API is identical (they are), and does not use STL-type specific parameters, such as basic_string or std::map, whose implementation may have changed between releases of the compiler (and they have), then it should just work.
Of course, you'll want to make sure you either compiled your DLL using /MT mode (static linked runtimes), or include the redistributables for VS2010 runtimes with your supplied libraries and link targets.
EDIT: Expanding on "don't pass in types that have version-specific implementations". A partial list is most easily found by looking at the output of the exports of MSVC100P.DLL.
cd %VS100COMNTOOLS%\..\VC\redist\x86\Microsoft.VC100.CRT
DUMPBIN /exports MSVCP100.DLL
The next issue will be header-only implementations of things like map or set which have changed under the hood between versions of the compiler.
This is why it's highly recommended that only scalar types be passed across boundaries between memory arenas. And thus, simple tests will pass, and be reliable.

You have not mentioned if you have used MFC to create the DLL's .If you have, regular DLL's should work , but I dont think extension shall work as the latter links to the MFC dlls .I am including links for your reference.
http://www.codeguru.com/cpp/cpp/cpp_mfc/tutorials/article.php/c4017
http://www.experts-exchange.com/Programming/System/Windows__Programming/MFC/Q_20385543.html
http://msdn.microsoft.com/en-us/library/26h8x9sy%28v=VS.100%29.aspx
EDIT
If its a normal DLL, there should not be any problem.Also depends on the linkage type.

Related

Delphi link to windows dll statically or dynamically

I am aware that implicitly linking to libraries at load time can lead to performance increases and as such I was wondering if it was good practice to link in this way at compile time thus increasing executable size (admittedly this is only marginal) compared to linking explicitly at runtime. My question is when linking against Microsoft Windows dll files located in System32, is it 'better' to link at load time as you can be mostly certain that the libraries will be present or follow the explicit approach?
Language used is Delphi (pascal) and the library in question is the WTsAPI32.dll - Terminal Services.
EDIT: As pointed out - my choice of language was incorrect and has been amended. Also, due to having only really every extensively linked to libraries in Unix, my comments about executable size can be omitted, I believed at the time I WAS in fact referring to static linking which bundles the library code into the executable and I now realise this is impossible when using dll files (DUH!). Thanks all.
The two forms of DLL linking are perhaps better named implicit and explicit. Implicit linking is what you refer to as static linking. And explicit linking is what you refer to as runtime linking
For implicit linking the linker writes entries into the import table of the executable file. This import table is metadata that is used by the loader to resolve DLL imports at module load time. A stub function is included for each implicit import that is only a few bytes in size. The executable size implications of implicit linking are negligible.
With explicit linking the imported function's address is resolved by a call to GetProcAddress. This call is made when the programmer chooses. If the DLL or the function cannot be resolved, the programmer can code fall back behaviour. There are size implications to explicit linking that I estimate to be similar to implicit linking. If the function address is evaluated once and remembered between calls then the performance characteristics are similar to implicit linking.
My advice is as follows:
Prefer implicit linking. It is more convenient to code.
If the DLL may not be present, use explicit linking.
If the DLL must be loaded using a full path, use explicit linking.
If you want to unload the DLL during program execution, use explicit linking.
You specifically mention Windows DLLs. You can safely assume that they will be present. Don't try to code to allow your program to run in case user32.dll is missing. Some functions may not be present in older versions of Windows. If you support those older versions you'll need to use explicit linking and provide a fallback. Decide which version you support and use MSDN to be sure that a function is available on your minimum supported platform.
If your only two options are static linking and run-time dynamic linking, then the latter is the best choice for linking with Windows DLLs because it's your only choice. You cannot link statically to a DLL because DLLs are exclusively for dynamic linking; that's what the D stands for. Microsoft does not provide static libraries for the OS modules, so you cannot link to them statically.
But those typically aren't your only two options. There's a third, namely load-time dynamic linking.
In Delphi, you use load-time dynamic linking by marking a function declaration external and specifying the name of the DLL where the function resides. If you use the function, then an entry is created in your module's import table, and when the OS loads your module, it reads the table, loads the referenced DLL, looks up the address of the function, and stores the address in your program's memory image so that your program can call it directly.
You use run-time dyanmic linking by declaring a function pointer, and then using LoadLibrary and GetProcAddress to look up the function's address prior to calling it. In newer Delphi versions, you can also declare a function in the same style that load-time dynamic linking uses, but then mark it with delay. In that case, the Delphi run-time library will call LoadLibrary and GetProcAddress on your behalf the first time you call the function.
The size differences are negligible. Run-time dynamic linking requires your program to contain code to load and link to libraries, but load-time dynamic linking stores more function references in the import table.
Run-time dynamic linking offers more flexibility in the face of uncertain DLL availability. With load-time dynamic linking, if a DLL is missing, or if it doesn't have all the functions mentioned in your import table, then the OS will fail to load your program — none of your code will run. With run-time dynamic linking, however, you have the opportunity to recover from the problem. You can disable certain parts of your program that the missing DLL depends on, or you can search for DLLs in non-standard places, or you can provide alternative implementations of missing functions.
If the functions you're calling are integral to your program's ability to operate, and there's ample reason to expect the functions to be present wherever your program is installed, then you should choose to link at load time. It allows you to write simpler code. You can be confident that you'll have the required functions if they are available on a certain version of windows that you check for in your installer, or if they're provided by DLLs that you distribute with your program.
On the other hand, if the functions you're calling are optional, then you should prefer to link at run time. Use that for loading plug-ins, or for taking advantage of advanced OS features while maintaining backward compatibility. (For example, you might want to take advantage of Windows Vista theme support when it's present, but still allow your program to run on Windows XP.)
Why do you think that compile-time linking to dynamic libraries would increase EXE size ? I believe you are mislead by somewhat poor choice of terms, used in windows programming from far ago. Let us better use relative terms "early binding" and "late binding" instead for the choice who should search for procedure names, compiler/loader or programmer's custom code.
Using early binding (aka static linking against dynamic library) your EXE contains the values (in a special tables):
DLL1 Name:
procedure "aaaaa" into the variable $1234
procedure "bbbbb" into the variable $5678
.
DLL2 Name:
procedure "ccccc" into the variable $4567
...et cetera.
Now, when you turn this into runtime loading (dynamic linking against dynamic libraries) it would look like
VarH1 := SafeLoatLibrary(DLL1 Name);
if Error-Loading-DLL then do-error-handling;
Var1234 := GetProcAfdress(VarH1, "aaaaa");
if Error-Searching-For-Function then do-error-handling;
Var5678 := GetProcAfdress(VarH1, "bbbbb");
if Error-Searching-For-Function then do-error-handling;
et cetera.
Obviously in the latter case your EXE contains all those values like in the 1st case, but more so - it contains a lot of code to deal with those values, that was just absent before.
So, while EXE size difference is not really large for today memory sizes, it is still in favor of early binding (static compilation against dynamic library).
Then what are the benefits for late binding? For example you can load different DLLs from different paths, determined in runtime by configuration - the flexibility and avoiding of DLL Hell (funny, concept of avoiding DLL Hell is against concept of volume saving). You can make your application work with limited functionality, if DLL load failed while statically binded EXE would just not load - graceful degradation concept. And at least you may give user much better, full of semantics, error messages than Windows could ever do.
And the last word, where you got that concept of EXE size from. I believe you mistaken it from talks about - attention! - static linking against static libraries. That is when OBJ/LIB/DCU files are not the part of distribution, but are just temporary code containers, that ultimately takes its place inside the monolythic EXE. Then yes - then your EXE has all those libraries insideitself and thus grows larger. However this case have nothing about dynamic libraries - DLLs.
The wording chosen once ago overuses static/dynamic terms in two closely related topics: how the library is loaded (compile-time vs runtime) and how functions inside the library are located (or bound. By developer's custom codeing ro by some OS-provided or compiler-provided toolset way before 1st line of your sources started execution).
Due to that ambiguity those close but different concepts start overlapping and sometimes this leads to a total confusion.
Now, what more static linking may give you in modern Windows versions. That is WinSxS folder Novadays Windows tends to keep multiple versions of each system DLL and your program may ask for the specific version of it (while in System32 folder there would be the most recent version that your program may be not get used to. Then you can make a special MANIFEST resource and compile it into EXE asking windows to load not DLLs not be name, but by name+version instead. You can replicaty that functionality with dynamic loading as well, but using Windows-provided toolset it is much easier.
Now you can decide which of those options do or do not have importance for your particular case and make somewhat better informed choice.
HTH.

Implicit vs. Explicit linking to a DLL

When one should implicitly or explicitly link to a DLL and what are common practices or pitfalls?
It is fairly rare to explicitly link a DLL. Mostly because it is painful and error prone. You need to write a function pointer declaration for the exported function and get the LoadLibrary + GetProcAddress + FreeLibrary code right. You'd do so only if you need a runtime dependency on a plug-in style DLL or want to select from a set of DLLs based on configuration. Or to deal with versioning, an API function that's only available on later versions of Windows for example. Explicit linking is the default for COM and .NET DLLs.
More background info in this MSDN Library article.
I'm assuming you refer to linking using a .lib vs loading a DLL dynamically using LoadLibrary().
Loading a DLL statically by linking to its .lib is generally safer. The linking stage checks that all the entry points exist in compile time and there is no chance you'll load a DLL that doesn't have the function you're expecting. It is also easier not to have to use GetProcAddress().
So generally you should use dynamic loading only when it is absolutely required.
I agree with other who answered you already (Hans Passant and shoosh). I want add only two things:
1) One common scenario when you have to use LoadLibrary and GetProcAddress is the following: you want use some new API existing in new versions of Windows only, but the API are not critical in your application. So you test with LoadLibrary and GetProcAddress whether the function which you need exist, and use it in the case. What your program do if the functions not exist depend total from your implementation.
2) There are one important options which you not included in your question: delayed loading of DLLs. In this case the operating system will load the DLL when one of its functions is called and not at the application start. It allows to use import libraries (.lib files) in some scenarios where explicitly linking should be used at the first look. Moreover it improve the startup time of the applications and are wide used by Windows itself. So the way is also recommended.

Which headers should I not use if I don't want my program to be linked with any of msvc*.dll's?

Which headers should I not use if I don't want my program to be linked with any of msvc*.dll ?
At the moment my application uses:
kernel32
user32
shell32
msvcp90
msvcr90
I want to get rid of the bottom two files. I don't mind if I will have to rewrite certain aspects of the program.
Because I know if you code in C and then link it won't link any msvc's
I believe you have to change the way the CRT is linked into your program. I think for that you have to change the C++->Code Generation->Runtime-Library to the static version. This is for Visual Studio 2005, don't know about newer versions.
Those libraries contain the C++ runtime - heap management and other stuff hard to get rid from.
You could link the C++ statically instead - use "C++ -> Code Generation -> Runtime Library" setting. Then you will not need those .dll files. However this is not the recommended way - if a vulnerability is found in the C++ runtime you'll have to recompile and reship your program.
Static link is the right answer. A related bit of advice is to use depends.exe to see what functions your exe is actually hitting in the dependent dlls. Those dependencies might be due to explicit use on your part or due to CRT implementation that you don't explicitly invoke.

Can I use a Visual Studio 6 compiled C++ static library in Visual Studio 2008?

Is it possible to use a C++ static library (.lib) compiled using Visual Studio 6 in Visual Studio 2008?
It really depends. Does the lib expose only 'extern "C"' functions where memory is either managed by straight Win32 methods (CoTaskMemAlloc, etc) or the caller never frees memory allocated by the callee or vice-versa? Do you only rely on basic libraries that haven't changed much since VS 6? If so, you should be fine.
There are 2 basic things to watch for. Changes to global variables used by 3rd-party libraries, and changes to the structure of structs, classes, etc defined by those 3rd-party libraries. For example, the CRT memory allocator has probably changed its hidden allocation management structures between the 2 versions, so having one version of the library allocate a piece of memory and having another free it will probably cause a crash.
As another example, if you expose C++ classes through the interface and they rely on MS runtime libraries like MFC, there's a chance that the class layout has changed between VS 6 and VS 2008. That means that accessing a member/field on the class could go to the wrong thing and cause unpredictable results. You're probably hosed if the .lib uses MFC in any capacity. MFC defines and internally uses tons of globals, and any access to MFC globals by the operations in the .lib could cause failures if the MFC infrastructure has changed in the hosting environment (it has changed a lot since VS 6, BTW).
I haven't explored exactly what changes were made in the MFC headers, but I've seen unpredictable behavior between MFC/ATL-based class binaries compiled in different VS versions.
On top of those issues, there's a risk for functions like strtok() that rely on static global variables defined in the run-time libraries. I'm not sure, but I'm concerned those static variables may not get initialized properly if you use a client expecting the single-threaded CRT on a thread created on the multi-threaded CRT. Look at the documentation for _beginthread() for more info.
I shouldn't think why not - as long as you keep the usual CRT memory boundaries (ie if you allocate memory inside a library function, always free it from inside the library - by calling a function in the lib to do the freeing).
this approach works fine for dlls compiled with all kinds of compilers, statically linked libs should be ok too.
Yes. There should be no issues with this at all. As gbjbaanb mentioned, you need to mind your memory, but VS2008 will still work with it. As long as you are not trying to mix CLR, (managed) code with it. I'd recommend against that if at all possible. But, if you are talking about raw C or C++ code, sure, it'll work.
What exactly are you planning on using? (What is in this library?) Have you tried it already, but are having issues, or are you just checking before you waste a bunch of time trying to get something to work that just wont?
Sure it'll work.
Are you asking where in VS2008 to code the references?
If so, go to proj props -> Linker -> Input on Configuration properties on the property pages. Look for "additional dependencies" and code the .LIB there.
Go to proj props -> Linker -> General and code the libs path in "Additional Library Directories".
That should do it!!
There are cases were the answer is no, when we moved from VS6 to VS2k5 we had to rebuild all our libraries, as the memory model had changed, and the CRT functions where different.
There were a handful of breaking changes between VC6, VS2003, VS2005 and VS2008. Visual C++ (in VS2005) stopped support for single-threaded, statically linked CRT library. Some breaking changes enumerated here and here. Those changes will impact your use of VC6 built libs in later versions.

Find Programming Language Used

Whats the easiest way to find out what programming language an application was written in?
I would like to know if its vb or c++ or delphi or .net etc from the program exe file.
Try PEiD
of course if they used a packer, some unpacking will need to be done first :)
Start it up and check what run-time DLLs it uses with Process Explorer.
If that doesn't make it immediately obvious, search the web for references to those DLLs.
Most disassemblers (including Olly I think) can easily show you the text contained in an EXE or DLL, and that can also sometimes give a clue. Delphi types are often prefixed with T as in TMyClass.
If it's a small executable with no DLL references and no text you might be SOL. At that point you'd need to look for idioms of particular compilers, and it would be mostly guesswork.
There is an art to detecting what language a program was written in. It is possible but there are no hard and fast rules. It takes a lot of experience (and it also leads to the question "Why would you want to..." but here are a few ideas on how to go about it.
What you're looking for is a "signature". The signature could be a certain string that is included by the compiler, a reference to an API that is quite common in the programming tool being used, or even a style of programing that is common to the tools being used, visible in the strings contained in the application.
In addition, there are styles to how an application is deployed: various configuration files found in the deployment directory, dlls and assemblies and even images, directories or icons.
Java applications wrapped in a self-launching executable will contain references to java libs, and will likely have certain libraries or files included in the same directory that indicate that it's java.
As indicated in other answers a managed assembly will show certain signs as well: you can open it in Reflector etc. While it is correct that c# and VB are "interchangable" once compiled, it is not true that they are identical. If you use Reflector to disassemble VB code you will quite often see that the assembly references the Microsoft.VisualBasic.dll assembly. You'll be able to tell the difference between Mono applications because they will most likely contain references to the mono assemblies.
Many compilers assemble and link code in certain ways, and leave footprints behind. For example, examining a window executable using "strings: tab in Process Explorer, you'll see a lot of strings. Using these you may be able to determine programming styles, methods called, error or trace methods withint the exe.
An example is that compilers use different mechanisms for localization: Microsoft stores localized strings in XML files or resource files. Other compilers will use a different tactic.
Another example is c++ name mangling. The CodeWarrior compiler uses a different algorithm to mangle the names of the member variables and functions of a call than Visual Studio.
I suppose you could write a book on the subject of accurately determining the lineage of any executable. This subject would probably be called "programming archeology".
You could try using Depends to see what runtime dependancies it has, which might give some clues.
The easiest way is to ask the developer of the program. It does not require any knowledge and utility programs.
Determine Delphi Application
Use eda_preview270.exe (from here) or some other spy tool and check the window class names. If they read like TButton or TfrmBlubb, it's a VCL app. If there is an "Afx" in them, it's probably MFC.
Compiled languages (by this I mean no scripting languages, or Java, .NET, etc.) are compiled into CPU assembly instructions, which is essentially a one-way conversion. It is not usually possible to determine which language a program was written in. However, using a dependency walker, you could potentially determine which runtime library the program was loading (if any) and therefore determine which language it used (e.g. MS Visual C++ 9 uses msvcr90.dll).
you can check is that a .net assembly or not by trying to open with ildasm.exe tool
PE Detective works best for me.
In general, you can't.
If you can load it into Reflector, you know it is a managed assembly.
That's a good question. There isn't any general way to tell, but I bet most compilers and libraries leave a mark in the resulting EXE file. If you wanted to spend a lot of time on it, you could gather a bunch of EXEs written in known languages and scan for common strings. I would image you'd find some.
Dependancy Walker, which someone else mentioned would be a good way to look for telltale dependencies, like versions of MSVCRT, etc
i'd try running the .exe thru a 'strings' program to get assorted hints.
If I remember correctly PE Explorer Disassembler gives some information about compiler that creates given not .net and java binary, for .net use Reflector or ILDAsm tool
The easiest way that I found (at least in computer games) was to look in the "redist" folder nested within the game's main folder. It might be obvious to some of you that are more experienced in programming yourself, but the specific purpose of the MSI in this folder is to allow the setup.exe file to automatically install the prerequisites for the game itself.
For example:
In Empire Total War, there is an MSI called "vcredist_x86-sp1.exe". This indicates that the game/program was written in Microsoft's "Visual C 2005" in the .NET Framework (usually).
In fact, if you open the MSI/EXE, the installer should immediately indicate the language it's written in and which version.
The reason I'm familiar is because I code in C# and VB in the .NET Framework and we auto-install the prerequisites for our business app.
Hope this helps!

Resources