I've a visual C++ project I'd like to debug. However, several functions are actually generated by macro expansion ( like set##Name for a particular property). So, while debugging I can't follow the execution flow inside these generated functions.
Do I have to use the /P flag and then debug the preprocessed code?
You would have to preprocess the code using the /P flag in some other project (or on the command line, if you fancy spelling out all the include and library folders), and then compile this preprocessed code instead of the source file in your real project. Then you can debug through it.
That said, once you're at it, can't you eliminate the macros? With const, inline, and templates, I rarely ever feel the need to resort to macros, and if I do, it's usually very small, isolated pieces of code. These are either too trivial to need debugging, or I manually replace one instance of the macro with the code it generates and debug that. (However, this might have happened to me thrice in the last decade.)
Related
I have a 20+ yo .dll, written in C that none of my colleagues want to touch. With good reason, it uses macros, macro constants and casting EVERYWHERE, making the symbol table quite lean.
Unfortunately, I have to sometimes debug this code and it drives me crazy that it doesn't use something as simple as enums which would put symbols in the .pdb file to make debugging just that little bit easier.
I would love to convert some of the #defines to enums, even if I don't change the variable types as yet, but there is a genuine fear that it will cause possible issues in terms of performance if it were to change the code generated.
I need to show definitively, that no compiled code changes will occur, but it would seem that the .dll is actually changing significantly in a 64 bit build. I looked at one of the function's disassembly code and it appears to be unaffected, but I need to show what is and is not changing in the binary to alleviate the fears of my colleagues as well as some of my own trepidation, plus the bewilderment as to why any changes would propagate to the .dll at all, though the .dlls are of the same size.
Does anyone have any idea how I could do this? I've tried to use dumpbin, but I'm not that familiar with it and am getting some mixed results, prolly because I'm not understanding the output as much as I like.
The way I did this was as follows:
Turn on /FAs switch for project.
Compile that project.
Move the object file directory (Release => Release-without-enums)
Change #defines to enums
Compile that project again.
Move the object file directory (Release => Release-with-enums)
From a bash command line. Use the command from the parent of the Release directory:
for a in Release-without-enum/*.asm; do
git diff --no-index --word-diff --color -U10000 $a "Release-with-enum/$(basename $a)";
done | less -R
The -U10000 is just so that I can see the entire file of each file. Remove it if you just want to see the changes.
This will list all of the modifications in the generated assembly code.
The changes found were as follows:
Symbol addresses were moved about for apparently no reason
Referencing __FILE__ seems to result in not getting a full path when using enums. Why this would translate to removing the full path when using enums is a mystery as the compiler flags have not changed.
Some symbols were renamed for apparently no reason.
Edit
2 and 3 seem to be caused by a corrupted .pdb error. This might be due to the files being used in multiple projects in the same solution. Rebuilding the entire solution fixed those 2 problems.
Let say I have a file that has a lot of preprocessor macros that generate loads of code. Normally when debugging such a file I wouldn't be able to step into macroses like functions as debugger does not have line number information. On the other hand it is possible to generate a preprocessed file using /P directive to the compiler, which will result in a file that contains all generated code and no macros.
Is it possible to make Visual Studio use preprocessed file for debugging?
One solution (not very convenient though) is to copy the preprocessed file back to the source file and compile it again. One must remember to generate the processed file without line numbers and to keep the original source code somewhere.
I've inherited some C99 code that I'm planing on reusing in a C++-centric solution. Unfortunately, even Microsoft's latest compiler has virtually no support for non-trivial C99 features.
The code in question has been tested to death and I'd rather not go through the trouble of rewriting it in C++. This means that in order for me to reuse the code verbatim I'll have to rely on a conforming 3rd-party compiler.
After looking around, it appears that the nicest way for me to integrate this code is by adding a "Makefile Project" to my solution. Only one problem. It seems like it is now my responsibility to keep the "Build Command Line" property synchronized with the files that I add to the project through Visual Studio.
At first glance, I couldn't find a way to get a list of files in my project through the usual Visual Studio $()-style macros. I could always write a shell script that would enumerate *.c files in my source tree and pass their paths to the 3rd-party compiler. However, I kind of expected that Visual Studio would do at least that part of the work for me since it already has this information in the relevant *.vc[x]proj file.
It is very unlikely that I'll need to add any new source files to this project, but still, manual synchronization (i.e., without a script) of this sort seems rather fragile to me.
What are my options besides writing a helper script?
From Hans Passant:
"Makefile project" means what it says,
there needs to be another 'agent'
that's responsible for the
dependencies. Like a make file. Rule
files can help you select another
build tool but that's kinda broken
right now in VS2010. Leverage the
original tool that built this C99
code, run it from the makefile
project.
I'm sort of conceptually designing a plug-in I'd love to have here. What I'd want is to be a able to tag line in my code (something like how breakpoints are added) and then get a trace log of when execution runs though them. Rather than set breakpoints (because they don't work outside the debugger), I'd rather that inside the compiler, the extra logging be added so the AST.
The main point would be to compare different runs of a program; it crashes if I do A but not if I do B and most of the code should be the same so where is it diverging?
Right now I'm doing this with file IO and a diff tool; it works but is a bit clumsy.
I guess the question is: Could this be done and has something like this been done?
I don't know of anything that exactly fits your description. However...
For debugging-only use, Visual Studio 2010 has "tracepoints". These are added in the same way as breakpoints, but rather than stopping the program, they output some text to the debug output. Because they're set in the debugger, they don't affect your source code at all.
If you want to trace activity in a release build, then just add System.Diagnostic.Trace.WriteLine() calls into your code. These can be controlled using TraceSwitches, so they can be disabled by default and only turned on if you need extra information to diagnose a problem. Unlike Debug.WriteLine() calls they are included (by default) in release builds as well as debug builds. Note that these trace calls do cost a small overhead even if the traceswitch is disabled, so avoid using them in performance critical regions of your code.
I would like to use Visual Studio 2008 to the greatest extent possible while effectively compiling/linking/building/etc code as if all these build processes were being done by the tools provided with MASM 6.11. The exact version of MASM does not matter, so long as it's within the 6.x range, as that is what my college is using to teach 16-bit assembly.
I have done some research on the subject and have come to the conclusion that there are several options:
Reconfigure VS to call the MASM 6.11 executables with the same flags, etc as MASM 6.11 would natively do.
Create intermediary batch file(s) to be called by VS to then invoke the proper commands for MASM's linker, etc.
Reconfigure VS's built-in build tools/rules (assembler, linker, etc) to provide an environment identical to the one used by MASM 6.11.
Option (2) was brought up when I realized that the options available in VS's "External Tools" interface may be insufficient to correctly invoke MASM's build tools, thus a batch file to interpret VS's strict method of passing arguments might be helpful, as a lot of my learning about how to get this working involved my manually calling ML.exe, LINK.exe, etc from the command prompt.
Below are several links that may prove useful in answering my question. Please keep in mind that I have read them all and none are the actual solution. I can only hope my specifying MASM 6.11 doesn't prevent anyone from contributing a perhaps more generalized answer.
Similar method used to Option (2), but users on the thread are not contactable:
http://www.codeguru.com/forum/archive/index.php/t-284051.html
(also, I have my doubts about the necessity of an intermediary batch file)
Out of date explanation to my question:
http://www.cs.fiu.edu/~downeyt/cop3402/masmaul.html
Probably the closest thing I've come to a definitive solution, but refers to a suite of tools from something besides MASM, also uses a batch file:
http://www.kipirvine.com/asm/gettingStarted/index.htm#16-bit
I apologize if my terminology for the tools used in each step of the code -> exe process is off, but since I'm trying to reproduce the entirety of steps in between completion of writing the code and generating an executable, I don't think it matters much.
There is a MASM rules file located at (32-bit system remove (x86)):
C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\VCProjectDefaults\masm.rules
Copy that file to your project directory, and add it to the Custom Build Rules for your project. Then "Modify Rule File...", select the MASM build rule and "Modify Build Rule...".
Add a property:
User property type: String
Default value: *.inc
Description: Add additional MASM file dependencies.
Display name: Additional Dependencies
Is read only: False
Name: AdditionalDependencies
Property page name: General
Switch: [value]
Set the Additional Dependencies value to [AdditionalDependencies]. The build should now automatically detect changes to *.inc, and you can edit the properties for an individual asm file to specify others.
You can create a makefile project. In Visual Studio, under File / New / Project, choose Visual C++ / Makefile project.
This allows you to run an arbitrary command to build your project. It doesn't have to be C/C++. It doesn't even have to be a traditional NMake makefile. I've used it to compile a driver using a batch file, and using a NAnt script.
It should be fairly easy to get it to run the MASM 6.x toolchain.
I would suggest to define Custom Build rules depending on file extension.
(Visual Studio 2008, at least in Professinal Edition, can generate .rules files, which can be distributed). There you can define custom build tools for asm files. By using this approach, you should be able to leave the linker step as is.
Way back, we used MASM32 link text as IDE to help students learn assembly. You could check their batchfiles what they do to assemble and link.
instead of batch files, why not use the a custom build step defined on the file?
If you are going to use Visual Studio, couldn't you give them a skeleton project in C/C++ with the entry point for a console app calling a function that has en empty inline assembly block, and let them fill their results in it?
Why don't you use Irvine's guide? Irvine's library is nice and if you want, you can ignore it and work with Windows procs directly. I've searching for a guide like this, Irvine's was the best solution.