Can I have VS 2013 ignore certain syntax errors until I build? - visual-studio

For example, I could pretty much always do without the "not all code paths return a value" error squiggly. This generally means I'm in the process of writing the method. I don't need to be told it doesn't return a value. If I forget to add a return value, VS can tell me that when it builds. But I never want that while I'm working; it's annoying and usually tells me something I already know.
For another example, I could do without the errors that show up while I'm in the process of writing a switch statement, such as the one that tells me I don't have a break statement yet, as I'm still working on whatever case.

You can try to disable "Show live semantic errors",
http://www.devcurry.com/2011/01/disable-squiggly-or-wavy-lines-in.html

Related

How to view the result of an expression in MSVS2013?

I remember seeing somewhere that you can specify which dll to get the address of symbols so that one can use that variable in the watch window. I can't for the life of me remember where I saw this. The best that I can come up with is Format Specifiers in C++.
The reason I want this is so that I can see the visibility status of a window and MSVS keeps saying that identifier "IsWindowVisible" is undefined.
I was trying to use something like the following in the watch window:
::IsWindowVisible(m_hWnd),user32.dll
Using:
this->IsWindowVisible()
results in Function CWnd::IsWindowVisible has no address, possibly due to compiler optimizations. which is why I'm trying to use the win32 call. Ideas?
http://msdn.microsoft.com/en-nz/library/y2t7ahxk.aspx
Haven't tried it, but it seems to me that IsWindowVisible(m_hWnd) should work, or maybe IsWindowVisible(this->m_hWnd).

Can I debug dynamically added Ruby method?

I want to store brief snippets of code in the database (following a standard signature) and "inject" them at runtime. One way would be using eval(my_code). Is there some way to debug the injected code using breakpoints, etc? (I'm using Rubymine)
I'm aware I can just log to console, etc, but I'd prefer IDE-style debugging if possible.
Hm. Let's analyze your question. Firstly, it does not seem to have anything to do with databases: You simply store a code block in the source form somewhere. It can be a file, or a database. Secondly, you don't want IDE-style "debugging", but TDD-style. (But don't concentrate on that question now.)
What you need, is assertions about your code. That is, you need to describe what output should your code produce given some input examples. And then, you need to run that code and see whether its function matches the expectations. Furthermore, if you are not sure about the source of your snippets, run them in a sandbox (with $SAFE = 4). If your code fails the expectations, raise nice errors (TypeError, or even better, your custom made exception), and then you can eg. rescue those exceptions and eg. use some default code snippets...
... but maybe I'm not actually answering the same question that you are asking. If that's the case, then let me share one link to this sourcify gem, which let's you know the source, so that you can insert a breakpoint by saying eg. require 'rdebug' in the middle of code, or can even convert code to sexps. That's all I know.

How stringent should I be with Code Analysis compliance in Visual Studio?

After playing with Code Analysis for a small project I am working on, I am wondering just how severe I should be when resolving code to be analytically compliant.
I know I can suppress warnings for this, but to me, suppressing a warning to some extent is a Cop-out (no pun intended..."FXCop").
Example warning:
Do not raise exceptions in unexpected
locations 'CustomObject.Equals(object)' creates an exception of type
'ArgumentException'. Exceptions should not be raised in this type of
method. If this exception instance might be raised, change this
method's logic so it no longer raises an exception.
Reason for throwing this...
CustomObject.Equals(object) might try and compare CustomObject to FooBarObject...which aren't even of the same type, so in this instance, should I throw an exception, or just return false?
In general, should I be really anal (for want of a better word) in making my code absolutely compliant, or will I come across situations where warning suppression will become necessary?
FxCop warnings are just warnings, they don't flag invalid code. That's the job of the compiler. The rules FxCop uses were collected from years of experience writing .NET code. They represent "best practices" and in general are there to remind you of unintended consequences and the more obscure parts of .NET programming, like CAS.
Always refer back to the documentation to see why the rule exists. For CA1065 you'll see:
An Equals method should return true or false instead of throwing an exception. For example, if Equals is passed two mismatched types it should just return false instead of throwing an ArgumentException.
Which exactly matches your usage, you'll have no trouble adopting the advice. Unfortunately it is a bit short on the exact reason the rule was created. Which really doesn't go beyond the "don't throw in unexpected places" guidance. The unintended consequence here is that another programmer that uses your class won't realize that a try/catch would be needed if he doesn't want the code to fail. Feel free to put a Debug.Assert() in your Equals method. There are plenty of cases where you'll want to ignore the advice, CA2000 is particularly prone to false warnings for example. Apply the [SuppressMessage] attribute if necessary to not have to look at it again.

Find diff in properties set when debugging

I was just wondering if there was a tool like this that existed, or something within Visual Studio 2010 that I just haven't come across before. I have a situation that arises that I'm sure many other people have ran into before. I debug into a method one time, and it works, another time, and it fails. I know (on the front end), what needs to happen for it to fail, and what needs to happen for it to pass, however I can't seem to find anything on the back-end that would show me the differences in all the properties that get passed through that method for each use case.
Is there a tool that can kind of analyze the objects in the code that I am passing through this method through each run, and then show me a diff of properties? Which ones are set/aren't set, which ones are different, etc.?
I normally wouldn't mind just blowing up the watches on each monitor and cruising through them, but we have a LOT of properties on these specific objects.
Thanks guys.
Would something like Mole 2010 work? I know you can basically do a diff on objects to compare their properties, but I'm not sure if this would work in your situation with method parameters.
Maybe Pex will help you? It analyzes your code and created input values based on identified code paths and boundary cases.

Coverity Prevent: how to handle "checked return" warning when I deliberately don't check the return?

As the title suggest, for example, in 85% of the situation, I'd like to check the return code of foo(), but sometimes I really don't care about it, but this will raise Coverity warning.
What's the best way to deal with this problem?
Changing Coverity settings doesn't count. :)
The correct way to suppress CHECKED_RETURN defect is to cast the return value you don't care about to a void. This has the additional advantage of making it clear to anyone reading the code that you don't care about return value, rather than that you forgot to check it.

Resources