why is LogSeverity #defined as DWORD in windows SetupAPI - winapi

I see this #define in windows kit 10 SetupAPI.h in line 4871
#define LogSeverity DWORD
So I can't do something like
typedef int LogSeverity
This effectively makes LogSeverity like a reserved keyword if I dont want to redefine DWORD. Am I missing something here?

The windows API has a huge number of such identifiers, what I actually find surprising about this case is that it's a #define and not a typedef. Although with it being a define it does give you the option of #undefing it after the include in question. Although I would likely just accept that it is an already used identifier and choose another name for whatever I was creating.
'Why' questions of this form generally don't have answers, only workarounds.
#include <setupapi.h>
#undef LogSeverity

Related

Error when including winuser.h. It defines ChangeMenu to ChangeMenuW or ChangeMenuA

Working on a Qt app on Windows. I include QVboxLayout in my source file only and this causes errors because its macro overwrites my method name.
foo.hpp
class foo
{
ChangeMenu();
}
foo.cpp
#include "foo.hpp"
#include "QVBoxLayout" // <--- this includes winuser.h
foo::ChangeMenu(){};
Now what happens is winuser.h has a macro
#ifdef UNICODE
#define ChangeMenu ChangeMenuW
#else
#define ChangeMenu ChangeMenuA
#endif // !UNICODE
This changes my function definition to ChangeMenuW but my declaration is still ChangeMenu.
How should I solve this? How can winuser.h define such a "normal" name as a macro?
Version of winuser.h is "windows kits\10\include\10.0.16299.0"
Pretty much any Windows API that deals with strings is actually a macro that resolves to a A or W version. There's no way around, you can either:
avoid including windows.h, but as you noticed, it creeps through;
brutally #undef the macro before defining/using your function; this is a fit punishment for hoarding such normal and non-macro-looking identifiers, but is tedious and some other code may actually need the Win32 function;
just accept it as a sad fact of life and avoid all the relevant Win32 APIs names; if you use Qt and follow its naming convention, it should be easy, as Qt functions use lowerCamelCase (as opposed to Win32 UpperCamelCase);
include windows.h explicitly straight in your header (possibly under an #ifdef _WIN32); this will make sure that your identifier will get replaced by the macro in all instances, so everything will work fine even if the compiler will actually compile a function with a different name; suitable for standalone projects, not suitable for libraries. (Thanks #Jonathan Potter for suggesting this)
You could take no care about this issue, Although your method name will be the same as the windows API, but the system will not mix them(just unify Unicode on both the place to define/call). If you call the ChangeMenu() directly, you will call the winapi, and if
foo f;
f.ChangeMenu();
or
foo::ChangeMenu();(static)
You will call your method.
And if you want to disable the winapi:
#ifdef ChangeMenu
#undef ChangeMenu
//code place that you define/call your own ChangeMenu().
#ifdef UNICODE
#define ChangeMenu ChangeMenuW
#else
#define ChangeMenu ChangeMenuA
#endif // !UNICODE
#endif
(It looks very tedious.)

Standard sizeof macro for primitive types

Are there any standard macros that can be used to identify the size of a primitive type at compile time? Similar to the ones in GCC:
__SIZEOF_INT__
__SIZEOF_LONG__
__SIZEOF_LONG_LONG__
__SIZEOF_SHORT__
__SIZEOF_POINTER__
__SIZEOF_FLOAT__
__SIZEOF_DOUBLE__
__SIZEOF_LONG_DOUBLE__
__SIZEOF_SIZE_T__
I remember seeing something similar somewhere but for the death of me I can't find or remember their name anymore. The one I'm interested mostly is the long type.
There are no standard macro definitions for sizes of primitive types.
In boost/atomic there are macros giving you sizes of primitive types, they are using boost/cstdint.hpp among other sources. Example would look like follow:
#include <iostream>
#include <boost/atomic.hpp>
int main() {
std::cout << BOOST_ATOMIC_DETAIL_SIZEOF_LONG;
}
reference:
http://www.boost.org/doc/libs/1_60_0/boost/atomic/detail/int_sizes.hpp

Differentiate between TCHAR and _TCHAR

What are the various differences between the two symbols TCHAR and _TCHAR type defined in the Windows header tchar.h? Explain with examples. Briefly describe scenarios where you would use TCHAR as opposed to _TCHAR in your code. (10 marks)
In addition to what #RussC said, TCHAR is used by the Win32 API and is based on the UNICODE define, whereas _TCHAR is used by the C runtime and is based on the _UNICODE define instead. UNICODE and _UNICODE are usually defined/omitted together, making TCHAR and _TCHAR interchangable, but that is not a requirement. They are semantically separated for use by different frameworks.
Found your answer over here:
MSDN Forums >> Visual Studio Developer Center >> TCHAR vs _TCHAR
TCHAR and _TCHAR are identical, although since TCHAR doesn't have a
leading underscore, Microsoft aren't allowed to reserved it as a
keyword (imagine if you had a variable called TCHAR. Think what would
happen). Hence TCHAR will not be #defined when Language Extensions are
disabled (/Za).
TCHAR is defined in winnt.h (which you'll get when you #include
), and also tchar.h under /Ze.
_TCHAR is available only in tchar.h (which also #defines _TSCHAR and _TUCHAR). Those are unsigned/signed variants of the normal TCHAR data type.

Is timespec not defined in Windows?

It seems weird to me that this answer is hard to find. I've included time.h and ctime, but vc9 is still complaining about an undefined type 'timespec'. I've searched here, MSDN, and the web (even with the exact compiler error), but I can't find the answer... maybe it's just lost in the noise.
Here's the exact error:
error C2027: use of undefined type 'timespec'
Thanks
struct timespec comes from posix, and are typically found on unixes, but not on windows.
If you are trying to compile code with a *nix-y provenance under Windows then you might be better off with something like cygwin and gcc, which gives you a *nix-like environment.
Try including pthread.h . That is where timespec is defined in my ming32 compiler that comes packaged with Codeblocks.
Why? I dont know. Just...
#include <time.h>
#ifndef POSIX MORALITY
#include <pthread.h>
#endif
Solution 2...
You dont need a prefined timespec! You can define it yourself! Which is what mingw does!
struct timespec { /* details */ };
You dont even have to create timespec. You can create struct Fun or struct Jav::timespec; with the same structure and cast it to timespec*. Doing this makes your code more portable when indeed timespec is defined.

macro "max" requires 2 arguments, but only 1 given

template <class T>
struct scalar_log_minimum {
public:
typedef T value_type;
typedef T result_type;
static
result_type initial_value(){
return std::log(std::numeric_limits<result_type>::max());
}
static
void update(result_type& t, const value_type& x){
if ( (x>0) && (std::log(x)<t) ) t = std::log(x);
}
};
i got the following error while trying to compile the above:
functional_ext.hpp:55:59: macro "max" requires 2 arguments, but only 1 given
max is not a macro, right? Then what is this error? BTW, I am using visual studio 2005
Also what is 55:59 --- 55 is the line number 59?
I find the many #defines that you encounter once you included windows.h very disturbing (not only max and min, but I also had problems with other generic words like Rectangle if I'm not mistaken). Therefore, I have developed the habit to include windows.h only when absolutely necessary, and never in header files. This reduces the pain to a small number of C++ files that are platform-specific.
Unfortunately some boost libraries (I believe thread and asio) do include windows.h in their headers, and I still run into this kind of silly problems from time to time.
My solution for the remainder of the situations where this causes problems is to #undef the problematic symbols after the inclusion of the header files.
You're including a header file somewhere that #defines max as a macro. The best solution would be to figure out where it's being defined, and inhibit it from being defined if possible. Alternatively, you could just #undef it:
#include <evil_header_which_defines_max.h>
#undef max
As others have noted, including windows.h is probably your problem. Microsoft provides a means to "turn off" parts of windows.h with preprocessor symbols. You can define these symbols as part of your build or directly in code.
Using preprocessor symbols to conditionally skip sections of windows.h may or may not be considered elegant but in the general case it is an easier, more general and more scalable solution than #undef.
Here's how to skip defining min or max as macros:
#define NOMINMAX
#include <windows.h>
Note that many include files will, at some point, include windows.h. In such cases setting up your defines at a more global level may be more convenient.
If you search through windows.h, you can find a bunch of other preprocessor symbols (e.g., NOOPENFILE, NOKANJI, NOKERNEL and many others) that can often be useful.
It's a macro called max that gets into the way as Adam explained. Another solution (more a "hotfix") may be to put parentheses around the function, to prevent it from being seen as a macro invocation:
return std::log((std::numeric_limits<result_type>::max)());

Resources