On Win7 having localized UI, error_code::message() returns a non-English message. As far as I see (in Boost 1.54, for system_error_category), the above function boils down to the following WinAPI call:
DWORD retval = ::FormatMessageA(
FORMAT_MESSAGE_ALLOCATE_BUFFER |
FORMAT_MESSAGE_FROM_SYSTEM |
FORMAT_MESSAGE_IGNORE_INSERTS,
NULL,
ev,
MAKELANGID(LANG_NEUTRAL, SUBLANG_DEFAULT), // Default language
(LPSTR) &lpMsgBuf,
0,
NULL
);
How to get the above FormatMessage to return an English message? I tried to set the locale, both with std functions and with SetThreadLocale - it didn't help.
Update: Just a clarification: essentially, my question is how to "override" programmatically the user default language and why setting locale is not enough.
Been searching all over the internet for solution, and finally found this.
Basically, you should call SetThreadUILanguage in your main/WinMain.
At a guess, you'll need to specify English for dwLanguageId instead of the default language. E.g.:
MAKELANGID(LANG_ENGLISH, SUBLANG_DEFAULT)
or, if you want specifically US English:
MAKELANGID(LANG_ENGLISH, SUBLANG_ENGLISH_US)
Note that this will fail if the message in the specified language is not present. So you may want to handle ERROR_RESOURCE_LANG_NOT_FOUND and try calling it again with dwLanguageId=0.
For more info, see MSDN.
Related
I am assigning to the lpszIcon member of the MSGBOXPARAMSW structure(notice the W). I want to use one of the predefined icons like IDI_APPLICATION or IDI_WARNING but they are all ASCII (defined as MAKEINTRESOURCE). I tried doing this:
MSGBOXPARAMSW mbp = { 0 };
mbp.lpszIcon = (LPCWSTR) IDI_ERROR;
but then no icon displayed at all. So how can I use the unicode versions of the IDI_ icons?
There is no ANSI or Unicode variant of a numeric resource ID. The code that you use to set lpszIcon is correct. It is idiomatic to use the MAKEINTRESOURCE macro rather than a cast, but the cast has identical meaning. Your problem lies in the other code, the code that we cannot see.
Reading between the lines, I think that you are targeting ANSI or MBCS. You tried to use MAKEINTRESOURCE but that expands to MAKEINTRESOURCEA. That's what led you to cast. You should have used MAKEINTRESOURCEW to match MSGBOXPARAMSW. That would have resolved the compilation error you encountered. You could equally have changed the project to target UNICODE.
But none of that explains why the icon does not appear in the dialog. There has to be a problem elsewhere. If the dialog appears then the most likely explanation is that you have set hInstance to a value other than NULL. But the code to set lpszIcon is correct, albeit not idiomatic.
I'm trying to hide some values in the registry (such as serial numbers) with C++/windows
so I've been looking at this article http://www.codeproject.com/KB/system/NtRegistry.aspx
which says:
How is this possible? The answer is
that a name which is a counted as a
Unicode string can explicitly include
NULL characters (0) as part of the
name. For example, "Key\0". To include
the NULL at the end, the length of the
Unicode string is specified as 4.
There is absolutely no way to specify
this name using the Win32 API since if
"Key\0" is passed as a name, the API
will determine that the name is "Key"
(3 characters in length) because the
"\0" indicates the end of the name.
When a key (or any other object with a
name such as a named Event, Semaphore,
or Mutex) is created with such a name,
any application using the Win32 API
will be unable to open the name, even
though they might seem to see it.
so I tried doing something similar:
HKEY keyHandle;
PHKEY key;
unsigned long status = 0;
wchar_t *wKeyName = new wchar_t[m_keyLength];
MultiByteToWideChar(CP_ACP, 0, m_keyName, m_keyLength, wKeyName, m_keyLength);
wKeyName[18] = '\0';
long result = RegCreateKeyExW(HKEY_LOCAL_MACHINE,
wKeyName,
0,
NULL,
0,
KEY_ALL_ACCESS,
NULL,
&keyHandle,
&status);
where m_keyName is the ASCII text and wKeyName is the wide char text, but in regedit I see that it is treated the same and the key is just cut where I put the '\0'.
what is wrong with it?
The problem is that you are using the Win32 API and not the NT Native API. There is a table about 1/2 way through the article that you referenced that contains the list of Native APIs. For example, you would use NtCreateKey or ZwCreateKey instead of RegCreateKeyExW. The Win32 API assumes that alls strings are terminated by a NUL character whereas the Native API counterparts use a UNICODE_STRING structure for the name.
I'll take a stab in the dark, as I have never tried to do this.
It appears that you are using the wrong function to create your registry key. You should be using the NtCreateKey method because RegCreateKeyEx[AW] will notice your '\0' and chop off past it.
Why not use the class provided in the example? It provides a method called CreateHiddenKey. To use it, simply call SetKey before it. It would be much cleaner.
The docs for MAKELANGID specify that MAKELANGID(LANG_NEUTRAL, SUBLANG_NEUTRAL) Means 'Language neutral'.
This seems to be English on my machine (tried it with FormatMessage), but what does it mean in general? Is it guarenteed to be English?
Thanks!
I would expect that this means that the strings associated with the lang id are not specific to any language - which could be useful to know for a localisation team. "%1 + %2 = %3" would be an example of one such string.
with sublanguage = SUBLANG_DEFAULT this would be the user's default language.
https://web.archive.org/web/20100704043524/http://msdn.microsoft.com/en-us/library/ms534732(VS.85).aspx
Here's a note on the sublanguage identifier - https://web.archive.org/web/20100728153356/http://wiki.winehq.org/SublangNeutral.
Note that MAKELANGID creates a language identifier for you from the primary language and sublanguage identifier - it does "not" get the default language, or anything like that.
No, it is not "gauranteed to be English." It "is" whatever you place into it at that point (English, in your case). But it means that it should not serve as a (language) satellite assembly (except maybe as a fallback).
I'm trying to create a GL context, and the call fails, returning a null pointer. According to MSDN, when wglCreateContext fails, you get the reason why from GetLastError. Except that GetLastError gives me a number, which isn't all that informative.
Again according to MSDN, you can get a descriptive string out of the GetLastError code with FormatMessage. But when I try the following, I get a blank string:
FormatMessageA(FORMAT_MESSAGE_FROM_SYSTEM, 0, GetLastError(), 0, errorStr, 0, NULL);
I checked the MSDN documentation, and apparently it only has a lookup table for Windows system errors. So that's no help to me afterall. Does anyone know how to figure out programmatically why my wglCreateContext call is failing?
I've found this MSDN page and this OpenGL page, but they only has the list of error code names, not their numerical values.
This page in the OpenGL reference mentions a function gluErrorString with this profile:
constGLubyte *gluErrorString(GLenum error)
Is this available in the library you're using? This should be defined in a file called "glu.h". For Visual Studio 6 it was in:
C:\Program Files\Microsoft Visual Studio\VC98\Include\GL
If you don't have that version it should be in:
C:\Program Files\Microsoft SDKs\Windows\v6.0A\Include\gl
This folder should also contain "gl.h" which contains the definitions for the error codes.
EDIT If you're getting a null string that that might mean that it's a non standard error code. Get the value coming out of GetLastError() and search for it's hex representation in "gl.h". If it finds it then the define will tell you what the error is.
I found the problem. FormatMessage works just fine; I just had the wrong parameters. This worked:
FormatMessageA(FORMAT_MESSAGE_FROM_SYSTEM | FORMAT_MESSAGE_IGNORE_INSERTS, NULL, GetLastError(), 0, errorStr, 255, NULL);
I have to convert the encoding of a string output of a VB6 application to a specific encoding.
The problem is, I don't know the encoding of the string, because of that:
According to the VB6 documentation when accessing certain API functions the internal Unicode strings are converted to ANSI strings using the default codepage of Windows.
Because of that, the encoding of the string output can be different on different systems, but I have to know it to perform the conversion.
How can I read the default codepage using the Win32 API or - if there's no other way - by reading the registry?
It could be even more succinct by using GetACP - the Win32 API call for returning the default code page! (Default code page is often called "ANSI")
int nCodePage = GetACP();
Also many API calls (such as MultiByteToWideChar) accept the constant value CP_ACP (zero) which always means "use the system code page". So you may not actually need to know the current code page, depending on what you want to do with it.
GetSystemDefaultLCID() gives you the system locale.
If the LCID is not enough and you truly need the codepage, use this code:
TCHAR szCodePage[10];
int cch= GetLocaleInfo(
GetSystemDefaultLCID(), // or any LCID you may be interested in
LOCALE_IDEFAULTANSICODEPAGE,
szCodePage,
countof(szCodePage));
nCodePage= cch>0 ? _ttoi(szCodePage) : 0;
That worked for me, thanks, but can be written more succinctly as:
UINT nCodePage = CP_ACP;
const int cch = ::GetLocaleInfo(LOCALE_SYSTEM_DEFAULT,
LOCALE_RETURN_NUMBER|LOCALE_IDEFAULTANSICODEPAGE,
(LPTSTR)&nCodePage, sizeof(nCodePage) / sizeof(_TCHAR) );