I'm reading a book called "Introduction to 3D Game Programming with DirectX 9.0c: A Shader Approach" and I was following the codes there but the application used Multi-Byte Character Set and I read from somewhere that it's not a good practice to use that and im having error when creating a window. here is the code that im having error.
mhMainWnd = CreateWindow(L"D3DWndClassName", mMainWndCaption.c_str(), WS_OVERLAPPEDWINDOW,
GetSystemMetrics(SM_CXSCREEN)/2 - width/2,
GetSystemMetrics(SM_CYSCREEN)/2 - height/2,
R.right, R.bottom, 0, 0, mhAppInst, 0);
then the eror is:
error C2664: 'CreateWindowExW' : cannot convert parameter 2 from 'const char [16]' to 'LPCWSTR'
hope someone can help me
What you heard about the preferability of Unicode over the ANSI/MBCS is entirely correct. All new Windows code should be written to work with Unicode. In order to make this happen, you have to ensure two things:
Both the UNICODE and _UNICODE symbols need to be defined globally to ensure that the Unicode versions of the API functions are called, even if you forget the W suffix.
You can either do this at the top of your precompiled header
#define UNICODE
#define _UNICODE
or in your project's Properties window within Visual Studio. Simply add both of the values to the list.
All of your strings (both literals and otherwise) need to be Unicode strings.
With literals, you accomplish this by prefixing them with L, just as you've done in the example: L"D3DWndClassName"
With strings that are allocated at runtime, you need to use the wchar_t type. Since you're using C++, you should obviously be using a string class rather than raw character arrays like you would in C. So you need to use a string class that treats the characters in the string as wchar_t. This would either be std::wstring or MFC/ATL/WTL's CStringW class.
It looks like you've got most of this down already. The culprit is mMainWndCaption.c_str(). You are using std::string (which returns a nul-terminated array of chars) instead of std::wstring (which returns a nul-terminated array of wchar_ts).
Either change your project to ANSI or MBCS rather than UNICODE, then change
L"D3DWndClassName"
to
"D3DWndClassName"
or, leave your project properties as UNICODE but use a UNICODE string of your window caption - so
CString szCaption(mMainWndCaption.c_str()); // CString is actually CStringW in UNICODE build
mhMainWnd = CreateWindow(L"D3DWndClassName", szCaption, WS_OVERLAPPEDWINDOW,
GetSystemMetrics(SM_CXSCREEN)/2 - width/2,
GetSystemMetrics(SM_CYSCREEN)/2 - height/2,
R.right, R.bottom, 0, 0, mhAppInst, 0);
Related
I am assigning to the lpszIcon member of the MSGBOXPARAMSW structure(notice the W). I want to use one of the predefined icons like IDI_APPLICATION or IDI_WARNING but they are all ASCII (defined as MAKEINTRESOURCE). I tried doing this:
MSGBOXPARAMSW mbp = { 0 };
mbp.lpszIcon = (LPCWSTR) IDI_ERROR;
but then no icon displayed at all. So how can I use the unicode versions of the IDI_ icons?
There is no ANSI or Unicode variant of a numeric resource ID. The code that you use to set lpszIcon is correct. It is idiomatic to use the MAKEINTRESOURCE macro rather than a cast, but the cast has identical meaning. Your problem lies in the other code, the code that we cannot see.
Reading between the lines, I think that you are targeting ANSI or MBCS. You tried to use MAKEINTRESOURCE but that expands to MAKEINTRESOURCEA. That's what led you to cast. You should have used MAKEINTRESOURCEW to match MSGBOXPARAMSW. That would have resolved the compilation error you encountered. You could equally have changed the project to target UNICODE.
But none of that explains why the icon does not appear in the dialog. There has to be a problem elsewhere. If the dialog appears then the most likely explanation is that you have set hInstance to a value other than NULL. But the code to set lpszIcon is correct, albeit not idiomatic.
I'm writing a small program with QT creator under windows 7 (32 bit). My goal is to create a windows key.
I'm using
QSettings settings("HKEY_CURRENT_USER\\Software\\Company", QSettings::NativeFormat);
settings.setValue("C:\\path\\prog.exe", "Value");
but in the windows registry the generated key has the value C:/path/prog.exe
I've tryed to convert it with
qDebug() << QDir::toNativeSeparators("C:\\path\\prog.exe");
the output of qDebug() is right c:\path\prog.exe
but doing
settings.setValue(QDir::toNativeSeparators("C:\\path\\prog.exe"), "Value");
results again in a path with a wrong slash.
do there is a way to write correctly the path in the windows registry without using the windows API?
Thanks
Francesco
You can't do it even with WinAPI. Because you are specifying an invalid key. You should understand that QSettings class use platform-specific backend, so it is useful to read documentation, if something does not work as expected. Start here.
QSettings class performs custom transformation to keys and values, so you may store any QVariant values there. Even arrays. Invalid values for each platform will be escaped. You may look at exact transformation rules in Qt sources.
Note: values transformation depends on type of settings storage. For example^ for .ini files.
ok, I managed to achieve my goal by using
RegSetValueEx(hkey, TEXT("C:\\path\\prog.exe"), 0, REG_SZ, (LPBYTE)TEXT("WIN98"), 6 * sizeof(WCHAR));
for a constant string.
in case the program path is strored in a char * (like in my case), it works with
char* exe_name = /*something*/
wchar_t* wString=new wchar_t[4096];
MultiByteToWideChar(CP_ACP, 0, exe_name, -1, wString, 4096);
RegCreateKeyEx( .......... )
RegSetValueEx(hkey, (LPCWSTR) wString, 0, REG_SZ, (LPBYTE)TEXT("WIN98"), 6 * sizeof(WCHAR));
Francesco
I have a UTF-8 string which I want to display in an HITextView (MLTE) control. Theoretically, HITextView requires either "Text" or UTF-16, so I'm converting:
UniChar uniput[STRSIZE];
ByteCount converted,unilen;
err = ConvertFromTextToUnicode(C2UInfo, len, output,
kUnicodeUseFallbacksMask, 0, NULL, 0, NULL,
sizeof(UniChar)*STRSIZE,
&converted, &unilen, uniput);
err=TXNSetData(MessageObject, kTXNUnicodeTextData, uniput, unilen, kTXNEndOffset,
kTXNEndOffset);
I have defined the converter C2UInfo as follows:
UnicodeMapping uMapping;
uMapping.unicodeEncoding = CreateTextEncoding(kTextEncodingUnicodeV2_0,
kUnicodeCanonicalDecompVariant,
kUnicode16BitFormat);
uMapping.otherEncoding = GetTextEncodingBase(kUnicodeUTF8Format);
uMapping.mappingVersion = kUnicodeUseLatestMapping;
err = CreateTextToUnicodeInfo(&uMapping, &C2UInfo);
It works fine for plain old ASCII characters, but multi-byte UTF-8 is being mapped to the wrong characters. For example, æ (LATIN SMALL LETTER AE) is being mapped to 疆 (CJK UNIFIED IDEOGRAPH-7586).
I've tried checking and unchecking "Output Text in Unicode" in Interface Builder, and I've tried varying some of the conversion constants, with no effect.
This is being built with Xcode 3.2.6 using the MacOSX10.5.sdk and tested under 10.6.
The “Text” that ConvertFromTextToUnicode expects is probably the same “Text” that is one of your two options for MLTE. If you had the sort of “Text” that ConvertFromTextToUnicode converts from, you could just pass it to MLTE directly.
(For the record, “Text” is almost certainly either MacRoman or whatever is dictated by the user's locale-determined current script.)
Instead, you should use a Text Encoding Converter. Create one, use it, finish using it, and dispose of it when you're done.
There are two other ways.
One is to create a CFString from the UTF-8, then Get its characters. You would do this instead of using a TEC. It's functionally equivalent and possibly a little bit easier. On the other hand, you don't get to reuse the converter, for whatever that's worth.
The other, since you have an HITextView, would be to create a CFString from the UTF-8 and just use that. Like Cocoa objects, HIToolbox objects have an inheritance hierarchy; since an HITextView is a kind of HIView, HIViewSetText should just work. (And if not, try HIViewSetValue.)
The last method also gets you that much closer to your eventual move away from MLTE/HITextView, since it's essentially what you'll do with an NSTextView. (HITextView and MLTE are deprecated.)
I'm trying to hide some values in the registry (such as serial numbers) with C++/windows
so I've been looking at this article http://www.codeproject.com/KB/system/NtRegistry.aspx
which says:
How is this possible? The answer is
that a name which is a counted as a
Unicode string can explicitly include
NULL characters (0) as part of the
name. For example, "Key\0". To include
the NULL at the end, the length of the
Unicode string is specified as 4.
There is absolutely no way to specify
this name using the Win32 API since if
"Key\0" is passed as a name, the API
will determine that the name is "Key"
(3 characters in length) because the
"\0" indicates the end of the name.
When a key (or any other object with a
name such as a named Event, Semaphore,
or Mutex) is created with such a name,
any application using the Win32 API
will be unable to open the name, even
though they might seem to see it.
so I tried doing something similar:
HKEY keyHandle;
PHKEY key;
unsigned long status = 0;
wchar_t *wKeyName = new wchar_t[m_keyLength];
MultiByteToWideChar(CP_ACP, 0, m_keyName, m_keyLength, wKeyName, m_keyLength);
wKeyName[18] = '\0';
long result = RegCreateKeyExW(HKEY_LOCAL_MACHINE,
wKeyName,
0,
NULL,
0,
KEY_ALL_ACCESS,
NULL,
&keyHandle,
&status);
where m_keyName is the ASCII text and wKeyName is the wide char text, but in regedit I see that it is treated the same and the key is just cut where I put the '\0'.
what is wrong with it?
The problem is that you are using the Win32 API and not the NT Native API. There is a table about 1/2 way through the article that you referenced that contains the list of Native APIs. For example, you would use NtCreateKey or ZwCreateKey instead of RegCreateKeyExW. The Win32 API assumes that alls strings are terminated by a NUL character whereas the Native API counterparts use a UNICODE_STRING structure for the name.
I'll take a stab in the dark, as I have never tried to do this.
It appears that you are using the wrong function to create your registry key. You should be using the NtCreateKey method because RegCreateKeyEx[AW] will notice your '\0' and chop off past it.
Why not use the class provided in the example? It provides a method called CreateHiddenKey. To use it, simply call SetKey before it. It would be much cleaner.
There is a function that copies a value to registry using
RegSetValueEx(hKey, theName, 0, REG_DWORD, (unsigned char *)&value, sizeof(value));
theName passed by the caller is a char *
I get a compile error:
Argument of type char * is incompatible with LPCWSTR
Why do I get this error?
I have copied some code that uses it (and I know it builds succesfully) and built it myself.
Has the function changed or my project settings is messed up? I do not know which version of VS the code was created.
It is because Windows has been a Unicode operating system for the past 18 years. Its default string type is utf-16 encoded, wchar_t* in your code. Or std::wstring. Or LPCWSTR, the typedef used in the Windows headers. Note the prevalence of 'w', it means Wide.
It still supports char* strings, you have to use RegSetValueExA(). Note the added "A". It is also a project setting to make your program use the old multi-byte API. Project + Properties, General, Character Set. Avoid marketing to the other 5 billion customers when you do.