How do I convert a WCHAR * to a regular string? - windows

So in Win32 API, I have my main function defined thus:
wmain(int argc, WCHAR* argv[])
I'm passing some arguments to it, and I'd like to execute a switch case based on the value of the argument, something like this.
wmain(int argc, WCHAR* argv[])
{
char* temp = argv[];
switch (temp) {
case "one": blah blah;
...
}
Of course, the temp=argv[] doesn't work, I'm looking for a suggestion to convert it. Right now I have an if-else-if thing going on, and it's VERY inefficient!
The reason I need to convert it is because I cannot execute a switch case on a WCHAR*.
Thanks for looking.

You can't execute a switch on a char* either. (But when you actually need to convert WCHAR* to char*, use WideCharToMultiByte)
You need to use if/else if with lstrcmpi, CompareString or some other string compare function.
Alternatively, use one of the parameter parser libraries like argtable or getopt

I am not sure if this is a good idea to do. A WCHAR* could contain unicode characters which cannot be mapped to a char* in a meaningful way. In case you want to ignore this, there is a forum post at http://www.codeguru.com/forum/showthread.php?t=336106 which has some suggestions for converting from WCHAR* to char*.

Try converting it from std::wstring to std::string, its easy, maybe there is a shorter way.
Convert WCHAR* to std::wstring using std::wstring constractor, and then use one of std::wstring method to convert to std::String

Here's a quick example I wrote some time ago.
Create a new win32 console application and select ATL support. Add this and compile/run...
#include "stdafx.h"
#include <iostream>
using namespace std;
int _tmain(int argc, _TCHAR* argv[])
{
// A _TCHAR is a typedef'd, depending on whether you've got a unicode or MBCS build
// ATL Conversion macros are documented here
// http://msdn.microsoft.com/en-us/library/87zae4a3(VS.80).aspx
// Declare USES_CONVERSION in your function before using the ATL conversion macros
// e.g. T2A(), A2T()
USES_CONVERSION;
TCHAR* pwHelloWorld = _T("hello world!");
wcout << pwHelloWorld << endl;
// convert to char
char* pcHelloWorld = T2A(pwHelloWorld);
cout << pcHelloWorld << endl;
cin.get();
return 0;
}
Of course, you can't switch on a string, but this should give you the info you need in order to read a WCHAR into a char. From there, you can convert to int easily enough..
Hope this helps ;)

Related

Command-Line arguments not working (char, TCHAR) VS2010

I have following code:
int _tmain(int argc, char** argv) {
bool g_graphics = true;
palPhysics * pp = 0;
#ifndef PAL_STATIC
PF -> LoadPALfromDLL();
#endif
char a[] = "Bullet";
std::string aa;
aa = std::string(argv[1]);
//PF->SelectEngine("Bullet");
DebugBreak();
PF -> SelectEngine(argv[1]);
//PF->SelectEngine(aa);
//debug
// assert(false);
pp = PF -> CreatePhysics();
}
I am trying to read in the command line argument no. 1 in this line:
PF->SelectEngine(argv[1]);
However, I only get the first letter of the argument. I have also tried changing
int _tmain(int argc, char** argv)
to
int _tmain(int argc, TCHAR** argv), but then I get
error:
error C2664: 'palFactory::SelectEngine' : cannot convert parameter 1 from 'TCHAR *' to 'const PAL_STRING &'
PAL_STRING is just a std::string.
This might be a simple one, but I am not sure how to convert TCHAR to std::string, especially since TCHAR is something else depending on compiler /environment settings. Is anyone aware of an easy way to get the command-line arguments to work, such that I don't need to convert anything myself, i..e maybe by changing the tmain function?
Thanks!
C
Update: example of invoking on command line:
Yep. so the way I invoke this on command line is:
progname.exe arg1 arg2,
where arg1 is a physics engine I am trying to load, and arg2 is a dae(physics file with physics info), so I go, specifically:
progname.exe Bullet E:/a.dae
Stepping into the line "PF->SelectEngine(argv[1]);" gives the following code:
bool palFactory::SelectEngine(const PAL_STRING& name) {
#ifdef INTERNAL_DEBUG
printf("palFactory::SelectEngine: this = %p\n", this);
#endif
SetActiveGroup(name); // also calls RebuildRegistry
return isClassRegistered("palPhysics");
}
, in this case, when debugging, I can see that const PAL_STRING& name, i.e. the string, is just "B", instead of what I would expect it to be, which is "Bullet", my command line argument I have passed in the command line.
I've been plauged by this problem for years. The only solution I've been able to find is to NOT USE Visual Studio. I've had to fall back to using other compilers when I must be able to process command-line args. Specifically, I've been using the Digital Mars compiler successfully. It handles the command-line args correctly. I use the VS environment for intellisense and debugging, then compile with DMC to deploy.
---edit below---
Turns out, I just wasn't asking the right question. I finally asked the right question, and got the right answer! See link below.
What is the difference between _tmain() and main() in C++?

Using strcmp to compare argv item with string literal isn't working as I was expecting

I'm quite new to Visual C++ so this might be a 'schoolboy' error, but the following code is not executing as I'd expected:
#include "stdafx.h"
#include <string.h>
int _tmain(int argc, _TCHAR* argv[])
{
if (strcmp((char*)argv[1], "--help") == 0)
{
printf("This is the help message."); //Won't execute
}
return 0;
}
The executable, named Test.exe is launched as follows
Test.exe --help
I was expecting the message This is the help message. but I'm not seeing it - debugging reveals that the if condition comes out as -1 and not 0 as I'd expect. What am I doing wrong?
OK, I've figured out what's going on. The argv[] array is declared as TCHAR*, which is a macro that adjust the type based on whether or not Unicode has been enabled for the project (wchat_t if it is or char if it is not). The strcmp function, which I was trying to use, is the non-Unicode string comparison while wcscmp is the Unicode equivalent. The _tcscmp function uses the appropriate string comparison function depending on the Unicode setting. If I replace strcmp with _tcscmp, problem solved!
#include "stdafx.h"
#include <string.h>
int _tmain(int argc, _TCHAR* argv[])
{
if (_tcscmp(argv[1], _T("--help")) == 0)
{
printf("This is the help message."); //Will execute :)
}
return 0;
}
The _T function converts the argument to Unicode, if Unicode is enabled.
See also: Is it advisable to use strcmp or _tcscmp for comparing strings in Unicode versions?

Outputting UTF-8 with qInstallMsgHandler

I would like to make my debug handler (installed with qInstallMsgHandler) handles UTF-8, however it seems it can only be defined as void myMessageOutput(QtMsgType type, const char *msg) and const char* doesn't handle UTF-8 (once displayed, it's just random characters).
Is there some way to define this function as void myMessageOutput(QtMsgType type, QString msg), or maybe some other way to make it work?
This is my current code:
void myMessageOutput(QtMsgType type, const char *msg) {
QString message = "";
QString test = QString::fromUtf8(msg);
// If I break into the debugger here. both "test" and "msg" contain a question mark.
switch (type) {
case QtDebugMsg:
message = QString("[Debug] %1").arg(msg);
break;
case QtWarningMsg:
message = QString("[Warning] %1").arg(msg);
break;
case QtCriticalMsg:
message = QString("[Critical] %1").arg(msg);
break;
case QtFatalMsg:
message = QString("[Fatal] %1").arg(msg);
abort();
}
Application::instance()->debugDialog()->displayMessage(message);
}
Application::Application(int argc, char *argv[]) : QApplication(argc, argv) {
debugDialog_ = new DebugDialog();
debugDialog_->show();
qInstallMsgHandler(myMessageOutput);
qDebug() << QString::fromUtf8("我");
}
If you step through the code in the debugger you will find out that QDebug and qt_message first construct a QString from the const char* and then use toLocal8Bit on this string.
The only way I can think of to circumvent this: Use your own coding (something like "[E68891]") or some other coding like uu-encode or base64-encoding that uses only ASCII characters and decode the string in your message handler.
You should also consider to use the version qDebug("%s", "string") to avoid quotes and additional whitespace (see this question).
Edit: the toLocal8Bit happens in the destructor of QDebug that is call at the end of a qDebug statement (qdebug.h line 85). At least on the Windows platform this calls toLatin1 thus misinterpreting the string. You can prevent this by calling the following lines at the start of your program:
QTextCodec *codec = QTextCodec::codecForName("UTF-8");
QTextCodec::setCodecForLocale(codec);
On some platforms UTF-8 seems to be the default text codec.
try to pass data in UTF8 and extract it in your function with something like
QString::fromUTF8
it takes const char* on input.
The problem is that the operator<<(const char *) method expects a Latin1-encoded string, so you should pass a proper UTF-8 QString to QDebug like this:
qDebug() << QString::fromUtf8("我");
... and from inside the message handler expect a UTF-8 string:
QString message = QString::fromUtf8(msg);
And that should work like a charm. ;)
For more information please read the QDebug reference manual.
You could also do the wrong thing: keep passing UTF-8 encoded strings via << and convert the strings with the horrible QString::fromUtf8(QString::fromUtf8(msg).toAscii().constData()) call.
Edit: This is the final example that works:
#include <QString>
#include <QDebug>
#include <QMessageBox>
#include <QApplication>
void
myMessageOutput(QtMsgType type, const char *msg)
{
QMessageBox::information(NULL, NULL, QString::fromUtf8(msg), QMessageBox::Ok);
}
int
main(int argc, char *argv[])
{
QApplication app(argc, argv);
qInstallMsgHandler(myMessageOutput);
qDebug() << QString::fromUtf8("我");
return 0;
}
Please note that QDebug doesn't do any charset conversion if you don't instantiate QApplication. This way you wouldn't need to do anything special to msg from inside the message handler, but I STRONGLY recommend you to instantiate it.
One thing you must be sure is that your source code file is being encoded in UTF-8. To do that you might use a proper tool to check it (file in case you use Linux, for example) or just call QMessageBox::information(NULL, NULL, QString::fromUtf8("我"), QMessageBox::Ok) and see if a proper message appears.
#include <QtCore/QCoreApplication>
#include <stdio.h>
#include <QDebug>
void myMessageOutput(QtMsgType type, const char *msg)
{
fprintf(stderr, "Msg: %s\n", msg);
}
int main(int argc, char *argv[])
{
qInstallMsgHandler(myMessageOutput);
QCoreApplication a(argc, argv);
qDebug() << QString::fromUtf8("我");
}
The code above works here perfectly, but I must stress that my console does support UTF-8, because if it would not it would show another char at that location.

CryptStringToBinary not working with a NULL terminated string. Why?

does anyone know why this code is not working?
#include "stdafx.h"
#include <windows.h>
#include <WinCrypt.h>
int _tmain(int argc, _TCHAR* argv[])
{
wchar_t *bin = TEXT("ProductID:1233===>55555");
BYTE out2[1000];
DWORD olen;
olen = 1000;
if (CryptStringToBinary(bin, 0, 1, out2, &olen, 0, 0) == 0)
{
wprintf(TEXT("Failure\n"));
}
else
{
//wprintf(TEXT("rn%s\n"),out2);
wprintf(TEXT("Success\n"));
}
system("pause");
return 0;
}
Thank you very much in advance!
Tom
Because you specified a length (parameter 2) of 0?
Edit: Just to clarify our eventual solution in the comments below, the code in the original question (since edited) contained two errors:
It was calling CryptBinaryToString instead of CryptStringToBinary. Since it's invalid to pass a 0 in the second parameter to CryptBinaryToString, the function was failing.
It was passing 1 in the third parameter (dwFlags), which is interpreted as CRYPT_STRING_BASE64. Since the string to encrypt wasn't in base 64 (it contained invalid characters such as ':'), the function was failing. In general, passing a raw value instead of using an existing definition (e.g., CRYPT_STRING_BASE64) is not a good idea.

GetUserDefaultLocaleName() API is crashing

I have one application which reads user default locale in Windows Vista and above. When i tried calling the API for getting User default Locale API is crashing. Below is the code, It will be helpfull if any points the reason
#include <iostream>
#include <WinNls.h>
#include <Windows.h>
int main()
{
LPWSTR lpLocaleName=NULL;
cout << "Calling GetUserDefaultLocaleName";
int ret = GetUserDefaultLocaleName(lpLocaleName, LOCALE_NAME_MAX_LENGTH);
cout << lpLocaleName<<endl;
}
You need to have lpLocaleName initialized to a buffer prior to calling the API. As a general consensus, if an API has a LPWSTR data type parameter, call malloc or new on it first, to the desired length, in this case, LOCALE_NAME_MAX_LENGTH. Setting it to NULL and passing it to the API function is a guaranteed way to crash!
Hope this helps,
Best regards,
Tom.
In addition to the previous answers, you should also be aware that you can't print a wide string with cout; instead, you should use wcout.
So:
#include <iostream>
#include <WinNls.h>
#include <Windows.h>
#define ARRSIZE(arr) (sizeof(arr)/sizeof(*(arr)))
using namespace std;
int main()
{
WCHAR_T localeName[LOCALE_NAME_MAX_LENGTH]={0};
cout<<"Calling GetUserDefaultLocaleName";
int ret = GetUserDefaultLocaleName(localeName,ARRSIZE(localeName));
if(ret==0)
cout<<"Cannot retrieve the default locale name."<<endl;
else
wcout<<localeName<<endl;
return 0;
}
I believe you need to initialise lpLocaleName to an empty string of 256 chars (for example) then pass the length (256) where you have LOCALE_NAME_MAX_LENGTH

Resources