Using strcmp to compare argv item with string literal isn't working as I was expecting - visual-studio-2010

I'm quite new to Visual C++ so this might be a 'schoolboy' error, but the following code is not executing as I'd expected:
#include "stdafx.h"
#include <string.h>
int _tmain(int argc, _TCHAR* argv[])
{
if (strcmp((char*)argv[1], "--help") == 0)
{
printf("This is the help message."); //Won't execute
}
return 0;
}
The executable, named Test.exe is launched as follows
Test.exe --help
I was expecting the message This is the help message. but I'm not seeing it - debugging reveals that the if condition comes out as -1 and not 0 as I'd expect. What am I doing wrong?

OK, I've figured out what's going on. The argv[] array is declared as TCHAR*, which is a macro that adjust the type based on whether or not Unicode has been enabled for the project (wchat_t if it is or char if it is not). The strcmp function, which I was trying to use, is the non-Unicode string comparison while wcscmp is the Unicode equivalent. The _tcscmp function uses the appropriate string comparison function depending on the Unicode setting. If I replace strcmp with _tcscmp, problem solved!
#include "stdafx.h"
#include <string.h>
int _tmain(int argc, _TCHAR* argv[])
{
if (_tcscmp(argv[1], _T("--help")) == 0)
{
printf("This is the help message."); //Will execute :)
}
return 0;
}
The _T function converts the argument to Unicode, if Unicode is enabled.
See also: Is it advisable to use strcmp or _tcscmp for comparing strings in Unicode versions?

Related

How do I handle errors in Lua when executing arbitrary strings?

I'm going for absolute minimalism here. (It's been a while since I've worked with the Lua C API.)
#include <lua.hpp>
#include <iostream>
#include <string>
using namespace std;
int main(int argc, char** argv)
{
lua_State* state = luaL_newstate();
luaL_openlibs(state);
string input;
while (getline(cin, input))
{
auto error = luaL_dostring(state, input.c_str());
if (error)
{
cerr << "Lua Error: " << lua_tostring(state, -1) << '\n';
lua_pop(state, 1);
}
}
lua_close(state);
return 0;
}
This program works fine as long as I feed it perfect Lua. However, if I enter something bad (such as asdf()), the program crashes! Why is it not handling my error gracefully?
I've tried breaking out the calls before. It crashes on the call to lua_pcall itself. I never make it past that line.
The binary download (5.2.1 I believe) has a bug that was corrected in 5.2.3. I rebuilt the library from source, and now my program works fine.

incorrect argc in win32 (in addition, the arguments are ignored)

I was coding in win32, and my program actually works in debug mode in vs, however not in release mode and not outside vs.
int _tmain(int argc, _TCHAR* argv[])
{
//assert that there are 3 parameters.
assert(argc==4);
LPCTSTR inputPath = argv[1];
LPCTSTR sharedName = argv[2];
LPCTSTR logPath = argv[3];
sometimes argc is not correct (over 300000, while it should be 4), and sometimes the
LPCTSTR sharedName = argv[2];
line is just ignored!
when debugging this program in release mode, it jumps over it, and when hoovering above the variable name nothing happens.
When right-clicking a variable and choosing Add Watch, I get the error logPath CXX0017: Error: symbol "logPath" not found
of course, I have set the command arguments in vs to be "a b c" (without the quotes)
What could it be?
running the simplified program:
// test.cpp : Defines the entry point for the console application.
//
#include "stdafx.h"
#include "stdafx.h"
#include <windows.h>
#include <assert.h>
#include "conio.h"
int _tmain(int argc, _TCHAR* argv[])
{
assert(argc==4);
LPCTSTR inputPath = argv[1];
LPCTSTR sharedName = argv[2];
LPCTSTR logPath = argv[3];
_getch();
}
yields the same result. the debugger just jumps to the getch line, and if I try to add watch, I get logPath CXX0017: Error: symbol "logPath" not found
inputPath CXX0017: Error: symbol "inputPath" not found
sharedName CXX0017: Error: symbol "sharedName" not found
when debugging this program in release mode, it jumps over it, and when hoovering above the variable name nothing happens. When right-clicking a variable and choosing Add Watch, I get the error logPath CXX0017: Error: symbol "logPath" not found
These symptoms make sense. "Release" mode tells the compiler to turn optimizations on, and since you never use the variables that you declare, the compiler helpfully optimizes them out altogether. There's no point in going through the motions of creating and assigning something if you'll never use it again.
That's why it's telling you that the symbol is not found, because its definition was optimized out.
On the other hand, "Debug" mode it disables optimizations. Thus, it goes through the motions of creating these variables and assigning values to them, even though you may never use them. That's the whole point of debug mode—so you can debug your application without interference from the optimizing behavior of the compiler, even when it's not completely written yet.
If you're desperate to make it work like you expect with optimizations enabled (i.e., in "Release" mode), then you can simply use the values of the variables you assign. That will prevent the compiler from optimizing them out. For example, you can simply output the strings to the debugger:
#include "stdafx.h"
#include <windows.h>
#include <assert.h>
#include <conio.h>
int _tmain(int argc, _TCHAR* argv[])
{
assert(argc==4);
LPCTSTR inputPath = argv[1];
LPCTSTR sharedName = argv[2];
LPCTSTR logPath = argv[3];
OutputDebugString(inputPath);
OutputDebugString(sharedName);
OutputDebugString(logPath);
_getch();
}

Outputting UTF-8 with qInstallMsgHandler

I would like to make my debug handler (installed with qInstallMsgHandler) handles UTF-8, however it seems it can only be defined as void myMessageOutput(QtMsgType type, const char *msg) and const char* doesn't handle UTF-8 (once displayed, it's just random characters).
Is there some way to define this function as void myMessageOutput(QtMsgType type, QString msg), or maybe some other way to make it work?
This is my current code:
void myMessageOutput(QtMsgType type, const char *msg) {
QString message = "";
QString test = QString::fromUtf8(msg);
// If I break into the debugger here. both "test" and "msg" contain a question mark.
switch (type) {
case QtDebugMsg:
message = QString("[Debug] %1").arg(msg);
break;
case QtWarningMsg:
message = QString("[Warning] %1").arg(msg);
break;
case QtCriticalMsg:
message = QString("[Critical] %1").arg(msg);
break;
case QtFatalMsg:
message = QString("[Fatal] %1").arg(msg);
abort();
}
Application::instance()->debugDialog()->displayMessage(message);
}
Application::Application(int argc, char *argv[]) : QApplication(argc, argv) {
debugDialog_ = new DebugDialog();
debugDialog_->show();
qInstallMsgHandler(myMessageOutput);
qDebug() << QString::fromUtf8("我");
}
If you step through the code in the debugger you will find out that QDebug and qt_message first construct a QString from the const char* and then use toLocal8Bit on this string.
The only way I can think of to circumvent this: Use your own coding (something like "[E68891]") or some other coding like uu-encode or base64-encoding that uses only ASCII characters and decode the string in your message handler.
You should also consider to use the version qDebug("%s", "string") to avoid quotes and additional whitespace (see this question).
Edit: the toLocal8Bit happens in the destructor of QDebug that is call at the end of a qDebug statement (qdebug.h line 85). At least on the Windows platform this calls toLatin1 thus misinterpreting the string. You can prevent this by calling the following lines at the start of your program:
QTextCodec *codec = QTextCodec::codecForName("UTF-8");
QTextCodec::setCodecForLocale(codec);
On some platforms UTF-8 seems to be the default text codec.
try to pass data in UTF8 and extract it in your function with something like
QString::fromUTF8
it takes const char* on input.
The problem is that the operator<<(const char *) method expects a Latin1-encoded string, so you should pass a proper UTF-8 QString to QDebug like this:
qDebug() << QString::fromUtf8("我");
... and from inside the message handler expect a UTF-8 string:
QString message = QString::fromUtf8(msg);
And that should work like a charm. ;)
For more information please read the QDebug reference manual.
You could also do the wrong thing: keep passing UTF-8 encoded strings via << and convert the strings with the horrible QString::fromUtf8(QString::fromUtf8(msg).toAscii().constData()) call.
Edit: This is the final example that works:
#include <QString>
#include <QDebug>
#include <QMessageBox>
#include <QApplication>
void
myMessageOutput(QtMsgType type, const char *msg)
{
QMessageBox::information(NULL, NULL, QString::fromUtf8(msg), QMessageBox::Ok);
}
int
main(int argc, char *argv[])
{
QApplication app(argc, argv);
qInstallMsgHandler(myMessageOutput);
qDebug() << QString::fromUtf8("我");
return 0;
}
Please note that QDebug doesn't do any charset conversion if you don't instantiate QApplication. This way you wouldn't need to do anything special to msg from inside the message handler, but I STRONGLY recommend you to instantiate it.
One thing you must be sure is that your source code file is being encoded in UTF-8. To do that you might use a proper tool to check it (file in case you use Linux, for example) or just call QMessageBox::information(NULL, NULL, QString::fromUtf8("我"), QMessageBox::Ok) and see if a proper message appears.
#include <QtCore/QCoreApplication>
#include <stdio.h>
#include <QDebug>
void myMessageOutput(QtMsgType type, const char *msg)
{
fprintf(stderr, "Msg: %s\n", msg);
}
int main(int argc, char *argv[])
{
qInstallMsgHandler(myMessageOutput);
QCoreApplication a(argc, argv);
qDebug() << QString::fromUtf8("我");
}
The code above works here perfectly, but I must stress that my console does support UTF-8, because if it would not it would show another char at that location.

How do I convert a WCHAR * to a regular string?

So in Win32 API, I have my main function defined thus:
wmain(int argc, WCHAR* argv[])
I'm passing some arguments to it, and I'd like to execute a switch case based on the value of the argument, something like this.
wmain(int argc, WCHAR* argv[])
{
char* temp = argv[];
switch (temp) {
case "one": blah blah;
...
}
Of course, the temp=argv[] doesn't work, I'm looking for a suggestion to convert it. Right now I have an if-else-if thing going on, and it's VERY inefficient!
The reason I need to convert it is because I cannot execute a switch case on a WCHAR*.
Thanks for looking.
You can't execute a switch on a char* either. (But when you actually need to convert WCHAR* to char*, use WideCharToMultiByte)
You need to use if/else if with lstrcmpi, CompareString or some other string compare function.
Alternatively, use one of the parameter parser libraries like argtable or getopt
I am not sure if this is a good idea to do. A WCHAR* could contain unicode characters which cannot be mapped to a char* in a meaningful way. In case you want to ignore this, there is a forum post at http://www.codeguru.com/forum/showthread.php?t=336106 which has some suggestions for converting from WCHAR* to char*.
Try converting it from std::wstring to std::string, its easy, maybe there is a shorter way.
Convert WCHAR* to std::wstring using std::wstring constractor, and then use one of std::wstring method to convert to std::String
Here's a quick example I wrote some time ago.
Create a new win32 console application and select ATL support. Add this and compile/run...
#include "stdafx.h"
#include <iostream>
using namespace std;
int _tmain(int argc, _TCHAR* argv[])
{
// A _TCHAR is a typedef'd, depending on whether you've got a unicode or MBCS build
// ATL Conversion macros are documented here
// http://msdn.microsoft.com/en-us/library/87zae4a3(VS.80).aspx
// Declare USES_CONVERSION in your function before using the ATL conversion macros
// e.g. T2A(), A2T()
USES_CONVERSION;
TCHAR* pwHelloWorld = _T("hello world!");
wcout << pwHelloWorld << endl;
// convert to char
char* pcHelloWorld = T2A(pwHelloWorld);
cout << pcHelloWorld << endl;
cin.get();
return 0;
}
Of course, you can't switch on a string, but this should give you the info you need in order to read a WCHAR into a char. From there, you can convert to int easily enough..
Hope this helps ;)

CryptStringToBinary not working with a NULL terminated string. Why?

does anyone know why this code is not working?
#include "stdafx.h"
#include <windows.h>
#include <WinCrypt.h>
int _tmain(int argc, _TCHAR* argv[])
{
wchar_t *bin = TEXT("ProductID:1233===>55555");
BYTE out2[1000];
DWORD olen;
olen = 1000;
if (CryptStringToBinary(bin, 0, 1, out2, &olen, 0, 0) == 0)
{
wprintf(TEXT("Failure\n"));
}
else
{
//wprintf(TEXT("rn%s\n"),out2);
wprintf(TEXT("Success\n"));
}
system("pause");
return 0;
}
Thank you very much in advance!
Tom
Because you specified a length (parameter 2) of 0?
Edit: Just to clarify our eventual solution in the comments below, the code in the original question (since edited) contained two errors:
It was calling CryptBinaryToString instead of CryptStringToBinary. Since it's invalid to pass a 0 in the second parameter to CryptBinaryToString, the function was failing.
It was passing 1 in the third parameter (dwFlags), which is interpreted as CRYPT_STRING_BASE64. Since the string to encrypt wasn't in base 64 (it contained invalid characters such as ':'), the function was failing. In general, passing a raw value instead of using an existing definition (e.g., CRYPT_STRING_BASE64) is not a good idea.

Resources