CryptStringToBinary not working with a NULL terminated string. Why? - windows

does anyone know why this code is not working?
#include "stdafx.h"
#include <windows.h>
#include <WinCrypt.h>
int _tmain(int argc, _TCHAR* argv[])
{
wchar_t *bin = TEXT("ProductID:1233===>55555");
BYTE out2[1000];
DWORD olen;
olen = 1000;
if (CryptStringToBinary(bin, 0, 1, out2, &olen, 0, 0) == 0)
{
wprintf(TEXT("Failure\n"));
}
else
{
//wprintf(TEXT("rn%s\n"),out2);
wprintf(TEXT("Success\n"));
}
system("pause");
return 0;
}
Thank you very much in advance!
Tom

Because you specified a length (parameter 2) of 0?
Edit: Just to clarify our eventual solution in the comments below, the code in the original question (since edited) contained two errors:
It was calling CryptBinaryToString instead of CryptStringToBinary. Since it's invalid to pass a 0 in the second parameter to CryptBinaryToString, the function was failing.
It was passing 1 in the third parameter (dwFlags), which is interpreted as CRYPT_STRING_BASE64. Since the string to encrypt wasn't in base 64 (it contained invalid characters such as ':'), the function was failing. In general, passing a raw value instead of using an existing definition (e.g., CRYPT_STRING_BASE64) is not a good idea.

Related

MFC - Display message

I'm trying to display a simple message within my first MFC application.
Strangely, the first sample doesn't work, instead the second one works correctly.
auto text = std::to_wstring(1).c_str();
MessageBox(text, NULL, 0); // Not ok, the message is empty
auto temp = std::to_wstring(1);
MessageBox(temp.c_str(), NULL, 0); // Ok, display 1
Can you explain why of this behavior?
Yes, in the first example, the wstring created by the call to std::to_wstring only has the scope of the line. After the line executes, it is out of scope and its value is dubious.
In the second example, the wstring is still in scope and valid and so the call to .c_str() works.
No, the other answer is wrong. Look at the implementation of c_str(). c_str() returns basically a LPCWSTR... call it a const WCHAR* or const wchar_t* or whatever. However, the return of c_str() is to an internal pointer of wstring. The problem is that after the line of code executes, the wstring returned from to_wstring() is not valid and so the the pointer returned by c_str() is garbage. For fun, try the following code:
//cstr_.cpp
#include <iostream>
#include <string>
using namespace std;
int main(int argc, char* argv)
{
auto temp = to_wstring(1).c_str();
wprintf(L"%s\n", temp);
auto temp2 = to_wstring(1);
wprintf(L"%s\n", temp2.c_str());
wstring ws = to_wstring(1);
auto temp3 = ws.c_str();
wprintf(L"%s\n", temp3);
}
I compiled the above from a VC++ shell prompt with: cl.exe cstr.cpp
If the other answer is correct, then the last line should have garbage or nothing output because according to the other answer, c_str() is a temp. But, if my answer is correct, then it should output 1 (which it does). If all else fails, look at the implementation source code.

GtkMM Pass Char as Argument in Button Callback

There is a question similar to this but it does not answer my question. I am working on a GUI using GTKMM. I am trying to pass a single char in the callback of my button.
This letter, is then assigned to the global variable letter. However, I don't understand pointers and have been trying to get this to work for quite a while without success.
main.cpp
#include <gtkmm.h>
#include "window.h"
int main(int argc, char *argv[])
{
Glib::RefPtr<Gtk::Application> app = Gtk::Application::create(argc, argv, "com.gtkmm.tutorial3.base");
mywindow window;
return app->run(window);
}
window.cpp
#include "window.h"
mywindow::mywindow()
{
set_default_size(480, 320);
set_title("Transfer");
Gtk::Box *vbox = Gtk::manage(new Gtk::Box(Gtk::ORIENTATION_VERTICAL, 0));
add(*vbox);
Gtk::Grid *grid = Gtk::manage(new Gtk::Grid);
grid->set_border_width(10);
vbox->add(*grid);
Gtk::Button *a = Gtk::manage(new Gtk::Button("A"));
//a->set_hexpand(true);
//a->set_vexpand(true);//%%%%%%%% next line is the issue%%%%%%%%%
a->signal_clicked().connect(sigc::mem_fun(*this, &mywindow::on_click('a')));
grid->attach(*a, 0, 0, 1, 1);//x=0,y=0, span 1 wide, and 1 tall
Gtk::Button *b = Gtk::manage(new Gtk::Button("B"));
//b->set_hexpand(true);
//b->set_vexpand(true);
b->signal_clicked().connect(sigc::mem_fun(*this, &mywindow::on_click('b')));
grid->attach(*b, 1, 0, 1, 1);
Gtk::Button *c = Gtk::manage(new Gtk::Button("C"));
//c->set_hexpand(true);
//c->set_vexpand(true);
c->signal_clicked().connect(sigc::mem_fun(*this, &mywindow::on_click('c')));
grid->attach(*c, 2, 0, 1, 1);
vbox->show_all();
}
mywindow::~mywindow()
{
//dtor
}
void mywindow::on_click(char l)
{
letter = l;
}
window.h
#ifndef MYWINDOW_H
#define MYWINDOW_H
#include <gtkmm.h>
class mywindow : public Gtk::Window
{
public:
mywindow();
virtual ~mywindow();
protected:
char letter;//global variable where the letter is stored
char on_click(char l);
private:
};
#endif // MYWINDOW_H
I tried replacing the * pointer with & and vice-versa for this and mywindow but I haven't gotten it to work and have no idea how to proceed.
First of all there is gtkmm3 tutorial.
From there:
you can't hook a function with two arguments to a signal expecting none (unless you use an adapter, such as sigc::bind(), of course).
So you need something like this:
c->signal_clicked().connect(sigc::bind<char>(sigc::mem_fun(*this,&mywindow::on_click), "c"));
On a side note:
If you have problem with pointers you could try smart pointers but to be honest I think it would be better for you to understand them.

Using strcmp to compare argv item with string literal isn't working as I was expecting

I'm quite new to Visual C++ so this might be a 'schoolboy' error, but the following code is not executing as I'd expected:
#include "stdafx.h"
#include <string.h>
int _tmain(int argc, _TCHAR* argv[])
{
if (strcmp((char*)argv[1], "--help") == 0)
{
printf("This is the help message."); //Won't execute
}
return 0;
}
The executable, named Test.exe is launched as follows
Test.exe --help
I was expecting the message This is the help message. but I'm not seeing it - debugging reveals that the if condition comes out as -1 and not 0 as I'd expect. What am I doing wrong?
OK, I've figured out what's going on. The argv[] array is declared as TCHAR*, which is a macro that adjust the type based on whether or not Unicode has been enabled for the project (wchat_t if it is or char if it is not). The strcmp function, which I was trying to use, is the non-Unicode string comparison while wcscmp is the Unicode equivalent. The _tcscmp function uses the appropriate string comparison function depending on the Unicode setting. If I replace strcmp with _tcscmp, problem solved!
#include "stdafx.h"
#include <string.h>
int _tmain(int argc, _TCHAR* argv[])
{
if (_tcscmp(argv[1], _T("--help")) == 0)
{
printf("This is the help message."); //Will execute :)
}
return 0;
}
The _T function converts the argument to Unicode, if Unicode is enabled.
See also: Is it advisable to use strcmp or _tcscmp for comparing strings in Unicode versions?

How do I convert a WCHAR * to a regular string?

So in Win32 API, I have my main function defined thus:
wmain(int argc, WCHAR* argv[])
I'm passing some arguments to it, and I'd like to execute a switch case based on the value of the argument, something like this.
wmain(int argc, WCHAR* argv[])
{
char* temp = argv[];
switch (temp) {
case "one": blah blah;
...
}
Of course, the temp=argv[] doesn't work, I'm looking for a suggestion to convert it. Right now I have an if-else-if thing going on, and it's VERY inefficient!
The reason I need to convert it is because I cannot execute a switch case on a WCHAR*.
Thanks for looking.
You can't execute a switch on a char* either. (But when you actually need to convert WCHAR* to char*, use WideCharToMultiByte)
You need to use if/else if with lstrcmpi, CompareString or some other string compare function.
Alternatively, use one of the parameter parser libraries like argtable or getopt
I am not sure if this is a good idea to do. A WCHAR* could contain unicode characters which cannot be mapped to a char* in a meaningful way. In case you want to ignore this, there is a forum post at http://www.codeguru.com/forum/showthread.php?t=336106 which has some suggestions for converting from WCHAR* to char*.
Try converting it from std::wstring to std::string, its easy, maybe there is a shorter way.
Convert WCHAR* to std::wstring using std::wstring constractor, and then use one of std::wstring method to convert to std::String
Here's a quick example I wrote some time ago.
Create a new win32 console application and select ATL support. Add this and compile/run...
#include "stdafx.h"
#include <iostream>
using namespace std;
int _tmain(int argc, _TCHAR* argv[])
{
// A _TCHAR is a typedef'd, depending on whether you've got a unicode or MBCS build
// ATL Conversion macros are documented here
// http://msdn.microsoft.com/en-us/library/87zae4a3(VS.80).aspx
// Declare USES_CONVERSION in your function before using the ATL conversion macros
// e.g. T2A(), A2T()
USES_CONVERSION;
TCHAR* pwHelloWorld = _T("hello world!");
wcout << pwHelloWorld << endl;
// convert to char
char* pcHelloWorld = T2A(pwHelloWorld);
cout << pcHelloWorld << endl;
cin.get();
return 0;
}
Of course, you can't switch on a string, but this should give you the info you need in order to read a WCHAR into a char. From there, you can convert to int easily enough..
Hope this helps ;)

GetUserDefaultLocaleName() API is crashing

I have one application which reads user default locale in Windows Vista and above. When i tried calling the API for getting User default Locale API is crashing. Below is the code, It will be helpfull if any points the reason
#include <iostream>
#include <WinNls.h>
#include <Windows.h>
int main()
{
LPWSTR lpLocaleName=NULL;
cout << "Calling GetUserDefaultLocaleName";
int ret = GetUserDefaultLocaleName(lpLocaleName, LOCALE_NAME_MAX_LENGTH);
cout << lpLocaleName<<endl;
}
You need to have lpLocaleName initialized to a buffer prior to calling the API. As a general consensus, if an API has a LPWSTR data type parameter, call malloc or new on it first, to the desired length, in this case, LOCALE_NAME_MAX_LENGTH. Setting it to NULL and passing it to the API function is a guaranteed way to crash!
Hope this helps,
Best regards,
Tom.
In addition to the previous answers, you should also be aware that you can't print a wide string with cout; instead, you should use wcout.
So:
#include <iostream>
#include <WinNls.h>
#include <Windows.h>
#define ARRSIZE(arr) (sizeof(arr)/sizeof(*(arr)))
using namespace std;
int main()
{
WCHAR_T localeName[LOCALE_NAME_MAX_LENGTH]={0};
cout<<"Calling GetUserDefaultLocaleName";
int ret = GetUserDefaultLocaleName(localeName,ARRSIZE(localeName));
if(ret==0)
cout<<"Cannot retrieve the default locale name."<<endl;
else
wcout<<localeName<<endl;
return 0;
}
I believe you need to initialise lpLocaleName to an empty string of 256 chars (for example) then pass the length (256) where you have LOCALE_NAME_MAX_LENGTH

Resources