How do you convert a 'System::String ^' to 'TCHAR'? - visual-studio-2010

i asked a question here involving C++ and C# communicating. The problem got solved but led to a new problem.
this returns a String (C#)
return Marshal.PtrToStringAnsi(decryptsn(InpData));
this expects a TCHAR* (C++)
lpAlpha2[0] = Company::Pins::Bank::Decryption::Decrypt::Decryption("123456");
i've googled how to solve this problem, but i am not sure why the String has a carrot(^) on it. Would it be best to change the return from String to something else that C++ would accept? or would i need to do a convert before assigning the value?

String has a ^ because that's the marker for a managed reference. Basically, it's used the same way as * in unmanaged land, except it can only point to an object type, not to other pointer types, or to void.
TCHAR is #defined (or perhaps typedefed, I can't remember) to either char or wchar_t, based on the _UNICODE preprocessor definition. Therefore, I would use that and write the code twice.
Either inline:
TCHAR* str;
String^ managedString
#ifdef _UNICODE
str = (TCHAR*) Marshal::StringToHGlobalUni(managedString).ToPointer();
#else
str = (TCHAR*) Marshal::StringToHGlobalAnsi(managedString).ToPointer();
#endif
// use str.
Marshal::FreeHGlobal(IntPtr(str));
or as a pair of conversion methods, both of which assume that the output buffer has already been allocated and is large enough. Method overloading should make it pick the correct one, based on what TCHAR is defined as.
void ConvertManagedString(String^ managedString, char* outString)
{
char* str;
str = (char*) Marshal::StringToHGlobalAnsi(managedString).ToPointer();
strcpy(outString, str);
Marshal::FreeHGlobal(IntPtr(str));
}
void ConvertManagedString(String^ managedString, wchar_t* outString)
{
wchar_t* str;
str = (wchar_t*) Marshal::StringToHGlobalUni(managedString).ToPointer();
wcscpy(outString, str);
Marshal::FreeHGlobal(IntPtr(str));
}

The syntax String^ is C++/CLI talk for "(garbage collected) reference to a System.String".
You have a couple of options for the conversion of a String into a C string, which is another way to express the TCHAR*. My preferred way in C++ would be to store the converted string into a C++ string type, either std::wstring or std::string, depending on you building the project as a Unicode or MBCS project.
In either case you can use something like this:
std::wstring tmp = msclr::interop::marshal_as<std::wstring>( /* Your .NET String */ );
or
std::string tmp = msclr::interop::marshal_as<std::string>(...);
Once you've converted the string into the correct wide or narrow string format, you can then access its C string representation using the c_str() function, like so:
callCFunction(tmp.c_str());
Assuming that callCFunction expects you to pass it a C-style char* or wchar_t* (which TCHAR* will "degrade" to depending on your compilation settings.

That is a really rambling way to ask the question, but if you mean how to convert a String ^ to a char *, then you use the same marshaller you used before, only backwards:
char* unmanagedstring = (char *) Marshal::StringToHGlobalAnsi(managedstring).ToPointer();
Edit: don't forget to release the memory allocated when you're done using Marshal::FreeHGlobal.

Related

Make a exported function ansi and unicode aware in win32 dll with the use of TCHAR

I'm trying to make a win32 dll that are able to handle ansi and unicode depending what specify in the character set on properties. Unicode or Not Set. ANSI when building in Visual Studio.
The dll has the definition
extern "C" int __stdcall calc(TCHAR *foo)
The definition file is as follow
typedef int (CALLBACK* LPFNDLLCALC)( TCHAR *foo)
Inside the MFC Calling app i load the dll as this
HINSTANCE DllFoo = LoadLibrary(L"foo.dll");
LPFNDLLCALC lpfnDllcalc = (LPFNDLLCALC)GetProcAddress(DllFoo ,"calc");
CString C_SerialNumber;
mvSerialNumber.GetWindowText(C_SerialNumber);
TCHAR* SerialNumber = C_SerialNumber.GetBuffer(0);
LPFNDLLCALC(SerialNumber);
I understand that i make something wrong in the C_SerialNumber.GetBuffer(0) to the TCHAR* pointer. Because in the debugger in the dll only show the first char is passed to the dll. Not the complete string.
How do i get CString to pointer that work in both ansi and unicode.
If change all my code to wchar_t or char in stead of TCHAR i get it to work. Put not with this nativ TCHAR macro.
As I see it you have two options:
Write the code entirely using TCHAR. Then compile the code into two separate DLLs, one narrow and one wide.
Have a single DLL that exports two variants of each function that operates on text. This is how the Windows API is implemented.
If you choose the second option, you don't need to implement each function twice. The primary function is the wide variant. For the narrow variant you convert the input from narrow to wide and then call the wide version. Vice versa for output text. In other words, you use the adapter pattern.
I suppose that you are imagining a third option where you have a single function that can operate on either form of text. Don't go this way. This way abandons type safety and will give you no end of pain. It will also be counter to user's expectations.
As David said, you need to export two separate functions, one for Ansi and one for Unicode, just like the Win32 API does, eg:
#ifdef __cplusplus
extern "C" {
#endif
int WINAPI calcA(LPCSTR foo);
int WINAPI calcW(LPCWSTR foo);
#ifdef __cplusplus
}
#endif
typedef int (WINAPI *LPFNDLLCALC)(LPCTSTR foo);
Then you can do the following:
int WINAPI calcA(LPCSTR foo)
{
return calcW(CStringW(foo));
}
int WINAPI calcW(LPCWSTR foo)
{
//...
}
HINSTANCE DllFoo = LoadLibrary(L"foo.dll");
LPFNDLLCALC lpfnDllcalc = (LPFNDLLCALC) GetProcAddress(DllFoo,
#ifdef UNICODE
"calcW"
#else
"calcA"
#endif
);
CString C_SerialNumber;
mvSerialNumber.GetWindowText(C_SerialNumber);
lpfnDllcalc(C_SerialNumber);

generic cout that can be wcout depending upon typedef

I've a typedef char char_t which can also be typedef wchar_t char_t and What I want is a generic cout.
I have a util namespace I want an util::cout that would be std::cout if char_t is char and std::wcout if char_t is wchar_t
Yes, no problem; you can do this with a template specialisation holding a static reference to the appropriate object.
template<typename T> struct select_cout;
template<> struct select_cout<char> { static std::ostream &cout; };
std::ostream &select_cout<char>::cout = std::cout;
template<> struct select_cout<wchar_t> { static std::wostream &cout; };
std::wostream &select_cout<wchar_t>::cout = std::wcout;
std::basic_ostream<char_t> &cout = select_cout<char_t>::cout;
You're reinventing a terrible (at least in the here and now, perhaps it was a good decision at one point) design by MS.
Note that every other platform most likely uses UTF-8 for output, so a UTF-8 string through std::cout outputs just fine. On windows, Unicode output on the console is impossible to get right anyway (due to fonts and broken console codepages).
In short, there is no reason to want such a thing, and you're better off using one or the other, not both.
If you are reading files in wide format and using a multibyte program which i picked up from the comments, a solution could be . . .
You can read the file content as a std::wstring in to your programs memory and the use wcstombs_s() to convert the string read from the file in to a multi-byte character string.
Essentially it doesn't matter which format the string is you can always change it when and where needed.

WM_GETTEXT usage

I'm trying to get the status of a text field in my application. But I don't get it to work. I'm using "SendMessage" to get "WM_GETTEXT", I save the content to a char *.
I output the char * to a file, but I only get "D" back. This is what I have now:
LRESULT result;
char * output = (char*)malloc(1024);
result = SendMessage(hwnd,WM_GETTEXT,1024,(LPARAM)output);
ofstream file("test.txt");
file << *output;
file.close();
delete [] output;
Pointers concepts
file << *output; will print the first element of the string array
file << output; print the entire string
C# code:
public const uint WM_GETTEXT = 0xD;
const int bufferSize = 10000;
StringBuilder sb = new StringBuilder(bufferSize);
SendMessageGetText(handle, WM_GETTEXT, new UIntPtr(bufferSize), sb);
Console.WriteLine(sb.ToString());
Working properly to me!
Sophia's answer is correct. However, the default now for a Visual Studio project is to create a Unicode project. You will only get the first letter if your project is Unicode and not MBCS.
Have you examined the buffer returned from WM_GETTEXT to verify it has the entire string?
If not, try declaring your output variable as TCHAR* (to be generic) or as a wchar_t* and see what results you get in the buffer.
p.s. It is bad form to allocate memory with malloc and release it with delete. You should either use malloc/free pairs or new/delete pairs. Even safer way to allocate a char buffer is to use std::string or use std::wstring for a wide string.
p.p.s Try making sure your project settings are for a Multibyte project and not Unicode project. Then everything in Sophia's answer will work.
One more thing... Just use GetWindowText() API instead of the SendMessage stuff. That's why it is there so you don't have to go through the rigmarole of casting a pointer to a LPARAM or WPARAM. It's more typesafe and will give you a compile time error (better than runtime errors) if your types don't match up--especially with Unicode/MBCS and wchar_t/char.

How do I convert const wchar_t* to wchar_t or a multibyte char?

I need to send multibyte char to a socket after appending '\n' but what I have is a const wchar_t*. How can I convert it?
If your question is how to actually manipulate the contents of a constant, then consider const_cast.
Appending an '\n' to a const wchar_t* string means you have to make a copy of the original string. Read this MS docs on how to use swprintf for that:
http://msdn.microsoft.com/en-us/library/ybk95axf%28VS.71%29.aspx
If your problem is the conversion, the WideCharToMultiByte Function is your friend:
http://msdn.microsoft.com/en-us/library/dd374130%28VS.85%29.aspx
You don't need to use direct newline characters to do this:
const wchar_t* original(L"original value");
std::wostringstream streamVal;
streamVal << original << std::endl;
const std::wstring modified(streamVal.str());
Going through a _bstr_t seems a pain but allows you to do the wide char to multibyte conversion pretty easily (small code). Include comsuppw.lib in the list of libraries in your project.
#include "comutil.h"
_bstr_t bstrVal(modified.c_str());
const char* multibytes((const char*)bstrVal);
std::cout << multibytes; // includes newline
How can I convert const wchar_t* to wchar_t*?
reinterpret_cast<const wchar_t*>(L"Test");
Should really work.

convert BSTR to wstring

How to convert a char[256] to wstring?
update. here is my current code:
char testDest[256];
char *p= _com_util::ConvertBSTRToString(url->bstrVal);
for (int i = 0; i <= strlen(p); i++)
{
testDest[i] = p[i];
}
// need to convert testDest to wstring to I can pass it to this below function...
writeToFile(testDestwstring);
If your input is BSTR (as it seems to be) the data is already Unicode and you can just cast this directly to wstring as follows. _bstr_t has implicit conversions to both char* and wchar* which avoid the need for manual Win32 code conversion.
if (url->bstrVal)
{
// true => make a new copy - can avoid this if source
// no longer needed, by using false here and avoiding SysFreeString on source
const _bstr_t wrapper(url->bstrVal, true);
std::wstring wstrVal((const _wchar_t*)wrapper);
}
See here for more details on this area of Windows usage. It's easy to mess up the use of the Win32 API in this area - using the BSTR wrapper to do this avoids both data copy (if used judiciously) and code complexity.
MultiByteToWideChar will return a UTF-16 string. You need to specify the source codepage.

Resources