This question already has answers here:
Why does my simple C++ GUI application show a message box in Chinese?
(3 answers)
Why does message box in C++ show chinese message [duplicate]
(3 answers)
Closed 6 years ago.
ConsoleApplication.cpp
// ConsoleApplication2.cpp : Defines the entry point for the console application.
//
#include "stdafx.h"
int main()
{
std::cout << "Hello World" << std::endl;
MessageBox(0, LPTSTR("Text Here"), LPTSTR("Text Here"), MB_OK);
std::cin.clear();
std::cout << "Press Enter to continue" << std::endl;
std::cin.ignore(32767, '\n');
std::cin.get();
return 0;
}
stdafx.h
// stdafx.h : include file for standard system include files,
// or project specific include files that are used frequently, but
// are changed infrequently
//
#pragma once
#include "targetver.h"
#include <stdio.h>
#include <tchar.h>
#include <iostream>
#include <Windows.h>
#include <sstream>
Whenever I run the MessageBox code, it comes out in Chinese characters.
The code is above.
Your LPTSTR type-cast is wrong. When compiling with UNICODE defined, MessageBox() resolves to MessageBoxW(), and LPTSTR resolves to wchar*, so you are really doing this:
MessageBoxW(0, (wchar*)"Text Here", (wchar*)"Text Here", MB_OK);
You are telling the compiler to type-cast narrow strings as if they were wide strings. The narrow string "Text Here" consists of bytes 54 65 78 74 20 48 65 72 65. When those same bytes are interpreted as a wide string, they produce the Chinese string "敔瑸䠠牥e".
If you are going to use TCHAR-based APIs, like MessageBox(), you need to use the TEXT() macro when passing char/string literals to them at compile-time, eg:
MessageBox(0, TEXT("Text Here"), TEXT("Text Here"), MB_OK);
When UNICODE is defined, TEXT() compiles a literal as wide. When UNICODE is not defined, TEXT() compiles a literal as narrow.
Otherwise, do not rely on TCHAR-based APIs at all. Use Ansi/Unicode versions explicitly, eg:
MessageBoxA(0, "Text Here", "Text Here", MB_OK);
MessageBoxW(0, L"Text Here", L"Text Here", MB_OK);
On a side note: your call to cin.ignore() should use std::numeric_limits instead of hard-coding the length:
#include <limits>
std::cin.ignore(std::numeric_limits<std::streamsize>::max(), '\n');
Related
I am trying to learn more about cyber security, in this case about buffer overflows. I have a simple code that I want to change flow of:
#include <stdlib.h>
#include <unistd.h>
#include <stdio.h>
#include <string.h>
void win()
{
printf("code flow successfully changed\n");
}
int main(int argc, char **argv)
{
volatile int (*fp)();
char buffer[64];
fp = 0;
gets(buffer);
if(fp) {
printf("calling function pointer, jumping to 0x%08x\n", fp);
fp();
}
}
By using some tools I have determined that function pointer (fp) gets it value updated after 72 characters have entered the buffer. The function win() is located at value 0xe5894855 so after 72 characters I need to provide that value to buffer for it to jump to the desired function.
However I am facing this issue:
By putting Python3's print("A"*18*4 + "UH" + "\x89" + "\xe5") into input of given C code, I should be getting desired value 0xe5894855 in section marked with red. But instead, I am getting highlighted malformed hex from somewhere. (89 is getting extra C2 and incorrect e5 value is overflowing to next part of stack) (value in those parts of stack are zero initially, but changed into that once overflow is attempted).
Why is this happening? Am I putting hex values into C program incorrectly?
Edit: Still have not figured out why passing hex through python did not work, but I found a different method, by using Perl: perl -e 'print "A"x4x18 . "\x55\x48\x89\xe5"', which did work, and address I needed to jump to was also incorrect (which I also fixed)
Here's my basic program, it should compile fairly easily with VisualStudio (even express).
// ConsoleApplication1.cpp : This file contains the 'main' function. Program execution begins and ends there.
//
#include "stdafx.h"
#include <iostream>
#include <windows.h>
#include <mmsystem.h>
#include <stdint.h>
#pragma comment(lib, "winmm.lib")
HWAVEIN hWaveIn;
WAVEFORMATEX WaveFormat;
WAVEHDR WaveHeader;
typedef union
{
uint32_t u32;
struct
{
int16_t iLeft;
int16_t iRight;
};
} audiosample16_t;
#define AUDIORATE (44100*4)
#define SECONDS (13)
audiosample16_t MyBuffer[AUDIORATE*SECONDS];
int _tmain(int argc, _TCHAR* argv[])
{
std::cout << "Hello World!\n";
UINT WaveId = 0;
WaveFormat.wFormatTag = WAVE_FORMAT_PCM; // simple, uncompressed format
WaveFormat.nChannels = 2; // 1=mono, 2=stereo
WaveFormat.nSamplesPerSec = 44100;
WaveFormat.wBitsPerSample = 16; // 16 for high quality, 8 for telephone-grade
WaveFormat.nBlockAlign = WaveFormat.nChannels*WaveFormat.wBitsPerSample/8;
WaveFormat.nAvgBytesPerSec = (WaveFormat.nSamplesPerSec)*(WaveFormat.nChannels)*(WaveFormat.wBitsPerSample)/8;
WaveFormat.cbSize=0;
WaveHeader.lpData = (LPSTR)MyBuffer;
WaveHeader.dwBufferLength = sizeof(MyBuffer);
WaveHeader.dwFlags = 0;
std::cout << "Hello World!\n";
//std::cout << std::flush;
HRESULT hr;
if(argc>1)
hr= waveInOpen(&hWaveIn,WaveId,&WaveFormat,0,0,CALLBACK_NULL);
std::cout << "Hello World!\n";
std::cout << "Hello World!\n";
//std::cout << std::flush;
return 0;
}
If you call it from the command line with no arguments, everything prints out fine(several 'Hello World!'s). If you redirect this to a file (myprog.exe > blah.txt) , again, everything works fine and several lines of 'Hello World!' end up in the file as expected.
HOWEVER, if you have an argument (so that waveInOpen is called), it will not redirect anything to the file. The file is empty. If you don't redirect the output, it'll print out to the command prompt just fine.
UNLESS you uncomment the std::flush lines, then the file isn't empty and everything works fine.
What the heck is going on under the hood that's causing that? Shouldn't stdout be flushed on exit and piped to the file no matter what? What is the waveInOpen() call doing that screws up the stdio buffering like that?
FWIW, this came to light because we're calling this program from TCL and Python to do audio quality measurements on an attached product and nothing was being read back, even though it would print out fine when run from the command line (and not redirected).
How can I make Russian letters visible in a dialog loaded from a LED file?
When the LED file is Unicode, IupLoad() returns an error.
When the LED file is UTF-8, IUP believes it has loaded and
shown the dialog but there is only vacuum.
When the LED file is ANSI, we get the predictable result:
(Ignore the red box, I’ve placed it there for another question.)
C file:
#include <stdlib.h>
#include <iup.h>
int main(int argc, char **argv)
{
IupSetGlobal("UTF8MODE", "YES");
// IupSetGlobal("UTF8MODE_FILE", "YES");
IupOpen(&argc, &argv);
if(IupLoad("dropdown.led")) IupMessage("Error", "Failed to load LED.");
else {
Ihandle *dropdown = IupGetHandle("dropdown");
IupShow(dropdown);
IupMainLoop();
}
IupClose();
return EXIT_SUCCESS;
}
Accompanying dropdown.led file:
dropdown = DIALOG[TITLE=dropdown.led](
HBOX[CMARGIN=10x10,CGAP=10](
LIST[VALUE=3, 1=я, 2=ты, 3=оно, 4=мы, 5=вы, 6=они, DROPDOWN=YES](do_nothing),
LIST[VALUE=3, 1=ik, 2=je, 3=hij, 4=we, DROPDOWN=YES](do_nothing)
)
)
Update: an experiment with manual LED file loading
I have attempted a workaround in the form of loading the LED file manually (my function LoadLED() below) and replacing IupLoad() with IupLoadBuffer(). However this has failed too, albeit – oddly enough – in reverse:
When the LED file is Unicode, IUP believes it has loaded and
shown the dialog but there is only vacuum.
When the LED file is UTF-8, IupLoadBuffer() returns an error.
IupLoadBuffer() reverses the faulty undesirable behaviour of IupLoad() regarding UTF-8 and Unicode – but it’s faulty not the desired outcome still.
IupMessage() confirms that UTF-8 mode is in force: it displays Russian letters in the LED file (UTF-8) correctly. It demonstrates that the problem is localised in the IupLoad() and IupLoadBuffer() functions rather than something caused by my incompetence. (In the end, it was kind of neither: the functions work as intended but I had no way of knowing the specific conditions necessary to make them work.)
Modified C file:
#include <stdio.h>
#include <stdlib.h>
#include <iup.h>
char *LoadLED(char *buffer, size_t size, char *ledFileName) {
FILE *led;
if (led = fopen(ledFileName, "rb")) /* Binary mode for UTF-8! */ {
fread(buffer, 1L, size, led);
fclose(led);
IupMessage("Loaded LED file", buffer);
return buffer;}
else return IupMessage("Error", "Failed to load LED."), NULL;
}
int main(int argc, char **argv) {
IupSetGlobal("UTF8MODE", "YES");
IupSetGlobal("UTF8MODE_FILE", "YES");
IupOpen(&argc, &argv);
char buffer[20000L], ledFileName[] = "dropdown.led";
if (!LoadLED(buffer, sizeof(buffer), ledFileName)) return EXIT_FAILURE;
if (IupLoadBuffer(buffer))
return IupMessage("Error", "Failed to load buffer."), EXIT_FAILURE;
else {
Ihandle *dropdown = IupGetHandle("dropdown");
IupShow(dropdown);
IupMessage("Success", "IUP thinks it has loaded buffer and displayed dialog.");
IupMainLoop();
}
return IupClose(), EXIT_SUCCESS;
}
All questions that pertain to this particular example:
How do I get access to GUI elements in a IUP dialog loaded from a LED file?
How can I make Russian letters visible in a IUP dialog loaded from a LED file? (current)
A gap in IUP dropdown lists
First, IUP does NOT supports Unicode. So to test it is useless.
UTF8MODE_FILE is for file names. Does not affect this case.
The UTF-8 string maybe affecting the LED parser although they shouldn't. Make sure the LED file does NOT have the UTF-8 BOM. I tested here your LED file and it works using IupLoad or IupLoadBuffer, but in both cases there are problems with the strings.
The solution is actually simple, just wrap your strings with quotes "", for instance:
LIST[VALUE=3, 1="я", 2="ты", 3="оно", 4="мы", 5="вы", 6="они", DROPDOWN=YES](do_nothing),
It works.
I am trying to do some simple box drawing in the terminal using unicode characters. However I noticed that wcout wouldn't output anything for the box drawing characters, not even a place holder. So I decided to write the program below and find out which unicode characters were supported and found that wcout refused to output anything above 255. Is there something i have to do to make wcout work properly? Why can't access any of the extended unicode characters?
#include <wchar.h>
#include <locale>
#include <iostream>
using namespace std;
int main()
{
for (wchar_t c = 0; c < 0xFFFF; c++)
{
cout << "Iteration " << (int)c << endl;
wcout << c << endl << endl;
}
return 0;
}
I don't recommend using wcout because it is non-portable, inefficient (always performs transcoding) and doesn't support all of Unicode (e.g. surrogate pairs).
Instead you can use the open-source {fmt} library to portably print Unicode text including box drawing characters, for example:
#include <fmt/core.h>
int main() {
fmt::print("┌────────────────────┐\n"
"│ Hello, world! │\n"
"└────────────────────┘\n");
}
prints (https://godbolt.org/z/4EP6Yo):
┌────────────────────┐
│ Hello, world! │
└────────────────────┘
Disclaimer: I'm the author of {fmt}.
I have one application which reads user default locale in Windows Vista and above. When i tried calling the API for getting User default Locale API is crashing. Below is the code, It will be helpfull if any points the reason
#include <iostream>
#include <WinNls.h>
#include <Windows.h>
int main()
{
LPWSTR lpLocaleName=NULL;
cout << "Calling GetUserDefaultLocaleName";
int ret = GetUserDefaultLocaleName(lpLocaleName, LOCALE_NAME_MAX_LENGTH);
cout << lpLocaleName<<endl;
}
You need to have lpLocaleName initialized to a buffer prior to calling the API. As a general consensus, if an API has a LPWSTR data type parameter, call malloc or new on it first, to the desired length, in this case, LOCALE_NAME_MAX_LENGTH. Setting it to NULL and passing it to the API function is a guaranteed way to crash!
Hope this helps,
Best regards,
Tom.
In addition to the previous answers, you should also be aware that you can't print a wide string with cout; instead, you should use wcout.
So:
#include <iostream>
#include <WinNls.h>
#include <Windows.h>
#define ARRSIZE(arr) (sizeof(arr)/sizeof(*(arr)))
using namespace std;
int main()
{
WCHAR_T localeName[LOCALE_NAME_MAX_LENGTH]={0};
cout<<"Calling GetUserDefaultLocaleName";
int ret = GetUserDefaultLocaleName(localeName,ARRSIZE(localeName));
if(ret==0)
cout<<"Cannot retrieve the default locale name."<<endl;
else
wcout<<localeName<<endl;
return 0;
}
I believe you need to initialise lpLocaleName to an empty string of 256 chars (for example) then pass the length (256) where you have LOCALE_NAME_MAX_LENGTH