VIsual Studio 2017 C++: How can I programmatically select code page 1252 (english), 932 (Shift JIS, Japanese) and 65001 (UTF-8, multi-lingual)? - utf-8

I want to support code page 1252 (english), 932 (Shift JIS, Japanese), and code page 65001 (UTF-8), but SetThreadUILanguage(65001) fails and returns 1033 (0x409). If I compile with a manifest, I can use code page 65001.
My example tries to display the Euro sign with code page 65001. Works with a manifest, does not work without. Our customers have lots of code page 932 files, so I'd like to be able to switch on-the-fly.
Example code here:
https://github.com/markmenge/UTF8.git
I tried calling SetThreadUILanguage(65001) in CWinApp constructor, but it didn't work.

Related

VS Code can't read unicode language written in other editors

I'm trying to use VS Code but I'm having a problem opening code written with other editors, VS Code can't read korean language unicode? utf-8? what we call other languages in code?
I wrote the code with vim editor with some comment written in korean, and any other editors can read korean languagee but VS Code like following.
ret.insert(ret.end(), bottom.begin(), bottom.end()); // written in vim
// 는 다음과 같음
ret.insert(ret.end(), bottom.begin(), bottom.end()); // opened in VS Code
// �� ������ ����
How can I fix this problem?
Make sure your VSCode is opening the file with UTF8 encoding.
Change the encoding of a file in Visual Studio Code

Why does my DLL not seem to be called?

I have a COM DLL, coded in Delphi. It should be invoked via an Active X control when a web page loads in MS IE (via soem JavaScript on the page).
Btw, this all works fine with an existing serial port interface, but I am recoding teh DLL to read from USB; all else is unchaged.
It works fine in the Delphi IDE, but not "in the field". The active X control should request it to read some input from a USB port and should then send that to the web page.
Reading from the USB device works, as I can open Notepad and see the value being written there.
The DLL will display a form, and a dialog box, and will write to the system debug trace. Since I am seeing none of these when loading the web page in MS IE, I think we can assume that Aective X control is not calling into the DLL.
In MS IE I have enabled all Active X options.
in c:\Windows\System32 (which is equivalent to c:\Windows\SysWOW64), I have regsvr32.exe -u my_dll.dll and then regsvr32.exe my_dll.dll both of which the system announced to be successful
I searched, and there is only one copy of my_dll.dll under c:\Windows
and it has the correct size and date/time
my %path% is %SystemRoot%\system32;%SystemRoot%;%SystemRoot%\System32\Wbem; for system and empty for user
Any idea what I am doing wrong? Or how I can go about tracking it down?
If you are loading the ActiveX control in webpage through javascript, you will have to package the control for web deployment. See this example for how to do this in your javascript and check whether you have done properly it or not:
Calling Activex Control 's Functions from javascript
Once you do the above thing correctly and open your website in IE, the web-page will at least "load" the ActiveX control. Beyond that, you can display message-boxes or write logs in your Delphi code to track down the actual coding issues.

ActiveX Control - MFC Locale

I have an MFC application developed with Visual Studio 2008 where I use Adobe ActiveX control (I have Adobe Reader X installed). I try to set the zooming rectangle using the setViewRect function and it works perfectly.
Now the problem appears when in my Windows Regional Settings, the decimal symbol is set to comma instead of dot (such as in the German Regional Settings). The parameters of the zooming rectangle seem to be interpreted incorrectly.
I used ProcessMonitor and discovered that when Adobe ActiveX Control is created and its DLL is loaded in my process, it calls setlocale, and hence the application is using the current Windows Regional Settings instead of the default "C" locale. Therefore, the application interprets the numbers in a wrong way.
I tried to reset the locale to "C" right after loading Adobe and this workaround fixed the problem.
Now the problem happened again when I migrated my application to Visual Studio 2010. Apparently Adobe DLL ("c:\Program Files (x86)\Common Files\Adobe\Acrobat\ActiveX\AcroPDF.dll") is developed using Visual Studio 2008. So when it sets the locale, it is done in MSVCR90.dll. When I reset the locale, I did so using the same DLL.
Now as my application is in VS2010, calling setlocale is done in MSVCR100.dll, so it does not affect the locale already set in MSVCR90.dll.
Is there a way to set the locale of the COM object that I am hosting inside my application?
Thank you so much in advance :)
This is just a shot in the dark, but you could try loading MSVCR90.dll with LoadLibrary (since the DLL is already loaded, it'll just give you a HANDLE to it, it won't load it twice), then finding the pointer to its setlocale function with GetProcAddress. Then you will be able to call the setlocale function for this DLL. It's an ugly hack, but it might work.

'Locale' configuration and its relationship with Windows API

Can the locale configuration of a system OR the keyboard type configuration of that system in anyway affect which API is called at the Kernel level? To be specific, if a program is invoking 'CreateFile()' API then the windows API documentation says that the call gets delegated to either CreateFileA or CreateFileW. If that program is being run on a system present in China with a Chinese Keyboard then which of the two functions will be called?
Unicode and Locale are two completely orthogonal concepts.
With regards to CreateFileA vs CreateFileW - the setting that controls this is a compile time setting: If the application is compiled to run with the Unicode character set, then CreateFileW will be called, if the compile settings indicated that the application should be compiled as a multibyte application, then CreateFileA will be called.
If you are using Visual Studio C++, then examine the Project settings of the application. Under Configuration Properties, on the page called "General" will be a setting "Character Set" - This can be set to "Use Unicode Character Set", or "Use Multi-Byte Character Set". The effect of this setting is to automatically add another property sheet to the solution - visible under the Property Manager: The "Unicode Support" property sheet adds the preprocessor definitions "_UNICODE,UNICODE" - the "Multi-Byte Character Support" property sheet instead adds "_MBCS".
Now, if you look at the definition for CreateFile you will see it is a macro itself, and when you build your application, all CreateFile calls in the code will resolve to CreateFileW calls if UNICODE is defined, and CreateFileA calls otherwise.
UNICODE is used by windows.h to switch between the Wide and Ansi versions of the windows API calls.
_MBCS and _UNICODE are used by the Microsoft C Runtime headers - principally tchar.h - to switch the c-library from supporting single byte to multi byte to unicode characters on functions that support that.

QT and Win32 Console Applications

I have a Win32 console app that is showing this behavior.
1) Using VC 2005 cl to compile and link, the application works fine. What I mean by working fine is that characters above 128 display correctly according to Code Page 437.
2) When I use QT qmake to construct a project (QT += console) and SOURCES = main.c, the build goes fine and my main.exe is created. But the characters above 128, using WriteConsoleOuput function display differently (some weird characters). I have the felling that this has to do with the Code Page not being set up correctly. I did not call any QT functions, neither have I created QApplication, or QCoreApplication object. When I created QApplication Object or QCoreApplication Object, the results where the same (Not displaying the correct characters).
Is there anyway to display the characters above 128 correctly using Win32 console and QT ?
I certainly wouldn't recommend using WriteConsoleOuput if that's a Windows specific API. Qt provides an easy way to write out strings using QTextStream:
// setup
QFile f;
f.open(stdout, QIODevice::WriteOnly);
QTextStream qout(&f);
// usage
qout << tr("translate this text");
I would recommend you use UTF-8 for everything, if possible. Then you don't have to worry about the different encodings, etc. If you are required to output in your local encoding for some reason, then consider QString::fromLocal8bit().
I solved the problem by using WriteConsoleA functions.

Resources