WinAPI CIniFile::ReadFile() can't read ini file - winapi

IDE: VS2005
Is there anyway to know why ReadFile() failed? I can't find the reason some of the INI can't be read. Thanks.
EDIT:
CIniFile iniFile;
iniFile.SetPath( "C:\\Services\\Server\\Server.INI" );
if( iniFile.ReadFile())
my code...
The program never gets in the if block.
And, sorry for the confusing. I use this library for the CIniFile class. Hope this information helps to pinpoint the problem.
http://www.codeproject.com/kb/cpp/cinifileByCabadam.aspx
EDIT2: I found the reason, it's because some of the ini files are saved as Unicode. And that's the reason ReadFile() fails. But now the question is how to read Unicode ini files.

Normally GetLastError() should give you an error number to look up
EDIT: In the CIniFile project there seems to be no default constructor, try instead CIniFile( string const iniPath ) i.e.
CIniFile iniFile( "C:\\Services\\Server\\Server.INI" );
if( iniFile.ReadFile())
EDIT2: OK, you would need to modify the code to instead of using fstream use wfstream - see

Related

Loading a PE File and Launching it, how to capture the exit/return code

I'm manually (in code that is) loading in a windows PE file and executing it succesfully with a call to its entry point as defined in the structure "IMAGE_NT_HEADERS32". However as this value is a void returned function how do we 'read' the exit/return code that it provides?
i.e. when i call this (which is the PE file's entry point)
((void(*)(void))EntryAddr)();
Where does it place/put the exit code from its 'int main(...)' or 'int winmain(...)' call?
Thanks in advance.
Edit: For clarification (see comments below). My Application loads an exe directly into memory using VirtualAllocEx and not any NtCreateProcess methods.
With thanks to "Eryk Sun" above, his solution was correct and this code does in fact work..
// call the entry point :: here we assume that everything is ok.
((void(*)(void))EntryAddr)();
DWORD exit;
GetExitCodeProcess(GetModuleHandle(NULL), &exit);

How to port stlport ifstream(HANDLE) to MS stl ifstream?

I'm currently porting code from stlport 5.1.3 to stl with MSVS 2010. I'm facing a problem and I hope someone can help me.
Somewhere in the code is:
HANDLE lHandle = CreateFileW(...);
ifstream lStream( lHandle );
// more job here...
This builds with stlport because the basic_ifstream has a cTor that takes a void*. But standard stl doesn't. I should write something like:
ifstream lStream( /*FileName*/ );
...but my file name is a wchar_t *. ifstream cTor only takes char *...
Do you know a work around?
Thanks in advance,
Dominique
Well, it seems that stl included in MSVC2010 provides all the stuff I needed but it was not in the doc.
CreateFile was use because the former std::fstream could not handle wide char file name. Now, it has a cTor and a open() member for that.
Also, the new flavor of ifstream allows the programmer to set a sharing protection mode. I needed that also and it was done by CreateFile...
Thus, the "new" stl give me all the power I needed. Just a little flaw in the doc.

OutputDebugString capture

In a previous question I asked about if any real-time enhanced versions of dbgview exist,
and ended trying to write my own, which worked out well except for one small snag
OpenMutex(MUTEX_ALL_ACCESS, FALSE, "DBWinMutex")
returns a handle to this mutex, except it returns NULL on windows2003 , anyone know why
this might be the case ?
The mutex doesn't necessarily exist. e.g. OutputDebugString attempts to create it, rather than open it.
Details here: http://www.unixwiz.net/techtips/outputdebugstring.html
Just in case, I faced similar permission issue when trying to use open mutex in MEX file.
This worked for me:
auto str = TEXT("MutexTest");
HANDLE h1 = OpenMutex(SYNCHRONIZE, FALSE, str);

C++/CLI Changing encoding

Good Day,
I've been writing a simple program using the Windows API, it's written in C++/CLI.
The problem I've encountered is, I'm loading a library (.dll) and then calling its functions. one of the functions returns char*. So I add the returned value to my textbox
output->Text = System::Runtime::InteropServices::Marshal::PtrToStringAnsi
(IntPtr(Function()));
Now, as you can see this is encoded in ANSI, the char* returned is, I presume, also ANSI (or Windows-1252, w/e you guys call it :>). The original data, which the function in LIBRARY gets is encoded in UTF-8, variable-length byte field, terminated by 0x00. There are a lot of non-Latin characters in my program, so this is troubling. I've also tried this
USES_CONVERSION;
wchar_t* pUnicodeString = 0;
pUnicodeString = A2W( Function());
output->Text = System::Runtime::InteropServices::Marshal::PtrToStringUni
(IntPtr(pUnicodeString));
using atlconv.h. It still prints malformed/wrong characters. So my question would be, can I convert it to something like UTF-8 so I would be able to see correct output, or does the char* loose the necessary information required to do so? Maybe changing the .dll source code would help, but it's quite old and written in C, so i don't want to mess with it :/
I hope the information I provided was sufficient, if you need anything more, just ask.
As I know there is no standard way to handle UTF-8. Try to google appropriate converters, e.g. http://www.nuclex.org/articles/cxx/10-marshaling-strings-in-cxx-cli , Convert from C++/CLI pointer to native C++ pointer .
Also, your second code snippet doesn't use pUnicodeString, it doesn't look right.

glGenBuffers is NULL giving a 0x0000000 access violation when using glew

> I have visual studio c++ express and a NVIDIA GeForce 7900 GS. I'm using glew to get at the openGL extensions. Calling glGenBuffers crashes as its a NULL pointer tho. I have an open GL context before I make the call ( wglGetCurrentContext() != NULL ). I'm calling glewInit() before the call. glewGetString( GLEW_VERSION ) is returning GLEW_VERSION_1_5. What am I doing wrong ? Is the card too old ? Is it the driver ?
Remember to make a glewInit() call in your code so that you get valid pointers to GL functions.
Hope it helps.
Without seeing your code it would be difficult to tell, but what you are attempting to do seems like it could be helped a lot by using GLee. It is designed to load all current extensions and you have the ability to check what is supported, e.g. :
#include <gl\GLee.h> // (no need to link to gl.h)
...
if (GLEE_ARB_multitexture) //is multitexture support available?
{
glMultiTexCoord2fARB(...); //safe to use multitexture
}
else
{
//fallback
}
The above was shamelessly copy/pasted from the GLee site, but it displays the functionality I'm trying to showcase.
You need to call glewInit() post having a valid context. And that would be, in the world of glew, after you've called glfwMakeContextCurrent(myWindow);
I have actually run into this problem with GLEW. For me, it was nullifying the function pointer for glGenerateMipmap. I fixed it by simply restoring the pointer to the appropriate function. This is my example in Linux:
glGenerateMipmap = (void(*)(GLenum))
glXGetProcAddressARB((GLubyte*)"glGenerateMipmap");
There is a WGL equivalent for glXGetProcAddress; I just don't remember the name off the top of my head. Try manually restoring the functions using this method. If you come across many functions that are null, something is definitely wrong in your setup process. The only other functions I recall having to restore were glGenVertexArrays, glBindVertexArray, and glDeleteVertexArrays. If your glGenBuffers is null, odds are that glBindBuffer and glDeleteBuffers are null as well. :(
Test if the desired extension is actually supported by checking the string returned by glGetString(GL_EXTENSIONS); if it's not there you know what's causing your problems.

Resources