Referencing GUIDs - winapi

I'm trying to capture an event and reference a GUID to see which event it is. The code is below:
DWORD WINAPI AvisionEventProc(LPVOID lpParam){
//HANDLE hEvent = * (HANDLE *) lpParam; // This thread's read event
STINOTIFY pStiNotify;
if (debug){
wprintf(L"Avision Event\n");
}
while(true){
WaitForSingleObject(hAvisionEvent, INFINITE);
wprintf(L"Event");
pStiDevice->GetLastNotificationData(&pStiNotify);
if (pStiNotify.guidNotificationCode == GUID_STIUserDefined1){
wprintf(L"User defined 1");
}else if (pStiNotify.guidNotificationCode == GUID_STIUserDefined2){
wprintf(L"User defined 2");
}else if (pStiNotify.guidNotificationCode == GUID_STIUserDefined3){
wprintf(L"User defined 3");
}
ResetEvent(hAvisionEvent);
}
return 1;
}
This compiles just fine but I get the errors below when linking:
1>sti.obj : error LNK2001: unresolved external symbol _GUID_STIUserDefined3
1>sti.obj : error LNK2001: unresolved external symbol _GUID_STIUserDefined2
1>sti.obj : error LNK2001: unresolved external symbol _GUID_STIUserDefined1
The strange thing is that sti.h is linked in as I am pulling other constants from it. I did notice by the GUID decleration the following:
#if defined( _WIN32 ) && !defined( _NO_COM)
/*
* Class IID's
*/
// B323F8E0-2E68-11D0-90EA-00AA0060F86C
DEFINE_GUID(CLSID_Sti, 0xB323F8E0L, 0x2E68, 0x11D0, 0x90, 0xEA, 0x00, 0xAA, 0x00, 0x60, 0xF8, 0x6C);
/*
* Interface IID's
*/
// {641BD880-2DC8-11D0-90EA-00AA0060F86C}
DEFINE_GUID(IID_IStillImageW, 0x641BD880L, 0x2DC8, 0x11D0, 0x90, 0xEA, 0x00, 0xAA, 0x00, 0x60, 0xF8, 0x6C);
<snip>
/*
* Standard event GUIDs
*/
// {740D9EE6-70F1-11d1-AD10-00A02438AD48}
DEFINE_GUID(GUID_DeviceArrivedLaunch, 0x740d9ee6, 0x70f1, 0x11d1, 0xad, 0x10, 0x0, 0xa0, 0x24, 0x38, 0xad, 0x48);
<snip>
#endif
Does the "if defined" line stop the GUIDs being referenced (I am writing a win32 console app) or is there something more fundamental I have wrong here to do with a lack of understanding on GUIDs?
Thanks, in advance for your help.
Cheers,
Neil

#include <initguid.h>should be added. That will help.

The DEFINE_GUID macro either defines a named GUID as a static, or just does a forward declaration to actually be done somewhere else, depending on if #include <initguid.h> has been previously declared. Your code perhaps only have the latter, and the symbols don't have actual initialization within the project.
See:
How to avoid error "LNK2001 unresolved external" by using DEFINE_GUID
INITGUID: An Explanation
How does the DEFINE_GUID macro work? - at the bottom of the page

Related

How can I run LoadEnclaveData function in memory?

I read about memory enclaves, and I found it an interesting feature to hide some data, so, here I am.
I wasn't able to find anything on the API required, but the MSDN documentation has [no source code on usage], I know that I have to call the following:
IsEnclaveTypeSupported : to make sure I can continue.
CreateEnclave : to return the base address of the enclave created, although I struggled with this one too, but this question helped me.
LoadEnclaveData : to add the data to our created enclave.
InitializeEnclave : to activate the enclave.
based on Windows Internal book (part 1), to execute I have to run the EENTER assembly instruction, which also I didn't find information on, but I think CallEnclave with the base address of the enclave can do the job.
Anyways, I'm stuck at step 3, my LoadEnclaveData is returning error code 87, which is ERROR_INVALID_PARAMETER.
I'm only copying NOPs (0x90) to the address, just to see it through the debugger that is running.
Here is the code:
LPVOID lpAddress ;
ENCLAVE_CREATE_INFO_VBS VBS = { 0 };
VBS.Flags = 0;
HANDLE hProcess = GetCurrentProcess();
lpAddress = CreateEnclave(
hProcess,
NULL,
2097152,
NULL,
ENCLAVE_TYPE_VBS,
&VBS,
sizeof(ENCLAVE_CREATE_INFO_VBS),
NULL
);
printf("[-] GetLastError : %d \n", GetLastError());
printf("[+] %-20s : 0x%-016p\n", "lpAddress addr", (void*) lpAddress);
unsigned char buffer[] = {0x90, 0x90, 0x90, 0x90, 0x90, 0x90, 0x90, 0x90, 0x90, 0x90, 0x90, 0x90, 0x90, 0x90, 0x90, 0x90, 0x90, 0x90, 0x90, 0x90 , 0x90 , 0x90 };
LoadEnclaveData(
hProcess,
lpAddress,
&buffer,
sizeof(buffer),
PAGE_READWRITE,
NULL,
0,
0,
0
);
printf("[-] GetLastError : %d \n", GetLastError());
Based on MSDN's LoadEnclaveData documentation, they didn't specify what to do with lpPageInformation, so I think it is the problem, but they said The lpPageInformation parameter is not used. So I recheck a couple of parameters, and I found out that nSize must be a whole-number multiple of the page size. So I got confused, what to do now?
In case anyone is sure about executing a buffer in an enclave, please let me know.
And BTW, the thing in choosing the title is killing me, [I wasted more time on that than writing this].

Manual OpenGL Extension Loading LNK2005 and LNK1169

I'm using Visual C++ and manually attempting to load OpenGL extensions. However, for some reason defining the pointers from the Khronos Groups' provided headers leads to linker errors and as such I never get the change to even define these functions within my OpenGL context. Below I have included a simplified version of my code and it's structure that causes the same issue.
//Test.cpp
#include "MyGL.h"
#include <iostream>
int main()
{
std::cout << "Hello World!\n";
}
//MyGL.h
#pragma once
#include "MyGLOpenGL.h"
//MyGL.cpp
#include "MyGL.h"
//MyGLOpenGL.h
#pragma once
#include <windows.h> // Windows functions
#include <GL/gl.h> // Provided w/ Compiler
#include "GL/glext.h" // Put out by Khronos Group
#include "GL/wglext.h" // Put out by Khronos Group
#pragma comment(lib, "opengl32.lib") // Provided w/ Compiler
#ifndef GL_OPENGL
#define GL_OPENGL
void glInitPointers(); // Defines pointers to opengl functions
void* glGetAnyProcAddress(const char* name); // Gets a pointer to any OpenGL function
extern PFNWGLCHOOSEPIXELFORMATARBPROC wglChoosePixelFormatARB;
extern PFNWGLCREATECONTEXTATTRIBSARBPROC wglCreateContextAttribsARB;
#endif
//MyGLOpenGL.cpp
#include "MyGLOpenGL.h"
void glInitPointers() {
wglChoosePixelFormatARB = (PFNWGLCHOOSEPIXELFORMATARBPROC)glGetAnyProcAddress("wglChoosePixelFormatARB"); //load function
wglCreateContextAttribsARB = (PFNWGLCREATECONTEXTATTRIBSARBPROC)glGetAnyProcAddress("wglCreateContextAttribsARB"); //load function
return;
}
void* glGetAnyProcAddress(const char* name) {
void *pointer = (void *)wglGetProcAddress(name);
if (pointer == 0 || (pointer == (void*)0x1) || (pointer == (void*)0x2) || (pointer == (void*)0x3) || (pointer == (void*)-1)) {
HMODULE module = LoadLibraryW(L"opengl32.dll");
pointer = (void *)GetProcAddress(module, name);
}
return pointer;
};
Compiling this gives me the following linker errors:
1>MyGLOpenGL.obj : error LNK2001: unresolved external symbol "int (__stdcall* wglChoosePixelFormatARB)(struct HDC__ *,int const *,float const *,unsigned int,int *,unsigned int *)" (?wglChoosePixelFormatARB##3P6GHPAUHDC__##PBHPBMIPAHPAI#ZA)
1>MyGLOpenGL.obj : error LNK2001: unresolved external symbol "struct HGLRC__ * (__stdcall* wglCreateContextAttribsARB)(struct HDC__ *,struct HGLRC__ *,int const *)" (?wglCreateContextAttribsARB##3P6GPAUHGLRC__##PAUHDC__##PAU1#PBH#ZA)
1>G:\Development\Test\Debug\Test.exe : fatal error LNK1120: 2 unresolved externals
As the issue was unresolved external symbols, I tried removing the extern keyword to arrive at getting these two errors instead:
1>MyGLOpenGL.obj :error LNK2005: "int (__stdcall* wglChoosePixelFormatARB)(struct HDC__ *,int const *,float const *,unsigned int,int *,unsigned int *)" (?wglChoosePixelFormatARB##3P6GHPAUHDC__##PBHPBMIPAHPAI#ZA) already defined in MyGL.obj
1>MyGLOpenGL.obj : error LNK2005: "struct HGLRC__ * (__stdcall* wglCreateContextAttribsARB)(struct HDC__ *,struct HGLRC__ *,int const *)" (?wglCreateContextAttribsARB##3P6GPAUHGLRC__##PAUHDC__##PAU1#PBH#ZA) already defined in MyGL.obj
1>Test.obj : error LNK2005: "int (__stdcall* wglChoosePixelFormatARB)(struct HDC__ *,int const *,float const *,unsigned int,int *,unsigned int *)" (?wglChoosePixelFormatARB##3P6GHPAUHDC__##PBHPBMIPAHPAI#ZA) already defined in MyGL.obj
1>Test.obj : error LNK2005: "struct HGLRC__ * (__stdcall* wglCreateContextAttribsARB)(struct HDC__ *,struct HGLRC__ *,int const *)" (?wglCreateContextAttribsARB##3P6GPAUHGLRC__##PAUHDC__##PAU1#PBH#ZA) already defined in MyGL.obj
1>G:\Development\Test\Debug\Test.exe : fatal error LNK1169: one or more multiply defined symbols found
I've also already made sure that "#pragma once" was in ever file, header guards, adding "OpenGL32.lib" into the additional dependencies in visual studio, adding a pragma comment for the lib, setting the pointers equal to null in the declaration, and I'm just at a complete loss for anything else to try, even after googling the issue. And in my case, using GLEW or any other extension loading library because that is exactly what I'm trying to create.
In C and C++ there's an important difference between declaration and definition. A statement of the form
extern <type> <symbol>;
declares that there is some symbol somewhere, but it doesn't actually bring it into existence. It's more like a promise to the compiler, that the symbol will be defined somewhere else. When you wrote
extern PFNWGLCHOOSEPIXELFORMATARBPROC wglChoosePixelFormatARB;
extern PFNWGLCREATECONTEXTATTRIBSARBPROC wglCreateContextAttribsARB;
you actually didn't create the variables for those function pointers, you just "listed them in the table of contents" for your program. You actually have to define them somewhere. I.e. in some compilation unit you have to write (without the extern).
PFNWGLCHOOSEPIXELFORMATARBPROC wglChoosePixelFormatARB;
PFNWGLCREATECONTEXTATTRIBSARBPROC wglCreateContextAttribsARB;
This has absolutely nothing to do with initialization, as hinted by #EricLopushansky; it's all about them being getting actually defined. Since these are global scope, they'll be initialized to 0 anyway, even if you don't explicitly write that = 0.
This isn't very well explained by the OpenGL wiki in the article Load OpenGL Functions but simply do the following. Setting the pointer to null within the cpp file will fix the issue.
For example:
PFNWGLCHOOSEPIXELFORMATARBPROC wglChoosePixelFormatARB = NULL;
For anyone interested, here is the relevant code from GLEW that serves the same purpose:
//glew.h
#ifdef GLEW_STATIC
# define GLEWAPI extern
#else
# ifdef GLEW_BUILD
# define GLEWAPI extern __declspec(dllexport)
# else
# define GLEWAPI extern __declspec(dllimport)
# endif
#endif
#define GLEW_FUN_EXPORT GLEWAPI
//wglew.h
#define WGLEW_GET_FUN(x) x
#define wglChoosePixelFormatARB WGLEW_GET_FUN(__wglewChoosePixelFormatARB)
typedef BOOL (WINAPI * PFNWGLCHOOSEPIXELFORMATARBPROC) (HDC hdc, const int* piAttribIList, const FLOAT *pfAttribFList, UINT nMaxFormats, int *piFormats, UINT *nNumFormats);
#define WGLEW_FUN_EXPORT GLEW_FUN_EXPORT
WGLEW_FUN_EXPORT PFNWGLCHOOSEPIXELFORMATARBPROC __wglewChoosePixelFormatARB;
//glew.c
# define glewGetProcAddress(name) wglGetProcAddress((LPCSTR)name)
PFNWGLCHOOSEPIXELFORMATARBPROC __wglewChoosePixelFormatARB = NULL;
r = ((wglChoosePixelFormatARB = (PFNWGLCHOOSEPIXELFORMATARBPROC)glewGetProcAddress((const GLubyte*)"wglChoosePixelFormatARB")) == NULL) || r;
//I would be using this static so it would evaluate in my code to
//MyGLOpenGL.h
#define wglChoosePixelFormatARB __wglewChoosePixelFormatARB
typedef BOOL (WINAPI * PFNWGLCHOOSEPIXELFORMATARBPROC) (HDC hdc, const int* piAttribIList, const FLOAT *pfAttribFList, UINT nMaxFormats, int *piFormats, UINT *nNumFormats);
extern PFNWGLCHOOSEPIXELFORMATARBPROC __wglewChoosePixelFormatARB;
//MyGLOpenGL.cpp
PFNWGLCHOOSEPIXELFORMATARBPROC __wglewChoosePixelFormatARB = NULL;
wglChoosePixelFormatARB = (PFNWGLCHOOSEPIXELFORMATARBPROC)wglGetProcAddress((const GLubyte*)"wglChoosePixelFormatARB")

How do I scan memory for an array of hex values in LLDB?

I want to find N (for example, 16) bytes of hex in memory. In gdb, I simply do:
find /b memrange_start, memrange_end, 0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08, 0x09, 0x0a, 0x0b, 0x0c, 0x0d, 0x0e, 0x0f
And it works flawlessly. What's the equivalent in lldb? I tried:
(lldb) me find -e "0x01 0x02 0x03" -- 0x00000001050e8000 0x00000001050e9000
But it just looks for the last byte (0x03). String search using the -s parameter and "\x00\x01\x02\x03(...)\x0f" does not work either. Furthermore, when I tried various combinations, I stumbled upon this error:
(lldb) me find -e "{0x01, 0x02}" 0x7fffbe6abfc6 0x7fffbe6abfcf
error: result size larger than 8 bytes. pass a string instead
Please don't tell me I'm limited to 8 bytes with "-e" :( Did I miss some easy way to scan for the virtual address of 16 bytes of hex in memory?

Send data to a PS3 DualShock3 controller from a Mac (IOHIDDeviceSetReport)

I've been playing around with the HID part of IOKit lately on my Mac with a PS3 controller. I've managed to look though the sample code and connect to my controller, receive a stream of data and parse it (Everything, including the accelerometer and gyro).
However, today I decided I wanted to start setting the LEDs on the back of the device and triggering the rumble motors, but I just can't get it to work!
Looking though the sample code Apple provides for IOHID there isn't much I can see on setting things on the HID device, only receiving data. From looking on the web (for petty much half a day) I've got what I think is a working send method which uses IOHIDDeviceSetReport(). However, I can't work out what data I should be sending.
I've found several sites that list data examples:
http://www.circuitsathome.com/mcu/ps3-and-wiimote-game-controllers-on-the-arduino-host-shield-part-2
https://github.com/ribbotson/USB-Host/blob/master/ps3/PS3USB/ps3_usb.h
http://wiibrew.org/wiki/Sixaxis
(I know not all these examples are for between Mac an a PS3 controller)
A lot of people seem to be talking about this and even doing it (I refuse to believe no one has got this working) but I can't seem to find anything about actually how to do it that works!
I feel like I am missing a simple step here, so if anyone has any ideas, help or a solution please let me know.
Thanks.
Example Code of how I'm trying to send a report (is is returning success):
CFIndex len = 64;
uint8_t report[64] = {0x0};
IOReturn tIOReturn = IOHIDDeviceSetReport(deviceRef,
kIOHIDReportTypeOutput,
reportID,
report,
len);
This is just sending a lot of nothing (literally) but it's just an example of what I'm using just incase it's not correct.
Extra: I've also just noticed that Apples defenition of IOHIDDeviceSetReport differes from there example given.
https://developer.apple.com/library/mac/documentation/DeviceDrivers/Conceptual/HID/new_api_10_5/tn2187.html#//apple_ref/doc/uid/TP40000970-CH214-SW81
There it says report should be "address of report buffer". But...
https://developer.apple.com/library/mac/documentation/IOKit/Reference/IOHIDDevice_iokit_header_reference/Reference/reference.html#//apple_ref/doc/uid/TP40012408-CHIOHIDDevicehFunctions-DontLinkElementID_23
There it say *report (being a pointer) is "The report bytes to be sent to the device.".
there's an example at: http://www.circuitsathome.com/mcu/ps3-and-wiimote-game-controllers-on-the-arduino-host-shield-part-2
with the code describing the LED and Rumble control at:
https://github.com/ribbotson/USB-Host/blob/master/ps3/PS3USB/ps3_usb.cpp#L187
It seems that the bytes that you sent as report need to have a certain format:
prog_char output_01_report[] PROGMEM = {0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x02, 0xff, 0x27, 0x10, 0x00, 0x32, 0xff,
0x27, 0x10, 0x00, 0x32, 0xff, 0x27, 0x10, 0x00,
0x32, 0xff, 0x27, 0x10, 0x00, 0x32, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
};
In the LEDRumble function, these bytes are copied into buf and then buf[9] is overridden to set the LED state and the bytes from buf[1] through buf[4] are used to configure the Rumble. The the bytes are all sent to the controller.
There are some constants defined here: https://github.com/ribbotson/USB-Host/blob/master/ps3/PS3USB/ps3_usb.h#L100
#define psLED1 0x01
#define psLED2 0x02
#define psLED3 0x04
#define psLED4 0x08
#define psRumbleHigh 0x10
#define psRumbleLow 0x20
These constants are passed to the LEDRumble function as parameters.
Their example seems fine, as far as I've read it. Under these circumstances, either an uint8_t[64] variable or an uint8_t* variable will be both interpreted as a pointer to uint8_t when passed to IOHIDDeviceSetReport.
I'm on a restricted network right now, so I won't be able to help any further. But I'll try to read a little bit more about this later at home, as I also have a related project in mind. If I find out anything that can help us on this, I'll get back here to talk about it.

What are the meanings of the hash keys when calling ObjectSpace.count_objects?

In Ruby 1.9 (YARV) you can get a count of all currently allocated objects like so:
ObjectSpace.count_objects
which returns a hash like
{:TOTAL=>1226560, :FREE=>244204, :T_OBJECT=>26141, :T_CLASS=>9819, :T_MODULE=>1420, :T_FLOAT=>287,
:T_STRING=>260476, :T_REGEXP=>4081, :T_ARRAY=>72269, :T_HASH=>14923, :T_STRUCT=>4601, :T_BIGNUM=>7,
:T_FILE=>16, :T_DATA=>54553, :T_MATCH=>5, :T_COMPLEX=>1, :T_RATIONAL=>15, :T_NODE=>524818,
:T_ICLASS=>8924}
What is the meaning of these hash keys? Some like T_STRING and T_FILE are obvious. I'm particularly curious about :FREE, :T_ICLASS, :T_DATA, and :T_NODE.
Just a guess: I assume :T_ICLASS counts include classes and :T_NODE could maybe stand for AST nodes.
Here's a full list (unfortunately without comments):
#define T_NONE RUBY_T_NONE
#define T_NIL RUBY_T_NIL
#define T_OBJECT RUBY_T_OBJECT
#define T_CLASS RUBY_T_CLASS
#define T_ICLASS RUBY_T_ICLASS
#define T_MODULE RUBY_T_MODULE
#define T_FLOAT RUBY_T_FLOAT
#define T_STRING RUBY_T_STRING
#define T_REGEXP RUBY_T_REGEXP
#define T_ARRAY RUBY_T_ARRAY
#define T_HASH RUBY_T_HASH
#define T_STRUCT RUBY_T_STRUCT
#define T_BIGNUM RUBY_T_BIGNUM
#define T_FILE RUBY_T_FILE
#define T_FIXNUM RUBY_T_FIXNUM
#define T_TRUE RUBY_T_TRUE
#define T_FALSE RUBY_T_FALSE
#define T_DATA RUBY_T_DATA
#define T_MATCH RUBY_T_MATCH
#define T_SYMBOL RUBY_T_SYMBOL
#define T_RATIONAL RUBY_T_RATIONAL
#define T_COMPLEX RUBY_T_COMPLEX
#define T_UNDEF RUBY_T_UNDEF
#define T_NODE RUBY_T_NODE
#define T_ZOMBIE RUBY_T_ZOMBIE
#define T_MASK RUBY_T_MASK
The RUBY_T_xyz enum is defined like this:
enum ruby_value_type {
RUBY_T_NONE = 0x00,
RUBY_T_OBJECT = 0x01,
RUBY_T_CLASS = 0x02,
RUBY_T_MODULE = 0x03,
RUBY_T_FLOAT = 0x04,
RUBY_T_STRING = 0x05,
RUBY_T_REGEXP = 0x06,
RUBY_T_ARRAY = 0x07,
RUBY_T_HASH = 0x08,
RUBY_T_STRUCT = 0x09,
RUBY_T_BIGNUM = 0x0a,
RUBY_T_FILE = 0x0b,
RUBY_T_DATA = 0x0c,
RUBY_T_MATCH = 0x0d,
RUBY_T_COMPLEX = 0x0e,
RUBY_T_RATIONAL = 0x0f,
RUBY_T_NIL = 0x11,
RUBY_T_TRUE = 0x12,
RUBY_T_FALSE = 0x13,
RUBY_T_SYMBOL = 0x14,
RUBY_T_FIXNUM = 0x15,
RUBY_T_UNDEF = 0x1b,
RUBY_T_NODE = 0x1c,
RUBY_T_ICLASS = 0x1d,
RUBY_T_ZOMBIE = 0x1e,
RUBY_T_MASK = 0x1f
};
I think most of those are rather obvious. The only ones I can't figure out are T_DATA (see #banister's comment), T_ZOMBIE and T_MASK.
BTW: Note that these are not part of Ruby 1.9. They are part of YARV. They might be totally different on other implementations of Ruby 1.9 or even not exist at all. The documentation clearly states:
The contents of the returned hash is implementation defined. It may be changed in future.
In fact, it isn't even guaranteed that the method itself exists:
This method is not expected to work except C Ruby.
(By which the author presumably means that the method is only guaranteed to work on MRI and YARV.)
You can get more information about the T_DATA category by calling ObjectSpace.count_tdata_objects (described here).
I believe that these are native objects controlled by the VM. Sometimes native extensions can allocate them, as well.
The types are described in the file doc/extension.doc in Ruby source code:
T_NIL :: nil
T_OBJECT :: ordinary object
T_CLASS :: class
T_MODULE :: module
T_FLOAT :: floating point number
T_STRING :: string
T_REGEXP :: regular expression
T_ARRAY :: array
T_HASH :: associative array
T_STRUCT :: (Ruby) structure
T_BIGNUM :: multi precision integer
T_FIXNUM :: Fixnum(31bit or 63bit integer)
T_COMPLEX :: complex number
T_RATIONAL :: rational number
T_FILE :: IO
T_TRUE :: true
T_FALSE :: false
T_DATA :: data
T_SYMBOL :: symbol
In addition, there are several other types used internally:
T_ICLASS :: included module
T_MATCH :: MatchData object
T_UNDEF :: undefined
T_NODE :: syntax tree node
T_ZOMBIE :: object awaiting finalization

Resources