Calling a DLL from Go - what am I doing wrong? - go

I'm trying to convert a program from Python (2.7, 32-bit) to Go (1.12.5, 32bit), and failing miserably with an access violation. The program calls a function in a 32-bit dll (ehlapi32.dll).
The following clips of the Python code appear to work perfectly (I'm not affirming that they are correct!):
class ehlapi32:
hllDll = "C:\Program Files (x86)\IBM\Personal Communications\EHLAPI32.dll"
hllDll = ctypes.WinDLL(hllDll)
hllApiProto = ctypes.WINFUNCTYPE(ctypes.c_int,ctypes.c_void_p,ctypes.c_void_p,
ctypes.c_void_p,ctypes.c_void_p)
hllApiParams = (1, "p1", 0), (1, "p2", 0), (1, "p3",0), (1, "p4",0),
hllApi = hllApiProto (("HLLAPI", hllDll), hllApiParams)
pFunConnect = ctypes.c_int(1)
...
def connect(self):
shlSession = self.session + b'\0\0\0'
pText = ctypes.c_char_p (shlSession)
pLength = ctypes.c_int (len(shlSession))
pReturn = ctypes.c_int (13)
ehlapi32.hllApi (ctypes.byref (ehlapi32.pFunConnect), pText,
ctypes.byref (pLength), ctypes.byref (pReturn))
Now here's what I'm trying as the Go equivalent. This fails:
// ElFunc: EHLLAPI Functions
type ElFunc uint32
const (
eConnect ElFunc = 1 // Connect to a terminal
)
...
// Session: a 3270 session handle
type Session struct {
SessId string
SysId string
Entry string
Result []string
Headers int
Footers int
Rows int
Cols int
Scroll int
Proc *syscall.LazyProc
}
...
func (s Session) Connect() uint32 {
// Load EHLLAPI DLL
dll := syscall.NewLazyDLL(dllName)
s.Proc = dll.NewProc(dllProc)
// Connect to session
rc := s.ecall(eConnect, s.SessId, 1, 0)
if rc == 0 {
defer s.Disconnect()
}
...
// ecall: provides simplified EHLLAPI DLL calls
func (s Session) ecall(fun ElFunc, buffer string, count uint32, ps uint32) uint32 {
rc := ps
_, _, _ = s.Proc.Call(
uintptr(unsafe.Pointer(uintptr(uint32(fun)))),
uintptr(unsafe.Pointer(&buffer)),
uintptr(unsafe.Pointer(uintptr(count))),
uintptr(unsafe.Pointer(uintptr(rc))),
)
return rc
}
The failure looks like an access violation at s.Proc.Call in ecall().
I realise that syscall is deprecated; I have also tried with golang.org/x/sys/windows, without success. I have no constraints as to which I use.
I'll confess I'm way out of my depth here.

Related

F# System.Runtime.InteropServices native library call to SendInput works in .net framework but not in .net core

I'm porting a small application I wrote for keybindings to .net core and I've run across an instance where the same code behaves differently. I'm calling the SendInput function in F# with this declaration
open System.Runtime.InteropServices
[<StructLayout(LayoutKind.Sequential)>]
type private MOUSEINPUT = struct
val dx: int32
val dy:int32
val mouseData:uint32
val dwFlags: uint32
val time: uint32
val dwExtraInfo: int
new(_dx, _dy, _mouseData, _dwFlags, _time, _dwExtraInfo) = {dx=_dx; dy=_dy; mouseData=_mouseData; dwFlags=_dwFlags; time=_time; dwExtraInfo=_dwExtraInfo}
end
[<StructLayout(LayoutKind.Sequential)>]
type private KEYBDINPUT = struct
val wVk: uint16
val wScan: uint16
val dwFlags: uint32
val time: uint32
val dwExtraInfo:int
new(_wVk, _wScan, _dwFlags, _time, _dwExtraInfo) = {wVk =_wVk; wScan = _wScan; dwFlags = _dwFlags; time = _time; dwExtraInfo = _dwExtraInfo}
end
[<StructLayout(LayoutKind.Sequential)>]
type private HARDWAREINPUT = struct
val uMsg: uint32
val wParamL: uint16
val wParamH: uint16
new(_uMsg, _wParamL, _wParamH) = {uMsg = _uMsg; wParamL = _wParamL; wParamH = _wParamH}
end
[<StructLayout(LayoutKind.Explicit)>]
type private LPINPUT = struct
[<FieldOffset(0)>]
val mutable ``type``:int // 1 is keyboard
[<FieldOffset(4)>]
val mutable mi : MOUSEINPUT
[<FieldOffset(4)>]
val mutable ki : KEYBDINPUT
[<FieldOffset(4)>]
val mutable hi : HARDWAREINPUT
end
module private NativeMethods =
[<DllImport("user32.dll", SetLastError=true)>]
extern uint32 SendInput(uint32 nInputs, LPINPUT* pInputs, int cbSize)
let appSignature = 0xA8969
let private createPressInput (code: int) =
let mutable input = LPINPUT()
input.``type`` <- InputModes.INPUT_KEYBOARD
input.ki <- KEYBDINPUT(uint16 code, uint16 0, Dwords.KEYEVENTF_KEYDOWN, uint32 0, appSignature)
input
let pressKey (code: int) =
let input = createPressInput code
NativeMethods.SendInput(uint32 1, &&input, Marshal.SizeOf(input)) |> ignore
The same code works in a .net framework application that I created in visual studio. Now, the output of Marshal.GetLastWin32ErrorCode() is 87 which apparently means ERROR_INVALID_PARAMETER -- not very helpful. I'm new to .net and F# so I'm not sure what could be different in this context. I admit, even getting this binding code was mostly trial and error as well.
I'd appreciate any info that could help me debug this.
UPDATE: I have a workaround that I'm not satisfied with. I can't explain why this works just yet -- I need to read more about how marshaling works. With this, the Marshal.GetLastWin32ErrorCode() is 5, access denied. It still send the key so I'm not sure what that error is supposed to mean. That said, here it is. Splitting out the union from the struct that I was using into a dedicated union type, making that union type LayoutKind.Explicit, and making at least one of the fields FieldOffset(1) (but not the field I care about) gets key presses working. Other combinations of field offsets result in something that works but doesn't actually press keys, which I assume means that its marshaled in a way that results in no visible key presses.
[<StructLayout(LayoutKind.Explicit)>]
type private InputUnion = struct
[<FieldOffset(0)>]
val mutable ki : KEYBDINPUT
[<FieldOffset(1)>]
val mutable mi : MOUSEINPUT
[<FieldOffset(1)>]
val mutable hi : HARDWAREINPUT
end
[<StructLayout(LayoutKind.Sequential)>]
type private LPINPUT = struct
val ``type``:int // 1 is keyboard
val u: InputUnion
new(_type, _u) = {``type`` = _type; u = _u}
end
I ended up opening a bug on this.
https://github.com/dotnet/runtime/issues/1515
It wasn't an issue in .net. Apparently, my .net framework app was running as 32bit. I had declared the dwExtraInfo field as an int, which was fine for 32bit. My .net core app was running as 64bit and I should have used a UIntPtr to handle the differences between platforms for the length of that field. Updated to this code and it works.
[<StructLayout(LayoutKind.Sequential)>]
type private MOUSEINPUT = struct
val dx: int32
val dy:int32
val mouseData:uint32
val dwFlags: uint32
val time: uint32
val dwExtraInfo: UIntPtr
new(_dx, _dy, _mouseData, _dwFlags, _time, _dwExtraInfo) = {dx=_dx; dy=_dy; mouseData=_mouseData; dwFlags=_dwFlags; time=_time; dwExtraInfo=_dwExtraInfo}
end
[<StructLayout(LayoutKind.Sequential)>]
type private KEYBDINPUT = struct
val wVk: uint16
val wScan: uint16
val dwFlags: uint32
val time: uint32
val dwExtraInfo: UIntPtr
new(_wVk, _wScan, _dwFlags, _time, _dwExtraInfo) = {wVk =_wVk; wScan = _wScan; dwFlags = _dwFlags; time = _time; dwExtraInfo = _dwExtraInfo}
end
[<StructLayout(LayoutKind.Sequential)>]
type private HARDWAREINPUT = struct
val uMsg: uint32
val wParamL: uint16
val wParamH: uint16
new(_uMsg, _wParamL, _wParamH) = {uMsg = _uMsg; wParamL = _wParamL; wParamH = _wParamH}
end
[<StructLayout(LayoutKind.Explicit)>]
type private InputUnion = struct
[<FieldOffset(0)>]
val mutable mi : MOUSEINPUT
[<FieldOffset(0)>]
val mutable ki : KEYBDINPUT
[<FieldOffset(0)>]
val mutable hi : HARDWAREINPUT
end
[<StructLayout(LayoutKind.Sequential)>]
type private LPINPUT = struct
val mutable ``type``: int // 1 is keyboard
val mutable u: InputUnion
end
The main differences are the dwExtraInfo field definitions. Also, there is an explicit type for the union now instead of having it all rolled into a single LPINPUT. That let's me avoid having to specify the field offset for the 3 optional fields which also differs by platform.

Pass a complex struct to the Windows API

I am trying to use the GetConsoleScreenBufferInfo(HANDLE, PCONSOLE_SCREEN_BUFFER_INFO) function from the Windows API using Perl 6 and (of course) NativeCall.
I think I have set up the CONSOLE_SCREEN_BUFFER_INFO struct the function needs correctly, but the code crashes after the call when I try to dump its content.
This is the shortest (not quite but close) way to demonstrate the problem:
use NativeCall;
constant \HANDLE := Pointer[void];
constant \SHORT := int16;
constant \USHORT := uint16;
constant \WORD := uint16;
constant \DWORD := uint32;
constant \BOOL := int32;
constant \STD_OUTPUT_HANDLE := -11;
constant \STD_INPUT_HANDLE := -10;
class COORD is repr('CStruct') {
has SHORT $.X;
has SHORT $.Y;
}
class SMALL_RECT is repr("CStruct") {
has SHORT $.Left;
has SHORT $.Top;
has SHORT $.Right;
has SHORT $.Bottom;
};
class CONSOLE_SCREEN_BUFFER_INFO is repr("CStruct") {
has COORD $.dwSize;
has COORD $.dwCursorPosition;
has WORD $.wAttributes;
has SMALL_RECT $.srWindow;
has COORD $.dwMaximumWindowSize;
submethod TWEAK {
$!dwSize := COORD.new;
$!dwCursorPosition := COORD.new;
$!srWindow := SMALL_RECT.new;
$!dwMaximumWindowSize := COORD.new;
}
}
# C: BOOL WINAPI GetConsoleScreenBufferInfo(_In_ HANDLE hConsoleOutput, _Out_ PCONSOLE_SCREEN_BUFFER_INFO lpConsoleScreenBufferInfo);
sub GetConsoleScreenBufferInfo(HANDLE, CONSOLE_SCREEN_BUFFER_INFO is rw) is native("Kernel32.dll") returns BOOL { * };
sub GetStdHandle(DWORD) is native('Kernel32') returns Pointer[void] { * };
my CONSOLE_SCREEN_BUFFER_INFO
$info = CONSOLE_SCREEN_BUFFER_INFO.new;
my HANDLE
$handle-o = GetStdHandle( STD_OUTPUT_HANDLE );
dd $info;
say "GetConsoleScreenBufferInfo ", GetConsoleScreenBufferInfo( $handle-o, $info );
say "Will I live?";
dd $info; #crashes without notice
Any hints as to why the crash occurs and how to fix it are very welcome.
You need to use HAS instead of has for the members of CONSOLE_SCREEN_BUFFER_INFO that are structures as these are embedded instead of referenced by pointer (which is the Perl6 default).
Once you do that, you can drop the TWEAK as well so the code will read
class CONSOLE_SCREEN_BUFFER_INFO is repr("CStruct") {
HAS COORD $.dwSize;
HAS COORD $.dwCursorPosition;
has WORD $.wAttributes;
HAS SMALL_RECT $.srWindow;
HAS COORD $.dwMaximumWindowSize;
}

How do I call the default windows screen to choose a certificate?

THTTPRIO component, in HTTPWebNode property, when you click in ClientCertificate, Delphi opens a form to choose the certificate and load it's information in the component's propertys. Is this a windows screen? If it is, how can I use it? Today I'm using SecureBlackBox to load the certificates in a combobox, but I'd like to know if is possible to use this screen.
Thanks
UPDATE
I was able to show the dialog using the ms function CryptUIDlgSelectCertificateFromStore, using JWAPI. Now I'm having problems with the result of the function, the PCCERT_CONTEXT structure.
var
P: Pointer;
Context: PCCERT_CONTEXT;
Issuer: DATA_BLOB;
function GetDataBlobText(Data: DATA_BLOB): string;
begin
SetString(Result, PAnsiChar(Data.pbData), Data.cbData div SizeOf(AnsiChar));
end;
begin
P := CertOpenSystemStore(0, 'MY');
Context := CryptUIDlgSelectCertificateFromStore(P, 0, PChar('test'), nil, CRYPTUI_SELECT_ISSUEDTO_COLUMN, 0, nil);
if Context <> nil then
begin
Issuer := Context.pCertInfo.Issuer;
ShowMessage((GetDataBlobText(Issuer)));
end;
end;
The result in ShowMessage is:
UPDATE2
Thanks #RbMm.
To get string of ASN encoding fields (Issuer and Subject)
var
P: Pointer;
Context: PCCERT_CONTEXT;
Subject: DATA_BLOB;
SubjectStr: string;
size : Cardinal;
begin
P := CertOpenSystemStore(0, PAnsiChar('MY'));
Context := CryptUIDlgSelectCertificateFromStore(P, 0, 'test', 'select certificate',
CRYPTUI_SELECT_ISSUEDTO_COLUMN, 0, nil);
if Context <> nil then
begin
Subject := Context.pCertInfo.Subject;
size := CertNameToStr(X509_ASN_ENCODING or PKCS_7_ASN_ENCODING, #Subject, CERT_X500_NAME_STR, 0, 0);
SetString(SubjectStr, PAnsiChar(Subject.pbData), size);
CertNameToStr(X509_ASN_ENCODING or PKCS_7_ASN_ENCODING, #Subject, CERT_X500_NAME_STR, PAnsiChar(SubjectStr), size);
Result := SubjectStr;
end;
To get the string of raw data block (SerialNumber):
var
SerialNumber: CRYPT_INTEGER_BLOB;
size : Cardinal;
s: PWideChar;
ss: string;
begin
SerialNumber := Context.pCertInfo.SerialNumber;
CryptBinaryToStringW(SerialNumber.pbData, SerialNumber.cbData, CRYPT_STRING_HEX, nil, size);
s := AllocMem(SizeOf(Char) * size);
CryptBinaryToStringW(SerialNumber.pbData, SerialNumber.cbData, CRYPT_STRING_HEX, s, size);
ss := s;
showmessage(ss);
FreeMem(s, SizeOf(Char) * size);
all data blobs in certificate is encoded. so you need decode it. in general by using CryptDecodeObjectEx api. however for Issuer ( i.e. CERT_NAME_BLOB) decode you can use also CertNameToStrW. only after converts an encoded name in a CERT_NAME_BLOB structure to a null-terminated character string you can print it. code example on c/c++:
void PrintIssuer(PCCERT_CONTEXT Context)
{
CERT_NAME_BLOB Issuer = Context->pCertInfo->Issuer;
// option #1
if (ULONG len = CertNameToStrW(X509_ASN_ENCODING | PKCS_7_ASN_ENCODING, &Issuer, CERT_X500_NAME_STR, 0, 0))
{
PWSTR sz = (PWSTR)alloca( len * sizeof(WCHAR));
if (CertNameToStrW(X509_ASN_ENCODING | PKCS_7_ASN_ENCODING, &Issuer, CERT_X500_NAME_STR, sz, len))
{
DbgPrint("%S\n", sz);
}
}
// option #2
PCERT_NAME_INFO pcni;
ULONG size;
if (CryptDecodeObjectEx(X509_ASN_ENCODING | PKCS_7_ASN_ENCODING, X509_NAME, Issuer.pbData, Issuer.cbData,
CRYPT_DECODE_ALLOC_FLAG, 0, &pcni, &size))
{
if (DWORD cRDN = pcni->cRDN)
{
PCERT_RDN rgRDN = pcni->rgRDN;
do
{
if (DWORD cRDNAttr = rgRDN->cRDNAttr)
{
PCERT_RDN_ATTR rgRDNAttr = rgRDN->rgRDNAttr;
do
{
DbgPrint("ObjId = %s\n", rgRDNAttr->pszObjId);
switch (rgRDNAttr->dwValueType)
{
case CERT_RDN_PRINTABLE_STRING:
DbgPrint("Value = %s\n", rgRDNAttr->Value.pbData);
break;
}
} while (rgRDNAttr++, --cRDNAttr);
}
} while (rgRDN++, --cRDN);
}
LocalFree(pcni);
}
}
and output
CN=***
ObjId = 2.5.4.3
Value = ***
(the string after CN= and Value = is the same)
you can note that "2.5.4.3" is szOID_COMMON_NAME or "CN". so first api is append CN= before Issuer name. second variant return you name as is and additional ObjId = 2.5.4.3
convert SerialNumber to printable string can next way:
CRYPT_INTEGER_BLOB SerialNumber = Context->pCertInfo->SerialNumber;
DWORD cb = 0;
if (CryptBinaryToStringW(SerialNumber.pbData, SerialNumber.cbData, CRYPT_STRING_HEX, 0, &cb))
{
PWSTR sz = (PWSTR)alloca( cb * sizeof(WCHAR));
if (CryptBinaryToStringW(SerialNumber.pbData, SerialNumber.cbData, CRYPT_STRING_HEX, sz, &cb))
{
DbgPrint("%S\n", sz);
}
}

PonyLang Windows CreateProcess FFI

I've been trying to call Window's CreateProcessA from Pony Language's FFI.
I created both a C and a PonyLang example. The C example works great:
#include <windows.h>
#include <stdio.h>
#include <tchar.h>
void wmain(void) {
STARTUPINFO info={0};
PROCESS_INFORMATION processInfo={0};
CreateProcessA("calc.exe", 0, 0, 0, 0, 0, 0, 0, &info, &processInfo);
if (status == 0)
printf("%d",GetLastError()); // never hits
}
I put calc.exe in the current directory. This works flawlessly on Windows.
However, my PonyLang implementation keeps on returning a non zero GetLastError:
use "lib:kernel32"
primitive _ProcessAttributes
primitive _ThreadAttributes
primitive _Inherit
primitive _Creation
primitive _Environment
primitive _CurrentDir
primitive _StartupInfo
primitive _ProcessInfo
primitive _HandleIn
primitive _HandleOut
primitive _HandleErr
primitive _Thread
primitive _Process
struct StartupInfo
var cb:I32 = 0
var lpReserved:Pointer[U8] tag= "".cstring()
var lpDesktop:Pointer[U8] tag= "".cstring()
var lpTitle:Pointer[U8] tag= "".cstring()
var dwX:I32 = 0
var dwY:I32 = 0
var dwXSize:I32=0
var dwYSize:I32=0
var dwXCountChars:I32=0
var dwYCountChars:I32=0
var dwFillAttribute:I32=0
var dwFlags:I32=0
var wShowWindow:I16=0
var cbReserved2:I16=0
var lpReserved2:Pointer[U8] tag="".cstring()
var hStdInput:Pointer[_HandleIn] = Pointer[_HandleIn]
var hStdOutput:Pointer[_HandleOut]= Pointer[_HandleOut]
var hStdError:Pointer[_HandleErr]= Pointer[_HandleErr]
struct ProcessInfo
var hProcess:Pointer[_Process] = Pointer[_Process]
var hThread:Pointer[_Thread] = Pointer[_Thread]
var dwProcessId:I32 = 0
var dwThreadId:I32 = 0
//var si:StartupInfo = StartupInfo
actor Main
new create(env: Env) =>
var si: StartupInfo = StartupInfo
var pi: ProcessInfo = ProcessInfo
var inherit:I8 = 0
var creation:I32 = 0
var one:I32 = 0
var two:I32 = 0
var three:I32 = 0
var four:I32 = 0
var z:I32 = 0
var p = #CreateProcessA[I8]("calc.exe",
z,
one,
two,
inherit,
creation,
three,
four,
addressof si,
addressof pi)
if p == 0 then
var err = #GetLastError[I32]() // hits this every time.
env.out.print("Last Error: " + err.string())
end
So the above code compiles for PonyLang, but GetLastError most of the time returns 2. Sometimes GetLastError returns 123. Other times it returns 998?
It all seems odd that the error code is different sometimes. Those codes all mean that there is some issue with file access?
Calc.exe is in the current directory (same directory as the c example).
Also not only is the Error code different but calc.exe is executed(runs fine) in the C version but not in the PonyLang version. This leads me to believe something is off with my PonyLang ffi setup.
Does anyone know what may be wrong?
The problem is with your use of addressof. When you create a struct object, e.g. with var si = StartupInfo, the underlying type is a pointer to the structure (i.e structs in Pony don't have value semantics). Then when you call CreateProcessA with addressof, you're actually passing a pointer to pointer to the function.
If your C function expects a pointer to a structure, you can simply pass the Pony object without addressof when doing the FFI call.

GetOpenFileNameW results in FNERR_INVALIDFILENAME, or CDERR_INITIALIZATION if I call GetOpenFileNameA

Here's the code using GetOpenFileNameW:
import core.sys.windows.windows;
import std.stdio, std.string, std.utf;
pragma(lib, "comdlg32");
// Fill in some missing holes in core.sys.windows.windows.
extern (Windows) DWORD CommDlgExtendedError();
enum OFN_FILEMUSTEXIST = 0x001000;
void main()
{
auto buf = new wchar[1024];
OPENFILENAMEW ofn;
ofn.lStructSize = ofn.sizeof;
ofn.lpstrFile = buf.ptr;
ofn.nMaxFile = buf.length;
ofn.lpstrInitialDir = null;
ofn.Flags = OFN_FILEMUSTEXIST;
BOOL retval = GetOpenFileNameW(&ofn);
if (retval == 0) {
// Get 0x3002 for W and 0x0002 for A. ( http://msdn.microsoft.com/en-us/library/windows/desktop/ms646916(v=vs.85).aspx )
throw new Exception(format("GetOpenFileName failure: 0x%04X.", CommDlgExtendedError()));
}
writeln(buf);
}
This results in FNERR_INVALIDFILENAME, but I don't see any non-optional strings that I haven't filled in. And here's the code (only differences shown) for GetOpenFileNameA:
auto buf = new char[1024];
OPENFILENAMEA ofn;
// ...
BOOL retval = GetOpenFileNameA(&ofn);
This results in CDERR_INITIALIZATION, and the only elaboration MSDN gives me is
The common dialog box function failed during initialization.
This error often occurs when sufficient memory is not available.
This is on Windows 7 64 bit, DMD v2.059.
buf has to be zeroed completely. The problem here is that wchar.init == wchar.max (for error detection reasons), so your array is essentially 1024 instances of wchar.max. A simple buf[] = 0; should fix that.

Resources