Segfault when calling GetBinaryTypeA - winapi

I tried to import the GetBinaryTypeA function:
use std::ffi::CString;
use ::std::os::raw::{c_char, c_ulong};
extern { fn GetBinaryTypeA(s: *const c_char, out: *mut c_ulong) -> i32; }
fn main() {
let path = "absolute/path/to/bin.exe";
let cpath = CString::new(path).unwrap();
let mut out: c_ulong = 0;
println!("{:?}", cpath);
unsafe { GetBinaryTypeA(cpath.as_ptr(), out as *mut c_ulong); }
println!("{:?}", cpath);
}
Output:
error: process didn't exit successfully: `target\debug\bin_deploy.exe` (exit code: 3221225477)
Process finished with exit code -1073741819 (0xC0000005)
If I set an invalid path then it executes successfully and GetLastError() returns 2 ("The system cannot find the file specified"), so it looks like the imported function works.
I received the same error using the kernel32-sys crate. Where else can the error be?

You are casting the value 0 to a pointer. On the vast majority of computers in use today, the pointer with the value 0 is known as NULL. Thus, you are trying to write to the NULL pointer, which causes a crash.
You want to write to the address of the value:
&mut out as *mut c_ulong
Which doesn't even need the cast:
unsafe {
GetBinaryTypeA(cpath.as_ptr(), &mut out);
}

Related

Error while trying to get PID with FindWindowA and GetWindowThreadProcessId

I'm trying to get the PID of a program in rust using the Windows crate, with the FindWindowA and GetWindowThreadProcessId functions. My problem is that GetWindowThreadProcessId fails with error 1400.
fn main(){
unsafe{
let process_name: PCSTR = windows::s!("ac_client.exe");
let window = windows::Win32::UI::WindowsAndMessaging::FindWindowA(None, process_name);
let error = windows::Win32::Foundation::GetLastError();
println!(" {:?}", error); //0
let mut pId= 0;
windows::Win32::UI::WindowsAndMessaging::GetWindowThreadProcessId(window, Some(&mut pId));
let error = windows::Win32::Foundation::GetLastError();
println!("{:?}", error); // 1400
}
}
error 1400: ERROR_INVALID_WINDOW_HANDLE Invalid window handle.
It seems that you used a wrong window handle. I suggest you could try to check the window handle. And you should make sure that the thread is valid before you call the GetWindowThreadProcessId.

Trying to Read Process Memory with windows_sys::Win32::System::Diagnostics::Debug::ReadProcessMemory in rust

I'm trying to read health in rust from a game called Assault cube.
However i allways have 0 stored in buffer. can anyone explain what i´m doing wrong?
my code looks like this:
fn main(){
unsafe{
use std::ffi::c_void;
use windows_sys::Win32::Foundation::HANDLE;
let process_id:HANDLE = 13488;
let health_adress = 0x005954FC as *const c_void;
let buffer: *mut c_void = std::ptr::null_mut();
let mut number_read:usize= 0;
windows_sys::Win32::System::Diagnostics::Debug::ReadProcessMemory(process_id, health_adress, buffer,4 , &mut number_read);
println!("{:?}", buffer as i32);
}
}
the picture below shows how I got the address with Cheat Engine.

Rust Win32 FFI: User-mode data execution prevention (DEP) violation

I'm trying to pass a ID3D11Device instance from Rust to a C FFI Library (FFMPEG).
I made this sample code:
pub fn create_d3d11_device(&mut self, device: &mut Box<windows::Win32::Graphics::Direct3D11::ID3D11Device>, context: &mut Box<windows::Win32::Graphics::Direct3D11::ID3D11DeviceContext>) {
let av_device : Box<AVBufferRef> = self.alloc(HwDeviceType::D3d11va);
unsafe {
let device_context = Box::from_raw(av_device.data as *mut AVHWDeviceContext);
let mut d3d11_device_context = Box::from_raw(device_context.hwctx as *mut AVD3D11VADeviceContext);
d3d11_device_context.device = device.as_mut() as *mut _;
d3d11_device_context.device_context = context.as_mut() as *mut _;
let avp = Box::into_raw(av_device);
av_hwdevice_ctx_init(avp);
self.av_hwdevice = Some(Box::from_raw(avp));
}
}
On the Rust side the Device does work, but on the C side, when FFMEPG calls ID3D11DeviceContext_QueryInterface the app crashes with the following error: Exception 0xc0000005 encountered at address 0x7ff9fb99ad38: User-mode data execution prevention (DEP) violation at location 0x7ff9fb99ad38
The address is actually the pointer for the lpVtbl of QueryInterface, like seen here:
The disassembly of the address also looks correct (this is done on an another debugging session):
(lldb) disassemble --start-address 0x00007ffffdf3ad38
0x7ffffdf3ad38: addb %ah, 0x7ffffd(%rdi,%riz,8)
0x7ffffdf3ad3f: addb %al, (%rax)
0x7ffffdf3ad41: movabsl -0x591fffff80000219, %eax
0x7ffffdf3ad4a: outl %eax, $0xfd
Do you have any pointer to debug this further?
EDIT: I made a Minimal Reproducion Sample. Interestingly this does not causes a DEP Violation, but simply a Segfault.
On the C side:
int test_ffi(ID3D11Device *device){
ID3D11DeviceContext *context;
device->lpVtbl->GetImmediateContext(device, &context);
if (!context) return 1;
return 0;
}
On the Rust side:
unsafe fn main_rust(){
let mut device = None;
let mut device_context = None;
let _ = match windows::Win32::Graphics::Direct3D11::D3D11CreateDevice(None, D3D_DRIVER_TYPE_HARDWARE, OtherHinstance::default(), D3D11_CREATE_DEVICE_DEBUG, &[], D3D11_SDK_VERSION, &mut device, std::ptr::null_mut(), &mut device_context) {
Ok(e) => e,
Err(e) => panic!("Creation Failed: {}", e)
};
let mut device = match device {
Some(e) => e,
None => panic!("Creation Failed2")
};
let mut f2 : ID3D11Device = transmute_copy(&device); //Transmuting the WinAPI into a bindgen ID3D11Device
test_ffi(&mut f2);
}
The bindgen build.rs:
extern crate bindgen;
use std::env;
use std::path::PathBuf;
fn main() {
// Tell cargo to tell rustc to link the system bzip2
// shared library.
println!("cargo:rustc-link-lib=ffi_demoLIB");
println!("cargo:rustc-link-lib=d3d11");
// Tell cargo to invalidate the built crate whenever the wrapper changes
println!("cargo:rerun-if-changed=library.h");
// The bindgen::Builder is the main entry point
// to bindgen, and lets you build up options for
// the resulting bindings.
let bindings = bindgen::Builder::default()
// The input header we would like to generate
// bindings for.
.header("library.h")
// Tell cargo to invalidate the built crate whenever any of the
// included header files changed.
.parse_callbacks(Box::new(bindgen::CargoCallbacks))
.blacklist_type("_IMAGE_TLS_DIRECTORY64")
.blacklist_type("IMAGE_TLS_DIRECTORY64")
.blacklist_type("PIMAGE_TLS_DIRECTORY64")
.blacklist_type("IMAGE_TLS_DIRECTORY")
.blacklist_type("PIMAGE_TLS_DIRECTORY")
// Finish the builder and generate the bindings.
.generate()
// Unwrap the Result and panic on failure.
.expect("Unable to generate bindings");
// Write the bindings to the $OUT_DIR/bindings.rs file.
let out_path = PathBuf::from(env::var("OUT_DIR").unwrap());
bindings
.write_to_file(out_path.join("bindings.rs"))
.expect("Couldn't write bindings!");
}
The Complete Repo can be found over here: https://github.com/TheElixZammuto/demo-ffi
According to https://github.com/microsoft/windows-rs/issues/1710#issuecomment-1111522946 my error was that I was trasmutating the structs, while what I should have done is to cast the references:
let f2 : &mut ID3D11Device = transmute_copy(&mut device); //Transmuting the WinAPI into a bindgen ID3D11Device
test_ffi(f2);

sendinput string in rust [duplicate]

I am trying to convert this example to Rust 1.3 with winapi-rs 0.2.4.
I have:
fn send_key_event(vk: u16, flags: u32) {
let mut input = winapi::INPUT {
type_: winapi::INPUT_KEYBOARD,
union_: winapi::KEYBDINPUT {
wVk: vk,
wScan: 0,
dwFlags: flags,
time: 0,
dwExtraInfo: 0,
}
};
unsafe {
user32::SendInput(1, &mut input, mem::size_of::<winapi::INPUT>() as i32);
}
}
which does not compile with:
error: mismatched types:
expected `winapi::winuser::MOUSEINPUT`,
found `winapi::winuser::KEYBDINPUT`
(expected struct `winapi::winuser::MOUSEINPUT`,
found struct `winapi::winuser::KEYBDINPUT`) [E0308]
Haw do I send keystrokes to the active window?
The definition of winapi::INPUT in the version of winapi-rs you use is incorrect. It appears to have been fixed today (or yesterday, depending on where you are).

Issue with using winapi to get char from scancode

I've been trying to convert between a scancode and a character. This system has worked before but as of now, for no reason that I can tell, has stopped working.
static mut SCANCODE_BUFFER: winapi::shared::minwindef::PBYTE = std::ptr::null_mut();
static mut layout: winapi::shared::minwindef::HKL = std::ptr::null_mut();
pub fn SCANCODE_TO_CHAR(scancode: u32) -> char {
unsafe {
let mut result = [0 as u16; 2];
if GetKeyboardState(SCANCODE_BUFFER) == winapi::shared::minwindef::FALSE {
return 0 as char;
}
let vk = MapVirtualKeyExA(scancode, 1, layout);
ToAsciiEx(vk, scancode, SCANCODE_BUFFER, result.as_mut_ptr(), 0, layout);
result[0] as u8 as char
}
}
pub fn initialize() {
unsafe {
SCANCODE_BUFFER = [0 as u8; 256].as_mut_ptr();
layout = GetKeyboardLayout(0);
}
}
I've done some debugging, and it seems that the function call:
GetKeyboardState(SCANCODE_BUFFER)
Is causing the program to end with the this:
(exit code: 0xc0000005, STATUS_ACCESS_VIOLATION)
Does anyone know how this might be fixed?
Extra info:
SCANCODE_BUFFER is definitely not a null pointer.
Sorry for posting this. SCANCODE_BUFFER was pointing to dropped memory. I must have been extremely lucky in the past.

Resources