Rust Win32 FFI: User-mode data execution prevention (DEP) violation - winapi

I'm trying to pass a ID3D11Device instance from Rust to a C FFI Library (FFMPEG).
I made this sample code:
pub fn create_d3d11_device(&mut self, device: &mut Box<windows::Win32::Graphics::Direct3D11::ID3D11Device>, context: &mut Box<windows::Win32::Graphics::Direct3D11::ID3D11DeviceContext>) {
let av_device : Box<AVBufferRef> = self.alloc(HwDeviceType::D3d11va);
unsafe {
let device_context = Box::from_raw(av_device.data as *mut AVHWDeviceContext);
let mut d3d11_device_context = Box::from_raw(device_context.hwctx as *mut AVD3D11VADeviceContext);
d3d11_device_context.device = device.as_mut() as *mut _;
d3d11_device_context.device_context = context.as_mut() as *mut _;
let avp = Box::into_raw(av_device);
av_hwdevice_ctx_init(avp);
self.av_hwdevice = Some(Box::from_raw(avp));
}
}
On the Rust side the Device does work, but on the C side, when FFMEPG calls ID3D11DeviceContext_QueryInterface the app crashes with the following error: Exception 0xc0000005 encountered at address 0x7ff9fb99ad38: User-mode data execution prevention (DEP) violation at location 0x7ff9fb99ad38
The address is actually the pointer for the lpVtbl of QueryInterface, like seen here:
The disassembly of the address also looks correct (this is done on an another debugging session):
(lldb) disassemble --start-address 0x00007ffffdf3ad38
0x7ffffdf3ad38: addb %ah, 0x7ffffd(%rdi,%riz,8)
0x7ffffdf3ad3f: addb %al, (%rax)
0x7ffffdf3ad41: movabsl -0x591fffff80000219, %eax
0x7ffffdf3ad4a: outl %eax, $0xfd
Do you have any pointer to debug this further?
EDIT: I made a Minimal Reproducion Sample. Interestingly this does not causes a DEP Violation, but simply a Segfault.
On the C side:
int test_ffi(ID3D11Device *device){
ID3D11DeviceContext *context;
device->lpVtbl->GetImmediateContext(device, &context);
if (!context) return 1;
return 0;
}
On the Rust side:
unsafe fn main_rust(){
let mut device = None;
let mut device_context = None;
let _ = match windows::Win32::Graphics::Direct3D11::D3D11CreateDevice(None, D3D_DRIVER_TYPE_HARDWARE, OtherHinstance::default(), D3D11_CREATE_DEVICE_DEBUG, &[], D3D11_SDK_VERSION, &mut device, std::ptr::null_mut(), &mut device_context) {
Ok(e) => e,
Err(e) => panic!("Creation Failed: {}", e)
};
let mut device = match device {
Some(e) => e,
None => panic!("Creation Failed2")
};
let mut f2 : ID3D11Device = transmute_copy(&device); //Transmuting the WinAPI into a bindgen ID3D11Device
test_ffi(&mut f2);
}
The bindgen build.rs:
extern crate bindgen;
use std::env;
use std::path::PathBuf;
fn main() {
// Tell cargo to tell rustc to link the system bzip2
// shared library.
println!("cargo:rustc-link-lib=ffi_demoLIB");
println!("cargo:rustc-link-lib=d3d11");
// Tell cargo to invalidate the built crate whenever the wrapper changes
println!("cargo:rerun-if-changed=library.h");
// The bindgen::Builder is the main entry point
// to bindgen, and lets you build up options for
// the resulting bindings.
let bindings = bindgen::Builder::default()
// The input header we would like to generate
// bindings for.
.header("library.h")
// Tell cargo to invalidate the built crate whenever any of the
// included header files changed.
.parse_callbacks(Box::new(bindgen::CargoCallbacks))
.blacklist_type("_IMAGE_TLS_DIRECTORY64")
.blacklist_type("IMAGE_TLS_DIRECTORY64")
.blacklist_type("PIMAGE_TLS_DIRECTORY64")
.blacklist_type("IMAGE_TLS_DIRECTORY")
.blacklist_type("PIMAGE_TLS_DIRECTORY")
// Finish the builder and generate the bindings.
.generate()
// Unwrap the Result and panic on failure.
.expect("Unable to generate bindings");
// Write the bindings to the $OUT_DIR/bindings.rs file.
let out_path = PathBuf::from(env::var("OUT_DIR").unwrap());
bindings
.write_to_file(out_path.join("bindings.rs"))
.expect("Couldn't write bindings!");
}
The Complete Repo can be found over here: https://github.com/TheElixZammuto/demo-ffi

According to https://github.com/microsoft/windows-rs/issues/1710#issuecomment-1111522946 my error was that I was trasmutating the structs, while what I should have done is to cast the references:
let f2 : &mut ID3D11Device = transmute_copy(&mut device); //Transmuting the WinAPI into a bindgen ID3D11Device
test_ffi(f2);

Related

How to use SetDisplayConfig (windows-rs) to force screen re-detection?

I am using windows-rs (Latest version from GitHub, because it contains some fixes the stable version on Crates.io doesn't have).
My goal is to develop a small software that automatically forces the screen to be re-detected and set to the highest resolution (It's for a school with a weird setup where teachers have to turn projectors on before the PC for resolutions to get detected, but often forget that, leading the PCs to have a very low resolution, and the higher resolutions not being detected).
For re-initializing the screen, I have the following function:
// Some imports may be unused here, I haven't checked them yet, the full file has more functions
use windows::Win32::Graphics::Gdi::{ChangeDisplaySettingsA, EnumDisplaySettingsA, DEVMODEA, SDC_FORCE_MODE_ENUMERATION, SDC_APPLY, SDC_SAVE_TO_DATABASE, SDC_USE_SUPPLIED_DISPLAY_CONFIG, QDC_ALL_PATHS};
use windows::Win32::Media::Audio::Endpoints::IAudioEndpointVolume;
use windows::Win32::Media::Audio::{IMMDeviceEnumerator, MMDeviceEnumerator};
use windows::Win32::Devices::Display::{GetDisplayConfigBufferSizes, QueryDisplayConfig, SetDisplayConfig, DISPLAYCONFIG_TOPOLOGY_ID};
use windows::core::GUID;
use windows::Win32::System::Com::{CoInitialize, CoCreateInstance, CLSCTX_ALL};
// Forces Windows to reinit display settings
pub fn force_reinit_screen() -> i32 {
let mut path_count = 0;
let mut mode_count = 0;
let result = unsafe { GetDisplayConfigBufferSizes(QDC_ALL_PATHS, &mut path_count, &mut mode_count) };
println!("GetDisplayConfigBufferSizes returned {}", result);
let mut path_array = Vec::with_capacity(path_count as usize);
let mut mode_array = Vec::with_capacity(mode_count as usize);
let result = unsafe {
QueryDisplayConfig(
QDC_ALL_PATHS,
&mut path_count,
path_array.as_mut_ptr(),
&mut mode_count,
mode_array.as_mut_ptr(),
::core::mem::transmute(::core::ptr::null::<DISPLAYCONFIG_TOPOLOGY_ID>()),
)
};
println!("QueryDisplayConfig returned {}", result);
let flags = SDC_FORCE_MODE_ENUMERATION | SDC_APPLY | SDC_USE_SUPPLIED_DISPLAY_CONFIG | SDC_SAVE_TO_DATABASE;
let result = unsafe { SetDisplayConfig(Some(&path_array), Some(&mode_array), flags) };
result
}
However, it does not work on any computer I've tried this on (Returns code 87, which seems to mean bad parameters). What am I doing wrong?

Trying to Read Process Memory with windows_sys::Win32::System::Diagnostics::Debug::ReadProcessMemory in rust

I'm trying to read health in rust from a game called Assault cube.
However i allways have 0 stored in buffer. can anyone explain what i´m doing wrong?
my code looks like this:
fn main(){
unsafe{
use std::ffi::c_void;
use windows_sys::Win32::Foundation::HANDLE;
let process_id:HANDLE = 13488;
let health_adress = 0x005954FC as *const c_void;
let buffer: *mut c_void = std::ptr::null_mut();
let mut number_read:usize= 0;
windows_sys::Win32::System::Diagnostics::Debug::ReadProcessMemory(process_id, health_adress, buffer,4 , &mut number_read);
println!("{:?}", buffer as i32);
}
}
the picture below shows how I got the address with Cheat Engine.

How do I allocate space to call GetInterfaceInfo using the windows crate?

I'm trying to fetch information regarding the network interfaces available on the system via GetInterfaceInfo using Microsoft's windows crate. This requires me to do some unsafe operations, and I get it to work for one interface, but not two:
#[cfg(test)]
mod tests {
use super::*;
use windows::{
core::*, Data::Xml::Dom::*, Win32::Foundation::*, Win32::NetworkManagement::IpHelper::*,
Win32::System::Threading::*, Win32::UI::WindowsAndMessaging::*,
};
#[test]
fn main() {
unsafe {
let mut dw_out_buf_len: u32 = 0;
let mut dw_ret_val =
GetInterfaceInfo(std::ptr::null_mut(), &mut dw_out_buf_len as *mut u32);
if dw_ret_val != ERROR_INSUFFICIENT_BUFFER.0 {
panic!();
}
println!("Size: {}", dw_out_buf_len);
// allocate that amount of memory, which will be used as a buffer
let mut ip_interface_info = Vec::with_capacity(dw_out_buf_len as usize);
let mut ptr = ip_interface_info.as_mut_ptr() as *mut IP_INTERFACE_INFO;
dw_ret_val = GetInterfaceInfo(ptr, &mut dw_out_buf_len as *mut u32);
println!("Num adapters: {}", (*ptr).NumAdapters);
for i in 0..(*ptr).NumAdapters as usize {
println!(
"\tAdapter index: {}\n\tAdapter name: {}",
(*ptr).Adapter[i].Index,
String::from_utf16(&(*ptr).Adapter[i].Name).unwrap()
);
}
}
}
}
It crashes when I'm trying to access the second entry (even though there should be two available):
panicked at 'index out of bounds: the len is 1 but the index is 1'
The struct IP_INTERFACE_INFO containing all data has a field called Adapter which seems to be limited to only be array size of 1. Am I reading this correctly? How is it then supposed to hold multiple adapters?
#[repr(C)]
#[doc = "*Required features: `\"Win32_NetworkManagement_IpHelper\"`*"]
pub struct IP_INTERFACE_INFO {
pub NumAdapters: i32,
pub Adapter: [IP_ADAPTER_INDEX_MAP; 1],
}
It appears that IP_INTERFACE_INFO uses a C flexible array member, which often uses the [1] syntax. The C++ example in Managing Interfaces Using GetInterfaceInfo corroborates this usage:
for (i = 0; i < (unsigned int) pInterfaceInfo->NumAdapters; i++) {
printf(" Adapter Index[%d]: %ld\n", i,
pInterfaceInfo->Adapter[i].Index);
printf(" Adapter Name[%d]: %ws\n\n", i,
pInterfaceInfo->Adapter[i].Name);
}
The equivalent in Rust would be to take the single-element array, get the raw pointer to it, then iterate over that. There are lots of details to be aware of, such as allocation alignment and pointer provenance. Here's an annotated example:
use std::{
alloc::{GlobalAlloc, Layout, System},
mem,
ptr::{self, addr_of},
slice,
};
use windows::{
Win32::Foundation::*,
Win32::NetworkManagement::IpHelper::{
GetInterfaceInfo, IP_ADAPTER_INDEX_MAP, IP_INTERFACE_INFO,
},
};
fn main() {
unsafe {
// Perform the first call to know how many bytes to allocate
let mut raw_buf_len = 0;
let ret_val = GetInterfaceInfo(ptr::null_mut(), &mut raw_buf_len);
assert_eq!(
ret_val, ERROR_INSUFFICIENT_BUFFER.0,
"Expected to get the required buffer size, was {ret_val:?}",
);
// Allocate an appropriately sized *and aligned* buffer to store the result
let buf_len = raw_buf_len.try_into().expect("Invalid buffer length");
let layout = Layout::from_size_align(buf_len, mem::align_of::<IP_INTERFACE_INFO>())
.expect("Could not calculate the appropriate memory layout");
let base_ptr = System.alloc(layout);
let ip_interface_info = base_ptr.cast();
// Perform the second call to get the data
let ret_val = GetInterfaceInfo(ip_interface_info, &mut raw_buf_len);
assert_eq!(
ret_val, NO_ERROR.0,
"Could not get the data on the second call: {ret_val:?}",
);
// Construct a pointer to the adapter array that preserves the provenance of the original pointer
let adapter_ptr = addr_of!((*ip_interface_info).Adapter);
let adapter_ptr = adapter_ptr.cast::<IP_ADAPTER_INDEX_MAP>();
// Combine the pointer and length into a Rust slice
let n_adapters = (*ip_interface_info).NumAdapters;
let n_adapters = n_adapters.try_into().expect("Invalid adapter count");
let adapters = slice::from_raw_parts(adapter_ptr, n_adapters);
println!("Num adapters: {}", adapters.len());
for adapter in adapters {
let IP_ADAPTER_INDEX_MAP {
Index: index,
Name: name,
} = adapter;
// The fixed-size buffer contains data after the UTF-16 NUL character
let name_end = name.iter().position(|&c| c == 0).unwrap_or(name.len());
let name = String::from_utf16_lossy(&name[..name_end]);
println!("Adapter index: {index}\nAdapter name: {name}",);
}
// Free the allocation. This should be wrapped in a type that
// implements `Drop` so we don't leak memory when unwinding a panic.
System.dealloc(base_ptr, layout);
}
}

Windows, add user by Rust

I use winapi NetUserAdd to add a user,account added succsess,execute command net user,shows as below picture and cannot find users by control panel.
What wrong with LPSWTR or USER_INFO_1 struct to LPBYTE?
use winapi::um::lmaccess::{USER_INFO_1,NetUserAdd,UF_SCRIPT};
use std::iter::{once};
use std::os::windows::ffi::OsStrExt;
use std::ffi::OsStr;
pub fn winstr(value: &str) -> Vec<u16> {
OsStr::new(value).encode_wide().chain(once(0)).collect()
}
fn main() {
let username:String = "Test".to_string();
let password:String = "Test******".to_string();
let mut user = USER_INFO_1{
usri1_name:winstr(&username).as_mut_ptr(),
usri1_password:winstr(&password).as_mut_ptr(),
usri1_priv:1,
usri1_password_age: 0,
usri1_home_dir: std::ptr::null_mut(),
usri1_comment: std::ptr::null_mut(),
usri1_flags:UF_SCRIPT,
usri1_script_path: std::ptr::null_mut(),
};
let mut error = 0 ;
unsafe{
NetUserAdd(std::ptr::null_mut(),1,&mut user as *mut _ as _,&mut error);
}
println!("{}",error);//result is 0,means success.
}
You're sending dangling pointers.
Your winstr(...).as_mut_ptr() calls create a Vec<u16>, gets a pointer to its data, and drops the Vec<u16> since it was a temporary value. You need to keep those values at least until the call to NetUserAdd has finished:
let mut username = winstr("Test");
let mut password = winstr("Test******");
let mut user = USER_INFO_1{
usri1_name: username.as_mut_ptr(),
usri1_password: password.as_mut_ptr(),
usri1_priv: 1,
usri1_password_age: 0,
usri1_home_dir: std::ptr::null_mut(),
usri1_comment: std::ptr::null_mut(),
usri1_flags: UF_SCRIPT,
usri1_script_path: std::ptr::null_mut(),
};

Segfault when calling GetBinaryTypeA

I tried to import the GetBinaryTypeA function:
use std::ffi::CString;
use ::std::os::raw::{c_char, c_ulong};
extern { fn GetBinaryTypeA(s: *const c_char, out: *mut c_ulong) -> i32; }
fn main() {
let path = "absolute/path/to/bin.exe";
let cpath = CString::new(path).unwrap();
let mut out: c_ulong = 0;
println!("{:?}", cpath);
unsafe { GetBinaryTypeA(cpath.as_ptr(), out as *mut c_ulong); }
println!("{:?}", cpath);
}
Output:
error: process didn't exit successfully: `target\debug\bin_deploy.exe` (exit code: 3221225477)
Process finished with exit code -1073741819 (0xC0000005)
If I set an invalid path then it executes successfully and GetLastError() returns 2 ("The system cannot find the file specified"), so it looks like the imported function works.
I received the same error using the kernel32-sys crate. Where else can the error be?
You are casting the value 0 to a pointer. On the vast majority of computers in use today, the pointer with the value 0 is known as NULL. Thus, you are trying to write to the NULL pointer, which causes a crash.
You want to write to the address of the value:
&mut out as *mut c_ulong
Which doesn't even need the cast:
unsafe {
GetBinaryTypeA(cpath.as_ptr(), &mut out);
}

Resources