Trying to Read Process Memory with windows_sys::Win32::System::Diagnostics::Debug::ReadProcessMemory in rust - windows

I'm trying to read health in rust from a game called Assault cube.
However i allways have 0 stored in buffer. can anyone explain what i´m doing wrong?
my code looks like this:
fn main(){
unsafe{
use std::ffi::c_void;
use windows_sys::Win32::Foundation::HANDLE;
let process_id:HANDLE = 13488;
let health_adress = 0x005954FC as *const c_void;
let buffer: *mut c_void = std::ptr::null_mut();
let mut number_read:usize= 0;
windows_sys::Win32::System::Diagnostics::Debug::ReadProcessMemory(process_id, health_adress, buffer,4 , &mut number_read);
println!("{:?}", buffer as i32);
}
}
the picture below shows how I got the address with Cheat Engine.

Related

How to use SetDisplayConfig (windows-rs) to force screen re-detection?

I am using windows-rs (Latest version from GitHub, because it contains some fixes the stable version on Crates.io doesn't have).
My goal is to develop a small software that automatically forces the screen to be re-detected and set to the highest resolution (It's for a school with a weird setup where teachers have to turn projectors on before the PC for resolutions to get detected, but often forget that, leading the PCs to have a very low resolution, and the higher resolutions not being detected).
For re-initializing the screen, I have the following function:
// Some imports may be unused here, I haven't checked them yet, the full file has more functions
use windows::Win32::Graphics::Gdi::{ChangeDisplaySettingsA, EnumDisplaySettingsA, DEVMODEA, SDC_FORCE_MODE_ENUMERATION, SDC_APPLY, SDC_SAVE_TO_DATABASE, SDC_USE_SUPPLIED_DISPLAY_CONFIG, QDC_ALL_PATHS};
use windows::Win32::Media::Audio::Endpoints::IAudioEndpointVolume;
use windows::Win32::Media::Audio::{IMMDeviceEnumerator, MMDeviceEnumerator};
use windows::Win32::Devices::Display::{GetDisplayConfigBufferSizes, QueryDisplayConfig, SetDisplayConfig, DISPLAYCONFIG_TOPOLOGY_ID};
use windows::core::GUID;
use windows::Win32::System::Com::{CoInitialize, CoCreateInstance, CLSCTX_ALL};
// Forces Windows to reinit display settings
pub fn force_reinit_screen() -> i32 {
let mut path_count = 0;
let mut mode_count = 0;
let result = unsafe { GetDisplayConfigBufferSizes(QDC_ALL_PATHS, &mut path_count, &mut mode_count) };
println!("GetDisplayConfigBufferSizes returned {}", result);
let mut path_array = Vec::with_capacity(path_count as usize);
let mut mode_array = Vec::with_capacity(mode_count as usize);
let result = unsafe {
QueryDisplayConfig(
QDC_ALL_PATHS,
&mut path_count,
path_array.as_mut_ptr(),
&mut mode_count,
mode_array.as_mut_ptr(),
::core::mem::transmute(::core::ptr::null::<DISPLAYCONFIG_TOPOLOGY_ID>()),
)
};
println!("QueryDisplayConfig returned {}", result);
let flags = SDC_FORCE_MODE_ENUMERATION | SDC_APPLY | SDC_USE_SUPPLIED_DISPLAY_CONFIG | SDC_SAVE_TO_DATABASE;
let result = unsafe { SetDisplayConfig(Some(&path_array), Some(&mode_array), flags) };
result
}
However, it does not work on any computer I've tried this on (Returns code 87, which seems to mean bad parameters). What am I doing wrong?

Rust Win32 FFI: User-mode data execution prevention (DEP) violation

I'm trying to pass a ID3D11Device instance from Rust to a C FFI Library (FFMPEG).
I made this sample code:
pub fn create_d3d11_device(&mut self, device: &mut Box<windows::Win32::Graphics::Direct3D11::ID3D11Device>, context: &mut Box<windows::Win32::Graphics::Direct3D11::ID3D11DeviceContext>) {
let av_device : Box<AVBufferRef> = self.alloc(HwDeviceType::D3d11va);
unsafe {
let device_context = Box::from_raw(av_device.data as *mut AVHWDeviceContext);
let mut d3d11_device_context = Box::from_raw(device_context.hwctx as *mut AVD3D11VADeviceContext);
d3d11_device_context.device = device.as_mut() as *mut _;
d3d11_device_context.device_context = context.as_mut() as *mut _;
let avp = Box::into_raw(av_device);
av_hwdevice_ctx_init(avp);
self.av_hwdevice = Some(Box::from_raw(avp));
}
}
On the Rust side the Device does work, but on the C side, when FFMEPG calls ID3D11DeviceContext_QueryInterface the app crashes with the following error: Exception 0xc0000005 encountered at address 0x7ff9fb99ad38: User-mode data execution prevention (DEP) violation at location 0x7ff9fb99ad38
The address is actually the pointer for the lpVtbl of QueryInterface, like seen here:
The disassembly of the address also looks correct (this is done on an another debugging session):
(lldb) disassemble --start-address 0x00007ffffdf3ad38
0x7ffffdf3ad38: addb %ah, 0x7ffffd(%rdi,%riz,8)
0x7ffffdf3ad3f: addb %al, (%rax)
0x7ffffdf3ad41: movabsl -0x591fffff80000219, %eax
0x7ffffdf3ad4a: outl %eax, $0xfd
Do you have any pointer to debug this further?
EDIT: I made a Minimal Reproducion Sample. Interestingly this does not causes a DEP Violation, but simply a Segfault.
On the C side:
int test_ffi(ID3D11Device *device){
ID3D11DeviceContext *context;
device->lpVtbl->GetImmediateContext(device, &context);
if (!context) return 1;
return 0;
}
On the Rust side:
unsafe fn main_rust(){
let mut device = None;
let mut device_context = None;
let _ = match windows::Win32::Graphics::Direct3D11::D3D11CreateDevice(None, D3D_DRIVER_TYPE_HARDWARE, OtherHinstance::default(), D3D11_CREATE_DEVICE_DEBUG, &[], D3D11_SDK_VERSION, &mut device, std::ptr::null_mut(), &mut device_context) {
Ok(e) => e,
Err(e) => panic!("Creation Failed: {}", e)
};
let mut device = match device {
Some(e) => e,
None => panic!("Creation Failed2")
};
let mut f2 : ID3D11Device = transmute_copy(&device); //Transmuting the WinAPI into a bindgen ID3D11Device
test_ffi(&mut f2);
}
The bindgen build.rs:
extern crate bindgen;
use std::env;
use std::path::PathBuf;
fn main() {
// Tell cargo to tell rustc to link the system bzip2
// shared library.
println!("cargo:rustc-link-lib=ffi_demoLIB");
println!("cargo:rustc-link-lib=d3d11");
// Tell cargo to invalidate the built crate whenever the wrapper changes
println!("cargo:rerun-if-changed=library.h");
// The bindgen::Builder is the main entry point
// to bindgen, and lets you build up options for
// the resulting bindings.
let bindings = bindgen::Builder::default()
// The input header we would like to generate
// bindings for.
.header("library.h")
// Tell cargo to invalidate the built crate whenever any of the
// included header files changed.
.parse_callbacks(Box::new(bindgen::CargoCallbacks))
.blacklist_type("_IMAGE_TLS_DIRECTORY64")
.blacklist_type("IMAGE_TLS_DIRECTORY64")
.blacklist_type("PIMAGE_TLS_DIRECTORY64")
.blacklist_type("IMAGE_TLS_DIRECTORY")
.blacklist_type("PIMAGE_TLS_DIRECTORY")
// Finish the builder and generate the bindings.
.generate()
// Unwrap the Result and panic on failure.
.expect("Unable to generate bindings");
// Write the bindings to the $OUT_DIR/bindings.rs file.
let out_path = PathBuf::from(env::var("OUT_DIR").unwrap());
bindings
.write_to_file(out_path.join("bindings.rs"))
.expect("Couldn't write bindings!");
}
The Complete Repo can be found over here: https://github.com/TheElixZammuto/demo-ffi
According to https://github.com/microsoft/windows-rs/issues/1710#issuecomment-1111522946 my error was that I was trasmutating the structs, while what I should have done is to cast the references:
let f2 : &mut ID3D11Device = transmute_copy(&mut device); //Transmuting the WinAPI into a bindgen ID3D11Device
test_ffi(f2);

Windows, add user by Rust

I use winapi NetUserAdd to add a user,account added succsess,execute command net user,shows as below picture and cannot find users by control panel.
What wrong with LPSWTR or USER_INFO_1 struct to LPBYTE?
use winapi::um::lmaccess::{USER_INFO_1,NetUserAdd,UF_SCRIPT};
use std::iter::{once};
use std::os::windows::ffi::OsStrExt;
use std::ffi::OsStr;
pub fn winstr(value: &str) -> Vec<u16> {
OsStr::new(value).encode_wide().chain(once(0)).collect()
}
fn main() {
let username:String = "Test".to_string();
let password:String = "Test******".to_string();
let mut user = USER_INFO_1{
usri1_name:winstr(&username).as_mut_ptr(),
usri1_password:winstr(&password).as_mut_ptr(),
usri1_priv:1,
usri1_password_age: 0,
usri1_home_dir: std::ptr::null_mut(),
usri1_comment: std::ptr::null_mut(),
usri1_flags:UF_SCRIPT,
usri1_script_path: std::ptr::null_mut(),
};
let mut error = 0 ;
unsafe{
NetUserAdd(std::ptr::null_mut(),1,&mut user as *mut _ as _,&mut error);
}
println!("{}",error);//result is 0,means success.
}
You're sending dangling pointers.
Your winstr(...).as_mut_ptr() calls create a Vec<u16>, gets a pointer to its data, and drops the Vec<u16> since it was a temporary value. You need to keep those values at least until the call to NetUserAdd has finished:
let mut username = winstr("Test");
let mut password = winstr("Test******");
let mut user = USER_INFO_1{
usri1_name: username.as_mut_ptr(),
usri1_password: password.as_mut_ptr(),
usri1_priv: 1,
usri1_password_age: 0,
usri1_home_dir: std::ptr::null_mut(),
usri1_comment: std::ptr::null_mut(),
usri1_flags: UF_SCRIPT,
usri1_script_path: std::ptr::null_mut(),
};

Issue with using winapi to get char from scancode

I've been trying to convert between a scancode and a character. This system has worked before but as of now, for no reason that I can tell, has stopped working.
static mut SCANCODE_BUFFER: winapi::shared::minwindef::PBYTE = std::ptr::null_mut();
static mut layout: winapi::shared::minwindef::HKL = std::ptr::null_mut();
pub fn SCANCODE_TO_CHAR(scancode: u32) -> char {
unsafe {
let mut result = [0 as u16; 2];
if GetKeyboardState(SCANCODE_BUFFER) == winapi::shared::minwindef::FALSE {
return 0 as char;
}
let vk = MapVirtualKeyExA(scancode, 1, layout);
ToAsciiEx(vk, scancode, SCANCODE_BUFFER, result.as_mut_ptr(), 0, layout);
result[0] as u8 as char
}
}
pub fn initialize() {
unsafe {
SCANCODE_BUFFER = [0 as u8; 256].as_mut_ptr();
layout = GetKeyboardLayout(0);
}
}
I've done some debugging, and it seems that the function call:
GetKeyboardState(SCANCODE_BUFFER)
Is causing the program to end with the this:
(exit code: 0xc0000005, STATUS_ACCESS_VIOLATION)
Does anyone know how this might be fixed?
Extra info:
SCANCODE_BUFFER is definitely not a null pointer.
Sorry for posting this. SCANCODE_BUFFER was pointing to dropped memory. I must have been extremely lucky in the past.

Segfault when calling GetBinaryTypeA

I tried to import the GetBinaryTypeA function:
use std::ffi::CString;
use ::std::os::raw::{c_char, c_ulong};
extern { fn GetBinaryTypeA(s: *const c_char, out: *mut c_ulong) -> i32; }
fn main() {
let path = "absolute/path/to/bin.exe";
let cpath = CString::new(path).unwrap();
let mut out: c_ulong = 0;
println!("{:?}", cpath);
unsafe { GetBinaryTypeA(cpath.as_ptr(), out as *mut c_ulong); }
println!("{:?}", cpath);
}
Output:
error: process didn't exit successfully: `target\debug\bin_deploy.exe` (exit code: 3221225477)
Process finished with exit code -1073741819 (0xC0000005)
If I set an invalid path then it executes successfully and GetLastError() returns 2 ("The system cannot find the file specified"), so it looks like the imported function works.
I received the same error using the kernel32-sys crate. Where else can the error be?
You are casting the value 0 to a pointer. On the vast majority of computers in use today, the pointer with the value 0 is known as NULL. Thus, you are trying to write to the NULL pointer, which causes a crash.
You want to write to the address of the value:
&mut out as *mut c_ulong
Which doesn't even need the cast:
unsafe {
GetBinaryTypeA(cpath.as_ptr(), &mut out);
}

Resources