Windows, add user by Rust - windows

I use winapi NetUserAdd to add a user,account added succsess,execute command net user,shows as below picture and cannot find users by control panel.
What wrong with LPSWTR or USER_INFO_1 struct to LPBYTE?
use winapi::um::lmaccess::{USER_INFO_1,NetUserAdd,UF_SCRIPT};
use std::iter::{once};
use std::os::windows::ffi::OsStrExt;
use std::ffi::OsStr;
pub fn winstr(value: &str) -> Vec<u16> {
OsStr::new(value).encode_wide().chain(once(0)).collect()
}
fn main() {
let username:String = "Test".to_string();
let password:String = "Test******".to_string();
let mut user = USER_INFO_1{
usri1_name:winstr(&username).as_mut_ptr(),
usri1_password:winstr(&password).as_mut_ptr(),
usri1_priv:1,
usri1_password_age: 0,
usri1_home_dir: std::ptr::null_mut(),
usri1_comment: std::ptr::null_mut(),
usri1_flags:UF_SCRIPT,
usri1_script_path: std::ptr::null_mut(),
};
let mut error = 0 ;
unsafe{
NetUserAdd(std::ptr::null_mut(),1,&mut user as *mut _ as _,&mut error);
}
println!("{}",error);//result is 0,means success.
}

You're sending dangling pointers.
Your winstr(...).as_mut_ptr() calls create a Vec<u16>, gets a pointer to its data, and drops the Vec<u16> since it was a temporary value. You need to keep those values at least until the call to NetUserAdd has finished:
let mut username = winstr("Test");
let mut password = winstr("Test******");
let mut user = USER_INFO_1{
usri1_name: username.as_mut_ptr(),
usri1_password: password.as_mut_ptr(),
usri1_priv: 1,
usri1_password_age: 0,
usri1_home_dir: std::ptr::null_mut(),
usri1_comment: std::ptr::null_mut(),
usri1_flags: UF_SCRIPT,
usri1_script_path: std::ptr::null_mut(),
};

Related

Trying to Read Process Memory with windows_sys::Win32::System::Diagnostics::Debug::ReadProcessMemory in rust

I'm trying to read health in rust from a game called Assault cube.
However i allways have 0 stored in buffer. can anyone explain what i´m doing wrong?
my code looks like this:
fn main(){
unsafe{
use std::ffi::c_void;
use windows_sys::Win32::Foundation::HANDLE;
let process_id:HANDLE = 13488;
let health_adress = 0x005954FC as *const c_void;
let buffer: *mut c_void = std::ptr::null_mut();
let mut number_read:usize= 0;
windows_sys::Win32::System::Diagnostics::Debug::ReadProcessMemory(process_id, health_adress, buffer,4 , &mut number_read);
println!("{:?}", buffer as i32);
}
}
the picture below shows how I got the address with Cheat Engine.

How do I allocate space to call GetInterfaceInfo using the windows crate?

I'm trying to fetch information regarding the network interfaces available on the system via GetInterfaceInfo using Microsoft's windows crate. This requires me to do some unsafe operations, and I get it to work for one interface, but not two:
#[cfg(test)]
mod tests {
use super::*;
use windows::{
core::*, Data::Xml::Dom::*, Win32::Foundation::*, Win32::NetworkManagement::IpHelper::*,
Win32::System::Threading::*, Win32::UI::WindowsAndMessaging::*,
};
#[test]
fn main() {
unsafe {
let mut dw_out_buf_len: u32 = 0;
let mut dw_ret_val =
GetInterfaceInfo(std::ptr::null_mut(), &mut dw_out_buf_len as *mut u32);
if dw_ret_val != ERROR_INSUFFICIENT_BUFFER.0 {
panic!();
}
println!("Size: {}", dw_out_buf_len);
// allocate that amount of memory, which will be used as a buffer
let mut ip_interface_info = Vec::with_capacity(dw_out_buf_len as usize);
let mut ptr = ip_interface_info.as_mut_ptr() as *mut IP_INTERFACE_INFO;
dw_ret_val = GetInterfaceInfo(ptr, &mut dw_out_buf_len as *mut u32);
println!("Num adapters: {}", (*ptr).NumAdapters);
for i in 0..(*ptr).NumAdapters as usize {
println!(
"\tAdapter index: {}\n\tAdapter name: {}",
(*ptr).Adapter[i].Index,
String::from_utf16(&(*ptr).Adapter[i].Name).unwrap()
);
}
}
}
}
It crashes when I'm trying to access the second entry (even though there should be two available):
panicked at 'index out of bounds: the len is 1 but the index is 1'
The struct IP_INTERFACE_INFO containing all data has a field called Adapter which seems to be limited to only be array size of 1. Am I reading this correctly? How is it then supposed to hold multiple adapters?
#[repr(C)]
#[doc = "*Required features: `\"Win32_NetworkManagement_IpHelper\"`*"]
pub struct IP_INTERFACE_INFO {
pub NumAdapters: i32,
pub Adapter: [IP_ADAPTER_INDEX_MAP; 1],
}
It appears that IP_INTERFACE_INFO uses a C flexible array member, which often uses the [1] syntax. The C++ example in Managing Interfaces Using GetInterfaceInfo corroborates this usage:
for (i = 0; i < (unsigned int) pInterfaceInfo->NumAdapters; i++) {
printf(" Adapter Index[%d]: %ld\n", i,
pInterfaceInfo->Adapter[i].Index);
printf(" Adapter Name[%d]: %ws\n\n", i,
pInterfaceInfo->Adapter[i].Name);
}
The equivalent in Rust would be to take the single-element array, get the raw pointer to it, then iterate over that. There are lots of details to be aware of, such as allocation alignment and pointer provenance. Here's an annotated example:
use std::{
alloc::{GlobalAlloc, Layout, System},
mem,
ptr::{self, addr_of},
slice,
};
use windows::{
Win32::Foundation::*,
Win32::NetworkManagement::IpHelper::{
GetInterfaceInfo, IP_ADAPTER_INDEX_MAP, IP_INTERFACE_INFO,
},
};
fn main() {
unsafe {
// Perform the first call to know how many bytes to allocate
let mut raw_buf_len = 0;
let ret_val = GetInterfaceInfo(ptr::null_mut(), &mut raw_buf_len);
assert_eq!(
ret_val, ERROR_INSUFFICIENT_BUFFER.0,
"Expected to get the required buffer size, was {ret_val:?}",
);
// Allocate an appropriately sized *and aligned* buffer to store the result
let buf_len = raw_buf_len.try_into().expect("Invalid buffer length");
let layout = Layout::from_size_align(buf_len, mem::align_of::<IP_INTERFACE_INFO>())
.expect("Could not calculate the appropriate memory layout");
let base_ptr = System.alloc(layout);
let ip_interface_info = base_ptr.cast();
// Perform the second call to get the data
let ret_val = GetInterfaceInfo(ip_interface_info, &mut raw_buf_len);
assert_eq!(
ret_val, NO_ERROR.0,
"Could not get the data on the second call: {ret_val:?}",
);
// Construct a pointer to the adapter array that preserves the provenance of the original pointer
let adapter_ptr = addr_of!((*ip_interface_info).Adapter);
let adapter_ptr = adapter_ptr.cast::<IP_ADAPTER_INDEX_MAP>();
// Combine the pointer and length into a Rust slice
let n_adapters = (*ip_interface_info).NumAdapters;
let n_adapters = n_adapters.try_into().expect("Invalid adapter count");
let adapters = slice::from_raw_parts(adapter_ptr, n_adapters);
println!("Num adapters: {}", adapters.len());
for adapter in adapters {
let IP_ADAPTER_INDEX_MAP {
Index: index,
Name: name,
} = adapter;
// The fixed-size buffer contains data after the UTF-16 NUL character
let name_end = name.iter().position(|&c| c == 0).unwrap_or(name.len());
let name = String::from_utf16_lossy(&name[..name_end]);
println!("Adapter index: {index}\nAdapter name: {name}",);
}
// Free the allocation. This should be wrapped in a type that
// implements `Drop` so we don't leak memory when unwinding a panic.
System.dealloc(base_ptr, layout);
}
}

Rust Win32 FFI: User-mode data execution prevention (DEP) violation

I'm trying to pass a ID3D11Device instance from Rust to a C FFI Library (FFMPEG).
I made this sample code:
pub fn create_d3d11_device(&mut self, device: &mut Box<windows::Win32::Graphics::Direct3D11::ID3D11Device>, context: &mut Box<windows::Win32::Graphics::Direct3D11::ID3D11DeviceContext>) {
let av_device : Box<AVBufferRef> = self.alloc(HwDeviceType::D3d11va);
unsafe {
let device_context = Box::from_raw(av_device.data as *mut AVHWDeviceContext);
let mut d3d11_device_context = Box::from_raw(device_context.hwctx as *mut AVD3D11VADeviceContext);
d3d11_device_context.device = device.as_mut() as *mut _;
d3d11_device_context.device_context = context.as_mut() as *mut _;
let avp = Box::into_raw(av_device);
av_hwdevice_ctx_init(avp);
self.av_hwdevice = Some(Box::from_raw(avp));
}
}
On the Rust side the Device does work, but on the C side, when FFMEPG calls ID3D11DeviceContext_QueryInterface the app crashes with the following error: Exception 0xc0000005 encountered at address 0x7ff9fb99ad38: User-mode data execution prevention (DEP) violation at location 0x7ff9fb99ad38
The address is actually the pointer for the lpVtbl of QueryInterface, like seen here:
The disassembly of the address also looks correct (this is done on an another debugging session):
(lldb) disassemble --start-address 0x00007ffffdf3ad38
0x7ffffdf3ad38: addb %ah, 0x7ffffd(%rdi,%riz,8)
0x7ffffdf3ad3f: addb %al, (%rax)
0x7ffffdf3ad41: movabsl -0x591fffff80000219, %eax
0x7ffffdf3ad4a: outl %eax, $0xfd
Do you have any pointer to debug this further?
EDIT: I made a Minimal Reproducion Sample. Interestingly this does not causes a DEP Violation, but simply a Segfault.
On the C side:
int test_ffi(ID3D11Device *device){
ID3D11DeviceContext *context;
device->lpVtbl->GetImmediateContext(device, &context);
if (!context) return 1;
return 0;
}
On the Rust side:
unsafe fn main_rust(){
let mut device = None;
let mut device_context = None;
let _ = match windows::Win32::Graphics::Direct3D11::D3D11CreateDevice(None, D3D_DRIVER_TYPE_HARDWARE, OtherHinstance::default(), D3D11_CREATE_DEVICE_DEBUG, &[], D3D11_SDK_VERSION, &mut device, std::ptr::null_mut(), &mut device_context) {
Ok(e) => e,
Err(e) => panic!("Creation Failed: {}", e)
};
let mut device = match device {
Some(e) => e,
None => panic!("Creation Failed2")
};
let mut f2 : ID3D11Device = transmute_copy(&device); //Transmuting the WinAPI into a bindgen ID3D11Device
test_ffi(&mut f2);
}
The bindgen build.rs:
extern crate bindgen;
use std::env;
use std::path::PathBuf;
fn main() {
// Tell cargo to tell rustc to link the system bzip2
// shared library.
println!("cargo:rustc-link-lib=ffi_demoLIB");
println!("cargo:rustc-link-lib=d3d11");
// Tell cargo to invalidate the built crate whenever the wrapper changes
println!("cargo:rerun-if-changed=library.h");
// The bindgen::Builder is the main entry point
// to bindgen, and lets you build up options for
// the resulting bindings.
let bindings = bindgen::Builder::default()
// The input header we would like to generate
// bindings for.
.header("library.h")
// Tell cargo to invalidate the built crate whenever any of the
// included header files changed.
.parse_callbacks(Box::new(bindgen::CargoCallbacks))
.blacklist_type("_IMAGE_TLS_DIRECTORY64")
.blacklist_type("IMAGE_TLS_DIRECTORY64")
.blacklist_type("PIMAGE_TLS_DIRECTORY64")
.blacklist_type("IMAGE_TLS_DIRECTORY")
.blacklist_type("PIMAGE_TLS_DIRECTORY")
// Finish the builder and generate the bindings.
.generate()
// Unwrap the Result and panic on failure.
.expect("Unable to generate bindings");
// Write the bindings to the $OUT_DIR/bindings.rs file.
let out_path = PathBuf::from(env::var("OUT_DIR").unwrap());
bindings
.write_to_file(out_path.join("bindings.rs"))
.expect("Couldn't write bindings!");
}
The Complete Repo can be found over here: https://github.com/TheElixZammuto/demo-ffi
According to https://github.com/microsoft/windows-rs/issues/1710#issuecomment-1111522946 my error was that I was trasmutating the structs, while what I should have done is to cast the references:
let f2 : &mut ID3D11Device = transmute_copy(&mut device); //Transmuting the WinAPI into a bindgen ID3D11Device
test_ffi(f2);

Rust FFI with windows CryptoUnprotectData

I'm trying to learn FFI by starting with something simple (and with a practical use), but this doesn't seem to work:
mod bindings {
::windows::include_bindings!();
}
use std::{convert::TryFrom, ptr};
use bindings::{
windows::win32::security::CryptUnprotectData,
windows::win32::security::CRYPTOAPI_BLOB
};
// Powershell code to generate the token
// $pw = read-host "Enter Token" -AsSecureString
// ConvertFrom-SecureString $pw
fn main() -> windows::Result<()> {
// The encrypted string is 'foobar'
let encrypted_token = "01000000d08c9ddf0115d1118c7a00c04fc297eb01000000c336dca1c99b7d40ae3f797c2b5d2951000000000200000000001066000000010000200000007a87d6ac2fc8037bef45e3dbcb0b652432a22a9b48fc5fa3e4fcfd9aaf922949000000000e8000000002000020000000eeaa76a44b6cd5da837f4b0f7040de8e2795ed846f8abe2c7f2d2365d00cf89c1000000069fcaa7fa475178d623f4adab1b08ac4400000008af807014cba53ed2f1e7b8a54c6ad89ff57f0ee3d8c51ecd8c5b48e99b58d0e738c9fae9fc41b4280938865a047f2724106d34313c88a0f3852d5ba9d75abfd";
let mut et_bytes = hex::decode(encrypted_token).unwrap();
let size = u32::try_from(et_bytes.len()).unwrap();
let mut decrypted = vec![0u8; et_bytes.len()];
let dt_bytes = &mut decrypted;
let mut p_data_in = CRYPTOAPI_BLOB {
cb_data: size,
pb_data: et_bytes.as_mut_ptr(),
};
let mut p_data_out = CRYPTOAPI_BLOB {
cb_data: size,
pb_data: dt_bytes.as_mut_ptr(),
};
let pin = &mut p_data_in;
let pout = &mut p_data_out;
unsafe {
let result = CryptUnprotectData(
pin,
ptr::null_mut(),
ptr::null_mut(),
ptr::null_mut(),
ptr::null_mut(),
0,
pout
);
println!("{:?}, {:?}", dt_bytes, result);
}
Ok(())
}
Basically it returns the all zero array, but the result of the CryptUnprotectData returns 1, which according to the docs means success: https://learn.microsoft.com/en-us/windows/win32/api/dpapi/nf-dpapi-cryptunprotectdata
I've verified that by trying to mangle the hex string thus corrupting the encrypted data, which causes it to return 0. I'm not sure if it's writing to the wrong location or something, but presumably the success condition means it wrote somewhere.
The CryptUnprotectData API allocates the output buffer for you. It doesn't write into the buffer you provided. That's why you keep getting the original data, irrespective of the API call's result.
Instead, you'll want to pass in a (default-initialized) CRYPTOAPI_BLOB structure, and observe the values the API passed back, something like the following will do:
fn main() -> windows::Result<()> {
// The encrypted string is 'foobar'
let encrypted_token = "01000000d08c9ddf0115d1118c7a00c04fc297eb01000000c336dca1c99b7d40ae3f797c2b5d2951000000000200000000001066000000010000200000007a87d6ac2fc8037bef45e3dbcb0b652432a22a9b48fc5fa3e4fcfd9aaf922949000000000e8000000002000020000000eeaa76a44b6cd5da837f4b0f7040de8e2795ed846f8abe2c7f2d2365d00cf89c1000000069fcaa7fa475178d623f4adab1b08ac4400000008af807014cba53ed2f1e7b8a54c6ad89ff57f0ee3d8c51ecd8c5b48e99b58d0e738c9fae9fc41b4280938865a047f2724106d34313c88a0f3852d5ba9d75abfd";
let mut et_bytes = hex::decode(encrypted_token).unwrap();
let size = u32::try_from(et_bytes.len()).unwrap();
let mut p_data_in = CRYPTOAPI_BLOB {
cb_data: size,
pb_data: et_bytes.as_mut_ptr(),
};
// Default-initialze; don't allocate any memory
let mut p_data_out = CRYPTOAPI_BLOB::default();
let pin = &mut p_data_in;
let pout = &mut p_data_out;
unsafe {
let result = CryptUnprotectData(
pin,
ptr::null_mut(),
ptr::null_mut(),
ptr::null_mut(),
ptr::null_mut(),
0,
pout
);
// Probably safe to ignore `result`
if !p_data_out.pb_data.is_null() {
// Construct a slice from the returned data
let output = from_raw_parts(p_data_out.pb_data, p_data_out.cb_data as _);
println!("{:?}", output);
// Cleanup
LocalFree(p_data_out.pb_data as _);
}
Ok(())
}
That produces the following output for me:
[102, 0, 111, 0, 111, 0, 98, 0, 97, 0, 114, 0]
which is the UTF-16LE encoding for foobar.
Note that you need have to generate and import windows::win32::system_services::LocalFree to perform the cleanup.

Issue with using winapi to get char from scancode

I've been trying to convert between a scancode and a character. This system has worked before but as of now, for no reason that I can tell, has stopped working.
static mut SCANCODE_BUFFER: winapi::shared::minwindef::PBYTE = std::ptr::null_mut();
static mut layout: winapi::shared::minwindef::HKL = std::ptr::null_mut();
pub fn SCANCODE_TO_CHAR(scancode: u32) -> char {
unsafe {
let mut result = [0 as u16; 2];
if GetKeyboardState(SCANCODE_BUFFER) == winapi::shared::minwindef::FALSE {
return 0 as char;
}
let vk = MapVirtualKeyExA(scancode, 1, layout);
ToAsciiEx(vk, scancode, SCANCODE_BUFFER, result.as_mut_ptr(), 0, layout);
result[0] as u8 as char
}
}
pub fn initialize() {
unsafe {
SCANCODE_BUFFER = [0 as u8; 256].as_mut_ptr();
layout = GetKeyboardLayout(0);
}
}
I've done some debugging, and it seems that the function call:
GetKeyboardState(SCANCODE_BUFFER)
Is causing the program to end with the this:
(exit code: 0xc0000005, STATUS_ACCESS_VIOLATION)
Does anyone know how this might be fixed?
Extra info:
SCANCODE_BUFFER is definitely not a null pointer.
Sorry for posting this. SCANCODE_BUFFER was pointing to dropped memory. I must have been extremely lucky in the past.

Resources