How do I write to a DIBits structure in Go? - go

I'm using a w32 library to allow me to do Windowing with the Go language. I'm not quite sure what to do with an unsafe.Pointer that will allow me to start setting pixel values in the pixel buffer.
I use an unsafe.Pointer, because that's what the w32 library expects me to pass in the CreateDIBSection function.
var p unsafe.Pointer
bitmap := w32.CreateDIBSection( srcDC, &bmi, w32.DIB_RGB_COLORS, &p, w32.HANDLE(0), 0 )
That code succeeds and gives me a pointer to the memory location where the DIBBits are stored. How can I use that to write values?
p[idx] = 0xff
will give me an error type unsafe.Pointer does not allow indexing. I've read the relevant docs on the unsafe.Pointer, but can't figure out how to treat it as a byte buffer that I can write into.
I'm new to Go and have worked through a lot of the examples at gobyexample.com, but cannot figure this out.

It's just a matter of casting the unsafe.Pointer back to an array (which is indexable) in the proper way.
After trying various casts, this is the one that worked (assuming wid and hgt are each declare as const):
pixels := (*[wid*hgt*4]uint8)(ptr)
then I was able to change them with:
pixels[(y*wid+x)*4+0] = 0x00 // Blue
pixels[(y*wid+x)*4+1] = 0x00 // Green
pixels[(y*wid+x)*4+2] = 0x00 // Red

Related

Converting raw pointer to 16-bit Unicode character to file path in Rust

I'm replacing a DLL written in C++ with one written in Rust.
Currently the function in the DLL is called as follows:
BOOL calledFunction(wchar_t* pFileName)
I believe that in this context wchar_t is a 16-bit Unicode character, so I chose to expose the following function in my Rust DLL:
pub fn calledFunction(pFileName: *const u16)
What would be the best way to convert that raw pointer to something I could actually use to open the file from the Rust DLL?
Here is some example code:
use std::ffi::OsString;
use std::os::windows::prelude::*;
unsafe fn u16_ptr_to_string(ptr: *const u16) -> OsString {
let len = (0..).take_while(|&i| *ptr.offset(i) != 0).count();
let slice = std::slice::from_raw_parts(ptr, len);
OsString::from_wide(slice)
}
// main example
fn main() {
let buf = vec![97_u16, 98, 99, 100, 101, 102, 0];
let ptr = buf.as_ptr(); // raw pointer
let string = unsafe { u16_ptr_to_string(ptr) };
println!("{:?}", string);
}
In u16_ptr_to_string, you do 3 things:
get the length of the string by counting the non-zero characters using offset (unsafe)
create a slice using from_raw_parts (unsafe)
transform this &[u16] into an OsString with from_wide
It is better to use wchar_t and wcslen from the libc crate and use another crate for conversion. This is maybe a bad idea to reimplement something that is already maintained in a crate.
You need to use OsString, which represents the native string format used by the operating system. In Windows these are specifically 16-bit character strings (usually UTF-16).
Quoting the doc:
OsString and OsStr are useful when you need to transfer strings to and from the operating system itself, or when capturing the output of external commands. Conversions between OsString, OsStr and Rust strings work similarly to those for CString and CStr.
You first need to convert the pointer into a slice, using unsafe code:
use std::slice;
// manifest a slice out of thin air!
let ptr = 0x1234 as const *u16;
let nb_elements = 10;
unsafe {
let slice = slice::from_raw_parts(ptr, nb_elements);
}
This assumes you know the size of your string, meaning your function should also take the number of characters as argument.
The from_wide method should be the one needed to convert from a native format:
use std::ffi::OsString;
use std::os::windows::prelude::*;
// UTF-16 encoding for "Unicode".
let arr = [0x0055, 0x006E, 0x0069, 0x0063, 0x006F, 0x0064, 0x0065];
let string = OsString::from_wide(&arr[..]);

How much storage does this Swift struct actually use?

Suppose I have the following struct –
struct MyStruct {
var value1: UInt16
var value2: UInt16
}
And I use this struct somewhere in my code like so -
var s = MyStruct(value1: UInt16(0), value2: UInt16(0))
I know that the struct will require 32-bits of storage for the two 16-bit integers –
What I am not certain about is whether swift is allocating two additional 64-bit pointers for each value in addition to one 64-bit pointer for the variable s.
Does this mean total storage requirement for the above code would result in the following?
MyStruct.value1 - 16-bits
MyStruct.value1 ptr - 64-bits
MyStruct.value2 - 16-bits
MyStruct.value2 ptr - 64-bits
s ptr - 64-bits
–––––––––––––––––––––––––––––
Total - 224-bits
Can someone please clarify?
MyStruct is 4 bytes because sizeof(UInt16) is 2 bytes. To test this for any given type, use sizeof. sizeof return the memory in bytes.
let size = sizeof(MyStruct) //4
If you want to get the size of a given instance you can use sizeOfValue.
var s = MyStruct(value1: UInt16(0), value2: UInt16(0))
let sSize = sizeofValue(s) //4
I believe the size of the pointer will depend on the architecture/compiler which is 64-bits on most computers and many newer phones but older ones might be 32 bit.
I don't think there is a way to actually get a pointer to MyStruct.value1, correct me if i'm wrong (i'm trying &s.value1.
Pointers
Structs in Swift are created and passed around on the stack, that's why they have value semantics instead of reference semantics.
When a struct is created in a function, it is stored on the stack so it's memory is freed up at the end of the function. It's reference is just an offset from the Stack Pointer or Frame Pointer.
It'll be four bytes on the stack.
Just try it in a XCode Playground:
The answer is 4 bytes.

Allocating a buffer on the heap at runtime

I am learning Rust by writing simple binary decoder.
I'm using a BufferedReader with the byteorder crate to read numbers, but I'm having problems with reading byte buffers.
I want to read byte data into buffer allocated at runtime.
Then I want to pass ownership of this buffer to a struct. When struct is no longer in use, the buffer should be deallocated.
There seems to be no way to allocate array with size determined at runtime on heap except some Vec::with_capacity() hacks. Any ideas how to implement this with proper Rust semantics?
This will create a pre-allocated mutable 500MB byte buffer of zeros stored on the heap with no need for unsafe rust:
// Correct
let mut buffer = vec![0_u8; 536870912];
Note that the following code below is not a good idea and will most likely result in a stack overflow because the buffer is created on the stack before being boxed and moved to the heap.
// Incorrect - stack used
let mut bytes: Box<[u8]> = Box::new([0_u8; 536870912])
// Incorrect - slow
let mut bytes = Vec::with_capacity(536870912);
for _ in 0..bytes.capacity() {
bytes.push(0_u8);
}
Rust is a low-level language; thus you can allocate raw memory and then fill it with objects yourself. Of course, it will require unsafe code, as all fiddling with raw memory does.
Here is a complete example:
use std::{
alloc::{self, Layout},
mem, ptr,
};
fn main() {
unsafe {
let layout = Layout::from_size_align(512 * 1024, 4 * 1024).expect("Invalid layout");
let mut raw: *mut i32 = mem::transmute(alloc::alloc(layout));
for i in 0..(512 * 1024 / 4) {
ptr::write(raw, i as i32);
raw = raw.offset(1)
}
}
}
Of course, in real code, I would just use Vec to safely manage the memory for me. It's just simpler!
I tried using box but it seems that it is experimental and I can't use it with release branch. Any ideas how to implement this with proper Rust semantics?
This is covered in The Rust Programming Language, specifically the section "Using Box<T> to Point to Data on the Heap".
Use Box::new:
fn main() {
let answer: Box<u8> = Box::new(42);
}
See also:
Allocate array onto heap with size known at runtime
Is there any way to allocate a standard Rust array directly on the heap, skipping the stack entirely?
How to allocate arrays on the heap in Rust 1.0?
Creating a fixed-size array on heap in Rust
How do I allocate an array at runtime in Rust?
Thread '<main>' has overflowed its stack when allocating a large array using Box

C++ builder - convert UnicodeString to UTF-8 encoded string

I try to convert UnicodeString to UTF-8 encoded string in C++ builder. I use UnicodeToUtf8() function to do that.
char * dest;
UnicodeSring src;
UnicodeToUtf8(dest,256,src.w_str(),src.Length());
but compiler shows me runtime access violation message. What I'm doing wrong?
Assuming you are using C++Builder 2009 or later (you did not say), and are using the RTL's System::UnicodeString class (and not some other third-party UnicodeString class), then there is a much simplier way to handle this situation. C++Builder also has a System::UTF8String class available (it has been available since C++Builder 6, but did not become a true RTL-implemented UTF-8 string type until C++Builder 2009). Simply assign your UnicodeString to a UTF8String and let the RTL handle the memory allocation and data conversion for you, eg:
UnicodeString src = ...;
UTF8String dest = src; // <-- automatic UTF16-to-UTF8 conversion
// use dest.c_str() and dest.Length() as needed...
This fixes the problem in the question, but the real way to do a UTF16 to UTF8 conversion is in Remy's answer below.
dest is a pointer to a random space in memory because you do not initialize it. In debug builds it probably points to 0 but in release builds it could be anywhere. You are telling UnicodeToUtf8 that dest is a buffer with room for 256 characters.
Try this
char dest[256]; // room for 256 characters
UnicodeString src = L"Test this";
UnicodeToUtf8( dest, 256, src, src.Length() );
But in reality you can use the easier:
char dest[256]; // room for 256 characters
UnicodeString src = L"Test this";
UnicodeToUtf8( dest, src, 256 );

WSASend : Send int or struct

I would like to use MS function to send data.
I didnt find examples where they send other type of data other than const char * .
I tried to send a int, or other, but I failed.
WSASend() and send() both function only take a Char* parameters.
How should i proceed ?
Thanks
Its just a pointer to a buffer, this buffer may contains anything you want.
This char pointer is actually an address to a bytes array, this function requires a length parameter too.
An integer is a 2/4 (short/long) bytes value,
Then if you want to send an integer variable (for example) you have to pass its address, and its length.
WSASend and send are simple functions that send a memory block.
I assume you are talking about C, you have to understand that C's char variables are bytes - 8 bits block, char variables contain any value between 0 and 255.
A pointer to a char var is an address to a byte (which maybe the first cell of a bytes array).
I think thats what confuses you.
I hope you understand.
The const char* parameter indicates that the function is taking a pointer to bytes. Witch really seems to be the result of the original socket api designers being pedantic - C has a generic type to handle any kind of pointer without explicit casts: void*.
You could make a convenience wrapper for send like this - which would allow you to send any (contiguous) thing you can make a pointer to:
int MySend(SOCKET s, const void* buf, int len,int flags)
{
return send(s,(const char*)buf,len,flags);
}
Using void* in place of char* actually makes the api safer, as it can now detect when you do something stupid:
int x=0x1234;
send(s,(const char*)x,sizeof(x),0); // looks right, but is wrong.
mysend(s,x,sizeof(x),0); // this version correctly fails
mysend(s,&x,sizeof(x),0); // correct - pass a pointer to the buffer to send.
WSASend is a bit more tricky to make a convenience wapper for as you have to pass it an array of structs that contain the char*'s - but again its a case of defining an equivalent struct with const void*'s in place of the const char*'s and then casting the data structures to the WSA types in the convenience wrapper. Get it right once, and the rest of the program becomes much easier to determine correct as you don't need casts everywhere hiding potential bugs.

Resources