I have a trait, MyTrait, that defines a method called my_func that returns an array of unknown size. The reason for this is that the size of the array returned from this method will depend on the struct that implements the trait. So, the method definition looks like this:
MyTrait<T: Clone + Float> {
fn my_func() -> &[&[T]];
}
Now, I'm trying to implement this trait with the following method:
impl MyStruct {
const matrix_desc: [[u32; 4]; 4] = [
[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, 1, 0],
[0, 0, 0, 1]
];
}
impl MyTrait<u32> for MyStruct {
fn my_func() -> &[&[u32]] {
&matrix_desc
}
}
However, I keep getting errors saying there's a type-mismatch and that Rust expected a &[&[u32]] but got a &[[u32; 4]; 4]. This is very clear to me but I'm not sure how to fix it. I've also tried using the Sized trait but I can't seem to get that to work. Does anyone know how I can return arrays of an unknown size at compile time?
Also, for bonus points, does anyone know how I can force this 2d-array to be square?
Vecwould help. Is there a reason why you can't return aVec<Vec<T>>?[[T;4];4]. But he said he wants unknown sizes. That's whatVecdoes.Vecadds is ownership. If it's not needed, it shouldn't be there.StructBthat implements a[[T;8];8]for example. I do not want the matrix size to change when implemented but I can't determine what the size is in the trait definition