I'm trying to represent an arbitrarily dimensioned matrix in Java as a 1d array and I'm struggling with the implementation the function I'm writing to calculate the correct index in the array given the location in the matrix. Here's what I have so far:
private int indexOf(int... indices) {
assert indices.length == rank() :
"Number of indices does not match rank";
if(indices.length == 1) return indices[0];
int x = 0;
for(int i = 1; i < indices.length; ++i){
x += indices[i] % this.dims[i];
}
return x;
}
Where rank() returns the number of dimensions in the matrix and this.dims is an array with the size of each dimension.
I know i=y*numCols + x works for the 2-d version but I'm just having trouble abstracting out to a matrix with variable dimensions.
Thanks for the help.
int value = arr[4][2][87][5]instead ofint value = arr.indexof(4,2,87,5).