I have a problem using NumPy which I have been working on for three hours and I can't figure it out. It's a five part problem and I figured four out of the five, but this last one I just can't figure out. Given a 3-dimensional array called "X", how would you find the index of the row with the smallest standard deviation in each layer?
So far I have this:
min_std_row_layer = X.std(axis = "").argmin()
But I don't know if that's even a good starting point.