I’m importing 2-D matrix data for a multi year climate time series testing on a 5 year annual dataset. I’ve created a for loop to import the 2D matrix data by year into a series of 5 separate arrays of size (1500, 3600). I append the matrix time series data into a single combined (5, 1500, 3600) array with each year being one dimension in the array. I then run the np.mean and np.std to create (1500, 3600) matrices calculating the 5 year mean and stddev of the data at each matrix point. Code is below. The numbers look to be coming out correctly when I test this but I would like to know ..
Is there a faster way to do this? I will eventually need to run this type of analysis for daily data over an 18 time span which would be building and operating on a (6570, 1500, 3600) array. Any suggestions? I’m fairly new to Python and still finding my way.
StartYear=2009
EndYear=2014
for x in range(StartYear, EndYear):
name = "/dir/climate_variable" + str(x) + ".gz"
Q_WBM = rg.grid(name)
Q_WBM.Load()
q_wbm = Q_WBM.Data # .flatten()
q_wbm[np.isnan(q_wbm)] = 0
if x == StartYear:
QTS_array = q_wbm
else:
QTS_array = np.append(QTS_array, q_wbm, axis=0)
DischargeMEAN = np.mean(QTS_array, axis=0)
DischargeSTD = np.std(QTS_array, axis=0)