1

I have saved a large array of complex numbers using python,

numpy.save(file_name, eval(variable_name))

that worked without any trouble. However, loading,

variable_name=numpy.load(file_name)

yields the following error,

ValueError: total size of new array must be unchanged

Using: Python 2.7.9 64-bit and the file is 1.19 GB large.

2 Answers 2

2

There is no problem with the size of your array, you likely didn't opened your file in the right way, try this:

with open(file_name, "rb") as file_:
    variable_name = np.load(file_)
Sign up to request clarification or add additional context in comments.

Comments

0

Alternatively you can use pickle:

import pickle

# Saving:
data_file = open('filename.bi', 'w')
pickle.dump(your_data, data_file)
data_file.close()

# Loading:
data_file = open('filename.bi')
data = pickle.load(data_file)
data_file.close()

3 Comments

This has a significant overhead for numpy arrays. Using numpy.save or hdf5 file format is to be preferred.
Thanks a lot. Yes, pickle / dump gave me a memory error for the big array. I'll try the upper version in an hour.
It is weird: On friday I ran the code repeatly without any error (even after closing the console) today, i got the code repeatedly, even after restarting python/spyder. (The data was stored on a local drive) Now, i re-ran the script and I can't reproduce the error. Once the error happens again, I will get back here .

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.