3

I'm working on a program that modifies a file, and I'm wondering if the way I'm working with it is wrong.

The file is stored in blocks inside another file and is separated by a bunch of hashes. It's only about 1mb in size, so I just calculate its location once and read it into a byte array and work with it like that.

I'm wondering if it's some kind of horrendous programming habit to a read an entire file, despite its size, into a byte array in memory. It is the sole purpose of my program though and is about the only memory it takes up.

3 Answers 3

2

This depends entirely on the expected size (range) of the files you will be reading in. If your input files can reach over a hundred MB in size, this approach doesn't make much sense.

If your input files are small relative to the memory of machines your software will run on, and your program design benefits from having the entire contents in memory, then it's not horrendous; it's sensible.

However, if your software doesn't actually require the entire file's contents in memory, then there's not much of an argument for doing this (even for smaller files.)

Sign up to request clarification or add additional context in comments.

1 Comment

OK, good, that's what I thought. The files usually go from about 500kb to 1.5mb, and the people using the program will generally have 2gb or more of RAM.
1

If you require random read/write access to the file in order to modify it then reading it into memory is probably ok as long as you can be sure the file will never ever exceed a certain size (you don't want to read a few hundred MB file into memory).

Usually using a stream reader (like a BinaryReader) and processing the data as you go is a better option.

1 Comment

It's also a plus that the file can be opened elsewhere as it's modified. It won't exceed 1.5mb either.
0

It's horrendous -- like most memory-/CPU-hogging activities -- if you don't have to do it.

5 Comments

I can see the point about memory hogging (potentially), but how would this hog CPU?
@M.Babcock Marshalling data from disk to memory surely involves some series of read and move instructions.
@robjb - Perhaps, but not enough to be called a 'hog'. Besides they are read and move instructions that would have to happen whether you read the entire file to memory or not. What's worse is performing random access operations on disk instead of just in memory.
@M.Babcock Good point, I definitely agree. I just thought I'd mention that CPU time is definitely used.
That really depends at what point you call something a "hog". I usually define it as when something uses grossly more of something than it needs to. Reading a whole file when you only need a known section of it qualifies (imo).

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.