2

I've found a curious quirk of the MySqlDataReader.GetBytes implementation and just wondering if this is well known as I can't seem to find any articles about it on the net.

If you follow the code example for the SqlDataReader and apply it to the MySqlDataReader it wont work...unless the number of bytes in the record you are retreiving is exactly divisible by the buffer. So for example if you are on the last iteration of the loop and there is only 100 bytes left but it's trying to read another 1024 bytes then the MySqlDataReader will fail and throw an exception. The SqlDataReader will not.

Unless I'm doing something wrong?

D.

1
  • GetBytes works great, no need for all that casting. Commented Aug 8, 2012 at 6:52

1 Answer 1

4

Rather than reading the whole buffer size, only ask for at most the buffer size, but also at most what you believe is left. To be honest, you might as well create a buffer of exactly the right size rather than a fixed size one anyway.

// I assume this works for MySqlDataReader too...
int length = (int)reader.GetBytes(column, 0, null, 0, 0);
byte[] buffer = new byte[length];
int index = 0;

while (index < length)
{
    int bytesRead = (int)reader.GetBytes(column, index,
                                    buffer, index, length - index);
    index += bytesRead;
}

But if you wanted a smaller buffer (e.g. if you were processing it a buffer at a time) you could use:

int length = (int)reader.GetBytes(column, 0, null, 0, 0);
byte[] buffer = new byte[length];
int index = 0;

while (index < length)
{
    int bytesRead = (int)reader.GetBytes(column, index, buffer, 0, 
                                    Math.Max(buffer.Length, length - index));
    // Process the buffer, up to value bytesRead
    // ...
    index += bytesRead;
}
Sign up to request clarification or add additional context in comments.

5 Comments

What I don't understand is the method GetBytes takes a long for for the fieldOffset but takes an int for bufferoffset? Surely that will cause issues when working with very large binary objects?
@David: If you wanted more than 2^31 bytes in a single go, that would certainly be a problem - but this interface lets you get an amount of data from an enormous blob, but still limits you to working with the .NET array types which are limited (at least in some places - it's inconsistent) to signed 32 bit integers for indexes.
So in other words if I was storing something with more than 2147483647 bytes in a database record I'd probably have other problems :)
@David: No, you could store that much - but you wouldn't be able to get it into a .NET process as a single array, that's all. For example, you could have a 10GB file in the database and fetch it out 1MB at a time to write to local disk.
@Jon: I had to add some casts, and change byte index = 0; to int index = 0; so it would work. Although, you might want to treat fieldOffset and bufferoffset separately (I just throw an exception if length is > int.MaxValue and use buffer.Length). Thanks for the great answer (+1).

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.