2

I have a byte array that will always be 10 megs. No matter what data is in it, it will be ten megs. The purpose has to do with the Large Object Heap (more information).

So if a 1Mb file is placed into the byte[], then the last nine megs are zero'd out. Problem is, when I upload this to SQL Server, the uploaded file is always 10 megs. I'd much rather upload only the necessary bits.

I have the size before it's copied into the byte array, so I could do a trim if needs be. Problem is, I don't know how to do one efficiently that is not creating new byte[] on the LOH.

Ideas?

Updated #1

Here is some pseudo code. I've removed most of the unneccessary code. It's using the Microsoft.Practices.EnterpriseLibrary.Data library to access the database. (Legacy code, can't change)

Param[] parametersArray = new Param[10];
// other params
parametersArray[4] = new Param("DocumentData", DbType.Binary, documentToSave.Data);
// other params
DataAccess.ExecuteNonQuery("esMD.proc_WS_UpdateDocument", parametersArray);

public static int ExecuteNonQuery(string spName, params Param[] parameters)
{
    int ret = -1;

    DbCommand storedProcedure = Database.Db.GetStoredProcCommand(spName);
    storedProcedure.CommandTimeout = commandTimeout;

    if (parameters != null)
    {
        foreach (Param parameter in parameters)
        {
            if (parameter != null)
            {
                Database.Db.AddInParameter(storedProcedure, parameter.ParameterName, parameter.DbType, parameter.Value);
            }
        }
    }

    try
    {
        ret = MedicareDatabase.Db.ExecuteNonQuery(storedProcedure);
    }
    catch (Exception)
    {
        throw;
    }
    finally
    {
        if (storedProcedure != null)
        {
            storedProcedure.Dispose();
        }
    }
    return ret;
}

Updated #2

I modified the database call above and also modified how I entered Parameters.

if (parameter.Size != null && parameter.Size > 0)
{
    MedicareDatabase.Db.AddParameter(storedProcedure, parameter.ParameterName, DbType.Binary, parameter.Size, ParameterDirection.Input, true, 0, 0, string.Empty, DataRowVersion.Default, parameter.Value); 
}
else
{
    MedicareDatabase.Db.AddInParameter(storedProcedure, parameter.ParameterName, parameter.DbType, parameter.Value);
}

This seems to be working for me. Does anyone see any issues with this?

-Chad

6
  • 3
    Show us the code you use to "upload the file to SQL Server" Commented Feb 7, 2012 at 21:53
  • Updated with the database call. Commented Feb 7, 2012 at 22:02
  • 1
    Where did you find the information that it was more memory efficient to use an array larger than you actually needed? I can't see anything in the answers to that question you linked that suggests so. On the contrary the answers recommending using several small arrays over one large, but that's obviously not applicable in your case. Commented Feb 7, 2012 at 22:25
  • The need came from getting out of memory exceptions. If I process too many files at once, my LOH gets fragmented and I can't add new files to it and it errors out. So if I constantly refill an existing array, then I don't run into that issue. Commented Feb 7, 2012 at 22:35
  • If you're using a SQL Server 2008, the FileStream could be an alternative, since it supports seeking and writing from a stream directly. Commented Feb 7, 2012 at 22:35

1 Answer 1

4

Create a MemoryStream from the byte[] and desired length. Use the stream to construct a SqlBytes instance. Build the parameter using the SqlBytes.

As a side comment, your large data should never be materialized as a byte[]. It should live as a stream through the processing. See Download and Upload images from SQL Server via ASP.Net MVC for an actual example of how you can keep a large object as a stream, never materialized into a large byte[], from end-to-end on both download and upload paths.

Sign up to request clarification or add additional context in comments.

1 Comment

I'm using itextsharp to cut the large pdf's into smaller chunks. Itextsharp requires a byte[] (I think).

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.