3

I want to upload a large file of size 1Gb to my firebird database. I keep getting a 'System.OutOfMemoryException' thrown at the code below.

FbTransaction trans = conn.BeginTransaction();
string updateSQL = string.Format( @"UPDATE {0} SET {1} = :DATA WHERE {2} RETURNING DATA", tableName, colName, whereStr );
using( FbCommand getBlobCmd = new FbCommand( updateSQL, conn ) )
{
  try
  {
    getBlobCmd.Transaction = trans;
    FbParameter parBlob = new FbParameter( "DATA", FbDbType.Binary );
    parBlob.Direction = ParameterDirection.Output;
    parBlob.Value = File.ReadAllBytes(filePath); //System.OutOfMemoryException
    getBlobCmd.Parameters.Add( parBlob );
    getBlobCmd.ExecuteNonQuery();
    trans.Commit();
  }
  catch
  {
    if( trans != null )
      trans.Rollback();
  }
}

I understand that I need to write data in chunks. But there's no class in .NET data provider with similar functionality. What I need to use in this situation? Thanks!

4
  • Saving such large data directly in the DB is most times bad practise. Can you may change that to just save a link and put the file in the filesystem? Otherwise there is not much you can do, if the data provider does not support streams... Commented Apr 17, 2014 at 13:55
  • I agree, store your BLOB in a file-store some where, and use the database to serve up the path. This also has the side advantage of not congesting your DB's network traffic with GBs of BLOB traffic. Commented Apr 17, 2014 at 14:26
  • Unfortunately, data is needed in the database. Commented Apr 17, 2014 at 15:17
  • Unfortunately there doesn't seem to be a way in the Firebird .NET provider to stream the blob to Firebird, and as far as I can tell you can't simply access the low-level API to do this either. Commented Apr 17, 2014 at 18:22

3 Answers 3

1

Currently you have basically only one option (suppose you want to go via .NET provider). Switch your app to x64 (if you haven't done that already) and buy enough RAM (or make big swap file). Not a best solution, I know.

Also you can give some love to http://tracker.firebirdsql.org/browse/DNET-279. Then you can upload basically size-unlimited file.

Sign up to request clarification or add additional context in comments.

Comments

0

I solved the problem by adding method in BlobBase.cs:

    public void Write( Stream stream )
    { 
        try
        {
            this.Create();

            byte[] tmpBuffer = null;

            int length = (int)stream.Length;
            int offset  = 0;
            int chunk   = length >= this.segmentSize ? this.segmentSize : length;

            tmpBuffer = new byte[chunk];

            while( length > 0 )
            {
                if( chunk > length )
                {
                    chunk = ( int )length;
                    tmpBuffer = new byte[ chunk ];
                }

                stream.Read( tmpBuffer, 0, chunk );
                this.PutSegment( tmpBuffer );

                offset += chunk;
                length -= chunk;
            }

            this.Close();
        }
        catch (Exception)
        {
            // Cancel the blob and rethrow the exception
            this.Cancel();

            throw;
        }
    }

And FbCommand.cs:

BlobBase blob = this.statement.CreateBlob();
if( this.Parameters[ index ].Value is System.IO.Stream )
{
  blob.Write( (System.IO.Stream)this.Parameters[ index ].Value );
}
else
{
   blob.Write( (byte[])this.Parameters[ index ].Value );
}

And final using code snippet:

string updateSQL = string.Format( @"UPDATE {0} SET {1} = :DATA WHERE {2}",    tableName, colName, whereStr );

using( FbCommand getBlobCmd = new FbCommand( updateSQL, conn ) )
{
  try
  {
    FbParameter parBlob = new FbParameter( "DATA", FbDbType.Binary );
    parBlob.Direction = ParameterDirection.Input;
    parBlob.Value = blobFile.InnerStream;//input FileStream
    getBlobCmd.Parameters.Add( parBlob );
    getBlobCmd.Prepare();
    getBlobCmd.ExecuteNonQuery();
  }
  catch{}
}

Thanks, for help!

Comments

0

You can write and read the BLOB data (text and binary) with a size more than 1GB :)

"IBProvider"+"LCPI .NET Data Provider" have the native support for System.IO.Stream and System.IO.TextReader for avoiding any limitations of physical memory.

[Usage of System.IO.Stream] http://www.ibprovider.com/eng/examples/lcpi_oledb_net__c001__example_0001.html

[Sample of work with System.IO.FileStream][Russian text] http://firebirder.ru/firebird_interbase__and__system_io_stream__system_io_textreader

Look to other similar samples on IBProvider web site and find other advanced technologies for work with Firebird BLOB :)

Regards.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.