1

I'd like to upload big files via ASP.NET to WCF service. Until 100 MB is not a problem, my configuration works perfectly, but above 100 MB it throws System.OutOfMemoryException.

The uploading method works with FileStream, but before that, I save the file to a temporary folder. Not sure if this is the problem, or something else. I add the code of my controller, which takes care of calling the wcf service.

[HttpPost]
    public ActionResult Upload()
    {
        if (Request.Files.Count > 0)
        {
            var file = Request.Files[0];

            if (file != null && file.ContentLength > 0)
            {
                string fileName = Path.GetFileName(file.FileName);
                var path = Path.Combine(Server.MapPath("~/App_Data/Images"), fileName);

                file.SaveAs(path);

                FileStream fsSource = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);

                TileService.TileServiceClient client = new TileService.TileServiceClient();
                client.Open();
                client.UploadFile(fileName, fsSource);
                client.Close();

                fsSource.Dispose();
                if (System.IO.File.Exists(path))
                {
                    System.IO.File.Delete(path);
                }
            }
        }

        return RedirectToAction("");
    }

The method is called like this:

@using (Html.BeginForm("Upload", "Home", FormMethod.Post, new { enctype = "multipart/form-data" }))
{
<input type="file" name="FileUploader" />
<br />
<input type="submit" name="Submit" id="Submit" value="Upload file" />
}

In the ASP.NET web.config I already set the following things: executionTimeout, maxRequestLength, requestLengthDiskThreshold, maxAllowedContentLength. I add the binding part of the configuration.

<basicHttpBinding>
    <binding name="BasicHttpBinding_ITileService"
      closeTimeout="24:01:00" openTimeout="24:01:00" receiveTimeout="24:10:00" sendTimeout="24:01:00" allowCookies="false" bypassProxyOnLocal="false" hostNameComparisonMode="StrongWildcard" maxBufferPoolSize="4294967295" maxBufferSize="2147483647" maxReceivedMessageSize="4294967295" textEncoding="utf-8" transferMode="Streamed" useDefaultWebProxy="true" messageEncoding="Text">
      <readerQuotas maxDepth="2147483647" maxStringContentLength="2147483647" maxArrayLength="2147483647" maxBytesPerRead="2147483647" maxNameTableCharCount="2147483647" />
      <security mode="None">
        <transport clientCredentialType="None" proxyCredentialType="None" realm="" />
        <message clientCredentialType="UserName" algorithmSuite="Default" />
      </security>
    </binding>
  </basicHttpBinding>
1
  • You can try reading it in chunks using the Request.Files[0].InputStream Commented Nov 25, 2015 at 16:09

2 Answers 2

1

The problem was not in the code I think. The ASP.NET project was hosted in IIS Express instead of Local IIS. Since I changed that in project properties everything works smoothly.

I'm using now @nimeshjm 's code though. Thanks for your help!

Sign up to request clarification or add additional context in comments.

Comments

0

You can try reading it in chunks using the Request.Files[0].InputStream

Something along these lines:

    public ActionResult Upload()
    {
        if (Request.Files.Count > 0)
        {
            var file = Request.Files[0];

            if (file != null && file.ContentLength > 0)
            {
                string fileName = Path.GetFileName(file.FileName);
                var path = Path.Combine(Server.MapPath("~/App_Data/Images"), fileName);

                using (var fs = new FileStream(path, FileMode.OpenOrCreate))
                {
                    var buffer = new byte[1024];
                    int count;
                    while ((count = file.InputStream.Read(buffer, 0, 1024)) > 0)
                    {
                        fs.Write(buffer, 0, count);
                    }
                }

                FileStream fsSource = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);

                TileService.TileServiceClient client = new TileService.TileServiceClient();
                client.Open();
                client.UploadFile(fileName, fsSource);
                client.Close();

                fsSource.Dispose();
                if (System.IO.File.Exists(path))
                {
                    System.IO.File.Delete(path);
                }
            }
        }

        return RedirectToAction("");
    }

6 Comments

Might be a good idea, but it writes: An exception of type 'System.ArgumentException' occurred in System.Web.dll but was not handled in user code Additional information: Destination array was not long enough. Check destIndex and length, and the array's lower bounds. Any ideas? :-/
while ((count += file.InputStream.Read(buffer, count, 1024)) > 0) not sure if byte[1024] is enough? I should be able to upload up to 2 GB files. I tested with 270 MB file.
A slight bug in the original sample :) I've updated the example, give that a go.
Now your code is working for small files (tested for 33 MB), but I still get System.OutOfMemoryException for big files (tested for 270 MB) :(
Which line throws that exception?
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.