45

I had it working.. but I noticed once the files I was uploading get bigger (around 4000k) the controller would not be called..

So I added in chunking which fixed that problem.. but now when I open the file its full of garbage characters...

So what is the correct way to upload large files with plupload/MVC 4 ?

Here is my current code

  $(document).ready(function () {

    var uploader = new plupload.Uploader({
        runtimes: 'html5',
        browse_button: 'pickfiles',
        container: 'container',
     //   max_file_size: '20000mb',
        url: '@Url.Action("Upload", "Home")',
        chunk_size: '4mb',
        //filters: [
        //    { title: "Excel files", extensions: "xls,xlsx" },
        //    { title: "Text files", extensions: "txt" }
        //],
        multiple_queues: true,
        multipart: true,
        multipart_params: { taskId: '' }
    });

and the controller

  [HttpPost]
    public ActionResult Upload(int? chunk, string name, string taskId)
    {
        string filePath = "";
        var fileUpload = Request.Files[0];
        var uploadPath = Server.MapPath("~/App_Data/Uploads");
        chunk = chunk ?? 0;
        string uploadedFilePath = Path.Combine(uploadPath, name);
        var fileName = Path.GetFileName(uploadedFilePath);

 try
        {
            using (var fs = new FileStream(filePath, chunk == 0 ? FileMode.Create : FileMode.Append))
            {
                var buffer = new byte[fileUpload.InputStream.Length];
                fileUpload.InputStream.Read(buffer, 0, buffer.Length);
                fs.Write(buffer, 0, buffer.Length);
            }

            //Log to DB for future processing
            InstanceExpert.AddProcessStart(filePath, Int32.Parse(taskId));
        }
5
  • 1
    A 4MB file should be able to be handled without doing chunking in ASP.NET. You may need to increase your max upload file size, or increase the execution time though. Commented Apr 29, 2013 at 20:35
  • this biggest file I have that is being uploaded is 7268k. Do I need chunking ? Or what do I change ? Commented Apr 29, 2013 at 20:38
  • Change the maxRequestLength in the web.config. Take a look at stackoverflow.com/a/288675/254973 for an example of what you need to change there. After that, remove the chunk_size option from the javascript so plUpload doesn't send chunked uploads. Commented Apr 29, 2013 at 20:43
  • 3
    Chunking should really be enabled if you have files larger than 1mb, i tend to stick with 500kb chunks. Even though servers generally accepts 4mb chunks, time-outs can occur for people with slow internet. So its not really proportional. Chunking allows rate limiting, resume uplaods. You can even check the file size and adjust the chunk size based on file size and speed. Changing the post size value is bad practise, its so Apache and 1990. You need to set the client script to enable chunking too, using HTML5 or other fallbacks. Dont POST entire large files! Commented Feb 28, 2015 at 11:47
  • what is the meaning of these line in js chunk_size: '4mb', and max_file_size: '20000mb', ?? when chunk size is 4mb then what will happen? Commented Jun 24, 2015 at 11:54

3 Answers 3

68

In web.config you need these (2GB all around):

<system.web>
    <compilation debug="true" targetFramework="4.5" />
    <httpRuntime targetFramework="4.5" maxRequestLength="2147483647" executionTimeout="1600" requestLengthDiskThreshold="2147483647" />
    <security>
      <requestFiltering>
        <requestLimits maxAllowedContentLength="2147483647" />
      </requestFiltering>
    </security>
    ...
</system.web>
Sign up to request clarification or add additional context in comments.

7 Comments

Thanks Shane! This worked for my "Intranet" App, but I am using targetFramework="4.0" so mine does not have the <security> section. I ended up having a limit between 300MB and 700MB. I have not fully tested to find the exact limit.
I think you mean <system.webServer> not <system.web>. Maybe it is different in different versions.
for a website it's system.web
compilation and httpRuntime go into system.web - security go into system.webserver. Also maxRequestLength is in kilobytes, while maxAllowedContentLenght and requestLengthDiskThreshold are in bytes
This is pretty bad! What happens if my file fails at 1.8GB... restart. Plus, its open to DDoS attacks, start 100+ (its pretty easy with client scripting to start 10000 uploads in a second) uploads and your server is bricked! Do not do this! Implement proper chunking. This allows you rate limit too, plus resume uploads. -1
|
15

Current Version

According to the detailed error description of IIS 8.0, which is the version I used at the time I wrote this answer, you need to verify the configuration/system.webServer/security/requestFiltering/requestLimits@maxAllowedContentLength setting in the ApplicationHost.config or Web.config file. That means you need to include:

<requestLimits maxAllowedContentLength="20971520000"></requestLimits>

inside configuration/system.webServer/security/requestFiltering tag tree. Just in case you lack the imagination to visualize where it goes, the full code block goes like the following:

<configuration>
    <system.webServer>
        <security>
            <requestFiltering>
                <requestLimits maxAllowedContentLength="20971520000"></requestLimits>
            </requestFiltering>
        </security>
    </system.webServer>
</configuration>

Visual Studio 2010/.Net Framework 4 and Before

It is also possible that legacy web applications created with VS2008/10 and/or .Net Framework 3.5/4 may still be looking for this configuration via configuration/system.web/httpRuntime@maxRequestLength, but as evidenced by the linked page, it is no longer available, although HttpRuntime Class, which doesn't apply to this scenario, still exists since .Net Framework 1.1. If this is the case, you need to include:

<httpRuntime maxRequestLength="20971520000" />

inside configuration/system.web/httpRuntime tag tree. Once again, just in case you lack the comprehensibility to figure out where it gets inserted, the full code block looks something like the following:

<configuration>
    <system.web>
        <httpRuntime maxRequestLength="20971520000" />
    </system.web>
</configuration>

The file size number is just an arbitrary number (20,000 MB – not 20 GB, which would rather be 21,474,836,480) to show as a demo. Unless you're coding the website for a tight security group who has a need to upload large files, you shouldn't allow this big of a file size being uploaded to your web server.

Comments

15

The solution is based on Jonathan's code here. If you want to upload a large file, something like 1Gbyte video file, you have to chuck the file and send it through several request (one request gives time out). first you set the max limit for client and server side in Web.config as discussed in other answers.

<system.webServer>
 <security>
  <requestFiltering>
    <requestLimits maxAllowedContentLength="2147483647" />
  </requestFiltering>
 </security>
<system.webServer>

and

<system.web>
  <httpRuntime targetFramework="4.5" maxRequestLength="2147483647" />
</system.web>

then chunk the file, and send each chuck, wait for response and send the next chunk. here is the html (VideoDiv work as upload panel), javascript (jQuery) and controller code.

    <div id="VideoDiv">
        <label>Filename:</label>
        <input type="file" id="fileInput" /><br/><br/>
        <input type="button" id="btnUpload" value="Upload a presentation"/><br/><br/>
        <div id="progressbar_container" style="width: 100%; height: 30px; position: relative; background-color: grey; display: none">
            <div id="progressbar" style="width: 0%; height: 100%; position: absolute; background-color: green"></div>
            <span id="progressbar_label" style="position: absolute; left: 35%; top: 20%">Uploading...</span>
        </div>
    </div>

Javascript code to chuck, call controller and update progressbar:

        var progressBarStart = function() {
            $("#progressbar_container").show();
        }

        var progressBarUpdate = function (percentage) {
            $('#progressbar_label').html(percentage + "%");
            $("#progressbar").width(percentage + "%");
        }

        var progressBarComplete = function() {
            $("#progressbar_container").fadeOut(500);
        }

        var file;

        $('#fileInput').change(function(e) {
            file = e.target.files[0];
        });

        var uploadCompleted = function() {
            var formData = new FormData();
            formData.append('fileName', file.name);
            formData.append('completed', true);

            var xhr2 = new XMLHttpRequest();
            xhr2.onload = function() {
                progressBarUpdate(100);
                progressBarComplete();
            }
            xhr2.open("POST", "/Upload/UploadComplete?fileName=" + file.name + "&complete=" + 1, true);
            xhr2.send(formData);
        }

        var multiUpload = function(count, counter, blob, completed, start, end, bytesPerChunk) {
            counter = counter + 1;
            if (counter <= count) {
                var chunk = blob.slice(start, end);
                var xhr = new XMLHttpRequest();
                xhr.onload = function() {
                    start = end;
                    end = start + bytesPerChunk;
                    if (count == counter) {
                        uploadCompleted();
                    } else {
                        var percentage = (counter / count) * 100;
                        progressBarUpdate(percentage);
                        multiUpload(count, counter, blob, completed, start, end, bytesPerChunk);
                    }
                }
                xhr.open("POST", "/Upload/MultiUpload?id=" + counter.toString() + "&fileName=" + file.name, true);
                xhr.send(chunk);
            }
        }

        $("#VideoDiv").on("click", "#btnUpload", function() {
            var blob = file;
            var bytesPerChunk = 3757000;
            var size = blob.size;

            var start = 0;
            var end = bytesPerChunk;
            var completed = 0;
            var count = size % bytesPerChunk == 0 ? size / bytesPerChunk : Math.floor(size / bytesPerChunk) + 1;
            var counter = 0;
            progressBarStart();
            multiUpload(count, counter, blob, completed, start, end, bytesPerChunk);
        });

and here is the upload controller to store the chucnk in ("App_Data/Videos/Temp") and later merge them and store in ("App_Data/Videos"):

public class UploadController : Controller
{
    private string videoAddress = "~/App_Data/Videos";

    [HttpPost]
    public string MultiUpload(string id, string fileName)
    {
        var chunkNumber = id;
        var chunks = Request.InputStream;
        string path = Server.MapPath(videoAddress+"/Temp");
        string newpath = Path.Combine(path, fileName+chunkNumber);
        using (FileStream fs = System.IO.File.Create(newpath))
        {
            byte[] bytes = new byte[3757000];
            int bytesRead;
            while ((bytesRead=Request.InputStream.Read(bytes,0,bytes.Length))>0)
            {
                fs.Write(bytes,0,bytesRead);
            }
        }
        return "done";
    }

    [HttpPost]
    public string UploadComplete(string fileName, string complete)
    {
        string tempPath = Server.MapPath(videoAddress + "/Temp");
        string videoPath = Server.MapPath(videoAddress);
        string newPath = Path.Combine(tempPath, fileName);
        if (complete=="1")
        {
            string[] filePaths = Directory.GetFiles(tempPath).Where(p=>p.Contains(fileName)).OrderBy(p => Int32.Parse(p.Replace(fileName, "$").Split('$')[1])).ToArray();
            foreach (string filePath in filePaths)
            {
                MergeFiles(newPath, filePath);
            }
        }
        System.IO.File.Move(Path.Combine(tempPath, fileName),Path.Combine(videoPath,fileName));
        return "success";
    }

    private static void MergeFiles(string file1, string file2)
    {
        FileStream fs1 = null;
        FileStream fs2 = null;
        try
        {
            fs1 = System.IO.File.Open(file1, FileMode.Append);
            fs2 = System.IO.File.Open(file2, FileMode.Open);
            byte[] fs2Content = new byte[fs2.Length];
            fs2.Read(fs2Content, 0, (int) fs2.Length);
            fs1.Write(fs2Content, 0, (int) fs2.Length);
        }
        catch (Exception ex)
        {
            Console.WriteLine(ex.Message + " : " + ex.StackTrace);
        }
        finally
        {
            if (fs1 != null) fs1.Close();
            if (fs2 != null) fs2.Close();
            System.IO.File.Delete(file2);
        }
    }
}

However, if two users at same time upload files with same name, there will be some problem, and you have to handle this issue. By reading responseText, you can catch some error and exception and trim it.

1 Comment

working great! btw, the recursion here might put limits. each browser has it's own limit.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.