3

I am implementing upload file using AWS S3. the file that I want to upload is average 500 Mb. The process of uploading (using filesystem AWS S3) is synchronous. So when one user uploads big file, others people cannot access the website until the user finished uploading progress. How to make it asynchronous?

Basically, I have two issues:

  1. Uploading large files in chunks so that other people can use the website
  2. Uploading it asynchronously.

the command I used to handle upload is:

Storage::put('preview_image/'.$file_name, $file_preview_image_1, 'public');

5 Answers 5

1

Why don't you use Asynchronous Multipart Uploads, which is recommended for files larger than 100MB? The code will look something like this:

$source = '/path/to/large/file.zip';
$uploader = new MultipartUploader($s3Client, $source, [
    'bucket' => 'your-bucket',
    'key'    => 'my-file.zip',
]);

$promise = $uploader->promise();

You can look at the documentation here: Asynchronous multipart uploads

Sign up to request clarification or add additional context in comments.

Comments

1

SOLVED

After being stuck and seeking for a good solution for several days, I finally got an explanation about why my project ran single-threaded, it's because I ran php artisan serve. I reported issue on https://github.com/laravel/framework/issues/22944.

Comments

0

i don't understand that why other users can't access the website, because modern servers are capable of handling multiple requests very well maybe you have to consider upgrading your server.

But to have asynchronous behaviour in your project you can look into laravel Jobs and Queues, Here is the doc link, But you have to consider changing your queue driver appropriately, by default laravel queues driver is sync which is nothing but synchronous, once you have your queues setup it can be redis queues or amazon sqs or anything, you can push the file uploadng job to a queue and you can take the user out from the hassle of waiting for the file to upload, there are custom packages like laravel horizon for monitoring your queues. where you can even restart the job if it fails.

1 Comment

"i don't understand that why other users can't access the website ....." -> So, when user uploading file, they will send file request directly to AWS S3 storage. the progress of "request" is take too much time because file is too large. Thanks for your answer :)
0

The rule of thumb when dealing with jobs that take an excessive amount of time (above 5 seconds) to complete, is to process them in the background.

See: https://laravel.com/docs/5.5/queues

So when one user upload big file, others people cannot access the website until the user finished uploading progress

However yours is a hosting issue whereby your upload bandwidth is being consumed completely by the upload OR the php script which is running the upload process is consuming too much memory and therefore blocks other php threads from spawning.

3 Comments

If I process uploading file in the background, how about if user close the browser?
After reading your comments, firstly you have to solve the issue of having the user stay on the same page while uploading your file. You should look into Single Page Applications (SPAs) for coding your frontend, e.g: React, AngularJs.
Thank for SPAs, I am looking for that. :)
0

If you are trying to upload a file from user's disk to a remote location then It's not possible to do what you want (i.e queuing it to be done later).

The user needs to complete the file upload in the $POST request of the form - you cant queue it to be done later. Queueing is for server-side processing tasks to be delayed and done later - but uploading needs the user to stay on the page to send the data to your server.

To further expand - the best option you will be able to do is a javascript asynchronous upload - using a package like dropzonejs or something. This way users can upload multiple files simultaneously, and get visual progression bars updating.

5 Comments

"uploading needs the user to stay on the page to send the data to your server" -> This is the problem too, so when user exit browser (but the progress of uploading file not start yet) and then uploading progress will fail. btw thank for your asnwer and dropzonejs :)
@fird0s Yes that is the gist of it
I was thinking about first upload locally, and then run queue to upload from local to AWS S3. How do you think?
@fird0s that would not solve the issue you are trying to bypass, the user will have to wait the same amount of time till the upload completes in both cases right?. I suggest you stick with s3 and make use of a js library for async uploads that would at least give the user a visual feedback on the progress.
@fird0s also keep in mind to chunk when uploading using s3 so that you don't run out of memory during the upload. As php tries to load entire file in memory when uploading it, that would easily choke your server for large size uploads

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.