Uploading Directly To Amazon S3 From Your Laravel 5 Application

/ PHP / by Paul Robinson / 30 Comments
This post was published back on March 30, 2015 and may be outdated. Please use caution when following older tutorials or using older code. After reading be sure to check for newer procedures or updates to code.

So you have your killer new application planned out in Laravel, you get everything working with local uploads for that fancy image feature you need. Then bummer, you realize something. With a lot of hosts now shifting to SSD technology they are no longer offering unlimited hosting space. What about if you use something like Digital Ocean who only offer 20GB of precious SSD with their basic accounts?

Amazon S3 To The Rescue

Or any other good cloud storage provider. I’m looking at you Rackspace.

Amazon S3 is cheap, pretty simple to use and (along with Rackspace) has built in support within Laravel 5. We are going to take a look at how you can upload your image straight up to Amazon S3 storage.

Uploading Locally

Let’s get straight into this. I’m assuming you have your own controller setup, and we will only be taking a look at the method that responds to the POST request. Importantly though this code will work if you want to use a bulk uploader such as Plupload. Just be aware that every image uploaded (obviously) will count as a request to your S3 bucket.

First thing we need to do is get ourselves the package to push to S3. To do that we need to use Composer. Just run:

Please double check the version of flysystem used by your version of Laravel. Check out their Cloud Storage docs for more details on which package to composer require.

Once it’s complete pop yourself over to the config/filesystems.php file and enter your Amazon S3 information. I’m not going to cover setting up your bucket or anything as that is covered extremely well over at Amazon. I will however tell you to take their advice and use their user system so you aren’t using your Root keys in your application… It’s normally (pretty much always) a really bad idea.

Now we can write our code. I’ve given you all the code needed to upload an image, it isn’t the only way it can be done & this assumes your response needs to be JSON (handy if using a mass uploader that needs JSON as a response).

The code has been generalised to suit most situations, you’ll need to customise where needed. Things like filename generation, and entering rows into the database are obviously dependant upon your project.

That’s actually all there is to it. Your files will now be placed straight into your bucket on Amazon S3. Awesome, huh?

Troubleshooting

There are actually a few things that can go wrong when trying to push to S3 with Laravel, so I’m going to cover some common solutions here.

Nothing is Happening / Logs Show Connection Error

This is generally caused by something being incorrect in your config. This is generally either a misspelling, the key being incorrect, or the region being incorrect. Remember if you choose a location like Ireland for your bucket, your region is actually EU West 1 (eu-west-1) not Ireland. There is a list available on Amazon so you can confirm.

Image Appears In Storage Folder

You haven’t given a disk name when accessing the Storage facade. This shouldn’t happen if you use the code as written above, but can happen when writing your own code. If you are tired, or forget when writing it can be a painful mistake to make.

I Want To Make Thumbnails Too

Then uncomment the lines that are commented out in the code snippet & then thumbnail as you wish. Once finished push them in the same way the larger image has been. Remember you must always push your larger file to S3 first as the put method on the Storage facade copies the image to the server, the move method called from the file[1] actually physically moves the image so it can no longer be accessed via file_get_contents() without pointing to it’s new location.

How Can I Link To My Files

Ahh, this is a good one. I might be wrong, but as far as I’m aware there is no ‘simplez’ method of doing this. You could piece together the URL by pulling your region from your filesystem config using Config::get('filesystems.disks.s3.region') and similar for bucket, since the URL is essentially https://s3-{region}.amazonaws.com/{bucket}/{filepath}.

I did however find another way. It may, or may not be a good way, but after some digging through the core code I found you could access the getObjectUrl() method on the Amazon SDK used by Laravel. That is done by doing the following:

Where $key is the path to the file inside the bucket. If you have a helper class for something like this you could make yourself a small method & set the disk & bucket as variables via the Construct. How you do it though is completely up to you.

If you come across any other issues feel free to ask in the comments & I’ll see if I can help. In the meantime go forth & populate your Amazon S3 bucket with images from your awesome applications.

[1] Yes, I am aware the method is called from the UploadedFile Symfony class, which extends the PHP SplFileInfo class. That’s a mouth-full though, which is why it is down here & not in the main article.

30 Comments

Author’s gravatar

Hi Paul,

Thanks for this helpful post! I used this as a guide to write some code however, it looks like the Request isn’t coming through. I’m also using Postman to test an uploaded file. When sending the Request, the Response goes straight to the error. I tried dd($request->all()); and it returns an empty array. Any idea why this might be?

Thanks!
Chloe

Reply
Author’s gravatar author

Hi Chloe,

Without any code it is difficult to say exactly, however I noticed there are a lot of cases where people are using Postman, but are not passing an extra _method hidden input in their forms. Laravel fakes PUT/PATCH verbs since they are not widely supported by browsers, using that hidden form field let’s Laravel know it should fake it.

Is that any help? If it doesn’t please feel free to drop some code into Laravel’s pastebin (http://laravel.io/bin) and I’m happy to take a further look.

Author’s gravatar

Thanks for getting back to me so quickly, Paul! Unfortunately, that didn’t work when I tried either. I’m thinking there is some trouble not with the code but perhaps how I am using Postman and something weird with the Content Type field. Here’s the code: http://laravel.io/bin/l5GJa

If you have any other thoughts, please let me know. Thank you!

Reply
Author’s gravatar author

Hi again Chloe,

Hmm. The code you’ve pasted looks fine, assuming all of the Classes you are hinting exist (which I assume they do).

I can only think it is something to do with Postman too. Have you tried pushing an upload through with a simple file upload field to see if it is a problem specific to Postman?

The only other thing I’ve seen is that sometimes when testing with Postman some forget about their PHP.ini limits. For example I once tried with a photo from my D-SLR that was 4MB and my local server was set to only allow 2MB uploads. I didn’t get an error or anything, it just didn’t work, which is interesting.

Hopefully that might help, if it doesn’t I’m happy to try and help more, if I can. Or if you do figure out the issue please come back & share. I’m sure it will help a lot of people who have come across this before. 🙂

Author’s gravatar

Thanks again for the speedy response! I will keep playing with it – it’s also baffling because none of the data (I’ve also entered a dummy email address that is just text) will come through, not just the image. Hoping to get an answer soon and I will surely share once I do!

Reply
Author’s gravatar

Figured out the answer – such a silly mistake! My headers were incorrect in Postman. I had both “Accept application/json” and “Content-Type application/json” as headers in Postman. Once I deleted “Content-Type application/json” and used form-data, it recognized the file. Thanks so much again for your time and help with this post, Paul!

Reply
Author’s gravatar

Hi Paul,

So sorry to bother you again but now I’m running into another issue! I keep getting a FatalErrorException in my AWS file on line 23. The error reads: FatalErrorException in AWS.php line 23:
Class ‘Northstar\Services\Storage’ not found. Do you know what might fix this? I’m not quite sure why it’s looking for the Storage file – it is true that this doesn’t exist but I’m trying to send it back to the AvatarController.php file (same cold that I linked to before).

Thank you again in advance!

Reply
Author’s gravatar author

Hi Chloe,

Glad you sorted the issue with Postman and thank you so much for coming back to share the solution. Hopefully it will help out anyone else frustrated with the same issue.

Hmm. I’m not sure. Is Northstar the namespace of your app? Also what is in your AWS.php file as I don’t use one in this tutorial as I just reference the Laravel Storage façade to access the S3 drive.

Author’s gravatar

Hi Paul,

Just figured this out as well and another silly mistake – looks like I didn’t include use Storage; at the top of my file. So sorry to take up your time and again, thank you for being so helpful!

Best,
Chloe

Author’s gravatar author

Hi,

Oh, haha. I never thought to ask if you had listed all of your namespaces. Glad you managed to find the issue.

Also don’t worry, I’ve definitely spent more than an hour looking for an issue before only to find I’ve forgotten to use a particular namespace in my project. 😉

Author’s gravatar

Hi thanks for your tuto,

regarded to listing files, I follow this example,
https://gist.github.com/localdisk/5f6608d271c2842255e3

with getObjectUrl inside a loop will make several request to get the link to my files, I guess this is not a good, correct me if I am wrong or a better way to improve it.

Reply
Author’s gravatar author

Hi Leo,

No problem. Hope you found it helpful.

Well when creating a Signed URL apparently the AWS SDK does not make a request to Amazon’s servers. So there is no need to worry about abusing their servers or your request limits.

If you are also worried about performance, I wouldn’t worry too much. Obviously looping through all files if you just want one specific file would be wasteful, but if you need to list all files, I can’t see an issue with it.

P.S. I’m trying to learn Japanese… It’s hard. 🙁

Author’s gravatar

Saved my day. Had set my region to Frankfurt. After setting it to eu-central-1 all my worries vanished.

Thanks a billion, trillion, million x a lot 🙂

Reply
Author’s gravatar author

Hi Martin,

Totally missed this as my email notification got jumbled in with client emails. Glad this helped you though.

Author’s gravatar

Hi. how to check if a file exists in S3. $exists = Storage::disk(‘s3’)->has(‘file.jpg’); doesnt seem to be working.

I get this error in my browser
“Call to undefined method Illuminate\\Filesystem\\FilesystemAdapter::has()”.

Any suggestions?

Reply
Author’s gravatar author

Hello Yoganandan,

It’s odd as the Laravel docs do say to use $exists = Storage::disk('s3')->has('file.jpg'); but looking in Illuminate\Filesystem\FilesystemAdapter shows the method on the Filesystem Adapter is actually exists() so it would be $exists = Storage::disk('s3')->exists('file.jpg');.

It is important to note that the get() method does exist on the actual Driver and exists() just returns the result of running the get() method on the driver, so maybe that’s why the docs say that.

Hope that helps.

Author’s gravatar

Hi i am trying to upload an audio file (259kb) and i am getting this error

Error executing “PutObject” on “https://autoscaling.us-west-1.amazonaws.com/suyaya/support-tickets/1449566787.mp3”; AWS HTTP error: Client error: PUT https://autoscaling.us-west-1.amazonaws.com/suyaya/support-tickets/1449566787.mp3 resulted in a 413 Request Entity Too Large response:

(client): 413 Request Entity Too Large –

Any advice please?

Reply
Author’s gravatar author

Hi Florence,

It’s unusual for that to happen with such a small file, but a 413 error is generally caused by your server software being configured to only allow small request bodies. Fixing it depends on your server software.

Apache: set LimitRequestBody in httpd.conf or .htaccess file

Nginx: set client_max_body_size in nginx.conf

For others you’ll have to do a Google search as I’m only familiar with those two web servers.

Hope that helps.

Author’s gravatar

Hi Paul,
Thanks for the tutorial.

I’m trying to slightly modify what you’ve demonstrated here https://gist.github.com/stevieg83/d64ef7fd10f16e758bf62a178f7bb402

but I’m running into this error:

Fatal error: Class ‘League\Flysystem\AwsS3v3\AwsS3Adapter’ not found

Reply
Author’s gravatar author

Hi Steven,

Glad you liked it.

Well, as you’ve probably guess from the error, it looks like it is unable to find a class being referenced. I know it’s a bit like telling you to make sure your PC is turned on, but there is double checking you installed Flysystem via composer.

From looking at the error though I think it might be due to following a step in my tutorial I need to update. When I made this tutorial Laravel used Flysystem V2, now Laravel’s docs say to use V3:

composer require league/flysystem-aws-s3-v3 ~1.0

There is a good chance that is the cause of the error. I’ll update my tutorial right now to prevent it from happening to anyone else.

Edit: Tutorial updated, hopefully that small note will help this from happening to anyone else. I’m sorry for any inconvenience that caused as that is likely the cause of your error.

Author’s gravatar

Hi Paul,
I followed your tutorial, it helps me much to upload file into amazon aws s3 after days googling here and there,

thanks

Reply
Author’s gravatar

Hi, great code. My code is working perfectly, but have a question:
why is it that when i try to open the file url on the browser, the file is downloaded automatically? How can i make that url to show the file, instead of download it?
Thanks,
Damian

Reply
Author’s gravatar author

Hi Damian,

The solution could depend on which Laravel version you are using as I believe they are up to 5.4 now.

The reason for it, however, is the same. You need to set the mime type when putting the file to S3. You can set existing ones via the S3 dashboard if that helps, but you should be able to set it programmatically when puting it to the S3 server.

I’m unfortunately unable to try this myself as I don’t have any apps that are using S3 at the moment, but there are reports that accessing the driver directly allows you to set the mime type as well as other headers like CacheControl.

Note that the mime type is actually ContentType on the S3 servers.

Hopefully that helps. Let me know if you are still having trouble and I will see if I can help further.

Author’s gravatar

Hi Paul,
Your code is really helpful, but I’m getting error “AWS HTTP error: cURL error 60: SSL certificate problem: unable to get local issuer certificate (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)”, I also tried some solutions like updating my php.ini file with update cacert.pem file from the https://github.com/amazonwebservices/aws-sdk-for-php/tree/master/lib/requestcore, but nothing helped.
Have you any solution regarding this?

Reply
Author’s gravatar author

Hi Swati,

I’m afraid that is the only solution I am aware of for that problem. The only thing I can suggest is double checking that your PHP ini has updated correctly and that the line specifying the cert is not commented out still.

Beyond that some people have had success downgrading to version 4.0 of Guzzle, but I wouldn’t recommend that if at all possible.

Author’s gravatar

Thanks for your response Paul,

I just want to confirm that, Will this code worked on live site on the server without SSL certificate or I have to integrate SSL first only then I’ll able to upload my files in S3 buckets?
coz this error is shown when I tried to upload my files from localhost (wampserver) to S3.

Reply
Author’s gravatar author

Hi,

As far as I’m aware the SSL issue is a very common error on Windows since Guzzle no longer provides the SSL certs & Windows has trouble locating the certificates provided by the AWS library. I use Docker so have never actually experienced the issue myself and can only go on tips and posts from other developers.

From what I have read on the issue you shouldn’t have the same problem on your live server, although it may still be a good idea to test first.

If you need to you can edit the CURL connection lines within the AWS library to turn of SSL checks, however I very strongly do not recommend you do that.

Author’s gravatar

I don´t know what else to do with this, it makes me cry, help me please.

Error executing “PutObject” on “https://docus-cu1sm.s3.us-east-2.amazonaws.com/algo.pdf”; AWS HTTP error: cURL error 77: error setting certificate verify locations: CAfile: /path/cacert.pem CApath: none (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)

Reply
Author’s gravatar author

Hi,

Apologies for the delay in getting to your comment.

This is normally an issue with not being able to find the certificate that cURL uses. You need to check that your path to your certificate is correct. There is a similar error an Stack Overflow that might help.

Older Comments
Newer Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

I'll keep your WordPress site up-to-date and working to its best.

Find out more