?
Solved

PHP class to work with Amazon S3 fails randomly

Posted on 2009-07-07
14
Medium Priority
?
1,495 Views
Last Modified: 2013-11-14
I am using the PHP class described in this other question, and most of the time it works beautifully.
http://undesigned.org.za/2007/10/22/amazon-s3-php-class


But then occasionally I get either this:

user warning: S3::putObject(): [55] select/poll returned error

...or this...

user warning: S3::putObject(): [RequestTimeout] Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.


---------------------

Before I was using a different class that relied on PEAR and some other services to interact with S3, and I got the [Request Timeout]  error nearly every time.

0
Comment
Question by:shambright
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 9
  • 5
14 Comments
 
LVL 35

Expert Comment

by:gr8gonzo
ID: 24800309
Are you logging details of which files are failing? In particular, it would be good to know the size of the files that are failing to upload, the types, and any common traits they might have.

Or is it more like you upload the same file 5 times and 1 time it fails?

I've never really heard great things about Amazon's cloud, so it may ultimately be a reliability / service issue on their end, but at the very least, we can explore the possibilities...
0
 

Author Comment

by:shambright
ID: 24804294
We are moving Flash videos to S3.

The common thread appears to be any files above 20Mb don't make it.
0
 
LVL 35

Expert Comment

by:gr8gonzo
ID: 24804897
Progress! :)

Okay, so have you tested the limits to see if there's an exact file size limit? For example, 15 meg files, 17 megs files, 19.9 meg files, 20.1 meg files?

Also, how are the files making their way to the S3 service? Are users uploading the files to your server first, and your server uploads them to the S3 service right away? If so, are you checking to make sure the file has successfully uploaded to your server before triyng to put them on Amazon S3?
0
Don't Cry: How Liquid Web is Ensuring Security

WannaCry is just the start. Read how Liquid Web is protecting itself and its customers against new threats.

 

Author Comment

by:shambright
ID: 24805048
Verified: I was able to upload a 7Mb flash file using the above library with no trouble.
0
 

Author Comment

by:shambright
ID: 24805075
Sorry.... I should have refreshed.

The files rest comfortably on our server before being transported out. Ultimately, this will be a cron function to happen "behind the scenes".

The breakdown occurs somewhere above 11Mb. Can't give an exact amount though.
0
 
LVL 35

Expert Comment

by:gr8gonzo
ID: 24805181
Can you create a text file that starts at exactly 11 megs (11534336 bytes), test the upload then increment it by 256k, and try again and keep doing until it breaks. If it's a fixed number, then that's a perfect starting place - it's probably a setting somewhere if that's the case.
0
 

Author Comment

by:shambright
ID: 24805858
Success at 14.4Mb (15249829)
...but then the same file upload fails on the next try

Larger files appear to fail consistently
0
 
LVL 35

Expert Comment

by:gr8gonzo
ID: 24805943
Hmmm, that's not good if the same file fails after succeeding (unless it failed because the file already existed). Can you try the same 14.4Mb file 4 times in a row? If it works once and then fails 3 times, then we know it's probably failing due to the file already being uploaded.
0
 

Author Comment

by:shambright
ID: 24807868
That is not it, I kept adding/subtracting data to the same file and it uploaded onto itself with no problems at smaller sizes
0
 

Author Comment

by:shambright
ID: 24808307
I am starting to suspect that my PHP version is out of date, and causing issues communicating with Amazon.

Upgrading PHP and will report back...
0
 

Author Comment

by:shambright
ID: 24809712
Didn't upgrade PHP, but I found this:

by turning on (off?) CURLOPT_NOPROGRESS, and running my PHP script from the command line, I can see the progress meter.

The connection is made to S3 and the headers are sent, then the file begins it's upload.

 % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                                          Dload  Upload   Total   Spent    Left  Speed
 19 25.9M    0     0   19 5088k      0   303k  0:01:27  0:00:16  0:01:11  390k



Then after a little while, the "Current" drops to zero.

 % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                                           Dload  Upload   Total   Spent    Left  Speed
 71 25.9M    0   348   71 18.6M      3   215k  0:02:03  0:01:28  0:00:35     0


This last run, it actually stalled and started again a few times before not coming back - and you can see that I actually got 18.6Mb to go through...

Now what?
0
 
LVL 35

Accepted Solution

by:
gr8gonzo earned 1500 total points
ID: 24810568
Well, the problem probably lies in either:

#1. Hardware. For example, a router along the way is being overloaded with shared traffic. I'd contact your server/hosting ISP and try to see if they can analyze the traffic to see if they can monitor their hardware load levels on their networking equipment while you run your test. Most of the internet backbones are major enough that networking problems would only happen within the local network or something close to it. This is probably not the problem (the symptoms aren't quite right), but you never know.

#2. Scheduled interference. Again, not probable, but you never know. I've seen some novice sysadmins run into problems that they resolve by having the network interface reset on a scheduled basis. Or there could be a cron job that happens to be running at certain intervals, and the larger files take long enough for you to see the effects. Open 2 console/shell windows side-by-side. On the right-hand console, just type date and don't press enter. On the left-hand side, run the script again to upload a 20 meg file. As SOON as it starts dropping to 0, run the date command on the right-hand window. See if the time is close to a minute marker. Do it a couple times and see if there are any patterns in the times that it stalls out (e.g. every 5 minutes, etc). If so, check your crontab to see what's running. Apps like webmin make it easy to see system-wide cron jobs in case jobs are set up under a different user.

Also, after  it stalls, run:
ls -lt /var/log | head

See what log files have changed recently to see if anything helpful might be getting logged.

#3. CURL. Usually PHP will just segfault if there's a significant problem, so I don't think it's due to an old version of PHP (but you never know). I think there would be a greater chance that cURL is failing in some way (internally). You could always recompile the latest cURL, and then upgrade to the latest PHP in the same run. It's almost always good to be on the latest PHP release to avoid memory leaks and security holes anyway.

#4. Amazon S3. There's always a chance that Amazon's S3 service just isn't handling its load very well and is dropping connections. Try running the same test case / upload on a different server on a different Internet connection. If you still see problems uploading 20 meg files, then you've almost certainly eliminated all the other possibilities.
0
 

Author Comment

by:shambright
ID: 24935942
Well... without changing anything, the problem went away.

I did have time to rewrite my scripts to serve files locally until a file can be relocated to S3 - so I guess that was a good thing.

Thanks for giving me things to poke at.
0
 

Author Closing Comment

by:shambright
ID: 31600881
I actually posted this elsewhere. The problem just went away, so I guess it it something at Amazon.

Thanks for giving lots of options to poke at. Always good to learn "where to look" for future troubleshooting.
0

Featured Post

Free Tool: SSL Checker

Scans your site and returns information about your SSL implementation and certificate. Helpful for debugging and validating your SSL configuration.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Since pre-biblical times, humans have sought ways to keep secrets, and share the secrets selectively.  This article explores the ways PHP can be used to hide and encrypt information.
Many old projects have bad code, but the budget doesn't exist to rewrite the codebase. You can update this code to be safer by introducing contemporary input validation, sanitation, and safer database queries.
The viewer will learn how to dynamically set the form action using jQuery.
The viewer will learn how to create and use a small PHP class to apply a watermark to an image. This video shows the viewer the setup for the PHP watermark as well as important coding language. Continue to Part 2 to learn the core code used in creat…
Suggested Courses

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question