Solved

PHP class to work with Amazon S3 fails randomly

Posted on 2009-07-07
14
1,412 Views
Last Modified: 2013-11-14
I am using the PHP class described in this other question, and most of the time it works beautifully.
http://undesigned.org.za/2007/10/22/amazon-s3-php-class


But then occasionally I get either this:

user warning: S3::putObject(): [55] select/poll returned error

...or this...

user warning: S3::putObject(): [RequestTimeout] Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.


---------------------

Before I was using a different class that relied on PEAR and some other services to interact with S3, and I got the [Request Timeout]  error nearly every time.

0
Comment
Question by:shambright
  • 9
  • 5
14 Comments
 
LVL 34

Expert Comment

by:gr8gonzo
Comment Utility
Are you logging details of which files are failing? In particular, it would be good to know the size of the files that are failing to upload, the types, and any common traits they might have.

Or is it more like you upload the same file 5 times and 1 time it fails?

I've never really heard great things about Amazon's cloud, so it may ultimately be a reliability / service issue on their end, but at the very least, we can explore the possibilities...
0
 

Author Comment

by:shambright
Comment Utility
We are moving Flash videos to S3.

The common thread appears to be any files above 20Mb don't make it.
0
 
LVL 34

Expert Comment

by:gr8gonzo
Comment Utility
Progress! :)

Okay, so have you tested the limits to see if there's an exact file size limit? For example, 15 meg files, 17 megs files, 19.9 meg files, 20.1 meg files?

Also, how are the files making their way to the S3 service? Are users uploading the files to your server first, and your server uploads them to the S3 service right away? If so, are you checking to make sure the file has successfully uploaded to your server before triyng to put them on Amazon S3?
0
 

Author Comment

by:shambright
Comment Utility
Verified: I was able to upload a 7Mb flash file using the above library with no trouble.
0
 

Author Comment

by:shambright
Comment Utility
Sorry.... I should have refreshed.

The files rest comfortably on our server before being transported out. Ultimately, this will be a cron function to happen "behind the scenes".

The breakdown occurs somewhere above 11Mb. Can't give an exact amount though.
0
 
LVL 34

Expert Comment

by:gr8gonzo
Comment Utility
Can you create a text file that starts at exactly 11 megs (11534336 bytes), test the upload then increment it by 256k, and try again and keep doing until it breaks. If it's a fixed number, then that's a perfect starting place - it's probably a setting somewhere if that's the case.
0
 

Author Comment

by:shambright
Comment Utility
Success at 14.4Mb (15249829)
...but then the same file upload fails on the next try

Larger files appear to fail consistently
0
Save on storage to protect fatherhood memories

You're the dad who has everything. This Father's Day, make sure your family memories are protected. My Passport Ultra has automatic backup and password protection to keep your cherished photos and videos safe. With up to 3TB, you have plenty of room to hold the adventures ahead.

 
LVL 34

Expert Comment

by:gr8gonzo
Comment Utility
Hmmm, that's not good if the same file fails after succeeding (unless it failed because the file already existed). Can you try the same 14.4Mb file 4 times in a row? If it works once and then fails 3 times, then we know it's probably failing due to the file already being uploaded.
0
 

Author Comment

by:shambright
Comment Utility
That is not it, I kept adding/subtracting data to the same file and it uploaded onto itself with no problems at smaller sizes
0
 

Author Comment

by:shambright
Comment Utility
I am starting to suspect that my PHP version is out of date, and causing issues communicating with Amazon.

Upgrading PHP and will report back...
0
 

Author Comment

by:shambright
Comment Utility
Didn't upgrade PHP, but I found this:

by turning on (off?) CURLOPT_NOPROGRESS, and running my PHP script from the command line, I can see the progress meter.

The connection is made to S3 and the headers are sent, then the file begins it's upload.

 % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                                          Dload  Upload   Total   Spent    Left  Speed
 19 25.9M    0     0   19 5088k      0   303k  0:01:27  0:00:16  0:01:11  390k



Then after a little while, the "Current" drops to zero.

 % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                                           Dload  Upload   Total   Spent    Left  Speed
 71 25.9M    0   348   71 18.6M      3   215k  0:02:03  0:01:28  0:00:35     0


This last run, it actually stalled and started again a few times before not coming back - and you can see that I actually got 18.6Mb to go through...

Now what?
0
 
LVL 34

Accepted Solution

by:
gr8gonzo earned 500 total points
Comment Utility
Well, the problem probably lies in either:

#1. Hardware. For example, a router along the way is being overloaded with shared traffic. I'd contact your server/hosting ISP and try to see if they can analyze the traffic to see if they can monitor their hardware load levels on their networking equipment while you run your test. Most of the internet backbones are major enough that networking problems would only happen within the local network or something close to it. This is probably not the problem (the symptoms aren't quite right), but you never know.

#2. Scheduled interference. Again, not probable, but you never know. I've seen some novice sysadmins run into problems that they resolve by having the network interface reset on a scheduled basis. Or there could be a cron job that happens to be running at certain intervals, and the larger files take long enough for you to see the effects. Open 2 console/shell windows side-by-side. On the right-hand console, just type date and don't press enter. On the left-hand side, run the script again to upload a 20 meg file. As SOON as it starts dropping to 0, run the date command on the right-hand window. See if the time is close to a minute marker. Do it a couple times and see if there are any patterns in the times that it stalls out (e.g. every 5 minutes, etc). If so, check your crontab to see what's running. Apps like webmin make it easy to see system-wide cron jobs in case jobs are set up under a different user.

Also, after  it stalls, run:
ls -lt /var/log | head

See what log files have changed recently to see if anything helpful might be getting logged.

#3. CURL. Usually PHP will just segfault if there's a significant problem, so I don't think it's due to an old version of PHP (but you never know). I think there would be a greater chance that cURL is failing in some way (internally). You could always recompile the latest cURL, and then upgrade to the latest PHP in the same run. It's almost always good to be on the latest PHP release to avoid memory leaks and security holes anyway.

#4. Amazon S3. There's always a chance that Amazon's S3 service just isn't handling its load very well and is dropping connections. Try running the same test case / upload on a different server on a different Internet connection. If you still see problems uploading 20 meg files, then you've almost certainly eliminated all the other possibilities.
0
 

Author Comment

by:shambright
Comment Utility
Well... without changing anything, the problem went away.

I did have time to rewrite my scripts to serve files locally until a file can be relocated to S3 - so I guess that was a good thing.

Thanks for giving me things to poke at.
0
 

Author Closing Comment

by:shambright
Comment Utility
I actually posted this elsewhere. The problem just went away, so I guess it it something at Amazon.

Thanks for giving lots of options to poke at. Always good to learn "where to look" for future troubleshooting.
0

Featured Post

Get up to 2TB FREE CLOUD per backup license!

An exclusive Black Friday offer just for Expert Exchange audience! Buy any of our top-rated backup solutions & get up to 2TB free cloud per system! Perform local & cloud backup in the same step, and restore instantly—anytime, anywhere. Grab this deal now before it disappears!

Join & Write a Comment

Things That Drive Us Nuts Have you noticed the use of the reCaptcha feature at EE and other web sites?  It wants you to read and retype something that looks like this.Insanity!  It's not EE's fault - that's just the way reCaptcha works.  But it is …
This article discusses four methods for overlaying images in a container on a web page
The viewer will learn how to create and use a small PHP class to apply a watermark to an image. This video shows the viewer the setup for the PHP watermark as well as important coding language. Continue to Part 2 to learn the core code used in creat…
The viewer will learn how to create a basic form using some HTML5 and PHP for later processing. Set up your basic HTML file. Open your form tag and set the method and action attributes.: (CODE) Set up your first few inputs one for the name and …

772 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

11 Experts available now in Live!

Get 1:1 Help Now