Link to home
Start Free TrialLog in
Avatar of Crazy Horse
Crazy HorseFlag for South Africa

asked on

Compressing already uploaded files

I have hundreds of already uploaded photos and want to compress their size. Without having to download them all, compress them and upload them again, is there some way of compressing all images in a particular folder already on a live server?
ASKER CERTIFIED SOLUTION
Avatar of Chris Stanyon
Chris Stanyon
Flag of United Kingdom of Great Britain and Northern Ireland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
You can also expand on the code Chris provided.

You don't post your file extention (jpg, gif, png, etc...) or type of photo (illustration, photo, animated) so the answer is...

Every combination of images, like gif + gif/transparent + gif/transparent/animated + png + jpg all have many tools for compression.

Each combination will have a tool + recipe (command line options) to produce best image.

Now a simple way to approach this, which will likely produce good file compression + keep good quality is to use ImageMagick, which works on all images, independent of any transparency or animation or any other factor, so something like this...

convert foo.ext -quality 90% foo.new.ext
mv foo.ext /image-archive-path/foo.original.ext
mv foo.new.ext foo.ext

Open in new window


You can get far better compression + quality using other tools + image specific recipes + sometimes, the simple convert trick produces some impressive compression results with very little coding effort.
Avatar of Crazy Horse

ASKER

Thanks guys, what is standard practice? To compress upon upload?

I have looked at a few services like below where it doesn't really matter what you upload, it will generate optimized images for you on the fly. The only problem is that you have to pay (the free plan wouldn't be sufficient for my needs).

https://uploadcare.com
There's no "standard practice".

I tend to do compression as images appear in a DocumentRoot file system (using inotifywait -rmq), because...

You can make up a "standard practice" + there's no way to enforce it, so in a rapid site development environment, easier just to do compression on the fly... as files appear...
How and when you optimise your images will depend entirely on your application needs. If the upload is handled through a PHP script, such as a user uploading files, then it makes sense to handle the compression when the file is uploaded.

If you're talking about static images (i.e site images / css etc.) then it makes more sense to add some image optimisation to your build process.
Thanks guys, no the user uploads all the images for the site. This is where I like Wordpress because there are so many awesome plugins that do this for you right out of the box but for this project I couldn't use wordpress. I actually purchased this for the project which works really nicely. It offers a little compression if you set the quality to 90 or 80 but even so, when I run a google page speed test it tells me my images aren't optimized. So, that is the whole point of this question really. I need to be able to compress the images enough for google not to mark me down. So, maybe I would have to try run another compressor into this script? Sorry, I don't mean to drag this question out or repeat myself, just trying to figure out the best way to do this. If the user is uploading a lot of images themselves, I would rather it just optimize them there and then instead of me having to periodically go and run a script to compress the already uploaded images.

https://innostudio.de/fileuploader/
Sounds like it makes send to compress on upload. You'll need to get the balance of compression (set the 'editor' options in the plugin) right - quality over size. There's no point in getting a great Google Speed result, if the image quality is so poor.
Ah... I see...

Generally, if you compress your images so Google says they're compressed, then you hit the problem Chris mentions.

I never pay attention to Google or GTMetrix regards image compression.

Your image quality will be very poor.

Best to do this.

1) Compress images to smallest size, preserving quality.

2) Run HTTP/2 protocol + SSL HSTS + Stapling, which allows for extremely fast serving of all assets/files required to render a page.

You can read about why HTTP/2 is so much faster, especially for image heavy sites... meaning either many image files or big image files or many + big image files (both).
I never pay attention to Google or GTMetrix regards image compression.
I don't either.  While they have some good recommendations, you can Not satisfy Google Pagespeed.  Google doesn't even do it themselves.  Do what makes sense.  But don't re-compress the images.  I never do that.
Dave brings up a good point. If you actually run Google's test against Google pages, I've never seen a 100% correct score.

Dave also brings up another good point which can be expanded.

Always make sure you have a copy of your original image somewhere.

The auto compression tools I run across my sites always save a timestamped original, like foo.png becomes foo-20180817-113737.png, so anytime a new original uploads, I test the new original against previous originals + either save the physical file or link to a previous file.

Point is, I always keep originals (never throw them away via an overwrite operation).