Best way to optimise images and pictures to display on the web

Conscious that hosting a website from a home internet connection from the Raspberry Pi may lead to slower load time than you'd want I've been working on speeding up its response.

I've always been focused on server speed throughout its whole development in a bid to speed up load times. I think I've got it cracked – but the only thing I'm lacking is a decent workflow for optimising the file size of the images I'm serving up.

I played around with a few web and windows tools but the process all seemed to be time consuming and cumbersome.


After a while I settled upon an add in from a tried and tested bit of software that I've been using since I can remember - IrfanView using the 'Save for web' plugin. It is easy, quick and also tune-able whilst also providing a live view of what the output would look like.

It is actually really quick if you've associated it to automatically open image files, my workflow was to drop the required image into the folder on the web-server, hit enter, select the save for web - the GUI pops up and you just hit save to overwrite the file with the optimsed settings.

The results were good, I was getting some good file size reductions. Although it doesn't take too long what is really needed is some automated way to transform your image in a single click.


I've been a user of imagemagick for a while now so this was my first port of call in testing. After a bit of playing around I soon realized that all I needed was to to just change the quality to 70. Easy peasy, just need to run the following code and you are done:

convert sample.jpg -quality 70 output.jpg

Note for windows users, if you try and run convert.exe without specifying a path the the executable it won't work, this is because there's another convert.exe system file which it uses in preference. To fix this I simply rename it to img_convert.exe and dump it into my windows directory. The code would now be

convert sample.jpg -quality 70 output.jpg


The next trick is to remove all of the EXIF metadata attached to the image, you wouldn't probably think it isn't many bytes worth but a) they all add up and b) you'd be suprised how much info the modern digital camera now spits out.

So go ahead and install a Exiftool on you system and run the following code:

exiftool -all= output.jpg -overwrite_original -P

Bringing it all together

Now all we have to do is clobber the code together and we are away.


go ahead and create a new file:


and paste the following code in:

convert $1 -quality 70 $2 && exiftool -all= $2 -overwrite_original -q

we then have to make it executable with:

chmod +x

and now we can run it using: input.jpg output.jpg


It is a similar story for windows, just create a file called


open in notepad and paste the following:

img_convert %1 -quality 70 %2
exiftool -all= %1 -overwrite_original -P 

Drag & Drop

With windows we can add an extra flourish - you can set up the batch file so that you can optimise your jpg by just dragging the image on the .bat and it will convert it automatically in the GUI without going near a command prompt:

Create a new file:


and enter this code:

@echo off
copy %1 %~n1"_"%~x1
img_convert %1 -quality 70 %1
exiftool -all= %1 -overwrite_original -P

What the code does here is to copy the input.jpg to input_.jpg then optimise the original file.

The results

I was quite impressed by the results, I tried it on a plethera of files. The most notable reductions come from digicam images, here's a typical example

Original digicam jpg: 2843 kb
Reduced to quality 70: 650kb
then exif info removed: 595kb
As we can see, the reduction is an impressive 79% reduction.

Extra credit

Head over to GitHub for the extended versions of the scripts with multiple command line options which gives you control of the quality parameters and lets you also resize the image:

Web image optimiser on GitHub


Search Posts

Back to top