For some reason my brain has twigged onto this today instead of doing anything else useful, so I’m gonna look at various lossy image formats and see what their story is like. Basically, what stuff exists newer than the good old de-facto standard of JPG, how do they compare to it, and why do they exist? Like 75% of this information is from Wikipedia and the other 25% is from the websites of the projects in question, but I like to think I do a good job of aggregating it into a sensible story.

I was originally gonna look at lossless image formats too, but it turned out that the story for those is pretty boring: PNG has ruled the general-purpose world since the early 2000’s and while QOI is adorable in its niche and TIFF is a high-grade swiss-army-chainsaw for all sorts of raster data, nothing has really bothered dethroning PNG in everyday usage. Lossy image formats, on the other hand, have a lot more drama involved…

Last updated July 2023.



You know what this is. Made in 1992 by the Joint Photographic Experts Group. One of the first widespread and pretty good lossy image formats.

JPEG 2000

Made in 1997-2000 by JPEG and extended several times since then. Never quite managed to catch on. Has some patent uncertainties, according to Wikipedia at least.


Open file format developed by Microsoft. Apparently doesn’t actually have anything to do with the Joint Photographic Experts Group, the name is just marketing, though its algorithms are similar to JPG’s. Doesn’t seem to get actually used very much?


Kinda its own thing, developed basically for low-latency video by the Joint Photographic Experts Group. Started in 2019, work is still ongoing. Claims to be “visually lossless”.


The actual intended successor to JPEG 2000. Developed by the Joint Photographic Experts Group, development started in 2017, finished in 2022ish. Can be lossy or lossless. Supports HDR and other such fun things. Royalty-free.


Developed by Google in 2010. Has both lossy and lossless compression, the lossy one is based on the VP8 video codec while the lossless one is its own thing. The lossy format has been panned a time or two for poor perceptual quality compared to jpg and h.264-based compression.

For a while Google was developing a successor, WEBP2, though in 2021 it backed off on it and said that WEBP2 will remain a research project and playground rather than anything actually specified.


Made in 2015. No longer developed as of 2017, apparently the ideas put into it have been folded into JPEG XL.


Made by Fabrice Bellard, creator of ffmpeg. Uses an encoding based on HEVC video codec (aka h.265), which is patented. Supports HDR (up to 14 bits per pixel). Claims to support lossless compression and animation. Last released in 2018, seems to be kinda no longer necessary since HEIC does basically the same thing anyway.


HEIF is an open-ish image container format. When containing HEVC-compressed still images it gets called HEIC, which seems kinda silly but oh well. You can technically stuff h.264 or JPEG image data into it also but idk why you’d want to. Developed in 2013-2015 by MPEG, adopted in 2017 by Apple which apparently started making it actually popular.


Ah, that’s why Wikipedia is careful to make the distinction that HEIF is an image container, it’s actually used by other things! This is an HEIF image container with a royalty-free compression algorithm called AV1, apparently descended from AVI by way of VP8 and VP9 – so it was basically developed by Google and some of their cronies. Applying the compression algorithm to static images and making a format out of it was done by Netflix in 2018, presumably to combat patent bullshit.

Video codecs

This is here purely because the latest gen of image formats appear to be based on video compression algorithms now. Let’s take a look…


Published in 2003ish, got popular by 2008ish. Infamous at the time for being a) really much better than most other things, and b) patent-encumbered as shit and aggressively litigated by MPEG LA (a company apparently unrelated to the actual Motion Picture Experts Group). For some reason they seem to have renamed it to AVC since then.

HEVC/H.265/MPEG-H Part 2

First version published in 2013. Per wikipedia the HEVC video codec, like h.264, is patented and owned by MPEG LA. Based on https://www.ffmpeg.org/legal.html , MPEG LA is basically a “legit” patent troll whose job it is to extract money from people using their patents. Apparently they don’t really care if you use their stuff for noncommercial purposes or make open-source re-implementations, but if you start making money from it they will come after your ass. Google, Mozilla, GPU vendors that implement hardware codecs, etc all presumably pay MPEG LA for you, which is why your browsers and phones and stuff can play HEVC videos.


Video format developed by Google to compete with h.264, released in 2008. Royalty free, there were big slap-fights over it between MPEG LA and Google but by 2013 they seem to have been resolved more or less in Google’s favor – or at least, Google paid them off to let people use VP8 without risking getting sued. (The other main contender in that slap-fight was Xiph’s open source Ogg Theora, which lost hard; unfortunately Xiph has not been terribly relevant in the video/image space since, but their audio tech like Opus, Speex, and FLAC remains pretty darn good.)


Successor to VP8, released by Google in 2013. Seems pretty good but IMO never quite caught the mindshare that VP8 and h.264 did.


Successor to VP9, released in 2015. By now Google has formed a consortium called Alliance For Open Media, AOmedia, which is basically there to try to counter-balance MPEG LA’s patent bullshit. AOmedia includes Google, Mozilla, Amazon, and Netflix – ie the people who actually produce and distribute most of the video on the internet.

The story

Okay, after looking at all of this, it is VERY clear that there are 3 separate lineages of image formats, and they are HEAVILY driven by patents for the codecs they are based off of, and this is a fight that’s been going on continually since 2008 or earlier. Moreover, two of the lineages really are about video codecs, because since Youtube, Netflix and Amazon are now our main media distribution companies, that’s where the money is. All the actual image formats apart from the JPEG lineage piggyback off of those video codecs.

So the lineages are:

  • The MPEG Lineage. h.264 -> h.265/HVEC, eventually producing HEIC. Developed and driven by evil IP companies to extract license fees from Google/Amazon/Netflix. Apple bought into it heavily and it’s now “the Mac/iPhone image format”, presumably because before 2020 or so it was actually the best option around.
  • The Web (aka Google) Lineage. VP8 -> VP9 -> AV1, producing WEBP (VP8) -> AVIF (AV1). Developed by Google/Netflix/Amazon so they don’t have to pay the MPEG license fees, and thus are generally freely licensed and any patents are held by people who promise not to litigate on them (as long as you don’t litigate back).
  • The JPEG Lineage. JPG -> JPEG 2000 -> JPEG XL. Developed by the people who actually just want to have good digital photographs, rather than having billion-dollar slap fights about licensing fees. Thus they have the least mindshare and the least pull in the computer and electronics industries.

There’s some offshoots of this set of family trees of course, like FLIF and BPG and so on. But most of them seem to have either withered or gotten merged back into the main lineages.

A tougher story in this generation-long licensing battle has been hardware support. The MPEG Lineage tends to make good tech before other people do, and thus is good at getting its tech into GPU’s for hardware accelerated encoding and decoding first. NVidia has had HVEC decoding in GPU’s since 2015, while NVidia and AMD have only been putting AV1 decoding into their GPU’s since 2020. (I’m not even gonna TOUCH mobile GPU’s, which are much more cursed than desktop GPU’s.)


Okay, so how well do these even work??? Well… that depends on your priorities. So, let’s discuss my own priorities: I suck at picking out fine visual details, so I can NOT judge these by quality. HEIC, AVIF and JPEG XL all seem to support high-bit-depth color spaces and other fancy features, which I will almost never use because I’m not an artist or photographer. All I can say is that they look pretty similar in terms of bells-and-whistles feature set, while older formats like JPG and WEBP are usually more limited. And finally, the quality and speed of encoding can depend a lot on the encoder, hardware acceleration support, etc, and that’s a rabbit hole I do not want to go down.

(This analysis has much more depth, with cool graphs: https://cloudinary.com/blog/jpeg-xl-and-the-pareto-front . Note that it uses a newer version of libjxl than I do.)

So I’m gonna compare these on the basis of compression speed and compressed size. I will convert the same corpus of images to each format, badly, using ImageMagick with default settings and whatever default codec it decides to use. I sure as heck don’t have the patience to start digging into the ups and downs of different codec implementations and their numerous and sundry quality settings. The image corpus is a chunk of my porn library, which should be a sufficient variety of JPG images with different image characteristics.

The encoding time is measured on my desktop, with something resembling the following command line: date; /bin/ls | parallel convert {} ../jpg2000/{}.jp2; date. I ended up doing it on a fast SSD so it isn’t noticably I/O bound, or at least I/O doesn’t dominate the times. convert didn’t understand JPEG XL yet, so I had to install libjxl-tools version 0.9 and use cjxl instead. Then THAT defaults to “lossless transcoding” for old JPEG’s, which is not something the other codecs are capable of doing, so I had to set --lossless_jpeg=0 to make it actually convert apples to apples.

Decoding time was measured similarly by converting each image from the lossy format to png, so the absolute times are BS but the relative times are true.

Format Age Lineage Size Encoding time Decoding time
JPG (baseline) Oldschool JPEG 649M 1 sec (noop) 91 sec
JPEG 2000 Midschool JPEG 1.7G 45 sec 118 sec
WEBP Midschool Web 464M 44 sec 85 sec
JPEG XL (--lossless_jpeg=0) Newschool JPEG 510M 30 sec 44 sec
JPEG XL Newschool JPEG 1.4G 302 sec 86 sec
AVIF Newschool Web 185M 191 sec 128 sec
HEIC Newschool MPEG 285M 501 sec 78 sec

So which lossy image format should you use in 2023? I mean… the only good option is “not HEIC”, because fuck those guys. idk what the hell is going on with the various JPEG format encoders, both of JPG’s successors appear mostly incapable of recompressing a JPG into something smaller than it started. But JPEG XL’s tools are still being actively developed so I’ll cut them a little slack for that one. So… just use AVIF. It seems pretty great.

Postscript: my web developer boyfriend listened to these conclusions and instantly said “yeah, AVIF is good if you don’t want anyone on mobile browsers to be able to see anything”. Fortunately, I don’t have to care about mobile browsers, so it’s fine, but apparently if you do care about them then WEBP is currently the way to go.