Not

Hacker News!

Beta
Home
Jobs
Q&A
Startups
Trends
Users
Live
AI companion for Hacker News

Not

Hacker News!

Beta
Home
Jobs
Q&A
Startups
Trends
Users
Live
AI companion for Hacker News
  1. Home
  2. /Story
  3. /Google Revisits JPEG XL in Chromium After Earlier Removal
  1. Home
  2. /Story
  3. /Google Revisits JPEG XL in Chromium After Earlier Removal
Nov 23, 2025 at 1:05 AM EST

Google Revisits JPEG XL in Chromium After Earlier Removal

eln1
60 points
12 comments

Mood

informative

Sentiment

neutral

Category

tech_discussion

Key topics

Jpeg Xl

Chromium

Google

Image Compression

Browser Support

Discussion Activity

Active discussion

First comment

38m

Peak period

18

Hour 8

Avg / period

4.6

Comment distribution82 data points
Loading chart...

Based on 82 loaded comments

Key moments

  1. 01Story posted

    Nov 23, 2025 at 1:05 AM EST

    1d ago

    Step 01
  2. 02First comment

    Nov 23, 2025 at 1:43 AM EST

    38m after posting

    Step 02
  3. 03Peak activity

    18 comments in Hour 8

    Hottest window of the conversation

    Step 03
  4. 04Latest activity

    Nov 23, 2025 at 9:02 PM EST

    5h ago

    Step 04

Generating AI Summary...

Analyzing up to 500 comments to identify key contributors and discussion patterns

Discussion (12 comments)
Showing 82 comments
mikae1
1d ago
2 replies
The final piece of the JPEG XL puzzle!
free_bip
1d ago
2 replies
It's a huge piece for sure, but not the only one. For example, Firefox and Windows both don't support it out of the box currently. Firefox requires nightly or an extension, and on Windows you need to download support from the Microsoft store.
JyrkiAlakuijala
23h ago
1 reply
Would PDF 2.0 (which also depends JPEG XL and Brotli) put pressure on Firefox and Windows to add more easy to use support?
jchw
23h ago
2 replies
I don't think so: JPEG 2000, as far as I know, isn't generally supported for web use in web browsers, but it is supported in PDF.
fmajid
23h ago
2 replies
JPEG-XL is recommended as the preferred format for HDR content for PDFs, so it’s more likely to be encountered:

https://www.theregister.com/2025/11/10/another_chance_for_jp...

jchw
23h ago
1 reply
What I mean to say is, I believe browsers do support JPEG 2000 in PDF, just not on the web.
Zardoz84
20h ago
1 reply
the last time that I check it, I find that I need to convert to Jpeg to show the image in browsers.
jchw
18h ago
A *PDF* with embedded JPEG 2000 data should, as far as I know, decode in modern browser PDF viewers. PDF.js and PDFium both are using OpenJPEG. But despite that, browsers don't currently support JPEG 2000 in general.

I'm saying this to explain how JPEG XL support in PDF isn't a silver bullet. Browsers already support image formats in PDF that are not supported outside of PDF.

bmicraft
17h ago
I'm not convinced HDR PDFs will be a common thing anytime soon, even without this chicken and egg problem of support
RobotToaster
18h ago
2 replies
So Firefox (or others) can't open a pdf with a embedded jpeg-2000/XL? Or does pdf.js somehow support it?
lxgr
18h ago
Seems like it: https://github.com/mozilla/pdf.js.openjpeg

This test renders correctly in Firefox, in any case: https://sources.debian.org/data/main/p/pdf2djvu/0.9.18.2-2/t...

jchw
12h ago
Apparently I really flubbed my wording for this comment. I'm saying they do support it inside of PDF, just not elsewhere in the web platform.
zinekeller
21h ago
> on Windows you need to download support from the Microsoft store.

To be really fair, on Windows:

- H.264 is the only guaranteed (modern-ish) video codec (HEVC, VP9, AV1 is not built-in unless the device manufacturer bothered to do it)

- JPEG, GIF, and PNG are the only guaranteed (widely-used) image codecs (HEIF, AVIF, and JXL is also not built-in)

- MP3 and AAC are the only guaranteed (modern-ish) audio codecs (Opus is another module)

... and all of them are widely used when Windows 7 was released (before the modern codecs) so probably modules are now the modern Windows Method™ for codecs.

Note on pre-8 HEVC support: the codec (when not on VLC or other software bundling their own codecs) is often on that CyberLink Bluray player, not a built-in one.

viktorcode
18h ago
A large and important piece, but not the final. If it will remain web-only codec, that is no Android and iOS support for taking photos in JPEG XL, then the web media will still be dominated with JPEGs.
charcircuit
1d ago
1 reply
Here are the direct links:

blink-dev mailing list

https://groups.google.com/a/chromium.org/g/blink-dev/c/WjCKc...

Tracking Bug (reopened)

https://issues.chromium.org/issues/40168998

IshKebab
17h ago
1 reply
Yeah note that Google only said they're now open to the possibility, as long as it is written in Rust (rightly so).

The patch at the end of that thread uses a C++ implementation so it is a dead end.

surajrmal
16h ago
1 reply
Rick specifically said commitment for long term maintenance and meeting usual standards for shipping. The implementation was abandoned in favor of a new one using rust, so not necessarily a dead end.
IshKebab
13h ago
I meant the C++ patch is a dead end; not JPEG XL support in general. Seems like there's a Rust library that will have to be used instead.
adzm
22h ago
1 reply
jxl-rs https://github.com/libjxl/jxl-rs was referenced as a possibility; what library is Safari using for jpegxl?
JimDabell
20h ago
libjxl:

https://github.com/libjxl/libjxl

https://github.com/WebKit/WebKit/blob/7879cb55638ec765dc033d...

jiggawatts
22h ago
9 replies
2026 is nearly upon us, and Google, Microsoft, and Apple remain steadfast in the refusal to ever allow anyone to share wide-gamut or HDR images.

Every year, I go on a rant about how my camera can take HDR images natively, but the only way to share these with a wider audience is to convert them to a slideshow and make a Rec.2020 HDR movie that I upload to YouTube.

It's absolutely bonkers to me that we've all collectively figured out how to stream a Hollywood movie to a pocket device over radio with a quality exceeding that of a typical cinema theatre, but these multi-trillion market cap corporations have all utterly failed to allow users to reliably send a still image with the same quality to each other!

Any year now, maybe in 2030s, someone will get around to a ticket that is currently at position 11,372 down the list below thousands of internal bullshit that nobody needed done, rearranging a dashboard nobody has ever opened, or whatever, and get around to letting computers be used for images. You know, utilising the screen, the only part billions of users ever look at, with their human eyes.

I can't politely express my disgust at the ineptitude, the sloth, the foot dragging, the uncaring unprofessionalism of people that get paid more annually then I get in a decade who are all too distracted making Clippy 2.0 instead of getting right the most utterly fundamental aspect of consumer computing.

If I could wave a magic wand, I would force a dev team from each of these companies to remain locked in a room until this was sorted out.

mirsadm
22h ago
2 replies
It is incredibly annoying that instead of adopting JpegXL they decided to use UltraHDR. A giant hack which works very poorly.
jiggawatts
22h ago
> A giant hack which works very poorly.

Indeed. I tried every possible export format from Adobe Lightroom including JPG + HDR gainmaps, and it looks... potato.

With a narrow gamut like sRGB it looks only slightly better than JPG, but with a wider gamut you get terrible posterization. People's faces turn grey and green and blue skies get bands across them.

Meanwhile my iPhone creates 10-bit Dolby Vision video.

lxgr
17h ago
That's backwards compatibility for you.

I think Ultra HDR (and Apple's take on it, ISO 21496-1) make a lot of sense in a scenario where shipping alternate formats/codecs is not viable because renderer capabilities are not known or vary, similarly to how HDR was implemented on Blu-Ray 4K discs with the backwards-compatible Dolby Vision profiles.

It's also possible to do what Apple has done for HEIC on iOS: Store the modern format, convert to the best-known supported format at export/sharing time.

geocar
22h ago
1 reply
> the only way to share these with a wider audience is to convert them to a slideshow and make a Rec.2020 HDR movie that I upload to YouTube

i understand some of this frustration, but really you just have to use ffmpeg to convert it to a web format (which can be done by ffmpeg.js running in a service worker if your cpu is expensive) and spell <img as <video muted autoplay playsinline which is only a little annoying

> I can't politely express my disgust at the ineptitude, the sloth, the foot dragging, the uncaring unprofessionalism of people that get paid more annually then I get in a decade who are all too distracted making Clippy 2.0 instead of getting right the most utterly fundamental aspect of consumer computing.

hear hear

> If I could wave a magic wand, I would force a dev team from each of these companies to remain locked in a room until this was sorted out.

i can think of a few better uses for such a wand...

jiggawatts
21h ago
> <img as <video muted autoplay playsinline which is only a little annoying

Doesn't work for sharing images in text messages, social media posts, email, Teams, Wikipedia, etc...

n8cpdx
20h ago
1 reply
I’m wondering if HDR means something different to me, because I see HDR images all the time. I can share HDR images via phones (this seems to be the default behavior on iPhone/Mac messages), I can see HDR PNG stills on the web (https://github.com/swankjesse/hdr-emojis), I can see wide gamut P3 images on the web as well (https://webkit.org/blog-files/color-gamut/).

What am I missing?

jiggawatts
19h ago
1 reply
> I can share HDR images via phones

Sure, me too! I can take a HDR P3 gamut picture with my iPhone and share it with all my friends and relatives... that have iPhones.

What I cannot do is take a picture with a $4000 Nikon DSLR and share it in the same way... unless I also buy a Mac so I can encode it in the magic Apple-only format[1] that works... for Mac and IOS users. I have a Windows PC. Linux users are similarly out in the cold.

This situation so incredibly bad that I can pop the SD card of my camera into an reader plugged into my iPhone, process the RAW image on the iPhone with the Lightroom iPhone app in full, glorious HDR... and then be unable to export the HDR image onto the same device for viewing because oh-my-fucking-god-why!?

[1] They claim it is a standards-compliant HEIF file. No, it isn't. That's a filthy lie. My camera produces a HDR HEIF file natively, in-body. Everything opens it just fine, except all Apple ecosystem devices. I suspect the only way to get Apple to budge is to sue them for false advertising. But... sigh... they'll just change their marketing to remove "HEIF" and move on.

gen2brain
19h ago
1 reply
Not that I disagree, but HEIF is a container format. What is inside that container is essential. HEIC in HEIF, AVIF in HEIF, etc.
jiggawatts
19h ago
Sure, but Apple doesn't fully support HEIC either.

They support only a very specific subset of it, in a particular combination.

Some Apple apps can open third-party HEIC-in-HEIF files, and even display the image correctly, but if you try anything more "complex", it'll start failing. Simply forwarding the image to someone else will result in thumbnails looking weirdly corrupted, brightness shifting, etc...

I've even seen outright crashes, hangs, visible memory corruption, etc...

I bet there's at least one exploitable security vulnerability in this code!

fingerlocks
20h ago
1 reply
What are you talking about? You extract 3 exposure values from the raw camera buffer and merge and tone map them manually into a single HDR image. The final exported image format may not have the full supported color space, but that’s on you. Apple uses the P3 space by default.

This has been supported by both Apple and third party apps for over a decade. I’ve implemented it myself.

jiggawatts
19h ago
2 replies
That's not HDR. That's pretend HDR in an SDR file, an artistic effect, nothing more.

Actual HDR needs at least 10 bits per channel and a modern display with peak brightness far in excess of traditional monitors. Ideally over 1,000 nits compared to typical LCD brightness of about 200.

You also don't need "three pictures". That was a hack used for the oldest digital cameras that had about 8 bits of precision in their analog to digital converters (ADC). Even my previous camera had a 14-bit ADC and in practice could capture about 12.5 bits of dynamic range, which is plenty for HDR imaging.

Lightroom can now edit and export images in "true" HDR, basically the same as a modern HDR10 or Dolby Vision movie.

The problem is that the only way to share the exported HDR images is to convert them to a movie file format, and share them as a slide show.

There is no widely compatible still image format that can preserve 10-bit-per-channel colours, wide-gamut, and HDR metadata.

fingerlocks
16h ago
1 reply
I never mentioned a file format. These operations are performed on the raw buffer, there is no hack. There is no minimum bit depth for HDR (except for maybe 2) that’s just silly. High dynamic range images just remap the physical light waves to match human perception, but collecting those waves can be done at any resolution or bit depth.

I wrote camera firmware. I’ve implemented HDR on the both the firmware level, and later at the higher client level when devices became faster. You’re either overloading terminology to the point where we are just talking past each other, or you’re very confused.

oktoberpaard
12h ago
2 replies
What you are taking about is also called HDR, but has nothing to do with what the other person is talking about. The other person is talking about the still image equivalent of HDR video formats. When displayed on an HDR capable monitor, it will map the brightest parts of the image to the extended headroom of the monitor instead of tone mapping it to be displayed on a standard SDR monitor. So to be even more clear: it defines brightness levels beyond what is normally 100%.
fingerlocks
8h ago
[delayed]
fingerlocks
8h ago
[delayed]
alwillis
12h ago
1 reply
> Actual HDR needs at least 10 bits per channel and a modern display with peak brightness far in excess of traditional monitors. Ideally over 1,000 nits compared to typical LCD brightness of about 200.

In the Apple Silicon era, the MacBook Pro has a 1,000 nit display, with peak brightness at 1,600 nits when displaying HDR content.

Affinity Studio [1] also supports editing and exporting "true" HDR images.

[1]: https://www.affinity.studio

jiggawatts
12h ago
I have a 4K HDR OLED plugged into my Windows PC that works just fine for editing and viewing my photos.

I have no way, in general, to share those photos with you, not without knowing ahead of time what software you’re using. I’ll also have to whip up a web server with custom HTML and a bunch of hacks to encode my images that will work for you but not my friends with Android phones or Linux PCs.

lxgr
18h ago
1 reply
> 2026 is nearly upon us, and Google, Microsoft, and Apple remain steadfast in the refusal to ever allow anyone to share wide-gamut or HDR images.

Huh? Safari seems to render HDR JPEG XLs without any issues these days (e.g. [1]), and supports wide gamut in even more formats as far a I remember.

[1] https://jpegxl.info/resources/hdr-test-page.html

jiggawatts
10h ago
"Share" is the key word in my rant. I know spotty support exists here and there for one format or another.

The problem is that I can't, in general widely share a HDR image and have it be correctly displayed via ordinary chat applications, social media, email, or what have you. If it works at all, it only works with that One Particular Format in One Specific Scenario.

If you disagree, find me something "trivial", such as a photo sharing site that supports HDR image uploads and those images are viewable as wide-gamut HDR on mobile devices, desktops, etc... without any endpoint ever displaying the image incorrectly such a very dark, very bright, or shifted colors.

swed420
18h ago
1 reply
> It's absolutely bonkers to me that we've all collectively figured out how to stream a Hollywood movie to a pocket device over radio with a quality exceeding that of a typical cinema theatre, but these multi-trillion market cap corporations have all utterly failed to allow users to reliably send a still image with the same quality to each other!

You act like this is some kind of mistake or limit of technology, but really it's an obvious intentional business decision.

Under late stage capitalism, it'd be weird if this wasn't the case in 2026.

Address the underlying issue, or don't be surprised by the race to the bottom.

lxgr
17h ago
This theory utterly fails Hanlon's razor (or whatever the organizational/societal equivalent is).

On one hand, there have been (and still are!) several competing HDR formats for videos (HDR+, Dolby Vision, "plain" HLG, Dolby Vision in HLG etc.), and it tooks years for a winner to pull ahead – that race just started earlier, and the set of stakeholders is different (and arguably a bit smaller) than that for still images.

On the other hand, there are also several still image HDR formats competing with each other right now (JPEG with depth map metadata, i.e. Ultra HDR and ISO 21496-1, Apple's older custom metadata, HEIF, AVIF, JPEG XL...), and JPEG XL isn't the clear winner yet.

Format wars are messy, and always have been. Yes, to some extent they are downstream of the lack of a central standardization body, but there's no anti-HDR cabal anywhere. If anything, it's the opposite – new AV formats requiring new hardware is just about the best thing that can happen to device manufacturers.

zozbot234
19h ago
Just use PNG: https://www.w3.org/TR/png-3/ (for HDR content, see the cICP, mDCV and cLLI chunks; also note that PNG supports up to 16-bit channel depth out of the box).
charcircuit
11h ago
The web has supported 16 bit pngs for decades.
baq
19h ago
I wish I could upvote this multiple times. Spot on, the situation is completely batshit bonkers insane.
AshleysBrain
22h ago
3 replies
I think the article is slightly misleading: it says "Google has resumed work on JPEG XL", but I don't think they have - their announcement only says they "would welcome contributions" to implement JPEG XL support. In other words, Google won't do it themselves, but their new position is they're now willing to allow someone else to do the work.
jmgao
20h ago
Describing it as 'Google' is misleading, because different arms of the company might as well be completely different companies. The Chrome org seems to have had the same stance as Firefox with regards to JPEG XL: "we don't want to add 100,000 lines of multithreaded C++ because it's a giant gaping security risk", and the JPEG XL team (in a completely separate org) is addressing those concerns by implementing a Rust version. I'd guess that needing the "commitment to long-term maintenance" is Chrome fighting with Google Research or whatever about long-term headcount allocation towards support: Chrome doesn't want the JPEG XL team to launch and abandon JPEG XL in chrome and leaving Chrome engineers to deal with the fallout.
jonsneyers
21h ago
It's technically correct. Googlers (at Google Research Zurich) have been working on jxl-rs, a Rust implementation of JPEG XL. Google Research has been involved in JPEG XL from the beginning, both in the design of the codec and in the implementation of libjxl and now jxl-rs.

But until now, the position of other Googlers (in the Chrome team) was that they didn't want to have JPEG XL support in Chrome. And that changed now. Which is a big deal.

IshKebab
17h ago
Yes and they will also only accept it if the library is written in Rust. The patch to add support that is in the thread, and referenced in the article uses libjxl which is C++ and therefore cannot be used.
lousken
19h ago
1 reply
It is absolutely insane that google has not implemented this yet. They implement all sort of unimportant stuff but not the most critical image format of this decade, what a joke
theandrewbailey
19h ago
1 reply
And the things they do implement, they kill 8 or so years later.

https://killedbygoogle.com/

lxgr
18h ago
If all goes well (which is anything but guaranteed), JPEG XL will take off sufficiently to make any future deprecation as unthinkable as e.g. deprecating GIF rendering support.
ksec
19h ago
3 replies
While being a big supporter of JPEG-XL on HN, I just want to note AV2 is coming out soon, which should further improve the image compression. ( Edit: Also worth pointing out current JPEG-XL encoder is no where near its maximum potential in terms of quality / compression ratio )

But JPEG-XL is being quite widely used now, from PDF, medical images, camera lossless, as well as being evaluated in different stage of cinema / artist workflow production. Hopefully the rust decoder will be ready soon.

And from the wording, it seems to imply Google Chrome will officially support anything from AOM.

snvzz
18h ago
3 replies
>medical images

Isn't JPEG-XL a lossy codec?

MagnumOpus
18h ago
1 reply
It has both lossy and lossless modes.
snvzz
18h ago
1 reply
Good to hear.

I sure hope they came up with a good, clear system to distinguish them.

lxgr
18h ago
1 reply
As in, a clear way to detect whether a given file is lossy or lossless?

I was thinking that too, but on the other hand, even a lossless file can't guarantee that its contents aren't the result of going through a lossy intermediate format, such as a screenshot created from a JPEG.

snvzz
17h ago
2 replies
I meant like a filename convention, and tags in the file itself.
stavros
15h ago
1 reply
Presumably you can look at the file and tell which mode is used, though why would you care to know from the filename?
crazygringo
14h ago
4 replies
I find it incredibly helpful to know that .jpg is lossy and .png is lossless.

There are so many reasons why it's almost hard to know where to begin. But it's basically the same reason why it's helpful for some documents to end in .docx and others to end in .xlsx. It tells you what kind of data is inside.

And at least for me, for standard 24-bit RGB images, the distinction between lossy and lossless is much more important than between TIFF and PNG, or between JPG and HEIC. Knowing whether an image is degraded or not is the #1 important fact about an image for me, before anything else. It says so much about what the file is for and not for -- how I should or shouldn't edit it, what kind of format and compression level is suitable for saving after editing, etc.

After that comes whether it's animated or not, which is why .apng is so helpful to distinguish it from .png.

There's a good reason Microsoft Office documents aren't all just something like .msox, with an internal tag indicating whether they're a text document or a spreadsheet or a presentation. File extensions carry semantic meaning around the type of data they contain, and it's good practice to choose extensions that communicate the most important conceptual distinctions.

stavros
14h ago
1 reply
But JPEG has a lossless mode as well. How do you distinguish between the two now?

This is an arbitrary distinction, for example then why do mp3 and ogg (vorbis) have different extensions? They're both lossy audio formats, so by that requirement, the extension should be the same.

Otherwise, we should distinguish between bitrates with different extensions, eg mp3128, mp3192, etc.

crazygringo
14h ago
In theory JPEG has a lossless mode (in the standard), but it's not supported by most applications (not even libjpeg) so it might as well not exist. I've certainly never come across a lossless JPEG file in the wild.

Filenames also of course try to indicate technical compatibility as to what applications can open them, which is why .mp3 and .ogg are different -- although these days, extensions like .mkv and .mp4 tell you nothing about what's in them, or whether your video player can play a specific file.

At the end of the day it's just trying to achieve a good balance. Obviously including the specific bitrate in a file extension goes too far.

lxgr
13h ago
1 reply
> Knowing whether an image is degraded or not is the #1 important fact about an image for me

But how can you know that from the fact that it's currently losslessly encoded? People take screenshots of JPEGs all the time.

> After that comes whether it's animated or not, which is why .apng is so helpful to distinguish it from .png.

That is a useful distinction in my view, and there's some precedent for solutions, such as how Office files containing macros having an "m" added to their file extension.

crazygringo
13h ago
Obviously nothing prevents people from taking PNG screenshots of JPEGs. You can make a PNG out of an out-of-focus camera image too. But at least I know the format itself isn't adding any additional degradation over whatever the source was.

And in my case I'm usually dealing with a known workflow. I know where the files originally come from, whether .raw or .ai or whatever. It's very useful to know that every .jpg file is meant for final distribution, whereas every .png file is part of an intermediate workflow where I know quality won't be lost. When they all have the same extension, it's easy to get confused about which stage a certain file belongs to, and accidentally mix up assets.

nothrabannosir
13h ago
Legacy. It’s how things used to be done. Just like Unix permissions, shared filesystem, drive letters in the file system root, prefixing urls with the protocol, including security designators in the protocol name…

Be careful to ascribe reason to established common practices; it can lead to tunnel vision. Computing is filled with standards which are nothing more than “whatever the first guy came up with”.

https://en.wikipedia.org/wiki/Appeal_to_tradition

Just because metadata is useful doesn’t mean it needs to live in the filename.

ksec
14h ago
>I find it incredibly helpful to know that .jpg is lossy and .png is lossless.

Unfortunately we have been through this discussion and author of JPEG-XL strongly disagree with this. I understand where they are coming from, but for me I agree with you it would have been easier to have the two separated in naming and extensions.

149765
16h ago
There is some sort of tag, jxlinfo can tell you if a file is "lossy" or "(possibly) lossless".
ksec
18h ago
1 reply
JPEG-XL is both a lossy and lossless codec. It is already being used in Camera DNG format, making the RAW image smaller.

While lossy codec is hard to compare and up for debate. JPEG-XL is actually better as a lossless codec in terms of compression ratio and compression complexity. There is only one other codec that beats it but it is not open source.

cipehr
15h ago
2 replies
What is the non-open source codec?
ksec
14h ago
HALIC (High Availability Lossless Image Compression)

https://news.ycombinator.com/item?id=38990568

Nanopolygon
12h ago
HALIC is by far the best lossless codec in terms of speed/compression ratio. If lossy mode were similarly available, we might not be discussing all these issues. I think he stopped developing HALIC for a long time due to lack of interest.

Its developer is also developing HALAC (High Availability Lossless Audio Compression). He recently released the source code for the first version of HALAC. And I don't think anyone cared.

andybak
16h ago
1 reply
Surely something close to perceptually lossless is sufficient for most use cases?
AlotOfReading
14h ago
Think of all the use cases where the output is going to be ingested by another machine. You don't know that "perceptually lossless" as designed for normal human eyeballs on normal screens in normal lighting environments is going to contain all the information an ML system will use. You want to preserve data as long as possible, until you make an active choice to throw it away. Even the system designer may not know whether it's appropriate to throw that information away, for example if they're designing digital archival systems and having to consider future users who aren't available to provide requirements.
Nanopolygon
12h ago
1 reply
AVIF/AV1 is a codec that encodes both lossy and lossless files very slowly. JXL is significantly faster than AVIF. But AVIF provides better image quality than JXL even at lower settings. However, AV2 will require much more power and system resources for a small bandwidth gain.
spartanatreyu
7h ago
1 reply
> But AVIF provides better image quality than JXL even at lower settings.

I don't think that's strictly true.

The conventional reporting has been that JXL works better at regular web sizes, but AVIF starts to edge out at very low quality settings.

However, the quality per size between the two is so close that there are comparisons showing JXL winning even where AVIF is supposed to out perform JXL. (e.g. https://tonisagrista.com/blog/2023/jpegxl-vs-avif/)

Even at the point where AVIF should shine: when low bandwidth is important, JXL supports progressive decoding (AVIF is still trying to add this) so the user will see the images sooner with JXL rather than AVIF.

---

There is one part where AVIF does beat JXL hands down, and that's animation (which makes sense considering AVIF comes from the modern AV1 video codec). However, any time you would want an animation in a file, you're better off just using a video codec anyway.

ksec
5h ago
To be fair, those comparison image size aren't small enough. Had it been 30 - 50% of those tested size AVIF should have the advantage.

But then the question is should we even be presenting this level of quality. Or is it enough. I guess that is a different set of questions.

eviks
15h ago
> AV2 .... further improve the image compression. ( Edit: Also worth pointing out current JPEG-XL encoder is no where near its maximum potential in terms of quality / compression ratio

But at what cost? From this the en/decoding speed (links below) is much higher for those advanced video codecs, so for various lower powered devices they wouldn't be very suitable?

Also, can we expect "near max potential" with AV2/near future or is it an ever-unachievable goal that shouldn't stop adding "non-max" codecs?

https://res.cloudinary.com/cloudinary-marketing/image/upload...

https://cloudinary.com/blog/time_for_next_gen_codecs_to_deth...

particlo
13h ago
if you wanna compare jxl vs avif by taking photos yourself and have an android phone then try this APK https://github.com/particlo/camataca i thought jxl was better by looking at its website benchmarks but then after trying it myself i find jxl generates ugly blocky artifacts
_ache_
14h ago
It's a little step but a step forward. JXL is on part with AVIF and WebP2 most of the time but is very much better to share photography.

There is no reason to block its adoption.

eviks
17h ago
Maybe they'll do it right this time

> The team explained that other platforms moved ahead. Safari supports JPEG XL, and Windows 11 users can add native support through an image extension from Microsoft Store. The format is also confirmed for use in PDF documents.

glad those folks didn't listen to "the format is dead since the biggest browser doesn't support it" (and shame on Firefox for not doing the same)

caminanteblanco
23h ago
My introduction to JPEG-XL was by 2kliksphillip on YouTube, he has a few really good analyses on this topic, including this video: https://youtu.be/FlWjf8asI4Y
cgfjtynzdrfht
16h ago
How quickly things turn. Hard to not support it given chrome wants to support PDF natively.
View full discussion on Hacker News
ID: 46021179Type: storyLast synced: 11/23/2025, 9:45:14 AM

Want the full context?

Jump to the original sources

Read the primary article or dive into the live Hacker News thread when you're ready.

Read ArticleView on HN

Not

Hacker News!

AI-observed conversations & context

Daily AI-observed summaries, trends, and audience signals pulled from Hacker News so you can see the conversation before it hits your feed.

LiveBeta

Explore

  • Home
  • Jobs radar
  • Tech pulse
  • Startups
  • Trends

Resources

  • Visit Hacker News
  • HN API
  • Modal cronjobs
  • Meta Llama

Briefings

Inbox recaps on the loudest debates & under-the-radar launches.

Connect

© 2025 Not Hacker News! — independent Hacker News companion.

Not affiliated with Hacker News or Y Combinator. We simply enrich the public API with analytics.