Rio Terminal: a Hardware-Accelerated GPU Terminal Emulator
Posted3 months agoActive3 months ago
rioterm.comTechstory
calmmixed
Debate
60/100
Terminal EmulatorsGPU AccelerationRust
Key topics
Terminal Emulators
GPU Acceleration
Rust
The Rio Terminal is a new hardware-accelerated GPU terminal emulator written in Rust, sparking discussion about its features, use cases, and comparisons to other terminal emulators.
Snapshot generated from the HN discussion
Discussion Activity
Active discussionFirst comment
52m
Peak period
16
6-9h
Avg / period
5.2
Comment distribution57 data points
Loading chart...
Based on 57 loaded comments
Key moments
- 01Story posted
Sep 30, 2025 at 8:29 PM EDT
3 months ago
Step 01 - 02First comment
Sep 30, 2025 at 9:21 PM EDT
52m after posting
Step 02 - 03Peak activity
16 comments in 6-9h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 2, 2025 at 5:00 PM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45432977Type: storyLast synced: 11/20/2025, 4:11:17 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
What I do care about is my bitmap font, which all these new terms don't seem to like supporting.
Is this fixable by using a gpu? Probably not, but I'm willing to give it a try!
- Terminal search and focus (you can list kitty tabs and windows and get the window content from the socket, implementing a BM25 based search is quite easy)
- Giving the current terminal content for AI, so I can do things like run `ls` and then write "Rename the files (in some way)", and push the whole thing to LLM that replaces the command line without me having to write the full context
I even have a Codex session finder that uses codex session files to list and select the session I want, and then uses the kitty socket to find and focus the window which matches the session content
Mutt, personal wiki (fzf searching and neovim editing) and todo (a long never ending markdown file), they all have dedicated shortcut (cmd+m/cmd+z/cmd+d) to open(switch to) their window. These applications, always reside in the first tab with stack layout. For example, I can press cmd+m to switch to mutt (or open it), and press cmd+m again to switch back to the previously focused window.
Depending on which repl is running, I can usually open up vim to edit the line with the same cmd+e shortcut, which sends C-X C-E in bash/zsh, ESC O in iex, C-O in aichat ... Also vertical split in tmux and kitty tab share the same shortcut, cmd+|.
Kitty does not have a command palette, and I use fzf to search some of my frequently used operations and make this my command palette.
https://beuke.org/terminal-latency/
without GPU acceleration you just don't have enough FPS. i haven't tested Rio Terminal, but i have tested plenty of other terminals on Windows and xterm.js was the winner by far due to GPU acceleration
That said, all pixels are rendered via GPU anyway. So using GPU API seems like the most appropriate API and if it means that terminal could work with 100500 FPS, why not?
I say this as someone who's written my own terminal emulator too. SDL was a natural choice, not because of GPU acceleration, but because it gave me the APIs I needed for rendering a terminal.
The other ironic thing about all these terminal emulators that advertise themselves as being "GPU accelerated" is that text rendering cannot be GPU accelerated. Or at least the only way I know of to render text to the GPU is to first render it using the CPU, then push that bitmap to the GPU memory. This leads to developers having to create a "font atlas", which is basically a bitmap with all your characters already cached in GPU memory. And thus you either need to cache the entirety of Unicode, including alternative glyphs due to variation selections, different font characteristics (bold, italic, etc), and even different font colours (if you want nice anti-aliased fonts ;) ). Which means you then need to software render a bunch of stuff on demand anyway, because otherwise you're basically caching every imaginable character and combination on start up.
So yeah, I don't think "GPU accelerated" is the same flex it used it be.
It should be possible that it saves some battery on mobile at high resolutions, but for a while windows terminal was also doing something like issuing a separate draw call for every character so it didn't really help with that, but I think that was addressed.
24-bit color is even more ubiquitous than GPU acceleration. Apparently even the builtin macOS Terminal will support 24-bit color in Tahoe, and I think Windows Terminal has been supporting that since it was released (which is more than 5 years ago). Even image support is kinda old news: many terminals support at the very least Sixel or the iTerm image protocol. Ligatures and splits are also quite common.
It would be more interesting if we started comparing terminals by the details of these features, since the devil really is in the details. For instance, not all image protocols are made equal. Sixels are very slow, while the iTerm protocol is quite limited - you have very little control on where the image is placed. The Kitty Graphics protocol is the most advanced protocol, but there are two different image placement methods: positioning, unicode placeholder and relative to other images. Besides that there are a couple of other features such as animation and communicating back with the terminal to get image IDs. I've seen several terminals claiming to have Kitty Image Protocol support, but I've never seen any of them put out a matrix of which features they support (other than Kitty that obviously supports everything).
The Kitty Keyboard Protocol is also another thing that is quite complex and I've ran into issues in the past with both iTerm and WezTerm behaving differently than Kitty and running into trouble with some programs which expected the kitty behavior.
As I noted in https://news.ycombinator.com/item?id=45433715 , this is likely down to following Kitty's lead and agreeing with Kovid Goyal's rationale.
The world of terminal emulators is diverging and has been for some time. There are the terminal emulators that are designed to emulate prior real terminals, with one codepoint per cell, support for original bitmap fonts, support for conventional TUI effects with MouseText/block/line drawing characters, and so forth. And there are are conversely the terminal emulators that are heading towards having an entire WWW browser/wordprocessor-like Document Object Model, with embedded images, reflowed proportionally-spaced document lines instead of a strict cell matrix, arbitrary font scaling, hyperlinks, and so forth.
https://plan9.io/wiki/plan9/using_rio/index.html
Is this new terminal bringing anything I'd be missing on? Or is it a case of "made in Rust" vs. "made in Zig"?
Coming from iTerm2 I was a bit disappointed at first. Now, I hope this feature stays because it means I can save logs easily, version them, compare them, etc. It's actually fantastic.
It’s not so much the language as it is the people behind it. Hashimoto is just a machine and appears to know how to recruit well and when to delegate well. Potent combo.
DarkTile (https://github.com/liamg/darktile) accepts TTF and OTF, so presumably if you can get your bitmap font of choice converted into those formats you are alright.
GhosTTY (https://github.com/ghostty-org/ghostty) gained proper bitmap font support just last month, the issues list says, but is reliant upon what FreeType and CoreText can handle. It has an open issue discussing those vis-a-vis bitmap fonts.
Trap2 (https://github.com/tale/trap2) likewise relies upon FreeType.
Contour (https://github.com/contour-terminal/contour) at least appears to have bitmap font support. Alacritty, WezTerm, and Kitty get a lot of mentions in the issues list.
...or is this another Ghostty geeksfest that wants me to C-S-whatever to dump history into a file then pipe it into a pager etc etc etc?
[+EDIT: ...apparently not liking ghostty gets you downvoted too? weird amounts fanboysm and hype for an underfeatured and poorly documented terminal]
So your CPU can focus on doing useful computation instead of software-rendering the terminal.
On a 640x480 screen, a terminal will probably render fine with low CPU overhead. On a HiDPI screen, CPU may have millions of pixels to render for a single terminal window.
It's not that different from other application programs IMO. DAWs and VSTs often offload rendering to GPUs to save as many CPU cycles as possible to get the latency down. I love having low latency in my terminals, so I prefer GPU accelerated emulators.