Modern Linux Tools
Posted3 months agoActive3 months ago
ikrima.devTechstoryHigh profile
controversialmixed
Debate
80/100
Linux ToolsCommand-Line UtilitiesProductivity
Key topics
Linux Tools
Command-Line Utilities
Productivity
A list of modern Linux tools sparks debate among HN users about their usefulness, maintainability, and compatibility with traditional tools.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
2h
Peak period
103
0-6h
Avg / period
14.5
Comment distribution160 data points
Loading chart...
Based on 160 loaded comments
Key moments
- 01Story posted
Oct 13, 2025 at 5:44 AM EDT
3 months ago
Step 01 - 02First comment
Oct 13, 2025 at 7:19 AM EDT
2h after posting
Step 02 - 03Peak activity
103 comments in 0-6h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 17, 2025 at 5:24 AM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45566548Type: storyLast synced: 11/20/2025, 8:56:45 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
exa modern replacement for ls/tree, not maintained
"not maintained" doesn't smell "modern" to me...
eza: https://github.com/eza-community/eza
Yeeeah, nope.
Damned if you do and damned if you don't.
> cat clone with syntax highlighting and git integration
doesn't make any sense because cat is not really meant for viewing files. You should be comparing your tool with the more/less/most family of tools, some of which can already do syntax highlighting or even more complex transforms.
Actual LOL. Indeed. I was working for a large corporation at one point and a development team was explaining their product. I asked what its differentiators were versus our competitors. The team replied that ours was written in Go. #faceplam
Sure. You can drive really slowly in a sports car. But if you're looking for travel options for a long distance journey are you going to pick the sports car or the bicycle.
Also I have actually yet to find slow unmaintainable brittle garbage written in Go or Rust. I'm sure it's possible but it's vastly less likely.
Citation needed. /s
This is called "the fallacy of grey". Very common.
https://www.lesswrong.com/posts/dLJv2CoRCgeC2mPgj/the-fallac...
An example from my personal experience: I used to think that oxipng was just a faster optipng. I took a closer look recently and saw that it is more than that.
See: https://op111.net/posts/2025/09/png-compression-oxipng-optip...
That's the problem. How good they are ? Who can tell ? The basic UNIX tools didn't come in "one day" like most of these "rust tools".
Those "foundational GNU tools" just suck, sure, people are familiar with them and they are everywhere, but they just plain suck.
For many common operations you'd want to do by default with grep/find and so on, you have to type mountains of random gibberish to get it done. And that random gibberish isn't something that rolls of your tongue either, thus at minimum you'd define truckload of aliases.
OR you can use a tool(s) that has marginally "sane defaults" and marginally sane UX out of the box.
It really isn't that complicated. This has nothing to do with "rust".
Learn the classic tools, learn them well, and your life will be much easier.
Many folks nowadays don't get how lucky they are, not having to do UNIX development on a time-sharing system, although cloud systems kind of replicate the experience.
ed (pronounced as distinct letters, /ˌiːˈdiː/)[1] is a line editor for Unix and Unix-like operating systems. It was one of the first parts of the Unix operating system that was developed, in August 1969.[2] It remains part of the POSIX and Open Group standards for Unix-based operating systems
so it is a bug in those distros.
Agreed, but that doesn't stop you from using/learning alternatives. Just use your preferred option, based on what's available. I realise this could be too much to apply to something like a programming language (despite this, many of us know more than one) or a graphics application, but for something like a pager, it should be trivial to switch back and forth.
Awk and sed.
I like the idea of new tools though. But knowing the building blocks is useful. The “Unix power tools” book was useful to get me up to speed.. there are so many of these useful mini tools.
Miller is one I’ve made use of (it also was available for my distro)
Not everybody is a sysadmin manually logging into lots of independent, heterogeneous servers throughout the day.
[1] https://github.com/charmbracelet/
The infra may be cattle but debugging via anal probe err SSH is still the norm.
Same goes for a bunch of other tools that have "modern" alternatives but the "classic" ones are already installed/available on most default distribution setups.
apt-get/pacman/dnf/brew install <everything that you need>
You'll need install those and other tools (your favorite browser, you favorite text editor, etc) anyway if you're changing your OS.
> or SSH anywhere
When you connect through SSH you don't have GUI and that's not a reason for avoiding using GUI tools, for example.
> even use a mix of these on my personal computer and the traditional ones elsewhere
I can't see the problem, really. I use some of those tools and they are convenient, but it doesn't matter that I can't work without that. For example, bat: it doesn't replace cat, it only outputs data with syntax highlight, makes my life easier but if I don't have it, ok.
The point is that sometimes you're SSHing to a lightweight headless server or something and you can't (or can't easily) install software.
I personally haves an ansible playbook to ~setup all my commonly used tooling on ~any cli I will use significantly; (almost) all local installs to avoid need for root. It runs in ~minute - and I have all the Niceties. If it's not worth spending that minute to run; then i won't be on the machine long enough for it to matter.
^^ Yep. Totally this. I've become entirely too accustomed to all the little niceties of a well-crafted toolchain that covers all my needs at any given moment. It was worth the time invested to automate installing and configuring all the fancy newfangled stuff I've built up muscle-memory for. :)
One major difference can emerge from the fact that using a tool regularly inevitably builds muscle memory.
You’re accustomed to a replacement command-line tool? Then your muscle memory will punish you hard when you’re logged into an SSH session on another machine because you’re going to try running your replacement tool eventually.
You’re used to a GUI tool? Will likely bite you much less in that scenario.
Yes.
> Then your muscle memory will punish you hard
No.
I'm also used to pt-br keyboards, it's easier to type in my native language, but it's ok if I need to use US keyboards. In terms of muscle memory, keyboards are far harder to adapt.
A non-tech example: if I go to a Japanese restaurant, I'll use chopsticks and I'm ok with them. At home, I use forks and knives because they make my life easier. I won't force myself to use chopsticks everyday only for being prepared for Japanese restaurants.
If only it were so simple. Not every tool comes from a package with the same name, (delta is git-delta, "z" is zoxide, which I'm not sure I'd remember off the top of my head when installing on a new system). On top of that, you might not like the defaults of every tool, so you'll have config files that you need to copy over or recreate (and hopefully sync between the computers where you use these tools).
That said I do think nix provides some good solutions for this. It gives you a nice clean way to list the packages you want in a nixfile and also to set their defaults and/or provide some configuration files. It does still require some maintenance (and I choose to install the config files as editable, which is not very nix-y, but I'd rather edit it and then commit the changes to my configs repo for future deploys than to have to edit and redeploy for every minor or exploratory change), but I've found it's much better than trying maintain some sort of `apt-get install [packages]` script.
But yes, in the eventual case that I don't have Nix I can very much use the classic tools. It is not a binary choice, you can have both.
I can just drop it into the environment and pull in tools that I need using nix.
For example, in $CURRENT_JOB we have a bastion host that gives access to the databases (not going to discuss if this is a good idea or not, this is how my company does). 90% of time I can do whatever I need just with what the bastion host offers (that doesn't have Nix), if I need to do further analysis I can copy some files between the bastion host and my computer to do further analysis.
IMO, it's worth spending some time to clean up your setup for smooth transition to new machines in the future.
I never asked for such behaviour, and I have no time for pretty "modern" opinions in a base software.
Often, when I read "modern", I read "immature".
I am not ready to replace my stable base utilities for some immature ones having behaviour changes.
The scripts I wrote 5 years ago must work as is.
Note: if you want to make ripgrep not do .gitignore filtering, set `RIPGREP_CONFIG_PATH` to point to a config file that contains `-uu`.
Sources:
- https://github.com/BurntSushi/ripgrep/blob/master/GUIDE.md#c...
- https://github.com/BurntSushi/ripgrep/blob/master/GUIDE.md#a...
That's on me!
--smart-case --no-messages --hidden --ignore-vcs
and then point to it with
.zshenv 3:export RIPGREP_CONFIG_PATH="$HOME/.rgrc"
Not perfect and sometimes I reach for good old fashioned escaped \grep but most of the time it's fine.
> ripgrep is a line-oriented search tool that recursively searches the current directory for a regex pattern. By default, ripgrep will respect gitignore rules and automatically skip hidden files/directories and binary files. (To disable all automatic filtering by default, use rg -uuu.)
https://github.com/BurntSushi/ripgrep
Regarding ripgrep: if it's not bug-for-bug compatible with grep, it’s deemed useless. Yet, if it is identical, then why bother using it at all? What kind of logic is that?
Because hey guess what: you can still use grep! So I built something different.
Wow, that is so cool. This looks a lot more approachable than other sandboxing tools.
Then use grep, what’s your point? grep is not going away because ripgrep is better, but ripgrep might become more available?
I also notice you’re saying "if", so you’re not. So again, what’s your point?
Often there are plenty of of paths open to getting a decent environment as you go:
Mostly, I rely on ansible scripts to install and configure the tools I use.
One fallback I haven't seen mentioned, that can get a lot of mileage from it: use sshfs to mount the target system locally. This allows you to use local tool & setup effectively against another machine!
I have a chef cookbook that sets up all the tools I like to have on my VMs. When I bootstrap a VM it includes all the stuff I want like fish shell and other things that aren’t standard. The chef cookbook also manages my SSH keys and settings.
Only to feel totally handicapped when logging in into a busybox environment.
I'm glad I learned how to use vi, grep, sed..
My only change to an environment is the keyboard layout. I learned Colemak when I was young. Still enjoying it every day.
E.g. I have ls set up aliased to eza as part of my custom set of configuration scripts. eza pretty much works as ls in most scenarios.
If I'm in an environment which I control and is all configured as I like it, then I get a shinier ls with some nice defaults.
If I'm in another environment then ls still works without any extra thought, and the muscle memory is the same, and I haven't lost anything.
If there's a tool which works very differently to the standard suite, then it really has to be pulling its weight before I consider using it.
[1] Not like the time one of my friends "wardialed" every number in my local calling area and posted the list to a BBS and I found that some of them could be logged into with "uucp/uucp" and the like. I think Bell security knew he rang everybody's phone in the area but decided to let billing handle the problem because his parents had measured service.
not really contradicted by:
> exa: modern replacement for ls/tree, not maintained
Some of them are smart but sometimes I want dumb, for example, ripgrep respects gitignore, and often, I don't want that. Though in this case, there is an option to turn it off (-uuu). That's a common theme with these tools too, they are trying to be smart by default and you need option to make them dumb.
So no, these tools are not "objectively superior", they are generally more advanced, but it is not always what you need. They complement classic tools, but in no way replace them.
I genuinely don't know what is going on here.
> I genuinely don't know what is going on here.
I basically live in my music library. However, every single pop artist offers songs that I don't like, are not in my library, and mysteriously have many millions of albums sold.
I genuinely don't know what is going on here.
Joking aside, have you ever tried to use some of these tools ? I use to not understand why people where using vim until I really tried.
No.
> I use to not understand why people where using vim until I really tried.
There's your problem. I respectfully suggest installing Emacs.
I use the command line a lot too and this is one of my most common commands, and I don't know of an elegant way to do it with the builtin Unix tools.
(And I have basically the same question for finding files matching a regex or glob [ignoring the stuff I obviously don't want], e.g., `fd '.foo.*'`.)
Hits in hidden files is not really a pain point for me
If traversing those hidden files/directories were expensive, I'd tell `find` itself to exclude them. This also lets me switch `xargs` for `find`'s own `-exec` functionality:
(I had to look that one up.)For the record, I think `git grep` is probably the best builtin solution to the problem I gave, but personally I don't know off-hand how to only search for files matching a glob and to use the current directory rather than the repository root with `git grep` (both of which are must haves for me). I'd also need to learn those same commands for different source control systems besides git (I use one other VCS regularly).
Makes sense. If I had to do this frequently, I'd add a function/alias encapsulating that `find` incantation to my .bashrc, which I keep in version control along with other configuration files in my home directory. That way, when moving to a new environment, I can just clone that repo into a fresh home directory and most of my customizations work out-of-the-box.
1. I don't recommend using shell functions or aliases for this (e.g., `bashrc`) because then these scripts can't be called from other contexts, e.g., like Vim and Emacs builtin support for shell commands. This can easily be solved by creating scripts that can be called from anywhere (my personal collection of these scripts is here https://github.com/robenkleene/Dotfiles/tree/master/scripts). Personally, I only use Bash functions for things that have to do with Bash's runtime state (e.g., augmenting PATH is a common one).
2. The more important part though, is I don't always want to search in `*.foo`, I want a flexible, well-designed, API that allows me to on-the-fly decide what to search.
#2 is particularly important and drifts in philosophy of tooling, a mistake I used to make is building my workflow today into customizations like scripts. This is a bad idea because then the scripts aren't useful as your tasks change, and hopefully your tasks are growing in complexity over time. I.e., don't choose your tools based on your workflow today, otherwise you're building in limitations. Use powerful tools that will support you no matter what task you're performing, that scale practically infinitely. "The measure of a bookshelf is not what has been read, but what remains to be read."
Is that a zshism?
The only thing I use JQ for at work is parsing the copilot API response so I remember what the model names are - that's it! TBH, I could just skip it and read the json
Also if I had to guess, the type of person who criticizes tools like jq, fzf, and ripgrep and struggles to understand some people's need for benchmarking tools like hyperfine would likely disapprove of fish as well.
The OP list is significantly cheapened by the various ls alternatives.
ripgrep is something I have installed, but use only via text editor integrations. fzf is nice for building ad-hoc TUIs. fd may make sense (I’m told it’s faster than find), but I already know enough find.
The “next gen ls” family of tools in the article is baffling.
It's useless as a cat replacement, I agree. The article really shouldn't call it that, although the program's GitHub page does self-describe it as "a cat clone". It's more of a syntax highlighter combined with a git diff viewer (I do have an issue with that; it should be two separate programs, not one).
1) Piping the contents of some file into a process.
2) Showing the contents of some short file.
Now (1) is better done with redirection (< or >). The only time I use cat is when I'm testing some pipeline where I only want a few lines of input, so I use 'head' or something similar. Once I have the pipeline working right, I edit the command line to replace 'head' with 'cat'. Easier than re-arranging whole words.
And it's rare that (2) is the right solution--too often I find that the file was longer than I thought, and I have to use 'more' (actually 'less').
So a replacement for 'cat' that does color coding sounds pretty much useless to me.
I can't see bat as a "useless cat" or a replacement for cat except for reading source code in the terminal. It's more a like a less with syntax highlight or a read-only vim.
That said, I've never really cared much about missing syntax highlighting for cases where I'm viewing file contents with cat. So the tool doesn't really serve a purpose for me and instead I'll continue to load up vim/neovim if I want to view a file with syntax highlighting.
50 more comments available on Hacker News