Fernflower Java Decompiler
Posted3 months agoActive3 months ago
github.comTechstory
calmpositive
Debate
20/100
Java DecompilationReverse EngineeringOpen-Source Software
Key topics
Java Decompilation
Reverse Engineering
Open-Source Software
The Fernflower Java decompiler, developed by JetBrains, is a highly-regarded tool for reverse engineering Java bytecode, with users discussing its capabilities, limitations, and potential applications.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
3d
Peak period
22
84-96h
Avg / period
9.8
Comment distribution39 data points
Loading chart...
Based on 39 loaded comments
Key moments
- 01Story posted
Sep 25, 2025 at 4:20 PM EDT
3 months ago
Step 01 - 02First comment
Sep 29, 2025 at 12:07 AM EDT
3d after posting
Step 02 - 03Peak activity
22 comments in 84-96h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 30, 2025 at 4:55 AM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45378450Type: storyLast synced: 11/20/2025, 4:35:27 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
https://old.reddit.com/r/java/comments/ue8u59/new_open_sourc...
A little more info in this thread as well:
https://old.reddit.com/r/java/comments/ue8u59/new_open_sourc...
(It was earlier named Quiltflower and is actually a combination of multiple Fernflower forks from its' gh README)
Would ideally expect the project site/github to list out how's the fork different though.
[0]: https://thejunkland.com/blog/using-llms-to-reverse-javascrip...
[1]: https://github.com/jehna/humanify/blob/main/README.md#exampl...
That really deserves a link. What is an “analytical” decompiler?
Answer is pretty vague though, but sounds like it’s about not trying to “reverse” what the compiler did, but rather try and “analytically” work put what source code would likely have yielded the byte code it’s looking at?
If you have a block of code a compiler will compile a language expression or statement into a particular set of assembly/bytecode instructions. For example converting `a + b` to `ADD a b`.
A reversing decompiler will look at the `ADD a b` and produce `a + b` as the output. This is the simplest approach as it is effectively just a collection of these types of mapping. While this works, it can be harder to read and noisier than the actual code. This is because:
1. it does not handle annotations like @NotNull correctly -- these are shown as `if (arg == null) throw ...` instead of the annotation because the if/throw is what the compiler generated for that annotation;
2. it doesn't make complex expressions readable;
3. it doesn't detect optimizations like unrolling loops, reordering expressions, etc.
For (1) an analytical decompiler can recognize the `if (arg == null) throw` expression at the start of the function and map that to a @NotNull annotation.
Likewise, it could detect other optimizations like loop unrolling and produce better code for that.
> When you compile your project with IntelliJ IDEA build tool, the IDE adds assertions to all code elements annotated with @NotNull. These assertions will throw an error if the elements happen to be null at runtime.
See this comment by an OpenJDK tech lead: https://news.ycombinator.com/item?id=37666793
A better example for Java would be something like lambda expressions on functional interfaces. There, the compiler is creating an anonymous object that implements the interface. A reversable decompiler will just see the anonymous class instance whereas an analytical decompiler can detect that it is likely a lambda expression due to it being an anonymous class object implementing a single method interface and is being passed to a function argument that takes that interface as a parameter.
In C# yield is implemented as a state machine, so an analytical decompiler could recognise that construct.
And yes, for JVM decompilers it could have language heuristics to detect (or be specifically for) Lombok, Scala, Groovy, Kotlin, etc.
[1] https://docs.oracle.com/javase/tutorial/java/javaOO/lambdaex...
So if I'd have to give a definition I pulled out of my ass:
* non-analytical compiler: "local", works only at the instruction or basic-block level, probably done by just pattern matching templates
* analytical: anything that does non-local transformations, working across basic-blocks to recover logic and control flow
> Stiver decided to write his own decompiler as a side project. To overcome the weaknesses of existing alternatives, he took a different approach. After reading the bytecode, he constructed a control-flow graph in static single-assignment form, which is much better to express the program semantics abstracting the particular shape of bytecode. At the beginning of this project, Stiver knew little about static analysis and compiler design and had to learn a lot, but the effort was worth it. The resulting decompiler produced much better results than anything available at that time. It could even decompile the bytecode produced by some obfuscators without any explicit support.
https://blog.jetbrains.com/idea/2024/11/in-memory-of-stiver/
[I work at JetBrains]
Maybe I'm just misunderstanding you, but even if the bytecode sequence is reconstructed as the original code that produced it, stuff like whitespace and comments are simply lost with no ways to recover.
(Also, local variable names, certain annotations depending on their retention level, etc)
However, most of the time it'll output perfectly valid Java code that'll compile if you just create the necessary maven/ant/gradle build configuration to get all of the sources loaded correctly.
I had decompiled the class, fixed the issue, checked in the original decompiled source and then the change. Then a coworker pointed out that the original decompiled source also fixed the issue.
After a bit of digging, I learned that hotspot compiler had code to detect and fix the issue, but it was looking for the pattern generated by a modern compiler, and the library was compiled with an older compiler.
(It's been a while, but I think it was the JAI library, and the issue was triggered by long comments in a PNG.)
Over in .NET land, dnSpy (https://github.com/dnSpyEx/dnSpy) works very well, even on many obfuscated binaries.
I found this amusing, from a Java perspective. The 3-character command-line options are also very "not Java-ish". However, since this one is also written in Java, a good test is if it can decompile itself perfectly and the result recompiled to a matching binary; much like how bootstrapping a compiler involves compiling itself and checking for the existence of the fixed-point.