Dependable C
dependablec.orgKey Features
Tech Stack
Key Features
Tech Stack
Inconsistent titles, stuff labelled [TOC].
It might be a work in progress and not really ready to be shared widely.
Vibe coders usually offer zero workmanship (especially with the web), and are enamored with statistically salient generated arbitrary content. https://en.wikipedia.org/wiki/The_Power_of_10:_Rules_for_Dev...
https://www.stroustrup.com/JSF-AV-rules.pdf
Best of luck =3
Follow the 10 rules on the single wiki page, and C becomes a lot less challenging to stabilize. Could also look at why C and Assembly is still used where metastability considerations matter. If you spend your days in user-space Applications, than don't worry about it... =3
An assembly language program specifies a sequence of CPU instructions. The mapping between lines of code and generated instructions is one-to-one, or nearly so.
A C program specifies run-time behavior, without regard to what CPU instructions might be used to achieve that.
C is at a lower level than a lot of other languages, but it's not an assembly language.
Both ISA-level assembly and C are targeting an abstract machine model, even if the former is somewhat further removed from hardware reality.
Nobody claimed that. It corresponds to the instructions the CPU runs and their observable order.
Also it's really only x86 that uses micro-ops (in the way that you mean), and there are still plenty of in-order CPUs.
Assembly is not about corresponding to exactly which gates open when in the CPU. It's just the human writable form of whatever the CPU ingests, whereas C is an early take on a language reasonable capable of expressing higher level ideas with less low-level noise.
I seriously doubt anyone who has written projects in assembly would make such comparisons...
With genuine respect, I believe this type of insinuation is rarely productive.
Someone might still have silly opinions, even if they have been paid to write assembly for 8-24-64 bit cisc, risc, ordered and out of order ISAs, and maybe compilers too. Peace :)
This should not be mistaken as appeal to authority, it is merely reasonable discrimination between those speaking from experience, and those forming opinions without experience.
If one believes those with experience has poorly informed opinions, they're always free to gain experience and associated perspective. They will then either have the fundamentals to properly push their viewpoint, or end up better understanding and aligning with the common viewpoint.
It's still much closer to the input machine code compared to what compiler optimizer passes do to your input C code ;)
If you use
mov %i0, %l0
instead of or %g0, %i0, %l0
Then that isn't "the lowest level you can target."What I meant to say is that since there is no way to directly write microcode, assembly is the lowest level software can target.
Here's a trivial example clang will often implement differently on different systems, producing two different results. Clang x64 will generally mul+add, while clang arm64 is aggressive about fma.
x = 3.0f*x+1.0f; float fma(float x) {
return 3.0f * x + 1.0f;
}
Clang armv8 21.1.0: fma(float):
sub sp, sp, #16
str s0, [sp, #12]
ldr s1, [sp, #12]
fmov s2, #1.00000000
fmov s0, #3.00000000
fmadd s0, s0, s1, s2
add sp, sp, #16
ret
Clang x86-64 21.1.0: .LCPI0_0:
.long 0x3f800000
.LCPI0_1:
.long 0x40400000
fma(float):
push rbp
mov rbp, rsp
vmovss dword ptr [rbp - 4], xmm0
vmovss xmm1, dword ptr [rbp - 4]
vmovss xmm2, dword ptr [rip + .LCPI0_0]
vmovss xmm0, dword ptr [rip + .LCPI0_1]
vfmadd213ss xmm0, xmm1, xmm2
pop rbp
retNow let's say you're working on a game with deterministic lockstep. How do you guarantee precision and rounding with an assembler? Well, you just write the instructions or pseudoinstructions that do what you want. Worst case, you write a thin macro to generate the right instructions based on something else that you also control. In C or C++, you either abuse the compiler or rely on a library to do that for you ([0], [1]).
This is the raison d'etre of modern assemblers: precise control over the instruction stream. C doesn't give you that and it makes a lot of things difficult (e.g. constant time cryptography). It's also not fundamental to language design. There's a long history of lisp assemblers that do give you this kind of precise control, it's just not a guarantee provided by any modern C implementations unless you use the assembly escape hatches. The only portable guarantees you can rely on are those in the standard, hence the original link.
Low level control over the instruction stream is ultimately a spectrum. On one end you can write entirely in hex, then you have simple and macro assemblers. At the far end you have the high level languages. Somewhere in the middle is C and however you want to categorize FASM.
Since optmizing compilers became a thing in the C world, and the WG14 never considered modern CPU architectures on what hardware features C should expose, this idea lost meaning.
However many people hold on to old beliefs that C is still the same kind of C that they learnt with the first edition of K&R C book.
This ACM article might be interesting to you, https://queue.acm.org/detail.cfm?id=3212479
Before dismissing it as the author having no idea what he is talking about, David Chisnall used to be a GCC contributor, one of the main GNUStep contributors back in the original days, and one of the key researchers behind the CHERI project.
1. All the reasons he cites that depend on "what the metal does" being different and quite a bit more complex than what is surfaced in C apply equally to assembly language. So assembly language is not a low-level language? Interesting take, but I don't think so: it is the lowest level language that is exposed by the CPU.
2. The other reasons boil down to "I would like to do this optimization", and that is simply inapplicable.
"C isn't not a high level assembler" captures it almost perfectly, and it also captures the documented intent of the people who created the ANSI C standard, which was that ANSI C should not preclude the (common) use of C as a high-level assembly language.
Here the quote:
C code can be non-portable. Although it strove to give programmers the opportunity to write truly portable programs, the C89 Committee did not want to force programmers into writing portably, to preclude the use of C as a “high-level assembler”: the ability to write machine- specific code is one of the strengths of C. It is this principle which largely motivates drawing the distinction between strictly conforming program and conforming program (§4).
https://www.open-std.org/jtc1/sc22/wg14/www/C99RationaleV5.1...
(bottom of page 2)
https://gcc.gnu.org/wiki/boringcc
As a boring platform for the portable parts of boring crypto software, I'd like to see a free C compiler that clearly defines, and permanently commits to, carefully designed semantics for everything that's labeled "undefined" or "unspecified" or implementation-defined" in the C "standard" (DJ Bernstein)
And yeah I feel this:
The only thing stopping gcc from becoming the desired boringcc is to find the people willing to do the work.
---
And Proposal for a Friendly Dialect of C (2014)
Oof, those passive-aggressive quotes were probably deserved at the time.
The actually interesting stuff happens outside the standard in vendor-specific language extensions (like the clang extended vector extension).
https://www.youtube.com/watch?v=zqHdvT-vjA0
https://www.youtube.com/watch?v=443UNeGrFoM
Vibe coders follow the herd, they don't write any C.
Given the amount of C code in existence for practically any domain you can think of, i imagine LLMs can do a good job in generating it.
It'd also be a good starting point to be more concrete in your ambitions. What version of C is your preferred starting point, the basis for your "Better C"?
I'd also suggest the name "Dependable C" confuses readers about your objective. You don't seek reliability but a return to C's simpler roots. All the more reason to choose a recognized historical version of C as your baseline and call it something like "Essential C".
Of course, the website operator can do something about this as well. On my website I redirect all traffic from mobile devices to the kids section of YouTube.
https://en.wikipedia.org/wiki/The_Power_of_10:_Rules_for_Dev...
A bit of history where these rules came from, and why they matter. =3
"Why Fighter Jets Ban 90% of C++ Features"
Case in point: the article has this somewhere in the example code:
struct struct s;
s.member = 42;
s.other_member = 1138;
(ignore the syntax errors and typos, the article is full of them)If new members are added to the struct, you end up with uninitialized memory. With C99 designated init at least the new members are zero-initialized.
The problem with post-C89 is that you lose the unique features of old-school C.
For example, it can be compiled on basically any platform that contains a CPU. It has tool support everywhere. And it can be fairly easily adapted to run using older C compilers going back to the 1980s.
Or that (old-school) C is basically a high level assembly language is actually a feature and not a bug. It's trivial to mentally map lines of C89 code to assembly.
No other widely available language can tick these boxes.
So the problem with later versions of C is that you lose these unique features while you are now competing for mindshare with languages that were designed in the modern age. And I just don't see it winning that battle. If I wanted a "modern" C, I'll use Zig or Rust or something.
Just because a language (C89) doesn't evolve doesn't mean it's dead.
It will be pretty hard to find a platform which doesn't have at least a C99 compiler.
For instance even SDCC has uptodate C standard support, and that covers most 8-bit CPUs all the way back to the 70's:
Also let's not forget that C99 is a quarter century old by now. That's about as old as K&R C was in 1999 ;)
And so on. Is this a work-in-progress thing not meant for public consumption yet?
Not affiliated with Hacker News or Y Combinator. We simply enrich the public API with analytics.