The Grammar According to West
Key topics
The debate rages on about the clarity of mathematical notation, with some commenters lamenting the "abusive language" and "sloppy notation" that can hinder understanding, while others defend the practice of overloading symbols as a convenient and context-dependent shorthand. As one commenter astutely pointed out, memorizing equations without true comprehension can create a false sense of understanding, highlighting the importance of precise notation. Proposals for alternative notations, such as using function parameters like `F{sin}(x)`, were floated, but others countered that complicated formulas can't always be simplified so neatly. Amidst the discussion, a consensus emerged that context is key to deciphering ambiguous notation, with some commenters appreciating the convenience of overloaded symbols like `\Sum_{s \in S}`.
Snapshot generated from the HN discussion
Discussion Activity
Active discussionFirst comment
3d
Peak period
17
72-84h
Avg / period
5.2
Based on 26 loaded comments
Key moments
- 01Story posted
Aug 27, 2025 at 9:51 AM EDT
4 months ago
Step 01 - 02First comment
Aug 30, 2025 at 8:28 AM EDT
3d after posting
Step 02 - 03Peak activity
17 comments in 72-84h
Hottest window of the conversation
Step 03 - 04Latest activity
Aug 31, 2025 at 11:20 PM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
"let {x_n} be a sequence"
As the author points out, a sequence is a function. The statement {x_n} is the set of terms of the sequence, its range. A function and its range are two different things. And also sets have no ordering. It might seem like a minor thing, but I thought we were trying to be precise?
A second example: at the high school level, I'm pretty sure a lot of textbooks don't carefully distinguish between a function and the formula defining the function very well.
The author of this web page has a section on what he calls "double duty definitions". Personally, I don't find anything wrong with the language "let G=(V,E) be a graph". G is the graph and we're simultaneously defining/naming its structure. So, some of this is a matter of taste. And, to some extent, you just have to get used to the way mathematicians write.
Same in college when learning the Fourier transform, a stumbling block was that the prof didn't properly explain that it takes a function as a whole and gives a whole new function as output. When you first learn this concept, it's a bit of time to wrap your head around, but when it clicks, everything makes more sense. But just writing F{sin(x)} = ... seems like F acts on a concrete value. A more explicit way would be F{x->sin(x)}={x->...}
Of course once you already know these fundamentals and they are baked into your brain and take them for granted, it's hard to see where beginners get confused, and writing in short hand is so much easier so you get sloppy while still unambiguous to experienced people.
This is why I always preferred to see coded-up demos and implementations as opposed to formulas on blackboards and slides. If you have to implement it, you can't handwave away things as pedantry. It forces precision by default.
Which is why I am so favorable of Jupyter notebook-like teaching environments. Embed the (guaranteed to execute!!! no illegal shorthand) code so that learners can get a true representation that can be manipulated. Although, I think they are still unlikely to reshape education - now you require some coding fluency + the niche math topic.
F{sin}(x) = ...
is just as short and clearer?
In most cases it is not as much abusing notation as overloading it. If you think of the context of a formula (say, adjacent paragraphs) as its implicit arguments (think lambda captures in c++), then it is natural that curly braces can denote both a set and a sequence, depending on this implicit input.
Such context dependent use of symbols is actually rather convenient with a little practice.
?
I don't even know where to begin. Overloading symbols in mathematics occurs all over the place. There's nothing wrong with that. The difference between overloading a symbol and abusing it is whether there is an agreed upon definition/convention regarding its use and to what extent its use conforms to that definition/convention. What I'm saying in my original post is that the statement "{x_n} is a sequence" disagrees with the formal idea of what a sequence is and that most writers don't bother to explain their own notational use.
If you wish to re-define the curly braces to have a context-dependent meaning, knock yourself out. But, I would imagine that that practice would confuse a lot of people. Math is a human activity. It's not a programming language.
I wasn't aware of its ubiquity! I may only think of it as "abusive" due to lack of familiarity. The way I've seen it used is: \Sum_{e \in S} e_i, where 'i' is never explicitly defined, and this still assumes elements indexed by integers. The only utility seems to be from the abbreviation, leaving out the range of indices being iterated over. Not saying that isn't useful, but the rigor of the math probably doesn't benefit from time-saving omissions.
I'm tempted to call that notation simply wrong instead of abusive. Generally "abusive" notation, while technically wrong, has some redeeming feature in intuition or consicebess.
In this case, the alternative notation would be to simply drop the index and write "\Sum_{e \in S} e", which seems to be all around better.
From having spent way too much time doing technical writing; I'm tempted to say the notation you are recalling really was a mistake. They probably started out with "\Sum_{e \in S} e", then decided to make all summations be index based instead of set based. Unless you spend a lot of time proofreading, that type of style change can easily lead to half translated expressions like what you recall.
Good mathematical writing has this kind of cadence and pattern to it, and that's not a problem. For good writers some personal charm and flavor can still shine through, but it helps the reader to use the familiar trope structures. Unfortunately, this kind of "meta" is not taught much, so many students don't quite understand how to read math books, get frustrated when they progress slow, expecting to read it at the same speed as a history book or a novel. In a math book it's normal to re-read sentences or jump back half a page, to flip the pages back and forth, to put down the book for a moment and think etc. and it may take an afternoon to just digest one or two pages.
Paul Halmos and Donald Knuth are good examples of people that have a very nice mathematical style imo.
I will always be grateful to Prof Körner for advocating that people who did Part III retroactively get awarded MMath masters degrees, thus making the most hardcore year of my academic life at least somewhat comprehensible to people outside Cambridge.
I found this interesting. SQL and Swift use “where” for restrictions. Any other examples in programming languages?
If a programming language wanted to use a keyword for restrictions that isn’t “where” (and still is a single word, hence “such that” doesn’t qualify), what word would be suitable instead? “With”? “Having”?
As for single-word alternatives, "if" is common. For example, in Python:
Common Lisp uses "if" but also uses "when": Perl and Common Lisp both use "unless", of course with the sense inverted: SQL uses both "where" and "having". You could also reasonably use "suppose", "stipulate", "assert", "wolog", or "let". Lisp M-expressions (and, arguably, Dijkstra's guarded command language) used "→"."If" wouldn't fit that use case, and neither would "when". I suppose "require" would work, but it also feels different from "such that". The intended meaning of the latter example would be "let s be an S such that s.x < s.y". "Given", as the sibling comment by hallole proposes, also doesn't fit.
Provable termination isn’t strictly necessary I think, because compile-time evaluation, or metaprogramming in general, is usually Turing-complete anyway.
I don't know SQL, but going solely off Google Images, that'd read like:
SELECT X GIVEN X > 5
E.g
Haskell also has a "let" notation > If a programming language wanted to use a keyword for restrictions that isn’t “where” (and still is a single word, hence “such that” doesn’t qualify), what word would be suitable instead? “With”? “Having”?If? Python does so with it's list comprehensions:
I'm in no way saying you should DO mathematics in a programming language, rather translate it for a wider audience. At a sufficiently high level even little kids could understand your argument before drilling down into specifics, and since the spec never actually has to run it can be as high level as you want limited only by verification time
The whole point of introducing a math equation in a paper is to serve as a completely unambiguous formalism, devoid of the ambiguities of the spoken word.
And yet, it is all too common to read something many times and not make sense of it, until it hits me the author means something completely different than what the symbols would imply in principle, and what looked like a formalism is basically a sloppy direct translation of words as math symbols, combined with abuse of notation, idiomatic but undefined uses of established notation, or outright nonsense.