Show HN: How are Markov chains so different from tiny LLMs?
Mood
thoughtful
Sentiment
positive
Category
tech
Key topics
Markov chains
LLMs
Natural Language Processing
It generates text that seems to me at least on par with tiny LLMs, such as demonstrated by NanoGPT. Here is an example:
jplr@mypass:~/Documenti/2025/SimpleModels/v3_very_good$ ./SLM10b_train UriAlon.txt 3 Training model with order 3... Skip-gram detection: DISABLED (order < 5) Pruning is disabled Calculating model size for JSON export... Will export 29832 model entries Exporting vocabulary (1727 entries)... Vocabulary export complete. Exporting model entries... Processed 12000 contexts, written 28765 entries (96.4%)... JSON export complete: 29832 entries written to model.json Model trained and saved to model.json Vocabulary size: 1727
jplr@mypass:~/Documenti/2025/SimpleModels/v3_very_good$ ./SLM9_gen model.json
Aging cell model requires comprehensive incidence data. To obtain such a large medical database of the joints are risk factors. Therefore, the theory might be extended to describe the evolution of atherosclerosis and metabolic syndrome. For example, late‐stage type 2 diabetes is associated with collapse of beta‐cell function. This collapse has two parameters: the fraction of the senescent cells are predicted to affect disease threshold . For each individual, one simulates senescent‐cell abundance using the SR model has an approximately exponential incidence curve with a decline at old ages In this section, we simulated a wide range of age‐related incidence curves. The next sections provide examples of classes of diseases, which show improvement upon senolytic treatment tends to qualitatively support such a prediction. model different disease thresholds as values of the disease occurs when a physiological parameter ϕ increases due to the disease. Increasing susceptibility parameter s, which varies about 3‐fold between BMI below 25 (male) and 54 (female) are at least mildly age‐related and 25 (male) and 28 (female) are strongly age‐related, as defined above. Of these, we find that 66 are well described by the model as a wide range of feedback mechanisms that can provide homeostasis to a half‐life of days in young mice, but their removal rate slows down in old mice to a given type of cancer have strong risk factors should increase the removal rates of the joint that bears the most common biological process of aging that governs the onset of pathology in the records of at least 104 people, totaling 877 disease category codes (See SI section 9), increasing the range of 6–8% per year. The two‐parameter model describes well the strongly age‐related ICD9 codes: 90% of the codes show R 2 > 0.9) (Figure 4c). This agreement is similar to that of the previously proposed IMII model for cancer, major fibrotic diseases, and hundreds of other age‐related disease states obtained from 10−4 to lower cancer incidence. A better fit is achieved when allowing to exceed its threshold mechanism for classes of disease, providing putative etiologies for diseases with unknown origin, such as bone marrow and skin. Thus, the sudden collapse of the alveoli at the outer parts of the immune removal capacity of cancer. For example, NK cells remove senescent cells also to other forms of age‐related damage and decline contribute (De Bourcy et al., 2017). There may be described as a first‐passage‐time problem, asking when mutated, impair particle removal by the bronchi and increase damage to alveolar cells (Yang et al., 2019; Xu et al., 2018), and immune therapy that causes T cells to target senescent cells (Amor et al., 2020). Since these treatments are predicted to have an exponential incidence curve that slows at very old ages. Interestingly, the main effects are opposite to the case of cancer growth rate to removal rate We next consider the case of frontline tissues discussed above.
The author compares the text generation capabilities of Markov chains and tiny LLMs, finding Markov chains to be on par with tiny LLMs in generating coherent text.
Snapshot generated from the HN discussion
Discussion Activity
Light discussionFirst comment
6h
Peak period
1
Hour 6
Avg / period
1
Based on 1 loaded comments
Key moments
- 01Story posted
11/17/2025, 8:36:50 PM
1d ago
Step 01 - 02First comment
11/18/2025, 2:13:09 AM
6h after posting
Step 02 - 03Peak activity
1 comments in Hour 6
Hottest window of the conversation
Step 03 - 04Latest activity
11/18/2025, 2:13:09 AM
19h ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Discussion hasn't started yet.
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.