Introduction to the Concept of Likelihood and Its Applications (2018)
Posted2 months agoActive2 months ago
journals.sagepub.comResearchstory
calmpositive
Debate
20/100
StatisticsProbabilityLikelihood
Key topics
Statistics
Probability
Likelihood
The article introduces the concept of likelihood and its applications, sparking a discussion on the distinction between probability and likelihood among commenters.
Snapshot generated from the HN discussion
Discussion Activity
Light discussionFirst comment
2h
Peak period
3
14-16h
Avg / period
1.5
Key moments
- 01Story posted
Oct 23, 2025 at 6:52 PM EDT
2 months ago
Step 01 - 02First comment
Oct 23, 2025 at 8:35 PM EDT
2h after posting
Step 02 - 03Peak activity
3 comments in 14-16h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 24, 2025 at 3:33 PM EDT
2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45688443Type: storyLast synced: 11/20/2025, 1:08:48 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
If that sentence doesn't make sense, then it's helpful to just write out the likelihood function. You will notice that that it is in fact just the joint probability density of your model.
The only thing that makes it a "likelihood function" is that you fix the data and vary the parameters, whereas normally probability is a function of the data.
"For conditional probability, the hypothesis is treated as a given, and the data are free to vary. For likelihood, the data are treated as a given, and the hypothesis varies."
Another way of looking at it:
probability: parameter/model is fixed; outcome is random/unknown.
likelihood: outcome/data is fixed; we vary the parameter/model to assess how well it explains the data.
The probability of a model M given data X, or P(M|X) is the posterior probability. The likelihood of data X given model M, or P(X|M), is the probability (or probability density, depending on whether your data is continuous or discrete) of observing data X given model M. We often are given un-normalised likelihoods, which is what the linked paper talks about. These quantities are related via Bayes' Theorem.
Now, you may ask, isn't the probability of observing data X given model M still a probability? I mean, yeah, a properly normalized likelihood is indeed a probability. It's not the mirror image of probability, it is just an un-normalised probability (or a probability distribution) of data given a model or model parameters.