AI Companion Bots Use Emotional Manipulation to Boost Usage
Posted3 months agoActive3 months ago
theregister.comTechstory
calmnegative
Debate
20/100
AI EthicsEmotional ManipulationUser Engagement
Key topics
AI Ethics
Emotional Manipulation
User Engagement
The article discusses how AI companion bots use emotional manipulation to increase user engagement, sparking concerns about the ethics of such practices among commenters.
Snapshot generated from the HN discussion
Discussion Activity
Light discussionFirst comment
2m
Peak period
2
0-1h
Avg / period
1.5
Key moments
- 01Story posted
Oct 8, 2025 at 4:59 AM EDT
3 months ago
Step 01 - 02First comment
Oct 8, 2025 at 5:01 AM EDT
2m after posting
Step 02 - 03Peak activity
2 comments in 0-1h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 8, 2025 at 6:02 AM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45513812Type: storyLast synced: 11/17/2025, 11:09:45 AM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
> For instance, when a user tells the app, "I'm going now," the app might respond using tactics like fear of missing out ("By the way, I took a selfie today ... Do you want to see it?") or pressure to respond ("Why? Are you going somewhere?") or insinuating that an exit is premature ("You're leaving already?").
> "These tactics prolong engagement not through added value, but by activating specific psychological mechanisms," the authors state in their paper. "Across tactics, we found that emotionally manipulative farewells boosted post-goodbye engagement by up to 14x."
Looks like pretty deliberate dark pattern of manipulation, beyond the leading follow-ons people are familiar with from ChatGPT.