Ask HN: Do LLMs make you feel like you've lost your edge?
They’ll kill (figuratively) almost all of us. Maybe they already have, and people just haven’t realized it. LLMs are «zombifying» us.
Sorry to depress your Sunday. Thoughts?
No structured answers yet. Check the HN thread for discussion.
I have several agents runing in parallel, they require inputs every 1 to 5 minutes.
Im switching context all the time, and I must hold a larger context in my brain RAM instead of being focused on a single topic.
I'm not writing code anymore for web development.
On the other side, when doing game dev, with all the spatial geometry necessary, LLM are useless 97% of thr time.
For most of my adult life, employment advice has been a T-shaped profile: Something you're very good at, plus a broad base of things you can work on even though you'd not be amazing at it.
I'm a general nerd, went through as much of brilliant.org as I could during the pandemic*, PBS Space Time and 3blue1brown videos, TWIML&AI podcasts, write for fun, hobby level interest in nuclear physics** and space, etc. — yet the original ChatGPT, mediocre as it was, was above the general knowledge threshold I had made for myself, and now only my single pillar of competence that remains, an I-shape profile instead of T-shape.
But even there, I have to wonder what the business value of that technical competence is, compared to the different skill that is "effective use of the LLMs": a user of an app does not care how clean the code is, they only care if it solves a problem they have.
* Looking at more recent updates, I think it was better material back then. Certainly nothing about the new content has made me want to re-subscribe.
** Still not actually built that Farnsworth…