Purdue University Approves New AI Requirement for All Undergrads
Key topics
Purdue University has sparked debate by introducing a new AI competency requirement for all undergraduate students, leaving many to wonder if this is a necessary step or a hasty attempt to stay trendy. Commenters are divided, with some like conartist6 and andy99 dismissing it as a "public embarrassment" and others like turtleyacht and gmfawcett seeing potential in leveraging AI to enhance general machine thinking and graduate attributes. The discussion highlights concerns that this might be a superficial fix, with gamblor956 drawing parallels to the "big data" fad, while basch and turtleyacht argue that AI literacy could be a valuable skill. As the university delegates the development of this competency to its provost and deans, the conversation remains lively, reflecting broader questions about the role of AI in education.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
30m
Peak period
21
0-2h
Avg / period
5.5
Based on 60 loaded comments
Key moments
- 01Story posted
Dec 13, 2025 at 3:54 PM EST
20 days ago
Step 01 - 02First comment
Dec 13, 2025 at 4:24 PM EST
30m after posting
Step 02 - 03Peak activity
21 comments in 0-2h
Hottest window of the conversation
Step 03 - 04Latest activity
Dec 15, 2025 at 1:36 AM EST
19 days ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
I though Purdue was a good school, these kind of gimmicks are usually the province of low-tier universities trying to get attention.
Professors can tailor lectures to narrower topics or advanced, current, or more specialized subjects. There may be less need to have a series of beginning or introductory courses--it's assumed learners will avail themselves.
Pessimistically, AI literacy contributes to further erosion of critical thinking, lazy auto-grading, and inability to construct book-length arguments.
Ever-available… until ClosedAI decides that you did something wrong and bans your account.
it's not unrealistic to be selecting for people with strong language skills and the ability to break tasks into discrete components and assemble them into a process. or the skill of being able to define what they do not know.
a lot of what makes a person good with an llm makes them also good at general problem solving.
What percentage of students who graduated in 2025 have no idea what machine learning is?
Forget Attention Is All You Need and transformers. What percentage can't define machine learning? What percentage have no idea what the question even means? A highly non-trivial percentage.
ChatGPT prompting 101 would obviously be stupid but there is more than enough material to do a fantastic AI 101 class.
You need exposure to philosophical ideas because you need words to be able to think about and describe the similarities and differences between computed language output and a lived experience. You need evolutionary biology to understand that AI is not going to catch up to a billion years of evolutionary progress in 6 months. You need ethics because AI is an invitation to ruin yourself through cheating, bullshitting your responsibilities, and generally failing to consider that improving yourself takes work.
https://www.purdue.edu/newsroom/2025/Q4/purdue-unveils-compr...
Where the actual news is:
> To this end, the trustees have delegated authority to the provost, working with deans of all academic colleges, to develop and to review and update continuously, discipline-specific criteria and proficiency standards for a new campuswide “artificial intelligence working competency” graduation requirement for all Purdue main campus students, starting with new beginners in fall 2026.
So the Purdue trustees have "delegated authority" to people at the University to make a new graduation requirement for 2026.
Who knows what will be in the final.
I think it would be the ongoing job of the dean's or at least someone to be setting graduation requirements? Why would the trustees have to explicitly delegate it?
After more than a trillion dollars spent, LLMs can replace: (a) a new secretary with one week of experience (b) a junior programmer who just learned that they can install programs on a desktop computer, and (c) James Patterson.
That's the bright future that Purdue is preparing its students for.
Yes, AIs will be a huge thing...eventually...but LLMs are not AI, and they never will be.
“License your chat history” - most of us wouldn’t have any takers, but someone like you might.
(And I say this as someone who is really not a fan of how LLMs are being presented to the world at large)
AI/ML isn't going to completely shift the world, but understanding how to do basic prompt engineering, validate against hallucinations, and know what the difference between ChatGPT and GPT-4o is valuable for people who do not have a software background.
However, there's no reason to think any trick would be relevant even in a year. As llms get better, why wouldn't we just have them auto rewrite prompts using appropriate prompt engineering tricks?
That's why you don't understand the dismissive comments. The reality is that the technology sucks for actually doing anything useful. Mandating that kids work with a poor tool just because it's trendy right now is the height of foolishness.
Yeah, yeah, for you who knows better than everything, you already know what they're going to teach from this press release, you already know it all, that's why you have no use for AI.
With little apology for breaking the HN civility rules. "They did it first."
And I just know this is going to turn into a (pearl-clutching) AI Ethics course...
But I like to think that actually learning the history was important and it certainly was a diversion from math/chemistry/physics. I liked Shakespeare, so reading the plays was also worthwhile and discussing them in class was fun. Yeah, I was bored to tears in medieval history, so AI could have helped there.
If you're going to try to fake being able to write, better to try to dupe any other professor than a professor of English. (source: raised by English majors)
Why do you think it wouldn't do the same for other fields? The purpose of writing essays in school is never to have the finished product; it's to learn and analyze the topic of the essay and/or to go through the process of writing and editing it.
Part of this is very reasonable; AI is upending how students learn (or cheat), so adding a requirement to teach how to do it in a way that improves learning rather than just enhances cheating makes sense. The problem with the broad, top-down approach is it looks like what happens in Corporate America where there's a CEO edict that "we need a ____ strategy," and every department pivots projects to include that, whether or not it makes sense.
For the same reason that elementary schools don't allow calculators in math exams.
You first need to understand how to do the thing yourself.
So no, computers are not required to teach computer science.
Perhaps the world is going the direction of relying on an AI to do half the things we use our own brains for today. But to me that sounds like a sad and worse future.
I’m just rambling here. But at the moment I fail to see how current LLMs help people truly learn things.
When I heard that today, it sounded like self-serving partnership, and, frankly, incompetence.
This is not remotely the kind of thing that a school should be making a requirement at this time. The technology is changing way too fast to even be sure that basic fundamental skills related to it will remain relevant for as many as 4-5 years.
"all as informed by evolving workforce and employer needs"
“At the same time, it’s absolutely imperative that a requirement like this is well informed by continual input from industry partners and employers more broadly."
Purdue is engaging in the oldest profession in the world. And the students pay for this BS.
Not really, you're the one accelerating "reach and pace" based on hype, and you'd naively expect more educated approach at institutions that educate.
Purdue not necessarily uniquely but specific to their charter does a really good job at workforce development focus in their engineering. They are very highly focused on staffing and training and less so on the science and research part - though that exists as well.
This tracks what I would expect an in line with what I think it should be best practice