Computer Science Courses That Don't Exist, but Should (2015)
Posted2 months agoActiveabout 2 months ago
prog21.dadgum.comTechstoryHigh profile
calmpositive
Debate
60/100
Computer Science EducationSoftware DevelopmentProgramming
Key topics
Computer Science Education
Software Development
Programming
The article discusses hypothetical computer science courses that don't exist but should, sparking a discussion on the gaps in current CS education and the importance of practical skills.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
31m
Peak period
149
Day 1
Avg / period
22.9
Comment distribution160 data points
Loading chart...
Based on 160 loaded comments
Key moments
- 01Story posted
Oct 23, 2025 at 10:22 PM EDT
2 months ago
Step 01 - 02First comment
Oct 23, 2025 at 10:52 PM EDT
31m after posting
Step 02 - 03Peak activity
149 comments in Day 1
Hottest window of the conversation
Step 03 - 04Latest activity
Nov 4, 2025 at 3:31 PM EST
about 2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45690045Type: storyLast synced: 11/20/2025, 8:18:36 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Although, to be fair, while it was a helpful practice at coding, I'm not a game designer, so it was a game too terrible to play.
First year Uni though I spent too many hours in the lab, competing with friends, to recreate arcade games on the PC. Skipping the game design part was helpful. To be fair by then we had a glorious 640k of ram. Some Assembly required.
The professors had a habit of sending out an email one week before the due date (right before finals week) which contained several updates to the spec.
It was a surprisingly effective course.
(Dartmouth also followed this up with a theory course that often required writing about 10 pages of proofs per week. I guess they wanted a balance of practice and theory, which isn't the worst way to teach CS.)
Glad that someone actually did this.
I've never been taught anything more clearly than the lessons from that class.
(I didn't believe him at the time, but in some ways he really didn't go far enough...)
Jenkins, Docker, Kubernetes, none of these sorts of things - and I don’t even mean these specific technologies, but moreover nothing even in their ballpark.
Everyone has a bespoke mishmash of nonsense pipelines, build tools, side cars, load balancers, Terragrunt, Terraform, Tofu, Serverless, Helm charts, etc.
There are enough interesting things here that you wouldn’t even need to make a tool heavy project style software engineering course - you could legitimately make a real life computer science course that studies the algorithms and patterns and things used.
Would make a great course.
It would be easy for me to agree with you. I hold a graduate degree in computer science and I’m named for my contributions proofreading/correcting a graduate text about algorithms in computability theory.
I love abstraction and algorithms and pure theory, but this whole “computer science is a branch of mathematics, nothing more” idea has always struck me as ridiculous. Are you willing to throw out all of the study of operating systems, networking, embedded systems, security (hardware and software), a good chunk of AI, programming languages, UI/UX/human computer interaction, graphics, just to draw a line around algorithms and Turing Machines and say this is all there is to computer science?
Yes I very much think a computer science degree should be as close to the foundation of theory as possible. And still learning Jenkins and kuberenetes or even a general course on how effectively push code is still far from the things you listed.
Theres so much computer science that isnt even covered that id include before including courses on CI/CD
AI is a lot of math, especially if you hang out with the quasiconvex optimization crowd, but a vast majority of work in that field can not properly constitute “theory”
I think it’s clear in practice that computer science has officially strayed beyond whatever narrow bounds people originally wished to confine it to.
If it were me Id get rid of statically defined 4 year programs, and/or definite required courses for degrees, or just degrees in general. Just offer courses and let people come learn what they want.
One of my favorite classes was a python class that focused on building some simple games with tkinter, making a chat client, hosting a server, because it was the first time I understood how actual software worked. Im really glad I took that class.
On the other hand Id love to have learned information theory, lamba calculus, all the early AI, cognitive science, theory of programming languages, philosophy behind all of it that got us here
The research done by professional academic computer scientists also reflects the broad scope I’m advocating for.
It's hard to fit everything student needs to know in the curriculum. Someone else posted here they had 10 pages of proofs per week, for one course. I would have been fired for assigning so much homework!
I was a CS professor at a local college. My solution was to ignore CS1 and CS2 curriculum (we were not ABET accredited, so that's okay) in the second course of Java programming. Instead, I taught students Maven/Gradle, Git and GitHub, workflows, CI/CD, regular expressions, basic networking, basic design patterns, Spring Boot, and in general everything I thought new programmers ought to know. I even found a book that covered much of this stuff, but in the end I wrote my own learning materials and didn't use a book.
The course was a victim of its success. The school mandated the course for non-Java programmers too, resulting in a lot of push-back from the non-Java students.
If anyone is interested, I have the syllabus online still (I've since retired) at <https://wpollock.com/>. Look for COP2800 and COP2805C. I can also send the Java teaching materials as a PDF to anyone interested (book length, but sadly not publishable quality).
Huh. As a professor, I would not be able to grade this kind of volume in any serious capacity. Especially since proofs need to be scrutinized carefully for completeness and soundness. I wonder how their instructor manages.
a) Practical CI/CD requires understanding (and some practical experience) of many other concepts like Linux shell scripting, version control, build automation, containers, server administration, etc. As few students (in our degree programme) have sufficient experience in these concepts, I spend about half of the semester teaching such devops basics instead of actual software engineering.
b) Effectively, teaching CI/CD means teaching how GitHub or GitLab do CI/CD. I feel a little bit uncomfortable teaching people how to use tech stacks owned by a single company.
¹) Actually it's more a course on basic software craftsmanship for media informatics students because no such course exists in our curriculum and I find it more important that students learn this than that they understand the V model.
Alan Kay, my favorite curmudgeon, spent decades trying to remind us we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since. He’s still disappointed because very few programmers are ever introduced to the history of computer science in the way that artists study the history of art or philosophers the history of philosophy.
The effective history of computing spans a lifetime or three.
There's no sense comparing the two. In the year 2500 it might make sense to be disappointed that people don't compare current computational practices with things done in 2100 or even 1970, but right now, to call what we have "history" does a disservice to the broad meaning of that term.
Another issue: art and philosophy have very limited or zero dependence on a material substrate. Computation has overwhelming dependence on the performance of its physical substrate (by various metrics, including but not limited to: cpu speed, memory size, persistent storage size, persistent storage speed, network bandwidth, network scope, display size, display resolution, input device characteristics, sensory modalities accessibly via digital to analog conversion, ...). To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.
True they might not all be "dramatic lessons" for us, but to ignore them and assume that they hold no lessons for us is also a tragic waste of resources and hard-won knowledge.
My dictionary absolutely implies that, it even claims that all the sciences were split of from Philosophy and that a common modern topic of Philosophy is the theory of science. The point of Philosophy is to define truth in all aspects, how is that not science? It's even in the name: "friend of truth". Philosophy is even more fundamental and formal than mathematics. Mathematics asks what sound systems are, what properties they have and how they can be generalized. Philosophy asks, what something truly is, what it means to know, what it means to have a system and whether it's real. The common trope of going even more fundamental/abstract goes: "biology -> chemistry -> physics -> mathematics -> philosophy"
I once thought about a series of PHYS classes that focus on historical ideas and experiments. Students are supposed to replicate the experiments. They have to read book chapters and papers.
You're not wrong about history, but that only strengthens Kay's case. E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025. (Note the modeless GUI interaction.) Well, guess what? Such a thing definitely doesn't exist. And that can only point to programmers having a general lack of knowledge about their incredibly short history.
(In reality my hope is some slice of devs have achieved this and I've summoned links to their projects by claiming the opposite on the internet.)
Edit: just so I get the right incantation for summoning links-- I'm talking about the whole enchilada of a visual language that runs and rebuilds the user's flowchart program as the user modelessly edits it.
1: https://www.youtube.com/watch?v=QQhVQ1UG6aM
At 85 he has earned the peace of staying away from anything and everything on the internet.
https://www.quora.com/profile/Alan-Kay-11?share=1
The easy example I used to use to really blow people's minds on what was possible was Mathematica.
That is to say, it isn't so much lack of knowledge of history. It is lack of knowledge of the present. And a seeming unwillingness to want to pay for some things from a lot of folks.
Isn't that pretty much how things like simulink and gnu radio flowgraphs work?
Why? What problem did it solve that we're suffering from in 2025?
And yet for the most part, philosophy and the humanities' seminal development took place within about a generation, viz., Plato and Aristotle.
> Computation has overwhelming dependence on the performance of its physical substrate [...].
Computation theory does not.
This was clearly true in 01970, but it's mostly false today.
It's still true today for LLMs and, say, photorealistic VR. But what I'm doing right now is typing ASCII text into an HTML form that I will then submit, adding my comment to a persistent database where you and others can read it later. The main differences between this and a guestbook CGI 30 years ago or maybe even a dialup BBS 40 years ago have very little to do with the performance of the physical substrate. It has more in common with the People's Computer Company's Community Memory 55 years ago (?) using teletypes and an SDS 940 than with LLMs and GPU raytracing.
Sometime around 01990 the crucial limiting factor in computer usefulness went from being the performance of the physical substrate to being the programmer's imagination. This happened earlier for some applications than for others; livestreaming videogames probably requires a computer from 02010 or later, or special-purpose hardware to handle the video data.
Screensavers and demoscene prods used to be attempts to push the limits of what that physical substrate could do. When I saw Future Crew's "Unreal", on a 50MHz(?) 80486, around 01993, I had never seen a computer display anything like that before. I couldn't believe it was even possible XScreensaver contains a museum of screensavers from this period, which displayed things normally beyond the computer's ability. But, in 01998, my office computer was a dual-processor 200MHz Pentium Pro, and it had a screensaver that displayed fullscreen high-resolution clips from a Star Trek movie.
From then on, a computer screen could display literally anything the human eye could see, as long as it was prerendered. The dependence on the physical substrate had been severed. As Zombocom says, the only limit was your imagination. The demoscene retreated into retrocomputing and sizecoding compos, replaced by Shockwave, Flash, and HTML, which freed nontechnical users to materialize their imaginings.
The same thing had happened with still 2-D monochrome graphics in the 01980s; that was the desktop publishing revolution. Before that, you had to learn to program to make graphics on a computer, and the graphics were strongly constrained by the physical substrate. But once the physical substrate was good enough, further improvements didn't open up any new possible expressions. You can print the same things on a LaserWriter from 01985 that you can print on the latest black-and-white laser printer. The dependence on the physical substrate has been severed.
For things you can do with ASCII text without an LLM, the cut happened even earlier. That's why we still format our mail with RFC-822, our equations with TeX, and in some cases our code with Emacs, all of whose original physical substrate was a PDP-10.
Most things people do with computers today, and in particular the most important things, are things fewer people have been doing with computers in nearly the same way for 30 years, when the physical substrate was very different: 300 times slower, 300 times smaller, a much smaller network.
Except, maybe, mass emotional manipulation, doomscrolling, LLMs, mass surveillance, and streaming video.
A different reason to study the history of computing, though, is the sense in which your claim is true.
Perceptrons were investigated in the 01950s and largely abandoned after Minsky & Papert's book, and experienced some revival as "neural networks" in the 80s. In the 90s the US Postal Service deployed them to recognize handwritten addresses on snailmail envelopes. (A friend of mine who worked on the project told me that they discovered by serendipity that decreasing the learning rate over time was critical.) Dr. Dobb's hosted a programming contest for handwriting recognition; one entry used a neural network, but was disqualified for running too slowly, though it did best on the test data they had the patience to run it on. But in the early 21st century connectionist theories of AI were far outside the mainstream; they were only a matter of the history of computation. Although a friend of mine in 02005 or so explained to me how ConvNets worked and that they were the state-of-the-art OCR algorithm at the time.
Then ImageNet changed everything, and now we're writing production code with agentic LLMs.
Many things that people have tried before that didn't work at the time, limited by the physical substrate, might work now.
See https://longnow.org/ideas/long-now-years-five-digit-dates-an...
But I hate it. It makes most readers stumble over the dates, and it does so to grind an axe that is completely unrelated to the topic at hand.
That argument actually strengthens the original point: Even though it's been that short, youngsters often still don't have a clue.
But the rhyme and reason for who is known is not at all obvious. Outside of "who is getting marketed."
I will agree that the "greats" seem to tend to know all of this. Such that I think I'm agreeing with your parenthetical there. But most practitioners?
I owned at one time a wonderful two-volume anthology called Actors on Acting, which collected analysis and memoir and advice going back... gosh, to Roman theatre, at least. (The Greeks were more quasi-religious, and therefore mysterious - or maybe the texts just haven't survived. I can't remember reading anything first-hand, but there has been a good deal of experimental "original practice" work done exploring "how would this have worked?"). My graduate scholarship delved into Commedia dell'Arte, and classical Indian theatre, as well as 20th century performers and directors like Grotowski, and Michael Chekhov, and Joan Littlewood. Others, of course, have divergent interests, but anyone I've met who cares can geek out for hours about this stuff.
However, acting (or, really, any performance discipline), is ephemeral. It invokes a live experience, and even if you (and mostly you don't, even for the 20th c) have a filmed version of a seminal performance it's barely anything like actually being there. Nor, until very recently, did anyone really write anything about rehearsal and training practice, which is where the real work gets done.
Even for film, which coincidentally covers kinda the same time-period as "tech" as you mean it, styles of performance - and the camera technology which enables different filming techniques - have changed so much, that what's demanded in one generation isn't much like what's wanted in the next. (I think your invocation of film directors is more apt: there are more "universal" principles in composition and framing than there are in acting styles.)
Acting is a personal, experiential craft, which can't be learned from academic study. You've got to put in hours of failure in the studio, the rehearsal room, and the stage or screen to figure out how to do it well.
Now, here's where I'll pull this back to tech: I think programming is like that, too. Code is ephemeral, and writing it can only be learned by doing. Architecture is ephemeral. Tooling is ephemeral. So, yes: there's a lot to be learned (and should be remembered) from the lessons left by previous generations, but everything about the craft pulls its practitioners in the opposite direction. So, like, I could struggle through a chapter of Knuth, or I could dive into a project of my own, and bump up against those obstacles and solve them for myself. Will it be as efficient? No, but it'll be more immediately satisfying.
Here's another thing I think arts and tech have in common: being a serious practitioner is seldom what gets the prize (if by that you mean $$$). Knuth's not a billionaire, nor are any of my favorite actors Stars. Most people in both disciplines who put in the work for the work's sake get out-shined by folks lucky enough to be in the right place at the right time, or who optimize for hustle or politics or fame. (I've got no problem with the first category, to be clear: god bless their good fortune, and more power to them; the others makes me sad about human nature, or capitalism, or something.) In tech, at least, pursuing one's interest is likely to lead to a livable wage - but let's see where our AI masters leave us all in a decade, eh?
Anyway, I've gone on much to much, but you provoked an interesting discussion, and what's the internet for if not for that?
Studying history is not just, or even often, a way to rediscover old ways of doing things.
Learning about the people, places, decisions, discussions, and other related context is of intrinsic value.
Also, what does "material substrate" have to do with history? It sounds here like you're using it literally, in which case you're thinking like an engineer and not like a historian. If you're using it metaphorically, well, art and philosophy are absolutely built on layers of what came before.
Dude watch original StarTrek from 1960’s you will be surprised.
You might also be surprised that all AI stuff nowadays is so hyped was already invented in 1960’s only that they didn’t have our hardware to run large models. Read up on neural networks.
Given the pace of CS (like you mentioned) 50 years might as well be centuries and so early computing devices and solutions are worth studying to understand how the technology has evolved and what lessons we can learn and what we can discard.
"Computer" goes back to 1613 per https://en.wikipedia.org/wiki/Computer_(occupation)
https://en.wikipedia.org/wiki/Euclidean_algorithm was 300 BC.
https://en.wikipedia.org/wiki/Quadratic_equation has algorithms back to 2000 BC.
This seems to fundamentally underestimate the nature of most artforms.
That's absolutely false. Do you know why MCM furniture is characterized by bent plywood? It's because we developed the glues that enabled this during world war II. In fashion you had a lot more colors beginning in the mid 1800s because of the development of synthetic dyes. Really odd that oil paints were really perfected around Holland (major place for flax and thus linseed oil), which is what the dutch masters _did_. Architectural mcmansions began because of the development of pre-fab roof trusses in the 70s and 80s.
How about philosophy? Well, the industrial revolution and it's consequences have been a disaster for the human race. I could go on.
The issue is that engineers think they're smart and can design things from first principles. The problem is that they're really not, and design things from first principles.
Consider transport. Millennia ago, before the domestication of the horse, the fastest a human could travel was by running. That's a peak of about 45 km/h, but around 20 km/h sustained over a long distance for the fastest modern humans; it was probably a bit less then. Now that's about 900 km/h for commercial airplanes (45x faster) or 3500 km/h for the fastest military aircraft ever put in service (178x faster). Space travel is faster still, but so rarely used for practical transport I think we can ignore it here.
My current laptop, made in 2022 is thousands of times faster than my first laptop, made in 1992. It has about 8000 times as much memory. Its network bandwidth is over 4000 times as much. There are few fields where the magnitude of human technology has shifted by such large amounts in any amount of time, much less a fraction of a human lifespan.
Not true for either. For centuries it was very expensive to paint with blue due to the cost of blue pigments (which were essentially crushed gemstones).
Philosophy has advanced considerably since the time of Plato and much of what it studies today is dependent on science and technology. Good luck studying philosophy of quantum mechanics back in the Greek city state era!
Art or philosophy might or might not make progress. No one can say for sure. They are bad role models.
Micro-economics is much more approachable with experiments etc.
Btw, I didn't suggest to completely disregard history. Physics and civil engineering don't completely disregard their histories, either. But they also don't engage in constant navel gazing and re-hashing like a good chunk of the philosophers do.
As opposed to ours, where we're fond of subjective regression. ;-P
https://www.cs.rit.edu/~swm/history/index.html
Alan Kay giving the same (unique, his own, not a bad) speech at every conference for 50 years is not Alan Kay being a curmudgeon
>we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since.
it's Alan Kay running in circles
In my observation the problem rather is that many of the people who want to "learn" computer science actually just want to get a certification to get a cushy job at some MAGNA company, and then they complain about the "academic ivory tower" stuff that they learned at the university.
So, the big problem is not the lack of competent educators, but practitioners actively sabotaging the teaching of topics that they don't consider to be relevant for the job at a MAGNA company. The same holds for the bigwigs at such companies.
I sometimes even see the conspiracy that if a lot of graduates saw that what their work at these MAGNA involves is from the history of computer science often decades old and has been repeated multiple times over the decades, this might demotivate the employees who are to believe that they work on the "most important, soon to be world changing" thing.
How far we have fallen but so great the the reward if we could, "lift ourselves up again." I have hope in people like Bret Victor and Brenda Laurel.
It's funny, my impression had been that Mansfield was legislating from a heavily right-wing standpoint, of 'Why are we tax-and-spending into something that won't help protect the sanctity of my private property?" I was pleased to see the motivations were different. The fact that so few have articulated why the invoke Mansfield underscores why I was able to adopt such an orthogonal interpretation of Mansfield's amendment prior to this discussion. All the best AfterHIA. -ricksunny
Q: "Sooo... what does this do that Ansible doesn't?"
A: "I've never heard of Ansible until now."
Lots of people think they are the first to come across some concept or need. Like every generation when they listen to songs with references to drugs and sex.
I think software engineering have so many social problems to a level that other fields just don't have. Dogmatism, superstition, toxicity ... you name it.
You must not know any electricians. These behaviors are far from unique to one field.
Software development/programming is a field where the importance of planning and design lies somewhere between ignored and outright despised. The role of software architect is both ridiculed and vilified, whereas the role of the brave solo developer is elevated to the status of hero.
What you get from that cultural mix is a community that values ad-hoc solutions made up on the spot by inexperienced programmers who managed to get something up and running, and at the same time is hostile towards those who take the time to learn from history and evaluate tradeoffs.
See for example the cliche of clueless developers attacking even the most basic aspects of software architecture such as the existence of design patterns.
with that sort of community, how does anyone expect to build respect for prior work.
I think one of the problems is that if someone uses a word, one still does not know what it means. A person can say 'design patterns' and what he is actually doing is a very good use of them that really helps to clarify the code. Another person can say 'design patterns' and is busy creating an overengineered mess that is not applicable to the actual situation where the program is supposed to work.
I see that more from devs in startup culture, or shipping products are sofware only company.
It is a very different mindset when software is not the main business of a company, or in consulting.
I think assessing piece of hardware/software is much more difficult and time consuming than art. So there are no people with really broad experience.
The BASIC implementation started to outrun the supercomputer at some surprisingly pedestrian array sizes. I was properly impressed.
So we can estimate the Cray's scalar performance at 400× the TRS-80's. On that assumption, Quicksort on the TRS-80 beats the Cray somewhere between 10000 items and 100_000 items. This probably falsifies the claim—10000 items only fits in the TRS-80's 48KiB maximum memory if the items are 4 bytes or less, and although external sorting is certainly a thing, Quicksort in particular is not well-suited to it.
But wait, BASIC on the TRS-80 was specified. I haven't benchmarked it, but I think that's about another factor of 40 performance loss. In that case the crossover isn't until between 100_000 and 1_000_000 items.
So the claim is probably wrong, but close to correct. It would be correct if you replaced the TRS-80 with a slightly faster microcomputer with more RAM, like the Apple iiGS, the Commodore 128, or the IBM PC-AT.
The reason is very simple: Python takes longer for a few function calls than Java takes to do everything. There's nothing I can do to fix that.
I wrote a portion of code that just takes a list of 170ish simple functions and run them, and they are such that it should be parallelizable, but I was rushing and just slapped the boring serialized version into place to get things working. I'll fix it when we need to be faster I thought.
The entire thing runs in a couple nanoseconds.
So much of our industry is writing godawful interpreted code and then having to do crazy engineering to get stupid interpreted languages to do a little faster.
Oh, and this was before I fixed it so the code didn't rebuild a constant regex pattern 100k times per task.
But our computers are so stupidly fast. It's so refreshing to be able to just write code and it runs as fast as computers run. The naive, trivial to read and understand code just works. I don't need a PhD to write it, understand it, or come up with it.
There are many cases where O(n^2) will beat O(n).
Utilising the hardware can make a bigger difference than algorithmic complexity in many cases.
Vectorised code on linear memory vs unvectorised code on data scattered around the heap.
Premature optimisation is a massive issue, spending days working on finding a better algorithm is many times not with the time spent since the worse algorithm was plenty good enough.
Real world beats algorithmic complexity many many times because you spent ages building a complex data structure with a bunch of heap allocations all over the heap to get O(N) while it's significantly faster to just do the stupid thing that is in linear memory.
Gave a good idea of how python is even remotely useful for AI.
https://www.amazon.ca/Visual-Basic-Algorithms-Ready-Run/dp/0...
https://mpv.io/manual/stable/#options
https://ffmpeg.org/ffmpeg.html#Options
Unlearning Object-Oriented Programming: a course on specific software engineering techniques
Classical Software Studies: a course on the history of software tools
Writing Fast Code in Slow Languages: a course on specific engineering techniques
User Experience of Command Line Tools: an engineering design course
Obsessions of the Programmer Mind: course about engineering conventions and tools.
One day, the name of science will not be so besmirched.
This is just a common meme that often comes from ignorance, or a strawman of what OOP is.
>CSCI 4020: Writing Fast Code in Slow Languages Analyze performance at a high level, writing interpreted Python that matches or beats typical C++ code while being less fragile and more fun to work with.
I like this one, but see?
Python is heavily OOP, everything is an object in python for example.
I'm wondering if OP took a basic OOP course or would otherwise be interested in taking one? You can learn about a thing you are against, or even form your opinion after actually learning about it.
>
I strongly disagree. How is everything being called an object in any way "heavily OOP"? OOP is not just "I organize my stuff into objects".
You can write OOP code with python but most python code I've seen is not organized around OOP principles.
Do I need to spell it out? The O in OOP stands for object.Everything is an object therefore it is Object Oriented. It's not much more complex than that man.
And I don't mean that it supports users writing oop code, I mean that the lang, interpreter and library are themselves written with oop. Inheritance? Check. Classes? Check. Objects? Check. Even classes are an object of type MetaClass.
Simulate increasingly unethical product requests and deadlines. The only way to pass is to refuse and justify your refusal with professional standards.
Watch as peers increasingly overpromise and accept unethical product requests and deadlines and leave you high, mighty, and picking up the scraps as they move on to the next thing.
I have seen a disconnect between what is covered in ethics classes and the types of scenarios students will encounter in the working world. My (one) ethics class was useless. But not political even with the redrawn ethical map of the Trump era.
I'm not really sure how you could totally separate politics (forming laws) from ethics anyway.
Other points to note: All AI will always have bias; a) because neural networks literally have a constant called 'bias' in them; and b) training a prediction algorithm means it is being trained to be bias and running a clustering algorithm means lumping commonly themed attributes together.
If you mean "BIAS" as in "RACE/ETHNICITY/SOCIOECONOMIC_STATUS", then most groups already do this, state they do it, and still deal with the general public not believing them.
If you mean "BIAS" as in avoiding "RACE/ETHNICITY/SOCIOECONOMIC_STATUS", most groups state they do not use them for decision making and still deal with the "general public" not believing them because results do not support their opinions.
This isn't a "Trump Era"/"Right Wing Talking Point" amigo, its the truth; also it's always been this way, listen to some punk rock. Is AI allowed to generate child pornography? Because that is currently being argued as protected by Free Speech in courts because no child was actually involved [1]. Sorry your ethics class was useless, I don't believe mine is.
[1] https://www.usatoday.com/story/news/nation/2024/11/21/pennsy...
139 more comments available on Hacker News