The Intercollegiate Student Magazine

ChatGPT is Killing Homework. That’s a Good Thing.

Robot writing on a desk to do homework

Last November, the artificial intelligence company OpenAI released ChatGPT, an incredibly accessible chatbot that can answer questions, generate text, and even write poetry. It can also simulate conversations between historical figures, play Chess (even though it likes to cheat) and write in the style of any well-known author. It can also summarize, annotate and translate human-text, do discrete math, write code in most programming languages, solve riddles, play role-playing games, write blog posts and provide therapy.

But more importantly, it can do my homework. 

I am a double major in philosophy and computer science, but ChatGPT cares not whether the class is STEM or humanities. A discussion post? Done. A programming assignment? Easy. A math problem? No problem. 

When I first discovered that ChatGPT can do my job for me, I was very happy. Then, I came to a stark realization: ChatGPT can do my job for me. If one bot can do the majority of the work for my degree, what the hell am I paying for?

The knee-jerk reaction from some professors is to ban the use of ChatGPT. Here’s the problem with that: whenever you ban something which you have no way of monitoring (ChatGPT detection tools are incredibly weak), you just penalize the kids who actually follow the rules. If you’re a CS professor and you say: “don’t use AI to do this programming assignment” the kids who don’t use AI will spend 5 hours on the assignment, while the kids who use AI will spend 15 minutes. It’s a horrendously bad incentive structure. 

But there’s another, deeper problem with banning the use of AIs like ChatGPT. If these tools are so useful right now, they are undoubtedly going to be useful in the real world. Real-world programmers, consultants, artists, and writers are all using these AI tools. 

In 2nd grade, we teach kids how to add, subtract and multiply on paper. But then, by the time they’re in middle school, we give them calculators. Then, since they now have a tool that can do the basic stuff for them, we start teaching them more complex math and they can begin to build up their knowledge from the foundations. That is precisely how colleges need to treat these AI tools. Anything that ChatGPT can do in half a second is not something worth doing outside of an introductory course.

So, perhaps it’s more productive to ask what can’t ChatGPT do? Those are the skills we should be developing through our education. 

To language models, words are all relational. Using the vast amounts of data that it’s trained on, ChatGPT generates each word based on its algorithmic relationship to the prompt and to the previous words that it generated. When humans use language, though, it’s not just syntax and semantics; we bring observations and insights from the outside world onto the paper. 

Many humanities classes, however, focus only on theory, and are detached from the outside world. In the words of the author John Warner:

Students are rewarded for, at best, regurgitating existing information. Now we have GPT3, which, in seconds, can generate surface-level correct prose on just about any prompt. That this seems like it could substitute for what students produce in school is mainly a comment on what we value.

Most of my humanities classes have followed a pretty similar format: I read text from a screen and then wrote about that text on a screen. But my best class—the one where I learned and grew the most intellectually—was an ethics course where we would go to nearby middle schools to coach debate. The coursework felt so much more real because it weaved together ethical theory with actual volunteer work.

Many STEM classes would also benefit from having students do more practical application of math and science. The CS course where I learned the most was one where I had to make a video game; instead of memorizing one concept after another, I had to actually build something. With these bigger, more creative projects, ChatGPT starts to feel far more like a tool; it can help me edit and write parts of my code, but I have to strategically plan the whole project. 

Students, professors and administrators should focus on the elements of education that are distinctly human, like community engagement and creative projects. If your coursework can be automated, then perhaps you shouldn’t be doing it in the first place.

~ Also Read ~

Louisiana State University

The populace is ill-equipped at rigorously filtering through the wildfire of information produced by the digital age.