In a comment on a recent post about learning, a person asked my opinion about new technologies called Open AI (artificial intelligence) and ChatGPT. This software is causing quite a stir in academic realms and has even made CNN and Wall Street Journal. Douglas Rushkoff, a supporter of humanity and critic of modern technologies, discussed some of the real dangers of the software, as opposed to many surface-level issues. He noted that its use ultimately separates us from each other, writer from reader, teacher from student, etc.
Some surface-level issues are that students can plug a prompt from their class into the software and it will use AI to write responses and even academic papers. The student can copy the AI-written work and paste it as their own work. According to several professors who have tried and studied the software, the outcomes of the papers are grammatically correct while also missing the mark on several items. Other concerns are that people can use it to create fake news.
The commenter on my post specifically asked if I thought this technology could “actually make us powerless, make us think less, make us achieve less by being reliant on other things.” My answer is a resounding “yes.”
I am a media ecologist. Media, in our perspective, going back to Marshall McLuhan, is any technique or technology that extends human capabilities and central nervous systems. This comes with the reality that as a medium extends, it also obsolesces and amputates.
For instance, while writing allowed humans to transmit messages to future generations, what we might call time-binding (see Alfred Korzybski and General Semantics), it also hampered their ability to memorize. This is the contrast between orality and literacy, oral culture and literate culture (see the work of Walter Ong for more on this topic).
For a short explanation, media ecologists often cite Plato’s dialogue, where Socrates explains the interplay between King Thamus and the god Theuth. Theuth, as the god of lightning, wants to give humanity writing as a way to communicate. King Thamus thought this was not good because it would also lessen people’s ability to remember things. They could write them down without the need to commit them to memory.
Deferring to (perhaps acquiescing to is a better term) AI and modern technologies amputates our abilities to do intellectual work, much like a calculator relieves us of the struggle of pen/paper or mental computation.
The human condition has nearly always been plagued by the path of least resistance or what Jacques Ellul called efficiency. But that drive is, in many ways, counterintuitive to human longevity and finding meaning in the struggle. Viktor Frankl’s quote, “When a person can’t find a deep sense of meaning, they distract themselves with pleasure,” is a great way to put it. We find meaning amid our struggles, not in our pleasures. With AI-written papers, there’s no struggle, therefore, no meaning.
Neil Postman’s book, Amusing Ourselves to Death, is also a great example of what I mean. Postman often asked of a technology: “What purpose does it serve? What problem does it solve?” If the purpose is to streamline writing, I think AI accomplishes that goal. But I don’t believe that ultimately makes us better human beings.
Writing to Learn
In my doctoral program at Duquesne University, the professors emphasize writing to learn, making the process less about outcomes or grades and more about letting the student gain knowledge through the activity. The entire process becomes trivial when we remove that purpose from the endeavor. No learning occurs, weakening mental capacity and engagement, a dumbing down if you will.
As Troy Headrick wrote, learning is “the fundamental change that occurs within a human mind and heart when that person grows intellectually and emotionally.” We can’t grow without the struggle. These are a few of my thoughts on the matter.
You can find more of my work at www.thephilosophicalfighter.com. Thanks for reading and I look forward to hearing from you.