I am excited to announce that from now on, important blog posts about AI, education, and digital media will also be available in English. I am looking forward to engaging with international teachers, tutors, and experts. Your feedback, comments, and the sharing of these articles will be greatly appreciated and highly valued. Let’s connect and discuss these vital topics together! The original German version of this article can be found here.
Artificial Intelligence is on everyone’s lips. OpenAI’s ChatGPT is celebrated in thousands of posts and described in numerous articles. And indeed: how the program seemingly “answers” questions is astonishing at first glance. Does this mean traditional education is obsolete? On the contrary.
The various types of content that can be fed into the program can be read about in many other places. So let’s take a typical example of a question and an answer as ChatGPT gives us:
What can be done about the teacher shortage?
To combat the teacher shortage, it can be helpful to take measures that encourage teachers to stay in the profession and increase the attractiveness of the teaching profession. Some possibilities are:
– Improving the working conditions and pay of teachers to make the profession more attractive and reduce the migration to other professions.
– Creating continuing education and professional development opportunities to help teachers expand their skills and knowledge and increase their motivation.
– Introducing mentoring programs where experienced teachers support new teachers and help them acclimate to the profession.
– Enhancing the public profile of the teaching profession to increase recognition and appreciation for the work of teachers, thus making the profession more attractive to young people.
These are just some examples of measures that can help address the teacher shortage. It is important that the measures are tailored to the needs and requirements of teachers and schools to be truly effective.
What does this mean?
At first glance, this all sounds good. Does this solve the teacher shortage? No, of course not. The briefly outlined points are known. Or partially even wrong (as the issue of pay is more complex. And professional development is not the main problem here).
Moreover, what can be identified as correct is already known, because what ChatGPT does – and this is a realization that should seep into the discussion – has nothing to do with “intelligence”. Platform expert Vicky Boykis writes in a Twitter thread about the misunderstanding that exists. Her tl;dr conclusion (own translation):
MY PERSONAL TL;DR OPINION IS: IT DOES SOME COOL GENERATIVE STUFF THAT WILL BE GOOD FOR CHATBOTS, CUSTOMER SUPPORT, HELP CODES, AND FIRST DRAFTS. ULTIMATELY, IT WON’T “TURN SOCIETY UPSIDE DOWN” FOR A VARIETY OF REASONS.
It would be too lengthy to list these reasons, so here’s one found in the thread: The AI does not “answer” a “question”, but recognizes in the question an indication for an algorithm that arranges words in a sequence similar to how words were arranged after such a “question” before. This is impressive. But it has nothing to do with intelligence nor does it signify the end of “education” as we know it.
Wrong and Right Questions
Of course, this does not mean that we should not – especially in the educational sector – think about the significance of a program that can seemingly answer any form of question comprehensibly. But in my view, the questions are sometimes being asked incorrectly.
For example, if one ensures that ChatGPT writes a text and believes that this text is just as good as one they would have written themselves, the more sensible question would first be what this says about their own text. Not about the brilliance of the program. Because (still) we are dealing with a program that, in the sense mentioned above, does not go further than making – mostly coherent and formally correct – restructurings of known material. Thus, ChatGPT is not much more than an impressive version of Beat Döbeli-Honegger’s Blahfasel generator.
But isn’t one’s own description, one’s own text, also just a mere form of restructuring? That would be one of the right questions, which goes beyond word allocation and into thinking about focus, detail, connections, and so on.
You might also ask: Are you sure that I – Bob Blume – wrote this text? If so, why?
Of course, this does not mean that the next or overnext version of AI might go even further. Imitate my style, formulate deeper analyses, and so on. It simply means that the current version, impressive as it is, is an impressive imitation of meaning.
What does this mean for schools?
Besides the shrugging agreement that one can of course work with ChatGPT in school – be it to practice imperatives, to practice detail questions and assess their integration, or to check one’s own skills – one conclusion that I have read time and again seems relatively odd to me.
In the bright excitement of the evaluation, one reads that now virtually everything one could set as tasks is redundant because the AI can basically solve it. Really?
I don’t quite understand the difference between ChatGPT and Wikipedia (well, I do understand, but want to create a cognitive dissonance). Or let’s take a step further back.
Let’s consider a teacher who has explained everything to his class and realizes that there is an encyclopedia. “Everything changes,” he exclaims! “How will I ever do anything of significance again? It’s all already collected.” Or put another way:
Just because the means of access change, that doesn’t change the fact that the person accessing it must be competent.
How do we achieve this competence? This is another important question.
A 180-Degree Turn
First, I offer an answer whose conservative radicalism is not my own; we need it, however, to counter knee-jerk reactions.
One answer to the question of how we respond to the fact that now an “AI” can “answer” everything might be to say: Therefore, we need to know more! Therefore, we need to write more! Therefore, we need to work less with technology!
I would wish for everyone to allow this thought to then say in the next step. No, a complete retreat can’t be the answer either. So what? Exactly! That’s the problem. We need to talk about what this means for learning.
Which Learning Are We Talking About
The articles I have read so far start from a truncated concept of learning (which is also prevalent in schools). One might formulate it as: Learning is what appears as a correct result as proof of a skill at the end. Learning is thus the result.
Subsequently, the surprise is great: ChatGPT can calculate, write, know everything. What does this mean for our children and adolescents? Wrong question.
Once again: Whether students think easier, faster, and more efficiently is completely irrelevant. The question that arises much more is:
How can learning become a process that is perceived as so meaningful in its individual and dialogical deepening that it results in a development of the (young) person in terms of cognitive, physical, personal, and professional abilities?
Whether ChatGPT is used to support or expand such learning is more than irrelevant. Because meaning arises in the action and reflection of the learner themselves, not in terms of how a result was produced.
Lost or Won?
If we judge ChatGPT by how well it can imitate the results of young people, we have already lost. It doesn’t matter whether someone thinks the performance level is 8th, 9th, or 10th grade.
I don’t know how it is for others, but: The friendly reminder to try a certain task yourself and not share it via WhatsApp is not unfamiliar to me. Because the goal is not a formally perfect result, but the goal is the acquisition process itself, especially when it results in something we initially judge as “not correct.”
So if we have a little less panic and consider how we want to accompany the process of learning where the result is less important, we have already gained a lot.
This text was written by ChatGPT. No, it wasn’t. Or was it? How would you know?