Wednesday, December 03, 2025

Are We Really Teaching Artificial Intelligence

According to Code.ORG

The Hour of Code is now the Hour of AI

The Hour of AI makes teaching AI literacy easy, engaging, and fun. Empower your students to become the next generation of innovators with AI.

Of course the web site has a lot of activities that are still labeled “Hour of Code” but it raises the question: What does it mean to teach artificial intelligence?

In a conversation, a very smart friend talked about how there was a time when we taught Office applications (Word processing, spreadsheets, and the like) and called those classes computer science classes. That’s not real computer science and the CS education community has fought that characterization for years. With some success!

My friend made the comment that much of what is called teaching AI today is the AI equivalent of teaching the Office application of AI and not the science of AI. For example, is teaching AI teaching how to use an existing AI tool and having students train it to recognize some type of object. Is that the same as teaching how the AI works or what sorts of algorithms are behind the training? Of course not.

Likewise, teaching students how to write good prompts to a LLM is not the same as teaching how a LLM works. In fact, I would argue, its not any different from teaching students to write good instructions to another person.

Now I am not saying that learning how to write a prompt or train a machine learning tool is not valuable. Clearly it is. Arguably it is even necessary. But is that really teaching Artificial Intelligence in the same way that a course like AP CS is teaching computer science? Or is it more like teaching applications without teaching how it works?

I have heard the argument that students are not ready to learn how AI works or that the algorithms are too complicated. I would agree that students may not be ready to create an AI at the level of ChatGPT but that doesn’t mean they cannot handle the concepts behind that sort of software.

The truth is that a lot of what AI, especially machine learning and Large Language Models are doing is not really that new. What is new is that we have lots more data for AI to work with and we have processing speed that is a lot faster than what we used to have. Machine learning is heuristics on steroids. LLMs are data analysis with lots more data and faster CPUs.

We had software learning by asking questions and trying different options 50 years ago. Now we have the software asking itself the questions and finding new paths based on data. We had rule based software for decades but now we have better algorithms to evaluate data against rules. We’ve been studying text looking for hidden meanings throughout history. Now we have more text for algorithms to analyze and the ability to analyze in more ways in less time. We can start with simple data sets and basic concepts that teach the roots of AI. Students can deal with it. They do that sort of thing with human intelligence all the time.

If we are serious about preparing students for the future of AI we really need to get serious about teaching some depth of concepts. Let’s not stick with the Office applications equivalent and lets move on to the real science.

No comments: