Monday, February 23, 2026

Who Is Driving Changes to Computer Science Education

There are a lot of Changes happening at code dot org The Slashdot article linked there lists several of them. While the changes include a number of people changes including President Cameron Wilson stepping aside, Chief Academic Officer Pat Yongpradit leaving to join Microsoft, and some staff layoffs the change in direction, to AI, may be the most concerning. From Hour of Code to Hour of AI? Some interesting comments follow that post.

The questions top of my mind are "who is driving the direction of CS education" and "is CS education moving in the right direction?" A lot of people believe that industry is pushing CS education in the direction of being vocational. The new focus on Artificial Intelligence often feels like a vocational direction.

My involvement with computer science education predates code.org and even CSTA so I have seen a lot of changes. In my first teaching days computer science teachers were pretty isolated. There was SIGCSE which accepted K12 teachers though welcomed sometimes felt like aspirational rather than actual. ACM, of which SIGCSE was and is still a part, was doing some support for CS education. Cameron Wilson was a huge part of that and worked policy.

CSTA was developed by some wonderful people in and around ACM. This started the real movement towards expanding K12 CS education. CSTA helped train and organize teachers to push for more more CS education. Code,org came a bit later and brought something new to the effort.

Code.org brought money and industrial production values. From the first set of videos that went viral to some very good curriculum resources as well as connections to industry and political leaders. Getting policymakers to push for CS education stepped up.

We’ve come a long way.

Coming back to my earlier questions. Is industry driving the directions that CS education is moving? A lot of people think they are. Industry has money and it has funded a lot of the work by code.org and CSTA. The modern Golden Rule is that the people with the gold make the rules after all.

Industry has some motivation here. I spent a few years working at Microsoft myself where my job was to promote the use of Microsoft tools for teaching. I didn’t get much in the direction of what to teach. I always felt that teachers should decide what to teach and I just wanted to help teachers find ways to use tools to teach those concepts. Teaching computer science as vocation was always there though. Senior mangers often told me that industry needed more people to know CS because there were jobs that needed to be filled.

CS as vocation has always been a selling point for CS education of course. It’s what helped sell school boards and other elected officials. Among teachers that was usually a secondary motivation. For a lot of teachers, including me over time, CS education became more about understanding how the world works. We don;t teach physics because we want to make more physicist. We teach it so that students understand the world around them.

People who are not working for tech companies often have to use computers and make decisions about computing. From spreadsheets to databases to internet searches. And now AI. People in all walks of life use computers. Understanding computer science can make those people more efficient. Computers are an important part of our world.

It seems like all the big tech companies are betting huge sums of money on AI. There is a lot of pressure to move the direction of CS education into AI. Is the industry push vocational in intent? Is is all about helping these companies to make money? CSTA and code.org are both pushing AI these days. Is this because of industry (gold making the rules?) or would it be happening independently?

That leads to the second question – are we moving in the right direction? I think that question may be different for K12 and for university. Personally, I still think CS education in K12 should be about understanding and not vocational. Someone else can address higher education but K12 should be about preparation for life and not for vocation at least in comprehensive schools.

So is AI the right direction? I think it is indisputable that AI is important to learn. Students should learn prompting and they should learn what AI can and cannot do, They should also learn how to think about what AI should not do. They need to know something about how AI works and that is core computer science.

I think that computer science, in the old analogy, is the dog and AI is the tail. The tail should not wag the dog. Making AI the focus at the expense of basic  computer science would be a huge mistake. We do have to teach the basics that make AI possible. Students need to understand where AI comes from and where it might go. Understanding code is an essential part of that understanding.  There is always going to be more to CS than just AI. We didn’t stop teaching arithmetic when calculators were invented. We should not assume that AI code writers mean we don’t have to stop teaching basic computer science.

CS in K12 should not be just vocational. Is industry driving CS education? I fear they may be. Are we moving in the wrong direction? Maybe. If so, it will be up to educators to provide some course correction. 

Saturday, February 07, 2026

AI Tutors and the Human Connection

I  recently shared at quote on Facebook:

Unless our students know that we care, they will not learn from us.

I made the comment that I wondeedr if an AI teacher will get students to think it cares about them. I really believe that a connection between student and educator is important for a good educational experience. Several people on Facebook indicated that they think that an AI tutor will be able to convince students that they (the AI tutor) cares. Is a major concern I have about AI tutors misplaced?

Thinking about this, I recalled variations of the saying:

The secret of success is sincerity. Once you can fake that you’ve got it made.

Can Artificial Intelligence tutors fake caring about students? I wonder.

Initially, I thought, no, not going to happen. Now I am not so sure. I have been thinking of my own interactions with Alexa from Amazon via their smart devices. Attempts to be personal with the AI, for example, saying “thank you.” elicit what feel a lot like personal responses. Alexa wishing me a “good night” or a suggestion to “keep warm out there.”

I recently had a conversation of sorts with Copilot about books I am interested in reading. The conversation felt a lot like taking to a real person.

Also, a friend of mine (Richard Seltzer) recently shared a book he was working on titled “How to Partner with AI: A New Kind of Relationship” (A pre-publication pdf of the entire book is available here for free.) The book reads a lot like a conversation between two real people rather than a person and a computer program. In fact it feels a lot like a conversation among friends.

So maybe AI tutors will get students thinking they care. Whether the program is faking that it cares or really cares is more of a philosophical question than a practical one. It’s a question well worth talking about of course. Just as asking if computers really think or if they can be truly creative. Practically speaking though does it mean that AI tutors can replace human teachers? I think it is more complicated than that.

There is also the matter of what to teach. I read someone recently saying that human teachers teach what they want but that students are not interested in learning and that AI tutors will teach things that students are actually interested in learning. That may be true but is that what we really want? Would that meet the needs of a real education?

What I see often is autodidacts attempting to promote learning that works for them as being the way that everyone should learn. That is decidedly not the case. Many, perhaps most, students need some external motivation and some direction.

I love the idea of students learning more about the things they are interested in knowing. There are things that student need to know though and students are not always interested in learning them all. We have required courses for a reason! Learning all about football at the cost of not learning any mathematics is probably not a good thing. Students are masters of distraction – both of becoming distracted and distracting others. Others includes instructors!

Perhaps that will work out. Perhaps an AI tutor will work mathematics into the football lesson. It could happen but will it?

There is also the question of who is teaching the AI. Will the AI tutors have a good bias or a bad one? Will it be trained to better society or to make it more compliant? Will the students wind up retraining the AI in unhealthy directions? We have seen AI chatbots turn very ugly with help from the internet. Who will monitor these AI tutors? Parents? Not likely.

We’ve also seen AIs get a lot of things wrong. They are not very good at validating sources of information. Human educators are a lot better at that.

I can imagine AI tutors working out very well. I can also imagine them turning out very badly. What I am strongly concerned about is AI tutors for the poor with human educators for the rich. Perhaps the human teacher supplemented with an AI tutor or an AI tutor supplemented with a human supervising instructor. But  it is clear to me that many of the rich are more interested in using AI to save money by replacing people and not as much of making things work better.

Relegating the masses to AI tutors is a high risk proposition with potential of holding the masses back. Autodidacts with high self motivation and a good AI tutor may go far. I am not sure that is the way to bet for most students though.

Sunday, February 01, 2026

Reminiscing - When Computers Had Lights

Back before the personal computer age, computers had lights and toggle switches. One could use the switches to program computers and read answers in the lights. All in binary of course. We also used these tools for debugging. One could enter a memory address using the switches and see what was in the location, data or instruction, in the lights.

If a computer program was hung in a loop one could halt the computer and see what address and instruction was part of the loop. It was a useful debugging tool. Similarly if the computer halted for some reason an error code might be displayed in lights.

It wasn’t all seriousness though. Many operating systems would display something in the lights when the computer was idle – not doing real work. Usually this was some sort of animation – lights racing though the strip and rows of lights. Digital Equipment Corporation had a computer type called the PDP-11 that supported a number of different operating systems. Each OS had it’s own idle loop light display. One could walk into a computer lab, typically at night when no one was using the computers, and tell which OS was running on which computer just by watching the lights.

Some manifaxine computers had a lit of lights. A company called Burroughs had one large computer that would display the company logo in the lights when it was idle. Now you never really want to see that display if you owned that computer. It was frightfully expensive to buy and operate so you really wanted it to be doing real work 24/7. One potential buyer wanted their company logo to display when the computer was idle. Vanity perhaps? Anyway, silly as it was, as I recall, the program change was made and the sale went though.

Today, those sorts of lights are an unwanted, and generally unneeded, expense. I do sometimes miss those simpler days though.