People continue to share things that they learned at SIGCSE online. Recently Ria Galanos, from Thomas Jefferson HS, shared Cybersecurity Modules: Security Injections|Cyber4All @Towson These ae a bunch of modules for teaching about various security issues like integer errors, Input Validation, and buffer overflow. Examples are in a number of languages including C++, Java, and Python. Looks like great stuff really.
From what I understand the majority of security vulnerabilities in software are a result of one of these three types of errors. Clearly they are important concepts but we don’t spend a lot of time talking about them in beginner classes. That probably needs to change.
It seems like we keep adding things to what we should be teaching beginners. Ethics, Accessibility, Security, and let's not forget the stuff for the AP Exam. How do we fit it all in?
Ultimately we have to do things in parallel. I think we have to think about ethical computing, safe programs, and accessible software as all part of good, solid program design. I think it is a mistake to think about all of these things as separate units to be taught in isolation. Projects can be created that integrate security, accessibility, and more by defining them all as good design. Not something special to do just for security or just for accessibility but for sound design.
Now to go look at my projects and see how to do all of that.
A lot of the problems can theoretically be avoided with good defensive programming but we also have to ask a deeper question - people make these errors even when they DO know better. Why and how can we change this?
ReplyDeleteHow many people who leave memory open to a buffer overflow know that buffer overflow is an issue? Based on my experiences, I'd say a lot.
There's something deeper going on here.
I agree that there is more going on. Two things I think are common are the assumptions that no one will discover or abuse the vulnerability. Especially with proprietary code people assume the flaw will not be discovered. With open source code I think there is an assumption that people will not abuse the flaw. Both assumptions are often wrong of course. And then there is the lazy factor. It takes more work (which equals time and money) to plug the holes and test them. If you add that to the other assumptions you get security holes.
ReplyDeleteAs an aside, I had a student doing a summer internship testing software. He called to tell me that the professionals were doing things I taught him not to do. Specifically allocating memory and not reallocating it when they were done. I asked him how that worked. He said "it works fine for one iteration. For ten iterations it gets real slow. For 100 iterations the software crashes. So yeah, people who know better still do dumb stuff. We still need to make sure our students know better. Sharing examples like this one, which I do, may help. It's worth a try.