Do high school or younger computer science students really need to understand number-base conversion and binary, decimal, and hexadecimal number systems? Obviously most students are comfortable using decimal numbers. How important though is them knowing Binary? Or hexadecimal ? Especially hexadecimal?
Now in my career there have been times when I used Binary, hexadecimal, and even octal (very useful in machines with 12 bit words.) But do we need to teach these to secondary school CS students? If so why?
Do we have students reading hex dumps or looking at data in hexadecimal format? I can’t remember the last time I asked students to do something like that. So why hexadecimal?
Now ok Binary comes in handy for understanding things like why character variables are between –128 and +127 when expressed as integers. ASCII (and other formats) are available in decimal as often as Binary, Octal, or hexadecimal. So we really need students to know that a space is 20 in hexadecimal and 32 in decimal? Isn’t the decimal enough?
Binary is useful in other ways of course. Setting and reading flag bits for example can be very efficient and useful. And it is helpful in some cases to get a deeper understanding of data storage.
Now I do think understanding number systems is important. Just as learning a new natural language often helps people understand their native language, learning about number systems/bases can help students understand decimal better. But is the a math requirement or a computer science requirement?
OK let’s discuss.
6 comments:
Number systems give us a way to "look inside" a computer. I had this conversation with another CS teacher where we were talking about all data being represented as binary numbers in a computer. They kept getting stuck with the idea, accepting that words can be converted to binary but struggling to understand that images and sound can too.
I don't think it is important to understanding CS to know number systems but it is important to understanding computers. It is worth teaching, especially early when coding is more challenging. Number systems allow for an "unplugged" activity that deepen understanding of computers and how they operate.
I think that is especially important as students study cyber security where you might use a HEX editor or look at files in a more raw form. The letters/numbers are not magic, just different representations of numbers.
At the high school level, I think that covering the idea of binary is important. I think covering other number bases, or going in depth with any of them, isn't.
For most students, they will at most have the ability to take an intro level CS course, and an AP or college 101 level course. I think knowing about binary is important at that level, but there are so many other concepts to teach or to spend time reinforcing that are higher priority (conditionals, loops, functions, project planning, problem solving, etc).
Further, unless the curriculum will build on the skill, and make use of it in growing levels of depth across a semester, then including it for a week long unit and never mentioning it again doesn't help much. It's teaching something because it feels important, not because it is important.
At most, I'd do as Derek suggestions, and have an unplugged activity around it, for some deeper learning. I wouldn't make it complex, I wouldn't make it critical to the curriculum, and I wouldn't spend much time on it (unless you really want to spend a lot of time on bit manipulation).
Some thoughts: http://cestlaz.github.io/posts/essentials-for-an-intro-course/
I agree with Derek, but only in part. As I start CS from the idea of information and not the computer, I use number bases to demonstrate different representations on the same information. I also use lexigraphical symbols for the English word 'Tree' in different spoken languages (and Morse Code) or the notion that we represent a geometric point with a dot (when we could easily use a triangle). However, the idea the notion of 'ten' can be represented as the decimal 10 or 1100b or 0xA matters just as much in a practical sense.
And it matters because 0.1 + 0.1 + 0.1 is not 0.3 to a digital computer (it's 0.30000...0004, depending on architecture and language boundary issues). The fact that the computer needs to convert from base-10 to base-2 can matter, and I actually have students complete a simple program where their program gets the wrong answer in one case if they aren't careful about the rounding error.
Now, do I care if students can convert from Base-2 to Base-16, etc? Not unless we're doing ACSL or some other competition that requires it. I do show them that arithmetic holds across bases, and we discuss why computers use base-2, and how base-10 actually works. But I never test them on it. It's more of a novelty, but the emphasis is *always* on information representation -- which leads easily to ASCII and software design and abstraction later.
And the idea of bases comes up again in my Data Structures class when talking about Big Oh, and why the naive prime checking algorithm is O(2^n) versus O(n). And bases come up again in Selected Topics when dealing with low-level (circuits, assembler, and C) concepts.
Basically, I sprinkle it in as students need it to facilitate more important concepts. If they want to know more, that's for outside the classroom. The concept of bases is important, but not important enough to require a week of dedicated time and testing.
My 0.3000000000004 cents :-D
--sea
I do not teach bases as a topic. I give my Python class (only one kid this year) a programming assignment of converting bases. They learn the bases as a by product. We do discuss why the bases are important but I do not have them solve conversions by hand or anything strange like that. I am more interested in the kids learning what an algorithm is all about and converting bases is all about algorithms. Some programming languages (Small Basic) use base-16 for colors. In the tech class we look at IPv6 which shows IP numbers in base-16 form. Bases to me are more of a tool than a topic. When you need it teach it.
Three reasons to teach number bases:
1) It may help students deepen their understanding of decimals, though whether this is true is empirical question.
2) Certain ideas and facts that don't make sense without teaching number base. What is a byte? Why do attributes like RGB intensities run from 0 to the otherwise-arbitrary 255? Concretely, what is meant by the phrase "numbers and letters are represented by zeros and ones?" Feaster et al (2012) surveyed 30 CS educators and report "one interesting observation is that while 30% of respondents indicated that binary is relatively unimportant in the courses they teach, all respondents agreed that understanding number systems other than base-10 is important for computing students."
3) Number bases are intertwined with some problems that students find interesting, such as representing colors in CSS, using least-significant bit steganography, or finding secret messages or forensic data in images viewed with a hex viewer. The well researched framework for developing effective curriculum, Understanding by Design (UbD), requires that objectives be aligned with student work, and authors Wiggins and McTieghe labor to point out that, contrary to many of their readers' understanding of UbD, objectives can be selected specifically because they align with a motivating problem or activity. Measurements by Lambert and Guiffre (2009) and Feaster (2011) report low student engagement for activities teaching number bases, but these measurements were made using activities that treat the concept without a motivating problem or application, and the Feaster (2012) report demonstrates that number base can be taught in an engaging context.
Post a Comment