Tuesday, October 28, 2014

Leave School Now While You Still Know It All

Interesting questions appear on Twitter all the time. Sometimes addressed to individuals and sometimes addressed to the Twitterverse as a whole. One earlier today has me thinking a lot. And chatting with people on Twitter and Facebook.

This is a tough question if only because it raises many more. What does it mean to be “programming at a high level?” What is the value of a degree? How does one invest their time and money to make the most of their talents?

People regularly ask about moving into a career in software development right from high school. I never hear anyone asking about moving into a professional career in architecture, engineering, medicine, biology, chemistry or the like. If computer science is one of the hardest HS courses students can take (based on how many are afraid to take it) why is is seen as so easy to turn pro at?

In part this is because of the stories we tell. Bill Gates, Mark Zuckerberg and more made billions without finishing their degrees.  But wait, LeBron James makes millions of dollars and he jumped to the NBA right from high school. Being Bill Gates is probably about as much a long shot as being the next LeBron James. but While a lot of high school seniors think they are ready for the NBA and are wrong they at least know they have to be found before they can turn pro. Young software developers have the ability to “turn pro” at very little cost or risk these days. The temptation is great to try because the risk appears so low.

There are many people with very successful careers in software development who don’t have degrees so there is that. Why don’t we see that in other fields? Is it because we don’t do well teaching people professional level skills in high school in those fields or is it because there are artificial barriers to entry in those other fields or is there just too much more to learn? Possibly a bit of all three.

There is also a real anti-degree feeling among some software developers one could point to as well.

This is far from new. In the late 1970s I had a hiring manager tell me he was initially against bringing me in for an interview because I had a university education in computing. It’s hard to imagine another professional field were there are practitioners who see a university education in the field as detrimental.

One thing that makes computing, or perhaps more specifically software development, different from other fields is that it is easier to learn a lot without a formal education. Between MOOCs, freely available software, shared resources, online forums, and more there seems to be a virtual smorgasbord of resources for autodidacts.

One can quickly learn, on their own, enough to start making apps and applications and even make money. Once in the field one can bootstrap their learning though hard work and online resources to keep improving and growing. This is not really possible in many other fields. Not these days. There was a time when doctors and lawyers and other professionals learned on the job though internships and apprenticeships but those days are long gone. They have been replaced by professional degrees.

So to get back to the question that started this epic long post – degree or not? I’m a big fan of the degree. I think that people benefit from a guided learning experience. It forces one to broaden their knowledge and helps them to learn some things they don’t know that they need to know. There are some who can succeed without it but not as many as think they can.

If you are going to skip the degree you need some way of demonstrating that you really know your stuff though. You need significant projects that have been completed that you can show off as samples of your ability. That can be a profitable app for phones or tablets. I suspect that the students who created YikYak for example will have little trouble showing that they understand phone apps and cloud computing. People looking to get into game development need (and I’ve been told this time and again by hiring managers) to have a significant game project under their belts.

A degree, for good or for ill, is validation that you “know something” and if you don’t have that degree you need to be able to convince people you know as much.

One last thought, if a student says he doesn’t need a degree because Bill Gates doesn’t have one remind them that Bill Gates completed two years at Harvard. Then ask them if they can get into Harvard. It may get them to think a bit.

A late thought from someone on Facebook. A good degree includes a lot of courses that are not part of the major but which make for a more well-rounded and complete person. There is some real value in that.

20 comments:

Garth said...

$100K for a CS degree in a field where a portfolio is often more important? The nice thing about college is it is always there. Want to try getting a programming job without it? Give it a try. If it works, great. If it doesn't, go to college. The mistaken idea that kids should go to college directly out of high school is part of this issue. As a college math teacher I would say about half of my classes had no idea why they were in college in the first place. The only ones that really knew why they were there were the non-traditionals. I think the best strategy is to go out and get some experience, in the field if possible, then go to college.

Dan Wiebe said...

We like passionate hobbyist developers straight out of high school because such people understand already that their craftsmanship sucks, and they're looking to find a way to improve it. So we teach them, and they learn. They're happy to.

College graduates tend to be problematic, for two reasons. First, they tend to believe their craftsmanship must be worth something after all the money they spent on it, and we have to break them of that before we can teach them. Second, they tend to have been taught a bunch of bad habits (commenting code, writing code that isn't demanded into existence by a failing test, etc.) that we also have to beat out of them before we can teach them the right habits.

College would be welcome, and our profit margin considerably higher, if it would teach our prospective hires the things about software craftsmanship that we need them to know. Presumably it teaches them something, but nothing--or almost nothing--that pertains to the sort of job we want them to do.

I was once in a position to offer three college computer-science professors the opportunity to spend a week during summer vacation in a real industry team room with real industry developers on a real industry project, so that they could see what went on and what skills were needed and what skills were not.

All three turned me down flat. They weren't interested; they had better things to do. One of them told me, "I don't train software developers. I train computer scientists!"

Until that attitude changes, college computer science will be harmful for students who want to get jobs as software developers.

Geekymom said...

It is true that College CS professors believe they're training Computer Scientists. That's true of any professor of any field, although as academic jobs dwindle, many are starting to focus on the practical. I hope I'm teaching my students how to learn to program. I don't have rote assignments. We have projects. I'm not looking for the "right" answer, I'm looking for interesting results. I tell my students all the time that what I learned in college in terms of an actual language and practice is useless. No one uses it anymore. What I know now about programming, I learned on my own.

But, I do think college--a good college--has something to teach students. Often that's the place where students learn better how to think and learn, to understand the importance of diversity, of social justice, and generally to pull themselves out of the egocentric view of the world they have at the end of high school. If they go off to a programming job, where I'm going to guess a) they're a guy and b) they'll be surrounded by guys (mostly), I worry they'll lose the opportunity to learn some of those softer skills and develop a skewed view of the world, cf. GamerGate.

But you don't have to spend $100k to get that. Go to a community college or state school. If you're already a kick-butt programming, I'm guessing some school might pony up some money to get you to go to that school. There is a value to college, but no, it's not going to train you to be a developer, but it isn't there for job training, it's there for a broader education.

Dan Wiebe said...

I get what you're saying. For example, in college I learned that drinking alcohol isn't for me, and I met the woman I would later marry, and I discovered that a defective bass combo amp that keeps the bass's strings about 24VDC above ground will make your fingers numb after about half an hour of playing, and that if you ride your bike to class every day because you don't have a car, on rainy days the rear wheel will throw water all up and down your back, the front wheel will throw it straight into your crotch, and you'll be squidgy all day.

The part that causes me trouble is that somehow colleges, college students, and employers have all fallen under the impression that a college CS degree means you should be hireable as a junior-level software developer, for at least $30-40K/yr.

If everybody understood that a college degree in computer science gave you a disadvantage rather than an advantage for the work a software developer has to do, it'd make things better for me.

However, it would also cause a mass exodus of university students to technical schools and a concomitant loss of funding for those universities, so probably universities aren't going to be advertising that anytime soon.

But actually, a guy I know who learned to be a software developer in prison has now been released and is going to such a technical school, and he keeps telling me they don't have it right either.

It could be that software craftsmanship is just something that needs to be learned in an apprenticeship rather than in a classroom. So far my company has had better luck with apprenticeships than with anything else we've tried; but they're expensive.

Alfred Thompson said...

I think it is wrogn to generalize too much. Several of my professors in my undergraduate days spent sabbaticals working in industry and others did a lot of active consulting in industry while teaching. Most of the professors I had in grad school were industry professionals who were adjunct professors. Nothing quite like learning compiler development from someone who makes their living doing for in a commercial enterprise.

You probably see less of that in an R1 school where they are trying to turn out computer scientists rather than software developers.

As a formally trained person I was regularly asked for help by self-taught developers who knew that there were things they needed to know which were taught in university but that they hadn't learned on their own. That breadth of knowledge can be very useful.

Dan Wiebe said...

Perhaps I _am_ overgeneralizing, but I can only operate from two sources of experience.

First, my own experience with college, from whence I got a BA in computer science, an MS in computer architecture, and an MS in computer graphics. Since then, I've been one kind of professional developer or another for a quarter century, and in that career I've used exactly one thing I spent that $150,000 learning: an algorithm I could have found in a $15 textbook (or, today, for free with Google). If I had gone straight into the industry from high school, not only would I have saved that $150,000, but I probably would have made at least $150,000 in the seven years I actually spent in school; so I would have been $300,000 ahead. No, wait--$299,985: gotta remember that $15 algorithms book.

Second, my experience interviewing potential hires for companies I've worked for--probably hundreds by now. Not without fail, but very frequently, the most promising prospects straight out of college are passionate and smart, but they know almost nothing we need them to know. We have to hire them and pay them to go through our own proprietary Craftsmanship Academy for four or six weeks, and we have to pay two senior developers to teach them.

But maybe it's the particular slice of the industry I'm in (Agile consulting, usually but not always on some kind of business software) or the geographical region (Midwest US).

Michael S. Kirkpatrick said...

So many great, quotable things in this discussion: "Being Bill Gates is probably about as much a long shot as being the next LeBron James." This is reminiscent of the Geek Gene myth that suggests you can't teach programming skills, so college is a waste. "The mistaken idea that kids should go to college directly out of high school is part of this issue." I'm a really big fan of the Gap Year and apprenticeships. Sadly, the former is generally socially unjust (primarily restricted to wealthy privileged), while the latter is just not a part of our society.

But then there's the "I don't train software developers. I train computer scientists!" refrain. All that tells me is that you're looking at the wrong schools. You're probably talking to professors at top-tier research universities. Here's the not-so-hidden truth about the academic world: At those institutions, teaching--particularly undergraduate teaching--is neither valued nor rewarded. And training students in job-related skills? Shudder. The incentive structure of those universities is based on research and prestige. And their research IS computer science, not software development. I know PLENTY of faculty who would gladly take you up on your offer to spend a week with an industry team. (Disclaimer: I have an undergraduate degree from one R1, a M.S. from another, and a Ph.D. from a third. So I have plenty of exposure to the R1 life. I also spent half a decade out of undergrad with IBM, so I've seen a different side of things.)

I teach at a large, primarily undergraduate institution. In my classes (I teach upper-level systems courses such as OS), I won't even look at students' code unless it has been pushed into their Mercurial repository (I prefer Mercurial to Git because it has a shorter learning curve). I have set coding standards that they must adhere to, though I encourage them to use Eclipse's autoformatting tool so they don't have to. Depending on the course, some projects involve writing a lot of code, while other projects involve mostly code reading and comprehension with minor tweaks. In all projects, I set agile-like milestones to encourage a release early, release often pattern. I am not at all under the impression that I'm teaching the next Donald Knuth or Edsgar Djikstra. My students go into professional practice, and I want them to have the skills (beyond just the theoretical concepts) necessary to succeed in that environment.

However, here's where I push back: Your use of the word "training" and the bemoaning of the fact that you have to actually invest in your new hires highlights an industry bias. I am not your employee trainer. If your business requires some particular language, that is your responsibility, not ours. It is foolish for you to expect otherwise. I am preparing my students to go into one of many possible positions in the computing profession, ranging from web developer to DBA to systems analyst to ethical hacker to robotics engineer. I am not preparing them to step into your particular job opening. Unless you agree to hire every single one of our ~100 graduates that we produce every year (many of whom will have absolutely no interest in your company), we will continue to provide a foundation that gives students options for their career paths. We have plenty of excellent students who would be able to learn the skills you need VERY quickly (I get some going from no C background to writing kernel code in the same semester), but you cannot expect it for free.

Michael S. Kirkpatrick said...

(continued...sorry for the length...)

I'm not sure what exact skills you are suggesting that students lack, but I'll say this: Am I going to teach them Node.JS or push for our curriculum to be standardized around the Popular Language of the Day? Absolutely not. (I actually had a conversation with a recent graduate who went to work for a web services company that suggested we teach EVERY class in Javascript.) As I teach systems courses, my students program in C, because it is the most appropriate language for the concepts in this area. You can't teach something like buffer overflow attacks using Python. My goal with my students is to teach them what they need to know so that they can teach themselves. I don't teach computer scientists. I teach future self-guided learners. They won't be masters of any particular language, but they'll get enough exposure to OOP vs. functional vs. declarative languages so that they understand which paradigm might be best for a given problem.

Will every single one of my students go on to become amazing software developers? Um, no. Not even close. But I feel that the experience they get here will put them in a much better situation than when they arrived.

And I'll also back up what Geekymom said. One of the big problems in the software industry right now is monoculture, particularly one that creates an unwelcoming environment for women and underrepresented minorities. There is plenty of evidence that this is particularly the case in the start-up culture and companies that hire "geek prodigies" right out of high school. Oftentimes, the bias is implicit and employers aren't aware of it. At other times, it's explicit, like when the organizers of Black Hat thought it'd be a good idea to have a conference-sponsored strip show. When you try to hire people who just look like you, think like you, talk like you, and learn like you, you are contributing to the problem and you are actually hurting the field overall. After all, the point of diversity isn't just for its own sake. Rather, the point of diversity is so that you have multiple viewpoints, some of whom can point out critical things (such as crappy UI design) that geek geniuses (who are not mature enough to empathize and critically analyze designs) fail to recognize.

Garth said...

Michael, Not too long and well said. As the only CS type person in my high school I constantly have to give advice to my CS students on how they should shape their career. It is conversations like these that help me help them. I have a bias. I went to college for 2 years after HS and did not do well. No focus. I then spent 4 years growing up and learning focus. The next 4 years of college I was on the Dean's list. But my daughter went directly from HS to college and is on the Dean's list and really has focus. As a college math instructor for 13 years I see many more like me than her in my classes. College after high school is the norm for several reasons; things like scholarships and no major life commitments are two biggies that jump out. Good reasons but I still think society emphasizes that automatic direction too much. But I do not want my bias to affect my advice too much. Conversations like these are a big help.

Dan Wiebe said...

Actually, two of the three professors I talked to were from a small Wesleyan Methodist college with surprisingly little emphasis on research, and one was from a huge research-heavy state-funded university with its sports teams named for a poisonous nut.


But we don't hire language programmers: we hire smart programmers and let them learn what they need. A month ago, for example, I knew next to nothing about Python; now I test-drive it every day because of the engagement I'm on. There are many dark corners of the language with which I'm not familiar, but then my objective is to write simple, clear code, not clever, unmaintainable code.

And we don't care about technologies or frameworks or platforms either. Again, we hire smart people, pair them with an expert, and let them learn it. No big deal.

What is crippling to us is the almost complete dearth of college graduates with good practices.

Chief among those practices by several lengths is test-driven development. It's tough even to find somebody who claims he knows how to test-drive, and most of the ones who do make the claim turn out to have pretty horrible habits (asserting mock behavior, asserting tautologies, etc.) and knowledge gaps (for example, how do you write a test that asserts that your production code throws a specific exception?) Knowing how to write good tests (good tests, now, not just tests) is immensely important to, say, the leading third of the industry these days, and that fraction won't do anything but increase.

Coming in second would be a facile command of the SOLID principles--you know, Single responsibility, Open/closed, Liskov substitution, Interface segregation, and Dependency inversion. In our work, which involves writing code for clients that may linger dormant for years until somebody completely unconnected with it has to change it, we need it to be well-constructed and maintainable; we can't have developers slapping trash together and pushing it just because it works. Trashy code reflects badly on our reputation and hence our market share.

Gimme an interviewee who can demonstrate TDD and SOLID to me in a pairing interview, and I promise I won't ask him a single question about sort algorithms or language specifics or NP-completeness. We'll have to either hire him on the spot or have him killed, because otherwise he'll go to work for the competition, and we can't have that.

Now don't get me wrong, learning to check in frequently, at least whenever the tests are green (maxim: "Always commit on a high five") is an important thing to understand. Releasing early and often is good too, as well as Travel Light and YAGNI and DRY and all those others. (I'm sorry, I don't give a rip about coding standards.)

But to me, out here in the industry, I'd much rather teach those to a young pair partner while trying to produce high-quality client software at high velocity than have to teach TDD and SOLID. Those two take a lot of low-velocity time and intense effort to learn, which is hard to find in a burgeoning, competitive industry but should be exactly the sort of environment people expect from college.

Okay, lunch break is over.

Michael S. Kirkpatrick said...

Again, I'm going to have to split this across multiple posts... Dang character limit...

Garth, if I were Education Benevolent Dictator for Life (EBDfL), NO ONE would go to college directly out of high school. I would mandate a 2 year professional apprenticeship period. A big part of the problem with bad results coming out of college has to do with the simple fact that a very significant percentage of 18-year-old students are not emotionally and intellectually mature enough. They do not understand what the purpose of college is, they do not know what they want to do with their lives, and they do not know what they should be learning. They go to college because it's expected of them, then they waste 2 years NOT learning. By the time they're juniors and seniors, they often have to go back and re-learn what they failed to learn the first time. (...which means they also don't learn the upper-level stuff, too...) That's not true of everyone. But I would echo Garth's observation and argue that a majority of college freshmen could benefit from 2 years of additional maturity.

Dan, my tone was a bit harsher and snippier than I should have used. I tend to get that way when the conversation starts off as (essentially), "College is a waste of time." If you want to have a constructive conversation, it's generally not best to start from a stance that devalues the worth or integrity of another's profession. A better start would have been, "There are particular high-level reusable skills that colleges should emphasize more." We would have gotten along splendidly at that point.

In general, I would argue that your points fall into the category of argument from ignorance. That's honestly not intended as an insult; rather, it is just intended to suggest that many people who criticize academics (as well as teachers at primary and secondary levels, too!) lack sufficient knowledge of the domain to understand WHY their point is off-target. You are criticizing CS faculty for insufficient results but you do not seem to have taken the time to ask what we ARE teaching ("Presumably it teaches them something, but nothing--or almost nothing--that pertains to the sort of job we want them to do.").

For one thing, there is a glaring distinction between teaching and learning. If you have college graduates applying for jobs with you and they don't understand TDD, that doesn't mean faculty aren't teaching it. It is equally likely that they just didn't learn it. You can lead a horse to water but you can't make it drink. (See the first paragraph about wasted time as freshmen and sophomores.) Furthermore, like many people who criticize educators, you do not understand effective teaching practices. I'm sure you have plenty of memories about what teaching styles worked really well for you. The problem is just that: They worked really well for YOU. That doesn't mean they were effective for everyone or even a majority. Your dig on sort algorithms highlights this. We don't teach sort algorithms because they are valuable in and of themselves. Rather, we use them as a scaffolding technique so that students learn to analyze algorithmic complexity. I don't care if my students can do a proof using a polynomial-time reduction. But if they're trying to manage any decently sized data set, they had better be able to know when to use a hash table instead of a tree structure. The point of analyzing sort algorithms (which really only comes down to a very, very small amount of time) is to prepare students for these more advanced forms of critical thinking.

Michael S. Kirkpatrick said...

The same thing holds for coding standards. I also do not care for coding standards, per se. I enforce them for specific reasons: If I leave it to the students, they will continue to write all of their code in the same crappy style that they learned in CS1. And if you truly care about code reuse and maintenance, you do NOT want graduates who do that. Also, some organizations DO care about and mandate standards, and I want my students to be ready for that. I would rather they whine to me about annoying trivialities than have them whining to their future employers (who will turn around and not hire our future graduates for fear that they are also whiny!). But the real problem is this: Students who can only write in one coding style can only READ one coding style. So if you hire a graduate (who may be very bright!) who has spent 4 years writing really crappy Java code and only really crappy Java code, they will get very frustrated and become unproductive if your code base doesn't happen to look exactly like that same crappy Java code style.

As for SOLID, you do not seem to realize that IS a domain-specific skill. I didn't even know what SOLID was because I do not do OOP. While some aspects of it are cross-discipline, others (e.g., the open/closed principle) are very bad for some applications. I teach courses like OS, computer organization, embedded systems, and some security-related material. (If your company did work in that area and you were offering an opportunity to sit in on your development process for a week, I would absolutely jump at the idea. Hell, I'd beg for the opportunity.) For my subdomain, OOP is generally viewed as bloat. For low-level systems code, forcing layers of abstraction, encapsulation, nested method invocations, etc., leads to two results: unusably slow performance and complexity-induced bugs (often timing-related). While I certainly think that students in an OOP course should learn SOLID principles, that's really not relevant to my courses.

TDD is less domain-specific, but still impractical for my courses. In systems courses, students do not yet have the insight to write meaningful test cases. If they're just starting to understand how to use pointers and race conditions, they won't have an appreciation for what can go wrong. By the end of the project, after they have worked to resolve the subtleties of the test cases I give them, they might be in better shape to write similar test cases but there is no point to that: very, very few of them will go on to do systems programming. But all of them should have at least a rudimentary understanding of what a kernel is, how code executes on a CPU, and why high-level synchronization primitives are vital. While pushing TDD is a great idea in theory, it just doesn't work in practice. If I spent my time with my OS students teaching them how to write a meaningful test case for kernel code, I would have to cut out about half of my class.

The problem with TDD (and other practices) is one of economics: there simply isn't enough time. Undergraduate programs generally have 10-15 courses required for the CS major. In our program, we try to strike a balance between theory and practice (emphasizing the latter more). Our requirements consist of 2 intro courses (the first is a general intro to programming and the latter is an intro to OOP), software engineering, databases, 2 discrete math (I'd prefer only 1...but that's a different story), data structures, computer organization, OS, networks, programming languages (a survey of paradigms). In addition, they have choices of electives in areas like robotics and AI, software design patterns, web-based information systems, security, cyber defense, UI design, and a couple of others. IDEALLY, we could find a way to have every course integrate the practices they learned in software engineering. And, as I said earlier, a lot of us make an effort to do what we can. But there's just not enough time.

Michael S. Kirkpatrick said...

If you disagree with our curricular choices, that's fine. Perhaps you think that design patterns course should be required (I actually do, too, but that's a different story as well), that's your prerogative. However, what should we get rid of to make room for it? Data structures is fundamental to every advanced course. Computer org is the foundation of OS and networks, both of which are necessary for our security curriculum. PL? That's the only spot in our curriculum that introduces the functional and declarative paradigms. Considering lambdas are coming to Java and growing in popularity, I would argue functional concepts are core.

Overall, I would say that your argument is either myopic or overstated. You view the type of software development that you do as characteristic of ALL software development, and that simply is not true. Again, your word choice betrays you: "out here in the industry." Your company is not the industry. It is one part of a subfield of the industry. Your argument isn't really about us teaching too much theory or neglecting the needs of industry. Your argument essentially boils down to the mistaken assumption that a CS degree is about preparing students for careers in OOP software development. At this point, I would return to my point in the first paragraph: Your argument illustrates that you also do not understand what the purpose of a college degree, particularly one in CS, is.

If your argument is that a CS degree does not adequately train someone to become an Agile OOP software developer, I would agree. But if you try to extrapolate that into "we discover that CS degrees are useless or worse to devs," then you are simply mistaken.

Michael S. Kirkpatrick said...

I should have said: Dang character limit and dang long-winded professor on a soapbox...

Michael S. Kirkpatrick said...

Gah! I promise I'll shut up about this, but I just thought of one other subtlety: Another reason new CS graduates do not have the skills required for developing and maintaining code on a long-term basis is because they've never had to develop and maintain code on a long-term basis. If the longest you've ever worked on a single project is 3 months, you've never had a reason to adhere to SOLID principles.

Dan Wiebe said...

------
If you have college graduates applying for jobs with you and they don't understand TDD, that doesn't mean faculty aren't teaching it. It is equally likely that they just didn't learn it. You can lead a horse to water but you can't make it drink.
------

Actually, I've talked to a number of these graduates about their experience in college. According to what they say, when TDD is addressed (rarely), it's given one or two class periods. Once I talked to a guy whose professor had taught on it for a week. This when it takes me three to six months of six to eight hours daily pairing one-on-one with a client developer to teach him or her to write good tests.

TDD should not be a subject for a few days or a week; it should be a way of life. The first code you ever write should have not a println but an assert.

------
I'm sure you have plenty of memories about what teaching styles worked really well for you. The problem is just that: They worked really well for YOU. That doesn't mean they were effective for everyone or even a majority.
------

Don't worry; I have plenty of experience finding teaching methods that work for other people; it's part of my job. Has been to one extent or another for three decades.

------
I also do not care for coding standards, per se. I enforce them for specific reasons: If I leave it to the students, they will continue to write all of their code in the same crappy style that they learned in CS1. And if you truly care about code reuse and maintenance, you do NOT want graduates who do that.
------

Coding standards don't have to do with reuse and maintenance. Coding standards are always about things that don't matter, like spacing and brace placement and blank lines and comment banners. That's how you know it's material for a coding standard: it doesn't matter. And the reason I don't care about coding standards is that they can be so easily automated these days--made part of the build process.

If it's something that matters, like small functions or DRY code or single-indent methods or good names, that's practice or craft, not a coding standard.

------
As for SOLID, you do not seem to realize that IS a domain-specific skill. I didn't even know what SOLID was because I do not do OOP. While some aspects of it are cross-discipline, others (e.g., the open/closed principle) are very bad for some applications.
------

Oh, I do my share of functional and prototypal and straight-line programming as well, and a good knack for the SOLID principles serves me very well there too, even if they may look a little different. For example, for what applications is the principle that a software entity should be open for extension but closed for modification "very bad?"

------
I teach courses like OS, computer organization, embedded systems, and some security-related material. (If your company did work in that area and you were offering an opportunity to sit in on your development process for a week, I would absolutely jump at the idea. Hell, I'd beg for the opportunity.)
------

As a matter of fact, my current engagement involves writing build tools for system software (such as it is) on an embedded platform. But I can't extend you an invitation, because my client is extremely security-conscious and would not agree.

Dan Wiebe said...

------
TDD is less domain-specific, but still impractical for my courses. In systems courses, students do not yet have the insight to write meaningful test cases. If they're just starting to understand how to use pointers and race conditions, they won't have an appreciation for what can go wrong.
------

Every time I meet a new Agile-transformation client, I hear all about how TDD may be wonderful for the general case, but their particular vertical application isn't susceptible to it for important reasons. It never turns out to be true. The toughest case we had was test-driving cross-compiled code for an 8-bit microcontroller; but we handled it.

------
If I spent my time with my OS students teaching them how to write a meaningful test case for kernel code, I would have to cut out about half of my class.
------

They ought to know how to write meaningful test cases before you encounter them; tuning their skills to kernel code should be a minor operation. As I said, until somebody comes up with something better, TDD is going to become a way of life for people who want to write professional code.

------
If you disagree with our curricular choices, that's fine. Perhaps you think that design patterns course should be required (I actually do, too, but that's a different story as well), that's your prerogative. However, what should we get rid of to make room for it?
------

I don't mind if new hires don't know design patterns; those are easy to teach on the job. But the fact that you're asking what you should get rid of to make room for TDD means that I haven't made myself clear.

TDD should not be a separate subject like data structures or operating systems; it should be a crosscutting ingrained practice, like using an IDE or understanding indentation. When you start to think about a problem, your first thought should be, "What's the simplest test I can write for that?"

------
You view the type of software development that you do as characteristic of ALL software development, and that simply is not true.
------

Heavens, no! If it were, I'd be out of a job. But it should be.

------
Your argument illustrates that you also do not understand what the purpose of a college degree, particularly one in CS, is.
------

So do your students understand it? Do their parents? Do they know you're not interested in teaching them the skills they'll need to get jobs as software developers?

------
If your argument is that a CS degree does not adequately train someone to become an Agile OOP software developer, I would agree. But if you try to extrapolate that into "we discover that CS degrees are useless or worse to devs," then you are simply mistaken.
------

I'm the world's foremost authority on what I have discovered, and I'm afraid I'll have to pull rank on you and say that no, I'm not mistaken: that's what I've seen. I understand if that's unattractive to you, and I sympathize. I'm just trying to say what's out here.

Michael S. Kirkpatrick said...

"Do they know you're not interested in teaching them the skills they'll need to get jobs as software developers?"

At this point, I just have to write you off as a troll. You don't understand that any teen you have worked with is, by definition, abnormal; they are self-selected high performers. They are like the top 20% of my students. If we only had students like those in our classes, all of our graduates would be TDD masters. Instead, I have to find ways to motivate the 50% of my students who would otherwise wait until the day before their 6-week-long project is due. See Garth's comment about his experiences teaching math at the college level.

You bemoan the fact that 3 professors (not exactly a great sample size) wouldn't take you up on your offer for them to spend a week in a professional development environment. But it's all too obvious from your comments that you have never spent even a single minute actually considering the realities of what professors do. It's easy to criticize when you have no empathy or understanding of others' experiences. Your comments make it clear that you (a) do not understand the distinction between teaching, training, and tutoring, (b) do not understand the opportunity cost factors that influence curricular decisions, (c) have never examined or engaged with discussions relating to the ACM computing curriculum, (d) have made no attempt to consider that the types of projects done in an educational setting serve a different purpose than real software projects, (e) have no knowledge about the actual intellectual and psychological maturity of MOST 18-year-olds, and (f) do not really care to work with academics to improve the situation. It's quite clear that academics aren't the only ones who need a change in attitude.

Dan Wiebe said...

------
At this point, I just have to write you off as a troll.
------

It's not an uncommon response among academics who encounter professionals who say things they'd rather not hear; I've seen it happen before--and no, not just to me.

------
You don't understand that any teen you have worked with is, by definition, abnormal; they are self-selected high performers. They are like the top 20% of my students. If we only had students like those in our classes, all of our graduates would be TDD masters. Instead, I have to find ways to motivate the 50% of my students who would otherwise wait until the day before their 6-week-long project is due. See Garth's comment about his experiences teaching math at the college level.
------

Now you sound like a public-school teacher--as though you're compelled by law to accept whoever wants to be in your class. Is that really the case? Can't you winnow your students until you're left with just the serious ones?

I just got home from Coding in the Clink #10 at Marion Correctional Institution, where a bunch of (mostly) professional software developers spent the day in prison pairing with prisoners who want to be developers when they're released.

There is a bunch of serious students. CITC is an occasional bonus, but the prisoners are in there every day learning to figure stuff out. It's amazing how much and how fast they learn in there, even without an Internet connection. And we accept whoever wants to join; but we run them really hot and make them work very hard, and probably 70% of them wash out after a month or two, even without any exams or grades or flunking--just because they have better things to do with their time than flog their brains all day in front of a computer and then again all evening so as not to be left behind. The few who are left are like gold panned out of gravel.

Can't you do something analogous?

------
But it's all too obvious from your comments that you have never spent even a single minute actually considering the realities of what professors do. It's easy to criticize when you have no empathy or understanding of others' experiences.
------

Actually, my father has been a college computer-science professor ever since I can remember. He's still working hard even into his seventies. (He was at CITC #10 today, pairing with prisoners.) But you're right: I don't think he teaches the right things either. On the plus side, he's at least willing to check out stuff like TDD, even to the point of coming to prison to do it.

------
(b) do not understand the opportunity cost factors that influence curricular decisions, (c) have never examined or engaged with discussions relating to the ACM computing curriculum,
------

Probably true.

------
(d) have made no attempt to consider that the types of projects done in an educational setting serve a different purpose than real software projects,
------

Well, now, I wouldn't say that. I'm involved with lots of tiny educational katas in various contexts; but I always test-drive them.

------
(e) have no knowledge about the actual intellectual and psychological maturity of MOST 18-year-olds,
------

Used to be one!

------
and (f) do not really care to work with academics to improve the situation. It's quite clear that academics aren't the only ones who need a change in attitude.
------

Okay, so what would "working with academics to improve the situation" involve?

Dan Wiebe said...

Pending an answer from Michael, let me give it a try on my own and offer a challenge to you CS teachers and professors reading this.

Show this to your students and spend ten minutes talking it up: http://globalday.coderetreat.org On the 17th, talk to the ones who went and find out what they thought of it. You might even decide it's worth trying out for yourself next year.