Joe Konstan pointed me to this interesting debate around an essay by Dewar and Schonberg about the teaching of computer science. Many of these essays are shallow: the authors have a particular style of programming that they think will solve all problems, and they advocate that all of us ought to spend more time teaching and learning their style. Dewar and Schonberg’s presentation is much deeper, perhaps because of their experience both teaching and practicing computer science.
The basic three arguments in their article are:
1) Computer scientists should learn about many different language
paradigms (described in detail in the essay), because that gives them a
deep understanding of the approaches that might best solve any
particular problem.
2) Computer scientists should learn more about formal models and how they can be used in software construction. In particular, computer scientists should learn more of the appropriate math.
3) Computer scientists should learn the discipline from the ground up. The early use of a very high-level language such as Python — or even Java — means that students develop a too high-level understanding of programming, without the details that are sometimes crucial.
I think the authors are absolutely correct on (1), and they do a good job of presenting their arguments. I’ll say no more on this issue, since their article does such a good job.
I think the issue of formal models is a trickier one. I certainly agree with their assertion that it would be valuable for all computer scientists to understand how formal models can be part of the solution for achieving very high reliability. On the other hand, most computer scientists will never work on such systems. How much of their time should be spent studying such systems? My bias would be to create a path for those students who find such work interesting, but to require only the basics for all students.
Finally, I think the authors are dead-wrong on the idea of teaching students to program from the hardware up. I understand the temptation: that’s how we learned, and we’re all awfully good at what we do, so it must be the best way to learn. But, this argument misses the most important skill for a computer scientist: effective abstraction. The current approach of beginning with high-level languages starts students on the path to understanding the really deep issues of our discipline, rather than spending this precious formative time on problems only a few of them will face in practice. The authors argue that high-level languages are much easier to outsource than lower-level thinking. They have it exactly backwards: the most challenging problems in our discipline today — and the most difficult to outsource — are mapping from user needs to concrete requirements that are implementable. A student who knows what a high-level language can do, along with the power of its attendant libraries, is much better prepared for this sort of work than a student who has spent years learning about machine architecture and machine language.
I agree with the authors that Java is the wrong language for the first course, but for nearly the opposite reason. Java is a difficult language to learn because it requires that so many details be understood before interesting program can be built. In particular, Java suffers with opaque syntax for simple things — like the basic list, and dictionary data structures — and the lack of a lambda expression to make higher-order programming accessible. Beginning students would be much better off with a language like Python, which gives them the tools to explore both modern imperative programming and functional programming.
John
Powered by ScribeFire.