dcsimg
 
 
 
 
 

Computing Curriculum Off Track

 
 
 
 
 
 
 
 

by Tim Moran

Do you recall the name Steve Furber? Back in the '80s, he worked at Acorn Computers Ltd., where he was one of the designers of the the BBC Micro microcomputer and the ARM 32-bit RISC microprocessor. Not a bad resume.

He was recently interviewed by Nicole Kobie for the U.K. publication, PC Pro, about technology education in the United Kingdom. The article, "Steve Furber: Why Kids Are Turned Off Computing," looks at what appears to be something of a technology-education crisis in Britain: Students are staying away from computing classes in droves because, according to Furber, "they teach nothing but boring basics."

Furber is out to do something about it. He is, according to the article, "working with the Royal Society to figure out why the number of students taking A-Level computing classes has halved in the past eight years, and why students who love technology aren't signing up to study the subject."

Behind the unnerving trend, says Furber: "I [get] the impression that the schools' curriculum has very much focused on ICT [information and communication technologies] skills, and so what everybody does is learn to use a spreadsheet and word processor and PowerPoint and so on." Furber notes that, while these are important skills, they are much too basic for those students who already have an interest in computing technology. Schools are presenting ICT as an academic subject in a most mundane way. "It's as if maths was just arithmetic or English was taught as just spelling," Furber said.

This tack is not going to excite and educate a new generation of computer technologists. What's really needed, thinks Furber, is some good, old-fashioned programming: "In the '80s, when the BBC Micro was introduced into schools, the first thing you got when you turned on the machine was a programming interface. Anybody who was interested could write little programs and understand something about programming and algorithms. With a modern PC, you've got to work quite hard to get yourself in any position where you can write any sort of program. They're designed to be tools to use and not programmed. I certainly think we'd like to see much wider use of the idea of the computer as a programmable device, some understanding of algorithms, and understanding of the insights that computer science has brought in terms of computational thinking."

While we know that it's quite possible to get an extremely fine university-level education here in the United States--and one would think that the same is true for Britain--what of our secondary schools, the U.S. equivalent of the U.K. A-levels? Is this sort of thing happening here, too, or do we do a better job at getting our high school students interested in computing, programming, and information technology in general?