Just how much programming do we need?

In recent years, there has been a push for an increased education in computers in general and in some school districts, pushing kids to code. Debate has since raged between whether or not it is a vital skill or something that is extra albeit nice to have. I suspect that this will end up becoming very similar to the other similar types of debates in education and in other fields. In other words, it will remain a controversial topic for quite some time.

 

About a month ago, there was a post by Jeff Atwood, asking about how much programming did people really need? Here is the original link:

http://www.nydailynews.com/opinion/jeff-atwood-learning-code-overrated-article-1.2374772

 

Atwood argues that the emphasis on programming is overrated and that what we need is a more general ability to think more. Anyways, the article is a worthy read and I think that he makes some very interesting points. In particular:

 

  1. Computers have become more user friendly and it is a testament to how good they are that we no longer have to be geeks to use them.
  2. Despite this, computer programming is a very “narrow vocational skill” (Jeff’s words, not mine) – a good skill potentially for one’s career, but a very narrow one nonetheless.
  3. This overemphasis on programming has come at the expense of the “why” we do certain things.

 

I agree with him for the most part. Certainly your odds of getting a good career established in say, computer programming is far better than say, becoming a celebrity at acting, but it is still a very specialized skill. Atwood encourages his children to be skeptical, to not believe in what they read without some careful thought. He also notes that a great many programmers would have been much more successful had they built the skills he urges. The technical skills he argues are the easiest skills to gain compared to the other skills. The other is that he encourages people to build things rather than to just learn pedantically (which is likely to happen if programming becomes widespread, save the few students fortunate enough to have some truly outstanding teachers).

 

Far more important is the “why” we do certain things. I think that in our society, we have put some much emphasis on “content that we do not always step back and take the time to appreciate the “why”. Everyone needs that ability to think no matter what setting they are in.

 

Most distressingly, I find that when I ask that question, the answer is often along the lines of “it has always been this way”. That is a logical fallacy. A valid answer may be that a certain way has been tested to be the most efficient or some other advantage but the why is critical. It forces people to look inwards and to reflect and go “is this really the best way to do something” or ” how could we be doing this differently”? It could lead to some very disquieting conclusions and potentially improvements that were never anticipated. That makes some people very uncomfortable I find, even if you pose the question you ask in a friendly and diplomatic manner.

 

What worries me is that it seems that the ability to not have to learn about a certain topic often encourages a certain level of ignorance. From my experiences, IT people often despair about the state of knowledge of the users that they have to work with. On one hand, it is great that we have all of this technology. On the other, it has become something of a crutch, enabling people to not think. Atwood argues that relatively few people know much about how an automobile functions. You could say the same thing about modern utilities, about human biology, or pretty much any topic.

 

I think that encouraging thinking is great because it has a tendency to create that burning desire to know “why”? That is perhaps the ultimate gift to humanity.

Leave a Comment

Your email address will not be published. Required fields are marked *