The increasing divergence of technical skill

2013-08-12

This is something of a response to the article recently posted on Hacker News called Kids Can't Use Computers... And This Is Why It Should Worry You.

I'll summarize the thesis of the article if you don't care to read it: guy is bitter because people in their 20s and younger don't know how to use computers. And by use, he's not referring to the "knows how to click icons" definition; rather he means is incapable of basic maintenance like setting proxies on a corporate LAN, knowing how to respond appropiately to dialog boxes (specifically: just fucking read them), and such things as being able to open up Windows' task manager and determine what process/application is causing your cpu to max out.

I've worked in tech support, so I know the lay of the land, so to speak. You get called out to users who have done unspeakable horrors to their machines, and didn't realize it, people who won't check if their monitor is plugged in until you drive over, people who have 48 applications running and wonder why things are going slow. These anecdotes are not central for the point I mean to make, but they help to qualify what comes next, namely, that you can understand why the guy is bitter.

Most people are capable of relating to a doctor what part of their arm hurts. Maybe it's the wrist, the elbow, the somewhere in the forearm, or the shoulder. Imagine you are a doctor, and you are trying to help your patient, and they are incapble of relating to you this information. Further, they are incapable of relating any status information other than "my arm just isn't working". Compare the outcome of such an occurrence to a similar occurrence in tech and you'll immediately grok the tone of the aforementioned article.

Now, let's get to it.

The author is moaning (justifiably, erroneously, I don't care) about the fact that people my age (I'm 24) and younger don't don't know how to do anything on the computer other than Facebook. I can hear the protests, but no, he's right. How many of your friends can undertake a task as easy-fied as downloading and installing Wordpress on a typical LAMP stack? How many would venture to install and use more than one OS on their machine, even with the help of something automated like Apple's Bootcamp? How many can mock up a ridiculously simple HTML/CSS (no JavaScript) webpage in 20 or 30 minutes or edit a Tumblr theme...and actually know what they're doing other than following instructions by rote?

Where am I going with this? Well, back in the day (hate to say it, but it applies) this sort of stuff was all under the headline of "assumed". It was assumed that if you were using a computer, you could make small changes to a text file. You could download and run some all-but-autoconfig installation like Wordpress on LAMP. You could use DOS to navigate around and get shit done. A computer wasn't the foreign, terrifying "black box" that it is now. I'm not saying it was easy if you were doing it then, but it wasn't writing your own compiler. It was...using a manual transmission, or knowing how to find something on a map. It was a skill required to navigate that country of bits.

Today, these skills are not necessary for use. They're nice, but they aren't needed. iOS is so responsive, so intuitive, that the human involvement in the process is almost gratuitous. Android is 95% of that. Windows XP through 7 (haven't used 8) are all incredibly, ridiculously easy to use. 7 especially. If it wasn't a clusterfuck of non-free/proprietary licensed mystery, it would be something special.

Among many other conditions, ease of use is required for mass market adoption of anything. As more people start to adopt a given product or technology, they start to wonder, "Why aren't more people using this? This should be easier to use." Consider the jump Android has made between 1.0 and 4.2. The usability difference is huge. Likewise between Windows 95 and 7. So, what happens? Things get even easier. They get so easy as to crest the grandmother-boundary. They get so easy that you can put them in the hands of a kid 2 years old and he can successfully navigate around ...without knowing how to read.

What's interesting is that the actual construction of these technological environments has not gotten drastically easier. Yes, most programmers no longer code in assembly, yes, tons of people use interpreted languages, yes, there are libraries and frameworks available to make a lot of common things easier and take less time (ie, jQuery, Rails, etc). The level of complexity is still there. As far as I'm able to tell, rolling a Linux distro by hand still requires some competency. Writing drivers and embedded systems stuff is still fucking tough. By no means am I saying that everyone who works with a computer is Ken Thompson (in case you had any doubt, I can't do any of those things, but I am curious enough to be attending Dev Bootcamp in a few months). I am, however, saying that if you're into this stuff at all, it's likier today that you are much more significantly above the mean computer user than you would have been if you liked tinkering with computers 30 years ago.

To make this plainer: using a computer at an average or below average level has gotten much, much easier, while using a computer at an advanced level has not eased to the same degree.

This has lead, and will continue to lead to a divergence of skill in society. Rest assured, things for the average user will get easier. And, barring some unforseen advances of an absolutely hilarious scale, writing code to build systems that can intelligently handle thousands (millions?) of concurrent requests will remain a complex, arduous task well above the individual intelligence, skill, and stamina of 99% of human beings. In effect, there are two forces pulling apart those of us who use computers: if you are skilled, you will have to become more and more skilled to remain relevant, and if you are not, don't worry because things will eventually come down to your level.

The problem with this scenario, however, is agency. We're faced with a situation where at least a part of Neal Stephenson's Snow Crash has become literally true:

“We have a huge workforce that is illiterate or alliterate and relies on TV—which is sort of an oral tradition. And we have a small, extremely literate power elite—the people who go into the Metaverse, basically—who understand that information is power and who control society because they have this semimystical ability to speak magic computer languages.” (pg. 406)

iOS, Facebook, et al. is our generation's TV. Just as TV made it possible to be functionally illiterate in modern society (able to enjoy news, movies, and culture generally in a society that has attained such feats as space travel and nuclear power but without the need for higher-order literary comprehension), mass-market adoption and brilliant UX design has allowed the majority of us younger people to be functionally computer illiterate. Perhaps there is some pushback in this regard, with more and more people seeking to learn how the sausage is made. Time will tell.

There are certainly arguments to be made that counter mine: "look at all the cool stuff that we've built!" definitely carries weight. I just ask: is the number of people our age involved in this creation (in both raw numbers and percentage/demographic terms) more, less, or about as many people as you would expect, given our nauseating title of "digital natives"?

I can tell you that having gone to college from 2008 to 2012 with a host of people my age, someone who tore apart the family computer at age 10 to mess about with things like RAM and HD upgrades, built tons of sites on the likes of Angelfire, Geocities, and Tripod for no other reason than "it's fun!", and set up a proxy to route around my high school's "great firewall", I was a rarity. Let us hope my experiences grow less and less strange, rather than the opposite. Even with all of the problems of democracy, would you rather have an oligarchy populated with people like Mark Zuckerberg and these assholes?

Just some thoughts.