My knowledge of computers and operating systems is fairly pedestrian these days, but as a kid I was ahead of the curve for a while, learning BASIC when I was 10 or 11 and later writing some baseball- and running-simulation programs on an IBM PC Jr. (It helped that my dad was a programmer.) My earliest efforts were on an Atari 400/800.
In those days, it was a rare thing even for relatively “with it” adults to know anything about computers that didn’t involve playing games. If someone saw you punching keys with a screen in front of you and called out “Hey nerd,” you probably looked his way with an expression not of hurt but of pride. Only nerds knew how to *really* use computers. (I wasn’t a nerd myself, though. I was extremely suave. In addition to spending summer vacations running endless simulations of 5K races involving fictional runners on nonexistent teams at imaginary schools, I could solve a Rubik’s cube, play chess, create my own scaled-down rip-offs of “Choose Your Own Adventure” books, and execute a variety of other social maneuvers that 13- and 14-year-old girls found irresistible.)
I remember wondering, maybe aloud but perhaps to myself, what would happen if one were to somehow locate a tribe of prehistoric cave people and furnish them with computers. (This was in addition to, of course, furnishing them with well-cooked food and reliable shelter, but only after they reached a certain level of proficiency with Astrosmash and Zork.)
35 years later, I don’t have to wonder anymore. It’s called Twitter, and it has a lot of first-degree relatives.