Forgot your password?

We just sent you an email, containing instructions for how to reset your password.

Sign in

  • If you want to know what got me started, I'll tell you.

    My generation of children received the first doses of Polio vaccine, watched the first network television programs, and witnessed the dawn of the space age. As a teen-aged nerd, I built electronic gizmos with vacuum tubes, resisters, capacitors and transformers, and marveled at the first Earth satellites, transistor radios, and silent timepieces driven by tiny quartz crystals. All of this made me want to go to college to study engineering.

    My first full-time employment was in the summer between high school and college, a cool job being a gofer at a university computer center (it was nicely air conditioned). There I learned to feed coded punch cards into hoppers and to spew out tables of numbers onto fan-folded paper amidst rows of softly humming gray cabinets. I schmoozed with the maintenance engineers as they replaced failed vacuum tubes in the IBM 709 mainframe. The best part was learning how to write code in FORTRAN. I submitted my first program, which tabulated values of the Fitzgerald-Lorentz transformation coefficient across a range of velocities (how geeky was that?), taking care not to divide by zero. How proud I was of the neat columns of numbers as they unfurled from the chunking line printer and of having made a huge, mysterious machine obey my commands. I was hooked. That first summer job may have sealed my fate.

    My little program didn't work the first several times my punch cards went down the hopper. I sought the aid of the programming consultant (a woman; very unusual in those early days). She read through my code and found errors, which she called "bugs." The word puzzled me. Much later, I learned that it's an old term (even Edison used it) brought into computing just after World War II at Harvard, where a group led by Howard Aiken was creating a large computer called the Mark II. On that team was another woman, a Navy Ensign named Grace Murray Hopper and a pioneer computer programmer. The Mark II was making errors, and she and their team set out to determine why. This machine was electric, not electronic: it computed digits using banks of electromechanical relays organized into circuits such as flip-flops, registers and accumulators. Eventually, Hopper or a colleague found the problem: a moth had alighted on a relay and got its wing caught in one of its contacts, so that gate couldn't conduct electricity. The moth was tweezed out of the works and taped to a duty log, with the note "First actual case of a bug being found." Now they are everywhere.
  • In high school, I had read every science fiction story I could find. Even then, I loved to speculate how the inexorable advance of science and technology might be transforming civilization. What would my future world look like, what would we be capable of doing, and—which concerned me most—how would human beings respond to all the coming changes, which in the 1960s were clearly accelerating? Would robots and other machines produce our goods, serve us, and even heal us? Would we converse with them? What would we do with all our leisure time, and what would socializing with robots do to us? Would "human nature" change? I sensed that it would, even though at that tender age my historical perspective was pretty limited.

    At college, I aimed to become a geek, but somehow I managed to acquire a liberal education that exposed me to history, literature, philosophy, art and social science. My favorite college course was one called Science and Government, taught by a former physicist who had worked on building the atomic bomb and later in life became interested in the social and institutional relations of science and scientists. His curiosity about how scientists think and associate motivated to understand the motives, assumptions, aspirations and connections of scientists and technologists. He asked me to become his student assistant and I gladly accepted the job.

    In graduate school, I studied urban planning and plunged myself into the future. I created art works based on math, science, and technology. I read Bucky Fuller and worshipped at the alter of the Whole Earth Catalog and even was a card-holding member of the World Future Society. I tried to dope out what the future would do for us, let us do, and make us do. I didn't get very far, but I did confirm my belief that the future would be a lot more digital. I got with the program, so to speak, and started coding, first on mainframes and then on minicomputers. Mostly I invented data visualization software focused on displaying various types of maps. One production involved a sequence of 3-D map images made into a rotating holographic movie. I was on the cutting edge, but now far cooler stuff shows up in our browsers, mostly for free.
  • Now I live in the future I tried so earnestly to imagine, full of wonders like Dick Tracy wrist radios, paperless newspapers, video telephony, a pharmacopeia of wonder drugs, and supersonic passenger planes (oops). And everyone—especially young people—seems to take them for granted, accepting them as entitlements of inevitable progress. We still haven't gotten our jet-packs, gyro-cars, food pills, robot butlers or moon colonies, but if we did, we would take them for granted too. Still, I don't recall any people I know lusting after such niceties. Is it just me, or do others hunch down when they hear glib expectations of scientific progress, technological abundance, and carefree lives?

    Most people probably think that socializing with robots is still pretty far out. Or is it? Maybe they're all around us and we don't notice them because they just don't look like us. Place a call to any corporation or government agency, and one will almost surely answer it. We know they work in factories. We know that the military—not to mention Google—is hard at work developing driver-less vehicles. Our Apple and Android (great name, what?) cell phones speak to us and proffer advice, and when we're not talking to them they entertain us. Some of us own robots that sweep up our rooms, mercifully without trying to converse. But sooner or later they will. And we will talk back and enjoy it even when they diss us. ("Hal, sweep out the hall closet." "I'm sorry, Dave. I'm afraid I can't do that.")

    Then there are what we politely call "unanticipated consequences" of progress in science and technology. Hydrocarbons and heavy metals in water supplies. Air pollution alerts. A parade of oil spills. Unsafe factory food and untested "frankenfood." Endemic obesity, diabetes and cancer clusters. Vanishing fish and wildlife. Invasions of exotic species. Nuclear meltdowns. If and when news media cover such unfortunate events, most people feel upset for a while, then just mutter "what are you gonna do," without taking time to consider why they have happened in the first place. No one takes responsibility or takes charge. No one goes to jail. Shit just happens. And happens again, even after technologists earnestly go about repairing what went awry.

    Once, when I was struggling with some code that wasn't working, a buddy told me, "The first law of computer programing is: There is a bug." That's even truer now than it was for me or for Grace Murray Hopper, and it applies to all technologies. The more intricate and embedded they become, the more ways to fail they have. In 2002, a Federal Government study estimated that "software bugs, or errors, are so prevalent and so detrimental that they cost the US economy an estimated $59 billion annually, or about 0.6 percent of the gross domestic product." Today, just the cost of dealing with consequences of hacks that exploit software bugs to crash sites and steal sensitive data must be at least as high, and that doesn't count biological harm or civil engineering disasters.

    Still, many of us—including myself—accept new technologies with grim fascination, like moths darting around a porch light. Perhaps the bug is us.


    @image: The first "Computer Bug" – a moth found trapped between points at Relay # 70, Panel F, of the Mark II Aiken Relay Calculator while it was being tested at Harvard University, 9 September 1947. The operators taped the moth to the computer log, annotating it: "First actual case of bug being found". U.S. Naval Historical Center Online Library photograph from wikimedia.org.
    • Share

    Connected stories:

About

Collections let you gather your favorite stories into shareable groups.

To collect stories, please become a Citizen.

    Copy and paste this embed code into your web page:

    px wide
    px tall
    Send this story to a friend:
    Would you like to send another?

      To retell stories, please .

        Sprouting stories lets you respond with a story of your own — like telling stories ’round a campfire.

        To sprout stories, please .

            Better browser, please.

            To view Cowbird, please use the latest version of Chrome, Safari, Firefox, Opera, or Internet Explorer.