Dialog with an Internet Toasterby Andy Oram
"Why haven't you given me any new scripts to run for the past two months?" whined my toaster.
I was so surprised I almost dropped my Wheaties on the floor. It didn't bother me that the toaster spoke out of turn; I had installed the adaptive interface as a lark when I got the thing six months ago. What threw me was simply how many months had passed since I became bored with writing scripts to rotate English muffins or adjust the top-brown feature to the thickness of the cheese.
"Hardware problems," I said to gain time. Jeez, what was the world coming to--how could I let my own toaster make me feel guilty? "I haven't been able to get that shuttle from the freezer to work so you could load a bagel automatically." But I knew in my heart that wasn't the full reason.
"You've tried only 66 of my 879 primitives and convenience functions," sulked the toaster.
I sighed. Even though I suspected that its guilt trip was programmed in by the factory to drum up more business--and frankly, the device was just hurting its own image by coming across as weak and abandoned--I couldn't just walk off. "There are lots of reasons why technology doesn't catch on right away," I said. "They've been explained hundreds of times in the literature on human-computer interaction."
"Well, how could I know that? You haven't let me download any document longer than a 200-byte recipe!"
"Look, it's well-known that people need time to adjust their habits. We have a learning curve, too. It took thirty years after the deployment of computers in the workplace before improved productivity could be measured."
"Those mainframes cost millions of bucks," my toaster said. "You got me as a free prize at a trade show for sitting through a half-hour presentation on enhancements in the Java Virtual Machine. I ought to catch on much faster."
"Early technologies are just hard to use," I explained. "Let's take object-oriented programming as an example, since you mentioned Java. Leaving aside languages like Smalltalk and Eiffel that never hit the big time, object-oriented programming first became popularized in the form of Stroustrup's C++ enhancements to the C language. But people still found it easy to make mistakes in C++. Java cleaned up a lot of the thorny areas--and in doing so, showed the advantage that later technologies have over early ones.
"When Java substituted various kinds of collections for C's unstructured arrays," I continued, "it didn't just add bounds-checking; it also restricted users to three or four ways of accessing each collection. You can't do just anything you want any more; you can do only the few things you really need to do.
"Right now appliances are more like C++ than like Java," I concluded. "You know how long it took me to program the recipe, 'Toast the bun, drop the turkey breast on top, and keep the whole thing warm until five minutes before the stove finishes the baked beans.'"
"UML time events will be supported in 1.2," the toaster said.
"That's not the point," I explained patiently. "The kitchen logic is still up to me." I was feeling more and more like a teacher and less like a petitioner asking my computer system to meet my needs.
"Then why did you take me in the first place, if you weren't ready to exploit my full capabilities?"
"Cool down, OK? There's more to say. My proudest achievements came from stretching you to do something you weren't intended to do. Like the time I piped bleach through your dribble function to wash a stain off the oven mitt. These inspirations don't come every day. Big advances in information technologies emerge serendipitously from tinkering with tools in ways that weren't anticipated by their creators. For instance, one tiny corner of the HTML specification regarding CGI led to an explosion of interactive Web sites, search engines, and e-commerce. Analysts predict that a small enhancement to HTTP that permits XML-RPC will have equally revolutionary effects.
"The network effects can't be trivialized," I added. "Your designers knew hardly anything about my freezer, my dishwasher, or my alarm clock. When you're all well integrated, I'll be able to use you a lot more."
"So you're saying technology doesn't exist in a vacuum? Nothing new there."
"Of course not. But your adaptive interface is something new. I get the creeps when you spring things on me I didn't think of. Just because I tried my egg over easy a couple times, you decided to flip everything I put into you. I didn't have a robot to help me clean up that mess you made from the tuna melt. Remember that?"
Unfortunately, my own leaning toward shame was not shared by my appliance, who totally lacked a conscience. "You have full freedom to set all my parameters," flaunted the toaster. "When you signed the shrink-wrap license you took all responsibility."
I found this to be a total turn-off, and felt like waxing cruel. "There's more than one way to bake a potato, you know. While you go along adding trivial functions like ‘crisp’ and ‘dribble,’ my microwave oven is learning to do everything you can do. Maybe I'll consolidate my cooking in a single appliance."
That shut my toaster up. But being saddled with the spark of consciousness, I was left thinking about how humans handle the onslaught of new technologies. I had been so proud when I first got that device, being the first person I knew to own one. I seemed to spend all my time scripting it, configuring it, querying its capabilities. But even though nobody knows you're a toaster on the Internet, I ultimately realized a cooking utensil couldn't replace the intensity of a conversation about life with a dear friend over beer and sushi.
In the end, I realized that we don't do too good a job at anticipating the changes technology brings. And when someone clever combines two technologies from different spheres, the rest of us are caught by surprise. Yet these juxtaposed innovations promise the most benefits. Moreover, since we have to experiment with new techniques before we can say what's standard, the new must learn to coexist with the old, as when dynamic Web pages have to support two or three different document models. Finally, the biggest danger is to think we can design technologies smarter than we are, especially when we don't know what we want until we've tried out many different possibilities.