Back in 2010, creator of a U.S. television hit-series “Modern Family,” Steven Levitan, wrote an episode featuring the character, Mitchell Pritchett having a conversation with the voice command on his car. “CD player next track,” he said. After two tries, the automated voice replied, “Air-conditioner on,” and the car automatically starts blowing wind out of the air-conditioning vents. While it was an extremely hilarious scene to watch, it does illustrate how automated computers were simply not ready to interact with human beings then.
What about now? Well, let us just say improvements have been made, but just how good are they to be able to efficiently aid us in our daily chores?
Apple’s invention and implementation of Siri on the iPhone4s in 2011, proves how human computer interaction has evolved since the first chatterbot, Eliza, in 1966 . Siri promised to turn human voice into calls and texts without users having to lift a single finger. But Siri’s accuracy was not as Apple and media outlets promised her to be. Accents now become unrecognizable words, and we speak to our lifeless gadgets at the slowest possible speeds, just to see it work its magic. Despite these flaws, we cannot deny how great it would be if we actually owned a robotic device of our own—one that interacts with you as if it were alive!
IBM in its latest 5 in 5 premonition brings us closer to these dreams. In five years IBM envisions computers to be able to generate touch through vibrations as well as have the technology be smart enough to detect and analyze sounds, tastes, smell, and pictures better than human beings. They would even be able to “smell” and “hear” better. If these technologies were to work as well as they sound, then IBM would have created a portable robotic device that predicts trends and understands us better than we do. Imagine these devices predicting diseases and illnesses even before they happen to us.
That said, Barb Darrow, writer at GigaOM, pinned the tail on the horse when he raised doubts on IBM’s futuristic predictions. While each one of us indefinitely wants a smart personal robotic device to call our own, it may be a fact that this reality is still far away. Even if these premonitions become reality, we have to start asking ourselves if silk would ever feel as real on a mobile device as when we actually physically run our fingers through the textile? Allowing these gadgets of the future to be created to serve the interests of human beings, would we reap infinite benefits or lose sense of what being human feels like?
Well, hopefully that would not be the case. And it definitely will be the best of both worlds, human and machine, when Mitchell of “Modern Family” can finally get his car’s voice command to actually play the next track.
 Weizenbaum, J. Eliza—A Computer Program for the Study of Natural Language Communication between Man and Machine. Communications of the Association for Computing Machinery 9, 1, (1966), 36-45.