Printer Friendly

Watson in your pocket.

Ubiquitous computing (ubicomp), which is also known as pervasive computing, is defined in Wikipedia in a way that evokes all kinds of science-fiction and/or utopian overtones. The online encyclopedia describes ubicomp as "a post-desktop model of human computer interaction in which information processing has been thoroughly integrated into everyday objects and activities."

"Ubiquitous" sounds horizon-wide and somehow optimistic, while "pervasive" sounds almost as ominous as invasive. It appears there are two very different ways of looking at the digital waves of change flowing over us in our ever-evolving digital environment. For some, the question is when will the machines with the AI (artificial intelligence) engines take over? For others, it's when will our computers free us sufficiently from meaningless drudgery to allow us to finally become fully human?


Remember the old story about the best way to boil a frog? Add one degree every few hours to the water temperature of a pot of cold water, and pretty soon the job will be done, almost unnoticed by all--maybe even the frog. It has something to do with definable limits that somehow never get defined.

So when will the warming waves of the digital streams in which we are immersed reach the point when we're sufficiently rendered to be served to the new AI masters or be released like steam into new, freer atmospheres? Who knows? The view from inside the covered pot is dark, and the warmth encourages quiet napping.

What's needed is an accurate thermometer--something that could measure those definable limits for us. For the effects of computing and invisible networks, perhaps we should start with the hardware--the pots and the pans.

Consider tablets and smartphones, for instance. Few devices provide such ever-present connections to digital jet streams, and never have such natural human gestures linked us to them. We tap icons to get the device's attention, slide elements and pages to where they belong, flip-scroll, or use two-finger zoom gestures to open the eye of the screen. The on-board keyboards are an afterthought--they're almost vestigial.

It's likely that when human beings first learned to communicate with each other they also probably began with hand gestures. The early identification of self with one's hands is evidenced in the early cave paintings that often include numerous impressions of hands outlined in dark paint applied alongside the less personal images of the animals.

In the last 50 years, we have moved into a different neighborhood, patched into everywhere through networks, and we are now holding hands with new companions as we sit on the bench at the train station checking e-mail or events everywhere via those companions.

If we're looking to take some readings of the heat in the pot, here's a strange scene that might be indicative of something. "A farmer could stand in a field and ask his phone, 'When should I plant my corn?" The guy offering this scenario is Bernie Meyerson, IBM's vice president of innovation. Talking to a Bloomberg News reporter about a planned pocket-sized version of Watson, Meyerson says, "He [the farmer] would get a reply in seconds, based on location data, historical trends, and scientific studies."

You remember Watson, the IBM program that defeated Ken Jennings and Brad Rutter, the two world-champion Jeopardy players. That first version of Watson appeared to have a more comprehensive database of basic information than the TV contestants, and it had a quicker buzzer finger because of its ability to process the question, search its memory, and frame its response in human terms in almost no time. IBM's Meyerson recently explained to Sarah Frier of Bloomberg News that not only is Watson being developed with an impressive financial information database for Citigroup, Inc., and an oncology database for Wellpoint, Inc., but the next version of the program, to be called Watson 2.0, would be able to work on both smartphones and tablet computers. "The power it takes to make Watson work is dropping like a stone," he explained. "One day, you will have ready access to an incredible engine with a world knowledge base."

We already talk to our mobile devices. There's Siri on the iPhone and a voice on Google's Nexus 7 tablet, also female, will answer questions inspired by information it gathers from the largest search engine on the Internet. Now IBM is investing in an effort to put Watson's 10 racks of IBM Power 750 servers in your pocket. Additionally, researchers will be working toward adding image recognition to Watson so that "it can respond to real-world input." By letting Watson "see" through your phone or tablet's camera, it might better provide answers. The unusual way IBM Vice President of Industry Research Katharine Frase described this feature to Bloomberg was, "In 2.0, we hope to give him more senses. A guy could say into his phone, 'Here's where I am, and here's what I see,' lifting it up to take in images of the environment."

We hope to give him more senses?

Speech recognition on mobile devices is nothing new. The Dragon speech program DragonDictate first demoed in the late 1990s. Originally, it was slow and required frequent edits and corrections. The latest version, NaturallySpeaking 12, is an amazing program that takes advantage of much more powerful hardware available to all and algorithms that have matured over the last 15 years. You introduce yourself and all your individual speech habits as it creates your own profile in its memory. It will continue to learn as you make corrections, and, with time, it gets even better. The iPad tablet has had a free version of Dragon Dictation through all three versions of its hardware, and you can use Android smartphones as remote microphones for v. 12 of NaturallySpeaking.


Obviously, the ways we interface with our computers and networks are becoming more human as touch, speech, and images take over from keyboards. But have we arrived at the point where the network is the computer, and do we now look less like nodes floating in that network as our attention is dissolved into the streams? How can we tell without coming up with some definable limits?

Here's a simple test that works with addictive personalities. Pull the plug. Disconnect from your digital universe for a day and see how it goes. Well, maybe keep the computerized circuits in your car so you'll be able to get around. And maybe keep your phone--for emergencies. And since you won't have the Web and all its social connections, maybe keep your cable TV. And the networks for the power grid--I guess you need those. And the computer-controlled filtration systems at the water company--you can't legally shut those down, right? And all those satellite communication systems? Whew, anyone else think it's getting a little warm in here?

By Michael Castelluccio, Editor
COPYRIGHT 2012 Institute of Management Accountants
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2012 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:TECH FORUM
Author:Castelluccio, Michael
Publication:Strategic Finance
Geographic Code:1USA
Date:Oct 1, 2012
Previous Article:Nokia Lumia 920.
Next Article:Continuous improvement.

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters