The idea of "virtual" reality—immersive computer simulations almost indistinguishable from reality—has been a mainstay of modern "cyberpunk" science fiction since the early 1980s, popularized in movies such as The Thirteenth Floor and The Matrix. Typically, a virtual reality environment produces computer simulated sensory inputs which include at least sight and sound, and, perhaps, touch, taste and smell. These inputs are presented to the user through goggles, earphones and gloves or—in the true cyberpunk sci-fi-via direct brain interfaces.
Although the idea of virtual reality tweaked the public's imagination, attempts to achieve a true virtual reality have been disappointing. The brain's perception of reality relies on a complex synthesis of multi-sensory information and, most notably, almost all virtual reality environments create a mismatch between the inner ear's reporting of movement and visual inputs. "Cyber sickness"—a special form of motion sickness—can result. Furthermore, the computing power and program complexity required to produce a realistic simulation generally are out of proportion to the benefits of the simulation.
"Augmented reality" appears to be both more achievable and more useful. Rather than creating a virtual reality from scratch, in augmented reality, we overlay digital information on top of the real world. "Head-up" displays in military aircraft are an example of augmented reality—they impose digital information on top of real world sensory data.
The Apple iPhone and similar devices may create a tipping point for augmented reality applications. Because the iPhone combines a GPS, compass, internet connection and camera, it's possible to point the device at an object, have the iPhone determine what is at the given location and overlay useful information on the camera image. For instance, you could point the iPhone at a store and see information about items currently available at the store. You might point the camera at a train station and have the schedule and destinations overlaid With the increasing ability for facial recognition, you could conceivably point your iPhone at people to retrieve their personal details.
We don't often consider how Moore's Law—which predicts doubling of computing power every 12-18 months—applies just as much to the computing power of our mobile devices as it does to desktop and server computers. In coming years mobile phones—mobile computers, really—are going to have astonishing processing, memory and storage capacities—more than enough to provide us with whatever augmentation we might desire.
New interface devices are emerging that will combine with the increasingly powerful and ubiquitous mobile computing power available in mainstream phones to further accelerate the capabilities of augmented reality. Companies such as MyVu and VuSix produce video "sunglasses" with iPod and iPhone interfaces. Currently, they merely overlay a sort of virtual TV screen in the user's visual field. However, it's not hard to imagine similar glasses providing augmented reality overlays. Indeed, such technologies already have been developed for military and medical applications.
Micro cameras mounted in your clothing or in your glasses could allow your phone to determine what you are looking at. There already are commercial implementations of brainwave and eye movement interfaces which might let you control these displays, literally, with a thought, a glance or a blink of your eye.
The idea of walking down the street and having popup windows identify places and people of interest around you might seem farfetched and perhaps not even desirable. But, we are clearly entering an age in which the technical capability for augmented reality exists. If we want an augmented reality, we can have it.