Monday, April 13, 2015

Tactile Augmented Reality

Dr. Eagleman is an American neuroscientist at the Baylor School of Medicine, Laboratory for Perception and Action. Dr. Eagleman gave a TED talk at the following link:  This talk discusses the principles of sensory perception and how the brain makes sense and responds to sensory inputs.  

The brain is set up to perform pattern recognition and association. The brain doesn't care how or where the data comes from. You give the brain any sensory inputs and given some time, the brain will begin to draw out patterns in the data and then associate those patterns with other patterns. The brain is designed to recognize patterns, make and then reinforce connections.  

Humans have been designed with 5 major senses: sound, sight, taste, touch, smell.  These 5 senses exclude the inner ear that is designed to sense orrientation and acceleration.  There are other refined senses in the animal kindgom like a bat's echolocation, or a pit viper's thermal sense.  Birds have magnetite in their heads that helps them orrient to the Earth's magnetic field.

For the brain, everything is just electrochemical signals and synapses. According to the brain, it doesn't really care what the peripheral input devices are.  We see this with the blind and deaf. Many blind people learn braille and instead of seeing words written on a page and associating meaning with various symbols, braille readers (blind or seeing) can teach their brains to "read" by associating meaning with tactile bumps on a page instead of printed symbols.

On one side, scientists have been moderately successful in bionically restoring sight and sound perception to the blind and deaf by creating artificial eyes and cochlear implants which simulate natural sensory organs.  Other experiments have been done using sensory replacement or augmentation. In this case one sense can be converted into another.  In this way, blind or deaf persons have been able to see or hear by converting a digital image or audio into tactile sensations on the forehead, tongue or back.

Sensory associations can get a bit messed up too.  Synesthesia is when certain people associate certain numbers or letters with a color or shape. Projecting synesthesia report actually seeing certain numbers as having a color.  PTSD is a disabling disorder where victims experience flashbacks, and strong negative emotions, anxiety, panic in association with certain "trigger" sounds, smells, taste, or sights.  On the positive side people can have positive emotional response where the smell of fresh baked bread "takes them back" to happy emotional memories of childhood or a vacation.

Dr. Eagleman talked about using a vest that transmitted sound into tactile stimulation for the deaf. This served as a cheaper and much less invasive alternative than undergoing a coclear implant.   But the interesting part was the potential adaptation of this "augmented reality" technology for the common man. So far, scientists have envisioned augmented reality to involve glasses or contacts which project or overlay information into our visual field. The downside to doing this is that ths paradigm may detract and distract from our visual senses. 

According to Dr. Eagleman, the visual sense is really very limited. Despite the complexity and amount of data being imputed to our brains, the brain has to narrowly scan through that ocean of visual data and pick out patterns one at a time.

Instead, Dr. Eagleman suggests augmented reality data could better fed to us by tactile sensations. The brain receives a deluge of vibratory, touch and proprioception data from our skin allowing our brain to know exactly where each part of our body is at any given moment. This spacial sensation makes it possible for us to cognitively visualize, project, predict, and make coordinated complex movements. 

Dr. Eagleman is of the opinion that augmented reality data may be more efficiently and less obtrusively fed to us through tactile stimulation instead of visually using a vest with hundreds or thousands or potentially millions of tactile vibratory and light touch stimulators on our back and torso.  While our eyes, ears, arms and legs are always being heavily used, the advantage of using your chest, abdomen and back is that this sensory real estate is not really being used much.

The potential possibilities of using an augmented reality tactile vest is to unobtrusively receive needed information like turn-by-turn directions (like the apple iwatch tactile sensor) but also recieving realtime telemetry data on the orientation and functioning of a complex machine like an airplane.  In this way a pilot could "feel" the status of the airplane at every moment.  Similarly, health data could be transmitted through a similar system allowing a nurses to "feel" and continuously monitor the status of their patients.

Are there downsides to this technology? Like any technology, it can be abused and used to manipulate. Imagine a matrix-like dystopian future with 7 billion people or more all wearing these augmented reality vests?  Could central governments be tempted to begin sending unsolicited information and stimulus to users to distort, decieve, modify, and manipulate behavior, emotions and attitudes?  

In the future "internet of things" are we all humans to be plugged into a grid where central governments and corporations begin using people and their processing potential in crowdsourced, distributed computing projects? Will humans together with our appliances, toothbrushes and even forks become yet another "thing" to be controlled through a future global matrix?

Okay, I wasn't really taking this Matrix thing serious until I just read the following article about "Human Sat Nav" system controlling movement with electrodes and electrical impulses to the muscles. Is this wearable tech one step closer to the creation of a Borg-like Hive and Collective?  Resistance is futile, you will be assimilated!

No comments: