As some of you may know over the past several years I've been working in the field of neuroprosthetics. this field has seen a number of startling developments just in the past five years. While auditory prosthetics, in the form of the cochlear implant, have been around for more than 20 years, we have yet to come up with a similar prosthetic for the eye. However as I reported last June, much progress has been made with the Argus 2 device by Second Sight.
The cochlear implant consists of microphone connected to a processor that selects and enhances waveforms associated with speech. This processor then activates electrical stimulation within the cochlea of the inner ear. In this manner, nonfunctioning structures which may include the eardrum and ossicles (small bones which translate vibrations of the eardrum to movements of the fluid inside the cochlea), are bypassed. However this prosthetic requires that the cochlea be intact with normal neural function connecting to the brain.
Likewise the Second Sight retinal prosthetic requires intact neurons in the retina, but can bypass faulty cornea, iris, lens, and photoreceptors. The Argus 2 collects input images from the camera, processes to enhance black and white images and edges, and converts the image to single pixels that are then represented as stimulation to discrete retinal ganglion cells. Like the artificial cochlea, it requires that the normal neural connections into the brain are intact. Neither of these devices can replace a missing or totally damaged ear or eye, much less replace the function of damaged brain structures subserving those sensory functions.
|"The Six Million Dollar Man," Harve Bennett Productions, 1974|
The current state of the art in retinal and cochlear prosthetics certainly restore at least a portion of the ability to see and hear. However, they are highly dependent on microcomputer processors which take an image or a sound and reduce it to very simple elements before transmitting the stimulus to the residual neural tissue in an existing retina or cochlea. The Argus I retinal stimulator produced only a 4 x 4 matrix of stimulation - only 16 spots on the retina could be stimulated, and the only images that could be detected were light, dark and edges. The Argus II stimulates an 8 x 8 grid, with newer designs of 100 and 1000 total "pixels." And yet, patients with an Argus II are able to read - one letter at a time - presented full size (>15 inches) on a computer screen. When the difference is total blindness vs. one letter at a time, it is a highly significant difference. Likewise, sound transmitted to a cochlear implant is filtered and preprocessed to remove background sound and enhance the frequencies and tone of human speech. In speaking with a person who has had an implant since a young age, you encounter none of the distortion of pitch and tone normally associated with the speech of a nonhearing person - much the same as when speaking with a person who lost their hearing as an adult. However, the preprocessing for speech has its price: cochlear implant is inadequate at reproducing music or subtle background sounds of nature. In addition, the preprocessor can overload in "party" settings, with too many overlapping voices to effectively isolate single conversations. Yet again, this auditory prosthetic is a significant advance over no hearing at all.
So what would it take to have a "real" neural prosthetic that directly interfaces with the brain as in the picture above? The answer to that question can be found in the next installment of The LabRats' Guide to the Brain in which we examine Bionics and Brain-Machine Interfaces!
See you next time!