EEG Biocontrol Technology and Applications
German psychiatrist Hans Berger first coined the term electroencephalogram, or EEG, in 1929 to describe the brain's voltage fluctuations as detected from scalp electrodes . Since the thirties, numerous studies have been done with the scalp recorded EEG in efforts to correlate brain activity with sensory and behavioral events. Microelectrode recording techniques introduced in the 1950s and '60s have provided another window on brain function, as scientists have been able to listen in on the electrical signals directly from single nerve cells. These single neuron studies, conducted mostly on animals, have assisted in the understanding the generators of the EEG signal, as specific electrical components corresponding to functionally different brain regions have been elucidated. Functional maps of the human cerebral cortex have been constructed, and researchers can record EEG signals corresponding to a particular brain area by placing the scalp electrodes directly over the area of interest.
The EEG signal arises from a 6mm thick layer of neuronal tissue known as the cerebral cortex. This layer has a large surface area and is richly convoluted to fit into the relatively small skull cavity. The cortical layer is subdivided into six layers which are stratified according to neuronal cell type, cell density, interconnections with other cortical cells, and connections to other areas of the brain. The pyramidal cells with their dense apical dendritic processes are believed to be the source of the EEG voltage fluctuations. The activity of excitatory and inhibitory synaptic endings in the dendritic tree produces current flow in relation to the cell body and other dendritic processes. The cell - dendrite formation is therefore a constantly shifting current dipole, and variations in field strength of the dipole produce fluctuations in the volume conductor of the surrounding tissue, skull, and skin. The summation of electrical activity of this dense forest of cells and dendritic processes produces the voltage waves detectable on the scalp.
In order to use EEG signals directly for a computer interface it is necessary to find a signal that the user can control or easily learn to control, and this signal must be detectable using scalp electrodes. Currently, there are two general approaches to the development of EEG - computer interfaces. The first is to use a feature of the continuous EEG output, which the user can modify in some reliable way, and the second is to evoke an EEG response with an external stimuli. Let's discuss the continuous EEG features first and come back to the evoked potential research a bit later.
The continuous or resting rhythms of the brain, "brain waves", are categorized by frequency bands. Different brain wave frequencies correspond to behavioral and attentional states of the brain, and a traditional classification system has long been used to characterize these different EEG rhythms:
1. Alpha waves are between 8 and 13 Hz with amplitude in the range of 25 - 100 µV. They appear mainly from the occipital and parietal brain regions and demonstrate reduced amplitude with afferent stimulation, especially light, and also with intentional visual imagery or mental effort.
2. Beta activity normally occurs in the range of 14 to 30 Hz, and can reach 50 Hz during intense mental activity. Beta arises mainly from the parietal and frontal areas and is associated with the normal alert mental state.
3. Theta waves occur in the 4 to 7 Hz range and arise from the temporal and parietal regions in children, but also occur in adults in response to emotional stress, especially frustration or disappointment.
4. Delta activity is inclusive of all brain waves below 3.5 Hz. Delta occurs in deep sleep, during infancy, and in patients with severe organic brain disease.
5. Mu waves, also known the comb or wicket rhythm, appears in bursts at 9 - 11 Hz. This activity appears to be associated with the motor cortex and is diminished with movement or the intention to move.
6. Lambda waves are large electropositive sharp or saw-toothed waves that appear mainly from the occipital region and are associated with visual attention.
7. Vertex waves are electronegative waves of 100 µV amplitude which appear in normal individuals, especially children, in the absence of overt stimulation. These waves have been observed to have a higher incidence in patients with epilepsy or other encephalopathy.
For some of these rhythms, it is possible to change the amplitude of the wave with certain intentional use of imagery or focus of attention, as well as physical movement or speech. The alpha and mu waves are currently being studied for brain signal interfaces because they can be volitionally manipulated.
A mu wave interface system is being developed by Jonathon Wolpaw and Dennis MacFarland of the New York State Department of Health's Wadsworth Center in Albany. Their system detects the change in amplitude of the mu wave. The mu wave has been studied since the 1930s and came to be referred to as the "wicket rhythm" since the rounded waves on the EEG record resembled a croquet wicket. In a study in the 1950s, Gian Emilio Chatrian and colleagues showed that the amplitude of this wave could be suppressed by physical movements, and later studies showed that simply the intent to move or certain other efforts requiring visual or mental activity would also suppress the amplitude of the mu wave. In Wolpaw and MacFarlands' lab, subjects can learn to control the amplitude of this waveform by trial and error when visualizing various motor activities, such smiling, chewing, or swallowing. For different subjects, different images enhance or suppress the voltage of the mu waveform. Upon detection of the voltage change in the mu wave, the system sends output code to drive a cursor up or down on a computer screen. Thus, with a certain amount of feedback training, users can learn to move the cursor with the appropriate mental effort. The researchers hope that this system will eventually provide a communications link for profoundly disabled individuals.
Alpha waves can also be volitionally manipulated. Alpha activity appears with closing the eyes or defocussed attention. Also, alpha is suppressed by light or normal attentive activity. Thus, most people can learn to produce bursts or "epochs" of alpha activity, and then return to normal beta activity. This behavioral "switch" between beta and alpha activity can be used as the mental command for a brain wave controller. When the signal processor detects the alpha epoch by using an FFT to detect the change in the fundamental frequency of the brain rhythm, an instruction is sent to control an output device.
In 1987, the authors (Lusted and Knapp) demonstrated an EEG controller which was configured to switch settings of a music synthesizer. Music was chosen for the controller's output because sound provided a good demonstration the real-time capabilities of this technology. By wearing a headband that positioned electrodes on the back of the head to detect the occipital alpha activity, users controlled a switch that responded to the transitions between beta and alpha epochs. More recently, composer Atau Tanaka of the Stanford Center for Computer Research in Music and Acoustics uses this EEG controller in his performance pieces to switch certain synthesizer functions while generating sounds using EMG signals.
Another recent application for the EEG-alpha interface is being used as a controller for visual keyboard software. In Brazil, Roberto Santini is using a Biomuse system configured to provide him with the EEG switch, since he is immobilized with advanced ALS (amyotrophic lateral sclerosis) and cannot make use of his eye movements to use the EOG controller. With the EEG controller interfaced to the mouse port of his personal computer, Roberto can select letters from the visual keyboard on the screen. The selection process is somewhat laborious because each choice is binary. The word processing software allows him to zoom in on a given letter by dividing the screen in half. Thus, starting from the full keyboard, as many as 6 steps may be required to move down the branching pattern in order to select a desired letter. Roberto now writes complete letters and is pleased that he can again communicate with others.
Currently, the authors and a few other researchers, notably a group headed by Alkira Hiraiwa at the NTT Laboratories in Japan, are continuing development in EEG controllers by using pattern recognition algorithms in an attempt to detect signature patterns of EEG activity which correspond to volitional behaviors. The eventual aim is to develop a vocabulary of EEG signals that are recognizable by the computer. The process of pattern recognition is similar to that used for EMG gesture recognition. In this case, the "gesture" is a thought pattern or type of visualization. For instance, attempts have been made to train a neural network to recognize subvocalized letters, where subjects think a particular letter as though about to speak it, and over many repetitions, train the neural net to recognize a brain wave pattern that occurs with this behavior. This is a promising technique, but the training period is laborious in order to obtain a high percentage of accuracy in matching letters with brain wave patterns.
As mentioned earlier, another approach to development of an EEG-computer interface involves the use of an evoked potential (EP) paradigm. Evoked potentials are produced by activating a sensory pathway with a particular type of stimulus, such as a flash of light or a noise burst, and then recording a characteristic waveform from the brain at a particular time interval after the stimulus presentation. Since the characteristic evoked waveform appears at a specific time after the stimulus, researchers can discriminate between the EP and the noise because they know its temporal location in the post-stimulus EEG recording. Other electrical activity which occurs before and after the EP latency window can be ignored.
Eric Sutter at the Smith-Kettlewell Institute in San Francisco has developed a visual EP controller system for physically handicapped users. The user can select words or phrases from a matrix of flashing squares on a computer screen. The flashing square upon which the user is fixating his or her gaze produces a characteristic EP from a particular portion of the visual cortex, and since the amplitude of the EP produced from the foveal portion (point of maximal accuity) of the retina is much larger than the response form surrounding retinal areas, the computer can discriminate which word square the user is watching at any given time. Dr. Sutter has implanted electrodes under the scalp to improve the quality of the EEG signal in these patients. Also, this eliminates the need to put on scalp electrodes for each test session since the patients simply "plug in" their transdermal connection to interface with the computer.
While progress is being made toward a practical EEG-computer interface there are still many elements in this technology that prevent it from being user friendly. At present perhaps the most cumbersome factor is the need for scalp electrodes, which require an electrolyte gel for electrical conductivity, and as little hair as possible. Users with normal hair have to deal with electrode prep before use and hair cleaning after use. Arthur C. Clarke proposed an excellent solution to the hair problem in a recent science fiction short story. In his futuristic story, all computer users are bald, in order to use the skull cap connection for the brain-computer interface.
The scalp electrodes may always be the limiting factor in resolution of a EEG- computer interface. There is probably much electrical activity concomitant with thought patterns and sensory images in the brain, but the fine resolution of this activity is not detectable with surface electrodes. One area of research needs to be directed toward the development of better electrode technology. The electrode sensors need to be convenient to use and inexpensive. Ideally, the electrode elements would be inconspicuous and be wearable as easily as clothing.
Another difficulty, is that the EP systems are quite slow. The EP must be derived by signal averaging, that is, multiple repetitions of the evoked response must be accumulated in order to see the EP signal above the noise. In the case of Dr. Sutter's system, 1.5 seconds is required to discriminate the selection of a particular letter from the alphabet array. The continuous EEG interface systems have faster switch functions because the change in alpha or mu wave amplitude can be detected more quickly.
BioControl Systems
PO Box 19596
Stanford, CA 94309
Phone 707-824-9703
Email:
Ben - knapp@biocontrol.com
Hugh - hugh@biocontrol.com