Researchers from Washington University School of Medicine in St. Louis have recently demonstrated that a computer can decode brain signals to infer the intentions of a human thoughts. From now on, people can “talk within their minds” to a computer in order to control it. How the act of human mind reading, usually associated with science-fiction literature, is actually possible?
Brain-computer interfaces, which constitute temporary surgical implants, detect activity in the brain’s motor networks, which control muscle movements. By directly connecting the patient’s brain to a computer, the scientists showed that humans can control a cursor on a computer screen using words spoken out loud and in their head. Patients, with up to 90% accuracy, were able to control a computer by saying or thinking of a particular sound. Using those surgical implants scientists were able to analyze the frequency of brainwave activity, and to identify the brainwave patterns that represented the following sounds: oo, e, a (as in ‘say’), a (as in ‘hat’). The interface is programmed to detect when patients say or think of those sounds. Therefore, they can quickly learn to control a computer cursor by thinking or saying the appropriate sound.
The development of such brain-computer interfaces technology have significant implications for disabled patients with limited movement to restore their lost mobility or for patients who have lost their speech by enhancing their ability to interact and communicate with their environment. According to Eric Claude Leuthardt, assistant professor of neurosurgery, biomedical engineering and of neurobiology, those patients could engage the implant to move a robotic arm through the same brain areas they once used to move an arm disabled by injury.
“We can distinguish both spoken sounds and the patient imagining saying a sound, so that means we are truly starting to read the language of thought,” says Leuthardt. “This is one of the earliest examples, to a very, very small extent, of what is called ‘reading minds’ - detecting what people are saying to themselves in their internal dialogue.”
The discovery is fully described by researchers in their papers Using the electrocorticographic speech network to control a brain–computer interface in humans and Evolution of brain-computer interfaces: going beyond classic motor physiology. This revolutionary technology will soon be applied in conversational virtual humans. They will not only be able to read what we write, to hear what we say, and to detect what we feel, but also they will be able to recognize our thoughts and understand them! ...and they say that artificial intelligence isn’t making progress
Comments
There are 2 comments: