Brain–Computer Interfaces: Brainy Connections

Smart Lifestyle

Brain–Computer Interfaces: Brainy Connections

Computers can do many things a lot better than the human brain but there are still tasks we do easily that are impossible for computers to accomplish. What if the two systems could cooperate seamlessly? Smart Industry takes a look at some of the amazing developments in brain–computer interfaces.

by Rainer Claassen

Since the transmission in the human brain is done by electricity, it is possible to measure the related activities using technologies such as functional magnetic resonance imaging (fMRI) – which requires very expensive and large machines. As there are no signs this technology will become more accessible in the near future, neuroscientists are exploring alternatives.
One existing alternative is to apply sensors to the skull. This method, called electroencephalography (EEG), has been known since the late 19th century and it can detect the status of the whole brain as well as activities in different regions of it. EEG enables scientists to find out which regions of the brain are active during different kinds of activities – resulting in maps with detailed information about the functions of different regions of the brain.

Brain–Computer Interfaces - Neurable DKI system - source ©: Neurable

Look, No Hands! Neurable’s DKI system uses six dry EEG sensors and can be fitted in minutes, allowing human users to move within VR worlds by thought control alone.

Over time, the technology is becoming more accessible. In Cambridge, Massachusetts, for example, Neurable has developed an EEG that is relatively simple to use. It takes more than an hour to apply a standard “wet electrode” EEG to a human skull, using gels to optimize the electrode contact, but Neurable’s dry system is attached to a virtual reality headset and can be fitted within minutes. The company claims its DK1 system is noninvasive, quick to set up, and easy to use. The headset uses six dry EEG sensors, which has more than 90 percent correlation with wet systems, and includes continuous impedance and signal quality monitoring.
The company claims it is know-how in pattern recognition and machine learning that allows the DK1 to return stunning results from this rather simple system. In a demonstration at Augmented World Expo 2019 in Munich, Germany, Neurable showcased a person interacting with virtual items displayed through the VR headset by thought alone. He was even able to move within the VR setting without using handheld controllers. Neurable’s clients include architects and interior designers who find it especially interesting that the headset can receive direct feedback from the user’s brain. Without the need for written surveys, they can detect how each user feels about their virtual surroundings, which helps to create places in which people will feel good and do better work.

Brain–Computer Interfaces - OpenWater - source ©: Openwater

Window to the Brain: Openwater’s new headset resembles a beany hat, but contains near-infrared light emitters that measure blood flow in the brain. Originally intended to help diagnose brain damage, it could one day enable thought reading.

Neurable’s software tools enable integration with Unity, C++, and C# development environments, and the company also offers data export capabilities and a web portal for 3D data visualization and post-session analysis.

Sooner or later, we will be able to read your thoughts.
Mary Lou Jepsen, Openwater
{bqalt}

 

Another example of this technology entering the mass market is a meditation headset produced by Muse. The company claims that the device, available from about €170, can translate brainwaves into sounds. It aids meditation by giving audible feedback when rising brain activity is detected. This can help users to get into a state of deep relaxation as they learn how to control the sound.

Brain–Computer Interfaces: Let there Be Light

Increases in the brain’s oxygen levels can also reveal neuron activity, a method that is currently being investigated at the Facebook Reality Labs (FRL). Near-infrared light can be used to measure blood oxygenation in the brain from outside in a noninvasive way. Neurons consume far more oxygen from the blood when they are active. Shifts in oxygen levels within the brain can be measured by a device that works in a similar way to a pulse oximeter – the clip-like sensor attached to a patient’s finger to measure blood oxygen levels. Nearinfrared light can pass through the skull and back, allowing blood oxygenation in the brain to be measured from outside of the body in a noninvasive way – thus giving hints on current brain activity. At Facebook’s lab they are experimenting with a portable, wearable device made from consumer-grade parts with an eye on mass production.
Facebook’s researchers have an ambitious goal: to convert thought into text and achieve a real-time decoding speed of 100 words per minute with a 1,000-word vocabulary and word error rate of less than 17 percent. To make progress by comparative results, the researchers are currently engaging with a lab at the University of California, San Francisco, that is using invasive technology – a small patch of tiny recording electrodes temporarily placed on the surface of seizure patients’ brains, to map back to the origins of their attacks in preparation for neurosurgery.
First results are promising, and brain activity recorded while people spoke has been converted to text on a computer screen. A small set of spoken words and phrases was decoded in real time, a first in the field of brain–computer interfaces (BCI) research, and the ongoing work aims to translate much larger vocabularies with dramatically lower error rates.
There is still a lot of progress needed within the algorithms and hardware before this will lead to Facebook’s aim of producing an affordable headset that will allow people to dictate with the force of thought alone.
Facebook is not the only company exploring this technology. Silicon Valley hardware engineer Mary Lou Jepsen recently founded Openwater. The company plans to build a headset that resembles a beany hat to house the near-infrared light emitters for measuring blood flow. Openwater is actually focusing on diagnosing brain injuries or neurodegenerative diseases but Jepsen believes that the technology could be used to read thoughts – sooner or later.
Jepsen’s assumption is supported by experiments performed by Professor Jack Gallant at the University of California, Berkeley, eight years ago. With the help of fMRI, he scanned the brain activity of people as they watched video clips. After analyzing the patterns that occurred during watching different footage, a computer was able to process the activity patterns in the brain to generate images that bore a stunning resemblance to the original videos.

Come inside

Brain experts often compare noninvasive methods of investigating brain activities to listening to the noise of a crowd from outside a stadium. You may be able to determine when goals are scored and maybe deduce which team is winning from the loudness of the reactions – but you can hardly discern any other details of the game.
To find out about these it is necessary to go inside the stadium – and to place many microphones in there in different places. In regard to the brain this means getting sensors inside the skull. But measuring the activities of single neurons is quite difficult even when working with the larger cells of primitive animals in a laboratory – and extremely complicated when dealing with the brains of living humans. Very small sensors have to be placed very precisely and they need to stay in place in conditions that can be compared to a jungle by the sea: hot, humid, and salty. A tough environment for technology.

Brain–Computer Interfaces - Synchron device - source ©: Synchron

Inside Out: Synchron has developed so-called “Stentrodes” that are implanted in the blood vessels of the brain and can gather detailed information that can then be transmitted wirelessly to an output device.

These conditions don’t deter some organizations and Synchron, partnering with Australia’s University of Melbourne, is working on a stent-like device studded with electrodes. Inserted via a small incision in the neck, this “Stentrode” is guided through blood vessels that overlie the brain.

Working from Within

Once in the right location, it expands from the size of a matchstick to fit the vessel and tissue grows into its mesh, keeping it in place. The device is designed to record from multiple locations through the numerous sensors positioned along and around it.
The company claims that human trials of the Stentrode are due to start this year. It does not get in direct contact with single neurons but can get more detailed information than systems that work from outside the brain. The signals it detects are transmitted wirelessly to an output device carried in the subject’s pocket.
Implant specialist Neuropace is currently using a responsive neurostimulation (RNS) system on seizure patients. It consists of a neurostimulator that is implanted on the inner surface of the skull with tiny wires connecting it up to two seizure-onset areas. It monitors brainwaves, detecting signal patterns that are typical for the onset of a stroke, and responds in real time by sending brief pulses that prevent the seizure from developing further. A data collector can wirelessly acquire data from the stimulator, which helps medics understand the causes of the seizures and improve health care.
More sophisticated solutions could soon lead to the possibility of “inserting” thoughts into brains – by stimulating different regions of the brain, scientists have been able to activate certain images and thoughts.

Optical Signals

Researchers at the Massachusetts Institute of Technology (MIT) have developed a completely different approach to measuring electrical activity in the brain. They have embedded light-sensitive proteins into neuron membranes. The proteins emit a fluorescent signal that indicates how much voltage a particular cell is currently experiencing.
This could allow scientists to study how neurons behave, millisecond by millisecond, as the brain performs specific functions.
Edward Boyden, an associate professor of biological engineering and brain and cognitive sciences at MIT explains: “If you put an electrode in the brain, it’s like trying to understand a phone conversation by hearing only one person talk. Now we can record the neural activity of many cells in a neural circuit and hear them as they talk to each other.”
MIT is now trying to measure brain activity in mice as they perform various tasks. Boyden hopes that this will result in maps of neural circuits and to help understand how they manifest specific behaviors. “We will be able to watch a neural computation happen,” he says. “Over the next five years or so we’re going to try to solve some small brain circuits completely. Such results might mean a big step forward to understanding what a thought or a feeling really is.”

Big Promises with Brain–Computer Interfaces

Since the topic leaves room for much futuristic fantasy and great opportunities, it is no surprise that Elon Musk (Tesla) is involved too. In July 2019, he outlined plans to connect human brains directly to computers through his company Neuralink. He described a campaign to create “symbiosis with artificial intelligence,” announcing a first prototype would be implanted in a human by the end of 2020. It involves microfibers that could record and stimulate the activities of up to 1,000 neurons.

We are slowly beginning to understand what a thought really is.
Edward Boyden, MIT Department of Biological Engineering
{bqalt}

 

As Musk is personally afraid that artificial intelligence may eventually consider humans to be no longer necessary, he hopes to enable people to “merge” with AI – and he expects that a high-bandwidth brain interface will lead to options to achieve this.
The California-based entrepreneur is even on record saying that the infrastructure in Neuralink’s system could become so simple it wouldn’t need expensive neuroscientists to implant and maintain it – thus making the implantation of the interface relatively cheap. “I really think you will one day be able to repay the loan for such a procedure with superhuman intelligence. I think that’s a safe bet,” he argues.
Some scientists are less optimistic, doubting that the great announcements will lead to real-life outcomes any time soon. To most Brain–Computer Interfaces experts, neuroscience is a work in progress, with many different disciplines involved: materials science, neuroscience, machine learning, engineering, design, and many more. They don’t see any shortcuts to evading clinical trials and regulatory approval.


Theoretical physicist Michio Kaku said that the human brain is the most complex object in the known universe, and many scientists agree. Science has already found out a lot about the way it works – but is still far from actually understanding it. Weighing less than three pounds, the brain has some impressive figures. Contributing only two percent to the weight of an adult, it consumes about 20 percent of the energy a person needs. Almost a hundred billion neurons are working in it and each of these cells may be connected to up to 10,000 other cells. This adds up to as many as a thousand trillion synaptic connections through which signals are transferred. Each neuron has many tentacle-like projections: numerous dendrites and a single axon – a long, slender nerve fiber which transmits information to different neurons, muscles, and glands.

Brain–Computer Interfaces - Infobox Neural Networks - source ©: Onlinezeitung24

The rather short dendrites, usually less than a millimeter, receive electric signals that are transmitted from neighboring neurons via their axons, which can be up to a meter in length. In addition to the trillions of connections within the brain, there are many more connecting to the sensory cells within the body. Although the way brains work is quite different from computers, some comparisons can be made. The memory capacity of a brain is estimated to be between one and 1,000 terabytes, with a computing capability equal to a computer with a one trillion bit per second processor. Compare that to Hewlett Packard’s recently announced single memory computing system of 160 terabytes – currently the world’s largest – and you see that computers still have a way to go before they can catch up. The same goes for raw computing power. The fastest supercomputer in the world, the Tianhe-2 in Guangzhou, China, has a maximum processing speed of 54.902 petaFLOPS. A petaFLOP is a quadrillion (one thousand trillion) floating point calculations per second. That’s a huge amount of calculations, and yet that doesn’t even come close to the processing speed of the human brain. Although it is impossible to precisely calculate, it is postulated that the human brain operates at 1 exaFLOP, which is equivalent to a billion billion calculations per second.


Brain–Computer Interfaces: Should We Do this?

Although it looks like there is still a long way to go until these technologies will actually allow direct access to secret thoughts, possible consequences have to be considered early on. As people get more and more concerned about all the data some firms are collecting without asking, many worry that BCIs may one day lead to even greater exploitation of personal information.
Many questions remain unanswered, for instance: Do we really want companies to know more about ourselves than we do? Who will be held responsible if a wrong thought leads to fatal consequences when mind controlling a machine? Will a random thought like “I turned my phone off. I must remember to turn it on” get truncated to “Turn it on” and the industrial machine obeys?
The current state of developments in brain–computer interfaces is a long way behind science-fiction fantasy. Although the fourth part of the Matrix movie franchise is currently in the making and its hero Neo is set to reenter the computer-generated world cabled into his brain, in today’s world even a simple interface to the brain for direct input and output has yet to be developed.
Indeed, many scientists remain doubtful that it will ever be possible to actually transmit complex thoughts – let alone to upload human consciousness to a computer. So far, Brain–Computer Interfaces technologies look as though they will have their strongest impact in medical use cases – but they are starting to seed into industry. Lots of companies are doing research in many different directions and many scientific breakthroughs in the past have been achieved by chance when many players became involved in a specific theme. This could well happen in this field, too. The path may still be full of obstacles, but the outlook for BCIs is starting to confound the doubters and look more than promising.

Tags: /

Leave a Reply

Your email address will not be published. Required fields are marked *

*