How Scientists Want to Prevent Your Brains From Being Hacked (Originally in Spanish)

By
Paula Molina
July 10, 2019

Google Translation to English 

"Imagine if companies interested in trading with our personal data, not only had data on what we do and say, but on what we think."

The phrase sounds like the introduction of a chapter of the "Black Mirror" series, but it is told to BBC World Rafael Yuste, director of the Center for Neurotechnology at Columbia University in New York .

Yuste is one of the scientists determined to regulate the future use of neurotechnologies : tools that are developed today to map and modify the activity of the human brain.

As spokesperson for a group of 25 scientists and engineers, Yuste proposed in 2017 to incorporate five inalienable neuro- rights into the human rights charter : mental privacy, personal identity, free will, equitable access and non-discrimination.

In June, he visited Santiago de Chile, where every year the Senate's technology commission organizes a Congress of the Future where they expose some of the world's leading scientists and intellectuals.

"Chile would be a precedent," he says. "I am happy to be able to work on it."

One hundred billion neurons

Yuste was one of the first advisors to the "Brain" project , launched in 2013 by the then US president, Barack Obama, in order to boost and finance neurotechnologies capable of "mapping" the brain.

A year earlier, the researcher born in Madrid had been named one of the most influential scientists in the world by the British magazine "Nature".

"Obama launched the 'Brain' project as the start of the space race was launched," recalls Yuste.

In this race, the United States is accompanied today by other countries. Japan, China, South Korea, Australia, Canada, Israel and Europe, which have their own versions of the project. "

Yuste explains the scientific appeal of the project.

"The brain operates electrically: we have 100 .000 million neurons within the skull The number is astronomical and connections in our head more connections and nodes throughout the Internet. ierra " he explains.

"All that complexity of neurons is firing electronically and through processes that we do not understand. Hence the vision, sensations, behavior, ideas, memory, emotions, consciousness, mind, everything we are. That is why it is so important to have neurotechnologies capable of mapping them. "

The risks of an "increased" person

Neurotechnologies use optical, electronic, magnetic and nanotechnology techniques to understand these processes and future, "read and write" brain activity.

"It's something similar to what we had to decipher the human genome: nobody knows who will arrive first," says Yuste. "But the concrete thing is that someone is going to arrive, opening new opportunities. And also risks."

"In the US, a flexible computer chip of two square centimeters, with a thickness of 100 microns, is being manufactured to be implanted under the skull, in the brain," says Yuste.

"After the intervention, the person could wear a cap or a helmet with the electronic components that allow controlling this chip implanted in his brain."

This neurotechnology is designed, for example, to connect a camera to a blind patient and transmit the images to your brain through a chip .

"We know that vision is generated in the cerebral cortex and that most blindness is caused by problems in the eye. In these blind patients, you could install a visual prosthesis connected to a camera. The camera would work like the eye and the cortex. would receive the signals through the prosthesis, making the person can see, "explains the scientist to BBC Mundo.

"But imagine that you install the same prosthesis to a person who sees well, and that that prosthesis is no longer connected to a camera, but to a group of cameras capable of seeing in infrared, or to a camera installed in another place on the planet, or a television screen where the person could read information, "he adds.

That person could perceive things that the rest cannot, and would have access to information that the rest could not have. He would be an augmented person ... Combined with an artificial intelligence system, the person could go down the street looking at people and detecting the information of each person, this type of uses of neurotechnology must be regulated before it is late. "

Another risk of neurotechnology according to Yuste is its military use .

Because the same chip implanted in the brain that allows to receive information, could transmit it from the brain to a robotic arm, or a tank.

There is no standard for these developing neurotechnologies today: there are no laws that prioritize their use, whether among patients with disabilities or healthy people who want to "increase" their abilities.

Nor are there regulations on the theft or manipulation of brain data.

"I have a very positive opinion about neurotechnologies and I think it is essential to develop them in order to help patients with neurological or mental diseases. But the same tools can be used for better or for worse, " warns the scientist.

Human Neuro Rights

Yuste describes to BBC Mundo each of the five neuro-rights with which it seeks to avoid misuse or inequalities that could be generated by neurotechnologies.

The issue is already worrying part of the scientific community. In addition to Yuste and his group, in the same year 2017, the neuroethics expert Marcelo Ilenca and the Swiss human rights lawyer Marcelo Adorno published another document on the same line warning about the same topic.

l first neuroderecho is mental privacy . "We want it to be a fundamental human right: that the content of your mind cannot be extracted without your consent and that it has the same legal treatment as the human organs," Yuste explains.

Personal identity and free will are two other rights to ensure in a world where neurotechnologies can act on cognitive abilities and individual decisions, explains the scientist.

Imagine the case of a soldier who could be handled from the outside , connecting his mind to a network through a prosthesis. That person's identity could be completely dissolved and his decision-making capacity could be the same."

The fourth right is responsible for ensuring equitable access to neurotechnologies .

"These technologies are going to be very expensive, and only certain social groups in certain countries will have access to them. In the case of the neurotechnologies used to increase certain sensory or cognitive abilities, we want to avoid a social fracture, where some people have capacities superior to others ".

Yuste proposes the example of transplants.

"Today when you have several patients waiting for an organ, the medical community decides who is transplanted, based on medical and justice criteria. The same criteria should define the possibility of increasing a capacity through neurotechnology," He says.

The fifth law aims to protect people from discriminatory biases and traits of artificial intelligence algorithms .

"If we decide to use artificial intelligence algorithms that change the functioning of your brain from the outside, we must take care that these algorithms do not project those biases in your brain. Otherwise, there would be no way forward in the creation of fairer, more peaceful societies. ".

"A new rebirth"

Yuste hopes for both neurotechnology and humanity's ability to regulate them.

"These technologies will impact the entire society, allow us to treat patients, but they will also open new fields of development to countries; they will allow us to change education, justice ."

"Today we educate children with methods that we inherit from the past, but if we understood how the mind works, we could have a much more efficient education," he says.

"Today you catch a criminal and imprison him. But if we understood why he did what he did, that criminal would become a patient," he adds.

"I think we are in a new renaissance: in the first, man began to understand his role in the world. Now, we can understand each other inside, finally understand what we are ."

"But first it is up to us as a society to organize the rules so that these neurotechnologies are used in the sense of the common good. And the time to do so is now," he concludes.