“Companies already read our minds and will know even more thanks to neurotechnology”
Nita Farahany proposes a “right to cognitive freedom”.
Shin Suzuki
BBC News Brasil en Sao Paulo
https://euro.dayfr.com/trends/935673.html
BBC News Brazil in Sao Paulo
September 24, 2023
A few years ago, the idea of a “threat to the confidentiality of thought” was dystopian science fiction, like George Orwell’s 1984.
But for Nita Farahany, professor at Duke University (United States), specializing in the study of the consequences of new technologies and their ethical implications, this threat already exists and must be taken seriously.
This year, the Iranian-American professor published the book The Battle for your Brain: Defending the Right to Think Freely in the Age of Neurotechnology. .
But how is it possible to read our brain? There is no super machine yet that can enter a person’s head and provide them with a complete list of ideas and concepts, as is the case in science fiction.
But in fact, Farahany explains, the defenses of our private thoughts began to be broken down without the need to look directly into the brain.
The data era
This is made possible by the vast amount of personal data we share on social media and other applications, which is then analyzed by algorithms and monetized.
Today, technology companies have important information about us: the identities of our friends, the content we are passionate about (and, above all, the type of passion), our political preferences, the products we click on, our movements during the day and some of our financial transactions.
“All of this is used by companies to create very precise profiles of who we are in order to understand our preferences and desires,” Ms. Farahany explains in an interview with BBC News Brazil.
“It’s important for people to understand that they are already in a world where minds are read,” she adds.
With the growing popularity of smartwatches – which collect data on heart rate, stress levels, sleep quality and more – another frontier is beginning to be explored, that of our inner workings.
But advances in neurotechnology and equipment in direct contact with the head make it possible to take things to the next level, with more data and greater precision.
The professor explains that brain sensors are precisely similar to heart rate sensors found in smartwatches or rings that measure body temperature when sensing electrical activity in the brain.
“Every time you think or feel something, neurons fire in your brain, emitting tiny electrical discharges. You can use characteristic patterns to draw conclusions,” she explains.
“For example, if you see an advertisement and you feel joy, stress, anger, boredom, engagement… all of these reactions can be captured by the electrical activity of your brain and decoded by the most advanced artificial intelligence,” he adds.
In other words, these brain signals translate what we feel, observe, imagine or think.
Ms Farahany says people need to understand and accept that their brain “is not entirely theirs”.
This calls into question the concept of free will, that is, the power of an individual to choose their actions.
“Imagine that at the beginning of the week you set a goal of spending no more than one hour per day on social media. At the end of the week, you realize that you spent four hours per day. day – what happened?
“If there are algorithms designed to catch you when you want to disconnect, if you get notifications when you spend too much time away from your phone, if you want to watch just one episode of the show and the next one starts automatically, could you really use your free will? These are tools and techniques designed to undermine what you committed to.”
“The technology itself is rarely the problem”
Each person’s brain characteristics could be used to draw incorrect conclusions about them.
Farahany, contrary to popular belief, is very enthusiastic about advances in neurotechnology.
Throughout her book, she lists a long list of contexts in which brain monitoring could improve humanity and save lives.
“What I’m proposing is a balance. It’s both a way for people to see the positive aspects of technology, but also to protect themselves against the biggest risks,” she explains.
“To achieve this, we need to change the way we think about our relationship with technology. Technology is rarely the problem. It’s almost always misuse.
It’s not about taking absolute positions like “all of this is bad” or “all of this is great”, but trying to define what are the features of this technology for the common good and what are the risks of misuse,” she adds.
The list is full of complex cases and double-edged swords.
Neurotechnology could reduce the number of fatal accidents by monitoring the levels of inattention and, above all, fatigue that affect drivers of trucks, trains and subways, for example.
This same feature could be misused by a business or school seeking total productivity, where an employee or student’s distracted moments are monitored, recorded, and potentially punished.
A bracelet that captures electromagnetic waves sent by the brain to move arms and hands could transform these impulses into electronic signals and make digital or virtual reality experiences much more intuitive and integrated.
The potential of this device is even greater: it makes it possible to detect the early stages of a neurodegenerative disease. Analyzing brain activity as a whole could represent a breakthrough for medicine and longevity.
On the other hand, Ms. Farahany writes in her book, the same bracelet will also be able to detect “if you engage in intimate activities using your hands in your bedroom.”
And all this data in the hands of governments?
But for Farahany, the biggest concern about individual privacy is that governments possess an increasingly wide range of personal data.
She reports that the US Department of Defense funded a company that developed a biometric system combining data on brain waves, cognitive states, facial recognition, pupil analysis and changes in the amount of sweat produced.
In China, a 2018 report by the South China Morning Post said workers in various branches and members of the country’s military are already using brainwave monitors to detect emotional spikes such as depression, anxiety or anger.
In addition to their use to improve the performance and therefore the financial results of companies, the report states that the project also aimed to “maintain China’s social stability”.
According to Farahany, in most countries, privacy laws do not explicitly address the right to mental privacy.
I think the United Nations should strive to recognize what I call the “right to cognitive freedom.” A universal right that would move us toward an actualization of privacy, that would explicitly say that there is a right to mental privacy, a right to be protected from interference with the way we think and feel about things.
According to her, today “freedom of thought” is applied and understood as referring strictly to freedom of religion and belief.
“I think we need to broaden this understanding to ensure protection against interference, manipulation and punishment of thought.”
The problem is that technology continues to evolve faster than the debate and adoption of legislation, and businesses and governments take advantage of loopholes in the law.
“It’s really about trying to determine as early as possible, and also as the technology evolves, what the benefits and risks are. Then you have to clarify the issues and develop a regulatory regime that addresses them. taken into account. It’s not always easy to do,” admits Ms. Farahany.
Elon Musk’s project
One of Elon Musk’s various projects is Neuralink, a company that aims to connect brains to computers.
The most well-known neurotechnology project has several controversial elements: it involves implanting a chip in the brain and it is led by Elon Musk, a figure who frequently makes headlines, often for reasons of controversy.
One of his companies, Neuralink, wants to implant such devices in the most complex human organ in the future in order to cure diseases such as Alzheimer’s disease and allow people with neurological diseases to control cell phones or computers by thought.
Some experts in the field are concerned about this project and express doubts about the implications of this type of technology developed by a for-profit company.
Last May, the FDA, the US Food and Drug Administration, authorized the first human trial.
“I’m not that worried about Musk’s project. In fact, I’m quite optimistic,” Ms. Farahany says.
“Neuralink promises two innovations. The first is to perform surgical interventions through robots, which would take care of the most delicate and difficult parts of the operation [l’implantation de la neurotechnologie]. The second is the development of hair-sized electrodes that could be implanted with much less risk to the human brain.
Few surgeons in the world today have the necessary skills to perform such an intervention.
“If I were to be severely disabled to the point where I couldn’t communicate or move, I would probably look into having some sort of neural technology implanted,” she concludes.