In a future with more ‘mind reading’, thanks to neurotechnology, we may have to rethink freedom of thought

Socrates, the ancient Greek philosopher, never wrote things down. He warned that writing undermines memory – that it is nothing more than a reminder of some previous thought. Compared to people who discuss and debate, readers will “listen to many things and learn nothing; they will appear to be omniscient and generally know nothing.”

These views may seem strange, but his central fear is an infinite fear: that technology threatens thought. In the 1950s, Americans were concerned about the possibility of advertisers using hidden subliminal messages in movies to trick consumers into buying things they didn’t really want. Today, the United States is in the midst of a similar panic over TikTok, with critics worried about its impact on viewers’ freedom of thought.

For many people, neurotechnologies seem very threatening, even though they are still in their infancy. In January 2024, Elon Musk announced that his company Neuralink had implanted a brain chip in the first human subject – although they did so long after their competitors. Fast forward to March, and that person can already play chess with just their thoughts.

Brain-computer interfaces, known as BCIs, have sparked debate about the appropriate limits of technologies that interact with the nervous system. Looking ahead to the day when wearable and implantable devices may become more widespread, regulations and restrictions on BCIs and related neurotechnology have been discussed at the United Nations. Chile has even enshrined neurological rights – special protections for brain activity – in its constitution, and other countries are considering doing so.

At the heart of neural rights is the idea that everyone has a fundamental right to decide what their brain is in and who is allowed access to that information, the way people usually decide what is done with their bodies and property. It is commonly synonymous with “freedom of thought.”

Many ethicists and policy makers think that this right to mental self-determination is so fundamental that it should never be undermined, and that institutions should impose strict limits on neurotechnology.

But as my research on intellectual rights argues, protecting the mind is not nearly as easy as protecting bodies and property.

Thoughts vs things

It is relatively simple to create rules that protect a person’s ability to decide what is done to their body. The body has clear boundaries, and things that cross it without permission are not allowed. It is usually obvious when a person violates laws that prohibit assault or battery, for example.

The same is true of regulations that protect one’s property. Part of the main reason people come together to form governments is to protect body and property.

In general, people can enjoy these protections without seriously restricting how others want to live their lives.

The difficulty in establishing nerve rights, on the other hand, is that, unlike bodies and property, brains and minds are constantly influenced by external forces. One cannot fence off one’s mind so that nothing can enter.

A light colored wooden fence set against a cloudy sky.

Instead, one’s thoughts are largely the result of the thoughts and actions of others. Everything from the way we see colors and shapes to our most basic beliefs is influenced by what others say and do. The human mind is like a sponge, soaking up whatever it happens to be immersed in. Regulations may be able to control the types of liquid in the bucket, but they cannot protect the sponge from getting wet.

Even if that were possible – if there were a way to control people’s actions so that they don’t influence other people’s thoughts at all – the regulations would be so heavy that no one would be able to do much of anything do.

If I am not allowed to influence other people’s thoughts, I can never leave my house, because that alone makes people think and act in certain ways. And as the internet increases one’s reach, not only would I not be able to leave the house, but I would not be able to “like” a post on Facebook, leave a product review, or comment on an article.

In other words, protecting one aspect of freedom of thought – the ability of a person to protect himself from outside influences – can conflict with another aspect of freedom of thought: freedom of speech, or the ability of a person to express ideas. .

Neurotech and control

But there is another concern: privacy. People may not be able to fully control what goes into their heads, but they should have considerable control over what goes out – and some believe that societies need “neuro-rights” regulations to ensure that. Neurotech is a new threat to our ability to control the thoughts people reveal to others.

There are ongoing efforts, for example, to develop wearable neurotechnology that would read and adjust the customer’s brain waves to help them improve their mood or get better sleep. Although such devices can only be used with the consent of the user, they still take information from the brain, interpret it, store it and use it for other purposes.

In experiments, it is also becoming easier to use technology to measure a person’s thoughts. Functional magnetic resonance imaging, or fMRI, can be used to measure changes in blood flow in the brain and produce images of that activity. Artificial intelligence can then analyze those images to interpret a person’s intentions.

Critics of neurotechnology fear that as the field develops, it will be possible to extract information from brain activity whether a person wants to reveal it or not. Hypothetically, that information could one day be used in a range of contexts, from research for new devices to law courts.

A tiny golden brain, about to be beaten with a wooden mallet with a gold band on it.A tiny golden brain, about to be beaten with a wooden mallet with a gold band on it.

Regulation may be needed to protect people from information harvesting by neurotechnology. For example, nations could ban companies that make commercial neurotechnology devices, such as those intended to improve sleep, from storing the brainwave data collected by those devices.

But I argued that it might not be necessary, or even possible, to protect against neurotechnology putting information in our brains – although it is difficult to predict what capabilities neurotechnology will have. even a few years from now.

In part, this is because I believe people tend to overestimate the difference between neurotech and other forms of external influence. Think about books. Horror novelist Stephen King has said that writing is telepathy: When an author writes a sentence – say, describing a shotgun over the fireplace – they trigger a particular thought in the reader.

In addition, there are already strong protections for bodies and property, which I believe could be used to prosecute anyone who imposes invasive or wearable neurotechnology on another person.

How different societies will respond to these challenges is an open question. But one thing is certain: With or without neurotechnology, our control over our own minds is not as certain as many of us would like to think.

This article is republished from The Conversation, a non-profit, independent news organization that brings you reliable facts and analysis to help you make sense of our complex world. It was written by Parker Crutchfield Western Michigan University

Read more:

Parker Crutchfield does not work for, consult with, own shares in, or receive funding from any company or organization that would benefit from this article this, and has not disclosed any relevant connections beyond their academic appointment.

Leave a Reply

Your email address will not be published. Required fields are marked *