It’s becoming easier all the time to read signals from the human brain:
Tesla founder Elon Musk’s company Neuralink just this summer announced that human trials will move forward next year for an implantable device that can read a user’s mind; scientists at UCSF recently released the results of a brain activity study, backed by Facebook, that shows it’s possible to use brain-wave technology to decode speech; in 2018, Nissan unveiled Brain-to-Vehicle technology that would allow vehicles to interpret signals from the driver’s brain; and Nielsen is already using neuroscience to capture nonconscious aspects of consumer decision-making.Allison Duncan, “Mind-reading technology is closer than you think” at Fast Company
There will be good and bad outcomes. Recently, we’ve talked about how high tech can help the blind see and amputees feel by reading brain signals directly. A mind-controlled robotic “arm” can help sufferers from movement disorders with the tasks of daily living.
One workplace use is to monitor drowsiness in, for example, the operators of high-speed trains. In China, for example: “If the driver dozed off, for instance, the cap would trigger an alarm in the cabin to wake him up.” (South China Morning Post). “Smart Caps” have been used in Australia for the same purpose, at least since 2015. A “drowsiness” detector is actually pretty simple in principle: Sleep brain waves differ from awake brain waves. If the device detects brain waves efficiently, we should notice a difference.
Smart Caps, according to reports, have gotten fatigue-related incidents under control at some heavy industry sites, in part by providing masses of realtime information on when such incidents were most likely to occur:
Clear trends have emerged in companies where the device is used on large numbers of staff, and [software developer Dan] Bongers said those companies were able to adapt working conditions in line with those trends.
“As you would expect we see shift workforces most fatigued between 2am to 5am. We are also starting to see patterns based on rosters where we have noticed the first nightshift after a break period is the most difficult to deal with; that transition from day-work to night-work” he says.Peter Ker, “Australian workers are starting to have their brains monitored in the workplace” at Sydney Morning Herald
But in China, much more ambitious government-backed surveillance projects to monitor employee emotions are under intensive development:
Speaking on behalf of Neuro Cap, the central government-funded brain surveillance project at Ningbo University, cognitive psychologist Jin Jia explained,
… a highly emotional employee in a key post could affect an entire production line, jeopardising his or her own safety as well as that of others.
“When the system issues a warning, the manager asks the worker to take a day off or move to a less critical post. Some jobs require high concentration. There is no room for a mistake,” she said.Stephen Chen, “‘Forget the Facebook leak’: China is mining data directly from workers’ brains on an industrial scale” at South China Morning Post
We are told that the new technique boosts profits but, in the non-transparent environment that Stephen Chen describes, it is hard to know how to evaluate the statement. At any rate, we are told that the workers at first resisted but then accepted the devices and now wear them all day at work. Experiments are underway to introduce such devices to hospitals, to “monitor a patient’s emotions and prevent violent incidents.”
This may all sound great, and as a bioethicist, I am a huge proponent of empowering people to take charge of their own health and well-being by giving them access to information about themselves, including this incredible new brain-decoding technology. But I worry. I worry that we will voluntarily or involuntarily give up our last bastion of freedom, our mental privacy. That we will trade our brain activity for rebates or discounts on insurance, or free access to social-media accounts … or even to keep our jobs. [05:17]Nita Farahany, “When technology can read minds, how will we protect our privacy?” at TED talk November 2018 transcript
In sum, she fears we are heading complacently into a world of brain transparency with no protections in place:
Think about it. In a world of total brain transparency, who would dare have a politically dissident thought? Or a creative one? I worry that people will self-censor in fear of being ostracized by society, or that people will lose their jobs because of their waning attention or emotional instability, or because they’re contemplating collective action against their employers. That coming out will no longer be an option, because people’s brains will long ago have revealed their sexual orientation, their political ideology or their religious preferences, well before they were ready to consciously share that information with other people. [07:53]Nita Farahany, “When technology can read minds, how will we protect our privacy?” at TED talk transcript
Farahany hopes for a “cognitive liberty revolution” but the main obstacle to her hopes may be a general lack of awareness of the issues.
Qiao Zhian, professor of management psychology at Beijing Normal University, says no laws prevent or regulate the use of brain monitoring equipment in China. And in the United States, for example, few laws prevent private companies from collecting and selling brain data and “anonymized” medical information to third parties either. Currently, everywhere is the frontier, where the pushiest push ahead.
Further reading: High tech can help the blind see and amputees feel. It’s not a miracle; the human nervous system can work with electronic information.
“Anonymized” data is not confidential. It’s almost as anonymous as your fingerprint, actually