Huh.
From MIT's Technology Review, July 30:
Big tech firms are trying to read people’s thoughts, and no one’s ready for the consequences.
In 2017, Facebook announced that it wanted to create a headband that would let people type at a speed of 100 words per minute, just by thinking.
Now,
a little over two years later, the social-media giant is revealing that
it has been financing extensive university research on human
volunteers.
Today,
some of that research was described in a scientific paper from the
University of California, San Francisco, where researchers have been
developing “speech decoders” able to determine what people are trying to
say by analyzing their brain signals.
The research is
important because it could help show whether a wearable brain-control
device is feasible and because it is an early example of a giant tech
company being involved in getting hold of data directly from people’s
minds.
To
some neuro-ethicists, that means we are going to need some rules, and
fast, about how brain data is collected, stored, and used.
In the report
published today in Nature Communications, UCSF researchers led by
neuroscientist Edward Chang used sheets of electrodes, called ECoG
arrays, that were placed directly on the brains of volunteers.
The
scientists were able to listen in in real time as three subjects heard
questions read from a list and spoke simple answers. One question was
“From 0 to 10, how much pain are you in?” The system was able to detect
both the question and the response of 0 to 10 far better than chance.
Another
question asked was which musical instrument they preferred, and the
volunteers were able to answer “piano” and “violin.” The volunteers were
undergoing brain surgery for epilepsy.
Facebook
says the research project is ongoing, and that is it now funding UCSF
in efforts to try to restore the ability to communicate to a disabled
person with a speech impairment.
Eventually,
Facebook wants to create a wearable headset that lets users control
music or interact in virtual reality using their thoughts.
To
that end, Facebook has also been funding work on systems that listen in
on the brain from outside the skull, using fiber optics or lasers to
measure changes in blood flow, similar to an MRI machine.
Such
blood-flow patterns represent only a small part of what’s going on in
the brain, but they could be enough to distinguish between a limited set
of commands.
“Being
able to recognize even a handful of imagined commands, like ‘home,’
‘select,’ and ‘delete,’ would provide entirely new ways of interacting
with today's VR systems—and tomorrow's AR glasses,” Facebook wrote in a blog post....
....MORE