The post title sounds like science fiction meets Captain Obvious, but this is real. Brain implants, or brain-computer interfaces, can help disabled people move prosthetic limbs with their intentions, and paralyzed people without speech can speak through a computer by intending to speak. But with improved technology, we find that the parts of the brain that are used to attempt speech are very close to the parts of the brain that produce our inner dialogue, meaning our thoughts. How would someone using this technology to communicate maintain the privacy of their own thoughts? The interfaces we have now are not good at reading inner thoughts, but they will get better, and the ethical issues need to be addressed now.
One way to separate the intention to speak from one's inner dialogue would be to improve the technology, and researchers are working on that. Another way that is already being tried is to use a "password" that the user would think of before activating intentional communication. You can see how this might have its own problems- it's akin to telling someone to not think about an elephant. Read about the attempts to assure privacy as well as communication for those using brain-computer interfaces at Ars Technica. As you would guess, the comments are full of worst case scenarios.
Newest 1 Comment
There are several schools of meditation all built around getting the inner dialog to just shut up. Some people spend years learning to do that. It would be cool if mute cyborgs could someday inform us about some easy, accessible, and direct way.
Abusive comment hidden.
(Show it anyway.)