Would you really want to know what thoughts linger in your mind? When you lie in delerium? When you are in a 'vegitative state'?
Technology like this mind-reader – when it reaches maturity – could have many important applications in allowing traumatized, paraplegics, and other disabled groups to express their needs and provide insight into their experiences and required therapy.
Yet it also raises ethical questions: to what extent should brain-reading be allowed? As a preventative measure to identify behaviour anomalies in children early on? As a wearable device to 'relive your dreams'? Who stores the data, and how would it be evaluated? What is the risk and consequences of false positive results in an increasingly digitalized analytics environment?
… just a few thoughts to start the day! Enjoy!
/via +Hans Youngmann
Researchers Have Invented A Brain Decoder That Can Hear Your Inner Thoughts
Researchers at the University of California, Berkeley have invented a brain decoder device that’s able to work out what you’re thinking based on neuron activity inside the brain — essentially, the …
The data is there but the cypher and catalog is the challenge. We should look at how we organize the data in our lives to see how our brain does it. After all, it's our brain that came up with the idea, method and assume very much individualized. On the other hand, I'm not sure I want my brain decoded, the past is done, we have today and look forward to tomorrow.
There is an old cognitive science thought experiment where an "infallible" brain-reading machine determines someone is guilty of a crime but they know they did not commit it. How do we determine truth if we abdicate to machines? What if the machine says we dreamt of birds, but we remember butterflies? What if the paraplegic wants to say "I'm in pain", but the machine says, "I'm okay"?
I would use it only after taking out or breaking every single networking function it had.
Well, let's turn the question around then: what would happen if, indeed, privacy vanished? And I mean not in terms of analytical potential (I'm sure some constrewed imperfect system of access level rights is technically possible), but in terms of psychological impact on individuals? Some studies in Scandinavia suggest that privacy invasion alters study participant behaviour, and in some cases was so unbearable that the subjects terminated the experiment early. Now that can't be healthy!
Privacy is a concept which can't be defined beyond the core application.
Now the term "private" is continously vanishing out of our lives…
Challenging.