Bridging the gap between ethics and neurotechnology - the case of Brain-Computer Interfaces
We develop and test a novel and interdisciplinary approach for responsible innovation of Brain-Computer Interfaces
The fields of neuroscience, biology, psychology and information technology are increasingly merging and this merge has given rise to the field of neuro-engineering or neurotechnology. Engineering techniques are used to understand, repair, replace, enhance, or otherwise exploit the properties of neural systems. Humans are melting together with technologies. One example of such a neurotechnology is a Brain-Computer Interface.
Since decades the field of Brain-Computer Interfacing (BCI) aims to provide people with severe motor disabilities a muscle-independent tool to communicate. For example, people with a classical Locked-In Syndrome (LIS) can typically only communicate through blinking with their eye. If they want to independently use a communication device then they should control that device using the “power of their brain” instead of their body: brain activity is translated into commands (BCI).
Many studies demonstrated the feasibility of such Brain-Computer interfaces (BCIs), but transfer of BCIs from the lab to the daily life of people is relatively slow. We pose that BCIs, like other neurotechnologies, should be developed and designed in a responsible and user-centered manner to increase technology acceptance. We are developing a novel and highly interdisciplinary approach which uses methods from psychology, computer science, ethics, innovation studies and arts to reach this goal.
Different stakeholders of BCIs are first individually interviewed about technical, ethical, legal, societal and moral issues they relate to BCI technology. Then, their input is used to develop scenarios and narratives. In a final step, stakeholders are brought together through artistic means (for example improvisational theater) so they can experiment safely with different scenarios and can decide together which are desirable and undesirable scenarios.