Trust in the Privacy Concerns of Brain Recordings





By Ian Stevens







Ian is a 4th year undergraduate student at Northern Arizona University. He is majoring in Biomedical Sciences with minors in Psychological Sciences and Philosophy to pursue interdisciplinary research on how medicine, neuroscience, and philosophy connect. 




Introduction





Brain recording technologies (BRTs), such as brain-computer interfaces (BCIs) that collect various types of brain signals from on and around the brain could be creating privacy vulnerabilities in their users.1,2 These privacy concerns have been discussed in the marketplace as BCIs move from medical and research uses to novel consumer purposes. 3,4 Privacy concerns are grounded in the fact that brain signals can currently be decoded to interpret mental states such as emotions,5 moral attitudes,6 and intentions.7 However, what can be interpreted from these brain signals in the future is ambiguous.






The current uncertainty that surrounds future capacities to decode complex mental states – and the privacy vulnerabilities that this uncertainty creates for research subjects – requires that participants put trust in BRT researchers. I argue that “trust as reliability” is insufficient to understand the obligations of BRT researchers and that a more robust understanding of trust is needed. The populous deserves to know the possibility for privacy losses in neuroscientific research.





There is a growing body of literature addressing how developing neurotechnologies may change, harm, or undermine our senses of self,8,9 and there are potential deleterious effects of neurotechnology on privacy.10 Technologies like positron emission tomography (PET), functional magnetic resonance imaging (fMRI), electroencephalography (EEG), and electrocorticography (ECoG) present numerous ways that scientists can “look into” the brain. Here, I will not address all the concerns that arise with technologies that peer through the skull; my more limited focus is the idea that brain recordings produced as part of research protocols now may one day be used to decipher mental states of individual research participants. This narrow topic will serve as the groundwork for examining the importance of trust in brain recording research (and perhaps beyond).





Privacy, Brain Recording Technologies & Big Data








Image courtesy of Flickr.

Privacy scholars have described multiple types of privacy, such as physical, informational, decisional, and associational.11-13 I will focus on informational privacy because it is tied to the control of personal information. This broad category can be distilled to information about yourself that you may wish to withhold from certain parties. Such information can be as common as your social security number or as pertinent as possible unconscious racial biases. The concern with informational privacy for brain recording technology is it might allow for the discovery and revelation of mental content “hidden” within recorded brain data. In other words, research subjects may lose the ability to withhold personal mental information.





This topic is nothing new, as the ability to draw additional information from brain data is already of philosophical intrigue.14,15 As I have said, intentions, moral attitudes, and visual imaging16



can be deciphered through the collection and interpretation of brain signals from various neurotechnologies (like the electroencephalogram recorded on the exterior of the cranium or, electrocorticography recorded on the surface of the brain).* Depending on which BRT is used, what kind of mental states the study is examining, and by what means researchers look to interpret brain recordings, a research team could discover different things (i.e. emotions and intentions). The type of procedure could also change what kind of brain recordings are collected (from which area of the brain) and the quantity of recordings collected.





The notion of “big data”18 might come to mind when considering the vast amount of brain recordings collected by use of these technologies. In research contexts, brain data is actively shared amongst researchers to “spread the wealth.”19 This practice proves efficient to allow multiple types of research to be done when the costs of conducting research are high. However, the tension rises if/when such data can be re-identified. Although currently this has a low probability of occurring from publicity and small sample groups,20,21 the chance that it could happen highlights the concerns for the future privacy of brain recordings. If brain recordings are left in a free state of movement and can hypothetically be read to infer the identity (among other things) of the subject, the privacy concerns seem obvious. Therefore, future unknown privacy vulnerabilities of using such BRTs for subjects needs some guidance to aid in the balance between current capabilities for decoding brain data and what could be around the bend for brain recording interpretation. 





Trust








Image courtesy of The Blue Diamond Gallery.

Trust is an important, if often under-theorized, feature of medical research 22-24 and of the patient-physician relationship.25,26 It is defined as the knowledge or feeling that we can place something of significance in another’s control. Philosophically, one important aspect in the nature of trust lies in the vulnerability of the truster to the trustee.27 We have all experienced this in some form or another when building relationships. In a sense, we “open ourselves up” to others and feel hurt if we are rejected. This rejection should be thought of as being synonymous with feeling betrayed and hints at the importance of a relationship dynamic in this trust schema.28,29 I will focus on our relationships with researchers (participant-researcher relationship) here.





Reliability is generally the attribute we associate with those people who follow through on their actions. There is an important distinction between trust and reliability in that the social understanding of trust can assist in clarifying. Here, I will define reliability as following an agreed upon contract,29-31 while trust is associated with more implicit standards of situations not explicitly stated.28,32 For example, a researcher failing to follow the regulations set by HIPAA is being unreliable with someone’s brain recordings. But if that researcher then sold such information to marketing companies, a more implicit understanding of what should not be done with brain recordings, it would produce a feeling of betrayal. The importance here is the ability of trust to be used in the implicit vulnerabilities of privacy that could be exploited in the future of brain recordings. As the previous section stressed, the future for what brain recordings could tell about us is largely unknown. Therefore, since a patient cannot consent to such exhaustive hypothetical practices completely, an amount of trust between researcher and subject needs to be in place to account for these unknown vulnerabilities. Learning about this part of the research process will allow participants to have more knowledge of when they are vulnerable to better foster the informed consent process.





Clinical Application








Image courtesy of Wikimedia Commons.

The implementation of trust in the patient-researcher relationship is an epistemological question; that is, how can you know who you can trust? Initially, the satisfaction of reliably is used to classify trustworthiness: does what the researchers say they are going to do, happen? The more difficult step is creating the knowledge and feeling that if something does go wrong, then the researchers have your well-being in mind. A topic that addresses this, while not intended for such a use, is the notion of ancillary care. Ancillary care is the care allotted to study participants if something medically concerning is discovered while a participant is in a study. Research obligations are grounded in a kind of implicit understanding of participant-researcher relationship.33 Lessons about trust found in the ancillary care debates can be applied to BRT research. This may provide a way to understand the activities of researchers accessing, tracking, and possibly controlling brain data as taking place within trusting relationships. So, while it may be premature to speak of codifying trust-based guidelines for BRT research, this may be an avenue worth exploring. This way, while participants might not meet the holders of their brain data, they can still trust them.





*It is important to note that in these practices patients are consenting to have such information recorded.17 And since ethically such a practice is sound, recording brain states that a patient is unaware of is an important distinction to make about informational privacy. 21 Such a practice does require further ethical evaluation but for this paper, the important point to extrapolate is a subject’s vulnerabilities. 


Unknown vulnerabilities are common in scientific research. Will a patient be harmed by the side effects of this drug? Can this plastic be safe around children? Such unknowns are usually the original motivation to conduct research. However, I will only analyze possible harms to privacy in brain recordings. Future literature could explore how trust could work with unknown physical harms in other research fields. 





Acknowledgements



I want to thank the Neuroethics Thrust at the Center for Sensorimotor Neural Engineering for having me as an intern this summer and especially Dr. Eran Klein and Dr. Sara Goering for mentoring me.





References



  1. Bonaci, T., Herron, J., Matlack, C. & Chizeck, H. J. Securing the exocortex: A twenty-first century cybernetics challenge. in 2014 IEEE Conference on Norbert Wiener in the 21st Century (21CW) 1–8 (2014). doi:10.1109/NORBERT.2014.6893912

  2. Klein, E. & Rubel, A. Privacy and ethics in brai-computer interface research. in Brain-Computer Interfaces Handbook: Technological and Theoretical Advances (eds. Nam, C., Nijholt, A. & Fabien, L.) 653–668 (Taylor & Francis).

  3. Bonaci, T., Calo, R. & Chizeck, H. J. App Stores for the Brain?: Privacy and Security in Brain-Computer Interfaces. IEEE Technol. Soc. Mag. 34, 32–39 (2015).

  4. BiddleMay 22 2017, S. B. & A.m, 9:26. Facebook Won’t Say If It Will Use Your Brain Activity for Advertisements. The Intercept Available at: https://theintercept.com/2017/05/22/facebook-wont-say-if-theyll-use-your-brain-activity-for-advertisements/. (Accessed: 10th July 2017)

  5. Daly, I. et al. Affective brain–computer music interfacing. J. Neural Eng. 13, 46022 (2016).

  6. Greene, J. D., Sommerville, R. B., Nystrom, L. E., Darley, J. M. & Cohen, J. D. An fMRI Investigation of Emotional Engagement in Moral Judgment. Science 293, 2105–2108 (2001).

  7. Haynes, J.-D. et al. Reading Hidden Intentions in the Human Brain. Curr. Biol. 17, 323–328 (2007).

  8. Mecacci, G. & Haselager, W. F. G. (Pim). Stimulating the Self: The Influence of Conceptual Frameworks on Reactions to Deep Brain Stimulation. AJOB Neurosci. 5, 30–39 (2014).

  9. de Haan, S., Rietveld, E., Stokhof, M. & Denys, D. Becoming more oneself? Changes in personality following DBS treatment for psychiatric disorders: Experiences of OCD patients and general considerations. PloS One 12, e0175748 (2017).

  10. Ahmadi, M. & Ahmadi, L. Privacy Aspects of Nanoneuroimplants from the Point of View of a Human Dignity Perspective in Related International Conventions. J. Biomater. Tissue Eng. 4, 315–337 (2014).

  11. Decew, J. W. In Pursuit of Privacy: Law, Ethics, and the Rise of Technology. (Cornell University Press, 1997).

  12. Allen, A. L. Uneasy Access: Privacy for Women in a Free Society. (Rowman & Littlefield Publishers, 1988).

  13. Nissenbaum, H. Privacy in Context: Technology, Policy, and the Integrity of Social Life. (Stanford University Press, 2009).

  14. The Oxford Centre for Neuroethics - Neil Levy. Available at: http://www.neuroethics.ox.ac.uk/our_members/neil_levy. (Accessed: 28th July 2017)

  15. Edwards, S. D. R. R. J. L. I Know What You’re Thinking: Brain imaging and mental privacy by Sarah D. Richmond. (Oxford University Press, 2012).

  16. Schoenmakers, S., Barth, M., Heskes, T. & van Gerven, M. Linear reconstruction of perceived images from human brain activity. NeuroImage 83, 951–961 (2013).

  17. I Know What You’re Thinking: Brain imaging and mental privacy. (Oxford University Press, 2012).

  18. Mittelstadt, B. D. & Floridi, L. The Ethics of Big Data: Current and Foreseeable Issues in Biomedical Contexts. Sci. Eng. Ethics 22, 303–341 (2016).

  19. Poldrack, R. A. & Gorgolewski, K. J. Making big data open: data sharing in neuroimaging. Nat. Neurosci. 17, 1510–1517 (2014).

  20. Obama shakes mind-controlled robot hand wired to sense touch. US News & World Report Available at: https://www.usnews.com/news/news/articles/2016-10-13/paralyzed-man-feels-touch-through-mind-controlled-robot-hand. (Accessed: 28th July 2017)

  21. May 16, S. P. C. N., 2012 & Pm, 8:41. Paralyzed woman uses mind-control technology to operate robotic arm. Available at: http://www.cbsnews.com/news/paralyzed-woman-uses-mind-control-technology-to-operate-robotic-arm/. (Accessed: 28th July 2017)

  22. Perusco, L. & Michael, K. Control, trust, privacy, and security: evaluating location-based services. IEEE Technol. Soc. Mag. 26, 4–16 (2007).

  23. Kerasidou, A. Trust me, I’m a researcher!: The role of trust in biomedical research. Med. Health Care Philos. 20, 43–50 (2017).

  24. Kass, N. E., Sugarman, J., Faden, R. & Schoch-Spana, M. Trust The Fragile Foundation of Contemporary Biomedical Research. Hastings Cent. Rep. 26, 25–29 (1996).

  25. Mainous, A. G., Baker, R., Love, M. M., Gray, D. P. & Gill, J. M. Continuity of care and trust in one’s physician: evidence from primary care in the United States and the United Kingdom. Fam. Med. 33, 22–27 (2001).

  26. Anderson, L. A. & Dedrick, R. F. Development of the Trust in Physician Scale: A Measure to Assess Interpersonal Trust in Patient-Physician Relationships. Psychol. Rep. 67, 1091–1100 (1990).

  27. McLeod, C. Trust. in The Stanford Encyclopedia of Philosophy (ed. Zalta, E. N.) (Metaphysics Research Lab, Stanford University, 2015).

  28. Baier, A. Trust and Antitrust. Ethics 96, 231–260 (1986).

  29. Hardin, R. Trust and Trustworthiness. (Russell Sage Foundation, 2002).

  30. Black, D. Autonomy and Trust in Bioethics. J. R. Soc. Med. 95, 423–424 (2002).

  31. Dasgupta, P. Trust as a Commodity. in Trust: Making and Breaking Cooperative Relations (ed. Gambetta, D.) 49–72 (Blackwell, 1988).

  32. Jones, K. Trust as an Affective Attitude. Ethics 107, 4–25 (1996).

  33. Richardson, H. S. Gradations of Researchers’ Obligation to Provide Ancillary Care for HIV/AIDS in Developing Countries. Am. J. Public Health 97, 1956–1961 (2007).




Want to cite this post?




Stevens, I. (2018). Trust in the Privacy Concerns of Brain Recordings. The Neuroethics Blog. Retrieved on , from http://www.theneuroethicsblog.com/2018/05/trust-in-privacy-concerns-of-brain.html

Comments

Popular posts from this blog

Why use Brain Cells in Art?

Misophonia: Personality Quirk, Symptom, or Neurological Disorder?

The Man Who Voled the World