Posts

Showing posts from May, 2018

Ethical Implications of fMRI In Utero

Image
By Molly Ann Kluck Image courtesy of Wikimedia Commons . When my neuroethics mentor approached me with a publication from Trends in Cognitive Science called “Functional Connectivity of the Human Brain in Utero” (1) in hand, I was immediately delighted by the idea of performing an ethical analysis on the use of functional Magnetic Resonance Imaging (fMRI) on fetuses in utero. As of right now, I’m still conducting this ethical analysis.  Using fMRI to look at human brains as they develop in utero is groundbreaking for a couple reasons. For one, there is a vast difference between the fMRI method currently used to investigate developing brains and previous methods that were used to examine fetal brain development. Research on developing brains had utilized preterm neonates, or babies born prematurely. While these data are valuable, there are issues with validity associated with this method: early exposure to an abnormal environment (e.g. being in the intensive care unit, where many preterm

Should you trust mental health apps?

Image
By Stephen Schueller Image courtesy of Pixabay . If you were to search the Google Play or Apple iTunes store for an app to help support your mental health you’d find a bewildering range of options. This includes nearly 1000 apps focused on depression, nearly 600 focused on bipolar disorder, and 900 focused on suicide (Larsen, Nicholas, & Christensen, 2016). But how much faith should you have that these apps are actually helpful? Or to take an even more grim position, might some apps actually be harmful? Evidence suggests the latter might be true. In one study, researchers who examined the content in publicly available bipolar apps actually found one app, iBipolar , that instructed people to drink hard liquor during a bipolar episode to help them sleep (Nicholas, Larsen, Proudfoot, & Christensen, 2015). Thus, people should definitely approach app stores cautiously when searching for an app to promote their mental health. One reason people might believe such apps could be helpful

Presenting... The Neuroethics Blog Reader: Black Mirror Edition!

Image
It is our pleasure to present you with  The Neuroethics Blog  Reader:  Black Mirror  Edition! This reader features the seven contributions from the blog's  Black Mirror  series, in which six different student writers explored the technology and neuroethical considerations presented in various  episodes of the British science fiction anthology television series.  As Dr. Karen Rommelfanger puts it:  T his reader "... features critical reflections on the intriguing, exciting and sometimes frightful imagined futures for neurotechnology. Every day, in real life, we move closer to unraveling the secrets of the brain and in so doing become closer to understanding how to intervene with the brain in ways previously unimaginable. Neuroscience findings and the accompanying neurotechnologies created from these findings promise to transform the landscape of every aspect of our lives. As neuroethicists, we facilitate discussions on the aspirations of neuroscience and what neuroscience dis

Regulating Minds: A Conceptual Typology

Image
By Michael N. Tennison  Image courtesy of Wikimedia Commons . Bioethicists and neuroethicists distinguish therapy from enhancement to differentiate the clusters of ethical issues that arise based on the way a drug or device is used. Taking a stimulant to treat a diagnosed condition, such as ADHD , raises different and perhaps fewer ethical issues than taking it to perform better on a test. Using a drug or device to enhance performance—whether in the workplace, the classroom, the football field, or the battlefield—grants the user a positional advantage over one’s competitors. Positional enhancement raises issues of fairness, equality, autonomy, safety, and authenticity in ways that do not arise in therapy; accordingly, distinguishing enhancement from therapy makes sense as a heuristic to flag these ethical issues.  These categories, however, do not capture the entire scope of the reasons for and contexts in which people use drugs or devices to modify their experiences. Consider psy

Trust in the Privacy Concerns of Brain Recordings

Image
By Ian Stevens Ian is a 4th year undergraduate student at Northern Arizona University. He is majoring in Biomedical Sciences with minors in Psychological Sciences and Philosophy to pursue interdisciplinary research on how medicine, neuroscience, and philosophy connect.  Introduction Brain recording technologies (BRTs), such as brain-computer interfaces (BCIs) that collect various types of brain signals from on and around the brain could be creating privacy vulnerabilities in their users. 1,2 These privacy concerns have been discussed in the marketplace as BCIs move from medical and research uses to novel consumer purposes. 3,4  Privacy concerns are grounded in the fact that brain signals can currently be decoded to interpret mental states such as emotions, 5 moral attitudes, 6 and intentions. 7 However, what can be interpreted from these brain signals in the future is ambiguous. The current uncertainty that surrounds future capacities to decode complex mental states – and the priva

The Promise of Brain-Machine Interfaces: Recap of March's The Future Now: NEEDs Seminar

Image
Image courtesy of Wikimedia Commons . By Nathan Ahlgrim If we want to – to paraphrase the classic Six Million Dollar Man – rebuild people, rebuild them to be better, stronger, faster, we need more than fancy motors and titanium bones. Robot muscles cannot help a paralyzed person stand, and robot voices cannot restore communication to the voiceless, without some way for the person to control them. Methods of control need not be cutting-edge. The late Dr. Stephen Hawking’s instantly recognizable voice synthesizer was controlled by a single cheek movement , which seems shockingly analog in today’s world. Brain-machine interfaces (BMIs) are the emerging technology that promise to bypass all external input and allow robotic devices to communicate directly with the brain. Dr. Chethan Pandarinath, assistant professor of biomedical engineering at Georgia Tech and Emory University, discussed the good and bad of this technology in March’s The Future Now NEEDs seminar:  " To Be Implanted a