Your brain is the next tech frontier - podcast episode cover

Your brain is the next tech frontier

Jan 17, 202550 min
--:--
--:--
Listen in podcast apps:

Episode description

We're entering a new era of brain monitoring and enhancement, but what are the ethical implications? This hour, TED speakers explore the potential and pitfalls of merging our minds with machines. Guests include legal scholar and AI ethicist Nita Farahany, neurotechnologist and entrepreneur Conor Russomanno, neuroscientist and physician Sergiu Pașca and sous chef Kate Faulkner.

Original broadcast date: January 26, 2024.

TED Radio Hour+ subscribers now get access to bonus episodes, with more ideas from TED speakers and a behind the scenes look with our producers. A Plus subscription also lets you listen to regular episodes (like this one!) without sponsors. Sign-up at plus.npr.org/ted.

Learn more about sponsor message choices: podcastchoices.com/adchoices

NPR Privacy Policy

Transcript

This is the TED Radio Hour. Each week, groundbreaking TED Talks. Our job now is to dream big. Delivered at TED conferences. To bring about the future we want to see. Around the world. To understand who we are. From those talks, we bring you speakers. and ideas that will surprise you. You just don't know what you're gonna find. Challenge you. We truly have to ask ourselves, like, why is it noteworthy? And even change you. I literally feel like I'm a different person. Yes. Do you feel that way?

Ideas worth spreading. From TED and NPR, I'm Manoush Zomorodi. In 2017, Nita Farahani and her family... went through a devastating experience. And it's something she's okay talking about now. Yeah, so we suffered the loss of our second daughter, Calista. She fell ill. And ultimately, after a prolonged hospital stay, she died from complications of it on Mother's Day in 2017.

The day-to-day became almost unbearable for Anita. From the first moment that I took her into the emergency room and the weeks and months in the hospital, There were just so many vivid images that really left me with so much trauma. And I ended up with PTSD because some of the kind of harrowing...

Cries and moments just seared themselves on my brain. And so it got to the point where I couldn't sleep and I was pretty dysfunctional. Nita tried traditional therapy, but it didn't do much. And then eventually... A psychologist helped me through it and was able to help me using exposure therapy, which was also really traumatic. Basically, I would clench up every time I...

you know, tried to go through the exposure therapy and not allow myself to, you know, kind of re-experience the memories. But that's a really painful process to go through. At the time, need a thought. If only there was a way to reprogram her brain so she could stop having these intrusive thoughts and feelings. And now there is. Today, there are more innovative neurotechnology.

procedures that are available, that had they been available to me, I would have opted for in a heartbeat. Neurotechnology, devices that let us peer into the brain, monitoring, even guiding its activity. In Nita's case, she believes a treatment called decoded neurofeedback might have helped her. Basically, you know, you can go into a fMRI machine, a functional magnetic resonance imaging.

where, you know, when you recall the memory, it's mapped in your brain. Once doctors know which path your brain is using to recall that memory, they can start using it to do something else. Take it over. That mapping allows for the development of something like a game, like playing basketball, to implicitly reactivate.

those same pathways, where by playing that game, which is much more pleasant experience than remembering the trauma, you can essentially overwrite those memories with more positive associations instead. So when that path in her brain was activated, instead of feeling total panic, Nita would have felt more relaxed. And so it's a different and, you know... in many ways, less traumatic way of working through PTSD or working through any kind of traumatic memory or experience.

So that's just one of the different techniques that people have been working on to have innovative uses of neurotechnology to enable people to work through trauma or to work through different neurological diseases and disorders. So the memory is still there, but the visceralness of that thought is lessened? That's right. Yeah, so it's...

Like any other memory, it's faded more now. It's not literally reliving it and re-experiencing it. And PTSD, for me at least, and I think for a lot of people who experience it, you're back there. feeling it all again, all of the overwhelming, you know, fear and trauma that you're experiencing in the moment, you're re-experiencing as the memory kind of takes over again, versus how we usually remember, which is

you know, kind of at a distance with reflection. It may hurt still, but it doesn't grip us and take us over and put us back there in that same kind of way. So that does sound amazing. If there was technology that could help with that more easily than the years of therapy that you went through, that would be great. But also, it makes me think there are some things that people would like to forget.

And are we going to get to the point where, you know, like in that movie, Eternal Sunshine of the Spotless Mind, there will be technology that could maybe help us do that. I mean, it could be. So Nita Farhani knows a lot about neurotechnology and its pros and cons because she's a law professor at Duke University studying the ethics of how we should and shouldn't use this technology in the future.

which she says is coming faster than we might think. Neurotechnology in headbands, smartwatches, earbuds, will be able to track our health, including, eventually, our thoughts and emotions. Yeah. So, I mean, there are brain sensors that pick up electrical activity in the brain. Major tech companies are really racing to embed these brain sensors.

to be like the heart rate sensors and other sensors we have in everyday objects, but tracking something very different, which is being able to tell if a person is happy or sad or if they're paying attention or their mind is wandering or if they're bored or engaged. tired of falling asleep at the wheel, for example. There are companies like SmartCap that have been selling these EEG headcaps that allow

a driver or a pilot or somebody who's in mining, for example, to track their fatigue levels and give more accurate data about whether they're starting to get to dangerous levels of sleep. I mean... This sounds cool, easier, but where do we start to cross the line between technology understanding our intention based on tiny movements or brain activity to understanding?

our thoughts and ideas and our feelings? Yeah, it's a great question. So, you know, there was this amazing study that came out and I think April of last year, where using more sophisticated neurotechnology, which is like these giant, you know, MRI machines that a person goes into. They had people listen to podcasts and then trained a generative AI classifier using GPT-1.

to say like, this is what the person's listening to. This is what their brain activity looks like. This is what they're listening to. This is what their brain activity looks like. And then... had them listen to something and have the classifier try to decode what that was. And it was, with a really high degree of accuracy, able to decode a lot of what the person was hearing or what they were imagining.

just based on brain activity. And that, again, is more sophisticated technology than EEG. And, you know, it's peering more deeply into the brain, but other researchers are applying that same. concept to try to decode EEG activity. And so the question of words that cross the line, as you're starting to wear everyday devices, you know, earbuds that are tracking your fatigue levels or picking up your intention to type or to swipe, but it's recording that brain activity at all times.

These models are getting more and more sophisticated at being able to decode what that means. It's not that hard to see that we're quickly moving into a world where what you're thinking and feeling is just as transparent and can be just as easily decoded using AI and neurotechnology. Over the past decade, there have been incredible advances in understanding how the brain works. Now we are on the cusp of a new era of brain monitoring and enhancement.

And the exciting potential and pitfalls of merging our minds with machines and manipulating the brain, well, they're mind-boggling. And so today on the show, Brain Hacks. Neurotechnology that could treat devastating cognitive diseases, mental illness, and brain injuries. But as legal scholar Nita Farahani warns,

also put our most private thoughts and emotions in jeopardy. And that's the part that I fear that when... you know, we get to this world of brain transparency if we don't have the right kind of safeguards in place, that that which I think is so fundamental to what it means to be human, what it means to flourish as a human. may suddenly not be our own. Here's Nita Farahani on the TED stage. This new category of technology presents unprecedented possibility, both good and bad.

Consider how our physical health and well-being are increasing while neurological disease and suffering continue to rise. 55 million people around the world are struggling with dementia. with more than 60 to 70 percent of them suffering from Alzheimer's disease. Nearly a billion people struggle with mental health and drug use disorders. Depression affects more than 300 million.

Consumer neurotech devices can finally enable us to treat our brain health and wellness as seriously as we treat the rest of our physical well-being. Regular use of brain sensors could even enable us to detect the earliest stages of the most aggressive forms of brain tumors like glioblastoma, where early detection is crucial to saving lives.

The same could hold true for Parkinson's disease, to Alzheimer's, traumatic brain injury, ADHD, and even depression. But all of this will only be possible... if people can confidently share their brain data without fear that it will be misused against them. You see...

The brain data that will be collected and generated by these devices won't be collected in traditional laboratory environments or in clinical research studies run by physicians and scientists. Instead, it'll be the sellers of these new devices, the very companies who've been commodifying our personal data for years. Which is why we can't go into this new era naive about the risks or complacent about the challenges that the collection and sharing our brain data will pose.

Brain sensors provide direct access to the part of ourselves that we hold back, that we don't express through our words and our actions. Brain data, in many instances, will be more sensitive than the personal data of the past because it reflects... our feelings, our mental states, our emotions, our preferences, our desires, even our very thoughts. I would never have wanted the data that was collected as I worked through the trauma of my personal loss.

to have been commodified, shared, and analyzed by others. These aren't just hypothetical risks. Take Entertack, a Hangzhou-based company who has collected millions of instances of brain activity data as people have engaged in mind-controlled car racing, sleeping, working. even using neurofeedback with their devices. They've already entered into partnerships with other companies to share and analyze that data.

I mean, what's your biggest fear here, Anita? What is the worst case scenario if we don't start to put some legal safeguards around this tech? I think honestly that... Human flourishing is at risk here, like fundamentally what it means to be human is at risk. In a minute. Nita Farahani explains how our ideas of civil liberty need to change in this new era of neurotechnology to protect our most private ideas and thoughts. On the show today.

Brain Hacks. I'm Manoush Zomorodi, and you're listening to the TED Radio Hour from NPR. We'll be right back. On the show today, brain hacks and the future of neurotechnology. We were just talking to Duke law professor Nita Farahani. She's the author of the book, The Battle for Your Brain. She says we need to rethink what our right to privacy even means in this new era. So for me, you know.

What I've been talking about is our right to cognitive liberty, our right to self-determination over our brain and our mental experiences. And what I mean by that is, like, fundamentally... your right to develop your own identity, your own thoughts, like to have a space. where you're able to reflect, to think, to think about being a child and trying to sort through who you are. I have a third grader, so an almost nine-year-old.

And watching her, you know, more and more come into her own and trying to figure out who she is, all of that happens in the safe space, the safe space of mental reprieve. You know, in one of the most jarring things, I think, in researching my book I came across was... You know, the classrooms in China where children were being required to wear headsets that track their brain activity to track their attention levels throughout the workday.

And the thought of a child, kind of no matter what that headset can or can't pick up, but the chilling effect that that has. on the ability to think freely, to be able to develop your own internal sense of self, to dare to think differently at an age where it's so hard already to go against the norm.

That's the space that I worry about us eroding, getting to a place where people are afraid to even think. And if we are afraid to even think freely, the capacity to be able to... figure out who you are and to, you know, dare to dream big or to dare to, you know, help us change the path in the course of humanity or your own path in life is compromised.

When I hear you say that, it makes me particularly worried because I think we're talking about technology that will be very subtle, right? Like, you know, when I tap something or I swipe something, but if I... think something or if someone is collecting a brainwave, that seems very discreet. It's very discreet and hidden and invisible. So one of the risks I worry about...

is a lot of times with an emerging technology, especially a whole new class of technology, it becomes normalized and hidden in ways that we can't see. So, you know, just... To give you an example of that, some of the places in which these EEG headsets have been introduced already are places like you go into an Ikea store to look at a series of rugs and you're given a headset.

And told, like, only if your brain can prove that you love the rug are you going to be able to take it home. This actually happened. What? Yeah, they had this marketing gimmick in Brussels where they had these limited edition rugs. They were worried that the kind of idea of these limited edition rugs was to bring art to people at a reasonable price and people were buying them and reselling them on places like eBay. And so...

They ran this marketing campaign where you had to wear an EEG headset while looking at the rugs to prove you loved the rug. And if you did, you could take one home. And if you didn't, you couldn't. We can laugh at that in so many different ways. It was clearly a marketing gimmick, but it normalizes it, right? You encounter technology in situations, novel technology in situations that...

are non-threatening, that are invisible. You don't realize what's being collected. And suddenly we've breached this category of giving away, you know, our most intimate selves without even realizing that we're doing so. You know, I don't think people realize what that world might look like. It's not just that you've given up your mental privacy. You've given up the keys to who you are to be able to mentally shape and change you.

That's happening already with algorithms, but the ways in which this can happen so much more precisely and so much more in a customized way. I worry about a world of more almost brainwashing of people in ways that really limit our ability to think freely. Unless people have individual control over their brain data, it will be used for micro-targeting or worse, like... the employees worldwide who've already been subject to brain surveillance in the workplace to track their attention and fatigue.

to governments developing brain biometrics to authenticate people at borders, to interrogate criminal suspects' brains, and even weapons that are being crafted to disable and disorient the human brain. Brain wearables will have not only read but write capabilities, creating risks that our brains can be hacked, manipulated, and even subject to targeted attacks. We must...

Act quickly to safeguard against the very real and terrifying risk to our innermost selves. So somebody listening is like, okay, I don't want that future. Tell me what to do right now. What would you say? Well, I'd say the first place that I've been advocating that we address is like a system of laws. And that's not just because I'm a law professor. I think it's because...

We're starting in a world in which the balance is not in favor of individuals, right? We've gotten to this place where like... The collection of data is the norm, and so the balance is in favor of the tech companies, and we have to reclaim some of that power for individuals. And in the past year, a lot has happened. So UNESCO has launched a new...

project around trying to develop ethical guidelines around neurotechnology. The UN has a committee that's putting together a report on neurotechnologies and its impact on human rights. And a lot of the AI legislation that's been happening... and the conversations that have been happening. Some of those have started to recognize the convergence of these fields and include some specifics around the processing of biometric and personal data, but there needs to be a lot more convergence.

between the AI conversations and the neurotechnology conversations. And so I've been advocating for a global right to cognitive liberty, a right to self-determination over our brain and mental experiences. And that really just that's a framework to update. three existing human rights, which is our right to privacy, to explicitly include a right to mental privacy.

the right to freedom of thought to more explicitly cover a right against interference and manipulation and punishment for our thoughts, and the right to self-determination, which has been recognized as a collective and political right to also be an individual. It feels like we are hurtling towards an era, and this sounds so dramatic, but where man and machine are merging in many ways. I agree with you.

How do you think about it? So, you know, I think of it as kind of co-evolution. The way I see that is human thinking is relational. And so... You know, our technology and our dependence and interdependence on technology is increasing, which means our relational thinking with respect to technology is being shaped and changed with that technology.

I want us to be more in the driver's seat of that. Actually knowing what's happening with that coevolution and to be able to drive that coevolution rather than the technology in the hands of a few powerful people. deciding how our brains are relationally going to change with respect to that technology. And I think this is a whole category where...

It has such a huge implication for how it could reshape what it means to be human that it's so important that we get it right. That's Nita Farahani. She's a law professor at Duke University and the author of The Battle for Your Brain. defending the right to think freely in the age of neurotechnology. You can see her full talk at TED.com.

So to get to a future where all this neurotechnology is even possible, a lot of technical work is being done in labs all over the world. Like one that I visited in Brooklyn at a company. called OpenBCI. Wait, what am I going to put on my head? Yeah, so the new Galia headset is crazy beast. I was there to try out a brain-sensing virtual reality headset called the Galia. I'm just going to adjust this island back here. It's a helmet.

like contraption that is at the bleeding edge of neurotechnology and costs about $25,000. And the one in the back also has three active EEG electrodes. So what are they collecting then? They're collecting brain activity. academics, medical researchers, and other tech developers who are using the hardware as a starting point for their own projects.

Founder Connor Russomano walked me through what was happening as I looked at a screen and all the sensors attached to my scalp and earlobes began processing lots of data. and building a very rough profile of how my mind works. It kind of records your EEG, your eye tracking, your heart rate, your heart rate variability, all these things against.

These stimuli. The result is a display of all my brain waves. Delta, theta, alpha, beta, and gamma. It's like a rainbow of brain activity that I'm looking at. I feel like naked because I feel like you can look at these brainwaves and be like, wow, she's a nervous wreck inside, but she presents as like a normal human being. Connor could not read my mind, but he and the computer.

could quickly make sense of the electrical pulses my brain sent out to move tiny muscles. Let's try flexing or like kind of winking your left eye. And then an incredible thing happened. With barely a wink of an eye, I was able to save a virtual cat from evil rats. All right, here I go. All right, so there's left. Try out right.

There it is. Wow, nice. Try red again. You want to collect those bones, but you've got to avoid those pesky rats. Yeah. All this incredibly impressive, complicated, and expensive technology. To play a video game. I know. But it's early days. And the goal is to get to the point where we can just think something and have it happen on a screen. You know, at OMBCI, we're really focused on...

The path of least resistance to solving the problem, which is decoding the mind, intention, emotions, what makes us who we are, and then how do we augment that? How do we improve that? To do that, Connor and his team are piecing together how the mind works with the data that's easiest to collect, like the brainwaves that show up when I twitch my eye. The computer begins to understand that an eye twitch...

shows a specific wave pattern, and that means move the cat. Eventually, with a lot of repetition, I could just think about moving the cat, and it would know to move it. So this is... Maybe the beginning of mind-reading technology. If we figure it out the right way, it becomes...

the greatest tool that humanity has ever created. When you say the greatest tool, tell me, tell me, give me an example, like to somebody listening, like, what do you mean the greatest tool? How is that going to help me? Okay. So if you knew. Your computer was keeping track of your emotions, your deepest, deepest, darkest secrets, but also just like, are you focused? Do you need a nap? Do you need to go for a walk?

Are there these kind of like basic things that you don't even know about yourself, but I do because I now understand your subconscious better than you do. We're already there.

in a lot of ways but it's you know we have the power the potential with modern neurotechnology and computers to make it even more powerful the understanding of our subconscious and so When I think about in a real world context, like I do want the ability to walk down the street and not have to bend my neck down to look at my emails and not have to be like, man, I got to like open up Google Maps and copy and paste that.

address from my text thing and then like make sure that it's northeast not northwest you know like it could be um turning down the brightness of the display in your sunglasses because it's noticing that you are getting too much light or you're being stimulated too much and it knows that. Instead of giving you more of what it thinks you want, it knows that it should be prioritizing what you need and not what you want.

But that's not what Connor thought the purpose of this technology was when he first started the company in 2013. He learned a lot just watching how people in other fields used his hardware. In the beginning, our goal... was to build an inward-pointing telescope and to share the blueprints with the world so that anybody with a computer could begin peering into their own brain. Connor Russomano continues from the TED stage. At first, we were an EEG-only company.

We sold brain sensors to measure brain activity. But over time, we discovered people doing very strange things with our technology. Some people were connecting the equipment to the stomach to measure the neurons in the gut. and studied gut-brain connection in the microbiome. Others were using the tools to build new muscle sensors and controllers for prosthetics and robotics. What we learned from all of this is that the brain, by itself,

is actually quite boring. Turns out, brain data alone lacks context. And what we ultimately care about is not the brain, but the mind, consciousness, human cognition. When we have things like EMG sensors to measure muscle activity or ECG sensors to measure heart activity, eye trackers, and even environmental sensors to measure the world around us, all of this makes the brain data much more useful.

But the organs around our body, our sensory receptors, are actually much easier to collect data from than the brain, and also arguably much more important for determining the things that we actually care about, emotions, intentions, and the mind overall. Additionally, we realized that people weren't just interested in reading from the brain and the body. They were also interested in modulating the mind through various types of sensory stimulation, things like light, sound.

haptics and electricity. It's one thing to record the mind, it's another to modulate it. The idea of a combined system that can both read from and write to the brain or body is referred to as a closed-loop system or bi-directional human interface. This concept is truly profound, and it will define the next major revolution in computing technology.

It sounds like you think that we are at a moment where on and off are not going to be the way we live anymore. Yeah. There's all these moments where there's friction and there doesn't need to be. Like the computer, if we trusted it, would just access that information for us. And it would understand that it put the wrong thing in and that we're frustrated and it would change it without us changing it for it. And we're already there, right?

Most people feel uncomfortable when their phone is not with them. It's true. You know, most people feel like there's a piece of them missing when they're like, where's my phone? Where's my phone? Oh my gosh, did I lose it? Oh, what does that mean? I've lost part of my brain. There's memories in there that I don't have without my phone. And this is part of our mind right now.

And it changes us. It manipulates us. And I think, you know, we're right at this kind of inflection point where we're figuring out how to put real human emotions directly into the computing loop. When you have... products that not just are designed for the average user, but are designed to actually adapt to their user. That's something truly special. When we know what the data of an emotion or a feeling looks like, and we know how to make that data go up or down.

then using AI, we can build constructive or destructive interference patterns to either amplify or suppress those emotions or feelings. In the very near future, we will have computers that we are... resonantly and subconsciously connected to, enabling empathetic computing for the very first time. Okay. So tell me what that looks like to you. Like if everything goes according to plan for you, what is the potential that you see coming from the technology that you're building?

I think the potential is immense. And so in my mind, success is building a new type of personal computer where the owner, the user of the computer... has total agency over their own data and how the computer is augmenting their mind. We're working on what I believe is the greatest challenge of... This century and maybe the greatest challenge that humans have ever faced, which is understanding intelligence, human intelligence. That's Conor Rosamano, CEO of OpenBCI. You can watch his full talk.

On the show today, brain hacks. I'm Manoush Zomorodi, and you're listening to the TED Radio Hour from NPR. Stay with us. It's the TED Radio Hour from NPR. I'm Manoush Zomorodi. On the show today, brain hacks. So far, we have talked about technology augmenting the brain. But what if there was a way to diagnose and even treat severe neurological diseases at the molecular level? So these are really severe conditions for which we have...

very poor understanding of the biology, and no therapeutic insights. So those are the conditions that we're focusing on. This is Dr. Sergio Paschka. He's a professor of psychiatry and behavioral sciences at Stanford University. He also directs the Stanford Brain Organogenesis Program, where he researches neuropsychiatric disorders, including the most debilitating forms of autism. I've been very interested in autism.

Early on, when I was actually in medical school and met my first patient with autism and was just like struck by... I guess like how little we could do for patients with autism, and still today is the case, we can still do very little. We still don't understand the biology very well. We still diagnose disorders the way we were doing in the 19th century.

psychiatry, and of course, neurology, there are, to a large extent, the only branches of medicine where the organ of interest, the brain, is inaccessible. So if you have a patient with cancer... You remove that tumor, right? Or you take a biopsy, you bring it to the lab and you can directly study those cells in a dish and identify what are the molecules and the pathways that are abnormal and come up with treatments.

In psychiatry, we can't just take the human brain out. You know, what the past few decades of medicine have indicated is that to find new treatments that are biologically inspired, you need access to the tissue. And really, this has become my mission. Sergio Pashka continues from the TED stage. Today, most of what we know about the human brain comes from studies in animals, typically mice.

And while we've learned a lot from these animal brains, the characteristics that make the human brain unique and uniquely susceptible to disease remain mysterious. Dysfunction in the human brain causes brain disorders such as... autism and schizophrenia and Alzheimer's disease, devastating conditions that are poorly understood.

Nearly one in five individuals suffers from a psychiatric disease. What is even more striking is that the lowest success rate for finding new drugs is in psychiatry, out of all the branches of medicine. Until now, we couldn't really access the human brain. So you have found a way of studying brain tissue.

Take us back to how you went from being a clinical psychiatrist to one who is researching disorders at the genetic, even molecular level. Yes, absolutely. So as I was finishing my clinical training about 15 years or so ago, There was a remarkable breakthrough made by a Japanese scientist, Shinya Yamanaka, who reported at that time that under some condition, you could take a cell that is already fully formed and mature.

and push it back in time through a genetic trick to look pretty much like the stem cells from which the entire organism is built. So this was called cell reprogramming. He received the Nobel Prize for this remarkable discovery a few years after. And it essentially involves taking, for instance, skin cells from any individual, take a few skin cells or blood cells, putting them in a dish.

and then inserting in those cells a few genes, just briefly. And the dream was, at that time, that you would be able actually to make. human neurons at the bottom of a dish from those patients in a non-invasive way that would not involve taking any neurons from anybody's brain, but rather reconstructing or reverse engineering this process in the laboratory.

I'm here to tell you that we can finally grow parts of the human brain from any individual and then build functioning human circuits in a laboratory so-called tradition. We start by asking a patient to provide a small skin sample. We then take those skin cells, reprogram them by putting a series of genetic factors.

And push them back in time so that those skin cells become stem cells. It's like cellar alchemy. These stem cells have almost magical abilities to turn into any other cell type. So what do we do? We take the stem cells, we dissociate them, we then aggregate them so that they form spheres or tiny balls of cells. We then take those, move them into a special plate where there is a kind of chemical soup.

And that chemical soup will allow them to grow and transform and turn into a brain organoid. And let me be clear, these are not brains in a jar. These are parts of the nervous system in a laboratory dish. Each of them contains millions of cells. And we can even listen as they fire electrical signals. Or we can watch them as they sparkle with electrical activity. Or we can image inside and watch the cells as they communicate with each other. Isn't it remarkable to think that...

Just a few months ago, these cells were skin cells in a patient. And now there are neural cells at the bottom of a dish that we can study at ease. Wow. So you can figure out how your patient's brain cells evolved. by using stem cells to grow neural cells that you call brain organoids I mean it's amazing Yes. So for Timothy's syndrome, which is very rare.

the patients will not just have autism and epilepsy, but later on, they'll have a heart problem. They're more susceptible to infections. But the patients have literally... just a letter in their genome that has been modified. And that tiny, tiny mutation in a calcium channel has devastating effects. And this mutation that causes Timothy syndrome...

was predicted to cause that channel to stay open slightly longer so that more calcium would get inside the cell. But of course, nobody has ever seen it. So the first experiment that we've done, which is still... vividly remember even today, you know, the night when it actually worked for the first time. And I was in the microscopy room and we finally had neurons.

and could actually see that there was more calcium going inside the cells. And that was really a very exciting moment because it actually showed us that you take the cells from patients and you can see a defect. a biological consequence of a mutation. We now have literally hundreds of genes that are associated with autism, intellectual disability, ADHD, schizophrenia. And the reason why that is important... is because genes are closer to the molecular pathways. So they offer a great entry point.

And of course, I am convinced that in understanding the molecules that are important for that, we're going to discover very important pathways that are causing disease. So forecast for me, you know, if you had a fantasy come true, what would that look like? Would it mean that you would be able to stop neurons from...

having dysfunction or problems before they got there? Would it mean that you would do genetic testing? I want to understand the implications of what this might mean for future generations of people. So we've helped close to 250 labs around the world to learn this technology. We're going to start to understand what goes wrong in those cells. And I can tell you that...

for Timothy syndrome in particular, we got such a good understanding that a therapeutic opportunity just arose pretty much naturally out of those experiments. And so we design a strategy. where we essentially destroy that RNA that carries the mutation. And this would likely involve an injection of this therapy. Of course, we're still far away. But in principle, it would involve injecting a piece of DNA that will modulate that. So in theory, you could stop the mutation in its tracks.

But how old would a person have to be to get a treatment like this? I mean, I guess in utero? Certainly for neurodevelopmental disorders, you want to intervene as early as you possibly can because, you know, once damage has been done. to the nervous system is going to be probably hard to reverse. And so it is foreseeable that some of these patients actually will be diagnosed within the first few weeks after being born.

which means that one could intervene very early on in the course of this disease. Where do we start to think about what the ethical implications are of when we intervene in the development of someone? This is a great issue and something that actually preoccupies me quite a lot. And I spend a lot of time now thinking about the ethical societal implications of the work that we do, both in terms of delivering some of these therapies and how early you would intervene.

For Timothy syndrome, it's likely to be justified to intervene very early. with a therapy like this, especially because children are very sick, very debilitated by this condition. But there will be other neuropsychiatric conditions that are perhaps not as severe. where we're not as good as predicting. And I think there we're going to have to think very carefully. When do we want to intervene? And what are the risks for intervention?

I think that's important to keep in mind because I'm sure there are some people listening who think, you know, we've just gotten to a point in society where we've started to appreciate that people's brains work differently. The word neurodiversity is used often that we start to understand.

understand that you know people come in all sorts of shapes and sizes and ways of seeing the world absolutely and certainly our goal has never been to try to either change or cure anybody who doesn't want to change in any way. Most of our work has been focused on addressing these devastating conditions of the human brain. Severe intellectual disability, where any improvement, very small improvements there will make a huge difference because there are no therapeutic approaches.

that are even close to being curative in those conditions. I think it's going to be an exciting time for human neurobiology and for psychiatry. And my hope is that slowly but surely... psychiatry will be moving into a molecular era. That's Dr. Sergio Paschka. He's a professor of psychiatry and behavioral sciences at Stanford University, where he directs the Stanford brain organogenesis. So we have talked about all kinds of brain hacks, and we want to end our show with a very personal story.

So for people with neurological conditions like epilepsy, the lack of control over their brain can be terrifying. It's very, it's a crushing limitation for sure. This is Kate Faulkner. She's a sous chef in Colorado. She's also the sister of TED Radio Hour producer Rachel Faulkner White. And when Kate was a teenager, she started having tiny seizures called...

absence seizures. Where I would shake a little bit and drop whatever I was holding, but I wouldn't black out or fall over or lose consciousness or anything like that. At the time, she didn't know what they were. And she wasn't formally diagnosed with epilepsy until a few years later. My first big seizure was in the summer of 2017. And luckily it was with other people, but I have no memory of it happening. It was...

One minute I was sitting at the kitchen table and the next minute I was with a bunch of EMTs who were in the living room with me. After that, Kate had started on medications that were supposed to prevent these seizures. And they seemed to help at first. But a few years later, she had another big one. I was driving in my car, which was terrifying. Luckily, it was the only place on the route that I was driving.

Didn't have any trees or guardrails, and I just kind of drifted across the field. After that one, I stopped driving. This kept happening more and more frequently. I was having seizures alone at home. I was having seizures at work. I eventually had to leave that chef job because open flames and sharp knives and seizures are not a great combination. It's devastating to feel so limited in what I can do. It's the idea of hurting someone else if I was driving a car.

Like, I don't know how I managed to not hurt anybody. I am haunted by the idea of what could have happened. And I really missed having the freedom to go where I wanted and to go hiking by myself or go to the store by myself. Go swimming or take a bubble bath. But in 2023, Kate's neurologist had a new idea. to surgically implant a vagus nerve stimulation device. Which is a battery that goes in your chest on the left side, just underneath the skin.

And it's connected to a wire that wraps around the vagus nerve, which is part of the parasympathetic nerve system. And the battery is programmed. by the neurologist to emit small electrical pulses. It sends an extra burst of electricity into the brain and can increase blood flow to certain areas of the brain and can... prevent seizures from happening. And right now my battery is programmed to send a pulse every five minutes for a 30 second interval.

The interesting side effect is that whenever the electrical pulse goes off, there it goes. So yeah, so as you can hear it happening right now, the battery is going off. The vagus nerve also works around your vocal cords. And so your voice...

goes a little strange. It's not painful at all. I remember for the first... couple weeks it it felt like it was really hard to catch my breath even though I was breathing normally and breathing fine it's it's kind of that same feeling so that was hard to get used to and After 30 seconds, then it goes away, and my voice goes back to normal. It's not a cure for epilepsy, and it's not effective for everyone. But for Kate...

I haven't had a seizure since I got the device put in. I now have control to some extent over a part of my brain that I didn't have before. The possibility of the freedom that... this could potentially bring like I might be able to start titrating off the epilepsy drugs that I'm on which those have some not fun side effects and I want to be able to go through

a day without having the intrusive thought of I'm carrying a 40 gallon pot of hot soup and thinking on today's episode of bad times to have seizures. It'd be wonderful to have that kind of freedom. I mean, it's an incredible piece of technology, and even though I have a weird voice now, I'm optimistic that this will make life easier.

and open up possibilities. That's Kate Faulkner. We are so grateful to her for telling her story. And you can learn more about her condition and treatment at ted.npr.org. Thank you so much for listening to our show, Brain Hacks. This episode was produced by Rachel Faulkner-White, Katie Monteleone, and Fiona Guerin. It was edited by Sanaz Meshkinpour, James De La Housie, and me.

Our production staff at NPR also includes Matthew Cloutier and Harsha Nahada. Irene Noguchi is our executive producer. Our audio engineers were Robert Rodriguez, Margaret Luthar, and Ted Meebane. Our theme music was written by Ramteen Arablui. Our partners at TED are Chris Anderson, Michelle Quint, Alejandra Salazar, and Daniela Ballarezzo. I'm Manoush Zomorodi, and you've been listening to the TED Radio Hour from NPR.

This transcript was generated by Metacast using AI and may contain inaccuracies. Learn more about transcripts.
Your brain is the next tech frontier | TED Radio Hour podcast - Listen or read transcript on Metacast