Welcome back to the Law School Toolbox podcast. Today we're excited to have lawyer, anthropologist, and author Petra Molnar here to talk with us about her new book, The Walls Have Eyes: Surviving Migration in the Age of Artificial Intelligence. Your Law School Toolbox host today is Alison Monahan, and typically, I'm with Lee Burgess. We're here to demystify the law school and early legal career experience, so that you'll be the best law student and lawyer you can be.
Together, we're the co-creators of the Law School Toolbox, the Bar Exam Toolbox, and the career-related website CareerDicta. I also run The Girl's Guide to Law School. If you enjoy the show, please leave a review or rating on your favorite listening app. And if you have any questions, don't hesitate to reach out to us. You can always reach us via the contact form on lawschooltoolbox. com, and we would love to hear from you.
And you can check out the Bar Exam Toolbox podcast if the bar exam is on your radar. And with that, let's get started. Welcome back to the Law School Toolbox podcast. Today we're excited to have lawyer, anthropologist, and author Petra Molnar here to talk with us about her new book, The Walls Have Eyes: Surviving Migration in the Age of Artificial Intelligence. Welcome, Petra.
Thanks, Alison. Thanks for having me.
Oh, it's definitely my pleasure. To start us off, can you give our listeners some information about your background and your work, just so they have some context? Sure.
Yeah, would be happy to. So, I'm a refugee lawyer by training and a social anthropologist, and I've been working on issues around migration for 15, 16, 17 years. It's been a while. But I was focusing more on issues around, let's say, gender-based violence, immigration detention, more traditional, so to speak, issues.
And then about six years ago, I fell into this area of trying to understand how new technologies are impacting immigration and people on the move, and it's been a bit of a journey since.
I would imagine. I was actually a programmer before I went to law school. So, I call myself techno ambivalent. Some good points, some bad points, so I'm really interested to talk with you about this. If people want to learn more about you or reach out, how can they do that?
I have a website, which is PetraMolnar.com, where a lot of the work lives. And I also run a project called the Migration and Technology Monitor, which might be of interest. That's where we house a lot of the work that we do directly with refugees and people on the move at this intersection of borders and tech. And I'm also on Twitter.
Cool. We will put all that in the show notes for people who are interested. Tell me generally about the book. What was your goal in writing this?
So, the book has been in the works for almost six years, which is wild. And it's very strange to see it out there in the world after all this time. But what motivated me to write it in the way that I did is that I wanted it to be accessible. It's not, I would think, overly academic, but it's rather story-driven. And that was deliberate, because I didn't want to write an academic text that was very expansive and impossible to read and inaccessible, but rather a text that anybody could pick up.
And I have to say, one of the biggest moments, highlights of my career is the fact that it's actually available at Walmart and Target, which is amazing. That's really what I wanted. But I think for me, that was important because I'm not a tech person. Really, six years ago, I barely knew what an algorithm was. We're talking like Wikipedia-level knowledge. But I wanted to understand it from a legal perspective and a human rights perspective, but most importantly, a human perspective.
And that's what the book tries to do, to humanize the issue.
Yeah. We'll talk a little bit about this later, but I definitely loved the story aspect of the book. And I do think that was one of the most interesting points. But we'll talk about that later. One of the first things that struck me reading this, which I think is really important and often misunderstood you say - I'm not sure I'm phrasing this right, but immigration is really a civil issue, not a criminal one. So, can you tell me a little more about that and how this process is supposed to work?
Yeah, sure. And I think that's a great place to start the kind of context that underpins immigration and immigration decision-making, and also some of the decisions that get made by states to weaponize migration and conflate people who are on the move and people who cross borders with criminality. This is something that scholars have been calling "crimmigration" - kind of a joining of the two words of criminalization of migration. And that's a trend that we've been seeing globally.
The fact that we seem to have lost the plot on the fact that, A) people are exercising their internationally protected right to seek asylum, and B) there are civil and administrative procedures in place that are meant to govern the way that decisions get made.
So, it is this kind of tension there where, in a lot of jurisdictions like the United States or Canada, where I've been working, we have this legal system that supposedly says, you come here, you are adjudicated based on these metrics imported into our domestic law through the international system, like the Refugee Law Convention, for example, that has been with us since 1951. And then you go along your merry way.
But what we've been noticing as a trend is that, again, these civil procedures that are meant to be the governing framework behind the way that decisions get made, are being eroded because people who are coming are presupposed chippy frauds, criminals, cheating the system. And it just exacerbates the vast power differentials that are inherent in the space already.
Yeah. I actually took an Asylum Law class in law school, which was actually very interesting. And it's shocking when you see these pictures and kind of the narrative and the media around the stuff that's happening at these borders. And it does, to the average person, I think, look like, "Oh, this is like a criminal thing. These people are trying to break in."
But for people who may not be as familiar with it, can you just give us a really high-level overview of how the asylum process is theoretically supposed to work?
Yeah, sure. So, the asylum process starts with the foundational idea that everybody on the planet has the right to seek asylum. This is something that's protected by international law and then also domestic law. And there's a certain set of tests that you're supposed to meet in order to qualify for refugee protection. There's a refugee convention definition that sets out five categories that you're supposed to meet, one of at least, to be then considered for protection.
But it is very interesting, precisely what you just said, Alison, because we've seemed to have almost reversed the process of how these decisions get made, right? Because in criminal proceedings, at least in most jurisdictions, you're considered innocent unless proven otherwise, right? But there's a reverse onus kind of principle in immigration law, where you are presupposed to not be a refugee, to be lying, unless proven otherwise.
And it puts an amazing amount of the burden on the applicant or the person who is claiming asylum, oftentimes without attorneys, without psychosocial support, and even with the system not really recognizing the nuances and complexities that motivate people to move, because it is very complicated. It's not as clear-cut as the law would like us to believe.
Right. No, I've definitely seen that among people that I personally know, where it's like, you left for this reason, but also for this other reason, and one of those reasons happens to be in the five categories, so that's really what you need to talk about. But it's not the full complexity of their reality, and I think that can be very difficult, to fit into these categories.
And I also found it interesting - and you've alluded to this just now because this is not a criminal system, you don't have the rights of a criminal defendant, right?
Yeah, that's exactly right. And we see this across the entirety of the system, especially in instances of immigration detention, but also at the border where, as a refugee claimant, you don't have a right to an attorney like you would in the criminal justice system, or even other types of administrative and procedural rights that are present for you. Not to say that the criminal system is perfect either.
Sure, but there's something.
Absolutely. But again, I think because immigration is oftentimes also tied with issues of national security and politics, it becomes this kind of powder keg that loses the humanity in the mix, right? Again, we've lost the plot on why we even have certain legal protections in the first place if people can't actually meaningfully exercise their rights. This is a textbook example of the difference between rights on paper and rights in practice.
Definitely. On that point, can you talk a bit about pushbacks and what's actually happening on some of these borders? Because I will say I wasn't ignorant of the fact that these things happened, but really reading about it in your book was pretty shocking.
Yeah, this is some of the sharpest manifestations of border violence and the kind of politicization of border enforcement that we've seen in recent history.
So pushbacks, or pullbacks or forcible interdictions, as they're sometimes called are practices where either a state party or a supranational entity, like the European Union's border force, for example - Frontex - is involved in essentially illegal operations to either prevent people from reaching a particular territory to seek asylum, or interdicting them or stopping them in the middle and then pushing them back into other territory.
This is something that's been widely documented now along the fringes of Europe, whether on the Mediterranean Sea or Aegean Sea or even the land border, where different, again, state parties will intercept refugees and basically push them back into Turkey or Libya or other places.
And again, from a legal perspective, this is an absolute derogation of the established principles of asylum, of this norm that's called "non-refoulement", which is a fancy French term for not returning somebody to a place where they might be facing persecution - again, the backbone of international refugee law. And these practices are happening on a daily basis nearly, and people have lost their lives as a result.
Yeah, it was pretty horrifying as somebody who thought, "Oh, there are rules around this, right?" And then you read this and you realize, "Wow, there really are not." There are, but they're not being observed.
Yeah, that's exactly it. And it's depressing really, because those of us who are trying to shed some light on this, it's like, how much more are we supposed to do, right? There have been UN hearings, European hearings, U.S. hearings on this, journalists have covered it, books have been written on it, and it still keeps happening.
Yeah, I think "depressing" is a good word for it. Let's switch gears a little bit. Talk to me about some of the technology that you are seeing, whether it might be immigration detention camps around different borders. What's going on?
Yeah, so what helps me sometimes when I think about it is tracing the journey of somebody as if they were on the move. Because at every single point of a person's journey, there are now technologies that are impacting the way that people experience asylum and movement in general. So, before you even move, there're all sorts of things that you might be interacting with already, such as social media scraping and different data sets that are being put together on you based on your online activity.
What mosque you go to, who your friends are, who you associate with, things like this. And it creates this kind of global database, based on your behavior. Then as you're moving and you are getting close to a border or a humanitarian emergency, and you might be in a refugee camp, that's where people come across things like biometrics, like iris scanning, fingerprinting, other types of data that is collected from the body, and again, fed into these databases.
And that's also where we see a lot of the surveillance tech that the book covers - so, things like drones, cameras, even robodogs that were recently announced at the U.S.-Mexico border. That was a pretty recent announcement, I think February 2022, when the Department of Homeland Security rolled out this kind of new tool to be joining this arsenal of global border tech. Robodogs, same thing that you would see on a sci-fi show.
And then there's a whole host of other things too, like AI lie detectors that the European Union has been using, different visa triaging algorithms, carceral technology in immigration detention and many other projects. But really, it's this whole ambit of tech that really is pretty much present in every single point of a person's journey.
And it sounded to me like there's not a ton of outside oversight on any of this. Is that accurate?
Absolutely. That's one of the key issues here, that a lot of the times we don't even know what's happening, why certain projects are being rolled out, sometimes even if your case is being adjudicated by an algorithm or not.
This has happened in Canada, where people's visas were rejected based on an algorithm, and people didn't even know until a lawyer saw the reasons and was like, "Something's off here with the way that it was written", and then filed an access to information request or freedom of information request, you guys call it. And then all of a sudden it was like, "Whoa, there's algorithmic decision-making here."
So, oversight and transparency, very important to think about because so much of this happens behind closed doors.
Yeah, let's talk a little bit about that because you do discuss, even the bringing AI, for example, might or might not be being used to make immigration decisions. I can only imagine that's going to expand in the future, given what we're doing now with AI in everything. Can you just talk a little bit more about what is going on and also what might be potentially concerning about this?
Yeah. So, AI, or artificial intelligence, is the kind of latest manifestation of a lot of these technologies. So that includes things like algorithms, machine learning, basically using big data sets to make determinations about particular cases. And different jurisdictions are trying out importing elements of this into their immigration decision-making systems.
Again, Canada, for example, since 2018, when I first started working on these issues, has been using some sort of algorithmic decision-making to triage visa applications. But again, the issues around transparency and accountability, huge. And two, the human rights impacts of this, right? We know that AI is oftentimes very biased, and we also don't know why certain decisions get made the way they do.
And It's really concerning that we don't really have a lot of governance and regulatory mechanisms to put some guardrails around this. It just seems to be like full speed ahead on this without really, again, bringing it back to what it's doing to real people.
Right. And I think anyone who's even played with some of these AI tools, on the one hand, it's "Oh, this is super interesting, and it's cool what it can do", and then it does something really stupid, and you're just like, "Huh, how did that happen?" And no one can really explain it, which I think is a little terrifying in the context of decisions about people's fundamental rights and things like that.
Yeah, totally. And states sometimes try and get around that by saying, "Don't worry, there's a human in the loop to check everything."
But there's a thing called automation bias, where we know that human decision-makers, when they're making determinations about complicated things, tend to give more credence or weight to a decision that was made by an algorithm, because again, it's this kind of idea of objectivity and somehow technology is able to get at the truth of the matter better than a human.
And so, even if you eventually do check it as a human officer, you might already be predisposed to think of an applicant or an asylum case or a visa triage case in a particular way because an algorithm kind of predetermined you to think that way. And that's, again, super concerning, because we oftentimes don't know at what point does algorithmic decision-making end and human decision-making begin, and why does it always happen in these really high-risk cases?
Right. I thought that was also an interesting point, that it seems almost like some of this stuff has been tried out here in an environment where people might not be paying quite as much attention to other people's privacy rights and things like that. Talk a little bit more about that, if you would.
Oh, totally. This is one of the biggest trends that I've noticed across all these different borders around the world that are these laboratories of technology, because they are this kind of frontier zone, where they're murky, they're opaque. People oftentimes don't know what's happening at the border. There's so much discretion already in the way that immigration decision-makers make decisions.
You can one set of evidence in front of two officers, and they can make two completely different yet equally legally valid determinations, right? That's how the system works. And so if that's the discretion that's already there, and then you're playing around with this high-risk technology, it can have massive impacts.
But it's totally not an accident that it happens at the border and refugee camps and in immigration, because again, it's these fringe cases, oftentimes also weaponized against marginalized communities. But last thing I'll say there, if I may, is, it also doesn't stop at the border.
Even some of the most draconian uses of tech, like those robodogs I mentioned - first they were announced at the U.S.-Mexico border, and then lo and behold, a year later, the New York City Police Department announced that they want to be using robodogs on the streets of New York. So again, it bleeds over into other facets of public life.
Yeah, and I just saw something in the New York Times today about how consumer technology now is being used on battlefields, so it goes both directions. Talk to me a little bit about where all this technology is even coming from. Who's developing it? Where are people getting it?
Yeah, so there is a huge piece to this puzzle, and that's the private sector - the fact that companies are often the ones who get to say, "You know what you need in this very complicated, complex situation? More technology." And that is done through these kind of public-private partnerships that are the bread and butter, this kind of underpinning to what some people have called the "border industrial complex". And we're talking about, I think it's been projected to be nearly 70 billion dollars.
It's a huge industry. And not only is that kind of disturbing from just the sheer amount of money that's being pumped into this, but from a normative perspective, it really becomes clear whose priorities take precedence and why. If you have companies who are developing tech like robodogs and AI lie detectors and making a lot of money out of it, they're not going to say, "We need more laws to regulate this and constrain it and stifle innovation", right?
It's not an accident that we're developing these technologies and not actually using AI for other purposes that could actually be beneficial, like auditing immigration decision-makers or rooting out racist border guards. That's a clear choice, because it benefits a very powerful set of actors in this space.
Yeah, and I think the whole tech ethos of, "move fast and break things" is really terrifying in these scenarios.
Yeah, absolutely. And it's so interesting, because it's almost like the technology space is like the sandbox, where you can test out ideas and play around and innovate, which in and of itself sounds like a really beautiful thing, but innovate on whose behalf and who is your test subject, right? And it's interesting because you see so many of these things now - AI for good, tech for good! But good for who, right?
And if we know that this technology is hurting people and we now have evidence from all these different contexts - not just at the border like I study, but in criminal justice, in welfare algorithmic decision-making - we know that this is exacerbating vast power differentials in society. The tech sector has to pay attention to this kind of manifestation, because it's not a neutral exercise.
Yeah. I was a Sociology major, a long time ago, and I remember reading Foucault, I think it was, in the Panopticon, and this idea that was supposed to be a negative scenario of being watched at all times, but that's what it seems like we're moving to at a border, and also not.
For sure. And we've seen this kind of normalization of surveillance just become part and parcel of everyday life, and the incursion of technology into everything we do, right? Face ID, facial recognition becoming part of our daily lives if you have a smartphone, getting on a plane, applying for social services. Even sports stadiums now have facial recognition and biometric surveillance. It's really becoming part of everything.
Yeah, tell me a little about the origin of the name of the book, The Walls Have Eyes. Where did that come from?
So, that came from one of the spaces that I've been working, the Greek border between Turkey. And somebody said this in passing, but the surveillance element really gets to your head after a little while, right? And you start thinking that every single wall has eyes and somebody's always watching you. It is this kind of psychological phenomenon that, of course, the people who are at the sharpest edges of tech feel it the most.
But even those of us who do the work in this space, you really do feel like even trying to shed light on some of these human rights abuses has really deep psychological impacts too. Always being watched.
Right. In some cases you had to take certain risks to tell the stories that are in this book, or to get the information, or to see the things that you were seeing. How does that reality impact your work?
Yeah, thanks for asking that. I feel really privileged to be able to do this kind of work because it brings me really close to, again, the people who are affected by this, and also trying to understand it from a more kind of holistic perspective. I think that's where the anthropology hack comes in, spending a lot of time in these spaces and working from a relational perspective, working in a slower, deeper way, that again, centers human agency and storytelling as a methodology to the book.
But for sure, it's had psychological impact on me. I am quite open about the fact that I see a therapist regularly for some of the stresses that come with doing this work, and I actually see that as my professional responsibility to do so. I think it's important to be aware of the impacts that this work has.
My team and I have also routinely been at the sharp edges of state surveillance, whether through phones getting tapped and information being shared, or even just recently, one of the labs that houses some of the work that I do - the Refugee Law Lab, was a victim of a cyber attack just two weeks ago. Hopefully it was a bot. At first we thought maybe it was a targeted attack, but it seems like maybe it was just accidental.
But it really makes you think, because the work that we're trying to do sheds light on things that people don't want you to see sometimes. And it definitely makes me think twice about some of the security measures that I put in place as well, for my well-being and the well-being of my own family too.
Oh, that absolutely makes sense. Talk to me a little bit more about the aspect of the stories, because one of the things I really liked was this deep dive in the book on the stories of the people who are caught up in all of this. And in some cases, obviously, people are in very challenging circumstances. What was that like, to talk with them and get them to open up and to, I assume, allow their stories to be told here?
Yeah, I think that's what kind of goes back to the slower way of working, and that's where I think anthropologists and people who work from these slower methodologies have a bit of an advantage, because we are not journalists. We don't go in and try and gather a lot of data, and then just parachute out and leave.
The vast majority of the stories in this book are as a result of relationships that have grown up around being in the same space at the same time, oftentimes multiple visits, different kind of partnerships through work, connecting people with others.
It gives you a little bit more of a level playing field, so to speak, rather than it being a researcher going in, extracting knowledge, taking it out, writing a book about it, which is of course not without its pitfalls either, because it needs to be queried, like who gets to speak on behalf of whom?
To me it's something that is uncomfortable and has to remain uncomfortable, the way we produce knowledge in this space, because there are vast power differentials that are inherent in the way that we think about these complicated issues.
One way we try and get around this is through this one project that we're now running, which is called the Migration and Technology Monitor, where we fund people on the move directly to do work on border surveillance from their own perspectives, from within the community, working in a very horizontal, participatory way at every kind of element of the project as a way to try and redistribute resources, and also make space and create opportunities for colleagues on the
move to be the experts in this space. If I can become obsolete in this, great. I would be super happy.
Right, I think that's a great idea, and probably, honestly, more effective if it's people who are really there on the ground telling their own stories than somebody who comes in, like you said, just like, "Oh, cool. What's happening? Alright, bye."
Yeah, exactly. And you see a lot of that, whether through journalism or human rights reporting. And sometimes in certain situations, it's necessary to try and get that factual record right away. But I do actually think you're absolutely right, that the power of stories is the depth and the ability to meet each other on that human level. And to me, that also gives me hope in the way that we can think about these really complicated topics, that ultimately it's people talking to people, right?
Yeah. Before we wrap up, I am curious, what impact did COVID have on all of this?
Yeah, COVID was an important piece to the puzzle here, personally because the project grew to a more international focus right around that time when I noticed some really concerning trends in Europe and in Greece, where new technology was being introduced into refugee camps, and also COVID being used as a bit of a weapon to basically say, "We're going to keep people locked up in these camps", under the guise of COVID, so to speak, even though that was really not the case.
And so, I went to Greece thinking, "I'm going to be here for a few months", and then I stayed for two years, because it was one of the epicenters of where a lot of this was happening. But from a more surveillance perspective, COVID also normalized a lot of data collection, different apps, different information sharing that I think we're still trying to figure out the impact of.
And it is oftentimes these times of crises that normalize more and more technology and this kind of turn to techno solutionism that I think we're going to see trending upwards in the years to come.
Yeah, I think definitely just the whole idea of the surveillance state, for lack of a better word, is much more normalized now than probably before. But it feels like there are just a lot of different things colliding in a lot of different ways. And towards the end of the book, you offer some ideas for moving forward. Can you talk a bit about some of these - you call them resistance and just what do we do with all of this stuff?
Yeah, that was an important kind of point to end the book on, because otherwise it would be so bleak and depressing, like a lot of it is.
Definitely. It is one of those books you walk away from going, "Huh, okay. I don't know what to do with this."
It was interesting writing it, because I knew that it felt very heavy, but it also felt important to give space to that heaviness. But it is only a part of the story, because absolutely, people are resisting the technology, they're thinking in creative ways, how to use technology to upscale their communities, to get information into hands of refugees and people on the move, to circumvent really problematic practices, to come together in solidarity.
That to me is a massive kind of way forward, seeing the way that people who either are in border communities or who work in search and rescue show up for each other in these kind of ways. And that's actually something I'm starting to think about for a follow-up book, because there're so many stories of goodness and joy and contestation and resistance in this kind of messy reality as well, and I think we need to also make space for that.
Interesting. I would be curious to read that, the hopeful follow-up maybe. Any final thoughts that you'd like to share before we wrap up here?
Yeah, just thanks for the opportunity to reflect a bit on this, and I just always like to leave it at this point: We just can't lose sight of the fact that technology and its incursions into all these different spaces might seem really abstract or something that's difficult to understand, especially for people who are not technologists or trained in tech, like I was.
But at the center of it all are people who are being harmed by this kind of normalization of tech, and we need to just not lose sight of that, but really look to the stories that are coming from the ground that are illuminating the reality of what's happening.
I agree. Remind us again how people can find out more about you and more about the book.
So, if you're curious to learn more about my work, I have a website, which is PetraMolnar.com. And that's where a lot of the other projects that I mentioned sit. Otherwise, the book is available for purchase wherever you buy books, hopefully a small bookshop, also Amazon and Target and others. And I'm also on Twitter or X or whatever it's called these days.
Who knows? Alright. Well, Petra, thank you so much for joining us. I really appreciate it.
Thanks so much, Alison.
My pleasure. If you enjoyed this episode of the Law School Toolbox podcast, please take a second to leave a review and rating on your favorite listening app. We would really appreciate it. And be sure to subscribe so you don't miss anything. If you have any questions or comments, please don't hesitate to reach out to Lee or Alison at lee@lawschooltoolbox.com or alison@lawschooltoolbox.com. Or you can always contact us via our website contact form at lawschooltoolbox. com.
Thanks for listening, and we'll talk soon!