Silenced, censored, resisting: Feminist struggles in the digital age with Asha Allen - podcast episode cover

Silenced, censored, resisting: Feminist struggles in the digital age with Asha Allen

Mar 03, 2025•34 min•Ep. 167
--:--
--:--
Listen in podcast apps:

Episode description

📺🔊 Available on Youtube Spotify and Apple Podcast As International Women’s Day 2025 approaches, feminist movements face mounting threats from rising authoritarianism and Big Tech’s growing role in shaping public discourse. In this episode, Laeticia Thissen (Senior Policy Analyst on Gender Equality, FEPS) speaks with Asha Allen (Director and Secretary General, Center for Democracy & Technology Europe) about the increasing censorship of feminist and LGBTQ+ advocacy by platforms like Meta and X, the rollback of diversity and inclusion commitments by major tech companies, and the growing wave of online gender-based violence. They explore the role of the EU Digital Services Act, the Directive on Violence Against Women and existing intersectionality methodologies for policy assessments, exploring how these measures can effectively curb disinformation, algorithmic bias, and platform-driven suppression. With Trump 2.0 emboldening anti-gender movements and far-right forces gaining ground, what can activists and policymakers do to resist digital suppression and protect equality? Tune in for a crucial conversation on the fight for feminist digital justice.

Transcript

[MUSIC PLAYING] Welcome to this new edition of FEPPS Talks podcast. Head of International Women's Rights Day, we are going to discuss the new threats to gender equality that are emerging in this rapidly changing world. My name is Lyti Siatisin. I'm Senior FEPPS Policy Analyst on Gender Equality, and I'm delighted to be your host today. In order to help us navigate these very challenging times for equality in human rights advocates, I'm delighted to be joined today by Asha Allen.

Let me extend my very warm welcome to you. Thank you so much. It's wonderful to be here. Thank you for the invitation. So you are the Director and Secretary-General of the Center for Democracy and Technology Europe, CDT. And you are part of the online expression in Civic Space Team. You are, as part of this capacity, a renowned expert in digital rights, civic space, and online gender-based violence.

So you've been at the forefront of policy advocacy at Ford's with the aim of making digital platform more countable, inclusive, and rights-respective. Would you like to add anything to that before we start? No, that was amazing. Thank you so much for covering that in your rights. I've been working on these issues for many, many years now. And CDT, for those of you who don't know us, CDT International has been going for 30 years, so right at the very inception of the internet.

And CDT Europe, we've been fully fledged now as an NGO since 2019. And so we've been at the forefront, particularly in the last mandate, working as a fundamental rights organization, looking at tech policy. And we work on a variety of issues, online expression, as you mentioned, but issues of equity and data, security and surveillance. So it's really been a busy period for us, as you can imagine.

And so moving to the first part of the conversation, analyzing and discussing a little bit the shrinking democratic spaces and risk to equality, we for sure are all aware that we are finding ourselves at a very critical moment. We've seen the return of Trump to the White House, the consolidation of far-right movements in Europe, but also beyond. And this has also come with a significant erosion of democratic norms.

All of this is amplifying the attacks on reproductive rights, feminist movements, LGBTQ rights, and equality at large. And meanwhile, this has also emboldened and enabled major tech companies like Meta, X, Google, and so on to retreat from their commitments to diversity, equity, and inclusion policies, which is then eventually limiting access to information and further entrenching discrimination.

So what are, in your view, the biggest risk that big tech poses to equality and civic participation? No, and it's really an important moment for us to take a step and look at what's happening. I think there's genuine real concern about the geopolitical situation we find ourselves in right now. And we've been raising the alarm. Civil society at large has been raising the alarm on the kind of backsliding when it comes to equality.

And also what's happening in terms of a shrinking civic space, but also more hostile space for civil society organizations. Interestingly, thinking in this context of International Women's Day Week, the Institute for Gender Equality has been doing their fantastic work, of course, and doing their Gender Equality Index every year. And we've slowly been seeing over the last decade how progress towards equality, true gender equity, let's say, has either been slowing or come to a stop.

And I think we have to look at it in that broader context of we are now kind of at a culmination point, let's say, in where fears around kind of the rise of these movements and where we are right now are really real. So there's a real sense of concern, and I fully agree with you. The risks that we see to civic participation, rights of access to information, free expression in the online spaces, they've been well documented for a long time.

We're almost 10 years out from the Cambridge Analytica scandal. We've been looking at this for a really long time. Research has been doing amazing work documenting the real risks that these platform services and their products actually have towards society at large. And in the last EU mandate, we spent a lot of time assessing, debating, and finally regulating this space and trying to address that.

And we have key legislations like the Digital Services Act, for example, but there's also things like the AI Act and other pieces. But in this particular context, we have regulation in place to try and address this. It tries to address illegal content, but at the same time, tries to mandate due diligence obligations for these very large online platforms, recognizing the impact that their products and services have on society at large.

And we can't forget that even prior to these EU legislations, there was a requirement for these platforms to respect our human rights under the International Business and Human Rights Framework. So anything that we're asking these platforms is not new. They have an obligation to respect their users' rights and make sure they're not of harm to society.

So to come to kind of the second part of your question, we have a real concern that there's a rolling back of this commitment being emboldened by the current geopolitical situation. You've mentioned some of the different policies that we're seeing specifically targeted to LGBTQI communities, to migrant communities, and maybe we can get into that in a bit more detail a little later.

But we are concerned that even though these frameworks are in place, and there's in Europe, let's say, but looking at it in an international perspective, that there's this walking back of commitments to ensure the respect for people's rights online, and that there won't be any accountability mechanisms in place to stop that from happening.

So there is a genuine concern, and I think many organizations, not just fundamental rights organizations or digital rights organizations, but those of us who work on a wide range of issues are very concerned about where some of these policies may be leading.

And precisely also going a little bit deeper on this very last point when it comes to coming back or rolling back on certain commitments, we've seen, for instance, META and eggs that have been censoring posts on reproductive rights, LGBTQI advocacy. META also has abandoned its third-party fact-checking program. So how does this sort of suppression affect feminist movements, and what can be done to challenge that? Yeah, absolutely.

And just to say at the very top, it's always these movements, the feminist movements, the LGBTQI movements, these movements that are intersectional in nature that are first impacted by these policy changes. Just the sheer lack of consideration of these communities and how these policies could affect them always results in or usually results in, let's say, an impact on their communities first.

We've seen this with steps that say, META's different platforms have taken, Instagram and other platforms have taken, where you saw black feminist movements are posts being censored or other things. So I just want to say that from the top. But very specifically, the new hateful conduct policy published by META recently, my amazing colleagues in the CDT US office worked together with many of their partners to really push back hard on this.

And there's a couple of factors to take into consideration here. The policy itself was not consulted with their civil rights advisory body, a body that's there in order to be consulted when these policies are changing and what could be the specific impact on different groups. And so that advisory body established by META itself wasn't even consulted in this context.

Alongside this, the policy itself is of deep concern, particularly for the LGBTQI community, because the policy itself seems very specifically targeted at allowing certain forms of speech towards that community in particular. But the policy itself would seem to indicate that it would be okay to discuss not allowing LGBTQI or migrant communities in public spaces. And that's only a dangerous road that you can go down talking about not allowing certain communities into different spaces.

But it also has an impact on the communities for people with disabilities, for example, and allowing this kind of policy to manifest in this way will inevitably chill their speech. It means people of those communities will self-censor, they will not be allowed to feel safe on these particular platforms. And that is the opposite of, you know, fostering free expression as the policy itself would proclaim to be doing so.

And that has happened in the US context, but I don't want to separate it from the developments that are happening here in the EU and more globally as well. The policies on fact-checking and the hateful conduct policy have not been applied here in the EU. We have the Digital Services Act, and again, we can get into that in more detail. But that doesn't mean we aren't concerned about how platforms are complying with the Digital Services Act.

We're still very early days, and there are a number of investigations and certain processes going on. But one example I can think of is that platforms would have had to, very large on my platforms, would have done a risk assessment, the first round of risk assessments at the end of last year. And from what we've been able to analyze quite early on is that they are unsubstantiated.

We're not very enthused, let's say, or let's say there's not a lot of information in there that lets us understand that they have conducted a full risk assessments from a human rights perspective and that the measures that they've put in place or removed will be adequate to kind of address the systemic risks that they're supposed to address. There has been virtually no consultation with civil society in these risk assessments.

They've been very kind of tacit, but the conversation with impacted communities and consistent engagement with civil society just wasn't there in this first round of risk assessments. So this doesn't fulfill us with a sense of hope and faith that this is going to continue in a really kind of meaningful way that leads to real transparency and real accountability.

So I think for civil society organizations who've been working on this for a long time, we're working very closely collaboratively, but the pressure is there to maintain that these regulations are followed quite closely and lead to meaningful transparency and accountability because our concern, of course, is for these communities and for them to be able to enjoy their rights to free expression as well. - Absolutely.

And what you highlight here also says a lot about inclusivity and eventually the complete absence of consultation of the ones that are concerned by those platforms. And so this also perhaps relates back to other recent initiatives where we saw Google, for instance, rolling back on the DEI policies and initiatives.

And it's quite also, it's not insignificant, for instance, that they scrapped important elements of the internal calendars, like Black History Month, the Pride Month, also further invisibilizing these communities, but also similarly limiting its diversity-based hiring targets and so on. So in your view also, what does this say about corporate accountability and what does this signal for the broader fight for equality?

- Yeah, and I think in terms of what it signals, it's kind of part of this broader scale that we're seeing, broader movement we're seeing, towards kind of pushing back on, let's say, inclusivity, but also kind of making sure that the rights and concerns of these groups are at the forefront of our policy discussions, as they always should be, because they're the most kind of acutely impacted by many of these aspects.

And we're seeing very much of the same in the discourse politically, right? In terms of some of the conversations that are being had in regards to the prioritization of other aspects of policy over thinking about fundamental rights as a key focus.

So I think more broadly speaking, it's very difficult for us to feel that we can advance progress in an environment that's becoming increasingly hostile, from not only that kind of lack of corporate accountability perspective, but also from the political perspective as well, that doesn't seem to be as much enthusiasm, let's say, in this mandate so far, from our decision makers to keep fundamental rights at the forefront, there seems to be a lot of different other priorities

and we're concerned that those rights will be deprioritized in that context. And so this lack of accountability will be allowed to slide, let's say. But it does speak to a wider concern. And so I think it's what civil society does now in terms of how we work together, how we come together to kind of push back on this, working more collaboratively in non-traditional ways, let's say, and learning from those who have been pushing this fight for a very long time.

- Well, it comes also to highlighting the dispensing issue of accountability. In previous occasions, also, you have very much emphasized on the role of media. And so that is also playing an essential role in achieving gender justice, for instance. How can journalists and media organizations better report and counteract on online disinformation and harassment against women, for instance? - Absolutely.

And it's so important because I think that entire sector is experiencing such a major pushback on just the fundamental job that they do as civic space actors and kind of bringing the information and the stories and the truth to the wider public to foster kind of healthy civic debates.

And in particular, in this context, I think I've always been a proponent this idea of when we talk about media pluralism and what that means and a diversity of media, we have to think about those who are bringing that to the forefront. And women journalists are on the receiving end of a disproportionate amount of online gender-based violence, I think, or politically engaged women in the online space are disproportionately impacted.

And CDT has done some amazing research in this area in particular. But women journalists, the International Center for Journalism has done some amazing research on this led by some incredible researchers looking at how this has impacted women and how they want to get into the media sphere and what that means for them to be able to do their job.

And if you imagine that the media sphere has shifted to the online space, and then the online space is not safe for women journalists to engage in. So what does that mean when you come back full circle in terms of media pluralism and participation and diversity?

So you have to look at it in that holistic context that if women are disproportionately experiencing, harassment and online gender-based violence online, and they're removing themselves, then we're moving towards a weaker kind of media environment that's from a diversity perspective. And then you look at it from that context.

But very specifically, I think if you look at it more broadly also, Plan International has also done some fantastic research over the years looking into how young women and girls are looking at this environment. And very sadly, we've been able to see that more young women and girls are now saying that they don't want to engage in, say, online debates.

They wouldn't foster or go for careers that are more kind of politically engaged journalism, things like this, because they just understand the sheer level of harassment that they may be subject to. And that is further adding to this pushback or drop in moving towards a real true gender equity. When you have a younger generation who is saying, "I'm just not going to participate because it's not safe for me to participate."

So it has just what I call this generational impact in terms of what that means. So it's really kind of concerning as you also see platforms moving towards this, we're removing fact checking, which a lot of journalist organizations would have done, just to say at the forefront, fact checking is not censorship. It has never been censorship and even met his own policies around that show that it didn't foster censorship.

But it seems like the hostility that journalists are facing, women journalists in particular from the online gender-based violence, from the diminishing of their work to lack of access. Again, you have to put it in that context of what's happening as we move towards gender equality and how far are we actually rolling back.

- And on this also perhaps beyond also this serious issue that this is posing to women in the public space when it comes to their own safety, you've also multiple times underlined the democratic participation dimension of that problematic. So how does the silencing of women and other marginalized groups erode this democratic participation? - Yeah, and I think it's contextualizing democratic participation in where we are now, right? In terms of what that means in the modern context.

A lot of our democratic debate is happening in the online space, whether this is on social media platforms, or other different platforms, a lot of this is happening in that context.

And so if you imagine then therefore that, it's not a safe place for people to feel engaged from minoritized communities, broader communities as well, then that means they're not being able to enjoy their right to not only freedom of expression, but right to access of information, which is also deployed in the online space and around kind of democratic participation as well.

So it's, I think often the conversation, especially around online gender-based violence, for example, is quite, can get quite focused around, what is it that was the conversation and what's the specific form, but when you look at it in that broader, more holistic context, you understand is really having an impact on how women and minoritized communities are able to engage in our society more broadly, where the issues that concern them, where they are in our discussions,

how they're being politicized, how they're being discussed, and what does democratic participation mean in this more modern context, where people are having to self-censor, they're being silenced, they're having their speech chilled, and they're not able to participate freely. Who has access? That's really the question that it leads us to. - And in that sense, how can we actually strengthen protection for these groups? - Yeah, absolutely.

I think there are things in place, and I think especially those who work in the kind of policy space, we try to look to the regulatory framework, whether that's the Digital Services Act that we've mentioned before, that put in place specific due diligence obligations for the very largest online platforms that operate. And that's not only you have an obligation to deal with illegal content on your platforms, you have an obligation to assess societal risks.

There's specific requirements to assess online gender-based violence for the very large online platforms, and you have to mitigate those risks. There is a real recognition in the law that these risks have a huge impact on society, and if you're going to operate here, you have to be able to address those with the rights of users at the forefront of mind.

And we're still in early days, as I've mentioned, in terms of these laws coming into force, but we really want to make sure that they are enforced in a human rights-respecting way, that rights is at the forefront of enforcement. So, thinking about how regulators approach this, but also so that these obligations are not just empty tick box exercises for the platforms. We do not want to see a race to the bottom when it comes to compliance. We want to see continuous improvement.

We want to see meaningful engagement with civil society organizations and organizations who represent affected groups to genuinely try and address some of the issues that we see on these platforms. Otherwise, they will continue to proliferate, and the level of safety will continue to diminish.

So, we're hoping to kind of use those mechanisms that we have in place to push for increased transparency, kind of get behind the curtain as it will, and understand what's really happening with these systems and with these particular products, and to be able to then make recommendations to be able to say, "This is how you have to address "these issues more comprehensively," but understand that as these systems evolve, the approach to it, addressing those, will need to evolve as well.

So, it will be an iterative process for all stakeholders involved, but there has to be clear, meaningful engagement. Otherwise, again, all the work that we put into making sure that these frameworks were in place will sadly be in vain.

- And also, perhaps looking at the more general pictures, coming back to what we mentioned earlier with the consequences of the return of Donald Trump to the White House, which is also further emboldening anti-gender and anti-democratic movements globally with massive consequences when it comes to equality policies. So, we have seen here a big rollback in terms of regulations in the big tech.

So, how might we see a shift in platform accountability measures, and what should Europeans do in order to mitigate these effects? - Absolutely. And this ties into the frameworks that were put in place in the last mandate. A lot of times, often, especially in the European context, people refer to the Digital Services Act and other frameworks in place as being kind of the standards that were put in place. They are kind of very clear, first of their kind, transnational regulations.

That's not to say that they're perfect. We haven't seen how they fully operate yet, but there is at least a standard there that we should be aiming for as a baseline to be able to continue improving upon, let's say.

So, this pushback that we're seeing, the rolling back of accountability, the rolling back to it, kind of allow that proliferation of the key issues that we've been talking about, I think the European bodies, those of us working in this context, need to stand firm in the rules that we have put in place, because the rules are there, they are in place, and the platforms will need to respect them.

And also for there to be more global efforts to keep pushing, we cannot allow the kind of narrative that we're seeing right now in that there doesn't need to be as much accountability, and we can allow the kind of proliferation for a very skewed perception of what is meant by facilitating free expression to continue to proliferate.

So, the main message here is just to stand firm in terms of the frameworks that we have in place, to keep bringing civil society in as an equitable partner, and not at the last moment for last-minute consultation, but to keep bringing civil society in, who have been working on these issues for actually decades at this point, and have the expertise that can really inform some of the questions that regulators or other bodies may just be first addressing,

or have less familiarity with, let's say, in terms of the very specific issues that we've been researching for a very long time. So, that's really taking a kind of multi-stakeholder approach in addressing those specific issues, but also really thinking about what is the standard that we're really trying to maintain here, and not to allow that influence of rolling back to kind of have an impact that's intended for compliance to be as weak as possible. We have to stay firm. Mm-hmm.

And perhaps also in connection to that, but moving on to the last part of our conversation, and more particularly focusing on the intersectional impact of online gender-based violence, CDT has also highlighted that online suppression does not necessarily impact all communities alike. Feminist, anti-racism, and LGBTIQ movements face multiple layers of discrimination online. So, who are the most -- what are the most pressing intersectional impacts of these shifts?

Yeah. And it's difficult to kind of pinpoint a very specific thing, because as you mentioned, those groups are inherently experiencing intersectional levels of discrimination. And so, if you're looking at, say, content moderation policies and kind of policies that are put in place, it's like the example we talked about previously, where the hateful conduct policy that Meta's put in place seems very targeted.

You know, the changes that they made seemed very targeted towards the LGBTQI community, but it's having an impact on migrant communities, but also people with disability communities. There's just a lack of understanding of how that policy is actually going to affect an array of different communities. And so, I think thinking about how to address that in itself is to use the intersectional methodology for assessment in itself, right?

Understanding that we're not looking at a group in isolation because those groups don't live their lives in isolation, right? You and I can talk along about what it means to be a woman or a woman of color or an LGBTQI woman. And so, it's very multifaceted, but often you see policies are not even conceptualized in that way.

So, we've been pushing for this concept of using an intersectional methodology when assessing how policies have their impact, because our research has shown, particularly in the U.S., where the research was conducted, that it was women of color who experienced disproportionately more online harassment than their white peers and counterparts, despite the fact that they, as women, also experienced a high level of online gender-based violence.

So, it's about the methodology you use to kind of do the policy development, but also to do the policy assessment. It's a cyclical process. One always should feed into the other. And so, again, it's useful not to think about these issues in isolation, so that then the conversations happen in isolation.

It really should be thought about in that way, in terms of who are the communities that might be most affected, how it might affect different communities, and who is not in the room helping us understand how it might affect their communities also, as well. So, when thinking about intersectional impacts, I always go back to what's the intersectional methodology that was used to kind of think about the policies that were put in place, or even assess them.

If there wasn't one in the first place, you've got your answers to why these are happening. But if you can start understanding and doing that and bringing that in from inception, then you're able to think about how policy impacts really and truly manifest. So, in that sense, also, what are the policy measures that should be prioritized in order to address these intersectional challenges? Yeah, absolutely. So, it's good just to use the risk assessments under the DSA as an example.

The law is quite prescriptive in that it says you should talk to affected communities, test mitigation measures on affected communities. And there's been some amazing work done by civil society organizations to say, "Here's what a human rights impact assessment using an intersectional methodology looks like." That in itself is fantastic. That's a great starting point. And these very large online platforms have the resources to engage that expertise to do those assessments properly.

That is a first policy measure. You are not going to do a risk assessment unless you're using a methodology that allows you to assess how this is going to impact the different communities that use your platforms.

And thinking about it from the context of also how important the online communities are to these communities who have been historically marginalized and historically discriminated against, the online space has been fantastic for curating, say, Black feminist movements and different movements. It's been a wonderful place for that.

And so if you understand that if your policies are going to negatively impact on communities who really could use this space and have fostered solidarity and community and even legislative change, as we've seen with MeToo, for example, in these online spaces, it's fundamental that you're bringing in those expertise to do those, say, risk assessments from the beginning and understand how measures that you're putting in place to address things are useful.

Just to give one more concrete example, when we were talking about Black feminist movements and the discussions that they were having on Instagram, for example, some policies that were put in place using kind of just using keywords to say if this word is used and then this post can go down, not understanding how these words would be used in that context of Black feminist discussion, thinking about things in that intersectional way, it meant that their context was being taken down

and they had to have a whole process to understand why that policy was detrimental to those specific communities. So as a kind of accountability measure and kind of content moderation and the due diligence obligations, I think they're using an intersectional methodology to assess the risks is a very clear policy that can be put in place, thinking about the very large online platforms.

But as the enforcement bodies and the regulatory bodies assess what the platforms have assessed, they too can also think about an intersectional methodology and think to themselves, okay, we're looking at a societal systemic risk, who in society might this impact, not just thinking about it widely, but who might this have a negative impact on? And again, they can look to research that's been done, that they can also do extensive consultation with civil society.

We're calling for the European Commission and the regulators to consult with us on a more regular and systemic basis to be able to feed into their enforcement processes. But it's a very similar thing, making sure that you've got the expertise to inform and making sure that you're saying, okay, yes, we're also making sure your platforms are doing their job when it comes to implementing and being in compliance.

So there's a few different policies, but we have the digital services at that I've mentioned. We also have the directive on violence against women that also touches upon different forms of online cyber violence. And so there's also a role there for member states in terms of how they transpose that and what that means in their national context to also address this as well.

As we round up this conversation, and in light of the different elements that you have just pointed out, what do you think will be the main challenges ahead? But also to finish on a positive note, what keeps you enthusiastic and optimistic for the years to come? Yeah, I like to try and stay optimistic at least 60% of the week. It helps us when we're doing our work, especially in civil society.

So just to address those challenges first, I think it's going to be political will and motivation to stay the course, to ensure that the rules and regulations that we've put in place are well adhered to. They are respected and compliance is to the top level that we can possibly push for, especially with things like the Digital Services Act, but things like the directive as well that will be transposed in two years from now.

And I think that the geopolitical context, which has kind of been the overall frame for our conversation today, what we don't want to see is allowing that political, that kind of political discourse to really influence, facilitating, allowing or rolling back for compliance to be weak and for us to be, taking steps back from where we've come from recently.

In terms of staying motivated, I think we have to also celebrate what we have been able to achieve, particularly in the last mandate. I've been working on these policy issues for a really long time and seeing the directive come through after 20 years of some advocacy from women's rights organizations, finally seeing the Digital Services Act finalized and there's like key due diligence obligations in there. We can celebrate the fact that these frameworks are in place.

Everything always has space to be improved upon. Everything always has space to be monitored and watched to make sure that there's human rights enforcement and implementation. But we have these frameworks in place and we can use them to the best of our ability as civil society, as activists, as some advocates to ensure that we are trying to really protect the communities that we've been advocating for this whole time. So progress has been made.

That's kind of the message we've been talking about. Progress has been made. Let's keep moving forward. The geopolitical situation is difficult right now, but we can keep moving forward, even in that context. And so I think we can do that through solidarity, through collaboration, through care. Also, we should be caring for each other in this space because it's difficult.

What we do, and I think caring for each other, working together and fostering each other's different movements, knowing that we are not separate movements, we are all pushing for the same thing and improvement to our society. I think thinking about that over your morning coffee is going to help. Thank you very much. I think that's precisely the type of enthusiastic thought so that we need in the present time. So Asha, thank you so much.

This has been such a critical and inspiring conversation. So thanks a lot for sharing your insights. Thank you so much. It was wonderful to speak with you. So as we mark International Women's Day, it's evident that we are at the critical and pivotal moments. The threats, as we have seen in today's conversation, to gender equality, democracy and civic space are intensifying. But we also have the power to resist and to fight back.

Applying feminist principles, as our guest today has just highlighted. And with this, I would like to finish this podcast with a quote from Audre Lorde, who defines herself as a Black lesbian scholar, feminist and mother and part, and who famously quoted that, "In our world, divide and conquer must become defined and empowered."

And so I think that very much reflects the need to be able to recenter ourselves on feminist principles, like caring, collaboration, solidarity, in order to also make this world a better world. So thank you very much for tuning in to FEP Storps. If you liked it, please share it, subscribe to our podcast, and let's keep fighting for a feminist future. Thank you very much. [Music]

Transcript source: Provided by creator in RSS feed: download file