Welcome to the Making Sense podcast. This is Sam Harris. Okay, just a little housekeeping here. Over waking up we just introduced playlists, which has been our most requested feature. Took a while to do that, but it seems like a very auspicious change. You can create your own retreats. You can create playlists for any purpose. Despite the name of the app, there's a lot of content there that is very good for sleep. So you could create a sleep playlist.
Many of us fall asleep to audio these days. So thanks to the team over waking up for producing that feature. Among many others, the app is continually improving. And what else? If you haven't seen Coleman Hughes on the view promoting his book, that is worth finding on YouTube. Coleman was recently on the podcast discussing the book, the end of Race Politics.
He went on the view to do that. As you might expect, he was bombarded with a level of moral and political confusion that is genuinely hard to deal with in a confined space when one is short on time. And I have to say, he really did a perfect job. I mean, it was absolutely masterful. So it's worth watching in case there was any doubt in your mind about Coleman's talents. If ever there were a commercial for the equanimity that can be achieved through
mindfulness, that was it. So Bravo Coleman. In the last housekeeping, I acknowledged the death of Danny Coniman. I went back and listened to my podcast with him recorded at that event at the Beacon Theater in New York about five years ago. I was pleasantly surprised. It's often the case that live events don't translate into the best podcasts. I really thought this was a great conversation. And Danny was really worth listening to there. So that was episode
150. If you want to revisit it, I really enjoyed hearing it again. Okay. Today I'm talking to Will McCaskill. Will is an associate professor in philosophy and a research fellow at the Global Priorities Institute at Oxford University. He is one of the primary voices in a philanthropic movement known as Effective Alchorism and the co-founder of three nonprofits based on EA principles, giving what we can, 80,000 hours, and the Center for Effective
Alchorism. He is also the author of several books, including Doing Good Better, Effective Alchorism and a Radical New Way to Make a Difference, and most recently what we owe the future. However, today we don't talk much philosophy. Rather, we do a post-mortem on the career of Sam Bankman-Freed and the implosion of FTX and look at the effect that it's had on
the Effective Alchorism movement. When we recorded last week, Sam had not yet been sentenced, but he has since, and he was sentenced to 25 years in prison, which is not as much as he could have gotten, but certainly more than the minimum. I must say that strikes me as too long a sentence. You'll hear Will and I struggle to form a theory of mind of Sam
in this podcast. We discuss the possibilities at some length, but when you look at some of the people who don't get 25 years in prison for the malicious things they do, I don't know, it does not strike me as a fair sentence. Perhaps I'll talk about that more some other
time. And then we will and I talk about the effect that this fiasco has had on Effective Alchorism, the character of the EA community, potential problems with long-termism, with a brief sidebar discussion on AI risk, we discuss the effects of the FTX collapse on Will personally and other topics. There's no paywall for this one, as I thought everyone should
hear what Will has to say on this topic. As you'll hear despite the size of the crater that Sam Bankman freed left on this landscape, I consider the principles of Effective Alchorism untouched, and while I've always considered myself a peripheral member of the community, you'll hear me discuss the misgivings I have with it. Once again, here, I've discussed them on previous podcasts as well. I just think the backlash against EA is thoroughly
wrong-headed. And Will and I talk about that. As always, if you want to support the podcast, there's one way to do that. You can subscribe at samherris.org. And if you can't afford a subscription, you can request one for free. Occasionally, I hear rumors that someone has requested a free subscription and didn't get one. That should never happen. So check your spam folder. Something has gone wrong. Just request again if that happens to you.
You don't decline any of those requests. And I bring you Will McCaskill. I am back here with Will McCaskill. Will, thanks for joining me again. Thanks for having me on. So we have a lot to talk about. I've been wanting to do a post-mortem with you on the Sam Bankman freed FTX catastrophe. I don't think that's put into it too strongly,
at least in EA circles. So we're going to talk about what happened there, your perception of it, what it has done to the optics around effective altruism and perhaps effective altruism itself. Where should we start here? I mean, perhaps you could summarize what Sam Bankman freed's position was in the EA community before the wheels came so fully off. Where did you meet him and what, you know, he certainly seemed like a promising young man who was
going to do great things. Give us, give, I mean, perhaps we should take it from the top. Sure, I'm happy to. And yeah, he did, from my perspective, seemed like a promising young man, even though that's, yeah, very much not how it turned out. So I first met Sam all the way back in 2012. I was giving talks for a new organization, I'd set up 80,000 hours, which were about how you can do good with your career. And I was going around college campuses
speaking about this. And I can't remember who, but someone put me in Sam in touch. I think he had been quite active on a forum for people who have interested in utilitarian philosophy. And so, and ideas like earning together have been discussed on that forum. And as a result, he met up for lunch and he came to my talk. He was interested in a number of different
career paths at the time. So politics and into give was one, perhaps we should remind people what earning to give means because it's really the proper framing for everything Sam was up to. Sure. So earning to give was the idea that rather than say, directly working for a charity, instead, you could deliberately take a career that was higher paying something you were perhaps particularly good at in order to donate a significant fraction of your earnings
where depending on how much you made, that might be 50% or more. And the core idea was that, well, you could say, become a doctor in the developing world and you do a huge amount of good by doing that. Or you could earn more and donate enough to pay for many doctors working in the same cause. And thereby, perhaps do even more good again. And this was one of the things that I was talking about at the time. And he found the ideas compelling.
You know, we discussed it back and forth at the time. I next met him something like six months later at a vegan conference. And he told me, we hadn't been much in touch in that period, but then he told me that he'd got an internship at Changed Fleet, which is this quantitative trading fund. And, you know, that was very impressive. I thought he seemed just this very autonomous, very morally motivated person. Animal welfare was his main focus
at the time. He said later, he'd also asked some animal welfare organizations, would they rather his time, would they rather they work for him, or would they rather that he go make money in order to donate it to them? And they said, we do have the money. And so he went and did that change the, but then subsequently left and set up a trading firm called Alameda
Research, that was a cryptocurrency trading firm. And then a couple of years later, an exchange as in a platform where others could trade cryptocurrency called FTX in 2019. Those seemed to be incredibly successful. So by the end of 2021, he was worth tens of billions of dollars. The company FTX was worth 40 billion dollars. And he seems to be living up to his kind of claims. He was saying he was going to donate everything, essentially
everything he earned, 99% of his wealth, something like that. And through the course of 2022, had actually started making those donations to had donated well north of 100 million dollars. But then as it turned out, in November, it seemed like the company was not all that it seemed. There was, you know, what you could call a run on the bank, except it wasn't a
bank. So there was a lot of confidence in FTX. A lot of people started with throwing their money, but the money that customers had deposited on the exchange that should have been there was not there. And that should not have been possible. It must have been the case, therefore, that Sam and the others leading FTX had misappointed that money in some way all the while saying that the assets were perfectly safe that they were not invested.
That led to the complete collapse of FTX, a number of people, so three other people who were high up at FTX or Alameda, so Caroline Ellison, Gary Wang and Nishad Singh, they all pleaded guilty to fraud a couple of months after the collapse. Sam did not plead guilty, but it was a trial at the end of last year and he was found guilty.
And what is his current state? He's in jail, is he awaiting an appeal? What's, do you have up to the minute information on his progress through the criminal justice system? Yes. So he's in jail and he's awaiting sentencing, which will happen next week, I think. So I guess one thing we should talk about is some theory of mind about Sam, what his
intentions actually were and so far as we can guess about them. They're really two alternate pictures here, which give a very different ethical sense of him as a person and just the situation so many people were in, in giving him their trust. Perhaps we can just jump there. Do you think this was a conscious fraud? I mean, perhaps there are other variants
of this, but I'll give you the two that come to mind for me. Either this was a conscious fraud where he was quite cynically using the concepts of effective altruism, but his heart was never really in that place and he was just trying to get fantastically wealthy and famous and misappropriate in people's funds to that end and it all blew up because
just bad luck on some level. So he was kind of a Bernie Madoff style character running something like a Ponzi scheme or some unethical variant of misappropriate people's funds or alternately and I think quite differently, he was somebody who based on what he believed
about the actual ethics of the situation and probability theory. He was taking risks that he shouldn't have taken obviously in the end given the outcome, but they may well have paid off and he was taking these risks because he wanted to do the maximum amount
of good in the world with as much of the resources available that he could get his hands around and he just was just placing in the end some silly bets and you know, bets that he was allowed to place in a totally unregulated space and it catastrophically failed, but it was by no means guaranteed to fail and he was a on some level a good guy who was ruled by some bad or at least unrealistic expectations of just how many times you can play a game
of roulette and win. Perhaps there's some middle position between those two cases, but what's your sense of his actual intentions throughout this whole time? Yeah, so this is something that I've now spent many months over the last year and a half
really trying to understand. I didn't know about the fluid or have suspicions about the fluid at the time, so my understanding of things here is really me trying to piece together the story on the basis of all that's come out as a result of the trial and media coverage
over the last year and a half. One thing I'll say kind of before we talk on this is very easy once you start getting into trying to inhabit someone's mental state to start saying things where it sounds like you're defending the person or something and so yeah, I just want to be clear on just how bad and how harmful what happened was. So a million
people lost money. The scale of this is just unbelievable. And actually recently the prosecution released Twitter messages that Sam had deceived during the collapse and they're really heartbreaking to read like one is some of Euclidian man who would flee Euclidian and in order to get his money out for the money on FTX another person who was going to be made homeless had four children he needed to feed. And so on that point, Will it just linger for a second? Yeah, I had heard that a lot of
the money was getting recovered. It would do you know where that process is and how much has been recovered? Yeah, so as it turns out, all customer customers will receive all of the financial, all of all of the money they put on the exchange as measured measured in terms of the as measured in terms of the value of what they put on the exchange in November 2022. So the value as of that date is opposed to the amount they initially put
in at whatever date. Yes, but also as opposed to the amount today. So often people were say putting Bitcoin on the exchange, Bitcoin is now worth more than it was then. The standard narrative of this has been that that has happened because Kepto has risen and and for a particular investment that was made has done well. My best understanding is that
actually that's not accurate. That has helped, but even putting that to the side, kind of even as of September last year when there had not been a Kepto rise, customers would have been made whole. So the issue was not that money was taken and then just lost in the sense of like spent or just lost on bad grades or something. Instead that the money was illegally taken and invested into assets that couldn't be liquidated quickly.
So already where I think a far distance from someone like Bernie Madoff, right, who was whatever the actual origins of his behavior, whether he was ever a legitimate investor, for the longest time he was making only sham investments and just lying to everyone in sight and running
a proper Ponzi scheme. Yeah, that's like Bernie Madoff was committing the fraud for about eight years and yeah, he was needing any time a customer or so any time a client of his wanted to withdraw money, he would raise more money in order to give it back. Right. To give the fiction, you know, what should have been there to the customers,
which is quite a total of ponzi's. ponzi's queen. Yeah. Alameda and FTX, they, this is one of the things that so bizarre about the whole story and even tragic is just that the companies themselves were making money, in fact, large amounts of money. But what,
but the customer asked it. So in that sense, they were not Ponzi's schemes. But the customer assets held on FTX that should have been there, should have been, you know, bank vaulted separate from everything got used by Alameda research, the trading phone in a way that should not have been possible, should not have even been possible. Right. And so you were asking on how to interpret the story and you gave two interpretations. One was that effect of
altruism was a sham. His commitment to that was a sham. He was just in it for his own power, his own greed. The second was that it was some carefully calculated bet that, you know, may have been was illegal, had good intentions though, or perhaps twisted intentions, but didn't pay off. My personal take is that it was neither of those things. And obviously I'll caveat, you know, I've followed this a lot because I've really, to later dissolve
the confusion in my mind about what happened. But I'm not an expert in this. It's extremely complicated. But I think there's a few pieces of evidence for thinking that this just wasn't rational or calculated decision. No matter what utility function, Sam and the others were following, it did not make sense as an action. And one piece of evidence is actually just learning more about other white collar climbs. So Bernie made off being one
example, but the in Lonskandal and many others too. So there's this Harvard Business Professors Eugene Salters, who's written this really excellent book called Why They Do It about White collar climb. And he argues, and it's on the basis of interviews with many of the most famous white collar criminals. And he argues quite strongly against the idea that these crimes are a result of some sort of careful cost benefit analysis. Mainly
big, you know, in part because the cost benefit analysis just does not make sense. Often these are actually really quite wealthy, really quite successful people who have not that much to gain but everything to lose. But then secondly, looking at how the decisions actually get made, the word he often uses is mindless. You know, it's like people aren't even paying attention. It might be that, you know, this was true for the CEO of McKinsey gets off, gets
off a call from the board of Goldman Sachs, I think. And immediately 23 seconds later calls a friend to tell him about what happened in the board meeting in a way that was illegal inside of fading. This was not a carefully calculated decision. It was just irrationality.
It was a failure, a failure of intuition rather than kind of reasoning. And that's my best guess at what happened here as well, where yeah, I think what happened it seemed is that I mean, yeah, there was so many actually let me lay around a few points that could certainly buy us some people in the direction of kind of more calculation and less impetuanity than that because I think both within EA circles more widely and certainly it seems within
Sam's brain. There were some ideas that will strike people as strange but nonetheless difficult to refute rationally or logically, which is to say it all hinges around a topic you and I believe have discussed before in the podcast, which comes up in any discussion of what's called long-termism, which I think we'll get to, but it comes down to just how to integrate rationally any notion of probability, especially probability is where one side of
the decision tree represents some extraordinarily large possible gains. So I believe Sam at one point was accused of believing, he may have said something along these lines that if the expected value is such, you know, that you could, I forget if it was in terms in positive terms or negative terms in terms of avoiding extinction, but it sounds like he was willing to just toss a coin endlessly with the risk of ruin on one side with a sufficiently large,
expected value, you know, positive outcome on the other, right? It's just like if you have a chance to win a million dollars on one side and lose, you know, 100,000 on the other, you should just keep tossing that coin because your expected value is 50% of a million on one side and 50% of losing 100,000 on the other. So it's your expected value is, you know, $450,000 every time you toss that coin. But of course, if you only have $100,000
to lose, you toss the coin, you can lose everything on your first toss, right? And so he just seems to be someone who was looking at the expected value proposition somewhat naively and looking at it with everyone else's money on the line. Or at least that's, you know,
certain things I've heard said of him or by him suggested that was the case. So perhaps bring in your beliefs about how one should think about probability and it kind of ends justifying the means, thinking that many people believe has corrupted EA more generally. But and Sam is just kind of the ultimate instance of a cautionary tale there. Sure. So yeah, one thing that's just absolutely true was Sam was seemed unusually
risked, tolerant and was unusually risked, tolerant. And at the outset, like when the collapse happened, I absolutely was worried that perhaps what had gone on was some sort of carefully calculated, flawed, carefully calculated willingness to break, just to break the law in the service of what he thought was best. You know, I was worried that maybe they would come out a spreadsheet that, you know, did a little cost benefit analysis of fraud and
was clear for all to see. I think, yeah, that, yeah, I think that are just good reasons for thinking that's not what happened. And let's, I'll discuss that first and then let's come back to the important points about attitudes to risk and ends justify the means reasoning. But just briefly on why that's I think not what happened. So one is just the overall plan makes like so little sense if it was a long con, like a con kind of from the very start.
So if that was the case, then why would they be trying so hard to get regulated? Why would they be so incredibly public about the what they were doing? And in fact, very actively courting press attention or in fact, having Michael Lewis, one of the world's leading financial writers, following them around having access. Why would they be associating themselves
with the A so much as well? If that was also what they seemed to care about. And then the second thing is, it's just absolutely agreed by everyone that the organized the companies were a shambles. There was not even the most basic accounting, not the most basic corporate controls. The in June of 2022, there was a meeting between the high ups where they were very worried because it looked like there was a $16 billion loan from FTX to
Alameda. And they thought that was the rebound. It turns out that was a bug in the, turned out that was a bug in the code. And actually, it was only an $8 billion loan. And they were apparently elated at the time. So they didn't know how much their assets were to within $10 billion. And in fact, it was at that time that they discovered that they had been
double counting $8 billion. So customers for FTX, one way they could put money on the FTX exchange was by sending money to Alameda, that Alameda then should have given to FTX. And it seems that in June of that point of that time, they realized that had not been happening. Alameda had thought that money was within Alameda, legitimately, I guess. And
FTX had thought that money was there. And now I'm not going to claim I know that wasn't like conveniently overlooked or something, but at least Caroline on the stand testified that, you know, Sam might, she at least didn't know that Sam knew that that money prior to that point in time was in Alameda when it, whereas it should have been in FTX. And that's the way almost all the money flowed from FTX to Alameda. There were also, there
was also like a lending program, but which was really focused on by the prosecution. But that actually, yeah, we can go into that in more detail as well. I think that's actually not where the action was in terms of how the money, how the money moved. And if you're just like really getting to the heads of the people at this time, okay, let's now suppose they're the most ruthless, consequentialists ever. And they want to just make as much money
as possible. Let's say that's just dollars, they're not risk averse at all. It wasn't even true, but let's even assume that why on earth would they take that money and then invest it, 5.5 billion of it into illiquid venture investments. So companies basically. It was obviously posing enormous risks on them. And the gains were really quite small compared to the potential loss of not only everything in the company, but also the huge harm that
it would do to the rest of the world, to the effect of altruism movement itself. It really just makes no sense from a utilitarian perspective at all. And that's why when I dial in habit this mode of them making these kind of carefully calculated, rational decisions, I think it just, there's too many facts that seem intention without or then consistent with that for that to be the kind of best, at least my best guess at what happened.
So what do you think was actually going on? Did you describe this in the end to some version of incompetence combined with a dopaminergic attachment to just winning at some kind of gambling task? Yeah, I mean, I see the deep kind of vice, the ultimately drove this all as hubeless, where they were not very experienced, they were very smart. They grew a company in a, what was for a while an impressive way. And Sam in particular, I just think thought he
was smarter than everyone else. And this was something that I didn't like about Sam and noticed like during the time he was, he would not be convinced of something just because other people, even if everyone else believed X and he believed Y, that wouldn't give him no port like no pause for doubt. And so I think he got kind of corrupted by his own success. I think he felt like he had made these bets that had paid off in the spite of people being
skeptical time and again. And so he just thought he was smarter. And that means that very basic things like having good accounting, having the kind of adults professionals come in who could do risk management, actually point out what on earth was going on with where different, different stashes of money were. Because this is another thing that shows just how insane the whole venture was at the time of collapse. They just didn't know
where most of their assets were. You know, they would be hundreds of millions of dollars in a bank account somewhere. And they wouldn't even know existed. The bank would have to call them to tell them by the way you've got these assets on hand. Again, if it was a carefully calculated ploy, you would want to know where all of your assets were in case there
was a mass withdrawal of customer deposits. And so I think, yeah, that huge risk, also kind of not a risk calculation, but maybe an attitude to risk where many people, I think, when they're in the position of, you know, really quite rapidly, of earning a multi-billion dollar company would think, holy shit, I should really get some experience professionals in here. Whereas that was and be quite worried about, you know, have I attended to everything?
Have I basically just, have I got this company under control? And I think at that point, they did, you know, that was not at all how Sam and the others were thinking. And then the final thing to say just is that this isn't saying that they didn't commit fraud, from June onwards. After this whole has been discovered, I think then it becomes pretty clear that, you know, there are just blazing lies to try to, to try to get out the position
that they've put themselves in. I think there are also other cases of things that seem like clearly forward, though they are not of the kind of eight billion dollars scale. And this was fraud, you think, to conceal the whole and the boat that was putting everything at risk or this was fraud, even when things were, appeared to be going well and there was no risk of oblivion evident. I mean, what was the nature of the fraud, do you think?
Yeah, I mean, again, flagging that there's probably lots that I'm saying that are wrong because this is, you know, it's complex and I'm not confident there's like lots of different stories. My guess is both like in the trial, one thing that came up and I'm surprised
didn't have more attention was that FTX advertised it had an insurance fund. So if it's liquidation engine, basically a bit of technology that meant that even if a customer was borrowing funds on the exchange in order to make basically a bet on the exchange with borrowed money, you know, on other exchanges, you could easily go negative by doing that.
And that would mean other other users would have to pay to cover that loss. FTX had this risk automate, get automatic liquidation engine that was quite well respected, but they said even if that fails, there's this insurance fund that will cover any losses. However, the number that was advertised on the website seemed to have been created just by a random
number generator. So that seems like really quickly a fraud. And I don't know, it hasn't been discussed very much, but seems totally inexcusable and was applied even when the going was good. But then the big fraud, the $8 billion, it seemed like that was that really started kicking in from June of 2022 onwards. They'll also say like, you know,
we can talk about my interactions with the people there. It seemed like looking back, I think it seems to me like they did not know just how badly the situation they were in was. But yeah, well, let's talk about your interactions and let's focus on Sam
to start. I mean, you bring up the vice of hubris. I mean, I only spoke to him once, I believe it's possible I had a call with him before I did a podcast with him, but he was on the podcast once and this was very much in the moment when he was the the darling of the a community. I think he was described as the I guess the he was the youngest, you know, self-made person to reach something like $30 billion. I think he was, you know,
$29 and had $29 billion or something at the time I spoke to him. And again, the purpose of all of this earning was to do the maximum amount of good he could do in the world. He was just earning to give as far as the I could see. You know, I didn't really encounter his now famous arrogance in my discussion with him. Maybe he just seemed, you know, smart and well-intentioned. And I had no reason to I knew nothing about the details of FTX apart
from what he told me. And, you know, I think anyone in our position of of talking to him about his business could be forgiven for not immediately seeing the fraudulence of it or the potential fraudulence of it given that, you know, he he had people invest with him, you know, quite sophisticated, you know, venture investors early on. And, you know, they didn't detect the problem, right? And we were not in the position of investing with him.
But in the aftermath, there are details of just how he behaved with people that struck me as arrogant to the point of insanity, really. I mean, just like he's, you know, in these investor calls, apparently he is well describing his business and soliciting, I think it was hundreds of millions of dollars at a minimum from firms like Sequoia. He is simultaneously
playing video games. And this is, you know, celebrated as this delightful affectation. But clearly he is someone who thinks, you know, he need not give people 100% of his attention because, you know, he's got so much bandwidth, he can just play video games while while having these these important conversations. And there were some things in Michael Lewis's book
that revealed, or at least seemed to reveal that he was quite a strange person. And you know, someone who claimed on his own account, like at least to Lewis, that he didn't know what people meant when they said they experienced the feeling of love, right? Like he, so he's narrow a typical at a minimum. And perhaps I just, you know, offer this to you is just a
series of impressions. But how peculiar a person is he and shouldn't there have been more red flags, you know, earlier on, you know, just in terms of his integrity, ethically, or just his capacity for ethical integrity given, you know, I mean, if someone tells me that they have no idea what anyone means when they say they love other people, you know, that is an enormous red flag. I mean, it's something that, you know, I would feel compassion for,
you know, that the person is obviously missing something. But as far as collaborating with this person or putting trust in them, it's an enormous red flag. And so I, you know, I don't know what point he told Lewis that, but what, which is what was your impression, or is your impression of Sam as a person? And in retrospect, were there signs of his unreliability, ethically, you know, far earlier than when the emergency actually occurred? Sure. So there's,
yeah, lots of say here. And briefly on the not feeling love. So yeah, there's my descriptions of Sam and feelings of Sam are quite very, very gated. And is the ability to not feel love? It's, you know, that wasn't something that seemed striking or notable to me. Like after the Michael Lewis Burke and lots of things came out, it seemed like he had just emotional flatness across the board. And whether that's a result of depression or ADHD or autism is
like not really that clear to me. But that wasn't something that seemed obvious at the time, at least. I guess I interact with people who are, you know, relatively emotionally flat, quite a lot. I certainly wouldn't have said he's a very emotional person. He did seem like a very thoughtful and credibly morally motivated person all the way back to 2012. I mean, his main concern was for the plight of non-human animals on factually farms for most of that time. It's kind of an
unusual thing to care about if you're some sort of psychopath or something like that. Yeah, I, you know, when I'd first reached out to Sam after the FTX had been so successful, I talked to him about, you know, okay, you've started this company. It's a crypto company. Isn't crypto like, you know, please get you like what they're, you know, how much have you thought about risks to
the company and so on. And there was a narrative that came from him and then was echoed and emphasized by Nishad Singh who in my experience was really a kind of crucial part of the story, like really crucial part of my interface with that world. And the story I got told was FTX is trying quite self-consciously to be much more ethical than the standard crypto exchange or anything going on in the crypto world. And there are two reasons why we need to do that. Even putting
aside the, you know, intrinsic desire to act ethically. One is because they were trying to get regulated. So, you know, they were very actively courting regulation in the US because they thought that was a way in which, you know, by being, you know, they were these center left people. They were not the libertarians that populate crypto normally. They thought, you know, that's how they could get the edge over the competitors was by being much more open and open to regulation.
And then secondly, because they planned to give the proceeds away, they knew that they would get, you know, they would face a higher bar for criticism. And that claim got, yeah, made to me over and
over again, where not just Sam, but Nishad, you know, Sam was very busy. So, you know, I spoke to him a number of times, like half a dozen times, or something one on one, more times in kind of group settings, but I talked to Nishad and Nishad really came across like, I mean, this is the thing that like maybe breaks my heart the most about the whole story where he came across just as this incredibly thoughtful, you know, moderately motivated, careful, just kind person. And I would ask him
kind of, okay, so why are you in the Bahamas? And there would be an answer, which is that that's where they were able to get licensed. Or I'd ask kind of why is your apartment so nice? And they would say, well, we can't really get kind of mid-level property in the Bahamas. We just need somewhere that create like a campus feel. And so, yeah, it is nicer than we'd like. Hopefully we can move over time to something a bit less nice. So over and over again, or other kind of ethical issues in
Clipto, we can go into. And yeah, over and over again, he was painting that picture. And something that was just so hurtful and confusing is just, was he lying to me that whole time? Like, was that just all false? Or was he just like a gullible fool? I haven't followed the trial in sufficient detail to know where, what his role was revealed to be in all of this. I mean, where is he and has he been prosecuted? And what do you think of his actual intentions at this point? Yeah, I mean, so
he pled guilty, I think, for thought, among other things. And he testified. In these pleatings, do you think this was just kind of a classically perverse prisoner's dilemma situation where you have people, given the shadow of prosecution and prison hanging over them, they're willing to testify to things that the government wants to hear, but which are not strictly true? I mean, what do you, what's your theory of mind for the people who
pled guilty at this point? Yeah, I mean, again, this is something that comes up in Eugene Salters' book and he talks about where it's a very strange aspect of the US legal system. Like, it's not something that happens in the UK where the government will reward people, literally with their lives, for going on the stand, like, you know, because they will, the other people probably will get no jail time. They will reward people in that way, for going on the stand and test
ifying. And so that just does mean, you know, they can't tell lies or whether it's not verifiable lies, but there are very strong incentives to resent things in a certain way. And again, I don't want to, this was all sounding much more defensive of Sam than I want to be, but the Salters' book talks about some people who were just, you know, they would beat their hearse with their liars for hours and hours and hours in order to seem, you know, display apotheocthia, confession,
and so on. And so the view of this that kind of Michael Lewis took is just, you know, people understand they will have just said true things throughout, but the kind of tone of it is maybe a little different than it really was, where there was a lot of, you know, the co-conspirators talking about how bad they felt and how they knew what they were doing was wrong at the time. They were really torn out. That, you know, seems quite inconsistent with my experience of them,
but maybe they were just incredible liars. One question about that. So they knew what they were doing was wrong, could mean many things. It could mean that they knew that they were taking risks with people's funds that were unconscionable, you know, given the possibility of losing money that customers thought was safely on the exchange, but that's not the same thing as stealing money
and misappropriating it in a way that is purely selfish, right? It's not like we took money that was not ours and we bought, you know, luxury condominiums and the Bahamas with it and hoped
no one would notice, right? That's one style of fraud. This, you tell me, is it possible that they thought they were going to wager this money on other real investments, you know, however shady, some of these crypto properties were, but they actually expected enormous returns as a result of that misappropriation and that money would come back, you know, safely into FTX and no one would lose anything in the end if everything worked. Yeah, I mean, in terms of how things seemed to me,
I just think they didn't think the company was at risk, not at serious risk. And here's a couple of, there's a few reasons why. I mean, one, this is kind of how I felt like, why I was so confused this whole time, like, you know, I visited the Bahamas a number of times in 2022. I never saw any kind of change in attitude from them over that time. Like you would really think if you're engaging this major fraud that something would seep out, some sort of flags, maybe I'm a fool, but I did not see
that. And in fact, even so in September, my last trip to the Bahamas, I heard from Michael Lewis that Sam had been courting funding for FTX from Saudi Arabia and other places in the Middle East. And I do not love the idea of taking money from Saudi Arabia. I have issues with that. And it also just struck me as kind of odd. And I was aware of those acceptors downturn. So I talked in a chat and I say, I asked, look, is there anything up with a company like a UN trouble? And he
says no. And we talk about this for, you know, it's not a passing comment for some time. And that, by any account, is like past the point when he, you know, allegedly had learned about the huge hole that the company faced. Similarly, Michael Lewis was the same time, asked both Caroline and Nishad as a kind of fun question like, oh, what could go along with the company? Like if this all goes to zero, what happened? And again, he said like no indication
of stress upon hearing that question. They had fun with it. They were like, oh, maybe Kliptos is just a lot of air and everyone gets turned off or maybe Sam gets kidnapped. That was kind of one of the big worries. But nothing that kind of leaked out there. So given his guilty plea, the in and his testimony, what's your belief about your conversation with Nishad at that point?
Do you think he was unaware of the risk or do you think he was lying to you? So I think another thing that came out during the trial, though I'm not sure if it was admissible as evidence, was that Nishad commented to the government kind of immediately upon pleading guilty that in that period he was he kind of thought he still thought that FTX would last for years.
So and yeah, and in terms of just giving an indication of what Nishad's personality was like, when the collapse happened, he really he had to be watched because he was on the verge of suicide. He was so distraught about what happened to the customers and I think was really quite close to taking his own life. So the one sort of fraud was he pleading guilty to if he was the kind of person whose suicidal when the wheels come off as though he had no idea that this was in the cards.
Right. I mean, he's just responding to all of the appropriate him and disgrace aimed his way in the aftermath or do you think he was actually surprised fundamentally by the risk that was being run and if the latter, in what sense does he claim to be culpable for a conscious fraud? Yeah, I mean, so yeah, so I don't know about whether Nishad at this time was ignorant as in just like really did not know the risks they were running. I don't know if he
was just like delusional. So again, this is a thing that Sultez talks about is just, you know, the the capacity for humans to create a narrative in which they're still, you know, the good guys. I don't know, perhaps he thought that yes, this was bad, but they would get out of it. And so it was a little bit bad, but it'll be fine. Maybe maybe that was there too. Oh, yeah, one thing that he
does is he buys this fee and a half million dollar property for himself in October. Again, it's just not the action with someone who again, on the stand said that, you know, how distraught he was and he was talking to Sam about this. So all of these things are possible to me. As for, you know, him pleading guilty. Well, I mean, whichever of these is true, I think it would make sense to plead
guilty if you're in that situation where, you know, there are huge costs to going to jail. And like, I don't know, like, you know, plausibly also, he just thought, look, yes, I thought, so, you know, there's a very, very story. But plausibly he thought, look, yes, I knew what I was doing was bad. I thought it was only a little bit bad. Actually, I was long, it was very bad. It was extremely bad. I'm willing to just fast up and take the hit I should get. Like, there are various possible
explanations there. Right. There was a lot made, I don't know if this appeared in the trial, but in the court of public opinion, there was a lot made of a text exchange that Sam had with somebody. I think it was a journalist or a quasi journalist in the immediate aftermath of the scandal where he seemed to admit that all the effective altruism lip service was just that.
It was just the thing you say to liberals to make them feel good. I forget the actual language, but it just, it seemed like he was copying to the fact that his, it was that part was always a ruse. Honestly, when I read those texts, I didn't, I didn't know how to interpret them, but it was not obvious to me that they were the smoking gun. They were, they appeared to be in so many minds who were, who were ready to bury effective altruism as a scam. Do you know the
threat I'm referring to and do you have a few? Yeah, I know the, I know the fled. And yeah, I mean, in a way, like from the perspective of the brand of effective altruism, maybe it would have been better if he'd actually be that part had been a big con. But no, I think he, I think he believed in these ideas. I think there he was the furthing to kind of what you might call like sort of corp the ethics that are really kind of PR. So, you know, companies will often make these big
charitable donations to their local community and so on. And I mean, in this case, yeah, exactly everyone knows like this is marketing. And I guess I don't know really details, but presumably FTX was, you know, doing stuff like that in the same way other companies do. And that my interpretation of those taxes that that's what he was the furthing to actually reminds me
that the one concern I did have about Sam before the scandal broke. I don't know if this was contemporaneous with, with my conversation with him on the podcast, but I just remember thinking this way when I heard how much money he had given away and you referenced it earlier of it's something north of a hundred million dollars, you know, I'm always, you know, quick to do the math on that and I recognize what a poultry sum that actually is if you have $30 billion.
Right. I mean, so it's an enormous amount of money out in the real world where people are grateful for whatever you give them. But, you know, it's, it's analogous to somebody who has $30 million giving $100,000 away. Right. It's not a sacrifice. It's a rounding error on their actual wealth. And it's certainly not the sort of thing that I would expect of someone for whom the whole point
of becoming fantastically wealthy is to give all of it away. And so I think, and I forget if I asked him a question about the pace of his giving during that podcast, but it's just, I know that some people think, well, it's, you know, the best thing for me to do is to use these assets to make
more assets in the meantime and then I'll give it all away later on. But given the, the urgency of so many causes and given the real opportunity to save lives and mitigate enormous suffering, you know, every day of the week now, you know, starting now, it just see, you know, my spidey sense tingles when, when I hear, you know, a fantastically wealthy person deferring their giving, you know, to the far future. And so I'm wondering what you think of that. Sure. Yeah. I think that issue is,
I think that wasn't really an issue. So a couple of reasons. One is just, you know, his net worth, basically entirely in FTX. There's no way of converting that. So if you're in a startup, and all your wealth is in the equity of that startup, there's not really any way of converting that wealth into money, the sort of thing that you could donate. You have to basically keep building the company until you can have an exit. So have get acquired or sell the company. And then you can
become more liquid. And then the second thing is just at least relative to other business people, he was very unusual in wanting to give more and give quickly. So I mean, I advised on the set up of his foundation. And it got a lot of criticism for giving up, for scaling up, giving too quickly. So going from kind of zero to, you know, one to 200 million in a year is like a very big scale up. And it's actually just quite hard to do. And so I guess I like, you know, if you were asking me,
yes, it's a tiny fraction. And I agree with the point in general that when someone who's a center billionaire then gives, you know, a hundred million dollars to something, that is just really not very much at all, especially once they've had that money for decades and they can really kind of distribute it. But in that case, the way it seemed to me at the time, and I guess still does just seem to me was like basically consistent with someone to kind of scale up their giving as
fast as they can. And in fact, in a way that, you know, plausibly should have been paying more attention to the business, and not getting distracted by other things. Yeah. So so what is this, what if anything does this say about effective altruism? I mean, I guess there's an additional question
here. But what has been the effect as you perceive it on EA and the public perception of it, the fundraising towards good causes, has it forced a rethinking of any principles of effective altruism that, you know, whether it's earning to give or or a focus on long-termism, which we haven't talked about here yet, but you and I've discussed before. And how large a crater has this left and what has been touched by it, and is there any good to come out of this? Just give me the pictures
you see it of EA at the moment. Yeah, I mean huge charm, huge charm to EA, where, you know, at the time of the collapse, I put it like 20% or something that the EA movement would just die that this was a killer blow. And so obviously in terms of hit to the blend, you know, so many people think ill of EA now or critical of EA now and all sorts of different ways, to live and buy this. In a way that's not surprising, it was this horrific, horrific thing.
I don't think it happened because of EA. I think it happened in spite of EA. I think like EA leaders and communicators have been very consistent on the idea that the ends do not justify the means. Really since the start. I mean, and really this goes back, essentially. It's going back to John Stuart Mill. And actually even Sam knew this. So again, as part of just then to figure out what happened, I did some Facebook archaeology. There's so there's an essay by Eliezer Yukeski called
the ends do not justify the means among humans. Basically making the classic point that you are not God at calculation, even if you're a 100% consequentialist, which I don't think it should be. But even so, follow the heuristics that are tried and true, including heuristics not to violate side constraints. And this was shared on a group, on a kind of discussion group. And this is well before FTX,
Sam's response was like, why are you even sharing this? This is obvious. Everyone already knows this. So yeah, this was in my view in spite of EA, not because of it. But yes, the damage is huge. Also internal damage as well. Moral was very, very low. Trust was very low. The thought being well,
if Sam and the others did this, then who knows what other people are like. And there has just been an enormous amount of self-reflection, self-scudenty, whether that's because of discotestrophe itself, or just if there's any point in time for self-reflection, I think it was in the aftermath of that. And so there's a whole bunch of things that have changed over the last year and a half. Not in terms of the principles, because what is effective altruism, it's the idea of
using evidence and reason to try to make the world better. That principle is still good. I still would love people to increase the amount by which they are benevolent towards others, and increase the amount by which they think extremely carefully and are like really quite intense about trying to figure out how they can have more positive impact with their money or with their time. That's just still as true as ever. And the actions of one person in no way under
mind that. I mean, take any ideology, take any moral view that you can imagine. You will find advocates of that ideology that are utterly of a pognit. This is the Hitler of Wars of Egetarian Principles. Exactly. Sam was too. Vegetarians having a bad time. Yeah, exactly. Exactly. But there have been a lot of changes to the institutions within effective altruism. So it has essentially entirely new leadership now, at least on the organizational side.
So Centre for Effective Altruism, Open Flanth, be 80,000 hours, and the boards of at least some of these organizations are like really quite refreshed. This is partly just a lot of people had to work exceptionally hard as a result of the fallout and got really quite burned out. In my own case, I've stepped back from being on the boards of any of these main organizations,
and I won't do that again really for quite a while. One thing was just that I wasn't able to talk about the stuff in the way I really wanted to for over the year, like I spent, again, like months, little months, writing blog posts and rewriting them and having them then kind of knocked back because there wasn't kind of investigation being held by effective ventures, one of the charities and the law firm doing that, really didn't want me to speak while that was
ongoing. But then also because I think a healthier effect of altruism movement is more decentralized than it was, and there was an issue when the collapse happened that I was in the roles of being on the board of the charity, also if anyone was being a spokesperson for the EA, but also having advised Sam and the creation of the foundation. And that meant I wasn't able to kind of offer guidance and the assurance to the community at that time of crisis in a way that
I really wanted to and wish I'd been able to. And so I do think like a healthier EA movement is you know, has greater decentralization in that way. And there's some other things happening in that direction too. So various organizations are kind of separating projects, like separating out legally and becoming and becoming their own entities. Yeah, in the aftermath, I was certainly unhappy to see so many people eager to dance on the grave of effective altruism.
And in the worst cases, these are people who are quite wealthy and cynical and simply looking for an excuse to judge the actual good intentions and real altruism of others as just you know, patently false and it was never there there. No, everyone's just in it for themselves. And therefore, I, Rich, I, and Randy and type should feel completely clear conscience in being merely selfish. Right. It's all a scam. Right. And that's I just think that's an odious
worldview and a false one. Right. It's not that everyone is just in it for themselves. It's not all just virtue signaling. They're, they're real goods in the world that can be accomplished. They're real harms that can be averted and you know, being merely selfish is a really is a character flaw and is possible to be a much better person than that. And we should aspire to that. And I say this is as someone who's been to some degree always somewhat critical or at least
leery of EA as a movement and as a community. I mean, I think I'm, you know, one of the larger contributors to it just in, you know, personally and just how much money I give to EA, Align charities and in how much I have spread the word about it and inspired others to take the pledge and to also give money to give well and similar organizations. But I've always been and I've spoken to you about this and I've just I've said as much on on this podcast and elsewhere.
You know, I feel like as a movement, it's it's always struck me as too online and for some reason attractive to you know, in the in the in the most comedic case, you know, you know, you know, a typical people who are committed to polyamory, right? I mean, it's just there's a silicon valley cult like dynamics that I've detected. If not in the center of the movement, certainly at his fringe that I think is a Vince to some degree in in the life of Sam Bankman
Freed too. And we haven't talked about just how they were living in the Bahamas. But, you know, there's there's certainly some colorful anecdotes there. And so it just it seems to me that there's a there's a culture that you know, I haven't wanted to endorse without caveat. And yet the principles that you know, I've learned from my conversations with you and you know, in reading books like your own and in a Toby Ords book, the precipice, their ideas about existential risk and you know,
actually becoming rational around the real effects of efforts to do good rather than the the imagined effects or the the hoped for effects, divorcing a rational understanding of mitigating human suffering and risk of harm and the good feels we get around, you know, specific stories and specific kind of triggers to empathy, right? And just performing conceptual surgery on all of that so that one can actually do what one actually wants to do in a clear headed way guided by
compassion and a rational understanding of the effects one can have on the world. And it's, you know, we've talked about many of these issues before and in previous conversations. I think all of that still stands. I mean, none of that was wrong and none of that was shown to be wrong by the example of Sam Bankman-Freed. And so I, I just, you know, I do more in any loss that those
ideas have suffered, you know, in public perception because of this. So yeah, I mean, take do with that what you will, but that's where I've netted out at this point. Yeah, I mean, I think it's part of the tragedy of the whole thing is just, you know, giving what we can has over 9,000 people who are trying to get, who are pledging to give at least 10% of their
income to highly cost effective charities, naming for 10,000 people this year. For those people, you know, generally living like the normal lives, middle class, maybe they're, you know, or maybe they're wealthier. Like in what way does the action of Sam and the others invalidate that? And the answer is not at all. Like that is just as important as ever. And yeah, one of the things that's so sad is like maybe fewer people will be inclined to do so, not for any good national reasons, but
just because of the, you know, bad order that surrounds that idea now. And that's just a little tragedy. I think that's, I think donating a fraction of your income to causes that effectively help other people. I still think that's a really good, a really good way to live. You talk about, yeah, the kind of online cult like and shot through with Asperger's kind of side of the movement. I think I want to do say that, you know, EA is many thing, or like the EA movement is many things.
And also, of course, you can endorse the ideas without endorsing anything to do with the movement. But I definitely worry that, you know, there is a segment that is extremely online and perhaps
unusually weird in its culture or something. And it's a bit of a shame, I think, if people get the impression that that's kind of what everyone within the EA movement is like on the basis of whoever is kind of most loud on the internet, where, you know, and people can be poly, if they want, I wouldn't know, no moral objection to that. At all, find a way to live. People can have all sorts of weird beliefs too. And like maybe some of them
are correct. Like I think AIOS was extremely weird for many years. And now people are taking it really seriously. So I mean, I think that's important. But I think the vast majority of people within the effect of AIOS movement are like normal people, they're people who care a lot. They're people who are willing to put their money or the time where their mouth is. And because they care, they're really willing to think this through. And, you know, willing to go where the arguments
or the evidence lead them. And, you know, I'm not someone who's naturally kind of on the internet all the time. I find Twitter and internet forums, you know, quite off-putting. And when I meet people in person who are engaged in the project of Effective Autism, it just, it feels very, very different than it does if you're just kind of, yeah, hanging out on Twitter or on some of the forums online
or something. So is there anything that has been rethought at the level of the ideas? I mean, the one other issue here, which I don't think it played an enormous role in the coverage of the FTX collapse, but it's come under some scrutiny and become a kind of an ideological cause for concern. The emphasis on long-termism, which you brought out at book length in your last book,
was that part of the problem here? And is there any rethink? Because that certainly brings in this issue of probability calculus that turns our decisions into, you know, a series of
trolley problems wherein ends justify the means thinking, at least becomes tempting. I wish to say that if you thought you could, a decision you made had implications for the survival of humanity, not just in the near term, but out into an endless future, you know, where trillions upon trillions of lives are at stake, and they hang in the balance, well, then there's a lot you might do if you really took that, the numbers seriously, right? Is there anything that you're, you have
been forced to revise your thinking on as a result of this? Yeah, so I mean, I really think long-termism wasn't a play. I mean, again, like I've said, I feel like it wasn't. What happened to FTX was not the matter of some rational calculation and pursuit of some end. I think it looks dumb in a model
for many perspective. I also just think like if your concern is with hundreds of millions of people in extreme poverty or the tens of billions of animals suffering, in fact, the farms, the scale of those problems are more than enough for the same reasons, for the same kind of worries to arise. And in fact, we have seen, like in the animal welfare movement on the flinges, people taking violent actions even in the pursuit of what they regarded as the kind of later good
good. Long-termism, if anything, kind of actually shifts against it because this argument about, oh, you should be willing to take more risk if you're using your money philanthropically than if you're willing to just spend the money on yourself. That argument applies much less strongly in the case of long-termism than it does for global health and development, for example. Because, you know, if I have five dollars to spend, that can buy a bed net. If I have a billion
and five dollars to spend, that final five dollars still buys a bed net. Global health and development can just absorb huge amounts of money without the cost effectiveness going down
pretty much. The same is not just a follow further point on those lines. My concern with long termism has been the way in which it can seem to devalue the opportunity to alleviate present harms and present suffering because if you can tell yourself a story that the one billion people suffering now are, you know, their interests are infinitesimal compared to the trillions upon trillions who
may yet exist if we play our cards right. So it's an argument for perhaps overlooking the immediate suffering of the present out of a concern for the unrealized suffering of the future. Yeah, and I think that's, you know, in what we are the future, I was very careful to defend only what I call the weak form of long-termism that positively impacting the long-term future is any model priority of our time, not claiming it's the only one, nor claiming it's
overwhelming and important, either. Like, I think we should be uncertain about this. Right. In, you know, got a new preface, I suggest a goal, kind of a way of operationalizing that of rich countries putting 1%, at least 1% of their resources to issues that distinctively impact future generations because at the moment they currently put close to 0%. And I do think the mode of operating in which you think, oh, pleasant catastrophe is nothing compared to the
unparalleled good that may come in a trillion years time. Like, I think that's a very bad way of thinking, even just from a pure long-term perspective. I think it doesn't have a good track record and is really not how I would want people to think. There has been a different line of criticism that I got from within EA, from the publication of what we are the future onwards, that I think has had a lot of merit. And that line of criticism was that I was misunderstanding,
actually, how near term the risks that we are talking about were. So, in particular, the risk from AI, where the risks we face from really advanced artificial intelligence, even artificial general intelligence, these are coming in the next decade, at most the next couple of
decades. And secondly, the scale of the problems that are imposed by technological development like AI are so great that you don't need to think about future generations at all, even if you just care about the 8 billion people alive today, imposing the size of risks that we are imposing on them via these technology is more than enough for this to become one of the top problems that the world should face today. And over the last few years, since the publication of the book,
I just think that perspective has been getting more and more vindicated. And so, I am now much more worried about really very advanced, very fast advances in AI, in a very near time thing, as in literally the next five, six years or the next decade, much more worried by risks from that than I was even just a few years ago. This is a sidebar conversation, but it's interesting.
Are your concerns mostly around the prospect of unaligned AGI, or are they the more piecemeal and near-term and actually already present concerns around just the misuse of AI at whatever capacity it exists to essentially render societies ungovernable and more or less guarantee malicious use at scale that becomes quite harmful to what degree you focused on one versus
the other? I think I want to say I'm focused on both, but also other things too. So, I think misalignment risk, I think it's real and I think it's serious and I think we should be working on
it and working on it much more than we currently are. I am an optimist about it, though. As in, I think, very probably it will either turn out not to be an issue because it's just really quite easy to make advanced AI systems that do what we want them to, or we'll put in a big effort and be able to solve the problem, or we'll notice that the problem has not been solved and we'll actually just hold back, put in regulations and other controls for long enough to give us
enough time to solve the problem. But this should still be more work. However, I think that AI will pose an enormous array of challenges that haven't really been appreciated, and the reason I think this is because I find it increasingly plausible that AI will lead to much accelerated rates of technological progress. So, imagine as a kind of thought experiment, all the technologies and intellectual developments that you might expect to happen over the coming five centuries,
everything that might happen there. And now imagine all of that happens in the course of three years. Would we expect that to go well? So, in that period of time, okay, we're developing new weapons of mass destruction. We now have an automated army, an automated police force, so that in principle, all military power could be controlled by a single person. We now have created beings that plausibly have moral status themselves. What economic rights, welfare rights,
political rights should they have? Potentially, I mean, you talked about misinformation, but potentially now we have the ability to just have superhuman persuasive abilities far, far better than even teams of the best most charismatic lawyers or politicians in the most targeted ways possible. And I think more challenges too, like over this period will probably also have new conceptual insights and intellectual insights, or radically changing the game board for
us too. And all of that might be happening over the course of a very short period of time. Why now? Like, why might it happen in such a short period of time? Well, that's the classic argument goes back to IJ Good in the fifties, which is once you've got the point in time when AI can build better AI, you've got this tight feedback loop because once you built the better AI, that can help
you build better AI and so on. And that argument has been subject to really quite intense inquiry over the last few years, building it into leading growth models, really looking at the input output curves in existing ML development for how much of a gain you get for an increase in input.
And it really looks like the argument is checking out. And then that means that it's not long from the point of time that you've got your first AI that can significantly help you with AI research to then trillions upon trillions of AI scientists that are driving progress and all sorts of scientific domains forward. And that's just a really quite dizzying prospect. I think
misalignment and misinformation are some of the challenges we'll have to face. But it's really just like it's like facing all of technological progress at once and doing it in a incredibly short period of time such that I think the default outcome is not that we handle that well. Handling at will or not, I think we just birthed another topic for a future podcast. Will. Sure. There's a lot to talk about there. Okay, so finally, where has all of this controversy
and confusion landed for you? Where does it leave you personally and in terms of what you're now doing and how do you what's your view optimistic pessimistic or otherwise about EA going forward? Yeah, I mean, so the collapse was, yeah, I mean, it was extremely hard for me and there's just been no doubt at all. It's the hardest you have in a half now of my life. Both, you know, so many of these are just the harder of the harms that were caused. The incredible damage it did
to organizations and people that I loved. And so I found that just, yeah, very tough to deal with. And I was, you know, really quite a dark place like for the first time in my life actually, at this chunk of time where I just didn't, I kind of lost the feeling of moral motivation. I like, didn't really know if I could keep going. So I did actually even think about just stepping back like really just giving up on EA as a project in my life because it just felt kind of tainted.
And that was weird. I mean, that was weird not having that motivation. Well, what would have produced that effect? Is it just the public perception of EA becoming so negative? Is it practically speaking fewer funds going into EA organizations that need those funds? What was it funds that were getting clawed back because Sandbank went free or FTX had given those funds and now that was become legally challenged? What was the actual impact?
Yeah, I mean, the reason for me thinking maybe I was just going to give up was nothing practical like that. It was psychological like after, I really felt like I'd been punched or stabbed or something. I felt like I'd been building this thing for 15 years. And it really worked, you know, unsustainably hard on that. I was tired. And it had just been blown up in this one
in one swoop. And it just been blown up by Sam's actions. And yeah, it was just hard to then think, okay, I'm going to get back up off the ground and go into the shallow pond and rescue another child or something. When you say it's been blown up though, so you're talking essentially about the brand damage to EA. Yes. With which you have been so closely associated as really one of its progenitors. Exactly. And I was talking about just what's going on, what was going on in my mind,
not about what in fact happened. So we can talk about what the hit was to EA. So yeah, huge band damage. I think people were definitely keen to disassociate themselves. In my own case, I actually got surprisingly little in the way of being cancelled. But it definitely means it's harder for people, especially in the aftermath to go out and advocate. But I'm now feeling much more optimistic. So it was really tough. Like, though, year afterwards it was low malout. I think
it was as hard of a, you know, people associated to EA to just do the work they wanted to do. But a few months ago there was a conference among many of the leaders of the kind of most core in organizations. And I honestly just found that really inspiring because for so many people, the principles are still there. The problems in the world are still there. Like sounds actions do not change in any way how serious the problems we're facing are and how important it is to take
action. And how important it is to ensure that our actions have as big an impact as possible. So there really was a sense of people going up the sleeves and wanting to get back to it. And then in terms of the brand, like I think once you go off Twitter, people again are like quite understanding. People understand that the actions of one person don't reflect on an entire movement. And ultimately just not that many people have heard of either FTX or effect
of altruism. So there's still kind of plenty of room to go. And so I think we're starting to see things change again. And you know, it's just made me reflect on the good things that the effect of altruism movement is accomplishing and has continued to accomplish. So just recently, give well has now moved over two billion dollars to the top recommended charities. Just so insane for me to think that that could be that amount compared to where I was 15 years ago. It's like hundreds of
thousands of people who are alive that would have otherwise been dead. Like I live in Oxford, has a population of 120,000 people. If I imagine like a nuclear bomb went off and killed everyone in Oxford, like that would be world news for months. And if a kind of group of altruistically motivated people had managed to come together and prevent that plot saving the city of Oxford, that would be huge news. People would write about that for decades. And yet that's exactly,
you know, not nearly as dramatic a way, perhaps. But that's what's happened actually several times over. And so now I think, yeah, we're getting back to basics, basically, back to the core principles of effect of altruism, back to the idea that, you know, all people have model claims upon us. We in rich countries have an enormous power to do good. And if we do just devote some of our money or some of our time through our careers to those big problems, we can make a truly enormous
difference. That message is as inspiring to me as ever. Yeah, well, I can certainly could say the same for myself. As I've described, I've always viewed myself to be on the periphery of the movement, but increasingly committed to its principles. And, you know, those principles have been learned directly from you more than from any other source. And I think as I said, in a blurb to your your most recent book, I wrote that no living philosophers had a greater impact
upon my ethics than Wilma Cascull. And much of the good I now do in particular by systematically giving money to effective charities is the direct result of his influence. That remains true, and I remain immensely grateful for that influence. And so I just you should not read into the noise as a result of what Sam Bankman-Free did any lesson that detracts from, you know, all the good you have done that many of us recognize that you have done and all the good you have
inspired other people to do. It's really it's extraordinarily rare to see absentee's philosophical reasoning result in so much tangible good and such an obvious increase in well-being in those whose lives have been touched by it. So you should keep your chin up and just get back to your ultimate necessary work. It's really it remains inspiring and you know, well you may no longer be the youngest philosopher I talk to. You still are you still are nearly so.
So just keep going. That's my advice. Well, yeah. Thank you so much, Sam. That really yeah, it really means a lot that you feel like that. Especially because yeah, your advocacy has just it's really been unreal in terms of the impact it's had. So I was saying that giving what we can has over 9,000 members now. Over 1,000 of them cite you. Sam had his podcast as why they have
taken the pledge. That's over 300 million dollars of pledged donations. And so I guess it just goes to show that yeah, the listeners of this podcast, you know, they just are people who are taking good ideas seriously and not, you know, you might think people who listen are just insident ideas just for their own sake, you know, they find them intellectually engaging. But no, actually people are just willing to put those ideas into practice and do something like
take, given what we can pledge. And that's just yeah, that's amazing to see. That's great. Well, we'll we have a another podcast teed up. Let me know when our robot overlords start making increasingly ominous noises such that they're now unignorable and we will we'll have a podcast talking about all the pressing concerns that that AI has birthed because yeah, I share your your fears here and it'll be a great conversation. Sure, I'm looking forward to it.