You know what I did this weekend, Paul? Maybe a little yard work, hanging out with you. Digital detox retreat. Oh, you got off the internet, good for you. I went upstate. They took my phone and they handed me like a bunch of carrots like with the like green still attached to them and they said, this is your phone. Here you go. I said, this is not a phone. This is a word. This is a word. But I got to tell you after three days, what was I going to miss?
Like, what was I going to miss? It was just another weekend in me doom scrolling on on X and threads and war and suffering. What was I going to miss? You missed the single most interesting weekend the tech industry has had in forever. You say that every Monday? No, this one was good. Okay, what happened? Okay, there is a tool. It is a company called OpenAI. Oh, I know we've talked about OpenAI at the beginning of the year. I like their products. They make chat GPT and Dolly and all these
different, all the things that are happening in AI, they are the nexus. They are led by a number of people, but their sort of star is such was this person Sam Altman who is just like a pure, why commonator. The human embodiment of Silicon Valley. Okay. But they have a strange structure. They are not for profit. I heard this before. Okay. So they he owns none of it is my understanding. Yes, it's this is not about money as much as it well, but it is, but it's about
belief. So anyway, let's skip to it. Very strange governance structure, very where they are, they exist to create artificial general intelligence and then get it into the world. That is their mission, but they're going to build a business along the way, commercializing their research as they figure stuff out. Okay. And the little concern in this world that the AI will take over the
world and eat everything or create something. Talked about it. Yeah, exactly. So anyway, kind of this intense belief system, not for profit, mission driven, but the company inside is now worth like $80 billion. Billions of investment from Microsoft, mostly in the form of Azure crowd credits. That's like six hours of server time. Five days ago, Sam Altman and Satsya Nadella, the CEO of Microsoft were on stage together. That's right. So I'm going to I'll just, I'll do it
at a real fast clip. So the board, fire Sam Altman, what? The board of the not for profit, fire, fire Sam Altman, they can do that. The board is not a large can fire CEOs. Yes. Well, almost always, but and it's a strange board for an organization at this scale with this many investments. This is going to be a juicy scandal. Yeah. This one. Sam do. That's a thing. First everybody's like, what did Sam do exactly? But it just turns out that they didn't like what
he was doing. Like they just thought that he was maybe a little too come right. It's still not clear. But the vibe that's coming out of this is that he was just too aggro about getting this into the marketplace. And that they felt that was counter to mission. And instead, he should be slowing things down to get the artificial intelligence under control before it goes out into. I see. You know, Chatchy PT 5 will just be too much like a human being. We got to roll this back. So we
don't know. And they fired him for that. They fired him for that. I guess they weren't able to come to an agreement, which I can see. I don't think that's a guy who agrees a lot if he doesn't want to. Okay. And who became the CEO? They they name someone at the company, one of the sort of early founders as CEO. She said for like two days. And then, but what happened is this caught everybody by surprise, including Microsoft, a big investor of open AI. Big investor, Satya Nadella,
personally connected, goes to Dev Day. You know, like so. So all men is out. And then like high level resignations follow because people are like, what's this? And the way the board did it, like they pulled somebody back from the board who would have supported Altman. So it was a little bit of a coup. Like it definitely had coup vibes, which happens in corporate structures, especially in not for profit boards. I mean, this is like, what is wild here is that like, you know, if this was
a food charity in Brooklyn, there would be this kind of drama that they're often is. But it doesn't rise to international news level. Okay. This isn't that exciting, Paul. A board removed the CEO. Well, what do you do? Yeah, except that this guy represents Silicon Valley. And then there was this big conversation about the employee started to revolt. And then there was this big conversation and the investors were like, whoa, whoa, hold on a minute. That's who we invested in.
So now you have this like not for profit. And then you have this very profit driven universe over here. And they're like, you put our boy back in there. We like our same. Don't you let, what have you done? What have you done? So employees are rising up. Investors are angry. Microsoft I'm guessing is angry. Yeah. And yes, and there's no clear communication out of this board, even to their own employees. Like everybody is very confused as to what's happening.
All sorts of stuff are leaking. But the board isn't saying like, here's why they said that Sam hadn't been consistently candid. But then like the COO comes out and is like, no, no, and there was nothing actually crooked here. It's just like there was a breakdown in communication. Board is a lot. I mean, that's what a board is for, by the way. You don't have to have like a smoking gun to remove a CEO from a board. I mean, and I don't see any evidence that the board has
acted out of like some form of corruption. I think it's not. I just think we're in the universe is did not align here. You have. Okay. So with respect, Paul, still not very exciting, but keep going. Well, it's exciting because this ridiculous drama is just unfolding tweet by tweet. Altman goes back to be maybe brought back into the company. Wait, okay, you didn't say that. I was chewing on endive on Sunday. Yeah. They asked him to come back. Well, he comes back to
discuss. Maybe he will be reinstated as CEO, but he's going to want to change the way the board works. So this is banana cakes at this point. Okay. And it's all happening in public. Did he come back? He did. And you had like Microsoft kind of clearly in the background going, hey, you need to work this out. But they didn't work it out. And they're all these deadlines being set. And it's all sort of filtering out. They didn't work it out. In the end, they hired the, I can't remember his name,
but a guy who used to be the CEO of, he was one of the co-founders of Twitch. And in which the, let's watch you play Minecraft. Yeah. And what I like, what I like about that, that twitch, that twitch, at which point Microsoft said, okay, we're going to give Altman and his co-founder, you know, card blanche to build a huge research lab. And then, wait. Altman went to Microsoft. He went, he got a job at Microsoft. They're going to let him build
the research lab of his dreams. No. Yes. While also being heavily invested in OpenAI and Microsoft, searching the dollars tweeting like, listen, we're all going to get along here. We're all going to figure this out. Meanwhile, you just get the sense of like galactases hammer coming down in various places. So now 500 out of the 700 employees are saying, we're going to quit and go to Microsoft unless the board resigns. The board is silent. The new CEO, like, I'm, you know,
who knows where he is. Like, he's probably trying to get an Uber right now. Yeah. And, and that's where we are. That was the whole weekend. So welcome back and let you back in the city. Whoa, whoa, whoa. So you've lost over the last part. So now, Altman and his some team member, senior co founder, co founder went to Microsoft as employees. Well, let's on board you. Yes. Okay. And there's a new CEO at OpenAI who is the former CEO of video game recording videos.
Who's Twitch guy? Yes. And in the background of all this, there's one of the AI link, like the lead researcher, the sort of heavy duty engine. Ilya something. That's right. Ilya is in there. And he was everybody thought he was sort of, he was connected to this. Like, he really was part of this decision. And now he's tweeting that he's done or Xing, he's Xing that that that he feels really bad about how this all happened. And he's broken up the whole company. He feels terrible.
And he'll do anything to put it all back together. Sure. So that's tricky because he's on the board. So it's literally he's going like, he probably was a key voice in the board decision. Of course, there's like three, four people on the board. They fire. That's where it's mentioning. There's a four member board, a little tiny board, little boy, little guy, little guy. Yeah. And so, so he also signed the letter saying that he'll quit and go to Microsoft. He signed the letter saying he'll
quit and go to Microsoft. Wow. Yeah. So that's that's a lot. Now, either all men and his team go back and run the show and the board resigns or and maybe people leave or 500 people say they're going to quit and go to Microsoft leaving this giant company kind of a shell. There's only the people. Okay. With a board that has kind of lost everybody's faith. So it's a very, very volatile corporate set of shenanigans. This it is Monday, November 20th when we're recording this podcast.
This hasn't fully played out yet. No, but also what I mean, we're watching this, right? Like the world is a mess right now, like a mess, a capital M mess. And there is just a sense of like while these people keep saying this is the most important thing in the universe in their, you know, trying to keep the artificial intelligence is that don't actually work yet under control. Yeah. No, that's all made up sci-fi nonsense. So there's this element of just watching people
role play and get up to all their shenanigans. Sure. And it was a light, both high stakes, but also extremely ridiculous. Yeah. Situationally. Yeah. And which is probably not done. Oh, not finished. No, no, and there'll be lawsuits forever. Yeah. So this was a recap. If you want to jump to whatever minute it is in the podcast or YouTube video, you can. It's worth noting we were so pumped up about talking about this. I wasn't away all weekend that we didn't introduce
ourselves or mention what this podcast even is. It's a great point, Rich. Yeah. But that's okay. It doesn't matter. This is the board podcast. I'm Rich Ciyotti. And I'm Paul Ford. And you know what? There's metadata in the podcast and on the YouTube video. We don't need AI for that. I mean, it's all good. Everybody relax. They know who we are. This, um, you know, first off, I'm not going to get into the predictions game. Don't don't get into the predictions game. I'm not
going to get it. I'm going to get into the observation game. And this is not that shocking. Because of the structure of the organization. I think no, yeah, well, the structure of the organization and the way this played out are kind of both symptoms of the same thing, which is that in the last 10 years, we have watched technologists ride absolutely astronomical attention growth and value growth that nobody can get their bearings. It's like you ever see the videos of
astronauts who are trying are getting trained for G forces. Yeah. And their faces kind of kind of mushed out. Yeah. And they're their goal is to not blifte black out. They can't they can't right. They can't fly the fighter plate. God, stay awake when you ride the big horse. Meanwhile, like all the blood in your head is going down your toes and then back up to your head or whatever happens in GeForce. This is happening. And this what it does is it creates an absolutely technicolour
reality for these people who who hit this absolute home run. Elon Musk, Sam Bankman, Fried, to the point where money goes from a number to an absolute abstraction, you go from a business person to actually the savior of humanity. And you lose all bearings. Well, money engineers have a fantasy of themselves as like knowledge doctors. They organize the world using code. And and they there's a lot of talk about different kind of codes of thinking and operating and understanding.
So when you and I saw this happening, here's what it felt like to me. It felt like engineering had finally figured out how to get product out of the building. They're like, we will finally be able to do this right and ethically and well. And we will get rid of these horrible deadlines to keep us shipping the software to these people without having a perfect ethical framework in place. And let's do that. And the board said absolutely. And then like humans got involved and it all went
out. What ends up happening, I think, is that that tidy algorithm, that tidy library that you so rely on or that massive computing power, you think you can redirect it and apply it to people. Well, can I go meta for one sec? Because this was incredibly political. This happened with the last three days. Here's here's the reality. There is this fantasy inside of this discipline, meaning AI and especially like LLM based AI, large language models, large X models that they're going to crack
human consciousness by going down this path. And I used to work in AI in my 20s and I've sort of always kept in interest in all these arguments. It's one of those things that's always right around the corner. And I actually just think that like it's so infinitely slippery and there isn't always the fantasy that somehow we're going to get humans kind of wrapped up and make sense of them. I think the past few days were incredibly human. They were. They were. Yes.
And I think what it taught us is, and we've seen this lesson again and again, which is the comp-side geniuses think they can sort of universalize their intelligence and apply it to the world. And they validated so often with so much money and so much opportunity. That's exactly it. And I want to shift gears here and talk about a story around Michael Jordan. Michael Jordan was the greatest basketball player to ever live at the time when he played. You
can argue easily that he's still the greatest basketball player to ever play. And he decided, all right, I solved this. I won like four championships. I'm going to go play baseball. Right. And then he went to the minor leagues and he did terribly. Like so badly that he went back to basketball. Even in the minor leagues. In the minor leagues. He didn't make it to the majors. And you know, it was great for the, you know, the minor league team because Michael Jordan's
coming to play baseball will sell a lot more corn dogs. Yes. That is true. So he played. And then they asked him, like years later, like what happened with that? You know, you tried baseball. Why? You know, you're an incredible athlete. You know what he said? I couldn't hit a curveball. Just couldn't do it. He just couldn't do it. Yeah. And what that, what that, it was very humbling for him, by the way, because this guy was not just like a star player. This was like an iconic
global phenomenon. One of the greatest athletes who ever lived. Well, ever lived. Yeah. Jump to another discipline in some Poe Dunk town in North Carolina and couldn't hit a curveball. No, I know. I'm trying to learn piano. I practice for about an hour. That's incredibly humbling. I'm going to continue to be trying to learn piano for the next five years. And then maybe I'll be pretty good. Yes, men. Politicians know what yes, men are. They understand what they are.
You know who doesn't know what yes, men are computer scientists, computer engineers. No, that's right. They're like, holy moly. I am actually right. I'm right. I'm always right. I'm right. I'm right. We can we don't need to go the same route. Why the get the business, get the MBAs out of here. We'll do the org chart. Well, I want to come back to this because I think we're talking around. I'm going to do the org chart in Obsidian. That that's real, right?
We're going to, you know, that you know, you can see this when they always want to put law and GitHub. So it's like, we'll just we'll track versions and that will make a better society. Yeah. This is a particular culture. It's a monoculture in Silicon Valley. And there is a set of beliefs. And it's almost a religion, almost a millenit millenit millenial millenarian religion. Sorry, I took a minute to about like how the world is going to end unless they step in and
manage how the AI is being released. Let me. So, okay. So there's this whole framing. And so they make an offer profit and so on. Let's get down a little bit to brass tech. So I watched your lawyer brain over the weekend explode because the organizational structure of this business, you saw the org chart and you just laughed and laughed and laughed and laughed and explained that. Why? Well, I mean, it's worth framing what you described that.
Describe the order. Open AI is a nonprofit. Okay. That is that owns another non-profit. So those are good. I like nonprofits. I give them money. You invest in a lot of non-for-profits. We think they're good. Yeah. But it happens to be in the belly of this nonprofit is a for-profit. So how does that you can't do that? Well, as far as I know and I don't know every board, like first off, you've got three major entities here. Right. And the fact that you don't have
separate governance rules for each of them. And the fact that there is no other board for the like lower company, which happens to be worth tens of billions of dollars. And instead, you have essentially like sort of a star chamber sitting up at the top is banana cakes. By the way, the idea of putting a for-profit box inside of a bigger non-profit box is incredibly entertaining and it's just the recipe for disaster. Right. Because what you're going to have is for-profit businesses are
designed to grow and to move quickly and to navigate commercial waters. Right. And the nonprofits are usually very idealistic and mission driven. And for this one, I think what's interesting about this nonprofit is that the goals around it are actually aspirational. It's not we are goals to feed a million people. It's not that. Right. In fact, we're going to keep this within certain
parameters, make sure it doesn't kill us. Interesting. So the parent organization has an anti-growth goal, which is we need to regulate and understand this from the beginning. The child organization is fiendishly focused on all forms of growth. It's a reason, you know what it is like, you know, there are nonprofits that are like trying to cure diseases. Yeah. But that you can't put in the in the charter, like you must cure it by 2025. Like that's not in the charter. So
it's a research nonprofit. I mean, so it's like if in the 90s, I had an organization dedicated to identifying the patterns that make the best operating system that's window-based. And but but really keeping it on rail so that it was the best, most optimal solution for everyone. And the Microsoft came along giving $10 billion to build Windows 95 inside of it. Yeah. And there are commercial mechanisms for making Microsoft has stockholders. They have shareholders.
They have to they have to they have to make money. So whatever meant, well, I don't know the terms of the deal, but they're going to make money off of this thing. Well, the terms of the deal are, I mean, yes, they get rights to sort of all the IP unless they come up with an artificial intelligence that really works like a like it's like a baby. And then Microsoft can't have that. They get to keep that is that in the agreement between them? Yes, because everyone is bananas.
Okay. So this goes back to the brains in the room. And the brains in the room thought that they could write a essentially an SDK around how all this is going to play out. And let's reboot the way that all business works in America. Yeah. And the truth is when you described over essentially a 72 hour period is a disaster. Right. And it's a disaster not because they made the wrong decisions. It's a disaster because the implications of codifying these relationships over such a short period
of time will play out in the next two weeks. Essentially when they like take away Sam Altman's Mac and give him a Dell in Sparron. Like these are things that don't you it takes weeks to get one executive to sort of understand the contours of what the relationship is going to be like. You can't just have these arranged marriage kind of situations are never play out well. They just
don't know. There's another element here, which is I believe that you know the board could have said we are out of alignment with our mission and we must do whatever it takes to get back to our mission. That is our that's our function as a not for profit board. I totally get that. Do you agree? I don't know.
I don't know. I think this whole thing is silly, right? Like I think that like so I want to get back to my my or another point, but like I'll just tell you like I think this is a very interesting technology and that the the conversation around it is utterly self indulgent and ridiculous. It is a lot of egos a lot of incredibly successful. They've seen success. They have become the darlings not just of like Silicon Valley and technology. They've become kind of the darlings of the world. They
travel away. They travel around the world and meet with leaders. Yeah. They're meeting they're meeting prime ministers. That's right. There's diplomats right there sort of like tech diplomats. So they're and they're self appointed and they're self appointed because they have all the money. Yeah. I get all this. I think this is a really fascinating, interesting enabling technology. I feel that it produces a lot of absolute nonsense, not just in the way that it's but like
it's not really a usable technology yet, but boy is it novel. It is capable of doing things. It's an interesting time. No doubt about it. Yeah. And so you know him being blindsided says a lot about him as much as it says about the board. I mean I think I think ultimately here this was some classic. Let me phrase it. Let me boil it down. When you give people power or the opportunity to use power they tend to use it. They just tend to use it. Sam became the
darling. He was sort of the face of open AI. This board has immense power, right? And I don't think it's as simple as like well you're misaligned with our goals or whatever they could pick up. They could throw a meeting and say, hey dude we're the board. Yeah. This is yes. Have you heard of these people before this weekend? I've heard of Ilya like in passing. Yeah. Not really. I've a little bit.
Yeah. I mean, all men is he is the Michael Jordan right now of this whole movement. Right? And so what you have here is people tend to get real riled up about pushing the red button and thinking wow check it out. And then it's like two days later and everyone is like looking at this scorched all around. Let me wrap this in a bow. Put it in a bow for you. Put it in a bow for you. This goes back to this. I think the theme of this podcast. You know what this group of geniuses is not
thinking about what HR? No. They're not thinking about any of it. They have put themselves in a domain of like humanity is waiting for us. They are not you say 700 employees. You know how utterly uninteresting and quaint that sounds to that group of people because super robots are going to take over the universe. They are convinced they are stewards of the future for humanity. And you're talking about 700 employees scattered around California and the United States and around the
world or wherever. Not interesting. Really boring. HR handbook. Really, really boring. That's not what's going on here. That's I think what has been by the way, you could argue this Sam falls into that category because he probably hasn't been in the office in six months. Right. So that's worth whatever. But like I agree with you. Like that's the decision making that's at work here. This is you know how weak the COO or the other VP of HR is in that company. Do you have any idea?
It's like what are you talking about 500 employees? We have to worry about eight billion. They are and I think also they're just told here's another hundred million go get me three more engineers. Right? Like just like. Yeah. That's all they were in an alternate reality that was just just dripping with like gratifying maple syrup all over the walls. It was just another time in placement. I'll tell you what though. If if people had been thinking about those 700 people
on all sides, you wouldn't have this kind of chaos. You are waking up. I say this to myself every morning. You're not that big of a deal. I say it every morning. I say to you too. You are getting the door. I say to you. I kind of have to say to you like five times a day. I'm okay and I thank you for that. Well, it's how we keep this under control. You're not that big of a deal. You are not the steward. You know who needed to hear that? Sam Bankman freed needed to hear that. You're not that.
I don't think any of these people are are somebody made a good point, which is like if Sam Bankman freed it had a good board. He would have been fired. He wouldn't be in jail. It's the same script. Elon Musk, Sam Bankman freed, these people, these are people who are seeing like outlandish success and adoration so quickly. Can I tell you something that bums me out of all of this? They get excited by the abstractions. They're excited about humanity and the abstract.
Exactly. But they're indifferent to humanity on the ground. Absolutely. I feel that that is a genuine failure of their class. Absolutely. They should care about human rights and civil rights and they should care. And there are lots of previous examples of people who achieve their power. And what they care about instead is this sort of imaginary future state that's very abstract. You're nailing it.
You know, the whole effective altruism message skips over the neighbors. It skips over the 500 employees. Look, Fred wanted to like he was like, I don't want any of this money. I'm going to use all of it to make the world a better place. I'm not going to buy pants with it. That's for sure. I'm not going to buy pants with it. That's for sure. But what happened? Literally families who had invested in this fund saw their money disappear because of malfeasance,
right? It's indistinguishable from made off except much, much larger. And also driven not by a scheme, but rather like this sort of long-term goal of fixing the world, where you're going to actually cause damage near term, right? And that's just ego. That's hubris. That's all this is. It's hubris. It's why we have law. Because then you can't go ahead and break the law just because it's some imagined future state. Or governance. There will be no law
that is going to govern how an open AI is run. But there's a reason why those boring operating agreements exist. Also a great argument for the separation of church and state. And especially since AI is approaching religion for real, which is like, you don't actually want people who can say, here's how we're going to enact policy in order to bring the apocalypse around sooner. No, but people do. They want to. They want to get that going. They want Jesus back.
Yeah. There's a group in Israel that's trying to breed a red heifer to bring the just Google red heifer, my friend. In cognito first. Yes, definitely. You're in a world here where like those belief systems start to get so profound. And people start to act on to the point that they trump the basic norms of society. And that's bad stuff. I want to close it with this. It feels like the like, the waters of Silicon Valley have sort of leaked into the rest of culture.
But they were like, well, it did with crypto, right? Like crypto, crypto was perfect that way because I would go over to somebody's house and they'd be on discord trading. And I'd be like, you work and you work and you're a you own a shoe store. I was very confused. Yeah. Yeah. And here we are. I find it this all a little, it's a good reboot. Like better now. Like imagine if it was like AI was way further along. There was some giant
committee that had a meltdown. Like this thing is still a baby. Yeah. And we just got reminded that human dynamics and politics, speaking of super alignment, human alignment turns out to be really frickin tricky. There's another broader point in this, which is Microsoft, the degree of Microsoft power that was exerted over the weekend was quite remarkable. Yeah. We may pay the price by having like a collection of absolute garbage applications
come out of Microsoft because of this. I'm just going to let you know that all future currency will be as your credits. That is like, so I'm going to. So anyway, this is the board podcast. Yeah. What does it have to do with the board? Well, we're building a startup. Yes. We are a for profit entity on apologetic. We don't have a board. We don't have a board. We may never know. No, not after this. No, you and I are on the board. And you should check us out at
abord.com. If you want to see something that is kind of the opposite of truly speculative technology, but actually something that is designed to help you do stuff, what have you been using it for lately? I'll ask you that and I'll tell you what I've been using it for. A lot of the or like a lot of the notes and I use it to take notes. Me too. And I don't share those
notes. No, I'm with you on that. I do too. They're like post it's a board has cards and I treat cards like post it's which makes me like want to ask for features that are more for that stuff, but I'm not going to. I've been wrestling with the same thing. I went to a community board meeting about climate resilience. You just want to put data in. I just started taking notes in the app. It worked very well. I'm also using it. We're doing a family get together in DC because that's
kind of in the middle of everybody. So I use it. I'm using it to kind of track the restaurants that we're calling because we want to get a room and just so so anyway. Check out. Practical, straightforward, simple things like that. Yeah. It's got a little AI but it won't get in your way and check it out at board.com. Check it out. All right. Thanks for watching. If you're on YouTube listening, if you're listening to other podcasts on the podcast,
like, subscribe, five stars, give us all the love. Please we need it very, very badly. We're just a little tiny startup in the world of really big startups and stuff like this. Have a lovely week. Goodbye, everybody.