An OpenAI and Shut Case

November 21, 2023  ·  32 min 1 sec

With Sam Altman fired from OpenAI, there was so much drama in tech that it was impossible not to discuss it. With no secret insight but remarkable aplomb, Paul and Rich dive deep on the passionate subject of corporate governance, and try to understand just what the hell is going on.

Transcript

Rich: You know what I did this weekend, Paul?

Paul: Maybe a little yard work, hanging out with the kids. 

Rich: Digital detox retreat.

Paul: Oh you got off the internet. Good for you.

Rich: I went upstate.

Paul: Mmm hmmm.

Rich: They took my phone and they handed me like a bunch of carrots, like with the, like greens still attached to them and they said, “This is your phone.”

Paul: Here you go.

Rich: And I said, this is not a phone. I ate one of the carrots—

Paul: Yeah?

Rich: But I got to tell you after three days, what was I going to miss? Like, what was I going to miss? It was just another weekend of me doomscrolling on, on X and Threads and war and suffering. What did I, what was I going to miss?

Paul: You missed the single most interesting weekend the tech industry has had in forever.

Rich: You say that every Monday.

Paul: No, this one was different.

Rich: Okay. What happened?

Paul: Okay, there is a tool, it is a company called OpenAI.

Rich: Oh, I know, we’ve talked about OpenAI at length.

Paul: I like their products. I use their products. They are…

Rich: They make ChatGPT.

Paul: And Dall-E and all these different, all the things that are happening in AI, they are the nexus. And they are, they’re led by a number of people, but they’re, their sort of star is slash was this person, Sam Altman. Who is just like a pure Y Combinator.

Rich: Right, the CEO.

Paul: The human embodiment of Silicon Valley.

Rich: Okay.

Paul: And, but, they have a strange structure. They’re not-for-profit. 

Rich: I heard this before. 

Paul: Okay, so…

Rich: And he owns none of it, is my understanding.

Paul: Yes, this is not about money, as much as, well but it is, but it’s about belief. So anyway, let’s skip to it. Very strange governance structure where they exist to create artificial general intelligence and then get it into the world. That is their mission, but they’re going to build a business along the way, commercializing their research as they figure stuff out.

Rich: Okay.

Paul: And they’re a little concerned in this world that the AI will take over the world and eat everything or create…

Rich: We talked about it.

Paul: Yeah, exactly. So anyway, it’s kind of this intense belief system not for profit mission driven, but, but the company inside is now worth like $80 billion, has billions of investment from Microsoft.

Rich: Yes.

Paul: Mostly in the form of Azure crowd credits. That’s like six hours of server time 

Rich: Five days ago, Sam Alton and Satya Nadella, the CEO of Microsoft, were on stage together.

Paul: That’s right. So I’m gonna, I’ll just, I’ll, I’ll do it in a real fast clip. So the board fires Sam Altman.

Rich: What?!

Paul: They, the board of the not-for-profit fired Sam Altman. They can do that, the board is not…

Rich: Boards can fire CEOs, very often. 

Paul: Yes.

Rich: Uh, well, almost always, but…

Paul: And it’s a strange board for an organization at this scale with this many investments.

Rich: This is gonna be a juicy scandal. 

Paul: Yeah well… 

Rich: What did Sam do?

Paul: That’s the thing, um, first everybody’s like, “What did Sam do?” Exactly. But it just turns out that they didn’t like what he was doing, like they just thought that he was maybe a little too com—it’s still not clear, but the vibe that’s coming out of this is that he was just too aggro about getting this into the marketplace, and that they felt that was counter to mission, and instead he should be slowing things down to get the artificial intelligence under control before it goes out into—

Rich: I see…

Paul: ChatGPT-5 will just be too much like a human being. We gotta roll this back so we don’t—

Rich: Okay, and they fired him for that?

Paul: They fired him for that. I guess they weren’t able to come to an agreement, which I can see. I don’t think that’s a guy who agrees a lot, if he doesn’t want to.

Rich: Okay, and who became the CEO?

Paul: They, they named someone, um, uh, at the company, uh, one of the sort of early founders as, as CEO. Um, she, uh…so for like two days. And then, but what happened is, um, this caught everybody by surprise, including Microsoft.

Rich: A big investor of OpenAI. 

Paul: Big investor. Satya Nadella personally connected, goes to DevDay.

Rich: Yup.

Paul: So, so Altman is out. And then, like, high-level resignations followed, because people were like, “What’s this?” And the way the board did it, like, they pulled somebody back from the board who would have supported Altman. So it was a little bit of a coup. Like, it definitely had coup vibes.

Rich: Which happens in corporate structures.

Paul: Especially in not-for-profit boards. I mean, this is like…

Rich: Bur, bur, burrr…

Paul: Yeah. What is wild here is that, like, you know, if this was a food charity in Brooklyn, there would be this kind of drama. There often is. But it doesn’t rise to an international news level.

Rich: Okay, this isn’t that exciting, Paul. A board removed the CEO, big whoop-de-doo.

Paul: Well, yeah except that this guy represents Silicon Valley, and then there was this big conversation about, the employees started to revolt. And then there was this big con—and the investors were like, “Whoa, hold on a minute, that’s who we invested in.” So now you have this like not-for-profit and then you have this very profit-driven universe over here and they’re like, “You put our boy back in there. We like our Sam. Don’t you let, what have you done? What have you done?”

Rich: So employees are rising up, investors are angry, Microsoft, I’m guessing is angry?

Paul: Yeah, and, and, yes, and there’s no clear communication out of this board, even to their own employees. Like, everybody is very confused as to what’s happening. All sorts of stuff are leaking, but the board isn’t saying, like, here’s why. They said that Sam hadn’t been consistently candid, but then, like, the COO comes out and is like, “No, no, there was nothing actually crooked here. It’s just like there was a breakdown in communication.”

Rich: A board is allowed—I mean, that’s what a board is for, by the way, you don’t have to have like a smoking gun to remove a CEO from a board. 

Paul: I mean, I don’t see any evidence that the board has acted out of, like, some form of corruption.

Rich: Malice or…

Paul: No, I just think the universes did not align here.

Rich: Okay, well with respect, Paul, still not very exciting. But keep going…

Paul: Well it’s exciting because this ridiculous drama is just unfolding tweet by tweet. Altman goes back to be maybe brought back into the company.

Rich: Wait, okay, you didn’t say that. I was chewing on endive on Sunday. 

Paul: Yeah.

Rich: They asked him to come back?

Paul: Well he comes back to discuss, maybe he will be reinstated as CEO, but he’s going to want to change the way the board works. So this is banana cakes at this point. Like it’s all, and it’s all happening in public.

Rich: Okay, did he come back?

Paul: He did, and he had like Microsoft kind of clearly in the background going, “Hey, you need to work this out.” But they didn’t work it out. In the last—there are all these deadlines being set and it’s all sort of filtering out. They didn’t work it out. In the end uh, they hired, um, I can’t remember his name, but the guy who used to be the CEO, he was one of the co-founders of Twitch.

Rich: Twitch the, let’s watch you play Minecraft?

Paul: Yeah. What I like about Twitch…

Rich: Twitch? That twitch?

Paul: That Twitch, at which point Microsoft said, “Okay, we’re going to give Altman and, uh, and his co-founder, you know, carte blanche to build a huge research lab.” And, and then, uh…

Rich: Wait… Altman went to Microsoft? 

Paul: He went, he got a job at Microsoft. They’re going to let him build the research lab of his dreams.

Rich: No.

Paul: Yes, while also being heavily invested in OpenAI and Microsoft, Satya Nadella was tweeting, like, “Listen, we’re all going to get along here. We’re all going to figure this out.” Meanwhile, you just get the sense of like Galactus’s hammer coming down in various places.

Rich: Yeah, yeah.

Paul: So now, 500 out of the 700 employees are saying, “We’re going to quit and go to Microsoft unless the board resigns.” The board is silent, the new CEO, like, you know, who knows where he is? Like, he’s probably trying to get an Uber right now.

Rich: Yeah.

Paul: And, um, and that’s where we are. That was the whole weekend, so welcome back. I’m glad you’re back in the city.

Rich: Whoa, whoa, whoa, whoa. So you glossed over the last part. So now, Altman and his, some team member, senior per—

Paul: Co-founder.

Rich: Co-founder, went to Microsoft as employees?

Paul: Well, they’re…

Rich: Like let’s onboard you?

Paul: Yes. Yes.

Rich: Okay, and there’s a new CEO at OpenAI who is the former CEO of video game recording videos Twitch guy?

Paul: Yes, and in the background of all this, there’s one of the, um, AI, like, like the lead researcher, the sort of heavy-duty engineer.

Rich: Ilya something—

Paul: That’s right. Ilya is in there, and he was, everybody thought he was sort of, he was connected to this. Like, he really was part of this decision, and now he’s tweeting that, um, he’s done—or X-ing, he’s X-ing. That, uh, that, that he feels really bad about how this all happened and he’s broken up the whole company. He feels terrible and he’ll do anything to put it all back together. 

Rich: Sure.

Paul: So that’s tricky, because he’s on the board. So it’s literally, he’s going like…

Rich: Probably was a key voice in the board decision.

Paul: Of course, there’s like three, four people on the board. They fired everybody else.

Rich: That’s worth mentioning, there’s a four-member board.

Paul: A little tiny board. Little boy, little guy.

Rich: Little ugy.

Paul: Yeah, yeah. And so, so he also signed the letter saying that he’ll quit and go to Microsoft.

Rich: He signed the letter.

Paul: Saying he’ll quit and go to Microsoft.

Rich: Wow.

Paul: Yeah, so that’s, that’s a lot. Now, either Altman and his team go back and run the show.

Rich: Mmm hmmm.

Paul: And the board resigns.

Rich: Mmm hmmm.

Paul: Or, and maybe people leave, or 500 people say they’re going to quit and go to Microsoft leaving this giant company kind of a shell. There’s only the people.

Rich: Okay.

Paul: Um with a board that has kind of lost everybody’s faith. So it’s a very very volatile corporate set of shenanigans.

Rich: Okay, this is Monday November the 20th, when we’re recording this podcast. This hasn’t fully played out. 

Paul: No, but also I mean we’re watching this right, like, the world is a mess right now. Like a mess, a capital-M Mess. And there is just a sense of like, while these people keep saying this is the most important thing in the universe, and they’re, you know, trying to keep the artificial intelligences that don’t actually work yet under control, um, none of that, that’s all made up sci-fi nonsense. So there’s this element of just watching people roleplay and get up to all their shenanigans.

Rich: Sure.

Paul:  Um, and it was a light, both high-stakes, but also extremely ridiculous situation.

Rich: Yeah, and which is probably not done.

Paul: Oh no.

Rich: Probably not finished.

Paul: No, no. And there’ll be lawsuits forever, right?

Rich: Yeah. So this was a recap. If you want to jump to whatever minute it is in the podcast or YouTube video, uh, you can. It’s worth noting, we were so pumped up about talking about this—I wasn’t away all weekend—that we didn’t introduce ourselves or mention what this podcast even is.

Paul: That’s a great point, Rich.

Rich: Yeah, uh, but that’s okay. It doesn’t matter. This is the Aboard podcast. I’m Rich Ziade.

Paul: And I’m Paul Ford. And you know what? There’s metadata in the podcast and on the YouTube video.

Rich: We don’t need AI for that.

Paul: I mean…

Rich: It’s all good. 

Paul: Everybody relax. They know who we are.

Rich: This, you know, first off, I’m not going to get into the predictions game.

Paul: Don’t. Don’t get into the predictions game. 

Rich: I’m going to get into the observation game, and this is not that shocking.

Paul: Because of the structure of the organization?

Rich: I think—no, yeah, well, the structure of the organization and the way this played out are kind of both symptoms of the same thing.

Paul: Mmm hmmm.

Rich: Which is that in the last ten years, we have watched technologists ride absolutely astronomical attention growth and value growth that nobody can get their bearings. It’s like, you ever see the videos of astronauts who are getting trained for G forces?

Paul: Yeah. Their faces—

Rich: And their faces are kind of getting mushed out?

Paul: Yeah. 

Rich: And there, the goal is to not blackout. If they black out, they can’t, they can’t, they can’t fly the fighter— 

Paul: Gotta stay awake when you’re riding the big horse. 

Rich: Meanwhile, like all the blood in your head is going down to your toes and then back up to your head or whatever happens in G Force. This is how, and this, what it does is it creates an absolutely technicolor reality for these people who, who hit this absolute home run. Elon Musk…

Paul: Mmm hmmm.

Rich: Sam Bankman-Fried. To the point where money goes from a number to an absolute abstraction. You go from a business person to actually the savior of humanity.

Paul: Mmm hmmm.

Rich: And you lose all bearings.

Paul: Money—engineers have a fantasy of themselves as like knowledge doctors. They organize the world using code and, and there’s a lot of talk about different kinds of codes of thinking and operating and understanding. So when you and I saw this happening, here’s what it felt like to me. It felt like engineering had finally figured out how to get product out of the building. They’re like, “We will finally be able to do this right and ethically and well, and we will get rid of these horrible deadlines that keep us shipping this software to these people without having a perfect ethical framework in place. And let’s do that.” And the board said, “Absolutely.” And then, like, humans got involved and it all went south.

Rich: What ends up happening, is that that tidy algorithm, that tidy library that you so rely on, or that massive computing power, you think you can redirect it and apply it to people.

Paul: Well, can I go meta for one sec?

Rich: Because this was incredibly political, what happened over the last week.

Paul: This is… here’s, here’s the reality. There is this fantasy inside of this discipline, meaning AI and, and especially like LLM-based AI, large language models, large X models, that they’re gonna crack human consciousness by going down this path. And, you know, I used to work in AI in my twenties and I’ve sort of always kept an interest in all these arguments. 

Rich: Mmm hmmm.

Paul: It’s one of those things that’s always right around the corner. And, I actually just think that, like, it’s so infinitely slippery and there is always the fantasy that somehow we’re going to get humans kind of wrapped up and make sense of them.

Rich: I think the past few days were incredibly human in a lot of ways. 

Paul: They were. Yes. 

Rich: Right. And, and, and I think what it taught us is, and it’s been, we’ve seen this lesson again and again, which is, the comp-science geniuses think they can sort of universalize their intelligence and apply it to the world, right? And the…

Paul: Well, they’re validated so often with so much money and so much opportunity.

Rich: That’s exactly it. But here’s, and I want to go, I want to shift gears here and talk about a story.

Paul: Okay.

Rich: Around Michael Jordan. Michael Jordan was the greatest basketball player to ever live, at the time when he played. You could argue easily that he’s still the greatest basketball player to ever play. And he decided, all right, I solved this. I won like four championships. I’m going to go play baseball.

Paul: Right.

Rich: And then he went to the minor leagues and he did terribly, like so badly that he went back to basketball. 

Paul: Even in the minor leagues.

Rich: In the minor leagues, he didn’t make it to the majors. And you know, it was great for the, you know, the minor league team. Cause Michael Jordan’s coming to play baseball, we’ll sell a lot more corn dogs.

Paul: Yes, that is true. 

Rich: So he played, and then they asked him, uh, like years later, like what happened with that? You know, you tried baseball. Why didn’t, you know, you’re an incredible athlete. And you know what he said?

Paul: Hmm?

Rich: “I couldn’t hit a curveball.”

Paul: Just couldn’t do it?

Rich: He just couldn’t do it.

Paul: Yeah.

Rich: And, and what that, what that, it was very humbling for him, by the way. Because this guy was not just, like, a star player. This was, like, an iconic global phenomena.

Paul: One of the greatest athletes who ever lived.

Rich: Who ever lived.

Paul: Yeah.

Rich: Jumped to another discipline in some podunk town in North Carolina and couldn’t hit a curveball.

Paul: No, I know. I’m trying to learn piano. I practice for about an hour a day.

Rich: It’s incredibly humbling.

Paul: I’m going to continue to be trying to learn piano for the next five years and then maybe I’ll be pretty good.

Rich: Yes men…politicians know what yes men are. They understand what they are. You know, who doesn’t know what yes men are? Computer scientists, computer engineers. 

Paul: No, no that’s true.

Rich: They’re like, “Holy moly. I am actually right. I’m right. I’m always right. I’m right. I’m right.” We can, we don’t need to go the same route. Why the get the, get the MBAs outta here. We’ll do the org chart. 

Paul: Well I want to come back to this because I think we’re talking around—

Rich: I’m gonna do the org chart in Obsidian.

Paul: That, that’s real, right? We’re gonna, you know, the, you know, you can see this when they, they always wanna put law in GitHub. So, it’s like, we’ll just, we’ll track versions and that will make a better society. 

Rich: Sure, yeah.

Paul: This is a particular culture, it’s a monoculture in Silicon Valley, and there is a set of beliefs, and it’s almost a religion, almost a millen, millen, millennial, millenarian religion, sorry I took a minute, um, to, about like how the world is, is gonna end unless they step in and manage how the AI is being released. Let me, so, okay, so there’s this whole framing and so they make a not-for-profit and so on. Let’s get down a little bit to brass tacks because I watched your lawyer brain over the weekend explode because the organizational structure of this business, you saw the org chart and you just laughed and laughed and laughed and laughed, and explain that. Why?

Rich: Well, I mean, it’s worth framing what the structure looked like. 

Paul: Describe that, describe the organization. 

Rich: OpenAI is a non-profit.

Paul: Okay.

Rich: That owns another non-profit.

Paul: So those are good. I like non-profits. I give them money. You, you invest in a lot of non-for-profits. We think they’re good.

Rich: Yeah, but it happens to be, in the belly of this non-profit is a for-profit.

Paul: So how does that—you can’t do that.

Rich: Well, I, as far as I know, and I don’t know every board, like first off, you’ve got three major entities here. 

Paul: Right. 

Rich: And the fact that you don’t have separate governance rules for each of them. And the fact that there is no other board for the like lower company, which happens to be worth tens of billions of dollars.

Paul: Mmm hmmm.

Rich: And instead you have essentially like a sort of a star chamber sitting up at the top? Is banana cakes. By the way, the idea of putting a for-profit box inside of a bigger non-profit box is incredibly entertaining and it’s just a recipe for disaster. Because what you’re going to have is for-profit businesses are designed to grow and to move quickly and to navigate commercial waters, right? And the non-profits are usually very idealistic and mission driven. And for this one, I think what’s interesting about this non-profit is that the goals around it are actually aspirational. It’s not, “our goal is to feed a million people,” it’s not that.

Paul: Right.

Rich: In fact, it’s, “we’re gonna keep this within certain parameters, make sure it doesn’t kill us…”

Paul: Interesting, so the parent organization has an anti-growth goal, which is we need to regulate and understand this from the beginning. The child organization is fiendishly focused on all forms of growth.

Rich: It’s a—you know what it, it’s like, like, you know, there are non-profits that are, like, trying to cure diseases.

Paul: Yeah.

Rich: But they, you can’t put in the, in the charter, like you must cure it by 2025. Like, that’s not in the charter. So it’s a research non-profit.

Paul: So it’s like if in the nineties I had an organization dedicated to identifying the patterns that make the best operating system that’s Window-based, and, but, but really keeping it on Rails so that it was the best, most optimal solution for everyone. And then Microsoft came along and gave me $10 billion to build Windows 95 inside of it.

Rich: Yeah, and there are commercial mechanisms for making—Microsoft has stockholders, they have shareholders. They have to, they have to make money. So whatever, I don’t know the terms of the deal, but they’re gonna make money off of this thing. 

Paul: Well, the terms of the deal are, I mean, yes, they get rights to sort of all the IP, unless they come up with an artificial intelligence that really works. Like, it’s like a, baby. And then Microsoft can’t have that. They get to keep that.

Rich: Is that in the agreement between them? 

Paul: Yes, because everyone is bananas.

Rich: Okay, so this goes back to the brains in the room. And the brains in the room thought that they could write essentially an SDK around how all this is gonna play out.  

Paul: And let’s reboot the way that all business works in America.

Rich: Yeah. And, and the truth is what you described over essentially a 72-hour period is a disaster. And it’s a disaster, not because they made the wrong decisions. It’s a disaster because the implications of codifying these relationships over such a short period of time will play out in the next two weeks. Essentially when they, like, take away Sam Altman’s Mac and give him a Dell Inspiron. Like, these are things that don’t, you—it takes weeks to get one executive to sort of understand the contours of what the relationship is going to be like.

Paul: Yes.

Rich: You can’t just have—these arranged marriage kind of situations never play out well. They just don’t.

Paul: No, there’s another element here which is, I believe, uh, that, you know, the board could have said, we are out of alignment with our mission and we must do whatever it takes to get back to our mission. That is our, that’s our function as a not-for-profit board. I totally get that. 

Rich: Do you agree?

Paul: I…I don’t know. I don’t know. I think this whole thing is silly, right? Like, I think that, like, so, I want to get back to my, my, another point, but like, I’ll just tell you, like, I think this is a very interesting technology and that the conversation around it is utterly self-indulgent and ridiculous. Uh, it is—

Rich: There’s a lot of egos, a lot of incredibly successful—they’ve seen success. They have become the darlings, not just of, like, Silicon Valley and technology. They’ve become kind of the darlings of the world, in a weird way.

Paul: They travel around, no, they travel around the world and meet with leaders.

Rich: Yeah, they’re meeting, they’re meeting prime ministers. That’s right.

Paul: They’re diplomats, right? They’re sort of like tech diplomats. So they’re, and they’re self-appointed and they’re self-appointed because they have all the money.

Rich: Yeah.

Paul: I get all this. I think this is a really fascinating, interesting, enabling technology. I feel that it produces a lot of absolute nonsense—not just in the way that it’s, but like, it’s not really a usable technology yet, but boy is it novel. It is capable of doing things—

Rich: It’s an interesting time. No doubt about that. And yeah, and so, you know him being blindsided says a lot about him as much as it says about the board. I mean, I think ultimately here, this was some classic… Let me phrase it, let me boil it down: When you give people power or the opportunity to use power? They tend to use it.

Paul: Yeah.

Rich: They just tend to use it. Sam became the darling, he was sort of the face of OpenAI. This board, um, has immense power, right? And, and I don’t think it’s as simple as like, well, you’re misaligned with our goals or whatever. They could pick up, they could throw a meeting and say, hey, dude, we’re the board.

Paul: Yeah, this is… Yes.

Rich: This is… Have you heard of these people before this weekend? I’d heard of Ilya, like, in passing but…

Paul: Yeah. Not really, a little bit. 

Rich: Altman, I mean Altman is, he is the Michael Jordan right now of this whole movement, right? And so what you have here is um, people tend to get real riled up about pushing the red button and thinking, wow, check it out. And then it’s like two days later and everyone is like looking at this scorched earth all around, let me wrap this in a bow.

Paul: Put it in a bow.

Rich: Let me put it in a bow for you. And this goes back to this, I think the theme of this podcast, you know what this group of geniuses is not thinking about?

Paul: What?

Rich: HR. 

Paul: No.

Rich: They’re not thinking about any of it. They have put themselves in a, in, in a domain of, like, humanity is waiting for us. They are not, you say 700 employees. You know how utterly uninteresting and quaint that sounds to that group of people?

Paul: Right, because super robots are going to take over the universe.

Rich: They are convinced they are stewards of the future for humanity. And you’re talking about 700 employees scattered around California and the United States, and around the world or wherever. Not interesting. Really boring. HR handbook? Really, really boring. That’s not what’s going on here. That’s I think what has been—by the way, you could argue, this, Sam falls into that category because he probably hasn’t been in the office in six months, right? So…

Paul: Or, or whatever, but like, I agree with you. Like, that’s the decision-making that’s at work here. This is…

Rich: You know how weak the COO or the VP of HR is in that company, do you have any idea? It’s like, what are you talking about 500 employees? We have to worry about 8 billion.

Paul: They are, and I think also they’re just told, “Here’s another 100 million, go get me three more engineers.” Right? Like, just like, it’s, you know.

Rich: Yeah that’s all they were in an alternate reality that was just, just dripping with like gratifying maple syrup all over the walls. Like it was just another time and place, man.

Paul: I’ll tell you what though, if, if people had been thinking about those 700 people, on all sides, you wouldn’t have this kind of chaos.

Rich: You aren’t… Waking up, I say this to myself every single morning “You’re not that big of a deal.” I say it every morning. You say it to me, I say it to you. 

Paul: I kind of have to say it to you like five times a day.

Rich: And I’m okay, well thank you for that. 

Paul: Well it’s how we keep this under control.

Rich: You are not that big of a deal. You are not the steward. You know who needed to hear that? Sam Bankman-Fried, needed to hear that. I don’t think these people are…

Paul: Somebody made a good point, which is like, if Sam Bankman-Fried had a good board, he would have been fired. He wouldn’t be in jail.

Rich: It’s the same script. Elon Musk, Sam Bankman-Fried, these people—these are people who are seeing, like outlandish success and adoration so quickly.

Paul: Can I tell you something that bums me out, out of all of this? They get excited by the abstractions, they’re excited about humanity in the abstract. 

Rich: Exactly.

Paul: But they’re indifferent to humanity on the ground. 

Rich: Absolutely. 

Paul: I feel that that is a genuine failure of their class. 

Rich: Absolutely, absolutely. 

Paul: They should care about human rights and civil rights, and there are lots of previous examples of people who achieve their power and what they care about instead is this sort of imaginary future state that’s very abstract.

Rich: You’re nailing it. I mean, you’re, you know, the whole effective altruism message skips over the neighbors. It skips over the 500 employees. Look, Fried wanted to like, he, he was like, “I don’t want any of this money. I’m going to use all of it to make the world a better place.”

Paul: “I am not going to buy pants with it. That’s for sure.”

Rich: “I’m not gonna buy pants with it. That’s for sure.” But what happened? Literally families who had invested in this fund saw their money disappear because of malfeasance, right? 

Paul: And it’s indistinguishable from Madoff, except much, much larger. 

Rich: And also driven not by a scheme, but rather like this sort of long-term goal of fixing the world where you’re going to actually cause damage near term, right? And that’s just ego. That’s hubris. That’s all this is, is hubris.

Paul: That’s why we have law. Because then you can’t go ahead and break the law just because of some imagined future state.

Rich: Or governance, like there will be no law that is going to govern how an open AI is going to run. But there’s a reason why those boring operating agreements exist.

Paul: Also, a great argument for the separation of church and state, because, and especially since AI is approaching religion for real, which is like… You don’t actually want people who can say, here’s how we’re going to enact policy in order to bring the apocalypse around sooner. No, but people do, they want to, they want to get that going, they want Jesus back. 

Rich: Yeah. 

Paul: Or they want, there’s a group in Israel that’s trying to breed a red heifer to bring the apocalypse. Just google “red heifer,” my friend.

Rich: Incognito first, but it’s definitely…

Paul: Yes, definitely definitely. You’re in a world here where like those belief systems start to, start to get so profound, and people start to act on them to the point that they trump the basic norms of society. 

Rich: Yeah. 

Paul: And that’s bad stuff.

Rich: Yeah, I want to close it with this thought. It feels like the, like, the waters of Silicon Valley have sort of leaked into the rest of culture.

Paul: Well it did with crypto, right? Like crypto, crypto was perfect that way, because I, I would go over to somebody’s house and they’d be on Discord trading, and I’d be like, you know, you work in, you, you’re a, you, you own a shoe store. Like, it was very confusing. 

Rich: Yeah, yeah. Um, and here we are, I find it, this all a little, it’s a good reboot, like better now, like imagine if it was like AI was way further along, there was some giant committee that had a meltdown. Like this thing is still a baby, and we just got reminded that human dynamics and politics, um, speaking of super alignment, human alignment turns out to be really freaking tricky right?

Paul: There’s another broader point in this, which is Microsoft, the degree of Microsoft power that was exerted over the weekend was quite remarkable.

Rich: Yeah, we may pay the price by having, like a collection of absolute garbage applications come out of Microsoft because of this [laughter].

Paul: I’m just, I’m going to let you know that all future currency will be Azure credits. So I’m going to, so anyway… [chuckles]

Rich: This is the Aboard podcast. Womp womp womp.

Paul: Yeah what does it have to do with Aboard? We’re building a startup.

Rich: Yes. 

Paul: We are a for-profit entity.

Rich: Unapologetic. 

Paul: We don’t have a board.

Rich: We don’t have a board. We may never have a board [laughter].

Paul: No, not after this. No, you and I are on the board. And you should check us out at aboard.com. If you, if you want to see something that is kind of the opposite of truly speculative technology, but actually something that is designed to help you do day to day stuff. What have you been using it for lately? I’ll ask you that and I’ll tell you what I’ve been using it for.

Rich: A lot of the, like a lot of the notes and, I use it to take notes.

Paul: Me too.

Rich: And I don’t share those notes. 

Paul: No, I’m with you on that. I do, too.

Rich: They’re like post-its. Aboard has cards, and I treat cards like post-its. Uh, which makes me, like, want to ask for features that are more for that stuff. But, I’m, I’m not going to.

Paul: I’ve been wrestling with the same thing. I went to a community board meeting about climate resilience.

Rich: You just wanna put data in.

Paul: I just started taking notes in the app. It worked very well. I’m also using it, we’re doing a family get together in D.C. because it’s kind of in the middle of everybody. So I’m using it to kind of track the restaurants that we’re calling because we want to get a room and just sort of… So, anyway, practical, straightforward, simple things like that. It’s got a little AI, but it won’t get in your way. And, uh, check it out, aboard.com. 

Rich: Check it out, and, uh, thanks for watching if you’re on YouTube. Listening if you’re listening on the podcast, uh, like, subscribe, follow, five stars, give us all the love. 

Paul: Please, we need it very, very badly. We’re just a little tiny startup in a world of really big startups and software companies.

Rich: Have a lovely week

Paul: Goodbye everybody.