Image of a person in a business suit sitting on a wooden swing.
February 17, 2026 - 37 min 23 sec

Can Tech CEOs Be Thoughtful?

Anthropic founder Dario Amodei wants AI to be regulated. Will anyone listen? On this week’s podcast, Paul and Rich dive into Amodei’s recent (lengthy) essay, “The Adolescence of Technology,” which argues for social responsibility both from within and around the AI industry. Amodei might have the best intentions, but with less mindful competitors in the space, are his ideas nothing more than wishful thinking? 

 

Subscribe to the podcast, or watch the episode on YouTube.

album-art
00:00

Show Notes

Transcript

Paul Ford: Hi, I’m Paul Ford.

Rich Ziade: And I am Rich Ziade.

Paul: And this is The Aboard Podcast. It’s the podcast about how AI is changing the world of software, brought to you by a company that uses AI to change the world of software. That company is called Aboard!

Rich: Good one. Good energy.

Paul: I’m just getting in there today.

Rich: Get in there.

Paul: Then let’s play the theme song, and let’s get into a conversation about if the world’s gonna end or not.

Rich: Okay…

Paul: Okay!

[intro music]

Paul: Okay, Rich, I’m gonna play a little bit of audio for you.

Rich: Oh…

Paul: Okay? You ready?

Rich: Go ahead. Not the Doobie Brothers again!

Automated voice: Dario Amodei.

Rich: Okay…?

Paul: Dario Amodei is the founder of Anthropic. They make Claude Code. It’s an enormous company. It didn’t really spin out, but he left—he was one of the very early people at OpenAI and he didn’t like the way OpenAI was being run. And he, I think—

Rich: Is that true? I didn’t know the backstory.

Paul: Oh, yeah. He was essentially a person who was like, “We’re not being serious enough about this technology. We’re not being focused on safety.” A real true kind of scientist type, like, a real sort of math person.

Rich: Okay.

Paul: Seemed to have some really fundamental disagreements with Sam Altman at OpenAI and was, like, “I’m going to go do my own thing,” relatively early in the game.

Rich: Yeah.

Paul: And created Anthropic, which is a true competitor, along with Google and along with, and really not too many others. You got to go to China to get the next sort of big thing.

Rich: Yeah.

Paul: To OpenAI, and in some ways I think is a leader, especially in code generation. Claude Code, as we’ve talked about, has taken the world by storm.

Rich: Yeah. I think a good way to sort of summarize how it seems to be materializing is that OpenAI has gotten a foothold on end-user consumer use of AI to shop and answer questions and draw pictures and whatnot. And Anthropic seems more business-oriented, enterprise-oriented. A lot of talk these days about Claude Code. It’s very good at generating code.

Paul: I mean, a good example is that ChatGPT is very, very focused right now on embedding shopping.

Rich: Yeah.

Paul: And whole percentages of, like, Walmart’s revenue are starting to come from ChatGPT.

Rich: You can buy stuff inside of ChatGPT.

Paul: Similar to you can buy placement on Amazon and you can buy Google ad. And so like they’re, I don’t know exactly how the pay to play works or if this is just them absorbing data

Rich: Or click out, yeah, I don’t know.

Paul: We’ve talked about this previously on the podcast. I don’t know the exact—

Rich: Same thing.

Paul: Yeah, but what I would say is like, you know, ChatGPT has leaned into commerce. It is leaned into having lots of different plays.

Rich: Yes.

Paul: It reminds me in some ways of Microsoft, which is both a gaming company and your database company and does Windows, et cetera, et cetera.

Rich: Yeah. But very consumer-oriented.

Paul: Yes, that’s right. And whereas I see Anthropic is, actually reminds me a lot of Google in the old days, which is like, we’re going to do these things really well with a whole lot of computers.

Rich: Narrower mission.

Paul: And very, kind of like, quantitatively driven, like, very math and science at its core.

Rich: Okay.

Paul: You can feel it coming off. So Dario Amodei is the CEO and he likes to write really long essays online.

Rich: Really? Without the help of A.I.

Paul: You know, we don’t know. I don’t know.

Rich: We really don’t know.

Paul: We really don’t know. So he has a website. It is called darioamodei.com.

Rich: Good URL.

Paul: No pictures. None.

Rich: That’s okay.

Paul: Honestly, and it’s the same color scheme as Claude. Clearly the guy has a relationship to his own product, which is actually really good in a founder, especially at that scale.

Rich: Fine.

Paul: So he wrote a piece that people are buzzing about.

Rich: Hmm!

Paul: And they’re buzzing about it for really interesting reasons. It’s called “The Adolescence of Technology.”

Rich: Okay.

Paul: It’s on his website. It’s long. He has a lot of thoughts. He’s not really a summarizer. He builds one of the world’s better summarizers, but he personally is not a summarizer.

Rich: Yeah.

Paul: I gotta be very frank, I kind of like this guy.

Rich: Okay.

Paul: I doubt I agree with him on a lot of things, but he makes an attempt to be forthright and essentially has a kind of engineer-scientist worldview.

Rich: Mmm hmm.

Paul: That he conveys, which means that occasionally he gets a little first principles. Like it’s a little bit like, “Okay, here’s how we got to economics.” But! He is engaged with the world. He is broadly pro-liberal society and democratic society, and he believes that the risks of those things are more important than his ability to make unbelievable amounts of money every single moment of the day.

Rich: Okay.

Paul: And so in the piece, interestingly, and I think it’s really, I don’t know, you’d have to probably be a little on the nerdy side to really want to read it, but he doesn’t call out Altman by name, but definitely, like, Musk really comes under—because he’s like, “Wait a minute, you know, Grok is producing child porn. So how are we going to tell it—”

Rich: He mentions that?

Paul: Yeah. And he’s just, like, “We have to actually start taking this stuff seriously.” And I think it’s a very—so first of all, I’m going to just set a few things up for you. And he talks about this, too. Right? So Claude Code is moving quickly, and it can write code.

Rich: Yes.

Paul: Okay? We’ve had this conversation going on for years. It’s like, is it artificial? Is it AGI yet? Is it AGI? We’re also seeing this world where things are getting faster and they’re getting more—

Rich: Reliable.

Paul: And they can do more stuff, right?

Rich: Yeah.

Paul: But what’s really interesting to me about Claude Code is that code has this tendency, once you build a really good, stable platform, it gets easier to accelerate building from that platform.

Rich: Okay.

Paul: That is the nature of the internet, it’s the nature of software. You build a platform that can accelerate the development of other things, right?

Rich: Yeah.

Paul: And Claude Code to me is this really full expression of how an LLM platform can truly operate.

Rich: Okay.

Paul: Okay? So you got Opus underneath it, their LLM, that’s their ChatGPT.

Rich: Yeah.

Paul: And essentially, when you use Claude Code, you’re chatting with it, but it’s putting all these little layers and tools in between and it’s running, like, little loops.

Rich: Make it work better.

Paul: It does a little thinkin’, does a little plannin’, does a little codin’, does a little testin’.

Rich: Yeah.

Paul: Then it goes, “Better do some more plannin’!”

Rich: Yeah.

Paul: Now we’re sort of in this, but the end result is you have this pretty good bundle of code that runs pretty well, and then the next thing you do can build on that. And that used to be often months and a lot of planning, and it was really hard to get the people to do it for you. You had to pay them a lot of money and now you can do it pretty quickly.

Rich: Yeah.

Paul: You still have to have really good ideas.

Rich: Yeah.

Paul: Okay, so what I see is an opportunity, and I think what he sees, too, he’s very explicit about it, is there’s this real opportunity now to radically accelerate. Like, if you, if code changes the world and it has a lot of power?

Rich: Mmm hmm.

Paul: And you can do it not just twice as fast today, but another, let’s say 20% faster than twice as fast tomorrow, and 20%—if you have this kind of compounding interest?

Rich: Yeah.

Paul: In which you can deliver more and more stuff.

Rich: Right.

Paul: At more and more scale, that is an enormous amount of change.

Rich: I mean, he’s essentially, well, I read a lot of the piece. And one of the things he’s sharing is that what he’s observing, which is the pace of change, put “progress” in quotes, seems to be accelerating.

Paul: And he’s also seeing Anthropic, absolutely, no-holds barred, I would say, has some of the best engineers in the world.

Rich: Sure.

Paul: Especially around LLMs and so on. And he’s saying, look, I’m seeing some of the people who work here, they just don’t write code anymore.

Rich: Right.

Paul: Claude does it for them.

Rich: Yeah.

Paul: So he’s like, if we’re there, where else are we going to get? And that’s what the piece is about. And then he has a big—

Rich: He’s looking ahead, mostly, in the piece.

Paul: That’s what this is about. It’s about the risks. And it’s a—

Rich: Yeah.

Paul: He’s had a little bit of utopian sort of… Look, the thing with AI is it’s been sort of, like, it’s either the end of the world or the beginning of everything, and everybody’s sort of like, “Whoa!” But what he is seeing, and I think I saw this, too, you see this, too, is like, at least in the domain that he is closest to, which is technology delivery, a many, many trillion dollar industry is changing. When an engineer says, “Hey, I’m going to just use this instead of doing the thing that I did for the last 20 years, I’m not going to do that anymore, I’ll just do this.”

Rich: Yeah.

Paul: That’s a pretty radical disruption. And so, and then if they can do more, faster, because they’re doing it the new way, that is going to change your industry. It’s not going to change your company.

Rich: It’s happening. It’s starting to happen.

Paul: That’s right. And so he’s trying to take responsibility for that ownership. Now he uses a thought experiment, or uses a big metaphor. So I’m going to give you the metaphor.

Rich: Okay.

Paul: The metaphor is imagine if there was a country with 50 million geniuses.

Rich: Okay.

Paul: Okay? How would you, like, let’s say you’re an American. How would you deal with that country? The country can do genius things. It can hack into computer systems. It can…

Rich: Yeah.

Paul: So how would you regulate and how would you interact with that nation state?

Rich: Yeah.

Paul: And so that’s his way of saying—

Rich: Setting up the fact, the backstory.

Paul: It’s big, and there’s an element of it, too, where it’s like, how do you get control of a nation with 50 million geniuses?

Rich: Yeah.

Paul: Because they got the nuclear codes at that point.

Rich: They got everything.

Paul: First of all, do—you read some of this?

Rich: Yeah.

Paul: I found it… There’s a lot of, there’s a million little critiques I would make of it. But I actually, I think it’s relatively open for a, somebody who’s running a $350 billion company? It’s a very clear thinking, open-hearted document. It’s actually a risky document because this is not necessarily—no one who is going to sign up for the Anthropic IPO is excited for him to say, “I welcome government regulation.”

Rich: Yeah, I think it’s a little baffling to me that this guy is in the minority. Okay, this guy…

Paul: It’s baffling to him, too. It’s a bummer.

Rich: It’s a little baffling because if it’s wealth he’s looking for, he’s got it. Like, even if they never IPO, he’s a billionaire. This guy is a billionaire by—

Paul: He cannot spend the amount of money he has.

Rich: He can’t spend the amount of money he has. Now, he doesn’t strike me as unambitious because we’re here. [laughing] We got to here because he went for it.

Paul: Yep.

Rich: And he wants to keep going for it.

Paul: He obviously likes running this company and telling people to do things and seeing what happens.

Rich: Yeah, I mean, in reading the piece, I found that he was sort of expressing almost, like, an internal struggle for himself, because I think he sees—he strikes me as an optimist, deep down.

Rich: Yeah.

Paul: As someone that’s, like, I see like we’re gonna cure diseases with this. Let’s go, let’s go and do great things.

Paul: I really want to give this to every doctor in the world so that they can better doctors.

Rich: Correct. I think his saving grace is that that ambition and optimism hasn’t, like, polluted his own ego. I don’t know… And that’s interesting.

Paul: You know why? It’s because he’s not, like, 22.

Rich: Maybe that’s part of it.

Paul: That’s part of it—

Rich: Yeah, but there’s a lot of knucklehead 40 year olds right now.

Paul: Boy, there are.

Rich: You know…

Paul: I hope this one doesn’t turn. I hope we’re not—

Rich: I gotta say, others have turned. Others that I was a fan of that became infinitely wealthy also turned eventually. I don’t know if that’s what you have here. I’m going to be a little optimistic about where he’s going with this. Look, I think part of what he’s saying as well isn’t… It didn’t strike me as high-minded and moral. It struck me as, like, we could really get into some deep shit here.

Paul: Yeah.

Rich: And while we may be on our way to curing cancer, we maybe also may be on our way to, like, getting into really big trouble.

Paul: Yeah.

Rich: And I don’t want us to get into really big trouble. It didn’t strike me as a lecture. I think it’ll land that way because he’s coming out of an environment—

Paul: Richard, anything more than, like, three sentences in 2026 is understood as a lecture.

Rich: That’s true.

Paul: People can’t—˜o one’s going to read this because no one can read. [laughter] Like, we just kind of…

Rich: This is true.

Paul: Actually, just, I think, because it’s emblematic, let me read just a couple sentences. First of all, I agree with you. I think he is just trying to organize his thoughts and sort of, essentially—look, at this scale of company, when you write something like this, it’s policy, both internally for the company and how you are trying to express policy moving outward.

Rich: Yeah.

Paul: And so he’s calling for more regulation, more stability. Not too much regulation. And he’s also very paranoid about external things. He’s very paranoid about China. He’s like, that’s an authoritarian state. There is a little bit of an implication< I think he sees America’s current disaster as a temporary situation. He’s very sort of pro-democracy.

Rich: Yeah. It struck me, it had a bit of a patriotic tone to it.

Paul: It does. This guy believes in an open society and he’s very concerned about how an authoritarian, surveillance-oriented state—

Rich: Could have a field day with this stuff.

Paul: Because the robots can program the robots to be smarter. That’s, I think, where he’s going.

Rich: Yeah.

Paul: Right? So, like, Claude Code can make a better Claude Code.

Rich: Yeah.

Paul: Okay, so maybe I can make a really good automated drone pilot.

Rich: Yeah.

Paul: And then I go out and I get some video back from it. It’s like, “Mmm, it’s off by, like, 2 degrees.”

Rich: Yeah, yeah, yeah.

Paul: Okay. Right now some guy has to, like, figure out how to solve the 2-degree problem.

Rich: Yeah.

Paul: But maybe I put 10,000 drones out and I get all the little different measurements off of them and then I automatically improve my drone software for tomorrow.

Rich: Yeah.

Paul: Okay? And that used to be—

Rich: And that’s good if you’re using drone software for forest fires. Not good when you have malicious intent behind using those drones.

Paul: And his other point too is that, like, a relatively incompetent actor—used to be, like, if you go get a PhD in biology?

Rich: Yeah.

Paul: By the time you got that PhD, you’re less likely to create a super virus that will kill everyone.

Rich: Yeah.

Paul: You might want to kill only your advisor, your thesis advisor.

Rich: Yeah. Right.

Paul: But like now, essentially, one of these things can kind of walk you through all the steps necessary to make the super virus.

Rich: Yeah.

Paul: So they’re trying to put guardrails in. He’s like, “Hey, we are. We’re spending 5% of our compute sometimes on the way out.”

Rich: Yeah.

Paul: To make sure that, like, you don’t get biological warfare in front. But he’s like, if they’re making child porn over at Grok.

Rich: Yeah.

Paul: I own think they’re also really concerned about viral issues.

Rich: You know, when he talks about, like, examples like that, I don’t think he’s talking about them in the near-term. I think what he’s talking about is, look, this is a symptom of, like, how this can really, really go off the rails.

Paul: Well, and because it’s a, you know, it’s a recursive process. Right?

Rich: Yeah.

Paul: So when you do things with a computer and the computer crashes and suddenly the screen is all lots of little pixels.

Rich: Yeah.

Paul: He’s thinking of that as, like, a cultural thing.

Rich: He is, yeah. And look, I’m on board with what he’s saying.

Paul: Yeah.

Rich: I worry about what he’s saying because I think a lot of his competitors and a lot of other technologists are not on board.

Paul: Well, we’re in the gravy train zone right now.

Rich: We’re in the gravy train zone. I don’t have a ton of confidence in government and, like, civic society to have the foresight to head off something that I think is far more potent than, like, what social media has done.

Paul: Unfortunately, we know the playbook here. Well, first, let me just read the paragraph.

Rich: Read the paragraph.

Paul: “While all the above private actions can be helpful—” You know, to kind of talking about what companies can do. “Ultimately a macroeconomic problem this large will require government intervention. The natural policy response to an enormous economic pie coupled with high inequality (due to a lack of jobs, or poorly paid jobs, for many) is progressive taxation. The tax could be general or could be targeted against AI companies in particular.”

Rich: Mmm.

Paul: Hell of a thing for a guy to say. “Obviously tax design is complicated, and there are many ways for it to go wrong. I don’t support poorly designed tax policies. I think the extreme levels of inequality predicted in this essay justify a more robust tax policy on basic moral grounds, but I can also make a pragmatic argument to the world’s billionaires that it’s in their interest to support a good version of it: if they don’t support a good version, they’ll inevitably get a bad version designed by a mob.”

Rich: Hmm.

Paul: Here’s what human history shows us.

Rich: Hmm.

Paul: The billionaires will build compounds in Nevada, and hopefully they’ll be able to flee to them in time. That’s what they’re thinking.

Rich: I think that’s a thing, isn’t it?

Paul: Yeah.

Rich: A lot of Wyoming, like, underground bunkers.

Paul: Yes, that’s right. And hopefully their security personnel won’t eat them. [laughter] Right? And hopefully AI won’t, like, knock their planes out of the sky. Like, I mean, what he’s saying is, like, whoa, hold on. Don’t go build that society. Let’s build this one. The problem we have, and we’ve seen it with the pandemic, we’re actually seeing it now with sort of the counter movement to Trump. Like, you’re actually starting to see that, like, the authoritarian impulse in America is being roundly rejected.

Rich: Sure.

Paul: But it, I gotta be frank, we could have done this at Democracy Time a couple years ago.

Rich: We never do.

Paul: We never do. And we’re not doing it with climate either. Right?

Rich: We always step in shit.

Paul: You know it is, and I gotta tell you, I gotta tell you something. This is a little personal, but just, like, Mounjaro helped me lose a lot of weight, right?

Rich: Mmm hmm.

Paul: I’m one of those. Type 2 diabetic. I’ve written about it. It’s good. It’s good for me. I weigh less than I used to. I’m healthier. I do more. I ride my bike more. But there was a funny part of me, because I’m very connected to the climate movement, have spent a lot of time on it. You know, climate scientists tend to be pretty skinny, kind of like doctors. And they wear, like, nice jackets.

Rich: Bow ties?

Paul: No, not so much bow ties. More like sort of, they’re almost like UN types. Like, they wear cool all-season jackets and they’re ready to go to a glacier at a moment’s notice.

Rich: [laughing] Okay.

Paul: And they’re, you know, “We gotta just get everybody to calm down.” And I’m like, “You’ve never seen somebody around a can of Pringles just lose it.” [laughter] Right?

Rich: I was wondering, how are you gonna bring this all together?

Paul: You get that can of Pringles, and America is just, like, the world is somebody who is, “You know what? I had a quarter of that can of Pringles. That’s enough Pringles for now.”

Rich: Yeah.

Paul: “I’m gonna put the.” And you put, you seal that can up. In fact, maybe you even left that foil top on it.

Rich: Yeah.

Paul: You know, you’re like, “I’m done. I’m done till tomorrow.” And then—

Rich: 20 minutes later.

Paul: That little guy on the Pringles can is like, [high-pitched voice] “Were did you go?”

Rich: Yeah.

Paul: “Come back.” He doesn’t even have an accent in my mind. I don’t know what he’s supposed to be.

Rich: Yeah, okay, ground this for me.

Paul: No, no, we’re going, we’ve got an hour of this to go. [laughter] No, what I’m saying is like, trans fats, cigarettes, we all know. Everybody knows.

Rich: Yeah.

Paul: “I guess I’m getting cancer then!” I’ve heard people say those words.

Rich: Yeah, yeah.

Paul: Right? And we expect somehow that a larger human society will be enlightened than your neighbor or yourself.

Rich: Yeah.

Paul: And sometimes it is like, especially after World War II, because I think we were like, “That was real bad. That was, like, super bad.”

Rich: I was actually gonna bring up World War II.

Paul: Yeah.

Rich: I like to look at examples where, like, humanity went ahead and lined up. The truth is, since World War II, the stretch of peace and prosperity, obviously there’s been flare-ups in different places—

Paul: No, no, but it was awesome. And it was directly in response to, “That was literally the worst thing we’ve ever seen.”

Rich: Yeah. And if. And to take it even further, you know, we dropped two nuclear bombs on Japan.

Paul: Yes.

Rich: And then, yes, there was a cold war and an arms race.

Paul: No, no. And we’re coming out—the news about the Holocaust doesn’t come out all at once. It comes out over, like, a decade.

Rich: Exactly. And so I think—

Paul: This is a marketing podcast, by the way.

Rich: I think, look, I think what he’s aiming for, and it’s good that he did it, by the way, because Anthropic is not a little company, and he wrote an op-ed.

Paul: No, this is a—

Rich: Like, he actually has a voice and this is a big deal.

Paul: No, the guess is if they IPO this year, they’re worth at least $350 billion. They’ll be one of the, absolutely, one of, like, the top, like 50 companies in the world.

Rich: Yeah. I mean, this is a big, big, big voice in this movement. I think what he’s trying to do, which is hard, history dictates that this is really hard, is he’s trying to let everyone know that, you know, yeah, we built a lot of nukes, but we also knew about mutually assured destruction. And then that was the end of it.

Paul: Let’s be clear. We didn’t at first we had people like Edward Teller saying, “Let’s nuke the moon.”

Rich: Yeah.

Paul: Everybody got real excited about the new—

Rich: Yeah. That was short-lived. And then you had pretty much the world order orienting around not obliterating the whole earth.

Paul: You know, one of the first things that Truman did, after—Truman did not approve Hiroshima. That was military. It was the army.

Rich: Yeah, but he’s got to ultimately sign off.

Paul: No. He said, “In the future, I’ll be signing off on these.”

Rich: Oh, that’s a good loophole.

Paul: Well, no, no, no. It was like the, he was like, as president, we’re going to, these are going to go through the president from now on.

Rich: Yeah.

Paul: The army cannot be using nuclear weapons.

Rich: Of course. Right.

Paul: Nobody thought that way. Like, nobody had the thought because they’ve never seen what could happen.

Rich: Correct.

Paul: Right?

Rich: So, look, this is the equivalent of a tobacco executive saying, “Hey, listen, man, I know these, we keep saying they’re minty fresh and it’s 1953, but this is gonna kill a lot of people. And so let’s regulate it. Let’s not let kids smoke.”

Paul: It’s not tobacco. What is it? It’s something that is really great and exciting and good, but, boy, does it go out. It’s—

Rich: I mean, in the 50s, that was tobacco. Here’s my point. This is the CEO of Winston cigarettes—

Paul: Altria.

Rich: Writing an essay.

Paul: Yeah.

Rich: I fear that this is out of the bag and that humans tend to step in shit first before they learn. He’s trying to wrap his head around losing control, I think, is what he’s saying. He’s essentially saying, we’ve got something now that could be great, but frankly, I cannot give you a guarantee sticker on this thing, and it is going to go off the rails. It will happen, and we can try to regulate it.

Paul: So what infrastructure—

Rich: That’s what he’s saying.

Paul: What policy will we have? What will governments be doing?

Rich: That’s what he’s saying. He’s not saying, like, you know—

Paul: Well, he is also saying, like, I personally will not be fighting you tooth and nail on that. If you work with us, we will comply with the regulation.

Rich: Correct.

Paul: There’s one other important thing here, too, which is at the scale, you know, you’re saying this is about him saying, wow, this is going out of control. At the scale of organization, at the speed of growth, he’s also lost control. He doesn’t control his own company.

Rich: He doesn’t.

Paul: No CEO, and I think if you haven’t been around these orgs, no CEO of an organization that’s worth, I don’t know, like, more than a billion dollars, really has control. I mean, every now and then there’s like, the Instagrams of the world that aren’t that big.

Rich: He doesn’t have control over the industry. He has control over his own business.

Paul: Even there, you’re still, he can set policy and agenda, but thousands of people cannot be controlled. They do what they want.

Rich: True, but—and this is actually happening at Anthropic. There are a lot of guardrails inside of Anthropic’s models.

Paul: Yes.

Rich: Like, a lot of them. It’s part of their process.

Paul: It’s kind of their whole thing.

Rich: It’s kind of their whole thing.

Paul: It’s actually, it’s pretty awkward, too, because for obvious reasons, the U.S. government has been like, “We’re gonna bring Grok into the Department of Defense.” It really should be this product.

Rich: Let me ask you a question.

Paul: Go ahead.

Rich: It’s 2014, 15.

Paul: Okay.

Rich: Zuckerberg sits down. And already it’s starting to become clear that Facebook is getting into the hands of malevolent actors and authoritarian regimes and whatnot.

Paul: Yeah.

Rich: And it’s not fire yet, but there’s, there’s, there’s, there’s starting to, it’s starting to flare up—

Paul: This isn’t a hypothetical. This happened.

Rich: No, no. Then Zuckerberg in 2014 says, “Guys, this is going to cause damage. I’ve got 1.3 billion people on it. Another 1.3 billion will be on it in three years or two years or whatever. A lot of the Earth’s people are going to be on this platform, and they’re going to manipulate elections, they’re going to stir up angry sentiment and cause disruption in societies around the world. We need regulation.” Would that have done it? First off, he didn’t do that. Let’s also point out, like, growth was absolutely the number one goal of Facebook. It was a business, unapologetically a business driven by Sheryl Sandberg and—

Paul: It’s been pretty widely reported, right.

Rich: And would you say that now, in hindsight, that there should have been more control, more policy, more guardrails, so that something, like, as vast as, frankly, social media was on the Earth, right, should have been reined in?

Paul: Yeah, absolutely.

Rich: You do?

Paul: I did at the time, and I do now. And then when they went in front of Congress to testify, they were, the Congress was basically like, “If we write some regulations, will you help us write them?” They’re like, “Sure, I will, Congress!” Like, I’ve never seen a greater bunch of sucking up and just garbage fires. And we all know what it did to society.

Rich: Okay, next question. Do you think that this wave, AI, could cause more damage more quickly than social media?

Paul: Yes. Not directly. Currently, I would say, I don’t directly buy that, like, the talking robot will.

Rich: Yeah.

Paul: But that ability to accelerate… Social media is a vector for distribution of AI-generated stuff, right?

Rich: Yes.

Paul: Right now, I mean, in some ways, some of our good protections are things like app stores, because I could make a thousand apps an hour.

Rich: Forget that. Well, yeah, I mean, okay, fine. Like, malicious actors that, like, want to take down the northeast electric grid.

Paul: Yeah, sure.

Rich: Which is an act of war.

Paul: Absolutely. Sure.

Rich: Yeah. We don’t have it yet. Like, I don’t know if this guy, who is not Facebook.

Paul: No.

Rich: Facebook had 90% of social media between Facebook and Instagram at the time.

Paul: Yeah, right.

Rich: He’s not that. He’s got, like, 35% of AI. Right?

Paul: Pretty good, but yeah.

Rich: But yeah, right. And everyone else is not on board. Do you think we’re heading down a path where we’ going to have some really bad stories?

Paul: Yes, absolutely. Of course. 100%. How can we not? It’s the most powerful thing we’ve unlocked in tech. It’s probably more powerful than social.

Rich: Yeah.

Paul: And it can accelerate every aspect of cultural production, including software development.

Rich: How do you get an African country to not maliciously use AI? Let’s say the U.S. government agrees to it. How do you get in a Middle Eastern country where there’s unrest everywhere and it’s tribal?

Paul: His point, he actually, he, this is like again, where he, I feel he’s like a very first principles kind of guy.

Rich: Yeah.

Paul: He’s like, “This is even bad for authoritarian governments.” [laughter] Like, unless they’re just like—no, because you won’t get to keep the power. Right?

Rich: Yeah.

Paul: You know, like, this could be really bad for you. You are vulnerable to this just in the same way that like—and his point, I think is also—

Rich: Like, the meaning it’ll get out of hand in their hands?

Paul: Yeah. Like, you can’t control this. This can come for you.

Rich: Yeah.

Paul: Just way more easily than you can rally this and come for somebody else.

Rich: Yeah, yeah, yeah.

Paul: But you’re asking a level of enlightened self interest out of humanity that we have absolutely no evidence that we can produce. And I know that the AI people think they’re very special, but we’re also the only real win we’ve had, and it came after two nuclear weapons were deployed, was nuclear non-proliferation slowly, over time and. But the reality is like, that is, I’m gonna be real cynical, dark for a minute, but that is a gun that still hasn’t gone off. Like, it’s still a problem.

Rich: It’s still there.

Paul: Yeah,

Rich: Yeah. I mean, there have been reduction treaties to bring stuff down.

Paul: This is where we are, is that—so the best situation you can have is probably so, first of all, I don’t think anyone will believe until they see something really bad happen.

Rich: Yeah.

Paul: And then we will overreact. A good example would be September 11th. And then we created the Patriot Act. We created all this stuff.

Rich: Yeah.

Paul: We created this whole—

Rich: Take your shoes off.

Paul: Yeah. We created this whole infrastructure for surveillance because we are—

Rich: We freaked out.

Paul: Well, and you basically, you get a little bit of shame that you don’t see it as a culture.

Rich: Yeah.

Paul: So we were ashamed of September 11th, and so we created this absolute overreaction because we didn’t want to admit how badly we’d screwed up to the point that, you know, “Hey, Bin Laden determined to strike in the U.S.” to the Bush administration. Well, that’ll never happen again.

Rich: Question for you. Do you think it’s a naive essay?

Paul: It will prove in history to be incredibly naive. And he knows that. [laughter] Anything you say and write about the future of technology is humiliating about two and a half hours later. And you can take that from me.

Rich: Yeah.

Paul: The thing is, I’ll tell you, I’ve actually recently started resurrecting a lot of my old writing and just kind of trying to make it just like a little personal archive because I have stuff scattered everywhere. Right?

Rich: Yeah.

Paul: The times when I wrote about technology and about culture purely from my values, that had a very good predictive quality.

Rich: Yeah.

Paul: Like, humans are going to continue to be humans, and this is what is good about humans, and this is what’s bad. And we need and, you know, technologies for managing human behavior include things like religion. Like, this is really fundamental stuff. And I think what’s wild about AI is we’ve jammed a new meta category of culture into an already kind of deeply oversaturated culture. It’s essentially,  like, hey, let’s add, oh, this movie theater popcorn is pretty good. Let’s add half a cup of olive oil to it.

Rich: And butter.

Paul: And butter. And you’re like, “Well, no, I already had enough butter on this.” It’s like, “No, no, no. You also want—”

Rich: We can’t process it.

Paul: It’s more oil and butter and grease than it is popcorn at this point.

Rich: Yeah.

Paul: And we’re trying just to get a kernel in our mouth. So that’s how AI is landing. But to me, it is one of these big transformative cultural blobs that has a lot of, that humans are going to have to interact with that both tells us how to behave in a way, but also requires us to figure out the world that we want to live in and what belief system we’re going to be aligned with. So he’s taken a stab.

Rich: Yeah.

Paul: And it’s a $350 billion company and here we are, and we don’t have leadership that can address it. We don’t have national leadership that can address it. And some of the people running the other companies are among the most immature sociopaths we’ve ever seen.

Rich: There’s a bit of that.

Paul: So what can this guy do? He can make a really good product. He can put it on rails. Here’s the real ethical question. Given everything he knows and the risks involved, should Dario Amodei stop?

Rich: Stop building?

Paul: Stop. He knows that he could do, he could create an enormous amount of existential risk building Anthropic. He knows that and he’s worried about it. Should he continue to do it or should he stop? He thinks he should continue to do it. What do you think?

Rich: I think he should continue… Well, he’s going to continue to do it.

Paul: But regardless of should. You know that.

Rich: I think he should continue to do it. I’ll tell you why. As a thriving baby capitalist, that’s me, I actually—

Paul: Just a widdle tiny baby capitalwist.

Rich: [laughing] I hold immense value in building things and building goodwill with things that people trust. I think Anthropic is going to be more trusted by governments. I think Anthropic will find an inside lane where they will be understood to be the gold standard in terms of what you’re handing over to their platform.

Paul: There’s a real ethos of transparency.

Rich: I’m not making a moral case here. What I’m saying is that I think this will be understood and perceived to be—now, look, there may be scandals in the future and things may change and—

Paul: No, but it’s the most compliant with sort of the idea of having a civic commons.

Rich: It is where a more positive outcome is a byproduct of actually capitalist and ambitious intent. There are examples of—Apple. Apple’s like, if you really want airtight privacy, you should buy Apple stuff. You shouldn’t get anything that isn’t Apple. And that’s not because they love you and want your privacy, want your information to be private. It’s because they sell hardware and not data and that’s their primary money maker. And so their angle, their inside track, actually turned out to be to have an altruistic side effect. And I think, I think this is so world-changing, this stuff, that Anthropic will thrive in the position they’ve taken. Actually, that’s my view of it. That they will have greater goodwill at their backs because some ugly stories are gonna come out in many places.

Paul: Yes.

Rich: Probably even Anthropic’s. Like, let’s be real—

Paul: Sure, oh, they already helped themselves to all of written word.

Rich: Exactly.

Paul: Had to pay a billion-plus.

Rich: Exactly. But they’ve at least planted a flag and said, “We’re gonna aim for this because we’re all fabulously successful and we don’t want to see the Earth melt like a blob of Velveeta cheese.” No offense to anyone who likes Velveeta cheese.

Paul: I mean, look, the funny thing is they help themselves to everybody’s prose.

Rich: Yeah, they knew what they were. They knew, they knew everyone was gonna come at them, and the economics work. That’s literally what happened. Oh, shit, the New York Times sued them. It’s like, yeah, of course there’s a spreadsheet.

Paul: Well, they’re not suing Anthropic.

Rich: No, I just—

Paul: No, but. But actually with Anthropic, what happened is like, they got in trouble. They didn’t keep appealing. They’re like, “All right, let’s pay it out.”

Rich: Oh, interesting.

Paul: And then what happened is all the writers I know are like, “That’s kind of cool. I just made a couple thousand dollars.”

Rich: Is that right?

Paul: “That I was never going to see again.” [laughter] No, no, no. People were actually kind, they were like, “This isn’t cool. It sucks. I hate AI.”

Rich: Yeah, but they paid the bill.

Paul: And they’re like, “But mmm, you know, I was never going to see a penny out of this.”

Rich: Dude, they ran the numbers. They were like, “Listen, if everyone comes at us, what is it going to be?” It’s, like, about a billion bucks. “Oh, okay, we can do that.”

Paul: So we’re in this funny zone where, like, they actually ended up in some ways getting goodwill from that disaster. Anyway, regardless, I think—

Rich: I will say it’s happening.

Paul: Yes.

Rich: It’s happening now. Like, we are advocating for this company. We actually, the tools, the quality of the tools is very good. Let me ask you a question.

Paul: Okay.

Rich: I think that the $10, $20 model is not enough for the valuations that are out there. And so ads are coming to these platforms. Data marketplaces are coming.

Paul: They’re testing ads and they’re testing all that with ChatGPT right now.

Rich: Exactly. I pay $15 a month to not see ads on YouTube.

Paul: Mmm hmm.

Rich: Is Anthropic’s model going to eventually be an ad-driven one? My sense is no. I don’t know for sure. But would you pay more money to have a quote-unquote “purer experience” that is clearly not polluted by data marketplaces, behavioral marketplaces? And of course you would. Businesses will. That’s for sure.

Paul: I spend $200 a month right now on Anthropic.

Rich: Businesses would. Right?

Paul: Yeah.

Rich: Because he’s essentially, I think the ethos of the place is very tied to his worldview.

Paul: I gotta tell you, and just as a professional, as somebody who is using this to accomplish certain goals and do certain things, I just don’t trust OpenAI. I don’t trust them. I just, they’re not, I actually do, for all their faults, I’m like, all right, I trust Google. Like, I don’t like Google, but I trust them with my email.

Rich: Yeah, right.

Paul: I just kind of like, all right, here we go.

Rich: So I think. I think it’s complicated. I do think there are paths, and then maybe they lead the way. Maybe they lead the way. Like, wow, look at that. They actually are a beast of a business. And at the same time, they have a particular belief system that drives how they think about things. Do we need regulation? Do we need better control over this stuff? We do. I don’t know how that comes together. It’s a lot of old men in Washington that don’t understand anything. Like, I remember those Facebook hearings. It was kind of goofy. It’s like, “Hey, you know, my daughter was doing this last night. What do you think about that?” It was like, “Well, sir, if you turn off this privacy setting, she’ll be fine.” Like, it was literally the—you know, it was scary.

Paul: The politics of the future gonna incorporate this. I’m seeing new patterns and how people are getting, like, I’m gonna just, I’ll close on this, which is, I’m watching how people are reacting in Minneapolis right now. There’s a lot of stuff going down. Right?

Rich: Yeah.

Paul: And what I see is, like, I’m on Bluesky a lot. Very progressive space, and people are like, “Nobody’s doing anything.”

Rich: Mmm.

Paul: And then you go, and then you realize, like, no, they are. They’re just doing it in the group chat. They’re doing it on Signal. They’re doing it—

Rich: Other places.

Paul: Yeah. And I think that what’s happening is we’re actually seeing a pretty big realignment in how people are using technology. They’re more conscious of privacy. They’re able—and so what’s happening is, like, actual collective action showing up, and everybody’s like, “Where did that come from?”

Rich: Yeah.

Paul: Because I didn’t see it. It wasn’t on Facebook.

Rich: Yeah. It’s not happening there.

Paul: It’s happening, and nobody cares about telling the media. Nobody cares about calling the New York Times. Nobody cares about it being on a Facebook event. And so I think that there are really different patterns emerging and this is going to come into that. And so there is one thing where authoritarianism is being able, you get the panopticon and it’s AI-powered. But there’s another thing where human beings are able to build tools for themselves at an incredible pace.

Rich: Yeah.

Paul: And distribute them really, really quickly and to create their own kind of culture.

Rich: Yeah.

Paul: And I don’t know exactly if it’s gonna be—those two poles are going to exist. I just don’t know what comes in the middle. And nobody does. And I think this guy is trying to just put an outline out so that we can—this is his PowerPoint.

Rich: Yeah, it is, it is. I appreciate that he did it, and I’m glad it’s out there because it’s being read and it’s being talked about.

Paul: I wish we had one of these a day.

Rich: Yeah, exactly. You’re listening to The Aboard Podcast. This time we talked about how AI is changing the world.

Paul: Yes.

Rich: We typically talk about how AI is changing the world of software. I mean, it’s all kind of one thing.

Paul: It is to us.

Rich: Fun discussion. Go, Dario. Reach out hello@aboard.com. We’ll talk about anything related to AI. We’ve got our own platform that ships amazing software and solves problems for people, led by people who are still really important.

Paul: We’re going to start doing more demos of it. It’s actually, it’s really landing.

Rich: Cage match time.

Paul: Cage match.

Rich: So we’re gonna have some fun. Reach out and give us a like, subscribe, thumbs up, stars. I don’t know what the latest thing is.

Paul: Oh, and watch out for events. We’re building a New York City community around AI because we need to protect this city from the evil of Silicon Valley.

Rich: If you sign up to the newsletter, you’ll hear about our events.

Paul: All right.

Rich: Have a lovely week.

Paul: Lots of love.

Rich: Bye.

Paul: Bye.

[outro music]