AI Regulation (and Disregulation)
February 25, 2025 · 34 min 22 sec
Is it too late to regulate AI? On this week’s Reqless, Paul and Rich assess what “regulating AI” could even mean, from controlling training data sources to moderating its ability to spread information—and disinformation. They then zero in on the question in the context of the new American administration, and Paul muses about just how long he’d like to hold his breath underwater given the current state of the news. (Five minutes? Ten?)
Show Notes
- “Regulate” by Warren G feat. Nate Dogg
Transcript
Paul Ford: Hi, I’m Paul Ford.
Rich Ziade: And I’m Warren G.
Paul: No, you’re not! You’re not that.
Rich: Yeah, but he has an excellent song called “Regulate.”
Paul: He does. Okay, we’re going to talk about regulation today. Your name is Rich Ziade, in case you forgot.
Rich: I am Rich Ziade.
Paul: We are the co-founders of a company called Aboard that uses AI in lots of ways.
Rich: And you’re listening to Reqless: AI in the world of changing software?
Paul: Pretty good. That’s fine.
Rich: Okay.
Paul: AI, you know, it’s about how AI is changing the world of software.
Rich: Okay.
Paul: That is what we are. Anyway, let’s play the theme song and then I’ll get us all set up to talk about the future of AI in a civil society.
[intro music]
Paul: Hoo boy, AI in a civil society. That’s, that’s going to get everybody awake.
Rich: It was a stroke of genius.
Paul: How are you doing? How are you doing? You know, here’s what you’re going to say. “Everybody should get off their phone and really focus on what matters.” No one can do that right now.
Rich: Michael McDonald on “Regulate.”
Paul: Mmm.
Rich: The sample.
Paul: Mmm hmm?
Rich: I don’t even know if his voice is in it. It’s just that sample.
Paul: All right, I think we got to give people a little bit of context here before we go and talk about AI, which is that Warren G and Snoop Dogg made a song called “Regulate” that heavily samples Michael McDonald’s, what is it—
Rich: Okay, fatal error.
Paul: What?
Rich: It’s Warren G and Nate Dogg.
Paul: Oh my God.
Rich: Yeah. Wrong Dogg.
Paul: Oh my God!
Rich: Wrong Dogg, dude.
Paul: Ooh. [laughter] It actually is shameful. We’re gonna leave that in so people know.
Rich: It’s one of the, it’s a, it’s a hip-hop classic.
Paul: It really is. Which song, is it “Keep Forgettin’?” What is the song?
Rich: Yeah.
Paul: Yeah.
Rich: It’s “Keep Forgettin’,” but I think they—
Paul: [singing “Keep Forgettin'”]
Rich: —just took this, yeah, I don’t think he does that in the in the sample.
Paul: No.
Rich: It’s just the beat and the rhythm.
Paul: And apparently Michael McDonald’s kid was like, “Dad, you know, why aren’t you that cool? Like, you’re in this, but you’re not cool.”
Rich: Michael McDonald is a sweetie. He’s a very nice man.
Paul: I don’t have a lot of bad things to say about—
Rich: We’re gonna have to hijack this podcast at some point and talk about what Steely Dan did to America and music, but not today.
Paul: Honestly, it’s probably one of the reasons we’re in the situation we’re in now. [laughter] There is, there is somehow, I don’t know how to, I don’t know how to do it. But there is a direct connection between Steely Dan and America’s current predicament.
Rich: Probably.
Paul: I can’t—
Rich: All right, listen, go. What you’re, you’re concerned…
Paul: I’m not concerned.
Rich: Tell me your thinking.
Paul: Our editor had a good point, which, you know, I was just, like, all right, all right. You’re making a good point here, which is, hey, you fellas on the podcast are always saying, like, you know, AI is new, but there’s going to be a regulatory framework and the government will get involved, and I’m sure there’ll be a way for us to kind of make sense of this in our culture, because it can fake being a human, or it could pilot a drone.
Rich: Yeah.
Paul: So we’re going to need all kinds of rules for this. And other podcast guests have kind of iterated on that, and said the same thing. And as we look at today’s exciting and dynamic political environment, which may be entirely different by the time you listen to this podcast tomorrow, I would say that it’s going to be pretty thin on the ground when it comes to AI regulation. Large language models, who gets to access them, what they get to do with them, what the government does with them. I think that we have people who are going to try to jam this into every aspect of civil society, including defense, including government services, workplace, HR stuff, et cetera.
Rich: Yeah.
Paul: And I think we should talk about, actually, because here we are building our AI business, and we kind of tend to be downstream of the companies like OpenAI and the companies like Claude. We tend to be downstream of big tech players. That’s where you and I have spent our careers.
Rich: Mmm hmm.
Paul: And we always talk about this. We talk about the fact that kind of the law that you and I have to live up by is the terms of service in the Apple App Store.
Rich: Yeah.
Paul: Right? I don’t get to do anything that Apple doesn’t want me to do.
Rich: Yeah, which is not regulation, by the way.
Paul: It’s not regulation, but—
Rich: It’s a contractual commitment, essentially.
Paul: But there’s a set of rules. But I would say for the tech industry, it’s indistinguishable. Right?
Rich: Yeah.
Paul: This is the set of rules. I cannot succeed unless I follow these rules. And Apple sets those rules because it wants to make a lot of money, and also because it just wants it to not be a horrible, disgusting mess. And those two things kind of line up.
Rich: Yeah, it’s driven by self-interest, but they want a stable, predictable place.
Paul: That’s right.
Rich: Marketplace.
Paul: So I think that this is real. And so, you know, in some ways, like, I just think we should open this gate a little bit, because I don’t think you’re going to see this big federal, like, here’s how you must use AI move in the next couple of years, because that’s not what this administration or Elon Musk or anybody is really all about.
Rich: I wouldn’t even zoom in on this administration. I think if history is any indicator, the laws and regulations are always too late. They’re almost always historically way too late. In fact, sometimes they never show up. Like, to this day, Facebook and all the pre-AI disinformation stuff is still essentially unregulated. Like, it never got really regulated with another administration. And it’s still to this day not regulated.
Paul: I mean, they do like to, I, I guess, you know, they do like to bring Mark Zuckerberg in and he sits on that booster seat and they yell at him for a while, but.
Rich: Yeah, but nothing ever really comes of it. Like, I mean—
Paul: Well, they did fire—or didn’t fire, but like, Sheryl Sandberg went away. So that was one sort of big change.
Rich: Yeah, but you could still con a lot of people on Facebook.
Paul: Yeah, you can, and you can post AI slop like, you know, Jesus made of shrimp.
Rich: It’s just, it’s just, the truth is there’s a libertarian streak to our society to leave people alone, to innovate and grow and let kind of, let leave things go where they want to go.
Paul: And nowhere more than Silicon Valley.
Rich: And nowhere more than Silicon Valley. It has a big lobbying effort and whatnot.
Paul: And look, AI is moving really, really fast. We’re starting to figure out ways that we’re going to succeed because it’s moving fast.
Rich: What do you want to regulate?
Paul: Well, let me throw out like a little bit of—I actually don’t—
Rich: I want to roleplay, because I—
Paul: Me, too.
Rich: I’m willing to be the guy who says, “What are you regulating? You sound ridiculous. It’s a productivity tool.”
Paul: Well, let me throw out a couple different regulatory frameworks that are emerging. So Europe has a whole lot of rules about what can be brought in and turned into an LLM, all sorts of stuff.
Rich: Okay, so, all right, so let’s pick this apart. What you can ingest from…
Paul: Yeah. You can’t, you can’t just help—
Rich: …common knowledge.
Paul: You can’t just help yourself to all of culture.
Rich: Right, right. And there’s actually, as we speak, a lawsuit between the New York Times and OpenAI.
Paul: That’s right.
Rich: Because they slurped in all the New York Times articles.
Paul: It’s also coming out that, like Facebook—so what, there is an enormous database of books, you know, it has a bunch of different names.
Rich: Okay, okay.
Paul: I won’t even try to figure out what the current canonical name is, which are, you know, you got to get them through BitTorent and other pseudo-legal ways.
Rich: Yup.
Paul: And of course academics and people—
Rich: Epubs and stuff.
Paul: A lot of stuff like that, a lot of academic stuff, a lot of research stuff.
Rich: Sure.
Paul: So academics tend to kind of, “Well, we want open knowledge and so on and so forth.” Well, you know, Facebook and maybe others have ingested this. They’ve ingested YouTube videos that maybe people didn’t opt into. And this is where the artists got really upset when things started making images.
Rich: Mmm hmm, mmm hmm.
Paul: You know, the generative stuff that made things in there, in there, kind of—
Rich: Okay, so let’s put it, let’s—
Paul: So that’s, that’s one. So the EU, and there’s sort of lawsuits, and there is a sense of like, “Hey, you all can’t just grab whatever you want and turn that into AI.”
Rich: Okay, so that’s bullet one. That’s regulation one.
Paul: Okay, so bullet two—
Rich: You must get permission from the copyright holder.
Paul: Yes.
Rich: For lack of a better term, for the moment, before you take in their copyrighted works for machine learning, for LLMs to digest. You can’t just take stuff off the internet. You can’t just walk in and take stuff off the internet.
Paul: And look, it’s tricky and fuzzy because, are you building a search engine? Well, you kind of can if you follow the rules of robots.txt.
Rich: Yeah.
Paul: So, you know, I mean, in general, Google—the Times likes when Google indexes their stuff.
Rich: Okay, so where do you stand on that? I haven’t thought it through—
Paul: Well, honestly, it’s a tricky one. To me, this is like—
Rich: It’s probably too late, by the way.
Paul: It is.
Rich: I think the monster ate, you know, Godzilla trounced the city already. And what they’ll probably have to do is pay some money, and Lord knows they have some money to throw back to, like, publishers. [laughing]
Paul: Here’s what’s real—well, no, they’ll never throw any money back to anybody. But here’s what’s real.
Rich: No, I mean, in the context of a lawsuit.
Paul: What’s really, really tricky here, and I don’t know what a judge does, because this is a set of norms and rules in our society that are really fluid. And people don’t like to talk about how fluid they are. But like, hey, if I want to go torrent, like a weird art movie right now that I can’t get anywhere else?
Rich: Yeah.
Paul: Pretty much everybody in my cohort who isn’t, like, associated with the recording industry would be like, “Yeah, man, go to. No big deal.”
Rich: Yeah.
Paul: If I want to go get a weird academic text that’s hard to find otherwise, and I can pirate it on the internet? No problem.
Rich: Yeah.
Paul: If I want to create a search engine of stuff that I can then go back and search.
Rich: Yup.
Paul: All good. I follow robots.txt. There’s all these search—
Rich: You’re allowed to make a search engine.
Paul: Right? So I think—and then, but here’s the thing that’s going to blow everything up, Rich. Everything compresses to, like, four gigabytes. Right? So I go out—
Rich: Well, I think there’s a broader issue.
Paul: No, but wait, wait, let me just make my point. It used to be that like, hey, you’d go make these search engines or whatever and it really required a lot of technical knowledge and, like, even the people doing the giant pirating of all the books? Like, that’s a pretty small cohort of human beings who can pull that off.
Rich: Yeah.
Paul: But I can make this sort of lossy version of all human knowledge and I can squeeze it into, like, something that fits on your phone?
Rich: Sure. Yeah.
Paul: So what’s different now is the access. Like, everybody can kind of have these things and they can generate as much content as anybody ever wanted and they can run in smaller and smaller devices. And so, like, this natural barrier of hey, I need to have a bunch of servers in a cloud in order to do this thing?
Rich: Has been eliminated, essentially.
Paul: It’s been eliminated. And so I think what you’re going to end up with are probably two sets of rules. One set of rules where OpenAI is spending hundreds of millions of dollars to license stuff and pull it in and make the next version of ChatGPT, and another where there’s, like, open LLMs that are kind of, you know, some are dodgy, some contain all the stolen stuff you like and so on.
Rich: I wouldn’t focus on where it ends up. I think the fundamental issue is this. The fundamental issue is there’s sort of this unstated pact with search engines which is, “Yes, you can index my stuff, but when you reference it, the click is back to me.”
Paul: And in general, where people get upset with the internet is when that transaction gets broken.
Rich: Yeah. And what you have with it—
Paul: Remember AMP pages, like, those—
Rich: Yeah.
Paul: Like, a faster mobile version?
Rich: That’s right. That’s right.
Paul: That got everybody upset.
Rich: And so what you have with AI now is frankly, the utility of a hyperlink has sort of been eliminated with, in a much more, and replaced with something far more convenient which is like, “Here’s a paragraph for the answer you’re looking for.” And the truth is that got constructed organically through these AI engines, and there’s no click back. In fact, I think I’ve seen, I think OpenAI, or is it Claude, puts a reference back to the source in the, in the answer. But that’s, that’s a, that’s a…
Paul: I mean…you know…
Rich: That’s tossing a little cracker back out to the world, you know?
Paul: You know what you see is like, you know what no one expected, right? It’s like, Stack Overflow is a good example. Stack Overflow published—because all their stuff came from the community—they published all the stuff back out as a database.
Rich: Yeah.
Paul: In an open license for all those questions and answers. Who would ever anticipate that the giant code goblin would come gobble all of it, and make you redundant in that way?
Rich: I’m going to give you another example, and I hate to say it, but the ship has sailed. And what’s going to happen is new market dynamics will take their place. And I’m going to give you a very useful example of how this happened about 25, 30 years ago. When Napster came out—
Paul: Mmm hmm.
Rich: I think two things happened. One, compression happened, because the internet wasn’t that fast yet.
Paul: No.
Rich: Some dudes in Germany came up with the MP3 standard.
Paul: In about two minutes you could get that club track on your—
Rich: You could get that club track.
Paul: Your Performa.
Rich: Napster shows up and essentially blows up the entire copyrighted music world. Right? Like, it was the beginning of the end.
Paul: You wouldn’t download a car.
Rich: You wouldn’t—yeah, exactly. And so what happened—
Paul: What we learned is absolutely everybody would download a car. Not a problem.
Rich: I would download a car.
Paul: In a minute.
Rich: Like a 3D printed car?
Paul: Absolutely. Let’s go.
Rich: So what happened? If you watch what happened there, right, there was an uproar, obviously. There was a lot of turmoil in the industry. It ate away at the industry in, like, probably three years. It just pummeled revenue to the industry. And then you would think, okay, we can pass laws. In fact, the laws existed. You can’t do this. You’re not supposed to take copyrighted work and put it in your pocket.
Paul: And then remember there was like, that $3,000 bounty that showed up.
Rich: Yeah, yeah.
Paul: And like, you know, kids in Idaho with little hats on were like, [sad teenager voice] “I can’t afford it. I just wanted a Blink-182 song.”
Rich: Exactly. And so, but how did the movie end? The movie ended with like, okay, well this technology is here now, and it was sort of barren wasteland for a while. And then who shows up? Streaming apps. And they’re like, you know what? That all got, that’s all scorched earth now. That’s behind us. How about I give you, like, a nickel every time you play a song and let’s see if we can get some artists on this platform.
Paul: Let’s be clear. The artists get a nickel. The labels get contracts and trips to Bermuda and…
Rich: Yeah, but even the, even the lab—yeah, I mean it, it has changed the economics and imposed a brutal new reality for artists, and frankly, labels as well.
Paul: Sure.
Rich: The shift in power to the streaming services is massive. What that means is this. Now let’s apply that to AI. What that means—
Paul: Meanwhile, while this is all going on, those laws are enacted. And so your ISP, which is now the regulatory enforcement body, if it sees, like, “Hey, you’ve been torrenting too much,” it sends you that email. And the email’s like, “Hey, you gotta calm down.”
Rich: No, no, but I’m not even talking—I’m talking about the appropriation of these works by new ways to deliver them.
Paul: No, but what I’m saying is like, there’s also a punishment. There’s a stick here, too.
Rich: Of course.
Paul: You might as well go over to the streaming service. It’s not that expensive and it beats getting yelled at and losing your internet.
Rich: It’s the new reality.
Paul: Yeah.
Rich: And what I’m implying here is that there is a new reality.
Paul: It doesn’t have exactly the shape of streaming. I don’t know what the shape is. Like, I don’t think anybody can tell you.
Rich: I can’t tell you either.
Paul: Yeah.
Rich: What I can tell you is this: New economic mechanisms are going to take hold that embrace this new reality. There’s going to be a middleman who manages fees back to publishers, like a DistroKid for content or some nonsense.
Paul: [sighing] I think this is real. The society we live in.
Rich: Regulation? It’s too late.
Paul: Well, it’s not just that. A lot of times we have the fantasy that, like, you know, Elizabeth Warren is going to call somebody in front of them and yell at them.
Rich: It’s too late.
Paul: But so many cats are out of the bag. And it’s not just that, like, you can download this stuff off the Internet and computers keep getting faster. Like, it is not the same as, you need to have a multi-terabyte archive of every label’s high-quality audio. It’s like, you need to go online and download something in 15 minutes that can, then you can use to build your whole next thing forever.
Rich: Here’s why it’s too late. It’s too late because the consumer appeal, and I don’t mean consumer as in shopper, I mean the user appeal is too good. Back when Napster showed up, and I remember I had like a shitty MP3 CD player.
Paul: Yeah.
Rich: Where I let, essentially it let me hold about 110 songs.
Paul: That was an ugly technology.
Rich: An ugly technology.
Paul: It used the CD as a hard drive and you put it, you filled it with MP3s.
Rich: But I had 100 songs in my pocket, right?
Paul: Yeah.
Rich: And what has the consumer appeal was so good that the wave was already over us. And what’s happening here is the truth is, search sucks, and if I need an answer to something, AI’s ability to package it up and something I can take in without clicking a link is better. And because it’s better, the consumer momentum, the user momentum around it is going to dictate these new markets. It is not—there is no putting it back in.
Paul: I think, unfortunately you’re totally right. And on that—
Rich: As usual.
Paul: No, I just like, I would love to—my fantasy is always that, you know, I’d like to see Sam Altman explain what’s going on to the government and, and someone smart in the government to be like, “Hold on a minute.”
Rich: Think he’s gonna have to go to Europe. Right?
Paul: Yeah.
Rich: You know, Europe is funny—
Paul: Look, there’s Europe can’t get its own—like I’m using Mistral, which is the French AI. It’s just not as good.
Rich: It’s also a little ruder and kind of cold.
Paul: The service is bad and you know—
Rich: It’s not even that—
Paul: But the steak frites are delicious.
Rich: [laughing] Europe is always a step behind, and then they end up never getting a piece of the cake that comes out of all this innovation. They always end up being sort of the angry schoolmaster, and they mess it up every time.
Paul: Well, Europe’s number-one thing that it’s able to manufacture is bureaucracy. And I say that as someone who is like, really likes a nice Nordic lifestyle. But it’s just, like it’s, you know, they love to slow stuff down and sort of send it to Brussels and build a center for it.
Rich: Yeah, yeah.
Paul: I’ve watched European interpretation of web technology and unfortunately the output keeps slowing down and then their smartest people come get academic jobs in the United States. Over and over again.
Rich: Yeah.
Paul: Now I’m going to, I’m going to—let me switch, because I think there’s two—
Rich: So I think we touched on, like, a pretty big bullet point here.
Paul: Yeah, but there’s another aspect of this that’s also really important, which is I would argue that in many ways, good and bad, there’s a huge amount of what I would call, like, auto-regulation happening in AI companies. I’ll give you an example.
Rich: Self-regulation.
Paul: Well, well—
Rich: Is that what you mean?
Paul: Sure. DeepSeek comes out and everybody starts asking—it’s a Chinese-built LLM. Chinese company.
Rich: Yes.
Paul: Everyone starts asking it, “Hey, what about Tiananmen Square?” And it’s like, it literally goes, like, “Thinking!”
Rich: Yeah.
Paul: It’s just like, “Uh uh. Whoa, no, no, not for me, not for me.”
Rich: Right.
Paul: And there are things like that, like I asked, I don’t know, ChatGPT to blow something up the other day. It’s like, “No, I can’t do that. Can’t do that. Can’t make you a picture with an explosion.”
Rich: There are guardrails.
Paul: And it’s possible to have guardrails. So I think we actually—and this is going to be really, really complicated in the current environment—there are very, very complicated and adults-required kind of free speech issues and global trade issues that are going to be really connected to these. And I also see, like, you know, Elon Musk sort of like Grok is his that runs on Twitter, and Grok is like, you know, the free speech anti-woke LLM, right?
Rich: Yeah.
Paul: And so like, and apparently it’ll say everything and it’ll even trash him and everybody—it’s just, I sometimes I just want to, like, just jump in a swimming pool and stay under it for a while. Just like, just for a little bit—
Rich: You need oxygen.
Paul: I know, but just for like a minute. Just to—
Rich: Oh, you could do that.
Paul: Three, four or five minutes.
Rich: That’s fine.
Paul: Just ten minutes. When the brain damage kicks in, I’ll come back up.
Rich: Okay.
Paul: I just need a little barrier.
Rich: Okay.
Paul: Anyway, regardless of that.
Rich: That was a really nice aside.
Paul: Yeah. You know. There’s too much news anyway.
Rich: I hear that.
Paul: So look, I think there’s that, right? And so I think you could—right now, the way things are going, you could have two things, or they could whiplash after two years or whatever. You could have the AI that consistently is told to say less about certain things. For instance, the AI that is like, “January 6th was a day of patriotism and nothing else.” Right? You can have that.
Rich: Yeah. Yeah.
Paul: Just as you can have the AI that is like, “Tiananmen Square. I don’t know what you’re talking about. I really care about the way that our government is supporting all the Chinese people.”
Rich: Yeah.
Paul: And so I think we have that, and we have that kind of unmonitored right now. Nobody knows how it’s happening, what the tools are, so on and so forth. So we actually have these engines of information and discovery where the thing that we all think of as free speech is going to be hashed out inside of these things. And I don’t think we’ve started to have that conversation yet as a society. Kind of because we can’t right now. It’s gotten a little crazy.
Rich: Yeah. I mean, I think these tools, a good way and a good lens to look at them, because it’s so new, is that they’re publishers. They are actually pushing information. They are actually shaping sentiment.
Paul: Yeah.
Rich: And that—anything that pushes information and can shape sentiment is a propaganda tool, if used in certain ways. And the truth is, these tools are very close to power. And there is going to be things imposed on them that are going to advocate for power, the status quo, and actually diminish sentiment that is against the status quo.
Paul: Let me give you an example.
Rich: That is real.
Paul: As far as I know, this has not happened yet. But I could easily see the current government of the United States going to ChatGPT and saying, “This is what you need to produce when people ask questions about trans rights.”
Rich: Sure.
Paul: Right? I could see that happening.
Rich: Sure.
Paul: That, to me is where we would need a regulatory framework. We would need an understanding of what’s been said, what’s been told.
Rich: My friend, you are in a fantasy world. The news—news—lies every day.
Paul: Sure.
Rich: And now you want to go fix the robot?
Paul: I’m actually—I’m not saying that, actually. I’m saying I want to—I don’t believe you can fix the robot. The robot doesn’t work that way.
Rich: Yeah.
Paul: What I’m saying is I want to know what people are telling the robot it’s allowed to say or not. And this actually ties back to the copyright stuff. And it ties back—I want to know what the robot gets fed and the rules that the robot.
Rich: I would love that, too. And I’d like to see that kind of mechanism, that kind of accountability be in place. I would love to see that. Whether we see it or not, I don’t know. I’d love to see it.
Paul: But I gotta tell you, that, to me, is the tech industry I fell in love with, which was, “Open source, and here’s the code, and here’s the pieces—”
Rich: Were you angry at Facebook when they were like, pushing out—
Paul: I’m always angry at Facebook.
Rich: Okay, but like, during elections, when they were pushing out, like, literally customized banner ads, they would let, that would like, get you all riled up so that you voted a certain way. Or like, foreign actors getting involved and actually managing sentiment. Spinning up new, like, fake news sources that people couldn’t distinguish, right? All the manipulation and games that go on.
Paul: That’s just going to get more and more easier—
Rich: What did we, what did we do about that back then?
Paul: Not anywhere near enough.
Rich: Not anywhere near enough. So now you’ve got this tool that’s essentially a chatbot. So when I ask it, you know, “What are the, you know, what’s the hottest periodic—hottest chemical element?” It’ll answer it. So I come to trust this tool. My kids trust the tool.
Paul: Sure.
Rich: They ask it questions all the time.
Paul: It listed all 50 states, no problem.
Rich: Exactly. And now you’re going to have an ability to ask it things that clearly could threaten the status quo, threaten power, like, power, essentially.
Paul: Sure.
Rich: And wealth, in this world, in this country, especially. And you want to have the right guardrails and bumpers on it.
Paul: Well, we have some. If you ask—
Rich: You could even do it with banner ads.
Paul: Go ahead. And I’m not going to do it right now because I don’t really want anybody coming to visit me. But like, if you went and said, “Hey, ChatGPT, which politicians should be assassinated?” It’s going to go, “I cannot answer that.”
Rich: Of course.
Paul: Right?
Rich: Of course. Look, I think there’s an important distinction to make here. And the truth is, information is extremely fluid and extremely malleable. There are exceptions to free speech, like by law, like, you can’t yell fire in a theater.
Paul: Sure, sure.
Rich: You can’t incite riots, et cetera, et cetera. So there are laws. Obscenity, like pornography. Like, there’s actually, like, you can’t say, “I have, I’m free to show porn on the side of a building.” No, you’re not. There’s laws against that.
Paul: No, no, and there’s, there’s actually increasingly laws against, like, child pornography generation.
Rich: Of course.
Paul: Sure.
Rich: So there are laws around things that are just so damn black and white, that there’s no, you can’t even debate them. Now, where things get murky is, as you go down the spectrum, there’s so much savvy manipulation of information, because the truth is the control of that information as, like, an asymmetrical warfare tool—
Paul: Oh, man—
Rich: —is incredibly effective.
Paul: I gotta tell you, we’re just in for it, right? Because what’s going to happen, like, the FBI will generate a bunch of fake child porn to trap people into its like, like, we’re just gonna end up in that kind of world, I think. And like, where just sort of like everything is being used to do… [trails off into weary noises]
Rich: Here’s the thing. Here’s where we’re at, my friend.
Paul: Let me, let me—
Rich: We zoomed out big time.
Paul: I need to go get a law degree. It’s a little bit of a drag. And we got to build a business on top of it.
Rich: Oh, I’m not worried about it. We’re not in that business.
Paul: Thank frickin’ God!
Rich: We’re in the getting-work-done business. That’s different.
Paul: That’s true.
Rich: We’re not worrying about other people’s content.
Paul: We’re trying to disrupt, destroy programming, not the government.
Rich: Correct, correct. And, and, and I think, look, I do think there are other mechanisms. Regulation is one. And I think regulation may materialize it. Like, everything is always too late, right?
Paul: Well, I would—
Rich: I don’t know when seatbelts showed up. They showed up after a lot of people flew through the windshield. [laughing]
Paul: Regulation, I mean, you know, assuming—you do, I go back and forth—assuming there’s a government, regulation will show up and it will show up late.
Rich: Always, always.
Paul: I think that’s fair. But what I would say is there is already a ton of implicit regulation, and it’s not transparent in any way. And I actually think that that is leading to a lot of confusion and complexity. And it would be really good for our society if we could understand more about what’s going on.
Rich: I have two final thoughts.
Paul: Okay, good.
Rich: One is there is a bit of a movement right now to get young people off of their phones.
Paul: Sure.
Rich: And I think about that—people couch it in terms of mental health.
Paul: Jonathan Chait. [he means Haidt] So there’s a whole…
Rich: You know, and people aren’t, you know, kids aren’t socializing, nobody’s going on dates, nobody’s, like, hanging out.
Paul: Yeah.
Rich: Because they’re sitting on their phones. I get that. But the other thing I think about, by the way, is you are effectively training young people to become news-consumption adults that are only able to digest small little bites of, like, incendiary information. Like, if it’s not exciting, you just sort of move on. So that’s a positive, that’s a positive step, which is it’s couched as mental health, but I actually think for us to be, like, good kind of citizens in our world, getting off the phone is a good thing. The second thing I would bring up, and this has historically been the case, is that there are external actors that effectively serve as quasi-regulators. Lawsuits. Fox News lost almost a billion dollars when Dominion settled that suit.
Paul: Sure.
Rich: Right?
Paul: Over the voting machine…
Rich: Over the voting machines.
Paul: Accusations of voting machines were fake.
Rich: Court cases. Organizations like the ACLU. That are out there effectively making sure that the higher level guardrails, mostly the Constitution, is somehow imposed on these tool, these incredibly powerful tools. Is it perfect? Is it messy? Yes, it’s messy. Is it perfect? No, it’s not perfect. But these are mechanisms that, in the past, actually, have served us really well, just to sort of calm us down. Because we don’t have the laws yet, right? So lawsuits? Someone very powerful could get caught in some misinformation, and they’re like, “You know what? The hell with it. I have money and I’m going to bring you down.” That’s effectively what happened with Dominion.
Paul: You’re a lawyer. Do you think it’s time to be suing OpenAI? Do you think that that would be the right way forward for our culture in order to get people to—
Rich: I don’t think any individual. You could.
Paul: I mean, I mean, sure, anyone can sue anyone. It’s one of our safety rights.
Rich: Cellino & Barnes. Yeah, you can do that. No, but I do think that government bodies—I could see states suing it. I could see countries suing it. I could see organizations that defend certain liberties and have certain causes that are behind nonprofits suing it. Like, you have mechanisms like that. Is that enough? I’m not really going to get into that. It wasn’t enough for Facebook. The reason I’m suspicious about all of this, by the way, but we somehow still end up, like, landing on our feet, is nobody did anything about Facebook. It just kind of went by. He fired Cheryl Sandberg. I think he probably fired her because she was getting on his nerves as much as anything else. Like, I don’t think it was just about, “Oh, you fail—” He probably threw her under the bus as a symbolic feature.
Paul: You’re right. And everything worked out for the best.
Rich: Yeah, exactly.
Paul: No, it did not!
Rich: No, no, no, that’s not what I’m saying. I’m saying it didn’t. And nobody did anything. So now you want to fix this thing? How about we go fix the last thing?
Paul: Oh, no, no, we’re not gonna, we’re not gonna fix anything. Yeah, okay.
Rich: No, but, but, but the thing is, centers of power have always been there, right? Like, it took, what was it, 40 years later, tobacco finally was put, dragged in front of Congress. Like, “It’s killing people, right?” They’re like, “Yeah, I guess it is.” It was like 50, like, 2 million, millions of people died already. And now tobacco is some other company, like, because they don’t want to be noticed as tobacco.
Paul: Oh, yeah. Altria. It has that really nice logo.
Rich: Altria. Very nice logo.
Paul: It is beautiful.
Rich: All of these things come bearing down on us all the time. And now AI is here as if it’s special. I’m not even sure how special—Facebook to me, by the way, in terms of its ability to blanket across all of population? To me is much more pervasive than, oh, AI may get there, but nowhere near…
Paul: AI’s not going to get there, because it’s too decentralized.
Rich: It’s too decentralized.
Paul: I think that’s right. I think the damage that AI will do, along with the good that it will do. We’re mostly focusing on damage for this one. I think that’s okay. But the good will, it’s because of that decentralization. I don’t care who the big winner is.
Rich: I agree with that, and I think that’s a positive thing.
Paul: I think overall it will be—I think a lot of this will be kind of a footnote, because this stuff will be infrastructure built into every operating system rather than a couple Facebook-style players.
Rich: We’re seeing it.
Paul: Yeah.
Rich: We’re seeing the commoditization and we’re seeing it, like, it’s hard to distinguish these various LLMs.
Paul: The greater danger is actually AI plus the engine of Facebook without mature oversight saying, “Hey, maybe that’s bad for humanity.”
Rich: Exactly.
Paul: And we’re seeing, you see what Twitter has become, and now we’re going to add like Grok’s just spewing whatever it wants into that sort of pit, right?
Rich: Yeah.
Paul: And it’s like, that is a particular version of society that I don’t think anybody’s actually really that into.
Rich: No, I don’t think so either. And I feel like a lot of this—
Paul: Nobody wants to be screeched at or shown boobies by a robot all day long.
Rich: [laughing] I agree with that. Do you think that—and I don’t know this, and I don’t have any statistics—do you think Facebook’s influence has waned?
Paul: No, I doubt it.
Rich: Okay, so you think it’s just still sitting there humming along.
Paul: You gotta tie Instagram into that. Right? Like, it’s just sort of, you just have an unbelievable power and unbelievable opacity into how that power operates.
Rich: Yup.
Paul: I will go with you on this. I actually think if we were to, if we were to classify the threats, like, Musk is fascinating because he’s such an obvious idiot and he’s wrecked everything, in a way, like you literally know how many babies he’s having, because he can’t get anything under direct control, because he’s just chaos.
Rich: Yeah.
Paul: Zuckerberg’s actually, if you were to have, like, a whole big palace coup kind of situation based on money and control, Zuckerberg would be the one who would come out in the end, and be like the Octavian to Caesar. He’d be like, “I’ll take over here.”
Rich: I think you’re right.
Paul: And he would have 3 billion people who he could say, “Hey, we got to do this. It’s really important. I have acquired a large drone army.”
Rich: I think you’re right.
Paul: And I think that is the one to watch, while we watch all the other stuff.
Rich: Well, it’s what? It’s what? It’s, it’s—my mom doesn’t use AI. My mom goes on Facebook.
Paul: That’s right.
Rich: Right? And my sister goes on Instagram. And she uses AI to, like, answer a question. That’s about it. Like, I mean, AI is out there and it’s gonna—what it can generate, coupled, you just said it earlier. Coupled with the dissemination engine of Facebook and Instagram is incredible, right?
Paul: If they decide to really hit the big red button on that. Like, you can’t, like, yeah, go ahead, put Bernie Sanders in, like, a robot outfit and give him a bunch of—
Rich: Totally.
Paul: What is going to stop that?
Rich: Well, nothing’s going to stop that. Right? And other things—I said earlier, there are things that kick in that do stop it. Whether it’s enough, I don’t know.
Paul: It’s hard to sue a global pseudo-state that exists only on a network.
Rich: True.
Paul: Yeah. Anyway, good times. Having a good time. And we’re going to keep building our product built on AI.
Rich: Yeah, I mean, it’s hard to pitch our company at this very moment. [laughter]
Paul: Nah, look—
Rich: We use AI to build software. We don’t use AI to disseminate misinformation. [laughing]
Paul: It’s the most—unless code is misinformation, but, like, it’s the most interesting technology that has showed up since the web.
Rich: Yes. And we’re kind of on it, and we think we’re onto something really interesting. And in essence, we want to help you ship software and tools that you never thought you could get.
Paul: I’m going to make it really simple. Like, all this stuff, all this drama, all this excitement, we want to do the boring parts and get them out of your way so that you can just manage your stuff.
Rich: Feeling productive and helping people be productive is good medicine.
Paul: I like to do the boring parts for people.
Rich: It keeps us sane.
Paul: Anyway, if you need anything, get in touch: Hello@aboard.com.
Rich: [announcer voice] This is Meet the Press. [laughing]
Paul: Whew! I like these conversations. All right, and I’m sure. I’m sure everyone agrees with us and really feels that we covered it in depth and is very glad that we’ve expressed our opinions.
Rich: I can’t tell if you’re being ironic.
Paul: [laughing] No, no—
Rich: I thought it was a good discussion.
Paul: I’m sure—
Rich: It’s a better discussion than many I’ve read recently.
Paul: I smashing with a ball-peen hammer, screaming at us.
Rich: Take care of each other. Take care of yourselves. Have a wonderful week.
Paul: One day at a time!
[outro music]