What’s the Opposite of AI Slop?

November 12, 2024  ·  29 min 33 sec

In the wake of the 2024 U.S. presidential election, Paul and Rich look towards the future with an AI lens—especially with the incoming Trump administration unlikely to put any regulatory guardrails on this rapidly evolving technology. What can AI do for people in our deeply fractured state? Are we doomed to poison the information environment forever, or could we use it to start building things that help people make sense of the world?

Transcript

Paul Ford: Hello, I’m Paul Ford.

Rich Ziade: And I’m Rich Ziade.

Paul: And you’re listening to Reqless, R-E-Q-L-E-S-S, the podcast about how AI is changing the world of software.

[intro music]

Paul: Richard, who is the marketing force behind this amazing podcast?

Rich: It’s Aboard.

Paul: Aboard? What’s that?

Rich: Aboard is a rapid-development AI-powered custom software platform. Here’s how it works, man. You just take a week, talk through what you need, you just say what you need, you talk it out with an Aboard solution specialist. They spin up credible, fully baked software for your company or organization, and then you tweak it, you get it over the line.

Paul: Where do I go if I want to learn more?

Rich: Aboard.com.

Paul: That sounds like an amazing website that took a while to build.

Rich: Yes.

Paul: All right, so since we last spoke on the podcast…

Rich: We… Election just happened.

Paul: Yeah.

Rich: Congrats, everyone. [broken laughter] Or condolences, everyone. Depending on where you sit at the table, or not at the table.

Paul: Yeah.

Rich: Whatever you like.

Paul: I’m going to bet most of our audience was not feeling it.

Rich: Well, I mean, look—

Paul: Some are, some are like, “Hey, listen, he’s got good policies.” And I’m going, I’m going to bet most are just like, “Oh Jesus, here we go again.”

Rich: Different people have different feelings about different things.

Paul: Yes. That’s true.

Rich: There’s wars in the world, Paul.

Paul: Ay ay ay.

Rich: Some people don’t like taxes. Some people really are into taxes.

Paul: Love taxes, yeah.

Rich: God bless. What I want to talk about is kind of the implications of where things are going vis-à-vis technology, and…I find California fascinating.

Paul: Okay.

Rich: And I’ll tell you why.

Paul: Why?

Rich: California is a highly regulated state. The tax burden—

Paul: Drives Silicon Valley insane.

Rich: It drives—but yet, Silicon Valley is there.

Paul: Yeah. Oh, no, no. And they’re convinced that they’re there in spite of California.

Rich: Yeah. And it’s a fascinating thing, because they’re like, we know what we’re gonna go to, we’re going to—what’s that state that, like, there’s, all the VCs are going to now, because there’s a lot of land. Oh, Wyoming. Jackson Hole.

Paul: No, there’s, there’s that, and then they were going to go to Austin for a while and then actually—

Rich: They did go to Austin for a while.

Paul: —and then Miami, but Miami didn’t, it’s kind of not working out. Also it’s hard to, like, walk down the street because there’s three inches of water at all times.

Rich: Miami’s problematic for a host of reasons.

Paul: Yeah. [laughing]

Rich: Austin, I’ve heard there’s—

Paul: Swordfish, though. You get a good, just a…yeah.

Rich: I mean I’ve been to, I’ve been to some goofy-looking restaurants in Miami.

Paul: It is, it’s a lot of purple and neon, but the fish is great.

Rich: The fish is delicious.

Paul: If you’re pescatarian—

Rich: It’s a lot of good food.

Paul: Yeah.

Rich: It is the same lounge tracks playing in every single restaurant.

Paul: Oh yeah.

Rich: It’s like this Buddha Bar garbage pile.

Paul: [beatboxing, in a chill way]

Rich: Yeah. It’s like down-tempo. Anyway, off-topic. There’s been a pullback from Austin actually, as well.

Paul: Sure.

Rich: Austin was booming at one point. Property was—

Paul: Back to the Valley.

Rich: It’s a fascinating thing, because, my God, tipping my hat, that little corner—

Paul: Yeah?

Rich: —of California.

Paul: Yeah?

Rich: Has been just an absolutely nuclear source of value creation and just innovation out of, really, if you’ve been to the Valley, it’s actually kind of sleepy.

Paul: Yeah.

Rich: It’s weirdly calm. It’s very Northern California.

Paul: A lot of taco restaurants and low-slung buildings.

Rich: It’s just a lot of low-slung buildings. And yet what has come out of the Valley has been, its impact on the world, good and bad.

Paul: Yeah.

Rich: Has been absolutely massive.

Paul: You want to know my hypothesis of the Valley? It’s real simple.

Rich: Yeah.

Paul: When you go out there and you’re like, all right, I’m going to go to Stanford Research or I’m going to go to PARC, I’m going to go to, you know, Xerox PARC. And then you’re like, all right, I’ll go from here, I’ll go to Apple. I just want to get the geography in my head. It’s not a long drive, it’s very pretty. But you start to realize, like, every transition, everything that everybody did involve getting into a car and going somewhere.

Rich: Uh huh.

Paul: And I always think of the Valley as like all the fact that the windowing interface got big and all that is literally because I think people got into their cars, looked at dashboards and went, “Oh!”

Rich: You think, you think the ride.

Paul: I think the ride.

Rich: To work.

Paul: I think—

Rich: Was incredibly influential.

Paul: I think driving between companies is where—with people in the car with you, like, talking, doing stuff.

Rich: Yeah.

Paul: Is what made the Valley. That’s my, it’s not a serious hypothesis, but I think about that whenever I’m out there. All right, all right. So?

Rich: And so part of me loves to see that innovation.

Paul: Sure.

Rich: I do believe that it is, it is a quintessentially American thing. There is no Silicon Valley of anywhere else.

Paul: No. They’re always trying to make one.

Rich: They’re always trying to make one.

Paul: Something Valley. They’re also trying to make other valleys around the United States. There’s always like, a, you know, Electronics Gulch, here in Kentucky.

Rich: Yeah, exactly. And look, there are parts of the world, there’s parts of countries in Europe that like to cultivate innovation in pharma or innovation in other places. But there’s nothing like the Valley on the earth.

Paul: No, everybody knows it.

Rich: And I think that is not, that is not because of the political or sort of economic environment. Because actually California is insanely expensive.

Paul: Very expensive.

Rich: To live in. Northern California is incredibly expensive to live in. The tax burden is huge. It’s notoriously blue. It’s a very, very—it’s like a rock-solid blue state.

Paul: Yes.

Rich: It has its pockets, but it’s a rock-solid blue state. And yet this explosion of innovation happens. And what happens is once it explodes, it explodes out of California.

Paul: Right.

Rich: It has an impact on the whole country.

Paul: Okay.

Rich: And now we are seeing a new administration come in.

Paul: Uh huh.

Rich: The Valley took a real vested interest in getting a right-leaning, anti-government, deregulatory environment going so that as they see it, American innovation can continue to thrive.

Paul: Well, let’s break that down. Like, an enormous number of powerful VC types, Ron Conway, et cetera, are not—or Reid Hoffman—were very pro-Harris, very pro-Democrat, funded all that stuff.

Rich: Yeah.

Paul: But—

Rich: There is an ilk.

Paul: There is a, there was enormous energy brought by Elon Musk, obviously, Peter Thiel, and kind of a host of other people, and people who tend to be a little more aligned around, like, crypto, and aligned around sort of like very libertarian kind of stuff. So to the point that you could say that Elon Musk, with his, he invested $130 million into electing Donald Trump, that he is a linchpin in this election. And JD Vance is Peter Thiel’s protégé. So…

Rich: Something like that.

Paul: So there is a strong argument to be made—Marc Andreessen invested heavily. Ben Horowitz did and then didn’t. But anyway, regardless, there’s a very strong argument to be made, we talked to somebody recently and we were like scheduling an event, and I was like, “Ugh, boy, it’s a little too close to the election. I wonder how it’s going to feel to people.”

Rich: Yeah.

Paul: And—

Rich: We’re in New York City.

Paul: We’re in New York City.

Rich: The other bit here.

Paul: And we’re in a pocket of a pocket in New York City. And he went, “I don’t know. Everybody seems okay with everything out here, man.” And it was this moment, because in New York, everybody was just panicking nonstop, and realizing there’s some guy, like kind of, you know, got a nice car, driving around, talking to his friends, and they’re like, “Eh, whatever happens, we’ll figure it out.”

Rich: Yeah.

Paul: Yeah. So, so, but yes, very real. A sort of libertarian, accelerationist, AI, cognitive, bitcoin, crypto-everything community was very happy when Trump won.

Rich: Let me ask you a couple of questions, then. We are always talking about how this is the real deal. This isn’t crypto.

Paul: AI.

Rich: This isn’t some flash in the pan. AI.

Paul: Well, let’s be real specific. LLM-based technologies that generate things, not necessarily the text or the images, but code and sort of, like, really functional documents that are able to accomplish certain tasks. Like, a lot of the stuff that got big in public, I don’t know if I fully buy that that’s as transformative as everybody made it to be. But this technology is transformative.

Rich: It’s transformative.

Paul: Yes.

Rich: Okay, so do you think it’s also potentially dangerous?

Paul: Yes.

Rich: Okay. Do you feel like… I think when I think about regulation, I think about two things. One is chill on the regulation, otherwise we can’t invent stuff.

Paul: Yes. And that is something, I mean, Europe is very proactively regulatory.

Rich: Yes.

Paul: And it is actually, like, it’s a bad place—doing a startup, like, in France is just not great.

Rich: It’s nearly impossible.

Paul: Yeah.

Rich: It’s very hard. And I don’t, I don’t know all the details and maybe some French startups are crushing it.

Paul: No, there are, there are plenty that are really great, but, no, no, but it’s like a famously difficult regulatory environment. Like, labor laws are really specific and—

Rich: Privacy laws are very strong.

Paul: And so there’s like just a lot of threshold—you know another wacky environment to start a business in is New York City. We don’t even ever talk about this.

Rich: Yeah.

Paul: But it’s, like, you get fleeced into—it’s less about what you do and more just that there is a middleman in the form of everything.

Rich: Yes.

Paul: Like, renting an office is an absolute nightmare here. And taxes are extremely, extremely high.

Rich: Extremely high. The legacy of classic New York banking and media?

Paul: Yes.

Rich: Still dominates the mindset in New York. And look, the truth is, a lot of people do want to be here. You can create incredible value in New York City if you’re willing to stomach it.

Paul: It does drive you to a brutal sales culture at your software company. Right? Because you actually have, you have more to make up than somebody in Kansas does.

Rich: Absolutely.

Paul: Yeah. Okay, so you were saying about, like, AI, is it dangerous? Regulatory?

Rich: I guess, I don’t, I don’t know how many times we can step in the same shit.

Paul: You mean by electing Donald Trump twice?

Rich: No, by essentially—

Paul: Oh God…

Rich: —letting like, wow, look at the aura and glow coming off of that new technology invention we just let out into the wild. And then you let it out into the forest and it comes back a giant fire-breathing monster.

Paul: Well, I think this is a really—

Rich: Well, let me clarify.

Paul: All right.

Rich: I went too deep into metaphor land. Social media, phones and kids. Like, these are things like, wow, this is awesome. And then like people’s brains are melting, people have body image issues, governments are manipulating other governments and other elections. Misinformation, disinformation, the list goes on. So am I naive to think that, hey, there’s an opportunity to get ahead of it here and do better, but I do still want people to innovate.

Paul: Let me riff for a minute, because I’ve thought a lot about this. So the hypothetical fantasy is that a new technology will emerge and a group of elders who are serious and thoughtful will review the technology and say, here are the risks and here are the benefits. Let us culturally define each one of those, discuss it, reach consensus, and then decide which parts to release.

Rich: Yeah.

Paul: And that is the academic fantasy of danger—of dealing with danger.

Rich: Yeah.

Paul: The way that a capitalist society does that is it goes, let’s test it.

Rich: Yeah.

Paul: And so you end up with situations like the FDA, rest in peace, where you have—now run by RFK, nominally, but—but where you go, “This drug is amazing and it will make people sweat 30% less.”

Rich: Yeah.

Paul: And this is going to be really powerful and great. And they go, “Great. I’m going to need five years.”

Rich: Right. It’s a loooong process.

Paul: But you ever heard of thalidomide? Thalidomide was a, it helped with morning sickness for women, but it led to a lot of malformed limbs in babies. So they were born without, like, their, like, parts of their arms were kind of like stumps. And so they were known as thalidomide babies. And that didn’t happen in the U.S. It happened in Europe and other places because the FDA was like, “We don’t know what to make about this drug.”

Rich: Yeah.

Paul: Right? And I’m sure I’m getting details there wrong. But that was like, we’re, those regulatory systems are really important. So is, is AI—and what’s tricky is that AI enters the information environment. If AI is creating drugs in a lab, then the FDA might have, there are ways to lean in on that.

Rich: Yeah.

Paul: Right? But AI is creating information in a free speech-driven society.

Rich: Yeah.

Paul: And so it’s very, very hard to regulate. Okay, so putting all that aside, right? So I think, like—

Rich: Let—

Paul: But wait, no, no, no, then what happened is there was public outcry about generative images being tending toward, you know, if you say “CEO”, it makes a white person, if you say “criminal”, it put in, like, a Hispanic. That happened for a while. And so they started to change the prompts and restructure stuff.

Rich: Mmm hmm.

Paul: What was happening is it was soaking up the web and all this information and all those biases are built into the information.

Rich: Mmm hmm.

Paul: So it was just spitting that back out. It’s just a dumb, you know, Mad Lib robot. But that got people really, really angry, and so they started to tweak and adapt and sort of prepare for that.

Rich: Not because a law was passed.

Paul: No, because they felt, everybody felt bad and weird or they didn’t want to get yelled at anymore. Okay, so, so, so the markets are—

Rich: That’s an argument for, like, self-regulating markets.

Paul: Yeah. So it, it kind of worked because what happens is you just don’t hear about it that much anymore.

Rich: Yeah.

Paul: You just—

Rich: Well, they took care of it.

Paul: They did the best they could.

Rich: Yeah.

Paul: And they’re still biased and so on. But everybody’s like, no—what happened, I think, is that these big AI companies went, “You know, that does feel bad. We don’t like it. So we’re not going to do it anymore. We’re going to, we’re going to try to keep you from doing it. You can still do it. You know, no naked pictures and things like that.”

Rich: Yeah.

Paul: Now we are entering an administration that is probably the most anti-regulation of any—except around abortion—of any we’ve ever seen in our life.

Rich: Well, we don’t know the administration yet.

Paul: But we’re, you got to assume a little bit. They do not want to—if you, if Elon Musk is the puppet master of Trump or vice versa.

Rich: Yeah.

Paul: They do not want a lot of laws regulating AI.

Rich: No, they don’t.

Paul: I think we are basically on our own and no one is coming to help us for the next waves of this technology. I don’t believe that Sam Altman, even though he has been out there saying, like, “Let’s regulate it, friends,” is really excited to go in front of that congressional hearing yet again. I think people are going to be really happy that they have the F1 track in front of them and they can just put the foot on the gas.

Rich: Yeah, I gotta be honest, I don’t think it would have mattered.

Paul: Probably not.

Rich: Because I think if history is any indicator, we are always too late. Historically, we watched the Obama administration was utterly confused about what was happening.

Paul: We blew up two nuclear weapons in Japan and then we were like, “We should stop doing this.”

Rich: Yeah, exactly, exactly. And so historically, we were always, we were never preemptive and it was always a reaction to unforeseen consequences because of tech. Always.

Paul: Yes.

Rich: Democrat, left, right-leaning government. It’s never been the case. Now, can it be worse, is the question. Because I do, I agree with you. There will, there—whatever attempt there will be to sort of rein it in is going to get pushed aside for now. There’s no way around it. Can it be worse?

Paul: Okay, so a) yes, absolutely. If one thing we’ve learned over the last, like, let’s say 20 years.

Rich: [laughing] You can always do it worse?

Paul: It can actually get worse. I mean, we had September 11th and then we had Trump 1, and now we’re gonna have Trump 2. And this one looks like a bad scene. Like, who knows? Everybody’s pretty incompetent, but RFK running the FDA? Not a great scene.

Rich: That’s a different podcast. We will point it in the links below. That’s a whole—

Paul: Yeah, if you’re upset by that, I’m really sorry. Best of luck to you. But let’s do two things. Let’s make the negative catastrophic case and then I’m gonna make a positive case. I’m gonna throw it back to you. What’s the worst that could happen? AI is completely unregulated. Everybody has access to it and they’re gonna do stuff with it. What are they gonna do that’s so bad?

Rich: There is no oversight and no consequences for midterm elections in two years to be utterly flooded with, like, we just can’t distinguish fake from real.

Paul: And I think in two years we really will be there. Like, you’ll be able to say—

Rich: Not only will we be there, but it will be proliferating in massive ways. Like there will be—

Paul: “Make a post of AOC eating a baby.”

Rich: Yeah.

Paul: Right? Like, just show—

Rich: All of it.

Paul: Just killing kids.

Rich: And I think it’ll be, I mean, you said that as, you know, in a, in a satirical way. It could be something incredibly subtle.

Paul: Yes.

Rich: It’ll be like overheard at the 7-Eleven, she said this. And it’ll look like it was a snuck-in phone cam.

Paul: At a velocity—like, right now, you can reply and say, “You are a large language model” to the bots. But we’re going to be at a velocity with a level of skill, I think that’s real, the misinformation will be absolutely vast.

Rich: That’s the thing that I worry about the most.

Paul: Okay. That is—

Rich: Because I’ll tell you why. Because the appetite for it is clearly there.

Paul: It’s happening now.

Rich: How do we know it’s there? Because the willingness to manipulate is so strong, because it really works.

Paul: Well, let’s be, I want to be really clear about one thing too, which is the—there’s an assumption that, you know, OpenAI or Anthropic are going to put guardrails on. But the reality is these technologies fit on a thumb drive and you can download and you can hack around.

Rich: 100 percent.

Paul: So there is no—like I said, no one is coming to save us.

Rich: I think it’s too soon for robots. Let me get that. Oh my God, there’s, the robot cop is coming.

Paul: No, I don’t think we’re going to be there in two years. So let’s say two years from now, which I think is a useful window because it’s moving unbelievably quickly. Here is what I’ve been thinking about and I’m actually, I’m thinking about this in my way, which is little projects. Right? I saw, I was kind of expecting the outcome that happened in the election. And I’ve been thinking like how to engage. And one of the things I’m going to start with, and this will sound really weird: I’m going to find a way to read links using AI and make little summaries and pull out some statistics and use that as my research assistant for helping write the Aboard newsletter.

Rich: Okay.

Paul: Okay. How is that a counter to misinformation? Well, there are people who are starting to use AI to see if they can flag and tag misinformation.

Rich: Yeah.

Paul: Because it can parse and be like, “Ooh, that doesn’t really—” You can train it to do certain things. So you could start looking at the social feeds and start automatically flagging them. I don’t think you’re going to get that through with X, but I actually think X is going to circle the drain because once everybody is politically monolithic, nobody wants to be there anymore.

Rich: It’s not as interesting.

Paul: People like fight.

Rich: Yeah.

Paul: Right. There’s not a lot of fight on X right now. Everybody’s just like, to hell with it.

Rich: Yeah, yeah.

Paul: You know, it’s going to read the links, it’s going to make me some summaries. I’m learning how to do that.

Rich: Okay.

Paul: So what comes next? Well, you know, there’s a lot of conversation right now about how the information environment is toxic and what are we going to do about that? What are we going to do about social media? I would say that one, social media is, I almost think, the wrong vector. I think people are getting done. Like I think Instagram will remain really hot. And I think TikTok will come and go or get outlawed or whatever. But you know, the power move in all of this is email newsletters.

Rich: Hmm.

Paul: And when I look at email newsletters—and people are like, the Democrats have to make their own whatever, their own social network or their own Fox News or whatever. Okay? You’re not going to do that in two years.

Rich: Hmm.

Paul: But what you can do is you can send like literally 100 million people some good email newsletters every morning. Don’t saturate their brain. Tell them to ignore all the other stuff and live their lives. But I’ll give you just what you need to know.

Rich: Okay.

Paul: Pretty cheap. And it’s kind of cut off from the drama. You can’t get into a lot of circular stuff.

Rich: There’s no way to scream at each other.

Paul: So now what I could do is start using AI to gather news and indicators by looking at lots of different feeds. Track them, put them into a little bit of a dashboard, and that can become my newsletter. Like, I can start to have custom experiences for people on a per-state basis and so on. I can aggregate news, I can summarize, I can visualize, I can do all kinds of stuff, if I think about the AI-enabled newsroom.

What I’m saying is there is this very organic way to catastrophize and go, like, I will just feed people the racism and blood libel that they love so that they can stew in hate. And I think we know for a fact that that works great. And actually, right now that vibe is ascendant on the right wing, but it works real good for the left, too. Like everybody loves to be fed prior-assuming rage fuel about how their tribe is being attacked by the other tribe.

Rich: Yeah.

Paul: Okay? So—

Rich: It’s very, it’s very primal.

Paul: Yes, that’s right. And so I think you need to kind of, you need to own that and you need to speak to it and you need to be part of it. But so AI just seems like such a natural vector. Make more of that, right? But you really can go in another direction and you can say—

Rich: Why would you?

Paul: Why would you? Because you are dedicated to improving the information commons and you think that if you could accelerate that, not just get a bunch of people writing newsletters, there have been a million newsrooms that have been started. Right? I don’t think we even need more news. I think we just need better tools and pipelines and fun pictures and more cartoons.

Rich: [laughing] Yeah.

Paul: And like, aggregate, analyze dashboard, feed it out on a daily basis, get all those wine moms together and just go to and like, and start, like, driving action. You could incorporate and accelerate a healthier information environment by using these tools. People flame out on hate. It’s exhausting. Hate is exhausting. Knowledge and learning is actually energizing, even for people who you think—

Rich: Yeah. I would argue that hate and anger is more addictive.

Paul: It is, but it’s exhausting. There are certain dads who are watching—there’s comedian, Shane Gillis, who just describes his father watching Fox News screaming at Nancy Pelosi.

Rich: Yeah.

Paul: And just like, it’s unavoid—like, you’re never gonna get that guy back.

Rich: No. That ship has sailed.

Paul: There is no way to sit him down and be like, “Let’s vote for a progressive Board of Ed strategy.”

Rich: Yeah, yeah.

Paul: Right? You’re never going to get him back.

Rich: No.

Paul: But there’s a lot of people who just, like, need an information environment that isn’t toxic. Like, we had, the polls made no sense. The Times is running this sort of, like, kind of engagement bait on the top of the page. Fox is just screaming, like…

Rich: Everything’s all-caps.

Paul: And you just kind of can’t—I’ll tell you, I am a media consumer. I’m a skilled media consumer as exists on Earth. And I was overwhelmed and confused by this environment in every way.

Rich: Sure. It was overwhelming.

Paul: And it actually turned out that much of what I was fed, much of it, opinion, fact, polling, statistics, was fundamentally wrong, was delivered as absolute fact, and turned out to be complete nonsense, and the biggest political story of the last four years was completely missed, which is the rightward shift of the entire country at multiple points.

Rich: Yeah.

Paul: So, like, I’m like, what, what are we going to, like, let’s just fricking rebuild that environment because whatever they’re doing isn’t working.

Rich: Yeah. I think you’re saying two things here. Let me button it up.

Paul: Okay.

Rich: I’m gonna give it a go. I think, first off, I think what you’re suggesting, which is there are amazing tools in people’s hands right now. Go be productive, not destructive.

Paul: And they’re tools for building tools.

Rich: And they’re tools for building tools. But go be productive because I actually think you’ll feel better, you’ll be mentally healthier. Like, I think there is that. So let’s forget about the rest of the world and worry about you for one second here.

Paul: Yeah.

Rich: And I think if you’re feeling productive, there is nothing that is more satisfying than feeling like you got something done.

Paul: I mean, just think about it. If your real issue in life is LGBTQIA rights, you could build now, in 2024, a gay-rights observatory that goes out, pulls in news and summarizes it on an hourly basis.

Rich: Knock yourself out.

Paul: And you could share that with as many people as you want on the web or via push notifications or things like that.

Rich: Yeah.

Paul: And that could be, if that is the issue that you think is going to bring people together in a community and get them to take action, you can build that framework and you can start that community and you re—you and a couple people can get it, and that used to be six, eight months of work just to like, put those pieces together. And we’re getting the days now. We’re getting to hours.

Rich: Yeah. And also you’re not spending your time in toxic places feeling awful or getting into fights.

Paul: I mean, this is, unfortunately what I think happens is either a war happens and everybody feels really bad because they are being killed, or people just, I think in the same way that there’s like a weird element of vibes to the political move where like, everybody’s like, “Huh, you know, it’s just the bread is too expensive.”

Rich: Yeah.

Paul: And it’s inflation. But also just like, “Eh, I just don’t like it.”

Rich: Yeah.

Paul: Right? I think that the vibes do switch the other way and people can never explain why.

Rich: Yeah. Yeah.

Paul: Anyway, so yes. So that was issue one.

Rich: I think, about sports.

Paul: Boy, do you.

Rich: I want to end this about, with it with a little—

Paul: That’s fine. Let’s get a man in the room. There we go.

Rich: There’s plenty of sports writers.

Paul: There sure are.

Rich: They’re a certain type.

Paul: Yeah.

Rich: The cigar hanging out of their mouth. There’s this, like, trend now of videos of just men sitting in large leather chairs watching the ball game. And then you could watch them watch the ball—

Paul: There’s like the Yankees guy. What’s his name?

Rich: Yeah, Jom—Jomboy’s great boy.

Paul: Jomboy? J-O-M B-O-Y?

Rich: Yeah.

Paul: He’s actually, because he’s, first he’s a knucklehead and then he, he does like—

Rich: He actually knows the bits.

Paul: No, he gets like JFK, like, Dealey Plaza movie analysis-level on.

Rich: You know, I think. I mean, we meandered a bit here. But I do think that, I do think that these tools can make you feel productive. But they can also, I think by not using them purely to put us all in a cage match, but rather feel productive, you’re in such a different place. I don’t, I have no desire to get into a fist fight with a Mets fan, as a Yankees fan.

Paul: No.

Rich: None whatsoever. I see it. I might make a joke about them. And when they, when I Make a joke about them, they don’t cry or take it personally. They’re like, “Oh, Yankee fans.” And they’ll make a joke about me. And I think what’s understood there—

Paul: Hey, you went to that fourth game of the World Series. How was that?

Rich: It was the fifth, and it was one of the worst experiences of my life. [laughter] All right, man? So the fifth inning, the legendary fifth inning, I was there. What I’m trying to get out here is it is understood that I’m not going to convince that Mets fan to become a Yankee fan.

Paul: Yeah.

Rich: It’s understood that there is a little bit of horsing around.

Paul: Yeah.

Rich: There’s a little bit of joking around, and we get, we can talk to each other, but it’s understood that that’s the game. And I think these tools have made us lose the script here, to the point where we’ve gone beyond—we want to rip the jerseys off of each other, because the tools are so immediate and so visceral.

Paul: There was a coincidental narrative of kind of like a, an online left going, this is really bad and encodes a lot of the, that’s really bad. And an online right going, yes, and that’s what we love about it.

Rich: Yeah.

Paul: And so what’s been lost in that narrative is, could I use this as a filter? Could I incorporate human feedback loops and accelerate things that I think need to be accelerated? There is very little room for that conversation. That’s the conversation we’re having.

Rich: Yeah.

Paul: But there’s no room for it right now.

Rich: There’s no room for it. I’m an optimist by nature.

Paul: Yeah.

Rich: And I do think kind of moments in our lives can sort of help us tilt our trajectory just a little bit to get to better places.

Paul: Yeah.

Rich: Otherwise, we don’t do it on our own. We tend to just keep going in the same direction.

Paul: Yeah. I mean, the path that we’re on right now truly does lead to a terrible place. And I hope that at some level, adults on left and right, like, both sides. We need both sides, including our MAGA friends, to be like, “Uh, I don’t want to be in a scorched-earth nightmare. Forever.”

Rich: Yeah. I don’t, I don’t know if I agree with you. I, like, I think that there are a lot of very happy people right now, and they’re not sure why they’re happy, other than the fact that the World Series just ended and they won.

Paul: Yeah.

Rich: And they’re not really thinking, it was like, well, we just go home now. We do a ticker taper and everybody goes home, right?

Paul: Right.

Rich: It doesn’t work that way.

Paul: Right.

Rich: And so there are very happy, a lot of happy people because they feel enfranchised right now.

Paul: That’s right.

Rich: And that’s a funny thing.

Paul: Yeah.

Rich: Right. Now—

Paul: Because the Dodgers will let you down eventually.

Rich: They sure will. We’re going to make sure that.

Paul: Yeah.

Rich: Let’s go, Yanks.

Paul: Let’s go.

Rich: 2025. [laughing]

Paul: Okay.

Rich: I think these, the tools that we’ve had in our hands, and I’m kind of mushing together social and AI, I don’t view them that differently, are amplification tools. That’s all I see them as in this context. In social and cultural context, they’re amplification tools.

Paul: I would even, I would even just add in the word acceleration

Rich: And accelerate.

Paul: Those—

Rich: Accelerate and amplify.

Paul: So, like, amplify is almost old-school computing. And this is just, like, that times 100, because it can go amplify things while you’re not looking.

Rich: To double down on my optimistic posture here, I am an optimist.

Paul: Okay.

Rich: I think the power of these tools can also help us get to a better place and actually understand each other better. And I think that is one of those things that I’m not going to sit here and write the spec on that. But I do believe it can happen because it is an utterly curious thing to go into places you’ve never been to before, even if you think they’re terrible places, or you have assumptions and stereotypes about them. I don’t know what that looks like. I think you’re kind of circling around it in your media, what you just described in terms of the power of these tools.

Paul: You have to learn by, well, it’s the power of software to accelerate this stuff, but you can flow content through it. You can do all kinds of stuff.

Rich: Yeah.

Paul: And so. And that’s an unbelievable accelerant. The amplification is there, the distribution you have to go find. But it is, so, yeah, no, I think there is a positive case to be made here. I’m going to continue to explore and play and try to communicate outward about the things that we learn.

Rich: Yeah.

Paul: It is a wild moment and we’re going to have to figure out what is happening as well.

Rich: Yes, this is the Aboard podcast. Reqless.

Paul: Reqless.

Rich: Take care of each other. Have a wonderful week.

Paul: Hello@aboard.com is how you contact us.

Rich: Reach out. We’d love to talk.

Paul: Bye.

[outro music]