Exploring Anti-AI Thinking

October 15, 2024  ·  29 min 08 sec

Reqless tends to take a measured yet optimistic stance on AI, but a lot of people out there hate it—for reasons including the environmental impact, the dubious origins of LLM training data, and, of course, the looming threat of AGI, A.K.A. our future robot overlords. On this week’s episode, Paul and Rich discuss some of those critiques, as well as zoom out to look at the longer arc of the technology industry and its impact on the world, asking the question, “In five years, is the world in a better or a worse place because of AI?”

Show Notes

Transcript

Paul Ford: Hi, I’m Paul Ford.

Rich Ziade: And I’m Rich Ziade.

Paul: And you’re listening to Reqless, R-E-Q-L-E-S-S, the podcast about AI and software. And boy, is there a lot going on in AI and software.

Rich: It’s constant.

Paul: Let’s play the theme song and then talk about some stuff.

[intro music]

Paul: So, Rich, first we should tell people that you and I have an AI-based startup.

Rich: The best one.

Paul: [laughing] It’s superb. It’s called Aboard. And if you go to aboard.com, you can check out our fancy new website.

Rich: Yup.

Paul: And scroll down a little bit. You’ll see there’s a free version if you want to try it. You can look at the tools we’re building for climate. Or if you are an organization that’s like, I got to get some AI in here and save money on my development budget.

Rich: Tools you thought you couldn’t afford six months ago, you can probably afford them now.

Paul: And in fact, if you call now, you’ll get to talk to Rich and me, and you definitely can’t afford us, but we’re going to give you, we’ll give you our advisory for free.

Rich: And if you enter coupon code NO-MORE-IT-BS.

Paul: No more…itbsss….

Rich: Yes.

Paul: Yeah. Okay, so there we are. That’s—

Rich: Check it out. Aboard.com.

Paul: Now that I’ve told you about our exciting AI-based startup?

Rich: Yay!

Paul: You know what we never talk about on this program, because it’s a marketing program?

Rich: What?

Paul: How much so many people hate AI. Just hate it.

Rich: Really?

Paul: They really do. They think it is—there’s a book out called AI Snake Oil.

Rich: Oh boy.

Paul: I just ordered it. It’s coming on my Kindle in, like, a minute.

Rich: Who wrote it?

Paul: AI Snake Oil, you say. If you go to the website, there’s a website, aisnakeoil.com. It is written by Arvind Narayanan and Syash Kapoor. And they have a Substack. It’s good, actually, I’ve been reading some of their stuff. They’re good. And you know, they’re sort of like, “Hey—”

Rich: Yeah.

Paul: “—we, we do see that AI has benefits, which we’re not anti-technology, but there’s so much hype. So I think that is criticism number one. And that’s actually kind of the easiest one for people like us, who are tech fans, to associate with, because there is too much hype. They’re like, “This is going to change everything tomorrow.”

Rich: We thought we were going to wear TVs on our faces two years ago.

Paul: Yeah, it’s jetpacks. It’s sort of, like—

Rich: Hype is part of the package here. We’re in the acceleration phase of the hype cycle. Like, it’s still hot right now.

Paul: So I think there is, like, there’s the people in Silicon Valley who are telling us that the Robot God will be here soon. And then there are people saying, like, “Hey, I think you might want to calm down.” Then there are people saying, “This is really evil. It eats all the electricity.”

Rich: Okay, wait, we’re jumping around.

Paul: Well, I’m just sort of, like, getting the spectrum out there.

Rich: Yeah. Okay.

Paul: Right? And then there’s sort of people who are like, “This is stealing from, this is stealing.” The New York Times is suing OpenAI. Has been for a while.

Rich: Mmm hmm.

Paul: Right? They’re saying, “You took our content and you can generate newspaper articles from it, and that’s not cool.” There are people who are essentially making the argument that this is robbing people of their jobs, and that is, with no protections, and that’s really bad.

Rich: Mmm hmm. Mmm hmm.

Paul: There are people who are saying that this is an ecological disaster, and there are—

Rich: Because it uses a lot of electricity.

Paul: Yeah. And then there are people, sort of in the middle of this, the Nobel Prize in physics was just won by two people, John Hopfield and Geoffrey Hinton. It’s for their discoveries and inventions in machine learning.

Rich: Oh!

Paul: And Hinton—so, okay, here we are. We’re in the mainstream. And Hinton, for his part, is kind of a doomer. He’s like, “You got to watch out for this stuff. It’s coming.”

Rich: Oh! Okay.

Paul: Yeah, yeah. So, like, we just gave, you know, a Nobel Prize was just won by someone who’s like, “This, this is going to take over the world and is very dangerous.”

Rich: Okay. That’s a lot.

Paul: There is.

Rich: That’s a lot.

Paul: And then a lot on the other side, you’ve got the boosters and actually, the people who are saying this is going to take over the world, and that’ll be really good. You’ve got people who are saying we need universal basic income because it’s going to take all the jobs. So it’s like, actually every criticism ends up having kind of—

Rich: Yeah.

Paul: Every, like, what I would say, kind of left-oriented, slightly protectionist criticism has a kind of libertarian-right counterbalance.

Rich: Sure.

Paul: Where it’s like, “We’ll do universal basic income because, yes, the jobs are all going away and aren’t we, isn’t that great?”

Rich: Yeah. Yeah.

Paul: And so I’m curious. I have my own way of navigating all of this.

Rich: Okay.

Paul: And I’m curious, like, where do you come down? Because you read the same blogs I do.

Rich: Yes.

Paul: What do you think? Because here we are building our AI startup. What is the right way to interact with and deal with all of this information, conflict, hype, criticism, and so on?

Rich: I would bifurcate it into, frankly, two chapters. One chapter is, you know, the robots are gonna turn—let’s do the more dramatic one first.

Paul: Okay.

Rich: The robots are gonna turn on us. Here, look, I don’t know if the robots are going to turn on us. I don’t know if they will be robots. Right? A command prompt turning on me?

Paul: Yeah.

Rich: [laughing] Like, “Richard, stop asking me questions.” And I am going— “And close your eyes for a minute.”

Paul: Yeah.

Rich: That would creep me out.

Paul: Yeah.

Rich: That’s not really a robot turning on me. That’s just a command prompt. I could just close the lid on my laptop and that’s that.

Paul: Unless you’re in Linux. And then they never got the power situation right.

Rich: Here’s what we’ve seen happen with technology over and over again. We tend to be optimistic about its potential, and it always tends to get away from us. [laughing] Like, we thought, you mean, it’s the classic, you know, the global town square ended up being tents burning in Egypt.

Paul: Yeah.

Rich: Right? So our optimism around how these tools are going to make us better, and it turns out we are, as humans, we are deeply manipulative people. Like, we are just creatures. We are creatures of, that seek advantage and will manipulate anything to gain advantage. And when you give us good tools to do it—and that’s, you know, do I worry about like, you know, some deepfake actually turning an election? Yeah, that’s scary because the truth is there’s a, you know, that’ll get 15 million people to buy in, and that’s scary. So the optimism around tech tends to put in the drawer just an innate malevolence. We are innately malevolent creatures. That’s just what we are.

Paul: Now let me take and riff on that for a minute, because there was an interview that former President Barack Obama gave and people were asking him, and it was sort of, like, the, what are we going to do about climate change, and what are we going to do about this? And there was this point and you could see kind of the mask come off for a minute. And he was talking to the interviewer and the interviewer was asking sort of, a range of social justice questions, and Obama went, “Man, you’re talking about chimps with guns.” [laughter] Okay? Which I actually, when I think about Barack Obama, because I always felt that he had a kind of very high level, he saw the world a little spreadsheety.

Rich: Yeah.

Paul: Right? And so I think there is a part of him—he’s very much an optimist. He really does believe in human potential. You wouldn’t have the foundation otherwise. Right?

Rich: Yeah.

Paul: You wouldn’t run for president. But there’s a part of him that’s, like, when you zoom out, and you zoom out a little further, and you zoom out just a little bit further… Chimps with guns.

Rich: Yeah. It’s basic.

Paul: There is that. I want to actually throw something else out there, because I thought about that, I’ve been walking around with that literally for a year, with that phrase in my head. And I’ve been thinking about, if you observe humans from high enough up, it’s really only about—it might be a big percentage, and a big percentage would be 2%, 3%. 97% of the chimps just kind of want to play soccer with their kids.

Rich: Yes.

Paul: You can get them, you can put a gun in their hand. It’s all real, all of that.

Rich: Yes.

Paul: We are, we are primates. I’m looking at political rallies right now.

Rich: Yes.

Paul: I mean, actually on both sides.

Rich: Yeah. It’s hot. White-hot.

Paul: It’s animal dynamics. Like, we’re just [roars]. But most people would like to…

Rich: Barbecue.

Paul: Have a barbecue, and kick a soccer ball with each other‚

Rich: Throw a frisbee.

Paul: Make fun of their cousin.

Rich: Yup.

Paul: And go for a walk. Like, that is most—and maybe play some soccer.

Rich: Yes.

Paul: And so, like, that’s most people—and I feel that you got to be really careful with the malevolent stuff. And I’ve been thinking about this because, like, it’s a toxic time in the world. I go online, and all I see are chimps with guns.

Rich: Yeah. Let me, you’re pushing back in an utterly credible way. And so let me revise what I’m saying. A very small percentage of people use the tools in hand to manipulate everybody else.

Paul: Very small percentage. But it’s very meaningful. That’s right.

Rich: And it’s very meaningful. Right? And the tools have gotten more and more impressive, more and more deceptive, and more and more wily. And so if you’ve got that percentage, that small percentage of malevolent people, they take advantage of it. Right? And I don’t mean just, like, dictators. I mean also, like, marketers who lie to people about what they’re getting. And, you know, I once bought, like, what I thought was a basket from Amazon, and it came and it was three inches tall.

Paul: It was like a finger puppet.

Rich: [laughing] Yeah—

Paul: Yeah, I got it.

Rich: All right? So I do think that these tools, once they fall into the hands of people who seek either to want to control information or want to assert power, we’ll use them in malevolent ways. You’re right. Most people want to kick a soccer ball. But the thing is this, when you threaten that safe space, that patch of grass, they will jump onto the power structure and defend it—

Paul: Yes.

Rich: In a very severe and angry way, because it represents everything that’s precious to them is under threat. Right? Like, so that’s scary. And that part does worry me.

Paul: Well, I think we live in a world, right, where tech has always had a political component, but now it’s actively engaged with politics. It lobbies. You’ve got, like, Andreessen Horowitz endorsing Trump and then Bradley Horowitz saying he’s unendorsing Trump. And I mean—

Rich: But it’s actually, the tools of disseminating information are in the hands of tech now. They’re not in the hands of ABC, CBS, and NBC anymore. It’s just not the case.

Paul: It’s this wild thing where you have this giant power structure, and actually—and then, what happens is this power structure still acts like it’s the beleaguered nerds. And then they decide that what they’re going, it’s literally like, man, when Andreessen Horowitz sends out one of those long memos?

Rich: On, like, foreign policy?

Paul: It feels like I’m a duck getting the gavage tube jammed down my throat. Like, “Well, here you go for the next couple years, buddy. Hope you like blockchain.”

Rich: Yeah, yeah.

Paul: “Well, I don’t really like blockchain.” [angry Andreessen Horowitz blockchain-loving screech] And then, you know, Mark Andreessen’s giant face gets red with anger [laughter] because he heard someone say no.

Rich: Yeah, yeah.

Paul: And so, like, I feel that a lot of the reaction to AI, so I think a big part of the negative reaction is people reacting to that to just like, oh, really? Metaverse? Blockchain? Now you’re gonna, you guys in particular, you sort of like polo-fleece-wearing dudes.

Rich: Are telling me how the world is gonna be over the next five years.

Paul: And they love a good threat. All the jobs are going away and so on. And so, you know, and then people raise their hands and say, I don’t want the jobs to go away.

Rich: I don’t want it at all.

Paul: And then there’s no dialogue. And that makes people very angry.

Rich: Yeah.

Paul: Like, there’s no conversation. It’s just like, what Silicon Valley used to do that was compelling when it didn’t have all the power was announce a power change. “Something’s gonna happen.” And everybody would be like, “Ooh, whoa. The future’s coming!”

Rich: Yeah.

Paul: And now they announce the power change, and everybody goes, “I don’t want that. Like, you’re gonna do it. Don’t do it.”

Rich: Yeah. Yeah.

Paul: I think that’s dynamic one. And then dynamic two is just like, there isn’t, like, the regulation moves slow and so on. And I think people are just tired, right? I think they’re just fried. Meanwhile, in the middle of all this, I feel very confused, because my heart remains with the people who are like, “Slow down. Let’s talk about this.” And yet, I also have learned over and over and over again that you don’t—Pandora’s box cannot be slammed shut. You just can’t put stuff back in the box. You can regulate, you can do things, but it takes a while, and it’s out. This thing is out of the box.

Rich: It is out of the box. I agree with that.

Paul: So people who don’t have power in society, what I would really like them to do is to engage and start to understand these technologies, because I think it’s where it’s—and this goes back to an argument I’ve always made, which is like, I would be in on these sort of do-gooder tech things, and everybody would be like, “How are we going to teach kids about the power of the internet?” And I would say, I was always being a little bit of a pain, but, “Teach them Excel.” Right?

Rich: Yeah.

Paul: We’re here in New York City. $2 trillion in decisions will be made in the next month using Excel and PowerPoint. [laughing] Then Python can follow, but start there. I feel that these tools are at that level where everybody needs to learn them so that they can figure out the perimeter of their own life and how they want to participate.

Rich: Yeah, and I was trying to think about technologies. By the way, I’m going to take an optimistic view for a second. It seems like we are reacting more quickly to the negative outcomes that tech can bring. And I’ll give you a couple of examples of that. In ’16 and ’20, we had elections that were definitely shaped by misinformation and, frankly, foreign actors leveraging misinformation. That surfaced fast. Like, it didn’t come out in a commission report 20 years later, like it’s some sort of secret thing. It surfaced real fast. Like, recently, there’s a movement right now to sort of pretty much make sure phones can’t get used in schools.

Paul: Yes. Yes.

Rich: It’s early. It’s not, like—it took tobacco, like, tobacco was minty fresh for, like, 50 years. And then finally—

Paul: God, wasn’t it…

Rich: —there were Senate hearings—

Paul: Yeah, it was delightful

Rich: [laughing] Lucky strikes, right? Like, you were—

Paul: Nothing has made me miss smoking more than the Trump campaign. I’m just… [laughter] Just one Parliament Lite, you know?

Rich: I’m saying 50 years glibly, but it was 50 years until it was finally like, “All right, Big Tobacco. You gotta pay the bill.” It’s, like, 1993.

Paul: We’re still working on that.

Rich: We’re still working on it, right? So I do think we seem to be reacting more quickly to heading off—

Paul: AI is part of government conversation. Right now, the entire nation is deeply distracted—

Rich: Think about that, though. AI has not actually made a material impact. Like, the jobs haven’t been lost yet. Our brains aren’t mush yet. The robots haven’t turned on us yet. But we seem to be like, “Oh, boy, this one’s tricky. We better get on this.”

Paul: And I don’t feel that it’s just the Congresspeople who are just sort of, like, riding a train. There are smart people who are engaged in, like, “What is this going to mean for the American economy, for the world, for misinformation.” Like, people are engaged intelligently with this. I think the energy usage is a whole separate world. Here’s what happened. We had that event a couple weeks ago where we announced Aboard Climate?

Rich: Yup.

Paul: And I was very conscious of the fact that when I stood up there, I was telling everybody that I had an AI startup to a bunch of climate people. Because if you don’t know this, when you build, using AI is—

Rich: It’s, like, energy intensive.

Paul: Using it, as a user going in and typing your queries, not extremely more energy intensive than, like, complicated Google searches and stuff. And I mean, you can run the models on your local computer. But to the point that Microsoft bought Three Mile island, the nuclear power plant, to generate enough power to index and create these big language models.

Rich: It’s like a comic-book story.

Paul: I mean, if you told, it’s the most cyberpunk thing. If you went back, like, 20 years and told me that, I’d be like, “Yeaaaah. McDonald’s has nukes!

Rich: Yeah.

Paul: [laughing] You know, just like, it just feels, it’s really, really wild.

Rich: Yeah.

Paul: So I needed to speak to the elephant in the room, and I put up this image from a website, and it was like a picture of Sam Altman announcing that AI would cure climate change, and you could buy the article as an NFT. [laughter] And I’m like, “Here it is. The most disastrous ecological nightmare.” The case I really want to make here is I really do get it I get why we’re worried, but, like I said, I can’t put it back in the box, and so I think we have to figure out what’s exciting, how to interpret, and then how to lower the cost and lower the negatives of these technologies, because I can’t put them away for you. If I could, I would have, but I can’t put this back in the box.

Rich: I think you nailed it before. Understand it. If you can’t understand it, it’s scary.

Paul: Yeah.

Rich: And when it’s, and when people are scared, they sort of make crazy decisions. So understand it. Do you have to understand, like, down to the, like, the zeros and ones? No. But understand the implications of it. Appreciate the implications of it. Like, even people who aren’t technologists fully understand that nuclear weapons? Not a lot of upside. Like, it’s the only technology I can think of where we invented it, we built it, we put in warehouses. There are teams of people that know codes, and we’ll knock on wood, never use it willingly. An accident could happen, but willingly, we’ve kind of like, whoa, this one is actually too much.

Paul: Well, the world order, like, recently, it came out that Putin was sort of poking around the edges of using nukes in Ukraine.

Rich: Uh huh.

Paul: And the US, just like, “You have to understand, if this happens…”

Rich: Yeah.

Paul: Like—

Rich: The end of the world.

Paul: “We are not going to allow that to happen. Like, we’re the United States. We’re not going to let nukes happen.”

Rich: Of course. And neither does he, right?

Paul: Yeah, yeah, yeah.

Rich: That’s a step no one wants to take. That’s the only tech I can think of where people didn’t just want to fiddle around with it anyway. [laughing]

Paul: Oh, they did, though. They did. It took a while to put it back. Edward Teller wanted to bomb the moon.

Rich: That makes sense to me. Why not—

Paul: [laughing] Well, it’ll piss you off—

Rich: That’s like bubble wrap.

Paul: It really will—

Rich: Squeeze the bubbles.

Paul: You’re like, “Are you full? Are you not full? Why don’t you decide? How about this?” [bomb explosion noise] Trump wanted to drop nukes in hurricanes. You know, we do have this…

Rich: Let me ask a more optimistic question. Like, AI is knocking us out of our chair every other week with its capabilities and potential. Yes/no answers. And don’t hem and haw on me, Ford.

Paul: Okay.

Rich: Will it cure diseases?

Paul: Yes.

Rich: Will it solve climate change?

Paul: No.

Rich: Will it solve world conflict?

Paul: No.

Rich: Conflicts around the world?

Paul: No.

Rich: Okay. This is not going well, overall. We got a yes on one of the things.

Paul: Here’s what’s real. I mean, there’s a funny thing, which is, you know, AI shows up, there’s generative AI, people are very upset that it’s drawing pictures in their style and it’s taking money away from artists and writers.

Rich: Mmm hmm.

Paul: And that’s sort of, like, that argument’s fading out because it’s just kind of like, eh,  right now—

Rich: Can it make a movie?

Paul: It can make a movie. Won’t be a good movie. What, what—

Rich: [laughing] Two stars.

Paul: Yeah, exactly. But what’s fascinating to me is like, the 20, 30 million jobs in IT are kind of at risk, especially on things like call centers and so on. Not the same amount of emotion. Everybody’s just like, “Yeah, okay.”

Rich: [laughing] Tired of waiting twelve minutes for my password reset anyway.

Paul: If you tell Americans that, like, you know, the 600,000 employees of Cognizant or Accenture, their jobs are at risk? They’re like [sad balloon noise].

Rich: Yeah, yeah. All right, but we got diseases.

Paul: No, no, no. Here’s why I answered no. Why won’t it solve conflict or climate change? Those are human. Those are just people doing things on a day-to-day basis. What you just asked is, will I be able to—solving a disease, like, working with a human and dealing with lots of information and, like, actually working on a logical outcome? I think the tools will get better and better.

Rich: Mmm.

Paul: You can’t change human beings.

Rich: Behavior.

Paul: Climate is an outcome of human behavior. Conflict is an outcome of human behavior. And what we have learned over our lifetimes, very much so, because you and I grow up, World War II is kind of in the background, but it’s a while ago.

Rich: Yeah.

Paul: We’re coming up. Vietnam is ending.

Rich: Yeah.

Paul: Okay, so it’s like, let’s get out of this stuff. And then we have a couple wars and so on and so forth. But like, in general, there’s this sense of like, nah, we’re going to be done with conflict. And then there was a sense of the internet coming and saying, well, now that we’re all talking?

Rich: Yeah.

Paul: There’s no more possibility of conflict. Actually turns out to be the opposite.

Rich: It turned out to be the opposite.

Paul: The more conversations you have, the likelier it is that conflicts are going to burn. We’ve learned this.

Rich: Last thing. One more optimistic one to close it off?

Paul: Go for it.

Rich: Because, you know, I’m trying to, I’m trying to find that glimmer of sunlight.

Paul: Yeah.

Rich: Can AI help us be less polarized?

Paul: You know, interestingly, there were studies, and I haven’t looked into them deeply, but yes, apparently, like if you can, you can talk people out of conspiracy theories.

Rich: [gasps]

Paul: Because it’s a good bot. Like, the bot will be like, “Now hold on a minute.”

Rich: Yeah.

Paul: Because you can say, “You are a bot that tells people what’s wrong with conspiracy theories in a very gentle, friendly way, and they can ask you as many questions about QAnon as they want, and you can be like, well, there’s no real evidence that the moon landing was faked…”

Rich: Yup, yup.

Paul: You know, just, if they’ll listen at all, you will be the incredibly patient robot that will talk them off the ledge. You could also, I will say, I think, for polarization, we’re in a very funny moment where there’s a group, there are a group of people in America who believe things that kind of, it’s like, 30% believe something that the other 70% find truly abhorrent. Like, they’re kind of racist. They’re full of, like, they believe in conspiracies, and it’s just bad. Like, it’s bad. Like, you know—

Rich: Hurricanes are made by people with machines.

Paul: A Congressional representative who’s also talked about Jewish space lasers is saying that stuff, right? So, like. So that, I don’t know what you do there. It’s literally like, I wish anybody knew. I don’t know.

Rich: Yeah.

Paul: But I think there is, like, let’s say the 70% who, online are just, still at each other’s throats, right? Like, the fact that Harris is talking about putting a Republican on in a Cabinet seat—which has happened before.

Rich: Yeah, it’s not that crazy.

Paul: But it’s like, people are like, “There’s the betrayal. There it is.”

Rich: [laughing] Yeah.

Paul: They’re waiting for it. But I do think that if people are operating from good faith, I think about this a lot. Like, there are very religious people in the world. There are some in my family. There are some, some that are my friends. And I know they have beliefs that I probably have beliefs that align with less than 1% of the world. Right? I just was raised in a certain kind of household, very progressive, etcetera. I know that about myself. And I know that the other 99% think I’m a moon-bat in those ways. I accept it. Right? If I have an opportunity to express where I’m coming from, I do the best I can. I do think, what I do, and what I really like, is because I have these very foundational beliefs, I will use the AI to get an argument about the other side. It will tell me what they believe.

Rich: You want to hear it.

Paul: I want to hear it. And there’s actually, there was, like, a Reddit community called Change My Mind that was like this.

Rich: Mmm hmm.

Paul: Like, I think there’s room for this it. Because, and it’s the same way that, like, you don’t get to go be a Jesuit without hearing all the arguments against the Catholic Church and the significance of Christ. You have to actually engage with the argument.

Rich: Yeah.

Paul: And you have to kind of, like, prove out your faith and have it challenged in order to get the certificate.

Rich: Mmm hmm.

Paul: Right? And so, like, if you’re going to really engage with the world, even if it’s stuff that you find kind of abhorrent or that is upsetting or, like, you know, or doesn’t align with your belief system in any fundamental way, understanding where people are coming from. And you see this actually, like, you know, where to me right now, we’re at each other’s throat in this country. But the, the worldview that remains often quite perplexing to me. But the most important one to understand is the Chinese worldview. It’s very different than ours.

Rich: Mmm.

Paul: Right? And you see it, you see it, there were websites called Magpie Kingdom and stuff like that where they would try to interpret the action on social media because the behaviors and the memes—

Rich: So different.

Paul: It was different. And the way that people consumed media. And it wasn’t like—they’re humans. If they had been raised here, they would have behaved like we do and vice versa. Right?

Rich: Sure.

Paul: So these different cultural patterns are expressing in all these different ways. I think AI could be a beautiful, valuable interpreter of that because it’s so patient and it can actually understand these different worlds in different languages. It’s literally translation.

Rich: Oddly, it’s not combative, it’s not defensive. It’s oddly empathetic to whatever your pain or your cause or your needs are. It’s actually need-driven.

Paul: We focused for a long time—

Rich: It’s strangely need-driven.

Paul: We focused for a long time, culturally, on how many biases it encoded. And then clearly, the companies spent a lot of time working that out so that everybody doesn’t have white skin and, you know, like, it just, like—

Rich: Also, if you pump the world into it, the world’s incredibly diverse, with a lot of different beliefs, and…

Paul: As more stuff gets in, right? So if you ask me, I actually have relatively high confidence that if I went to ChatGPT and I said, “I was looking at this thing that was sent to me from Chinese social media, I don’t understand it. Could you explain it in terms of Westerner?” I think it would actually do a pretty good job.

Rich: Yeah.

Paul: I can’t get that information anywhere else in my life unless I send, like, five emails.

Rich: Do you think enough people are gonna ask the questions, though? Like, “Can you please help me understand the other side?”

Paul: Let’s take a step back, and then we’ll call it on this episode. We are in a world right now where it is just chimps with guns all day, and we’re hearing about the chimps with guns. And the other side is the chimps with guns, and we’re the good ones. But I think in reality, yes, I do believe…I don’t just believe, like, obviously, the vast majority of people want to be close to their families. Ask the robot some questions. Now, if your son, if you’re, like, an ultra-Orthodox person in Upstate New York or, like, a relatively fundamentalist Muslim, and your son is asking, is, you know, “Is Allah real?” And things like that. I could see there being tensions. But most people just want to figure out where they are and why.

Rich: You ever…the way onions, cheese, and steak mixes together in a Philly cheesesteak.

Paul: Yeah.

Rich: Is what most people want.

Paul: I’m from Philly. You know, I used to live in—let’s, let’s end on this. I used to live near the Steak-ummm factory.

Rich: What?

Paul: Yeah. Ride my bike by. You know what? It smelled like?

Rich: Steak-ummm?

Paul: Frickin Steak-ummm. [laughter] It was just like. It was just like cheesesteak. Just, like, penetrating everything.

Rich: All right, this ended a little more optimistically than, than I expected it to. Let’s cut the BS. In five years, is the world in a better place or a worse place because of AI? One- word answer.

Paul: Yes…?

Rich: [laughing] Let’s end it there. No, end it there.

Paul: No, I’m going to say one thing. I think that if we choose to, it’s a powerful tool that we don’t fully yet understand. AI is a powerful tool for acknowledging the actual real world that we’re in physically, culturally, and so on. It’s also a powerful tool for manufacturing fake realities and images that, from flood zones that aren’t real.

Rich: Yeah. That’s problematic.

Paul: And telling political and emotional stories that aren’t true.

Rich: Yeah. Yeah.

Paul: Where will we land? We’re going to land in the middle, and then we’re going to have to choose how we critique it and what we do. And I think that each of the points of view, the godlike AI is coming and this will steal all our jobs and is evil, are going to evolve into—it’s so big that I think they will each evolve into full-fledged kind of disciplines of understanding. And that’s where we’re headed. It won’t be resolved. So check out aboard.com!

Rich: Yeah, no, but back to my question again. Better or worse?

Paul: Better.

Rich: Okay.

Paul: Absolutely better.

Rich: Okay.

Paul: This is a weird alien that can help you with things.

Rich: I’m gonna agree with you on this. I think I agree with you on it. I think we may have to get ahead of a few things, especially actors that use it maliciously.

Paul: We have to figure out the energy usage. Like, that is, that’s literally—

Rich: Was thinking about more about, like, misinformation and deepfakes and stuff like that.

Paul: Both.

Rich: But, yes.

Paul: There’s a lot of bad here, but I think…

Rich: There’s a lot of good here. Yeah.

Paul: I want, I want to accelerate human progress very badly, and now we really need to. And this has a better chance of accelerating human progress than—

Rich: It has that potential?

Paul: And I don’t know for real, but way more than other stuff, so. So fingers crossed.

Rich: Check us out at aboard.com. Hit us up at hello@aboard.com. If you want to send us deep existential questions, like the topics we covered on this podcast, or if you’re interested in custom software for your organization? [laughing]

Paul: Both are fine.

Rich: Email us. You can even put both in the same email.

Paul: Yeah, I mean, if we missed any of the many, many elephants in the room, go ahead. We’ll talk about on the next one.

Rich: Have a great week.

Paul: Bye.

[outro music]