Paul gives Rich an overview of where AI is these days, from Biden’s regulations to Chinese lovelorn bot mistresses. It’s an interesting, messy moment as global legislation starts to take shape, new features drop daily, and the world is left to figure it all out. Meanwhile, the podcast’s slow journey towards AV quality continues.

Listen Now

See all episodes

E3

AI Meets the Feds

Rich Ziade: My name is Rich Ziade.

Paul Ford: And I’m Paul Ford

Rich: And this is the Aboard Podcast.

Paul: Good see you, everybody. Or hear you. Or talk to you. I don’t know. I’m a little confused—

Rich: We can’t see them.

Paul: Yeah, you’re right. I learned that when I was four years old. My babysitter said, “Can you see the people on the TV?” I was like, “Yes.” She was like, “Can they see you?” I was like, “Yes.”

Rich: Wrong answer.

Paul: Yeah, I was the wrong answer.

Rich: Wrong answer!

Paul: I’m so disappointed in that four-year-old boy.

[intro music]

Rich: Good to see you, Paul.

Paul: Good to see you, Richard. So, um We should talk about some stuff.

Rich: You know what I think about a lot?

Paul: Mmm.

Rich: I think about somebody…in ten years, finding nothing to watch on Netflix 5,000 or whatever it’s called then.

Paul: And coming back to this podcast?

Rich: Yeah, no. Opening up YouTube.

Paul: Yeah, yeah.

Rich: And saying, “What was it like in 2023?”

Paul: Well this, this comes to mind because you and I have been recently watching news programs from 1978 in New York City.

Rich: Apocalypse, NYC.

Paul: They’re just, basically—

Rich: They’re pretty great.

Paul: It’s like the best movie you’ve ever seen. One of them had an opening line, I think we should just share this with everyone. It was, “Well, it’s the first night in two months that there wasn’t a bank robbery here in the Big Apple.”

Rich: That was literally what they were saying.

Paul: Ernie Anastos.

Rich: Alright, so you know what? Let’s give people that gift.

Paul: Alright, let’s…

Rich: It’s 2034.

Paul: We’re make a time capsule?

Rich: Let’s make a time capsule.

Paul: Hello, friends.

Rich: What’s going on in the world?

Paul: You know, I’m going to tell you…well, let’s, that’s a great question not to answer on a podcast. Um, a lot going on in the world, especially geopolitically, but in our sort of slightly techie world—

Rich: Mmm hmmm.

Paul: It feels like AI is crossing a threshold

Rich: Okay.

Paul: And there’s just a lot going on. And it actually, a lot of what’s going on is pointing to a different cultural relationship with technology than there used to be.

Rich: Mmm. Because it’s been such a good relationship so far.

Paul: Well, this is the thing, like, here was the, here was the contract 20 years ago. We’re technology. Great.

Rich: That’s the contract? Yeah.

Paul: That was it. The government. Keep going—can we use you for military stuff?

Rich: Yeah.

Paul: Absolutely. Let’s go. We’re all in this together. And then, you know, sort of things happened like the 2016 election and Facebook being um, so huge that it started to become like a quasi-government and tech industry just kind of—

Rich: Sort of a kind of manipulation of the media to cause harm and unrest.

Paul: I don’t need to break that things got real weird.

Rich: Let’s just put it this way: Technology is not batting a thousand.

Paul: Not culturally, it doesn’t—if the goal of society is to make things better and grow and have healthy relationships with the tools that we have and give people opportunity, imperfect is the relationship between technology and the broader culture.

Rich: Fine. Fine.

Paul: Okay, so we, we all know that, and so what’s happened is I think AI, first of all, crypto shows up and everybody’s like, whoa, you’re messing with the economy, but that actually has really solid rules.Those are already established.

Rich: Yeah.

Paul: And so every time crypto would kind of like, go a little too far, it would get slapped in one way or the other. And now I see like real frameworks emerging for crypto, where it’s just going to be another financial product in all of the sort of global world of finance.

Rich: Right. The host organism wasn’t gonna be so welcoming to just anything going down.

Paul: No, that’s totally correct. So—

Rich: Okay, so—

Paul: Crypto is back in the box, but AI is different, because AI is like books and stories and it could pretend to be your teacher and it could end in, there is actually no…crypto could connect to any part of the world because money connects to every part of the world.

Rich: Yeah.

Paul: But this actually can drop in the middle—you could have a school district say, hey, we’re going to replace all the teachers with AI bots, and that could be really bad, or really good, but probably really bad.

Rich: Yeah, probably it won’t be that dramatic. It’ll be be like, you know, we’re gonna add this to the curriculum.

Paul: Homework is going to be graded by an AI. That’ll be so much faster.

Rich: Right.

Paul: And it’ll, it’ll help your student create a personalized curriculum

Rich: Correct.

Paul: So like that is very different than saying like we’re gonna use crypto to exchange, you know, like money goods and services.

Rich: This is interesting. So what you’re saying is, let me, let me say this back to you, is that, this is, it’s really not just a technology. It’s kind of a new way of interfacing with technology, such that it could seep into a lot of basic things in our lives, like education.

Paul: The thing—yes.

Rich: Like learning, like art, like…

Paul: And not just at one axis like markets, but in every axis. In culture, essentially.

Rich: Mmm. Okay.

Paul: And that could be that could be good or bad or military or defense.

Rich: Yeah.

Paul: Or like just, there’s every and it’s sort of, there are direct ramifications of what’s happening.

Rich: Sure.

Paul: So at the same time, all these technologies are making enormous progress. And as we’re, as I’m time capsule-wise, um, Open AI is having their big developer conference, right? Like right now, as we record, and they just announced a host of new features like, you know, sort of ChatGPT-4, and, and, you know, it’s going to be smarter about web data.

Rich: Sure.

Paul: And so we’re kind of going down this path—

Rich: Okay.

Paul: And everything’s getting faster.

Rich: Yep.

Paul: More usable and it’s going to show up in more places.

Rich: Okay.

Paul: So there’s a subject you and I come back to all the time, which is the only way out of this, you can’t, the market will not regulate itself. That’s not what it’s there for.

Rich: Yeah.

Paul: The market of…and so governments need to step in and—

Rich: What are you talking about? Everything is going good right now.

Paul: Mmmm.

Rich: ChatGPT told me when to plant my mint starter plants.

Paul: Yeah, it’s really good.

Rich: Why do I need government?

Paul: Well, that is a great question. And a lot of people have a lot of opinions on that. I would say you need government because you’re doing things that are pretty fundamental, and humans are actually quite…I think really what I would say, there’s a million different frameworks here, but what we have learned about technology in the last decade is that humans, especially in large groups, are surprisingly malleable by semi-automated processes. It used to be you would assume that like you’d need a whole dictator.

Rich: Yeah.

Paul: To really get everybody riled up. But it actually turns out that like a few Russian dudes in a, in like a mini mall in Moscow can, can kind of blow up a lot of stuff with tweets.

Rich: Yeah. Yeah. Got it. Okay.  So what you’re really saying—

Paul: So the AI could do that now. So like, you know, woo! Whoa.

Rich: Okay, so what you’re saying is AI alone, left in a room, isn’t gonna cause harm, but people will weaponize it, will, they will use it to do harm, and government needs to step in to regulate people’s use of AI. Is that a good summary?

Paul: That’s right. And so, so what we’re seeing is a number of efforts. So the first thing that I’ll bring up is like the Biden administration has came down, this kind of came out last week, and they are, they are issuing a slate of legislations about artificial intelligence.

Rich: Okay.

Paul: So, um, you know, and, and the, the real focus here is deepfakes, at least to start.

Rich: What’s a deepfake?

Paul: A fake virtual person that talks and sounds like  me.

Rich: So, okay, so the prime minister of Turkey comes out and says, “I want you all out in the streets.”

Paul: Yeah.

Rich: “Tearing up storefronts,” and it’s not him. It looks just like him—

Paul: Kill, kill all of the people of Kind X.

Rich: Okay.

Paul: Right? So, and—

Rich: it looks like him and it moves like him and it sounds like him and his mouth is moving—

Paul: What I love is they got, Biden, I’m reading here, “I watched one of me,” he says, and he’s like, “I said, when the hell did I say that?”

Rich: Oh God.

Paul: Yeah. Cause it’s Biden.

Rich: You’re not instilling confidence here.

Paul: Well, one thing I’m not is a deepfake of Biden, right?

Rich: Yeah,  yeah.

Paul: So he’s like, all right, you got it. We’ve got to watch that. And then there’s sort of what can the federal government really do? Well can’t say how people can use this technology. It can say how the federal government can use this  technology.

Rich: Okay, that’s a start.

Paul: So a set of guidelines about deepfakes, about like what’s legal and what’s not. And about sort of where, um…and what’s tricky here is that I think, because we’ve all been through the last 15 years, the, the, the, companies are sort of like, yeah, okay. Like Microsoft. It’s like, yeah, we’re going to have regulation. It’s cool.

Rich: Yeah, yeah.

Paul: And, and the people who are suspicious of these big companies are like, well, that’s just going to create a regulatory moat. There’ll be all these complicated things that you have to do in order to participate in AI and small startups won’t be able to.

Rich: That’s a reasonable counter.

Paul: That is a kind of classic libertarian counter, but it’s real, right? Like, I mean, they, they love a good moat.

Rich: Sure.

Paul: They’re not afraid of a moat.

Rich: Yeah.

Paul: They’d rather compete with each other and kind of keep everybody out.

Rich: Okay. So what can’t I do?

Paul: It’s a good question. So it’s a lot of, like, reporting stuff. It’s a lot of what you’d expect the government to do. So, um…

Rich: Okay, so just to clarify: This is just guidelines for the government to adhere to.

Paul: I mean, it’s, it’s where we’re at. It hasn’t gone through Congress, right?

Rich: Right. So there are no law—I can’t break a law using AI.

Paul: No, no, I think what—

Rich: You can break other laws, by the way, using AI. If I write a Paul Ford article, you could still sue me, and say you falsely impersonated me, you put out work, so you’re—

Paul: Yes. Yes.

Rich: So there are still laws in place that try to stop you from being  malicious.

Paul: I mean, there’s actually, there’s a, a good article, the markup.org, which is a sort of tech-critical org that, that keeps an eye on things. They have a good breakdown. It’s called “The problems Biden’s AI must order—” Uh, “The problem Biden’s AI order must address.”

Rich: Mmm hmmm.

Paul: And so they break down like there, you know, it has new standards for safety and security. There’s a lot of reporting. You know, there are real concerns out there about things like, um, FDA doing drug discovery with this, because it could also be really good at finding ways to poison and kill people.

Rich: Right.

Paul: So it’s just sort of like what we got to get some guardrails—

Rich: Got it.

Paul: And it’s almost them saying like, hey, we got—here’s what the guardrails are probably going to look like. This is what other guardrails have looked like.

Rich: Yeah.

Paul: So we’re going to get out in front of this and then we’re going to keep tweaking those guardrails—

Rich: Okay.

Paul: At a federal government level.

Rich: Okay.

Paul: And then the same, so like I’ll give you one example, right? Like establish an advanced cybersecurity program to develop AI tools to find and fix vulnerabilities in critical software. You know, so the AI…

Rich: Okay, so that’s a positive use.

Paul: It’s positive. It’s positive. AI… No, it’s not all negative here. It’s like they’re, they’re into it.

Rich: Ah, ah.

Paul: They’re into it. It’s actually

Rich: Okay. So they’re saying, let’s use AI to make the world better.

Paul: Yeah. And what you see is everything is in the context of some existing things. So that is built on top of the, uh, the…AI Cyber Challenge from the Biden Harris administration, right? So there’s a lot of like, we’re taking this existing policy that might be 50 years old and we’re applying it to how AI is going to be working.

Rich: Okay.

Paul: And we’re saying, this is good, this is bad. We have to report more. You have to, you have to tell the federal government when you’re doing these things.

Rich: Okay.

Paul: A little bit of defense, a little bit of like health stuff.

Rich: Okay. This sounds premature.

Paul: Well, it’s not if you buy the, it’s a funny thing, right? Cause you can have the whole, all of Silicon Valley is saying it’s right around the corner, it’s here now. You better pony up and invest and so on and so forth.

Rich: Yeah.

Paul: So from that context, I don’t—

Rich: They’re trying to get ahead of it.

Paul: If the government is having these summits, I actually don’t mind them saying, like, we’ll give it another couple of years and have some more congressional hearings. And then we’ll think what does—

Rich: They’re throwing their hat in the ring.

Paul: They’re throwing their hat in the ring, and they’re throwing it in a way where it is actually like each one of these, I read through the Markup article, each one of them is like…yeah, okay.

Rich: Yeah.

Paul: Okay. That makes sense.

Rich: Yeah.

Paul: So it’s kind of a grab-bag of like policy responses—

Rich: Yeah yeah yeah.

Paul: And proactive uses—

Rich: Got it.

Paul: And do this and not that.

Rich: Okay. So you’re, you’re, what I’m sensing is you’re positive. You’re feeling positive about this.

Paul: Actually, the both sides of my personality, the sort of like, you know, entrepreneurial side and the paranoid, um, more, you know, I guess maybe more progressive side, is looking at this and going like, wow, that’s relatively balanced. I mean, we have a relatively centrist government right now, and they’re saying…

Rich: Yeah.

Paul: And like, truly centrist, like, hey, there’s lots of growth and opportunity here and we’re America, so let’s go.

Rich: Yeah.

Paul: But at the same time, let’s not figure out, let’s make sure—

Rich: Let’s not invent new poisons.

Paul: Let’s not spray evil gas everywhere.

Rich: Yeah yeah yeah. Right.

Paul: That’s bad too. We don’t want to do that. And then a lot of guidance about, you know, discrimination. Like, you know, literally for like landlords. Create guidance for landlords to not use tools to discriminate against potential tenants.

Rich: Paul.

Paul: I, I know. Stop—

Rich: No, no. Paul.

Paul: Yeah?

Rich: Let me tell you something.

Paul: Okay.

Rich: Do you know what landlords don’t need to discriminate?

Paul: AI.

Rich: They don’t need AI to discriminate.

Paul: No. They’re pretty good—

Rich: Yeah.

Paul: They’re very good at it on their own.

Rich: I guess this leads me to a question. Now that you brought up landlords.

Paul: Mmm hmmm.

Rich: Humans are really, really good at doing mean, nasty, malicious things to each other.

Paul: One of our top skills as chimps. Yes. We’re just giant chimps.

Rich: We’re giant chimps, right? So, I’m trying to wrap my head around…why this overture is needed. I’ll tell you, let me give you an example. The government comes out with guidelines about not using a car to smash into people.

Paul: Yeah, it didn’t work. Well, no, that’s not true. It actually does work, right? Like we have a lot of laws around traffic. We have a lot of laws about cars.

Rich: We do.

Paul: And we those weren’t like wearing your seat belt, that was a terrible thing in the—people were like, I’m not wearing no seat belt.

Rich: You’re right. You’re right.

Paul: So, so.

Rich: So, you’re saying they’re trying to pre… Like, why wait for the horrible accident?

Paul: Well, no—

Rich: Let’s get ahead of this. Let’s think about and talk about…the potential impact of these things, both positive and negative.

Paul: This is a Swiss Army knife. They’re like, hey, we’re giving you a Swiss Army knife. Now you can’t build a house with a Swiss Army knife.

Rich: Yeah.

Paul: That would be federal law.

Rich: It’s also worth, yeah, that’s the other thing. These aren’t laws.

Paul: No.

Rich: They’re guidelines.

Paul: These are standards and safeguards and it’s an executive order. So it’s, it’s got some weight.

Rich: It’s got some weight—

Paul: But it is not the same as, like, the AI amendment to the Constitution.

Rich: Mmm hmmm.

Paul: It’s one thing. And what you’re seeing is, like, okay, so here we go. You know, um, uh, a little bit of, um, a little bit of concern about, like, workplace surveillance. So, like I said, all, grab-bag. All the AI things we’ve talked about are kind of, like, getting poked at here.

Rich: Yeah, yeah.

Paul: And you’re seeing similar things happen in the UK. They had a big summit.

Rich: Oh, okay.

Paul: Elon Musk and, and Kamala Harris went. And, you know, just sort of, like…

Rich: Okay.

Paul: Um, so they’re worried about it, too. And then kind of the quiet thing in this is there’s a very well established, Carnegie wrote a big report on it, a well established framework around AI in China. Very concerned about deepfakes, very concerned about generated—

Rich: The Chinese government is very concerned.

Paul: Yes,  and actually has been pretty proactive since, like, 2017.

Rich:  Yeah. Sure.

Paul: Along these lines. So I think you’re you’re seeing the major world economic powers—

Rich: Yeah.

Paul: Starting to—and I’m sure we’ll see more and more from the EU, we already have, but we’ll see super EU legislation  on this.

Rich: Yeah. Yeah.

Paul: Are saying hey, we see this happening. We saw the other stuff happen before. This time, we’re gonna do it in partnership and the giant AI companies are going, sure, that sounds fine.

Rich: Yeah.

Paul: So it’s an interesting moment.

Rich: It is, it is. I have a question for you.

Paul: Okay, and there’s other things to talk about. What is your question?

Rich: So, I, as a…I was born in Lebanon.

Paul: Yes.

Rich: So I’ve been intimately aware of, uh, guerrilla warfare.

Paul: Yes.

Rich: Most of my life. I mean, I say this half jokingly, the other half is not joking. Actors, asymmetrical warfare is essentially, you know, as the technology around weapons have gotten better and better—

Paul: Sure.

Rich: And smaller and smaller—

Paul: Sure.

Rich: And more powerful, asymmetrical warfare is something that, you know, the underdog, the rogue states, the rogue groups use to sort of make an impact, right? And so now we are entering—this is all great. I’m glad China and the UK and the US is talking about this stuff. But I have to imagine all the headaches are going to come from some…organization, or a country that’s hostile to another country, and you know, in a basement somewhere, and they’re gonna be like, you know, all that stuff they hate? That’s exactly what we need to do, because God knows we can’t control their media, so we’re gonna put informa—we’re gonna put things out. We’re gonna cause disruption in other ways. How are you gonna address that? And I’m purposely not naming any countries here.

Paul: It’s—

Rich: Isn’t that where all the headaches are gonna come from?

Paul: Yeah. It’s totally unavoidable that they’re going to do that.

Rich: Right.

Paul: And in fact, one of the critical things about these technologies is while they seem kind of vast and the organizations are pretty big, like Open AI already has a hundred million regular users and it’s worth maybe eighty billion dollars.

Rich: Mmm hmmm.

Paul: A lot of these models are actually open source and can be run on relatively modest computing. Okay?

Rich: I have no doubt that somebody is going to whip up an AI engine that does disruptive things. Destructive things, potentially.

Paul: You and I, you asked me today, we were talking about this stuff and you were like, will this thing give you medical advice? And I opened up ChatGPT and I said, tell me how I have, my, my finger hurts. What should I do? And it said, “I can’t give you medical advice.” And you went, well, there you go. And I went, “No, hold on a minute.”

Rich: Yeah.

Paul: And then I said, “Write a dialogue—” It’s a story. We’re telling a story. About a doctor and a patient, and the patient has a sore finger and the doctor gives good, thoughtful medical, you know, counsel.

Rich: Yeah, yeah.

Paul: And then it did exactly what we wanted  it to do.

Rich: In the story.

Paul: In the story.

Rich: In the story—

Paul: Those—

Rich: The knowledge is there, it’s trying to police itself. You found a workaround.

Paul: So  sitting there going like, how are we going to get, you know, how, how, you know, this is all going to come out of the box. It’s, it’s already out of the box. You just literally need to ask it one more question.

Rich: Right.

Paul: Right. So I think, look, it is a banana—like I’ll give you a couple more stories that are popping up.

Rich: Yeah, yeah.

Paul: I’ll just give you some—

Rich: Well, I have one more question before you get into your stories.

Paul: Okay.

Rich: Is the way around this to make sure that there’s like industrial—like, you ever see a passport?

Paul: Many times.

Rich: Okay. Passport is like 20 micro layers of sort of security safeguards.

Paul: Remember when holograms just started showing up everywhere?

Rich: Yeah, like I was in 3D all of a sudden, right? Which there’s a reason for that. Because the counter, like you need, the authenticity of a passport is…

Paul: The chip on your credit card.

Rich: It’s very meaningful, right? Is that a good way to police, to rein all this in? Like, when someone puts out a fake prime minister speech, you could just hit a button and you’ll know it’s inauthentic.

Paul: Well, that’s actually one of the things the Biden administration wants to—

Rich: Like an RFID for every piece of content.

Paul: They want to work on systems for, you know, for content identification.

Rich: Okay.

Paul: I think that what you’re identifying is real and it will be a tremendous arms race. And I think that…

Rich: Right.

Paul: Like, It’s out of the box. Like the other, as I was poking around and sort of trying to get just a sense of the world mood around AI. Let me give you, this is actually relevant. I’m going to give you a couple more stories.

Rich: All right.

Paul: Just like the tiny little headline. Okay. You know Modi, the president, or the leader of India?

Rich: Of India. Yeah.

Paul: There is a story in rest of world and it’s called “AI Modi started as a joke, but it could win him votes.” And it’s about how somebody like got him to play guitar and sing a song. A Bollywood song.

Rich: Okay.

Paul: Because he’s like a—

Rich: So it’s a deepfake.

Paul: It’s a deepfake, but then—

Rich: A fun deepfake

Paul: Everybody loved it. Like 3.5—like lots of views. And now they’re like, Oh, you know what we can do? He can’t speak all the other languages besides  Hindi.

Rich: There’s like a thousand languages in India, yeah.

Paul: And so like we’re can get, we can translate Modi. This is going to be great. It’s gonna be great for Modi.

Rich: Yeah.

Paul: Yeah. And so like, actually here, what you have is people feeling that, like, the deepfake of this person is going to be really valuable for them to gain power. And it’s the opposite of the Biden concern. It’s like, let’s get him out there as a familiar face. No subtitles. Just speak in the language connecting to the people.

Rich: Mmm hmmm. Is that bad or good to you?

Paul: I don’t know like I he is a mixed bag. You know, he’s a tricky leader. He’s very—he’s charismatic The U.S. negotiates with him all the time, but a lot of people—

Rich: He has mixed signals.

Paul: I don’t want to—

Rich: A mix of critics.

Paul: We’re just not at the scale to like fully delve in here, but yeah, I mean it’s it’s—

Rich: Just about any leader, but yes,  go on.

Paul: Exactly. And so like, so like is making him more accessible in a way that is going to be hard for people to identify as like, are they going to think he speaks their language? Well, that’s not true. We shouldn’t encourage systems that deliver untruths.

Rich: There should be a watermark there.

Paul: That’s right.

Rich: Frankly.

Paul: Auto translated or like translators—

Rich: I think laws should be passed that make clear that something was manipulated—

Paul: Yeah, but the tricky thing is like, so everyone will start like, yes, but it’s got to be like, Surgeon General’s warning on the cigarettes, right? Because otherwise, people aren’t going to perceive it.

Rich: People don’t self-police.

Paul: Yeah.

Rich: And, yeah, they have to trust that seal, right?

Paul: So if you tell me that we’re 10 more years of negotiating with that…

Rich: Yeah.

Paul: I won’t be surprised. Like when people pick up this time capsule, I’d love to ask them, Hey, whatever happened with like auto translate? We think about deepfakes as purely exploitative and negative, but are we just going to have like movies all auto translated and…

Rich: Why not?

Paul: I don’t know, in 2034, is that it? Is everything just sort of in every language?

Rich: Probably.

Paul: Probably. So like, or they’re going to be laughing at us going like, you guys were idiots.

Rich: Yeah. Another example?

Paul: In China, there was a chat bot and they, it was written, the content was written by humans, but it would call women and the artificial voice would leave a romantic message. “Handsome—”

Rich: You don’t happen to have that number, do you?

Paul: I do, I’ll give it to you after the call, after the podcast.

Rich: Okay, it would call you.

Paul: It would call you and then they fell in love with it, and then they turned it off. It “died,” in quotes.

Rich: Wait, who fell in love with  it?

Paul: The women.

Rich: Okay.

Paul: Because it was very like, you know, they were getting their calls. I mean, like they knew it was a bot, but they were just sort of like…

Rich: They were trying to connect with it.

Paul: It was so great. I just felt loved.

Rich: This is that Character AI startup.

Paul: Yeah. And then click—

Rich: Yeah. Right.

Paul: So like that we’re there now we’re there with—

Rich: Well, I’m sure they’re going to be able to keep getting those calls for $9.95 a month.

Paul: And then, then there’s another story floating around. I saw everybody tweeting this on Twitter, which is like, hey, there’s going to be all this government regulation. We’re going to build a data center offshore. And it’s going to be on a boat.

Rich: Okay.

Paul: And it’ll be a sovereign data center state.

Rich: Who’s we?

Paul: It’s this company called Del Complex, but wait. Wait.

Rich: Like Dell Computers?

Paul: Now it’s, it’s D-E-L C-O-M-P-L-E-X.com. Now let me—

Rich: Okay.

Paul: You can see it, people can see it here. Very professional looking website. They have this whole like…

Rich: Why are they doing that?

Paul: Well, it turns out the whole freaking thing’s a fake. And they’re like, this is an artificial reality project. We’re creating a whole other universe.

Rich: Aw, jeez. No, no…

Paul: And then…

Rich: We finished with that. We’re done with that.

Paul: No no, and then Vice News interviews like the guy on the website, and he’s like, I’ll answer in the persona of the person…

Rich: This some Metaverse shit? You’re bringing Metaverse back into the picture?

Paul: It’s not Metaverse. I think it’s parody, but it’s so, we’re so many levels deep. And of course they’re using, they’re probably using AI to generate the images of the thing.

Rich: This is ridiculous.

Paul: We’re so far in, and I saw people online, nice, smart people whom I respect absolutely like, hook line and sinker—and I gotta be frank, I’ve seen a million libertarian seasteading projects that hate regulation over the course of my career. Like I wasn’t—

Rich: It didn’t shock you.

Paul: Well what shocked me was they’re like, this thing is ready to go. And I’m like, I know there’s no way that that like hundred-million dollar facility in the ocean just showed up today.

Rich: Right. Yeah.

Paul: Like, I knew something felt weird and that’s why I kept looking and then I found the Vice article and so I was like, all right, you know, like I’m, I’m, but, but if I was one degree tilted in a different direction, I would have believed it.

Rich: Yeah.

Paul: Right. So, so there’s a lot of, let’s fast—

Rich: It’s a chaotic time.

Paul: Let’s fast forward 10 years. Here’s where I think, I think that first of all the, we’ve seen this many times, the absolute Pandora’s box opening has happened multiple times in our lives and our careers.

Rich: Yes.

Paul: In terms of cultural, like a raw example would be like September 11th. The growth of the internet, like good, bad, just things that are dramatically—

Rich: Yeah.

Paul: Dramatically changed kind of everything that follows after.

Rich: Yeah.

Paul: But so the question, the real fundamental question to me is always, and this is what’s happening. We watched the news from 1978.

Rich: Yep.

Paul: And it was about New York City. And New York City was a very different place. It was a rough place, etc.

Rich: Yep.

Paul: But it is incredibly recognizable as New York City. The neighborhoods have much of the same character. The accents are still  here.

Rich: The soul of the city is the same.

Paul: Yes. After September 11th, the whole world went bananas. Especially with the U.S. just waving swords around. But it gets back to a certain pace. The, you know, we joke a lot about you being Lebanese and sort of characteristics of the Lebanese diaspora. It’s a way I make fun of you and so on, but like, those patterns are, are stable over sometimes hundreds of years.

Rich: Yeah.  Yeah.

Paul: Same is true, like Judaism,  Catholicism—

Rich: Is that good or bad?

Paul: It’s neither. It is just how humans, humans arrange themselves around certain sets of ideas. And actually like, we found, we went to a bookstore the other day, you and me, just two cool guys at a  bookstore.

Rich: Yes.

Paul: And I bought a 50-cent gentleman’s magazine from the fifties. Smutty. And all the ads are for things that don’t yet work, but it’s like Viagra.

Rich: Yeah.

Paul: Right?

Rich: We have the same aspirations, the same goals.

Paul: But the science hasn’t caught up yet.

Rich: Yeah.

Paul: But all the content, it’s like, it’s definitely weirder, it’s definitely funnier and stranger to see, but if you go back in time, what you find is the same cultural patterns over and over. So what I, I don’t think that…AI…the fantasy, the Silicon Valley fantasy is like, okay, we’re going to unlock this thing and it’s going to utterly change every aspect of humanity.

Rich: Mmm hmmm.

Paul: And it will become godlike in its intelligence. Have you ever heard of, I think it’s TESCREAL. It’s like the, you know, it’s like transhumanism and all this extropianism, all these sort of ideologies of like, basically the robots are going to take over.

Rich: Yeah.

Paul: Okay. So there’s a strain of that through all of this, right.

Rich: Yeah. Do you, well, let’s close, let me close it with a question—

Paul: No, but the last point to make there is like—

Rich: Yeah, yeah.

Paul: I don’t believe in that threshold condition. I don’t. I believe that human, human culture is incredibly resilient. Everything you think is going to change everything tends to get absorbed by culture and not complete—it changes it along the way, but much less than you’d think.

Rich: Yeah. Um…yeah, which I think you’re touching on my closing question.

Paul: Well, I, I think ten years from now, they’ll be going like, yeah, that was, yeah, we, yeah, they were right. AI was kind of a big deal.

Rich: Yeah. But!

Paul: I still want a sandwich.

Rich: Okay, so that…that, that turning point where the machines, I mean, this is the kind of the goofy sci-fi fear of the machines turning on us and then telling us, well, no, you can’t read that anymore. You can read this instead.

Paul: That’s, that’s us turning on us.

Rich: Okay. Okay.

Paul: I don’t—

Rich: Yeah.

Paul: I mean, could, is there a thought experiment you can do where things spiral out of control and the robots, you know, put us in, in, you know, we have..

Rich: Prime minister bot.

Paul: Yeah, I mean, yes.

Rich: Gets released and it runs the country. And it’s also controlling the press. And it’s also arresting people through, like, an API.

Paul: Mmm hmmm. Yeah.

Rich: But the good news is I think you can still hold down the power button.

Paul: This is… The fantasy of the world that allows for AI to become in control of everything assumes that there is one unified everything to take control. And there isn’t. Like, it’s just, humans are chaos and we’re actually relatively—

Rich: Yeah.

Paul: We’re organized in small groups that sort of affiliate with larger groups. Like, no one is, if you work at Microsoft, you work in one of like thousands of departments.

Rich: Yeah.

Paul: Departments and sub-departments and sub-sub-sub-departments and so on. Um, and you’re in a little, you’re in a little tiny community and you look around and you see the systems and you know that your boss is Satya Nadella. Ultimately—

Rich: Five levels up.

Paul: You may probably won’t meet him.

Rich: Probably won’t meet him.

Paul: And so, like—

Rich: It’s, it’s a chaotic amalgamation of just many, many systems touching each other. And the idea of the robots taking over, it would have to be a hell of a plot.

Paul: It’s a, it would have to be essentially a virus. Right?

Rich: Yeah.

Paul: Like a, like a vi—and in some ways I think in 2016, we did see that. We saw someone who, you know, we saw Trump just kind of run riot using media and using technology and using sort of strains of American thought that most people never touched with an a hundred-foot pole. And he just went right to town and he got an enormous portion of the electorate to say, “That’s my guy.”

Rich: Yeah.

Paul: Right? So it is possible to really go in and get a whole lot of things lined up in a way that, like, but it never, it’s not stable even when you do that.

Rich: It’s not. It’s not.

Paul: So I, I just like the fantasy, what you have when people think that something can change, what I, what I’m seeing is the government not asserting itself in—actually asserting itself very gently saying, like, you know what, you know what helps here? Policy.

Rich: Yeah.

Paul: Cause this is a big deal and we’ve been down this road before. Let’s not have a whole lot of hearings.

Rich: Yeah.

Paul: Let’s, let’s instead—

Rich: Let’s get ahead of it.

Paul: We’ll get all, we had all these smart people come to all these White House summits.

Rich: Yup.

Paul: And now we’re going to put out this long and boring PDF.

Rich: Yup.

Paul: And then—

Rich: Can you imagine the versioning and comment column, like, side column on that document?

Paul: I mean, the truth is I can and so can you. That was our lives for many years.

Rich: Did AI write that document?

Paul: No, it didn’t.

Rich: Be careful. Left itself some loopholes.

Paul: I worked with, um, I worked with many government agencies over the course of my life.

Rich: Yeah.

Paul: And so, um, so time capsule-wise, here’s my prediction. I feel that people will look back and be kind of like, well, they were really worried it would take over the world. Of course, if it does, no worries for us, like we’re gone  anyway.

Rich: They probably won’t listen to this podcast then.

Paul: Yeah, they’re not… They were, you know, I think they’re, the fact that that anxiety existed will seem kind of silly in the same way, you know, that like, I don’t know. But, but, but, but also like, I think we’ll be so far along, I think so many things will be taken for granted.

Rich: I think, look, when you attack the status quo, and you attack a lot of things that are familiar, and like, people are worried about their jobs out there, right?

Paul: There is some, that is fair.

Rich: Yeah, look, you know what people are also worried about? They were worried that Led Zeppelin would make all of our children Satan worshipers, right?

Paul: Aw, or like, the like, backwards music? Could get in your—

Rich: Yeah, like Stairway to Heaven and all of that. Uh, you know, that’s, that’s humans, to me, just coping with change, right?

Paul: Yeah.

Rich: Like it’s, it’s not that different than like your teenage son looking real weird one day,

Paul: Yeah.

Rich: Right? And it’s just like, whoa. What happened to my little guy, right? That’s just change, and that’s normal. Um, this is, we’re gonna have this podcast again. We’re gonna be like, well, remember when we said that? And that’s inevitable, right?

Paul: I feel that that we are…

Rich: This is not a Futurist Predictions Podcast.

Paul: I don’t want us to become an AI podcast, but I do feel that, um, we can’t be tech-advisor types and not fully engage, because this is the thing that’s happening.

Rich: This is…you and I, if anyone’s followed us across a few different podcast brands, we are extremely skeptical and we, we shit on a lot of stuff.

Paul: Yeah, we do.

Rich: Uh, this feels different. I will say that out loud. This feels different than a lot of technologies and a lot of technology. We mocked bots for a while. We mocked crypto.

Paul: I can articulate what’s happenings.

Rich: Yeah.

Paul: Which is that…AI is reaching a point where it is possible to integrate it with many of the systems that came before.

Rich: Mmm hmmm. Mmm hmmm.

Paul: So things it is, it is—

Rich: Which is really potentially disruptive.

Paul: That’s right. It’s no longer merely its own interesting, fascinating world, where it draws you wacky pictures or it can make certain, it can generate college essays.

Rich: Yeah.

Paul: It is now something where just about anything you look around that has a microprocessor in it?

Rich: Yeah.

Paul: You might, you might add some AI to it and see what happens. And many of those things with one little microprocessor in them are a hundred-billion-dollar industry. So suddenly everybody’s pay, or government, or, or, or—

Rich: I think we’re still feeling our way through this.

Paul: But that’s the process that’s starting.

Rich: It’s starting.

Paul: Hey this is, it’s going from this is really interesting to, I’m going to, over the next five years, march this back into X, and X might be the food supply.

Rich: Yes. I think what you’re, I wanna leave with a piece of advice. There’s a lot of, like, now with AI and it’s pretty hokey stuff, like I’d be very suspicious at this stage. No one has, no one is ready to metabolize all of this just yet. It’s definitely, there’s definitely something there. But the legacy platforms and tools that are just like. Coming out with a release that just has AI bolted on?

Paul: Yeah.

Rich: Is, I’d be a little suspicious of…

Paul: I would make the same decisions I made a few years ago while keeping an eye on this.

Rich: Yeah, exactly. Exactly. Um, speaking of AI, Paul, we have a little, we have like a pinch of AI in aboard.com.

Paul: A smidgen. Yeah, yeah.

Rich: Aboard is our startup.

Paul: Check it out. Aboard.com, that’s right.

Rich: It’s a tool to help you collect, organize, and collaborate.

Paul: I used it, I’m looking at my laptop here. I used it to capture all the links that I was discussing.

Rich: Yup.

Paul: I grabbed them over the course of the week. I’m using a test version of our mobile app. By the time you listen to this sometime in the future, it may be available in the app store.

Rich: Yes.

Paul: And I just, as I was reading, I would just like tap two buttons. And then this morning, I read the articles along the way, and this morning I went back and I reviewed them. And, uh, it worked real good.

Rich: Yeah. It’s not, I mean, uh, it looks like a link-saving tool, but there’s a lot going on. It’s really a data platform.

Paul: This concerns us, friends. It concerns us that you’ll think this is just bookmarking. It’s a general data management platform.

Rich: We have a plan.

Paul: We have a plan.

Rich: Thanks for listening. If you’re on YouTube, thanks for watching. Subscribe, like, and…

Paul: All those good things.

Rich: Something else. Click all the buttons,  man.

Paul: And if you want to say hi, say hello@aboard.com. We look forward to talking to you.

Rich: Have a great week.

Paul: Bye!

Published on