Using AI Respectfully
Using AI Respectfully
From copyright violations to environmental concerns to the looming threat of the singularity, AI is a hot-button topic these days. Paul and Rich talk through many facets of this conversation, and discuss how they think about the AI components of Aboard. Plus: A little roleplay in which we learn that Paul thinks Aboard is an earnest mid-century cartoon character.
Read the transcript
- If you were wondering why your AI-powered toaster was writing erotic fanfiction…
- Via waxy.org, that list of AI scrapers.
- Printer & Moonlight 😻
Paul Ford: Hi, I’m Paul Ford, the co-founder of Aboard.
Rich Ziade: And I am Rich Ziade, the other co-founder of Aboard—and CEO. Paul, you report to me.
Paul: I’m the President.
Rich: You’re fired.
Paul: You and I, you like a good, solid line in the org chart. I’m very much a dotted line.
Rich: Not only is it dotted, but it kind of wiggles.
Paul: Yeah.
Rich: It doesn’t connect to any of the other boxes.
Paul: Yeah, no, it is. [laughter] I am like a satellite that orbits the company. We’ll talk about this another time.
Rich: Yes. Joking aside, Paul and I work closely together. Very collaborative. A lot of push and pull, as I like to say.
Paul: I lead when I have to. You lead—
Rich: This podcast is not about us, is it?
Paul: No.
Rich: What’s it about?
Paul: It’s about Aboard, and it’s about the software industry, and it’s about the wacky world we live in. And it’s actually less about us than that intro would make you think. So let me tell you what Aboard is, Richard.
Rich: Go.
Paul: Don’t get confused. Aboard is a software platform. It is data management for everyone. You just go to aboard.com and whatever problem you got, let us try to solve it.
Rich: Yes.
Paul: Let me tell you some things about Aboard. It is a visual tool. It is a collaborative tool. It is an AI-powered tool. It works great at home, it works great at work, and it works on your phone. It takes data and turns it into pretty cards and lets you move them around.
Rich: Check it out. Aboard.com.
Paul: Look at that. Look at that! So this is the Aboard Podcast.
[intro music]
Paul: Well, Rich, today we’re going to talk about something near and dear to—
Rich: Let me guess. AI. Ayyyy-eyyyyye.
Paul: Yeah. But let’s make it zippy and fun—for everybody. [laughter] Okay, so I think we have an interesting perspective here, which is we have integrated artificial intelligence technologies, which means a very specific thing in 2024. It means the kind of chatbotty, image-generating world. That’s what we talk about when we’re talking about AI. We have integrated those technologies into our product. And let me—you know what? Let’s role play. I’ll be the software, you be the user.
Rich: Okay.
Paul: Hey, how you doing? I’m Aboard.com!
Rich: Quiet. Don’t answer unless you’re spoken to.
Paul: [quietly] Okay!
Rich: I need a way to manage the employee-onboarding experience for my company. Can you help me?
Paul: Did you just type those words in? Because if you did, I can!
Rich: I did type those words in. Are you a post-World War II cartoon?
Paul: I am! Okay, let me go ahead, and first thing I’m going to do is show you a nice card that has—oh, sorry, what was it you wanted again?
Rich: A simple way to manage employee onboarding for my company.
Paul: Okay. So I’ve divided that into steps, you know, and documents that people need to read, and each card has the document and a link to it and a list of notes that people should fill out and sort of things like that. Pretty use—looks useful, right?
Rich: Yeah. So wait, are you a set of instructions that I need to now follow?
Paul: No, I’m a suggestion for how you might organize one little unit of data inside of this cool thing you’re building.
Rich: Like ChatGPT.
Paul: Yeah, yeah. But except instead of it being a conversation, I’ve turned it into a card that you can work with. Comment on. Add data to. All kinds of things.
Rich: Do you mean like software?
Paul: I’ve built software for you. You can have a whole workflow for onboarding, all the data, all the things your employees need in order to learn and do things as part of their onboarding experience. Already built for you.
Rich: end scene. This is the hardest thing to share on a podcast. [laughter] It’s that, you know, AI is extremely chatty, for lack of a better term. It gives a lot of instructions out, a lot of information gets thrown at you, and then you got to go and take it. We built something that actually generates a tool as the output. And that’s unusual.
Paul: Yes. It’s not a document. It’s not a conversation. Doesn’t draw you a picture. It makes you software.
Rich: Is this one big marketing podcast, Paul?
Paul: No—
Rich: Because we did that last week.
Paul: No, we did. We’re going to move on from talking about us, but this has been our direct experience. AI is a hot-button issue. You’ve got, let me try to characterize some of the responses to AI, out in the world.
Rich: Okay.
Paul: So you have, one—you have, “Hey, we are the company OpenAI. Some of us believe that our product is so dangerous that we might need to fire our own CEO because it’s…”
Rich: Which they tried to do.
Paul: “It’s imminent that we will develop a huge robo-brain that could take over the world, and we have to really, really keep an eye on it. That’s how powerful and amazing and unbelievable the technology that we’re building is.”
Rich: Can I, can I…respond to that?
Paul: Well, that’s characterization one. So, yes, go ahead. And then we’ll get to two.
Rich: Okay, you know, it’s, first off, just general observation with humanity. We tend to go right for the Marvel movie.
Paul: Yeah.
Rich: I imagine the first, like, major news incident that, like, ends up on the nightly news is, like, the AI-powered, like, cheese dispenser at Wendy’s just goes off the rails and floods the kitchen.
Paul: I mean—
Rich: Like, why are we killing each other?
Paul: Yeah, why is my AI-powered toaster writing erotic fanfiction?
Rich: [laughing] I mean, I found the whole drama around everyone, all the executives at OpenAI losing their minds incredibly arrogant.
Paul: Yes, but delightful. [laughter] Delightful. Just absolutely—you know, you gotta understand, as two tech-industry veterans, very rarely does someone come up to you at the party with a huge platter full of candy and be like, “Hey, what do you like?”
Rich: [laughing] This is cupcakes.
Paul: Cupcakes. We got chocolate. You like those little—
Rich: [overlapping] It was…it was good…
Paul: We have like the kind of—yeah, like, what’s the taffy you get at the seaside? You know, just sort of…all of it.
Rich: [still laughing] I mean, you can’t even give me, like, a chicken tenders recipe. Do you really think you’re gonna annihilate the human race?
Paul: This is the thing. I mean, look. Tech—everyone…when there is a new and exciting technology, this happened with crypto, it happened with the web and so on. They assume absolute, utter transformative output. Right? Like, just—
Rich: [laughing] It’s as arrogance.
Paul: Constantly. But I also think it’s insularity. When you live in a little box and everybody’s like, “Well, this is it. Here it is.”
Rich: Yeah. Yeah, yeah, yeah.
Paul: And then, you know, honestly, if five people came up to me before I got out of bed in the morning, like, just—or not, like, as I’m having my coffee, I’m like, haven’t really gone outside yet, and I got five text messages saying, “Sky’s purple today.”
Rich: Yeah.
Paul: I would go out and look, I’d be like, “Oh my God, maybe the sky’s purple.”
Rich: Yeah. Yeah.
Paul: If you are around a couple people telling you all the time, that are like, “We really should take this seriously.”
Rich: Yeah.
Paul: That is viral wildfire in your brain.
Rich: And it happened.
Paul: Yeah.
Rich: Okay, so that’s one.
Paul: So that’s one. Two, and I think that the countervailing, there’s kind of a web… It’s not a web, it’s not right and left. These are different, it’s a different kind of political categorization where, like, one group of people is really heavily accelerationists and really believes that technology will completely influence culture. The other side believes the same, but they tend to believe that it’ll be a very negative outcome, and that the negative outcome will be that, so, first of all, they feel that the AI engines have sort of pirated, ingested all of this content, broken it into little pieces, and they are now essentially performing a new kind of theft that isn’t really fair use and isn’t really, and sometimes is explicit copying. Like, the New York Times is suing OpenAI around—
Rich: Yeah, so let’s clarify what’s going on. When you ask, let’s say ChatGPT, or one of the platforms a question, it’s essentially dipping into a knowledge, a base of knowledge that is frankly mined from the internet.
Paul: Well, and there was a transaction involved with, like, Google where, yes, they would put an ad on top of it, but they would send you ideally to the original URL. Right? So it’d be like, hey, I’m a new, I published—
Rich: That’s this sort of implicit agreement.
Paul: That’s how robots.txt and search works. And they go out and they get that, they spider the article, but then they send you back to the article.
Rich: Right.
Paul: So they provide service across the entire web. In this case, the way this technology works is it basically ingests all that content, compresses it in an incredibly lossy way into these bizarre databases that we still don’t fully understand what they are yet, culturally. And then when you ask it something, it goes and retrieves it and spits it back to you. It’s like a cover band. It’s like a really bad cover band.
Rich: Yeah, yeah.
Paul: Right? It’s like, you know, you go see, The Song Remains the Same, that’s the name of the band. And they play, like, “Stairway to Heaven,” but they use a synthesizer for the flute.
Rich: Yeah, yeah, yeah.
Paul: That’s what AI sounds like and feels like. And so those are the two camps. One camp is, this is enormous theft. The other camp is, this is the absolute future, but it’s quite dangerous. It’s wild because nobody, everybody seems to be afraid of it. There are other sort of other interpretations. There is the kind of like very worried about deepfakes and cultural impact, so less—more concerned on like a political, nudes and sort of pornography being generated.
Rich: Manipulation, misinformation…
Paul: Privacy invasion…
Rich: All of it.
Paul: There’s also a huge ecological impact because these models take an enormous amount of computing resources to create, so on and so forth. So I think that like over the last couple years, I’ve been internalizing a lot of these and trying to figure them out. I think you have, too. We’ve talked about them a lot. We’ve talked about them on this podcast. And then at a certain point, though, these tools are—we know a lot of web people. I would say two years ago, they’re like, “Wow, that’s weird.” And I would say in the last year or so, we’ve been having more coffees where people are like, “Yeah, there it is. We got to figure out what to do with it. Because, boy, is it interesting. And it’s up to stuff.”
Rich: Well, I think a lot of the damage that tech causes isn’t because of malevolent actors. In terms of the founders, like, the team at Facebook, I don’t think there was, like, a dark room with a big conference table where they’re like, “We will destroy humanity with Facebook.”
Paul: No, that’s the easy fantasy.
Rich: Right.
Paul: We’d love if—if that was the case, the moral argument against Facebook…. [laughing] What’s tricky is, like, Elon Musk kind of gave us that with X. Like, he’s…
Rich: Well, bless his heart.
Paul: [laughing] Yeah.
Rich: Like, that’s separate. Let’s focus on Facebook for a second.
Paul: He’s the first real villain we’ve had.
Rich: I mean, you look at Zuckerberg’s, like, awkwardly tell us that he just wants one big, happy human family.
Paul: Yeah.
Rich: I think he means it. Like, in the beginning, I think Twitter had the same kind of aspirations. Now what happens is there is a tipping point in tech where you really pretty much lose control of the thing. Like, it’s not really… You cannot put out a new terms of service—the wild, rabid animal is out of the cage.
Paul: That’s the moment—yes. That’s the moment, when you hire Sheryl Sandberg, is the moment when you’ve admitted you’ve lost control.
Rich: No, that’s when they decide they admitted that they wanted to make money. They brought her on.
Paul: I guess that is true, actually.
Rich: She didn’t rein anything in. If anything, she probably was an accelerant for a lot of the growth. Right?
Paul: Yeah, I guess you’re right. I guess you’re right. I was thinking of her more as, like, she came because she kind of did come in to provide adult supervision.
Rich: She did. And she turned it into a business. But did she, was she sort of like the keeper of the moral compass? She was not.
Paul: No, the reality is it actually spun further out of control under her watch. Because I think she brought so much money and power into the organization. That’s when it really started to spiral.
Rich: I think she did something else. It’s not just a matter of, like, let’s invite powerful person to the organization. I think what she did was she was like, “Okay, take the reins off. This thing is an engagement machine. Study where engagement works and go.” Like, I don’t think Facebook, I mean, they, they have hundreds of people, if not thousands, policing the content on the platform. The truth is, they can’t make it airtight. There’s going to be malevolent actors, there’s going to be disinformation. That’s because it’s too vast and too big.
Paul: Well, it’s also, it’s so big that moral reaction happens at an immoral speed. So a good example would be, you know, Facebook. Let’s not, let’s not—we’re talking about AI, so let’s not double down on Facebook. But, like, you know, they now have this advisory board, which seems pretty toothless overall.
Rich: It is toothless. It is toothless. And AI will have one, too. OpenAI will have an advisory board.
Paul: I mean, kind of—it’s a not-for-profit. It does.
Rich: It does. Right?
Paul: Yeah.
Rich: And so what you’re seeing now is that, oh, human ingenuity created yet another thing they’re not gonna be able to really control. [laughing]
Paul: Yeah.
Rich: So there’s a few things that happen. One is…nothing.
Paul: Well, I think this one’s interesting because there’s actually already government interest in regulation.
Rich: Well, the thing about AI, I think, that makes it slightly different, is Facebook was like, “All right, we’re gonna mess with people’s knowledge, like, where they get information and what that information is and how truthful it is.” I think with AI, I think because its utility is so kind of wide open, it’s like, wait, is this thing landing planes in ten years?
Paul: It’s also weird because it’s very legible to a non-technologist because it acts like a person.
Rich: It acts like a person.
Paul: So, like, a social graph is a very abstract concept.
Rich: Yeah. That’s right. That’s right.
Paul: Right? And so there’s, like, when you watched. I mean, some of the greatest entertainment in the last 20 years is watching congressmen try to understand the internet.
Rich: Oh, it’s the worst.
Paul: It’s just endless humiliation, you know…
Rich: Yeah. It is.
Paul: And hearings that should never have happened and on and on. But, and so, like, they really couldn’t wrap their heads around the social graph. But everybody understands that when you say to the robot, “Make me a list of the ten best people to kill,” and it gives you a list, that’s bad. [laughter] “Oh, whoa. Okay.”
Rich: Tell me, as a co founder of frankly, an AI startup now.
Paul: Yeah.
Rich: I would call us that now. How did you bake that into your thinking into what the product should be and what the product should not be?
Paul: Well, here’s where I come down with on AI. I come down on a lot of different things. First of all, it’s out of the box, and when things come out of the box, you got to deal with them. You can’t put them back in the box, because they won’t go back in. We’ve seen this. Crypto is not back in the box. The market has kind of decided like, no, we’ve had enough now. We’re going to send, these guys are going to go to jail and we’re going to calm down a little bit.
Rich: Yeah.
Paul: But it’s, it’s nowhere out of the box. It’s coming. It is going to be with us for the rest of our lives.
Rich: Yeah.
Paul: The web is not going back in the box. The media industry is not going to go back to the way it was in 2001, no matter how much everybody would like it to. So there’s that. There’s like, okay, it’s here, let’s deal with it. Then there is, I spent a lot of time researching, trying to understand large language models. I just enjoy it as a technology. I want to just—
Rich: It’s fascinating.
Paul: Well, because also, especially when anybody is like, “This is the most mysterious thing, you can never understand it, it’s so amazing—”
Rich: Makes you want to go in!
Paul: I’m like, “No, it’s not. It’s software. Shut up.”
Rich: Yeah.
Paul: And that happened to me with crypto, too. Okay, so now I’m going in and I’m understanding it, and I got a chat GPT account, and I started using DALL-E and I started using all that stuff and getting my bearings and kind of both really starting to enjoy it and also losing any sense of awe and wonder. Right? Because it just became software to me. I started to understand a lot about how it worked.
Rich: Yeah.
Paul: Then there are the ethical concerns, which I think are quite valid. I think it’s a lot fuzzier than the detractors would like to say. Like, you know, it drew me a pretty, you know, draw me a picture of a squirrel in the style of Richard Scarry, who passed away… Like, I don’t, there’s a lot of complexity that actually kind of has to work its way through the courts.
Rich: Yeah.
Paul: Like, I’m glad the New York Times is suing open AI. I hope something gets worked out.
Rich: You are glad? Well, the precedent is important there, right?
Paul: Get it out. Figure it out. I don’t, honestly, I’m at a point where, like, we’re not going to shut this whole thing down. There’s a lot of open-source and so on. Like, it’s like, I’m not going to worry about OpenAI or the New York Times. They’re both very large actors.
Rich: You think we’re headed to a place where…you didn’t answer my question, which is what makes this podcast entertaining.
Paul: [overlapping] We’re getting there. We’re getting there. Hold on.
Rich: Do you think there’s going to be a time where, like, you just got to drop a robots.txt on your website that says, don’t, hey AI, stay away, mind your own business.
Paul: Literally right now there are, if you, we have a friend in a website we like called waxy. Waxy.org, Andy Baio’s website.
Rich: Mmm hmm. Mmm hmm.
Paul: And the top link on it at this moment, I was looking at it this morning, is to a site that lists all the spiders that AI companies use so you can block them.
Rich: Oh!
Paul: Right? Because that’s what they’re going to do. Here’s what’s going to happen. They’re going to have spiders, and the spiders are going to have, you know, respect robots.txt, which is the standard for telling a spider whether it’s allowed to index your content or not.
Rich: Right.
Paul: And you’re going to, you know, start blocking AI stuff if you don’t want your stuff spidered. It’s tricky because then, you know, there’s all these images…that’s one thing. And then there’s the fact that like, you know, copyrighted images from Getty are getting drawn in.
Rich: Yeah. Yeah, yeah.
Paul: And so honestly, dude, there’s some things that are above my pay grade. They’re culturally, they need to be worked out. I don’t think that an AI bot is really should be allowed to take everybody’s stuff and go to town. That said, it’s essentially a bizarre compression algorithm. And when you take a multi, multi, multi-terabyte data set and squeeze it into four gigs of descriptive stuff and then spin back out an image that kind of looks like something else, that there’s no precedent for that. It’s going to take a long time for us to understand what’s going on.
Rich: Yeah.
Paul: So it’s not, it’s not as—
Rich: You’ve built, uh, I mean, we’re sounding pretty critical, but we built a startup that leans pretty heavily on AI. What should we—what did we do?
Paul: Well, we made a couple really concrete decisions. First of all, wherever AI is, so what does the tool do? It adds links, it adds cards, it makes content…
Rich: Suggests content.
Paul: And it makes demo content, and it does sort of like research-style tasks. You can ask it to list all the first ladies. Wherever possible, AI is not up to the minute, the language models, but wherever possible, we took links out. We try to keep links, so that we send people back to the original content, if it’s known.
Rich: Okay.
Paul: We flag all generated content very clearly. You know what the AI made.
Rich: Why?
Paul: Because it’s important to—honestly, it’s more thematic than anything else. I think AI is a wonderful way to fill in a blank page so you can get started.
Rich: Okay.
Paul: And I think after that, humans are probably better and will be for an awfully long time.
Rich: Okay, so this is a counter-narrative to a lot of, like, AI will take all the jobs.
Paul: I don’t think it will. I think it will take really, like, I think that sending people email about your outsourced firm will be taken over by AI and that then AI will respond and say, not interested. [laughter] Like, I think that there’s… But for the most part, most real things in the world happen because of direct interaction, not because—
Rich: Humans talking to humans?
Paul: So there is a horrible churn world of bots just rambling at bots.
Rich: Yeah, yeah.
Paul: And that’s going to be gross. And that actually, I think the greater risk is that the larger public web will be taken over by web pages that are generated by bots, just infinite, because they can produce infinite amounts of content—
Rich: Sounds lame.
Paul: It’s not great. And it’s like, Google’s going to index that stuff. And will it get better at figuring out what bots gener—Google can’t tell what’s real or not.
Rich: Yeah.
Paul: It can only tell that there’s a certain amount of words on a web page. So anyway, I feel that the ecological concerns I think are valid, but I also see the cost of accessing these models going down drastically, almost like on a monthly basis. I think there’s an enormous amount of burn and churn happening, but I think that the actual cost of a query is going to get really, really low.
Rich: Yup.
Paul: I feel that the copyright stuff—
Rich: As with most things in tech..
Paul: That’s, you just see that coming. And as for the copyright stuff, we’re not engaged with it. We do a little bit of text generation and so on. But it is like, this is sample, for-placement-only stuff, or it’s like, make me a list of software as a service products and it tells you that one of them is Trello. Like, I mean, that’s, that’s the kind of thing that we’re doing.
Rich: That’s not—yeah, you’re not ripping off…
Paul: And I mean, look, proof is in the pudding. We go live with this, and I haven’t, I get yelled at about all kinds of things. I haven’t been yelled at about our use of AI.
Rich: Yeah.
Paul: I didn’t post it on Mastodon. I left that alone.
Rich: Why’d you do—why not? [laughter] Marketing is marketing, man.
Paul: Not on Mastodon, my friend. No, on Mastodon, it’s…yeah.
Rich: I don’t know. I’m not on there.
Paul: I’m not either. And so, yeah, so for me, what is a good framework for AI? It should be assistive. It should not wholesale copy or attempt to simulate other content. It should be probably relatively close to listings. I think it’s really good at bullet points. And then it should get the hell out of your way and let you clean stuff up and move stuff along. I think it is good for onboarding and for filling in a blank page so that you figure out what your next step is.
Rich: Yep. I think in a future episode we should talk about how AI can empower you and accelerate what you do well. Everyone’s talking about it right now as a replacement for what you do rather than as something that accelerates, or, like, actually enhances what you do. And no one’s having that conversation. And here’s the topic I want to talk about. Maybe not the next podcast, just so we can mix it up. Maybe next podcast we’ll talk about a history of the MP3 algorithm.
Paul: You know, you know, we’ve never talked about my two cats for 20 minutes. I’ll talk about them.
Rich: You can do that.
Paul: Their names are Printer and Moonlight.
Rich: I think a lot of the AI innovation ahead is not, oh, the answers are getting better. I think it’s the interfaces with AI. It started off, frankly, on just, they sort of threw a flyer with like, you know, this thing is interesting. Put it out in the world, let’s see what happens. And it just exploded.
Paul: Well, you know, it’s the, the application was text generation. Like, that’s the way that the algorithm worked.
Rich: Right.
Paul: And so getting that towards chat made a lot of sense.
Rich: That’s right, that’s right. And so I think a lot of the interface innovations are going to be really interesting, like meaning what is the output? Right now the output is mostly words and pictures. And we did something pretty different. We took the AI answer and interpreted, reimagined it as actual software. I think that’s interesting.
Paul: You know what we did, you know what we did—and what I think, I think this will be the trend going forward. We got rid of the virtual human.
Rich: We got rid of the virtual human.
Paul: There is no—
Rich: You are the human.
Paul: Yeah.
Rich: You’re the only human we want in the room.
Paul: It is very clear when you use a board that what AI does is a software product executing on a CPU on your behalf.
Rich: Yes.
Paul: It is not saying, “As a large language model, I don’t want to take off my clothes.” You know, it’s just—
Rich: [laughing] They’re all named after, like, British butlers, too. It’s so exhausting.
Paul: Yeah. It’s gross. It’s a robot doing robot things, not a person, not a robot pretending to be a person.
Rich: Yeah.
Paul: And I think that that’s the future of this technology. I actually, we always, things always kind of pretend to be more human than they are in the early days.
Rich: And then they kind of dial it back.
Paul: Yeah. I mean, it’s, I don’t know, like a good example, early Mac interfaces were all little smiley faces.
Rich: Yeah.
Paul: Right? And then that sort of slowly it fades away because too much personality gets in the way of you doing the thing you want to do.
Rich: It’s a tool. It’s not another person.
Paul: That’s right. Well, you just, you don’t want to deal with that friction.
[outro music]
Rich: Check it out. We think we’ve done something cool here.
Paul: Yeah, it’s hard to explain in audio, isn’t it?
Rich: Go to aboard.com. I think it explains itself and kind of guides you through its setup process. Board Builder is the big innovation, which is AI-powered. We have big ambitions for this platform. We view it as a platform. We want to solve big problems with it. So if you want to talk, reach out. We think we’ve built something pretty interesting.
Paul: That’s great. And look, share this podcast. Give us a couple stars if you’re in the mood.
Rich: A couple? That’s two, dude.
Paul: Okay. Five.
Rich: [overlapping] Five! “Give us a couple stars.” Oh, God.
Paul: Tell some friends. We should also point out, we have really nice, like, go to the website. We have good transcripts if you like to read instead of listen. I can understand why you might get tired of hearing our voices. And if you want us to talk about anything in particular, your questions or feedback, go ahead. You can email hello@aboard.com. You can also send us any questions. Get in touch. And, you know, tell your friends. We are looking to build things on top of Aboard for all sorts of organizations. That doesn’t mean that you gotta get your checkbook out. We want to talk to everybody and figure out where we can be most helpful. Just get in touch.
Rich: Have a lovely week.
Paul: Bye.