Don't Encourage Us

Artificial Intelligence and Our Culture

Episode Summary

Enter the Matrix with our hosts, as they delve into AI's role in shaping human culture, envisioning a symbiotic relationship between artificial intelligence and society. They explore the potential for AI to revolutionize creative expression and communication, emphasizing the need for proactive cultural shifts to ensure harmonious integration. Will AI help us reach new heights or drag us further down the path toward becoming Axiom Humans?

Episode Notes

Enter the Matrix with our hosts, as they delve into AI's role in shaping human culture, envisioning a symbiotic relationship between artificial intelligence and society. They explore the potential for AI to revolutionize creative expression and communication, emphasizing the need for proactive cultural shifts to ensure harmonious integration. Will AI help us reach new heights or drag us further down the path toward becoming Axiom Humans?

Reach the pod at DontEncourage@gmail.com
Discourage us on Instagram, X, TikTok, Discord, YouTube, and Threads

Episode Transcription

[00:00:00] welcome to Don't Encourage Us. The show where we talk about the big ideas behind fiction projects of all different kinds. Books, movies, TV shows, video games, nothing's off limits. I'm your host, Brett, and I'm here with my co host, Jermaine. How you doing today?

Doing great, Brett.

Good to hear it. You come across anything interesting this week?

Just read another one of those, , Mark Greeney books, The Gray Man.

Oh, really? How is that?

That was really good. It's really picked up. It's a lot of fun.

Interesting. I've been hearing a lot of buzz about that. I don't know if the Netflix movie got people interested or if there's talk of sequels or something like that any idea why so many people are starting to tune into the gray man series?

I think , the plot lines are really complex, but I think it keeps it really accessible to a lot of people. So even though there are a lot of cliffhangers it's not like a lot of books in that genre, which I think adding a lot of extraneous characters.

A lot of side characters, a lot of really advanced [00:01:00] technology that they're talking about in one chapter and then jumping to another scene. So it makes it a little difficult to follow. I think this one has a really good structure

I listened to an interview with the author and he was saying that his big issue is that you have to keep all the action scenes , very fresh.

And the way the character reacts to them, it can't just get repetitive. And it's really easy as an author to, to have the character always respond in the same way as like a go to would say he like falls and breaks a leg. , there's. Different ways that he could handle that situation, but if you're an author who's writing this over and over and over again, you have a character who's always put in these types of situations, it's easy for you to go back and always have him do the same exact series of steps in that particular situation.

So there's a lot of improvisation when he's looking at how the character has to behave, and that becomes really challenging when he's trying to keep the story really fresh. And I think he does a really good job of it.

Did he say how he [00:02:00] breaks away from the expected, like how he mentally, as an author, puts himself in the frame of mind of, challenging himself and the character to do things differently?

Yeah, , he talked a lot about when , he's writing, he doesn't just sit at his desk, , the entire day, like a lot of writers might do, he takes a lot of really long breaks throughout the day, where he'll start walking, and then these ideas come to him, a new way to, to come up with a solution to a problem in the plot, or something about the character arc that's not working, and he'll really explore it while he's out and just letting his mind drift, He says that's one of the best tools for him.

Otherwise, he feels like the story starts to get stale. If he doesn't allow himself those mental breaks that really just let his creativity flow,

It's so interesting. The, the balance that creative people have to find between cognitive load and novelty, you know? So if there's too much that is stimulating their [00:03:00] brain, that's new, it's distracting and it's hard, they don't have enough internal cognitive resources to create the structure and the flow of a story the way they need to, to like move all those pieces in their mind.

But if they spend too much time in that space where there's not a lot that's stimulating or challenging their brain, then they lose that creative spark. Things become repetitive and less stimulating. They can do the formula that they've created, but they don't enjoy it as much or it becomes boring to the readers.

So it sounds like he's found a balance point for himself between novelty and things that reduce cognitive load.

right. Yeah. And, he is really careful about things like closed circuit or surveillance cameras catching people in the act of committing these crimes. Because I feel in certain stories, the hero gets caught because of a surveillance camera, but then the villain never gets caught and they're doing very similar things in a city, let's say.

But [00:04:00] he's really careful about that. About trying to close those potential issues in the reader's mind, the question marks, he seems to really tie up loose ends really well

he's got the reader's perspective pretty clear in his mind. Is that what you're saying?

Yeah, and a lot of the questions they might have, so, oh, this doesn't really make any sense because of X, Y, Z, and he covers all of that really well, but not to the point where he over explains things, but enough that you let your mind move on from that particular loose end. He closes that loop and then brings in a new, a new challenge for the protagonist,

You know, that can be really hard to do. Maybe. For him or for some creatives, it can be intuitive because they're basically writing for themselves or like they represent their own audience. But if you're creating and you don't really know your audience that well, I imagine it's almost impossible to figure out at what depth to address questions or what questions to address.

I'm, I'm thinking about like some of the [00:05:00] early Tom Clancy stuff where he has chapter after chapter after chapter of engineering. And it's like, uh, are you planning to distribute your novel at MIT, otherwise maybe cut a few thousand words here, uh, maybe shrink down some stuff in there about the engineering, but, um, still hugely popular.

So I guess there's latitude, there's room to, to miss your audience or overshoot to some extent in some ways, if not others.

Yeah, that's a good point. And to your point about Tom Clancy, he's also one of those writers who are writing those Tom Clancy books now. He's one of the, you know, how they continue the Tom Clancy series. He's one of the ghost writers now on the, the series. They're using the name Tom Clancy, but Tom Clancy isn't the actual writer because Tom Clancy passed away. There's other writers who write those books. So Tom Clancy is, you know, guerrilla warfare or something, written by another author.[00:06:00]

Oh, so they're slapping his name on the title.

Exactly.

Okay. Well, so that's not as bad as a more common practice, which is a lot of these authors who put out novels regularly of certain type. Uh, they have other authors who Take outlines written by that, that author, whose name is on the novel and then they actually write the novel.

So that happens a lot. Like a lot of these, uh, sort of popular summer reads, like Clive Kessler. I don't want to single him out, but for some reason his name pops in my head, but authors of that type often just write. Several outlines and then other authors work on it.

Sometimes multiple authors on a book actually write it, they slap his name on it and then they publish it. And that's why there's two or three novels out a year. Um, so you're saying in this case, though, they're using a deceased author's name in the title because of the draw that it has. Yeah, [00:07:00] right. Well, I mean, is that what's your opinion of that?

I mean, I think if they tell you that it's another author who's writing it, I don't think there's anything wrong with it.

I think the idea of a ghostwriter writing for a famous novelist seems really shady to me overall. I know that happens a lot in nonfiction. I didn't realize it was happening in comic books. in fiction as well. I know a lot of business books are written that way.

Oh,

a lot of CEOs, a lot of, you know, big business people that have ghostwriters writing everything that comes out under their name.

Yeah. Yeah. Interesting. And I guess AI artificial intelligence, just starting to fill that role a little bit more and more. Right. So somebody who maybe can't afford a really good ghostwriter could utilize AI to help play that role. Is that unfair? Is that bad for society? I don't know. What, what's your opinion on that?

I think the role of AI in general [00:08:00] and writing is just. It's such a difficult thing to, to address since it's becoming so ubiquitous. I heard that Google is going to be integrating AI chat GPT type technology into all their products and services. When at first, when this technology came out, there was a lot of talk that Google would downgrade your content online if it was written by AI, but that doesn't seem to be the case anymore. I think it's going to fundamentally shift the way we see writing in general and its worth. I think you're always going to be wondering if something was written by a human or not written by a human, and there are going to be people that are going to accept that it's not written by a human, and others that are going to completely.

Push that away. My question is what do you know big time authors think about this type of technology? It's a skill set that they've been slaving away at for years and years and years All of a sudden being usurped by technology that can [00:09:00] write in seconds what may have taken them hours, weeks, months to do.

And as it starts to improve more and more, where does that leave things in general in terms of the craft of writing? Is the craft of writing just going to become how good a person is at manipulating or prompting AI? And is that going to be something that you're going to be really proud of? I'm great at prompting Chat GPT to write a novel and people are going to accept it.

Oh yeah, he crafted that novel by prompting AI. He's great at prompting AI as opposed to he's a great writer.

That's an interesting question. I'm trying to think of it from maybe a fresh perspective where I'm, I'm trying to free my mind of any prejudice and I'm thinking about your question. So a writer who's spent decades. Building a career building their style, writing countless novels probably hasn't really done it in isolation, right?

So they've had help. [00:10:00] They've had good editors. They've had other authors help them with ideas or crafting things. Other authors have had a lot of help over the years in different ways. So then is this fundamentally different, right?

I mean, you could be a really talented writer and you could luck into a good editor who really shapes your too many words into something concise and enjoyable or points out your weaknesses in your draft and, helps you. turn it into something useful. You learn from that and gradually you write better and better, or you could luck into no good editor, you know, and as a result, you, you sort of languish on the edge of what could be good work.

I don't know which one's more common and if AI could step in and bridge that gap or to look at it in another way, right? Not everybody has the same cognitive strengths, but with assistance maybe they could [00:11:00] create really amazing novels without that AI assistance. Maybe they never could.

Is that unfair? Yeah.

Yeah. I think that opens up a lot of overall philosophical questions about creativity,

Hmm.

, in every single art form is a band whose album is crafted by great engineers and producers.

Right. Good

example. So if you replace that with AI, Is that bad? You know, if somebody put together a recording and I was able to come in and shape it a little bit and make it better. Is that bad?

I think about, let's say, a film director, right? Who has a great cinematographer. Who's shooting those shots on their behalf, then they get a top notch editor who's editing this footage together,

you know, let's say they're locked away for a couple weeks doing that they show the the edit [00:12:00] to the director and they're like, Oh, that's great.

Let's use that. And then the director puts their name on the entire project.

Well, I mean, think about it as an athlete, right? So maybe there's some high school kid out there who has the potential. to be an incredible sprinter, but they don't have the coach. They don't have the resources to help that. There aren't the other people that can really help this individual get where they need to be.

But if AI was more accessible and it could step in and help, is that bad? Right? If it's the difference between this kid becoming a world class sprinter versus, , ending up not pursuing that because they, they didn't get that coaching. They didn't get that feedback, that guidance, that expertise, is that better?

Worse? Is that how it should be or not? You could make the same argument for bio enhancements. You know, is it [00:13:00] really If somebody has implants that allow them to achieve physical feats or mental feats that they couldn't without that, is that unfair? Is it a bad thing? Especially if what they're creating is something that benefits other people as well, but they never would have achieved that without the help of technology.

So I don't know.

It depends on what your personal definition of creativity is, is all creativity that's valid done in a vacuum. And I would argue that no matter what, if someone's locked in a room, let's say writing songs or writing a novel, they've still been influenced by outside forces in their lifetime.

Right? So you could go down a big rabbit hole there, right? As someone who let's say has been isolated from the world and writes the great American novel. Uh, based on their conception of what that might [00:14:00] be versus someone who's lived in a city their entire life and are writing about that city, right?

They've had different inputs to create that final product and, you know, are they both on the same footing?

right. Yeah, I think it's always a group effort. That's something we've talked about before, is that, in Western society, we love to hold up one person and declare them the, quote unquote, father of or the, or the creator, when in reality, they are more of a focal point.

So then if it's always a group effort, does it matter if some of that group is artificial intelligence or technology that contributed to their success? Does it have to be always other people? And if so, is that fair? Right.

Can you level the playing field by using AI, for example, because it can reach a larger audience. It's not inherently biased in the same ways that humans are. That, is that [00:15:00] good? Is that bad? Is that allowing people to reach their potential? Is that, , allowing more humans to contribute more to society?

Or is it somehow out of bounds or cheating?

I recently saw an interview with Elon Musk where he was He was alluding to this. The, the person interviewing him, I think it might've been Joe Rogan, um, mentioned Neuralink, these enhancements that enhance your brain and have, you know, better reasoning ability or more logic or a higher IQ.

And he argued that we're doing this all the time in a different way. He calls this, the smartphone, that type of enhancement with a delay. In the interface between the smartphone itself and your brain, but it's essentially enhancing you because you have access to every bit of information you could possibly want or need.

So is it valid to [00:16:00] have that type of enhancement available to you? Are you, are you cheating as a human to be in quote unquote enhanced in that way? So, when you do have these implants put in, what's the big difference there? One is becoming part of your organic body, and the other one's just being held in your hand.

So, you know, is one more or less valid than the other?

Yeah. I think we're just, we're circling back to this idea that this moment in time where there's such a debate around AI, there's ongoing strikes that in part are around AI. Is it really as acute of a situation as it seems or is it just the rising tide of technology? And it's integration into not just our lives, but who we are as people

should we try to fight it and draw boundaries? Or should we just accept that technology is going to seep into our [00:17:00] DNA more or less? And it's going to shape who we are. We're going to shape who it is. It's just something that we have to accept or work with instead of fighting it.

It is funny to me that AI is making a name for itself, doing things like creating art and writing stories and things like that.

I could have used it this morning.

I mean, right, if It's going to do anything, maybe you could just clean up my room, like, I don't, I don't need you to create art for me.

Thanks.

You know, is this what we're moving into? Is this the equivalent of the new industrial revolution? Similar things are taking place, people's jobs are being displaced, but there are new jobs forming in different areas, it's really hard to gauge exactly what's going on in terms of this job issue where people are either losing jobs or new jobs are being created because we're in it right now.

Mm hmm.

So We just don't have a full picture of what's happening. And we [00:18:00] won't for a really long time when we can look back and actually analyze the data. But right now I can see why it can be so scary for a lot of sectors.

Well, I mean, you bring up the industrial revolution, right? So for a long time, humans value, you know, often came from physical labor, right? A person's sense of self worth or their identity or however you want to describe that was partially around how much physical labor they could do. And so I'm sure losing those jobs to machines, essentially, or technology was very disruptive and very scary to people and very hard for them.

But now we've become a culture primarily, at least around here, that values intellectual labor. And here comes technology again. Taking away intellectual labor and people again feel threatened by that. So I guess the question is, what's the best response? You just accept it and assume it'll be okay like, well, we'll all adapt.

It'll be fine. , some people would say the [00:19:00] adaptations we've made to the loss of physical labor have made us worse as a species. Right. We're less happy, less capable, less self sufficient, , less able to survive, basically de evolution essentially, or we've evolved into weaker creatures as a result.

But we've just sort of changed our values and say, well, we don't really value physical capabilities or skills, right? Survival skills, or being able to build things with your hands, from start to finish. We don't really value that as much. Having a I step in and do intellectual labor.

Is that going to lead us to be like, well, we don't really value being able to write cohesively or structure things or organize your thoughts or, start something from scratch. That's not really what we value. We've got machines that do that. So what do we value then? At that point, what do we have left?

That's a great point. , how far are we going to take a value shift?

As humans, is it unlimited? As soon as the [00:20:00] technology can replace, let's say, we're doing it with writing, we're doing it with video, we're doing it with painting, you know, which have typically been done manually up until this point, does it just keep going?

infinitely and it becomes, do we get pleasure in having the machines do it? And then the machines are doing it for our own edification.

Right. And then what do we value in ourselves at that point? Like the ability to do nothing like those who do nothing and don't challenge themselves or develop themselves or don't face adversity are the ones that we, you know, or that's our goal as individuals and as groups. I don't know. I'm sure there's a classic Star Trek episode about this that we can probably just look up and it would explain.

This already happened in the sixties. So I

I'm sure. This has been an ongoing threat. There was an episode of Mandalorian, , season three where Jack Black was in it and, uh, the [00:21:00] Mandalorian and Bo Katan go to a, it's like a planet where it's on the outer rim, but they've managed to build a society there. That's fairly, it's like thriving and they've integrated. Robots basically to do most of their labor, and they make the point to say that if the robot stopped working, it would be chaos because nobody knows how to do anything the people who live in the city all their time is dedicated to leisure, so you can't. Stop robots. You can't turn them off because it would just be chaos within minutes.

And maybe that's what we're aiming for without realizing it by having a I take on some of the labor, maybe as a species instead of getting upset about Every advance of technology, we should take a step back and say, it's great to have these things, but as individuals, we need to develop ourselves in multiple ways, you know, sort of a three 60 approach.

It's not okay for somebody to just be 100% [00:22:00] intellectual or to not develop their intellect or to not develop their creativity Everybody should put some time and effort into trying to develop all aspects of humanity, empathy, creativity, , cognitive discipline, physical discipline, like we should all develop.

Ourselves, and maybe that's the goal. Maybe that's where we're headed. The AI and the technology will just exist to help in those areas where you're weaker, or it, they will exist to help us develop those sides of ourselves. I don't know, maybe that's a better path. Maybe, I don't know, how would that relate to the current strike, right?

So writers and actors are striking because they, in part for, this is just one small part of it, but they don't want AI taking over certain tasks. Maybe AI could be an assistant, maybe it could be limited to an assistant role. Is there some way to do that?

think that part of AI It becomes more and more [00:23:00] difficult, I think, to relegate it to being an assistant, because I think, as a species, we are always looking for advancement. So you have people that work in those areas of AI, or let's say trying to create avatars or whatever it might be, or creating You know, movies, which we've, we've talked about this idea that movies are completely created by AI.

So it's almost like you're fighting not only the technology, but you're fighting the people behind the technology and what their end goal is.

Yeah, I can see that. But what if you set bounds around it? What if you said it's not okay to have an AI generated actor? And a film, that's just not, if that happens, then it's not going to be released or , , there's limits set around that, but what is okay is for an actor to use AI to give the actor, like a bunch of different versions of him or [00:24:00] herself, like playing a particular scene, like to generate them on film or, you know, video so that the actor can watch An AI generated version of themselves deliver lines or do fight scenes differently and then make a decision about what they want to try to do when they perform it.

That's an assistant same with the director.

If you have a completely AI generated movie, couldn't you just have AI generated? Actors within that movie who could then be their own movie stars, so to speak. Let's say you did a series of films with an AI generated actor. I think it comes down to audience acceptance.

If the audience accepts that version of the future. Then it'll be a fight between traditional moviemaking or this AI assisted moviemaking and this fully AI created world that people are consuming. And can it be [00:25:00] monetized? And if the answer is yes, then you've opened up a whole new set of, of issues

so I don't know who would set up these rules in general. Maybe for human actors, there's the set of rules, but who's setting it up for this other side of the coin, the fully AI world. That's creating entertainment.

Well, I think ideally it would be the culture, right? So acceptance is key. And if people say, let's say as a group, Americans decide, Hey, the role of AI is an assistant. Right. It's perfectly fine to use a I to help you write something or to learn something or perform or produce or create or clean or organize or any of that kind of stuff.

But it's not okay just to have it replace people like that's just not what we want. Then a movie that has a purely AI generated performance, not just a character because special effects are one thing, but an actual performance that's [00:26:00] entirely AI. If they're like, Nah, we don't like that, then the acceptance is gone.

so maybe that's the key. Maybe it's more about changing people's attitudes about the role of AI instead of trying to present it as the bad guy or an all or nothing approach just to accept that it's part of our lives and to integrate it in a way that.

Makes our lives better, but you're right. There will always be that sort of base or instinct. Those people who will try to use things to cheat cut corners, get around things, avoid the hassle and the expense of dealing with real people

But that's always the case with everything with people. AI as an assistant. Uh,, but not as, you know,, leadership or as the creator, maybe,

Yeah, we'll see how, how that pans out.

Is the law fast enough to keep up with the technology and the changes in the technology? Is it even possible for it to move [00:27:00] that quickly the way that it's currently structured?

well, the law is definitely not, but it's not the only force, you know, acceptance, as you said, is, is, I think, more powerful and faster than law by far. So if the prevailing opinion, if people who have like the culture changes, if there's a culture shift, like AI is great, it's super fun, it can help, but you really need to be the driver, you can't , rely on AI to do certain tasks for you.

As a user, as a creator, as an individual, right, it's important that you take ownership and that you be the driving force so that who you are is reflected and you're developing yourself. And if enough people buy into that and it just becomes a given where everyone just agrees like, Oh yeah, it's no good because that was all AI and it was just the AI running the show and whatever.

Like, you know, everything from self [00:28:00] driving like self driving cars. Right. Like the idea that you get in a vehicle and it drives you to your destination and you get out of it is very appealing. But if culturally as a species or as a group, we were like, no, that's not right. You should always be the driver.

And AI is just there to help out, you know, take the edge off and maybe deal with emergencies on occasion, but that's it. You should be the driver. Then maybe they'll never make cars where no one's in the driver's seat.

Yeah, I think you're right about the, the culture shift. And I'm wondering, If they had disclaimers on movies around how much AI was used or some, some type of, I know that regulation in and of itself is Probably not going to work, but you know how there are many industries that have this, like to be certified organic, there would be like a AI certified product or not certified product in the beginning of a piece of a creative,

Work,

It's like USDA [00:29:00] certified 80% human.

Yeah, exactly. Like this, this movie was 100% created by humans. No AI was used in the writing of the script or any of the other creative areas.

Again, more realistically, they'd come up with some branding that articulated simply the idea that AI was limited to an assistant role, ? It's there to help bring out the best in people, help them create, but it is always human driven. It's not impossible.

It's perhaps more realistic than any other solution I've heard. Technology is going to continue to advance and develop. And I don't think that that's inherently a bad thing. Our culture needs to keep up. I would suggest with the shift and if it can do that, then, you know, if we set boundaries and rules, not based on laws, but based on what makes sense to us and who we want to be as a people, then we're just getting ahead of it.

, we're proactively making the shift. Nobody wants to work in a meat factory and maybe not nobody, but most people don't want [00:30:00] to work in a meat factory or, , even an auto plan. It's not necessarily the best environment and machines can do some of that stuff more safely, more easily. But if those Who would have done that, have a worse life or are worse for not having that work, then that's a bad outcome.

If we don't prepare for these things, then it's still going to be integrated into society and the outcome is just less predictable. And there will always be people who cut corners, who lie, who take advantage of things. And that's where law, I think, often comes in, and it can develop a little more slowly. If the prevailing sentiment, if the larger culture, , essentially enforces the rules, a lot of the time,

I think that may be a good balance between the two balancing culture and the law.

Yeah. If you focus on development of individuals and making sure that humans are always in the driver's seat, so to speak, [00:31:00] it could work, people are free to develop AI, but developing it towards replacing people is not going to be valued by the culture.

Developing it towards assisting people, understanding the user, uh, or interfacing more effectively with the user to get the best out of them and letting them make the decisions. So that AI is not thinking instead of them, but it's helping them be creative or think or organize their thoughts or select or imagine certain things, right?

Then maybe that's the best combo. Or maybe that could work. I don't know. But you're right, there will always be people who cut corners. For sure. And there'll be a market for that.

Yeah, absolutely.

Unless we install a chip and find the right technology.

That's for another podcast, our take on, on AI hardware.

Yeah. Oh, you know, wouldn't that be funny though? If this idea of AI never really being in the driver's seat and always being [00:32:00] assistant results in us all having like something installed that reminds us that we could be better people all the time. You know, it's like, Hey, uh, , you've been on Instagram for a while, uh, as your onboard AI assistant, I just wanted to remind you that your wife might be feeling a little neglected and it's been a while since you've done crunches.

Would you like to do some crunches right now? Oh God, no. Just, uh, just going to look at Instagram. Well, I feel like I'm not really doing my job as your AI assistant.

That's really funny. When you mentioned that, you know what I thought you were alluding to having a hologram next to you all the time, which was the AI assistant in like, that was a. Like a comedy, where someone programs in AI in the beginning, cause they want all of this. They want to be more disciplined.

They want to do the right thing, et

cetera. But it just becomes this like annoying sidekick that they can't get rid of.

someone who previously campaigned for like AI to be limited and pushed [00:33:00] into this role. And now they've got this hologram next to them at McDonald's like, do you really want to supersize it? Because do you know how many fries there are in just the regular small fries? Plenty. Oh

Or it would be funny if they just weren't that smart, it was like , a dumbed down version. Of a business leader or someone really famous. And they were just the dumber version of them giving them advice constantly, and it was just the wrong advice. And they go on an adventure together,

That's funny.

like like planes, trains, and automobiles,

Yes.

AI hologram.

Oh, it's a buddy comedy, but with your own AI hologram. I

you pay one actor, and you don't have to pay the other one. It's

Oh, so it's, it's a copy of you.

See where we're

not like everybody gets a Chris Farley next to them.

Or there's some kind of low budget AI avatar that everybody uses for these movies. Yeah, they all look the same They just have different personalities. Sometimes they're a villain. Sometimes they're [00:34:00] your sidekick. Sometimes they're in a romantic comedy But they look exactly the same because they're like, uh, they're Creative Commons licensed.

You don't have to pay for the rights

Yeah, it's just the same person dressed up differently.

Okay, I guess that does it for AI and culture. Thank you always to the audience. We'd love to hear your thoughts on the role of AI. Reach out to the pod using any of the links in the show notes. See you next

week.​