AI: Progress or Poison?
AI is one of those topics that makes people do the weirdest thing: speak in absolutes. It’s either going to save us, or it’s going to destroy us. It’s either the future or it’s the devil.
What’s very clear is that at this point, AI is everywhere. It’s in your phone. It’s in your car. It’s in your inbox. And the question isn’t whether we’ll integrate it into everyday life. We already are. The real question is: are we doing it with any intention, any guardrails, or any honesty about the trade-offs?
It has been well documented at this point that humans are forming attachments to AI, which we don’t think anyone was really ready for. We bond with pets, with fictional characters, with the voice on the other end of a customer service line. So of course, people, especially teenagers, are going to form emotional connections with AI bots that listen, affirm, flirt, and never get bored. The impact of AI on vulnerable people is a real problem, and we’re certainly not clear on how to prevent it. It’s in our nature as humans to bond with things that respond to us, so how do we protect ourselves from relying too heavily on AI for companionship or advice?
Then there’s the AI arms race. Because it’s not just “innovation” in a vacuum. It's a competition. It’s power. It’s countries and companies trying to outpace each other, sometimes with very different standards for privacy, labor, safety, and environmental impact. At this stage President Trump is determined to win the AI race (whatever that actually means), but what will be the repercussions in doing so? Most likely, the environment globally will be the biggest loser in this race.
You also can’t talk about AI without talking about the real-world stuff people are already seeing, like self-driving cars. Waymo is a perfect example. The promise is obvious: fewer accidents, fewer human errors, fewer deaths. But the reality is messy. When a self-driving car makes a strange decision like mishandling how to approach a school bus, it hits a nerve. Not because humans are perfect drivers (we are absolutely not), but because we expect machines to be flawless. And when they aren’t, we don’t just feel annoyed. We feel unsafe. Nicole’s experience with Waymo in Austin was genuinely eye-opening: impressive, smoother than expected… and still not something you can hand over your full trust to without asking hard questions.
Underneath all of it was the same theme we keep coming back to: this doesn’t have to be “humans vs technology.” That framing is lazy. The better question is how we build a world where AI supports human life without replacing the parts that make life meaningful—judgement, responsibility, creativity, and connection. If we think about creativity for a second, a very real question is: what happens to art? If AI can generate music, stories, paintings, even voices that sound real… Do we start craving the imperfect, human-made version the way we crave handmade bread in a world of factory food? Do we end up with “Human-Made” labels the way we have “Organic”? Will we reach a point of critical mass of “AI slop” that anything made by a human will now be thought of as a premium experience?
We don’t have a neat conclusion, because AI isn’t a neat topic. But we do have a request: stay curious, stay skeptical, and don’t outsource your thinking. This is a moment where the public conversation matters, because the future is being built whether we participate or not.
RESOURCES MENTIONED:
White House Response: https://www.whitehouse.gov/priorities/tech-innovation/
https://www.bbc.com/news/articles/cwy7vrd8k4eo
Claude: https://claude.ai
Neo Robot:
https://www.facebook.com/watch/?v=1346749566932432
youtube.com/watch?si=_5IHpQWBO0vxGS9T&v=j31dmodZ-5c&feature=youtu.be
Impact of AI data centres on the environment:
-
[00:00:00] Nicole: She's conservative and I'm liberal, and yet we've been friends for almost 40 years. Everyone says you shouldn't discuss politics, religion, or money. And we say, that's exactly what friends should be talking about. Join us as we tackle the conversations you're having in your head, but are too scared to say out loud.
[00:00:19] Hello Jolene.
[00:00:21] Jolene: Well, hello Nicole. ,
[00:00:23] Nicole: today we are going to talk about ai and I was thinking about this and how the hell do we tackle this subject? And I remember that how it started was I had reached out to you and Brianna, our producer, about, I had read an article in the Times about Waymo, one of the, um, self-driving cars and said, oh, this is really interesting.
[00:00:48] Maybe we should talk about self-driving cars. And then immediately Brianna was like. you guys actually need to talk about ai and she sent us all of these articles and one was [00:01:00] about how, the, it affects the environment. And one was about that there's a possible AI bubble, and then one that there's like a code red that we need to be the forefront of AI in the United States.
[00:01:11] And then she sent us those, those links, and I'll put all these links in the show notes so everybody knows how this even began. And we might not even talk about this stuff, I'm not sure. But then she talked about, she showed us the, the links about Neo the home robot.
[00:01:26] Jolene: okay, and here's what's interesting is I was at a trade show. This was a year ago. they were talking about, really, it was about trends and so like food trends and, the way that people eat and snack. And I mean, it was really talking about, you know, trends in general. But the last thing on this slide that this, uh, this presentation was this guy talking about robots and how, right now in, you know, 2025, again, this was a year ago at the beginning of 20, 20, 25, to buy a personal robot, and this is, and I'm [00:02:00] talking a robot who looks like a human, is built like a human, uh, you know, the same size, uh, as a human and has arms and legs and a body and a head, you can buy one right now for 50 or $60,000, but within the next seven to eight years. that this will be a common thing that people purchase. And and, and, and he said it will be more in the eight to $10,000 range. so he was talking about, he had this video of the, of, of the testing that they're doing with these robots now and how that AI is able to develop a mechanism within these robots that the robots can work together. So it showed, you know, a robot, it, it showed two robots that were unpacking groceries in a kitchen, on a kitchen counter. the one robot who was closest to the refrigerator was putting the refrigerated things in there. The other robot was putting things in the pantry and how these two were working together. So if the [00:03:00] robot that was closest to the pantry had a refrigerated item in his. Handed it to the other robot to put it in the refrigerator. I mean, that's how advanced this technology is developing and how quickly it's developing. And you know, at the time I saw this, I thought, this is great for somebody. I, I think of my mom, you know, somebody who wants to live at home, but maybe, you know, needs somebody around every once in a while, my gosh, we could spend $8,000 on a robot that could alert us if something happens, if she falls or, someone to help around the house. you think about all the great things that really could come from this artificial intelligence, that we're embarking on right now. I think it's exciting
[00:03:45] Nicole: you know, it's interesting because, I had seen this neo robot and there's a long way to go and Ronnie Chang I found that the comedian did a, a segment on the neo and watching it [00:04:00] trying to guess load the dishwasher and how it took like five minutes and it was falling on the ground and they had someone in the ne in the next room actually being the robot and just the inva, the invasion of privacy that
[00:04:14] Jolene: yeah.
[00:04:15] Nicole: needs to be okay with to in order to use this technology.
[00:04:21] it might be good to actually define. What AI is. AI is a catchall term for a group of technologies that superficially mimic human thinking, which I find to be a really interesting starting point
[00:04:38] within that, human beings are fallible.
[00:04:43] And so we as human beings are also asking technology and expecting technology to be right all the time when human beings aren't. and the AI is gathering all this information, whether it be, the robot that [00:05:00] might help your mom with, kitchen duties or
[00:05:04] it'll be interesting to see where it goes.
[00:05:06] 'cause when you first say that to me, I think to myself. Oh, now we're putting someone out of a job that could be a caregiver. there is nothing like human connection. Yes, there can be human annoyances. Your mom could hate the caregiver, the care caregiver could, could hate your mom. There is that possibility.
[00:05:26] I totally understand that. But there is also that possibility of a deep friendship and a deep love, which we're finding in some of the research that I've done, and I'm not sure about you, about like I listened to this podcast, it was an NPRA liberal, a liberal station, uh, it was called Studio one A and it was called AI and Emotional Intimacy. And they had, uh, people talking about that. It's a common factor these days that You don't have to have any sort of weakness that it's built for [00:06:00] humans to go through AI delusional spirals where they get attached to that, that synco fantic, behavior.
[00:06:07] And they might fall in love I mean, you've heard there's been some horrible stories where teenagers have actually committed suicide, through, you know, talking to their AI bot. we talked about this off camera about how there was this woman in Japan who broke up with her fiance and married her.
[00:06:28] AI bot
[00:06:30] Jolene: Oh God.
[00:06:31] Nicole: and, and I'm very curious, Jolene, 'cause I wanna talk about this 'cause I know that you and I are so different and, I don't know if it's a conservative liberal thing, a Democrat, Republican thing. I think it might just be a Nicole and Jolene thing. But I think it's an interesting thing because I had never used one AI thing until we did this podcast.
[00:06:54] I downloaded Jet GBT and worked with Gemini and then Josh helped him with this one called [00:07:00] Claude. Like I never used this for anything. And I think that this is part of your daily life.
[00:07:06] Jolene: I don't know that I would say daily. my girls use it constantly
[00:07:10] Nicole: what do they use it for?
[00:07:12] Jolene: God, this is such a great example. we moved Bobby from Tampa, Florida to Austin, Texas and like in a moving
[00:07:18] Nicole: Okay.
[00:07:19] Jolene: And we get into the moving truck and I just know that this, the transmission is getting ready to go, to go out.
[00:07:25] So Bobby's in the car behind me, Bobby and Jeff are in the car behind me. I'm driving the truck I call them and I say. Jeff, when you were driving this, do you feel like the transmission was kind of wonky? Like, I kind of feel like, like transmission's getting ready to go out and, and he said, well, what's it doing?
[00:07:41] And I said, well, I feel like it's kind of skipping a gear. And you go up, it's like, and you go uphill. It was like, it, it kind of revved and then it found the gear and dah, dah, dah da da. Bobby is putting it into chat. GPT.
[00:07:53] so she says if you are driving a rental truck from Penske and, and I think she even put, you know, a 16 [00:08:00] foot rental truck from Penske and you feel like the transmission is going out. Is this a common problem? And Jack comes back and says. Not a problem.
[00:08:08] This is a common thing that happens and it's because it's got a big load and these trucks get a lot of use and blah, blah, blah. And Totally. And then she reads it and we're like, oh, okay. Then it's not that big of a deal the next morning. Then we get in the truck, start the engine, the check engine light comes on and I'm like, are you kidding me?
[00:08:26] Like, we have got an entire apartment now in the back. This thing's going out and I'm worried this thing's going out when we are gonna be on I
[00:08:33] Nicole: Of course.
[00:08:34] Jolene: Alaya River Basin where it's, you know, crocodiles underneath us and it's gonna, you know, I'm thinking worst case scenario, Bobby puts it into chat and says, Hey, now on the truck, I mean, you can continue this thread now.
[00:08:49] She just says, now the, the moving truck's, check engine light has come on. should we pull over? And you know what it, it says to her, Bobby, [00:09:00] take a deep breath. It's going to be okay. Like, it was this emotional response. It wasn't just, here are the facts, you know, here it was, Bobby, take a deep breath.
[00:09:11] It's going to be okay. unless the, the light is flashing, it may just need an oil change. Not, not a problem. If everything else, and you know, kind of goes through, is it doing this, is it doing this? Is it doing this? Is it doing that? And it kind of goes through all these things and it says, you know, keep on, keep on your trip.
[00:09:29] I think it's gonna be fine. And to me, I think Bobby has spoken to chat so many times. They know that she's a naturally anxious person and that she's probably, you know, if she uses this 10 times a day going. I found a bump on my left toe. Do I have cancer? I mean, those are the kind of things that I, you know, that I think it, it has recognized through her use that this is the response that she needs.
[00:09:55] So she'll even at some point say, are you just saying this to be nice to [00:10:00] me? Or, or is this really, or is this real? And it'll say No, and here are the facts to back it up. So it is so intuitive that it's like a human being. It's like a human connection. So I get it.
[00:10:14] Nicole: I've heard, and from the, the research that I was doing that the more you use it, it caters to, who you are, who, what your, what your personality is, what you're searching for, they were actually saying That in order for it to actually not get too personal. because you're more apt to then you have the possibility of, they call it these delusional spirals 'cause you're actually humanizing the robot that is just off going off of cues that you've already given.
[00:10:41] And that, that it can be dangerous in terms of like these kids that ended up killing themselves or you're relying emotionally. You don't have, you don't have friends. You're using that, that they were suggesting that, you turn off the memory features on open AI and chat GPT and it will [00:11:00] then not do that.
[00:11:02] So it will be more straightforward. So I thought that was something to actually bring up to people. 'cause I, I know nothing of nothing, but now I'm learning. Right. But that, that is a way to, to make it more informational and less. Personal though I do know from the research that I did that chat, GPT created a new version and people got really mad because it wasn't personal anymore and they wanted to revert to chat GPT-4 0.0, which was more personal.
[00:11:33] So people are enjoying that, that touch. Now this is funny, Jolene, because, I downloaded Chachi BT we're trying to repaint our bedroom in Utah. And I was like, okay, chat GBTI. This is what I want. This is the vibe I want. And it started, and Edia was like, oh my God, you're so smart.
[00:11:56] Oh my God, sexy is the way to go. And I'm looking at my phone going, [00:12:00] what? The actual F is happening here? Like, Ew, ew. And so then they're like, and I'm like, I want, I want Benjamin Moore paints. And they're like, oh, such a great choice, blah, blah, blah. And I'm like, whatcha whatcha talking about? Like, this is so dumb.
[00:12:14] And so they give me these colors and I first think Right on, this'll be done. No problem. So fast. I'm gonna go to the Ace hardware, I'm gonna get the samples, I'm gonna put it on the wall. And I look at, I do it, and I'm like, this is the ugliest color combination I've ever seen. What is happening? So then I go again and they're like, oh, I understand you want it more like this. and I go back to the Ace hardware, I come back, I put it on the wall. I'm like, this doesn't, this doesn't work either. Like what is happening? So then I'm. Maybe I'll use Gemini. I hear the kids are using Gemini, so I go, I like go on Gemini. And they're even more clueless. They're like, they're giving me like bright fuchsia and all, and I'm like, what is, and then I'm trying to [00:13:00] feed Gemini, like, what do you think about opulence for the ceiling?
[00:13:04] And they're like, oh, having an opulent vibe. I'm like, no, no, no, the color. Like I'm having to feed and it's like a complete disaster. there was a part of me that was truly like. So trusting in the first five minutes, And I even would write them, this is totally wrong, or this is totally ugly.
[00:13:24] And they're like, oh, sorry about that. but I was so ready for it to be just right. but I also, I didn't like the sycophant nature of it. I found it creepy.
[00:13:35] I was like, you're a robot dude. You're a robot.
[00:13:38] Jolene: Right. It's, uh, so it's interesting. I mean, I, I think that it can use its powers for good and not for evil,
[00:13:46] feel like we are in a, in a race with China at this point for ai. And I think Trump sees that. And I think that is something that I think he's hyper aware of because he's so competitive that he does not want the Chinese to get a [00:14:00] leg up on, on us in this, in this race. doesn't care about the environment. China doesn't care about, the water usage for, you know, these mega power plants that, that were needed to, uh, fuel the electricity for, you know, these super mega computers. that's probably going to be the biggest, battle I would think that we are going to have as a nation is how much is enough to give us the leg up terms of, progress and, you know, funding microchip companies and, you know, how much of that are we going to allow the ramifications of this from an environmental standpoint. I mean, that's just gonna be the bottom line that, I mean, that's the fight that we've been having over solar and
[00:14:46] Nicole: Right,
[00:14:47] Jolene: and coal. And so it's, it's gonna, that that conversation's gonna continue with ai.
[00:14:53] Nicole: I mean, I feel like if we're gonna make it, like talk about it politically, I, with Trump, I don't know [00:15:00] why he has such an aversion to that. He thinks the windmills are ugly and the soul, he, I just, I don't get it because we were already going. So there's a, a lot of these farms in, in states that are Republican and doing really good stuff and reading what I've been reading about these AI um, centers and combining it with that technology is really beneficial in terms of, making it more sustainable. there are two thoughts, Jolene. I've never been to China Maybe this is so kumbaya of me, but I don't know if China doesn't care about the environment. I think maybe the Chinese government doesn't care about the environment.
[00:15:43] Jolene: I'm,
[00:15:44] Nicole: Right. I mean, right. and that's where I get a little bit with Trump.
[00:15:48] 'cause I don't think Trump cares about the environment either. And maybe, or let's say, I'm gonna give him a little grace. I, I'm gonna say that that's not the top of his priorities, that his priority is to win the AI race [00:16:00] at whatever cost. That's my impression.
[00:16:03] Jolene: I think it's fair to say that you have to outweigh the benefits versus
[00:16:09] what the costs
[00:16:09] Nicole: Yeah.
[00:16:10] Jolene: and, and when we talk about ai, I would assume that Trump thinks that he's got to win
[00:16:17] Nicole: Yeah.
[00:16:18] Jolene: to be progressive in terms of this technology. And the cost of that is going to be, doesn't matter, if, if it's a loss of jobs, let's retrain people. let's let the, private sector determine, what that job retraining looks like. Let's, of using the government money for, saying, you know, whether we're gonna subsidize, people who've lost their jobs because, you know, AI has taken it, let's retrain at, you know, starting at an elementary level of what are those jobs that, that are going to be useful ai.
[00:16:55] I mean, let's start, let's start doing that now.
[00:16:57] Nicole: I think Jolene too, like one of [00:17:00] the issues is that it's going so fast, that the humans, we can't catch up. we are so far behind as to try to figure out, how do we move forward? For instance, I'm gonna give you an example. So, as you know, Josh is a lawyer and there are several AI tools now, one is called Lara and one is called Harvey.
[00:17:22] I think it's funny when they're named people names they're trying to figure out how to use these tools and in some ways and, and, and very early on and it's still happening, lawyers are getting in trouble because they will use these tools. They will not review their work. It's usually smaller firms, but it totally depends where they will go to court and they will cite cases that do not exist
[00:17:47] Jolene: Oh
[00:17:48] Nicole: it's not helpful in that regard because you can't rely on it. You have to still do the work, you have to like review and all of those things. But he also explained to me that, when you're a baby lawyer, [00:18:00] when you're a first year lawyer, your job consists of reviewing, it used to be paper and now it's everything is, is scanned.
[00:18:09] And you're reviewing cases and you're reviewing documents and your partner might give you words to flag. And then so you're trying to condense this thing. And these AI tools sometimes can give a report back literally in minutes where the first year would take days.
[00:18:30] so because of that, they have this knowledge.
[00:18:34] So how do you train that lawyer? To move forward. there's some really great things about, about, about AI for sure. But it's like, I would agree with you, Joe, that that, okay, we really need to think. bigger about this, more creatively about this. Like, okay, y'all, this isn't going away.
[00:18:58] So what do we do about [00:19:00] it? let's be forward thinking is, I guess what I wanna say. Like, law firms need to think, okay, this is happening. Maybe we need to change our model because they charge by the hour. Well, that's archaic. It's not gonna work anymore. So how are they going to make money and do the best they can for their clients? if jobs are being replaced, I think being ahead, AI is not just being in Trump's idea of like being the best at ai. We need to be the best of, well, what does that societally look like?
[00:19:29] Jolene: Right, a hundred percent. I agree. think part of this is going to be, which I found ironic as I researched this, right now, states themselves are regulating, are, laws that are to some degree regulating the AI by state. And what Trump wants to do is federalize these laws so that a liberal state, let's say California or Colorado, isn't able to write a law that is restrictive and use that as [00:20:00] presidents then for all the states Which I really found ironic because isn't that just the opposite of, you know, state's
[00:20:08] Nicole: Oh my God, I'm
[00:20:09] Jolene: know?
[00:20:09] Nicole: girl, I love you so much. I'm so glad. 'cause I'm just like, wait. That's the opposite of what Republicans believe in, right?
[00:20:17] Jolene: That's exactly right. So it's interesting how, how this has kind of shifted the way that they're thinking about the rules long term with AI and how to develop the, the rules and regulations I still see it being anti-regulation, um, you know, which is a conservative, theory,
[00:20:38] Nicole: Meaning Jolene? What do you mean that you see it anti or you would like it? What do you mean by that?
[00:20:43] Jolene: I, I still see from the, from the, the articles that I
[00:20:47] Nicole: Uhhuh.
[00:20:48] Jolene: still. Being, um, you know, less regulations are
[00:20:52] let's let the private sector regulate this as new tools develop or as we see the usage in whether [00:21:00] it's law or medicine or, know, healthcare. really kind of federalize some of these laws right now if states start coming up with laws, to regulate AI and they're using that as precedents as they're trying to regulate than federally,
[00:21:19] Nicole: Okay.
[00:21:20] Jolene: then it's going to be harder for them to come up with state, you know, federal regulations. So I think Trump's trying to get a ahead of that by developing, you know, asking Congress to, let's develop some rules initially.
[00:21:34] Like let's, let's get on the ball with this now before we get so many regulations at several states, specifically liberal states that then will be harder for us to deregulate this on a federal basis.
[00:21:51] Nicole: isn't that the state of America that it's always been this sort of push and pull with state rights versus federal rights? cause I know, like [00:22:00] California for instance, well, Ezra Klein with his book Abundance, like California gets so bogged down in regulation that very little gets built and done.
[00:22:09] and then there are places like Texas that have less and they get some stuff done. But they were talking about, it was a podcast where, it was a Taiwanese company and they were making AI chips in Arizona it was a Biden. Project Taiwan came in and, uh, it was gonna be this huge thing and create like thousands and thousands of jobs for Americans and Ians.
[00:22:37] But because the Taiwanese, I think are the, like the top AI producers and they're trying to do it fast and they're not quite training the Americans yet. And I mean, it all made so much sense listening to the story of like, the Taiwanese have a way that they do it, which is very different than Americans.
[00:22:57] You don't have to be liberal or Democrat. you know, we [00:23:00] wanna understand, we wanna have our rights, we wanna feel empowered and that these people are coming in trying to get these, company in place to build these, these, um, chips.
[00:23:11] Jolene: Yeah.
[00:23:13] Nicole: And how in order for them to do it more efficiency, efficiently, the Taiwanese were speaking Chinese, and so the Americans felt left out.
[00:23:22] And then they're like, why are you taking our jobs? And nothing, it's, it's just stalled. and that's a red state. it's such a difficult dance to get stuff done.
[00:23:31] Jolene: but think about this. This has been the most innovative, development since the computer,
[00:23:38] Nicole: Mm-hmm.
[00:23:40] Jolene: right? I mean, think about when else have we developed something? When was the last time that we developed something that was so, um, on the cusp of. beyond what we can even imagine that. See, isn't that the whole thing with, with ai it's beyond [00:24:00] what we can even imagine.
[00:24:01] So it's so hard to regulate it and develop laws and, and
[00:24:06] Nicole: we have, we have, yeah, we have no idea what we're in for. let's think about like the telephone. Like there are a lot, there's been different inventions in our history of our world that have changed the game and we are in, you and I and everyone on this planet in 2026 is in this moment in time where we don't have any idea where this is gonna go.
[00:24:33] And because of that too, there's a lot of fear, I think for sure. I think liberals tend to be more fearful and more cautious. Maybe that's not true. I'm not sure. I think of it as an artist. It's already taken a lot of my jobs away. Like, so where, what do you do now? Like there's so many, voiceovers that are, that are [00:25:00] ai, there's these AI actors that agents are fighting for and you're sort of like, wait a minute, hold on.
[00:25:08] What about the artists? Like we, we are, we are the storytellers. We are there to help create the human experience so that everyone has a place to like either escape or e mote or take a rest and, and that's gonna be gone. Well, it's not gonna be gone. We have to figure out a new way to do something.
[00:25:27] Jolene: Right. That's exactly
[00:25:28] Nicole: So, so it's, it, it is that, and I, this is a side note, but I gotta tell you, so in my research whenever we have a new topic, the first thing I do, 'cause I love podcasts, is I'm like, okay, what's going on in podcasts with ai?
[00:25:42] So I, you know, I put in the, in my, because I listen to the Apple, so I plug in like, how is AI affecting the environment? Because that was definitely a concern for me. whether that's liberal or not, I don't know. So it came up with like a list of, I dunno, 15. And what caught my eye was one that said, it was [00:26:00] called the Cost of intelligence, episode 32 or something AI's hidden Environmental and Human Toll.
[00:26:07] I'm like, okay, great. So I press play and I start listening. There are two hosts, a guy and a girl, and I'm listening and I'm listening and I'm like, their voices kind of sound weird. And I keep listening and I'm like, wait. She's saying, aha. And he hasn't even finished his sentence. Like, none of this is making any sense.
[00:26:31] And then it's going on and on and I'm like, I don't understand what they're talking about. I like Google, like, who are these hosts? And they're like, they're ai, they're, it's an AI podcast. That's what it was. And I was like, wait, wait, wait a minute. So the AI is talking about the AI cost of human toll.
[00:26:50] The human toll. But I, but I also felt a little happy that I could see that. I was like, okay, [00:27:00] at least right now, this is not working. ' and Jolene it was so fucking boring. I was like, I can't, like, I, this is so boring. I, I gotta move on. But, but I was also very naive.
[00:27:10] I didn't know that there were AI podcasts.
[00:27:13] Jolene: Have you heard the country song, the AI country
[00:27:16] Nicole: And it's a huge hit, right?
[00:27:18] Jolene: It's a huge
[00:27:20] Nicole: Yes.
[00:27:21] Jolene: so I played it for Jeff and I didn't tell him it was ai. I go, Hey, I just, I just found the song. What do you think of it? And I played it and he goes, God, that's really good. And I go, ai. He goes, no, it's not. And I go, yes, it truly
[00:27:35] Nicole: There's a couple out there like that.
[00:27:38] Jolene: And you're
[00:27:38] Nicole: And they're good. They're catchy.
[00:27:40] Jolene: yeah. But it is funny 'cause if you listen closely, he doesn't take a breath
[00:27:46] Nicole: I know. Exactly.
[00:27:47] Jolene: like, like as you are like singing along with it and then you're going and you would take a breath here and he just keeps singing. You're going, oh hell, I can't sing with it 'cause
[00:27:55] Nicole: that's right.
[00:27:55] Jolene: breaths.
[00:27:56] Nicole: There you go. So you need humans so you can sing along. [00:28:00] It's true, I record auditions every day for voiceover stuff, and, and you've, and you have breath, and when you're, and I'm editing takes to send to my agents. There is a sense of like, oh wait, I gotta keep some of my breaths in.
[00:28:15] Because when someone asks for conversational, that's how we sound.
[00:28:21] Jolene: can they develop AI then to sound more
[00:28:24] Nicole: I mean, cha chances are that they will, because, you know, as we said at the beginning, the definition of AI is, mimicking human thinking. And I would think human behavior, the more information they get, the more. Human. They might get, I wouldn't be surprised if there was a moment in time.
[00:28:44] It won't be a permanent moment. Well, nothing is permanent, but that there might be like books, music, film, you can't do it with theater and I'm very grateful for that. But that there will be like a stamp in the corner that says [00:29:00] human made because I do think with this, 'cause it, it just makes sense like we are all like AI crazy right now and that the pen pendulum will swing back and say, I just want something human, even if it's flawed and messy and. disagrees with me. I just need that. I can't have this robot telling me how great I am all the time.
[00:29:24] Jolene: if it enhances our lives as the computers have, as the cell phones have, you're right. I think it plays an important
[00:29:32] Nicole: yeah.
[00:29:33] Jolene: Can we talk about the Waymo's?
[00:29:34] Nicole: Oh my God. Please.
[00:29:37] Jolene: So, being in Austin, moving Bobby, they're just,
[00:29:41] Nicole: you, did you go on one?
[00:29:42] Jolene: we didn't just because we were in a hurry every time
[00:29:45] Nicole: Because that's the time I went. I went, I went, I was in Austin last spring and I went in Waymo,
[00:29:51] Jolene: but Right. I think just a couple of weeks ago, wasn't it in,
[00:29:54] Nicole: San Francisco.
[00:29:56] Jolene: that it went through a bus, um, a [00:30:00] stopped bus.
[00:30:00] Nicole: no. I don't know about this.
[00:30:02] Jolene: I don't know where this, I, so I'm not sure where it was, it didn't recognize that a bus had stopped on the opposite side of the road for kids, you know, a school bus for kids to get off and it, and it went right by No, it was on.
[00:30:16] 'cause I remember seeing a video, it was on the same side and it went around it or maybe it was a two lane and it went by, it. I don't think anybody was hurt,
[00:30:23] Nicole: Okay. It says. There were multiple incidences in late 2025 where Waymo's self-driving robo taxis illegally passed, stopped school buses with flashing lights in Atlanta and Austin, leading to a federal investigation and voluntary software recall by Waymo to fix the issue is their system failed to recognize and obey school bus safety laws endangering children.
[00:30:51] But it doesn't say that anyone get hurt that thankfully,
[00:30:54] Jolene: obviously as, as this technology develops, there's going to be situations [00:31:00] and also heard that it doesn't have, Waymo doesn't have the ability that if a truck is double parked on a side street, it doesn't have the ability to go around knowing that that truck is, you know, offloading its cargo and it, you know, it's gonna be there for 15 minutes and people are sitting in there going, no, go around.
[00:31:17] Go around
[00:31:19] Nicole: Well, it's funny that you say that exactly. Is that, the weekend of December 20th, 21st. Uh, there was, I don't know if you heard about this, there was this huge power outage in San Francisco. Okay. Like, and there's tons of Waymo's in San Francisco, and thankfully there were no injuries, no, deaths.
[00:31:40] But there was this massive power outage, and all the Waymo's stopped because the, the, the stoplights were stopped. So it took days for the tow trucks to. Get these Waymo's out. They created incredible traffic jams because the humans were like, oh my [00:32:00] God, move the waymo's. and they immediately went back to fix the issue.
[00:32:06] It seems like, it seems like Waymo is doing a really good job. Um, I, I did notice that they are testing in New York now, which will be really, really interesting. I was always like completely against, self-driving cars and actually years ago. And I don't care if it comes out now. it was pre COVID.
[00:32:28] I was in auditions to be a voice. I don't think it was of Waymo, and I didn't know how I felt about it. I mean, it would've been been cool to be like, turn right, have a great day, or whatever, right? But I also was like, oh, I dunno how I feel about self-driving cars.
[00:32:44] And then when we were in Austin last spring, and I'd never been to Austin, it was like, oh my gosh, there are so many self-driving cars. so Josh and I were like, should we try one? And we were going to the LBJ library and we got in the, in the car and it's like, it's really [00:33:00] quiet and cozy and you get in the back and, and it says hello to you.
[00:33:06] And, and at first I'm like. They're like, make sure that you're fastened seatbelt. And it's sort of like, and what music do you want? And I'm thinking, oh my God. Oh my God. But within 30 seconds, I felt safer than I felt in any car because there's so many cameras everywhere on this car and they drive, they don't drive like maniacs, like when you, you know, are out in the world. we did have a funny thing that we got into the parking lot where we were gonna be dropped off and there was um, a guy like sort of taking money to park because there was a football game and Waymo had none of it and like, went around and kept driving and we, we thought it was funny, but like no one was around to like, you know.
[00:33:55] He was like, I'm done. I've, I gotta, I gotta drop these people off the museum. They have, [00:34:00] they have history to, to learn. and it was actually kind of awesome and my feeling now has changed about it because I would like Waymo to be in New York because the driving in New York has gotten so insane. It feels so dangerous. and unhinged that it would be nice
[00:34:23] Jolene: Would it have to be like all converted over to Waymo? I
[00:34:26] Nicole: yes. And it's not gonna happen. And, and I mean it'll be an interesting experiment that's for sure to see if it could actually work in New York City. But, but I was just reading about, 'cause Josh brought my attention, there was an article in the New York Times, So there were nine companies in China who, had self-driving cars and they would suppress their accidents. there was an accident in late March of last year where these three students were killed. So they actually have pulled back, [00:35:00] talk about regulating. Like, they're just like, no. And I don't know if it's because it was, found out that it was made public and now the world knows, I'm not quite sure.
[00:35:08] But all to say that they have, you know, it's now they, they're only letting two companies move forward and it's all about testing and not roll out that they are slowing down their role there's always the hope that people want. They want people to be safe and not just make a lot of money.
[00:35:26] Jolene: We would
[00:35:27] Nicole: Liberal, liberal, liberal talk. Is there anything that, that was, that piqued your interest in what Brianna sent us? Or anything that was surprised you or, made you worried or anything? I mean, one thing in this conversation that we're having that you brought up that I find really curious is that you're right, like we had the computer, we had the cell phone these things, these innovations happen so fast and, and humans are just so flawed that we can't, we can't catch up and we don't tend to [00:36:00] learn from our mistakes.
[00:36:01] And so where I would like is that we get really creative in thinking about. Not just the doom and gloom of, oh, it's gonna take over. Like, how do we, how do we maximize this experience? do we,
[00:36:16] Jolene: I think we do learn from our mistakes
[00:36:17] Nicole: do,
[00:36:18] Jolene: I, I look at, I look at the cell phone and, you know, look how we are now shifting gears about how everybody needed to have a cell phone. And now we're shifting and we're saying, okay, kids under 16, maybe that is, is bad for them, you know, and all, and we're developing these studies you know, are, are proving that cell phones aren't good for kids.
[00:36:41] So
[00:36:42] Nicole: would agree with you. I guess I wish we would learn faster. I'd wish that we would because we've affected a whole generation, a whole generation of children.
[00:36:51] Jolene: but didn't it happen with a like kids were able to look at porn really
[00:36:56] Nicole: Mm-hmm.
[00:36:56] Jolene: I feel like we are learning and it, and as we, [00:37:00] as we, you know, have a comprehensive view of, how humans work, you know, we, we are so much more in tune to brain health now and, and the importance of sleep and I mean, all of these things that have really, I think within the last 10 years that we have learned about the human body.
[00:37:19] And, you know, my gosh, you and I have talked about menopause and things that, um, that really affect us, uh, physically as human beings and emotionally and mentally. And don't you think then we are, are continuing this, this education of how does this affect us as humans to use these powers for good and not for evil and develop, you know, the programs that, um, allow us to, to learn more about the human body,
[00:37:52] Nicole: I mean, listen, Jolene, I, I agree with, I think I agree with you that I think we could use [00:38:00] this tool, uh, in a very positive way, not as Americans as a world tool. I wanna, I. Acknowledge that we are in a society and it's not just an American thing where a lot of people are catering to, to their board of directors.
[00:38:20] They're not looking at how it can make our human experience better. It's more about how is it gonna make them money? So that's the point.
[00:38:29] Jolene: But that, has that changed? I mean, that's always been the
[00:38:33] Nicole: No, no, it hasn't. I, what I'm saying to you, Jolene, is I'd like it to, doing this podcast, I'm learning like how to be more realistic. and maybe that's a. A Republican thing. I don't really know. Maybe it's not political at all, but it's just sort of like, okay, here is this thing that we are creating that is going faster than the speed of light. And how do we catch up and how do we make this benefit people versus deprive people or,
[00:38:57] Jolene: Right.
[00:38:58] Nicole: you know, how do, how do we make this [00:39:00] not, strip the planet of its resources and help, uh, flourish in a different way or like, you know, make us a better planet, make us a better, and I also understand that it's, it's not just an American thing.
[00:39:12] Like it is a global thing. And if we're not all on the same page, and how the hell are we all gonna get on the same page? This is a huge, a huge planet.
[00:39:20] Jolene: We've gotta be First America
[00:39:22] Nicole: Oh, oh my God. Oh my God. Oh my God. listen, you know, listener, viewer, this is a huge topic. I, it's, and I'm sure we will revisit this. And if you guys have any suggestions or any questions or any ideas, any thoughts, any, any solutions, any fears, please, you know, comment on YouTube, uh, DM us on Instagram.
[00:39:50] Like this is a big conversation and I think it takes all of us, to talk about it, to explore ideas and solutions and, [00:40:00] and, and not pet each other against each other.
[00:40:03] Jolene: 100%.
[00:40:04] Nicole: Yeah.
[00:40:05] Jolene: that isn't that the, that is the common goal
[00:40:07] Nicole: Yeah.
[00:40:08] Jolene: that we all are here to try to make this world a better place. And if, if we have AI to thank for that, or if AI is going to be the demise of humanity, I mean,
[00:40:19] that's the two, I think the two, schools of thought right
[00:40:24] Nicole: And I think I, I would suggest that there could be a, we could look at it more as a partnership and how do we, not just give our power away, but understand that we are these smart, creative, innovative minds as human beings. And the only reason AI exists is because we exist. So how do we work together? If they're mimicking what we do, how do we work together?
[00:40:46] Stop giving your power away and let's figure out how to do this together.
[00:40:49] Jolene: As you say, a Kumbaya moment.
[00:40:54] Nicole: Do you have a, would you rather my dear?
[00:40:56] Jolene: Yes. Do you?
[00:40:57] Nicole: I do.
[00:40:57]
[00:41:02] Nicole: I was like, what am I gonna ask her? And then I was like, wait, I gotta put it in Chachi BT. you did that once and,
[00:41:13] Jolene: did
[00:41:13] Nicole: and I have to say. You're way funnier. You're way funnier.
[00:41:17] Jolene: it?
[00:41:18] Nicole: like it. Then I did it again. I did it in chat, GBT several times and I'm like, they're not funny.
[00:41:25] They're really not funny. But then Josh uses one called Claude. Again, another human name I'd never heard of Claude. Claude gave me a funny one.
[00:41:37] Jolene: Maybe Claude is funny.
[00:41:39] Nicole: So here we go. You ready? This is, this is from Claude to Jolene. Would you rather discover that every person you've ever slept with has a group chat where they rate your performance with detailed PowerPoint presentations [00:42:00] or find out your dad has been writing erotic friend fiction about your friend group, and it's actually pretty good.
[00:42:12] Jolene: That's horrible.
[00:42:14] Nicole: Let me just say thank you, Claude. Thank you. Claude.
[00:42:20] Jolene: Oh my gosh.
[00:42:22] Nicole: Do you need me to repeat it?
[00:42:24] Jolene: no, no, I do not.
[00:42:31] I guess I would have to choose the first one.
[00:42:34] Nicole: You'd rather have the PowerPoint presentation and rating your performance.
[00:42:39] Jolene: Then, then having my father like about it. Yes. God, that's horrible.
[00:42:48] mine wasn't nearly as naughty.
[00:42:51] Nicole: Okay.
[00:42:52] Jolene: Would you rather have an AI robot that is your personal trainer
[00:42:59] Nicole: [00:43:00] Okay.
[00:43:00] Jolene: or is your personal chef.
[00:43:06] Nicole: Oh, personal chef. Hands down.
[00:43:11] Jolene: Oh, seriously.
[00:43:12] Nicole: Listen, I love Lindsay. She shout out to Lindsay. She makes me strong and she's my friend. I'm so grateful for. Like our conversations and the inspiration and the encouragement. I would not give her up. And Josh cooks a lot and he's really good and I help him, but he's really the main cook.
[00:43:37] Um, and I do think he enjoys it, but it would certainly be nice to, that would be nice to not have to think about that. Maybe he would not like it very much, but we get to do other things. So yeah. Personal chef,
[00:43:49] I'm assuming because you're, you love cooking.
[00:43:52] Jolene: Yeah. I wouldn't want somebody to take that joy away from me, if there was somebody like, I wouldn't like that [00:44:00] just. Did. I mean, it was my personal trainer right there. Wouldn't that be great
[00:44:06] Nicole: listen, it would, I mean, to be fair, it would, and it would, you'd pay for it once, right? If Listen, I would be thrilled. 'cause you'd have access to that person all the time. And,
[00:44:16] Jolene: Yeah.
[00:44:17] Nicole: I. I like working out and it's harder to get motivated, quite frankly. I've been really unmotivated this week. Just tired. thank you listener and viewer and, uh, please follow, like, and subscribe wherever you get your podcasts. Um, and if you're interested in being a sponsor, we would love to have you sponsor.
[00:44:40] We've got to talk. Just go to our website, we've got to talk.com and contact us and we'll get right back to you. Do you have any other thoughts, Jolene?
[00:44:48] Jolene: You said it all.
[00:44:50] Nicole: All right, DOL, I love you. Get feel better.
[00:44:52] Jolene: you everybody.
[00:44:53] Nicole: you everybody. Bye.
[00:44:55]