Half Century Hangout

AI: Use It Or Lose!

John, Luke & Chuck Season 1 Episode 15

Send us a text

The line between innovative learning tools and academic shortcuts grows increasingly blurred as artificial intelligence reshapes education. In this thought-provoking episode, we tackle the complex question: Is AI a valuable asset for learning or a technological crutch undermining authentic understanding?

We break down three distinct approaches students take with AI tools: submitting purely AI-generated content (raising serious ethical concerns), using AI as a research assistant to gather information before forming personal conclusions, and employing AI as an advanced editing tool while maintaining ownership of core ideas. Through these examples, we explore how intent fundamentally shapes whether technology enhances or short-circuits the learning process.

The conversation takes an unexpected turn when we confront a challenging paradox: How can educators credibly restrict student AI use while simultaneously employing these same tools for grading and lesson preparation? This examination of double standards offers a compelling perspective on modeling ethical technology use rather than simply imposing restrictions.

Beyond academic settings, we venture into AI's broader societal impact, particularly its potential to transform professional fields like medicine and law. We navigate the tension between embracing efficiency gains and maintaining necessary human oversight, challenging listeners to consider how these technologies might reshape career landscapes across generations.

Rather than settling for simplistic answers, we advocate for a balanced approach—one that neither demonizes nor uncritically celebrates AI. The path forward requires deliberate education about effective AI usage for students and adults alike, ensuring these powerful tools enhance rather than replace the critical thinking that remains essential to genuine learning and innovation.

Support the show

Luke:

Welcome back, John Great. Hey, thank you.

Chuck:

It is good to have you back on, John.

Luke:

Absolutely so. What's been going on here? We've got you know it's May, the end of the school year, may Day, end of the school year, all those activities that come along with it, my namesake, Luke, graduated from Aurora University.

John:

Congrats to him. Degree in business and marketing.

Chuck:

That's quite an accomplishment, nice yeah.

John:

He doesn't have a job yet, but you know?

Chuck:

Well, that's the next step, right?

John:

Yeah, it's baby steps, that's what he said, and I said well let's make those baby steps a little bigger.

John:

Let's see what we can do there For sure, let's try. It was fun, though. It's good to see that. You know it's always proud of your kids when they reach those milestones. You know it's cool to go through that and you know got to spend some time with you know my other kids too, so it's always good. It's good to get together, spend time. You know release from work a little bit and get out the door. Missed a couple things here at school, but you know what you guys did a great job of carrying the weight absolutely keeping it going.

Chuck:

Yeah, um, I saw tiktok. Oh yes, you're a tiktok guy, I'm a tiktok guy. And I saw tiktok that used ai to put jordan, michael jordan, up against the stats of all the greats of today, and even in this AI-generated type of story, he was head and shoulders above everyone else, oh, 100%, 100%.

John:

There's no comparison.

Chuck:

I don't care what anybody says. They use the type of game that's played today. I don't care what anybody says. They use the type of game that's played today, the different technology type of things. When it comes to fitness.

John:

Well, if you think about it, if you watched any of that era when he was playing and watching him and you had, and again I'll just bring up the Pistons again, it was the Jordan rules you go to the hole he's getting hit, he's getting hit. He's getting fouled they touch somebody, you know it's, it's they, he would have gone off.

Chuck:

He, this thing, said that even he went off getting tackled. Yeah, for sure. And this thing said that with the way they use space these days, you know how it's more of a flow game yeah yeah, space is broader.

John:

Whatever he said that, they said that he would have absolutely oh, he would have, he would have, he would have, absolutely, he would have eaten him up he would have torn up.

John:

It is kind of funny that you brought that up, because, as we segue in the one thing of AI that I will if I'm in a mindless mode where I just I don't want to think, I just want to watch something and laugh those little AI things crack me up. I don't know if you've ever seen them where it's like a, it's like a, it's an AI generated thing. Where there's a car and it's going to crash into a tree at 20 miles an hour, at 50 miles an hour, at a hundred. Yeah, the one I saw the other day, it was a school bus versus a large speed bump, okay, at 10 miles an hour, and so this bus goes over it and it scrapes the bottom like the driveshafts and then by the time it gets to like 80 miles an hour. Here's this school bus plowing down the road, hits the thing and it hits the overpass on the highway. Oh my, I was laughing so hard. I'm like first of all, seeing a bus going 150 miles an hour just made me laugh by itself For sure, but that shows you what mood I was in at the time when it did it. But some of that stuff with AI it does make me laugh just because it's funny.

John:

But I think the direction we were heading today initially with AI was kind of a question that we get here because of what we're involved with in education Is AI a tool that supports students in their learning or is it a crutch that holds them back? And I think it was something that I was talking about with somebody over the weekend. Is it cheating, is it not cheating? That's kind of the question. So I thought maybe we'd chew on that a little bit and see, because I know that we have people that listen, that are parents that probably have views about it, but we also have other educators that listen to just people in general that may not really quite understand and know what it is. But I think that the biggest question that I had that people are concerned about trying to limit student use of AI.

Chuck:

All right, Can you explain for people who may not know what is the AI and how is it used among students?

John:

I think that's part of what the answer is. That I've looked at is is that it depends on how you actually use it. All right, If students were using it. Let's just say it's chat, GPT or something that they are going to put in a prompt or a topic that the teacher gives them.

Luke:

Right.

John:

For a paper, an essay, whatever you want to say, whatever they call it, and AI generates something for them. And then they use it as their work. Okay, right, that's one thing, right, that's, that's one way.

Luke:

Example way over here. Yeah, okay.

John:

There's another example where a student might say okay, here's this topic. What I'd like to see is research on XYZ, on this. Show me research about this. And it'd be just like doing an internet search. It's very similar. It's not doing much except for the fact that the AI and the chat GBT is generating information for them.

John:

So you might even look at that and say that was like using an encyclopedia back in the day, or a dictionary or a thesaurus or something where you're gaining more information somehow to then make this essay or to write this essay Right. Another way that they would use it would be I have this essay that I wrote, they're going to put it in and they're going to say hey, could you check this for grammatical errors? Could you check this for better ways that I could write this? Could you check this to make it flow better? Whatever you want to do, I think that's another way, and there's more ways than that, I'm sure, but those are the three ways that I kind of focused on in this conversation that I had over the weekend. But on the flip side of it, how are the teachers using it? Exactly, okay.

John:

So, that's kind of where I wanted to jump in at and see what you thought.

Luke:

So, John, what do you think about using AI in education? I think it can be a wonderful tool. I think you have to learn how to use it, just like you would any other tool that you've got. I think that teachers can use it to make education more individualized for students. Teachers need to use it to help them maybe get some questions put together, how to check for someone's understanding, so all kinds of tools that teachers can use it for. Seems like we're a little bit hesitant to use it as a tool. Look at the calculator.

Luke:

When it first came out, they're like, well, you're going to let kids use a calculator Really A calculator? No, they got to think of it on their own. They got to do this mental math. They got to use a slide rule. They got to do it the way we did it, yeah, and we eventually get to a point where you know what it's okay.

John:

I think part of it is that and I'm not going to lie that I've fallen into that thought process before where I don't know that we should use things or invent things, or that Think for us, yeah, that think for us, or that do the work for us. But, as we know, our society has completely changed toward convenience. But I think that the technology like you mentioned, a calculator, some of those things I think people said, oh, we don't really want kids to do that because you know when they get into the workforce they won't be able to do that. It's like well, there's probably not a whole lot of jobs that were involved in numbers that wouldn't allow some sort of computer use or calculator use for people to do work, because it would be faster and they wanted production, right yeah.

John:

So I think that I kind of fell along those lines for a while, that I really, as a teacher, I would always say I want to see your work, right, I want to see your, I want to see how you got there, which I do think is important Absolutely. They need to understand the process. But is it wrong to say, okay, if I know that you know how to do the process, okay, now you've graduated and you're able to use this or whatever it is like. It's some sort of a progression.

Chuck:

Yeah, is it even necessary to learn some of those things? Because you can. I mean, you can google that answer pretty quick. Like, if you need to, if you're building a house and you need to know a cut, uh, because you're building a 612 pitch and you need to know what the cut is for, uh, you know a truss or something. You can find that answer out pretty quick, do we? Do we?

Luke:

need to get your slide rule out and look it up, or look it up on the computer, right? Yeah, that's what you're saying. Sure, I think it's important to know, yeah.

John:

Now you might not use that in your day to day like as a carpenter or something in that regard, but I think in certain ways there's benefits to knowing what got us there, wherever we're at, whatever that is, if it's a thought process, if it's some sort of a project, if it's a geometric equation, if it's a financial somehow of an equation that gets us to gross national product or whatever it is, I think it's important to know those things and the process as we go, just because I think if you're just working for the answer at the end which I realize that's kind of the end product I don't think that has as much value as it does knowing what it took to get there. I feel like we asked a lot more questions when I was in high school than kids do now.

John:

I think we have to sometimes pull those out of kids now, like we have to lead them to that trough, like we've got to get them there to that point of query where they say, I'm going to inquire about this, like you're getting them to the point where they start to nibble at it, and then they want to take it to the next step and say, hey, how does this happen? Or hey, where does it go from there? And then sometimes that's like pulling teeth. Yeah, it is, that's difficult. Yeah, because, especially if they know, like you said, they could google something.

John:

But I feel that it's taking away from the process of being able to say, hey, how does this, does this happen? Or I'm really interested in that. Can you tell me how it works? Some of those things I think are important for students to know. So, yes, I would say I feel like we need to teach more of that, which is why, when we talked about the AI thing in those three examples I don't necessarily know, like the one example that I gave, where a student puts in a topic and they get something out excuse me, and they claim it as their own.

John:

That's ethical.

Luke:

That's probably not right.

John:

In the middle example where it was. I'm going to use this and it's going to help me get some information. I'm going to search some research and then I can come to my own conclusions, write something up, right, I don't think that's cheating, that's research Just using a tool that's smart.

Chuck:

Use your time.

John:

Okay, the other one, which was here's what I wrote Check this for errors or check this for you know.

John:

It's kind of like a glorified spell check, if you want to say as long as it doesn't change the content too much, you know, which I think is something that we need, and we have tools for that right. Teachers have tools to be able to put a research paper in and say how much of this was generated by AI. They can give you a real. I don't know that I'm always sold on numbers because you can put it into two or three different modems and it doesn't. It comes out different, but I think as long as it's, and I don't know what that threshold number is, I don't know if I know, if I would say, is it?

Luke:

10%, 15% 20%.

Chuck:

I heard somebody say today is actually 20%.

John:

I mean it just depends on what you're comfortable with. Today is actually 20. I mean it just depends on what you're comfortable with. I think you know, as far as what the school district would say or what people would say, I wouldn't say more than because 20 I mean. If it's more, you know that'd be like I mean, 30 years ago writing a paper, you're writing a fifth of the paper and the rest of it you got from somebody else. Yeah, that doesn't. I think you know you are.

Chuck:

Somebody I was talking to you today said if you're citing some of those sources, it's actually probably not such a bad thing well, yeah, as long as you're just like in a normal paper, if you're citing your source and you know where it's from and you did the research to do it.

John:

I mean, they directed you to it, but you're actually looking through it and picking out the stuff that you think is important. It's not really any different than what we used to do.

Chuck:

I think we need to teach our students more ways that they can use AI effectively.

John:

I think what the biggest thing that came in and it goes to what you're saying is that if we are, we're in a school system and we're trying to I don't want to say discourage but we hold kids accountable for using AI. They use it over a certain percentage, or the teacher looks at it and gives them another chance and says, hey, this is really heavy AI, you need to redo it. Put it in your own words do your own thing. Mm-hmm. What message does it send when we use AI to grade the papers? So if we're telling them not to use it, but I, as the teacher, am using it to grade their papers and I'm not putting in the same amount of work to do it, what is that saying? It's kind of like telling a kid to put his phone away and the teacher's on their phone the whole class period. Does that ever happen? I mean not here, but I'm just saying in general. That would be the same thing. It's like telling your kid, hey, don't smoke while you're sitting there, chain smoking in front of them.

Chuck:

Yeah.

John:

It's by your actions right and you see that, and it's difficult for high school kids to look past that.

Chuck:

Yeah.

John:

When you, as the educator or as the teacher in the room, are telling them they can't do something, but yet you are, because that old saying which well, I can do it because I'm the adult that doesn't fly, I wonder. You know and never did.

Luke:

No, you know, uh, reminds me of a quote be the example that you want them to be assistant your paper and had that assistant look over that paper and maybe make some corrections or look at it and add some content, but you still have to be that author.

Luke:

You have to be the person who's going to look at that content and say, yeah, this is, this is the way I would write it, or I'm going to change it so that it is the way I would write it, rather than saying, oh, ai generated all this and I just gave it to you and it really isn't anything that you know, I I put in some ideas, but it it gave me what, what it gave me.

Luke:

I think you need to be the author, you need to be the, the, the person that is making that content. And I think that's where the first one that you talked about, where you just put in a topic and you get one back, and the second one where you'd put in, and the second one where you'd put in and get research I think that's fine. I think the third one you have to look at and say, well, how do I make this my own, rather than I put in what I put in and I got back what I got back and I'm just going to turn that in. You still have to read through it and say, okay, this sounds like me, this doesn't sound like me.

John:

So I do think it's important, just like any other technology that comes down the line, that it's our responsibility as educators to teach students how to use it correctly, and I think that it's important that students know how to use it correctly. And I think that it's important that students know how to use it correctly. You know and what those boundaries are. You know what is it that if I was using this, you know what are the moral or ethically correct ways to use it. You know that follow down the right way or the wrong way, or this is this way. I think that that, to me, is probably the biggest thing that comes out of the conversation. For me is that it's just like any other technology that would come down the pipe for kids.

Chuck:

What about this little spin? How many people are afraid to introduce AI into the education system because we think it might replace some of our efforts in the classroom?

Luke:

Might replace teachers. Is that what you're?

John:

saying Maybe we're going down a rabbit hole here. We could I mean the one that I had was, and that's a pretty good, you know thought to go down to talk about the one that I had is that what's one of the main things that people were worried about with AI Is that AI starts to learn Right, and not only does it learn, but it's taking whatever the majority of the thoughts are that are somehow put into it and it's learning from and it goes a direction, it goes an opinion way, right.

John:

It's not just factually based. It starts to form opinions, okay, and a lot of that is based on what input it gets, right. People worry about AI learning and as it learns right, it starts to influence back people as they're using it, because it directs them a certain way, the way that AI is learned.

Chuck:

Isn't that what teachers do in the classroom?

John:

Oh sure.

Chuck:

Anyways.

John:

But you know as well as I do that students don't always listen to teachers. Sure, you could tell them, as a teacher, something and they'd be like eh, whatever. And then a kid tells them in the hallway 10 minutes later and they're like oh, that's a great idea because it's a peer, all right In our age, in our generation. For this technological generation that's here, this means more and I'm pointing to my computer at the moment Sure Than what we're always saying I mean, look at how hard it is already for them to say, oh, I saw it on the Internet.

John:

It's got to be true.

Luke:

Well, and yeah, many, many people are getting their news off of TikTok or Instagram or that kind of thing, and some of it is AI. And what we need to start teaching kids is how to discern that too, and figure out what is good, what is not, and to think on their own, and I think you said that we need to get to a point where we're teaching kids to think.

John:

And how many times have we heard the term fake news? Right yeah. It happens all the time it does Okay.

John:

Now you really have to be vigilant. Okay, like, if you're really going to go and you're going to look for your news, you're going to go through your stuff. I mean, there's trusted news sources that people get their news from, and a lot of people a little younger than us, through our age and older, get it from certain sources and then from that certain age maybe it's like the mid-30s into the 40s get it more from other sources like TikTok or Instagram or wherever these other things are, where a lot of times you'll see it's individuals that aren't in a trusted news source, if you want to say telling you the news. I think it's crazy sometimes because if you ever go down that road, like I've done that before, where I'll just be, because if you ever go down that road, like I've done that before, where I'll just be like today, a colleague told me that Sherone Moore got suspended for two games the Michigan coach.

John:

Today. I hadn't heard. Now I wasn't on my phone either. I was working right, so I didn't know. I think it's one of those things where we need to be careful because, like some of that news, like you guys one of you two said I don't know, some of it's AI generated, but that's where that fear comes in.

John:

And again, I'm not promoting that fear. I'm saying that that is a fear that people have about AI Is that once it starts to think and it grows and it learns who is controlling that, who's controlling its growth and what information it's disseminating to us.

Chuck:

One of the things that I react to in more of a stronger way than what I normally react is fear generated type of reactions, and that's kind of what I feel like some people have when it comes to ai is, I'm afraid it's going to do a, b or c that's why I was saying it yeah, and I I kind of understand that a little bit, but at the same time I don't want to be driven by fear.

Chuck:

Um, but I do understand we got to guard against this. So there's like this fine line of how do we react. Is that something that can actually really happen? I don't know.

John:

I think there's a difference in what you said is true. Is that there's one thing to react to it or be driven by it, and one thing to be aware of it. Yeah, okay, so that if you started to see certain things that maybe brought that to light, like hey, this wasn't happening before, or whatever you would see. I just think that it's important to be educated about it, about possibilities. You're not driven by that. I mean, I'm not, yeah, but I know some people are Right, some people live in that world.

Chuck:

Yeah.

Luke:

I don't live in that world.

John:

Yeah, I don't live in that world because if you did, you'd never get anything done because you'd always be worried about what's happening around you and behind your back. I mean forever. So I I try not to and I don't really have to try very hard to not live that way, because I think it's just there's too many things that I want to enjoy and and and accomplish, as opposed to sit there and wait for the bad things to happen that I'm worried are going to happen.

Chuck:

Yeah, I do wonder. I was reading an article um, I don't know, it was recently uh, about AI and how which age groups it could actually affect more than others, and it was an article I want to say it was in the like washington post or one of those types of big papers, national papers, and it said that the potential for it to affect people our age, particularly people who are professionals and who do technical jobs, is pretty like pretty realistic pretty high pretty high yep and so like.

Chuck:

They gave the example, for instance, of somebody who gets a an x-ray done and they're trying to work through all these. The doctor or the specialist is trying to work through all these. The doctor or the specialist is trying to work through all these different scenarios based on medical journals, pictures of their x-rays that that actually could be a job that could be replaced by AI, because all it needs is information and pictures in order to make a diagnosis, and I thought that was fascinating.

Luke:

Yeah, there's a book, I think it's called Reimagine. Who was talking about those jobs that we have done over time? Are they going away because of technology? And this was a while ago, it was before AI.

John:

I think technology is one thing, AI, which is actually built to learn and to expand itself is a little different. Technology has replaced a lot of jobs. I mean, you can't even. I just laughed because I was in Illinois again and there's nobody at the toll booth anymore. But the technology can walk right in there and take care of it because they program it. But who are the guys that are programming it, the guys that really know what they're doing, right? That's where the problem comes with the AI thing is who's programming it?

John:

We are Everybody is Because it's taking information from anybody that it gets yeah, and do I want my broken femur to be, you know, diagnosed from a bunch of information that some crazy high school kid was typing in a bunch of stuff one day about something?

Chuck:

I don't think it works like that.

Luke:

I don't think so I'm just saying I think, I think that it takes information from.

John:

From what I've understood, there's a lot of different. It's obviously not just one thing, sure, but there's information that's there and at some point AI needs to make a decision as to what it is moving or what it's going toward, because there's a lot of times if you put something in, you'll get something out. That's complete gibberish. It makes no sense whatsoever. It's because it hasn't really either gotten to that topic or that area of expertise that it thinks that it can.

Luke:

I don't know if you guys watch medical shows, you have to refine it with your yeah, with the way that you put it in. Yeah, yeah, yeah you can't, I'm not sold on it, let me just put it in yeah, yeah, yeah. You can't just put in this, I'm not sold on it, let me just put it that way.

Chuck:

I think if it could potentially decrease maybe the cost of health care if something like that were to happen, because instead of paying a specialist you know $1,500 to read an x-ray you got that done through the job of ai and the way that this article explained it. If it's fishing through dozens and dozens of medical journals that the specialist would probably be fishing through anyways but probably take three times as long I, I'm game, I'd be, I'd be the guinea pig on that.

Luke:

One another way that that it's affected um jobs legal research yeah, you don't have to spend all that time and have a legal library and all that.

Luke:

You can put your quarry in oh boy, see where it's going you can put your query into ai and get that back now as as a lawyer or as a legal aid. You have to read it and look at it and say, yes, this case was this and that you know you got to. You still have to do the research, but AI is going to do a lot of that for you.

John:

So there's been a couple of times where I've put something in and I've read what it spits out, and then I go and I'll do it through another way, whether it's not necessarily through Google, but I'll go to an online library research clearinghouse, right, and look through some topics to do it and a lot of times they're very different because it paraphrases a lot. So in some of these things, when you're looking for something specific, it's difficult because it paraphrases a lot.

John:

So in some of these things, when you're looking for something specific, it's difficult because it doesn't maybe necessarily have that as far as the specific knowledge that you need, which you would definitely need in a medical sense it can't be so broadly general. It has to be very, very, very specific, because one small thing could change the entire diagnosis of anything. Really, you're dealing in very small units of whatever it would be. I think that it's just something that and we've done this before in our country where things have gone and they get going super fast and then we have to try to reel it back in, and I think there's moments where we probably need to try to do that. And I think there's moments where we probably need to try to do that because the technology piece is still it's all well, it's not I shouldn't say still it's always going to be a work in progress, okay, so, the areas that we think it might help the most, I would think we would need to be a little careful just to make sure that we don't let it go too far Right, without enhancing it somehow to make sure that it's right.

John:

It sounds like a little bit of fear there. Oh, it's not fear, I think. To me it's reality to say it's just like anything else. It's like am I going to bring in? Let's just say that our principal left the building. Am I going to bring somebody in who's not had any experience as a principal?

Luke:

Probably not.

John:

So the idea of taking something and putting trust into something that you think might have had some experience with it, but you don't really know for sure, you're taking a risk and, depending on what that is that you're, I mean, if I'm doing it to write a paper, obviously there's not a big risk, except for me getting an.

Chuck:

F right.

John:

But if you're making a diagnosis, or a medical diagnosis or it's to build an airplane or build a better bomb or something or whatever it is. I mean clearances and all these things. There's a lot of stuff and I don't know if I said this before, but I don't like it. I mean it's there, it's not like it's a problem because it's not hurting anybody. But I can't stand watching some movies anymore Because it's computer generated.

Luke:

Yeah, it's not real. It drives me nuts.

John:

But I think, like I watch it and it's like no, that's just, and you know, and you've seen them, Some are really good.

Chuck:

Oh yeah.

John:

And some are really bad. And it's like man, you dumped billions of dollars on this movie.

Chuck:

Well, if you look at Star Wars. Yeah, one, if you look at.

Luke:

Star Wars, one of the last Star Wars movies.

Chuck:

No, it was one of the last Star Wars movies. Princess Leia, what's her name?

Luke:

Carrie Fisher Carrie.

Chuck:

Fisher had died Right and they actually did an AI-generated model of her to place her in this particular movie.

John:

Well, didn't they do that with the guy from Fast and Furious too, Paul Walker?

Chuck:

Yep, they did that too.

John:

Yeah, some of that stuff is a little creepy to me because, like Claymation, even that kind of wigs me out a little bit Really. That's ancient, I know, but it always does kind of wig me out.

Luke:

But some of those movies, yeah, I just watch some of that stuff and it's like man, I want to see some Three Stooges action.

John:

I want to see some Three Stooges action. I want to see some real inaction. Nothing generated there, except you know ass-hattery, I would say when it comes to. Ai.

Chuck:

I've always been pretty much an early adopter of technology throughout the years and I have adopted AI pretty heavily, to the point to where I, when I use it, um, like I'll have it summarize articles, um, I'll have it.

John:

Uh, give me suggestions of topics but do you read the article first? Not always no so you're getting cliff's notes pretty much, but what?

Chuck:

I would do in the beginning is I would read the article and then have it summarized.

John:

Okay.

Luke:

So you built some trust.

John:

Yeah, I built some trust, Built some trust yeah.

Chuck:

Yeah, and it was. I mean it's been good for me.

Luke:

So you gave a talk at church. I don't know if you call them sermons yeah, we call them sermons. So you gave a sermon at church. Did you use AI? So you gave a sermon at church.

Chuck:

Did you use AI? I did use AI for the purpose of summarizing different articles. Yeah, that's pretty cool.

Luke:

I don't think there's anything wrong with using AI tools properly and how it will enhance their learning rather than replace their learning. And so I think this AI thing, I think it's going and it's going to continue, yeah, and we need to be, we need to be willing and able to help people learn how to use it and how to use it well, yeah, and I think even beyond kids, people in their 30s, 40s and 50s need to learn how to use AI so that they're not left behind and they're not useless when it comes to different things in the workforce.

Luke:

Hey, we might want to wrap it up here. I think we really came to some good conclusions with it too.

Chuck:

I don't know about any conclusions. I still have tons and tons of questions. There's a lot of questions.

Luke:

But you know, I think learning about it and not being afraid of it are two great things.

Chuck:

Yeah, I think, not being afraid of it Kind of going with.

Luke:

Luke, you got a quote for us.

John:

I got one.

Luke:

Is it AI generated?

John:

I don't want you to think that I'm all afraid of it. Some people worry that artificial intelligence will make us feel inferior, but then anybody in his right mind should have an inferiority complex every time he looks at a flower.

Chuck:

Deep Think about that one Chew on that a little bit. In my opinion, that is pointing us toward a creator.

John:

Yeah.

Chuck:

And every time you look at a flower you should feel inferior because you know there's a creator behind the flower.

Luke:

Absolutely, amen. That's good stuff. All right, you should feel in fury because you know there's a creator behind it. Absolutely there you go. Well, hey, thanks for hanging out with us here at half century hangout. We appreciate your listening to us and make sure you like us wherever you listen to your best podcast. So happy cinco de mayo.

John:

Peace out wherever you listen to your best podcasts, so happy Cinco de Mayo.

Chuck:

Peace out.

People on this episode