S2E17 - College in the Age of AI (Transcript)
This week’s College Conversations podcast revolves around the impact of AI, specifically generative predictive text models like ChatGPT, on education. Institute President Gregg Garner alongside professors Jeffrey Sherrod and Benjamin Reese discuss concerns such as plagiarism, the erosion of critical thinking skills, and the oft-used comparison of AI to calculators. They express mixed feelings about AI in education, emphasizing the importance of maintaining human excellence alongside AI tools. They also touch on the theological implications of humans creating technology in their own image and the potential ethical implications of human-like robots. The conversation highlights the necessity of understanding and critically evaluating the use of AI in academic and practical settings, as well as the need for ethical regulation in the face of unintended consequences, offering a consideration of the “promises and perils” of AI.
[00:00:11.00] - Jeff Sherrod
Hey everyone, and welcome back to College Conversations. This is the podcast about all things related to Christian higher education. Today, I am joined with librarian and professor, Mr. Benjamin Reese. (Hey.) And professor and Institute President, Mr. Gregg Garner.
[00:00:24.60] - Jeff Sherrod
Today, okay. So today, we're talking about a topic that at least in a lot of the education literature Mhmm. People are, really freaking out about.
[00:00:36.60] - Gregg Garner
Like, in a positive or negative?
[00:00:38.20] - Jeff Sherrod
I think in both I think in both senses. Some people are saying it's the end of education as we know it.
[00:00:44.79] - Gregg Garner
Bum bum bum
[00:00:45.29] - Jeff Sherrod
bum bum bum, the rise of AI.
[00:00:47.60] - Benjamin Reese
Right.
[00:00:48.10] - Jeff Sherrod
So yeah. So this is there's AI, of course, is, like, you know, capturing the national conversation in so many ways, but I think it is particular. People are concerned about how it relates to education, like, what's happening with it, and in a lot of ways. So what I guess what I'm interested in, maybe we can just just have in a freewheeling conversation with you guys, about promise and peril, you know, kind of both. Like, what do we what are you guys not concerned about at all? And you'd be like, actually, I think this is gonna be awesome. Sure. That part. And then I'm also, you know, interested in, like, alright. But this part Mhmm. We do need to think through, and how can this happen? This is something I know for, you know, for us even as a I'm a school administrator. I've had a meeting with I think it was last week, actually. And it was a a play it was so obviously a plagiarized paper. I mean, so, you know, I think that's the thing. Sometimes, like, especially when people use ChatGPT, I think they think that, like, alright. People will think that this is me, and there's so many telltale signs. Like, if they use the word delve, I just know. Underscore.
[00:01:48.09] - Gregg Garner
Or Meticulous. Right.
[00:01:50.09] - Jeff Sherrod
Yeah. There's some stuff I just know. And then it was also, wrong, the whole write up. It was about, like, a chapter in the book, and it was like, this chapter delves into, you know, the the character of Adam, and underscores, you know, something else. Yeah.
I was like, well, it had nothing. No. I didn't even Adam didn't even come up in this chapter. You know? Genesis, Adam. So had a meeting with this guy and, or girl. I'm I'm gonna keep it in anonymous here,
[00:02:20.40] - Gregg Garner
person.
[00:02:20.90] - Jeff Sherrod
And, you know, they were real hesitant to be like, yeah. I just straightforward put a prompt in chatGPT, and what came out was my submission. But that's really what happened.
[00:02:30.00] - Gregg Garner
Yeah. And I think they didn't know that was plagiarizing.
[00:02:32.30] - Jeff Sherrod
They didn't know that that was plagiarizing. At least that's what this person said. And so we had some, you know, talks about what is plagiarism. Mhmm.
[00:02:39.19] - Jeff Sherrod
Then I think the other thing that kinda stood out to me is that when I try to show, hey. And it was wrong, what you learned. And they're like, I don't think so. And I think that was the awkward part. And I was like, well, I didn't even know how to respond at first. I was like, I promise you. You know, it's wrong. But, yeah, I think that so I learned in that one that this student, I should have known, but this student thought that whatever you put in chatGPT was automatically right.
[00:03:03.90] - Gregg Garner
Right.
[00:03:04.30] - Jeff Sherrod
Right. Mhmm. Yeah. Just right. And he was like, I didn't ever think that. I could just get completely wrong information. I was like, yeah, man. It could all be wrong. So, anyway, yeah, I think that, a lot of people are are kinda losing their minds about this in the education space, saying, you know, this is the end of education as we know for a few reasons because we'll never be able to outpace, plagiarism. Like, you know, it's it's this is a cat and mouse kinda thing with plagiarism. Students are using AI, and then there's AI detectors. But then those things just keep on, you know, chasing each other in terms of what can actually be detected Mhmm. Through it. And, you know, when does someone when are they using check? Let's just say, AI in general, to edit papers, but when does it become more than editing? Mhmm. And, like, what's the line there with this? So, yeah, lots of stuff. There's schools that are scrambling to update their academic integrity policies, you know, about, like, how to like, what what is plagiarism in the age where it doesn't really come from another person, but we're still saying that's not your work. Anyway, things along those lines. I mean, maybe I'll just start with a general question. Like, what what are you guys seeing so far?
[00:04:17.19] - Gregg Garner
Yeah. It could help to just start by defining what AI is. Maybe not everybody is Yeah. Privy to artificial intelligence. I think a common misconception amongst Gen Z is that this is a new thing.
[00:04:30.10] - Jeff Sherrod
Right.
[00:04:30.39] - Gregg Garner
That AI has been around for a while. Mhmm. And not just in science fiction. We've been contending with the advances connected to to AI for decades. Yeah. I think some of the the earliest studies that I've I've seen are that are beyond speculative are, like, in the sixties. Mhmm. And and not just studies, they were actually implementing forms of AI. Mhmm. And they were just making observations about how it worked. And and given the way that artificial intelligence played out would seem to us very archaic and rudimentary. It nonetheless fits the definition. So I think, generally when we're talking about artificial intelligence, we're talking about something that has the appearance of being able to solve a problem Mhmm. Or or rationalize a scenario. And I think that'd be like a real broad definition that works for us with respect to artificial intelligence. So that means we've already been using artificial intelligence even in schools, even in college for a long time. It just wasn't branded in such a way that it gave us the awareness. So for example, a spell checker is a form of artificial intelligence. You hit F7 or whatever you wanna do, and it it runs through your paper and then tells you, what's not spelled right or the grammar checkers Yeah. Form of artificial intelligence.
[00:06:12.10] - Jeff Sherrod
Checks your voice and, you know, all that kind of stuff.
[00:06:14.10] - Gregg Garner
Yeah. Yeah. So I I I do think it's important to have a definition. And I think right now, what we're talking about are the the latest ways in which AI is being used
[00:06:27.89] - Jeff Sherrod
Right.
[00:06:28.19] - Gregg Garner
By society. And we're trying to capture that under the umbrella of platforms like ChatGPT, maybe the Alexa, Siri, and other type of Google help Right. Type, bots
[00:06:45.10] - Jeff Sherrod
Yep.
[00:06:45.69] - Gregg Garner
That that exist there in personality. So if or is that the angle we're coming from?
[00:06:49.69] - Jeff Sherrod
I think so. And maybe maybe even just if we could, for the purpose of this one, narrowing it to, like, generative predictive text models,
[00:06:57.89] - Jeff Sherrod
You know, like the the GBTs out there, like the ones that are using large datasets
[00:07:03.10] - Gregg Garner
Yeah.
[00:07:03.39] - Jeff Sherrod
And then saying, alright. Based upon what people start here, this is where Yeah. Yeah.
[00:07:07.00] - Gregg Garner
Yeah. Yeah. Because I didn't talk about how it works. Yeah. Because that student didn't think ChatGPT could be wrong because they probably imagined that it's like some kind of super brain that's capable of thinking on its own independent of your prompt. But it's literally just taking your prompt and then searching a knowledge base, some database
[00:07:34.00] - Benjamin Reese
Mhmm.
[00:07:34.19] - Gregg Garner
And then synthesizing according to the way you you ask the question, very specifically Yep. Information. So if you ask a bad question, it's gonna give you
[00:07:44.80] - Benjamin Reese
Right.
[00:07:45.10] - Gregg Garner
An answer to your bad question. Yeah. Prompting is everything when it comes to ChatGPT, but that student didn't realize that by asking the question or giving the prompt the way that they did, ChatGPT is just gonna look at whatever sources are available. And that's where it gets kinda scary.
[00:08:01.60] - Jeff Sherrod
Mhmm.
[00:08:02.00] - Gregg Garner
If the bot itself doesn't have parameters on it Mhmm. That make it disqualify certain sources for pieces of information. They know that ChatGPT, there are occasions where you can put in a prompt, and it'll let you know, I'm not gonna dive into that or Yeah. This is not something that I have the capacity to do. So so programmers are putting some kind of limitation.
[00:08:28.19] - Jeff Sherrod
Yeah. This has been, like, the big debate with Elon, model. Right? Like Yeah. Alright. That one's more or less censored. Just kinda go for it. Yeah. Forget what his is called. Rock can't Rock, I think, or something like that.
[00:08:40.39] - Benjamin Reese
Yeah.
[00:08:41.60] - Jeff Sherrod
Yeah. What do you what do you guys think of, like in one of the arguments I've heard in favor of students using and being taught to use predictive text models in education spaces? This is really no different than what the calculator was. This was a tool just like this is a tool. We were told at some point that you'll never be able to have a calculator in your pocket. You have to be able to learn how to do all this the long way. Well, that's not true. We have calculators in all of our pockets now. And in the same way where people are like, this is gonna erode all critical thinking, Well, no. It just has to now develop a new way of critical thinking with a new tool just like we do that with any tool. Fair comparison, you think, to to calculator, unfair comparison?
[00:09:25.60] - Benjamin Reese
I mean, my because
[00:09:26.39] - Jeff Sherrod
we all think we all feel like calculator is wonderful. Yeah. Yeah. Ben said, yeah.
[00:09:32.70] - Benjamin Reese
I do like that.
[00:09:33.60] - Gregg Garner
To a certain degree for me, I always wanna be faster in the calculator, better Better than the calculator.
[00:09:38.79] - Jeff Sherrod
For For the rest of us, Gregg.
[00:09:39.60] - Gregg Garner
I think I but I just think it holds with respect to ChatGPT as well. Like, I still want to be the kind of person that can think and produce information. And I would like to not have to access that tool, but the tool is helpful Yeah. Because we are limited in our capacity Mhmm. To work through things. And what something like ChatGPT does in the AI space is it's really good at, synthesizing a bunch of data based upon a prompt almost simultaneously. Yeah. Right. And that's, I could do that too with the help of the Internet, but I'd be at it for a while to do what chat can do instantaneously. Yeah. I just go on Google and enter my queries and then find my sources, like, we traditionally would do. It just does that really fast.
[00:10:28.70] - Benjamin Reese
Yeah. Yeah. In that way, it's I mean, with the calculator and the computer technology, still at the cutting edge of mathematics as human beings who are pushing things forward. So it hasn't replaced the calculators haven't replaced the human necessity for excellence in that space. It's just that for people who maybe aren't mathematicians, that it becomes helpful to get certain tasks done. So with ChatGPT, my first instinct is like, no. It's way worse than the calculator. I have some, like, reservations about it. Like, I've I have this general uneasiness about it all.
[00:11:09.50] - Jeff Sherrod
Is it just because you like the way it makes it to be…
[00:11:11.50] - Benjamin Reese
think it's the same feeling that math teachers had.
[00:11:13.89] - Jeff Sherrod
Yeah. That's, I guess, that's my question. Yeah.
[00:11:15.39] - Benjamin Reese
I think that they, like math people really love math and doing figuring out a problem on yourself by yourself and working through it, there's a certain amount of enjoyment and love that people have for that abstract problem solving. And I think that they saw that with calculators, kids were going to be, kids were gonna miss out.
[00:11:37.39] - Jeff Sherrod
The joy.
[00:11:37.79] - Benjamin Reese
Yeah. Robbed of that joy. Now mathematical enjoyment is something that I never quite found in life, but I get so much enjoyment from, like, you know, reading and writing, and I think that the struggle is part of the enjoyment process. And so I think Okay.
[00:11:53.60] - Jeff Sherrod
The same like this.
[00:11:54.70] - Benjamin Reese
The same kind of, like, sadness maybe that math teachers had, seeing kids be like, ah, this is stupid when they when it's something they love, I feel like it's it's kinda how I feel. I'm like, oh, like, why why not just summarize this book using chat g p t? Like, that kinda hurts my soul a little bit.
[00:12:11.70] - Gregg Garner
Well, you know, I math did make sense to me from a very young age, and I was taking courses at high school, and I was still in elementary school. And I was, gifted with a Texas Instruments eighty one. So the TI eighty one I remember those. Yeah. Calculator, graphic calculator. I would upgrade later to the TI eighty five, which I still have somewhere.
[00:12:40.20] - Jeff Sherrod
I only have the eighty three.
[00:12:41.39] - Gregg Garner
Yeah. Do you see that? That was after the eighty five, believe it or not. That came out later. But the eighty one is where I started, and it was a calculator indeed. But the thing about it was you could program it so that it would do and this is where I think, when you're just talking about the calculator analogy and ChatGPT, this is the difference. The calculator requires the human being to input the variables and to and to define them so that you can solve a problem. Where ChatGPT, you can leave variables and just ask for it to think through how to solve for those variables, and it does the thinking. But with my TI eighty one, what I quickly learned to do was program it. Mhmm. So while all my friends were working through what the math teacher taught, I would run a program, insert the numbers for the variables, and then solve for x, solve for y, solve for x y. Yeah. And then press enter and my program, and I'd be done. But then, you know, the big issue with math teachers is show your work.
[00:13:49.39] - Jeff Sherrod
Right. Right.
[00:13:50.10] - Gregg Garner
So I could only use it as a way to check my work. Mhmm. So I would have to show my work and then run my program. Boom. But it was so fun for me to program it because I I was essentially the learning the external learning processor for my calculator to think.
[00:14:09.10] - Jeff Sherrod
Mhmm.
[00:14:09.89] - Gregg Garner
So I I could actually make it learn by giving it new program sets. But that's what AI does today. It's learning.
[00:14:19.50] - Jeff Sherrod
Yeah.
[00:14:20.00] - Gregg Garner
And this is where it gets confusing. Like, what does it mean for AI to learn? In the newest chat g b t four module. It just had an update, and it says, now it will remember everything you're talking about Right. In in a stream.
[00:14:34.29] - Jeff Sherrod
Yeah. I saw that.
[00:14:35.50] - Gregg Garner
Which I'm so happy about.
[00:14:36.70] - Jeff Sherrod
Me too. Fall the same way.
[00:14:38.50] - Gregg Garner
Reprompting it every single time. But but that in and of itself is saying it it it it has this artificial form of memory, a pseudo memory.
[00:14:47.60] - Jeff Sherrod
Mhmm.
[00:14:48.10] - Gregg Garner
And now it can hold that information probably in in a in a way that that exceeds our normative ability to remember information. Mhmm. And at least outside the scope of what most people do with memory. Right. There are those people out there that are like memorizing decks of cards in random order. Right? But most people aren't gonna remember, like, ChatGPT will remember now, but they will remember that they had questions along the stream of thought. And if they arrange their prompts appropriately, they can go back to the very thing they were working on a month later. Or, like, for example, in academics, if I have a class that's scaffolding with respect to the final result
[00:15:32.00] - Benjamin Reese
Mhmm.
[00:15:32.60] - Gregg Garner
I can work with ChattGPT, gather my thoughts in the first assignment. And then when the second assignment comes, reference my first assignment thoughts because there remembers things and and update the the knowledge acquisition that chat's helping me to do. Yeah. So I think, like, initial response to all this stuff is that there's some exciting opportunities Yeah. With chat and academics. I have to concede I am a early adopter with technology. I it doesn't take much to convince me, that if this thing can make life easier, I'm going to figure out how to use it. Yeah.
But I also have some theological rooting for that. I when Genesis two introduces the work to the human being as being something that God provides water for, but they have to actually till and work the ground. The aloneness of man is highlighted, and the remedy to his aloneness is first to introduce technology. Now the technology is found in animals, Mhmm. And the the text, it seems to be pointed out that this technology exists to lighten the burden of the man with all that he had to do. So even the man is having to go through and name the animals so that whatever it was named, that was its name, Mhmm. That this human being has a responsibility to identify the the functional capacity of this animal, and that's usually how we name things. Right? We usually name things by, how we understand it fits into our world.
So I I even think that today, modernly, when we look at the, progress of technology, we can we can see that, what we have came from that very, initial technology. So let let's just think automobiles. Right? We we wanna get somewhere. We ride a horse, And we like the horse. The horse is fast. The horse has stamina. The horse can get through various terrains. We like the horse. And and then we're like, but now we wanna carry a load. The horse can carry a limited load. But if I had two horses, I could have one attached to me, carry more of a load. Then it's like, but I'd like to go faster, further, longer.
[00:18:00.09] - Jeff Sherrod
Mhmm.
[00:18:00.59] - Gregg Garner
I wanna put a carriage. I'm gonna put these several horses on the front of the carriage, and they're gonna take me. But then at a certain point, it's like, these horses are expensive, and they have to have upkeep. They're stables. Let's invent something that can produce the same kind of energy to propel forward a carriage. Here comes the engine. Yeah. Right? Now let's describe how we talk about the power of the engine. Well, we might as well talk about it how we understand it. We're gonna call it horsepower. And then next thing you know, the the our vehicles are being named Mustangs and Rams.
[00:18:35.09] - Jeff Sherrod
Right.
[00:18:35.50] - Gregg Garner
And and we we we see that the earliest technology for human beings were the living creatures around them that they were able to identify their functional purpose for so that if they were able to name them appropriately and use them accordingly, it makes life easier. It makes the the burden of having a productive and beautiful society, a little lighter. So for me, whenever I'm looking at technology today, I, first of all, I can recognize that that these aren't living systems.
[00:19:06.79] - Benjamin Reese
Mhmm.
[00:19:07.00] - Gregg Garner
So and that God didn't create a lot of the technology and systems that we're we're working with. I I can note that they're likely derivatives go back to something in creation. Like, we know that the Wright brothers were thinking about flying by looking at birds, that Da Vinci drew his original model based upon his observation of birds. And then today, we have planes and all the advanced technology goes on planes that human beings created, but their their creation is still a derivative of something that God did. Mhmm. So I think the scary part about the AI we're talking about is that now the derivative is us, the human being. Yeah. Wherein all everything else, the little skyscrapers, derivative of a mountain or the pyramid, derivative of a mountain. Right. Like I said, planes, birds, vehicles, the the these other kind of animals or or, like, even even, like, when we look at tractors, that used to be an ox in the plow. Right?
[00:20:05.20] - Benjamin Reese
Yeah.
[00:20:06.00] - Gregg Garner
The the derivative used to be animals. Now it's us. Right.
[00:20:09.50] - Gregg Garner
And I think that's where the where the fear enters.
[00:20:10.40] - Jeff Sherrod
Yeah.
[00:20:12.20] - Gregg Garner
It's like, oh, man. Like, human beings created something, and now they're creating it in our image. Where before, they're creating something in the image of creation. Right. Now we're creating something in the image of the ones that have been created in the image of God. Yeah. And that's where I I feel like it it requires a a more careful approach.
[00:20:34.59] - Jeff Sherrod
Yeah. Because it's primarily a linguistic tool. Yeah. And so we're dealing with the only real linguistic animal in God's creation. Right?
[00:20:43.59] - Gregg Garner
Yeah. And and, frankly, God's invisible.
[00:20:47.20] - Jeff Sherrod
Yeah.
[00:20:47.50] - Gregg Garner
So to image God is linguistic Yeah. Mhmm. By default. I'm not gonna image God by moving around real fast or
[00:20:55.79] - Benjamin Reese
Right.
[00:20:56.40] - Gregg Garner
Like, visibly demonstrating something. It's gonna be linguistic. So we do know that according to that Genesis two text, the man has that's his power is is whatever he named it, that's its name. We have this linguistic capacity to create realities.
[00:21:11.70] - Jeff Sherrod
Right.
[00:21:12.50] - Gregg Garner
And, that that that's gonna that's gonna be disruptive without a sound theology because where it can go is is now people are lonely. Because remember, that was the initial thing. Mhmm. This man's alone. Technology. We already know that in nursing homes, elderly people struggle with loneliness. Like, they're alone in these places, and and that negatively impacts their longevity, their well-being. And so part of the solution has been to bring in four different bots. I know a really popular one is like this seal bot that that just has these reactions,
[00:21:50.59] - Benjamin Reese
I don't know.
[00:21:52.90] - Jeff Sherrod
I don't even know what you're talking about. Yeah. That looks like a seal?
[00:21:55.70] - Gregg Garner
Yeah. Yeah. Gotcha. They have dog versions. They they have, different animal versions at this point
[00:22:01.79] - Jeff Sherrod
Yeah.
[00:22:02.20] - Gregg Garner
That are present, bring comfort, but they're bots. That way the if the elderly person forgets to feed them or, doesn't do the kind of things that Right. They should be doing, it doesn't die. It just loses its charge, and you have to recharge it. Right? But we're we're also looking at how, how when the AI doesn't take on a human personification, it's less threatening. Yeah. I I
[00:22:29.70] - Jeff Sherrod
I can intuitively feel that.
[00:22:31.40] - Gregg Garner
Yeah. Yeah. Yeah. So, have you seen the bots that do Amazon packages now?
[00:22:37.20] - Benjamin Reese
Like, deliver them?
[00:22:38.29] - Jeff Sherrod
I've seen, like, the helicopter ones.
[00:22:40.40] - Gregg Garner
The yeah. The the drone type ones. There are those, but there are now these these actual they look like something out of the Terminator era where it's just like it looks like mechanical structural, or like you know when the Terminator has no skin? Yeah. Right. Yeah. It's just the the the, I don't know, robot.
[00:23:01.09] - Jeff Sherrod
Right.
[00:23:02.00] - Gregg Garner
Yeah. That's what they look like, except they they have half bodies. So they're legs, and then they they have, like, a platform. So they still don't feel fully human. But what happens when someone just decides, we're gonna complete the skeletal form on this. Yeah. And, we're gonna slap some skin on it. Hey. We can make them look pretty good. Next thing you know, you drive by and you see you see this human figure get out of an Amazon vehicle, which by the way is auto driving
[00:23:28.20] - Jeff Sherrod
Mhmm.
[00:23:28.70] - Gregg Garner
Because Amazon does have its its, cyber vehicles now. Yeah. And, anyways, it's auto driving. The Autobot comes out of it and then delivers a package. You drive by, you may not even tell the difference.
[00:23:40.40] - Benjamin Reese
Yeah.
[00:23:40.70] - Gregg Garner
At this way. To me, that's where it's like, okay. We're entering into something interesting. Now I think we can debunk the fact that there's any form of technology that would create sentient forms of intelligence.
[00:23:51.79] - Jeff Sherrod
Right.
[00:23:52.70] - Gregg Garner
There's Yeah. Yeah. We're not we're not thinking of spirituality. Yeah.
[00:23:56.40] - Jeff Sherrod
Yeah.
[00:23:56.50] - Gregg Garner
None of this stuff can think on its own. It's all programmed. And I think that's where the danger is because who's programming it? What kind of ethics do they have? Right. What what what are they in writing into the if we can analogically say, the DNA of these bots. Mhmm. Like, how how will they respond? What will they choose? Because what happens if they become forms of law enforcement? That's always the the, dystopian, future. Right? Like, you you now have these things that are programmed to act without any any sense of soul. Right. And and their judge, jury, executioner kinda thing. I think this is where it gets troubling.
[00:24:42.29] - Jeff Sherrod
It certainly gets troubling. You know, even to your point of someone's programming this, that someone or some people have an agenda that they're you know, this this is gonna affect certainly people in the education space as well. Where do you think, though, that, like, as a tool, we're still saying, alright. Because you're you're saying you're using this. This is helpful. And and I'm I would say the same thing to students just like we encourage students to use Grammarly. This is a tool an AI tool. But Ben's getting into something I think is still interesting that there's this process of maybe writing and working through it that seems to be helpful. At least you like it.
[00:25:22.70] - Benjamin Reese
Yeah. I think it's helpful.
[00:25:23.90] - Jeff Sherrod
And helpful. Yeah. Where where does, like, the and when we're doing education I'm trying to think about how the best way to say this. When we're doing education, is it that we're trying to impart the concepts so that they can, in any in any format, so that they can discern those agendas that are there, or are we also acknowledging that the the means that we use to instruct, whether it's the actual writing and putting words to paper, matters? You know, like, is it do we or should education get turned?
[00:25:56.79] - Gregg Garner
Right? Because there's a dichotomy here that we have to identify. There is a group of people that want to go to school to make money. Yeah. There's another group of people that wanna go to school to acquire knowledge and wisdom. Right. And in many ways, there is opposition that exists between those two approaches. And Ben is one of those pure souls, you know, that really wants to learn something for the reward of the knowledge and the wisdom. Mhmm. Where there are a lot of people out there, I would say, specifically in developing world countries.
[00:26:34.90] - Benjamin Reese
Mhmm.
[00:26:36.00] - Gregg Garner
Who, because of a lack of access to the venues and the knowledge bases that have long existed in the world independent of the type of technology we have now. They've been kept out of…
[00:26:52.70] - Jeff Sherrod
Yeah.
[00:26:53.09] - Gregg Garner
How it was that they could, effectively partake in the rest of the world. Let me give you a practical example. If I was growing up, let's say I'm growing up in Iran and, we've we've had some trouble in the last, you know, fifty years. And so the infrastructure of Iran is not where we'd like it to be. And, the city has its development, and there's some technology there. I'm coming from a rural area, but I have some capacity, except I have no access to the libraries, which in include not just books, but films and photographs and artifacts. I I I can't access any of that, but I all of a sudden have an Internet connection.
[00:27:36.59] - Jeff Sherrod
Right.
[00:27:36.79] - Gregg Garner
And now with the Internet, I can go on national geographic dot com, the smithsonian dot org. And I start looking at videos and watching and learning and growing. I'm now accessing like, it's even better than remember in Goodwill Hunting, he's like, how you like them apples? Mhmm.
[00:27:50.90] - Jeff Sherrod
Because he gets the girl's phone number, but it's a result of the fact that he said he got what the the Harvard grad got in a dollar fifty in late fees at the library. Right. But now people get access to that kind of knowledge through their phones Yeah.
[00:28:02.90] - Jeff Sherrod
Mhmm.
[00:28:03.40] - Gregg Garner
Which is phenomenal. So now what what is a hindrance? Well, the the language of commerce is English. But but now with with, AI Yeah. You can talk your native language into your phone, and it spit out for you the translation into English, and then revoice it for you Right. Into an an English voice Yeah. That now you can, type in your own language into another AI generator, a video that you want for your new product that you're gonna put on Amazon. It'll produce the video. Take your AI native language to English, English voice speaking script, and now you put your products on Amazon. You seem like the most educated business person with a sophisticated model that AI has done the name check for you. It's figured out what's gonna SEO well. And because you were capable enough of making all this happen, you just eliminated the need for, like, ten years of study because of that piece of technology you have, and now you can change the lives of your family. Yeah. So I think on the one hand, it's I'm I'm not trying to say it's evil to to to have that desire to create something productive, So it's create an income and AI shortcuts you.
[00:29:24.70] - Jeff Sherrod
Mhmm.
[00:29:25.00] - Gregg Garner
It gets you there. And the flip side of that is it can also create, an an appearance of something legitimate when it is in fact, hollow. Yeah. There there's actually nothing
[00:29:38.20] - Jeff Sherrod
And I think that has concerns as educators. That's our that's our fear. Right? Like, we'd hate to graduate someone with the appearance that they have learned
[00:29:46.79] - Gregg Garner
So that's and they haven't. That's what's gonna have to happen. Regulating bodies in in various countries are going to have to assert themselves in such a way where they give us citizens a protection because now there are loopholes that have been created for a person to verify their legitimacy as a technical practitioner.
[00:30:08.90] - Jeff Sherrod
Right.
[00:30:09.40] - Gregg Garner
In the same way we've had to do in all kinds of fields. Because believe it or not, there was a time when anybody could diagnose you medically. Right. Even in our country. Right?
[00:30:18.09] - Benjamin Reese
Yeah.
[00:30:18.29] - Gregg Garner
That's not the case anymore. You do that. That's the illegal practice of medicine. Same thing with law, same thing with other other areas. I I believe that this is the direction we have to go into the future, or we're gonna find, people scammed out of a lot of stuff. And scams are at an all time high
[00:30:36.20] - Jeff Sherrod
Yeah. Yeah. Yeah. Right now.
[00:30:37.70] - Gregg Garner
And that's a result of AI. I mean, you you people who are good at it can can take the vocal sample. Let's say let's say, you like this podcast. I could get vocal samples of you
[00:30:50.50] - Benjamin Reese
Mhmm.
[00:30:50.90] - Gregg Garner
And then now, have AI generate all of the vocalizations that you would make based upon a few episodes here. And then I can now turn that into a voice on the other side of my typing prompt. Call your grandmother and ask her for some money because I'm in trouble. Yeah. And and she give me her account number, and then she thinks that she helped out her grandson. How do I know that happens? This literally happened to my grandmother. No way. Yeah. And she called me crying because, my brother reached out to her saying he was in jail and he needed help and she was going to send him money immediately, but she didn't know how to do it according to the instructions. And I said, I I said, mom, let me call my brother. Call him, not him. And I was like, grandma, it's not you. And and she was about she would have sent it already, except she didn't know how to do it. But she's like, it was his voice. I know his voice sounded just like him. Goodness. It wasn't him. Yeah. So this stuff happens. So there's the real negative side to to all of this. But I think on the positive side, like like, even for a purist, then like yourself, I think with some integrity and if we can teach young people how to use it effectively
[00:32:06.29] - Benjamin Reese
Mhmm.
[00:32:07.00] - Gregg Garner
The best thing that AI could be used for right now is a sorter. Like, it can sort things out for us.
[00:32:13.20] - Benjamin Reese
Mhmm.
[00:32:13.50] - Gregg Garner
Like, you you there's so much junk out there. So, like, you can you can say, I need ten resources in the last five years that speak towards these ideas that are published by, publishing companies that fit this criteria
[00:32:29.59] - Benjamin Reese
Right.
[00:32:30.59] - Gregg Garner
Of of whom the authors have at least, one person with a terminal degree.
[00:32:37.70] - Jeff Sherrod
Like, you can get very specific morning, I asked I was like, I was doing some work on Pistis Christo, in this phase. And and I was, you know, looking at subjective or objective
[00:32:47.20] - Gregg Garner
Faith of Christ.
[00:32:47.70] - Jeff Sherrod
Faith of Christ and looking at subjective or objective genitive debates here, faithfulness of Christ or faith in Christ.
[00:32:53.40] - Gregg Garner
Galatians, looking at the Galatians six.
[00:32:54.79] - Jeff Sherrod
Looking at the Galatians six. And then you know? But my question was, just give me five scholars that hold to the subjective view and then tell me some articles that they wrote. Yeah. But that's a nice that's a short way of getting some articles. And
[00:33:07.09] - Gregg Garner
if you were to Google all of that, which people are okay with Googling.
[00:33:10.29] - Jeff Sherrod
Yeah. We're all okay with that.
[00:33:11.79] - Gregg Garner
Basically, what we're doing is we're letting chat Google for us.
[00:33:14.70] - Jeff Sherrod
Right. Yeah.
[00:33:15.50] - Gregg Garner
And and do it
[00:33:16.20] - Jeff Sherrod
like a Google replacement.
[00:33:17.29] - Gregg Garner
Even though it uses Bing
[00:33:18.79] - Benjamin Reese
Yeah.
[00:33:19.00] - Gregg Garner
It's it's nonetheless, gathering all of this information. So I think I think in academics, we we're gonna have to surrender over to the fact that the tools available it's like, you know, you got you got an aux in your backyard. You can't expect the student to just go out and plow by hand. Yeah. Like, we we we gotta let them use the aux and then give them a way to think about using the ox that's to their benefit. I think we're gonna have to this is all gonna end up in metacognition for me. Like, if Right. If a person doesn't know why they're using what they're using, that's the problem. Yeah. Yeah. So if a student's like, I just use ChatGPT, and I go why? And they are like, because it was easier, because it gave me the right answer, we're we're failing.
[00:34:05.40] - Jeff Sherrod
Yeah.
[00:34:05.79] - Gregg Garner
But if they can tell me why and articulate the benefits of that approach in in some kind of, like, cost benefit analysis with respect to getting their work done, I can that exercise in and of itself was worth it for me.
[00:34:21.19] - Jeff Sherrod
Right. Right.
[00:34:22.40] - Benjamin Reese
Does that make sense?
[00:34:22.59] - Jeff Sherrod
Yeah. I mean, that goes back to your teacher asking you with the with the program that you made about the formulas. Yeah. Why? Well, it shows that I both understand the formula, and it shows I know how to do programming. And it shows I know you know, like, there's so much benefit to someone that reasonably could look and be like, well, yeah, this just makes sense. Go for it.
[00:34:38.50] - Benjamin Reese
Yeah. I did. I I do think that metacognition is going to have to be, yeah, taught more explicitly or emphasized more because, you know, one of the one of the most important skills for any intellectual person is to get over our fast ability to deceive ourselves into thinking we know something when we don't. Yeah. To read something once and to say, oh, yeah. I think I got the gist of it. Yeah. You sort of learn over time that, okay, I I can't believe that sense in me that says it's good enough. I I actually don't know what I'm talking about. You might have the experience of, like, thinking you understand something and then trying to explain it to somebody else and realizing. I don't actually understand it at all. Yeah. But I think with ChatGPTthat that, just sort of like, I don't don't know, human, tendency to deceive ourselves into thinking where we have knowledge when we don't, it's gonna be much higher.
[00:35:30.00] - Jeff Sherrod
Because those things this is where because I think
[00:35:31.19] - Benjamin Reese
the student when he when he put in that prompt and he turned in the paper, I think he believed without malice that he understood the material. Like Right. I think he was like
[00:35:43.90] - Gregg Garner
Otherwise, the submission wouldn't have been made.
[00:35:45.90] - Benjamin Reese
I produced or, like you know? So I think there's just there's always been an opportunity for self deception, but I think with ChatGPT,
[00:35:54.50] - Jeff Sherrod
it's think that this is where colleges are uniquely suited to help students still because we're not just the repository of information that we used to be. But what we've curated is not just, again, not library resources. We've curated people, and those people have curated Mhmm. Knowledge base and skills to be able to help students through that process. And I think that's where every young person's come is like, yeah. You can find all kinds you can find the best teacher about the best subject probably free. But then for someone to be like, what did you think about that? And why did you think that they made that argument instead of that one? That's what uniquely a teacher can do, and predictive text can't do it the same way.
[00:36:34.80] - Gregg Garner
Yeah. Yeah. Yeah. Because even if you ask AI to help you by asking you questions, It doesn't know how to ask any question outside the scope of the prompt you gave it to ask the questions.
[00:36:49.40] - Jeff Sherrod
Mhmm.
[00:36:49.90] - Gregg Garner
It doesn't have free dialogue with you. It's not in your context. Now given they're trying to change that, for example, with wearable technology, there there's, one of the so, like, with some of these self driving cars Mhmm. If you attach it to your wearable technology, it now can detect your habits so that you let's say you drive it to work, you go to work, you get off at a certain time, it knows the weather. Based upon its knowledge of you getting off at a certain time, the car will actually start on its own. It'll warm up, and it'll get it to the temperature you prefer without you prompting it just based upon your agreement to whatever agreement they had you…
[00:37:41.50] - Jeff Sherrod
Right.
[00:37:42.19] - Gregg Garner
You know, whatever it does, just scroll through it really fast, press, like, accept, agree to get to the next screen. But your wearable technology is now informing your whereabouts to the vehicle so the vehicle now can prepare for you. And you get into the vehicle, and it will say your destination is five minutes away. Or if you go this route, it's not heavy in traffic because it knows you're going home. It knows you're going to the gym. Yeah. Based upon just a form of
[00:38:06.09] - Jeff Sherrod
pattern recognition.
[00:38:07.09] - Gregg Garner
Pattern recognition, and it you didn't know, but it's connected to your calendar. So it sees where you're at and what like, there is this appearance of thinking
[00:38:18.00] - Jeff Sherrod
Right.
[00:38:18.69] - Gregg Garner
That exists, but it's not thinking. It's data collecting, and it's predicting. Yeah. And so you like, so you go back to that's what this model is, essentially, and you go to that student who doesn't recognize that the generative result of that AI that gave you the paper that you could clearly see was a form of plagiarism to that student, This thing thought the way that the student would have thought had it been asked the question and then reason into words the the answer rather than recognizing that it's only pulling from information
[00:39:02.90] - Benjamin Reese
Right.
[00:39:03.40] - Gregg Garner
That someone else put out there. Right. AI is not uniquely producing its own information without gathering it from somewhere else
[00:39:11.30] - Benjamin Reese
Yeah.
[00:39:11.80] - Gregg Garner
All the time. Mhmm. So there's there's the the ability for you and I to think about something, even though academia I this is one of my biggest pet peeves about academia, the idea that none of us could have any original thought. Everything has to be sourced.
[00:39:26.80] - Jeff Sherrod
Mhmm.
[00:39:27.30] - Gregg Garner
Even if you never read that person, you have to find that person who thought what you thought. Otherwise, you're not a good academic. That that is very irritating to me because it is possible for human beings to have thoughts that are original to them based upon our capacity to rationally evaluate things.
[00:39:46.19] - Jeff Sherrod
Mhmm.
[00:39:46.59] - Gregg Garner
Right? These these bots, though, they're learning, they're they're not learning in that way. They're learning based upon the patterns
[00:39:57.50] - Jeff Sherrod
Yeah. That are are And this is only gonna come out more in colleges because if we want that other type of scholarship, well, a, I can write that one. Yeah. You know, it's like and that one's an easy one to write. But the one that shows that we have we have the capacity to understand why it's doing what it's doing, the metacognition Yeah.
[00:40:13.59] - Gregg Garner
I think what what's gonna have to happen is we're we're every we're gonna have to change in in big ways. For example, I don't think we're gonna accept written essays anymore.
[00:40:22.69] - Jeff Sherrod
Yeah. I've wondered. Is it does it does it does it think it's gonna
[00:40:25.09] - Gregg Garner
be a video interview or a local interview, or we're gonna make a person handwrite things. Like, I'm like, for me, that's that's what I'm headed next Is is in our college.
[00:40:34.30] - Jeff Sherrod
Is yeah. I've thought about, like, if someone turns in an outline, and then we just do oral papers. You know? Like, they have to present it live and be able to defend what they're saying. And Yeah. That way, it's still something that they have to be able to show.
[00:40:46.59] - Benjamin Reese
Because I I think the angle that I've seen a lot of schools go at it is, like, how do we stop cheating? And I don't like I think, you know, for those who are maliciously cheating Yeah. They're gonna find a way. You know? Like Yeah.
[00:40:58.19] - Jeff Sherrod
I don't know. And they're cheating themselves anyway.
[00:40:59.80] - Benjamin Reese
They're cheating themselves. Right? But really, like, even to do oral tests is not about stopping cheating. It's about even stopping, again, that sort of, like, that false sense of knowledge that students could, like, very innocently
[00:41:13.90] - Gregg Garner
Yeah.
[00:41:14.50] - Benjamin Reese
Think that, oh, I have a competency Yeah. When they don't. And we wanna protect students from themselves in that way. Yeah. And it's a good experience for a student to have that, like, oh, yeah. I know what I I know what I know or whatever, and then to get into a classroom, be orally examined and figure out, oh, yeah. That sense I had, that wasn't the right sense of Right. Yeah. What I need to understand.
[00:41:35.90] - Gregg Garner
Yeah. So for us as a bible school
[00:41:38.90] - Jeff Sherrod
Yeah.
[00:41:39.69] - Gregg Garner
I think the thing that I'm just gonna speak colloquially. That's scariest about this is to believe that AI can give you a theological education.
[00:41:51.80] - Jeff Sherrod
Mhmm.
[00:41:52.19] - Gregg Garner
To think that AI could, for example, as was said in the seminar you attended Yeah. Be better preachers than we are. Mhmm. Maybe give us a little context for that experience.
[00:42:02.90] - Benjamin Reese
It was a it was a seminar on AI, and there was a discussion related to AI in preaching. But one of the attendants, not the guy giving the the the workshop, but one of the attendants very boldly proclaimed that he believes that Chad GPT can write better sermons than preachers and that they should just rely on ChatGPT, which obviously didn't play well to that audience.
[00:42:27.50] - Jeff Sherrod
Group. Yeah.
[00:42:28.09] - Benjamin Reese
But, you know, it's That
[00:42:29.50] - Gregg Garner
is a group of academics on the rear.
[00:42:31.09] - Benjamin Reese
But it yeah. It was a very I mean, just that somebody had that opinion was very kind of shocking.
[00:42:36.40] - Gregg Garner
Yeah. Yeah.
[00:42:38.09] - Jeff Sherrod
Yeah. It's it's that thing that says the only thing that matters is content. Mhmm. And, and that's not that's not education. It's ever been, like, because someone has content or is able to say it the right way, suddenly, they know they know things.
[00:42:55.59] - Gregg Garner
Again, and this content is being generated by something they can't really think. So what if you prompt it in such a way that it sends its little spiders onto the web, and it attaches itself to, like, these sites generated by idiots.
[00:43:15.00] - Jeff Sherrod
Right. Yeah. People because you Well, I've read search
[00:43:17.69] - Gregg Garner
I've read these
[00:43:18.00] - Jeff Sherrod
I've asked ChatGPT a lot of times about questions about the bible, and I've gotten idiotic responses.
[00:43:23.00] - Gregg Garner
Yeah. Me too. Yeah. It makes me laugh.
[00:43:25.19] - Jeff Sherrod
Yeah. That's what I thought.
[00:43:26.80] - Gregg Garner
But but it's just pulling from what is made available on the web. And I think we can agree that most academics who are experts in their field aren't just blogging. Right. Right? They they're they're they're monetizing their ideas. They're on, password protected websites that
[00:43:46.30] - Jeff Sherrod
Yeah. Sometimes behind paywalls. Right? Yeah. That they can't access. Access.
[00:43:50.09] - Gregg Garner
Yep. So the information that we are getting is coming from, people that we can't even vet because chat's not telling you. It's not letting you know. And I sometimes, it'll source aspects.
[00:44:02.90] - Benjamin Reese
Mhmm.
[00:44:03.50] - Gregg Garner
But you usually have to ask it to. Otherwise, it's just gonna give you its its answers.
[00:44:08.50] - Jeff Sherrod
Even then, you're hoping it's right.
[00:44:10.80] - Gregg Garner
Yeah. And you you you hope that the person doing it has the capacity to discern To discern. Whether it's right or wrong. It's real. Yeah. I think I agree more with Ben. I don't think the person's hoping it's right. I think they think it's right. Yeah.
[00:44:24.30] - Jeff Sherrod
I think so too. Yeah. There's a lot to say about the about this one. We might need a part two at some point. Yeah. We can kinda keep going with it. But, yeah, I appreciate you guys.
[00:44:32.90] - Gregg Garner
Maybe next time we get together, we we could look at some stats as to how people are perceiving AI and then Yeah. Maybe maybe get maybe show some demonstrations. Because I think people listen to us, they'll hear us talk about it. But I think also we might have sound like we're speaking another language at a certain point.
[00:44:47.19] - Benjamin Reese
Yeah.
[00:44:48.00] - Gregg Garner
Just because it's it's not, like, I think I think Gen Z listening to us, they're they're familiar with aspects, but it's still novel. Mhmm.
I think millennials probably have have, watched the evolution Mhmm. Of the whole thing. And, boomers are just like
[00:45:06.69] - Jeff Sherrod
What's happening?
[00:45:08.69] - Gregg Garner
Maranatha. Come Lord Jesus.
[00:45:10.90] - Jeff Sherrod
Come Lord Jesus. Yeah.
[00:45:14.59] - Gregg Garner
But, I think I think I did wanna close this thought here. Yeah. They it's a pretty popular documentary on Netflix, but I just saw it recently, The Social Dilemma.
[00:45:21.59] - Jeff Sherrod
Mhmm.
[00:45:22.19] - Gregg Garner
And it interviews the architects of social media. And one guy in particular who is or has created an organization, but is advocating towards some ethical regulation, legislative regulation based upon the ethics of social media to identify the harm that it causes.
[00:45:48.50] - Jeff Sherrod
Mhmm.
[00:45:49.00] - Gregg Garner
And, like, eSight's things, like, everybody's created this stuff, doesn't let their kids have social media Yeah. That they themselves, all the original architects are out of it and have been created other organizations to come against it. But that they all originally had no intent for what came out of
[00:46:07.00] - Jeff Sherrod
Mhmm.
[00:46:07.50] - Gregg Garner
The Yeah. The effort. And so I think it's important for us to recognize that even with AI, people who are crafting it, not all of them are evil masterminds who are trying to, yeah, launch us into that dystopian future.
[00:46:21.50] - Jeff Sherrod
Mhmm.
[00:46:21.80] - Gregg Garner
But that some of them are just curious people who are in being inventive and innovative and and seeing possibilities, but they don't have a biblical, ethical, philosophical education.
[00:46:34.59] - Jeff Sherrod
Yeah.
[00:46:35.00] - Gregg Garner
And in that case, they're not capable of recognizing what it is that they're building and its impact on the world. And, so I I think I think that's gonna be one of the things that emerges, as we move forward.
[00:46:51.00] - Jeff Sherrod
The unintended consequences.
[00:46:52.90] - Gregg Garner
Well, I I think society there's definitely so many unintended consequences, but I think society is gonna have to create guardians, for that, people who who legislate the behavior that could lead to that, type of thoughtlessness. So in, like, today, pretty much anybody could program something and release it, an app or or whatever, put that technology out there, because it's just considered a a business.
I think probably in the future, what's gonna happen is in the same way we have a federal board to give permissions for people to release pharmaceuticals. I think it's gonna be similar
[00:47:33.69] - Benjamin Reese
Interesting. Yeah.
[00:47:34.30] - Gregg Garner
When it comes to technology because we're I mean, in that documentary, they established the addictiveness of social media that, the I think the DSM six now has Oh, does it? Yeah. As as a disorder, specifically, the fact that some people and and I think people have felt this before, like, where they they didn't have their phone on them. They have anxiety.
[00:47:58.69] - Jeff Sherrod
Oh, right.
[00:47:59.09] - Gregg Garner
Having their phone on them. Or some people will feel like their phone is vibrating in their pocket when there's no phone actually there.
[00:48:04.90] - Benjamin Reese
Mhmm. Yeah.
[00:48:05.40] - Gregg Garner
Like, there's stats where people are like, lose their minds
[00:48:09.50] - Benjamin Reese
Yeah.
[00:48:09.69] - Gregg Garner
If they're not connected. So it's it's affecting our health as human beings. And I think government regulation has the interest to protect the well-being of its citizenry. So I imagine in the same way we saw pharmaceuticals impact the well-being of human beings, we should regulate it. It's gonna go the same way when it comes to technology and technology use.
[00:48:33.90] - Jeff Sherrod
Yeah. Yeah. Well, I appreciate it, guys. Thanks for everyone for hanging out with us today, being able to talk about this. This is a topic that's gonna be around for a while, like, as Greg says, been around for a while. As always, we appreciate when you guys like and subscribe, when you share, the show with others, it really does mean the world to us. Until then, we'll see you guys next time.