Episode 565: Artificially Augmenting Higher Education - podcast episode cover

Episode 565: Artificially Augmenting Higher Education

Jul 14, 202357 min
--:--
--:--
Listen in podcast apps:

Episode description

Guest: Emily Weigel of the School of Biological Sciences at Georgia Tech.

First broadcast July 14 2023.

Transcript; Playlist at https://www.wrek.org/?p=39584

"My students are going to be hearing about this, they need to know about this."

Transcript

[RATTLING NOISES]

HAL

Look, Dave, I can see you're really upset about this. I honestly think you ought to sit down calmly, take a stress pill, and think things over.

[BREATHING NOISES]

HAL

I know I've made some very poor decisions recently, but I can give you my complete assurance that my work will be back to normal. I've still got the greatest enthusiasm and confidence in the mission, and I want to help you.

[ENERGETIC ROCK MUSIC PLAYING]

HAL

CHARLIE BENNETT

You are listening to WREK Atlanta, and this is Lost in the Stacks, the research library rock and roll radio show, live from the spaceship Discovery. I'm Charlie Bennett in the studio with Fred Rascoe, Marlee Givens, Lila Jane Bennett, and a guest to be named later. Each week on Lost in the Stacks, we pick a theme and then use it to create a mix of music and library talk. Whichever you're here for, we hope you dig it.

MARLEE GIVENS

And today's show is called "Artificially Augmenting Higher Education." Oh, man, more artificial intelligence?

[LAUGHS]

FRED RASCOE

Yeah, kind of had a thing about it lately, haven't we?

CHARLIE BENNETT

Fred, you've produced three of the last four AI episodes.

FRED RASCOE

Yeah. CHARLIE BENNETT: I feel like it's gotten under your skin a bit. I think it has. And the so-called "intelligence" of artificial intelligence, well, it may be ARTIFICIAL, in all caps there, but its impact on higher education is very real.

MARLEE GIVENS

And it's only going to become more and more integrated with academia's curricula, assessment, and processes.

FRED RASCOE

Sure, there are professors that are running screaming from this attack of the AI chatbot-- which would be a good sci-fi movie. But there are professors who are embracing the potential of the technology to be something positive in the college classroom.

MARLEE GIVENS

And we'll have one of those very professors as a guest today, talking about her experience using tools like ChatGPT to augment teaching and learning.

FRED RASCOE

And our songs today are about control, human and machine labor, and automated solutions. Even though it's summer break, our first song will be about school. It's a song all about learning biology in the classroom and breaking rules.

CHARLIE BENNETT

Fred, the themes sound like punk rock, but that last bit sounds like '80s hair metal.

FRED RASCOE

Hey, funny you should mention that. This is "Back to School" by Ace Frehley right here on Lost in the Stacks CHARLIE BENNETT: Cover your ears, Lila.

[ACE FREHLEY, "BACK TO SCHOOL]

FRED RASCOE

"Back to School," Ace Frehley. This is Lost in the Stacks, and today, we're talking about using AI tools like ChatGPT in the college classroom. And our guest is Dr. Emily Weigel, whose official title is senior academic professional in the School of Biological Sciences here at Georgia Tech. Thanks for joining us, Emily.

EMILY WEIGEL

Thanks for having me.

FRED RASCOE

And I know you're taking a break from welcoming the brand new biology students to Georgia Tech, so thanks for taking the time to join us today.

EMILY WEIGEL

Appreciate it.

FRED RASCOE

So before we jump into to AI, when you're teaching, what kind of classes are you teaching?

EMILY WEIGEL

So I have most of the intro bio classes, ecology labs, biostatistics. I have our teaching assistants class in biology-- so lots of different types of classes.

CHARLIE BENNETT

If I may be so bold, that sounds like too many classes.

EMILY WEIGEL

Yes. Yes, it does.

CHARLIE BENNETT

OK, yeah. You've had to pick up some slack?

EMILY WEIGEL

I've picked up a few, but I think it's as our department grows, we can find a niche in the classes we like to teach and really think about who's got strengths in different areas. So that's been kind of nice to hone in. So it's not everything all at once necessarily, but things have changed through time.

FRED RASCOE

Well, we have on because you are giving sessions here at Georgia Tech how you see some positives in the use of AI in education. And it makes me think that, talking about your class teaching load, that maybe using AI would be some relief. Is that what got you interested in using AI? EMILY WEIGEL: Honestly, I just wanted to see why everybody was so scared, and I wanted to play with it. And I wanted to see, what is this buzz about? What potential does this have?

There is some time-saving elements that you could think about with incorporating some of these elements, but I think also, too, my students are going to be hearing about this. I need to know about this if they're going to encounter it so I can help them.

CHARLIE BENNETT

I don't want to make you speak for your colleagues or the public at large, but when you said, why were people so scared, were you seeing other folks at Tech who didn't want it around? Were you thinking of the general air of fear about AI that we've been seeing in the media? What are you referring to?

EMILY WEIGEL

I think it's the media, and then also just places like Twitter, Instagram, other places where educators talk. We have communities, not just at Tech, but broadly. And folks are like, how is this going to affect my teaching? What is it going to look like when I design assessments? Can I even do some of the things that I was doing before, or do I need to redesign my whole course?

And I didn't think it was great to operate from a place of fear, so I decided to look and see what we're actually dealing with.

CHARLIE BENNETT

And was that your list of experiments, the questions you just said?

EMILY WEIGEL

I suppose so.

CHARLIE BENNETT

Yeah, what was the first thing you did when you were trying to put AI into any sort of class process?

EMILY WEIGEL

So AI's been around for a bit, but I guess I hadn't really thought about it. One of the things that we do in biology a lot of the time is figuring out what organism is it that we're seeing when we're outside. And there's a lot of different apps that will use AI to figure out what that organism is. So I said, OK, in this instance, it works. Now, we're looking at a language model. Let's see what it does. Can it accurately do this?

Because sometimes it is, at least on the visual end, being comically wrong like a dog standing next to a shadow from a gate gets coded as a tiger because it's alternating stripes, that's not right.

CHARLIE BENNETT

[LAUGHS] Wow.

EMILY WEIGEL

Yeah, so I wanted to see if there was something linguistically, if I'd see these other things, like, can I break it? And what instances is it actually useful versus not worth the time? And that, it depends on what you give it.

CHARLIE BENNETT

Go ahead, Fred.

FRED RASCOE

What's the language use that you use in your class?

EMILY WEIGEL

So it kind of has many different roles. One of the things I like to do is, if I design an assessment, I like to ask my TAs to take a look at what student answers might be, and--

FRED RASCOE

"Assessing" meaning you're gauging whether the students have learned something or not?

EMILY WEIGEL

Yes, so it could be something like a homework or a test question. If I'm writing a question and I need to know whether it's intelligible, I might say, OK, can you put this at the level for a fifth grader? If it interprets it a little bit differently than I did, it encourages me to clarify that.

And then, sometimes I can ask that question, and get sample responses, and ask my TAs to grade those responses so they know, maybe, the variety of things or interpretations they might see-- which can be really helpful when they're about to encounter 150 students' different written responses to something, and they might want to know, OK, what's a typical ChatGPT type response? so if they see it, maybe guide that student to providing their own answer.

But more importantly, if the question could be interpreted in several ways, what's equally valid, and what are cool ways to look at it?

MARLEE GIVENS

I mean, that makes me think of what-- I feel like the primary fear of educators is that students are going to use this to cheat.

EMILY WEIGEL

I mean, it's possible, but I think if you design some of your assessments to recognize it's a tool and not the be all, end all, then you're just helping them to learn how to use the tool. And the objectives you might have might shift to being, OK, I'm going to learn how to delegate or to take this assignment and break it down into parts, rather than the focus being on doing any one of those individual parts.

CHARLIE BENNETT

So I think it's important to note that we got an eyeroll at the idea that students would use AI for cheating. Like, that was not your concern at all.

EMILY WEIGEL

Yeah, I think it's possible, but I think folks will find a way to do things and subvert learning in any way possible. This might be faster, it might be more freely available, but I think depending on what you use it for, and if you have told your students instances where they can and cannot use it, in what ways, and help them delineate the boundaries for what you're looking for, and don't treat it like this thing that shall not be named, I think it actually goes over better.

But I'm not foolish enough to think that everyone comes into the class with completely and totally good intentions always. So it's more like it's a given. Students are going to use this. And what do we do with that now?

MARLEE GIVENS

Yeah it's like a calculator.

CHARLIE BENNETT

We're going to talk more about student use of AI in the next segment. But now, this is Lost in the Stacks, and we'll be back to talk with more with Emily Weig-- you know what? I heard your name said before, and, of course, I'm thinking about too many things. Will you say your last name again for me?

EMILY WEIGEL

It's Weigel.

CHARLIE BENNETT

Weigel, I was going to say it right, but I misthought-- Emily Weigel about how AI is augmenting education, after a music set.

LILA JANE BENNETT

File this set under LV1028.3.N67.

[DRAMATIC ELECTRONIC MUSIC PLAYING]

LILA JANE BENNETT

You just heard "Where is the Machine" by Jeffrey Lewis and the Voltage, and before that, "Automated Education" by Jerk Test, songs about looking for a solution in automated machines.

CHARLIE BENNETT

This is Lost in the Stacks, and we're back talking about using AI in higher education with our guest, Emily Weigel, a biologist here at Georgia Tech. All right, so we were talking in the past segment about using AI. And at first, I was hearing you describe giving the students a new tool, like a calculator. You know, here's this thing; let's see if it can identify a dog or a tiger.

But then, also, you mentioned using it to stress test your assessments, your quizzes, and maybe even concepts you were putting in there. It sounds like you just went for it and used it in all kinds of ways. Can you dig a little bit deeper into how students are using the AI?

EMILY WEIGEL

So I've asked them to, on occasion, take our learning objectives-- so what we want the students to know and do-- and create questions using ChatGPT. It's like having a tutor on demand. And so, OK, make some questions for you. Instead of me making these giant sets of test questions, that kind of thing, I'll have them go through and then trade them with one another. Or ask it intentionally to solve this problem, but make a mistake.

And you have to go through and figure out where they made the mistake. That's really great in classes where they have to calculate things. So they can kind of go, OK, I need to really think through my steps here and spell it out in a way that the TA can follow-- yay for grading and getting those points-- but also in a way that someone else can follow.

Because when they write anything that they're doing for science and that kind of thing, they need to be able to spell out their rationale and how they got each step. So that's been really helpful to get them to think through critically. I really liked her lab reports asking them to play devil's advocate, ask them to give their ideas-- if they're comfortable, right? So there's the ethics of asking them to use this tool.

I'm not going to make them use it, but I can make them aware that they can use this tool to say, OK, give me the limitations of this experiment as I've written it out. Or, if I have some ideas here that I'm positing, this is why. My data should be supported in this way. What are other possible explanations for this? And, get them to really think a little bit broadly about it. CHARLIE BENNETT: How many semesters have you been using AI in a really significant way in classes? So language models?

Since spring, although we were aware in the fall of-- last fall, so about almost a year. But for other forms of AI, probably two years? But more visuals. CHARLIE BENNETT: Are students excited about using it as part of the official curriculum, or are they already over it? So I think they're aware that it exists, and I think some of the shininess has been removed from some of it because they've tried it in different contexts.

And they see that it can work in some ways and be a great time-saver, but in others, it's just dead wrong. We have some objectives that ask them to look at misconceptions around evolution or vaccines, and we asked it to ask ChatGPT, and put on the hat of a vaccine skeptic, versus a scientist, versus a doctor, and see just what types of responses they'll get from it.

And they realize, wow, OK, some of the responses I'm getting are not-- they can be illuminating in different ways, but not necessarily accurate in themselves unless I'm really particular about using it. And so I think that it is a solution for everything, that shiny, wonderful feeling has gone away. But now they're like, OK, this is a tool, and I know how to use it.

CHARLIE BENNETT

You were very gentle there. But I think that when it comes to vaccine information being spit out by ChatGPT, I got to feel like it can spiral into misinformation very quickly.

EMILY WEIGEL

Oh, absolutely. Yeah, and that's the thing, is if I'm training future biologists, they need to be aware of misinformation. Particularly those that are going into public health fields, they need to know what their particular patients are saying to them, and where this is coming from, and what those things look like. So this is a good way for them to anticipate in a language way what those issues might be, and how to counteract it, and how to move it in the right direction.

CHARLIE BENNETT

It just hit me that AI is a little bit like sex, and that some people are teaching abstinence only. They're saying, just keep the AI out of it. We don't want to interact with it or have it in the classes. But you're sort of doing AI education-- not just using it, but you're getting the students to understand what it is, what it can do, what it can't do.

EMILY WEIGEL

Yeah.

CHARLIE BENNETT

They're probably a lot more prepared for the flood that they're going to experience pretty soon.

EMILY WEIGEL

Yeah, it's pretty much like sex ed of informing them, telling them about consent, making sure that they can make wise decisions when they come into situations when I'm not there. So I think that that's a good way to prepare them for future tools.

We have the current AI tools that we have and the current problems, but down the road, if they don't have the ethical basis and troubleshooting skills to work with that, they're not going to be in a good place when they encounter something post-Georgia Tech.

MARLEE GIVENS

Have you found a community of other educators that are using these tools in the same way as you?

EMILY WEIGEL

"Community" is a loose term. I think there are some of us that are-- I wouldn't call us "early adopters" but more "eager adopters," let's say. And we are all communicating with one another. I think the divide where we see it more is folks that are in computer science type fields versus those that are outside of it. I think those that are in computer science type fields or more statistics fields have a different perspective than those that are outside of it.

Usually those that are outside-- so most of my biology colleagues that are not engaging in AI for part of their research-- are pretty averse, or just choosing to put their hands over all sensory organs and just ignore it. And I think in other fields where it's become, or it's been, something they've had to worry about for longer, they set the stage on what boundaries to have, how to talk to your students, and that kind of thing.

So I think it's more of everyone outside of CS borrowing CS education's footsteps early on on how to deal with this in the classroom space.

MARLEE GIVENS

It just sounds like such an interesting pedagogical tool that other educators will be interested in exploring. But, of course, we're at Georgia Tech, and people don't always share these kinds of things with each other.

EMILY WEIGEL

That's true.

FRED RASCOE

All the talk is, cheating, cheating, cheating.

CHARLIE BENNETT

How is this affecting your, sort of, not-classroom relationship to AI? Like are you recognizing it out in the world more? Are you avoiding it? Have you taken all the voice assistants out of your house?

EMILY WEIGEL

No. [LAUGHS] I'm kind of going the other direction. No, I think it's made me a little bit skeptical, I think, on more of the other things that I have seen, just because I've seen it fail very simple things that I want it to do. And I think it's also been helpful when I think about other things that I want to do in my life, like creating any other kind of thing that I need summaries of or that kind of thing. I'm a little bit more able to use it than I think I would have been.

MARLEE GIVENS

You're listening to Lost in the Stacks, and we will talk more about augmenting higher education with AI on the left side of the hour.

[ROCK MUSIC PLAYING]

MARLEE GIVENS

[HIGH-ENERGY MUSIC PLAYING]

MARLEE GIVENS

MARK RIEDL

Hi there, this is Mark Riedl of Georgia Tech's Computer Science Department-- or is it? Maybe my voice has been deepfaked and this is just a digital forgery created by a neural network. Either way, you're definitely listening to Lost in the Stacks on WREK Atlanta. CHARLIE BENNETT: Our show today is called "Artificially Augmenting Higher Education"-- all about how tools like ChatGPT are influencing classroom teaching and learning.

New technologies, for whatever benefits they bring, also extract a cost, a societal debt to be paid in exchange for its use. AI and similar technologies can lead us into a particular kind of societal debt called "ethical debt," as coined by information professor Casey Fiesler of the University of Colorado and a previous Lost in the Stacks guest.

Fiesler describes ethical debt as the detrimental trade-offs of our principles for using technology-- like when we lose privacy, when there's a proliferation of misinformation, and when there is labor exploitation or elimination.

The bulk of our show today is about how these new AI technologies can be used to improve academia, but it's also important to take a moment to consider the ethical debt that may come due and therefore plan our usage with an eye on making sure that AI serves us-- all of us-- rather than the other way around. So We can file this set under T14.5.S6385. [BARBARA LEWIS, "MAKE ME BELONG TO YOU"] Darling, whatever you--

[HIGH-ENERGY MUSIC PLAYING]

MARK RIEDL

That was "Ain't Got No, I Got Life," by Nina Simone. Before that, "Robot Parade" by They Might Be Giants. And we started with "Make Me Belong To You" by Barbara Lewis. Those are songs about figuring out who is actually in control.

[ROCK MUSIC PLAYING]

MARK RIEDL

MARLEE GIVENS

This is Lost in the Stacks and we're talking about using artificial intelligence tools in the classroom with our guest Emily Weigel, biology professor here at Georgia Tech. And I know this got an eyeroll earlier when we were talking about cheating. Are there any other sort of dark sides to using AI in the classroom?

EMILY WEIGEL

There are some serious dark sides. So I talked earlier about having students generate their own questions. It can be-- ChatGPT and other ones can be very confidently wrong on things-- and almost so much so that students may think, I must be wrong, and it kind of messes with their own confidence. Or even they brought it to me saying, well, ChatGPT said this. And I'm like, well, here's this. Here's my paper on this. So I'm pretty sure, like, this is the actual answer, not what ChatGPT spits out.

CHARLIE BENNETT

[LAUGHS]

EMILY WEIGEL

So sometimes you have to kind of reel it back in going. Some of the answers you might get from it are not great, it's also just based on what data has been fed into the model. So the biases and everything else that there are reflected in that space. You can get a little bit better by asking it to take on the role of x. So let's say, be a professor of this, and the sources it uses will be slightly different.

But for the most part, the biases are already baked into whatever answers they're going to get. So accuracy and equity in those answers can be a little bit problematic.

CHARLIE BENNETT

That problem of the students being confounded by the appearance of authority, is that familiar to you in other areas? Is it the same process of being overwhelmed by Wikipedia or by common sense? Or does it feel different than things in the past?

EMILY WEIGEL

So back in the day, when people were wondering about what to do about Wikipedia and that kind of thing, you could easily show them, you can't back this up with sources like this, and anyone can edit this, and that works. This feels a little bit different because it's so conversational that they really do kind of believe it a little bit more, and-- at least that's just my experience.

CHARLIE BENNETT

Like it talks them into it?

EMILY WEIGEL

Yeah, almost. It just sounds so confident in what it says, and it's so wrong. So I'll just tell them, OK, ask it for your own biography, or ask it something you know definitively. And look the answer it gives you and go, OK, imagine it has that amount of accuracy with these other things. It's not a good tool for information accuracy. It might be good for helping you brainstorm ideas or to outline certain-- there's lots of good uses of it, but it's not a tool for everything.

So use the correct tool for the correct type of problem. Find your problem, find the correct associated tool. Don't just throw everything at this. That's my rule.

CHARLIE BENNETT

This sounds like one of those-- the benefit of interacting with it in the classroom. You get to teach this. You get to give them the guidelines. Are there other problems or dark spots that you don't get to talk to the students about? Things that you're discovering in the back end of the teaching?

EMILY WEIGEL

I think initially, I was really excited about the ability for ChatGPT to help with language because I have students for whom English is not their first language or they're just not very strong writers. And so it was very helpful for structure and that kind of thing. But I think because of the data that goes into it about the voices of the writers that are there, sometimes the individual writer's voice gets lost as it makes edits.

And I don't like that as much because I don't think they feel the originality about what they're producing, and so they don't own it in quite the same way. So that's a dark side that I didn't anticipate. And so now I tell them, don't lose yourself as you make those edits. Don't forget you have an important voice. You have important things to offer. It's not just ChatGPT essentially taking your words and making it pretty. It's just cleaning up what is your set of ideas.

CHARLIE BENNETT

Are you using it in your own scholarship?

EMILY WEIGEL

I've tried, but sometimes it's not the best at-- well, it's not the greatest of tools. There's other tools out there that are helpful for science type things. And I've found it to be useful for, if I'm stuck and I'm like, play devil's advocate for me. Like, what am I missing here? Or if I've got a paragraph that I need to shorten an abstract to 180 words, absolutely, if I'm at 204 and I can't cut anything else, I will put it in there and then see what results.

I think most of the time, I get about 40% of the way there, but it can get me unstuck when I'm just immovable at my desk. So--

CHARLIE BENNETT

So we all have moments of doubt and pessimism?

EMILY WEIGEL

Moments?

MARLEE GIVENS

[LAUGHS]

CHARLIE BENNETT

And maybe I have them more often than others. When you're thinking about ChatGPT and tools like that, and see how rapidly the technology is accelerating, do you ever pause a little bit and think, what about my job is going to be replaced by AI?

EMILY WEIGEL

Yeah, I've thought about it a little bit. I think a lot of pedagogical training is getting you to set up conditions the right way for a student to be able to learn things and get rapid feedback. And so I think in some ways, it might take away some of the grading I have to do, which might be great-- to be able to get feedback more rapidly in larger classes and that kind of thing. But I think after playing with it for a while, at least for my lifetime, I'm probably secure.

But I think anything that, on scale, needs to happen quickly, it's probably going to be a human grader as long as the task is simple enough. But I think that just pushes me to do different things with my students that might be more in-depth anyway. If a computer can do it, and it can do it so easily, then maybe that isn't something I should have my student doing.

Maybe they can use the tool for that themselves and we get deeper into, let's structure your writing, rather than the more simple things

CHARLIE BENNETT

This past year of using AI, has it changed any of your prior assumptions about how it's going to develop as a tool, how your career as an educator was going to go? I mean, anything that I'm not thinking of?

EMILY WEIGEL

I think it's gotten me a lot better about, potentially, having digital TAs alongside my normal teaching assistants-- so the fleshy and the digital-- and thinking about how to delegate tasks and what are things that I as an expert need to do, versus someone who is closer to the information but just learning it, versus someone who doesn't have a context for the information but can process it. And thinking about those roles and separating them out has been important.

CHARLIE BENNETT

So just in the last minute we've got left, ending on an optimistic note, what are you excited about in the coming semester with regards to using these tools?

EMILY WEIGEL

I think I'm interested to see how much deeper and more meaningful my students' writing will get now that some of the simpler things like formatting their references or those kinds of things can be taken care of a little bit quicker. I want to see how they're going to use that time differently. And I hope for good things.

CHARLIE BENNETT

Our guest today is Dr. Emily Weigel, Senior Academic Professional in the School of Biological Sciences here at Georgia Tech, talking about using ChatGPT and AI tools for teaching and learning. Dr. Weigel, thank you so much for coming on the show.

EMILY WEIGEL

Thank you.

LILA JANE BENNETT

File this set under Q334.M84.

[THE FRONT, "I'M NOT A MACHINE]

LILA JANE BENNETT

"I Am A machine" by The Meat Puppets. And we started off with "I'm Not a Machine" by The Front. Those are songs about the lines between human and machine labor.

[ROCK MUSIC PLAYING]

LILA JANE BENNETT

CHARLIE BENNETT

Our show today is artificially augmenting higher education, because Fred is so into AI.

FRED RASCOE

[LAUGHS]

CHARLIE BENNETT

Dr. Weigel, this show was about the intersections of AI and academia, especially in your class. We skimmed over the fact that when you're not figuring out ways to use new technology and strategies to help students to learn, you are a biology researcher. Don't you study things like how species like fish and reptiles select mates or move through environments? How does AI interact with your research?

EMILY WEIGEL

A lot of what we're doing is trying to figure out what animals are where. So we use things called camera traps. So think about it like a doorbell camera for wildlife. You see what comes by. You try to figure out who's there. Ideally they're coming by at the right times, not the wrong times. And we try to figure out who's interacting with whom-- who's in the neighborhood, effectively.

CHARLIE BENNETT

Is there a particular animal that calls the cops more often than others when the animals are on the Ring?

EMILY WEIGEL

Some kind of Karen squirrel? I don't know.

FRED RASCOE

[LAUGHS]

EMILY WEIGEL

No we see a lot of different animals, and we try to keep track of mostly mammals for what's moving through a space because they trigger the cameras most easily. But it's really about figuring out what all is in a particular space, if you shouldn't leave your cat outside. We see it-- those kinds of things-- and we try to figure out how to help the environment and to think about what makes an urban space different from a suburban space in terms of what animals are there.

CHARLIE BENNETT: And forgive me if I missed it, but is AI in there? Yeah, so we get thousands of pictures. It's almost impossible to do it all by hand. So AI will--

CHARLIE BENNETT

Gotcha.

EMILY WEIGEL

Yeah, we'll process most of it, and then we'll go through to check. It's right about 90% of the time.

CHARLIE BENNETT

So this is a question that we could do a whole episode on, so I'm just going to force you to be superficial about it. But so is AI really just a very good automated system that can absorb a lot of different information, or is it something more than that that we're going to have to get a hold of?

EMILY WEIGEL

I think it's a tool that is different than tools we've had in the past, but I still think it's just a tool. We just have to think about the ethics behind using it and what it can and cannot do. CHARLIE BENNETT: Perfect soundbite.

FRED RASCOE

And just one last question, because my mind is in the gutter. Do you always also use these cameras to catch what animals are doing it with other animals?

EMILY WEIGEL

Yeah, we see that a lot. We try to keep it PG, but unfortunately, it's on camera, it's on camera. And AI sees it, so do we.

CHARLIE BENNETT

It must be a summer show. Fred, roll the credits.

[RHYTHMIC MUSIC PLAYING]

CHARLIE BENNETT

MARLEE GIVENS: Lost in the Stacks is a collaboration between WREK Atlanta and the Georgia Tech Library, written and produced by Charlie Bennett, Fred Rascoe, and Marlee Givens.

FRED RASCOE

Legal counsel and a dog that we think is a tiger-- or maybe it's a tiger we think is a dog-- were provided by the Burrus Intellectual Property Law Group in Atlanta, Georgia.

CHARLIE BENNETT

Special thanks to Emily for being on the show, to all the students doing their best in this ChatGPT era, and thanks, as always, to each and every one of you for listening.

MARLEE GIVENS

Our webpage is library.gatech.e du/lostinthestacks, where you'll find our most recent episode, a link to our podcast feed, and a web form if you want to get in touch with us.

FRED RASCOE

On next week's Lost in the Stacks, we're going to plan the perfect library instruction curriculum.

CHARLIE BENNETT

What?

FRED RASCOE

Think big, Charlie.

CHARLIE BENNETT

I guess I have to do some prep before the next show. OK, time for our last song. In honor of the research that Dr. Weigel conducts, we close with a song about the characteristics that are used to identify the right mate. This is Fred with his mind in the gutter, and this song is, It's In His Kiss, the Shoop-Shoop song by Betty Everett right here on Lost in the Stacks. Hey, Lila, you want to close the show?

LILA JANE BENNETT

Have a great weekend, everybody! [BETTY EVERETT, "IT'S IN HIS KISS"] Does he love me? I want to know. How can I tell if he loves me so? Is it in his--

Transcript source: Provided by creator in RSS feed: download file