Here on WREK Atlanta, wow.
[MUSIC PLAYING]
Charlie, I want to ask you a question. CHARLIE BENNETT: Lay it on me, dude. What's the best bit of feedback you've received about your teaching?
Slow down.
That was going to be my answer as well.
[MUSIC PLAYING]
You try to pack it all in. You're going fast. You got the energy. Someone just reminded me to slow down.
Mine is going to be a combination of slow down and maybe cut some stuff out. CHARLIE BENNETT: Yeah, less and slower.
[LAUGHTER]
["LOST IN THE STACKS" THEME MUSIC]
You are listening to WREK Atlanta. And this is Lost in the Stacks, the research library rock and roll radio show. I'm Charlie Bennett in the studio with Marlee Givens, Fred Rascoe, and a bunch of old friends. Each week on Lost in the Stacks, we pick a theme and then use it to create a mix of music and library talk. Whichever you're here for, we hope you dig it.
Our show today is called "Instruction Assessment Check-in."
And-- sorry, some weird audio stuff going on right now. How about you read my line real quick while I try to fix this.
We'll be talking about a new evaluation survey we've been using to assess library and archives instruction for the past year. That is to say, we're going to check in on an assessment project. And Fred, I think, thinks that that's assessing an assessment project.
I think something is up with your mic, and I cannot get it to come up to a good level.
Oh, what if I do this?
No, we'll have to figure it out during the first music set, I think. Live radio.
Live radio, I love it.
All right.
Marlee, why don't you tell us why you named the show this kind of "Instruction Assessment Check-in" weirdness?
Because I couldn't call it "Instruction Assessment Assessment."
I think you could have done that. All right, we're not doing an official assessment. We're just talking. As everyone can tell, this is a very loose panel show today.
And I am delighted that we have multiple assessment experts joining us.
I know. It's a full studio. Our songs today will be about keeping track, limitations of what we can know, and our expectations. And we start with a song-- hey, Fred, I just thought of something. Turn off the operator mic and see if that changes things. How about now?
No, it doesn't change anything.
We start with a song about getting the best information and making crucial decisions. And wouldn't we know it, It's a song called "Assessment." This is "Assessment," by the Beta Band, right here on Lost in the Stacks.
[MUSIC PLAYING]
That was "Assessment," by the Beta Band. Our show today on Lost in the Stacks is "Instruction Assessment Check-in."
And joining us today is Georgia Tech Library's assessment librarian Matt Frizzell. Hi, Matt.
Hey there.
OK, checking in on your mic, that sounds good. Charlie, let's check in on your mic. CHARLIE BENNETT: I'm scared, too. OK, Marlee sounded good. Matt sounded good. Charlie sounded good.
Let's do another sound check with the other person.
Hello, folks.
Doshi.
Sorry. Hi, everyone.
[LAUGHTER]
Hey, everybody.
Ameet Doshi is also in the studio.
He is. He is here being the Ameet.
Longtime listeners are ecstatic. New listeners have no idea why that's important.
So just to bring everyone up to speed and the reason why Matt is here is that Matt and I worked on a team to develop a new survey for evaluating instruction in the library. And we launched it as a pilot in spring of 2023. So now we have about a year's worth of assessment data to look at. But I have to say it feels like we're still just getting started.
Yeah, it's kind of amazing. We have over, I believe, 1,500 responses now from students and they're amazingly good, which I have to say I shouldn't be surprised by. But I am, just there should be more outliers and--
Do you mean the ratings that people are giving or-
Yeah, yeah. On a scale of 1 to 5, we're averaging 4.9. And so there's not a lot to criticize about what you guys are doing.
Yeah. And we designed it to be short and sweet, so that people would be more motivated to fill it out. So I don't know if you've compared it to the actual numbers of students. I looked at my own results. I felt like I was getting about 50% completion rate, which is really good.
Yeah, it's really good. And to back up a little bit, one of the reasons we did this is in the library, we teach a lot of non-credit courses and workshops. And of course, for the credit courses, you have COs. And for all of us doing the non-credit courses, we don't have a tool like that to get qualitative data. So this is our way of checking in and saying, are we doing what we're supposed to be doing? Are we meeting the needs of the students? Are the students happy? and all that great stuff.
So professors have "CIOS", which is C-I-O-S, which I can't remember what that stands for.
Course Instructor Opinion Survey, I think.
And that's for the professors that give grades A through F, whatever. Library classes are, by nature, a little less structured than that.
Right. Yeah, so we have drop in classes. You probably, for the folks listening at home, have seen Marlee or Charlie or Fred in your class , in English or one of the core classes, speaking about plagiarism, how to do research, all that fun stuff. And those are part of a credit class, but not actually graded or for credit.
Yeah, and we're not even guest instructors so much as we're visitors. We're not recorded in the CIOS scores or CS scores.
Right.
I want to read the questions that you all put together on the survey. Because I've got the results of mine right here. Maybe we won't read it anytime soon.
Oh, yes, let's.
So on a scale of 1 to 5, that is to say, from strongly disagree to strongly agree, do you, the survey person, think that the instructor demonstrated expected knowledge of the course content? Then I have to flip past my results to get to the next one. The instructor was effective. The content of the workshop met your expectations. And then there's another one. Fred, do you have it? Or Matt, can you remember off the top of your head?
I can apply immediately what I learned at this workshop.
There it is. And then, would you recommend it to your peers? Did you all have to really workshop those questions? How long did it take to put that together?
It took a while, yeah. Because, I mean, honestly, I think we started in 2022. And it all came out of the library promotion review committee. And, we wanted to, figure out, what are the CO scores? What is their role in the promotion process? And then that led to, well, why don't we have something that evaluates all the other instruction that we do so that we get credit for that?
And then, yeah, we put a committee together, and we tried to figure out, based on CIOS's one model-- and then I think we brought in some other ideas, like net promoter scores and other questions that we thought we might want to ask. I know we talked to some outside experts. But yeah, we had several meetings, and we brought it before the rest of the library faculty and had them weigh in on the actual wording.
But yeah, again, we wanted to keep it to five questions to try to get more people to take it. MATT FRIZZELL: Generally speaking, the shorter your survey, the more likely folks are to complete the survey. So that's one of the reasons we wanted to do this. If anybody out there is listening and is doing anything similar, please reach out to us. We want to see what you're doing, what your results have been, your successes, failures. We did speak to some other folks on campus.
And nobody's really doing this comprehensively. They'll do it on occasion for a workshop for something they're doing for a short period of time. But they're not doing it longitudinally, like we are. So it's pretty cool.
So we're going to dig into that more in the second segment. This is Lost in the Stacks. And we'll be back with more on library instruction assessment after a music set.
File this set under LB3051.B66.
Wait a minute. You know better than that.
Dot.
[MUSIC PLAYING]
(SINGING) 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
I love it. I love it.
The "Pinball Number Count," I love it, I love it, I love it, by The Pointer Sisters, from Sesame Street. And "Hey, Who's Keeping Track" started us off, by Tyler Burkhardt. Those are songs about keeping track of things in both quantitative and qualitative ways.
[MUSIC PLAYING]
This is Lost in the Stacks. And today's show is called "Instruction Assessment Check-in." And before that music set, we were talking about the actual questions that we put in our survey and how we came up with them. So I wanted to talk a little bit about what we're not asking. So we're not asking, I think, some basic questions, like, Are the students learning what we teach them?
Yeah. And I mean-- and that's. supposedly what grades are, right, guys?
Yeah.
I guess. But--
We don't have access to those, right?
We don't have access to those. And do we ask the students, Are we ourselves a good judge of if we're learning? Do we know what we don't know? So yeah, that's a big one. CHARLIE BENNETT: Matt, you make it sound like you would like to know and just that doesn't seem like there's any way to. Well, that's not exactly right. But I think, Charlie, It's hard to do in the short amount of time that we're talking about. We're talking about workshops and dropping classes.
Over a semester, you can definitely gauge if a student's learning. Over four years, You Can really judge if somebody's learning. But in an hour, that's tough. I mean, what do you think? Do you think you teach a lot?
Well, I think you can tell. I asked my question too simply. I feel like it's easy to tell if a student has learned if you have time to observe them across the whole class, not the whole class session, but the whole course or--
The whole semester.
--if you do a pre-assess-- I have never been able to accept doing a pre-assessment before I start the class, then do the class, and then do a post-assessment. Less and slower was the advice that our cold open declared. And I have done so much less and less and less. I can't put a five-minute survey and a five-minute survey at the front and back. Sorry, I just went off on a spiral there. I think it's privacy. We can't track students after we do a single library or archivist session.
I think that's the big problem, right?
Yeah, I mean, we could technically, be invited into the online course. So there is a librarian role in our course management system, which is called Canvas. But that takes developing a relationship and being invited by the professor into the class. And even then, we don't really know, because it's all correlation. We're not running an experiment to see-- a pre-test, post-test, as you just mentioned, that would be an actual gauge of learning.
Whereas, if a student makes whatever grade on the assignment that we're coming in to help them with, how do we know that was us?
And then that-- and this is a bit of a tangent. But one project I've always wanted to do is, How much do libraries contribute to learning, to student success? In order to do that, there are so many things, so many experiences and outside influences, that are affecting a student. Where does that line with invading somebody's privacy and tracking all their classes-- and do they live on campus? And how much time are they spending in the library?
And so we've got to respect, obviously, the privacy of the students and the ethics of our profession. CHARLIE BENNETT: And a control group would be terrible in that because you'd have to tell some class, you may not use the library. We've got to see if you don't learn as well as your cohort.
[LAUGHTER]
Yeah, that's not going to fly here. Well, another thing that we're also not engaging with this survey is, are we as instructors improving? And honestly, I don't know how we could improve over like a 4.9 out of 5 anyway.
Isn't this appreciation, though?
It is. CHARLIE BENNETT: It's not competence. It's how well we impress the students. It's a satisfaction, like a comment card with a thumbs up, basically.
And I'll go one further. And maybe this is controversial to say, but are the CO scores, a lot of times, popularity contest? CHARLIE BENNETT: Oh, likeability.
Yes. Yes.
And then you're getting into the whole spectrum of bias and these other factors.
I worry that it becomes a customer survey. I mean, there's such a danger of higher education becoming a customer and vendor relationship. And then it's like, were you pleased with your product? You said, does a library help learning, right?
Yeah. CHARLIE BENNETT: And I think we're so indirect in so many ways. We can say, here are some skills. Please use that when you have a research question. Or here's a particular amount of discipline in terms of sorting information and assessing it. But then you can't really tell someone, So how did you use that advice we gave you to, check your sources? And then you've got, if you want to slice it even further, it's if you're coming into a library, you're using the space.
Is the quiet area beneficial? Because how much of that is really value we're adding? Is it the advice the librarians are giving you? Is it the resources? So you need to dig a little bit deeper. And that's true of assessment. With almost any subject you're assessing, there's context and multiple layers to it.
You're listening to Lost in the Stacks. And we will talk more about library instruction assessment on the left side of the hour.
["LOST IN THE STACKS" THEME MUSIC]
[SHELLAC, "YOU CAME IN ME"]
Hi, this is Steve Albini. I'm a recording engineer, and I'm in the band Shellac of North America.
[MUSIC PLAYING]
And you are listening to Lost in the Stacks on WREK.
Our show today is called "Instruction Assessment Check-in." Here at Georgia Tech, faculty members who teach four credit classes automatically receive course evaluations from students through the Office of Academic Effectiveness. Those evaluations are official and get used in reviews and promotion dossiers. Now, librarians and archivists teach a lot. But most of our instruction in the library is not for credit, for us or for the students.
So we don't get any official evaluations through the Institute. Part of why this evaluation survey was developed by Marlee and Matt and a team of others-- but you all get the credit here-- part of why they developed the survey for library instruction was to get credit for the kind of instruction that we do as librarians and archivists, so we can show people what people thought of us. But we should acknowledge that using a student course evaluation as a model for the survey had risks.
Student evaluations are notoriously problematic, especially when they're used to evaluate faculty for promotion and tenure. To get a sense of these problems, here are just some headlines from articles that Marlee found. FRED RASCOE: "Student evaluations, a poor assessment of learning experience." "Superficial assessments hurt professors and students, but reform is hard."
"Student teaching evaluations are racist, sexist, and often useless."
Goodness. In fact, at Ryerson University, the Faculty Association brought the matter to arbitration. The 2018 decision set a precedent for other faculty associations to keep these evaluations from being used for the purposes of measuring teaching effectiveness for promotion or tenure. Arbitrator William Kaplan describes student evaluations as--
Imperfect at best and downright biased and unreliable at worst. CHARLIE BENNETT: With this in mind, it's important for librarians and archivists in the classroom to define effective teaching and hold ourselves to that standard, regardless of the numeric scores or nice comments we earn from student surveys.
File this set under BV 4598.2H33.
[MUSIC PLAYING]
FRED RASCOE: "Limitation," by The Gems, '70s power pop for you, and before that, "Grey Area," by Dumb, Those are songs about the limitations of what we can know or find out.
[MUSIC PLAYING]
This is Lost in the Stacks. And our show today is called "Instruction Assessment Check-in." And we've been talking about the library instruction evaluation survey that we've been using for about a year.
So you're a year in. Has it done what you wanted it to do? Are you going to do something with this data? What's next?
So that's the big question, is we collect all this data, which is good. But what do we do with it other than just enjoy the warm glow of the data?
And I think that's why we are not really worried about it except to say, How are we going to use it? Because it has all been so positive.
Wait, not worried about it, meaning you're not thinking about what to do with it?
There's no action to take because no one is telling us that we need to fix anything. So it's really a matter of, well, can we use this for promotion? Can we use this to show the library's impact? Can we use this for any other purpose?
Yeah, I felt like if the students were saying, I did not learn at a high rate on this survey, then we would have to take immediate action. As it is, we're at the point of, How do we use this? Do we change it so that we get more critical feedback? Are we trying to shape instruction? Are we going to use this just primarily for promotion? I think we've got a we've got a lot of good data here. So that's the question. How are we going to use it? CHARLIE BENNETT: It's all very gentle.
I got a comment on one of my surveys that "he did a great job. He should get a promotion or a raise or a gift basket or something."
[LAUGHTER]
I love it.
I agree, Charlie. CHARLIE BENNETT: Thank you, dude. That's the level of strictness or rigor that's coming through this. Because we're asking students, after we do a course session, hey, did you learn that? Did you like it? Can you work with it now? How was the teacher? Anything you want to tell us? And it's like, yeah, everything was good. I'm delighted to hear that none of our colleagues are problem instructors, that you don't have an outlier ones all the way through.
I will say we have not gotten feedback for everyone now.
Oh dear.
So we have to rely on the instructor actually giving the survey to the students. So there's no survey that pops up automatically for the student to take.
This is back to it not being official?
Yeah.
When we talk about the official evaluations for the classes for credit, an email goes out from a Georgia Tech Office of Academic Effectiveness, and the students get hassled. Fill this out. Fill this out. Fill this out. I think there's even occasionally the sense that maybe it has something to do with how you'll get your grades if you just fill this out, not that that's true. But we are just doing this ourselves. This is like a DIY.
We put a QR code on a slide and say, please do this.
I do love the QR codes, though.
And like a lot of these surveys, it's self-selecting. If you've got something positive to say, you're going to say it. If you've something really negative to say, you want to say it. But there's not anything in it for the students to fill this out unless they have a strong opinion. So we're missing a lot of that, I feel like, that middle area where there may be juicy bits of information that can help us.
I had one very effective moment coming out of the evaluations. I had a comment where someone said, I would have preferred some color in the Powerpoint, but I was really sleepy-- again, not a lot of rigor, but I looked at that. And then I found another comment that was something like, pretty spare presentation, but useful. And I put together-- oh, my preferred way of doing my PowerPoint, it's--
It's very barebones.
Yeah, It's less than the students want. So I've started adding colorful bits and some emojis. I haven't gotten emojis yet, but I think I might have to find a young person to explain to me how to use those.
[LAUGHTER]
I hear they like Pokemon.
Oh my god. We could do a whole show on Pokemon, now that one of my children has discovered it.
I'm showing how old and out of touch I am. So that was a joke.
No, I'm going to a Pokemon night at a comic book shop tonight--
Oh, nice.
--with my children because they love it.
Cool.
Yeah, I think. I haven't been yet. We'll find out if it's cool.
Yeah. Hopefully, it's other children there and not just them and--
Oh dear. MATT FRIZZELL: --50-year-old men. Like a Britney Spears concert? OK, let's move on. I don't know--
This got off track a little bit.
Sorry about that.
No, no, that was good. I like the path that we put in. So what is next in terms of how you frame it? You were talking about the data. You don't really have a "we're going to use it" yet. It's not really affecting. But what's the next piece? Because it's still a pilot.
It is still a pilot. And that's really just because-- I mean, the original purpose for the survey was to see if we could attach this to our promotion package the same way that a professor would attach their CS scores. CHARLIE BENNETT: That seems tactical for our promotion reviews. Yes. Yes. And so we've had to wait until this coming year, where someone may potentially be going up for promotion who would have a report to attach.
And then, because no one is forcing them to do it, I'm having to try to persuade them, please, we want to see how this goes through the review process. Because eventually, those go up to the campus level. And we don't know if the campus is going to take this seriously and take it as seriously as they do any other teaching evaluation until it goes through that process. And we won't know that for another year.
And I think another aspect of this is, besides evaluating if students are learning, which, based on the scores, it would seem they are, is showing value for what we do. Once we get enough of these scores, we can start to use in impact reports, materials that we put out to the public, saying, hey, not only are we doing this, but we're doing a good job. So I think that's important. CHARLIE BENNETT: It seems like it might be more useful in the aggregate than the individual surveys.
Correct. Yeah, I mean, if we have 50, and they all say we're doing a great job, that's terrific. But it's an outlier. if we have 4,000 over 2 years, that's impressive. And I think it really does speak to what a good job we're doing.
Fred, optimism, How about that?
OK, I'll allow it.
This is Lost in the Stacks. You've been listening to our "Instruction Assessment Check-in" panel. I'm delighted that we had visitors in the studio. And I think, Ameet, you should--
File this set under RA785.884.
[KATHY STACK, "EXPECTATION"]
[THE ROLLING STONES, "NO EXPECTATIONS"]
That was "No Expectations," by the Rolling Stones. And we started with "Expectation," by Kathy Stack, songs about managing our expectations.
[MUSIC PLAYING]
Today's show is called "Instruction Assessment Check-in." And since spring semester is over and summer is on its way, I thought we could all share one thing we want to work on over the summer. Fred?
Well, so I've come across some old minidisc recordings from many years ago. So I thought maybe I'd work on transferring those to something that-- some other digital form you can actually listen to it without having to have a minidisc player.
All right, Charlie?
I'm going to do more drugs.
[LAUGHTER]
All right.
All right. Matt?
How about you? MATT FRIZZELL: Well, you took mine. So--
[LAUGHTER]
--I'm going to wimp out here and say, I'm going to grow, and I'm going to nurture things.
Oh, nice.
I'm going to work on my yard. I've got some dead things that have been staring at me, trees and plants. I'm going to dig those up and replant them. And hopefully, next year, I'll be able to enjoy a nice green, growing lawn. CHARLIE BENNETT: Don't forget to do a pre-work assessment of the yard so that you can then refer back. I will. And I'll ask them to evaluate me as a gardener.
Marlee?
Oh, that's great. Oh, I'm going to try to make future me happier by building some templates.
[LAUGHTER]
That's a very librarian take on things.
I know. I know. I had to get at least one library thing in there.
I think we should all do everything that we just said. Roll the credits.
[MUSIC PLAYING]
MARLEE GIVENS: Lost in the Stacks is a collaboration between WREK Atlanta and the Georgia Tech Library, written and produced by Alex McGee, Charlie Bennett, Fred Rascoe, and Marlee Givens. Legal counsel and wow, I don't think we should talk about what Phil gave us this time. We're provided by the Burrus Intellectual Property Law Group in Atlanta, Georgia.
Special thanks to Matt for joining us on the show today, to all students who have given us feedback. And thanks, as always, to each and every one of you for listening.
Our page is library.gatech.e du/lostinthestacks, where you'll find our most recent episode, a link to the podcast feed, and a web form, if you want to get in touch with us and, I don't know, assess this show.
Yeah. And on next week's show, we will talk to an archivist who is making their profession a better place for people with disabilities.
All right, it's time for our last song today. And I will be taking that duty from Fred on this announcement. This week, an infamous and magnificent recording engineer passed away. Steve Albini was a guitarist and singer for Big Black Shellac and several other bands I won't mention on the radio. He recorded maybe 1,000 albums, some of them for bands you've never heard of, and some of them for world-famous musicians like Nirvana, Bush, Robert Plant, and Jimmy Page, Pixies, and PJ Harvey.
He might have been used to interviews with national news outlets and being the subject of features and music magazines. But he also spoke with me for an episode of Lost in the Stacks, all about being a recording engineer and the importance of the master tapes. He was, by all accounts, a great friend to have and a great opponent to face. So let's play a song for Steve Albini. This is "All The Surveyors," by Shellac, from the album, Dude Incredible. And it's sort of about assessment.
And Fred, you did cut off the beginning of the song, right?
Yeah, we go straight into the song. We cut out the intro bit.
Good. A lot of people don't know what we're talking about, but the Shellac fans know.
Have a great weekend, everybody.
[SHELLAC, "ALL THE SURVEYORS"]