Campus Conversation

Transforming Teaching and Learning through Student Outcomes Assessment

We were able to speak with two institutions in Florida, Palm Beach State College and Palm Beach Atlantic University, back in July of 2020 to discuss how they were using the same technology differently in order to promote assessment and continuous improvement on campus prior to and through the start of the COVID-19 pandemic. Roughly a year later, we were excited and grateful for the opportunity to reconnect with Palm Beach State College. During this follow-up conversation, we discussed the state of assessment on their campus. We’re happy to report that, despite all the challenges facing campuses today, Palm Beach State College is reaping what they’ve sown in building that assessment culture brick-by-brick early on, prior to and during the onset of the pandemic. Below are some highlights from our conversation for your reference as you consider how assessment has and will continue to evolve going forward, as well as some best practices to consider on your journey.

Karen Pain is the associate dean of academic affairs on two of five campuses at Palm Beach State College, and for six years prior to this position she worked in Institutional Research & Effectiveness. She was previously the director of assessments and special projects.

Dave Weber is the executive director for the office of Institutional Research and Effectiveness at Palm Beach State College. He previously served as executive director for Planning and Effectiveness at Lake Sumter State College in Clermont/Leesburg, Florida, and as chief of strategic operation and student affairs at Rochester Community and Technical College in Minnesota.

Headshot of Karen Pain

Karen Pain

Associate Dean of Academic Affairs
Palm Beach State College
Headshot of Dave Weber

Dave Weber

Executive Director for the Office of Institutional Research and Effectiveness
Palm Beach State College

Q: How has life changed in the assessment world of Palm Beach State?

Karen: In a way I would say it’s taking care of itself. Right now, we are about 75% of the way through our assessment cycle, and we were without an official assessment director for six months because I had moved into another role. We have almost as many individual student results as we finished with last year, and we have more assessments loaded already. It looks like we pulled together enough momentum that it has almost been self-sustaining. It’s my observation that the manner of change is that it’s running itself right now, and I would attribute almost all of that to the implementation of Outcomes right before and then during COVID. I was pretty excited at this.

Dave: I would say that we're really looking at organizational performance in a variety of ways. Assessment results are, I think, something on which Karen and I would hope to continue to report more. It's probably been an opportunity area. And I hope we can do more of that in the future. You know when I say we track 12 outcomes, not all of them are now tracked and analyzed inside of Insight. Many of them are, and I think that's been very helpful to our planning effort and continued use of the Anthology solutions.

Q: One of the things that you both had mentioned now and in previous conversations is assessment culture. To what do you attribute building an assessment culture, not only technology-wise but from a behavior standpoint as well?

Karen: I can tell you that in my assessment world, culture of assessment became a dirty word for me very early on, very seriously. My first two months as director of assessment, I had a faculty member come to ask me: “Why are we trying to build a culture of assessment? We're a teaching and learning institution.” This took me a second to realize the heart of the matter, and what she communicated to me is something that never left me.

We are an institution of teaching and learning, and we should have a culture of teaching and learning that includes assessing. We can't teach well, and we don't know how well students are learning, if we're not assessing. So, to build a culture of teaching and learning, we really need to look hard and close at how we're assessing and how we're using those results. I think it's hard to talk about this outside of the technology because once we got Outcomes, it became a lot more possible to talk about why we are doing this in the first place. We're not just checking a box if we do it for the right reasons. The box is going to check itself and will be golden. But we need to look at how we're assessing and figure out whether it's giving us the information that we need and then using those results. I think that any success we've had here is directly attributed to that. The numbers have grown every year, but it's still a small percentage of our faculty that really buy into this. It's growing and growing and I'm super excited to see that. Nobody will ever convince me that it's anything other than trying to help faculty truly focus on teaching and learning and getting better at it.

Dave: The college is focused on getting better. There’s a spirit of continuous improvement and quality that's being able to use data correctly. So, all these things tie together. Are we there yet? I don't know if we're there. I think we're we've got the right foundations in place and will continue to progress.

Q: One of the things that we're constantly working with campuses on is closing the loop on continuous improvement. How are you all putting the data to work for Palm Beach State?

Karen: One of the things that I've really tried to emphasize to our faculty is getting them to first do the assessment, but then in reviewing the results I use Outcomes and I use graphics from Outcomes to show them that when all the faculty are on board, and are entering the results, they then have that immediate picture of what number and what percent of their students successfully met a benchmark. They see that immediately, but then going beyond that, if they're all using it, it gives them the opportunity to roll that data up as a program or as a department and have those meaningful conversations. If 68% of our students are achieving it, is it because of the way we're teaching it? Was it an unrealistic expectation? Is it because we're not even focusing on it and we're doing other things in the class? It paves the way for those kinds of conversations. It also allows an individual faculty member the opportunity to see if they are doing something different than other faculty. That's the kind of sales pitch that I presented to the programs in the department when I was working with faculty. And that's where I was getting the buy-in because it was visual and tangible. If we all agree on this assessment and we all do it, we administer it the same way, it gives us more time when we meet to actually talk about the curriculum and whether or not we have chosen good outcomes. Are they measurable outcomes? Are students achieving them? Do we have the right outcomes for their curriculum? Is the curriculum right for the outcomes? All those questions are made possible because of assessment and those conversations were not happening my first couple of years because it was all about checking the boxes. That started changing little by little when we started focusing on teaching and learning and then plugging it into Outcomes to help us do that. We had some good growth among the smaller group of faculty that really did buy into the process.

Dave: It's unfortunate that we've experienced what we have the last 15 to 18 months; COVID has clearly set us back in some regard. I don't know if we would have performed better or not, but I think there's two things that separate out great organizations from good organizations. Good organizations tend to focus operationally on what's in front of them. Great organizations have the ability to balance focusing on the future while focusing on the operational. I think as a college we've been too stuck on the operational, though I think we're moving in the right direction, especially as we continue to expand our influence across the institution in terms of assessment, program and services review, continuous improvement, and the use of results.

Q: Karen, in the last webinar you left us with some final thoughts and lessons learned. Those lessons were to start small, have resources ready, get visuals early and often, and use virtual presentation technology for training. Any sort of additions or amendments to those final thoughts or lessons learned on best practices from an assessment standpoint?

Karen: I might add that resources are perhaps the most critical; that's what has carried me through these six months. When someone reaches out, in most cases I've been able to send them the link to something that I made and say the screenshots might not match because it's a new date, but you can basically follow this and I don't hear from them again, except for a quick thank you a week later.

Our sincere thanks to Karen Pain, Ph.D. and Dave Weber for sharing their experiences with us. If you’d like your campus to be showcased, reach out to your consultant.