May 19, 2020

Ask Bigger Questions of your Course Evaluation Data

This content was previously published by Campus Labs, now part of Anthology. Product and/or solution names may have changed.

pull quote

As we think back on the first half of 2020, one thing is certain: the sudden and unexpected transition of so many college campuses to remote instruction has unavoidably impacted student learning. Now, as campuses determine whether to use an open, remote, or hybrid teaching model for summer and fall, they’re asking how they can learn from the Spring term. Course evaluations are a big part of that conversation. Our data shows that many campuses are adapting how they use course evaluation results. Some are omitting Spring 2020 course evaluations data from tenure and promotion consideration or giving faculty the option to share results with their chairs. Whatever adjustments you make, don’t miss this critical opportunity to use your data to capture student learning development. So, how do you do that? Ask bigger questions of your course evaluation data, and you’ll gain greater insights.

If your analysis of course evaluations data has historically focused on response rates and individual faculty or program interventions, you might be unsure of what questions to ask, how to interpret the data, and how to share it with the stakeholders who need to see it. We connected with two campuses ahead of the curve to ask for their advice for campuses just getting started.

Asking the Right Questions to Guide Curriculum-Level Insight

The University of Rhode Island uses the IDEA Diagnostic Instrument for their course evaluations. With this standard set of questions in place, Sean Krueger, the Coordinator of Course Evaluations in the Office of the Provost, began to realize how distinct questions could pair together for greater understanding. For example, Krueger paired responses to “My background prepared me for this course” against “Difficulty of subject matter” in a quadrant chart. A difficult course that students felt prepared for could represent a curriculum that successfully challenged students. While a difficult course that students felt unprepared for might be worth revisiting when examining pre-requisites in the educational structure: an intermediary course could be missing, or the curriculum of prior courses might need to go further.

Another example is Southern New Hampshire University, where Aaron Flint oversees course evaluation administration. Flint has seen their online division embrace assessment of course evaluation results at an advanced level. They incorporate their course evaluation results within a larger data set by a team that uses the data to inform curriculum strategy. Course content is consistent from instructor to instructor with these courses, allowing them to use the data to evaluate potential changes to syllabus and course content.

Other questions you might explore on your campus include:

  • What does the data show us are the strengths of remote learning as it pertains to achieving learning outcomes?
  • What trends does the data show us about student learning environments and how those impact learning?
  • Are there success stories and opportunities to celebrate campus resilience captured within our data?

Interpreting & Sharing Data

Both Krueger and Flint agreed that a compelling visualization is a powerful tool in these conversations. Because many campuses have yet to look at this data on a macro scale, visualizations can zoom out the conversation to more easily capture trends in the data. Krueger has been using visualizations for some time. One of his projects successfully shows faculty how important in-class time for course evaluation completion is to data collection and accuracy. Thus, continuing down the road of data visualizations was an intuitive path for him.

Flint has found other ways to interpret and share data results. Flint emphasized how important qualitative questions are in providing faculty genuinely actionable feedback, especially for online courses. While it may be faster to rely on multiple-choice questions when reviewing your course evaluation data at a macro scale, resist the temptation to end your analysis there. Dive into qualitative assessment to identify trends in student needs and concerns. The comments of respondents can often be the most impactful and persuasive aspect of any presentation on results, so be sure to weave these into your narrative.

Headshot of Bethany Jacobs, Ph.D.

Bethany Jacobs, Ph.D.

Senior Adoption Consultant
Anthology

Bethany Jacobs, Ph.D., joined Anthology (formerly Campus Labs) after a ten-year tenure as a university instructor. She was a Marion L. Brittain Postdoctoral Teaching Fellow in the Writing and Communication Program at Georgia Institute of Technology, where she simultaneously served as a project manager for the Center for Serve-Learn-Sustain, Georgia Tech’s premiere sustainability initiative. In both these positions, Bethany worked with faculty across multiple disciplines to improve teaching effectiveness. Before this, she was a career instructor at the University of Oregon, where she received her doctorate in American Literature. She received her BA at Westmont College in Santa Barbara. With her experience in small liberal arts colleges, as well as flagship tech schools, she has first-hand knowledge of the shared and disparate experiences of faculty and administrators across university settings. She is committed to helping all her contacts leverage data for the improvement of course and programmatic outcomes and make the university experience more productive and satisfying for faculty and students alike.