February 7, 2022

How Peer Institutions Approach Course Evaluation Response Rates

Response rates are often top-of-mind for those of us who are close to the course evaluations process. After all, survey responses are necessary to understand students’ perceptions of their learning experience. With this in mind, I decided to conduct research around current clients of the Anthology Course Evaluations platform to determine what influences response rates and why some institutions are experiencing more feedback than others. 

One of the first things I discovered is that measuring response rates is not a one-size-fits-most process. Some institutions require reaching a certain percentage; others are just happy that students are responding in some way; some have sequenced and intricate plans, while others just let the process occur organically. I segmented Anthology Course Evaluations clients into three groups: a low response rate (10-25%), a middle response rate (26 – 50%), and a high response rate (51- 99%) over 20 months. 

To better understand the decisions driving course evaluations and their fulfillment, I sent an email to the three groups with the following four questions: 

  1. Did you have goals for your response rates? 

  1. Did you establish a plan and implement different strategies? 

  1. What worked? 

  1. What did you try that may not have worked as well? 

Below are the lessons learned from 17 responding campuses

Goals for Response Rates

In terms of goals, some clients indicated they hope and dream about increasing response rates, some aspire to maintain the same/similar average of the last semester, while others expressed interest in achieving a 60 - 75% response rate or just acquiring some usable data. According to users, the pandemic has had a slight impact on response rates in both directions.   

Some campuses experienced a decrease in response rate, prompting them to ask, “What can we do to make the process better, so our response rate increases?” Other campuses are looking to integrate Course Evaluations with their LMS and other systems to improve exposure to the faculty and students. In some cases, campuses are looking to incentivize, where applicable, and update the questions they are asking. 

Plans and Strategies 

Examples of strategies applied by institutions to improve response rates: 

  • Monitor response rates every morning during main survey window 
  • Engagement through email 
    • Add more reminders at different times of the day 
    • Change the language in the emails to faculty, reminding them by asking, “Have you checked your response rates today? Please put time aside in class for completing the evaluations.” 
  • Share response rate information, like a news brief to students and faculty, to remind them to participate in the process 
  • Encourage faculty members to discuss the importance of the evaluations with their class 
  • Create a website with recommendations and FAQs for faculty to reference 
  • Set response rate requirements for faculty 
  • Make response rates a part of the annual performance review and teaching culture 
  • To achieve a better response rate at the end of term data, use mid-course evaluations to collect data for the faculty to make changes in their delivery, share the information with students, and change what isn’t working well 

What Worked Well

  • Incentives
    • Free cafeteria passes for the courses with highest response rate 
    • Random prize lottery  
    • Extra credit points to make up for a poor quiz grade 
    • Food 
      • Donuts for the class that completes all evaluations first for lower enrollment courses; pizza for the class that completes all evaluations first for mid-range enrollment courses 
    • Gift cards 
      • Instructors with response rates of 90% or more are entered into a drawing for a gift card. Three entries allowed per drawing. This motivates instructors to encourage students to complete evaluations. 
    • To encourage friendly competition throughout teaching departments, a trophy is handed down. The department that earns the highest response rate for the semester earns the trophy. 
  • Reducing the number of other surveys students are required to complete decreases survey fatigue during the main survey period 
  • Knowing your data - spring and summer semesters typically have a lower response rate than the fall 
  • Providing time in class to complete evaluations (in the same manner for completing paper evaluations) 
  • Encouragement from faculty to complete evaluations 
  • Integrating with the LMS, so when students log in, they are instantly reminded they have a task to complete 
  • Health professions schools require students to complete evaluations as part of their professional code of conduct. The students are in a professional program and are taught that constructive feedback to a colleague is a professional duty to that person and their profession. 
  • Engaging departments and department chairs 
    • Departments set a target rate of their own 
    • Maintaining awareness through conversations/meetings with key stakeholders at the right time 
  • Removing withdrawn/non-enrolled students from courses to improve response rates  
  • Content marketing 
    • Campus signage, flyers, posters, slogans, hashtags, videos, newsletters, best practices webpage, and PowerPoint presentations  
    • Creating a YouTube channel 
      • Short videos for use in group email messaging 
        • Showing students how to provide constructive feedback 
        • Navigational videos for faculty and students 
  • Social media marketing 
    • Promotion of course evaluations by the institution on Facebook and Twitter 
    • Other social media platforms for ads/ad campaigns 
    • Example: one campus had a low response rate, but when their communications director implemented a social media campaign, the rate jumped 12% 
  • Email marketing 
    • Timing messages to go out when students might be able to complete them right away 

What Didn’t Work as Well as Expected

  • A shorter email response time window caused students to ignore the messaging 
  • Adding or recommending extra credit incentives appalled some faculty 
  • The impact of the pandemic on response rates: 
    • Too many surveys for COVID-19 check in's added to survey fatigue for students 
    • Response rates were impacted for some campuses as they couldn’t get in front of students to talk about the importance of the process and how the data is used 
    • The importance of the evaluation process isn’t being emphasized right now due to pandemic concerns 

More Key Findings

  • Close surveys before final exams to keep responses on a more friendly level 
  • Share the data and how it is being used! Share with institutional research, department chairs, faculty, students, and anyone else who will listen 
  • Make the data a part of the campus culture 
  • Adding a mid-course evaluation helps faculty and students improve teaching and learning. Students are very honest in their feedback when they can see the changes being made. Students complained when faculty didn’t do a mid-course evaluation.  
  • Add a survey or use the feedback option for faculty reflection 
  • Effective practice is to have multiple people involved in the process 
    • Get in front of students using an educational campaign 
    • Encourage people on the team to be the process cheerleader

While there is no single solution to increasing course evaluation response rates, there are multiple approaches that every institution can consider based on its goals and resources. There are many levers to pull to change behaviors and increase response rates if that is what you are aiming to do.

Headshot of Michele Borucki

Michele Borucki

Principal Adoption Consultant
Anthology

Michele Borucki, MS in Creative Problem Solving, has been working with the Anthology Course Evaluations product since 2002. As a founding member of the Course Evaluations team, Michele has helped hundreds of clients navigate the software, update their processes, and get the results they need. Her specialty areas include experiential learning, creating unique processes to meet campus needs, and helping campuses understand their data. Along with training, she is also involved in high-level support needs, process analysis, support center documentation, and release quality assurance (QA). She obtained a bachelor’s degree from The State University of New York at Fredonia and a master’s degree from The State University of New York at Buffalo. After completing her master’s degree, Michele began her career and worked for several years as a content trainer in the health insurance industry.