A Continued Comparison of Course Evaluation Qualitative Data During COVID-19
Course evaluation data continues to provide comparative insight into student perceptions before and during the pandemic. Over the last few years, we have reviewed student feedback to gain a better understanding of the impact of COVID-19 within the classroom.
In a previous blog post, Comparing Course Evaluation Qualitative Data During COVID-19 (Benton, 2020), we offered a comparison of qualitative data from student-written comments across the two-year period for course evaluations conducted in spring 2019 and spring 2020. As the pandemic continues to influence higher education practices, we found it appropriate to continue monitoring for changes in course evaluation results conducted through spring 2021.
After another full pandemic year in the classroom, our latest data indicates there were still no meaningful differences in the average word count, proportions of negative and positive words, the content of word clouds, sentiments, linguistic complexity, or readability. Though there may not be significant differences, we did identify word usage patterns of interest. For example, “covid” was not a word used in evaluations in spring 2019, but it began to make an appearance in 2020 and 2021.
The proportions of positive and negative words, which reveal students’ general feelings, were also not meaningfully different from the past. As shown, the same words are among the most frequently positive and negative words in spring 2021 for the third year in a row.
Word clouds of the top 100 positive and negative words are also overall relatively similar, but now the negative words “covid” and “pandemic” are present where they were not present in spring 2019 course evaluations. Each of the word clouds below position negative words at the top and positive words at the bottom.
While the words “covid” and “pandemic” display the most prominent increase in frequency, there were 12 words that were outside of the top 100 words ranked by frequency in 2019 but did show up in the top 100 lists for both 2020 and 2021. Many of these words are unsurprising: accessible, covid, flexible, grateful, improvement, overwhelming, pandemic, smooth, stress, stressful, struggled, and support. The numbers in the line charts below denote the frequency rank of those words in the corresponding years.
Despite the continued challenges of the COVID-19 pandemic and the changes in higher education classrooms that instructors and students experienced during spring 2021, the summary statistics are still very similar to the past.
Summary Statistic Table
In our most recent sentiment analysis using sentimentr (Rinker, 2017), we attempted to control valence shifters that might affect the polarity of detecting positive, negative, or neutral opinions. This analysis is explained in detail in our previous post: Comparing Course Evaluation Qualitative Data During COVID-19.
Summary Analysis Table
Despite another year of having the global pandemic shift the culture of higher education, we still find no meaningful difference in the polarity and sentiments expressed in student course evaluations. While this may be surprising, it supports the notion that faculty and students have successfully navigated the continually changing experience of pandemic-influenced learning environments. Students continue to share similar course evaluation feedback compared to their pre-covid submissions, which is a trend we expect to continue.
As is evidenced by the data, higher education remains resilient and continues to deliver on the learner’s expectations when it comes to the student experience in the classroom.
1. Benton, S. (2020, October 12). Comparing Course Evaluation Qualitative Data During COVID-19. Anthology Blog. Comparing Course Evaluation Qualitative Data During COVID-19 | Anthology
2. Rinker, T. W. (2019). sentimentr: Calculate Text Polarity Sentiment version 2.7.1. http://github.com/trinker/sentimentr