November 12, 2018

Student Success as a Measure of Effectiveness

This content was previously published by Campus Labs, now part of Anthology. Product and/or solution names may have changed.

Three Questions That Can Help Measure Student Success

As a faculty member, it was always tempting to rely on my own personal measures of student success. I could easily point to performances on evaluated assignments or exams, using these as metrics to tell myself a story about my students’ likelihoods for success. I would look to the “experience” of the course or program – were students having, what I considered to be, rich discussions? Did it feel like they were engaged? The problem was, at the center of these measures was one person: me. I viewed effectiveness as a concept that began and ended within my sphere of influence.

My experience as a higher education professional tells me that I’m not the only well-intentioned faculty member/program coordinator that treated student success more as a mirror and less as a window into my department as well as my entire institution. But if we shift the viewpoint, we can place student success at the heart of institutional effectiveness.

We can achieve this shift in viewpoint by leveraging data to understand the experiences of our students and asking ourselves three key questions:

  1. What data do you have?
  2. What data do you need?
  3. What data can you generate?

What data do you have?

Typically, the initial responses to this question center on traditional metrics of student success: retention from fall to spring semesters, rates of persistence to graduation, achievement and performance within key courses. It’s these data points that usually serve as our calls to action.

For example, The International Center for Supplemental Instruction (SI) recommends their method of academic intervention for especially challenging first-year courses, “especially those with a 30% or higher rate of D, F, and withdrawal (DFW) grades” (Stone & Jacobs, 2008). I used this as a guiding principle when coordinating SI on my campus.

What I didn’t consider at the time was what other data points were available. With each course in which we were ready to offer supplemental instruction, there was also student demographic data readily available, yet unexplored. I was also working on a campus that had administered the Student Strengths Inventory, Campus Labs’ assessment that provides a snapshot of the noncognitive skills students were bringing to their college experiences.

recent report of Student Strengths Inventory data provides some fascinating insights. In analyzing over 750,000 respondents who opted to disclose their college generational status, 14% identified as first-generation students (those for whom neither parent/guardian held a four-year degree). Further analysis shows that first-generation students outscore their “traditional” student counterparts in the areas of academic engagement, educational commitment, and campus engagement.

While demographic and non-cognitive data points were accessible to me, at the time I hadn’t thought to leverage them to inform my student success efforts. Consider taking a moment and asking – what data exists on your campus that is going un-leveraged?

What data do you need?

If our institutional effectiveness efforts are guided by a culture of continuous improvement, then we arrive at this question with some regularity. We look back upon what we’ve learned, we identify gaps and areas for improvement, and we adjust our practices for the future.

If I had taken advantage of more existing data points from my previous example, I would have known the population of first-generation students enrolled in courses with high D/F/W rates, and I might also have known that these first-generation students were likely to be academically engaged and committed to their studies. It would be tempting to consider this the “ah-hah” moment, but reflecting on the data, I realized that there were more data points that I needed to intervene and assist students most effectively.

A report from the Georgetown University Center on Education and the Workforce uncovered that “about 40% of undergraduates…work at least 30 hours a week” (Carnevale, Smith, Melton, & Price 2015). If we value student success, we need to know data points such as these for our specific institutions. It does our campuses little value if we offer solutions that students cannot easily participate in (for example, because of work commitments outside of their academic studies), but we would only know if that was a barrier if we engaged in cyclical and reflective data-driven processes. When we seek gaps in what we know about our students, we find more effective approaches to assist them.

What data can you generate?

If your work includes touch points with students, then you have an opportunity to generate data of value to your institution. I see this clearly when reflecting on my experiences coordinating academic success programs and tracking student usage of our services. My approach was decidedly low-tech, and somewhat cumbersome. I asked students to sign into SI sessions on paper, and then I compared their midterm and final course grades against their peers who did not attend SI sessions.

What I didn’t realize at the time was that I was generating data points. The potential value of this data was vast – not only for grasping the efficacy of academic intervention efforts, but also for slicing in any multitude of ways, all in alignment with institutional strategic initiatives, while still taking appropriate efforts to protect student anonymity where appropriate. When we focus on the data we have and the data we need, we similarly begin to grasp the potential of the data we create.

By asking ourselves questions about data we give our work credibility that can incorporate, supplement, and exceed the readily available metrics we may have habitually relied on. As our institutions increasingly utilize analytics in their decision making, taking a data-informed approach allows those of us invested in student success to be part of these conversations. Using tools and thought leadership to guide us in this process, allows us to go beyond our hunches when collecting student success data and create a healthy analytics culture on our campuses. Asking questions gives us new contexts to consider our efficacy in our daily work, while simultaneously generating new insights about the students we serve – an endeavor well worth our efforts, with benefits for our students and our institutions alike.

References

  1. Carnevale, Smith, Melton, & Price 2015 - https://cew.georgetown.edu/cew-reports/workinglearners/
  2. Stone, M. E., & Jacobs, G. (Eds.). (2008). Supplemental instruction: Improving first-year student success in high-risk courses (Monograph No. 7, 3rd ed.). Columbia, SC: University of South Carolina, National Resource Center for The First-Year Experience and Students in Transition.
  3. First-Generation College Students More Engaged Than Peers

Headshot of Matt Jackson, Ed.D.

Matt Jackson, Ed.D.

Senior Product Manager
Anthology

Matt Jackson, Ed.D., joined the Anthology (formerly Campus Labs) team after a decade serving as an assistant professor with the Academic Writing and Learning Center at the University of Minnesota Duluth (UMD). Jackson completed his doctorate in teaching and learning at UMD researching the intersections of educational policies, emerging technologies and student success. His academic and professional experiences provide a unique expertise in understanding how data and language are used to navigate complex phenomena in higher education. Jackson’s work has been featured in such notable spaces as Inside Higher Ed, Education Dive, Community College Daily, Diverse Issues in Higher Education and the Observatory of Educational Innovation.