November 20, 2019

Thought You Were Closing the Loop? Think Again.

This content was previously published by Campus Labs, now part of Anthology. Product and/or solution names may have changed.

“The end of assessment is action.”
Walvoord (2010)

Everyone knows that assessment is all about the use of the data. There’s a reason that “closing the loop” is one of the most used and most loved phrases in the assessment professional’s handbook – it’s always relevant. The assessment cycle (whatever version of it you use) is here to stay – it’s always a loop, and that final step is what matters.

At least, that’s what you tell us. Our latest user research survey of institutional effectiveness professionals shows that 80 percent of you rate “meaningful use of data to inform improvement” as extremely important and 84 percent of you saw that “successfully closing the loop on use of results” is also extremely important. And of course, accreditors make it crystal clear that this is what’s expected of us.

Well, I have some bad news to share: Despite your best efforts, for four out of five of you, your loop is still broken.

How do we know that? Data, of course!

We looked at the assessment documentation processes from 86 different institutions of varying sizes and types across the United States. Each of these institutions mapped the fields in their assessment documentation templates to the Plan, Do, Check, Act framework which we use in our assessment software:

blog image 1

Here’s an example of what that may look like. Let’s say your institution has three different assessment documentation forms – one for Academic Assessment, one for Strategic Planning, and one for Student Life Assessment. Each of those forms has a set of fields that your staff and faculty fill out and those fields correspond to a phase in the assessment process. Here’s an example of how a form might map to the phases of the cycle:


  • Unit Goal
  • Related Strategic Priority
  • Success Criteria


  • Assessment Method
  • Description of Assessment Process
  • Date of Data Collection


  • Summary of Results
  • Was Success Criteria met?
  • Interpretation and Reflection
  • Recommendations for Action
  • Resources Needed


  • Actions Taken
  • Evidence of Impact

So when we add up all of the assessment processes and forms from those 86 institutions, we have a set of 204 assessment cycles made up of 2,397 fields. When we looked closer at this data, we learned a few things:

We really, really, really like to plan. Almost half of all the information we document about our assessments is mapped to the PLAN phase.

blog image 2

By comparison, only 20 percent of the fields were mapped to the ACT stage. And if we look closer at the data on an institution-by-institution level, we found over 50 percent of the institutions had absolutely no fields mapped to the Act phase at all. (That means that these institutions have yet to design or implement an assessment process that even intends to capture evidence of actions taken.)

blog image 3

As you can see above in our Plan, Do, Check, Act model, the ACT phase requires more than just saying we’ll make improvements or changes – it requires that the actions were **actually taken**. (Recommendations for action are cataloged in the CHECK phase, when results are being analyzed and interpreted.) When we looked even closer at the field names and descriptions that were mapped to the ACT stage, we found something that brought our number down even further – many fields asked about planned actions, and never confirmed what actually happened.

blog image 4

All in all, only one in five of the institutions were collecting evidence of actions taken based on data.

Which, at first, sounds very disheartening. After all, isn’t the point of all of this to drive improvement and action? Schoepp and Benson (2015)​ talk about this dilemma in their work – the gap between deciding on closing the loop actions and actually taking meaningful actions. (They also identified that it takes an average of 1.68 years for us to implement one of those actions!)

The truth is, we spend a lot of meaningful time and effort guiding our faculty and staff to do good assessment – to start with an outcome, to pick an appropriate method, to stop and consider that data. So we’ve emphasized that a lot in our reporting and documentation. And, by and large, it looks like that has made an impact on the quality of our assessment efforts – which is certainly an accomplishment and something we should celebrate as a win.

But continuous improvement doesn’t magically come by virtue of having data. How can you ensure that your hard-fought assessment efforts close that loop in a meaningful way? The first step can be to stop and take a closer look at your assessment documents to see if you can make some simple changes in existing processes. Then you can continue the momentum of the cycle all the way through, leveraging the success you’ve had in driving quality assessment in the early phases.

To help with this, we examined the templates from those campus success stories and identified specific techniques to incorporate for the crucial “Act” stage documentation into your assessment process.

So, are you committed to putting your #DataInAction and fixing your broken assessment loop?

Headshot of Annemieke Rice

Annemieke Rice

Vice President of Campus Strategy

A self-professed data geek, Annemieke has spent her time at Anthology (formerly Campus Labs) helping guide and educate member campuses in their journey to use data more effectively. In doing so, she has consulted with hundreds of higher education institutions seeking to accelerate practice in areas including student success, learning assessment and institutional effectiveness. She arrived at Campus Labs via early member campus Northeastern University, where her responsibilities provided her with first-hand experience in strategic planning, retention initiatives, strategic enrollment management, educational technology strategy and accreditation.

She earned a bachelor of arts in behavioral neuroscience and journalism from Lehigh University and a master of science in applied educational psychology from Northeastern University. A prolific and engaging speaker, she has presented at more than 100 national and regional forums.