This content was previously published by Campus Labs now part of Anthology. Product and/or solution names may have changed.
A 2019 Campus Labs Trailblazer Award winner, the University of Tennessee-Knoxville (“UT Knoxville”) impressed us with their organized integration of assessment as a priority across the institution. We connected with Ashley Browning, an integral member of the campus’ Assessment Steering Committee (ASC), who shared her perspectives on how small changes to their assessment templates, monitoring assessment in real-time and leveraging visualizations helped her team cultivate a stronger climate of assessment and develop a practical model for meta-assessment.
Your campus was recognized in the 2019 Campus Labs Awards for developing a more inclusive, efficient and holistic approach to assessment that expanded buy-in across campus. From your perspective, what role does technology play in creating a culture of continuous assessment?
The ASC has used the Planning tool since 2013 to create, assess, evaluate and improve student learning outcomes. After attending training sessions at the annual Elevate Conference, the committee identified small changes that could be made to:
- Create more transparency and availability of data to faculty
- Empower faculty to use data for student learning improvement
- Create assessment champions in every college across the university
The ASC made slight modifications to templates and reports within the Planning tool that transitioned the feedback process away from a college-level review of assessment reports to one driven by faculty report reviewers. This approach gave feedback to deans, directors and department heads two months ahead of prior timeframes. With Campus Labs tools, buy-in for a culture of continuous assessment was not only possible, but it drove our meta-assessment process and results to new heights.
Today, you use our Planning tool across both the Academic and Student affairs sides of the house under one centralized model. When you reflect on your process so far, how do you think this influences the way you use Planning in particular?
For the culture of assessment at UT Knoxville, it is important for the ASC to work as a centralized team. It’s comprised of representatives who oversee assessment in the Office of Institutional Research and Assessment (OIRA), the Office of Accreditation, Teaching and Learning Innovation and the Division of Student Life.
The ASC uses the Planning tool to assess both academic and student services. This means that data is transparent, accessible and perhaps most importantly—available when needed. Committee members and decision-making administrators can read and navigate data, regardless of what “side of the house” they work in, as long as the proper permissions are provided. This impacts decision-making in a big way, because we don’t have to suppose as fact when developing new processes and procedures that will impact student learning or experiences on campus.
Last year we introduced Assessment Cycles to Planning and your campus was one of our early adopters. Looking back, in what ways do you think assessment cycles have improved your process management?
Data visualization is a powerful thing. As a team member tasked with managing and providing support for the entry of nearly 300 programmatic assessment reports, the Assessment Cycles tool has potential to shift the quality and timeliness of reporting for our institution.
The ASC’s first step to adopt Assessment Cycles was identifying content from Planning templates that could be organized by the “Plan, Do, Check, Act” reporting model Campus Labs provided. Once content was identified, creating our unique cycle within Assessment Cycles was straightforward. Being able to set deadlines for benchmarks saves time, but this tool does much more than that. For each phase of reporting, a dashboard shows units who have completed all tasks, partially completed the requirements (complete with a percentage breakdown of exactly how many benchmarks have been fulfilled) or haven’t begun that phase of assessment. Reports can be generated to show precisely what components are missing to satisfy the phase of assessment and show whether requirements such as number of instruments, direct assessments, etc. have been met. This level of analysis is powerful in addressing concerns about assessment reporting in real time, rather than compiling independent lists of reports that need troubleshooting.