Campus Conversation

Efficient and Effective Gen Ed Assessment

There is much that can be said about the impact of general education on a learner. Wherever you may fall within this conversation, one thing that is consistent is the need to understand the learning occurring within the general education curriculum and whether students are developing those intended skills to succeed. Establishing those learning outcomes, assessing the achievement of those outcomes, and adapting to improve that experience for your students and faculty is no small feat. We recently spoke with Linda Silva Thompson and Amanda Strong from Molloy College about their experience establishing a new general education assessment process across their institution and how their approach and use of technology helped them achieve a successful expanded adoption.

Anthology Products Showcased

Headshot of Linda Silva Thompson, Ph.D.

Linda Silva Thompson, Ph.D.

Associate Dean of Academic Assessment & Associate Professor, School of Business
Molloy College
Headshot of Amanda Strong, M.A.

Amanda Strong, M.A.

Assistant Director of Academic Assessment
Molloy College

Q: What were the driving forces or factors behind that need for a technology solution for the Gen Ed assessment at Molloy?

A: Our institution adopted a new program and general education assessment model that is using student artifacts. Starting fresh with a new assessment model and committee structure provided us with the opportunity to review all our assessment practices, including general education. After establishing clear ownership of general education, we reviewed and updated our learning outcomes, and then the question became, how are we going to measure it and provide the data to continually improve? This is where the technology played a critical role in our success to remain as efficient and effective at assessment. Our goal was to minimize the number of standalone technology solutions, take advantage of much-needed connections, and, where possible, leverage familiar environments that would allow faculty and administration to adopt and embrace the tool much quicker. With Anthology Collective Review, we were able to expand our use of an already familiar solution to a college-wide adoption and streamline the data collection, analysis, and reporting used by faculty, deans, and chairs. This driving force demonstrated to faculty how their work is being used to help improve student learning and inform our perspectives around students’ college and career readiness.

Q: What would you consider to be some of those key wins that helped continue to propel this initiative forward at Molloy?

A: A key win for Molloy is helping faculty to see and understand the big picture and develop a clear sense of how their assessment work contributes to the big picture. This was enhanced because of the training resources and expertise available from Anthology. Another win for Molloy is our ability to adapt and customize Collective Review to individual program needs while maintaining that ability to report at the institutional level. From an assessment point -of -view, our learning outcomes are the starting point that create opportunities for continual improvement. Assessment provides best practices that enable General Education and programs using Collective Review to customize their use of the platform.

Q: What are some of the ways that you have been or are planning to put that data to work for you moving forward?

A: I'd never want Molloy to be in a situation where a visiting accreditation team is on our campus asking for artifacts that we have to scurry to produce. That will never be a problem for us now, and not only do we have them for our institutional use, but the programs have them for their program accreditation, because the same trend is happening around using artifacts to demonstrate continual improvement. It’s too common for an institution to fail their assessment- related standard because of a weak or a lack of general education assessment, so I think we've made a lot of progress in this area. Our focus now is the ability to utilize demographic analysis and disaggregation of the data based on race, first gen, adult learner, etc. This is something that Collective Review has recently released, and we’re excited to utilize this going forward.

Q: What is some advice you would give fellow institutions embarking upon this general education assessment journey based on your experiences?

A: First, establish what the institution is trying to accomplish regarding their general education program. Second, clearly understand what the institution’s accrediting body requires and how it will assess the institution’s general education program. I think a mistake that a lot of institutions make is that the technology decisions are made in a decentralized way. It’s that tightrope you need to walk in terms of picking the right solution to achieve certain Academic Affairs strategic goals, and making that selection and decision quickly and efficiently.

Q: Do you have any thoughts about the practice of assessment using Collective Review (CR)?

A: CR can help an institution to organize peer-review of student artifacts and consistently utilize established parameters that ensure the desired levels of reliability and margins of error are maintained. CR helps create a checks and balances that serve as benchmarks throughout the institution’s assessment cycle. Best of all, as higher ed embraces remote work, CR provides an assessment platform that supports it. The result is, no longer are faculty required to sit in the same room, on campus, after hours with piles of paper artifacts and rubrics to do this work. Faculty can sit in their pajamas on their laptop on their sunporch and complete their peer-review of student artifacts.

Q: Any improvements or behavioral changes in terms of the actual experience/execution of the process with assessors?

A: CR has demystified what faculty thought was a complicated process. The result is more faculty buy-in, more faculty positive energy directed towards assessment, more faculty understanding the difference between grading and assessing, and more faculty-driven curriculum changes that might not have been identified.

Our sincere thanks to Linda Silva Thompson, Ph.D. and Amanda Strong, M.A. for sharing their experiences with us. If you’d like your campus to be showcased, reach out to your consultant.