Higher education institutions are responsible for large amounts of data about learners; however, they then must figure out how to use that data to support and guide learners throughout their academic journey. Connecting and sharing the data is an opportunity to help achieve institutional and student goals. Legal and ethical questions arise when you consider the type and volume of data and the number of people and systems who have access to it.
The Anthology team takes many things into consideration when navigating the legal and ethical concerns related to the connection and sharing of data. We offer solutions that harness the power of a fully connected data experience, but that doesn’t mean we can or, even if legal, are willing to share everything with everyone involved. As stewards of the data that passes through our systems, it is our responsibility not only to ensure the security and privacy of our customer data but to protect the integrity of that data as well.
We don’t own your data. Anthology does not sell or trade an institution’s data with any third parties. We do not use customer data to market our products to students. We do not use the data in ways that do not serve the needs and goals of customer institutions. The data belongs to the institution, and we are merely acting as stewards working for the betterment of the institution and for student success—taking the data that is shared with us, using our innovative solutions to harness its powers and then returning value back to the institution and its students.
We are clear and transparent. Our higher education customers are our partners, and they are involved in the conversations about what we do with their data and why. We work to inform them of the data we use and what they can expect to see when working with the results. Everything we develop is with an educationally purposeful mindset intended to benefit institutions and their students.
We are thoughtful and intentional. To create a data-fueled solution, we need to understand the dataset: what’s included, where it came from, how it was collected and what’s missing, to name a few. Our data science team critically examines the data components that are useful for building a data model, as well as identifying where adjustments need to be made. We want institutions to be effective and students to succeed, and we try to be parsimonious, to use a statistical term. We want to create the data model with the least number of variables that explains the most. Therefore, we focus on the quality and necessity of data rather than indiscriminately throwing everything in. Then, we ask even more questions about who at the institution needs access to that data and for what purpose, and finally, we build a data experience that creates value for our users.
We guard against the risks. While our intentions are positive, unintended outcomes can arise. Not just in terms of the data, but also the conditions of the systems in which the data is operating. We are constantly exploring new concepts in predictive modeling, analysis and metric creation that can have a tremendous upside for institutions, but for every one of those ideas we are mindful of potential unintended consequences. We take time to consider the potential downsides or unintended biases within the system. And we do this for existing solutions as well—we continuously monitor outcomes and guard against unexpected outcomes with negative results, even in low-stakes situations.
We look at what supports higher ed. We at Anthology aren’t just tech experts, we are also higher ed experts. So, everything we do is through that lens. Our products aren’t developed for general use—they are developed for the advancement of higher education. We are purposeful and goal-oriented, and don’t lose sight of the positive outcomes institutions want to achieve. Therefore, we test different ways to do things, and stay focused on what could put those institutions in a better position for success.
We never stop having the conversation. The discussion on legal and ethical data use will never be over. It must be ongoing whether we are talking about new technology or products that have existed for a while. Needs change, goals change and over time artificially intelligent systems can reveal systemic biases that have existed all along. Regardless of our good intentions at the start of the process, we are responsible for keeping the conversation going and continuing to monitor the outcomes. At Anthology, we make sure this conversation is constant and we engage our customers to join in—the more we push ourselves, and are pushed to think through these things, the more it helps us to work through them and solve them in ways that make sense for institutional success.