Published November 18, 2025

The Consortial Approach to Learning Analytics

When Unizin was first organized a decade ago, the consortium’s initial members often referenced a well-known African proverb:

“If you want to go fast, go alone, but if you want to go far, go together.”

A founding tenet of Unizin was to help make innovations born at one member institution widely available to all members. By going together, we could not only go far, but also go faster than a single institution working alone. While proverbs are wonderful tools to make big ideas approachable, putting them into practice is often easier said than done! In Unizin’s case, our members needed some time to work alone before they could fully take advantage of working together.

For the first several years of Unizin’s existence, members focused primarily on ways to use the Unizin Data Platform (UDP) to support various aspects of student success and institutional health on their campuses. This inward focus revealed valuable lessons and led to the development of tools and approaches to data utilization that are now being catalyzed across the consortium.

Over the past four years, Unizin has worked closely with many members to translate the core components of these homegrown applications into data marts that can be shared across the entire consortium. A data mart is a flat database table that combines raw data from the UDP to generate actionable information. The proliferation of digital learning tools and platforms makes even seemingly simple insights incredibly complex data problems. Data marts simplify the process and can be updated daily as new data flows into the UDP.

Consider for a moment how to calculate a student’s last activity. Student activity can be defined as the last time a student logged into the LMS. But what about the last time they submitted an assignment? Or the last time they messaged a faculty member, participated in a discussion board, or watched a video? All of these are last activity measures, and all reside in different database tables. The last activity data mart aggregates all these metrics into a single database table that is updated nightly.

Because every Unizin member has a resident UDP that utilizes a common data model to integrate data from disparate learning tools and platforms, any member can plug in any data mart without reconfiguration, customization, or starting from scratch. Faster and farther.

The Next Step: From Marts to Metrics

On its own, the student last activity data mart doesn’t tell us much about a student’s overall engagement in the digital learning environment. Enter the Canvas Activity Score, a metric developed by Indiana University to help advisers better identify students who may benefit from a proactive check-in early in a semester. IU not only developed the Canvas Activity Score, the team spent several years testing, refining, and studying its impact. IU’s research concludes that when advisers incorporate the Canvas Activity Score metric into their advising workflow, the students they support have a higher GPA, on average, and a higher rate of persisting to the next semester compared to students whose advisers don’t use the metric.

As IU members shared the success of the Canvas Activity Score across the consortium, other Unizin members expressed interest in implementing the metric at their institution. Unizin staff worked with IU to package the underlying statistics into the Student Activity Score data mart.

Unizin members at the University of Iowa and the University of California, Irvine, have now used IU’s work as a starting point to accelerate their own projects, leveraging more than four years of work at IU to jumpstart their initiatives. More broadly, in 2024 alone, the Student Activity Score data mart was queried more than 1 million times across all Unizin members!

A similar story has emerged from Penn State where a different engagement metric, utilizing entirely different UDP data as inputs, has been developed. Leveraging the millions of clickstream events taking place each week across multiple digital environments, including Canvas, Penn State developed a rolling 7-day average of clickstream events for every student in every course. With the student average established, Penn State then developed a class average rolling 7-day clickstream window to serve as a comparison point.

Penn State has successfully deployed this engagement metric in adviser and faculty-facing applications, helping advisers and faculty identify students who appear to be disengaging from the digital learning environment. This insight has been especially helpful early in a semester to identify students who may benefit from proactive outreach.

Similar to our work with Indiana to package the Canvas Activity Score into a data mart, we collaborated with Penn State to integrate a weekly 7-day average of clickstream events into our student success data mart. All Unizin members can now leverage this engagement metric into bespoke applications.

Today, you’ll find components of the student success data mart at work in the University of Nebraska, Lincoln’s faculty-facing dashboard, Course Insights and the University of California, Irvine’s Spark application: a student-facing tool designed to help students make more informed decisions about time management and self-regulated learning.

Returning to our African Proverb, it turns out that you can go far and fast with this consortial approach to learning analytics!