Published June 24, 2025

Iowa Gets a Jump on Its Student Activity Score

With an Assist from Indiana

Like many research and analytics department leaders across higher education, Jane Russell at the University of Iowa is always working to balance requests with reality.  As the director of research and analytics in the university’s Office of Teaching, Learning, and Technology, Russell and her team work to merge teaching and learning data emanating from Iowa’s already vast array of digital learning platforms with institutional data to deliver new insights to a long list of stakeholders: advisors, faculty, administrators, and students themselves.  

With resources at a premium and the number of learning tools and campus information systems growing, jumpstarting and streamlining the discovery, development, and deployment process is essential to keep pace with demand. 

“The Unizin Data Platform (UDP) is instrumental in bringing together data from our Student Information System (SIS) and Learning Management System (LMS) to fuel our research projects and help our stakeholders do their work more effectively,” explains Russell. “In the case of our academic advising program, the UDP combined with the experience of a fellow Unizin member helped us accelerate the process of developing a new tool to let our advisors see where a student is today to provide more effective guidance for tomorrow.”

The Unizin Model in Action

The Canvas Activity Score was initially developed at Indiana University (IU) as a simple, UDP-derived metric to power IU’s academic advising efforts.  Studies by Matthew Rust and Ben Motz at IU have shown the Canvas Activity Score to be an effective and accurate indicator of student performance with predictive qualities.  Russell and the Iowa team used IU’s model, and the data marts that power it, as a starting point for discovery and development. 

Russell and her team were able first to perform a deep analysis of the model’s effectiveness using Iowa UDP data, then evaluate new combinations of student activity data.  Ultimately, Iowa landed on using a weighted metric incorporating grade book data (which is not included in the IU activity score) with student engagement factors such as course logins, materials viewed and missed assignments. 

“We made a conscious investment to find the ideal balance of accuracy and simplicity,” explains Russell.  “We have a duty to the advisors to make the underlying formula easy to understand and easy to trust. While incorporating dozens of variables might boost the predictive power of our metric, our primary objective is to provide an accurate snapshot of a student’s status at a specific moment in time.” 

From Data to Insight

Iowa’s application development team, led by Ross Miller, developed an advisor-facing tool that is integrated into the university’s SIS.  Now, when an advisor calls up their advising roster, a simple icon indicates students who are currently ranked in the bottom 15% of their cohort in terms of engagement and performance.  With one click, an advisor can access an easy-to-understand snapshot of student engagement in a particular course (or courses).  Additional clicks enable advisors to dig deeper to inform outreach or a conversation.

“It’s important to note that we worked very closely with Academic Advising Center director, Maureen Schaeffer, and her team throughout the application development process,” explains Miller. “Because access to this level of student-specific insight is new for advisors, we didn’t want to be prescriptive in how it should be applied to their day-to-day. As we’ve observed, some advisors are more inclined to check the dashboards regularly, while others use the tool to confirm what they have already assessed for specific students by other means.  Either way, we have received overwhelming positive feedback from the initial cohort of users.”

Impact Beyond Advising

For Iowa’s advising department, training and expansion are the next steps. Working with the pilot users and the development team, the advising group is devising a self-directed training program that reflects the flexible nature of the application.

“Because every advisor incorporates these insights into their own workflow, we want to extend that concept through a self-paced learning and application process, explains Anna Marie Smith, senior analytics specialist in the Office of Teaching, Learning and Technology.  “The activity score data is also being used to collaborate on broader efforts within the university. Our  retention team is integrating the score into their model for student outreach and support to increase persistence through graduation.” 

Three Insights from Iowa
UDP Universality and Uniqueness

The UDP establishes standards to align and unify data emanating from disparate SIS and LMS systems across an institution.  These standards enable analytics teams to not only build data tables and queries to power tools like Iowa’s Student Activity Score, but also to share models across the consortium.  A model built at one institution can be replicated within another.

Access to specific data within the UDP is governed by the standards established at each member institution.  For example, the Iowa team could incorporate Canvas grade book data from the UDP into their Activity Score metric, while Indiana’s data governance policies do not allow advisors to see this detailed data.  Still, with the Unizin Common Data Model underlying both institutions, the Iowa team could quickly run Indiana’s Canvas Activity Score model within their UDP, assess its capabilities, and then invest time and resources in refining the model to make maximum use of the data within their system. 

Simplification Breeds Trust

Both Indiana and Iowa landed on remarkably simple formulas to represent a decidedly complex notion of student activity and performance.  While developers at both institutions recognized that a more complex algorithm could boost the predictive qualities and accuracy of their models, they also recognized that a simple, easy-to-explain formula would help establish trust among the advisors being asked to incorporate a new tool into their workflow.

Transparency and trust are the key determinants to uptake and application.  Neither model overpromises on its predictive capabilities. Rather, each solution is positioned as providing an accurate representation of where a student is at a moment in time, relative to others in their cohort. It is left to the advisor to determine if, when and how to use this information.

Perspective vs Prescriptive

Perspective is the name of the game when it comes to tools like the Canvas Activity Score and Student Activity Score.  It takes a collaborative effort between advising groups and technologists to align a data-derived tool to the workflows and approaches of academic advisors. 

On both campuses, the development teams worked hard to avoid being too prescriptive with how student-level insights are shared within their platforms.  Both platforms provide new insights that help advisors put data into context while promoting advisor independence to determine the optimal approach for supporting each of their students.

Image created with the help of AI using Canva.