Articles in this section

Exploring Your Academic Progress Data

Developed by Ciji Heiser, Ph.D., with contributions from Mentor Collective

The Institute for Higher Education policy released a study in 2016, Towards Convergence: A Technical Guide for the Postsecondary Metrics Framework, which identified points of consensus in the field of higher education for measuring performance related to student success, progress, and completion. Credit accumulation, credit completion ratio, program of study selection, retention rate, and persistence rate were all measures of progress indicated in the metrics framework. 

Another study showed that students receiving mentorship had better average grades and passed more courses in year one of their program of study; these results persisted into year two of their program of study (Leidenfrost, Strassnig, Schutz, Carbon, & Schabmann, 2014). Beyond establishing metrics, the framework also includes recommendations for disaggregating progress data by economic status, preparation, age, race, and ethnicity. 

Based on discussions in the field, key indicators of academic progress include credit accumulation (Belfield, Jenkins, & Fink, 2019; UTRGV Strategic Plan), and timely selection of program of study (Benchmarking Institute). Credit accumulation is a measure of the number of credits a student has acquired in a given time frame. Credit accumulation signals the percent of coursework successfully completed which contributes directly to degree attainment. One university measures academic progress using the following two leading indicators:

  • number of first-time, full-time, mentored students who complete 80% of courses attempted each semester 
  • number of first-time, full-time, mentored students who earn 30 degree applicable hours each academic year

Timely selection of program of study can be measured through:

  • the number of undecided students, 
  • the percent of students who have chosen a major after one semester, and 
  • the percent of students who have chosen a major after one year. 

Lagging indicators of student academic progress include: 

  • GPA
  • Persistence Rate 
  • Retention Rate 

In this guideline, we recommend that comparisons be drawn between students’ actual progress towards their degrees and progress if they were ‘on track’ or exceeding timelines to completion based on national graduation rates (four-year or six-year). According to USC Rossier School of Education, completion is a measure of students who finish their degree within six years of entering postsecondary education for the first time. 

This guide outlines the steps institutions can take to conduct an impact analysis with their Mentor Collective data.

You Have a Dashboard and Data Exports, Now What?

Suggestion 1: Set meaningful targets.

Using the baseline measure of leading indicators established in step two, identify meaningful targets of academic progress for the students in the mentoring program. Select targets to serve as a benchmark for both mentor and mentee students. If the mentoring program is focused on specific student subpopulations, establishing a target would be useful for the subpopulation as well.

Suggestion 2: Establish a baseline measure of leading indicators using historical data.

Use historical data from the institution or national data sets to establish a baseline of leading and lagging indicators of student academic progress. If the mentoring program is focused on specific student subpopulations - establishing baselines would be useful for the subpopulation as well.

The following questions use first-time, full-time students in their first term as the population and timeframe. When answered, these questions provide a baseline of leading indicators for academic progress. 

  • What is the number, averaged over five years, of first-time, full-time, students who complete 80% of courses attempted each semester? 
  • What is the number, averaged over five years, of first-time, full-time students who earn 30 credit hours each academic year? 
  • What is the number, averaged over five years, of undecided students in term 2, term 3, and term 4?
  • What percent, averaged over five years, of students choose a major after one semester?
  • What percent, averaged over five years, of students choose a major after one year?

Suggestion 3: Explore relationships between variables.

Explore the relationship between academic progress and other variables for students in your mentoring program, such as demographic data. Three examples are listed below:

Academic Progress and Academic Self-Efficacy

  • What is the most recent academic progress score for students with low, medium, or high academic self-efficacy?

Academic Progress and Academic Help Seeking

  • What is the most recent academic progress score for students with low, medium, or high academic help seeking?

Academic Progress and Demographics

  • What is the most recent academic progress score for participating students based on demographic status, such as gender, versus non-participating students by the same demographic status?

Suggestion 4: Create an action plan.

Create an action plan for enhanced impact that includes necessary interventions, celebrating target achievement, and shares the program impact with key campus partners. Action plans are simple but effective tools which focus time and talent on using key data findings to inform 1-2 action strategies. Include in the action items a follow up analysis to determine if strategies from the action plan were effective. Below is an example of what an action plan could look like, including a celebration. 

Actionable data Strategies Target  Person Responsible Due Date

Transfer students who are mentored have higher academic progress indicators than the five-year institutional average.

Maintain our mentoring program to foster transfer student success. 

Share findings with this year’s mentees and mentors, next year’s mentees, Provost’s Council, Student Affairs Leadership Team, and Student Organizations.

Maintain our current academic progress indicators targets.

Mentor program coordinator

 

 

Dean of Student Success

5/30/2025

 

 

 

8/30/2024



Impact Analysis Steps

Step 1: Clearly identify the populations and the timeframe.

Start with a detailed description of which students were invited to participate in the mentoring program. For example, first-time, full-time students in their first term of enrollment were given a mentor. Defining the population prior to analysis is critical for establishing leading indicators of progress and allows you to quickly determine how you will make meaning of the data once analysis is completed. 

Next, define meaningful timeframes for analysis. For example, when talking about academic progress towards completion, specify that the analysis includes students enrolled in fall of 2023 and spring of 2024.

Step 2: Identify the relevant fields needed to conduct your analysis.

The fields needed to analyze academic progress for mentors and mentees are included in the table below. You will need to pull fields from multiple sources, and having a clear list of fields with definitions (also known as a data dictionary) will help to gather the relevant information more easily. 

Note: Here we are following common convention and using courses attempted and courses earned, but you could also use degree relevant hours attempted and earned. Declared majors may be designated as a different status in your SIS.

A template for populating these fields is provided here. Columns denoted in blue ( B, C, D) are from the Mentor Collective participant data export while the columns in yellow (A, E, F, G, I, K, L, M, N, O, P) would come from the student information system. Columns denoted in green (J) reflect an area where a calculation is necessary to conduct the analysis. Here’s how to export your participant data through your Partner Dashboard.

  Data Field Data Location
  Institution ID* Mentor Collective Participant Export

and 

Student Information System

  Role Mentor Collective Participant Export
  Program Status Mentor Collective Participant Export
  Institution-Provided Email Student Information System
  Courses Attempted Student Information System
  Courses Completed Student Information System
  Credits Attempted Student Information System
  Credits Completed Student Information System
  First Term of Enrollment Student Information System
  Current Term of Enrollment Student Information System
  Declared Major (Y/N) Student Information System
  Term GPA Student Information System
  Cumulative GPA Student Information System
  Meaningful Demographic Variables
    • First-Generation Student (Y/N) 
    • Race or Ethnicity 
    • Gender 
    • etc.

Student Information System

unless data was provided to Mentor Collective

*The Institutional ID is what will serve as your unique, common identifier for each participant, necessary to merge your data in the next step. If you did not provide Institutional ID to Mentor Collective, you can alternatively use an institutional email address as the unique identifier. However, please note that email address is a less reliable field, as some students may have registered for the mentorship program under a different email than what is listed in your Student Information System.

Step 3: Obtain and merge data.

To analyze academic progress for mentors and mentees, you will need data from your student information system (SIS) combined with your Mentor Collective data. From your Mentor Collective administrative dashboard, you can export the Mentor Collective Participant Data file. To connect these two files, you will need the student institutional email address, and institutional ID number. An example of merged data can be seen here.

Step 4: Create a pivot table.

Use your fields of interest in the template provided here and pictured below. Start with the basic counts of who is mentored and matched and build on this table to include a sum of courses attempted and earned for mentored students. You could also create tables for students serving as mentors or disaggregate the data. 

Step 5: Summarize the data and calculate rates.

Below is a screenshot of summary tables with rates calculated. In the academic progress guidelines spreadsheet, you have access to a summary tab with an example of a completed summary table comparing mentored students' course completion rates for those who participated in the program.

Step 6: Articulate impact statements.

Bulleted below are examples of impact statements that could be made with the data in the pivot tables. 

  • Students who are mentored have higher course completion rates in their first semester and in their first year than the overall five year average. 
  • Mentored students are more likely to declare their major in the first term or first year than students typically do (when looking at the five year average).

Now That You Have Your Exploratory Findings, What Can You What Can You Do?

Idea 1: Share the data!

Take your data on a road show by asking to present at committee meetings or standing meetings across campus (e.g., Provost’s council, President’s Cabinet, Student Government, Faculty Senate). For example, mentorship program directors at Florida Atlantic University collaborated with their institutional research office to identify how mentorship had made an impact for specific groups of students. They shared these findings with top leadership at the University, leading to increased and ongoing monetary support for their mentorship initiatives across the institution. Sharing data is one way to promote an institution-wide culture of mentorship.

Idea 2: Shift from sharing to collective problem solving.

Create a space for collective problem-solving or idea storming with those most closely connected to the data, including students. Host a data and doughnuts party and ask attendees to reflect on: "What do you see in this data?" and "How might we respond to these insights?" Augusta University has a committee with staff and faculty from across the institution to do this. Among their initiatives include working to ensure that mentorship is leveraged in the right ways in response to student needs.

Idea 3: Identify areas to celebrate.

Take the time to identify and highlight areas where your organization is making strides towards its goals. What does the data reveal about progress? What surprising insights are worthy of celebration?

Idea 4: Empower others with knowledge.

Who else might find this information meaningful? Consider reaching out to community partners, student clubs, or historically underrepresented groups to talk about the data. Sharing data with a wide range of collaborators can lead to fresh perspectives and more comprehensive decision-making. This is one way to ensure a participant-centered approach to mentorship. 

 


Ciji Heiser, Ph.D.

IMG_5662-1

Ciji Heiser, Ph.D., (she/her) is the Founder of Co-Creating Action, an award winning researcher and seasoned professional in education, evaluation, and strategic planning. She teaches antiracist methodologies at American University, contemporary issues in higher education at New England College, and Applying and Leading Assessment in Higher Education for the Student Affairs Assessment Leaders. She holds degrees from Bucknell University, Kent State University, and the University of North Carolina at Greensboro.

She has led assessment, evaluation, and strategic planning initiatives across sectors. Recently, she contributed to NSF grants supporting underrepresented students in STEM and conducted qualitative research on factors influencing students’ choices in pursuing postsecondary education. Currently, she collaborates on reducing racial disparities in prison diversion programs across Illinois and is researching how policies shape equal opportunity work across a state system of community colleges. 

As a volunteer, Ciji co-leads the Grand Challenges in Higher Education plan, leveraging data to promote accessible and quality education. She also shared her expertise as a faculty member at the ACPA Assessment Institute and as past-chair of the Student Affairs Assessment Leaders.

Ciji developed the impact analysis guidelines, as well as the recommendations for using the analysis data to create institutional impact.

Was this article helpful?
0 out of 0 found this helpful

Comments

0 comments

Please sign in to leave a comment.