Developed by Ciji Heiser, Ph.D., with contributions from Mentor Collective
*When working with an enrollment program, expand the impact analysis to include melt analysis as a standard practice in impact reporting. In this guideline, melt is defined as the timeframe from yield to enrollment.
In a study using administrative data from New York City and including over 50,000 students, results showed that near-peer college matriculation support programs increased matriculation by 7% and were especially effective for historically marginalized students: Black and Hispanic students and students from low-income areas (Liu, Haralampoudis, & Polon, 2023). Additional studies show that while peer mentor outreach over the summer can increase student matriculation by 4.5%, students matched to their mentors based on demographic data may increase enrollment even more (Strategic Data Project: Summer Melt Handbook).
Currently, the majority of client institutions do not invite students into the mentoring program early enough to conduct a melt analysis. Some institutions invite students at the time of acceptance, while other institutions invite students at summer orientation. Melt analysis would be relevant for institutions inviting students to a mentorship program over the summer. Prior melt analysis focuses on the timeframe from yield to enrollment rather than acceptance to matriculation.
This guide outlines the steps institutions can take to conduct a melt analysis with their Mentor Collective data.
You Have a Dashboard and Data Exports, Now What?
Suggestion 1: Set a meaningful target.
Use historical data from the institution to identify an average persistence rate for prospective students at the institution. If the mentoring program is focused on specific student subpopulations, an average rate of melt would be useful for the subpopulation as well. Given the historic persistence rate for the institution, set a meaningful target for what you would like the persistence rate to be for the mentored population.
- What are your current persistence and melt rates for all students?
- How do these rates differ by subpopulation?
- What is your target persistence rate for mentored students?
- What range of persistence rates signals a successful mentoring intervention?
- What range of persistence rates signals a need for enhancing interventions?
Suggestion 2: Reflect on responses.
Reflect on student melt and term-to-term persistence responses for mentees, mentors, and demographic groups meaningful to your campus (e.g., by gender, race, and first-generation college student status).
- What do you notice overall about the melt rate?
- For mentees, are melt rates higher than your target? Lower than your target?
- For each demographic group, are melt rates are higher or lower than your target?
- Where do you have opportunities to celebrate?
- Where do you have opportunities to provide interventions to improve enrollment?
Suggestion 3: Explore relationships between variables.
Explore the relationship between melt and other variables for students in your mentoring program, such as demographic data. Two examples are listed below:
Persistence and Participant Status
- For students who are mentored and matched: What is the most recent melt or term-to-term persistence score for participating students versus non-participating students?
- What differences or relationships do you notice?
- Are students with higher melt or term-to-term persistence scores also participants?
Persistence and Demographics
- For students who are mentored and matched: What is the most recent melt or term-to-term persistence score for participating students based on demographics, such as gender, versus non-participating students by the same demographic status?
- What differences or relationships do you notice?
- Do participating female students have lower melt scores than participating male students?
- Do participating female students have lower melt scores than non-participating students overall?
- Do participating male students have lower melt scores than non-participating male students?
Suggestion 4: Create an action plan.
Create an action plan for enhanced impact that includes necessary interventions, celebrates target achievement, and shares the program impact with key campus partners. Action plans are simple but effective tools which focus time and talent on using key data findings to inform 1-2 action strategies to enhance impact while simultaneously building in follow up to determine if the action strategies were effective. Below is an example of what an action plan could look like.
| Actionable data | Strategies | Target | Person Responsible | Due Date |
|---|---|---|---|---|
|
Mentored students are significantly more likely to enroll in the fall. |
Maintain our mentoring program to foster enrollment. Share findings with next year’s orientation, Provost’s Council, Student Affairs Leadership Team, and Student Organizations. |
Maintain our target. |
FGCS program team. Dean of Student Success |
5/30/2025
8/30/2024 |
Impact Analysis Steps
Step 1: Clearly identify the populations and the timeframe.
Start with a detailed description of which students were invited to participate in the mentoring program. For example, we invited transfer students with less than 45 credit hours to engage in the peer mentoring program. Defining the population prior to analysis, allows you to quickly determine how you will make meaning of the data once analysis is completed.
Next, define meaningful timeframes for analysis. For example, when talking about persistence, specify that the analysis includes students enrolled in fall of 2023 and spring of 2024. Retention could be defined as enrolled in fall of 2023 and enrolled in fall of 2024.
Step 2: Identify the relevant fields needed to conduct your analysis.
The fields needed to conduct the analysis strategies recommended above are provided in the table below. In many cases, you will need to pull fields from multiple sources, and having a clear list of fields with definitions (also known as a data dictionary) will help to gather the relevant information more easily.
A template for populating these fields is provided here. Columns on tab “Data Fields” that are denoted in blue (B, C, D) are from the Mentor Collective participant data export while the columns in yellow (A, E, F, G, I, J, K) would come from the student information system (SIS). Columns denoted in green (H) reflect an area where a calculation is necessary to conduct the analysis. Here’s how to export your participant data through your Partner Dashboard.
| Data Field | Data Location |
| Institution ID* |
Mentor Collective Participant Export and Student Information System |
| Role | Mentor Collective Participant Export |
| Program Status | Mentor Collective Participant Export |
| Institution-Provided Email | Student Information System |
| Next Term Confirmation/Deposit (Y/N) | Student Information System |
| Next Term Enrollment Status (Y/N) | Student Information System |
Meaningful Demographic Variables
|
Student Information System |
| Melt |
Calculated Data Follow steps in the video above to calculate. |
*The Institutional ID is what will serve as your unique, common identifier for each participant, necessary to merge your data in the next step. If you did not provide Institutional ID to Mentor Collective, you can alternatively use an institutional email address as the unique identifier. However, please note that email address is a less reliable field, as some students may have registered for the mentorship program under a different email than what is listed in your Student Information System.
Step 3: Obtain and merge data.
Because you pulled fields from multiple sources, you will now need to merge the data into one data source for analysis. This is possible by identifying a unique, common data point for each individual that exists in all data sources, and conducting a VLOOKUP or XLOOKUP to pull related data. The recommended common data points are: institutional email address and institution ID*.
Step 4: Create a pivot table.
Use your fields of interest in the template provided here and pictured below. Start with the basic counts of who is matched (MC Status 0/1) and build on this table to include a count of those who enrolled (Count of Melt N/0) and those who did not enroll (Count of Melt Y/1). Also shown below is an example of how to further break down melt counts with gender data as a demographic group of interest.
Step 5: Summarize the data and calculate rates.
Below is a screenshot of summary tables with rates calculated. In the melt guidelines spreadsheet, you have access to a summary tab with an example of a completed summary table comparing mentored students' melt rates to those who were not mentored but invited into a mentoring program.
Step 6: Articulate impact statements.
Bulleted below is an example of an impact statement that could be made with the data in the table above.
- Mentored students are significantly more likely to enroll in the fall than students who are not matched and mentored.
Now That You Have Your Exploratory Findings, What Can You What Can You Do?
Idea 1: Share the data!
Take your data on a road show by asking to present at committee meetings or standing meetings across campus (e.g., Provost’s council, President’s Cabinet, Student Government, Faculty Senate). For example, mentorship program directors at Florida Atlantic University collaborated with their institutional research office to identify how mentorship had made an impact for specific groups of students. They shared these findings with top leadership at the University, leading to increased and ongoing monetary support for their mentorship initiatives across the institution. Sharing data is one way to promote an institution-wide culture of mentorship.
Idea 2: Shift from sharing to collective problem solving.
Create a space for collective problem-solving or idea storming with those most closely connected to the data, including students. Host a data and doughnuts party and ask attendees to reflect on: "What do you see in this data?" and "How might we respond to these insights?" Augusta University has a committee with staff and faculty from across the institution to do this. Among their initiatives include working to ensure that mentorship is leveraged in the right ways in response to student needs.
Idea 3: Identify areas to celebrate.
Take the time to identify and highlight areas where your organization is making strides towards its goals. What does the data reveal about progress? What surprising insights are worthy of celebration?
Idea 4: Empower others with knowledge.
Who else might find this information meaningful? Consider reaching out to community partners, student clubs, or historically underrepresented groups to talk about the data. Sharing data with a wide range of collaborators can lead to fresh perspectives and more comprehensive decision-making. This is one way to ensure a participant-centered approach to mentorship.
Idea 5: Add narrative to the numbers.
Consider interviewing or conducting focus groups with mentees and mentors in order to identify which aspects of the mentoring program had the most impact on their enrollment.
Ciji Heiser, Ph.D.
Ciji Heiser, Ph.D., (she/her) is the Founder of Co-Creating Action, an award winning researcher and seasoned professional in education, evaluation, and strategic planning. She teaches antiracist methodologies at American University, contemporary issues in higher education at New England College, and Applying and Leading Assessment in Higher Education for the Student Affairs Assessment Leaders. She holds degrees from Bucknell University, Kent State University, and the University of North Carolina at Greensboro.
She has led assessment, evaluation, and strategic planning initiatives across sectors. Recently, she contributed to NSF grants supporting underrepresented students in STEM and conducted qualitative research on factors influencing students’ choices in pursuing postsecondary education. Currently, she collaborates on reducing racial disparities in prison diversion programs across Illinois and is researching how policies shape equal opportunity work across a state system of community colleges.
As a volunteer, Ciji co-leads the Grand Challenges in Higher Education plan, leveraging data to promote accessible and quality education. She also shared her expertise as a faculty member at the ACPA Assessment Institute and as past-chair of the Student Affairs Assessment Leaders.
Ciji developed the impact analysis guidelines, as well as the recommendations for using the analysis data to create institutional impact.
Comments
Please sign in to leave a comment.