Presentation materials are being added. Please check back for updates.
AzAIR 2026 IPEDS Update
IPEDS presentation
Presenter
Stephnie Hopple, Maricopa Community Colleges
Using the Economic Hardship Index
The Economic Hardship Index is a tool designed to measure and compare the economic distress of communities. Developed to provide a comprehensive picture beyond simple income, it analyzes six key indicators: unemployment, dependency, education, income, housing, and poverty. While often used for studying how economic conditions affect health outcomes, it can also be used in higher education to both identify students in need of additional support services and measure institutional effectiveness.
Presenters
Dustin Maroney, Central Arizona College
Rebecca Hougland, ZogoTech
Beyond the Dashboard: Aligning Data Silos for Institutional Analytics
Institutions generate countless reports and dashboards across multiple teams, often resulting in an incomplete data story due to disconnected systems, siloed datasets, and inconsistent KPI definitions
This session explores a central question: How do we align our systems and definitions so everyone sees the same story?
Drawing from the experience at ASU Learning Enterprise, this presentation shares how we moved beyond simply building a dashboard to aligning data sources, standardizing KPI logic, and creating a unified foundation for institutional reporting.
Presenter
Vanshaj Gupta, Arizona State University
Dynamic Course Projections: Leveraging Data to Optimize Enrollment & Academic Planning
Effective enrollment management requires more than static projections built months before registration begins. Institutions increasingly need dynamic, data-informed approaches that integrate admissions pipelines, move-on and progression rates, historical course composition, and real-time enrollment trends to optimize course offerings and resource allocation.
This session presents a practical framework for developing both initial and continuously updated course enrollment projections. Participants will explore how to integrate admissions data (yield rates, student type), student progression metrics (retention, move-on rates), and historical course enrollment behavior (fill rates, majors, modality preferences) to create accurate, responsive forecasts.
The presentation will demonstrate how predictive modeling and dashboard-driven monitoring can support iterative decision-making throughout the active enrollment cycle. Attendees will learn strategies for identifying leading indicators of enrollment shifts, adjusting course capacity in real time, and aligning instructional staffing with evolving demand. Particular attention will be given to cross-functional collaboration among institutional leadership, academic departments, and enrollment management to ensure projections translate into actionable scheduling decisions.
By the end of the session, participants will be able to:
• Construct a multi-source data model to inform initial course enrollment projections.
• Incorporate move-on and progression analytics to anticipate downstream course demand.
• Develop processes for frequent projection updates using live enrollment and registration data.
• Apply trend analysis to optimize course offerings, reduce bottlenecks, and improve student progression.
• Align course planning decisions with broader enrollment and retention goals.
This session is designed for institutional research professionals, enrollment managers, academic planners, and data analysts seeking to enhance forecasting precision and improve institutional agility in course scheduling and enrollment optimization.
Presenters
Maria Willis, Arizona State University
Jason Bradshaw, Arizona State University
Four Pathways Retention ML Model: An Application to Community College Data
This research uses a four certificate programs: Addiction/Substance Abuse Level I, Accounting, Paralegal, and Programming. The longitudinal data set consisted of 535 observations of students. A random forest model was built to predict student retention into the following year. Several models were compared in their performance metrics: accuracy, recall, precision, and F1-Score. The random forest model was selected based on those metrics. Model accuracy achieved was 84% but more importantly false positive and false negative rates were acceptable. A REST API and a front API in STREAMLIT were built for model deployment pilot hosted at Hugging Face. Additional analysis is provided to facilitate interpretation of marginal effects of the variables introduced in the model.
Presenters
Fermin Ornelas, Rio Salado College
Zach Lewis, Rio Salado College
Following the North Star: Designing Metrics that guide Institutional Performance
Higher education institutions track a variety of operational and business data, yet many struggle to clearly define and translate “North Star” metrics that directly advance institutional effectiveness. The challenge is often not access to data, but rather aligning reporting structures with the measures that most meaningfully influence portfolio performance and learner success.
This session will explore how a structured metrics framework at ASU Learning Enterprise strengthened the connection between operational reporting and strategic decision-making and evolved the role of its analytics team from passive contributors to strategic partners.
Presenter
Pallavi Sharma, Arizona State University
Canvas of Success: The Art of Academic Assessment for Student Success
In the evolving landscape of education, effective assessment reporting is vital for illuminating student mastery of learning outcomes. This session will explore the enhanced academic course and program assessment processes implemented by an Arizona college, showcasing tools that drive continuous improvement toward student mastery. We will discuss best practices and successful strategies utilized, emphasizing methodologies that enhance assessment processes while fostering transparency and collaboration among faculty and staff. Participants will gain insight into assessment frameworks and tools ensuring alignment between learning outcomes, key assessments, and evidence-based decision-making. By sharing insights and experiences, we aim to inspire attendees to reflect on their own assessment practices, cultivating a culture of continuous improvement that benefits Arizona's diverse student population. Join us as we paint a picture of success through data-informed assessment, reinforcing our collective mission to provide students with quality education. Together, let’s embrace the art of assessment and enhance the educational journey for all learners.
Presenters
Charity Adams, Mohave College
Shelly Castaneda, Mohave College
Andrea Wange, Mohave College
From Submission to Completion and Beyond: Streamline Data Requests with Microsoft 365
Discover how Microsoft 365 can empower your team to automate data request processes and manage them effectively, from the initial submission to final deliverables and completion to post completion management. This session will cover how to design a user-friendly form for submitting data requests, trigger automated workflows using Power Automate, and manage requests through a SharePoint list from initial submission to post completion management. Attendees will learn how to configure automated dynamic email notifications to keep both requesters and assignees informed throughout the process. This session is valuable for professionals seeking to reduce manual effort, improve transparency, and enhance operational efficiency. By the end, participants will understand how to implement a scalable solution that transforms data request handling into a streamlined experience.
Presenter
Trevor Hart, Eastern Arizona College