Updated 01/02/2026
Low populations (n size) or small denominators: Users should pay particular attention to the population (n size) or the denominator when making comparisons and interpreting outcomes. When results displayed are based on only a few students, they can fluctuate widely and may not reflect long-term or typical performance; therefore, caution should be used when analyzing and drawing conclusions from the results.
For the two Comparison views, denominators are displayed on the vertical axis in brackets (n = x) next to the chart labels and in the hover over. Display of the data as 3-year averages helps mitigate this low population issue since the weighted three-year averages are only calculated when denominators are present for all three years leading to a greater number of students included. For further information on the lack of deduplication of these students for the 3-year averages, please see the Methodologies section on the Overview & Methodologies page. Denominators are not displayed on the Explore Time Trends & Equity view. However, the three-year time trend display allows viewers to note any swings in the data over that time period or not that could be related to low n size.
Note: All denominators for all metrics, all years, all programs, and all student populations included are available in the CSV download.
Useful Data: Any dashboard displaying student outcome metrics will have data caveats that users should take into consideration when using the data for analysis. Data caveats for these dashboards can be found on the Data Caveats page. Information from the three data views can be useful in helping shine a light on persistent equity gaps in achievement for historically marginalized student populations and in understanding where further exploration may be necessary. To strengthen understanding and interpretation of the information provided, users are encouraged to connect with their Institutional Research Office or Centers of Excellence for Labor Market Research, explore other publicly or privately available quantitative and qualitative data sources, or conduct their own student and other stakeholder surveys.
Comparisons: Comparing a program or college to others in the region along with benchmarking against the region and the state places a program or college in context and hopefully encourages sharing and collaboration. As an example, viewing the range of Median Annual Earnings for a TOP code or sector for a student population across all LA colleges and comparing to the regional and statewide earnings for that student population can inspire reflection and conversation among faculty, program leads, counselors, deans, and other regional stakeholders. Programs at one college may differ greatly from programs at another college on the same TOP4 or TOP6 code in terms of the Student Learning Outcomes (SLOs) and skills attained by students. Student supports may also differ greatly across colleges. Nevertheless, exploring why earnings for a student population at one college are significantly different for the same student population at other colleges in the region could lead to revelations in terms of the value of skills attained or the value of career, counseling or other support services provided for that student population.
Everyone has a role to play in student success, and these three data views are intended to support a shared understanding of both successes and possible areas to explore further to improve student outcomes for California community college students in the LA region.