In practice, more than one year of prior test scores are used in SGP models. In cases were multiple years of prior test scores are used, students are compared with their peers who have the same combination of prior-year test scores.
Some policymakers prefer SGPs over another popular statistical measurement of teacher effectiveness, value-added models (VAMs), because they view SGPs as more interpretable. Ranking students of similar baseline academic performance based on how much they grew in the year is intuitive for policymakers, practitioners, and the community. Additionally, calculating the mean or median of student-level SGP to obtain a teacher-level score is mathematically simple and retains the interpretability of SGPs. SGPs are easier to understand than many other complex statistical models.
Although the calculations behind them are sophisticated, SGPs present information about growth in percentile terms that are familiar to most teachers and parents. They provide a clear indicator of progress for each student.
Unlike VAMs, SGPs typically do not adjust for differences in student characteristics beyond prior achievement, such as income or special education status. This is one reason why SGPs sometimes perform more poorly than VAMs when students are not randomly assigned to classrooms. In other words, teachers who teach more disadvantaged students or students in specialized programs tend to have lower SGP scores than their peers. Although this pattern is also seen in VAMs, which do control for student characteristics, the relationship is often weaker.
RAND is a research organization that develops solutions to public policy challenges to help make communities throughout the world safer and more secure, healthier and more prosperous. RAND is nonprofit, nonpartisan, and committed to the public interest.
The web Browser you are currently using is unsupported, and some features of this site may not work as intended. Please update to a modern browser such as Chrome, Firefox or Edge to experience all features Michigan.gov has to offer.
Student Growth Percentiles (SGPs) represent one powerful way to quantify the learning of individual students over one or more years. Conceptually, SGPs communicate the degree to which a student has learned in a particular domain, compared to a group of academic peers who had a comparable score on the previous test (or multiple previous tests) in that subject. In order to calculate SGPs, students are grouped with academic peers throughout the state who had comparable score patterns on past tests. Students in each academic peer group are then ordered based on their score on the current year test. Each student then receives a percentile rank, compared to their academic peers. Like other percentile scores, SGPs range from 1-99, where a SGP of 50 indicates that the student demonstrated growth in the content area equal or greater to half of the students with comparable score histories on that subject-matter test.
A student growth percentile (SGP) describes a student's growth compared to other students with similar prior test scores (their academic peers). Although the calculations for SGPs are complex, percentiles are a familiar method of measuring students in comparison to their peers.
The student growth percentile allows us to fairly compare students who enter school at different levels. It also demonstrates a student's growth and academic progress, even if she is not yet meeting standard.
A student growth percentile is a number between 1 and 99. If a student has an SGP of 85, we can say that she showed more growth than 85 percent of her academic peers. A student with a low score on a state assessment can show high growth and a student with a high score can demonstrate low growth. Similarly, two students with very different scale scores can have the same SGP.
Student growth percentiles are measured by using a statistical method called quantile regression that describes the relationship between students' previous scores and their current year's scores. For more discussion of the SGP model, please see the technical resources on the Student Growth School and District Resources webpage.
For SGPs, a student is compared to his/her academic peers. A student's "academic peers" are all students in Washington State in the same grade and assessment subject that had statistically similar scores in previous years. In other words, they are students that have followed a similar assessment score path. Students are only compared to others based on their score history, not on any other characteristics, such as demographics or program participation. A student's growth percentile represents how much a student grew in comparison to these academic peers.
The median growth percentile summarizes student growth percentiles by district, school, grade level, or other group of interest. The median is calculated by ordering individual student growth percentiles from lowest to highest, and identifying the middle score, which is the median. The median may not be as familiar to people as the average, but it is similar in interpretation - it summarizes the group in a single number that is fairly calculated to reflect the group as a whole. (Medians are more appropriate to use than averages when summarizing a collection of percentile scores.)
At the state level, median SGPs are almost always 50 since norms are usually established using student scores from only the current year. Half of the state's students have growth below 50 and half above. In certain instances, statewide median SGPs may differ from 50 due to slight misfit, the assignment of Highest Obtainable Scale Score (HOSS) students to an SGP of 99, or the use of baseline method for calculating SGPs.
Yes. Students that typically have high scores on state assessments will be compared to all other students in the state that also have high scores. If a student receives the Highest Obtainable Scale Score (HOSS), the student will receive an SGP of 99 in that year. The data show that even students that score at the top of the scale will have varied performance the next year, so the model allows us to identify growth for students at the upper end of the scale.
The students included in the student growth percentile calculations are those that attend public school and took a state assessment during the spring administration. Certain test types and categories of students are excluded from this comparison group. Only students that have at least two years of consecutive scores are included. For example, if a student has a score in 5th grade, but not in 6th grade, she would not be included in the analysis.
All available scores are used in the model, as long as they are consecutive. Washington's student growth percentiles are calculated using assessment data beginning in 2005-06. All students in the state that have valid and consecutive test scores in the same subject and grade form the norming population for the calculation of the SGPs.
Although the table lists the testing grade of students that would receive a student growth percentile, these students are now most likely in the next higher grade. SGPs will not be calculated for Science, Writing, EOC Biology, or EOC Math.
Student growth percentiles are primarily a descriptive model, telling us what amount of growth a student has made over the last year. This growth model is not a value-added model; it does not attempt to separate a teacher or school effect on student learning. SGPs can, however, help answer the following questions (Yen, 2007):
It is at the discretion of Washington school districts whether or not to distribute student growth reports to families and students. OSPI recognizes that the model is complex, and, given other competing initiatives, investing the necessary time and energy into training on SGPs may be a lower priority.
Washington State student growth percentiles were developed by Damian Betebenner of the Center for Assessment (NCIEA). They were first developed in Colorado for use in their Accountability framework in 2007.
Recognizing that student growth percentiles are a complex method of assessing student growth percentiles, OSPI is very interested in hearing your questions. We look forward to continued communication. Please email your questions and feedback to Student Information.
The Student Growth Percentile (SGP), developed by Dr. Damian Betebenner, provides the latest in growth projections, including Betebenner's well-known catch-up, keep-up growth projections to provide longitudinal SGP data.
We provide this information in two formats: Window Specific SGP (see the next section), whose purpose is to compare or report student growth between specific time frames, and Current SGP, whose purpose is to provide the most current SGP scores available for a student as a quick check-in for progression of growth.
Current SGP is calculated for students who have taken at least two tests within different testing windows. It uses the most recent test (within the past 18 months) and at least one prior test from an earlier testing window (Fall, Winter, or Spring; the dates for these windows are static and need not correspond to a school or district's school year). The assessments used to calculate SGP are pictured in the table below. Additional rules are considered if there is more than one assessment in a prior window.
SGP utilizes the historical growth trajectories of Star examinees to map out what the range of potential growth trajectories for each student will lead to, including what growth is necessary for each student to reach/maintain proficiency. SGP is updated regularly so that students projections are based upon the most recent data available.
Window Specific SGPs are determined (during report customization) by selecting a current or prior school year as well as the SGP window time frames that you want to view growth for. The tests used to determine Window Specific SGPs are based on the following rules:
SGPs should be reviewed by educators, data teams, and administrators to identify areas of success and areas for improvement. DPI uses mean SGP data in the federal ESSA accountability system. SGPs are not used in the school and district report cards, because state law requires report cards to use a different growth measure (value-added growth).
c80f0f1006