State-level RISE testing results are strikingly similar to previous years’ SAGE test data, according to state and national assessment experts, but local officials point to “red flags” in some school and district-level data.
The Utah State Board of Education voted unanimously Thursday to direct its staff to ask the Utah Legislature for “flexibility from the state accountability system,” which could mean no assignment of letter grades on upcoming school report cards.
Current Utah law requires that state education officials assign letter grades to schools on state report cards, and Democratic-led efforts to eliminate letter grades as part of the state accountability system have failed the past two legislative sessions.
However, SB220, Student Assessment and School Accountability Amendments, passed by the Utah Legislature in 2017, may give the State School Board latitude to forego letter grades for the 2018-19 academic year.
The legislation says, in part, that in a school year the State School Board determines it is necessary to establish a new testing baseline to determine student growth due to a transition to a new assessment, “the board is not required to assign an overall rating … to a school to which the new baseline applies.”
The state board also directed staff to add a disclaimer on the Utah School Report Card website that notes interruptions in RISE testing last spring, along with links to reports on three analyses of the data.
RISE stands for Readiness, Improvement, Success and Empowerment. RISE was selected by the State School Board as a replacement for SAGE testing. SAGE is short for Student Assessment of Growth and Excellence.
While individual classrooms and schools experience some challenges with assessment each year, the state board experienced “systemwide, high visibility interruptions of service” in RISE testing’s inaugural year.
Assistant State Superintendent of Student Learning Darin Nielsen presented the statewide results to the board, along with an explanation of three separate analyses of the data and a recommendation that the State School Board move ahead with accountability calculations for 2018-19.
He acknowledged school districts and charter schools had expressed “an uneasy feeling” regarding the test results after testing interruptions in the spring.
But multiple analyses indicated the statewide data was highly similar to SAGE testing data from previous years. One important difference was that fewer students opted out of RISE testing compared to SAGE. Statewide, there was nearly 95% participation in RISE testing, Nielsen said.
“The more data we have in system the more opportunity for the validity of the data,” said Nielsen.
Scott Marion, executive director of the New Hampshire-based Center for Assessment, which reviewed the RISE results under contract with the State School Board, told the board that its analysis found little difference in mean outcomes between students whose test experiences were interrupted and those who experienced no disruptions.
Marion said growth percentiles between SAGE and RISE testing were also “quite stable.”
One notable difference was the RISE test had no writing assessment, which typically means increases in language arts scores for male students because girls tend to be better writers, he said.