A 21-member group tasked with reviewing the appropriateness of Connecticut’s statewide mastery examination reviewed a draft report today prepared by the State Department of Education (SDE) outlining several findings and recommendations.
Unfortunately, the Mastery Examination Committee’s report fails to address substantial concerns raised by CEA Director of Policy, Research, and Reform Donald Williams and other committee members over the 15-month-period during which the group met.
Key objections center on
- the reliability of the Smarter Balanced Assessment Consortium (SBAC) test as a measure of student growth year over year (SBAC is administered online to students in grades 3–8 in the subjects of math and English language arts)
- the misuse of the test for purposes it was not designed for, such as teacher and administrator evaluation
- the discriminatory computer-adaptive format that disadvantages students in high-needs districts with limited access to technology
“Recent expert research has raised significant questions regarding the test’s validity, reliability, and fairness and unintended consequences that harm students,” Williams said at the November 30 meeting, referring to various studies and analysis that had been shared with the committee in recent months. “Those findings,” he noted, “are not reflected in this report.”
While the SDE conceded that computer fluency plays a role in SBAC outcomes—raising social justice and test validity issues—it stopped short of addressing those concerns in its final report.
Although SBAC was not designed, intended, or validated as a measure of teacher effectiveness, the Mastery Examination Committee’s report states that aggregate results from the test can be used to inform educator evaluation.
“The SBAC test was not designed or validated for teacher evaluation,” said Williams, “and it is not appropriate to use it for that purpose. That should be straightforward and obvious.”
Williams and AFT-CT Vice President Patti Fusco pressed the group to explicitly state in the final report that SBAC is not to be used in teacher evaluations in the same way that SDE acknowledged that SBAC is not valid for informing classroom instruction. The group declined to make that change, instead suggesting that the state’s Performance Evaluation Advisory Council (PEAC) take up the matter. (PEAC’s next meeting is December 15.)
Williams also warned about other potentially damaging high-stakes uses of SBAC—including as a variable in administrator evaluations and numerical school rankings—and the risk of driving instruction toward test prep instead of well-rounded academic standards. “There are already test-prep products being marketed for SBAC, and the test prep industry for the SAT is well known,” he said.
“If you go back to No Child Left Behind, it went terribly wrong, and that wasn’t the intention of those who created it. The law had unintended consequences. We must look at other states’ approaches,” he added, pointing out that out of 31 states that had initially adopted SBAC, only 14 continue to use the test. “We should ask, ‘Why did they stop using it, and what are they doing instead? How is it working?’”
In the coming weeks, CEA and the American Federation of Teachers will issue a minority report that addresses these and other shortfalls of the Mastery Examination Committee’s final report. Both reports will be provided to state legislators by January 15, 2017.