The state’s Mastery Examination Committee heard about problems with SBAC from several education stakeholder groups Monday. The Connecticut State Department of Education, the Connecticut Association of Schools (CAS), and the Connecticut Association of Boards of Education (CABE) presented results of surveys they conducted earlier this year on reactions and feedback to SBAC administration. The findings focused on everything from student anxiety and stress to technical issues and too much test prep. You can read survey results here.
Somewhat less than one hundred principals (largely from suburban districts)—responded to the CAS survey. Patrice McCarthy of CABE said her organization’s survey was limited in scope, but added that they have had many recent conversations with school board members.
Members of the committee agreed that the information presented can be regarded as anecdotal, not scientific in any way.
Donald Williams, CEA director of Policy, Research, and Reform, urged the committee to look hard for trends—that negatively or positively impact high-quality teaching and learning—in any information that is provided to the committee. CEA has gathered information from teachers across the state about the detrimental impact of SBAC on students and learning, and Williams and Windsor teacher Marcia Ferreira will present CEA’s findings at the next meeting of the committee on January 11, 2016.
Dr. Stephen Hegedus of SCSU said there are issues that require a “deeper dive” by the committee. Accommodations for special education and students with limited English proficiency are examples. He pointed out that if urban centers in Connecticut were better represented in the surveys, then there would be a different response than what the committee heard Monday from the largely suburban principal survey.
Hegedus continued, “From what I have heard, it is not just special education and English language learners whom we need to be concerned about. It is the modality of the SBAC test—the use of a computer versus a pencil and paper test.”
Hegedus urged the committee not to lose its concern about students’ anxiety over SBAC. “Students were crying. That is the reality. I don’t think we should displace that. What are the kind of anxieties going on?” he said.
Ferreira expressed concerns regarding technology problems associated with implementation of SBAC and the disparities between the wealthy and economically challenged districts across the state.
“It’s not just about the money, but also the manpower it takes to get technology in place. The amount of people and time it takes to just load apps onto iPads for students to use and then to maintain and fix the technology on a day-to-day basis causes a wider gap between the haves and have nots,” said Ferreira.
The usefulness of the SBAC results is a concern. Abe Krisst of the SDE said, “We have heard from practitioners about the usefulness of the results.” He told the group that the SBAC system does not only have an end of the year summative aspect. According to Krisst, districts are starting to use the interim assessments in the SBAC system. He said, “There are warts, but we are promoting these interim assessments to be richer, more diagnostic.”
SBAC has been criticized for not providing adequate, actionable information to teachers. SDE officials said yesterday that that is not the intent of the SBAC testing program.
Kathy Greider, superintendent of the Farmington schools, said, “In the lack of detail, you start to look within.” She suggested that local school districts will evaluate their interim daily assessments to build a system locally with more detail to assess student success.
Mary Anne Butler of the SDE told the committee that the state needs a one-page summary explaining what the state test (SBAC) is intended to do and not intended to do.
Ferreira said a big concern for teachers is the wide gap that exists between instruction and assessment in districts across the state.
“Teachers need a direct link between high-quality curriculum and instruction and then the assessment they are using to inform that instruction. That can be a mixed bag across the state because curriculum takes a long time to fix. It takes a lot of expense. It takes a lot of work and districts can be all over the place. You really can’t adjust instruction if you are still working on curriculum,” said Ferreira.
Monday’s meeting opened with State Education Commissioner Dianna Wentzell acknowledging that President Obama and the U.S. Congress passed the new Every Student Succeeds Act earlier this month. Wentzell said she expects more guidance in the next few weeks from the U.S. Department of Education on implementing the act. In general the movement is to transfer more responsibility to the states.
“We welcome this new approach,” said Wentzell. “We are ready for that.”
Many thanks to BLOGCEA for keeping membership apprised to the workings of the Mastery Examination Task Force. Unfortunately, the legislature-driven list of education stakeholders on this committee is clearly slanted toward past practice and maintaining the status quo, and – even though recent federal ESSA legislation is supposed to give decision-making power back to the States – it will probably be difficult for the predominant number of stakeholders on the Task Force to break away from their pro-CCSS advocacy with its aligned, controversial and flawed, testing protocol. I would urge members to check out the links provided in BLOGCEA as they draw a sharp contrast with the highly-professional, comprehensive CEA Teacher Survey which was undertaken last summer and had over thirty-thousand respondents. The “Results of SBAC Implementation Survey” will be reported at the next meeting of the METF in January; good luck to Ms. Ferreira (one of only two practicing classroom teachers on the Task Force) and Mr. Williams, the former State Legislator who is now the CEA director of Policy, Research, and Reform.
I can only hope that our membership will call upon our CEA leaders to get into the details of these woefully inadequate, admittedly nonscientific, dumps of information designed to appear official but sadly lacking in professional integrity. The links provided by CSDE are billed as “nonscientific” and include SBAC-developed surveys: the Administrative Survey is really a Smarter Balanced Proctor Survey Report completed by an unreported number of district administrators; the Smarter Balanced Field Test Student Survey was provided by the testing company and had 1800 respondents (less than one-percent of students who took the test in CT); and the Superintendent’s Survey was specific to the Smarter Balanced Testing Calendar and from the 79 respondents listed (out of a possible 169 school districts) it is easy to understand why this information provided by the CSDE should be viewed with caution. The 7-question CAS Feedback Survey of Principals (which appeared to be developed by the CAS Task Force members) gave building Principals fifteen days to return, and did not indicate how many responded. CABE reported on “Feedback for the Mastery Examination Committee” that appeared to summarize a wide-range of district demographics across the State while drawing conclusions without making available supporting evidence. CABE also suggested that the METF look at a report from the National School Boards Association: Council of Urban Schools Report of “Must-Know Information for Urban School Board Members”, dated April 2014, a full year before the SBAC test experiment came to an end for the students in our State.