First and foremost, these firstyear administration results are just that—first year. Since its inception and throughout the multiple generations of Texas student assessments (i.e., TABS, TEAMS, TAAS, TAKS), the first round of results are always perceived as dismal. Naturally, they are worse because the results under the prior assessment were so much better.
Nevertheless, our results are actually surprisingly high when you consider the context in which teachers and students performed. We began last school year with very limited information. We knew that STAAR was going to be more rigorous and more complex, that a time limit would be imposed for the first time, and that the performance standards would be higher than before.
We knew that students in grades 4 and 7 would be writing two compositions instead of one and that they would be limited to one page per composition. In addition, we changed the architecture of the exam with the introduction of supporting and readiness standards. We also only had a few sample items released after the start of the school year for each content and grade level assessed to help align our instruction and prepare students. Despite these less than optimum circumstances, our teachers and students pushed ahead.
Now, we have less than 40 days of instruction before the next full-round begins on April 1. In fact, the STAAR Alternate assessment window opened Jan. 7. As a system, we are faced with what to do now.
We know that our teachers are focused on purposeful, engaging instruction solidly grounded in the TEKS. These numbers provide the baseline for where we are starting with STAAR in grades 3-8. As you ref lect on these baseline results, we offer the following guiding questions:
• How do these results align with what our benchmarks were tellinguslastyear? Thedegree to which there is a match reflects yet another feedback loop for us this year.
• How do these results align with our intervention strategies? As with every school year, we put interventions in place to support our struggling learners. These results can inform the effectiveness of 2011-2012 interventions now that we have state assessment results. Here we consider whether we saw any return in the investment with individual students or if there is a need to make adjustments.
Still, we need additional information such as released tests/ items followed by a deeper analysis on how students performed by reporting categories and performance data at the student expectation level to better inform our classroom instruction. This along with examination of student work throughout the year will help further inform effective professional development.
As administrators, we have a multi-dimensional challenge. We must inform our public on what these results mean and what they don’t mean. We cannot allow these results to define us; rather, they should inform and refine our craft. Teachers, students and parents will no doubt be disappointed; however, they should not be dismayed. We have many dedicated and experienced professionals in our district and schools committed and in service to students and their communities. Our teachers and our students have consistently risen to the challenge. Our performance data not only implies this, it predicts it.
Over the coming weeks, we will continue working with teachers and campus administrators to ensure that our professional development and technical assistance is informed by best practice and district and students’ needs.