“We’re Making The Data Relevant And Exciting To Teachers”: How Local Partnerships Are Transforming Assessment
September 1, 2017
Let’s be honest, it can be difficult to get excited about another data tracker. For corps members, student data is tracked in myriad ways, to comply with district and school as well as Teach For America standards. This is true for several reasons, including that on a district level, this data uses provided assessments to predict how students will perform in state-tested subject area. Another reason is that corps members’ support structures require these trackers in order to assist in the interpretation and application of student assessment data in real time. However, real time is what’s required for this data to be relevant, not least for corps members to be reflective as teachers.
This was true for Ethan Smith (‘10) during his corps commitment, as well as during his tenure as a Teacher and Leadership Development manager (or coach) and a math specialist with Teach For America-Mississippi. Though he’d always been passionate about seeing One Day on the horizon for Mississippi math classrooms, he admits that his field of vision was limited from his seat at the TFA table alone. That’s why he enrolled at the University of Mississippi in a program to receive his masters in secondary education with a math focus.. “I gained this tremendous awareness of the larger math community in Mississippi,” says Smith, reflecting on how, through this program, he was able to learn from other rural education specialists on a state and even national level. His membership in organizations like the Mississippi Association of Mathematics Teacher Educators (MAMTE) meant that Smith had a lens into the challenges and wins of math classrooms beyond his own experience, and was encouraged to engage in broader conversations with external stakeholders about math curriculum and assessment. Yet, this external perspective also gave Ethan pause about how TFA’s model was keeping up, particularly in terms of student proficiency versus growth.
So, when one of his corps members, Dylan Jones (‘15), approached him with questions about how to use assessment data in order to get his students invested in their progress, Smith was on board for making a change. Jones believed in TFA’s diligence to data, but had had been noticing a gap in the treatment of benchmark data, which seemed misaligned to assessment scores where his students were concerned, and was already working with data leads in Sunflower County Consolidated School District to suss out the gaps in their data. “It’s true, as a corps member and just simply as a teacher, our methods of data analysis could get complicated,” Jones says. “What we needed was an easy, fun, color-coded, fully-automated way to identify the gaps we should address.” In short, TFA-MS needed a tool corps members and their classrooms would be excited to use. So, together Smith and Jones built it.
They started with a deep dive into the Mississippi Academic Assessment Program (MAAP) accountability model, which outlined the levels of student proficiency between minimal, basic, passing, proficient, and advanced. Then, Smith created a spreadsheet-based tracker with an algorithm to predict what students would score on their state assessments, so that all teachers had to do was enter their students’ assessment scores from the previous year and plug in their most recent practice test scores. From there, the tracker would immediately populate with a prediction of scores, color-coding by level and growth.
Once they had a working product, Jones immediately tested it in the classroom. Having the tracker in hand “made the data conversations quantifiable, and gave me the confidence to make the data conversations quantifiable for students,” Jones says. “In my school, it was revolutionary.” As his content specialist, Smith’s close collaboration with Jones illuminated the far-reaching potential of their product.
Jones was instrumental in quickly making the tool useful not only to his students and peers, but also to his district as a whole, since it could be utilized for not only math assessments but any state-tested content. Where the density of the model’s language had made it inaccessible to most, districts now had a holistics view of student proficiency and growth. By doing the work of interpreting the model, Jones says, the tracker’s prediction of assessment results promised a time and cost-effective solution with major impact. “I started leading PD [professional development] sessions with principals on the accountability model using the tracker,” Jones says. His leadership resulted in a position at the end of his two year commitment: Data and Information Technology Specialist of Sunflower County Consolidated.
Soon after Jones’ successful introduction of such a tool, word spread. Sarah Trimm, a eleven-year veteran teacher and an MTLD for corps members in Clarksdale Municipal School District, brought the tracker to her corps members and their administrators to similar acclaim. “I liked the concept,” says Liz George, principal of George H. Oliver Elementary School. “It was beneficial in that it gave me a real idea about how my students were doing.” Besides, for districts who would’ve ordinarily paid a consultant for this kind of information - which may or may not have been accurate - they were now able to utilize this tool for free through their partnership with TFA.
Yet, the tracker did not always bring happy tidings. “When they first got it, initially they were very disappointed in the data,” says Trimm. “But I saw principals start to play with the numbers - ‘what if we moved the bottom 20 percent of students ‘x’ amount of points, then where will we fall? What if we focus on this after school group?’’ It was motivation, because it showed them what was possible.” Clarksdale administrators began reaching out to Smith for professional development in regards to the data they were seeing. What followed were deeper discussions about the possibility their assessments - and assessment vendors - had been providing inaccurate data, full of “false positives.” Soon, using their own data, classrooms went from an F to a D or C predicted rating within a nine week period. “We’re making data relevant and exciting for teachers,” says Smith. “It goes to show that when all these people are working together, magical things can happen.”
But the work is far from over, and, like any successful tool, the tracker will need to remain nimble to keep up with state and district level changes. Fortunately, it’s equipped to do so. When changes were made to the accountability model in the middle of the academic year, Smith quickly adjusted the tracker’s algorithm, and Jones ensured that it would still be usable in his district. In one year of use, however, it promises a “huge win for Delta area schools,” as Jones puts it. With dedicated educators like these working on behalf of kids in Mississippi, such a tool will surely continue to improve what proficiency and growth really mean for our students.