Teachers look for solutions. That’s what we do. But data collection often fails when it comes to solutions. You can collect all the information in the world, but if you don’t know how to use it, or it can’t be translated into something meaningful, the information is worthless.
That’s how I feel about yesterday’s article in U.S. News and World Report, “Where Poor Students Are Top of the Class” telling about students who are top test takers in schools along the Rio Grande River in Texas. Poor students are outperforming students from high income households. Ninety-five percent of students there are poor and 33 percent are still learning English. They also boast of a 90 percent graduation rate. The report looks at three school districts: McAllen, El Paso, and Brownsville.
When I see an article that says poor students are doing well in school, I want to know why. How are educators, parents, and these communities making this happen? What are they doing differently that other school districts can learn from?
The report, called the Education Equality Index, originates from researchers from Great Schools, a nonprofit backed by the Bill & Melinda Gates Foundation, the Waltons, and other foundations, and Education Cities, another nonprofit backed by the Bill & Melinda Gates Foundation, the Waltons, and other foundations, says little as to why students are doing better.
In fact, here is what’s stated in the article: The dataset doesn’t help explain why certain cities are doing a better job preparing low-income students, and researchers don’t attempt to answer the question.
Well, why not? What’s the point of the data if we can’t figure out from it what worked? Teachers and school districts have always been able to report how their districts did on tests. Learning that students get good test scores doesn’t mean much if we don’t know what was done to help students do well.
They go on to say: Such school districts typically rely on collaborations with nonprofits and businesses to provide the types of supports the schools cannot afford on their own. I guess they are taking credit. But that still doesn’t tell us what those businesses did to help students do better on the tests! Perhaps it was supportive parents. Maybe teachers did a great job teaching.
They also say: Researchers hope policymakers will look to the cities where poor students perform the best to pinpoint what other school districts with similar profiles can replicate. But how will they determine what really worked? What kind of bias will be involved? Will they pick their pet projects even though there’s no proof they work?
What’s missing? Information that gets to the heart of what reform should really be about—finding out what works! For all the micromanaging these foundations do, spending money on data collection, and don’t forget the time and resources teachers spend putting together and examining data walls, finding solutions to teaching the poor seems to escape all involved.
The superintendent of Brownsville says they “work hard,” and the principals are “great.” But most superintendents of poor school districts will tell you the same thing! She also says, The understanding of data is critical. You must be able to desegregate [sic] (disaggregate?) the data early enough to know what the kids need. Early. So where’s the understanding part of the data?
Here is what I was able to extract from the article, which might lend a clue to the real reasons poor students are doing well—but the data doesn’t tell us this. It leaves out the most important details.
- Students attend Head Start.
- Head Start professionals get training.
- Many teachers are “homegrown” and understand the child’s culture.
- They have wraparound programs.
- Teachers may use particular instructional strategies.
- They use social-emotional learning.
When we know children live in poor environments, don’t we already understand all the above are important?
My favorite reason why students likely did well—children are provided breakfast, lunch, and dinner due to an extended school day. The kids are able to learn because they aren’t hungry. The data didn’t tell us this. It didn’t need to.