5 Reading Screener Mistakes Districts Make (And How to Fix Them)

child points her fingers at the letters as she reads a book to her teacher

It’s easy to find yourself in a love-hate relationship with reading screeners. 

On one hand, they’re powerful tools that are essential for identifying readers who need support early. On the other hand, they can be tedious, stressful, and pretty darn confusing. It’s so easy to lose sight of what they’re actually supposed to tell us and why it matters.

The only reason we spend time in our busy days thinking about assessments is because of the vital role they play in instruction and intervention for students, says Dr. Jan Hasbrouck, a researcher, consultant, and author who works nationally and internationally to help schools design effective literacy programs. “We’re not going to be able to provide optimal instruction to students if we don’t collect the right data.”

As a veteran literacy expert, Hasbrouck has seen firsthand how simple reading screener mistakes cost districts resources, money, and precious time. Here, she and other literacy educators dig into the details of screening assessments, the most common mistakes they see districts make when using them, and their go-to solutions to get better results.

What’s a Reading Screener Good For, Anyway?

In the simplest terms, a reading screener is a snapshot of student’s foundational reading skills that you can use to guide instruction.

Think of it like a thermometer, Hasbrouck says. “It’s a quick measure of: Do you have a fever, do you have a high fever, [or] do you not have a fever?”

Teachers can certainly tell a lot from simply listening to kids read, Hasbrouck says, but screening assessments are critical because they can quickly and easily test key skills like oral reading fluency (ORF) and phonemic awareness. 

Typically, screeners are given one to three times a year to every student in a class, grade level, school, or entire district to determine their level of risk or proficiency with certain reading skills. 

Reading Screener

Choosing Your Screener

For the most accurate and useful screening data, look for assessments that are widely validated and research-backed such as DIBELS, FastBridge, or Acadience.


The Most Common Screening Assessment Mistakes And How to Fix Them

Ok, so you have the right literacy screener in place. Now what?

Let’s make sure that screening assessment really lives up to its potential. Though mistakes are easy to make, every one of those mistakes can be fixed, and often it only requires a few simple shifts.

Here’s what the experts we spoke to say to look out for plus their go-to fixes for getting the most out of your reading screener data.

Mistake #1— Thinking Universal Reading Screeners Will Tell You What to Teach

As valuable as screening assessments are, they can feel like a headache when you’re trying to choose the right one, make sense of the results, and figure out how to move forward.

It’s not uncommon for leaders to “get the universal screener and think it’s going to tell them exactly what to do,” says Anna Geiger, the author of Reach All Readers and host of the Triple R Teaching podcast. “But it doesn’t tell us everything.”

What to Do Instead

Just like that thermometer that tells a physician that a fever exists but doesn’t identify the cause, reading screeners will alert you to the sign of a problem, but they won’t diagnose the issue.

For example, the reading screener may tell you that a student is unable to read a grade-level passage fluently, Hasbrouck says, but it won’t tell you if it’s because a student has issues around simple decoding or background knowledge, or if the child simply needs to work on automaticity.

When the reading screener flags a possible problem, your next step should be administering a diagnostic assessment to get at that root cause.

“The screener is to figure out who’s on track or who’s at risk — so who’s on track to be a good reader or who’s at risk to not be a good reader,” Geiger explains. “And then the diagnostic assessment helps us pin down what to teach next.”

Geiger says one of the most common problems she sees in schools is a misunderstanding of what different assessments are and what they’re meant to show. 

“What I’m seeing from teachers is a misunderstanding of what assessment I’m giving and what it’s doing,” she says. “So I’ll have people say, ‘I’ve got a phonics assessment. How do I use this to find their grade level?’ But if it’s a diagnostic phonics assessment, the point is to figure out what I’m going to teach now. It’s probably not to be used as a benchmark.”

Reading screeners are one part of what should be an entire assessment suite that your district uses to support students who are learning to read. Geiger breaks it down like this: 

  • Screening Assessments — Help us figure out who’s on track and who’s at risk
  • Diagnostic Assessments — Help us pin down what to teach next
  • Progress Monitoring Assessments Tell us if our instruction is working
  • Outcome-Based Assessments — Tell us how we did at the end of the school year

Mistake #2 — Implementing Screener Assessments Without Teacher Training

“A data point is meaningless if you don’t know what to do with it,” says Cassandra Bell, the director of curriculum and instruction for Richmond City Public Schools in Virginia.

All too often, that’s true for teachers trying to make sense of the screening data in front of them.

When Ignite Reading partnered up with Ed Week for a webinar about early literacy assessments and data in October 2025, 78 percent of the educators in attendance said their staff wasn’t trained to analyze all that data.

Geiger recalls being one of those teachers when she was in a classroom.

“I didn’t really know what kind of data I should be collecting,” she recalls. “I learned about informal reading inventories and running records in graduate school, so running records were primarily what I did and taught teachers to do for a long time as well. And that was just basically to figure out what reading level my students were at.”

It wasn’t until Geiger learned about the Science of Reading and moved on to post master’s degree-work that she says she learned how assessments should be administered and how the data should be used.

“Teachers might not have any idea what to do with [the assessment] because no one’s telling them, and then it’s completely a compliance activity,” Geiger adds.

Don’t let your valuable resources become another box to check off the to-do list or (even worse) another source of stress for teachers.

What to Do Instead

Teacher buy-in is a critical part of the assessment process (and a key factor in your district’s success!), so make sure teachers have the support they need from the start.

What does this look like in practice?

  • Build Knowledge in Administration Before Instructional Staff — Start by educating administrators, Geiger suggests. “In an ideal world, we would teach the principals first and then provide coaches for teachers,” she explains.

    “Professional learning at all levels of the system [means] we can all be speaking the same language, ” adds Lindsey Gonzalez, M.Ed, director of MTSS for Texas’ Round Rock Independent School District. “We have different interpretations of what response to intervention was or, you know, now MTSS. Sometimes just vocabulary differences can cause a difference of opinion.”
  • Build Training Into Your Schedule — “We stopped treating data as an event and made it a routine,” says Anthony Fitzpatrick, Ed.D. assistant superintendent of the Delsea Regional and Elk Township School Districts in New Jersey and author of the book Blueprint for Success: Implementing MTSS in Your School District. “There is a 40-minute instructional period every single day for my acceleration coaches to collaborate on student data.

    “That’s a really tough thing to schedule. But if we want data to be a priority, and we want education to be responsive, then you have to carve out that time, and it needs to be collaborative.”
  • Bring Teachers Into the Process — Ask teachers for their suggestions and feedback on a regular basis, and take it into account when making decisions and planning training that fits their needs. In Bell’s district, for example, ensuring early literacy data was being put to good use meant doing “some data literacy to support our teachers in understanding how to utilize assessment data to inform their instruction,” Bell says.
  • Focus on both the “How” and the “Why” — Offer training, ongoing workshops, demos, and coaching so teachers have a full understanding of how to use this critical data to inform instruction and feel supported on the journey.

“If we want data to be a priority, and we want education to be responsive, then you have to carve out that time, and it needs to be collaborative.” — Anthony Fitzpatrick, assistant superintendent of the Delsea Regional and Elk Township School Districts

purple lightbulb icon

Mistake #3 — Collecting Assessment Data Without Using It

Raise your hand if your district has ever spent time on an assessment and then let the data linger untouched in a spreadsheet. Guess what? You’re not alone.

“Oftentimes, we take the assessment, we collect the data, and then we just hold on to it until it’s time again for us to take another assessment,” Bell says.

When this happens, Geiger says, there’s a tendency for districts for stop administering their reading screeners because they’re not seeing improved reading outcomes. But the problem wasn’t the screener. “It wasn’t producing any outcomes because they weren’t doing anything with it,” she says.

There’s a better way to do this.

What to Do Instead

Reading screener data is only as valuable as the action it inspires.

“if we’re collecting data and not using it wisely, we’ve wasted time,” Hasbrouck points out. “We’ve wasted energy. We’ve wasted our precious resources.”

So, to get the most out of your data:

1. Conduct a Data Audit

“Most districts are actually collecting too much data,” Hasbrouck says. This includes overlapping data, data that isn’t actionable, and data that doesn’t really have a purpose.

She suggest kicking of an audit of your assessment data with an eye on simplifying.

For example, use “the ORF for screening, some quick diagnostic skills [assessments], and know how to progress monitor. Sleek and simple is going to get you where you need to go.”

2. Empower Literacy Teams to Take Ownership of Data and Collaborate

“If you’re seeing all of the data and the students as belonging to everyone, you can work together to meet their needs,” Geiger says. For example, if multiple teachers have kids struggling with the same skill, “we can pool our resources and help those kids together versus all trying to solve it on our own.” That saves time and creates a united front that benefits everyone.

3. Establish Fast Response Protocols

“Within 48 hours of any universal screener, we get together and we start making our lists,” Fitzpatrick says of the process in his district. “Teachers then use the list to begin to match interventions to the students precise need, whether it’s decoding, fluency, comprehension.”

The New Jersey district has established a hard and fast respond within 10-day rule.

“If a child is struggling, intervention must begin within 10 instructional days,” Fitzpatrick notes, “and typically we are within five because MTSS only works when the urgency of a student need meets the structure that you build around it.”

Mistake #4 — Not Communicating Screener Data District-Wide

Screening results are often thought of as being for educator eyes only. This makes sense, since it’s the district, administrators, and teachers who are doing most of the work to improve literacy skills.

Unfortunately, the unintended result is that families and other supportive stakeholders can be left out of the process, and it’s a lot more helpful to have everyone involved.

What to Do Instead

Top secret assessment data? Not in this district. We know students fare better with increased family engagement, and sharing literacy goals, data, and research is a smart way to get the whole community involved in literacy change.

One example? Bell’s district, Richmond City Public Schools, put a program in place called RPS 200 where kids at select schools get an extended school year (20 extra school days, to be exact) to address literacy gaps.

To get this project off the ground, Bell and her colleagues started by sharing reading research and assessment data — in their case, from a Virginia-specific literacy screener — not only with teachers, but with families as well.

“Walking our teachers through that data and talking to our parents, our community, and our other stakeholders about how we could really utilize that [extra school] time, we got more and more buy-in,” she explains. “We also wanted to elevate our teachers’ voices and elevate our family voices so that they felt a sense of ownership and felt really connected to the work that we were doing.”

Consider how you can use screening data to get families and your wider school community engaged in reaching literacy goals, and don’t be afraid to think outside the box.

purple lightbulb icon

Mistake #5 — Not Using Screening Assessments Often Enough

Screeners aren’t just one-and-done. They should be administered three times a year to assess students’ growth, and they should also be accompanied by regular progress monitoring. How often all of this takes place will depend on your district policies and the individual students.

“It’s a perfect analogy to the medical world,” Hasbrouck says. “If you’re a reasonably healthy person, your doctor collects data on you once a year… If you have an illness or you broke your leg or something, that progress monitoring is going to happen more frequently. If you’re in intensive care, progress monitoring happens 24/7. Those differences are based on the needs of the patient.”

What to Do Instead

It’s recommended that schools do screening assessments at least three times a year (and many states, including Minnesota, Iowa, Tennessee, Florida, and Michigan, have made this a mandate).

A middle of the year screener helps you avoid end-of-the-year surprises and gives you the information you’ll need to be sure students get timely interventions.

Overall, a winter screener is going to tell you three major things:

  • Who’s maintaining growth
  • Who’s falling behind
  • Whether core instruction is working for the majority of students

This is important, not just as a way of checking in, but also for planning. If you’re going to see an influx of students needing support in the second half of the school year, you’ll want your intervention system ready to absorb them without overwhelming your staff or diluting the quality of support.

If you use a universal screener like DIBELS or Acadience, Geiger says, they come with different 60-second screening exercises teachers can use every six to eight weeks to see how kids are doing.

If teachers see consistent progress during each monitoring period, those students should be on track to show growth on a winter screener, as well.

purple lightbulb icon

Mistake #6 — Not Using Year-End Literacy Screener Data Wisely

You got your year-end screening done. The data is in. Now what?

If you aren’t sure how to use that data efficiently, you aren’t alone. If datasets got superlative awards, year-end data would probably be voted most likely to end up locked away on a hard drive forevermore.

But, as Bell tells us, there are actually some really powerful ways to use year-end data to move the literacy needle in your district.

What to Do Instead

If you’re collecting year-end data, think through how that data can be used and what creative solutions you can come up with to offer students more support. When Bell’s district first introduced the idea of RPS 200 and made decisions about how to use the extra school time, they did so by analyzing year-end data. 

“We used the data from the previous school year to show a case for extending the year,” Bell explains. This included reflecting on student progress and discussing meaningful ways to shore up kids’ reading skills for the subsequent school year.

RPS 200, now in its third year, has become an ongoing part of the district’s larger literacy strategy, along with other proven interventions like high-dosage tutoring from Ignite Reading. As a result, Bell says her district is happy to report “significant increases in their student performance, particularly in early literacy.”

Round Rock ISD takes a similar pro-active approach, Gonzalez says. During the summer, principals are provided with students’ year-end reading data so they can begin planning “before we even get those beginning of year screener results back.”

This means the district doesn’t waste the critical first weeks of school waiting for screening results — they’re already prepared to act.


About the Author