Progress Monitoring

Table of contents

  • Keys Terms
  • Progress Monitoring and Literacy Assessments
  • Linking Assessment to Intervention
  • Data-Based Individualization (DBI) (tier 3)
  • Voices from the Field
  • Additional Resources
This chapter will look at the big picture and scope of progress monitoring. The first section begins by looking at progress monitoring of student learning in the context of assessing literacy skills. This is an example of tier 2 or secondary level instruction. In the second section of the chapter tier 3 intensive interventions will be examined through the research based process of Data-Based Individualization (DBI).

Key terms

  • Progress monitoring– is an ongoing series of measures to gauge how the student is responding to instruction. Are they making adequate progress? Does their instruction need to be adjusted?
  • Baseline– current knowledge and skill level of the student.
  • Screeners- used to determine the starting point for instruction and identifying the need for further assessment.
  • Curriculum-based measurement (CBM)- an assessment tool that is used for periodic progress monitoring. This can range from weekly, month to quarterly. Eg. AIMSweb, DIBELs Next
  • Data points- progress monitoring data collected from a probe or learning activity.
  • Slope– indicates rate of growth (or not). Students probe scores are graphed and calculated over a period of time. See Slope Calculator

Steps to Success: Crossing the Bridge Between Literacy Research and PracticeAuthored by: Kristen A. Munger, Ed. Provided by: Open SUNY Textbooks. Located athttp://textbooks.opensuny.org/steps-to-success/LicenseCC BY-NC-SA: Attribution-NonCommercial-ShareAlike

Progress-Monitoring Literacy Assessments

To monitor a student’s progress in literacy, assessments are needed that actually measure growth. Rather than just taking a snapshot of the student’s achievement at a single point in time, progress-monitoring assessments provide a baseline (i.e., the starting point) of a student’s achievement, along with periodic reassessment as he or she is progressing toward learning outcomes. Such outcomes might include achieving a benchmark score of correctly reading 52 words per minute on oral reading fluency passages or a goal of learning to “ask and answer key details in a text” (CCSS.ELA-Literacy.RL.1.2) when prompted, with 85% accuracy. The first outcome of correctly reading 52 words per minute would likely be measured using progress-monitoring assessments, such as DIBELS Next and AIMSweb. These screeners are not only designed to measure the extent to which students are at risk for future literacy-related problems at the beginning of the school year, but also to monitor changes in progress over time, sometimes as often as every one or two weeks, depending on individual student factors. The second outcome of being able to “ask and answer key details in a text” could be monitored over time using assessments such as state tests or responses on a qualitative reading inventory. Being able to work with key details in a text could also be informally assessed by observing students engaged in classroom activities where this task is practiced.

Unlike assessments that are completed only one time, progress-monitoring assessments such as DIBELS Next and AIMSweb feature multiple, equivalent versions of the same tasks, such as having 20 oral reading fluency passages that can be used for reassessments. Using different but equivalent passages prevents artificial increases in scores that would result from students rereading the same passage. Progress-monitoring assessments can be contrasted with diagnostic assessments, which are not designed to be administered frequently. Administering the same subtests repeatedly would not be an effective way to monitor progress. Some diagnostic tests have two equivalent versions of subtests to monitor progress infrequently—perhaps on a yearly basis—but they are simply not designed for frequent reassessments. This limitation of diagnostic assessments is one reason why screeners like DIBELS Next and AIMSweb are so useful for determining how students respond to intervention and why diagnostic tests are often reserved for making other educational decisions, such as whether a student may have an educational disability.

Progress-monitoring assessments have transformed how schools determine how a student is responding to intervention. For example, consider the hypothetical example of Jaime’s progress-monitoring assessment results in second grade, shown in Figure 2. Jaime was given oral reading fluency passages from a universal literacy screener, and then his progress was monitored to determine his response to a small group literacy intervention started in mid-October. Data points show the number of words Jaime read correctly on each of the one-minute reading passages. Notice how at the beginning of the school year, his baseline scores were extremely low, and when compared to the beginning of the year second grade benchmark (Dynamic Measurement Group, 2010) of 52 words per minute (Good & Kaminski, 2011), they signaled he was “at risk” of not reaching later benchmarks without receiving intensive intervention. Based on Jaime’s baseline scores, intervention team members decided that he should receive a research-based literacy intervention to help him read words more easily, so that his oral reading fluency would increase at least one word per week. This learning goal is represented by the “target slope” seen in Figure 2. During the intervention phase, progress-monitoring data points show that Jaime began making improvements toward this goal, and the line labeled “slope during intervention” shows that he was gaining at a rate slightly faster than his one word per week goal.

Ch 5 figure 2

Figure 2. Progress-monitoring graph of response to a reading intervention.

When looking at Jaime’s baseline data, notice how the data points form a plateau. If his progress continued at this same rate, by the end of the school year, he would be even further behind his peers and be at even greater risk for future reading problems. When interpreting the graph in Figure 2, it becomes clear that intensive reading intervention was needed. Notice after the intervention began how Jaime’s growth began to climb steeply. Although he appeared to be responding positively to the intervention, in reality, by the end of second grade, students whose reading ability progresses adequately should be reading approximately 90 words correctly per minute (Good & Kaminski, 2011). Based on this information, Jaime is not likely to reach the level of reading 90 words correctly by the end of second grade and will probably only reach the benchmark expected for a student at the beginning of second grade. These assessment data suggest that Jaime’s intervention should be intensified for the remainder of second grade to accelerate his progress further. It is also likely that Jaime will need to continue receiving intervention into third grade, and progress monitoring can determine, along with other assessment information, when his oral reading fluency improves to the point where intervention may be changed, reduced, or even discontinued. You may wonder how the intervention team would determine whether Jaime is progressing at an adequate pace when he is in third grade. Team members would continue to monitor Jaime’s progress and check to make sure his growth line shows that he will meet benchmark at the end of third grade (i.e., correctly reading approximately 100 words per minute; Good & Kaminski, 2011). If his slope shows a lack of adequate progress, his teachers can revisit the need for intervention to ensure that Jaime does not fall behind again.

Some schools monitor their students’ progress using computer-adapted assessments, which involve students responding to test items delivered on a computer. Computer-adapted assessments are designed to deliver specific test items to students, and then adapt the number and difficulty of items administered according to how students respond (Mitchell, Truckenmiller, & Petscher, 2015). Computer-adapted assessments are increasing in popularity in schools, in part, because they do not require a lot of time or effort to administer and score, but they do require schools to have an adequate technology infrastructure. The reasoning behind using these assessments is similar to other literacy screeners and progress-monitoring assessments—to provide effective instruction and intervention to meet all students’ needs (Mitchell et al., 2014).

Universal Literacy Screeners Links to additional information
Table 1. Examples of Commonly Used Universal Literacy Screeners
AIMSweb http://www.aimsweb.com/
Dynamic Indicators of Basic Early Literacy Skills—Next https://dibels.uoregon.edu/
STAR Reading http://www.renaissance.com/assess
Phonological Awareness Literacy Screening (PALS) https://pals.virginia.edu/
See Academic Progress Monitoring Tools Chart from the National Center on Intensive Intervention.  The tools chart presents information about progress monitoring tools across three ratings; Performance Level Standards (Reliability), Growth Standards (Validity) and Usability (Bias Analysis Conducted).
See examples of Progress Monitoring Handouts in Reading and Mathematics, from the National Center on Intensive Intervention.

Although many literacy screening and progress-monitoring assessment scores have been shown to be well-correlated with a variety of measures of reading comprehension (see, for example, Goffreda & DiPerna, 2010) and serve as reasonably good indicators of which students are at risk for reading difficulties, a persistent problem with these assessments is that they provide little guidance to teachers about what kind of literacy instruction and/or intervention a student actually needs. A student who scores low at baseline and makes inadequate progress on oral reading fluency tasks may need an intervention designed to increase reading fluency, but there is also a chance that the student lacks the ability to decode words and really needs a decoding intervention (Murray, Munger, & Clonan, 2012). Or it could be that the student does not know the meaning of many vocabulary words and needs to build background knowledge to read fluently (Adams, 2010-2011), which would require the use of different assessment procedures specifically designed to assess and monitor progress related to these skills. Even more vexing is when low oral reading fluency scores are caused by multiple, intermingling factors that need to be identified before the intervention begins. When the problem is more complex, more specialized assessments are needed to disentangle the factors contributing to it.

A final note related to progress-monitoring procedures is the emergence of studies suggesting that there may be better ways to measure students’ progress on instruments such as DIBELS Next compared to using slope (Good, Powell-Smith, & Dewey, 2015), which was depicted in the example using Jaime’s data. In a recent conference presentation, Good (2015) argued that the slope of a student’s progress may be too inconsistent to monitor and adjust instruction, and he suggested a new (and somewhat mathematically complex) alternative using an index called a student growth percentile. A student growth percentile compares the rate at which a student’s achievement is improving in reference to how other students with the same baseline score are improving. For example, a student reading 10 correct words per minute on an oral reading fluency measure whose growth is in the 5th percentile is improving much more slowly compared to the other children who also started out reading only 10 words correctly per minute. In this case, a growth percentile of five means that the student is progressing only as well as or better than five percent of peers who started at the same score, and also means that the current instruction is not meeting the student’s needs. Preliminary research shows some promise in using growth percentiles to measure progress as an alternative to slope, and teachers should be on the lookout for more research related to improving ways to monitor student progress.

Linking Assessment to Intervention

How can teachers figure out the details of what a student needs in terms of intervention? They would likely use a variety of informal and formal assessment techniques to determine the student’s strengths and needs. The situation might require the use of diagnostic assessments, a reading or writing inventory, the use of observations to determine whether the student is engaged during instruction, and/or the use of assessments to better understand the student’s problem-solving and other thinking skills. It may be a combination of assessment techniques that are needed to match research-based interventions to the student’s needs.

You may be starting to recognize some overlap among different types of assessments across categories. For example, state tests are usually both formal and summative. Literacy screeners and progress-monitoring assessments are often formal and formative. And some assessments, such as portfolio assessments, have many overlapping qualities across the various assessment categories (e.g., portfolios can be used formatively to guide teaching and used summatively to determine if students met an academic outcome).


Tools to make your own Progress Monitoring Assessment /Probes

Response to Intervention -RTI Resources from Intervention Central, https://www.interventioncentral.org/index.php/cbm-warehouse Includes tools to make progress monitoring assessments, Dolch Wordlist Fluency Generator, Early Math Fluency Generator, Letter Name Fluency Generator, Math Work- Math Worksheet Generator, Reading Fluency Generator, Writing Probe Generator and more.


Data-Based Individualization (DBI) (tier 3)

One solution to supporting students with severe and persistent learning difficulties is the research based process of data-based individualization or DBI

DBI, an Intensive Intervention, (tier 3 instruction) is provided in smaller group setting of 1-3 students. This instruction is provided in addition to primary instruction in the general education classroom.

How can Data Based Individualization (DBI) help special educators improve outcomes for students with disabilities?

In this video, Amy McKenna, a special educator in Bristol Warren Regional School District shares her experience with data-based individualization (DBI). Amy discusses how she learned about DBI, the impact her use of the DBI process had on the students she worked with, and how DBI helped change her practice as a special educator.

[National Center on Intensive Intervention], (2018, Dec. 13). How can DBI help special educators improve outcomes for students with disabilities? [Video File]. Retrieved from https://youtu.be/0v-IuZ5KdUw  (8:08 minutes)

These two IRIS modules on Intensive Interventions and Data-Based Individualization were developed in collaboration with the National Center on Intensive Intervention at American Institutes for Research and the CEEDAR Center, These resources are designed for individuals who will be implementing intensive interventions (e.g., special education teachers, reading specialists, interventionists)
There two modules will focus on the student who is not responding to tier 2 interventions which  are typically conducted in small groups in addition to general education classroom instruction.

Resource: the IRIS Center. (2015). Intensive intervention (part 1): Using data-based individualization to intensify instruction. Retrieved from https://iris.peabody.vanderbilt.edu/module/dbi1/

Learning Objectives

  • Understand the purpose of providing intensive intervention
  • Be familiar with the data-based individualization process
  • Understand how to intensify and individualize academic interventions
  • Understand the difference between quantitative and qualitative adaptations

 

Resource: The IRIS Center. (2015). Intensive intervention (part 2): Collecting and analyzing data for data-based individualization. Retrieved from https://iris.peabody.vanderbilt.edu/dbi2/

Learning Objectives

  • Be familiar with the data-based individualization process
  • Understand how to make data-based instructional decisions
  • Be familiar with processes for collecting and analyzing progress monitoring data and diagnostic assessment data
  • Understand how to use these data to make instructional adaptations

voices from the field

Voices from the Field

Teacher candidates share how their schools monitor progress of students who require tier 2 and 3 interventions and/or have IEPS

Student data essentially drives the bus for instruction for a teacher.  Using consistent, frequent progress monitoring allows a teacher to address skill deficits in students, refine and improve teaching strategies, particularly within areas/concept a child or children may struggle with, and identify those students who may need additional support outside of a group setting.  Of particular note, teachers should avail themselves of CBMs, curriculum-based measurement, as offered through the school’s established curriculum tools.  One example is  Fresh Reads Fluency checks that are part of the Reading Street program.  Weekly fluency checks help show growth or lack of growth over a sustained period of time.  Other progress monitoring tools can include both formative and summative assessments.  Formative assessments are ongoing during the instruction of the skill and can include such items as exit tickets, pop quizzes and informal teacher questioning.  Summative assessments such as mid-year tests, end of unit tests and year end tests do not necessarily help make course adjustments, but help with giving the teacher over time feedback on skill areas he/she should improve their instruction on. Jacqueline Godin

——————-

This post illustrates the importance of collecting both quantitative and qualitative data.

While speaking with my supervising practitioner, SP there are many different things that she uses to progress monitor her children. For example, for Literacy (Reading, Phonological Awareness, Comprehension) and she uses Dibels, PA Drills from Reading Success by David Kilpatrick, Wilson Readers as well as identifying text structures. She also takes into consideration the Fountas and Pinnell Benchmarks that the teachers do within the classroom. For both Literacy and Math, she creates google forms to input information based on Dibels and other assessments. She also has the children answer questions. She feels that it is important to really understand what they are thinking and have them explain their own thinking. Anonymous

——————-

At  — Elementary School we have a robust tier 2 intervention system in place to support students who require additional time and exposure to build up reading and mathematics skills. The Intervention curriculum for reading is designed to reach students where they are using a mixture of Lively Letters, Spire and Heggerty teachings. To assess each student’s developing knowledge, weekly progress monitoring occurs using the Heggerty Progress Monitoring tools.

Students are given Heggery benchmarking at the start of the intervention cycle and at the end of the intervention cycle, a mid year assessment is performed on each student. After the assessment is completed and data is compiled, the general education teachers, special education case managers, interventionists and administration meet together to discuss recommendations to the intervention groups. This meeting will be taking place on 2/10, which is an early release day for our school. We will review recommendations for students who can stop receiving intervention due to adequate growth, students who need to be added to the intervention groups due to need – based on midyear assessment data, and recommendations of changes to current intervention groups, in case not enough growth is taking place and a student may benefit from a slower paced group.

For our students needing math intervention support, Bridges is the main curriculum that is used to supplement our general education math curriculum of Ready Math. In order to assess student progress or academic standing, early school year, mid school year and end of school assessments are conducted using iReady for all students (with the exception of students that are excused due to allowances in IEPs.)  Colleen Mehalko

——————-

We are administering bi-weekly probes to track student growth in each area of need. Behavioral interventions are based more on conversation and putting strategies into place to suit the needs of the student. I track their progress by talking to the student, parents, and teachers and inquiring about challenges that they have faced and strategies that they used to be successful. We are entering the results into spreadsheets and tracking growth on charts over time, and will transfer that data into AimsWeb when the program is up and running for all Special Ed staff.

It is important that teachers assess students regularly to be sure that students are making progress, but more importantly, the instruction that they are being given must be of value to them. If a student has mastered a skill, there is no reason to reteach the concepts to them. The purpose of interventions at all levels, especially tier 3, are to close the gaps that students have with their grade-level peers and that will not happen if they are not being given instruction at the proper level.Amy Welch

——————-

As IRIS Center (2006) states, “Progress monitoring is a key component in a response to intervention (RTI) or multi tiered system of supports (MTSS)”.  Without consistent and reliable data from progress monitoring, a teacher would not be able to accurately determine if the student is making growth in the areas needed. Allison Gibson

——————-

Some of the tools used for progress monitoring at the school I work at are the Northwest Evaluation Association – Measures of Academic Progress (NWEA-MAP), the Eureka Math Program, and the Fountas and Pinnell Reading Program.  The NWEA-MAP tests reflect the instructional level of each student and measures growth over time.  They are state-aligned, computerized adaptive tests that are administered in the fall, winter, and spring to provide teachers with valuable information regarding where the students are at in math and reading, and how far they have advanced. Eureka is our math curriculum and provides teachers with daily informal assessments, as well as unit assessments.  Judith Moore

——————-

At the — School progress monitoring is done regularly in Special Education for tier 2 and 3 interventions throughout the year in order to determine student progress toward targeted goals. The frequency of the assessments depends on the assessments themselves as well as the level the students are at. For example, the fifth-grade students requiring tier 2 interventions are assessed through a Diagnostic Decoding Survey every 2 months. This is a running record that evaluates progress with decoding words and determines where the students will start/move on to in their Phonics Blitz lessons. This survey is a short assessment with words containing specific phonics skills. Students read the words while the educator notes which words they read incorrectly as well as why they got the word incorrect (short vowel sound, sound added, final sound, ending sound, digraph, consonant sound, etc). This helps to break down which decoding skills are improving and which need to be worked on further. A couple of other running records and benchmarks used by the Special Ed. department are Reading A-Z and Fountas & Pinnell.

The Special Education department is always working with teachers to assist and support lessons. Through progress monitoring, the educators can determine what is needed to support students and help get them ready for core learning. This may include pre-teaching activities to set students up for success and to prepare them for the lesson. In doing pre-teaching activities, students have more opportunities to be in the classroom with their peers during the regular lesson without falling behind. Progress monitoring helps with differentiating instruction and determining who may need extra help or support. This is where the Special Educators and Teachers can come up with various reading groups or small group instruction based on the information on areas of need gathered from progress monitoring.  Tate Van Valkenburg

——————-

I talked to my SP and she said they use a bunch of different tools to monitor progress. First are, the more formal tests of information, benchmarking, I-Ready scores, sight word inventory… These are done at specific times throughout the year for all students in our district and we can use these tools to see if any growth in these areas. They also use checklists and data tracking charts, running records, progress reports (for the parents)… These are less formal, but are a way to keep track of data when you are working with a student every day to see the smaller growth. Some of the programs that they use for literacy, Wilson and Edmark, have their own data tracking sheets that go right along what your direct teaching for tier 2 and 3.

Data tracking should directly impact instruction practice. When you are working with students and they are learning the curriculum, you need to know where your students are to be able to teach in the “just right” area.  Brooke McGibbon

——————-

[High School special educator]

I check-in with the students on my caseload to see how well the progress is going about four times a year.  I check in with them quarterly and I think it is super helpful when regular education classroom teachers are familiar with their goals as well. At the school that I work for we also send out different rubrics for goal setting and executive function that the teacher fills out. These rubrics are super helpful because it becomes more number based grading which makes collecting accurate data more simple. Throughout this first quarter, I have come to realize that students will need a lot of modifications because it is hard to create an IEP with everything planned for the future. Anything can change in the future, so it is my job that the IEP makes clear connections with the students’ goals and curriculum. “Most of the time I am in constant communication with the students on my caseload, so the progress reports I receive are usually not that eye-opening”. Being in constant communication with students on our caseload is key for them to succeed in high school. It also benefits them because they learn valuable communication skills that can help them become successful after graduating high school.

Student data has an impact on a teacher’s instructional practice because if the data demonstrates that a student is not growing or developing at the pace they should be, it could be because the material the teacher is teaching is not connecting as it should with an effective instruction plan.  Lucas Fisher

——————-

{Early Childhood special educator]

In the preschool setting, there are few formal methods for progress monitoring, but the one that is used most often is the ASQ (Ages and Stages Questionnaire).  This is a questionnaire which looks at different aspects of the child’s development based on what is expected for their age.  This serves as a tool to see where deficits lie and catch possible areas of need before leaving the age for early intervention services.  The questions on this include areas of fine and gross motor, social, emotional, and cognitive development.  In the preschool level, some is based on asked questions while others are off teacher observation and reflection.  For example, the student may be asked to hop on one leg to test their balance and muscle tone or given two step directions and see if they can complete both components without having the directions repeated.  There is also a math section which focuses on pattern recognition and creation, shape identification, and repeating numbers (out of numerical order) after hearing them spoken once by the teacher.  This tool does not show what the issue may be but rathe provides clues as to what area one should focus on.  The other forms of progress monitoring tend to be more informal and occur in the method of observational notes.  If a teacher observes an area of concern for a student, they will begin to create a log of what/when the concerning area is seen.  For example, if the child seems to be mixing up their colors, they will note if it is only certain colors or if they are only accurately identifying the color puzzle and are not able to transfer the information to other objects.

The data the teacher is able to collect is what guides all aspects of the educational experience for the class.  Teachers can create a strong curriculum with amazing lesson plans, but if the methods and ideas are not influenced by their specific group of students, then the chance of meaningful learning will be greatly impacted.  Students’ data should shape the teaching presented to them.  It may be planned to teacher the children to write their names, but if the data is showing that the students are not yet able to identify individual letters the lesson of writing letters would be lost on this class.  Stocked lesson plans should be used as a guide rather than a set plan as each year different children will require different teaching.  The data is the insight into the inner workings of each child and so should be used to influence instruction each and every day.  The data is the pieces of the picture which creates the child, the teaching is the frame to finish the artwork.  Both impact one another, but without one, the other is weakened and incomplete.  Annie Lewis

——————-

IXL is an example of a progress monitoring tool that is used in the program I work in. Once students have completed a diagnostic assessment, IXL provides a level of where that student is and then produces work based on their skills. These diagnostic assessments happen once a month in the program I work in. We are then able to pull up each student individually and access data that provides insight to each student’s progress and areas of strength and weakness. We utilize IXL for math and reading fluency, and what I find beneficial is that after the student completes a math problem incorrectly, visual, corrective feedback on how the problem should have been completed is provided. The data provided by IXL comes in handy for teachers’ future lesson planning and when writing IEP goals. Visual data is also always helpful not just for teachers, but also for parents to get a good concept of where their child is at academically. This student data drives our teachers’ instruction. It helps teachers to differentiate and modify instruction should the data reflect that a student is not where they should be academically.

Remote learning was a challenging time for the students I work with, as I’m sure many of you experienced with your students as well. While IXL was a great learning tool for the students to use at home, the data showed that it wasn’t super effective for most of our learners. Work provided was based on skills they had been previously taught but the data provided after the work was completed showed that most students did not have skill retention. Most times IXL is worked on in the classroom, students will request help if they need it. When they were at home, they didn’t always have us available when they needed us and may not have had the availability of their parents or sibling to assist either. Our students were remote from March 2020 – June 2020 but returned to school early September 2020 for in person instruction. Our program was the only one in the building and accounted for 16 students and 7 staff. The reasoning for this was solely due to the progress monitoring results we saw during those few remote months in 2020. So we made the instructional change from remote learning to in person instruction and saw student progress increase the first month we were back in person.  Sarah Caroll

IXL is an adaptive learning program that collects progress monitoring data. Sarah’s post illustrates the need for in person direct instruction with students with more intensive disabilities. Without the added structure of the classroom and the direct teaching component, the students did not make progress via the IXL adaptive learning program. The data was useful in illustrating the learning needs of these students. Paula Lombardi


Additional Resources

Brown, J., Skow, K., & the IRIS Center. (2009). RTI: Progress monitoring. Retrieved from
http://iris.peabody.vanderbilt.edu/wp-content/uploads/pdf_case_studies/ics_rtipm.pdf 

Common Core State Standards CCSS IEP Goal Probes and online assessment tools

National Center on Intensive Interventions, (n.d.) Progress Monitor, https://intensiveintervention.org/intensive-intervention/progress-monitor

National Center on Response to Intervention (n.d), Using Progress Monitoring Data for Decision Making. Retrieved from https://rti4success.org/sites/default/files/Transcript%20Progress%20Monitoring%20for%20DBDM_2.pdf 

The IRIS Center. (2019). Progress monitoring: Mathematics. Retrieved from https://iris.peabody.vanderbilt.edu/module/pmm/

The IRIS Center. (2005, Rev. 2019). Progress monitoring: Reading. Retrieved from https://iris.peabody.vanderbilt.edu/module/pmr/


Updated 3/23/2022