EDVIEW 360
Podcast Series

Look Ahead to Summer and Fall Success Using Spring Assessment Data

Dr. Kelly A. Powell-Smith
Mount St. Joseph University
Dr. Kelly A. Powell-Smith
Dr. Kelly A. Powell-Smith

Kelly A. Powell-Smith, Ph.D., NCSP, Professor of Reading Science at Mount St. Joseph University. Dr. Powell-Smith is the former Chief Science Officer at Acadience Learning. Dr. Powell-Smith is the lead author on Acadience RAN, Acadience Reading Survey, Acadience Reading Diagnostic assessments, and Acadience Spelling. She obtained her doctorate in school psychology from the University of Oregon. She has served as an Associate Professor of School Psychology at the University of South Florida, faculty associate of the Florida Center for Reading Research, and consultant with the Eastern Regional Reading First Technical Assistance Center. She currently serves on several editorial boards including School Psychology Review, School Psychology Forum, and Single-Case in the Social Sciences. Her work has been cited in more than 200 professional journals. Dr. Powell-Smith has provided training in assessment and intervention in 23 states and Canada and conducted 285 national, state, and regional workshops and presentations. 

Learn more about Dr. Kelly A. Powell-Smith
Release Date: Tuesday, May 4, 2021

With vaccination efforts in full swing, September is likely to see a strong return to in-classroom instruction. How can educators prepare for summer school or a new school year after students have seen extended periods outside the classroom? In this important podcast, assessment expert Dr. Kelly Powell Smith—vice president and associate director of research and development at Acadience—shares the key indicators that educators should look for when reviewing spring literacy assessment scores, along with how to pinpoint where students are struggling so you can plan for summer or fall intervention and instruction. 

Listeners will learn:

  • Why end-of-year assessment data matters
  • What to look for when analyzing reading assessment scores
  • How to use literacy assessment scores to drive summer and fall instruction
  • Strategies to use for planning for summer and fall intervention
Related Products
Transcript

Narrator: Welcome to EDVIEW360!

Dr. Kelly Powell Smith:
We may see some continued impacts on academic performance, such as students ending the year, on average, lower in reading and/or math than would be predicted based on prior years. However, we may also see the positive impact of many educators working extra hard, resilience among students might contribute to better outcomes than you might've initially anticipated, and a marshaling of resources to get students back on track. So it's really my earnest hope that these factors combined—working hard, student resilience, and marshaling of resources available—those things combined would be helpful to moving us back toward what we might typically see in terms of academic performance.

Narrator:
You just heard Dr. Kelly Powell Smith, vice president and director of research and development at Acadience Learning. Dr. Powell Smith is our guest today on EDVIEW 360. Here's our host, Pam Austin.

Pam Austin:
This is Pam Austin. Welcome back to the EDVIEW 360 podcast series. We're so excited to have you back with us. I'm conducting today's podcast from my native New Orleans, channeling the heart [inaudible 00:01:16 ] in Dallas, Texas. Today, we are honored to have with us Dr. Kelly Powell Smith, vice president and director of research and development at Acadience Learning. Hello, Dr. Powell Smith. We are so happy to have you with us today on EDVIEW 360.

Dr. Kelly Powell Smith:
It's great to be here.

Pam Austin:
Tell us a little bit about your background as an educator and how you became interested in assessment.

Dr. Kelly Powell Smith:
Thank you, Pam. My background and training are in developmental psychology and school psychology, and my interest in assessment goes way back to my undergraduate days while I was earning a degree in developmental psychology. And as part of that experience, I participated in my first course on special education, and that really got me interested. And in that course, I was first introduced to the idea of using assessment data to support students in their development. And that interest was further developed through my graduate school training and experiences at the University of Oregon.

Dr. Kelly Powell Smith:
Now, my graduate training in school psychology was steeped in curriculum-based measurement, which many folks may be familiar with, but within a problem-solving approach to service delivery, which is much like our response to intervention or multi-tiered system of support models that we talk about now, except for this was long before those terms RTI and MTSS were really used. That training also incorporated the notion that assessment data should be used to address specific questions and that we need to consider the consequences of our assessment use. And it's through those experiences that I developed this very strong desire to conduct assessments that would lead to better outcomes for students and for teachers. I believe it's not enough to gather and report scores on assessments, but instead our focus should be on ensuring that the information and the data that are gathered through the assessment process result in actions that ultimately improve outcomes for students and educators.

Pam Austin:
A word I heard over and over again was “outcomes”—to improve outcomes for students and educators. I find it very interesting you put both together. They're tied, aren't they?

Dr. Kelly Powell Smith:
They very much are. Yeah.

Pam Austin:
Well, tell us a little bit about Acadience Learning as well. Their backstory, along with the products they offer. We have heard some confusion out there between Acadience and DIBELS® [8th Edition]. Can you clarify for us?

Dr. Kelly Powell Smith:
Yeah, I'd be glad to do that. Acadience Learning, formerly known as Dynamic Measurement Group ... was an educational research company that was founded by the original authors of DIBELS®, Dr. Ruth Kaminski and Dr. Roland Good. And Drs. Good and Kaminski have been really at the forefront of research on assessments that help educators to improve outcomes for students in schools for over 30 years now. And Acadience Learning—or ALI, as we finally  refer to our company—has been in operation since 2002. And through our continued research and development over the past decade and a half, we have really expanded far beyond the early literacy assessments for kindergarten and first grade that were first offered in the early years of Ruth and Roland's work. And our current family of assessments includes our screening, progress-monitoring, and diagnostic reading assessments for grades K–6, so Acadience® Reading K–6 (formerly DIBELS Next®), Acadience® Reading Survey, and Acadience® Reading Diagnostic: Phonemic Awareness & Word Reading and Decoding, as well as comprehension, fluency, and oral language.

Dr. Kelly Powell Smith:
We have our early literacy assessment for preschool [that] we call Acadience® Reading Pre–K: PELI®, which stands for Preschool Early Literacy Indicators. We have content-area reading for grades 7 and 8, Acadience® Reading 7–8. And we have math assessment for grades K–6, Acadience® Math (formerly called DIBELS® Math), which includes universal screening and progress-monitoring measures.

Dr. Kelly Powell Smith:
Our goal with all of these assessments is to improve outcomes for children and support school success by developing and conducting research on effective intervention and assessments and by coupling that with high-quality professional development, so you can gather that theme again of improved outcomes. Drs. Kaminski and Good are the original authors of DIBELS®, including DIBELS® 6th Edition, DIBELS Next® (again, now known as Acadience® Reading K–6), and all prior versions of DIBELS®. In 2018, the name DIBELS®was sold to the University of Oregon, who then used that name for a new work not associated with Drs. Kaminski and Good. As such, Drs. Kaminski and Good and the offering team at ALI are not associated with DIBELS® 8th Edition. DIBELS® 8th Edition is not a new version of DIBELS Next®, nor is it an extension of the work of Drs. Good and Kaminski, but rather it's a new work from a new and different offering team using the name previously owned by Drs. Kaminski and Good. Our new name—Acadience—brings together all of our assessments into one comprehensive suite of educational tools, so we're really excited about our new name.

Pam Austin:
It's exciting altogether. I'm listening. This is an extensive list of measures, these assessment measures. In a word, I would say “all-inclusive”—or maybe that was two words. I just think of the idea of assessments and PD is what you mentioned, and the decades-long work. This is the end result of it, isn't it?

Dr. Kelly Powell Smith:
Yes. Yeah, it is. It's been a journey, for sure.

Pam Austin:
How do you think this school year will be different in terms of assessment results?

Dr. Kelly Powell Smith:
Well, I think what we are generally seeing in our data, both [in] reading data and in math data, is that students on average began this year with lower skills than would be anticipated based on watching the patterns of performance over time. And that's likely due to the disruptions in schooling caused by the pandemic. And we aren't the only ones seeing this. Others out there who have assessments have been doing similar analyses, and they're seeing consistent things with respect to what we are seeing as well. We've also seen those impacts felt differentially. So, inequities, in some cases, have been exacerbated—for example, among low-income students or students who are English language learners. So we need to be mindful of this when we consider assessment results and next steps to take.

Dr. Kelly Powell Smith:
What we don't know at this time is how much of that ground will have been made up by the end of this year. And this is something that we'll be taking a look at in our own research. Now, we may see some continued impacts on academic performance, such as students ending the year, on average, lower in reading and/or math than would be predicted based on prior years. However, we may also see the positive impact of many educators working extra hard, resilience among students might contribute to better outcomes than you might've initially anticipated, and a marshaling of resources to get students back on track. So it's really my earnest hope that these factors combined—working hard, student resilience, and marshaling of resources available—those things combined would be helpful to moving us back toward what we might typically see in terms of academic performance.

Pam Austin:
Well, I think you answered my next question, because I was going to ask why is the end-of-the-year assessment data so important? And I think you began to answer that question based on what we hope to see—the possibility of students' improvement and progression, despite them starting off at a lower level than expected. Well, I'm going to let you expound on that anyway. I'll repeat the question: Why is the end-of-the-year assessment data so important?

Dr. Kelly Powell Smith:
Well, I'm glad you actually asked that because I feel like it could be important to us in really two specific ways. First of all, the end-of-the-year data this year could be used to identify students who we would want to receive additional support over the summer. It also could help us to determine what type of support would be provided to those students, as well as the skills to focus on with respect to that support. In particular, I would want to be out on the lookout for students who have not yet reached benchmark on skills that should be mastered by that timepoint. So, for example, phonemic awareness in kindergarten or basic phonics and decoding by the end of grade 1. Those would be really critical things to examine. And when thinking about what support you might want to plan for over the summer regarding support, options might include summer learning opportunities. And in fact, that might be beneficial for all students if it were a possibility. We could also look at tutoring programs or online programs, or even providing materials to caregivers to use in supporting their students. So that's one way.

Dr. Kelly Powell Smith:
The second specific way is that the end-of-the-year data can help us with planning for next fall. So back when I was a school psychology professor at the University of South Florida, there were several of us that worked two days a week at an elementary school. And we had practicum and intern students who we supervised there, but we also had a very strong, collaborative relationship with the school’s problem-solving team. And one of the things we always did before school released for the year was to review their universal screening data for end of year and develop what we called the get-go list. And these were students for whom we wanted to prioritize following up as soon as school began in the fall, so educators could address the needs of these students immediately, even before fall data were collected. And I think this kind of practice makes sense now more than ever. These data can help with planning the resource needs in the fall for those students, whether that's personnel, or professional development, or scheduling, or even looking at what materials might be necessary to purchase prior to fall.

Pam Austin:
So, no delays for offering support. And I love that. We're looking at clarity of support for the summer, clarity of support for the school year as well. So, what are some warning signs or alerts regarding literacy scores that educators should look for in examining end-of-the-year reading results?

Dr. Kelly Powell Smith:
That's another great question, and another one where I would say there's sort of two levels to examine here. One is the individual student level, the data there, and the other is the systems level. So, at the student level, warning signs might include things like the student ending the year either below or well below benchmark on those critical skills that are assessed by our Acadience® Reading assessments.

Dr. Kelly Powell Smith:
Another thing that you might look for in an individual student is a student who has not made much progress over the course of the year. So, for example, a student who was below or well below typical progress using our pathways of progress, or a student who has a flat trend on their reading progress over the course of the year and [in] the progress-monitoring data. And, in particular, we might be concerned when we have a student who is part of an intervention group, and that student has not made progress relative to other students in the intervention group—so, the other kids in the group were making progress, but an individual student in that group is not, and it's not always necessarily related to the reading data. There's one other thing I would toss in here for the individual student level, and that would be to consider when student motivation is waning. Like, that would be another warning sign.

Dr. Kelly Powell Smith:
Now, at the systems level, we may wish to examine data that helps us to make decisions about instructional effectiveness, and we could look at each tier of instruction and ask some questions. So, for example, is our core instruction supporting students who began at benchmark to stay at benchmark? We could look at that at the end of the year. And some criteria we might consider is, are we keeping students… like, 95 to 100% of those students who started at benchmark, are they still at benchmark? When it comes to our strategic supports or our Tier 2 interventions, are they supporting most—80% or more—of students who started in that below-benchmark range to have reduced risk and get to the benchmark?

Dr. Kelly Powell Smith:
So, that's another place where we might look at instructional effectiveness and at the systems level. And then we could also ask a similar question for more intensive supports, our Tier 3 interventions. Are those interventions and supports helping 80% or more of students who began well below benchmark to reduce risk? Meaning those students either get to below benchmark or get to the benchmark—or, even better, get to a better benchmark. So, if any of these levels of the system are not meeting those targets, we might consider what resource allocation decisions could support us to be more effective in the future.

Pam Austin:
Yes, there's so much to be considered, and you end it with that term “effective,” to be more effective with the resources that you have. The thought of being prescriptive and diagnostic ran through my head as I'm listening to all the details that you have provided for us. Thank you so much, Dr. Powell Smith. How would you use the end-of-the-year results to plan for some instruction? You did mention summer instruction. Well, how can we anticipate that data being used to plan for support for students over the summer?

Dr. Kelly Powell Smith:
Yeah, as I mentioned, those end-of-year data may be used to identify students who we'd want to provide additional support to over the summer. It can also help us to determine the type of support to be provided, as well as the skills on which to focus with respect to that support. So, we could do this much in the same way we would advocate for people to do during the course of the regular school year. So, by looking at our data—like Acadience® Reading screening data, for example, or progress-monitoring data—you could look at your diagnostic data to determine instructional targets for students and then plan how to address them. Options might include, again, those summer learning opportunities, providing materials to send home with the students, getting them set up for tutoring. And as we do this planning, we might want to keep in mind the need to gather resources and consider additional resources that could be funded or supported by money available through the American Rescue Plan Act. So, there's actually money available that schools could take advantage of to help fund some of these summer opportunities for students.

Dr. Kelly Powell Smith:
Really notably and kind of exciting is that there's actually research to suggest that summer learning can boost skills for all students, even helping them to achieve two to four months of growth when the experience is intensive and carried out in small groups by well-trained and experienced teachers. Some studies even show greater gains for students who are disadvantaged. So, summer learning opportunities are a really great option, especially now that we have some special funds set aside that folks can access. Now, it would be desirable to use end-of-year data to determine the best approach to progress-monitoring as well, so it'd be good to continue some progress-monitoring while students are accessing those intervention programs so we know that they’re having the impact that we would like them to have. And those data also could be helpful as we think towards the fall and plan what the next steps might be for those students.

Pam Austin:
All right. Great. Because my next question was the suggestions that you’re making for the summer. Would you suggest they would parallel the needs that we need to provide for students in the fall as well? Would it be more extensive? Would there be any differences?

Dr. Kelly Powell Smith:
I think the process would be very much the same in terms of looking at the data that we have on the student progress, monitoring data, or diagnostic or screening data in the fall and using that information to plan what next steps we might take. So, setting a new goal for a student, deciding on progress-monitoring material, or even if the student is ready to sort of exit that additional support. Maybe they don't need it anymore and those resources could be allocated elsewhere, is another consideration.

Pam Austin:
Great, wonderful. So it boils down to really focusing on student need, whether it's summer, whether it's the fall. That intervention that students need, we just have to make sure we provide it based on the data we collect with our assessments. Am I correct in saying that, Dr. Powell Smith?

Dr. Kelly Powell Smith:
Yes. I think that's a great way to frame it.

Pam Austin:
All right. Thank you. I do have another question. So, what type of assessments should educators use to get the strongest data feedback? Are there any that you recommend? And what about recommended literacy assessments for adolescent students?

Dr. Kelly Powell Smith:
OK, great. I think you're probably not going to be surprised by my response here, but I think the strongest approach is to have a suite of assessments available, beginning with screening and progress-monitoring of essential skills, such as literacy and math, and following up on those data with additional assessments when warranted to address student needs. So, for example, using Acadience® Reading Survey to determine instructional and progress-monitoring levels for students who are scoring well below their grade level, or diagnostic assessments like Acadience® Reading Diagnostic to identify optimal instructional targets for students who are struggling. And then once goals are set and interventions are implemented, we want to monitor student progress to determine if the interventions are having the intended impact. And then those data provide a really important feedback loop. So, if adjustments are made to an intervention, we would be looking at those data to see if those adjustments were resulting in improvement and made a difference.

Dr. Kelly Powell Smith:
And with respect to adolescents, Acadience® Reading 7–8 provides tools—both for universal screening and for progress-monitoring—in content-area reading skills, which are critically important for students in that age range. And importantly, those data can help identify middle school-aged students who are struggling with the mastery of those critical content-area reading skills, and they can also be used to help identify students in need of basic skills instruction in reading. So, I just want to mention a couple of things about that. With Acadience® Reading 7–8, we use a multiple gating process, which saves time. At each gate, the pool of students who are assessed is narrowed. And the first gate is our Maze assessment, and that's given to all students and can be group-administered. And students who score below or well below the benchmark on that assessment move to gate two and are administered the Silent Reading assessment with comprehension questions. And then from that group, students who score below or well below the gate move to gate three and are administered a one-on-one oral reading assessment with comprehension questions.

Dr. Kelly Powell Smith:
And for those students—so, the students with the greatest needs, the most academic difficulty, those who score below and well below on gate three—we might consider something like Acadience® Reading Survey, which will help us determine an appropriate out-of-level target for instruction and progress-monitoring. So, it's likely those students are going to need to be dropped back into materials that are lower than grade 7, and Surveycould help us figure out what the best placement is for that student in terms of progress-monitoring materials and instructional materials. We might also consider the use of something like the diagnostic assessment to help identify instructional targets for those students.

Pam Austin:
A suite of assessments. I think you need them all to do the work accurately.

Dr. Kelly Powell Smith:
Yeah. I would advocate for folks having access to all of them.

Pam Austin:
And at the risk of being a broken record, the detailed focus on assessments brings to mind these two terms: prescriptive and diagnostic. Dr. Powell Smith, that's what keeps running through my head as I listen to you detail the needs of students and trying to give them all of the support that they need, really focusing on those outcomes.

Dr. Kelly Powell Smith:
Yes. I agree.

Pam Austin:
Unfortunately, we are nearing the end of our podcast. Lots of good information that you shared today. Does Acadience have any new features coming out that would help educators in examining their student data?

Dr. Kelly Powell Smith:
Yes, we do. In our Acadience® Data Management data service, we have added a new gate three score calculation to help users of Acadience® Reading 7–8 obtain that score without having to do the manual calculations for it, which is a time-saver for the educator. So, that's a support that's newly available. And also, we will be expanding our training of Acadience® Reading 7–8 by building a data interpretation training workshop, so folks can be looking for that.

Dr. Kelly Powell Smith:
In addition, our pathways of progress technology will be implemented for Acadience® Math within the Acadience® Data Management system, allowing users to have access to a goal-setting utility like the one we have for reading and to be able to examine student progress relative to peers who have the same initial math skills—so, providing that normative comparison when looking at student progress and making decisions about it. And then finally, our progressive web application Acadience® Learning Online has direct entry of paper and pencil-administered Acadience® Reading K–6 assessment data coming on the horizon. So, users will want to be sure to check out ALO now so they can learn more about what that dynamic data service has to offer them when it comes to examining their data at both the student and systems level.

Pam Austin:
This is all very exciting. So, Acadience provides both the assessment and the professional development so you can understand what the assessment data means.

Dr. Kelly Powell Smith:
Yes, exactly.

Pam Austin:
All right. Finally, if you could wave a magic wand and change one thing in the world of education, what would you change and why?

Dr. Kelly Powell Smith:
Well, this is a really tough choice if I only get to pick one thing. But seriously, given the negative outcomes that so often come from a student not learning to read—whether thinking about things like the prison pipeline or maintaining inequities in our society to things like being an informed consumer of information, and all of which have these sort of cascading negative impacts on individuals over their lifetime—I would want to wave my magic wand so that all children had the opportunity, the high-quality instruction, and the support necessary so that they could learn to read successfully and with enjoyment.

Pam Austin:
Oh, wow. I just love that answer. Thank you for joining us today, Dr. Powell Smith; it's been a pleasure speaking with you. Please tell our listeners how they can learn more about you and how to follow you on social media.

Dr. Kelly Powell Smith:
Well, I would say that listeners can learn more about my work and the work of the team, ALI, by following us on Facebook and Twitter and by going to our web page, acadiencelearning.org. These are good places to look for announcements about presentations or webinars about developments with respect to training opportunities, publications, technical reports, as well as other news and announcements relative to our work. So, that's probably the best places I could think of to go.

Pam Austin:
Thank you. This is Pam Austin, bringing the best thought leaders in education directly to you.

Dr. Kelly Powell Smith:
Thanks, Pam.

Narrator:
This has been an EDVIEW 360 podcast produced by Voyager Sopris Learning. For additional thought-provoking discussions, sign up for our blog webinars and podcast series at voyagerssopris.com/podcast. If you enjoyed the show, we’d love a five-star review wherever you listen to podcasts, and to help other people like you find our show. Thank you.