EDVIEW360

New Year, New Name, Same Assessment: What Happened to DIBELS Next?

December 14, 2020 Voyager Sopris Learning with guest Alisa Dorman of Acadience Learning Season 2 Episode 3
EDVIEW360
New Year, New Name, Same Assessment: What Happened to DIBELS Next?
Show Notes Transcript

As 2020 began, the assessment you knew as DIBELS Next® acquired a new name, Acadience® Reading K–6. Do you know why the name changed and that Acadience Reading (and its product family) are the only assessments developed by authors Dr. Roland Good III and Dr. Ruth Kaminski, based on 30 years of research? Join us as we discuss how Acadience Learning assessments can be used to support data-driven instructional decisions and improve student outcomes.

If you used DIBELS Next in the past, and/or are thinking about how to use assessment to drive student instruction in your school or district, this is a can’t-miss podcast.

Narrator: Welcome to EDVIEW360!

Alisa Dorman: "...This is a unique year in the same way that beyond the teacher in the classroom, supervisors, leaders, education leaders at the school and district level, are going to have to be using data in a very different way as well. Because they're going to have to assess impact and have to make some really important system-wide change decisions. I think that supporting the teachers and also supporting the systems to make those decisions is really the power that assessment can bring."

Narrator: You just heard Alisa Dorman, the Vice President of Strategic Partnerships at Acadience® Learning. Joining her is Kristen Biadasz, Senior Marketing Manager of Assessment at Voyager Sopris Learning®. Mrs. Dorman and Mrs. Biadasz are our guests today on EDVIEW360.

Pam Austin:
This is Pam Austin. Welcome back to the EDVIEW360 podcast series. We are so excited to have you back with us. I'm conducting today's podcast from my native New Orleans, channeling the heart of Voyager Sopris Learning in Dallas, TX. Today, we are honored to have with us Alisa Dorman, Vice President of Strategic Partnerships at Acadience Learning, and Kristen Biadasz, Senior Product Marketing Manager of Assessment at Voyager Sopris Learning.

Hello, Mrs. Dorman and Mrs. Biadasz. Welcome. We're so happy to have you here today.

AD: Thanks, Pam.

Kristen Biadasz: Thank you. I'm excited to be here.

PA: Would each of you tell us a little bit about your backgrounds and how you became involved in education? It's a little something we do here at our EDVIEW360 podcast series. Please include how you became focused on assessment. Alisa, share your education and assessment journey.

AD: Well, my assessment and education journey began, I guess, in my undergraduate preparation. I chose to become an elementary educator with an emphasis in early childhood, as well as elementary. I started my work in a classroom and found right away the value of data and assessment. Very soon after I was in the classroom, I found my way to a state department of education leading a reading initiative for that state. It was through that large-scale implementation that assessment became even more important to me and in the ways that I would be supporting the districts and schools within those districts, to meet the intended outcomes. So, I think my journey began in undergraduate school and has continued through my college preparation and my professional experience, both in departments of education. Now, here in a research organization, Acadience Learning.

PA: Oh wow. An eagle eye on education and an eagle eye on student outcomes. Data makes sense. Thank you, Alisa. Kristen, please share your personal journey.

KB: Well, thank you. I'd love to. Mine is different from Alisa's. I actually started in the education field in higher ed where I taught public speaking for close to three years. Then I started to work in the education technology field. I've been in that field for almost 15 years now. I've had the opportunity to work in the literacy area, mathematics, professional development, and then assessment. I quickly saw how important assessment was, the data that it provided when the test was given with fidelity or the assessment was given with fidelity and how that could really impact the teacher's work and then ultimately how that impacts student learning. So, I found assessment very fascinating. I couldn't wait to start working in that area and really dig deeper into it. That's how I came to Voyager Sopris Learning.

PA: Wow. Really spending a lot of time with instructional technology and highlighting assessments. Great. Thank you all. I just think it's so interesting how our journeys to education,  through education are so varied and unique. Each of us are unique individuals as well. Let's just get right to it. We've had some questions about DIBELS Next®. Please explain to our listeners who are unfamiliar with DIBELS Next, what it is and what happened to it.

AD: Well, Pam, DIBELS Next is the name of an assessment that stands for many, many years of research and development. That particular assessment has evolved and its name has changed while it is the same assessment. It is still a universal screener, can be used to gauge where students are, and to monitor student progress. It's evolved with its name to a new name, and that name is Acadience® Reading K–6. The reason that our name evolved, I think is important for your listeners, because the name change really evolved because our work as researchers evolved. We have been in the research business for 20 years as a research organization, our leadership, Dr. Roland Good, Dr. Ruth Kaminski and the team have been doing this work for nearly three decades now. In that work, we have expanded beyond the initial beginnings of just early literacy. We've expanded our assessments into the preschool area.

AD: We've expanded our assessments upward into the middle school or junior high area. We've also expanded into the area of mathematics and we do more than just screening assessment. We also have diagnostic assessments and survey-level assessments, as well as data management services. We really outgrew the name DIBELS and DIBELS Next. So, what we did is we evolved and we became branded under the name Acadience. What you can know is that it comes from the same authors and the authoring team, and that it is the same assessment with the same benchmark goals, with the same interpretive framework. It just simply has a new name.

PA: So, just to reiterate, Acadience K–6, is the only assessment from Dr Roland Good and Dr Ruth Kaminski.

AD: Well, Pam. While Acadience Reading K–6 is an assessment, the first assessment from Dr. Roland Good and Dr. Ruth Kaminski, it is not the only assessment from Drs. Good and Kaminski and the team at Acadience. In fact, we have an entire suite of assessments. As I mentioned before, we have Acadience® Reading Pre-K PELI, which supports assessment screening for 3 and 4, 4 and 5 year olds. We also have Acadience® Reading Survey and Acadience® Reading Diagnostic, which are complements to the Acadience Reading K-6 screener that allow you to provide survey-level assessment, to place students in the just right place for progress monitoring.

While we have Acadience Reading Diagnostic that lets you go a little bit deeper beyond screening, to pinpoint the sub skills that students may be missing, some gaps that need to be closed. We also have Acadience® Math. So, we have universal screening for elementary in math. Very soon, before next school year, we will have the published version of Acadience® Reading 7 and 8, which is the middle school or junior high assessment. All of these are available from Voyager Sopris Learning as our publishing partners. So, while Acadience Reading K–6 is sort of our foundation and maybe what we've been known for, we actually have a very comprehensive assessment family.

PA: What an extensive offering of assessments. I'm sure educators are excited to learn more about them. Can you tell us about any new features that Acadience Reading K–6 has that DIBELS Next did not have? What types of assessments and tools does it have to support student learning?

AD: Great question. While we continue to evolve our assessments and conduct research and analysis, I think some things that are unique and important to mention about Acadience Reading K–6 is, as we moved through the rebranding process from DIBELS Next to Acadience Reading, we were able to introduce into that assessment family, the Acadience® RAN or Rapid Automatized Naming measure that can come with the Acadience Reading K–6 assessment. This is just an additional assessment measure that can be used to help catch kids at risk and better support them. It is often used to really pinpoint risks and often associated with reading difficulties, including dyslexia. Another thing that we've been focused on and are really excited about is that we wanted to bring the assessment that has historically been offered from a paper and pencil administration method to a new digital assessment method. We were able to bring Acadience Reading K–6 to a digital platform, Acadience® Learning Online. What Acadience Learning Online offers is it offers the educators, the schools and districts who use the assessment, to choose between paper, pencil, and or digital.

When you choose digital administration, you're able to capture information, the educator, through the test experience and have that assessment data automatically entered into the associated data management system. Right at a single point of entry, you can see information about the students, groups of students, classes. You can see schools across the district and all done through a progressive web application, a web interface.

PA: So, you're not only offering access to these assessments that are well known, you're offering choices from pencil and paper to digital, and you're offering an ease of use so that educators have that opportunity to view reports for data. It's all very, very exciting. Now, this has been a year like no other. Can you tell us why assessments are so crucial in a school year like this one?

KB: Yes, actually. This year has been one like no other. I think almost every listener here has experienced some disruption, if not total disruption of their lives, especially in education. Schools, basically, all switched to online last spring and all of the states waived their high-stakes summative assessment.

Many educators were coming into the fall with no previous year data from a summative and really needed to understand how much learning loss had happened, where their students were at in regards to their grade level and what to do with that information, because they needed to be able to design instruction for the type of learning environment that they were in, whether it was online only, in-person, a hybrid version of that, or if they were doing asynchronous or synchronous type of learning. So, it's really crucial because that assessment data is really going to help inform the teacher on what to do with those students so that he or she can really help to move their learning along a path for growth in such a unique year when it comes to education.

AD: So, I agree Kristen. I think that you've really emphasized some of the most important features and considerations that assessments can offer. I think the thing that I would only add is, I think this is a unique year in the same way that beyond the teacher in the classroom, supervisors, leaders, education leaders at the school and district level, are going to have to be using data in a very different way as well. Because they're going to have to assess impact and have to make some really important system-wide change decisions. I think that supporting the teachers and also supporting the systems to make those decisions is really the power that assessment can bring.

PA: I agree. Oh, yes. Yes. Those varied instructional methods are systems that were put in place and those varied results definitely needing to have your finger on the pulse of what students know. Thank you both for sharing. I've got another question for you. How can assessments be used to help pinpoint where students are struggling? You talked about putting your finger on the pulse of where students are and what they know, correct?

KB: Correct. I think with this one, assessments have always been used to help pinpoint where students are struggling and this year is no different. I think what's happening this year is that the assessment data is not only being used to pinpoint the struggling students, but it's also helping to find the starting point for those students who are in that grade level to see if they are still on grade level or below. So, assessment data is such an important tool to help inform the educator. It could help inform their data team, their professional learning community at their school level, but it's also great information to help a district person know what the status of all their students are across the district, because that can also help to pinpoint where they might need more resources, whether that be teachers, the parents being involved this year, or it might be physical resources like new programs or other types of things.

It's really crucial information to pinpoint not only where students are struggling, but the rest of the student population as well, so that teachers and administrators can plan for the learning to happen in that year.

AD: Oh, Kristen, I think you, again, really have elevated some critical and important points. I think data can help us lift up needs and can help us really go through a problem-solving sort of framework to really begin to take action in response to those needs. I think you've covered that and I completely agree.

PA: Thank you both for your input. For educators out there who may not have an assessment solution, have no idea where to go: What are the first steps here, or maybe they're looking for a new assessment solution? What are some things they should look for?

AD: If I were going to elevate a few things that I would want to recommend that educators consider, I think one of the top things on my list would be technical adequacy. I think that we really have to know that assessments that we're going to use for important decision making has met the standards in the industry for technical adequacy. I also would think that we'd want to consider appropriateness of use. Making sure that we were selecting the right type of assessment for the right purpose. Another thing that I would just elevate is having assessments be standardized so that every assessor gives the test in the same way. Every student has the same opportunity to perform. I think those are other really important considerations. I also think making sure that they are aligned in purpose, aligned to standards and are efficient, are all other important considerations. Kristen, these are some that popped to my mind, but I'm sure you're thinking of some others.

KB: I do actually think of a few, especially if we're thinking about in the current timeframe we're in. I think the other thing to think about is addressing the human need around it. I know that some people might be looking for assessments to do various things this year, but just some of the research I've seen at the CCSSO and the Center for Assessment, look at the human need first. Do you need to assess or not? It might do more harm than good. I think that everything Alisa outlined for choosing and selecting an assessment is 100 percent spot on. I think you also need to be thinking about why you're looking for an assessment and what you're trying to address there because first and foremost, do you need it? Is it necessary for your job? I think you do. I honestly believe that because we need to know where kids are at.

Then, second of all, is it able to be implemented in a way that is helpful for the student, the educator, the parent? So, going back to Alisa's technology, a point that she made there. I think that it's really something to think about what kind of outcomes you're looking for for your classroom, for your school, and your district as a whole. How does the assessment that you are looking at, or the assessments that you might be looking at to implement, get you to that or help you create a path to get yourself there? So, a lot of different items to think about within that.

PA: That's a heavy question. We are not assessing just for the sake of assessing. Right? There's a purpose for it. That purpose would tie in with what you both said, that human element, what's the purpose for assessing, and the reliability, I think, is a big, big issue as well. I do have another question for you. We know research is very important in literacy solutions, but why is research so important for assessment? Why should educators look for assessments that are research based and have reliability and validity? Those are two words I hear all the time with assessments. Why are they so important? Also, please explain what those terms mean?

AD: Well, I think coming from a research organization, it's very easy for me to elevate the importance of research. I think that if we want to have confidence in what we're doing, the decisions we're making, we have to have confidence in the data that comes from the assessments that are producing that data. I think it is critical in decision making, I think, for us to have that confidence, that's necessary. When it comes to, as you stated, research being important in the decision making for solutions, an assessment in the same way, I think educators have to know and have an understanding of what it means. You mentioned the terms to be both valid and to be reliable. Briefly those really terms refer to reliability means consistency in a measurement, such as the assessments we're talking about here and validity in the accuracy of that measure. We want to ensure that we have both reliability, that we're able to create those results, produce those results repeatedly under the same conditions across different settings, with different observers at different times.

If we can do that, then we know that what we've reproduced is supported, reliable, can be used. It is also important that it be valid in the sense that it's measuring what we intend for it to measure. That we're able to look at those results in correspondence to other established theories or measures in the same way. It's really important that we look at both validity and reliability and research does that. When you find an assessment that is technically adequate and meets that criteria, again, you have greater confidence in its utility and its application for decision-making.

KB: I agree 100 percent with what you said on here. I've always looked to see what kind of research base has been used in order to create an assessment for those exact reasons, because you want to have a reliable and valid assessment, because it all comes down to that confidence. Educators need to feel confident that the measure or the assessment tool they're using is going to give them data that is useful and can help with student learning and with their teaching. So, I 100 percent agree with all you said there. If you're looking for assessments and you're going through a decision-making process, you should be looking for those assessments that are based in research, have been validated, have been found reliable, because that will give you and the rest of your school and district the confidence in those tools to help support your work.

PA: Well, I have to tell you, listening to both of you. If I were to summarize the answer to reliability and validity, that will come to one word, competence. Competence in what you're using. Well, we're getting close to the end of our podcast. What do you think assessment will look like in 2021? What can we look forward to in 2021 from Dr. Roland Good and Dr. Ruth Kaminski?

KB: I'll take the first question if you don't mind. Then, I'll let Alisa weigh in on the second one. I think assessment in 2021 will look similar to what we're seeing in fall of 2020. We have educators who are learning how to assess, and they're trying to follow the standardized ways in which assessments are meant to be given. I think that they are trying their hardest to make sure that they don't stress out a student or even the parents who might be monitoring at this time. I think that going into 2021, that educators, administrators, parents, students will have a better rhythm. They will have more confidence because they have gone through some of the ups and downs last spring and this fall, and that they should, at that point, have, I hate to say it, but standardized a way on how to give an assessment in this type of environment and that the data they can get out of that will help them to confidently make those decisions for student learning.

I think it's going to look a lot like what we've seen in 2020, except that we now have educators and administrators who have had about nine months under their belt of either having in-person, online or hybrid environment, and have started to put together standardized protocols on how they're going to administer assessment and use that data for this year.

AD: I think to add to that, I would say, I think it's going to still have a very important contribution to make for all the reasons Kristen just outlined. I think assessment is being elevated as a really important contributor. It's often been thought of in a punitive manner sometimes with consequences that often feel like they are high stakes. I think that we're changing the conversation about assessment in such a way that people understand its contribution to the conversation it's power in informing.

I hope we're going to build on that energy into '21. You asked a question and I really want to make sure that I get the chance to say something about what can we look forward to in '21 from the team at Acadience? Specifically, what are Drs. Good and Kaminski thinking when it comes to the work of our research team? I will tell you that we are continuing to expand our research. We have a commitment to the community that we will first and foremost, continue to study the impacts of the pandemic. We continue to evolve our guidance about assessments, and those are ongoing research projects that we'll continue well into '21 as we continue the analysis. Preliminary data does indicate impact. Be looking at early '21 for us to reveal some of our findings.

I also think that you can count on us to expand our product inventory. Meaning, we have a few research projects coming. We have plans and continue to do work in expanding through high school. We also have, right now, a measure of spelling or encoding that's available, as well as experimental measure for vocabulary and language. Those things are active and ongoing research projects. I think that the other exciting thing for us is that we will continue to evolve our Acadience Learning Online digital assessment platform. Our intention is to continue to bring these assessments that you know and can trust from this team to that platform to, again, continue to give educators options about administration methods and about data management services. We're looking forward to '21.

PA: Thank you so much for sharing, both of you. Some highlights just from that answer, the answers were so detailed here, but there were a couple of big takeaways for me that I just want to reiterate for our listening audience here. The idea Kristen mentioned, the experiences that districts and schools and teachers and administrators have had and how they may have created some systems in place for administration. I thought of the idea of you know what, experience is the best teacher. We are flexible. We move, we shape, we grow from these experiences and then maybe taking a look at new structures and systems that will support our students because there's something Alisa said that really struck a chord with me. Power in informing.

We always want to have that power and being informed in regards to what our students know and because of that power in informing, I was not surprised at all Alisa, to hear about the continued to research on the impact of COVID. Research is never ending. I love the idea that Dr. Roland Good and Dr. Ruth Kaminski will be offering guidelines and supports in that digital expansion for the offerings of assessments. We're all looking forward to that as well. You know what? We're truly getting to the end of our podcast. Finally, if both of you could wave a magic wand and change anything in the world of education, what would you change and why?

KB: That's a good one.

AD: I think it's a great question. To hone in on just one thing is really difficult. I think we all have those of us who are in education or education-related work, really all started that work with a desire to ensure that students, learners, all have the opportunity to meet their greatest potential and to be able to reach important goals. If I could have one wish, I think today, and tomorrow I might name another, but if I had one wish today, I would want to be able to empower teachers, educators, and leaders, to be able to be prepared and supported to implement and use evidence-based practices. That would extend from their pre-preparation in colleges and education departments, all the way through continued learning through in-service. I think it's through conversations like these, that we can elevate the importance of that sort of teacher preparedness, educator preparedness and support so that we can make decisions that are truly guided by evidence. I think if I had one wish, if we could be guided by evidence, I think that's what I'd wish for. Kristen, what is your wish?

KB: Well, if I could take a magic wand and wave it to change one thing, I think about inequity. I'm sure many listeners have been struggling with the issue of inequity since March. Truth be told, there were always inequities in our school systems and for technology resources, things like that before COVID, but the impact of things like access to technology, experience with technology, access to reliable Internet, have become barriers to access to education like never before. If I could wave my magic wand and change something, I would remove the barriers for the students to learning. Whether that is technology, whether it's Internet access, or whether it's just simply getting the right tool or lesson in front of that student to spark their learning so they can continue to grow no matter what grade level they're at, at or above grade level, or below. I would just love to be able to remove that inequity so all the kids have that playing field to truly be their best self in their education.

AD: I'll ditto that.

PA: Thank you both so much for your wish and for joining us today. Mrs. Dorman and Mrs. Biadasz, it's been a pleasure speaking to you. Please tell our listeners how they can learn more about Acadience and how they can follow on social media.

AD: Thank you, Pam, for having us. For those who want to learn about the work from Acadience Learning, you can follow us on Facebook, Twitter, and LinkedIn @Acadience, or you can visit our website at acadiencelearning.org. Kristen?

KB: Yes. I would also love to offer up our ways to get ahold of us here at Voyager Sopris Learning. You can follow us on Instagram, Facebook, Twitter, and LinkedIn @Voyager Sopris. You can also visit us at voyagersopris.com, to learn more about any of the Acadience assessments or the solutions that we offer as well.

PA: Thank you so much. This is Pam Austin, bringing the best thought leaders in education directly to you.

Narrator: This has been an EDVIEW360 podcast produced by Voyager Sopris Learning. For additional thought-provoking discussions, sign up for our blog, webinars, and podcast series at voyagersopris.com/podcast. If you enjoyed the show, we'd love a five-star review wherever you listen to podcasts and to help other people like you find our show. Thank you.