Most Recent, Power Law, and the Decaying Average

Conceptual Illustration

One of the trickiest and most controversial aspects of mastery-based grading is the way that scores (grades, though I usually try to avoid the word) are calculated in the system. Specifically, determining the level of mastery a student receives on a given standard across multiple pieces of evidence is probably the most complicated.

There are pros and cons to each approach, so it’s a matter of finding the proper balance between the philosophical and the practical.

In my experience, there’s no perfect way to calculate mastery for a standard. There are pros and cons to each approach, so it’s a matter of finding the proper balance between the philosophical and the practical. As you may know, JumpRope supports several different strategies/algorithms, and it’s up to the teacher, school, or district to choose. Today, I want to offer some thoughts on how to approach this choice if you wish to incorporate the concept that more recent evidence (newer assessments) should “count more” toward a student’s grade.

The first and obvious choice is what we call the Most Recent calculation, which works exactly as you may expect: It always rolls up the most recent score achieved by a given student for each target. Most Recent has a beautiful simplicity to it; it’s really the only method available that can be easily understood and easily “overridden” by teachers (by adding a more recent assessment selectively to students/targets where appropriate). That being said, it’s also the “harshest” of the methods that we offer in that it throws out prior evidence as new evidence arises.

By contrast, one popular calculation method used by our school districts is the Power Law. It carries its own pros and cons, the biggest con being its mathematical complexity and the resulting difficulty teachers/students/parents have understanding and explaining it. It’s also much harder to override in that you can’t really force a specific value to emerge just by adding more recent evidence. See here for more detail.

Another similar option built into JumpRope is what we call the Decaying Average. This is a calculation method that we designed internally (unlike the Power Law which is based on Marzano’s work) as an alternative to the Power Law with fewer of its downsides but which still honors the concept of “more recent evidence is more important when determining a student’s level of mastery.”

I think it’s important for our partner schools and districts to play out the conversation about what calculation method to use, but I always caution folks against pursuing the holy grail/pot of gold when it comes to the calculation. I’ve seen schools and districts change their mind several times only to eventually realize that they’re chasing their own tail. It’s a tough reality of this work, I think: Every situation is different, and computers aren’t going to be able to always match a teacher’s intuition about a student. In the meantime, the most effective solution I’ve seen is to provide some standardization/structure for the sake of clear communication and simplicity and to help educators understand the system well enough to know when and how to handle exceptions.