Towards Effective Implementation
I got a note from a school last week that had been thinking about implementing JumpRope for the 2014-15 school year. During the last four months, we’d had many conversations with folks from this school - a very thoughtful, very progressive bunch - but as one of the staff members wrote me, “we will not be prepared to even consider a pilot for 2015.” He continued: “Unfortunately, this is a case where we love the product but lack the infrastructure to support it.” I appreciated this staff member’s honesty and the understanding that he and his colleagues have work to do around standards-based teaching and learning before they undertake a tool like JumpRope. We know that a JumpRope implementation will be significantly more effective after the team at this school lays their standards-based teaching and learning groundwork.
In fact, we here at JumpRope have been thinking a lot about this issue of readiness for implementation. You might remember my last blog post, when I wrote that “like classrooms full of students who are at different points of readiness, schools and districts are the same way on their standards-based grading paths, and there are ways that we can help those that might be at more nascent stages.” Well, we got thinking even more deeply about this issue of readiness and wondered if we can be really specific about what that means - at least from our experience. If a school is implementing its standards-based system effectively, just what will that look like, and can we develop a tool to measure the level of implementation? And not just a tool for us - but, if we can so bold, a tool any school or district might use to get a sense of where on the “standards-based practice” continuum it is and therefore its readiness for a JumpRope implementation.
So that’s what we’re working on: What we call our implementation rubric that will help to measure the implementation of standards-based teaching and learning in a school or district setting. While the work’s ongoing, we’ve been collecting and analyzing information on successful and not-so-successful implementations for years and have created a draft of the dimensions of the rubric; here are three, to whet your appetite:
My school administration and staff can communicate effectively with students, parents, families, and the community about the value of and practical aspects (how to read reports, engage with the data, whether kids are going to college, etc.) of standards-based grading.
My school community has the technical resources and expertise to manage and support a new mission-critical technical tool.
My school values and honors the difference between academic learning and habits of work, their relation to one another, and the importance of each in analyzing and communicating mastery data.
Right now, we have ten dimensions, all in draft form, and know that we have more work to do. This coming week, for example, we’re developing the descriptors for the scale levels (we’re using a 1-4 scale), starting with the language that identifies a 3 and then moving on from there. In this work we may find that these dimensions need further simplifying, even excising, so that the rubric is not too long and convoluted - just as teachers work hard to design rubrics that convey all they need to without confusing their students. Comprehensiveness is important, yes, but more important is usefulness to someone in a school or district; this document needs to be a page or two, not ten.
Once we get a draft finished, we look forward to sharing with our partner schools and districts, to get their feedback. We understand that this implementation rubric is being written from our own experiences and look forward to hearing from a wider audience about it, particularly to its utility.
Why bother undertaking this task? We’re just some ed tech firm, right? Well, we like to think that we’re not; we like to think that we’re a group committed to improved teaching and learning, through the lens of standards-based grading, and we see our implementation rubric as not only helping us with the implementation of JumpRope at our partner schools and districts but also helping the larger standards-based grading community. We hope to make a difference in that community - to help others improve practice - and we believe that this rubric can do just that.