Leading graded lesson observations part 2
Leading graded lesson observations part 2

Leading graded lesson observations part 2

We were moving forward with a fairly standard graded lesson observation form. The top section gathering the contextual information of the class/lesson and current progress of the pupils. The form was then divided into four key sections. “Learning and Progress1,” – what the pupils are learning and how thy are progressing over time, despite my own reservations of whether or not, in lessons we are observing “learning” or “performance.” I am proud to say, we were not looking for progress but sustained learning in lessons, and then progress over time. A position heavily influence by the Robert Bjork.

Unsurprising, it had a strong focus on “Teaching2” with differentiation and framing the learning both key focus points. It had a strong focus on “Assessment for learning3” with pupils knowing and being able to confidently assess their own work, confidently assess and feedback on their work or their peers. With the development of a effective questioning and the use of a marking dialogue between teacher and pupil to prompt ownership. And finally on “Engagement, behaviour and safety4.” On high expectations, on staff-pupils relationships, on routines (just implemented), on supporting and using school processes (just implements). On the reverse, the form focused on the recording strengths and weaknesses, on taking opportunities for SMSC and RWCM and the agreed action on “PD and Support.” And of course, a section for the all important, unreliable and often crucifying grade. These forms were submitted (with copies for the teacher and curriculum file) and filed away, out of sight and out of mind.

We continued to make minor adjustments in response to staff feedback, moved to identifying strengths only and a tighter focus, on fewer, areas of professional development. We made minor adjustments to incorporate infrastructure improvements (eg the expectation for staff to provide contextual data about their class, now that it was readily available) and to reflective inspection policy.

Two major changes were made in 2013. First we moved to setting-up school improvement software (webware) SchooliP. Second, we adjusted our lesson observation form and how we interpreted the aggregated scores of the four sections. During our inspection we were are advised to place greater emphasis on the “progress in the lesson.” To make “the progress in the lesson” the driving or deciding factor in the grade. Clearly, we wrestled with this advice. However, given our recent inspection feedback and grade, we needed to heed the advice instruction. As a result we thickened the line around the “Learning and Progress” section on the form and shared the feedback from the Ofsted inspector. Second we moved from “Most students achieve well (which perhaps should have been learn well) during the lesson and sustained progress over time,” to “Most pupils make good progress during the lesson and sustained progress over time.” After being “proud” to say we side-stepped progress in favour of learning, we were reunited with progress. And what is progress if not performance?

I have a question for you. Where are your graded lesson observations “housed?” How often are they referred to by SLT, Curriculum Leaders and the staff themselves? How is your investment in the process translated into professional development? Our LOPDS forms were reviewed following each cycle, collated, analysed and centrally filed. Although Curriculum Leaders and staff are encouraged to keep a copy and develop their areas for development, I expect some staff never returned to the form commentary but could probably recite their grades for the past 6 observations. Smacks of double standards. Comment only marking for pupils, grades for teachers. As a result I doubt we were very successful encouraging reflection and professional development? I shall admit it, I set about the conundrum of improving personal reflection and professional development, and shared both my concerns and ideas with the SchooliP Development Team.

The 2013-14 Performance Review started without SchooliP. The first LOPDS cycle was completed offline. The forms had been collected, collated and analysed, with the “Areas for Professional Development” the reverse of the form hidden neatly in a yellow A4 Lever Arch file. With time to spare, the Performance Reviews, all three lesson observation cycles and Teachers’ Standards had been set up in SchooliP ready for the second LOPDS cycle, what is more the first lesson observation cycle had been manually added or upload to SchooliP. Observers and staff could now review their previous lesson observation (and grade) on the SchooliP and discuss the focus of the up-coming observation. We were now starting to encourage both a reflective and developmental philosophy, though judgements were still casting a long shadow. To support our observers, the form could still be completed offline and hard copies handed in to the Vice Principals PA (to be typed up), but it could also be typed up straight onto SchooliP, in real-time, during the lesson. Shared with the teacher in preparation for the feedback conversation or simply saved.  Likewise, feedback could be added to the form, or straight onto SchooliP (the online version requiring the teachers agreement before being authorised as opposed to being signed.) Regardless, the end result was that both teacher and observer had full access to graded lesson observation commentary, feedback and outcome. Analysis by curriculum area, teacher type, pay range, LOPDS cycle and more, available in a few clicks and there was space on my shelving. Still, an opportunity to communicate how the teacher could be “even better” was missed.

Every teacher needs to improve, not because they are not good enough, but because they can be even better. – Dylan Wiliam

After a couple of months development work with the Development Team at SchooliP they had added a feature to transfer a section of commentary or “Area for Professional Development” from the Lesson Observation form and automatically add it to the Performance Review. Right there, front and centre. What is more, the Performance Review would then require the teacher and line manager to address the “Areas for Professional Development” in order for the Performance Review to be closed, further encouraging reflection and professional development. Then, by the end of the year, the Development Team at SchooliP had added the ability to forwarded these to the subsequent Performance Review. After all, not all “Areas for Professional Development” are simply addressed by attending a CPD course or by the arbitrary end date of the current Performance Review.

All the meanwhile, the “grading” swell was darkening the skies. The clouds opened and it poured down. With the removal of grading, we were keen to focus even harder on the personal reflection and professional development of our staff and less on (part) professional accountability as measured by 0.4% of a teachers teaching responsibility (straight talking Kevin Bartle has plenty to contribute here.) We were not yet sure what “observations” would look like in the future, or what the outcomes would be, but we were keen to re-design a lesson observation process. Progress in lessons (good or otherwise) would not feature, I had submitted our divorce papers. Quantifiable measures or percentages of “good or better” teaching/teachers were gone with grades. How would we now qualify rather than qunatity the Quality of Teaching? Those Teachers’ Standards may hold the key.

We wanted a culture that promote personal reflection and professional development, a process that encouraged every teacher to be even better. A Performance Review that emphasises and recognising the process of addressing an agreed “Area for Professional Development” represented a positive cultural shift.

[qr_code_display]

 

Leave a Reply