Driven by data – Analysis
Driven by data – Analysis

Driven by data – Analysis

Before I spend my time writing this post, and possibly a handful of colleagues invest their time reading this post, I need to point that my colleagues and I have not yet finished Step 1 of a Data Driven Leadership approach.

We have set out and published our data cycling calendar (back in July). We have our first assessment on the horizon meeting on Tuesday, covering aligned, interim, reassess and transparent qualities of effective assessment. We have and our first ever – assessment collection cycle – through VP line management meeting in a fortnights time, ensuring Senior Heads of Curriculum (SHoC) are assessing the right content, in the right format, to A Level, or GCSE or National Curriculum standards, one full week before the assessments themselves take place.

All this is still to be secured, before we collect the data and we move onto the Analysis phase. There is a one week window for assessment, given we work on a two-week timetable. Assessments then need to me marked and data uploaded to our MIS. If staff wait for the whole school analysis we will not meet the recommended 48hr rule nor will we empower teachers to own the process.

We will lead staff to have mark and return their assessment on a class-by-class basis with whole school analysis shared with SHoC and a six week cycle or reports home to parents and carers, all within 5 teaching days.

Half way through this cycle, we will ask SHoC to lead a scheduled dept meeting on the assessment and the analysis. In many senses, this feels like a SWOT of the course so far for both students and the dept.

In preparation for this meeting, teachers will be expected to have reviewed the assessment and prepared action plans for each of their teaching groups, possibly for individual students.

Let’s look at question x. Why did the students get it right / wrong?

What did the students need to be able to do the get that question right?

This group got it right, this group did not do as well? Teacher x, how did you teach this section? Teacher Y, what did you do?

Who exceeded expectations? Which students needs to review that content again?

With the assessment analysis underway, what might be the possibly actions? What might this include?

  • Recap sections of the curriculum
  • Target and differentiate learning activities at particular students
  • Use ‘Do Now’ tasks as lesson starters to re-teach sections of the previous content unlearnt in the last interim assessment
  • Self and peer assessment – developing a spirit of collaboration
  • Rewards and praise for achieving and improved student attainment
  • Additional support for identified students
  • Engagement with students: how they did, and what actions they need to take to improve
  • Parent and carer contact or referral
  • Ongoing assessment, shorter cycles of assessment for particular students or classes
  • Teachers plan new lessons collaboratively to develop new strategies based on data analysis

Following the analysis and dialogue, SHoC will be expected to review teacher plans and ensure that each plan meets pre-established expectations. This is a more distributed approach than Bambrink-Santoyo advocates. however I can not see any other way it can work at the moment. Accountability measures may arise from this type of dialogue if pre-established expectations are not meet, or teacher plans are not fulfilled. These may return re-focus senior leadership involvement.

A summary of these plans and actions will then discussed with the VP during the first line management meeting of the new term.

Whole dept data will by then have been collected and presented in 4 Matrix (proprietary software). Commentary and key cases will be recorded in the subject commentary section. Together, this information will be used to direct student effort.

Key Principles for Leading Analysis Meetings

  • Let the data do the talking.
  • Let the teacher do the talking. (Or if necessary, push the teacher to do so!)
  • Always go back to specific questions on the test.
  • Don’t fight the battles on ideological lines. (In the larger picture, you’ll lose.)
  • You’ve got to know the data yourself to lead an analysis meeting effectively.
  • Keep in mind the difference between the first assessment and the third. (That is , if it is not the first meeting.)
  • Make sure the analysis is connected to a concrete action plan you can verify.

So that is the goal of my next six weeks of term. To develop an assessment process with improved rigour, (alignment, re-assess(ment) and transparency). Develop the analysis processes of teachers and the leadership of analysis in SHoC. (There is another aim – rapid improvement of teaching and learning and a communicated Wellington Academy teaching standard – more of that in another post, another time.)

I am also very aware, that somehow, there is an imperative to offer professional development that models assessment analysis and action planning. My personal reflections…

cant-cross-a-chasm-in-two-small-stepsNot forgetting student targets and school estimates models to prepare, reporting to parents, lesson observations and professional development and support, line management and whatever may arrive next week.

Taking a few steps backwards, rocking on my heels, in preparation to run and….  jummmmppppp.

[qr_code_display]

One comment

  1. Pingback: Moneyball for schools: can we use data like the Oakland A’s? | Improving Teaching

Leave a Reply