Reporting to parents
Reporting to parents

Reporting to parents

With just six reporting cycles under my belt at The Wellington Academy, this term saw the release of Progress Reports to all seven year groups, within days of the term coming to end. Reflecting back over this process, I hoped might just expedite the process for other school leaders, embarking on this very journey.

Back in April 2013, we started with the end in mind; our aim was to enable parents and carers to be more involved with their children’s education. In our context that meant clear, more regular, more informative progress reports. Progress Reports that enabled parents and carers to engage their children in a conversation about their learning, and give them the confidence to either celebrate their children’s successes or contact the Academy to seek for further advice and guidance.

We consulted with Heads of Year, Heads of Curriculum and selected parents and carers and listed what we wanted to include; name, tutor, attendance and late, Key Stage 2 grades, subject names, teacher, current or ‘Working at Grade,’ ‘Most Likely Final Grade.’ ‘Target Grade’ and ‘Attitude to Learning Grade’ – formerly known as ‘Effort.’ We new we wanted to chart a flight path, or show learning over time, and designed a report that would present at least 3 terms worth of information.

At the end of Term 1 we released progress reporting to all year groups for the first time. Data entry went relatively smoothly, if a little delayed. Print runs from our MIS (ISAMS) to local printers caused issues so we moved to reporting to year groups reports as a single PDF document and sent printing to reprographics. This proved a useful way to share year group documents with Heads of Year. All Progress Reports were posted by the Tuesday of the half term. (Only Target Grades and Attitude to Learning or ATL grades were sent to Year 7 parents and carers). We also added a new data point – breaking down and reporting the ATL grades in deciles and one further band, the bottom 5%. Pupils in the Top 10, 20 and 30 decile bands were recognised with a letter sent home to parents and carers thanking them for their contribution, and in assemblies. Finally, pupils securing “All Gold” grades were invited to have lunch with the Principal, approximately 25 students. The parent and carers of pupils in the bottom 20%, 10% and 5% bands were also notified. Parent and carers of pupils in the bottom 5% were asked to come into school to meet with Heads of Year and to discuss the report cards that students were being issued. All data was added to 4Matrix, summaries shared with Heads of Year, Heads of Curriculum and the Curriculum and Achievement Governor.

At the end of Term 2 we added the new data series, plus an indicator of whether or not the students ATL ranking was rising or in fact failing, one arrow for per band. Again the same curriculum and pastoral procedures were applied. It would be fair to say, that this data point had the most significant follow up from parents and carers, and was the most focused point of conversation at the Year 8 parents evening. Again repeated in Term 3, however pupils in the bottom 30% bands for ATL were handled very carefully, as some students had made real personal progress and either note moved up a band or only moved one band. Of course, not only has ATL been slowly improving, as ‘we’ know, ‘every student’ can not be above average. We now report the last three terms data, on a single report. We send this report home and issue a copy to tutors which is then stuck into the pupils planners. It enables any staff member to hold a purposeful conversation with a pupil at any time.

We have decided to only show three series or cycles of progress.  In term 4, we will report Terms 2, 3 and 4. The only new data we are considering to share is reading and spelling age which is report twice a year internally.

Y7 progress reportLeading progress reports

From a leadership perspective we were influenced by Paul Bambrick-Santoyo’s Driven by Data: A Practical Guide to Improve Instruction (Driven by data). If we were going to report more frequently, we needed more frequent assessment cycles and more frequent assessment collections. Written reports were not a viable option. What Driven by Data emphasised was that assessments needed to be carefully designed and moderated, before being issued. Then promptly marked, analysed and reported, with feedback from the assessment addressed by the teachers and acted upon by the pupils. (To summarise a whole book in a pithy sentence or two is of course unreasonable). This represented the real change in teaching methodology.

From a teaching and learning perspective, there where two key management tasks. Schemes of Learning needed to be reviewed and adjust, to now include purposeful assessments time, assessment feedback time and to impact on pupil achieve, remedial teaching or re-learning. This is perhaps a teaching and learning debate and perhaps it prudent to set that aside for this post. Second, middle leaders needed to allocate department time to designed and moderation of assessments, time to the moderation of the assessment marking, so that data being entered was accurate and had integrity. Then, to ensure these processes we timely and that data was entered on time. This maybe the ‘driven by data’ model, however it is in fact ‘driven by pupils learning and investigating the impact of teaching on learning and reporting that back to parents, carers and pupils,’ – but I guess that was not a sufficiently catchy title. This is not a data wagging the learning dog, this is sound monitoring to support learning.

What does that look like in practice?

A typical six week term for staff is obviously influenced by the number of lessons per week, however, in this model it would loosely follows a  pattern of;

Week 1-3 – teaching

Week 4 – teaching, preparing for an assessment

Week 5 – assessing, marking, moderation and data entry

In the time between the assessment and feedback, teachers have chosen to showcase lessons beyond the assessment, or challenge tasks, or employ supportive work picked up from book marking, or personal targets, or revisited a lesson that didnt go so well. In some cases, pupils have received personalised tasks, or used to time to feedback their ideas on the lessons.

Week 6 – feedback, recording actions required, planning response, remedial teaching and / or re-learning

At the end of last term, we started to see staff setting up the next cycle of learning, so that the new term was hitting running.

Behind the scenes

From a data management perspective, assessment data cycles and data checks need to be established. We were moving from a complicated, tiered reporting calendar that attempted to balance and schedule reports at key times of the year for key year groups, to all year groups, every cycle. That means approximately 10,000 data entries per cycle, from 80+ staff. Expect errors.

In addition to the new data cycles, we moved to a new MIS (ISAMS) and were still getting to know the system from a data management and set up point of view and from a data entry point of view; there were a few minor errors here.

We have had issues with parent and carer contact addresses, and with identifying which parents, or sets of parents and carers should receive a copy of the report (same for most schools I would expect). Second, as the reports are not produced within the MIS (rather as an SQL report) we needed to run multiply copies of approximately 40 pupils progress reports, selecting and printing these 40 progress reports each cycle is time consuming.

Like all progress report we had to discuss how we will work with carousel subjects and how best explain this to parents and carers – when to include and exclude subjects, whether to include in the report, or use NA or dash. We have found that blank cells require more explanation.

NA a grade can not be reported.

‘-‘ Subject not taught.

We explain this is a mini report guide shared via our website. In this document we included the image of the Progress Report (above) and definitions.

We have experienced parent queries on the published KS2 data, and have recently doubled check all our 900+ pupils pupils Key Stage 2 results. Corrections have been made. This may be a legacy or a government issue. Though, if starting a new role, new school, or new reporting process, I would recommend double checking this information. We found a number or inconsistencies and a handful of Key Stage 2 scores had been added to the pupils records since we last collected the data from “Keys to Success.

We made an assumption that staff would know how / what data to input for their classes  despite verbal communications in briefing and simple instruction sheets for data entry, the most common errors were

  • missing a class data sheet completely – either admin error (class not assigned correctly) or teacher error
  • not entering a grade within a class sheet, no grade, NA or dash
  • not using the appropriate grade set eg B+ instead of Merit
  • entering a grade for a pupil instead of a NA or dash
  • sending conflicting academic messages, reporting a low attitude to learning grade and sending positive postcards
  • inconsistent reporting between data sets

From our limited experience, incorrect data is more challenging to correct than no data errors.

We have a system “missed report check” that is used to show a percentage reports completed  on the staff notice board in the lead up to the deadline. We do not use staff identifiers.

Day 1 – following the deadline, which should have included department checks, we run the missing reports again and a keying error check. Missed deadlines are recorded. Missed reports and keying errors / obvious mistakes are reported to Heads of Curriculum to follow up, (eg a most likely final grade that is lower than a current grade) with a 4pm deadline.

Day two – outstanding cases are individual followed up. The ATL league tables are calculated and shared with Heads of Year for assembly celebrations. Y11 reports are processed as a priority. Heads of Year spot check a sample of reports, particularly the known pupil issues. In some cases, pupil reports are withdrawn from the sample. We also use Y11 reports to check layout, that the data is pulling through correctly. The process then moves through Y10, Y9, Y8, Y7, and finally Y12-13. Reports are printed and handed over to the admin team for posting on the last Friday of term. A copy places in tutor pigeon holes for the first tutor period for academic mentoring conversations.

Progress report errors

Management is doing things right. Leadership is doing the right thing.

Progress reports errors are costly. Checking progress reports is time consuming, managing error correction is time consuming. Ensure that checking progress report data integrity is defined as a responsibility in middle leader job descriptions. Always follow up errors through line management. Conversations with Heads of Department may or may not lead to conversation with individual staff members, though conversations are always preferable to emails. There are often very plausible reasons why the last 5% of the progress reports are not completed, they even may be system errors. (In many organisation, how these remaining progress reports is handled is often determined by the Principal.)

Errors sent home are three times as time expensive and your responsibility (correcting the information, meeting with parents or carers) and what is more they hurt the schools credibility. Let’s hope these are very few and far between given the checks in place above and a team of supportive people beind you.

Thanks to Jon Adams, Simon Chappell and Sam Chalke.

[qr_code_display]

(We know are parents in the lower years would like to read comments. This is of course not practical when collecting data six times a year. We are considering tutor comments twice a year).

Leave a Reply