Photo by Julian Dufort on Unsplash
“Will I need a snorkel?”
This was our Head of English’s response to the news that Ofsted had chosen her department for one of five ‘deep dives’ into our curriculum. What would a deep dive even entail? And would we regret inviting in the big ‘O’ to pilot their new Inspection Framework in our school?
We had brought this upon ourselves. Feeling proud of our curriculum, our research informed pedagogy and our recent improvements in GCSE outcomes, our Headteacher had picked up the phone and volunteered us as guinea pigs. It would be great CPD he insisted. We’d get the ‘benefit’ of three HMI testing our leadership, with no school grading, no written report and only a professional dialogue about the school’s strengths and areas for improvement. The stakes, in theory, were relatively low.
Except, as Curriculum and Teaching and Learning lead, that’s not quite how I felt. I’d invested so much into implementing our curriculum policy, developing our staff and leading on quality assurance that I couldn’t help but feel the usual pre-Ofsted butterflies: the second-guessing and that desire to show off the school at its best. I prepared myself for an inspection process that would be every bit as intense and thought provoking as the real thing. And that’s what it proved to be. It was, to say the least, a demanding couple of days. When Ofsted say that they are inspecting the ‘quality of education’ in its broadest sense – they mean it. Curriculum is at the heart of this.
Because every school is different and because this was only a pilot of a framework still in draft form, I want to avoid sharing school specifics that might be misleading. I found the inspectors to be professional, insightful and I want to respect the fact that they too were genuinely trialling something. Anyone reading this blog and assuming that our experience is exactly how Ofsted will inspect any school in future would be unwise, especially as the consultation for the new framework was still underway at the time.
With all the context and caveats out of the way, what follows is an overview of our experience and more detail about those ‘deep dives’ that were at the heart of the inspection.
As per the draft school inspection handbook, the lead inspector arrived for a day prior to the two-day inspection to gather evidence. From this moment on, it was clear that this was a new framework. The first discernible difference was that this evidence gathering was not a trawl through data and paper trails. They did not want our internal data. They only wanted to discuss our GCSE outcomes, which influenced their early hypothesis about curriculum impact and shaped some of their decisions about which departments to spend time with. Instead of beating down the door of our progress and attainment lead, their priority was to meet the Headteacher and the Senior Leader responsible for Curriculum (me!). What they said to me was something to the effect of: ‘We’re going to be doing deep dives into your school curriculum. What do you think we should know before we begin?’. This was my opportunity to set out our stall and to describe our curriculum intent (to use Ofsted’s terminology). I explained the school’s Curriculum policy; the evidence base that influenced it and the CPD that supports it. After that they told me that they would test this. And over the next two days of the inspection, they really did put it to the test.
To do this, they took five departments and conducted ‘deep dives’ into their curriculum. These deep dives consisted of the following:
- Observing the Head of Department
- Meeting the head of department
- Joint observations of teachers in the department alongside the HOD
- Meetings with the observed teachers
- Meetings with pupils
- Joint work scrutinies (with the HODS)
All of the above activities were focused on the curriculum. The inspectors used them to get under the skin of whether the school’s curriculum policy and intent was being implemented. They used it to explore whether SMSC opportunities were being taken or missed. Whether the breadth of opportunity and value of a subject for its own sake was being pursued, or whether exam outcomes unduly narrowed learning. Book scrutinies were used as a springboard for discussion. Can you tell me where you are pleased with your curriculum? Can you tell me why you chose this work for pupils to do? Is there anything that disappoints you in relation to your curriculum expectations? What is the purpose of this assessment?
SLT had the strange experience of being practically side-lined for much of the first day. The meat of the inspection were those ‘deep dives’. But it wasn’t limited to just these. Many more meetings also took place, particularly on day two. There were meetings specifically about PSHE and Careers. More meetings with other Senior Leaders, Governors and Academy Leaders, usually relating their remit to the ‘quality of education’. Senior Leaders also accompanied inspectors at different points on tours of the school, the conversation always relating back to the curriculum. Inspectors told us that they wanted to gauge the experience of a typical child at our school. What could a child expect their journey to be? What would be their progression over the years? What would they know, what would they experience and be able to do because of their time with us? What would their curriculum provision entail in the broadest sense? As I walked with an inspector towards our Arts and DT block, he asked me what I would expect to see? What would it please me to see when visiting these classrooms? What would disappoint me? They had other questions too that related to the wider leadership of the school. They wanted to understand our use of form time, assemblies and enrichment opportunities. They wanted us to justify our leadership strategies. What was the evidence base for our decisions about curriculum, assessment, workload and data? What impact was this having? Personally, I was impressed and pleased with the quality of this dialogue. We discussed metacognition and long term memory, particularly the guidance that we have been implementing from the Education Endowment Foundation. They were thoughtful and nuanced in this way. Not the 'super trad' ideologues that rumours suggest. They played devil's advocate too: Did I see any pitfalls in knowledge organisers? Is there such a thing as too much retrieval practice?
It was a testing, fascinating, and exhausting few days. When it was over, we were glad that the Easter Holidays were waiting for us at the end of the week. I’m holding Ofsted accountable for my unusually high chocolate intake this year.
We were proud of our school when we picked up the phone, and prouder still when the inspection was over. Like any school, we have opportunities to improve and work still to do, but we are glad to have had the input of inspectors that were so focused on the totality of the experience for children in our school – the ‘quality of education’ as a whole.
In case anyone accuses me of being too much of an Amanda Spielman ‘fanboy’ (guilty!), I will add a final thought to temper my enthusiasm slightly. The point is not lost on me that as a pilot, with no grading, we may have seen the best version of Ofsted. This was a purely developmental process, with no grading, and no crude summary of our school in the form of a category. If there had been, we probably wouldn’t have been able to resist putting a banner up on our school fences, but I do share the misgivings of others about the consequences Ofsted labels can have for the profession as a whole. Early in my career, I worked at a previous school that was placed in special measures and I know all the subsequent challenges that this can lead to. As encouraged as I am by the improved new framework, I understand why for some, it still does not go far enough.