The National Audit Office recently released a report entitled Ofsted’s inspection of schools. The report examines whether the Office for Standards in Education, Children’s Services and Skills’ (Ofsted) school inspections provide value for money.
The report is insightful and a good starting point for a debate about Ofsted, however, it is also critically flawed on school improvement. This blog aims to examine the issue.
The link between cost, quality and improvement
The report sets itself the goal of examining the impact Ofsted has on “the quality of education, relative to the cost”. The report, however, does not adequately define what constitutes “the quality of education”, who is responsible for improving quality and the link between quality and cost.
The argument presented in this blog is that Ofsted’s role should entail the objective measurement of quality over time. Ofsted should not also be accountable for improving the quality of education because there is an inherent conflict of interest in doing both at the same time.
Ofsted defines its role as “a force for improvement through intelligent, responsible and focused inspection and regulation”. The question is, what is “a force for improvement” and can improvement be achieved through the use of accountability processes such as inspection and regulation? The NAO criticises Ofsted for overly concerning itself with its own processes and being unable to quantify its contribution to the quality of education.
The NAO report addresses Ofsted’s role and the confusion surrounding improvement and accountability:
The system for school improvement and accountability is fragmented and there is some confusion about Ofsted’s role. A range of different bodies are involved in holding schools to account and supporting them to improve, with different arrangements for maintained schools, academies and independent schools. Ofsted does not decide what action should be taken after it has inspected a school and does not intervene to improve schools. These are matters for schools themselves, the Department, local authorities and multi-academy trusts.
There is some overlap between the role of Ofsted and that of the Department’s regional schools commissioners, who oversee academies’ educational performance. The Department recognises the potential for confusion and duplication and, in May 2018, published principles for a clearer system of accountability. It plans to develop these into detailed proposals for consultation in autumn 2018 (paragraphs 1.13 to 1.17).
The NAO also lays the onus of improving the quality of education on various educational bodies such as schools, the Department for Education (DfE) and local authorities. Somewhat contradictorily, the report then suggests that:
The question is, how does Ofsted demonstrate its own impact on the quality of education when it does not decide what action should be taken or intervene to improve schools? If Ofsted is held accountable for both measuring and improving education then what prevents it from simply engaging in inspection grade inflation and then claiming to have improved the system?
As Amanda Spielman, Her Majesty’s Chief Inspector (HMCI), suggests:
The NAO makes it clear, however, that Ofsted:
The NAO report identifies the confusion surrounding Ofsted’s role but it does not offer clarity and a way forward. It adds to the confusion by suggesting that Ofsted should be able to measure the impact of its inspection processes on improvement whilst not being involved in activities undertaken to improve schools.
My own view on Ofsted’s role is that it should be measuring the quality of education over time; it cannot be a force for improvement, however, it can be a factor in the improvement process by generating data for others. A semantic difference, but in a landscape where confusion reigns semantic differences matter.
Does the NAO report resolve the cost/quality conundrum?
The NAO does not attempt to correlate the cost of inspection and its impact on the quality of education. In the example below, the NAO calculates efficiency based on cost without reference to quality:
Ofsted does not have reliable data on the efficiency of its state-funded school inspections over time. This is because it cannot match the costs of school inspection with activity. Specifically, it cannot separate the costs of inspecting state-funded schools from the wider costs of inspecting the schools sector. In 2017-18, the only year for which this calculation was possible (paragraph 1.4), we estimate that the average total cost per state-funded school inspection was £7,200. For earlier years, based on our work, Ofsted undertook indicative analysis using data on the costs of inspecting the schools sector. This suggested that the total cost per inspection may have decreased overall between 2012-13 and 2017-18, but is likely to have increased in 2014-15 and 2015-16 because Ofsted performed fewer inspections in those years. These changes cannot be reliably quantified.
If the NAO does not attempt to correlate the cost of inspection with the quality of education then neither does the currently serving HMCI who, in defence of Ofsted against the NAO report, suggests that:
The NAO’s conclusion that we cannot prove the value for money we represent is explicitly not the same as demonstrating that we do not provide value, particularly considering that the costs of our school inspection work represents just 0.1% of the overall school budget. We are confident we compare well against other school inspectorates internationally, something the NAO did not look at.
The argument HMCI proffers is one of cost competitiveness with other inspectorates. Is being cheaper than other inspectorates without any attempt to quantify the quality of those inspectorates purposeful?
The NAO report does offer some anecdotal qualitatively generated evidence for improvement by suggesting that headteachers think the quality of education has improved. The data, however, does not reflect headteachers views over time and the own survey casts doubt on the data used in the report. Quite simply 85 per cent of schools are judged to be good or outstanding and almost exactly that number were satisfied at the outcome of an Ofsted inspection:
In our survey of headteachers, 84% of respondents said that the outcome of their school’s most recent inspection was fair. As would be expected, the better the inspection grade awarded, the higher the proportion of respondents who considered that the outcome of their most recent inspection was fair: the proportion varied from 100% of outstanding schools to 51% of inadequate schools.
Is there much point to including data of this nature in the report? The suspicion is that it lends itself to “perverse incentives” with Ofsted caught between the need to improve schools and at the same time judged by the satisfaction of the schools it judges.
The argument that value or efficiency is reflected by cost alone is problematic. Simply being cost competitive with no understanding of impact does not say much about value or efficiency. Neither the NAO or Ofsted offers a coherent view of cost relative to quality. It does not stop either offering unconvincing arguments.
Does Ofsted know how to objectively measure improvement over time?
HMCI considers that Ofsted’s role involves objectively measuring quality, and I agree; however, my own view is that Ofsted does not objectively measure the quality of education over time.
Ofsted has an idea of the performance of a school relative to others at a given point in time, but not whether quality is improving over-time. Inspection frameworks change as do exams, pass rates and pedagogic fashions. Objectively measuring quality over time requires a sector-wide, transparent methodological approach to data.
As HMCI states:
Ofsted inspections cannot be the only source of data used to make judgements about whether the quality of education is improving or not. A longer-term policy approach to data, which includes exam, progress data etc. is also required.
The challenges of objective measurement and quality improvement
The challenges of objective measurement and quality improvement are painfully evidenced in the myth-busting guides produced by Ofsted and referenced in the NAO report. The myth-busting reports attempt to counter the actions taken by school leaders to second guess Ofsted. As the NAO report describes:
Quite simply, Ofsted has become a factor in the data that it collects. A major problem in methodologically underpinned evidence-informed data collection and analysis.
Furthermore, it is also clear that, in addition to the pointless processes generated to second-guess Ofsted, there has been a widespread fabrication of classroom practices, using such discredited methods as VAK learning styles, to produce evidence to satisfy the requirements of inspection. The suspicion is that there is far too much proxy action taken by schools in response to inspection, that has little impact but is uncritically accepted by inspection teams.
Rober Coe, in his inaugural speech to Durham University, entitled Improving Education: A triumph of hope over experience, stated:
Despite the apparently plausible and widespread belief to the contrary, the evidence that levels of attainment in schools in England have systematically improved over the last 30 years is unconvincing. Much of what is claimed as school improvement is illusory.
Ofsted has problems to overcome in measuring quality. The good news is that there has been marked progress over the last few years with Ofsted genuinely interacting with educators and critically embracing evidence-informed ideas. The question is, do policymakers and the education sector as a whole share the same clarity of purpose?
Ofsted’s response to the NAO report
No doubt Ofsted is frustrated by the lack of clarity offered by policymakers. A response to the NAO report by HMCI certainly suggests so (see HMCI responds to National Audit Office report).
It is surprising that such a lack of clarity exists between Ofsted and the NAO. Policymakers have to be clear about Ofsted’s role and its relations with other educational bodies. The education system cannot improve whilst those responsible for it cannot answer the questions that matter.
The problem of data in the NAO report
The NAO does gather an impressive amount of data, but it is hard to see the methodology it adopts to analyse the data or place the data within a body of literature. In my last blog, I criticised Ofsted for gathering data and arriving at a conclusion without adequate reference to data; the NAO seems to have done the same thing in this report.
As I mention above, small-scale qualitative surveys of headteacher views are de-contextualised from literature with no reference to time, an important factor in the report. In addition, the NAO references Ofsted generated data to form a view on the effectiveness of processes without seemingly recognising that Ofsted is capable of managing the data it generates:
In 2016/17, the quality assurance processes led to the overall effectiveness grade being changed following 17 inspections (equivalent to one in 420 inspections). They also identified 36 (one in 200) inspections where a further evidence-gathering visit was required. These results were similar to those in 2015/16. Data from the first eight months of 2017/18 indicate that quality may be improving.
As HMCI alludes to above, the use of Ofsted data to ascertain the quality of Ofsted inspections could lead to “perverse incentives”.
The NAO report is insightful in many respects and does cast light on many of Ofsted’s issues; however, it also re-enforces the view that education is systemically opinion-based, riven with poor use of data rather than being evidence-informed, and consists of bodies that do not communicate with one another.
It claims that it:
You simply cannot expect a standards agency to be held accountable for measuring quality and at the same time improving it; the one impacts upon the other. In addition, you cannot over-rely on the judgements of those who are to be measured as an important indicator of the quality of measurement; again, one impacts on the other.
The NAO criticises Ofsted for a role it cannot perform and fails to scrutinise properly the role it should be performing. There is hope, however, and I agree with Robert Coe and think he offers a good starting point to define Ofsted’s role when he states the following:
(…) we need to do four things: to be clear what kinds of learning we value; to evaluate, and measure properly, teaching quality; to invest in high-quality professional development; and to evaluate robustly the impact of changes we make.
Most sector watchdogs begin with the premise that their job is to answer the following questions: is the product of high quality? and, does it represent good value for the consumer? Ofsted probably cannot answer those questions but at least it now seems prepared to do something about it.
- adopt a sector-wide approach to evidencing improvements in the quality of education;
- define what “quality of education over time” means in terms of evidence;
- ascertain a replicable methodology to collect and analyse data, both qualitative and quantitative, representing the quality of education over time;
- expect Ofsted to produce reports based on systematically derived data using properly constituted mixed methodologies that are open to external scrutiny;
- fund research by other bodies, such as the NAO or research institutions, using the same methodologies as Ofsted to triangulate the data;
- remove the onus on Ofsted to generate improvements;
- measure the cost of Ofsted against data triangulated by other bodies;
- use the data generated by Ofsted to inform policymaking and instigate action by other stakeholders;
- hold the appropriate bodies and institutions to account for the improvement of quality.
This is supposed to be a blog but has turned into something of an epic written in the time available to do a blog. I rely on that defence should there be any punctuation, grammatical or logical errors or inconsistencies.