Research, Analysis and Insight into National Standards (RAINS) Project
Second Report: Understanding New Zealand’s Very Local National Standards
Martin Thrupp April 2013
Report commissioned by
The New Zealand Educational Institute Te
Riu Roa (NZEI)
OVERVIEW OF THIS REPORT
1. This is the second report of the Research, Analysis and Insight into National Standards (RAINS) project, a three-year study of the introduction of National Standards into New Zealand primary and intermediate schools. The report builds on the first report published in March 2012 and is best read in conjunction with that earlier report.
2. The report reviews national developments in the policy and its contestation over 2012. It considers how the National Standards policy became less contentious in 2012 as it became overtaken by other events. However, the public release of National Standards data by the media and Government in September 2012 was a much-contested development that is reviewed here. Other more ‘under the radar’ developments discussed include the growth of business involvement as the policy has become targeted by for-profit providers of relevant products and services, and changes to Ngā Whanaketanga Rumaki Māori, the assessment system for Māori-medium settings which is being developed alongside the National Standards.
3. The report argues that it was irresponsible of Government and media to release such poor quality data for comparative purposes and in the case of special schools, such obviously inappropriate data too. The initial release was rendered relatively impotent by the form of the data, the extensive qualifications wrapped around it as the media sought to justify its role and the decrying of the release and the data by most principals and others. But the release of the data was not just a passing event; it remains available in a reasonably convenient and unqualified form for comparing between schools on several Fairfax websites. For this reason the release of the data will be more damaging than many people probably recognise. It has also potentially opened the door to further releases, depending on what government is in power in years to come.
4. Aspects of the RAINS research undertaken in 2012 are reviewed including various kinds of fieldwork, dissemination and the politics of the research.
5. A key issue explored by this report is whether or not the comparability of teacher judgements against the standards (called OTJs—Overall Teacher Judgements) can be improved to allow
‘apples with apples’ comparisons of student achievement across schools. It is argued that variation is often a symptom of the deep reach of school-specific factors (school context) along with the incremental processes of change that are all that schools can realistically manage. This mix of context and incremental change is referred to here as creating particular local school trajectories that will not be easily turned. The research discussed here indicates that variation in OTJ-making often relates back to these trajectories.
6. The Maths Technology Ltd research on OTJ-making commissioned by the Ministry of Education is briefly discussed before the report goes on to offer a more comprehensive conceptualisation and illustration of the reasons for variation between and within schools. Sources of variation at three related levels are considered: national/regional, school and classroom. The discussion provides a multi-faceted explanation and some rich illustration of why National Standards are actually very local.
7. Even if some of the national and regional sources of variation could be addressed, there are many sources of variation at the ‘local’ level that are impossible to set aside when it comes to making OTJs. National Standards may be a Government aspiration but they are not national and never will be while there is so much potential for local variation. It is almost comical—if it weren’t so serious—that OTJ data representing such variation has been put into the public domain for comparative purposes when there are such important differences in what it actually represents.
8. If the Progress and Consistency Tool to be made mandatory by the Government is mainly intended as a form of national moderation for OTJ-making, then it can be expected to be an expensive failure. This is because it will not be able to address many of the various influences and pressures on schools and teachers illustrated by this report that will lead schools to take different ‘readings’
of the National Standards and of OTJs.
9. The report provides an update for each of the RAINS schools for 2012. Matters covered include each school’s development of National Standards and perceived impacts, as well as particular activities related to the policy, such as the forwarding of National Standards data to the Ministry
and the release of National Standards data by the media and Ministry. The schools’ relationships with the Ministry and views of the wider policy environment and likely prospects for the future are also discussed.
10. The experiences of the six RAINS schools in 2012 as reported by members of their Senior Leadership Teams and the outlooks of those SLTs were still mainly in line with the trajectories noted in the first RAINS report. Most of the schools were not making major changes to their approach to the National Standards. The most abrupt change was Kanuka starting to use ‘well below’, which was a change required by the Ministry but one that was resented. When it came to describing the impact of the National Standards, it was also only the Kanuka SLT that really viewed them in a favourable light, albeit not the use of the ‘well below’ category.
11. None of the SLT members interviewed in the RAINS schools were positive about the public release of National Standards data. They provided numerous arguments against this development and most had resisted forwarding data to the media. They were also mostly sceptical about PaCT to the extent that they knew about it, but those who had seen it were more positive about the new required template for National Standards reporting.
12. Those interviewed in the RAINS schools were generally unhappy with the way the Ministry was relating to schools, in some instances their own school’s specific relationship to the Ministry and in some cases more generally. Deep mistrust and a sense of being misunderstood were dominant features of SLT accounts of the Ministry. It was also clear that wider policy developments were often being viewed with concern by the schools and that they had also distracted from National Standards. Yet all of those interviewed provided nuanced accounts, ones where they were willing to take a favourable view of some developments and give credit where they thought it was due.
13. The report illustrates that popular views in some quarters about National Standards bringing accountability to the school system—views that have been encouraged by Government discourses around the National Standards—are often quite unrealistic. Despite some compulsory elements, the National Standards policy has so far turned out to be more of an exhortative policy than a disciplinary policy. Nevertheless there is growing evidence of schools ‘doing it to themselves’, i.e., the growth of a damaging performativity culture in schools around the National Standards.
14. There will be further research in the RAINS schools in 2013, leading up to the next and final RAINS report.
GUIDE TO ACRONYMS USED
AP/DP Assistant Principal/Deputy Principal
asTTle Assessment Tools for Teaching and Learning BOT Board of Trustees (also referred to as ‘Boards’) BTAC Boards Taking Action Coalition
ELL English Language Learner
ERO Education Review Office
ESOL English for Speakers of Other Languages
FRSSNZ Federation of Rudolf Steiner Schools in New Zealand GERM Global Education Reform Movement
GloSS Global Strategy Stage (Numeracy Project assessment tool) IKAN Individual Knowledge Assessment for Numeracy Literacy MCSFoRA Ministerial Cross-Sector Forum on Raising Achievement MoE Ministry of Education (also referred to as ‘the Ministry’)
MTL Maths Technology Ltd
NZC New Zealand Curriculum
NZCER New Zealand Council for Educational Research NZEI New Zealand Educational Institute Te Riu Roa NZPA New Zealand Press Association
NZPF New Zealand Principals Federation
NSSAG National Standards Sector Advisory Group
OECD Organisation for Economic Co-operation and Development OIA Official Information Act
ORS Ongoing Resourcing Scheme
OTJ Overall Teacher Judgement PAI Public Achievement Information PaCT Progress and Consistency Tool PAT Progressive Achievement Test
PD Professional Development
PLD Professional Learning and Development PRT Provisionally Registered Teacher
RAINS Research, Analysis and Insight into National Standards RTLB Resource Teacher: Learning and Behaviour
RTLit Resource Teacher: Literacy SAP Student Achievement Practitioner
SES Socio-Economic Status
SLS Supplementary Learning Support
SMS Student Management System
SLT Senior Leadership Team
STA School Trustees Association
STAR Supplementary Test of Achievement in Reading TFEA Targeted Funding for Educational Achievement TKI Te Kete Ipurangi (Ministry of Education website)
TABLE OF CONTENTS
Overview of this report ... 2
Guide to acronyms used ... 4
Table of contents ... 5
1. Introduction: New Zealand’s very local National Standards ... 8
1.1 A brief account of the National Standards ... 9
1.2 An overview of the RAINS project and its initial findings ... 10
1.3 The rest of this second report ... 12
2. Further background ... 13
2.1 Developments in the National Standards policy and its contestation over 2012 ... 13
The introduction of National Standards became in some ways less contentious ... 13
Introduction of National Standards overtaken by events ... 13
The lead-up to the release of the National Standards data ... 14
The release of the National Standards data ... 16
Other National Standards developments as highlighted by the media during 2012 ... 18
‘Under the radar’ National Standards developments ... 21
2.2 The RAINS research approach in 2012 ... 23
Research at Huia Intermediate ... 24
Research in the other five RAINS schools ... 25
The comparability of OTJs ... 25
Quality assurance ... 26
Dissemination ... 26
Politics of the RAINS research in 2012 ... 27
2.3 The MTL research on OTJ-making ... 28
3. Understanding why New Zealand’s National Standards are so local ... 30
3.1 Six schools, six different accounts ... 32
Juniper School ... 32
Seagull School ... 33
Kanuka School ... 33
Magenta School ... 34
Cicada School ... 34
Huia Intermediate ... 35
3.2 Sources of variation at national and regional level ... 36
Ambiguities ... 36
Varying professional development opportunities ... 37
Difficulties around advice ... 38
Weak Ministry requirements ... 39
Crude and misleading reporting ... 39
3.3 Differences in schools’ overall trajectories ... 40
3.4 Differences in each school’s framing of the National Standards ... 42
Approaches to the National Standards categories ... 42
Matching National Standards categories to curriculum levels etc. ... 42
The rigour of data sent to the Ministry ... 45
3.5 Differences in the detail of policies and practices ... 45
Formalising policies and practices ... 45
Discussion about National Standards and National Standards-related areas within the schools .. 46
Intervention by the SLT ... 46
Balance between types of evidence for informing OTJs ... 47
Choice of assessment tools ... 48
Assessment and moderation procedures ... 49
3.6 Sources of variation at the classroom level ... 50
Differences in teacher judgements ... 50
Deviation from school expectations ... 50
Children’s practices ... 50
4. The RAINS schools in 2012 ... 51
4.1 Juniper School ... 51
General changes at Juniper ... 51
Development of National Standards at Juniper ... 51
Impact of National Standards at Juniper ... 52
Forwarding of National Standards data to the Ministry ... 52
Release of National Standards data by the media and Ministry ... 52
Template, PaCT and test changes ... 53
Relationship with the Ministry ... 53
Views of the wider policy environment ... 53
Future directions ... 54
4.2 Seagull School ... 54
General changes at Seagull School ... 54
Development of National Standards at Seagull School ... 54
Impact of National Standards at Seagull School ... 54
Forwarding of National Standards data to the Ministry ... 55
Release of National Standards data by the media and Ministry ... 55
Template, PaCT and test changes ... 56
Relationship with the Ministry ... 56
Views of the wider policy environment ... 56
Future directions ... 57
4.3 Kanuka School ... 57
General changes at Kanuka ... 57
Development of National Standards at Kanuka ... 58
Impact of National Standards at Kanuka ... 59
Forwarding of National Standards data to the Ministry ... 60
Release of National Standards data by the media and Ministry ... 60
Template, PaCT and test changes ... 61
Relationship with the Ministry ... 62
Views of the wider policy environment ... 63
Future directions ... 63
4.4 Magenta School... 64
General changes at Magenta ... 64
Development of National Standards at Magenta ... 64
Impact of National Standards at Magenta ... 65
Forwarding of National Standards data to the Ministry ... 65
Release of National Standards data by the media and Ministry ... 66
Template, PaCT and test changes ... 66
Relationship with the Ministry ... 67
Views of the wider policy environment ... 67
Future directions ... 68
4.5 Cicada School ... 69
General changes at Cicada ... 69
Development of National Standards at Cicada ... 69
Impact of National Standards at Cicada ... 69
Forwarding of National Standards data to the Ministry ... 70
Release of data by the media and Ministry ... 71
Template, PaCT and test changes ... 72
Relationship with the Ministry ... 72
Views of the wider policy environment ... 74
Future directions ... 75
4.6 Huia Intermediate ... 75
General changes at Huia Intermediate ... 75
Development of National Standards at Huia Intermediate ... 76
Impact of National Standards at Huia Intermediate ... 77
Forwarding of National Standards data to the Ministry ... 79
Release of National Standards data by the media and Ministry ... 80
Template, PaCT and test changes ... 82
Relationship with the Ministry ... 82
Views of the wider policy environment ... 83
Future directions ... 85
5. Discussion and next steps ... 87
5.1 The release of the data in 2012 ... 87
5.2 Making a profit with no questions asked ... 88
5.3 New Zealand’s local National Standards ... 88
5.4 The RAINS schools in 2012 ... 89
5.5 Little obvious pressure for compliance ... 90
5.6 Schools ‘doing it to themselves’ ... 90
5.7 Next steps ... 91
References ... 92
Appendix 1: Interview questions for 2012 SLT interviews ... 98
1. INTRODUCTION: NEW ZEALAND’S VERY LOCAL NATIONAL STANDARDS
Over the last few years, the New Zealand Government has been trying to establish National Standards for student achievement in reading, writing and mathematics in New Zealand primary and intermediate schools. For various reasons—good and bad—the policy is highly idiosyncratic, so much so that international observers would often struggle to even recognise the New Zealand approach as a National Standards system.1 Furthermore, the policy is continuing to be developed as the Government’s aspirations around the National Standards policy become clearer. An important development in 2012 was the announcement mid-year that the Government intended to release schools’ National Standards data, the subsequent and controversial release of data in September 2012 despite its widely acknowledged weaknesses, and the related announcement of new plans to ensure better quality data in the future.
All of this created much debate, continuing the controversy that has surrounded the National Standards policy since 2009 (see Thrupp & Easter, 2012). An issue at the heart of the controversy is whether or not the comparability of teacher judgements against the standards (called OTJs—Overall Teacher Judgements) can be improved so that public release of the data would allow ‘apples with apples’
comparisons of student achievement across schools. (Note that this is still not the same as ‘fair’
comparisons as the New Zealand National Standards system is based on ‘raw’ data that doesn’t try to account for intake and other contextual differences between schools). From the perspective of Government and the research programme it has commissioned, there is considerable variation in OTJs, and this problem is seen to be caused by the weak ability of individual teachers to make OTJs (Ward
& Thomas, 2012). In contrast, the Research, Analysis and Insight into National Standards (RAINS) project discussed here illustrates that comparability between schools in making OTJs has to be seen as more than just a matter of individual practice. This is because variation is often a symptom of the deep reach of school-specific factors (school context) along with the incremental (i.e. rather than more fundamental) processes of change that are all that schools can realistically manage. This mix of context and incrementalism is referred to here as creating particular local school trajectories that will not be easily turned. The research here indicates that variation in OTJ-making often relates back to these trajectories.
It is this issue that this report addresses most, as part of the wider RAINS research programme into how schools are responding to or ‘enacting’ (Ball, Maguire, & Braun, 2012) the National Standards policy. An understanding of how local context and incrementalism colours schools’ approaches to the National Standards has profound implications for the development of the National Standards policy.
Attempts to shore up national comparability across schools by a mandated reporting template from 2013 will be illusory if the underlying judgements are being heavily influenced by school-specific factors. The same issue can be expected to limit the effectiveness of the Progress and Consistency Tool (PaCT), an online platform intended to increase the consistency of teacher judgements across the country. PaCT is being trialled during 2013 to be ready for 2014 and will become mandatory from 2015. While PaCT might be expected to address some types of variation between schools due to individual teacher judgements, there are other sources of variation between schools discussed in this report that it is unlikely to address, nor will it address the underlying reasons for these variations.
The rest of this introduction provides a brief account of the National Standards, an overview of the RAINS project and its initial findings, and signals the content of this second report. The report focuses on developments in the schools and nationally in 2012; readers are reminded that the first RAINS report (Thrupp & Easter, 2012) contained much detail about the National Standards policy and its contestation up until the beginning of 2012 as well as further details of the RAINS research design methodology and previous findings. Other information about how the standards system is intended to work and be progressively ‘rolled out’ can be found on the Ministry of Education (‘Ministry’) website and on TKI (Te Kete Ipurangi), the Ministry’s portal website for schools.
1 This has been apparent at conferences where I have needed to explain the New Zealand approach to international audiences.
1.1 A BRIEF ACCOUNT OF THE NATIONAL STANDARDS
New Zealand’s National Standards were introduced in 2009 and involve schools making and reporting judgements about the reading, writing and mathematics achievement of children up to Year 8 (the end of primary school). These judgements are made against a four-point scale (‘above’, ‘at’, ‘below’, or
‘well below’ the Standard) and are made after one, two or three years at school in the junior school and then at each year level from Years 4–8 (i.e., by the end of Year 4, Year 5, etc.). The policy matches up existing curriculum levels and assessment stages and progressions with the National Standards and so, in practice, teachers are supposed to consider students’ achievement against what is required for those levels, progressions and stages and use that understanding for then making overall teacher judgements (OTJs) about achievement against the National Standards. OTJs are therefore intended to be ‘on-balance’ judgements made by using various indications of a child’s level of achievement, such as teachers’ knowledge of each child from daily interactions and in relation to exemplars and assessment tools, tasks and activities. The National Standards policy also requires schools to report to parents about a child’s achievement against the National Standards twice a year.
Schools do not need to use the wording of the four-point scale in this reporting, but they are expected to report against the scale when they report annually to the Ministry about student achievement levels in the school.
The National Standards policy has been one of the most controversial school-level educational developments in New Zealand for decades. Although there are many reasons for this,2 a key issue has been the way National Standards represented such a sharp break from earlier approaches to primary assessment because New Zealand had previously avoided high stakes approaches to assessment and the associated curriculum narrowing and other perverse effects of performativity that have been found in other national settings (Alexander, 2009; Au, 2009; Ball, 2003; Comber, 2012; Hursh, 2008;
Lingard, 2010; Nichols & Berliner, 2007; Stobart, 2008). During the decades prior to the election of the current National Government in 2008, there had been an emphasis on formative assessment, backed up after 1995 by the National Educational Monitoring Project (NEMP), which provided a national overview of achievement by sampling all areas of the curriculum over consecutive four-year assessment cycles. There had also been a tradition, especially up to the end of the 1980s, of sector representatives such as teachers and principals being heavily involved in the curriculum and assessment policy development. But the National Standards were legislated for and developed by the present Government with little consultation (Thrupp, 2010) and threatened to take New Zealand down a high stakes path that most educators had been pleased to be avoiding.
Yet whether New Zealand’s system of National Standards would lead to the unfortunate outcomes of high stakes assessment found elsewhere was not straightforward. As can be seen from what has already been described, teachers’ judgements against the National Standards were to potentially draw on many sources in an apparent attempt to avoid teachers ‘teaching to the test’. As a senior Ministry official put it in 2010:
New Zealand has taken a different approach to the rest of the world. We have used our national curriculum to determine the standard of achievement that needs to be reached at the end of each year. Other countries’ approach to standards has been to set them in relation to how students have actually performed on national tests. This approach could lead to narrowing the curriculum, and mediocre outcomes. Our approach has been bolder, to look to the future, and to determine what our students need to know in order for them to succeed. It’s not just about where we are today—but where we can be in the future. (Chamberlain, 2010)
Nor was it clear at that time that the National Standards data would be publicly released. As discussed in more detail in Section 2.1, the Government has vacillated on this issue, initially saying in 2009 that the data might have to be released because of the requirements of the Official Information Act (OIA) and then promising by the end of that year (when opposition to the National Standards had intensified) that the Government would not create ‘league tables’ of school performance. But then in August 2012 the Government’s ‘Public Achievement Information’ (PAI) policy was announced. An almost
2 See Thrupp & Easter (2012) p. 10.
immediate effect of this (September 2012) was to make the 2011 end-of-year National Standards data from nearly all of New Zealand’s primary and intermediate schools available on the Ministry’s Education Counts website, www.educationcounts.govt.nz. At about the same time some media (i.e., Fairfax Group and the APN-owned Herald On Sunday) also released data in newspaper tables and in the case of Fairfax, a search and compare online database. The Education Counts data does not rank schools in any way and in the struggle to legitimate their involvement in the face of considerable criticism, the newspapers avoided obvious ranking of schools as well.
What New Zealand has been left with then is an assessment policy for primary schools that was initially presented as a flexible and relatively low stakes assessment system, albeit with very crude categories, but one that within three years is becoming the basis of publicly released assessments of schools, assessments that are also likely to become more comparative and high stakes in future. At the same time, the policy has so far developed little structure to support high stakes assessment, although the Ministry is now working on this as part of the PAI policy. There has been little professional development (and generally only for the senior leaders in schools, not for all classroom teachers), no requirement to report in a consistent format (but this will be required from 2013) and no national moderation (although in 2014 the Ministry will be bringing in the PaCT assessment tool mentioned above).
1.2 AN OVERVIEW OF THE RAINS PROJECT AND ITS INITIAL FINDINGS
At the heart of the RAINS project is the recognition that schools never just ‘implement’ policy. Rather RAINS is concerned with policy ‘enactment’: how the standards policy will be translated and reinterpreted at the local level by individuals and groups in different ways amidst the messy complexities and uncertainties of diverse school settings and numerous other educational policies and practices (Ball, Maguire, & Braun, 2012). One important reason to think about enactment is that the idiosyncratic features of the National Standards policy mean that context will be very important in how it plays out in schools. For instance, the general paucity of professional development around the National Standards means schools can be expected to draw on their existing approaches to assessment while the OTJ approach, along with the absence of national moderation, allows for a great deal of local variation in how schools choose to approach the standards. Another reason for taking an enactment perspective is that the National Standards can be expected to require new performances by those in schools as complex social processes are translated into those simple categories of ‘well below’, ‘below’, ‘at’ and ‘above’ standard and reported at different levels within and beyond the school. Based on the international literature on performativity already noted, New Zealand teachers, principals and boards can be expected to be looking for advantageous assessment practices and curricula shifts if they want their schools or particular groups of children to perform well in the standards. They can be expected to look for ways to increase the proportions of children ‘at’ or
‘above’ through the decisions they make around the National Standards, such as choosing ‘easier’
tests. Related to both of the above, a third reason for seeing National Standards as enacted is because it has been such a heavily contested policy. Even if schools are now apparently mostly complying with the standards policy, this does not mean it has captured ‘hearts and minds’ amongst principals, teachers and boards. Their varying perspectives and concerns will continue to influence the way schools approach the National Standards.
These concerns are reflected in the project’s research questions:
1. How are boards, senior leadership teams and teachers in different school contexts enacting the National Standards policy?
2. To what extent is performativity apparent in these enactments of policy?
3. How does the evidence on policy enactments and performativity in relation to New Zealand’s National Standards compare to the international evidence?, and
4. What lessons are there from the research for policy and for practice in schools?
In-depth qualitative research has been required to investigate these questions. The RAINS research design has involved case study research illuminating a wide range of perspectives and practices by drawing on multiple data sources. Case studies are of course studies of singularities but multiple cases
allow for some level of generalisation (Bassey, 1999; Cohen, Manion, & Morrison, 2000). They are a
“prime strategy for developing theory which illuminates educational policy and enhances educational practice” (Sikes, 1999, p. xi). The ways in which Board of Trustees, the senior leadership team and individual teachers in the six RAINS schools are enacting policy as well as responses of children and parents are all being investigated. The views and approaches of other education professionals such as ERO reviewers are also of interest where they are in contact with the schools in relevant ways during the period of the research. Semi-structured interviews and other recorded and unrecorded discussions form the mainstay of data collection and there is also observation of classrooms and meetings and collection of relevant school documents and student data.
The six schools were introduced in the first report and are discussed again in Section 3.1 of this report.
They were chosen primarily for their diverse characteristics in terms of the socio-economic and ethnic makeup of their intakes, school size and rural or suburban locations. While they vary in their level of support for the National Standards, only one (Cicada School) obviously resisted them. All the schools have had successful ERO reviews in recent years and they all enjoy reasonably favourable (and sometimes excellent) reputations in their local communities. Another feature of the research has been the involvement of an experienced teacher from each school—the RAINS ‘lead teachers’—in the research team. These teachers were chosen by the schools and have a role in facilitating the progress of the project in their respective schools and providing advice on matters such as the contexts of each school, the best areas to explore and questions to ask and whether emerging findings fit with their experiences in the setting under discussion.
The RAINS project aims to provide rich descriptions of how schools are enacting the National Standards. It generates internal validity through a ‘chain of evidence’ approach that allows readers to make their own judgements as to the plausibility of research claims. A ‘chain of evidence’ approach provides “a tight and interconnected path of recording evidence so that the reader who was not present to observe the case can follow the analysis and come to the stated conclusion” (Anderson & Arsenault, 1998, p. 159). For the RAINS project, data is being collected to refute or support existing theories and to add to them if possible. This implies comparative analysis within and across schools and also provides many themes to structure the analysis. At the same time, the analysis has needed to be sensitive to differences between New Zealand and the overseas contexts that have produced many of the previous research findings and open to considering the implications of these differences.
As noted above the RAINS research has taken place against the background of intense contestation of the National Standards. The project has also been controversial because it was funded not by the Ministry but by the New Zealand Educational Institute (NZEI). The effect is that the politics of the project are sometimes pushed more to the fore than in other research. The RAINS response to this situation is to be as reflexive as possible about issues such as media coverage of the project, academic activism and freedom, the effects on fieldwork, quality assurance, the political positioning of other research on the National Standards and the continuing debates around the policy, all of which are being explicitly discussed in project reports.3
One of the central findings of the first RAINS report (Thrupp & Easter, 2012) was that the changes around National Standards over 2009–11 were typically incremental rather than representing substantially new departures from what schools had already been doing. Reasons for this included the way the New Zealand standards system was not yet particularly ‘high stakes’ in terms of reputation, change in schools being tempered by what already-busy teachers could deal with and schools already having a major focus on numeracy and literacy as a result of policy over the last decade. Just as Cowie and colleagues found that the New Zealand Curriculum “did not arrive in a vacuum”, (Cowie et al., 2009, p. 7) the same was true of the National Standards. The effect was that even the most obvious responses to the National Standards, such as report formats, tended to involve modifications of what the schools had already been doing.
Another key finding was that the RAINS schools’ approaches to the National Standards were
“intimately shaped and influenced by school- specific [contextual] factors” (Braun, Ball, Maguire, &
Hoskins, 2011, p. 585). Such contextual factors include both intake differences (such as socio- economic make-up, ethnicity, transience, the proportion of pupils from migrant families or with
3 See Sections 1 and 2 of Thrupp & Easter (2012), especially pp. 41–43.
special needs) and other school and area characteristics (urban/rural location, market position compared to surrounding schools). There are also important internal contexts, such as the history of approaches to teaching, assessment and school organisation, reputational or recruitment issues and significant staffing changes. This is not an argument that leadership and teaching can’t make an important difference. Instead it recognises that there are internal school factors, especially historical ones, that can advantage or weigh heavily on schools even if there is little that schools can do about them (Lupton & Thrupp, 2013; Thrupp & Lupton, 2011).
1.3 THE REST OF THIS SECOND REPORT
Section 2 provides further background relevant to the 2012 RAINS research and the issue of comparability. There is a discussion of developments in the National Standards policy and its contestation during the year, looking at both the matters that attracted media attention and some developments that went more ‘under the radar’. This section goes on to detail the research activities undertaken as part of the RAINS research in 2012. It also reviews the recent MTL research on variability between teachers in making OTJs, both for its substantive findings and for what it tells us about the limited way this research programme, and the Key Government who commissioned it, is viewing the problem of consistency and comparability between schools.
Section 3 is concerned with better characterising and explaining variations across schools in terms of how OTJs are made. This section does some scene setting by reminding readers how the RAINS schools differ and by providing a flavour of the way their different features have led to varying responses to the National Standards, including different school decisions around the 2011 OTJs. The discussion then provides a more detailed account at national/regional, school and classroom levels. In each there are sources of variation that are usefully illustrated by the cases of the RAINS schools.
Overall, and as the title of this report suggests, the discussion provides a multi-faceted explanation and some rich illustration of why National Standards are actually very local because of the effects of context and related trajectories on school processes around OTJs.
Section 4 then provides a further update for each of the RAINS schools for 2012. Matters covered include each school’s development of National Standards and perceived impacts, as well as particular activities related to the policy, such as the forwarding of National Standards data to the Ministry as part of each school’s annual report in May 2012 and the release of National Standards data by the media and Ministry in September 2012. Each school’s relationship with the Ministry and views of the wider policy environment and likely prospects for the future held by each school’s senior leadership team (SLT) are also discussed.
The report concludes (Section 5) by reflecting on the main points discussed and signalling matters that remain to be researched and/or discussed in the final RAINS report at the end of 2013. The RAINS research will also be a significant resource for the forthcoming ‘Primary Education: Taking Stock, Moving Forward’ Conference to be held in Wellington in January 2014, see www.education2014.org.nz/
2. FURTHER BACKGROUND
The first RAINS report (Thrupp & Easter, 2012) provided extensive background about the National Standards policy and its contestation, the research being undertaken for the Ministry by Maths Technology Ltd (MTL) and the research design and methodology of the RAINS project itself. This section essentially updates that earlier discussion, covering relevant developments in 2012.
2.1 DEVELOPMENTS IN THE NATIONAL STANDARDS POLICY AND ITS CONTESTATION OVER 2012
The introduction of National Standards became in some ways less contentious In 2012 there were further indications of the spreading influence of the National Standards policy.
More web pages of primary and intermediate schools were mentioning the standards system. There was also more influence from the Ministry as reflected in the expanding coverage of the standards system on its websites as well as the way that the policy was becoming targeted by businesses as they began to see opportunities in the perceived demands from schools and parents (both discussed later).
Also noteworthy was that the requirement to forward National Standards data to the Ministry by May 31 as part of schools’ annual reports went almost without incident when compared with the tensions around submitting National Standards targets in charters the previous year. This was despite an NZEI warning in January 2012 that schools might boycott the previous of data (‘Schools may refuse to hand over student data’, 2012). Finally, and related to this apparent acquiescence by schools, whether or not schools would ‘implement’ the National Standards policy was no longer a matter for almost continuous media coverage as it had been.
To what extent did these patterns signal the National Standards becoming accepted or embedded in schools? It seems clear from the media coverage that accompanied the release of the National Standards data that some schools and teachers were genuinely enthusiastic about National Standards in 2012 and the RAINS research has also picked up some of this enthusiasm. Yet schools such as Kanuka—the most positive about National Standards amongst the RAINS schools—were not ‘buying into’ the policy in any simplistic way: their enthusiasm was contingent on it working for their children and community. Furthermore most of the 2012 media coverage of schools suggests it is more often the case that they continue to see the standards system as a problem but are learning to live with it. The first RAINS report illustrated some of the complexities of this uneasy accommodation as experienced by the RAINS schools and this report reveals more.
Introduction of National Standards overtaken by events
The debate over the introduction of National Standards was also being pushed aside by new developments in 2012. First, a new Minister of Education, Hekia Parata, had been appointed as part of the Cabinet reshuffle after the Key Government had been re-elected for a second term in November 2011. Parata had not been associated with the early defence of the National Standards policy and so the campaign against the National Standards lost the lightning rod for discontent that her predecessor, Anne Tolley, had provided. Second, there were new Government proposals to release National Standards data and so when it came to National Standards the attention of the profession, the media and others tended to focus on these, as well as the eventual release of the data in late September, and closely related matters. Third, there were a series of highly contentious education policy developments in other areas of schooling, especially a Treasury-inspired proposal to increase class sizes (and the need to back down from this after public resistance), the development of ‘Partnership’ (charter) schools, the wholesale ‘reorganisation’ of schools in and around Christchurch following the earthquake there, the intended closure of some ‘special’ schools (unsuccessfully in some instances) and major problems with a new pay system for those working in schools, Novopay. By the end of a tumultuous year, Education Secretary Lesley Longstone had resigned and the Minister’s future was also looking uncertain.
These new developments dominated educational politics in the primary sector in 2012, helping to further explain why schools’ responses to the National Standards became less contentious. For instance, returning to the requirement to forward National Standards data to the Ministry by May 31, this occurred at a time of intense debate and concern over class sizes and primary principals are likely to have been picking their battles. The same may be true of the embattled Ministry as well. Even by the end of September 2012 about 9% of schools (188 out of 2087) had still not provided the Ministry with National Standards data that could be put on the Education Counts website (Tapaleao, 2012c).
Such schools were said to be ‘liaising with the Ministry’ (ibid) with about 25 said by the Minister to be “having difficulty complying”—indicating the schools were withholding data in an apparent boycott (Trevett, 2012b). But such cases were dealt with quietly rather than the schools being publicly
‘named and shamed’. The new developments also had the effect of making the handling of National Standards by the Government seem relatively successful. For instance, a New Zealand Herald editorial written just prior to Christmas listed “errors that have embarrassed the Government in education this year”. These included class sizes, the Christchurch school closures, the ongoing Novopay debacle and the attempt to close Salisbury School, a school for girls with special needs. (‘Parata lucky to stay after year of errors’, 2012). The National Standards policy is not mentioned, even though the public release of National Standards data was also controversial.
At the same time, National Standards were never far from discussion about these other developments and surfaced regularly in related commentary and blogs. For those opposing the broad direction of the Government’s education policies, developments like charter schools came on top of National Standards and were often seen as part of the same neo-liberal package, the so-called GERM—Global Education Reform Movement. For those supporting the new developments, the National Standards policy continued to be identified as that which would bring the accountability that teachers and their organisations were refusing to face by their opposition to the Government’s wider educational reform agenda. (In fact such outlooks are deeply challenged by this second report, a point reiterated in Section 5). National Standards were also incorporated into some of the policy responses to the growing tensions in the sector. An example was that the NSSAG became inactive but PAI became the focus of a subgroup of the Ministerial Cross-Sector Forum on Raising Achievement (referred to hereafter as the MCSFoRA). This was a forum set up by the Minister for consultation with key sector representatives and other selected individuals after the Government’s backdown over class sizes.
The lead-up to the release of the National Standards data
The events that dominated media coverage of the National Standards in 2012 were those around the publication of National Standards data online and in newspapers. On this issue the Key Government had initially said in 2009 that the data might have to be released because of the requirements of the Official Information Act, promised by the end of that year (when opposition to the National Standards had intensified) that the Government would not create ‘league tables’ of school performance (Tolley, 2009), then argued in mid-2010 that league tables had become ‘inevitable’ after an advisory group had failed to come up with a means to prevent them (Hartevelt, 2010). In February 2012 plans for the release of the data were raised again when the new Minister of Education Hekia Parata raised the possibility of a Ministry website similar to the Australian ‘My School’ website, with schools compared within the same decile grouping (Hartevelt, 2012a; Young, 2012a), but by May she was still not committing to launching a website in 2012: “I am not going for haste over substance”
(Shuttleworth, 2012). Yet on 18 June, shortly after the Government had had to make the embarrassing backdown over the increased class sizes policy, Prime Minister Key mentioned—casually at a post- cabinet briefing, see Brown (2012a)—that he supported some kind of government league table of National Standards results (Vance, 2012). Key argued that parents were “desperate” for comparative information on student achievement and that as the media could get the data from schools under the Official Information Act at any rate, it would be better if Government became involved in the public release of the data. This suggestion quickly firmed up into a decision to release the data in some form in September (Young, 2012b) despite a much-quoted admission by Key in early July that the early data would be “very ropey”:
The earlier data, in my view, is unlikely to be terribly satisfactory for anybody so it does need a bit more time.… It’s extremely patchy and it’s in different forms and that
will make it very difficult to interpret…. But over time, the Government’s hope would be that it would be more consistent because the purpose of having better information is to give parents, I think, a better sense of how their school is performing. (cited by Hartevelt, 2012b)
The next two months leading up to the eventual release of National Standards data by media and Government saw more plans to release the data announced along with some intense debate over the merits or otherwise of releasing it. In July the MCSFoRA met and considered a report from the PAI subgroup that agreed to “the pro-active publication of National Standards achievement information at the school, regional and national level in September 2012” (MCSFoRA, 2012a, p. 2) and noted that
“through a process of continuous improvement over time, the quality of the information will improve and be increasingly useful” (p. 3). Representatives of NZEI, NZPF and PPTA dissented or expressed reservations about the report of this subgroup. A Ministry briefing on ‘Progressing The Government’s Education Priorities’ involved a target of “100% of schools … reporting high quality achievement information” by 2017 (Ministry of Education, 2012a), an uncompromising target presumably also intended to underpin the Government’s stated aim of “educational success for five out of five learners”.
On 8th August, while the Olympics were under way, the Government revealed how the National Standards data was to be released online in September as part of its continuing ‘Public Achievement Information Plan’ with various steps towards “incrementally improving the quality of the data”.
(Ministry of Education, n.d.-a, Office of Hekia Parata, 2012a). These other steps included mandating the use of a standardised reporting template for the Ministry in 2013, schools’ electronic upload of National Standards data to the Ministry in 2014 and schools’ use of the PaCT tool in 2015.
The intended release of the National Standards data was generally supported by newspaper editorials (e.g., ‘Publish league tables,’ 2012), and newspapers also began to seek data directly from schools in order to publish it themselves. This started with the Dominion Post newspaper (Fairfax Media group) but a letter to schools in the Wellington region elicited only ten replies, especially after the NZEI and NZPF advised schools not to respond (‘Schools refuse to release national standards information’, 2012). This led in turn to Fairfax Media complaining to the Office of the Ombudsman, a ruling from the Ombudsman that schools must release their data under the OIA and advice from the Ministry to all primary and intermediate schools in the country that they must comply (Brown, 2012b). Nevertheless Fairfax was eventually only able to obtain data from about half the schools. It was John Hartevelt, a Fairfax journalist who had been reporting on the developments around league tables and covering for several years a range of views, who became the person to lead Fairfax’s publication of the data. The Herald on Sunday request to schools for the data came later and did not evoke the Official Information Act. As a result this newspaper was only able to publish data on around 600 schools.
There was support for the release of data from right-wing bloggers such as Slater (2012b) and Farrar (2012a) and from pro-market lobby groups the New Zealand Initiative (‘Parents “hungry for information”‘, 2012) and the Maxim Institute (Thomas, 2012a), although the latter also cautioned against the poor quality of the data (see also Thomas, 2012b). There was also support from Pem Bird, president of the newly formed Iwi Educational Authority (‘Iwi Education Authorities supports league tables’, 2012), a member of the MCSFoRA and president of the Māori Party (National’s coalition partner). Bird had previously been highly critical of what he described as the “political self serving scaremongering humbug NZEI are dishing up” (Heuber, 2010). On the other hand, Lorraine Kerr, head of STA, and also a member of the MCSFoRA, was less supportive of the release of the National Standards data than she had previously been of the introduction of National Standards: “We support the rights of parents to know how well their school is meeting their children’s needs.… We are not convinced that National Standards data is the best, or indeed the only relevant way of doing this”
(Kerr, 2012). It also emerged that the Ministry itself had advised the Minister against premature publication of National Standards data:
In a report sent to Ms Parata in June this year, the ministry warned against the wholesale release of the information. It said she should instead release it as part of a detailed report, outlining the problems with the data and how the issues would be addressed in the future. The officials said if Ms Parata released the information alone, she would risk losing buy-in from teachers and education groups and provide
justification for opponents of national standards. (‘Parata warned against publishing national standards data’, 2012)
Stronger public opposition leading up to the release of the National Standards data came from the NZEI and NZPF (numerous media releases and local activities as part of the ‘Stand up for Kids’
campaign), NZPF (numerous media releases and advice to members), the Greens and Labour (especially MPs Catherine Delahunty and Nanaia Mahuta, questions in the House, media releases and blogging), bloggers such as Russell Brown and Kelvin Smythe (e.g., Brown, 2012a; Smythe, 2012a), academics including those in the Assessment Academy (see Johnston, 2012) and the author (Thrupp, 2012a, b) and many principals. In an exceptional case of academics unifying behind a cause, an open letter organised by the author along with John O’Neill of Massey University was eventually signed by over 170 academics (Thrupp, O’Neill, et al., 2012; Young, 2012c). There was a similar open letter signed by 277 Auckland principals (APPA, 2012) and further public support from principals elsewhere (e.g., Smythe, 2012e). Other forms of resistance prior to the release of the data included school principals and boards refusing to give their National Standards data to the media both before and regardless of OIA requests and many principals making comments in the media and in school newsletters despite being warned by Parata against using school newsletters for political comment (Sutton, 2012).
The release of the National Standards data
The release of the National Standards data gained intense coverage in both traditional and social media: what follows is not comprehensive but does cover many of the main developments.4 The release began on Friday 21 September 2012 with a press release from the Minister. The main points of this release were that 76% of primary-aged children were at or above standard for reading, 72% for maths and 68% for writing, with a “concerning number” of Māori and Pasifika children not achieving the Standard and boys over-represented amongst those not achieving the Standard in reading and writing (Office of Hekia Parata, 2012b). Although the release was scant on detail, it would have gained some publicity and ownership of the release of data for Government ahead of the media’s more substantial coverage at the weekend. As it turned out, it also constituted most of the Government’s public analysis of the data since it was just released by Government in the form provided by schools, discussed further shortly.
The next day, Saturday 22 September, saw the Fairfax release on the Stuff website and in its various regional newspapers. There was no ranking of schools but rather an approach that allowed easy comparisons, through both a searchable online database and tables of data from local schools in the newspapers. The accompanying commentary provided both assertive justifications of the release and frank concessions around the flawed nature of the data:
Many people told us not to publish the information you see on this site.
They fought to stop us. Some sent us bills for the privilege of their school’s data.
Others buried the figures we asked for in complex matrices and pages of indecipherable bumph.
Many more gave up their school’s National Standards data with a grave note of caution about the reliability and usefulness of it. We have not been deterred by the criticisms and the cautions, but neither were we unmoved by them.
Anyone who read the National Standards results as a proxy for quality would be quite foolish. We wouldn’t do that and we don’t suggest you do, either. For starters, they are not moderated, so one school’s “well below” may be another’s “at” or “above”. There is just no way of knowing—yet—exactly how the standards have been applied across schools….
So why publish National Standards data at all? Our critics have already suggested this is a “business decision”. An official in the Education Minister’s office charged that it
4 See Edwards (2012) for another broad discussion of the media coverage around the release of National Standards.
was “solely aimed at gazumping” the Government’s own website. Both accusations reflect the bias of their authors—and both are wrong. Of course we want people to look at what we have published here; to talk about it and to debate it. But that does not mean our decision to publish National Standards data was a “business decision”. This project has been led by journalists from the beginning….
If there are problems with the National Standards—and it’s pretty clear that there are—the Government, teachers, parents and education leaders are going to have to figure out how to fix them. If they have to be scrapped, then those that would have them scrapped will have to win the argument. In the meantime, the public should expect that the media will work to turn over National Standards information and report on it as best it can.
We cannot lose faith in our readers so much that we feel we have to censor them from information just because it is challenging. They are smarter than that and they deserve better. (Hartevelt, 2012c)
The Fairfax approach also includes case studies (of ‘higher’ and ‘lower’ achieving schools), ‘health warnings’, tips for choosing schools, the facility to download the data, statements from various of those opposed to the release of the data and some contextual data such as school decile and links to ERO reports on the searchable database.
The Herald on Sunday published its data and commentary the next day (Sunday 23 September). The Herald’s approach did not include a searchable database or put the data online. Its tables were organised by deciles and regions and again there were various justifications, qualifications, tips and some (shorter) case studies and professional and other perspectives. Some of the professionals and academics cited supported the release5 but most were against (Wynn & Jillings, 2012). The Herald made up for its inclusion of fewer schools with larger claims, particularly that “children in bigger classes and bigger schools get better grades”. Its editorial praised the president of the Waikato Principals Association who, unlike all the “tunnel-visioned ideologues”, was “courageous enough to listen to the arguments” and “on discussion, had accepted it was better to talk through school results with parents than hide information from them” (‘Won’t someone please think of the children’, 2012).
On the other hand, in an article called ‘Lessons from the motherland’, deputy editor Jonathan Milne complained about the calibre of what the Herald on Sunday had nevertheless decided to report:
England’s stringent assessment regime has been widely panned. Yet strangely it is actually better in some respects than New Zealand’s new and shonky national standards. At least in England, the test results are checked and moderated before the inspectors print off the spreadsheets and decide which schools to close down. English schools are ranked on value-added data—how much children improve from one year to the next—rather than having the raw test results of privileged kids from the leafy suburbs compared directly with those from the concrete council estates. (Milne, 2012) The schools that were most obviously devalued by the way the data was reported were special schools for children with various kinds of intellectual disability. As the Herald on Sunday put it, “despite being told they would be exempt from national standards … many show a line of noughts for the numbers of pupils achieving at or above standards” (Wynn & Jillings, 2012). This was referring to a change of Ministry of Education policy in late 2011 that saw all students at state schools, regardless of background characteristics, having to be entered for the National Standards or Ngā Whanaketanga, the Māori-medium assessment system. In the Waikato Times (Fairfax) published on 22 September, one of the schools in the table of Waikato schools (p. 4) was Hamilton North School. It stood as the only special school in the table as it had 100% ‘well below’ in all categories. The Waikato Times tried to put some context around this online by providing a sympathetic case study. But the case study and accompanying video clip was not in the hardcopy version of the paper containing the offending table, just a short quote from the principal: “‘We’ve talked about national standards, we know they’re there and we know that, due to the intellectual disability of our students, none of them are actually going to
5 Liz McKinley, the only university-based member of the MSCFoRA apart from Gary Hawke, was found to be in favour.
attain level one,’ [the principal] said. ‘It’s disappointing that we should be lumped in with all those other schools.’” (‘How our region’s schools stack up’, 2012). Oddly, this school was included when many other local schools were not included because of privacy issues around their data. The video clip accompanying the online case study even discussed how parents of the children at this school had made it clear to the school that they did not want to be told three times a year in personal reports that their children were ‘well below’ standard; the school’s data still got put in front of the public anyway.
As with the approach of the newspapers more generally, there was huge faith in qualifying commentary being able to make the publication of such data acceptable, a point taken up in Section 5.
The week following the media’s release of the data was marked by much commentary and analysis as to what conclusions could be really drawn from it, if any. There was further discussion of the various claims highlighted by the newspapers (e.g., ‘National standards data confirms boys are lagging behind’, 2012) and new coverage of the theme that children at lower decile schools were less likely to achieve at or above the standard. While some, including some opponents of National Standards, were gratified that at least the National Standards seemed to reflect the strong relationship between social inequalities and achievement (e.g., Hartevelt & Francis, 2012; McLauchlan, 2012a), there were numerous warnings from principals and bloggers, including some statistical analyses, that the data was not to be trusted (e.g., Crampton, 2012; Manning, 2012; McLauchlan, 2012b; McLauchlan, 2012c;
McNabb, 2012; Ng, 2012). Cases also came to light where newspapers seem to have simply misreported the figures as provided by schools (Mahuta, 2012). Meanwhile the Minister continued to point to support from parents for the release of data (Hansard, 2012) and suggested that National Standards could eventually form part of a performance pay system (Hartevelt, 2012d).
Edwards (2012) claimed that, “… the blogosphere really added value to the debate. Shallow or self- serving analysis will be ruthlessly examined by people who know their way around a scientific calculator and/or a classroom”. Russell Brown’s Hard News blog considered the various claims from media and other commentators around what the data was showing and how misleading those claims might be (Brown, 2012c). He suggested that the release of the data could have been worse if there hadn’t been such contestation of the National Standards policy:
Opponents of the standards process might wish to reflect on what they have achieved in the process that began when John Key’s new National government shoved through national standards under urgency shortly after winning the election in 2008. Neither of the big newspaper groups has actually published a ‘league table’ of schools: both made positive decisions not to do so. It’s doubtful that we would be seeing so many obvious caveats on the reporting had the issue not been pursued.
On Friday 28 September, a week after the initial release, came the Government’s release of individual school data on the Education Counts website. As it turned out, this simply involved providing a pdf of whatever schools had sent the Ministry, sometimes even including handwritten notes. The website also provided some contextual information such as school decile, student ethnicity and ERO report, as well as a pop-up ‘health warning’ before the data could be accessed. Although Edwards (2012) had predicted “a fresh round of analysis”, there was relatively little. Fairfax remained resolute about the value of release (Hartevelt, 2012e), but the Herald was beginning to question whether a little knowledge could be a dangerous thing (Cumming, 2012). Within a few days the release of National Standards data had been overtaken by the story that research had found most teachers unable to make reliable OTJs (see Section 2.3).
Other National Standards developments as highlighted by the media during 2012
Stepping back in protest: Reported in mid-February, there was news of the resignation of Louis Guy, principal of New Windsor School in Auckland, in protest of National Standards. Guy, who had been critical of National Standards, went to work for NZEI. A statement from the Minister implied Guy’s stance had been unprofessional:
If Mr Guy feels he is unable to require the teachers under his authority and
‘leadership’ to implement Government policy, then he has made the right decision to