Closing the emergency facility: Moving schools from literacy triage to better literacy outcomes

Joanne O’Mara

Abstract This article focuses on the tensions between national and international testing, educational policy and professionalism for middle school English teachers. I argue that state and federal government(s) are responding to the impact of the publication of the international testing results such as PISA through the NAPLAN and the publication of NAPLAN results on the MySchool website. I also argue that that the international testing provides better markers for how we are doing as a nation, and what might be done to improve our international standing with respect to our literacy scores. I use qualitative data from the survey results from the Victorian Association of Teachers of English (VATE) response to the Australian government senate inquiry into the unintended consequences of the National Assessment Program for Literacy and Numeracy (NAPLAN) as well as drawing cases from my own research and the literature. I use the metaphor of the hospital emergency department to explore this situation, with different and multiple educational professionals playing the role of the triage nurse—the alternate federal and state government education ministers responding to international and state test results in triage, and principals of poor performing schools operating their school as though it is an emergency department, poor literacy results triaged Code Red, receiving immediate focus and attention, but “treated” in terms of immediate survival and a focus on basic skills.

Literacy Triage: Emergency Response to Code Red NAPLAN@MySchool  Blame the workers under you (Prime Minister  Education MinisterDepartment Regional Directors Principals Teachers)  Teach to the test: Make NAPLAN the Curriculum  Reduce other activities so can focus on NAPLAN  Keep poor performing students away or move them on  Give prizes for good NAPLAN results

Literacy as Redesign: Long term response to PISA trends  Increase student inclusion and educational equality  Develop a culture of reading  Read and Write longer, more complex texts  Engage boys in literacy  Keep up the good work in digital literacy

I argue that, as is well established in the literature and my review of PISA 2009, true gains in literacy and the development of more complex literacy skills can be made through a redesign approach to schools and systems—long term responses that increase student inclusion and educational equality, develop a culture of reading and skills in reading and writing longer complex texts, engaging all students and retaining our position in digital literacy skills.

Schools in Triage Mode Researching a school animation program that had been identified as an outstanding case of school curriculum redesign, I interviewed a visiting Vice Principal, who I shall call “Ken”, who had been placed in the school for training purposes. The school he belonged to was 25 minutes further out of the city and a much lower socio-economic area than this very middle-class school that was devoting much time and resources to the innovative and creative animation program. Ken “loved” every aspect of the program and was extremely enthusiastic about it, However, when I asked him how he might adapt it for his school he told me that there was no way that he could run a program like that at his school. Having just heard him wax lyrical about the benefits of the program for some minutes, I was surprised, so probed him further. He told me that because the National Assessment Program—Literacy and Numeracy (NAPLAN) results were very low at his school—they were in crisis and had to “focus on the basics”. I tried to talk him through the numerous literacy skills that the students were developing through this program, the richness of the task, the complexity of the products, the cooperation skills, the creativity, the attention to good quality writing and textual structure and the relationship between this middle-school rich task and later employment prospects, but he was unmoved. “It is a great program, but (at my school) we need to focus on the basics. Our kids need the basics. We have to improve our test results” From his determination and distress, it was obvious that he saw his school as in emergency crisis mode, the measure of which was the NAPLAN results. This conversation was before the Australian Labor federal government made the policy decision to publicly publish the test results from every school in Australia in a searchable and comparable website, called MySchool, which I believe has intensified the constructed emergency around NAPLAN scores. Even so, the impact of this national testing has been shaping the experience of teaching and learning at Ken’s school. This school was already operating in triage mode in response to the “crisis” put upon them by their literacy results and the way in which the principal class team were held responsible for them—as this leadership team moved from one crisis to another, the literacy and numeracy results became a form of emergency to deal with.

Ken is not alone. All over Australia, leadership teams of schools who do not perform well in NAPLAN are operating in triage mode as a result of the publication of their NAPLAN scores on the MySchool website. This is high stakes for these schools, as parents “shop” for schools based on NAPLAN results, and make decisions about which school to enroll their child by trawling the MySchool website from the comfort of their own home, rather than the bother of school tours, discussions with teachers and principals, and the consideration of what really will suit their child. As the mother of school-aged children in a middle-class suburb, I have been surprised and dismayed about the ways in which some of my neighbours and friends draw on the green and red bars on the MySchool website to measure how well the school is doing, rather than visiting, meeting with parents, and looking at how the students are faring. In this article, I identify some of the tensions between national and international testing, educational policy and professionalism for middle-school English teachers. Drawing on data collected from schools and teachers in Victoria as well as an analysis of the PISA reports, I identify a potential relationship between the impact of the NAPLAN testing on school curriculum and performance and the declining position and declining scores of Australia in the PISA testing.

Testing Times: Literacy, NAPLAN, PISA and MySchool “Literacy” is frequently associated with basic skills, and the term itself is often used to mean the basic skills of something – for instance- “emotional literacy”, “computer literacy”. It is also often (tacitly, if not directly) regarded as the responsibility of early primary education and can be neglected in middle schooling (Years 5-9). However, the focus on literacy learning and development of a repertoire of complex literacy skills should continue in a systematic way through secondary education, and the attention that has been paid to this in the new national curriculum is encouraging. In Australia Years 3, 5, 7 and 9 have been tested annually through the NAPLAN since 2008. The Australian Curriculum, Assessment and Reporting Authority (ACARA) website describes that NAPLAN “tests the sorts of skills that are essential for every child to progress through school and life, such as reading, writing, spelling and numeracy. The assessments are undertaken nationwide, every year, in the second full week in May” (ACARA, 2014). From the site, it is evident that ACARA has been dealing with some of the issues for teachers and schools around the delay between the sitting of the tests and production of results. In 2016, NAPLAN is being moved to online testing, and these will be available to be accessed on a variety of devices and the results will be quickly available to schools so that they can use them as part of their ongoing summative and diagnostic testing.

The major study that enables international comparisons to be made of middle school literacy is the OECD Programme for International Student Assessment (PISA). PISA is conducted every three years, with the 2009 PISA (OECD, 2010), being the most recent set of published and fully analysed results at the time of writing this article and will be used as the discussion point in this article. In PISA, “Students are not assessed on the most basic reading skills, as it is assumed that most 15-year-old students will have acquired these. Rather, they are expected to demonstrate their proficiency in accessing and retrieving information, forming a broad general understanding of the text, interpreting it, reflecting on its contents and reflecting on its form and features” (Thomson et al., 2011). The OECD states that PISA is unique because the tests which are not directly linked to the school curriculum. “The tests are designed to assess to what extent students at the end of compulsory education, can apply their knowledge to real-life situations and be equipped for full participation in society”. The PISA also collects sociological information and, importantly, information about students’ reading habits and dispositions (OECD, 2014).

In 2009 Australia was listed as a high-performing country in reading literacy (with a score of 515), however these results were statistically lower than previous years’ tests (OECD, 2010). In their analysis of Australia’s PISA results, Thomson et al (2011) note that in reading literacy scores between PISA 2000 and PISA 2009, Australia was the only high performing country to show a significant decline (13 score points) in reading literacy” pp v1-vii. This was largely due to falls in results of students at the top levels of schooling (COAG, 2010, p.27). PISA assesses students at achieving at different levels from 1a-6 (with 6 as the highest and 2 as the minimum required skills for baseline participation in society). The proportion of students who achieved level 5 and 6 declined over the decade from 18 per cent to 13 per cent. Barry McGraw, Chair of the Board of the Australian Curriculum, Assessment and Reporting Authority, noted that “The reasons for this are not immediately evident from the data but it is at least clear that it is due to schools focusing more on basic achievement levels and not so much on the development of sophisticated reading of complex text” (McGraw, 2010 p. 5.) The other significant trends in PISA 2009 were a 17-point decrease in males, with 4% more males not achieving the basic levels (Thomson et al, 2011) and an increase in the number of students who do not read for enjoyment. The losses in this are awful- 4% more males not having the basic skills for employment and everyday life and the follow-on effects that this has for their families and communities. Again, the average score of those in remote locations was almost two years of schooling lower than that of students in metropolitan schools. Additionally, the gap between the performance of students from the highest and lowest socio-economic backgrounds can be up to three years of schooling.

Both the definition of literacy and the range of text types described and used in the print and digital PISA tests are compatible with the DEECD literacy definition Masters (2013). Literacy in the PISA is defined as: an individual’s capacity to understanding, use and reflect on and engage with written texts, in order to achieve one’s goals, to develop one’s knowledge and potential and to participate in society. (OECD, 2009, p.14) The print literacy testing covers three separate areas—text format; reading processes; and the situation for which the text was constructed. The texts distinguish between “continuous texts” (organised in sentences and paragraphs) and “non-continuous texts” (organised in graphs, charts, forms etc.). It is important to note that reading literacy is assessed in relation to the text format, reading processes (aspects) and situations. The texts are selected based on the range of texts used in adult life. “Students are not assessed on the most basic reading skills, as it is assumed that most 15-year-old students will have acquired these. Rather, they are expected to demonstrate their proficiency in accessing and retrieving information, forming a broad general understanding of the text, interpreting it, reflecting on its contents and reflecting on its form and features” (Thomson et al., 2011). Situations are about textual purpose—for instance, reports and manuals are for occupational use, novels, personal letters are for personal use etc.

For the first time in 2009, the OECD ran a test of digital literacy skills with a smaller sample of 19 countries. In this test, Australia achieved a mean of 537 points, ranking second in the trial. Australian students performed in general 22 points higher in digital literacy than print literacy, and it is worth noting that “except for students who attended schools in remote areas, whose digital and print reading literacy performances were not significantly different” (Thompson and Bertoli, 2012, p. ii). Australian students were relatively highly engaged by the digital assessment and there was a very high completion rate. Australia had high usage of computers at school as well as high usage at home.

The unintended consequences of NAPLAN In June 2013, the Australian Senate announced an inquiry into the “unintended consequences” of NAPLAN. VATE does not have a set position on the NAPLAN program. Nevertheless, VATE council made a submission to the Senate Inquiry to communicate some of the experiences we had heard from the membership, or experienced ourselves, both as English teachers in schools and English teacher educators in universities. As a council member, I prepared a short anonymous survey (using an online survey too, Survey Monkey) via email to the membership, expecting a few teachers to submit responses. We were overwhelmed with 88 responses over the four days that the survey was open, with many of these responses being very lengthy and extremely affecting. An overwhelming theme from our data is that NAPLAN is having very different effects on different schools, teachers and jurisdictions, some of these effects being extreme and very negative. We did not collect demographic data with our survey, so other than knowing that respondents are members of the association, we cannot map the responses to school type, socio-economic advantage or any other factors. This said, the teachers whose responses were the most distressed, the most passionate and where the effects were reported as being the greatest also mentioned aspects of their school and student body that one would assume as being from a lower socio-economic area, such as describing the adverse circumstances that students were in. For instance:

…NAPLAN has created a state of forged data, manipulated statistics, the withdrawal of certain students, the coaching of students, teaching has become nothing more than teaching to the test... this is Australia wide. It has become such a frightful test that school programs are driven by it and we all saw it coming so it can only be concluded that the intent of the NAPLAN focus over the past few years has been to forge data at all costs - driven from the Prime Minister down. Furthermore the amount of taxpayers’ money that is splurged by schools needing 'literacy and numeracy' coaching, and help etc. because the data is not increasing enough (the teachers aren't cheating enough) is repugnant - where is the accountability?? I would like to know who is looking at the data associated with these programs as closely as they are looking at the NAPLAN data. Many students at our school are lucky to have breakfast, lucky to have been asleep by midnight, lucky to have a pen and lucky to have ever read a book in their lives and yet they are being taught to achieve better data in a test that they won't even take seriously to begin with. We had a regional director come in and talk about the data not increasing enough, our assistant principal chastised our staff over it - no mention of the crazy and dysfunctional students that need help - and then the programs are destroyed as funding gets syphoned into rubbish programs - I really want an audit nationwide into all of the programs at schools and curriculum that was destroyed because of the NAPLAN hype. I would like to know how much money has been wasted - and I would argue that many schools have actually gone backwards because of this focus.

This school and teacher is clearly suffering as the school responds to the regional director’s criticism leading to the principal’s chastisement of staff, who are dealing with the no breakfast, not enough sleep, not the right equipment, as their starting point for working with the students. Like the opening vignette of Ken’s school, this school is changing where it invests and the programs it is prepared to run because of the fallout from NAPLAN. I have selected some of the themes we developed and quotes from teachers who were very deeply affected by NAPLAN in their school and consider how these teachers described their experiences in terms of the impact of the testing on their sense of professionalism and autonomy, and leading their schools into operating in emergency mode. In doing this, however, I would reinforce our major finding that the effects are very unequal across schools and jurisdictions. The VATE submission is publicly available from the VATE website and was published with all the other submissions on the Senate Inquiry website and I would encourage you to read our full submission.

Unintended Consequences of NAPLAN: Code Red An extremely negative major unintended consequence of NAPLAN and the publication of the NAPLAN results on the MySchool website is that it has pushed some school leadership teams into operating in emergency mode, triaging their curriculum and turning the study of NAPLAN tests and topics into the major focus of the curriculum. While this is unfortunate, I do not blame them for this, as the construction of the website invites the reader to read low scores as an emergency and results in a loss of confidence in the school by the community and a loss of reputation. Principals need to deal with that. On the site, the series of bar graphs show the tests results coded as Dark green (very high), light green (high), white (neutral), light red (low) and dark red (very low). These are all compared to the Australian average and then they are also ranked compared to “like” schools. Any school that gets under the Australian average is going to look bad, and this is particularly the case for schools who also do not do well compared to “like” schools. VATE members reported on some of the fallout of how these graphs were being read by the community, resulting at times in the community losing confidence in themselves, as seeing themselves as failing, and that the school is “really bad”.

We are now being compared unfavourably with 'like' schools when the comparison methodology is questionable and most likely inaccurate. The 'value adding' that is achieved at this school is higher than most other schools but that is not taken into consideration.

The school and community sees itself as lacking in ability due to low scores on NAPLAN. The principal and many of the staff concentrate on NAPLAN scores as the deciding indicator of success of student learning and teacher competence is viewed through the lens of NAPLAN success. Teaching and learning has become defined by NAPLAN performance.

No matter how much information is communicated to parents they choose to believe the inaccurate reporting in the media both newspapers & radio and the school has been left to deal with concerns.

I now have students asking me why our school is so bad. I have been told by students that I shouldn't expect better results from them because, "It's just the way things are here because it's a really bad school."

I think the broader social picture has also intensified the issues around NAPLAN for teachers. The fact that NAPLAN preparation books are available in supermarkets and that 'health food' companies are marketing herbal supplements as useful for NAPLAN only intensify broader social pressure on teachers and students around NAPLAN. They also target parents, making them feel that they need to be working with their children at home to prepare them for these tests. This then intensifies the pressure on teachers and schools to 'prepare' students, as parents begin asking why children haven't had any preparation.

The publication of the NAPLAN test results on the MySchool website have resulted in many schools’ reputations being affected in both positive and negative ways (depending on results). The emotional toll is high on schools with low results. One respondent described that the publication of the results “has made the teaching and learning practices much more test driven and exam focused. It can be devastating to a school in a low socio-economic area to have these results published on this website”. Many teachers cited examples where loss of reputation caused NAPLAN to have great significance to the school shaping teaching and learning processes. One teacher wrote that, “There are the usual complaints from on high (region) each year about how bad we all are, which is discouraging”. This pushes down onto the principals, who begin triaging:

I have had executive members of staff comment openly on how the results look to the rest of the community although they are aware that it is not the only measure.

As teachers, we are held responsible for this. We are told that we have the responsibility to keep these results good or our reputation will be destroyed and "public education is in crisis".

The principal places pressure on the teacher leaders to improve NAPLAN scores and be able to explain from year to year why there has not been a significant improvement. It over-simplifies the process of teaching and learning, ignores the social and cultural aspects of schools, learning and teacher/student and student/ student relationships and has caused the principal to demand that teachers work in a standardised and measurable way.

There has been very interesting conversations and comparisons made between different cohorts of students which many of the senior admin team look at as showing 'improvement' etc., when it's a different group of students altogether.

Principals and school management were reported as sometimes going to desperate measures to preserve their school’s reputation. Schools have been reported as encouraging low-achieving students not to attend school on the day of the testing or not to sit the test. One teacher even reported that at their school students with very poor results are shifted to “alternative pathways”.

My school was going to encourage low achieving students to not sit the test, so their results did not impact the overall results of the school.

In 2012 20% of our non-indigenous students were exempt or absent for the tests. I believe this was because we had a low Yr 3 cohort and maybe students were encouraged not to attend school.

…the school data published on MySchool website is of the utmost importance to leadership, and basically dictates the inane things that they do to lift these scores. My school, like many public schools in lower socio-economic areas, tries to shift kids to 'alternative pathways' - that is to say, dump them on somebody else so that they do not bring down the schools’ overall data. Immoral, ineffective and sickening. There is a silent but pervasive awareness that these tests and published results drive everything at this school from the top down, that leadership makes extravagant promises about what our kids will 'achieve' on tests, and they will do anything to get those improvements to secure and advance their careers at the expense [of] learning, student well-being and teacher motivation. These tests and the publication of school results undermine the motivation and professionalism of teachers, particularly in schools like mine that are burdened with the bureaucratic pressure that falls on school communities lacking the cultural, social and financial capital to resist it.

Sometimes damage control is in the form of teachers being “instructed as to what to tell the public”. Having positive NAPLAN results reported can lead to parents wanting to move their children to the schooland poor results can lead to parents removing their children from a school. In a competitive school environment, such as we have in Victoria, where parents can choose any government school that has places, this can lead to school numbers being both positively and adversely impacted.

Our school has good NAPLAN results, so more students want to come here. This meant this year we lost our music room, with a loss in the quality of education for all students at the school. Next year we may lose our art room. Our specialist teachers may be lost as well. This would be a disaster for our school, all because of the government wanting some figures. Parents don’t understand what NAPLAN is, and it creates tension and stress for the school community as a whole - an external intervention in teaching and learning that has no benefit for the students, only harm.

Our school is struggling with numbers of enrolments and performance issues so NAPLAN is used as evidence of what needs to be done, whether staff are able to deliver or not, in places where resources and staffing are limited.

Parental/Community impacts Respondents saw that there was a “lack of understanding of the nature of the tests by parents”, the results are interpreted by the community out of context and often with little knowledge of what the scores might mean, or what the tests might be measuring, beyond whether the lines were green or red. One teacher wrote that NAPLAN “is a detrimental tool that is understood and used poorly by parents and the community”. The importance of the NAPLAN results to parents is extremely frustrating to many teachers who feel undermined in their teaching and learning:

As we know the results are made public and compared to other schools on the one test on the one day, we make every effort to create an environment conducive to the students doing the best that they can. It creates stress and is frustrating because the public do not necessarily understand the context.

Respondents wrote of the results publication as undermining parental “trust of the school”, despite the parents not really understanding what the tests were about, “Parents lack trust in what is being taught if their student doesn't do well”.

The bottom line for many teachers was that, “Our school is perceived as deficit and the judgments made about the value of our school have become score oriented”. Positive NAPLAN results can likewise lead to a skewing of a school’s reputation that is “score-orientated”:

I am working with a school that topped the state in some areas of the literacy NAPLAN. The principal told me that 3 parents phoned her the day after the results were released wanting to change their children to her school. She felt quite insulted by this, because she saw NAPLAN as one very small part of what the school did and stood for. These parents did not care about the enormous energy the school was putting in to inter-cultural understanding or developing their art program. It was insulting to her.

Schools are responsible for far more than the range of skills tested in NAPLAN, but many of the respondents felt the ways in which this scoring was narrowing how the community perceived their worth. Schools with positive results are affected likewise, as the community in general turns to the MySchool site as the place to compare schools, to judge them and to rank them, neglecting what they do beyond what is tested in the NAPLAN. Many teachers feel passionately about the effects that this is having on their school and their ability to work effectively in their classroom.

It just reinforces the abhorrent attitude to education. Careers have been destroyed by this. Good people who care about their students have been destroyed. No one takes into account the difficult students that some take on and make a difference to; if their little data dots aren't increased enough then a teacher is considered a failure. MySchool needs to take into account the amount of abuse that happens from students and parents at a given school and the amount of students that have sociopathic issues, suspensions etc. And please can we let future teachers know that if they are in a government high school it is likely that they will be treated as dirt.

I worry that we are creating a working childhood for kids (I think of coal mining Victorian urchins). We need to cope with grey areas, yet we are only offering black and white.

However, the process is intensifying competition between schools, and respondents wrote about how the comparisons between schools was leading to negative competition between nearby schools, rather than collaboration, as schools competed for students and reputation:

There is some pressure from above to ensure that our results are as good as if not better than those of our close and similar competitors.

All the schools near us are private schools with select entry so we always look bad to local parents.

The VATE membership who have responded to our survey have put forward a variety of positions and sets of experiences about the impact of NAPLAN on their teaching and learning at their schools. Clearly NAPLAN is having a destructive effect on English curriculum and pedagogy in our schools. Even schools where the reported effects of the testing were not as devastating, teachers still reported a lack of actual benefit from the time and resources that were given to the testing preparation and processes due to the time delays between the test taking and the test results being published. The overemphasis of the importance of these tests caused by the publication of the tests on the MySchool website and the structure of the site itself which encourages comparison of schools’ results have led to an undermining of public trust in teachers and their professional judgements. This mistrust has reached the stage where some parents place more value on what the NAPLAN results have said about their child’s progress than what the teacher who has been working with them in class every day throughout the school year has recorded. While these negative consequences have not been equally spread through the membership, the reports we have received from members who responded to our brief survey gave us deep concerns about continue with this testing program in its current form.

Literacy Triage: Emergency Response to Code Red NAPLAN@MySchool  Blame the workers under you (Prime Minister  Education MinisterDepartment Regional Directors Principals Teachers)  Teach to the test: Make NAPLAN the Curriculum  Reduce other activities so can focus on NAPLAN  Keep poor performing students away or move them on  Give prizes for good NAPLAN results

Literacy as Redesign: Long term persistent response to PISA trends  Increase student inclusion and educational equality  Develop a culture of reading  Read and Write longer, more complex texts  Engage boys in literacy  Keep up the good work in digital literacy

Closing the emergency facility: Lessons from PISA The secondary literacy education section of the review focuses on the two OECD Programmes for International Student Assessment (PISA) reading studies of 2009—Print Reading and Digital Reading. The rationale for focusing on PISA 2009 is multi-dimensional: PISA is the major international benchmark for literacy standards in secondary education; 2009 PISA is the most recent set of published and analysed results (OECD, 2010); the DEECD definition of literacy values digital and print literacies; in PISA 2009 Australia’s position and also its actual scores in Print Reading were statistically in decline from previous PISA tests; PISA 2009 included Digital Literacy tests where Australia ranked 2nd (with New Zealand). The review uses the details of the findings from PISA 2009 to develop a framework of specific areas that the tests indicate need attention to improve secondary literacy practices in Australia. These areas are then examined through the lens of systems that have attained/maintained PISA success (OEC(OECD, 2010, 2011)D, 2009, 2010, 2011) –Shanghai, Korea, Finland, Hong Kong, Singapore, Ontario.

The systems that have attained/maintained PISA success are Shanghai, Korea, Finland, Hong Kong, Singapore, Ontario, with Australia a high performing nation on the digital literacy tests. Examples will be used from high performing jurisdictions to support recommendations, with a particular focus on Ontario’s extremely successful strategies. As a very successful “like” system, these strategies have great potential to be used in the Victorian context.

The five aspects of the model adopted by Ontario are focus, build relationships, persist, develop capacity, and spread quality implementation.

Increase student inclusion and educational equality Across all cycles of PISA, Australia has shown larger than average effects for socio-economic status. Ontario’s recent reforms have seen considerable gains for all students in literacy outcomes and school retention regardless of socio-economic background. The increase in the number of students not achieving level 2 is of equally high concern as the decrease in high performing students. Luke (2010) warns that: • That the closure of the ‘equity gap’ in Australian education cannot be addressed by a principal policy emphasis on the teaching and high stakes testing of basic autonomous skills and behaviours; • That longitudinally‐sustained improvement in the performance of students from low socioeconomic and Indigenous communities will require an enacted curriculum that features: intellectually challenging, demanding and interesting knowledge; sustained and scaffolded linguistic interaction around and about that knowledge; and demonstrable links between school knowledge and the everyday realities of Australian life, cultures and work. “Educational inequality is not a given. Some schools, some school systems, and some countries do more to mitigate inequality than others. Australia has chosen to participate in PISA in order to monitor national outcomes on a regular basis – the challenge is to act on these findings as other countries have, to lift educational outcomes for all students” (Thomson et al, 2001 p. xiv). Ontario has shown that it is possible to turn around some of these inequalities and improve outcomes for all students. The similarities between Canada and Australia make it possible to implement similar approaches in Victoria.

An example of such a jurisdiction is Ontario where reforms have shifted the focus away from punitive accountability measures, performance pay and course completion and towards building shared sociocultural and leadership values and purposes within a system that aspires to improve (Fullan, 2006). The key to this strategy is a professionally-driven rather than market-driven system change. Levin (2007) describes this approach as follows: The Province of Ontario’s education change strategy embodies vital principles, grounded in research, that are associated with meaningful and sustainable change. Changes are respectful of professional knowledge and practice. Main elements of change are coherent and aligned at the provincial, district and school level. Key partners – the provincial Ministry of Education, school boards, schools, and provincial and local organizations of teachers, principals, and other partners – work together. Change strategies are comprehensive and emphasise professional learning, strong leadership, necessary resources, and effective engagement of parents and the broader community. We believe that this strategy provides an example of large-scale change that is effective and sustainable. On the basis of this professionally-driven approach, Ontario balances administrative and professional accountability and has seen the dramatic reduction of low-performing schools (Levin & Fullan, 2008). The underlying assumption of Ontario’s leaders seems to be that teachers are professionals who are generally motivated but require ongoing support and investment in their capacity building. Consequently, teachers seem to take more responsibility for performance than is often the case in countries with a more punitive approach to external accountability. (Audet, Barnes, Clegg, Heggie, Jamieson, Klinger, Levine, Martinussen & Wade-Woolley, 2009; Hargreaves & Shirley 2009, 2012; Leithwood & Massey, 2012) Ontario has made substantial improvements in its reading proficiency levels: 14% more students reaching proficiency in reading and 13% more students graduating high school since 2004. Moreover, the improvements are sustained as numbers continue to rise and have been inclusive of all students. Their policies focus on literacy, numeracy and school completions. Rather than the government working against teacher unions and punitive interventions, they worked with teacher unions and provided extensive support to help teachers and schools. They formed a Literacy and Numeracy Secretariat, which developed, in tandem with unions, teachers and schools, a variety of initiatives across a range of issues. A review of the reforms concluded that: “The consistent finding across all components of the study is that over its brief history, Ontario’s Literacy and Numeracy Secretariat (LNS) has had a major, and primarily highly positive, impact on Ontario’s education system. Overall, the level of activity associated with and generated by the LNS is very high”. They cited impressive numbers of documented initiatives, and facilitative and direct roles by the LNS and that has led to: “... a significant shift in the culture of Ontario schools that is focused on enabling the succes of all students. There has also been sustained improvement in student achievement. Thes are major accomplishments”

The increased understanding of the importance of literacy has been recognized at ever level, leading to changes in attitudes and behaviors at every level. Many schools hav developed strong Professional Learning Communities (PLCs) within and across many school The Ministry of Education has been key to the development of the initiatives, and throug the LNS is providing much needed resources and opportunities for local schools to mov forward. There has also been an intensification of Ministry research within the province.

Develop a culture of reading PISA also collected data on reading for enjoyment, which is a strong marker of student success. The Australian report noted that one third of Australian students said that they did not read for enjoyment (OECD, 2010). This suggests that there needs to be a renewed focus on reading for pleasure, and more work done on creating a reading culture in Australian schools. The provision of time for reading for pleasure in English classes is an effective way of creating a classroom culture of reading, with improved student attitude and affinity towards reading (Australian School Library Association (2003), Krashen (2004), Pilgreen, (2000). Finland has a traditional strong reading culture, with free libraries and reading is part of the community everyday practice (Sahlberg, 2011, Johnson, Adams, et al., 2012). In Hong Kong, the development of a reading culture in schools was one of the specific aims of the educational reforms (Cheng, 2001) and time in class dedicated to reading, discussions about texts as well as resources were part of this strategy. A priority of recent reforms in Ontario has been to increase engagement of secondary students in reading (see extended case under 5.2.3). Notably, the relationship between enjoyment of reading and digital reading performance is higher in Australia than other countries, which suggests that this is an area that deserves particular attention.

Read and Write longer, more complex texts In their analysis of Australia’s PISA 2009 results, Thomson et al (2011) note that in reading literacy scores between PISA 2000 and PISA 2009, Australia was the only high performing country to show a significant decline (13 score points) in reading literacy between PISA 2000 (with a mean score of 528 points) and PISA 2009 (with a mean score of 515 points)”, pp v1-vii. This was largely due to falls in results of students at the top levels of schooling (COAG, 2010, p.27). PISA assesses students at achieving at different levels from 1a-6 (with 6 as the highest and 2 as the minimum required skills for baseline participation in society). The proportion of students who achieved level 5 and 6 declined over the decade from 18 per cent to 13 per cent. “The reasons for this are not immediately evident from the data but it is at least clear that it is due to schools focusing more on basic achievement levels and not so much on the development of sophisticated reading of complex text” (McGraw, 2010 p. 5.) While the OECD average is consistent across the text types and the skills tested, in Australia, students are less proficient with continuous texts and the skills of access and retrieval of information and integration and interpretation. While the teaching of reading and composition of these kinds of texts is embedded in the Victorian curriculum, they are vulnerable to oversight when the focus is on basic skills, and changes at VCE level, such as the removal of the writing folio in 2000, have knock down effects in the amount of time dedicated to text composition further down the school. Higher performing countries such as Finland have a more extensive literature curriculum as well as an emphasis on composition (Sahlberg, 2011). These countries hold a conviction that all students can perform at high levels, and focusing on inclusion of all students was an important part of the Ontario strategy (Canadian Council of Learning, 2007). The Australian PISA 2009 results indicate that there needs to be a renewed focus by teachers on the development of sophisticated reading and composition skills and attention paid to direct teaching of literacy skills across the curriculum and acknowledgement that the repertoire of literacy practices needed to engage with contemporary texts continues to increase exponentially, transforming reading and other engagements with text (Coiro, Knobel, Lankshear & Leu, 2008; OECD, 2011).

One jurisdiction in Australia who has focused on this since 2009, is the Catholic Education Office Melbourne (CEOM). They have implemented the Secondary Literacy Improvement Project (SLIP) as a part of their literacy strategy goal to improve the educational experience and learning outcomes of all students. Flexible in its delivery, SLIP provides schools with targeted support directed towards empowering teachers and building leadership capacity. The project also clearly demonstrates the benefits of an enabling school culture in developing teachers’ and student’s knowledge and practice about language and literacy and the ability for students to demonstrate subject-specific knowledge in assessments and in turn transfer what they learn into future contexts. The early results from this work are very promising, and this approach could be scaled-up around the country.

Engage boys in literacy The other significant trends in PISA 2009 for Australia, were a 17-point decrease in males, with 4% more males not achieving the basic levels (Thomson et al, 2011) and an increase in the number of students who do not read for enjoyment. In Australia there has been concern about the lower literacy levels in male students for some time (Alloway,N., Freebody,P., Gilbert,P., Muspratt, S., 2002). Alloway et al’s 2002 study found that boys made significant gains in their literacy when the teachers gave them opportunities to 1.represent the self, 2.relate to others and 3. negotiate and engage with culture. Teachers can ensure that they expand the repertoire of literacy practices to ensure these 3 areas are included. Ontario has made significant gains in boys’ literacy practices after a significant series of efforts under the umbrella of The Road Ahead: Boys literacy Teacher Inquiry Project. The 8 key learnings were identified:

  1. The power of teaching with a wide variety of materials:
  2. The role of social interaction in learning
  3. The importance of regular and consistent provision of choice
  4. The importance of student talk
  5. The value of using differentiated approaches
  6. The importance of clear assessment strategies
  7. The benefits of information and communication technology
  8. The need to engage parents/guardians and the community as partners These learnings are primarily about teacher practices and curriculum, although some (such as engaging parents/community) require a whole-school approach. It should be noted that both the Ontario approach and the earlier Australian study cited found that these practices, which are successful with boys are also successful with girls and can be seen as examples of good practices to engage and extend adolescents in developing their repertoire of literacy practices.

Keep up the good work in digital literacy Digital literacy skills are extremely important to contemporary students who are living in a rapidly changing world demanding ever-increasing abilities to utilize new technological tools and innovations (Carrington & Marsh, 2008; Knobel & Lankshear, 2010; Lankshear & Knobel, 2003a, 2003b; Merchant, 2007b) For the first time in 2009, the OECD ran a test of digital literacy skills with a smaller sample of 19 countries. In this test, Australia achieved a mean of 537 points, ranking second in the trial. Australian students performed in general 22 points higher in digital literacy than print literacy “except for students who attended schools in remote areas, whose digital and print reading literacy performances were not significantly different” (Thompson and Bertoli, 2012, p. ii). Australia has had robust policies and support for digital literacy for some time and there have been considerable resources devoted to it (DEEWR, 2012; DEECD, 2009), Australian teachers have knowledge and skills in this area, and the results show that students are more engaged with digital texts. The PISA 2009 revealed that girls and boys both perform better with digital texts, but this is particularly the case for boys. “In the digital medium, girls are still performing relatively well as readers in comparison with boys, but the gap is narrower. Finding some way to harness the reading interests and strengths of boys would have great national benefits” (Mendelovits, 2012, p. 8).

Moving schools from literacy triage to better literacy outcomes

Literacy Triage: Emergency Response to Code Red NAPLAN@MySchool  Blame the workers under you (Prime Minister  Education MinisterDepartment Regional Directors Principals Teachers)  Teach to the test: Make NAPLAN the Curriculum  Reduce other activities so can focus on NAPLAN  Keep poor performing students away or move them on  Give prizes for good NAPLAN results

Literacy as Redesign: Long term persistent response to PISA trends  Increase student inclusion and educational equality  Develop a culture of reading  Read and Write longer, more complex texts  Engage boys in literacy  Keep up the good work in digital literacy

It should be noted that while the PISA results are compelling, and their analysis can show us many useful things about both Australian education and that of other nations, we cannot simply borrow policy from other nations--policies “do not travel very well”, they are developed in situ, and “many of the effective educational policy suites on display are not methods or approaches that can be wrenched out of context.” (Luke, 2011, p. 375). There is still much that we can learn from other nations, and similar jurisdictions such as Ontario, Canada can give us models for improving outcomes that are reasonably able to translate to the Australian context.

Bibliography ACARA 2014 http://www.nap.edu.au/naplan/naplan.html Alloway,N., Freebody,P., Gilbert,P.,Muspratt, S.,(2002)Boys,Literacy and Schooling: Expanding the Repertoires of Practice.Curriculum Corporation and DEST Australian School Library Association (2003), Impact of School Libraries on Student Achievement:A Review of the Research, ACER, Camberwell. Barbara Bodkin, Micki Clemens Rose Dotten ,Clay Lafleur,Shelley Stagg Carrington, V. & Marsh, J. (2008). Forms of literacy. Document commissioned by the Beyond Current Horizons Project: Technology, Children, Schools and Families (pp. 1-20). Available online:http://www.beyondcurrenthorizons.org.uk/forms-of-literacy/ Canadian Council on Learning,. State of Learning in Canada No Time for Complacency. Ottawa, Ont.: Canadian Council on Learning, 2007. http://site.ebrary.com/lib/mcgill/Doc?id=10195001. Coiro, J., Knobel, M., Lankshear, C. & Leu, D.J. (Eds). (2008). Handbook of research on new literacies. Mahwah, NJ: Lawrence Erlbaum. DEECD (Department of Education and Early Childhood Development). (2009). Digital learning statement. Melbourne: Innovation and Next Practice Division. Available: http://www.education.vic.gov.au/researchinnovation/digitallearning DEEWR (Department of Education, Employment & Workplace Relations) (2012). Digital education revolution. Available: http://www.deewr.gov.au/SCHOOLING/DIGITALEDUCATIONREVOLUTION/Pages/default.aspx Krashen, S.(2004) The Power of Reading: Insights from the Research. Portsmouth NH: Heinneman Knobel, M. & Lankshear, C. Eds. (2010). DIY media: Creating, sharing and learning with new technologies. New York: Peter Lang. Lankshear, C., & Knobel, M. (2003a). New literacies: Changing knowledge and classroom learning. Philadelphia, PA: Open University Press. Lankshear, C., & Knobel, M. (2003b). New technologies in early childhood literacy research: A review of research. Journal of Early Childhood Literacy 3, 59-82. doi:10.1177/14687984030031003 Merchant, G. (2007b, November). Writing the future in the digital age. Literacy, UKLA 41 (3): 118-128. OECD. (2011). PISA 2009 results: Students on line: Digital technologies and performance (vol. VI). Available: http://dx.doi.org/10.1787/9789264112995-en OECD (2014) Accessed 10.1.14 http://www.oecd.org/pisa/aboutpisa/ Peterson,Larry Swartz (2009)The Road Ahead: Boys’ Literacy Teacher Inquiry Project Final Report, Funded by the Ontario Ministry of Education. Accessed http://www.edu.gov.on.ca/eng/curriculum/RoadAhead2009.pdf Pilgreen, J.L.(2000).The SSR handbook: How to organize and manage a sustained silent reading program. Portsmouth, NH: Boynton/Cook Publishers. Thomson, Bortoli, Nicholas, Hillman, and Buckley (2011)

Vate submission ot the Inquiry into the unintended effects of NAPLAN(2014) Dr. Joanne O’Mara, Council Member; Ms. Monika Wagner, President; Ms. Nicoll Heaslip, Acting Executive Officer, and Ms. Kate Gillespie, Education Officer on behalf of the Association.