Improving students' writing through feedback

Feedback is identified as one of the most powerful strategies to progress student learning. When implemented effectively, high quality feedback can have an impact of an additional eight months' worth of learning progress (Education Endowment Foundation, 2020a).

The key to effective feedback is to ensure that it is being targeted to the students' next step in learning or self-regulation. Feedback needs to be specific, accurate and clear in providing guidance on strategies for improvement which will eventually become metacognitive strategies through the learning process (Education Endowment Foundation, 2020a; Wiliam, 2016). This means that effective feedback is not just something that is merely given, but is something that needs to be received by the student (Hattie et al., 2016).

The impact of feedback on student learning can be identified through meta-analysis (Education Endowment Foundation, 2020b). The spread of these impacts according to several meta-analyses is shown in Figure 1. A key difference in the Kluger and DeNisi meta-analysis (1996) – which shows a relatively small impact of 0.41 in comparison to the mean weight effect size of 0.63 – is that a third of their studies involved a focus on praise, which we now know can have a detrimental impact on students' learning (Dweck 2006; Hattie & Clarke, 2018).

Figure 1: Spread of effect sizes for feedback (adapted from Education Endowment Foundation, 2020a).

Feedback can be targeted at four different levels that are known to have varying impact on students' learning growth, as outlined in Table 2. These results are to be seen as general guide, as some students may respond to more positive support than others, due to social and emotional factors.

Table 1: Relative effectiveness of the four levels of feedback (adapted from AITSL and E4L, 2017; Evidence for Learning, 2020; Hattie & Timperley, 2007).

Feedback in writing in action: A school example

In 2019, Evidence for Learning (E4L) and the Queensland Catholic Education Commission commenced a partnership with the Science of Learning Research Centre (SLRC), based at The University of Queensland, to deliver the Research Partner School Project (RPSP). This work, based on the SLRC Triadic Partnership Model of Research Translation (MacMahon et al., 2020), involves 21 schools and 84 educators across Queensland from Brisbane to Cairns. The RPSP is a collaborative professional learning opportunity for teachers, school leaders, and sector leaders to engage in an ongoing dialogue with researchers about evidence-informed approaches to learning and teaching relevant to a problem of practice in their school.

In early 2019, Robyn, Luke and Elisa, educators from Our Lady Help of Christians School (OLHOC) in the outer regional location of Earlville, 1678 kilometres north of Brisbane, began their work with the RPSP. During a two-day workshop in Cairns facilitated by the SLRC and E4L, the team from OLHOC decided that their focus would be on teacher feedback on students' writing. They examined the research on teacher feedback, metacognition, and self-regulation through the Teaching & Learning Toolkit along with other literature on feedback and implementation (AITSL and E4L, 2017b; Hattie & Timperley, 2007; Sharples et al., 2019). Through examining the evidence, they determined their research question: To what extent does our [teacher] feedback intervention affect or improve self-regulation and metacognitive skills when students are writing?

Upskilling teachers on key concepts

The OLHOC research project team then developed a small team to implement strategies for upskilling teachers in Year 2 and Year 4 on key concepts relating to high quality feedback to develop metacognitive strategies specifically in relation to their writing program. Teachers in Years 2 and 4 were the target of the intervention, as potential impact could be beneficial to student NAPLAN results in Years 3 and 5. Strategically, the potential long-term benefits of upskilling a small group of early adopters, would facilitate a later, broader implementation of developing all staff on effective feedback processes (dependent on positive data from the initial intervention). This approach has been used to build teacher capacity in the past at OLHOC and is aligned with a key tool of effective implementation in cultivating leaders throughout the school (Sharples et al., 2019).

Teacher observations and student interviews

The team collected a mix of qualitative and quantitative baseline data, including writing achievement data, student writing samples, teacher observations and student interviews on feedback. This is considered practice-based data – helping educators determine where their school currently is and whether an intervention has had an impact (Vaughan et al., 2017).

All teacher observations were carried out after obtaining their consent and, to ensure data integrity, the purpose of the observations was not described to the involved teachers. Three observations per teacher were carried out with use of scoreboards. Verbatim transcripts were created to capture the verbal feedback teachers gave to students during a targeted 30-minute writing lesson. The transcripts were then analysed as to whether they were Task, Process, Self-Regulation or Self (Praise) level oriented. Teachers were also interviewed after the intervention.

Students were grouped according to their current achievement in writing (high, average, and low). Twenty-four students were randomly selected and interviewed in groups of four. Samples of students' writing pre- and post-intervention were collected. These writing samples are what OLHOC refers to as an Unaided Writing Task and are conducted under NAPLAN-style conditions. The interviewed students were asked eight questions about feedback, taken from Visible Learning Plus. The questions, listed below, initially referred only to ‘feedback' in general. However, after the first 12 interviews, the OLHOC researchers decided to include the phrase ‘about your writing', to better elicit information about writing in particular:

  • What does feedback look like?
  • What does feedback about your writing look like?
  • What does feedback sound like?
  • What does feedback about your writing sound like?
  • Give examples of feedback that has helped you?
  • Give examples of feedback about your writing that has helped you?
  • Give examples of feedback that has not helped you?
  • Give examples of feedback about your writing that has not helped you?

Preliminary data analysis

Year 4 teachers' scoreboards showed the most common form of feedback students received from their teacher was ‘task' feedback (49 per cent), followed by praise (26 per cent), process level (18 per cent) and self-regulation (7 per cent). This indicated that teachers needed guidance to develop feedback processes that facilitated student self-monitoring and self-regulation to achieve learning goals, supporting the research team's hypothesis.

Student interview transcripts (analysed according to the same coding used in teacher observations) suggested students were unable to identify examples of teacher feedback specific to writing, despite the rewording of the interview questions.

Implementing improved feedback strategies

The team designed a ‘waitlist control group' approach, with teachers from the two different year levels starting to implement new feedback practices at different times of the year. The Year 4 teachers were part of phase one (starting Term 3, 2019), while the Year 2 teachers were part of phase two (Term 4, 2019). Three high impact feedback strategies were chosen as an intervention, based on their effect sizes.

The first was learning intentions and success criteria (AITSL and E4L, 2017a). Starting each writing lesson with a shared learning intention and success criteria explicitly describing what success looks like is important. The Year 4 and Year 2 teachers identified that vocabulary was an area to be explicitly targeted, therefore the second strategy ‘Bump-it-up Walls' (Sharratt, 2018) focused on vocabulary. Students were explicitly taught how to use the wall to self-monitor and regulate their writing. The third feedback strategy was the use of Question Stems (Hattie & Timperley, 2007). Three questions were used to create a dialogue of feedback between teachers and students: How am I going? Where am I going? Where to next?

Both sets of teachers undertook a professional development day to increase their repertoire of feedback strategies. The impacts of the phase one and two interventions are described below. Unfortunately, the final phase – which was whole school implementation of improved feedback strategies – has been delayed due to the COVID-19 pandemic.

Outcomes and teacher reflection

Despite this setback, teachers who undertook the RPSP training have had the opportunity to share the positive impact of this work and their own learnings with colleagues at a general staff meeting.

There was an overall improvement in how Year 2 and Year 4 students performed in the Unaided Writing Task. Pre- and post-intervention data show 74 per cent of students achieved had an effect size above 0.4, indicating progress at or above five months of learning in one term – compared to 26 per cent achieving the same growth in the term prior to the intervention. In addition, 79 per cent of students achieved at or above five months of learning growth in vocabulary (one of the target areas for instruction) compared to 7 per cent of students in the term prior to the intervention.

Teacher reflection on the RPSP (gathered through interviews), included: ‘It made a huge difference in the results at the end in their [students'] writing'; ‘It has completely changed my teaching practice'; and ‘I'm using it in all areas of my teaching now'.

The researchers at OLHOC believe that these comments, and the student data, informs the power of teacher feedback, indicating a potential answer to their research question.

Conclusion

When a group of school leaders engage in making evidence-informed decisions at their school, drawing from qualitative and quantitative sources of practice-based evidence, real change is possible that improves students' outcomes. This is an example of a collaboration between university researchers and a national evidence organisation, scaffolding an improvement journey of a school. It is part of a larger effort to embed evidence within the education profession across Australia.

Two of the authors of this article, Dr Tanya Vaughan and Robyn Arri will be sharing more details about this work at a free webinar Feedback to increase student learning on 18 August, 2020.

References

AITSL and E4L. (2017a). Learning intentions and success criteria. https://www.aitsl.edu.au/teach/improve-practice/feedback

AITSL and E4L. (2017b). Reframing feedback to improve teaching and learning. https://www.aitsl.edu.au/teach/improve-practice/feedback

Dweck, C. S. (2006). Mindset: The new psychology of success. Random House Incorporated.

Education Endowment Foundation. (2020a). Evidence for Learning Teaching & Learning Toolkit: Education Endowment Foundation. Feedback. https://www.evidenceforlearning.org.au/teaching-and-learning-toolkit/feedback/

Education Endowment Foundation. (2020b). Evidence for Learning Teaching & Learning Toolkit: Education Endowment Foundation. https://www.evidenceforlearning.org.au/the-toolkits/the-teaching-and-learning-toolkit/full-toolkit/

Evidence for Learning. (2020). Improving literacy in upper primary. Evidence for Learning. https://evidenceforlearning.org.au/guidance-reports/improving-literacy-in-upper-primary

Hattie, J., & Clarke, S. (2018). Visible Learning: Feedback. Routledge.

Hattie, J., Gan, M., & Brooks, C. (2016). Instruction Based on Feedback. In R. E. Mayer & P. A. Alexander (Eds.), Handbook of research on learning and instruction (pp. 290-324). Routledge.

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of educational research, 77(1), 81-112.

Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological bulletin, 119(2), 254.

MacMahon, S., Nugent, A., & Carroll, A. (2020). Developing a Model for the Translation of Science of Learning Research to the Classroom. In A. Carroll, R. Cunnington & A. Nugent (Eds.), Learning under the Lens – Applying findings from the Science of Learning to the Classroom. Routledge.

Sharples, J., Albers, B., Fraser, S., Deeble, M., & Vaughan, T. (2019). Putting Evidence to Work: A school's Guide to Implementation. Evidence for Learning and the Education Endowment Foundation. https://evidenceforlearning.org.au/guidance-reports/putting-evidence-to-work-a-schools-guide-to-implementation/

Sharratt, L. (2018). Clarity: What matters most in learning, teaching, and leading. Corwin Press.

Vaughan, T., Deeble, M., & Bush, J. (2017). Evidence-informed decision making. Australian Educational Leader, 39(4), 32.

Wiliam, D. (2016). The secret of effective feedback. Educational leadership, 73(7), 10-15.

Think about a recent lesson where you gave a student feedback on their work. Take a look at Table 1 in this article. Using your example, which of the four levels (praise, task, process, self-regulation) did this feedback fall into? How can you provide more feedback at the process and self-regulation level?