I’ve rushed out this blog to try and take control of the narrative because in my view there are two damaging statements from the executive summary that are backed with no evidence and school leader may seize upon them. So forgive any grammar/punctuation mistakes(!)
1. The use of targets to make marking as specific and actionable as possible is likely to increase pupil progress
2. Pupils are unlikely to benefit from marking unless some time is set aside to enable pupils to consider and respond to marking
Let’s look at the targets section of the report:
It opens with:
“Very few studies appear to focus specifically on the impact of writing targets on work.”
This seems to be a theme with all of the report, which begs the question of why it was published in the first place. More concerning is that this statement is followed by this:
“However, a large number of studies and syntheses consistently identify the impact of making other types of feedback specific.”
So basically read the following: There is no evidence on written marking but we are going to get stuff from verbal feedback and use the conclusions so we have something to write about.
That theme is continued:
“Wider evidence on effective feedback – including studies of verbal and peer feedback in schools, as well as studies from related fields such as psychology – consistently finds that the specificity of feedback is a key determinant of its impact on performance, while feedback that is imprecise may be viewed by pupils as useless or frustrating”
So evidence from something completely different is being used to justify the following position:
“Given this wider evidence, setting clear targets in marking, and reminding pupils of these before they complete a similar piece of work in the future, appears to be a promising approach, which it would be valuable to evaluate further.”
Evaluating further is a must because there is no evidence being put forward, just fumbling in the dark and looking for parallels in different research areas. What is clear is the first sentence is totally without any foundation and saying it “could be a promising approach” is without any foundation WHATSOEVER. This section should be the other way around and read: “We don’t have any evidence but if we look at a related area there are pointers (but not direct ones) which MUST BE EVALUATED BEFORE ANY CONCLUSIONS OR CLAIMS CAN BE MADE.”
Again, in order to fill up space perhaps, the report goes on to say:
“Consistent with evidence about specificity, it is likely that short-term targets are more effective than longer-term goals, and when pupils are only working towards a small number of targets at any given time. Some studies indicate that different age groups may respond to targets in different ways, but no studies appear to have robustly evaluated this difference.”
Again, the other way around: “we have no evidence but we have to write something about this so we are guessing, rather than leaving this space blank”
To top it all off we have the following:
“In some cases, targets may be more effective if pupils have a role in setting them, or are required to re-write them in their own words. Studies from schools and universities suggest that teachers can overestimate the degree to which pupils understand targets or success criteria in the same way that they do, which may act as a barrier to improvement.”
By this time I’m tempted to cry because I know how this will all be received by SLT’s across the land. They will say the report says the following:
1. We need targets on work
2. They need to be short-term, i.e. on every piece of work
3. Pupils have to set their own targets
This DESPITE THE FACT THAT NONE OF THESE CLAIMS HAVE ANY EVIDENCE IN ANY STUDIES BASED ON WHAT THE REPORT IS TALKING ABOUT, NAMELY WRITTEN MARKING. I am appalled that the authors could be so cavalier about their choice of words for this section.
Response to marking
In the section on “dialogue marking” after the usual paragraph saying there are virtually no studies, this section is more blunt than the others and states:
“No high-quality studies appear to have evaluated the impact of triple impact marking.”
For me, that should have been the end of this section.
The case study, which attempts to show something, mentions a project done on verbal feedback via ipads, which has absolutely no relevance to the issue of triple marking in books. At any rate, the study mentions no substantive data whatsoever, other than there were 231 pupils involved and a number of schools. No evidence of how they gathered information or evaluated it. In fact, it has no reason for being there AT ALL.
However, we have to get to the “response” part of the report to (try to) drill down for the evidence to back up the executive summary claim (point 2 above).
Let’s unpick this statement:
“The most basic question related to pupil responses is whether pupils should be given time in class to consider comments. While no high-quality experimental studies appear to have looked at this question, surveys in schools and higher education settings consistently suggest that pupils do not engage with or find it hard to act on the feedback they are given, and that pupils value the opportunity to respond to feedback.”
Read: no studies exist but kids have told us they don’t read/understand written marking but they like responding (?)
Then we have this MASSIVE leap:
“Given this, it appears that there is a strong case for providing dedicated time to consider and respond to marking in class.”
Can we find the evidence that there is a strong case? Because some surveys have said so? Does it work? Help pupil progress? Increase kids learning stuff? You decide. The evidence, or lack of it, is clear.
Can we go back to the statement above which said:
“Pupils are unlikely to benefit from marking unless some time is set aside to enable pupils to consider and respond to marking”
Is there any evidence for this? It may seem like common sense, but its how this will be taken out of context by some ineffective school leaders and inspectors who do not understand ‘research’. If we flip this sentence and say: “pupils are unlikely to benefit from responding to marking because there is no evidence they do.” Can we be proved wrong by what we know?
This report is a living, breathing, example of why you should NEVER only read the executive summary. But that aside, the report has no evidence about anything useful (to do with written marking) and should never have been published. In fact, it could have been one paragraph saying the following: “we can’t find anything to look at so we are saying to the research community that they MUST research this. In the meantime we will not be publishing a report on this, just in case school leaders take it out of context.”
IF this report is quoted at you in your workplace to make you do something you feel is daft, say the following: “there is no evidence to back up any conclusions you draw from the report. We are still waiting for decent studies on this area of research.”