or ‘The tale of why my classroom is full of students doing retests’
Students join us from a variety of secondary schools into a range of different Maths programs (we are a 16-18 provider). They come with different experiences, strengths and methods that vary hugely based on where they have come from, what ability set they were in and whether they had a specialist teacher for any, part or all of their later secondary education.
All of this compounds into problems for the start of the course. Both AS Maths and AS Use of Maths are incredibly algebra heavy courses, and as such a level of competence and speed is necessary for success. Over the years we have found getting students up to speed on algebra to be a sink of class time early in the course, and frankly unfair on those students who were competent and wanted to learn something new at their new college. So over the last two years we changed how we were to run the start of the year from:
“We will spend the first week getting students up to speed and covering in gaps”
“We will spend a fraction of time in the first two weeks checking students are up to speed and highlighting areas they need to improve. We will provide opportunity outside of class for remediation”
We put together a choice of algebra skills we expected students to come to us with from GCSE that they should demonstrate accuracy and efficiency with. I wrote 4 assessments for each skill and we asked all students to sit one per lesson for about 10mins at the beginning or end. If the student didn’t get 100% we asked them to return outside of class for help and a resit.
Doesn’t sound like anything new to people who do SBG I guess, but it was a big change for us.
The first year we had 10 skills, shown below (with slight variation later between the two courses to take into account the lower entry requirement of AS Use of Maths). All were tested at 8 questions (except AS UoM skill 10). They looked something like this:
I’ll be honest, those first few weeks were a fairly harrowing experience for students and teachers. Marking load was high, especially for those teachers with 4 first year classes, and students were frustrated with timing (too many questions on the later tests) difficulty (particularly the surds example above) and the organization became messier as the third week (only four lessons a week meant it spilled over into week 3) drew on.
So for this year we changed things. We reduced from 10 to eight; reduced the number per assessment to 6 and kept the 10min target. We also, critically, kept the bar set at 100%. As I said to my classes ‘If I ask you to write your name 6 times and you get it wrong once, we’d both be worried. These skills are just as important’.
So this is what they look like now:
We’re now winding down week 3, so all classes have tried them all once, and we have had a much better uptake on student resits. Sure, I still have students who got none of them yet, and yes I’m worried about them. But now, crucially, I have something concrete to start a discussion about support with managers, parents and most importantly, the student themselves.
The fact is; I’m not sure how to truly measure the impact here. I’m certainly not sure it would work anywhere or everywhere else. All I know is that those students who get it all, or at least most of it, done and sorted turned out last year to be the more successful, and that theres a pretty strong correlation between those who didn’t bother, and those who failed. I’m looking forward to a proper analysis once we have two years of data.
That’s all I have on that. If you want to see more of the assessments let me know.
update: actually while this was in my drafts folder, we finished up. I set a deadline and on the last day I was positively snowed under with resits. Some students still didn’t make the attempt, or didn’t get them right the last time around – what to do next for those is the next big job!