I hate "grades" in education systems
Aug. 8th, 2020 08:45 amA few days ago
danieldwilliam sent me this thread (on their blog here) discussing the Scottish exams fiasco*.
What the thread points out is that the exam results have _always_ been a rubbish approach. They're a way of slicing the kids into layers of competence, so that employers and educators can skim off the layer they want. If everyone does well then the grades are adjusted to make the results fit a curve.
And, on top of that, judging everything you've learned in a course by a single letter makes no sense. A course may teach you dozens of skills, and to then give only one single letter to represent that is ludicrous.
My preferred method would be that you either understand something or you don't. Break a subject down into atoms of understanding. People then learn those, and build new ones on top of ones they've learned. This would have two effects: (1) you'd be able to see if the person had the actual skills an employer needed (e.g. needing long division, not caring about geometry) and (2) it would highlight places where teachers were trying to teach more complex knowledge on top of basic knowledge that the person didn't actually understand yet. ("Bob, you're trying to teach my kid long division, but he clearly hasn't understood basic division yet"). This was something I saw fairly regularly as a child - kids left behind as the classroom swept on, when a little more work on a few basics might have given them the chance to catch up. If they were lucky then a parent might help them get caught up.
I'm not convinced our current teaching setup would work well with this. But then I think our current teaching setup doesn't work for a ton of people**, and I'd like to see us look at ways of making it work for far more of us. I hate to think how many other people are let down by a system designed to mush everything into "This person, overall, understands this huge area better than 90% of other people." - a grade which seems useful for gatekeeping people moving on to the next level of the same subject, and not much else.
Edit: A grading system I would be comfortable with would be to mark people as "Can repeat by rote", "understands concepts", and "can explain concepts to others".
*No exams this year, so grades were given based on prelim exams and teacher predictions. But the exam authority moderated down the predicted grades for children from schools where exam grades were historically lower, effectively saying "You come from a poor area, so we don't believe that your results could be that good." The defense being that even after this downgrading, more people from those schools got good grades than they did last year - in other words the predictions were not reasonable. There are clearly a lot of unhappy people, and this is going to take a while to unpick. More details here. For clarity, this is not just a Scottish problem, the front pages of the papers today say that 40% of A level (English and Welsh exams at 18) results will be lowered.
**To give a personal example, I took three attempts to pass my English language exams aged 16. I have no idea why. I read voraciously. I understood English pretty much as well as I do now. But the exams were asking questions where I literally did not understand what they wanted me to write. Instead, measuring my understanding of specific points, making it clear what it was they were asking me to prove I understood, would have been a huge benefit to me.
What the thread points out is that the exam results have _always_ been a rubbish approach. They're a way of slicing the kids into layers of competence, so that employers and educators can skim off the layer they want. If everyone does well then the grades are adjusted to make the results fit a curve.
And, on top of that, judging everything you've learned in a course by a single letter makes no sense. A course may teach you dozens of skills, and to then give only one single letter to represent that is ludicrous.
My preferred method would be that you either understand something or you don't. Break a subject down into atoms of understanding. People then learn those, and build new ones on top of ones they've learned. This would have two effects: (1) you'd be able to see if the person had the actual skills an employer needed (e.g. needing long division, not caring about geometry) and (2) it would highlight places where teachers were trying to teach more complex knowledge on top of basic knowledge that the person didn't actually understand yet. ("Bob, you're trying to teach my kid long division, but he clearly hasn't understood basic division yet"). This was something I saw fairly regularly as a child - kids left behind as the classroom swept on, when a little more work on a few basics might have given them the chance to catch up. If they were lucky then a parent might help them get caught up.
I'm not convinced our current teaching setup would work well with this. But then I think our current teaching setup doesn't work for a ton of people**, and I'd like to see us look at ways of making it work for far more of us. I hate to think how many other people are let down by a system designed to mush everything into "This person, overall, understands this huge area better than 90% of other people." - a grade which seems useful for gatekeeping people moving on to the next level of the same subject, and not much else.
Edit: A grading system I would be comfortable with would be to mark people as "Can repeat by rote", "understands concepts", and "can explain concepts to others".
*No exams this year, so grades were given based on prelim exams and teacher predictions. But the exam authority moderated down the predicted grades for children from schools where exam grades were historically lower, effectively saying "You come from a poor area, so we don't believe that your results could be that good." The defense being that even after this downgrading, more people from those schools got good grades than they did last year - in other words the predictions were not reasonable. There are clearly a lot of unhappy people, and this is going to take a while to unpick. More details here. For clarity, this is not just a Scottish problem, the front pages of the papers today say that 40% of A level (English and Welsh exams at 18) results will be lowered.
**To give a personal example, I took three attempts to pass my English language exams aged 16. I have no idea why. I read voraciously. I understood English pretty much as well as I do now. But the exams were asking questions where I literally did not understand what they wanted me to write. Instead, measuring my understanding of specific points, making it clear what it was they were asking me to prove I understood, would have been a huge benefit to me.
no subject
Date: 2020-08-08 08:28 am (UTC)The problem is how people in that position ever get that far.
Being dysnumeric means I have no maths qualifications for example which means that the roads not taken are vulcanology, seismology and geology which utterly fascinate me.
no subject
Date: 2020-08-08 08:56 am (UTC)no subject
Date: 2020-08-08 09:06 am (UTC)no subject
Date: 2020-08-08 09:42 am (UTC)no subject
Date: 2020-08-09 06:38 pm (UTC)Hand-drawn example of this effect from the doctor who diagnosed me:
http://www.deltaworld.info/media/events/conference.outandequal.autism.20031002/SquarePegs-20031002-Luskin.jpg
My high school English writing can be summarized by the frequent grade, "B- You strayed from the assignment." It suggests that my letter grade would have been higher based on the quality of my writing, but I failed to stay on the teacher-intended topic. I liked my topics more than theirs, apparently. :)
no subject
Date: 2020-08-08 09:38 am (UTC)no subject
Date: 2020-08-08 10:27 am (UTC)no subject
Date: 2020-08-08 10:41 am (UTC)no subject
Date: 2020-08-08 07:44 pm (UTC)CIMA Exams
Date: 2020-08-08 10:16 am (UTC)But.
One of the things that CIMA exams are trying to assess is can you work out what the actual question is because real life business problems don't walk up and announce themselves by reference to the ciriculum.
Re: CIMA Exams
Date: 2020-08-08 10:26 am (UTC)Re: CIMA Exams
Date: 2020-08-08 10:29 am (UTC)Re: CIMA Exams
Date: 2020-08-08 10:50 am (UTC)And that it contributes to "actually understands the concepts" as opposed to "rote learning".
no subject
Date: 2020-08-08 10:35 am (UTC)no subject
Date: 2020-08-08 10:43 am (UTC)(We had this at work where we trained a bunch of developers on a new language that they then didn't get to touch for 6 months, and it was just a waste, because the practice embeds the training)
no subject
Date: 2020-08-08 11:00 am (UTC)no subject
Date: 2020-08-08 11:08 am (UTC)no subject
Date: 2020-08-08 07:49 pm (UTC)no subject
Date: 2020-08-09 07:18 am (UTC)no subject
Date: 2020-08-08 10:55 am (UTC)Or, failing that, a system like music grades, where you pick up about a level a year in whatever subjects you're learning *but you don't have to do them continuously*. So 1-5 roughly map onto GCSEs, and 7 is A-level, and 10 is a degree, and 11 is a master's, and 27 is a PhD (just joking...). So I'd be level 1 Mandarin Chinese and level 5 French and level 7 Latin and ooo, joint honours are interesting. Anyway. Level 0 in music, but AFAICT nobody makes you start music at a particular age and keep with the programme.
I like the specifications stuff more, but it's explicitly obvious right now that they don't trust the teachers to do the grading, and the current one-size-fits-all, once, system of public examinations is already unmanageably complex. I used to do invigilation for secondary school exams, and my contemporaneous notes say things like "We had the physics International Baccalaureate exams this morning. Twelve kids; each had a multiple choice paper followed by a written problems paper. This required *41* timed announcements. (Because some were Higher and some were Standard, and some were normal time and some were 25% extra, and everyone gets a half hour warning and a five minute warning, and the written papers have five minutes reading time at the beginning, and no two papers were the same length. This isn't entirely true: the Higher multiple choice with extra time is the same length as the Standard written paper without, but these are sequential so it doesn't help.)" That gets more complicated still if anyone gets to use a laptop, or arrives late, or has a nosebleed, and there aren't enough rooms in the school to be doing it for every year group.
I think all the government actually needs is to be able to identify a tranche of adequately competent kids to go on to university and then keep the system running, and the exams they've got are by and large up to that. They don't need to find the best kids, and they'd rather have ones from rich backgrounds than poor backgrounds because they'll be less inclined to start thinking about inconvenient reforms if they've experienced the system as it is as working for them. And then they explain to that tranche of kids that they've been selected on MERIT and obviously the kids agree and don't look too closely, and the kids who aren't selected get told they weren't good enough and, again, don't look too closely.
no subject
Date: 2020-08-08 01:06 pm (UTC)Absolutely. Breaking down a subject makes it easier for everyone to understand what a child has understood or not. Global grades can be awfully demotivating too, making it seem like you're not able to do anything well in that subject, which is rarely true.
"Can repeat by rote" "understands concepts", and "can explain concepts to others".
The problem with "can repeat stuff by rote" is that it mostly tells me whether a student is able to work on their own and whether they have help at home. I prefer giving a student a task and seeing how well they can do it. Grading can them comes down to a simple "not at all/still struggling/pretty well done". Now, if only I could provide individual help and more training to the students who need it...
no subject
Date: 2020-08-08 08:21 pm (UTC)Previously I worked as an admissions advisor, so that means I saw every transcript of every person who applied to my institution. There are almost as many types of transcripts and grading systems as there are students. Not quite, but by gummy there's lots.
Admissions people and human resources people are looking for shorthand, and we also read between the lines. How many times did the person attempt a course; does the person load up with lots of similar courses because they do well in a particular thing; does the person have odd time gaps; are there odd things on the transcript.
It becomes a battle of attrition. As gatekeepers focus on a thing, educators develop cunning ways to manipulate a thing, to massage their students' thing into a better shaped thing. Eventually the thing measured no longer tells you anything useful, and the educators at the applied-to place stage a revolt and want admission to be based on "something real" like a portfolio or an entrance exam, and then the massaging begins on those new things.
Education isn't based on the apprenticeship model where the apprentice moves forward as they master certain skills. Education is an artificial way to reduce access to deliberately limited opportunities. It is about class and not about knowledge.
no subject
Date: 2020-08-08 08:59 pm (UTC)no subject
Date: 2020-08-10 10:09 am (UTC)However, counter-theory. The current system does an adequate job, cheaply.
I wonder if there is a question of value engineering and good enough is close enough.
I'm assuming that an assessment scheme such as you are proposing would be more costly to operate. By operate I mean not just the assessment process but the use of the assessment process. (This is an assumption not a fact. ) What is the overall system burden of a better assessment system?
Does the fact that someone has a bunch of Highers tell me if they are a great match for the admin job I have? Not especially, but it quickly and cheaply allows me to sort out potentially suitable candidates from not probably suitably candidates. Do I really want the cost of a) working through a more detailed assessment to see if there is a great match in aptiude and b) paying taxes for the overhead of more detailed, more costly assessments which don't add value in a hiring decision. There will always be exceptions but on average a quick and cheap assessement tool might get society what it needs in terms of matching suitable candidates to suitable jobs. Particularly where, if someone, isn't that good at the job they can be fired.
Similarly for entry to university. Does having 6 A's at Highers tell you that I'm going to be a great law student? Not extraordinarly well but it's cheap and if 10% of the class flunk out that might be okay.
Where more detailed assessements are valuable I suspect they are going to be very tightly tailored to a specific role in a specific organisation and / or valuable enough to be done by the recruiting organisation.
Improved assessment might be useful for student and more useful for some users of the assessment outcomes but (my guess for the sake of this discussion) is that the cost of providing them to a recruiting organisation and that recruiting organisation using them outweights the benefits.
So my follow up question is what 21st Century technology could you deploy to make the improved assessement structure cheaper to operate?
no subject
Date: 2020-08-10 10:14 am (UTC)And I think that's the fundamental issue. I believe that our education system is largely slanted towards signals towards further education/employers, and I'd like it to be focussed on actually educating people, which I don't think it's doing very well, because it's "teaching the test".
no subject
Date: 2020-08-10 10:17 am (UTC)no subject
Date: 2020-08-10 10:21 am (UTC)